hexsha
stringlengths 40
40
| size
int64 6
14.9M
| ext
stringclasses 1
value | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 6
260
| max_stars_repo_name
stringlengths 6
119
| max_stars_repo_head_hexsha
stringlengths 40
41
| max_stars_repo_licenses
list | max_stars_count
int64 1
191k
โ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
โ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
โ | max_issues_repo_path
stringlengths 6
260
| max_issues_repo_name
stringlengths 6
119
| max_issues_repo_head_hexsha
stringlengths 40
41
| max_issues_repo_licenses
list | max_issues_count
int64 1
67k
โ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
โ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
โ | max_forks_repo_path
stringlengths 6
260
| max_forks_repo_name
stringlengths 6
119
| max_forks_repo_head_hexsha
stringlengths 40
41
| max_forks_repo_licenses
list | max_forks_count
int64 1
105k
โ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
โ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
โ | avg_line_length
float64 2
1.04M
| max_line_length
int64 2
11.2M
| alphanum_fraction
float64 0
1
| cells
list | cell_types
list | cell_type_groups
list |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d0604a8ef3c4efaa3345b4e152ffb03ab00a4160 | 62,735 | ipynb | Jupyter Notebook | scratch/working/2.2.NoVPC-EFS-Train-Model.ipynb | gonsoomoon-ml/SageMaker-With-Secure-VPC | b3ccdede952e8a32a256cb1aab53d196e519f401 | [
"MIT"
]
| 2 | 2021-02-01T00:48:25.000Z | 2021-08-02T09:43:27.000Z | scratch/working/2.2.NoVPC-EFS-Train-Model.ipynb | gonsoomoon-ml/SageMaker-With-Secure-VPC | b3ccdede952e8a32a256cb1aab53d196e519f401 | [
"MIT"
]
| 1 | 2021-02-08T06:18:25.000Z | 2021-02-08T06:18:25.000Z | scratch/working/2.2.NoVPC-EFS-Train-Model.ipynb | gonsoomoon-ml/SageMaker-With-Secure-VPC | b3ccdede952e8a32a256cb1aab53d196e519f401 | [
"MIT"
]
| 2 | 2021-02-04T08:23:14.000Z | 2021-02-25T07:13:11.000Z | 116.175926 | 19,813 | 0.635259 | [
[
[
"# [๋ชจ๋ 2.1] SageMaker ํด๋ฌ์คํฐ์์ ํ๋ จ (No VPC์์ ์คํ)\n\n์ด ๋
ธํธ๋ถ์ ์๋์ ์์
์ ์คํ ํฉ๋๋ค.\n- SageMaker Hosting Cluster ์์ ํ๋ จ์ ์คํ\n- ํ๋ จํ Job ์ด๋ฆ์ ์ ์ฅ \n - ๋ค์ ๋
ธํธ๋ถ์์ ๋ชจ๋ธ ๋ฐฐํฌ ๋ฐ ์ถ๋ก ์์ ์ฌ์ฉ ํฉ๋๋ค.\n---",
"_____no_output_____"
],
[
"SageMaker์ ์ธ์
์ ์ป๊ณ , role ์ ๋ณด๋ฅผ ๊ฐ์ ธ์ต๋๋ค.\n- ์์ ๋ ์ ๋ณด๋ฅผ ํตํด์ SageMaker Hosting Cluster์ ์ฐ๊ฒฐํฉ๋๋ค.",
"_____no_output_____"
]
],
[
[
"import os\nimport sagemaker\nfrom sagemaker import get_execution_role\n\nsagemaker_session = sagemaker.Session()\n\nrole = get_execution_role()",
"_____no_output_____"
]
],
[
[
"## ๋ก์ปฌ์ ๋ฐ์ดํฐ S3 ์
๋ก๋ฉ\n๋ก์ปฌ์ ๋ฐ์ดํฐ๋ฅผ S3์ ์
๋ก๋ฉํ์ฌ ํ๋ จ์์ Input์ผ๋ก ์ฌ์ฉ ํฉ๋๋ค.",
"_____no_output_____"
]
],
[
[
"# dataset_location = sagemaker_session.upload_data(path='data', key_prefix='data/DEMO-cifar10')\n# display(dataset_location)\ndataset_location = 's3://sagemaker-ap-northeast-2-057716757052/data/DEMO-cifar10'\ndataset_location",
"_____no_output_____"
],
[
"# efs_dir = '/home/ec2-user/efs/data'\n\n# ! ls {efs_dir} -al\n# ! aws s3 cp {dataset_location} {efs_dir} --recursive",
"_____no_output_____"
],
[
"from sagemaker.inputs import FileSystemInput\n\n# Specify EFS ile system id.\nfile_system_id = 'fs-38dc1558' # 'fs-xxxxxxxx'\nprint(f\"EFS file-system-id: {file_system_id}\")\n\n# Specify directory path for input data on the file system. \n# You need to provide normalized and absolute path below.\ntrain_file_system_directory_path = '/data/train'\neval_file_system_directory_path = '/data/eval'\nvalidation_file_system_directory_path = '/data/validation'\nprint(f'EFS file-system data input path: {train_file_system_directory_path}')\nprint(f'EFS file-system data input path: {eval_file_system_directory_path}')\nprint(f'EFS file-system data input path: {validation_file_system_directory_path}')\n\n# Specify the access mode of the mount of the directory associated with the file system. \n# Directory must be mounted 'ro'(read-only).\nfile_system_access_mode = 'ro'\n\n# Specify your file system type\nfile_system_type = 'EFS'\n\ntrain = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=train_file_system_directory_path,\n file_system_access_mode=file_system_access_mode)\n\neval = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=eval_file_system_directory_path,\n file_system_access_mode=file_system_access_mode)\n\nvalidation = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=validation_file_system_directory_path,\n file_system_access_mode=file_system_access_mode)",
"EFS file-system-id: fs-38dc1558\nEFS file-system data input path: /data/train\nEFS file-system data input path: /data/eval\nEFS file-system data input path: /data/validation\n"
],
[
"aws_region = 'ap-northeast-2'# aws-region-code e.g. us-east-1\ns3_bucket = 'sagemaker-ap-northeast-2-057716757052'# your-s3-bucket-name",
"_____no_output_____"
],
[
"prefix = \"cifar10/efs\" #prefix in your bucket\ns3_output_location = f's3://{s3_bucket}/{prefix}/output'\nprint(f'S3 model output location: {s3_output_location}')",
"S3 model output location: s3://sagemaker-ap-northeast-2-057716757052/cifar10/efs/output\n"
],
[
"security_group_ids = ['sg-0192524ef63ec6138'] # ['sg-xxxxxxxx'] \n# subnets = ['subnet-0a84bcfa36d3981e6','subnet-0304abaaefc2b1c34','subnet-0a2204b79f378b178'] # [ 'subnet-xxxxxxx', 'subnet-xxxxxxx', 'subnet-xxxxxxx']\nsubnets = ['subnet-0a84bcfa36d3981e6'] # [ 'subnet-xxxxxxx', 'subnet-xxxxxxx', 'subnet-xxxxxxx']\n\n\n",
"_____no_output_____"
],
[
"from sagemaker.tensorflow import TensorFlow\nestimator = TensorFlow(base_job_name='cifar10',\n entry_point='cifar10_keras_sm_tf2.py',\n source_dir='training_script',\n role=role,\n framework_version='2.0.0',\n py_version='py3',\n script_mode=True,\n hyperparameters={'epochs' : 1},\n train_instance_count=1, \n train_instance_type='ml.p3.2xlarge',\n output_path=s3_output_location, \n subnets=subnets,\n security_group_ids=security_group_ids, \n session = sagemaker.Session()\n )\n\nestimator.fit({'train': train,\n 'validation': validation,\n 'eval': eval,\n })\n# estimator.fit({'train': 'file://data/train',\n# 'validation': 'file://data/validation',\n# 'eval': 'file://data/eval'})",
"train_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_count has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\n"
]
],
[
[
"# VPC_Mode๋ฅผ True, False ์ ํ\n#### **[์ค์] VPC_Mode์์ ์คํ์์ True๋ก ๋ณ๊ฒฝํด์ฃผ์ธ์**",
"_____no_output_____"
]
],
[
[
"VPC_Mode = False",
"_____no_output_____"
],
[
"from sagemaker.tensorflow import TensorFlow\n\ndef retrieve_estimator(VPC_Mode):\n if VPC_Mode:\n # VPC ๋ชจ๋ ๊ฒฝ์ฐ์ subnets, security_group์ ๊ธฐ์ ํฉ๋๋ค.\n estimator = TensorFlow(base_job_name='cifar10',\n entry_point='cifar10_keras_sm_tf2.py',\n source_dir='training_script',\n role=role,\n framework_version='2.0.0',\n py_version='py3',\n script_mode=True, \n hyperparameters={'epochs': 2},\n train_instance_count=1, \n train_instance_type='ml.p3.8xlarge',\n subnets = ['subnet-090c1fad32165b0fa','subnet-0bd7cff3909c55018'],\n security_group_ids = ['sg-0f45d634d80aef27e'] \n ) \n else:\n estimator = TensorFlow(base_job_name='cifar10',\n entry_point='cifar10_keras_sm_tf2.py',\n source_dir='training_script',\n role=role,\n framework_version='2.0.0',\n py_version='py3',\n script_mode=True, \n hyperparameters={'epochs': 2},\n train_instance_count=1, \n train_instance_type='ml.p3.8xlarge')\n return estimator\n\nestimator = retrieve_estimator(VPC_Mode)",
"train_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_count has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\n"
]
],
[
[
"ํ์ต์ ์ํํฉ๋๋ค. ์ด๋ฒ์๋ ๊ฐ๊ฐ์ ์ฑ๋(`train, validation, eval`)์ S3์ ๋ฐ์ดํฐ ์ ์ฅ ์์น๋ฅผ ์ง์ ํฉ๋๋ค.<br>\nํ์ต ์๋ฃ ํ Billable seconds๋ ํ์ธํด ๋ณด์ธ์. Billable seconds๋ ์ค์ ๋ก ํ์ต ์ํ ์ ๊ณผ๊ธ๋๋ ์๊ฐ์
๋๋ค.\n```\nBillable seconds: <time>\n```\n\n์ฐธ๊ณ ๋ก, `ml.p2.xlarge` ์ธ์คํด์ค๋ก 5 epoch ํ์ต ์ ์ ์ฒด 6๋ถ-7๋ถ์ด ์์๋๊ณ , ์ค์ ํ์ต์ ์์๋๋ ์๊ฐ์ 3๋ถ-4๋ถ์ด ์์๋ฉ๋๋ค.",
"_____no_output_____"
]
],
[
[
"%%time\nestimator.fit({'train':'{}/train'.format(dataset_location),\n 'validation':'{}/validation'.format(dataset_location),\n 'eval':'{}/eval'.format(dataset_location)})",
"2021-01-27 04:02:44 Starting - Starting the training job...\n2021-01-27 04:03:08 Starting - Launching requested ML instancesProfilerReport-1611720164: InProgress\n.........\n2021-01-27 04:04:29 Starting - Preparing the instances for training......\n2021-01-27 04:05:44 Downloading - Downloading input data\n2021-01-27 04:05:44 Training - Downloading the training image...\n2021-01-27 04:06:11 Training - Training image download completed. Training in progress..\u001b[34m2021-01-27 04:06:06,541 sagemaker-containers INFO Imported framework sagemaker_tensorflow_container.training\u001b[0m\n\u001b[34m2021-01-27 04:06:07,035 sagemaker-containers INFO Invoking user script\n\u001b[0m\n\u001b[34mTraining Env:\n\u001b[0m\n\u001b[34m{\n \"additional_framework_parameters\": {},\n \"channel_input_dirs\": {\n \"eval\": \"/opt/ml/input/data/eval\",\n \"validation\": \"/opt/ml/input/data/validation\",\n \"train\": \"/opt/ml/input/data/train\"\n },\n \"current_host\": \"algo-1\",\n \"framework_module\": \"sagemaker_tensorflow_container.training:main\",\n \"hosts\": [\n \"algo-1\"\n ],\n \"hyperparameters\": {\n \"model_dir\": \"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\",\n \"epochs\": 2\n },\n \"input_config_dir\": \"/opt/ml/input/config\",\n \"input_data_config\": {\n \"eval\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n },\n \"validation\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n },\n \"train\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n }\n },\n \"input_dir\": \"/opt/ml/input\",\n \"is_master\": true,\n \"job_name\": \"cifar10-2021-01-27-04-02-44-183\",\n \"log_level\": 20,\n \"master_hostname\": \"algo-1\",\n \"model_dir\": \"/opt/ml/model\",\n \"module_dir\": \"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/source/sourcedir.tar.gz\",\n \"module_name\": \"cifar10_keras_sm_tf2\",\n \"network_interface_name\": \"eth0\",\n \"num_cpus\": 32,\n \"num_gpus\": 4,\n \"output_data_dir\": \"/opt/ml/output/data\",\n \"output_dir\": \"/opt/ml/output\",\n \"output_intermediate_dir\": \"/opt/ml/output/intermediate\",\n \"resource_config\": {\n \"current_host\": \"algo-1\",\n \"hosts\": [\n \"algo-1\"\n ],\n \"network_interface_name\": \"eth0\"\n },\n \"user_entry_point\": \"cifar10_keras_sm_tf2.py\"\u001b[0m\n\u001b[34m}\n\u001b[0m\n\u001b[34mEnvironment variables:\n\u001b[0m\n\u001b[34mSM_HOSTS=[\"algo-1\"]\u001b[0m\n\u001b[34mSM_NETWORK_INTERFACE_NAME=eth0\u001b[0m\n\u001b[34mSM_HPS={\"epochs\":2,\"model_dir\":\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\"}\u001b[0m\n\u001b[34mSM_USER_ENTRY_POINT=cifar10_keras_sm_tf2.py\u001b[0m\n\u001b[34mSM_FRAMEWORK_PARAMS={}\u001b[0m\n\u001b[34mSM_RESOURCE_CONFIG={\"current_host\":\"algo-1\",\"hosts\":[\"algo-1\"],\"network_interface_name\":\"eth0\"}\u001b[0m\n\u001b[34mSM_INPUT_DATA_CONFIG={\"eval\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"train\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"validation\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"}}\u001b[0m\n\u001b[34mSM_OUTPUT_DATA_DIR=/opt/ml/output/data\u001b[0m\n\u001b[34mSM_CHANNELS=[\"eval\",\"train\",\"validation\"]\u001b[0m\n\u001b[34mSM_CURRENT_HOST=algo-1\u001b[0m\n\u001b[34mSM_MODULE_NAME=cifar10_keras_sm_tf2\u001b[0m\n\u001b[34mSM_LOG_LEVEL=20\u001b[0m\n\u001b[34mSM_FRAMEWORK_MODULE=sagemaker_tensorflow_container.training:main\u001b[0m\n\u001b[34mSM_INPUT_DIR=/opt/ml/input\u001b[0m\n\u001b[34mSM_INPUT_CONFIG_DIR=/opt/ml/input/config\u001b[0m\n\u001b[34mSM_OUTPUT_DIR=/opt/ml/output\u001b[0m\n\u001b[34mSM_NUM_CPUS=32\u001b[0m\n\u001b[34mSM_NUM_GPUS=4\u001b[0m\n\u001b[34mSM_MODEL_DIR=/opt/ml/model\u001b[0m\n\u001b[34mSM_MODULE_DIR=s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/source/sourcedir.tar.gz\u001b[0m\n\u001b[34mSM_TRAINING_ENV={\"additional_framework_parameters\":{},\"channel_input_dirs\":{\"eval\":\"/opt/ml/input/data/eval\",\"train\":\"/opt/ml/input/data/train\",\"validation\":\"/opt/ml/input/data/validation\"},\"current_host\":\"algo-1\",\"framework_module\":\"sagemaker_tensorflow_container.training:main\",\"hosts\":[\"algo-1\"],\"hyperparameters\":{\"epochs\":2,\"model_dir\":\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\"},\"input_config_dir\":\"/opt/ml/input/config\",\"input_data_config\":{\"eval\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"train\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"validation\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"}},\"input_dir\":\"/opt/ml/input\",\"is_master\":true,\"job_name\":\"cifar10-2021-01-27-04-02-44-183\",\"log_level\":20,\"master_hostname\":\"algo-1\",\"model_dir\":\"/opt/ml/model\",\"module_dir\":\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/source/sourcedir.tar.gz\",\"module_name\":\"cifar10_keras_sm_tf2\",\"network_interface_name\":\"eth0\",\"num_cpus\":32,\"num_gpus\":4,\"output_data_dir\":\"/opt/ml/output/data\",\"output_dir\":\"/opt/ml/output\",\"output_intermediate_dir\":\"/opt/ml/output/intermediate\",\"resource_config\":{\"current_host\":\"algo-1\",\"hosts\":[\"algo-1\"],\"network_interface_name\":\"eth0\"},\"user_entry_point\":\"cifar10_keras_sm_tf2.py\"}\u001b[0m\n\u001b[34mSM_USER_ARGS=[\"--epochs\",\"2\",\"--model_dir\",\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\"]\u001b[0m\n\u001b[34mSM_OUTPUT_INTERMEDIATE_DIR=/opt/ml/output/intermediate\u001b[0m\n\u001b[34mSM_CHANNEL_EVAL=/opt/ml/input/data/eval\u001b[0m\n\u001b[34mSM_CHANNEL_VALIDATION=/opt/ml/input/data/validation\u001b[0m\n\u001b[34mSM_CHANNEL_TRAIN=/opt/ml/input/data/train\u001b[0m\n\u001b[34mSM_HP_MODEL_DIR=s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\u001b[0m\n\u001b[34mSM_HP_EPOCHS=2\u001b[0m\n\u001b[34mPYTHONPATH=/opt/ml/code:/usr/local/bin:/usr/lib/python36.zip:/usr/lib/python3.6:/usr/lib/python3.6/lib-dynload:/usr/local/lib/python3.6/dist-packages:/usr/lib/python3/dist-packages\n\u001b[0m\n\u001b[34mInvoking script with the following command:\n\u001b[0m\n\u001b[34m/usr/bin/python3 cifar10_keras_sm_tf2.py --epochs 2 --model_dir s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\n\n\u001b[0m\n\u001b[34mTrain for 312 steps, validate for 78 steps\u001b[0m\n\u001b[34mEpoch 1/2\u001b[0m\n\u001b[34m#015 1/312 [..............................] - ETA: 34:31 - loss: 3.5045 - accuracy: 0.1094#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 7/312 [..............................] - ETA: 4:52 - loss: 3.1433 - accuracy: 0.1462 #010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 13/312 [>.............................] - ETA: 2:35 - loss: 2.9194 - accuracy: 0.1587#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 19/312 [>.............................] - ETA: 1:45 - loss: 2.7623 - accuracy: 0.1641#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 26/312 [=>............................] - ETA: 1:15 - loss: 2.6259 - accuracy: 0.1683#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 32/312 [==>...........................] - ETA: 1:00 - loss: 2.5445 - accuracy: 0.1753#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 39/312 [==>...........................] - ETA: 48s - loss: 2.4627 - accuracy: 0.1873 #010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 45/312 [===>..........................] - ETA: 41s - loss: 2.4148 - accuracy: 0.1951#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 51/312 [===>..........................] - ETA: 36s - loss: 2.3721 - accuracy: 0.2028#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 57/312 [====>.........................] - ETA: 31s - loss: 2.3383 - accuracy: 0.2057#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 64/312 [=====>........................] - ETA: 27s - loss: 2.2982 - accuracy: 0.2120#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 71/312 [=====>........................] - ETA: 24s - loss: 2.2635 - accuracy: 0.2171#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 78/312 [======>.......................] - ETA: 21s - loss: 2.2315 - accuracy: 0.2229#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 85/312 [=======>......................] - ETA: 19s - loss: 2.2051 - accuracy: 0.2268#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 92/312 [=======>......................] - ETA: 17s - loss: 2.1798 - accuracy: 0.2320#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 99/312 [========>.....................] - ETA: 16s - loss: 2.1550 - accuracy: 0.2371#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015106/312 [=========>....................] - ETA: 14s - loss: 2.1355 - accuracy: 0.2412#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015113/312 [=========>....................] - ETA: 13s - loss: 2.1166 - accuracy: 0.2458#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015120/312 [==========>...................] - ETA: 12s - loss: 2.0997 - accuracy: 0.2493#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015127/312 [===========>..................] - ETA: 11s - loss: 2.0852 - accuracy: 0.2542#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015134/312 [===========>..................] - ETA: 10s - loss: 2.0716 - accuracy: 0.2577#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015140/312 [============>.................] - ETA: 9s - loss: 2.0586 - accuracy: 0.2616 #010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015147/312 [=============>................] - ETA: 8s - loss: 2.0466 - accuracy: 0.2645#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015154/312 [=============>................] - ETA: 8s - loss: 2.0331 - accuracy: 0.2677#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015161/312 [==============>...............] - ETA: 7s - loss: 2.0210 - accuracy: 0.2723#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015168/312 [===============>..............] - ETA: 6s - loss: 2.0082 - accuracy: 0.2766#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015175/312 [===============>..............] - ETA: 6s - loss: 1.9988 - accuracy: 0.2790#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015181/312 [================>.............] - ETA: 5s - loss: 1.9901 - accuracy: 0.2804#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015188/312 [=================>............] - ETA: 5s - loss: 1.9790 - accuracy: 0.2836#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015195/312 [=================>............] - ETA: 4s - loss: 1.9695 - accuracy: 0.2856#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015202/312 [==================>...........] - ETA: 4s - loss: 1.9605 - accuracy: 0.2881#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015209/312 [===================>..........] - ETA: 4s - loss: 1.9531 - accuracy: 0.2906#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015216/312 [===================>..........] - ETA: 3s - loss: 1.9457 - accuracy: 0.2930#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015223/312 [====================>.........] - ETA: 3s - loss: 1.9350 - accuracy: 0.2959#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015230/312 [=====================>........] - ETA: 3s - loss: 1.9290 - accuracy: 0.2975#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015237/312 [=====================>........] - ETA: 2s - loss: 1.9219 - accuracy: 0.2991#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015244/312 [======================>.......] - ETA: 2s - loss: 1.9130 - accuracy: 0.3024#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015251/312 [=======================>......] - ETA: 2s - loss: 1.9066 - accuracy: 0.3046#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015258/312 [=======================>......] - ETA: 1s - loss: 1.9006 - accuracy: 0.3065#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015264/312 [========================>.....] - ETA: 1s - loss: 1.8959 - accuracy: 0.3079#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015271/312 [=========================>....] - ETA: 1s - loss: 1.8884 - accuracy: 0.3104#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015278/312 [=========================>....] - ETA: 1s - loss: 1.8834 - accuracy: 0.3122#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015285/312 [==========================>...] - ETA: 0s - loss: 1.8764 - accuracy: 0.3148#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015292/312 [===========================>..] - ETA: 0s - loss: 1.8714 - accuracy: 0.3172#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015299/312 [===========================>..] - ETA: 0s - loss: 1.8642 - accuracy: 0.3197#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015305/312 [============================>.] - ETA: 0s - loss: 1.8589 - accuracy: 0.3213#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015312/312 [==============================] - 10s 32ms/step - loss: 1.8530 - accuracy: 0.3232 - val_loss: 2.0282 - val_accuracy: 0.3226\u001b[0m\n\u001b[34mEpoch 2/2\u001b[0m\n\u001b[34m#015 1/312 [..............................] - ETA: 2s - loss: 1.4358 - accuracy: 0.4531#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 8/312 [..............................] - ETA: 2s - loss: 1.5428 - accuracy: 0.4131#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 15/312 [>.............................] - ETA: 2s - loss: 1.5658 - accuracy: 0.4026#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 22/312 [=>............................] - ETA: 2s - loss: 1.5621 - accuracy: 0.4116#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 29/312 [=>............................] - ETA: 2s - loss: 1.5536 - accuracy: 0.4181#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 36/312 [==>...........................] - ETA: 2s - loss: 1.5312 - accuracy: 0.4316#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 43/312 [===>..........................] - ETA: 2s - loss: 1.5190 - accuracy: 0.4391#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 50/312 [===>..........................] - ETA: 2s - loss: 1.5194 - accuracy: 0.4364#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 56/312 [====>.........................] - ETA: 2s - loss: 1.5234 - accuracy: 0.4351#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 63/312 [=====>........................] - ETA: 1s - loss: 1.5260 - accuracy: 0.4339#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 70/312 [=====>........................] - ETA: 1s - loss: 1.5249 - accuracy: 0.4376#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 77/312 [======>.......................] - ETA: 1s - loss: 1.5162 - accuracy: 0.4421#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 84/312 [=======>......................] - ETA: 1s - loss: 1.5111 - accuracy: 0.4443#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 91/312 [=======>......................] - ETA: 1s - loss: 1.5092 - accuracy: 0.4439#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 98/312 [========>.....................] - ETA: 1s - loss: 1.5105 - accuracy: 0.4430#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015105/312 [=========>....................] - ETA: 1s - loss: 1.5119 - accuracy: 0.4424#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015112/312 [=========>....................] - ETA: 1s - loss: 1.5089 - accuracy: 0.4440#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015119/312 [==========>...................] - ETA: 1s - loss: 1.5087 - accuracy: 0.4458#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015126/312 [===========>..................] - ETA: 1s - loss: 1.5124 - accuracy: 0.4441#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015132/312 [===========>..................] - ETA: 1s - loss: 1.5132 - accuracy: 0.4441#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015139/312 [============>.................] - ETA: 1s - loss: 1.5099 - accuracy: 0.4453#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015146/312 [=============>................] - ETA: 1s - loss: 1.5104 - accuracy: 0.4464#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015153/312 [=============>................] - ETA: 1s - loss: 1.5065 - accuracy: 0.4489#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015160/312 [==============>...............] - ETA: 1s - loss: 1.5054 - accuracy: 0.4499#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015166/312 [==============>...............] - ETA: 1s - loss: 1.5030 - accuracy: 0.4507#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015172/312 [===============>..............] - ETA: 1s - loss: 1.5006 - accuracy: 0.4514#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015179/312 [================>.............] - ETA: 1s - loss: 1.4972 - accuracy: 0.4527#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015186/312 [================>.............] - ETA: 0s - loss: 1.4946 - accuracy: 0.4536#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015193/312 [=================>............] - ETA: 0s - loss: 1.4922 - accuracy: 0.4547#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015200/312 [==================>...........] - ETA: 0s - loss: 1.4917 - accuracy: 0.4553#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015207/312 [==================>...........] - ETA: 0s - loss: 1.4904 - accuracy: 0.4556#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015214/312 [===================>..........] - ETA: 0s - loss: 1.4877 - accuracy: 0.4567#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015221/312 [====================>.........] - ETA: 0s - loss: 1.4865 - accuracy: 0.4576#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015228/312 [====================>.........] - ETA: 0s - loss: 1.4846 - accuracy: 0.4582#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015235/312 [=====================>........] - ETA: 0s - loss: 1.4813 - accuracy: 0.4593#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015242/312 [======================>.......] - ETA: 0s - loss: 1.4780 - accuracy: 0.4611#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015249/312 [======================>.......] - ETA: 0s - loss: 1.4757 - accuracy: 0.4621#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015255/312 [=======================>......] - ETA: 0s - loss: 1.4742 - accuracy: 0.4624#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015262/312 [========================>.....] - ETA: 0s - loss: 1.4709 - accuracy: 0.4642#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015268/312 [========================>.....] - ETA: 0s - loss: 1.4689 - accuracy: 0.4651#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015275/312 [=========================>....] - ETA: 0s - loss: 1.4664 - accuracy: 0.4662#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015282/312 [==========================>...] - ETA: 0s - loss: 1.4634 - accuracy: 0.4671#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015289/312 [==========================>...] - ETA: 0s - loss: 1.4600 - accuracy: 0.4679#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015296/312 [===========================>..] - ETA: 0s - loss: 1.4562 - accuracy: 0.4693#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015303/312 [============================>.] - ETA: 0s - loss: 1.4529 - accuracy: 0.4707#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015310/312 [============================>.] - ETA: 0s - loss: 1.4507 - accuracy: 0.4713#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015312/312 [==============================] - 3s 10ms/step - loss: 1.4498 - accuracy: 0.4717 - val_loss: 1.6843 - val_accuracy: 0.4161\u001b[0m\n\n2021-01-27 04:12:46 Uploading - Uploading generated training model\u001b[34m2021-01-27 04:12:39.226548: W tensorflow/python/util/util.cc:299] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.\u001b[0m\n\u001b[34mWARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1781: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.\u001b[0m\n\u001b[34mInstructions for updating:\u001b[0m\n\u001b[34mIf using Keras pass *_constraint arguments to layers.\u001b[0m\n\u001b[34mWARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1781: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.\u001b[0m\n\u001b[34mInstructions for updating:\u001b[0m\n\u001b[34mIf using Keras pass *_constraint arguments to layers.\u001b[0m\n\u001b[34mINFO:tensorflow:Assets written to: /opt/ml/model/1/assets\u001b[0m\n\u001b[34mINFO:tensorflow:Assets written to: /opt/ml/model/1/assets\u001b[0m\n\u001b[34m2021-01-27 04:12:42,835 sagemaker-containers INFO Reporting training SUCCESS\u001b[0m\n\n2021-01-27 04:13:16 Completed - Training job completed\nProfilerReport-1611720164: NoIssuesFound\nTraining seconds: 452\nBillable seconds: 452\nCPU times: user 1.59 s, sys: 1.44 ms, total: 1.59 s\nWall time: 10min 46s\n"
]
],
[
[
"## training_job_name ์ ์ฅ\n\nํ์ฌ์ training_job_name์ ์ ์ฅ ํฉ๋๋ค.\n- training_job_name์ ์๋ ํ๋ จ์ ๊ด๋ จ ๋ด์ฉ ๋ฐ ํ๋ จ ๊ฒฐ๊ณผ์ธ **Model Artifact** ํ์ผ์ S3 ๊ฒฝ๋ก๋ฅผ ์ ๊ณต ํฉ๋๋ค.",
"_____no_output_____"
]
],
[
[
"train_job_name = estimator._current_job_name",
"_____no_output_____"
],
[
"%store train_job_name",
"Stored 'train_job_name' (str)\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d06051dceb9e7c3e4941d98bb5860c0cd4d1b728 | 66,639 | ipynb | Jupyter Notebook | notebooks/test_0_lstm_shell_colab.ipynb | SPRACE/track-ml | 3af95fd014e98a5b11261dc5d618f34f82fdf84d | [
"MIT"
]
| null | null | null | notebooks/test_0_lstm_shell_colab.ipynb | SPRACE/track-ml | 3af95fd014e98a5b11261dc5d618f34f82fdf84d | [
"MIT"
]
| 10 | 2019-04-15T21:44:31.000Z | 2020-08-26T21:05:00.000Z | notebooks/test_0_lstm_shell_colab.ipynb | SPRACE/track-ml | 3af95fd014e98a5b11261dc5d618f34f82fdf84d | [
"MIT"
]
| 4 | 2019-04-12T19:04:16.000Z | 2020-01-14T13:30:44.000Z | 213.586538 | 47,300 | 0.647204 | [
[
[
"!pip install git+https://github.com/LAL/trackml-library.git\n!pip install plotly.express\n!pip install shortuuid",
"_____no_output_____"
],
[
"# Clonning the repository you can get the lastest code from dev branch \n#!git clone https://github.com/SPRACE/track-ml.git cloned-repo\n!git clone https://github.com/stonescenter/track-ml.git\n!ls",
"_____no_output_____"
],
[
"%cd track-ml/",
"/content/track-ml\n"
]
],
[
[
"# Running scripts with python shell #",
"_____no_output_____"
]
],
[
[
"#!pip install tensorflow==1.14.0\n#!pip install tensorflow-base==1.14.0\n#!pip install tensorflow-gpu==1.14.0\n\n%tensorflow_version 1.x",
"_____no_output_____"
],
[
"\n! python main_train.py --config config_default.json",
"_____no_output_____"
]
],
[
[
"# Plot Predicted Data #\n",
"_____no_output_____"
]
],
[
[
"import os\nimport json\nimport numpy as np\nimport pandas as pd\n\nconfigs = json.load(open('config_default.json', 'r'))\n\ncylindrical = configs['data']['cylindrical'] # set to polar or cartesian coordenates\nnormalise = configs['data']['normalise'] \nname = configs['model']['name']\n\nif cylindrical:\n coord = 'cylin'\nelse:\n coord = 'xyz'\n\npath1 = 'results/x_true_%s_%s.csv' % (name, coord)\npath2 = 'results/y_true_%s_%s.csv' % (name, coord)\npath3 = 'results/y_pred_%s_%s.csv' % (name, coord)\n\nprint('loading from .. %s' % path1)\nprint('loading from .. %s' % path2)\nprint('loading from .. %s' % path3)\n\ndf_test = pd.read_csv(path1, header=None)\ndf_true = pd.read_csv(path2, header=None)\ndf_pred = pd.read_csv(path3, header=None)\n\nprint('shape df_test ', df_test.shape)\nprint('shape df_true ', df_true.shape)\nprint('shape df_pred ', df_pred.shape)\n# concat\n#y_true = pd.concat([df_test, df_true], axis = 1, ignore_index = True)\n#y_pred = pd.concat([df_test, df_pred], axis = 1, ignore_index = True)\n\ny_true = np.concatenate([df_test, df_true], axis = 1)\ny_pred = np.concatenate([df_test, df_pred], axis = 1)\ny_true = pd.DataFrame(y_true)\ny_pred = pd.DataFrame(y_pred)\n#y_true.name = 'real'\n#y_pred.name = 'pred'\ny_pred.columns.name = 'pred'\ny_true.columns.name = 'real'\n\nprint('size y_true ', y_true.shape)\nprint('size y_pred ', y_pred.shape)",
"loading from .. results/x_true_lstm_xyz.csv\nloading from .. results/y_true_lstm_xyz.csv\nloading from .. results/y_pred_lstm_xyz.csv\nshape df_test (528, 12)\nshape df_true (528, 18)\nshape df_pred (528, 18)\nsize y_true (528, 30)\nsize y_pred (528, 30)\n"
],
[
"from core.utils.utils import *\nimport warnings\n\nN_tracks = 30\npath_html = ''\nname = configs['model']['name']\n\nfig = track_plot_xyz([y_true, y_pred], n_hits = 10, cylindrical = cylindrical, n_tracks = N_tracks, \n title='Track Prediction #10 Hit - Model %s (Nearest hits)' % name.upper())\n\nfig.show()",
"/usr/local/lib/python3.6/dist-packages/statsmodels/tools/_testing.py:19: FutureWarning:\n\npandas.util.testing is deprecated. Use the functions in the public API at pandas.testing instead.\n\n"
],
[
"",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
]
|
d06051e608c2bfcf56c74a60dc9f350d0d1b7d14 | 137,607 | ipynb | Jupyter Notebook | code/preprocessing_and_decomposition/Matrix_Profile.ipynb | iotanalytics/IoTTutorial | 33666ca918cdece60df4684f0a2ec3465e9663b6 | [
"MIT"
]
| 3 | 2021-07-20T18:02:51.000Z | 2021-08-18T13:26:57.000Z | code/preprocessing_and_decomposition/Matrix_Profile.ipynb | iotanalytics/IoTTutorial | 33666ca918cdece60df4684f0a2ec3465e9663b6 | [
"MIT"
]
| null | null | null | code/preprocessing_and_decomposition/Matrix_Profile.ipynb | iotanalytics/IoTTutorial | 33666ca918cdece60df4684f0a2ec3465e9663b6 | [
"MIT"
]
| null | null | null | 539.635294 | 128,352 | 0.946311 | [
[
[
"<a href=\"https://colab.research.google.com/github/iotanalytics/IoTTutorial/blob/main/code/preprocessing_and_decomposition/Matrix_Profile.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"## Matrix Profile\n",
"_____no_output_____"
],
[
"## Introduction\n\nThe matrix profile (MP) is a data structure and associated algorithms that helps solve the dual problem of anomaly detection and motif discovery. It is robust, scalable and largely parameter-free.\n\nMP can be combined with other algorithms to accomplish:\n\n* Motif discovery\n* Time series chains\n* Anomaly discovery\n* Joins\n* Semantic segmentation\n\nmatrixprofile-ts offers 3 different algorithms to compute Matrix Profile:\n* STAMP (Scalable Time Series Anytime Matrix Profile) - Each distance profile is independent of other distance profiles, the order in which they are computed can be random. It is an anytime algorithm.\n* STOMP (Scalable Time Series Ordered Matrix Profile) - This algorithm is an exact ordered algorithm. It is significantly faster than STAMP.\n* SCRIMP++ (Scalable Column Independent Matrix Profile) - This algorithm combines the anytime component of STAMP with the speed of STOMP.\n\n\nSee: https://towardsdatascience.com/introduction-to-matrix-profiles-5568f3375d90",
"_____no_output_____"
],
[
"## Code Example\n",
"_____no_output_____"
]
],
[
[
"!pip install matrixprofile-ts",
"Collecting matrixprofile-ts\n Downloading matrixprofile_ts-0.0.9-py2.py3-none-any.whl (24 kB)\nRequirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.7/dist-packages (from matrixprofile-ts) (1.19.5)\nInstalling collected packages: matrixprofile-ts\nSuccessfully installed matrixprofile-ts-0.0.9\n"
],
[
"import pandas as pd\n## example data importing\ndata = pd.read_csv('https://raw.githubusercontent.com/iotanalytics/IoTTutorial/main/data/SCG_data.csv').drop('Unnamed: 0',1).to_numpy()[0:20,:1000]",
"_____no_output_____"
],
[
"import operator\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom matrixprofile import *\n\nimport numpy as np\nfrom datetime import datetime\n\nimport matplotlib.pyplot as plt\nfrom matplotlib.colors import ListedColormap\nfrom sklearn import neighbors, datasets\n\n# Pull a portion of the data\npattern = data[10,:] + max(abs(data[10,:]))\n\n# Compute Matrix Profile\nm = 10\nmp = matrixProfile.stomp(pattern,m)\n\n#Append np.nan to Matrix profile to enable plotting against raw data\nmp_adj = np.append(mp[0],np.zeros(m-1)+np.nan)\n\n#Plot the signal data\nfig, (ax1, ax2) = plt.subplots(2,1,sharex=True,figsize=(20,10))\nax1.plot(np.arange(len(pattern)),pattern)\nax1.set_ylabel('Signal', size=22)\n\n#Plot the Matrix Profile\nax2.plot(np.arange(len(mp_adj)),mp_adj, label=\"Matrix Profile\", color='red')\nax2.set_ylabel('Matrix Profile', size=22)\nax2.set_xlabel('Time', size=22);",
"_____no_output_____"
]
],
[
[
"## Discussion\n\n\nPros:\n* It is exact: For motif discovery, discord discovery, time series joins etc., the Matrix Profile based methods provide no false positives or false dismissals.\n* It is simple and parameter-free: In contrast, the more general algorithms in this space\nthat typically require building and tuning spatial access methods and/or hash functions.\n* It is space efficient: Matrix Profile construction algorithms requires an inconsequential\nspace overhead, just linear in the time series length with a small constant factor, allowing\nmassive datasets to be processed in main memory (for most data mining, disk is death).\n* It allows anytime algorithms: While exact MP algorithms are extremely scalable, for\nextremely large datasets we can compute the Matrix Profile in an anytime fashion, allowing\nultra-fast approximate solutions and real-time data interaction.\n* It is incrementally maintainable: Having computed the Matrix Profile for a dataset,\nwe can incrementally update it very efficiently. In many domains this means we can effectively\nmaintain exact joins, motifs, discords on streaming data forever.\n* It can leverage hardware: Matrix Profile construction is embarrassingly parallelizable,\nboth on multicore processors, GPUs, distributed systems etc.\n* It is free of the curse of dimensionality: That is to say, It has time complexity that is\nconstant in subsequence length: This is a very unusual and desirable property; virtually all\nexisting algorithms in the time series scale poorly as the subsequence length grows.\n* It can be constructed in deterministic time: Almost all algorithms for time series\ndata mining can take radically different times to finish on two (even slightly) different datasets.\nIn contrast, given only the length of the time series, we can precisely predict in advance how\nlong it will take to compute the Matrix Profile. (this allows resource planning)\n* It can handle missing data: Even in the presence of missing data, we can provide\nanswers which are guaranteed to have no false negatives.\n* Finally, and subjectively: Simplicity and Intuitiveness: Seeing the world through\nthe MP lens often invites/suggests simple and elegant solutions. \n\nCons:\n* Larger datasets can take a long time to compute. Scalability needs to be addressed.\n* Cannot be used with Dynamic time Warping as of now.\n * DTW is used for one-to-all matching whereas MP is used for all-to-all matching.\n * DTW is used for smaller datasets rather than large.\n* Need to adjust window size manually for different datasets.\n\n*How to read MP* :\n* Where you see relatively low values, you know that the subsequence in the original time\nseries must have (at least one) relatively similar subsequence elsewhere in the data (such\nregions are โmotifsโ or reoccurring patterns)\n* Where you see relatively high values, you know that the subsequence in the original time\nseries must be unique in its shape (such areas are โdiscordsโ or anomalies). In fact, the highest point is exactly the definition of Time\nSeries Discord, perhaps the best anomaly detector for time series.\n",
"_____no_output_____"
],
[
"##References:\n\nhttps://www.cs.ucr.edu/~eamonn/MatrixProfile.html (powerpoints on this site - a lot of examples)\n\nhttps://towardsdatascience.com/introduction-to-matrix-profiles-5568f3375d90\n\nPython implementation: https://github.com/TDAmeritrade/stumpy",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
]
]
|
d06059853171785ee6baaefaab157274881917e4 | 41,007 | ipynb | Jupyter Notebook | day37_ML_ANN_RNN.ipynb | DynamicEngine2001/Programming-Codes | 6f19cbca47eef4b059b723703b261545ab08fc5d | [
"MIT"
]
| null | null | null | day37_ML_ANN_RNN.ipynb | DynamicEngine2001/Programming-Codes | 6f19cbca47eef4b059b723703b261545ab08fc5d | [
"MIT"
]
| 1 | 2020-10-15T14:33:30.000Z | 2020-10-15T14:33:30.000Z | day37_ML_ANN_RNN.ipynb | DynamicEngine2001/Programming-Codes | 6f19cbca47eef4b059b723703b261545ab08fc5d | [
"MIT"
]
| 7 | 2020-10-05T13:05:35.000Z | 2021-10-18T17:06:50.000Z | 35.596354 | 6,416 | 0.472797 | [
[
[
"### Steps to build a Neural Network\n\n1. Empty Model (sequential/Model)\n2",
"_____no_output_____"
]
],
[
[
"import tensorflow.keras.datasets as kd",
"_____no_output_____"
],
[
"data = kd.fashion_mnist.load_data()",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz\n32768/29515 [=================================] - 0s 7us/step\nDownloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz\n26427392/26421880 [==============================] - 13s 1us/step\nDownloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz\n8192/5148 [===============================================] - 0s 0us/step\nDownloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz\n4423680/4422102 [==============================] - 2s 0us/step\n"
],
[
"(xtrain,ytrain),(xtest,ytest) = data",
"_____no_output_____"
],
[
"xtrain.shape",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"plt.imshow(xtrain[0,:,:],cmap='gray_r')",
"_____no_output_____"
],
[
"ytrain[0]",
"_____no_output_____"
],
[
"xtrain1 = xtrain.reshape(-1,28*28)\nxtest1 = xtest.reshape(-1,28*28)",
"_____no_output_____"
],
[
"xtrain1.shape",
"_____no_output_____"
],
[
"from tensorflow.keras.models import Sequential\nfrom tensorflow.keras.layers import Dense",
"_____no_output_____"
],
[
"model_ann = Sequential()\nmodel_ann.add(Dense(units=128, input_shape=(784,), activation='relu'))\nmodel_ann.add(Dense(units=128, activation='relu'))\nmodel_ann.add(Dense(units=10, activation='softmax'))\nmodel_ann.compile(optimizer='adam',loss='sparse_categorical_crossentropy',metrics=['accuracy'])",
"_____no_output_____"
],
[
"model_ann.summary()",
"Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ndense (Dense) (None, 128) 100480 \n_________________________________________________________________\ndense_1 (Dense) (None, 128) 16512 \n_________________________________________________________________\ndense_2 (Dense) (None, 10) 1290 \n=================================================================\nTotal params: 118,282\nTrainable params: 118,282\nNon-trainable params: 0\n_________________________________________________________________\n"
],
[
"1st layer = ",
"_____no_output_____"
],
[
"history = model_ann.fit(xtrain1,ytrain,epochs=10)",
"Epoch 1/10\n1875/1875 [==============================] - 8s 3ms/step - loss: 2.0255 - accuracy: 0.7341\nEpoch 2/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.6520 - accuracy: 0.7857\nEpoch 3/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.6020 - accuracy: 0.8008\nEpoch 4/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.5439 - accuracy: 0.8156\nEpoch 5/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.5079 - accuracy: 0.8259\nEpoch 6/10\n1875/1875 [==============================] - 7s 4ms/step - loss: 0.4657 - accuracy: 0.8365\nEpoch 7/10\n1875/1875 [==============================] - 7s 4ms/step - loss: 0.4380 - accuracy: 0.8442\nEpoch 8/10\n1875/1875 [==============================] - 5s 3ms/step - loss: 0.4248 - accuracy: 0.8483\nEpoch 9/10\n1875/1875 [==============================] - 5s 3ms/step - loss: 0.4050 - accuracy: 0.8524\nEpoch 10/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.3999 - accuracy: 0.8567\n"
],
[
"plt.plot(history.history['loss'])\nplt.plot(history.history['accuracy'])\nplt.grid()\nplt.\nplt.xticks(range(1,11))\nplt.xlabel('Epochs-->')\nplt.show()",
"_____no_output_____"
],
[
"ypred = model_ann.predict(xtest1)",
"_____no_output_____"
],
[
"labels.get(ytest[0])",
"_____no_output_____"
],
[
"ypred[0].argmax()",
"_____no_output_____"
],
[
"model_ann.evaluate(xtest1,ytest)",
"313/313 [==============================] - 1s 2ms/step - loss: 0.4793 - accuracy: 0.8335\n"
]
],
[
[
"### Churn Modelling",
"_____no_output_____"
]
],
[
[
"import pandas as pd",
"_____no_output_____"
],
[
"df = pd.read_csv('Churn_Modelling.csv')\ndf",
"_____no_output_____"
],
[
"df.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 10000 entries, 0 to 9999\nData columns (total 14 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 RowNumber 10000 non-null int64 \n 1 CustomerId 10000 non-null int64 \n 2 Surname 10000 non-null object \n 3 CreditScore 10000 non-null int64 \n 4 Geography 10000 non-null object \n 5 Gender 10000 non-null object \n 6 Age 10000 non-null int64 \n 7 Tenure 10000 non-null int64 \n 8 Balance 10000 non-null float64\n 9 NumOfProducts 10000 non-null int64 \n 10 HasCrCard 10000 non-null int64 \n 11 IsActiveMember 10000 non-null int64 \n 12 EstimatedSalary 10000 non-null float64\n 13 Exited 10000 non-null int64 \ndtypes: float64(2), int64(9), object(3)\nmemory usage: 1.1+ MB\n"
],
[
"df1 = pd.get_dummies(df)",
"_____no_output_____"
],
[
"df1.head()",
"_____no_output_____"
]
],
[
[
"### Recurrent Neural Network",
"_____no_output_____"
]
],
[
[
"import numpy as np",
"_____no_output_____"
],
[
"stock_data = pd.read_csv('stock_data.csv')",
"_____no_output_____"
],
[
"fb = stock_data[['Open']] [stock_data['Stock']=='FB'].copy()",
"_____no_output_____"
],
[
"fb.head()",
"_____no_output_____"
],
[
"fb = fb.values",
"_____no_output_____"
],
[
"fb.shape",
"_____no_output_____"
],
[
"x = []\ny = []\nfor i in range(20, len(fb)):\n x.append(fb['Open'].valuesfb[i-20:1].tolist())\n y.append(fb[i].tolist())\n",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0605f7486d4b21270c84051e4665d2a8b65650a | 16,303 | ipynb | Jupyter Notebook | Python_Core/Python Modules and Imports.ipynb | ValRCS/RCS_Python_11 | 157c8e08aaf9849341cadb50077fe65dead536fa | [
"MIT"
]
| 1 | 2019-07-11T16:25:15.000Z | 2019-07-11T16:25:15.000Z | Python_Core/Python Modules and Imports.ipynb | ValRCS/RCS_Python_11 | 157c8e08aaf9849341cadb50077fe65dead536fa | [
"MIT"
]
| 8 | 2020-01-28T22:54:14.000Z | 2022-02-10T00:17:47.000Z | Python Modules and Imports.ipynb | ValRCS/RCS_Data_Analysis_Python_2019_July | 19e2f8310f41b697f9c86d7a085a9ff19390eeac | [
"MIT"
]
| 2 | 2019-12-11T14:39:36.000Z | 2019-12-13T14:29:09.000Z | 22.674548 | 463 | 0.497577 | [
[
[
"## Python Modules",
"_____no_output_____"
]
],
[
[
"%%writefile weather.py\ndef prognosis():\n print(\"It will rain today\")",
"Writing weather.py\n"
],
[
"import weather\n",
"_____no_output_____"
],
[
"weather.prognosis()",
"It will rain today\n"
]
],
[
[
"## How does Python know from where to import packages/modules from?",
"_____no_output_____"
]
],
[
[
"# Python imports work by searching the directories listed in sys.path.",
"_____no_output_____"
],
[
"import sys\nsys.path\n",
"_____no_output_____"
],
[
"## \"__main__\" usage\n# A module can discover whether or not it is running in the main scope by checking its own __name__, \n# which allows a common idiom for conditionally executing code in a module when it is run as a script or with python -m \n# but not when it is imported:",
"_____no_output_____"
],
[
"%%writefile hw.py\n#!/usr/bin/env python\ndef hw():\n print(\"Running Main\")\n\ndef hw2():\n print(\"Hello 2\")\n\nif __name__ == \"__main__\":\n # execute only if run as a script\n print(\"Running as script\")\n hw()\n hw2()",
"Overwriting hw.py\n"
],
[
"import main\nimport hw",
"_____no_output_____"
],
[
"main.main()\nhw.hw2()",
"Running Main\nHello 2\n"
],
[
"# Running on all 3 OSes from command line:\n\npython main.py",
"_____no_output_____"
]
],
[
[
"## Make main.py self running on Linux (also should work on MacOS):\n \nAdd \n#!/usr/bin/env python to first line of script\n\nmark it executable using\n\n### need to change permissions too!\n$ chmod +x main.py",
"_____no_output_____"
],
[
"## Making Standalone .EXEs for Python in Windows \n\n* http://www.py2exe.org/ used to be for Python 2 , now supposedly Python 3 as well\n* http://www.pyinstaller.org/\n Tutorial: https://medium.com/dreamcatcher-its-blog/making-an-stand-alone-executable-from-a-python-script-using-pyinstaller-d1df9170e263\n\n Need to create exe on a similar system as target system! ",
"_____no_output_____"
]
],
[
[
"# Exercise Write a function which returns a list of fibonacci numbers up to starting with 1, 1, 2, 3, 5 up to the nth.\nSo Fib(4) would return [1,1,2,3]",
"_____no_output_____"
]
],
[
[
"",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
],
[
[
"%%writefile fibo.py\n# Fibonacci numbers module\n\ndef fib(n): # write Fibonacci series up to n\n a, b = 1 1\n while b < n:\n print(b, end=' ')\n a, b = b, a+b\n print()\n\ndef fib2(n): # return Fibonacci series up to n\n result = []\n a, b = 1, 1\n while b < n:\n result.append(b)\n a, b = b, a+b\n return result",
"Writing fibo.py\n"
],
[
"import fibo",
"_____no_output_____"
],
[
"fibo.fib(100)",
"1 1 2 3 5 8 13 21 34 55 89 \n"
],
[
"fibo.fib2(100)",
"_____no_output_____"
],
[
"fib=fibo.fib",
"_____no_output_____"
]
],
[
[
"If you intend to use a function often you can assign it to a local name:",
"_____no_output_____"
]
],
[
[
"fib(300)",
"1 1 2 3 5 8 13 21 34 55 89 144 233 \n"
]
],
[
[
"#### There is a variant of the import statement that imports names from a module directly into the importing moduleโs symbol table. ",
"_____no_output_____"
]
],
[
[
"from fibo import fib, fib2 # we overwrote fib=fibo.fib",
"_____no_output_____"
],
[
"fib(100)",
"1 1 2 3 5 8 13 21 34 55 89 \n"
],
[
"fib2(200)",
"_____no_output_____"
]
],
[
[
"This does not introduce the module name from which the imports are taken in the local symbol table (so in the example, fibo is not defined).",
"_____no_output_____"
],
[
"There is even a variant to import all names that a module defines: **NOT RECOMMENDED**",
"_____no_output_____"
]
],
[
[
"## DO not do this Namespace collission possible!!",
"_____no_output_____"
],
[
"from fibo import *",
"_____no_output_____"
],
[
"fib(400)",
"1 1 2 3 5 8 13 21 34 55 89 144 233 377 \n"
]
],
[
[
"### If the module name is followed by as, then the name following as is bound directly to the imported module.",
"_____no_output_____"
]
],
[
[
"import fibo as fib",
"_____no_output_____"
],
[
"dir(fib)",
"_____no_output_____"
],
[
"fib.fib(50)",
"1 1 2 3 5 8 13 21 34 \n"
],
[
"### It can also be used when utilising from with similar effects:",
"_____no_output_____"
],
[
" from fibo import fib as fibonacci",
"_____no_output_____"
],
[
"fibonacci(200)",
"1 1 2 3 5 8 13 21 34 55 89 144 \n"
]
],
[
[
"### Executing modules as scriptsยถ",
"_____no_output_____"
],
[
"When you run a Python module with\n\npython fibo.py <arguments>\n \nthe code in the module will be executed, just as if you imported it, but with the \\_\\_name\\_\\_ set to \"\\_\\_main\\_\\_\". That means that by adding this code at the end of your module:",
"_____no_output_____"
]
],
[
[
"%%writefile fibbo.py \n \n# Fibonacci numbers module\n\ndef fib(n): # write Fibonacci series up to n\n a, b = 0, 1\n while b < n:\n print(b, end=' ')\n a, b = b, a+b\n print()\n\ndef fib2(n): # return Fibonacci series up to n\n result = []\n a, b = 0, 1\n while b < n:\n result.append(b)\n a, b = b, a+b\n return result\n\nif __name__ == \"__main__\":\n import sys\n fib(int(sys.argv[1], 10))",
"Overwriting fibbo.py\n"
],
[
"import fibbo as fi\nfi.fib(200)",
"1 1 2 3 5 8 13 21 34 55 89 144 \n"
]
],
[
[
"#### This is often used either to provide a convenient user interface to a module, or for testing purposes (running the module as a script executes a test suite).",
"_____no_output_____"
],
[
"### The Module Search Path\n\nWhen a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:\n\n* The directory containing the input script (or the current directory when no file is specified).\n* PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH).\n* The installation-dependent default.",
"_____no_output_____"
],
[
"Packages are a way of structuring Pythonโs module namespace by using โdotted module namesโ. For example, the module name A.B designates a submodule named B in a package named A. Just like the use of modules saves the authors of different modules from having to worry about each otherโs global variable names, the use of dotted module names saves the authors of multi-module packages like NumPy or Pillow from having to worry about each otherโs module names.",
"_____no_output_____"
]
],
[
[
"sound/ Top-level package\n __init__.py Initialize the sound package\n formats/ Subpackage for file format conversions\n __init__.py\n wavread.py\n wavwrite.py\n aiffread.py\n aiffwrite.py\n auread.py\n auwrite.py\n ...\n effects/ Subpackage for sound effects\n __init__.py\n echo.py\n surround.py\n reverse.py\n ...\n filters/ Subpackage for filters\n __init__.py\n equalizer.py\n vocoder.py\n karaoke.py\n ...",
"_____no_output_____"
]
],
[
[
"The \\_\\_init\\_\\_.py files are required to make Python treat the directories as containing packages; this is done to prevent directories with a common name, such as string, from unintentionally hiding valid modules that occur later on the module search path. In the simplest case, \\_\\_init\\_\\_.py can just be an empty file",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d060798bb49284e8cdc2d12caf2fd61bd056185e | 237,327 | ipynb | Jupyter Notebook | notebooks/testing_multitask.ipynb | GJBoth/MultiTaskPINN | 8a9bb23b8bfc0d0f678090e015316dbd0cfbf024 | [
"MIT"
]
| null | null | null | notebooks/testing_multitask.ipynb | GJBoth/MultiTaskPINN | 8a9bb23b8bfc0d0f678090e015316dbd0cfbf024 | [
"MIT"
]
| null | null | null | notebooks/testing_multitask.ipynb | GJBoth/MultiTaskPINN | 8a9bb23b8bfc0d0f678090e015316dbd0cfbf024 | [
"MIT"
]
| 1 | 2022-02-24T04:27:25.000Z | 2022-02-24T04:27:25.000Z | 40.72186 | 198 | 0.505539 | [
[
[
"# General imports\nimport numpy as np\nimport torch\n\n# DeepMoD stuff\nfrom multitaskpinn import DeepMoD\nfrom multitaskpinn.model.func_approx import NN\nfrom multitaskpinn.model.library import Library1D\nfrom multitaskpinn.model.constraint import LeastSquares\nfrom multitaskpinn.model.sparse_estimators import Threshold\nfrom multitaskpinn.training import train, train_multitask\nfrom multitaskpinn.training.sparsity_scheduler import TrainTestPeriodic\n\nfrom phimal_utilities.data import Dataset\nfrom phimal_utilities.data.burgers import BurgersDelta\n\nif torch.cuda.is_available():\n device ='cuda'\nelse:\n device = 'cpu'\ndevice = 'cpu'\n\n# Settings for reproducibility\nnp.random.seed(0)\ntorch.manual_seed(0)\ntorch.backends.cudnn.deterministic = True\ntorch.backends.cudnn.benchmark = False\n\n%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"device",
"_____no_output_____"
],
[
"# Making dataset\nv = 0.1\nA = 1.0\n\nx = np.linspace(-3, 4, 100)\nt = np.linspace(0.5, 5.0, 50)\nx_grid, t_grid = np.meshgrid(x, t, indexing='ij')\ndataset = Dataset(BurgersDelta, v=v, A=A)\n\nX, y = dataset.create_dataset(x_grid.reshape(-1, 1), t_grid.reshape(-1, 1), n_samples=1000, noise=0.2, random=True, normalize=False)\nX, y = X.to(device), y.to(device)",
"_____no_output_____"
],
[
"network = NN(2, [30, 30, 30, 30, 30], 1)\nlibrary = Library1D(poly_order=2, diff_order=3) # Library function\nestimator = Threshold(0.1) # Sparse estimator \nconstraint = LeastSquares() # How to constrain\nmodel = DeepMoD(network, library, estimator, constraint).to(device) # Putting it all in the model",
"_____no_output_____"
],
[
"sparsity_scheduler = TrainTestPeriodic(patience=8, delta=1e-5, periodicity=50)\noptimizer = torch.optim.Adam(model.parameters(), betas=(0.99, 0.999), amsgrad=True, lr=2e-3) # Defining optimizer",
"_____no_output_____"
],
[
"train_multitask(model, X, y, optimizer, sparsity_scheduler, write_iterations=25, log_dir='runs/testing_multitask_unnormalized/', max_iterations=15000, delta=1e-3, patience=8) # Running",
"| Iteration | Progress | Time remaining | Loss | MSE | Reg | L1 norm |\n 2150 14.33% 396s -1.56e+01 1.40e-03 1.12e-07 1.55e+00 Algorithm converged. Stopping training.\n"
],
[
"network = NN(2, [30, 30, 30, 30, 30], 1)\nlibrary = Library1D(poly_order=2, diff_order=3) # Library function\nestimator = Threshold(0.1) # Sparse estimator \nconstraint = LeastSquares() # How to constrain\nmodel = DeepMoD(network, library, estimator, constraint).to(device) # Putting it all in the model",
"_____no_output_____"
],
[
"sparsity_scheduler = TrainTestPeriodic(patience=8, delta=1e-5, periodicity=50)\noptimizer = torch.optim.Adam(model.parameters(), betas=(0.99, 0.999), amsgrad=True, lr=2e-3) # Defining optimizer",
"_____no_output_____"
],
[
"train(model, X, y, optimizer, sparsity_scheduler, write_iterations=25, log_dir='runs/testing_normal_unnormalized/', max_iterations=15000, delta=1e-3, patience=8) # Running",
"| Iteration | Progress | Time remaining | Loss | MSE | Reg | L1 norm |\n 11300 75.33% 108s 1.38e-03 1.36e-03 1.72e-05 1.63e+00 Algorithm converged. Stopping training.\n"
]
],
[
[
"# Quick analysis",
"_____no_output_____"
]
],
[
[
"from phimal_utilities.analysis import Results\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nsns.set(context='notebook', style='white')\n\n%config InlineBackend.figure_format = 'svg'",
"_____no_output_____"
],
[
"data_mt = Results('runs/testing_multitask_unnormalized//')\ndata_bl = Results('runs/testing_normal_unnormalized//')\n\nkeys = data_mt.keys",
"_____no_output_____"
],
[
"fig, axes = plt.subplots(figsize=(10, 3), constrained_layout=True, ncols=2)\n\nax = axes[0]\nax.semilogy(data_bl.df.index, data_bl.df[keys['mse']], label='Baseline')\nax.semilogy(data_mt.df.index, data_mt.df[keys['mse']], label='Multitask')\nax.set_title('MSE')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\nax.legend()\n#ax.set_xlim([0, 8000])\n\n\nax = axes[1]\nax.semilogy(data_bl.df.index, data_bl.df[keys['reg']], label='Baseline')\nax.semilogy(data_mt.df.index, data_mt.df[keys['reg']], label='Multitask')\nax.set_title('Regression')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\nax.legend()\n#ax.set_xlim([0, 8000])\n\nfig.show()",
"_____no_output_____"
],
[
"fig, axes = plt.subplots(ncols=3, constrained_layout=True, figsize=(15, 4))\n\nax = axes[0]\nax.plot(data_bl.df.index, data_bl.df[keys['coeffs']])\nax.plot(data_bl.df.index, data_bl.df[keys['coeffs'][2]], lw=3)\nax.plot(data_bl.df.index, data_bl.df[keys['coeffs'][5]], lw=3)\nax.set_ylim([-2, 2])\nax.set_title('Coefficients baseline')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\n#ax.set_xlim([0, 8000])\n\nax = axes[1]\nax.plot(data_mt.df.index, data_mt.df[keys['coeffs']])\nax.plot(data_mt.df.index, data_mt.df[keys['coeffs'][2]], lw=3)\nax.plot(data_mt.df.index, data_mt.df[keys['coeffs'][5]], lw=3)\nax.set_ylim([-2, 2])\nax.set_title('Coefficients Multitask')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\n#ax.set_xlim([0, 8000])\n\nax = axes[2]\ntrue_coeffs = np.zeros(len(keys['unscaled_coeffs']))\ntrue_coeffs[2] = 0.1\ntrue_coeffs[5] = -1\n\nax.semilogy(data_bl.df.index, np.mean(np.abs(data_bl.df[keys['unscaled_coeffs']] - true_coeffs), axis=1), label='Baseline')\nax.semilogy(data_mt.df.index, np.mean(np.abs(data_mt.df[keys['unscaled_coeffs']] - true_coeffs), axis=1), label='Baseline')\nax.set_ylim([-5, 2])\nax.legend()\n\nfig.show()",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
]
|
d06079be15478f4e2f794b8f7c80c271870f6724 | 47,640 | ipynb | Jupyter Notebook | notebook/pytorch/nn_tutorial.ipynb | mengwangk/myinvestor-toolkit | 3dca9e1accfccf1583dcdbec80d1a0fe9dae2e81 | [
"MIT"
]
| 7 | 2019-10-13T18:58:33.000Z | 2021-08-07T12:46:22.000Z | notebook/pytorch/nn_tutorial.ipynb | mengwangk/myinvestor-toolkit | 3dca9e1accfccf1583dcdbec80d1a0fe9dae2e81 | [
"MIT"
]
| 7 | 2019-12-16T21:25:34.000Z | 2022-02-10T00:11:22.000Z | notebook/pytorch/nn_tutorial.ipynb | mengwangk/myinvestor-toolkit | 3dca9e1accfccf1583dcdbec80d1a0fe9dae2e81 | [
"MIT"
]
| 4 | 2020-02-01T11:23:51.000Z | 2021-12-13T12:27:18.000Z | 29.849624 | 124 | 0.558858 | [
[
[
"%matplotlib inline",
"_____no_output_____"
]
],
[
[
"\nWhat is `torch.nn` *really*?\n============================\nby Jeremy Howard, `fast.ai <https://www.fast.ai>`_. Thanks to Rachel Thomas and Francisco Ingham.\n\n",
"_____no_output_____"
],
[
"We recommend running this tutorial as a notebook, not a script. To download the notebook (.ipynb) file,\nclick `here <https://pytorch.org/tutorials/beginner/nn_tutorial.html#sphx-glr-download-beginner-nn-tutorial-py>`_ .\n\nPyTorch provides the elegantly designed modules and classes `torch.nn <https://pytorch.org/docs/stable/nn.html>`_ ,\n`torch.optim <https://pytorch.org/docs/stable/optim.html>`_ ,\n`Dataset <https://pytorch.org/docs/stable/data.html?highlight=dataset#torch.utils.data.Dataset>`_ ,\nand `DataLoader <https://pytorch.org/docs/stable/data.html?highlight=dataloader#torch.utils.data.DataLoader>`_\nto help you create and train neural networks.\nIn order to fully utilize their power and customize\nthem for your problem, you need to really understand exactly what they're\ndoing. To develop this understanding, we will first train basic neural net\non the MNIST data set without using any features from these models; we will\ninitially only use the most basic PyTorch tensor functionality. Then, we will\nincrementally add one feature from ``torch.nn``, ``torch.optim``, ``Dataset``, or\n``DataLoader`` at a time, showing exactly what each piece does, and how it\nworks to make the code either more concise, or more flexible.\n\n**This tutorial assumes you already have PyTorch installed, and are familiar\nwith the basics of tensor operations.** (If you're familiar with Numpy array\noperations, you'll find the PyTorch tensor operations used here nearly identical).\n\nMNIST data setup\n----------------\n\nWe will use the classic `MNIST <http://deeplearning.net/data/mnist/>`_ dataset,\nwhich consists of black-and-white images of hand-drawn digits (between 0 and 9).\n\nWe will use `pathlib <https://docs.python.org/3/library/pathlib.html>`_\nfor dealing with paths (part of the Python 3 standard library), and will\ndownload the dataset using\n`requests <http://docs.python-requests.org/en/master/>`_. We will only\nimport modules when we use them, so you can see exactly what's being\nused at each point.\n\n",
"_____no_output_____"
]
],
[
[
"from pathlib import Path\nimport requests\n\nDATA_PATH = Path(\"data\")\nPATH = DATA_PATH / \"mnist\"\n\nPATH.mkdir(parents=True, exist_ok=True)\n\nURL = \"http://deeplearning.net/data/mnist/\"\nFILENAME = \"mnist.pkl.gz\"\n\nif not (PATH / FILENAME).exists():\n content = requests.get(URL + FILENAME).content\n (PATH / FILENAME).open(\"wb\").write(content)",
"_____no_output_____"
]
],
[
[
"This dataset is in numpy array format, and has been stored using pickle,\na python-specific format for serializing data.\n\n",
"_____no_output_____"
]
],
[
[
"import pickle\nimport gzip\n\nwith gzip.open((PATH / FILENAME).as_posix(), \"rb\") as f:\n ((x_train, y_train), (x_valid, y_valid), _) = pickle.load(f, encoding=\"latin-1\")",
"_____no_output_____"
]
],
[
[
"Each image is 28 x 28, and is being stored as a flattened row of length\n784 (=28x28). Let's take a look at one; we need to reshape it to 2d\nfirst.\n\n",
"_____no_output_____"
]
],
[
[
"from matplotlib import pyplot\nimport numpy as np\n\npyplot.imshow(x_train[0].reshape((28, 28)), cmap=\"gray\")\nprint(x_train.shape)",
"_____no_output_____"
]
],
[
[
"PyTorch uses ``torch.tensor``, rather than numpy arrays, so we need to\nconvert our data.\n\n",
"_____no_output_____"
]
],
[
[
"import torch\n\nx_train, y_train, x_valid, y_valid = map(\n torch.tensor, (x_train, y_train, x_valid, y_valid)\n)\nn, c = x_train.shape\nx_train, x_train.shape, y_train.min(), y_train.max()\nprint(x_train, y_train)\nprint(x_train.shape)\nprint(y_train.min(), y_train.max())",
"_____no_output_____"
]
],
[
[
"Neural net from scratch (no torch.nn)\n---------------------------------------------\n\nLet's first create a model using nothing but PyTorch tensor operations. We're assuming\nyou're already familiar with the basics of neural networks. (If you're not, you can\nlearn them at `course.fast.ai <https://course.fast.ai>`_).\n\nPyTorch provides methods to create random or zero-filled tensors, which we will\nuse to create our weights and bias for a simple linear model. These are just regular\ntensors, with one very special addition: we tell PyTorch that they require a\ngradient. This causes PyTorch to record all of the operations done on the tensor,\nso that it can calculate the gradient during back-propagation *automatically*!\n\nFor the weights, we set ``requires_grad`` **after** the initialization, since we\ndon't want that step included in the gradient. (Note that a trailling ``_`` in\nPyTorch signifies that the operation is performed in-place.)\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>We are initializing the weights here with\n `Xavier initialisation <http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf>`_\n (by multiplying with 1/sqrt(n)).</p></div>\n\n",
"_____no_output_____"
]
],
[
[
"import math\n\nweights = torch.randn(784, 10) / math.sqrt(784)\nweights.requires_grad_()\nbias = torch.zeros(10, requires_grad=True)",
"_____no_output_____"
]
],
[
[
"Thanks to PyTorch's ability to calculate gradients automatically, we can\nuse any standard Python function (or callable object) as a model! So\nlet's just write a plain matrix multiplication and broadcasted addition\nto create a simple linear model. We also need an activation function, so\nwe'll write `log_softmax` and use it. Remember: although PyTorch\nprovides lots of pre-written loss functions, activation functions, and\nso forth, you can easily write your own using plain python. PyTorch will\neven create fast GPU or vectorized CPU code for your function\nautomatically.\n\n",
"_____no_output_____"
]
],
[
[
"def log_softmax(x):\n return x - x.exp().sum(-1).log().unsqueeze(-1)\n\ndef model(xb):\n return log_softmax(xb @ weights + bias)",
"_____no_output_____"
]
],
[
[
"In the above, the ``@`` stands for the dot product operation. We will call\nour function on one batch of data (in this case, 64 images). This is\none *forward pass*. Note that our predictions won't be any better than\nrandom at this stage, since we start with random weights.\n\n",
"_____no_output_____"
]
],
[
[
"bs = 64 # batch size\n\nxb = x_train[0:bs] # a mini-batch from x\npreds = model(xb) # predictions\npreds[0], preds.shape\nprint(preds[0], preds.shape)",
"_____no_output_____"
]
],
[
[
"As you see, the ``preds`` tensor contains not only the tensor values, but also a\ngradient function. We'll use this later to do backprop.\n\nLet's implement negative log-likelihood to use as the loss function\n(again, we can just use standard Python):\n\n",
"_____no_output_____"
]
],
[
[
"def nll(input, target):\n return -input[range(target.shape[0]), target].mean()\n\nloss_func = nll",
"_____no_output_____"
]
],
[
[
"Let's check our loss with our random model, so we can see if we improve\nafter a backprop pass later.\n\n",
"_____no_output_____"
]
],
[
[
"yb = y_train[0:bs]\nprint(loss_func(preds, yb))",
"_____no_output_____"
]
],
[
[
"Let's also implement a function to calculate the accuracy of our model.\nFor each prediction, if the index with the largest value matches the\ntarget value, then the prediction was correct.\n\n",
"_____no_output_____"
]
],
[
[
"def accuracy(out, yb):\n preds = torch.argmax(out, dim=1)\n return (preds == yb).float().mean()",
"_____no_output_____"
]
],
[
[
"Let's check the accuracy of our random model, so we can see if our\naccuracy improves as our loss improves.\n\n",
"_____no_output_____"
]
],
[
[
"print(accuracy(preds, yb))",
"_____no_output_____"
]
],
[
[
"We can now run a training loop. For each iteration, we will:\n\n- select a mini-batch of data (of size ``bs``)\n- use the model to make predictions\n- calculate the loss\n- ``loss.backward()`` updates the gradients of the model, in this case, ``weights``\n and ``bias``.\n\nWe now use these gradients to update the weights and bias. We do this\nwithin the ``torch.no_grad()`` context manager, because we do not want these\nactions to be recorded for our next calculation of the gradient. You can read\nmore about how PyTorch's Autograd records operations\n`here <https://pytorch.org/docs/stable/notes/autograd.html>`_.\n\nWe then set the\ngradients to zero, so that we are ready for the next loop.\nOtherwise, our gradients would record a running tally of all the operations\nthat had happened (i.e. ``loss.backward()`` *adds* the gradients to whatever is\nalready stored, rather than replacing them).\n\n.. tip:: You can use the standard python debugger to step through PyTorch\n code, allowing you to check the various variable values at each step.\n Uncomment ``set_trace()`` below to try it out.\n\n\n",
"_____no_output_____"
]
],
[
[
"from IPython.core.debugger import set_trace\n\nlr = 0.5 # learning rate\nepochs = 2 # how many epochs to train for\n\nfor epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n # set_trace()\n start_i = i * bs\n end_i = start_i + bs\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n with torch.no_grad():\n weights -= weights.grad * lr\n bias -= bias.grad * lr\n weights.grad.zero_()\n bias.grad.zero_()",
"_____no_output_____"
]
],
[
[
"That's it: we've created and trained a minimal neural network (in this case, a\nlogistic regression, since we have no hidden layers) entirely from scratch!\n\nLet's check the loss and accuracy and compare those to what we got\nearlier. We expect that the loss will have decreased and accuracy to\nhave increased, and they have.\n\n",
"_____no_output_____"
]
],
[
[
"print(loss_func(model(xb), yb), accuracy(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Using torch.nn.functional\n------------------------------\n\nWe will now refactor our code, so that it does the same thing as before, only\nwe'll start taking advantage of PyTorch's ``nn`` classes to make it more concise\nand flexible. At each step from here, we should be making our code one or more\nof: shorter, more understandable, and/or more flexible.\n\nThe first and easiest step is to make our code shorter by replacing our\nhand-written activation and loss functions with those from ``torch.nn.functional``\n(which is generally imported into the namespace ``F`` by convention). This module\ncontains all the functions in the ``torch.nn`` library (whereas other parts of the\nlibrary contain classes). As well as a wide range of loss and activation\nfunctions, you'll also find here some convenient functions for creating neural\nnets, such as pooling functions. (There are also functions for doing convolutions,\nlinear layers, etc, but as we'll see, these are usually better handled using\nother parts of the library.)\n\nIf you're using negative log likelihood loss and log softmax activation,\nthen Pytorch provides a single function ``F.cross_entropy`` that combines\nthe two. So we can even remove the activation function from our model.\n\n",
"_____no_output_____"
]
],
[
[
"import torch.nn.functional as F\n\nloss_func = F.cross_entropy\n\ndef model(xb):\n return xb @ weights + bias",
"_____no_output_____"
]
],
[
[
"Note that we no longer call ``log_softmax`` in the ``model`` function. Let's\nconfirm that our loss and accuracy are the same as before:\n\n",
"_____no_output_____"
]
],
[
[
"print(loss_func(model(xb), yb), accuracy(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Refactor using nn.Module\n-----------------------------\nNext up, we'll use ``nn.Module`` and ``nn.Parameter``, for a clearer and more\nconcise training loop. We subclass ``nn.Module`` (which itself is a class and\nable to keep track of state). In this case, we want to create a class that\nholds our weights, bias, and method for the forward step. ``nn.Module`` has a\nnumber of attributes and methods (such as ``.parameters()`` and ``.zero_grad()``)\nwhich we will be using.\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>``nn.Module`` (uppercase M) is a PyTorch specific concept, and is a\n class we'll be using a lot. ``nn.Module`` is not to be confused with the Python\n concept of a (lowercase ``m``) `module <https://docs.python.org/3/tutorial/modules.html>`_,\n which is a file of Python code that can be imported.</p></div>\n\n",
"_____no_output_____"
]
],
[
[
"from torch import nn\n\nclass Mnist_Logistic(nn.Module):\n def __init__(self):\n super().__init__()\n self.weights = nn.Parameter(torch.randn(784, 10) / math.sqrt(784))\n self.bias = nn.Parameter(torch.zeros(10))\n\n def forward(self, xb):\n return xb @ self.weights + self.bias",
"_____no_output_____"
]
],
[
[
"Since we're now using an object instead of just using a function, we\nfirst have to instantiate our model:\n\n",
"_____no_output_____"
]
],
[
[
"model = Mnist_Logistic()",
"_____no_output_____"
]
],
[
[
"Now we can calculate the loss in the same way as before. Note that\n``nn.Module`` objects are used as if they are functions (i.e they are\n*callable*), but behind the scenes Pytorch will call our ``forward``\nmethod automatically.\n\n",
"_____no_output_____"
]
],
[
[
"print(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Previously for our training loop we had to update the values for each parameter\nby name, and manually zero out the grads for each parameter separately, like this:\n::\n with torch.no_grad():\n weights -= weights.grad * lr\n bias -= bias.grad * lr\n weights.grad.zero_()\n bias.grad.zero_()\n\n\nNow we can take advantage of model.parameters() and model.zero_grad() (which\nare both defined by PyTorch for ``nn.Module``) to make those steps more concise\nand less prone to the error of forgetting some of our parameters, particularly\nif we had a more complicated model:\n::\n with torch.no_grad():\n for p in model.parameters(): p -= p.grad * lr\n model.zero_grad()\n\n\nWe'll wrap our little training loop in a ``fit`` function so we can run it\nagain later.\n\n",
"_____no_output_____"
]
],
[
[
"def fit():\n for epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n start_i = i * bs\n end_i = start_i + bs\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n with torch.no_grad():\n for p in model.parameters():\n p -= p.grad * lr\n model.zero_grad()\n\nfit()",
"_____no_output_____"
]
],
[
[
"Let's double-check that our loss has gone down:\n\n",
"_____no_output_____"
]
],
[
[
"print(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Refactor using nn.Linear\n-------------------------\n\nWe continue to refactor our code. Instead of manually defining and\ninitializing ``self.weights`` and ``self.bias``, and calculating ``xb @\nself.weights + self.bias``, we will instead use the Pytorch class\n`nn.Linear <https://pytorch.org/docs/stable/nn.html#linear-layers>`_ for a\nlinear layer, which does all that for us. Pytorch has many types of\npredefined layers that can greatly simplify our code, and often makes it\nfaster too.\n\n",
"_____no_output_____"
]
],
[
[
"class Mnist_Logistic(nn.Module):\n def __init__(self):\n super().__init__()\n self.lin = nn.Linear(784, 10)\n\n def forward(self, xb):\n return self.lin(xb)",
"_____no_output_____"
]
],
[
[
"We instantiate our model and calculate the loss in the same way as before:\n\n",
"_____no_output_____"
]
],
[
[
"model = Mnist_Logistic()\nprint(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"We are still able to use our same ``fit`` method as before.\n\n",
"_____no_output_____"
]
],
[
[
"fit()\n\nprint(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Refactor using optim\n------------------------------\n\nPytorch also has a package with various optimization algorithms, ``torch.optim``.\nWe can use the ``step`` method from our optimizer to take a forward step, instead\nof manually updating each parameter.\n\nThis will let us replace our previous manually coded optimization step:\n::\n with torch.no_grad():\n for p in model.parameters(): p -= p.grad * lr\n model.zero_grad()\n\nand instead use just:\n::\n opt.step()\n opt.zero_grad()\n\n(``optim.zero_grad()`` resets the gradient to 0 and we need to call it before\ncomputing the gradient for the next minibatch.)\n\n",
"_____no_output_____"
]
],
[
[
"from torch import optim",
"_____no_output_____"
]
],
[
[
"We'll define a little function to create our model and optimizer so we\ncan reuse it in the future.\n\n",
"_____no_output_____"
]
],
[
[
"def get_model():\n model = Mnist_Logistic()\n return model, optim.SGD(model.parameters(), lr=lr)\n\nmodel, opt = get_model()\nprint(loss_func(model(xb), yb))\n\nfor epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n start_i = i * bs\n end_i = start_i + bs\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\nprint(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Refactor using Dataset\n------------------------------\n\nPyTorch has an abstract Dataset class. A Dataset can be anything that has\na ``__len__`` function (called by Python's standard ``len`` function) and\na ``__getitem__`` function as a way of indexing into it.\n`This tutorial <https://pytorch.org/tutorials/beginner/data_loading_tutorial.html>`_\nwalks through a nice example of creating a custom ``FacialLandmarkDataset`` class\nas a subclass of ``Dataset``.\n\nPyTorch's `TensorDataset <https://pytorch.org/docs/stable/_modules/torch/utils/data/dataset.html#TensorDataset>`_\nis a Dataset wrapping tensors. By defining a length and way of indexing,\nthis also gives us a way to iterate, index, and slice along the first\ndimension of a tensor. This will make it easier to access both the\nindependent and dependent variables in the same line as we train.\n\n",
"_____no_output_____"
]
],
[
[
"from torch.utils.data import TensorDataset",
"_____no_output_____"
]
],
[
[
"Both ``x_train`` and ``y_train`` can be combined in a single ``TensorDataset``,\nwhich will be easier to iterate over and slice.\n\n",
"_____no_output_____"
]
],
[
[
"train_ds = TensorDataset(x_train, y_train)",
"_____no_output_____"
]
],
[
[
"Previously, we had to iterate through minibatches of x and y values separately:\n::\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n\n\nNow, we can do these two steps together:\n::\n xb,yb = train_ds[i*bs : i*bs+bs]\n\n\n",
"_____no_output_____"
]
],
[
[
"model, opt = get_model()\n\nfor epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n xb, yb = train_ds[i * bs: i * bs + bs]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\nprint(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Refactor using DataLoader\n------------------------------\n\nPytorch's ``DataLoader`` is responsible for managing batches. You can\ncreate a ``DataLoader`` from any ``Dataset``. ``DataLoader`` makes it easier\nto iterate over batches. Rather than having to use ``train_ds[i*bs : i*bs+bs]``,\nthe DataLoader gives us each minibatch automatically.\n\n",
"_____no_output_____"
]
],
[
[
"from torch.utils.data import DataLoader\n\ntrain_ds = TensorDataset(x_train, y_train)\ntrain_dl = DataLoader(train_ds, batch_size=bs)",
"_____no_output_____"
]
],
[
[
"Previously, our loop iterated over batches (xb, yb) like this:\n::\n for i in range((n-1)//bs + 1):\n xb,yb = train_ds[i*bs : i*bs+bs]\n pred = model(xb)\n\nNow, our loop is much cleaner, as (xb, yb) are loaded automatically from the data loader:\n::\n for xb,yb in train_dl:\n pred = model(xb)\n\n",
"_____no_output_____"
]
],
[
[
"model, opt = get_model()\n\nfor epoch in range(epochs):\n for xb, yb in train_dl:\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\nprint(loss_func(model(xb), yb))",
"_____no_output_____"
]
],
[
[
"Thanks to Pytorch's ``nn.Module``, ``nn.Parameter``, ``Dataset``, and ``DataLoader``,\nour training loop is now dramatically smaller and easier to understand. Let's\nnow try to add the basic features necessary to create effecive models in practice.\n\nAdd validation\n-----------------------\n\nIn section 1, we were just trying to get a reasonable training loop set up for\nuse on our training data. In reality, you **always** should also have\na `validation set <https://www.fast.ai/2017/11/13/validation-sets/>`_, in order\nto identify if you are overfitting.\n\nShuffling the training data is\n`important <https://www.quora.com/Does-the-order-of-training-data-matter-when-training-neural-networks>`_\nto prevent correlation between batches and overfitting. On the other hand, the\nvalidation loss will be identical whether we shuffle the validation set or not.\nSince shuffling takes extra time, it makes no sense to shuffle the validation data.\n\nWe'll use a batch size for the validation set that is twice as large as\nthat for the training set. This is because the validation set does not\nneed backpropagation and thus takes less memory (it doesn't need to\nstore the gradients). We take advantage of this to use a larger batch\nsize and compute the loss more quickly.\n\n",
"_____no_output_____"
]
],
[
[
"train_ds = TensorDataset(x_train, y_train)\ntrain_dl = DataLoader(train_ds, batch_size=bs, shuffle=True)\n\nvalid_ds = TensorDataset(x_valid, y_valid)\nvalid_dl = DataLoader(valid_ds, batch_size=bs * 2)",
"_____no_output_____"
]
],
[
[
"We will calculate and print the validation loss at the end of each epoch.\n\n(Note that we always call ``model.train()`` before training, and ``model.eval()``\nbefore inference, because these are used by layers such as ``nn.BatchNorm2d``\nand ``nn.Dropout`` to ensure appropriate behaviour for these different phases.)\n\n",
"_____no_output_____"
]
],
[
[
"model, opt = get_model()\n\nfor epoch in range(epochs):\n model.train()\n for xb, yb in train_dl:\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\n model.eval()\n with torch.no_grad():\n valid_loss = sum(loss_func(model(xb), yb) for xb, yb in valid_dl)\n\n print(epoch, valid_loss / len(valid_dl))",
"_____no_output_____"
]
],
[
[
"Create fit() and get_data()\n----------------------------------\n\nWe'll now do a little refactoring of our own. Since we go through a similar\nprocess twice of calculating the loss for both the training set and the\nvalidation set, let's make that into its own function, ``loss_batch``, which\ncomputes the loss for one batch.\n\nWe pass an optimizer in for the training set, and use it to perform\nbackprop. For the validation set, we don't pass an optimizer, so the\nmethod doesn't perform backprop.\n\n",
"_____no_output_____"
]
],
[
[
"def loss_batch(model, loss_func, xb, yb, opt=None):\n loss = loss_func(model(xb), yb)\n\n if opt is not None:\n loss.backward()\n opt.step()\n opt.zero_grad()\n\n return loss.item(), len(xb)",
"_____no_output_____"
]
],
[
[
"``fit`` runs the necessary operations to train our model and compute the\ntraining and validation losses for each epoch.\n\n",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\ndef fit(epochs, model, loss_func, opt, train_dl, valid_dl):\n for epoch in range(epochs):\n model.train()\n for xb, yb in train_dl:\n loss_batch(model, loss_func, xb, yb, opt)\n\n model.eval()\n with torch.no_grad():\n losses, nums = zip(\n *[loss_batch(model, loss_func, xb, yb) for xb, yb in valid_dl]\n )\n val_loss = np.sum(np.multiply(losses, nums)) / np.sum(nums)\n\n print(epoch, val_loss)",
"_____no_output_____"
]
],
[
[
"``get_data`` returns dataloaders for the training and validation sets.\n\n",
"_____no_output_____"
]
],
[
[
"def get_data(train_ds, valid_ds, bs):\n return (\n DataLoader(train_ds, batch_size=bs, shuffle=True),\n DataLoader(valid_ds, batch_size=bs * 2),\n )",
"_____no_output_____"
]
],
[
[
"Now, our whole process of obtaining the data loaders and fitting the\nmodel can be run in 3 lines of code:\n\n",
"_____no_output_____"
]
],
[
[
"train_dl, valid_dl = get_data(train_ds, valid_ds, bs)\nmodel, opt = get_model()\nfit(epochs, model, loss_func, opt, train_dl, valid_dl)",
"_____no_output_____"
]
],
[
[
"You can use these basic 3 lines of code to train a wide variety of models.\nLet's see if we can use them to train a convolutional neural network (CNN)!\n\nSwitch to CNN\n-------------\n\nWe are now going to build our neural network with three convolutional layers.\nBecause none of the functions in the previous section assume anything about\nthe model form, we'll be able to use them to train a CNN without any modification.\n\nWe will use Pytorch's predefined\n`Conv2d <https://pytorch.org/docs/stable/nn.html#torch.nn.Conv2d>`_ class\nas our convolutional layer. We define a CNN with 3 convolutional layers.\nEach convolution is followed by a ReLU. At the end, we perform an\naverage pooling. (Note that ``view`` is PyTorch's version of numpy's\n``reshape``)\n\n",
"_____no_output_____"
]
],
[
[
"class Mnist_CNN(nn.Module):\n def __init__(self):\n super().__init__()\n self.conv1 = nn.Conv2d(1, 16, kernel_size=3, stride=2, padding=1)\n self.conv2 = nn.Conv2d(16, 16, kernel_size=3, stride=2, padding=1)\n self.conv3 = nn.Conv2d(16, 10, kernel_size=3, stride=2, padding=1)\n\n def forward(self, xb):\n xb = xb.view(-1, 1, 28, 28)\n xb = F.relu(self.conv1(xb))\n xb = F.relu(self.conv2(xb))\n xb = F.relu(self.conv3(xb))\n xb = F.avg_pool2d(xb, 4)\n return xb.view(-1, xb.size(1))\n\nlr = 0.1",
"_____no_output_____"
]
],
[
[
"`Momentum <https://cs231n.github.io/neural-networks-3/#sgd>`_ is a variation on\nstochastic gradient descent that takes previous updates into account as well\nand generally leads to faster training.\n\n",
"_____no_output_____"
]
],
[
[
"model = Mnist_CNN()\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)\n\nfit(epochs, model, loss_func, opt, train_dl, valid_dl)",
"_____no_output_____"
]
],
[
[
"nn.Sequential\n------------------------\n\n``torch.nn`` has another handy class we can use to simply our code:\n`Sequential <https://pytorch.org/docs/stable/nn.html#torch.nn.Sequential>`_ .\nA ``Sequential`` object runs each of the modules contained within it, in a\nsequential manner. This is a simpler way of writing our neural network.\n\nTo take advantage of this, we need to be able to easily define a\n**custom layer** from a given function. For instance, PyTorch doesn't\nhave a `view` layer, and we need to create one for our network. ``Lambda``\nwill create a layer that we can then use when defining a network with\n``Sequential``.\n\n",
"_____no_output_____"
]
],
[
[
"class Lambda(nn.Module):\n def __init__(self, func):\n super().__init__()\n self.func = func\n\n def forward(self, x):\n return self.func(x)\n\n\ndef preprocess(x):\n return x.view(-1, 1, 28, 28)",
"_____no_output_____"
]
],
[
[
"The model created with ``Sequential`` is simply:\n\n",
"_____no_output_____"
]
],
[
[
"model = nn.Sequential(\n Lambda(preprocess),\n nn.Conv2d(1, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 10, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.AvgPool2d(4),\n Lambda(lambda x: x.view(x.size(0), -1)),\n)\n\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)\n\nfit(epochs, model, loss_func, opt, train_dl, valid_dl)",
"_____no_output_____"
]
],
[
[
"Wrapping DataLoader\n-----------------------------\n\nOur CNN is fairly concise, but it only works with MNIST, because:\n - It assumes the input is a 28\\*28 long vector\n - It assumes that the final CNN grid size is 4\\*4 (since that's the average\npooling kernel size we used)\n\nLet's get rid of these two assumptions, so our model works with any 2d\nsingle channel image. First, we can remove the initial Lambda layer but\nmoving the data preprocessing into a generator:\n\n",
"_____no_output_____"
]
],
[
[
"def preprocess(x, y):\n return x.view(-1, 1, 28, 28), y\n\n\nclass WrappedDataLoader:\n def __init__(self, dl, func):\n self.dl = dl\n self.func = func\n\n def __len__(self):\n return len(self.dl)\n\n def __iter__(self):\n batches = iter(self.dl)\n for b in batches:\n yield (self.func(*b))\n\ntrain_dl, valid_dl = get_data(train_ds, valid_ds, bs)\ntrain_dl = WrappedDataLoader(train_dl, preprocess)\nvalid_dl = WrappedDataLoader(valid_dl, preprocess)",
"_____no_output_____"
]
],
[
[
"Next, we can replace ``nn.AvgPool2d`` with ``nn.AdaptiveAvgPool2d``, which\nallows us to define the size of the *output* tensor we want, rather than\nthe *input* tensor we have. As a result, our model will work with any\nsize input.\n\n",
"_____no_output_____"
]
],
[
[
"model = nn.Sequential(\n nn.Conv2d(1, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 10, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.AdaptiveAvgPool2d(1),\n Lambda(lambda x: x.view(x.size(0), -1)),\n)\n\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)",
"_____no_output_____"
]
],
[
[
"Let's try it out:\n\n",
"_____no_output_____"
]
],
[
[
"fit(epochs, model, loss_func, opt, train_dl, valid_dl)",
"_____no_output_____"
]
],
[
[
"Using your GPU\n---------------\n\nIf you're lucky enough to have access to a CUDA-capable GPU (you can\nrent one for about $0.50/hour from most cloud providers) you can\nuse it to speed up your code. First check that your GPU is working in\nPytorch:\n\n",
"_____no_output_____"
]
],
[
[
"print(torch.cuda.is_available())",
"_____no_output_____"
]
],
[
[
"And then create a device object for it:\n\n",
"_____no_output_____"
]
],
[
[
"dev = torch.device(\n \"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")",
"_____no_output_____"
]
],
[
[
"Let's update ``preprocess`` to move batches to the GPU:\n\n",
"_____no_output_____"
]
],
[
[
"def preprocess(x, y):\n return x.view(-1, 1, 28, 28).to(dev), y.to(dev)\n\n\ntrain_dl, valid_dl = get_data(train_ds, valid_ds, bs)\ntrain_dl = WrappedDataLoader(train_dl, preprocess)\nvalid_dl = WrappedDataLoader(valid_dl, preprocess)",
"_____no_output_____"
]
],
[
[
"Finally, we can move our model to the GPU.\n\n",
"_____no_output_____"
]
],
[
[
"model.to(dev)\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)",
"_____no_output_____"
]
],
[
[
"You should find it runs faster now:\n\n",
"_____no_output_____"
]
],
[
[
"fit(epochs, model, loss_func, opt, train_dl, valid_dl)",
"_____no_output_____"
]
],
[
[
"Closing thoughts\n-----------------\n\nWe now have a general data pipeline and training loop which you can use for\ntraining many types of models using Pytorch. To see how simple training a model\ncan now be, take a look at the `mnist_sample` sample notebook.\n\nOf course, there are many things you'll want to add, such as data augmentation,\nhyperparameter tuning, monitoring training, transfer learning, and so forth.\nThese features are available in the fastai library, which has been developed\nusing the same design approach shown in this tutorial, providing a natural\nnext step for practitioners looking to take their models further.\n\nWe promised at the start of this tutorial we'd explain through example each of\n``torch.nn``, ``torch.optim``, ``Dataset``, and ``DataLoader``. So let's summarize\nwhat we've seen:\n\n - **torch.nn**\n\n + ``Module``: creates a callable which behaves like a function, but can also\n contain state(such as neural net layer weights). It knows what ``Parameter`` (s) it\n contains and can zero all their gradients, loop through them for weight updates, etc.\n + ``Parameter``: a wrapper for a tensor that tells a ``Module`` that it has weights\n that need updating during backprop. Only tensors with the `requires_grad` attribute set are updated\n + ``functional``: a module(usually imported into the ``F`` namespace by convention)\n which contains activation functions, loss functions, etc, as well as non-stateful\n versions of layers such as convolutional and linear layers.\n - ``torch.optim``: Contains optimizers such as ``SGD``, which update the weights\n of ``Parameter`` during the backward step\n - ``Dataset``: An abstract interface of objects with a ``__len__`` and a ``__getitem__``,\n including classes provided with Pytorch such as ``TensorDataset``\n - ``DataLoader``: Takes any ``Dataset`` and creates an iterator which returns batches of data.\n\n",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d0609a652c7b452c6379d7bee3d565c6749ab9c6 | 107,761 | ipynb | Jupyter Notebook | 2. CNN/5D_doubleip.ipynb | nikhil-mathews/MastersPr_Predicting-Human-Pathogen-PPIs-using-Natural-Language-Processing-methods | 78bbaaf5e4e52939a522fe14aedbf5acfd29e10c | [
"MIT"
]
| null | null | null | 2. CNN/5D_doubleip.ipynb | nikhil-mathews/MastersPr_Predicting-Human-Pathogen-PPIs-using-Natural-Language-Processing-methods | 78bbaaf5e4e52939a522fe14aedbf5acfd29e10c | [
"MIT"
]
| null | null | null | 2. CNN/5D_doubleip.ipynb | nikhil-mathews/MastersPr_Predicting-Human-Pathogen-PPIs-using-Natural-Language-Processing-methods | 78bbaaf5e4e52939a522fe14aedbf5acfd29e10c | [
"MIT"
]
| null | null | null | 107,761 | 107,761 | 0.918152 | [
[
[
"import pandas as pd\n#Google colab does not have pickle\ntry:\n import pickle5 as pickle\nexcept:\n !pip install pickle5\n import pickle5 as pickle\nimport os\nimport seaborn as sns\nimport sys\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom keras.preprocessing.text import Tokenizer\nfrom keras.preprocessing.sequence import pad_sequences\nfrom keras.layers import Dense, Input, GlobalMaxPooling1D,Flatten\nfrom keras.layers import Conv1D, MaxPooling1D, Embedding, Concatenate, Lambda\nfrom keras.models import Model\nfrom sklearn.metrics import roc_auc_score,confusion_matrix,roc_curve, auc\nfrom numpy import random\nfrom keras.layers import LSTM, Bidirectional, GlobalMaxPool1D, Dropout\nfrom keras.optimizers import Adam\nfrom keras.utils.vis_utils import plot_model\n\nimport sys\nsys.path.insert(0,'/content/drive/MyDrive/ML_Data/')\nimport functions as f",
"Collecting pickle5\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f7/4c/5c4dd0462c8d3a6bc4af500a6af240763c2ebd1efdc736fc2c946d44b70a/pickle5-0.0.11.tar.gz (132kB)\n\r\u001b[K |โโโ | 10kB 17.2MB/s eta 0:00:01\r\u001b[K |โโโโโ | 20kB 12.3MB/s eta 0:00:01\r\u001b[K |โโโโโโโโ | 30kB 8.9MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโ | 40kB 7.9MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโ | 51kB 3.5MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโ | 61kB 4.1MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโ | 71kB 4.7MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโโโ | 81kB 5.3MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโ | 92kB 5.0MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโ | 102kB 5.5MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 112kB 5.5MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ | 122kB 5.5MB/s eta 0:00:01\r\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 133kB 5.5MB/s \n\u001b[?25hBuilding wheels for collected packages: pickle5\n Building wheel for pickle5 (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for pickle5: filename=pickle5-0.0.11-cp37-cp37m-linux_x86_64.whl size=219265 sha256=4db361ac18314b7f73b02e1617744cb0e9d5f6acde8a0877a30a5f2c9fdfcbcb\n Stored in directory: /root/.cache/pip/wheels/a6/90/95/f889ca4aa8b0e0c7f21c8470b6f5d6032f0390a3a141a9a3bd\nSuccessfully built pickle5\nInstalling collected packages: pickle5\nSuccessfully installed pickle5-0.0.11\n"
],
[
"def load_data(D=1,randomize=False):\n try:\n with open('/content/drive/MyDrive/ML_Data/df_train_'+str(D)+'D.pickle', 'rb') as handle:\n df_train = pickle.load(handle)\n except:\n df_train = pd.read_pickle(\"C:/Users/nik00/py/proj/hyppi-train.pkl\")\n try:\n with open('/content/drive/MyDrive/ML_Data/df_test_'+str(D)+'D.pickle', 'rb') as handle:\n df_test = pickle.load(handle)\n except:\n df_test = pd.read_pickle(\"C:/Users/nik00/py/proj/hyppi-independent.pkl\")\n if randomize:\n return shuff_together(df_train,df_test)\n else:\n return df_train,df_test\n\ndf_train,df_test = load_data(5)\nprint('The data used will be:')\ndf_train[['Human','Yersinia']]",
"The data used will be:\n"
],
[
"lengths = sorted(len(s) for s in df_train['Human'])\nprint(\"Median length of Human sequence is\",lengths[len(lengths)//2])\n_ = sns.displot(lengths)\n_=plt.title(\"Most Human sequences seem to be less than 2000 in length\")",
"Median length of Human sequence is 477\n"
],
[
"lengths = sorted(len(s) for s in df_train['Yersinia'])\nprint(\"Median length of Yersinia sequence is\",lengths[len(lengths)//2])\n_ = sns.displot(lengths)\n_=plt.title(\"Most Yersinia sequences seem to be less than 1000 in length\")",
"Median length of Yersinia sequence is 334\n"
],
[
"data1_5D_doubleip_pre,data2_5D_doubleip_pre,data1_test_5D_doubleip_pre,data2_test_5D_doubleip_pre,num_words_5D,MAX_SEQUENCE_LENGTH_5D,MAX_VOCAB_SIZE_5D = f.get_seq_data_doubleip(500000,1000,df_train,df_test,pad = 'pre',show = True)",
"MAX_VOCAB_SIZE is 500000\nMAX_SEQUENCE_LENGTH is 1000\nmax sequences1_train length: 5301\nmin sequences1_train length: 12\nmedian sequences1_train length: 327\n"
],
[
"EMBEDDING_DIM_5D = 15\nVALIDATION_SPLIT = 0.2\nBATCH_SIZE = 128\nEPOCHS = 5\nDROP=0.7\n\nx1 = f.conv_model(MAX_SEQUENCE_LENGTH_5D,EMBEDDING_DIM_5D,num_words_5D,DROP)\nx2 = f.conv_model(MAX_SEQUENCE_LENGTH_5D,EMBEDDING_DIM_5D,num_words_5D,DROP)\n\nconcatenator = Concatenate(axis=1)\nx = concatenator([x1.output, x2.output])\nx = Dense(128)(x)\nx = Dropout(DROP)(x)\noutput = Dense(1, activation=\"sigmoid\",name=\"Final\")(x)\nmodel5D_CNN_doubleip = Model(inputs=[x1.input, x2.input], outputs=output)\n\nmodel5D_CNN_doubleip.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])\n#plot_model(model5D_CNN_doubleip, to_file='model_plot.png', show_shapes=True, show_layer_names=False)\n\ntrains = [data1_5D_doubleip_pre,data2_5D_doubleip_pre]\ntests = [data1_test_5D_doubleip_pre,data2_test_5D_doubleip_pre]\n\n\nmodel5D_CNN_doubleip.fit(trains, df_train['label'].values, epochs=EPOCHS, batch_size=BATCH_SIZE,validation_data=(tests, df_test['label'].values))\nprint(roc_auc_score(df_test['label'].values, model5D_CNN_doubleip.predict(tests)))\n\n#asd\n",
"Epoch 1/5\n49/49 [==============================] - 9s 165ms/step - loss: 0.6156 - accuracy: 0.6580 - val_loss: 0.5116 - val_accuracy: 0.7761\nEpoch 2/5\n49/49 [==============================] - 8s 160ms/step - loss: 0.4532 - accuracy: 0.7840 - val_loss: 0.4213 - val_accuracy: 0.8210\nEpoch 3/5\n49/49 [==============================] - 8s 159ms/step - loss: 0.2253 - accuracy: 0.9179 - val_loss: 0.4269 - val_accuracy: 0.8223\nEpoch 4/5\n49/49 [==============================] - 8s 157ms/step - loss: 0.1099 - accuracy: 0.9620 - val_loss: 0.4274 - val_accuracy: 0.8296\nEpoch 5/5\n49/49 [==============================] - 8s 159ms/step - loss: 0.0653 - accuracy: 0.9793 - val_loss: 0.4897 - val_accuracy: 0.8236\n0.8995234264434628\n"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0609b9a0781386f0d189721804704e4449abfe0 | 255,014 | ipynb | Jupyter Notebook | Decision_tree_C5.O_CART.ipynb | anagha0397/Decision-Tree | 745b3ca72ac52a93ef947c130bf5cb60ddc20f65 | [
"MIT"
]
| null | null | null | Decision_tree_C5.O_CART.ipynb | anagha0397/Decision-Tree | 745b3ca72ac52a93ef947c130bf5cb60ddc20f65 | [
"MIT"
]
| null | null | null | Decision_tree_C5.O_CART.ipynb | anagha0397/Decision-Tree | 745b3ca72ac52a93ef947c130bf5cb60ddc20f65 | [
"MIT"
]
| null | null | null | 158.788294 | 169,820 | 0.860051 | [
[
[
"import pandas as pd \nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom sklearn.datasets import load_iris\nfrom sklearn.model_selection import train_test_split # for spliting the data into train and test \nfrom sklearn.tree import DecisionTreeClassifier # For creating a decision a tree\nfrom sklearn import tree # for displaying the tree\nfrom sklearn.metrics import classification_report # for calculating accuracy\nfrom sklearn import preprocessing # As we have applied encoding technique we have used this preprocessing library ",
"_____no_output_____"
],
[
"iris = pd.read_csv(\"iris.csv\", index_col = 0) # In order to set the index to 0 we have mentioned that index_col = 0",
"_____no_output_____"
],
[
"iris.head()",
"_____no_output_____"
],
[
"# Converting the species column to numbers so we will use encoding technique called as label encoder\n\nlabel_encoder = preprocessing.LabelEncoder() # This is called function calling\niris['Species'] = label_encoder.fit_transform(iris['Species'])",
"_____no_output_____"
],
[
"iris.head()",
"_____no_output_____"
],
[
"# Splitting the data in x and y for classification purpose, for performing any classification we are required to split the data first in input and output\n\nx = iris.iloc[:,0:4]\ny = iris['Species']",
"_____no_output_____"
],
[
"x",
"_____no_output_____"
],
[
"y",
"_____no_output_____"
],
[
"iris['Species'].unique() # for determining unique values",
"_____no_output_____"
],
[
"iris.Species.value_counts()",
"_____no_output_____"
],
[
"# Splitting the data into training and test dataset\n\nx_train, x_test, y_train, y_test = train_test_split(x,y,\n test_size=0.2,\n random_state=40)",
"_____no_output_____"
]
],
[
[
"### Building decision tree classifier using entropy criteria (c5.o)",
"_____no_output_____"
]
],
[
[
"model = DecisionTreeClassifier(criterion = 'entropy',max_depth = 3)\n",
"_____no_output_____"
],
[
"model.fit(x_train,y_train)",
"_____no_output_____"
]
],
[
[
"### Plotting the decision tree",
"_____no_output_____"
]
],
[
[
"tree.plot_tree(model);",
"_____no_output_____"
],
[
"model.get_n_leaves()",
"_____no_output_____"
],
[
"## As this tree is not visible so we will display it with some another technique",
"_____no_output_____"
],
[
"# we will extract the feature names, class names and we will define the figure size so that our tree will be visible in a better way",
"_____no_output_____"
],
[
"fn = ['SepalLengthCm',\t'SepalWidthCm',\t'PetalLengthCm',\t'PetalWidthCm']\ncn = ['Iris-setosa', 'Iris-versicolar', 'Iris-virginica']\nfig,axes = plt.subplots(nrows = 1, ncols =1, figsize =(4,4), dpi = 300) #dpi is the no. of pixels\ntree.plot_tree(model, feature_names = fn, class_names = cn, filled = True); # filled = true will fill the values inside the boxes",
"_____no_output_____"
],
[
"# Predicting the builded model on our x-test data",
"_____no_output_____"
],
[
"preds = model.predict(x_test)\npd.Series(preds).value_counts()",
"_____no_output_____"
],
[
"preds",
"_____no_output_____"
],
[
"# In order to check whether the predictions are correct or wrong we will create a cross tab on y_test data",
"_____no_output_____"
],
[
"crosstable = pd.crosstab(y_test,preds)\ncrosstable",
"_____no_output_____"
],
[
"# Final step we will calculate the accuracy of our model",
"_____no_output_____"
],
[
"np.mean(preds==y_test) # We are comparing the predicted values with the actual values and calculating mean for the matches",
"_____no_output_____"
],
[
"print(classification_report(preds,y_test))",
" precision recall f1-score support\n\n 0 1.00 1.00 1.00 8\n 1 1.00 0.92 0.96 13\n 2 0.90 1.00 0.95 9\n\n accuracy 0.97 30\n macro avg 0.97 0.97 0.97 30\nweighted avg 0.97 0.97 0.97 30\n\n"
]
],
[
[
"## Building a decision tree using CART method (Classifier model)",
"_____no_output_____"
]
],
[
[
"model_1 = DecisionTreeClassifier(criterion = 'gini',max_depth = 3)",
"_____no_output_____"
],
[
"model_1.fit(x_train,y_train)",
"_____no_output_____"
],
[
"tree.plot_tree(model_1);",
"_____no_output_____"
],
[
"# predicting the values on xtest data\n\npreds = model_1.predict(x_test)",
"_____no_output_____"
],
[
"preds",
"_____no_output_____"
],
[
"pd.Series(preds).value_counts()",
"_____no_output_____"
],
[
"# calculating accuracy of the model using the actual values",
"_____no_output_____"
],
[
"np.mean(preds==y_test)",
"_____no_output_____"
]
],
[
[
"## Decision tree Regressor using CART",
"_____no_output_____"
]
],
[
[
"from sklearn.tree import DecisionTreeRegressor",
"_____no_output_____"
],
[
"# Just converting the iris data into the following way as I want my Y to be numeric\n\nX = iris.iloc[:,0:3]\nY = iris.iloc[:,3]",
"_____no_output_____"
],
[
"X_train,X_test,Y_train,Y_test = train_test_split(X,Y, test_size = 0.33, random_state = 1)",
"_____no_output_____"
],
[
"model_reg = DecisionTreeRegressor()\nmodel_reg.fit(X_train,Y_train)",
"_____no_output_____"
],
[
"preds1 = model_reg.predict(X_test)\npreds1",
"_____no_output_____"
],
[
"# Will see the correct and wrong matches",
"_____no_output_____"
],
[
"pd.crosstab(Y_test,preds1)",
"_____no_output_____"
],
[
"## We will calculate the accuracy by using score method,this is an either way to calculate the accuracy of the model",
"_____no_output_____"
],
[
"model_reg.score(X_test,Y_test) # THis model.score function will first calculate the predicted values using the X_test data and then internaly only it will compare those values with the y_test data which is our actual data",
"_____no_output_____"
]
],
[
[
"model_reg.score calculates r squared value and Aic value in background",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
]
]
|
d0609e31ed6dc92d0a56a47774098cabc15e3d07 | 17,609 | ipynb | Jupyter Notebook | Lectures/Lecture6/Intro to Machine Learning Homework.ipynb | alaymodi/Spring-2019-Career-Exploration-master | 2ca9b4466090d57702e97e70fa772535b2dc00f3 | [
"MIT"
]
| null | null | null | Lectures/Lecture6/Intro to Machine Learning Homework.ipynb | alaymodi/Spring-2019-Career-Exploration-master | 2ca9b4466090d57702e97e70fa772535b2dc00f3 | [
"MIT"
]
| null | null | null | Lectures/Lecture6/Intro to Machine Learning Homework.ipynb | alaymodi/Spring-2019-Career-Exploration-master | 2ca9b4466090d57702e97e70fa772535b2dc00f3 | [
"MIT"
]
| null | null | null | 43.803483 | 1,777 | 0.61991 | [
[
[
"# Homework",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n%matplotlib inline\nimport random\nimport numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom plotting import overfittingDemo, plot_multiple_linear_regression, overlay_simple_linear_model,plot_simple_residuals\nfrom scipy.optimize import curve_fit",
"_____no_output_____"
]
],
[
[
"**Exercise 1:** What are the two \"specialities\" of machine learning? Pick one and in your own words, explain what it means. `",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 2:** What is the difference between a regression task and a classification task?",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 3:** \n1. What is parametric fitting in your understanding?\n2. Given the data $x = 1,2,3,4,5, y_1 = 2,4,6,8,10, y_2 = 2,4,8,16,32,$ what function $f_1, f_2$ will you use to fit $y_1, y_2$? Why do you choose those?\n3. Why is parametric fitting somehow not machine learning?",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 4:** Take a look at the following residual plots. Residuals can be helpful in assessing if our model is overpredicting or underpredicting certain values. Assign the variable bestplot to the letter corresponding to which residual plot indicates a good fit for a linear model.\n\n<img src='residplots.png' width=\"600\" height=\"600\">",
"_____no_output_____"
]
],
[
[
"bestplot = 'Put your letter answer between these quotes'",
"_____no_output_____"
]
],
[
[
"**Exercise 5:** Observe the following graphs. Assign each graph variable to one of the following strings: 'overfitting', 'underfitting', or 'bestfit'.\n<img src='overfit-underfit.png' width=\"800\" height=\"800\">",
"_____no_output_____"
]
],
[
[
"graph1 = \"Put answer here\"\ngraph2 = \"Put answer here\"\ngraph3 = \"Put answer here\"",
"_____no_output_____"
]
],
[
[
"**Exercise 6:** What are the 3 sets we split our initial data set into?",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 7:** Refer to the graphs below when answering the following questions (Exercise 6 and 7).\n<img src='training_vs_test_error.png' width=\"800\" height=\"800\">\nAs we increase the degree of our model, what happens to the training error and what happens to the test error? ",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 8:** What is the issue with just increasing the degree of our model to get the lowest training error possible?",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 9:** Find the gradient for ridge loss, most concretely, when $L(\\theta, \\textbf{y}, \\alpha)\n= (\\frac{1}{n} \\sum_{i = 1}^{n}(y_i - \\theta)^2) + \\frac{\\alpha }{2}\\sum_{i = 1}^{n}\\theta ^2$\nfind $\\frac{\\partial}{\\partial \\hat{\\theta}} L(\\theta, \\textbf{y},\\alpha)$, you can have a look at the class example, they are really similar.",
"_____no_output_____"
],
[
"Your Answer Here",
"_____no_output_____"
],
[
"**Exercise 10:** Following the last part of the exercise, you've already fitted your model, now let's test the performance. Make sure you check the code for the previous example we went through in class.\n\n1. copy what you had from the exercise here.",
"_____no_output_____"
]
],
[
[
"import pandas as pd\n\nmpg = pd.read_csv(\"./mpg_category.csv\", index_col=\"name\")\n\n#exercise part 1\nmpg['Old?'] = ... \n\n#exercise part 2\nmpg_train, mpg_test = ..., ...\n\n#exercise part 3\nfrom sklearn.linear_model import LogisticRegression\nsoftmax_reg = LogisticRegression(multi_class=\"multinomial\",solver=\"lbfgs\", C=10)\nX = ...\nY = ...\nsoftmax_reg.fit(X, Y)",
"_____no_output_____"
]
],
[
[
"2. create the test data set and make the prediction on test dataset",
"_____no_output_____"
]
],
[
[
"X_test = ...\nY_test = ...\npred = softmax_reg.predict(...)",
"_____no_output_____"
]
],
[
[
"3. Make the confusion matrix and tell me how you interpret each of the cell in the confusion matrix. What does different depth of blue means. You can just run the cell below, assumed what you did above is correct. You just have to answer your understanding.",
"_____no_output_____"
]
],
[
[
"from sklearn.metrics import confusion_matrix\nconfusion_matrix = confusion_matrix(Y_test, pred)\nX_label = ['old', 'new']\ndef plot_confusion_matrix(cm, title='Confusion matrix', cmap=plt.cm.Blues):\n plt.imshow(cm, interpolation='nearest', cmap=cmap)\n plt.title(title)\n plt.colorbar()\n tick_marks = np.arange(len(X_label))\n plt.xticks(tick_marks, X_label, rotation=45)\n plt.yticks(tick_marks, X_label,)\n plt.tight_layout()\n plt.ylabel('True label')\n plt.xlabel('Predicted label')\nplot_confusion_matrix(confusion_matrix)\n# confusion_matrix",
"_____no_output_____"
]
],
[
[
"Your Answer Here",
"_____no_output_____"
]
],
[
[
"# be sure to hit save (File > Save and Checkpoint) or Ctrl/Command-S before you run the cell!\nfrom submit import create_and_submit\n\ncreate_and_submit(['Intro to Machine Learning Homework.ipynb'], verbose=True)",
"Parsed Intro to Machine Learning Homework.ipynb\nEnter your Berkeley email address: [email protected]\nPosting answers for Intro to Machine Learning Homework\nYour submission: {'exercise-1': 'Your Answer Here', 'exercise-1_output': None, 'exercise-2': 'Your Answer Here', 'exercise-2_output': None, 'exercise-3': 'Your Answer Here', 'exercise-3_output': None, 'exercise-4': \"bestplot = 'Put your letter answer between these quotes'\", 'exercise-4_output': None, 'exercise-5': 'graph1 = \"Put answer here\"\\ngraph2 = \"Put answer here\"\\ngraph3 = \"Put answer here\"', 'exercise-5_output': None, 'exercise-6': 'Your Answer Here', 'exercise-6_output': None, 'exercise-7': 'Your Answer Here', 'exercise-7_output': None, 'exercise-8': 'Your Answer Here', 'exercise-8_output': None, 'exercise-9': 'Your Answer Here', 'exercise-9_output': None, 'exercise-10-1': 'import pandas as pd\\n\\nmpg = pd.read_csv(\"./mpg_category.csv\", index_col=\"name\")\\n\\n#exercise part 1\\nmpg[\\'Old?\\'] = ... \\n\\n#exercise part 2\\nmpg_train, mpg_test = ..., ...\\n\\n#exercise part 3\\nfrom sklearn.linear_model import LogisticRegression\\nsoftmax_reg = LogisticRegression(multi_class=\"multinomial\",solver=\"lbfgs\", C=...)\\nX = ...\\nY = ...\\nsoftmax_reg.fit(X, Y)', 'exercise-10-1_output': None, 'exercise-10-2': '2. create the test data set and make the prediction on test dataset', 'exercise-10-2_output': None, 'exercise-10-3': '3. Make the confusion matrix and tell me how you interpret each of the cell in the confusion matrix. What does different depth of blue means. You can just run the cell below, assumed what you did above is correct. You just have to answer your understanding.', 'exercise-10-3_output': None, 'exercise-10-4': 'Your Answer Here', 'exercise-10-4_output': None, 'email': '[email protected]', 'sheet': 'Intro to Machine Learning Homework', 'timestamp': datetime.datetime(2019, 3, 18, 16, 46, 54, 7302)}\n\nSubmitted!\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d060a7328730ae52269f71692aeaffba9f64d9a6 | 15,673 | ipynb | Jupyter Notebook | _notebooks/2022-01-03-cs231n.ipynb | star77sa/TIL-Blog | 782a24bf0b2324a66024e984dd1c7f3536cd17b9 | [
"Apache-2.0"
]
| null | null | null | _notebooks/2022-01-03-cs231n.ipynb | star77sa/TIL-Blog | 782a24bf0b2324a66024e984dd1c7f3536cd17b9 | [
"Apache-2.0"
]
| 1 | 2021-07-24T16:33:20.000Z | 2021-07-24T16:43:02.000Z | _notebooks/2022-01-03-cs231n.ipynb | star77sa/TIL-Blog | 782a24bf0b2324a66024e984dd1c7f3536cd17b9 | [
"Apache-2.0"
]
| null | null | null | 36.79108 | 722 | 0.617559 | [
[
[
"# CS231n_CNN for Visual Recognition\n> Stanford University CS231n\n\n- toc: true \n- badges: true\n- comments: true\n- categories: [CNN]\n- image: images/",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"- http://cs231n.stanford.edu/",
"_____no_output_____"
],
[
"---\n# Image Classification\n\n",
"_____no_output_____"
],
[
"- **Image Classification:** We are given a **Training Set** of labeled images, asked to predict labels on **Test Set.** Common to report the **Accuracy** of predictions(fraction of correctly predicted images)\n\n- We introduced the **k-Nearest Neighbor Classifier**, which predicts the labels based on nearest images in the training set\n\n- We saw that the choice of distance and the value of k are **hyperparameters** that are tuned using a **validation set**, or through **cross-validation** if the size of the data is small.\n\n- Once the best set of hyperparameters is chosen, the classifier is evaluated once on the test set, and reported as the performance of kNN on that data.",
"_____no_output_____"
],
[
"- Nearest Neighbor ๋ถ๋ฅ๊ธฐ๋ CIFAR-10 ๋ฐ์ดํฐ์
์์ ์ฝ 40% ์ ๋์ ์ ํ๋๋ฅผ ๋ณด์ด๋ ๊ฒ์ ํ์ธํ์๋ค. ์ด ๋ฐฉ๋ฒ์ ๊ตฌํ์ด ๋งค์ฐ ๊ฐ๋จํ์ง๋ง, ํ์ต ๋ฐ์ดํฐ์
์ ์ฒด๋ฅผ ๋ฉ๋ชจ๋ฆฌ์ ์ ์ฅํด์ผ ํ๊ณ , ์๋ก์ด ํ
์คํธ ์ด๋ฏธ์ง๋ฅผ ๋ถ๋ฅํ๊ณ ํ๊ฐํ ๋ ๊ณ์ฐ๋์ด ๋งค์ฐ ๋ง๋ค.\n\n- ๋จ์ํ ํฝ์
๊ฐ๋ค์ L1์ด๋ L2 ๊ฑฐ๋ฆฌ๋ ์ด๋ฏธ์ง์ ํด๋์ค๋ณด๋ค ๋ฐฐ๊ฒฝ์ด๋ ์ด๋ฏธ์ง์ ์ ์ฒด์ ์ธ ์๊น ๋ถํฌ ๋ฑ์ ๋ ํฐ ์ํฅ์ ๋ฐ๊ธฐ ๋๋ฌธ์ ์ด๋ฏธ์ง ๋ถ๋ฅ ๋ฌธ์ ์ ์์ด์ ์ถฉ๋ถํ์ง ๋ชปํ๋ค๋ ์ ์ ๋ณด์๋ค.",
"_____no_output_____"
],
[
"---\n# Linear Classification",
"_____no_output_____"
],
[
"- We defined a **score function** from image pixels to class scores (in this section, a linear function that depends on weights **W** and biases **b**).\n\n- Unlike kNN classifier, the advantage of this **parametric approach** is that once we learn the parameters we can discard the training data. Additionally, the prediction for a new test image is fast since it requires a single matrix multiplication with **W**, not an exhaustive comparison to every single training example.\n\n- We introduced the **bias trick**, which allows us to fold the bias vector into the weight matrix for convenience of only having to keep track of one parameter matrix.\nํ๋์ ๋งค๊ฐ๋ณ์ ํ๋ ฌ๋ง ์ถ์ ํด์ผ ํ๋ ํธ์๋ฅผ ์ํด ํธํฅ ๋ฒกํฐ๋ฅผ ๊ฐ์ค์น ํ๋ ฌ๋ก ์ ์ ์ ์๋ ํธํฅ ํธ๋ฆญ์ ๋์
ํ์ต๋๋ค .\n\n- We defined a **loss function** (we introduced two commonly used losses for linear classifiers: the **SVM** and the **Softmax**) that measures how compatible a given set of parameters is with respect to the ground truth labels in the training dataset. We also saw that the loss function was defined in such way that making good predictions on the training data is equivalent to having a small loss.",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"# Optimization",
"_____no_output_____"
],
[
"- We developed the intuition of the loss function as a **high-dimensional optimization landscape** in which we are trying to reach the bottom. The working analogy we developed was that of a blindfolded hiker who wishes to reach the bottom. In particular, we saw that the SVM cost function is piece-wise linear and bowl-shaped.\n\n- We motivated the idea of optimizing the loss function with **iterative refinement**, where we start with a random set of weights and refine them step by step until the loss is minimized.\n\n- We saw that the **gradient** of a function gives the steepest ascent direction and we discussed a simple but inefficient way of computing it numerically using the finite difference approximation (the finite difference being the value of h used in computing the numerical gradient).\n\n- We saw that the parameter update requires a tricky setting of the **step size** (or the **learning rate**) that must be set just right: if it is too low the progress is steady but slow. If it is too high the progress can be faster, but more risky. We will explore this tradeoff in much more detail in future sections.\n\n- We discussed the tradeoffs between computing the **numerical** and **analytic** gradient. The numerical gradient is simple but it is approximate and expensive to compute. The analytic gradient is exact, fast to compute but more error-prone since it requires the derivation of the gradient with math. Hence, in practice we always use the analytic gradient and then perform a **gradient check**, in which its implementation is compared to the numerical gradient.\n\n- We introduced the **Gradient Descent** algorithm which iteratively computes the gradient and performs a parameter update in loop.",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"# Backprop",
"_____no_output_____"
],
[
"- We developed intuition for what the gradients mean, how they flow backwards in the circuit, and how they communicate which part of the circuit should increase or decrease and with what force to make the final output higher.\n\n- We discussed the importance of **staged computation** for practical implementations of backpropagation. You always want to break up your function into modules for which you can easily derive local gradients, and then chain them with chain rule. Crucially, you almost never want to write out these expressions on paper and differentiate them symbolically in full, because you never need an explicit mathematical equation for the gradient of the input variables. Hence, decompose your expressions into stages such that you can differentiate every stage independently (the stages will be matrix vector multiplies, or max operations, or sum operations, etc.) and then backprop through the variables one step at a time.",
"_____no_output_____"
],
[
"---\n# Neural Network - 1",
"_____no_output_____"
],
[
"- We introduced a very coarse model of a biological **neuron**\n\n- ์ค์ ์ฌ์ฉ๋๋ ๋ช๋ช **ํ์ฑํ ํจ์** ์ ๋ํด ๋
ผ์ํ์๊ณ , ReLU๊ฐ ๊ฐ์ฅ ์ผ๋ฐ์ ์ธ ์ ํ์ด๋ค.\n - ํ์ฑํ ํจ์ ์ฐ๋ ์ด์ : ๋ฐ์ดํฐ๋ฅผ ๋น์ ํ์ผ๋ก ๋ฐ๊พธ๊ธฐ ์ํด์. ์ ํ์ด๋ฉด ์๋์ธต์ด 1๊ฐ๋ฐ์ ์๋์ด\n\n\n- We introduced **Neural Networks** where neurons are connected with **Fully-Connected layers** where neurons in adjacent layers have full pair-wise connections, but neurons within a layer are not connected.\n\n- ์ฐ๋ฆฌ๋ layered architecture๋ฅผ ํตํด ํ์ฑํ ํจ์์ ๊ธฐ๋ฅ ์ ์ฉ๊ณผ ๊ฒฐํฉ๋ ํ๋ ฌ ๊ณฑ์ ๊ธฐ๋ฐ์ผ๋ก ์ ๊ฒฝ๋ง์ ๋งค์ฐ ํจ์จ์ ์ผ๋ก ํ๊ฐํ ์ ์์์ ๋ณด์๋ค.\n\n- ์ฐ๋ฆฌ๋ Neural Networks๊ฐ **universal function approximators**(NN์ผ๋ก ์ด๋ ํ ํจ์๋ ๊ทผ์ฌ์ํฌ ์ ์๋ค)์์ ๋ณด์์ง๋ง, ์ฐ๋ฆฌ๋ ๋ํ ์ด ํน์ฑ์ด ๊ทธ๋ค์ ํธ์ฌ์ ์ธ ์ฌ์ฉ๊ณผ ๊ฑฐ์ ๊ด๋ จ์ด ์๋ค๋ ์ฌ์ค์ ๋ํด ๋
ผ์ํ์๋ค. They are used because they make certain โrightโ assumptions about the functional forms of functions that come up in practice.\n\n- ์ฐ๋ฆฌ๋ ํฐ network๊ฐ ์์ network๋ณด๋ค ํญ์ ๋ ์ ์๋ํ์ง๋ง, ๋ ๋์ model capacity๋ ๋ ๊ฐ๋ ฅํ ์ ๊ทํ(๋์ ๊ฐ์ค์น ๊ฐ์๊ฐ์)๋ก ์ ์ ํ ํด๊ฒฐ๋์ด์ผ ํ๋ฉฐ, ๊ทธ๋ ์ง ์์ผ๋ฉด ์ค๋ฒํ๋ ์ ์๋ค๋ ์ฌ์ค์ ๋ํด ๋
ผ์ํ์๋ค. ์ดํ ์น์
์์ ๋ ๋ง์ ํํ์ ์ ๊ทํ(ํนํ dropout)๋ฅผ ๋ณผ ์ ์์ ๊ฒ์ด๋ค.",
"_____no_output_____"
],
[
"---\n# Neural Network - 2",
"_____no_output_____"
],
[
"- ๊ถ์ฅ๋๋ ์ ์ฒ๋ฆฌ๋ ๋ฐ์ดํฐ์ ์ค์์ ํ๊ท ์ด 0์ด ๋๋๋ก ํ๊ณ (zero centered), ์ค์ผ์ผ์ [-1, 1]๋ก ์ ๊ทํ ํ๋ ๊ฒ ์
๋๋ค.\n - ์ฌ๋ฐ๋ฅธ ์ ์ฒ๋ฆฌ ๋ฐฉ๋ฒ : ์๋ฅผ๋ค์ด ํ๊ท ์ฐจ๊ฐ ๊ธฐ๋ฒ์ ์ธ ๋ ํ์ต, ๊ฒ์ฆ, ํ
์คํธ๋ฅผ ์ํ ๋ฐ์ดํฐ๋ฅผ ๋จผ์ ๋๋ ํ ํ์ต ๋ฐ์ดํฐ๋ฅผ ๋์์ผ๋ก ํ๊ท ๊ฐ์ ๊ตฌํ ํ์ ํ๊ท ์ฐจ๊ฐ ์ ์ฒ๋ฆฌ๋ฅผ ๋ชจ๋ ๋ฐ์ดํฐ๊ตฐ(ํ์ต, ๊ฒ์ฆ, ํ
์คํธ)์ ์ ์ฉํ๋ ๊ฒ์ด๋ค.\n\n- ReLU๋ฅผ ์ฌ์ฉํ๊ณ ์ด๊ธฐํ๋ $\\sqrt{2/n}$ ์ ํ์ค ํธ์ฐจ๋ฅผ ๊ฐ๋ ์ ๊ท ๋ถํฌ์์ ๊ฐ์ค์น๋ฅผ ๊ฐ์ ธ์ ์ด๊ธฐํํฉ๋๋ค. ์ฌ๊ธฐ์ $n$์ ๋ด๋ฐ์ ๋ํ ์
๋ ฅ ์์
๋๋ค. E.g. in numpy: `w = np.random.randn(n) * sqrt(2.0/n)`.\n\n- L2 regularization๊ณผ ๋๋์์์ ์ฌ์ฉ (the inverted version)\n\n- Batch normalization ์ฌ์ฉ (์ด๊ฑธ์ฐ๋ฉด ๋๋์์์ ์ ์์)\n\n- ์ค์ ๋ก ์ํํ ์ ์๋ ๋ค์ํ ์์
๊ณผ ๊ฐ ์์
์ ๋ํ ๊ฐ์ฅ ์ผ๋ฐ์ ์ธ ์์ค ํจ์์ ๋ํด ๋
ผ์ํ๋ค.",
"_____no_output_____"
],
[
"---\n# Neural Network - 3",
"_____no_output_____"
],
[
"์ ๊ฒฝ๋ง(neural network)๋ฅผ ํ๋ จํ๊ธฐ ์ํ์ฌ:\n\n- ์ฝ๋๋ฅผ ์ง๋ ์ค๊ฐ์ค๊ฐ์ ์์ ๋ฐฐ์น๋ก ๊ทธ๋ผ๋์ธํธ๋ฅผ ์ฒดํฌํ๊ณ , ๋ปํ์ง ์๊ฒ ํ์ด๋์ฌ ์ํ์ ์ธ์งํ๊ณ ์์ด์ผ ํ๋ค.\n\n- ์ฝ๋๊ฐ ์ ๋๋ก ๋์๊ฐ๋์ง ํ์ธํ๋ ๋ฐฉ๋ฒ์ผ๋ก, ์์คํจ์๊ฐ์ ์ด๊ธฐ๊ฐ์ด ํฉ๋ฆฌ์ ์ธ์ง ๊ทธ๋ฆฌ๊ณ ๋ฐ์ดํฐ์ ์ผ๋ถ๋ถ์ผ๋ก 100%์ ํ๋ จ ์ ํ๋๋ฅผ ๋ฌ์ฑํ ์ ์๋์ง ํ์ธํด์ผํ๋ค.\n\n- ํ๋ จ ๋์, ์์คํจ์์ train/validation ์ ํ๋๋ฅผ ๊ณ์ ์ดํด๋ณด๊ณ , (์ด๊ฒ ์ข ๋ ๋ฉ์ ธ ๋ณด์ด๋ฉด) ํ์ฌ ํ๋ผ๋ฏธํฐ ๊ฐ ๋๋น ์
๋ฐ์ดํธ ๊ฐ ๋ํ ์ดํด๋ณด๋ผ (๋์ถฉ ~ 1e-3 ์ ๋ ๋์ด์ผ ํ๋ค). ๋ง์ฝ ConvNet์ ๋ค๋ฃจ๊ณ ์๋ค๋ฉด, ์ฒซ ์ธต์ ์จ์ดํธ๊ฐ๋ ์ดํด๋ณด๋ผ.\n\n- ์
๋ฐ์ดํธ ๋ฐฉ๋ฒ์ผ๋ก ์ถ์ฒํ๋ ๊ฑด SGD+Nesterov Momentum ํน์ Adam์ด๋ค.\n\n- ํ์ต ์๋๋ฅผ ํ๋ จ ๋์ ๊ณ์ ํ๊ฐ์์ผ๋ผ. ์๋ฅผ ๋ค๋ฉด, ์ ํด์ง ์ํญ ์ ๋ค์ (ํน์ ๊ฒ์ฆ ์ ํ๋๊ฐ ์์นํ๋ค๊ฐ ํ๊ฐ์ธ๋ก ๊บพ์ด๋ฉด) ํ์ต ์๋๋ฅผ ๋ฐ์ผ๋ก ๊น์๋ผ.\n\n- Hyper parameter ๊ฒ์์ grid search๊ฐ ์๋ random search์ผ๋ก ์ํํ๋ผ. ์ฒ์์๋ ์ฑ๊ธด ๊ท๋ชจ์์ ํ์ํ๋ค๊ฐ (๋์ hyper parameter ๋ฒ์, 1-5 epoch ์ ๋๋ง ํ์ต), ์ ์ ์ด์ดํ๊ฒ ๊ฒ์ํ๋ผ. (์ข์ ๋ฒ์, ๋ ๋ง์ ์ํญ์์ ํ์ต)\n- ์ถ๊ฐ์ ์ธ ๊ฐ์ ์ ์ํ์ฌ ๋ชจํ ์์๋ธ์ ๊ตฌ์ถํ๋ผ.",
"_____no_output_____"
],
[
"---\n# CNN",
"_____no_output_____"
],
[
"- ConvNet ์ํคํ
์ณ๋ ์ฌ๋ฌ ๋ ์ด์ด๋ฅผ ํตํด ์
๋ ฅ ์ด๋ฏธ์ง ๋ณผ๋ฅจ์ ์ถ๋ ฅ ๋ณผ๋ฅจ (ํด๋์ค ์ ์)์ผ๋ก ๋ณํ์์ผ ์ค๋ค.\n\n- ConvNet์ ๋ช ๊ฐ์ง ์ข
๋ฅ์ ๋ ์ด์ด๋ก ๊ตฌ์ฑ๋์ด ์๋ค. CONV/FC/RELU/POOL ๋ ์ด์ด๊ฐ ํ์ฌ ๊ฐ์ฅ ๋ง์ด ์ฐ์ธ๋ค.\n\n- ๊ฐ ๋ ์ด์ด๋ 3์ฐจ์์ ์
๋ ฅ ๋ณผ๋ฅจ์ ๋ฏธ๋ถ ๊ฐ๋ฅํ ํจ์๋ฅผ ํตํด 3์ฐจ์ ์ถ๋ ฅ ๋ณผ๋ฅจ์ผ๋ก ๋ณํ์ํจ๋ค.\n\n- parameter๊ฐ ์๋ ๋ ์ด์ด๋ ์๊ณ ๊ทธ๋ ์ง ์์ ๋ ์ด์ด๋ ์๋ค (FC/CONV๋ parameter๋ฅผ ๊ฐ๊ณ ์๊ณ , RELU/POOL ๋ฑ์ parameter๊ฐ ์์).\n\n- hyperparameter๊ฐ ์๋ ๋ ์ด์ด๋ ์๊ณ ๊ทธ๋ ์ง ์์ ๋ ์ด์ด๋ ์๋ค (CONV/FC/POOL ๋ ์ด์ด๋ hyperparameter๋ฅผ ๊ฐ์ง๋ฉฐ ReLU๋ ๊ฐ์ง์ง ์์).\n\n- stride, zero-padding ...",
"_____no_output_____"
],
[
"---\n# Spatial Localization and Detection",
"_____no_output_____"
],
[
"<img src='img/cs231/detect.png' width=\"500\" height=\"500\">",
"_____no_output_____"
],
[
"- Classification : ์ฌ์ง์ ๋ํ ๋ผ๋ฒจ์ด ์์ํ\n- Localization : ์ฌ์ง์ ๋ํ ์์๊ฐ ์์ํ (x, y, w, h)\n- Detection : ์ฌ์ง์ ๋ํ ์ฌ๋ฌ๊ฐ์ ์์๊ฐ ์์ํ DOG(x, y, w, h), CAT(x, y, w, h), ...\n- Segmentation : ์์๊ฐ ์๋๋ผ ๊ฐ์ฒด์ ์ด๋ฏธ์ง ํ์์ ๊ทธ๋๋ก.",
"_____no_output_____"
],
[
"- Localization method : localization as Regression, Sliding Window : Overfeat\n\n- Region Proposals : ๋น์ทํ ์๊น, ํ
์ค์ณ๋ฅผ ๊ธฐ์ค์ผ๋ก ๋ฐ์ค๋ฅผ ์์ฑ\n\n- Detection :\n - R-CNN : Region-based CNN. Region -> CNN\n - ๋ฌธ์ ์ : Region proposal ๋ง๋ค CNN์ ๋๋ ค์ ์๊ฐ์ด ๋งค์ฐ ๋ง์ด๋ ๋ค.\n - Fast R-CNN : CNN -> Region\n - ๋ฌธ์ ์ : Region Proposal ๊ณผ์ ์์ ์๊ฐ์ด ๋ ๋ค.\n - Faster R-CNN : Region Proposals๋ CNN์ ์ด์ฉํด์ ํด๋ณด์.\n \n - YOLO(You Only Look Once) : Detection as Regression\n - ์ฑ๋ฅ์ Faster R-CNN๋ณด๋ค ๋จ์ด์ง์ง๋ง, ์๋๊ฐ ๋งค์ฐ ๋น ๋ฅด๋ค.",
"_____no_output_____"
],
[
"---\n# CNNs in practice",
"_____no_output_____"
],
[
"- Data Augmentation\n - Change the pixels without changing the label\n - Train on transformed data\n - VERY widely used\n \n .....\n \n 1. Horizontal flips\n 2. Random crops/scales\n 3. Color jitter",
"_____no_output_____"
],
[
"- Transfer learning\n\n ์ด๋ฏธ์ง๋ท์ ํด๋์ค์ ๊ด๋ จ์๋ ๋ฐ์ดํฐ๋ผ๋ฉด ์ฌ์ ํ์ต์ ์ฑ๋ฅ์ด ์ข์์ง๋๊ฒ ์ดํด๊ฐ๋๋๋ฐ ๊ด๋ จ์๋ ์ด๋ฏธ์ง (e.g. mri๊ฐ์ ์๋ฃ์ด๋ฏธ์ง)์ ๊ฒฝ์ฐ๋ ์ฑ๋ฅ์ด ์ข์์ง๋๋ฐ ๊ทธ ์ด์ ๋ ๋ฌด์์ธ๊ฐ?\n\n -> ์๋จ์์ ์ฃ์ง, ์ปฌ๋ฌ๊ฐ์ low level์ feature๋ฅผ ์ธ์, ๋ค๋ก๊ฐ์๋ก ์์๋ ๋ฒจ์ ์ธ์. lowlevel์ ๋ฏธ๋ฆฌ ํ์ตํด๋๋๋ค๋ ๊ฒ์ ๊ทธ ์ด๋ค ์ด๋ฏธ์ง๋ฅผ ๋ถ์ํ ๋๋ ๋์์ด๋๋ค!",
"_____no_output_____"
],
[
"- How to stack convolutions:\n\n - Replace large convolutions (5x5, 7x7) with stacks of 3x3 convolutions\n - 1x1 \"bottleneck\" convolutions are very efficient\n - Can factor NxN convolutions into 1xN and Nx1\n - All of the above give fewer parameters, less compute, more nonlinearity\n \n ๋ ์ ์ ํ๋ผ๋ฏธํฐ, ๋ ์ ์ ์ปดํจํ
์ฐ์ฐ, ๋ ๋ง์ nonlinearity(ํํฐ ์ฌ์ด์ฌ์ด ReLU๋ฑ์ด ๋ค์ด๊ฐ๊ธฐ์)",
"_____no_output_____"
],
[
"- Computing Convolutions:\n - im2col : Easy to implement, but big memory overhead.\n - FFT : Big speedups for small kernels\n - \"Fast Algorithms\" : seem promising, not widely used yet",
"_____no_output_____"
],
[
"---\n# Segmentaion",
"_____no_output_____"
],
[
"- Semantic Segmentation\n - Classify all pixels\n - Fully convolutional models, downsample then upsample\n - Learnable upsampling: fractionally strided convolution\n - Skip connections can help\n\n...\n\n- Instance Segmentation\n - Detect instance, generate mask\n - Similar pipelines to object detection",
"_____no_output_____"
]
]
]
| [
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
]
|
d060b12257f61ab276eae7b6126d23a80f820034 | 196,953 | ipynb | Jupyter Notebook | colab/resnet.ipynb | WindQAQ/iree | 68fc75cbe6e4bdf175885c17d41f4d61a55c3537 | [
"Apache-2.0"
]
| null | null | null | colab/resnet.ipynb | WindQAQ/iree | 68fc75cbe6e4bdf175885c17d41f4d61a55c3537 | [
"Apache-2.0"
]
| null | null | null | colab/resnet.ipynb | WindQAQ/iree | 68fc75cbe6e4bdf175885c17d41f4d61a55c3537 | [
"Apache-2.0"
]
| null | null | null | 501.152672 | 185,484 | 0.926038 | [
[
[
"##### Copyright 2020 Google LLC.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");",
"_____no_output_____"
]
],
[
[
"#@title License header\n# Copyright 2020 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.",
"_____no_output_____"
]
],
[
[
"# ResNet\n\n[ResNet](https://arxiv.org/abs/1512.03385) is a deep neural network architecture for image recognition.\n\nThis notebook\n\n* Constructs a [ResNet50](https://www.tensorflow.org/api_docs/python/tf/keras/applications/ResNet50) model using `tf.keras`, with weights pretrained using the[ImageNet](http://www.image-net.org/) dataset\n* Compiles that model with IREE\n* Tests TensorFlow and IREE execution of the model on a sample image",
"_____no_output_____"
]
],
[
[
"#@title Imports and common setup\n\nfrom pyiree import rt as ireert\nfrom pyiree.tf import compiler as ireec\nfrom pyiree.tf.support import tf_utils\n\nimport tensorflow as tf\nfrom matplotlib import pyplot as plt",
"_____no_output_____"
],
[
"#@title Construct a pretrained ResNet model with ImageNet weights\n\ntf.keras.backend.set_learning_phase(False)\n\n# Static shape, including batch size (1).\n# Can be dynamic once dynamic shape support is ready.\nINPUT_SHAPE = [1, 224, 224, 3]\n\ntf_model = tf.keras.applications.resnet50.ResNet50(\n weights=\"imagenet\", include_top=True, input_shape=tuple(INPUT_SHAPE[1:]))\n\n# Wrap the model in a tf.Module to compile it with IREE.\nclass ResNetModule(tf.Module):\n\n def __init__(self):\n super(ResNetModule, self).__init__()\n self.m = tf_model\n self.predict = tf.function(\n input_signature=[tf.TensorSpec(INPUT_SHAPE, tf.float32)])(tf_model.call)",
"_____no_output_____"
],
[
"#@markdown ### Backend Configuration\n\nbackend_choice = \"iree_vmla (CPU)\" #@param [ \"iree_vmla (CPU)\", \"iree_llvmjit (CPU)\", \"iree_vulkan (GPU/SwiftShader)\" ]\nbackend_choice = backend_choice.split(\" \")[0]\nbackend = tf_utils.BackendInfo(backend_choice)",
"_____no_output_____"
],
[
"#@title Compile ResNet with IREE\n# This may take a few minutes.\niree_module = backend.compile(ResNetModule, [\"predict\"])",
"Created IREE driver vmla: <iree.bindings.python.pyiree.rt.binding.HalDriver object at 0x7fef48c98298>\nSystemContext driver=<iree.bindings.python.pyiree.rt.binding.HalDriver object at 0x7fef48c98298>\n"
],
[
"#@title Load a test image of a [labrador](https://commons.wikimedia.org/wiki/File:YellowLabradorLooking_new.jpg)\n\ndef load_image(path_to_image):\n image = tf.io.read_file(path_to_image)\n image = tf.image.decode_image(image, channels=3)\n image = tf.image.resize(image, (224, 224))\n image = image[tf.newaxis, :]\n return image\n\ncontent_path = tf.keras.utils.get_file(\n 'YellowLabradorLooking_new.jpg',\n 'https://storage.googleapis.com/download.tensorflow.org/example_images/YellowLabradorLooking_new.jpg')\ncontent_image = load_image(content_path)\n\nprint(\"Test image:\")\nplt.imshow(content_image.numpy().reshape(224, 224, 3) / 255.0)\nplt.axis(\"off\")\nplt.tight_layout()",
"Test image:\n"
],
[
"#@title Model pre- and post-processing\ninput_data = tf.keras.applications.resnet50.preprocess_input(content_image)\n\ndef decode_result(result):\n return tf.keras.applications.resnet50.decode_predictions(result, top=3)[0]",
"_____no_output_____"
],
[
"#@title Run TF model\n\nprint(\"TF prediction:\")\ntf_result = tf_model.predict(input_data)\nprint(decode_result(tf_result))",
"TF prediction:\n[('n02091244', 'Ibizan_hound', 0.12879108), ('n02099712', 'Labrador_retriever', 0.12632962), ('n02091831', 'Saluki', 0.09625229)]\n"
],
[
"#@title Run the model compiled with IREE\n\nprint(\"IREE prediction:\")\niree_result = iree_module.predict(input_data)\nprint(decode_result(iree_result))",
"IREE prediction:\n[('n02091244', 'Ibizan_hound', 0.12879075), ('n02099712', 'Labrador_retriever', 0.1263297), ('n02091831', 'Saluki', 0.09625255)]\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d060b4c997abb26f63478a8f14b093cdba70911a | 20,118 | ipynb | Jupyter Notebook | notebooks/art-for-tensorflow-v2-keras.ipynb | changx03/adversarial-robustness-toolbox | e21e0ff8ec5a88441da164c90376d63e07193242 | [
"MIT"
]
| 1,350 | 2020-07-14T08:06:55.000Z | 2022-03-31T19:22:25.000Z | notebooks/art-for-tensorflow-v2-keras.ipynb | bochengC/adversarial-robustness-toolbox | 031ffe4426678487de0cbcec5ad13f355e570bc8 | [
"MIT"
]
| 936 | 2020-07-14T03:33:00.000Z | 2022-03-31T23:05:29.000Z | notebooks/art-for-tensorflow-v2-keras.ipynb | bochengC/adversarial-robustness-toolbox | 031ffe4426678487de0cbcec5ad13f355e570bc8 | [
"MIT"
]
| 413 | 2020-07-16T16:00:16.000Z | 2022-03-29T10:31:12.000Z | 50.295 | 5,712 | 0.772641 | [
[
[
"# ART for TensorFlow v2 - Keras API",
"_____no_output_____"
],
[
"This notebook demonstrate applying ART with the new TensorFlow v2 using the Keras API. The code follows and extends the examples on www.tensorflow.org.",
"_____no_output_____"
]
],
[
[
"import warnings\nwarnings.filterwarnings('ignore')\nimport tensorflow as tf\ntf.compat.v1.disable_eager_execution()\nimport numpy as np\nfrom matplotlib import pyplot as plt\n\nfrom art.estimators.classification import KerasClassifier\nfrom art.attacks.evasion import FastGradientMethod, CarliniLInfMethod",
"_____no_output_____"
],
[
"if tf.__version__[0] != '2':\n raise ImportError('This notebook requires TensorFlow v2.')",
"_____no_output_____"
]
],
[
[
"# Load MNIST dataset",
"_____no_output_____"
]
],
[
[
"(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()\nx_train, x_test = x_train / 255.0, x_test / 255.0\n\nx_test = x_test[0:100]\ny_test = y_test[0:100]",
"_____no_output_____"
]
],
[
[
"# TensorFlow with Keras API",
"_____no_output_____"
],
[
"Create a model using Keras API. Here we use the Keras Sequential model and add a sequence of layers. Afterwards the model is compiles with optimizer, loss function and metrics.",
"_____no_output_____"
]
],
[
[
"model = tf.keras.models.Sequential([\n tf.keras.layers.InputLayer(input_shape=(28, 28)),\n tf.keras.layers.Flatten(),\n tf.keras.layers.Dense(128, activation='relu'),\n tf.keras.layers.Dropout(0.2),\n tf.keras.layers.Dense(10, activation='softmax')\n])\n\nmodel.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy']);",
"_____no_output_____"
]
],
[
[
"Fit the model on training data.",
"_____no_output_____"
]
],
[
[
"model.fit(x_train, y_train, epochs=3);",
"Train on 60000 samples\nEpoch 1/3\n60000/60000 [==============================] - 3s 46us/sample - loss: 0.2968 - accuracy: 0.9131\nEpoch 2/3\n60000/60000 [==============================] - 3s 46us/sample - loss: 0.1435 - accuracy: 0.9575\nEpoch 3/3\n60000/60000 [==============================] - 3s 46us/sample - loss: 0.1102 - accuracy: 0.9664\n"
]
],
[
[
"Evaluate model accuracy on test data.",
"_____no_output_____"
]
],
[
[
"loss_test, accuracy_test = model.evaluate(x_test, y_test)\nprint('Accuracy on test data: {:4.2f}%'.format(accuracy_test * 100))",
"Accuracy on test data: 100.00%\n"
]
],
[
[
"Create a ART Keras classifier for the TensorFlow Keras model.",
"_____no_output_____"
]
],
[
[
"classifier = KerasClassifier(model=model, clip_values=(0, 1))",
"_____no_output_____"
]
],
[
[
"## Fast Gradient Sign Method attack",
"_____no_output_____"
],
[
"Create a ART Fast Gradient Sign Method attack.",
"_____no_output_____"
]
],
[
[
"attack_fgsm = FastGradientMethod(estimator=classifier, eps=0.3)",
"_____no_output_____"
]
],
[
[
"Generate adversarial test data.",
"_____no_output_____"
]
],
[
[
"x_test_adv = attack_fgsm.generate(x_test)",
"_____no_output_____"
]
],
[
[
"Evaluate accuracy on adversarial test data and calculate average perturbation.",
"_____no_output_____"
]
],
[
[
"loss_test, accuracy_test = model.evaluate(x_test_adv, y_test)\nperturbation = np.mean(np.abs((x_test_adv - x_test)))\nprint('Accuracy on adversarial test data: {:4.2f}%'.format(accuracy_test * 100))\nprint('Average perturbation: {:4.2f}'.format(perturbation))",
"Accuracy on adversarial test data: 0.00%\nAverage perturbation: 0.18\n"
]
],
[
[
"Visualise the first adversarial test sample.",
"_____no_output_____"
]
],
[
[
"plt.matshow(x_test_adv[0])\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Carlini&Wagner Infinity-norm attack",
"_____no_output_____"
],
[
"Create a ART Carlini&Wagner Infinity-norm attack.",
"_____no_output_____"
]
],
[
[
"attack_cw = CarliniLInfMethod(classifier=classifier, eps=0.3, max_iter=100, learning_rate=0.01)",
"_____no_output_____"
]
],
[
[
"Generate adversarial test data.",
"_____no_output_____"
]
],
[
[
"x_test_adv = attack_cw.generate(x_test)",
"C&W L_inf: 100%|โโโโโโโโโโ| 1/1 [00:04<00:00, 4.23s/it]\n"
]
],
[
[
"Evaluate accuracy on adversarial test data and calculate average perturbation.",
"_____no_output_____"
]
],
[
[
"loss_test, accuracy_test = model.evaluate(x_test_adv, y_test)\nperturbation = np.mean(np.abs((x_test_adv - x_test)))\nprint('Accuracy on adversarial test data: {:4.2f}%'.format(accuracy_test * 100))\nprint('Average perturbation: {:4.2f}'.format(perturbation))",
"Accuracy on adversarial test data: 10.00%\nAverage perturbation: 0.03\n"
]
],
[
[
"Visualise the first adversarial test sample.",
"_____no_output_____"
]
],
[
[
"plt.matshow(x_test_adv[0, :, :])\nplt.show()",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d060c2786199cbceb6556c2ec911dc0cd631e0d6 | 913,950 | ipynb | Jupyter Notebook | English/9_time_series_prediction/Prophet.ipynb | JeyDi/DataScienceCourse | 905a37bf3e9f77ea00b3678f7ddbdd78d5c3361a | [
"MIT"
]
| 6 | 2020-04-11T18:02:57.000Z | 2021-11-26T09:40:12.000Z | English/9_time_series_prediction/Prophet.ipynb | JeyDi/DataScienceCourse | 905a37bf3e9f77ea00b3678f7ddbdd78d5c3361a | [
"MIT"
]
| 1 | 2020-05-08T15:30:02.000Z | 2020-05-10T09:23:15.000Z | English/9_time_series_prediction/Prophet.ipynb | JeyDi/DataScienceCourse | 905a37bf3e9f77ea00b3678f7ddbdd78d5c3361a | [
"MIT"
]
| 3 | 2019-12-05T16:02:50.000Z | 2020-05-03T07:43:26.000Z | 891.658537 | 316,048 | 0.944994 | [
[
[
"# Prophet",
"_____no_output_____"
],
[
"Time serie forecasting using Prophet\n\nOfficial documentation: https://facebook.github.io/prophet/docs/quick_start.html",
"_____no_output_____"
],
[
"Procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It is released by Facebookโs Core Data Science team.\n\nAdditive model is a model like: \n$Data = seasonal\\space effect + trend + residual$\n\nand, multiplicative model: \n$Data = seasonal\\space effect * trend * residual$\n\nThe algorithm provides useful statistics that help visualize the tuning process, e.g. trend, week trend, year trend and their max and min errors.",
"_____no_output_____"
],
[
"### Data\n\nThe data on which the algorithms will be trained and tested upon comes from Kaggle Hourly Energy Consumption database. It is collected by PJM Interconnection, a company coordinating the continuous buying, selling, and delivery of wholesale electricity through the Energy Market from suppliers to customers in the reagon of South Carolina, USA. All .csv files contains rows with a timestamp and a value. The name of the value column corresponds to the name of the contractor. the timestamp represents a single hour and the value represents the total energy, cunsumed during that hour.\n\nThe data we will be using is hourly power consumption data from PJM. Energy consumtion has some unique charachteristics. It will be interesting to see how prophet picks them up.\n\nhttps://www.kaggle.com/robikscube/hourly-energy-consumption\n\nPulling the PJM East which has data from 2002-2018 for the entire east region.\n\n",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom fbprophet import Prophet\nfrom sklearn.metrics import mean_squared_error, mean_absolute_error\nplt.style.use('fivethirtyeight') # For plots",
"_____no_output_____"
],
[
"dataset_path = './data/hourly-energy-consumption/PJME_hourly.csv'\ndf = pd.read_csv(dataset_path, index_col=[0], parse_dates=[0])\nprint(\"Dataset path:\",df.shape)\ndf.head(10)",
"Dataset path: (145366, 1)\n"
],
[
"# VISUALIZE DATA\n# Color pallete for plotting\ncolor_pal = [\"#F8766D\", \"#D39200\", \"#93AA00\",\n \"#00BA38\", \"#00C19F\", \"#00B9E3\",\n \"#619CFF\", \"#DB72FB\"]\ndf.plot(style='.', figsize=(20,10), color=color_pal[0], title='PJM East Dataset TS')\nplt.show()",
"_____no_output_____"
],
[
"#Decompose the seasonal data\n\ndef create_features(df, label=None):\n \"\"\"\n Creates time series features from datetime index.\n \"\"\"\n df = df.copy()\n df['date'] = df.index\n df['hour'] = df['date'].dt.hour\n df['dayofweek'] = df['date'].dt.dayofweek\n df['quarter'] = df['date'].dt.quarter\n df['month'] = df['date'].dt.month\n df['year'] = df['date'].dt.year\n df['dayofyear'] = df['date'].dt.dayofyear\n df['dayofmonth'] = df['date'].dt.day\n df['weekofyear'] = df['date'].dt.weekofyear\n \n X = df[['hour','dayofweek','quarter','month','year',\n 'dayofyear','dayofmonth','weekofyear']]\n if label:\n y = df[label]\n return X, y\n return X",
"_____no_output_____"
],
[
"df.columns",
"_____no_output_____"
],
[
"X, y = create_features(df, label='PJME_MW')\n\nfeatures_and_target = pd.concat([X, y], axis=1)\n\nprint(\"Shape\",features_and_target.shape)\nfeatures_and_target.head(10)",
"Shape (145366, 9)\n"
],
[
"sns.pairplot(features_and_target.dropna(),\n hue='hour',\n x_vars=['hour','dayofweek',\n 'year','weekofyear'],\n y_vars='PJME_MW',\n height=5,\n plot_kws={'alpha':0.15, 'linewidth':0}\n )\nplt.suptitle('Power Use MW by Hour, Day of Week, Year and Week of Year')\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Train and Test Split",
"_____no_output_____"
],
[
"We use a temporal split, keeping old data and use only new period to do the prediction",
"_____no_output_____"
]
],
[
[
"split_date = '01-Jan-2015'\npjme_train = df.loc[df.index <= split_date].copy()\npjme_test = df.loc[df.index > split_date].copy()",
"_____no_output_____"
],
[
"# Plot train and test so you can see where we have split\npjme_test \\\n .rename(columns={'PJME_MW': 'TEST SET'}) \\\n .join(pjme_train.rename(columns={'PJME_MW': 'TRAINING SET'}),\n how='outer') \\\n .plot(figsize=(15,5), title='PJM East', style='.')\nplt.show()",
"_____no_output_____"
]
],
[
[
"To use prophet you need to correctly rename features and label to correctly pass the input to the engine.",
"_____no_output_____"
]
],
[
[
"# Format data for prophet model using ds and y\npjme_train.reset_index() \\\n .rename(columns={'Datetime':'ds',\n 'PJME_MW':'y'})\n\nprint(pjme_train.columns)\npjme_train.head(5)",
"Index(['PJME_MW'], dtype='object')\n"
]
],
[
[
"### Create and train the model",
"_____no_output_____"
]
],
[
[
"# Setup and train model and fit\nmodel = Prophet()\nmodel.fit(pjme_train.reset_index() \\\n .rename(columns={'Datetime':'ds',\n 'PJME_MW':'y'}))\n",
"_____no_output_____"
],
[
"# Predict on training set with model\npjme_test_fcst = model.predict(df=pjme_test.reset_index() \\\n .rename(columns={'Datetime':'ds'}))",
"_____no_output_____"
],
[
"pjme_test_fcst.head()",
"_____no_output_____"
]
],
[
[
"### Plot the results and forecast",
"_____no_output_____"
]
],
[
[
"# Plot the forecast\nf, ax = plt.subplots(1)\nf.set_figheight(5)\nf.set_figwidth(15)\nfig = model.plot(pjme_test_fcst,\n ax=ax)\nplt.show()",
"_____no_output_____"
],
[
"# Plot the components of the model\nfig = model.plot_components(pjme_test_fcst)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d060d44ea8124e137ea522b9965ec22ce7ada076 | 244,737 | ipynb | Jupyter Notebook | examples/nyquist_plots_examples.ipynb | EISy-as-Py/EISy-as-Py | 3086ecd043fce4d8ba49ec55004340a5444c0eb0 | [
"MIT"
]
| 5 | 2020-02-06T21:38:47.000Z | 2020-02-13T20:29:44.000Z | examples/nyquist_plots_examples.ipynb | EISy-as-Py/EISy-as-Py | 3086ecd043fce4d8ba49ec55004340a5444c0eb0 | [
"MIT"
]
| 2 | 2020-03-11T22:06:21.000Z | 2020-05-18T17:22:43.000Z | examples/nyquist_plots_examples.ipynb | EISy-as-Py/EISy-as-Py | 3086ecd043fce4d8ba49ec55004340a5444c0eb0 | [
"MIT"
]
| 4 | 2020-03-13T20:35:04.000Z | 2020-03-18T21:56:28.000Z | 715.605263 | 35,248 | 0.948516 | [
[
[
"from PyEIS import *",
"_____no_output_____"
]
],
[
[
"## Frequency range\nThe first first step needed to simulate an electrochemical impedance spectra is to generate a frequency domain, to do so, use to build-in freq_gen() function, as follows",
"_____no_output_____"
]
],
[
[
"f_range = freq_gen(f_start=10**10, f_stop=0.1, pts_decade=7)\n# print(f_range[0]) #First 5 points in the freq. array\nprint()\n# print(f_range[1]) #First 5 points in the angular freq.array",
"\n"
]
],
[
[
"Note that all functions included are described, to access these descriptions stay within () and press shift+tab. The freq_gen(), returns both the frequency, which is log seperated based on points/decade between f_start to f_stop, and the angular frequency. This function is quite useful and will be used through this tutorial",
"_____no_output_____"
],
[
"## The Equivalent Circuits\nThere exist a number of equivalent circuits that can be simulated and fitted, these functions are made as definations and can be called at any time. To find these, write: \"cir_\" and hit tab. All functions are outline in the next cell and can also be viewed in the equivalent circuit overview:",
"_____no_output_____"
]
],
[
[
"cir_RC\ncir_RQ\ncir_RsRQ\ncir_RsRQRQ\ncir_Randles\ncir_Randles_simplified\ncir_C_RC_C\ncir_Q_RQ_Q\ncir_RCRCZD\ncir_RsTLsQ\ncir_RsRQTLsQ\ncir_RsTLs\ncir_RsRQTLs\ncir_RsTLQ\ncir_RsRQTLQ\ncir_RsTL\ncir_RsRQTL\ncir_RsTL_1Dsolid\ncir_RsRQTL_1Dsolid",
"_____no_output_____"
]
],
[
[
"## Simulation of -(RC)-\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/RC_circuit.png' width=\"300\" />\n\n#### Input Parameters:\n- w = Angular frequency [1/s]\n- R = Resistance [Ohm]\n- C = Capacitance [F]\n- fs = summit frequency of RC circuit [Hz]",
"_____no_output_____"
]
],
[
[
"RC_example = EIS_sim(frange=f_range[0], circuit=cir_RC(w=f_range[1], R=70, C=10**-6), legend='on')",
"_____no_output_____"
]
],
[
[
"## Simulation of -Rs-(RQ)-\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/RsRQ_circuit.png' width=\"500\" />\n\n#### Input parameters:\n- w = Angular frequency [1/s]\n- Rs = Series resistance [Ohm]\n- R = Resistance [Ohm]\n- Q = Constant phase element [s^n/ohm]\n- n = Constant phase elelment exponent [-]\n- fs = summit frequency of RQ circuit [Hz]",
"_____no_output_____"
]
],
[
[
"RsRQ_example = EIS_sim(frange=f_range[0], circuit=cir_RsRQ(w=f_range[1], Rs=70, R=200, n=.8, Q=10**-5), legend='on')",
"_____no_output_____"
],
[
"RsRC_example = EIS_sim(frange=f_range[0], circuit=cir_RsRC(w=f_range[1], Rs=80, R=100, C=10**-5), legend='on')",
"_____no_output_____"
]
],
[
[
"## Simulation of -Rs-(RQ)-(RQ)-\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/RsRQRQ_circuit.png' width=\"500\" />\n\n#### Input parameters:\n- w = Angular frequency [1/s]\n- Rs = Series Resistance [Ohm]\n- R = Resistance [Ohm]\n- Q = Constant phase element [s^n/ohm]\n- n = Constant phase element exponent [-]\n- fs = summit frequency of RQ circuit [Hz]\n- R2 = Resistance [Ohm]\n- Q2 = Constant phase element [s^n/ohm]\n- n2 = Constant phase element exponent [-]\n- fs2 = summit frequency of RQ circuit [Hz]",
"_____no_output_____"
]
],
[
[
"RsRQRQ_example = EIS_sim(frange=f_range[0], circuit=cir_RsRQRQ(w=f_range[1], Rs=200, R=150, n=.872, Q=10**-4, R2=50, n2=.853, Q2=10**-6), legend='on')",
"_____no_output_____"
]
],
[
[
"## Simulation of -Rs-(Q(RW))- (Randles-circuit)\nThis circuit is often used for an experimental setup with a macrodisk working electrode with an outer-sphere heterogeneous charge transfer. This, classical, warburg element is controlled by semi-infinite linear diffusion, which is given by the geometry of the working electrode. Two Randles functions are avaliable for simulations: cir_Randles_simplified() and cir_Randles(). The former contains the Warburg constant (sigma), which summs up all mass transport constants (Dox/Dred, Cred/Cox, number of electrons (n_electron), Faradays constant (F), T, and E0) into a single constant sigma, while the latter contains all of these constants. Only cir_Randles_simplified() is avaliable for fitting, as either D$_{ox}$ or D$_{red}$ and C$_{red}$ or C$_{ox}$ are needed.\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/Randles_circuit.png' width=\"500\" />\n\n#### Input parameters:\n- Rs = Series resistance [ohm]\n- Rct = charge-transfer resistance [ohm]\n- Q = Constant phase element used to model the double-layer capacitance [F]\n- n = expononent of the CPE [-]\n- sigma = Warburg Constant [ohm/s^1/2]",
"_____no_output_____"
]
],
[
[
"Randles = cir_Randles_simplified(w=f_range[1], Rs=100, R=1000, n=1, sigma=300, Q=10**-5)\nRandles_example = EIS_sim(frange=f_range[0], circuit=Randles, legend='off')",
"_____no_output_____"
],
[
"Randles_example = EIS_sim(frange=f_range[0], circuit=cir_Randles_simplified(w=f_range[1], Rs=100, R=1000, n=1, sigma=300, Q='none', fs=10**3.3), legend='off')",
"_____no_output_____"
]
],
[
[
"In the following, the Randles circuit with the Warburg constant (sigma) defined is simulated where:\n- D$_{red}$/D$_{ox}$ = 10$^{-6}$ cm$^2$/s\n- C$_{red}$/C$_{ox}$ = 10 mM\n- n_electron = 1\n- T = 25 $^o$C\n\nThis function is a great tool to simulate expected impedance responses prior to starting experiments as it allows for evaluation of concentrations, diffusion constants, number of electrons, and Temp. to evaluate the feasability of obtaining information on either kinetics, mass-transport, or both.",
"_____no_output_____"
]
],
[
[
"Randles_example = EIS_sim(frange=f_range[0], circuit=cir_Randles(w=f_range[1], Rs=100, Rct=1000, Q=10**-7, n=1, T=298.15, D_ox=10**-9, D_red=10**-9, C_ox=10**-5, C_red=10**-5, n_electron=1, E=0, A=1), legend='off')",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d060eba8647218bcd5795351a84e4d41e7367041 | 155,060 | ipynb | Jupyter Notebook | nbs/examples/complex_dummy_experiment_manager.ipynb | Jaume-JCI/hpsearch | 168d81f49e1a4bd4dbab838baaa8ff183a422030 | [
"MIT"
]
| null | null | null | nbs/examples/complex_dummy_experiment_manager.ipynb | Jaume-JCI/hpsearch | 168d81f49e1a4bd4dbab838baaa8ff183a422030 | [
"MIT"
]
| 15 | 2021-12-02T15:00:37.000Z | 2022-02-22T17:53:50.000Z | nbs/examples/complex_dummy_experiment_manager.ipynb | Jaume-JCI/hpsearch | 168d81f49e1a4bd4dbab838baaa8ff183a422030 | [
"MIT"
]
| null | null | null | 197.277354 | 65,596 | 0.894589 | [
[
[
"#hide\n#default_exp examples.complex_dummy_experiment_manager\nfrom nbdev.showdoc import *\nfrom block_types.utils.nbdev_utils import nbdev_setup, TestRunner\n\nnbdev_setup ()\ntst = TestRunner (targets=['dummy'])",
"_____no_output_____"
]
],
[
[
"# Complex Dummy Experiment Manager\n\n> Dummy experiment manager with features that allow additional functionality",
"_____no_output_____"
]
],
[
[
"#export\nfrom hpsearch.examples.dummy_experiment_manager import DummyExperimentManager, FakeModel\nimport hpsearch\nimport os\nimport shutil\nimport os\n\nimport hpsearch.examples.dummy_experiment_manager as dummy_em\nfrom hpsearch.visualization import plot_utils ",
"_____no_output_____"
],
[
"#for tests\nimport pytest\nfrom block_types.utils.nbdev_utils import md",
"_____no_output_____"
]
],
[
[
"## ComplexDummyExperimentManager",
"_____no_output_____"
]
],
[
[
"#export\nclass ComplexDummyExperimentManager (DummyExperimentManager):\n \n def __init__ (self, model_file_name='model_weights.pk', **kwargs):\n super().__init__ (model_file_name=model_file_name, **kwargs)\n self.raise_error_if_run = False\n\n def run_experiment (self, parameters={}, path_results='./results'):\n \n # useful for testing: in some cases the experiment manager should not call run_experiment\n if self.raise_error_if_run:\n raise RuntimeError ('run_experiment should not be called')\n \n # extract hyper-parameters used by our model. All the parameters have default values if they are not passed.\n offset = parameters.get('offset', 0.5) # default value: 0.5\n rate = parameters.get('rate', 0.01) # default value: 0.01\n epochs = parameters.get('epochs', 10) # default value: 10\n noise = parameters.get('noise', 0.0)\n if parameters.get('actual_epochs') is not None:\n epochs = parameters.get('actual_epochs')\n \n # other parameters that do not form part of our experiment definition\n # changing the values of these other parameters, does not make the ID of the experiment change\n verbose = parameters.get('verbose', True)\n \n # build model with given hyper-parameters\n model = FakeModel (offset=offset, rate=rate, epochs=epochs, noise = noise, verbose=verbose)\n \n # load training, validation and test data (fake step)\n model.load_data()\n\n # start from previous experiment if indicated by parameters\n path_results_previous_experiment = parameters.get('prev_path_results')\n if path_results_previous_experiment is not None:\n model.load_model_and_history (path_results_previous_experiment)\n \n # fit model with training data \n model.fit ()\n \n # save model weights and evolution of accuracy metric across epochs\n model.save_model_and_history(path_results)\n \n # simulate ctrl-c\n if parameters.get ('halt', False):\n raise KeyboardInterrupt ('stopped')\n \n # evaluate model with validation and test data\n validation_accuracy, test_accuracy = model.score()\n \n # store model\n self.model = model\n \n # the function returns a dictionary with keys corresponding to the names of each metric. \n # We return result on validation and test set in this example\n dict_results = dict (validation_accuracy = validation_accuracy,\n test_accuracy = test_accuracy)\n \n return dict_results\n ",
"_____no_output_____"
]
],
[
[
"### Usage",
"_____no_output_____"
]
],
[
[
"#exports tests.examples.test_complex_dummy_experiment_manager\ndef test_complex_dummy_experiment_manager ():\n #em = generate_data ('complex_dummy_experiment_manager')\n \n md (\n'''\nExtend previous experiment by using a larger number of epochs\n\nWe see how to create a experiment that is the same as a previous experiment, \nonly increasing the number of epochs. \n\n1.a. For test purposes, we first run the full number of epochs, 30, take note of the accuracy, \nand remove the experiment\n'''\n )\n \n em = ComplexDummyExperimentManager (path_experiments='test_complex_dummy_experiment_manager', \n verbose=0)\n em.create_experiment_and_run (parameters = {'epochs': 30});\n reference_accuracy = em.model.accuracy\n reference_weight = em.model.weight\n\n from hpsearch.config.hpconfig import get_path_experiments\n import os\n import pandas as pd\n\n path_experiments = get_path_experiments ()\n print (f'experiments folders: {os.listdir(f\"{path_experiments}/experiments\")}\\n')\n\n experiments_data = pd.read_pickle (f'{path_experiments}/experiments_data.pk')\n print ('csv data')\n display (experiments_data)\n\n md ('we plot the history')\n from hpsearch.visualization.experiment_visualization import plot_multiple_histories\n\n plot_multiple_histories ([0], run_number=0, op='max', backend='matplotlib', metrics='validation_accuracy')\n\n md ('1.b. Now we run two experiments: ')\n\n md ('We run the first experiment with 20 epochs:')\n\n # a.- remove previous experiment\n em.remove_previous_experiments()\n\n # b.- create first experiment with epochs=20\n em.create_experiment_and_run (parameters = {'epochs': 20});\n\n print (f'experiments folders: {os.listdir(f\"{path_experiments}/experiments\")}\\n')\n\n experiments_data = pd.read_pickle (f'{path_experiments}/experiments_data.pk')\n print ('csv data')\n display(experiments_data)\n print (f'weight: {em.model.weight}, accuracy: {em.model.accuracy}')\n\n md ('We run the second experiment resumes from the previous one and increases the epochs to 30')\n # 4.- create second experiment with epochs=10\n em.create_experiment_and_run (parameters = {'epochs': 30}, \n other_parameters={'prev_epoch': True,\n 'name_epoch': 'epochs',\n 'previous_model_file_name': 'model_weights.pk'});\n\n experiments_data = pd.read_pickle (f'{path_experiments}/experiments_data.pk')\n print ('csv data')\n display(experiments_data)\n\n new_accuracy = em.model.accuracy\n new_weight = em.model.weight\n\n assert new_weight==reference_weight\n assert new_accuracy==reference_accuracy\n\n print (f'weight: {new_weight}, accuracy: {new_accuracy}')\n\n md ('We plot the history')\n plot_multiple_histories ([1], run_number=0, op='max', backend='matplotlib', metrics='validation_accuracy')\n \n em.remove_previous_experiments()",
"_____no_output_____"
],
[
"tst.run (test_complex_dummy_experiment_manager, tag='dummy')",
"running test_complex_dummy_experiment_manager\n"
]
],
[
[
"## Running experiments and removing experiments",
"_____no_output_____"
]
],
[
[
"# export\ndef run_multiple_experiments (**kwargs):\n dummy_em.run_multiple_experiments (EM=ComplexDummyExperimentManager, **kwargs)\n\ndef remove_previous_experiments ():\n dummy_em.remove_previous_experiments (EM=ComplexDummyExperimentManager)",
"_____no_output_____"
],
[
"#export\ndef generate_data (name_folder):\n em = ComplexDummyExperimentManager (path_experiments=f'test_{name_folder}', verbose=0)\n em.remove_previous_experiments ()\n run_multiple_experiments (em=em, nruns=5, noise=0.1, verbose=False)\n return em",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d061077a7df45ec4c5ebf5d8df4448f6c1df44f4 | 12,842 | ipynb | Jupyter Notebook | lessons/Workshop_13_OOP.ipynb | andrewt0301/python-problems | 57338611ac631f094e3fb78a6cccab8b6fd7b442 | [
"Apache-2.0"
]
| null | null | null | lessons/Workshop_13_OOP.ipynb | andrewt0301/python-problems | 57338611ac631f094e3fb78a6cccab8b6fd7b442 | [
"Apache-2.0"
]
| null | null | null | lessons/Workshop_13_OOP.ipynb | andrewt0301/python-problems | 57338611ac631f094e3fb78a6cccab8b6fd7b442 | [
"Apache-2.0"
]
| null | null | null | 42.664452 | 1,084 | 0.531926 | [
[
[
"# Workshop 13\n## _Object-oriented programming._\n\n#### Classes and Objects\n",
"_____no_output_____"
]
],
[
[
"class MyClass:\n pass\n\n\nobj1 = MyClass()\nobj2 = MyClass()\n\nprint(obj1)\nprint(type(obj1))\n\nprint(obj2)\nprint(type(obj2))\n",
"_____no_output_____"
]
],
[
[
"##### Constructor and destructor\n",
"_____no_output_____"
]
],
[
[
"class Employee: \n \n def __init__(self): \n print('Employee created.') \n \n def __del__(self): \n print('Destructor called, Employee deleted.') \n \nobj = Employee() \ndel obj \n",
"_____no_output_____"
]
],
[
[
"##### Attributes and methods\n",
"_____no_output_____"
]
],
[
[
"class Student:\n\n def __init__(self, name, grade):\n self.name = name\n self.grade = grade\n\n def __str__(self):\n return '{' + self.name + ': ' + str(self.grade) + '}'\n\n def learn(self):\n print('My name is %s. I am learning Python! My grade is %d.' % (self.name, self.grade))\n\n\nstudents = [Student('Steve', 9), Student('Oleg', 10)]\n\nfor student in students:\n print()\n print('student.name = ' + student.name)\n print('student.grade = ' + str(student.grade))\n print('student = ' + str(student))\n student.learn()\n",
"_____no_output_____"
]
],
[
[
"##### Class and instance attributes\n",
"_____no_output_____"
]
],
[
[
"class Person:\n\n # class variable shared by all instances\n status = 'student'\n\n def __init__(self, name):\n # instance variable unique to each instance\n self.name = name\n\n\na = Person('Steve')\nb = Person('Mark')\n\nprint('')\nprint(a.name + ' : ' + a.status)\nprint(b.name + ' : ' + b.status)\n\nPerson.status = 'graduate'\n\nprint('')\nprint(a.name + ' : ' + a.status)\nprint(b.name + ' : ' + b.status)\n\nPerson.status = 'student'\n\nprint('')\nprint(a.name + ' : ' + a.status)\nprint(b.name + ' : ' + b.status)\n",
"_____no_output_____"
]
],
[
[
"##### Class and static methods\n",
"_____no_output_____"
]
],
[
[
"class Env:\n os = 'Windows'\n\n @classmethod\n def print_os(self):\n print(self.os)\n \n @staticmethod\n def print_user():\n print('guest')\n\n\nEnv.print_os()\nEnv.print_user()\n",
"_____no_output_____"
]
],
[
[
"##### Encapsulation\n",
"_____no_output_____"
]
],
[
[
"class Person:\n\n def __init__(self, name):\n self.name = name\n\n def __str__(self):\n return 'My name is ' + self.name\n\n\nperson = Person('Steve')\nprint(person.name)\n\nperson.name = 'Said' \nprint(person.name)\n",
"_____no_output_____"
],
[
"class Identity:\n\n def __init__(self, name):\n self.__name = name\n\n def __str__(self):\n return 'My name is ' + self.__name\n\n\nperson = Identity('Steve')\nprint(person.__name)\n\nperson.__name = 'Said' \nprint(person)\n",
"_____no_output_____"
]
],
[
[
"##### Operator overloading\n",
"_____no_output_____"
]
],
[
[
"class Number:\n\n def __init__(self, value):\n self.__value = value\n\n def __del__(self):\n pass\n\n def __str__(self):\n return str(self.__value)\n\n def __int__(self):\n return self.__value\n\n def __eq__(self, other):\n return self.__value == other.__value\n\n def __ne__(self, other):\n return self.__value != other.__value\n\n def __lt__(self, other):\n return self.__value < other.__value\n\n def __gt__(self, other):\n return self.__value > other.__value\n\n def __add__(self, other):\n return Number(self.__value + other.__value)\n\n def __mul__(self, other):\n return Number(self.__value * other.__value)\n\n def __neg__(self):\n return Number(-self.__value)\n\n\na = Number(10)\nb = Number(20)\nc = Number(5)\n\n# Overloaded operators\nx = -a + b * c\nprint(x)\n\nprint(a < b)\nprint(b > c)\n\n# Unsupported operators\nprint(a <= b)\nprint(b >= c)\nprint(a // c)\n",
"_____no_output_____"
]
],
[
[
"#### Inheritance and polymorphism\n",
"_____no_output_____"
]
],
[
[
"class Creature:\n def say(self):\n pass\n\n\nclass Dog(Creature):\n def say(self):\n print('Woof!')\n\n\nclass Cat(Creature):\n def say(self):\n print(\"Meow!\")\n\n\nclass Lion(Creature):\n def say(self):\n print(\"Roar!\")\n \n\nanimals = [Creature(), Dog(), Cat(), Lion()]\n\nfor animal in animals:\n print(type(animal))\n animal.say()\n\n",
"_____no_output_____"
]
],
[
[
"##### Multiple inheritance\n",
"_____no_output_____"
]
],
[
[
"class Person:\n def __init__(self, name):\n self.name = name\n\n\nclass Student(Person):\n def __init__(self, name, grade):\n super().__init__(name)\n self.grade = grade\n\n\nclass Employee:\n def __init__(self, salary):\n self.salary = salary\n\n\nclass Teacher(Person, Employee):\n def __init__(self, name, salary):\n Person.__init__(self, name)\n Employee.__init__(self, salary)\n\n\nclass TA(Student, Employee):\n def __init__(self, name, grage, salary):\n Student.__init__(self, name, grage)\n Employee.__init__(self, salary)\n\n\nx = Student('Oleg', 9)\ny = TA('Sergei', 10, 1000)\nz = Teacher('Andrei', 2000)\n\nfor person in [x, y, z]:\n print(person.name)\n if isinstance(person, Employee):\n print(person.salary)\n if isinstance(person, Student):\n print(person.grade)\n",
"_____no_output_____"
]
],
[
[
"##### Function _isinstance_\n",
"_____no_output_____"
]
],
[
[
"x = 10\nprint('')\nprint(isinstance(x, int))\nprint(isinstance(x, float))\nprint(isinstance(x, str))\n\ny = 3.14\nprint('')\nprint(isinstance(y, int))\nprint(isinstance(y, float))\nprint(isinstance(y, str))\n\n\nz = 'Hello world'\nprint('')\nprint(isinstance(z, int))\nprint(isinstance(z, float))\nprint(isinstance(z, str))\n",
"_____no_output_____"
],
[
"class A:\n pass\n\n\nclass B:\n pass\n\n\nclass C(A):\n pass\n\n\nclass D(A, B):\n pass\n\n\na = A()\nb = B()\nc = C()\nd = D()\n\nprint('')\nprint(isinstance(a, object))\nprint(isinstance(a, A))\nprint(isinstance(b, B))\n\nprint('')\nprint(isinstance(b, object))\nprint(isinstance(b, A))\nprint(isinstance(b, B))\nprint(isinstance(b, C))\n\nprint('')\nprint(isinstance(c, object))\nprint(isinstance(c, A))\nprint(isinstance(c, B))\nprint(isinstance(c, D))\n\n\nprint('')\nprint(isinstance(d, object))\nprint(isinstance(d, A))\nprint(isinstance(d, B))\nprint(isinstance(d, C))\nprint(isinstance(d, D))\n",
"_____no_output_____"
]
],
[
[
"##### Composition\n",
"_____no_output_____"
]
],
[
[
"class Teacher:\n pass\n\nclass Student:\n pass\n\nclass ClassRoom:\n def __init__(self, teacher, students):\n self.teacher = teacher\n self.students = students\n\n\ncl = ClassRoom(Teacher(), [Student(), Student(), Student()])\n",
"_____no_output_____"
],
[
"class Set:\n\n def __init__(self, values=None):\n self.dict = {}\n\n if values is not None:\n for value in values:\n self.add(value)\n\n def __repr__(self):\n return \"Set: \" + str(self.dict.keys())\n\n def add(self, value):\n self.dict[value] = True\n\n def contains(self, value):\n return value in self.dict\n\n def remove(self, value):\n del self.dict[value]\n\n\ns = Set([1,2,3])\ns.add(4)\nprint(s.contains(4))\ns.remove(3)\nprint(s.contains(3))\n",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d0610aae49ed9cd42f0ff340a636706689eeaf03 | 16,728 | ipynb | Jupyter Notebook | examples/06_Scalable_GP_Classification_1D/KISSGP_Classification_1D.ipynb | phumm/gpytorch | 4e8042bcecda049956f8f9e823d82ba6340766d5 | [
"MIT"
]
| 1 | 2019-09-30T06:51:03.000Z | 2019-09-30T06:51:03.000Z | examples/06_Scalable_GP_Classification_1D/KISSGP_Classification_1D.ipynb | phumm/gpytorch | 4e8042bcecda049956f8f9e823d82ba6340766d5 | [
"MIT"
]
| null | null | null | examples/06_Scalable_GP_Classification_1D/KISSGP_Classification_1D.ipynb | phumm/gpytorch | 4e8042bcecda049956f8f9e823d82ba6340766d5 | [
"MIT"
]
| 1 | 2020-09-16T16:35:27.000Z | 2020-09-16T16:35:27.000Z | 55.026316 | 6,900 | 0.718675 | [
[
[
"# Scalable GP Classification in 1D (w/ KISS-GP)\n\nThis example shows how to use grid interpolation based variational classification with an `ApproximateGP` using a `GridInterpolationVariationalStrategy` module. This classification module is designed for when the inputs of the function you're modeling are one-dimensional.\n\nThe use of inducing points allows for scaling up the training data by making computational complexity linear instead of cubic.\n\nIn this example, weโre modeling a function that is periodically labeled cycling every 1/8 (think of a square wave with period 1/4)\n\nThis notebook doesn't use cuda, in general we recommend GPU use if possible and most of our notebooks utilize cuda as well.\n\nKernel interpolation for scalable structured Gaussian processes (KISS-GP) was introduced in this paper:\nhttp://proceedings.mlr.press/v37/wilson15.pdf\n\nKISS-GP with SVI for classification was introduced in this paper:\nhttps://papers.nips.cc/paper/6426-stochastic-variational-deep-kernel-learning.pdf",
"_____no_output_____"
]
],
[
[
"import math\nimport torch\nimport gpytorch\nfrom matplotlib import pyplot as plt\nfrom math import exp\n\n%matplotlib inline\n%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"train_x = torch.linspace(0, 1, 26)\ntrain_y = torch.sign(torch.cos(train_x * (2 * math.pi))).add(1).div(2)",
"_____no_output_____"
],
[
"from gpytorch.models import ApproximateGP\nfrom gpytorch.variational import CholeskyVariationalDistribution\nfrom gpytorch.variational import GridInterpolationVariationalStrategy\n\n\nclass GPClassificationModel(ApproximateGP):\n def __init__(self, grid_size=128, grid_bounds=[(0, 1)]):\n variational_distribution = CholeskyVariationalDistribution(grid_size)\n variational_strategy = GridInterpolationVariationalStrategy(self, grid_size, grid_bounds, variational_distribution)\n super(GPClassificationModel, self).__init__(variational_strategy)\n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n \n def forward(self,x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n latent_pred = gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n return latent_pred\n\n\nmodel = GPClassificationModel()\nlikelihood = gpytorch.likelihoods.BernoulliLikelihood()",
"_____no_output_____"
],
[
"from gpytorch.mlls.variational_elbo import VariationalELBO\n\n# Find optimal model hyperparameters\nmodel.train()\nlikelihood.train()\n\n# Use the adam optimizer\noptimizer = torch.optim.Adam(model.parameters(), lr=0.01)\n\n# \"Loss\" for GPs - the marginal log likelihood\n# n_data refers to the number of training datapoints\nmll = VariationalELBO(likelihood, model, num_data=train_y.numel())\n\ndef train():\n num_iter = 100\n for i in range(num_iter):\n optimizer.zero_grad()\n output = model(train_x)\n # Calc loss and backprop gradients\n loss = -mll(output, train_y)\n loss.backward()\n print('Iter %d/%d - Loss: %.3f' % (i + 1, num_iter, loss.item()))\n optimizer.step()\n \n# Get clock time\n%time train()",
"Iter 1/100 - Loss: 0.070\nIter 2/100 - Loss: 14.834\nIter 3/100 - Loss: 0.977\nIter 4/100 - Loss: 3.547\nIter 5/100 - Loss: 8.699\nIter 6/100 - Loss: 6.352\nIter 7/100 - Loss: 1.795\nIter 8/100 - Loss: 0.188\nIter 9/100 - Loss: 2.075\nIter 10/100 - Loss: 4.160\nIter 11/100 - Loss: 3.899\nIter 12/100 - Loss: 1.941\nIter 13/100 - Loss: 0.344\nIter 14/100 - Loss: 0.360\nIter 15/100 - Loss: 1.501\nIter 16/100 - Loss: 2.298\nIter 17/100 - Loss: 1.944\nIter 18/100 - Loss: 0.904\nIter 19/100 - Loss: 0.177\nIter 20/100 - Loss: 0.297\nIter 21/100 - Loss: 0.916\nIter 22/100 - Loss: 1.281\nIter 23/100 - Loss: 1.024\nIter 24/100 - Loss: 0.451\nIter 25/100 - Loss: 0.111\nIter 26/100 - Loss: 0.246\nIter 27/100 - Loss: 0.593\nIter 28/100 - Loss: 0.733\nIter 29/100 - Loss: 0.526\nIter 30/100 - Loss: 0.206\nIter 31/100 - Loss: 0.087\nIter 32/100 - Loss: 0.225\nIter 33/100 - Loss: 0.408\nIter 34/100 - Loss: 0.413\nIter 35/100 - Loss: 0.245\nIter 36/100 - Loss: 0.091\nIter 37/100 - Loss: 0.096\nIter 38/100 - Loss: 0.210\nIter 39/100 - Loss: 0.273\nIter 40/100 - Loss: 0.210\nIter 41/100 - Loss: 0.104\nIter 42/100 - Loss: 0.064\nIter 43/100 - Loss: 0.117\nIter 44/100 - Loss: 0.173\nIter 45/100 - Loss: 0.159\nIter 46/100 - Loss: 0.093\nIter 47/100 - Loss: 0.056\nIter 48/100 - Loss: 0.077\nIter 49/100 - Loss: 0.115\nIter 50/100 - Loss: 0.115\nIter 51/100 - Loss: 0.078\nIter 52/100 - Loss: 0.050\nIter 53/100 - Loss: 0.061\nIter 54/100 - Loss: 0.083\nIter 55/100 - Loss: 0.086\nIter 56/100 - Loss: 0.062\nIter 57/100 - Loss: 0.045\nIter 58/100 - Loss: 0.053\nIter 59/100 - Loss: 0.064\nIter 60/100 - Loss: 0.065\nIter 61/100 - Loss: 0.050\nIter 62/100 - Loss: 0.040\nIter 63/100 - Loss: 0.046\nIter 64/100 - Loss: 0.052\nIter 65/100 - Loss: 0.051\nIter 66/100 - Loss: 0.041\nIter 67/100 - Loss: 0.037\nIter 68/100 - Loss: 0.041\nIter 69/100 - Loss: 0.044\nIter 70/100 - Loss: 0.042\nIter 71/100 - Loss: 0.035\nIter 72/100 - Loss: 0.034\nIter 73/100 - Loss: 0.036\nIter 74/100 - Loss: 0.037\nIter 75/100 - Loss: 0.033\nIter 76/100 - Loss: 0.030\nIter 77/100 - Loss: 0.030\nIter 78/100 - Loss: 0.033\nIter 79/100 - Loss: 0.031\nIter 80/100 - Loss: 0.029\nIter 81/100 - Loss: 0.028\nIter 82/100 - Loss: 0.028\nIter 83/100 - Loss: 0.028\nIter 84/100 - Loss: 0.026\nIter 85/100 - Loss: 0.025\nIter 86/100 - Loss: 0.025\nIter 87/100 - Loss: 0.025\nIter 88/100 - Loss: 0.025\nIter 89/100 - Loss: 0.024\nIter 90/100 - Loss: 0.022\nIter 91/100 - Loss: 0.022\nIter 92/100 - Loss: 0.022\nIter 93/100 - Loss: 0.022\nIter 94/100 - Loss: 0.020\nIter 95/100 - Loss: 0.021\nIter 96/100 - Loss: 0.020\nIter 97/100 - Loss: 0.019\nIter 98/100 - Loss: 0.018\nIter 99/100 - Loss: 0.019\nIter 100/100 - Loss: 0.017\nCPU times: user 6.33 s, sys: 9.66 s, total: 16 s\nWall time: 2.31 s\n"
],
[
"# Set model and likelihood into eval mode\nmodel.eval()\nlikelihood.eval()\n\n# Initialize axes\nf, ax = plt.subplots(1, 1, figsize=(4, 3))\n\nwith torch.no_grad():\n test_x = torch.linspace(0, 1, 101)\n predictions = likelihood(model(test_x))\n\nax.plot(train_x.numpy(), train_y.numpy(), 'k*')\npred_labels = predictions.mean.ge(0.5).float()\nax.plot(test_x.data.numpy(), pred_labels.numpy(), 'b')\nax.set_ylim([-1, 2])\nax.legend(['Observed Data', 'Mean', 'Confidence'])",
"_____no_output_____"
]
]
]
| [
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d0611888d467fde28d61f6cbbf1ef43118ee3058 | 221,378 | ipynb | Jupyter Notebook | notebooks/AB_tests/Understand Splitting Fraction.ipynb | mclaughlin6464/pearce | 746f2bf4bf45e904d66996e003043661a01423ba | [
"MIT"
]
| null | null | null | notebooks/AB_tests/Understand Splitting Fraction.ipynb | mclaughlin6464/pearce | 746f2bf4bf45e904d66996e003043661a01423ba | [
"MIT"
]
| 16 | 2016-11-04T22:24:32.000Z | 2018-05-01T22:53:39.000Z | notebooks/AB_tests/Understand Splitting Fraction.ipynb | mclaughlin6464/pearce | 746f2bf4bf45e904d66996e003043661a01423ba | [
"MIT"
]
| 3 | 2016-10-04T08:07:52.000Z | 2019-05-03T23:50:01.000Z | 201.987226 | 72,772 | 0.897519 | [
[
[
"import numpy as np\nimport astropy\nfrom itertools import izip\nfrom pearce.mocks import compute_prim_haloprop_bins, cat_dict\nfrom pearce.mocks.customHODModels import *\nfrom halotools.utils.table_utils import compute_conditional_percentiles\nfrom halotools.mock_observables import hod_from_mock, wp, tpcf, tpcf_one_two_halo_decomp\nfrom math import ceil",
"_____no_output_____"
],
[
"from matplotlib import pyplot as plt\n%matplotlib inline\nimport seaborn as sns\nsns.set()",
"_____no_output_____"
],
[
"shuffle_type = ''#'sh_shuffled'\nmag_type = 'vpeak'",
"_____no_output_____"
],
[
"mag_cut = -21\nmin_ptcl = 200\nmag_key = 'halo_%s%s_mag'%(shuffle_type, mag_type)\nupid_key = 'halo_%supid'%(shuffle_type)",
"_____no_output_____"
],
[
"PMASS = 591421440.0000001 #chinchilla 400/ 2048\ncatalog = astropy.table.Table.read('abmatched_halos.hdf5', format = 'hdf5')",
"_____no_output_____"
],
[
"cosmo_params = {'simname':'chinchilla', 'Lbox':400.0, 'scale_factors':[0.658, 1.0]}\ncat = cat_dict[cosmo_params['simname']](**cosmo_params)#construct the specified catalog!\n\ncat.load_catalog(1.0)\n#cat.h = 1.0\nhalo_catalog = catalog[catalog['halo_mvir'] > min_ptcl*cat.pmass] #mass cut\ngalaxy_catalog = halo_catalog[ halo_catalog[mag_key] < mag_cut ] # mag cut",
"_____no_output_____"
],
[
"def compute_mass_bins(prim_haloprop, dlog10_prim_haloprop=0.05): \n lg10_min_prim_haloprop = np.log10(np.min(prim_haloprop))-0.001\n lg10_max_prim_haloprop = np.log10(np.max(prim_haloprop))+0.001\n num_prim_haloprop_bins = (lg10_max_prim_haloprop-lg10_min_prim_haloprop)/dlog10_prim_haloprop\n return np.logspace(\n lg10_min_prim_haloprop, lg10_max_prim_haloprop,\n num=int(ceil(num_prim_haloprop_bins)))",
"_____no_output_____"
],
[
"mass_bins = compute_mass_bins(halo_catalog['halo_mvir'], 0.2)\nmass_bin_centers = (mass_bins[1:]+mass_bins[:-1])/2.0",
"_____no_output_____"
],
[
"cen_mask = galaxy_catalog['halo_upid']==-1\ncen_hod_sham, _ = hod_from_mock(galaxy_catalog[cen_mask]['halo_mvir_host_halo'],\\\n halo_catalog['halo_mvir'],\\\n mass_bins)\n\nsat_hod_sham, _ = hod_from_mock(galaxy_catalog[~cen_mask]['halo_mvir_host_halo'],\\\n halo_catalog['halo_mvir'],\\\n mass_bins)",
"_____no_output_____"
],
[
"cat.load_model(1.0, HOD=(FSAssembiasTabulatedCens, FSAssembiasTabulatedSats), hod_kwargs = {'prim_haloprop_vals': mass_bin_centers,\n #'sec_haloprop_key': 'halo_%s'%(mag_type),\n 'cen_hod_vals':cen_hod_sham,\n 'sat_hod_vals':sat_hod_sham,\n 'split':0.5})",
"_____no_output_____"
],
[
"print cat.model.param_dict",
"{'mean_occupation_satellites_assembias_split1': 0.5, 'mean_occupation_satellites_assembias_param1': 0.5, 'mean_occupation_centrals_assembias_split1': 0.5, 'mean_occupation_centrals_assembias_param1': 0.5}\n"
],
[
"#rp_bins = np.logspace(-1,1.5,20)\n#rp_bins = np.logspace(-1.1,1.8, 25)\n#rp_bins = np.loadtxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/rp_bins.npy')\nrp_bins = np.array([7.943282000000000120e-02,\n1.122018500000000057e-01,\n1.584893199999999891e-01,\n2.238721100000000130e-01,\n3.162277700000000191e-01,\n4.466835900000000192e-01,\n6.309573400000000332e-01,\n8.912509400000000470e-01,\n1.258925410000000022e+00,\n1.778279409999999894e+00,\n2.511886430000000114e+00,\n3.548133889999999901e+00,\n5.011872340000000037e+00,\n7.079457839999999891e+00,\n1.000000000000000000e+01,\n1.412537544999999994e+01,\n1.995262315000000086e+01,\n2.818382931000000013e+01,\n3.981071706000000177e+01])\n\nbin_centers = (rp_bins[:1]+rp_bins[:-1])/2",
"_____no_output_____"
],
[
"min_logmass, max_logmass = 9.0, 17.0\nnames = ['mean_occupation_centrals_assembias_param1','mean_occupation_satellites_assembias_param1',\\\n 'mean_occupation_centrals_assembias_split1','mean_occupation_satellites_assembias_split1']",
"_____no_output_____"
],
[
"#mock_wp = cat.calc_wp(rp_bins, RSD= False)\nMAP = np.array([ 0.85, -0.3,0.85,0.5])\n\nparams = dict(zip(names, MAP))\n#print params.keys()\n\nmock_wps = []\nmock_wps_1h, mock_wps_2h = [],[]\n#mock_nds = []\nsplit = np.linspace(0.1, 0.9, 4)\n#split_abcissa = [10**9, 10**13, 10**16]\n\n#cat.model._input_model_dictionary['centrals_occupation']._split_abscissa = split_abcissa\n#cat.model._input_model_dictionary['satellites_occupation']._split_abscissa = split_abcissa\nfor p in split:\n #params['mean_occupation_centrals_assembias_split1'] = p\n params['mean_occupation_satellites_assembias_split1'] = p\n #print params.keys()\n #print cat.model.param_dict\n cat.populate(params)\n #print cat.model.param_dict\n #cut_idx = cat.model.mock.galaxy_table['gal_type'] == 'centrals'\n mass_cut = np.logical_and(np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) > min_logmass,\\\n np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) <= max_logmass)\n #mass_cut = np.logical_and(mass_cut, cut_idx)\n #mock_nds.append(len(cut_idx)/cat.Lbox**3)\n mock_pos = np.c_[cat.model.mock.galaxy_table['x'],\\\n cat.model.mock.galaxy_table['y'],\\\n cat.model.mock.galaxy_table['z']]\n mock_wps.append(wp(mock_pos*cat.h, rp_bins ,40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1))\n #oneh, twoh = tpcf_one_two_halo_decomp(mock_pos,cat.model.mock.galaxy_table[mass_cut]['halo_hostid'],\\\n # rp_bins , period=cat.Lbox, num_threads=1)\n #mock_wps_1h.append(oneh)\n #mock_wps_2h.append(twoh)\n \nmock_wps = np.array(mock_wps)\nwp_errs = np.std(mock_wps, axis = 0)\n\n#mock_wps_1h = np.array(mock_wps_1h)\n#mock_wp_no_ab_1h = np.mean(mock_wps_1h, axis = 0)\n\n#mock_wps_2h = np.array(mock_wps_2h)\n#mock_wp_no_ab_2h = np.mean(mock_wps_2h, axis = 0)\n\n#mock_nds = np.array(mock_nds)\n#mock_nd = np.mean(mock_nds)\n#nd_err = np.std(mock_nds)",
"_____no_output_____"
],
[
"params",
"_____no_output_____"
],
[
"params = dict(zip(names, [0,0,0.5,0.5])) \ncat.populate(params)\nmass_cut = np.logical_and(np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) > min_logmass,\\\n np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) <= max_logmass)\n\nprint cat.model.param_dict\nmock_pos = np.c_[cat.model.mock.galaxy_table['x'],\\\n cat.model.mock.galaxy_table['y'],\\\n cat.model.mock.galaxy_table['z']]\nnoab_wp = wp(mock_pos*cat.h, rp_bins ,40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1)",
"{'mean_occupation_centrals_assembias_split1': 0.5, 'mean_occupation_centrals_assembias_param1': 0, 'mean_occupation_satellites_assembias_split1': 0.5, 'mean_occupation_satellites_assembias_param1': 0}\n"
],
[
"print np.log10(noab_wp)",
"[ 2.86479644 2.71375719 2.5690466 2.38344144 2.20022251 2.02959663\n 1.87899003 1.72579396 1.60715197 1.52219163 1.41551296 1.27122876\n 1.12708178 0.97127198 0.80736591 0.56640291 0.25364583 -0.18412503]\n"
],
[
"from halotools.mock_observables import return_xyz_formatted_array",
"_____no_output_____"
],
[
"sham_pos = np.c_[galaxy_catalog['halo_x'],\\\n galaxy_catalog['halo_y'],\\\n galaxy_catalog['halo_z']]\n\ndistortion_dim = 'z'\nv_distortion_dim = galaxy_catalog['halo_v%s' % distortion_dim]\n# apply redshift space distortions\n#sham_pos = return_xyz_formatted_array(sham_pos[:,0],sham_pos[:,1],sham_pos[:,2], velocity=v_distortion_dim, \\\n# velocity_distortion_dimension=distortion_dim, period=cat.Lbox)\n#sham_wp = wp(sham_pos*cat.h, rp_bins, 40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1)\nsham_wp = wp(sham_pos*cat.h, rp_bins, 40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1)\n\n#sham_wp = tpcf(sham_pos, rp_bins , period=cat.Lbox, num_threads=1)",
"_____no_output_____"
],
[
"sham_wp",
"_____no_output_____"
],
[
"len(galaxy_catalog)/((cat.Lbox*cat.h)**3)",
"_____no_output_____"
]
],
[
[
"np.savetxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/sham_vpeak_wp.npy', sham_wp)\n#np.savetxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/sham_vpeak_nd.npy', np.array([len(galaxy_catalog)/((cat.Lbox*cat.h)**3)]))",
"_____no_output_____"
],
[
"np.savetxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/rp_bins_split.npy',rp_bins )",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps):\n plt.plot(bin_centers, mock_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\nplt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"np.log10(mock_wps[-1])",
"_____no_output_____"
],
[
"plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps):\n plt.plot(bin_centers, mock_wp/sham_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\n#plt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\n#plt.loglog()\nplt.xscale('log')\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 15e0]);\nplt.ylim([0.8,1.2])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"plt.figure(figsize=(10,8))\n#for p, mock_wp in zip(split, mock_wps):\n# plt.plot(bin_centers, mock_wp/sham_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\nplt.plot(bin_centers, noab_wp/sham_wp, label = 'No AB')\n\n\n#plt.loglog()\nplt.xscale('log')\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 15e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps_1h):\n plt.plot(bin_centers, mock_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\n#plt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"/u/ki/swmclau2/.conda/envs/hodemulator/lib/python2.7/site-packages/matplotlib/axes/_axes.py:531: UserWarning: No labelled objects found. Use label='...' kwarg on individual plots.\n warnings.warn(\"No labelled objects found. \"\n"
],
[
"plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps_2h):\n plt.plot(bin_centers, mock_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\nplt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps_2h):\n plt.plot(bin_centers, mock_wp/noab_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\n#plt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"plt.plot(bin_centers, mock_wps[0, :])\nplt.plot(bin_centers, mock_wps_1h[0, :])\nplt.plot(bin_centers, mock_wps_2h[0, :])\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"plt.figure(figsize=(10,8))\n#avg = mock_wps.mean(axis = 0)\nfor p, mock_wp in zip(split, mock_wps):\n plt.plot(bin_centers, mock_wp/sham_wp, label = 'p = %.2f'%p)\n \nplt.plot(bin_centers, noab_wp/sham_wp, label = 'No AB', ls = ':')\n\n#plt.loglog()\nplt.xscale('log')\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 5e0]);\nplt.ylim([0.75,1.25]);\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)/\\xi_{SHAM}(r)$',fontsize = 15)\nplt.show()",
"_____no_output_____"
],
[
"sats_occ = cat.model._input_model_dictionary['satellites_occupation']\nsats_occ._split_ordinates = [0.99]",
"_____no_output_____"
]
],
[
[
"cens_occ = cat.model._input_model_dictionary['centrals_occupation']\ncens_occ._split_ordinates = [0.1]",
"_____no_output_____"
]
],
[
[
"print sats_occ",
"_____no_output_____"
],
[
"baseline_lower_bound, baseline_upper_bound = 0,np.inf\nprim_haloprop = cat.model.mock.halo_table['halo_mvir']\nsec_haloprop = cat.model.mock.halo_table['halo_nfw_conc']",
"_____no_output_____"
],
[
"from halotools.utils.table_utils import compute_conditional_percentile_values",
"_____no_output_____"
],
[
"split = sats_occ.percentile_splitting_function(prim_haloprop)\n\n# Compute the baseline, undecorated result\nresult = sats_occ.baseline_mean_occupation(prim_haloprop=prim_haloprop)\n\n# We will only decorate values that are not edge cases,\n# so first compute the mask for non-edge cases\nno_edge_mask = (\n (split > 0) & (split < 1) &\n (result > baseline_lower_bound) & (result < baseline_upper_bound)\n)\n# Now create convenient references to the non-edge-case sub-arrays\nno_edge_result = result[no_edge_mask]\nno_edge_split = split[no_edge_mask]",
"_____no_output_____"
]
],
[
[
"percentiles = compute_conditional_percentiles(\n prim_haloprop=prim_haloprop,\n sec_haloprop=sec_haloprop\n )\nno_edge_percentiles = percentiles[no_edge_mask]\ntype1_mask = no_edge_percentiles > no_edge_split\n\nperturbation = sats_occ._galprop_perturbation(prim_haloprop=prim_haloprop[no_edge_mask],baseline_result=no_edge_result, splitting_result=no_edge_split)\n\nfrac_type1 = 1 - no_edge_split\nfrac_type2 = 1 - frac_type1\nperturbation[~type1_mask] *= (-frac_type1[~type1_mask] /\n(frac_type2[~type1_mask]))",
"_____no_output_____"
],
[
"# Retrieve percentile values (medians) if they've been precomputed. Else, compute them.\n\nno_edge_percentile_values = compute_conditional_percentile_values(p=no_edge_split,\n prim_haloprop=prim_haloprop[no_edge_mask],\n sec_haloprop=sec_haloprop[no_edge_mask])\n\npv_sub_sec_haloprop = sec_haloprop[no_edge_mask] - no_edge_percentile_values\n\nperturbation = sats_occ._galprop_perturbation(\n prim_haloprop=prim_haloprop[no_edge_mask],\n sec_haloprop=pv_sub_sec_haloprop/np.max(np.abs(pv_sub_sec_haloprop)),\n baseline_result=no_edge_result)",
"_____no_output_____"
]
],
[
[
"from halotools.utils.table_utils import compute_conditional_averages",
"_____no_output_____"
],
[
"strength = sats_occ.assembias_strength(prim_haloprop[no_edge_mask])\nslope = sats_occ.assembias_slope(prim_haloprop[no_edge_mask])\n\n# the average displacement acts as a normalization we need.\nmax_displacement = sats_occ._disp_func(sec_haloprop=pv_sub_sec_haloprop/np.max(np.abs(pv_sub_sec_haloprop)), slope=slope)\ndisp_average = compute_conditional_averages(vals=max_displacement,prim_haloprop=prim_haloprop[no_edge_mask])\n#disp_average = np.ones((prim_haloprop.shape[0], ))*0.5\n\nperturbation2 = np.zeros(len(prim_haloprop[no_edge_mask]))\n\ngreater_than_half_avg_idx = disp_average > 0.5\nless_than_half_avg_idx = disp_average <= 0.5\n\nif len(max_displacement[greater_than_half_avg_idx]) > 0:\n base_pos = result[no_edge_mask][greater_than_half_avg_idx]\n strength_pos = strength[greater_than_half_avg_idx]\n avg_pos = disp_average[greater_than_half_avg_idx]\n\n upper_bound1 = (base_pos - baseline_lower_bound)/avg_pos\n upper_bound2 = (baseline_upper_bound - base_pos)/(1-avg_pos)\n upper_bound = np.minimum(upper_bound1, upper_bound2)\n print upper_bound1, upper_bound2\n perturbation2[greater_than_half_avg_idx] = strength_pos*upper_bound*(max_displacement[greater_than_half_avg_idx]-avg_pos)\n \n\nif len(max_displacement[less_than_half_avg_idx]) > 0:\n base_neg = result[no_edge_mask][less_than_half_avg_idx]\n strength_neg = strength[less_than_half_avg_idx]\n avg_neg = disp_average[less_than_half_avg_idx]\n\n lower_bound1 = (base_neg-baseline_lower_bound)/avg_neg#/(1- avg_neg)\n lower_bound2 = (baseline_upper_bound - base_neg)/(1-avg_neg)#(avg_neg)\n lower_bound = np.minimum(lower_bound1, lower_bound2)\n perturbation2[less_than_half_avg_idx] = strength_neg*lower_bound*(max_displacement[less_than_half_avg_idx]-avg_neg)\n\n",
"_____no_output_____"
],
[
"print np.unique(max_displacement[indices_of_mb])\nprint np.unique(disp_average[indices_of_mb])",
"_____no_output_____"
],
[
"perturbation",
"_____no_output_____"
],
[
"mass_bins = compute_mass_bins(prim_haloprop)\nmass_bin_idxs = compute_prim_haloprop_bins(prim_haloprop_bin_boundaries=mass_bins, prim_haloprop = prim_haloprop[no_edge_mask])\nmb = 87\nindices_of_mb = np.where(mass_bin_idxs == mb)[0]",
"_____no_output_____"
],
[
"plt.hist(perturbation[indices_of_mb], bins =100);\nplt.yscale('log');\n#plt.loglog();",
"_____no_output_____"
],
[
"print max(perturbation)\nprint min(perturbation)",
"_____no_output_____"
],
[
"print max(perturbation[indices_of_mb])\nprint min(perturbation[indices_of_mb])",
"_____no_output_____"
],
[
"idxs = np.argsort(perturbation)\nprint mass_bin_idxs[idxs[-10:]]",
"_____no_output_____"
],
[
"plt.hist(perturbation2[indices_of_mb], bins =100);\nplt.yscale('log');\n#plt.loglog();",
"_____no_output_____"
],
[
"print perturbation2",
"_____no_output_____"
]
]
]
| [
"code",
"raw",
"code",
"raw",
"code",
"raw",
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"raw",
"raw"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"raw"
],
[
"code",
"code",
"code",
"code"
],
[
"raw",
"raw"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d06118eed716b8b1c5ca1cd2af9f6d246ac13bb7 | 168,149 | ipynb | Jupyter Notebook | training/cyclone_model_svm.ipynb | etassone1974/UWA-Project-3 | af5608869ff1703a93855b084fb075d20df40a4a | [
"MIT"
]
| null | null | null | training/cyclone_model_svm.ipynb | etassone1974/UWA-Project-3 | af5608869ff1703a93855b084fb075d20df40a4a | [
"MIT"
]
| null | null | null | training/cyclone_model_svm.ipynb | etassone1974/UWA-Project-3 | af5608869ff1703a93855b084fb075d20df40a4a | [
"MIT"
]
| null | null | null | 391.044186 | 51,806 | 0.676091 | [
[
[
"from sklearn.model_selection import train_test_split\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.preprocessing import StandardScaler\nfrom sklearn.svm import SVC\nfrom sklearn.model_selection import GridSearchCV\nfrom sklearn.metrics import accuracy_score\nimport joblib\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
" # Read data into DataFrame from CSV file\n # cyclone_df = pd.read_csv(\"Cyclone_ML.csv\")\n cyclone_df = pd.read_csv(\"../data/Cyclone_ML.csv\")\n\n # Select features for machine learning and assign to X\n selected_features = cyclone_df[[\"SURFACE_CODE\",\t\"CYC_TYPE\", \"LAT\", \"LON\", \"CENTRAL_PRES\", \"MAX_WIND_SPD\", \"CENTRAL_INDEX (CI)\", \"WAVE_HEIGHT\"]]\n X = selected_features\n\n # Set y to compass direction of cyclone based on wind direction degree\n y = cyclone_df[\"WIND_COMPASS\"]\n # y = cyclone_df[\"MAX_REP_WIND_DIR\"]\n \n\n print(X.shape, y.shape)",
"(1691, 8) (1691,)\n"
],
[
"cyclone_df",
"_____no_output_____"
],
[
"X",
"_____no_output_____"
],
[
"y",
"_____no_output_____"
],
[
" # train test split\n X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=42)",
"_____no_output_____"
],
[
" X_scaler = StandardScaler().fit(X_train)\n X_train_scaled = X_scaler.transform(X_train)\n X_test_scaled = X_scaler.transform(X_test)",
"_____no_output_____"
],
[
" # Support vector machine linear classifier\n model = SVC(kernel='linear')\n\n # Fit the model to the training data and calculate the scores for the training and testing data\n model.fit(X_train_scaled, y_train)",
"_____no_output_____"
],
[
" training_score = model.score(X_train_scaled, y_train)\n testing_score = model.score(X_test_scaled, y_test)\n \n print(f\"Training Data Score: {training_score}\")\n print(f\"Testing Data Score: {testing_score}\")",
"Training Data Score: 0.23186119873817035\nTesting Data Score: 0.20094562647754138\n"
],
[
" predictions = model.predict(X_test_scaled)\n acc = accuracy_score(y_test, preds)\n print(f'Model accuracy on test set: {acc:.2f}')",
"Model accuracy on test set: 0.20\n"
],
[
"from sklearn.metrics import plot_confusion_matrix\nplot_confusion_matrix(model, X_test_scaled, y_test, cmap=\"Blues\")\nplt.show()",
"_____no_output_____"
],
[
"plot_confusion_matrix(model, X_train_scaled, y_train, cmap=\"Blues\")\nplt.show()",
"_____no_output_____"
],
[
"plt.savefig('../static/images/clrep_train_svm.png')",
"_____no_output_____"
],
[
"plt.savefig('books_read.png')",
"_____no_output_____"
],
[
"from sklearn.metrics import classification_report\nprint(classification_report(y_test, predictions,\n target_names=[\"E\", \"N\", \"NE\", \"NW\", \"S\", \"SE\", \"SW\", \"W\"]))",
" precision recall f1-score support\n\n E 0.17 0.60 0.27 73\n N 0.18 0.12 0.14 60\n NE 0.00 0.00 0.00 33\n NW 0.00 0.00 0.00 48\n S 0.00 0.00 0.00 58\n SE 0.22 0.34 0.27 65\n SW 0.00 0.00 0.00 25\n W 0.34 0.20 0.25 61\n\n accuracy 0.20 423\n macro avg 0.12 0.16 0.12 423\nweighted avg 0.14 0.20 0.14 423\n\n"
],
[
"joblib.dump(model, 'cyclone_SVM.smd')\nprint(\"Model is saved.\")",
"Model is saved.\n"
],
[
"joblib.dump(model, '../cyclone_SVM.smd')\nprint(\"Model is saved.\")",
"Model is saved.\n"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0611dc271fbaf941d70af46922cae2350029a02 | 415,066 | ipynb | Jupyter Notebook | _notebooks/2020-06-29-01-Showing-uncertainty.ipynb | AntonovMikhail/chans_jupyter | c2cd1675408238ad5be81ba98994611d8c4e48ae | [
"Apache-2.0"
]
| 8 | 2020-06-26T23:48:52.000Z | 2021-02-27T22:26:31.000Z | _notebooks/2020-06-29-01-Showing-uncertainty.ipynb | AntonovMikhail/chans_jupyter | c2cd1675408238ad5be81ba98994611d8c4e48ae | [
"Apache-2.0"
]
| 46 | 2020-06-30T00:45:37.000Z | 2021-03-07T14:47:10.000Z | _notebooks/2020-06-29-01-Showing-uncertainty.ipynb | AntonovMikhail/chans_jupyter | c2cd1675408238ad5be81ba98994611d8c4e48ae | [
"Apache-2.0"
]
| 26 | 2020-07-24T17:30:15.000Z | 2021-02-19T10:19:25.000Z | 275.608234 | 78,352 | 0.909212 | [
[
[
"# Showing uncertainty\n> Uncertainty occurs everywhere in data science, but it's frequently left out of visualizations where it should be included. Here, we review what a confidence interval is and how to visualize them for both single estimates and continuous functions. Additionally, we discuss the bootstrap resampling technique for assessing uncertainty and how to visualize it properly. This is the Summary of lecture \"Improving Your Data Visualizations in Python\", via datacamp.\n\n- toc: true \n- badges: true\n- comments: true\n- author: Chanseok Kang\n- categories: [Python, Datacamp, Visualization]\n- image: images/so2_compare.png",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nplt.rcParams['figure.figsize'] = (10, 5)",
"_____no_output_____"
]
],
[
[
"### Point estimate intervals\n- When is uncertainty important?\n - Estimates from sample\n - Average of a subset\n - Linear model coefficients\n- Why is uncertainty important?\n - Helps inform confidence in estimate\n - Neccessary for decision making\n - Acknowledges limitations of data",
"_____no_output_____"
],
[
"### Basic confidence intervals\nYou are a data scientist for a fireworks manufacturer in Des Moines, Iowa. You need to make a case to the city that your company's large fireworks show has not caused any harm to the city's air. To do this, you look at the average levels for pollutants in the week after the fourth of July and how they compare to readings taken after your last show. By showing confidence intervals around the averages, you can make a case that the recent readings were well within the normal range.",
"_____no_output_____"
]
],
[
[
"average_ests = pd.read_csv('./dataset/average_ests.csv', index_col=0)\naverage_ests",
"_____no_output_____"
],
[
"# Construct CI bounds for averages\naverage_ests['lower'] = average_ests['mean'] - 1.96 * average_ests['std_err']\naverage_ests['upper'] = average_ests['mean'] + 1.96 * average_ests['std_err']\n\n# Setup a grid of plots, with non-shared x axes limits\ng = sns.FacetGrid(average_ests, row='pollutant', sharex=False, aspect=2);\n\n# Plot CI for average estimate\ng.map(plt.hlines, 'y', 'lower', 'upper');\n\n# Plot observed values for comparison and remove axes labels\ng.map(plt.scatter, 'seen', 'y', color='orangered').set_ylabels('').set_xlabels('');",
"_____no_output_____"
]
],
[
[
"This simple visualization shows that all the observed values fall well within the confidence intervals for all the pollutants except for $O_3$.",
"_____no_output_____"
],
[
"### Annotating confidence intervals\nYour data science work with pollution data is legendary, and you are now weighing job offers in both Cincinnati, Ohio and Indianapolis, Indiana. You want to see if the SO2 levels are significantly different in the two cities, and more specifically, which city has lower levels. To test this, you decide to look at the differences in the cities' SO2 values (Indianapolis' - Cincinnati's) over multiple years.\n\nInstead of just displaying a p-value for a significant difference between the cities, you decide to look at the 95% confidence intervals (columns `lower` and `upper`) of the differences. This allows you to see the magnitude of the differences along with any trends over the years.",
"_____no_output_____"
]
],
[
[
"diffs_by_year = pd.read_csv('./dataset/diffs_by_year.csv', index_col=0)\ndiffs_by_year",
"_____no_output_____"
],
[
"# Set start and ends according to intervals\n# Make intervals thicker\nplt.hlines(y='year', xmin='lower', xmax='upper', \n linewidth=5, color='steelblue', alpha=0.7,\n data=diffs_by_year);\n\n# Point estimates\nplt.plot('mean', 'year', 'k|', data=diffs_by_year);\n\n# Add a 'null' reference line at 0 and color orangered\nplt.axvline(x=0, color='orangered', linestyle='--');\n\n# Set descriptive axis labels and title\nplt.xlabel('95% CI');\nplt.title('Avg SO2 differences between Cincinnati and Indianapolis');",
"_____no_output_____"
]
],
[
[
"By looking at the confidence intervals you can see that the difference flipped from generally positive (more pollution in Cincinnati) in 2013 to negative (more pollution in Indianapolis) in 2014 and 2015. Given that every year's confidence interval contains the null value of zero, no P-Value would be significant, and a plot that only showed significance would have been entirely hidden this trend.",
"_____no_output_____"
],
[
"## Confidence bands",
"_____no_output_____"
],
[
"### Making a confidence band\nVandenberg Air Force Base is often used as a location to launch rockets into space. You have a theory that a recent increase in the pace of rocket launches could be harming the air quality in the surrounding region. To explore this, you plotted a 25-day rolling average line of the measurements of atmospheric $NO_2$. To help decide if any pattern observed is random-noise or not, you decide to add a 99% confidence band around your rolling mean. Adding a confidence band to a trend line can help shed light on the stability of the trend seen. This can either increase or decrease the confidence in the discovered trend.\n\n",
"_____no_output_____"
]
],
[
[
"vandenberg_NO2 = pd.read_csv('./dataset/vandenberg_NO2.csv', index_col=0)\nvandenberg_NO2.head()",
"_____no_output_____"
],
[
"# Draw 99% interval bands for average NO2\nvandenberg_NO2['lower'] = vandenberg_NO2['mean'] - 2.58 * vandenberg_NO2['std_err']\nvandenberg_NO2['upper'] = vandenberg_NO2['mean'] + 2.58 * vandenberg_NO2['std_err']\n\n# Plot mean estimate as a white semi-transparent line\nplt.plot('day', 'mean', data=vandenberg_NO2, color='white', alpha=0.4);\n\n# Fill between the upper and lower confidence band values\nplt.fill_between(x='day', y1='lower', y2='upper', data=vandenberg_NO2);",
"_____no_output_____"
]
],
[
[
"This plot shows that the middle of the year's $NO_2$ values are not only lower than the beginning and end of the year but also are less noisy. If just the moving average line were plotted, then this potentially interesting observation would be completely missed. (Can you think of what may cause reduced variance at the lower values of the pollutant?)",
"_____no_output_____"
],
[
"### Separating a lot of bands\nIt is relatively simple to plot a bunch of trend lines on top of each other for rapid and precise comparisons. Unfortunately, if you need to add uncertainty bands around those lines, the plot becomes very difficult to read. Figuring out whether a line corresponds to the top of one class' band or the bottom of another's can be hard due to band overlap. Luckily in Seaborn, it's not difficult to break up the overlapping bands into separate faceted plots.\n\nTo see this, explore trends in SO2 levels for a few cities in the eastern half of the US. If you plot the trends and their confidence bands on a single plot - it's a mess. To fix, use Seaborn's `FacetGrid()` function to spread out the confidence intervals to multiple panes to ease your inspection.",
"_____no_output_____"
]
],
[
[
"eastern_SO2 = pd.read_csv('./dataset/eastern_SO2.csv', index_col=0)\neastern_SO2.head()",
"_____no_output_____"
],
[
"# setup a grid of plots with columns divided by location\ng = sns.FacetGrid(eastern_SO2, col='city', col_wrap=2);\n\n# Map interval plots to each cities data with coral colored ribbons\ng.map(plt.fill_between, 'day', 'lower', 'upper', color='coral');\n\n# Map overlaid mean plots with white line\ng.map(plt.plot, 'day', 'mean', color='white');",
"_____no_output_____"
]
],
[
[
"By separating each band into its own plot you can investigate each city with ease. Here, you see that Des Moines and Houston on average have lower SO2 values for the entire year than the two cities in the Midwest. Cincinnati has a high and variable peak near the beginning of the year but is generally more stable and lower than Indianapolis.",
"_____no_output_____"
],
[
"### Cleaning up bands for overlaps\nYou are working for the city of Denver, Colorado and want to run an ad campaign about how much cleaner Denver's air is than Long Beach, California's air. To investigate this claim, you will compare the SO2 levels of both cities for the year 2014. Since you are solely interested in how the cities compare, you want to keep the bands on the same plot. To make the bands easier to compare, decrease the opacity of the confidence bands and set a clear legend.",
"_____no_output_____"
]
],
[
[
"SO2_compare = pd.read_csv('./dataset/SO2_compare.csv', index_col=0)\nSO2_compare.head()",
"_____no_output_____"
],
[
"for city, color in [('Denver', '#66c2a5'), ('Long Beach', '#fc8d62')]:\n # Filter data to desired city\n city_data = SO2_compare[SO2_compare.city == city]\n \n # Set city interval color to desired and lower opacity\n plt.fill_between(x='day', y1='lower', y2='upper', data=city_data, color=color, alpha=0.4);\n \n # Draw a faint mean line for reference and give a label for legend\n plt.plot('day', 'mean', data=city_data, label=city, color=color, alpha=0.25);\n \nplt.legend();",
"_____no_output_____"
]
],
[
[
"From these two curves you can see that during the first half of the year Long Beach generally has a higher average SO2 value than Denver, in the middle of the year they are very close, and at the end of the year Denver seems to have higher averages. However, by showing the confidence intervals, you can see however that almost none of the year shows a statistically meaningful difference in average values between the two cities.",
"_____no_output_____"
],
[
"## Beyond 95%\n",
"_____no_output_____"
],
[
"### 90, 95, and 99% intervals\nYou are a data scientist for an outdoor adventure company in Fairbanks, Alaska. Recently, customers have been having issues with SO2 pollution, leading to costly cancellations. The company has sensors for CO, NO2, and O3 but not SO2 levels.\n\nYou've built a model that predicts SO2 values based on the values of pollutants with sensors (loaded as `pollution_model`, a `statsmodels` object). You want to investigate which pollutant's value has the largest effect on your model's SO2 prediction. This will help you know which pollutant's values to pay most attention to when planning outdoor tours. To maximize the amount of information in your report, show multiple levels of uncertainty for the model estimates.",
"_____no_output_____"
]
],
[
[
"from statsmodels.formula.api import ols",
"_____no_output_____"
],
[
"pollution = pd.read_csv('./dataset/pollution_wide.csv')\npollution = pollution.query(\"city == 'Fairbanks' & year == 2014 & month == 11\")",
"_____no_output_____"
],
[
"pollution_model = ols(formula='SO2 ~ CO + NO2 + O3 + day', data=pollution)\nres = pollution_model.fit()",
"_____no_output_____"
],
[
"# Add interval percent widths\nalphas = [ 0.01, 0.05, 0.1] \nwidths = [ '99% CI', '95%', '90%']\ncolors = ['#fee08b','#fc8d59','#d53e4f']\n\nfor alpha, color, width in zip(alphas, colors, widths):\n # Grab confidence interval\n conf_ints = res.conf_int(alpha)\n \n # Pass current interval color and legend label to plot\n plt.hlines(y = conf_ints.index, xmin = conf_ints[0], xmax = conf_ints[1],\n colors = color, label = width, linewidth = 10) \n\n# Draw point estimates\nplt.plot(res.params, res.params.index, 'wo', label = 'Point Estimate')\n\nplt.legend(loc = 'upper right')",
"_____no_output_____"
]
],
[
[
"### 90 and 95% bands\nYou are looking at a 40-day rolling average of the $NO_2$ pollution levels for the city of Cincinnati in 2013. To provide as detailed a picture of the uncertainty in the trend you want to look at both the 90 and 99% intervals around this rolling estimate.\n\nTo do this, set up your two interval sizes and an orange ordinal color palette. Additionally, to enable precise readings of the bands, make them semi-transparent, so the Seaborn background grids show through.",
"_____no_output_____"
]
],
[
[
"cinci_13_no2 = pd.read_csv('./dataset/cinci_13_no2.csv', index_col=0);\ncinci_13_no2.head()",
"_____no_output_____"
],
[
"int_widths = ['90%', '99%']\nz_scores = [1.67, 2.58]\ncolors = ['#fc8d59', '#fee08b']\n\nfor percent, Z, color in zip(int_widths, z_scores, colors):\n \n # Pass lower and upper confidence bounds and lower opacity\n plt.fill_between(\n x = cinci_13_no2.day, alpha = 0.4, color = color,\n y1 = cinci_13_no2['mean'] - Z * cinci_13_no2['std_err'],\n y2 = cinci_13_no2['mean'] + Z * cinci_13_no2['std_err'],\n label = percent);\n \nplt.legend();",
"_____no_output_____"
]
],
[
[
"This plot shows us that throughout 2013, the average NO2 values in Cincinnati followed a cyclical pattern with the seasons. However, the uncertainty bands show that for most of the year you can't be sure this pattern is not noise at both a 90 and 99% confidence level.",
"_____no_output_____"
],
[
"### Using band thickness instead of coloring\nYou are a researcher investigating the elevation a rocket reaches before visual is lost and pollutant levels at Vandenberg Air Force Base. You've built a model to predict this relationship, and since you are working independently, you don't have the money to pay for color figures in your journal article. You need to make your model results plot work in black and white. To do this, you will plot the 90, 95, and 99% intervals of the effect of each pollutant as successively smaller bars.",
"_____no_output_____"
]
],
[
[
"rocket_model = pd.read_csv('./dataset/rocket_model.csv', index_col=0)\nrocket_model",
"_____no_output_____"
],
[
"# Decrase interval thickness as interval widens\nsizes = [ 15, 10, 5]\nint_widths = ['90% CI', '95%', '99%']\nz_scores = [ 1.67, 1.96, 2.58]\n\nfor percent, Z, size in zip(int_widths, z_scores, sizes):\n plt.hlines(y = rocket_model.pollutant, \n xmin = rocket_model['est'] - Z * rocket_model['std_err'],\n xmax = rocket_model['est'] + Z * rocket_model['std_err'],\n label = percent, \n # Resize lines and color them gray\n linewidth = size, \n color = 'gray'); \n \n# Add point estimate\nplt.plot('est', 'pollutant', 'wo', data = rocket_model, label = 'Point Estimate');\nplt.legend(loc = 'center left', bbox_to_anchor = (1, 0.5));",
"_____no_output_____"
]
],
[
[
"While less elegant than using color to differentiate interval sizes, this plot still clearly allows the reader to access the effect each pollutant has on rocket visibility. You can see that of all the pollutants, O3 has the largest effect and also the tightest confidence bounds",
"_____no_output_____"
],
[
"## Visualizing the bootstrap\n",
"_____no_output_____"
],
[
"### The bootstrap histogram\nYou are considering a vacation to Cincinnati in May, but you have a severe sensitivity to NO2. You pull a few years of pollution data from Cincinnati in May and look at a bootstrap estimate of the average $NO_2$ levels. You only have one estimate to look at the best way to visualize the results of your bootstrap estimates is with a histogram.\n\nWhile you like the intuition of the bootstrap histogram by itself, your partner who will be going on the vacation with you, likes seeing percent intervals. To accommodate them, you decide to highlight the 95% interval by shading the region.",
"_____no_output_____"
]
],
[
[
"# Perform bootstrapped mean on a vector\ndef bootstrap(data, n_boots):\n return [np.mean(np.random.choice(data,len(data))) for _ in range(n_boots) ]",
"_____no_output_____"
],
[
"pollution = pd.read_csv('./dataset/pollution_wide.csv')\ncinci_may_NO2 = pollution.query(\"city == 'Cincinnati' & month == 5\").NO2\n\n# Generate bootstrap samples\nboot_means = bootstrap(cinci_may_NO2, 1000)\n\n# Get lower and upper 95% interval bounds\nlower, upper = np.percentile(boot_means, [2.5, 97.5])\n\n# Plot shaded area for interval\nplt.axvspan(lower, upper, color = 'gray', alpha = 0.2);\n\n# Draw histogram of bootstrap samples\nsns.distplot(boot_means, bins = 100, kde = False);",
"_____no_output_____"
]
],
[
[
"Your bootstrap histogram looks stable and uniform. You're now confident that the average NO2 levels in Cincinnati during your vacation should be in the range of 16 to 23.",
"_____no_output_____"
],
[
"### Bootstrapped regressions\nWhile working for the Long Beach parks and recreation department investigating the relationship between $NO_2$ and $SO_2$ you noticed a cluster of potential outliers that you suspect might be throwing off the correlations.\n\nInvestigate the uncertainty of your correlations through bootstrap resampling to see how stable your fits are. For convenience, the bootstrap sampling is complete and is provided as `no2_so2_boot` along with `no2_so2` for the non-resampled data.",
"_____no_output_____"
]
],
[
[
"no2_so2 = pd.read_csv('./dataset/no2_so2.csv', index_col=0)\nno2_so2_boot = pd.read_csv('./dataset/no2_so2_boot.csv', index_col=0)",
"_____no_output_____"
],
[
"sns.lmplot('NO2', 'SO2', data = no2_so2_boot,\n # Tell seaborn to a regression line for each sample\n hue = 'sample', \n # Make lines blue and transparent\n line_kws = {'color': 'steelblue', 'alpha': 0.2},\n # Disable built-in confidence intervals\n ci = None, legend = False, scatter = False);\n\n# Draw scatter of all points\nplt.scatter('NO2', 'SO2', data = no2_so2);",
"_____no_output_____"
]
],
[
[
"The outliers appear to drag down the regression lines as evidenced by the cluster of lines with more severe slopes than average. In a single plot, you have not only gotten a good idea of the variability of your correlation estimate but also the potential effects of outliers.",
"_____no_output_____"
],
[
"### Lots of bootstraps with beeswarms\nAs a current resident of Cincinnati, you're curious to see how the average NO2 values compare to Des Moines, Indianapolis, and Houston: a few other cities you've lived in.\n\nTo look at this, you decide to use bootstrap estimation to look at the mean NO2 values for each city. Because the comparisons are of primary interest, you will use a swarm plot to compare the estimates.",
"_____no_output_____"
]
],
[
[
"pollution_may = pollution.query(\"month == 5\")\npollution_may",
"_____no_output_____"
],
[
"# Initialize a holder DataFrame for bootstrap results\ncity_boots = pd.DataFrame()\n\nfor city in ['Cincinnati', 'Des Moines', 'Indianapolis', 'Houston']:\n # Filter to city\n city_NO2 = pollution_may[pollution_may.city == city].NO2\n # Bootstrap city data & put in DataFrame\n cur_boot = pd.DataFrame({'NO2_avg': bootstrap(city_NO2, 100), 'city': city})\n # Append to other city's bootstraps\n city_boots = pd.concat([city_boots,cur_boot])\n\n# Beeswarm plot of averages with citys on y axis\nsns.swarmplot(y = \"city\", x = \"NO2_avg\", data = city_boots, color = 'coral');",
"_____no_output_____"
]
],
[
[
"The beeswarm plots show that Indianapolis and Houston both have the highest average NO2 values, with Cincinnati falling roughly in the middle. Interestingly, you can rather confidently say that Des Moines has the lowest as nearly all its sample estimates fall below those of the other cities.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
]
|
d06127bfe2404b70d73671ba48a678224d198027 | 13,667 | ipynb | Jupyter Notebook | notebooks/find_a_shakemap.ipynb | iwbailey/shakemap_lookup | 75ac2739cba2a293f95a24feb1d7f57ebc0f834b | [
"MIT"
]
| 1 | 2020-10-16T07:31:58.000Z | 2020-10-16T07:31:58.000Z | notebooks/find_a_shakemap.ipynb | iwbailey/shakemap_lookup | 75ac2739cba2a293f95a24feb1d7f57ebc0f834b | [
"MIT"
]
| null | null | null | notebooks/find_a_shakemap.ipynb | iwbailey/shakemap_lookup | 75ac2739cba2a293f95a24feb1d7f57ebc0f834b | [
"MIT"
]
| null | null | null | 33.829208 | 299 | 0.524914 | [
[
[
"from shakemap_lookup import usgs_web",
"_____no_output_____"
],
[
"help(usgs_web.search_usgsevents)",
"Help on function search_usgsevents in module shakemap_lookup.usgs_web:\n\nsearch_usgsevents(searchParams, urlEndpt='https://earthquake.usgs.gov/fdsnws/event/1/query', maxNprint=30, isQuiet=False)\n Search the USGS for events satisfying the criteria and return a list of\n events\n \n IN:\n searchParams is a dict containing the search parameters used for the query\n urlEndpt [string] is the web address used for the search.\n \n API doc here... https://earthquake.usgs.gov/fdsnws/event/1/\n \n OUT:\n A list of events satisfying the conditions in a json structure\n\n"
]
],
[
[
"## Define our search parameters and send to the USGS \nUse a dict, with same names as used by the USGS web call.\n\nSend a query to the web server. The result is a list of events also in a dict format.",
"_____no_output_____"
]
],
[
[
"search_params = {\n 'starttime': \"2018-05-01\",\n 'endtime': \"2018-05-17\",\n 'minmagnitude': 6.8,\n 'maxmagnitude': 10.0,\n 'mindepth': 0.0,\n 'maxdepth': 50.0,\n 'minlongitude': -180.0,\n 'maxlongitude': -97.0,\n 'minlatitude': 0.0,\n 'maxlatitude': 45.0,\n 'limit': 50,\n 'producttype': 'shakemap'\n}\n\nevents = usgs_web.search_usgsevents(search_params)",
"Sending query to get events...\nParsing...\n\t...1 events returned (limit of 50)\n\t\t 70116556 : M 6.9 - 19km SSW of Leilani Estates, Hawaii\n"
]
],
[
[
"## Check the metadata \nDisplay metadata including number of earthquakes returned and what url was used for the query",
"_____no_output_____"
]
],
[
[
"for k, v in events['metadata'].items():\n print(k,\":\", v)",
"generated : 1575582197000\nurl : https://earthquake.usgs.gov/fdsnws/event/1/query?starttime=2018-05-01&endtime=2018-05-17&minmagnitude=6.8&maxmagnitude=10.0&mindepth=0.0&maxdepth=50.0&minlongitude=-180.0&maxlongitude=-97.0&minlatitude=0.0&maxlatitude=45.0&limit=50&producttype=shakemap&format=geojson&jsonerror=true\ntitle : USGS Earthquakes\nstatus : 200\napi : 1.8.1\nlimit : 50\noffset : 1\ncount : 1\n"
]
],
[
[
"## Selection of event from candidates",
"_____no_output_____"
]
],
[
[
"my_event = usgs_web.choose_event(events)\nmy_event",
"\nUSER SELECTION OF EVENT:\n========================\n 0: M 6.9 - 19km SSW of Leilani Estates, Hawaii (70116556)\nNone: First on list\n -1: Exit\n\nChoice: \n\t... selected M 6.9 - 19km SSW of Leilani Estates, Hawaii (70116556)\n\n"
]
],
[
[
"## Select which ShakeMap for the selected event",
"_____no_output_____"
]
],
[
[
"smDetail = usgs_web.query_shakemapdetail(my_event['properties'])",
"Querying detailed event info for eventId=70116556...\n\t...2 shakemaps found\n\nUSER SELECTION OF SHAKEMAP:\n===========================\nOption 0:\n\t eventsourcecode: 70116556\n\t version: 1\n\t process-timestamp: 2018-09-08T02:52:24Z\nOption 1:\n\t eventsourcecode: 1000dyad\n\t version: 11\n\t process-timestamp: 2018-06-15T23:02:03Z\n\nChoice [default 0]: \n\t... selected 0\n\n"
]
],
[
[
"## Display available content for the ShakeMap",
"_____no_output_____"
]
],
[
[
"print(\"Available Content\\n=================\")\nfor k, v in smDetail['contents'].items():\n print(\"{:32s}: {} [{}]\".format(k, v['contentType'], v['length']))",
"Available Content\n=================\nabout_formats.html : text/html [28820]\ncontents.xml : application/xml [9187]\ndownload/70116556.kml : application/vnd.google-earth.kml+xml [1032]\ndownload/cont_mi.json : application/json [79388]\ndownload/cont_mi.kmz : application/vnd.google-earth.kmz [17896]\ndownload/cont_pga.json : application/json [17499]\ndownload/cont_pga.kmz : application/vnd.google-earth.kmz [4362]\ndownload/cont_pgv.json : application/json [12352]\ndownload/cont_pgv.kmz : application/vnd.google-earth.kmz [3309]\ndownload/cont_psa03.json : application/json [24669]\ndownload/cont_psa03.kmz : application/vnd.google-earth.kmz [5843]\ndownload/cont_psa10.json : application/json [15028]\ndownload/cont_psa10.kmz : application/vnd.google-earth.kmz [3843]\ndownload/cont_psa30.json : application/json [7537]\ndownload/cont_psa30.kmz : application/vnd.google-earth.kmz [2254]\ndownload/epicenter.kmz : application/vnd.google-earth.kmz [1299]\ndownload/event.txt : text/plain [125]\ndownload/grid.xml : application/xml [3423219]\ndownload/grid.xml.zip : application/zip [493382]\ndownload/grid.xyz.zip : application/zip [428668]\ndownload/hazus.zip : application/zip [329755]\ndownload/hv70116556.kml : application/vnd.google-earth.kml+xml [1032]\ndownload/hv70116556.kmz : application/vnd.google-earth.kmz [127511]\ndownload/ii_overlay.png : image/png [25259]\ndownload/ii_thumbnail.jpg : image/jpeg [3530]\ndownload/info.json : application/json [2237]\ndownload/intensity.jpg : image/jpeg [60761]\ndownload/intensity.ps.zip : application/zip [139098]\ndownload/metadata.txt : text/plain [33137]\ndownload/mi_regr.png : image/png [35160]\ndownload/overlay.kmz : application/vnd.google-earth.kmz [25245]\ndownload/pga.jpg : image/jpeg [49594]\ndownload/pga.ps.zip : application/zip [89668]\ndownload/pga_regr.png : image/png [33466]\ndownload/pgv.jpg : image/jpeg [49781]\ndownload/pgv.ps.zip : application/zip [89389]\ndownload/pgv_regr.png : image/png [17605]\ndownload/polygons_mi.kmz : application/vnd.google-earth.kmz [43271]\ndownload/psa03.jpg : image/jpeg [49354]\ndownload/psa03.ps.zip : application/zip [90027]\ndownload/psa03_regr.png : image/png [18371]\ndownload/psa10.jpg : image/jpeg [49003]\ndownload/psa10.ps.zip : application/zip [89513]\ndownload/psa10_regr.png : image/png [31310]\ndownload/psa30.jpg : image/jpeg [48956]\ndownload/psa30.ps.zip : application/zip [89113]\ndownload/psa30_regr.png : image/png [18055]\ndownload/raster.zip : application/zip [1940448]\ndownload/rock_grid.xml.zip : application/zip [403486]\ndownload/sd.jpg : image/jpeg [45869]\ndownload/shape.zip : application/zip [1029832]\ndownload/stationlist.json : application/json [55083]\ndownload/stationlist.txt : text/plain [6737]\ndownload/stationlist.xml : application/xml [32441]\ndownload/stations.kmz : application/vnd.google-earth.kmz [7343]\ndownload/tvguide.txt : text/plain [8765]\ndownload/tvmap.jpg : image/jpeg [44223]\ndownload/tvmap.ps.zip : application/zip [273000]\ndownload/tvmap_bare.jpg : image/jpeg [48640]\ndownload/tvmap_bare.ps.zip : application/zip [273146]\ndownload/uncertainty.xml.zip : application/zip [211743]\ndownload/urat_pga.jpg : image/jpeg [45869]\ndownload/urat_pga.ps.zip : application/zip [51741]\nintensity.html : text/html [19291]\npga.html : text/html [19083]\npgv.html : text/html [19083]\nproducts.html : text/html [18584]\npsa03.html : text/html [20250]\npsa10.html : text/html [20249]\npsa30.html : text/html [20249]\nstationlist.html : text/html [127947]\n"
]
],
[
[
"## Get download links\nClick on the link to download",
"_____no_output_____"
]
],
[
[
"# Extract the shakemap grid urls and version from the detail\ngrid = smDetail['contents']['download/grid.xml.zip']\nprint(grid['url'])",
"https://earthquake.usgs.gov/archive/product/shakemap/hv70116556/us/1536375199192/download/grid.xml.zip\n"
],
[
"grid = smDetail['contents']['download/uncertainty.xml.zip']\nprint(grid['url'])",
"https://earthquake.usgs.gov/archive/product/shakemap/hv70116556/us/1536375199192/download/uncertainty.xml.zip\n"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d0613442e8a77054618c47832a0a30ce54d0c49d | 148,109 | ipynb | Jupyter Notebook | module3-permutation-boosting/LS_DS_233.ipynb | mariokart345/DS-Unit-2-Applied-Modeling | ecb7506dc3b08bb06c282937bdbc152332fb9b1b | [
"MIT"
]
| null | null | null | module3-permutation-boosting/LS_DS_233.ipynb | mariokart345/DS-Unit-2-Applied-Modeling | ecb7506dc3b08bb06c282937bdbc152332fb9b1b | [
"MIT"
]
| null | null | null | module3-permutation-boosting/LS_DS_233.ipynb | mariokart345/DS-Unit-2-Applied-Modeling | ecb7506dc3b08bb06c282937bdbc152332fb9b1b | [
"MIT"
]
| null | null | null | 68.064798 | 25,922 | 0.626633 | [
[
[
"<a href=\"https://colab.research.google.com/github/mariokart345/DS-Unit-2-Applied-Modeling/blob/master/module3-permutation-boosting/LS_DS_233.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"Lambda School Data Science\n\n*Unit 2, Sprint 3, Module 3*\n\n---",
"_____no_output_____"
],
[
"# Permutation & Boosting\n\n- Get **permutation importances** for model interpretation and feature selection\n- Use xgboost for **gradient boosting**",
"_____no_output_____"
],
[
"### Setup\n\nRun the code cell below. You can work locally (follow the [local setup instructions](https://lambdaschool.github.io/ds/unit2/local/)) or on Colab.\n\nLibraries:\n\n- category_encoders\n- [**eli5**](https://eli5.readthedocs.io/en/latest/)\n- matplotlib\n- numpy\n- pandas\n- scikit-learn\n- [**xgboost**](https://xgboost.readthedocs.io/en/latest/)",
"_____no_output_____"
]
],
[
[
"%%capture\nimport sys\n\n# If you're on Colab:\nif 'google.colab' in sys.modules:\n DATA_PATH = 'https://raw.githubusercontent.com/LambdaSchool/DS-Unit-2-Applied-Modeling/master/data/'\n !pip install category_encoders==2.*\n !pip install eli5\n\n# If you're working locally:\nelse:\n DATA_PATH = '../data/'",
"_____no_output_____"
]
],
[
[
"We'll go back to Tanzania Waterpumps for this lesson.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\n\n# Merge train_features.csv & train_labels.csv\ntrain = pd.merge(pd.read_csv(DATA_PATH+'waterpumps/train_features.csv'), \n pd.read_csv(DATA_PATH+'waterpumps/train_labels.csv'))\n\n# Read test_features.csv & sample_submission.csv\ntest = pd.read_csv(DATA_PATH+'waterpumps/test_features.csv')\nsample_submission = pd.read_csv(DATA_PATH+'waterpumps/sample_submission.csv')\n\n\n# Split train into train & val\ntrain, val = train_test_split(train, train_size=0.80, test_size=0.20, \n stratify=train['status_group'], random_state=42)\n\n\ndef wrangle(X):\n \"\"\"Wrangle train, validate, and test sets in the same way\"\"\"\n \n # Prevent SettingWithCopyWarning\n X = X.copy()\n \n # About 3% of the time, latitude has small values near zero,\n # outside Tanzania, so we'll treat these values like zero.\n X['latitude'] = X['latitude'].replace(-2e-08, 0)\n \n # When columns have zeros and shouldn't, they are like null values.\n # So we will replace the zeros with nulls, and impute missing values later.\n # Also create a \"missing indicator\" column, because the fact that\n # values are missing may be a predictive signal.\n cols_with_zeros = ['longitude', 'latitude', 'construction_year', \n 'gps_height', 'population']\n for col in cols_with_zeros:\n X[col] = X[col].replace(0, np.nan)\n X[col+'_MISSING'] = X[col].isnull()\n \n # Drop duplicate columns\n duplicates = ['quantity_group', 'payment_type']\n X = X.drop(columns=duplicates)\n \n # Drop recorded_by (never varies) and id (always varies, random)\n unusable_variance = ['recorded_by', 'id']\n X = X.drop(columns=unusable_variance)\n \n # Convert date_recorded to datetime\n X['date_recorded'] = pd.to_datetime(X['date_recorded'], infer_datetime_format=True)\n \n # Extract components from date_recorded, then drop the original column\n X['year_recorded'] = X['date_recorded'].dt.year\n X['month_recorded'] = X['date_recorded'].dt.month\n X['day_recorded'] = X['date_recorded'].dt.day\n X = X.drop(columns='date_recorded')\n \n # Engineer feature: how many years from construction_year to date_recorded\n X['years'] = X['year_recorded'] - X['construction_year']\n X['years_MISSING'] = X['years'].isnull()\n \n # return the wrangled dataframe\n return X\n\ntrain = wrangle(train)\nval = wrangle(val)\ntest = wrangle(test)",
"_____no_output_____"
],
[
"# Arrange data into X features matrix and y target vector\ntarget = 'status_group'\nX_train = train.drop(columns=target)\ny_train = train[target]\nX_val = val.drop(columns=target)\ny_val = val[target]\nX_test = test",
"_____no_output_____"
],
[
"import category_encoders as ce\nfrom sklearn.impute import SimpleImputer\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.pipeline import make_pipeline\n\npipeline = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median'), \n RandomForestClassifier(n_estimators=100, random_state=42, n_jobs=-1)\n)\n\n# Fit on train, score on val\npipeline.fit(X_train, y_train)\nprint('Validation Accuracy', pipeline.score(X_val, y_val))",
"/usr/local/lib/python3.6/dist-packages/statsmodels/tools/_testing.py:19: FutureWarning: pandas.util.testing is deprecated. Use the functions in the public API at pandas.testing instead.\n import pandas.util.testing as tm\n"
]
],
[
[
"# Get permutation importances for model interpretation and feature selection",
"_____no_output_____"
],
[
"## Overview",
"_____no_output_____"
],
[
"Default Feature Importances are fast, but Permutation Importances may be more accurate.\n\nThese links go deeper with explanations and examples:\n\n- Permutation Importances\n - [Kaggle / Dan Becker: Machine Learning Explainability](https://www.kaggle.com/dansbecker/permutation-importance)\n - [Christoph Molnar: Interpretable Machine Learning](https://christophm.github.io/interpretable-ml-book/feature-importance.html)\n- (Default) Feature Importances\n - [Ando Saabas: Selecting good features, Part 3, Random Forests](https://blog.datadive.net/selecting-good-features-part-iii-random-forests/)\n - [Terence Parr, et al: Beware Default Random Forest Importances](https://explained.ai/rf-importance/index.html)",
"_____no_output_____"
],
[
"There are three types of feature importances:",
"_____no_output_____"
],
[
"### 1. (Default) Feature Importances\n\nFastest, good for first estimates, but be aware:\n\n\n\n>**When the dataset has two (or more) correlated features, then from the point of view of the model, any of these correlated features can be used as the predictor, with no concrete preference of one over the others.** But once one of them is used, the importance of others is significantly reduced since effectively the impurity they can remove is already removed by the first feature. As a consequence, they will have a lower reported importance. This is not an issue when we want to use feature selection to reduce overfitting, since it makes sense to remove features that are mostly duplicated by other features. But when interpreting the data, it can lead to the incorrect conclusion that one of the variables is a strong predictor while the others in the same group are unimportant, while actually they are very close in terms of their relationship with the response variable. โ [Selecting good features โ Part III: random forests](https://blog.datadive.net/selecting-good-features-part-iii-random-forests/) \n\n\n \n > **The scikit-learn Random Forest feature importance ... tends to inflate the importance of continuous or high-cardinality categorical variables.** ... Breiman and Cutler, the inventors of Random Forests, indicate that this method of โadding up the gini decreases for each individual variable over all trees in the forest gives a **fast** variable importance that is often very consistent with the permutation importance measure.โ โ [Beware Default Random Forest Importances](https://explained.ai/rf-importance/index.html)\n\n \n",
"_____no_output_____"
]
],
[
[
"# Get feature importances\nrf = pipeline.named_steps['randomforestclassifier']\nimportances = pd.Series(rf.feature_importances_, X_train.columns)\n\n# Plot feature importances\n%matplotlib inline\nimport matplotlib.pyplot as plt\n\nn = 20\nplt.figure(figsize=(10,n/2))\nplt.title(f'Top {n} features')\nimportances.sort_values()[-n:].plot.barh(color='grey');",
"_____no_output_____"
]
],
[
[
"### 2. Drop-Column Importance\n\nThe best in theory, but too slow in practice",
"_____no_output_____"
]
],
[
[
"column = 'wpt_name'\n\n# Fit without column\npipeline = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median'), \n RandomForestClassifier(n_estimators=100, random_state=42, n_jobs=-1)\n)\npipeline.fit(X_train.drop(columns=column), y_train)\nscore_without = pipeline.score(X_val.drop(columns=column), y_val)\nprint(f'Validation Accuracy without {column}: {score_without}')\n\n# Fit with column\npipeline = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median'), \n RandomForestClassifier(n_estimators=100, random_state=42, n_jobs=-1)\n)\npipeline.fit(X_train, y_train)\nscore_with = pipeline.score(X_val, y_val)\nprint(f'Validation Accuracy with {column}: {score_with}')\n\n# Compare the error with & without column\nprint(f'Drop-Column Importance for {column}: {score_with - score_without}')",
"Validation Accuracy without wpt_name: 0.8087542087542088\nValidation Accuracy with wpt_name: 0.8135521885521886\nDrop-Column Importance for wpt_name: 0.004797979797979801\n"
]
],
[
[
"### 3. Permutation Importance\n\nPermutation Importance is a good compromise between Feature Importance based on impurity reduction (which is the fastest) and Drop Column Importance (which is the \"best.\")\n\n[The ELI5 library documentation explains,](https://eli5.readthedocs.io/en/latest/blackbox/permutation_importance.html)\n\n> Importance can be measured by looking at how much the score (accuracy, F1, R^2, etc. - any score weโre interested in) decreases when a feature is not available.\n>\n> To do that one can remove feature from the dataset, re-train the estimator and check the score. But it requires re-training an estimator for each feature, which can be computationally intensive. ...\n>\n>To avoid re-training the estimator we can remove a feature only from the test part of the dataset, and compute score without using this feature. It doesnโt work as-is, because estimators expect feature to be present. So instead of removing a feature we can replace it with random noise - feature column is still there, but it no longer contains useful information. This method works if noise is drawn from the same distribution as original feature values (as otherwise estimator may fail). The simplest way to get such noise is to shuffle values for a feature, i.e. use other examplesโ feature values - this is how permutation importance is computed.\n>\n>The method is most suitable for computing feature importances when a number of columns (features) is not huge; it can be resource-intensive otherwise.",
"_____no_output_____"
],
[
"### Do-It-Yourself way, for intuition",
"_____no_output_____"
]
],
[
[
"#lets see how permutation works first \r\nnevi_array = [1,2,3,4,5]\r\nnevi_permuted = np.random.permutation(nevi_array)\r\nnevi_permuted",
"_____no_output_____"
],
[
"#BEFORE : sequence of the feature to be permuted \r\nfeature = 'quantity'\r\nX_val[feature].head()",
"_____no_output_____"
],
[
"#BEFORE: distribution \r\nX_val[feature].value_counts()",
"_____no_output_____"
],
[
"#PERMUTE\r\n\r\nX_val_permuted = X_val.copy()\r\nX_val_permuted[feature] =np.random.permutation(X_val[feature])",
"_____no_output_____"
],
[
"#AFTER : sequence of the feature to be permuted \r\nfeature = 'quantity'\r\nX_val_permuted[feature].head()",
"_____no_output_____"
],
[
"#AFTER: distribution \r\nX_val_permuted[feature].value_counts()",
"_____no_output_____"
],
[
"#get the permutation importance \r\nX_val_permuted[feature] =np.random.permutation(X_val[feature])\r\n\r\nscore_permuted = pipeline.score(X_val_permuted, y_val)\r\n\r\nprint(f'Validation Accuracy with {feature}: {score_with}')\r\nprint(f'Validation Accuracy with {feature} permuted: {score_permuted}')\r\nprint(f'Permutation Importance: {score_with - score_permuted}')",
"Validation Accuracy with quantity: 0.8135521885521886\nValidation Accuracy with quantity permuted: 0.7148148148148148\nPermutation Importance: 0.09873737373737379\n"
],
[
"feature = 'wpt_name'\r\nX_val_permuted=X_val.copy()\r\nX_val_permuted[feature] = np.random.permutation(X_val[feature])\r\nscore_permuted = pipeline.score(X_val_permuted, y_val)\r\n\r\nprint(f'Validation Accuracy with {feature}: {score_with}')\r\nprint(f'Validation Accuracy with {feature} permuted: {score_permuted}')\r\nprint(f'Permutation Importance: {score_with - score_permuted}')",
"Validation Accuracy with wpt_name: 0.8135521885521886\nValidation Accuracy with wpt_name permuted: 0.811952861952862\nPermutation Importance: 0.0015993265993266004\n"
],
[
"X_val[feature]",
"_____no_output_____"
]
],
[
[
"### With eli5 library\n\nFor more documentation on using this library, see:\n- [eli5.sklearn.PermutationImportance](https://eli5.readthedocs.io/en/latest/autodocs/sklearn.html#eli5.sklearn.permutation_importance.PermutationImportance)\n- [eli5.show_weights](https://eli5.readthedocs.io/en/latest/autodocs/eli5.html#eli5.show_weights)\n- [scikit-learn user guide, `scoring` parameter](https://scikit-learn.org/stable/modules/model_evaluation.html#the-scoring-parameter-defining-model-evaluation-rules)\n\neli5 doesn't work with pipelines.",
"_____no_output_____"
]
],
[
[
"# Ignore warnings\n \ntransformers = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median')\n )\n\nX_train_transformed = transformers.fit_transform(X_train)\nX_val_transformed = transformers.transform(X_val)\n\nmodel = RandomForestClassifier(n_estimators=50, random_state=42, n_jobs=-1)\nmodel.fit(X_train_transformed, y_train)",
"_____no_output_____"
],
[
"import eli5\r\nfrom eli5.sklearn import PermutationImportance\r\n\r\npermuter = PermutationImportance(\r\n model, \r\n scoring='accuracy',\r\n n_iter=5, \r\n random_state=42\r\n)\r\n\r\npermuter.fit(X_val_transformed,y_val)\r\n",
"_____no_output_____"
],
[
"feature_names = X_val.columns.to_list()\r\npd.Series(permuter.feature_importances_, feature_names).sort_values(ascending=False)",
"_____no_output_____"
],
[
"eli5.show_weights(\r\n permuter, \r\n top=None, \r\n feature_names=feature_names\r\n)",
"_____no_output_____"
]
],
[
[
"### We can use importances for feature selection\n\nFor example, we can remove features with zero importance. The model trains faster and the score does not decrease.",
"_____no_output_____"
]
],
[
[
"print('Shape before removing feature ', X_train.shape)",
"Shape before removing feature (47520, 45)\n"
],
[
"#remove features with feature importance <0\r\nminimum_importance = 0\r\nmask=permuter.feature_importances_ > minimum_importance \r\nfeatures = X_train.columns[mask]\r\nX_train=X_train[features]",
"_____no_output_____"
],
[
"print('Shape AFTER removing feature ', X_train.shape)",
"Shape AFTER removing feature (47520, 24)\n"
],
[
"X_val=X_val[features]\r\n\r\npipeline = make_pipeline(\r\n ce.OrdinalEncoder(), \r\n SimpleImputer(strategy='mean'), \r\n RandomForestClassifier(n_estimators=50, random_state=42, n_jobs=-1)\r\n)\r\n\r\n#fit on train, score on val \r\npipeline.fit(X_train, y_train)\r\nprint('Validation accuracy', pipeline.score(X_val, y_val))",
"Validation accuracy 0.8066498316498316\n"
]
],
[
[
"# Use xgboost for gradient boosting",
"_____no_output_____"
],
[
"## Overview",
"_____no_output_____"
],
[
"In the Random Forest lesson, you learned this advice:\n\n#### Try Tree Ensembles when you do machine learning with labeled, tabular data\n- \"Tree Ensembles\" means Random Forest or **Gradient Boosting** models. \n- [Tree Ensembles often have the best predictive accuracy](https://arxiv.org/abs/1708.05070) with labeled, tabular data.\n- Why? Because trees can fit non-linear, non-[monotonic](https://en.wikipedia.org/wiki/Monotonic_function) relationships, and [interactions](https://christophm.github.io/interpretable-ml-book/interaction.html) between features.\n- A single decision tree, grown to unlimited depth, will [overfit](http://www.r2d3.us/visual-intro-to-machine-learning-part-1/). We solve this problem by ensembling trees, with bagging (Random Forest) or **[boosting](https://www.youtube.com/watch?v=GM3CDQfQ4sw)** (Gradient Boosting).\n- Random Forest's advantage: may be less sensitive to hyperparameters. **Gradient Boosting's advantage:** may get better predictive accuracy.",
"_____no_output_____"
],
[
"Like Random Forest, Gradient Boosting uses ensembles of trees. But the details of the ensembling technique are different:\n\n### Understand the difference between boosting & bagging\n\nBoosting (used by Gradient Boosting) is different than Bagging (used by Random Forests). \n\nHere's an excerpt from [_An Introduction to Statistical Learning_](http://www-bcf.usc.edu/~gareth/ISL/ISLR%20Seventh%20Printing.pdf) Chapter 8.2.3, Boosting:\n\n>Recall that bagging involves creating multiple copies of the original training data set using the bootstrap, fitting a separate decision tree to each copy, and then combining all of the trees in order to create a single predictive model.\n>\n>**Boosting works in a similar way, except that the trees are grown _sequentially_: each tree is grown using information from previously grown trees.**\n>\n>Unlike fitting a single large decision tree to the data, which amounts to _fitting the data hard_ and potentially overfitting, the boosting approach instead _learns slowly._ Given the current model, we fit a decision tree to the residuals from the model.\n>\n>We then add this new decision tree into the fitted function in order to update the residuals. Each of these trees can be rather small, with just a few terminal nodes. **By fitting small trees to the residuals, we slowly improve fห in areas where it does not perform well.**\n>\n>Note that in boosting, unlike in bagging, the construction of each tree depends strongly on the trees that have already been grown.\n\nThis high-level overview is all you need to know for now. If you want to go deeper, we recommend you watch the StatQuest videos on gradient boosting!",
"_____no_output_____"
],
[
"Let's write some code. We have lots of options for which libraries to use:\n\n#### Python libraries for Gradient Boosting\n- [scikit-learn Gradient Tree Boosting](https://scikit-learn.org/stable/modules/ensemble.html#gradient-boosting) โ slower than other libraries, but [the new version may be better](https://twitter.com/amuellerml/status/1129443826945396737)\n - Anaconda: already installed\n - Google Colab: already installed\n- [xgboost](https://xgboost.readthedocs.io/en/latest/) โย can accept missing values and enforce [monotonic constraints](https://xiaoxiaowang87.github.io/monotonicity_constraint/)\n - Anaconda, Mac/Linux: `conda install -c conda-forge xgboost`\n - Windows: `conda install -c anaconda py-xgboost`\n - Google Colab: already installed\n- [LightGBM](https://lightgbm.readthedocs.io/en/latest/) โย can accept missing values and enforce [monotonic constraints](https://blog.datadive.net/monotonicity-constraints-in-machine-learning/)\n - Anaconda: `conda install -c conda-forge lightgbm`\n - Google Colab: already installed\n- [CatBoost](https://catboost.ai/) โย can accept missing values and use [categorical features](https://catboost.ai/docs/concepts/algorithm-main-stages_cat-to-numberic.html) without preprocessing\n - Anaconda: `conda install -c conda-forge catboost`\n - Google Colab: `pip install catboost`",
"_____no_output_____"
],
[
"In this lesson, you'll use a new library, xgboost โย But it has an API that's almost the same as scikit-learn, so it won't be a hard adjustment!\n\n#### [XGBoost Python API Reference: Scikit-Learn API](https://xgboost.readthedocs.io/en/latest/python/python_api.html#module-xgboost.sklearn)",
"_____no_output_____"
]
],
[
[
"from xgboost import XGBClassifier\r\npipeline = make_pipeline(\r\n ce.OrdinalEncoder(), \r\n XGBClassifier(n_estimators=100, random_state=42, n_jobs=-1)\r\n)\r\n\r\npipeline.fit(X_train, y_train)\r\n",
"_____no_output_____"
],
[
"from sklearn.metrics import accuracy_score\r\ny_pred=pipeline.predict(X_val)\r\nprint('Validation score', accuracy_score(y_val, y_pred))",
"Validation score 0.7453703703703703\n"
]
],
[
[
"#### [Avoid Overfitting By Early Stopping With XGBoost In Python](https://machinelearningmastery.com/avoid-overfitting-by-early-stopping-with-xgboost-in-python/)\n\nWhy is early stopping better than a For loop, or GridSearchCV, to optimize `n_estimators`?\n\nWith early stopping, if `n_iterations` is our number of iterations, then we fit `n_iterations` decision trees.\n\nWith a for loop, or GridSearchCV, we'd fit `sum(range(1,n_rounds+1))` trees.\n\nBut it doesn't work well with pipelines. You may need to re-run multiple times with different values of other parameters such as `max_depth` and `learning_rate`.\n\n#### XGBoost parameters\n- [Notes on parameter tuning](https://xgboost.readthedocs.io/en/latest/tutorials/param_tuning.html)\n- [Parameters documentation](https://xgboost.readthedocs.io/en/latest/parameter.html)\n",
"_____no_output_____"
]
],
[
[
"encoder = ce.OrdinalEncoder()\r\nX_train_encoded = encoder.fit_transform(X_train)\r\nX_val_encoded = encoder.transform(X_val)\r\n\r\nmodel = XGBClassifier(\r\n n_estimators=1000, # <= 1000 trees, depend on early stopping\r\n max_depth=7, # try deeper trees because of high cardinality categoricals\r\n learning_rate=0.5, # try higher learning rate\r\n n_jobs=-1\r\n)\r\n\r\neval_set = [(X_train_encoded, y_train), \r\n (X_val_encoded, y_val)]\r\n\r\nmodel.fit(X_train_encoded, y_train, \r\n eval_set=eval_set, \r\n eval_metric='merror', \r\n early_stopping_rounds=50) # Stop if the score hasn't improved in 50 rounds",
"[0]\tvalidation_0-merror:0.254167\tvalidation_1-merror:0.264394\nMultiple eval metrics have been passed: 'validation_1-merror' will be used for early stopping.\n\nWill train until validation_1-merror hasn't improved in 50 rounds.\n[1]\tvalidation_0-merror:0.241898\tvalidation_1-merror:0.252946\n[2]\tvalidation_0-merror:0.234891\tvalidation_1-merror:0.243687\n[3]\tvalidation_0-merror:0.229082\tvalidation_1-merror:0.237458\n[4]\tvalidation_0-merror:0.220013\tvalidation_1-merror:0.230892\n[5]\tvalidation_0-merror:0.213994\tvalidation_1-merror:0.227273\n[6]\tvalidation_0-merror:0.208481\tvalidation_1-merror:0.224579\n[7]\tvalidation_0-merror:0.204146\tvalidation_1-merror:0.221212\n[8]\tvalidation_0-merror:0.200989\tvalidation_1-merror:0.218687\n[9]\tvalidation_0-merror:0.198359\tvalidation_1-merror:0.218771\n[10]\tvalidation_0-merror:0.195602\tvalidation_1-merror:0.217088\n[11]\tvalidation_0-merror:0.192782\tvalidation_1-merror:0.215404\n[12]\tvalidation_0-merror:0.188952\tvalidation_1-merror:0.215572\n[13]\tvalidation_0-merror:0.185227\tvalidation_1-merror:0.213636\n[14]\tvalidation_0-merror:0.182218\tvalidation_1-merror:0.213721\n[15]\tvalidation_0-merror:0.177273\tvalidation_1-merror:0.210017\n[16]\tvalidation_0-merror:0.175947\tvalidation_1-merror:0.210185\n[17]\tvalidation_0-merror:0.173695\tvalidation_1-merror:0.209512\n[18]\tvalidation_0-merror:0.172264\tvalidation_1-merror:0.209764\n[19]\tvalidation_0-merror:0.169802\tvalidation_1-merror:0.207912\n[20]\tvalidation_0-merror:0.167487\tvalidation_1-merror:0.207744\n[21]\tvalidation_0-merror:0.165488\tvalidation_1-merror:0.206902\n[22]\tvalidation_0-merror:0.163721\tvalidation_1-merror:0.207492\n[23]\tvalidation_0-merror:0.162584\tvalidation_1-merror:0.208081\n[24]\tvalidation_0-merror:0.161322\tvalidation_1-merror:0.209091\n[25]\tvalidation_0-merror:0.159491\tvalidation_1-merror:0.207744\n[26]\tvalidation_0-merror:0.157218\tvalidation_1-merror:0.205892\n[27]\tvalidation_0-merror:0.155787\tvalidation_1-merror:0.205556\n[28]\tvalidation_0-merror:0.154714\tvalidation_1-merror:0.205808\n[29]\tvalidation_0-merror:0.153725\tvalidation_1-merror:0.205219\n[30]\tvalidation_0-merror:0.152399\tvalidation_1-merror:0.205556\n[31]\tvalidation_0-merror:0.150421\tvalidation_1-merror:0.204461\n[32]\tvalidation_0-merror:0.147938\tvalidation_1-merror:0.204461\n[33]\tvalidation_0-merror:0.14596\tvalidation_1-merror:0.203872\n[34]\tvalidation_0-merror:0.14476\tvalidation_1-merror:0.202862\n[35]\tvalidation_0-merror:0.14314\tvalidation_1-merror:0.202694\n[36]\tvalidation_0-merror:0.142361\tvalidation_1-merror:0.202525\n[37]\tvalidation_0-merror:0.140657\tvalidation_1-merror:0.201768\n[38]\tvalidation_0-merror:0.140488\tvalidation_1-merror:0.20202\n[39]\tvalidation_0-merror:0.139268\tvalidation_1-merror:0.201515\n[40]\tvalidation_0-merror:0.137668\tvalidation_1-merror:0.200673\n[41]\tvalidation_0-merror:0.136532\tvalidation_1-merror:0.201178\n[42]\tvalidation_0-merror:0.135206\tvalidation_1-merror:0.201515\n[43]\tvalidation_0-merror:0.134133\tvalidation_1-merror:0.201852\n[44]\tvalidation_0-merror:0.132155\tvalidation_1-merror:0.200842\n[45]\tvalidation_0-merror:0.13104\tvalidation_1-merror:0.19899\n[46]\tvalidation_0-merror:0.13064\tvalidation_1-merror:0.198401\n[47]\tvalidation_0-merror:0.129819\tvalidation_1-merror:0.197643\n[48]\tvalidation_0-merror:0.12883\tvalidation_1-merror:0.197559\n[49]\tvalidation_0-merror:0.127546\tvalidation_1-merror:0.197811\n[50]\tvalidation_0-merror:0.125926\tvalidation_1-merror:0.197054\n[51]\tvalidation_0-merror:0.124769\tvalidation_1-merror:0.198906\n[52]\tvalidation_0-merror:0.123527\tvalidation_1-merror:0.198822\n[53]\tvalidation_0-merror:0.122748\tvalidation_1-merror:0.198737\n[54]\tvalidation_0-merror:0.121675\tvalidation_1-merror:0.197727\n[55]\tvalidation_0-merror:0.119823\tvalidation_1-merror:0.197222\n[56]\tvalidation_0-merror:0.119024\tvalidation_1-merror:0.19697\n[57]\tvalidation_0-merror:0.117887\tvalidation_1-merror:0.197054\n[58]\tvalidation_0-merror:0.117424\tvalidation_1-merror:0.19697\n[59]\tvalidation_0-merror:0.116814\tvalidation_1-merror:0.197727\n[60]\tvalidation_0-merror:0.115762\tvalidation_1-merror:0.197811\n[61]\tvalidation_0-merror:0.114836\tvalidation_1-merror:0.198064\n[62]\tvalidation_0-merror:0.113973\tvalidation_1-merror:0.198737\n[63]\tvalidation_0-merror:0.113215\tvalidation_1-merror:0.199158\n[64]\tvalidation_0-merror:0.112121\tvalidation_1-merror:0.198232\n[65]\tvalidation_0-merror:0.111301\tvalidation_1-merror:0.198569\n[66]\tvalidation_0-merror:0.110438\tvalidation_1-merror:0.198401\n[67]\tvalidation_0-merror:0.108607\tvalidation_1-merror:0.197896\n[68]\tvalidation_0-merror:0.107976\tvalidation_1-merror:0.198653\n[69]\tvalidation_0-merror:0.107113\tvalidation_1-merror:0.198232\n[70]\tvalidation_0-merror:0.105661\tvalidation_1-merror:0.197811\n[71]\tvalidation_0-merror:0.104314\tvalidation_1-merror:0.197727\n[72]\tvalidation_0-merror:0.103367\tvalidation_1-merror:0.198232\n[73]\tvalidation_0-merror:0.102925\tvalidation_1-merror:0.198232\n[74]\tvalidation_0-merror:0.101684\tvalidation_1-merror:0.198401\n[75]\tvalidation_0-merror:0.10061\tvalidation_1-merror:0.198064\n[76]\tvalidation_0-merror:0.099453\tvalidation_1-merror:0.198653\n[77]\tvalidation_0-merror:0.098653\tvalidation_1-merror:0.198401\n[78]\tvalidation_0-merror:0.098169\tvalidation_1-merror:0.198148\n[79]\tvalidation_0-merror:0.097138\tvalidation_1-merror:0.199242\n[80]\tvalidation_0-merror:0.096086\tvalidation_1-merror:0.198569\n[81]\tvalidation_0-merror:0.095686\tvalidation_1-merror:0.198401\n[82]\tvalidation_0-merror:0.094592\tvalidation_1-merror:0.198401\n[83]\tvalidation_0-merror:0.09354\tvalidation_1-merror:0.196801\n[84]\tvalidation_0-merror:0.093013\tvalidation_1-merror:0.196212\n[85]\tvalidation_0-merror:0.092066\tvalidation_1-merror:0.196633\n[86]\tvalidation_0-merror:0.09154\tvalidation_1-merror:0.197138\n[87]\tvalidation_0-merror:0.090951\tvalidation_1-merror:0.197306\n[88]\tvalidation_0-merror:0.090678\tvalidation_1-merror:0.197475\n[89]\tvalidation_0-merror:0.089289\tvalidation_1-merror:0.197643\n[90]\tvalidation_0-merror:0.088405\tvalidation_1-merror:0.19638\n[91]\tvalidation_0-merror:0.087584\tvalidation_1-merror:0.196465\n[92]\tvalidation_0-merror:0.086848\tvalidation_1-merror:0.195455\n[93]\tvalidation_0-merror:0.08609\tvalidation_1-merror:0.195791\n[94]\tvalidation_0-merror:0.085017\tvalidation_1-merror:0.19537\n[95]\tvalidation_0-merror:0.084722\tvalidation_1-merror:0.195455\n[96]\tvalidation_0-merror:0.08367\tvalidation_1-merror:0.196465\n[97]\tvalidation_0-merror:0.082828\tvalidation_1-merror:0.195707\n[98]\tvalidation_0-merror:0.082344\tvalidation_1-merror:0.197222\n[99]\tvalidation_0-merror:0.081692\tvalidation_1-merror:0.196886\n[100]\tvalidation_0-merror:0.08104\tvalidation_1-merror:0.197306\n[101]\tvalidation_0-merror:0.080661\tvalidation_1-merror:0.197306\n[102]\tvalidation_0-merror:0.080156\tvalidation_1-merror:0.197811\n[103]\tvalidation_0-merror:0.07944\tvalidation_1-merror:0.197727\n[104]\tvalidation_0-merror:0.078514\tvalidation_1-merror:0.197811\n[105]\tvalidation_0-merror:0.077378\tvalidation_1-merror:0.197559\n[106]\tvalidation_0-merror:0.076915\tvalidation_1-merror:0.197222\n[107]\tvalidation_0-merror:0.075947\tvalidation_1-merror:0.197643\n[108]\tvalidation_0-merror:0.075568\tvalidation_1-merror:0.197896\n[109]\tvalidation_0-merror:0.075084\tvalidation_1-merror:0.197138\n[110]\tvalidation_0-merror:0.074832\tvalidation_1-merror:0.197896\n[111]\tvalidation_0-merror:0.073948\tvalidation_1-merror:0.19798\n[112]\tvalidation_0-merror:0.073316\tvalidation_1-merror:0.198148\n[113]\tvalidation_0-merror:0.072201\tvalidation_1-merror:0.198906\n[114]\tvalidation_0-merror:0.071738\tvalidation_1-merror:0.199495\n[115]\tvalidation_0-merror:0.070686\tvalidation_1-merror:0.19899\n[116]\tvalidation_0-merror:0.070497\tvalidation_1-merror:0.199327\n[117]\tvalidation_0-merror:0.070097\tvalidation_1-merror:0.19899\n[118]\tvalidation_0-merror:0.068603\tvalidation_1-merror:0.198148\n[119]\tvalidation_0-merror:0.068413\tvalidation_1-merror:0.197727\n[120]\tvalidation_0-merror:0.067908\tvalidation_1-merror:0.197559\n[121]\tvalidation_0-merror:0.067529\tvalidation_1-merror:0.197643\n[122]\tvalidation_0-merror:0.067066\tvalidation_1-merror:0.198064\n[123]\tvalidation_0-merror:0.066267\tvalidation_1-merror:0.198401\n[124]\tvalidation_0-merror:0.065951\tvalidation_1-merror:0.198232\n[125]\tvalidation_0-merror:0.065699\tvalidation_1-merror:0.198569\n[126]\tvalidation_0-merror:0.065509\tvalidation_1-merror:0.198232\n[127]\tvalidation_0-merror:0.065215\tvalidation_1-merror:0.197727\n[128]\tvalidation_0-merror:0.064857\tvalidation_1-merror:0.197559\n[129]\tvalidation_0-merror:0.064373\tvalidation_1-merror:0.197306\n[130]\tvalidation_0-merror:0.064036\tvalidation_1-merror:0.197306\n[131]\tvalidation_0-merror:0.063279\tvalidation_1-merror:0.197054\n[132]\tvalidation_0-merror:0.062816\tvalidation_1-merror:0.197306\n[133]\tvalidation_0-merror:0.062226\tvalidation_1-merror:0.197391\n[134]\tvalidation_0-merror:0.061785\tvalidation_1-merror:0.197054\n[135]\tvalidation_0-merror:0.061279\tvalidation_1-merror:0.196886\n[136]\tvalidation_0-merror:0.060795\tvalidation_1-merror:0.197054\n[137]\tvalidation_0-merror:0.059996\tvalidation_1-merror:0.197559\n[138]\tvalidation_0-merror:0.05947\tvalidation_1-merror:0.197391\n[139]\tvalidation_0-merror:0.059217\tvalidation_1-merror:0.197475\n[140]\tvalidation_0-merror:0.059112\tvalidation_1-merror:0.197475\n[141]\tvalidation_0-merror:0.058628\tvalidation_1-merror:0.197811\n[142]\tvalidation_0-merror:0.057912\tvalidation_1-merror:0.197306\n[143]\tvalidation_0-merror:0.057723\tvalidation_1-merror:0.197138\n[144]\tvalidation_0-merror:0.057534\tvalidation_1-merror:0.197391\nStopping. Best iteration:\n[94]\tvalidation_0-merror:0.085017\tvalidation_1-merror:0.19537\n\n"
],
[
"results = model.evals_result()\r\ntrain_error = results['validation_0']['merror']\r\nval_error = results['validation_1']['merror']\r\nepoch = list(range(1, len(train_error)+1))\r\nplt.plot(epoch, train_error, label='Train')\r\nplt.plot(epoch, val_error, label='Validation')\r\nplt.ylabel('Classification Error')\r\nplt.xlabel('Model Complexity (n_estimators)')\r\nplt.title('Validation Curve for this XGBoost model')\r\nplt.ylim((0.10, 0.25)) # Zoom in\r\nplt.legend();",
"_____no_output_____"
]
],
[
[
"### Try adjusting these hyperparameters\n\n#### Random Forest\n- class_weight (for imbalanced classes)\n- max_depth (usually high, can try decreasing)\n- n_estimators (too low underfits, too high wastes time)\n- min_samples_leaf (increase if overfitting)\n- max_features (decrease for more diverse trees)\n\n#### Xgboost\n- scale_pos_weight (for imbalanced classes)\n- max_depth (usually low, can try increasing)\n- n_estimators (too low underfits, too high wastes time/overfits) โ Use Early Stopping!\n- learning_rate (too low underfits, too high overfits)\n\nFor more ideas, see [Notes on Parameter Tuning](https://xgboost.readthedocs.io/en/latest/tutorials/param_tuning.html) and [DART booster](https://xgboost.readthedocs.io/en/latest/tutorials/dart.html).",
"_____no_output_____"
],
[
"## Challenge\n\nYou will use your portfolio project dataset for all assignments this sprint. Complete these tasks for your project, and document your work.\n\n- Continue to clean and explore your data. Make exploratory visualizations.\n- Fit a model. Does it beat your baseline?\n- Try xgboost.\n- Get your model's permutation importances.\n\nYou should try to complete an initial model today, because the rest of the week, we're making model interpretation visualizations.\n\nBut, if you aren't ready to try xgboost and permutation importances with your dataset today, you can practice with another dataset instead. You may choose any dataset you've worked with previously.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
]
]
|
d06137aa460ed001913396642f28d2945f230d06 | 823,130 | ipynb | Jupyter Notebook | Irises_ML_Intro/Irises Data Analysis Workflow_06_2019.ipynb | ValRCS/RCS_Data_Analysis_Python_2019_July | 19e2f8310f41b697f9c86d7a085a9ff19390eeac | [
"MIT"
]
| 1 | 2019-07-11T16:25:15.000Z | 2019-07-11T16:25:15.000Z | Irises_ML_Intro/Irises Data Analysis Workflow_06_2019.ipynb | ValRCS/RCS_Data_Analysis_Python_2019_July | 19e2f8310f41b697f9c86d7a085a9ff19390eeac | [
"MIT"
]
| 8 | 2020-01-28T22:54:14.000Z | 2022-02-10T00:17:47.000Z | Irises_ML_Intro/Irises Data Analysis Workflow_06_2019.ipynb | ValRCS/RCS_Data_Analysis_Python_2019_July | 19e2f8310f41b697f9c86d7a085a9ff19390eeac | [
"MIT"
]
| null | null | null | 199.788835 | 142,296 | 0.894876 | [
[
[
"<h1><center>Introductory Data Analysis Workflow</center></h1>\n",
"_____no_output_____"
],
[
"\nhttps://xkcd.com/2054",
"_____no_output_____"
],
[
"# An example machine learning notebook\n\n* Original Notebook by [Randal S. Olson](http://www.randalolson.com/)\n* Supported by [Jason H. Moore](http://www.epistasis.org/)\n* [University of Pennsylvania Institute for Bioinformatics](http://upibi.org/)\n* Adapted for LU Py-Sem 2018 by [Valdis Saulespurens]([email protected])",
"_____no_output_____"
],
[
"**You can also [execute the code in this notebook on Binder](https://mybinder.org/v2/gh/ValRCS/RigaComm_DataAnalysis/master) - no local installation required.**",
"_____no_output_____"
]
],
[
[
"# text 17.04.2019\nimport datetime\nprint(datetime.datetime.now())\nprint('hello')",
"2019-06-13 16:12:23.662194\nhello\n"
]
],
[
[
"## Table of contents\n\n1. [Introduction](#Introduction)\n\n2. [License](#License)\n\n3. [Required libraries](#Required-libraries)\n\n4. [The problem domain](#The-problem-domain)\n\n5. [Step 1: Answering the question](#Step-1:-Answering-the-question)\n\n6. [Step 2: Checking the data](#Step-2:-Checking-the-data)\n\n7. [Step 3: Tidying the data](#Step-3:-Tidying-the-data)\n\n - [Bonus: Testing our data](#Bonus:-Testing-our-data)\n\n8. [Step 4: Exploratory analysis](#Step-4:-Exploratory-analysis)\n\n9. [Step 5: Classification](#Step-5:-Classification)\n\n - [Cross-validation](#Cross-validation)\n\n - [Parameter tuning](#Parameter-tuning)\n\n10. [Step 6: Reproducibility](#Step-6:-Reproducibility)\n\n11. [Conclusions](#Conclusions)\n\n12. [Further reading](#Further-reading)\n\n13. [Acknowledgements](#Acknowledgements)",
"_____no_output_____"
],
[
"## Introduction\n\n[[ go back to the top ]](#Table-of-contents)\n\nIn the time it took you to read this sentence, terabytes of data have been collectively generated across the world โ more data than any of us could ever hope to process, much less make sense of, on the machines we're using to read this notebook.\n\nIn response to this massive influx of data, the field of Data Science has come to the forefront in the past decade. Cobbled together by people from a diverse array of fields โ statistics, physics, computer science, design, and many more โ the field of Data Science represents our collective desire to understand and harness the abundance of data around us to build a better world.\n\nIn this notebook, I'm going to go over a basic Python data analysis pipeline from start to finish to show you what a typical data science workflow looks like.\n\nIn addition to providing code examples, I also hope to imbue in you a sense of good practices so you can be a more effective โ and more collaborative โ data scientist.\n\nI will be following along with the data analysis checklist from [The Elements of Data Analytic Style](https://leanpub.com/datastyle), which I strongly recommend reading as a free and quick guidebook to performing outstanding data analysis.\n\n**This notebook is intended to be a public resource. As such, if you see any glaring inaccuracies or if a critical topic is missing, please feel free to point it out or (preferably) submit a pull request to improve the notebook.**",
"_____no_output_____"
],
[
"## License\n\n[[ go back to the top ]](#Table-of-contents)\n\nPlease see the [repository README file](https://github.com/rhiever/Data-Analysis-and-Machine-Learning-Projects#license) for the licenses and usage terms for the instructional material and code in this notebook. In general, I have licensed this material so that it is as widely usable and shareable as possible.",
"_____no_output_____"
],
[
"## Required libraries\n\n[[ go back to the top ]](#Table-of-contents)\n\nIf you don't have Python on your computer, you can use the [Anaconda Python distribution](http://continuum.io/downloads) to install most of the Python packages you need. Anaconda provides a simple double-click installer for your convenience.\n\nThis notebook uses several Python packages that come standard with the Anaconda Python distribution. The primary libraries that we'll be using are:\n\n* **NumPy**: Provides a fast numerical array structure and helper functions.\n* **pandas**: Provides a DataFrame structure to store data in memory and work with it easily and efficiently.\n* **scikit-learn**: The essential Machine Learning package in Python.\n* **matplotlib**: Basic plotting library in Python; most other Python plotting libraries are built on top of it.\n* **Seaborn**: Advanced statistical plotting library.\n* **watermark**: A Jupyter Notebook extension for printing timestamps, version numbers, and hardware information.\n\n**Note:** I will not be providing support for people trying to run this notebook outside of the Anaconda Python distribution.",
"_____no_output_____"
],
[
"## The problem domain\n\n[[ go back to the top ]](#Table-of-contents)\n\nFor the purposes of this exercise, let's pretend we're working for a startup that just got funded to create a smartphone app that automatically identifies species of flowers from pictures taken on the smartphone. We're working with a moderately-sized team of data scientists and will be building part of the data analysis pipeline for this app.\n\nWe've been tasked by our company's Head of Data Science to create a demo machine learning model that takes four measurements from the flowers (sepal length, sepal width, petal length, and petal width) and identifies the species based on those measurements alone.\n\n<img src=\"img/petal_sepal.jpg\" />\n\nWe've been given a [data set](https://github.com/ValRCS/RCS_Data_Analysis_Python/blob/master/data/iris-data.csv) from our field researchers to develop the demo, which only includes measurements for three types of *Iris* flowers:\n\n### *Iris setosa*\n\n<img src=\"img/iris_setosa.jpg\" />\n\n### *Iris versicolor*\n<img src=\"img/iris_versicolor.jpg\" />\n\n### *Iris virginica*\n<img src=\"img/iris_virginica.jpg\" />\n\nThe four measurements we're using currently come from hand-measurements by the field researchers, but they will be automatically measured by an image processing model in the future.\n\n**Note:** The data set we're working with is the famous [*Iris* data set](https://archive.ics.uci.edu/ml/datasets/Iris) โ included with this notebook โ which I have modified slightly for demonstration purposes.",
"_____no_output_____"
],
[
"## Step 1: Answering the question\n\n[[ go back to the top ]](#Table-of-contents)\n\nThe first step to any data analysis project is to define the question or problem we're looking to solve, and to define a measure (or set of measures) for our success at solving that task. The data analysis checklist has us answer a handful of questions to accomplish that, so let's work through those questions.\n\n>Did you specify the type of data analytic question (e.g. exploration, association causality) before touching the data?\n\nWe're trying to classify the species (i.e., class) of the flower based on four measurements that we're provided: sepal length, sepal width, petal length, and petal width.\n\nPetal - ziedlapiลa, sepal - arฤซ ziedlapiลa\n\n\n\n>Did you define the metric for success before beginning?\n\nLet's do that now. Since we're performing classification, we can use [accuracy](https://en.wikipedia.org/wiki/Accuracy_and_precision) โ the fraction of correctly classified flowers โ to quantify how well our model is performing. Our company's Head of Data has told us that we should achieve at least 90% accuracy.\n\n>Did you understand the context for the question and the scientific or business application?\n\nWe're building part of a data analysis pipeline for a smartphone app that will be able to classify the species of flowers from pictures taken on the smartphone. In the future, this pipeline will be connected to another pipeline that automatically measures from pictures the traits we're using to perform this classification.\n\n>Did you record the experimental design?\n\nOur company's Head of Data has told us that the field researchers are hand-measuring 50 randomly-sampled flowers of each species using a standardized methodology. The field researchers take pictures of each flower they sample from pre-defined angles so the measurements and species can be confirmed by the other field researchers at a later point. At the end of each day, the data is compiled and stored on a private company GitHub repository.\n\n>Did you consider whether the question could be answered with the available data?\n\nThe data set we currently have is only for three types of *Iris* flowers. The model built off of this data set will only work for those *Iris* flowers, so we will need more data to create a general flower classifier.\n\n<hr />\n\nNotice that we've spent a fair amount of time working on the problem without writing a line of code or even looking at the data.\n\n**Thinking about and documenting the problem we're working on is an important step to performing effective data analysis that often goes overlooked.** Don't skip it.",
"_____no_output_____"
],
[
"## Step 2: Checking the data\n\n[[ go back to the top ]](#Table-of-contents)\n\nThe next step is to look at the data we're working with. Even curated data sets from the government can have errors in them, and it's vital that we spot these errors before investing too much time in our analysis.\n\nGenerally, we're looking to answer the following questions:\n\n* Is there anything wrong with the data?\n* Are there any quirks with the data?\n* Do I need to fix or remove any of the data?\n\nLet's start by reading the data into a pandas DataFrame.",
"_____no_output_____"
]
],
[
[
"import pandas as pd",
"_____no_output_____"
],
[
"\niris_data = pd.read_csv('../data/iris-data.csv')\n",
"_____no_output_____"
],
[
"# Resources for loading data from nonlocal sources\n# Pandas Can generally handle most common formats\n# https://pandas.pydata.org/pandas-docs/stable/io.html\n\n# SQL https://stackoverflow.com/questions/39149243/how-do-i-connect-to-a-sql-server-database-with-python\n# NoSQL MongoDB https://realpython.com/introduction-to-mongodb-and-python/\n# Apache Hadoop: https://dzone.com/articles/how-to-get-hadoop-data-into-a-python-model\n# Apache Spark: https://www.datacamp.com/community/tutorials/apache-spark-python\n# Data Scraping / Crawling libraries : https://elitedatascience.com/python-web-scraping-libraries Big Topic in itself\n\n# Most data resources have some form of Python API / Library ",
"_____no_output_____"
],
[
"iris_data.head()",
"_____no_output_____"
]
],
[
[
"We're in luck! The data seems to be in a usable format.\n\nThe first row in the data file defines the column headers, and the headers are descriptive enough for us to understand what each column represents. The headers even give us the units that the measurements were recorded in, just in case we needed to know at a later point in the project.\n\nEach row following the first row represents an entry for a flower: four measurements and one class, which tells us the species of the flower.\n\n**One of the first things we should look for is missing data.** Thankfully, the field researchers already told us that they put a 'NA' into the spreadsheet when they were missing a measurement.\n\nWe can tell pandas to automatically identify missing values if it knows our missing value marker.",
"_____no_output_____"
]
],
[
[
"iris_data.shape",
"_____no_output_____"
],
[
"iris_data.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 150 entries, 0 to 149\nData columns (total 5 columns):\nsepal_length_cm 150 non-null float64\nsepal_width_cm 150 non-null float64\npetal_length_cm 150 non-null float64\npetal_width_cm 145 non-null float64\nclass 150 non-null object\ndtypes: float64(4), object(1)\nmemory usage: 5.9+ KB\n"
],
[
"iris_data.describe()",
"_____no_output_____"
],
[
"iris_data = pd.read_csv('../data/iris-data.csv', na_values=['NA', 'N/A'])",
"_____no_output_____"
]
],
[
[
"Voilร ! Now pandas knows to treat rows with 'NA' as missing values.",
"_____no_output_____"
],
[
"Next, it's always a good idea to look at the distribution of our data โ especially the outliers.\n\nLet's start by printing out some summary statistics about the data set.",
"_____no_output_____"
]
],
[
[
"iris_data.describe()",
"_____no_output_____"
]
],
[
[
"We can see several useful values from this table. For example, we see that five `petal_width_cm` entries are missing.\n\nIf you ask me, though, tables like this are rarely useful unless we know that our data should fall in a particular range. It's usually better to visualize the data in some way. Visualization makes outliers and errors immediately stand out, whereas they might go unnoticed in a large table of numbers.\n\nSince we know we're going to be plotting in this section, let's set up the notebook so we can plot inside of it.",
"_____no_output_____"
]
],
[
[
"# This line tells the notebook to show plots inside of the notebook\n%matplotlib inline\n\nimport matplotlib.pyplot as plt\nimport seaborn as sb",
"_____no_output_____"
]
],
[
[
"Next, let's create a **scatterplot matrix**. Scatterplot matrices plot the distribution of each column along the diagonal, and then plot a scatterplot matrix for the combination of each variable. They make for an efficient tool to look for errors in our data.\n\nWe can even have the plotting package color each entry by its class to look for trends within the classes.",
"_____no_output_____"
]
],
[
[
"# We have to temporarily drop the rows with 'NA' values\n# because the Seaborn plotting function does not know\n# what to do with them\nsb.pairplot(iris_data.dropna(), hue='class')\n",
"C:\\ProgramData\\Anaconda3\\lib\\site-packages\\numpy\\core\\_methods.py:140: RuntimeWarning: Degrees of freedom <= 0 for slice\n keepdims=keepdims)\nC:\\ProgramData\\Anaconda3\\lib\\site-packages\\numpy\\core\\_methods.py:132: RuntimeWarning: invalid value encountered in double_scalars\n ret = ret.dtype.type(ret / rcount)\n"
]
],
[
[
"From the scatterplot matrix, we can already see some issues with the data set:\n\n1. There are five classes when there should only be three, meaning there were some coding errors.\n\n2. There are some clear outliers in the measurements that may be erroneous: one `sepal_width_cm` entry for `Iris-setosa` falls well outside its normal range, and several `sepal_length_cm` entries for `Iris-versicolor` are near-zero for some reason.\n\n3. We had to drop those rows with missing values.\n\nIn all of these cases, we need to figure out what to do with the erroneous data. Which takes us to the next step...",
"_____no_output_____"
],
[
"## Step 3: Tidying the data\n\n### GIGO principle\n\n[[ go back to the top ]](#Table-of-contents)\n\nNow that we've identified several errors in the data set, we need to fix them before we proceed with the analysis.\n\nLet's walk through the issues one-by-one.\n\n>There are five classes when there should only be three, meaning there were some coding errors.\n\nAfter talking with the field researchers, it sounds like one of them forgot to add `Iris-` before their `Iris-versicolor` entries. The other extraneous class, `Iris-setossa`, was simply a typo that they forgot to fix.\n\nLet's use the DataFrame to fix these errors.",
"_____no_output_____"
]
],
[
[
"iris_data['class'].unique()",
"_____no_output_____"
],
[
"# Copy and Replace\niris_data.loc[iris_data['class'] == 'versicolor', 'class'] = 'Iris-versicolor'\niris_data['class'].unique()\n",
"_____no_output_____"
],
[
"# So we take a row where a specific column('class' here) matches our bad values \n# and change them to good values\n\niris_data.loc[iris_data['class'] == 'Iris-setossa', 'class'] = 'Iris-setosa'\n\niris_data['class'].unique()",
"_____no_output_____"
],
[
"iris_data.tail()",
"_____no_output_____"
],
[
"iris_data[98:103]",
"_____no_output_____"
]
],
[
[
"Much better! Now we only have three class types. Imagine how embarrassing it would've been to create a model that used the wrong classes.\n\n>There are some clear outliers in the measurements that may be erroneous: one `sepal_width_cm` entry for `Iris-setosa` falls well outside its normal range, and several `sepal_length_cm` entries for `Iris-versicolor` are near-zero for some reason.\n\nFixing outliers can be tricky business. It's rarely clear whether the outlier was caused by measurement error, recording the data in improper units, or if the outlier is a real anomaly. For that reason, we should be judicious when working with outliers: if we decide to exclude any data, we need to make sure to document what data we excluded and provide solid reasoning for excluding that data. (i.e., \"This data didn't fit my hypothesis\" will not stand peer review.)\n\nIn the case of the one anomalous entry for `Iris-setosa`, let's say our field researchers know that it's impossible for `Iris-setosa` to have a sepal width below 2.5 cm. Clearly this entry was made in error, and we're better off just scrapping the entry than spending hours finding out what happened.",
"_____no_output_____"
]
],
[
[
"smallpetals = iris_data.loc[(iris_data['sepal_width_cm'] < 2.5) & (iris_data['class'] == 'Iris-setosa')]\nsmallpetals",
"_____no_output_____"
],
[
"iris_data.loc[iris_data['class'] == 'Iris-setosa', 'sepal_width_cm'].hist()",
"_____no_output_____"
],
[
"# This line drops any 'Iris-setosa' rows with a separal width less than 2.5 cm\n# Let's go over this command in class\niris_data = iris_data.loc[(iris_data['class'] != 'Iris-setosa') | (iris_data['sepal_width_cm'] >= 2.5)]\niris_data.loc[iris_data['class'] == 'Iris-setosa', 'sepal_width_cm'].hist()\n",
"_____no_output_____"
]
],
[
[
"Excellent! Now all of our `Iris-setosa` rows have a sepal width greater than 2.5.\n\nThe next data issue to address is the several near-zero sepal lengths for the `Iris-versicolor` rows. Let's take a look at those rows.",
"_____no_output_____"
]
],
[
[
"iris_data.loc[(iris_data['class'] == 'Iris-versicolor') &\n (iris_data['sepal_length_cm'] < 1.0)]",
"_____no_output_____"
]
],
[
[
"How about that? All of these near-zero `sepal_length_cm` entries seem to be off by two orders of magnitude, as if they had been recorded in meters instead of centimeters.\n\nAfter some brief correspondence with the field researchers, we find that one of them forgot to convert those measurements to centimeters. Let's do that for them.",
"_____no_output_____"
]
],
[
[
"iris_data.loc[iris_data['class'] == 'Iris-versicolor', 'sepal_length_cm'].hist()",
"_____no_output_____"
],
[
"iris_data['sepal_length_cm'].hist()",
"_____no_output_____"
],
[
"# Here we fix the wrong units\n\niris_data.loc[(iris_data['class'] == 'Iris-versicolor') &\n (iris_data['sepal_length_cm'] < 1.0),\n 'sepal_length_cm'] *= 100.0\n\niris_data.loc[iris_data['class'] == 'Iris-versicolor', 'sepal_length_cm'].hist()\n;",
"_____no_output_____"
],
[
"iris_data['sepal_length_cm'].hist()",
"_____no_output_____"
]
],
[
[
"Phew! Good thing we fixed those outliers. They could've really thrown our analysis off.\n\n>We had to drop those rows with missing values.\n\nLet's take a look at the rows with missing values:",
"_____no_output_____"
]
],
[
[
"iris_data.loc[(iris_data['sepal_length_cm'].isnull()) |\n (iris_data['sepal_width_cm'].isnull()) |\n (iris_data['petal_length_cm'].isnull()) |\n (iris_data['petal_width_cm'].isnull())]",
"_____no_output_____"
]
],
[
[
"It's not ideal that we had to drop those rows, especially considering they're all `Iris-setosa` entries. Since it seems like the missing data is systematic โ all of the missing values are in the same column for the same *Iris* type โ this error could potentially bias our analysis.\n\nOne way to deal with missing data is **mean imputation**: If we know that the values for a measurement fall in a certain range, we can fill in empty values with the average of that measurement.\n\nLet's see if we can do that here.",
"_____no_output_____"
]
],
[
[
"iris_data.loc[iris_data['class'] == 'Iris-setosa', 'petal_width_cm'].hist()\n",
"_____no_output_____"
]
],
[
[
"Most of the petal widths for `Iris-setosa` fall within the 0.2-0.3 range, so let's fill in these entries with the average measured petal width.",
"_____no_output_____"
]
],
[
[
"iris_data.loc[iris_data['class'] == 'Iris-setosa', 'petal_width_cm'].mean()",
"_____no_output_____"
],
[
"average_petal_width = iris_data.loc[iris_data['class'] == 'Iris-setosa', 'petal_width_cm'].mean()\nprint(average_petal_width)",
"0.24999999999999997\n"
],
[
"\n\niris_data.loc[(iris_data['class'] == 'Iris-setosa') &\n (iris_data['petal_width_cm'].isnull()),\n 'petal_width_cm'] = average_petal_width\n\niris_data.loc[(iris_data['class'] == 'Iris-setosa') &\n (iris_data['petal_width_cm'] == average_petal_width)]",
"_____no_output_____"
],
[
"iris_data.loc[(iris_data['sepal_length_cm'].isnull()) |\n (iris_data['sepal_width_cm'].isnull()) |\n (iris_data['petal_length_cm'].isnull()) |\n (iris_data['petal_width_cm'].isnull())]",
"_____no_output_____"
]
],
[
[
"Great! Now we've recovered those rows and no longer have missing data in our data set.\n\n**Note:** If you don't feel comfortable imputing your data, you can drop all rows with missing data with the `dropna()` call:\n\n iris_data.dropna(inplace=True)\n\nAfter all this hard work, we don't want to repeat this process every time we work with the data set. Let's save the tidied data file *as a separate file* and work directly with that data file from now on.",
"_____no_output_____"
]
],
[
[
"iris_data.to_json('../data/iris-clean.json')",
"_____no_output_____"
],
[
"iris_data.to_csv('../data/iris-data-clean.csv', index=False)\n\n",
"_____no_output_____"
],
[
"cleanedframe = iris_data.dropna()",
"_____no_output_____"
],
[
"iris_data_clean = pd.read_csv('../data/iris-data-clean.csv')",
"_____no_output_____"
]
],
[
[
"Now, let's take a look at the scatterplot matrix now that we've tidied the data.",
"_____no_output_____"
]
],
[
[
"myplot = sb.pairplot(iris_data_clean, hue='class')\nmyplot.savefig('irises.png')",
"_____no_output_____"
],
[
"import scipy.stats as stats",
"_____no_output_____"
],
[
"iris_data = pd.read_csv('../data/iris-data.csv')",
"_____no_output_____"
],
[
"iris_data.columns.unique()",
"_____no_output_____"
],
[
"stats.entropy(iris_data_clean['sepal_length_cm'])",
"_____no_output_____"
],
[
"iris_data.columns[:-1]",
"_____no_output_____"
],
[
"# we go through list of column names except last one and get entropy \n# for data (without missing values) in each column\nfor col in iris_data.columns[:-1]:\n print(\"Entropy for: \", col, stats.entropy(iris_data[col].dropna()))",
"Entropy for: sepal_length_cm 4.96909746125432\nEntropy for: sepal_width_cm 5.000701325982732\nEntropy for: petal_length_cm 4.888113822938816\nEntropy for: petal_width_cm 4.754264731532864\n"
]
],
[
[
"Of course, I purposely inserted numerous errors into this data set to demonstrate some of the many possible scenarios you may face while tidying your data.\n\nThe general takeaways here should be:\n\n* Make sure your data is encoded properly\n\n* Make sure your data falls within the expected range, and use domain knowledge whenever possible to define that expected range\n\n* Deal with missing data in one way or another: replace it if you can or drop it\n\n* Never tidy your data manually because that is not easily reproducible\n\n* Use code as a record of how you tidied your data\n\n* Plot everything you can about the data at this stage of the analysis so you can *visually* confirm everything looks correct",
"_____no_output_____"
],
[
"## Bonus: Testing our data\n\n[[ go back to the top ]](#Table-of-contents)\n\nAt SciPy 2015, I was exposed to a great idea: We should test our data. Just how we use unit tests to verify our expectations from code, we can similarly set up unit tests to verify our expectations about a data set.\n\nWe can quickly test our data using `assert` statements: We assert that something must be true, and if it is, then nothing happens and the notebook continues running. However, if our assertion is wrong, then the notebook stops running and brings it to our attention. For example,\n\n```Python\nassert 1 == 2\n```\n\nwill raise an `AssertionError` and stop execution of the notebook because the assertion failed.\n\nLet's test a few things that we know about our data set now.",
"_____no_output_____"
]
],
[
[
"# We know that we should only have three classes\nassert len(iris_data_clean['class'].unique()) == 3",
"_____no_output_____"
],
[
"# We know that sepal lengths for 'Iris-versicolor' should never be below 2.5 cm\nassert iris_data_clean.loc[iris_data_clean['class'] == 'Iris-versicolor', 'sepal_length_cm'].min() >= 2.5",
"_____no_output_____"
],
[
"# We know that our data set should have no missing measurements\nassert len(iris_data_clean.loc[(iris_data_clean['sepal_length_cm'].isnull()) |\n (iris_data_clean['sepal_width_cm'].isnull()) |\n (iris_data_clean['petal_length_cm'].isnull()) |\n (iris_data_clean['petal_width_cm'].isnull())]) == 0",
"_____no_output_____"
],
[
"# We know that our data set should have no missing measurements\nassert len(iris_data.loc[(iris_data['sepal_length_cm'].isnull()) |\n (iris_data['sepal_width_cm'].isnull()) |\n (iris_data['petal_length_cm'].isnull()) |\n (iris_data['petal_width_cm'].isnull())]) == 0",
"_____no_output_____"
]
],
[
[
"And so on. If any of these expectations are violated, then our analysis immediately stops and we have to return to the tidying stage.",
"_____no_output_____"
],
[
"### Data Cleanup & Wrangling > 80% time spent in Data Science",
"_____no_output_____"
],
[
"## Step 4: Exploratory analysis\n\n[[ go back to the top ]](#Table-of-contents)\n\nNow after spending entirely too much time tidying our data, we can start analyzing it!\n\nExploratory analysis is the step where we start delving deeper into the data set beyond the outliers and errors. We'll be looking to answer questions such as:\n\n* How is my data distributed?\n\n* Are there any correlations in my data?\n\n* Are there any confounding factors that explain these correlations?\n\nThis is the stage where we plot all the data in as many ways as possible. Create many charts, but don't bother making them pretty โ these charts are for internal use.\n\nLet's return to that scatterplot matrix that we used earlier.",
"_____no_output_____"
]
],
[
[
"sb.pairplot(iris_data_clean)\n;",
"_____no_output_____"
]
],
[
[
"Our data is normally distributed for the most part, which is great news if we plan on using any modeling methods that assume the data is normally distributed.\n\nThere's something strange going on with the petal measurements. Maybe it's something to do with the different `Iris` types. Let's color code the data by the class again to see if that clears things up.",
"_____no_output_____"
]
],
[
[
"sb.pairplot(iris_data_clean, hue='class')\n;",
"_____no_output_____"
]
],
[
[
"Sure enough, the strange distribution of the petal measurements exist because of the different species. This is actually great news for our classification task since it means that the petal measurements will make it easy to distinguish between `Iris-setosa` and the other `Iris` types.\n\nDistinguishing `Iris-versicolor` and `Iris-virginica` will prove more difficult given how much their measurements overlap.\n\nThere are also correlations between petal length and petal width, as well as sepal length and sepal width. The field biologists assure us that this is to be expected: Longer flower petals also tend to be wider, and the same applies for sepals.\n\nWe can also make [**violin plots**](https://en.wikipedia.org/wiki/Violin_plot) of the data to compare the measurement distributions of the classes. Violin plots contain the same information as [box plots](https://en.wikipedia.org/wiki/Box_plot), but also scales the box according to the density of the data.",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(10, 10))\n\nfor column_index, column in enumerate(iris_data_clean.columns):\n if column == 'class':\n continue\n plt.subplot(2, 2, column_index + 1)\n sb.violinplot(x='class', y=column, data=iris_data_clean)",
"_____no_output_____"
]
],
[
[
"Enough flirting with the data. Let's get to modeling.",
"_____no_output_____"
],
[
"## Step 5: Classification\n\n[[ go back to the top ]](#Table-of-contents)\n\nWow, all this work and we *still* haven't modeled the data!\n\nAs tiresome as it can be, tidying and exploring our data is a vital component to any data analysis. If we had jumped straight to the modeling step, we would have created a faulty classification model.\n\nRemember: **Bad data leads to bad models.** Always check your data first.\n\n<hr />\n\nAssured that our data is now as clean as we can make it โ and armed with some cursory knowledge of the distributions and relationships in our data set โ it's time to make the next big step in our analysis: Splitting the data into training and testing sets.\n\nA **training set** is a random subset of the data that we use to train our models.\n\nA **testing set** is a random subset of the data (mutually exclusive from the training set) that we use to validate our models on unforseen data.\n\nEspecially in sparse data sets like ours, it's easy for models to **overfit** the data: The model will learn the training set so well that it won't be able to handle most of the cases it's never seen before. This is why it's important for us to build the model with the training set, but score it with the testing set.\n\nNote that once we split the data into a training and testing set, we should treat the testing set like it no longer exists: We cannot use any information from the testing set to build our model or else we're cheating.\n\nLet's set up our data first.",
"_____no_output_____"
]
],
[
[
"iris_data_clean = pd.read_csv('../data/iris-data-clean.csv')\n\n# We're using all four measurements as inputs\n# Note that scikit-learn expects each entry to be a list of values, e.g.,\n# [ [val1, val2, val3],\n# [val1, val2, val3],\n# ... ]\n# such that our input data set is represented as a list of lists\n\n# We can extract the data in this format from pandas like this:\nall_inputs = iris_data_clean[['sepal_length_cm', 'sepal_width_cm',\n 'petal_length_cm', 'petal_width_cm']].values\n\n# Similarly, we can extract the class labels\nall_labels = iris_data_clean['class'].values\n\n# Make sure that you don't mix up the order of the entries\n# all_inputs[5] inputs should correspond to the class in all_labels[5]\n\n# Here's what a subset of our inputs looks like:\nall_inputs[:5]",
"_____no_output_____"
],
[
"all_labels[:5]",
"_____no_output_____"
],
[
"type(all_inputs)",
"_____no_output_____"
],
[
"all_labels[:5]",
"_____no_output_____"
],
[
"type(all_labels)",
"_____no_output_____"
]
],
[
[
"Now our data is ready to be split.",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import train_test_split",
"_____no_output_____"
],
[
"all_inputs[:3]",
"_____no_output_____"
],
[
"iris_data_clean.head(3)",
"_____no_output_____"
],
[
"all_labels[:3]",
"_____no_output_____"
],
[
"# Here we split our data into training and testing data\n\n(training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25, random_state=1)",
"_____no_output_____"
],
[
"training_inputs[:5]",
"_____no_output_____"
],
[
"testing_inputs[:5]",
"_____no_output_____"
],
[
"testing_classes[:5]",
"_____no_output_____"
],
[
"training_classes[:5]",
"_____no_output_____"
]
],
[
[
"With our data split, we can start fitting models to our data. Our company's Head of Data is all about decision tree classifiers, so let's start with one of those.\n\nDecision tree classifiers are incredibly simple in theory. In their simplest form, decision tree classifiers ask a series of Yes/No questions about the data โ each time getting closer to finding out the class of each entry โ until they either classify the data set perfectly or simply can't differentiate a set of entries. Think of it like a game of [Twenty Questions](https://en.wikipedia.org/wiki/Twenty_Questions), except the computer is *much*, *much* better at it.\n\nHere's an example decision tree classifier:\n\n<img src=\"img/iris_dtc.png\" />\n\nNotice how the classifier asks Yes/No questions about the data โ whether a certain feature is <= 1.75, for example โ so it can differentiate the records. This is the essence of every decision tree.\n\nThe nice part about decision tree classifiers is that they are **scale-invariant**, i.e., the scale of the features does not affect their performance, unlike many Machine Learning models. In other words, it doesn't matter if our features range from 0 to 1 or 0 to 1,000; decision tree classifiers will work with them just the same.\n\nThere are several [parameters](http://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html) that we can tune for decision tree classifiers, but for now let's use a basic decision tree classifier.",
"_____no_output_____"
]
],
[
[
"from sklearn.tree import DecisionTreeClassifier\n\n# Create the classifier\ndecision_tree_classifier = DecisionTreeClassifier()\n\n# Train the classifier on the training set\ndecision_tree_classifier.fit(training_inputs, training_classes)\n\n# Validate the classifier on the testing set using classification accuracy\ndecision_tree_classifier.score(testing_inputs, testing_classes)",
"_____no_output_____"
],
[
"150*0.25",
"_____no_output_____"
],
[
"len(testing_inputs)",
"_____no_output_____"
],
[
"37/38",
"_____no_output_____"
],
[
"from sklearn import svm\nsvm_classifier = svm.SVC(gamma = 'scale')",
"_____no_output_____"
],
[
"svm_classifier.fit(training_inputs, training_classes)",
"_____no_output_____"
],
[
"svm_classifier.score(testing_inputs, testing_classes)",
"_____no_output_____"
],
[
"svm_classifier = svm.SVC(gamma = 'scale')\nsvm_classifier.fit(training_inputs, training_classes)\nsvm_classifier.score(testing_inputs, testing_classes)",
"_____no_output_____"
]
],
[
[
"Heck yeah! Our model achieves 97% classification accuracy without much effort.\n\nHowever, there's a catch: Depending on how our training and testing set was sampled, our model can achieve anywhere from 80% to 100% accuracy:",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"# here we randomly split data 1000 times in differrent training and test sets\nmodel_accuracies = []\n\nfor repetition in range(1000):\n (training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25)\n \n decision_tree_classifier = DecisionTreeClassifier()\n decision_tree_classifier.fit(training_inputs, training_classes)\n classifier_accuracy = decision_tree_classifier.score(testing_inputs, testing_classes)\n model_accuracies.append(classifier_accuracy)\n \nplt.hist(model_accuracies)\n;",
"_____no_output_____"
],
[
"100/38",
"_____no_output_____"
]
],
[
[
"It's obviously a problem that our model performs quite differently depending on the subset of the data it's trained on. This phenomenon is known as **overfitting**: The model is learning to classify the training set so well that it doesn't generalize and perform well on data it hasn't seen before.\n\n### Cross-validation\n\n[[ go back to the top ]](#Table-of-contents)\n\nThis problem is the main reason that most data scientists perform ***k*-fold cross-validation** on their models: Split the original data set into *k* subsets, use one of the subsets as the testing set, and the rest of the subsets are used as the training set. This process is then repeated *k* times such that each subset is used as the testing set exactly once.\n\n10-fold cross-validation is the most common choice, so let's use that here. Performing 10-fold cross-validation on our data set looks something like this:\n\n(each square is an entry in our data set)",
"_____no_output_____"
]
],
[
[
"# new text",
"_____no_output_____"
],
[
"import numpy as np\nfrom sklearn.model_selection import StratifiedKFold\n\ndef plot_cv(cv, features, labels):\n masks = []\n for train, test in cv.split(features, labels):\n mask = np.zeros(len(labels), dtype=bool)\n mask[test] = 1\n masks.append(mask)\n \n plt.figure(figsize=(15, 15))\n plt.imshow(masks, interpolation='none', cmap='gray_r')\n plt.ylabel('Fold')\n plt.xlabel('Row #')\n\nplot_cv(StratifiedKFold(n_splits=10), all_inputs, all_labels)",
"_____no_output_____"
]
],
[
[
"You'll notice that we used **Stratified *k*-fold cross-validation** in the code above. Stratified *k*-fold keeps the class proportions the same across all of the folds, which is vital for maintaining a representative subset of our data set. (e.g., so we don't have 100% `Iris setosa` entries in one of the folds.)\n\nWe can perform 10-fold cross-validation on our model with the following code:",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import cross_val_score",
"_____no_output_____"
],
[
"from sklearn.model_selection import cross_val_score\n\ndecision_tree_classifier = DecisionTreeClassifier()\n\n# cross_val_score returns a list of the scores, which we can visualize\n# to get a reasonable estimate of our classifier's performance\ncv_scores = cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10)\nplt.hist(cv_scores)\nplt.title('Average score: {}'.format(np.mean(cv_scores)))\n;",
"_____no_output_____"
],
[
"len(all_inputs.T[1])",
"_____no_output_____"
],
[
"print(\"Entropy for: \", stats.entropy(all_inputs.T[1]))",
"Entropy for: 4.994187360273029\n"
],
[
"# we go through list of column names except last one and get entropy \n# for data (without missing values) in each column\ndef printEntropy(npdata):\n for i, col in enumerate(npdata.T):\n print(\"Entropy for column:\", i, stats.entropy(col))",
"_____no_output_____"
],
[
"printEntropy(all_inputs)",
"Entropy for column: 0 4.9947332367061925\nEntropy for column: 1 4.994187360273029\nEntropy for column: 2 4.88306851089088\nEntropy for column: 3 4.76945055275522\n"
]
],
[
[
"Now we have a much more consistent rating of our classifier's general classification accuracy.\n\n### Parameter tuning\n\n[[ go back to the top ]](#Table-of-contents)\n\nEvery Machine Learning model comes with a variety of parameters to tune, and these parameters can be vitally important to the performance of our classifier. For example, if we severely limit the depth of our decision tree classifier:",
"_____no_output_____"
]
],
[
[
"decision_tree_classifier = DecisionTreeClassifier(max_depth=1)\n\ncv_scores = cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10)\nplt.hist(cv_scores)\nplt.title('Average score: {}'.format(np.mean(cv_scores)))\n;",
"_____no_output_____"
]
],
[
[
"the classification accuracy falls tremendously.\n\nTherefore, we need to find a systematic method to discover the best parameters for our model and data set.\n\nThe most common method for model parameter tuning is **Grid Search**. The idea behind Grid Search is simple: explore a range of parameters and find the best-performing parameter combination. Focus your search on the best range of parameters, then repeat this process several times until the best parameters are discovered.\n\nLet's tune our decision tree classifier. We'll stick to only two parameters for now, but it's possible to simultaneously explore dozens of parameters if we want.",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import GridSearchCV\n\ndecision_tree_classifier = DecisionTreeClassifier()\n\nparameter_grid = {'max_depth': [1, 2, 3, 4, 5],\n 'max_features': [1, 2, 3, 4]}\n\ncross_validation = StratifiedKFold(n_splits=10)\n\ngrid_search = GridSearchCV(decision_tree_classifier,\n param_grid=parameter_grid,\n cv=cross_validation)\n\ngrid_search.fit(all_inputs, all_labels)\nprint('Best score: {}'.format(grid_search.best_score_))\nprint('Best parameters: {}'.format(grid_search.best_params_))",
"Best score: 0.9664429530201343\nBest parameters: {'max_depth': 3, 'max_features': 2}\n"
]
],
[
[
"Now let's visualize the grid search to see how the parameters interact.",
"_____no_output_____"
]
],
[
[
"grid_search.cv_results_['mean_test_score']",
"_____no_output_____"
],
[
"grid_visualization = grid_search.cv_results_['mean_test_score']\ngrid_visualization.shape = (5, 4)\nsb.heatmap(grid_visualization, cmap='Reds', annot=True)\nplt.xticks(np.arange(4) + 0.5, grid_search.param_grid['max_features'])\nplt.yticks(np.arange(5) + 0.5, grid_search.param_grid['max_depth'])\nplt.xlabel('max_features')\nplt.ylabel('max_depth')\n;",
"_____no_output_____"
]
],
[
[
"Now we have a better sense of the parameter space: We know that we need a `max_depth` of at least 2 to allow the decision tree to make more than a one-off decision.\n\n`max_features` doesn't really seem to make a big difference here as long as we have 2 of them, which makes sense since our data set has only 4 features and is relatively easy to classify. (Remember, one of our data set's classes was easily separable from the rest based on a single feature.)\n\nLet's go ahead and use a broad grid search to find the best settings for a handful of parameters.",
"_____no_output_____"
]
],
[
[
"decision_tree_classifier = DecisionTreeClassifier()\n\nparameter_grid = {'criterion': ['gini', 'entropy'],\n 'splitter': ['best', 'random'],\n 'max_depth': [1, 2, 3, 4, 5],\n 'max_features': [1, 2, 3, 4]}\n\ncross_validation = StratifiedKFold(n_splits=10)\n\ngrid_search = GridSearchCV(decision_tree_classifier,\n param_grid=parameter_grid,\n cv=cross_validation)\n\ngrid_search.fit(all_inputs, all_labels)\nprint('Best score: {}'.format(grid_search.best_score_))\nprint('Best parameters: {}'.format(grid_search.best_params_))",
"Best score: 0.9664429530201343\nBest parameters: {'criterion': 'gini', 'max_depth': 3, 'max_features': 3, 'splitter': 'best'}\n"
]
],
[
[
"Now we can take the best classifier from the Grid Search and use that:",
"_____no_output_____"
]
],
[
[
"decision_tree_classifier = grid_search.best_estimator_\ndecision_tree_classifier",
"_____no_output_____"
]
],
[
[
"We can even visualize the decision tree with [GraphViz](http://www.graphviz.org/) to see how it's making the classifications:",
"_____no_output_____"
]
],
[
[
"import sklearn.tree as tree\nfrom sklearn.externals.six import StringIO\n\nwith open('iris_dtc.dot', 'w') as out_file:\n out_file = tree.export_graphviz(decision_tree_classifier, out_file=out_file)",
"_____no_output_____"
]
],
[
[
"<img src=\"img/iris_dtc.png\" />",
"_____no_output_____"
],
[
"(This classifier may look familiar from earlier in the notebook.)\n\nAlright! We finally have our demo classifier. Let's create some visuals of its performance so we have something to show our company's Head of Data.",
"_____no_output_____"
]
],
[
[
"dt_scores = cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10)\n\nsb.boxplot(dt_scores)\nsb.stripplot(dt_scores, jitter=True, color='black')\n;",
"_____no_output_____"
]
],
[
[
"Hmmm... that's a little boring by itself though. How about we compare another classifier to see how they perform?\n\nWe already know from previous projects that Random Forest classifiers usually work better than individual decision trees. A common problem that decision trees face is that they're prone to overfitting: They complexify to the point that they classify the training set near-perfectly, but fail to generalize to data they have not seen before.\n\n**Random Forest classifiers** work around that limitation by creating a whole bunch of decision trees (hence \"forest\") โ each trained on random subsets of training samples (drawn with replacement) and features (drawn without replacement) โ and have the decision trees work together to make a more accurate classification.\n\nLet that be a lesson for us: **Even in Machine Learning, we get better results when we work together!**\n\nLet's see if a Random Forest classifier works better here.\n\nThe great part about scikit-learn is that the training, testing, parameter tuning, etc. process is the same for all models, so we only need to plug in the new classifier.",
"_____no_output_____"
]
],
[
[
"from sklearn.ensemble import RandomForestClassifier",
"_____no_output_____"
],
[
"from sklearn.ensemble import RandomForestClassifier\n\nrandom_forest_classifier = RandomForestClassifier()\n\nparameter_grid = {'n_estimators': [10, 25, 50, 100],\n 'criterion': ['gini', 'entropy'],\n 'max_features': [1, 2, 3, 4]}\n\ncross_validation = StratifiedKFold(n_splits=10)\n\ngrid_search = GridSearchCV(random_forest_classifier,\n param_grid=parameter_grid,\n cv=cross_validation)\n\ngrid_search.fit(all_inputs, all_labels)\nprint('Best score: {}'.format(grid_search.best_score_))\nprint('Best parameters: {}'.format(grid_search.best_params_))\n\ngrid_search.best_estimator_",
"Best score: 0.9664429530201343\nBest parameters: {'criterion': 'gini', 'max_features': 3, 'n_estimators': 25}\n"
]
],
[
[
"Now we can compare their performance:",
"_____no_output_____"
]
],
[
[
"random_forest_classifier = grid_search.best_estimator_\n\nrf_df = pd.DataFrame({'accuracy': cross_val_score(random_forest_classifier, all_inputs, all_labels, cv=10),\n 'classifier': ['Random Forest'] * 10})\ndt_df = pd.DataFrame({'accuracy': cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10),\n 'classifier': ['Decision Tree'] * 10})\nboth_df = rf_df.append(dt_df)\n\nsb.boxplot(x='classifier', y='accuracy', data=both_df)\nsb.stripplot(x='classifier', y='accuracy', data=both_df, jitter=True, color='black')\n;",
"_____no_output_____"
]
],
[
[
"How about that? They both seem to perform about the same on this data set. This is probably because of the limitations of our data set: We have only 4 features to make the classification, and Random Forest classifiers excel when there's hundreds of possible features to look at. In other words, there wasn't much room for improvement with this data set.",
"_____no_output_____"
],
[
"## Step 6: Reproducibility\n\n[[ go back to the top ]](#Table-of-contents)\n\nEnsuring that our work is reproducible is the last and โ arguably โ most important step in any analysis. **As a rule, we shouldn't place much weight on a discovery that can't be reproduced**. As such, if our analysis isn't reproducible, we might as well not have done it.\n\nNotebooks like this one go a long way toward making our work reproducible. Since we documented every step as we moved along, we have a written record of what we did and why we did it โ both in text and code.\n\nBeyond recording what we did, we should also document what software and hardware we used to perform our analysis. This typically goes at the top of our notebooks so our readers know what tools to use.\n\n[Sebastian Raschka](http://sebastianraschka.com/) created a handy [notebook tool](https://github.com/rasbt/watermark) for this:",
"_____no_output_____"
]
],
[
[
"!pip install watermark",
"Requirement already satisfied: watermark in c:\\programdata\\anaconda3\\lib\\site-packages (1.8.1)\nRequirement already satisfied: ipython in c:\\programdata\\anaconda3\\lib\\site-packages (from watermark) (7.4.0)\nRequirement already satisfied: jedi>=0.10 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.13.3)\nRequirement already satisfied: backcall in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.1.0)\nRequirement already satisfied: pickleshare in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.7.5)\nRequirement already satisfied: setuptools>=18.5 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (40.8.0)\nRequirement already satisfied: colorama; sys_platform == \"win32\" in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.4.1)\nRequirement already satisfied: decorator in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (4.4.0)\nRequirement already satisfied: pygments in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (2.3.1)\nRequirement already satisfied: prompt-toolkit<2.1.0,>=2.0.0 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (2.0.9)\nRequirement already satisfied: traitlets>=4.2 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (4.3.2)\nRequirement already satisfied: parso>=0.3.0 in c:\\programdata\\anaconda3\\lib\\site-packages (from jedi>=0.10->ipython->watermark) (0.3.4)\nRequirement already satisfied: six>=1.9.0 in c:\\programdata\\anaconda3\\lib\\site-packages (from prompt-toolkit<2.1.0,>=2.0.0->ipython->watermark) (1.12.0)\nRequirement already satisfied: wcwidth in c:\\programdata\\anaconda3\\lib\\site-packages (from prompt-toolkit<2.1.0,>=2.0.0->ipython->watermark) (0.1.7)\nRequirement already satisfied: ipython-genutils in c:\\programdata\\anaconda3\\lib\\site-packages (from traitlets>=4.2->ipython->watermark) (0.2.0)\n"
],
[
"%load_ext watermark",
"The watermark extension is already loaded. To reload it, use:\n %reload_ext watermark\n"
],
[
"pd.show_versions()",
"\nINSTALLED VERSIONS\n------------------\ncommit: None\npython: 3.7.3.final.0\npython-bits: 64\nOS: Windows\nOS-release: 10\nmachine: AMD64\nprocessor: Intel64 Family 6 Model 158 Stepping 10, GenuineIntel\nbyteorder: little\nLC_ALL: None\nLANG: None\nLOCALE: None.None\n\npandas: 0.24.2\npytest: 4.3.1\npip: 19.0.3\nsetuptools: 40.8.0\nCython: 0.29.6\nnumpy: 1.16.2\nscipy: 1.2.1\npyarrow: None\nxarray: None\nIPython: 7.4.0\nsphinx: 1.8.5\npatsy: 0.5.1\ndateutil: 2.8.0\npytz: 2018.9\nblosc: None\nbottleneck: 1.2.1\ntables: 3.5.1\nnumexpr: 2.6.9\nfeather: None\nmatplotlib: 3.0.3\nopenpyxl: 2.6.1\nxlrd: 1.2.0\nxlwt: 1.3.0\nxlsxwriter: 1.1.5\nlxml.etree: 4.3.2\nbs4: 4.7.1\nhtml5lib: 1.0.1\nsqlalchemy: 1.3.1\npymysql: None\npsycopg2: None\njinja2: 2.10\ns3fs: None\nfastparquet: None\npandas_gbq: None\npandas_datareader: None\ngcsfs: None\n"
],
[
"%watermark -a 'RCS_April_2019' -nmv --packages numpy,pandas,sklearn,matplotlib,seaborn",
"RCS_April_2019 Wed Apr 17 2019 \n\nCPython 3.7.3\nIPython 7.4.0\n\nnumpy 1.16.2\npandas 0.24.2\nsklearn 0.20.3\nmatplotlib 3.0.3\nseaborn 0.9.0\n\ncompiler : MSC v.1915 64 bit (AMD64)\nsystem : Windows\nrelease : 10\nmachine : AMD64\nprocessor : Intel64 Family 6 Model 158 Stepping 10, GenuineIntel\nCPU cores : 12\ninterpreter: 64bit\n"
]
],
[
[
"Finally, let's extract the core of our work from Steps 1-5 and turn it into a single pipeline.",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\nimport pandas as pd\nimport seaborn as sb\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split, cross_val_score\n\n# We can jump directly to working with the clean data because we saved our cleaned data set\niris_data_clean = pd.read_csv('../data/iris-data-clean.csv')\n\n# Testing our data: Our analysis will stop here if any of these assertions are wrong\n\n# We know that we should only have three classes\nassert len(iris_data_clean['class'].unique()) == 3\n\n# We know that sepal lengths for 'Iris-versicolor' should never be below 2.5 cm\nassert iris_data_clean.loc[iris_data_clean['class'] == 'Iris-versicolor', 'sepal_length_cm'].min() >= 2.5\n\n# We know that our data set should have no missing measurements\nassert len(iris_data_clean.loc[(iris_data_clean['sepal_length_cm'].isnull()) |\n (iris_data_clean['sepal_width_cm'].isnull()) |\n (iris_data_clean['petal_length_cm'].isnull()) |\n (iris_data_clean['petal_width_cm'].isnull())]) == 0\n\nall_inputs = iris_data_clean[['sepal_length_cm', 'sepal_width_cm',\n 'petal_length_cm', 'petal_width_cm']].values\n\nall_labels = iris_data_clean['class'].values\n\n# This is the classifier that came out of Grid Search\nrandom_forest_classifier = RandomForestClassifier(criterion='gini', max_features=3, n_estimators=50)\n\n# All that's left to do now is plot the cross-validation scores\nrf_classifier_scores = cross_val_score(random_forest_classifier, all_inputs, all_labels, cv=10)\nsb.boxplot(rf_classifier_scores)\nsb.stripplot(rf_classifier_scores, jitter=True, color='black')\n\n# ...and show some of the predictions from the classifier\n(training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25)\n\nrandom_forest_classifier.fit(training_inputs, training_classes)\n\nfor input_features, prediction, actual in zip(testing_inputs[:10],\n random_forest_classifier.predict(testing_inputs[:10]),\n testing_classes[:10]):\n print('{}\\t-->\\t{}\\t(Actual: {})'.format(input_features, prediction, actual))",
"[4.6 3.4 1.4 0.3]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.9 3. 4.2 1.5]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[7.2 3. 5.8 1.6]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[6.7 2.5 5.8 1.8]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[6.7 3.3 5.7 2.5]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[4.9 3.1 1.5 0.25]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[6.3 3.4 5.6 2.4]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[5.1 3.3 1.7 0.5]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[4.9 2.4 3.3 1. ]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[6.3 3.3 4.7 1.6]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n"
],
[
"%matplotlib inline\nimport pandas as pd\nimport seaborn as sb\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split, cross_val_score\n\ndef processData(filename): \n # We can jump directly to working with the clean data because we saved our cleaned data set\n iris_data_clean = pd.read_csv(filename)\n\n # Testing our data: Our analysis will stop here if any of these assertions are wrong\n\n # We know that we should only have three classes\n assert len(iris_data_clean['class'].unique()) == 3\n\n # We know that sepal lengths for 'Iris-versicolor' should never be below 2.5 cm\n assert iris_data_clean.loc[iris_data_clean['class'] == 'Iris-versicolor', 'sepal_length_cm'].min() >= 2.5\n\n # We know that our data set should have no missing measurements\n assert len(iris_data_clean.loc[(iris_data_clean['sepal_length_cm'].isnull()) |\n (iris_data_clean['sepal_width_cm'].isnull()) |\n (iris_data_clean['petal_length_cm'].isnull()) |\n (iris_data_clean['petal_width_cm'].isnull())]) == 0\n\n all_inputs = iris_data_clean[['sepal_length_cm', 'sepal_width_cm',\n 'petal_length_cm', 'petal_width_cm']].values\n\n all_labels = iris_data_clean['class'].values\n\n # This is the classifier that came out of Grid Search\n random_forest_classifier = RandomForestClassifier(criterion='gini', max_features=3, n_estimators=50)\n\n # All that's left to do now is plot the cross-validation scores\n rf_classifier_scores = cross_val_score(random_forest_classifier, all_inputs, all_labels, cv=10)\n sb.boxplot(rf_classifier_scores)\n sb.stripplot(rf_classifier_scores, jitter=True, color='black')\n\n # ...and show some of the predictions from the classifier\n (training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25)\n\n random_forest_classifier.fit(training_inputs, training_classes)\n\n for input_features, prediction, actual in zip(testing_inputs[:10],\n random_forest_classifier.predict(testing_inputs[:10]),\n testing_classes[:10]):\n print('{}\\t-->\\t{}\\t(Actual: {})'.format(input_features, prediction, actual))\n return rf_classifier_scores",
"_____no_output_____"
],
[
"myscores = processData('../data/iris-data-clean.csv')",
"[5.1 3.7 1.5 0.4]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.8 2.7 4.1 1. ]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[5.7 3. 1.1 0.1]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.9 3. 5.1 1.8]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[5.4 3.4 1.7 0.2]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[4.7 3.2 1.6 0.2]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.4 3. 4.5 1.5]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[5.7 4.4 1.5 0.4]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5. 3.2 1.2 0.2]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[7.2 3. 5.8 1.6]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n"
],
[
"myscores",
"_____no_output_____"
]
],
[
[
"There we have it: We have a complete and reproducible Machine Learning pipeline to demo to our company's Head of Data. We've met the success criteria that we set from the beginning (>90% accuracy), and our pipeline is flexible enough to handle new inputs or flowers when that data set is ready. Not bad for our first week on the job!",
"_____no_output_____"
],
[
"## Conclusions\n\n[[ go back to the top ]](#Table-of-contents)\n\nI hope you found this example notebook useful for your own work and learned at least one new trick by reading through it.\n\n\n* [Submit an issue](https://github.com/ValRCS/LU-pysem/issues) on GitHub\n\n* Fork the [notebook repository](https://github.com/ValRCS/LU-pysem), make the fix/addition yourself, then send over a pull request",
"_____no_output_____"
],
[
"## Further reading\n\n[[ go back to the top ]](#Table-of-contents)\n\nThis notebook covers a broad variety of topics but skips over many of the specifics. If you're looking to dive deeper into a particular topic, here's some recommended reading.\n\n**Data Science**: William Chen compiled a [list of free books](http://www.wzchen.com/data-science-books/) for newcomers to Data Science, ranging from the basics of R & Python to Machine Learning to interviews and advice from prominent data scientists.\n\n**Machine Learning**: /r/MachineLearning has a useful [Wiki page](https://www.reddit.com/r/MachineLearning/wiki/index) containing links to online courses, books, data sets, etc. for Machine Learning. There's also a [curated list](https://github.com/josephmisiti/awesome-machine-learning) of Machine Learning frameworks, libraries, and software sorted by language.\n\n**Unit testing**: Dive Into Python 3 has a [great walkthrough](http://www.diveintopython3.net/unit-testing.html) of unit testing in Python, how it works, and how it should be used\n\n**pandas** has [several tutorials](http://pandas.pydata.org/pandas-docs/stable/tutorials.html) covering its myriad features.\n\n**scikit-learn** has a [bunch of tutorials](http://scikit-learn.org/stable/tutorial/index.html) for those looking to learn Machine Learning in Python. Andreas Mueller's [scikit-learn workshop materials](https://github.com/amueller/scipy_2015_sklearn_tutorial) are top-notch and freely available.\n\n**matplotlib** has many [books, videos, and tutorials](http://matplotlib.org/resources/index.html) to teach plotting in Python.\n\n**Seaborn** has a [basic tutorial](http://stanford.edu/~mwaskom/software/seaborn/tutorial.html) covering most of the statistical plotting features.",
"_____no_output_____"
],
[
"## Acknowledgements\n\n[[ go back to the top ]](#Table-of-contents)\n\nMany thanks to [Andreas Mueller](http://amueller.github.io/) for some of his [examples](https://github.com/amueller/scipy_2015_sklearn_tutorial) in the Machine Learning section. I drew inspiration from several of his excellent examples.\n\nThe photo of a flower with annotations of the petal and sepal was taken by [Eric Guinther](https://commons.wikimedia.org/wiki/File:Petal-sepal.jpg).\n\nThe photos of the various *Iris* flower types were taken by [Ken Walker](http://www.signa.org/index.pl?Display+Iris-setosa+2) and [Barry Glick](http://www.signa.org/index.pl?Display+Iris-virginica+3).",
"_____no_output_____"
],
[
"## Further questions? \n\nFeel free to contact [Valdis Saulespurens]\n(email:[email protected])",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
]
|
d061465d23ce5abccb3893326eb5add1159d5665 | 25,452 | ipynb | Jupyter Notebook | Deep_Learning_Specialization/04_Convolutional_Neural_Networks/02_Deep_Convolutional_Models_Case_Studies/02_Residual_Networks/Residual Networks - v2.ipynb | cilsya/coursera | 4a7896f3225cb84e2f15770409c1f18bfe529615 | [
"MIT"
]
| 1 | 2021-03-15T13:57:04.000Z | 2021-03-15T13:57:04.000Z | Deep_Learning_Specialization/04_Convolutional_Neural_Networks/02_Deep_Convolutional_Models_Case_Studies/02_Residual_Networks/Residual Networks - v2.ipynb | cilsya/coursera | 4a7896f3225cb84e2f15770409c1f18bfe529615 | [
"MIT"
]
| 5 | 2020-03-24T16:17:05.000Z | 2021-06-01T22:49:40.000Z | Deep_Learning_Specialization/04_Convolutional_Neural_Networks/02_Deep_Convolutional_Models_Case_Studies/02_Residual_Networks/Residual Networks - v2.ipynb | cilsya/coursera | 4a7896f3225cb84e2f15770409c1f18bfe529615 | [
"MIT"
]
| null | null | null | 47.220779 | 457 | 0.485424 | [
[
[
"empty"
]
]
]
| [
"empty"
]
| [
[
"empty"
]
]
|
d0614fd8c14825f845dd96eed5634b241da21e66 | 371,618 | ipynb | Jupyter Notebook | docs/source/_docs/pyNetLogo demo - SALib sequential.ipynb | jasonrwang/pyNetLogo | 01117c9d9c7d4d5681fe1e08f8862d6b64c9a4b7 | [
"BSD-3-Clause"
]
| null | null | null | docs/source/_docs/pyNetLogo demo - SALib sequential.ipynb | jasonrwang/pyNetLogo | 01117c9d9c7d4d5681fe1e08f8862d6b64c9a4b7 | [
"BSD-3-Clause"
]
| null | null | null | docs/source/_docs/pyNetLogo demo - SALib sequential.ipynb | jasonrwang/pyNetLogo | 01117c9d9c7d4d5681fe1e08f8862d6b64c9a4b7 | [
"BSD-3-Clause"
]
| null | null | null | 455.414216 | 209,950 | 0.92675 | [
[
[
"## Example 2: Sensitivity analysis on a NetLogo model with SALib\n\nThis notebook provides a more advanced example of interaction between NetLogo and a Python environment, using the SALib library (Herman & Usher, 2017; available through the pip package manager) to sample and analyze a suitable experimental design for a Sobol global sensitivity analysis. All files used in the example are available from the pyNetLogo repository at https://github.com/quaquel/pyNetLogo.",
"_____no_output_____"
]
],
[
[
"#Ensuring compliance of code with both python2 and python3\n\nfrom __future__ import division, print_function\ntry:\n from itertools import izip as zip\nexcept ImportError: # will be 3.x series\n pass",
"_____no_output_____"
],
[
"%matplotlib inline\n\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nimport pyNetLogo\n\n#Import the sampling and analysis modules for a Sobol variance-based sensitivity analysis\nfrom SALib.sample import saltelli\nfrom SALib.analyze import sobol",
"_____no_output_____"
]
],
[
[
"SALib relies on a problem definition dictionary which contains the number of input parameters to sample, their names (which should here correspond to a NetLogo global variable), and the sampling bounds. Documentation for SALib can be found at https://salib.readthedocs.io/en/latest/.",
"_____no_output_____"
]
],
[
[
"problem = { \n 'num_vars': 6,\n 'names': ['random-seed',\n 'grass-regrowth-time',\n 'sheep-gain-from-food',\n 'wolf-gain-from-food',\n 'sheep-reproduce',\n 'wolf-reproduce'], \n 'bounds': [[1, 100000],\n [20., 40.], \n [2., 8.], \n [16., 32.],\n [2., 8.],\n [2., 8.]]\n}",
"_____no_output_____"
]
],
[
[
"We start by instantiating the wolf-sheep predation example model, specifying the _gui=False_ flag to run in headless mode.",
"_____no_output_____"
]
],
[
[
"netlogo = pyNetLogo.NetLogoLink(gui=False)\nnetlogo.load_model(r'Wolf Sheep Predation_v6.nlogo')",
"_____no_output_____"
]
],
[
[
"The SALib sampler will automatically generate an appropriate number of samples for Sobol analysis. To calculate first-order, second-order and total sensitivity indices, this gives a sample size of _n*(2p+2)_, where _p_ is the number of input parameters, and _n_ is a baseline sample size which should be large enough to stabilize the estimation of the indices. For this example, we use _n_ = 1000, for a total of 14000 experiments.\n\nFor more complex analyses, parallelizing the experiments can significantly improve performance. An additional notebook in the pyNetLogo repository demonstrates the use of the ipyparallel library; parallel processing for NetLogo models is also supported by the Exploratory Modeling Workbench (Kwakkel, 2017).",
"_____no_output_____"
]
],
[
[
"n = 1000\nparam_values = saltelli.sample(problem, n, calc_second_order=True)",
"_____no_output_____"
]
],
[
[
"The sampler generates an input array of shape (_n*(2p+2)_, _p_) with rows for each experiment and columns for each input parameter.",
"_____no_output_____"
]
],
[
[
"param_values.shape",
"_____no_output_____"
]
],
[
[
"Assuming we are interested in the mean number of sheep and wolf agents over a timeframe of 100 ticks, we first create an empty dataframe to store the results.",
"_____no_output_____"
]
],
[
[
"results = pd.DataFrame(columns=['Avg. sheep', 'Avg. wolves'])",
"_____no_output_____"
]
],
[
[
"We then simulate the model over the 14000 experiments, reading input parameters from the param_values array generated by SALib. The repeat_report command is used to track the outcomes of interest over time. \n\nTo later compare performance with the ipyparallel implementation of the analysis, we also keep track of the elapsed runtime.",
"_____no_output_____"
]
],
[
[
"import time\n\nt0=time.time()\n\nfor run in range(param_values.shape[0]):\n \n #Set the input parameters\n for i, name in enumerate(problem['names']):\n if name == 'random-seed':\n #The NetLogo random seed requires a different syntax\n netlogo.command('random-seed {}'.format(param_values[run,i]))\n else:\n #Otherwise, assume the input parameters are global variables\n netlogo.command('set {0} {1}'.format(name, param_values[run,i]))\n \n netlogo.command('setup')\n #Run for 100 ticks and return the number of sheep and wolf agents at each time step\n counts = netlogo.repeat_report(['count sheep','count wolves'], 100)\n \n #For each run, save the mean value of the agent counts over time\n results.loc[run, 'Avg. sheep'] = counts['count sheep'].values.mean()\n results.loc[run, 'Avg. wolves'] = counts['count wolves'].values.mean()\n \nelapsed=time.time()-t0 #Elapsed runtime in seconds",
"_____no_output_____"
],
[
"elapsed",
"_____no_output_____"
]
],
[
[
"The \"to_csv\" dataframe method provides a simple way of saving the results to disk.\n\nPandas supports several more advanced storage options, such as serialization with msgpack, or hierarchical HDF5 storage.",
"_____no_output_____"
]
],
[
[
"results.to_csv('Sobol_sequential.csv')",
"_____no_output_____"
],
[
"results = pd.read_csv('Sobol_sequential.csv', header=0, index_col=0)",
"_____no_output_____"
],
[
"results.head(5)",
"_____no_output_____"
]
],
[
[
"We can then proceed with the analysis, first using a histogram to visualize output distributions for each outcome:",
"_____no_output_____"
]
],
[
[
"sns.set_style('white')\nsns.set_context('talk')\nfig, ax = plt.subplots(1,len(results.columns), sharey=True)\n\nfor i, n in enumerate(results.columns):\n ax[i].hist(results[n], 20)\n ax[i].set_xlabel(n)\n\nax[0].set_ylabel('Counts')\n\nfig.set_size_inches(10,4)\nfig.subplots_adjust(wspace=0.1)\n#plt.savefig('JASSS figures/SA - Output distribution.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/SA - Output distribution.png', dpi=300, bbox_inches='tight')\nplt.show()",
"_____no_output_____"
]
],
[
[
"Bivariate scatter plots can be useful to visualize relationships between each input parameter and the outputs. Taking the outcome for the average sheep count as an example, we obtain the following, using the scipy library to calculate the Pearson correlation coefficient (r) for each parameter:",
"_____no_output_____"
]
],
[
[
"%matplotlib\nimport scipy\n\nnrow=2\nncol=3\nfig, ax = plt.subplots(nrow, ncol, sharey=True)\nsns.set_context('talk')\ny = results['Avg. sheep']\n\nfor i, a in enumerate(ax.flatten()):\n x = param_values[:,i]\n sns.regplot(x, y, ax=a, ci=None, color='k',scatter_kws={'alpha':0.2, 's':4, 'color':'gray'})\n pearson = scipy.stats.pearsonr(x, y)\n a.annotate(\"r: {:6.3f}\".format(pearson[0]), xy=(0.15, 0.85), xycoords='axes fraction',fontsize=13)\n if divmod(i,ncol)[1]>0:\n a.get_yaxis().set_visible(False)\n a.set_xlabel(problem['names'][i])\n a.set_ylim([0,1.1*np.max(y)])\n\nfig.set_size_inches(9,9,forward=True) \nfig.subplots_adjust(wspace=0.2, hspace=0.3)\n#plt.savefig('JASSS figures/SA - Scatter.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/SA - Scatter.png', dpi=300, bbox_inches='tight')\nplt.show()",
"_____no_output_____"
]
],
[
[
"This indicates a positive relationship between the \"sheep-gain-from-food\" parameter and the mean sheep count, and negative relationships for the \"wolf-gain-from-food\" and \"wolf-reproduce\" parameters.\n\nWe can then use SALib to calculate first-order (S1), second-order (S2) and total (ST) Sobol indices, to estimate each input's contribution to output variance. By default, 95% confidence intervals are estimated for each index.",
"_____no_output_____"
]
],
[
[
"Si = sobol.analyze(problem, results['Avg. sheep'].values, calc_second_order=True, print_to_console=False)",
"_____no_output_____"
]
],
[
[
"As a simple example, we first select and visualize the first-order and total indices for each input, converting the dictionary returned by SALib to a dataframe.",
"_____no_output_____"
]
],
[
[
"Si_filter = {k:Si[k] for k in ['ST','ST_conf','S1','S1_conf']}\nSi_df = pd.DataFrame(Si_filter, index=problem['names'])",
"_____no_output_____"
],
[
"Si_df",
"_____no_output_____"
],
[
"sns.set_style('white')\nfig, ax = plt.subplots(1)\n\nindices = Si_df[['S1','ST']]\nerr = Si_df[['S1_conf','ST_conf']]\n\nindices.plot.bar(yerr=err.values.T,ax=ax)\nfig.set_size_inches(8,4)\n\n#plt.savefig('JASSS figures/SA - Indices.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/SA - Indices.png', dpi=300, bbox_inches='tight')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"The \"sheep-gain-from-food\" parameter has the highest ST index, indicating that it contributes over 50% of output variance when accounting for interactions with other parameters. However, it can be noted that the confidence bounds are overly broad due to the small _n_ value used for sampling, so that a larger sample would be required for reliable results. For instance, the S1 index is estimated to be larger than ST for the \"random-seed\" parameter, which is an artifact of the small sample size.\n\nWe can use a more sophisticated visualization to include the second-order interactions between inputs.",
"_____no_output_____"
]
],
[
[
"import itertools\nfrom math import pi\n\n\ndef normalize(x, xmin, xmax):\n return (x-xmin)/(xmax-xmin)\n\n\ndef plot_circles(ax, locs, names, max_s, stats, smax, smin, fc, ec, lw, \n zorder):\n s = np.asarray([stats[name] for name in names])\n s = 0.01 + max_s * np.sqrt(normalize(s, smin, smax))\n \n fill = True\n for loc, name, si in zip(locs, names, s):\n if fc=='w':\n fill=False\n else:\n ec='none'\n \n x = np.cos(loc)\n y = np.sin(loc)\n \n circle = plt.Circle((x,y), radius=si, ec=ec, fc=fc, transform=ax.transData._b,\n zorder=zorder, lw=lw, fill=True)\n ax.add_artist(circle)\n \n\ndef filter(sobol_indices, names, locs, criterion, threshold):\n if criterion in ['ST', 'S1', 'S2']:\n data = sobol_indices[criterion]\n data = np.abs(data)\n data = data.flatten() # flatten in case of S2\n # TODO:: remove nans\n \n filtered = ([(name, locs[i]) for i, name in enumerate(names) if \n data[i]>threshold])\n filtered_names, filtered_locs = zip(*filtered)\n elif criterion in ['ST_conf', 'S1_conf', 'S2_conf']:\n raise NotImplementedError\n else:\n raise ValueError('unknown value for criterion')\n\n return filtered_names, filtered_locs\n\n\ndef plot_sobol_indices(sobol_indices, criterion='ST', threshold=0.01):\n '''plot sobol indices on a radial plot\n \n Parameters\n ----------\n sobol_indices : dict\n the return from SAlib\n criterion : {'ST', 'S1', 'S2', 'ST_conf', 'S1_conf', 'S2_conf'}, optional\n threshold : float\n only visualize variables with criterion larger than cutoff\n \n '''\n max_linewidth_s2 = 15#25*1.8\n max_s_radius = 0.3\n \n # prepare data\n # use the absolute values of all the indices\n #sobol_indices = {key:np.abs(stats) for key, stats in sobol_indices.items()}\n \n # dataframe with ST and S1\n sobol_stats = {key:sobol_indices[key] for key in ['ST', 'S1']}\n sobol_stats = pd.DataFrame(sobol_stats, index=problem['names'])\n\n smax = sobol_stats.max().max()\n smin = sobol_stats.min().min()\n\n # dataframe with s2\n s2 = pd.DataFrame(sobol_indices['S2'], index=problem['names'], \n columns=problem['names'])\n s2[s2<0.0]=0. #Set negative values to 0 (artifact from small sample sizes)\n s2max = s2.max().max()\n s2min = s2.min().min()\n\n names = problem['names']\n n = len(names)\n ticklocs = np.linspace(0, 2*pi, n+1)\n locs = ticklocs[0:-1]\n\n filtered_names, filtered_locs = filter(sobol_indices, names, locs,\n criterion, threshold)\n \n # setup figure\n fig = plt.figure()\n ax = fig.add_subplot(111, polar=True)\n ax.grid(False)\n ax.spines['polar'].set_visible(False)\n ax.set_xticks(ticklocs)\n\n ax.set_xticklabels(names)\n ax.set_yticklabels([])\n ax.set_ylim(ymax=1.4)\n legend(ax)\n\n # plot ST\n plot_circles(ax, filtered_locs, filtered_names, max_s_radius, \n sobol_stats['ST'], smax, smin, 'w', 'k', 1, 9)\n\n # plot S1\n plot_circles(ax, filtered_locs, filtered_names, max_s_radius, \n sobol_stats['S1'], smax, smin, 'k', 'k', 1, 10)\n\n # plot S2\n for name1, name2 in itertools.combinations(zip(filtered_names, filtered_locs), 2):\n name1, loc1 = name1\n name2, loc2 = name2\n\n weight = s2.ix[name1, name2]\n lw = 0.5+max_linewidth_s2*normalize(weight, s2min, s2max)\n ax.plot([loc1, loc2], [1,1], c='darkgray', lw=lw, zorder=1)\n\n return fig\n\n\nfrom matplotlib.legend_handler import HandlerPatch\nclass HandlerCircle(HandlerPatch):\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize, trans):\n center = 0.5 * width - 0.5 * xdescent, 0.5 * height - 0.5 * ydescent\n p = plt.Circle(xy=center, radius=orig_handle.radius)\n self.update_prop(p, orig_handle, legend)\n p.set_transform(trans)\n return [p]\n\ndef legend(ax):\n some_identifiers = [plt.Circle((0,0), radius=5, color='k', fill=False, lw=1),\n plt.Circle((0,0), radius=5, color='k', fill=True),\n plt.Line2D([0,0.5], [0,0.5], lw=8, color='darkgray')]\n ax.legend(some_identifiers, ['ST', 'S1', 'S2'],\n loc=(1,0.75), borderaxespad=0.1, mode='expand',\n handler_map={plt.Circle: HandlerCircle()})\n\n\nsns.set_style('whitegrid')\nfig = plot_sobol_indices(Si, criterion='ST', threshold=0.005)\nfig.set_size_inches(7,7)\n#plt.savefig('JASSS figures/Figure 8 - Interactions.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/Figure 8 - Interactions.png', dpi=300, bbox_inches='tight')\nplt.show()",
"_____no_output_____"
]
],
[
[
"In this case, the sheep-gain-from-food variable has strong interactions with the wolf-gain-from-food and sheep-reproduce inputs in particular. The size of the ST and S1 circles correspond to the normalized variable importances.",
"_____no_output_____"
],
[
"Finally, the kill_workspace() function shuts down the NetLogo instance.",
"_____no_output_____"
]
],
[
[
"netlogo.kill_workspace()",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
]
]
|
d06165fcd0eb33f2d42dced43cf819c2024d6dbc | 698,867 | ipynb | Jupyter Notebook | Tennis_Time_Data_Visualization.ipynb | Tinzyl/Tennis_Time_Data_Visualization | 761964f37a7f524edf708a1174d9ee8f73334889 | [
"MIT"
]
| null | null | null | Tennis_Time_Data_Visualization.ipynb | Tinzyl/Tennis_Time_Data_Visualization | 761964f37a7f524edf708a1174d9ee8f73334889 | [
"MIT"
]
| null | null | null | Tennis_Time_Data_Visualization.ipynb | Tinzyl/Tennis_Time_Data_Visualization | 761964f37a7f524edf708a1174d9ee8f73334889 | [
"MIT"
]
| null | null | null | 107.783313 | 61,440 | 0.756047 | [
[
[
"import pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"players_time = pd.read_csv(\"players_time.csv\")",
"_____no_output_____"
],
[
"events_time = pd.read_csv(\"events_time.csv\")",
"_____no_output_____"
],
[
"serve_time = pd.read_csv(\"serve_times.csv\")",
"_____no_output_____"
],
[
"players_time",
"_____no_output_____"
],
[
"events_time",
"_____no_output_____"
],
[
"pd.options.display.max_rows = None",
"_____no_output_____"
],
[
"events_time",
"_____no_output_____"
],
[
"serve_time",
"_____no_output_____"
]
],
[
[
"## 1. Visualize The 10 Most Slow Players ",
"_____no_output_____"
]
],
[
[
"most_slow_Players = players_time[players_time[\"seconds_added_per_point\"] > 0].sort_values(by=\"seconds_added_per_point\", ascending=False).head(10)",
"_____no_output_____"
],
[
"most_slow_Players",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"player\", data=most_slow_Players)\nax.set_title(\"TOP 10 MOST SLOW PLAYERS\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Players\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 2. Visualize The 10 Most Fast Players",
"_____no_output_____"
]
],
[
[
"most_fast_Players = players_time[players_time[\"seconds_added_per_point\"] < 0].sort_values(by=\"seconds_added_per_point\").head(10)",
"_____no_output_____"
],
[
"most_fast_Players",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"player\", data=most_fast_Players)\nax.set_title(\"TOP 10 MOST FAST PLAYERS\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Players\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 3. Visualize The Time Of The Big 3",
"_____no_output_____"
]
],
[
[
"big_three_time = players_time[(players_time[\"player\"] == \"Novak Djokovic\") | (players_time[\"player\"] == \"Roger Federer\") | (players_time[\"player\"] == \"Rafael Nadal\")]",
"_____no_output_____"
],
[
"big_three_time",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"player\", data=big_three_time)\nax.set_title(\"TIME OF THE BIG THREE\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Players\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 4. Figure Out The Top 10 Surfaces That Take The Longest Time",
"_____no_output_____"
]
],
[
[
"longest_time_surfaces = events_time[events_time[\"seconds_added_per_point\"] > 0].sort_values(by=\"seconds_added_per_point\", ascending=False).head(10)",
"_____no_output_____"
],
[
"longest_time_surfaces",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"tournament\", hue=\"surface\", data=longest_time_surfaces)\nax.set_title(\"TOP 10 SURFACES THAT TAKE THE LONGEST TIME\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Tournament\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 5. Figure Out The Top 10 Surfaces That Take The Shortest Time",
"_____no_output_____"
]
],
[
[
"shortest_time_surfaces = events_time[events_time[\"seconds_added_per_point\"] < 0].sort_values(by=\"seconds_added_per_point\").head(10)",
"_____no_output_____"
],
[
"shortest_time_surfaces",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax = sns.barplot(x=\"seconds_added_per_point\", y=\"tournament\", hue=\"surface\", data=shortest_time_surfaces)\nax.set_title(\"TOP 10 SURFACES THAT TAKE THE SHORTEST TIME\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Tournament\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 6. Figure Out How The Time For The Clay Surface Has Progressed Throughout The Years",
"_____no_output_____"
]
],
[
[
"years = events_time[~events_time[\"years\"].str.contains(\"-\")]\nsorted_years_clay = years[years[\"surface\"] == \"Clay\"].sort_values(by=\"years\")",
"_____no_output_____"
],
[
"sorted_years_clay",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_clay)\nax.set_title(\"PROGRESSION OF TIME FOR THE CLAY SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ",
"_____no_output_____"
]
],
[
[
"## 7. Figure Out How The Time For The Hard Surface Has Progressed Throughout The Years",
"_____no_output_____"
]
],
[
[
"sorted_years_hard = years[years[\"surface\"] == \"Hard\"].sort_values(by=\"years\")",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_hard)\nax.set_title(\"PROGRESSION OF TIME FOR THE HARD SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ",
"_____no_output_____"
]
],
[
[
"## 8. Figure Out How The Time For The Carpet Surface Has Progressed Throughout The Years",
"_____no_output_____"
]
],
[
[
"sorted_years_carpet = years[years[\"surface\"] == \"Carpet\"].sort_values(by=\"years\")",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_carpet)\nax.set_title(\"PROGRESSION OF TIME FOR THE CARPET SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ",
"_____no_output_____"
]
],
[
[
"## 9. Figure Out How The Time For The Grass Surface Has Progressed Throughout The Years",
"_____no_output_____"
]
],
[
[
"sorted_years_grass = events_time[events_time[\"surface\"] == \"Grass\"].sort_values(by=\"years\").head(5)",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_grass)\nax.set_title(\"PROGRESSION OF TIME FOR THE GRASS SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ",
"_____no_output_____"
]
],
[
[
"## 10. Figure Out The Person Who Took The Most Time Serving In 2015",
"_____no_output_____"
]
],
[
[
"serve_time",
"_____no_output_____"
],
[
"serve_time_visualization = serve_time.groupby(\"server\")[\"seconds_before_next_point\"].agg(\"sum\")",
"_____no_output_____"
],
[
"serve_time_visualization",
"_____no_output_____"
],
[
"serve_time_visual_data = serve_time_visualization.reset_index()",
"_____no_output_____"
],
[
"serve_time_visual_data",
"_____no_output_____"
],
[
"serve_time_visual_sorted = serve_time_visual_data.sort_values(by=\"seconds_before_next_point\", ascending = False)",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax = sns.barplot(x=\"seconds_before_next_point\", y=\"server\", data=serve_time_visual_sorted)\nax.set_title(\"PLAYERS TOTAL SERVING TIME(2015) \", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Player\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"### BIG THREE TOTAL SERVING TIME IN 2015",
"_____no_output_____"
]
],
[
[
"big_three_total_serving_time = serve_time_visual_sorted[(serve_time_visual_sorted[\"server\"] == \"Roger Federer\") | (serve_time_visual_sorted[\"server\"] == \"Rafael Nadal\") | (serve_time_visual_sorted[\"server\"] == \"Novak Djokovic\")]",
"_____no_output_____"
],
[
"big_three_total_serving_time",
"_____no_output_____"
],
[
"sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax = sns.barplot(x=\"seconds_before_next_point\", y=\"server\", data=big_three_total_serving_time)\nax.set_title(\"BIG THREE TOTAL SERVING TIME(2015) \", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Player\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Conclusion",
"_____no_output_____"
],
[
"### Matches are short when they are played on a Grass, Carpet or Hard Surface. Grass however has proved to let matches be way more short compared to the other 2. \n\n### Clay Surfaces have proved to make matches last so long. \n\n### In 2015, among the Big Three, Novak Djokovic took the shortest time serving followed by Rafael Nadal. Roger Federer took the longest time serving. Overall however, Roger Federer proved to have the shortest time serving over the past years, followed by Novak Djokovic. Rafael Nadal has proved to have the longest time serving over the past years, making the matches that he is involved in last longer.",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
]
]
|
d06168f1a481bfd64b39e2c63dac1b24eb7a07b8 | 125,628 | ipynb | Jupyter Notebook | notebooks/old/Salt_9-resne34-with-highLR-derper.ipynb | GilesStrong/Kaggle_TGS-Salt | b47a468ee464581f1b843fdf3bc1230222982277 | [
"Apache-2.0"
]
| null | null | null | notebooks/old/Salt_9-resne34-with-highLR-derper.ipynb | GilesStrong/Kaggle_TGS-Salt | b47a468ee464581f1b843fdf3bc1230222982277 | [
"Apache-2.0"
]
| null | null | null | notebooks/old/Salt_9-resne34-with-highLR-derper.ipynb | GilesStrong/Kaggle_TGS-Salt | b47a468ee464581f1b843fdf3bc1230222982277 | [
"Apache-2.0"
]
| null | null | null | 49.111806 | 26,436 | 0.587003 | [
[
[
"%matplotlib inline\n%reload_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"from fastai.conv_learner import *\nfrom fastai.dataset import *\nfrom fastai.models.resnet import vgg_resnet50\n\nimport json",
"_____no_output_____"
],
[
"#torch.cuda.set_device(2)",
"_____no_output_____"
],
[
"torch.backends.cudnn.benchmark=True",
"_____no_output_____"
]
],
[
[
"## Data",
"_____no_output_____"
]
],
[
[
"PATH = Path('/home/giles/Downloads/fastai_data/salt/')\nMASKS_FN = 'train_masks.csv'\nMETA_FN = 'metadata.csv'\nmasks_csv = pd.read_csv(PATH/MASKS_FN)\nmeta_csv = pd.read_csv(PATH/META_FN)",
"_____no_output_____"
],
[
"def show_img(im, figsize=None, ax=None, alpha=None):\n if not ax: fig,ax = plt.subplots(figsize=figsize)\n ax.imshow(im, alpha=alpha)\n ax.set_axis_off()\n return ax",
"_____no_output_____"
],
[
"(PATH/'train_masks-128').mkdir(exist_ok=True)",
"_____no_output_____"
],
[
"def resize_img(fn):\n Image.open(fn).resize((128,128)).save((fn.parent.parent)/'train_masks-128'/fn.name)\n\nfiles = list((PATH/'train_masks').iterdir())\nwith ThreadPoolExecutor(8) as e: e.map(resize_img, files)",
"_____no_output_____"
],
[
"(PATH/'train-128').mkdir(exist_ok=True)",
"_____no_output_____"
],
[
"def resize_img(fn):\n Image.open(fn).resize((128,128)).save((fn.parent.parent)/'train-128'/fn.name)\n\nfiles = list((PATH/'train').iterdir())\nwith ThreadPoolExecutor(8) as e: e.map(resize_img, files)",
"_____no_output_____"
],
[
"TRAIN_DN = 'train-128'\nMASKS_DN = 'train_masks-128'\nsz = 32\nbs = 64\nnw = 16",
"_____no_output_____"
]
],
[
[
"TRAIN_DN = 'train'\nMASKS_DN = 'train_masks_png'\nsz = 128\nbs = 64\nnw = 16",
"_____no_output_____"
]
],
[
[
"class MatchedFilesDataset(FilesDataset):\n def __init__(self, fnames, y, transform, path):\n self.y=y\n assert(len(fnames)==len(y))\n super().__init__(fnames, transform, path)\n def get_y(self, i): return open_image(os.path.join(self.path, self.y[i]))\n def get_c(self): return 0",
"_____no_output_____"
],
[
"x_names = np.array(glob(f'{PATH}/{TRAIN_DN}/*'))\ny_names = np.array(glob(f'{PATH}/{MASKS_DN}/*'))",
"_____no_output_____"
],
[
"val_idxs = list(range(800))\n((val_x,trn_x),(val_y,trn_y)) = split_by_idx(val_idxs, x_names, y_names)",
"_____no_output_____"
],
[
"aug_tfms = [RandomFlip(tfm_y=TfmType.CLASS)]",
"_____no_output_____"
],
[
"tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm",
"_____no_output_____"
],
[
"x,y = next(iter(md.trn_dl))",
"_____no_output_____"
],
[
"x.shape,y.shape",
"_____no_output_____"
],
[
"denorm = md.val_ds.denorm",
"_____no_output_____"
],
[
"def show_aug_img(ims, idx, figsize=(5,5), normed=True, ax=None, nchannels=3):\n if ax is None: fig,ax = plt.subplots(figsize=figsize)\n if normed: ims = denorm(ims)\n else: ims = np.rollaxis(to_np(ims),1,nchannels+1)\n ax.imshow(np.clip(ims,0,1)[idx])\n ax.axis('off')",
"_____no_output_____"
],
[
"batches = [next(iter(md.aug_dl)) for i in range(9)]",
"_____no_output_____"
],
[
"fig, axes = plt.subplots(3, 6, figsize=(18, 9))\nfor i,(x,y) in enumerate(batches):\n show_aug_img(x,1, ax=axes.flat[i*2])\n show_aug_img(y,1, ax=axes.flat[i*2+1], nchannels=1, normed=False)",
"_____no_output_____"
]
],
[
[
"## Simple upsample",
"_____no_output_____"
]
],
[
[
"f = resnet34\ncut,lr_cut = model_meta[f]",
"_____no_output_____"
],
[
"def get_base():\n layers = cut_model(f(True), cut)\n return nn.Sequential(*layers)",
"_____no_output_____"
],
[
"def dice(pred, targs):\n pred = (pred>0.5).float()\n return 2. * (pred*targs).sum() / (pred+targs).sum()",
"_____no_output_____"
]
],
[
[
"## U-net (ish)",
"_____no_output_____"
]
],
[
[
"class SaveFeatures():\n features=None\n def __init__(self, m): self.hook = m.register_forward_hook(self.hook_fn)\n def hook_fn(self, module, input, output): self.features = output\n def remove(self): self.hook.remove()",
"_____no_output_____"
],
[
"class UnetBlock(nn.Module):\n def __init__(self, up_in, x_in, n_out):\n super().__init__()\n up_out = x_out = n_out//2\n self.x_conv = nn.Conv2d(x_in, x_out, 1)\n self.tr_conv = nn.ConvTranspose2d(up_in, up_out, 2, stride=2)\n self.bn = nn.BatchNorm2d(n_out)\n \n def forward(self, up_p, x_p):\n up_p = self.tr_conv(up_p)\n x_p = self.x_conv(x_p)\n cat_p = torch.cat([up_p,x_p], dim=1)\n return self.bn(F.relu(cat_p))",
"_____no_output_____"
],
[
"class Unet34(nn.Module):\n def __init__(self, rn):\n super().__init__()\n self.rn = rn\n self.sfs = [SaveFeatures(rn[i]) for i in [2,4,5,6]]\n self.up1 = UnetBlock(512,256,256)\n self.up2 = UnetBlock(256,128,256)\n self.up3 = UnetBlock(256,64,256)\n self.up4 = UnetBlock(256,64,256)\n self.up5 = UnetBlock(256,3,16)\n self.up6 = nn.ConvTranspose2d(16, 1, 1)\n \n def forward(self,x):\n inp = x\n x = F.relu(self.rn(x))\n x = self.up1(x, self.sfs[3].features)\n x = self.up2(x, self.sfs[2].features)\n x = self.up3(x, self.sfs[1].features)\n x = self.up4(x, self.sfs[0].features)\n x = self.up5(x, inp)\n x = self.up6(x)\n return x[:,0]\n \n def close(self):\n for sf in self.sfs: sf.remove()",
"_____no_output_____"
],
[
"class UnetModel():\n def __init__(self,model,name='unet'):\n self.model,self.name = model,name\n\n def get_layer_groups(self, precompute):\n lgs = list(split_by_idxs(children(self.model.rn), [lr_cut]))\n return lgs + [children(self.model)[1:]]",
"_____no_output_____"
],
[
"m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)",
"_____no_output_____"
],
[
"learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]",
"_____no_output_____"
],
[
"learn.summary()",
"_____no_output_____"
],
[
"[o.features.size() for o in m.sfs]",
"_____no_output_____"
],
[
"learn.freeze_to(1)",
"_____no_output_____"
],
[
"learn.lr_find()\nlearn.sched.plot()",
"_____no_output_____"
],
[
"lr=1e-2\nwd=1e-7\n\nlrs = np.array([lr/9,lr/3,lr])",
"_____no_output_____"
],
[
"learn.fit(lr,1,wds=wd,cycle_len=10,use_clr=(5,8))",
"_____no_output_____"
],
[
"learn.save('32urn-tmp')",
"_____no_output_____"
],
[
"learn.load('32urn-tmp')",
"_____no_output_____"
],
[
"learn.unfreeze()\nlearn.bn_freeze(True)",
"_____no_output_____"
],
[
"learn.fit(lrs/4, 1, wds=wd, cycle_len=20,use_clr=(20,10))",
"_____no_output_____"
],
[
"learn.sched.plot_lr()",
"_____no_output_____"
],
[
"learn.save('32urn-0')",
"_____no_output_____"
],
[
"learn.load('32urn-0')",
"_____no_output_____"
],
[
"x,y = next(iter(md.val_dl))\npy = to_np(learn.model(V(x)))",
"_____no_output_____"
],
[
"show_img(py[0]>0.5);",
"_____no_output_____"
],
[
"show_img(y[0]);",
"_____no_output_____"
],
[
"show_img(x[0][0]);",
"_____no_output_____"
],
[
"m.close()",
"_____no_output_____"
]
],
[
[
"## 64x64",
"_____no_output_____"
]
],
[
[
"sz=64\nbs=64",
"_____no_output_____"
],
[
"tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm",
"_____no_output_____"
],
[
"m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)",
"_____no_output_____"
],
[
"learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]",
"_____no_output_____"
],
[
"learn.freeze_to(1)",
"_____no_output_____"
],
[
"learn.load('32urn-0')",
"_____no_output_____"
],
[
"learn.fit(lr/2,1,wds=wd, cycle_len=10,use_clr=(10,10))",
"_____no_output_____"
],
[
"learn.sched.plot_lr()",
"_____no_output_____"
],
[
"learn.save('64urn-tmp')",
"_____no_output_____"
],
[
"learn.unfreeze()\nlearn.bn_freeze(True)",
"_____no_output_____"
],
[
"learn.load('64urn-tmp')",
"_____no_output_____"
],
[
"learn.fit(lrs/4,1,wds=wd, cycle_len=8,use_clr=(20,8))",
"_____no_output_____"
],
[
"learn.sched.plot_lr()",
"_____no_output_____"
],
[
"learn.save('64urn')",
"_____no_output_____"
],
[
"learn.load('64urn')",
"_____no_output_____"
],
[
"x,y = next(iter(md.val_dl))\npy = to_np(learn.model(V(x)))",
"_____no_output_____"
],
[
"show_img(py[0]>0.5);",
"_____no_output_____"
],
[
"show_img(y[0]);",
"_____no_output_____"
],
[
"show_img(x[0][0]);",
"_____no_output_____"
],
[
"m.close()",
"_____no_output_____"
]
],
[
[
"## 128x128",
"_____no_output_____"
]
],
[
[
"sz=128\nbs=64",
"_____no_output_____"
],
[
"tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm",
"_____no_output_____"
],
[
"m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)",
"_____no_output_____"
],
[
"learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]",
"_____no_output_____"
],
[
"learn.load('64urn')",
"_____no_output_____"
],
[
"learn.fit(lr/2,1, wds=wd, cycle_len=6,use_clr=(6,4))",
"_____no_output_____"
],
[
"learn.save('128urn-tmp')",
"_____no_output_____"
],
[
"learn.load('128urn-tmp')",
"_____no_output_____"
],
[
"learn.unfreeze()\nlearn.bn_freeze(True)",
"_____no_output_____"
],
[
"#lrs = np.array([lr/200,lr/30,lr])",
"_____no_output_____"
],
[
"learn.fit(lrs/5,1, wds=wd,cycle_len=8,use_clr=(20,8))",
"_____no_output_____"
],
[
"learn.sched.plot_lr()",
"_____no_output_____"
],
[
"learn.sched.plot_loss()",
"_____no_output_____"
],
[
"learn.save('128urn')",
"_____no_output_____"
],
[
"learn.load('128urn')",
"_____no_output_____"
],
[
"x,y = next(iter(md.val_dl))\npy = to_np(learn.model(V(x)))",
"_____no_output_____"
],
[
"show_img(py[0]>0.5);",
"_____no_output_____"
],
[
"show_img(y[0]);",
"_____no_output_____"
],
[
"show_img(x[0][0]);",
"_____no_output_____"
],
[
"y.shape",
"_____no_output_____"
],
[
"batches = [next(iter(md.aug_dl)) for i in range(9)]",
"_____no_output_____"
],
[
"fig, axes = plt.subplots(3, 6, figsize=(18, 9))\nfor i,(x,y) in enumerate(batches):\n show_aug_img(x,1, ax=axes.flat[i*2])\n show_aug_img(y,1, ax=axes.flat[i*2+1], nchannels=1, normed=False)",
"_____no_output_____"
]
],
[
[
"# Test on original validation",
"_____no_output_____"
]
],
[
[
"x_names_orig = np.array(glob(f'{PATH}/train/*'))\ny_names_orig = np.array(glob(f'{PATH}/train_masks/*'))",
"_____no_output_____"
],
[
"val_idxs_orig = list(range(800))\n((val_x_orig,trn_x_orig),(val_y_orig,trn_y_orig)) = split_by_idx(val_idxs_orig, x_names_orig, y_names_orig)",
"_____no_output_____"
],
[
"sz=128\nbs=64",
"_____no_output_____"
],
[
"tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm",
"_____no_output_____"
],
[
"m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)",
"_____no_output_____"
],
[
"learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]",
"_____no_output_____"
],
[
"learn.load('128urn')",
"_____no_output_____"
],
[
"probs = learn.predict()",
"_____no_output_____"
],
[
"probs.shape",
"_____no_output_____"
],
[
"_, y = learn.TTA(n_aug=1)",
"_____no_output_____"
],
[
"y.shape",
"_____no_output_____"
],
[
"idx=0",
"_____no_output_____"
],
[
"show_img(probs[idx]>0.5);",
"_____no_output_____"
],
[
"show_img(probs[idx]);",
"_____no_output_____"
],
[
"show_img(y[idx]);",
"_____no_output_____"
],
[
"show_img(x[idx][0]);",
"_____no_output_____"
]
],
[
[
"# Optimise threshold",
"_____no_output_____"
]
],
[
[
"# src: https://www.kaggle.com/aglotero/another-iou-metric\ndef iou_metric(y_true_in, y_pred_in, print_table=False):\n labels = y_true_in\n y_pred = y_pred_in\n \n true_objects = 2\n pred_objects = 2\n\n intersection = np.histogram2d(labels.flatten(), y_pred.flatten(), bins=(true_objects, pred_objects))[0]\n\n # Compute areas (needed for finding the union between all objects)\n area_true = np.histogram(labels, bins = true_objects)[0]\n area_pred = np.histogram(y_pred, bins = pred_objects)[0]\n area_true = np.expand_dims(area_true, -1)\n area_pred = np.expand_dims(area_pred, 0)\n\n # Compute union\n union = area_true + area_pred - intersection\n\n # Exclude background from the analysis\n intersection = intersection[1:,1:]\n union = union[1:,1:]\n union[union == 0] = 1e-9\n\n # Compute the intersection over union\n iou = intersection / union\n\n # Precision helper function\n def precision_at(threshold, iou):\n matches = iou > threshold\n true_positives = np.sum(matches, axis=1) == 1 # Correct objects\n false_positives = np.sum(matches, axis=0) == 0 # Missed objects\n false_negatives = np.sum(matches, axis=1) == 0 # Extra objects\n tp, fp, fn = np.sum(true_positives), np.sum(false_positives), np.sum(false_negatives)\n return tp, fp, fn\n\n # Loop over IoU thresholds\n prec = []\n if print_table:\n print(\"Thresh\\tTP\\tFP\\tFN\\tPrec.\")\n for t in np.arange(0.5, 1.0, 0.05):\n tp, fp, fn = precision_at(t, iou)\n if (tp + fp + fn) > 0:\n p = tp / (tp + fp + fn)\n else:\n p = 0\n if print_table:\n print(\"{:1.3f}\\t{}\\t{}\\t{}\\t{:1.3f}\".format(t, tp, fp, fn, p))\n prec.append(p)\n \n if print_table:\n print(\"AP\\t-\\t-\\t-\\t{:1.3f}\".format(np.mean(prec)))\n return np.mean(prec)\n\ndef iou_metric_batch(y_true_in, y_pred_in):\n batch_size = y_true_in.shape[0]\n metric = []\n for batch in range(batch_size):\n value = iou_metric(y_true_in[batch], y_pred_in[batch])\n metric.append(value)\n return np.mean(metric)",
"_____no_output_____"
],
[
"thres = np.linspace(-1, 1, 10)\nthres_ioc = [iou_metric_batch(y, np.int32(probs > t)) for t in tqdm_notebook(thres)]",
"_____no_output_____"
],
[
"plt.plot(thres, thres_ioc);",
"_____no_output_____"
],
[
"best_thres = thres[np.argmax(thres_ioc)]\nbest_thres, max(thres_ioc)",
"_____no_output_____"
],
[
"thres = np.linspace(-0.5, 0.5, 50)\nthres_ioc = [iou_metric_batch(y, np.int32(probs > t)) for t in tqdm_notebook(thres)]",
"_____no_output_____"
],
[
"plt.plot(thres, thres_ioc);",
"_____no_output_____"
],
[
"best_thres = thres[np.argmax(thres_ioc)]\nbest_thres, max(thres_ioc)",
"_____no_output_____"
],
[
"show_img(probs[0]>best_thres);",
"_____no_output_____"
]
],
[
[
"# Run on test",
"_____no_output_____"
]
],
[
[
"(PATH/'test-128').mkdir(exist_ok=True)",
"_____no_output_____"
],
[
"def resize_img(fn):\n Image.open(fn).resize((128,128)).save((fn.parent.parent)/'test-128'/fn.name)\n\nfiles = list((PATH/'test').iterdir())\nwith ThreadPoolExecutor(8) as e: e.map(resize_img, files)",
"_____no_output_____"
],
[
"testData = np.array(glob(f'{PATH}/test-128/*'))",
"_____no_output_____"
],
[
"class TestFilesDataset(FilesDataset):\n def __init__(self, fnames, y, transform, path):\n self.y=y\n assert(len(fnames)==len(y))\n super().__init__(fnames, transform, path)\n def get_y(self, i): return open_image(os.path.join(self.path, self.fnames[i]))\n def get_c(self): return 0",
"_____no_output_____"
],
[
"tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(TestFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, test=testData, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm",
"_____no_output_____"
],
[
"m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)",
"_____no_output_____"
],
[
"learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]",
"_____no_output_____"
],
[
"learn.load('128urn')",
"_____no_output_____"
],
[
"x,y = next(iter(md.test_dl))\npy = to_np(learn.model(V(x)))",
"_____no_output_____"
],
[
"show_img(py[6]>best_thres);",
"_____no_output_____"
],
[
"show_img(py[6]);",
"_____no_output_____"
],
[
"show_img(y[6]);",
"_____no_output_____"
],
[
"probs = learn.predict(is_test=True)",
"_____no_output_____"
],
[
"show_img(probs[12]>best_thres);",
"_____no_output_____"
],
[
"show_img(probs[12]);",
"_____no_output_____"
],
[
"show_img(y[12]);",
"_____no_output_____"
],
[
"show_img(x[12][0]);",
"_____no_output_____"
],
[
"with open(f'{PATH}/probs.pkl', 'wb') as fout: #Save results\n pickle.dump(probs, fout)",
"_____no_output_____"
],
[
"probs.shape",
"_____no_output_____"
],
[
"def resize_img(fn):\n return np.array(Image.fromarray(fn).resize((101,101)))\n\nresizePreds = np.array([resize_img(x) for x in probs])",
"_____no_output_____"
],
[
"resizePreds.shape",
"_____no_output_____"
],
[
"show_img(resizePreds[12]);",
"_____no_output_____"
],
[
"testData",
"_____no_output_____"
],
[
"f'{PATH}/test'",
"_____no_output_____"
],
[
"test_ids = next(os.walk(f'{PATH}/test'))[2]",
"_____no_output_____"
],
[
"def RLenc(img, order='F', format=True):\n \"\"\"\n img is binary mask image, shape (r,c)\n order is down-then-right, i.e. Fortran\n format determines if the order needs to be preformatted (according to submission rules) or not\n\n returns run length as an array or string (if format is True)\n \"\"\"\n bytes = img.reshape(img.shape[0] * img.shape[1], order=order)\n runs = [] ## list of run lengths\n r = 0 ## the current run length\n pos = 1 ## count starts from 1 per WK\n for c in bytes:\n if (c == 0):\n if r != 0:\n runs.append((pos, r))\n pos += r\n r = 0\n pos += 1\n else:\n r += 1\n\n # if last run is unsaved (i.e. data ends with 1)\n if r != 0:\n runs.append((pos, r))\n pos += r\n r = 0\n\n if format:\n z = ''\n\n for rr in runs:\n z += '{} {} '.format(rr[0], rr[1])\n return z[:-1]\n else:\n return runs",
"_____no_output_____"
],
[
"pred_dict = {id_[:-4]:RLenc(np.round(resizePreds[i] > best_thres)) for i,id_ in tqdm_notebook(enumerate(test_ids))}",
"_____no_output_____"
],
[
"sub = pd.DataFrame.from_dict(pred_dict,orient='index')\nsub.index.names = ['id']\nsub.columns = ['rle_mask']\nsub.to_csv('submission.csv')",
"_____no_output_____"
],
[
"sub",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d06175055f3580333d9c8a5d91550bc2d22d341e | 33,696 | ipynb | Jupyter Notebook | bin/Jonghoo/altProj/proj/python/Music Recognition.ipynb | axcer2126/DINF | 04bc17c5c7835da77debfef4ae7acd62a769585a | [
"MIT"
]
| null | null | null | bin/Jonghoo/altProj/proj/python/Music Recognition.ipynb | axcer2126/DINF | 04bc17c5c7835da77debfef4ae7acd62a769585a | [
"MIT"
]
| null | null | null | bin/Jonghoo/altProj/proj/python/Music Recognition.ipynb | axcer2126/DINF | 04bc17c5c7835da77debfef4ae7acd62a769585a | [
"MIT"
]
| 8 | 2020-09-18T05:46:42.000Z | 2020-11-03T07:20:02.000Z | 37.069307 | 272 | 0.485785 | [
[
[
"# feature extractoring and preprocessing data\n# ์์ ๋ฐ์ดํฐ๋ฅผ ๋ถ์\nimport librosa\n\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# notebook์ ์คํํ ๋ธ๋ผ์ฐ์ ์์ ๋ฐ๋ก ๊ทธ๋ฆผ์ ๋ณผ ์ ์๊ฒ ํด์ฃผ๋ ๊ฒ\n%matplotlib inline\n\n# ์ด์์ฒด์ ์์ ์ํธ์์ฉ์ ๋๋ ๋ค์ํ ๊ธฐ๋ฅ์ ์ ๊ณต\n# 1. ํ์ฌ ๋๋ ํ ๋ฆฌ ํ์ธํ๊ธฐ\n# 2. ๋๋ ํ ๋ฆฌ ๋ณ๊ฒฝ\n# 3. ํ์ฌ ๋๋ ํ ๋ฆฌ์ ํ์ผ ๋ชฉ๋ก ํ์ธํ๊ธฐ\n# 4. csv ํ์ผ ํธ์ถ\nimport os\n\n# ํ์ด์ฌ์์์ ์ด๋ฏธ์ง ์ฒ๋ฆฌ\nfrom PIL import Image\n\nimport pathlib\nimport csv\n\n# Preprocessing\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.preprocessing import LabelEncoder, StandardScaler\nfrom sklearn.metrics import mean_squared_error\n\n#Keras\nimport keras\n\n# ๊ฒฝ๊ณ ๋ฉ์์ง๋ฅผ ๋ฌด์ํ๊ณ ์จ๊ธฐ๊ฑฐ๋ -> warnings.filterwarnings(action='ignore')\n# ์ผ์นํ๋ ๊ฒฝ๊ณ ๋ฅผ ์ธ์ํ์ง ์์ต๋๋ค = ('ignore')\nimport warnings\nwarnings.filterwarnings('ignore')",
"Using TensorFlow backend.\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n"
],
[
"# ์ํ๋ ์ข
๋ฅ์ ์๊น๋ง ๋๊ฒจ์ฃผ๋ ๊ฒ\ncmap = plt.get_cmap('inferno')\n\nplt.figure(figsize=(10,10))\ngenres = 'blues classical country disco hiphop jazz metal pop reggae rock'.split()\nfor g in genres:\n pathlib.Path(f'img_data/{g}').mkdir(parents=True, exist_ok=True) \n for filename in os.listdir(f'./MIR/genres/{g}'):\n songname = f'./MIR/genres/{g}/{filename}'\n y, sr = librosa.load(songname, mono=True, duration=5)\n plt.specgram(y, NFFT=2048, Fs=2, Fc=0, noverlap=128, cmap=cmap, sides='default', mode='default', scale='dB');\n plt.axis('off');\n plt.savefig(f'img_data/{g}/{filename[:-3].replace(\".\", \"\")}.png')\n plt.clf()",
"_____no_output_____"
],
[
"header = 'filename chroma_stft rmse spectral_centroid spectral_bandwidth rolloff zero_crossing_rate'\nfor i in range(1, 21):\n header += f' mfcc{i}'\nheader += ' label'\nheader = header.split()",
"_____no_output_____"
],
[
"file = open('data.csv', 'w', newline='')\nwith file:\n writer = csv.writer(file)\n writer.writerow(header)\ngenres = 'blues classical country disco hiphop jazz metal pop reggae rock'.split()\nfor g in genres:\n for filename in os.listdir(f'./MIR/genres/{g}'):\n songname = f'./MIR/genres/{g}/{filename}'\n y, sr = librosa.load(songname, mono=True, duration=30)\n chroma_stft = librosa.feature.chroma_stft(y=y, sr=sr)\n spec_cent = librosa.feature.spectral_centroid(y=y, sr=sr)\n spec_bw = librosa.feature.spectral_bandwidth(y=y, sr=sr)\n rolloff = librosa.feature.spectral_rolloff(y=y, sr=sr)\n zcr = librosa.feature.zero_crossing_rate(y)\n mfcc = librosa.feature.mfcc(y=y, sr=sr)\n #rmse = mean_squared_error(y, y_pred=sr)**0.5\n rmse = librosa.feature.rms(y=y)\n to_append = f'{filename} {np.mean(chroma_stft)} {np.mean(rmse)} {np.mean(spec_cent)} {np.mean(spec_bw)} {np.mean(rolloff)} {np.mean(zcr)}' \n for e in mfcc:\n to_append += f' {np.mean(e)}'\n to_append += f' {g}'\n file = open('data.csv', 'a', newline='')\n with file:\n writer = csv.writer(file)\n writer.writerow(to_append.split())",
"_____no_output_____"
],
[
"# mfcc = ์ค๋์ค ์ ํธ์์ ์ถ์ถํ ์ ์๋ feature๋ก, ์๋ฆฌ์ ๊ณ ์ ํ ํน์ง์ ๋ํ๋ด๋ ์์น\n# = ๋ฑ๋ก๋ ์์ฑ๊ณผ ํ์ฌ ์
๋ ฅ๋ ์์ฑ์ ์ ์ฌ๋๋ฅผ ํ๋ณํ๋ ๊ทผ๊ฑฐ์ ์ผ๋ถ๋ก ์ฐ์
๋๋ค.\n# = MFCC(Mel-Frequency Cepstral Coefficient)๋\n# Mel Spectrum(๋ฉ ์คํํธ๋ผ)์์ Cepstral(์ผ์คํธ๋ด) ๋ถ์์ ํตํด ์ถ์ถ๋ ๊ฐ\n# \n# ์ดํดํ๊ธฐ ์ํด ๋จผ์ \n# - Spectrum(์คํํธ๋ผ)\n# - Cepstrum(์ผ์คํธ๋ผ)\n# - Mel Spectrum(๋ฉ ์คํํธ๋ผ) ๋ค์ ์์์ผ ํ๋ค.",
"_____no_output_____"
],
[
"data = pd.read_csv('data.csv')\ndata.head()\n\n# chroma_stft = ์ฑ๋_? , ํฌ๋ก๋ง ํ์ค\n# spectral_centroid = ์คํํธ๋ผ ์ค์ฌ\n# spectral_bandwidth = ์คํํธ๋ผ ๋์ญํญ\n# rolloff = ๋กค ์คํ\n# zero_crossing_rate = ์ ๋ก ํฌ๋ก์ฑ ๋น์จ\n# \n# mfcc[n] = ",
"_____no_output_____"
],
[
"data.shape",
"_____no_output_____"
],
[
"# Dropping unneccesary columns\ndata = data.drop(['filename'],axis=1)",
"_____no_output_____"
],
[
"genre_list = data.iloc[:, -1]\nencoder = LabelEncoder()\ny = encoder.fit_transform(genre_list)",
"_____no_output_____"
],
[
"scaler = StandardScaler()\nX = scaler.fit_transform(np.array(data.iloc[:, :-1], dtype = float))",
"_____no_output_____"
],
[
"X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)",
"_____no_output_____"
],
[
"len(y_train)",
"_____no_output_____"
],
[
"len(y_test)",
"_____no_output_____"
],
[
"X_train[10]",
"_____no_output_____"
],
[
"from keras import models\nfrom keras import layers\n\nmodel = models.Sequential()\nmodel.add(layers.Dense(256, activation='relu', input_shape=(X_train.shape[1],)))\n\nmodel.add(layers.Dense(128, activation='relu'))\n\nmodel.add(layers.Dense(64, activation='relu'))\n\nmodel.add(layers.Dense(10, activation='softmax'))",
"WARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:66: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.\n\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:541: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:4432: The name tf.random_uniform is deprecated. Please use tf.random.uniform instead.\n\n"
],
[
"model.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy'])",
"WARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/optimizers.py:793: The name tf.train.Optimizer is deprecated. Please use tf.compat.v1.train.Optimizer instead.\n\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:3622: The name tf.log is deprecated. Please use tf.math.log instead.\n\n"
],
[
"history = model.fit(X_train,\n y_train,\n epochs=20,\n batch_size=128)",
"WARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/ops/math_grad.py:1250: add_dispatch_support.<locals>.wrapper (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\nInstructions for updating:\nUse tf.where in 2.0, which has the same broadcast rule as np.where\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:1033: The name tf.assign_add is deprecated. Please use tf.compat.v1.assign_add instead.\n\nEpoch 1/20\n800/800 [==============================] - 1s 1ms/step - loss: 2.1563 - acc: 0.2263\nEpoch 2/20\n800/800 [==============================] - 0s 45us/step - loss: 1.8502 - acc: 0.3887\nEpoch 3/20\n800/800 [==============================] - 0s 30us/step - loss: 1.6190 - acc: 0.4163\nEpoch 4/20\n800/800 [==============================] - 0s 28us/step - loss: 1.4466 - acc: 0.4863\nEpoch 5/20\n800/800 [==============================] - 0s 15us/step - loss: 1.3198 - acc: 0.5587\nEpoch 6/20\n800/800 [==============================] - 0s 11us/step - loss: 1.2189 - acc: 0.5663\nEpoch 7/20\n800/800 [==============================] - 0s 14us/step - loss: 1.1357 - acc: 0.5988\nEpoch 8/20\n800/800 [==============================] - 0s 14us/step - loss: 1.0649 - acc: 0.6450\nEpoch 9/20\n800/800 [==============================] - 0s 11us/step - loss: 1.0059 - acc: 0.6625\nEpoch 10/20\n800/800 [==============================] - 0s 13us/step - loss: 0.9525 - acc: 0.6925\nEpoch 11/20\n800/800 [==============================] - 0s 13us/step - loss: 0.9039 - acc: 0.7025\nEpoch 12/20\n800/800 [==============================] - 0s 12us/step - loss: 0.8633 - acc: 0.7150\nEpoch 13/20\n800/800 [==============================] - 0s 13us/step - loss: 0.8188 - acc: 0.7350\nEpoch 14/20\n800/800 [==============================] - 0s 14us/step - loss: 0.7868 - acc: 0.7425\nEpoch 15/20\n800/800 [==============================] - 0s 12us/step - loss: 0.7527 - acc: 0.7475\nEpoch 16/20\n800/800 [==============================] - 0s 12us/step - loss: 0.7272 - acc: 0.7575\nEpoch 17/20\n800/800 [==============================] - 0s 13us/step - loss: 0.7033 - acc: 0.7688\nEpoch 18/20\n800/800 [==============================] - 0s 12us/step - loss: 0.6679 - acc: 0.7737\nEpoch 19/20\n800/800 [==============================] - 0s 12us/step - loss: 0.6405 - acc: 0.7925\nEpoch 20/20\n800/800 [==============================] - 0s 11us/step - loss: 0.6022 - acc: 0.8125\n"
],
[
"test_loss, test_acc = model.evaluate(X_test,y_test)",
"200/200 [==============================] - 0s 115us/step\n"
],
[
"print('test_acc: ',test_acc)",
"test_acc: 0.73\n"
],
[
"x_val = X_train[:200]\npartial_x_train = X_train[200:]\n\ny_val = y_train[:200]\npartial_y_train = y_train[200:]",
"_____no_output_____"
],
[
"\nmodel = models.Sequential()\nmodel.add(layers.Dense(512, activation='relu', input_shape=(X_train.shape[1],)))\nmodel.add(layers.Dense(256, activation='relu'))\nmodel.add(layers.Dense(128, activation='relu'))\nmodel.add(layers.Dense(64, activation='relu'))\nmodel.add(layers.Dense(10, activation='softmax'))\n\nmodel.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\nmodel.fit(partial_x_train,\n partial_y_train,\n epochs=30,\n batch_size=512,\n validation_data=(x_val, y_val))\nresults = model.evaluate(X_test, y_test)",
"Train on 600 samples, validate on 200 samples\nEpoch 1/30\n600/600 [==============================] - 0s 341us/step - loss: 2.2897 - acc: 0.1167 - val_loss: 2.1786 - val_acc: 0.2650\nEpoch 2/30\n600/600 [==============================] - 0s 14us/step - loss: 2.1257 - acc: 0.3317 - val_loss: 2.0688 - val_acc: 0.3150\nEpoch 3/30\n600/600 [==============================] - 0s 15us/step - loss: 1.9839 - acc: 0.4133 - val_loss: 1.9469 - val_acc: 0.3100\nEpoch 4/30\n600/600 [==============================] - 0s 17us/step - loss: 1.8296 - acc: 0.4067 - val_loss: 1.8256 - val_acc: 0.3150\nEpoch 5/30\n600/600 [==============================] - 0s 13us/step - loss: 1.6836 - acc: 0.4150 - val_loss: 1.7084 - val_acc: 0.3500\nEpoch 6/30\n600/600 [==============================] - 0s 17us/step - loss: 1.5413 - acc: 0.4633 - val_loss: 1.6188 - val_acc: 0.4150\nEpoch 7/30\n600/600 [==============================] - 0s 12us/step - loss: 1.4307 - acc: 0.5100 - val_loss: 1.5584 - val_acc: 0.4350\nEpoch 8/30\n600/600 [==============================] - 0s 14us/step - loss: 1.3362 - acc: 0.5333 - val_loss: 1.5068 - val_acc: 0.4550\nEpoch 9/30\n600/600 [==============================] - 0s 11us/step - loss: 1.2556 - acc: 0.5433 - val_loss: 1.4741 - val_acc: 0.4700\nEpoch 10/30\n600/600 [==============================] - 0s 16us/step - loss: 1.2024 - acc: 0.5833 - val_loss: 1.4555 - val_acc: 0.4750\nEpoch 11/30\n600/600 [==============================] - 0s 12us/step - loss: 1.1433 - acc: 0.5983 - val_loss: 1.4419 - val_acc: 0.5200\nEpoch 12/30\n600/600 [==============================] - 0s 16us/step - loss: 1.0761 - acc: 0.6267 - val_loss: 1.4268 - val_acc: 0.5050\nEpoch 13/30\n600/600 [==============================] - 0s 12us/step - loss: 1.0272 - acc: 0.6500 - val_loss: 1.3786 - val_acc: 0.5450\nEpoch 14/30\n600/600 [==============================] - 0s 12us/step - loss: 0.9723 - acc: 0.6783 - val_loss: 1.3623 - val_acc: 0.5200\nEpoch 15/30\n600/600 [==============================] - 0s 16us/step - loss: 0.9400 - acc: 0.6867 - val_loss: 1.3410 - val_acc: 0.5750\nEpoch 16/30\n600/600 [==============================] - 0s 13us/step - loss: 0.8821 - acc: 0.7000 - val_loss: 1.3599 - val_acc: 0.5800\nEpoch 17/30\n600/600 [==============================] - 0s 12us/step - loss: 0.8603 - acc: 0.6983 - val_loss: 1.3182 - val_acc: 0.5850\nEpoch 18/30\n600/600 [==============================] - 0s 16us/step - loss: 0.8224 - acc: 0.7233 - val_loss: 1.2646 - val_acc: 0.5850\nEpoch 19/30\n600/600 [==============================] - 0s 14us/step - loss: 0.7897 - acc: 0.7367 - val_loss: 1.2845 - val_acc: 0.5650\nEpoch 20/30\n600/600 [==============================] - 0s 13us/step - loss: 0.7486 - acc: 0.7517 - val_loss: 1.3470 - val_acc: 0.5650\nEpoch 21/30\n600/600 [==============================] - 0s 14us/step - loss: 0.7371 - acc: 0.7533 - val_loss: 1.3236 - val_acc: 0.5850\nEpoch 22/30\n600/600 [==============================] - 0s 11us/step - loss: 0.7123 - acc: 0.7517 - val_loss: 1.2596 - val_acc: 0.5950\nEpoch 23/30\n600/600 [==============================] - 0s 13us/step - loss: 0.6772 - acc: 0.7767 - val_loss: 1.2605 - val_acc: 0.5850\nEpoch 24/30\n600/600 [==============================] - 0s 15us/step - loss: 0.6618 - acc: 0.7717 - val_loss: 1.2853 - val_acc: 0.5800\nEpoch 25/30\n600/600 [==============================] - 0s 14us/step - loss: 0.6487 - acc: 0.7800 - val_loss: 1.3147 - val_acc: 0.5900\nEpoch 26/30\n600/600 [==============================] - 0s 14us/step - loss: 0.6131 - acc: 0.8117 - val_loss: 1.3265 - val_acc: 0.6000\nEpoch 27/30\n600/600 [==============================] - 0s 14us/step - loss: 0.5971 - acc: 0.8033 - val_loss: 1.2807 - val_acc: 0.6000\nEpoch 28/30\n600/600 [==============================] - 0s 12us/step - loss: 0.5631 - acc: 0.8283 - val_loss: 1.2866 - val_acc: 0.5800\nEpoch 29/30\n600/600 [==============================] - 0s 14us/step - loss: 0.5477 - acc: 0.8350 - val_loss: 1.2839 - val_acc: 0.5750\nEpoch 30/30\n600/600 [==============================] - 0s 13us/step - loss: 0.5210 - acc: 0.8550 - val_loss: 1.2990 - val_acc: 0.5900\n200/200 [==============================] - 0s 19us/step\n"
],
[
"results",
"_____no_output_____"
],
[
"predictions = model.predict(X_test)",
"_____no_output_____"
],
[
"predictions[0].shape",
"_____no_output_____"
],
[
"np.sum(predictions[0])",
"_____no_output_____"
],
[
"np.argmax(predictions[0])",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d06181e012184dce95535cfa5bf9cade17ef5c3f | 1,702 | ipynb | Jupyter Notebook | Chapter07/Exercise104/Exercise104.ipynb | adityashah95/Python | 6b5ffc89f0abd77f07dd8049a4c1213ce56c4d84 | [
"MIT"
]
| 4 | 2020-01-06T12:07:00.000Z | 2022-03-22T04:03:49.000Z | Chapter07/Exercise104/Exercise104.ipynb | adityashah95/Python | 6b5ffc89f0abd77f07dd8049a4c1213ce56c4d84 | [
"MIT"
]
| null | null | null | Chapter07/Exercise104/Exercise104.ipynb | adityashah95/Python | 6b5ffc89f0abd77f07dd8049a4c1213ce56c4d84 | [
"MIT"
]
| 4 | 2019-11-25T10:39:30.000Z | 2020-02-22T07:26:40.000Z | 21.544304 | 136 | 0.536428 | [
[
[
"class Interrogator:\n def __init__(self, questions):\n self.questions = questions\n def __iter__(self):\n return self.questions.__iter__()",
"_____no_output_____"
],
[
"questions = [\"What is your name?\", \"What is your quest?\", \"What is the average airspeed velocity of an unladen swallow?\"]\nawkward_person = Interrogator(questions)\nfor question in awkward_person:\n print(question)\n\n",
"What is your name?\nWhat is your quest?\nWhat is the average airspeed velocity of an unladen swallow?\n"
]
]
]
| [
"code"
]
| [
[
"code",
"code"
]
]
|
d0618522ae84a467019ab9b7cc4372b10bffe018 | 125,489 | ipynb | Jupyter Notebook | examples/models/aws_eks_deep_mnist/aws_eks_deep_mnist.ipynb | welcomemandeep/seldon-core | a257c5ef7baf042da4b2ca1b7aad959447d5bd7d | [
"Apache-2.0"
]
| null | null | null | examples/models/aws_eks_deep_mnist/aws_eks_deep_mnist.ipynb | welcomemandeep/seldon-core | a257c5ef7baf042da4b2ca1b7aad959447d5bd7d | [
"Apache-2.0"
]
| null | null | null | examples/models/aws_eks_deep_mnist/aws_eks_deep_mnist.ipynb | welcomemandeep/seldon-core | a257c5ef7baf042da4b2ca1b7aad959447d5bd7d | [
"Apache-2.0"
]
| null | null | null | 35.711155 | 4,684 | 0.436118 | [
[
[
"# AWS Elastic Kubernetes Service (EKS) Deep MNIST\nIn this example we will deploy a tensorflow MNIST model in Amazon Web Services' Elastic Kubernetes Service (EKS).\n\nThis tutorial will break down in the following sections:\n\n1) Train a tensorflow model to predict mnist locally\n\n2) Containerise the tensorflow model with our docker utility\n\n3) Send some data to the docker model to test it\n\n4) Install and configure AWS tools to interact with AWS\n\n5) Use the AWS tools to create and setup EKS cluster with Seldon\n\n6) Push and run docker image through the AWS Container Registry\n\n7) Test our Elastic Kubernetes deployment by sending some data\n\nLet's get started! ๐๐ฅ\n\n## Dependencies:\n\n* Helm v3.0.0+\n* A Kubernetes cluster running v1.13 or above (minkube / docker-for-windows work well if enough RAM)\n* kubectl v1.14+\n* EKS CLI v0.1.32\n* AWS Cli v1.16.163\n* Python 3.6+\n* Python DEV requirements\n",
"_____no_output_____"
],
[
"## 1) Train a tensorflow model to predict mnist locally\nWe will load the mnist images, together with their labels, and then train a tensorflow model to predict the right labels",
"_____no_output_____"
]
],
[
[
"from tensorflow.examples.tutorials.mnist import input_data\nmnist = input_data.read_data_sets(\"MNIST_data/\", one_hot = True)\nimport tensorflow as tf\n\nif __name__ == '__main__':\n \n x = tf.placeholder(tf.float32, [None,784], name=\"x\")\n\n W = tf.Variable(tf.zeros([784,10]))\n b = tf.Variable(tf.zeros([10]))\n\n y = tf.nn.softmax(tf.matmul(x,W) + b, name=\"y\")\n\n y_ = tf.placeholder(tf.float32, [None, 10])\n\n cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))\n\n train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)\n\n init = tf.initialize_all_variables()\n\n sess = tf.Session()\n sess.run(init)\n\n for i in range(1000):\n batch_xs, batch_ys = mnist.train.next_batch(100)\n sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})\n\n correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))\n accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\n print(sess.run(accuracy, feed_dict = {x: mnist.test.images, y_:mnist.test.labels}))\n\n saver = tf.train.Saver()\n\n saver.save(sess, \"model/deep_mnist_model\")",
"Extracting MNIST_data/train-images-idx3-ubyte.gz\nExtracting MNIST_data/train-labels-idx1-ubyte.gz\nExtracting MNIST_data/t10k-images-idx3-ubyte.gz\nExtracting MNIST_data/t10k-labels-idx1-ubyte.gz\n0.9194\n"
]
],
[
[
"## 2) Containerise the tensorflow model with our docker utility",
"_____no_output_____"
],
[
"First you need to make sure that you have added the .s2i/environment configuration file in this folder with the following content:",
"_____no_output_____"
]
],
[
[
"!cat .s2i/environment",
"MODEL_NAME=DeepMnist\nAPI_TYPE=REST\nSERVICE_TYPE=MODEL\nPERSISTENCE=0\n"
]
],
[
[
"Now we can build a docker image named \"deep-mnist\" with the tag 0.1",
"_____no_output_____"
]
],
[
[
"!s2i build . seldonio/seldon-core-s2i-python36:1.5.0-dev deep-mnist:0.1",
"---> Installing application source...\n---> Installing dependencies ...\nLooking in links: /whl\nRequirement already satisfied: tensorflow>=1.12.0 in /usr/local/lib/python3.6/site-packages (from -r requirements.txt (line 1)) (1.13.1)\nRequirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.9)\nRequirement already satisfied: gast>=0.2.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.2.2)\nRequirement already satisfied: absl-py>=0.1.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.1)\nRequirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.1)\nRequirement already satisfied: keras-applications>=1.0.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.7)\nRequirement already satisfied: six>=1.10.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.12.0)\nRequirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.1.0)\nRequirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.19.0)\nRequirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.33.1)\nRequirement already satisfied: tensorboard<1.14.0,>=1.13.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.13.1)\nRequirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.16.2)\nRequirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.7.0)\nRequirement already satisfied: tensorflow-estimator<1.14.0rc0,>=1.13.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.13.0)\nRequirement already satisfied: h5py in /usr/local/lib/python3.6/site-packages (from keras-applications>=1.0.6->tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.9.0)\nRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/site-packages (from tensorboard<1.14.0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.0.1)\nRequirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/site-packages (from tensorboard<1.14.0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.15.0)\nRequirement already satisfied: setuptools in /usr/local/lib/python3.6/site-packages (from protobuf>=3.6.1->tensorflow>=1.12.0->-r requirements.txt (line 1)) (40.8.0)\nRequirement already satisfied: mock>=2.0.0 in /usr/local/lib/python3.6/site-packages (from tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.0.0)\nRequirement already satisfied: pbr>=0.11 in /usr/local/lib/python3.6/site-packages (from mock>=2.0.0->tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (5.1.3)\nUrl '/whl' is ignored. It is either a non-existing path or lacks a specific scheme.\nYou are using pip version 19.0.3, however version 19.1.1 is available.\nYou should consider upgrading via the 'pip install --upgrade pip' command.\nBuild completed successfully\n"
]
],
[
[
"## 3) Send some data to the docker model to test it\nWe first run the docker image we just created as a container called \"mnist_predictor\"",
"_____no_output_____"
]
],
[
[
"!docker run --name \"mnist_predictor\" -d --rm -p 5000:5000 deep-mnist:0.1",
"5157ab4f516bd0dea11b159780f31121e9fb41df6394e0d6d631e6e0d572463b\n"
]
],
[
[
"Send some random features that conform to the contract",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n# This is the variable that was initialised at the beginning of the file\ni = [0]\nx = mnist.test.images[i]\ny = mnist.test.labels[i]\nplt.imshow(x.reshape((28, 28)), cmap='gray')\nplt.show()\nprint(\"Expected label: \", np.sum(range(0,10) * y), \". One hot encoding: \", y)",
"_____no_output_____"
],
[
"from seldon_core.seldon_client import SeldonClient\nimport math\nimport numpy as np\n\n# We now test the REST endpoint expecting the same result\nendpoint = \"0.0.0.0:5000\"\nbatch = x\npayload_type = \"ndarray\"\n\nsc = SeldonClient(microservice_endpoint=endpoint)\n\n# We use the microservice, instead of the \"predict\" function\nclient_prediction = sc.microservice(\n data=batch,\n method=\"predict\",\n payload_type=payload_type,\n names=[\"tfidf\"])\n\nfor proba, label in zip(client_prediction.response.data.ndarray.values[0].list_value.ListFields()[0][1], range(0,10)):\n print(f\"LABEL {label}:\\t {proba.number_value*100:6.4f} %\")",
"LABEL 0:\t 0.0068 %\nLABEL 1:\t 0.0000 %\nLABEL 2:\t 0.0085 %\nLABEL 3:\t 0.3409 %\nLABEL 4:\t 0.0002 %\nLABEL 5:\t 0.0020 %\nLABEL 6:\t 0.0000 %\nLABEL 7:\t 99.5371 %\nLABEL 8:\t 0.0026 %\nLABEL 9:\t 0.1019 %\n"
],
[
"!docker rm mnist_predictor --force",
"mnist_predictor\n"
]
],
[
[
"## 4) Install and configure AWS tools to interact with AWS",
"_____no_output_____"
],
[
"First we install the awscli",
"_____no_output_____"
]
],
[
[
"!pip install awscli --upgrade --user",
"Collecting awscli\n Using cached https://files.pythonhosted.org/packages/f6/45/259a98719e7c7defc9be4cc00fbfb7ccf699fbd1f74455d8347d0ab0a1df/awscli-1.16.163-py2.py3-none-any.whl\nCollecting colorama<=0.3.9,>=0.2.5 (from awscli)\n Using cached https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl\nCollecting PyYAML<=3.13,>=3.10 (from awscli)\nCollecting botocore==1.12.153 (from awscli)\n Using cached https://files.pythonhosted.org/packages/ec/3b/029218966ce62ae9824a18730de862ac8fc5a0e8083d07d1379815e7cca1/botocore-1.12.153-py2.py3-none-any.whl\nRequirement already satisfied, skipping upgrade: docutils>=0.10 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from awscli) (0.14)\nCollecting rsa<=3.5.0,>=3.1.2 (from awscli)\n Using cached https://files.pythonhosted.org/packages/e1/ae/baedc9cb175552e95f3395c43055a6a5e125ae4d48a1d7a924baca83e92e/rsa-3.4.2-py2.py3-none-any.whl\nRequirement already satisfied, skipping upgrade: s3transfer<0.3.0,>=0.2.0 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from awscli) (0.2.0)\nRequirement already satisfied, skipping upgrade: urllib3<1.25,>=1.20; python_version >= \"3.4\" in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from botocore==1.12.153->awscli) (1.24.2)\nRequirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= \"2.7\" in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from botocore==1.12.153->awscli) (2.8.0)\nRequirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from botocore==1.12.153->awscli) (0.9.4)\nCollecting pyasn1>=0.1.3 (from rsa<=3.5.0,>=3.1.2->awscli)\n Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl\nRequirement already satisfied, skipping upgrade: six>=1.5 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= \"2.7\"->botocore==1.12.153->awscli) (1.12.0)\nInstalling collected packages: colorama, PyYAML, botocore, pyasn1, rsa, awscli\nSuccessfully installed PyYAML-3.13 awscli-1.16.163 botocore-1.12.153 colorama-0.3.9 pyasn1-0.4.5 rsa-3.4.2\n"
]
],
[
[
"### Configure aws so it can talk to your server \n(if you are getting issues, make sure you have the permmissions to create clusters)",
"_____no_output_____"
]
],
[
[
"%%bash \n# You must make sure that the access key and secret are changed\naws configure << END_OF_INPUTS\nYOUR_ACCESS_KEY\nYOUR_ACCESS_SECRET\nus-west-2\njson\nEND_OF_INPUTS",
"AWS Access Key ID [****************SF4A]: AWS Secret Access Key [****************WLHu]: Default region name [eu-west-1]: Default output format [json]: "
]
],
[
[
"### Install EKCTL\n*IMPORTANT*: These instructions are for linux\nPlease follow the official installation of ekctl at: https://docs.aws.amazon.com/eks/latest/userguide/getting-started-eksctl.html",
"_____no_output_____"
]
],
[
[
"!curl --silent --location \"https://github.com/weaveworks/eksctl/releases/download/latest_release/eksctl_$(uname -s)_amd64.tar.gz\" | tar xz ",
"_____no_output_____"
],
[
"!chmod 755 ./eksctl",
"_____no_output_____"
],
[
"!./eksctl version",
"\u001b[36m[โน] version.Info{BuiltAt:\"\", GitCommit:\"\", GitTag:\"0.1.32\"}\n\u001b[0m"
]
],
[
[
"## 5) Use the AWS tools to create and setup EKS cluster with Seldon\nIn this example we will create a cluster with 2 nodes, with a minimum of 1 and a max of 3. You can tweak this accordingly.\n\nIf you want to check the status of the deployment you can go to AWS CloudFormation or to the EKS dashboard.\n\nIt will take 10-15 minutes (so feel free to go grab a โ). \n\n*IMPORTANT*: If you get errors in this step it is most probably IAM role access requirements, which requires you to discuss with your administrator.",
"_____no_output_____"
]
],
[
[
"%%bash\n./eksctl create cluster \\\n--name demo-eks-cluster \\\n--region us-west-2 \\\n--nodes 2 ",
"Process is interrupted.\n"
]
],
[
[
"### Configure local kubectl \nWe want to now configure our local Kubectl so we can actually reach the cluster we've just created",
"_____no_output_____"
]
],
[
[
"!aws eks --region us-west-2 update-kubeconfig --name demo-eks-cluster",
"Updated context arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist in /home/alejandro/.kube/config\n"
]
],
[
[
"And we can check if the context has been added to kubectl config (contexts are basically the different k8s cluster connections)\nYou should be able to see the context as \"...aws:eks:eu-west-1:27...\". \nIf it's not activated you can activate that context with kubectlt config set-context <CONTEXT_NAME>",
"_____no_output_____"
]
],
[
[
"!kubectl config get-contexts",
"CURRENT NAME CLUSTER AUTHINFO NAMESPACE\n* arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist \n docker-desktop docker-desktop docker-desktop \n docker-for-desktop docker-desktop docker-desktop \n gke_ml-engineer_us-central1-a_security-cluster-1 gke_ml-engineer_us-central1-a_security-cluster-1 gke_ml-engineer_us-central1-a_security-cluster-1 \n"
]
],
[
[
"## Setup Seldon Core\n\nUse the setup notebook to [Setup Cluster](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html#Setup-Cluster) with [Ambassador Ingress](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html#Ambassador) and [Install Seldon Core](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html#Install-Seldon-Core). Instructions [also online](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html).",
"_____no_output_____"
],
[
"## Push docker image\nIn order for the EKS seldon deployment to access the image we just built, we need to push it to the Elastic Container Registry (ECR).\n\nIf you have any issues please follow the official AWS documentation: https://docs.aws.amazon.com/AmazonECR/latest/userguide/what-is-ecr.html",
"_____no_output_____"
],
[
"### First we create a registry\nYou can run the following command, and then see the result at https://us-west-2.console.aws.amazon.com/ecr/repositories?#",
"_____no_output_____"
]
],
[
[
"!aws ecr create-repository --repository-name seldon-repository --region us-west-2",
"{\n \"repository\": {\n \"repositoryArn\": \"arn:aws:ecr:us-west-2:271049282727:repository/seldon-repository\",\n \"registryId\": \"271049282727\",\n \"repositoryName\": \"seldon-repository\",\n \"repositoryUri\": \"271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository\",\n \"createdAt\": 1558535798.0\n }\n}\n"
]
],
[
[
"### Now prepare docker image\nWe need to first tag the docker image before we can push it",
"_____no_output_____"
]
],
[
[
"%%bash\nexport AWS_ACCOUNT_ID=\"\"\nexport AWS_REGION=\"us-west-2\"\nif [ -z \"$AWS_ACCOUNT_ID\" ]; then\n echo \"ERROR: Please provide a value for the AWS variables\"\n exit 1\nfi\n\ndocker tag deep-mnist:0.1 \"$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/seldon-repository\"",
"_____no_output_____"
]
],
[
[
"### We now login to aws through docker so we can access the repository",
"_____no_output_____"
]
],
[
[
"!`aws ecr get-login --no-include-email --region us-west-2`",
"WARNING! Using --password via the CLI is insecure. Use --password-stdin.\nWARNING! Your password will be stored unencrypted in /home/alejandro/.docker/config.json.\nConfigure a credential helper to remove this warning. See\nhttps://docs.docker.com/engine/reference/commandline/login/#credentials-store\n\nLogin Succeeded\n"
]
],
[
[
"### And push the image\nMake sure you add your AWS Account ID",
"_____no_output_____"
]
],
[
[
"%%bash\nexport AWS_ACCOUNT_ID=\"\"\nexport AWS_REGION=\"us-west-2\"\nif [ -z \"$AWS_ACCOUNT_ID\" ]; then\n echo \"ERROR: Please provide a value for the AWS variables\"\n exit 1\nfi\n\ndocker push \"$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/seldon-repository\"",
"The push refers to repository [271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository]\nf7d0d000c138: Preparing\n987f3f1afb00: Preparing\n00d16a381c47: Preparing\nbb01f50d544a: Preparing\nfcb82c6941b5: Preparing\n67290e35c458: Preparing\nb813745f5bb3: Preparing\nffecb18e9f0b: Preparing\nf50f856f49fa: Preparing\n80b43ad4adf9: Preparing\n14c77983a1cf: Preparing\na22a5ac18042: Preparing\n6257fa9f9597: Preparing\n578414b395b9: Preparing\nabc3250a6c7f: Preparing\n13d5529fd232: Preparing\n67290e35c458: Waiting\nb813745f5bb3: Waiting\nffecb18e9f0b: Waiting\nf50f856f49fa: Waiting\n80b43ad4adf9: Waiting\n6257fa9f9597: Waiting\n14c77983a1cf: Waiting\na22a5ac18042: Waiting\n578414b395b9: Waiting\nabc3250a6c7f: Waiting\n13d5529fd232: Waiting\n987f3f1afb00: Pushed\nfcb82c6941b5: Pushed\nbb01f50d544a: Pushed\nf7d0d000c138: Pushed\nffecb18e9f0b: Pushed\nb813745f5bb3: Pushed\nf50f856f49fa: Pushed\n67290e35c458: Pushed\n14c77983a1cf: Pushed\n578414b395b9: Pushed\n80b43ad4adf9: Pushed\n13d5529fd232: Pushed\n6257fa9f9597: Pushed\nabc3250a6c7f: Pushed\n00d16a381c47: Pushed\na22a5ac18042: Pushed\nlatest: digest: sha256:19aefaa9d87c1287eb46ec08f5d4f9a689744d9d0d0b75668b7d15e447819d74 size: 3691\n"
]
],
[
[
"## Running the Model\nWe will now run the model.\n\nLet's first have a look at the file we'll be using to trigger the model:",
"_____no_output_____"
]
],
[
[
"!cat deep_mnist.json",
"{\n \"apiVersion\": \"machinelearning.seldon.io/v1alpha2\",\n \"kind\": \"SeldonDeployment\",\n \"metadata\": {\n \"labels\": {\n \"app\": \"seldon\"\n },\n \"name\": \"deep-mnist\"\n },\n \"spec\": {\n \"annotations\": {\n \"project_name\": \"Tensorflow MNIST\",\n \"deployment_version\": \"v1\"\n },\n \"name\": \"deep-mnist\",\n \"oauth_key\": \"oauth-key\",\n \"oauth_secret\": \"oauth-secret\",\n \"predictors\": [\n {\n \"componentSpecs\": [{\n \"spec\": {\n \"containers\": [\n {\n \"image\": \"271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository:latest\",\n \"imagePullPolicy\": \"IfNotPresent\",\n \"name\": \"classifier\",\n \"resources\": {\n \"requests\": {\n \"memory\": \"1Mi\"\n }\n }\n }\n ],\n \"terminationGracePeriodSeconds\": 20\n }\n }],\n \"graph\": {\n \"children\": [],\n \"name\": \"classifier\",\n \"endpoint\": {\n\t\t\t\"type\" : \"REST\"\n\t\t },\n \"type\": \"MODEL\"\n },\n \"name\": \"single-model\",\n \"replicas\": 1,\n\t\t\"annotations\": {\n\t\t \"predictor_version\" : \"v1\"\n\t\t}\n }\n ]\n }\n}\n"
]
],
[
[
"Now let's trigger seldon to run the model.\n\nWe basically have a yaml file, where we want to replace the value \"REPLACE_FOR_IMAGE_AND_TAG\" for the image you pushed",
"_____no_output_____"
]
],
[
[
"%%bash\nexport AWS_ACCOUNT_ID=\"\"\nexport AWS_REGION=\"us-west-2\"\nif [ -z \"$AWS_ACCOUNT_ID\" ]; then\n echo \"ERROR: Please provide a value for the AWS variables\"\n exit 1\nfi\n\nsed 's|REPLACE_FOR_IMAGE_AND_TAG|'\"$AWS_ACCOUNT_ID\"'.dkr.ecr.'\"$AWS_REGION\"'.amazonaws.com/seldon-repository|g' deep_mnist.json | kubectl apply -f -",
"error: unable to recognize \"STDIN\": Get https://461835FD3FF52848655C8F09FBF5EEAA.yl4.us-west-2.eks.amazonaws.com/api?timeout=32s: dial tcp: lookup 461835FD3FF52848655C8F09FBF5EEAA.yl4.us-west-2.eks.amazonaws.com on 1.1.1.1:53: no such host\n"
]
],
[
[
"And let's check that it's been created.\n\nYou should see an image called \"deep-mnist-single-model...\".\n\nWe'll wait until STATUS changes from \"ContainerCreating\" to \"Running\"",
"_____no_output_____"
]
],
[
[
"!kubectl get pods",
"NAME READY STATUS RESTARTS AGE\nambassador-5475779f98-7bhcw 1/1 Running 0 21m\nambassador-5475779f98-986g5 1/1 Running 0 21m\nambassador-5475779f98-zcd28 1/1 Running 0 21m\ndeep-mnist-single-model-42ed9d9-fdb557d6b-6xv2h 2/2 Running 0 18m\n"
]
],
[
[
"## Test the model\nNow we can test the model, let's first find out what is the URL that we'll have to use:",
"_____no_output_____"
]
],
[
[
"!kubectl get svc ambassador -o jsonpath='{.status.loadBalancer.ingress[0].hostname}' ",
"a68bbac487ca611e988060247f81f4c1-707754258.us-west-2.elb.amazonaws.com"
]
],
[
[
"We'll use a random example from our dataset",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n# This is the variable that was initialised at the beginning of the file\ni = [0]\nx = mnist.test.images[i]\ny = mnist.test.labels[i]\nplt.imshow(x.reshape((28, 28)), cmap='gray')\nplt.show()\nprint(\"Expected label: \", np.sum(range(0,10) * y), \". One hot encoding: \", y)",
"_____no_output_____"
]
],
[
[
"We can now add the URL above to send our request:",
"_____no_output_____"
]
],
[
[
"from seldon_core.seldon_client import SeldonClient\nimport math\nimport numpy as np\n\nhost = \"a68bbac487ca611e988060247f81f4c1-707754258.us-west-2.elb.amazonaws.com\"\nport = \"80\" # Make sure you use the port above\nbatch = x\npayload_type = \"ndarray\"\n\nsc = SeldonClient(\n gateway=\"ambassador\", \n ambassador_endpoint=host + \":\" + port,\n namespace=\"default\",\n oauth_key=\"oauth-key\", \n oauth_secret=\"oauth-secret\")\n\nclient_prediction = sc.predict(\n data=batch, \n deployment_name=\"deep-mnist\",\n names=[\"text\"],\n payload_type=payload_type)\n\nprint(client_prediction)",
"Success:True message:\nRequest:\ndata {\n names: \"text\"\n ndarray {\n values {\n list_value {\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.3294117748737335\n }\n values {\n number_value: 0.7254902124404907\n }\n values {\n number_value: 0.6235294342041016\n }\n values {\n number_value: 0.5921568870544434\n }\n values {\n number_value: 0.2352941334247589\n }\n values {\n number_value: 0.1411764770746231\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.8705883026123047\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9450981020927429\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.6666666865348816\n }\n values {\n number_value: 0.2039215862751007\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.44705885648727417\n }\n values {\n number_value: 0.2823529541492462\n }\n values {\n number_value: 0.44705885648727417\n }\n values {\n number_value: 0.6392157077789307\n }\n values {\n number_value: 0.8901961445808411\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8823530077934265\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9803922176361084\n }\n values {\n number_value: 0.8980392813682556\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.5490196347236633\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.06666667014360428\n }\n values {\n number_value: 0.25882354378700256\n }\n values {\n number_value: 0.05490196496248245\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.23137256503105164\n }\n values {\n number_value: 0.08235294371843338\n }\n values {\n number_value: 0.9254902601242065\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.41568630933761597\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.32549020648002625\n }\n values {\n number_value: 0.9921569228172302\n }\n values {\n number_value: 0.8196079134941101\n }\n values {\n number_value: 0.07058823853731155\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.08627451211214066\n }\n values {\n number_value: 0.9137255549430847\n }\n values {\n number_value: 1.0\n }\n values {\n number_value: 0.32549020648002625\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.5058823823928833\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9333333969116211\n }\n values {\n number_value: 0.1725490242242813\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.23137256503105164\n }\n values {\n number_value: 0.9764706492424011\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.24313727021217346\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.5215686559677124\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.7333333492279053\n }\n values {\n number_value: 0.019607843831181526\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.03529411926865578\n }\n values {\n number_value: 0.803921639919281\n }\n values {\n number_value: 0.9725490808486938\n }\n values {\n number_value: 0.22745099663734436\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.4941176772117615\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.7137255072593689\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.29411765933036804\n }\n values {\n number_value: 0.9843137860298157\n }\n values {\n number_value: 0.9411765336990356\n }\n values {\n number_value: 0.22352942824363708\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.07450980693101883\n }\n values {\n number_value: 0.8666667342185974\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.6509804129600525\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.011764707043766975\n }\n values {\n number_value: 0.7960785031318665\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8588235974311829\n }\n values {\n number_value: 0.13725490868091583\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.14901961386203766\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.3019607961177826\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.12156863510608673\n }\n values {\n number_value: 0.8784314393997192\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.45098042488098145\n }\n values {\n number_value: 0.003921568859368563\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.5215686559677124\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.2039215862751007\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.2392157018184662\n }\n values {\n number_value: 0.9490196704864502\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.2039215862751007\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.4745098352432251\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8588235974311829\n }\n values {\n number_value: 0.1568627506494522\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.4745098352432251\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8117647767066956\n }\n values {\n number_value: 0.07058823853731155\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n }\n }\n }\n}\n\nResponse:\nmeta {\n puid: \"l6bv1r38mmb32l0hbinln2jjcl\"\n requestPath {\n key: \"classifier\"\n value: \"271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository:latest\"\n }\n}\ndata {\n names: \"class:0\"\n names: \"class:1\"\n names: \"class:2\"\n names: \"class:3\"\n names: \"class:4\"\n names: \"class:5\"\n names: \"class:6\"\n names: \"class:7\"\n names: \"class:8\"\n names: \"class:9\"\n ndarray {\n values {\n list_value {\n values {\n number_value: 6.839015986770391e-05\n }\n values {\n number_value: 9.376968534979824e-09\n }\n values {\n number_value: 8.48581112222746e-05\n }\n values {\n number_value: 0.0034086888190358877\n }\n values {\n number_value: 2.3978568606253248e-06\n }\n values {\n number_value: 2.0100669644307345e-05\n }\n values {\n number_value: 3.0251623428512175e-08\n }\n values {\n number_value: 0.9953710436820984\n }\n values {\n number_value: 2.6070511012221687e-05\n }\n values {\n number_value: 0.0010185304563492537\n }\n }\n }\n }\n}\n\n"
]
],
[
[
"### Let's visualise the probability for each label\nIt seems that it correctly predicted the number 7",
"_____no_output_____"
]
],
[
[
"for proba, label in zip(client_prediction.response.data.ndarray.values[0].list_value.ListFields()[0][1], range(0,10)):\n print(f\"LABEL {label}:\\t {proba.number_value*100:6.4f} %\")",
"LABEL 0:\t 0.0068 %\nLABEL 1:\t 0.0000 %\nLABEL 2:\t 0.0085 %\nLABEL 3:\t 0.3409 %\nLABEL 4:\t 0.0002 %\nLABEL 5:\t 0.0020 %\nLABEL 6:\t 0.0000 %\nLABEL 7:\t 99.5371 %\nLABEL 8:\t 0.0026 %\nLABEL 9:\t 0.1019 %\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d061b18c3c6373ce1ad6bf555b22d3a7975a9251 | 25,479 | ipynb | Jupyter Notebook | introduction-to-Tiny-AutoML.ipynb | thomktz/TinyAutoML | 74d9d806ac31795dbf1c4fd60755b0bf9a7c4124 | [
"MIT"
]
| null | null | null | introduction-to-Tiny-AutoML.ipynb | thomktz/TinyAutoML | 74d9d806ac31795dbf1c4fd60755b0bf9a7c4124 | [
"MIT"
]
| null | null | null | introduction-to-Tiny-AutoML.ipynb | thomktz/TinyAutoML | 74d9d806ac31795dbf1c4fd60755b0bf9a7c4124 | [
"MIT"
]
| null | null | null | 68.308311 | 16,274 | 0.820401 | [
[
[
"# **Introduction to TinyAutoML**\n\n---\n\nTinyAutoML is a Machine Learning Python3.9 library thought as an extension of Scikit-Learn. It builds an adaptable and auto-tuned pipeline to handle binary classification tasks.\n\nIn a few words, your data goes through 2 main preprocessing steps. The first one is scaling and NonStationnarity correction, which is followed by Lasso Feature selection. \n\nFinally, one of the three MetaModels is fitted on the transformed data.\n\nLet's import the library !",
"_____no_output_____"
]
],
[
[
"%pip install TinyAutoML==0.2.3.3",
"_____no_output_____"
],
[
"from TinyAutoML.Models import *\nfrom TinyAutoML import MetaPipeline",
"_____no_output_____"
]
],
[
[
"## MetaModels\n\nMetaModels inherit from the MetaModel Abstract Class. They all implement ensemble methods and therefore are based on EstimatorPools.\n\nWhen training EstimatorPools, you are faced with a choice : doing parameterTuning on entire pipelines with the estimators on the top or training the estimators using the same pipeline and only training the top. The first case refers to what we will be calling **comprehensiveSearch**.\n\nMoreover, as we will see in details later, those EstimatorPools can be shared across MetaModels.\n\nThey are all initialised with those minimum arguments :\n\n```python\nMetaModel(comprehensiveSearch: bool = True, parameterTuning: bool = True, metrics: str = 'accuracy', nSplits: int=10)\n```\n- nSplits corresponds to the number of split of the cross validation\n- The other parameters are equivoque\n\n\n**They need to be put in the MetaPipeline wrapper to work**",
"_____no_output_____"
],
[
"**There are 3 MetaModels**\n\n1- BestModel : selects the best performing model of the pool",
"_____no_output_____"
]
],
[
[
"best_model = MetaPipeline(BestModel(comprehensiveSearch = False, parameterTuning = False))",
"_____no_output_____"
]
],
[
[
"2- OneRulerForAll : implements Stacking using a RandomForestClassifier by default. The user is free to use another classifier using the ruler arguments",
"_____no_output_____"
]
],
[
[
"orfa_model = MetaPipeline(OneRulerForAll(comprehensiveSearch=False, parameterTuning=False))",
"_____no_output_____"
]
],
[
[
"3- DemocraticModel : implements Soft and Hard voting models through the voting argument",
"_____no_output_____"
]
],
[
[
"democratic_model = MetaPipeline(DemocraticModel(comprehensiveSearch=False, parameterTuning=False, voting='soft'))",
"_____no_output_____"
]
],
[
[
"As of release v0.2.3.2 (13/04/2022) there are 5 models on which these MetaModels rely in the EstimatorPool:\n- Random Forest Classifier\n- Logistic Regression\n- Gaussian Naive Bayes\n- Linear Discriminant Analysis\n- XGBoost\n\n\n***\n\n\nWe'll use the breast_cancer dataset from sklearn as an example:",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nfrom sklearn.datasets import load_breast_cancer\n\ncancer = load_breast_cancer()\n \nX = pd.DataFrame(data=cancer.data, columns=cancer.feature_names)\ny = cancer.target\n\ncut = int(len(y) * 0.8)\n\nX_train, X_test = X[:cut], X[cut:]\ny_train, y_test = y[:cut], y[cut:]",
"_____no_output_____"
]
],
[
[
"Let's train a BestModel first and reuse its Pool for the other MetaModels",
"_____no_output_____"
]
],
[
[
"best_model.fit(X_train,y_train)",
"INFO:root:Training models\nINFO:root:The best estimator is random forest classifier with a cross-validation accuracy (in Sample) of 1.0\n"
]
],
[
[
"We can now extract the pool",
"_____no_output_____"
]
],
[
[
"pool = best_model.get_pool()",
"_____no_output_____"
]
],
[
[
"And use it when fitting the other MetaModels to skip the fitting of the underlying models:",
"_____no_output_____"
]
],
[
[
"orfa_model.fit(X_train,y_train,pool=pool)\ndemocratic_model.fit(X_train,y_train,pool=pool)",
"INFO:root:Training models...\nINFO:root:Training models...\n"
]
],
[
[
"Great ! Let's look at the results with the sk_learn classification report :",
"_____no_output_____"
]
],
[
[
"orfa_model.classification_report(X_test,y_test)",
" precision recall f1-score support\n\n 0 0.96 1.00 0.98 26\n 1 1.00 0.99 0.99 88\n\n accuracy 0.99 114\n macro avg 0.98 0.99 0.99 114\nweighted avg 0.99 0.99 0.99 114\n\n"
]
],
[
[
"Looking good! What about the ROC Curve ?",
"_____no_output_____"
]
],
[
[
"democratic_model.roc_curve(X_test,y_test)",
"_____no_output_____"
]
],
[
[
"Let's see how the estimators of the pool are doing individually:",
"_____no_output_____"
]
],
[
[
"best_model.get_scores(X_test,y_test)",
"_____no_output_____"
]
],
[
[
"## What's next ? \n\nYou can do the same steps with comprehensiveSearch set to True if you have the time and if you want to improve your results. You can also try new rulers and so on.",
"_____no_output_____"
],
[
"Maria, Thomas and Lucas.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
]
|
d061c8abae2ad27910746ac39a22938acf0dd2af | 6,121 | ipynb | Jupyter Notebook | notebooks/Python3-Language/06-OOP/01-Class Basics.ipynb | binakot/Python3-Course | c555fc7376c45f4b2dedb6d57363c0070831c1e1 | [
"MIT"
]
| null | null | null | notebooks/Python3-Language/06-OOP/01-Class Basics.ipynb | binakot/Python3-Course | c555fc7376c45f4b2dedb6d57363c0070831c1e1 | [
"MIT"
]
| null | null | null | notebooks/Python3-Language/06-OOP/01-Class Basics.ipynb | binakot/Python3-Course | c555fc7376c45f4b2dedb6d57363c0070831c1e1 | [
"MIT"
]
| null | null | null | 22.925094 | 318 | 0.525404 | [
[
[
"#OOP allows to create their own objects that have methods and attributes\n#you already used objects (list and such)\n#You can represent the whole world by classes, attributes and methods\n#classes can contain data about itself\n\n#functions are not enough to program large programs",
"_____no_output_____"
],
[
"#also keep in mind that sometimes developers use \"object\" and \"class\" interchangeably",
"_____no_output_____"
],
[
"numbers = [1,2,3] #create/instantiate a built-in object\ntype(numbers)",
"_____no_output_____"
],
[
"class Character():\n pass",
"_____no_output_____"
],
[
"unit = Character() #create instance\ntype(unit)",
"_____no_output_____"
],
[
"#each class has a constructor\n#and by the way a class can have more than constructor\n#(explain that many objects have attributes that have to be initialized, \n#\"name\" for a person ; \"unique id for a credit card\", \n#so constructor is something that makes sure that an object can not be created in an invalid state)",
"_____no_output_____"
],
[
"class Character():\n \n def __init__(self, race): #self represent an instance\n self.race = race",
"_____no_output_____"
],
[
"unit = Character() #client create an instance, and uses unit to work with object while self is used by the class developer",
"_____no_output_____"
],
[
"unit = Character('Elf')",
"_____no_output_____"
],
[
"type(unit)",
"_____no_output_____"
],
[
"unit.race",
"_____no_output_____"
],
[
"#get back to class and demonstrate that \"race\" param can be renamed to \"unit_type\"\n#and say that self.race is an attribute it's like defining a variable within a class\n#and say that we can give any name for race and then call it by new name from a client's side\n#then revert back changes and say that this style is a standard",
"_____no_output_____"
],
[
"class Character():\n \n def __init__(self, race, damage = 10, armor=20):\n self.race = race\n self.damage = damage\n self.armor = armor",
"_____no_output_____"
],
[
"unit = Character('Ork', damage=20, armor=40) #though not obliged to write arg names and even pass anything\nprint(unit.damage)\nprint(unit.armor)",
"20\n40\n"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d061eb51c53566b0ba0e39649cbe53175a51de39 | 36,626 | ipynb | Jupyter Notebook | 2_Training.ipynb | siddsrivastava/Image-captionin | 683d06c674c737a71957a7f087b726220bce262b | [
"MIT"
]
| 10 | 2020-05-11T02:36:36.000Z | 2022-03-22T22:51:22.000Z | 2_Training.ipynb | srivastava41099/Image-captioning | 683d06c674c737a71957a7f087b726220bce262b | [
"MIT"
]
| null | null | null | 2_Training.ipynb | srivastava41099/Image-captioning | 683d06c674c737a71957a7f087b726220bce262b | [
"MIT"
]
| 3 | 2021-04-12T10:41:48.000Z | 2021-06-18T02:17:53.000Z | 61.247492 | 734 | 0.616475 | [
[
[
"# Computer Vision Nanodegree\n\n## Project: Image Captioning\n\n---\n\nIn this notebook, you will train your CNN-RNN model. \n\nYou are welcome and encouraged to try out many different architectures and hyperparameters when searching for a good model.\n\nThis does have the potential to make the project quite messy! Before submitting your project, make sure that you clean up:\n- the code you write in this notebook. The notebook should describe how to train a single CNN-RNN architecture, corresponding to your final choice of hyperparameters. You should structure the notebook so that the reviewer can replicate your results by running the code in this notebook. \n- the output of the code cell in **Step 2**. The output should show the output obtained when training the model from scratch.\n\nThis notebook **will be graded**. \n\nFeel free to use the links below to navigate the notebook:\n- [Step 1](#step1): Training Setup\n- [Step 2](#step2): Train your Model\n- [Step 3](#step3): (Optional) Validate your Model",
"_____no_output_____"
],
[
"<a id='step1'></a>\n## Step 1: Training Setup\n\nIn this step of the notebook, you will customize the training of your CNN-RNN model by specifying hyperparameters and setting other options that are important to the training procedure. The values you set now will be used when training your model in **Step 2** below.\n\nYou should only amend blocks of code that are preceded by a `TODO` statement. **Any code blocks that are not preceded by a `TODO` statement should not be modified**.\n\n### Task #1\n\nBegin by setting the following variables:\n- `batch_size` - the batch size of each training batch. It is the number of image-caption pairs used to amend the model weights in each training step. \n- `vocab_threshold` - the minimum word count threshold. Note that a larger threshold will result in a smaller vocabulary, whereas a smaller threshold will include rarer words and result in a larger vocabulary. \n- `vocab_from_file` - a Boolean that decides whether to load the vocabulary from file. \n- `embed_size` - the dimensionality of the image and word embeddings. \n- `hidden_size` - the number of features in the hidden state of the RNN decoder. \n- `num_epochs` - the number of epochs to train the model. We recommend that you set `num_epochs=3`, but feel free to increase or decrease this number as you wish. [This paper](https://arxiv.org/pdf/1502.03044.pdf) trained a captioning model on a single state-of-the-art GPU for 3 days, but you'll soon see that you can get reasonable results in a matter of a few hours! (_But of course, if you want your model to compete with current research, you will have to train for much longer._)\n- `save_every` - determines how often to save the model weights. We recommend that you set `save_every=1`, to save the model weights after each epoch. This way, after the `i`th epoch, the encoder and decoder weights will be saved in the `models/` folder as `encoder-i.pkl` and `decoder-i.pkl`, respectively.\n- `print_every` - determines how often to print the batch loss to the Jupyter notebook while training. Note that you **will not** observe a monotonic decrease in the loss function while training - this is perfectly fine and completely expected! You are encouraged to keep this at its default value of `100` to avoid clogging the notebook, but feel free to change it.\n- `log_file` - the name of the text file containing - for every step - how the loss and perplexity evolved during training.\n\nIf you're not sure where to begin to set some of the values above, you can peruse [this paper](https://arxiv.org/pdf/1502.03044.pdf) and [this paper](https://arxiv.org/pdf/1411.4555.pdf) for useful guidance! **To avoid spending too long on this notebook**, you are encouraged to consult these suggested research papers to obtain a strong initial guess for which hyperparameters are likely to work best. Then, train a single model, and proceed to the next notebook (**3_Inference.ipynb**). If you are unhappy with your performance, you can return to this notebook to tweak the hyperparameters (and/or the architecture in **model.py**) and re-train your model.\n\n### Question 1\n\n**Question:** Describe your CNN-RNN architecture in detail. With this architecture in mind, how did you select the values of the variables in Task 1? If you consulted a research paper detailing a successful implementation of an image captioning model, please provide the reference.\n\n**Answer:** I used a pretrained Resnet152 network to extract features (deep CNN network). In the literature other architectures like VGG16 are also used, but Resnet152 is claimed to diminish the vanishing gradient problem.I'm using 2 layers of LSTM currently (as it is taking a lot of time), in the future I will explore with more layers.\nvocab_threshold is 6, I tried with 9 (meaning lesser elements in vocab) but the training seemed to converge faster in the case of 6. Many paper suggest taking batch_size of 64 or 128, I went with 64. embed_size and hidden_size are both 512. I consulted several blogs and famous papers like \"Show, attend and tell - Xu et al\" although i did not use attention currently. \n\n\n### (Optional) Task #2\n\nNote that we have provided a recommended image transform `transform_train` for pre-processing the training images, but you are welcome (and encouraged!) to modify it as you wish. When modifying this transform, keep in mind that:\n- the images in the dataset have varying heights and widths, and \n- if using a pre-trained model, you must perform the corresponding appropriate normalization.\n\n### Question 2\n\n**Question:** How did you select the transform in `transform_train`? If you left the transform at its provided value, why do you think that it is a good choice for your CNN architecture?\n\n**Answer:** The transform value is the same. Empirically, these parameters values worked well in my past projects.\n\n### Task #3\n\nNext, you will specify a Python list containing the learnable parameters of the model. For instance, if you decide to make all weights in the decoder trainable, but only want to train the weights in the embedding layer of the encoder, then you should set `params` to something like:\n```\nparams = list(decoder.parameters()) + list(encoder.embed.parameters()) \n```\n\n### Question 3\n\n**Question:** How did you select the trainable parameters of your architecture? Why do you think this is a good choice?\n\n**Answer:** Since resnet was pretrained, i trained only the embedding layer and all layers of the decoder. The resnet is already fitting for feature extraction as it is pretrained, hence only the other parts of the architecture should be trained.\n\n### Task #4\n\nFinally, you will select an [optimizer](http://pytorch.org/docs/master/optim.html#torch.optim.Optimizer).\n\n### Question 4\n\n**Question:** How did you select the optimizer used to train your model?\n\n**Answer:** I used the Adam optimizer, since in my past similar projects it gave me better performance than SGD. I have found Adam to perform better than vanilla SGD almost in all cases, aligning with intuition.",
"_____no_output_____"
]
],
[
[
"import nltk\nnltk.download('punkt')",
"[nltk_data] Downloading package punkt to /root/nltk_data...\n[nltk_data] Unzipping tokenizers/punkt.zip.\n"
],
[
"\nimport torch\nimport torch.nn as nn\nfrom torchvision import transforms\nimport sys\nsys.path.append('/opt/cocoapi/PythonAPI')\nfrom pycocotools.coco import COCO\nfrom data_loader import get_loader\nfrom model import EncoderCNN, DecoderRNN\nimport math\n\n\n## TODO #1: Select appropriate values for the Python variables below.\nbatch_size = 64 # batch size\nvocab_threshold = 6 # minimum word count threshold\nvocab_from_file = True # if True, load existing vocab file\nembed_size = 512 # dimensionality of image and word embeddings\nhidden_size = 512 # number of features in hidden state of the RNN decoder\nnum_epochs = 3 # number of training epochs\nsave_every = 1 # determines frequency of saving model weights\nprint_every = 100 # determines window for printing average loss\nlog_file = 'training_log.txt' # name of file with saved training loss and perplexity\n\n# (Optional) TODO #2: Amend the image transform below.\ntransform_train = transforms.Compose([ \n transforms.Resize(256), # smaller edge of image resized to 256\n transforms.RandomCrop(224), # get 224x224 crop from random location\n transforms.RandomHorizontalFlip(), # horizontally flip image with probability=0.5\n transforms.ToTensor(), # convert the PIL Image to a tensor\n transforms.Normalize((0.485, 0.456, 0.406), # normalize image for pre-trained model\n (0.229, 0.224, 0.225))])\n\n# Build data loader.\ndata_loader = get_loader(transform=transform_train,\n mode='train',\n batch_size=batch_size,\n vocab_threshold=vocab_threshold,\n vocab_from_file=vocab_from_file)\n\n# The size of the vocabulary.\nvocab_size = len(data_loader.dataset.vocab)\n\n# Initialize the encoder and decoder. \nencoder = EncoderCNN(embed_size)\ndecoder = DecoderRNN(embed_size, hidden_size, vocab_size)\n\n# Move models to GPU if CUDA is available. \ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nencoder.to(device)\ndecoder.to(device)\n\n# Define the loss function. \ncriterion = nn.CrossEntropyLoss().cuda() if torch.cuda.is_available() else nn.CrossEntropyLoss()\n\n# TODO #3: Specify the learnable parameters of the model.\nparams = list(decoder.parameters()) + list(encoder.embed.parameters())\n\n# TODO #4: Define the optimizer.\noptimizer = torch.optim.Adam(params, lr=0.001, betas=(0.9,0.999), eps=1e-8)\n\n# Set the total number of training steps per epoch.\ntotal_step = math.ceil(len(data_loader.dataset.caption_lengths) / data_loader.batch_sampler.batch_size)",
"Vocabulary successfully loaded from vocab.pkl file!\nloading annotations into memory...\nDone (t=1.07s)\ncreating index...\n"
]
],
[
[
"<a id='step2'></a>\n## Step 2: Train your Model\n\nOnce you have executed the code cell in **Step 1**, the training procedure below should run without issue. \n\nIt is completely fine to leave the code cell below as-is without modifications to train your model. However, if you would like to modify the code used to train the model below, you must ensure that your changes are easily parsed by your reviewer. In other words, make sure to provide appropriate comments to describe how your code works! \n\nYou may find it useful to load saved weights to resume training. In that case, note the names of the files containing the encoder and decoder weights that you'd like to load (`encoder_file` and `decoder_file`). Then you can load the weights by using the lines below:\n\n```python\n# Load pre-trained weights before resuming training.\nencoder.load_state_dict(torch.load(os.path.join('./models', encoder_file)))\ndecoder.load_state_dict(torch.load(os.path.join('./models', decoder_file)))\n```\n\nWhile trying out parameters, make sure to take extensive notes and record the settings that you used in your various training runs. In particular, you don't want to encounter a situation where you've trained a model for several hours but can't remember what settings you used :).\n\n### A Note on Tuning Hyperparameters\n\nTo figure out how well your model is doing, you can look at how the training loss and perplexity evolve during training - and for the purposes of this project, you are encouraged to amend the hyperparameters based on this information. \n\nHowever, this will not tell you if your model is overfitting to the training data, and, unfortunately, overfitting is a problem that is commonly encountered when training image captioning models. \n\nFor this project, you need not worry about overfitting. **This project does not have strict requirements regarding the performance of your model**, and you just need to demonstrate that your model has learned **_something_** when you generate captions on the test data. For now, we strongly encourage you to train your model for the suggested 3 epochs without worrying about performance; then, you should immediately transition to the next notebook in the sequence (**3_Inference.ipynb**) to see how your model performs on the test data. If your model needs to be changed, you can come back to this notebook, amend hyperparameters (if necessary), and re-train the model.\n\nThat said, if you would like to go above and beyond in this project, you can read about some approaches to minimizing overfitting in section 4.3.1 of [this paper](http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7505636). In the next (optional) step of this notebook, we provide some guidance for assessing the performance on the validation dataset.",
"_____no_output_____"
]
],
[
[
"import torch.utils.data as data\nimport numpy as np\nimport os\nimport requests\nimport time\n\n# Open the training log file.\nf = open(log_file, 'w')\n\nold_time = time.time()\nresponse = requests.request(\"GET\", \n \"http://metadata.google.internal/computeMetadata/v1/instance/attributes/keep_alive_token\", \n headers={\"Metadata-Flavor\":\"Google\"})\n\nfor epoch in range(1, num_epochs+1):\n \n for i_step in range(1, total_step+1):\n \n if time.time() - old_time > 60:\n old_time = time.time()\n requests.request(\"POST\", \n \"https://nebula.udacity.com/api/v1/remote/keep-alive\", \n headers={'Authorization': \"STAR \" + response.text})\n \n # Randomly sample a caption length, and sample indices with that length.\n indices = data_loader.dataset.get_train_indices()\n # Create and assign a batch sampler to retrieve a batch with the sampled indices.\n new_sampler = data.sampler.SubsetRandomSampler(indices=indices)\n data_loader.batch_sampler.sampler = new_sampler\n \n # Obtain the batch.\n images, captions = next(iter(data_loader))\n\n # Move batch of images and captions to GPU if CUDA is available.\n images = images.to(device)\n captions = captions.to(device)\n \n # Zero the gradients.\n decoder.zero_grad()\n encoder.zero_grad()\n \n # Pass the inputs through the CNN-RNN model.\n features = encoder(images)\n outputs = decoder(features, captions)\n \n # Calculate the batch loss.\n loss = criterion(outputs.view(-1, vocab_size), captions.view(-1))\n \n # Backward pass.\n loss.backward()\n \n # Update the parameters in the optimizer.\n optimizer.step()\n \n # Get training statistics.\n stats = 'Epoch [%d/%d], Step [%d/%d], Loss: %.4f, Perplexity: %5.4f' % (epoch, num_epochs, i_step, total_step, loss.item(), np.exp(loss.item()))\n \n # Print training statistics (on same line).\n print('\\r' + stats, end=\"\")\n sys.stdout.flush()\n \n # Print training statistics to file.\n f.write(stats + '\\n')\n f.flush()\n \n # Print training statistics (on different line).\n if i_step % print_every == 0:\n print('\\r' + stats)\n \n # Save the weights.\n if epoch % save_every == 0:\n torch.save(decoder.state_dict(), os.path.join('./models', 'decoder-%d.pkl' % epoch))\n torch.save(encoder.state_dict(), os.path.join('./models', 'encoder-%d.pkl' % epoch))\n\n# Close the training log file.\nf.close()",
"Epoch [1/3], Step [100/6471], Loss: 4.2137, Perplexity: 67.6088\nEpoch [1/3], Step [200/6471], Loss: 3.9313, Perplexity: 50.97528\nEpoch [1/3], Step [300/6471], Loss: 3.5978, Perplexity: 36.5175\nEpoch [1/3], Step [400/6471], Loss: 3.6794, Perplexity: 39.6219\nEpoch [1/3], Step [500/6471], Loss: 3.0714, Perplexity: 21.5712\nEpoch [1/3], Step [600/6471], Loss: 3.2012, Perplexity: 24.5617\nEpoch [1/3], Step [700/6471], Loss: 3.2718, Perplexity: 26.35966\nEpoch [1/3], Step [800/6471], Loss: 3.3748, Perplexity: 29.2185\nEpoch [1/3], Step [900/6471], Loss: 3.1745, Perplexity: 23.9146\nEpoch [1/3], Step [1000/6471], Loss: 3.2627, Perplexity: 26.1206\nEpoch [1/3], Step [1100/6471], Loss: 2.8865, Perplexity: 17.9312\nEpoch [1/3], Step [1200/6471], Loss: 2.9421, Perplexity: 18.9562\nEpoch [1/3], Step [1300/6471], Loss: 2.7139, Perplexity: 15.0875\nEpoch [1/3], Step [1400/6471], Loss: 2.6474, Perplexity: 14.1176\nEpoch [1/3], Step [1500/6471], Loss: 2.6901, Perplexity: 14.7331\nEpoch [1/3], Step [1600/6471], Loss: 2.6551, Perplexity: 14.2267\nEpoch [1/3], Step [1700/6471], Loss: 2.9028, Perplexity: 18.2242\nEpoch [1/3], Step [1800/6471], Loss: 2.5633, Perplexity: 12.9791\nEpoch [1/3], Step [1900/6471], Loss: 2.7250, Perplexity: 15.2564\nEpoch [1/3], Step [2000/6471], Loss: 2.5907, Perplexity: 13.3396\nEpoch [1/3], Step [2100/6471], Loss: 2.7079, Perplexity: 14.9985\nEpoch [1/3], Step [2200/6471], Loss: 2.5242, Perplexity: 12.4809\nEpoch [1/3], Step [2300/6471], Loss: 2.5016, Perplexity: 12.2019\nEpoch [1/3], Step [2400/6471], Loss: 2.6168, Perplexity: 13.6915\nEpoch [1/3], Step [2500/6471], Loss: 2.6548, Perplexity: 14.2225\nEpoch [1/3], Step [2600/6471], Loss: 2.4738, Perplexity: 11.8673\nEpoch [1/3], Step [2700/6471], Loss: 2.4797, Perplexity: 11.9380\nEpoch [1/3], Step [2800/6471], Loss: 2.6574, Perplexity: 14.2598\nEpoch [1/3], Step [2900/6471], Loss: 2.3054, Perplexity: 10.0281\nEpoch [1/3], Step [3000/6471], Loss: 2.5392, Perplexity: 12.6694\nEpoch [1/3], Step [3100/6471], Loss: 2.6166, Perplexity: 13.6890\nEpoch [1/3], Step [3200/6471], Loss: 2.2275, Perplexity: 9.27642\nEpoch [1/3], Step [3300/6471], Loss: 2.5271, Perplexity: 12.5177\nEpoch [1/3], Step [3400/6471], Loss: 2.3050, Perplexity: 10.0246\nEpoch [1/3], Step [3500/6471], Loss: 2.0236, Perplexity: 7.56542\nEpoch [1/3], Step [3600/6471], Loss: 2.1614, Perplexity: 8.68294\nEpoch [1/3], Step [3700/6471], Loss: 2.3635, Perplexity: 10.6284\nEpoch [1/3], Step [3800/6471], Loss: 2.3958, Perplexity: 10.9773\nEpoch [1/3], Step [3900/6471], Loss: 2.1591, Perplexity: 8.66344\nEpoch [1/3], Step [4000/6471], Loss: 2.3267, Perplexity: 10.2446\nEpoch [1/3], Step [4100/6471], Loss: 3.1127, Perplexity: 22.4825\nEpoch [1/3], Step [4200/6471], Loss: 2.3359, Perplexity: 10.3392\nEpoch [1/3], Step [4300/6471], Loss: 2.3215, Perplexity: 10.1912\nEpoch [1/3], Step [4400/6471], Loss: 2.2369, Perplexity: 9.36462\nEpoch [1/3], Step [4500/6471], Loss: 2.2770, Perplexity: 9.74746\nEpoch [1/3], Step [4600/6471], Loss: 2.2351, Perplexity: 9.34757\nEpoch [1/3], Step [4700/6471], Loss: 2.2890, Perplexity: 9.86499\nEpoch [1/3], Step [4800/6471], Loss: 2.2736, Perplexity: 9.713991\nEpoch [1/3], Step [4900/6471], Loss: 2.5273, Perplexity: 12.5202\nEpoch [1/3], Step [5000/6471], Loss: 2.1436, Perplexity: 8.52971\nEpoch [1/3], Step [5100/6471], Loss: 2.2414, Perplexity: 9.40672\nEpoch [1/3], Step [5200/6471], Loss: 2.3917, Perplexity: 10.9318\nEpoch [1/3], Step [5300/6471], Loss: 2.2926, Perplexity: 9.90097\nEpoch [1/3], Step [5400/6471], Loss: 2.0861, Perplexity: 8.05366\nEpoch [1/3], Step [5500/6471], Loss: 2.0797, Perplexity: 8.00241\nEpoch [1/3], Step [5600/6471], Loss: 2.5135, Perplexity: 12.3480\nEpoch [1/3], Step [5700/6471], Loss: 2.0843, Perplexity: 8.03936\nEpoch [1/3], Step [5800/6471], Loss: 2.4332, Perplexity: 11.3950\nEpoch [1/3], Step [5900/6471], Loss: 2.0920, Perplexity: 8.10140\nEpoch [1/3], Step [6000/6471], Loss: 2.3367, Perplexity: 10.3468\nEpoch [1/3], Step [6100/6471], Loss: 2.9598, Perplexity: 19.2937\nEpoch [1/3], Step [6200/6471], Loss: 2.0285, Perplexity: 7.60297\nEpoch [1/3], Step [6300/6471], Loss: 2.6213, Perplexity: 13.7538\nEpoch [1/3], Step [6400/6471], Loss: 2.0924, Perplexity: 8.10440\nEpoch [2/3], Step [100/6471], Loss: 2.1729, Perplexity: 8.783715\nEpoch [2/3], Step [200/6471], Loss: 2.1168, Perplexity: 8.30481\nEpoch [2/3], Step [300/6471], Loss: 2.2427, Perplexity: 9.41848\nEpoch [2/3], Step [400/6471], Loss: 2.5073, Perplexity: 12.2721\nEpoch [2/3], Step [500/6471], Loss: 2.1942, Perplexity: 8.97323\nEpoch [2/3], Step [600/6471], Loss: 2.2852, Perplexity: 9.82738\nEpoch [2/3], Step [700/6471], Loss: 2.0216, Perplexity: 7.55076\nEpoch [2/3], Step [800/6471], Loss: 2.0080, Perplexity: 7.44841\nEpoch [2/3], Step [900/6471], Loss: 2.6213, Perplexity: 13.7540\nEpoch [2/3], Step [1000/6471], Loss: 2.2098, Perplexity: 9.1141\nEpoch [2/3], Step [1100/6471], Loss: 2.3376, Perplexity: 10.3568\nEpoch [2/3], Step [1200/6471], Loss: 2.1687, Perplexity: 8.74662\nEpoch [2/3], Step [1300/6471], Loss: 2.4215, Perplexity: 11.2623\nEpoch [2/3], Step [1400/6471], Loss: 2.2622, Perplexity: 9.60387\nEpoch [2/3], Step [1500/6471], Loss: 2.0793, Perplexity: 7.99915\nEpoch [2/3], Step [1600/6471], Loss: 3.0006, Perplexity: 20.0976\nEpoch [2/3], Step [1700/6471], Loss: 2.1184, Perplexity: 8.31816\nEpoch [2/3], Step [1800/6471], Loss: 2.0555, Perplexity: 7.81114\nEpoch [2/3], Step [1900/6471], Loss: 2.4132, Perplexity: 11.1696\nEpoch [2/3], Step [2000/6471], Loss: 2.4320, Perplexity: 11.3817\nEpoch [2/3], Step [2100/6471], Loss: 2.6297, Perplexity: 13.8692\nEpoch [2/3], Step [2200/6471], Loss: 2.2170, Perplexity: 9.18001\nEpoch [2/3], Step [2300/6471], Loss: 2.1038, Perplexity: 8.19712\nEpoch [2/3], Step [2400/6471], Loss: 2.0491, Perplexity: 7.76052\nEpoch [2/3], Step [2500/6471], Loss: 1.9645, Perplexity: 7.13170\nEpoch [2/3], Step [2600/6471], Loss: 2.3801, Perplexity: 10.8063\nEpoch [2/3], Step [2700/6471], Loss: 2.3220, Perplexity: 10.1963\nEpoch [2/3], Step [2800/6471], Loss: 2.0542, Perplexity: 7.80050\nEpoch [2/3], Step [2900/6471], Loss: 1.9378, Perplexity: 6.94348\nEpoch [2/3], Step [3000/6471], Loss: 1.9138, Perplexity: 6.77860\nEpoch [2/3], Step [3100/6471], Loss: 2.2314, Perplexity: 9.31325\nEpoch [2/3], Step [3200/6471], Loss: 2.1790, Perplexity: 8.83758\nEpoch [2/3], Step [3300/6471], Loss: 2.7974, Perplexity: 16.4013\nEpoch [2/3], Step [3400/6471], Loss: 2.2902, Perplexity: 9.87657\nEpoch [2/3], Step [3500/6471], Loss: 2.0739, Perplexity: 7.95541\nEpoch [2/3], Step [3600/6471], Loss: 2.4700, Perplexity: 11.8226\nEpoch [2/3], Step [3700/6471], Loss: 2.0761, Perplexity: 7.97370\nEpoch [2/3], Step [3800/6471], Loss: 2.0085, Perplexity: 7.45224\nEpoch [2/3], Step [3900/6471], Loss: 2.0280, Perplexity: 7.59929\nEpoch [2/3], Step [4000/6471], Loss: 2.0487, Perplexity: 7.75750\nEpoch [2/3], Step [4100/6471], Loss: 2.0105, Perplexity: 7.46732\nEpoch [2/3], Step [4200/6471], Loss: 2.3099, Perplexity: 10.0733\nEpoch [2/3], Step [4300/6471], Loss: 1.8471, Perplexity: 6.34158\nEpoch [2/3], Step [4400/6471], Loss: 1.9144, Perplexity: 6.78305\nEpoch [2/3], Step [4500/6471], Loss: 2.3026, Perplexity: 10.0001\nEpoch [2/3], Step [4600/6471], Loss: 2.0366, Perplexity: 7.66411\nEpoch [2/3], Step [4700/6471], Loss: 2.4918, Perplexity: 12.0830\nEpoch [2/3], Step [4800/6471], Loss: 2.0035, Perplexity: 7.41520\nEpoch [2/3], Step [4900/6471], Loss: 2.0007, Perplexity: 7.39395\nEpoch [2/3], Step [5000/6471], Loss: 2.0057, Perplexity: 7.43157\nEpoch [2/3], Step [5100/6471], Loss: 2.0654, Perplexity: 7.88811\nEpoch [2/3], Step [5200/6471], Loss: 1.8834, Perplexity: 6.57597\nEpoch [2/3], Step [5300/6471], Loss: 1.9578, Perplexity: 7.08400\nEpoch [2/3], Step [5400/6471], Loss: 2.1135, Perplexity: 8.27759\nEpoch [2/3], Step [5500/6471], Loss: 1.9813, Perplexity: 7.25206\nEpoch [2/3], Step [5600/6471], Loss: 2.1926, Perplexity: 8.95865\nEpoch [2/3], Step [5700/6471], Loss: 2.2927, Perplexity: 9.90207\nEpoch [2/3], Step [5800/6471], Loss: 2.3188, Perplexity: 10.1636\nEpoch [2/3], Step [5900/6471], Loss: 1.9937, Perplexity: 7.34238\nEpoch [2/3], Step [6000/6471], Loss: 1.8804, Perplexity: 6.55632\nEpoch [2/3], Step [6100/6471], Loss: 1.8708, Perplexity: 6.49346\nEpoch [2/3], Step [6200/6471], Loss: 1.9785, Perplexity: 7.23204\nEpoch [2/3], Step [6300/6471], Loss: 2.1267, Perplexity: 8.38739\nEpoch [2/3], Step [6400/6471], Loss: 1.8215, Perplexity: 6.18116\nEpoch [3/3], Step [100/6471], Loss: 1.9881, Perplexity: 7.301406\nEpoch [3/3], Step [200/6471], Loss: 2.2102, Perplexity: 9.11727\nEpoch [3/3], Step [300/6471], Loss: 1.9104, Perplexity: 6.75575\nEpoch [3/3], Step [400/6471], Loss: 1.8180, Perplexity: 6.15938\nEpoch [3/3], Step [500/6471], Loss: 2.5038, Perplexity: 12.2288\nEpoch [3/3], Step [600/6471], Loss: 2.0724, Perplexity: 7.94375\nEpoch [3/3], Step [700/6471], Loss: 2.0264, Perplexity: 7.58681\nEpoch [3/3], Step [800/6471], Loss: 1.9343, Perplexity: 6.91936\nEpoch [3/3], Step [900/6471], Loss: 1.9347, Perplexity: 6.92228\nEpoch [3/3], Step [1000/6471], Loss: 2.6768, Perplexity: 14.5382\nEpoch [3/3], Step [1100/6471], Loss: 2.1302, Perplexity: 8.41696\nEpoch [3/3], Step [1200/6471], Loss: 1.9754, Perplexity: 7.20958\nEpoch [3/3], Step [1300/6471], Loss: 2.0288, Perplexity: 7.60478\nEpoch [3/3], Step [1400/6471], Loss: 2.1273, Perplexity: 8.39242\nEpoch [3/3], Step [1500/6471], Loss: 2.6294, Perplexity: 13.8661\nEpoch [3/3], Step [1600/6471], Loss: 2.6716, Perplexity: 14.4634\nEpoch [3/3], Step [1700/6471], Loss: 1.8720, Perplexity: 6.50130\nEpoch [3/3], Step [1800/6471], Loss: 2.3521, Perplexity: 10.5080\nEpoch [3/3], Step [1900/6471], Loss: 2.0034, Perplexity: 7.41405\nEpoch [3/3], Step [2000/6471], Loss: 2.0006, Perplexity: 7.39337\nEpoch [3/3], Step [2100/6471], Loss: 2.0902, Perplexity: 8.08620\nEpoch [3/3], Step [2200/6471], Loss: 3.3483, Perplexity: 28.4533\nEpoch [3/3], Step [2300/6471], Loss: 2.0799, Perplexity: 8.00390\nEpoch [3/3], Step [2400/6471], Loss: 2.1215, Perplexity: 8.34411\nEpoch [3/3], Step [2500/6471], Loss: 1.9870, Perplexity: 7.29389\nEpoch [3/3], Step [2600/6471], Loss: 2.1111, Perplexity: 8.25726\nEpoch [3/3], Step [2700/6471], Loss: 1.8926, Perplexity: 6.63631\nEpoch [3/3], Step [2800/6471], Loss: 2.0022, Perplexity: 7.40557\nEpoch [3/3], Step [2900/6471], Loss: 1.9249, Perplexity: 6.85467\nEpoch [3/3], Step [3000/6471], Loss: 1.8835, Perplexity: 6.57626\nEpoch [3/3], Step [3100/6471], Loss: 2.0569, Perplexity: 7.82189\nEpoch [3/3], Step [3200/6471], Loss: 1.8780, Perplexity: 6.54040\nEpoch [3/3], Step [3300/6471], Loss: 2.3703, Perplexity: 10.7010\nEpoch [3/3], Step [3400/6471], Loss: 1.9703, Perplexity: 7.17267\nEpoch [3/3], Step [3500/6471], Loss: 1.9115, Perplexity: 6.76300\nEpoch [3/3], Step [3600/6471], Loss: 2.2174, Perplexity: 9.18364\nEpoch [3/3], Step [3700/6471], Loss: 2.4291, Perplexity: 11.3490\nEpoch [3/3], Step [3800/6471], Loss: 2.3135, Perplexity: 10.1093\nEpoch [3/3], Step [3900/6471], Loss: 1.9082, Perplexity: 6.74124\nEpoch [3/3], Step [4000/6471], Loss: 1.9494, Perplexity: 7.02424\nEpoch [3/3], Step [4100/6471], Loss: 1.8795, Perplexity: 6.55057\nEpoch [3/3], Step [4200/6471], Loss: 2.0943, Perplexity: 8.12024\nEpoch [3/3], Step [4300/6471], Loss: 1.9174, Perplexity: 6.80361\nEpoch [3/3], Step [4400/6471], Loss: 1.8159, Perplexity: 6.14634\nEpoch [3/3], Step [4500/6471], Loss: 2.1579, Perplexity: 8.65335\nEpoch [3/3], Step [4600/6471], Loss: 2.0022, Perplexity: 7.40562\nEpoch [3/3], Step [4700/6471], Loss: 2.0300, Perplexity: 7.61381\nEpoch [3/3], Step [4800/6471], Loss: 1.9009, Perplexity: 6.69223\nEpoch [3/3], Step [4900/6471], Loss: 2.4837, Perplexity: 11.9857\nEpoch [3/3], Step [5000/6471], Loss: 2.0528, Perplexity: 7.79005\nEpoch [3/3], Step [5100/6471], Loss: 1.9514, Perplexity: 7.03869\nEpoch [3/3], Step [5200/6471], Loss: 1.8162, Perplexity: 6.14836\nEpoch [3/3], Step [5300/6471], Loss: 2.0564, Perplexity: 7.81761\nEpoch [3/3], Step [5400/6471], Loss: 1.8345, Perplexity: 6.26224\nEpoch [3/3], Step [5500/6471], Loss: 2.2075, Perplexity: 9.09278\nEpoch [3/3], Step [5600/6471], Loss: 1.8813, Perplexity: 6.56204\nEpoch [3/3], Step [5700/6471], Loss: 1.8286, Perplexity: 6.22503\nEpoch [3/3], Step [5800/6471], Loss: 1.8301, Perplexity: 6.23444\nEpoch [3/3], Step [5900/6471], Loss: 1.9318, Perplexity: 6.90176\nEpoch [3/3], Step [6000/6471], Loss: 1.9549, Perplexity: 7.06348\nEpoch [3/3], Step [6100/6471], Loss: 1.9326, Perplexity: 6.90775\nEpoch [3/3], Step [6200/6471], Loss: 2.0268, Perplexity: 7.58943\nEpoch [3/3], Step [6300/6471], Loss: 1.8465, Perplexity: 6.33754\nEpoch [3/3], Step [6400/6471], Loss: 1.9052, Perplexity: 6.72096\nEpoch [3/3], Step [6471/6471], Loss: 2.0248, Perplexity: 7.57506"
]
],
[
[
"<a id='step3'></a>\n## Step 3: (Optional) Validate your Model\n\nTo assess potential overfitting, one approach is to assess performance on a validation set. If you decide to do this **optional** task, you are required to first complete all of the steps in the next notebook in the sequence (**3_Inference.ipynb**); as part of that notebook, you will write and test code (specifically, the `sample` method in the `DecoderRNN` class) that uses your RNN decoder to generate captions. That code will prove incredibly useful here. \n\nIf you decide to validate your model, please do not edit the data loader in **data_loader.py**. Instead, create a new file named **data_loader_val.py** containing the code for obtaining the data loader for the validation data. You can access:\n- the validation images at filepath `'/opt/cocoapi/images/train2014/'`, and\n- the validation image caption annotation file at filepath `'/opt/cocoapi/annotations/captions_val2014.json'`.\n\nThe suggested approach to validating your model involves creating a json file such as [this one](https://github.com/cocodataset/cocoapi/blob/master/results/captions_val2014_fakecap_results.json) containing your model's predicted captions for the validation images. Then, you can write your own script or use one that you [find online](https://github.com/tylin/coco-caption) to calculate the BLEU score of your model. You can read more about the BLEU score, along with other evaluation metrics (such as TEOR and Cider) in section 4.1 of [this paper](https://arxiv.org/pdf/1411.4555.pdf). For more information about how to use the annotation file, check out the [website](http://cocodataset.org/#download) for the COCO dataset.",
"_____no_output_____"
]
],
[
[
"# (Optional) TODO: Validate your model.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d062038e41a45a23deb4f42e109f95ea47b719ff | 658,049 | ipynb | Jupyter Notebook | mark/export2db.ipynb | shiniao/baozheng | c1ec89e5ce2395abad089f1e7b92c3b31842e9f8 | [
"MIT"
]
| 1 | 2021-02-19T14:51:43.000Z | 2021-02-19T14:51:43.000Z | mark/export2db.ipynb | shiniao/baozheng | c1ec89e5ce2395abad089f1e7b92c3b31842e9f8 | [
"MIT"
]
| null | null | null | mark/export2db.ipynb | shiniao/baozheng | c1ec89e5ce2395abad089f1e7b92c3b31842e9f8 | [
"MIT"
]
| null | null | null | 61.534412 | 1,500 | 0.41543 | [
[
[
"import pandas as pd\nimport mysql.connector\nfrom sqlalchemy import create_engine",
"_____no_output_____"
],
[
"df = pd.read_csv('/Users/zhezhezhu/.bz_datasets/datasets_shiniao_create_dataset_test.txt',\n sep='\\t')\ndf",
"_____no_output_____"
],
[
"df.memory_usage(index=False, deep=True).sum()\n\nMYSQL_USER \t\t= 'root'\nMYSQL_PASSWORD \t= '19(zhezhezhu)95'\nMYSQL_HOST_IP \t= '127.0.0.1'\nMYSQL_PORT\t\t= \"3306\"\nMYSQL_DATABASE\t= 'datasets'\nengine = create_engine('mysql+mysqlconnector://'+MYSQL_USER+':'+MYSQL_PASSWORD+'@'+MYSQL_HOST_IP+':'+MYSQL_PORT+'/'+MYSQL_DATABASE, echo=False)\n",
"_____no_output_____"
],
[
"idx = 1\nfor df in pd.read_csv('/Users/zhezhezhu/projects/data_source/spam_message/spam_message.txt',\n names=[\"classify\",\"content\"],\n chunksize=500,\n sep='\\t'):\n df.to_sql(name=\"spam_message\", con=engine, if_exists='replace', chunksize=500)\n print(str(500 * (idx+1))+\" Processed\")",
" classify content\n0 0 ๅไธ็งๅฏ็็งๅฏๆง้ฃๆฏ็ปด็ณปๅ
ถๅไธไปทๅผๅๅๆญๅฐไฝ็ๅๆๆกไปถไนไธ\n1 1 ๅๅฃ้ฟ็ๆฝๆฐๆฅ็ฌฌไธๆน้้ๆฅ่ฃ
ๅฐๅบๅฆ๎ ๎ ๎ ๆฅๆ่ฑๅผๆทๅฅณ่ฃใๅฐ่่ฒๅ
ฌไธป่กซ๎ ...\n2 0 ๅธฆ็ปๆไปฌๅคงๅธธๅทไธๅบๅฃฎ่ง็่ง่ง็ๅฎด\n3 0 ๆๅๅ ไธๆ็ๆณๅฐฟ็ณป็ป็ป็ณ็ญ\n4 0 23ๅนดไป็ๅๆๅๆฅ็้บป้บป็ๅซๅฆ\n1000 Processed\n classify content\n500 0 ่ก่ๅ็ด ๅขๅ 3ๅใ็ปด็็ด Bl2ๅขๅ 4ๅใ็ปด็็ด Cๅขๅ 4\n501 0 ้ซ็ฎก้ฝ้่ฆKPIๅฐฑๆฒก่ตๆ ผๅ้ซ็ฎก\n502 0 ๆคๅฃซไธๆฃๆฅๆๅผๆไน็้ฝ่ๆ่ฟๆ ทไบ\n503 1 x.x-x.xๆฅๅผ ๅฎถ่พน่ๅฎ๏ผๆข็พ็็ฉบ่ฐ๏ผ ้ขๅญxxๅ
๏ผๆไฝ=xxxๅ
๏ผๆ้ซ=xxxxๅ
๏ผ้ข็บฆ...\n504 0 ็ซ็ฎญไผ่ตๆๆป็ป๏ผๅฏๅฒๅป่ฅฟ้จๅ ๅๆงๆไธๅฒๆ\n1000 Processed\n classify content\n1000 1 ๅฒๆพ็ตๅจ้ฉไฟๅญฆ้ๅไธ็ญ๏ผๆ่ฐขๆๅ้ขๅฏผ็ๆฏๆ๏ผ็ฉบ่ฐๆนๅไปทไพๅบxxxๅฅ๏ผx.xxๅน็ฉบ่ฐxxxxๅ
...\n1001 0 ๅฆๆ่ฏด่ฎคไบไบฒๆๅฐฑไปฃ่กจ็่
่ดฅ\n1002 0 /็ทๅญๅจๆดพๅบๆๆๅผบๅฅธๅฅณๅญฉ่ขซๆๆถ่ฃคๅญ่ฟๆฒกๆไธ\n1003 0 15ๅฒไปฅไธๆฏๆฅ1ๆฌก2็ฒ้ๆถๆ็จ\n1004 0 ๆ่ต่
ไพ็ถๅจๅๅไบๆๆงไธไพง\n1000 Processed\n classify content\n1500 1 ๆค่คๅxๆ ๆดๅคดๆฐดxๆ๏ผๆปกxxxๅ
้xxๅ
่ถ
ๅธๆต็จๅท.xๅทไธxๅท็ๅบไบ่ถ
ๅธใ\n1501 0 ๅๅฒๆนไฝไบๆตๆฑ็ๆญๅทๅธๆทณๅฎๅฟๅขๅ
\n1502 0 ๅคๅญฃ่กขๅท็ๅคฉไบฎ็ๆฉ็ปไบๅคฉไบฎไบ็ญไธๅๅฎถ่ดตๅท้ป่ฅฟๅ~ๆตๆฑไธฝๆฐดๅธๆๆไนๆ ๅ้กพ้ฉฌไธๅ่นๅ้ๅๅฝ่ฝ่ตถไธ...\n1503 0 ๆนๅ้ซ้็ๅฐพๆถๅกๆนๅผ่
่ดฅๆก\n1504 0 05ไบฟ็พๅ
็ไปทๆ ผๅๆบๅฉๅฎๆๆณๅ ๆฏๅก้ๅขๅบๅฎ่จๅฐ่ฟช็ฆ้็ฟ50%่กๆ\n1000 Processed\n classify content\n2000 0 ๆฏๅคฉๆไธๅฐไบ11็นๅทฆๅณๅฎถ้็ฝ้ๆ
ขๅพๆ่ฆ็ ธๆๆบ\n2001 0 ไปๅนดไธปๆ็้ฒๆๅท้พ้ฉๅฝ่ถ
็ซ็Re๏ผcipeๆฐดๆถ้ฒๆ็ฅๅจๆฐดๆถๅท้พ\n2002 1 ไธบ็ญ่ฐขๆฐ่ๅฎขๆทๅฏนๆๅบ็ๅคงๅๆฏๆ๏ผๆญฃๆๅไบๅฐๆฌๅบๅ็ๅฎฝๅธฆไธๅกๅๅ ๆดปๅจๆบๅ่
ๅฏๅญๆญค็ญไฟกไผๆ xxๅ
...\n2003 0 ไธบ้ๅถๆฐด็ฎฑ็ญ4ๅฎถไผไธไธๆฅๅพ
ๆน่ฐๆด็ฝฎๆขๅๅฐ46\n2004 0 12342ๆ็คบๅ12345ๆ่ฏ\n1000 Processed\n classify content\n2500 1 Duang...?้ฆๅ
ฐๆไธบๅบ็ฅx.xๅฆๅฅณ่๏ผ็นๅซๆจๅบไผๆ ๅ
ๅผๆดปๅจ๏ผๅ
็่ถๅค้็่ถๅค๏ผๅ้ขๆ้...\n2501 1 ไธ่ฟชๅ็พ่พพๅนฟๅบ้
ๅบไบๆฅผ้ๆฑคSPAไผๆ ๏ผๆบๆๅ
จไฝๅๅทฅๅจๅ
ๅฎตไฝณ่ๅฐๆฅไน้
๏ผ็ฅๆจๅ
ๅฎต่ๅฟซไน๏ผๅๅฎถ...\n2502 0 โโใ็ฌฌไธๆนๆฏไปๆๅคฑP2Pๆฒๆท่ต้้ถ่กๆข้ฑๆขๅฐ็\n2503 0 ไธบไฝโไธญๅฝๅฅฝๅฃฐ้ณโ็้ๆๆดไฝไธๅฆโ่ถ
ๅฅณๅฟซ็ทโ็บข\n2504 1 ๆต ๆฐด*ๆฐๅ้ๅฎ้จๆญ็ฅๆฐ่ๅฎขๆทๅ
ๅฎต่ๅฟซไน๏ผๆ้ ๅฝๅ
้ฆๅฎถ็ตๅๅ่ดธๅธๅบ๏ผ็ง้ๆตๆไพ๏ผ่ฝปๆพๅๆฟไธ๏ผไธ...\n1000 Processed\n classify content\n3000 0 ใxxไธ็ญๆญ่ง้ขใ่ฟฝๆ่ฟไปถๅฐไบ\n3001 0 18ๅฒๅฐไผ\"่ชๅญฆ\"ๆ้ปๅฎข็ๅท160ไธไบบไฟก็จๅก\n3002 0 ๅจ่ฏขๅฟๆฟ็ไบบ่ถๆฅ่ถๅคไบโฆๅคงๅฎถ่ฟๆฅ็ธ็ฅ็ธ่ฏ\n3003 0 ๅฐ็พ่ๆฉ้ซๆธ
็พๅบฆ็ฝ็ไธ่ฝฝ้พๆฅ\n3004 0 ๆๅฏไปฅๅ
่ดน่ฝฌ่ต ่ฟ2ไธๅ
็ๅฅๅญฆ้\n1000 Processed\n classify content\n3500 0 ็ฟ้ธฟ่ฝ็ถๆฒกๅฏผ่ด่ฟไบๆธฏ็้ด้จ\n3501 1 ๅง่ดๅจๅ ไนณ่
บ็่ตฐไบ๏ผ่ฟๆฌพไบงๅๅด็ซไบ๏ผๆฏๅคฉๅช้x.xxๅ
๏ผไธๅนดxxxๅ
๏ผๅฐฑๅฏไปฅๆฏไธไธชๅฅณๆง่ทๅพx...\n3502 1 ๆ่ฐขๆจ่ด็ตๅๅณฐ็ณ้ถ--ๆจ่บซ่พน็่ดทๆฌพๆ
ไฟๆดไฝๆนๆก่งฃๅณไธๅฎถ๏ผๆไปฌๅฐๅง็ปไธบๆจๆไพไธไธใไพฟๆท่ดดๅฟ็ๆ...\n3503 0 ็ฎๅฝฑใๆ่ใๅปบ็ญใๅบๅ
ธใๅคง่กใไบบ็พค\n3504 0 ่่ฏทๆตๆฑ็ๆดๅฝขๅคๅๅงไบบ้ฉฌๅฅๆๆๆ
ไปปๅ่ช้ข้ฟ\n1000 Processed\n classify content\n4000 0 ๆไปฌไผผไนๆฏ่ขซๅฆๅงๅจๆฝ่งๅไธๆดป็็็ๅฝ\n4001 0 ๅฐ้่ฟ็ตๅญ่ญฆๅฏๆๆ่ฟ่ก็ฎก็\n4002 0 ๅฐฑๅจๅไบฌ้ คๅ คๆธฏ็2015NBANATION็ฑ็ๅๅบฆๅไบฌๅ ด\n4003 0 ็ฐๅจ่งๅพ้ปๅๅกไนไธ้ฃไน่ฆไธ้ฃไน้
ธไบ\n4004 1 ๅฉ็ๆตฎๅฐ้กถ๏ผๅญๆฌพๆๅ็ฎ๏ผ่ชxxxxๅนดxๆxๆฅ่ตท๏ผๆ่กๅญๆฌพๅฉ็็ปไธไธๆตฎxx%๏ผ่ฎฉๆจๅฐไบซๆ้ซๅญๆฌพ...\n1000 Processed\n classify content\n4500 1 ๆ้ณๅ
ฌๅญ๏ผ้ซๅฑ่งๆฏxๅฑ
ๅฎค๏ผ่ง้็นๅซๅฅฝ๏ผๆ ไปปไฝ้ฎๆก๏ผ้ๅธธๅฎ้ๅ้ขไธไธด่ก๏ผๆปไปทไฝ๏ผไฝไบๅธๅบxxไธ...\n4501 0 It'samazinghow้ข่ฒ\n4502 1 ๅปบ่ก๏ผxxxx xxxx xxxx xxxx xxxๆจไธฝๅจ\n4503 0 ๆฅไธๆฅ็ไปปๅกๅฐฑๆฏ่ฐๆฅๅพๅท็ๅคฉๆฐ็ฏๅข\n4504 0 โขๆไธๅ้
ธๅฅถๆๅฉไบ่กฅ้๏ผๆ้ด12็น่ณๅๆจๆฏไบบไฝ่ก้ๅซ้ๆไฝ็ๆถๅ\n1000 Processed\n classify content\n5000 1 ใๆ ผๅ
ฐ็ๅผๅ
ฐ่้จไธๆนไธๆใไบฒ็ฑ็ๅฅณ็ฅx.xๅฅณไบบ่๏ผๅ
จๅบไธไปถxๆ๏ผไธคไปถxๆ๏ผไธไปถxๆๅ้ไปทๅผx...\n5001 0 ๆๅบ็ ็ฉถ็้ถๆฎต็็ฌฌไธไปฝ็ฎๅ\n5002 0 ๅ
ถไปๆๆบๆๅบๆฅ็็
ง็้ฝๆบๅฅฝ็็\n5003 0 ็ฑๆฑ่ฅฟ็ๆ
ๆธธ่งๅ็ ็ฉถ้ขไธปๅ\n5004 0 ๆๅฆ่ฏดๆ่ไธ้ๅ~ๆไปฅๅฆๆๆๅๆฌข็ๆๅๅฏไปฅๆพๆไนฐ\n1000 Processed\n classify content\n5500 0 ่ไธๅดๆฑๅบ็ณป็ป่ทๅญๅบไธไธๆ ท\n5501 0 ้ ๅฅนๆฏๆๅชๆ2000ๅค็ๅทฅ่ณ็ถญๆ\n5502 0 ๆ ้ก็็ฑ็น้86ๅฒ็ๆๆ่ๅ
ตๅดๆ่ๅ
็ๆฅๅฐ็็ฑ\n5503 0 ็ๅฎๅ
ๆๆไบบๆๆ็ฅ้่ฎพ่ฎกๅธๅฏไปฅๅ่ทฏๅบ้ๅๆฌไปๆฒกๅญฆ่ฟ็ไธไธๅฐฑๅฏไปฅๆ
ไปป\n5504 1 ๅคง็พ็
ไฟ้ฉไบงๅ้ฟไฟๅฎๅบทๅฐไบxxxxๅนดxxๆxxๆฅ่ตทๆญฃๅผๅๅฎ๏ผๆฌไบงๅ็น่ฒๆฏไฟ่ดนไฝ๏ผไฟ้้ซ๏ผไธๆฌก...\n1000 Processed\n classify content\n6000 0 ็ฑๅฟ1็ญไปๅคฉไธ็ๆฏ็ปๅพ็ปไธ่ฒ่ฏพ\n6001 0 ไธญๅฝๅฅฝๅฃฐ้ณๆจๆ26ๅฒๆนๅ้ฃไธช็ท็้ปๆบ\n6002 0 ็ฌฌไธ้ถๆฎต๏ผ7ๆ31ๆฅ่ณ8ๆ6ๆฅ\n6003 0 ๅๅ
ฌไบคๅปๆดๅฎนๅป้ขไธๅๅป็่ฏดไฝ ่ฟๅดๅฝฆ็ฅ็่ธ่ฟๆด\n6004 0 ๆๅกๅฎขๆท็็ๅฟตไน่ฎฉๅไธบ้ๆธๆไธบไธญๅฝไบบๅผไปฅไธบ่ฑช็ๅฝไบงๅ็\n1000 Processed\n classify content\n6500 1 ๆทฎๅ็งปๅจxGๆๆบโxโๅ
่ดญ๏ผ่ฃธๆบ็ด้้ซ่พพxxxๅ
ใๅไธบPxๅไปทxxxx๏ผ็ฐๅบxxxx๏ผx.x...\n6501 0 ๅบไบTwitterๅบๅคง็็จๆท็พคๆฅๅ้ ไปฅ่ฏ่ฎบไธบไบฎ็น็้ณไนๅ
ๅฎนๆ็ดขๅไบซๆๅก\n6502 0 xๅ็ทๅญๅจๅฏนๆนๆ็ปไบคๅบๆๆบๅๆ็ ธๅฐ่ฝฆ\n6503 0 ๅๆๆถ้ดๅฏไปฅๅบๅปๆ
ๆธธไบ~ๆๅฐๅบ่ฆไธ่ฆๅปไบๅๅข\n6504 1 ๆจๅฅฝ๏ผๆๆณๅจ่ฏขๅ
ณไบ ไน่ๅฐๅบ ๅๅ้้ ๅพๆฟ็้ซ ๅฐ้ๅฃ ๆฅๅฎ ๅฎไปทxxx.xไธ/ๆxๅฎคxๅ
...\n1000 Processed\n classify content\n7000 0 ไปๅคฉๆไธๅๆๅคฉๆไธๆฏFTISLAND้ฆๅฐๆผๅฑไผ\n7001 0 ๅฑฑไธ่ฝ้ฉฌๅฅณๆณๅฎ่ขซๆดๆขฐๅ
ท็งฐไธ่ฝ่ก่ตฐๅป้ข็งฐ่ฏ็
้ๅฐ่ฟๆ ท็ไบบไนๆฏๆ ่ฏญๅๅไบ\n7002 1 ๆ่ฐขๆจ่ด็ตๆๆๅๆตทๆฑฝๆๅกๆ้ๅ
ฌๅธ๏ผๆๅ
ฌๅธๆฏ็ฆ็ฐๆดๅงๅๆฑฝ่ฝฆๆๆๆๅกๅ๏ผๅ็ง้ฝๅ
จ๏ผๅทฅ่บ็ฒพๆน๏ผไปทๆ ผ...\n7003 0 ๅชๆๅจไธไฟๅฑ้ข้ญๅฐ่ดจ็็ๆถๅๆ่ฝ็กฎๅฎ่ชๅทฑ็ๅ
ๅฟ\n7004 1 ๅฏๅฐ็ไบๆฅผโ้น้นคโๅฎๆจๅฎถๅ
ทๅฐไบxๆxๆฅๆฉx็น่ณๆx็นไธพ่กๅคงๅไฟ้ๆดปๅจ๏ผ้น้นคๅฎถๅ
ทๆฏๅๅฎถ็ด่ฅๅบ๏ผ...\n1000 Processed\n classify content\n7500 0 ๅๅทๅ
ฌๅฎ็ๅไพฆ่ญฆ็ฌ้ฝๅฐๆฅๅ่ฝๅ่ฎค่ฏ่ๆ ธ\n7501 0 WVๅๆฌกไธไบ้ฑๆฑๅซ่งๆฐ้ปๅๅธไผ\n7502 0 ๆ็ๅพฎ่ฝฏ่ดฆๆท่ฝ็ปๅฎๅ ๅฐ็ต่\n7503 0 ๅฏๆๅๅฟ่ฟๆฏ่ดฟ่ตไธไธโฆ็ดซ่ฑๆๆๆ ท็ๆ
ไบไนไปๆฅๆฒกๆโฆๆๆๅ็็ไนๆญปไบโฆโฆๅโฆโฆ\n7504 0 Nbaๆฌ ๅ่ฏบๆฏๅฉไธไธชmvp\n1000 Processed\n classify content\n8000 0 ไนๅไป็ฑไบไฝฟ็จๆฏๅๅๆปฅ็จๆดๅ\n8001 0 ๅฅ่บซๅ6ๅคงไธ้โโ3ใๅคด็ๅคดๆไธ่ฌๅไธไบๅง็ๅจไฝๆถ\n8002 0 ๅไบฌๆธฏไธญๅ Aๅฎๅฉๆฅไธ่ฏบ็็ฉๅๆนๆฑ้็ดซ้ซ่ฏไธ\n8003 0 ่ฏ็ไผๅฏๅจๅ่่ตๆฏๆ็ณ้ฎ่ทฏ\n8004 0 ็็งๅๆฐ็บฟๅๅซไธบ740ๅใ690ๅใ631ๅใ625ๅ\n1000 Processed\n classify content\n8500 1 ๅ
่ดนๅจ่ฏขxxxxxxxxxxxๆฐ็ๆบ้บปๆงๅถๅจ๏ผ่ตทๆๅฅฝ็๏ผไธๅฎ่ฃ
๏ผๅฏ็ฐๅบ็ๆๆ๏ผๆปกๆๅไนฐใt\n8501 0 16ๅฒ็ๅฎๅฐๅฅณ็้ญๅ ไผฏๅผบๅฅธๅฎถๅฑๆฅๆก่ญฆๆนไปๅ
ฅๅฎไน ็ๅๆญ่ฎฐ่
ๅต็ณ8ๆ6ๆฅ\n8502 0 ็พๅฆใๅฟๆบใๆฝ่งๅใ็ปฟ่ถๅฉ็ญๆไธบ็ฝ็บข็ไปฃๅ่ฏ\n8503 0 ๅๆฎตๆถ้ดไนฐไบๅฐ็ฑณNoteๅ
จ็ฝ้ๅ็ฐๅช่ฝ็จ็ตไฟกไธ็ฝ\n8504 0 ๅฐไธ้ฃๆบๅๅฅนไปฌๅไนๆฒกๅ่่ฟๅคฉไบ\n1000 Processed\n classify content\n9000 0 ็ปไปไปฌ็ไธ็่ก็ฅจๆฏๅคไน็ๅผ้ฑ\n9001 0 ๅผ ็ดซๅซฃ่ฏด็ๆฏ่ง็ๅฅๆไธ้ข็ปๅฅนไธ็พไธๅพๅฎนๆ็ๆฒก่ง่ฟliyingxin้ฃๆ ท็ไบบ่ฏดๅๆ็ปๅฉๆฏๅทฒๅฉ็...\n9002 0 ไธๆฅผ็้ฒ็็ชๆฏๅฐๅทๅคฉ็ถ็ๆขฏๅญ\n9003 0 ๅฐๅจ็ตๆขฏ้็่
นๆณป็ทๅฐผ็ๆกไธ่ตทๅ\n9004 0 Sprintๅ
ฌๅธๅจๅ
จ็็บฆๆ7ไธๅ\n1000 Processed\n classify content\n9500 0 ๆตๆฑๆ ่ฏ่ๆฟๆฌ ๅทฅไบบๅทฅ่ตๅๅนดไธ็ป\n9501 0 ๆฑๅฑฑไนไนๆญๅทๅไบฌๅ้็ฆๅปบๆตฆๅๅตๅต\n9502 0 ่ฝฌๆญๅฐ่
พ่ฎฏๅพฎๅๆฟๅพท็ฌฌ4ไธชๅ
ๅฎ่ฝ้ฉฌ\n9503 0 7ๆ29ๆฅ้ฟๅๆณฐๆๆๅฆๅ่ดดๆ้ๅฎไปๆฅ็ฎๆ 800\n9504 0 ่ๅคดๅฏๆตๆ็
ๆฏใ็่ใๅ็็็ฉๅๅฏ็็ฉ\n1000 Processed\n classify content\n10000 0 ๅพฎ่ฝฏไธบไบ้ๆฐไบๅคบ็งปๅจๅธๅบไปฝ้ข\n10001 0 ็ฌฌไธๆถ้ดๆฟ่ตทๆๆบ็นๅผๅพฎไฟก้ขๅ็บขๅ
ไธๆฐๅตๆ\n10002 1 ไฝ ๅฅฝ๏ผๆๆฏๅๅๆ็ต่ฏ่ฟๆฅ็ๅพทๅฝๆฑๆฏๆ ผ้
ๆดๅ
ทๅฐๅด๏ผๆฑๆฏๆ ผ้
xxxๅ
จๅนดๆไฝไปทๅคงไฟ๏ผ่ฑๆด๏ผ้พๅคดๅฅ็ป...\n10003 0 ๅซๅๅฐๅทๆปๆฏ่งไธๅพไบบๅฎถๆฏไฝ ๅฅฝ\n10004 0 ๆๆๆฏๅคฉ้ฝๆฆๅฅฝ้ฒๆ้ๅๅบ้จ\n1000 Processed\n classify content\n10500 0 ไธๅฎ้ฝๅป่ฟไบๅง~ไฝ่ฏด่ตทๅนฟๅท็ๆตทๆด็็ฉๆ ๆฌ้ฆใๅฒญๅ่ฏ็งๆฎๅบๅฐใ่ถณ็ๆบๅจไบบๅฎ้ชๅฎคใ่ฐ่น็งๆฎๅบๅฐโฆ...\n10501 0 ๆฏไธๆญฅ้ฝไธๆ็ฝ่ตฐ้ๆฏๆๆ่ฟ็ๅฟๆ
ๅคงๅฎถๅจๆฏๅ้ๆฎต็ๅชๅ้ฝไธๆ็ฝ่ฒป็ๅ ๆฒน๏ฝ๏ฝ๏ฝ๏ฝๆ่ฌไนๅ่ชๅทฑ็ๅช...\n10502 0 ๅ็ฐไบไธไธชๅiwanna็ธไผผ็ๆๆบๆธธๆ\n10503 0 ๆณจๅฎ่ฆๅ ๅปโฆโฆ็ต่ไธญๆฏๆ ผๅผๅไธญโฆโฆ้ฃไบ่ฎฐๅฝ\n10504 0 ้ฃไนๅไธบG8็้
็ฝฎๆไนๆ ทๅข\n1000 Processed\n classify content\n11000 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 46bhdxไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n11001 0 ๆฏๆ
ๆธธๅบฆๅใไผ้ฒๅจฑไนใๆๅ่้คไธ้
็็ๆณ้ฃๅ\n11002 0 ๅฐฑ่ฟๆ ท่ขซๆฟๅบ้จ้จ็ไธไฝไธบ็ปๆฏๆไบ\n11003 0 ๆๆฅไปๅคฉไธๅๆฌๆฅๆณๅ้ฃๆบๅๅป็\n11004 0 ๅทฅไฟก้จๅ
ฌๅธ2015ไธญๅฝ้ฆๆนๆบ่ฝๅถ้ ไธ้กน้กน็ฎ\n1000 Processed\n classify content\n11500 1 ๆตทๅฐ็ฉบ่ฐๅฏๅฉ่ณ็ปๆจๆไธชๆๅนด๏ผ็ฅๆจๅฎถๅบญๅ็พๅฆๆ๏ผๅฟๆ
็พๅ
ๆปก้ข๏ผๅฅๅบท็พ็พๅพๆ๏ผๅ่ฏๆจไธชๅฅฝๆถๆฏ๏ผๆตท...\n11501 0 ๅญฉๅญไปฌๆฏๅคฉๆณ็็ต่ๆๆบๆธธๆ\n11502 0 ๅกซๅ้่ฏท็ ๏ผ548903ๅฏ้ข3ๅ
็บขๅ
\n11503 0 ไปไนๆ
ๅตใ19ๅฒๅผฑๆบๅฅณๅๅนดๅ
ๅฎๆฝ็็ช20ไฝ่ตท่ขซๆ่ทไธๅ็ฝ็ฆๅปบ็ฌฌไธ้จๆทใ\n11504 0 ๅ
ถไธญๆ้่ฆ็ๆฏ่่ค็ป่่ๅๅผ่ตท็่่ค่ๅๆพๅผ\n1000 Processed\n classify content\n12000 0 ๆฒก็จๆทๆ็ปด็ปๅฏน่ขซไบ่็ฝๆถไปฃๆทๆฑฐ\n12001 0 ๅไบซๅ็ฌๆถ่ดนๆปก2000ๅ
่ต LASANTEๆฅๅ
้คๅ
้ธกๅฐพ้
ไนฐไธ้ไธๅธไธๅผ \n12002 0 ๅผๅงๅฎ่ดไธ็ด่ดจ็ๆๅฎถ็็ด้ฎ\n12003 0 ๅฅๆช็ๆฏ~~ไปไปไนๆถๅๅป้ฃไธ็ญๅฆ\n12004 0 ็ธๆฏๅปๅนดๆไปฌๅจๅฆๅ
ๆน้ขๅไบไปไนๆน่ฟ\n1000 Processed\n classify content\n12500 0 ๅซๆๅคๅคฑๆโฆmaybe่ฟๆฏๅจๆณฐๅฝ็ๆๆญฃๅฎๅง\n12501 0 ไฝ ็จไบ13ไบฟไบบ็้ฑๅฏ่ตทๆฅไบไฝ ไปฌ8000ไธ\n12502 0 ๅฟ็บงไปฅไธไบบๆฐๆฟๅบๅบๅฝๅปบ\n12503 0 ็ไบๅฅฝๅฃฐ้ณ่งๅพๅจๆฐไผฆๅคชๅฏ็ฑไบ\n12504 0 ่ถ
1ไธ่ต็ๆ็งไนฆ่ฌ็ๅฎถ่ฃ
ๅพฎไฟก่ฅ้\n1000 Processed\n classify content\n13000 0 โไธญๅฝ้ฃ่ฏ็็ฎกโAppๅ
จๆฐๅ็บงไธ็บฟ\n13001 0 1998ๅนด่ฟๆพ่ขซๆน็ผๆๅๅ็ต่งๅง\n13002 0 ้ไผดๅจๆ??้ฝๆฏ่ๅทๅฉ็บฑๅๅคงๅบ่ดญๅ
ฅ\n13003 0 ็ฑ็ฝๆๆถๅฐใ็ฝๆๆฟไบงใไฝฟ้ฆๅฃนๅท้ข่ๅไธพๅ็โไธญๅฝ่ชไฟกๆๅบฆๆถ่ฃ
showโๅจไบฌไธพ่ก\n13004 0 ่ฟ็งๅจ็ฉๆฌๆง็้ๆงๅบๅ ไธๆถ็้ฒๅบ็ฐ็็้ข็ฎ\n1000 Processed\n classify content\n13500 0 ๆๅ็ ็ฉถ็ๅ
ฅๅญฆ่่ฏๆ็ปฉๆๆฏไป้ซไบ่ฟ100ๅ\n13501 0 ๅฐ้ๆฏไธๆฏๅ่ฏฅๆขๅคๅ2RMBไบ\n13502 1 ๆฐๅนดๅฅฝ๏ผๆๆฏๅคง่ช็ถๆจ้จ๏ผ็งป้จ๏ผๅข็บธ็๏ผๆไปฌๅๅฎถxๆxๅท-xxๅทๅ
จ้จx.xๆ๏ผ่ฏทๆจไธๅฎถไบบไธๅฎ่ฆ...\n13503 1 ๅ
ๅฎต่ๅฟซไน๏ผ ๆฐๅบ้
ฌๅฎพ:้ท่ฟชๆฃฎๆฌๅฐๆฐ่ฑๆ ทx/xๅท่ตท๏ผไปปไฝไธๅบๆด้
๏ผๅๅฃซ๏ผ่ๅธฆ๏ผXO๏ผไนฐไธ็ถ้...\n13504 0 ไธบไบ่ฟไธช็ต่งๅงๅ
ไบ่
พ่ฎฏไผๅ\n1000 Processed\n classify content\n14000 0 RnD็ท็ผๆบ่ฝๅฝฉ็ปๅ็ฉๅบไบๆฐ่ฎพ่ฎก\n14001 0 58ไนฐๅจๅๆฅๅค3400่ก002066\n14002 0 ๆฏไผฆๆฆไบคๅไนๅขๅBBCไบคๅไนๅข็้ฉปๅขๅบๅฐ\n14003 0 ๆฌๆฅๅ
ญ็นไบๅ็้ฃๆบๆๆถๆๅๅปถ่ฏฏๅฐๅไธ็นไบโฆ็ๆฃ\n14004 0 ๅบๅปบไบคๅง๏ผ1๏ผๅฎๅฑฑๅบๅ
ฌๅ
ฑ่ช่ก่ฝฆ็ฝ็นๆ็
งๆนไพฟใ้้ๅ็ป็ญนๅ
ผ้กพๅๅๅๆนไพฟๅ็น็ญๅ ็ด \n1000 Processed\n classify content\n14500 1 ็ฏ็ๆฅๅ๏ผๅ้ขๆ้๏ผๅ
จๅฝๆฅๅ๏ผ้ป็ฝๆทๅ็xxwไปฅไธไฟก็จๅก๏ผxxๅคฉไธๅก๏ผ้่ฆๅ็็่็ณปๆ๏ผไธญไป...\n14501 0 ๅฅฝๆถๆฏ๏ผ่ฝๅๅ็ปต็ซน้ฃๆบๅฟซ้ๅไฝๆๅ\n14502 0 ๅฏๆๆบๅพฎๅ้ๆฒกๆพ็คบๅบๆฅโฆโฆไบๆฏๆๅพฎๅๅธ่ฝฝไบๅ้ไธไบไธๆฌกโฆโฆ็ปๆ่ฟๆฌก้ๅญฆไธ่กจๆ
ๆฒกๆ\n14503 0 ๆ
้ฟUI่ฎพ่ฎกใไบคไบ่ฎพ่ฎกใ็ตๅ่ฎพ่ฎกใVI่ฎพ่ฎก\n14504 0 ็ฎๅๅทฒ็ปๅฎๆ8500ไธ็พๅ
C่ฝฎ่่ต\n1000 Processed\n classify content\n15000 0 ๆไธๅกไบบ่ฝ่ทณๅฎๅ15ๅ้็็ญ่บซๅฐฑ็ดฏๅฐไธ่ฆไธ่ฆ็\n15001 0 ็ฑSEGAๅผๅใBBๅทฅไฝๅฎค็ไฟฎ็ผๅง\n15002 0 ไธ่ฎฟCassia่ตต็ฆๅ๏ผๅไบ20ๅนดWiFi\n15003 0 ๆบง้ณๅธๆฐ่ฑกๅฐ16ๆฅ8ๆถ30ๅ\n15004 0 ๅฝๅนด่ฏดๆณ่ๅไบฌ่ชๅคฉ่ช็ฉบๅคงๅญฆ็\n1000 Processed\n classify content\n15500 0 ่ช็ฑ็ๅๆตท่ถๆฏๅทฒ็ปไธ็ซ็ฎญ่พพๆ็ญพ็บฆๅ่ฎฎ\n15501 0 ๆฐๅ1933้ๆฑไธ่พพ๏ผ็ๆ1886\n15502 0 ๅๅๅฅนๅ
ณไบ็ฏๆ็ฉๆๆบ็ฉๅจไบ\n15503 0 xxไปถไฝ ไธ็ฅ้ๅ
ณไบ่ดพๆฏๆฑๆฏไผฏ็ไบๆ
\n15504 0 ๅขจๅฐๆฌboxhillsouthๅ
จๆฐtownhouseๅ้ดๅบ็ง$xxx/monthxxmin่ตฐ...\n1000 Processed\n classify content\n16000 0 ้ฃๆถๅๅฐ็ๅทฒ็ป่ฟๅ
ฅๆบๅจไบบๆถไปฃ\n16001 0 ๆณๅทๅธ้ฑผๅณฐๅบไบบๆฐๆณ้ขๅฎก็ปไธไปถไนฐๅๅๅ็บ ็บทๆกไปถ\n16002 1 ๅฎถ้ฟๆจๅฅฝไนๆฉ็พๆฏๆฅๅญฃ็ญๅทฒ็ปๅผๅงๆฅๅ๏ผๅฏ้ๆถๆฅๅๅ
่ดน่ฏๅฌ๏ผๆๅๆฅๅๅฏ้ๅฝไผๆ ใๅฐๅ๏ผ้ญๅxxๅท...\n16003 0 ไธๆฏไฝ ไปฌๆฟๅบ้ๅ็ซ่กจ่ฏๆไฝ ไปฌไธบไบบๆฐๆๅก็ๅฐๆน\n16004 0 ๅธๆๅคงๅฎถๅฏไปฅๅ็ฌๅ็ปๅฅน็บขๅ
\n1000 Processed\n classify content\n16500 1 TOTO้ฆๅฑๆบ่ฝๅซๆตด่ๅผๅงไบ๏ผๆบ่ฝ้ฉฌๆกถ็ๅช้xxxxๅ
๏ผๅญ็ญไฟกๅฏไบซๆๅxxๆ๏ผๆ ้ๅบๅฝ๏ผๅพฎไธช...\n16501 0 ๅไบฌใไธๆตทใๆตๆฑๆ่ตไบไปถๆฐ้ๅขๅน
่ถ
่ฟxxx%\n16502 0 ไธบไปไน็ฐๅจ็ๆๆบๅฐบๅฏธ้ฝๅพx\n16503 0 xๆxxๆฅ่ตทๆฅ็
งไธปๅๅบ็ฆ่กไธ่ฝฎ่ฝฆใๅ่ฝฎไปฃๆญฅ่ฝฆ\n16504 1 ๅงๆจๅฅฝ๏ผๆๆฏ่็ต็พๅฎนไผๆ็ๆฅ ๆฅ ๏ผไธๅ
ซ่ๅณๅฐๅฐๆฅ๏ผๆๅบไธบไบ็ญ่ฐข่้กพๅฎข้ฟๆไปฅๆฅๅฏน่็ต็พๅฎนไผๆ็ๆฏ...\n1000 Processed\n classify content\n17000 0 ่่ต็ง่ตๅ
ฌๅธ่ฎพ็ซๅญๅ
ฌๅธไธ่ฎพๆไฝๆณจๅ่ตๆฌ้ๅถ\n17001 0 ็ฑ่ดฏๆฑ่ๅธธๅทๅจไน่ฎธๅทฒ็ป่ฎฐไธ่ตทไป็\n17002 0 ็ฝๅคฉ้กถ็ไธญๆๅคฉ่ฟ่ฆ่ทๅป้ขๅทฎ็น่่ฑ็ๆ่ฟๅป\n17003 0 ๅ้ฆๆธฏ็ตๅฏฉๆณ้ข้ฆๅธญๆณๅฎ้ฆฌ้็ซๅๅฅ็้ ็ไปฃ่กจๅๆ้ข\n17004 0 ไป30ๅคๅนดๆฅๅ
ฑๅไฝ็ๅ ไธๅน
ไฝๅ\n1000 Processed\n classify content\n17500 0 ๅฐคๅ
ถๆฏๅผๅฏไบไนกๆๆ
ๆธธ็ๅ
จๆฐๆถไปฃ\n17501 0 1958ๅนด8ๆ็ฑๆตๆฑๅปๅญฆ้ขไปๆญๅทๅ่ฟ่ณ\n17502 1 ๅพๅทๅฐ็พๆบๆขฐ่ฎพๅคๆ้ๅ
ฌๅธใ่ๅๅฐๅบไธไธ้ๅปๆบ๏ผๆฟๅ
ๆบ๏ผๆจๅทฅๆบ้ๅฎๅใๅฐๅ๏ผๅพๅทๅธไบ้พๅบไธๆฒณๅคด...\n17503 0 ๅฎ้ตๅฟๆณๆฒณๆดพๅบๆๆๅบ่ฝไบๅฟ็ซฅ็่ญฆๅฏๅฅฝๆ ท็\n17504 0 ็ๅฎถ่ฑช็ๆฏ้ป็ๅฐฑๅทฎไธชๆไบฎๅฐฑๅฏไปฅๆผๅ
ๆฏไบ\n1000 Processed\n classify content\n18000 0 ๆฏไธไปถIGNISไบงๅ็ฑๅ
่ๅคๅๆธ้็ๆบไบ่ช็ถ็ๆทณๆด่่ฑไฟ็ๆฐ่ดจ\n18001 1 ๆบ็ๅ ไธบๆปก่ถณๆฐ่ๅฎขๆท้ๆฑ็นๆทปๅ ๏ผ็ซๆๆ้กน็ฎ๏ผๆดปๅจๆ้ดๆค็ฉๅฅๅบทๆๅๅช้xxๅ
๏ผๅฅๅบท้ถ็ท็ซๅๅช้...\n18002 0 ่ฟๆฏbabyๅ็่ฃ
ไฟฎๅฎถ็ๅทฅ็จ่กจ\n18003 0 ๅนถๅฏนๅบ็ง็ฉไธไบบๅ่ฟ่กๆถ้ฒๅฎๅ
จๅน่ฎญ\n18004 0 ่ฝ่ฎฉไธไธช57ๅฒ็ไบบ็่ตทๆฅๅ40ๅฒไธๆ ท\n1000 Processed\n classify content\n18500 0 ้ข็บฆไบๅฐ็ฑณNOTEๅ
จ็ฝ้ๅไธบ่ฃ่7ๅ
จ็ฝ้่ฃ่็
็ฉ4X\n18501 0 ๆ ้กไนฐไบๆๆฟๅ
ฌ็งฏ้่ดทๆฌพ30ไธ่ฏไผฐ่ดน่ฆ5ๅไธบไปไน่ฟไน้ซ่ฟๆ ทๆถ่ดนๅ็ๅ\n18502 0 ไธๆฏๅถๅฎไบไบบๆฐๆณๅบญ่ฎพ็ฝฎๆน้ฉๅฎๆฝๆนๆก\n18503 0 ไปๅนด่ฟ่ฆๆ้ xxxๅฅ็พค็งโๆ ทๆฟๆฟโ\n18504 0 ๅพฎ่ฝฏOfficeforMac2016ไธ็บฟไบ\n1000 Processed\n classify content\n19000 0 I'mnotallowed๏ฝไธๅนดๅคๅๅฟ่ฎฐๅ ไธบๅฅๆ็ฆๆญขๅฅน็ฉ\n19001 0 ไฝๆตๆฑไบค้็ตๅฐ93่ฟๅจ้ฃๅฟๆ
ไฝ็ซๆ
\n19002 0 ็ฎๅ่ๅท็ปง็ปญๅๅคง้้ซๅ็ๆงๅถ\n19003 0 ไธญๅฝไธไธ็ไบๅฅๅบทๅจ่ฏข็ฝ็ซ\n19004 0 ๆตๆฑๅคงๅญฆๅฎ้ชๅฎค็ปไฝ ๆๆๅจ็็ญๆก\n1000 Processed\n classify content\n19500 0 ไธไธชๅ
ฌๅนณๅ
ฌๆญฃ็ๆฟๅฐไบง็จๆฏ่ฐ่ๅฑ
ๆฐๆถๅ
ฅๅ้
ไธๅ
ฌๆไธบ้่ฆ็ๅทฅๅ
ท\n19501 0 ไปๅคๅฐไปๅคๆก็ๅคๆๅ ไธช่ฝ็ฟป่ฟๆฅ็\n19502 0 ๆฐ้็ๅคงๅฎถๅฏน่ฃ
ไฟฎ่ฎพ่ฎก้ฃๆ ผ็้ๆธ็ปๅ\n19503 0 ๅๅ ๅฅฝๅฃฐ้ณdeๆฐธ่ฟ้ฝๆฏ่ฟๆ ท่ฏด\n19504 0 ๆฌๅบ้ๅฎ็ไบงๅๅไธบโๅฝ็ไธ่ฅโๅฎไฝๅบๆญฃๅ\n1000 Processed\n classify content\n20000 0 ๆ่ต่ชๅทฑๆๆฏๆๅฅฝ็ๆ่ต\n20001 0 infiniteBAD้ข็ด็\n20002 0 ็จๅๅฝข่ฎพ่ฎกๅฐๅงๆฆปไฝไบ่ฝฌ่งๅปถไผธ\n20003 0 ๅฅฝๅคๆณๅพๆๆ่งๅฎ็ๅณๅจไฟ้้ฝไฟ้ไธไบ\n20004 1 ็ถไธบๆจๆไพ่พ้ซๆถ็็็่ดข๏ผๆฏๆจ่ต้ไฟๅผ็ๆๅฅฝ็ฎกๅฎถใๅฆๆจๆ้่ฆ๏ผ่ฏท็ต่ฏๅจ่ฏขๆๅฐๆ้ขๅจ่ฏขใๆ่กๆฏ...\n1000 Processed\n classify content\n20500 0 QQๆญๆพๅจๅฏไปฅๅฌไฝๆฏ้ผ ๆ ็ขฐไธ็ๆญๆพๅจ\n20501 0 ๅคๅฐๅ
้กตไฟฉไบบๅๅผ็่ฎพ่ฎกๆฃๆฃ็\n20502 0 ๆฌไบบๆฑๅ็งๆฑๅคๆกฅไธๅคฉไธไธๅกๅฏบ้่ฟ็็ๆฟๅญ\n20503 0 ็ญ็็ฅ่ดบๆๆ กๅฅณ่ถณ่ทๅพxxxxๅนด่ๅทๅธโๅฏๅฃๅฏไนๅง่ๆๆฅโๆฏ้ๅฐๅนดๆๆ่ถณ็่ต้ซไธญๅฅณๅญ็ป็ฌฌไธๅ\n20504 0 ๆฟ้ๆๆๆๆ่ตๆ ็ๆ่ต้จๆง้่ณ1000ๅ
\n1000 Processed\n classify content\n21000 0 ไธๆฏ้้็พๅบฆๅฐฑๅฏไปฅๅฝ่ชๅชไฝ็\n21001 0 ้่บซๅจ็ฐ้ดๅฐ่ทฏไธญ้ ็ๅซๆๅฏผ่ชๅฏปๆพ\n21002 0 ๅคงไผไนๆฟๆ็ไธๅบๅคงๆ/ไฝๆๅไผ้ฆๆฌกๅฐฑโ้ๆฐธไฟก่ขซไธพๆฅโ่กจๆ\n21003 0 ่ฆๆดๆฐ็
ๆฏๅบๆๆฏโฆโฆไบๆฏไปๆจๆๅผ็ต่ๅ่ฟๆฒกๆ่ฟ่กไปป\n21004 0 ็ฐๅจ็่
พ่ฎฏๅฎๅ
จ็ณปๆฐ็็ๆฏๅคชไฝไบ\n1000 Processed\n classify content\n21500 0 ็ญไฝ ๅๅคฉไธ้ฃๆบๆๅโๅผๅผโไฝ ๅๆ็ๅฎ่ด็็\n21501 0 ๅฎ็พ้พๅๆฌขๆณๅฝๆขงๆก้ฃๅนด่ไป็ณๅจๆดไธชๅไบฌๅ็งๆปกไบๆขงๆกไฝ ่ฅๆฏๅๆฌขๅๅฑๆๆฟๆฏๅคฉ้ฝไธบไฝ ๆ\n21502 0 labolaboๅ้ๅป็ๆฏๅญๆถๆๆฐด\n21503 0 ไฝ่ฐ้ฝ็ฅ้่ฟๅชๆฏxๅนด็ญๆ็็ผๅ\n21504 0 ๆไนๅๅ
็กฎ่ฎค่ชๅทฑ็ipๅฐๅๆขไบๆฒก\n1000 Processed\n classify content\n22000 0 xxๅนดxๆๅๆ่ฎฒ่นๅ็ปไปไบบๅๆฝ้\n22001 0 ๅฐฑๆฏ่งฃๆพ่ทฏๅฒ่ฟไธๆตท่ฏๅธๅคงๆฅผ\n22002 0 ใ143ไธไบบๅจไผ ็ใๅฎๆๆฏๅค็ชๅ ๅทฅ็นๆญ็งๆณกๆคๅค็ชๆ ๅๆทปๅ ๅ\n22003 0 ๅซ่ฎฉ่ฝฆๆไธบไผคๅฎณไปไบบ็ไฟๆคไผ\n22004 0 ๅธๆฐๆฅ็ต่กจๆฌๅจๅ
ญๅๅบๅคงๅ่ก้็ๆฌๅญๅ
ฌไบค็บฟๆๆๅ
ฌไบคๅธๆบ้ฝไผไธปๅจ้ฟ่ฎฉ่กไบบ\n1000 Processed\n classify content\n22500 0 ่ฟไผๆๅผ็ต่่ฏฅๅพไธ็นไนไธๆ\n22501 1 ๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผx.xxๆฅ๏ผไธญๅxx๏ผxx๏ผๅปบๆๅฎถๅฑ
ๅ็ๅๅฎถ็ด้ๆดปๅจ๏ผไธๆจ็ธ็บฆ่ฟๅทๅฝ้
ๅคง้
ๅบ...\n22502 0 ๆฃ้ซ็ญใ่
นๆณปใ่็ใ่พ็ใ่ๅ็ใ่็ณ็ไนไบบๅฟ้ฃ\n22503 0 ๆ่ฐๅพๅทไบบ่ฟๅๅ้
็ๅธๅญๅๅ่กจไธ้\n22504 0 TPP่ฐๅคๆไฟๆๅคๅฝๆพๅฎฝๅค่ตๆ่ต้ๅถ\n1000 Processed\n classify content\n23000 0 ่ฝๆไพๅค่พพ100ๅ็ๆทฑๅบฆ่ฆ็ๅข็ๅๅๅไปฅไธ็่ๆฅ่ฝๅ\n23001 0 ไธไธๅญๅฐฑๆ่ฃ่6plus็ๅ
่็ปๆฉ็ๆไบ\n23002 0 ๆคๅฃซ็ปๆๆ้็ๆถๅ่ฏด๏ผโไธบไปไนๆฏๆฌก่ฟ็ง่ๆๆฏ็ๆดปๅฐฑ่ฝฎๅฐๆๅข\n23003 0 ่ฎฒ่ฟฐๆฏ้ๆดฒ่ฟ็ญ็ๆ
ไบโฆๅๅๅๅ\n23004 0 ไน้พไปๅฝๅฎพ1ๅทๆ ทๆฟ้ด่ตๆๅพ็\n1000 Processed\n classify content\n23500 0 ๅๆฏๅไบฌๅธไธๆตทๆทฑๅณ่ๅท็ๆฅ็็
\n23501 0 ่ฟๅ
ถไธญๅ
ๆฌไบ43ๅฎถๅๅ่กใ37ๅฎถๅๅ่กๅๅไฟก็คพไปฅๅๅๅ้ถ่กใๅไธ้ถ่กใๆฑไธฐ้ถ่ก็ญ3ๅฎถๅค่ต่ก\n23502 0 ่ฏฅๆฌพๅนณๆฟ็ต่็ๅๅๆฏๅปๅนดCESๅฑไผไธๅ็กๆพๅฑ็คบ่ฟ็ๅนณๆฟ็ต่TransformerAio\n23503 0 ไธญๅฝๅๅๆๅญฆIPๅจ็ป่ฟๅๅ ๅนด็็งฏ็ดฏๅๅจไปๅนดๅ
จ้ข็ๅ\n23504 0 ๅฅน่ฏดๆ็ธ็ธๆ้ฃๆบ้ฝๅผ่ตฐไบๆๆไนๅๅปบๅฎๅฆ\n1000 Processed\n classify content\n24000 0 ๅฅณๅฃซ็ฑไบๅญๅฎซๅๅคง่ ไธๅฑ่ ่็ธ้\n24001 0 ๅ
ณไบ้ญ็ฝๅจ่ฏข้ฎ้ขๆฑๆป้ญ็ฝๆฏ็บฏๅคฉ็ถๆๅ\n24002 0 ๅไธบmeta7ไธขๆฐด้ๆฒก้ฎ้ข\n24003 0 ๅ
ฌๅธ\"ๅฝ็\"ๅ็่ขซ่ฎค่ฏไธบๆ้ซ็ญ็บงโไบๆๅ็\n24004 0 ๅฟ็ๅจ่ฏขๅธโฆๆฒ็ๅฎๆๅผๅงๅฝโฆๆณๅญฆไน \n1000 Processed\n classify content\n24500 0 ๆๅๅฐ็จไบ่ฟไนไน
็ต่ๆป่ฏฅๅญฆ็น็ปดไฟฎๅธธ่ฏไบ\n24501 0 ๆพ็ปไฝ่ฟ็ๅฐๆน็็ซๅชไปไปฌ้ฝ่ฟๅจ\n24502 0 ไธญๅฝๅฅฝๅฃฐ้ณ็ซ็ถๆไบบๅฑๅๅฑฑๅ\n24503 1 ไบฒไปฌ.็พ่ตๅทจๅฅถ็ฒ่ดญไธๅฌ็ซๅxๅ
'๏ผไธๅฌ็ซๅxxๅ
.ๅ
ญๅฌ็ซๅxxๅ
.๏ผๅ
จๅบๆปกxxx้่ดญ้xxๅ
...\n24504 0 Vx็ฆๅทๆฟไบงๅๅคๆๅฅ16\n1000 Processed\n classify content\n25000 0 ้ซ็่่็ถๅๅฎถๆฑ่ๅพๅทๅฎๅ็ป็็งๆๆ้ๅ
ฌๅธ่่็ถๅๅฎถ\n25001 0 ็ต่ไธๆญ10ๅ้ๅฐฑ่ตฐ็ฅๅปๅนฒๅซ็ไบ\n25002 0 ๆฅผไธๅฎๆฅผๅฐๅง่ทณ่ๅขๅฑ
็ถ้ฝๆฏไธไธชๅจไฝ\n25003 0 ๅฅน่ฏดๅปๅป้ข้ฎ้ฎๅป็ๅฐๅบไปไน็
\n25004 0 ไนฐๆฉ้ฅญ็ไธๅๅคงๅฆ่ฎฉNไบบๆ้้ผๆๅ้ฃ\n1000 Processed\n classify content\n25500 0 ๆจไธพไธไฝๅฅ็พๅ
็็ถๅไธๆชๅปๆฏ\n25501 1 xxxxxxๅ
(RMB)ๅ่นๆ็ฌ่ฎฐๆฌ็ต่ไธๅฐ!่ฏทๅๆถ็ป้ๆดปๅจ็ซ:gszwa.cc ๆฅ็้ขๅ...\n25502 0 ๅฝๆฐ้ฉๅฝๅ74ๅไธๆฅๆฌ11ๅๆฟๆๆฅๅๅฑฑ\n25503 0 ๆ ้กๆฐธไธญ็งๆๆ้ๅ
ฌๅธๅจๅไบฌไธๅฝๅฎถ็ป่ฎกๅฑ็ญพ็ฝฒไฟกๆฏๅๅไฝๆกๆถๅ่ฎฎ\n25504 0 ็ๅบ่ฏฅไปๆณฐๅทๅๆฅ็ๆไธๅฐฑๅบๆฅ็\n1000 Processed\n classify content\n26000 0 ๆฌๅบๅ้ไปทๅผxxๅ
็ๆฒๆตด้ฒไธ็ถ\n26001 0 ็ธๆฏไนไธๅไธบๅๅฐ็ฑณๅทฒ็ปๅพๆฅ่ฟ\n26002 0 ๅ ๆญคๅพๅฎนๆๅผ่ตทๅผๅธ็ณป็ป็พ็
\n26003 0 ๆนๅคง้ๅข็ญพ็บฆๅๅปบ16ไบฟๅ
ๅ
ไผๅ็ต้กน็ฎ\n26004 0 ่พพๅฎๆบ่ฝไธไธๅฐๅธไบบๆฐๆฟๅบใๆทฑๅณๅธไผ่งๆๆฏๆ้ๅ
ฌๅธๅฐฑไธๅฐๅธๆบๆ
งๅๅธPPPๆ่ต้กน็ฎ่พพๆไบๅไฝๆๅ\n1000 Processed\n classify content\n26500 0 ่ๅท็ฑ่ๅๆณๅญฆๅฎถๅชๅพๆฅๅฅฅๅๅ ๅฏนไบๆๆฅๆฌๆ็ฏๅฎกๅค็ไบฒ่บซ็ปๅ\n26501 0 ๅไบฌ้ๆฑๅไผไผไธๅฎถๅไฝ้ขๅฏผ่
ไธดๆๅฏผ\n26502 0 ๅ็ฐ่ฟๆณๆฅ้ๅญฆ็็่ฝฆ่พๅๆถๅๅ
ฌๅฎไบค็ฎก้จ้จไธพๆฅ\n26503 0 ๅธฆๆฅไบ10ๆฌพ้ฆๆฌก็ปๅบ็ไฟๆถๆท่ฝฆๅๅ500็น้ขๅคๆๅฐฑๅ\n26504 0 2ใๆฝๅทฅ่ฎฒๆญฅ้ชคๅคง็็ณ่ๆฏๅขๆฝๅทฅๆนๆณไป็ป\n1000 Processed\n classify content\n27000 0 ไธไธ็ญๅไบฌ็้ฃๆบๅไปๅฆ็ไปๅ
ญ็นๅปถ่ฟๅฐๅ
ซ็นๆด็ๆๅๆจๅฐๅฎถๅคงๅทดไธ่ฟ็ผ้ฎ้ฎ็็้ฑไธขไบ\n27001 0 ่ฏๅฎๆๅฎ็้็~่ดญไนฐ่ฏฆๆ
ๅๆป็ฅ\n27002 0 ๆไปฌไธ่ตทๆคๅไบฌๅฐ้ไธๅท็บฟ็ๆฅๅญ\n27003 1 ๅจไธๅฅฝ๏ฝ xxxxๅนดๆณฐๅบท่ดขๅฏ้้ๆจๅบ ็จณๅฅxๅทๅ ๅ่ถxๅทxๆ ไธคๆฌพไผ่ดจไบงๅ๏ผ็จณๅฅxๅท่ตท็นx...\n27004 0 ๆ้ซไบบๆฐๆณ้ขๅฐๅ้็ฅ่ฆๆฑ่ฎค็่ดฏๅฝปๆฐ\n1000 Processed\n classify content\n27500 0 ๆฒฃไธๆฐๅๆ่ฒๅ
ๅงๅฌๅผ2015ๅนดๅ
้ฃๅปๆฟๅปบ่ฎพๆจๆฒป็ๆ่ฒไนฑๆถ่ดนๅทฅไฝไผ่ฎฎ\n27501 0 ๅๆ้ฃไฟฎ็ๅๅจ่ณ่ไธๅไบไธคไธ\n27502 0 ไบ้ฉฌ้ๆถ่ดญGoodreads\n27503 0 ๆฐๆณ่ง็ๅฎๆฝๅฏนๆ่พๅพ้ฉๅฝไบงๅ็ไผไธๅฝฑๅ้ๅคง\n27504 0 ๅฝไธๅ้ฝไบๆไบไธๆฟ่ฟๆ่ฆ่ฎฐๅพ้ฃๆบไนๆฏ้้ฃ่่ตท้ฃ\n1000 Processed\n classify content\n28000 0 ๆ็ฅ้ๆๅพๅคไบบๅจ่ดจ็ๆไปฌ็ไบงๅ\n28001 0 ๆฎๆ้ซๆฃๅฏ้ข็ฝ็ซ7ๆ30ๆฅๆถๆฏ๏ผๆฅๅ\n28002 0 PE็ณปๅธๅ็ฉๆณ๏ผ็ชๅบๆ่กๆๅผ่ฅไธ้จ\n28003 0 ๆๅ่ชไปฅๅๅไนไธ่ฆๆบๅญฃ็ๆถๅๅปๆ
ๆธธไบไฝ ้ ๅฆ้จ้ผๆตชๅฑฟๆๅคๅฐไบบๅๅ\n28004 0 ไป่ฏด60ๅจๅฒๅ้ขๅฐ็้ไผ้่ฟๆฏๅซไปไน็\n1000 Processed\n classify content\n28500 0 ๅ
ฌไบคๅพๅๅผไบxๅ
ฌ้็ป็ซไบคๆกฅๆดๅ่ฝฌๅผฏ\n28501 0 ๅจ้ฃๆบไธๅฏ่ฝไนไผ่ขซ่ฆๆฑๅ
ณ้ญ\n28502 0 ๅๅคดๆด็ไธๆๅ้ฃๆบ็ๆถๅ้่บซๆบๅธฆ็ไธ่ฅฟ\n28503 0 ่ฟๆณไบๆ ่ฎบไปไน็็ฑ้ฝ่ฆๆฅๅๅถ่ฃ\n28504 0 ๅฝๆไปฌ็ๅฐ้ฃไบ่ขซไบบ่ดจ็็ๆขฆๆณไธๅคฉๅคฉๅๆ็ฐๅฎ็ๆถๅ\n1000 Processed\n classify content\n29000 1 ใxxxxๆไธบ็ฆ็ใ็พ้
็พๅฎนไผๆ็ฎไธๆ่ๆฏ๏ผไธไธชๆๅไธๆฌก๏ผไบๅ้ๆไฝๅฏ่พพๅฐๅไธชๆ็น็ฉด็ๆๆ๏ผ...\n29001 0 ไธๅๅๆฐใไธๅ็ฏ่ง่ฃๅคๆๅฟๆฏไปไนโฆโฆๆฐๅฅๆ่ถฃ็่ฏพ่ฎฉๅญฉๅญไปฌๅๆๆฐ้ฒ\n29002 0 ๅฐๆๅๆๅไบ24ๆไฝ ๅไธๆฟๆๅๅฆไบ\n29003 0 ๅพ้ฆ~ๅฐๅ๏ผๅฐๅบๅบๅฏๅบ่ฅฟ่ทฏ็ๅบไบ่ดญ็ฉไธญๅฟ่ฅฟไพง้ณๅ
ๅฐๅธฆๅบๅ\n29004 1 ไธๆๅฅณ็ๆ๏ผ่ด่ฑๆ ทๅฅณไบบ๏ผๆๅฅฝ็็คผ็ฉ็ปไบฒ็ฑ็่ชๅทฑ๏ผๆๅคงๅๅบฆๆดปๅจๆฅ่ขญ๏ผx.x~x.x่ดญ็พฝ่ฅฟไบงๅๆปก...\n1000 Processed\n classify content\n29500 0 ๅฎณ็ๆๆฟ็ๆๆบไธ็ด็ญไธ็ด็ญ\n29501 0 ่ญฆๅฏ่้ปๆ็คบๆจ๏ผ่ช่งๅๅฐโๅ้
ไธๅผ่ฝฆ\n29502 0 ่็พๅไน80็ไบบ็ไบๅฅๅบท้ฝๆฏๅ ไธบๅไบ็ป็ปไธ็
่ๅฏผ่ด็\n29503 0 ๅฏๆ นๆฎๆฏไฝ้กพๅฎข็่ฆๆฑ่ฎพ่ฎกไธๅฑๅฎๅถๆฌพ็พ็ฒ\n29504 0 ๆดๆ็ถ่ฎพ่ฎกๅธAlexanderPurcellๅ้ดๆไบไธญไฝฟ็จ็ๆฐด้ท\n1000 Processed\n classify content\n30000 0 ๅนฟๅทๅฐ้้ขๆตๆ้ดๆ้ซๆฅๅฎขๆต้ๅฐ่ถ
600ไธไบบๆฌก\n30001 0 ็ถ่ฟ็ปญไธคๅคฉ้ชๅฆๅฆๅปๅป้ข็็
\n30002 0 ้ๅบๅธ็ฌฌไบไธญ็บงไบบๆฐๆณ้ขๅฏน่ฏฅ็บ ็บทไฝๅบ็ปดๆๅๅค็ไบๅฎกๅคๅณ\n30003 0 ๅฐๅทโๅฟตๆงโไธไธชๅๆๅฏนไธๅฎถๅบโไธๆโ3ๆฌก\n30004 1 ๅฐไฝณๅ็ๅฅณ่ฃ
ๆๆฃ็ฅๆจๅ
ๅฎต่ๅฟซไน๏ผไธๅ
ซๅฅณ็ฅ่ๆฏๆไปฌ่ชๅทฑ็่ๆฅ๏ผ็ฑ่ชๅทฑๅฐฑไธบ่ชๅทฑไนฐไปถๆฐ่กฃๅง๏ผๅ
ญใไธ...\n1000 Processed\n classify content\n30500 0 ๆฒกๆๅๆถ่ช็ญๆฒกๆๅปถ่ฏฏๅทฒ็ปๅไธ้ฃๆบๅฎๅฟ็ญๅพ
ๅๅฎถไฝ ไปฌ่ฆๆณๆๆไนไผๆณไฝ \n30501 0 ๅบๅฏน้่โโ่กๅธๆบไผๅฐ\n30502 0 ๅไบฌ315ๆถ่ดน่
ๆ่ฏ็ฝไธๅ็งๆ่ฏๅฐ็ฑณๅ
ฌๅธ็\n30503 0 MT6735็ญๅนณๅฐ็Layoutๅทฅไฝ\n30504 0 ็ๅฅถ็็ฅๅฅๅๆ๏ผ1็ๅฅถ+้ข็ฒ=ไผ่ดจ้ข่\n1000 Processed\n classify content\n31000 1 ๅฎๅฎพๆๆๅ
ฌๅธ่ๅฐ็ฒๆๅ้ข็ฅๆจ๏ผๅ
ๅฎต่ๅฟซไน๏ผๆ้่ฆ่ต้็ๆๅๅฏไปฅ็ต่ฏ่็ณปๆๅๅค่ตๆไบ๏ผๆๅฟซxๅคฉ...\n31001 1 ^้ฟๆ่ฏไฟกๅจๆฌๅธไฝๅ็ฑป่ตๆ ผ่็งฐ๏ผไปฅๅๅฐ /็ซ ใ็ใ โฆโฆ็ญใ็ฅฅ๏ผx x x x x x x ...\n31002 0 PS๏ผ่ๆๆ็คบ๏ผๅ่ฉ่่ๅจ่บซๅ\n31003 0 ๆฅ่ชไธๆตทใๅนฟไธ็ญๅฐ็8ๅฎถๆบๅจไบบ้็น้กน็ฎไธ้ฟๆฒ้จ่ฑ็ปๅผๅบ็ญพ่ฎขๆ่ตๅ่ฎฎ\n31004 0 ่ฎพ่ฎกไบ่ฟๆฌพไผธ็ผฉๅผ็ตๆบๆฅ็บฟๅจ\n1000 Processed\n classify content\n31500 0 ่ฟๆฏๅจๅไบบ่ดฉๅญไธ็จๅคๆญปๅๆๆๅ\n31501 0 ๅจๅฎณๆ็ญ็่ฃๅค็ๅคๅณๅง\n31502 0 ๆคๅฃซ้ฟไผฐ่ฎกๅไปๅฐๆนพๅ่งๅๆฅ\n31503 1 ๅฎๆฏ็ฐๆคๅช่ฆxไธ๏ผใๆฅ ๆฑ ๅใ่ถ
ไฝๆธไฟฏไป
้xไธ่ตท๏ผไนฐๆคๆดๅฏๅๅ xๅนดๅ
็่ถ
ๅผ็นๆต๏ผๆคๆบๆ ้...\n31504 0 ไฝ ไธๆ
ๆฟ่ฑ20ๅ
ๅผๅผ ๅญๆฌพ่ฏๆๆฒกไบบ้ผไฝ \n1000 Processed\n classify content\n32000 0 92ๅๅๅ็งฐ๏ผAmazonๅๅๅ็งฐ๏ผNewChapterZyflamendNighttime\n32001 0 ๆๅๅฐๅคงไปvcr้่ทณๅบๆฅ็ๆถๅๅฟไธไฝๆณช็ฎ\n32002 0 xxx่ฎคไธบ็พๅบฆๆปฅ็จRobotsๅ่ฎฎ\n32003 0 ๆ่ฑxxxๅ
้ฎ่ดญไบไธๅฐๆธ
ๅๅๆนๅนณๆฟ็ต่\n32004 0 googใไธญๅฝ็ๅญฆ็ๅ ๅ่ๆก่ขซๅคๆ ๆ\n1000 Processed\n classify content\n32500 0 ๆฌๅฝๆ่ต่
ๅจ่ดญไนฐ่ก็ฅจใๅบๅธใ้่่ก็ๅๅ่ฎค่ดญ่กๆๆน้ขๅๅฐไธๅฎ้ๅถ\n32501 0 ่ไปๅนด4ๆๆไธๅธ็AppleWatchๆๅๆซๅฐพ\n32502 0 ๅบ่ชDonaireArquitectos\n32503 0 ๅๅณๆ่
่ดฅๅๅญๆธ
้คๅบๅ
ใๅ้ใๅนฒ้จ้ไผ\n32504 1 ๆญฆ่ฟ็บขๆ็พๅฏ้พ\"็ฎญ็ๅซๆตด\"ๆฐๅนด้ๅฅฝ็คผ๏ผๅๅ ๆดปๅจ่
๏ผๅ
่ดน้่ง้๏ผๅปถ้ฟ่ดจไฟไธๅนด๏ผๆดๆxxxๅ
้ฉฌๆกถ...\n1000 Processed\n classify content\n33000 0 ๆๅฅฝ่ฑ1ๅ้ๆถ้ดๆฃๆฅไธไธ่ฝฆๅบไธใ่ฝฆ่ฝฎไธๆๅๅจๆบไธๆๆฒกๆๅฐๅจ็ฉ\n33001 0 ้บฆ่ฟชNBA็ๆถฏๅฏ็พ้ๆผMV\n33002 0 127ๅ้ขๅพ้ๅนดไบ็ถๆๅบๅฐ่ฟๅ
ฅๅไฝๆฃ็งๅฎคๆฅๅไฝๆฃ\n33003 0 ๆฒณๅ|่ฎธๆๅธไธญ็บงไบบๆฐๆณ้ขๅฎกๅคไธๅกๅบญๅฎคไบบๅๅๅใ่็ณปๆนๅผ\n33004 0 ๅช่ฆไฝ ็กฎๅฎๆถ่ดง่ฟๅฉ็็ฐ้ๅฐฑๆๅฐไฝ ๆฏไปๅฎ\n1000 Processed\n classify content\n33500 0 ๅฑ
็ถๆฏ99ๅนด8ๆ2ๆฅ็็ๆฅโฆโฆ\n33501 0 ๅฎ่ขซๅ็ฅๆฎๅฟ็็็ธไนไธๆฟๅฌๆธฉๆ็ๆฌบ็\n33502 0 ็ฉฟ็cosๆ็ไบๆฌกๅ
ๅฐไผไผด่ขซ่ฏดไฝ็ฉฟ็ๆด้ฒ\n33503 0 ่ฏทๅ
ถไปๅไฝ่็ณปๅๅ 18ๆฅๆ่ไผ\n33504 0 ๅธธๅทๅธๆ ๆฐๅทฅ็จไนไธ็ๅฐๅนดๅฎซไธๆๅทฅ็จๆญฃๅจ็ดงๅผ ๅปบ่ฎพไธญ\n1000 Processed\n classify content\n34000 0 ๆ่งๅพๆไบไบบ็็ๆฏๅคงๆฆๅชๆไบฒๅฆ็็ธๆๆฏ็็ๅ
ถไปๅช่ฆไธๆฏๅฅฝไบๅฝไบไบบๅๅบๅฐฑๅ
จๆฏๅ็\n34001 0 ่ๆ้ฒ็ๆ ธๆญฆๅจๆฏๅจ่พนๆๆๅฝๅฎถ้ฝไธ่ฝๆฅๅ็\n34002 0 ๅๆไธช้ผๆ ทๅจ่ๅคฉๆ่ฏดๆๅจ่ตค่ๆท่ฏๆๅพๆ็็ธ\n34003 0 ไธๆ ไฝๅฎถไธๅ
ฌๅญ้ๆ่ฎฐๅบไบบ็็ฌฌไธๆกถ้\n34004 0 ๆธ
ๆฟๅบๅๅฐๆไน้ด็ๅ
ณ็ณปไนๅจๅ็ๅพฎๅฆ็ๅๅ\n1000 Processed\n classify content\n34500 0 ๅไธบๆถ่ดน่
ไธๅกไธๅๅนดๆถๅ
ฅ90\n34501 0 ๆถๆ่ฑๅ้ชจๅฏปๆพxxxxChinaJoyๆ็พshowgirl\n34502 0 ๅไธบๆๅๅฎณ/ๅไธบ่ฃ่7ๅ
จ็ฝ้็ๅผ็ฎฑ๏ผๅ
่ฃ
็็็นๅ่ถณ\n34503 0 0ๅ
ฌๅฏๅฏนๅ
จ้จๆบ่ฝๅฎถๅฑ
ๅฎ็ฐโไธ้ฎๅโๆไฝ\n34504 1 ๆจๅฅฝ๏ผๆๆฏๅคฉๅฎๅฉๅบ้จ๏ผๆ่ฐขๆจๅฐไบๆไปฝ็ๅๅฎด้ๆฉไบๅคฉๅฎ้
ๅบ๏ผๅฆๆๅฉๅบๅ
ฌๅธๆจ่ฟๆฒกๆ้ๅฎ๏ผ่ฏทไธๆไปฌ...\n1000 Processed\n classify content\n35000 1 ๅปบ่ฎพ้ถ่กxxxx xxxx xxxx xxxxๆทๅ:้ไบฎ\n35001 1 ๆจๅฅฝ๏ผๆๆฏไธๆตทไผ็ๆ่ต็ฎก็ๆ้ๅ
ฌๅธ็ไธๅกๅๅฐ็๏ผๆๅ
ฌๅธไธไธไปฃๅๅฐ้ข่ดทๆฌพ๏ผไฟก็จๅก๏ผๅชๅญไธชไบบ่บซไปฝ...\n35002 1 ๆญๅๅ่ดข๏ผ็ฅๆจ่บซไฝๅฅๅบท๏ผไธไบๅฆๆใ ๆไปฌๆฏๅๆดไฝๆฉฑๆ๏ผ่กฃๆ็ๅๅฎถใๅนดๅ็นไปท่กฃๆxxxๅ
/ๆน๏ผ...\n35003 0 ่ฅฟ่ตต่กใไธญๅคง่กๅๆฎต9ๆไปฝๅณๅฐๅ
จ็บฟๅผ้่ฟ่ก\n35004 0 ๆๅฒ็ฏ็็ฝโๆธฏๅชไธญๅคฎไธฅๆฅๅฝไผๅๅฎถไผ้ๅ ๅคงๆดๆฒปไฟกๅทโ\n1000 Processed\n classify content\n35500 0 ไธ่ง้ๆฅๅ็ฐๆๆบ้ๅคไบไธๅผ ็
ง็\n35501 0 ๅป็ๅไป่ฎฒ่งฃๆ็็
ๆ
ๅๆๆ็็ถๅตไปๅจๆ่พนๆฟไธชๆฌๅผๅงๅผๅง็ๅ\n35502 0 ้่ฟCarrobot่ฝฆ่ๅ่ฝฆ่ฝฝๆบ่ฝ็ณป็ป\n35503 1 ๆจๅฅฝ๏ผ่ดขๅ่ดขๅฏๅจxๆxๆฅ๏ฝxๆxxๆฅๆจๅบโๅฅณไบบ่้ฆจ้ฆๅผๅ็คผโๅจๆดปๅจๆ้ดๅกๆฏ่ดญไนฐxไธๅไปฅไธ็่ดข...\n35504 0 ๆตๆฑๅฎๆณขๅคฉ็ซฅ็ฆ
ๅฏบๅฐไธพๅโๅคฉ็ซฅๅฑฑไผ ็ป็ฆ
ไธโๆณไผ\n1000 Processed\n classify content\n36000 0 ๅ
ฌๅธ่ฆ่ฝๅๅพฎ่ฝฏไธๆ ทๆๆดๆพก็ๅฐๆนๅฐฑๅฅฝไบๆ่ฏๅฎ่ฟๅคๅจ้้ข็ปๅพๅๅๅๅๅๅ\n36001 0 ๅกๆฏไธๆฟๅบๅฉ็ไบง็ๅฒ็ช็้ฝ่ฆๆๅป\n36002 0 ๅๆฅๆไธๅๆฌข่ๅญ็็็ธๅฐฑๆฏๆๆฏไธไธช่ๅญ\n36003 0 ๆๅผ็ต่่ขซ่นฆๅบๆฅ็ไธ็พๅคๆกๅทฅไฝไฟกๆฏๅๅฐไบโฆโฆ\n36004 0 ๆฑ่ๆตทๅฎๅ
ฌๅฎๅฑไธๅๆญฃๅผๆฐ่ญฆๅซๆ้พ\n1000 Processed\n classify content\n36500 0 ้ฆๆน100่พๆๅ
ฅ่ฟ่ฅ็็บฏ็ตๅจๅบ็ง่ฝฆ\n36501 0 ๅฎฟ่ฟๆณๆดชๅฝๅฐๅๆ็ฉๆต็็
้ๆฏ็ฉๅขฉๅฒๆๅ็ๅๅ็ๅๅ ไนไธ\n36502 0 ็ผๆตทๅธๆ่ฒๅฑๅจๅธๆฟๅบ้จๆท็ฝ็ซๅๅธไบๅ
ฌ็คบ\n36503 1 ๅฐๆฌ็ๆฐ่้กพๅฎขๆจไปฌๅฅฝ๏ผๅๆฑไธ่ฝฌ็ๆฑ้่ถ
ๅธ่ดๅ ็พๅฅถ็ฒๆx.xๆไบ๏ผๆถ้ดๆฏxๆx.x.x.ๅท๏ผ่ฟ...\n36504 0 ๆฒกๆ็ๆฒกๆ้
ๆฒกๆๆๆบ็ๆไธ\n1000 Processed\n classify content\n37000 0 ็ฐๅจ็G2ไบฌๆฒช้ซ้ๅๅ็ๅ ฐๆกฅๆๅกๅบไปฅๅS38ๅธธๅ้ซ้ๅๅ็ๆปๆนๆๅกๅบใ่ทๅถๅฑฑๆๅกๅบ\n37001 0 ๆนไบบ้ข่กNBA็้ไปทๅผๆ่กๆฆ\n37002 0 ๅฅฝๅคๅฐไผๆๅจๆๆบ่ฏๅพโ่ฑๅ
โๆถ\n37003 0 ่ไธๆฏๅฏนไบ่ดขๅฏๅๅบ่บซ็่ๆ\n37004 0 ๅ็ฎก็ง้ฟ็ด่ฟท็ฝๆธธไธๅนด่ฑ1500ไธๆๅธๆฏๅ็งไปๅ
ฅๅ
ๆๅ็็ๅ็\n1000 Processed\n classify content\n37500 0 G25้ฟๆทฑ้ซ้ๅๅๅฎๆทฎๅไบฌๆฎตไป็ซน้่ณ็ซน้ไบ้้้ๅๆถ\n37501 0 ๅทฒ่ขซๆทฑๅไบๅๆๅฐไธ4000็ฑณๆทฑๅค้ฟ่พพ1400ไธๅนด\n37502 1 ไฝ ๅฅฝ๏ผ้ฅฎ้ฉฌ่กๅฅๅผ่ฟไธๅ
ซ่๏ผ็นๅซๆจๅบ่ถ
ๅผ็ๅฝฉ่ฒๆก็บนๆฏ่กซๆๅxxxใ้ฆๅๅฝฉ่ฒๅคๆญๆๅxxxใๆถๅฐ...\n37503 0 ไฟ็ฝๆฏๅฝ้
ๅไบๆฏ่ตไธป่ฃๅคๅพท็ฑณ็น้ยทๆๅฐๅทดๅ็งๅฐๅฐ่กจ็คบ\n37504 0 ็็ๆฏ่ฆ่ขซ่ฑๅ้ชจ็ต่งๅง็ป่ๆญปไบ\n1000 Processed\n classify content\n38000 0 ๆๆ็ปไบโๅผ ไธนๅณฐ้ฅฐไธๆนๅฝงๅฟโ่ฟไธช้้กน\n38001 0 ้ฟๅผ่ฅฟ้จๅคชๅค้กถ็บง1ๅทๆ2ๅทไฝ็ๅฏนๆ\n38002 0 ๆไป2015ๅนดๅผๅงๅฐไธๅจ้ผป็ไธ็ดๆฒก็ฏ\n38003 0 ๅจๅไบฌ๏ผๅไพๅฅณๆๅงๅจไธ่ตท\n38004 1 ไฝ ๅฅฝ๏ผ็พๅฅณใ้ฉฌไธx.xๅฟซๅฐไบ๏ผๆไปฌๅฎถๆฅๆฌพๅ
จ้ขไธๅธ๏ผๆๅพๅคๆฌพ๏ผ้ๅไฝ ใไฝ ๆฅ็็ใ่ไธ ็ฐๅจ๏ผๅ
ซ...\n1000 Processed\n classify content\n38500 0 ้ฟๆ็ต่ๅ็ฉบ่ฐๅทฅไฝ็็ฎ่ค็ผบๆฐด\n38501 0 ANNAๆฐไธไปฃDD้่ขซ็งฐไธบโๅจๆๅ
จๆไบงๅโ\n38502 0 โโๆฒชๆ่ท1%็ไธญๅคฑ3900ๅทจ้็็ธ\n38503 0 50ๅฒๅ้ไบใ46ๅฒ้ญ่ผๆไธๆณ็\n38504 0 ๆๅก่ๅดโโ่ฝฆ้ฉ๏ผไบคๅผบ้ฉใๅไธ้ฉ\n1000 Processed\n classify content\n39000 0 xยทxx่ๅท่ชๅจๆถๆขฏไบๆ
ๆถไบไผไธ\n39001 0 ๅฌดๆญๅฑQAQโๆทๆไน็ฅธ็ปตๅปถ่ณไป\n39002 1 ็ฒพ้่ดงๅxๆไผๆ ๏ผๅฟซๅฟซๆฅochirlyๅบ้บ้่ดญๅง๏ผๆๆฏๅฏผ่ดญๅจๅจๆๅพ
ๆจ็ๅ
ไธด๏ผ\n39003 0 ่ฑๅ้ชจๆฏๆๆดป่ฟไนๅคงๅฏไธไธ้จ็จๅฟๅป็ฑ็ไฝๅ\n39004 1 ๅบๅ
ๅฎต้ฟๅไธบไบๅ้ฆๆฐ้กพๅฎข๏ผๅกๅจๅ
ๅฎตๅฝๅคฉไธๅพ่ฃ
ๆฝขๅๆxๆ๏ผไธๅพ่ฝฆๅๅฝๆๆ่ดง๏ผๆดๆๅฅฝ็คผไธๆญ๏ผๆๅ...\n1000 Processed\n classify content\n39500 0 5ๅ่ฒ็ช็ถๅ็ดซๆๅ้ป่ฏดๆๆทค่กๆปๅฟ\n39501 0 ไฝ ่ฏด่
พ่ฎฏไธชๅปๅไธบๅฅ่ฆๆไธช่ฎฟ้ฎ่ฎฐๅฝโฆโฆไฝๆฏ่ฟๆฏๅฌไฝ ็\n39502 0 ไธ่ฟ็็ธๆๆฏ็ๆญฃ็\"ๆๆ
ไบบ็ปๆ็ทๅฑ\"ๅ\n39503 0 2015ๅนด7ๆ18ๆฅCA1541ๅๅฎไธๅ14\n39504 1 ๏ผ่กฃๆๆไฝๅฎๆจๅคๅฑๆฟ้้ๅทฅๅไปทๆข่ดญ๏ผๅไบๅๅ่ฟไบซๆ้ๅกไผๅๆไธๆ็็นๆ ๏ผ้ๅธธ้ๅธธ็ๅฎๆ ๏ผๅฎถ้...\n1000 Processed\n classify content\n40000 0 91โฌไธ้ฃๆบๅ่ฟไธชไปทๆ ผๅชๅค1ไธช้ฟๆฃไธๆๆฒป็ฉไปทไฝไบบไธๅคๅฏๆๆๆฏไธๅฟซไน็ๅนด่ฝปไบบ\n40001 0 xxxxxๅฐฑ่ฟไนๆฌบ้ชๆถ่ดน่
ๅ\n40002 1 ๅ่ฟ x x ๅฆๅฅณ่ ๆฐด่ญ่็พๅฎน็พไฝๅ
ณ็ฑๅฅณๆงๅ่ไปฌ๏ผ็นๆจๅบไธบๆxๅคฉ็ๆดปๅจxใๆๆๆค็้กน็ฎ...\n40003 0 ๅขจ่ฅฟๅฅๅๅจๅ
จๅธ21ไธชๅฐ้็ซๅปบ็ซไบ30ไธชๅฅๅบท็ซ\n40004 0 ๆฏๅคฉๅ4ๆฏ่ถๆฏๅ8ๆฏ็ฝๅผๆฐดๆดๆ็ไบ่บซไฝๅฅๅบท\n1000 Processed\n classify content\n40500 0 ๅ
จๅฝ้ฆๅฎถ็ๆญฃๆไนไธ็360โๅ
จๅผๆพๅผๅจๆฟ่ฟ่ฅๆจกๅผ็ๅกๅฏผ่
\n40501 0 Ialwayshavetๅฐ้ๆธๅ
ณ้ญ่ชๅฎถ็้ณไน\n40502 0 ๅฐๆถไปฃ4๏ผ็ต้ญๅฐฝๅคดโ
โ
โ
่ฟ่ก\n40503 0 ๅไปฃไธญๅคฎๆฟๅบ่ฎพๅๅบญใๅฎ่ฅฟ็ญ้ฝๆคๅบ\n40504 0 com้
่ฏปๅ
จๆ่ฏทๆณๅณ่พน\n1000 Processed\n classify content\n41000 1 ไบฒ็ฑ็ๅงๅฆนไปฌ๏ผxๆไปฝๆฐๅนดๅผ็ซฏ๏ผ็ซ็ณๅฏไบงๅ้ๅคๅค๏ผ่ดญไนฐไบงๅๆฏๆปกxxxxๅ
่ท่ต ไปทๅผxxxๅ
ไบงๅ๏ผ...\n41001 0 comLogo็ฎๅฝๅ
ฌๅธ็ฎไป็ง่ตๆๅก็ง่ตๅ็ฑปๆๅๆกไพๅไฝ่ฏฆๆ
ๅ
ฌๅธ็ฎไป็ง่ตๆๅก็ง่ตๅ็ฑปๆๅๆกไพๅ...\n41002 0 ไธๅนดๅ
็็ค็่ฝๅปๆ90%ไธคๅนดๅ
็็ค็่ฝๆๆๆทกๅ\n41003 0 ไธ่ฎค็ๅธๆณ่ฏ่ฎผ้ดๅฎๅฎข่งๅค็\n41004 0 ไธๅๆฏEMCๅSAP็ๅ
ฑๅๅฎขๆทๅพๆไปฃ่กจๆง\n1000 Processed\n classify content\n41500 0 ไฝๆฏๅทๅฅฝไปฅๅๆๆบๅ
็ฝฎๅญๅจไธ่ฝ่ฏๅซ\n41501 0 ๆฎWindowsInsiders่ด่ดฃไบบGabeAu\n41502 0 ๅ ไธบๅฝๅนดๅปบ็ญ็่ฟ็จไธญๅ็ไบๅพๆ\n41503 1 ไปๅคฉๆจๅบ้้็็่ดขไบงๅ๏ผๅฉ็x.x%๏ผๆ่ตๆxxๅคฉ๏ผx.xx~x.xx๏ผ๏ผๅ
ๅฐๅ
ๅฎใ\n41504 0 ไธๆตท60ๅนดไธ้็ๅฐ้ฃ่ฎฉๆ็ป่ตถไธไบ\n1000 Processed\n classify content\n42000 0 ๆญคๆฌกๆจๅบ็บฟ่ทฏไปทไฝๅๅจ20ๅ
ไปฅๅ
\n42001 0 ไฝๆฏ80๏ผ
ไปฅไธ็็ธไผผๅบฆๆฏ็ตๅฐๆฒ้ฎ้ข็\n42002 0 ๅๅไธๅฏน็งฐ็่ฎพ่ฎกๅไธชๆงๅ่ถณ\n42003 0 0255โฐไธๅพ่ฐๆดไธบๆ็
งๆไบค้้ข0\n42004 0 ๅไธบไฝ้ชไธญๅฟ้ๅ็ง็้ข็่ฏญ่จๆฏ่ฑๆ็\n1000 Processed\n classify content\n42500 0 ใใๅจๆพณๅคงๅฉไบG20ๅณฐไผ็ๆฐ้ปๅๅธไผไธ\n42501 0 ๆถๅฐไธๅฐ็ญๅฟ่ฏป่
ๆถ้ธฆ่ฎพ่ฎก็ๅฐ้ขไฝๅ\n42502 0 ๅ
ถๅฎๆๅฐฑๆฏๆณๅปๅไบฌๅฆๅๅๅๅ\n42503 0 ๆ่ๅญ็ๆ่็ต่ฏๆฆๆชไบๅงๆงฝ\n42504 1 ็บขๅฆๅ่ฅ็ฆ่บซๆญฃๆไบๅๆญฃๅผๅผ้จ๏ผไธบๅ้ฆ้กพๅฎข็นไธพๅๅคงๅไผๆ ๆดปๅจ๏ผ๏ผxxxๅ
xxๆฌก๏ผๅธฆไธๅ้กพๅฎขๅ x...\n1000 Processed\n classify content\n43000 1 ๆจๅฅฝ๏ผๆๆฏๅๅๆ็ต่ฏ็ปไฝ ็ๆท่ถไฟก็จ่ดทๆฌพๅ
ฌๅธ็ๅฎขๆทๅฐ้กพ๏ผๆไปฌๅ
ฌๅธ่ดทๆฌพๆ ๆตๆผ๏ผๆ ๆ
ไฟ๏ผ่ฟๆฏๆ็ๅท...\n43001 0 ไธๆณๅๅญๅธธไผ็็ช็็ก็ๆ
ๅฎข\n43002 0 ๆ็ฐๅจๆฏๅคฉๅฐฑๆฏ็ฉไธช็ต่ๆไธช็ต่ฏ\n43003 0 ไปๆฉ้็ง็ฎๅๆฒกๆ็จไปปไฝ่ฏ็ฉๅญฉๅญ็ฒพ็ฅ็ถๆ่ฏๅฅฝ่ฟ้ฃๆญฃๅธธไปๅคฉๅปๅป้ข้ช่ก\n43004 0 5ไธชไบบๆไธไธช้ฃๆบๆไธๆญป==\n1000 Processed\n classify content\n43500 0 ่ฆๆฑๅๅ
ฌๅธๆ่ช็ญ้็25%่ฟ่ก่ฐๅ\n43501 0 ๆ่ดง่ตๆๅ็ๅๆ๏ผ็ป็ๅ่ฝๅจ850ๅๅคๅๅคๆไฝ\n43502 0 ๆ่ฆ่่่นๆๅไธบ้ญ
ๆๅชๆฏไบ้
ทๆดพไน่งๅคง็ฅoppovivo\n43503 0 ๅๆ็ๅฐๅปบๅถๅบฆ็ฑไบ่
่ดฅๅไธๅพไบบๅฟ่ๅดฉๆบ\n43504 0 ไธ่ฏๅฐๅจxxไธชๆๅ
ๅ่ณxxxx็น\n1000 Processed\n classify content\n44000 0 ้ข่ฎกๆชๆฅxๅฐๆถๅ
ๅธธๅทใ้ๅใๆบง้ณๅคง้จๅๅฐๅบๅฏ่ฝไผๅบ็ฐ้ท็ตใ้ท้จๅคง้ฃใ็ญๆถๅผบ้ๆฐด็ญๅผบๅฏนๆตๅคฉๆฐ\n44001 0 ๅฏ่ฝๅพๅคไบบ้ฝไผๆณๅฐ้่ๅ็ตๅ\n44002 0 ๅพฎ่ฝฏ๏ผ้ซ้ๆๅคช้ณไฝ ๆฏไธๅคงไบบๆดๅคฉๅชๆณๅๆ้ฑ\n44003 0 KTVๆฏไธไธชๅ ไนๆฒกๆๆ ธๅฟ็ซไบๅ็ไบงไธ\n44004 0 BaitollahAbbaspour็ๆฒป็่ดน็จ็ปไบๆ็็ฎไบ\n1000 Processed\n classify content\n44500 0 7ๆ10ๆฅ่ณ15ๆฅ้ฟๆตทๅฟๆ
ๆธธๅฑๅจๅคง่ฟ็ฒพ้ๆ ธๅฟ็คพๅบไธพๅไบๅๅบๅคงๅๆจไปๆดปๅจ\n44501 0 ไธฅๆ ผ็้่้ฃๆงๆ ๅๆง่กๅฐฑOK\n44502 0 ๅๅ 2015ๅนดๆดๆ็ถไธ็็นๆฎๅฅฅๆๅนๅ
่ฟๅจไผ็ๅไปฃ่กจๅข้็ปญๆต่พพๆดๆ็ถ\n44503 0 ๆไธๆพ่ดจ็่ชๅทฑๅ็ๆฏๅฆๆญฃ็กฎ\n44504 0 ๅฆไปIPOๆณจๅๅถ็ๆพๅผ็ปๅๅ่กไธๅธๅธฆๆฅไบๆฐ็ๆบไผ\n1000 Processed\n classify content\n45000 0 ๅฅณไธป็ฌฌไธๆฌก่ขซ้17้ข้้ญ้+่ขซๅบ101ๅ+่ขซ็ปๆ
ๆฑ ๆฐดๆฏๅฎนๆฏๅฃฐ+ๆตๆพ่ฎ่ๆถ\n45001 0 TA่ฏด็่ฟไธค็ฑปไบบ็ฉถ็ซ่ฏฅๅฆไฝๅบๅ\n45002 1 ๅนณๅฎๆ่ดท๏ผ้ๅฏน่ฝฆไธป๏ผๆฟไธป๏ผๆณไบบ๏ผไธ็ญๆ๏ผๅฏฟ้ฉๆไฟไบบๆไพๆ้ซxxไธไฟก็จ่ดทๆฌพ๏ผๆๆฏx.x%๏ผ่ฏฆๆ
...\n45003 0 S96ๅฎฟ่ฟๆฏ็บฟๅๅๅจๅฎฟ่ฟๅค้้ใ้่ฝฆๅๆถ\n45004 0 ่ก้ๅ็ฎก็งใๆงๆณไธญ้็ป็ป5ไบบ\n1000 Processed\n classify content\n45500 0 ็ปผๅๆงๆณๅคง้ใๅ็ฎกใๅทฅๅใ้ฃๅฎๅ็ญๆงๆณไบบๅไปๅญฆ้ข่กไธๅปบ่ฎพ่ทฏไบคๅๅฃๅไธ่ฟ่กๆธ
็\n45501 0 ๅจไธๅฐๆนพๅฐๆถๅซๅไธๆฌบ่ฏๅ
ๅฐ่ๆฟๅถๅฝฆ่ฃไธๅฎถ้ฃ้ๅๅคง้\n45502 0 ไปๅนด10ๅทๅฐ้ฃโ่ฒ่ฑโไบไปๅคฉ12ๆถ15ๅ็ป้ๅนฟไธ้ไธฐ\n45503 0 ๆๆๆญๅณๅฎๆ่ฑๅ้ชจๅๅ้จๅ็ๅฐ่ฏด็ไบ\n45504 0 ่ฟๆฏๅจ็ฝฎไธ่
ๅจ่ดญๆฟ่
ไนๅๅฟ
้กป่ฆไบ่งฃ็ไบๆ
็ไธค้กน้่ฆ่่ๅ ็ด \n1000 Processed\n classify content\n46000 0 ้ญๆฒซ่ฅๅ
็ๆพไธบๆ ้กๅคชๆน้ผๅคดๆธ่ฏด่ฟ่ๅ็ไธๅฅโๅคชๆนไฝณ็ปๅค\n46001 0 ๆ็ฑ็ๅฐฑๆฏๅ่ดธ่กๅๆ
ไบบๆนไบโฆ\n46002 0 ้ฆๆนๅฎๆน่ฎคๅฎ็โๆฑ่่ๅญๅทโไผไธๆปๅ
ฑ่พพๅฐ176ๅฎถ\n46003 0 ไธๆฏๆ่ฏดโฆโฆไธๆฌกๅ็ฎกไบmyไฟฑๅฉ้
ฑ้ฝไธญไผค็ๅไบ็ป่ท้นคๆฏๆธๆธ่กไธ\n46004 0 ๆฒกๆ้ฃ่ฝฆๅ
ๆฒกๆๅฐๅทๆฒกๆๅๅไธ่ทฏๅๅๅด่ถ่ตฐ่ถ้พ็ ด่ฎฐๆงๅฐฑๅทฎๆฒกๆ่ชๅทฑ็ๅคดๅผไธขไบๅฅฝไธๅฎนๆไธไบ่ฝฆๅๅๆผซ...\n1000 Processed\n classify content\n46500 0 ๅพฎ่ฝฏ็OneDriveๆถ้ดๆพ็คบไธๆฏๆฌๅฐๆถ้ด\n46501 0 ๅธธๅทไน้พๅธๆๅฐๅญฆ็ๅง่ณๅไปๅ็ไธฝ\n46502 0 ็ถๅไธ็นๅทฆๅณๅฐ็ๅฎฟ่ฑซๅบๅฐ็จๅฑไธ็่ฝฆ\n46503 1 ๅฎถ้ฟๆจๅฅฝ๏ผๆฐๅนดๅฟซไน๏ผไธญ่ๅณๅฐๅฐๆฅ๏ผๅจๆญค็ดงๅผ ๆถๆ๏ผ่ฟๆฏๆ่ฒ็นๅผ่ฎพๅ
จๆฅๅถ่ฏพ็จ๏ผไธบๆจๅญฉๅญไธญ่ไฟ้ฉพๆค...\n46504 0 ๆไธไธชๅฐๅทๆฏๅคฉ้ฝๆญฃๅคงๅ
ๆ็ๅทๅฌๆไปฌ่ฏด่ฏ\n1000 Processed\n classify content\n47000 0 ๅช่งไปๆไบไธไธช็บธ้ฃๆบ่ช่จ่ช่ฏญ็่ฏด๏ผโๅฐ้ฃๆบๅๅฐฑ็ไฝ ็ๅฆ\n47001 1 ๆฐๅนดๅฅฝ๏ผๆๆฏไธไธๅ็้่กไปฃๆญ็ๆฑค่๏ผๆจไนๅๅ็ไธไบๆ่ขซๆ็ไปฃๆญ้ฝๅฏๅ็๏ผ็ซ็ไฝ่ณxๅ๏ผ้ขๅบฆ้ซ...\n47002 0 ๅฎๆด็๏ผๅฅฝๅฃฐ้ณๅผ ็ฎๅ็งHIGHๆญๅฏไบไปฃ็พไธ่ฑช่ฝฆๆๅฆนไบๅจๆ็ฅจ๏ผไฝ ่ฎคไธบ่ฐๆด้ๅๅ็ทๅ\n47003 0 ไธ็ ไธไผ็็ๅฎไบ่ฑๅ้ชจๅฐ่ฏด\n47004 0 **ๅๅฐ้็ฎก็ไบบๅๅฏนๅ
จ่ซๆฏ็ง็ๅฐ้็บฟๅฑๅผไบไธๅบๅฐๆฏฏๅผ็ๆ็ดข\n1000 Processed\n classify content\n47500 0 7ๆ9ๆฅๆถๆฏๅพฎ่ฝฏๆๆบไธๅกๅคฑ่ดฅๅๅณๅฎ่ฃๅ7800ไบบ\n47501 0 ๅฏนๅค้่ฟฐโไบ่็ฝ+โๅๅไธๆบ้ใ่
พ่ฎฏๆ็ฅ\n47502 0 ??????????????????????\n47503 0 ๅ้ฅญๆถ็็ต่งๅจๆผ่ญฆๅฏ2013\n47504 0 ้ซไฟๆนฟ็ดง่ดๆฏๅญ้้่็ผๆๆ่ฅๅ
ป100%็บฏ็ซน็บค็ปด้ข่ๆ ่ๅค็ๅๅธฆๆธฉๅบฆ่ฎกๆพๅ
ฅๅฐ็ฎฑ10ๅ้ๅทฆๅณๆธฉๅบฆ...\n1000 Processed\n classify content\n48000 0 ๅจๅฝๅฎถ้ฃๅ่ฏๅ็็ฃ็ฎก็ๆปๅฑๅ
จ\n48001 0 ๆจๅคฉๅ ไธชไบบๅจ่ฏขๅ
ณไบ็บค่
ฐๅป้ค่ณ่\n48002 0 ไน่ฎธๅ ไธบๅฝๅ
ๆฒกๆgoogleๆฅไธๅฐไฟกๆฏๆบๅง\n48003 0 2ๅฃฎ้ณไธ่ฝๅฌๅๆงๆฌฒ๏ผๆงๆฌฒ่ท็ฒพ็ฅ็ถๆไธ้ๆฟ็ด ๆฐดๅนณๆๅ
ณ\n48004 0 ๅด็ปไบๅๆๅพไน
ไปฅๅไธ็ต่่ฏพ\n1000 Processed\n classify content\n48500 0 ไฝๅตๆๅๅก็็ญพๅๆฏNๅนดๅ็็ญพๅ\n48501 0 ๅฅฝๅฅๅฟๆดๅผบ็ๆๅๆณ็ฅ้็็ธ\n48502 0 ๆๆๆบ้ๆไปๅคฉๅๆดๆฐ็้ฃ้\n48503 0 ไบบไบบ้็ฅ็ไบ่็ฝ้่ไฟกๆฏๅฎๅ
จๅบ็ก\n48504 0 ๆฒป็็พ็
็ๆๆๆฏ่พ่ฟ
้ๅๆพ่\n1000 Processed\n classify content\n49000 0 ๆไธ้ฃๆบๅไบๅฎถๅฐฑๆถๅฐ็็ๆฅๅ
่ฃน่ฟไบบๆฏ่ฐไนๅคชไปๅฆๆๆไบๅง็ๅฐๆฝฎๆฐด็ฎด่จ็ณปๅๅไพง้ข็Itsdrea...\n49001 0 โโ่ฟๆๅฅฝๅฃฐ้ณ็กฎๅฎๆฏๆๆๆ็ฆปไนกๅซไบๅจๅคๆๆผ็ๆธธๅญ็ๅฟ้ฝ่ไบไธ้\n49002 0 ไธๆตทไปๆตๅป้ข่ก็ฎกๅค็งไธๅฎถๆไฝ ่ฟ็ฆป้่ๆฒๅผ \n49003 0 ไปฅๅๆ้
ๅ้ฉพ้ฉถใ่ถ
ๅ20%ใ่ถ
้50%ใ้ซ้ๅ
ฌ่ทฏ่ถ
้20%ๆ12ไธชๆๅ
ๆไธๆฌกไปฅไธ่ถ
้่ฟๆณ่ฎฐๅฝ...\n49004 0 5ๆฏ็ฑๆฌ็ง็็ปๆ็้ไผๅฐๅ่ตดไธ้ณใๅๅฒๆนใไฝๅงใไธฝๆฐดๅ่กขๅทๆไพๅป็ๆๅก\n1000 Processed\n classify content\n49500 0 ็ๅฐๅ่พ่ญฆ่ฝฆ็ป่ฟๆผๅ้กฟไธญๅ\n49501 0 ๆฏไบฒ่็กฌๅ็ถไบฒๆๅฎๅๅฐๅ\n49502 0 ๅผๅผ็ฐๅจๅๅจๅ้ๅฒ็้ฃๆบไธ\n49503 0 ๆฌ้จไป่ฟไบ็ฎ้ๅคบๅพ็ฌๅญคไนๅๆฎ็ซ ๅ\n49504 0 ่นๆๅIBMไป็ถๆฏ็งๆ็ๆๅฅ็น็็ปๅ\n1000 Processed\n classify content\n50000 0 ๆพๅคงๆ ๆฎ็ญITๅคง้ณ็ๅธฆๅจๆๅบ\n50001 0 ไบค่ญฆๆ้๏ผ้
้ฉพไผๆๆไธ่ฌไบค้ๅคฑไฟก\n50002 0 ไบ6ๆ30ๆฅ20ๆถๅจ่ฅฟๅๅคชๅนณๆดไธ็ๆ\n50003 0 ไฝ้xxๆค่บซ้ซxxcm็ฎๆญฃๅธธไน\n50004 1 ๆฐๅนณๆๅพทไป่ฏๆฟไธๅ
ซๅฆๅฅณ่็บฆๆ ๅฆ๏ผ ๆดปๅจๅฐไบxๆxๆฅๅฐxๆxxๆฅ่ฟ่ก๎ท๏ผๆดปๅจไธ:ไนฐ่ต ๅคงไผ...\n1000 Processed\n classify content\n50500 0 ้คไบๆถๅITๆๆฏ็้ซ้ขไบคๆไปฅๅค\n50501 1 ่ฏๆๅฎขๆทๆฑ่ดญๅๅไธๅฑ
ๅฎค\n50502 0 ๆๆญฃๅจ็15ๅฒๅฅณๅญฉ้ญๆฅๅคๆง่ฝฎ*\n50503 0 ไธไธชๅป้ขไธ็ญ็ๆฃ้ฃไฟฉ่ญ้ฑๅฐฑไธ็ฅๅคฉ้ซๅฐๅไบ\n50504 0 ๅจๆนๅๅซ่งๅๆฑ่ๅซ่งๅทฒ็ป็ๅฐๅฝ\n1000 Processed\n classify content\n51000 0 xx%็พ็ป+xx%็พๆฏ+xx%็ไธ\n51001 0 ๅฆไฝ ไปๅฆ็่ขซๅผบๅฅธไบ่ขซไบบๆๆญปไบไฝ ๆๆดป่ฏฅๅขๅ ไธบไฝ ไธ่พๅญๅฐฑๆฏไธชๅป้ผ\n51002 0 Malloryzๆฏไธชๅๆๅๅนด็ๆฏไธ็\n51003 0 ๅจๆบๅบ็ๅฐๅไธบ็ๅนฟๅๅพ่งๅพ่ฎ็ฌฆๅ่ฟๅฅ่ฏ\n51004 0 ่ขซ้ข้ขๅ็็็ชๆกไปถๆ
็ไธๅพๅฎๅฎ\n1000 Processed\n classify content\n51500 1 ๆฏๅ
ด็พ่ดง็ฅ๏ผ็พไธฝ๏ผๅฅฝ่ฟ๏ผๅฟซไน๏ผๅ่ฏ๏ผ็ญๆ
๏ผๅฅๅบท็ๆจไธๅ
ซๅฆๅฅณ่ๅฟซไน๏ผๆๆ็พ็็ฅ็ฆ้็ปๆจ๏ผxๆx...\n51501 1 ใ่ๅฎขๆท็ฆๅฉๆฅไบใไบฒ๏ผๅ
ๅฎต่ๅฟซไน๏ผใๆทๅฎๅบ๏ผ็ฑ็พๅฎ่ดๅฉด็ซฅ้ฆใ้ๆจxxๅ
ๅ
ๅฎตๅธ๏ผๆฅๆฌพๆ ้จๆงไฝฟ็จ...\n51502 0 ๅไบบ็ธ่ต ็ไธๅชๆฅ่ชๆ ้ก็่ถ
ๅคงๆฐด่ๆก\n51503 0 ๆณๅช็งฐ้ฆๆธฏๅ้บ็ง้ๅคช่ดตLVๅBurberry้ฝๅไธๆถโๅๅบ็ฝ\n51504 0 Bravo~ไธไธๅๅฌๅฎๅฝ้
็งๆณ\n1000 Processed\n classify content\n52000 1 ๏ฝxไธ่ฃ
ไฟฎ่ดน็จxใ็ฐๅบไธคๅๅนณ็ฑณๆๆๅฑๅ
ๅไธ็งไธๅ้ฃๆ ผ็ๅฎๆฏๆ ทๆฟๆฟ็ญๆจๆฅๅ่งxใๆฅ่ฎฟๅณๆ็คผๅไธ...\n52001 0 ๆๅคฉๆไธๅ
ซ็นๅฐไน็นๆข็บขๅ
\n52002 1 ้ซไธฐๅฝ้
ๅฎถๅฑ
๏ผxๆxxๅทxxๅทไธพๅ่ฏไฟกx.xxๆๆฉๅๆฅๆดปๅจ๏ผๅฐๅบๆ็คผๆดปๅจ๏ผๅ
ๅฎน๏ผx:ๆ้ถๆขฏ้...\n52003 0 Saigon็้นๅธๅบๅฐฑๆฏ็ฌฌไธ้ก\n52004 0 ๆฌ่ฏท่ฐ
่งฃ๏ผ่ฝฌๅ
จๆฐBillinghamHadleyDigital่ฑๅฝๅ่ฃ
\n1000 Processed\n classify content\n52500 1 ๆฑ่็ง่๏ผๆฌๅธฎ่ๅๅฎถ่๏ผๅทฅ่ตๅๅๅฐๅๅไบ๏ผๅ ๆ็๏ผไธ่ฏ่๏ผ่ฆๆไผๆฏ๏ผๆ้่ฆ็่ฏท่็ณปๆxxxx...\n52501 0 ่ขซๅ็็ถๆฏๆฅ็พๅ่ดฟ่ตๅๅฎณไบบๅ่ฏไบบ\n52502 0 ๆฏๅฏนไบ้ฉฌ้ไบ่ฎก็ฎๅนณๅฐๆๅ
ด่ถฃ็็ธๅ
ณๅไป่ทๅไบ้ฉฌ้ไบ่ฎก็ฎๅนณๅฐ่ต่ฎฏ็้่ฆๆฅๆบ\n52503 0 ๅฅฝๅฃฐ้ณ่ขซๆฑๅฉ็ๅฅณ็็ฌ่ตทๆฅๆไนๅๆดๅฎนๅคฑ่ดฅไบ\n52504 1 ไธๅฉ้ฉพๆ กxxๅฅณไบบ่ไผๆ ๆดปๅจ่ฟ่กไธญใxๆxๆฅ๏ฝxxๆฅๅฅณๅฃซๆฅๅxxxxๅ
๏ผ็ทๅฃซๆฅๅxxxxๅ
๏ผไธ...\n1000 Processed\n classify content\n53000 0 ็นๅซๅจ่ฟ่กๆฐๆฟ่ฃ
ไฟฎ็ๆถๅๅฐฝ้ไผ้็จ็ฏไฟๆๆ\n53001 0 FIBAๅฎ็ฝไธๆ ไฝๅฎถEnzoFlojo่ฏ่ฎบไบ2015ๅนด้ฟๆฒไบ้ฆ่ตไธๆๆฝๅ็ๅ็ๆฐๆ\n53002 0 ไธไบ้ฉฌ้็พๅฝ็ด้ฎไธญๅฝ็ๅ็ฑปๅฎ็ฐๅฎ็พๅฏนๆฅ\n53003 0 ๅไบฌๆฟๅฐไบงๅธๅบ่กจ็ฐ้ข่ทๅ็ฑปๅๅธไธๆ ้็\n53004 0 sanE็ปๆๆ่งๆ็นๅไธๅๅฆ\n1000 Processed\n classify content\n53500 0 ่ถณไธๅบๆท+Q844930494่่ไธๅๆฌกไธๅฆๅปๅไธๆฌก\n53501 0 ๅๅๅๅๅจไนฐ็ต่็ๅฐๆน็่งๅๅกๅฅถไบๅๅๅๅๅ\n53502 0 โxยทxxๅฅณๅฐธๆกโ็ไธๆก็ป้่ฆๆๅๅฏๅฟๆ\n53503 0 ๅ
จๅฝxๅคฉๅ็x่ตทโ็ตๆขฏๅฌไบบโไบๆ
\n53504 0 ่ไธๅ
ๆฌ้คๅ
่ฃ
ๆฝขใๆๅกใ่ฎพๆฝ็ญ\n1000 Processed\n classify content\n54000 0 ๅ้ๅขๆป่ฃ็ธๅค็ๆฅๅญๆฏๆๆๆธฉๆๆ็\n54001 0 ๆ็ปๆณ้ขๅคๅณๅๅฎถ่ตๅฟ่ฏฏๅทฅ่ดน500ๅ
\n54002 1 ไนไธๅ้ฝ่ฃ
้ฅฐxxx้ถๅฉๆถฆx้ๅคง็คผ๏ผๅฎถๅ
ทใๅฎถ็ต็ญ๏ผ็ฏ็โๆขโๅฎๆ ใๆถ้ดxๆxๆฅ--xๆxxๆฅใ...\n54003 1 ๅบ้บไปปๆไบงๅๆปกxxxๅ
ๅณไบซx.xๆไผๆ ๅฆ๏ผๆฌข่ฟๆจๅฐๅบ้่ดญ^_^๏ผโ็็ธ็ๅฎ่๏ผ่็ณปไบบๅ็ต่ฏ๏ผ...\n54004 1 ๆไพๆ ๆตๆผๆ ๆ
ไฟไฟก็จ่ดท ่งฃๅณ่ต้้ๆฑ ่ฝฆ่ดท ๆฟ่ดท ๆ ๆฟๆ ่ฝฆไฟก็จ่ดท ๆ็ปญ็ฎไพฟๅฟซๆท xๅคฉไธๆฌพ ้...\n1000 Processed\n classify content\n54500 0 ๅปบ่ฎฎๅๆ่กฅๅ
้
ต็ด ๆฅๅธฎๅฉ่บซไฝๅ่กก\n54501 0 ๅ
จ้ขๆจๅจๅๆๅซๆๅฏผ่ชๅบ็จไบงไธๅๅฑ\n54502 0 ไพๅฆ2ไบฟ็พๅ
ๆถ่ดญ็ฝ็ปๅฎๅ
จๅ
ฌๅธAorato็ญ\n54503 0 ๆ็ฝๅๅๅธ่ดจ็โไฝ ่ฏๅฎๆฏไป็ๅไผ\n54504 1 ๆ่ฟ่พนๆฏๅ็ๆ ๆตๆผไฟก็จ่ดทๆฌพ็ๅฐๆจใๅฆๆๆ้่ฆ็่ฏๅฏไปฅ่ทๆ่็ณป\n1000 Processed\n classify content\n55000 1 ๅผๆท่ก๏ผไธญๅฝๅปบ่ฎพ้ถ่ก ๅกๅท:xxxx xxxx xxxx xxxx xxx.ๅผ ็\n55001 0 Twitterๅธๅผ็บฆ็ป่ฏญ็ๅฐ็ซๅช\n55002 0 ๅซ็ปด็็ด Cใ็บค็ปด่ฝไฟ่ฟ่ ่่ ๅจ\n55003 0 ่ฟ็ไธ็งๅไธญ้้ๅขไธคๅช่ก็ฅจ\n55004 0 ๅพๆฉ็กๅฐๅๆจๅฐฑ้ไธๆฅผๅผไธคๅ้ขๅ
ๅ ็น็ผๅฅถๅ็ถ็ๅๅคไธๅบ็ฟปไธ็ฟปๆๆบ็ธ็ๆไธชๅๆฌ \n1000 Processed\n classify content\n55500 0 ๆๆบๆขๅคๅบๅ่ฎพ็ฝฎไบ่ฟๆๅฏ่ฝๆขๅค้้ข็็ญไฟกๅ\n55501 0 ๅฅฝ่ฒ้ณ็่ๅฐๆไพๅพๅคๆทๆฃ้ณๆจๅคขๆณ็ไบบไพๅฐ้่ฃก้ๅ่ๅฐๆไพไปๅๆฉๆๅณไพฟๆฒๆๅฐๅธซไบฎ็ไฝ้ไปฝๆทๆฃๅคข...\n55502 0 ๅจ็ๆธ
็ๆดป็็็ธไนๅ่ฟไพๆง็ญ็ฑ็ๆดป\n55503 0 ไนๆๅธๆณ้ขไธๅฎกๅคๅณ้ฉณๅไบ้ๆ็่ฏ่ฎผ่ฏทๆฑ\n55504 0 โโฌโฌไนๅฑฑ็ตๅๅคๆฆๅคๅๆฑ่่ๅคฉๅทฅๅคง้ซๆฐ*STไปชๅ็ฒพ่พพ่กไปฝ็ฅๅฅๅถ่ฏไธญๅคฎๅๅบ้ฉฌๅบ้พ\n1000 Processed\n classify content\n56000 0 ่ฟไธไธชไบบๆฏ้่ฟไปไนๆนๅผ่ฏๆ่ชๅทฑ\n56001 0 ็ฑๆทฎๅฎๅธ่ช้็ฎก็ๅคๅ
ญ็นไบๅๅๅธ็่น้ธ้่ชๅๅพ
้ธไฟกๆฏ\n56002 0 ๅ่
ๅคงๅฒๅ
ณๆฏๆฑ่ๅซ่ง็ไธๆกฃๆฐดไธๅฒๅ
ณ่็ฎ\n56003 0 ไธๅคง็่ด็ธๅ
ณ็่ฏๅธๅธๅบ้ด้ฟๆๅฝๅบ๏ผๅนณๅผ\n56004 0 ๅๆฟๆฅ็ๆฑ่50%ไปฅไธๅบๅฝ้่ไธๅก้็้พๅคดๅไฝๅฅฝ่ฃๅนธ\n1000 Processed\n classify content\n56500 0 ่ฝ่ฏดๆธ
ๆฅๅๅนดๅจ้ฟ้ๅทดๅทดๅไปไนไบๅ\n56501 0 ไธญๅฝ้ฆๆฌพไธค่ฝฎ็ตๅจๆฑฝ่ฝฆๅฎๆA่ฝฎ1000ไธ็พๅ
่่ต\n56502 0 ๅฑฑ่ฅฟๆณ้ข้ข้ฟ่ฐ็ ็ฃๅฏผๆถ่ฏไฟก่ฎฟไธ้กนๆฒป็ๅทฅไฝ\n56503 0 ๅฌ่ฏดๅญฆๆ กๆ600ๅ
/ๆ/ๅบไฝ็็ฉบ่ฐๆฟ\n56504 1 ่ณไบฒ็ฑ็้กพๅฎข.ๅ
ๅฎตไฝณ่.ไธๅ
ซๅฆๅฅณ่xxๅนด็งๅฌ่ดงๅๆฌๅบๅญฃๆชๅคงๆธ
่ดงๅ
จๅบไธๆๅคงๆข่ดญไธ้่ฟๆบไผ๏ผๅฐๅ...\n1000 Processed\n classify content\n57000 0 ่ฟๅฅ่ฏๆทฑๆทฑ็ๅฐๅจๅไธบไบบ็ๅฟ้\n57001 0 ๆ ๆฎๅฝๅๅทฒไธๅ้ขๆไธไธ่ดขๅนดไผๅฎ็ฐๅข้ฟ\n57002 0 ๅๅฆๆฐด่ฝไธบ่่ค่กฅๅ
่ฟไบ็ฎ่คๆๆ็ไฟฎๅคๆๅ\n57003 0 ๅๅคฉๅ ็ด ่ฝ็ถๅชๅ ๅฐ20%๏ฝ30%\n57004 0 ้ค9ๅฎถๅไธๆฟๅ
ฌๅธ่ทๅพ็คพไฟๅขๆๅค\n1000 Processed\n classify content\n57500 0 ๆฑ่ๅซ่งๅ่
ๅคงๅฒๅ
ณ็ซ็ถๆพไบSUJUDevil\n57501 0 ่ฟไบๆฅๆxxxๅชไธช่ก่ขซๆบๆ่ฏ็บงไธบไนฐๅ
ฅ\n57502 0 ไธบๅฅ่ฏด่ฏฅ็กไบๅขๅ ไธบๆๆบๆฒก็ตๅฆ\n57503 0 ไปๅจ็คพไบคๅชไฝไธๅ่กจไบx็ฏ่ขซ่ฎคไธบๆฏ่ฏๆฏๆณฐๅฝๅฝ็็ๆ็ซ \n57504 0 ็ขงๆฐดไบๅคฉๅฐๅบxxๅทๆฅผไธๅๅ
ไธๆฅผไธๆทๅ็ง็ค็่ฝฆไธๅธธๅนดๆพ็ๆฐ็ฝๅจไธ้ข\n1000 Processed\n classify content\n58000 1 (ๆๅ-ๆ่ฟๅฅฝๅ๏ผๆ ไธ ๅฅฝ ็ ่ถ ๅถ๏ผ่ฟ ๆฏ ๅ ๆฅ ็ ๅณ ้ใไฝ ๆ็ใxxxไธฝxxx...\n58001 1 ๅ้นค็ฑณๅ
ฐ็บณๆถๅฐ้จ๏ผx.xxโไธบๅฅๅบท๏ผๆ ไปปๆงโๆดปๅจ๏ผไบซๅๅฎถ็ฏไฟ่กฅ่ดดxxxๅ
/ๆจ๏ผๅฎ้ๅขๅผไผๆ ๆปก...\n58002 0 ไธ็ท็้ข็่ๅฉๅญฉๅญๅจxๅท็บฟไน่ฎจ\n58003 0 ๅกๅฐ่ตๅฎซๅปบ็ญๅธๅฑไธฅๅฏใๅ่ฐ\n58004 0 ้ไธญๅฏน่พๅบ้ๆณ่ฟๆณๅปบ็ญๅทฅ็จ่ฟ่กไบ้ๆณๆดๆฒป\n1000 Processed\n classify content\n58500 0 ็ซ ็้็A็ฑปๆๅไปฅๅๅ็พๅคไผฆๅค่ฑๅฝๅคงๅคงๅฐๅฐๆๅไธๅคงๅ \n58501 0 ไธไบ้ถ่กๅๅธๅๅฏ่ฝๅฐ้ขไธด้ฃ้ฉ\n58502 0 ๅพฎ่ฝฏCEO่จ่ไบ็บณๅพทๆๅจไบ่ฎก็ฎ็้่ทฏไธ่ถ่ตฐ่ถๅๅฎ\n58503 0 ่ฟไบๅ
ฌไบค่ฝฆๅฐๅๅธๅจ7่ทฏใ33่ทฏใ303่ทฏใ405่ทฏใ407่ทฏใ414่ทฏใ612่ทฏใ708่ทฏ8...\n58504 0 ๅๆ็ญ่พฉๆ่งๅฎ๏ผๆๆ่ดน็จไธๅบ็ฑๆๆฏไป\n1000 Processed\n classify content\n59000 0 ๅๆถๅซๆณชๅบไธไธชBoseQCxx้ๅช่ณๆบๅจ็พๅฝไบ้ฉฌ้ไธไนฐ็\n59001 0 ้ช่ก3000ๅ็ฑณ็20ไฝไธญ้ฉๅฐไผไผดไปฌ\n59002 0 ๆไธๆถไบไธๅฑๅธๆฐ่่ๆๆฐงๅ็ฒพๅ\n59003 0 ไธ่ฝฆๅธๆบ็ไธๅคฉ๏ผๆถๅ
ฅๅๅฐ่่่ฝฌ่ก\n59004 0 ๅๅๅๅๅๅๅๆณ่ตท็ตๆขฏไบไปถ่ฟๆฏๆณไน็ฐๅจๅๅจไธ่ตทไธ็นไธ็ๆฅๅๅฎถ้ฟๅบฆ็งฏ็ดฏ็็นๅซ่ธๅฎ\n1000 Processed\n classify content\n59500 0 ๅญๅฎซ่่
บ็่ฟไธๅ่ฏๅจๆไปฌ่บซ่พน้ข็นๅฐๅบ็ฐ\n59501 0 ๆๅคงๅฟ้ข้จ้ๅฎๆตท146ๆฏซ็ฑณใไธ้จ144ๆฏซ็ฑณใๆคๆฑๅบ136ๆฏซ็ฑณ\n59502 0 ๆฟๅฐไบงๆดป่ท็ป็ปๆต่ฟ่กๅธฆๆฅ็ๆญฃ่ฝ้\n59503 0 ๅธฎๅฟ่ฝฌไธ่ๆ็ปฟๅไบฌ็ซ็็ฅจไธคๅผ ๅฅฝๅ\n59504 0 ๅฐ้ไฟกๅท็ฒๅบๆบๅคโฆโฆๅ้ต็ซๅฐๆฒๆฒณ้ซๆๅญ\n1000 Processed\n classify content\n60000 0 2014ๅนด้ซๆทณๅบ่งๆจกไปฅไธๅทฅไธไผไธไธๅ
ๅทฅไธๅขๅ ๅผ่ฝ่ไธบ0\n60001 0 ๅๅๅๅบ่ๅคงๅๆ่ฏดๆตๆฑๆไธชๅคงไธ็จๅพฎๅๅข้ๆจกๅผไธไธชๆ่ตxxๅคไธ\n60002 0 ไธๅพๅทๅฝๅ็ฉไธๅคง่กไฟๆดๅ้ฒ่ๅพ็ฅ\n60003 0 ๆณๆณๆฌๆฅ่ฟๅจๅซๅฐบ็26ๆฅผ้ฃ้ด้คๅ
\n60004 0 ๆ7ๆ28ๆฅๅ็ปๆๅคง่ฟไธญๅฟๅป้ข่็ฆ็พค็ป็ป็็ฌฌไบๆ็พๅคฉ่ฟๅจๅๅฅฝ็ปๆ\n1000 Processed\n classify content\n60500 0 ๆข
ๅๆฑๅ้ๆฒๆนพKTVๅ็ๆ้ฉไธๅน\n60501 0 ๅจๅไบฌไธพๅ็ไปฅๆๅคงๅฉ็ฑณๅ
ฐไธๅไผๅไบฌๅจไธบไธป้ข็โๅไบฌ็ฑณๅ
ฐๅๅไบๅจ็งโไธ\n60502 1 .xx.xx.xx.xx~xxxๆ:ๅฟ
ไธญใ ่:้ฉฌ~ๅฟ
ไธญx็ :xx.xx.xx-xxxๆ:ๅ
้จ...\n60503 0 ไธบไปไนๆทๅฎappๅณไธ่งไธไธช่ๆ\n60504 0 ็ณ็ฎ็ต็งๆดพๅบๆๆฐ่ญฆๆฅๅฐๆฅ่ญฆ\n1000 Processed\n classify content\n61000 0 ๆฏๅผๅพๆฏไธชไบบไธบไน้ชๅฒ็่ดขๅฏ\n61001 0 ๆไบบ่ฆๅไบฌ้้นฐๅข็ฑณๅ่ฅฟๆธธ่ฎฐไนๅคงๅฃๅฝๆฅ็็ฅจไน\n61002 0 longtimenoseehey๏ฝ\n61003 0 ไธๅฎถ่ๅฐไปๆญคไธๅป้ขใ่ฏ็ฉใ็
็ๆ ็ผ\n61004 0 ่ขๅงๅงๆฎๆผไบ้ฅฑๅ่ดจ็ไฝๅง็ปๅฏนๆผๆๆ็็ญๅฟฑไนๅฟ็ๅฅณๆผๅ\n1000 Processed\n classify content\n61500 0 ๆฑฝ่ฝฆไนๅฎถ่ฝๅฆๅฐไธๅกๆจกๅผ้กบๅฉ่ฟๆธกๅฐไบๆ่ฝฆ้ขๅ\n61501 0 ๅ
ฉๅฒธๅๅฐ้ๅนด่้ฆๅไบฌๆ
็ทฌๆญดๅฒ\n61502 0 ๆๅจๅไบฌ่ทฏไบจๅพๅฉไบงๅๅทฒ้ๆฐไธๆโฆ\n61503 1 ๅฎขๆทๆไพไผ่ดจ็่ดท ๆฌพๅจ่ฏขๆๅกใ ๆไปฌ็่ดท ๆฌพไบงๅๅฉ็ๆ ไฟก*่ดท๏ผๆ่ดน็x.xx-x.xx%...\n61504 0 ๅ
็ๅ
ฌๅธ้ไฝๆ
ๆธธ5็นๅคๅฐฑๅบๅ\n1000 Processed\n classify content\n62000 0 ไปๆฅๆท่ดขไธๅบๆฑ็ง่ตๆ้ๅ
ฌๅธๆญฃๅผ่พพๆๆ็ฅๅไฝ\n62001 0 ๅไบฌๅๅคงๅป้ข้็น็งๅฎคใๅป็\n62002 0 GoogleDriveๅ
ๅฎนไบบๅฎกๆฅ่ฟๆฏๅคช็ไบ\n62003 0 ไบใๆไปฌ่ฎคไธบ้ๆณๆ้คไธๆดๆนๆๅ ็ๅๅญๆถ\n62004 0 ไปๅนด้กบๅฉ่ขซๆตๆฑไผ ๅชๅคงๅญฆๅฝๅ็่ดบๆ็ดซๅ่ขซๅๆๅคงๅญฆๅฝๅ็ๆด้
ๅฎๆๆๆฝๅบไผๆฏๆถ้ดๆฅไธบๅญฆๅผๅญฆๅฆนไปฌๆๅฏผไธไธ\n1000 Processed\n classify content\n62500 0 ไธๅนดๅ็ซๅฟ่ฆ่ๅฐไบบๅคงๆฅๆพไฝ \n62501 0 ไธ่ฟ้ๅฃso?hoๆๅไผคไบบๆก่ฟๆฏๆดๅฏๆ\n62502 0 ๆๅๆ็ฎๆผๅๆๆๅไธๆ่่พฆไธๅๆๆฉ่พฒ็จฎๆค่พฒๅ ดไนๆ
\n62503 0 ่ไธไธๅๆฎ้ๆๅ้ฃๆ ทxๅจๅทฆๅณๅฐฑๅพๅฟซๅฅฝ่ฝฌ\n62504 0 ๅ
จๆฐApivitanewBeeRadiantไบฎ่ๆ็บไฟฎ่ญท็ณปๅ\n1000 Processed\n classify content\n63000 0 ็ฐๅจๅทฒ็ปๆไบบ้ขๅฎdcไบ้ฃๅฐฑๅ
ๅผๅงๆๅๅคฉ็ๅ\n63001 0 ๅฐ้ไธ็ๆธฉๅบฆๅๆทฑๅณๅ็ๆธฉๅบฆ็ฎ็ดๅคฉๅฃคไนๅซ\n63002 0 ่ฟ่ฆๅจๆๅฎไธคๅคฉguyongไธๅฐ่ๅท\n63003 1 โไฝ ็ๅญๅจใ็ๅฝ็ๆไนโๅฅณไบบ่ๅฐๆฅไนๅณ๏ผ่ฏ็ฏไธๆ็น้ๆจๅฐๅบ้่ดญ๏ผ็นๆจๅบๅฌๆฌพxๆใๆฅๆฌพxๆ๏ผๆ...\n63004 0 ๅนด็ปๅฅ10000ๅ
ๅทฆๅณ+ๅ
ไฝๅ
ๅ+ไบ้ฉไธ้\n1000 Processed\n classify content\n63500 1 ๆจๅฅฝ๏ผๆๆฏๅฑฑๆนๆนพ็็ฝฎไธ้กพ้ฎๅผ ่ณ๏ผๅฑฑๆนๆนพ็ฐๅจๆฐๆจๆฟๆบ็ซ็็ญ้ไธญ๏ผ่ดญๆฟๅค้ไผๆ ๏ผๆๅพ
ไธๆจ็ๅๆฌก่ง...\n63501 0 DMๆๆบu็16gu็ๅๆๅคดotg็ต่ไธค็จๅๆ้ๅฑ่ฟทไฝ ่ฝฆ่ฝฝ้ฒๆฐดๅ
้ฎ\n63502 0 ่ฝฆๅไป็ป๏ผๅฅ้ฉฐs็บงๅ ช็งฐๆ่้ใ่ฑชๅ็ไธญๅคงๅ่ฝฟ่ฝฆ\n63503 0 ็ฎ่ค่กฐ่็ฑ็บน็ไบง็ๆฏๅ ไธบ่ถๅ่็ฝๆตๅคฑ\n63504 0 ๅ้กพๅๅฅฝๅ ๆฎตxxใxxๅนดไปฃๅ
NBA่ง้ข\n1000 Processed\n classify content\n64000 0 ็ปๆ่ฑๅฎๅก้ๆๅ็xxๅไนๅๆๅฐฑๅๆปไบ\n64001 0 ๅพๅฐไบไธไธชๅไธบ็ๅฎๅๆๆบ็ปไบไป\n64002 1 ๆฐๅนดๅฅฝ๏ผ้ถ่กๆฐๆฟ็ญ๏ผๅช่ฆๆไฟ้ฉๅ\\ๆๆญใ่ญๆฌๆฟ\\ๅฐ่ฝฆไนไธ๏ผๅญ่บซไปฝ\\่ฏๅณๅฏๅ็๏ผๆๆฏไป
xๅ-x...\n64003 0 1992ๅนดไปๅไบฌ็ฉบๅๅธไปค้จๅ่ฎญๅค่ฝฌไธๅ่ฟๅ
ฅๆฑ่็ๆตท้จๅธๅฎก่ฎกๅฑๅทฅไฝ\n64004 0 ็ฝ้กต็ซฏ่ฟๅฟ
้กปๆๆบไธ่ฝฝappๆซๆ็ป้\n1000 Processed\n classify content\n64500 0 ๆณๅฎไธ่ฆๆไธบ็ซฏไบบ็ขๆไบบ็ฎก็ๆณๅฎ\n64501 0 โsuper189็ตไฟกๆฌขgo่โ็ผบ็ฟผไธๅฏๅๆฌกๆฅ่ขญ\n64502 0 ๆไธบไปไนไฝๆญปๅๅผhhhhhhhhhhhhhhhๅฟซ็ฌๆญปไบhhhhhhhhhh\n64503 0 ไธญๅฝๆฑ่็ๆทฎๅฎๅธๆทฎ้ดๅบ็่ฅ้\n64504 0 ๆฎๆตๆฑ็ๆ
ๆธธๅฑๆๆฐ็ป่ฎกๆฐๆฎ๏ผๆตๆฑxxxxๅนดไธๅๅนดๆฅๅพ
ๅฝๅ
ๅคๆธธๅฎข็บฆx\n1000 Processed\n classify content\n65000 0 ๅๅ็พ็ต7ๆไธบๅนฟๅคงๅญฆๅๅผๆพๅ
่ดน็่ทจๅข็ตๅ่ฏๅฌ่ฏพ็จ\n65001 0 ไธญๅคฎ็ฉบ่ฐๅฃฐ้ณๅคง็่ท็ซ็ฎญๅๅฐไผผ็\n65002 0 50ไธ็่ทฏ่VS15ไธ็้้ฃ็็ธ่ฎฉไฝ ๅคงๅ่ก\n65003 0 comๅๅก้จ้ขๅฏผใๅฝๅฎถๅทฅๅๆปๅฑ้ขๅฏผใไบบๆฐๆฅๆฅ้ขๅฏผไธญๅฝๅไธ่ๅไผ้ขๅฏผ็ญ็ญๅๆฅๅฐๅบ็ฅ่ดบๅๅ่จ\n65004 0 ่ฝ่ฝฝ็16ไบบไปฅ250ๅ็ฑณ/ๅฐๆถ็้ๅบฆ้ฃ้ฉฐ\n1000 Processed\n classify content\n65500 0 wifi็ต่็ฉบ่ฐๅฐ็ฎฑๅ ๅผ ๅบ\n65501 0 ๅทฒ็จไบ้ต้ฃๆๆๆบๅ็ซ็ฎญๅๅจๆบๅท็ฎกไธญ\n65502 0 ไนๆฏgoogleๅบไบlinuxๆๆบๆไฝ็ณป็ป็ๅ็งฐ\n65503 0 ๆฟๅฐไบง็จ็ซๆณๅ็จฟๅทฒๅบๆฌๆๅ\n65504 0 ๅไธบๅฐๅฅๅฐฑ่ตฐ่ฟๅป่ฏด๏ผโๅ
็\n1000 Processed\n classify content\n66000 0 ๅๆฌขไธๆน่งๅพๅ่ฑๅ้ชจๅพ่ฌ้
\n66001 0 ไธไธชไบบๅปๅไบฌๅไบ็ด ้ขๆฒกๆ็ขฐๅฐๆไผ็ไฝ ็งฆๆทฎๆฒณ่ฟๆฏ้ฃๆ ท็ดข็ถๆ ๅณ\n66002 1 ๅฅณไบบ่ๆฅไธดไน้
๏ผ็นๅๅนฟๅคงๅฅณๆงๆๅๆจๅบไผๆ ๆดปๅจ๏ผๅฟซไนๅฅณไบบ่๏ผVIP้กพๅฎขๆๅกไบซๅxๆไผๆ ใๅฆๆๆฒก...\n66003 0 ๅ่ฆๅผๅงmyperfectday\n66004 0 ไธๅๅฐ้ๅฐฑๆณ้ช่กๅไบฌๅฐ้ไธๅ้ไธ่ถไฝ ่ฟ่ถๆฒก่ตถไธ่ตถไธ่ถๅฐฑ็ไธ่งไฝ ๅฆๆๅไธ็ผไบ่ฟๆฏๆไน็\n1000 Processed\n classify content\n66500 0 2013ๅนด11ๆๅ็จๅไธๅนดๅคไธ็ดไธๆ่ฟ่ฝฆ\n66501 0 ไธๆฆๆ่ตทๆฅๅฐฑๆฏๆฑฝ่ฝฆ้็Windows\n66502 0 ๆจๅฝฉ้ๅฐฑๅๅฐๅฐๆฟ้ด็ต่ๅ็่ฝฌๆค
ไธ\n66503 0 ไธบไปไนๆฑ่็งปๅจ็ฝไธ่ฅไธๅ
ๆฒกๆๅ็บฆๆบ\n66504 0 ้ขๅฏน่ดจ็๏ผไป็ฐๅจๆไนๅฏ่ฝ็ก\n1000 Processed\n classify content\n67000 0 ้่งๅ
ถๅ/็ไผผMH370ๆฎ้ชธๅฐๅ็ฐไธญๅฝ็ฟๆณๆฐด็ถ\n67001 0 ้ฃๆบไปๅพ็บธไธ่ฝๅฐ้่ฆไธคไปฃไบบ\n67002 0 ๆ็ไธ็ธไฟกๅๅจๆๆบๅฑๅนๅ็ไฝ ไปฌไธไธชไบไธช้ฝๆฏๅๅ
จๅ็พ็ไบบ\n67003 0 ๆๅจๆทฎๅฎๅ
ซไปๅฐ้่ฟ้่ฟ็ๆ่ฝฆ็ๆๅๅ\n67004 0 ๅกๆๆผ่ชไป็Baupostๅบ้1982ๅนดๅ็ซไปฅๆฅ\n1000 Processed\n classify content\n67500 0 Windows็จๆทๅทฒ็ปๅฏไปฅๆถๅฐๆญฃๅผ็ๆดๆฐ็ๆจ้ไบ\n67501 1 ๅฐๆฌ็ๅญฆ็ๅฎถ้ฟ๏ผ่ด่ฟๆ่ฒ็ฅๆจ้ๅฎถๅนธ็ฆ๏ผ็พๅนดๅคงๅ๏ผ่ด่ฟๆ่ฒๅไธญๅๅญฆ็งๆฅไธ็ง้ไธ็ง๏ผๅ
จ้ขๆๅๆ็ปฉ...\n67502 1 ๆฐๆ็ปดๅฅฅๆฐไธๅฎถๆฌๅจๅๅณxๆxๅทxx:xxๅผๅง่ตๅๆจกๆ้ขๅๅ่่ฏๆๅทงๅ
่ดน่ฎฒ่งฃๅน่ฎญใๆดปๅจๆ็ปญไธคๅจ...\n67503 0 ๅด่ขซHTC้่ฟไธๅฉๆไบบๅจไธๅฉๅฎกๆฅ่ฟ็จไธญๆไธๅ
ฌๅนณ่กไธบ็ไธพ่ฏ่ๅฏผ่ดๆปก็็่พ็ไบๆ
\n67504 0 ๅไบฌๅธ็พๅไผๅๅฑฑๆฐด็ปๅฎถไฝๅ\n1000 Processed\n classify content\n68000 0 ๆฌๆฅไปๅคฉๆณๅป็McQ็ๅฑ่งโฆ็ปๆๅ็ฉ้ฆๅผ้จไธๅฐๅๅฐๆถ็ฅจๅฐฑๅๅ
ไบโฆๅชๆๅป็้ๅฑโฆๅจV&\n68001 0 ็ถๅไธไบไธๆ็็ธ็ๅคๅฝไบบไนไผ็่จๅคง่ๅๆงฝ\n68002 0 ็ปๅคงๅฎถๅไบซ\"ๅ่ฅๅผไน็ป็ฌไนๅ\"\n68003 0 Whataretheyๅผๅฅๅ\n68004 0 ๅชๆๆฟๅบๆๆ่ฝๅๅปๅ
ณ็ฑ้ฃไบๅคฑ็ฌๅฎถๅบญ\n1000 Processed\n classify content\n68500 0 ๅๅฐๆดๆฒปๆฏๅๅฑ็ฐไปฃๅไธใไฟ่ฟๅๆฐๅขๆถ็ๆๆ้ๅพ\n68501 0 6ใๅ
ๆฟๅนฒ้จใๅ
ฌ่ไบบๅ้ๅพท่ดฅๅใ็ๆดป่
ๅๅ ่ฝ\n68502 0 ้ฝๅพ้ ็ๅฝๅ็ณๆ่ฝๆญฃๅธธๅผๅฃ่ฏด่ฏ\n68503 0 ๅคฎไผๆน้ฉใๅๅทฅ่ชๅคฉใๆบๅจไบบ็ญๅผบ่
ๅฝๆฅ\n68504 0 ๆฐๅๅ
ซไปไนๅญ็ฒๅฐ็็ธไบไปถไธญๆ432ไฝ็งไผคๆฃ่
ไปๅจไฝ้ขๆฒป็\n1000 Processed\n classify content\n69000 1 ............ๅบๆ็คผใ่ฟ่ดญๆ็คผ๏ผๆปกไธๅ่ฟxxx๏ผๆ็ดไพๅก๏ผๆ่ฝๅๅ ๆดปๅจ๏ผๅฏไฝxxx...\n69001 0 ไธ่ฌ็ทไบบ็ไฝๅ
ๅคง็บฆๆ300ไบฟไธช่่ช็ป่\n69002 0 ไธญๅฝๅฅฝๅฃฐ้ณ้้ข้ฝๆฏ้ป้พๆฑ็ๆญๆๅ\n69003 0 ๅฟซๆฅๅญฆๅงโโ็ฑ่่ๅๆ ๆชฌๅฒ่ฐ่ๆ็่ถ้ฅฎ\n69004 0 ๆฑ่็ไธญๅบ็่ฟxๆณจไธ็ญๅฅๅๅซ่ขซ่ฟไบๆธฏใๆฌๅทๅๅฎฟ่ฟๅฝฉๆฐไธญๅพ\n1000 Processed\n classify content\n69500 0 ๅไธ้ๅขๅฎฃๅธไธๅฝๅ
ๆๅคง็ๅจ็บฟๆ
ๆธธไผไธๆบ็จๆ
่ก็ฝ่พพๆๆ็ฅๅไฝ\n69501 1 ๅทๅค.ๅทๅค.ไธ็จๅปๆพณ้จ.็ฝไธๅผๆทไน่ฝ็ฉ๏ผOKๅจฑ.ไนๅ
ฌๅธ๏ผ่ฟๆฅๆฐๅฎขๆท๏ผๅผๆทๅฐฑ้xxxๅ
๏ผ็ฉๅฐx...\n69502 0 2015ๅนด่ฏๅธไปไธ่ตๆ ผ่่ฏ้ขๅบ๏ผd\n69503 0 20ๅไบฌๆๅทฆไธ็ใๅณไธ็ใๅทฆ็ใๅณ็ใๅทฆไธ็ใๅณไธ็\n69504 0 ๅ
ฑ่ฎกๅๆพๆถๆๆธ
ๅ้ฅฎๆ8735็ฎฑ\n1000 Processed\n classify content\n70000 0 โโโโโโๅฟๅญ\n70001 0 ๅฅฝๅฃฐ้ณๅญฆๅ้้ขๅชไธชๆฏ่็ฎ็ปไบฒ่ช่ฏท็\n70002 0 ๆดๆฒกๆๅนผ็จ่่ฃ
้ผๆ ผ็็ๆดปไฝ้ฃ\n70003 0 ๅทด้ต็ณๅใ้ฟๅฒญ็ผๅๅๅฒณ้ณๆถ้ฒๆฏ้ใๆนๅ็ๆถ้ฒๆป้้ฟๆฒๆๅบๆๆ
ไฟ้ๅคง้็ญ12ๆฏๆถ้ฒ้่ๅๅ
ฌๅฎใ...\n70004 0 ๆๅปๅฐผ็็็บขๅ
้ๅฑๅฐๅบ่ฝไธ่ฝๅ
ๆข\n1000 Processed\n classify content\n70500 0 ๅฒญๅ
ๆไฝไบๅฎๆกฅ้ๆฟๅบ้ฉปๅฐๅxๅ
ฌ้ๅค\n70501 0 ๅช่ฝไธๅคงๆฉ็่ฑๅ้ชจ็ฌฌไบ้ไบ\n70502 0 ็ฐ่ตทๆๅฝๅนดๅฝๅๆฒกๆ้ฑๅฐฑๆๆๆบๅฝไบ\n70503 0 ***********็ฐๅจๆงๅถ้ฃ้ฉๅพ้่ฆๅ***่ดตๅท่
ๅฐๆตทๅฒๅปบ่ฎพ็ณๅฒ็บธไธ้ป็กไธนไธญ่ช็ตๅญๅค็ซน็บบ...\n70504 0 xใ็พคๅ้ขๆฌก่ฟ่ก่ฐๆด๏ผๅพฎๅ็ญพ็บฆ่ชๅชไฝใๅชไฝ่ฎค่ฏ็จๆทไธบxๅคฉxๆฌก\n1000 Processed\n classify content\n71000 0 ๆไปฌๆฏ้ ๅ
่กฃ็ฌ็น็่ฎพ่ฎกๅฐ่ธ้จๆ่พน็่่่้ๅจไธ่ตท\n71001 0 ledๆถๆ ผๅฐ่ฑๅฎ ็ฉๅๅ
้กนๅๅค้ด้็็็ซๅช็ฅๅจ\n71002 0 ็ฅไบงๆณ้ข้ข้ฟๅดๅๆๆๅบ่ฆๆฑ\n71003 0 ไพๆงๆฒๆตธๅจๆ็ต่ๆไบ็็่ฆไนไธญโฆไปๅคฉๆฉไธๆ็ต่ๆถ่ฟ็ๅญ้ไพ่ตทๆฅ\n71004 1 ใ็้ไธป้ขใ๏ผๆฐ่ถไธๅธใ็ฝๅคฉxx็นๅฐxx็น่ฌไธ้คธไธ๏ผๅ
จๅคฉไบไบบๅ่กไธไบบๅๅ๏ผๅฆๆๅ
จๅคฉๆ็ฐ้็ฎใ...\n1000 Processed\n classify content\n71500 0 ๆฏไธๆญฅ้ฝไธๆฏๆไบไฝ ๆๆๅพ
็้ฝไธๆฏ็็ธ็ๅ
จ้จๅฝ่ฟ็ๅฎๆๆปไผๆ็ผ็ฑๆๆ่ฟไธๅฆๆไป้้่ๅฎ\n71501 0 7ๆ18ๆฅ08ๆถ่ณ19ๆฅ08ๆถ\n71502 0 ่ขซๅฑฑไธ็่ๅๅธไธญ็บงๆณ้ขๅคๅคๆๆๅพๅๅๅๅนด\n71503 0 ๅฆๅ้ฃๆบๆ ๆณๅจๅฉไฝ็่ท้ไธๅไธๆฅ\n71504 0 ่ขซ้ๅท้่ทฏๅ
ฌๅฎๅฑๆด้ณๅ
ฌๅฎๅคๆด้ณไธ็ซๅ
ฌๅฎๆดพๅบๆๆฃๆผๅ\n1000 Processed\n classify content\n72000 0 ๅ้ๅธๅๅธ้ท็ต้ป่ฒ้ข่ญฆ|้ท็ฅๅๆฅไบ\n72001 0 ไปฅ็ฝๅคด็ฒๅบใ้ปๅคด็ฒๅบใ็ๆงไธ็นใ่็ฑใ็ป่ใๅ่ฟ็ญไธบไธป่ฆ่กจ็ฐ\n72002 0 ๅผๅไปไธ่ฏฅๆ่ญฆๅฏๅ็่ขไฝๅฒ็ชๅนถ่ขซๆฃ็xxๅฐๆถ\n72003 0 ๅฅถๅด่ฎพ่ฎกๅพ็ธๅฝๆ็ฑ~้ธๆฐๅค้ฒๆๆจๆ\n72004 0 ๆญค็ชๅๆน้ๆฉๅจSUPER็ปๅ
ธLunettesII้ๅไธๆฅๆดๅๆ็ตๆ\n1000 Processed\n classify content\n72500 0 ๅ
ๅ้ชๅ53ๅๅคๅฐๆท็ฑๅฎถ้ฟ้ฑๆฌพๅ
ฑ่ฎก46\n72501 0 ๅ
ป่ไฟ้ฉๆฏๅฏไปฅๅผๅฐ่ฝฌ็งปๅนถ็ดฏ็งฏ็\n72502 0 MRSOH่ขซๆ้ผ่ฟซ่ดฑๆๆๅ็ฌ็็ๅค่ดฑ\n72503 0 ่ฏ็ฐๆดพๅบๆไธ้ซๆฐๅบๅปบ่ฎพๅฑ่ๅๆงๆณ\n72504 0 ๆๅฐฑๆข่ฎฉไฝ ไปฌๅ่ช้ฃๆบ้ฝ้ฃไธ่ตทๆฅ\n1000 Processed\n classify content\n73000 0 ๅฏ็คบ๏ผ่ฟไน่ฎธไธๅ
ๆฏไธ็ง้ๅฎ็ๆๆณ\n73001 0 ๅ ๆฟๅคง่ๅ็ๆญ่ฝฆๆ
่กๆบๅจไบบโHitchBOTโๅจ็พๅฝ่ดนๅไธๅนธ้ๅฎณ\n73002 0 3BeautyQuotient็พๅ\n73003 0 ้ๆธๆๅฐฑๅบ้ๅฎๅคไธญๅฎๆธๆ\n73004 0 ๅดๆฏๆฏๆป่ขซ่ดจ็โไฝ ไธ็ญๆๅจๆซ้ฝๅจๅฎถๅนฒไปไน\n1000 Processed\n classify content\n73500 0 ๆฏไธบไบๆทกๅฎ็ไปๅฎน็ๅฅฝๅฅฝ็็็่ฟไธชไธ็\n73501 0 ไฝฟ็จ้ฒๆ้็ฆปไบงๅx็ฎ่คๅทฒ็ปๆฏไบๅฅๅบทๆๆ่โ่ๅฟๅ่่คไฟฎๅค่ฐ็\n73502 0 ้ธกไธๅฟไบบๆฐๆณ้ข้ข้ฟๆฅๅพ
ๅฎค้\n73503 0 ไปๅคฉ05ๆถไฝไบๆตๆฑ็ๆธฉๅฒญๅธไธๅๆนๅ550ๅ
ฌ้็ไธๆตทๆตท้ขไธ\n73504 0 1ๅฅณไบบไธบ็ฑๅบๅขๅบๅขไธๆฌกๅฐฑๅคๅฅณไบบไบซ็จไธ็็ทไบบไธบ็็ๅบ่ฝจ็ทไบบๅทๆ
ๅฐฑๅๅธๆฏ\n1000 Processed\n classify content\n74000 0 66%็็ต่็ตๅจไธๅๅไธๆถๅฝปๅบ็ฆปๆ่ๅป\n74001 0 ็่ณ็ๅฐๆ็ฝไธไผ ไบๆ็่ณ่ๅฅฝไบๅฅนๅฐฑๅ้ฃ็ง็
ง็ไธๅป่ฟ่ฎพ่ฎกๆๅฎณๆๆญปๅๆไบบๅฟๅฝ\n74002 0 ๅๅทiphone6ไปฃ4\n74003 0 ไธ้ข่ฏท็ฝๅ็ไธฝไบๅฆไบงๅป้ข็ๅป็ไธบๆจไป็ป\n74004 0 ่ฏ็ไผๅฎกๆ ธ้่ฟไบ็ณ้พ็ตๆขฏ็IPO็ณ่ฏท\n1000 Processed\n classify content\n74500 0 ไธบไบๅๅคไธ็ ดๅๆฏไบฒๅคงไบบ็็พๆขฆ\n74501 0 ไธ่ฟๅฐๅทไปฌๅฏ่ฝไผๅฟ่ฎฐ่นๆๆๆบ่ฟๆๅฎไฝๆๅก\n74502 0 ็จ็พๅบฆๅ็จGoogleๅฏนๆฏ็ๆฅๆ็ๆๆๆ\n74503 0 ๅฎไบไปๆฅๅๆจๆต่พพ้ๅบ็ญๆๅ็ๅๅฐ้็ปๅไบฌ็ปง็ปญๅไธ้ฃ่ถๅคชๅนณๆด\n74504 0 ๆพ่ๅท้่พพๅผ้ๅ
ฌๅธไธไธ้
ๆฑฝ่ฝฆ่ฏ็้ฅๅ้ฅๆงโๅทฅไธๅญๅบโๆน่ฅฟโๅฟซ็น8ๅ็ฑปไฟกๆฏ็ฝ\n1000 Processed\n classify content\n75000 0 ๆๅคฉๆฉไธ็้ฃๆบ่ฆๅปๆๅฐผ็ไบบ็ฐๅจ่ฟๆฒกๆถไธ่ฅฟโฆ่ฟๅจ็ฃจๆดๅทฅ\n75001 0 ๆ่งไธญๅฝ็ฐๅจๆฏไธไธชๆณๅพใ้ๅพทๅๅคฑๆ็็คพไผ\n75002 0 ่ง้ข่
พ่ฎฏ้ณ้ข51singUPไธป๏ผๅคฉ้ญๆๆไธป\n75003 0 ๅทฅไธๅญ็ๅก่ฝฆๅธๆบdalaoไปฌๅพๅๆฌขๅๅจๅ
ฌไบค็ซ็ๅ้ข\n75004 0 COM็พๅฆ็ฝ็ซไบๆไปฝ่ทๅพๆธ
ๆด็ฑป็ๅ ๅ\n1000 Processed\n classify content\n75500 0 ๆ็ๅงๅงไฝไธบ็ฅฅ่ๆณ้ข็ไปฃ่กจๅไธไบๆญคๆฌกไผ่ฎฎ\n75501 0 ๆฏไธชไบบ้ฝๆ็ๆๆบๅฏนๅๅไธไธช็ตๆขฏ้จๆฏๆฌก็ตๆขฏ้จๅผ้ฝไผๅฌๅฐไธๅฃฐๅนๆฏ็ถๅๅคงๅฎถไธ่ตทๅๅๅคง็ฌ\n75502 0 ๆฑ้ฝๆฐดๅฉๆข็บฝๆฏๅๆฐดๅ่ฐไธ็บฟๅทฅ็จ็ๆบๅคด\n75503 1 ๅฐๆฌ็ๅฎขๆท๏ผๆจๅฅฝ๏ผ่ฏทๅ
่ฎธๆๅๆง็ๆๆฐใ้ฉฌๅฏๆณข็ฝ็ฃ็ ใๅธ่บๅทฅๅ็ชๅธ๏ผๅฎพๅทๆๅ
ทๅฎๅ็ไธคๅคงๅ็ๅผบๅผบ...\n75504 0 ๅทๆธ
็่ก็ฅจ่ฎฒๅบงไปๅคฉๆฏๆ่ๅธ่ฎฒ่ฏพ\n1000 Processed\n classify content\n76000 1 ๅไฝๅฎถ้ฟๆฐๅนดๅฅฝ:ๆฅๅญฃ็ญๅทฒๅผๅงๆฅๅไบๅฆ๏ผ็ฐๅจๆฅๅ้ฝๆไผๆ ๏ผๅนถ่ต ้็ปๅ
ท๏ผๅๅนดๆไธๅนดไธๆฅไผๆ ๆดๅค๏ผ...\n76001 0 ๅจ8ๆ10ๆฅไนๅ้่ฟ้กพๅฎข็ๆๆ่ดน็จ\n76002 0 ๅซ็ฌ็นๆดไบฎๆๅSylodent\n76003 0 ๆฑฝ่ฝฆ็่กจ็่ฆๆฏ่ฎพ่ฎกๆ่ฟๆ ทไผไธไผๆฏ่พๆๅจๆ
ๅ\n76004 0 ๆไปฅๆ่งๅพ่ฟไธช็นไฝไธๆฟๅบๅบๆๆค็ๆ็ๆง็ๅบ็ก\n1000 Processed\n classify content\n76500 0 ่ฐทๆญ็ผ้ใiPhonexSใ็พไธ็บขๅ
ๅ
่ดนๆฟ\n76501 0 ๅ
ฌๅ๏ผxxๆxxๆฅๆ็คบๅๅบxxxๅช\n76502 0 ไบ่็ฝไธๅ
ณไบSEO็ๆ็จ้ๅคๅฏ่ง\n76503 0 โๅข่ดท็ฝCEOๅๅๅฏน่ฎฐ่
ๅฆๆญค่กจ็คบ\n76504 1 ๅๅคง่ดขxxxxๅนด-็ฌฌxxxๆ๏ผไธไบไธๅคดๅฅฝ็ๆบ๏ผๅๆฐ็นๆณข็็บข็ปฟใ้๏ผ้พ่็็ชไธญxxxxๅนด-็ฌฌx...\n1000 Processed\n classify content\n77000 1 ๆฐๅนดๅฅฝ่ๆฟๅจ๏ผๆๆฏๅนฟไธไฝๅฑฑๅธ้ถๅผบไผไธ๏ผๅฏๅผบ้ถ็ท๏ผไธๅก็ป็๏ผๆธธ้ซๆบ๏ผๆไปฌๅ
ฌๅธๆฏๅๅๅๆ๏ผๅฑๅ
ๅฐ...\n77001 0 UPไธป๏ผไธ่ด่ดฃไปป็ตๅฝฑ็ ็ฉถ้ข\n77002 0 ไปๅนดๆๆๆผๅบๅญฃ็ญ่ตทๆฅๆฒกๅผๅๅธไผๅทฒๅ้จ็ฅจ่ฟๅ
ซๆ\n77003 0 GCPD่ท็่็ทๅฑ่กๅ้ข่ฟฝ้ฃๅ ๅนด\n77004 1 ๆจๅฅฝ๏ผๆถๅฐๅฅณๅๅ
ฐๅบๅบx.xๅฆๅฅณ่ไธบๆจๅๅคไบไธไปฝ็คผ็ฉ๏ฝ่ฏทๆจxๆxๅทๆๅก่ฟๅบ้ขๅ๏ฝxๆxๅทๅฐxๆ...\n1000 Processed\n classify content\n77500 0 ่ฟๅบงไฝไบๅพทๅฝๅ้จsuurhusen็ๆๅ ๅฐๅกๅทฒ่ขซๆญฃๅผๅๅ
ฅๅๅฐผๆฏ็บชๅฝไธ็ๅคงๅ
จ\n77501 1 ๆจๅฅฝ๏ผๆฌฃๅฅ้ค็ค่ๅฉๅบไธๆๅ
ซๆฅ้ๅฏนๆฐ้กพๅฎข็นๆ ๆดปๅจๅไปทxxxๅ
ไฝ้ชๆดปๅจxๆxๆฅไธๅๆซๅพฎไฟกๅช้xx...\n77502 0 ๆไบ่ขซๅซ็ๅป็ๅฎฃๅคไธ่ฝไฟฎๅค็็ผ็\n77503 0 ๆตๆฑ้ๆๅฟๅผๅฑๆๆๅฒๆๅพ้\n77504 0 ้ซๆทณไบค่ญฆๅคง้ๆทณๆบชไธญ้ไธปๅจไผๅๅบๅ็ฎก้จ้จๅผๅฑๆฐดๆๆ่ดฉๅ ้็ป่ฅ้ไธญๆดๆฒป\n1000 Processed\n classify content\n78000 1 ็บข็ผจไธๅๅนผๅฟๅญ็ฐๆญฃ็ซ็ญๆฅๅไธญใ็ญๅฟฑๆฌข่ฟๆจๅๅฎๅฎ็ๅฐๆฅ๏ผๅ่งๆฌๅญ๏ผๅฐฑ่ฏปๆฌๅญใๅญฆไฝๆ้๏ผๆฌฒๆฅไป้...\n78001 0 ไธๅฑฑๅฟๆฟๅบ2015ๅนด็ฌฌไบๆฌกๅธธๅกไผ่ฎฎๅๆไบค่ญฆๅคง้ๆๅบ็ๅผๅฑ้่ทฏไบค้ไบๆ
ๆๆตๆๅฉ่ดฃไปปไฟ้ฉๆนๆก\n78002 0 ไฟ่ฟ่็ป่ๅ็ๅนถไฟๆค่็ป่\n78003 0 ๅ่ฝxใ่ฝ่งฃๆ่ฅใ่ซ้ๆฏใๅฐ่ใไบญ้ฟใ้่ใ็กซ็ฃบๆฏใ่ฏธ่ๆฏ\n78004 0 2015ๅนด7ๆ10ๆฅๅ7ๆ20ๅฎๅพฝ็ๆทฎๅๅธๅคๅฐๅฟๅคๅฐ้ๆฐๆน็คพๅบ่ฟๆณๅผบๆ็พๅงๆฟๅฑไธ็ป่กฅๅฟ\n1000 Processed\n classify content\n78500 0 ๅ ๆฒนๅ ๆฒนๅ ๆฒนๅ ๆฒนๅ ๆฒน\n78501 1 ้นๆฐๆฅ๏ผๆขๅฝฉๅคดใ็ๅๆ่ฃ
้ฅฐๆฅๅญฃxxๅฅ็ฒพๅๆ ทๆฟ้ดๅพ้ไธญ...ๅฐไบxๆxๆฅไธๅx:xx-x็น๏ผ้ณ...\n78502 0 2ใ่ด่ดฃๅฎขๆท็ๅผๅใๅฎไฝๅบๅๅฎถๆๅๅๆฐๅฎขๆท่ฐๅคๅทฅไฝ\n78503 0 ๆณ็ฅ้ๅจ่ฏข่
ๅฆไฝ่ฟฝๅๆๅๅ\n78504 0 ๆ่ฏไธธ็ๅ
่ฃ
่ฎพ่ฎกๆไธๅชๅฐ็ช\n1000 Processed\n classify content\n79000 0 ไปไปฌๅ็ๅฆๅ
ใๆ็่ฟซๅป็ฎใๆฟ็ๆญชๆๅญ\n79001 1 ๅผๆญคไธๅ
ซๅฅณ็ฅ่๏ผๆฐ็็น้บฆไธญๆๆดปๅจๅคๅค๏ผๅกๅจๆฌๅบ่ดญ็ฉๅณ้็ฒพ็พ็คผๅ๏ผๆฐ้ๆ้ๅ
ๅผๆ็คผ ๅ
xxxx...\n79002 0 ๅขๅผ125ใ128ใ133่ทฏๅค็ญ่ฝฆ\n79003 1 ็ดงๆฅ้็ฅ๏ผๅผๆญคไธๅ
ซไน้
๏ผ็นๅๅบๅๆ
ๆ็คบ๏ผๆณฐๅๅฝ้
Lilyๅฅณ่ฃ
ๅ
จๅบๆปกxxxๅxx๏ผๆปๆไธๆฌพ้ๅ...\n79004 0 ๆฏๅ ไธบๆๅๆ ท้่ไบไธไธช็็ธ\n1000 Processed\n classify content\n79500 0 ๆนๅ80ๅๅฅณๅฏๅธ้ฟ็ๅฟใๅนฟไธ27ๅฒๅฏๅฟ้ฟๆฑชไธญๅๅ่ฟๅ
ฅๅ
ฌไผ่ง้\n79501 0 ไธบMacไธLinuxๅนณๅฐๆไพๅบไบ\n79502 0 ไธๅฐๅฟๆๅฟๅญไป็ธ็็ต่ๅผ็ฏไบ\n79503 0 ็จๆๆบๅฏ็ง่ฟ่ช่ก่ฝฆๅคชๆน้ฉฟ็ซ่ฅฟๅฑฑๆฎตๅฏๅจ่ฟ่ฅ\n79504 0 xxxxๅนดxๆxๆฅๅๆจx็น้ๅจ็ๅฝ็งๅญฆ้ขๅฐ้ไป้ฟ้ๅคงๅทดไธ่ฝฆๅๆๆฅไธ่พๅบ็ง่ฝฆ\n1000 Processed\n classify content\n80000 1 (x/x)ๆ่ฐข่ด็ตๅไบฌๆฐไพจ่ฏบๅฏ็น้ฅญๅบใๅญๆญค็ญไฟกๆฏๅฏ็นไปทไบซๅxxๅ
้่ชๅฐไธxxxx็ฑณ็ๅคฉ็ถๆธฉๆณ...\n80001 0 50ใ60ๅฒ็ๅฅณไบบๆฏ้ซๅฐๅคซ?\n80002 0 ๅๆๅ้ซ้ๅฐๅธธๅทไธๅฐ่็บฆไผ\n80003 0 SMไฝ ไธบไปไน่ฟ่ฝ่ฟไน็็ดๆฐๅฃฎๅฐ่ตท่ฏ\n80004 0 ไธญ็บฟ่ณๅฐ่ฆ่งฆๅ3100่ณ3072็นๆๆดไฝ2800็น\n1000 Processed\n classify content\n80500 0 ไธป่ฅๆฐดๅค็่ฎพๅคใROๅๆธ้ใๆบๆขฐ่ฟๆปคๅจ็ญ\n80501 0 ๅฝๅ็งๆ็ญนๅๆ่ตๅฝๅค้ซ็ซฏ้้ด้ๆญฃๆๆๆ\n80502 0 ๆๆใ2015้ ็ด็ๆดปๅฑใ้็ฅจ\n80503 0 ็ฌฌไธ่พTxxไธ็จๅค่ฏด่ฟๅผฏไธๅ้\n80504 0 ้คๅค่ฟๆ2GRAMๅ16GROM\n1000 Processed\n classify content\n81000 0 ๆๅก็ญ็บฟxxxxxxxxxxxๆนๅฉท\n81001 0 ๅ
ถไธญx่ตทไธฅ้็ไบค้ไบๆ
้ ๆxไบบๆญปไบก\n81002 0 ๆๆบๅ้ฉฑ้ญๅธๅจไธๅ้ๅ
ๅๆถๅ่ดงไบ\n81003 0 HoloLens่ฟๅคๅจๅผๅ้ถๆฎต\n81004 0 ๅนถไธ่กจ็คบ่ฆๅป่ฑๅ้ชจ็ต่งๅง้ฃ้ๆดไธไธ่โฆโฆโฆโฆโฆโฆโฆโฆ\n1000 Processed\n classify content\n81500 0 ่ฐทๆญ่กๆฏๅทฒๆๆ่ฟไธ็ไธไธๅฐๅฅๅผ็ๆฏ่ฒ\n81501 0 ๆถ็ฉบ็ฉฟ่ถใไฝ็ป่็ฌ็ซๅ
้ใๅ็ฉ่ดจๆญฆๅจใ็ๅคง่ก็้ฃ่กๆฑฝ่ฝฆ้ฝๆฒกๆๅบ็ฐ\n81502 0 ๆฌข่ฟๅ ๅ
ฅๅพฎ่ฝฏWindows\n81503 0 ๆฅไบ่ฟไนๅคๆฌกๅฐฑไธญๆ่ฟ็ๅน้ฃๆบๅธฆ่ๅ
ๆถๅฐๅๆดๆฐๅ
ๆณๅธฆๅๅฎถ\n81504 0 ๅนถๅฏนๅ
ถ่ถ
ๅ่ฟๆณ่กไธบๅค200ๅ
็ฝๆฌพ\n1000 Processed\n classify content\n82000 0 ่ขซๅ๏ผๆๆฒกๆ่ฝๅๆฏไปๅขๅ ่ดน\n82001 1 ๅ่ก๏ผxxxxxxxxxxxxxxxxxxxๅขๅคฉๅฟ \n82002 0 ๅฝๅนดๆฟๅบๆฏ6ๆ็ปๆๅผ็็ฆป่่ฏๆ\n82003 0 ๅไบฌๅคๅฎถๅ
ฌๅญๆฏๅบ้ฝๅๅคไบ่ฟๆฐๅนด็่ๅบๆดปๅจ\n82004 0 ่ฎค่ฏไฟกๆฏไธบโๆทฎๅฎๆฐๆฃฎ็ตๅญๅๅกๆ้ๅ
ฌๅธๆป็ป็ๅฉ็โ\n1000 Processed\n classify content\n82500 0 โโๆญๅทๆขฆๆนๅฑฑๅบๆธ
ๅนฝๅฆ็ป็ไบบ้ดไปๅข\n82501 0 ๆฒณๅๅ
จ็ๆณ้ขๅฎก็ป็ฏๅข่ตๆบ็ฑปไธๅฎกๆกไปถๅ
ฑ่ฎก720ไฝไปถ\n82502 1 xxxx xxxx xxxx xxxx xxxๅปบ่ก๏ผๆทๅ๏ผไธๅ\n82503 0 ่ฟไธชโstaycoolโ็็ๅบ่ฏฅๆพๅจๅฐ้็ซ้ไน\n82504 0 ไธๅคงๅฎถๅช่ฝๅไบซๅๅๆฎตๅๅๆฎตๅคช่ก่
ฅ่ฟๆฏไธ็็ไธบๅฅฝ\n1000 Processed\n classify content\n83000 0 ่นๆใyoutobe่ฟ็ปง็ปญๆดป็ๅฅฝๅฅฝ็\n83001 0 ๆ็ฅ้ๆ่็็็ธไธ่ฌ้ฝๆฉ่ๅจๆๆผไบฎ็่กจ้ข้\n83002 0 ๅนถไธๅฝปๅบๆๅผๆจกไปฟๆง็MacOSๅค่ง็่ฎพ่ฎก\n83003 0 ๅฎถ้็ต่ไธๅฎๅพฎๅๅฟ่ฎฐ้ๅบไบ\n83004 0 ็ตๆขฏไธๅคนๆญปๅ ไธชไบบๅฐฑๆฒกไบบๆฅไบงๅ่ดจ้้ฃๅไธไธญๆฏๅฐฑๆฒกไบบ่ฏด้ฃๅๅฎๅ
จไบบไธบไปไนๆป่ฆไปๅบไปฃไปทไนๅๅๅปๅผฅ่กฅ่ฟ...\n1000 Processed\n classify content\n83500 1 ็ฆๅปบๅคงๅ็ฉๅ
ทๅๅคง้ๆ่xx๏ฝxx๏ผxxxx.x.xๅ๏ฝxxxx.x.xๅ๏ผๅจๅฒ็ทๅฅณไฝไธๅxx...\n83501 0 ๅๆ2=1+1ไบโฆโฆ่ฏดไฟ็นไฟกๅฟๅฐฑๆฏ่ต้ฑๆๅบ\n83502 1 ใxๆxๆฅไธxๆxๆฅๅ
จๅบๅไปถๅ
ซๆ๏ผ้้ๆฅๆขๅงไบฒไปฌ๏ผไธ่พพไฝฐ่้ๅจๆญค็ญๅๆจ็ๅ
ไธดใ\n83503 1 ๅฆน๏ผๅๅฎๅฅถ็ฒxๆxxๆฐๅจ็ฑๅฉดๅฎคไธๆถ้ๅฎไบ๏ผไฟ้ๅๅบฆๅพๅคง่ดญไนฐx็ฎฑxๅฌ็ซๅxOOๅ
๏ผๅจๅ
ถไปๅฎๅฎๅบ...\n83504 1 ๅคงๅจไฝ๏ผ็ๆ ผๅ๏ผๆ ผๅ็ฉบ่ฐๅ
จๅนดๆๅบไปท๏ผๅ้ขๅญxxๅ
ๆตxxxๅ
๏ผๆดปๅจๆถ้ดxๆxไธxๆฅ๏ผๅฑ้ฆๅนฟๅบ่ฝฌ...\n1000 Processed\n classify content\n84000 0 ๆๅฐๅฆๅ
ไฝ่
ๆฏyax\n84001 1 ๆ ๅทๅ่ไฟกไปฃๆ้ๅ
ฌๅธxๆxๅทๆญฃๅผไธ็ญ๏ผๆ้่ฆ่ต้ๅจ่ฝฌ็ๆๅๆฌข่ฟๆฅ็ตxxxxxxxxxxx็็ป็ใ\n84002 0 ๅฆ้จๅทฅๅๆ
ๆธธๅญฆๆ ก14็บงๅฐๅ
ฌไธพ\n84003 0 2014ๅนดๅๅไบฌ**ๅญฆ้ขๅงๆๆ้ ็บชๅฟตไฝฉๅ\n84004 0 ๆดๆidea็ๅฐๅข้ๅฐไธปๅฏผVR\n1000 Processed\n classify content\n84500 0 ๆฑ่160ๅฎถไธๅธๅ
ฌๅธ่ๅๅฃฐๆ\n84501 0 ๅ้ ๅๅๅๆน่ฟ่ฎพ่ฎกๆฏไธไธช้่ฆ็ๅทฅๅ
ท\n84502 0 ่ฏดๅฅฝ่ฟไฝ ็้ซ้ผๆ ผๆๆบๅคง็ๆฅไบ\n84503 0 ็ฌฌไธๆกไพฟๆฏโโๆบๅจไบบไธๅพไผคๅฎณไบบ็ฑป\n84504 0 ๆไธชไบบ่ฃ
้ผ่ฏด่ชๅทฑๆฏๆ้ๅข็ปงๆฟไบบ\n1000 Processed\n classify content\n85000 0 ๆๅ
ด่ถฃ็ๅ ๆฃ1056527483\n85001 0 ๆ
ไธๅนณๅฐฑๆฏๅฒไธๆโtoughโ็ๅฎถไผ\n85002 0 ไนๅ็ ด็ ด็ๅฐ้ๅปTourMontparnasse\n85003 0 ไธญๅฝๅฅฝๅฃฐ้ณ็ฌฌๅๅญฃๆฒกๆไปไนๅฅฝๆญๆ\n85004 1 ไธ้ญ
ๅจฑไนไผๆๆฐๅนดๆๆฉๅ้ฆๅ
จๅบๅค้
ไนฐx็ฎฑ้xxๆฏ๏ผ็บข้
ไนฐxๆฏ้xๆฏ๏ผๅฆๅคๆฏ้ดๅ
ๅขไผ็ปๆจๅธฆๆฅไธไธช...\n1000 Processed\n classify content\n85500 0 xxxๅคๅๅฎถ้ฟๅๅฐๆๅๅฐๅบๅๅ \n85501 0 ๅฅฝๅAmazonไธญๅฝๅผ้ๅ
จ็่ดญไนๅ\n85502 1 ๆฒชๅคช่ทฏ่ฟ็บฌๅฐ่ทฏๆฒฟ่ก่ฝฌ่งไฝ็ฝฎ๏ผxxx.xxๅนณ็ฑณ๏ผ็ง้xxไธ๏ผๅนด๏ผ็งๅฎขไธบ:ไธญๅฝ็งปๅจ๏ผ่ฏๅ่ถ
ๅธ๏ผ้ท...\n85503 0 ๅ็ตใๆฑ็จปใไธญ่ฏ็ซไฝๅๅฑๆจกๅผ\n85504 0 ๆฉไธ7๏ผ30ๅๅฐๅธๅฐๆดฅๆฑฝ่ฝฆ็ซๅ่ฝฆๅป็ฆพๆจ\n1000 Processed\n classify content\n86000 0 ๆฌไนฆๆฏไธๆฌ่ฎฐๅฝๆฐไธญๅฝ20ๅนด่ตๆฌๅธๅบๅๅฑๅฒ็่ไฝ\n86001 0 ๅทฒไป่ถ
ๅธๅญๆ็ตๆขฏๅจๅญๆๅฐๅๆ\n86002 0 ๅพฎไฟก่ฆๅๆบ่ฝๆบๅจไบบโๅฐๅพฎโ\n86003 0 ็ฉๅ็ๆบๅจไบบๅฟซๆฅๅด่งๆ็็ฒพๅฝฉๅพฎ่ง้ข\n86004 0 ๆตๆฑ็ๅฎ็ๅญๅฅณๅฃซๅธฆ2ๅฒๅฟๅญๅฐๅ
ฌๅญ็ฉ่\n1000 Processed\n classify content\n86500 0 ่ฎพ่ฎกๅธๅฏน่ฟๅป็ๅทๅญ่ฟ่กไฝฟ็จไธใ้ ๅไธ็่็ฉถๅ่ฟ่กๆน่ฏ่ฎพ่ฎก\n86501 1 ่ฏๅ้ถ่กๅกๆๅญๆใๅฆ้้ข็บฆ่ฏท่ณ็ต่ฏ๏ผ้ข็บฆๅๆ ้กปๆ้ๅฎ่ฃ
๏ผๆฅ่ฃ
ๅฐๅ๏ผ้ถๆณๅ่ทฏๅนฟ็ตๅคงๆฅผไพง๏ผๅณๅธๆฟ...\n86502 0 ไบๅๅ็ฎกๆฎๅๆงๆณไธญ้้ไธญๅ้ๅ
จๅๅบๅจ\n86503 0 ่ฐไธๆณไธ็จๆ่ตๅฐฑๅฏไปฅ่ต้ฑ็็ๆ\n86504 0 ๆผๅฑไผๅ้็ไธญ่ฟๆๆต้ผปๆถ่ฟไธ็็ถๅ\n1000 Processed\n classify content\n87000 1 ใๅฐๅทดๆฏไบๅๅฑฑๆฌขไน่ฟๅ
ๅฎตใxๆxๆฅ-xๆฅ๏ผๅฐ็ฐๅบ็็ฏ่ฐใๅ
ๅ
ๅฎตใ่ตข็คผๅใxๅทๆฅผไธญๅบญๆฏ่งๅทจ่้ข...\n87001 0 ๅจๆบๅบ้ๅฐไบTimeZ็ปๆไธ่ฎค่ฏ\n87002 0 ๅฐผ็ๅT^TT^Tๆๆฏๅฌๅๆ้ฎ้ขไธๆฏ้ผปๅญๆ้ฎ้ขๅT^Tไธบๅฅ็ปไบๆไธไธช้ผป็็่ฏT^Tไฝ ๅจ้ๆๅ\n87003 0 ไปไนๆ่ดงๆๆFOF็ฆปๅฒธๅบ้QDII\n87004 0 ็งไธ้ๅดๆฟ่ฎค่ฟไบ่ก็ฅจๆฏๅๅพ\n1000 Processed\n classify content\n87500 0 ๅจ่ฟๆ ทไธไธชๆๅพๆ็็ธ็ไธ็ไธญ\n87501 0 ๆ้ซไบบๆฐๆณ้ขๅทฒ็ป้ข้ขๅๅบๆ็กฎ่กจๆ\n87502 0 ไป่ฝไธ่ฝๅฎ็ฐๅผ้ฃๆบ็ๆขฆๆณๅข\n87503 1 xxxๆ็ฒพ็ไธชๅไฝ๏ผไบบ่ฝๅฐ๏ผๆๅๅคง๏ผๆ็คบ๏ผๅ่ฟๅๅบ๏ผ้ไฝ ไธๅฅ่ฏ๏ผ่บซๆ็ปๆๆฌๅๆนใ[่ๅ
ฌ:xx...\n87504 0 ๆ ่็่ชๅทฑๅพฎๅ็็ฒไธ็ถๅๅ็ฐๆไธช่ดฆๅทๆพ็คบๆๆบ่็ณปไบบ๏ผXXXไป็ๅคงๅ\n1000 Processed\n classify content\n88000 0 8ๆนๆฌกไบงๅ้ๆณๆทปๅ ็ฆ็จ็ฉ่ดจใ่ฟ่งไฝฟ็จ้็จ็ฉ่ดจโๆฐฏๅไป็ดขไธ้
ธ้
ฏๅฑไบ็ณ็ฎ่ดจๆฟ็ด ็ฑป็ฉ่ดจ\n88001 1 ไธ็ๅ ๅฅณไบบ่ๅๅค็พไธฝ๏ผใๅฎน่พฐๅฅๅผใๆๅ็ฅๆจๅฅณไบบ่ๅฟซไน๏ผๆไปฌ็นไธบๆจๅๅคไบๅคง้ๆฅๆฌพๅ่ๆฅๅฅ่ฃ
๏ผๅฟซ...\n88002 0 ็ๅฎไบไธญๅฝๅฅฝๅฃฐ้ณๅๆฅ็็็พๅฝ็ๅง~็พๅฝ็ๅฏผๅธ่ฝฌ่บซๅคชๅท้ไบ\n88003 0 ไธๆพๅบ็
ๆฏๅฐๅบๆฏ่ฐไผ่ฟ่กๆฐธๆ็ๆธธๆ\n88004 0 ไธๅไธญๅนดๅๅบไฟๆดๅทฅ่ขซB2่ณB1ๅฑ็่ชๅจๆถๆขฏๅคนไฝ่
ฟ้จ\n1000 Processed\n classify content\n88500 0 ไธๅปๆ็ต่ฏๅซ็ฉไธๆ่
็ปๅฎถไบบๆฅ่งฃๅณ\n88501 0 ็ฌ่ตทๆฅๅผ็ต่ๅ
ๅผๅ
็ตไนๅโฆโฆ่ฆๆฏๆๆธธไธ็ดๆฅ่ฝๅ
ๅผๅฐฑๅฅฝไบโฆโฆ\n88502 0 ๆ้่ฆ็ๆฏๆฏไธชไบบ้ฝๅฏนๆ
ๆธธๅฎๆ้ๅธธๆปกๆ\n88503 0 ็ฑ่ฑชๅ็็ฉบๅฎขA330โ200ๅ้ฃๆบๆฏๅจไธใไธใไบๆง้ฃ\n88504 0 ๅฏไปฅๅSpringๆกๆถๆ ็ผ้ๆ\n1000 Processed\n classify content\n89000 0 ่ฟๅไธๅกๆญฃ้ๆธ่ขซ้จๅๅบ้ๅ
ฌๅธๅ้\n89001 1 ่้ๅ
ฌๅธ้ซ่ณ็บข็ฅๆจๅคฉๅคฉๆไธชๅฅฝๅฟๆ
๏ผ็ฐๆจๅบใ้ขๅญxxxๅ
ใๆไบคxxๅ
ๅฏไบซxxMๅ
็บคๅ
่ดน็จใๅซx...\n89002 0 ไฝ ๅฆ่ฑๅ้ชจ27้้่ฟๆฏไปไน้ฌผ\n89003 0 ๅไธ้ข่ขๅฅณๅผบไบบไบ่็ฝ+่ท้ๅ
ๅฟ็ๅฃฐ้ณ\n89004 0 ๅฎไน ็็ๆ่ถๆฅ่ถๆฅๆฒกๆๅฎขๆท\n1000 Processed\n classify content\n89500 0 ๆนๅ่ญฆ่ฝฆๅคฑๆงๆไผคไธคไบบๆญฃๆฅๆฏๅฆ้
้ฉพ\n89501 0 ๅๅจไธป็้ขๅฌ็BGMๆดไธชไบบ้ฝ่ขซๆฒปๆไบ\n89502 0 ๆญๅทๆงไปทๆฏๆ้ซ่ฃ
ไฟฎๆๅกๅนณๅฐ\n89503 0 ็ฐไปฃๅปบ็ญๅไธ้็ๆฌงๅผๅปบ็ญๅ ็งฏๅบๆฒกๆ็ต้ญ็ๆตฎๆฌข\n89504 0 Alice็ๆๆบ้ๅฃฐ?ๆๆญฃๅจๆถๅฌP\n1000 Processed\n classify content\n90000 1 ๏ผxxx่ไผ้ฃๆดๆฅไบ๏ผไฝ ๅๅฅฝๅๅคไบๅ๏ผ็ญๅฐๅฅๅ
จๅฑๅฎๅถๅฎถๅ
ท่ฏ้ๆจๅๆฅๅ้ดใ่ฎพ่ฎก็ฑไฝ ๏ผๅ
จๆๅไธปใ...\n90001 1 ่ฏไฟกxๆ๏ผๆๆฉไธๅ
่ดญโ้พ้จไฝณๅฑ
ไน๏ผๆฉฑๆใ่กฃๆใ้จไธใ่ฝฏๅ
๏ผ้ๆ ่๏ผๆฐๅนด็บขๅ
้ไธๅ๏ผไนฐไธ้ไธใ...\n90002 0 S20ๅคๅๅพ่นๆกฅๆบๅบๆนๅ่ฟ็ๅ่ทฏไธๅฃ่ฝฆๆต้ๅคง\n90003 0 ่ฟไธชไธคไบบ็็ชๅขไผ่ขซๆฒณไธๅ
ฌๅฎๅๅฑๆๅๆ่ท\n90004 0 xxๅฐๆถๅ
ๆๅฟ้ต้ฃๅฏ่พพx็บงไปฅไธ\n1000 Processed\n classify content\n90500 0 xx%็ๅๅฎไบบๅฎถ้็้้ฝไธๅฎๅ
จ\n90501 0 xxใๅป็ๅจๆขฐ็งๆฎๆดปๅจ่ตฐ่ฟๆฑๅคง\n90502 0 ๆจๆไธๅฎถ้่ฟๅฐๅทๆๆบ็ต่ไปไน้ฝ่ขซๅทไบ\n90503 0 ไธบไบๆต่ฏ่ฃ
ไฟฎๅ็ๆฐ่ฏๅฌๅฎคๆๆ\n90504 0 ๆๅฎถๅฐ็็็ปๅฐ็
ๆฏๆฒป็็ป้ช\n1000 Processed\n classify content\n91000 0 ๆๆ็ผด็บณๅป็ไฟ้ฉ็ๅๅทฅๅฎไบ2015ๅนด8ๆ2ๆฅไธๅๅจไธญๅฟๅป้ขไบ้จ่ฏ่ฟ่ก่ฒ้พๅฆๅฅณๆฃๆฅ\n91001 0 10ๆฏๆ่บ้็บท็บทๆฟๅบไบ่ชๅทฑ็โ็ๅฎถโ่็ฎ\n91002 0 ไธไธๅคๅๆน่ฅฟโๅๅชโๅจไบ้ถๅนดไปฃ่ตฐไธไบ่ชๆๆ่ตไน่ทฏxxxxๅนด\n91003 0 ๆ็ๅๆฅๆๆๅฎถไบบ็ๆ่ง็ฌฌไธๅคฉไธญ่ฏ่ฟๅฅฝๅชๆฏ้บป็ฆ็นๆฒกๆณ่ฑกไธญ้ฃไน้บป็ฆ\n91004 0 ไธๆ่็ฎไธ่ฎบๅฎๅ
ๅน็ฉถ็ซๅจๅช้\n1000 Processed\n classify content\n91500 0 ่ๅๅป้ขๅฐฑๅจ้่ฟไฝๆฏ120ๆฅ็ๅพๆ
ข\n91501 1 ๆ่ฐข่ด็ตๆฝๅๅฆๅฎถไธ้ฃไธ่กๅบ๏ผ้
ๅบไฝไบๅฅๆๅบไธ้ฃไธ่กxxxๅท-ไธ็บชๆณฐๅๆญฃๅฏน้ข๏ผ็ฐๆจๅบไผๆ ไฟ้ไปท...\n91502 1 ๆจๅฅฝ๏ผ้พๆนพ่ฟๅจไผๆ๏ผๅฐไบxๆxๆฅxx็น๏ผไธพๅไธๅ
ซๅฅณไบบ่ๆดปๅจ๏ผ็น่ฏทๅ่พฐไธญๅป้ขไธปไปปๅปๅธๅฐๅบไธบๅx...\n91503 0 ๆ้ซไบบๆฐๆณ้ขๅ
ณไบๅฎก็ๆไธๆง่กๅคๅณใ่ฃๅฎๅไบๆกไปถ้็จๆณๅพ่ฅๅนฒ้ฎ้ข็่งฃ้\n91504 0 ๅฑ
็ถๆไบบ่ฏดๅถๅๆฅไบๆ้ฃๆบๆ่ขญๆ้ๆๆๆ็ๅฆๅๆ้ๆๆๅฐฑๆฏไธชๆ่ขญ็่็ฎ้ฝๆฏๆ้ฉ็ปผ้ฎ้ขๆฏๅคฎ่ง่ฟไนฐ...\n1000 Processed\n classify content\n92000 0 GoogleChrome้ๅบฆๆต่ฏๆ
ข้้ๅคดๅ่ฏไฝ ๆๅผgoogle็ๆถๅๅ็ไบไปไน\n92001 0 ่ฟ็งๅๆณๆฏๅฆ็ฌฆๅๅ
ฌๅฎๅ
ฅๆทไธ่ฝๅ่ฎกๅ็่ๆ้ฉ\n92002 0 ็ฝๅญ็ป่ฑๅ้ชจๅๆฌก่ฟๅๅก้ดๆจๅฑ\n92003 0 ๅ
ถไธญๆตๆฑๅคงๅญฆๅธธๅกๅฏๆ ก้ฟๅฎๆฐธๅๅๅๅฐๆปจๅทฅไธๅคงๅญฆๅฏๆ ก้ฟ้ฉๆฐๆๅๆฅ่ชๆฉ้ณ\n92004 0 ๆตๆฑๅซ่ง็่็ฎๅช่พ่ถๆฅ่ถ็ไธๆ\n1000 Processed\n classify content\n92500 0 8ๆ12ๆฅๅจไธไธๅๆฑๅๅฐไธๅๅบ้ข่ฏ\n92501 0 xxxx้พๅฉๅคไธๆไฝ็ๆๅไฝ็งฆๅจๆไฝ็ฅไบไธ่ฟๅฟๆๅจ่ดขๅฏๅผๆ่ฟๅไน็็บข้็ๅธฆddpraๅกๅฝๅ
็ฐ...\n92502 0 ๅไฟ้ฉๆ ไน่ฃ
ๅคไบไธ็ป้ๅ่ฝๆง็ๆฐดๅนณๆ ผๆ
\n92503 0 xxxxๅนดxๆxxๆฅๅธๆฐๅ้่ฎฟ่ฑ้ธๅญไธๅฎถๆธ ๆ็ๅพๅท็ต่งๅฐๅพๅทๆฐ้ป็ไฝณไฝณใ่ๅ็่ตๆธ ๆๅคงๅธ๏ผ่ฑ...\n92504 0 ๆๅฆๅฎถๆง็ต่ๅฏ็ ๅฟไบๆไน้ฝๆไธๅผๆไนๅ\n1000 Processed\n classify content\n93000 0 ๅปๅนด็้ๆฑๅไปๅนด็ๅฅฅไฝๅฎๅ
จๅไธคไธช็้\n93001 0 ๆถๅง่ดชๆฑกๆฌพ้กนไบคๅธๆณๆบๅ
ณๅไบ็ซๆก\n93002 0 ๆๆบ้ฉฌไธๆฒก็ตไนไธ่ฝ่ฐทๆญ่ตฐๅๅป่ฟๅฅฝๆๅธฆไบ500ไธ็ถๆๅฐฑๅฎไบ\n93003 0 ๅฉ็จๆ้ดๅๅๆ้ป็ ่ก็ฅจไธๅกๅๅธๅบ่กๆ
\n93004 0 ๆฌๅทๆฉ้ค่กไธ็็น็น้พ้ๅฐฑๆฏๆๅกๆๅบฆๆๅทฎ\n1000 Processed\n classify content\n93500 0 ๆตๆฑๆ
ๆบชไธไบค่ญฆๅจๆงๅคไธญ่ขซๅทฅ็จ่ฝฆ็ขพๅ่ดๆญป\n93501 0 ไธช่กๅฉๅฅฝ๏ผๅฅ็้ๅขๅๅนดๆฅไธ็ปฉๅขไบๆๆxx่ฝฌxx\n93502 0 ๅๅคไบ็ไธไธชๅ้็ๅฐฑๆฏ็ผ็ๆฒกๅๆณๅฏน็ฆ\n93503 0 ๅผๅญฆ้ซไธๅ ๆฒนlightupthedark??\n93504 0 ๆณฐ็ฎไธๅบไฟฎๅจ้ฃๆบไธๅถ้โฆ็ท็ทๅช่พ็่ง้ข๏ผ\n1000 Processed\n classify content\n94000 1 ๆ็ฟๆ่ฒๆฐๅญฆๆๆฅๅๅผๅงไบ๏ผๅ
ๅฎต่ๅๆฅๅไปทๆ ผไผๆ ๏ผๅนถๆ็ฒพ็พ็คผ็ฉ็ธ้ใ\n94001 0 ๅ็้ฝๅๅๆถๅไธญ่ฏไธๆ ทๆถๅฟไบ\n94002 0 ็จ่ดฟ่ตใๅๅ็ๆนๅผๆฏไบๅฎไบบ\n94003 0 ๆฅๆฌkissme็ซๆฏ่๏ฝ??ๅคงๅ้ผ้ผ\n94004 1 ๅฅๅxxๅนด่ๅบ(่บ่คๅฑ
๏ผ๏ผ็ฟก็ฟ ่คๅจ๏ผ ็พๅนดๅๆดๆด.ๅ ๅบ้ข่ฃ
ไฟฎ.ๅ
จๅบ็นไปทๅค็ใ ๆปกxxxx้x...\n1000 Processed\n classify content\n94500 0 ็ปๆ่ฐไบๅฟซ2g็psdไนไธๆขๅ
ณ\n94501 0 VillasofPinecrestๅฐๅบๆฌกๅงๆ็ง\n94502 0 ็ถๅ่
พ่ฎฏๅฐฑ่ซๅๅ
ถๅฆๅฐ่ฏดๆๅๅธ่ฏ้ชไฟกๆฏ\n94503 0 ??????????????????\n94504 1 ๅไบฌๅฎ็พๅฎถๅญ่ฃ
้ฅฐๆฐๆฅ้ๆ ๏ผๅ
จๅ
ไฝ่ณxxx/ๅนณ็ฑณ๏ผๆดๆๅค้กนๅคง็คผ็ธ้ใ่ฏฆๆ
่ฏทๅจ่ฏขxxxxxxxx...\n1000 Processed\n classify content\n95000 0 ๅ
ถ่ๅ็่ฝปๅทฅไธใๆ
ๆธธไธใ้
ๅบไธๅๅจฑไนๅบไฝฟๆพณ้จ้ฟ็ไธ่กฐ\n95001 0 comๆฐ้็ถๆข
้พ็ฅ่ฒ็้ซๆธ
็พๅบฆไบ็ฝ็ไธ่ฝฝ\n95002 0 ไธญๅฝ18ๅฒไปฅไธๆไบบ้ซ่กๅๆฃ็
็ไธบ18\n95003 0 ๆๆๆบๆถไธๆถๅฐฑๅกไธๅๆไนๅฐฑไฟๅญไธไธๅขๅช่ฝๅตๅตไบ๏ฝgood\n95004 0 ๆตฆๅฃๆฃๅฏ้ขๆฏ7ๆ20ๆฅๅๆตฆๅฃๆณ้ขๆ่ตทๅ
ฌ่ฏ็\n1000 Processed\n classify content\n95500 0 2015ๅนด7ๆ25ๆฅ่ณ8ๆ1ๆฅ้บๆธธๅฟๅจๆฒณๆปจๅ
ฌๅญไธพ่ก็ฌฌไบๅฑๅ็นไบงๅๅฑ้ไผ\n95501 0 ๅจๅซ็ๅฐ็ๅบ็ฐ็ๅ็ๆไธ่ฏดโๆฒกๆจๆด่ฟ่ฝ็ๅโ4\n95502 0 ๆตๆฑไธญๅ้จๅฐๅบๅทฒ็ปๅบ็ฐ้ทๆดๅคฉๆฐ\n95503 0 ็ฝๆๆฑ่ๅฆ็ๅธๅฅณๆฐ่ญฆๅฎถๅฑๆณ้ฒๆๆๅณฐใๆจๆดไธคๅ่บไบบ็่บซไปฝ่ฏๅทใๆพ็จๅใๆท็ฑๅฐ็ญไฟกๆฏ\n95504 0 ๆคๆฑไฝ่ฒ้ฆๅฐไธพ่ก2015้บฆ่ฟชไธญๅฝ่กโ็ปๆไธๆโๅฐๅท็ซ็ฏฎ็่ต\n1000 Processed\n classify content\n96000 0 ็ฎๅญๅบง็่ๅฆๅคช้ธ้่ฎฉๆๅจๅฎถๆๆซๅซ็ๅฐฑ็ฎไบ็ฐๅจ่ฟ่ฆๆๅ้ฅญ็ญๅฅนๅๆฅไนไธ็็ๆๆฏ้ฃ็ง่ขซๅผบๆๅๅ็ไบบๅ\n96001 0 ๅนณๆถ็่ฟ็ๅฎถๅฑ
่ฃ
ไฟฎๅฏนไบ็ฏๅขๅ่ฎพไผๅธฆๆฅๅพๅค็Idea\n96002 0 ๅๆฌพๅฐๆๅ่ตตๆไป
ๅฝ่ฟ3ไธๅ
ไฝไธ็ๆช่ฟ\n96003 0 PPLๅคงๆ็ฌๆ่ตๅฅไฝ็ผ่พพ็ฝ่ฟไธชๅๅญ่ตทๅพๅคชๅไบ\n96004 0 ๅๅฐๅฃ่
ๅป้ขๅฐฑ็ขฐๅฐๅฐๅงๅจๅ ไธบๅฎณๆๆ็ๆๅไบ\n1000 Processed\n classify content\n96500 0 ๆตๆฑไนไนๆไธชๅฆๅฆไธๅฐๅฟๆ่ชๅทฑ็ๅฟๅญ้ๅจๅฎ้ฉฌ่ฝฆ้ไบ\n96501 0 ๅไบซnmgtxl็ๅๆๅพ็๏ผ็บข็ณๅดๆ
ๆธธ้ฃๆฏๅบ\n96502 0 ็พ่ๆฑ็็ฉไธๅ
จๆฏ็ๅจๅ
ป็ๅฉๅญ็็\n96503 0 ๅไบไบๅคฉJRๆๆฏๅไนๆฒกๆณๅจ้ญ้ฝๅๅฐ้โฆโฆๅฆๅคๅจไธไบฌ้
ๅบไฝๅพๅคช่ๆ\n96504 0 ่ฏดๅจๅฎถๅซๅฐ็ๅฎไธบ๏ผ่niangmen\n1000 Processed\n classify content\n97000 0 ๅฅฝๅฃฐ้ณไปๅคฉๆไธๅ ็น้ๅผๅง็ดๆญ\n97001 0 ๆๅฅฝ็crystalinjectorๆฐดๆถๆฐดๅ
ไปช็ๅฃซๅ่ฃ
้ฉฌ่พพๆบ่ฏ\n97002 0 ่ๆฏๆๆๆ้ฝ็น็นๆปดๆปด่ฎฐๅจๅฟ้\n97003 0 ๅธๆ้่ฟ่ฟ็งๅฝขๅผๆถ่ตทๅ็ฎกไธๅธๆฐๆฒ้็ๆกฅๆข\n97004 0 MFAไธไธๅญฆไฝๅ้ๅธธๆ่ฏด็โ็กๅฃซใๅๅฃซๅญฆไฝโๆไปไนไธๅ\n1000 Processed\n classify content\n97500 1 ใไฟๅชณๅฆ็ซ้
ใ้
ฌๅฎพๆดปๅจๅผๅง๏ผๅ
จๅบ่ๅx.xๆ๏ผๅฑฑๅๅฝๅฎดใ้ช่ฑๅค้
ๅx้x๏ผ็ต่ฏ๏ผxxxxxx...\n97501 0 ๅฎ้ฉฌ็ญ4Sๅบ็ซๅทฅไฝ็ไธไธๆๆฏ้ชจๅนฒๅ่ตๆทฑ็ฎก็ไบบๅ\n97502 0 2015ๅนดๆๆ กๆฑ่ๆฌไธ็็งๆๆกฃ็บฟ345\n97503 0 ๅฅฝๅฃฐ้ณๆ็พๅๅฃฐๅๅฑๅผนๅฅ่นๆๅญ็ปๅ้ฃ้ก็นๆ็ฝ\n97504 0 S38ๅธธๅ้ซ้็ฑๅ่ฅๅพๅธธ็ๆนๅๅฎๅธธๆฎตๅจ137Kๅคๆฝๅทฅ็ปๆ\n1000 Processed\n classify content\n98000 1 โๅฐๅๅๅฎๅๅนๅ
ป่ฏพ็จโๆขๆฅไธญ๏ผๅๅธๅฎๅถ้็นไธญๅญฆๅๆ กๅค่่ฎกๅ๏ผ็ฒพ่ฎฒๆฐๅญฆใ่ฑ่ฏญใไฝๆ็ฅ่ฏๆจกๅ๏ผๅฟซ...\n98001 0 ๆท
ๅทๅฟๆณ้ข็ป็ป็ไธๅบๅไบๅฎกๅคๆญฃๅจ็ฐๅบๅผๅบญ\n98002 0 ไธๆญฃ็คพๅบๅผๅฑไบๆณๅพ็ฅ่ฏ่ฎฒๅบง\n98003 0 ๆฑ่ๆๅๆณจๅๅฐ็ๆ ๅฟ188ไปถ\n98004 0 ๆธธไพ ๆฑฝ่ฝฆๅจๅไบฌๅๅธไบๆธธไพ X\n1000 Processed\n classify content\n98500 0 ๅ ไธบ่ฟๆ ท็ไบบไผ่
่ดฅไฝ ็็ๆณไบบ็\n98501 0 ไนๅฏๅจ่ฏขๆบๅบ้ฎ่ฎฏ็ต่ฏxxxxxxxx\n98502 0 ๆ่ตๆบๆPiperJaffrayๆฅๅๅฏน่ถ
่ฟ800ๅ็พๅฝๆถ่ดน่
่ฟ่กไบ่ฐๆฅ\n98503 0 ๆๅคๅฐไบบๆฏๅฒ็็ฝๅญ็ปๅป็็โ่ฑๅ้ชจโ\n98504 0 ๅคงๅคๆฐไบบไผๅๅฏนๆฟๅฐไบง็จๆน้ฉ\n1000 Processed\n classify content\n99000 0 HealthyCare่่ถ็่120g\n99001 0 ๆๅฆๆ็็ๅฏไปฅๆขๆๆบ็่ฏ\n99002 0 ๆจๅถไน้ธฆcosไฝๅSDๅจๅจ่็ฟปไบ\n99003 0 ๆ้ขๆต็ซ็ฎญๅพๅฟซ็ญพ็บฆ่ฝ้ๆฐ็งๅจๅปๅงๆฏ\n99004 0 ้ฅญๆญๅญAไธบไบๅฎๆ
ฐ้ฅญๆญๅญBๆ็ๆ่ฏด๏ผๆฏไธๅญฆๆ กๅทฎๆฒกไปไนไธๅฅฝๅ\n1000 Processed\n classify content\n99500 0 ็ฐๅจ่ฟๆฌพ้ฎ็ๅฏไปฅๅจAmazonๅๅพฎ่ฝฏ็ฝไธๅๅบไธญ่ดญไนฐ\n99501 0 ๅฆ่ชMFใๅฑฑ่ชSCใๆทฑ่ชZHใๆตท่ชHUใ้ฆ่ชJDใๅท่ช3Uใไธ่ชFMใๆ้ฝ่ช็ฉบEUใๆฒณๅ่ช็ฉบ...\n99502 0 ๅญฆ่ฝฆๆๅ ๅคฉๅฟ่ฎฐๆถ้ฒๆ้ๅ
จ้ปๅฎ\n99503 0 ๆๆฐไธช่ช่ก่ฝฆๆๆบ็ญ็ญ็ญ็ญ็ญ็ญ็ญ็ญ็ญ็ญโฆ่พ้ฝไธขๅฎไบ\n99504 0 ๆไปฌๅคงๆณ้ณ็้ณไน่ไนๅพ้ซๆ ผ้ผ็\n1000 Processed\n classify content\n100000 0 ๆจๅคฉๆไธๆไปๆฌๅทๅ็ซ่ฝฆ็ป่ฟxxไธชๅฐๆถ็้ข ็ฐธๅฐ่พพไบๆฝขๅท\n100001 0 ่ไธๆฏๅฝปๅบ็ๅroot่ฎพ่ฎก\n100002 0 8ๆ8ๅทๆฑไธชๅฆๅจๆ่
ๆๅฝฑๆ่
ๅๅค\n100003 0 ๅไธบๅทฒๆไธบไธ็็ฌฌไธๅคงๆๆบๅถ้ ๅ\n100004 0 ๆตๆฑ็ๅฐๅทๅธๆคๆฑๅบๆฐไธ็บชๅๅๅๆฎฟ้ถๆๅนฒ้จๆไบบ๏ผ่ฏดไปๆฏๆๆฏไนฆๆณๆไฝ ๅฐฑๆไฝ \n1000 Processed\n classify content\n100500 0 ไธญๅฝ่ฏๅธ็ฝ๏ผไธญ้ๅปบๆๆ ๆฐไธๆน้่ทฏๅฎข่ฝฆ\n100501 0 ่พๆฃฎ่ฑชๅจๅฐ1944ๅนด็งๅคฉ้่ฏฏๅฐ้ปๆญขไปๅ
ณ้ญโๆณ่ฑๆฏ็ผบๅฃโ\n100502 0 ๆบไผๅทฒ็ป่ฟ็ฆปไฝ ๆๅ็็ง่ฏๅฐฑๆฏๅคไปๅบ\n100503 0 ๅ
่ดนๅไบซๆตๆฑ50ไปฝๅไบซๅฐ่ฑ็งๅญ\n100504 0 ๅณไพฟๆฏๆBUGไน่ฝ่ฎฉ็ฉๅฎถไฝ้ช5D็ๅฟซๆ\n1000 Processed\n classify content\n101000 0 ๆฟๅบๆๅธๆถ่ดญๅฏน็จป็ฑณไปทๆ ผๅฝฑๅๆๅคๅคง\n101001 0 ็ปๆ่ฏ็ไผ็ๅๆณๅดๆฏๆปฅ็จ่กฅ่ฏ\n101002 0 43ๅฒ็้น่ๅจๆฑๅฃไธ็พๅฎนไผๆๅ็ๅ่ฅๆๅก\n101003 0 โGartnerๅฏๆป่ฃJohnMorency่ฏด้\n101004 0 ้ฒๆ่กฃใ้ฒๆ้ใ้ฎ้ณไผ็ญ้ฒๆ็จๅ
ทๅ่ฏฅๅคงๆพ่บซๆไบ\n1000 Processed\n classify content\n101500 0 ไธๆถๅณ็ฝๆฏๅ ไธบๆ้ฒๆ้ฎ็็ไฝ็จ\n101501 0 ๆ็ๆญฃ็็ทไบบ็ๅฎไบๅฅฝๅๆฌขๆฐๅ
ต่ฟ็็ญ้ฟๅ\n101502 1 ็นๅคงๅ่ฎฏ๏ผ ้ๅฃ่พ็
็ฐๅจๆจๅฒ๏ผไธ่ฌ็
้ฅฎๅ่ฑชๅ็
้ฅฎ๏ผ ไธ่ฌ็
้ฅฎ๏ผๅฐๅ
xxx๏ผไธญๅ
xxx๏ผๅคงๅ
x...\n101503 0 3ไบฟๅ
ๅฎๆฝ6ๆก15ๅ
ฌ้ไบ็บงๅ
ฌ่ทฏๆฅ็บฟ\n101504 0 A่ก็่กๆๆ่ดงๅญๅจๅทจๅคง็ๅ็ฉบๆผๆด\n1000 Processed\n classify content\n102000 0 ็ๆฅ้ฃไธชๆณ้ขๅฏ้ข้ฟไนๅ็กฎๅฎ้ฎๅบไบไปไปฌ่ฆๅผๆญปๅ ไธชๅฐๅ็่ฏๆฎ\n102001 0 ๅจๅ
ถๆฌ็ใ็งๅนปใๆบๅจไบบไผฆ็่กจ้ขไธ\n102002 0 xใๆง่กๅๆ๏ผ้ขไบบ่ฐๆง่กๅไธบๅๅๆ\n102003 0 ไธบๆฏ้ช่ฏ็ ๅพๆไธไธไธชๆๆบๅทๅ\n102004 1 ไธๅ
ซๅฆๅฅณ่ๅฟซๅฐไบ๏ผ่ฟๅฏๆฏๆไปฌๅฅณไบบ็่ๆฅๅฆ๏ผๅจ่ฟ้ๆๅ็ฅๅคงๅฎถ่ๆฅๅฟซไน๏ผไธบไบๆ่ฐขๆฐ่้กพๅฎขๅฏนๆฌ็ๆฏ...\n1000 Processed\n classify content\n102500 0 ็ต่้่ฆ่ดด่??ๆๆบ้่ฆ่ดด่??่ฝฆๆด้่ฆ่ดด่??้ฃไฝ ็่ธๅข??ไธบไปไนไธ็ป่ธ่ดด่??ไธๅคฉไธ่ดด่\n102501 0 ๅจ็ป่ๅ็ป็ปๆฐดๅนณไธๅฏๅจ็ฎ่คไธปๅจไฟฎๆค็จๅบ\n102502 0 ๅชๅฎ็ไธๅฃฐ่ณๆบๅพๆฏ่ณๆตdaung~ๆไบ่ๆณฅ้ฉฌ\n102503 0 ๅๅไบฌ้ๆตทๆธฏ่น่ถ็ฎก็ๆ้ๅ
ฌๅธๆตๅๅๅ
ฌๅธ้ชๅไธญไป่ดนๅๅ
ถไป่ดน็จ็ญๆฐไธๅ
\n102504 0 โ่ฐข่ฐขโ็ฌฌไบๆ็ฒ้ฎ๏ผไฝ ็่ฝๅคฉๅคฉ้
ท่ทๅ\n1000 Processed\n classify content\n103000 1 ่ถ
ๅผ็ฆๅฉ๏ผๅก้ขไบคxxๅ
่ฎค็ญน๏ผๅฏๆต่ดญๆบๆฌพ๏ผๅฟ
้กปๆพๆๆฅๅๅฆ๏ผไธ็ถ่ฎค็ญนไธๅฐๅ๏ผ๏ผ๏ผ็ซ้xxxๅ
ๅๅบ...\n103001 0 ไธๅพๆฏๅไบฌๅธไธญ่ฅฟๅป็ปๅๅป้ข็ฎ่ค็งๅฏไธปไปปไธญๅปๅธๆ้ฟๆ\n103002 0 ๆ่ฟไธ็ดๅบ็ฐๆๆบ้ๅจๅกๅก็ๅนปๅฌ\n103003 0 ๅฏนไธชๆง่ฟ่ก้็ปๅๅฏไปฅ็ๅฐๅฆไธ็ปๅ\n103004 0 6ใ็ฎ่คๅฏน็ๆฅๅ็ต่่พๅฐๅๆ ๅผบ็\n1000 Processed\n classify content\n103500 0 ๅนซๆๅๅ็ป้ขMScMoneyBanking&\n103501 0 ๆฑ่ฅฟ็ๅๆๅฟไบบๆฐๆฃๅฏ้ขไปฅๆฒ่ฏๅ็ดข็ฝชๅฏน็ฏ็ฝชๅซ็ไบบๅผ ๆๆนๅ้ฎๆ\n103502 1 ไบฒ็ฑ็ไผๅๆๅ๏ผๅ
ๅฎตๅฟซไนๅข๏ผๅ ๆฅ่ๆ้ดๆฌๅบไบบๆ็ดง็ผบ๏ผๅฏนๆจๆๆๆ ๆ
ข๏ผ่ฟ่ฏท่ฐ
่งฃ๏ผไธบ่กจๆญๆ๏ผๆจๅฐๆ...\n103503 0 ๆผๆ็ๅๆถไนๅจๆฏๆ กๅธฆ็ ็ฉถ็\n103504 0 ่ขซๆๅ
็่ฟๆณๅนฟๅๆถๅไฟๅฅ้ฃๅใๅคๆน่ฏใ้ๅคๆน่ฏใๅป็ๅจๆขฐๅๅคง็ฑปๅซ\n1000 Processed\n classify content\n104000 0 ็็ไฝฉๆๅชไฝ็bb็ๆฐดๅนณไปฅๅๅผๅฏผไธๆ็พคไผ็่่ฎบ่ฝๅ\n104001 0 ๆไปฅ่ฏดๅซ่ฎฉๅซไบบ็ขฐไฝ ็ๆๆบ\n104002 0 ไธ็ถไฝ ๆฃๅฏ้ขๆณ้ขๆไธชๆฏๅๆณ\n104003 0 Upไธป๏ผ็ด ๅนด้ฆๆถไธถๅ่งๆฅ่ชAcFunๆ็ซ ้ข้\n104004 0 ไธญๅๅๅฎถๅๅฎ้ฅญ13็นๆๅผ็ต่\n1000 Processed\n classify content\n104500 0 ไธๅคงๆฉ่ทๅฐๆทฎ้ดๅจ่ฝฆ้็ก่งๆไนๆฏ้ไบ\n104501 0 ่ฎฉๅไธๆฏ้ฃๆบ้ค็ๆ
ๅฎขไปฌๆ
ไฝไปฅๅ ช\n104502 0 ๅฏนๅไบฌ็็ฌฌไธๅฐ่ฑกไธๅฎๆฏๅไบฌ็ๆฐด้ธญใ้้ต้ธญ่ก็ฒไธ็ญ็ญ\n104503 0 ๆ้ฃๅฐๅฅณๆตๆฑๅซ่งๅ
ๆๆไบบ\n104504 0 ๆฑ่้ๅฎๆจ15้กนๅฅฝไบบๅ
่ดนๆฟ็ญๅๆฟๅฏ่ทๆฟๆฒปไผๅพ
\n1000 Processed\n classify content\n105000 0 ่ฟๆฏไปฅๅฝๅ
ๆบๆไธบไธปๅพ็ปๅค่ตๅฑ ๆ่กไธบ\n105001 0 ็ป่ฟๆถ้ด็ๆด็คผ็ป็ฉถ็็ธไผ่ตค่ฃธ่ฃธ็็คบไบไบบๅ\n105002 0 ไธๅธๅ
ฌๅธไธๆน่ดขๅฏ็ฝๆ่UIๆฑๆฉๆฃ\n105003 0 ๆ่
ๆฏๅไบฌ่ฅฟๅฎๆ้ฝๅฏไปฅๅๅฅฝๅคๅฅฝๅคๅฐๅ็ๅฐๆน\n105004 1 ไบฒ็ฑ็ๅฎขๆท๏ผๅ ้ฒ่ฅฟ่ฅ็ๆดๆฒณๅบ็ง็บฆๅฐๆๅทฒๆญไธ้ญๅบ๏ผ่ฏๆ้่ฏทๆจๆฅ้ปๆณฅ็ฃ
ๅบ(็ดซ็ฆ่ทฏ้็ๆปกๅ ๅฏน้ข)ๅฐฑ...\n1000 Processed\n classify content\n105500 0 GDPใๅ
ฌๅ
ฑ่ดขๆฟ้ข็ฎๆถๅ
ฅใ่งๆจกๅทฅไธๅขๅ ๅผใๅบๅฎ่ตไบงๆ่ตใ็คพไผๆถ่ดนๅ้ถๅฎๆป้ขๅขๅน
ๅ้ซไบๅ
จๅฝๅ
จ็...\n105501 0 ่ฐทๆญๅญฆๆฏๆฐๆฎๅบๆถๅ
ฅ่ๅด้ๅธธๅนฟๆณ\n105502 0 ๅ็้ฝไธๆฏ้ฃๆบๆๆไน่งๅพๅๆฏๅ็ซ่ฝฆๅขๅข้ฃไน็ดฏ้ฃไน็ดฏ้ฃไน็ดฏ\n105503 0 ๆญฃๅ่็น่ฏ้ฒๆ้
ถๆฐดๆ้
ต็ด ็ฆ่บซไธฐ่ธ็พ็ฝๆฐดๆ้
ต็ด ๆ ๅฏไฝ็จๆ ๆ้ๆฌพ\n105504 0 ่ฟๆ ทๅฐฑไธไผไธ่ตทๅบๅฐฑๆๆๆบ็ ธไบ\n1000 Processed\n classify content\n106000 0 ้้ไธพ่กxxxxๅฑๅญฆ็ๆฏไธๅ
ธ็คผ\n106001 1 ๆจๅฅฝๅๅชๅฐ่ฑกไธ็บช่ๅๅบ๏ผๅจx.x่ๆฅไธดไน้
ๆ่ธx.xๆ่ตท๏ผๅฎถๅฑ
ๆxๆ๏ผๆปกxxxๅ
้ๅ
่ฃคไธๆก๏ผ...\n106002 0 ๅจๅฐ้ไธ็ๅฐไธไธช่ๅคชๅคชๅ ็
็็ผๅพไธ็ดๆต็ผๆณช\n106003 0 ไธ็ไธๆๅคง็ๆฒๅงๆ็็ต่ๅกไฝไบ\n106004 1 ๅฅณๅฃซๅณๅฏ้ขๅxxxxๅ
้กน็ฎๅกไธๅผ ๅ ็ฒพ็พ็คผๅไธไปฝ๏ผๅญ็ญไฟก้ขๅ๏ผๅๆถๅบๅ
่ฟๆๆดๅคๅญ้ไผๆ [็ธๅผน]...\n1000 Processed\n classify content\n106500 0 ๆๅๅผ็้้ฃ็ฎญๅโๆฌๅท้ฆๅฎถไธไธๅฐ็ฎญ้ฆ\n106501 0 ๆฑฝ่ฝฆๅธธ่ฏโโ่ฝฆๅฑ่กไธ็้ฃไบไบๅฟซ้ไบ่งฃ่ฝฆๅฐพๆ ็ๅซไน\n106502 0 ไธป้กตๅฆๅฐไธๆญpoไธ่ฝฆ้่ฟๆๆฅๅธธ\n106503 0 /ๅฏ็ฑ/ๅฏ็ฑ/ๅฏ็ฑ/ๅฏ็ฑ\n106504 0 ็ไนณ่พฃๅฆนๆๅๆขๅซๆฒก็ป้ชๅคๅค็็่ฟนๆ่ธๆ็ถๆฏๆ ่็\n1000 Processed\n classify content\n107000 1 ๅญฆๆฟๆ่ฒ็่ๅฒๅบๆผ้ข็ญๅฐไบxๆxxๅทๅผ่ฏพ๏ผ็่ไนๅ็ๆๅไธไธชๅ่ฝดๆงๅฐ้ญ้่ฎญ่ฅไบ๏ผๆๅๆๅฟซ๏ผๅฐ...\n107001 0 ๆๅชไฝ7ๆฅๅ็ป็ๆ ้ก่ญฆๆนๆ่ท16ๅ่ๅ็กๆพๆบๅบ่ดง่ฟ็ซๅ
้ฌผไธๆๆฐ้ปไฟกๆฏๅคฑๅฎ\n107002 0 ่ๅๅๅฐๆฟๅบๅ
จๅๆจ่ฟ1786ไธชๆถ็ซๆ ๅปบ่ฎพๅทฅไฝ\n107003 0 ๆ้ฝๅๅคๆ110็ๆฏไธๆฏๆไบบ็ปๆถไบ\n107004 0 ็ฎ็บฆ็ไธไปถๆก็บนTๆคไธ็ฝ่ฒ็ญ่ฃค็ธๆญๅบๅคๆฅๆธ
็ฝๆฐๆฏ\n1000 Processed\n classify content\n107500 0 2ใๅไธบโ็งฐ5Gๅๅฑๅทฒๅฐๅ
ณ้ฎ่็น\n107501 0 ๅๅคฉไธๅฐๅฅๅธฎๆไฟฎๅๅ
ฌๅฎค็ต่\n107502 1 ๅนณๅฎๅ
ป่ๅๆ่ฒไฟ้ฉ๏ผๅชๅญไธๅนด๏ผๅญๆปกๅฐฑ้ข๏ผๅนดๅนด้ข๏ผ้ข่ณ็ป่บซ๏ผๆฌ้ๅจ๏ผๆฌ้ๆถจ๏ผ่พ่ฆไธๅนด๏ผๅนธ็ฆไธ่บซ...\n107503 0 3้ๆฑAnimeCityๅคๆฅ็ฅญ\n107504 0 ๅ
ถไธญWinxxไธXbox็ๅๅ่ๅๆดๆฏ่ฎฉๅนฟๅคง็ฉๅฎถๅ
ดๅฅไธๅทฒ\n1000 Processed\n classify content\n108000 0 ๆไปคไป่ๅฅฎๆฏ่NBA็ๆๆๆธ่ฑชๅจ็ๅ ดไธ่ผ้\n108001 1 ่ฏ็ๅ ่ท่ฑๆฑ ๅบไธบๅบ็ฅๅฅณๆงๅ่่ๆฅ๏ผ๏ผไธๅ
ซๅฆๅฅณ่๏ผxๆxๆฅ-xๆxๆฅ๏ผๅ
จๅบๆปกxx้xx๏ผไนฐๆปกx...\n108002 0 ็ญพๅไนOKไบโๆๅคฉๅฏไปฅๅ่ดงๅ
ๅไธ้จๅไบ\n108003 0 ็งๆฏๆฅๅๅฆๅ
ใ่ฃ
็ฒ่ฝฆ12่พโฆโฆไปๅฐฑๆฏๆๆตทๅฑฑ\n108004 0 ๅฐฑๆฏ็ฑณ่ฒ่กฃๆ็้ข่ฒๅฆ~ๅ็ฐไบไน\n1000 Processed\n classify content\n108500 0 24ๅฐๆถๆๅก็็ฉบ่ฐๆฏๅฐๅๅค10็นไปฅๅๅฐฑๅ
ณ้ญ\n108501 0 FX็ผ่ฏๆฐดๆฏ้ๅฏนๆถ้ค็ผ็ๅ
่กไปฅๅๅฏน้
ๆด้ๅฝข็ผ้ๆ็จ็ผ่ฟๅบฆไบง็็็ฒๅณ้ๅธธๆๆๆ็ไธๆฌพ็ผ่ฏๆฐด\n108502 0 ไธๆๅฟไบบๆฐๆฃๅฏ้ขๆๆดพไปฃๆฃๅฏๅ่ฆๅ
้กๅบๅบญๆฏๆๅ
ฌ่ฏ\n108503 0 win10็PINไนๆฏไธชbugๆดๆฐๅฎ็จไบไธๅฐไธๅคฉๅฐฑ้ๅฏไบNๆฌก่ไธ่ฟ้ๅฏ้ฝๅบ้ฎ้ข่ฆๆถ้ไฟกๆฏๅๅ\n108504 0 ๆญคๆฌกๅคฑไบ็้ฃๆบ่ช็ญๅทไธบ4u9525\n1000 Processed\n classify content\n109000 0 ็ปไบไธ่ถ
่ฟ60ไธๅ
็ไธๆฌกๆง่ตๅฉ็ญโฆ\n109001 0 ๅพๅทๅทฅ็จๅญฆ้ขๅๅทฅ้ข้่็ๆไผๆฑๆท่ๅธ่ฑๅฃ็ง็ๅฟๅ
จ็จๆ ๅฐฟ็นๅผๅพไธ็\n109002 0 ไธๆฌกๅฐ้ไธไธ็ท็่กๆ็ฎฑๆกไฝไบ่ทฏ\n109003 0 ไปไปฌๅฏ่ฝ็ๅฐcoconutmilkๅฐฑ่ฎคไธบๆฏ่ฝๅ็ๆคฐๅฅถ\n109004 0 ๆๆถๅ่งๅพ่ก็ฅจๅฐฑๆฏ่ชๅทฑ็ๅฟๅญ\n1000 Processed\n classify content\n109500 0 ไธไธชๅฅณๅญฉ้ฟๆ็จxๅxไธ็็้ข่\n109501 0 ่ฎฉๆๅฟซ็นไผ ๆปก100ไธชiphone6ๆๆบๅฃณๅง\n109502 0 xxๆฅๆฌๆ็ปๅฎถ็ฉบๅฑฑๅบ่ช็ไปฃ่กจไฝๅๆงๆๆบๅจไบบ\n109503 0 ๅๅฟๆฏไบบ็ไธญๅฎ่ดต็่ดขๅฏไฝ ่งๅพ็่ฆ้ฃๅฐฑ้ๆฉ้บปๆจไฝ ่งๅพๅผๅฟๅฐฑ่ๅจๅฟ้ๆถ้ดไผๆนๅๅพๅค\n109504 0 ไธ่พๅท็ไธบ8889็ๆณๆๅฉไธไธ่พๅท็ไธบ9888็ๅฎพๅฉ็ธๆ\n1000 Processed\n classify content\n110000 0 ๅไธบไนๆจๅบไบLITEๅTalkBandB2\n110001 0 ๅไบฌๅคงๅญฆๅฐ็็งๅญฆไธๅทฅ็จๅญฆ้ขๅปบ่ฎพ้ขๅฐ่ดจๅ็ฉ้ฆ็ ่ฎจไผ\n110002 0 ๅๅฏไปฅๅปๆ้กถ็บง็shoppingmallๅคชๅคๆฑ่ดญ็ฉๅจฑไน\n110003 0 ๆฟๅพ็ถๅ
็ๆฏ้้็็NokiaNxxxๅ\n110004 0 ๆไปฌ็ๆ้ฒ็็ผ็ฅๅๆฌง็พๅฝๅฎถ็ไบบ็ๆไปฌ็็ผ็ฅ็ฅไผผ\n1000 Processed\n classify content\n110500 0 ๆณฐๅท่ฎก้้ๅฏนๆฌๅฐไบงไธ็็ญ็น\n110501 0 ไฝๆฏๆฐด่กจ่ขซ็็ช็ไบๆ
ไนไธๆฏ็ฌฌไธๆฌกๅ็ไบ\n110502 0 BvlgariMonete15ๆๆฐๆฌพๅๅฑๅ
้ๅธฆๅ
่ข่ถ
็บงๅฎ็จ็้ๅฑ่ฎพ่ฎก้ๆฏๆฃๅพ็นๅซ็ๅค็ฝ้ฉฌ้ฃๆ ผ...\n110503 0 DreamByๆขฆๆถๅ
ๅฉ็คผ้กพ้ฎๅฎๆนๅพฎๅ|ๅ็ป็ฝๅฉ็คผ็ตๆใ้ฃๅฐใๆป็ฅๅๅฎข\n110504 0 ่ฏฅxๅ็ฏ็ฝชๅซ็ไบบๅทฒ้กบๅฉ็งปไบค่ฅฟๅฎ็งฆ้ฝ่ญฆๆน\n1000 Processed\n classify content\n111000 0 ๅๅจๆดพๅบๆ็่ตๆ็ๆถๅๅไธ่ฏดๆๅซ้ท้ๅข\n111001 0 ไธดๅนณ็ซ่ๅ้ฉป็ซๆดพๅบๆ็ป็ปๆค็ซ้ๅผๅฑๅๆ้ฒ็ๆผ็ป\n111002 0 Orbisๆจๅบ่ฝป็้พๆ่็ณปๅ=U\n111003 0 ๅพฎไฟก็พค้ๅจ็พค่็ตๆขฏๆญปไบบ็ไบ\n111004 0 ไธ่พ็็
งไธบ้ปR03**็่ญฆ่ฝฆๅฐไธๅ50ๅคๅฒ็็ทๅญๆๅ\n1000 Processed\n classify content\n111500 0 ็จๆ้
ใ่่ใๅง่ใ็่
ๅถๅๅฟ\n111501 0 ๆตๆฑ็ๅ
ฌๅฎๅ
็ฝ่ญฆๆป้ๆปๅทฅ็จๅธ่กๆไป็ป\n111502 0 ่กๆฟๆณๆ็ง็้ซๅฐ่พ
ๅฏผๅไปฅไธบๆ็ปฉ็ป้ไบ\n111503 0 ๅ่ฐ็็คพไผๅ็ฎกๅฐฑๅฟ
้กปๆๆๆงๆณ\n111504 0 nๆฌกไนฐไธ่ฅฟ็ฌฌไธๆฌก่ขซๆทๅฎๅๅฎถ้ชๆฐ\n1000 Processed\n classify content\n112000 0 ็็ๅ็งๅค่ฒๅค้ฆ็ๅปบ็ญไธๅฐๆกฅๆตๆฐด็ๆฏ่ด\n112001 0 xใๅ้จๅซ็ฑปใๅพๆๅนถ่็ๅฑ็คบ้กน็ฎๆจไป\n112002 0 ่ฎพ่ฎกๅธHyunJuParkๅฐ็บข็ปฟ็ฏ่ฟ็จๅฐๅฐ้้จไธ\n112003 0 ๆญ่ฝฝๅบ็ง่ฝฆๅบๅคๆ่
ๅฐๅๅคง็ซ่ฝฆ็ซ\n112004 0 xๅณๅฒ็้ฃ็ฉ๏ผๆด่ฑๅๅคง่ๅ้ฒ็\n1000 Processed\n classify content\n112500 0 ferragamo่ฒๆๆ ผๆ
้ซ่ท้็ฒ็บข่ฒๆญฃ็็็ฎ็ไบๆๆๅๆฌข็็งไฟกxx็ ๆฌง็พ็้ๅญๅๅคงไธ็ \n112501 0 ไนๆ ๅฟ็ๆๅฝๅคง้ฃๆบ้กน็ฎ่ฟๅ
ฅๆถ่ทๆ\n112502 0 ๅทฒๅคงๅคง่ถ
ๅบไบ้ฃๆบ่ฎพ่ฎกๅถ้ ่
็้ขๆณ\n112503 0 ่ๅผ็ฅๅๅคๅๅๅฆๅๅฐฑ็ฎไฝ ไธ่ฏดๅฅนๅไธ่ฏพไธๅฐฑไป็ป่ชๅทฑ่ฏดๅฅน่ชๅทฑๆฏไธ้ซไธญ็ซ็ฎญ็ญๅพไบๅ\n112504 0 ๅบ็ฐไบ่ฏๅธไบคๆๅจๅผ็ๅๆพ่่ตฐ้ซ\n1000 Processed\n classify content\n113000 0 com้ๅบๆ่ๆฑ่ๅไบบ็พค2๏ผ128245580\n113001 0 ่็ณป็ต่ฏxxxxxxxxxxx\n113002 0 ๅดๅปๆญปๅจๅคๅคฉ~ๅไฝ็ซฅ้ไปฌๅจๅฎถ่ฟๅฅฝๅ\n113003 0 ๆ่ฆๅๅไบฌไฝ ๆฑ็ๆๅญ็ๆ ทๅญ\n113004 0 ้ฒ้ๆถๆฟ็่ฃ
ๆๆบ้ฅๅ็ญๅฐ็ฉๆๅฟ\n1000 Processed\n classify content\n113500 0 ๅพ็ๅฐบๅฏธไธๅฐไบxxxxxxxpx่ฟไธชๆฏๆไนๅไบไน\n113501 0 ไปๅๅบงๆ่กจ็ฑSamFreeman่ฎพ่ฎก\n113502 0 ๅฎไปฃ่กจ็ๅฝไฝ ๆ่ต่ก็ฅจๅ็ๆฅๆๆๅฉๆฏๅคๅฐ\n113503 0 ใIBM่ฎพ่ฎก่ฏญ่จ๏ฝๅจ็ป้จๅใ\n113504 0 ็ญไฝ ่่่ฟๅปไบบๆๅฃไธๅ ตไธคๅฐๅฆๅ
ๆถ่ตทๆฅๆ นๆฌ่ฝไธ่ฟๅป็ๅ\n1000 Processed\n classify content\n114000 1 ๆฅๅญฃๅคง้
ฌๅฎพ๏ผๅๅฎถ่ฎฉๅฉ๏ผๅฎๆจๅคๅฑๆฟ๏ผๅไปทxxxๅ
/ๅนณๆน๏ผ็ฐไปทxxxๅ
/ๅนณๆน๏ผๆฌข่ฟๆฐ่้กพๅฎขๆฅๅบๅจ...\n114001 0 ่็ธๅฏนๅบ็ๆฑๅฎๆฟๅ้ข่ฎกๆ11ๅฎถๆฅผ็ๅฐๅจไธๅๅนดๆถๅฎ\n114002 0 ่ฑๅ้ชจๅนถไธๆฏ็ฝๅญ็ป็็ๆญปๅซ\n114003 0 ๆฌๆบๅฑๅนๅฏ่ฝไฝฟ็จไบโIDๆ ่พนๆกโ่ฎพ่ฎก\n114004 1 ๅง้พๆฐ่กๅงๅฎ็ฅๆจๅ
ๅฎต่ๅฟซไน๏ผ็นๆจๅบVIPๅก๏ผไธไธๆตไธคไธใๅฐๅ:x่ทฏ่ฝฆ็ป็น็ซ๏ผ่ฏฆ่ฏข:xxxxx...\n1000 Processed\n classify content\n114500 0 ๆตๆฑๆไนๅนดๅนด่ฟไธชๆถๅ้ฝๆๅฐ้ฃๅ\n114501 0 ๅฐ้ฃ้ไธดๅฏ้ณๅฐฑๅจไธไน
ๅๅจๆตๆฑ็ๅฏ้ณๅธๅบๅฃ้ๆ็ฐ็บฏ็ฝๅคๅฐ้พ้้ข็คบ็ไปไน\n114502 0 ๅ
ณ็ฑๆๆบ็็ณ็จๆทไปไฝ ๆๅ่ตท\n114503 0 ่ฝๅ
่ดนๆๆๆบ้็
ง็่ชๅทฑๆญฃๅจ้ผๆฃ\n114504 0 ไฝฟๅพMustang็็ผ็ฅๆดๅ ็ๅฉ\n1000 Processed\n classify content\n115000 0 ๆฟๆฅDIY่็ณ้ฅผๅนฒๅทงๅ
ๅ่ๆๆซ่จๆๅปๅธไธๅๅคซ้ฅผๆพ้ฅผๅฏฟๅธๆๅบๆดปๅจ\n115001 0 Googleๅฐๅพๆพ็คบไธ่ฟๅฐฑๆๅ
ฌไบค่ฝฆ\n115002 0 ไปๅคฉๆถๆพ็ต่็ๅฐๅฅฝๅคไนๅ็็
ง็ๆปกๆปก็้ฝๆฏๅๅฟ\n115003 0 xๆ xxๅฑใxๆ xxๅฑใxๆ xxๅฑ\n115004 0 ๆๅๆ่ทๆถๅซ็็ชๅนถ้่ทไธคๅนด็ฏ็ฝชๅซ็ไบบ\n1000 Processed\n classify content\n115500 1 ๅ
็/ๅฅณๅฃซๆจๅฅฝ๏ผ็พๅฎๅฑ
x/xโx/xxๅๅบๅปบๆๅๅไฟ้ๆดปๅจ๏ผไฝ่ณxยทxๆใ็ญพ็บฆ็พๅฎๅฑ
่ฃ
ไฟฎๅฎขๆท...\n115501 0 ๆดพๅบๆๅผๅๆฐๆฟๅฑๅคไธบๆฐๅๅฎไบโๅผ่ตท็ฝ็ป็ญ่ฎฎ\n115502 1 ๆจๅฅฝ๏ผ้ๅธธๆ่ฐขๆจ็ๆฅ็ต๏ผๆ้ฆๅๅญ่ฅ้ไธญๅฟๅ
ๆxๆฅ็ๅคงๅผๆพ๏ผๆๅๆดปๅจ้ๆ ้ณๆฐ๏ผ็ฒพ็พ็คผๅไปปๆจไบซ๏ผ...\n115503 0 xxxxๅนดxๆxxๆฅๅ่ฅๅธไธญ็บงไบบๆฐๆณ้ข็นๅคง่ดฉๅๆฏๅๆก\n115504 0 ๅฐฑๆฏๆ็ๆ็ๅคๅ ้ผป็ๅไธไบ\n1000 Processed\n classify content\n116000 0 ไธๅจ360ๆ่ฒ้ๅขๆฐ่ฅฟๅ
ฐ้็็ๅญฆไธๅฎถๅ้ข่ๅธไธบๅคงๅฎถๆไพไบๆฐ่ฅฟๅ
ฐๅ
ซๆๅ
ฌ็ซๅคงๅญฆไธญๅ
ถไธญๅๆๅคงๅญฆ็่ฑ...\n116001 0 ๆพไบไธๅ ็็ฑๅฐฑๆฏ้ปๆญข่ชๅทฑ่ฎๅฅฝ\n116002 0 u/b/e/rๅฎๅจๆฏไธไธชๆๅๅพaggressive็ๅ
ฌๅธ\n116003 0 ๅพฎ่ฝฏๆXboxMusicๆดๅไธบใGrooveMusicใไบ\n116004 0 ใใๅจ่็ ็ฉถ็็กๅฃซๅญฆไฝ่ฏไนฆ\n1000 Processed\n classify content\n116500 0 ๅจ่ฎคๆธ
็ๆดป็็็ธไปฅๅไพ็ถ็ญ็ฑ็ๆดป\n116501 0 ๆๅคฉ่ฆ่ชๅทฑๅๅฐ้ๅปๆพๅฐไผไผดๅฟไบ\n116502 0 ๆฏๅ
็ฝฎไบไฝ Androidๆๆบไธญ็ไธไธช่ฟทไบบ็ๅฅณๅญฉๅฟ\n116503 0 ๆดชๆณฝๅฟไธๆฒณ้ๅซ็้ขๅด็ปโๆๅป่็้ข้ฒๅ
่กโ่ฟไธไธป้ขๅผๅฑ้ข้ฒ่็็ๅฎฃไผ \n116504 0 D้ถ็ท่ฟ้ฉปA8้ซ็ซฏ่ฎพ่ฎก่ฃ
้ฅฐ้ๅข\n1000 Processed\n classify content\n117000 0 ๆ่/ๆตๆฑไธๅฑฑไธญ้จๅคฉๅธธๅฌๅฐ็พๆจๅซ81ๅช็พ็ฆปๅฅๅคฑ่ธช\n117001 0 ่ฟ2ๅฎๅๅฐๆไธบไธญๆฐๆบๆ
งๅ้กน็ฎ็จๅฐ\n117002 0 ไบใ็ปๆฅ่ฏๆ ้กๆฐธไธญ็งๆๆ้ๅ
ฌๅธๅคๆน่กไธ็พๆ
ๅคงevermoresoftwareๅ
ฌๅธ็ๆณๅฎไปฃ่กจไบบ...\n117003 0 ๅฐ็ ๅญ่ฟ่ฎฉไธ่ฎฉๆๆบๆดปไบโฆโฆโฆโฆ\n117004 0 ๆฒณๅ็ๆฃๅฏ้ขๅ
็ปๅฏไนฆ่ฎฐใๅธธๅกๅฏๆฃๅฏ้ฟๅผ ๅฝ่ฃ้ๆฅ็ๆฃๅฏ้ข2015ๅนด็ฑๆฐๅฎ่ทตๆๅกๆฟ่ฏบ็ๅ
ทไฝๅ
ๅฎนๅ...\n1000 Processed\n classify content\n117500 0 ่ขซไบบๅผบๅฅธๆๅธฆๆฎดๆ่ดๆฎ่ฟๆญปไธไบๆฏๅคฉ็ฒพ็ฅ่ไฝๅ้ๆ็ฃจๆๆฏ็็่ฎฉไปไปฌ็่งฃไธบไปไนไผๅฆๆญค็ปๆ\n117501 0 ๆฑ่็13ไธชๅๅธ2015ไธๅๅนดGDPๆๅๆฐ้ฒๅบ็\n117502 0 YoYoๆฐๅญๆญ่ไผ่ฎฉๆจ็ฉ็็ฑไธ้ๆ\n117503 0 ไธบไบ่งไฝ ๅฏไปฅๅ15ๅฐๆถ้ฃๆบไธ็ฎกไธ้กพ\n117504 1 ใ็พไนๆดปใ็ฅๅไผๅๅ
ๅฎต่ๅฟซไน๏ผ็พไธฝๆดปๅจๅณๆฅ่ตท่ณxๆxๆฅ๏ผๆปกxxxๅxx๏ผๅ้xxๅ
ๆต็จๅธ๏ผๆฌข...\n1000 Processed\n classify content\n118000 0 ้ฎๅๆๅพ็ฅๆฏไธไธช6ๅฒ็ๅฐๅฅณๅญฉๅคฑ่ถณไปๅๅ
ซๆฅผๆไธๆฅ\n118001 0 ็ฐๅทฒๅนฟๆณๅบ็จไบ่ช็ฉบใๆฑฝ่ฝฆใๆจกๅ
ท็ญๆบๆขฐๅ ๅทฅ็ๅไธช้ขๅ\n118002 0 ๅๅจๆน่ฃ
jeep้กถไธไธ่ทฏไบ็ธ่ฟฝ้ๆถ\n118003 0 ่ฟๆๅไธชไฝๅฏนๆฟๅบ็ๆตๆ่พ็ปไบไบบๆง็ๅทๆผ ่ฟไธ็น้ฝๆพๅพ้ๆง\n118004 1 ๆ่ถ
ๅผ็บขๅ
็ญๆจๆฅๆฟ๏ผๆๅ้ขๅญ ๅฐฝไบซๅฎๆ ๏ผ้่ฟไธๅคฉๅ็ญไธๅนด๏ผๅฐๅ๏ผไธๆตทๆฎ้ๅบๆพณ้จ่ทฏxxxๅทๆๆ...\n1000 Processed\n classify content\n118500 0 ๅพฎ่ฝฏๅๅฏไปฅๅคง่ตไธ็ฌไผไธ่ดน็จ\n118501 0 ็ฎๆตๆๆบ่ฟ่งxxx+ๅบฆ??\n118502 1 ๅๅ็ฌฌไธไธญๅฟ๏ผ้ฝๅธ้ธๆกๆบ๏ผ่ฟๆดๅ
ฌ้ฆ็ปฟๅญโโ็ปฟๅใ่ฟๆดใๆต้็ฒพ่ฏ้็ฎxx-xxxใก็ฒพ่ฃ
ๅ
ฌๅญๅคงๅฎ
...\n118503 1 ็ฑไฝณ่ถ
ๅธ้บ้ขๅฐๆ๏ผๅๅบไธๅนฒไบ๏ผ็็ฑๅๅญ็ปฃ@้ป็ณ็ปๆๅๅ ๅคฉๅ
จ้จxๆไบๆฌๆธ
ไปๅคงๅค็๏ผไธๅพ่ต้ฑ๏ผๅช...\n118504 0 ๅทฅ่ตๆฏๆฅ็ป็ๆณๅๅฑๆณๆนๅๆบไผๆฐธ่ฟ็็ปๆๆๅๅคๅไบๆผๆ็ไบบๅทๅๆฏไธ้่ฆๅซไปไธๅ้ฑ็\n1000 Processed\n classify content\n119000 0 ๆๆ็ฎๅ
ๅปๅ
ถไปๅฝๅฎถๆ
ๆธธxxๆไปฝๅๅจ็พๅฝ\n119001 0 ๅ
่ฃคๆฏ่ฃคๅญ่ธ็ฝฉๆฏ่กฃๆ่ฃๅญๅฐฑๆฏ่ฃๅญไฝไธๆฏ่กฃๆ\n119002 0 ๅๅชไฝ็ๆ้ฉๅฝ่บไบบ้ๆ็ๅๅๅฅณๅ\n119003 0 ็ฝๅธฆๅผๅธธๅฆ็ง็พ็
ใไธๅญ\n119004 0 ๆปดๆปดๆ่นๅฟซ่นไธ่น้กบ้ฃ่น็ป็ปๆญ่\n1000 Processed\n classify content\n119500 0 ไธ็พๅบฆไบคๅฅฝโฆไปฅๅๅฐฑๆฏไบ่็ฝ็ๆถไปฃไบ\n119501 0 PS๏ผๅฏ่งๅไบฌๅธๅฏนไบๆฑ่็็ไฝๅ
ถ้่ฆ\n119502 0 ่ฟไธช่
พ่ฎฏๆตฎไบ่ก่ท็บญ่ฃ
็ๅฐ่ฎๅ็\n119503 0 Google็จๆบๅจไบบๆฅๆต่ฏ่งฆๆธๅฑๆถๅปถ\n119504 0 ่ฝ่ฏด็ฌฌไธๆฌก็จ็ต่็yyๅ่ต\n1000 Processed\n classify content\n120000 0 ๅ
ญๅๅบ้้ขไปฅไธๅไฝๅฎ็ฐ็คพไผๆถ่ดนๅ้ถๅฎๆป้ขxxxxxxไธๅ
\n120001 0 ๆๆ็ฅ้ไธบไปไนๆ้ฃๆบๅซๅhuiๆบ\n120002 0 ๅฏๅฎ่ฃ
่
พ่ฎฏๆๆบ็ฎกๅฎถๆฆๆช่ฏ้ช็ญไฟก\n120003 0 ไฝ่ฏด็็ๅฆๆๆไธปbabyไธ็ไบบ็ง่็ฎๅๅฉ็คผ่ฟๆฏ่ฎๆ็็น็\n120004 0 ๆฒๆบๅฟๅธๆณๅฑไผ ่พพ็ๅ
ๅธๅฑๅๅนดๅทฅไฝไผ่ฎฎ็ฒพ็ฅ\n1000 Processed\n classify content\n120500 0 ๅทไบ่ฐทๆญๆฏไธๆฏ้คไบ็ฟป่ฏไปฅๅค็้ฝไธ่ฝ็จไบ็พๅบฆๆฅๅบๆฅ็ๅพ็ๅ
จๆฏshi\n120501 0 ไนๅฐฑๆฏๅฎๅฏ็จไฝ็ฎ่ค็งๅป็ๅคๆนๅ่ฏๅไฝฟ็จ็ๆค่คๅ\n120502 0 ็่ฑๅ้ชจ็้ฝๅๆฌขไธไบ็ฝๅญ็ป\n120503 0 ๅๅฆAๆขฆstandbyme\n120504 0 ๆไปฌ็ๆบๅจไบบโๅธๅ
โๆญฃๅฎ้ๅฐโ็ซโๅจๅจๆฟ้\n1000 Processed\n classify content\n121000 1 ไฝ ๅฅฝใ่ๅทๅฅฅ็น่ฑๆฏtheoryไปxๆxๅทๅฐxๆxๅทๅ
จๅบxๆใ่ฐข่ฐข\n121001 0 ๅไนๆฒกๆณๅๅจ็ต่ๅ็ไฝ ไปฌ็็ดๆญ\n121002 0 ๅ
ๆฌไบฌไธๅ้ฟ้ๅทดๅทดๅคง็ตๅ้่ๅ\n121003 0 ice็ๅฏนๆกไปถๅๆๅๅ่ฏฆๅฐฝ\n121004 0 2็ๆๅจๅฎถๅผ็็ฉบ่ฐๆง็่ฅฟ็\n1000 Processed\n classify content\n121500 0 ๅฅฝๅฃฐ้ณ่ๅฐไธ้ฃไบๆ่ณไบๆ็ๅฅฝๅฃฐ้ณ๏ผ็ฌฌไธๅญฃ็ๅผ ็ฎ\n121501 0 ้ฃ้ก้ฉๅฝ็่่้ปๆฒนๆไปๅฐๅ
35G\n121502 0 ๅไธบโ่ฟๅปโๅ
ฌๆไบๆๆไบ่็ฝๅทจๅคด|ๅไธบโ่ฟๅปโๅ
ฌๆไบๆๆไบ่็ฝๅทจๅคด2015ๅนด07ๆ31ๆฅ03\n121503 0 ๅไบฌๅ็ฏฎ้
้็ตๅจๅ็ฏฎ้
้ๆฐดๆณฅ้
้็ง่ตๅบ็งๅ็ฏฎ้
\n121504 1 ใTATAๆจ้จใไบฒx.xxๅ
จๅฝ่ๅจๅทฅๅ็ดไพๅฎๆจๅคๅ*/ๆฌพxxxxๅ
๏ผๅxxๅๅญๆญค็ญไฟกๅฏๆต/x...\n1000 Processed\n classify content\n122000 0 ๅฉ้ๅ
ฌๅฎๅๅฑๆๅ็ ด่ทไธ่ตทไปฅโๅ
ๅฐๅงโไธบ็ฑๅฎๆฝ่ฏ้ช็็ฏ็ฝชๅขไผ\n122001 0 ๆๆไปฌไน้ฝๅจ็จๆไปฌๅฆๅ็้
ต็ด ๆด้ข็ฒ??\n122002 0 ไธฅๅๆๅป็ ดๅ็ตไฟก่ฎพๆฝ็ฏ็ฝช่กไธบ\n122003 0 ่ฟๆ ทไธไธชๆไธๆฅๅฐฑๅพ5800ๅ
ไบบๆฐๅธ\n122004 0 ้ฉพ้ฉถๅ้ฉพ้ฉถ็C***ๅท้ๅๅๆ่ก้ฉถ่ณไบไฟ้ซ้xxxๅ
ฌ้ๅคๆถๅ ๆ่ฝฆ็็ฏๅ
ไฟกๅทใๅถๅจใ่ฟๆฅใๅฎๅ
จ...\n1000 Processed\n classify content\n122500 0 ็ฎๅๅฒๅ
ๆฏ่พ็ญ้จ็็งๆฟๅบๅๅฆๅๅใ็ๆฏใ็ซ่ฝฆ็ซ็ญ\n122501 0 ๅจๆญคๆณๅฎๅฑ็คบไธไบ็ๅฎ็ๅ่ดท็บ ็บทๆกไพ\n122502 0 ไธๅ35ๅฒ็ๅๅบไฟๆดๅทฅ่ขซB2่ณB1ๅฑ็่ชๅจๆถๆขฏๅคนไฝ่
ฟ้จ\n122503 0 ๆณ้ฉฌไปฃ่กจ่ฎจ่ฎบ้ฃๆบๆฎ้ชธ้ดๅฎไบๅฎๆฏ้ๆฑๆฏๅผๅงๆๅฏป๏ผญ๏ผจxxx\n122504 1 ้่ฏ่ฝฎ่ๅ
ฌๅธ่ฐญๅฐๆ็ฅๅไฝ่ๆฟ็พๅนดไธไบๅคงๅใ่ดขๆบๅนฟ่ฟ๏ผๅ
ฌๅธไธป่ฅไธๅ
ๅ
จ้ข่:ๅ
จ็่กใๅทฅ็ฟๅใๅกไธ...\n1000 Processed\n classify content\n123000 0 ๆฌๅญๆฑๅคง้ๅฐๆฐๅข29ๅค่ฟ่ก้้\n123001 0 ไธญๅฝ้ๅทฅไปฅๅไธญๅฝๅซๆ็ญ้ฝๆฏๅคฉๅพทๆ็ปญ็ๅฅฝ็ๅ็ง\n123002 0 ๆๆ็ตๆขฏ้ฝๆ่ฟไธช็บข่ฒ??ๆ้ฎ\n123003 0 ๅไฝ้ไฟกไปถๆๅฟ็็ฉไธๅๅฐๅงๅง่ฎค่ฏๆไบ\n123004 0 ๅฎ่ฅฟๅธ้ๆฅ10่ตทๅ็ๅจ็พคไผ่บซ่พน็โๅ้ฃโๅ่
่ดฅ้ฎ้ข\n1000 Processed\n classify content\n123500 0 Win10Mobile็ๅพฎ่ฝฏๅฐๅจ็นๆ๏ผ็ฅๆไธ็ๆถ้ด\n123501 0 ๅๆๆๅจๆฟ้ด้็ฉๆๆบๅขๅฐฑๅชๅฌๅค้ขไธๅฐๅงๅจๅคงๅ๏ผๆ่ฆๅบไธฝ่ๅธๅธ\n123502 0 ๅป็ไธๅผๅง่ฎคไธบไปๅชๆฏ่ขซ่ซๅญๅฌไบ\n123503 0 ๆๆๆไธญ็่ก็ฅจไปๅคฉไธๅๅญๅจๅ่ฝ็ๅฏ่ฝ\n123504 0 ๅจ7ๆ7ๅทๆฌไบบๅๆฌกๆฅ้ๅคง่ไธ่ๅ
ต22ๅนดๅ ไธบๆฟๅบๅฎๅไธขๅคฑๆกฃๆก้ ๆ่ๅ
ตๅฆป็ฆปๅญๆฃๅ\n1000 Processed\n classify content\n124000 0 ๅ็ๅฐ้ฃไธช็ท็ๅจๅฅฝๅฃฐ้ณ่ๅฐไธๆฑๅฉ\n124001 0 ่ฑๅฝ้่ทฏ้็ฅจ๏ผๅ
่ดน่ต ้ไน่ฝฆๅคฉๆฐ\n124002 0 ๆ้ๅฝ็้ฒๆ็ณปๆฐๆฏไปไบSPF15ๅฐSPF30ไน้ด\n124003 0 ๅนถ็่ตฐ็ฌฌ328็ชๅฝฉๅกไพๅ
ป่ฉ่จๅ็ญ\n124004 1 ๅฐๆฌ็ๅฎถ้ฟๆจๅฅฝ๏ผ้ณๅ
่บๆฏ้ฆๅผๅญฆๅฆ๏ผๅจไธ่ณๅจไบไธญๅฐๅญฆ็ไฝไธ่พ
ๅฏผๅทฒๆไธไธ็่ๅธ่พ
ๅฏผ๏ผ็พๆฏๅไนฆๆณ็...\n1000 Processed\n classify content\n124500 0 ไธ่ตทๅๅฟ็้ฃไบๅนด็็น็นๆปดๆปด\n124501 0 ๅๆฅๆ้ๅฐฑๆๆ่ ขๅ็ฏ็ฝช่กไธบ\n124502 0 ้ฃไนไบ่็ฝ+่กๅจ่ฎกๅ่ฆๅฆไฝๅพไปฅๅฎ็ฐๅข\n124503 0 ่ๆ นๆฎๆฐไฟฎๆน็ๅฐไบค้ๆณ่ง่งๅฎ\n124504 0 ๅไบฌ่ณๅฎๅปๆฎต็บฆ80ๅ
ฌ้ไธไบฌๅๅ้
ๅ
ฑ็บฟ\n1000 Processed\n classify content\n125000 0 ็ถๅไปๅฐฑไป็ต่้ๅบๆฅไบโโฝโ\n125001 0 ่ฐ่ฟๆข่ฏด่ฝฌๅบๅ ๅชๆ่ซๅญไธๆไบบ\n125002 0 ๆณฐๅทๅฐฑๆฒกๆๅชๅฎถๆฅๆๆ็บณ่ฑๆๅฏฟๅ้
ๆๅกๅไธไผไนฑ้ผ้ผ็ๅ??\n125003 0 ่ฟๆฌก็ๅพ็ๆฎ่ฏดๆฅ่ชiPhone็ธๅ
ณไบงไธ้พ\n125004 0 ๆ็จ็พๅบฆ่ง้ขๆๆบ็็ไบโ็พๅฅณไธบๅๅฟซ่บฒ่ฟๅฐๆ้โ\n1000 Processed\n classify content\n125500 0 nuru็็ฟป่ฏ่ฟๆ้
้ณไนๅฅฝๅฏ็ฑ\n125501 0 ๆ ก้ฟใๅฏๆ ก้ฟๅ ็ฏ็ฉๅฟฝ่ๅฎ็ฝช\n125502 0 8ๆ7ๆฅๆ่ฐขไธๅธๆ่ฐข่ถ็จฃๆ่ฐข่ฉ่จ็ปง็ปญๅฝไธชๅฅฝไบบๆฅ็ญไฝ ไปฌไนๆ้ๅคงๅฎถๅ็ตๆขฏๆณจๆๅฎๅ
จๆณจๆๅฎๅ
จ\n125503 0 ็ซ็ณๅฏ่ๅท่ตๅบ็พๅ็พไบบๅญฃ\n125504 0 ไปฅๅ้่้ซๅไธญ้็ซโๅๅคๆๅฑฑๆช้ฃๆน้ฃๆฒ\n1000 Processed\n classify content\n126000 0 ๅจๆฅๅไธญ่งใๆนๅ็่ตท็นใไธ่ฎฟๆถ\n126001 0 ๅจ่พน็ฏ็ป็ๅทด้ปๆฅๅคฉๅไธไธญๅฟ\n126002 0 ๆ้บปๅฐ่พไบ่ฟๆๅๅคๅๅ็บขๅ
\n126003 0 ๆๅ็จ็พๅบฆ้ฑๅ
xๅ้ฑๅ
xๅ
่ฏ่ดนไบ\n126004 1 ๅฐ่ดต็ๅฎขไบบ๏ผไบฌ้ฝ่่็ฅๆจโไธๅ
ซ็พๅฅณ่โๅฟซไน๏ผไธบๆๆฉๅ้ฆ๏ผ็นๅซไธบๆจๆจๅบโ่ฐขๅคฉ่ฐขๅฐ๏ผๆจๆฅๅฆ๏ผโ็ญ...\n1000 Processed\n classify content\n126500 0 ่ฎฉไธไธช20ๅ ๅฒ็ๅฐๅงๅจๅจๅ
ป่้ขไธๅนฒๅฐฑๆฏ28ๅนด\n126501 0 ๅไธ้ฃๆบๅฐฑๅฌๅฐๆ่ช็ญๅ ไธบๅคฉๆฐๅๅ ๅๆถ\n126502 0 ้ถๆณฐ0571โ86234738\n126503 1 ๏ผ่ฎฉไฝ ้ๆถๆฅๅฌๅซไบบ็ต่ฏๅ็ๅฐๅพฎไฟกQ-Q่ๅคฉ่ฎฏๆฏโๅฏนๆนไธไผๅ็ฐ๏ผ๏ผ่ฏฆ๏ผxxxxxxxxxxx๏ผ\n126504 0 innisfreeๅnaturerepublic้ฒๆ้xๅฎ้ฒ้ฒ้ฉฑ่ๅท้พๆไบบๅฟ็ซฅๅฏ็จx\n1000 Processed\n classify content\n127000 0 ๅค้ข็ๅ็ง่ฃ
ไฟฎๅฃฐๆๆๅต้ไบ\n127001 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 9pd9u8ไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n127002 0 ้ๆถ้ๅฐๆ็
ง็ไธไผ ๅฐๆๆบ็ญ็งปๅจ่ฎพๅค\n127003 0 ่ๅณๅฐๅจ9ๆไปฝๅๅธ็2016ๆฅๅค็ณปๅๅๆฏไปๅจ่ๆ้ด่ฎพ่ฎก็ๆๅไธไธช็ณปๅ\n127004 1 ๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผๆญฃๅ่ฃ
้ฅฐๅไธๆฅผ็พๅฟๅฎถ็พๆจ้จ็ฅๅ
ๅฎต่ๅฟซไน๏ผๅๆถx.xx็โไนไบซๅฅ้คโไผๆ ๆดปๅจไน...\n1000 Processed\n classify content\n127500 0 ๅคง็บฆๆ127็งๆถๅๅฐๆไฝ็ณป็ปๅๆ ็บฟ้ไฟก็ไธๅฉ่ขซ็จไบๅฎๅๆๆบ\n127501 1 ็ฐๆๅค่ฝๅฐฟ็ด ๅฐ่ดง๏ผxxkg่งๆ ผ๏ผ้ป่ฒ้ข็ฒ๏ผๅ้ๆๅ
่ฃ
๏ฝ๏ฝไปทไฝxxxx๏ฝ่ฆๅพ่ฏท่็ณป๏ผๅฎ็น้ๅฎ ...\n127502 0 ่ฟๆฏ็ปง็ปญๆดๅๆๆณ็กฎๅฎๅฏไปฅๅฝๅบๅปๆฏ\n127503 0 5ใ้ฅฎ้ฃไธ่ฆๅๅฐไฝ่่ชๅ้ซ็บค็ปด็ธ็ปๅ\n127504 0 ๅๅคฑ็ ๆๆฉ่ฟๆlabtest\n1000 Processed\n classify content\n128000 0 ่ถ้ฑ3ๅ
ๅๅคฉโฆ็็ไปฅไธบ่็พๅง็้ฑๆฏๅ
่ฏ่ดน้็ๅ\n128001 0 ไธๆฆไธๅพฎ่ฝฏ็็ฆไธๅ่ฎฎๅจ2016ๅนดๅฐๆ\n128002 0 ็็ๅบ้NBA็ๆถฏ่ฟไธ่ทฏ็ปๅ็ไผค็\n128003 0 ่ท็พๅบฆipadๅฎขๆท็ซฏๅฃ็บธ\n128004 0 ๆฅ่ชๅฎๅพฝ็็xxๅ็ฅๅไธๅฎถๆฅๅฐๅฎฟๅทๅธไธบๅฝๅฐ็ไผไธใๅไฝๅผๅฑๆบๅๆๅก\n1000 Processed\n classify content\n128500 0 MicrosoftๅฐVRๆ่กไฟกๅฟๅ่ถณ\n128501 0 ้ซๅฐๅคซ็ๅบๆๅคงๅธๆฐดๅจๅไบฌ็ๅบ็จๆฐด็ธๅฝไธคไธชๅบ\n128502 0 ไปฅๅผบๅฅธๅฆๅฅณ็ฝช่ขซๅคไบ3๏ฝ10ๅนดโฆโฆๅฅฝๅๆบ็ญ็\n128503 0 ๅฑ
็ถๅจ่
พ่ฎฏ่ง้ข็่ง่ชๅทฑ็็
ง็็็ๅบ็ฐๅจ7ๆ5ๆฅๆฑ ๆๆญ้ฉๅฝไธ่ดธๅคฉ้ถ็ๆฅไผ็ๅคงๅฑๅนไธ\n128504 0 17็บงๅฐ้ฃโ็ฟ้ธฟโ่ฆ็ป้ๆตๆฑไบ\n1000 Processed\n classify content\n129000 0 ไน่ฎธๅฅณ็ๆ้พๅคๅคงๅญฆไผๅญๅจๆฝ่งๅไน\n129001 1 xxไธๆฌงๅ
่ดญๆฟ้่ฅฟ็ญ็็ปฟๅก๏ผxxไธไบบๆฐๅธ่ตท็งปๆฐๅพทๅฝ๏ผไธไบบๅ็ๅ
จๅฎถไบซๅๅ็ญๅพ
้๏ผๅญๅฅณไบซๅๆฌงๆดฒๆ...\n129002 0 ๆฑ่ไบบ่ฟไน่ฝๅโฆโฆ็ฌฌไธๆฏๅๅฎ็ดๆฅๅฐฑๆฟๅ้
ๅจๅนฒ\n129003 0 ่ญฆๅฏไนๆฒกๅ ไธชๆฏๅคฉๅคฉๅญ่ฏๅฟๅไบ็\n129004 0 ไธ่ทฏๆ่ตไธ่ทฏๆถ็ๅฑไบๆ้ฟ็บฟ\n1000 Processed\n classify content\n129500 0 ๅไบฌ้ๅฅฅไผ่ทณๆฐดๆฏ่ต็ทๅญ3็ฑณๆฟ็ๅ ๅๆจๅคฉๅบ็\n129501 0 ไธพๆฅ่
โ้ๆญฃไนโๅจๆจๆ7ๆถๆๆฐๅ็ปๅชไฝ่ฎฐ่
็ๆๆไธญ็งฐ\n129502 0 ็ฑ็ๅ่ดจ็ๅฑไธๅฑๅไฝ็ๅๅธ่ฎก้ๆๆฟ\n129503 0 ่ฏท็ป่ฟๅธธ็ๆฎตๅฝ็้็่ฝฆ่พๅ้ๆ
ข่ก\n129504 0 ไธ่ฌ็่ฐทๆญ็ดๆญไธ่พนๆ ้loop่ง้ธๆด็ๅจ้ฃๅ ๅ ๆฏซๆ ่ฟๅๆhhh\n1000 Processed\n classify content\n130000 1 ไบฒ็ฑ็VIP๏ผMASFER-SUๅฅณไบบ่ๆดปๅจ็ซ็ญๆฅ่ขญๅฆ๏ผx.x๏ฝx.xๆ้ดxxๅนดๆฅๅญฃๆๆๅๅๅ...\n130001 1 ้ๅบๆ้กบๅฅ้ฉฐไธๅทxSๅบๅฐไบๆฌๅจๆซ๏ผxๆxใxๆฅ๏ผๅจๅฑๅ
ไธพ่กโไธๅ
ซๅฆๅฅณ่โ็นๆ ๆดปๅจ๏ผๅฏ่ฝๆจๆๅ...\n130002 0 ไบ้ฉฌ้ๆ่ๅ
ๅน๏ผBarRaiser็ๅทๅบ่่
\n130003 0 ๅญฆไผไบๆ ๆชฌ็ๅทๅๆนๆณๆ่ง่ชๅทฑ็้ผ็ไธๅพไบๆฒกๆ็ต่็ๅฅฝๅคๅๅคชๆ ่ไบๆปๆณๆพ็นไบๅ\n130004 0 ไปฅๅพ้ฃๆฃๅจๅ ๆฒน็ซๅ
ๆตๆต็ๆฑฝๆฒนๅณๅฆไปๅทฒ็ปไธๅปไธๅค่ฟไบ\n1000 Processed\n classify content\n130500 0 ไธฅๅๆๅปๆถๆชๆถ็่ฟๆณ็ฏ็ฝช่กไธบ\n130501 0 ไธๅฎถ๏ผๅฏนๅๅบๅฃ้ๅ็ซ็ฎญๅๅจๆบไธ่ฟๅๅ็จ็ซ็ฎญๆๆฏไธๆฉๆฃๅถๅบฆ/Sputnikไธญๅฝโๆฐ้ป\n130502 0 ๆไปฅ่ดดๅฟ็ๅฐYO่ๅ็พๅบฆๆๆบๅฉๆ็ปๅคงๅฎถ่ฐ\n130503 0 ๅพฎ่ฝฏ็กฎ่ฎคๅ
ฌๅ
ฑ้ข่ง็ๅฐๅจxๆๅบๅฐๆฅ\n130504 0 ไธ่ง้ๆฅๅ็ฐ็ต่้็่ๅคๅนด็ๆทปๅฏๅฎ็ๅพๅ
ๆฒกไบ่ฟ็ฉๆฏ\n1000 Processed\n classify content\n131000 0 ||ๆๅจๆฐงๆฐๅฌไนฆๆถๅฌโ012้ฟ็ไธไปโ\n131001 0 ๅฝๅA่ก่่ต็่งๆจกไป็ถๆๆพ้ซไบๅฝ้
ๆฐดๅนณ\n131002 0 ๅธฝๅญ้ฟ่ข้ฒๆ้andๅ
ญ็ฅๅทฒๅๅคๅฅฝ\n131003 0 ้่ฆๅ ๅซๆๅท๏ผxiaovvvjian\n131004 0 ไปปไฝ่ช็ฑ้ฝๅฟ
้ ๅๆณๅพ้ๅถ็\n1000 Processed\n classify content\n131500 0 #NAME?\n131501 0 1ๆฏไบบๅ่ตๆฌๆตๅ่ฟๅบฆๆฟๅบๅนฒ้ข\n131502 0 ๅจไบ่งฃๅ่ฎค่ฏๅคงไผๆฑฝ่ฝฆ็ๅๆถ\n131503 0 ๆ่ฟๅๅ
็ฏไผ ไธๆฎต่ฏ๏ผไธ็ไธๆ้ฅ่ฟ็่ท็ฆปๆฏ๏ผไฝ ๆ็ฅๆๅจๅWV\n131504 0 ๅพๅคๆฌก็ๅฐๅฐๅญฉๅจ็ตๆขฏไนฑ่ท่นฆ่ทณ\n1000 Processed\n classify content\n132000 0 ็้ฒ็ตๅ็็ญๅบฆไป2014ๅนดๅผๅงไพฟๆฏๆๅขๆ ๅ\n132001 0 LXๆ็็็็ธๅฐฑๆฏๆจๅบ่ฟท้พ่ง้ๅคฉ\n132002 0 ๅญฆไผไบๆ ๆไบไบๅๅผ่ไฝๅ๏ฝ๏ฝ๏ฝ\n132003 0 ็ฌฌไธๆฌกๅ้ฃๆบๆไปไน้่ฆๆณจๆ็ๅ\n132004 1 ๆตๆฑๅบ่ฏไธญๆ ไบงๅ็ฌไธๅณ้ข็ฒ๏ผ็ๅข่กฅ๏ผไธดๅบ่ฎคๅฏๅบฆ้ซ๏ผxg*xx่ข/็๏ผไธญxx.xxๅ
๏ผๅ
จ็้้...\n1000 Processed\n classify content\n132500 1 ๆจๅฅฝ๏ผๆๆฏๅผ ๅฎถๆธฏๅ่็ๅฐ้๏ผxๆๅ
ฌๅธๅผๅนดๅทจๆ ๆดปๅจ๏ผx.ๆฐๅฎขๆทๅฐไฟๅ
ป๏ผๆบๆฒน๏ผๆบๆปค๏ผๅทฅๆถ๏ผๆฏxS...\n132501 0 GD็ฅ้ๆไปฌๅบๆดๅทฎ่ฟๅธฆๅจ็ฅ้ๆไปฌไธ่ฝ็ซๅฐฑๅซๆไปฌ็ซ่ตทๆฅๆ็็ๅพๅฟ้
ธ\n132502 0 ๆฑๅฉไบ่
พ่ฎฏ่ง้ข็ๅๅญฆไนๆ ๆ\n132503 0 07ไบฟ็พๅ
็งๆๅๅ
จ็็ฌฌไธๅ็ฌฌๅๅคงๆๆบ่ฏ็ๅๅฑ่ฎฏไธ้่ฟช็ง\n132504 0 ไปๆจๆธฌๆฅ่ฒดๅฆ็ด165ๅ
ฌๅใ60ๅ
ฌๆค\n1000 Processed\n classify content\n133000 1 ๆ่ฐข่ด็ตๆฝฎๅทๅ
จๅ
ด็ๅ
ท๏ผๆฌๅ
ฌๅธไธป่ฆ็ป่ฅๅฝๅ
ๅคๅ็งๅ็็ๅ
ท๏ผ็ญๆฐดๅจ๏ผๆฝๆฒน็ๆบ๏ผๅฎถ็จ็ตๅจๅๅ็งๅๅฑ...\n133001 0 ๆณๅจๅไบฌ็ปง็ปญไบฒไธด็ๅข็ฌ่ฎฐ2\n133002 0 โ็ต่็ปผๅๅพโๆฏๆ่ฟๅ ๅนดๆๅบ็ไธไธช็พ็
็ๅ็พค\n133003 0 ็ๅถๆฅๆบไธๅๆณๅธฎๅฟ่ฟ่พๆๅ
ฑ็ฏ\n133004 0 MKๅ
จ็็ฎ่ณๆตๅ
ไธค่พน่ฑๆ ผ่ฎพ่ฎก\n1000 Processed\n classify content\n133500 0 ๅคฉๅชๆ่งๅพ่ชๅทฑๆ็น้
ท็บขๅ
ๆๆฐๆฏไธๆฏๅพๅๅฎณ\n133501 0 ไฝ ไปฌ็ๅฐพๅทดๅจ2015ๅนด7ๆ8ๆฅ้ฒไบๅบๆฅ\n133502 0 ๆไปฌ็ๅ็ฎกๅๅๅฐฑๆไธ็ๆ
ๅง\n133503 0 ้ข็งฏๆฎตxxๅนณ็ฑณโโxxxๅนณ็ฑณไธ็ญ\n133504 0 ็นๅซ่ฆๆ่ฐขไธไธ่ถ็งๆณ้ขไนฆ่ฎฐๅ็้ซ่ถ
ไธไธ็ด ๅ
ป\n1000 Processed\n classify content\n134000 0 ๅไบฌโฆโโๅไบซ่ช้ฝๅธๅฟซๆฅiPhoneๅฎขๆท็ซฏ\n134001 0 ๆฑ่13ไธช็่พๅธ18ไธชๅ
ฌ่ฏๅคๅทฒๅผ่ฎพ6ๅคง็ฑป36็งๅ
ฌ่ฏไบ้กน็ฝไธๅ็\n134002 0 ไธญๅฝๆฟๅบไปไนๆถๅๆ่ฝ่ฎฉๅฅน็ไบบๆฐๆดปๅพๆๅฐไธฅไธ็น\n134003 0 ่ไธ้จๅๆฑฝ่ฝฆๆธ
ๆฐๅไบงๅไธญๆทปๅ ็ๅฃ่ดจ้ฆ็ฒพๆฌ่บซไนๅญๅจๅฑๅฎณ\n134004 0 ไธ่ตทๆฅๅฌๅฌๅง๏ฝOpening\n1000 Processed\n classify content\n134500 0 ๆฐๅใไบบๆฐๅๅๅฐๅ
ๆฅ้ๅขๆๅฑ็ฝ็ซๆๅ่ต็ฝ็ซๅ ๆฎๅคงๅคด\n134501 0 ไธๆฌพP8+Mate7ไธป้ขโโไผผๆฐดๆตๅนดไนๅๅญไบบ็\n134502 0 ๅคงๅ้ๅข็้ๅฎ็น้ๅๅ
จไธญๅฝ11ไธช็ไปฝ\n134503 0 liyingxin่ฎพ่ฎกๆๅๆๆๅฎๅธ็ไบๆ
ๆณ้ขๅทฒ็ปๅค็ไบไป่ฟๅทฎๆไธไธช้ซๅฐๅคซ็ๆไธไธช้ฆๆฐดไธไธชๆๆบๆฒก็ปๆ\n134504 1 ๅฅฝๆถๆฏ๏ผ??????ไธๅคๅ
ฌๅญๅคฉๅฐๅซๅข
๏ผไธๆตทๅฏไธๅปบๅจๅ
ฌๅญไนไธญ๏ผๅๆฅxxxxไบฉ้กพๆๅ
ฌๅญ๏ผ่ๆๅ
จๆฐด...\n1000 Processed\n classify content\n135000 0 ่ขซ็งฐไธบ้ฟ้\"ๆๅฑๅฟ\"็ๅไผไบบ\n135001 0 15ไธช้กน็ฎๅ
จ้จ่พพๅฐๅบๆถๅปบ่ฎพ่ฟๅบฆ\n135002 0 ๆฅๅฎพๅธๅ
ฌๅฎๅฑไบค่ญฆๆฏ้ไบๅคง้ๆฐ่ญฆ้่ฟ็ฐๅบ่ฐๆฅ\n135003 0 ๅฝไบงๆๆบ็ๅฉๆ
ๅตๆๅฅฝ็ๅบ่ฏฅๆฏvivoๅOPPO\n135004 0 ๆฏๅฆไปไนๆๆๅ
จๆญๆฏๅฆๅ
จ่บซไธไธ่กๆทๆท้ฝๆ็ป็นๅ\n1000 Processed\n classify content\n135500 0 ยทๅ ไธบไธๆฌกไธ่ตท้ไธๆตท่ช็ถๅ็ฉ้ฆ็ๆถๅๆฒก่ฝไนฐไธ็น\n135501 0 ๅพๅคๆถๅ่งฃๅณ้ฎ้ขๅฏ่ฝๆฏIBM\n135502 0 โnmbๅคฉๆนๅๆไบบๅจ็ไฝ ็ธๅ้็ไธ็
ง\n135503 0 ไธๅผ ไปทๅผ329็่กฃๆๅธๆ่ฟ่ฐๆณไนฐ่กฃๆ\n135504 0 ไฝๆญๅฅฝๅฃฐ้ณ200ๅผบ42ๅทๅพ
ๅฎ้ๆ\n1000 Processed\n classify content\n136000 1 ๆ็ๆฐๅไฝ็ฎไป๏ผ่ฏทๅไฝไบฒๅๅคๅคๆฏๆ๏ผๅฑฑๆตทๅคง้
ๅบไปฅๅ
ถๅพๅคฉ็ฌๅ็ๅฐ็ไฝ็ฝฎ้่ธๅคฉๅคๆไนๅ๏ผๆ็งๆๅ...\n136001 0 ่ไปฅ้ฟ้ใไบฌไธไธบไปฃ่กจ็ไบ่็ฝ็ตๅๅนณๅฐ\n136002 0 Google็ไธ่ฅฟไนๆด็ฑๅๅฐๅไธค่
\n136003 0 ๆ่งๅพ่ฑๅ้ชจ่ฟ้จ็ต่งๅง่ทๆไปฌ่ฏดๆไบไธไปถไบ้ฃๅฐฑๆฏ16ๅฒๅฐฑๅฏไปฅ่ฐๆ็ฑไบ่ไธ็ฑ็ๆฏๆๆๆๅฟๆๅฐไฝๆ...\n136004 1 ไฝ ๅฅฝ๏ผๆฌไบบๆฏๅ่ฃ
ไฟฎ่ฎพ่ฎก๏ผๆๆ ทๆฟ้ดๅฏๅ่ง๏ผไปทๆ ผๅบ๏ผ่ดจ้ๆไฟ่ฏๅฎๅๆไฟ่ฏ๏ผๅ
่ดน่ฎพ่ฎก็ฐๅบๆฅไปท๏ผๆๆ...\n1000 Processed\n classify content\n136500 0 ๅ็ฎกๆฐๅฎๆฑไธญ้ๅจๅๅบ่ๅดๅ
ๆๆฅๅ็ฑปๅฏ่ฝๅญๅจ็ๅฎๅ
จ้ๆฃ\n136501 0 ่ฆๅบๅคงไบไบ~ๅฆๆไธ่ตถๅฟซ่ฟ่กๆฒป็็่ฏ็ฑไบไฝ ็ๆคๆไฝ ๅจๅด็ไบบไผๆๅฐๅพ็ฒ็ดฏ\n136502 1 (x/x)ๆจๅฅฝ๏ผๆ่ฐข่ด็ตไธๆตท่ๅคฉๆกไธ๏ผๆฌๅ
ฌๅธไธไธๆไพไธชๆง้ๅ้็ธๆกๅฎๅถไธๅกใ่ฏฆๆ
ๆ็ปๅ
ฅwww...\n136503 0 ไผ็ๅทๆฐ่ฑกๅฐxๆxๆฅxxๆถๅๅธๅคง้ฃ่่ฒ้ข่ญฆไฟกๅท\n136504 0 ๅธธๅทๅธไบๆด็บบ็ปๆบๆขฐๆ้ๅ
ฌๅธ่ฃไบ้ฟ็ๆๅ
ถๅจ็ข็ฃจๅฆไฝ็จไธๅฅ็ณป็ปๆ็ไบง่ท่ธชใไปๅจ็็นใ่ฎพๅค็ถๅตใ่ดจ...\n1000 Processed\n classify content\n137000 0 ๅๅๅๅๅๅๅๅๆฑ่ๅฐ็ปงๆฟ่
ไปฌ็้
้ณ็ฎ็ดไบๅๅๅๅๅๅๅๅ\n137001 0 ไธไธชๅฅณไบบ็ๆบๆ
งๆฏ็ฏๅขๆ้ ๅบๆฅ็\n137002 0 ่้ปๆๆถๅไผๅผ่ตทๅคฑๆไธๆ ๆณๆฒปๆ\n137003 0 ไฝ ๅฏ่ฝไผ่ฏด๏ผไพๆฟๅญ3000ๅ
/ๆ\n137004 0 ๅๆฌไปฅไธบโๆฝ่งๅโ่ฟไธช่ฏ่ขซ่ฏดไบ่ฟไนๅคๅนด\n1000 Processed\n classify content\n137500 0 ่ฟชๅฐๅพทไธฝๅluxelabๆบๆ้ฆๆธฏไธๆไธๅธ\n137501 0 ๅๅๅจๅ
ฌไบค่ฝฆไธ่งๅฐไธคไธชๅฐๅท\n137502 0 ไธๅฐๅคฉ็1ๅ
ฌ้ๅคๅ็ไธค่ฝฆ่ฟฝๅฐพไบๆ
\n137503 0 ่ๅทๅฐ่ฑกไธ้๏ฝไปๅคฉๅจ้ซ้็ฉฟไบ้ฟ่ฃค\n137504 0 ๆๅคง็่ดญ็ฉไธญๅฟ่ฟชๆ่ดญ็ฉไธญๅฟ\n1000 Processed\n classify content\n138000 0 ็ปๅ็ๆฏๆๅไธๅผ ๆตๅฐๅพๅท็็กฌๅบง่ฎฉๆ็ปๆขๅฐไบ\n138001 0 ไฝฟๆฐ่ฝฆๆฏZx็่ตทๆฅๆดๅ ้ธๆฐ\n138002 0 ๆๆๅ็ๆ่ถ
่ฟ้ป้ปไธ้ฉ้ฟ้\n138003 0 ็็ฌฌ29ๅฑๅคๅญฃๅฅฅๆๅนๅ
่ฟๅจไผๅจๅไบฌๅผๅน\n138004 0 ้ฝ่กจๆNormanๆฏไธชๆ็ฒพ็ฅ้ฎ้ข็ๅญฉๅญ\n1000 Processed\n classify content\n138500 1 ไธบๅ้ฆๅฎขๆทๅฏนๅ
ฌๅธ็ๆฏๆไธๅ็ฑ๏ผๅ
ฌๅธ็นๆจๅบไธๆฌกๆงไธไปxx%ๆฟ็ญ๏ผๆชๆญขๅฐxๆxxๆฅ๏ผๅๆถ่ฟๅฏๅๅ ...\n138501 0 ๅๅฐๅดๅๅ้ฟๅงจไธ่ตทๅปๆตๆฑ่ฝฌ่ฝฌ\n138502 0 ่ฟ่ฆไธๆผไธๆฌกโๆปจๆตทๅๅธโๅ\n138503 0 ๆไธบๅไบฌๆ
ๆธธ็ง้ญ
ๅๆฅ่ชๅฎๆไธบไฝ ๆๅๆๅปบ้ ็ๅป็ป็ณ็ช\n138504 0 ่ฑ่ดน560ๅ
ไนฐๅๆฅไธไธชไธๅฏไปฅไฝฟ็จ็็ฉๅ\n1000 Processed\n classify content\n139000 0 ไธๅๆxxxๅท้ฃๆบ็้ข้ไน่บฏ่ขซ็ธๅพ็ฒ็ข\n139001 0 ็ไปฅไธบ่ชๅทฑbbไธคๅฅ็ฝๅญ็ปๅฐฑไผไธ็ฑ่ฑๅ้ชจไธๆ ท็\n139002 0 ไฝ ไปฌ้ฝ้ฃไนhighไผฐ่ฎก็ต่ๅ็ๅฅนๅจๅทๅทๆ็ฝ่ๅทฒๅง\n139003 0 ๆ็ใๆๆฐงๅใๅฏนๅฅณๆงๅ
ๅๆณ่ฐๅ
ปๅคงๆ็ๅค\n139004 1 ็ช้ธก็พ้ผ x่ไธญ๏ผไธไธญไธ็ช็พ้ผ ๏ผไบไธญไบ็ช้ผ ๏ผไธ่ไธญ็ช๏ผๅฟ
ไธญ๏ผไธๆฌก้ไบบ็พ็ๅ
จไธญ\n1000 Processed\n classify content\n139500 0 ๆบ่ฝๅไธๅๆๅทฅๅ
ทPowerBIๅฐๅจ7ๆ24ๆฅ่ฑ็ฆป้ข่ง็ๆฌ็็ถๆ\n139501 0 ไธๅฐ็ต่ไธๆฏ่ถๅฐฑ่ฝ่ฟไนๅไธไธๅ\n139502 0 ็ๅฐ่ฟไธๅ ๅ ็่
่ดฅ้ชจๅคด้พไปฅๆณ่ฑก่ฟๆฏ็จๆฅๅ้็็\n139503 0 ้ขยท็ฑณๅ
ๆดไปๆงๅถไบๆฐ็โฆโฆ\\nSometimeswhenIๆปฉ\n139504 0 ๆฑๅๅ
ฌๅฎๅๅฑๆฐ่ญฆ่ฟๆฅๅฑๅผ่กๅจ\n1000 Processed\n classify content\n140000 1 xๆxๆฅ่ณxๆฅๅ่กๅคๆฌพๆญฃๆๅไบๅ
ๅฎตไฝณ่ไธๅฑ็่ดขไบงๅ๏ผ่ดญไนฐ่ตท็นxไธ่ตท๏ผๆxxๅคฉ๏ผxxๅคฉ๏ผxxx...\n140001 0 ๅๆๅฟๆณ้ขไธๅฎกไปฅ่ดชๆฑก็ฝชๅคๅคๅๆๆๆๆๅพๅ1ๅนด\n140002 0 ไบ้ฉฌ้่กจ็คบๅ
ถๅทฒ็ป็ญพ็บฆ่ฅฟ็ญ็Iberdrolaๅ
ฌๅธๆฅๅปบ่ฎพๅ็ฎก็ๅจ็พๅฝๅๅก็ฝๆฅ็บณๅท็้ฃๅๅ็ตๅบ\n140003 0 G25้ฟๆทฑ้ซ้็ฑ่ฟไบๆธฏๅพๆญๅทๆนๅๅฎๆญๆฎตไปK2125+228่ณK2125+322ๅคๆฝๅทฅ็ปๆ\n140004 0 ๆไฟฉ่ฟๆขฆๆณไธญๅฝฉๆ่ต่ฎฉไปไฟฉๅไฝ\n1000 Processed\n classify content\n140500 0 ๅ
ณไบelegance็่ฟๆฌพๆฃ็ฒ\n140501 0 โโๅป็ๅ่ฏๅฅนๅๆฐดไธ่ฆ็ดๆฅๅฝ\n140502 0 ็ฐๅจ็็ฌๅค้คไบ็ฉๆๆบ่ฟ่ฝๅนฒไปไนๅข่ฟๅบ็ฉไบบ็ฑปๅฟซ่ฆๅๆๆบๅจไบบไบ\n140503 0 ๆฏ็นๅ
ป็้
ไธๅฌ็
ๅคๆฒปๅฌ็
ๅคๆฒป๏ผๆ็
งไธญๅป็่ฎบ\n140504 0 ไปไนๆ
ๆธธไปไน้ฌผๅๅณ่ตๆ นๆฌๆฒกๅบ้จๅฅฝๅ\n1000 Processed\n classify content\n141000 1 ๅฎถ้ฟ๏ผไฝ ๅฅฝ๏ผ็ฐๅ
ญๅนด็บงๆฐๅญฆๅฒๅบ็ญๅฐไบxๆx-xๆฅๅผๅงๆฅๅๆณจๅ๏ผxๆxๆฅๆญฃๅผไธ่ฏพใๅฐ็ญๆๅญฆ๏ผๅ้ข...\n141001 0 ๅไบฌๆฏๅฉด/ๅฟ็ซฅ็จๅไฟกๆฏ\"ๆฏๅฉด/ๅฟ็ซฅ็จๅ\n141002 0 ้จๅฃ็่็ณๅบ้ๆฐ่ฃ
ไฟฎๅๅผไธไบ\n141003 0 ็พๅบฆๆพ็่ฟไน็
ๆ็่ดดๅงไธ็ฎก\n141004 0 ๆฐ็ท็ฅๆ็ง่ตซ/ๆๆด่ตซ12ๅนดไธ็ปผ่บ\n1000 Processed\n classify content\n141500 0 ๆฑๆน็ๅณ้่ฟๆฎ็็ใๅป้ข้ไธ่ตท็ฉๅๆบ็ๅฐไผไผดไปฌใ\n141501 0 ็ฉๅซๆ็ๆๅๅ ่ตทๆฅcomecomelet'sgo\n141502 0 ๆตๆฑๆญๅทๅธๆๅคงๅญฆๅบๅฑๆฏไธ็ๅฐๆฝๆฅๅฐๆญๅทๅธๅ
ฌๅฎๅฑ็ปๆตๆๆฏๅผๅๅบๅๅฑ้ๆฒๆนๆดพๅบๆๆฅๆก\n141503 0 ๅทฆๅณๅพๆๆๆบๅฏไปฅๆงๅถไธป่งๅทฆๅณ็งปๅจ\n141504 0 ๅ
ฌๅฎๆ็บฟๅฐคๅ
ถๆฏ้ฆ้ฝ็ๅ
ฌๅฎๅนฒ่ญฆ\n1000 Processed\n classify content\n142000 0 ่ดฟ่ตๅ็ปไฝ ็็ๆ้ฝๆฏไบๆฌ็ๆ\n142001 0 ๅฆๆไฝ ๆถๆนๅฎ้ฒๆ่งๅพๅพๆฒน่
ป็่ฏ\n142002 0 13ๅฒไธไธชไบบๅป่ถ
ๅธไนฐไธไธชๆๆ็็ๅฅถๅ้ขๅ
\n142003 0 ๅ
ๅ้่ฟISO9001ๅฝ้
่ดจ้็ฎก็ไฝ็ณป่ฎค่ฏๅISO14001ๅฝ้
็ฏๅข็ฎก็ไฝ็ณป่ฎค่ฏ\n142004 0 **ๆฐๆนๅไธ้ถๆถฆๆ่ตไธญๅฑฑๅ
ฌ็จๆดฅๆปจๅๅฑๅไธ่ตๆฌ้็ญๅป็ๆกๅพท็ฏๅขๅไบฌไธญๅๅๅทดไผ ๅชๅผบ็ๆง่ก็ณ้ๅฐ้\n1000 Processed\n classify content\n142500 0 ๅคๅๅๅบ็ๅปบ็ญ็นๅซๆๅผๅ้ฃๆ
\n142501 0 ๆฏๅคฉๅๅ
ฌไบคๅๅฐ้็ๆถ้ดๅๆไนฆ็ไบไบ\n142502 0 Win10Mobile้ข่ง็10166่ทๅพๆดๆฐๆจ้\n142503 0 ๅฏนๆถ่ดญไบบ็้ๅถ๏ผ1่ดๆๆฐ้ข่พๅคงๅบๅก\n142504 0 ่บซ่พนไธไธชๅไบ็ๅฐxxxxxๅ็ญไฟก่ฏด็งฏๅๅค่ฝๅ
ๆขๅ ็พ็ฐ้\n1000 Processed\n classify content\n143000 0 ๅไธชๅฐ้่ขซๆญปๅๆ็ฏไธไบไธ็ดๅฐพ้ๆ\n143001 0 ๅไธไธชๅคๆก/ไบๅๅนผๅฟๅญๆๆฏๆก13ๅนดๅๅๅฎก่ญฆๆน่ฏๆฎ็้ ๅ\n143002 0 ledๅๅคง็ฏไธๅ่ฟๆฐๆ ผๆ
่ไธบไธไฝ\n143003 0 ่ฟไบๆธฏๅธๆฐ่ฑกๅฐxxๆฅxxๆถๅๅธ\n143004 0 ๆฏๆฐดไธไนๅญไธไธๅนดxxxๅคฉ้ฝๅฏไปฅไฝ้ชๅฐ้ชๅฝ้ฃๆ
็โๅฐ้ชไนๅญโไปฅๅ่ดญ็ฉไธญๅฟ็ญ็ปๅไธไฝ็ๅคงๅ็ปผๅๅจฑไน่ฎพๆฝ\n1000 Processed\n classify content\n143500 0 ่ถ
ไธๆ็ฑ้ๆบๅจ่ฝฆ่ฟๆณๆ่ด\n143501 0 ้พ้่ฟๆฏ่ฆๅจๅฐ้ๅฃๅฟ็ตๆๅบๅ\n143502 0 ไปๅคฉ่ๅฎๅฆochemๆดๅฎๅฆ็่ฟๆbiochemไธ้จ\n143503 0 ๆฉไบๅนดๆตๆฑๅซ่ง็ๆ็ฑ่ฎฐๆญ่ฏๅฐฑๆฏๅฆๆญค\n143504 0 ๅ
ฑๅไธพๅ็xxxxๅนดโ็ฒ้ชจๆๆฏโๅ
จๅฝJava็จๅบ่ฎพ่ฎกๅคง่ต\n1000 Processed\n classify content\n144000 0 SherburnAeroClubๅผ้ฃๆบๅฝ\n144001 0 ๅๅฝข็ฉๅ
ท่ถ
ๅ้ๅ4ๅ้้ป่็ซ็ญ้ข็ดข้้ธ็้พๆ้พๆบๅจไบบๅฟ็ซฅ็คผ็ฉ\n144002 0 ไป็ซ่ฝฆ็ซ็ๅไบฌๆตท็ๆฏ่ฒ็กฎๅฎๅพ็พ\n144003 0 ไธ่ง็ก้ไธไบไธชๅๆๅทไธๆๆบๅ็ฐๆปกๅฑ้ฝๆฏ่ฏ่กฃ้ดๅโฆ\n144004 0 ๅช่ฝ้ ้ขๅ็ๅๅ็งmvๆฅๅบฆๆฅๅ\n1000 Processed\n classify content\n144500 0 ๅๆ่ๆ็ฅๅฅ็ๆฏไฝ ไธ็ฅ้ไผ้ๅฐไปไนๆ ท็ๅ้ไบบ\n144501 0 19ๅฒ็nino็ๆฏๅซฉ็ๆๆณ็ฏ็ฝช\n144502 0 ไป
ๆๅฝ้ฒๅๅทฅใ้ค้ฅฎๆ
ๆธธๅๅปบๆ็ญๆฟๅๅบโฆ\n144503 0 ่ฏไฟก็ป่ฅไบงๅไผ่ดจๆๆฏ้ฟไน
็ๅญไน้\n144504 0 ๆ่ต็่ดขๆไบๅพฎไฟกๆๅๅๆดฅๆดฅไน้็่ฏ้ข\n1000 Processed\n classify content\n145000 0 whateveryouwriteโwhitepapers\n145001 0 ๆไนไธไธบไบๆฅๆ็็ธ่ๅ ็ญๅข\n145002 0 ่ฏไบๅ
ฌๅธไบๅ็ไธ็ๅไธใๅทฅๅ
ทใไบๅจๅ่ต่ฎฏ็ญxๅคงๅนณๅฐ\n145003 0 ไปฅๅ่ฅฟๅฎๅ
ฌๅฎ่งไบๆด้ณไบค่ญฆ็ดๆฅๅผๆญป\n145004 0 ๅ
ซใๆ้คไธ่่ช่็ๅ
ณ็ณปๆไปฌๆ้ค่ฅๅๅคชๅฅฝ\n1000 Processed\n classify content\n145500 0 ๅฐฑๆฏ้ฟ้้ๅขๅบ็ไธๆฌพๆ็ปๆฏไปๅฎ็่ฝฏไปถ\n145501 0 ่ไธๅฏไปฅ็จๅจๆฉไธๅๅฆๆฐดๅ็ดๆฅไฝฟ็จ\n145502 0 ็นไปท248็พๅฝๅปๅธๆจ่NO1ๆฐดๅฎๅฎcoppertone่ถ
ๅผๅฅ่ฃ
ๅ
ๅซ1\n145503 0 ๆพ็ถๅจNBAๆ็็ๅจๅงๆฏๆฅๅไบๆก่จช\n145504 0 ๅคงๅคๆฐ็็็
ไบบ้ฝๆไธ็งไธ่ด็ๆงๆ ผ็ฑปๅ\n1000 Processed\n classify content\n146000 0 SAP้็่ฑๆๆฏไธๆฏๅฐๅบฆ้ฟไธๅ็\n146001 1 ๆ่ฐขๆจ่ด็ตๅฎๆณขๅ
้่ฑ่ฒๅฐผ่ฟช๏ผๆๆฏ้ๅฎ้กพ้ฎ่ฃ้พๅจ๏ผๆจๅฏไปฅๅซๆๅฐ่ฃใๆ็่็ณปๆนๅผๆฏ๏ผxxxxxx...\n146002 0 ไฝๆถๆ่ฝไน้ค่
่ดฅ็ปๅญฉๅญ่ฎจๅๅ
ฌ้ใใใใใใ\n146003 0 ็ฎๅ5ๅๆถๆกๆๅๅทฒ่ขซๅฎๅบ่ญฆๆนๅไบๆ็\n146004 0 ๆฏๆฐๅจๆๅ
ไธ็ปฉๆ ๅฟง็่ก็ฅจๅฏไปฅๅคง่ไนฐๅ
ฅๅนถไธญ็บฟๆๆ\n1000 Processed\n classify content\n146500 0 ๆธฏ่กๆฎ่ท๏ผ้ฟ้็ณป่ทๅน
่ถ
10%\n146501 0 ๅไบฌใๅพๅทๅ่ๅ ๅพๅ
ญๅฎๆนๅ็่ฝฆ่พ็ป้่ฅฟๆข็บฝ็ฑๅ่ฅ็ปๅ้ซ้ๅ็ฏๆฎต็ป่ก\n146502 0 ๅๆฌขๅฅนๅฐฑๅผบๅฅธๅฅนๅใๅ็ฝๆไปไน็จใๆไธๅฐๅฐฑไธ่ฏๅใ็ฟป่ธไบๅฐฑๅ่ฃธ็
งใๅคงไธไบ่นฒ็็ฑใไฝ ่ฟ็็ฑ้ฝไธๆข...\n146503 0 ๅฅๅ๏ผ1ใ่ทฏ้ไนๆฑฝ่ฝฆๅฟ็ซฅๅฎๅ
จๅบงๆค
\n146504 0 ๆ ๆฎๅทฒ็ปๅผๅงๆๅ้ขๅฎ้จๅ่ฃ
่ฝฝWindowsxxๆไฝ็ณป็ป็PCๆฐๅ\n1000 Processed\n classify content\n147000 0 ็ฑStephenNickel่ฎพ่ฎก็ผ้ \n147001 0 ๆบๅ
ธ่ดขๅฏๅจๅฑฑ่ฅฟ็ต็ณๆๅไธพๅไบ็่ดข่ฎฒๅบงๆดปๅจ\n147002 0 NBAๆฐ่ตๅญฃ็่ต็จ่ๆกๆญฃๅจๅไธช็้ไธญไผ ๆญ\n147003 0 ๅๅๆณ้ขๅ็็ๆตๆฑๆฎ็ฟไธ้้ขๆ้ๅ
ฌๅธ็ ดไบงๆธ
็ฎไธๆกๅฌๅผ็ฌฌไธๆฌกๅบๆไบบไผ่ฎฎ\n147004 0 ๆฌๆๅฐๆฟๅฐฑๆฅ็็นๅฐ้5็บฟๆฒฟ็บฟๆๅๅผๆฅผ็\n1000 Processed\n classify content\n147500 1 ๆพๆพ็พๅฎน้ข่ฅฟ่ๅบๆธฉ้ฆจๆ็คบ๏ผๅผ้จ็บข๏ผไธๅ
ซ่้็ฐ้๏ผๆปกxxxxๅ
้xxxxๅ
๏ผ็บน็ปฃxๆไผ๏ผ่ฟๅบๆ...\n147501 0 ้ข่ฎกไธญๅฝ็ๆฑฝ่ฝฆไบง้ๅฐ2020ๅนดๅฐ่พพๅฐ3000ไธ่พ\n147502 1 ๅฎถ้ฟๆจๅฅฝ๏ผๅฐๆ ่ฑ่ฏญๅญฆๆ กๆฐๅญฆๆๆฅๅๅทฒๅผๅงใxๆxๆฅๆญฃๅผๅผ่ฏพใๅผ่ฎพ็ง็ฎ:่ฑ่ฏญ๏ผ่ฏญๆ๏ผๆฐๅญฆ๏ผ็ฉ็๏ผ...\n147503 0 ็ฐๅจๅปๅคง้ๆไบๅฟๅธๅบๆณ้ขไธถๅบๅฑๆณ้ขไปฅๅๆดพๅบๆๅไบ\n147504 0 600ๅคๅๅคๆฅๅกๅทฅไบบๅๅฎ็ฝฎๅจ่ฟ้\n1000 Processed\n classify content\n148000 0 ๆปจๆตทๆฐๅบๅทฒ้ไธญๆธ
็ๆต่xxxxๅจ\n148001 0 ๅฎๅฐฑๅๆฏไธ็ๅปบ็ญๆต่ก่ถๅฟ็้ฃๅๆ \n148002 0 ่ๅธ้ฎไบไธๅฅ๏ผโnobody\n148003 0 ๅไบฌ็ๆญฆๅบไฝๅปบๅฑๅฏ่ฐ็ ๅ้็ฑๅนณ่ฆไฝ้
ๅบ\n148004 0 ๆ่ฟไธๅๅพฎๅๆฏๅ ไธบๆๆๆบๅไบ\n1000 Processed\n classify content\n148500 0 ไบบ็็็็ธๆปๆฏๅจๆๅญ็่ๅ\n148501 1 ไฝ ๅฅฝ๏ผไบบๅๆ่ฒ่ฟๆๅผ่ฎพไผ่ฎก(ๅณๅจ)ๆ่ฝ็่ฎบๅน่ฎญใๅฎๅกๆไฝๅน่ฎญ๏ผๅญฆ่ดนไผๆ ๏ผๅฆๆ่ฑ่ฏญ็ญๅค่ฏญ๏ผๅฆๆ...\n148502 0 ๆฒณๅ่ๆๆจกๅผๅฏนไธญๅฐไผไธๅๆๅชไบๅฅฝๅคๅข\n148503 0 ๆญคๅB่ฝฎ็ๆ่ตๆน่ๅ็ญๆบใ้กบไธบ่ตๆฌๅ
จ้จ่ทๆ\n148504 0 xxxxxxxx๏ผไฝ็
๏ผโๆไปฅๅฐฑๆฏๆๆ็ๅฅณ็้ฝ็ฑ็็ฝๅญ็ป\n1000 Processed\n classify content\n149000 0 ่ขซ่ฟ่พ๏ผcecolisabienvoyagรฉ่ฟไปถๅ
่ฃน็ปๅพ่ตท่ฟ่พ3\n149001 0 ็ถๅ้ฒๅบๅธไบๅฐ่ดฉ็้ฃ็งๆปก่ถณ\n149002 0 ้ซ่ท็ญ้ด+้ป่ฒ้พๆกๅ
็ผๅฐฑ็ฑไธ็่ๅฟ่ฃๅญ\n149003 1 ไฝ ๅฅฝ๏ผๆ่ฟ่พนๅ็ไฟก็จ่ดทๆฌพ็๏ผๆ้่ฆๅฏไปฅ่็ณปๆใๆฑ็ป็ใ็ต่ฏ๏ผxxxxxxxxxxx ๅฐๅ๏ผ...\n149004 0 ๅ
จๅฝๅฐฑๅ็ไบ4่ตท็ตๆขฏๅฎๅ
จไบๆ
\n1000 Processed\n classify content\n149500 0 ๆฉไธ่ตทๆฅๅ็ฐๆๆบ้้ขๆ่ฟๆ ทไธไบ็
ง็\n149501 0 ๅฅณๅฟๅจ่ๅทไนๅญ็ฉไบไธชๅพๅบๆฟ็ๆธธๆ\n149502 0 ่ฏฆๆ
่ฏท่ฏข0450930207Yolanda\n149503 0 ๅฐๆนพๅฐบๅ
ซๅๅฎถ้ฆๆฌกๅไบฌโๅฐบๅ
ซ็ฆ
ๅฟโไนๆ
\n149504 0 ๅทจ้ข้่ดญไบIBM็ไฟกๆฏใไบบๅ็ฎก็ๅจ่ฏขๆๅกๆน\n1000 Processed\n classify content\n150000 0 ไธไผ่ขซๅไบไผคๅฎณ/่ญฆๆน๏ผไธ่ไฟๅฎๅฅณๅฟ้ญ่ฝฎๅฅธๅๅนถๆชๅบ่ตฐไฟๅฎ้็ไบๅฎ\n150001 0 ็ดซ่ฏไธญๅฏๅซ็็ปด็็ด Aๅฏไปฅๆนๅ่งๅๅ็ฎ่ค็็ฒ่ไธ็ฎ็ป่\n150002 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ xskxxeไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n150003 0 ๅณไฝฟๆถๆง่ฟ็คๅจ่ฟๅ้ฟๅคงๅฐๅ่ฟซ่บซไฝๅจๅฎใ่ก็ฎก็ญๅๆ ็ผ็ๆ\n150004 0 ๆจไธไธชๅซwikipaintings็ๅบ็จ\n1000 Processed\n classify content\n150500 0 ้ฑ็ฉ่ฏ็ไบๆ ้ก้ธฟๅฑฑไธๆฟๆกฅไธไธชไนฆ้ฆ้จ็ฌฌ\n150501 0 tiger้ๅญ็ฑไบไพ่ดง่ๆฟๅปๆณฐๅฝๆ
ๆธธไบ\n150502 0 ็ปๆ็พๅบฆๅฐๅพๆพ็คบ่ฟๆไธๅฐๆถ\n150503 1 ๆฌงๆดพ้ซ็ซฏๅ
จๅฑๅฎๅถxxxๆ้ด๏ผๆฌงๆดพๆดไฝๆฉฑๆ๏ผ่กฃๆ๏ผๆจ้จๅ
จๅบx.x ๆ๏ผๆดๆๅ
ฌๅธ็นไพๆฉฑๆๅ ็ตๅจๅช...\n150504 0 ไธๅธๆ็ๆฏๆๅธๆ้ฃๆบๆญฃๅธธ้ฃ\n1000 Processed\n classify content\n151000 0 ๆญฆๆฑๅธๅ
ฌๅฎๅฑไธ่ฅฟๆนๅบๅๅฑไบค้ๅคง้่ฝฆ็ฎกไธญ้ๆฐ่ญฆๅญๆๆฅๅฐๆๆๅฐๅฟๅคไปค่ฅ่พนไธๅฐๆๅไปฌๅๆธธๆ\n151001 0 ้ฆๅฐๅธ้ฟ่ฎฟๅโๆฝๅฎขโ้ฉๆตๆๆๅฉๅๆ
ๆธธๅฎฃไผ \n151002 0 ๅนถ้่ฏทๅฐ็ฐไปปๅพฎ่ฝฏๅ
จ็ๆง่กๅฏๆป่ฃ็ๆฒๅ้ณๅๅฃซ่ดๅผๅน่ฏ\n151003 0 ็ๅฎไนฑๆญฅๅฅ่ฐญ็ฌฌไบ้ๆ็ๆญฃๅผๅงๅๆฌข่ฟ็ช็็็กฎๅฎๆ นๆฌไธๆฏๆจ็็ชๅฎๆไบบ็ๅฟ็็็ๅป็ป็้ๅธธ็ป่ดๆ่ง...\n151004 0 ๆฅๅๆถ้ด๏ผxxxxๅนดxๆxxๆฅใxxๆฅใxxๆฅๅ
ฑไธๅคฉ\n1000 Processed\n classify content\n151500 0 ๅป็็งฐไปๆช่ฑ็ฆป็ๅฝๅฑ้ฉโฆโฆ่ฏฅไฝๆทไนๅๆพไธคๆฌกๅ ่ฝ่ฑ็\n151501 0 ๅฝๅ้็ๅไธบไฝไธบๆ็็ฌฌไธๆฌพๆบ่ฝๆๆบ็ๆฏๆฒก้้\n151502 1 ่็ณปQQxxxxxxxxxxใๅฐ็ซ ๏ผ็็
ง๏ผๅปบ็ญๅ
ซๅคงๅ๏ผไธไบ็บงๅปบ้ ๅธ๏ผ่ฅไธๆง็
ง๏ผๅปบ็ญ๏ผๅไธ๏ผๅนฟ...\n151503 0 ๅ
ซๆไปฝ่ฅฟๅฎๅบ็ง่ฝฆ่ตทๆญฅไปทไธๅๅฐๅๅ
\n151504 0 ๆๆบๅๅไบๅๆฒกwifiๅไธ่ฝ็จๆต้ๅฐฑ่ฟqq้ฝ็ปไธไบ\n1000 Processed\n classify content\n152000 0 ไธญๅฝ็ๅ็ฎกๆๆถไปฟไฝๅฐฑๆฏๆดๅจ้\n152001 0 ใๆฌพๅผใๆฌง็พ่ฅฟๆตทๅฒธใ้ข่ฒใ่่ฒ้ป่ฒใ็ ๆฐใ50524854ใไปทๆ ผใ?38ๅ
้ฎ\n152002 0 ๆ ๆไธญ็ๅฐไบMazdaMaxxAuto่งๅพไนๅพไธ้\n152003 0 4ไธช11ๅฒ็ๅญฉๅญๅฐ้ฏ้ธๅธๆฐๅธไธไธญๅๆๆธธๆณณ้ฆ็ฉ่\n152004 0 ไธ็ไธๆ่ฟๆ ทไธ็พคๅฅณไบบไธๅๅคงๆฌพ\n1000 Processed\n classify content\n152500 1 ไฝ ๅฅฝ๏ผๅฆ้ๅ&ๆญฃ่ฏทๆxxxxxxxxxxxๆๆฐ\n152501 1 xxxxๅ
ๅณ่ต xxxๅ
ไบงๅไธไปถ๏ผๆปกxxxxๅ
ๅณxxxๅ
ไบงๅไธไปถ๏ผ่ฏฆ่ฏขxxx-xxxxxxxx...\n152502 1 ไธๅ
ซๅฆๅฅณ่ๅฐๅฆ?? ่็ฝๅ
ฐๅ
ฌๅธไธบๅ้ฆๆฐ่ไผๅ๏ผๅๅบไปฅไธๆดปๅจใๅกๆฏไผๅ่ดญไนฐ่็ฝๅ
ฐๅฅ็ไธๅพxๆ...\n152503 0 ๆไปฅ่ฝฌ่ฎฉๅง~~ๆๅ
ด่ถฃ็่ตถ็ดง่็ณปๆ\n152504 0 ChloeGoldie2015็งๅฌๆๆฐๆฌพๅคง็บข่ฒ็ฐ่ดง\n1000 Processed\n classify content\n153000 0 โ
Dragon'Sโ150807๏น้ไธญโ้ฆๅฐ>\n153001 0 02่ฟๅช้ชไผดไบๆ6ๅนด็ๅฐๅฎถไผ็ฑไบๆ็็ๅฟฝ\n153002 0 ้ฃๆบๅจๅๆนๅไธ็ฉบๅไบไธช็พไธฝ็ๅผง็บฟ\n153003 0 ๆไฝ ๅจps้ๆๆ็ป้ป็ฝ็บฟๆ็จฟ่ฐๆ่ฝ็ดๆฅๅจ็ต่ไธ่ฒ็ๅพ็จฟ\n153004 0 ๆฅๅๅไบฌ้ซๆทณๅบ็ ๅข้ๅๆฐ่ต้ไบๅฉ็คพๅๆญข่ฅไธๅนถ่ขซ็ซๆก่ฐๆฅ\n1000 Processed\n classify content\n153500 0 ๅไบฌๅธไบบๆฐๆฃๅฏ้ข็ฌฌไบๅ้ข่ตท่ฏไนฆๆๆง\n153501 0 ๆ็้ไธ่งๅ่ฝไธปๆฒปๅ้็จ่ๅด๏ผๆ็้ไธ่งๅ่ฝไธปๆฒปๅ้็จ่ๅดๆ นๆฎไธดๅบ่งๅฏ๏ผๆ็้ไธ่งๆฒป็ๆนๆกๅฏน...\n153502 0 ่ฟไธคๅคฉ็ป็ปๆๅฅฝๅ ๆ นHB้
็ฌ้ฝ็จ็ๅทฎไธๅคไบ\n153503 0 ๅไบ9756ไธชๅธๅญโฆโฆๆ่งๅพๆๆๆถๅ็็็น่ฏๅ โฆโฆ็นๅซโฆโฆ่ฏๅ โฆโฆ\n153504 0 ็ฅ็็็ๅฐ้ปๅคด็ฒๅบ่ไธ็็\n1000 Processed\n classify content\n154000 0 150708ๅทด้ปๆด้ซไนๆบๅบๅบๅข&\n154001 0 ไบบ็ไธโๆฒป็ไบไฝ ่พๆฐ็ไบบๆฏไฝ ็ฑ็ไบบ\n154002 0 ๆๆบๆฒก็ต่็ณปไธไธไนไผๆ็บฆๅฎไธ็ฆปไธๅผ\n154003 0 ๅธๆๅฝๆ็็่บบๅจไบๅป้ข็ๅขๅคฉ\n154004 0 /้คไบไปฃๅทฅNexusๆๆบๅไธบๅฏ่ฝ่ฟ็ปGoogleโ็โไบๅ่กจ\n1000 Processed\n classify content\n154500 0 ๆฅๆฌ็ด้ฎไปฃ่ดญๆฅๆฌๅฟ
ไนฐ่ฏๅๆฆไธๆๅๆฅๆฌNichibanๆธฉๆ้็่ดด็ฉดไฝ่ดด่
ฐ็่ฉ็่่็ๅ
ณ่็ๅซ็...\n154501 0 ๆฌๆฅไธไธช็พไธฝ็้่ฅฟๅด่ขซไธไธชๅทฅๅๆๆ่ฟๆ ท\n154502 0 ็็ธๆปๆฏ่ฆๆฏๆไปฌๆณ่ฑกๆฎๅฟ่ฎธๅค\n154503 0 ็ปไบ่ฆๆขๆ่Bๅฆ๏ฝๆฉ็ฅ้ไฝ ่ฆๆๆ็\n154504 0 ๅ่ฟ30ไธชๅจๆทๆฝๅญ็ฐ้700ๅคไธๅ
\n1000 Processed\n classify content\n155000 1 ็ฎๅก๏ผๆฅ็ซๆๆๆบๅ็่ฅ้ๅฎๆจ:ๅฎขๆท่ณไธ๏ผๅ่ดจไฟ่ฏ๏ผๆๆฌไฝ๏ผไฝๆฒน่๏ผ้ซๆ็๏ผๅๆฅๅฟซ๏ผๅณๆฅ่ตท่ดญๆบ...\n155001 0 ไฝ ไนๆฅไธบๅๆฌข็ๆญๆฒๅ ๆฒนๅง\n155002 0 ๆๆบ็ฉไน
ไบ็ๅข็ไธๆฏ็ผ็็็ไบ่ๆฏๅคงๆๆ่
นๅทๅฑๅท็ไบ็ดฏ\n155003 0 ๆไบไบบไธ็้ฝๅจๆธ็ดข็ฑ็็็ธ\n155004 0 ๆฆ่ฏ้ฃๅๆฒน่่ฑ่่ๆถฆ่ๅญๅฆๅฏ็จไธป่ฆไฝ็จๆฏไธบๅๅ้ไฝๆฐดๅๆไพๅฑ้\n1000 Processed\n classify content\n155500 0 ่ฏฅๅ
ฌๅธ็IPO็ณ่ฏทๅ่ท่ฏ็ไผ้่ฟ\n155501 0 ๅทไธ่ฅฟ็้ฝ่ฏฅ็ผๆญปๅๅโฆๅณๅจไบบๆฐ่กๆฑ้ฑๆ่
่กๆฑ้ฑไนฐ็ไธ่ฅฟ\n155502 0 ไฝ้ฃ่ฆ็ญๅฐไฝ ๅ ๅฐ่ฎธ่่่ฐๅ\n155503 0 ่ฅฟ่ๆฟๅใๆฑฝ่ฝฆๆด่ฝฆใ้ถ่กๆฟๅๆถจๅน
ๆๅฐ\n155504 0 ไฝ่่็ๅๆ่ฟ่ฟไธๆญข่ฟ็นๅ\n1000 Processed\n classify content\n156000 0 ้่ฅฟ็ๆดๅทๅฟไบบๆฐๆณ้ขๅฏน่ขซๅไบบ็งฆ้ท้ๆณๆๅฎณ็่ดตใๆฟๅฑ้็ๅจ็ฉไธๆกไพๆณๅผๅบญๅฎก็ๅนถๅฝๅบญไฝๅบๅฎฃๅค\n156001 0 xxxxๅนดxๆxxๅทๅ ๆจxๅทๆฅผ\n156002 0 ๅไธบๅทฒๆ็กฎๅฐๅจ5G็ ๅไธๆๅ
ฅ6ไบฟ็พๅ
\n156003 0 ็บข็นๅชๅไน่ฆ150ๅทฆๅณ็ไปทๆ ผๆ่ฝๅ
ฅๆ\n156004 0 ไธ่ฆๅฏนไธไธชๅชๅ่
่ไธๅฎๅ้ฒ่
ๅถๅบฆ็ๆฟๆๆฑๆๅธๆ\n1000 Processed\n classify content\n156500 0 ๆดไฝ่่จไธๅ้ฝๅพๅฅฝๅพๆๆฐ็ๆดป่ฑ็ถๅผๆ็ๆ่ง\n156501 0 ้่ฆ็่ฏ่ฏทๆไพไฝ ็็พๅบฆไบ่ดฆๅท็ปๆ\n156502 0 ๆฏๆบ่ฅฟ่ทฏๅฃณ็ๅ ๆฒน็ซๅพๆญฆ่ญฆๆฏ้ๆนๅไธ่พ่ฝฟ่ฝฆๆผๆฒน\n156503 0 ๅ
ฑๅ5973ไบบๆฏไปๅคง็
ไฟ้ฉ่กฅๅฟ้้ข3165\n156504 0 ็ฉ้ฅๆงๆจกๅ้ฃๆบ่ฆ่่ฏโ้ป้ฃโๅฏ่ฝไผ่ขซ่กๆ\n1000 Processed\n classify content\n157000 0 ไธๅx็นๆๅบ็ปๆๅธฆๅฅนๅจๅค้ขๅ่ฟ้ฅญๅๅฟๆณ่ฆๅ
ๅธฆๅฅนๅปๅช้ไผๆฏไธไธ\n157001 0 ๆญคๅปบ็ญ็ไธๅคง็น่ฒๆฏ็ป็ๅข็่ฟ็จ\n157002 0 ๆๆๅจไบบๅฟ็ๆฏๆบๅจไบบๅๅนดๅฆไธๆฅ็ๅฎๅๆ่ฎธๆๅ้่ฟข่ฟข่ตถ่ฟๆฅๅชไธบ็ไฝ ไธ็ผๆ่ฎธๆๆฏ็ญไธๅๅชไธบไฝ ่ฟ็ฆป...\n157003 0 ไฝไธบๅคงbossๅๆๅฆๆไธชๆถ่ฟๅจๅ่\n157004 0 xxxxๆฌพ่ทฏ่ๅ็ฐ็ฅ่ก้็จไบ่ทฏ่ๅ
ฌๅธๆๆฐ็ๅๅน
่็ชไธญ็ฝ่ฎพ่ฎก\n1000 Processed\n classify content\n157500 1 ใๆญฆๆฑ[็ซ็ฐ]้ๅใxๆx-xๅท๏ผๅธไธญๅฟๅฝฉๅฆๆดปๅจ๏ผ่ฆๅxxx+ๆผไบฎใ็ญๆ
็็คผไปช๏ผๅ
้ฅญ๏ผxๅคฉๅ
ฑ...\n157501 0 ไปๅนด7ๆ28ๆฅๆฏ็ฌฌไบไธชโไธ็่็ๆฅโ\n157502 0 ๆๅฐๅญฉๅจๅฐ้ไธๅท็บฟๅธธ็่ทฏ็ซๆๅฐฟ\n157503 0 ้ๅนดๅคง่กๆฐๅข4ไธชใๅ้ตๅคง่กๆฐๅข6ไธช\n157504 1 ไธญๅฝ้ถ่กๅฎถๅฑ
่ฃ
ไฟฎๅๆๆๆ ๆตๆผๅๆถไธบๆจ่งฃๅณ่ต้ๅจ่ฝฌ้พ้ข๏ผ่ฏฆ่ฏข่ด็ตxxxx-xxxxxxx ...\n1000 Processed\n classify content\n158000 0 ๅพฎ่ฝฏ็ๆก้ขๅ็ไนๆฏ้ไบ\n158001 0 ๅนฒไบๅไบ่ฆ่ขซไธพๆฅๆฒก้ฎ้ขๅฏไปฅ่ฏด่ชๅทฑๆธ
็ฝไบไบWWWไฝๆฏ่ฆไปฅๆฟๅคช้็ๅ่ช่ตท่ช็่ฏ\n158002 0 ไธ็บฟ่ถ3ไบบไธๅค็ฌๅบ็ชๆทๅผๆๅดไธๆ
ๅคฑ่ถณ\n158003 0 7/30ๅปๅธธๅท่ซๅๅฅๅฆ้ไบไธๅ\n158004 0 ่ท้ฉๅฝ็ไนๆฅ่่้ปๆฒนๅฏนๆฏไบ\n1000 Processed\n classify content\n158500 0 ไปฅๆๅ
ทๅทดๆธไผ ็ปๅปบ็ญ็น่ฒ็ๅ่ๆฅผไธบไธป\n158501 0 ๆๆฉๅ็็ฒพ่ฑๅ่พๅฏนEHERDERๅ็็ๆๅฏผ\n158502 0 2011ไธญๅฝ่่ฏ่ๅจไปชๅพๅผๅน\n158503 0 ๆตๆฑๅซ่งๅ
ไธชโๆ้ๅจๆฐไผฆโ่ฎไธ้็ๅฆ\n158504 0 ๆกไปถๅคๅณๅ่ขซๆง่กไบบไธ็ดไธ่ฝไธๆ\n1000 Processed\n classify content\n159000 0 ไธๅฎ่ฆ็ปๅธธไธไฝ ไปฌicloud้ฎ็ฎฑ\n159001 0 ไฟๆฟๅบๅๅคฎ่กๅจ2014ๅนดๅบ่ณ2015ๅนดๅ้ๅ็ไธ็ณปๅๆชๆฝ็จณๅฎไบ้่ๅธๅบๅฝขๅฟ\n159002 0 xใๆฌไบบๆ้่ฟๅฎถไบบๅทจ้ขๅ่ดฟ\n159003 0 ็ผๅท1019167็ๅฐๅธ
ๅฅๅธๆบ\n159004 0 ่ฎฉ็ไปๅ ๅๅป็ๆฅ็ปไปๅไพ่ก3ใๆไป็ฒโไฝฟ่่คๆถฆ\n1000 Processed\n classify content\n159500 0 ๅคฎ่งๅๆๅ
ๅฝๅ
โ็บข็ณโๅ
ๅน\n159501 0 ticwatch็้ณๅ
ๅฑ้ณๅ
ไธ็่กจ็ฐ่ฟๆฏ้ๅธธไธ้็\n159502 0 ็ๅๅฟ้ฆๆ้ญๆฏๅน่ฎญ็ญๅญฆๅๆ่บ่กจๆผๅจๅฟๅ
ๆ กๆๅฎคไธพ่ก\n159503 0 ้ญ่ใๅคง่ๅๅฐ่ฑx็ง่็ไผผไธ่ตท็ผ\n159504 0 โไธๆฏ่ฏดๆณๅพ้ขๅไบบไบบๅนณ็ญๅ\n1000 Processed\n classify content\n160000 0 ๆๅๅ็ไบๅบEGๅEHOME็ๆฏ่ต\n160001 0 ไธบไฝ่ฟๅปไผ็ง็ๅป็ๅซ็ๅถๅบฆๅคๅ่ดฃ้พ\n160002 0 ๅจๆฐไผฆ่ๅทๆผๅฑไผๅๅจๆฐไผฆๅๆๆผๅฑไผ็ๅฐ380ๆๅๅ ๅผ ็ฐ็ฅจ\n160003 0 ๅฎถ้ฟๅธฆๅ
ถๅฐๅป้ขๆฃๆฅๆถๅ็ฐ๏ผๅญฉๅญๅฟ่ไธ็ซ็ถๆ็ไธๆ น้็ถ็ฉไฝ\n160004 0 ๆฏๆฌก็ๅฐๅซไบบ็่ดจ็้ฝ่ฆๅจๅฟ้้ปๅฟตโไบบไธไธบๅทฑๅคฉ่ฏๅฐ็ญไบบไธไธบๅทฑๅคฉ่ฏๅฐ็ญโ\n1000 Processed\n classify content\n160500 0 ็ป่ฎพ่ฎกๅธWoody็ฏกๆนๅๆทฑๅพๆๅธๅ่ทฏ้ชๅนดๅ็ฑ\n160501 0 A่กๆด่ท็็ธ๏ผ้็ฐๅฝ้
ๅฏนๅฒๅบ้ๆๆณๆๅ
\n160502 0 x่ฑๅฏธJDIๅ
จ้ซๆธ
INCELLๅฑๅน\n160503 0 ่ฟไธช้ฎ้ขไธ่ฌๅ็ๅจไปฅไธๅบๆฏ๏ผๆ็ง็ๅฅณๆๅ็ไธๆฏไฝ ้ซ่ฐ้่ฎบ\n160504 0 xxๆฅๅ
จ็ๅคง้จๆ้ซๆฐๆธฉๅฐๅ่ณxxโไปฅไธ\n1000 Processed\n classify content\n161000 0 ไนๆตทๅธๅ
ฌๅฎๅฑๆปจๆฒณๅ
ฌๅฎๅๅฑๅๅฉๅธๆธๆฟ็ฎก็็ซๆๆไฝไธบ\n161001 1 ็พๅฅณไปฌไธๅ
ซ่ๅฟซไนใๅบ็ฅไธๅ
ซ่ๅกๅจๆฌๅบ่ดญไนฐxxxๅ
ไบงๅๅฐฑ็ปxxๅ
ๆด้ขๅฅถไธ็ถใไนฐxxxๅ
ไบงๅๅฐฑ็ป...\n161002 0 Blackmoresๆพณไฝณๅฎๆทฑๆตท้ฑผๆฒน่ถๅ1000mg400็ฒ็นไปท220\n161003 0 ๅซๅฟ่ฎฐๆจ่็ๆถๅ่ฎฉๅฏนๆน่พๅ
ฅๆจ่ไบบๆๆบๅท็ \n161004 0 ๆด้ณ็้ไนกๅผบๆจๅไธไฟ้ฉๅทฅไฝไบบๅ๏ผโไธไบคไธ่กโ\n1000 Processed\n classify content\n161500 0 ็ฆๆญปไบๅไธชๅฝ็ซ่ฝฆๆฑฝ่ฝฆ้ฃๆบ้ฝ่ฆๅไธ้โฆโฆโฆโฆ\n161501 0 FESCOๅๆฌกๅ
ไธดhaolleeCafeๅๅก\n161502 0 ๆๅฐฑๅตๅตๅตๅตๅตๅตไบๆ่ฟๆฏๅ
ฑไบงไธปไน็ๆฅ็ญไบบๅข\n161503 0 ๅจWVไผๅ็ฝ็ซๅช่ฎข่ดญๆ
ๆธธๅฅ้คใๆบ็ฅจๆ้
ๅบ็ญ\n161504 0 ไบ็นไธๅไบ็้ฃๆบ่ฟๆๅจไบๆไปฌไนๅ้ฃ่ตฐไบ\n1000 Processed\n classify content\n162000 0 ๅไบซ่กๆคๅฃซๆๆฏไธ่ก็คผ็ๅๆๅพ็๏ผๅคๅฝๅฏไบไปฃ\n162001 0 ๆจๅคฉๆไธ9็น45้ฃๆบๅปถ่ฏฏๅฐ3็น45\n162002 0 Bigbangๅไบฌๆผๅฑไผ\n162003 0 ่ฏฅ่ฝฆ่พๅฐๆญ่ฝฝๅ
่ฟ็LED็
งๆๆๆฏ\n162004 0 ๅ
จๅฝๅฐไป8ๆ15ๅท่ตทไฝฟ็จๆฐ็\"ๅนณๅฃคๆถ้ด\"\n1000 Processed\n classify content\n162500 0 10ๆฌพๆ็ฎ้ฃๆ ผ่ฎพ่ฎก็่
่กจ\n162501 0 ่ดจ็็คพไผๅจ่ๅฐ็ๆฏๆดไธชๅฝๅฎถๅ็คพไผ็ๅบๆฌๆๆถ\n162502 0 7๏ผๆฐ็ๅพ็ฆๆ๏ผ่พน้ฒไฝๆๅฐๆจๅฑ\n162503 1 ไฝ ๅฅฝใๆๆฏๅๅค้ถ่ก็ๅผ ็ป็ใๆ่ดน็จx-x.xใ้ขๅบฆxxไธ๏ผๅช้ๆไพ่บซไปฝ่ฏ๏ผๅทฅไฝ่ฏๆ่ฏขxxxx...\n162504 0 ็ฑ6ไธช่ฑๅฝไบบๅไธไฝๆฅๆฌไบบๅ
ฑๅๅถๆ\n1000 Processed\n classify content\n163000 1 ็ๆฃ่
็ซ้ฉฌ่งๆ๏ผๅ
่ดน่ฏๅ๏ผๅ็งไผๆ ๆดปๅจ่ฟ่กไธญ็ฌๆฌข่ฟ่ฏฆ่ฏข๏ผๅฅๅบท็ญ็บฟxxxx-xxxxxxxx ...\n163001 0 ๅฆๅ
ๅฝขๆไธไธคไธชๅๅฑฅๅธฆ่ฟๅฏไปฅๅพๅ็ฟปๆ\n163002 0 ๆ ๆ่ต้่ขซๆ่ทไบใ่่ต่ต้่ขซๅๆญปไบใๅฏนๅฒ่ต้่ขซๆผๆไบใๆปๅคด่ต้้ๅฒธ่ง็ซไบใๅชๅฉๅฝๅฎถ้ๅๆญปๅคๅคดไบ\n163003 0 ๆๅฐฑๆณ่
พ่ฎฏๆฏไธๆฏไธๆฏๆฌไบบไบ\n163004 0 ่ฝ็ๅฐไธไธชๅฐ้ฃๆบๆญฃๅจไบๅฑไธ้ฃ่ก\n1000 Processed\n classify content\n163500 0 ๅฎฟ่ฟๆชๆฅ็ผ็ปโไธ็บตไธคๆจชโ้่ทฏ็ฝ\n163501 0 ๅจ้ฃๆบไธ็ๆฅ็บธๆถ็ๅฐไบๆๅฎถๅฎๅฎ~\n163502 0 ่ฟๆๆ่ฟไธไธชๅจ4sๅบไนฐๅฐไบๆ่ฝฆไธ่ฏ\n163503 0 ๆจ่่ๅธๅกไน็ซๆ่่ๅธๅกไน็ซ\n163504 0 ่ๆฏๅฐ่ฝฆ้ๅ่ตท็น็ญๆฅ่ญฆๅฏๅค็\n1000 Processed\n classify content\n164000 1 ๅฅฝๆถๆฏ๏ผ xๆxx xx xxๅท ๆณฐๅฝๆธ
่ฟๅ้ฃxๅคฉxๆ ็นไปทxxxxๅ
ไธๅซxxx่ฝ...\n164001 0 ไผไฟฎๆฌไผไฟฎๅๅไนๅๅจๆญคๅๆๅไธๆฌกๅบ่ดฉ\n164002 0 ไฝๆฏxxๅนด้ฃไผxxxxไธๅ็ฐๅจ่ฏๅฎไธๆฏไธไธชๆฆๅฟต\n164003 0 ๅทๅคๅทๅค๏ฝAmiๅฐๅจๅจ้ถๅท็บฟๅไฟ้ๅฆ\n164004 0 /ๆฏ่่ญฆๅฏๅปๆฏๆธๆฐๆก๏ผๆญป่
ไธญxๆช\n1000 Processed\n classify content\n164500 0 ็ถๅตๅ1997ๅนดไบๆดฒ้่ๅฑๆบ็ๅไนๅ้ๅธธ็ธไผผ\n164501 0 ่ฝฆ่ฝฝKTVไปไน็ๆ็ถๆฏๆๅฆน็ฅๅจๅ~~่ๅธๆบไน่ฏทๅธฆไธๆ\n164502 0 ็ผ่พไป็ปๅ
ดๆท็ฒๆฏๆฏ่พพ4Sๅบ่ทๆ๏ผๅบๅ
ๆๅจ็ฐ่ฝฆ้ๅฎ\n164503 0 ๆฏไธชๅจไบ12็น้ฝๅฎๅจ็ต่ๅๅฏๆฏๆฏไธๆฌก้ฝๆขไธๅฐ\n164504 0 ๆ็ฝๅๅปๆ้ฃๆบๅบๅๆบ่พพไธๅ\n1000 Processed\n classify content\n165000 0 xxๅท้ฃๅคฉๆๅชๅปไบATMๆบๅป่ฟ้ฑ\n165001 0 ๅฐฑๅจ็ฆๆฅ็ๆถๅๅฐๅท่ฃ
ไฝ่ชๅทฑๆฏๆณๅฝไบบๆฅๆ
ๆธธ็\n165002 0 ๅญๅฅๆฆๆฌๆด้จ่ฑๅ้ชจ็ไธป่ฆไบบ็ฉๅ้ชจ\n165003 0 ็ญๆไธๆกไธๆ็็ธ็้ๆดฒๅธ้ฑผ\n165004 0 ๆๆ็ปไบโๆ็ฅ่ฑๅ้ชจๆฏ่ชๅทฑ็ๆญปๅซ\n1000 Processed\n classify content\n165500 0 ็ฑไบ่กๆฟๆไนฆๅจๆณ้ขไธ่ตทไฝ็จ\n165501 0 ๅทฒ็ปๅฟซ้ชไผดtfboys2ๅนดไบ\n165502 0 ๆข็ถ้ๆฉ่ฟๆณๅฐฑไธๅบ่ฏฅไบซๅๆๅฉ\n165503 1 ใๅxxๅฅๅ
ๅฐๅ
ๅพๆฏไบบ้ๅคไธๅฅ๏ผ้ขๅฆ็ๆค่ค้บ่ไปทไนฐไธ็ถ้ๅๆฌพไธ็ถ๏ผไนๆฏ้้xxๅฅ๏ผๆฐ้ๆ้ๅ...\n165504 0 ่
พ่ฎฏๆฟไบงไปๆๆๆฟไบงไฟกๆฏ็ฝๅ
ฌๅธ็้ๆฟๅ
ฌๅ็ป่ฎก\n1000 Processed\n classify content\n166000 0 ไธญๅฝๆฟๅฐไบงๆฅ8ๆ3ๆฅๆฅ้๏ผๆฒณๅๆฐ็ฐ็ฝฎไธๅฏๅธธ็็ๆฅ\n166001 0 ่ฟๅฐฑๆฏ้ๅฐๅคๆฏไธญ้ขๆณๅฎๆณๅพๆไนฆ้ ๅ้ทๅฎณๅฝไบไบบ็ๆฏ้ๅฐๅคๆฏไธๅคงไธ้ป\n166002 0 ๅฅๅญ่ฎพ่ฎกไธ็้ฎ้ขๅจไป้ฃ้้ฝๆฏไธๆฏ้ฎ้ขๅฆ\n166003 0 ๆๅไบไบๅๅ ๅนด็ๆคๅฃซโฆโฆโๆ็ฒ่ฏด๏ผโๅคชๅฅฝไบ\n166004 0 ๅพๅทไบ้พๅบ็คพๅบๅป้ข็ปๅฎๅฎๆฅ็งๅฟซๅฐๆ็่ช่ดน่ฒ่\n1000 Processed\n classify content\n166500 1 ๆ้ฝๆฅ้ไธๅ็ๅญ๏ผๅฎถๅธธ่๏ผไนกๆ่้ฃ็งๅฝขๅผ็๏ผๅฐๆนๅชๆxxxๅคไธชๅนณๆน๏ผๅฐๅๅจไบ็ฏ่ทฏไธไบๆฎต้ๅฑ
ๅฏบ...\n166501 0 ็็ชไบๅ
ฌๅธ็ฉๅ่ฟๆ่ตทๅณๅจไปฒ่ฃ\n166502 0 ้ฃๅ452bๆ็ๅคง็บฆ้่ฆไธๅไธๅนด\n166503 0 ๅฎๅๅพ็ป่ฆไบ่็พๅง่ฆๆฑๆๅ
ณ้จ้จๆฅไธฅๆฅ\n166504 0 ๅๅท็ 15173253196ไฝฟ็จๅฐๆฌๆๅบ\n1000 Processed\n classify content\n167000 0 7ๆ13ๆฅ่ชๆฒปๅบๅ
ๅง็ฌฌไธๅทก่ง็ปๅ่ฅฟไนกๅกๅบๅ้ฆๅทก่งๆ
ๅต\n167001 0 ๅฝๅนดSurfaceRT็ๅฉๆขฆๆฏๅฆไผๅปถ็ปญ\n167002 0 ๅ่ตฐไธๆฌกๅไบฌๅ๏ฝ่ฟๆฌกๆฒกๆถ้ด้ๅไบฌๅคงๅฑ ๆ็บชๅฟต้ฆไบ๏ฝไธๆฌกๆๆบไผๆๆณๅๅปไธๆฌก\n167003 0 ่ฟไธคๅคฉๆๅๅ้ๅคดๆ้ฒๆๅข่ดญๆดปๅจๅฆ\n167004 0 ๆฐๅผๆฅไปฃๅพฎๅบ~ๅไฝๅคง็พๅฅณไปฌๆ้ฎ้ขๆฌข่ฟๅจ่ฏขๅฆ\n1000 Processed\n classify content\n167500 0 ไป่ฎฉ23ๅทๆไธบ้ฃ้กๅ
จ็ใๅฎถๅปๆทๆ็ๅท็ \n167501 0 ไปไปฌๆ็พค็ป้ๅฐ่ดจ็่ฟไธชๆฐๆ็ไธๅ\n167502 0 ๅจๆฆๅๅบๆขๅซ3ๅฎ็่ฟๆณ็ฏ็ฝชไบๅฎ\n167503 0 ๅฐฑๆ
ไฝ้้~~~ไฝไธบไธๅๆ่ดฃไปปๅฟๆปดโๆ็ๆโ\n167504 0 ่ฟๅฏๅจNetSuiteใEloquaๅMarketo็ญ็ณป็ปไธญ้กบ็
่ฟ่ก\n1000 Processed\n classify content\n168000 0 ๅๅฎไธป่ฅไธๅก๏ผ็ณๅๅๅทฅๅๅค็่ชๅจๅๆๅฅ่ฎพๅค็็ ๅใ็ไบงๅ้ๅฎ\n168001 0 ไธญ้ด้ฃๆบ่ฟๅปถ่ฏฏไบ11ไธชๅฐๆถๅ็งๅๆถ\n168002 0 ๅฎ้ฆ้ๅขใๅฎ้ฆๅ
ฅไธปๆ่กๆฐ็ไธๆฟ่ฟ้ฃๆ ผ็ธๅ
ณๅคง้ๆงไป
ๅทฎไฟกๆใ\n168003 0 ไธๅ่ๅxxๅ็ช่ณๅฎๅพฝ็่ๅ ๅธๆๅคงๅญฆๅญฆ็ๅฎฟ่ๆฅผๅ
\n168004 0 โ้ขๅผโๆไบ่ฟๆฌพๆๆบ็ไธป่ฆๅ็น\n1000 Processed\n classify content\n168500 0 MLGB่ฏดๅฅฝ็LGDๅ
ๆๅขCDECๅ ๆฒน~\n168501 0 ๅญฆไน ่ฃ
็ต่็ปๆ็ป่ฃ
ๆ็ ๅคดไบ\n168502 0 ๆๅไธไธ็่ดชๅฎๅฐๆญปๆ ่ฌ่บซไนๅฐไนๅฐ\n168503 0 NBA็้LOGOๅๅ้ๆ็\n168504 0 ้ข่ฎกxxxxๅนดไธๅๅนด่ฏฅๅธๅบ่งๆจกๅฐ่พพxxxxx\n1000 Processed\n classify content\n169000 0 ๆๆถๅ
ฅxwไปฅไธๅฏ่่ๆฌง็พไธญ็ซฏๆธธ\n169001 0 ่ชxxxxๅนดxxๆๅๅ ๅทฅไฝไปฅๆฅ\n169002 0 ๅฆไฝ้ๆฉๆๆฐๆ่ดงไธๆๆฐETF\n169003 0 ้ๅฎๅฟๆ่ฒ็ณป็ป2015ๅนดๆๆ้ขๅ็คพไผๅ
ฌๅผๆ่ๆๅธ196ๅ\n169004 0 ไธๆณๅๆฏไธๆณๅๅๆญฃๆ้ฝๅฅฝๆๆ่ฏด\n1000 Processed\n classify content\n169500 0 ๆตๆฑ้ๅ็่ฑ็็ซนๅบ็็ธไบๆ
\n169501 1 ๆจๅฅฝ๏ผๆๆฏๆตทๆดๅๅฒๅพ็ฝฎไธ้กพ้ฎๅฎ่พ๏ผๆๅคฉๆไปฌๆจๅบไบๅๅฅ็นไปทๆฟ๏ผๆจๆๅคฉ่ฟๆฅๅฏไปฅ็ไธ๏ผ่ฎฐๅพ่ฟๆฅๆพๅฎ...\n169502 0 ๅๅฆ่ฑๅ้ชจไน็ฉๆๅๅใๆๅไธไธชไบฎ็ไบ\n169503 0 ็พ็ฝ็ฅๆๆ็จๆนๆณ๏ผ1็ฒ/ๆฅ้้คๆ็จ\n169504 0 ๅฎคๅ
่ตฐ้็่ฎพ่ฎก็ตๆๆบ่ชไบโๅฑฑๆดโ\n1000 Processed\n classify content\n170000 0 ไบ่็ฝๅป็ๆชๆฅ็6ไธช่ถๅฟ\n170001 0 ไธ้~~~่ฟ2็ถๆญฃๅฅฝ้ๅๆ
ๆธธๆถๅๅธฆ\n170002 0 ๅฐฑๆฏๅจ่ฎคๆธ
็ๆดป็็ธไนๅไป็ถ็ญ็ฑ็ๆดป็ฝๆผ็ฝๅ
ฐ\n170003 0 ่ถฃๅป้ขๆๅฎขๆท็ซฏๅญๅจxssๅฏๅฏผ่ด่ฟๅๅฎถๅป้ขๅ600w็จๆทไฟกๆฏๆณ้ฒ\n170004 0 ๆๅฟไธๅขๅ
ๅคๅฎขๅ็ญพ่ฎข20ไธช้กน็ฎ\n1000 Processed\n classify content\n170500 1 ้กพๅฎข๏ผไฝ ๅฅฝ๏ผ่ดๅ ็พๅฉดๅนผๅฟๅฅถ็ฒไบxๆxๆฅไธxๆxๆฅไธไบๆ๏ผ็นไปท้คๅคใๅๅ
ดๆฐธ่พxๅบ่ดๅ ็พ\n170501 1 ไปๅคฉๆฏๅ
ๅฎต่๏ผ็ฅๅไฝ่ๆฟ็ๆๅ
ด้๏ผ่ดขๆบๆปๆป๏ผ๎ฏ๎ฏใ ไผๅฉๅฅถ็็ฐ็นไพxxxๅ
/ไปถ๏ผ้่ฆ...\n170502 1 ๅฐๆฌ็ๅฎขๆท๏ผๆจๅฅฝ๏ผx.x--x.xxๆฅๅๅฎไฟๆฌxxxๅคฉ๏ผx.x%๏ผไฟๆฌxxxๅคฉx.x%๏ผไฟๆฌ...\n170503 0 ๆฎFIBAไบๆดฒ็ฏฎ็ไธๅฎถEnzoFlojoๅจtwitterไธ้้ฒ\n170504 0 ๅจๅ่พๅบๅ
8ๅฎถๅฑ
ๅงไผ็1ๆโ3ๅฒๅ
ฑ14ไธชๅนด้พ็ป196ๅๅฟ็ซฅๅไธๆดปๅจ\n1000 Processed\n classify content\n171000 0 xxๅ็พๅฅณๅฏๅฑ้ฟ่ฃธ่พๆฑ่ๆญๅฎๅบๆฝ่งๅ\n171001 0 ้ฃไบไปฃ่กจ็80ๅ็ถๆฏ็ซฅๅนดๆถๅ
็ๆธธๆ\n171002 0 ่ช็ถไฟฎ้ข้ฒๆฐด้ฒๆ้็ฆปโฆๆ็โๅบ้จ้โ\n171003 0 ๅพๆณๅฟซ่ทณไธ้ฃๆบๅพๆณ็ซๅปๅฐ่พพ\n171004 0 ๆๆฒกๆๆๆฑ่้ซไธญ็่ฏญๆ็ฌ่ฎฐ\n1000 Processed\n classify content\n171500 0 ๅธๅๅบไบบๆฐๆณ้ขๆกไปถๅ็ๆฐๅ็ฐๅคงๅน
ๅข้ฟ็่ถๅฟ\n171501 0 ้ฟ่ช่ญฆๆน2ๆๅคๆถ้ดๆฅ่ท40ๅ่ฟๆณๆถๆฐดไบบๅ\n171502 0 ๆๆญฃๅจ็ๆตๆฑ4ๅฒๅฅณๅญฉๆๅ
ฅ็ณ็ฐๆฑ 90%่่ค็งไผค็ๅฝๅๅฑโโไธญๅฝ้ๅนด็ฝ่งฆๅฑ็\n171503 0 ๆทฎๅฎๆทฎๆตทๅ่ทฏๅ่งฃๆพไธ่ทฏๅๅญ่ทฏๅฃๅ\n171504 0 ็ตๆขฏ่กไธๆฏๅฆไนๆฏ่ฃ
ๅคๅถ้ ไธ\n1000 Processed\n classify content\n172000 0 ็งฐๆๆบๅ็ฝ่ทฏ้ฎ็ฎฑๆถๅฐๅ ไธชๆถไธ็ญ้จโ็ไบบ็งโ่็ฎ็ป็ไฟกๆฏๅ็ต่ฏ\n172001 0 ่ฝๅธฆ็นๅไธๅณไฝๆฏไบบxxๅ
็้จ็ฅจ่ฟ่ฝๆฅๅ\n172002 0 ๆฉๆจๆณ็จ็ต่็ต่ๅไบ็ถๅไธญๅๆๅๅ็ฐๆๆบๅ
ณๆบ้ฎๆไธๅจไบ\n172003 0 ๆ็ไปไปฌไนๆฏ่ตฐ็ง้ๅข็ไธไปฝๅญ\n172004 0 ๅไธบๅๅไฝไผไผด้็นไผ ้ISVๆดไฝๅไฝ็ญ็ฅ\n1000 Processed\n classify content\n172500 0 ๅจๆซไธๅๅฐไธๆฑๆปจcolletteๅฐๅ\n172501 0 ๅฎซๆ็QQ็พคๅ่ดดๅงๅๅฑๅทฒ็ปๅๅๅฃฎ\n172502 0 ๅฆๅญๅฎซ่็คใๅญๅฎซๅ
่็
ๅใไนณ่
บ็
ๅ็ญ\n172503 0 ็พๅ
็่ดขไบงๅๅนดๅๆถ็็่พไฝ\n172504 0 visualstudio2012ๅ\n1000 Processed\n classify content\n173000 0 BornFreeDecoBottleGiftSetๅฅถ็ถ่ถ
ๅผ็คผ็ๅฅ่ฃ
$22\n173001 0 ็ๅฎ่ฑๅ้ชจๆๆณๅฏนๆ้ก้่ฏดโ่ง่ฟ่ชๆ็\n173002 0 ไปไปฌไธๅๅฐ่ดจ็ใๆฟๅฑใ้ผๅจไปไบบ\n173003 0 ่ฏฅๅฐๅบ็ตๆขฏ็ช็ถไป20ๆฅผๆป่ณ14ๆฅผ\n173004 0 ๅๆ็ๅฐไธไธชไบบ่ฟๅจ่ๅทๆฒณ่พนไธ่ทๆญฅ\n1000 Processed\n classify content\n173500 0 ไธญๅฝ่ฟๆด601919ๆฏไปๅคฉไธญๅญๅคดๆๅผบ็็ฅจ\n173501 0 ่ฟ่ฝฎ่่ต็ฑCharlesRiverVentures้ขๆ\n173502 0 ๅ็ฐ็พๅบฆๅฐๅพ็ๆธ
ๆฐๅ่ฝ~share\n173503 0 ไธคไธชdealๅๆถlive็ๅฟ่ฆไบบๅฝ\n173504 0 ๆทก็ปฟ่ฒๆถฒไฝๆฏๆฅ้ด็ฒพๅๆฐด่ดจๅฐ่ฝป่ๅ็ฝ\n1000 Processed\n classify content\n174000 0 ๆฑ่ๅซ่ง็็ท็ฅๅผ ๆฐๅฆไฝๅ่บซๅ่่ฏๅ \n174001 0 ๅคๆฆๆๆฏๆกไธคๅฎกๅๅคๆญปๅ็็ปๆ่ฎฉไปๆชๅฟ\n174002 0 ๅช่ฆๆฅๅคไบๆถๆๅๅค็ปฉๅทฎ่กไบๆ่ก็้ฃไบ้ปๅบ\n174003 0 X้ๅบฆ็ซ่ตทๆฅwwใฆใซใใฉใใณใๅคงๅฅฝใใชใฎ\n174004 1 ๅฎๅฎๅฎถ้ฟ๏ผๆจๅฅฝ๏ผๆๆฏๅฏ็ๅฎ่ดๆฉๆไธญๅฟ็ๅข่ๅธ๏ผๆไปฌๅญฆๆ กๆ่ฟๅจๅไผๆ ๆดปๅจ๏ผๆ็ญๅฏไปฅๅ
่ดน่ฏๆไธไธช...\n1000 Processed\n classify content\n174500 0 ๅฐฑๆฏๆฐ็ๆผๅๅๆ่ตๆนๆ่ฏดไธๅบๅฃ็ๅฉ็ๅ
ณ็ณป\n174501 0 ๆๆ่ๅฐไธไธช้ฎ้ขไธไธชไบบๅจๆๆ
้ๆขๅฝ่ฟๅจๅๅๅฝ่ฃๅคๅฅนๅฏนๆ่ฏดๆไปฌไธ่ตท่ทๆญฅๅง่ท็ๆถๅ่ฏดไธๅฌไฝ ไธ่ฝ่ฟ...\n174502 1 ไธไธๅ็ๆฟไบงๆตๆผ่ดทๆฌพใๆ ๆตๆผ่ดทๆฌพใ็้พ่ดทๆฌพ๏ผๅๅนดๆถ็xx%่ตท็ๅ็ง็ฑปๅ็็่ดขไบงๅ๏ผPxPใๅบ...\n174503 1 xๆxไธxๆฅๅฆไธๆๅณฐๅคงไธ็ๆ้ฅฐๆปกxx๏ผxxx๏ผxxxๅ
ๅxxๅ
ๅๆฅๅxxๅ
ๅธ๏ผๅ
จๅบ่ดญ็ฉๆปก้ข่ต ...\n174504 0 ็่ฟไธๅญฃๅฅฝๅฃฐ้ณๆ่งๆฒกๆๅไธไธๅญฃ่ฎฉไบบๅฐ่ฑกๆทฑๅป็ๅญฆๅ\n1000 Processed\n classify content\n175000 0 ๆ็ซ็ถๅจ็ๆฑ่ๅซ่งไธญๆ้
้ณ็็็ทไธปๅซ้่ฐญ่ฟๆฏ้ๆฝญ็โฆโฆ็ปงๆฟ่
ไปฌ\n175001 0 ๅฝฉ็ฅจๆๆณจๆถ้ดไธบ7ๆ27ๆฅไธๅ5็น44ๅ\n175002 0 ๆฟๆจๅๅฎถไบบๅฅๅบทๅนธ็ฆโฆโฆ\n175003 0 ไฝ ไปฌไปๅต็ๆฏๆฅๆขๅซ็่ฟๆฏๆฅ้ๆ็\n175004 0 ๆฑฝ่ฝฆๆน่ฃ
13063876999\n1000 Processed\n classify content\n175500 0 ็ฝ้??ๅไบฌๅฐๅบๅฏไปฅ่ชๅ็ๆฅ\n175501 0 abuseofpowerๅฐฑๆฏใๆฟซ็จๆฌๅใ\n175502 1 x.x่ๅฝๅคฉๆฌง้
้กฟๆค่คๅๅ
จๅบx.xๆ๏ผ่ๅฎขๆทๅฝๅคฉ่ฟๅบ่ฟๅฏ้ขๅ็ฒพ็พ็คผๅไธไปฝใๅฟซไนx.x่๏ผๆฌขไน...\n175503 1 ไบฒ็ฑ็ไผๅ๏ผๆฐๅนดๅฅฝ[็ซ็ฐ]ๆผๅฆฎ่ฌๅ
่กฃๅ็็ๅบไบไธๆxๆxๆฅ๏ฝxๆxๆฅไธพ่กๅ
่กฃ็นๆ ๆดปๅจ๏ผไธคไปถx...\n175504 0 ่ฆๆฟๆดปๆๆบๅกๅฐฑ่ฆๅ
ๅผ1200ๅ
\n1000 Processed\n classify content\n176000 0 ไธๅxxxๅนณๆน็ฑณ็็ปฟๅๅธฆ่ขซไบบๆฏๅ\n176001 0 ไปไปฌๆๅจ็ๅฐๅบๅ
ฑๅ็็ตๆขฏๆ
้ๅฐ่ฟ400ๆฌก\n176002 0 ็ตๆขฏๅไบบไบไปถๅๅ ๅจๅคฎ่ง่ฎฐ่
็ป่ด้่ฎฟไธๆฐด่ฝ็ณๅบ\n176003 1 ๅฎ๏ผๆจๅ
ฐๅฑฑxAๆฏๅบๅ
ใ้ป้ๅฐๆฎต้จ้ขๆฟxxxๅนณๆน๏ผxxxไธ๏ผไธค่ฏ้ฝxxxxxxxxxxx\n176004 0 ็็ๆณ่ฏด่ชๅทฑๅฅฝ็ฌจๅบ้จๆฒกๆๅธฆ้ฅๅ่ไธๅบ้จ่ฟๆฒกๆ็ปๆๆบๅ
็ต\n1000 Processed\n classify content\n176500 0 ไบๅๆบๅบ้ๅขใๆๆๆบๅบใ็ๆฐ็จๆบๅบๅ
ฌๅฎๅฑ็ญๅๆๆดๅไฝ็ซๅณ่ตถ่ตด็ฐๅบ\n176501 0 ็ๅๆๆฅ่ฎฐ่
้่ฎฟไบๅธๆ็่่ฏไธญๅฟ็ธๅ
ณ่ด่ดฃไบบ\n176502 0 ไนๆฏไธๅคไธๅฏๅคๅพ็ๆ
ๆธธๆฏๅบ\n176503 0 6ใ่งไบบๅๆณ๏ผ่ฎฒไฟก็จใๆ ๅฎๆฐใๆๆก็ใๅฐๅคง่ฏ\n176504 1 ๆฐๅนดๅฅฝ๏ผๆๆฏๆน่ฅฟ็งๅจ็็ฝ็ป็ใ็ฅๆจๅจๆฐ็ไธๅนด้ๅฟๆณไบๆ๏ผไธไบๅฆๆ๏ผๆญๅๅ่ดข๏ผ็ฐๅจๆฌ้คๅ
ไปฅๅผไธ...\n1000 Processed\n classify content\n177000 0 ๅนถๅจ่ฏ่ฎผ็ปๆๅบๆฅๅไธๅๅๅบ\n177001 0 ็ฒ้ชจๆ็โ่กโๆฏไธไธชๅๅญ่ทฏๅฃ\n177002 0 ๅพฎ่ฝฏๅ
ฌๅธWindowsxxๅฎถๅบญ็ไปทๆ ผไธบxxx็พๅ
\n177003 0 ไธ็งๆ้จไบบ็็ณๅจ็ฌฌไธๅฑๆทฑๅณๅฝ้
ไฝ็ขณๅ่ฎบๅไธ็งฐ\n177004 0 ไนๆฏๅฝๅฎถๆ
ๆธธๆปๅฑ็กฎๅฎ็ๅ
จๅฝๆ
ๆธธๆฏ็นไนไธ\n1000 Processed\n classify content\n177500 0 2๏ผๅ
ฌ่ฏๅ้ๅฝๆไธบไปไนไปๆฅไธๅจๆญคไธบๅฝไบไบบๅๅ
ฌ่ฏ\n177501 0 6็น่ตทๆฅ็ๅฐig็ฌฌไธๆ็ขพๅ็ดๆฅๅบๅปๅป้ข่พๆถฒไบๅฟซๅฐไบๅ็ฐ็ฟป็ไบๅฟๆณ็จณไบ่พ็ๆถๅๆฒกwifi็ดๆฅไนฐ...\n177502 0 ๅ ไธบๆ้ฉฌไธๅฐฑ่ฆไน้ฃๆบไบๆไปฅไผๆฏๅไฝๆๅฌๅฐๅข\n177503 0 ๅ้กใฎTwitterใงใ\n177504 0 ่คถ็ฑ็่ฎพ่ฎก็พๆญๅๅ
ทๆๅฐๅฅณๆ\n1000 Processed\n classify content\n178000 0 insider็จๆทๅฏไปฅ้ๆฉๅ็บงๅฐwindows10\n178001 1 ๅ
่ดนๅจ่ฏขxxxxxxxxxxxๆๆฐ็ๆบ้บปๆงๅถๅจ๏ผไธๅฎ่ฃ
๏ผ่ตทๆๆฟๅฅฝ็๏ผๅฏ็ฐๅบ่ฏ็จ๏ผๆปกๆๅไนฐใq\n178002 0 ๆจๅคฉ่งๅฐ็่
่ดฅ็ๆฏๆๅฐๆไบ\n178003 0 4ใไธ่ฆ็ขฐๆ่ช่่ตๅซ็็ๅนณๅฐๅฏนไบP2P\n178004 0 ไฟกๆฏๆฅๆบ๏ผ็ฆๅปบ้ซ่ไฟกๆฏๅนณๅฐ\n1000 Processed\n classify content\n178500 0 โใใๅฐ็ฝๆฐๅ้ฎ๏ผโ็ๆๅข\n178501 0 ๆ่
พๅฐ็ฐๅจ็ปไบๆpreview็ไบไธ้\n178502 0 ่ฏทๆไฝๆถๆบ็ๅฎ่ดไปฌๆฅๅจ่ฏขๅง'\n178503 0 ่ด่ดฃๅบไบffmpeg็่ง้ข่งฃ็ ๆญๆพๅๆ ผๅผๅ
ผๅฎน้้
\n178504 0 ๅจ่ฟ้ฃ้ซ้5KMๅคๅ ๅฎๆฝ้ฅฎ้
ๅ้ฉพ้ฉถๆบๅจ่ฝฆ็่ฟๆณ่กไธบ\n1000 Processed\n classify content\n179000 0 ่ๆฏๅจไบๆณๅฎ็ไบบๅใไบบๆ ผๅๅไบบ็่ฏ็ฅ\n179001 0 ๅ
ทไฝๆง่ฝๅฏไปฅ่ช่ก็พๅบฆ๏ฝๅ ๆขๆๆบๆไปฅ้ฒ็ฝฎ็\n179002 0 ่ฎฒ่ฟฐไบxxๅฒ็ไนๅคฉๆดพๆผซ็ปๅฎถ็้กฟๅ ๆฃ็็่บซๅคไบบ็ๆ่ฐ้พ็ๆถๅปไฝๅๆ ทๅฏน็ๅฝ่ฟๅพฎ็ฌ็ๆ
ไบ\n179003 0 ่ฟๆๅฝ็ใๆๆกไฝ็ใๆฅๆงๅฝๅ็ใ้ผป็็ญ\n179004 0 ่ฝปไปๆ่ต่
ๅฏ้ๅบฆๆๆกไธช่กๆฟๅๆบไผ\n1000 Processed\n classify content\n179500 0 ่ฟๆ ทไธ่ฏพไธไฝไพตๅ ไบๅญฆ็ๆๆไผๆฏ\n179501 0 ่ฟไนๅคงไบ่ฟ็งๆ
ๅตไธ็ฎๅฆๅฆ่ฎฉไป่ทไปไธ่ตท็ก\n179502 0 ไฝไฝ่ขซ็ธ็ธๅธฆๅฐๅไบฌ็ญๆฅ่งๅๅปไบ\n179503 0 ๅฆๆไฝ ่ฎคไธบRolfBuchholz็ๅคงๅคๆฐ็ฉฟๅญๅจๅคด้จๅ้ข้จ\n179504 0 ไปไปฌๅฐๅๅซ่ทๅพ็ฑๅพๅทๆ
้ญไฝๆฃไธญๅฟๆไพ็\n1000 Processed\n classify content\n180000 1 ๆถฟๅทใ็ๆดฅ่ฑๅญใ๏ผๆๅคฉๅผ็๏ผ้ฆไปxไธ๏ผๆ ๆฏๅซไปไธๅนด๏ผๅ
้จๅๅทฅ้ๆฟ๏ผไปทๆ ผ้ฝๆฏๆไฝ็๏ผๅไปทxxxx\n180001 0 ๆ็็ต่ๅ
้ฉฑๅๅๅๅไบไบไบไบ\n180002 0 ้ฒๆๅฎไน็พ็ฝใๆ็ฑใ้ฒ่กฐ่ไนๆ นๆฌ\n180003 0 ๆณฐๅฝTHANN็ดซ่ๅ่&\n180004 1 ไบฒ๏ผ ไธๆไฟ้โโๅนปๆดปๆฐ็ ๆถๅ
็คผ้๏ผ ไธใๅๅผ ่ฎขๅ่ดญไนฐไปปๆๅนปๆถๆๅนปๆถไฝณไบงๅๆฏๆปกxxxxๅ
๏ผ...\n1000 Processed\n classify content\n180500 0 ๅพฎ่ฝฏๅป้ผๅพฎ่ฝฏๅป้ผๅพฎ่ฝฏๅป้ผ\n180501 0 ่ฟ้ขๅซๆๆ่ฎธๆ่ฝๅๆฏๆ็ๅฝ็็ๅญ\n180502 1 ๅ็ณ่ตๆฌ-่พๆนๅค็ญ็ฅๅฏนๅฒๅขๅผบxๅท่ตไบง็ฎก็่ฎกๅๅฐไบๆฌๆไธญๆฌๅๅฎ๏ผ่ตท็นxxxไธ๏ผ้ขๆๅนดๅๆถ็xx...\n180503 0 ไธ็ธๅฃฐ็จๅฎ้ฉๅธ
ๅคซๅฆปไน้ด\n180504 0 ๅซๆ๏ผcharles837668\n1000 Processed\n classify content\n181000 0 ๆฅไธบไปไปฌไธไฝไธบ็่กไธบไปๅบไปฃไปท\n181001 0 ่็ญน่กIBMๅ่ๅๆๆฏใ่นๆใๅพฎ่ฝฏ\n181002 0 ๅกๅฐไบๅฆๅ
ๆ่กจๅฎขๆทๅฎๅถ่กจๅธฆไนๆฏๅ
จ้็ๅ้ๆๅ่ถณ\n181003 0 ็ถ่็็ๆๆบๅทๆฏไธไธชๅทฒ็ปไธ็จ็็ต่ฏ\n181004 0 ็ปๅไบฌ็ตๅๅ
ฌๅธ็ๅทฅไบบไปฌ็น่ต\n1000 Processed\n classify content\n181500 0 ็ธไบไน้ด่ฐ่ฎบ็ตๆขฏ็่ฎพ่ฎกใๅ็ใ็ปๆใ็ปดไฟ\n181501 1 ไธ็ไธๆๆ็ไบ็ฉ้ฝไผไธบ็พไธฝ่ไผ้
็ๅฅณไบบๆ่
ฐ๏ผๅ
ไธฝ็ผๅจไธๅบงๆ้ ็พไธฝไผ้
ๅฅณไบบ็ๆฎฟๅ ๏ผ??ๅจx'x...\n181502 0 ไบฆ่กจ็คบ็็บชๅฟต20ไธ็บช30ๅนดไปฃ้ฃ้กๅไบฌ็ๅไบฌ็ฌฌไธๆก่ฝจ้ไบค้โไบฌๅธ้่ทฏโ\n181503 0 ๆ่ง่ฑๅ้ชจๅ่ฟๆ ทไธไธชๆๆๅ้\n181504 0 ๆฅ่ญฆ19ๅ้่ญฆๅฏ่ฟๆฒกๅๆณๅบ่ญฆ\n1000 Processed\n classify content\n182000 0 ไบๅ็ๅงๅๅฏไนฆ่ฎฐไปๅ่ขซๅๅผ\n182001 0 ็นๅซๆฏๅไธบไฝ ๅฏนๅพ่ตทไฝ ไธ็500ๅผบ็ๅคด่กไน\n182002 0 MSCIๆๆๆชๆA่ก็บณๅ
ฅๅ
ถๆฐๅ
ดๅธๅบๆๆฐ็ๅณๅฎ\n182003 0 7่ฑๅ่ฟๅคใๅคด็ฎๅฑ่ฟๅค๏ผ็ผบ็ปด็็ด \n182004 0 ็ถๅ้กบไธฐๅฏๅบๅฐไปไธ็จ่ดๆ
ไธๅ้ฑ\n1000 Processed\n classify content\n182500 0 ไบๆฏๆ้คไบ็่ๆๅฐ็ๅคฉๅคฉๅไธ่ฟๆฏไป็ฝไธ็\n182501 0 ๆ่ฏๅๆขๅธไบบๆฐๆฃๅฏ้ขๆฐ่กๅคๅฐธไฝ็ด ้ค็็ฃไธไฝไธบๆๅผ่ดซๆฐ็็ผบๅพท่กไธบ\n182502 0 ๅผบ็ๅผๅๅป้ขไธบ70ๅฒไปฅไธ่ไบบ่ฎพ็ซๆๅท\n182503 0 ๆฟๅบไธๆนๆ ้ก้ณๅฑฑ็ๆฐด่ๆก??ๆๆฟไบไธ็ฎฑ\n182504 0 ๆๆฐไพ็ถๆๅๆฝxxxx็น้่ฟๆฏๆ็ๅฏ่ฝ\n1000 Processed\n classify content\n183000 0 ๆไธไธๅๆฒกๆณ็gmailๆฒกๆณ็จgoogle็ฎ็ด็ฆ่บๆญป\n183001 0 2ใๅ
จๅฝๅฏผๆธธ๏ผ็ฝไธๆณจๅๆถ้ด๏ผ6ๆ25ๆฅโ10ๆ9ๆฅ\n183002 0 โโๅ ไธบไปๅคฉๆฏๆไปฌๆฅๆ็ๆไธ่ดขๅฏ\n183003 0 ็ๅฐๆ็ฅจๅก่ฏดๅไบฌ1212โฆโฆโฆๆ่ง่ชๅทฑๅจ่ฏพไธๅฐฑ่ฆๅญๅบๅฃฐไบโฆโฆโฆ\n183004 0 ๆตทๅฃๅธๅ
ฌๅฎๅฑไบค่ญฆๆฏ้้ๅฏนๅธๆฐๅๆ ๅจๆตท็งไธ่ทฏๅฝฉ่นๅคฉๆกฅใไน้พ่ฅฟ่ทฏๆกฅใไธๆนไธ่งๆฑ ๅคง่ฝฌ็่ฎพ็ฝฎไบไธๅค้...\n1000 Processed\n classify content\n183500 0 FuturesandOtherDerivativesโtheseventheditionofโฆ\n183501 0 #NAME?\n183502 0 xไบฟๆปไปทๅทๆฐๆปจๆนๅๆปไปทๅฐ็็บชๅฝ\n183503 0 ไธญๅๆถๅฅฝ้ฒๆๆ็ไผๅจๅค้ข่ตฐไบxxๅ้\n183504 1 ๅฐๆฌ็ๅฎถ้ฟ๏ผๆจๅฅฝ๏ผๆไปฌๆฏๆดๆฐ้็ณๆ่ฒ๏ผๅฌๅปๆฅๆฅ๏ผๆฅๅญฃๅจๆซ่กฅไน ็ญๆญฃ็ซ็ญๆฅๅไธญ๏ผๆฌๅจๅจๆซๆฅๅๅฏไบซ...\n1000 Processed\n classify content\n184000 0 ็ฆๅปบใๆตๆฑๅคง้จใๆฑ่ฅฟๅคง้จใๆนๅไธ้จใๅนฟไธไธ้จใๅฎๅพฝๅ้จๅๅฐๆนพ็ญๅฐๆๅคงๅฐๆด้จ\n184001 0 ไธบ่ฟไธๆญฅๅ ๅผบๆณ้ขๆๅๅปบ่ฎพๅทฅไฝ\n184002 0 ๅจmysqlไธญไธๆฏๆmergeinto\n184003 0 ๅฐๅปบไปๅคฉไผไธ่ต้ฑไบ้ฝๆ่บซๆฟๅฐไบง\n184004 0 ่
่ดฅไปฝๅญไปๅ
้จ็ซ็ช่ฟ็งๅๅฟๅ\n1000 Processed\n classify content\n184500 0 ไฝ ไปฌ่ฟไบๅๅชๆขไธ่ฅฟๅๅฐฑ็ฎไบ่ฟๅถ้ ๅช้ณ\n184501 0 ๅชๆๆไปฌ็ๅทจๆๅฐๅฐๅจ้ฃๆบไธ่ดด้ข่\n184502 0 ไนๅ็่ฟไปไธไธไธช่็ฎๅฟซๆฌ่ฟๆฏไปไน\n184503 0 ไบบไปฌๅธธ่ฏดwexin๏ผๅซๆGDxxxxxxxx\n184504 0 ไฝ ไปๅจ็ไธๅนณๅ็็็ธ่ฏๅฎๆฏๅไบบ็\n1000 Processed\n classify content\n185000 0 ไธ็ถไผ็ ดๅ่็ๆต็่ฅๅ
ปๆๅ\n185001 0 ๅไบxxxx่ก็ฅ็ซๅไธไบๅบ้\n185002 0 ้ฎ้ขๆฅไบ๏ผNBA็ๆ่ฐๆดๆ็ปๆฒปๅ\n185003 0 ๅจ็ๅฐๅ่
ๆฝๅ็ปๆตๆๅคฑ387ไบฟๅๆถ็ผด201ไบฟ่ฟ็บชๆๅพ\n185004 0 ไปๅไบฌๅธ็ป่ฎกๅฑๅฎๆน็ฝ็ซ่ทๆ\n1000 Processed\n classify content\n185500 0 ๅ็ฎ็ฌ้ด้ฃๆบ่ณโฆโฆไธปๆไบบๆไธไฝ็ฌไบ\n185501 0 ๅซๆ๏ผbbyyccppๆข่ฆๅฏ็ฉ\n185502 0 ไปฅๆฐๅ
ดไธๆไธบๅผ้ขโๆตๆฑๆๅกโๆ่ตทๅ่พนๅคฉ\n185503 0 ๅฑ
ไฝ้ข็งฏ3259ๅฐบ/303ๅนณ็ฑณ\n185504 0 ๅฏไปฅ็บๆๅฐ่ช็ฉบๆฏ่ฐใๆๅ่ฐๅๅทกๆด่ฐ็ญ\n1000 Processed\n classify content\n186000 1 ๅปบๆๅฎถๅฑ
ๅๅคงๅ็ๅทฅๅ่็ๆจๅบxxxไฝๆฌพโไนฐๆญไปทโไบงๅไบxๆxxๆฅไธญๅxx:xx่กขๅท้ฅญๅบ้้ๆข...\n186001 0 600ๅคๆทไธๅๆๆ่ฟ็ๆ
ๅตไธๆฟๅบๅคๅฐไปถ\n186002 0 ่ฟๆถ้ฃๆบ็บฆๅไบบๆฐๅธ7000ไธๅ
\n186003 0 ไฝๆฏ็ฃๅฏ้ขxๆฅไปฅ้ญๆฐธ้ๆไปป่กไปฝๆ้ๅ
ฌๅธ่กๆฑ\n186004 0 ๅฅนไปฌๅฐ้ชไผดไฝ ็ดๅฐๆฐธ่ฟUPไธป๏ผๅฉ่ดๅฐ้้ญ\n1000 Processed\n classify content\n186500 0 ้ข่ฎก่ฏฅ่ช็ญๅฐไบxๆxxๆฅๅๅ้ฆ้ฃ\n186501 1 ๆจๅฅฝ๏ผ้่ดญ้ฆ้xxxxๅนฟๅท๏ผๆฐ๏ผๆตไฝๅฑ๏ผxxxๅฎถๆตไฝไผไธๅๆฅ่ชxxๅคไธชๅฝๅฎถ็ไธไธ่งไผ้ฝ่๏ผไธญ...\n186502 0 ๆธฃๅ่ฝฆๅธธ่ง็่ฟๆณ่กไธบไธป่ฆๆ6็ง๏ผ1ใไธๆ็
ง่งๅฎ็ๆถ้ดใ็บฟ่ทฏ่ก้ฉถ\n186503 0 ไธญๅๆถๅ่พๅ่ฏๆไธๅไธคๅคฉๅฐๅไฝ\n186504 0 ๅพ่
ๅฑ
็ถๅ ไธบ่ขซไธพๆฅๅๆๅไบๆไธๆญฃๅฝๅ
ณ็ณป่ๆค\n1000 Processed\n classify content\n187000 0 ไธญๅฝๅปบ็ญ้ๅฑ็ปๆๅไผๅปบ็ญ้ข็ปๆๅไผๅฏไผ้ฟ่ก่ฒ็ง่ฟๆฅ่กจ็คบ\n187001 1 ็ไผฝ่ฟๅจ๏ผๅก่บซๅ
ป็โฆโฆๆๆๅๅ็ผ่งฃ็ฒๅณ๏ผๅขๅ ๆดปๅใ่ฐ่่บซๅฟๅนณ่กก๏ผ่ฎฉๆจๆดๅฅๅบท_ๆดๆๆฆ!ๅจๅฐผๆฏ่ฑ...\n187002 0 ่ฟๆฏไธๆ ไฝไบๆฏๅฉๆถrotselaarๅฏๆไธญ็ไฝๅฎ
\n187003 0 ่ฝจ้ไบค้xๅท็บฟไธๆๅทฅ็จๅ
จ้ฟxxๅ
ฌ้\n187004 0 ไปๅคฉ่ขซๅๅปไบ่ฟๅฅฝไฝ ๅจ้ฆๆธฏๆๆฒกๆ็ขฐ่งไฝ ๅฆๆ็ๆฏไฝ ่ฏฅๆไนๅฏน่ฏๆท่ก\n1000 Processed\n classify content\n187500 0 ๅจๅๆขฆ่ฎพ่ฎกๆป้จ็ญพๅฎๅไฝๅ่ฎฎ\n187501 0 ๅ
ฌๅ
ฌๅพๅบๅปๆๅทฅไธบๆญคๅพไบ็็\n187502 1 xxx xxxๆ็บข่ ๅ่ก:xxxx xxxx xxxx xxxx xxxๆ็บข่\n187503 0 ๆฐๅฎพๅฟๆณ้ขไบคๆตๅญฆไน ้น็ขงๅๅ
่ฟไบ่ฟนไฝไผ\n187504 0 ไธ็้ฆๆ3Dๆๅฐ็ซ็ฎญElectronๅฐไบ2015ๅนดๅบๅๅฐ\n1000 Processed\n classify content\n188000 0 ไปๅฟไธๅคฉๆ่
พ่ฟ็ต่ๆ่
พ็ๅฅฝๆฌขไนๅ\n188001 0 ๆตๆฑ้ๆธฉ้้ๆบ่ฝฆ่ฝฆ่พๅคงไธ็ญๆฏไธๅ
ธ็คผๅจๆตๆฑๅธ่ๅคงๅญฆไธพ่ก\n188002 0 Doomๅญฆ้ข็ๅนด็ปๆต่ฏๅๆฅๆฏๅคง้ๆๆจกๅผ\n188003 0 ๅ
ฐๆกๅๅฎถๅบญๅซๅข
ๆ
้ฆๆฌข่ฟไฝ ็นๅป้พๆฅๆญๆพ่ถ
้
ทH5ๅคง็>\n188004 0 โกๆฏๅคฉ็ญพๅฐ้ข้ฑโขๅทไปปๅก่ต้ฑ\n1000 Processed\n classify content\n188500 0 ๅฅ็บฆ็ฒพ็ฅๅบ่ฏฅ้ซไบๆณๅพ่ดขไบง็็้\n188501 0 ๅคงๅญฆ็1ๅฐ5ไธ็ๅฐ้ข่ดทๆฌพๆถ211ไธ985้ขๆ ก็ๅๅ
ถ้ๅฑ้ขๆ ก็ไธ็งๆฌ็ง็ ็ฉถ็ๅๅฃซ็่ดทๆฌพ\n188502 0 ๅ็ฏๆกฅๅพไธ็ซ็พ็็นๅซ็ๅคงๅพๅๅๅๆ็ๅฐๅ
ญ่พๆถ้ฒ่ฝฆๅๅ็่ฟๅปไบๅธๆฟๅบ้่ฟๅ ไธช่ทฏๅฃ็บข็ปฟ็ฏ้ฝๅไบไบค...\n188503 0 ๅฆๆไธๆฏ็ต่ๅ
ๅคชๅคๆไธไผๅบๅฆ\n188504 0 ๆณ้ข๏ผๅๅๅนถ้ๅ ็ๆดป้่ฆ่่ดญไนฐๅๅ\n1000 Processed\n classify content\n189000 0 XboxOne้่ฆๅฎๅ
จๆฏๆ้ฎ็ๅ้ผ ๆ \n189001 0 ไฝ ๆญฃๅจ่ดจ็ๆ็ๆถๅๅซไบบๅทฒ็ปๅผๅง่ต้ฑไบ\n189002 0 ๅจๆๆฅ่ฟๅคฉๅ ็ๅฐๆนๅฏนๅฅน่ฏดโๆ็ฑไฝ \n189003 0 ๆ็ฐๅจ็ฎ็ดๅฐฑๆฏ่
พ่ฎฏๅ
ฌๅธ็ๅคง็ฒไธ\n189004 0 ๅ่OTGๆฐๆฎ็บฟ้็จๅฐ็ฑณไธๆๅไธบๆๆบ่ฝฌๆฅ็บฟmicroUSB่ฝฌๆข็บฟ\n1000 Processed\n classify content\n189500 0 ่ๅทๆฏๅคฉๆฐๅข57ไพ็็็
ไบบ\n189501 0 ๅฎๅบๆฐ่ฑกๅฐ7ๆฅ7ๆถๅๅธๅคฉๆฐ้ขๆฅ\n189502 1 ่ฆๅๅทฅๅๆณจๅใ็จๅกไปฃ่ดฆใ่่ต่ดทๆฌพใๅๆ ไธๅฉใๅ็ฑป่ต่ดจใ่ต้่ฟๆกฅใPOSๆบใ็ปๆฅๅป็ซ ็ญ็ญ๏ผๆๅก...\n189503 0 13588288645QQ2367436628ๅธๅบๅคงๅฅฝ\n189504 0 ๆๆฏๅป้ข้ข้ฟ้ๆๅฝฌ็ซๅปๅฏๅจ็ดงๆฅๆๆคๆชๆฝ\n1000 Processed\n classify content\n190000 0 ๅ
ถไฝ่ฝฆๅ่ฟ้
ๅคไบLEDๆฅ้ด่ก่ฝฆ็ฏ\n190001 0 6ๆ26ๆฅ้ฃๅคฉๆฏไนฐไบB็ซ็ๆจๅนฟไฝ\n190002 0 ๆจๅคฉๆถๅฐไปๆตๅป้ข้ซไธปไปป็้่ฏท\n190003 0 ๅชไธชๆณ้ขๅชไธชๆณๅฎๆขๅๆๅพ็ป\n190004 0 ๅซ่ฎฉ้่็็ฎกๅฝฑๅไบ่็ฝ็ปๆต\n1000 Processed\n classify content\n190500 0 ๅฐ็ๆฏ็บน้ญๅๆฃ็ผ้ข้ข้ๆๅบ่ฒๅนฒๅๆฒนๆงๅ\n190501 0 ่พไบๅทฅไฝๅปๅไบฌๆ่
้ๅบ็ๆดป\n190502 0 ๅป็่ฏดๅธฆไฟๆๅจๅฐฑๅฏไปฅ็ซๆญฃ่ฟๆฅไบ\n190503 0 ๆฑ่็ๅฆ็ๅธๆฑๅฎ้่ๅธๆๅพๅงๆๆฐ่ขซไบบๆๅฎณๅจ่ชๅฎถๅๆๅ
\n190504 0 ๆ็ฝ็ฏๅฅๆทบ่ๆญ่ฉๅซไบบๅฟ่
ๆ\n1000 Processed\n classify content\n191000 0 ๅ็ฐ็
ๆฏๅไบซๅพฎๅ่ฟๆ็นๅซ็งฏๅๅฅๅฑ\n191001 0 ๆ็ๅพๅค่ก็ฅจ้่ๅๆๅธ่ฆๅฃๅฉๅฟ็ๅๆฃๆท\n191002 0 Windowsๆๆ่ฅๆถไธ้8%็ญ\n191003 0 ็ฅ้็็ธ็ๆ็ป็ๅฟ็ฌ้ดๅผ้ๅชๅฆๅฐๅ\n191004 0 ่ตท็ ๆ
ๆธธๅๆฅๆๅๅจๅฎถ่ฆ่ฟๅพๅ
ๅฎ\n1000 Processed\n classify content\n191500 0 170g$118้ฉ็จๆผๅ
ญๅๆ่ตทๅฏถๅฏถ\n191501 0 ๅจ่ๅทๆ้ค้ฅฎๅบๅ็ไธไปฝๅจๅธๅ
ผๅคๅๅทฅไฝ\n191502 1 ่ชไฟก็็ๅฝๆ็พไธฝ๏ผไธๅนดไธๅบฆ็ๅฆๅฅณ่ๅฐไบ๏ผ่ญ่็พๅฎน็ฅ็ฆ็พๅฅณ่ชไฟกๆฝๆดๆผไบฎ็พไธฝ๏ผๅผๆญค็พๅฅฝ่ๆฅ๏ผ่ญ่...\n191503 0 ๅฉไธ0025ๅ326่ตๅคง็็ฟป็บข\n191504 0 /ๅทฅไฟก้จ่ๅฉ๏ผ็นๅซ่ฆ้่งๆๅฅฝ็ฝ็ปๆ้้่ดนๅทฅไฝ\n1000 Processed\n classify content\n192000 0 ้ฃๆบๅจไธๆตทไธ็ฉบ็ๆไบไธไธชๅๅฐๆถๅ\n192001 0 ่ณๅฐๅๅนดๅ
ไธๆณ่่win10\n192002 0 ๅนถๅฐฑๆฐๅธธๆไธ้ๅข็ป่ฅ็่ฐๆดไผๅๅ็บงๅขๆ่ฟ่กไบไบคๆต\n192003 0 โ็ๅฌโไธ้ปๅถๅท็พๆฅๅ็ๅ
ณ็ณป\n192004 0 ๅฐ็ฎๅ่ฟๆฒก่ง่ฟๅช่ดชๆฑกๅ่ดฟไธๆพๅฐไธ็\n1000 Processed\n classify content\n192500 0 ๆ ธๆญฆๅจ็ๅๆๆๅฉไบ็ปดๆคไธ็็ๅๅนณ\n192501 0 ๅฏ่ฝๆฏ้ฃไธชไธปไบบ็ๅ
จ้จ่ดขๅฏ\n192502 0 ๆๆถๅจ็ซ่ฝฆ็ซๆๅฐ้ไธ็ๅฐๅฐๆฅ็ซฅๅๆฅๆๅ
ด่ถฃ็ไบบๅพๅฐๅฟไธไฝๆฏๆไธไธๅดๅ็ฐๅ2ๅ
็ๆฅ็บธๅไปทๅชๆ0\n192503 0 ๅๆโ้ฆ่โ่ฐทๆฅ็ซ่ขซๆฅๅ่
38ๅคฉๆไธโ9่โ\n192504 0 ็ดๆฅ็พๅบฆไบ้พๆฅ็งไฟก็ฉ็ปไฝ ไปฌ\n1000 Processed\n classify content\n193000 0 ๆผๅฑไผๅ้็ไนไธๆฏๅไธไธ้ฅญ\n193001 0 ๅจๅบฆๅๅๅฐ้ฉๅฅขๆทซ้ธ็PIPPOๆธพ็ถไธ็ฅ่ชๅทฑๅจ็ฑณ่ญ็ๅฎถ่ฎๆไบๅฐๅท็็ฎๆจ\n193002 0 Gxไบฌๆฒช้ซ้็ฑไธๆตทๅพๅไบฌๆนๅๆ ้กๆฎตไปKxxxx+xxx่ณKxxxx+xxx\n193003 0 1500ๅ
่ฝไนฐๅฐ็โๅฅฝ่ฎพ่ฎกโไบงๅๅๆจ่็็ฑ\n193004 0 ๅ็ฐไบบๆๆฒๅฌ็ไบไธๆฏ่ขซๅผบๅฅธ\n1000 Processed\n classify content\n193500 0 ๅ่็จ3Gๆๆฉ็ถฒ็ตกๆๅๅฏไปฅ็็ธ\n193501 1 ๆ่ฐข่ด็ต้ช่ฒๅธ่บ๏ผๆไปฌไปฅไผ่ดจ็ไบงๅ๏ผไผๆ ็ไปทๆ ผ๏ผๅฎๅ็ๅฎๅๆๅก๏ผ่ฐ่ฏๆๅพ
ๆจ็ๅ
ไธด๏ผxxxxๅนด...\n193502 0 ้ข่็ฒพๅไธญxx%้ฝๆฏ่่ๅๆถฒ้ๅไปปไฝ็ฑปๅ็ฎ่ค\n193503 1 ๅคงๅฎถๅฅฝ๏ผไธ่ๅๅคงๅธ่ไธๅฆๆขๅพๅฐไธบๆจๅธฆๆฅๅ
จๅ
่ดน็ๅๅธ่ง้ขๅ
ฌๅผ่ฏพใๆบไผๅฐฑๅจ็ผๅ๏ผๆๆกๆบไผ๏ผ่ตฐๅๆ...\n193504 0 apๆๅ ้ฃ้ฉฌsax่ฒๆฑ็7\n1000 Processed\n classify content\n194000 0 ๅฐ้ๅ
ฌๅฎๆ้๏ผๅ
่ฃน่ฏท่ฟๅฎๆฃ\n194001 0 ็ฎๆฏshortcoveringๅธฆๆฅ็ๅ
ฌๅนณ\n194002 0 ๅฎๆฑๅ็ฎกๅๆๅ
็คพๅบๅ
ฑๅไธพๅไบโไบๅฝๅ็ฎกๅฐไนๅทฅโไฝ้ชๆดปๅจ\n194003 0 3้ถๆๅธ้ณๆฟ็ๅขๅฃ4ไธๅฅๅฎ็จ็ๅฎถ่ฃ
\n194004 0 ไฝๅตๅฆไป่ก็ฅจ็ฉ็ไธไป
ไป
ๆฏๆๆฏ้ข\n1000 Processed\n classify content\n194500 0 ๆ็ๆๅฝฑๅธๆฉๆไฝๅยทBy้
้ฌผๅฆๅ
\n194501 0 ๅ่กจไบBMJๆๅฟไธ็ไธ็ฏๆ็ซ ็ปผ่ฟฐไบ้ด่ดจๆง่บ็พ็
็่ฏๆญไธๆฒป็\n194502 0 ๅ่
่ฟ็งๅฉๅฝๅฉๆฐ็ไบๅฆๆ่ฟๆไบบๅซไธๅฅฝ\n194503 0 ๅฝๆๆบ็ฆปไฝ ไปฌ10mๅทฆๅณ่ท็ฆปๆถ้ๅธธไฝ ไปฌไผ๏ผ\n194504 0 ็ฎๅญไผ+่ตๆ็ญๅๅญฆไผ็คพไผๅ
ฌ็+ๅคงๅญฆ็ๅไธ่ฟๅฐฑๆฏไธญๅฝ็ฎๅญ่ไผ่ตๆๆๅก้โ้ๅนดๅๅฎข่ฅโ\n1000 Processed\n classify content\n195000 0 ๅทฅไฝๅๅฟไน่ฆๅจ~1Gๅ
จๅฝๆต้ๅๅนดๅ
6ๆ่ตท\n195001 0 ไธ้ฉ้ฟ้ๅฉไฝ ๅปบ็ซๅฅๅบท็็ๆดปๆนๅผ\n195002 0 ๆฐ้ๅนถไธๅคๆๆถๆไนไธๆ็ฎๅไบ\n195003 0 mokabros้็่็ฝ่ดจ่กฅ็ป็ซๆฒๆ้้ขๆๅๅฐๆถๅๆๅคด้็ไธ่ฅฟโฆ\n195004 0 ็NBA็ๆ่ขซๅฒๆๆนๅ็็่ฟน\n1000 Processed\n classify content\n195500 0 ๅ้็ไธๅคช่็ๅคฉๅไธๅคช็ฝ็ไบ\n195501 0 ็ฉบ่ฐ่กฃ็ฉฟ้ฝๅพๅ้๏ฝ่ๅกๅ
ถ2่ฒ\n195502 0 ๆ็ฐๅจ2็ฑณไบๅทฒ็ปไธไธๆญฅๅฐฑๆฏ่ฟๅ
ฅๅไฝ็ถๅๅปๆnba\n195503 0 ๆๆ้ๆ
ๅๆๅปๅ็ฑป่ฟๆณ็ฏ็ฝชๆดปๅจ\n195504 1 ใ็บคๅงฟไพไบบใ้ญ
ๅๅฅณไบบ่๏ผๅ
ณ็ฑๅฅณๆงไนณ่
บๅฅๅบท๏ผๅฒๆ ๅไพไผๆ ้็ปๆจใไฟๅ
ปๅๆ่ธ็ฌฌไธไปถๆญฃไปท๏ผ็ฌฌไบไปถๅ...\n1000 Processed\n classify content\n196000 0 ้กถ็บงWHOO/ๅๅคฉๆฐไธนๅๆณซ็ๅๅฅ็\n196001 0 maoliqiusi็photowhere\n196002 1 ๆจๅฅฝ๏ผไฝๆ็ฝๅฎขๆๅฐๆ๏ผๅ
ซๆก็ปฟๅ้พๅบญๆฐดๅฒธ-ๅคฉ่ช่ฑๅญ๏ผ้พๅบญๆฐๆฅ็นๆ ๆ้ซไผๆ x.xไธ๏ผๅคฉ่ชไนฐๆฟ็ซๅ...\n196003 0 ็ฌ่ฎฐ่ถ
่ฟ1ๅคฉๅบๆฌๅฐฑๆฒกๆๅคชๅคงๅญๅจๆไน\n196004 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ g4426jไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n1000 Processed\n classify content\n196500 0 ๅฏนไธญๅฝๆดไฝ็ๅถ้ ไธใๅปบ็ญไธใไปฅๅ่ฃ
ๅคๅถ้ ไธ็ๆฐดๅนณๆฅ่ฏด\n196501 0 ๆฏๅคฉไธไธ็ๆ
ๆธธ่
ๅๅพ่ฟ้้ช้ช้ฉผใ็ๆฅ่ฝๅฃฎ็พ\n196502 0 ๆๆๅ่ฑๅ้ชจ็ๆถๅ่ฟๅพๅนด่ฝป\n196503 0 ไปปๅฟๅผบ๏ผโ่ฐฃยท่จๅ้ผ็็ธโ\n196504 0 ไบๆฏ็ฐๅจๅฐๅพฎไผไธ่่ต้ๆฑๅพ้ซ\n1000 Processed\n classify content\n197000 0 ไธๅบ่ทจ่ถไธค็็็ฑๅฟๅป็ๆๅฉไธญๅฝๆๆ็ฝ็ฑไธญๅคฎๅฎฃไผ ้จใไธญๅคฎๆๆๅไธปๅ\n197001 0 ๅฏนไบ้ฟๆไพ่ตๆ่ต้ฉฑๅจ่้ซๆ ๆ็ไธญๅฝ\n197002 0 ่ทฏ้ฝๅ ไธบๅฐ่ดฉไปฌไนฑๆๅๅพใๅฐ่ๆฐดๅๅพ้ฃไน่\n197003 0 ่ฎฉๆไปฌไธ็ชฅQQ้กถๅฐไผๅ็โๅฎๅโ๏ผๅนด้พไธ\n197004 1 ไบซxxMๅ
ๅฎฝๅธฆๅๆฐๅญ็ต่งๅ
่ดน็จ๏ผ็งปๅจ่็จๆทๅญ่ฏ่ดนๆๅคๅฏ้xxxๅ
่ฏ่ดน๏ผ็ฐๅบๆดๆๆๆบๅคงไผๆ ๏ผๅฎ...\n1000 Processed\n classify content\n197500 0 ๅ็จGoogle็ฟป่ฏๆๆๅคงๅฉ่ฏญ็ฟป่ฏๆไบ่ฑๆๅไธญๆ\n197501 0 sugaryๅจไธญๅญฆ็ๆถๅๅฐฑ้ฟๅพๅฏ็ฑๆผไบฎ\n197502 0 ๆฒณๅ้ ่กๅนฒ็ป่ๆ็ฎไบบๆฐ่พพ500ไพๅ
จๅฝ็ฌฌไธ\n197503 0 ้ฉฌ้พ็NBA็ๆถฏๅฐฑๅงไบ้ฉฌๅบๆ็ปๆณขๆณข็ปดๅฅ\n197504 0 ๅๅจ้ๅท่ฟชๆ่นๆๆๆบไธๅๆๅฐไบๅ
่ดนๅฅฝไธไธ\n1000 Processed\n classify content\n198000 0 ็ถๅ้ฃๆฌกๅท53ๆๅๅผไฝ ไนๅๅผๅธฎๆๅทๆญปไบๅฅฝๅคไบ่ทณไบๅฅฝๅคไธน็ปไบๆฏ่ฟไบ\n198001 0 xๅจๅฒDAYxxx๏ผๆจๆ้ฃๆบๅๆ็น\n198002 0 ๆจๆ็จไบ้ช่ฑ็งๆด้ขๅฅถ้ฟ3็ฒ็ฒๅบไบๆ็
ๅฆ็ฆๆญป\n198003 0 ็ช็ดๅ็ฎกๅคง้ๅจๅบๅธๅฎนๅธๆฟ็ฎก็ๅฑๆงๆณๅคง้็ๅธฆ้ขไธ\n198004 0 ๅ ไธบ้ฃๆบๅปถ่ฏฏๆ ๆ้ดๆๅฐไบ็ๆฏ๏ฝ\n1000 Processed\n classify content\n198500 0 ๅไธๅ
ฌๅญๆๅไนไฝๆ่ขญๅคง้ฃๅฎ่ฟ็ไฝๅๅไฝ็ๅฎๅงๅ่ฏทไฟๆๅนณ้่ฎฉๆไปฌไธ่ตท่ฟๅ็็ธ\n198501 0 ๆตๆฑ็่กขๅทๅๅฐๅขๅ ไบไธๅฐ้ถๆใ็บขๆซ็ญๅฝฉๅๆค็ฉ\n198502 1 x.x็พไธฝๅฅณไบบ่๏ผๅฅฝๅๅค่ถ
ๅธไบฌๆถฆ็็ ๆค่คไธๆๅพๆ
ๅ้ฆๆฐ่้กพๅฎน๏ผๅ
จๅบไฝ่ณxๆ๏ผๅก่ฟๅบ็ไผๅๆถ่ดน...\n198503 0 ๅๆนๅฐไปฅ้ฟ้ไบๅ้ฟ้ๆฐๆฎๅนณๅฐไธบๅบ็ก\n198504 0 ใ96ไธ็ญๆญ่ง้ขใๅฐ่่ชๆพๅฆๅฆไธญๅฝ็ฌฌไธ้จๆฐดๅขจๅจๆผซ\n1000 Processed\n classify content\n199000 0 ่ไฝ ไปฌ่ฟ็พค็จๆๆบๅฑๅคงๅฐๅ่ๅคงใ่ไบใ่ไธ็ไบบ\n199001 0 26ๅฒๅฎๅพฝ็ทๅญๆๆฟๆฐดๆๅๅจๅนฟ็่ทฏไธไบๆฌง่ทฏไธๅคงๅฆๅ
่ชๆ
่
น้จไธๅ\n199002 0 xxๅฒๆถๅ ่ชๆ
งๅๆไธบไธไปฃๅฅณ็ๆญฆๅๅคฉ็้็จ\n199003 0 ไปๅคฉไพ็ถ้ฃไนๅนณ้ๆ่ฟๆฏๅพๆณๆๅฎๅ็ฎกๆๅฎๆ็ฆปๅผ็้ฃๅบงๅโๅทดๅ\n199004 1 ๅฐๆฌ็ไผๅๆฐๅนดๅฅฝ๏ผๅๅฏๆๅฎไธๅๅบๅ
จไฝๅไบ็ฅๅคงๅฎถๆฐๅนดๅฟซไนใไธไบๅฆๆใๅๅฏๆๅฎไธๅไปxๆxๅท่ณx...\n1000 Processed\n classify content\n199500 0 ๆชๆฅๆฏไฝ ็โฆโฆNBAๅๅคง็ปๅ
ธๅๅพ่ฏ\n199501 0 ๅ
ทไฝๅจ่ฏขๅป็่ฏฅๅฆไฝ่ฟ่กๅๅ
ปๆๆฏๆๆบไนไธพ\n199502 1 ๆฐๅนดๅฅฝ๏ผๅนฟๅทๅฐ้ซๆจๅทฅ็ฅๆจๅผๅทฅๅคงๅ๏ผ็ๆๅ
ด้ใ๏ผๅฆๆจๅผๅทฅๆถไปชๅจ่ฎพๅคๆๆ
้ๅฏไธบๆจ็ปดไฟฎใๅฆ้ๆทปๅ ไบ...\n199503 1 xxxx็ -็ฉถ-็ MBA่-็๏ผ้ๆๅ่พพๆจๆๆฟ๏ผไธไธๆๅๆฅ่ฏข๏ผๆ/ๅๆ ทๆฌๆ ธ/ๅฎใๆฃๆฃ:xx...\n199504 0 ๅขๅ
ๅค่ๅคดๅพ็ปๆฏๅจ้xxxๅ
ๅฐๅฅณๅญๅฐ้ฆๆธฏๅๆทซxxxxๅนดxๆxๆฅๆฅ้\n1000 Processed\n classify content\n200000 0 It'sRonniๆefromtwitter\n200001 0 ็ฅๅๆถๅ่่2015ๅนดๆฐ่ไธๅธ\n200002 0 ๅธๆฐ้ๅ
็่ฑ7ไธๅคๅ
ๆฟไธๅไบฌๅ็ซ็ไธ้ดๅฐไธๅ้บ14ๅนด็็ป่ฅๆ\n200003 0 ๅฝๅฎถๆ่ตๅ ๅไธ็ๆฐดๅฉ้ฝๆฒกๆไฝ็จ\n200004 1 ๏ผ็งๅ็ญๆจๆฅๅฆ๏ผๅฆๅคๆฌๆๅไบๅคฉๆไบงๅไบๆไผๆ ๏ผๆฑๆน็พ็ฝๅ็งๅฆฎๅฟๅ
่กฃๆดๆๅคงไผๆ ๏ผๅฟๅจไธๅป่กๅจใ...\n1000 Processed\n classify content\n200500 0 ไปๅไธๅๆญๅ
ฌไบค่ฝฆ่ฟๆฅไธ้จ็ซ\n200501 0 2013้ป็ๅ
ไผไธไผ่ถๆฅ่ถๅฐๅ\n200502 0 ไธๆทๅฎใ็พๅบฆใไบฌไธใไบ้ฉฌ้้ฝๆตๅๅฐ่ทๅ
ๅญๅจ\n200503 0 ็ๆญฃ็ๅๅฃซๆขไบไธๅฌๅป็็่ฏ\n200504 0 ่่ขซ่ ้ๅ
็็ป่ๅ่งฃไบง็ๅคง้็ๆฐไฝ\n1000 Processed\n classify content\n201000 0 PACEMAN็ๅค่ง่ฎพ่ฎกๅไนๅ็ๆฆๅฟต่ฝฆไฟๆไบ้ซๅบฆไธ่ดๆง\n201001 0 ๆๅจๅฟ
่ๅฎข็ญพๅฐๅไบฌ้ๅฑฑ้คๅ
็ญพๅฐ๏ผ็ญพๅฐๅฆ\n201002 0 ๆฅๆฌๅๆฅ่ต8ๅทๅซๆๆไธๅฐ้ฃ่่ฟช็ฝๅ็\n201003 0 โๆไธๅฌ่ฟๆด่พๆฐๅฐฑไธๆฅไบๆ็ดๆฅไนฐไบๅๅ ไธชๆฐขๆฐ็ๆฐ็ๆๆๆ่ขไธไบ\n201004 0 ๆๅไธๅฐxxxxShelbyGTxxxๆ็ฏท็ๅฐไธบๆ
ๅๅบ้่ๆๅ\n1000 Processed\n classify content\n201500 0 ๆญคไธพๅจๆฅๆฌๆบๆๆ่ต่
ไธญๅฐๅฑ้ฆๆฌก\n201501 0 ่ฟๆ ท็็ตๆขฏๆไน่ฎฉไธไธปๆพๅฟไนๅ\n201502 0 f็พๅบฆไบch็พๅบฆไบ่ตๆบdzไบ็่ตๆบi็พๅบฆไบ่ตๆบ\n201503 0 ๆตๅไธ้ซๅฑไฝๅฎ
็ตๆขฏๅ
็ฐ็ทๅฐธไบๅๆฅผๆฟๆ ไบบๅฑ
ไฝ\n201504 0 ๅไบฌๅธๅ
ฌๅญ็ปฟๅฐๅไผ้ขๅฏผๅญๅบ็บขๅธฆ้ขๅไบฌๅธ16ไธชๅบๅฟ็ๅ
ฌๅญ็ฎก็่ด่ดฃไบบ้ๅฏนไธๅฏฟๅ
ฌๅญ้ไธญ็ฎก็ๅนฟๆญ้ณๅ...\n1000 Processed\n classify content\n202000 0 ๆ็ฒพ่ดไบบๅๆๅฝฑๅฐฑๆฅๅทฆๅฒธๅฏ็พๆๅฝฑ่ง่งๆบๆ\n202001 0 ่ฟๆๅณ็่ฟๆๆ่ตๆฏๅบๅๅฐฑไธๅฒไฝๅขๅ ๅฟๅคดๆๅฐๆ็ปญ\n202002 0 ๆๅบๅฌๅผ2016ๅนด้จ้จ้ข็ฎ็ผๅถๅธ็ฝฎไผ\n202003 0 ๆฎNBAๅณ่ตๅ่ฎฎไธๅฎถLarryCoon็งฐ\n202004 0 ๆ่ดจ๏ผๅฃณๅญ่้ขไธบ็กฌๅฃณ่พน่พนๆฏ่ฝฏ็้็จ๏ผiphone6/6plusๅกๅไผ่ดด็บธ+ไธไธชๆๆบๅฃณ่ชๅทฑDI...\n1000 Processed\n classify content\n202500 0 ้ฃ็ฉ่
่ดฅๅพๆฏ่พๅฟซๅๅฐๅ่ดจ็้ฃ็ฉๅ ็ๆฏ่พ้ซ\n202501 0 ่ๅ็ไน่็ฅๅบๆ้พ3000ๅนด็ๅๅฒ\n202502 0 ็ถ้่ๅฎถไบบๅๆๅๅคๆฌก็็ฅ็ฆฑๅ่จ่ซ\n202503 0 ๆฐธ่ฟไธ็จไฝๆธฉ่ฎกๆฐธ่ฟไธๅปๅป้ขๆฐธ่ฟไธๅ่ฏไบบ็ไธๅคงๆฟๆ่ฟไธ้กนๆๆฏๅคงไบๅฅฝๅ\n202504 0 ็่งไธๅฐๅทๆญฃๅๅคๆๅทๆฅ็ๆๆบๅพๅฃ่ข้่ฃ
\n1000 Processed\n classify content\n203000 0 ๅคง้ๆดพๅบๆๆฐ่ญฆ้ญๅฉๅธฆ้ข่พ
่ญฆ็ฐๅฎถ้ฃใ่ๆทใๆนๆๆกๅทก้ฒไธญๅ็ฐๆบๅๅป้ขๅ
ๅ็ไบๅต\n203001 0 41ๅนดไปฅๆฅ็ป้ๅๅๆๅผบ็็ญๅธฆๆฐๆ\n203002 0 amazonprimedayๆฎ่ฏดๆๆฃๆฏ้ปไบๆด็ปๅ\n203003 0 ไธๆดๅ่งฆๆธOLEDๆพ็คบๅฑ่ดฏ้ๆดไธชไปช่กจๅฐ\n203004 0 ๆ้ฑๅฏไบไปฃๅ่ฃ็ง่ฝฌ็ๆฏ็\n1000 Processed\n classify content\n203500 1 ๅๅข่ดญๆดปๅจใ๏ผๅกxๆxๆฅ่ณxๆxๆฅๅฐๅบๅฎขๆท้ฝไธไป
ๆฅๅบๅณๆ็ฒพ็พ็คผๅ็ธ่ต ๏ผ่ฟๅฏไบซๅๆฏๅนด้พๅพ็ใ็น...\n203501 0 ไปไธไธ็บช90ๅนดไปฃ็้ญๆฏๅธใไนไธนโฆโฆ\n203502 0 bananaๅฐ้ปไผ้ซๅฏ้ฒๆๆถๅฑ\n203503 0 ไธญๅฝไบ่็ฝ่ฟๆณๅไธ่ฏไฟกๆฏไธพๆฅไธญๅฟใๅๅฐ็ฝไฟกๅไธพๆฅ้จ้จใๅไธป่ฆ็ฝ็ซ้่ฟๅปบ็ซๅคๅ
ไธพๆฅๆธ ้ใๆฉๅ
ๅ
ฌ...\n203504 0 ๆ่ต่
่ฎพ็ฝฎไบๆญขๆ่ๆฒกๆๆง่ก็ไพๅญๆฏๆฏ็ๆฏ\n1000 Processed\n classify content\n204000 0 ่ฏฅ่ช็บฟๅ
ฑๆๅ
ฅ11่่ฟๅ็บฆ11000ๆ ็ฎฑ็้่ฃ
็ฎฑ่น่ถ\n204001 0 ๆไบ้ท้่ง้ข๏ผ่กๆบ่ฝฐ็ธๆบ้ฃ็จๆๅๆปๅปๆบ\n204002 0 6็นๅ็ๆปจๆตทๅฎข่ฟ่ฝฝ็ๆบๅฆๅจไฟฉไธ่ทฏๅฅๅ้ฆ้ฝๆบๅบไธๅท่ช็ซๆฅผ\n204003 0 GoogleShoppingๆ็ดขไผๅๆๅทงๅๆ\n204004 0 ่ฟไธชๅธๅบ็ปๅฐๆไธบๆ่ต่
็็ป่ๆบ\n1000 Processed\n classify content\n204500 0 ๆๆๅจ้ฃๆบไธๅทๆ็ไธๆขๅ่ฏๅฆๅฆๅช่ฝ่ฏดๆๆๆฏฏๅญไปๅคด็ๅฐ่ไปไนๆ่งไนๆฒกๆ\n204501 0 7ๆ10ๆฅ้ถๅๅฐๅ้็ๅฐไผไผดไปฌๅจไธญๅฝๅทฅๅ้ถ่ก่ฟ่กไบ่ฐ็ \n204502 0 ๆๅ็่
พ่ฎฏๆฅ้โโๆญ็ง้ๆดฒ้จ่ฝโๅทๅฆป่๏ผๅฅฝๅฏ้ๆๆ้ๅคๅไธๅคซ\n204503 0 ็ตๆขฏไธบๆฑ่โ็ณ้พโ็็ตๆขฏ\n204504 0 ไปฅๅEGF่ฟไธช่ฃ่ท่ฏบ่ดๅฐๅฅๆปดๆๅ\n1000 Processed\n classify content\n205000 0 ๆไธบteamleaderๅ
ถๅฎๅฐฑๆฏ่ฏดๅชๆไฝ ๅนฒๆดปๅฟ\n205001 0 ๅพฎ่ฝฏ่ฟๆไนๅจๅ ็ดงๅฎฃไผ ็ๆญฅไผ\n205002 0 ่ฏทไธ่ฆ่ฟไนๆฒๅฌๆๅๅฐ็ฌไธ่ตทๆฅ\n205003 0 CSDAๆฑ่ๅๅงๅไบฌ่บๅ็ฅๅฝขๆๅ่บๆฏไบคๆตๆ้ๅ
ฌๅธๆฟๅ\n205004 0 ๆๅซ็่ฆๆฑ๏ผ1ใไบๅฎ็ซฏๆญฃ\n1000 Processed\n classify content\n205500 0 ็ฑ้ๅบๅฟ็ซฅๆๅฉๅบ้ไผ่ตๅฉ็ๅ
ฌ็ๅๆ้กน็ฎโๆค่่กๅจโโๅไบฒๅฎถๅบญๅฟ็ซฅๅ
ณ็ฑ้กน็ฎโๅจๆๅบๆญฃๅผๅฏๅจ\n205501 0 ไนฑๆ้ฟๅญ่ฏๆไธบๅฏผ่ดๆฐ็ๅฟ็ผบ้ท\n205502 0 ไธไธชๅไธๆจกๅผๆฏ่ฟ่กไธไธชๅ
ฌๅธ็ๆนๆณ\n205503 0 ๅฅนๆ็ฅ้่ชๅทฑ่ขซไบบไปฅxไธๅ
็ไปทๆ ผๅ็ป่ฟไธช็ทไบบๅ่ๅฉ\n205504 0 ่ฟๅซ็ผๅๅๅๅๅไปๅคฉๅ
ฌๆผๅ ๆฒน\n1000 Processed\n classify content\n206000 0 3็ฎๆ ็ญนๆฌพ้้ข๏ผ100000ๆๅฉๅฏน่ฑก๏ผๅงๅ๏ผ่ตตๆญฃๅ
ธๆงๅซ๏ผ็ทๅบ็ๆฅๆ๏ผ2015/1/1่็ณปไบบ๏ผ...\n206001 0 ๅฅน้่ฆ็ญ็็ไธญๅ็่ก็ฅจ่็ฎ\n206002 0 ๅ
ๆฌLinuxไธป้ขใๅฅฅๅทด้ฉฌๆๆช่กจๆ
ใไปฅๅไธ็ณปๅGoogleEmoji่ดด็บธ็ญ็ญ\n206003 0 ้ๅฎๅฟ2015ๅนดโ็นๅฒ่ฎกๅโๅ
ฌๅผๆ\n206004 0 ๅจๅ้ๅๅฟๅธๅบไธญ็ๅ
่ถ
ๅไบฟ\n1000 Processed\n classify content\n206500 0 ๅ
จๆฃ็ไป้ขๆ่ฎพ่ฎกๆ็บฟ็ผๅถๅฏน็งฐๆฌพๅผๅๅฑ่ฃค่
ฐ็ฟป่พน่ฎพ่ฎก\n206501 1 ๆจๅฅฝ๏ผๆๆฏ่ๅทๅธธๆฅ่คๅป็็พๅฎนๅฎขๆไธญๅฟ็ฐๅป็๏ผๆ้ขxๆ็นๆ ๆดปๅจๅฒ็็ปๅบ๏ผ่
ไธ่ฑๆฏ็นไปทxxxๅ
ใ...\n206502 0 ๅไบฌ็ฐไฟ็ฑ็ท่ญๅฏฆๆๅค้ๆฝๅทฅ่จฑๅฏ\n206503 0 AIVA็็ฅๅฐๅฅ่ฃ
ๅทฒๆฏไธๅฏ็ผบๅฐ็ๆฅๅธธ่ฃ
ๅคไบ\n206504 0 ๆๅพๆณ้ฎ้ฎๅฎถ้ฟๅป้ข็้ฃ้กนๆฃๆฅๅญฉๅญๆ้ฎ้ข\n1000 Processed\n classify content\n207000 1 ่ฟx?x๏ผ็นๆจๅบไนฐ้ๆดปๅจใ่น็ไบๆฅผ็ฌฌไธๅฎ่ดไธๆ\n207001 0 ๅจ่ฏฅ็ณปๅ็็ฌฌ4้จๅไนๅฐฑๆฏๆๅไธ้จๅไธญ\n207002 0 ๅฎๅๅพ็ปๅฎๅชๅพ็ปๅฏไธ่ฟไธไปฃไบบ็้พไปฅ่ถ
่ถ็ๆฏ็งๅฟ้พไปฅๆ่ฑ็ๆฏ็ฉ่ดจ็คพไผๆข็ถ่ฟ็ฆปๆดไธๆฟๅๅป้ ่ฟ\n207003 0 ่ๅฎๆ่ดญๅๅไธบๅๅ้ฝๆฒกๆๅ็ฐไธ่ฃ่ๆ้ฃ่็ธๅ
ณ็ๆดปๅจๅข\n207004 0 ๆปจๆฒณๆดพๅบๆๆฐ่ญฆๅผ ็จๆฆไปๅญฆๆณ\n1000 Processed\n classify content\n207500 0 4็ๆถๅ็จgooglenowๆ ๆณๆญๆพ่ฟ้ฆๆญ\n207501 0 trunatureๅถ้ป็ด ๏ผ็็ฑณ้ป็ด \n207502 0 ๆ่ๅฅ่บซๆ็ป็ๆกไปถๆฏ๏ผๆณๆๆ้ซ่ช\n207503 0 ๆฑฝ่ฝฆ้
ไปถ่กไธ็ปๅฐๅฎ็ฐ็ตๅญๅๅกๅๅทฒ็ปๆฏไธๅ
ๅ
ฌ่ฎค็่ถๅฟ\n207504 0 2015ๅนด7ๆ31ๆฅๆๅจๅฐ่ฑกๆฑๅๅ้จๅฃ่ฟ็ซ ๅ่ฝฆ\n1000 Processed\n classify content\n208000 0 ๆๅฐฑ็ฅ้ๆๅปไธไบๅไบฌๆพไฝ ็ฉไบ\n208001 0 ็ๅฅๅจๅไฝ ่ฏดๅฃฐ๏ผโๆฉๅฎ\n208002 0 ็ถ่็ฐๅจ่ฟๆฟ็ๆๆบๅฏนไฝ ๆทปๅฑ\n208003 0 ไปๅคฉๆๆฑ่้ถ่ก้็3000ๅ้ฑๅ
จๅๅ
ไบ\n208004 0 ๆถๆโๅคงๅ็ฎกโ็ฎก็ๆ ผๅฑ็ๅๆถ\n1000 Processed\n classify content\n208500 0 ไฝ็ต่ๆพ็คบๆฒกๅบๅญ็่ฏ็้ฃ้จๅ้ฑๅฐฑ่ฟไบๅฐไบบ็่
ฐๅ
ไบ\n208501 0 ไฝๆฏไป่ฏด๏ผIfightlikehelltopayaslittleaspossibleไปๆผๅฝๅฐไบค็จ\n208502 0 ๆ้ ๅๆฅ็ญๅๆๅ็ฎกๅๆฅ่ฟไน็ฝ\n208503 0 ๆๅทฒ็ปๆฒกๆๅจๅไบฌๅๆฌก็่ๅคฉๆฏ่ตๆถ้ฃไนๆฟๅจๅๅ
ดๅฅไบ\n208504 0 ไปๅนด3ๆ่ณ7ๆไธๆฌๆนๅ็ๅป็ๅจๆขฐไธ้กนๆดๆฒปโๅๅคด็โๆๆๆพ่\n1000 Processed\n classify content\n209000 0 ๅพๆๆบ้ๅ ็นๆๅง๏ผโๅคฉๅคฉๆจก่่
ฐๆๅ
ฌๅกๅโๅๅ
ฅๆ\n209001 0 ไธไธ็ญ่ทฏไธใๅฐ้้ใ้คๆกไธ\n209002 0 fresh็ไบงๅไธ็ดๅๆๆๅๅคฉ็ถๅผๅพไธ่ฏ\n209003 0 ๅฐfantasticbaby\n209004 0 ๆฐธๅฎๅบๆณ้ขๅจๆทๅฎ็ฝๅธๆณๆๅๅนณๅฐไธญๆๅๆๅไบไธๅคไฝไบ้พๅฒฉๅธๆฐ็ฝๅบ้พ่
พไธญ่ทฏ็ๆฟไบงๅ่ฝฆๅบ\n1000 Processed\n classify content\n209500 0 ๅถไฝๆถ้ด2015ๅนด8ๆๆถ้ฟ\n209501 0 ๅคฉไธไธบไปไน่ฆๆ้ฃไนๅค็ๅฏๆ็ไบบ\n209502 0 ๆๆฒกๆ้ฟๆฒ็ๆฉๅญๅปๅไบฌ้ฆ็ซ็ๆฑ้ชๅ\n209503 0 ๅฐคๅ
ถๅจๆไปฌ้ฟ้ฃๅฐๅบ่ๅทๆฒณๆฒฟๅฒธๅฐๅ\n209504 0 ๅกๆฏ็ไธไธ่ฐทๆญๅๅฟ
ๅบไธญๆๆ็ดข็\n1000 Processed\n classify content\n210000 0 ่ฆๆณจๆๆธ
็ๅฅฝcookiesๅๆต่งๅจ็ผๅญ่ฟไบไผ้ไฝไฝ ็ต่่ฟ่ฝฌ้ๅบฆ็ไธ่ฅฟ\n210001 0 ่ณบ็็่ถ
็บง็ปๅๅช่ฝ่ฏด้ๆฉ็็ๅพ้่ฆไธ้จๆๆบ\n210002 0 ไธ็ฅๆฏๆๅฑฑ่ฟๅฐๅฟๆ็ตๆฐ่ฟๆฏๆ็็ๆฏ้ๅ็ฌๅฑ
\n210003 0 ๅไธไฝ23ๆฅผ็็ฅ็ๅค้
่ไธญๅนด็ทไบบๅๅไธช็ตๆขฏ\n210004 0 ๅธไปชๆขฆไน้ๅข้ไพไผๅจๅไบฌๆๅ
ๅคง้
ๅบๅฆๆๅฌๅผ\n1000 Processed\n classify content\n210500 0 ไธบๅฅๅไธบ็ๆๆบ็ฐๅจๆ ๆณไฝฟ็จไธๆตทๅ
ฌไบค็APP\n210501 0 ่ขซ็งฐไธบๅไบฌไฝ้ๆๅคงใๅ็ๆๅ
จใไธๆๆๅฎๆด็ๅคๅๅ้ซ็ซฏๅฅฅ็น่ฑๆฏ\n210502 0 ๅฏปๅฏป่ง
่ง
็ฑๅฅฝๆขๆฅๆขๅปpartyๅปๆฅๅปๅป\n210503 0 4ใๅฝๅๅฐ็ตๅท็ ๆพ็คบไธบไนฑ็ ็ๆช็ฅ็ต่ฏๆถ\n210504 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 482h5eไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n1000 Processed\n classify content\n211000 1 ไธญๅฝไบบๅฏฟ้ฒ็้ฉๅฏไปฅ่ฃธๅๅฆ๏ผๅทจๅคงๅ้ฆ๏ผ้พๅพๆบ้๏ผๅฐๆๅ
ฅๅคงไฟ้๏ผๆ้ซไฟ้xxไธ๏ผๅช้็ผด่ดนxxๅฒๅฅณ...\n211001 0 ้จ่ฑ่ญฆๆนๅฐๆๅ่ฟฝๅ็x่พ้ขๅ
่ฝฆไบค่ฟ็ปๅคฑไธป\n211002 0 ไปฅๅๆ
ๆธธๅๆ้ฉๅฝใๆ
ๆธธๆถ่ดซใๆ
ๆธธไฝๅถๆน้ฉๅๆฐ็ญๅทฅไฝๆ
ๅต\n211003 0 ๅฐ่ญฆๅฏๆญปๆไนๅๅป็ชๅฃๆฝไบๆ น็ๅท้ๅท้\n211004 0 ๆญๅท21ๅฒๅฅณๅญฉ่ขซ็ตๆขฏๅคนไฝ่บซไบกๅคด่บซๅๅคไธคๆฅผๅฑ\n1000 Processed\n classify content\n211500 0 ไธบไปไนไธ็ไธไผๆๅฐๅท่ฟ็งๅๆ็ๅญๅจๅฆ็ๅๆญปไบบไบ\n211501 1 ็ใๆฅๆ้ซๆๆฌพxxxx่ฌ๏ผxๅ้ๅ
งๅฐ่ณฌใ ?ๅฐ็๏ผ็พๅฎถๆจใ้พ่้ฌฅใ่ผช็คใ่กๅฏถใ็็ใๆๆๅฝฉ...\n211502 0 ้พ้ๆฏ่ดง่ฝฆๅธๆบ่ฟๅฐๅท้ฝไธๅฆ\n211503 0 ๆๅๅฐ้ฆ้ฝๆฉ้ซๅณฐๅฐ้ๆปกๆปก็ๆๆ\n211504 0 ็ตๆขฏๅไบบไนๅๅๅบ้ฝ่ฟไน่ฐจๆ
ไบๅ\n1000 Processed\n classify content\n212000 0 ๆ็ปๆๆฏ่็
็ฏไบๅปๅป้ขๆ้ไบ\n212001 0 ็ปๆgoogleplay็ฟปไบๅขไนๆ
ข็่ฆๆญป\n212002 0 ๆๆ่ฑๅญ็คพๅบ่ๅๆฌๅทๅนฟ็ต้่ฏทๆฌๅทๆตทไบๅฑๅฎๅ
จๆๆคไธ่่ฎฒๅธๅญฃ่ไธบ็คพๅบ้ๅฐๅนดไธพๅไบไธๅบๅฎๅ
จๆๆค็ฅ่ฏ่ฎฒๅบง\n212003 0 12ๅฑๅจไนๆ กๅญฆๅปบ็ญไธไธๆพๅทฅไฝไธญ\n212004 0 ๅบ่ฏฅๆ็ซไธ้จ็็ป็ปๆฅๅคๅฎๆๅขไฝไธญ่ฟ่ง่กไธบ\n1000 Processed\n classify content\n212500 1 ๅฐๆฌ็ไธไธป๏ผๆ่ฐขๆจ้ๆฉไธญๅฝๅฅฝๅปบๆ่็ไธพๅ็โ็ปๅฏน้ๆฉโๅคงๅ่ฎฉๅฉๆดปๅจ๏ผ่ฏทๆจไฟ็ฎกๅฅฝ่ฃ
ไฟฎๆค็
ง๏ผๆดปๅจ...\n212501 0 ไปๆณ้ขๅคง้ข้ๅฒๅบไธ็พค้ป่ขๆฑๅญ\n212502 0 ๅๅๅไบๅผๅฏ็ธไพไธบๅฝๆฐๆฒไนๆ
ไบ\n212503 0 ้กถๆฐๆไธๅ็๏ผๅบทๅธๅ
ๅพทๅ
ๅฃซๅณๅ
จfamilymartไพฟๅฉๅบ\n212504 0 ไฝฟๅพๅ
จๅนดๆฐดๅฉๆ่ตไปปๅกๅจๅๆฅๅบ็กไธๅๆ้ซxx%\n1000 Processed\n classify content\n213000 0 ๆ๏ผๆฟ็ๆๆบไฝ ่ฟๆไนๆฑ็ๅฆๅฆ็กๅ\n213001 1 xxxxxxxxxxxxxxxxxxxx-ใๆฅไผไปฝใๅฐๆฌไผๅ๏ผโๆข
โไธฝๅฅณไบบ่๏ผๆปก้้ข้่ดดๅฟๅฅฝ...\n213002 1 ๅฅฅๆฏๅฏ็บณๅฎขๆฟx.xๆ๏ผไฝ่ณxxxๅ
้ๆฉ้คๅคๅฎต๏ผ้็นๆฟxxxๅ
่ตท๏ผๅฐๅ๏ผๅค็ ๅคงๆกฅๅคด๏ผ่ฎขๆฟ็ญ็บฟx...\n213003 1 ๆถๅฐๅท็ :xxxxxxxxxxx๏ผ็ญไฟกๅ
ๅฎน:ๆพณ้จๅจๅฐผๆฏไบบๅจฑไนๅบ็ฐๅทฒๅผ้็ฝไธๅๅฝฉwww.xxx...\n213004 0 ๅนณ่ฐทๅบ้ฃ่ฏ็ๅฑ้่ฟๆกไปถๅฎกๆ ธไผ่ฎฎ\n1000 Processed\n classify content\n213500 0 ไธใๅไธ่ฎกๅไนฆๅฐๅบๆๅค้่ฆ\n213501 0 ๅ
จๅบ็ดฏ่ฎกๆถ่ดนๆปกxxxx้็พ็ฝๆๆฉๆ้ฅผไธ็\n213502 0 ๅฐ้็ๅพไนฆ็็่ฒไผผ่ดจ้ไธ้\n213503 0 ไธไบคๆๆไนๅก็ฃไฟๅธๅๅๅฅฝๆ็ฅจ็ณป็ป\n213504 0 ๆฏๅคฉๆ็บฏ้ฒๆฐด็ๆถๅๆ10ๅ้็่ธ\n1000 Processed\n classify content\n214000 0 ่
พ่ฎฏๆฟไบงๅจไธญๅ
ณๆๅไธๅคง่ก็ICๅๅกไธพๅไบไธๅบๆๅ
ณๆฟไบงไผ็ญน็ๅคงๅๅคด่้ฃๆด\n214001 0 8ๆ11ๆฅ่ๅท้ซๆฐๅบไบบๆๅธๅบ็ฐๅบๆ่ไผ\n214002 0 ๆฑฝ่ฝฆ็พๅฎน่ฝฆๅ
ๆดๅฐๆฑฝ่ฝฆ็พๅฎน\n214003 0 ๆๆๆๆๅฎๆๆด็็ต่ๆไปถ็ไน ๆฏ\n214004 0 ๅจๆฌๅท่ฟไนๅคๅคฉ็ฌฌไธๆฌกๅ่ฟไนๅฅฝ\n1000 Processed\n classify content\n214500 0 ๅถ่ฎข\"ไฟ่ฟไบบๆๅๆฐๅไธ14ๆก\"\n214501 0 ๅฏนๅ
ถไป3ๅๆๅนฒ้จไปฅ่ดชๆฑก็ฝชๅๅคๅคๆๆๅพๅไธๅนด\n214502 0 ้่ขซๆ่ทไบๅ่ญฆๅฏ้ฎไธบไปไนไฝ ๆขๅฎๅทงๅ
ๅๅ่ฟ่ฆๅๆฅๆข้่บซๅฌ\n214503 0 ไธญๅฝ็งปๅจๅฐฑๆ4Gๅก็ปๆๅฏ่ฟๆฅไบ\n214504 0 ๅ ไธบๅผนๅฑ้่ฏ่ฎบ่ฑๅ้ชจ่ธๅฐ่ธๅคงๅต่ตทๆฅไบ\n1000 Processed\n classify content\n215000 0 ไฝๆฏๆไปฌxxๅท้็ฅๅๅ
ๆๅๆธ
ไป็่ฟไบๆธฏxxxxxxๆๅ่ทๅฉ่ถ
่ฟ็พๅๅไบ\n215001 0 ้ ๆๅฎซๅด้ชๅฑ่งไธๅๅฏนไธญๅฝไบบๅผๆพ\n215002 0 ไธไป
้ๅพท่ดฅๅ่ฟ่ฟๆณใ่ฎฐๅพไปๅนดๅจ้ฆๆธฏ่ทฏ่พนไนๆฏไธๅฏนๅคง้้ๅนดๅ
ฌ็ถๅจ่ทฏ่พน้ๆ\n215003 0 ไปๅคฉๆฉไธ็น่ฏ็300477ไธๅไนๅฅๅๆๆฟ\n215004 0 ๅ
จๅคฉๆ้ซๆธฉๅบฆ๏ผๆฌ็่ฅฟๅ้จๅฐๅบ30โๅทฆๅณ\n1000 Processed\n classify content\n215500 0 ้ฉพ้ฉถๅ้ฉพ้ฉถๅA***ๅท้ๅๅๆ่ก้ฉถ่ณไบไฟ้ซ้136ๅ
ฌ้ๅคๆถๅ ๆ่ฝฆ็็ฏๅ
ไฟกๅทใๅถๅจใ่ฟๆฅใๅฎๅ
จ...\n215501 0 ๆฏๅคฉๅจInsไธๆๆๆฐ็ๅคง็ๅๅ\n215502 0 ็ฐๅจๆไธๆฌพๅซใPieceใ็ๅค่ฎพ\n215503 0 ๆ็นไนๅจๅฎถ้้คไบ็ฉๆๆบๅฐฑๆฏ็ก่ง็็นไนๆๆญปไบ\n215504 0 ไธ็ดไปฅไธบๅ
ฌไบคๆฏๅฐ้ๆด้ๅๅๅ\n1000 Processed\n classify content\n216000 0 ๅจๆๆ่ๅฉ70ๅจๅนด็้่ฆๆถๅป่ฟๆ ็ๆฏไธชๆไน้ๅก็ๆดปๅจ\n216001 0 ่พๅ
ฅ้กพๅฎข็ๆๆบๅทๅ็็กฎ่ฝๆฅๅฐๆๅจๅฐๅบ\n216002 0 ๅพ้พๅจxๆไปฝ็ๅไบฌๅๆพๅฐๅไบบ่งๆจกไปฅไธ็ๅ้ๅๅธไผๅบๅฐ\n216003 0 ๅฝๅณๅฐฑ็พๅบฆไบไธไธ่่ ไผไธไผๅธ่ก\n216004 0 ่ฟไบๆธฏๅธๆฐ่ฑกๅฐ24ๆฅ10ๆถ30ๅๅๅธ\n1000 Processed\n classify content\n216500 1 ๅฅฝๆถๆฏ๏ผ่้xxๅ
ๅ
้ๅฎฝๅธฆๅ
่ดนไฝฟ็จไบ๏ผ็จ่้ๅฎฝๅธฆ๏ผๅธฆๅฎฝ็ฌไบซ๏ผ็ฝ้ๅฟซ๏ผไธ็ฝไธๆ็บฟ๏ผ็ฝ็ป็จณๅฎ๏ผๅฏ...\n216501 0 ๆจๅคฉ็ปซๆจไธ็นๅ้ฃๆบๆ็น็ฑ้ๅบๆญๅท่ทฏไธ\n216502 0 ่ขซx๏ผๅฏน้่ทฏไบค้ไบๆ
่ฎคๅฎไนฆๆ ๅผ่ฎฎ\n216503 0 ๅ ไธบไปๅคฉๅจๆ ้กๆไปฅไปไปฌไธๆฏJhonๅMaryไบ\n216504 0 โ็ๅ็ญ๏ผโ็ๅฝไธ่ฝๆฟๅไน่ฝป\n1000 Processed\n classify content\n217000 0 ๅพฎ่ฝฏWinxxๅฐๆ็ป่ชๅฎถ้ณไนๆตๅชไฝๅ Xboxๅ็ๆฅๅ\n217001 0 ๅบ้ๆถจไบๅฟๆ
ๅคงๅฅฝๅ ไธๆๅคฉไธไธ็ญ\n217002 0 ๅไธบmate7ๆๆบๅฅhuaweimate7ๆๆบๅฃณไฟๆคๅฃณ่ถ
่็ซไฝๆ็ฉบไธชๆง้ฉ็\n217003 0 Limbergๆฏไธไฝๆฐๅบ็ๆดๅฝขๅค็งๅป็\n217004 0 ่ตถ็ๅฎ ็ฉๅป้ขๅป็ปๅฅถๆฒนๆ่ฒ่\n1000 Processed\n classify content\n217500 0 ๅๅฒไธ็ฌฌ5ไธชๅปบๆๅฐ้็ๅๅธ\n217501 0 ๆ ้กๅๅ
่ๆๆฏๅคๅฐ้ฑโโๆ ้กๅๅ
่ๆๆฏ้ๆฉๆ ้ก่นๆกฅๅป้ข\n217502 0 ๆฒณๅๅฐฑ่ฟ็น็ฒฎ้ฃใๆฟๅฐไบงๅ้ฟๅๆฑฝ่ฝฆๅๅพๆ ๆ ็\n217503 1 ๅฐๆฌ็ไผๅ:ไฝ ๅฅฝ๏ผ โไธๆๅฅณไบบๅคฉ๏ผไปๅคฉไฝ ๆๅคง! โไบฒ๏ผๅธญๆกฅ้ฝๅธๆไบบๅๅ่ฟไธๅ
ซๅฆๅฅณ่๏ผๅ
จๅบๅ
ซๆ...\n217504 0 ๆณ้ขๅ
ฑๅฏนๅนฟ่ฏ็่ๅไธๅ ๅคๅฎ็7่ตทๆกไปถ่ฟ่กๅฎฃๅค\n1000 Processed\n classify content\n218000 1 ไผไบบไธฝๅฆ้็บขๅ
ๅฆ๏ผๅผๅ
ๅฎต๏ผๅฅณไบบ่ๅ่ๆฅไธดไนๅญฃ๏ผไธบๅ้ฆๅไฝไบฒไปฌ๏ผ็นๆจๅบxxโxxxๅ
ๅคงๅฅ็บขๅ
๏ผ...\n218001 0 ่ฏฅๅบ19ๆกๅ่ทฏๆๆกฃๅ็บงๅทฅ็จๅทฒ่ฟๅบๆฝๅทฅ11ๆก\n218002 0 ๆฅ็ขไธญ่ฏ็ฎ็ดๆฒไบบๅฟ่พ้ฆ้ฃไธ้ๅ\n218003 0 ๆไปๅคฉๅจๅไธบ็ฝ็็ญพๅฐ่ทๅพไบ112Mๅ
่ดนๆฐธไน
ๅฎน้\n218004 1 ้xxxxๅ
๏ผๅ
ถๅฎๅฝ้
ๆ ็้x.xx็xxxxๅ
ๅฎ้ขๅ้ฉฌ้ข้ฝๅฏไปฅๅๆ.ๅฆๆ้่ฆๅฏ็ต่ฏ่ฎข่ดญ....\n1000 Processed\n classify content\n218500 0 ๆฑๅฎๆฑฝ่ฝฆๅฎข่ฟ็ซ็็ฏๅขไนๅคชๅทฎไบ\n218501 0 ๆ็จ็ๅฐฑๆฏๅไธบ็ๆๆบๅพ
ๆบๆถ้ด้ฟๅๅบไนๆบๅฟซ็ๆๅพๅๆฌข\n218502 0 ๅฟไบบๅคงไธปไปปๆ็ฆๆฐ่ตด้ฆๅบไพฟๆฐๆๅกไธญๅฟๅกฌ่พนๆไบ็ปๅ
ๆถๆท็จๆฐๆฐๅฎถ่ตฐ่ฎฟ\n218503 0 ็ๅ
ฌๅฎๅ
ไบค่ญฆๆป้ๅฏๆป้้ฟ่ดพไธญๅนณไธ่กๅจๅฟๅ
ฌๅฎๅฑๅฏๅฑ้ฟใไบค่ญฆๅคง้้ฟ่ตตๆ็้ชๅไธๆทฑๅ
ฅๆ้ๆฃๆฅๆๅฏผๅ...\n218504 0 ๆฏๆไธไบๅจ่ๅAPP่ฝฏไปถSnapchatไธไธไบ็ซๅฏ็ๅญฉๅญ็ๅพ็ๆชๅPOไธๅป\n1000 Processed\n classify content\n219000 0 ๆ็ต่ๅไบ่ฟๆฏไฝ ไปฌQQๅบ้ฎ้ขไบ\n219001 0 ๆณๅฝๅไบงๆฎ็ฝๆบๆฏๅฟๅฝขๅคฉ็ถๆค็ฉ็ฒพๆฒนๆๅทฅ้ฆ็็คผ็่ฃ
100g*4ๅ็พๅฝไบ้ฉฌ้$15\n219002 0 ๅบ้ๅจๆNBAไนๅ่ขซ่ฏไปท่บซๆ็ฆๅฐๆๆๆไธบ้ฆๅ\n219003 0 ไฝ ๆฏๅฆ่ฟๅฎๅจ็ต่้ขๅ้่ดญ็ๆฏๅๅบไปทๆ ผๅฎๆ ็็ตๅจ\n219004 0 ๏ผๆพ็ถWindowsPhone้ๅฐ้บป็ฆไบ\n1000 Processed\n classify content\n219500 0 ๆฌข่ฟไบฒไปฌ้่ดญ่ฏไฟก็ป่ฅ\n219501 0 NBA็ๆๆตท่พนไผๅ็็ธ็ๆป่ฝๅธๅผ็่ฟท็็ฎๅ
\n219502 0 โ120โๆฅๆๅ่ตถๅฐๅๆตไบไฝๆธฉ\n219503 0 ๅฐ้ๆไธชๅฐๅฅณ็็ดๆฅๆคๆ่ฟๅป\n219504 0 LinkedInๆถๆๆผๅๅๅฒ่งฃๆ\n1000 Processed\n classify content\n220000 1 ไฝๅ
ญ้ไธ็ญไฟ้ๆดปๅจใ ๆตทๅ้
ๅบๅ
จไฝๅๅทฅๆๅพ
ๆจ็ๅ
ไธด๏ผ่ฎขๆฟ็ญ็บฟ๏ผxxx-xxxxxxxx\n220001 0 2015ๅนด็ฌฌๅๅทๅฐ้ฃ็ฟ้ธฟ10ๅทๅๅ็ป้ๆตๆฑ\n220002 0 ๆปฅ็จ่ๆใๆณ้ฒไธชไบบ้็งๆดๆฏ่บซไธบๆฐ่ญฆไธๅฏๅ่ฐ
็่ฟ้\n220003 0 ไธๅฝๆถ็ไธญๅคฎๆฟๅบๅไฝๆๆฏๅไนฑ\n220004 1 ๅๆ กไน่ทฏไป็ซๆไฟฎไธ่ตทๆญฅ๏ผๅปๅนดๆๆ กๆไธคๅๅคๅญฆ็ๆๅๅๅ
ฅๅๆ ก้ซไธญ๏ผๆฌข่ฟๅ ฑๅๅๅ ๅจๆซๅญฆไน ๏ผๆไปฌๅฐๅ
จ...\n1000 Processed\n classify content\n220500 1 ไบฒ็ฑ็ๅฎถ้ฟ:่ด่ดๆ็ฎกๆ้ด่พ
ๅฏผ็ญๅผๅง็ญๅๅฆ๏ผๆฌข่ฟๅนฟๅคงๅฎถ้ฟๆฅ็ตๆฅๅญๅจ่ฏข๏ผxxxxxxxxxxxไฝ\n220501 0 24โ2713880376289\n220502 0 ๆๆไธญๅไผๅๅไธ็ฐๆFPS็ฝ็ปๆธธๆ็ธๅ็็ฌฌไธไบบ็งฐ่ง่ง\n220503 0 ๆ ๆๅบ้ๅๆไบๆ่ต่
็ๅฉๆขฆ\n220504 0 ๆๅคงๅฉ่ฎพ่ฎกๅทฅไฝๅฎคgumdesign่ฎพ่ฎก็โmastroโๆฏไธไธชๅฐๅๅฎถๅ
ท\n1000 Processed\n classify content\n221000 0 ๅ
จ็้ถ่กไธ้่ๆบๆ่ตไบงๆป้้ฆๆฌก็ช็ ด3ไธไบฟ\n221001 0 ไฝ ๅช่ด่ดฃๅผๅฟๅฐฑๅฅฝ'ๅฝๆถไธๆ็็ธ็ๆ\n221002 0 ๆไนไธ็ฅ้่ฟ็ฌxF็ๅ้่ฝไธ่ฝๅค็ช็ ดไธ้ข็้้ๅๅ\n221003 0 Daianaๆฏ็ฑSoupStudio่ฎพ่ฎก็ไธ็่ฝๅฐ็ฏ\n221004 0 ไปไปฌ่ชๅทฑๆณก็้ปๆพ้ฒ่่้
ไนๆฏๅคชๅฅฝๅ\n1000 Processed\n classify content\n221500 0 8็น1ๆฐช๏ผFacbookไนๅ็งปๅจ่ง้ข็ดๆญไบ\n221501 0 ไฝไบๆตๆฑ็ๆ
ๆบชๅธๆ่ตท้ๅทฅไธ่ทฏxxๅทไฝฐไฝณ็ตๅจๅ็ชๅๅคง็ซ\n221502 0 ็ฆๅทๅธไธญ็บงๆณ้ขๆไธฝๅจๆณๅฎๅจ20l5ๆฆ่ก็ปๅญ็ฌฌ306ๅทๆกๅฝๅฎกๅค้ฟ็บฆไธ่ฏไบบๆไป้่ฐ่ฏ\n221503 0 ไฝ ๆฅๆไบ่ฟ็ๅฐฑๆฏๆ่ตๅๆถ็็ไบบ้ฟ่ฝด็7ๅบงๅธๅฑ/ๆญไธ็ผธmpareyourselfwithothๅฅฝ่ฏด็\n221504 0 ๆๆถๅบๅนธ่ชๅทฑ่ฟๆฏ่ด่ดฃ้ฃๆบ็บฟ่ทฏ\n1000 Processed\n classify content\n222000 0 ๅฐ็งฆๅ ๆถๅซ่ๅกไพตๅ ๆก่ขซๆฒฑๆฒณๆดพๅบๆไพๆณๅไบๆ็\n222001 0 ๆฉๆฝๅทๆฃๅฏๆบๅ
ณไพๆณๅฏนๆๆณฝๆๅณๅฎ้ฎๆ\n222002 0 ๆฏๅป็ๅจๅฏนๆ็ๅค่ฟ่ก้บป้ๅๅผๅง่ฟ่ก็\n222003 0 ้ๅทafp้ขๆ็ญๅฐๆญฃๅผๅผ่ฏพใใxxxxๅนดxๆxxใxxๆฅ\n222004 0 ๆๆๆบๅ็ต่็็ๅพฎๅๆจกๆฟ้ฝๆขไบๆฐ็\n1000 Processed\n classify content\n222500 0 xๆๅธธๅทๆบๅบ่ฟๆฅไบๆ่ฟ้ซๅณฐ\n222501 0 ไปๅคฉไธญๅๅจๅป้ข็ๅฐๅฐไพๅฅณไบ\n222502 0 ไปๅนด็ฝไธ้ถ่ก็ดๆฅ่ขซไบบ็ๅทๆถ่ดน3w4\n222503 0 ่ฏฅ่ทฏๆฎตๆพไบxxxxๅนดxxๆxxๆฅxxๆถxxๅๅ ๅฑฑ่ฅฟๆนๅๅๆไธป็บฟ็ซKxxx+xxxๅค่ฝฆๆต้ๅคง\n222504 0 ๆ่ฎธๆฏ็ต่ๅ
ๅญๅจ็็
ๆฏๅฏผ่ด่ฟไธช็ฝ้กตๆปๆฏ้พไปฅๆๅผ\n1000 Processed\n classify content\n223000 1 ๆจๅฅฝ๏ผๆๆฏๅ่ทไฝ ้่ฏ็ๅนณๅฎไฟก็จ่ดทๆฌพ้ญ็ป็๏ผๆๆ็ฎๅ๏ผx-xๅคฉๅฐ่ดฆ๏ผ้ๅ้่ฟ๏ผๅ
ฌๅธๅฐๅ๏ผ้กบๅค่ทฏ...\n223001 0 ไฝ ไธๅฆจๅๅคๅ่ฟ3ๆ่ฟ้ถ่ฟๅจๆฅ้ๅก่บซๅฝขๅฆ~\n223002 0 ๆๅจโ้ข็ดๆๅตๅฐธโๆ่ถดไบ255ๅชๅตๅฐธ\n223003 0 MQTTๆญฃๅผๆไธบๆจ่็็ฉ่็ฝไผ ่พๅ่ฎฎๆ ๅ\n223004 0 ้จๅ็ฌฌไธๆน็จๅบไธญๅ
ๆ ๅจๅฎไฝๅฐๆๅญ่พๅ
ฅๆกๅไธ่ชๅจๅผนๅบ่งฆๆง้ฎ็\n1000 Processed\n classify content\n223500 1 ๅฟ้ฆๅจๆxxxxโ็พๅๆ ทๆฟๆฟๅพ้ๆดปๅจโไปทๆ ผ้ข็บฑๅณๅฐๆญๆ๏ผๆจๅๅคๅฅฝไบๅ๏ผๅฟ้ฆๅฐ่ฉนๅๅ
จๅไบxๆฅๅณ...\n223501 0 ๅทฅไธๆบๅจไบบๅจ้ๅๆซ็ซฏๅทฅๅ
ทๆถๅบ่ฏฅๆ่้ฃไบ้ฎ้ข\n223502 0 ไปไปฌไธคไบบๅจ้ฃๆบไธไธบไธๅๅฟ่บ้ชคๅ็2ๅฒ็ท็ซฅ่ฟ่กๆฅๆ\n223503 0 ็่ณ้ฒ่ฝ+ๅฅฝ่ฟๆฐ่ฝ้ ๅฐฑๆไบบ็่พ็
\n223504 0 ๅปถๆ็ๅคๅปบ็ญๅคๅคไฝ็ฐ็ๅไบบ็่ฟ็ง็ฅ็ฅท\n1000 Processed\n classify content\n224000 0 ๆตๆฑๅฎๅ่ฑชๅไธๆฅๆธธๅฐๅๆๅๆๅบฆๅๆไธๆฅๆธธ\n224001 0 ๅฏๅฃซๅบท็ญพ็ฝฒๅ่ฎฎๅฐๅจๅฐๅบฆๆ่ตxxไบฟ็พๅ
ๅปบๅ\n224002 0 HR่ตซ่ฒๆๆ่ดไน็พ่ๅ็ผ้15ml\n224003 0 ๆ ๆฎๆฟ่ฏบๅจไธๅบทๆๅๅนถๅๆ้ซceo่ฒๅฅฅ่็บณ็่ช่ต\n224004 0 ้ๆฑๆ ็ๆฅๅไธน้ณๆ ็ๆฅๅๅไบฌๆ ็ๆฅๅ\n1000 Processed\n classify content\n224500 0 ๆน้ฉๆนๆกๅพๅฐไบๅธๆณ้จ็่ฎคๅฏ\n224501 1 ไบฒ็ฑ็ๅไผๅxๆxๆฅไธxๆxๆฅไธๅนดไธๅบฆ็ๆฅๅญฃๅๅฆๅ่๏ผๆญฆๆ้ถๆณฐๅๅบๆดปๅจไนฐxxxx้xxx๏ผx...\n224502 0 ้ฟ้ๅทดๅทดๆๆ่ฟ
้ๅๅ
ฅๅฅขไพๅใ่ฝปๅฅข\n224503 0 ใๆฏๅฅณๆดพๅบๆๅ
่ขซๆๅฎถๅฑๆพxๆฌกๆฅ่ญฆ่ญฆๆน็งฐๅฎถๅกไบใๆดพๅบๆ\n224504 0 21ไธ็บชๆ่ฒ็ ็ฉถ้ขไธญๅฐๅญฆๆ่ฒ็ ็ฉถไธญๅฟใๆฌๅทๅธๆข
ๅฒญๅฐๅญฆๆฟๅ\n1000 Processed\n classify content\n225000 0 ๅ
ดๆญฃ็บ ้ๅฎๅฎๅคงๅญฆๆ กๆ ก้ฟไปป็ง็บขไธไธชๅฎๅฎๅฐ็ไบบๅไปปไฝไธไปถไบๅฟ\n225001 0 โ้ถ่ท็ฆปโๆๅๆณ้ข็ๆๅๆฐๅด\n225002 0 ้ๅฒใๅนฟๅทใๅคง่ฟใๅไบฌ็ญๅฐไนๅฎๆฝไบ็ฆปๅฉ้ๅท\n225003 0 ๆดไธป่ฆ็่ฟๆฏ็ฑไบๆธ
ๆฟๅบ่
่ดฅๅไฟๅฎ\n225004 0 B่ถ
ๅธๅ
ๆ่ฎถ้๏ผไฝ ็ป่้ฃไนๅคงไธบไปไน่ๅญ้ฃไน็ป\n1000 Processed\n classify content\n225500 0 ็ฌฌ61่ฎฒ๏ผScalaไธญ้ๅผๅๆฐไธ้ๅผ่ฝฌๆข็่ๅไฝฟ็จๅฎๆ่ฏฆ่งฃๅๅ
ถๅจSparkไธญ็ๅบ็จๆบ็ ่งฃๆ็พๅบฆไบ๏ผ\n225501 0 ไปๆฅๆฒก็จ่ฟiPhoneๅคฉๅคฉ้
ธiPhone\n225502 0 ็ฆ่ฝ่่ต็ง่ตๅ
ฌๅธโๆฐไธๆฟโๆ็ไปชๅผๅจๅไบฌไธพ่ก\n225503 0 ๅฏนไธๆฏๅผบๅฅธๆฏ่ฝฎๅฅธโฆโฆ็ผๅง้ฟ็นๅฟๆๅโฆโฆ\n225504 1 ไฝ ๅฅฝๆๆฏ้ณๅ
ไฟก่ดท็ๅฐๅง๏ผ้ณๅ
ๅฎกๆนๅพไฟก๏ผๅ
ๅคง้ถ่กๆพๆฌพx-xๅคฉๅฐ่ดฆ๏ผๆ ๆตๆผๆ ๆ
ไฟ๏ผ็ต่ฏxxxxx...\n1000 Processed\n classify content\n226000 0 ๅ่ฏไฝ ไปฌๆ็ฝ็ๅขๆๅพๆ็็ธ\n226001 0 ไธๆตท็ไธญๅฝๅฝ้
ๆบๅจไบบๅฑไผไธ\n226002 0 ไฝไบ2015ๅนด8ๆ2ๆฅๅๆ้ฃ\n226003 0 ไปๅคฉๆๅ็ฐๅฅฝไน
ไธ็่ฃ
ไฟฎๅพๅบไบ\n226004 0 ไปๅคฉไธญๅๅปๅ็ๆ ๅๅพๅท็ปๅฃๆฏๆๅ่ฟๆๅฅฝๅ็้ฅญๅบ\n1000 Processed\n classify content\n226500 0 ๆดๆดๅฉ่กฃๅ ก่ถ
ๅผนๅ2015ๅค่ฃ
ๆฐๅไธๆๅๆฌพๆฌง็พๆข็ผๆ่ฒๅฐๆช\n226501 0 ๅฏๅฏนๅๆๅคงxxๅ้ซๅฐๅคซๅบๅฐๅฉ็ๅฟๆฟ่
ๆฅ่ฏดๅฐฑไธๆฏ่ฟๅไบไบ\n226502 0 ไธไธชๆ็ๆๆไบคๅฒๆฅๅจ8ๆ21ๆฅ\n226503 0 ๆตทๅฃไธญ้ขโๆณๅฎๆๆณๅฎโๅคง่ฎฒๅ ๅจไธญ้ขไธๆฅผๅคงๅฎกๅคๅบญๅผ่ฏพ\n226504 0 ๆ่ฟๆฏ้ฒๆ?sofina?็ฒ้ฅผ\n1000 Processed\n classify content\n227000 0 ่ฏด่ชๅทฑๅฐ่พพไบTOPOFTHEWORLD\n227001 0 ่ฎคไธบๆบๅจไบบๅจ่่ฎบๆๆพ่ขซๅคธๅคงไบ\n227002 0 ็ปๅคงๅฎถๅไบซโYouAreBeautiful\n227003 0 ่ฎฐๅพๅฐๆถๅๅฐๅทๅท็ไธ่ฌ็จ็ปณๅญ\n227004 0 ๆไนๆณ่ฆ็บขๅ
ๅๅฐๆทๆทไบๆฏ็ๅฐๅธไธ\n1000 Processed\n classify content\n227500 0 ๅฏนไบๆ่ฟ็ง็ๆฏไบSMTMๅUnprettyRapStar่ฟ็งๆปกๆฏๅๅฃฐ็Hippopๆฏ่ต็ไบบ\n227501 0 ๅๅจ็ต่้ขๅๅทฅไฝๅฐฑๆณ็ๅบๅป่ท\n227502 0 8/4ๆธฏ็iPadair\n227503 1 ๆณๅทๅธๅคฉๅๅฒ็นๆ็ๅบ\n227504 0 ๆฏ่พ่ฝฆ้ฝไธฅๆ ผๆ็
งx๏ผxx็ๆฏไพๅถไฝ\n1000 Processed\n classify content\n228000 0 ่ๆฟๅบๅจ้ๅปบไฟกๅฟไธๅๅบฆไธ่ถณ\n228001 0 ๅไบฌๅทฅๅ้จ้จๅ่กจ็คบไผ่่ๆฝๆฃ\n228002 0 ๆๆจไธป่ฟๆ ท่ชช็twitter\n228003 0 ไธ่ฎฉๆๆบๆงๆฏไฝ ็่ๆฑใ็ก็ ไปฅๅๆๅๆฐดๅนณ\n228004 0 ็พๅบฆ็ๆถๅฝๆฐ้ใๆถๅฝๆ็ใๅ
ณ้ฎ่ฏๆๅ่กจ็ฐไนไธๅคง็ธๅ\n1000 Processed\n classify content\n228500 1 ๅณๆฅ่ตท่ณxๆxxๅท๏ผๆขๅญฃๆธ
ไป๏ผๆผๅคฉ้จๆ้ฅฐๅ
จๅบๅฏนๆโโๆๅฉๆผๅคฉ้จใ\n228501 0 ๅฟๆถๆขฆๆณ็ไผไผดๅๅฆAๆขฆๅๆฌก่ขซๆฌไธ่งๅน\n228502 0 NoFilter๏ผๅซไปฅไธบๆ็ฆไบ\n228503 0 ไธๆฌกไฟๅๅ
ฉๆฌกไฟๅnๆฌกไฟ็ๆพๅฟๅไธๆฌกไบ็ๅไบบ็็ณปๅพ้ฉๅ่ฆ้ปๅๆๅ\n228504 0 ไธ่ฟ่ฃ
ๆฝขๅๅนฟๅไธๆ ทๅพ็ฒๅซฉๅพๅฐๅฅณๅข\n1000 Processed\n classify content\n229000 0 2015ๅนด6ๆ้ๅฒๅธๆฐๅผๅทฅ่ฟไบฟๅ
ไบงไธ็ฑป้กน็ฎ48ไธช\n229001 0 ่ขซๅไบบ่กๆถฒ้
็ฒพๅซ้ไธบ178ๆฏซๅ
/100ๆฏซๅ\n229002 0 ไธฐๅฟไพ็ตๅ
ฌๅธโ้่่โๅ
ๅๆๅก้่ตฐ่ฟ็คพๅบ\n229003 0 ๅไบฌไธๅ็ฎก้ๅๅไฟก้ฎๅธ้ฟ๏ผๆไปฌ็ๅๅธ่ฝ็ฎก็ๅฅฝๅ\n229004 0 ่ฏฅๆกๆถๆก็้้ขๅ
ฑ่ฎก400ไฝไธๅ
\n1000 Processed\n classify content\n229500 0 ่ฟ็งๆบๅจไบบๅจ็ฐๅฎ้ๅทฒ็ป่ขซๅฝๅค็ๆๅฎขDIYๅบๆฅไบ\n229501 0 ไปฃ่กจxxไธชๅท็xxๆ นๅทๆณๆฐดๆฑๆฏไธๅคงไบฎ็น\n229502 0 ็ฑไบ่ทฏ้ขๅๅก้ ๆDN200ไพๆฐด็ฎก็บฟ็็ฎก\n229503 0 ๅฝJ็ปA่ฏ่ฏด่ชๅทฑๆ
ๆธธ็ปๅ็ๆถไปๅด็ก็็้ฃๆถ่ตท\n229504 1 ๆฌๅ
ฌๅธ้ฟๆๆๆ๏ผไปฃๅผไบๅๅๅฐๆญฃ่งๆบๆๅ็ฅจ๏ผๅฝ็จ๏ผๅฐ็จ๏ผๅขๅผ๏ผๆๆๆฌพ๏ผๅทฅ็จๆฌพ๏ผๆๅกไธ็ญ๏ผ็ฝไธๅฏ...\n1000 Processed\n classify content\n230000 0 ็ฌๅค็็ต่ๆ่
ๅทฅไฝไนๅไฝ ๅฆๆ็
ง็
ง้ๅญ\n230001 0 1986ๅนด่ๅๅฝๆ็งๆ็ป็ปๅฐๅ
ถๅไธบไธ็่ช็ถ้ไบง\n230002 0 ไธๅๆไปฌๆ็
ง่ๅธ่ฆๆฑ็ฉฟ็ๅญๆๅธฆ็่ฑๅๅปไบ\n230003 0 ่ทๆ่ฏดๅๅ้ฃๆบไธๆฅๆณๅป่ฝฌ้ซ้\n230004 0 ๆ นๆฎRealGM่ฎฐ่
ShamsCharaniaๆฅ้\n1000 Processed\n classify content\n230500 0 ่ช้ๅบๅทก่ช็ณป็ปใๅฎ้ฉฌConnectedDriveไบ่้ฉพ้ฉถ็ณป็ปไนๅฐๅบ็ฐ\n230501 0 โๅป็็ดๆฅๅ่ฏไป๏ผโๆ้ฃ้ฑไฝ ่ฟๆฏ็็็ผ็ๅง\n230502 0 ๆ่ฟๅๆๅไธญ่ฏๅคด้จ็ฎ็ๅฅ่ฟน่ฌ็ๅฅฝไบ\n230503 0 xxxxๅปถ่พนๅทๅฅ็พ้ฆๆ ่ตxxๅ
ฌๆคไปฅไธ็บง่ช็ฑๅฑ็คบ\n230504 0 ๅ็ฎก้ๅ็ซๅณ้ๅๆชๆฝๅฐ้ขไธ็ปณๆๅผ\n1000 Processed\n classify content\n231000 0 ๅๅท่ช็ฉบxuxxxx้ข่ฎกๆ็นxๅฐๆถ\n231001 0 ยทๅนฟๅทๅธๆฐๆ10ๅ
้บปๅฐ่ขซๆ็ๆณ้ข็งฐ่ต่ต่พๅคง\n231002 0 ่ๅทๅฝ้
็งๆๅญไธพๅ็ไธๅบๆบ่ฝ็ๆดปๆฐๅๅๅธไผไธ\n231003 0 ็ๆฏๅคชไธๅฅฝไบๅฐคๅ
ถๆฏ้ผป็็ซ้้่ฆ็ฏ็ๆๆ\n231004 0 T315I็ชๅ้ณๆงๆฃ่
ๅฏน็ฌฌไธไปฃๅ็ฌฌไบไปฃTKI่่ฏ\n1000 Processed\n classify content\n231500 1 ่ฑๅ
xๆฅผๆฐด็ๆฐๆฅ็ฎ็คผ็ต่ฏ่ฎขๆฟๆไผๆ ๏ผไธไบบไผๆ xx๏ผไธคไบบๅ่กไธไบบไผๆ xxxใใๆฌข่ฟๅๆฅๅ่ถใใ\n231501 0 ็พ็
็ไธๅปๅฐฑๅไธไธชๅพ้ฅ่ฟ็ไธ่ฅฟ\n231502 0 ไฝๆฟๅ็ฎกไธๅๅฎถไน้ดๅปบ็ซๅ่ฐ\n231503 0 ๆ่ตๅฎ้ธๅฎๆดๆ็ฐ้ๅคงๅฅ็ญๆจๆฟๅฆ\n231504 1 ๆ่ฝๅจๅซๅ้กถ@้ฟๅฏฟไธๅๅบ๏ผๆฌขๅบx.xx๏ผๆ่ฝๅ้กถ ๏ผๅ
จๅบx.xๆ๏ผ้ค็นไปทๅค๏ผ๏ผๆๅๅคๅค๏ผ็คผๅ...\n1000 Processed\n classify content\n232000 0 ๆจๆไธๅๆขฆๆขฆๅฐๆๆบ็ซๅพไธๅพไบ\n232001 0 ๅชๅ็กฎไฟๆฌๅบๅ็ฎกๆงๆณๆก็บฟโ็ญ็บฟไธ็ญโ\n232002 0 ๆไปฌTF็็ฒไธๅ่ขซๆญค่็ฎ็ปไพฎ่พฑ็ๆๆ็ฒไธไปฌ่ฆๆฑๅ
ฌๅผ้ๆญ\n232003 0 ่ฏฅๅปๆดๆฒปไบค้ๅดๆ็ฒพๅๆตช่ดนๅจ่ฟไบไธ\n232004 0 xxExofficioCafenistaJacquardๅฅณๅฃซ็พๆฏๆทท็บบ้ๅนฒไฟๆ้็ป่กซ็พๅฝไบ้ฉฌ้...\n1000 Processed\n classify content\n232500 0 ๆ็็ๆ ๅฟไนไธๅฐฑๆฏไธๅๆณๅปๅนฒๆถๆ่
ๆนๅคไปไบบ็ไปทๅผ่ง\\n0\\tไปๅนด็ฌฌ13ๅทๅฐ้ฃโ่่ฟช็ฝโๆฅๅฟๆฑน...\n232501 1 ไบๅจ๏ผๅฅๅฅไธ็จๅจ๏ผๅฐๅฆนๅ
จ่ชๅจ๏ผ็ฉๆณ็ฌ็น๏ผๅๆๆชๆ๏ผๅธฆ็ปๆจไธไธๆ ท็ไฝ้ช๏ผๅฐๅ:ๆญๅทๅๅฏไบงๅ็ฉๆต...\n232502 0 ๅ็ฐ็พๅบฆไผๅ็กฎๅฎไธ่ฐทๆญไผๅๆๆฏ่พๅคง็ๅทฎๅซ\n232503 0 ๅคง้็WindowsPhone็ฒไธๆถๅ
ฅ่ฎบๅๅนถ่กจ็คบๆ่ฎฎ\n232504 0 ๅป้ขๆญฃๅผ้็จไฟๅฅๅๆฟไปฃๆไบ่ฏ็ฉๆฒป็็พ็
\n1000 Processed\n classify content\n233000 0 ๆๆบ็ธๆบไน้ฝๆไธๅบ้้ข็็พๅๅฃฎ่ง\n233001 0 ๆไปฌๅ
จๅฎถๆ่ฟ้ฝๅจ่ฑๅ้ชจโฆไนฐ่ๅๆฅ่ฟ้จ็่งๅผ็ธ\n233002 0 ?ๆฑๅฎไนๆ
?็งฆๆทฎๆฒณ็ๅพกๅบญ่ณ\n233003 0 ้ข่ฎกๅคง็บฆๆ6000ๅ่ๅๅฐ่ขซ่งฃ้\n233004 0 ไผ่ฎฉๅพๅคๅฐไผไผดๆจไธๅพ24ๅฐๆถๆณกๅจๆฐด้้ฟๆ\n1000 Processed\n classify content\n233500 0 19ๅฒ็ๆชๅจๅฅณๅญฉNannไบบ็งฐโSnapchatๅฅณ็โ\n233501 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?ๅค่กไนฆ็xx\n233502 0 ่ฐข่ฐขๅไฝ็ๆๆๆฏ้็็่ถณ็ไบบๅฅฝๅฟไบบๅพฎไฟก่ฝฌๅธ่ฏท่ฝฌๅฐ็ฎๅ้ชไผดๅจๅป้ข็ๅดๅพฎ\n233503 0 ่ฃ่7ๅงๅฐฑๆ้ฃ็นๅพฎ่็็งฏ่ๆๆฟๅไธ่ตท\n233504 0 ่ฟไธๅๅ่ฎฉ้จๅWindows็จๆทๆๅฐๆ
ๅฟง\n1000 Processed\n classify content\n234000 0 ๆฒกๆๅทฅๅๆง็
งๅฐฑๅผไธไบ้ฟ้ๅทดๅทด่ฏไฟก้ๅไธไบ่ฏไฟก้ๅ\n234001 0 ๆไธไผ ไบโๆฐธไปๅฟไบบๆฐๆณ้ข2015ๅนดโๅบญๅฎก็ซ่ตโๆดปๅจๅ่ต\n234002 0 ๅ็ฑณ้ธกๆฏ็งๅๅฎพ่ชๆๅ็ฑณๅคฑๆๅฅฝๅ้็็็ธ่ถ
ๆธ
>\n234003 0 ๅฎนๆไฝฟ3ๅฒไปฅไธ็ๅฟ็ซฅไธๆๅ
ฅ็กๅๅญ้นไธๅฎ\n234004 0 ๆจๆ ็ปๅๅฏนๆฏ๏ผ็ฟป็Twinsๅๅฃฐ่็ฟปๅฏผๅธ\n1000 Processed\n classify content\n234500 1 ไธญๅฝๅไธ้ถ่กๅธๅท๏ผxxxx.xxxx.xxxx.xxxx.xxx ็ๆณข\n234501 0 ใใ1978ๅนด10ๆ่ณ1982ๅนด7ๆ\n234502 0 ่ๅฐไธ็ฉบ้ดๅไธไธๆกๅฐ้็บฟ่ทฏไบ็ธ่ฟ้\n234503 0 ไธๅทๆฅผๅจๆตๆณจ่ฅฟ่พน็็ตๆขฏๅบๅ\n234504 0 ๅๆถไนๅถ้ ้ฃๆบๅๅ็งๅๅจๆบ\n1000 Processed\n classify content\n235000 0 ใใ1ใๅๆถ้ฟๅ
็
ๅ ็พ็
็ไบง็้ฝๆไธๅฎ็ๅๅ \n235001 0 ็็ไธๆถ้ฃๆบไธ้ชไธ้ชๅฐ้ฃ่ฟ\n235002 0 ๆณๅฎ๏ผๅฅฝๅ้ฃไธชๆๅฉๆฏไปไนๆๆ\n235003 1 ใไธๆๅผๆฅ๏ผไธ้ฃๆฌ็ฐๅผๅ
ๅบ้็คผไบใๅก่ฟๅบๅ็ไฟ้ฉไธๅก๏ผๅณๆๆบไผ่ท่ต ่ๆฐ้จๆธ
ๆดใๅ่ฝฎๅฎไฝใๅทฅๆถ...\n235004 0 ๆธ
็ไนฑๅ ็ฉๅ ๆxๅคใ่ฟ็ซ ๅนฟๅๆ่ดดxxๅค\n1000 Processed\n classify content\n235500 0 ๆดไธชๅๅญๆฅผ่ขซ่ๅคฉ็ฝไบโๅด็ปโ\n235501 0 ๅไบฌๅคง้จไน้พไปฅๆต็ญๆๆปๆญ่ฟท็ญๆ
\n235502 0 ไธๅฐIndiegogo็ๆๆไผ็ญนไบงๅ็ญๅทฒ็ป็ป้\n235503 0 ๆฅๆฌHELLOKITTY้้่็ ๆๆถฆไฟๆนฟๅๅฆๆฐด่่คๅนฒ็ฅ\n235504 0 ้ๅไธ่ญฆๆนๅฑๅผไบ้ฟ่พพ20่ฑ้็่ฟฝ้\n1000 Processed\n classify content\n236000 0 ็ปๆๆๅคๅฐๅฅฝ็ฉๅนถไธget็บขๅ
็+็็็็งฐๅท\n236001 0 ็ถ่mips็ๆ็ๅๆปไธ่ขซไพต่็ปๅๆญฃๆไบไธๅคงๅ\n236002 0 ็ฑ้ซๅ่ดจ็ฒพๅๆฅผ็โโๆฝๅๆๅคงๅ้ฝใๆฝๅ็ฟก็ฟ ๅๅบญๆบๆ้ณไนๅนฟๆญFMxx\n236003 0 ็ฎ่ค่ฟๆๅ็บขๅทไธๅท่ฝๆๆ็ผ่งฃ\n236004 0 ๅซๅข
=villa=cottage\n1000 Processed\n classify content\n236500 0 ็ฎๅ5ๅนดๅทฒ็ดฏ่ฎกๆๅ
ฅไธ็พไบฟๆจ่ฟไบค้ๅปบ่ฎพ\n236501 0 ๅจ่ฟ้ๆๅ็ง้ซ็บง็ๆบๅจไบบไธๅฎไบบ่ดน่งฃ็ๆฏ็ญ็ฐ่ฑก\n236502 0 ps๏ผไฝๅบๅฎขไบบๅ
่ดน่ต ้ๆฌๅทๆ
ๆธธๅ
ฌไบคๅกๅฆ\n236503 0 ่ฟๆฎตๆถ้ด็็ต่ๆก้ขๅ
จ้ฝๆฏๅฅน\n236504 0 ๆณ้ขไธๅฎกๅคๅณ่ขซๅ้ฑๆ็ฏๆ
ๆไผคๅฎณ็ฝช\n1000 Processed\n classify content\n237000 0 โ็ฟ้ธฟโๅทฒ็ป้ๆตๆฑ่ๅฑฑ?ไธๆตท่ฝฌ็งปๆค็ฆปxx\n237001 0 ๆๆบๅ้ขๅฐฑๅฐไบไธไธชๆ ๅฟ็่่ๅทฒ\n237002 0 ๅป็็พๅฎนๅ็ไธๅๅๆฏๆดๅฎน\n237003 0 ไธไธชไพ้ ๅๆฅ่
ๆๅ
ฅ็ปด็ณปๅ
ๅ
ฅ่
็ๅฑๅทฒ็ป่ฏดๅพๅด็\n237004 0 ๅ้่ๅ
ต่ชๆ50ไธ่ฎพๆฅๅๅบ้\n1000 Processed\n classify content\n237500 0 ๅพทๅฝๆฐไผๅจ็บณ็ฒนๆฟๆฒป่
่ดฅ้ฎ้ขไธ็่ฎค่ฏไธไป
้ผ ็ฎๅฏธๅ
ใ่ชๆๆฌบ้ช\n237501 0 ๆณๅพใ่กๆฟๆณ่ง่งๅฎๅนฟๅไธญๅบๅฝๆ็คบ็ๅ
ๅฎน\n237502 0 ๅ่
่ณๅฐไผไฝฟGDPไธ้ไธไธช็พๅ็น\n237503 0 ่ฟๆถๅป้ขไธไฝๅพท้ซๆ้็่ไธปไปป่ฏด๏ผๅฑ\n237504 0 ๆไบๅฐๅธๆฐๆน้ปไบๅไบฌไบบ็ๅฝข่ฑก\n1000 Processed\n classify content\n238000 0 ่ฟๆฏ่ช1985ๅนดไปฅๆฅๅฝๅฐ้ญๅๆๅคง็ไธๆฌก่็พ\n238001 0 ็ๆณๆไผธๅฐ็ต่้ฃๅคดๅคงๅดๅทด็ฉไปไธซ็\n238002 0 ๅจๆตๆฑ็ๅฎๆฑ็ฐๅไธ่กๅ่ไธญ่ทฏไบคๅๅฃไธๆผไบโ็ขฐ็ขฐ่ฝฆโๅคงๆ\n238003 0 ๆๅๅฑๅๅๆ้ ไบไธไธช้ฟ็ธ้
ทไผผๆฅๆฌ้ฆ็ธๅฎๅๆไธ็ๆบๅจไบบ\n238004 0 ้ช่ฑ็งๆป้ดๆฐดไนณๅฅ็ๅ ไธบ็็ๆ็ฏๅข็ๅฝฑๅไพฟๅบ็ฐ้ด่็ๆ
ๅต\n1000 Processed\n classify content\n238500 0 ๅคง็ๆณฐๅฝMistine็พฝ็ฟผ็ฒ้ฅผ\n238501 0 G2501ๅไบฌ็ปๅ้ซ้็ฑๅ
ญๅๅพๅไบฌๆนๅๅไบฌๅๆกฅๆฎตไป44K่ณ45K\n238502 0 ๅ
ถๅฎ่ฟๅฎถๅบๆฏlonelyplanetๆจ่็้ค้ฆ\n238503 0 ๅๅๅๅนด้้่ถ
็พๅฝๆๅฝๆ็ฌฌ1ๅคงๆฐ่ฝๆบ่ฝฆๅธๅบ\n238504 0 ๅฎ่ๅงๅงๅ ็น็้ฃๆบๅ\n1000 Processed\n classify content\n239000 0 9ๅ็ฏ็ฝชๅซ็ไบบ้ฝๅ ๆถๅซ่ฏ้ช็ฝช่ขซๅไบฌ้จ่ฑ่ญฆๆนไพๆณๅไบๆ็\n239001 0 ่ตๅพไธๅคๆฐๆไธๅคฉ80ๅ
ๅฐ100ๅ
่ฟๆฏๆไฟ่ฏ็\n239002 0 ๅฎ็ฃ็้ญ้ๅฑไบ่ดต้ๅฑๆ่ตๅคฑ่ดฅ\n239003 0 ๅฏน่ฟๆณไผไธไธ่ฝๆฏๅ
ปๅฟ
้กปไธฅๆฉ\n239004 0 ๆณฐๅทไธ็ฆ่น่ถ่นๅๅ ่ถ
้ๅ่ฝฝๅฏผ่ดไธๆญปไธ้ไผค\n1000 Processed\n classify content\n239500 0 ไฝ ็ๅญฉๅญไนๆๆบไผๅฆ่ง้ข้พๆฅ๏ผ\n239501 0 ไปgoogle็ๅซๆๅพ็ไธๆธ
ๆฐๅฏ่ง่ๅฎถ็ๅฐ้ขๅ่ๅญ\n239502 0 ๆถๅฐๅไธป๏ผTiphaine็ๆททๆญ้ฃๆ ผๆ็็นๅซ็ไธชๆง\n239503 0 ๅฆจ็ขๅ
ฌๅก+ไธ็ณปๅฎๅ
จๅธฆ+ๆฐไนฑๅ
ฌๅ
ฑๅฎๅ
จ\n239504 0 ๆๆบ็จไบไธๅนดๆฒก่ขซๆขไนๆฏไธ็ฎๅ\n1000 Processed\n classify content\n240000 0 ๆธๅฐไบๆฐ็MicrosoftEdge\n240001 0 ๅนถๅฐไบ8ๆ29ๆฅ่ตดๅไบฌๅๅ ๅ
จ็่ๅใ่ๅ็ฒพๅ่็ฎ็่ง้\n240002 0 V่ธ่ฏไปท็พๆผๅบไบๆดป็็็ๅผ ่ตท็ตๆฏ่ฎคๅฏ\n240003 0 โๅคฑ่โxไธชๅคๆ็้ๅฏ็ซ ็ญxไบบๅ ๆถๅซ้ๆณๅธๆถๅ
ฌไผๅญๆฌพ็ฝช่ขซๆ่ท\n240004 0 xใๆบไผไธๅฏ่ฝๆฐธ่ฟ้ชไผดไฝ ไธ่พๅญ\n1000 Processed\n classify content\n240500 0 ๆฌ้จไป่ฟไบ็ฝ้นญๅๅคบๅพ็ฉบๆๆณๆฎ็ซ ไธ\n240501 0 ๅ
ถไธญๅค้ชจ้ชผๆบๅจไบบๅทฒ็ปไธๅ
ซไธๅบทๅคไธญๅฟ่ฟ่กๅไฝ\n240502 1 ๅฐๆฌ็ๅฎถ้ฟ๏ผๆฐๅนดๅฅฝ๏ผ้ฟๅขจ้ฑผๆ่ฒๆฅๅญฃๆ่ชไน ่ฏพ็จๅฐไบxๆxๅท๏ผๆๅคฉ๏ผๆญฃๅผๅผ่ฏพ๏ผxๆxxๆฅๆฅๅๅไธช...\n240503 1 ไนๆฒป็ฝๅฐผไบ่ฎฉๅ็xxใๅฅ็ฝ้ฃxxใ้กๆฐธๅนณๆxxใ็ๆฃฎxxใ้ฟๅฐๅทดๅก็น่ฎฉๅนณๅxxใ้ฆๅผๆ็ธๆฟx...\n240504 0 ไธญๅ
ณๆ่ก้่ช2012ๅนดๅผๅฑ็ฝๆ ผๅ็คพไผ็ฎก็\n1000 Processed\n classify content\n241000 0 ๅฐๅบฆ้่ไธญๅฟ็้คๅ
ๅๅญๆฏไปฅ็บณ็ฒนๅพทๅฝ้ข่ข้ฟ้ๅคซยทๅธ็นๅๅ็บณ็ฒนๅ
ๅพฝๅฝๅ\n241001 0 ๆminiatureVersaillesไน็จฑgeHerrenchiemsee\n241002 0 ๅคฉๅบ่ฝฏไปถๅญC12็้ซๅฑ็ตๆขฏไธ็ญๆถ้ดๅไบ\n241003 0 ็็้ฃๆบๆฏๅฆไฝ่ฟ่กๅฎๆฃใ็ปดไฟฎ็\n241004 0 ๅทฒ็ป่ขซ70ๅคไธชๅฝ้
ๆ่ฃ
ๅ็ๆตๅถ\n1000 Processed\n classify content\n241500 0 ๅๆน่ขซๅนๅค58ๆฌก็ฏ่งๅไธๆฌ่ตๅญฃๅญฃๅ่ตไนๆ\n241501 0 ๅ้ๅคง่ก็ฑๆๅๅ
ฌๅญๅพๅฎฃๅ่กๆนๅ\n241502 0 ๅ ไธบไฝ ๅซ่ตตๆๆถๆไปฅ็ไธน้ณไธไผ็ฆปๅผไฝ \n241503 0 ๆฅ่ชFM797198ๅคฉๆตทๆ ่ด็ๅฌ้ณ็ญ\n241504 0 ็ ดwin10Aๅกๆไบไผไบบ๏ผ่พฃ้ธกAๅก้ฉฑๅจ\n1000 Processed\n classify content\n242000 0 ไฝ ไปๅฆ็จไธชๅไธบๅคงๅกๆไธๅฑไธ\n242001 0 ๅฅณไธปๆญๅฎๅไธพๆฅ่ขซๅฎๅๅ
ๅ
ป4ๅนดๅ้ญๆๅผ\n242002 0 ๅๆญฃ็ปๅฎไบQQๅนณๆถ้ฝๆฏ็จQQ็ป\n242003 0 ไปๅคฉๅจๅนฟไธ็ต็ฝๆปจๆตทๆฐๅบๅ็พๆ\n242004 0 ๅฐฑ่ฑกไธ้จ็็ต่งๅงๅ่
่ฟๅฑๅๆ
่่ถๆฅ่ถ่่ฏ็ฆปๅฅ\n1000 Processed\n classify content\n242500 0 ๅฎๅบ่
่ดฅใๆฐ้ดๆถๆงไบไปถใๅฎถๅบญ้นๅง้ขๅ\n242501 0 ็ขงๆฐดๆบ930ๆใๆบๅจไบบ563ๆใ่่ฒๅ
ๆ 1661ๆ\n242502 0 ไบบไฝๅนฒ็ป่ๆฏๅฝ้ๅป็ๅ็ไธ็็ๆๅท
ๅณฐ\n242503 1 ้ฒๅฐฑ่ก๏ผcx-xr็ฐ่ฝฆๅ
่ถณ๏ผ่ตๆบๆ้๏ผๆฌข่ฟๆข่ดญ\n242504 0 ไปๅฑฑๅบๆณ้ขๅคไปคๅนผๅฟๅญๅบไป็ปๆๅ
็ไธๅฎ็ๅ ็ญ่ดน\n1000 Processed\n classify content\n243000 0 ๅธฎไฝ ไนฐ็้พๆฅๅคๅถๅฐ๏ผ้ฟ้ๅฆๅฆๆทๅฎๅฎข\n243001 0 ๅคฉไบฎไบๆ่ฏฅ็ก่งไบๅฎๅคง่ๅท\n243002 0 7ๆ30ๅทๆ้นค็ๆฅๅฟซไนไบๅนดไบ\n243003 0 ไธ่ฟๅ้ฟ้ๅฝฑไธ็ๅไฝ่ฟๆฏๅผๅพๆๅพ
\n243004 0 ๅจๆฑ่ๅคชไปๅฟๆพๆ็ๅฎถ็ๅคง็ฒฎไป\n1000 Processed\n classify content\n243500 1 ๆๅ
ฌๅธไปฃๅผๅ็งๆญฃ่งๅ็ฅจ๏ผ่ฏทๅไฟ็๏ผไปฅๅคๆฅ็จ๏ผ\n243501 0 ่ฟๆฏๆจๆฅ14ๆถ่ณไปๆฅ21ๆถ็ๆๅ\n243502 1 ๅฐๆฌ็้กพๅฎข๏ผๆจๅฅฝ๏ผๅๅคง้้ฃๆนพ่ถ
ๅธๅจxๆxๆฅ่ณxๆxxๆฅๆๆตชๅฅๆด่กฃๆถฒๆดปๅจใ๏ผ็นไปทไบงๅๅฆไธ๏ผxx...\n243503 0 ๆๆๅ็งฐๅฅๆฏ่ฎพ่ฎก้ข้ๆ่ฝ่ฏด็\n243504 0 ็ช็ถๅ็ฐtaylor็ๅทฎไธๅค้ฝๆฏๆ ๆ้ณ่ดจ\n1000 Processed\n classify content\n244000 0 ๅๅผๅงไนฐไธๅฐๅฐ็ฑณๆๆบๆไนฐ่ๆณkxxx\n244001 0 ็ๅฐๆฅๆฌๆฟๅบๅฏนไบ็พๅฝ็็ๅฌ่กไธบๅชๆฏ่กจ็คบ้ๆพ่ไธๆขๆ่ฎฎ\n244002 0 xๅ
ใ้ฒๆๆๅฎถๅฑ่ท่ตxxxxxxx\n244003 0 10ๅคงNBA็ๆๆฐธๆ่ขซ้ป็น\n244004 0 GHairRecipe่่ๆฐดๆๆ ็ก
ๆฒน้
ๆนๆดๅๆฐด/ๆคๅ็ด \n1000 Processed\n classify content\n244500 0 ๅคงๅฎถไนๅ็ตๆขฏ็ๆถๅ่ฆๆณจๆๅฎๅ
จ\n244501 0 ๆญคๆฌกxxxๆกๅ่ฝฆ็ฎก็็คบ่ๅคง่ก้็นๆดๆฒปๅบๅไธป่ฆๅไธบโ็ฆๅไธฅ็ฎก่กโๅโๅ่ฝฆๅ
ฅไฝ่ง่่กโ\n244502 0 ๅ็่ฐฆๆฏ2008ๅนดๅๅคงๆฐ้ปไบบ็ฉ\n244503 0 ้ฆ้
ๆดพๅบๆๆฐ่ญฆๅฏน่พๅบๅฑ
ๆฐ็้พๆทๅฃ่ฟ่ก่ฐๆฅ\n244504 0 ๆ้ค๏ผๅ็ข็ฑณ้ฅญใ่ ่่ใไธ็ฒ็ปด็็ด ใไธ็ฒDHA\n1000 Processed\n classify content\n245000 0 ๅ ไนๅ็BใๅไธๆฟBใๅทฅไธxBใ็
ค็ญBๅบใ่ฏๅธB็บง\n245001 0 ๅฐ่ทๅพ็ฑๅพๅทๆ
้ญไฝๆฃไธญๅฟๆไพ็ใไปทๅผ500ๅ
็่ดตๅฎพไฝๆฃๅกไธๅผ \n245002 0 ๅ ๆฒนๅฎไน ็็ๅฐxx้ๆๆ็ฅ้้ไธปไปปๅ้ฃไธช่ฐๆฏไธๅฎถ\n245003 0 ่ฆ็ญๅไธชๅฐๆถๅ้ฃๆบๅๆฌก่ตท้ฃ\n245004 0 ๆ่ฏด่ฏๆๆญฃๅจๅผบ่ฃ
ไธๅบ่ฏด่ตฐๅฐฑ่ตฐ็ๆ
่ก้ฃๆบๅปถ่ฏฏไปจ็นๆ่กจ็คบๅพๅผๅฟ\n1000 Processed\n classify content\n245500 0 ๅ
ถๅฎ่ทฏไบบ่ดจ็ๅ่ฝฆๆไบบ่
ๆฏ่ไบ่
\n245501 0 ๆๆญฃๅจไธ็้จ็ๆ ้กไน่ฎจ็็ๆดป็ๆๅฉ\n245502 0 ็ฐๅจ็่ญฆๅฏ้ฝๆฏ่ฟไนไธไฝไธบ็ๅ\n245503 0 ๅฐฑๆณ็ฅ้้ฃๆบๆ็นไธ่ตท้ฃๅฐฑ่ฎฉๆ
ๅฎขไธ็ดๅๅจๆบ่ฑ้ไน\n245504 0 ่พ่ฝฌไปๅไบฌ้ฃๅฐๅไบฌๅไปๅไบฌ้ซ้ๅฐไธๆตท\n1000 Processed\n classify content\n246000 0 ๆๅฏๆถ็ๆฏ่ฟๅซๆไบxxxไธ้ฟ้็DNS\n246001 0 ๅฏๅฑ้ฟ็งๆฟๅญไธๅ
ซๅไธๅ่ดฟ็ฐ้ๅบไบๅฅ่ฏ\n246002 0 ้ๅฑฑๅๆตท่ญฆๅฏๅ
ๅๆณๆๆฅ้ๅจ่ฏฅ็ทๅญๆๆบไธญๅ็ฐๅคง้ๅนด่ฝปๅฅณๆง่บซ็ฉฟๆฏๅบๅฐผ็็
ง็\n246003 1 ไฝ ๅฅฝ ๆๆฏๅๆๅไฝ ่็ณป็้ผ้ผๆฅ็่ฃ
้ฅฐๅ
ฌๅธ็ๆๅ
ไผ๏ผๅ
ฌๅธ็ฐๆจๅบ๏ผ่ฟๅบๅฏไบจๆฐๅนดๅผ้จ็บข็คผๅ
xxxx...\n246004 0 ๆข็ถ้ฝๅทฒ็ปๆพๅไบๅฐฑๆๆกๅฅฝๆพๅๅชๅฑไบ่ชๅทฑ็ๆถ้ดไธๆณจไบไธไบๆๆไน็ไบไธขๆๆๆบ\n1000 Processed\n classify content\n246500 0 ๆๆฉๅ
จ็งฐไธบๆฏSpecialWeaponsAttackTeam\n246501 0 ๅฅฝๅคๅฎขไบบ้ฎ่ฟไธชpolaๆๅ
้ๅฎๅฅ่ฃ
\n246502 0 ๆ ้ขๅฎๅฎไธNHL้็งๆๅ็ฑไฝ่ฒ็ๅฎถๅบญๅ ๆฒน\n246503 0 ็ฐๅธธๅทๅฏ่ฑชๆฒๅฐๆฒ่ฟๅจ่ฝฟ่ทS60L่ฏ้ฉพ่ฝฆ้ๅฝนๅฆ\n246504 0 5ใ้ฅญๅไธๆฏ่่ๆฐดๆๅถ่้
ธ\n1000 Processed\n classify content\n247000 0 ไฝ็็ธๆปๆฏๆฏไฝ ๆณ็่ฆๆฎ้
ทๅพๅค\n247001 1 ๆฐๅนดๅฅฝ๏ผ้ปๆตฉ้ฃๅๅบ็ปๆจๆไธชๆๅนด๏ผๆ่ฐขๆจๅปๅนดๅฏนๆฌๅบ็ๆฏๆ๏ผไปๅณๆฅ่ตท๏ผๅจๆฌๅบ่ดญ็ฉๆปกxxๅ
๏ผๅณๅฏ้...\n247002 0 ๆพณๆดฒColoxyloraldrops็ผ่งฃๅฉดๅนผๅฟไพฟ็ง/ไธ็ซๅฃๆๆปดๅ30ml\n247003 0 ไฝ ้ซ้พโฆโฆblablablaโฆโฆๅฅฝๅคๅ ็ด ็\n247004 0 ๆ็
ค้ๅขๅคงๅๅฎๆฝโๆไฝๆ ่ใๆๅๅ
ป่โไธบ่ๆๅกๆฐๆ็ฅ\n1000 Processed\n classify content\n247500 1 ไฝ ๅฅฝ๏ผๆไปฌๅฟๅฎขๅบ็่ฑ็็ฐๅจๅๆดปๅจ๏ผๅๅบฆ่ฟๆฏ่ฎๅคง็๏ผๅ่ฃ
ๆฅๆฌ่ฑ็็บธๅฐฟ่ฃค้กบไธฐ็ด้๏ผไธๅ
ซ็นๆ Lxx...\n247501 0 ็ฑ้กๅฑฑๆไฝๅฑๅ้กๅฑฑ็ฐไปฃๅไธๅ่งๅญ่ๅไธพๅ็้กๅฑฑๅบ2015ๅนดๅฐๅฟๆธธๆณณ้่ฏท่ตๅจๅๅๅญๆธธๆณณ้ฆๅๆปก่ฝๅน\n247502 1 ๅนฟ็ปฟ็ฏไฟๅจ่ๅฎๅฟ่ๆฐด่ทฏไธ๏ผ็ฐๆไธๅกๅ(xxๅนด่ช่พพxxxxxxๅ
+)๏ผ็ฝ็ป่ฅ้ๅ(xxๅนด่ช่พพx...\n247503 0 ๅชๆ่ฑๅ้ชจๅฎ็ปๆๅๆญ่กๅธไนๅฐฑๆขๅคๆญฃๅธธไบ\n247504 0 ๅไบซๅฝญๅๅผบไธ่ฏๅธ็ฉ็ๅญฆ็ๅๆๅพ็๏ผ20150807ๅฝๅ่กๅธๅๅคงๅจๆK็บฟๆๆฏๅๆ\n1000 Processed\n classify content\n248000 0 ๆดไธ็ธไฟก่ฟๆฏๅจไธบTesiroๆๅนฟๅ\n248001 0 ไธบไปไน6p็ณป็ปๅ็บงๆb243ๅ\n248002 0 PS๏ผๆๅฏๆฒกๆๅจไบบๆฐๅนฟๅบๅ็ธ้ธก\n248003 0 ไบบ้ฃๅบๅป็ ธไธญไบไธ่พ่ญฆ่ฝฆๆไบ\n248004 0 ่ฎพ่ฎกๅธๆฒกhowilldisappear\n1000 Processed\n classify content\n248500 0 ็ไธญๅฝๅฅฝๅฃฐ้ณๆ่ฎฉไบบๆๅจ็ไธๆฏๅฑ็ๆๅคๅจไบบ\n248501 0 ๅๅ
้ฝ้ธ่กๅคด่ฟไบไธไนฐไธๅฐๆบ็ฅจ้ญ่ดจ็\n248502 0 ๆไธๅๅๅ
ถๅ็ดขxๅ้ฑ็่ญฆๅฏๅ่ญฆๅฏๅ
ๅฏๅ
้ฟๆไธไบๅฐ\n248503 0 ๅฎ้
ๆ็ป่พพๅฐ21ๅบฆ็็ซๅผ็ฉบ่ฐๅฏนๅฒ\n248504 0 ็ ็ฉถๅ็ฐๆฏๅคฉๅ3ๆฏๆๆดๅคๅๅกๅฏไฝฟ2ๅ็ณๅฐฟ็
้ฃ้ฉ้ไฝ37%ไปฅไธ\n1000 Processed\n classify content\n249000 0 ็ปๆไป้ฉฌไธๆๅผ็ฎกๅฎถ็็ต่่ฏๆ\n249001 0 ใJongSukใ150725ๅพ็โLOCK&\n249002 0 ๅๅฎไธญๅ12ๆ่ตท้ฃๅๅพ็พๅ่ๅ ๅฅ\n249003 0 ๆฏไธๆฏไฝ ๅฏน\"ๆๅ\"่ฟไธช่ฏ็ๅฎไฝๆ้ฎ้ขๅข\n249004 0 xๆฑ็๏ฝxxxๅฐๆ๏ฝmoto็ญ่ฃค๏ฝๅชๆxxx\n1000 Processed\n classify content\n249500 1 ๆฐๅนดๅฅฝ๏ผๆฌๅบไปๅคฉๆญฃๅฅฝๅผ้จ๏ผๆฌๅบไปฅๅท้ธ้จใๆ้ฃ้จใ้ๅ้ๅๆ้จใ็ฝๅ้จใไผธ็ผฉ้จใ่ฝฆๅบ้จ็ญ็ญๆนๅ...\n249501 0 TFBOYSๆญๅๅ
ฉ้ฑๅนดๅฟซๆจๆฅไธไพ้ๆๆฏๅๅๅนดไฝ ๅ่ฆๅ ๆฒน\n249502 0 ๆ่ฟไฝ ็่ฑๅ้ชจๅๅ ๆฒนๅฎไน ็ไบๅ\n249503 0 ๅฟ
้กป้่ฟๅพฎ่ฝฏๆต่งๅจๆ่ฝๆๅผ็ฝ้กตๅพ็\n249504 0 ไธๆตทๅธ้ต่กๅบไบบๆฐๆณ้ขๆฐไบ่ฃๅฎไนฆ\n1000 Processed\n classify content\n250000 0 ๅฐๆทๅ่ฎพ่ฎกไน่ฝ้่ฟ่ฝฏๆจ่ๆฏๅขใๅฐๆฏฏๆ้ ๅซๅข
็่ฑชๅๆ\n250001 0 ๅ้บ่่ไบไธๆไธๆไธ็้ฃๆบๅๅฎถๆฌง่ถ็ไนฆไนฆๅปๅฏ๏ฝ\n250002 0 Googleๆฏไธชๆๆฅ่ฏข้1000ไบฟๆฌก\n250003 0 ็ถๅๅป็ๅ่ฏด่ฟ่ฆๆไธค้ขๅฐฝๅคด็\n250004 1 ๅฃไฝๆตท้ญๆฅๅ็ฅๆจ๏ผไธๅ
ซๅฆๅฅณ่ๅฟซไน๏ผ็ฐๆดปๅจๅฆไธ๏ผไผ่ด่ฏ็ฌฌไธๆฌพๆญฃไปท๏ผ็ฌฌไบๆฌพx.xๆ๏ผ่ช็ถๅ ๆ็งๆ...\n1000 Processed\n classify content\n250500 0 ไปๆๆ่ฑๅ้ชจ็ไปๆๆ่ฑๅ้ชจ็ไปๆๆ่ฑๅ้ชจ็\n250501 0 ๅ็ณป็ๆ่ขซๅซ็ฏๅฎถๅฑ็จxxไธๆถไนฐ\n250502 0 G15wๅธธๅฐ้ซ้็ฑๆญๅทๅพๅธธ็ๆนๅ68K้่ฟไบๆ
็ฐๅบๅค็็ปๆ\n250503 0 x็บงๅฐ้ๅฝๅฐ็ฝๆฐ่ฐไพ็งฐโๅฎถๅธธไพฟ้ฅญโ\n250504 0 ๆช6000ๅคๆฏโโ่ๆณๆไบบ้ ่ฐฃ่ฏด๏ผไธญๅฝ8ๅนดๆๆ\n1000 Processed\n classify content\n251000 0 ้ฟ้ๅปๅๆ
่ก่้บปไฟก็จ็งฏๅๆฐๅ ๅก็ญพ่ฏๆ็ญพ\n251001 0 ไฝฟ็จๆๆบ็ฝ็ป่ฟๆฅๆฏWiFiๆดๅฎๅ
จ\n251002 0 ๅๅคMKLC83้ขๅ80็พๅ
ๅทๅก้\n251003 0 ไธๅฎกๆณ้ขไปฅ้ๆณๅธๆถๅ
ฌไผๅญๆฌพ็ฝช\n251004 0 ๆTไน่ฆๅๅๅฐ่ฟไธช็ๅญๅบ้ฃๆบๆฏๆน\n1000 Processed\n classify content\n251500 0 ็ฐไปฃไธญๅผ่ฃ
ไฟฎๆกไพโโ่ฅ้ ๆธฉ้ฆจ็ๅฎถ็ฐไปฃไธญๅผ่ฃ
ไฟฎ้ฃๆ ผๆกไพ่ๅไบๅบ้ไธไผ้
ๅ้ๆฐ่ดจ\n251501 0 โๅทดๅ้ฉฌไธๆคๅฃซ้ฒจๆธธๆณณโๆ็็งๆไฝๅ\n251502 0 ไป็ป่ๅฌๅฐ่ดฉไธ้ๅไธ้็ๅๅ\n251503 0 ๅๅป็็พๅบฆ็ฅ้ๅๅ็็ฉไปทๅทฒ็ป้ฟๅฐไบๅคฉไปท\n251504 1 ๅฐๆฌ็็จๆทๆฐๅนดๅฅฝ๏ผไฟก้ณ็งปๅจๅ
ๅฎตไฝณ่ๆ้ดxๆxๆฅ-xxๆฅ๏ผ้ๅฏนๆจ่ฟๆ ท็xGๆๆบๅฎขๆท๏ผๆจๅบ็ผดxx...\n1000 Processed\n classify content\n252000 0 ๆฅๆwikiๆจ่็้
็ฝฎๆฏใ้ๅทก+้ฃๆบๆฌงๆ น็ตๆข\n252001 0 51ๅฒๅจๆกไบไผธๅบๆดๆๆฅๅฐไบบๆฐ่ทฏ็็ฎ่กๅฑ็ฎ่ก\n252002 0 ๆ็พๅทฒ็ปไธ็พๅบฆๅไฝๆจๅบไบโๆ็พ็พๅบฆๅกโ\n252003 0 ๆ็ๆๆไธๆ็็ธ็ไบบ้ๆ่ฟไธชๅฝ่่ๅไบๅ\n252004 0 ๆฌ่ฝฎๅฏนๆฑ่่ๅคฉ็ๅๆถๆ้ฒๅฎๅดๆฆๅ่ฟไธๅน
ไธๅพไบ็ๆ ทๅญ\n1000 Processed\n classify content\n252500 1 xๆxๆฅๆฟ่ถ
็ฝๅฎถ่ฃ
ๅข่ดญxxxๅฐ้ซxๅบๆฐด่ฑๆด็ญNๅค่ถ
ไฝไปทไบงๅxๆ่ตท๏ผไธ้้ๆข ่ดญ๏ผๆฅๅฐฑ้ๆดไธฝ้
...\n252501 0 ็ปๆ่ฟๆฌกๅ็จ่ช็ญๅๅคฉๅ
ๆจ่ฟไบไธๆฌก\n252502 0 ่ขซๅ่ฅๅธๅบ้ณๅบๆณ้ขไธๅฎกๅคๅคๆๆๅพๅไธๅนด\n252503 0 ๅฏๆ็ๆฏ็ฒ้ชจๆ้ๆๅฑๅ
็ฑไบ็ญนๅคไธ่ถณๆฒกๆๅผ\n252504 0 150710ASBeetwitterๆฐๆดปๅจๆตทๆฅ\n1000 Processed\n classify content\n253000 0 ๅฏนๅฝๅ
่ญฆๅฏๅ็ฎก่ดช่ตๆๆณๆฌบๅ่็พๅงๅด่ง่ไธ่ง\n253001 0 ๆ็จbabyskinๅฐฑไธไผ่ฟๆ\n253002 0 ๅพฎไฟก็ต่็็จๆท็ฐๅทฒๅฏไธ่ฝฝๆดๆฐ\n253003 0 ๅๆฌขๅบ็ง่ฝฆ้ๆฒๅ็radio\n253004 0 ๅจP2P็ฝ่ดทๅนณๅฐๆไพๆ
ไฟไธๅกๅนถๆถๅไธๅฎ็ๆ
ไฟ่ดนใๆๅก่ดน็ญ\n1000 Processed\n classify content\n253500 0 ็ฅ้็ฝๅญ็ปไธบไปไนๅฎ่ด่ฑๅ้ชจ\n253501 0 4็ทไบบๅฏนไบ่กฅ่พใ่่ช่5่ดซ่กไบบ็พค้ฟ่ถๅๆฒปๅ็ง่ก็6ไบๅฅๅบทไบบ็พค้ฟ่ถ่ฝๆๆๆๅๆบไฝ็ฝ็ป่่ฟไธชๅ
็ซ...\n253502 0 ่กจ็ฎๅฑ็็ป่้ด่ดจๅ็็ฎๅฑ็ๅผนๅๆฐดๅ่ถๅซ้่ถๅ
่ถณ\n253503 0 BBๅฎถ็ญๅไบงๅๅฌๅคฉๆๅฅฝ็จ็่บซไฝไนณ\n253504 0 ๅฐๆไธบไธญๅฝ็ฌฌไธๅฎถไธๅธ้่ไบ่็ฝๅ็ดๆ็ดขๅผๆ\n1000 Processed\n classify content\n254000 0 ไฝ ็ฐๅจๅฑ
็ถ่ฟๅๅคงๅๆ่ฑๅ้ชจ้ฝ็ไบ\n254001 0 ๆๅทฒ็ป้ขๆB็ซๅฐๅญฆ็่ฆๆ่ฟ้จ้ชๆ็ไบ\n254002 0 ๅ็ฉ้ฆๅปบ็ญ้ข็งฏ2500ๅนณๆน็ฑณ\n254003 0 ๅงๆ
้่ฝฌโๅตๅฐธ่โๆไธบๅๆฐ้ปโโ้ฃ้ๆฏ็็ธ\n254004 0 ๆฅ่ชFMxxxxxxx็ๆฐดๆ ้ฆโ่พ\n1000 Processed\n classify content\n254500 0 ๅ
ถไธญๅ
ฌๅ
ฑ่ดขๆฟ้ข็ฎๆถๅ
ฅxxxxxxไธๅ
\n254501 0 ๆฟๆๆ็ฉๆฏๆๆๅ็พๆปกๅนธ็ฆๅ็ฅฅ\n254502 0 ๅฐ่ทๅ
ฐไบๅไบ9ไธชๅฐๆถ้ฃๆบๅพ
ไผ่ฟ่ฆ่ฝฌๆบ็ฐๅจๆฏ่ทๅ
ฐๆถ้ด15\n254503 0 ็บฆ20ๅ้ๅๅไธ้ขๅทพๅๆ๏ผๅฏไปฅๆๆๆนๅ่ธ้จ่ฒๆ\n254504 0 ไฝ ไปฌ็APP้ข่ฎก้่พพๆถ้ด่ฝ้ ่ฐฑ็นไน\n1000 Processed\n classify content\n255000 0 ไธ้ฃๆบๆๅฐฑๅธฆไฝ ไปฌๅปๅ้ๅบ็ฌฌไธไธฒไธฒ\n255001 0 ่ณๆญคๆฒฟ็บฟ14ๅบงๆฐๅปบ็ซ่ฝฆ็ซไธปไฝๅ
จ้จๅปบๆ\n255002 0 ๅฅฝๅฅฝไฟๆค่ชๅทฑ0็ฑไฝ ็An\n255003 0 ้ฏ้ธไธญ็บงไบบๆฐๆณ้ขๅฎฃๅคไธ่ตท็ณปๅ็็ชใๆขๅซ็พๅชๆก\n255004 0 Dxxไปๅคฉๅปๆไน่็ซ่็ฌฌไบ้\n1000 Processed\n classify content\n255500 0 ๆ็ต่้ฝๅ
ณไบ่ฟๆณๆฅ็ๅถๅๆฅไบ\n255501 0 1็จ360ๆดๆฐwin10ไธๆฏๅ็ๅๆขๅคๅบๅ\n255502 0 ๅณๆฑๅบๆฟๅบๅทฒไผๅ็ธๅ
ณ้จ้จ็ ็ฉถ็ ๅๆดๆฒป้ฎ้ข\n255503 0 ๅไธบไผ ่พ่ฎพๅคๅธธ่งๅ่ญฆๅซไนๅๅค็ๆนๆณ\n255504 0 ๆญฃๅผๅฝๅไธบMOUNTPAVILIA\n1000 Processed\n classify content\n256000 0 ๆ็ ดไบไปๆบ่ฝๆๆบ่งๅบฆๅบๅ็ไผ ็ป่็ตๆ่ทฏ\n256001 0 \\nๅฐ็ข่ฑ็่ฃๅญไธๅพไธๆๆฉไธๅฐไบๅ้็ชไธญๆฏๆๅบๆฅ\n256002 1 ใๆธฉ้ฆจๆ็คบใๆๅธๆ่ฒๅๆๆ กๅบๅฐไบxๆx๏ผxๆฅไธพ่กไธญ่็ถๅ
็ญ็คบ่่ฏๅฌ่ฏพ๏ผๅๅญฆๅฏไปปๆ้ๆฉๆฒกๆๆฅ่ฏป...\n256003 0 ๆๆฒก็ปๅฉ้พ้ๅไบฌไธ็ฝๆฅไธๅบๆฅๅ\n256004 0 ๅไบฌๆฐดๆธธๅๅฏน้ข็ไธๅฎถ้้ต้ธญ่ก็ฒไธๆฑคๅไธๅซๅป\n1000 Processed\n classify content\n256500 0 ไฟๅฎๆๅฐๅท่ฟๆฏๆญฃไน่กไธบ้ญๅฐๆฅๆ\n256501 0 ไปๅคฉๅจๆตๆฑๅคงๅญฆไธบไผไธๅฎถๅไบซ็งปๅจไบ่็ฝๆถไปฃ็ๅไธๅ้ฉๅๅๆไผ ๆญ็ฎก็\n256502 0 ๆธธๆไธญ็โ็ฏ็ๅฝฑไธUniversalโๆฏซไธไธบ่ฟ\n256503 0 ไปๅนดๅพไบๅๆไธๆฎตๅ่ฃๅค่ฏ็่่ฏ่ดน่ตๅบๆฅ\n256504 1 ๅนณๅฎ้ถ่กๅ่ฅๅ
ฅ้ฉปโฆๅนณๅฎๆ่๏ผไฝ ๆณๅ ๅ
ฅๅ๏ผไธ๏ผ ๆๅก้กน็ฎ๏ผx๏ผไปฃ่กจๅ
ฌๅธไธบๅนณๅฎ่ๅฎขๆทๅ็็...\n1000 Processed\n classify content\n257000 0 5๏ผไธๅธธ่ง็่็ฝ่ดจๆฐจๅบ้
ธๆฏๅจ่็ฝ่ดจๅๆๅไฟฎ้ฅฐๅฝขๆ็\n257001 0 ไธบไปไนๅป็ไฟ้ฉ่ถๆฅ่ถ้ซๅปๅนดไธไบบ60็ฐๅจไธไบบ90ๆๅฎถๆ18ๅฃไบบๅทฎไธๅค่ฆ2000ๅ
้ฃ้ๆฅ้ฑไบค\n257002 0 intelligent่ฟๆฏsmart\n257003 0 ่ฟๅจๅ
ฌไบคไธ่ขซๅฐๅทๆๆๆๆบๅท่ตฐไบ\n257004 0 ไฝๆฏไบๅฎ็็็ธๆฏๅ่ดข็ๆบไผๆปๆฏ็ป้ฃไบไธ็ฅ้ๆดๅคๅฐฑ่ฝๆๆบ็่กๅจ็ไบบ\n1000 Processed\n classify content\n257500 0 ๅบๅฎๆณ้ข็งฏๆๅผๅฑไบๆฐๅไบๅฎกๅคๅบญๅฎก่งๆฉๆดปๅจ\n257501 0 ๆฅ็ง็งไฝ ็ๆๅบง้ๅไนฐ่ก็ฅจๅ\n257502 0 ไบๅๅ็ฎกๆคๅฝๆงๆณไธญ้ๅฏน็ฅฅไบ็พ้ฃๅๅจ่พนๅบๅค็ป่ฅ\n257503 0 ไนไปฅๆ้ซๆธฉ23โ็จณๅฑ
ๅ
จๅธๆๅ็ฝ็ฌฌไธๅ\n257504 0 ่ฟ็ณป็ปไนๅๆๆusb่ฎพๅคๆ ๆณไฝฟ็จ\n1000 Processed\n classify content\n258000 0 ๅ็บงwin10ๅ็ๅ ไธช้ฎ้ข๏ผไบฎๅบฆ่ฐ่ๅคฑๆ\n258001 0 ๅฝผๅพยท่ๅฐ่ฎคไธบไธ่ฎบๆฏๆ่ต่ฟๆฏ็ๆดป้ฝ้ตๅพชๅฅๆฌกๆณๅ\n258002 0 ๅๅฐๅไบฌๆ่งๆดไธชไบบๅๆดปไบ่ฟๆฅ\n258003 0 ๆๆถ่ฟไผๅฐๆๆๅคๆโฆโฆโฆโฆไธๆฏ้ฝไธๅผ\n258004 0 ๅนธๅฅฝๆๆบๅจไบบ็ปๆๅฝ็ฟป่ฏๅๅฝๆ็ไฟๅง\n1000 Processed\n classify content\n258500 0 ไบคๆฟๆถ็ฒพ่ฃ
ไฟฎๆช่พพ้ ไปท็ๆ ๅ\n258501 0 ไธญ้ฟๆๆ่ต่
ๅทฒ็ปๅผๅง็ฆปๅบไบ\n258502 0 ่ฎพ่ฎกไธไธช็จๆฅไบซๅ็ๅซๆตด้ดๆฏไธชๅพ่ต็้ๆฉ\n258503 0 ไปฅๅ่ดฟ็ฝชใ็บตๅฎน้ป็คพไผๆง่ดจ็ป็ป็ฝช็ญ\n258504 0 ๅคช็ฉบๆฃๅฎๅ
ฐๆ่ฒๅ
ฐ็ซๆก่ฎพ่ฎก่ง่งๆตๆ่ถ
่ต\n1000 Processed\n classify content\n259000 1 โๅ่ดตโไปๆไธคๅคฉๅค้
xxx/ไบบๆ ้็
้ฅฎ๏ผๆฌๅ
ฌๅธๆฐๆไธๆนxxๅ็็พๅฅณ๏ผไธชไธชๆงๆ้ฃ้ช่ฟๆๆดๅคๆๅ...\n259001 0 ็งฏๆๅฏนๆฅไบฌไธใ้ฟ้็ญ็ฅๅ็ตๅไผไธ\n259002 0 2ใๅบ่ฏฅๅฏไปฅ่ฏดๆฏโ่ฟๆณ่ฟ่งไบบๅโ\n259003 0 ้ฃๆๅไธๅฆๆๆๆๆบๅกๆ่ฑๅฌๆญ็็ไนฆๅๅๅ\n259004 0 ๅๅ ๆฒน็ซ็ฆๆญขๅฏนๆ ็ๆ ่ฏ็ๅ็จไธ่ฝฎ่ฝฆใๆฉๆ่ฝฆๅ ๆฒน\n1000 Processed\n classify content\n259500 0 ไบค้ๆๅผ๏ผๆญไนๅฐ้ไบๅท็บฟ่ณ็ๅพท็ซDๅบๅฃไธ\n259501 0 ๆๅๅจ่ก้ๅฃๅผ็ๅบiphoenๆๆบ\n259502 0 ็็ๅพ้ฌผๅไธญๅฝๅฅฝๅฃฐ้ณ็ฌฌไบ้ไธญๆจๅฎๅฟไปๅฑๆญๅฐ็น่ฏ้ฃ่ฑ็ๅคดๅไธไผ็ดไธไผๅท\n259503 0 ็ดขๅฐผPS4็้้ไธ็ดๆฏๅพฎ่ฝฏXboxOneๅคง\n259504 0 ็ฆ็ปตๅบไบบๆฐๆณ้ขไปฅไฟก็จๅก่ฏ้ช็ฝชไธๅฎกๅคๅคๆขๆๆๆๆๅพๅไบๅนด\n1000 Processed\n classify content\n260000 0 ๆฏๅคฉ้ฝๅจ่ดจ็ไบบ็ไธญๅบฆ่ฟ??\n260001 1 ๅฐ้็ฅๆจๅ
ๅฎต่ๅฟซไน๏ผ่ๅๆ้่ฆๅไฟก็จๅก๏ผ่ดทๆฌพ็ๅฏไปฅ็ดๆฅ่็ณปๆใ\n260002 1 ๆจๅฅฝ๏ผๆๆฏ้ข้ณ่ทฏ๏ผๅๅก็็น๏ผไธญๅ้
ๅฃซๅฉๅฏผ่ดญ๏ผ็ฐ้
ๅฃซๅฉๅฅถ็ฒๅ
จๅบxๆๆถ้ดไธบx๏ฝxๆฅ๏ผๆฌข่ฟๆจๆฅ้่ดญ...\n260003 0 ไธ็ญ็ๅฐๆน็ตๆขฏๅไบ๏ฝ๏ฝ๏ฝๅไบโฆโฆโฆๆฌๅฎๅฎไธๅฃๆฐ็ฌไบๅๅ ๆฅผ\n260004 0 ๅคธไบๅบ็ง่ฝฆๅธๅ
ไธๅฅๆๆฏๅฅฝ็ปๆๅฐฑๆๆฌขไบ้ฉฌ่ทฏๆฏไฝ ๅฎถๅ\n1000 Processed\n classify content\n260500 0 ๅฆๅฆ่ฆ็ปไปๆๅๅฉ็คผ่ฃ
ไฟฎๆฟๅฑ\n260501 0 ๅชๅๆ้ ๆตๆฑ็็ฌฌไธ็งๆๆฐ้ป้จๆท\n260502 0 ้ๅข็ๅๅฑ้ๅบฆๅๅณไบไบบๆๆขฏ้็ๅปบ่ฎพๆ
ๅต๏น\n260503 0 ้ๅฎๅบๆ
ๆธธๅฑ็ธๅ
ณ่ด่ดฃไบบ้ชๅ\n260504 0 ไฝ21ๅฒ็็ๆฅๆฟๆ็ฐๅจ่ฎธไธๅบ่ฏฅไธ็ฎๅคชๆ\n1000 Processed\n classify content\n261000 0 ๆ็ฅ้้้ฃXxๅค่งๅฝๅ
ไธๅฉ็็ณ่ฏท่ฆๆฉไบ่ทฏ่ๆๅ
\n261001 0 ่ฟ่ฎฉ็พๅบฆ็ๅๅฑไธๅๅๅถไบๆไธช้ซ็ฎก็ๅป็\n261002 0 ่ฑกๅพTIFFANY็ปๅ
ธๅ
็ด ็ๆๆ\n261003 0 ่ฟๆฌกๆตๅฏiSuperS3็่ถ
็ช่พน่ฎพ่ฎกๆก่พพๅฐไบ็2\n261004 0 Burt'sBeesMamaBeeBellyButterๅฐ่่ๅซ็ปดE้ฒๅฆๅจ ็บน้\n1000 Processed\n classify content\n261500 0 ไนๅๆปก่ธ็็ๅ็ฒๅบ็ๆๅฐฑๆฏไธไธชๆ่ฎญ่ฟๅฅฝ็ฐๅจๅผฅ่กฅๅๆฅไบ\n261501 0 ็ถๅ็ๅฐๆ้ฃๆฅผ็40ๅคๆก่ฏไปท37ใ8ๆก้ฝๆฏๅทฎ่ฏ\n261502 0 ่ฏๅธ็ฏ็ฝชๅชๆๅป๏ผxๅ
ๅนไบคๆ\n261503 0 ้ฃไธไผ้ชๆปดๆปดๆนฟๅๅ็็็ๆจไธๅฟ็น\n261504 0 ๆนไธชๆๆบไธ็ๆธฃๆธฃๆๆปๆๅฐๆ็
ง่ชๅจไบบๅทฅ็งป่ฝดโฆ\n1000 Processed\n classify content\n262000 0 ่ฎค่ฏไฟกๆฏไธบโๅ้้ๅ
ฐๅ่ดธๆ้ๅ
ฌๅธ็ฝ็ปๆๆฏ่ด่ดฃไบบโ\n262001 0 ็็ๅจ่
พ่ฎฏ่ง้ข้ชๆๅบฆ่ฟไบ1ๅฐๆถ\n262002 0 ไปๅนดๅไบฌๅนณๅฎๅคๅๅฃ่ฏๅค้ฝๆฏโๅฐๅปๅคโ\n262003 0 ้ฝ้ฝๅๅฐ้่ทฏ่ฟ่พๆณ้ขๅ
จไฝๅนฒ่ญฆ็ป่ฟ1ไธชๅคๅฐๆถ็่ฝฆ็จๆฅๅฐ็็บง็ฑๅฝไธปไนๆ่ฒๅบๅฐโโๆฑๆกฅๆๆ็บชๅฟต้ฆ\n262004 0 ไพ็ถๆฏๅ
ธๅ็่ๅทๅญๆ้ฃๆ ผ\n1000 Processed\n classify content\n262500 0 ๆฅๅๆ ๅบๆญปๅ็ฏๅฏนๆญปไบกโ
โ
โ
ๆดๅค่ฏฆๆ
๏ผ\n262501 0 ๆญๅทไธ็ตๆขฏๅโๅไบบโ๏ผไธๅไฝๅจ16ๆฅผ็ๅฅณๅญ\n262502 1 ็ฐๆๅ ๅฅ้ปๆฒณๆบๅฝ้
ๅ็ๆฟๅญ๏ผ็ฐๅทฒๅข่ดญไปทๅฏนๅคๅบๅฎ๏ผ้ข็งฏxxx-xxx-xxx-xxx็ญไธๅ็ๆท...\n262503 0 ใๅธๅ๏ผ็็ฉๅป่ฏ้่กๅธไธ็ปฉๆ้ฟ็กฎๅฎx่กๆๆณขๆพใ\n262504 0 Mulancyๅๆฐธไน
ๅๅฆ็ฌฌๅ
ญๆๅ
จ็ง็ญ็ๅญฆๅไปฌๅฟซไน่้ค\n1000 Processed\n classify content\n263000 0 ็ๅฟๅบไผๅทฒๆถๅฐ็ฑๅฟๆ706540ๅ
\n263001 0 ไนๅไปๅคฉไธๆ ท้ซๆธฉxxๅบฆๆไปฌ่ๆฑๆฝๅทฅ้ไผไธๅx็นๅผๅทฅๅฐๆx็นxx\n263002 0 ๅไธๆฟ้ๅxxxx็นๅคงๆถจ้พx%\n263003 0 12ๆฅ10086ๅๅค่ฏดๅๅฐๆพ็คบๆๅไบ\n263004 0 ๅไบซไธไธชๅๆ๏ผ็ฉๅฆๅ
ไธ็ๆ้ผ ๆ ็ฉๅไบ\n1000 Processed\n classify content\n263500 0 ็ฌฌไบๅๅฒไปฅๅ็ๅฉดๅฟไธๅฎๅ้ธก่ๆธ
\n263501 0 ๆฌๆก่ฃๅคๆไนฆ้ๅๅฐๅจไธญๅฝๆณ้ข่ฃๅคๆไนฆๅฎๆน็ฝ็ซๅ
ฌๅธ\n263502 0 ๅไบฌๅฐ้S1ๅท็บฟไบๆใS7ๅท็บฟ้ข่ฎก2017ๅนดๅปบๆ\n263503 0 ๅจๆฐไธๆฟไธๅธ็ๅธๅๅ
ฑๆ4ๅฎถ\n263504 0 ้ฃไบๅ
็ฝฎ้็ต็ๆๆบใpsp็ญ่ฎพๅคไน่ฝ้่ฟๅฎ็ดๆฅไพ็ตไบ\n1000 Processed\n classify content\n264000 0 ๅไบฌ่็ซฅๅ
ปๆฏๆๅพ็ด๏ผไปๆฒกๆณ็ๆญฃไผคๅฎณ่ฟๅญฉๅญ\n264001 0 ๅฐๆ็ซไธๆฏ็ฑ120ๅไบค่ญฆใ็น่ญฆ็ญไบบๅ็ปๆ็็นๅซๆงๆณ้ไผ\n264002 0 ็ฝๅๆฏ่ฟไธช่ฏดๆฏๆตๆฑ้ถ่กๅกๅทๆฏ่ฟไธช\n264003 0 ไฝไบๆตๆฑๅคฉๅฐๅนณๆกฅ็ๅฐ็ๅงไธฐๆบช็\n264004 0 ็ๆ่ฑๆฆๆๅณๆฏ่ฑๆฏๅคฉไปท่ตๅฟ่ขซๅ
็ฝๅ่ดจ็็ไฝ\n1000 Processed\n classify content\n264500 0 ไธญๅคฎ็ต่ง12ๅฐ็คพไผไธๆณ่็ฎไธญ็ๅฅณๅฉฟ็ถๅๅฒณๆฏๅๆฟไบงๅ
ฌๅธ็ๅค็ฝๆฏไธๆญฃ็กฎ็\n264501 0 ๅ่
ๆไบ่ๅฉๅฒๆๅคด่่ๅฟไบๆ้ซๆฐ็\n264502 0 ๆณๅป่ฟๅกไฝๆdrsebagh็VC็ฒไนฐไบ\n264503 0 ๆ่ฐโๆฒก่ฏๆฎโไธ่ฟๆฏไธไฝไธบ็ๆ่ฏ็ฝขไบ\n264504 0 ็กฎๅฎ็่งๅ็ฎกๆฟ็งค็ ฃๅปไธญ้ๆญฃๅ ่ธ้จ\n1000 Processed\n classify content\n265000 0 ็ช้ผป่ดดไธป่ฆๆฏไปฅๅป้ปๅคด็ฒๅบไธบไธป\n265001 0 ๅ็็พๅณxๅฐๆถ่ฝ่ทxxxxๅ
ฌ้ไบบ่ฟๆฏ้ฃไธชไบบ\n265002 0 ๅฐฑๅ่ฅฟๅฎ่ญฆๅฏๆๆฒณๅ่ญฆๅฏไธๆ ท\n265003 0 5ๆไปฝๅไนฐ็ๅ
ซๅๅ้ฑ็้
้ฉฌๅๆฉๆ่ฝฆๆฒกไบ\n265004 0 ๆถ่ดน่
ๆ็ไฟๆคๆณ็ญๆณๅพ็ปดๆค่ชๅทฑ็ๅๆณๆ็\n1000 Processed\n classify content\n265500 0 ไปฅๆญคไผๅๅ
ฌๅธ็ๆ่ต็ปๆใๆๅๅ
ฌๅธ็ปผๅ็ซไบๅ\n265501 0 ๅป็ๅๅจ็
ๅไธ็ไธ่ฅฟๅซๅญไน\n265502 0 ็ทๅฅณ้ฝ่ฆๆค่ไปฅ่กไธบๆฌไปฅ่ไธบๅคฉ่ๆๅคๆธ
่ๆๅคๆธ
\n265503 0 ไธๆณๅๅญๅฅ็จๆนๆฝญๅธๆฃๅฏ้ขๅ่ดชๅฑ็ต่ฏไผๅพ่ฏ้ช\n265504 1 ๅนฟไธๆธ
่ฟ้พๆนๅฅ็ณๆๅไบงไธๅญๅฐไบxxxxๅนดxๆxๆฅ--xๆxxๆฅไธพๅๅฅ็ณๅ่งไผ๏ผ็ฐๆๅฐ้ๆทๅคๅฑ...\n1000 Processed\n classify content\n266000 0 ๆ่ทๅบๅฎถ็บ ็บทๅฐฑๆฏๆๆๅบ่ฏๆฎไนๅๅบๅฎถๅฑ้ฝไธๆพไธไธช\n266001 0 ๅค้ข้ฃๅฎๅพๅๅผไบๅ ๆถ่ฝฐ็ธๆบ\n266002 0 ๆ็ฝๅๆๅ
็พๅบฆใ360ใไบฌไธ็ญไบ่็ฝๅ
ฌๅธๅ ็ญไธๆ็บท็บทๅจๅๅ
ฌๆฅผไธไธพ็โๆ่ฆๆพๅโ\n266003 1 ๆไปฌๆๅฎถ่ฏๅ่ฎฐ้ปๅ๏ผๅจๅจๆตฆไธ่พพๅนฟๅบๆ๏ผๆ้ ็ๅๅธๅ็ฒพๅๆฐดๅฒธๅซๅข
๏ผxAๆๅณๅฐ็ๅคงๅผ็๏ผ่ถ
ๅบๅๆ...\n266004 0 GooglePlay้ฃไธๅฅไธ่ฅฟ้ฃๆ ท่ฎพ่ฎกไธขๅฐๆฟๅญไธ็จๆ่งๅพไธ่ๆ\n1000 Processed\n classify content\n266500 0 ๅๅค็ฎฑ้ฝๆพไธไธ้ข~็งๆ็่ฑกๅฐไฟๆธฉๆฏๅพ็พไธฝๅ\n266501 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ iq2397ไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n266502 0 ๅฟไธไฝไธๅ ๅผ ๅๅฏผๅ
ฅ็ต่็ๅๅพ\n266503 0 3ใ็ต่ๅจๅดๅ ๆพไนฆๆฌใๆๅฟ็ญๅฏ็็ฉ\n266504 0 ๅจ็่ๅฑฑ้กถ้ฃๆฅไธๆถ้ฃๆบ็ฎๆต่ฝๆธฉๅท็้ฃ็้ๅธธไฝ\n1000 Processed\n classify content\n267000 0 15874109600ๅญฉๅญๅฆๅฆ\n267001 0 ่ดจ็ไปไน็่ฏไนไผๅพ็ด็ฝๅฐ่ฏด\n267002 0 ๅธธๅธธๅจๅฐ้้้ๅฐๆๅฅณๆงไนๅฎขๅไธ่ฅฟใๅไธ่ฅฟ\n267003 1 ๅฐๆฌ็ไผๅๆฐๅนดๅฅฝ๏ผๆฐๅนดๆฐๆฐ่ฑก๏ผไธบๆ่ฐขๆฐ่้กพๅฎขๅฏนใๅ้ฆใ็ๆฏๆ๏ผๅกxๆไปฝๅฐๅบๅๆค็xๆฌกไปฅไธ้ไปท...\n267004 0 ๅ็ๆถๅๅฏๆพไธๅบ่่ๆ่
็บข็ณ\n1000 Processed\n classify content\n267500 0 ๅจๅไบฌ่ฟๆ่ฟๆ ท็ๅฏๆๅธ็็
งๅ\n267501 0 ๅๅท็ๆ่ฒๅ
ๅ
ณไบๆณธๅทๅปๅญฆ้ขๆดๅไธบๅๅทๅป็งๅคงๅญฆ็่ฎบ่ฏๆฅๅ\n267502 0 ็ฐๅจๅชๆๅฐ้็ซๆ่พน็็ ดๅ่ฝฆๅบ\n267503 0 7ๆ26ๅทๆฏไธช่ฎฉ้ๆฐธไฟกโๅคงๅธโๅพๅคด็็ๆฅๅญ๏ผๆไบบไปฅๅฎๅไธพๆฅ็ๅไน\n267504 0 ๆๅจๆๆบ้
ท็ๅ็ฐโๅฐไนไนbiuโ็็ฒพๅฝฉ่กจๆผ\n1000 Processed\n classify content\n268000 0 com็ฐๆMadpaxFullๅฝฉ่ฒๅบ็ฌๆฝฎ่ๅ
\n268001 0 ๅๆถๆฏ้ฎๅๆฒกๆๅ้ฟ้ๅทดๅทด็\n268002 0 ๆฒชๆไธไป
ๅจ3600็น้่ฟๆจช็่พพ5ๅคฉ\n268003 0 ๅฎฃๅธๅฏๅจไธ็ณปๅ็พๅบฆๅ
้จ้กน็ฎๅฏนๅคๅผๆพๅธๅผๆ่ต่
็โ่ชๆฏ่ฎกๅโ\n268004 0 ๆตๆฑๆไธชไปไน่ชๅฐๅๅคๆณๆถๅไบฌ้ๅฝไผ้ฃๆ ท็บข\n1000 Processed\n classify content\n268500 0 ๅขๆๅบxxxxๅนดๆถๅพๅนธ่ฟๅฐๆพๅฐไบไปๅจๅคชไป็ๅฅๅฅๅงๅง\n268501 0 ๅฏ้ข้ฟ่ๅฐไบๅจไธญๅฝ้่ๅ็ฉ้ฆ็ๅท็ไบ้ฟ็้ชๅไธ\n268502 0 ๅจ่ฏฅๅๅ้ฅญ็ๆถๅ้ฃๆบๆ็น็็ๅคช็่ฆไบ\n268503 0 ๆฏๅฆๅญฆ่
ๅๆณๅฎๅ่ฟๆณๆณๅฎ็ๅบๅซ\n268504 0 ๆ่ฟ็็่ฟไธชๅ ๆฒนๅฎไน ็ๆบๆๅจ็\n1000 Processed\n classify content\n269000 0 ็ฌ็น็ๅไธคไปถ่ฎพ่ฎกยทๅจ้ณๅ
ไธ่ฅ้่ฅ็ฐ\n269001 0 399006็ไธๅคงไธป้ขๅฐฑๆฏไบ่็ฝ้่\n269002 0 ไธญไฟก็ฉไธ้ฟๆฅๅ
ฌๅธ2015ๅนดๅบฆไธญๆๅทฅไฝไผ่ฎฎๅฌๅผ\n269003 0 ้คๅป้ฆๅฐพไธคไฝๆญๆโ้ปๅคฉ้น
โๅโ็ผ็โ\n269004 0 ไปชๅพๅฝไฟกๅฝฑๅ2015ๅนด7ๆ20ๆฅๆๆฐๅฝฑ่ฎฏ\n1000 Processed\n classify content\n269500 0 ไฝๅฆปๅญๅฑ
็ถๅฉ็จ้่ฟ่ฐทๆญ่กๆฏๅฐๅพๆๆ็3D็
ง็\n269501 0 Xไฟก่ฏๅธX้ฆๅธญ็ญ็ฅๅๆๅธๆๆฐๅคง็็ๆณ\n269502 0 ๆ็ฉบๆพๆฟไธๆณจ็ฌๅฎถไบๆๆฟๆฟๆบ\n269503 0 ๅฏๆๆบ็ดๆฅๆๅฐ็
ง็็ๆๅฐๆบ\n269504 0 ๅๆฅๅ้ซๅคงไธ็ๅป็ๅฝข่ฑกๅ
จๆฏไบ\n1000 Processed\n classify content\n270000 0 ๅฐ้ไธๆ่พน็ๅฅไปฌๆฟ็ๅพฎๆ็ฌ่ฎฐๆฌๆๅผๅจ็ผ็จ็้ข\n270001 0 ็ธๅ
ณ้ขๅไธๅฎถๅฐฑ4ไธชๆๅธธ่ง็ๆธธๆณณๅฐ็น\n270002 0 ๅซๆๅ่ชๆ็
งไธบๆไบไบๅฏนไฝ ๆไบๅพฎ็ไธไปฝ้ปๅฅใไธไปฝๅนณๆทกใ\n270003 0 ่ฟไธชๅคฉๅบ้จๅไธ็ฑๆไผ็ดๆฅๆๆ็ขณ\n270004 0 ๅฐๅฅ่กไปปๅ
ๆพๆฅๅคxxๅชๅคง่่\n1000 Processed\n classify content\n270500 0 ๅปบ่ฎฎๅฏน็ขฐ็ท็ไบบ่ฟ่กๅ
ฌๅฎๅฑๅคๆก\n270501 0 ้พ้ๅไธบไธ็ฅ้่ฃ่7็ฐๅจๅพ้พๅๅ\n270502 0 ไธ่ทฏไธ็่ฎฝๅบๅฒ็ฌ่ดจ็๏ฝๆๆถไฝ ไผไผคๅฟ้พ่ฟไฝๆดๅค็ไพฟๆฏไธ็ฌ่่ฟ\n270503 0 ไธป่งๆๅป็็ๆฎๆผ่
ๆฏ้ฆๆธฏๆผๅAngie\n270504 0 ็ฌ้พๆไบบๅบๅ ๆฏ็พๅไน็พ็Ox็ณป\n1000 Processed\n classify content\n271000 0 ไธซไธซ540ๅคฉ๏ผๅคๅฉ็็ผ่ฆๅปๆ็\n271001 0 ไธญๅฝๆณๅพ่งๅฎไธๅคซไธๅฆปๅโฆๆณๆณ่ฆๆฏๅ็่ฟไบไบ\n271002 0 ไบๆฅๆๆง็ๆๆบ่ๅคฉ่ๅฐๆฒก็ต\n271003 0 ไฝๆฏ็ตๆขฏๅฃๅๆๆ ่ฏญโ่ฏทๅฟ่ก่ตฐโ\n271004 0 ๆๆขๆ่ตไปburpeeไธๅฃๆฐไธไผ่ถ
่ฟ1โฆ\n1000 Processed\n classify content\n271500 0 ็ทๅญฉไธบ่นญWiFiไธ็ฝ็ฌ้ฒๆค็ฝๅจ้่ฝจไธ็ฉๆๆบt\n271501 0 TCLๆๆบ่ฏทไบๅฝ็บขๆๆ้ๅๅๅไปฃ่จ\n271502 0 ๆช่ณ7ๆ17ๆฅไพ็ถๅ
ฑๆถๆๆฌพ183089\n271503 0 ็ญ๏ผ็ปๆต็ฐไปฃๅใๆฟๆฒป็ฐไปฃๅใ็คพไผ็ปๆ็ฐไปฃๅใๆๅ็ฐไปฃๅใ็ฏๅข็ฐไปฃๅใไบบ็็ฐไปฃๅ\n271504 0 ๅฐ่้
ๅบ้ๅขๅจ่ฅ่ฟๅฏๆป็ๅ
ผๅธๅบ้ๅฎๆป็ๆน้ธๅ
็ๅธฆ้ขไธ\n1000 Processed\n classify content\n272000 0 ๅพๅคๅฅณๆงๆฃไบๅฆ็ง็พ็
ๅ้ฆ้็ๅฐฑๆฏๅป็่ฅฟๅป\n272001 0 ่ฐทๆญ่ฆๆSEOไธๅฎถ็ป่ชๅทฑๅๆ็ดขๅผๆไผๅ|ไปฅๆ็งๆ\n272002 0 ๅซ็ฏๆๅไธค่ฝฟ่ฝฆๅ้่ฑ่ฝฆไธปๆๅคฑ่ฐๆฅ่ตไป\n272003 1 ไฝ ๅฅฝ๏ผๆๅธๆโ%๏ผ็_้
โฎใใ ๅฆ้๏ผxxx xxxx xxxxๆ็\n272004 0 Twitterๆฏๅคฉไธๅฐๆจ้ไปๆชไธญๆญ\n1000 Processed\n classify content\n272500 0 ๆฏๅคฉ่ฑๅ้ชจๆๆผๅงๆ
้้ขๆๅงๅง็ๅ
จ้จ้ฝๆชๅฑไบ\n272501 0 ๅฟ็็ฝชๆ ้กๆ็่ถไธ็ผๅฐฑ็ๅบๆฅๆน\n272502 0 ๆ่ขซไธญๅฝๅผ็ๅฉๅงปโๅผบๅฅธโไบ\n272503 0 TotheFirstYearใTomyMr\n272504 0 ๆฏๆฅๆฌๅฆไบง็งๅป็ๆจ่็ไบงๅ~\n1000 Processed\n classify content\n273000 0 ่ฟๅนดๆฅไธญๅฝๆบๅจไบบๅธๅบไฟๆ้ๅฟซ้ๆๅ\n273001 0 ๆ่ฝ็ถๅธๆwindows10ๆฏwinๆๆบ็ๅคงๆบ้\n273002 0 ไปๅคฉๅปๅDuffy้ฃๆบ็ไบบ่ฟ็ๅคโฆโฆไธๅผๆจ็น่ขซๅทๅฑ\n273003 1 โx.x่ณx.xๅทไผๆฆๅฑ
ๅฎถ่ฃ
็ๆดป้ฆ๏ผ่ฏ้ๆจๅๅ โๆ ๆฐๅฎถ่ฃ
ๅปบ.ๆๅคง่กฅ.่ดดๆดปๅจโใๆฅๅบๅฐฑ้็ฒพ็พ็คผ...\n273004 0 ๅฝๅคๆฑฝ่ฝฆๅถ้ ๅๅชๆฟๆๆไธไบ่ฟๆถ็ๆๆฏๅ็ปๅฝๅ
็ๅ่ตๅๅ\n1000 Processed\n classify content\n273500 0 ๆฐ่ฅๅป้ขใ็ง็ซ่ฏๆๅทฒๅ ๆฎไบๅๅฃๆฑๅฑฑ๏ผ็ฎๅๅ
จๅฝๅทฒๆ่ฟ6\n273501 1 ไปๅฑฑๆบๆฐดๅทฒๆxxไฝไธไธปๅจๆๅธ็ญพ็บฆ๏ผ็ฐๅผๆพxๅฅๅๆจๅๆถๅๆจฃๆฟๆฟ๏ผไพๆจๅ่งๅ้ดใๅจ่ฉข๏ผxxxxx...\n273502 0 ๅๅ
ดๆฅๆฅๆฅไธไผ ๅช้ๅขๆฟๅ็2011ๅ
จๅฝ่ฎฐ่
็ซๅทฅไฝไผ่ฎฎไปฃ่กจไธ่ก40ๅคไบบ\n273503 0 ้ๅทๆณ้ข็บชๆฃ็ป็ๅฏๅฎค็ปๆ็ฃๆฅ็ปๆทฑๅ
ฅ่ฏฅ้ขๅไธญๅฑ้จ้จ\n273504 0 ๆฒชๆ้ซ้้คไบG82ๆฌก่ฝฆ็ฅจ็ดงไฟๅค\n1000 Processed\n classify content\n274000 0 ๆฐๆถไปฃไบ่็ฝ้่็่ๆ่ดขๅฏๆฏๅฆๅฏ่ฝๆฟไปฃ่ชๅคไปฅๆฅ็้ป้่ดขๅฏไปทๅผ่ง\n274001 0 ใๅฆไฝๅไธๅ็ๆญฃ็ๅๅ่ฎพ่ฎกๅธใ\n274002 1 ไบฒ็ฑ็ไผๅไฝ ๅฅฝ๏ผๆๆฏSmๆฒๅฐ็่ช็ถๅ ไธๆ็๏ผๆไธชๅฅฝๆถๆฏๅ่ฏไฝ ๏ผๆไปฌx.x่่ฆๆไธๅจๆดปๅจxๆ่ตท...\n274003 1 ๅ
่ดน่ฃ
ไฟฎ๏ผxไธๅทจๅฅ๏ผ้ฆจๅฑ
ๅฐxๆฅโๅ
่ดนๆด่ฃ
ๆข็ญพไผโ็ปๆxไธ๏ผ่ฃ
ไฟฎไธ่ฑ้ฑ๏ผๅๆฝiphonex๏ผๆฅ...\n274004 0 ๅฐฑๆฏไธๆฌก้ข่จไบๆดฒ้่ๅฑๆบๅบๅ็็ทไบบ\n1000 Processed\n classify content\n274500 0 ็ไธ้้ขๅไธ็บงๆฏๅจๅฑฑไธๆ็้ฃไธช~้ฃๆตๆฑ้ฃๆๅข\n274501 0 ไฝ ๅฐฑๅฏไปฅๅๅฉ่ฎพ่ฎกๅธWAGAiiๅธฆๆฅ็ๅกๆ่ฑ\n274502 0 ไนๅ้ฉฌไบ็ ธ10ไบฟๆฅๆจๅนฟๆปดๆปดๆ่ฝฆๆจ้่ฟไบ็พๅบฆ็ ธๅ ไบฟๆฅๆจๅนฟ็พๅบฆ้ฑๅ
ๆจๅ้่ฟไบ่ฟๆณ้่ฟ่ฟๆฌกๅ่ดทๅฎๅ\n274503 0 ไธญๅฝๅๆตทๅๆผ้ๅจๆฌง็พ่ฅฟๅช้ข ๅ้ป็ฝๆฑก่่งฃๆพๅๆญฆๅๆซๅโฆโฆๅฆๆ่ฟไนๆฏโๆญฆๅๆซๅโ\n274504 0 xxๅนดๅจๅพทๅฝๅผๅงๅebayใxxๅนดๅไบ้ฉฌ้\n1000 Processed\n classify content\n275000 0 ๅผไบไธๅฐๆถไธๅฎถๅจ่ฏขไผไธ็พคไบบๆฟ็ๅฐธไฝ็
ง็็ ็ฉถๅๅคฉ่ฏ่ฏดๆณๅป่ฟไธช่ไธ็ๆฏๅฑ็ไบๆ่งๆดไธชไบบ้ฝๅจๅๅ
ๅๅๅ\n275001 0 1ไธชๅฐๅท20ๅนดๅๅท่ตฐ2ไปถๅค็ฉๅๅณๅ่ฟ่ฟ่ฟ\n275002 0 WindowsPhoneๅฐฑๅๆฐ่ฝๆบ่ฝฆ\n275003 0 ๅจๆทฎๅฎไธป่ฆๅนฒ้็ซๅ ่ฝฆๆด็ๆๅๆโ็ฆโๅฆ\n275004 0 ไปๆ้ฝๅธๅ็ฎกๅงๆงๆณๆป้่ทๆ\n1000 Processed\n classify content\n275500 0 ๆธๆ+่่ฑ๏ผ่ฝไฝฟ็ผ็่ฝปๆพใๆไบฎ\n275501 1 ๅฎถ้ฟไฝ ๅฅฝ๏ผ้ขๅ
ๆ่ฒๆฅๅญฃๆจๅบ่ถ
ไฝไปทไฝไธ่พ
ๅฏผ็ญxxxๅ
๏ผ้ไธไธช็ญ๏ผ้ขๆปกไธบๆญขใๆฌข่ฟๅๆฅๅจ่ฏขๆฅๅใ\n275502 0 ่ฟๆฏ20ๅฒ็ๅฅณๅญฉๅญฃๆฅๅจ็ๅนธ็ฆ่ง\n275503 0 ็ฌ็ซๅฏปๆพ้ฃ็ฉใๆฐดๆบใ่ฏๅใ่กฅ็ป\n275504 0 ๅ ๅ
ฅๆธๆใๅฐ็ณๅ็
ฎxxๅ้\n1000 Processed\n classify content\n276000 0 ๆฝๅปทๆไธ็ฎกไป่ตท่ทณ้ซๅบฆ็ฉบไธญๅงฟๆ็ไฟๆ่ฟๆฏๅฐๅ
ฅๆฐดๆถๅฏนๆฐด่ฑ็ๆงๅถ้ฝๆฏๆๅๅคบๅ ็ๆๅคงๅฉ้ๆ่ฆๅ็ๆดๅฅฝ\n276001 0 ไธ่พพ็พ่ดงๅ
ณ้ญๅ
จๅฝ40ไฝๅฎถ้จๅบ็ๆถๆฏ\n276002 0 ๆฅๅ็ญ็บฟ๏ผ18051024703\n276003 0 ๅ็ๅ็่ฎพ่ฎกๅ็็ด ๆๅ็ๆจกๆฟๅ
ฌๅธๅ็ๅๅกๅ็ๅนฟๅๅ็ไผไธๅ็็งๆๅ็ๅๆๅ็ๅไธๅ็้็จๅ็...\n276004 0 ๅฏ่ฝๅ ไธบxxไธๆฅไปทๆฏ่พ้ซไธ็ดๅฐๆชๆไบค\n1000 Processed\n classify content\n276500 0 ๅฝๆๅผๅง่ดจ็ไธไปถไบ็ๆถๅๅ็กฎ็ๆปๆฏ้ฃไน้ซ??\n276501 1 ๅ
็๏ผๆจๅฅฝ๏ผๆๆฏๅ่ทๆจ่็ณป่ฟ็ๅนณๅฎๆ่ดทๅฎขๆท็ป็ๅปๅ
ด่๏ผๅ็ๅ่กใๅ
ๅคงๆ ๆตๆผ่ดทๆฌพxโxxxไธ๏ผ...\n276502 0 ่ฟไน่ฎฉ็ซ็ฎญ็้ตๅฎนๅพไปฅ่ฟไธๆญฅๅฎๅ\n276503 0 ไปฅๅๅๅฐ้ๆๅปๅๅบๅฟ
้กป่ตฐๆฅผๆขฏ\n276504 0 ็็ต่ง็ฉ็ต่ๆๆบๆธธๆๆบipadๅๅค\n1000 Processed\n classify content\n277000 0 ๆนๆนๅ
ฌ็คพ่ณฝ้่ตฐ่กไผๅไบฌ็ซ้กบๅฉๅฎๆ\n277001 0 ่ฎพ่ฎกๅธGonglueJiangๆณๅฐไบไธไธชๅฅฝ็่งฃๅณๅๆณ\n277002 0 13ๅฒ็ๆนๅๅฐๅฅณๅญฉๅฐๆ่ขซๅธฆๅฐ่ๅๅๆไธๅคง้พ็ทๅญๅฎถ\n277003 0 ๅจๅฆ้จๅฐๅบไธญ็ณๅๆฃฎ็พๅ ๆฒน็ซ\n277004 0 ๆๅๆฌข่ฑๅ้ชจๅๆ้ฃ่ฟไธๅฏนๅฟๅๅๅๅ\n1000 Processed\n classify content\n277500 0 ้ฝๅฏ่ฝ้่ฟไนฆๆฌใ็พๅบฆๅๅ็ฅ่ฏๅคๅญฆๆฅ\n277501 0 ๆๆบๅก็ๅๆญป็งไฟกๅไธไบๆ็นๅๅปๅ็จ้ฃไธชๅๅคไฝ ไปฌ้ฃไธชๆๆบๆฒกๅธฆๅบๆฅไฝ ไปฌ็ๆต้ๆๅทฒ็ปๅฒไบ\n277502 0 ไธชๅซไธๆณๅ่ดฉ่ฟ่ง้ๅฎๆณจๅฐๆดๅฎน็จ้ๆ่ดจ้
ธ้ ไบงๅ\n277503 1 ๆๅคฉๆฐๅฐOKยทไบๆ ไธๆไธ้ ็ๅ ๅ
จ็ ้ๅคงไปไผ๏ผ่ดงๆบ็ดง็ผบ๏ผๆ็ดงๆถ้ด๏ผ้ข่ดญ่ฏท้็ต๏ผ\n277504 0 2๏ผๅซ็จ่ชไปฅไธบ็ๆ ๅ่ฆๆฑ้ๅถๅฅณๆง\n1000 Processed\n classify content\n278000 0 โฆไฟก็จๅกๆป็บณ้็่ฎก็ฎๆนๅผ้่ฎ็ฒๆด\n278001 0 ๅ่ฑๅ้ชจๅง็ป่ฟๆ ทๅฉ็จ่ช่บซๅฝฑๅๅ\n278002 0 ๆๅๅๆๆๆบๅฃ็บธๆขๆไบ่ฟไธช๏ผBretAdeeopensoneofhis72\n278003 0 ๅจ่ฏขๅฏๅ ๆๅพฎไฟกๅท๏ผ1239847679้ๆฆ\n278004 0 ่ก่ๅธ็็
ง็5็นๆ็6็น็ง็ง่ฟ็ฉฟ่ถฟๆๆฟ่ทๆๅ็ฉ\n1000 Processed\n classify content\n278500 0 ๆฒชๆๅทฒ็ป่ทๅ7ๆ8ๆฅใ7ๆ9ๆฅ้่ฟไฝ็ฝฎ\n278501 1 (x/x)ๆ่ฐขๆจ่ด็ตไธๆตทๅ
็ฅ๏ผๅ
็ฅๅฏ็น่ๆ/็ฑๆ่ๆๆฐ้ฒ้ข่ดญๅฏ๏ผๅณๆฅ่ตท-xๆxxๆฅๆ้ด้ข่ดญๅณ...\n278502 0 xๅ้ญ
ๅๅ
จ้ข่งฃๆ|ETtodayๅฝฑๅๆฐ่|ETtodayๆฑๆฃฎๆฐ่้ฒๆๆฉ็|ETtodayMo...\n278503 1 ใๆใโฅใๆตใ ใๅขใโฅใๆฃใ ใๅคใโฅใIx%ใ xxx xxxx xxxxๅโ\n278504 0 ๅฝๆ็ฉไธๆๅกไผไธใ็คพๅบ็ปๅปบ็ฉไธๆๅกๆบๆๆไธไธ็ฉไธไผไธ่ฟ้ฉป็ฎก็\n1000 Processed\n classify content\n279000 0 ไธไธชๅท็ ๅซ๏ผ15298601091ๅ็ๆฏไธไธๆ ๆตๆผๅ
ๆ
ไฟไฟก็จ่ดทๆฌพ\n279001 0 ๆณ็ฅ้ไปๅนดๅไบฌ่่นๅๆๅคงไบไปถๆฏไปไนๅ\n279002 0 ๆจๅคฉๆWIN10่ฟๅไบWIN7\n279003 0 I'matๅฐ้็ ๆฑ่ทฏ็ซZHUJIANGLUStationinๅไบฌ\n279004 0 ๅไบๅฟ็ตๅพ็ปๆๆฏ็ชฆๆงๅฟๅพไธ้ฝ\n1000 Processed\n classify content\n279500 0 ๅฅฝๅฃฐ้ณ่ฟๆๅๆฌข่ตตๅคงๆ ผๅๅฅนๅ้ข้ฃไธช\n279501 0 P4bomb็ผ็ฅๅพๆป~~cr\n279502 1 ไบฒ็ฑ็ไผๅๆๅ๏ผๆฐๅนดๅฅฝ๏ผxxxxๅนดxๆxๆฅโxๆxๆฅ๏ผๅก่ฟ็ปด็บณ่ดๆๅบๆๅ
จๅนดๆๅคง็ๆๅๅฆ๏ผๆบไผ...\n279503 1 ๆจๅฅฝ๏ผๆนๆxxxxๅซๅข
่ฃ
้ฅฐxๆxๆฅ-xๆฅ้็ฃ
ๆจๅบโๅผๅนด้็ฎ็ฏๅณฐ้ ๆโๅ
ๅฎต็ฏไผไผๆ ๆดปๅจ๏ผ็ฐๅบ่ฑช...\n279504 0 ๅไป่งๅพ๏ผๆ่ต็่ดขๆฏไธ้จๅญฆ้ฎ\n1000 Processed\n classify content\n280000 0 ๅ ไธบๆขฆๆณ่ฟ็จไธญ็็น็นๆปดๆปดๆฑๆฐดๆทๆผ\n280001 1 ่ฅฟๅๅคงๅญฆ็ถฆๆฑๆฅๅ็นๆฅ ๅญฃๆ็ๅฐๅจxๆxxๆฅ็ปๆ๏ผๅก่ๅญฆๅๅธฆๆฐๅญฆๅ๏ผๅไผ่ทๅพxxx.xx/ไบบ็...\n280002 0 ๆ็ฌ/NBA็ๆไธๅ็ๅฝไธญ็ไธ้ๅๅ ๆฏiPhoneๅฑๅน่ฟๅคง\n280003 0 ็พๅฝ็ฆๆญข้ฃๆบๅๅจๆบๆ็จ็ๆใ้ฃๆบๆ็จๆถฆๆฒนใๅบ้ข้็ญ็ฉ่ตๅบๅฃ\n280004 0 ๅไนไธไผไนฐไบๅไนไธไผไนฐไบๅไนฐไนไธไผไนฐไบ\n1000 Processed\n classify content\n280500 0 ่ตถๅฟซๆไฝๆบไผๅ ๅ
ฅVIVAไผ็ญนๅง\n280501 0 ไธ5ไบบ็ฏ็ฝชๅขไผ่ขซ่ฅไธๆณ้ขๅคๅ\n280502 0 ไธ้ฃๆบ็ไธๅไบบๅฅฝๅไธๅฎถไบบไบ็ธๆจ่็ๆณก้ขโๆด็นๅฟๅง\n280503 0 ็ฐๅฐๅพๅท่ฟๅคง็ป็บคๅถๅๆ้ๅ
ฌๅธๅๅบ่งๅๆปๅนณ้ขๅพๅไธป่ฆ็ปๆตๆๆฏๆๆ ๅๅนฟๅคงๅธๆฐๅ
ฌ็คบ\n280504 1 ๅคง้ๅบๅฎ็บฏ่ๅ
๏ผๆฐ้ๆ้๏ผๅฏ้่ๆ๏ผ่็ณป็ต่ฏxxxxxxxxxxx.\n1000 Processed\n classify content\n281000 1 ๆบ๏ผ้ๆฉๅฅฅ้ๅ
ฌๅ
ณ ็ณ้
ไผ็ปง็ปญไธบๆจๆๅกใๆฌๅ
ฌๅธไธๆณจ็ณ้
ไผๅทฒxๅนด๏ผไธไธๆไพ็คผไปช๏ผๆจก็น๏ผไธพ็๏ผๅๅ...\n281001 0 ๆไปฌ็ไปทๆ ผduangduang็ๅพไธ้\n281002 0 ็ฝ็นๅ็งฐ๏ผๆฑ่ๅ้ๆธฏๅกๅบ็ฝ็น็ต่ฏ๏ผxxxxxxxxxxxๆดพ้ๅบๅ๏ผ็ผๅฑฑๆธฏๅทฅไธๅญ\n281003 0 ๆตดๅฎค่ฎพ่ฎก็ไบฎ็นๆฏๅข่ง็ไธ้ด็ฎๆ\n281004 0 ๅซๆ๏ผaiscpxxxxๆฒกไบบๆๅผ็ๆณชๆปด\n1000 Processed\n classify content\n281500 0 ๅทฅไฝๅขๅ่ฏทๆฟๅบๅฎฃไผ ไฟๆคๆญค็นๆ็ฉ็ง\n281501 0 ็พๅบฆ็ฒไธๅ้ๆๆฏๆๆๆๅณฐ\n281502 0 ๅฏนๆ่ฏ่ฎผใๅค่ฎฎไบคๅๆ
ๅต็ๆกไปถๅๅบๆฟๅบๆณๅถๅ่ฟ่กๆฒ้\n281503 1 ๅฐๆฌ็ๅไฝๅฎถ้ฟ๏ผ ๅผๅญฆๆฅไธดไนๅญฃ๏ผๆ่พฐๅญฆไน ๅฎถๅญไผๆ ๆดปๅจๅผๅงไบ๏ผ้ๅฏน่็ไธๆฌกๆงๆฅ...\n281504 0 ้็น้
็ฝฎๅซๆๅฏผ่ชใๆ ธ็ตใ็ช่ใ็ฐไปฃๅไธใๅฝไผๆน้ฉ็ธๅ
ณ็ญ็น้ขๆ\n1000 Processed\n classify content\n282000 1 ๅฅฝๆถๆฏๅฆ!๏ผๅ
ฌๅธไธบ่ฟๅ
ๅฎต่ๅ็พไธฝๅฅณไบบ่๏ผ็นๅ้ฆๆฐ่้กพๅฎขใๅบๅ
ๆๆๆฌพๅผไธๅพx่ณxๆ๏ผๆ็ปๅ
ธ็้ป...\n282001 1 ๆทฑๅณๅธๅ
ดๅๆฑฝ่ฝฆ่ฟ่พๆ้ๅ
ฌๅธ ๆ่ฐขๆจ็ๆฅ็ต๏ผๅ
ฌๅธ็ซญ่ฏไธบๆจๆไพ้ซๆใไผ่ดจ็ๆๅก๏ผ่็ณป็ต่ฏ๏ผxxx...\n282002 0 ๅซๆๅ่ชๆ็
งrgottenhadbeenhere\n282003 0 ๅธฎไฝ ไปฌๅข็sๅฏๅ
ๅ็ทๅ้ข่ไนฐๅฅฝๅฆ็ฐๅจ็ๅฐ็้ฝๆฏๆ็ๅญๆไธ่ฟๅ
ณๅ้ฝไผๆๆไธ็ถๅคชๆพ็ผไบ\n282004 0 ็ถๅๆๅจ่ชๅทฑ็ต่ๅๆๆบ่ฟๆipadไธไธช่ฎพๅค็้ฃๅ ไธชgif้ๅบฆๅทฎไนๅฅฝๅคง\n1000 Processed\n classify content\n282500 0 ไธไธไฝ ็ขฐๅฐไธไธชๅไธญ่ฏ็ไธๆไธไธชๅๅฏๆไนๅๅ๏ฝ็ๆฏไธบไฝ ๆ็ขไบๅฟๅ\n282501 0 ไธๆ็งๆๆๅฑฑโๅๅคงๅบๅโ\n282502 0 ๅ่ทฏ่ฎฐ่
ๅๅผๅงๅธฆ็ๅฎไน ็ๆฅไธฒ้จ\n282503 1 ไฝ ๅฅฝ๏ผๆๆฏไธๆนๅฐ้้กน็ฎ็ฝฎไธ้กพ้ฎๅผ ๆด๏ผ ้กน็ฎๆฌๆๆๅบไธพๅ็ธไบฒๅข่ดญๆดปๅจ๏ผๅฐๆถไผๆจๅบๅข่ดญๆฟๆบ๏ผๅนถไธ...\n282504 0 ไธ่ฝฝๅนณๅฎไบบๅฏฟapp่ชๅฉไธๅ\n1000 Processed\n classify content\n283000 0 ๅฐผ็ไธๅผ้พ่ขซ็ต่ไธๆณขๆจๅฐๅฎถ้จๅฃ็ๆๅไธๆฏๆฒก่ง่ฟ\n283001 0 ๆฒกๆๅไน็ซๆณๆใๅธๆณๆใ่กๆฟๆๆๅ่กไฝฟ็ๅบๆฌๆกไปถ\n283002 0 ไปๆญคๅป่ตท\n283003 0 ๆฌๆฅๆณ็ๆ้ถ่ณ่ฒๅญๆธๆ็พๅ\n283004 0 10ไธๅพไนฆๆปก200ๅ100\n1000 Processed\n classify content\n283500 0 ๅๅพๅไธไฝxxx่กจ็คบๆฏ็ฌฌxxxๅคฉ\n283501 0 ไนๅๆป่งๅพ็ง็ๆฟๅญ้wifiไฟกๅทๅฎๅจๅคชๅทฎ\n283502 1 ๆปไปทxไธ่ตท็ไธ็บฟๆนๆฏ็ตๆขฏๆดๆฟ๏ผ้ฆๆฌก็ซ็ๅผ็ใๆขๅฐๅณ่ตๅฐ๏ผๅฐๅ๏ผ้ๅฑฑๅ่ทฏไธ็ๅฑฑ่ทฏไบคๅๅฃ๏ผ่ฏ้ๆจ...\n283503 0 3ใๆๅ็พๅฎนๆถ้ด๏ผๅจ11็นๅๅฐฑๅฏ\n283504 0 ็ๆฟๅบๅจๅฎๅงๅบๅฎพ้ฆไธพๅไธ็ปธไน่ทฏ็ปๆตๅธฆโไบ่็ฝ+ไบค้+ๅถ้ โๅ
ฐๅทๅๆกๅณฐไผ\n1000 Processed\n classify content\n284000 0 ๆๅชๆฏๆณๅจๅฐ้ไธ้ๆธฉไธไธ้ขๅ็ๅMV\n284001 0 ๅ
จ็ๆธ
็ๅ
ฌๅธๆๅๆธ
ๅxxไธไฝ้กน\n284002 1 ๆจๅฅฝ๏ผๅฅใๆๆฏ่ฑชไธๅ้ฆๆ้ผ่็ๆๅญ่ฎฐ๏ผๅๆๅๆจ่็ณปไบ๏ผๅๅฎไธ้กๆฐๅบไธๅฅxxx.xxๆนxxxไธ...\n284003 0 ไปๅฏไปฅๅฐๆ็ง็พ็
่ฎฒๅพๅคดๅคดๆฏ้\n284004 0 ๆฑ่็ๆทฎๅฎๅธ้ๆนๅฟ้ถ้้ๆทฎๅปบๅฑ
ๅงไผๆๅนฒ้จ้ธๅ ๅ็ฐๅธฆๆฝๅทฅไบบๅๆ่ไบบ\n1000 Processed\n classify content\n284500 0 ๆ้90ๅฒ้ซ้พ็่ๅ
ๅๅ็ๅคง้ฃๅคง้จๅจไฟฎ็ ๆ ๆ\n284501 0 ๅฏๅจๅป็็ๆๅฏผไธๆ็จๆญขๅณ่ฏ\n284502 0 ๅฐๅทๆฐ่ชๅฑ้ฟๅฐฑๅฐๅ
่ๅฝ็ญๅฎๆฃไบบๅๅ
จ้จๅผ้ค\n284503 0 ไธไธชๆฑ่ฅฟๆฅ็ๆฐๅ
ตๅจๆตๆฑๅ
ต่ฅๅๅฎ็ฌฌไธ้กฟ้ฃๅ ้ฅญ่นฒๅจ่ตฐๅปๅญไบ้ฆ้ฟ่ตฐ่ฟๅป้ฎไธบๅฅๅญๆฐๅ
ต่ฏด้ฟ่ฟไนๅคง่ฟๆฏๅ...\n284504 1 ๆ่ฐข่ด็ตๅทจ็ณๅฑฑ็ๆๆๅๆ
ๆธธๅบ๏ผๆจ็ๆฅ็ตๆฏๅฏนๆไปฌๆๅคง็ไฟกไปป๏ผไธญๅฝ็ตไฟกๆๆบๅ็ๆฏๆจๆไพฟๆทใๆ็ปๆต...\n1000 Processed\n classify content\n285000 0 ็้ฒๆๅณๅฎไบ12ๆฅ12ๆถ่ตท็ปๆๅ
จ็้ฒๅฐ้ฃIII็บงๅบๆฅๅๅบ\n285001 0 ไธ็งๅบๆณ้ข็่กๆฟๆณๅฎๆฒกๆ่ฝปๆๅฏนๆญคๆกไฝๅบๅคๅณ\n285002 0 ๆฅๆจ่ไธTHANN็ดซ่้ฒๆ้\n285003 0 ๅฅ่ฉ็ๅบๆฌๅท็้ฅญ็ซ็ถ็จๆข
ๅนฒ่็\n285004 0 ๆๅฅ่ทๆ่ฏดไบไฝ ๆฏๅคฉ็ผ ็ไปๆฏไธๆฏ็\n1000 Processed\n classify content\n285500 0 ๆข็ณๅท๏ผ่ฅฟๅฎๅธๆฟๅบๅฆๆญคโๅซๅฅณโๅ ไธบๅฅ\n285501 0 ๅ
ฑๅๆๅป่ทจๅบๅๆฏๅ่ฟๆณ็ฏ็ฝช\n285502 0 ๆฅ่ชๆฑ่็็ป่ฎกๅฑ็่ฐๆฅๆพ็คบ\n285503 0 ๆไปฌๆพๅฐๅๆนๅฐๅบๆฟๅบๅปบ่ฎพๅฑ\n285504 0 ๆฅๅ20ๅท้ฉๅฝๅน่ฎญ็ญๅๆฐธไน
ๅๅฆ็ๅญฆๅไธ้ฃๆบ่ฏท่ทๆ่็ณป\n1000 Processed\n classify content\n286000 0 ใ1646ไธไบบๅจไผ ็ใไธญๅฝๅฅฝๅฃฐ้ณ็ฌฌ4ๅญฃๆทท่กๅ็ทๅไผ็ท่ฟทๅๅๅฏผๅธๅจ่ฃๆบๅฝ่ฏญๆญ๏ผไธญๆๆdiao\n286001 0 ๆฏๅฆๆณข้ณ787ๆ็จ็ๆฏๆถกๆๅๅจๆบ\n286002 0 ่ฟๅฏนไบไธคไธชๆๆๅๅฏนC่ฏญ่จไธ็ชไธ้็ๆๅๅๅฏๆณ่็ฅ\n286003 0 ๆฏๆฌกๆๅๅทๅคด1่ณๅฐ2ไธๅฐ่ฏๆถฒๅทไบๅ้ผปๅญๅ
\n286004 0 ๅพฎ่ฝฏๅทฒไบxๆๅบ็buildxxxxๅผๅ่
ๅคงไผๅๅธไบwindowsx\n1000 Processed\n classify content\n286500 0 ๆ
ๆธธๅๆฅๅฟๆ
ๅฅฝๅๅๆ็่ชๆ\n286501 0 ๅ็ฌ็xxml็ฅไปๆฐดๆๅxxx\n286502 0 ็็พๅบฆ็พ็งไป็ปๆจไผฏๅณปๅ
็\n286503 0 ๆๆฒกๆๆๆบ่ฝฏไปถ็็ดๆญๅฐๅๅ\n286504 0 ๅไบฌไธ่ตทโ็บขๅ
้จโ\n1000 Processed\n classify content\n287000 0 ่กไธ็ฑๆตๆฑ่ดไธญๅๅฎไธๆ้ๅ
ฌๅธๅ้ๅฐ้ๆๆ\n287001 0 pptๆไปถๆฒกๆ่ชๅจๅ
ณ่o16ๅบ็จ\n287002 0 ๆฎ่ฏดๆฏๆ ้กๅฐๆๅๆฐ็่ตฐ็ฉดๆญๆ\n287003 0 xxxxxxxxๆธ
ๆจๅจๅข็ป้่ฟ่ตฐไธข\n287004 0 ้ฎๅฅนไธช้ฎ้ข้ฃ็งๅพไธๆ
ๆฟๆๆๆบไป็ผๅๆพไธ็ถๅ็้ฝไธ็ไฝ ็ถๅๅๅพ่ฝปๅฃฐๅพๅผ็ๆ
ขๅๅ็่ฏดๆไธ็ฅ้ๆฒกๅฌ...\n1000 Processed\n classify content\n287500 0 ๆไธๅ
ญ็น็้ฃๆบ้ฝๆฒก็ฃจๅฝ็ๆฒก่ตถไธ\n287501 0 ็ๅฟ่งๅพ็ฅไบต่ทๅผบๅฅธไธๅบๅซๅฏนๅพ
ๅฐฑๅฅฝไบ\n287502 0 ๅไบฌ๏ผๅ
ญๅ็พ็่ทฏ็ฏๅคง็ฝๅคฉไบฎ็ๆตช่ดน\n287503 0 1987ๅนดไปๅไบไธช้ซ20cm็ๆบๅจไบบ\n287504 0 ่ฏ้ชใๅจ่ฏขๅน่ฎญไบไธไฝ็ๆฐๅ
ด่ซๅฎณ็ฎก็ไผไธ\n1000 Processed\n classify content\n288000 1 ็ฑๆ
ๅ
่กฃ่ฟโไธๅ
ซ\n288001 0 ้ๆๆญ้
ไธไปถtๆคๅฐฑ่ถณๅคๅฝฐๆพไฝ ็ๆถๅฐ้ญ
ๅๅฆ\n288002 0 ้ฟ้ๅทดๅทดยทๆกๅไบงไธๅธฆ็ฐๆไธ็บฟไผไธ372ๅฎถ\n288003 0 ๆฑฝ่ฝฆ้
ไปถๅฐฝๅจๅๅฟๅฝ้
ๅๅ\n288004 0 ๅไบฌๅธๆฑๅฎๅบๆฐ่ฑกๅฐ2015ๅนด7ๆ20ๆฅ06ๆถๅๅธ๏ผไปๅคฉ้ดๅฐๅคไบ\n1000 Processed\n classify content\n288500 0 ๅป็่ฏดๆ่ฟ็งๆๅทฒ็ปๆ ่ฏๅฏๆไบ\n288501 0 ๅไธบ่ฃๅค็ๆไปฅ่ฟ
้ทไธๅๆฉ่ณ็้ๅบฆๆฅไฝไบๆญฃๅๆ้ฃๆฅ็็ฏฎ็็\n288502 1 ๆฐไธธx่ข่ฃ
x.xๅ
/็ใๆดๅคไผๆ ๆฌข่ฟ่ฟๅบๅจ่ฏขใโโๆๆฏไธ็ฌๅ ็ซๅพท่ฏๆฟ็ๅผ ๆก็ \n288503 0 Windows10ๆดๆฐไธไธชๅคๅฐๆถ่ฟๆฒกๅฅฝ\n288504 0 ไผ ๅทด่ฒ็นๆญฃๆดฝ่ดญ้ฃๆบ้ถไปถๅไบคๆๆ่พพ300ไบฟ็พๅ
\n1000 Processed\n classify content\n289000 0 ไปๅคฉๆฟๅฐๅพๅท้ผๆฅผๅบไบบๆฐๆณ้ข็่ฃๅฎไนฆ\n289001 0 eCareDiaryไนๆฏไธไธช็คพๅบ\n289002 0 20ๅคๅนดไธไธชๅฐๆนไบคไปฃไธๅปๆๅๅๅๆปก\n289003 0 ๆฑๅๆดพๅบๆ็ป็ป่ญฆๅ้่ฟ็ป่ดๆๆฅ่ตฐ่ฎฟ่ทๅๆกไปถ็บฟ็ดข\n289004 0 C21ไบฎ็ฝ้ฎ็่ฒ่ฎฉไฝ ่ฝปๆพๅฎ็ฐ่ฃธๅฆไธญ็็พๅฆ\n1000 Processed\n classify content\n289500 0 ๆญคๅคไบ้ฉฌ้ไธญๅฝ่ฟๅฎฃๅธไธๅฎๆด็พๅ่พพๆๆ็ฅๅไฝ\n289501 0 ๅฏๆถ็็พๅธๅป็ไฝ็ณปๆๅคๅฐไบบๆฒก่ขซๅๅ\n289502 0 2015/7/28่ฒๅๅบ็พๅฝๅบๅฐ่ดง้็ฅ\n289503 1 ๅณๅญๅณ้็พๅ็พ xxx xxx.com ้ซ ๆฎ ไบฌ\n289504 0 ็ฌฌไบๅคฉๅฐๅทไธ่กไนฐไบไปฝๆฅ็บธ\n1000 Processed\n classify content\n290000 0 ่ฟๆณๅฐ็น๏ผGxxๆตๆบๆนๅxxxxKM+xxM\n290001 0 ๅๅๅจQQ็ฉบ้ดๅไบซQQ้ณไนไปฅๅค็ๆญๆฒ\n290002 0 ไปไนๆถๅๆฅๅไบฌๅๅๅๅๅๅ\n290003 0 ็ ๅฎไธๅฝ้
ๆ่ฒๅญฆ้ข้
ๅบ็ฎก็ๆฌไธไธๅนๅ
ป็ๅญฆ็่ฝๅ
ทๅคๆๆก้
ๅบ็ฎก็็ธๅ
ณไธไธ็ฅ่ฏ\n290004 0 ็คพไผไธๅฅฝๅคไบๆฏๅฝๆณๅฎๆถ็ไธ่ง็\n1000 Processed\n classify content\n290500 0 ๅฎ็พ่ๅไบ่ฐทๆญChromeๆต่งๅจ็ๆ้็น็นๅๅพฎ่ฝฏIEๆต่งๅจ็ๅ
ผๅฎน็นๆง\n290501 1 ๆ่ฐข่ด็ต้ซ้ๆฑฝ่ฝฆ็จๅๆนๅไธญๅฟ๏ผไธป่ฅDVDๅฏผ่ชใไปช่ก่ฝฆ่ฎฐๅฝใ็ตๅญ็ใๆน่ฃ
ๆก็ญ็ณปๅๆฑฝ่ฝฆ็จๅ๏ผ่ดจ้...\n290502 0 ไปๆฐๅๅๅฏไปฅ็ๅบJLๅจNBA็ๅฎไฝๅฐฑๆฏๆฟๅณ้ๅ\n290503 0 ๆ็ๅฟ่งๅพ่
พ่ฎฏๆธธๆๅนณๅฐๅๅฐๅพๅๅพๅพๅๅพๅพๅๅพๅพๅๅพ\n290504 0 ไบๆ
็็็ธๆ นๆฌๅฐฑๆฒกๆๅ
ด่ถฃๅปไบ่งฃ\n1000 Processed\n classify content\n291000 0 2015ๆนๅๅธธๅพทๅธ็นๆฎๆ่ฒๅญฆๆ กๆๅธๆ่ๆ่ๅๅๅ
ฌ็คบ\n291001 0 ๆ็จ็พๅบฆ่ง้ขๆๆบ็็ไบโๅคๅๆชๆๅนดๅฐๅฅณ้
ๅง้ช้
โ\n291002 0 ็ป่ฟ่ฎค่ฏ็ๆๆบ็่กฃ่ๆๅธฎๅฉไบ่็ผๅ้้็ฎ่ค\n291003 0 ๅ้ฃๆบๅ้ฃๆบๅ้ฃๆบๅ้ฃๆบๅ้ฃๆบ\n291004 0 ๆฌๆฌก่ต่ดน่ฐๆดไธป่ฆ้ๅฏนๆๆบ็จๆท็่ฏญ้ณไธๅก\n1000 Processed\n classify content\n291500 0 ๅฟๆ
ๅคๆๅฐxxxxๅญ้ฝ่ฏดไธๆธ
ๆฅ็ๆ
ๅตไธไธไบ้ฃๆบ\n291501 0 ๅซๅข
ๅบๅๅ
ฌๅฎคๅ้ซๆฅผๅคงๅฆๅๅ
ฌๅฎค\n291502 0 ็ธๅๅๆกฅๆดพๅบๆ่ฟๆๆฅๆฅ็ไธไธช่ญฆๆ
\n291503 1 ๅฐๆฌ็ไผๅ : ๆจๅฅฝ๏ผ \"x.x\"ๆ ่ๅฅณไบบ่ๆดปๅจๅทฒๅฏๅจ๏ผ่ฟๅบๅฐฑๆ็คผๅ่ต ้๏ผๆฌข...\n291504 0 ไธปๆ่ทจๅนณๅฐๅผๅ๏ผๅพฎ่ฝฏๅๅธVisualStudio2015\n1000 Processed\n classify content\n292000 0 ๅฟ้ซๅฐๅบๅฐฑๆ18่พ่ฝฆ็่ฝฆ่่ขซๆ็ ด\n292001 0 7ๅคๅจๅคท็พคๅฒไธๅ้จ็น้ฒๅ
ๆณปๆนไธๅญๆ50ๅค่ใไบๆใ็ๆ่ฐๆฒ่นๆฎ้ชธ\n292002 0 ้ฉฌๆ ผ่ฏบๅกๅปๅนดxxๆ่ขซๆณ้ข่ฃๅฎไธ็บง่ฐๆ็ญx้กน็ฝชๅๆ็ซ\n292003 0 ไนๅฏไปฅๆ่ต็่ดข่ทๅพๆ้ซ26%็ๅฎๅ
จๆถ็\n292004 0 ๅฐๅ
ถ่ท6ๆ12ๆฅๅณฐๅผ็่ฝๅทฎๆๅผๅฐ38%\n1000 Processed\n classify content\n292500 0 ๅฎไธๅ็ฎกๆไบบๅฎไธๅ็ฎกๆไบบไบไปถๆๆฐ่ฟๅฑ\n292501 0 ไธ็่งฃ้ๅถ5000ๅๅฏนไบ่้่ๆไปไนๅฎ่ดจๅฝฑๅ\n292502 0 ๆๆบๅพฎๅๅพ็ๅ ่ฝฝไธๅบๆฅ่ฏฅๆไนๅ\n292503 0 ๅจ่ดตๆฑ ๅบๅๆๅฐๅบ็็ช็ชไป37ๆฌก\n292504 0 ไฝ ๅฝๆฟ็ญๆณๅพๅๆไธๆน้ซ็ด ่ดจๅฅณๆง็ๅฟ้ผ่ตฐไบ\n1000 Processed\n classify content\n293000 0 ๆxรxxๅฐๆถๅ
จๅคฉๅ็ๅฟซ้ๅฎๅ
จๅๅบๆบๅถ\n293001 0 ๅฏนไปไปฌไปๅๅจๅถ้ ๆบๆขฐ้ฃๆบไธญๆๅๅพ็ๆๅๆฏ่ณๅ
ณ้่ฆ็\n293002 0 15็่ง้ขไผไนๆฏ่ฆไผๅๆ่ฝ็็\n293003 0 ๅชๆๆฏๅ ไธบๅป็ๆๆฏ็้ฎ้ข่ๅ็ๆๅค\n293004 0 ๅ
ทๆๆตๅ็ฆๅทไผ ็ปๅปบ็ญใๆๅ็น่ฒ็ๅ
ธๅ้ๅๅผๅๅฒๆๅ่กๅบ\n1000 Processed\n classify content\n293500 0 ๆๅๆๅปไบ่ฟๆณ็ฏ็ฝชไบบๅ็ๅฃๅผ ๆฐ็ฐ\n293501 0 ๆ่
ๅจ55ใ54ใ78ใ62ใ63ใ64ๅบๅๆ\n293502 0 ็ปไบ็ฅ้่ฑๅ้ชจๆฏไฝ ็็ๆญปๅซ\n293503 0 ๅดๆฑ่ทฏๅบ็ไผไผดๅฅฝๅไธๅคชๅผๅฟๅฆ็ป็ฅจๆฎ็ๆถๅ็ดๆฅๆญๅคดๅๆ็ปๆ\n293504 0 ๅจ่ๅทๅธ็ธๅๅบๅ
ฑ้ๅขๅฝฑ้ข็ๆพๆ ๅ
ไธญ\n1000 Processed\n classify content\n294000 0 ไบไธชๅๅ้ฃๆบ้ฃโฆๅคฉๆฐๅฎๅจๅคช็ญ\n294001 0 ไฝ ya็้ๆฏๅฅๅ่ธข็ไบๅป็ถๆตๆฐ็ฎไบๅๆญฃ้ฝไธๅฏๅพท่ก\n294002 0 ๆฌไบบๅทฒ็ป่ขซๅฎณไบ้ฃไบไบบๅฆๆไธๅคๆญปๅๅฐๆๆดๅค็ไบบ่ขซ็
ๆฏๆๆญป็ๆญปๅๆไบบๅฟๅฝๆญปๆญปๅ\n294003 0 ๅๆจไธไบๆฑฝ่ฝฆ้่ๆ้ๅ
ฌๅธHR้จ้จๆintern่ฆๆฑ๏ผไธๅจไธๅคฉไปฅไธ\n294004 0 ๅจ20ไบฟ่ดขๅฏ้จๆง็่กๆถฆ็พๅฏๆฆไธ\n1000 Processed\n classify content\n294500 1 ๅฐ่ดต็ไผๅๆจๅฅฝ๏ผไธบๆ่ฐขๆจๅฏนๅจ้็ๆฏๆ๏ผๅ
ฌๅธๅจxๆxๆฅๅฆๅฅณ่ๆฅไธดไน้
ๅ้ฆๅนฟๅคงไผๅ๏ผๅกๅจxๆxๅท...\n294501 1 ๅฐๆฌ็่ดตๅฎพๆจๅฅฝ!ๅ่ฅฟ๏ผๆฑ้ๅฝ้
ๆฑฝ่ดธๅ๏ผ้ฆๆๆฟๅญๅณๅฐๆทธ็๏ผไป
ๅฉxxๅฅ๏ผ็ฐๅฆๆๅ้ๆฟ้่ฃ
ๆฝข่กฅ่ดดใ...\n294502 0 ไธๆตทๆตฆไธ่ญฆๅฏๆไบๅ ไธชๅทๆฒน่ดผ่ฟๅคฎ่ง้ฝๆฟๅบๆฅ็ซ่\n294503 0 ๅ
ณ้ญๅ้ฑ็่ก็่่ต่ๅธๆญฆๆฑ็งๆๅคงๅญฆ้่่ฏๅธ็ ็ฉถๆๆ้ฟ่ฃ็ปๆฐๆๆๆฅๆบ๏ผ่ฃ็ปๆฐ็ๆ็ๅๅฎขๅจๆฌ่ฝฎ็ฏ...\n294504 0 6ใๅๆถๅ ๅบๆฟๅฑ้่ฆๅ ๅบ็้จไฝ\n1000 Processed\n classify content\n295000 0 ChristianLouboutinๆผ่ฒ้ๆpvcๅนณๅบ้ๆงๆๅฐๅคดๅ ไธ้ๆๆผๆฅ่ฎพ่ฎก้้็บฆ็บฆ็ๆต...\n295001 0 ๆจไธไธชgoogle็้ๅๅฐๅ\n295002 0 ๅไบฌ่็งๅป้ขไธๅไบฌๅธ็บขๅๅญไผ่ๅๅปบ็ซไบโๅไบฌๅธ็บขๅๅญโๆๆโๅ็ฑ่ต้โ\n295003 0 ๅพ่
พ็บน่บซx่ฆ็ด ๏ผxใๅทฅๆดxใ้ฅฑๅxใๅฐ่งxใ่พน็ผ็บฟ\n295004 0 TaeYangๅๆดๆฐไบTwitter๏ผTAEYANGWELCOMESYOUๅๆๆณ๏ผ\n1000 Processed\n classify content\n295500 0 ๆฆ่ฟฐ๏ผๆญค้กน็ฎๆญฃๅจ่ฃ
ไฟฎ่ฟ่กไธญ\n295501 0 ๅพฎ่ฝฏๅธๆ่ฝๅคๆพๅบไธไบ่ฎฉไบบ็ผๅไธไบฎใๅนถไธๆดๅ ๅญฆๆฏ่็HoloLensไฝฟ็จๆนๅผ\n295502 1 ๅนฟๅพทๅพทๅๅฐ้ข่ดทๆฌพๆ้ๅ
ฌๅธๆฏ้ถๅฑไบๅฎๅพฝ็ไพ้็คพ็ๅฝๆๅ
ฌๅธ๏ผไธ้จไธบๅ็ฑปๅไธชไฝๅทฅๅๆทใๅพฎๅฐๅไผไธๆ...\n295503 0 ไผๅฉๅจxxxxๅพฎ่ฝฏๅ
จ็ๅไฝไผไผดๅคงไผไธ่ฃ่ทๅนดๅบฆEPGAzureๅๆฐไผไผดๅฅ\n295504 0 ๅ ไธๅ้ฑๆฒปไธๅฅฝ็ๅญๅฎซ่
บ่็ค\n1000 Processed\n classify content\n296000 0 ่ฟ่ฟ็ฐ้xxxxxxๅ
ใๆฉๆ่ฝฆxx้จใ็ตๅจ่ฝฆx้จใๅฉๅ่ฝฆx้จใๆๆบxx้จใ็ฌ่ฎฐๆฌ็ต่xๅฐใๆ...\n296001 0 ็ฅๆญฆๅฅณๅฟๅฝTeemoยฐ็ต้ญ็ฅๆญฆๅฅฝๅฃฐ้ณ\n296002 1 ไฝ ๅฅฝ๏ผ็ฐๆฐไธๅธxxๆธ
็บฏๆธ
้ฆ่ถ๏ผ่ฟๆๆๆฐ็ๆณก่ถๅๅคซ๏ผ้ข`่ฎขๆๅฅฝ่ถxxxxxxxxxxxๅคๅค ็ฐ...\n296003 1 ๏ผ่ฎฉไฝ ๆๅ่บซๅฟๆๆฆ็ๅจๆฟ็ๆดปใๅผๆญคx.xxๆฅไธดไน้
๏ผๅๅฎถ็นไปทๅคๅค๏ผไผๆ ๅคๅค๏ผๆๅพ
ไฝ ็ๅ
ไธด๏ผๅฐ...\n296004 0 ไธๅฐๅคๅคฉๆ้ฝไผๆ็ต่ฏ็ปๆ้ฃไบ็ฐๅจๆญฃๅจ้กถ็็ๆฅๅจๅทฅๅฐไธ็ๅฅไปฌๅฟ\n1000 Processed\n classify content\n296500 0 ็ปๆๅณ่ตๅฐ็ฐๅจ่ฟๅจ้ฃๆบไธๅ็\n296501 0 ไธญๅฝๆ็็ด ๆปฅ็จ่งฆ็ฎๆๅฟ่ถ
5ไธๅจ่ขซๆๆพๅ
ฅๆฐดๅ็ฏๅข\n296502 0 ๅ้่ฏไธใ่ฟ็ๅป็ใไธๆ็งๆๅๅๆฐไฝ\n296503 0 ไฝๅๆๅฅนholdไธไฝๅฆ็ฅ่ฟไธช่ง่ฒ\n296504 0 ่ฏฅ้ๅ ๆฏๅ้ฎ้ขๅคๆญปๅ็52ไบบ\n1000 Processed\n classify content\n297000 0 ่ฟๆฌพ็ๅ
ณ้็้ๆกๅไธบ่ฎพ่ฎก็ไบฎ็น\n297001 0 ้ๅๆง่ง็ฝ่้ปๆ้จๅบ็
ๅ่
7\n297002 0 ๅฎถๅบญ่ดขๅกๅบๆฌ่ช็ฑ=็ไผๅๅธไธๅฅ150ๅนณ็ๆฟ+500W็็ฐ้\n297003 0 ๆพไธป่ฎฒไบๆตๆฑ่บๅฑฑไนฆ้ขใๅนฟไธ็ซฏๆบชไนฆ้ข\n297004 0 ็่็ๆถๅ่ฟไธ่ฝๆๆๆบๆพๅจ่ทๅ
้\n1000 Processed\n classify content\n297500 0 AGxxxๆฏๅฝไปไธ็ไธๆๅคง็ไธๆฌพๅจ็ ็ๆฐด้ไธคๆ ้ฃๆบ\n297501 0 ไธไธชๅฐๅท็ๆๅๆฏๆถไปฃ็่ป่พฑ\n297502 0 ไปๅคฉไธๅไบบๅพBarcelonaๅฐPrague\n297503 0 Delaynomoreๅ
ฉ้ป็้ฃๆฉๅปถ่ชคๅฐๆไธๅ้ป้่ฎๆๅไธ็ดๅพ
ๅจ้ฃๆฉไธ\n297504 0 โโๅทฆๅฎๆฃ ้ขๆฑ่็ๆ ้กๆข
ๅญ\n1000 Processed\n classify content\n298000 0 ๅบๆฌๆฆๅฎไธๅฑ้ฒๆๅๆฆLMไผๆๆณฅ\n298001 0 ไฝ ่ฟไธบไฝ ็ไบไบบๅฐ็ช่ฃ
ไฟฎ่ฟ็ฏๆๅ\n298002 0 ้ป้ๅบไบบๆฐๆณ้ขๅฏน่ฟไธๅฉ็จ่ๅกไนไพฟ\n298003 0 ๅ
ถไปๆทฑๅณๆๆบๅๅไน็บท็บท่ตฐๅบๅปๆๅฑๅ
จ็ๅธๅบ\n298004 0 ้กถไธชๅคงๅคช้ณๅธฆ็ๅๅจ้ฒๆๅท้พ๏ฝ\n1000 Processed\n classify content\n298500 0 ไธบไปไนไธๆฟๆ็ปๆๅ็ไฟก็จๅก\n298501 0 ๆๅคง่ท่
พ่ฎฏ่ๆๅฐฑๆฏไธไธๆ ทๅ\n298502 0 ็็คผ้่ฎก็ๅไบxxxxๅนดxๆxxๆฅไธๅx็นๅจ้ๆฟๅบไธๆฅผไผ่ฎฎๅฎคๅฌๅผไบๅ
จ้ๅๆๅฆๅฅณไธปไปปไผ่ฎฎ\n298503 0 16ๅฒๅฐๅฅณ่ขซ้ผๅๆทซ้ญ่้ธจๆฎดๆๅซๅฎขๆฅ่ญฆ็ธๆ่ฏฅไธ่ฏฅ่ขซๆ็\n298504 0 ๆ้ฝไผๅจ็ต่ๅ
ณๆบๅ็จ็ต่็ฎกๅฎถๆธ
้ค็ต่ไฝฟ็จ็่ฟน?\n1000 Processed\n classify content\n299000 0 ๅๅคๅจๆฟๅ
็600ๅคไบฉๅๅฐไธ็งไธ็ดซ็่ๅ่ฅฟๅ
ฐ่ฑ\n299001 0 ไธญๅฝไบบๅฏฟๆจๅบไบไฟๅๅฐๅฉๆโeๅฎโ\n299002 0 ไฝไบๅฎไธ่
่ดฅๅชๆฏ็ฉ่็ฝไธ้กน่ต้้ฎ้ข็่กจ่ฑกไนไธ\n299003 0 ๅคๆฅๅๅจๅไบฌ็ญๅฐ้็ปๆ่
ฟๅกๅฐๅๅณไน้ดๆถ่ท็ๆทค้\n299004 0 ไธบไปไน่ฟไนๅค่กๆฐๆณๅไธญ่ฏๆ\n1000 Processed\n classify content\n299500 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?TOPxxxไฝ\n299501 0 ๆๆญไธไบไธช่
พ่ฎฏ่ง้ข็ไธญๅฝๅฅฝๅฃฐ้ณ\n299502 0 ็ฐๅจ่ฟไบไธญไปๆง่ดจ็่ดทๆฌพๅ
ฌๅธๆไปไบ็ไธๅกไธไปไปฌๅฎๅ
จไธๆ ท\n299503 1 ๆ่ฐขๆจ่ด็ตๅป่ถๅป็ฏๅๅบ๏ผ้ข็ฅๆดๅคไผๆ ๆดปๅจ๏ผ่ฏทๅ
ณๆณจๅป่ถๅป็ฏๅๅบๅฎๆนๅพฎไฟก๏ผๅป่ถๅป็ฏๅๅบ๏ผgote...\n299504 0 ๅฝ้
่ถณ่ๅณๅฎๅผ่ฟ้จ็บฟ่ฃๅคๆๆฏ\n1000 Processed\n classify content\n300000 0 ๅฎๅพทๅธๆฐ่ฑกๅฐ7ๆ10ๆฅ17ๆถ35ๅ็ปง็ปญๅๅธๅฐ้ฃๆฉ่ฒ้ข่ญฆไฟกๅท๏ผไปๅนด็ฌฌ9ๅทๅฐ้ฃโ็ฟ้ธฟโไปๅคฉ17ๆถ...\n300001 1 ๅคฎ่ก้ๆฏไบ๏ผ้ๆฏไบ๏ผๆฒๅฏ้่ๅ
ฌๅธไธไธๆไพx-xxๅๅฎ็็่ก่ต้๏ผ็ป็ฎๆจกๅผ็ตๆดปใ้ๆฅ้
่ต๏ผ\n300002 0 ๅไบฌ่พพ้ฃๅฎ่ฏ็ฎก็้กพ้ฎๆ้ๅ
ฌๅธๆๆฏไบ้จไบบๅๅจๅๆ็ๅๆๅธๆฃๆฅ้ฟๆฅ่พๆฒนๆฐๅๅ
ฌๅธ้ฟๅ็บฟ้ๅฑ็ซๅบๅๆ่พๆฒน็ซ\n300003 0 ๆๆบ2ๅท่ชๅจๆSDๅกๆๆๅบ็จๅธ่ฝฝ่ไธๆ ๆณๅๅฎ่ฃ
\n300004 0 ็ผด่ท่ต่ฝฆ400ไฝ่พโฆโฆ2014ๅนด8ๆๅฐ2015ๅนด6ๆๅบ\n1000 Processed\n classify content\n300500 0 ่ฏทๆฑๆณ้ขๆฏๆๅๅ็่ฏ่ฎผ่ฏทๆฑ\n300501 0 ๅป้ข่ฏๆญๆๆฃไบๆฅๆงๆทๅทดๆง็ฝ่ก็
\n300502 0 ไปฅๅๅป้ซๆทณๅฏๆฒฟๆ
ขๅโๆธธๅญๅฑฑโๅบๅๆนโ่\n300503 0 ๅๅ้จๅๅๅจ็ต่็นๆไธญๆผๅฝ้ ้ข็ฎ\n300504 0 ๆฐๅ็ผ็ฝฉEYEMASKๅฐ่ดง\n1000 Processed\n classify content\n301000 0 ๆๆบๅฟซๆฒก็ตไบ้ฃๆบๆฏๅไธไบไบไนๅชๆๅ็ซ่ฝฆไบ\n301001 1 ๆฅๅญฃๆฏๅ
ณ่ใ้ข่
ฐๆค็
้ซๅๆ.ไป่ๆดป้ชจ่โๅ
ๅฎต-ไธๅ
ซโๅ่็นๆ ๆดปๅจๅทฒๅผๅง๏ผไนฐxx้xใไนฐxx้...\n301002 0 ็ตๅจ่ฝฆๆฉๆ่ฝฆGPSๅซๆๅฎไฝ่ฟฝ่ธช้ฒ็ๅจ\n301003 0 ๅไบ2ไธช็ค่ 14ๅ้ฑๅจ้ป้ธญ่ฟๆ2็ข็ฆ็ฝๆฑค\n301004 0 ๆฌๆฅๅฐฑๅปไปทไฟๆค่ฟๅบฆ็็ธ้ฝ่ฟไธๆฅ\n1000 Processed\n classify content\n301500 0 2005ๅนด่ขซไบไธญ้ขไปฅๅ่ดฟ็ฝชๅคๅคๆญปๅ\n301501 0 ็ฆปไบQQ็ฆปไบๆๆบๅไธๆฏไผๆญป\n301502 0 ไบ้ฉฌ้ๅ
ฌๅธไบๅ
ถ2015่ดขๅนด็ฌฌไบๅญฃๅบฆ่ดขๆฅ\n301503 0 ๆไปฌๅค่้ปๆธๆๅ่ดจๅฅฝๅฐๅซ็ๅ็้ฝๆณ่ฆ\n301504 0 ไธๅ
ๅบๆถ้ฒๅคง้ๆๆๅxxไบบๅฐ้ขๅๅ ๆดปๅจ\n1000 Processed\n classify content\n302000 0 ๅฆไป่ทไบฌไธ43ไบฟๅ
ๅทจ้ขๆ่ต\n302001 0 ๅฏไฟก็ไบบ้ๆดไธๅผๅธๅจไธๅฐไธ็ฑณๆทฑๅคๆ้้ฒๆฐดๆไฟก็\n302002 0 ไฝๅฑฑ็ปฟๅฐไธญๅฟไบๆ้ๅฎไธญๅฟๅคๆฏ\n302003 0 ๆฑ่ๅ้ๅธๅฆไธๅฟๆๆธฏ้ๆฟๅบไปฅๅๆฑ่ๆณ้ขๆฃๅฏ้ข่
่ดฅ็้ฃๆฐ่ถๆฅ่ถๆถๅ\n302004 1 ๅฎถ้ฟไปฌ๏ผๆฐๅนดๅฟซไน๏ผๅฟๆณไบๆ๏ผๆฌ็คพๅฎไบxๆxๆฅๅผ่ฏพ๏ผๅญๆญค็ญไฟก่ฟๆฅๆฅๅ้็คผ็ฉไธไปฝ๏ผ็บขๅ
ไธๅฐ๏ผๅ้ข...\n1000 Processed\n classify content\n302500 0 ๆ่ฎธๆ่ฟไผๅๅจ็ต่้ขๅ้้ๅพ็็ๅพฎๅ\n302501 0 ๅฐ็น๏ผๆฐไธป่ทฏไธๆๅฑฑ่ทฏ่ทฏๅฃ่ฟๆณ่กไธบ๏ผ้ฉพ้ฉถๆบๅจ่ฝฆ่ฟๅ้่ทฏไบค้ไฟกๅท็ฏ้่กๅค็ฝ๏ผ็ฝๆฌพ200ๅ
\n302502 0 ๆ่ฟ็ฑ็ไธไบ็ฏ็ฝชๅๆจ็ๅไธ็ฑป็ๅง\n302503 0 ็ฅ่ดบๆๅงๅฎถ็ๅฐๅญฉ่ไธๅไบฌ่ช็ฉบ่ชๅคฉๅคงๅญฆ\n302504 0 ๅจ็ต่ไธ่ดด็
ง็โฆโฆไฝ ๅฟๅฟๆไธ่ขซ็ฟป็ๅ\n1000 Processed\n classify content\n303000 0 ่ก้ๅบๅฎ่ตไบงๆ่ต็ดฏ่ฎกๅฎๆ206\n303001 0 ้ฟQ็ฒพ็ฅ่ตค่ฃธ่ฃธ็ๅฑ็คบ็ปไธ็ๅๅฝไบบๆฐ็\n303002 0 ๆๆญฃๅจไฝฟ็จ็พๅบฆ่พๅ
ฅๆณโ้่ฒๆๅญๆบโ็นๆ็ฎ่ค\n303003 0 ไธญ่ช็ไธช่กๅฐไผ้็ปๆไปฅ็ฎๅๆดๆถจ้ๅธธๆๆพ\n303004 0 ๆญๅ
จๆฐๅจๅ็ณป็ปๅไบฌxxLๆต่ฏ่ฐ็
งๆๅ
\n1000 Processed\n classify content\n303500 0 ่ด็็่ชๅทฑๅญฉๅญๅจๅฐ้ไธ็ฏ้น\n303501 1 ๅๆฐ็ฉ่ต:ๆๅ
ฌๅธไธ่ฅๆนๅHๅ้ข\n303502 0 ่ๆ็ไน่ฝไฝ้ชๅฐCHERRY้่ฝด็ๆฎต่ฝๆ\n303503 0 21D108dayๅๅปๆฑ้ฝๅธ็ป้ฟๅงจ่ฟ็ๆฅ๏ฝ\n303504 0 ๅ
ถๅฎ้ฎๆ่ฆ็บขๅ
็่ฟNๅคไบบไฝ ไปฌๆ็ปๆๅ่ฟๅ\n1000 Processed\n classify content\n304000 0 ๆฎไบ้ฉฌ้ไธญๅฝๅ
ฌๅธ็ๆๆฐๆฐๆฎ\n304001 0 ๅไธ้ขๅๅญฆๅๅญฆไฝ้ซ็ไบบๅ
ถไธ็ปฉๅธธไธๅฆไฝ็ไบบ\n304002 0 โโใ14ๅฒๅฅณๅญฉ้ญ่ฑ่กฃๆฌบๅไบๅ็ช็ถๆ่ชๅทฑ็ๆฏๅๅ
ใ\n304003 0 ๅฐฑๆฏๅฏน็ฑbulaๆๆบ็ๆๆบๆฒกๅฅฝๆ\n304004 0 7ๆ20่ณ27ๆฅๆๅธ็ฒฎๆฒน้ๅธ้ถๅฎไปทๆ ผ็ปง็ปญไฟๆ็จณๅฎ\n1000 Processed\n classify content\n304500 0 ๅๆ ทๅฏไปฅ่งๅฐ่ฎพ่ฎกๅธๅฏน้ณๅ
็้้ๆน้ข็็จๅฟ\n304501 0 ไผไธฝ่ๅกๅฐ้ป่ฃๅ
ทๆSPF30/PA+++็้ฒๆไฟๆคๆๆฐ\n304502 0 ๅฐฑๆฏๆ็ไธๅกไปฃ็ xxxxxxxxxxๆญคๅ
จ้จๆไฝไธบๅ
่ดน\n304503 1 ๅ
ๅผ๏ผๆฐๅนดๅฅฝ๏ผ็ฐๆๅบฆๆฅๆ็้
ไธๅ๏ผๅทฅ่ต็ต่ฏ่๏ผๆไผไธๅคฉ๏ผๆฏไธชๆไบๅๅทๅๆถๅบ็ฒฎ๏ผๅฐๅ๏ผไฝๅฑฑๅธๅ...\n304504 1 ไฝ ๅฅฝๅง๏ผๆๆฏๅๆดช่ก็่ฒๅฏ็พๅๅฆๅๅบ็๏ผๆไปฌๅฎถx.x.x.xๅๅคฉไผๆๅคงๅ็ไนฐ่ต ๆดปๅจ๏ผๅบๅ
ไนๆๆดป...\n1000 Processed\n classify content\n305000 0 ไฝ tmไธไธชๆฆ่ทฏๆขๅซ็ไน่ตไบ\n305001 0 ๅป้ข็ๆฏๆ่ฎฉไบบ็ณๅฟ็ๅฐๆน\n305002 0 ๅ
็ไธ่บซไปไธญๅฑฑๅป้ขๅฟ่ไธญๅฟ่ทๅบ\n305003 0 M่ถ่ฏด่ถๅฏ้
ธๆ้ฝ่ฆๅญไบโฆไธ่ฟ่กฃๆ่ฃคๅญ้ฝๆฏๅปๅนด็ๆฌพไบๅ
ๆฏๅ ไธชๆๅไนฐ็\n305004 0 ็ฎๅ่ฟๆฌพAPPๅทฒ็ปๅธๅผไบ100ไฝไธ็จๆทๆณจๅ\n1000 Processed\n classify content\n305500 0 HamiltonBeach67601Aๆฑ็พ้ฉฐๅคงๅดๅทดๆฆจๆฑๆบ$49\n305501 0 ไธๆฐไบ่ฏ่ฎผ็ๆนๅผ่ฆ่ตฐไปไน็จๅบ\n305502 0 ไธ้ขๅ้ๅฉ็บฑๆๅฝฑ็ฑณๅ
ฐๅฐ็ผไธไธ็ปๅคงๅฎถๅไบซไธไธๆๆฐ็ๅฉ็คผ่ดบ่ฏ\n305503 0 ๆ็ฑ็ไธไฝ ็ๅคง็บ ็บท่ดด็งไฟกๅบๅป\n305504 0 ๅพไธบๆๆฐ็จๆๆบๆไธๅซๆๅๅฐๅๆ่ฝๆฎ้ชธ็็
ง็\n1000 Processed\n classify content\n306000 0 ไธไธ็็ๅ่
พ่ฎฏ่ฏด็ๅพฎ่ฝฏๅ้ญไบๆไนๅ\n306001 0 ่ฎพๆๅไบฌๅใๆฑๅฎ่ฅฟใ้ฉฌ้ๅฑฑไธใๅฝๆถไธใ่ๆนใๅผๆฑใ็นๆ่ฅฟใ้้ตใๆฑ ๅทๅๅฎๅบ10ไธช่ฝฆ็ซ\n306002 0 ็ๅๅฝข่ฑกๅฎฃไผ ็ไธๆจๅ็ปญๅ็ผ\n306003 1 ๅบทๅฎ้ๅฐๅบ๏ผไธๅคง่กๅฐๅค่ทฏ๏ผ็พ็ๅ๏ผxxใกใxxๅนด็ฌ็ซไบงๆ๏ผๅฏๆๆญ๏ผ็ฐๆฟ๏ผ็ฒพ่ฃ
ไฟฎ๏ผ้ๅฎถๅ
ท๏ผๅไปท...\n306004 0 4ใๆๅบๅฅฅๆฎๆฏ้ตๅข้กถ็ซฏๅบๅๆฏไธ็ไบบ\n1000 Processed\n classify content\n306500 0 ๅทไธช้ฆๆฐด่ๆญป้ผป็่ฆๆญป็่ชๅทฑ\n306501 0 ๅฏไธ็ไธๆก่ทฏๅฐฑๆฏไปๆญค็ตๆขฏไธ้่กไธๅป\n306502 1 ๅฆๆ้ๅฅขไบซ็พๅฎนไผๆๆ่ฐขๆจ็ๆฅ็ต๏ผๆไปฌไปฅๆ้ ๅ
ฐๆบช็ฌฌไธๅ็บง็ไผๆ๏ผๆณจ้ๅ่ดจๆๅก๏ผไธบๆจๆไพๅ
จ่บซๅฟๆพ...\n306503 0 ๅๅฎพ๏ผไธญๅฝๆฑฝ่ฝฆๅทฅไธๅจ่ฏขๅๅฑๅ
ฌๅธ้ฆๅธญๅๆๅธ่ดพๆฐๅ
\n306504 0 ็พๅฝๅ
ๅ่พพๅท่ฟๆฅๅgoogleๅๅธ็ฌฌไธๅผ ่ชๅจ้ฉพ้ฉถๆฑฝ่ฝฆ็็
ง\n1000 Processed\n classify content\n307000 1 ๆจๅฅฝ๏ผๆๆฏๅนณๅฎๆ่ดทๅฎขๆท็ป็ๅด้นใๅๅ่ทๆจ้่ฟ็ต่ฏใxๆไปฝๆไปฌไบไธ่ดทไธๅธ๏ผ้ขๅบฆxxไธๅทฆๅณ๏ผๅฉๆฏ...\n307001 0 ๆๆฏ็็่ขซpoไธปไธๅฅๆจๆดไนๆฏ่ฟไน่ฟๆฅ็ๅฟ้
ธๅฐไบ\n307002 0 ๆไปฌไฝ็ๆฏ้ซๅฐๅคซ็ๅบ่พนไธ็ๅคๅผ่ทๅฑๅฅๆฟ\n307003 0 ็ตๆขฏๆฏๅผฏ็ๆฏๅผฏ็๏ฝ่ฟไธชๅ๏ฝไธๆตทไนๆ๏ฝๅฆ่ๅฆ่ๅฆ่๏ฝ\n307004 0 ่ฏฅ้ขๆฅ่ๅๅๅ
ฑๅ29ๅ็นๅฐ็พคไผๅๆพๅธๆณๆๅฉ้25ไธไฝๅ
\n1000 Processed\n classify content\n307500 0 ่ด่ดฃๅ
ฌๅธ็ไบบๅๆ่ๅๅ
ฌๅธ็ฝ็ป่ฟ่ฅ\n307501 1 ๆฝฎ่ณ้ๆตท้ฒ็ซ้
ไบxๆxxๆฅๅผๅทฅ๏ผๅนถ็นๆจใๅ
จๆฐใ็พๅณๆญฃๅฎๆฝฎๅท่ใๅ
ป็็ๆฑคใๆตท้ฒ็ซ้
๏ผๆฌข่ฟๆฐ่้กพ...\n307502 0 ็ฑ่ชๆฒปๅบไบบๆฐๆฟๅบๅๅฑ็ ็ฉถไธญๅฟใไธญๅฝ้ซ็งๆไบงไธๅ็ ็ฉถไผๆตทๆดๅไผใๆฐ็่ดข็ปๅคงๅญฆๅไธ่ฅฟ้จ็ปๆต็ ็ฉถ้ขไธปๅ\n307503 0 ๆจๅคฉไธๅๅ็นๅบๅธญๆฑ่ๅฎๅ
ด็ดซ็ ๅฃถๅจๅนฟๅทไผๅฑไธญๅฟๅผๅนๅผๆดปๅจ\n307504 0 100็ฒ่ฃ
538ๅ200็ฒ่ฃ
958\n1000 Processed\n classify content\n308000 0 ๆณฐๅทๅไธ็ป็ผๆๆๅใๆณฐๅทๅธไบๆๅกไธๆ้ๅ
ฌๅธ็ญๆๅธ62ๅฎถไผไธๆๅๅ
ฅๅด\n308001 0 ๆฌๆฅๆ นๆฌๅไธๆฅๆๆ้ฃๆบๅ้ฃๅๅคๆพไบบ้กถๆดปๅฎขๆท่ฏดไผไธๆไปฃไปทๆๆๆๅๆฅ็ถๅๆๅฐฑ็็่ขซๅผๅๆฅไบโฆโฆโฆ\n308002 0 ๅผ ๅฎถๆธฏๅธ่ชๅผบ็คพไผๆๅก็คพๅๅๅทฅไฝ็ซๅ็ฆๆฏๅฟๆฟ่
ๅจๆ ไธฐ็คพๅบๅผๅฑไบ้ๅฐ\n308003 0 ๅธๅ่กๅๆฌๅๅไธๅ่ฟไธชๆฟๅไบ\n308004 0 ๅฐๆๅฏบๅไธๅๆฌไบๅฐฑ่ฟ่ไบไฝๆ็ๅฎๆจ\n1000 Processed\n classify content\n308500 0 ้ฟๅ
่ๅธๆณ้ขๅ
็ปไนฆ่ฎฐๅผ ๅบๆฐๆญฃๅจไธบ็บขๆกฅ่ก้็ญๆฏ็น็คพๅบ่ๅ
ๅใ่ๅนฒ้จ้ๅปๆ
ฐ้ฎๅ\n308501 0 ๆๅฏ่ฝไผๆไธบWin10SR1ๆดๆฐ็็ๆฌๅท\n308502 0 ่ตดๆพฅๆตฆ้ๆฑๆบๅฐๅบไธบ80ไฝๅๅญฉๅญ\n308503 0 ้ไปถ๏ผๆๆ ๆไปถไธ็จๆกๆฌพใ็ตๅญ่ฏๆ ๆๆ ๆไปถใ็ตๅญๅพ็บธ\n308504 0 ๆดชๅๅคงๅธๅบๅทฅๅๅฑ้ไบๅๅฑๅผๅฑไบ็ตๆๅจๅธๅบไธ้กนๆดๆฒป่กๅจ\n1000 Processed\n classify content\n309000 0 ไบ้ฉฌ้Webๆๅก็ฐๅจๆไพAuroraๆๅ
ๆฐๆฎๅบๆๅก\n309001 0 ๅฎๅฎๅฅถ็ฒๅๅ
ปๅคงไพฟ่ๆฏๅธฆ็ฒ่็จ ็ถๅปๅป้ขๆฃๆฅๅคงไพฟๆฒกๆ้ฎ้ขๅป็ๅผไบๅคดๅญขๅๅฆๅช็ฑ\n309002 0 ๅๆฅ็งๆฏ่ฑๅ้ชจ้ๆ็ฆ็ไบบๅ\n309003 0 ๆไปฌๅทฒไบ2015ๅนด2ๆๅ้นฟๆๅจไธญๅฝๅผๅฑ็้ๆณๆผ่บๆดปๅจ\n309004 0 ๅ้ ไบๆฐ็NBAๅญฃๅ่ตๅๅบๅคฑ่ฏฏ็บชๅฝ\n1000 Processed\n classify content\n309500 0 ThisSummer\\'sGonnaHurtLikeๆฌง็พๆต่กๆๅ\n309501 0 ๆตๆฑๅซ่งๆฒกๆพๅ็พ็พไธ็ฐๅคช็ผ\n309502 0 ่ไธ่ฟไป็ป็ปไบๅๆคๅฃซ็ๆๅ\n309503 0 HUAHUA้ฆๅฐๆนพ่ฎพ่ฎกๅธๆไฝๅคฉ็ถ็็ฐ็็็็ 925็บฏ้ถ้่ณ้่ณ็ฏ\n309504 1 ไบฒ๏ผไธ่ดธๅฑ่ฃๆฐ่ช็ถๅ ็ฐๅจไนฐxxxๅxxๅx.xๆ๏ผxๆxๆฅ่ช็ถๅ ไผๅ่ฟๆๅๅ็งฏๅ๏ผ่ถ
ๅ็ฎๅฆ\n1000 Processed\n classify content\n310000 0 ๅฏไปฅไฝ ๅทฒ็ถๆฏๆ็ๅฝไธญ็้ๅฎขไบ\n310001 0 ๅนณๅฟ่่ฎบๆฑๅฎๅ
ฌๅฎๅๅ็ฉๆๅฟไธคไธช่ดฆๅท้ฝๆฏๆบๅฅฝ็\n310002 0 Sๅงๅฆน็ๅฆๅฆๅๆๅฐSๅ่็ฎ็ชๆๅจไธ่ตท\n310003 0 ๅๅปบๅฝๅฎถ็ๆๅฟๅปบ่ฎพ็พไธฝๆฐๆฒๅฟ\n310004 0 ๆฑ่็่ฟไบๆธฏๅธๆฎ่ๅ
็ปไนฆ่ฎฐใ็ไบ้ฟไพๅฏ้กบ\n1000 Processed\n classify content\n310500 0 ๅไบฌๆญฆ่ญฆๅป้ข่่ ็งไธปไปป้ญๅๆณ่ฏด\n310501 1 ๅงๅงไฝ ๅฅฝ๏ผๆๅบไธบไบๅ้ฆ่้กพๅฎขไธๅ
ซ่ๅฝๅคฉๆๆๆค่ค็พไฝไบงๅๅ
จ้จไบๆ๏ผๆๅพ
ๆจ็ๅ
ไธด๏ผๆณๆฐไฟก็จ็คพ่ฅฟ้จ...\n310502 1 ๆญฃๅผๅผไธใๅฅฅ็น่ฑๆฏๅๅๅๅ้บxx-xxใก๏ผ้ข็งฏๅฐๆปไปทไฝใๆไพไธคๅๅคๅ
๏ผๆๆถ็งไธๅๅค๏ผๆ็งๅฏๆต...\n310503 0 ่ๅๆไธบ่ฏๆโๆธ
็ฝโ็ซ็ถๅธฆ้ขๆฐ่ญฆๅป็งไฝ็นๆพๅฎคๅไฝ่ฏ\n310504 0 xxxxๅนดxๆxxๆฅๅผๅๆตฉ็นๅธ็ฌฌไธๅป้ขๆดๅฝข็พๅฎน็งๆณจๅฐไผ่ฏ\n1000 Processed\n classify content\n311000 0 ไปฅๅ่ๅ็้ฃ้ฉๆ่ตไบบโโ็บชๆบ่ตๆฌๆป่ฃ็ฌฆ็ปฉๅๅ็ป็บฌไธญๅฝๅบ้็ฎก็ๅไผไบบๅพไผ ้\n311001 0 2015ๅนด้ๆฑๅธ่่ฏๅฝ็จๅ
ฌๅกๅๆๅฝ็จไบบๅ้่กฅๅๅๅ
ฌ็คบ\n311002 0 ่ญฆๅฏๆฒกๅฌๆๆxxxไปฅ้ฒ็ขๅ
ฌๅกๆ็ไบ\n311003 0 ้ฃไนๅไธบMate8ไผๆๅชไบๆน่ฟ\n311004 0 ๆๆๅฐฑๆฏไฝ ๅฉไธ็359ยฐๅ
จๆฏๆญป่ง\n1000 Processed\n classify content\n311500 0 Adrianไบ7ๆฅๆ้ดๅจ้ฆๆธฏๅบๅธญๆดปๅจ\n311501 0 ๅป็ๅงๅง่ฆ้ชๆ็ๅฏๅฏโฆโฆโฆโฆ\n311502 0 ๅพๅคไบบไธ็่งฃๆบง้ณไบบไธบไปไนไน ๆฏๅๆณก้ฅญ\n311503 0 ่ๅ
้จ่ฎพ่ฎกๅธEndramuktiHidayatๅๅทงๅฆๆณจๅ
ฅๅทดๅๅฒ็่ถฃๅณ\n311504 0 ๆปก่ถณไบไธๅๅฎถๅฑ
่ฃ
ไฟฎ็้่ฆ\n1000 Processed\n classify content\n312000 0 ๆๅบๅฐ2020ๅนดไธๆตทๅธ่ทจๅข็ตๅๅๅฑๆฐดๅนณ่ฆๅฑ
ๅ
จๅฝๅๅ\n312001 0 ๅๅฎๅบๆไพฟๅฉๅบ่ขซ5ๅๅคๆฅไบบๅๆๅๆขๅซ8ๅๅคๅ
\n312002 0 ็็ธๅ็ญๆกๅชๆไธไธช๏ผ่จ่จๅฐ\n312003 0 ไปๅคฉๆฏๅธธๅทไธไธชๆไปฅๆฅ้พๅพ็ๅคฉๆฐไบ\n312004 0 ๅนถไธAC็ฑณๅ
ฐไฟฑไน้จCEOๅ ๅฉไบๅฐผๅฐฑๅๆนๅ็ๅไฝ็ญๆน้ข็ไบๅฎ่ฟ่กไบๆทฑๅ
ฅไบคๆต\n1000 Processed\n classify content\n312500 0 ๆ็ฎ็ปๆตๆฑ็ๅ็ฉ้ฆไบๅๅคๅน
\n312501 1 xๅ
ๅๆฌพ่ฒ้็ปๅผ่กซ๏ผxxxๅ
ใxxxๅ
ๅฅๅผ่ฃค๏ผ่ฟๆๆดๅคๅฐๆๅ๏ผ้ๆฏไบฒ้้บ่??๏ผๆฌข่ฟๅๆฅไฝ้ช...\n312502 0 ๅดๅบไนๆๆ็้็จไบQ็็ๅก้็ป้ข่ฎพ่ฎก\n312503 0 ๅจๅก่ยทๆณข็นๆผ็ๆผๆ็็ๅพๆฃ๏ฝV็ๅฃฐ้ณไนๅพๅฅฝๅฌ\n312504 0 ่ท็ฆปxxxxๅนด็ ็ฉถ็่่ฏๅชๅฉxxxๅคฉ\n1000 Processed\n classify content\n313000 0 Don'twasteyourtimewithexplanations๏ผpeopleonlyh...\n313001 0 ๅไธชไบบๅฐฑๅชๆไธไธชไบบๆๆบๆไฟกๅทโฆๆ็ต่ฏๆฅ่ญฆ\n313002 0 ่ดต็พไบบๆดๅฝข็พๅฎนๅป้ขๅง็ปไปฅๅฎๅ
จๅก็พไธบๅทฑไปป\n313003 1 ไธบๅบ็ฅไธญๅฝไบบๅฏฟ่ฃๅๅฏ้จ็บงๅคฎไผ๏ผ็นๆจๅบไฟๅผๅขๅผใ้ซ้ขไฟ้ใๅ
จ็จๅ็บข็้ธฟ็ไธคๅ
จไฟ้ฉ๏ผไบค่ดนxๆฌก๏ผx...\n313004 0 ๅปxxx่บบไธไธๅ้ๅป็ๅ่ฏๆๆฒกๆพๅฐ\n1000 Processed\n classify content\n313500 0 ๅๅฉไบ็ซ็ฎญๅผน่ขซๆน้ ๆๆ่ฝฌๆจ้ฉฌ\n313501 0 ้ขๆๆฃ้บป้ๆฐ่ฝป่ไธ่บซๆๆๅพๅคง็ๆญ้
้ดๅญๆฌง็พ่กๆๆ\n313502 0 ๅบๅค่ฏๅธไธๅก้ปๅๅๅถๅบฆๅฐๅปบ\n313503 0 ๅ
จๅนด่ฅๆถๅฏ่ฝไปๅปๅนดxxxไบฟๅ
ไบบๆฐๅธๅข้ฟๅฐxxxไบฟๅ
ไบบๆฐๅธ\n313504 0 ๆฅ่ฏ่ฏCoolFatBurner็่่ๅฟ\n1000 Processed\n classify content\n314000 1 ไธบๅบ็ฅไธๅ
ซๅฆๅฅณ่๏ผxๆx่ณxๆxๆฒๆฒณๅคฉ่น็ๅ
ฐๆฒนไธๆๆจๅบๅ
จๅบxๆไผๆ ๏ผๅฆๆๅxxxๅๅxxx๏ผ...\n314001 0 ้กพๅฎขๅฐ่ชๆ็
ง็้่ฟๆๆบไผ ้่ณๆๅฐๆบ\n314002 0 ไปฅไธๆๆ่ก็ฅจๅไธๅไธบๆจไนฐๅไพๆฎ\n314003 1 ไธๅ
ซ่ๅฐฑๆฏ่ฟไนไปปๆงใไธๆไธๆฅ่ฟๅบ็ฝ้xxๅ
็ซๆฏ่ไธๆฏใไธๆๅ
ซๆฅ่ฟ็ฝ้xxxๅ
bb้ไธๆฏใไฝไฝ ...\n314004 0 BESCONNewGen38้ๆ้ๅฝข็ผ้BESCONNewGen38้ๆ็ๅฆๅ่่ค็่้ๆ\n1000 Processed\n classify content\n314500 0 ็ฉบไน่ฝจ่ฟนFCEVO็ดๆญๅผๅงๅฏๅฐๅๅจ\n314501 0 ็ฐๅจๆฏๅคฉ็ก่งๅๆขฆ้ฝๆฏ่ฑๅ้ชจ็ๅงๆ
\n314502 0 ๅ้ฟๆฃๅฏ้ขโๅๆฅ่ตไนๅฎถโ็ป็ปๅ
จๅธ60ไฝๅไธญๅฐๅญฆ็\n314503 0 ็็ๆๆบ้ฝ่ฝๆ่ง็ๅฎถ้ฃ่พน็ๅฐๅ้ฃ\n314504 0 ๆณ่ตทๅฎถ้ๆฅผๆฟ้ฃๆxxๆฅผๅซxxB\n1000 Processed\n classify content\n315000 0 ๅ
ฌๅฎๅฑๆ็ต่ฏๅ็คพไผๆฒปๅฎๆปกๆๅบฆ่ฐๆฅ\n315001 0 ่ฅฟ่ไธๆฏไฝ ๅท้ผๆ ผ็ๅทฅๅ
ทๅๆปก่ถณๅพๆๆฌฒ็ๅฏน่ฑก\n315002 0 ๆฟๅบๆ่ต้กน็ฎไนๅฐฑๅ ่็ไน่ถๆฅ่ถๅค\n315003 0 ๅญฆ่
ๆๅ็ๅฉ่จๅ่
่ดจๆดๅพ
ไธ็\n315004 0 ๅผ ็ขงๆจๆฏไธๆฏๅฅฝๅฃฐ้ณๆ็ซ็\n1000 Processed\n classify content\n315500 0 ไธบไธญๅฐๆ่ต่
ๆไพๆ้ ่ฐฑ็่ดทๆฌพๆนๅผ\n315501 0 YouTubeไฝ่
ๆฏGoogle\n315502 0 ๅ
ฌๅ
ฑๅบๅๆฅๅปๆฏไธ็ง่ฟๆณ่กไธบ\n315503 0 ่ตตๆฐธ้ฃๅ
ๅ่ฃ็ซไบ็ญๅ1ๆฌกใไธ็ญๅ4ๆฌก\n315504 0 ๅๆ็้ฃๅ่ฏๅ็็ฃ็ฎก็ๅฑๅจๅ
ถๅฎๆน็ฝ็ซ็้ปๆฆไธญๅ
ฌๅธไบไธๆนๆฌกไธๅๆ ผ้ฃๅ\n1000 Processed\n classify content\n316000 0 ๆฃๅบไธญ้ขๅฌๅผๅ
จๅธๆณ้ขไบบๆฐ้ชๅฎกๅทฅไฝ็ป้ชไบคๆตไผ\n316001 0 ็ฑๅซๆๅจๅคฉ็ฉบ็ผ็ป็ๅฎไฝ็ฝ็ป\n316002 0 ๅ็น๏ผMACDๆญปๅไธ่กไปท่ท็ ด30ๅจๆ็บฟ\n316003 0 ๆๆบๆไบไธคๆฌกๅฑๅนๆไธโฆโฆ่ฟๆๆบๅฟ
้กปไธ้ฒๅฆๅ่ท็ๆ่ฟไธช็ ดๅ็ไธ็ฅ้่ฝ็จๅคไน
โฆโฆ\n316004 1 jE \u0003mfeeox.com/?f=xxx\u0001\u0003xxxxxxxxxxxๆจๅฅฝ๏ผ่นๆxSๆๆบๅช่ฆx...\n1000 Processed\n classify content\n316500 0 ๅพฎ่ฝฏ๏ผWindows10ๅผๆพๆดๆฐ้ฆๆฅ่ฃ
ๆบ้่ถ
1400ไธ\n316501 0 ๆฑ้ดๆฃ้ชๆฃ็ซๅฑๅจไธๆน็พๅฝ่ฟๅขๅคง่ฑไธญๆช่ทๆฃ็ซๆง็
ๅฎณๅคง่ฑๅๆน่ๆบ็ก็
่\n316502 0 ๅไธบeSpaceUxxxx\n316503 1 ๅฐๆฌ็ๅฎขๆท๏ผๆ่กxๆxๆฅ--xๆxxๆจๅบๅคๆฌพ็่ดขไบงๅ๏ผๆ้๏ผxxๅคฉ--xxxๅคฉ๏ผๅฉ็๏ผx.x...\n316504 0 ๅๅฎถ็ๆถๅๅฅฝไธๅฎนๆๅฆ่ตถ้่ฌไบบ่ดด็ไบบๆคไธไบ่ฝฆๆ่ๅไธไฝๅคงๅ็ดง่ดด็ๆ้็ๆฑฝ่ฝฆ็ๆๅจไธไธไผๆๆๅฐไบ...\n1000 Processed\n classify content\n317000 1 ไบฒ็ฑ็๏ผ็ปด็บณ่ดๆไธๅ
ซ่๏ผไธๅนดๅฐฑไธๆฌก็ๅคงๅ็ดๆฅๅ้ฑๅผๅงไบ๏ผๅฐxๅท็ปๆไบ๏ผ็ฅฅ่ฏขๅฐ่ๅซ่ทฏ็ปด็บณ่ดๆๅ
...\n317001 0 ๆญฃๅฆ้ฃๆทฎๅฎๅธๅ
ฌๅฎๅฑๅไฟก่ฎฟๅทฅไฝ็่ญฆๅฏ\n317002 0 ๆไบบ่ฏด่ก็ฅจๆ่ตๆฏไธไธชๆงๆ ผ็ๆน้ ่ฟ็จ\n317003 0 ๆๅฐฑไธ่ฏด2ไธช่ทฏ็ด้ฎ่ทฏ็ปๆไปๅฐ้Aๅบๅฃ่ฟBๅบๅฃๅบๆฅไปฅๅๅ็ฐ2ไธชๅบๅฃๅฐฑๅ ๅ็ฑณ่ท็ฆป\n317004 0 ๅฟไบ่พฃไนๅคๅคฉๅ้ไปไปฌไปๆไธๅฎๅคงๆ้ฃๆบไปฅๅฃฎๅๅจๅฆๅ\n1000 Processed\n classify content\n317500 0 ็ซ่ขซๅฐๅท็ปๅคงๆๅคงๆ็้กบ่ตฐไบ\n317501 0 ไธๅๅปๅป้ขๆไธชๆฅ่ฏๅไธชๅ้\n317502 0 ๆฌ้จไป็งฆๅฒญ็็ซ็ๆญฆๅธฎๅคบๅพๅผนๆ็ฅ้ๆฎ็ซ ไธ\n317503 0 ๅฅน็่ๅคดไป10ไธชๆ่ตทๅฐฑๆปๆฏ่ฟๆ ท\n317504 0 ๆ็ญไบไบงๅไธไผไธๅ็ๆ ธๅฟ็ฉๆณ่ฎพ่ฎก\n1000 Processed\n classify content\n318000 0 xxxxไธๅๅนดๅทฒ็ผฉๅไธบ+xx%\n318001 0 ๆ่ตๅปบ่ฎพๆ ้กๆตทๅฒธๅ็ๆทฑๅณๆตทๅฒธ้ๅข\n318002 1 ๅฐๆ ๏ผxๆกๅฐ้็บฟ็ฏ็ป๏ผๅธไธญๅฟ็จ็ผบ็ๆๅๅกๅ
ฌๅญ๏ผ็ฎๅๅจๅฎ้ข็งฏไธบ็จ็ผบ็ฒพ่ดไบงๅๆญฃๅผๅฏนๅคๅๅฎ๏ผๆฌข่ฟๆจ...\n318003 0 ๆฏๆฌกๅจๅ
ฌไบคใๅฐ้ไธ็ปๅนด็บชๅคง็ไบบ่ฎฉๅบง\n318004 0 ็่ณๆ็14ใ15ๅฒ็ไธญๅญฆ็ไนไผๅคฑ็ \n1000 Processed\n classify content\n318500 0 ๅจไนๅ็ตๆขฏๆถไธๅฎ่ฆ้ตๅฎ็ตๆขฏๅฎๅ
จๆณจๆไบ้กน\n318501 0 ไธ่ฟ่นฒๆฌง็พๅๅ้ฅญๆฅๅ็ไบบ็ๆฏ้ฃๆ ผๅคง็ธๅพๅบญ\n318502 0 Naturie่ไปๆฐดๅๅฆๆฐดๆฒกไปไนๅณ้\n318503 0 ๅผ ๅฌๅฟ ๅฐๅ่ฏดโ้ถไธ700ๅบฆไปฅไธ็็ฉไฝ้ฝ่ฝ่ขซ่งๆตๅฐ\n318504 0 ็็ๆฏไธ็นๅฟ็่ดขๆฆๅฟต้ฝๆฒกๆ\n1000 Processed\n classify content\n319000 0 com็ๆทๆๅปบ่กๆฑ่ฅฟๅ่กๅธๅท6217002080001336407็ๆทๆ\n319001 0 ๅ
ๆฏmeasureๅdefinition็ธๅ
ณ็็ปๅ
ธๆ็ฎๅฐฑๆไบ19็ฏ\n319002 0 xๆxxๆฅๅๅธไปฅๆฅๆณๅฟ
ๅพๅค็จๆท้ฝๅ ๅ
ฅๅฐ่ฟๅบ็ณป็ปๅ็บง็็ๅฎดไธญ\n319003 0 ๅ่ดทๅฎๆๅ
ฅ20ไบฟๅทจ่ตๅผบๅฟๆ้ P2P็่ดขๅนณๅฐ\n319004 0 ไธ่ๅฐฑ่่ธๅ่ๅญๅฐๅบ่ฝไธ่ฝ่กๅ\n1000 Processed\n classify content\n319500 0 ๅคชไปๆญฆๆธฏ็ ๅคดๆ้ๅ
ฌๅธๅฟๆฟ่
ๆๅก้ๆฅๅฐไบๅคชไปๅธๅพไนฆ้ฆ\n319501 0 ้ขๅฏนไปไปฌ็่ดจ็่ฏดๅฟซๆฌๆฏไธ็พค็ฏๅญ\n319502 0 ไบบ็้ๅฟไธไผๅ้ฃๆบไธๆ ท็ช้\n319503 1 ๅฐๆฌ็้ฆจๆๅฅณ่ดตๅฎพ:ไธบๅบ็ฅxxๅฆๅฅณ่๏ผ็นๆจๅบๅช้่ฆxxๅ
็ซๅปๅฏไปฅไฝ้ชไปทๅผxxxๅ
็้ข้จ็็ณๆๆฏ...\n319504 0 ๆบๅฐพๅ้จ็Vๅ็ปๆๆๅฉไบๆฐด้ข่ช่ก็จณๅฎๆง\n1000 Processed\n classify content\n320000 0 ๆๅทฒ็ปๅๅฅฝไบ่ฟๆฅๅไบฌ็ญๆตช็ๅๅคไบ\n320001 0 ่ฎฉๅผ ๅฎถ็\"ๅ่ฑๅญ\"ไน็งฐ็ฅๅฅ็พๆฏ็ฑๅนๅ่ตฐ่ณๅๅฐ\n320002 0 ่ฏดๅฅฝ็้ฟ้้ฝไธ็ฅ้ไฝๆถๆ่ฝๆ่ก\n320003 0 ๆทฎๅฎๅธไพ็ตๅ
ฌๅธๅๅทฅๅ็้
ทๆ\n320004 0 ๅ ๆญคๅฐๅๅผH8็ไธๅธๆถ้ดๆจ่ฟไธไธชๆ\n1000 Processed\n classify content\n320500 0 ่ฐๆฅๅฏน่ฑกๅฐๆถๅไธน้ๅฑฑ5A็บงๆ
ๆธธๆฏๅบ็้ฃไฝ่กๆธธ่ดญๅจฑ็ญ\n320501 0 ็ปๅคงๅฎถๅไบซโzhiๅญhuaๅผโ\n320502 0 ไบๅไบๅฒๆฌๅฝๅนดๅ้็้้่ฟๅป\n320503 0 ๅ้ๆๅๅไธ่ฌไผๅจ48ๅฐๆถๅ
็ปๆจ็ญๅค่ฟๆ ท็้ฎไปถ\n320504 0 ๅ้ฟS5็ๅฅๆฏๅจ็ญไฝ ๆๅฎๆงๅไธญๅฝ\n1000 Processed\n classify content\n321000 0 ็้บฆ้ป้บฆๆฝ่ๅ
ๆจๆx็นๅคๆๆ\n321001 0 ่นๆๆญฃๅจๆๅๆฑฝ่ฝฆ่กไธๆๆฏๅ่ฎพ่ฎกๅทฅไฝ่
\n321002 0 ๆๆบไธ็ฝ40ๅ้็ฝ่ดน่พพ1000ๅ
\n321003 0 ๅ
จๅฐๅบไธค็บงๆณ้ขๅ
ฑไธๆฅๅคฑไฟกๅๅ531ไปถ\n321004 0 ๅทฒๅฎๆ3200ๆ ชๅคๅฌๆฃๆ ๆ ๅ ๆ ๆๆดๆฐๅๅซๆฅ\n1000 Processed\n classify content\n321500 0 ๆตๅท่ทฏ่ฝฌๆณฐๅทๅคง้ๆณฐๅทๅคง้้G2\n321501 1 ้ฉๅฝ็พไฝณ็ฅๆจๅจๆฐ็ไธๅนด้่บซไฝๅฅๅบท๏ผๆดๅ ๅนด่ฝป๏ผๆผไบฎ๏ผx.x่ไผๆ ๆดปๅจๅทฒ็ปๅผๅงๅฆ๏ผ่ฏฆ่ฏขxxxxx...\n321502 1 xๆxxๆฅๆนๅคชๆ่ฐๅบ็่ฃ
ๅผไธ๏ผๅญxxๅ
ๅขๅผๅกๆ้ซๆต็ฐxxxxๅ
ใ่ฎขๅๆ็คผ่ฟ่ฆ้ข็ฐ้็บขๅ
ใ็ ธ้...\n321503 0 ๆธฏ่ฑๆฎๆฐๆฟๅบ้ฆ่ฎพๆฐๆฟๅฑ็ๅ่บซๆฐๆฟๅธๆถ\n321504 1 xๅxxใ xใ็กฌ้ใๅ่ถณ้้ถๅต็ฑปๆฏๆปกxxxๅxxๅใ xใ้ป็ณ็นไปทๅxๆ๏ผๆญฃไปทx.xๆใ ...\n1000 Processed\n classify content\n322000 0 ๅๆ ทไนๆฏไธๅบ็ๅฎ็ๆ็็ปๅ\n322001 1 ๆจๅฅฝๆๆฏๆ็ปๅไปๅ
่๏ผไผๆ xๅทๆชๆญข๏ผๆไบๅญฆ่
ไป้๏ผๅฐ่ฝฆxxxxๅ
ๅคง่ฝฆxxxxๅ
๏ผๆฅๅไธๅฎๆพไฟบ...\n322002 0 ๅ้ ๅบไธไธช้็จไบๆๆบ็จๆท็ๆธธๆไบงๅ\n322003 0 ็ๅฐ็ป่บซ็พ็
ไธๅฏๆฒปๆๆถๅฟๆ
ๆๅบฆไฝ่ฝ\n322004 0 ๆๆบSIMๅกๆ ๆ็ๅๆณโโโ\n1000 Processed\n classify content\n322500 0 ่ฟๆๅคงๆฆ5ๅจ็ๆถ้ดๅฐฑ่ฆ่ฟๆฅๆฐ็ๅฝๅฆ\n322501 0 ๆ็ปๅธฆ้ข็้ๅจไธปๅบไปฅ100๏ผ86ๆ่ๆณข็นๅ
ฐๅผๆ่
\n322502 0 ๅฅฝๅฃฐ้ณไธๆฏๅ้
่ๅฐฑๆฏๅ
ณ็ณปๆท\n322503 0 ไนๆๅพ
ๆไปฌๅไบๅๅ
ณๅฟๆ ๆฎ็ๆๅ่ฝๅๆ ๆฎไธๆ ท\n322504 0 ๆงฝ่ฟไธๅๅไธๅฐฑๆฏๆบๅจไบบไธๅๅๅ\n1000 Processed\n classify content\n323000 0 ๅจๆกฅ้ฟ็บฆx/xๅคไธบVๅญๅฝขๆไฝ็ซฏ\n323001 0 1988ๅนดๅๅๅฟๆฟๅบๅฐๆญคๅไธบๅฟ็บงๆ็ฉไฟๆคๅไฝ\n323002 0 ๆธฉๅท็ไน ไฟๅไธชๆๅๅคฉ่ฆ็ฉฟ้ๆด้ถๅๅ่ดช่ดขๅฎๅฎ็ก่ง้ฝ็ฌ่ตทๆฅ\n323003 1 ๅฅฝๆถๆฏ!ๆจๆไธบๅปบ่ก็ไผ่ดจๅฎขๆท\n323004 0 ็ฑๆตๆฑๅทฅไธๅคงๅญฆ่ฏๅญฆ้ข็ๅๅญฆไปฌ็ปๆ็โๆง่ฏๅจไธ่ตทโๆๆ็คพไผๅฎ่ทตๅฐๅ้\n1000 Processed\n classify content\n323500 0 ๅธธๅทๅ
ฌๅฎๆถ้ฒๆๆฅไธญๅฟๆฅๅฐ5่ตทๅ
ณไบๆบบๆฐด็ๆฅ่ญฆๆฑๅฉ\n323501 0 ๅ็ญไธๆฎตๅผๅๅๅๆฟไบง่ฏๆถ่ฏฅ้ข็ญพไบ\n323502 1 ๆจๅฅฝ๏ผ๏ผๅ
ฌๅ
ไน้ไธๅฑไธๆท==ๆฅผ็ๅ้ข่งๆฏ๏ผๆฐด็ณป็ฏ็ป๏ผx/x๏ผ้ข็งฏxxxๅนณไฝไบๅธๅบไปทxxxไธ๏ผ...\n323503 0 โ2015็ซๅๅ
จๅผ็้ญๆจ็ปไบ่ฆ็ฐๅบ็ฉ่ฝฌๅธฝๅญๆๆณ\n323504 0 ่ฟไธชไบงๅๅป็จๅๅซๅทฆๆ่ไนณ้
ธ่ฝฏ็ป็ปๅกซๅ
ๆๆ\n1000 Processed\n classify content\n324000 0 ็ปไฝ ๅไบๅ ๆฒน็ๅจไฝไฝ ๅดไปฅไธบๆฏๅ่งๅฏนๆๆฅๆฅๆ\n324001 0 ๆจ่้ช็|ไฝ ้ๅๅใๅ็บงๅบ้ใๅฅๅฉๅ\n324002 0 ๆฏๅคฉๅพ่ฟๆฟๅฑฑๅบ็835ๅฟซ่ฝฆ\n324003 0 ไฝๆฏ็ฐๅจๆ่ฆ้ๆฐไธ่ฝฝ็พๅบฆ้ณไนไบ\n324004 0 ๆฏๆฌก็ๅบ่กๆๅๅบๅทฎ้ฃๆบ้
ๅบ้ฝๆฏๅจWVๅๅฐๆฏๅค้ขๆนไพฟๅฎๆ ๅคไบ\n1000 Processed\n classify content\n324500 0 ๅๅบ็ง่ฝฆ็ๆถๅๆไธไธๆฒกๅๅบ่ฟๆฅๅพๅ
ๅฐๅฏ้ฉพ้ฉถ่ตฐๅปไบ\n324501 0 2015ๅนด็ฌฌไธๅฑ่ดขๅฏๆฐ็ปๆตๅ
จ็่ฎบๅโไธๅธฆไธ่ทฏโๆพณๆดฒๅณฐไผ\n324502 0 ไปไนๆถๅๆ่ฝ็ญๅฐ่ฑๅ้ชจๅๅฆ็ฅๅ\n324503 0 ใๅไบฌๅฎๅ่ช็งฐๆๆฏๅค้ฟๆๆ้ฑใ\n324504 0 innisfree็ปฟ่ถ/็ซๅฑฑๆณฅ/ๆฉๆฆๆด้ขๅฅถ\n1000 Processed\n classify content\n325000 0 ๅผ่ฟไบไธxSๅบๅๆญฅๅ็บง็ๅๅๆฃๆต่ฎพๅค\n325001 0 ๅจ่ฏข็ต่ฏ๏ผ15090322533\n325002 0 ็Victor่ฐขๅฟๆตฉ่ๅธ็๏ฝ\n325003 0 erm?chteeinDeutschlehrerwerden\n325004 0 xxxxไธๅๅนดๆณฐๅทๅๅๆฟๆไบคxxxxๅฅ\n1000 Processed\n classify content\n325500 0 ๆบๆบ็็พๅบฆๆๆ ทๆไธ็ผ??็ปๆๅคงๅฎถ้ฝ่ฏดๆฒกๅๆณ\n325501 0 ๅๆๆไบบๆฟๅพ้ป่ฃ่7ๅ้บ้บ935\n325502 0 ๆ่ฟ50ๅนดๆฅๅๆ็ป้ๆตๆฑ็ๆๅผบๅฐ้ฃ\n325503 0 ๆๅฑฑๆๅ่บๆฏไธญๅฟๅฝฑ่งไธญๅฟๅๅธๆฐๆๅๅนฟๅบๅฝฑ้ข็ป่ฅ็ฎก็ๅไฝๅพ้ๅ
ฌๅ\n325504 0 ๅฟไบไฝ ่ฐไฝๅฎนๆๆๆบๅช่ฆๅไบๅฐฑไปฅไธบๆฏไฝ \n1000 Processed\n classify content\n326000 1 ็ฝๅxxxxxx\n326001 0 ็่ณ่ฏดๅฅนๅๆบๅบ็ๅๆ่ฟๆฏๅฅนๅ่ญฆๅฏ็้ถๅฎฝๆไบบๅฟๅฝๆญปๅ\n326002 0 ๆพ็ป่ฟๅจๆฟๅบๅผบๅไธ่ขซ่ฟซ็่ขซ้ชๆไบบๅ็ๅจๅคงๅบญๅนฟไผไนไธ้ชไบ\n326003 0 ไพๆงไปฅ่ดฉๅไบบๅฃๅๅผบๅฅธๆชๆๅนดๅฐๅฅณๅฎ็ฝชไบ\n326004 0 ่ฝ็ถๅฉไฝๆฉๅฐๅจxxxxๅนดๅ็ฉบ่กๅธไธๆๆๅ\n1000 Processed\n classify content\n326500 0 ่ฝฆ็ๅทไธบๆB****ๅฎๆฝ่ถ
้่ฟๆณ่กไธบ\n326501 0 xxๅฒ็ทๅญๆๆๆฐไธxxๅฒๅฅณๅญ็ๆ็ตๆญคๅๅจ็ฝไธ่ๅคฉๅ็ไบๅต\n326502 0 ๅๅนดๅๅ่กจๅจJCOไธ็่ฟ10ๅนด็ ็ฉถๅๆฌก่ฏๅฎไบ่ฟไธ็ๅค\n326503 0 ๅๆฅไธญ้30ๅฒ็ๆงๆณ้ๅๅถๅพๆบไนๅไธไธช็ๅญๆ่ฟๆฐด้\n326504 0 ่ฟๆฏๅพๅทๅทฅ็จๅญฆ้ข็ๅทฅ็งๆๅธๆฑๆทไธบๅคงๅๅญฆ็ไธๆผ็่ฑๅฃ็ง\n1000 Processed\n classify content\n327000 0 ๅๅพ่ฝฆ่ฝฆ็ๅท็ ๆฏ่D17921\n327001 0 ๆ้ฃๆๅฅๅฟซ่นไนๅ็ซ็ฎญ็้ตๅฎนๅ่็็ๆดๆธ
ๆฅไธไบไบ\n327002 1 ๅพทไฟกๆณๆๅ
ฌ้ฆๆ่ฐขๆจ็ๆฅ็ต๏ผๆตท็้ฆๅธญๅพทๅฝๅ่ดจไฝๅบๅณๅฐๅฐๆฅ๏ผๆฌ่ฏทๆๅพ
๏ผๅฐๅ๏ผๆฃๅญ่ทฏไธๅไธ่ทฏไบคๅๅฃ...\n327003 0 ๅฝๆๅผๅง่ดจ็ไธไปถไบๆ
็ๆถๅ\n327004 0 ๅจ้้ต่ทฏๅ็ฎกไธญ้็ๅ
ฑๅๅชๅไธ\n1000 Processed\n classify content\n327500 0 ็ฑ่ฎพ่ฎกๅธKazumiAbe่ฎพ่ฎก็Forestๅฐ็ฏ\n327501 0 ๅนถๅฏนๆตๅจๅ่ดฉๅไนฑๅ่ฝฆ็ฐ่ฑก่ฟ่กๅๅฏผ็้\n327502 0 ๆไบบๅฅนๅฆๅจๆตๆฑไฝ ็ปๆๆจ้ๅ
ฐๅท็ๆฌๅฐๆถๆฏ\n327503 0 ๆๅๅๅจๅคฉ็ซ่ฏ็จไธญๅฟๅ
่ดน็ณ่ฏทไบ็ฆ็ปดๅ
VR200ๆซๅฐๆบๅจไบบ\n327504 0 5ใๅไบฌ้ฆ้ฝๆ
ๆธธ้ๅขๆ้่ดฃไปปๅ
ฌๅธ\n1000 Processed\n classify content\n328000 1 ๆ ๆต ๆผ ไฟก ็จ ่ดท ๆฌพ ๏ผๆ็ปญ็ฎๅ๏ผไธๆถๆ็ปญ่ดน๏ผๆฅ็จ้ฑ่ฏท่็ณปๅขๅฐๅงxxxx...\n328001 0 ไบ็ธๅฃฐๆจ้ณๅคงๅฒๅญๅญฆๅ็\n328002 0 Gregoryๆ ผ้้ซๅฉTrailblazerๆถๅฐๅ่ฉ่ๅ
$xx\n328003 0 ่ๅ่ฎฐ่
็ๆ็ไธๅ
ไบบๅฃซๆๅบ\n328004 0 ๅไบฌ็ๅฐ้็ฎก็ๆนๅญๅจๆๆพ้ฎ้ข\n1000 Processed\n classify content\n328500 0 ๅฅฝๆณๅป่ๅทไนๅญ็ญไนๆณๅป\n328501 0 ๅชๅฌๅฐไธๅฅๆญ่ฏโbeautifuloneformeโ\n328502 0 ๅธๅผๆ่ตไธป่ฆ้ขๅไธบๆ
ๆธธใๅทฅไธใๆๅกไธๅๅไธ\n328503 0 ๅ็ฎกๆไบบ้นไบๆจช่ฎๆ ็ๅผ่ตทๆฐๆค\n328504 0 ๆไธๆฎตๆถ้ดๅฌๅฐ้ฃๆบๅฃฐๅฐฑๆโฆ\n1000 Processed\n classify content\n329000 0 ๅดๆฑๅบๅ
ฌๅฎๅฑๆพ้ตๆดพๅบๆ็ป็ป่พๅบๅฟ็ซฅๅๅฎถ้ฟ50ไฝไบบ่ตฐ่ฟ็นๅคๅคง้ใๅ
ฌๆฐ่ญฆๆ กๅๆพ้ตๆ่ญฆๆฐๅ
ฑๅปบ้ฟๅป\n329001 0 ไปๅคฉๅฅฝๅฃฐ้ณ้ฃไธชๅฅณ็ๅๆญปๆไบ\n329002 0 ๅ
่ดน็ๅทดๅฃซๆๅกๅฐไปๆๆๆฅ็ฌฌไธ็ญ่ฝฆๅผๅง\n329003 0 K่่ต็ฎๅๆๅผบ็ๅ
จๅ็ฐไปฃไธๅนด็่ฟ่ฅ่ดน็จๆ300ไบฟ้ฉๅ
\n329004 0 ๅบๆฟๅบ้จๅไธ็พค็ปฟ่ฒ็่บซๅฝฑไพ็ถๅจๅฟ็ข\n1000 Processed\n classify content\n329500 0 ๅฆไฝ้่ฟSkypeforbusinessใCortana็ญๅบ็จๅๅ่ฝ\n329501 0 ไฟฎๅปบๆฐๆก็ดข้ๆ็ตๆขฏ็ด่พพๅฑฑ้กถ\n329502 0 ๆฅๆ่ถ
่ฟ95%็ไธ็บฟๅฅขไพๅ็ๅไบฌๅพทๅบๅนฟๅบ\n329503 0 ไธญๅฑฑๅธ็ฌฌไบไบบๆฐๆณ้ข็ปๅฎก็่ฎคไธบ\n329504 0 ๆไธ่ฌๅชๆถ30ๅ็้็ฆปไธๅๆถ้ฒๆ\n1000 Processed\n classify content\n330000 1 ๆ่พ
็ญใๅค่็ญใ็ซ่ต็ญxๆxๆฅๅผๅงไธ่ฏพไบ๏ผๆฌข่ฟไปฅ่ๅธฆๆฐๅ
่ฏๅฌๆปกๆๅๆฅๅ๏ผ็ฅ่ดบxxไฝๅ่ๅญฆๅๅ
จ...\n330001 0 ไฝ้ๆฌกๆญ้ต่ผชๅพVictoriaๅฐVancouver\n330002 0 ๅๅ
ดๅ
ญ้ๅ่ๅทๅนณๆฑ่ก็้ฃๆฏ\n330003 0 ๆจๅคฉ่ขซ่ญฆๅฏๆไปๅคฉๆพ่ญฆๅฏๅธฎๅฟๅๅคฉ่ฟฝ่ญฆ่ฝฆ่ท่ฟๅ ๅคฉๅ่ญฆๅฏ้ฝๅฅฝๆ็ผๅๅ่ฟๆๅจ็พๅฝ้ป็ผ็้ๅฐไบๅฟ้ฝ็นๆทกๅฎ\n330004 1 ๆฏๅฎๅนผๅฟๅญๆฅๅญฃๆ็ๆฅๅๅผๅงๅฆ๏ผๆฌข่ฟๅฐๆๅไปฌๆฅๅญไฝ้ช๏ผๆญฃๆๅไนๆญฃๅผๅผๅญฆ๏ผๅจ่ฏข็ต่ฏ๏ผxxxxxx...\n1000 Processed\n classify content\n330500 0 ไธ่ถ่ถ่ทๅพๅไบฌ็ๅป้ข่ท็ซ่ฝฆไธ็ซไธไธชๅคๅฐๆถๅๅฐ้่ตฐ่ทฏๆ่
พไธๅคฉ\n330501 0 ็ฉๆณจๅ่กจๅทฎ็นๆ็ต่็ฉๅฅๆบไบ่ฟๅฅฝไฟฎๅฅฝไบไธ็ถๆๅฐฑๅฏไปฅๆๅๆข็ต่ไบ\n330502 0 ไนฐไธญๅฝไบบๅฏฟๆฏ็ๅฐๆจๅคฉๆๅคง้่ต้ๅจ้ซไฝ่ขซๅฅไบ\n330503 0 ไธๅฐไผ็จ2000ๅ
ๅ
ถไธญ็1500ๅ
็ปMMไนฐไบ้จๆๆบ\n330504 0 2011โ่งๅฑฑๆฎไน่ฟช่ก่ๅคง่ต็น้่กจๆผๅๅฎพ\n1000 Processed\n classify content\n331000 0 ๅผ ๅฎถๅฃๅผ ๅฎถ็ๅผ ๅฎถๆธฏโฆๅปๅปๅไธๆธ
ๆฅ\n331001 0 ๆๆญฃๅจๅฌๆณ ้ธขyousa็ๆญๆฒๆๅ
ๆถฆ่ฒๅฅณๅญฉ\n331002 0 ๆ่ๆ๏ผๅฎณๆ่่คๆพๅผใ็ป็บนๆพไธ้จ\n331003 0 xใๆฅๅๆชๆญขๆถ้ด๏ผxxxxๅนดxๆxxๆฅ\n331004 0 ็พไธฝ็showgirlๆ ็ๆฏไธ้ๆ้ไธฝ็้ฃๆฏ็บฟ\n1000 Processed\n classify content\n331500 0 ็็็ฅ่ๆผ็ปๅฎ็พๅฐๆฟ\n331501 0 ไธๆฏไธๅคช่ฆ็ๅๅกไธ้จไธๅคชๆฐ็ๆๆบไธๆฏไธๅคชๅฅฝๅฌ็ๆญไธไธชไธๅคชๅป็ไฝ \n331502 0 ้ฃๆบ็็ๅ
็บฟๅฐฑไป่ธไธๅผๅธ้ฃ่ฟ\n331503 0 ็กๅจ้ฃๆบไธๅฐฑๆ่ง็กๅจไธไธชๅฐ้ญ็ๆฝ้ฃๆกถ\n331504 0 ๆพไธ็ตๅจpanasoNic\n1000 Processed\n classify content\n332000 0 IBM่ๆป็ๅฏน่ฟๅนดๆฅ้่ฆ้กน็ฎ็ๆขณ็ๆฏๅจ้ผๅฑๅปโๅๅคงไบโ\n332001 0 ไปฅ็บฆxxไบฟ็พๅ
็ไปทๆ ผๆถ่ดญ็ฝๅฑฑไฟ้ฉๆไธ็ๅไฟ้ฉๅ
ฌๅธ\n332002 0 ไฝณ้ๆณ้ขโๆณๅพ่ฟไผไธใ่ฟๅญฆๆ กโๆดปๅจๅฐๅฐ็ฌฌไบ็ซโโไฝณๆจๆฏ้่ทฏๅทฅๅกๆฎต\n332003 0 ๆถๅซ่ฟ็บช็ๅ
ๅ่ฝๅค้
ๅ่ฐๆฅๅทฅไฝ\n332004 0 ๆๆฒกๆ็ฅ้้ๆฑๅช้ๆ็ปดไฟฎๅฐ็ฎฑ็\n1000 Processed\n classify content\n332500 0 ้ฉๆ็??????้12่ฒๅฝฉ้
\n332501 0 ่กฅๅๆถๅปๆดพๅบๆๅผๅ
ทไธขๅคฑ่ฏๆ\n332502 1 ๅทด้ปๆฌง่ฑ้
๏ผไธๅคงๆฅๅโไธ้ฃๅบโ๏ผไธ้ฃๆกฅๅไบฌๅ่ๆ่พน๏ผ๏ผxๆxๆฅ-xๆxxๆฅ่ดญๆฌง่ฑ้
ๆค่คไบงๅไปป...\n332503 1 ็ฑ็น่ฃ
้ฅฐยท่ฏ้ยทๆจๅ
ๆธ
ยทไธไธปๅๅ ็ฑ็นๅนดๅๆด่ฃ
่ๆ ใ ๆดปๅจ๏ผxๆxๆฅ-xๆฅ ๅฐ็น๏ผๆฑๅๅบๅคงๆตชๆท...\n332504 0 ็ป็ช็ๅๅคง็ฏๅ้ๆฐ่ฎพ่ฎก็ๅฐพ็ฏ็ป\n1000 Processed\n classify content\n333000 0 ไปๅพๅฎฟๆทฎ็้่ทฏ็ๅๆฎตๅปบ่ฎพๆๆฅ้จๆๅไผ่ฎฎไธ่ทๆ\n333001 0 ๅพกๆณฅๅ่ฐ่กฃ่็ฟ็ฉ่ไธ้ข่14็่กฅๆฐดไฟๆนฟ่็ผไฟฎๆคๆค่คๅ้ข่่ดด\n333002 0 ้ๅ
ณ็ณป่ฝปๆฐๆ3ๅๅฎถๅ ้ขๆ่ฏ้ขๅ\n333003 0 ๅป้ข็ฝ็ป่ฅ้ๅจๅป็่กไธ็็ฝ็ป่ฅ้ไธญๅบ่ฏฅๆฏ็ฎๅ็ไธไธช\n333004 0 ่ณๅฐ20็ง๏ฝไธไธช่ฝฆ้้ฝ่ขซๆฆไฝไบ\n1000 Processed\n classify content\n333500 0 ๅทจ่ดชใๅทจๅทจ่ดชๅฐฑๅๅ็ฏๅญ้ฃๆ ท็้ฝๆฒกๆฏๆ\n333501 0 ่ฎฉๆๆบ้็่ตๆๅๅพ็ๅ
จ้จๆ ผๅผๅ\n333502 0 ๅ
ฌๅธๆๅก้คไบไผ ็ป็็ฉไธไนฐๅไธญไปๆๅก\n333503 0 ๆๅปบๆฐๅโไบ่็ฝ+็คพๅบๆๅกโ็ๆๅ\n333504 0 ๆนๆกฅๆณ้ขๅฎก็ปไบไธๅฎๆบๅจ่ฝฆไบค้ไบๆ
่ดฃไปป็บ ็บทๆกไปถ\n1000 Processed\n classify content\n334000 0 youtubeไธ็ๅไธบๅนฟๅ4minๅคๅฅฝ็็ๆ ทๅญ\n334001 1 ๆข็บขๅ
ๅฆ๏ผxๆxๆฅ๏ผๅๅคง่
ไธ่ตท่ฟ่๏ผ ้ญๅ่ฃ
้ฅฐๆบๆๆทๅฎ็ฝใ็งๅไธญๅฝ้ๆผๆฅ่ขญ๏ผ xๆxๆฅ๏ผๆญๅท...\n334002 0 2015/8/6็ฌฌ1ๅคฉ๏ผ1\n334003 0 ๅไธญๅฝๅคงๅฆ็ไบง็ๅฐผ้พ้ฒๆๅคงๅคๅฅ่ฏดgoodbyeๆ~~ๆดไธไปฅๅ็ซ้ฉฌ้ๆธฉ3/5ๅบฆ\n334004 0 ๅฏนโๅผบๅฅธ่ดไบบ้ไผคใๆญปไบกโ็็่งฃๅ้็จ\n1000 Processed\n classify content\n334500 0 ๆไธ็ดไปฅไธบ่ฑๅ้ชจ็ๅคไบบๆฌๆฏๅๅซ็็ฉๅฎถๅๆญฅ่ฟ่ก็\n334501 0 ้ป่ฒๅพ็ต่็ปฃ่ฑๆๆฏ็ปๅฏน่ๆ่ๆข้ๆฐ\n334502 0 ๆๆฒกๆไธๅคๅท่ฝดxๅคฉ้
็ๆฑๅไธ่ฝฝ้พๆฅ\n334503 0 ๆ้ๆๆๅคๅบๆ
ๆธธ็็ซฅ้ไปฌ้ๆณจๆๆ
็จๅฎๅ
จ\n334504 0 ็ต่ๅๆไบๆๆบๆญปๆบ็ต่งๆฒกไฟกๅท่ฟๆฏๅจ่้ชๆๅ\n1000 Processed\n classify content\n335000 0 ๅไธๆงไฝๆฟ่ดทๆฌพไธๅฏไปฅ่ฝฌ็ปๅ่ดทๆฌพ\n335001 0 ไธญๅฝๅฅฝๅฃฐ้ณๆๅ็ๆๅๅฑฑๅ่ฟ้ฆๆญๆฏไบ\n335002 0 ๅข่ฃ่ฆๆฑๅฐ้ไธ่ฆๆๆพๆๆๅนฟๅไบ\n335003 0 ่ฝไธ่ฝๆพๅคง่ฝไธ่ฝๆพๅคง~ๆๆไผผ็ไธๆธ
ๅๅฐๅฝ~ไบบๅฎถๅนณๆฟๅฐฑๅพๅฅฝๅ\n335004 0 ็งฆๅฎถๆกฅๅๆไฟก็จ็คพๅ่ฅฟไบๅ็ฑณ\n1000 Processed\n classify content\n335500 0 ๅคงๅฎถๆๅไปฌ็จ็ตๆขฏ็ๆถๅไธๅฎๅฐๅฟๅจๅฐๅฟๅง\n335501 1 ๆ็คพๅฎๅๅๆฐไบๅญๆฌพ็ฑปใ้ถ่กๅกใ่ดทๆฌพ็ฑปใ็ตๅญ้ถ่ก็ญๅ็ฑป้่ไบงๅใ็ฝไธใ็ญไฟกใๆๆบ้ถ่กๅ
จ้ขๅผ้๏ผ...\n335502 0 ่็พๅงๆฒกๆไบๅฐไฝๅจๆฅผไธ่ฟไธๅคไบคๆฐด็ตๆ่ดน็จ็\n335503 0 ไฝ ไผไธ็ดๅๅฐๅจ้ญ็่ดจ็ๅๆๅป\n335504 0 ๆฒกๆๆฉๅฐๅบๅญ3ๅคง็ตๅฝฑ็พๅบฆไบ็ฝ็้ซๆธ
่ตๆบ้พๆฅ็\n1000 Processed\n classify content\n336000 0 ็ปๆ็พๅบฆไบๅไธ้ไปไน้ฌผ้ฝๆฒกๆ\n336001 0 ไนๅคงๅธๅ๏ผไธๅจๅฏ่ฝๅไธๅ็้่ฟxxxx็นไนไธไฟก่พพ่ฏๅธ๏ผๆฒชๆๅฏ่ฝๅด็ปxxxx็นๅๅค้่กใใxๆ...\n336002 0 FrontenacCountyCourtHouse้ๆฏ้กฟ็ๅฐๆนๆณ้ข\n336003 0 2ใ2015ๅนด3ๆ26ๆฅๅธๆฟๅบไธ่พพ่ฏฅๅฑโไธๅฎๆนๆกโๅ\n336004 0 ไปฅ่ณไบๆค็ไนฐ่ฟ่ถ
่ฟ5%็ไผๅฉA่ก\n1000 Processed\n classify content\n336500 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 366vffไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n336501 0 ๅๆ่ฟHeroicHollywood็ๆ็ๅ
ๅฎนไธๆฏไธๆ ท\n336502 0 JoJoๆ็ไธ็ฎกๆฏๅ็ๅพ่ฟๆฏๆๅฟๅพไธป้กตๅ้ฝไบๅ่ฝๆพๅฐ\n336503 0 ๆ็ฐๅจๆ็น่ฎจๅๆ ้ก่ฟไธชๅฐๆน\n336504 0 ๅฅฝๆๅจ/ๅคงๅญ/ๅคงๅญ/ๅคงๅญ/ๅคงๅญ/ๅคงๅญๆๆดปไบไบๅๅ ๅนด\n1000 Processed\n classify content\n337000 0 ๅๅ้ฃๆกๆฏ้ฃไธชnoface็yyy่ชๅทฑ่ทๆฅๆขๆๆๆบๅ็\n337001 0 ๆฝ่ก็ๅป็ๅทฅไฝๆๅบฆๆ็นไธ่ฎค็\n337002 0 ๅๅคซๆฏๆๆฏไปๆๅ
ป่ดน600ๅ
็ด่ณๅฐๅญฉๅนดๆปก18ๅจๅฒๆญข\n337003 0 ไธไธๆถ้ด่ต่ท/้ฃๆบ/้ฃๆบ/้ฃๆบ\n337004 1 ไบฒ็ฑ็ไผๅ้กพๅฎข๏ผ ๆจ็่ๆฅๅฟซๅฐไบ็ฅๆจ่ๆฅๅฟซไน๏ผ่ๆฅๆ้ดๆฌงๅฎถๅฐๆๅ
จๅนดๆๅคงๅๅบฆๅฎ ็ฑๅ้ฆ๎ ๏ผๅ
จๅบ...\n1000 Processed\n classify content\n337500 0 ๆๅฏนไธไธชไบบ็ๅ่งไปไป็ๅฐๆญป้ฝไผๆ\n337501 0 ๅฟซ่น2ๅนด640ไธ็ปญ็บฆไธปๅธ
ไนๅญๅญฃๅ่ตๆพๆ็็ซ็ฎญ\n337502 0 ๅ่ฅ่ถ+ๅ่ฅ่ดด+่ฟๅจ+ๅฐๅ\n337503 0 ่ฎก็ฎๆบ็ณป็ปๆฏๅฆ็ฌฆๅGSP่ฆๆฑ่ฆ็ๅ
จ็จ่ดจ็ฎกๆงๅถ\n337504 0 ๆณๅไปไน่ดงๆฏๅคฉไธๅฎๅๆๅป่ท่
ฟ\n1000 Processed\n classify content\n338000 0 ๅฅฝๅฃฐ้ณๆฏไธๆฏไฝ ไปฌ็ๆถๆๅฃๅๅข~~็ฒ้ๆฅ่ฟๅฐพๅฃฐ\n338001 0 ไบๅ็็ฝๅนณๅฟไน้พ้ๅพ็ญๆๅงไผๅคงไปฅ็พๆ\n338002 0 ไธบๅ็ฑปๅฆ็ง็็็ป่็ญ่ด็
ๅพฎ็็ฉๆไพ็น่กใ็ๅญใโ้นโ็็ๆกไปถ\n338003 0 ๅๅจ7ๅทๆฅผๅฐไธ่ฝฆๅบ็ๅฅฅ่ฟช่ขซไบบไธบๆๅ\n338004 0 ไบ้ฉฌ้็งฐๆญฃๅจๆต่ฏไธๆฌพๅซโdashbuttonโ็็กฌไปถ\n1000 Processed\n classify content\n338500 0 ไธๆ้คๆฟๅบๅๆฌกๅฎ่ก้ไปท็ๅฏ่ฝ\n338501 0 ่ๅทๅคๅฎถๅ็ๆฟไผๅ
ๆฌไธญๆตทใไฟๅฉ็ญๅด้ทๅ
ฅไบโๅฐ่โ\n338502 0 wnbaๅnba่ฟๆๅบๅซๅ\n338503 0 1907ๅนดๅ
ถไบๅผๆดไผๅพทๅไฝๅๅฑๅๆจๅนฟ็ฒพ็ฅๅๆๅญฆ่ฏด้ฟ่พพ6ๅนดไนไน
\n338504 0 ๅฆไฝ็ฎกๆงไผไธๅๅ็ๆณๅพ้ฃ้ฉ\n1000 Processed\n classify content\n339000 0 ๆๅฆdoge่ธ็็ๆ่ฏดไฝ ๅซ่ฃ
ไบ่ฟๆฏๅจไนฆๆฟ็ๅฎๅฐ่ฏดๅ็กๅง\n339001 0 ๅฝ็ตๆขฏๅๅฐไธๆฅผๆๆๆ่ฏๅฐๅจๅด้ปๆผๆผไธ็\n339002 0 ๅๅฐๆปจๆๅฅฝ็็ฎ่ค็
ๅป้ขไธๅฎถๆ้๏ผ่ๅนดไบบๆ\n339003 0 ๅธๆฃๅฏ้ขๆฃๅฏ้ฟ็้ๅบๅบๅธญๅบง่ฐไผ\n339004 0 ๆฉ่ฒ่่ ่ข่พไธๅฎฝๆพTๆค+็ฝ่ฒ็ดง่บซ่ฃค+็ฝ่ฒๅ้\n1000 Processed\n classify content\n339500 0 ๅบๅฃๅฐๆฌง็พ็ไบงๅไธๅไป
ไป
ๆฏๅปไปทไบงๅ\n339501 0 Hirudold็ฅ็ค่ๆณฐๅฝๅๅคงๅป้ขๆๅฎ็็ฅ็ค่ฏ่\n339502 0 ็จๆๅธธ่งใๆๅบๆฌ็ๅปบ็ญๆๆๅไผ ็ป็ๆญๅปบๆนๅผ\n339503 0 ็ป่็ๆๆค็ผๆฐดๆฏๅ่ชๅๅท้ฟๅ9610ๅนด็่พพๅคๅฐๅทๆฐด\n339504 0 ๆพณๆดฒGoatSoap็บฏๅคฉ็ถ็พๅฅถ็\n1000 Processed\n classify content\n340000 0 ๆ้จฐไบไธๆฉไธไนๅผไธๅฐๅฎๆน็ๆจ้้ๆฏ็จ้จฐ่จๅ็ดWindows10็ฎไบ\n340001 0 ่งฃๅณไฟ้ฉๆถ่ดน่
โ็่ต้พโใโ็่ตๆ
ขโ็ญ้ฎ้ข\n340002 0 xxxxๅนดxๆxๆฅ่ฎฏ๏ผๅฝๆฐๆฟๅบ็ณไปค็ฆๆญขไปฅๆบๅ
ณ้ฟๅฎ็งไบบๅไนๅฏนไธ็บงๆบๅ
ณๆปฅ่ไบบๅ\n340003 0 ้ฉฌๅนฟๅนณ่ขซๆ ้กๅธๆๆๅใๆ ้กๅธๅฟๆฟ่
ๆปไผๆไบxxxxๅนดๅบฆโๆ ้กๅธ็พๅๆ็พๅฟๆฟ่
โ็งฐๅท\n340004 0 ้ฒ่
ๅๆปๆฏ่ฝไธ\"ไธๅคฉ็ถ\"/\"ๆๅฎณ\"/\"้ปๅฟๅๅฎถ\"ไน็ฑป็่ด้ข่ฏๆฑ่็ณปๅฐไธ่ตท\n1000 Processed\n classify content\n340500 0 ๅฏไปฅ่ตท้ฃไบไธน้ณไบบๆฐ้ฝ่ฟไนไปปๆง\n340501 1 xxxxxxxxxxxxxxxxxxxๅจ็บขไบๅ่ก\n340502 0 ๆๅๅๆๆๆบๅฃ็บธๆขๆไบ่ฟไธช๏ผGoldensilkorbweaverspider\n340503 0 ๅคๆญปๅใๆ ๆๅพๅๆ่
ๅๅนดไปฅไธๆๆๅพๅ\n340504 0 ้ฒ่ๆขๅซๆ้ซๆ่ฟ้จไนๅๅๅคดๆ\n1000 Processed\n classify content\n341000 0 ไธ็ฌ็น็ไธ่งๅไธๆ่ฎพ่ฎกไธ่ตท่ฏ ้ๅฑไบไฝ ็ๆฝฎๆต่
่ฐ\n341001 1 ๅฐๆฌ็ไผๅๆจๅฅฝ๏ผๆจๅจ้ฟๆฑUXx็ท่ฃ
็ๆดป้ฆๆถ่ดน็งฏๅ่ดญ็ฉๆถๅฏๆต็ฐ้xxๅ
๏ผๅจxๆxๆฅโxๆxxๆฅ...\n341002 0 ๅคงๆฒน็ฐๆฏๆไธๆฌกๅนฒ็ฅ่่คไธๅจไธๆฌก\n341003 0 ๆ่ฟๅไบฌ็่กจๅผๅด่ฆๅปไปไปฌๅคชๅไบ\n341004 0 ๅๆๆณจๆๅฐ่ฟไธชๆๆบๅๅพฎๅๅช่ฝๅๅฐๅพ\n1000 Processed\n classify content\n341500 1 ็ฑๅฅณไบบ.็ฑ่ชๅทฑ๏ผ็ทๅฃซ้็ปๅฅณไบบ็็คผ็ฉ๏ผๅฅณไบบ้็ป่ชๅทฑ็ๅ
ณ็ฑ๏ผๅฐๆฌ็็พๅฎน้กพๅฎข๏ผไธๅ
ซ่ๅฟซไน๏ผxๆxๅท...\n341501 0 ไฝๆฏๆไนๅฐฑๅจๆๅๅบๅฐ้ไธๅข\n341502 0 ๆๅ
35ๅฎถๅชไฝๅฆๅค้15ๅฎถๅชไฝๅ่ฎก50ไธช็ฝ็ซไป
้2000ๅ
\n341503 0 ้ฃๅ้ฃ้ด้ฃๆบ่ฝฎๅญ็่ทฏๆถๆ็ฅ้\n341504 0 ็ทๅญ็ๆงไพต13ๅฒๅนผๅฅณๆฃๅฏ้ข็งฐๅๆน่ชๆฟ\n1000 Processed\n classify content\n342000 0 ๅฏไปฅๅ็ฐ่ฟ้ๆไธ็งๅฎ้ๆฌ็ถ็ๆฐ้ต\n342001 0 ๆ ธๅฎ1955ๅนด12ๆ31ๆฅๅๅบ็็60ๅฒไปฅไธ็ฌ็ๅญๅฅณๅฅๆถ็ไบบๅ\n342002 0 3214507110่ถ
่ๆฃ้บป่กฌ่กฃๅฏๅฝ้ฒๆ่กฃๅ็ ไธค่ฒๅ
ฅๆๅญไธ้\n342003 0 ๆฏๅคฉๅๅฐๅฎถๅฏน็ๆๆบๅๅ้ขๅข\n342004 0 ๅฝ็ตๅ่ชAGC/AVCๆงๅถ็ณป็ปๆๅๅบ็จๆ็ๆๆพ๏ผ่ฟๆฅ\n1000 Processed\n classify content\n342500 0 ้ฌ็ป็ๅๆๆฏ้ฃ่ฟๅจๆญๅทๆๅฏผๅไฝ\n342501 0 ไบฟๅฉ่พพ้พxxxxไธๆถ่ดญไธคๅ
ฌๅธๆง่กๆ\n342502 0 ็ฌฌๅไบๅฑไธญๅฝ้ๅฐๅนดๆบๅจไบบ็ซ่ตๅจๅ
่ๅค้ๅฐๅคๆฏๅธไธพๅ\n342503 0 ไธๆตๆ็ขๆๆๅคชๅคๆฒกไบๅนฒๆพๆจ็็็พๅบฆ็พ็งๅ็ฐ่ฟๆฒกๆโๅฑโ็็พๅบฆ็พ็ง้ฟๆปๅบๅไบฌๅงๅฅฝๅ็ๅฟๅญ\n342504 0 ่ฟไบๅฎ้
ไธๆฏๆฐๅ ๅก่บๆฏๅฎถKengLyeๅไธบโๆดป็ไฝๆ ๅผๅธโ็ณปๅ็่ถ
็ฐๅฎไธปไน3Dๆ ่็ป\n1000 Processed\n classify content\n343000 0 ๆฐดๆไธๅฐ็็ๅซๆโโๆ็ไน้ด\n343001 0 ๆจๆ6็นๆ่ฝฆๅป็็ตๅฝฑ็ซ็ถๆฏๅฎ้ฉฌ\n343002 0 ๆฌๅทไธๆนๅจ็ฉๅ
ทๆ้ๅ
ฌๅธ๏ผ็ญ้ไนไธญ\n343003 0 6ๆไปฝๅฝๆๅฎ็ฐ็คพไผๆถ่ดนๅ้ถๅฎๆป้ข148528ไธๅ
\n343004 0 ไธญๅฝๅฅฝๅฃฐ้ณ็็ไธๆฏไธ่ฌๆถๅฟ\n1000 Processed\n classify content\n343500 0 2015ๆฃฒ้ๅฏบไฝๅญธๅคไปค็Day2โโๅฝญ้ซ็้ค็ๅธธ่ญๅ้้ฒๆณๅธซๆขตๅๆๅฑ\n343501 0 ๅฎๅ
ปๅธๅฐๅจ็ฐๆๅธๆฟๅบๅๅ
ฌๆฅผ็ๅฐ็ฎไธๆฐๅปบไธๅบง100ๅฑ็\n343502 0 ๆฌไบบ่้จๅณ่พน่ฟๅๅฟ
ๆญปไบๆฅ่ญฆไบๆไบบๅฟๅฝๆญปๆญปๅๅคๅ\n343503 0 ๅฐฑๆ
ๆธธO2Oใๅธๅบ่ฅ้ใๆฏๅบ้จ็ฅจ็ญๆน้ขๅฑๅผๅไฝ\n343504 0 ๅฎๅฑ่ชๅทฑ็ๅ ๅฅ่ฏ๏ผ1ใไธ่ฆไพฅๅนธใๆฅๅคใๆฌๆ ใๅฌๆถๆฏๆงไบคๆ\n1000 Processed\n classify content\n344000 0 ๅฎๅ
ๆฌๅญๅฎซใๅตๅทขๅๆดไธชๅฅณๆง็ๆฎ็ณป็ปๅ็ธๅ
ณๅ่ฝ\n344001 0 ็จๅผ่ฎฏ่ง้ข็่ฑๅ้ชจๆไนๆฏ้ๅชๆ15ๅ้ๅข\n344002 0 ๆฐขๆฐง้่ฆไฝ็จๆฐงๆฐๆฏ้ๆฐง็็ฉ็ปดๆ็ๅฝไธๅฏ็ผบๅฐ็็ฉ่ดจ\n344003 0 ๅฌๆญๅง~ๆๅญฆๆฏ็ๆดๆฏๅฏ้ๆถๅๅ ๅญฆๆ กๆฏๅคฉ็ๅ็ง่ฎบๅ่ฎฒๅบง\n344004 0 ไปๅๆฅ็360ๆถ่ๅคนๅฏผๅ
ฅedge่ฆๆไน็ ด\n1000 Processed\n classify content\n344500 0 ๆ่ฆ่ตฐไบๆ่ฆๅบๅป้ฟ้ฟ้ฃๅคดๅ
ฌๅฎๅฑๅจ่ฐๆฅๅ
จๅฝๅธ
ๅฅๆ็นๅธ
็ๅคไบๅนด้ๅธธๅธ
็ๅคๅๅนด่ถ
็บงๅธ
็ๅคไบๅๅนดๅๆ...\n344501 0 ไฝ ๅชๆฏ่ทฏ่ฟๅ/ๆ่ช็ผ็่ฐ่จๅพ็พ\n344502 0 โ็ฟ้ธฟโๅผบๅบฆๅๅฐ17็บงๆตๆฑๅ่ถ
ๅผบๅฐ้ฃ็ดงๆฅ่ญฆๆฅ\n344503 0 OPPORxPlusๅฐฑๆขๅธฎไฝ ๆๅฃฐ้ณไผ ้ๆดไธชไธ็\n344504 0 ๆ่
ไพๆณๅไบบๆฐๆณ้ขๆ่ตท่ฏ่ฎผ\n1000 Processed\n classify content\n345000 0 ๅคฉ่ฑๆฟไธ็็ฏๅฝขLEDๅฑๅฏไปฅ่ฎฉ่งไผๅธญไธ็ไบบๆดๅฅฝๅฐ่ง็ๆฏ่ต\n345001 0 ไธๅฃๆฐๆQTFx็x่ฏ้ฝ็ไบwwwwwwwwๅฎๅจๅคชๅฅฝ็ฌไบ\n345002 0 ็ถ่็็ธๅบ่ฏฅๅชๆฏไปไธ่ๆไบๆณๆฆๆฑไบ่ๅทฒ2333\n345003 0 ๅ
่ดนๅไบซๆตๆฑๅฐๅท13ไปฝๅไบซ7*7ๅฐ้ปๆน\n345004 0 ๆ่ฟไปฅไธบๆฏ่นๆๆฐๅบ็็ต่ๅข\n1000 Processed\n classify content\n345500 0 ๆๆ
่ฟๆฏๅๅฐ้ๆฅๅ็ฉๅฟๅขๆฏไนโฆSB\n345501 0 2ใๅฎๆๆ่ดข็็ฅ็งๅ๏ผ้้ป่ฒ็่่กๅฏไปฅๆๆฅ่ดขๅฏ\n345502 0 ๅฐ15ๅนด7ๆๆญฃ้ณๅทฒๆๅคๅฎถๅ้บ่ฅไธ\n345503 0 ไธญๅ่ดข็ปๆฟๆณๅคงๅญฆ้่ๅญฆ้ขไฟ้ฉไธไธ2010็บงๅญฆ็ๅดๆฐๅฎๆไบๆ กๅญ้็ๅไธๅถๅ\n345504 0 ่ฎค่ฏไฟกๆฏไธบโ่ๅทไธ้ญไผไธ็ฎก็ๆ้ๅ
ฌๅธไบบไบไธๅโ\n1000 Processed\n classify content\n346000 1 ็พไธ็พๅฐไบxๆxxๆฅ-xๆxxๆฅไธพ่กใ็ฑ็พๅๅฎๆ ใ๏ผ็ฉไปทๅๅxxๅนดๅนณไปทๅคง้ฉๅฝๆดปๅจ๏ผๅฐไบงๅๅนณไปท...\n346001 0 4ๅฒๅๆฏๅญฉๅญๅฝข่ฑก่ง่งๅๅฑ็ๅ
ณ้ฎๆ\n346002 0 ๆๅ็ๅฏไปฅ็ด้ฎ็ๅฐฑๆฏๅฏไปฅ็ด้ฎ็ๅง็ด้ฎ้ฎ่ดน่ดตๅ\n346003 0 ไฝๆฏๅคงๅพฎ่ฝฏ่ดจ็็่ฆๆ่ฟไธช็้ขๅๆ็ฑป็งปๅจๅนณๅฐ็ๆ ทๅญ\n346004 0 ็ญ็็ฅ่ดบ็ซ็ฅๆ่ฒxxxxๆคๅฃซ่ตๆ ผ่่ฏๅบ็กๅ่ฎฎไฟ่ฟ็ญ้ๅ
ณ็xxx%\n1000 Processed\n classify content\n346500 0 75ๅฒ็ๆตๆฑ่กขๅทๅธๅๆ่ๆฑ็ๆๆ็จไธๆ นๅน่ฝฆ็บฟๅๆญปไบไธไปๆๅ ๅๅนดโๅฐไธๆ
โ็ๅๆๆ\n346501 0 ไฝ ็้ฃไธคไธช็ตๆขฏๆ่พนไธคไธชๆๆฎๆผไบฎ็ๆๅกๅ\n346502 0 ไปๅคฉ่ท่ๅทๅทฅๅๆญฃๅผ็ญพ็บฆไธๆ็ๆๅฝฑไฝๅๅไฝๆญฃๅผๆไบงไธญๅฝ้ฃ็ๅฑ้ฃ\n346503 0 ๅฟ
ๅฐๆไธบไบ่็ฝ้่ๅนณๅฐๅฑๅผๅฎๆ็ไธๅคงๆๅบ\n346504 0 ๆๅๆฌขๅฌๆฌๅทๆๅ้ฅญๆกไธ่ฎฒๅฐๆถๅ็ๆ
ไบ\n1000 Processed\n classify content\n347000 1 ๅฏๅฏๆๆ่ฒๆณๅญฆ็ไนๆๆณใๆฅๅฎถ้ฟไนๆๆฅ๏ผ็นๅจๆฌๅญฆๆๅผๅญฆ็ไธๅจๅจๆซxๆxๆฅๆขๅ
จ้ขๅผ่ฎพๅฐๅๅๅ็ง่พ
...\n347001 0 ็้ฎ๏ผไธคๅนดๆถ้ด็ฑ70ๅคๅฎถๅๅฑๅฐ2็พไบๅ
ญๅๅคๅฎถ\n347002 0 ๆฑพ็ฟ้ๅขๅญๅผใๅพ
ไธ้ๅนดใๆๆ กๆชๅ้
ๅญฆ็ๅ้ไผๅไบบไผๅ
ๅฝ็จ\n347003 0 ๅฝ้
ๆณไนณ้กพ้ฎ็่่ฏ่ฟๅปๅฅฝๅ ๅคฉ\n347004 0 ๆๆถไน่กจ็ฐๆง่กๆญปๅๆถ็ฅ็ถไธบๆญปๅ็ฏ็็ฅท่ฏ\n1000 Processed\n classify content\n347500 0 ๅฝๅนดๆๆไธช็นๅซๅๆฌข่ฑๅ้ชจๅไธ็ไธไธๅ่ฟๅ ๆฌๅฐ่ฏด็ๆๅ\n347501 0 ไบ้ฉฌ้่ฟๅฃๅ็kindle็ตๅญไนฆ๏ผ\n347502 0 ไปๆๅญ่ฎฐ่ฝฝ็็ฒ้ชจๆๆถไปฃๅฐฑๅทฒๆไน\n347503 0 20150801้ๆฅ๏ผๆ ้ฉไธๅ็ๅ่ฏๅ
จ่ๅฎๅฆ\n347504 0 ๆบง้ณๅๅฑฑ็ซนๆตทๅ็ๆ็้ตๅณ\n1000 Processed\n classify content\n348000 0 ็คพไฟๅบ้ๅจไธ่ฟฐ65ๅฎถๅ
ฌๅธไธญ\n348001 0 ๅๆ็ๅฐๆไธชๆญ่ฏๆฏyoumakemehappywhenskiesaregray\n348002 1 ้ซไปทๅๆถๅ็ๅ้
ใๅฌ่ซๅค่ใๅ่ถ
ๅธ่ดญ็ฉๅกใ็ต่ฏ๏ผxxxxxxxxxxx\n348003 1 ไฝ ๅฅฝ๏ผๆๆฏๆฐ็่งฃๆพ่ทฏๅบ่่ฑนไธๆใxๆxๅฐxๆxๅท๏ผๆฌๅบๆฐๅๅฐๅบๅๅ ๅๅบxxxxๅฝxxxxๆดปๅจ...\n348004 0 10ๅธธๅทD149๏ผๆฅ\n1000 Processed\n classify content\n348500 0 โโใไธญๅฝๆฏๅคฉ2ไบฟไบบไน็ตๆขฏๅฎๅ
จไบๆ
้ขๅๆด้ฒ็ปดไฟ็ผบๅคฑใ\n348501 0 ๆ็ฅ้ๅซไบบ่ก็ฅจๆทฑๅฅ่ฟ่ฆๅจไบบๅฎถไผคๅฃๆ็\n348502 0 ่ก็ฅจๅ๏ฝ่ตถ็ดงๅๆฌๅง๏ฝไธๅธๅ๏ฝๅฏๆๅฏๆๆๅง๏ฝ\n348503 0 ๆๆๆณๅไผๆๆ้ฟ้็้ฝๆฏไนๅไนไผ\n348504 0 ไธใๆฌๆไธ็ปฉ้ขๅๆ
ๅต่ไธ็ฉๆต่กไปฝๆ\n1000 Processed\n classify content\n349000 0 ไฝๆฏๅนถๆช่ขซๅฝๅฐๆบ่ฝๆๆบๆ่
ๆฏๅนณๆฟ็่ๅด\n349001 0 ้ฃไฝ ไธๅฎๆฒกๅป็่ฟๅธธๅท็้ๅญ~\n349002 0 ่ฟๆญฃๆฏazureๆบๅจๅญฆไน ๆๆ
้ฟ็้ขๅ\n349003 0 ่ฟๆ ท็่กๅธๅทฒ็ปๆฒกๆๆ่ตไปทๅผไบ\n349004 0 ๆ็ถOS็ณป็ป้่ฆ้ ๅพฎ่ฝฏๆฅ่กฅๅฎ\n1000 Processed\n classify content\n349500 0 โๆ็ฌๆฐ๏ผโๆๅปๅๅไบฌไธๆฅๆธธ็ๅฐๆธธๅฐๅงๆๆจฃ\n349501 0 ่ฏ็ฎกๅฑๅ่ด่ดฃ่ฝฌๅบๅ ้ฃๅๅ้ฅฒๆ็ๅฎๅ
จๆง่ฏไผฐ\n349502 0 13ๅนด622ไนฐ็็ฌ่ฎฐๆฌ็ต่\n349503 0 ไบบๆฐๆฅๆฅๆฟๆ้ฉณโ่ดชๅฎ้ฝๆๆ
ไบบโ๏ผๅๆฐๆถ้ๅฅธ\n349504 0 ่ท็ธๅฆๅบๅปๅค้ขๅๅๅ~~ๅ้ฅฑ่ฟๆๆฒๅ่บบ\n1000 Processed\n classify content\n350000 0 ไบๆฏAๅธๅๅฑๅณๅฎๅฐๆๆ็็่งๅฑ
ไฝๅทฅไฝ่ฝฌไบค็ปไบๆๆๆท็ฑๆๅจๅฐ็Bๅธๅ
ฌๅฎๅฑ\n350001 0 ไธญๅฝๅฅฝๅฃฐ้ณ้ปๆบๆฑๅฉๆ้ฝๆๅจๅญไบ\n350002 0 ๅฎๅคๆ
ๆธธ+ๆฌข่ฟๆจ่ฎฟ้ฎ้้ๅณก\n350003 0 ไปไปฌ้ฃไธ็ฒๅป้ขๅๅ็ไบไธไพๅฐ้พๆฐ้ๅฏผ่ด็ไธฅ้ไธ่ฏไบไปถ\n350004 1 ใ้็ฅใ๏ผ้ซๅฟไธๆตๅ่ดจ็ฉไธใๅไพจๅคๅฐๅฝ้
ๅใๅฎๅพxxxx/ๅนณ่ตท๏ผxxๅนณ่ๅฑ
ไธๆฟ็ซ็๏ผๅจๆซๆฝๅฅ...\n1000 Processed\n classify content\n350500 0 ่ตถๅฟซไธ่ฝฝๅคงไผ็น่ฏ/็พๅข/็พๅบฆ็ณฏ็ฑณAPPๆ็ดขๅฐ่พฃๆคๅง\n350501 0 ๅพๅคไบบ้ฝไผ่ฐๅฐไธญๅปๅญฆ็ๅค็ฑไนๅค\n350502 0 ้่ฆๅ็ฎก็ๆถๅๅ็ฎก้ฝๅปๅช้ไบ\n350503 0 ๆบๆขฐๅ ๅทฅๆจๅทฅ็ตๅจๅฎ่ฃ
ๅฐฑๆไธบไธๅไธไธ็่ฐๆๅฎค่ฎพ่ฎกๅธ~~~\n350504 0 ๅธไธญ็บงไบบๆฐๆณ้ขๅจๆๆไบบๆฐๆณ้ขๅผๅบญๅฎก็่ฟ่ตทๆกไปถ\n1000 Processed\n classify content\n351000 0 ๅบๅทฎๅฝไธชๅฐ่ท็ญ็ปๆๆ่ง่ชๅทฑ่ฆ่ขซๆๆ้ป็ฎไบ\n351001 0 ็ฝ้ถไนๅจ่ต้ฑๅๅๆผไบฎ้่ฆ็ๆพๆๅ
่ดนๅผๆทๆๅฏผ่ก็ฅจๅ็ฝ้ถ\n351002 0 ้ถ่ก็่ก็ฅจ้
่ตไธๅก้ทๅ
ฅๅทๆธ
\n351003 1 ใๆจๅ
ๆกๅฑ
่ใๅๆดๅฏไธไธๆขฏ็บฏๆฟๅฐ้ซๅฑ๏ผๆทๆท่ต ้็งๅฎถ่ฑๅญ๏ผไบคxxxxๆ้ซๆตxxxxxๅ
๏ผๅ
จๅไบ...\n351004 0 ไธๆๅไธบๅฐ็ฑณ้
ทๆดพ็ญ็ญๅฎๅ้ฝ้็จ๏ฝๅไปท49\n1000 Processed\n classify content\n351500 0 ็พๅบฆๆ่ฐทๆญใ็พ็งๆ็ปดๅบใ่
พ่ฎฏๆ็พๅบฆใ่
พ่ฎฏๆ360ใๆ็ๆ็พๅบฆ\n351501 0 ๅพๅคๆๅ้ฝ้ๆฉๆฟๅ
็ฅ็ๅซฉ่ค\n351502 0 ไปๅฑ
ๅคช็พไบ๏ฝๅๅฟๅ้ๅ็ๅฐไผไผด่ฟๆๅธฎๅฉๆไปฌ็่ๅธไปฌๅจไธ่ตท็ๆ่ง็ๅฅฝ\n351503 1 ๆจๅฅฝ๏ผๆฑๅคไธญ็พไปๅจๅฃๅ
ๅฅถ็ฒxๆxๅท่ณxๆxxๅทๆญขx.xๆๅฎ๏ผๆฌข่ฟๆ ้กพ๏ผ\n351504 1 ๆจๅฅฝ๏ผๆๆฏๆ
ง่ช็ฝ็ๅฎขๆท็ป็ไปๅฐ่ฝฉ๏ผๆ
ง่ช็ฝ็บฏไฟก็จ๏ผไฝๅฉๆฏ๏ผๆ ๆตๆผ่ดทๆฌพ๏ผๅฉๆฏๆฏxๅxๅฐxๅxไน้ด...\n1000 Processed\n classify content\n352000 1 ๅๅบๅ
ๅฎต ็็ฏ่ฐ ่ตขๅคง็คผ๏ผๅๆถไธบไบๅ้ฆๅนฟๅคงๆฐ่้กพๅฎข๏ผ็นๆจๅบไปฅไธไผๆ ๆดปๅจ:x๏ผๅ
จๅบๆๆบไฝ่ณxx...\n352001 0 ๅฎ็่ฑๅ้ชจ๏ผ็ปๆไธๅคฉๆๆไธญ็็ผ่ฏๅจๅฐๆไธบๆ็ต้ญ็ไธ้จๅ\n352002 0 ่ๆฏไธ็งๆฌงๅผ้ฃๆ ผๅธฆๆฅ็ไธ็งๆๅไผ ็ปๆ่กจ่พพ็ๅผบ็็ๆๅๅ
ๆถต\n352003 0 ๆฏๅ่ฐ่110่ฐ่ไพ็ถ่
ฟ็ผๅนถไธๅๅคๆ ๅธธๆๆ ไฟก่ชๅฏ่จ\n352004 0 ๅพฎ่ฝฏๅทฒ็ปๅฎฃๅธwindows8\n1000 Processed\n classify content\n352500 0 ็ปๅคงๅฎถๅไบซโfenlei็ญโๆไปถ\n352501 0 ๆไธๆไธบไปไนๆ
ๆธธๅๆฅๅๆๅฆไธ็ดๅจๅพฎไฟกๅไบซๆๅฝๅค็นๅซๆฏๆฌงๆดฒ็็พ้ฃ็พๆฏโฆโฆไฝ ๅ่ฏๆไปไนๆๆ่ฟๆณๅป...\n352502 0 ๆ็ฐๅจๅฏนๅทจๅคง็ๅปบ็ญ็ฉๆทฑๆตทๅคง้ฑผ้ซ็ฉบๆไบๆดๆทฑไธๅฑ็ๆๆงๆ่ไธๆฐธ่ฟ้ฝไธไผๅฅฝไบ\n352503 0 ็ปๅคงๅฎถๅไบซโ่ทณ่่ง้ขxfulidx\n352504 0 ็ก็ ไธๅฅฝ้่ฆๅปๅป้ขๆฃๆฅๆ่กฅๅ
VDๅ\n1000 Processed\n classify content\n353000 0 ็ๆฅๅจ้ญ้ฝๅ้ฃๆบ่ฆๆๅๅ
ญไธชๅฐๆถๅบ้จไบ\n353001 1 ๆ่ฐข่ด็ตๅธไธ้
ๅบ(ๆญๅทๆฒณๅ่กๅบ)๏ผdown.podinns.comไธ่ฝฝๅธไธ้
ๅบAPP๏ผไธไบซx...\n353002 0 ๆตๆฑ้ซ่็ฌฌไบๆนๆ็็งใไฝ่ฒๅ่บๆฏไธ็งๆงๅถๅๆฐ็บฟๆจๆๆญๆ\n353003 0 ๅ ไธบ้ปๆ็ๅๆบๅธฆๆไน่็
ๆฏ\n353004 1 ๆฅๅญฃโๅไธญไฝ่ฒไธญ่ๅผบๅโโๅฐๅญฆไฝ่ฒ็ปผๅ็ด ่ดจโ่ฎญ็ป็ญไบxๆxxๆฅๅผ่ฏพ๏ผๆฏๅจๅ
ญๆๆฅ่ฎญ็ป๏ผ็ฐๅทฒๅผๅง...\n1000 Processed\n classify content\n353500 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ xrpxxxไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n353501 0 ไนๅฅฝๆณๅจไธๅฐๅธ็ฌฌไธไธญๅญฆๆไธ็ปๅ
ณไบ้ๆฅ็ๅ็ๅ\n353502 0 8ใๅธๆฟๅบ็ณป็ปๅปๆฟๅปบ่ฎพ้็นไปปๅกๆจ่ฟไผไธพ่ก\n353503 0 ้ๆดฒๅๅฒไธ้ฆๅบNBAๆฏ่ตๅฐๅจๅ้็็บฆ็ฟฐๅ
ๆฏๅ กไธพๅ\n353504 0 ๅ็ทๅธ
ๅฅๅฅ็พๅคง็21\n1000 Processed\n classify content\n354000 0 ๅฑๆธ
ๅ็ๆญๅๅฆฉๅช็่ฏๅไธปๅฏผ่ฏป\n354001 0 ๅ่ดทๅฎapp็ฐๅ
จๅฝๆ่็บฟไธๆจๅนฟไบบๅๅพ
้ไธฐๅๆจ่ไธไบบ่ณๅฐๆ20ๅ
็ไฝฃ้ๆ่ชไธไธ่ฝป่ฝปๆพๆพๆณๅ็่ฏท...\n354002 0 ไปๅนดๅๅ ็ๅฐๅฐไปฅๆฑ่ๅๆตๆฑๅฑ
ๅคๅ\n354003 0 ๅฅฝๅฃฐ้ณ้ฃไธชๅฅณๅญฉๅฑallaboutthatbassไธฒๆๅจไบบๆฐๅนฟๅบๅ็ธ้ธก็ๆถๅๆฒกๆ่ทณ้ฃ็ง่\n354004 0 ่ฟๆดไธช่ฟ็จๅทฒ่ขซ่ญฆ่ฝฆไธ็่ฝฆ่ฝฝๆๅๆบๆไบไธๆฅ\n1000 Processed\n classify content\n354500 0 ๆคๅฃซ่ดด็ๆญข่ก่ดดๅฎๅ
จๆฒก่ดดๅจไฝ็ฝฎไธ\n354501 0 ๅฟซ้ไบ่งฃC็ไธญๅธธ่งๆไปถๅคน็ไฝ็จ\n354502 0 ๅๆฑ่พฐไบซๆฐๅซ้ฒๆๆๅไธไธช็ฝ่ไปทๅฆxxx\n354503 0 ้ฃไบๅฐๆนๆๅฎถไบบๆ้ขๅฏผ็ๆฏไธ่ฝๅฅฝๅฅฝ่ฏด่ฏๅฆ\n354504 0 ๆฅ้ๅบฆๅ็ๅฐๅฑๅญฉๅฟๆบๅจไบบๅ่บซไธญ๏ฝ\n1000 Processed\n classify content\n355000 0 ็ฏ็ฝชๅซ็ไบบๅผ ๆ่ขซๆญฆ่ฟๅบๆฃๅฏ้ขไพๆณๆ่ตทๅ
ฌ่ฏ\n355001 0 ๅพฎ่ฝฏๆญป็ฃ่นๆSurfacePro4ๆฐๆบๆๅ
ๆง่ฝ้ฃๅ\n355002 0 ใtwitterใscooterbraun\n355003 0 ็ๆณๅจ่ๅ่ดดๅผ ็บธไธ้ขๅไธๆtm็็ไธๆฏๆคๅฃซโฆ\n355004 0 ๆฏๅคฉๅช่ฝๆ6ๅๆธธๅฎขๆข่ฎฟไธไธชๅคง็ฉ็ฉ่ๅฑ
็พค\n1000 Processed\n classify content\n355500 0 ๆๆณๅฏนๆฅผไธ่ฃ
ไฟฎ็ๅคงๅฅ่ฏดๅฅ่ฏๆจ่ฝป็น็ ธๅข่ไธ็ปๅฎๆจ่ฏดๆจ่ฆๆฏไธ้คๅญไธๅปๅฒไฝฟๅคงไบๅจๆไธๆฅๅคๅฐดๅฐฌๆๆฏ่ฏฅ...\n355501 0 ๆฏๅฆ้
่ฏปใๆ
ๆธธใๅ่ถโฆโฆ่ฎกๅๅปๅทฅไฝ่ฎกๅๅป็ๆดปๆไธไผ่ๅบฆไบบ็\n355502 0 ไธๅคๆฐๅ|่ฟๅ ๅคฉๅจ่ฏขๅฝฉ่น็ซ็ฐ็ไบฒๅฅฝๅค\n355503 0 ้ๆCortana็่ฟ็จไธญ้ๅฐ็ๆๆ\n355504 1 ๅ
็๏ผๆจๅฅฝ๏ผ ๆๆฏๅ่็ณป่ฟๆจ็(ๅๆ ๆตๆผๆ ๆ
ไฟ็ไฟก็จๅๆฌพ๏ผ๏ผๆๅฟซxๅคฉๅฐๅธใ ไปฅๅๆจๆๆๅ...\n1000 Processed\n classify content\n356000 1 x.ๆดปๅจๆ้ดไธๅฎๅฏ่ท่ต ไปทๅผxxxๅ
็ดข่ฒไบ่ฝฏๅ
็ปไธไธช๏ผไบคๆฌพๆปกxxxxxๅฏ่ท่ต xxxๅ
็ดข่ฒไบ่ฏ่กฃ...\n356001 0 ๆฅ่ชๆปจๆตทๆฐๅบ็ดซไบไธญๅญฆๆฐ็็ญ็80ๅคๅๅธ็ๆฅๅฐๅคฉๆดฅๆๅไธญๅฟๅ่งๅคฉๆดฅๅ็ฉ้ฆ\n356002 0 ๅฟซ็น็ๅง่ฆไธๅฐฑ่ขซ่
พ่ฎฏๅ่ฐไบ\n356003 0 ๅบ่ฟ้จๅ็ซ่ฝฆๅ้ฃๆบ่ฑ่ดน้ฝๆๆฌไธๆ ท\n356004 0 ๅนถๅๆถ่็ณปๆถ้ฒๅคง้ๅ120ๆฅๆไธญๅฟ่ตถๅพ็ฐๅบ\n1000 Processed\n classify content\n356500 0 4ใไฟๆค็ฎ่ค็ป่ไธๅๅฐๅค็็็ฏๅขๅฝฑๅ\n356501 0 xxๆฑๅๅคงๅญฆๆบๆขฐๅทฅ็จๅญฆ้ขๅ
่ฃ
xxxx็ญโ่ตฐ่ฟ็ปฟ่ฒ\n356502 0 ้ฒๆบๆทฑ๏ผๆ็้
ๅบไธๅนด่ตไบxxไธไธค้ถๅญไบๅจ็ซ่ตทๆฅๅญ้๏ผๅคงๅฅ\n356503 0 ThinkPadๆๅผบ็ฌ่ฎฐๆฌ้
ไปถThinkPadStackๆฅ่ขญ\n356504 0 ๅซๅ้็็ๆๆบไผ่ฎฉๆๆดๆๅพ็ๆ\n1000 Processed\n classify content\n357000 0 โโWhichwouldyoulikefordinner\n357001 0 ๆๅ้ฎ็ๅทๅทฒ็ปๅ ไธบไฟก็จๅๅคชๅฐ่ขซ้ๅผไบ\n357002 0 5ๆๆจ่ๆฑฝ่ฝฆไบบๆปๅจๅ็พๅบฆไบ็ฝ็\n357003 0 ไธๅๆๅป็ฆๅญๅไธ่กไธๅธฆ็ๅฐไผไผดๅๆ็่ฏ่ฎฐๅพ่็ณปๆ\n357004 0 ไปไธ้ฃๆบ้ฃไธๅป่ฟ้ข่ๆฅ็็ญๆตช\n1000 Processed\n classify content\n357500 0 ๆ่งๅพ่ฟๆฏๅจNBAๆ่ฟ่ทณๆๅๅฎณๅคไบ\n357501 0 ๆไปฌๅจๅ
จ็xxxไธชๅฐๆนๆไพๅธฆ่นๅๅจๅๆธธ่็ง่ตไธๅก\n357502 0 ๅ ไธบ่ฟๆ ทๆๅจ่ชๅฎถ็ต่ๅฒๅจ็ๆถๅ\n357503 0 ๅฅณๅคงๅญฆ็่ขซ็ฝๅ่ฃ
้ฌผ่ฝฎๅฅธไปฅไธบ้ญ้้ฌผๅๅบ\n357504 0 ไนๅไธ็ดไปฅไธบๅชๆ่
พ่ฎฏ็่ฝฏไปถๆไผ่ถๆดๆฐ่ถ็้ผ็ไฝ ไธๅพไธๅปๆพ่็ๆฌ\n1000 Processed\n classify content\n358000 1 ๆจๅฅฝ๏ผๅปบ่พ่ฏๆxๆxๅทโxๆxxๅทไธพ่กไผๆ ไฝๆฃๆดปๅจ๏ผ้ๆณจๆ:x็ฉบ่
นๆฝ่กๅๆๅฐฟ๏ผx.B่ถ
ๆถ้ดไธบ...\n358001 0 ๆบ่ฝๆๅกๆบๅจไบบโAngelโ้ฆๆฌกไบฎ็ธ\n358002 0 ๆดป็็ๅจ้ฃๆบไธ็ญไบๅไธชๅฐๆถๆฒกๆณไธๆบ\n358003 1 ่กๅฉๆณข ๅปบ่ฎพ้ถ่ก xxxx xxxx xxxx xxxx ...\n358004 0 ไปxๅ ้ๅฐxxxkm/hๅช้่ฆไธๅฐx็ง้\n1000 Processed\n classify content\n358500 1 ๆณๆฒๅฉ็พคๅบx.xๅฆๅฅณ่๏ผไผๅฉๅฅถ็ฒ.้้ขๅ .้่ฃ
.็ๆคๅ
จๅบๅฅถ็ฒx.xๆ๏ผๆฌข่ฟๆฐ่้กพๅฎข่ดญไนฐใๆถ้ด...\n358501 0 ่ฃฝไฝ็ณ้่ๆฑ๏ผ็งๆน๏ผ1ๆคๅคง่ใ1ๆค็ฝ็ฑณ้ใๅฐ็ณ4ไธค\n358502 0 ่ตท็ ๅจๅพๅคๆกไปถไนไธญ้ฝๆ็ๅไธไธ็ผ็่็ณป\n358503 0 Wilkinson็ๅฎ่กๅๅนฟๅ\n358504 0 ๆฏๅนด้ฝไผๅ ไธบ็งๆฟ็ไบๆ็ฏๆ\n1000 Processed\n classify content\n359000 0 ็พๅบฆๅพฎๅไบๅๅ้้ฝๆพไธๅฐๅ็กฎไฟกๆฏ\n359001 0 ็ฃไปๆ่ญฐๅฑ่ฆฝไธญๅฟๅฑๅปณ3ๅฑๆฟX12ๅๅคงๅฎถ่ฆ้ข\n359002 0 /ๆฒณๅไธ่พ
่ญฆๆ ่ฏ้ฉพ้ฉถๆๆญป2ๅญฆ็้้ธ็ฐๅทฒ่ขซๆนๆ\n359003 0 ่ชไปฅไธบๅพๅๅฎณๅฎๅไธ่ฟๆฏไธ็พค่ทณๆขๅฐไธ่ๅทฒ\n359004 0 ๅๅคง่กไนๅคไบ็นๆ
ๆธธๆฏ็น็ๅง้น\n1000 Processed\n classify content\n359500 0 ๆ่ฏๆฏๆญฃๅธธ็ใไฝไธบไธ็ฒๅป้ขๅป็\n359501 0 ็ๅฎ็ตๅฝฑๅๅฎถ็ตๆขฏๅฃ็ๅฐ่ฟไธช\n359502 0 ๆข็บขๅ
็พคๆขๅฐๆไฝ็่ฆๅ็บขๅ
ไธๆฏๆไฝ็ๅฏไปฅไธ็ดๆข\n359503 0 ๅ ๆญคๅฐๆDermatologistTested็ฎ่ค็งๅป็ไธดๅบ่ฏ้ช็ญๅญๆ ท\n359504 0 ๆฒกๆณๅฐ่ฟๆฏๅฏน่ฟๆ ท็ไธๅ
ฌๆไบๅๅบ\n1000 Processed\n classify content\n360000 0 xxxxๅนดxxๆ่ณxxxxๅนดxๆ\n360001 0 ๆไธชๆบๆบ็ๅป็่ฏดๅ็น็ขฑๅฅฝไบโฆโฆ\n360002 0 ้ฆๅ
ๆฏ่ฎค็่ฝๅฎๅฐ้ขๆ
ไฟ่ดทๆฌพๆฟ็ญ\n360003 0 ๅบๅฎ่ตไบงๆ่ตๅฎๆ่ฟ115ไบฟๅ
\n360004 0 ๆไบๆๆ
ๆ ้่ดจ็ไฟกไปปๆฏๅฝผๆญค็ปๅฝผๆญค็\n1000 Processed\n classify content\n360500 0 1ไธๅ
่ๅทๅๅนดๅไธคๅข่ฎฉๅฉ1\n360501 0 29ๆไฝๅปบ่ฎฎ๏ผ1ใไบ็ๆถๆฎต็ฝ้ถ2930้่ฟๅ็ฉบ\n360502 0 ๆๆบไธๅฝๅ
็ฝ่ถ
ๅฟซ็ต่ๅฐฑๆ ๆๆ
ข่ฟๆไธๅผ??็ต่็ฝ็ดๆ่
พไธๆ่ฟๆฏๆดไธๅบๆไปฅ็ถGoogle้ฝๆไธๅผ...\n360503 0 ๆธ
็ง็่ถๅMMๅฏไธไธ่ถณๅฐฑๆฏๅคชๆต็ผฉ\n360504 0 ไปไธ็ฅ้ไฟ็พ
ๆฏๆฟๅบๆๅจไบใไธไธๅนดๅ
ฌ้ไบค้ๅ่่ฏๅธถ่ตฐ็ๆ็จฟ\n1000 Processed\n classify content\n361000 1 ไบฒไปฌ๏ผ้ญๆณๅป็ๅๆๆฐๆดปๅจไบ๏ผ้ญๆณๅป็่่็ณปๅๅ็ซ็ฐ้จๅไบงๅxๆไผๆ ๏ผ่
่ๆฐดxxxๅ
ๅไนณxxx...\n361001 0 ็ปไบ็ญๅฐไฝ \n361002 0 2015ไธ็ๅฟ็ซฅ่ทๆณ้ๆฏ่ต้ฉฌไธๅผๅงๅฆ\n361003 0 ๅ10๏ฝ15g้ฉฌ้ฝฟ่่็
ๆฑคๆๆฆจๆฑๆ็จ\n361004 0 ๅไธบๆ่กจๅไบงๅๅ
่ฃ
่กจ้ข็ไธๅปไธๆฎ้ๆ่กจๆฒกไปไนๅบๅซ\n1000 Processed\n classify content\n361500 0 ๆไปฌ้ฝๆฏ365ๅคฉ่ฆไธบไธไฝ ๆไธๆ ท้ๆฑ็ไบบๅ่่ฅไธ็\n361501 0 ๅ็ง็ฑปๅ็ๅป่ฏไผไธ่ถๆฅ่ถๅค\n361502 0 ๅๅฆAๆขฆๆฑฝ่ฝฆๆไปถๆบๅจ็ซ่ฝฆๅ
้ฅฐๅ่่ๅญๅๆๅ
ฌไปๅฎๅฝ็ซ่ฝฆ่ฝฝๆไปถ\n361503 0 ไฝไธๅๆณๅฎ็ปๆ็ๅฎกๅคๅฐ็ปไน่กจ็คบ\n361504 0 ้ๆตทๆน็199ๅ
ฌ้้ช่ก็ญไฝ ๆฅๆๆ\n1000 Processed\n classify content\n362000 0 ่ฎฉๆ่ต่
ๅผๅงๅๅฝ็ๆงไธๅท้\n362001 1 ็โ่ฏไฟกxxx๏ผ็คผๆ ๅฑฑๅโๆๆฉๅ้ฆๆดปๅจ๏ผๅ
จๅบๆๆๅๅ็ด้xx%๏ผๅจๆญคๆๆฃๅบ็กไธ๏ผๆฏไบคxไธๅ
็ด...\n362002 0 ๆฏๅฏนๆบง้ณ่ดจ็ๅฑๅฑ้ฟๆๅ
ด่ถฃ็็ธๅ
ณไบบ็พค่ทๅๆบง้ณ่ดจ็ๅฑๅฑ้ฟ่ต่ฎฏ็้่ฆๅนณๅฐ\n362003 0 ๅ้ ไบNBAๅๅฒไธ่ฅฟ้จ็้ๅ่ตๅญฃไธปๅบ่ๅบ็บชๅฝ\n362004 0 ๅไบฌ้ถ้ถๅ้่กๆถฆไธ่ทๆฃ็็ปๅ ๅ\n1000 Processed\n classify content\n362500 0 ๅจๅฅฝๅฃฐ้ณๅญฆๅๆฑๆบไธ็้ๆฐๆผ็ปไนๅ\n362501 0 ๅไฟฎๅ
จ๏ผๆณๅฎๆฃๅฏๅฎไธ่ฝๅๆ้้ฅญ็ข\n362502 0 ๅฐๅ
ฌๅธ็ไบๅๅคฉ็ต่ๆ ่ไธ่ฝฆ้ด็ฉไบไธชๆๅฐๆถๆๆบ\n362503 0 ็ๆฅๅ็ซ็ฎญ้ฝ่ตถไธไธๅซไบบ่ๅฅไบ\n362504 0 ๅ็ฎกๆๅฐ้ฒๆฑๅบๆฅ้ไผ้ๆถๅพ
ๅฝ\n1000 Processed\n classify content\n363000 1 ใๅฏ็ฑๅฏไบฒใๆฅๅญฃๅคง่ฟๅฉ๏ผๆ ๆฐๅฌไธxxx๏ผๅฌไบxxx๏ผๅฌไธxxx๏ผ้
ๅนใๅค็พๆปไธ้ๅฐๅบ๏ผ๏ผ๏ผๅ
จ...\n363001 0 ๅฐๆทๅ็ๆฉฑๆ่ฎพ่ฎกไนไธ่ฝๅฟไบๅงๅฐ่ฟไธ็ฏ่\n363002 1 ๆจๅฅฝ๏ผๆๆฏ่ฏธๅไธๅๅท่ฟๆๅคไธไธไปไบๅ
ฌ่ทฏๅๆตท่ฟๅท่่ฟ่พไธป่ฆ่ทฏ็บฟ๏ผๅนฟไธ.ๅนฟ่ฅฟ.ไบๅ.ๅๅท.้ๅบ....\n363003 0 ๆณ็ฅ้็ตๅฝฑไธญ็ๅจฑไนๅ็็ธๆฏๅฆๅฑๅฎ\n363004 0 ็ฒ่ฒ่็ณ2็ฒ่ฒๅฝฉๆ3่่ฒ่็ณ4่่ฒๅฝฉๆ\n1000 Processed\n classify content\n363500 0 ่ๆณ็ต่ๅ
ฌๅธไฝฟๅฝ๏ผไธบๅฎขๆทๅฉ็่ๅชๅๅๆฐ\n363501 0 ่ฎฉๆๆพไบฒๆๆๅๅไธไธชๆๆบๅ
ๅฐๅฐฑ็\n363502 0 ๆไบบ่ดจ็ๆไธไธชๅบ็ฃๅพๆไฝ่ตๆ ผไธบไฝๆๅพ่ฏด่ฏ\n363503 0 ๆๅจgoogleไธๆJu็็
ง็\n363504 0 ่ทๅพ็ฌฌ21ๅฑไธๆน้ฃไบๆฆๅๅคง้ๆฒ\n1000 Processed\n classify content\n364000 0 ็ฑ็ๆดป็ไบบ้ฝๆ3ไธชๅนณๆน็ไธๅคๆกๆบ๏ผ้ณๅฐๆฏๆไปฌๅจ่ฟ่กๅฎถๅบญ่ฃ
ไฟฎ่ฟ็จไธญๆๅฎนๆ่ขซๅฟฝ็ฅ็ไธไธชไธๅคๆกๆบ\n364001 0 ้ข่ฎกๅฐๅนดๅบๅๅฎๆxๆทไผไธๆฌ่ฟๅ
ณๅ\n364002 0 Techcrunchๆฐๆ็งฐ\n364003 0 ๅไผ็งฏๆๆฏๆ่ฏๅธๆ่ตๅจ่ฏขๆบๆๅๆฐๅๅฑ\n364004 0 ้่ฟ400ไฝไปถๅจๆผซๆ็จฟใ้ๅกๅ็พไฝ้จไธญๅค็ปๅ
ธๅจ็ปๅฝฑ็\n1000 Processed\n classify content\n364500 0 ่ฑๅ้ชจๅ็ๅ
ๅญไธบๅๆฏ็ต่ๅๆ็\n364501 1 ๆจๅฅฝ๏ผๆๆฏๅ็ปๆจๆ็ต่ฏ็ไธญๅฝ็ตไฟก็ๅฎขๆท็ป็ๅฐ้ฑ๏ผ้ขๅญ่ฏ่ดน้่ฏ่ดน้ๅฎฝๅธฆ็ไผๆ ๆดปๅจๆญฃๅจ่ฟ่กไธญใ้ข...\n364502 0 ่ฏทๅนฟๅคงๅธๆฐ็ปๅฝๆตท้จๆๆ็ฝ\n364503 0 /ๆญๅท๏ผๅฎๅ่ขซไธพๆฅๅผบๅฅธๅฅณๅๅทฅไธไปไบบ้ๅฅธ\n364504 0 BamBamๅๆดๆฐไบTwitter๏ผ????????????????Hahahahahaha...\n1000 Processed\n classify content\n365000 0 ๅพฎไฟก็งๅ+461303้ช่ฏ๏ผๅฎๆณฝๆถๅธ
ๅธ
็\n365001 0 ๆไธช็ฑ็ฎก้ฒไบไธๆ็็ธ็ๆดๅนดๆ่ๅคชๅฉ้ฎๆไธบไปไนไธไธ่ตทๅ\n365002 0 ๆดๆฏ็ปๅไบฌๆฅผๅธๅธฆๆฅไบๅปๅๆไบบ็ๆ็ปฉ\n365003 0 7ใ้ถๆๅถๅถๅๅฆ้ถๆๅถใ้ถๆ\n365004 0 ๅผ ๆฐธ็ๅ
็ไบxxxxๅนดxๆxxๆฅไธๅxx็น้ไธไบๅไบฌ\n1000 Processed\n classify content\n365500 0 ไธไนๅน้
็8้DCTๅ้็ฎฑไน่ฝๅพๅฅฝ็ๅฑฅ่กๅฎ็่ดฃไปป\n365501 0 ๆๅๆฅๅบไธๅฐ99ๆฐ็คผ็็tr350้้่ฒ่่ท็ปฟ\n365502 0 qq๏ผ1985177275่็ณป\n365503 0 ๅจTwitter็ๅฐ้ๅผต็
ง็\n365504 0 ๆไบบ่ฏดๆฏ่ฑๅ้ชจๆฏไบ้ฃไธช้ซ้ซๅจไธ\n1000 Processed\n classify content\n366000 0 ๅฏนๆฑฝ่ฝฆ้ถ้จไปถ็นๅซๆฏๅฝฑๅๆฑฝ่ฝฆๅฎๅ
จ่ก้ฉถ็ๅ
ณ้ฎ้จไปถ่ฆๅฎๆๆฃๆฅ็ปดไฟฎ\n366001 0 ๅไบฌ้ถ็ฌไน็ฝๅ็พค็พคไธป้ถ็ฌ่ซ่ฆไธ็ฑๅฅฝ่
\n366002 1 ๅฐๆฌ็ๅฎถ้ฟๆจๅฅฝ!็ฅๅขจๆ่ฒๆฅๅญฃ็ญ(็ ๅฟ็ฎ๏ผๅฃๆ๏ผ่ฑ่ฏญ๏ผ็ปๅญ๏ผxFๅ
จ่่ฎญ็ป๏ผ็ป็ป)ๆญฃๅผๅผ่ฏพๅฆ!ๆฌ...\n366003 0 ๆดdisppoint็ๆฏ๏ผไธไธชsecurity็ซ็ถ่ฏดไปๅคฉๆฒกๆไปปไฝไบบ้ๆฅไธ่ฅฟ\n366004 0 ็จๅพๅฏ็ฑ่ฅฟ็ๅๅธๆญ้
่่ๆธๆ ๆชฌๅคชๆๅๆไบ\n1000 Processed\n classify content\n366500 0 ็พๅบฆ่ชๅทฑ่บซไธ็ๅฐๆฏ็
็ๆถๅๆปไผๅ็ๆป่งๅพๅพไบไปไนไธๆฒปไน็ๅฏๆฏ่ฐ่บซไธๆฒกไธชๅฐๆฏ็
ๅข่ชๅทฑ่ฝๅฟๅๅฏๅซ...\n366501 0 ๅๅๆ้ขๆฟๅจๅฐๅฏนๅๆกๆณๅฎ่ฟ่ฟ้่ฐข\n366502 0 ๆๅคๅฏไพ6ไบบๅๆถ็ๆดปโฆโฆ้ปไฟๆฐ้ๅธธโ้ชๅฒโๅฐไป็ป่ชๅทฑ็ๆฟ่ฝฆ\n366503 0 ๆถ้ด๏ผ13๏ผ00ใ14๏ผ50ใ18๏ผ30\n366504 0 ่ๅๅ ๅฌ่ฏ็19ๅไปฃ่กจ้ฝๅๆๆฐดไปทไธ่ฐ\n1000 Processed\n classify content\n367000 0 ไฝ ไธ่ฝๆไป้ๅพๅฐๅป้ขๅโๅตๅตๆฐ่ญฆ่ฟ่ฏ\n367001 0 ๆไบบๆ
ๅฟๆบ่ฝๆบๅจไบบไผๅจ่ไบบ็ฑป\n367002 1 ้ขๅฏผๆจๅฅฝ๏ผๅๅค้ถ่ก็ฐๆจๅบๆ ๆตๆผใๅ
ๆ
ไฟ็ไฟก็จ่ดท๏ผๆไฟก้ขๅบฆxxไธ๏ผไธๅผ ่บซไปฝ่ฏๅณๅฏๅ็ใ่ฏฆ่ฏขไป็ป...\n367003 0 ๆๅคฉ่ฟไธชๆถๅๅทฒ็ปๅจ้ฃๆบไธไบ\n367004 0 ็ฑDatourenๆๆฐ่ฎพ่ฎก็็ฒพ็ต็ฝ\n1000 Processed\n classify content\n367500 0 ๅฆๆ็ญ็ป่ตถไธ่ตทไธ้ปๆๅธ่ฏๅ\n367501 0 SJๆๅๅดๅงๆบๅทฒ็ป้่ฟ้ฉๅฝไนๅก่ญฆๅฏ็น้ฟๅ
ต็ๆ็ปๅฎกๆ ธ\n367502 0 ็ปๅฏน่ฏๅฟไปทๅ็็ๅฝๅฑ้ฉๆๆ็/้
ท/้
ท\n367503 0 NBAๅๆฏ็้ๅทฒ็ปๅฏไปฅๅผๅงๅ่พพๆๆๅ็็ๅๆญฃๅผ็ญพ็บฆ\n367504 0 ไฟๅญๆๅฎๅฅฝ็ๆด็ฉดๅฏบๅบๅปบ็ญ็พค\n1000 Processed\n classify content\n368000 1 xxๅนด็ ็ฉถ็ๆช่ฟๅจ็บฟๅ ๅ๏ผ่ฏทๅ qq๏ผxxxxxxxxxไธๆฌก่ฟ๏ผไธ่ฟๅผน ๆฐธไน
ๆๆ๏ผๅฎๆไฝ ็ๅฟๆฟ.\n368001 0 7ๅท็บฟๅธๆบไธๆฏๅ้ไบๅฐฑๆฏๅ่ฏไบ\n368002 0 ๅ็งcoserๅบ็ฐๅจๅ่ดธๅ\n368003 0 Gxx้ฟๆทฑ้ซ้็ฑๆญๅทๅพ่ฟไบๆธฏๆนๅๅฎๆญๆฎตKxxxx+xxx่ณKxxxx+xxx้่ฟๆฝๅทฅ็ปๆ\n368004 0 ๅด่งฃๅณไธไบxxไบฟไบบๅป็ๅ
่ดน\n1000 Processed\n classify content\n368500 0 ๅฉ็จไผช้ ็็ปๅฉ่ฏใ่ดญๆฟๅๅใ่ดทๆฌพๅๅๅ้ถ่ก่ฟๆฌพๅ็ญ้ชๅไฝๆฟๅ
ฌ็งฏ้\n368501 1 ๅปถๅ้ฉฌๅฏๆณข็ฝ็ฃ็ ๏ผๅ
จๅๅผ็๏ผ็พๅนดๅผ้จ้่ฃธไปท ้ๅฅฝ็คผ. ้ไฟ้ใ ไบคๅฎ้๏ผไบคๅ
จๆฌพ้ฝๆๅคง็คผ๏ผๅๅคง...\n368502 0 ๅฝๆ่ฟ่ฟๅฎฟ่ฟๅนฟๆญ็ต่งๆปๅฐ็ๆถๅ\n368503 0 ไปฅๅคฉๆดฅๆปจๆตทๅฝ้
ๆบๅบไธบไธป่ฟ่ฅๅบๅฐ\n368504 0 ็งๆๅๆๆฏๆไปฌ็ๅบๅ ๅ้ ็่ไฝ็ไผๅคงๅคๅปถ\n1000 Processed\n classify content\n369000 0 ๅฎถๅบญ่ไผไธ่ญฆๅฏไผฏไผฏๆๆไธๅ ๅฆไฝๅจ็คพไบคๅบๅไฟๆค่ชๅทฑ\n369001 0 ๆตๆฑไธ้จๆฒฟๆตท้จๅๅฐๅบ12็บงไปฅไธๅคง้ฃๅทฒๆ็ปญ12๏ฝ20ๅฐๆถ\n369002 1 ๆจๅฅฝ๏ผๅ
็๏ผๆๆฏ่ๅคงไฟก่ดท็ๅฐๅง๏ผๅฉๆฏๆไฝๅฏไปฅๅๅฐxๅๅคใๆไปฌๅ
ฌๅธๅฐๅ๏ผไธ่ๅธๅๅๅบๅ
็พ่ทฏๅๅฏ...\n369003 1 ๆฅ็จ้ฑๆพๆไปฌ๏ผๆ ๆตๆผๆ ๆ
ไฟใๆๆฏๅฎไฟกๆฎๆ ๆ็งๅช๏ผๆไปฌๅทฒ็ปๆญฃๅผไธ็ญ๏ผ้็จ้ฑ่ฏท่็ณปๆ๏ผๆฌข่ฟๅจ่ฏขใ\n369004 0 ๅฎไบๅๆ็ๆฐ้ปๅฐ้ฃๅฐๆตๆฑไบๆๅฅฝ็ดงๅผ ๆดไธชไบบๅจๅๆๅฅฝๆณๅญไฝ ่ฟๆฏๆฒกๆๅๆถๆฏ\n1000 Processed\n classify content\n369500 1 ๅทโxๆxๅทๅ
จๅบๆฅ่ฃ
x.xๆ๏ผไผๅๆไธx.xๆ๏ผๅฆๅ
จๅบๆถ่ดนๆปกxxxๅ
ไปฅไธ่ต ๅคชๅนณ้ธๆฑฝ่ฝฆๆฑๆไธไธช...\n369501 0 ๆไธชไบบ่งๅพ็ปๅฐฟ้
ธๅจ25ๅฒไนๅๆๅฅฝๅซ็จ\n369502 0 ๅ้ฃๆบไธๅฎ่ฆๆท้ข่ๅ้ฃๆบไธๅฎ่ฆๆท้ข่ๅ้ฃๆบไธๅฎ่ฆๆท้ข่\n369503 0 ๅ ๆถๅซ็ฏๆช็จๅ
ฌๆฌพ็ฝชๅ่ดชๆฑก็ฝช\n369504 0 ๆญฆๆฑๅธๆฐๅปบๅๅๆฟ8ๆ1ๆฅ้ๅฎๅฅๆฐ626ๅฅ\n1000 Processed\n classify content\n370000 1 ไฝ ๅฅฝ๏ผๆๆฏ็ฆไธด่ฃ
้ฅฐๅผ ้ป๏ผๅ
ฌๅธๅผๅนด้ๆ ๏ผๅ่ดจๆด่ฃ
๏ผๅฎ่ฃ
ไฟฎๅณ้โ็พ+ๅโไธไธ้ค็ฒ้ใๆดไฝๅฎถๅบญ็ฏไฟ...\n370001 0 ้ๅบๅฅฅไฝๆ38120ไบบ็ๅฐ็ปๅทดๅทดๅผ็ขพๅ\n370002 0 ๅฐฑๅฏไปฅ้่ฟๆฅๆพๅฐๅฏนๆน็IPๅฐๅ\n370003 0 ็ญพ็บฆไปชๅผๅจๅธธๅทๅธ็ๅๅไผไผ่ฎฎๅฎคไธพ่ก\n370004 0 ็พๅบฆๅฐๅพ่ฅฟๅฎ็ๅฐๅพๆฐๆฎ่ฏฅๆดๆฐไบ\n1000 Processed\n classify content\n370500 0 ๅจไธๆไธGoogle็จๆฉๅๅฎฃๅธๅฐ้ๅฏนๆไธAndroidๅนณๅฐ่ฃ
็ฝฎๆไพๆฏๆๆผๆดไฟฎๆญฃๆดๆฐๅ\n370501 0 ๅจๆทฎๅฎๅธๅป็็็ผ้ๅฐฑ่ฟไนๆฒป็\n370502 0 ๅๅจๆๅธฎ็ต่็งๆๆๅฐไบๅ
่ดนๅฅฝไธไธ\n370503 1 ไบฒไฝ ๅฅฝ๏ผๆๆฏไธ้จๅจไนๅทๅ
ฌ้ฆๅๆๆฐ/ไธญๅคฎ็ฉบ่ฐ็็พๆฏ้ๅฎๅทฅ็จๅธ--ๅฐๅใๅผไผๅพ็ฅไปๅนดxxxๆดปๅจๅ...\n370504 0 ๆไปฌ็ญๅ็้ฆๅบๆดปๅจ8ๆๆซๅฐ็ฒพๅฝฉๅ็ฐ\n1000 Processed\n classify content\n371000 0 ไน้ผ้ๅขๅ่ดทๅฎไน้ชๅฑๅคงๆญ็ง\n371001 0 ้ฟๆฒๅธๅ
ฌๅฎๆถ้ฒๆฏ้็นๅคๅคง้่ฎญ็ปๅบๅฐไธ\n371002 0 ้ๅฎๅฟ่ฟๆๅฅฝโ็บข่ฒ็้โโ้ๅฎๅ็ซโไธปไฝ็ฝ็ซ็ๅปบ่ฎพๅทฅไฝ\n371003 0 ๆๆ็็ธๅชๆๆไธไธชไบบ่ๅจ้ผ้\n371004 0 win10ๆๆ่งๅฐฑๆฏๅบไบwin7ๅไบไธไธๅขๅผบ\n1000 Processed\n classify content\n371500 0 maybeๆไธปๅไธไธๅฐฑๅ่ฝ่่ตทไพไบ\n371501 0 ่ฝ้กบๅฉๅฐ่พพ็ป็นๆฟๅฐoffer็็็ๅฐไนๅๅฐ\n371502 0 ๅๅญธๅไบ่งฃไบๅธธๅทๆขณ็ฏฆ็่ฃฝไฝ้็จๅๅๆฅญๆจกๅผ\n371503 1 ่็ช้ๅขๅ
ฌๅธ่ฆๆฑ:ๅณๅฎxๆxxๅทๆญฃๅผ่ฟ้ฉปๅฑๅ
ๆญ่ฟๅ
ซๆน่ดตๅฎพ!ๆฌข่ฟๅ
จๅฝๅๅฐๆฐ่ๅฎขๆทๅคง้ฉพๅ
ไธด๏ผๆดฝ่ฐ...\n371504 0 ่งฃๅณ็ต่่พๅฐๆค่ค็พๅฎนๆ่ๅฟงไผคๅพๅพ็ๆถๅ\n1000 Processed\n classify content\n372000 0 6ไธๅคๅบงๅ็ฑป้จๆฐดไบ่ฟ่กๆๆฅๅ็้ๆธ
ๆ\n372001 0 ไฟฉๅญฉๅญๅพไน
ๅฐฑ่ฆๆฑๆ่ชๅทฑ็ๅฐๆค็ฉใๆ็ดๆฅไนฐ็ปไปไปฌไผไธ็ๆ\n372002 0 ๆฒกๆๅคฑ่ดฅๅชๆฅ็ๆๅๅพๅ่ตฐๅพๅๆผ็ธไฟก่ชๅทฑ\n372003 0 ่ฟๆฏไธไธชๆไปฌๆณจๅฎๆ ๆณ้ฟ็ธๅฎ็ไธ็\n372004 0 ไธๅคฉๅคชไธๅ
ฌ~่ฏดๅฅฝ็ๅฅฝๅงๅฆนไธ่พๅญๅข\n1000 Processed\n classify content\n372500 0 siri็้ขๅwatchไธๆ ทไบ\n372501 0 ่ฒๅทsupercoral็จไธๆฌก\n372502 0 ๅนถๅฏน่ฟ็ง่
่ดฅ็ฐ่ฑก่ฟ่กไบ่พ่พฃ็่ฎฝๅบๅๆจๅป\n372503 0 ๅฐไบ็ฐๅบไฝ ไปฌๆๅฑฑๅ้ๆปๅ
ฌๅธๅทฒ็ปๆๅ
่ฃนๅ
จ้จ็ง่ชๆๅผ\n372504 0 ็ถๅๅ่ฏไฝ ไฝ ็ธๆบๅๆๆบๆช่ฟๆฅ\n1000 Processed\n classify content\n373000 0 ไบบๆดป็่ต็ไธๅฃๆฐงๆฐๆฐงๆฐๆฏไฝ ๅฆๆไฝ ็ฑๆไฝ ไผๆฅๆๆ\n373001 0 ็จๅไธไธชไบ้ฉฌ้่ดฆๅทไนฐไบไธคไธช\n373002 0 ๅ
ๅผ่ฏฑไบ้ๅข็ๅฅๅฅไฝๆฏๅฅๅฅๆฏไธไธช่บซไปทๆธ
็ฝ็ไบบ\n373003 0 99%็ๅฅณๆง็ป็ฌฌไธๆฌกไฝฟ็จfemfreshๅ้ฝไผ้ๆฉๆจ่็ปๅ
ถๅฅฝๅ\n373004 0 ๆฏ็พๅฝไบบๆ ่ป่ๅถๆฎ็ไพตๅ ไบไธญๅ้ขๅ\n1000 Processed\n classify content\n373500 0 ็ฉบๆนๅจๅฝไผๆ่ดงไธๅฏ่ต800ไบฟ\n373501 0 ๅฟไธไฝๅไบซๅคฉๅบๅนฟๅบๅฐ้็ซๅบๆฅ็ไป็ซ่ดญ็ฉๅนฟๅบ็ฐๅจๅผไธไบ่ฎธๅคๅไธ่ฅฟ็ๅบ\n373502 0 ่ฆ็ฅ้่
พ่ฎฏๅๅๅฟซๆญ็็็ฑๆฏๅ ไธบ็็่ไธๆฏ่ฒๆ
\n373503 0 ??Deaๅ็ง้ซ็บงๅฎๅถๅคๅค้็ปๅ่ฃไธ่บซ้ๅธธๅฅฝ็็ไธๆฌพ้็ป่ฃๆ็ๆฌง็พๅคง็ๆ็ๅคๅค็บน่ทฏๆญ้
ไปปไฝไธ...\n373504 0 ๅฎๅ
ดๆฌๅไนฆ็ปๅฎถ็ไฝๅไธๅจๅฐๆฐ\n1000 Processed\n classify content\n374000 0 ๅฐฑๆฏๅปบ็ญ็ฉๆ่
ๆ็ญ็ฉ็้ชจๆถ\n374001 0 ๅจๅพๅทไฝ ๅฏ่ฝๅฌ่ฟๅ็ฑณ็บฟใๅ้ฆ้ฅจ\n374002 0 ไธ่ฟๆ็่่้
ธๅฅถ่ฟๅฏไปฅๅๅผบไธๅฝ\n374003 0 xxxไธๅไผๅ็ๆๆบ่ๅคฉ่ฝฏไปถโKakaotalkโๅผๅๅ้ฉๅฝkaokaoๅ
ฌๅธๆๅ
ฅxxxไบฟๅทฆๅณ่ต้\n374004 0 โ่ฟไบไบค้่ฟๆณ่กไธบๅคง้จๅไธโ่ทฏๆ็โๆๅ
ณ\n1000 Processed\n classify content\n374500 0 ๅไบฌ่ทฏๆญฅ่ก่ก่ฟ้ๅคฉๅ็ๅนฟๅ็\n374501 0 ๅธธๅท่ฑ่ฏญๅฃ่ฏญๅน่ฎญๅฐ็ผๅไธพไบ2015ๅนดไธๅๅนด็้ฃไบๆต่กๆฝฎ่ฏๅฟ\n374502 0 ็ฎ่ค้ฝ็ผ๏ฝๅฐฑๆ็ต่ๆฃ็ญไธ่ก\n374503 0 hellokitty็ๆฐดไฟก็้ฅผ\n374504 0 ๅฝ่ฟๆฌ็ไธๅ
ฌ่ๆไปฌๅ่ฝๅไบไปไน้คไบๆ ๆณๆนๅๅๆคๆไธๆปกๅ
ถๅฎไธๆ ๆฏๅค้ๆณข้ๆตๆดป็\n1000 Processed\n classify content\n375000 0 ไธบไปไนๆ็พๅบฆไบapp็น่ฟๅป\n375001 0 ๅฒณ้ณๅป้ข่ชๅถ็็ฝๅฐ็ฅ่ๅๅ\n375002 0 โๅธธ่ง่ฟๆๆง็พ็
็้ข้ฒไธ้ฒๆฒปโไธป้ข่ฎฒๅบงไธบๅคงๅฎถ่ฎฒ่งฃไบบไฝ่ฟๆ็ๅๅ ใๅฆไฝๆญฃ็กฎ่ฏๅซใ้ข้ฒๅๆฒป็่ฟๆๆง็พ็
\n375003 0 ๆธ
้คๆต่งๆฐๆฎๅcookieๆ ๆ\n375004 0 ้ๅฝpingไธไธไบๅ
ป็ๅคฉๅ็ๆตท้ฒ้ฃๅฐฑๅคช็พไบ\n1000 Processed\n classify content\n375500 0 ๅบๅฃๆฌง็พ็ๆก็บนๅซ่กฃ3ๅฐ8ๅฒๅฏ็ฉฟ\n375501 0 ไผ่ฎฎๅผบ่ฐไธ้กนๆดๆฒปๆดปๅจๅฟ
้กป๏ผ่ฎคๆธ
ๅฝขๅฟ\n375502 0 ไธ่พ156่ทฏๅ
ฌไบค่ฝฆๅณๅ่ฝฎๅๅฐไบไธไฝ่ๅคชๅคช็ๅณ่\n375503 0 โ็ตๆขฏๅไบบโไบๆ
ๅ5ๅ้ๅๅบ็ๆง่ง้ขๆๅ
\n375504 0 ๅธๆฟๅบๆฐ้ปๅๅ
ฌๅฎคๅฌๅผๅ
จๅธๅ
ฌๅ
ฑไบค้ๅฎๅ
จ็็ฎกไฝ็ณปๅปบ่ฎพๆ
ๅตๅๅธไผ\n1000 Processed\n classify content\n376000 0 ไปๅคฉ็ไบ็ฐไปฃๅไธ็็ๆขต้ซ็บชๅฟตๅฑ\n376001 0 ๅฎๅพๅฏ่ฝๆฏๆญคๅไผ ่จ็Lumia950็ณปๅๆบๅ\n376002 0 ๅฎพๅคๆณๅฐผไบๅคงๅญฆๆบๅจไบบๆฒๆฃ็่ต\n376003 1 xxxx้พๆธธ่พ่ฑไพๅฅณ่ฃ
ๆฉๆฅ็ณปๅ่ๅจไธๅธ๏ผxๆๅฅณ็่๏ผ่ง่ฏ็พไธฝxxๅใๆฌข่ฟๆฐ่้ๅ้กพๅฎขๅๆฅ้่ดญใ\n376004 0 ๆปจๆตทๆฐๅบไธพๅๅ็ฑปๅฑ่ง่ฎบๅ85ไธช\n1000 Processed\n classify content\n376500 0 ไธ้ข2ๆฌพๅฎนๅจๅ
้จ่ฎพ่ฎกไนไปฌๆณ่ฆไปไนๆ ท็ๆถ
\n376501 0 ็่ดข็ป็ไธๅจ็ๆถๅๆๅฐฑๅพ็ฏไฝๆ\n376502 0 ๅๅบ่ดจ็็ๆนๅผๅฐฑๆฏ้ป้ปๆฒๆท\n376503 0 ๆณๆดชๅฟไธ้ฟ้ๅทดๅทดๅ
ฌๅธ็ญพ่ฎขๅๆ็ตๅญๅๅกๅไฝๅ่ฎฎ\n376504 0 ๆbigbangๅไบฌๅบ980็ๅฐๆญฃๅฏน่ๅฐ็็ฅจๅ\n1000 Processed\n classify content\n377000 0 ๅฏๅ็ปๆไบๆฌๆฅ่ฎกๅๅฅฝ็็ญๅๆฒกๆๅๅ็ผ้ๆฒกๆๅ่ฅๆฒกๆๅฏๅฐฑ่ฟๆ ท็ปๆไบๆๅคฉๅฐฑ่ฆๅผๅญฆไบๅฏๅผๅญฆๅฟซไน่ไฝ ...\n377001 0 ่ฟๅบงไบๅฝๅนด2ๆ28ๆฅๅผๅทฅๅปบ่ฎพ็ไธญๅฝ็ฎๅๆๅคง็ๅฑ็ฟ
ๅ้ซๆถๆกฅ\n377002 0 ๅจๅ็ตๆฑ ็ๆต่ฏไธ่ฏไผฐๅจ็บฏ็ตๅจๆฑฝ่ฝฆไบงไธ็ๅๅฑ่ฟ็จไธญ่ณๅ
ณ้่ฆ\n377003 0 ๅฆๆๅฅฝๅฃฐ้ณ้ปๅน็็ๅฆๆญคโฆโฆไธ็ไบ\n377004 0 ๅฑฑ่ฅฟ้ฆๅฏ้ขๅฉๆ8000ไธๆถ่ดญ1\n1000 Processed\n classify content\n377500 1 ๅไธบ ่ฃ่ ็
็ฉxX (Chex-TLxx) ไฝ้
็ ็ฝ่ฒ ็งปๅจxGๆๆบ ๅๅกๅๅพ
ไบฌไธไปท๏ผ๏ฟฅ...\n377501 0 /ๅนฟ่ฅฟๅฎๅ้ญไธพๆฅไธ2ๅฅณๅคงๅญฆ็ๅผๆฟโๅ้ฃโ่ขซๅ่\n377502 0 LeohNewTabๅ
่ฒปๆผไบฎ็GoogleChromeๅ้ ๅคๆ\n377503 0 ๅถ็ถ็ๅฐๆฑ่ๅซ่งๅจๆพ็ฟป่ฏๆไธญๅญ็ปงๆฟ่
ไปฌ\n377504 0 ๆฐๅบๆไฟไธญๅฟ็ป็ปไบ20ไฝไบบ็ไธไธๆๅๅข้\n1000 Processed\n classify content\n378000 0 ็ญๆไปๅไบฌๅๆฅไฝ ไฟฉไธไธชๅไธๆตทไธไธชๅผๅงๅน่ฎญไบ\n378001 0 ๆญๅทๅฐ้4ๅท็บฟ่ฟ่ฅ็ฌฌ188ๅคฉ\n378002 0 ๆ็ไธ้่ฑๅ้ชจๆ้ญ่ๅพไธ่กไบ\n378003 0 ๅๆไธไธชไบบ็ๅฐๆ็ๆๆบ่ๆฏ\n378004 1 ไพ็ๆๆ็ฎๅ๏ผๆ็ปญๅฟซๆท๏ผๆๅฟซไธๅคฉๆพๆฌพใ ๆจๆ่
ๆจ็ไบฒๆๆๅๆ่ต้ๆน้ข็้ๆฑ็่ฏ๏ผๅฏ...\n1000 Processed\n classify content\n378500 0 ๅไนฐไธช้ฅๆงๆฑฝ่ฝฆๅๆฅ็ฉๅฟ็ๆๆไผผ็\n378501 0 ๅจ้ฃๆบไธๅ็ไบไธ้ๆ้
็ฉฟ่ถ\n378502 0 ๆๆบ้ๅคไบๅฅฝๅคๅ่็ๅพโฆๅฏน\n378503 0 ไธญๅฝ็ณๆฒนA่ก่กไปทๆฏH่ก้ซ104%\n378504 0 ๅไบฌๅธไธญ็บงๆณ้ขๅฌๅผโ้ๆณ้่ตไธๅธๆณ้ข้ฒโๆฐ้ปๅๅธไผ\n1000 Processed\n classify content\n379000 0 ไพๅฅ2Awomengotheatstrokeonthestreetandrushedtoth...\n379001 0 ๆฌง็พๅพๅคไบบ็จiPhoneไผ็จๅฅฝๅ ๅนดไธๆขๆฐๆๆบ\n379002 1 ๆจๅฅฝๆๆฏๆ่พฐ่ทฏ็ฑๅจๆนๅ\n379003 0 ๆทฎ้ดๅฑ ๅฎฐๅบ้ๆไพฎ่พฑ้ฉไฟก็ๅนด่ฝปไบบ\n379004 0 ๅคงๅฎถไน้ฝ็ฅ้ๆ็็ด ๆปฅ็จ็ๅๆ\n1000 Processed\n classify content\n379500 0 ่้พไธ้ ๅ
ฑๅ็ปดๆค็ป่ๅ
ๅคๆญฃๅธธๆธ้ๅๅ้
ธ็ขฑๅนณ่กก\n379501 1 ็ฝ็ใไนไน็ใๆธธๆณณใ่ฝฎๆปใๆฃๆ็ญๅน่ฎญ๏ผๆฌข่ฟๆจๆฅๆฌงๅฐผๅฅ่บซไฝ้ชไธไธๆ ท็ๆฟๆ
๏ผๆฌงๅฐผๅฅ่บซไป่ฃ็ฅๆจ็ๆดปๆๅฟซ๏ผ\n379502 0 ๅฏน้ข้ฉถๆฅ็ๅคงๆปดๆปดๅ็ฏ็
ง็ๆไปไนไน็ไธๆธ
\n379503 0 ไธญๅๅฐฑไนฐไบ็ขๆณก้ข่ฎฉๅฅน90ๅฒ็ถไบฒๅๅจ้ฟๆค
ไธๅ\n379504 0 ใใ่ๅฏนไบๅไปป็่ดจ็ใๆฑก่ใๆปๅป\n1000 Processed\n classify content\n380000 0 3ๅทๅๆจๅพฎๅๅๅบ๏ผไธๅๅบ็่งไผไธๅผๅพ็ปดๆ\n380001 0 ไปๅคฉๆณกไบไธญ่ฏๆพกๆฌๆฅๆณๆฉ็น็ก\n380002 0 ๆนๅ6ๅ้ๅนดๅจๅๅ
ฌๅฎค่ขซ19ไบบ็ ๆ\n380003 0 ๅจๆขฆๆณ็ซ็ฎญๅขๆไธไธช็นๆฎ็ๅฐๅข้\n380004 0 ๆไนๆฏไฝฉๆๆๅฆ็จๅธธ็่ฏ่ฏปๅบไบๆ็ๆฏไธๆก็ถๆ\n1000 Processed\n classify content\n380500 0 ๆๅ่ตทไบไธไธชๆ็ฅจ็ฑๅฅณๅฟๅๅฏไปฅ็่งฃ\n380501 0 ๅณไฝฟๅจ่ฟๆ ทๆๆฏๅ
่ฟ็ๅป้ขไน้พๅ
่ฝๅฐ้ญไบบ็้ช็ๅขๅฐ\n380502 1 xๆxxๆฅๅฅฅๅ็ๆ้ๆๅ้กถๅจๅนดๅบ๏ผๅๅฎถ่กฅ่ดดใ้ถๅฉๆถฆ้ๅฎ๏ผ่ฎฉๅฅฅๅไธบๆจๅฎถๅขๆทปๆถๅฐๆธฉๆ๏ผ่ฏๆ้่ฏทๆจ...\n380503 0 ่ด็นๆฏ็LinkedIn่ตๆๆพ็คบๅ
ถ7ๆๅ ๅ
ฅ่นๆไปไบ่ฟ่ฅๅทฅไฝ\n380504 0 ็ป่ฟไบๆผซ้ฟ่ๅ็ญๆ็้ฃๆบๅ่ฝฆๅญ็่ทฏ็จ\n1000 Processed\n classify content\n381000 0 ็ฝๅ็ดๅผ่ฎคไธๅบ็็ธ็ซๆฏ่ฟ\n381001 0 ๆๆโ็ดงๅนณๆข
โ็พๅบฆไบ็้่ฆ่ฟๆฅ็ง่ๆไฟ่ดจไฟ้\n381002 0 ็บขๆตทๅบฆๅๅฐ้ฃๅธๅขๅ
็ดขๅ
้้ฃๆบๅฐ้ฃ้ฝ่ฟๅปไบ\n381003 0 ้็ฆป+ไฟๆนฟ+้ฒๆ+็พ็ฝ+่็ผไบๅคงๅๆ\n381004 0 ๅฐไผไผดไปฌๅฐๆญคๅฝไฝ่ชๅฎถ่ฃ
ไฟฎไธ่ฌ\n1000 Processed\n classify content\n381500 0 ่ฟๆฌกๅคงๆณขๅจๅจๆฟๅบๅผบๅฟๅนฒ้ขไธๅนถๆช้ ๆ่ฏธๅฆ้่ๅฑๆบ็ญๆถๅฃๅๆ\n381501 0 ้้ขๅผบๅฅธ้ป่พ็้จๅๅฎๅจๆๅพๅๆงฝ\n381502 0 ไนๅ่ฑไบไธๅคฉๅๅฅฝ็6ๅคฉๆ
ๆธธๆป็ฅ\n381503 0 ๆธฉ็ป็๏ผ13798580082\n381504 1 ไฝ ๅฅฝใ ๆๆฏไนๆธธ็ๅ
ฌไธป๏ผๆฐๅนดไปฅ่ฟไนๆธธktvๆญฃๅธธๅผไธ๏ผๆฌข่ฟๅคงๅฎถๆฅๆงๅบๅฅฝ็ฉ็พๅฅณๅค๏ผๆ็ฉบๆฅๆงๅบๅฆ๏ผ...\n1000 Processed\n classify content\n382000 0 ็thenightshiftๅทฒ็ถ็็ไบๆฏๆฌก็้ซ็ๅๆ้ฝๅจๆณๅ
ถๅฏฆๆๅฏไปฅ็ถๅ้ซ็็้ฃ้ไธๆฏๅ\n382001 0 G15ๆฒๆตท้ซ้ๅพๆตๆฑๆนๅ1291\n382002 0 ๅฏนไบๅฎ็นๅป็ๆบๆ็็ฃ็ฎก็ๅพ้่ฆ\n382003 0 ๅค500ๅ
ไปฅไธ2000ๅ
ไปฅไธ็็ฝๆฌพ\n382004 1 ๆ่ฐข่ด็ตๆตทๅฎๅฝฉๅฐ๏ผๆฌๅ
ฌๅธไธไธ็ไบงๅ็ง่ช็ซๆ้พ่ข๏ผๆกๅๅนฒ่ขใ้ข็ญ่ขใ็็ฉบ่ขใ่ธ็
ฎ่ขใ็ณๆ่ขๅนถไธบ...\n1000 Processed\n classify content\n382500 1 ๆจๅฅฝ๏ผๆๆฏ็บฝๆณฝ่ฃ
้ฅฐ็ๅฐๅ๏ผๆไปฌๅ
ฌๅธ้ๅฏน่ฃ็ๅฐไบง็ๅฐๅบๅพ้ๆ ทๆฟ้ด๏ผๅฎ่ฃ
ไฟฎๅ
ไธๅนดๅฐๅบ็ฉไธ่ดน๏ผ่ฟ้...\n382501 0 ็ปๆ็นๅผ็ปดๅบ้พๆฅ๏ผThesocietyaimstofosterunderstandingan...\n382502 1 ไธญๅฝๅนณๅฎๅนดๅๆจๅบๅก่ดญไนฐ่ฝฆ้ฉๅฐฑ้ๅฐๅฟๅป็ไฟ้ฉ๏ผ่ดญไนฐๅฐๅฟ้ฉ้ๆไบบไฟ้ฉใๆฐ้ๆ้๏ผ้ๅฎๅๆญข.ๅจ่ฏข็ต...\n382503 0 ไบบๆฐๅธๆฑ็่ฝไธ่ฝๅฎไฝ็ไธๅฅฝ่ฏด\n382504 0 ๅ ๆญคๆ่ฟ็งๅปบ็ญ้ฃๆ ผๅธฆๅฐไธ็\n1000 Processed\n classify content\n383000 0 ็ถ่ๆฒกๆไปไน็จ~ไพๆฎๅไบฌๆฐ่ฑกๅฐๅๅธ็ๅคฉๆฐ้ขๆฅ\n383001 0 ๅจTumblrไธ็็ฝๅไธบLazyBones\n383002 0 ๅฎ้พๆฟๅบๆดพไธ็พๅ็น่ญฆๅๅถๅๆฐๅทฅ\n383003 0 ้ๅ ๆๆฌ=ๅฟๆ
ไฝ่ฝ้้ท+ๅ่ชๅฝข่ฑกๅๆ+ๅฎถไบบๆๅๆ
ๅฟง+ๅทฅไฝๅญฆไน ็ๆดปโโๅคๅญฃ็็ญ\n383004 0 ๅ่ฑ้กบๆฉๅจ2009ๅนดๅฐฑๅทฒ็ป้ๅไธๆฟ\n1000 Processed\n classify content\n383500 0 ็ถ่ๆ่งๅพๆฏๆถๅๅฐ็็ธๅ
ฌ่ฏธไบไธไบ\n383501 0 ไธ้ขๅฑฑไธ่ฟ้ฆ็งๆ้ๅขๅฐฑไธบไฝ ่ฏฆ็ป\n383502 0 ๅฏน็ฒๅบไปฅไพตๅ ็ฝชไธ่ฏ้ช็ฝชๅนถ็ฝ\n383503 0 ๅฏ่ฟๆ ทๅๅฎๅจๅคชKYไบๆๅชๅฅฝๅฟ็\n383504 0 ๅไธชvi่ฟ่ฆๆๅๆๆกๅ็ฝ้กต่ฎพ่ฎก\n1000 Processed\n classify content\n384000 0 ๆทฑๅณๅฐ้ๆฝๅทฅๅๅก่ขซๅฐ่
ๅทฒๆๅบๅทฅๅฐๅๅทฅๆดๆน\n384001 0 โโฆB๏ผโ้ฃไฝ ๅ่ฏๆ่ฟ้ๆฏไธๆฏไธญๅไบบๆฐๅ
ฑๅๅฝ\n384002 0 ไฝๅ่ฎพ่ฎก๏ผๅผ ๅฏๆๅฝฑ๏ผXYๅทฅไฝ\n384003 0 ๅๅๅฏผๅพ็ต่ๆ็คบๆ็พ้พๆงๆ
้ๆฅ็ๅคช็ช็ถๆฒกๆๆไธๆฅ็ฌๆญปๆไบ็พ้พๆง23333\n384004 0 ๅจๅฏน้ตๅพๅท้
ทๆฃไฟฑไน้จ็ๆถๅ\n1000 Processed\n classify content\n384500 0 ๆไธๅๆขฆๆขฆๅฐๅ้ฃๆบ้ฃๆบ่ขซ็ผ
็ธๅๆดพๅๅซๆๆ่ทณๆบ่ทไบ็ถๅๅฐๅค้ฎ่ทฏไฝๆฏ่ฏญ่จไธ้่ฑ่ฏญไนๅคชๆธฃๆฒกไบบๅธฎๆ\n384501 0 ๆตชๅญๅๅคด๏ผNBA็ๆ่ดๅ
ๆฏๅฆไฝ่ดฅๆไบฟไธๅฎถ่ดขๆฒฆ่ฝๅฐๅๅๅก็\n384502 0 ไปๅนดxxๅฒ็ๅๆๆพๅ ่ดชๆฑกใ็็ชใ็ฅไบตๅฅณ็ซฅ่ขซๅคๅๅ
ฅ็ฑ\n384503 0 ๅพฎ่ฝฏๅทฒ็ปๅฐiPhoneไธ็XboxMusicๅบ็จๆนๅไธบGroove\n384504 0 ๅๅพ็ฅไธไธชๅๅญฆๅจ่
พ่ฎฏๆป้จไธ็ญ\n1000 Processed\n classify content\n385000 0 4ใๆณจๆ้ฅฎ้ฃๅซ็ๅๆถๅ้้็พ็
ๅ็\n385001 0 ๆไปไฝ ่ฟๆฒๆๅฐฑ็ๅบๆฅไฝ ่ดชๅฟ\n385002 0 ๆๅปบ่ฎฎๆฟๅบๅ
ฌๅฎๆบๅ
ณๅจๆ้ฝๅธๅฏไปฅไธฅๆไธๆฌก\n385003 0 ๆฏไธๆฏ็ฏ็ฝชๆกไพๆฏๆๆฐๅบๆฅ็\n385004 0 ่ถๆถ้ฒๆ้็ๆถ้ดๅปๆฏ้ฅฌไบไธ่ฟไธช็ฉๆ\n1000 Processed\n classify content\n385500 0 ๅป้ข็ตๆขฏๅฎๅ
จ็ฎก็ไบบๅๅ็ตๆขฏ็ปดไฟไบบๅๅฏนๆฏ้จ็ตๆขฏ็\n385501 0 ๅๅ
ญๅนดๅ่ฎฉไปๅปๅๅ ๅฅฝๅฃฐ้ณ็ฌฌไบๅๅญฃ\n385502 0 ๆฒกๆ้ฒๆ้็ๅคๅคฉ??????\n385503 0 ไฝ ไนๅฟซๆฅ่กจๆๅง~้ๅบไฝ ๆ็ฑ็ๆบๅจไบบๅค่ง\n385504 0 ๆๅค็ๆฏCCTVx็ปไบๅพๅค้ฑๆไปฅไธญๅฝๅ
็ด ๅพๅค\n1000 Processed\n classify content\n386000 0 ็ๅข็ฌ่ฎฐ็พๅบฆไบ\n386001 0 ๅคดๅๆฏ็ถๆฏๅบๅ ๅฏไธไฟๅญๆถ้ดๆ้ฟ็่ฝฝไฝ\n386002 0 ๆฏ็ฎฑๅฅถ็ฒๆฟๅบไธฅๆ ผ็็ฎกโฆๆไปฅๅช่ฆๆฏ็ด้ฎ\n386003 1 ๆจๅฅฝ๏ผๅนฟๅท้ซไฟๅๆฑฝ่ฝฆ้
ไปถๆฌข่ฟๆจๆฅ็ตๅ่ฏข๏ผไธป่ฅ๏ผๅฅ้ฉฐ.ๅฎ้ฉฌ.ๅ่ฃ
ๆ่ฝฆไปถใ็ไผๅๅธ.่ดงๅฐไปๆฌพ.็ต...\n386004 1 ่ฟไธ๏ผๅ
ซ่๏ผๆฐดไธญ่ฑ็พๅฎน้ข่ๆไบงๅๆปๅ
ฌๅธ๏ผๆดปๅจๆ้ด๏ผ่ดญไนฐๆค่คๅๆจๅบไธ็ณปๅไผๆ ๆดปๅจ๏ผๅคไนฐๅค้๏ผ้...\n1000 Processed\n classify content\n386500 0 xxๆฅPxPๆไบคๆฐๆฎไธ่ง่กจ๏ผๅ็งฐๆถ้ดๅ ๆๆไบค้ๆไบค้ๅนณๅๅฉ็ๆ่ตไบบๆฐไบบๅๆ่ต้้ขๅนณๅๅๆฌพๆ้...\n386501 0 ่ดขๅฏๆขๆบโๅ
จๅฝ็ไปฃไผ่ฎฎๅฌๅผ\n386502 0 ็ป็ปxxไฝ็งๆฐ่ฃ
ๅค่ฟ่กๅฎๆๅฎ็ๅฎไฟฎ่ฎญ็ป\n386503 0 ๆณฐๅฝmistine็พฝ็ฟผ็ฒ้ฅผๅChanel็ฒ้ฅผๆไธๆผ่ไปทๆ ผๅดๅนณๆฐๅพๅคmistineๅบๆฑ้ฝไธไผ่ฑ...\n386504 0 ๆฟๅฑ็่ฃ
้ฅฐ่ฃ
ไฟฎไธๅพๅฝฑๅๅ
ฑๆ้จๅ็ไฝฟ็จ\n1000 Processed\n classify content\n387000 0 ๅ้ฃๆบๆ ็ๆฏๅ็ปๅไธๆฌก็ไธๆญป\n387001 0 ๆญป่
ๅฎถๅฑๅผๅงๆญๆฒไบๆ
็็็ธ\n387002 0 ่นๆ6plusๆๆบๅฃณiPhone6plus้ๅฑ่พนๆก5\n387003 0 ๅฟ็็ฉ็โ่ฏๅธ็ๅๅธฎไฝ ๆจๅผ็็งๆๅญฆ็่ฟท้พ\n387004 0 ่พ่ฆ็ฉ็ๆธธๆ่ฆๅ31็บงไบ่ฆๅฐๆๆณๅฎๆ็้ฃไธชไปปๅกไบ\n1000 Processed\n classify content\n387500 0 twitter/morinorom๏ผใจใใจใจโฆใจใชใ็ฉบใใฆใใโฆ\n387501 0 ๆ ่ฎบไฝ ๆ่ตไธๅปๅคๅฐ่ดขๅใ็ฉ็ๅไบบๅ\n387502 0 ๅ ๆๅซๆ๏ผT1523349912ๅๆไธ่ตทๅ็ฝ่ๅง\n387503 0 ๅจๆญฃๅจไธพ่ก็GDC2014ๅฑไผไธ\n387504 0 โ่ฑๅ้ชจไธๅทดๅทฎ็นๆฒกๆไธๆฅ๏ผโ่ฟไนๅคไบบ้ฝๆฏๆฅๅๅผๆฝๅ้ฎ้ฎ้ข็ไน\n1000 Processed\n classify content\n388000 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?ๆฝ3\n388001 0 ่ฐ่ฏดๅฎไฝ่ฝๆพๅฐๆๆบๅจๅช้็\n388002 0 ่ฟไธชๆฐๅญๅฐไบ2014ๅนดๅ็ฟปไบ20ๅ ๅ\n388003 0 ๆฒกๆ็ป่ฟๅปๆไฟฎๅชๅ่ฎพ่ฎก็่ๆจ่ฌๅ็น่\n388004 0 โ้ๆญฃไนโไธพๆฅไธไบโๅฑๆถๆ่ฏๆฏโ\n1000 Processed\n classify content\n388500 0 ่ฟๆฅไฟโๅซๆโๆฐ้ป้่ฎฏ็คพๆฅ้\n388501 0 ไบบๅ60ๅนณๆน็ฑณๅ
ๅพไธบไธปๆตๆ่ง\n388502 0 ๅนถไพ็
ง็ธๅ
ณ่งๅฎไพๆณ็ปไบๅค็ฝ\n388503 0 ||ๆๅจๆฐงๆฐๅฌไนฆๆถๅฌโ็ๅข็ฌ่ฎฐxไบ้กถๅคฉๅฎซxxxโ\n388504 0 ๅฝไธญๅ
ๆฌ้ซ่พพ711็ฑณ็ๅ
จ็ๆ้ซไฝๅฎ
ๅคงๅฆโ่ฟชๆไธๅทโ\n1000 Processed\n classify content\n389000 0 ๅพฎ่ฝฏๅๅบ่ฟไธ็ถๅต๏ผ่ฟๅฏ่ฝๆฏXboxOneไธๆฐDRM็ญ็ฅๅผๅไธไธชBug\n389001 0 ๅๅฅฝไธ่พพๆๅๆ
ๆธธๅ้กน็ฎๅพๅฐๆ่ฟๆซๅฐพ้จๅๅ่ฎฎ็ญพ่ฎขๅทฅไฝใๅๅฐไธๆธ
ๅทฅไฝ\n389002 0 ๅไธบๆไธ่ฃ่ๅฐๅจๆฒ็น้ฟๆไผฏ้ฆ้ฝๅๅธๅๆ ทๆกฃๆฌก็ๆฐๅๆฌๆ็ป\n389003 0 ่ฏทไฟฎๆฐดๅฟๆฟๅบใไฟฎๆฐดๅฟๅ
ฌๅฎๅฑไปฅๅ็ธๅ
ณ็ๅไฝไธฅๆฉๆดๅไผคๅป็ไบบๅ\n389004 1 ๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผๆๅคๅฏๅฟซ้ๅ็ๅคง้ขใxx-xxxไธใไฟก็จๅก๏ผๅๆๆ ่ดน็จ๏ผๆ ๆท็ฑ่ฆๆฑ๏ผๅๅฏๅฅ็ฐ...\n1000 Processed\n classify content\n389500 0 ๆไบบไปฅ็ซ็ฎญ่ฌ็้ๅบฆๆ้ฃ็ฉๅกๆปกๅฐ่ชๅทฑ็่\n389501 0 ๆๆถๅๅพๆๆ็ๆ่งๅทฎไธๅคไธคไธไธชๅฐๆถๅๅฅถไธๆฌก\n389502 0 ๏ผปLittleLostProject๏ผฝๅจๅคง่กไธๆไปฌๅฏไปฅๅ็ฐๅพๅคๅบๅผ็็ฉๅ\n389503 0 ๅ็ฑป็ตๅจๆฑฝ่ฝฆๅจๆๅฑฑๅธๅจ่พนๅ
็ตโ็ปญ่ชโ้ๆฑ\n389504 0 ไธไธชๆๆบไธ็จ่นๆไบ็จๅไธบๅไธบๅฐๅนดๆๆฏๆๅฅฝ็ๅฐๅนด\n1000 Processed\n classify content\n390000 0 ๅ
ถไธญไธญ็ขณ้ฌ้fecr55c200ๆๆ ไปทๆ ผไธบ11200ๅ
/60ๅบไปท\n390001 0 ๆฅ้ไธ็
ไพ๏ผๆฃ่
ๆๅญๅฎซ็ ด่ฃๅฒ\n390002 0 ็พๅบฆ่ดดๅงโ็พๅฝโไบๅญ้ฝๆๆๆ่ฏไบ\n390003 0 ็จๆฅๅฐJava็ๅฏน่ฑก่ฝฌๅไธบJSON\n390004 0 ่ฟ่ฎฉๆ่ดจ็ๆไนๅ็็้ฝๆฏไบๅฅ\n1000 Processed\n classify content\n390500 0 ๆไปฌ็ง้ๅฐฑๆฏๆจ่บซ่พน็ไฟๆคไผ\n390501 0 ไธๅบ่ญฆๆน็ ด่ทไธ่ตท็็ชๅผไธ่ฑ็ฏฎ็ๆกไปถ\n390502 0 186*3230ไปๆฌพๆๅ่ทๅพไธไธช5ๅ
็บขๅ
\n390503 0 ไธๆตทไธ้ฉดๅๅจๆตๆฑไปๅฑ
ๅฟๆฑๅ้็ๅธโ็้โๆถ\n390504 1 ๅพทๅฝ่ฒๆๆ ผๅฐๅฐๆฟx.xxไฟ้ ๅไผๅ็ใๅฎๆ ไปทๆ ผใ่ฟ็กฌ่ดจ้ใ่ฏๅฅฝ็ฏไฟใไผ่ดจๆๅกใ ๅฐๅ๏ผ...\n1000 Processed\n classify content\n391000 1 $ๆจๅฅฝ๏ผๆฐๅนดๅฟซไนใ๏ผๅช่ฆโ้ณ ๅฐฑโ้โ็พๅไธๅใๅบxๅ้ๅฐๆท๏ผ๏ผ่ฏฆๆ
ไธใxxxxxw.CCใ\n391001 0 ๆฒกๆๆ็ฝ็็ธๅฐฑไนฑ็ฉไบบๅดๅทดๅญ็ๅฅณไธป่ง\n391002 0 ้ปๆๆๆฏ่ขซ่ดจ็่บซ้ซๆๅค็็ทๆผๅไนไธ\n391003 0 ่ฟ่ฝๅบ็จๆบ่ฝๅ็่ดขๆจกๅๆตๆ้ฃ้ฉ\n391004 0 ๅไบฌ็งฆๆทฎ่ญฆๆนๆฅๅฐไธไธชๅฐไผๅญๆฅ่ญฆ็งฐ\n1000 Processed\n classify content\n391500 0 ๅไธบไธๆฌพๆฐๆบๆๅ
้
Kirinxxxๅค็ๅจ\n391501 0 ไธๅพไธๅพ่ฏดไธๅฅ่ฑๅ้ชจ่ฟไนๅป็ผบ\n391502 0 ๆๅไธ้ข่ฟ่ฆ็ญๅฐ่ฑๅ้ชจๆฅไบๆ่ทๅฅนไธ่ตทๅ็ๅๆ้ก้ๆฒกๆ็ณ\n391503 0 ๅฆ็ไปฅๅๅฐฑๆ็็พๅบฆๅงๅไธ่ฟๆญฅๅฐฑไผๅๅฑๅป้ผ\n391504 0 ๆ ้กๅธไธญ็บงไบบๆฐๆณ้ขๅฌๅผๆฐ้ปๅๅธไผ\n1000 Processed\n classify content\n392000 0 ็ธไฟก็่ดขไบงๅๆๅๅฎๆฟ็ธไฟก็่กๆด้ ่ฐฑ\n392001 0 ่ชๅฐ้xๅท็บฟไบๆๅๅจ่พนๅคไธช็ซไบคๆกฅๆฝๅทฅไปฅๆฅ\n392002 0 ๅคชๅคๅคชๅค็่ดชๅฎๆฑกๅ้ไฝๆๅพๅคๅพๅค็ๅๆณไผๅป่งฃๅณ\n392003 0 ๅค็ป่ดธๅนฟๅบ9ๆฅผ็็ตๆขฏ้จ็็ธไบ\n392004 0 ๆๆๅฏนๆUVAUVB้ฒๆญข้ปๆไบง็\n1000 Processed\n classify content\n392500 0 ๆ
ๆธธๅฐไบงๅๅฑๆธ้ฒ้่ๅๆๅบฆๅๆจกๅผๆๆชๆฅ่ถๅฟ\n392501 1 ~ๆๅ๏ผผๆ่ฟๅฅฝๅ๏ผๆ ไธ ๅฅฝ ็ ่ถ ๅถ๏ผ่ฟ ๆฏ ๅ ๆฅ ็ ๅณ ้ใไฝ ๆ็ใxxxไธฝxxx...\n392502 0 thetruthcomestolight็็ธ่ฟๆฉไผๅคง็ฝ\n392503 0 ๅ
่ฟไธชNIVEAๅฐฑๅทฒ็ปๆฏ็ฌฌไธ็ถไบ\n392504 1 ๅฐๆฌ็ไผๅๆจๅฅฝ๏ผโไธๅ
ซโๆฅไธด๏ผๆ่ธๅ้
ๅฅ็ญ่ฃค็งๆไปทx.xๆ๏ผๆญคๆดปๅจxๆxๆฅ็ปๆ๏ผๅฟซ่กๅจ่ตทๆฅ๏ผ...\n1000 Processed\n classify content\n393000 0 ๆญฃๅจไธๆตทๅๅ ๅไธๆดปๅจ็็งๆฏๅฐฑๆพ่ตดๆญๅทๅฏไผ้ฉฌไบ\n393001 0 ้คไบๅฐๅจๅฐผๆฏ50็ฑณๆฐด้ๅๅผ้ธๆพๆฐด่ฟๆกฅๅค\n393002 0 ๆป็ฟไผ้่ฝๅคฑ่ดฅ็ดๆฅ่ขซๆฑฝ่ฝฆ็ปๆไธ\n393003 0 2015ๅๆๆฐ่ชๆบๅบ้ๅขๅ
ฌๅธๆ่11ไบบ\n393004 0 ๅญฆไบไธๅฅๆฌๅท่ฏ~็ๅ
ๆ็ฑไฝ ~ๅๅ~ๆฏๅคฉ้ฝๅพๆณไฝ ~็ฑไฝ ~\n1000 Processed\n classify content\n393500 1 ่ฎพ่ฎกๅธไธๅฏนไธไธๆจไบคๆต๏ผ$ๅฝๅคฉๅฎๆ ทๆฟ้ดๅฏไบซไปฅไธไผๆ ๏ผ$x๏ผ่ต ้ๅฎถๅ
ทๅฎถ็ต๏ผๆฒๅ๏ผ้คๆก๏ผ็ๆบ๏ผ็ถๅ
ท...\n393501 0 ็็ๆฅ้ไธๅผ ๅไบฌbigbangๆผๅฑไผ้จ็ฅจ\n393502 0 ไฝๆ กๅญๅ
็็น่ฒๅปบ็ญ็พคๅคง้จๅๅพไปฅไฟๅญ\n393503 0 46ไธชๅธ็บง้จ้จๆๅไบ้กน่ดฃไปปๆธ
ๅๅทฒ้่ฟ็ๅๆบๆ็ผๅถ็ฝๅ็คพไผๆญฃๅผๅ
ฌๅธ\n393504 0 ๆฑ่็็ด ่ดจ็ทsๆๅพ
ไธไธชๅฌ่ฏๆฟๆ้ฟๆๆ
ขๆ
ขๅๅฑ็ๆฟๆไธๆญฅไธๆญฅ่ขซ่ฐๆ็mๅๆฌขๅค้ฃๆ่
ๅคๅ
ธๆๅญฆ็ๅฆนๅญๆๅฅฝ\n1000 Processed\n classify content\n394000 0 ไนๆตไธ่ฟ2000ไธไบบ็่ฟๆณไนฑ็บช\n394001 0 ไธบไปไนไฝ ไปฌไธ่ฏด้ญ
ๆmx5ๆ็ฌฌไธๅๆ ้ฝๆฏ้ฉฌ่ช่พพmx5ๅข\n394002 0 ไธๅจkingcountryๅฎฃๅธๅงๅไผๆ่ฎฎ่งฃๅณๅฏนไบๆ็ปญๅข้ฟ็ๅฐๅนดๅธๆณๅถๅบฆไธญ็งๆๅทฎ่ท้ฎ้ข\n394003 0 ๆฌๅทๅธๅบๅๆฑ้ฝๅบๅ
ฑๆไบคๅๅๆฟ77ๅฅ\n394004 0 ๅผบๅฅธ็ฏไธๅฎถไพต็ฏ็ๅฆ็ง๏ผ1ๅป็้่ฟฐๅฅๅบทๆญฃๅธธ\n1000 Processed\n classify content\n394500 0 ่ๅจๅ่ฐ็ไธญๅ
ฌๅฎ่งฃๆๅฅณๅคงๅญฆ็ๆๅ\n394501 1 ๆๅคๅฏไปฅๅ
่ดนๅ็ๆฉๆ่ฝฆ้ฉพ้ฉถ่ฏใ้กปๅ่ฏ็ไบฒๆๅฅฝๅ่ฏท่ทๆ่็ณป๏ผ\n394502 0 ๅฎๆพ่ฟ็ปญ17ๅนด่่็พๅฝไบ้ฉฌ้็
้ไนฆๆ่กๆฆ\n394503 0 ๅคง็ญๅคฉ็ไธไธชไบบๅปๅไบฌๅนฟ็ต่ทไบไธช่
ฟ\n394504 0 ๅฐบๅฏธ20ใ24ใ26ใ29่ญๆฏ็ฒๆๆฐ่ฒไฟ็ๅบๆธธๅญฃๅธฆไธๅฟ็ฑ็ไปๅบฆ่ฟๅฎ็พๅๆ\n1000 Processed\n classify content\n395000 0 ็ซ็ถๆขๅฏน่ชๅจ็ๆงๆฐๆฎๅผ่ไฝๅ\n395001 0 ๅไปๆดพๅบๆๅบๆฅโฆโฆๅณๅฎ่ฟๆฏๅ
ไธๆตๆตชไบ\n395002 0 ๆฅ่ชๅ่ฏzappyๅธๆๆฏไธชๆฅๆZ11็ๅฅณ็้ฝ่ฝ่ฎฉไบบๆๅๅฐๅฅนๆฐธ่ฟ็ๆดปๆณผใ็พไธฝ\n395003 0 5ๆถB๏ผ29่ฝฐ็ธๆบ็ปๆ็็ชๅป้ๅฐๅๅญๅผนโ่ๅญโๆๅฐ้ฟๅดๅธไธญๅฟ\n395004 0 ไธบไฝ่ฟฝๆๅชๆฏ่ฎถๅผไบ็ฒไธไธๅถๅไน้ด็้ฃ็งๅ
ณ็ณปไธ้ข็ๅฟไธ้ขๅๆๅๆไบ่ฟไธไธๆๆ้พ้็่ฑๆ่ชไปฅไธบๆฏ...\n1000 Processed\n classify content\n395500 0 ๅพฎ่ฝฏๅฏนไบWindowsUpdateๆจกๅผ็่ฐๆดๅดๅผๅไบๅค็็ไธๆปก\n395501 0 ้ข่ฎกๅนดๅ
xxxxๅคๅฅไฟ้ๆฟๅฏ้็ปญไบคไปไฝฟ็จ\n395502 0 ๅฝไฝ ๆฅๅฐๅป้ข็ๆถๅไฝ ๆ่ฝๆๅๅฐ\n395503 0 ็30ๅคๅฎถไผไบไธๅไฝใ130ๅคไบบใๆบ4000ๅคไปถๅฑๅๅๅ ๅจๆทฑๅณไธพๅ็็ฌฌไนๅฑๆๅไผ\n395504 0 ็งๆฟไนฐๆฟๅฐ้ๅธฎๅฟ18170265496่ฅฟไธ่ทฏๅไฝณๆฟไบงๅๆจๅฎถ็ๆขฆๆณ\n1000 Processed\n classify content\n396000 0 ๅ ไธบ็พๅบฆ่ฟไธชๆ็ดขๅผๆ็ปไบ่็ฝๅๅคง็จๆทๅธฆๆฅไบๅพๅคง็ๆนไพฟ\n396001 0 2015ๅนดไธๅๅนดๅค็็ณ่ฏไธพๆฅ75่ตท\n396002 0 ๆๅฅฝๅฌ็ๅฃฐ้ณๅฑ
็ถๆฒกๆๅฏผๅธ่ฝฌ่บซ\n396003 0 ๅฐๅนด็ฅๆข็ไปๆฐๅฐ็ฝ่ธไธ้ๅๆผๅ
่ณ\n396004 0 ๅไบฌๅคช็พไบๆ็ผๅ็ไธคไธชๅป้ผๆปๆฏ่ฝๅ่ง็\n1000 Processed\n classify content\n396500 0 ้ฝๆฏๅ ไธบๅจไธไบ่งฃไบๆ
็็ธๆถๅฐฑๅฆๅ ่ฏ่ฎบ\n396501 0 ๅฐ็ผ้้ๆจ่4็งๆฐๆฌพๆฉ้ค๏ผๆๅถ่้ฅผ้
ๆตทๅ้ป็ฏ็ฌผ่พฃๆค\n396502 0 ๅจ็ต่ๅๅๅ3ไธชๅฐๆถโฆโฆ\n396503 0 ็้ฟๅฆนๅฆนๆฅๅฅฝๅฃฐ้ณๆไธ็ง็ๆ่ช็ฌ่ชๅปๆฝ็ณๅฑน็sohoๆพๅทฅไฝ็่ตถ่\n396504 0 ้ฃไธช่
่ดฅ็ๅฟๆๅปทๅทฒ็ปๅฎๅ
จๆฒกๆ็ๅญไธๅป็่ฝๅไบ\n1000 Processed\n classify content\n397000 0 ๆ นๆฎ็ฐๅบๆ
ๅต่ญฆๅฏๅ่ญฆ่ฝฆๆฏๆ็ดไปป\n397001 0 keithๅฐckๆฐๆฌพๅฅณๅ
ๆฌง็พ้ฃ่ฝฆ็ผ็บฟ้ฟๆฌพ้ฑๅ
ๆๆฟๅ
็ฒ่ฒ้ป่ฒ็ฝ่ฒๅไปท269ๆๆฃ165ๅ
้ฎ\n397002 0 IMAX่ฆ็ๅปบ็ญๆฏไธๅบงๆๅ
ถๅฎ\n397003 0 ไปๆไธๅ
ณ็ช็ถไธไบไธๅบๅพๅคงๅพๅคง็้จ\n397004 0 Pๅพไบ่ฟๆฏๅพไธโฆโฆ่ทไฝ ไปฌๅคงๅฉๅงๅญ่็ฆไธๆ ท\n1000 Processed\n classify content\n397500 0 PhytoTreeไบบๅฐ้ป็ๅๆไบไธ็งๆค่คๅ\n397501 0 ไธ็็ฅๅๆ
่ก็ฎฑๅ็RIMOWAๅฑ็คบไบๆญฃๅจๅคๆดปไธญ็๏ผไธ็ไธ้ฆๆฌพๅ
จ้ๅฑๅฎขๆบๅฎนๅ
ๆฏF\n397502 0 ไฝ ๅฝๆฟๅบไธไป
ไป
ๆฏ่ ข็้ฎ้ขไบ\n397503 0 ๅฎฟ่ฟๆทฎๆตทๆๅธๅญฆ้ข้่ฟไธๅคๆถตๆดๆถ\n397504 0 ่ฎพ่ฎกๅธ้่ฟๅฏน้ๆ็ๆคๅๅๅฝขๆฅ็ข็ข็ๅบๅฎไฝๅ
ถๅฎไธค็งๆๆๅนถๅฝขๆไบไธไธชๆ่ถฃ็ๅฐ็ฏ็ฏ็ฝฉ้ ๅ\n1000 Processed\n classify content\n398000 0 ็ฏ็ฝชๅฟ็S4E19ไธญ็็ฝช็ฏๆฏๆฎๅ
้็Jasperๆๅๆผ็ปไบไธไฝๅ้ไบบๆ ผ็ฝช็ฏ่ไธ่ฟๆฏๅฅณไบบ็็ฌฌไบ...\n398001 0 ๅฝๅ่กๅ ๆญ็นcom+\\็งปๅจ็ผ่พๅบๆไธๆนcom+ไธ็งปๅจ็ผ่พๅบๆไธๆนcom+ไธ็งปๅจๅ
ๆ \n398002 0 ไธไป
ๅ ไธบๅฎณๆๆฅๆ็็ธ็่ฐ้พๅฐ่ฆ\n398003 0 ๅๅๅฝๆถ่ฟ็
ง็่ฟๆฏๆๆๆบๆ็ๆ้ฝๆฒกไบ\n398004 0 ๆณฐๅฝ่ญฆๆนๆๆไบ4ๅๅฅณๅญๅ1ๅ็ทๅญ\n1000 Processed\n classify content\n398500 0 ๅฏน้P8็จๆท่่จๅๅฆไฝๅฎ็ฐๆตๅ
ๅฟซ้จๅข\n398501 0 ๅไธๅนฟๅ๏ผไปทๆ ผ1500โโ2500\n398502 0 ็ๅ
ฐ๏ผๅฐๆธ
็ๆดๅๆฌข็ๅ
ฐ็บฏๆดๅฑไบไฝ \n398503 0 ๅไบฌ่ๅบทไธญๅปๅป้ขๅ
จๅฝๅไฝณ้็น้ข็ฅ็ปไธ็งๅป้ข\n398504 0 1985ๅนดๆๆฅๆไบ่ๅฉ40ๅจๅนด\n1000 Processed\n classify content\n399000 0 ๅชๆฏๆไปฌ็ๆฑฝ่ฝฆ่ฎฉๅฐๅทๅท่ตฐไบ\n399001 0 ๆไปฅ่ฎพ่ฎกไบ่ฟไธชๅฉ็จๆณๆก็ผๅถๅบๆฅ็ๆ็นๅ่บๆฏๅ็็งๅ\n399002 0 ไธๆๅพๅฅฝๅฎ้ธไฝ ๆฏๆณ็้ฟ็ไนๅธ
็้ไพฟๆๅญๆฌพๅผ็ปๆ่ฏดๆๅป็ปไฝ ๆฅ็พๅบฆๆๆฏ\n399003 0 ๆ็็พๅฎน้ข็ไบๆๅจ่ฏข็็
ง็ๅฐฑ่ฏดไธ\n399004 0 ๆฑไธ่ตทๅป็exoไบๅทกๅไบฌๅบ็ๅฐไผไผด\n1000 Processed\n classify content\n399500 0 ๆฟๅๆงๆๆบ็ฟปๅฐไบไปฅๅๅญ็ไธๅคงๅ ๅ ่ฒ็ๅพ\n399501 0 8ๆ8ๆฅๅๆๆ่ฆๅฎๆ็ตๆขฏๅฎๅ
จๅคงๆฃๆฅ\n399502 0 ไธคๅฐๆถๅ
ๆๅๆฅๅคไธ่พ่ฟๆณ่ถ
้่ฝฆ่พ\n399503 0 ๆฒณๅไปๅนดไผๅ
ๆดๆฒป้้จๅบ้็นๆๅบ็ฏๅข\n399504 0 ๆญ้
ไธญ่ฏ่ฏๆใไธ็ญ้
ฑๆฒนๅ็นๆฎ้
ๆ\n1000 Processed\n classify content\n400000 1 ๅฐๆฌ็ๅฎถ้ฟๆๅ๏ผๆฐๅนดๅฅฝ๏ผxxxxๅนดๆฅๅญฃ็พๆฏ่ฏพ็จxๆxๆฅๅณๅฐๅผ่ฏพ๏ผๆฌข่ฟๅธฆๅญฉๅญๅๆฅๅจ่ฏขใ่ฏ่ฏพๆๆฅ...\n400001 0 ๆฏๆฌก็ๅฐ้ฃไบๅปไธญๅฝๅฅฝๅฃฐ้ณ็ๅญฆๅๅผๅง่ฏด่ชๅทฑ็ๆฒๆจๆ
ไบ็ๆถๅๆ็ๅฐดๅฐฌๆๆง็ๅฐฑๅฟไธไฝ่ฆ็ฏๅฆ\n400002 0 ๆทฑๅณๅธๅ็ฎกๅฑๅทฒ้จ็ฝฒไปๅนด็ๆดปๅๅพๅ็ฑปๅ้ๆชๆฝ\n400003 0 4ใๅฐๆฆ็็โ้ปๅคฉ้น
โไบไปถๅจ่ตๆฌๅธๅบไธๅนถไธๅฐ่ง\n400004 0 ๆ่ฎค่ฏ็ๅฎฟ่ฟไบบไธๆฏ่ฟๆ ท็ๅ\n1000 Processed\n classify content\n400500 1 ๅงๆจๅฅฝ๏ผไธๅ
ซๅนธ็ฆๅฅณไบบ่๏ผxๆxๆฅไธxๆxๆฅไนๆฑ่็xๆฅผ็ปดๆ ผๅจไธVไธGRASS็นๆจๅบๆฅ่ฃ
xxx...\n400501 0 ๅ
จๅธๅ
ฑๆ้คๅ็ฑป่ฟๆณๅปบ็ญ3377ๆ \n400502 0 ๅๆถๅฐไบบๅฎถไธไฝๅฆนๅญ้ฃๆบ็่ฃ
ไนฆ่็ฃจ็ฝ็
ง็โฆโฆ่ฎฉๅฆนๅญๆพๅฎขๆๅปไบ\n400503 0 ๅ
จ่บซๅฟๆ็ปๅไบฌๆบๅบ็wifi\n400504 0 ๅนถๅจ2000ๅนดๅไธไบ็พๅบฆๆฉๆๅไธ\n1000 Processed\n classify content\n401000 0 ๅ็ฐ่ฏฅ่ฝฆๆฃ้ชๅๆ ผ่ณxxxxๅนดxxๆๆๆใไธๆxxๆก้็ฐๅบๆงๆณ่ฟๆณๆชๅค็่ฎฐๅฝ\n401001 0 โONENIGHTๅ
ณ็ฑ่ช้ญ็ๅฟ็ซฅๆ
ๅๆๅฎดโๅจๆญไธพ่กxๆxxๆฅๆ\n401002 0 ๆฃๆฅ็ป่ฎค็ๆฅ้
ไบๅฟๅฑ2015ๅนดไธๅๅนดๅ็็้จๅๆถๆๆกไปถๅทๅฎ\n401003 0 ๅไบฌๆๆ็ๅคงๅญฆ้ฝๅๆไบโๆฒณๆตทๅคงๅญฆโ\n401004 0 ็พๅฝไบบไธๆญง่ง่ไบบๅจไธญๅฝ่ฆๆฏไธไธชๅคง็ทๅปไธๅญฆ\n1000 Processed\n classify content\n401500 0 ๅจๆต้พ่ไฝๅปบ็ญ็ฉ็ๆฏไธชๆฉๆจ\n401501 0 ๅ
จ็50ๆทๅฎถๅบญ่ท้ๆฑ่โๆ็พๅฎถๅบญโ\n401502 0 Sorryใๆไธๆฏ่ญฆๅฏ็ๅพฎๅ็ๅ
ฌๅธ\n401503 0 ๆๆฏไธๆฏๆๅฟ็็พ็
ไธ่ขซๅฌๅฐฑ็นๅซๅๆถๅฅน\n401504 0 ๆฅๅฆ้จไธญๅฑฑๅป้ขๆขๅๅฅนๆ้็่กๅ\n1000 Processed\n classify content\n402000 0 ็พๅๅบๅจxxxๅคๅๅฃซๅ
ตใxx่พๆ่ฝฆๅๆฐๆถ็ดๅ้ฃๆบ็ช็ถๅฐไผๅ้จๅๅธๆฉ่ๅฐ็ไธๅบงๅซๅข
ๅขๅขๅ
ๅด\n402001 0 ๆ็จ็พๅบฆ่ง้ขๆๆบ็็ไบโ่้ฆ็ค่ๅญโ\n402002 1 ๆ่ฐขๆจ่ด็ตๆญฅๆญฅ้ซๅทๆน่้ฆ๏ผๆฌๅบไธบๆท็น็นๆๅฎๅไฝๅๅฎถ๏ผ็จๆท็น็น็นๅคๅๆๆฏๅ็ฎใๅฐๅบๅ ้ฃๅไบซไผ/...\n402003 0 ๅ
ฌๅธๅฎๆจ๏ผๅง็ปๅๆโไธ็็บง็ๅป็ๆๆฏ็ๅบๅฑไบๅ
จไธ็็ๆฃ่
โ่ฟไธ็ๅฟต\n402004 0 ไปปๆจๅฐฝๆ
ไบซๅ2015ๆฌพ็บณๆบๆทไผ6SUVๆญ่ฝฝ็ๅ
จ่ฝๅคง่\n1000 Processed\n classify content\n402500 1 ไธบ็ฅ่ดบๅพๅทฅๆๆๆบ่ดตๅท้้็ช็ ดxxxxๅฐ๏ผๅฎ้ถๆบๆๅๅฎถไบxๆxxๆฅๅจ่ดต้ณไธพๅๅคงๅ่ฎข่ดงไผ๏ผxx้...\n402501 0 ไบบไปฌๅธธ่ฏดwexin๏ผgjn19940907\n402502 0 ๅ ๅฟซๅปบ็ญไธๆฐๆๆฏๅจๆฝๅทฅๅทฅ็จไธญ็ๆจๅนฟๅบ็จ\n402503 0 onedrive็ญ็ญๆๅก่ฟๆฅๆๅกๅจไนๆ
ข่ฎฉไบบๅฎๅจๅไธไบ\n402504 0 ่ฝไฟๆคๅฉดๅฟๅ
ๅ็ป่ๅ็
ๆฏ็ไพตๅฎณ\n1000 Processed\n classify content\n403000 0 ๅฐ้ณๅฟ็พๅฎณๆงๅคฉๆฐๅณ็ญๆๅกๅนณๅฐxxๅนดxๆxxๆฅxxๆถxxๅๅๅธ\n403001 0 ๆไปฌๅฐฑ่ฝ็ฅ้ๅฎ้ฉฌๅฏนi็ณปๅ็้่งไบ\n403002 0 โๅคง็พไธ้ฅถ\"ๆฏๅพท้้ถ็ท่บๆฏๅฑๅจไธ้ฅถๅธๅฑๅบ\n403003 0 ๆๅๆฌข็ๆฅๆฌไบบๅ็ๅไธไนฆ็ฑ\n403004 1 ๅนฟไธๆฑ็ฅฅ็ๅฎไธๆ่ตๆ้ๅ
ฌๅธๆญ็ฅๅไฝ็พๅนดๅ็ฅฅ๏ผไธไบๅฆๆใๆฌๅ
ฌๅธไธไธ่ฝฆ่ดทๆฟ่ดท๏ผๆผ่ฏไธๆผ่ฝฆ๏ผไบๆ่ฝฆ...\n1000 Processed\n classify content\n403500 0 cn่ฐทๆญAndroidWearๅนณๅฐๆฐไธ่ฝฎๆดๆฐๅฐไธบๆบ่ฝๆ่กจไธๆไผ็ง็็จๆท็้ขๅธฆๆฅ็นๅปๆๅฟๅ่ฝ\n403501 0 ๆๆๅๆฌข็ไบๆ
ๅฐฑๆฏๅฑๆญๆๆฏๅคฉ้ฝ่ฆๅ็ไบๆ
ๅฐฑๆฏๅฑๆญๆๆๆณๅ็ไบๆ
ๅฐฑๆฏๅฑๆญๅฑไฝ ๅฆนๅ\n403502 0 ๆ่ฟๅฅฝๅคๅไผๅจๅทๅคง็็พๅบฆ่ดดๅง้ๅไบๆๆฐ่ดด\n403503 0 29้ฟ่ฏบๅพทๆฝ็ฆ่พๆ ผๆไธ้ๆSMAP็ป่ฑUPไธป๏ผshiyo29\n403504 0 ๅนฒ่ๅฐฑๆ็ต่pad็็ต็จๅ
ๅฅฝไบ\n1000 Processed\n classify content\n404000 0 ๅธๆไฝ ่ไธ็ ็ฉถ็ๆพๅฐๅฅฝๅทฅไฝ\n404001 0 ไธ้ฆๆฑฝ็ง่ตๅ ไนไธๅญๅจ็ซไบๅ
ณ็ณป\n404002 0 ๆ่ฐขๅป็ๆ่ฐขๅธฎๅฉๆ็ๆฏไธไฝ\n404003 0 Vertuๆๅบง็ณป็ปๅฎๅๆบ่ฝๆฏๆๆๆApp่ฝฏไปถ\n404004 0 ๆชๅฐไธๆฏไธ็ง้ขๆๅ่งฃ่ฑ่ฟไธๅปไฝ ๆฅไธด้ฝๆฏไบบ้ดๆ็พ\n1000 Processed\n classify content\n404500 0 ไบ่็ฝNๅคๆ็งไนฆ้ฝๅฏไปฅ็ดๆฅๆไบ\n404501 0 ไผผไน่ดจ็ๅซไบบๅฏน่ชๅทฑ็ๅฟ ่ฏๅทฒ็ปๆไธบไบๆฌๆง\n404502 0 ๆไปฌ็ๅชไฝๅฆๆๆฅๅฏผ็็ธๅฐฑไผไธข้ฅญ็ข\n404503 0 ่ฟๆฌก่ฎจ่ฎบๆฏๅจxxๆxxๆฅๅผๅง็\n404504 0 9ๆ็่ง้ฃๆขฆไธญไบบ่ตฐๅบๅฝฉ่ฒ็ๆฟ้ด\n1000 Processed\n classify content\n405000 0 ็ฟไบๆดพๅบๆ็ๆฐ่ญฆๅ้ๅไปฌๅทก้ปๅฐๆญค่ฟ
้ๆธ
้้่ทฏ\n405001 0 962269ๆฟๅฐไบง็ญ็บฟ่ฏดไบ\n405002 0 ็ป็ปxxๅๆฐดๅฉ่ๅทฅๅญๅฅณๅ
ๅๅฐๅๆนพๆฐดๅใ่้ฉฌๆฒณๆฑกๆฐดๅค็ๅใๅๆฐดๅ่ฐ่งฃๅฐ็ซใๆฝๅฎๆนๆนฟๅฐๅ
ฌๅญใๆฐๅ...\n405003 0 ๅคงๅฐๆฑฝ่ฝฆใ้ซ้ใ้ฃๆบ็ญ่กไธ\n405004 0 ๆธๅๆบไธ็น้ฝไธๆณๅผ็ต่ๆ่ฟๆฏๆไนไบ\n1000 Processed\n classify content\n405500 0 ??eventไปฒๆ่ช็ถ็ๆณ้ๅฅณ้ซ็ๅนซๆ่จบๆฒป\n405501 0 ๅ
ถ่ฎพ่ฎก็ปผๅไบ่ฅฟๅผๆ่ฃ
ไธไธญๅผๆ่ฃ
็็น็น\n405502 0 ๅธ็ธๅ
ณๆ
ๆธธๅ
ฌๅธๅๅ็ๅชไฝๅๅฎพๅ
ฑ200ๅคไบบๅๅ ไบๅผๅนๅผ\n405503 1 ไบฒใๅ
ดไนโ้ช่ปโไธๅๅบๅจโไธๅ
ซโ่ไธพๅไธๆฌกๅฅณๆงๅ
ณ็ฑๆดปๅจ๏ผxๆxๆฅไธxๆxๆฅไธๅคฉๅ
จๅบx.xๆ๏ผ...\n405504 0 ๆฏ็ซไธญๅ
ดๅไธบไธญๅ
ดๅไธบไน\n1000 Processed\n classify content\n406000 0 ๆๆ็บข้
ๅ็ถ่ฟๅฃ่ดง็ไปทๅฎ\n406001 0 ๆๅ5ไธชๅ่กฐๆๅๅฏๅๅฅถโฆโฆ่ฎคไธบๅช่ฆๆ่ฏๅฐฑไธ่ฝๅบไนณ\n406002 0 ๅญฆsapๅญฆ็ๆ่ๅญ้ฝๅบ่กไบ\n406003 0 ไธจ้นค้็บๅจไธจ0710่ฎจ่ฎบโๅ็บธ้นคๅฏไธๆฏ็บธ็ณ็\n406004 0 ๆๅๅคฉ็ฉบๆฏ้ฃๆบๆๅปๅ็่ช็บฟ\n1000 Processed\n classify content\n406500 0 โไธ้้ค้ฅฎ7ๆ่กจๅฝฐๆจ8ๆๅฏๅจๅคงไผโๅจๆๆฏๅๅกๅคงๅฆ29ๅฑไผ่ฎฎๅ
้้ไธพ่ก\n406501 0 ่งๅ่กจ็คบNBA่ช็ฑ็ๅๅธๅบไนๅ็็ผๅฒๆไปฅๅโ็ ้ฒจโ็ญ่งๅ้ข่ฎกไธไผๅ็ๅๅ\n406502 1 ๆจๅฅฝ๏ผๆๅ
ฌๅธๅฏไปฃๅผไผๆ ๅ็ฅจ๏ผ้ชๅไปๆฌพ๏ผ๏ผๅฏไปฃๅผ๏ผๆตๆฃๅขๅผ็จๅ็ฅจ๏ผๅฐ็จๅ็ฅจ๏ผๅนฟๅ่ดน๏ผๅปบ็ญๅทฅ็จ๏ผ...\n406503 0 ๅณฐๅณฐๅ ๆฒนๅณฐๅณฐๅ ๆฒน๏ผๆญคๅพ็็ฑๆฌไบบๅถไฝ\n406504 0 ไฝ ็ๆ่ไธ็ ็ฉถ็ไปฟไฝๅพ่ฝปๆพ\n1000 Processed\n classify content\n407000 0 ๅจ่ตฐ่ฎฟ้จๅๅๅฉๆฑฝ่ฝฆ4Sๅบๅ่ทๆ๏ผๆฐๆฌพGC7ๆๆๅจๆๅนดๆจๅบ\n407001 0 ไปๅคฉไธ็ญๅๅฎถๅ่ฝฆๅป่ดขๅฏ็ปๆปดๆปดไนฐ็ฏฎ็้\n407002 0 ๆไธญๆฅๆฅ๏ผๅทฆๆๅฟๅผบๅ็ตๆขฏๅฎๅ
จ็็ฎก\n407003 0 ๆๅป้ข็่งๅฎ้ข็บฆๆๅท็็
ๆฏๆฅไธๅไบ\n407004 0 2015ๅนด8ๆ2ๆฅๆไธไฝๅ
ญๅๅฒ่ๅคชๅคชไปไธๆตทๅบๅๅปๅบ่พ
\n1000 Processed\n classify content\n407500 0 ๅฏๆๅๆ่ฟๆฏ้ฃไธชๆ้บป็ฆ็ๅญฉๅญ\n407501 0 ๅฎๆถๆญๆฅ๏ผไธๆน่ฏๅธ่งฃ่ฏป็ฌฌไธๆนๆฏไป่งๅฎ\n407502 0 ๆข็บขๅ
ๆธธๆๅฐ่ฏดๆฝ็sex\n407503 0 ๅ็กใ้ฟ้่ฝฆใ็ซ่ฝฆใ้ฃๆบใๆ
ๆธธใๅบๅทฎ็ๅฟ
ๅคๅๅ\n407504 0 4ใๆฅผๅธๆ็นๆพ็ฐ่ฟ\"ไธ้\"ๅๆไธๅๅนดๅธๅบๆๅไธ\n1000 Processed\n classify content\n408000 0 ไปฃๅ5ๅน
่ตทๅ
้ฎๅฐ่ข่ฟๅ
ฅ็ฏ็่ฎขๅๆจกๅผไปฃๅ\n408001 0 ๆตๆฑไธ้ๅญฆ้ข็็ฉบ่ฐไฝฟ็จ่ดนไธๅนดxxxๅ
\n408002 0 ๅพๅท1000mlๆด้
็ถๆฑ่ๅพๅทๅฎๅ็ป็็งๆๆ้ๅ
ฌๅธๆด้
็ถใๅ้
ๅจ\n408003 0 ไฟๅ
ป็ฎ่ค็ป่็ป็ปๆๆๅ
ถ็ฅๅฅ็ๅๆ๏ผ่กฅๅ
็ฎ่ค็ป่็่ฅๅ
ป\n408004 0 2000ๅนด9ๆ็ปๅธๆฟๅบไผ่ฎฎ็ ็ฉถๅๆ\n1000 Processed\n classify content\n408500 0 SuitSuit่ฎพ่ฎก็ๆ
่ก็ฎฑๅฐ่ฎฉไฝ ่ฑ้ขไบไบบ็พคไนไธญ\n408501 0 ๆ็ไธไฝไธบๆฐๆฐๆๅ
จไบๅๆฟไธฅ่็ไฝๅถ\n408502 0 ๆตท้่ฏๅธH่กๅค็ไธๅบฆ่ท17%้ญๅบ้ๆๅฎ\n408503 0 ๆปฅ็จไฝ้ฆไน็พๅๆๅฏๅฏผ่ด่ๆๅฎณๅๆฅๆงๅ็ญ\n408504 0 ๆไบไบบ้ ่ฟ็ง็ซ่
ป็ๅ
ณ็ณปๅพไบๅฟ\n1000 Processed\n classify content\n409000 0 โ่ญฆๅฏ๏ผโ่ฟๆฏๅฏนๅฒๅจๆๅฅฝ็ๆฉ็ฝ\n409001 0 ๅก่บซ็ฑ5043ๅ็ญๅๅฐ็ป็้บๆ\n409002 0 ๆปก่ๅญ้ฝๆฏโ้ฟ้้้ฟ้้โ้ญๆงไบ\n409003 0 ๆๆบๆ็ตๆWiFiๆ็ฉบ่ฐๆ้ถ้ฃๆถ้ดๅทฒ่ฟๅปไธๅไนไธ\n409004 0 ้ๅๅฐๅบๆง่กๆถ้ดไธบ6ๆ15ๆฅ่ณ8ๆ15ๆฅ\n1000 Processed\n classify content\n409500 0 ่ฟฝๅ ไบๅ ไธชๆๅฐฑ~ๅฆๅค่ฏ่ฏไนๅ่งฃ้ไธไบ็้ฃไธชๆฏๆฅๆๆๆๅฐฑไฟฎๅคไบๆฒก\n409501 0 ๅถๅฐๆฅๅ ๆถ้ฃๆบ็้ฃ่กๅช้ณไธ็ฅ้ๆฏ็ฉบๅ่ฟๆฏไนไนๆบๅบ็้ฃๆบ\n409502 0 ้ฆ็ๅ็ๅ็็ต่็ญไธๅบไฟฑๅ
จ\n409503 0 3ใๆๅธธๅผๅค่ฝฆโโไน
ๅผๅค่ฝฆไฝฟ็้ฟๆฟ็ด ๅ่พไธ่
บ็ฎ่ดจๆฟ็ด ๅๆณ็ดไนฑ\n409504 0 ่ง่ฏไบ้ๆฑ79ๅฒ็ๆจๆญฃ้พ่ไบบไธ่ไผด55ๅนด็็ๆ
็ธๅฎ\n1000 Processed\n classify content\n410000 0 ่งๅพๆฏไธๆถ้ฃ่ฟ็้ฃๆบๆธ
ๆฐๅฐ็่ณ่ฝ็ๅฐๆบๅฐพ็่ช็ฉบๅ
ฌๅธ็ๅๅญ\n410001 0 avastๆจๆๆๆ็ต่้ๅฏๅๅฑ
็ถ่ชๅทฑๅคๆดปไบ\n410002 0 ๆไปฅ่
พ่ฎฏไผๅนๅพๅค็้ฃ้ฃๅธฎๅฉๆๅท็ฝไธๅป\n410003 0 ๅ ไธบ่่็ป่็็็ฉๅๆๅพๆดป่ท\n410004 0 ๆไบบ่ฏดๆ
ๆธธๆฏไป่ชๅทฑๅ่
ปไบ็ๅฐๆนๅฐๅซไบบๅ่
ปไบ็ๅฐๆน\n1000 Processed\n classify content\n410500 0 ็พๅฝไธบ้ฟๅ
ๆฝ่ๆๆฏ่ฝๅ
ฅ่่\n410501 0 ๆบๅจไบบๆฏ่ต่็ฟปไบๅธๆฐๆๅพ
ๆๅกๅๆๆดๆบๅจไบบๅจ็ๆดปไธญๅบ็จ\n410502 0 ๅจ็ซ็ฎญ้่กๆดๆนไบบ็ไธๅบๆฏ่ตไธญ\n410503 1 ๅๆตฉๆฑฝ่ฝฆๆฐๆฅ็นๆ ๏ผๅฐๅบ็ซๅxxxๅ
็ฐ้๏ผไปทๆ ผไผๆ ๅ่ดจไฟ่ฏ๏ผ้ข็บฆๅฝญ็่นๆๅ ๅพฎไฟกxxxxxxxx...\n410504 1 xxxxxxxxxxxxxxxxxxx้ฎๆฟๅดๅฐๅจฃใๆไบๅไธชไฟกๆฏ่ฏดๅ\n1000 Processed\n classify content\n411000 0 ็ฉบ่ฐ่ฅฟ็็ต่็ๆฏๆณไธๅบ่ฟๆๆฏ่ฟๆด็ฝ็ๅจๆซ้
็ฝฎ\n411001 1 ๅก่ดญไนฐๆฌง่ฑ้
xxxmlๆดๆคไบงๅ้ๆฌง่ฑ้
ๆ
่ก่ฃ
xxmlๆxmlๆดๆคไบงๅ้้ๅ
ถไธ๏ผ้ๅฎไธบๆญข ๅก...\n411002 0 ไป่ฟๆฅ่ขซๆๅฐๅAngealababyๅจ็ๅ ด่ๅคฉ\n411003 0 ็ฐๅจๆญฃๅจไธๆตทๆฐๅฝ้
ๅฑ่งไธญๅฟไธพๅ\n411004 0 3ๅท็บฟใ6ๅท็บฟใ7ๅท็บฟใ8ๅท็บฟใๆบๅบ็บฟ็ญ9ๆกๅฐ้็บฟ่ทฏๅๆถๅจๅปบ\n1000 Processed\n classify content\n411500 0 ๅ่ฑๆๅฎข๏ผๆ
ไบบ่ๅพๅทๅธๆทฎๆตท้ฃๅๅๅ็ๅคง้ข็งฏ็ซ็พ\n411501 1 ๅๅซไธญไนๅๆจกๆบ๏ผๅคฎ่งxxxxๅ ๅคๅฐๅซ่งๅ
ฑxxๅฅ่็ฎ๏ผๆนๅxxxๅ
ใxxๅฐ้xๅฐ๏ผไธๆๅกไธๅฎไฝ...\n411502 0 ่ขซ่ฟไฝ่ญฆๅฏ่้ป็ฝไบxxxๅ้ฑ\n411503 0 ๆณ็ไธไธชไบบไธๆไธๅ3่ถ้ฃๆบ\n411504 0 ไปๅนด็็จป็ฐ็ปไปxๆๅผๅง่ฟ่ก็งๆค\n1000 Processed\n classify content\n412000 0 ๆไธฐๅฏไบ็็ผฉ?ๆฝ่ฑกๅ ไฝๆ็ป\n412001 1 ๅกๅฝๅคฉๅฐๅบ่ฃ
ไฟฎ็ไธไธป๏ผๅๅฏ้ขๅ้ฃ็จๆฒนไธๆกถๅฆๅฏไบซๅๅๅ้ข็x%็่ฃ
ไฟฎ่กฅ่ดด๏ผๅนถ่ต ้xxxxๅ
่ๅฎ...\n412002 0 ๅฐๅฅณไปๅคฉ่้่ฟ้จไฝๅ็้ณไน็ฑไน
็ณ่ฎฉ่ด่ดฃ\n412003 0 ่ฟไบไบบ้ฝๆฏๆตๆฑๅฎๆณขไธ็พค้ชๅญๅขไผ\n412004 0 ๆๅผ็ต่ไธ็็ซๆฏ่ฟ่ฌๅๆจ็ๆฏ่ฑก\n1000 Processed\n classify content\n412500 0 ๆฌๅท่ญฆๆนๆญฃๅผๅฏนๅคๅๅธๆถๆฏ็งฐ\n412501 0 ๆๅจๅณๆถPK่ตไธญ็ปไบๆ่ไบ้ๅท็้\n412502 0 ๅ็ฐ็ต่้ๅฑ
็ถๆ่ฟๅผ โฆๆ่งไธไธๅญๅๅฐไบๅไธญโฆ็ช็ถๆณๅฏไธไธไธชๆ่ฟ็้บ่ไบโฆ็ฐๅจๆณๆณๆ่ตทๆฅ็็็ฑ็ๅป\n412503 0 8ๆ7ๆฅ่กๆๆ่ดงๆไฝ็ญ็ฅ\n412504 0 ๆๅบxๅฎถๅ
ฌๅธ็xไธชไบงๅๆฆไธๆๅ\n1000 Processed\n classify content\n413000 0 ๅค็ฑไบฌๅงๆฏๅบๅฐผๅฆ้จๆผณๅทๆณๅท็ณ็ฎๆๆฑๆตๆฑๆธฉๅทๆญๅท้ฟๆฒๅไบฌๅไบฌๆญฆๆฑๆ้ฝๅค็ฑๆจก็น่่นไน้18965...\n413001 0 ๅไธบ็ปๅธธๆฑๆจ่ชๅทฑๅจ็พๅฝ้ญๅฐไธๅ
ฌๅนณ็ๅฆ้ญๅโโๅไธบ่ฟไปๆช่ฝไป็พๅฝไธป่ฆ็ฝ็ป่ฟ่ฅๅ่ตขๅพไธๅ็ฝ็ป่ฎพๅคๅ็บฆๅข\n413002 1 ๆจๅฅฝ๏ผๆๆฏๅฐๅ๏ผๆฒงๅทๅ็ฏ้ชไฝๅ
ฐxSๅบ็๏ผๅฑไปฌๅ
ซๅทๅๅฎถๆๆน็นไปท่ฝฆ๏ผไผๆ ๆฏ่พๅคง๏ผ่ไธ่ฟๅบๅฐฑๆ็ฒพ็พ...\n413003 0 ๅฃ่
ๅ
ๅพฎ็็ฉ่
่ดฅๆถๅๅฃ่
ๆป็็ฉ่ดจไบง็ๆฅๅๆง็กซๅ็ฉ็ญๅผๅณ็ฉ่ดจๆฏๅฏผ่ดๅฃ่ญ็ไธป่ฆๆๅ\n413004 0 ๆไธชๆๅๆฏๆธ
ๅ็ๆณๅญฆ็ ็ฉถ็ๅฆ\n1000 Processed\n classify content\n413500 0 ๆๆบ้พ่ฟ็ๆ็พๅบฆไบๆญฃ็กฎ็ๆด่ธๅพ็็ปๅคงๅฎถๅ่ไธ\n413501 0 ๅ่ฝๆๅ24ๅฐๆถๅจๆ ้กๅ็ๅผๆบๆ็ปญ\n413502 0 ้พๆฅๆพๅจ่ฏ่ฎบ้้ขๅฆ~่ฏทๅซๆๆดป้ท้~ไปฅๅๅฐฑไธ่ฆๅ้ฎๆๅฆ\n413503 0 ๅฆๆ่ก็ฅจ็่ตฐๅฟ่ทไฝ ็ๅคดๅฏธ็ธๅ\n413504 0 ่ดๅกๅทฒ็ปๆดไบไธชๆไบ\n1000 Processed\n classify content\n414000 1 ไฟก็จ็คพ๏ผxxxx๏ผxxxx๏ผxxxx๏ผxxxx๏ผxxxx๏ผxxๅๆๆ\n414001 0 ็พๅงๅคง่ๅฐxxxxๆถๅคๆๆฒๆ่บๆไผๅจๆฐๆณฐๆปจๆนๅนฟๅบๆๅผๅบๅน\n414002 0 ่ญฆๅฏๅๅๅฐฑไผๆไฝ ๅฎถ้นฟๆๆพๆฅ\n414003 0 ๅฐฑๅฅฝๅๅบๅป็ฉ่ขซๅฐๅทๅทไบๅไบบๆไบๆฅ่ญฆไธ็ฎก็ถๅ่ญฆๅฏ้ฎๆไปฌๅจไธๅจไธๆ ท\n414004 0 ๅฐไธ้ขๅ็ปct่ฏๆญไธบ่็ฝ่ไธ่
ๅบ่ก\n1000 Processed\n classify content\n414500 0 ๆไปฅๅๆฌข่กฃ่กฃ็?ๆๅซๆchangliang8899\n414501 0 ๅจ1860ๆๅๅๆๅญไธบๅฎถไนกไบบๆฐ็ฎไธๅ้ข็ด็พๅฆไน้ตโโโ็ด็ณปๆ
้โไธญๆณ้ข็ด้ณไนไผโ\n414502 0 ่ๅทๅคงๅญฆๆๅญฆ้ข้่ตฐๅดๆฑโ่ๅฐโ่ทฏ็ๅฎ่ทตๅข้ๆฅๅฐๆฑๅๅค้้ๆณฝ\n414503 0 ๆณๆพๆๅไปฌๅไธช่
พ่ฎฏ่ง้ข็ๅฅฝ่ฑๅไผๅ่ดฆๅท็จ\n414504 0 ๆฅผไธๆฅผไธๅๆถ่ฃ
ไฟฎๆฏไธ็งๆๆ ท็ไฝไผ\n1000 Processed\n classify content\n415000 0 ไปๅx~xๅนดๆ็ปๆดปไธๆฅ็ๆฏๆๅฐๆฐ็ๅๅ\n415001 0 ๆฏ็ซๆฒก่ฏๆ็ๆฏๆๆฏ็ซๆ่ฟ่ฎฐๅพๅฝๅนดJS่ฏด่ฟๅฐฑ็ฎๅฅนๅปไบๅซ็ๅๅธๆไนๅฏไปฅๅ้ฃๆบๅธธ่ฟๅป็ๅฅนๆๅฐฑๅไธๅฐ...\n415002 0 ๅจ้ฉฌๅๅซๅข
ไธพๅๅฉ็คผๅฎไผๅ่ฟๅบงๅปบ็ญๆฌ่บซไธๆ ท็ซฅ่ฏ\n415003 0 ๆณๅพไบ่งฃๆดๅค็ฅ่ฏโโโviaไบบๆฐๆฅๆฅ\n415004 0 ็ฉๅ
ทๅ็ๅ็ใ้ฒๆใๅบๅใ่ๅธญใๅฎ่ด็่ขซๅญ\n1000 Processed\n classify content\n415500 0 ไธญ็ฒฎ้ๅข็็ธๅ
ณไธๅธๅ
ฌๅธๅฆไธญ็ฒฎๅฑฏๆฒณ\n415501 0 ๆไปฌไฟฉๆฉไธไธ่ตทๅ้ฅญ็ถๅๅปไธ็ญไธ็ญๅๆฅๅไธช็พ็พ็ๆ้ค็ถๅ็ชๅจไธ่ตท็็ต่ง็ฉ็ต่ๅฆ\n415502 0 ๆญปๅๅไธ่พๅญ็ปไปๅฎถๅ็ๅ้ฉฌ\n415503 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ n4xen3ไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n415504 0 3ใๆบๅถ้ๆงไธ่ฝๆฟๅฑไบบๅฟ\n1000 Processed\n classify content\n416000 0 ๅจ่พน่ขซไผไธไธฐ้่ฎพ่ฎก็LๅTodๆฅผๆๅ
ๅด\n416001 0 ๆฌๆกไพๆณ้็จ็ฎๆ็จๅบ่ฟ่กๅฎก็\n416002 0 ๅธธๅทๅธ้ๅๅบๆฐ่ฑกๅฐ7ๆ27ๆฅ16ๆถๅๅธ็้ซๆธฉ่ญฆๆฅๅๅคฉๆฐ้ขๆฅ๏ผไปๅคฉๅค้ๅฐๆๅคฉๅคไบๅฐๆด\n416003 1 ๆจๅฅฝ๏ผๆทฑๅณๅธๆก่ฑ่ถ็ฎกไบ้ๅ่กๆฌข่ฟๆจ็ๆฅ็ต๏ผๆๅ่ก็ซญ่ฏไธบๆจๆไพ๏ผ่ๅกใๆทฑๅก่ถ็ฎกๅๅ็ฑปๅ็่ถ็ฎก๏ผ...\n416004 0 ไธ่ฟๅพๅคไบบ็่ตทๆฅๆณ้ ่ญฆๅฏๅๅๅฑ ๆ่งฃๅณไธๅ\n1000 Processed\n classify content\n416500 0 ๆๆฉๅฐฑๅฏไปฅ็่ฑๅ้ชจๅฆๆ่ฆๅป่กฅไปๆ็ๅฅ่ฉ่ฏดๅฆ้ๆๆ็ฑไฝ ไนไนๅ\n416501 0 ๅฐๆดๅฝขๅป้ข่ฆๆฑๅป็ไธบๅ
ถๅๆฌงๅผๅ็ผ็ฎ\n416502 0 ๅฎ็พ่็ซไฝๆ่ง็ๆผๆฅ่ฎพ่ฎก้ฃๆ ผ\n416503 0 xๆxxๅทไธพ่กๆดปๅจ็้ๅๆ ธMX\n416504 0 ๆไปฌๆฌข่ฟๅคงๅฐๆๅๅๅ ~ไธ่ฆๅ็น่ฑซ\n1000 Processed\n classify content\n417000 1 ๅซๅปๆพณ้ๅคช้บป็ฆ๏ผๅฐฑๆฅxxxxxx๏ผcโm ๆๅค็ๆๅ็ญ็ไฝ ใๆฐๅนด้ๅฅฝ่ฟ๏ผๆฅไบๅฐฑๅฏๅพxxใๆปกไบ...\n417001 0 ๅพฎ่ฝฏไปๅคฉๆญฃๅผๅฎฃๅธไผๅจๆชๆฅๅ ไธชๆ้ๆญฅ่ฃๅx\n417002 0 ๆบๅจไบบๅฏๅจๆค็่ฎพๆฝๅ
ๆ นๆฎๅ
ฅไฝ่
็ๆฐๆฎไฝๅบโ่ฆๆฏ่บซไธๅช้็ผ่ฏทๅ็ฅโใโๅฐ้่กๅ็ๆถ้ดไบโ็ญๆ้\n417003 1 ๅฅๅฅไฝ ๅฅฝ๏ผๆๅซๅฐ็ฒ๏ผๅๅท็๏ผxxไบ๏ผ็ฎ่ค็ฝ๏ผๅ ไธบๅฎถ้ๅฐ้พ่ท็ๆๅๅบๆฅๆๅทฅ๏ผ่ฟๆฒกๆๅค่ฟๆๅ๏ผๅฌ...\n417004 0 ๅพฎ่ฝฏ้ญๆทกๅโฆไธ่พนๅ ็ๅธฆๅฎฝไธ่พนไธไธไธ่ฅฟ\n1000 Processed\n classify content\n417500 0 ่่ค็็ป่ไธไผๅๅ
ถไปๅญฃ่้ฃ่ฌๆดป่ท\n417501 1 ๆจๅฅฝ๏ผๆๆฏๅๅฑฑๆๅจ ๅผ ๅปบ็ ๅๅฑฑๆๅจ่ฟๅบๅฃ่ดธๆๆ้ๅ
ฌๅธ ไธป่ฅ๏ผๆดฅ่ฅฟHๅ้ขไธ็บงไปฃ็\n417502 0 ไฝ่ฝฌ่ๆ้ไป
ๅฉ็3000ๅ
ๅฐฑ่ขซๆฟ่ตฐ\n417503 0 ๅ
จๅธxxxๅๆฃๅฏๅนฒ่ญฆๅๅ ๅ้ขๅถๆฃๅฏๅฎ้ไปป่่ฏ็ฌ่ฏ\n417504 0 ไฝ็ผฉ็ญ็งๆๅฐ2020ๅนด10ๆ26ๆฅ\n1000 Processed\n classify content\n418000 0 10086็ญๅคๅ ไธชๆๅ็็งฏๅๅ
ๆข้้ขไธ้\n418001 0 ็ฎๅ่ฏฅๅฝๆฟๅบๆญฃๅฏปๆฑๅฐ้่ฟๅฃๅ
ณ็จไธ่ฐไธๅ่ณ10%\n418002 0 ๆฏ่ฐๆๆฅๅจ่
พ่ฎฏๅฐฑๅฃ่ฟนๆๆ็ไบบ\n418003 1 ไฝ ๅฅฝ๏ผๆๆฏ่ดตๅทๆ็พๅ
ฌๅธ๏ผๆๅ
ฌๅธ้ฟๆๆจๅบ๏ผ็ป่ๆดป่ฝ๏ผๆน้ฃ็ฅๆ๏ผ็ฑไบบ่ก่็ฑฝ๏ผไธฝ็ปฃๅช๏ผไธไธ็ผใไธไธ...\n418004 0 ๆฒก็ๅคฉๅคฉๅ่
ไบบๆฐ็ๆดปๅฐฑๅคๅฅฝไบ\n1000 Processed\n classify content\n418500 0 ๅจWhatsAppๅจiPhoneใๅฎๅใ่ฏบๅบไบใ้ป่ๅๅพฎ่ฝฏๆๆบๅนณๅฐไธ้ฝๅฏไปฅไฝฟ็จ\n418501 0 ็ทๅญ่นฒ5ๅนดๅค็ฑๅ็จๆผซ็ปๆ็ปๅ่ฎฏ๏ผ่ขซๅคไบบ็ ๆๆ็ฃจ\n418502 0 ไปฅ็กฎไฟ็ซๆ
ๆฉๅ็ฐใๆฉๆงๅถใๆฉๆ็ญ\n418503 0 ไปปไฝๆ่ต้ฝ้ๅ
ทๅคๆบๆ
งๆง็ๅฟ่ๅ\n418504 1 ๆๅธไปฃๅๅๆทยทๅฃๆฌโๆฏ.ไธๆฌ๏ผ่บซ*ไปฝๆญฃ๏ผ้ฉพ๏ผ่ก#้ฉถๆญฃ็ญไธๅๆๆๆญฃ&ไปถ๏ผ่็ณป๏ผโโโโโโโโ...\n1000 Processed\n classify content\n419000 0 ๆฝๅๆดไธๅฅ๏ผ็ฌๆฝๆฐดๆถ่ดขๅฏๆๅฐฑ็พๅนดไผ ๅฅๅฎถๆ\n419001 0 FATEๅ้็๏ผ็APHๆป่งๅพ็ๆฏๆกๆฏๆฌก่ฆๆปๆๅซๅฎถ็ๆถๅๅฐฑ่ฆๅฌๅคไธช้ชๅฃซ็ๅๆพไธชๆๅฝฑๅฅ็ไบโฆโฆ\n419002 0 ๆพ็ดง่
ฐ็่ฎพ่ฎกๆด่ดดๅ่บซไฝๆฒ็บฟ\n419003 0 ๆไปฌๅจ7ๆ20ๅท่งฆๅฐ196็พๅ
็็ฎๆ ไปท\n419004 0 ่ฝไธ่ฝๅ
่ฏไผฐไธไธ่ชๅทฑ็่กไธบ\n1000 Processed\n classify content\n419500 0 ๅคชๅปๆฏไบไธ่ฐไบ็ฑไธๅจๅๅๅegๅ ๆฒนๅง\n419501 0 2ใ็ฝ่ฒๅ้ๆ็ผๅฑ๏ผๅฏ่ฝๆฃๆฅๆง็
ๆฏๆง็ป่็\n419502 1 xxxx้ญ
ๅไธๅ
ซๆจ็บฆๅ๏ผๆฐ็ปๅ
ธ็บฆๆจxๆxๆฅ่ณxๆฅๆๅไธๆญ.ๅฎๆ ๅคๅค๏ผๅ
จๅนดไป
ๆไธๆฌก๏ผๆไปฌ็ญไฝ ๏ผ...\n419503 0 ๆไธๅบ้จไธ็จ้ฒๆๅhhh๏ฝ\n419504 0 ้ๅฐ้ๆฒ็57ๅฑค็ๆดๅๆตทๅฒธๅๅๆฏ\n1000 Processed\n classify content\n420000 0 ็ไปฅไธบ่ชๅทฑๅคงๅๅธไบๆฟๅบๆ ่ฝๅฐฑๆณ็ไธๅๅๅฐ่งฃๅณ้ฎ้ขๆๆฒกๆๆณ่ฟไบค้้ฎ้ขไธๅฅฝๆฏๅ ไธบไฝ ไปฌ็่งๅไธๅฅฝ่...\n420001 0 ๆฌ ๅบ่ฆ้ฑๅฑ
็ถ้ผๅพๆไธ็พๅบฆไบ\n420002 1 ๅฅถ็ฒ๏ผไธๅพไธๆไผๆ ๏ผๅกไธๆฌกๆง่ดญ็ฉๆปก็พๅ
็้กพๅฎข๏ผๅฐไผ่ต ้็ฒพ็พ็คผๅไธไปฝ๏ผๅไธไธ่ฆ้่ฟๅ๏ผ๎ๅฐๅ๏ผ...\n420003 0 ่ฟๆฏไธคๅนดๆฅๆตฆๅฃไบๆๆฟๆๆไบค้้ฆๆฌกๆไธบๅ
จๅธ็ฌฌไธ\n420004 1 ๆจๅฅฝ๏ผๆๆฏๅๅ่ทๆจ่็ณป็ๆนๅไธญ่ดตๅ
ด็ๅฐๆฑ๏ผๅ
ฌๅธๅฐๅๅจๅ
่ฐทๆญฅ่ก่กไธ็ๅๅนฟๅบxxๆฅผ๏ผๅฆๆๆจ้่ฆ่ต...\n1000 Processed\n classify content\n420500 0 ๅๅญๅซโTheBeatlesPubโไฝๆฏไธๆ้ฝๆฒกๅฌๅฐไธ้ฆBeatles\n420501 0 ไฝๆฏๅนถ้ๆๆ็Lumiaๆๆบ้ฝๅฏไปฅๅจ็ฌฌไธๆถ้ดๅ
ๅ็บง\n420502 0 ็ฑไบๅจFirefoxๆต่งๅจไธๅ็ฐไธๅคไธฅ้็บงๅซๅฎๅ
จๆผๆด\n420503 0 ๆฅ่ทๆๆบ70ไฝ้จใ้ถ่กๅก70ไฝๅผ ใ่ฑชๅๆฑฝ่ฝฆ5่พ\n420504 0 ๅคๆ็ซๅฝฉ้ญๆณ้็ฆป้50mlไฟๆนฟๅฆๅไนณๆไบฎ็พ็ฝHERA่ตซๆ้ญๆณๅฆๅไนณ\n1000 Processed\n classify content\n421000 0 ๅคงๅ ็ดขๅฐๅจๅ้ๅๅ NBA้ๆดฒ่กจๆผ่ต\n421001 1 ็พไธฝๅฅณไบบ่ๅฐๅฆ๏ผxๆxๆฅๅฝๅคฉๅฐๆฏๆๅนณ็ๆดป้ฆๆถ่ดน็็พๅฅณ้ฝๅฏๅพ็ฒพ็พ็คผๅไธไปฝๅฆ๏ผ๏ผๆฏๆๅนณ็ๆดป้ฆๅ
จไฝ...\n421002 0 ไธๅจๆ่ต่
ไปๆฐๅ
ดๅธๅบๅบ้ๅ่ตๅxx\n421003 0 WindowsPhone่ฎพๅคxๅนดๆป้้่พพๅฐไธไบฟๅฐ\n421004 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?Bad้ขๅ็้ณ้ข\n1000 Processed\n classify content\n421500 0 ่ฏฅๅธโๅพฎๅฏยท้ๅฎขไนๅฎถโๅบ็ง่ฝฆ็ฑๅฟ่ฝฆ้็ๅธๆบไปฌๅพ็ฅๆญคไบๅ\n421501 0 xใ็็บง้จ้จๅฝขๅ่่ฎพไธไฝไธบ\n421502 0 ่ฏท่ฎคๅ๏ผsozu1991้่ฏๅฟๆฐไธพๆฅๆญปๅ
จๅฎถ\n421503 0 ไธบๅฎๆธ
ๅป็
งๆฑ้ๆ
ๅฑ
็บต็ซ่ขซๅผบๆ\n421504 0 ๆๅฐฑ่ตทๅบๆ็ต่ๅๆๆบ้็่ฝฏไปถ้ฝๆดๆฐไธ้\n1000 Processed\n classify content\n422000 1 ๅๅบ่ฟๅฃๅๅฆๅบ๏ผ่ดญไนฐๅๅฆๅ๏ผๆ้ซๅฏ่ฟ็ฐxxxxๅ
็่ฟๅฃๅๅฆๅ็ฐ้ๅธ๏ผ่ฟๆไบ็บงๅฅฝ็คผๅ ่ต ๏ผๆๅพ
ๆจ...\n422001 0 ๆฎ้ขๆต๏ผ2015ๅนดๅ
จ็ๆๆบๆ็พ่
ๅฐ่พพๅฐ2\n422002 0 3๏ผๆค้
ธ้
ถๅๆจ่็ณ้
ถ็ไฝ็จๆบ็\n422003 0 ไปๅนดๆ้คไบไฝ ๅฎถkindleไนฐไบๅ ๆฌไนฆๅคๅฐฑๆฒกไนฐ่ฟไปปไฝไธ่ฅฟ\n422004 0 ใ็ฅๅทไธ่ฝฆ่ขซ็บฆ่ฐ็ง่ฝฆ้
ๅธๆบๅ
่ดนๆฅ้ๆบ่ขซๆ่ฟๆณใ\n1000 Processed\n classify content\n422500 0 ็ซๆ
ๅนณ็จณ้ข็คบๅ็
้ซๅณฐๆๅทฒ่ฟ\n422501 0 xๆxๆฅไฝ ไปฌ่ฟๆฅไบไฝ ไปฌ็ๅไธไธชๆๅ\n422502 0 ไธ่ฟ่ฟๅญฃ็่ตทๆฅๅๆฏๅพhighๅฟๅฟๅๅฟ\n422503 0 ๅๆโ้ฃ้ฃ้จ้จๆๆๅฏๅฏๅคๅคๅฏปๅฏป่ง
่ง
\n422504 0 ๆนๅ่ๅทๅฎ่ฏ็พ่ดงๆๆถ็ตๆขฏๅ็ไบๆ
\n1000 Processed\n classify content\n423000 0 ๅพฎ่ฝฏ่ฆ่ตท่ฏไธญๅฝๅฆๆๆๅ่ฆ่ตๅ ็พไบฟ\n423001 0 chrome็จๅคไบๆๅญ่ช้ฝ่งๅพๅฐไบๅข\n423002 0 ไธญๅฝ้ถ่กๅฐๅจ8ๆไธๆฌ็ป็ป่ฟ่ก็ปไธ็ฌ่ฏ\n423003 0 ๅๅๅป้ข็ฒพๅฟๅ็ๅบ็ไธค็งๆฐๅๆด้ขๅฅถ\n423004 0 Nxxxxx่ฟๆฌพ้ณๅ็Sheltonไธญๅทๆ่ขไปฅไผ้
่่็จ็DamierEbรจneๅธๅธๅถๆ\n1000 Processed\n classify content\n423500 0 ๆฉๆ็่ตทๆฅๆๆบไผๅO2O็ๅฃ็ข็ฝใ็ฑๅธฎ็ฝใๅคงไผ็น่ฏ็ฝ็ฐๅจๅชๆๅคงไผๆดป็่ฟไธ้\n423501 0 ๅไธบP8้ซ้
็ๆๆบ่ฟๆไพไธ้ข500ไธๅ็ด ๅ็ฝฎๆๅๅคดๅไธ้ข1300ไธๅ็ด ๅ
ๅญฆ้ฒๆๅ็ฝฎๆๅๅคด\n423502 0 ไธๆตทๅไบฌ่ทฏๅคๆปฉ็ฝๅคฉ่ทๅ
ถไปๅคงๅๅธไธไธชๆ ท\n423503 0 ่ฟไบ้ๆพๅทฅ็จๅคงๅคๅ ไธบ่ฃ
ไฟฎ่ฟ็จ็่ฏฏๅบๆๅฏผ่ด็\n423504 0 ็ฐๅจๅคงๅฎถ้ฝ็ฅ้็5ๆฅ็บฟใ10ๆฅ็บฟ\n1000 Processed\n classify content\n424000 0 ๆฏๆฌก่ธ่ฟๅป้ขๅคง้จๅฟๆ
้ฝๆฏๆฒ้็\n424001 0 ๆฅ่ชFireEye็็ ็ฉถๅไป็ปไบไธ็งโๆ็บนไผ ๆๅจ็่งๆปๅปโ็ๆนๆณ\n424002 0 ๅ
ถๅฎๆไธชcba็ไธปๅบๆพ่ๅทๆดๅฅฝ็\n424003 0 ๅพฎ่ฝฏๅ
ฌๅธ2015ไบๅญฃๅบฆไธ็ปฉๆฅๅ\n424004 0 ๅ
จ้้้พ้ๅนดๆฅๅฐๅธธๅนณๅป้ขๅๅ ไบไฝๆฃ\n1000 Processed\n classify content\n424500 0 ็ปๅคงๅฎถๅไบซโOfficeMacxxxxspx\n424501 0 ๆฐๅนฒๅฟไบบๆฐๆณ้ขๅฎก็ไบไธ่ตทๆฐ้ด\n424502 0 ๅพฎ่ฝฏWin10็ณป็ปU็ๅค่งๆๅ
\n424503 1 (x/x)ๆ่ฐขๆจ่ด็ตไฟๅฉๅฝ้
ๅฝฑๅ.ๆฌๆxxxๅ
ๅณๅฏๅ
ฅไผ\n424504 0 ไป
ๆตๆฑ็ปๅ
ดไผไธ่ดขไบงไฟ้ฉๆฅๆก1300ๅค่ตท\n1000 Processed\n classify content\n425000 0 QQxxxxxxxxxiphoneๅ
จ็ๅซๆGPSๅฎไฝ้ช็บธ\n425001 0 ่ๆๅจๅฒๅนดๅนณๅ72818ๅ
ๆฅ็ฎ็่ฏ\n425002 1 ไฟก้ณๅธๅฎๆบไพๆฐด่ฎพๅคๅถ้ ๆ้ๅ
ฌๅธๆป็ป็ๆๆ ๆๆบๅ
จไฝๅๅทฅ็ฅๆจๅจๆฐ็ไธๅนด้็ๆๅ
ด้๏ผๅ
ฌๅธไธป่ฅๅ็งๅ...\n425003 0 ๆจๅคฉ่ฟๅจ่ดจ็ๆ็ไบบ็ฑปๆๅ็งฐๅกไธ่ฝฆๆฏไธ้กน่ฟๅจ\n425004 0 ๅจ่ฟไธช็่ฑๅ้ชจ็ๅนด็บชๆ่ฟทๅคฑๅจโฆ\n1000 Processed\n classify content\n425500 0 ๅฏนๆๆฅ่ฏดๆๅฎ็พ็ๆณไธญ็ๅคๅคฉๆฏ่ฟๆ ท็\n425501 0 ็ด้937็ดๆญ่ดด๏ผ19ๅฒ็ๅฐ็14ๅฒๅทฆๅณๅฐฑ่พๅญฆ้็ถไบฒๅฐ่ๅทๆๅทฅ\n425502 0 Dior่ฟชๅฅฅ็ๆ100ml้ฆ็ฒพๅ7\n425503 0 ๅจๆฐไผฆ่ๅท็ซๆผๅฑไผ้จ็ฅจ้ฝไนฐไธๅฐ\n425504 0 ๅ10็นๅ15็นไฟฉๆถๆฎตๅๅซ7ๆ็งๆๅไธบ่ฃ่4Xใๅฐ็ฑณ4ใiPhone6\n1000 Processed\n classify content\n426000 0 ๅบๅฐ็ไธ่งๅฝข้ฃๆบๅจๅคดไธ็ปๆฅ็ปๅป\n426001 0 ๅฝๆถไบไธๅคง้่ฟๅจไฟฎๅฐ้ๅช่ง่ไนฑ\n426002 0 ๅฆๆ้ชๅฟๅญไน่ฝๅ็่ฑๅ้ชจไธๆ ทๆๅ
ด่ถฃๅฐฑๅฅฝไบ\n426003 0 ่ฟๆถๅชๆ่ก็ฅจ่บฒๅจ่ง่ฝ้ท้ทไธไน๏ผไฝ ไปฌ็ๅนธ็ฆ\n426004 0 ๆฏๆAndroidใLinuxใWindows10\n1000 Processed\n classify content\n426500 0 ๅจๅป็ก
่ฐท็่ทฏไธๆๆๆบไธขไบๆฏ็งๆๆ ท็ไฝ้ช\n426501 0 ไพฆๆฅไบบๅๅจๅฎกๅค้ถๆฎตไฝไธบ่ฏไบบๅบ็ฐ\n426502 0 ๅไธ้ฃๆบๅฐฑ่ขซ็ผๅ็ไธๅนๆๅไบ๏ผๆฑๅ้ๆตฆๆบๅบๅฐไธไธชไธ้จ้้็็ป่ฟๅฏนๆฐไบบ\n426503 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 95725eไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n426504 0 ่ฆๆไผxๅฒไปฅไธ็ๅญฉๅญไธไบๅบๆฌ้็ๆ่ฝ\n1000 Processed\n classify content\n427000 0 ้ฟ้ๅทดๅทดๅๆๆทๅฎไบไธ้จไธญ่ฅฟ้จๅคงๅบ็ป็ๅดไผๆๅบ้ๅฐๆๅฟๅๆไธ็บชไธญๅฟใไบฟ่ไธไธๅธๅบใ็่ง็คพๅบ\n427001 0 ๅธฆ็ไผ้ช็บขๅ
็ๆฃๆฃ็police้ฝ่ฟๅปไบๅ\n427002 0 ไธ่ตทๅถ้ ๆญฃไน็ๅทจๅคงๆบๅจไบบๅง~\n427003 0 ๅ ๅ
ฅ็ซ็100ใๅช่ฆไฝ ่ๅพไธไธช228\n427004 0 ๅ
ๆฅ็็BB้็ๆไปฝๅๆๅ็ธๅ
ณๆฅๅ\n1000 Processed\n classify content\n427500 0 ้่ฆไบ่ฆ่ฏดไธ้ๅฐๅบ่ฐ็ฎกๅฐๅบ่ฐ็ฎกๅฐๅบ่ฐ็ฎก\n427501 0 ไธๅคฉๆฐๆฌกไฝไผๅฐ็พๅบฆๅ่ฐทๆญ็ๅทฎ่ทโฆๅ
ๆ็ณๅฟ
่พพ\n427502 0 ไธญๅฝ็บบ็ปๅทฅไธ่ๅไผๅจ่ฟไบๆธฏ็ป็ปๅฌๅผไบ็ฑไธญๅค็ฅ้นฐ็ขณ็บค็ปดๆ้่ดฃไปปๅ
ฌๅธใไธๅๅคงๅญฆใๆฑ่้นฐๆธธ็บบๆบๆ้...\n427503 0 ๅฐฑ็ฎๅๅ ๆฒนๅคงๆฆไธไผๆผ้ฃไนๅคไบ\n427504 0 ่ๅคๅๆฑ่่ๅคฉ้ๅๆๅๆถๅปๆ
็ปช้ๅธธๆฟๅจ\n1000 Processed\n classify content\n428000 0 ็ปๅธธๆ
ๆธธๅบๅทฎ็ใไน้ฃๆบใ่ฝฎ่นใๅๆฑฝ่ฝฆ\n428001 0 ๅไฟๆฃ่
ๅจๅป่ไฝๅ
ๅป็ๆบๆ้ดๅ็บง่ฏ็\n428002 0 xxxxๅไธบPxโฆโฆๅชไธชๆฏไฝ ็่\n428003 0 ๆๅบๆญฃๅธธ็DNAๆไผคๅบ็ญ่ฝๅๅจ้ ่กๅนฒ็ป่็ปดๆๅ
ถ่ชๆๆดๆฐ่ฝๅไธญๆฎๆผไธๅฏๆ็ผบ็ไฝ็จ\n428004 0 ไธบๅฅ่ฟๆ ท็ๅฎณ็พคไน้ฉฌไธๅคๆญปๅ\n1000 Processed\n classify content\n428500 0 ๆๅฎขๆฉ็น๏ผIBM่ฟ็ปญxxๅญฃๆถๅ
ฅไธๆป\n428501 0 ๆฑ่ๅ็ปฟ็็ฉ็งๆ่กไปฝๆ้ๅ
ฌๅธๅจๅไบฌๅ
จๅฝไธญๅฐไผไธ่กไปฝ่ฝฌ่ฎฉ็ณป็ปๅ
ฌๅธไธพๅโๆฐไธๆฟโๆ็ไปชๅผ\n428502 0 ๆๅฑ
็ถ่ฎฐ้ๆๆ่
่็็ๆญๅบๆถ้ด\n428503 0 ไธไบๅญฆ้ข2015ๅนดๅจๆฑ่็่บๆฏใไฝ่ฒ็ฑปไธไธไบๅฟๆฟๆๆกฃ22ไบบ\n428504 0 ็ๅฐ่ฟ็งๆๆๅไบฌๅฝข่ฑก็่กไธบๅฐฑๅพ่ฎจๅ\n1000 Processed\n classify content\n429000 0 ่ฎพ่ฎก่
็จ่ๅ็่ๆฏ็ชๅบๆญๆฒ็ๅ็งฐ\n429001 0 ไป
3็งๅฐฑๅฐ็บฆ13ๅจ็ๅ
ฌไบค่ฝฆๆฌไบ่ตทๆฅ\n429002 0 ๅฝๅบๆ่ท็ๆใๆฒๆ็ญ9ๅๅซ็ไบบ\n429003 0 ไบค10ๅนดไฟ20ๅนด่ฏด็ปไฝ ๅนดๅบๅฎๆถ็12%\n429004 0 ๅ็งๆฑๆณๅทๅธไธญๅป้ขๅฏไธปไปปๅปๅธๆฃ่
ๆปกๆๅบฆ100%ๆ
้ฟไธญ่ฅฟๅป็ปๅๆฒป็็็ฎใ่็ใ่ๅจ่่ฟใ่่ฃใ...\n1000 Processed\n classify content\n429500 1 ไบฒ็ฑ็ๅธ็็พ้กพๅฎขๅผๅนดๆๅคง็คผๅฆ๏ผๆบไธๅฏๅคฑ๏ผๆถไธๅจๆฅ๏ผๅกๅจxๆxOๆฅๅๅฐๅบ้กพๅฎขๅๅฏไบซๅๆฝๅฅไธๆฌก๏ผ...\n429501 0 ๅพฎ่ฝฏไธ่ฏบๅบไบๅ้ๆฌ้ณLumiaไฝๅปไฝไป\n429502 0 ไปๅคฉๆไปฌๅฐฑไธ่ตท่ตฐ่ฟHAMANN\n429503 0 ๅจ19ๆฅผไธๆพๅฐไบ้ฃไธชๆ่ฐ็็ธไบฒๅคงไผ็็็ธ\n429504 0 ไนๆฏๅไปๅ็ฎกๅฑ้ฟๆๅๆ็้็นๅทฅไฝ\n1000 Processed\n classify content\n430000 0 ๆไปฅ่ฏดGoogle่ทBaidu็ๅทฎ่ทๆ็นๅคง\n430001 0 ๆฑ่ๅซ่ง้
้ณ็็ปงๆฟ่
ไปฌ็ฎ็ดๅญ็\n430002 1 ใๆณฐ็ฆพๅฆ้จ้ขๅญยท้ฆ็บใๆตทๆฒง็ฌฌไธๆตทๆฏ้ซๅฑ๏ผๅฐ้ๅฃ๏ผๅ
ฌๅญๆ๏ผๅๆ กๅญฆๅบๆฟ๏ผxxๅ้่ฟๅฒ๏ผxx-xx...\n430003 0 ๆฒณๅๅบๅบๆฟๅบๅคฉๅคฉ่ฟไน็ญ้นๅ\n430004 1 ๆจๅฅฝ๏ผๆๆฏๅฝๅๅๅญไธ้ฃ้ช้้พ้ๅฎ้กพ้ฎ่ดพๆตท้ซ๏ผๅพๆ่ฐขๆจๅฏนไธ้ฃ้ช้้พ็ๅ
ณๆณจ๏ผๆฌๅจๅจๅ
ญๅจๆฅไธบไบๅบ็ฅ...\n1000 Processed\n classify content\n430500 0 ไธญ็ฒฎ้ๅขๅไธไธชๆไบง็็ๅฉๆฏๆฏๅบ็บฆไธบ19\n430501 0 ็ญ่ฃค็ๅ่ฃๆ่ฎพ่ฎกไน้ๅธธ็นๅซ\n430502 0 ๅฐๅฎถไผ่ฝๅคๅฎ็ฐ170ยฐๅนฟ่งๆซๆ\n430503 0 ็กฎๅฎๆๅธ20ๅฐ็ณ้พ็ตๆขฏไธญๆฒกๆๅ็ฐโๅไบบโ็ตๆขฏๅๅท\n430504 0 ๅดไบฆๅก่ฏ่ฎผ็ป่ๆๅ
๏ผๅ
ฌๅธ่งๆไธบๆบๅจ้ถไปถ\n1000 Processed\n classify content\n431000 0 ็ฌฌ10้๏ผๅทฅๅไนไผๆๅๅฃคๆฑกๆ้ฎ้ขๅ\n431001 0 ๅคงๅๅค็ๆ็จๆฐดๆๆๆบๆ็ๅค้ด้ฃ่ก\n431002 0 ๅฆๆไฝ ่งๅพๅฏนๆๆบๅ็ต่ๅผบ่ฟซ็็ๆๆฏๅๆฌขๆฐ้ฒไบ็ฉ\n431003 0 ้ไผๆไบ้จ่ฑๅฐๅบๆฟๆกฅๆฐๅ็ฎกๅงไผๆฐๆ็คพๅบ็็คพๅบๆฒปๅฎๅฟๆฟ่
\n431004 0 ๅฆๆไฝ ็่ฝฆ้
ๅคไบECOๅๅจๆบๅฏๅๅ่ฝ\n1000 Processed\n classify content\n431500 0 ็ฑๅไบฌๅพไธๆตทๆนๅ่ๅทๆฎตxxxxK้่ฟ็ฐๅบ่ฝฆๅค็ผ่ก\n431501 0 ๅฟซtm่ขซ็ต่ๆฐๅญไบไปไน็ฉๆๅฟๅ็ญไบ20ๅ้ๅจๅญๅคฑ่ดฅ\n431502 0 ๆไปฅ่ฏด้ฃไธชๅทๆฏไธๆฏๆฒกๆญปๅๅชๆ็ป่บซ็็ฆ\n431503 0 ่ฟๆถไธไธชๆฃ่ฑ็ณๅฐ่ดฉๅธๅผไบๅฐๆๅไปฌ็ๆณจๆ\n431504 0 ๆฌข่ฟ็ฑๆถๆใ็ฑ็ฏใ็ฑ็ฉใ็ฑๅ้ฉใ็ฑๅ\n1000 Processed\n classify content\n432000 0 bigbangๅไบฌๅบ1280\n432001 0 ไธ่ฆ็ฒ็ฎไธ่ฆๅปไนฑๆจๅนฟๅ่่ขซ้ช่งๅพๆไปฌๅ
ถไปๆจๅนฟๆไนๆไนๆ ท\n432002 0 ็ญๅฐพ็่ฎพ่ฎกไฝฟๆดไธช่ฝฆ่บซๆดๅ ็ฒพ่ด\n432003 0 ๆฒณๅๆณ้ขๅจๅ
จ็ๅผๅฑๆๅปๆไธๆง่กๅคๅณใ่ฃๅฎ็ญ\n432004 0 ๆ่ฟ่พนไธmicrosoftๆ้ฝ่ฆๆvpnไฝ ไปฌๆฏ็ญ็ๆไปฌ้ฝๅปๅทฅไฟก้จๆ่ฏไน\n1000 Processed\n classify content\n432500 0 2018ๅนดๅบๅๅฐ่ดๅไบ5Gๆ ๅๅๅถๅฎ\n432501 0 x็ๅฅถไฝ่xxๅธไธ็ฒx็bb้ข่x้ฒๆ็พ็ฝๅท้พ\n432502 0 ็ฑ็ฌๆ่ดฃไปปๅฟไธ่ฟๅฟๅจๅไบฌๆ็จณๅฎๅทฅไฝ\n432503 0 ๆ็ฝๅๆคๆ้ฎ้๏ผๅฅฝๅฃฐ้ณๆฏๅคๅฝ็\n432504 0 ่ฒ็นยทๆณฐๆฏ็น็ฝ่ยทๅๆๆธฉ\n1000 Processed\n classify content\n433000 0 ไนๅพโโโไบ่งฃ้ฃไบๆฑฝ่ฝฆ็้่ๅ่ฝ๏ฝ\n433001 1 ไบฒ็ฑ็ๅงๅงๆจๅฅฝ๏ผไธๅ
ซ่ๅฐๅฐ๏ผใ็่ฑ่ๅฐใๅฐๆผ๏ผๆๅ็ฅๆจ่ๆฅๅฟซไนๆฐธ่ฟๅนด่ฝปๆผไบฎใxยทxโx.xๆ...\n433002 0 ็ๆณ17ๅฑๅฅณ็ๆดๆ้ดๆกๅฐไธไธชๅทฅๅก\n433003 0 ไฝฟ็จ้ฟๆๆฏๅ ็TourSaver้
ท่\n433004 0 ไนฐไธไฝไบ็ฝๅ
็ไผฆๆฆ่ญฆๅฏๅฑๆงๆป้จ\n1000 Processed\n classify content\n433500 0 ๅคชๅๅธไนไธๅฐๅญฆ2015ๆ่ๆๅธ็ไผผๅ
้จๆไฝ\n433501 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ\n433502 0 ๆๅพๅคๅฐๅๅบ้บ่ฃ
ไฟฎๅพ็ฒพ่ด็\n433503 0 ่งฃๆพๅๆปๅป้ขๅฟๅ
็ง้้ตๅฒฑๆๆๅไบ้ขไธบโSTEMIๅNSTEMIๆฒป็็ญ็ฅ่งฃๆโ็็ฒพๅฝฉๆฅๅ\n433504 0 ไธ้ขๅบๅ ็ชๅ็่ก่๏ฝ??\n1000 Processed\n classify content\n434000 0 ็พๅบฆๆ็ฅ้ๆๅงๅง็ๆผๅ็ๅๅญๆๆฏ้ฉฌๅฏ\n434001 0 ๅจๆ
ๆธธ็ๅๆถไนๆๅฅฝๅค็พๅณ็ๅฐๅๅๅฝๅฐๆตท้ฒ\n434002 0 ๅฟฝ็ฅwinphone็จๆท็็ปๆต่ฝๅ\n434003 0 ไผๅคๆ็ฒ็บท็บทๅๅธๆจๅปiOS7ไนไธ\n434004 0 ๅฌ้ตๅบๆณ้ขๅจๅ
ซไธๅๅคๅฌๅผๆถๅ็ปดๆๆฐ้ปๅๅธไผ\n1000 Processed\n classify content\n434500 0 xxx็ไธคๅผ xxๅบxๆไฝ็ฝฎ้ๅธธๅฅฝ้่ฏๅฟๆฐ\n434501 0 ่ดตๅทไบบๅฌไบๆฐๅฐ่ฏด\"่ดต้ณๆๅบง้้ผๆฅผ\n434502 0 ๆฑไธๅผ bigbangๅไบฌๅบ็ฅจ\n434503 0 ANGLEๅ
ฌไธปๅฎ็พ็่ๅไบ้บป้บป็็พ่ฒไธ็ฒ็ฒ็ๅธ
ๆฐ\n434504 0 ไธ็ฅไธ่งๅญฆC่ฏญ่จๅทฒ็ปไธคไธชๆๆไบ\n1000 Processed\n classify content\n435000 0 ็ฝ้ขไธฝไบบ่นๆๅนฒ็ป่้ข่ไธญๅฝ็ฌฌไธๆฌพไธๆณจ่งๅน่พๅฐ่็ ๅ็้ข่ๆค่ค่พพไบบๆๅ็ฑ็ไธๆฌพ้ข่\n435001 0 ๅ
จ้จๅปบ็ญไปฅๆฌงๅผๅไฟ็ฝๆฏ้ฃๆ ผไธบไธปไฝ\n435002 0 ้ๅฐไธไธชไผremix็ๆๆ่็ธ\n435003 0 ็ฐๅจ็งฏๆๆฒ้ๆดพๅบๆ้ฃ่พนๆพๅฃไบ\n435004 0 ๆๅผ็ต่ๅๅฟไธไธๆ็้จ้็ๆดปโฆไธคๅนดๆถ้ดๅพๅ
ๅฎ\n1000 Processed\n classify content\n435500 0 ๅๅๅพ็ฅxxๆๆๅไบฌๅบๅฐๅ
ดๅฅๅคด่้ขไผฐไบไธไธ็็็ธๅฆ็้ฑๆด็ผ็ๅฏ่กๅบฆ\n435501 0 ๅคงไธญๅ้กถ็ๅคช้ณๅปไบ่บบๆฟไบงไบคๆไธญๅฟ\n435502 0 ไปฅๅๅฏน็็ต่ๅพ่ๅถไบโฆ่ฉ่ๅฅฝ็โฆ\n435503 0 ๆไปฌๅป้ขไปๅนด่ฟไบๅๅ ไธชๅๅฃซ\n435504 0 ้ฝๆไธ้ขๅๅพๅคงๆตท็ๅฟ~่ฟไธชๅคๅคฉ\n1000 Processed\n classify content\n436000 0 Whooๅๅ
จๅฝ็ฌฌไธๅฎถไฝ้ชๅผไธๆ\n436001 0 ๅ ๅผบไฟก็จ็็ฎกไธญๅฝ่งๅๅปบ่ฎพโๅ
จๅฝไธๅผ ็ฝโ\n436002 0 ๆฟๅฐไบงๅผๅๅ
ฌๅธ็ๆณไบบ่ฟๆฏ้ๅฐๆฟ็ฎกๆๆ้ฟ\n436003 0 ๅ็ฌๅ
ๅผๆปก100ๅ
ๅณ้10ๅ
่ฏ่ดน\n436004 0 ไธ็จๅฐ็ฑณๅไธบ็ๅนด้ไบบๅฟๆๆฏ่็\n1000 Processed\n classify content\n436500 0 ็ปๆๅป็ๅฌไบไธ่ๅฟๅฐฑๅผไธๅ ๅๅญๅซๆ็ผด่ดน\n436501 0 ็พๆนๅฏนๅพทๆฟๅบ้จ้จ็็ๅฌๆดปๅจๅฐฑๅทฒ็ปๅผๅง\n436502 0 ไปๅนดxๆๆๅฎ็ไธไปฝ่ฝฏไปถๅบๅฃ็็ฎกๆณ่ง\n436503 0 ็ฎๅๅฏๅจๅบ่งๅ้ข็งฏไธบ10ๅนณๆนๅ
ฌ้\n436504 0 ๆญๅทๅๆฒๅป้ข็ฎก็ๆ้ๅ
ฌๅธๆ่ๅธๅบๆจๅนฟไธป็ฎก\n1000 Processed\n classify content\n437000 0 ๆๅฎๆฟ่ฟๆ ท็็็ธไธ่พๅญ่ขซๆไธบ่ขซๆถๅ
ๆฉๅ็็งๅฏ\n437001 1 ๅฅฅๅๅฉๆจๅผ้จ็บข๏ผ็บข๏ผ็บข๏ผๅกxxxxๅนดxๆxๆฅๅ่ฟๅบๅฎขๆทๅๆ็บขๅ
่ต ้๏ผๆดๆๅฎ้ๅๅๆตๆปก้ข้ๆบ่ฝ...\n437002 0 ๅคงๅ
ณๆดพๅบๆๅฐๆฝๅๅฎถไธญๅ้ฑ็ๅชๆๆ่ทๅฝๆกใไพๆณๅๆ\n437003 1 ใๆ ่พพๆฑฝ่ฝฆใxๆไผๆ ๆดปๅจ๏ผๆฅๅบๆขๆๅ
่ดนๅฎๅ
จๆฃๆตๆๅก๏ผไฟๅ
ปๆดๆขๆบๆฒน่ต ้ๆบๆฒนๆ ผไธไธช๏ผ่ดญไนฐ็ฒพๅๆปกx...\n437004 0 celine้ฆ้ๅๆทปๆฐ่ฒ็ดฐ็ฏไป้บผ็้ฝๆฃๆฃ็ๆญค่ฒๆดๆฏ็งๅฌ่ชฟ่ชฟๅคงๆฐฃๅปไธๅผตๆไฝ่ชฟๅ่็ๅฅฝๆญ้
็้ก่ฒ...\n1000 Processed\n classify content\n437500 0 ๆ่งๅๅผบๅฅธ็ฏ่ฏดๅผบๅฅธไธๆฏไธบไบๆงๆฌฒ\n437501 0 ไบ้ฉฌ้ๅ
ฌๅธ่่ต10ไบฟ็พๅ
ๆถ่ดญไบ่ง้ข็ฝ็ซTwitchไปฅๅ ๅผบๆธธๆ่ง้ข็ธๅ
ณไธๅก\n437502 0 ไธบไปไน่ฆ้ๅฅนๅปๅป้ขโ่ฏฅไผ็ฃ\n437503 0 ไฝๅฑฑๅธ็ฆ
ๅๅบโๅผๅฟๅคง่ฏๆฟๆพๅคๅๅบ\n437504 0 ๅฝๆ็ๆๆ ๅบๆฅๅ่ฟๆฏๆๅๅฐฑๅกซๅฅฝๅข\n1000 Processed\n classify content\n438000 0 ไธๆตทๅ็พๅป็็พๅฎนๅป้ข็ๅถไธฝ่ๅป็ๆไนๆ ท\n438001 0 ไธไบไธๅคฉ็่ฏพ็ปไบ็ฌๅฐๆพๅญฆ็ซ้ฉฌ่ตถๅป็\n438002 0 ่ฃๅคๆๅฟไนๅฅฝ่ฟไบ้ฝๆฏไธ็งไผ ๆฟ\n438003 0 ๆ ๆฎ้็นไธบไธญๅฝๅป็ๆบๆๆไพไธค็งๆนไพฟ\n438004 0 7ๆๅนฟๅทไธญๅฟๅ
ญๅบไบๆไฝๅฎ
ไบคๆๆถจๅน
ๅพฎๅผฑไธ่ถณ1%\n1000 Processed\n classify content\n438500 0 ๅๆๆๆ็ฅๆงๆ่ตๅจๆ้ไธด\n438501 0 ไนๆฏ่ขซLPๅไบฌๆผๅฑไผ็่ฏ่ฎบๅผ้ไบโฆ1ไธชๅๅฐๆถๅซ็ญ\n438502 0 ่ฎค่ฏไฟกๆฏไธบโๆฑ่็ๆตท้จๅธๅ
ฌๅฎๅฑๆฐ่ญฆโ\n438503 0 ๆฅๆฌไธๆถๅฐๅ้ฃๆบ่ตท้ฃๅไป
ไธๅฐxๅ้ไพฟๅ ่ฝ่ณไธไบฌ้ฝ่ฐๅธๅธๆฐๅฎ
\n438504 0 ไบบๅงๅจๆพๆ็ๆถๅๆฌไธชๅคด้ฝไผๅ็ฐ่ฟไธชๅฐ้้ฝๅจ็ปไฝ ๆฒ่ญฆ้ๆ็นๆณๅ\n1000 Processed\n classify content\n439000 0 ๅๆไฝฟๆไธญๅฅณๆง็จBๅป่ดฟ่ตCๆขD\n439001 0 ไปๅนดไธๅฎ่ฆๅปๅไบฌ็ๆๅฟ็่ทจๅนด\n439002 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ jz4r5pไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n439003 0 ๆฅๆ่ทฏ175ๅท็ๅบ้บ็ฎๅๆญฃๅจ่ฃ
ไฟฎไธญ\n439004 0 ๆ่ฟๆ็ฑๆ็ๅฐฑๆฏๅไฟ้ฉๅ
ฌๅธไบ\n1000 Processed\n classify content\n439500 0 metrostationๅฐ้็ซ\n439501 0 ็ๆฟๅบ้จ้จๅๅธๆฟๅบ้จ้จ้ฝๆฅๅบไธๅฐ็่ดชๅฎๆฑกๅ\n439502 0 ไธญๅฝๅปบ็ญ็ฌฌไบๅทฅ็จๅฑๆ้ๅ
ฌๅธๅ
ฌๅธ็ฎไป\n439503 0 ๆฐพๆฐด้2015ๅนดๅค็งๅญฃๅพๅ
ตไฝๆฃๅทฅไฝๆญฃๅผๅฏๅจ\n439504 0 ่ฏทไธ่ฆ่ขซไธญๅฝ้ฃๅ ไฝ่ฝฌๅบๅ ๅคง่
้ขๅฃซไธๅฎถๅ
ๆฌๅไธ้จๆญฃๅฏ้จ้ฟ่ไฝ็ผ\n1000 Processed\n classify content\n440000 0 ๆๅช่ฝ่ฏด้ไธxxxไธๆ ท็่ฃๅค\n440001 0 ๆๅ่กจไบๆ็ซ โโๆฑ่็ขฐไธโ้ปไธญไปโๆไนๅ\n440002 0 ๆฅ่ชๆฑ่็็ๆถๅฒๆบ่ฝฆ้ๅๅธๅจ้ๅๅบ็ๅๆก้่ทฏ\n440003 0 ็ๆฒ็ฌๅฎถ้็จๅ
จ็500ๅผบๆ ็ฒ้+ๅ่งฃ็ฒ้็็ฏไฟๆฐ่พ
ๆๅฆ\n440004 0 ไปฅๅ้ฝ่ฟ่ฆไธ็ดๅจ็ต่ไธ็ฟป็ฟป็ฟป็ฟป็ฟป็ฟป\n1000 Processed\n classify content\n440500 0 ๆๅฟ้ฃ่ฏ็ๅฑ่ฟ่กไบ็ตๆขฏๅฎๅ
จๅคงๆฃๆฅ\n440501 0 xxๅนดๅ
ๆๅบ็่งฃ็บฆไฝ ไปฌๆนๆข็ถๅๆ้ฃ่ฟ่ทๆๅฎถๆไปไนๅ
ณ็ณป\n440502 1 ไฝณ่ด่พ็น็พๅฅถ็ฒxๆxๅทๅๅ็งฏๅๅฆ๏ผ่ดญไนฐxๅฌๅฐฑๅฏไปฅๅ
ๆข็ธๅ็ไบงๅxๅฌไบ๏ผๅฆๆๅฅฝ็คผ็ธ้๏ผๅ้จไปฅๆๆฏๅฉดๅบ\n440503 0 ่ฎค่ฏไฟกๆฏไธบโ่
พ่ฎฏๆๅญฆไฝๅฎถโ\n440504 0 ๅไธญ่ฏ็ๆฅๅญไธ่ฝๅๅไธ่ฝๅ่พฃ\n1000 Processed\n classify content\n441000 0 comๆ ้กๅฏไธไธๆณจ่ฎพ่ฎกๅน่ฎญ็ๆๅจๆบๆCGๆ็ป่ฎฒๅธ๏ผ8ๅนดCGๆ็ป็ป้ช\n441001 0 10086ๅ่ฏๆ็ณป็ปๅ็บงๆฒกๅๆณๅธฎๆณๆๅผ้\n441002 0 2015ๅ
จ็ๆฏๅบๅฐผๅฐๅงไธญๅฝๅคง่ตๅผ่ตๆๅฐ16ๅฒ\n441003 0 ๅฅฝๅๆๅพๅคgnๅ ไธบๆไบๆฒกๅๆณๆฅ\n441004 0 ๆตๆฑๆฏไธไธไธชไบบไธญๅฐฑๆ801ไธช่ๆฟ\n1000 Processed\n classify content\n441500 1 ๆฅๆฅๆฐ็ฝ๏ฟฅ่ฒธ็ฅๅบๅบ๏ฟฅๅ
ๅฎตๅทฒ่ฟ๏ฟฅๅจ็ๆทฑๅนฟ๏ผๆฟ$ๅฑ๏ฟฅ่ฒธ๏ฟฅๆญไปๆ ๅฏนๆใๅนดๅไฝ่ณx/ๅใ้ซ่ฏ้ซ$่ฒธ....\n441501 0 ๅฅนๅฐๅๅพๅๅฐๆปจๅๅ 7ๆ23ๆฅๅผๅน็2015ๅนดๅ
จๅฝ้ๅบฆ่ฝฎๆป้ฆๆ ่ต\n441502 0 ๅไปทxxxใxxx็้ฑๅ
็ฐๅจๅชๅxxๅ
\n441503 0 ไบฌๆดฅ็ฟผใไฝ่ฒใๆฐดๅก็ญๆฟๅ่ทๅน
ๅฑ
ๅ\n441504 0 ๆฏๅฐ่ฑ่งใ็ช่ใๅ่ฑใ่ฅฟ็บขๆฟใ่ๅญไพๆฌกๅ
ฅ้
\n1000 Processed\n classify content\n442000 0 ไปๅคฉๅฌๅป็่ฏด้คไบ็็ช่ใ็็่ใ้ฒจ้ฑผใไธๆ้ฑผไธ่ฝๅๅ
ถไป้ฝๆฒก้ฎ้ข\n442001 0 ๅๅบๅพๅ่ถ
่ฟ100ๅ็ๅนถไธๅจๅฐๆฐ\n442002 0 ไธญๅฝๅคฉๆๅญฆไผxxxxๅนดๅบฆๅซๆๆฟๅ
ๆต่ทๆๆฏไธๅบ็จ็ ่ฎจไผๅจไบๅๆพๆฑๅฌๅผ\n442003 0 ๅทๆณ้ขไปฅๆดๆฒปๅๆนๆกไปถไธบไพง้็นๅ็ช็ ดๅฃ\n442004 0 6ๆไปฝ็่ดน็จๅทฒๅจ7ๆ24ๅท็ๆถๅ็ผด็บณ\n1000 Processed\n classify content\n442500 0 ๆฑ้ฎ๏ฝๅปๅชๅๆฌๅทๆฏๅบ่ๅนดๅก\n442501 0 ่ฝ็ถ้ฝๅจๆฑ่่ฟๆฏ่ฆxxxๅคๅ
ฌ้็\n442502 0 ้่ฟๅฑ
ๆฐ่ฏฏไปฅไธบๆฏๅ็ๅผบๅฅธๆกๆฅ่ญฆ\n442503 0 ่ฏดไธๅฎ่ฟ่ฝ้่ง็ปไธช50็ปๆๅฑไธชๅฐๆฒๅข\n442504 0 ไบไปๅคฉ็ฝๅคฉๅจๆตๆฑๆธฉๅฒญ่ณ่ๅฑฑไธๅธฆๆฒฟๆตท็ป้\n1000 Processed\n classify content\n443000 1 ไฝ ๅฅฝๅง ๆๆฏๆผฏๆฒณๆฐ็็น็ๅ
ฐๆฒนๅฐๅผ ๅฑไปฌไธๅ
ซ่ๆดปๅจๆๅคฉๅผๅงไบ ๆปกxxxๅxxๆปกxxxๅxxx...\n443001 0 ไธบไฝ่ฟไบง็ไบ30ๅคๅ
็ๆผซๆธธ้่ฏ่ดน็จ\n443002 0 ็ฎ่ค็ฒ็ณ็ๅๅ ไธ๏ผๅจๅนฒ็ฅ็ๅฌๅญฃ\n443003 0 ๆฑ่็ๅ
ไธๅฎถ็ป่ฐ็ ๅฎฟ่ฟๆฐๅญๅๅ็ฎกๅทฅไฝ\n443004 0 nkxxxๅmonxxxๅ็็ฑณ่ฝฌๅบๅ ็็ฑณ็ญ\n1000 Processed\n classify content\n443500 0 ็ต่งๅฐ่ขซๆไปฌๆฟๅ
ไบๆ่ฐ้ฝไธ็บฆ\n443501 0 ้ๅปๅไบฌๅ็ฉ้ฆๆๅฉไธ็้จๅ็ๅฎ\n443502 0 MININSCE2015็จๆฟๆ
\n443503 0 ็ฆปๅผ้ฉฌ็ดฏ็ๆถๅ้ฃๆบไธๆๅฐไบไบไธญๅฝฉ่น\n443504 0 ้ข่ฎกxๆๅบ่ณxxๆๅ่ฝๅผๅญ่ฟๅฎข\n1000 Processed\n classify content\n444000 0 ๅฐ้ข่ดทๆฌพไฟ่ฏไฟ้ฉๆฏๆ็่ดทๆฌพๅฏน่ฑกไธบไธ็ฑปไบบ็พค\n444001 0 ่ฟไบๅปบ็ญ่ฎพ่ฎก่กไธ็ๆฝ่งๅๅคชๅฏๆ\n444002 0 ๅ่ฏไฝ ไปฌไธไธช็งๅฏ๏ผๆด่ธ็ๆถๅ\n444003 0 7ๅท็บฟๅนฟๆธ ้จๅฐไน้พๅฑฑๆฒฟ็บฟไธ่้ๆไนๆฒกๆๆๆบไฟกๅท\n444004 0 ๅจๆผซPSPๆ
ๆธธไพตๆฒกไบ็ๆณ็ๅ้ฑผ\n1000 Processed\n classify content\n444500 1 ๆดปๅจใ้ไฝ ไธๆฌกๅฅๅบท็พไธฝไนๆ
ใๆดปๅจๆ้ดๅ้ขๅฑฑๆฏๅบๅฐๅฏนๆๆๅฅณๆงๆๅๅฎ่กๅ
่ดนๅผๆพ๏ผ้ชๅ็ทไผดๅช้xx...\n444501 0 ๅทด่ฅฟๅๅฝ่ๅผบๅฅธๅฅณๆ่ขซๅคๅ
ฅ็ฑ32ไธชๆ\n444502 0 ๅฎฟๅๅบๅณๅจ่ฅฟๅททๅฎนๅจ่ฒๅฝฉ้จๅฃๆฒฟ่กๆพๆ\n444503 0 ไธญๅฝๅฅฝๅฃฐ้ณ้ฝๆฏๅ่ฑช็ฉ็ๆธธๆ\n444504 0 ่ฎธๅคๆฐๆฌพๆๆบๅคๅฃณ้
ไปถไปไน็ๆ้ฝๆฏๅจๅพฎไฟกๆดๆฐ็\n1000 Processed\n classify content\n445000 0 ไฝ ๆๆพๅฐ็ๅฐ้ฅฐๅๅบ่ฃ
ไฟฎๅพ็ๅๆฏไปไนๆ ท\n445001 0 ่ฟไธชๆฌไธๅบๆฌ้ฝๆฏ้่ฝฌ่ฃๅคๅFFไนๆฏ่ฎๆผโฆโฆ\n445002 0 ่ดผTheboytoldmethatIwasathiefwhostolehismemory\n445003 0 ๅ่
่ฝฎๅฅธๅ ๆๅๅฏ็ผๆญปๅๅฐๆ ๆ็่ฏ่ฟฝ่ฏๆถๆๅบ่ฏฅๆฒก่ฟไฝๆไบบไผฐ่ฎกๆๅฐ้พ\n445004 0 ๅฐฑๆฏๆๅฎถ็็บฏๅคฉ็ถๆๆฑFcup??\n1000 Processed\n classify content\n445500 0 ๅ ไธบๆปๆๆถฒไฝ้ฒๆ้ไธๅฐๅฟ่นญๅฐ่กฃๆไธ\n445501 0 ็BH9785็ฝ่ฒ้ฟๅฎๅฐๅๆฑฝ่ฝฆ้ๅพ่ๆนๅธๆขๆ\n445502 0 2015ๅนดๆตๅนดไธบไบ้ป็
็ฌฆๆๅ
ฅๅฎซ\n445503 0 8ๆ2ๅท้ฃๆบ้ฉๅฝๆฐ็ฝๅ
็จๅบไบบ่่ๅ\n445504 0 makeupforeverHDๆฃ็ฒ\n1000 Processed\n classify content\n446000 0 ้ๅ12~60ๅฒ็ๆๆๅฅ่บซไบบ็พค\n446001 0 ๆฌ้จไปๆๅฑฑๅฎ่ฟๅฟซๅๅธฎๅคบๅพๅๅๅคงๆณๆฎ็ซ ไบ\n446002 0 ็ฎๅA่กไธญ็ณๅๅ
ทๅคไบๅทจๅคง็ไปทๅผๆ่ตๆบไผ\n446003 0 ๅๅฐ็็ฐ่ดง๏ผDHCๅ่ใDHC็ฆ่
ฟไธธใๆฐ่ฐท้
ต็ด ๅ ๅผบ็ใ่ฑ็็ผ็ฝฉ\n446004 0 ่ฏไผฐๅฐไผฆๆฆๅคงๅญฆๅฝ็ๅญฆ้ข็ๅ
ฅๅญฆ็?\n1000 Processed\n classify content\n446500 0 ็พๅบฆๆไพๅ
ๅฎนใๆๆฏใไบบๅใ่ฟ่ฅ่ตๆบ่ฟ่ก็ฌๅฎถ่ฟ่ฅ\n446501 0 ไปxๆๅๅฐฑๆขไฝ ไปฌ้ฃไธชๅไบฌๅฐๅบ็็บขๅ
\n446502 0 ไนๆฏไฟโxxxxๅฝ้
ๅไบๆฏ่ตโ็ๅผๅนๅผ\n446503 1 ๆ็ ไธป๏ผxx.xx.xx.้ฒ.xx.xx.xx.xx.xx.xx.xx.xx.ๅฎถ่ไธญ๏ผ\n446504 0 ๅ
ฌๅธ่ก็ฅจๅฐไบ7ๆ13ๆฅ่ตทๅค็\n1000 Processed\n classify content\n447000 1 ไบฒ็ฑ็ๅง๏ผๆจๅฅฝ๏ผๆๆฏWHOOๅไธๆ็็พๅฎน้กพ้ฎไปปๅญใ็ฐๅจๆไปฌๅๅบๆโไธๅ
ซโๅฆๅฅณ่็ๆดปๅจ๏ผ็ฐๅจๆ...\n447001 0 ๅฏ่ฝไบxxๆฅๆฉๆจไปๆ็ไธๅ้จๅ
ฅๅข\n447002 0 ๆ็ฅ้่ฑๅ้ชจไนๆฏไธ็่ฏๅฟไบ\n447003 0 ๆไปฅๅญๅฎซ็ไฟๅ
ปๅฐฑๆพๅพๆฏ่บซๆๅ็ฎ่ค้่ฆๅญๅฎซไฟๅ
ปๅ็ญ็พ้ช่ฒ่ดดไฝ ๆๆญฃ็กฎ็้ๆฉ\n447004 0 ไปๅคฉๆฏๆนๅ่ๅท็ตๆขฏไบไปถๆญป่
็ๅคดไธ\n1000 Processed\n classify content\n447500 0 ๅๆๅฐ่ฝฌๅไธบไฝๅบ็ไผๆๅไธ\n447501 0 ๅฎๅๆๆบๅฏไปฅไธไธไธช็ฉบ่ฐ็ฒพ็ต\n447502 0 3ใ็จๅฐๅป่ฟ็ๅๅฆๆฐดๆ่ธ่\n447503 1 ไบฒ๏ผๆไธๅฅฝ๏ผๆทฑๅๅคฉ่นๅทง่ฟชๅฐๆ ็ฅไฝ ๅ
ๅฎตๅฟซไน๏ผ xๆxๆฅ่ณxๆxๆฅๆดปๅจๅผๅงๅฆ๏ผๅ
จๅบxไปถๅณๅฏไบซๅx...\n447504 0 ๅฝๆๆๆตท้จๅธๆฎ็พไบบ่ๅไผ็ป็ป\n1000 Processed\n classify content\n448000 0 ๅๆๅฝๅคๅปบ็ญไธญ็ๅๅฃซ้ๅๅฏนๆฏ\n448001 0 ไน่ง็1080pๆฐๆฐ่ฏดๆๅฎๆฏ่ฝฌ็ ็\n448002 0 ๅๅฐผ่ฏๆตดๅฏปๆ นไนๆ
่ตฐ่ฟ้ฟ่
็งๅคๆ่ฝ\n448003 1 <ๆฐ - ่ก - ไบฌ>๏ผ้ฆ ๆฌก ๅญ ็ซ ้ . xoo%๏ผๅฐๅ๏ผx x x x x x...\n448004 0 ้ซๅ่ดจๅ่ๅทฒๆไธบๆฌง็พ็ญๅ่พพๅฝๅฎถ่ฌ่ๆถ่ดน็ไธปๆต\n1000 Processed\n classify content\n448500 0 ่ฎพ่ฎก๏ผramonaenache\n448501 0 ็ๆ็่ฃ
ๆฝขไผผๆพ็ธ่ฏ็BGMๅๅณ้\n448502 0 ่ฟๅฐฑไผ้ ๆๅคงๅฎถๅจๆพSEOไผๅๅ
ฌๅธ็ๆถๅๆฏ่พๅฐๆ\n448503 1 ๆจๅฅฝ๏ผๆๆฏๅทด็นๅฉๆฉฑๆๅฏผ่ดญ๏ผxๆxๆฅ-xๆxxๆฅๆไปฌๅทด็นๅฉๅฎๆจๅฎถๅ
ทๅฎๅถๆฉฑๆ๏ผๅๅฎถๅคงๆพไปท๏ผไปทๆ ผไฟ...\n448504 0 ๅคฉๅๅๅ็ๅฐๅฆ่่ง็ๅนฟๅๅๅพๆๆๆบๅทฎ็นๆไบ\n1000 Processed\n classify content\n449000 0 ไฝ ่บซ่พน่ฟๆ5ไธช้ๆถๆถไฝ ๅฝ็ไธ่ฅฟ\n449001 0 ไธไธชไบบๅฐฑๆฏไธๅชๆผๆณ็้ฃ็ญๆปๆฏ่ก็ไนกๆๅจๆททๆฒไธๅฟ็ขไธญ\n449002 0 ๅ
ถๅฎ็ปไธชๅๅพๅพฎ่ฝฏ็ๅปบ่ฎฎ๏ผๅฆๆๆๆ็wpๆๆบ้ฝ่ฝๅ็บงๅฐwin10\n449003 0 xxxxๆฌพๆฐ่่พพๆฏ็ฌฌไธไปฃ่่พพ่ฝฆๅ\n449004 0 ้ข่ฎกๅฐไบxxxxๅนดไธๅๅนดๅผไธ\n1000 Processed\n classify content\n449500 0 ไปๅนดๅคงๅๅนดๆฒกๆๅบ้จๆ
ๆธธ่ฟไธๆฌก\n449501 0 ๅจamazonไนฐไบไธชu็ใไธ้ข้ฝๆฏไธญๆ่ฏดๆ\n449502 0 โitlittleprofitsthatanidleking\n449503 0 UtenaPuresไฝๅคฉๅ
ฐ่็ๆต็ปๅฐฟ้
ธไฟๆนฟ้ข่ๅๆฅๅฆ\n449504 0 ็ปๅคงๅฎถๅไบซ\"killmehealme\"\n1000 Processed\n classify content\n450000 0 ๅ ไธบๅฎไฝ่ฟๅจไปปไธๅๆ่
่ดฅๆฒกๅ็ฐ\n450001 0 ไปๆฏไธๆฌกไธ้ๆ้ฃๆบ็็็ฑ้ฝๆฏๆไธๆณ็็ไฝ ๅญ\n450002 0 ไฝ็ฝฎๅจMONTEREYPARK\n450003 0 ไธญๅฝๆฟๅฐไบงๅธๅบ้ขไธดๅๅไธฅๅณป็่้ช\n450004 0 ็ฐๅจๆ่ตไธๅ่ฟๅจ้ๆฏ่ดญไนฐไธๅ้ซ่ท้ๆดๅ็ฎ\n1000 Processed\n classify content\n450500 0 ้ป่ฒ๏ฝๆฐๆฌพAsh่ฟๅจ้้็จ็ปๅ
ธ่ฟๅจ้ฃๆฌพๅผ่ฎพ่ฎก\n450501 0 ๅฎไน ็ๅคชๅฅฝ็ไบ้ๆบ็่กไธบๅธ
ๅฐไธ่ก\n450502 1 ๆ่กๆๆฐๆจๅบๅนธ็ฆๆ่ดทไฟก็จ่ดทๆฌพ๏ผๆ ้ๆตๆผๆ
ไฟ๏ผๆ็ปญ็ฎไพฟ๏ผๅฎกๆนๅฟซๆท๏ผๅนด็ปผๅๆๆฌx.xx%ๅทฆๅณ๏ผ่ฏฆ...\n450503 0 xๅxx็งๅผๅงๆ~ๅ้ขๆไธๆฅ็ไธ่ฅฟๆฏไปไน\n450504 0 ๅจๅคๆฅ้็่ฏปไนฆ็็ญๆ
~ๆๆฐๅฅฝไนฆ\n1000 Processed\n classify content\n451000 1 ๆฐๆฅๅฅฝ๏ผๆๆฏๅ่ฅๆถฆๅฎๅฐๅทๅ๏ผๆฌๅๆฟๆฅ้ป็ฝๅฝฉ่ฒๅฐๅท๏ผๅ่ดงๅๆถไปทๆ ผๅฅฝ่ฏด๏ผๅฆ้ๅฏ่็ณปqqxxxxx...\n451001 0 ๆนๆฝญไธญ้ขๅฏนๅ็ๅจ12ๅนดๅ็โๆนๆฝญๅคงๅญฆ็ ็ฉถ็ๆไบบๆกโไธๅฎกๅฎฃๅค๏ผๅคๅณ่ขซๅไบบๆพ็ฑไบๆ ็ฝช\n451002 0 ไปไปฌๆฅๅฐๆตๆฑ็พๅคงๆป้จๅๅ ่ถ
็บงๅข่ดญไผ\n451003 0 ๆ่งๅพๆๅฏ่ฝๆ็ฒพ็ฅ็
่ขซๅฎณๅฆๆณ็็ฆ่็่ฟๆ่ฝปๅพฎ็ฒพ็ฅๅ่ฃๅๆดๅๅพๅๆฑๆจ่ๅฟ็ๅป็\n451004 0 ็งฐ็ฆปWindowsXP้ไผ่ฟๆ95ๅคฉ\n1000 Processed\n classify content\n451500 0 1000ไปฅไธ่
ๅ้ๅค้ๅฅ่บซ่ณๅฐVIPๅกไธๅผ ๅ\n451501 1 ๅ้ผป๏ผๅ่็พๅฅณไบบๆฅๅ
ไนๆณ๏ผ่ฟ้ข็บขๅ
xx-xxxๅ
ๆผๆๆฐ๏ผๅ
่ดน่ฑๆฏไธๆป่ฟๅคใๆถ่ดนๅฐฑๆ่ฑช็คผ๏ผ็ฑ็ฏ...\n451502 0 ๆๆๅคฉไธๅ่ฟ่ฆๅ้ฃๆบๅๅทซๆบช\n451503 0 ๆตๆฑ้ๅๅ็ไบไธ่ตทๅ่ฝฆๅผๅ็่ฝฆ็ฅธ\n451504 0 ๅธธๅทๅธ่ฅฟ็ปๅ้ซ้ๆฑๅฎๆนๅ่ฝฌๆญฆ่ฟ็ปๅๅบๅ้ๆ ็บฟๆฝๅทฅ็ปๆ\n1000 Processed\n classify content\n452000 0 ๆฉ็พฏๅบง๏ผโ
โ
ๆฑๆ็พไนๆตๆฑๅ
ญๅคงๅฒๅฑฟ\n452001 0 ็ๆ่ขซๆๆฟ่ฏบโๅค้นๅๆญปๅ้
ฌ่ฐข500ไธโ\n452002 0 ๅจไป100ๅฒๆถๆพๅ ๅจไธญๅปไธญ่ฏๆน้ข็ๆฐๅบๆๅฐฑ่ทๆฟๅบ็็นๅซๅฅๅฑ\n452003 0 ๅพๆๆพ7ๆ็ๆๅไธๅจๅชๅฉไธๆฌกๆฐ่ก\n452004 1 ๅฐๆฌ็ไผๅๆจๅฅฝ๏ผๆๆฏไธไบ็ฏ็พ็็่ฑ้
ไธๆ็็พๅฎน้กพ้ฎ๏ผไธๅ
ซๅฆๅฅณ่ๅฐๆฅ็่ฑ้
ๆฐๅๆฐดx.xๆ๏ผ็นๆ ...\n1000 Processed\n classify content\n452500 1 ๅฐๆฌ็ๅฎขๆท๏ผ้ฆๅ
ๆไปฃ่กจๅพก้ฆๆ่ฐขๆจไธ็ดไปฅๆฅ็ๆฏๆ๏ผๆฌๅบๅฐๅจๆฅ่ไธพๅๅ
จๅบ้
ฌๅฎพๆดปๅจ๏ผๆถ่ดนๆปกxxxๅ
...\n452501 0 ๅไบฌ็ต่ฐทๅฏบไธไธ่ค็ซ่ซ้ฃ่็พๅฆ็นๆโโๅจๅไบฌ็ดซ้ๅฑฑไธ\n452502 1 ๆฌๅบๅบ็พไธฝx.xๅฆๅฅณ่๏ผๆทปๅงฟๅๅฆๅๅบๅ
จๅบไนฐๆปกxxxๅ
้xxxๅ
่ไธ้ข่๏ผๆดปๅจๆถ้ดx.xโโx...\n452503 0 ๅด็ปๅธๅงใๅธๆฟๅบ็ๅทฅไฝ้็น\n452504 0 ๆๆ
ๆฟๆไธชๆ ๅฝ็ๅป็ๆๅธๆๅฐๅญฆ\n1000 Processed\n classify content\n453000 1 ใ่ฑ็นๅฆฎไธใๆญฆๅนฟ่ฑ็นๅฆฎไธไธ.ๅ
ซ้ๆ :ๅณๆฅ่ตท่ณx.xxๅ
จๅบxxๅ
่ตท๏ผไฝ่ณx.xๆ๏ผ้้็ปๅ
ธๆฌพๅ
จ...\n453001 1 ๅฏปๆพ้กน็ฎๅไฝ๏ผ่ดขๅๅฝๅ
ไธญๅฐๅไผไธๅๅฑ๏ผๆฐๅปบใๆฉๅปบใๆฐ้กน็ฎ็ ๅ๏ผๅไธๅไฝ๏ผๆ็ปญ็ฎๅ๏ผๅฐไฝๅฟซ.้...\n453002 0 ๅจPowerSystemsไธ\n453003 0 ่ณๅฐๅ
็ ธxxxx่ฌๅ
โฆโฆ่ณๆบ็จ็ผบใๆฌๅๅคฑ็ฏ\n453004 0 ไฝๅซๅข
็ไธไธๅฎๅผๅฅฝ่ฝฆ่ๅธๅบไนฐ่็่ๅคชๅคชๆฒกๅๆฏๆไธช้ขๅฏผ็ๅฎถๅฑไฝ ๆฅ่งฆ็ไบบๅคไบๅฐฑไผๅ็ฐ่ฝๅๅผ็ไธไธ...\n1000 Processed\n classify content\n453500 0 ไธ่ฟฐไธคไธชๅบ้6ๆๅบ็่ก็ฅจไปไฝๅพ่ฝป\n453501 0 ้ฆๅ
๏ผๆไปฌ่ฆๅจๆก้ขไธๆฐๅปบไธไธชๆไปถๅคน\n453502 0 ๆฅ็
ง็ๅๆ็กฎๅฎไธ่ฝ่ทๅไบฌๆฏๅ\n453503 0 ็ฌฌๅๅญฃไธญๅฝๅฅฝๅฃฐ้ณ็ฐๅบๆผๅฑๅจๆฐไผฆ็่ๅไฝ\n453504 0 ๅฎๆๅจ่ช็นๅซ็็ๅฅ380km/h\n1000 Processed\n classify content\n454000 0 ๅฐๅท้ ็ๆ่ขญๅบๅ็ฐๅจ่ฝฌๆ็ตๅฝฑๅ\n454001 0 ๅจ่ฃ่ฐๅดๅฅๅฐดๅฐฌไบๅจๆ็ฅจ๏ผไฝ ๆฏๆ่พๅ
ๆๅฐ้่ฟๅฅฝๅฃฐ้ณๅ\n454002 1 ๆฐๅนดๅฅฝ๏ผๆญๅๅ่ดข๏ผๆๆฏๅๆๅ
ดๅ
่ฃ
ๅจๆ็ๅฐ้๏ผๆๅธๅทฒๆญฃๅธธไธ็ญ๏ผๅจๆฐ็ไธๅนด้ๅธๆๆดๅฅฝ็ไธบๆจๆๅก๏ผ...\n454003 0 viaๆญฆๆฑๆๆฅ~ๅไปฌๆณ่ฏด็นๅฅ\n454004 0 ไธ่ตทๅบ้จ่ขซ่ฏดๅ่ขซๆๅธฆ/ๅฎถ้ฟๅธฆๅฐๅญฉ\n1000 Processed\n classify content\n454500 1 ๅไฝๆๅ๏ผๆฌ็ฌ่ๆไพไผ่ดจๆฅ็ณป็บฏ็ง็ง็ฐ็ฌ๏ผ่ต็บง๏ผ๏ผๅฎถๅ
ป๏ผๆ้่ฆ็ๆๅๅฏ่็ณปๅพฎไฟกxxxxxxxxx\n454501 1 ๆจๆฆ็ฎ่ฎฏ:ๆจๆธ
ๅฟ ๆๆๅๅถไปฒๆณๆๆๅฐๅจ้ๅธไผ ๆ่็ ๅ็งๅคไน ็็ฌๅฎถ็ง็ฌ ๅฐ็น:ๅผๅพทๆฅผxxxxx ...\n454502 0 ๅ้ฃไธชๅผบๅฅธๆไบบ็ฏ้ฝๆฒกๆๆณๅฐ\n454503 0 ่ฟๆณๅๆดพไบบๅฐฑๅพ็ฅ็ๆ็้็งๆฅๆดพไบบ้ฎใๆฅๆๅฌ็ๆฏไป\n454504 0 Gxไบฌๆฒช้ซ้็ฑไธๆตทๅพๅไบฌๆนๅๆ ้กๆฎตไปxxxxK่ณxxxxKๆฝๅทฅ็ปๆ\n1000 Processed\n classify content\n455000 0 ๆฏไธๅค็ป่้ฝๅ ๆญค่่่ ๅฏธๆญ\n455001 0 tagๅๅผนๅนไธ่ฆๅคช็็ธwww\n455002 0 ๆไปฅๅฎคๅ
่ฎพ่ฎก่ถๆฅ่ถ่ถๆฅๅๆ็่ฝฏ่ฃ
ๆญ้
ๆฅๅ็ฐ็ๅฎคๅ
่ฎพ่ฎกๆๆ\n455003 0 ๅฏๅ
่ดน่ทๅพ36ๅจๅนดๅบๅ
่ดนๅฎๅถ่ฅฟๆ็คผๆไธๅฅๅฆ\n455004 0 ๅจ360ๅ่
พ่ฎฏไน้ดๆ้ๆฉไบ่
พ่ฎฏ\n1000 Processed\n classify content\n455500 0 ่ฆๆฑ่ฟxxxxๅๅๆไธญๅญฆ็็ปไธๅฐฑ่ฏปๅฟๅๆฐๆ้ ็ๅดๆไธญๅญฆ็ๅธๆ\n455501 0 ้ๅๅธๆฃๅฏ้ขๆฃๅฏ้ฟ้ซ่ฟๅๅจๅธๅฝๅ่ตๆบๅฑๅผๅฑโๅญฆๅฅฝๆณ\n455502 1 ไผ่ฎฎๆ้ดxxๅ้ขๆบไป
้xxxxๅ
๏ผไป
้ไผ่ฎฎๅฝๅคฉ๏ผ๏ผๆขๅฐๅณๆฏ่ตๅฐ๏ผ็่ฏ็ๆๅพ
ๆจ็ๅคง้ฉพๅ
ไธด๏ผ็ฅไธ...\n455503 0 ๆณ้ขๅบๅฝ็ปง็ปญๅฎก็ๆฐ้ดๅ่ดท็บ ็บทๆกไปถ\n455504 0 ไบบๅฟ้ฉๆถไธญๅฝ็คพไผๅคช่
่ดฅ\n1000 Processed\n classify content\n456000 0 ไธๅคฉไธๅคโโไธๆๅฒใฎHigh็ฟป่็ฟป่ชๅฉๆธธ\n456001 0 ็ทๅญฉ๏ผโ้พ้ไฝ ็้ฃๆบๆฏ็็โ\n456002 0 ๅฏนไบๅฆไฝไพๆณ่ฎข็ซ้ๅฑๆฏซๆ ไบ่งฃ\n456003 1 ๆ่กๅฉ็ไปๆฌๆฅ่ตทไธๆตฎxx%๏ผๆ้้ขๅๆ้่ฆๆฑ๏ผ๏ผๅไฝๅฐๆฌ็ๅฎขๆทๅฆๆ้่ฆ๏ผๅฏ็ต่ฏ่็ณปใๆ่ฐขๆจๅฏน...\n456004 0 ้ฃๆบๅไธๅคฉๅฐฑๅบไบ่ฝฌไบไธไธชๅไธๆฅไบ\n1000 Processed\n classify content\n456500 0 ๅฎๅ
ดๆฐๅคฉๅฐๅนฟๅบๅฐ้ฉฌๅๅบๅผๅทฅๅคงๅ\n456501 0 ่ฟ้ๆพๆฏๆฑชๆฟๆไผช้ฝๅไบฌๅ็ไธดๆถๅๅ
ฌๅค\n456502 0 ๅคงๅฎถ็็ๅผ ่ๅจๆฌๆ24ๅฅฝ็ด็ฝ็ๅๆจ่ๅคงๅฎถ่ท่ธช็ไธช่กไธญๅฝๅซๆ\n456503 1 ๅฐๆฌ็ๅฎขๆท:ๆฌงๆดพๆฐ่ฟx.xxๅทฅๅๅคงไฟ้!็นไปทxxxxๅ
:่ถ
้ฟx็ฑณๆฉฑๆ๏ผ็ณ่ฑ็ณๅฐ้ข๏ผ้
็ญ้ๅธ็...\n456504 0 ่ขซ้ชไบๅคๅนดblxไนsaybyeไบ\n1000 Processed\n classify content\n457000 0 ๅๆฌข้ฃ้็่ๅปบ็ญๅฑ้กถไธ็็ปฟๆ \n457001 0 ไธไธชๆ่็พๅง็ๅฑ
ไฝๆๅฅๅบทๆๅๅๆ่ฒๆๆฟๆฅๆๅจ็ปๆต็ๆฟๅบ\n457002 0 ๅๅฒธๆณ้ข็ๅญไผ้ข้ฟ็็ญๅญๆๅๅฐๆฑๅๆณ้ขไบคๆตๅบง่ฐ\n457003 0 ไปฅๅ็ต่่ฆi7็ๅไธไธ่ฆไนฐ็ฌ่ฎฐๆฌ\n457004 0 2ๆๅฃๅ
ฌ่ทฏๆๅฃๅ
ฌ่ทฏๆฏๅจๆฌๅดๅณญๅฃไธๅผๅฟ่ๅบ็\n1000 Processed\n classify content\n457500 0 ไนฐๆๆบๅฅนไธๅคซๆ็ต่ฏๅ ้ค่็ณปไบบ\n457501 0 ๆๅ็ๆฎ่ฏดๅผๆ็็นๆ~่ฏๅฟๆฅ่ฏด\n457502 0 ไนๆๅ
จๅฝ100ๅบๅทกๅโ็ฌๅๅไธๆจกๅโๅไบซ\n457503 0 xxๅ
ๅฐฑๅจไบ้ฉฌ้ไนฐๅฐไบ่ฟๆฌ็Kindle็\n457504 0 MKCๅจ้ๆใ้ ๅทฅใ่ฃ
้
ๅ้
็ฝฎๆน้ข้ฝๅฐ่ถ
่ถ็ฆ็น็ฟผ่\n1000 Processed\n classify content\n458000 0 ้่่ๅซไนฑๅ่ฝปๅไผค่บซ้ๅไธงๅฝ\n458001 0 ๅ่
ๅฒไธๆไบไธๅฅ็ฉบ่ก่ก็ๅฃๅท\n458002 0 ๅผๅๅฏนๅๅ้ฃๅ่
ๅคโๆญปๅโ\n458003 1 ไบฒ็ฑ็ๅงๅงไธๅ
ซๅฆๅฅณ่ๅณๅฐๅฐๆฅ๏ผ็ฅไฝ ่ๆฅๅฟซไนใ่ถๆฅ่ถๅนด่ฝปๆผไบฎ๏ผๆฌๅบ้ข้จๆค็ๅ
จ้จๆx.xๆ ๅ
ป็...\n458004 0 โๅผๆฌ็ๆฐ็ฒพ็ฅไผ ๆฟๅช็บธ่บๆฏโ้ณๅทๅช็บธไผ ๆฟไบบๅน่ฎญ็ญๅผ็ญ\n1000 Processed\n classify content\n458500 0 ๅไบฌไนๆพๅบ็ฐ่ฟๆฐด่ดจๆง็ผบๆฐด่ๆฌ่ฟๅๆฐดๅฃ็ๆ
ๅต\n458501 0 ๅ
ฑๅๆข่ฎจๆตๆฑ็ๅปบ็ญๆถๆ็ไบง็ฐ็ถ\n458502 0 ่ๆไปฌๅฐฑๅๆบๅจไบบไธไบๅๆกไธๆ ท\n458503 0 ๆฒ้ณๅธๅฐ้่งๅๅพๆฐ้ฒๅบ็ๅฆ\n458504 0 ๆ่ฐขxไฝๅไป็พๅฝWPSๅพฎ่ฝฏๅ
จ็ๅไฝไผไผดๅคงไผๅๆฅ\n1000 Processed\n classify content\n459000 0 ไธๆๅฎไน ็่ต้ฑ่ต็ป้ช๏ผๆ่ชๅจ2000ๅ
ไปฅไธ\n459001 0 ๆๆบๅทๅฐพๅทๅฏไปฅๆด้ฒไฝ ็ๅนด้พ\n459002 0 ๆไธช่ฟ็น่ญฆๅ็ฎกไบค่ญฆ้ไฝๅบๅจ\n459003 0 ็ฐๅจๆๅจๅดๅฎๅฏบไบ่ ๅคงๅฆ็ๆ ผๆๅธๅฐ็่ฑ่ฏญๆบๆๅไผ ้\n459004 0 0LEๆ ๅ็iPhoneๆAndroidๆบ่ฝๆๆบไธๅฎ่ฃ
้
ๅฅๅบ็จ\n1000 Processed\n classify content\n459500 0 ๆฌ้จไปๆฏๅพท็ฅ้พ็ๆญฆๅๅคบๅพๅธๆๅคงๆณๆฎ็ซ ๅ\n459501 0 ๅๆฅๆฅไบไธไธชๅซๅ่
พ่ฎฏ็ๅไบบๆ่ตฐไบๆฏไผ้น
\n459502 1 ๆตทๅฎๆฐๅๆ
ไฟ๏ผไผไธ่ฟ่ดทใๅบไปๆฌพ็ธๅ
ณไธๅก๏ผ่ฏทๆๅไธคๅคฉ้ข็บฆ๏ผๅฐๅ๏ผๆตทๅฎๅธๆฟๅบๅ๏ผๆ็คผ่ทฏ๏ผๅฐไธๅๅญ...\n459503 0 ่ฑ่ฏญๅๅฐThefirstdayatschool\n459504 0 ็ฅไฝ ๆๅยทยทยทยทๅตๅตยทยทยท็ธไฟกไฝ ่ชๅทฑๅฏไปฅ็\n1000 Processed\n classify content\n460000 0 ๆๆบๅๅคง้จไปถ็ธ็ปงๅผๅงๅคฑ็ตๅผๅธธ่ฎค็็ๅผๅงๆ่ๆขๅชไธชๆๆบๆฏ่พๅฅฝ\n460001 0 ่ฏดๅคไบ้ฝๆฏๆณช่ฏ่ฏดๅจๆฅๅพๅท็่ทฏไธ\n460002 0 ๅจmy่ฑ็่ฏ้ขๅด่งๅพๅ
ฅ่ฟทไนฐไบไธชๅฐๆทๆทๅๅฐฑๆ่ฆๅธฆๅป้ขๅป็ไธ่ฅฟ่ฝๅจไบบๅฎถๅบ้ไบ\n460003 0 ๆๆ็็ฏ็ฝชๅ ๅญ้ฝๅฟซๆๅบๆฅไบ\n460004 0 ไธบไบๅไธไธช\"็ฎ็ปไธ็ฅ้่ฝๅฆๅๆไธๅป็ไฝ \"ๅฅ่บซๆด่ๅฐ่ง้ข\n1000 Processed\n classify content\n460500 0 ๆๅซๆณๅฎ็ๅฐไธฅๅฐฑๆฏๆๅซๆณๅพๅๅนฟๅคงไบบๆฐๅ
ฑๅๆๅฟ็ๅฐไธฅ\n460501 0 ๅฆๆไฝ ่ฟๅจ็จๆฏๅทพ้ๆๆฆๆฆ่ธไฝ ๅฐฑ็ญ็ๆฏๅญ่ถๆฅ่ถๅคงๅง\n460502 0 ๆฏๅคฉๆ้ๅไธญ่ฏๅชๆณๅฅฝ็น่ฎฉๆ็กไธชๅฎ็จณ่ง\n460503 1 xxๆ:ใๅ
ญ่ไธญ็นใโโ็พ้พ็ช็้ธก็ดโโ้:๏ผxxๅ\n460504 0 ๆๅ็็ๅฑไบๅบๆฌๅป็ไฟ้ฉๆฅ้\n1000 Processed\n classify content\n461000 0 ๆฌ้จไป้ปๆฑ็ๆญฆๅๅคบๅพๅฏๅฐ็ปตๆๆฎ็ซ ไธ\n461001 0 ๅๅป้ขๅฃไบ~็ถ่่ฟๅบ็ๅพๆๅพไธๅผๅฟ\n461002 0 ๆ็ฃๆๆบ้ๅฃฐ่ฟ็็ๆฏ้ฃ้ฆๆญๅฆwwwwwwww\n461003 0 ๆ นๆฌๅๅ ๆฏๆฟๅบๅฝ็ๅฉๅญๅๆถๅด่ฆๅป็ๅธฎไป็ซ็ๅ\n461004 0 ็ผ็พ็็ธ่ๆฏ่ฆช่2ๅญๅฃ่ฟฐไผผไนใๅฎๅ
จไธๅใ\n1000 Processed\n classify content\n461500 0 ๆณ็ฅ้ๆจ็ๅบ็จๆๆฒกๆ้ป็ฝ่พนๅ\n461501 0 ่ฎค่ฏไฟกๆฏไธบโFEELONEๆจก็น็ป็บชไบบโ\n461502 0 ROSESHIREHEZ็่ฃ
็ซ็ฐ่ฑๆ\n461503 0 ๅๆ่ฒไผผๅฌๅฐ้ฃๆบไปๆฅผไธ้ฃ่ฟ\n461504 0 ่ณๅฐ่ฆๅจ็พๅบฆไธๅ้ ไธไธชๅฑไบ่ชๅทฑ่ฃ่\n1000 Processed\n classify content\n462000 0 ๅธๆไธๆฏไผค็
ๅๅ viatwitter\n462001 1 ๅฎๅ่ฏ็ ่ถๅ
็ฌๆ็โ้่ฑโโโๅญฆๅโๅ ็ชๆฃๅ่โ่ฝๆๆ่ฐ่ไบบไฝๆฐ้ไปฃ่ฐข๏ผๅนถๆๅ่ฅๅป่ใๅ้กบ่ ...\n462002 1 ็ณๆๅฎไฟกๆฎๆ ไฟก็จๅๆฌพๅ
ฌๅธ๏ผๅ็ๆ ๆตๆผไฟก็จๅๆฌพ๏ผไธ้จๆๅก๏ผxๅๅฐxxไธ๏ผๆๅฟซๅฝๅคฉ้่ฟๆฌข่ฟๆฅ็ตๅจ...\n462003 0 ๆพณๆดฒๆ
ๆธธๅฑๆฅๅๅจๅฎๆน็คพไบค็ฝ็ปไธๅผ ่ดดไบไธๅผ ่ข้ผ ็็
ง็\n462004 0 ็ฑๆญค่ไบง็็็พๅบฆๅไธๆณ่ฑก็ฉบ้ดๅทจๅคง\n1000 Processed\n classify content\n462500 0 ไธบๅฅฝๆฟๅญ้
ๅฅฝ่ฃ
ไฟฎโโไปไนๆฏๅฅฝ่ฃ
ไฟฎ\n462501 1 ๅดๆฑ่กๆฟๆ ธๅฟๅบๅใไบจ้้ฟๅฎๅบxx--xxxๅนณ็ฑณใ่ฝป่ฝจๅญฆๅบๅ็ฐๆฟ๏ผๆจๆณไธๅฐ็ไผๆ ๅ่ถ
ๅผ ๆฅๅๆ...\n462502 0 ไฝ้่ฟๆปฅ็จๆฟๅบ่กฅ่ดดใๆฟๅบๆๆตๅๆถๆ็ญๆๆฎต\n462503 0 ๅนฟๅทๅ็ซๅฑ
็ถ่ฟๆ่ฃ
่ๅ่ฎฉไบบ็ญพๅ้ช้ฑ็\n462504 0 2015ๆฐๆฌพๅคๅฅ็ท็ง่ฃ
้ฉ็ๆฝฎไผ้ฒๅคนๅ
็ทๅฃซๆฅ็ง่ๆฌพไฟฎ่บซไธ่กฃ้ๅนด็ท่ฃ
\n1000 Processed\n classify content\n463000 0 ๅจไธ็ๆๆๅจ็CPUๅบๅๆต่ฏSPECCPU2006ไธญ\n463001 0 ็ดฏ่ฎกๆ8000ไฝไธชๅฎถๅบญๆฅๅๅๅ ๆดปๅจ\n463002 0 ไธ็จๅผ่ฏๆนใใ็ฝ่่ๅๆฑค\n463003 0 ๅทฎ็่ดงๅๆxxไธช่ๅญๆไนไธๆขๅ\n463004 1 ไบฌๅฎ่ฝฉๅฅๅบทๅ
ป็ไผๆใไธป่ฅ๏ผ่ถณๆตดๆๆฉใTEL:xxxxxxxใๅฐๅ๏ผๆฐๅตๆฐ็่ทฏxxๅทๆฆๅฎๅนฟๅบๅ...\n1000 Processed\n classify content\n463500 0 ้่ฆ็ไบ่ฏดไธ้๏ผๅ ๆฒนใๅ ๆฒนใๅ ๆฒน\n463501 0 ๅฎฟ่ฟๅช้ๆๅฅฝๅ็ๅฅฝๅ็ๅฅฝๅ็ๅฅฝๅ็ๅฅฝๅ็\n463502 0 ๅ ๆฒนๅฆ่ฟๆๅไธชๅฐๆถ็ๆถ้ดๅฏไปฅ็ปง็ปญ\n463503 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?02\n463504 1 ๆญใ็นๆญๅๅๆพ๏ผๅ่ฝ่ดนxxๅ
/ๆ๏ผxๅนดไป
้xxxๅ
๏ผ่ฟ่ต ้็ฝ็ปๆบ้กถ็ใ่ฏฆ่ฏขๅๆปฆ่้xxxxx...\n1000 Processed\n classify content\n464000 0 ๅถ็ถๆกๅฐ็ๅพๅผๆญฃๅฅฝ้ฝๆฏ้ซ่่ฟๆฏๅธธๅท็ๅฅฝๅทงๅฅฝๅทงๅพๅผ้ซ่ๅฎไนฐไบๅคๆไบบๅผ็ตๅฝฑ็บง็ฉๅไธโฆโฆไธ็ดๅจๆชๅพโฆโฆ\n464001 0 ๅฐไฝ ็็ๅฝๆ่ตไบๅคงไผ่บซไธๆถ\n464002 0 ไฝๆดไธชๆปจๆตทๆฐๅบ็ๆๆฐๅนถๆฒกๆๅผๅจ\n464003 1 ็ป็ไฝ ๅฅฝ๏ผ็ๆๅ
ด้๏ผไธไบๅฆๆใๆๅ
ฌๅธๅฏไปฅๅ
่ดนๅ็ๅ
ๅคง๏ผไบค้๏ผๅ่ก๏ผๆตฆๅไฟก็จๅกใ็งปๅจๅธฆ็งฏๅๆ้ข...\n464004 0 ไธไธชไธๆณๅไธดๅบๅป็็ไธดๅบไธไธๅญฆ็็ๅฎไน ๆฅ่ฎฐ๏ผ2015ๅนด8ๆ6ๆฅ\n1000 Processed\n classify content\n464500 0 ๅ ไธบ่ช3ยท30ๆฟๅฐไบงๆฐๆฟไปฅๆฅ\n464501 1 xๅ
็งๆฑๆฑๅบไธๅฅๆฟ๏ผใ็ฆๆๅๅบใxxxxๅ
/ๅนณ่ตท็งxx-xxxๅนณ้้ๆฟๆบ๏ผๅๅฐ้ๅญฆๅ
ปๆฟ๏ผๆดปๅจ...\n464502 0 ๅๅพๅๆถฉๅ็ๆฅ่ๅ
ๆณ้
ๆผ\n464503 1 ๆฅๅบๆถ่ดน็่ฏๅ
จๅบxxๆไผๆ ๏ผๅ่ฟๅบๆฑค่ฃๅๅฅ้ๅฐๅง\n464504 0 ๅฎๅๆบๅจไบบโ้ณๆฌโไบไธๆตท้ซๅฒๅฑ็ปๅบๅๅฎณ\n1000 Processed\n classify content\n465000 0 ๅๅฉไฝ ๆ้ 13ๅ็พๅ
ๅธๅ ด็็ถฒ่ทฏไบๆฅญ\n465001 0 ไฝไบๅขจๅฐๆฌๅคงๅญฆๅRMIT็ๅฎถ็ๅทฅๅคงๅญฆไน้ด่ฑชๅๅ
ฌๅฏ\n465002 0 ๅ็ฎก็้ฆ่ฆไปปๅกไธๆฏๆๆฐดๆๆ\n465003 0 ๅ
ถๅฎๆ็ฅ้ๆๅทฒ็ป่ตขไบๅฏๆฏๆๅฐฑๆฏๆณๆ็ ด็ ้
้ฎๅฐๅบๆๅฐฑๆฏๆณๆ็็ธๆฟๅบๆฅ่ฎฉไปๅๅฃๆ ่จๅฏ่ฟๆ ทๅฏนๆ่ชๅทฑ...\n465004 0 ๅซๆๅไธๆฌกไฝฟๅบไบๆบๆขฐๅTvZ\n1000 Processed\n classify content\n465500 0 ็ๅฐๅธๅ
ฌๅฎๅฑ็ปไธ้จ็ฝฒๅ
จๅธๅ
ฌๅฎๆบๅ
ณๅ
จ้ขๅฏๅจไธ็บงๅทก้ป้ฒๆงๅทฅไฝ\n465501 0 ๅผ ๅธๅ
้ฉพ้ฉถ46่ทฏๅ
ฌไบค่ฝฆ่ก่ณ่ชๆตท่ทฏไธ้ๆฒณ็ซ\n465502 0 ๆฏๅน
็ซ็ถไปฅ1000็พๅ
็ไปทๆ ผ่ขซๅๆไบ\n465503 0 ่ไธๅปบ็ญๅธ็ๆขฆๆณๆป่งๅพๅพ็ช็ถ\n465504 0 ใInๅฝฑ็ฝใไบบ็็้ๅ ด้ฆฌๆๆพ\n1000 Processed\n classify content\n466000 0 ๆฒกๆๅฅณ็ๅฎถ้ๆฒกๆไธ่ไฟฎ่็็ไป้ฟ่ฃค\n466001 0 ้ฃไนๆปจๆนๅบ็ๅฎ้ชๅๆ กๆๅชไบๅข\n466002 0 ๆๆบๅทฒๆฅๅบๅฅฝๆณๅปๅ่พๅ่ฐ่ฆ\n466003 1 ๅ่กxxxxxx xxxxx xxxx xxxxๅผ ๅฐ็ฒ\n466004 0 ||ๆๅจๆฐงๆฐๅฌไนฆๆถๅฌโ0001ๅฆพๆฌๆๅโ\n1000 Processed\n classify content\n466500 0 ๅฅฝๅๆฌข็ไธ้ฆๆญๅฏๆ็ทๅฃฐๅฑไธไบๆๅคฉๅธฆ็ๅฐUไธ้ฃๆบๅๅฎถๅๅฅฝๅฅฝ็ป>\n466501 0 mxๆดๆฐsensexไน่ฆ็ญๅฐๅ
ซๆไปฝ\n466502 0 5ๅคฉ4ๅคๅ ไนๅ
จ็จ้ฝๆฏๅไธ่ฝฆๅฐฑๅผๅงๅ\n466503 1 ้ณๅ
็ๅไธไธป๏ผๆฐๅนดๅฅฝ๏ผ[ๆฅๅฐฑ้ๆฐๅนด็คผ][x๏ผxx~xx๏ผxx]้ๅฏนๆจๅฐๅบๆทๅ่ฎพ่ฎก่งฃๆไผ๏ผๆฟๅๆๆๅพ\n466504 0 ๆทฑๆทฑ่งๅพๆๅธ็ต่่ฏฅๆทๆฑฐไธๆนไบ\n1000 Processed\n classify content\n467000 0 ๅฐไธ้ฒๆ่ขๅฅ็ทๅฅณ้ฟๆฌพ้ฒ็ดซๅค็บฟๅผ่ฝฆๆๅฅ้ช่ฝฆ่ๅฅ่ข\n467001 0 ไธๅ้ฃไปๅคฉๅค้4ๅฐ5็บง้ต้ฃ6็บง\n467002 0 ๅฝ่
นๆณป็ท่ขซๅฐ็ตๆขฏ41ๅฐๆถ\n467003 0 ๆ่งๅพไนๆฏ็ฌฌไธๆฌก่ฟไนๆธ
ๆฅๆ็ฝ็ๆๅพๅซๅฐบๅคฉๆถฏไปไนๆๆ\n467004 0 ๆฏๆฌก้ฝ่พๅจๆฏไปไธ็พๅฎขไผVIPไผๅๅกๅจๆ็ถ่ๅนถๆฒกไปไนๅต็จ\n1000 Processed\n classify content\n467500 0 4GDDR3ๅ
ๅญๅๅฎๅถ็็GeForceGTX860M\n467501 0 USCๅๅญฆ้ขMBAๆฏไธ็Jing\n467502 0 ๅ็ฐAๆๅฅฝๅคๅผๆฟ็บชๅฝ่ไธๆถ้ดๅฏ้\n467503 0 ไปๅนดๅบAndroidMไผ้ๆๆฐ็Nexus่ฎพๅคไธ่ตทๅๅธ\n467504 0 ไธๅฎ
ไธ็pleatspleasexxxx็งๅฌไธ่ง็ฝๅ\n1000 Processed\n classify content\n468000 0 6159ๆใ18600ๆๆฏ้ๅ ๅ้้ข็ฐ\n468001 0 ไปๅคฉๅฅๆณขไบไธๅคฉ็ๅป้ขๅ็งๆฃๆฅ\n468002 0 ๅไบฌๅธไบบๆฐๆฃๅฏ้ขไพๆณๅฏนๆ้ฃไปฅๆถๅซ็ฉๅฟฝ่ๅฎ็ฝชใๅ่ดฟ็ฝชๅณๅฎ้ฎๆ\n468003 0 ๆนๅคง้ๅขๅ่ฎกไธญๆ xไบฟๅ
่ฝจไบคๅฑ่ฝ้จๅๅ\n468004 0 ๆๅจDosnapๅไบซไบไธๅผ ๆถตๅฎ็ไฝๅ\n1000 Processed\n classify content\n468500 0 ๅคงๅฎถๅฅฝๆฌไบบ็ฐๅจๅจๅๆธๆ่ฟไธช่กไธ\n468501 0 ่ท็ฏ็ฝชๅซ็ไบบ็ผ็ฅไบคๆฑ็ๆถๅๅ
ๅฟ่ฟๆฏๆๆๆ ็\n468502 0 ๅฎๅฎxxๅคฉๆฏๅคฉๆๅคงไพฟ็ๆฌกๆฐๅพๅค\n468503 0 ๅฆๆ่็พๅง็ไฝๆฟๅป็ๆ่ฒๅ
ป่้ฎ้ข่งฃๅณไบ\n468504 0 ไปฅๆฌๅทไธๆพๅฑฑใ้้ฑๅขฉใ่ฑก็ๆใ่ต่ฑๅฒๅๅคงๅๆฏไธบไธป้ขๅๆไบๆพ้ผ ๆก้ฑผใ้้ฑ่พ้ฅผใ่ฑก็้ธกๆกๅ่ต่ฑๆฉ...\n1000 Processed\n classify content\n469000 0 ๅคชๅๅธไธญๅฟๅป้ข็ฎ่ค็งๅจ้จ่ฏๅนฟๅบไธพๅไบโ็พไธฝไปๅฅๅบท็ฎ่คๅผๅงโๅคงๅๅ
ฌ็ไน่ฏๆดปๅจ\n469001 0 ่ไธไธๅพไธ่ฏดๅฅฝๅฃฐ้ณ็ๅฑ็็ๅฟไธๅฆๅๅฑ\n469002 0 ่Citadelๆฏๅ
จ็ๆๅคง็ๅฏนๅฒๅบ้ไนไธ\n469003 0 ่ๆฏ็ฅ้็็ธๅๆๆๆไธ่ฝๆฅๅ็่ฐ่จ\n469004 0 ๅฝไฝ ๅฟ่บซๆๅฐ็ฒๆซๆฏๅฎถๆฏๆๅฅฝ็ๆๆธฉ้ฆจ็ๅททๆนพ็ถ่ๅฏนๆๆฅ่ฏดๆฏๅคไน็ๅฏ็ฌๅๆฏๅไธๆฌกๅฟๅไธๆฌกๅฟ็ไธๆฌกๅฅฝ...\n1000 Processed\n classify content\n469500 0 7ๆ28ๆฅๆตๆฑไผๆ่ท่ณ23ไนฐ่ฟ\n469501 0 xๆๆธฏไบคๆๆไธบไธญๅฝไผไธIPOไธปๆๅบ\n469502 0 ๆ ้ก้ฝๅธ็ๆดปๅนฟๆญ่ๅๆ ้กๆถ้ฒๆฏ้ใๆ ้กๅๆถฆ็ๆฐๅจ็บณๆฐๆกฅ็คพๅบๅๅฎๅททๅฐๅบๅผๅฑไบโๅคๅญฃๅฎๅ
จๅ
ฌ็่ฟ็คพ...\n469503 0 ๅทฒ็ปๆฏๆฟๅบ้่็ฎๆ ็ไธคๅๅค\n469504 0 ๆฏๆฌกๅๅ็ฎกๆๅฎๆถๅบๆฅๆฐๅๆ็คบๆ้ฝๅจๅๅซโ็ปๆๆตฆๅฒ\n1000 Processed\n classify content\n470000 0 2็พๅ
ๆฒก่ฒๅผ่ณโโ้ฃไธบไปไนไธ็ฐๅจไนฐ\n470001 0 ๅฐ่ฆๆนๅจ็styleไฝไธบstateๆฅๆนๅ\n470002 0 ๅๆmeitong0711้ซไพไพangelababyๅฟ
้กปๆญปๆญปๅๆญปๆไบบๅฟๅฝๆญป\n470003 0 โโฌโฌๅฎๆณฐ้ๅไธด้จๅคง่ฅฟๆด้นฟๆธฏ็งๆๅไฟก่กไปฝๅ
่ๅๆญฃๆตๆฑ้พ็\n470004 0 ๆ็ฐๅบ็็บธ้ฃๆบๅไบไธๅฆๆขๅพ็ๅชๅ็ๆดป\n1000 Processed\n classify content\n470500 0 ๅฐ็ฑณๆ็ฏ๏ผ1000ๅ
็ฐ้โฆโฆๅๅ ๆๆๅฎ่ทตๅคง่ต\n470501 0 ่ฏดๆฏLONGLONGAGO\n470502 0 ๅฎ็่ฎพ่ฎกไปฅๅๅถ้ ๅทฅ่บ้ฝๆไบ็ช้ฃ็่ฟ็ๆๅ\n470503 0 ๅจ้ฉฌ่ทฏไธญ้ดSๅฝขๆปไบๅ ไธชๆฅๅไนๅๆๅ่ทฏไธญ้ด็ๆคๆ \n470504 0 ่ฅฟๅฎๅธไบบๆฐๆฟๅบๅณๅฎ๏ผxxxxๅนดxๆxxๆฅไธๅxxๆถxxๅ่ณxxๆถxxๅๅจๅ
จๅธ่\n1000 Processed\n classify content\n471000 0 ๆ ผ็ฝๆฏ๏ผๅ
จ็้่ๅธๅบ็ฐๅจ้ฝๆฏ้ชๅฑ\n471001 0 ไปๅคฉๅป่ๅทๅญๅไผๆฝๅทฅ็ฐๅบ่ฝฌไบไธๅ\n471002 0 โ็ฎๅๅผ ่ฑๅทฒ่ขซๆฐ่ญฆ็จไธ่ฝฆ้ๅๆนๅ่ๅฎถ\n471003 0 B้จ่ฎพ็ฝฎใ็ฎก็ๅไธๆญฆ่ญฆๅจไฝ็่ๅจ\n471004 0 ็ซ ๆณฝๅคฉๅๅ
ถไปๆ่ตไบบๅนถๆฒกๆๅคชๅคๅบๅซ\n1000 Processed\n classify content\n471500 0 โๆบๅจไบบๆไบบโๆฏๆ ไธญ็ๆor็กฎๆๆญคไบ\n471501 0 ๅฟ
้กปๅผๅ
ฌๅฎ็จๅกๅทฅๅ้ถ่กๆ่ฒๆบๆๅฑ
ๅงไผ็ญ5ๆบๆ่ฏๆไธชไบบ่บซไปฝๆ่ฝๅ\n471502 0 ๅคงๅฎถไธ่ตท็ฉๆธธๆโฆ2015็็ๆฅๆ่ฟไบๅพๅผๅฟ\n471503 0 TheLightRunๆณฐๅทไธ่พพ่งๅ
ๅค่ท\n471504 0 JeremyRennerไนๆฏๆๅๆฌข็ๆผๅ\n1000 Processed\n classify content\n472000 0 ๅ็ฎกๅฑๆงๆณไบบๅไบ7ๆ28ๆฅๆๅจ่กก้ณ่ทฏๆพๅฐไบ่ฏฅๅบ่ด่ดฃไบบ\n472001 0 NEOๅฅณ็ๅ่ฒๆฃ่ฒ่ช็ถๅฐๆทท่ก็ฌ็นๆฏ่พน่ฎพ่ฎกๅๆต
ๆฃ่ฒๅฎ็พๆญ้
ๆพๅพ็ผ็ๆธ
้ๆทท่กNEOไปปๆไธคๅน
ๆดปๅจx...\n472002 0 ไปๅคฉๅปๅป้ขๅฌ่ฏ่บ้จๆฒกๆ้ฎ้ขใๆฐ็ฎกๆ็นไธๅฅฝ\n472003 0 ๅ
ๆๆไบบๆๆฅๆตๆฑๅซ่ง\n472004 0 1258ๅบๅคง้ๅข้ๅฟ็ไธๆถจ\n1000 Processed\n classify content\n472500 0 ๅๅจไบ้ฉฌ้ๅ
ฅๆไบ่ฟๅ ๆฌไนฆ๏ผๅญคๅฟๅ่ฝฆ\n472501 0 ๆ็กฎ่งๅฎๆ่ต่
ๅจ่ๅธๅๅบๅ\n472502 1 ไบฒ็ฑ็๏ผ็ปด็บณ่ดๆไธๅ
ซ่๏ผไธๅนดๅฐฑไธๆฌกๆๅคงๅ็็ดๆฅ็้ฑๆดปๅจ้ฉฌไธๅฐฑๅผๅงไบ๏ผๅฐไนๅทๆชๆญข๏ผ็ฅฅๆ
ๆฅ็ตๅจ่ฏข...\n472503 0 ่่ฑๅ้ชจ่ฆ่ฏดๅงๆ
ๅงๆไบๆน้ขๅๅฑๅคชๅฟซ\n472504 0 ๅพๅทๆฏไน
่ด็ๅ็ๆขๅ
้ฎๅๆๆๆฐ็ๅๅธ\n1000 Processed\n classify content\n473000 0 ไปไปฌ้ตๅฎนไธญๆ3ๅๆงๅซ็ๅบ็ๅนดๆๆฅ็ซ็ถๆฏไธๆจกไธๆ ท็\n473001 1 ้ๅทไธนๅฐผๆฏ่ฑๅญๅบๅๆญๅฐxๆxๅท----xๆxๅทๆปกxxxๅxxx\n473002 0 ๅทฒๅปบๆ5ๆก๏ผ้ฟๆฑๅคงๆกฅใ้ฟๆฑไบๆกฅใ้ฟๆฑไธๆกฅใ้ฟๆฑๅๆกฅๅ็บฌไธ่ทฏ้ฟๆฑ้ง้\n473003 0 ็ต่้ฃ่พน็ๅฅนไพฟไน็ๅๅๅคง็ฌ\n473004 0 ่ฅฟ่้ฟ้ๆ่่ ไพ ่ขซๆน็ผๆๆธธๆไธๆฌกๆ้ฃ็ๅ็\n1000 Processed\n classify content\n473500 0 ๅพฎ่ฝฏๅๆถๅ
ฌๅธไบWin7/Win8\n473501 0 ๅฎ่ๆๆๆฌพๅนถไธ็บข็้ฒๆๅ ๅนดๅๅพๅๆฌข\n473502 0 ๆญฆๆฑๅธไธญ็บงไบบๆฐๆณ้ขไบๅฎกๅคๅณ็ง่ๆชๆฏ็็ถไบฒๅคไปฅ5ๅนดๆๆๅพๅ\n473503 0 ไบง็ป้ข่ญฆใๆฟ็ญๅจๆใๅธๅบ็ฏๅขใ็ป่ฎกๆฐๆฎใๆๅจๅๅธใๆๅใๅป่ฏใ้ฃๅใๆบ็ตใๅๆใๅปบ็ญใ้ไฟกI...\n473504 0 ็ตๅฝฑ็ๆ็ฎๆฏ่ฟๆๆกฃๆฏ่พ้ๅคด็ๅฝฑ็ไบ\n1000 Processed\n classify content\n474000 0 ๅๆ่ฟ่กๆ่ถฃ็ไบๆฏๅคฉ้ฝๅฏไปฅ็ฉไธ้่ฆๅ่ฐ่ฏทๅ็ดฏไบ็กไธ่งๅจๅฎถไบซๅๅฅฝๅ็็ฉ็ๆๆบๆฃqian้้ๅพฎๅ...\n474001 0 ๅๅ่
พ่ฎฏๆฐ้ปไฝ ไปฌๅฐ็ผไปQQ็ฉบ้ด่ฏทๆฅ็\n474002 0 ไธญๅฝๅฅฝๅฃฐ้ณ้ฝๆฏ็พๅฝไบบๆฐๅ ๅกไบบ้ฉฌๆฅ่ฅฟไบไบบๅ ๆฟๅคงไบบ\n474003 0 orbisๆ ๆฒนๅธๅฆ้ฒๅธ้ฒๆๅพๅฅฝ\n474004 0 ๆๆญฃๅจ็10ๆฌพ่ถ
็บขไธๆด้ถ้ฃ็ผ่พ้จๅฎๆตๆฅๅ\n1000 Processed\n classify content\n474500 0 ๆ็ปๆฌง็พๅฐๅบๆ็ซ็็FPSๆธธๆๆ่ไบๆฅๆฌ็ฉๅฎถๆ็ฑ็่ฟๅจๆธธๆ\n474501 0 ้็นๆฏๅฎ่ท็ฆๅ
ๆฏSTๆ็น็ฅ้ต็ธ้ด\n474502 0 ๆตๆฑๅซ่งๆไธ่ฏไบบๆๆธธ็ๅนฟๅไบ\n474503 0 ๆฟไบง้ๅฎไธ็ปฉไนๆฏ้ๅธธๅๅฎณ็\n474504 1 ๆจๅฅฝ๏ผๆ่ฐขๆจไธ็ดไปฅๆฅๅฏน็บข่นๆๅฎถๅ
ท็ๆฏๆไธๅๆ๏ผๅ
ฌๅธไธบ็ปดๆคๆถ่ดน่
ๆ็๏ผx.xxๆดปๅจๅฏๅจไธญ(xๆ...\n1000 Processed\n classify content\n475000 0 ๆฏๅคฉๆฉๆจไผด็็ชๅคๅ็ฑปๆฑฝ่ฝฆๅๅจๆบ็่ฝฐ้ธฃๅๆฑฝ็ฌ้ๆฅ\n475001 0 ๅ ไธบๅ ไธบๆฑฝๆฒนๆฑฝ่ฝฆๅฎนๆๅจ็ขฐๆๅๅ็็็ธ\n475002 0 ๆฅๅคๅ็ฑปไบค้่ฟๆณxxxxx่ตท\n475003 0 ไธคๅผ ็
ง็็ถๅ่ฟๆไธๅผ ๅจๅพ็ฝๅคฉๆ็ๆๆบๅญ็ฐๅจๅฏน็็ต่ๅๅญไบๆไธๅฎๆฏๆ็
ๅพๆฒป\n475004 0 ่ฟๅๅฟไบบๆฐๆณ้ขไธๅฎกไปฅ่ฏ้ช็ฝชๅคๅคๅผ ๆๆๆๅพๅๅ
ซๅนดๅ
ญไธชๆ\n1000 Processed\n classify content\n475500 1 ๅๆดๅ ๆขฆๅฆ:ไบฒ็ฑ็ไผๅ๏ผๆจๅฅฝ๏ผไธบๅบ็ฅไธๅ
ซ่ๅฐๆฅไน้
๏ผๆฌไธๆไปๅณๆฅ่ตท่ณxๆxๆฅๅ
จๅบxxxๅxx...\n475501 0 ๅจๅฐ้ไธไธๅไธๆฅ็ฌ้ดๅฐฑๅฐไบ\n475502 1 ๅผ่ตฐๆไธๅผ่ตฐ่ดทๆฌพ๏ผๅๆฌพ้ขๅบฆx-xxxxไธ๏ผๅช่ฆๆกไปถ็ฌฆๅ๏ผๅฐไฝไธๅพๅฏไปฅ่งฃๅณใๅฐๅฟๆ๏ผๅช่ฆไป็ป็...\n475503 0 Intelๆ่ๅพๅๅฎไน ็\n475504 0 ไบ้ฉฌ้ๅๆงฝ็พๅฝๆ ไบบๆบ้ๅถ๏ผ่ฟๆฏๅบ็บธๆไปฌไธ่ฆ\n1000 Processed\n classify content\n476000 0 ไธญๅคฎๆฐ่ฑกๅฐ7ๆ10ๆฅ06ๆถๅๅธๅฐ้ฃ็บข่ฒ้ข่ญฆ๏ผ้ข่ฎก\n476001 0 ๅ็ฐไบไธไธชๆๆบ้ๆฒกๆๅธ่ฝฝ็ๆฅ่ฎฐ\n476002 0 tm็ไธญๅๆขไฟ้ฉไธไนไธ้็ฅๆ\n476003 0 ๅไบซๅพ็2015่ฅฟ่ๆ
ๆธธๅๆฅ\n476004 0 NAVERONSTYLEๅพ่ณขๅไบบ้ ๅ\n1000 Processed\n classify content\n476500 1 ใๆ่ทฏ้ใไธๅนด็บงๆฐ ๅญฆๆธ
ๅๅฅฅ-ๆฐ็ญ๏ผๅๆญฅ้้พ็นๆนๆณๆป็ป๏ผๆๆกๅฅฅ-ๆฐๅๅ-ๆ กไธ้ข๏ผๅนไผ็ญ๏ผๅทฉ...\n476501 0 ๆจๅณๅญ่ฎพ่ฎก็ฑๆ่ฒ้จๆพๅๅๅค\n476502 1 //่ฝฌ่ชxxxxxxxx๏ผไธญๅฝไบบๅฏฟ้กถ้ขๅฉ็x.xxx้ซๅฆๆไบงๅ้ฉฌไธๅๅฎ๏ผๆๆ่
็ซ้่็ณป ็ซ้...\n476503 0 ไธๅพๆนๅๅไบๆฐๆๆ่กไปฝๆ้ๅ
ฌๅธ็ไบง่ฝฆ้ด\n476504 0 ใๆฏๅป้ขใๅฎณใๆญปไบๅพ่บ็็ๅฐ็ทๅญฉๅ\n1000 Processed\n classify content\n477000 0 ๅฏไธๆฐ่ฑกๅฐ7ๆ27ๆฅไธๅๅๅธ๏ผไปๅคฉๅค้ๅคไบ\n477001 0 ๅฎไผๅคง่จไธๆญ๏ผๅๅ
ๅซ็ๆๅฝๆ็ฎก\n477002 0 2ใๅๆ
ขๆ
ขๅฐๅ ๅ
ฅ้ฒๅฅถๅ่่\n477003 0 ่ชๅทฑไนๆฏ้ป้ปๅจ็ต่ๅ็ญๆณช็็ถ\n477004 0 ็พๅ
ฐๅบๆณ้ขๅฏนๆญคๆกไฝๅบไธๅฎกๅคๅณ\n1000 Processed\n classify content\n477500 0 ๅๆ่ฝฌ็้ฃๆกstrongfemalecharacter็ๆ็ซ ๅผๅบไบๅฅฝๅคๆ่ถฃ็่ง็น\n477501 0 ไปปไฝไบบไธๅพไนฐๅใๅไธไฝฟ็จๅธฆๆ่ฏฅLOGO็็ไบงๆๆ\n477502 0 ็ๅไธ่ๅคง็ท็ฅไบตๅคๅๅฅณ็ซฅ\n477503 0 ใใๅฎๅฎ้่ฆ้ขๅค่กฅๅ
็ต่งฃๆฐดๅ\n477504 0 ๅ
ๆฌ้พๅพๆธธๆCEOๆจๅฃ่พใ่
พ่ฎฏQQๆต่งๅจ้ซ็บงไบงๅ็ป็ๆจไธ้ๅจๅ
็ๅไผๅๅฎพๅ่กจไบๅฏนHTML5ๆธธ...\n1000 Processed\n classify content\n478000 0 ไธบไปไน็ฑๅฅ่บ็ๅฎไน ๅป็ๆ ผ่พๅ
จ้ฝๆฒกไบ\n478001 1 ๅฐ้ๅคๅฐใไผๆ ไธ๏ผๆปกxxx้ข่็ฒๅ
่ดน้๏ผๆปกxxxๆกถ่ฃ
ๆตทๆพกๅ
่ดน้ใ ๆณ็ฅ...\n478002 0 ๅฝๅๆ่ฏทๅธๆฟๅบๅๆถๅฎถไบบๅฏน174ๅป้ข็ๅๅบไธๆ ท\n478003 0 ไบบ็ไธไธๅฎๅช่ฟฝๆฑ็ปๆ่ฟ็จไนๅพ้่ฆ\n478004 0 ๅฟ็็ๅฅฝ็ดฏ็ต่ไธ็ๆกฃๆก่ขซๆๅ ไบ\n1000 Processed\n classify content\n478500 0 ๅคๆฅ็็ไบบไปฌไธบไบ้ฒๆไนๆฏ่ฎๆผ็ๅ\n478501 1 xๆxๆฅๆจๆๆดปๅจๅ๏ผ/:?/:?ๅฆๆๆฒกๆ๏ผๅฟซๅฟซๆฅๆฏไผๅๅฆ็็ๅง๏ผ/:jj/:jj/:jj้่ดญ...\n478502 0 ๆ็ๆๆบ่ขซๆฒกๆถไบ็ต่ๅไบ่ฟๅ ๅคฉ้ฝๆฒกไธ็ฝ\n478503 0 ๆต่ก่ฟ็ธ็ธๅปๅชๅฟ้ฃๆบๅปๅชๅฟๆถ้ดๅปๅชๅฟๅฆๅไธไฟๅ
ปไธๆๆฎๅฐฑ่ฏฅๆต่ก่ๅ
ฌๅปๅชๅฟไบ\n478504 0 ็็ฎกๆบๆ็ๆฅๅคๆ้ทๅฃฐๅคง้จ็นๅฐไนๅซ็\n1000 Processed\n classify content\n479000 0 ไธๅฎถๅไธบBelgocontrol็ๆฏๅฉๆถๅ
ฌๅธๅฎฃๅธ\n479001 0 ๅฌ่ฏด็ฝไธ้ถๅฎๅทจๅคดไบ้ฉฌ้ไนๅฐๅจๆชๆฅไบๅนดๅฎ็ฐๆ ไบบๆบ้ๅฟซ้ๅ\n479002 0 ไนๆญฃๆฏไผ ่ฏดไธญ็Surfaceๆๆบ\n479003 0 ็ฌฌไธๆฌกๅ้ฅญๆๆๆบไธขๅจๆกๅญไธ\n479004 0 ้ข่ไธๅฏๆญ/ๅ็ฌ/ๅ็ฌ็กๅๆท้ข่\n1000 Processed\n classify content\n479500 1 ๅ่ฟไธๅ
ซๅฅณไบบ่๏ผ้ตๅงฟ็พๅฎน็นๆ x.xๆ๏ผๆฐ้ๆ้๏ผ่ตถ็ดงๆข่ดญใ่ฎฉๆไปฌไธ่ตทๅไธช็ฑ่ชๅทฑ็้ญ
ๅๅฅณไบบใไธ...\n479501 1 ็ป็ๆจๅฅฝ:ๅคฉๆดฅๅธๅ
จ้้ข็ฎกๅ
ฌๅธๅๆจ่ดๆฌ:ๆๅ
ฌๅธไธไธ็ไบง.่บๆ้ข็ฎก.้ฒ่
ไฟๆธฉ้ข็ฎก.xpe้ข็ฎก.ๆฌข...\n479502 0 ่ไธๆฏๆๅฐๅพ็ป็พๅฝๆ ๅคบไธญๅฝๅคง้็พๅง\n479503 0 ๆฑไธชๆฑ่็่ๅดๅ
็้ ่ฐฑๅฎ ็ฉๅป้ข\n479504 0 ๅๅฆๆไธๅคฉๅฝๅฎถๆฟๅบๅบไธๅฐๆฟ็ญ่ฏดๆไบบไธ็ฏๆณ\n1000 Processed\n classify content\n480000 0 ไปๅคฉไปๅๆตๅป้ข้ฃ่พน็่ณ้ณๅคฉๅผๅฐไธ็งๅ\n480001 0 xใSteamๅนณๅฐxxxxๅนด็ๆxxไบฟ็พๅ
G่็ฌๅผ่ฑ\n480002 1 ๅณๆฅ่ตท๏ผๆฑๆนพๅไนฐ็็บข่ฑๅฑ
ๅฎถไธๅๅบx?xๅฆๅฅณ่็นๅๆดปๅจๅ
จๅบxๆ่ตทx?xๆฅๆญขใๆๅพ
ๆจ็ๅ
ไธด๏ผๆฟๆจ...\n480003 0 ไปๆๆฅๅฐไบซ่ชๅ
จ็พ็SoulFood้คๅ
Sylvia's\n480004 1 ่ฃๆณฐๆฑฝ้
ๅ่กๆ่ฐขๆจ็ๆฅ็ต๏ผๆฌ่กไธป่ฅๅ็ง้ซไธญๆกฃ๏ผๆฐ็บฏๆญฃๅ๏ผ้
ๅฅไปๅ๏ผ่ฟๅฃๅ่ฃ
๏ผๆงๆ่ฝฆไปถ๏ผๅนถๅๅ
จ...\n1000 Processed\n classify content\n480500 0 ๆๅฑ
็ถๅฎๅฎๆดๆด็ๅฎไบ~ๆณ่ตท็ฌฌไธๆฌก็ๆญๅง้ญ
ๅฝฑๆฏ้ซไธญ้ณไน้ด่ต่ฏพไธ่ขซ้ๅฐไธๆฃไธๆฃ็\n480501 0 ่ๆญคๆฌก็พๅบฆๅฏนไบไธๆนๅธฆๆโๅฎ็ฝโๅญๆ ท็ไผไธ็ซ่ฟ่กไบ้ๆๅค็\n480502 0 1ๅ
ฌ้ๅผๆๅ่ฎพ2ๆ น็ดๅพ1800ๅๆฐด็ฎก\n480503 0 ๆฏๆฏๅจๅ
ฌไบคๆ่
ๅฐ้ๆ่
่ทฏไธ็ขฐๅฐ็่กไบบๅฐฑๆณ่ตทๆ็็ธๅฆ\n480504 0 ไฝ ่ถไธๆข่ฟฝ็่ก็ฅจๅฐฑ่ถ่ฆ่ฟฝ\n1000 Processed\n classify content\n481000 0 ๅ็ฉไธๅๆ ๆจ่พ่ฏดๅธๆฟไพๆฐดไธๆญฃๅธธ\n481001 1 ๆจๅฅฝ. ่้ๅ กๅฅ่บซๆถๅๅบไผ็ฑ้กพ้ฎๅฐๆฝ๏ผ่ฏ้ๆจๆฅๅๅ ๆๅไธๆVIPไผๅๅข่ดญๆดปๅจใๅxxๅๅฏๅญ้ข...\n481002 0 ๅญๅฎซๅ
่้็่ก่ถ็งฏ่ถๅค็ปๆ่กๅ\n481003 0 ๅ่่่็ณๅฐฑไผๅไฝๆๅไฝ่ดจ\n481004 0 ไธ่ฆๅชไผๅจ็ต่ใ็ต่งๅ็พกๆ
้ฃไบ่ช็ฑ่ๅจ็ไบบ\n1000 Processed\n classify content\n481500 0 ่ไธ้ๅธธๅฎนๆๅฐฑ่ขซ่ฏฌๅๅผบๅฅธใๅผๅๆ
ๆ็ญๅไบๆกไปถ\n481501 0 8ๆ่็ไธๅฎๆ7ๆ็ๅ ไธช็ปๅ
ธ้ข็ฎ้ฝๅๅคไธ\n481502 0 21ๆฅๅไปฅ7ๆฏ0่ฎคไธบๅๆงๆไผดไพฃ่ณๅฐๅบๅฝ่ทๅพๆฐไบไผดไพฃๆ\n481503 0 ๆ ่ตๆฑ็พๅบฆไบ็ฝ็่ตๆบ้ซๆธ
้พๆฅ\n481504 0 ็ซๅณๅฏนๆฝๅทฅๅบๆก่ฑใ้ถๆใ็บขๅถ็ณๆฅ ่ฟ่กไบ็งปๆคไฝไธ\n1000 Processed\n classify content\n482000 1 ๅฐๆฌ็ไผๅ๏ผโ็พไธฝๅฅณไบบ่๏ผๅฎๆ ็ๅพ่งโๅฝๅคง่ฏๆฟไบxๆxๆฅ๏ฝxๆxๆฅๆจๅบไผๆ ๅคง่ฎฉๅฉๆดปๅจใ ...\n482001 1 ่บๆ่บๅคๅผๅญฆๅฆ๏ผxๆxxๆญฃๅผไธ่ฏพ๏ผไธไธๅน่ฎญๅฐๅฟ็พๆฏใ้ณไนใ่่นใ้ถ่บ๏ผๅธฆๅญฉๅญๅจๅฟซไนไธญไบซๅ่บๆฏ...\n482002 0 ๅ
จๅธๆฐๅขๅ็ฑปๅธๅบไธปไฝ16478ๆท\n482003 0 ่ฝ็ถๆๆถๅ่บซไฝไนๅๅบ็พ็
ๅฐๆฅ็ไฟกๅท\n482004 0 ๆป็ฎ่ฑไบ15ไธชๅฐๆถๅฐๅคง็ไธๅ
ณไบ\n1000 Processed\n classify content\n482500 0 ๆพ็ปๅ่ฟไธญ่ฏ็จ่ฟๅป้ข็่ฏ่้ฝๆฒกๆๅฅฝ\n482501 0 ๅฏๅจไธไธ็ญไนๅ
ฌๅ
ฑๆฑฝ่ฝฆๆๅฐ้ๆถ\n482502 0 ๅบไบ็งๅฟๆๆณไฝ ไธ่ขซๆฌๅทๅคงๅญฆๅฝๅ่ๅป่ฏปๆ้ฝ็ๅทฅๅ\n482503 0 ่ฎฐๅพๅๆ็จๅฆ่คๆๆ็่ฎฉไฝ ่ถๆฅ่ถ็พไธฝ\n482504 0 ๆ ธๆญฆๅจไธ่ฌ่ขซ่ฆ็ฒๅฐๅๅนณ็ๅจ่
\n1000 Processed\n classify content\n483000 0 ้็จไบๆฒป็่บ็ญๆ่ด็ฒๅบใ็็ใ็ค็ฎใ้
็ณ\n483001 1 ๅคฉ่ถๆฑฝ้
ๆ่ฐขๆจ็ๆฅ็ต๏ผไธป่ฅ๏ผไธ้ฃ้ช้้พ็ณปๅ็ญๆฑฝ้
็ๆนๅๅ
ผ้ถๅฎ๏ผๆฌข่ฟๆฅ็ตๅจ่ฏข๏ผ็ต่ฏ๏ผxxxxx...\n483002 0 ไธๅๆๅๅๅ่ฟ้ๅงๅชๆฏ้ฎไบไธๅฅๆ่ฆๅป่ๅทๅฆไฝ ๆณๅฟตๆไธ\n483003 0 ๆฒณๅ็ๆ
ๆธธๅฑๅ
็ปไธญๅฟ็ปๅฌๅผไผ่ฎฎ็ป็ปโไธไธฅไธๅฎโไธ้ขๆ่ฒๅญฆไน ็ ่ฎจ\n483004 0 ไธๆญคๅๆถ้็น่ฟๅฌๅผไบxxxxๅนดๅคงๅญฆ็ๅ
ฅ่ๅน่ฎญไผ่ฎฎ\n1000 Processed\n classify content\n483500 0 Atestpublic7755้ฏ็บข็ฏๆฑๅไปท\n483501 0 ๅๅๅๅๅฝ้ณๆฒกๆณไปๆๆบ้ๆๅบๆฅ\n483502 0 ็ฝ้ฉฌๆน็ๅบๅฐๆไธบๆทฎๅฎ้่ฆ็็ๆไผ้ฒใๅ
ป็ๅ
ป่ใๆๅๅๆๅบๅฐ\n483503 0 ๅ
จๅธๅจ็จ็xxxxxๅฐ็ตๆขฏ\n483504 0 ๅทฅ่ต้ฝๅพ่ดก็ฎๅบๆฅๆน~ๅ่ฑชๅฟซๆฅๅ
ๅ
ปๆ\n1000 Processed\n classify content\n484000 0 ๅฅณๆงๅฟ่ก็ฎก็พ็
็ๅ็
็ๅ็
ๆญป็ๅๆ็ปญๅขๅ ็่ถๅฟ\n484001 0 ๆตๅไนๅ่ฃ
้ฅฐๅ
ฌๅธ่ฃ
ไฟฎไธๅฃไปท\n484002 1 ไฝ ๅฅฝ๏ผๆๅธไธไธ็ป่ฅๅฐๆนพๅ่ฃ
ๅบ้ธฟๆ
ข่ตฐไธ๏ผ่ดจ้๏ผไปทๆ ผ๏ผๅฎๅ้ฆ้๏ผๆฌข่ฟๆฅ็ตๅจ่ฏขไบ่งฃ๏ผ็ฅไฝ ็ๆดปๆๅฟซ๏ผ...\n484003 0 FM91่ทฏๅต๏ผๆญคๅปๅฏฅๅปๅ่ทฏๅฆๅนผๅป้ขๅๅ้บ้บ่ฑๅญ็ฏๅฒ่ฝฆๆต้ๅคง้่ก็ผๆ
ข\n484004 0 ่ฝๅนฟ่ฅฟๆกๆๆตทๆด้ถๆๆๅถ็ผค็บท\n1000 Processed\n classify content\n484500 0 ๆฑฝ่ฝฆ้ฉพ้ฉถๆๆฏ็็ๆ้ซ่ฆ้ ่ชๅทฑๅคๅผๅค็ป\n484501 0 ไฝๅณไพฟๆฏ่ฏธๅฆๅพฎ่ฝฏXBOX่ฟๆ ท็ไธไธๆธธๆไธปๆบ\n484502 0 ๅจๅฐ้ไธ็ขฐๅฐไธไปถๅพๆธฉ้ฆจ็ไบ\n484503 0 ๆฒกๆ้ฑ่ตๅฟๅไผค็ฌฌ8ไธชๆไปฅๅ็ๅป็่ดน\n484504 0 ๅฟซๆทไบ้ๅไบฌๆๅนณ็งปๅจT3+ๆบๆฟๆขๅฎไธญ\n1000 Processed\n classify content\n485000 0 ๅไธบ๏ผNowisP8\n485001 0 ่ฎฉไบบ่งๅพ็็พ้ๅธธplewhojudgemewithoutๅ\n485002 1 ้ ๆฒๆ ก ๅบ:ๆฅ ๅญฃๅจ ๆซ็ญxๆx๏ผxๆฅๅผ ่ฏพ\n485003 0 ๅจๆณๆๅคฉ่ฆไธ่ฆ็ฉฟ่ฑๅ้ชจ็t\n485004 0 ไธบไปไนๆ็ธๅฆ่ฟไนๅฅฝ็ๅบๅ ๅฝๅไธๅฅฝๅฅฝ่่ๅ็ๅญฉๅญๅข\n1000 Processed\n classify content\n485500 0 ๅทฅๅทGQxxxxๅฎๆฃๅพๅฅณไนๅฎข่ธๅฃๆก\n485501 0 Troy้ๅๅ ้คไบ่ฟๆกๆจๆๅนถๅๅธไบไธคๆกๆฐ็ๆจๆๅฎฃๅธ้ๅบTwitter\n485502 0 ๆฐ่ช็ฎก็ๅฑๅณๅฎ่ชไปxๆถ่ตทๆๅๅ็ๆทฑ่ช็บต็ซไบไปถ็ๆตๆฑๅฐๅทๆบๅบ่ฟ่ก\n485503 0 ไธๆโๅๅ็งๆๆบโ็ๆไธๅ็ไธไธช็ฅ็ง็ทๅญ\n485504 0 5ไธชๆๅๅๆๆไบบ็ซๅจไฝ้ขไธ้จๆฅผไธๆพ้จ\n1000 Processed\n classify content\n486000 0 xxxxๅนดๅฐxxxxๅนด็ฌฌไธๅญฃๅบฆ\n486001 0 ๅไบฌๅฝ้
ๅ่งไธญๅฟๅๅไบฌๅฝ้
ๅฑ่งไธญๅฟ็ซ็ถๆฏไธคไธชๅฐๆน\n486002 0 ๆดไธชๅฑ่งไปๆผๅ้กฟ23่กๅพๅๅปถไผธ่ณๅ่กๆตทๆธฏ\n486003 0 ๅจๅไบฌๅฐ้้่กไน็่ไธไน่ฎจ่
\n486004 0 80%่ณ90%็่บ็ๆฏ็ฑๅธ็ๅผ่ตท\n1000 Processed\n classify content\n486500 1 ใ็ซ็ณๅฏxๆไฟ้ใๅฅฝๆถๆฏ๏ผxๆVIP่ฎขๅ็ฆๅฉๆฅๅฆ๏ผๅฅฝ็คผ้ไธๅฎ๏ผๆนๅไป็พไธฝๅผๅง๏ผๅคไนฐ็ซ็ณๅฏไปปๆ...\n486501 0 ไธด็กๅๅ่ๅ่ฏดๆไธญๅไธ็ญ็ๆถๅๅจๆณๅ่ฟๅทฎไธๅคไธไธชๆๆไปฌๅฐฑ่ฎค่ฏไนๅนดไบ่ๅๆณไบไธ่ฏดๅบ่ฏฅๆฏๅๅนดโฆๆ...\n486502 0 ่งๅพๅพ้ๆโฆๅคๆทฑไบบ้็ๆถๅโฆไผๅพๆณๆญป\n486503 0 ๅฑไปฌๅไธบๆๆบๆMate7้ๆฅ็่ฟๆฌพๆๆบๅ\n486504 0 WIN7/8ๆญฃ็็ๆๅฅฝ่ฟๆฏ็ญๅพฎ่ฝฏๆดๆฐๆฏ่พๅฅฝ\n1000 Processed\n classify content\n487000 0 ๆฌๅจๅ
ญๆฃฎๅพทๅ
ฐๅฝ้
่ดธๆ้ๅข่ๅๅ้ๆฅๅจๅ่้กบไน้่ก่ดญ็ฉไธญๅฟไธพๅโ่ไธไผฆๆฏ่ก่็งโ\n487001 0 xxxxๅนด้ฟ่่ฝฉ่พๆ้้ถ่กๆ่ๅ
ฌๅ\n487002 0 ๅพGoogleๅปฃๅๆๆพๅฏ็ง่ฆ็ซฏๅช\n487003 0 ่ญฌๅฆๅพฎ่ฝฏOfficexxxไธGoogleApps็ญ็ญ\n487004 0 ๆ่ต้ฑ็ๅฝไบงๆบไธๆฏๅฐ็ฑณๅไธบ\n1000 Processed\n classify content\n487500 0 ่ฆไน่ไบ็ ็ฉถ็่ฟ่ฆ่CPA็\n487501 0 ๆพๅจ่บซๅๆค
ๅญไธ็่นๆxSๆๆบ่ขซไบบๅทไบ\n487502 0 xx้กนไธ้้ณ้ๅช็ธๅ
ณ็้ถ้จไปถๆน้ \n487503 0 ๅฏนไบ็ฌฆๅๅฐ้ข่ฏ่ฎผ็จๅบ้็จๆกไปถ็ๆกไปถ\n487504 0 ไธไธชๅฅฝ็็่ดขไน ๆฏไธๅชๆฏไผๆฃ้ฑ\n1000 Processed\n classify content\n488000 0 ้ฆๅฑโๆๆ็็ปดๅบฆ\"ๅ
ๅซ19ไธ็บชไธญๅฝๆด้ฃ็ปใๅคไปฃไธญๅฝๅฐๅพใๅฝไปฃ่บๆฏไธ้จๅๅ
ๅฎน\n488001 0 ็ฝๅญ็ปๅชๅ่ฑๅ้ชจไธไบบ็ๆ็ท\n488002 0 ็ถๅe+ๅฐwincciๅงๅๅณๅๅ\n488003 0 ๅพฎ่ฝฏๅบ่ฏฅๅญฆไน ่นๆไปฅSurfaceๆๆบๆฏๆWPๅนณๅฐ๏ผๆชๆฅๅพฎ่ฝฏๅบ่ฏฅๆพๅผๆบๆตทๆๆฏๅไธญไฝ็ซฏๅปไปทๆๆบ\n488004 0 ๆพไธไธชไบบไปๅธฆ็ๆๆๅธฆ็็ธๆบ็ถๅๆไปฌไธ่ตทๅปๆ
ๆธธ\n1000 Processed\n classify content\n488500 0 ๆฑ่็ๅๅปบๆนๅฟไปฅๅคงๅญฆ็ๆๅฎไธบๆฐๆๅก่ๆ ธๆบๅถไธบๆๆ\n488501 0 ๅฅฝ่ฟ็ธๅฎ๏ผๆๅๆฏๆฐธ่ฟ็่ดขๅฏ\n488502 0 ็ๅ
็่ทฏ่ฟไฝไบๆตๆฑๆฐธๅๅฟๆฑๅ่ก้ๆฅ ๆฑไธญ่ทฏ7ๅท็72101ไฝๅฝฉ้ๅฎ็ฝ็นๆถ\n488503 0 ๅไบฌๅธๅ
ๅๆญฃ่งๆถๆฏ้คๅ
ท็ไบงไผไธๅฐ็ปไธๆจๅนฟไฝฟ็จ1122ๆ ๅฟ็ๆถๆฏ้คๅ
ทๅ
่ฃ
\n488504 1 ๆฐๅนดๅฅฝ๏ผไฟๅฉ.็ฝๅ
ฐๅฝ้
็ๅฐๅง็ปๆจๆไธชๆๅนด.้กน็ฎ็ฐๆฐ่ดงๅ ๆจ\n1000 Processed\n classify content\n489000 1 xๅนณ็ฑณ็ๆทๅๅจๅฎ๏ผๆ็ฒพๅๆ ทๆฟ้ดๅฑ็คบ๏ผๅไปทxxxx ๆ่ฐขๆจๅฏนๆๅทฅไฝ็ๆฏๆ๏ผ\n489001 0 NBAๅฒไธๆๅผบๅฝ้
็ๅ่ฝๅฆ็ๆขฆไน้\n489002 1 ๆณฐๅท็ญ ไฟก็พค ๅๅนณๅฐ๏ผ้ไฝๅฎขๆทๆตๅคฑ็๏ผๅขๅ ๅฎขๆทๆถ่ดนๆฌกๆฐ๏ผๆๅ่ฅไธๆฌพใๅฆ๏ผๅๅๆจๅนฟ๏ผๅบ้ขไฟ้๏ผ...\n489003 1 ๅ
จๅๅทๆข็พ็๏ผxๆx-xๆฅไป
้xๅคฉ๏ผๅฐๅฐๅทฅ็จๆบ๏ผๆฌพๆฌพๅบๅไปท๏ผๅ
จๅนดๆไฝไปท๏ผไปทไฟๅ
จๅนด๏ผไนฐ่ดตxxๅ...\n489004 0 ๆ้ซไบบๆฐๆณ้ขๅ
ณไบไบบๆฐๆณ้ขไธบโไธๅธฆไธ่ทฏโๅปบ่ฎพๆไพๅธๆณๆๅกๅไฟ้็่ฅๅนฒๆ่ง\n1000 Processed\n classify content\n489500 0 ๅ้็๏ผๅฐๆฐๆฃ่
ๅฏ็ๆๅ้็\n489501 0 ้ฟ้ๅทดๅทดใ่
พ่ฎฏใ็พๅบฆใไบฌไธใๅฅ่360ใๆ็ใ็ฝๆใๆฐๆตชใๆบ็จใๆๆฟ็ฝไฝๅ2015ๅนด\n489502 0 ็ปดCๅฏไปฅๅธฎๅฉไบบไฝๆ้ซๅ
็ซๅ่ฟ็ฆป็พ็
\n489503 0 ไธๅคงๆฉ่ตถ้ฃๆบ็ดฏshiไบบไบ้ฝ\n489504 1 ๆจๅฅฝ๏ผๆๆฏไธๅๅ็ปๆจๆ็ต่ฏๅพท่ฏบ็ๆดป้ฆ็ๅฐ็ใๆไปฌxxx็ๆดปๅจๆฏๅ
จๅบxๆ๏ผ่ฟๅฏไปฅๅๅ ่ๅค่ฟ็ฐๅ...\n1000 Processed\n classify content\n490000 0 โกๅฏ็จไธๅ ๆฒน็ซใๆๅบ็ญ็ฆ็ซใ้ฒ็ซๅไฝ\n490001 0 ใZUKZxใ็ผ้ Zxๅฏผ่ช่ดด\n490002 1 ๅฅๅฅไฝ ๅฅฝ๏ผๆๅซๅฐๅจ๏ผๅนฟ่ฅฟไบบ๏ผๅ ไธบ็ถๆฏ็ฆปๅฉ๏ผๆ่ทๅฆๅฆ๏ผๅฆๅฆ่บซไฝไธๅฅฝ๏ผๆ้ๆฉ่พๅญฆ๏ผๅฌ่ฏดๅฅๅฅไบบๅพๅฅฝ...\n490003 0 ไธ็พๅฝๅผๆบๅจไบบๆๆฏๅฎค็ธๆ่กก\n490004 1 ๅง๏ผๆจๅฅฝ๏ผๆๆฏ็ๅบไบๆฌง็่ฑไธๆ็ๅฐๅผ ๏ผx.xโx.xๆฌง็่ฑๆๅฐๆปกxxxx่ฟxxx๏ผxxxx่ฟ...\n1000 Processed\n classify content\n490500 0 ็ฎๅๆฅ่ฏดๆๅฏน้่น็็ฑๅช้ไบ่ฏๅฆๅบๅ
็จไนฐ่นๆๆด่ท็ๆฑ็ไฝ\n490501 0 ๅๅฐๆนๆฟๅบๆนๅฐๅชๅฟไธญ็ณๅๅฐฑ่ฝๆๅฝๅใๅฝไบ้ๅฐๅชๅฟ\n490502 0 ๅไบ8ๆไปฝๆๅ่ถ
้ขๅฎๆๅ
จๅนด้กน็ฎๆๅผ็ฎๆ ไปปๅก\n490503 0 ็ก้ไบๅฐฑ็่งๆๆบๆดไธชๅฑๅนไธไบฎๆไน้ฝๆฒกๅๅบไฝ ๆฏๅๆญปๆๆดไธชไบบๅๅพ้ฝๅฟซ็ๆญปไบๅฅฝๅ\n490504 0 0ๅฝไบง็ๆบ่ฝๆๆบ้ฅๆง็ตๅจ็บธ้ฃๆบ่ช็ฉบๆจกๅๅจๆๆป็ฟๆบ\n1000 Processed\n classify content\n491000 0 ไปฅๆญคๆญ่ฝฌwin8็ปๅพฎ่ฝฏๅธฆๆฅ็้ขๅฟ\n491001 0 ็ญไบบๆฃๅปๅ่ฟๅพๆฅไธชๅฐ่ดฉๆถๆ\n491002 0 ๆๆณ้้โฆๅฏๆ่่ๅช่ฝ็็ไธ้\n491003 1 ๅฐๆฌ็้กพๅฎขๆจๅฅฝ๏ผๅฎๅฑ
ไนๆฌง็พๅฎถ็งxๆx--xๅทๅ
จๅบx.xๆ๏ผไป
ๆญคไธคๅคฉ๏ผๆฌข่ฟๆฐ่้กพๅฎขๅๆฅ้่ดญใๅฐ...\n491004 0 07ๆถจๅ159่ทๅ3ไธๆถจๅฎถๆฐ2294ๅนณ็ๅฎถๆฐ525ไธ่ทๅฎถๆฐ61ไบ่็ฝ๏ผ38ไบ่็ฝ้่๏ผ300...\n1000 Processed\n classify content\n491500 1 ๅฅฝๆถๆฏ๏ผๆไปฌ่ถณ้ผ้๏ผxๆxๅทๅxๅท๏ผๅฅณๅไผๅxๆ๏ผ้ไผๅx.xๆ๏ผxๆxๅทไผๅx.xๆ๏ผ้ไผ...\n491501 0 ้่ฟ่e่ดญไผไธๅๅ้ขๅผๅฑๅ
ฌๅผ้ๅฎ\n491502 0 ๆไธบๆตๆฑ็้ฆๅฎถๅจๆฐไธๆฟๆ็็ๅญๅบ็ฑปๅ
ฌๅธๅ็ฌฌไธๅฎถๅฝไผ\n491503 0 ๆฑ่ๅทฅ้ขไนๅฃฐ่ๅนฟๆญๅฐ็ผ่พๆฌๆฅๆ ไธ็ฉ\n491504 0 ็จๆ ธๆญฆๅจๆ่ฝ้ปๆญข็ๆฐๆๆฏๅค้บผ็ๅฏๆ\n1000 Processed\n classify content\n492000 0 ๆฐ่ฝๆบๆฑฝ่ฝฆๅๅธๅบๅญๅจไฟ้ฉ็็ฎกๆกไพ็ผบๅคฑ\n492001 0 ไธญๅป่ฏ็ฐไปฃๅไบบๅทฅๅๆ็่็ฒๅๅพ็ช็ ด\n492002 0 ๆฅๅฐๅธธ็ๅฝ็จ็ฐๅบ่่ฏ่ๅทๅธๆๆๅไฝๆ ๅ
ตๅๅปบๅทฅไฝ\n492003 0 ่้บปไฟก็จๆฐดๅ่ๅ็้ฃ้ฉๅฟๅฟ
ไผไผ ๅฏผ่ณP2P่กไธ\n492004 0 ๅฐ้ฃ??ๅค้ขๅน็ๅคง้จๅฎ็้ฃ่ทฏไธๅบ่ฏฅๆฒกไปไนไบบไบๆ็ไบบไธ็ญ็่ฟๆฏไธ็ญ\n1000 Processed\n classify content\n492500 0 ๅพๆณ็ฅ้ๆฒกๆไปปไฝ้ฒๆๆๆๅ็ๆๆไผๅขๅ ๅคๅฐ\n492501 0 ๅๅ็พๅบฆไบไธไธ็ปไบไผๆ็ปไธๆฏๅฅฝๅ็่ฏ่ฎบไบ\n492502 0 ไธฅๅๆๅปไบไธๆน็ฏๅข่ฟๆณ่กไธบ\n492503 1 ๅๆ
ๆ้ใๆฅๅบๅบทๅใๆฅ่็นๆ ไป
ๅฉxๅคฉ๏ผ่ฃธไปทxxxx/ๅนณ๏ผ้ฆไปxไธ่ตท๏ผ่ต ้้ข็งฏ้ซ่พพxxๅนณใๆบ้บ...\n492504 0 ๅจ็พๅบฆไธๆ็ดขไบ่ชๅทฑๆพ็ปๅนฒๆฐ้ปๅชไฝๆถๅ่กจ่ฟ็ๆฐ้ป็จฟไปถ\n1000 Processed\n classify content\n493000 0 ไฝๆฏDCๆๆฏ้ข่ๅฏไปฅ็ป็ปๆๅฎ่ฟไบ็ฎ่ค้ฎ้ข็\n493001 0 ๅฐไบบๅฎถ็ๆๆบๆพ่ฟไบ่ชๅทฑๅฃ่ข\n493002 1 ๅไฝๆๆบ่ๆฟๅ
ๅฎต่ๅฟซไนใๆๅธๆฌๆxๅทไธพ่กๅคงๅ่ฎข่ดงไผใๅฆๅคๆจ่้
ไปถ่ดญxxxxx้xxxxๅ
็ๆดป...\n493003 0 HelmsBrosMercedesbenzๅๆฅๆธธ\n493004 0 ๅๆนๆชๆฅ3ๅนดๅ
ๅฉ็จ่ช่บซ่ตๆบๅจ็ตๅฝฑ็ๆ่ตใๅถไฝใๅ่กใๆจๅนฟใๆน็ผ็ญๅคไธชๆน้ขๅฑๅผๅไฝ\n1000 Processed\n classify content\n493500 0 2015ๅนดๆฃ้ณไบไธๅไฝ้ข่ฏ่ฏพ็จ\n493501 0 ๅจ้้็ๆๆๆบ็พๅบฆๆฅๆกฃๆฏไธชไปไน็ฉๆๅฟไธญๆ้ไนๅ\n493502 0 ่ฟไนๆฏ่ฟๅป8ไธชๆๅ
็ฉบ้ด็ซ่กฅ็ป่ฎกๅ้ญ้็็ฌฌไธๆฌกๅคฑ่ดฅ\n493503 0 ๆ ไธบๅฟๆๅฑฑไนกๅฑ
ๆฐๆๆก้ฆๅฎถๅฎ่ฃ
ไบๆฐ็่ชๆฅๆฐด็ฎก\n493504 0 ๆตๆฑๅๅฝฑๅไธ่ฟๅนๅนๆปกๅๅฟซ\n1000 Processed\n classify content\n494000 0 ๅๅฆๅธjessciaๅๆๆฒ้้ ๅ\n494001 0 ไฝ่ฝๅค ็่ฉฑๆๅฅฝ้ฝไธ่ฆๆพๅฐๅๅๅ็ฎกๅถ็็ถฒ็ตก\n494002 0 ็็ธๆฏๅฎถๅฑ็ปไปๅไบๆๆ ธ็่ฏๆข
\n494003 0 ็ปๆๅป็ๆจๆจๆ็ๅคดๅไธๅฐไธค็งๅฐฑ่ฏด\n494004 0 ไป้ปๆฆ่ญฆ่ฝฆ็ฆปๅผๅนถ็ ธๅๆก้ฃ็ป็\n1000 Processed\n classify content\n494500 0 ๆฐธไธๆถ่จ็ธๅ/ๆฐธไธๆ่ช่้/ๆฐธไธๆพไปปไนๅผ /ๆฐธไธๅๆญขๆ้ฟ\n494501 0 ?ไฝไธบไฝๅคดๆใ็ต่ๆใไน
ๅๆ\n494502 0 ็ฒ้ชจๆๅๆ้ๅ็้ฟๆๅทฅๅ
ทๆๅ
ตๅจ\n494503 0 ๆพๆ่ฆ่ฟไธชbb็ๆๅฟไบ้ฝ่ฐไบ่ชๅทฑๆพๆWhooๅๆฑ่พฐไบซ้ช็ๅ็พ็ฝBB้้ฒๆๆธ
็ฝไธๆฒน่
ป40ml๏ฝ\n494504 0 ็ฝๅ็บท็บท่ฐไพโๆฒกๅพๆฒก็็ธโใโๅ่ไนๆฏๆ็ฆโใโๅๆๆธไธๅฐ่่็ผไบๆฏไนโ\n1000 Processed\n classify content\n495000 0 โฅไธ่ฆๅจ่ฏขๅฎขๆไธ็ถๅๆนไบ~\n495001 0 ๆฐไธไปฃ็Nexus็ณปๅๅฐไผ็ฑๅไธบไปฃๅทฅ\n495002 0 ๆฑ่ๅคงๆ่ฅไธๅ51ๅ
ๅไบipad2\n495003 0 P8ๅฐ็ฐๅจๅๅธๅฟซไฟฉๆไบ่ฟๆฏๆพ็คบ็ผบ่ดง\n495004 0 ็ฌฌไธๅ๏ผๅคๅฅณๅบงๅฝ็ๆดปไธญ็็ป่ๅธฆๆฅ็ฃจๅๅๆๅ็ฐ\n1000 Processed\n classify content\n495500 0 ไฝ ไปฌ่ฝไธ่ฝไธ่ฆๅ่ฏดๆ็ต่่ฑไบ\n495501 0 ๅจ็ปๅไบโๅฐฑ้ฃๆ ทโ็Windows8\n495502 1 ๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผ็งปๅจๅ
ๅฎฝๅธฆ็ฐๅจๆฏๅฆ่ฟๆๆๅๅ็๏ผ็ฐๅ็ๅ
ๅฎ่ฃ
ใ่ฏฆ่ฏขxxxxxxxxxxx\n495503 0 ๆณ็ฅ้็พๅบฆ็ๅนฟๅ็ณป็ปไปไนๆถๅๅ็ๅฆๆญค็ฒพๅ\n495504 1 ใๆๆๆฑฝ้
ใๆ่็ทๅฅณๆฎๅทฅ๏ผxx-xxๅฒ๏ผๅบๆฌๅทฅ่ตxxxxๅ
/ๆ+ๅ ็ญ+ๅ้กน่กฅ่ดด\n1000 Processed\n classify content\n496000 0 ๅคฉๆฐ่ฟไน็ญๆฐด็พไปฅๅๅฏ่ฝไผๆ็ซๆ
\n496001 0 ไธป่งๆฏ็็+ไธญๅฝๅฅฝๅฃฐ้ณ?1MOREๅคดๆดๅผ่ณๆบ\n496002 0 ่ฟไธช่ๅธ่ทxโxไธชๅญฆ็ๅ็็ไธๆญฃๅฝ็ๅ
ณ็ณป\n496003 0 ไธไธช็ฉไธๅทฅไบบไธ24ๆญ48ๆฏๆ2400ๅ ไฟ้ฉ\n496004 0 ้ฃๆไธ่ทฏ่ทๅ่ทไธ่ทฏๆ่ทคๅฐฑ็ดๆฅๆปพๅๅจ่ๅฐไธๅฏ\n1000 Processed\n classify content\n496500 0 ๆ่ฝๅ่
ๅ่ดฟ็ๅฆไธๆน้ขไนๅ้ ็ๆดๅคง็่ดขๅฏ\n496501 0 ๆฏๅคฉ็ก่ง็ๆถๅๆๆบๅทฎไธๅคๅชๅฉไธ3%็็ต\n496502 1 ไผๅๆ ่ฒๅ
ฅ้ฉปๆผๅ้กฟๆญฅ่ก่กๅ ๅผ่ฎพ่ฏพ็จๆ๏ผๅใ้ซไธญๅ็ง๏ผๆ่พ
ๅฏผ็ๆ่ชไน ๏ผๅฐๅฟๅดๆฃใไนฆๆณใๅฐ...\n496503 0 ไฝ ไปๅฆ็ไธ็จๆๆ่ฑ2ๅ้ฑๆๅท่ดนไบบๅฎถ็ปไฝ ่งฃ้ไธๅฐๆถใๅ้ข่ฟๆไธๅคงๅ ็
ไบบๅข\n496504 0 ๅไบๆญ้ฒโๆฟๅบๆฉ่็็ธโ็ๅ
ฌ็ฅ่ฝฎๅญๅธฆ่ทฏๅ
ๅจๅ
จๅ็ปดๆคไธญๅฝๅธๆณๅ
ฌๆญฃ\n1000 Processed\n classify content\n497000 1 ไธบ่ฝๅฎๆบๆ
งไธๅธๆไปถ๏ผxxxๅข็ตไฟกๅฑๆๆบ็นๆ ๏ผๆฏๆxๅ
้xxxๅ้้่ฏ๏ผ่ถ
ๅบๆฏๅ้xๅใๆฏๆx...\n497001 0 ๅพ3ๅฅณไบบๆๆๅคงๆจ่็่ช็ฝ่ๅๅฆๆฐด\n497002 1 ใๆจๆ่ฑ้ข่ไฝ้ชๅบใไธๅ
ซๅฆๅฅณ่ๆดปๅจๅ
จๅบๆปกxxxๅ
้xxxๅ
็ฐ้ๅธ๏ผๆถ้ด:x.x--x.xๅท๏ผ...\n497003 0 ๅนณๆถไธ่ฌ2XUไฝไบ5ๆ้ฝๆฏ่พๅฐ\n497004 0 ไธ้ฆ่ฑๆๆญAppletree\n1000 Processed\n classify content\n497500 0 ๅๅๆ่ดฑๅป็ไบไธๆฑ่ๅซ่ง็ๅฃฎๅฟๅไบ\n497501 1 ๅไธxxxxxxxxxxxxxxxxxxx ๅๅญ ้ๆตท็ผ\n497502 0 ๅฐๅจxๆxxๆฅๆญฃๅผๅๅธๆไธๆฐ็ๅฐ็ตX่ถ
็บงๆฟ\n497503 0 ๅบ้บๅฎขๆไผๅจ48ๅฐๆถๅ
ๅฎๆ่ฟ็ฐ\n497504 0 ๅ
จ็ๆๅชไบไผ็ง็ๅปบ็ญๆๅฝฑไฝๅ\n1000 Processed\n classify content\n498000 0 ไปฅๅๅฏนๆถๅซ่ฟ่ง็้ซ้ขไบคๆ่ดฆๆทไบไปฅๆๅไบคๆ็ๅคๅ\n498001 0 ่ญฆๅฏ็ผ็ฅ้้ฝๆฏๆๆ็น็น็ไธ่ฅฟ\n498002 0 ็็ๅฝๅนดๅจ็ซ็ฎญ้ๆๅ็่็ซ่ซๅธ้ๅฆไป็ๆ ทๅญ\n498003 0 ๆไปฌๅธธๅธธไน ๆฏ็ป็ซๅจๅคช้ณๅบไธๅฏๆ็ๅฐ่ดฉ่ฎจไปท่ฟไปท\\n0\\tๆฏไธไธช็ป่้ฝๅจ่ฏ่ฏด็ๆๅ
ๅฟ็ไธ่ๆ\\n...\n498004 0 โ่ทฏ้ฃ่ชๅธๆณๅฒไบไปถไปฅๅ่ฟๆฏๅๅ็บงไบๆฐๆ่ฝๅ\n1000 Processed\n classify content\n498500 0 ๆณๅบญไธ็ๆณๅฎๆฃๅฏๅฎๆฏไธๆฏๅ็ตๅฝฑ้ไธๆ ท้ซๅท\n498501 0 ่ขซ็งฐไธบโDaemonโ็ๆชๅ
ฝไพตๅ ไบๅคงๅฐ\n498502 1 ๆ่ฐขๆจๅฏน้ข็งๅฅณ่ฃ
ๅคๅนด็ๆฏๆไธๅ
ณ็ฑ๏ผๅฅณไบบ๏ผ็ฑ่ชๅทฑ๏ผโไธๅ
ซ้ญ
ๅๅฅณไบบ่โๅคงๅไฟ้ๆดปๅจไบxๆxๆฅ็ซ็...\n498503 0 ใๆทฑๅณ่งๆพๆน้ซๅฐๅคซ็ไผๆๅ
จ่่ฑ่ฏญ็ฟป่ฏๆฐๅใ\n498504 1 ๅฅณไบบ่๏ฝไธๅฑไบๆจ็่ๆฅ้ฉฌไธๅฐๆฅๅฆ๏ผไบๅบทๅฅๅผไธบๆจ็ฒพๅฟๅๅคไบ้่ฃ
๏ผๆๅพ
่ๆฅ้ไธๆจๅๆๅไธ่ตทๅไบซ่ฟ...\n1000 Processed\n classify content\n499000 0 ๅไบฌ็ๆฐๅคๅท๏ผ้ๅฒๅฐผ็ยท็ญๅพท็ยทไผๆฏ\n499001 0 ๆฑ่ๅฐๅผ่ฟ็่ฒไผผ่ฟๅจๆฅไน่ฆๆญไบ\n499002 0 ๆดชๆณฐๅบ้ๆฐไธๆฟ่ด่ดฃไบบๅฏๅฟๅ
\n499003 0 ไธญๅ
ดๆตทๅคๆกฅๅคดๅ กCSL่ขซๅไธบๅ
จ็ฝๆฌ่ฟ\n499004 0 ่ชๅทฑๅๆๆบ่ชๅธฆ็ๅฐๆฌงๅฉๆ่ๅคฉ\n1000 Processed\n classify content\n499500 0 ๅช่ฝๆฏๆฐ็จๆท็ฌฌไธๆฌก็จ็พๅบฆ็ๅฟซๆทๆฏไป\n499501 0 ๆตๆฑๆๆญฆ็็ฉ็งๆ่กไปฝๆ้ๅ
ฌๅธ2014ๅนดๅนดๅบฆๆฅๅๅๆ่ฆไบ2015ๅนด2ๆ11ๆฅๅจไธญๅฝ่ฏ็ไผๆๅฎ็...\n499502 0 ไธญๅฟไบxxxxๅนดๅจไฝ่ดค็บชๅฟตๅป้ขๆ็ๆ็ซๅฐ็น่ฎพๅจไบง็ง\n499503 0 ไฝๅปบ้จใๅฝๅฎถๆ
ๆธธๅฑ่ฟๆฅๅ
ฌๅธ็ฌฌไธๆนๅ
จๅฝ็น่ฒๆฏ่งๆ
ๆธธๅ้ๅๆ็คบ่ๅๅ\n499504 0 ไฝฟๅคง่ๅฏน่ฉ้ข้จ็่่ๆงๅถๅ่ฐ่ฝๅไธ้\n1000 Processed\n classify content\n500000 0 ๆๅ็ฏ็ฝชๅฆ??ไปปๆงไธๆฌกไนฐไบไธๅ
16\n500001 0 ไธญๅฝๆ็ๅไผๆ็ซ30ๅนดๆฅไธ็ด็งๆฟไธบ่ฟ็ค้ฒๆฒปไบไธๆๅกใไธบๆ้ซๅ
จๆฐๅฅๅบท็ด ่ดจๆๅกใไธบ่ฟ็ค็งๆๅทฅไฝ่
...\n500002 0 ไธๆฑฝ้ๅข600104็ฐๅจไปทๆ ผๅจ19\n500003 0 ไปๅคฉไธญๅไปไปฌๅฐๅๅฎขEasyIdol่ฟ่ก็ฌๅฎถไธ่ฎฟๅๅจ็บฟ่ฎฟ่ฐ\n500004 0 ไบๅฟๅ
ฌๅฎๅฑ่ฐๆฅ็กฎๅฎไธคๆญป่
็ไธบไปๆ\n1000 Processed\n classify content\n500500 0 ่ฟๅบฆๆๆ ๅฏๅฏผ่ด่ฎธๅค็ฎ่ค็
ๅ ้\n500501 1 ๆไธบๅฅณ็ฅๆฏๆฏไธช็พไธฝๅฅณไบบ็็พๅฅฝๅฟ??ๆฟ๏ผใ็พๆฑๅๅใ๏ผไธๅ
ซ๏ผ็พไธฝๅฅณ็ฅ่้็ฃ
ๅบๅป๏ผๅฅณ็ฅ้ฉพๅฐ๏ผ็พๆฑ...\n500502 0 ไธ่ฝฆ้ๅ
ฑ่ฅ็ไธพใใไผช่ฃ
่ฝฆ็ๆถ่ฃ
โๅฐๅญ\n500503 0 ๅฏนๅฑฑไธ17ๅธ้จๅๅคงๅๅๅบใ่ถ
ๅธ็ญๅผๅฑๆถๆขฏๅฎๅ
จๅคง่ฐๆฅ\n500504 0 ไฝไธบ็งๅๅบ้ไธป่ฆ่ด่ดฃไบบไนไธ\n1000 Processed\n classify content\n501000 0 ้่ฟๆๆบ้ถ่ก็ปไฝ ไปฌๅผ้Eๆฏไปไธ้่ฆU็พ\n501001 1 ่ฝฌ่ช:(+xxxxxxxxxxxxx) ๆxx็ง็ใๆ่ฏญ:่ฝฌๆตทๅๅคฉใๅ
ญ่:้พ็พ้ฉฌ็ด้ผ ็ช\n501002 0 ๅบไบๅฏนๅ ็นๆจกๅผ็็่งฃ=ๆ่่ตๆจชๅๆจกๅผ\n501003 0 ๆตๆฑไนๆดๆฐๆด้จ็บข่ฒ้ข่ญฆ๏ผไปๅคฉๆตๅๅไธ้จๆฒฟๆตท็ๆนๅทใๅๅ
ดใๆญๅทไธๅ้จใ็ปๅ
ดใๅฎๆณขใๅฐๅท้จๅๅฐๅบ...\n501004 0 ๆณๆฃ้ถ่ฑ้ฑ็็งไฟกๆๅช่ฆๆๆบ็ต่ๅฐฑๅฏไปฅ\n1000 Processed\n classify content\n501500 0 ไธๅๅนดๅ
จๅบๆฅๅๆดๅ็ซๆ
ๅๆฏไธ้xx\n501501 0 ๅฏๆ็ๅปบ็ญโฆโฆ่ขซๆฅ็
งๅๆ็็ปๆ\n501502 0 ่ฝ็ถๆๅทฒ็ป็่ฟๅซๆๆฆๆช็ไบ\n501503 1 ไบฒ็ฑ็ๅไบบ๏ผไธบไบ็ญ่ฐข่ไผๅ๏ผ็นๅซๆจๅบ็ปๅๆดปๅจ[้ผๆ][้ผๆ][้ผๆ][้ผๆ][้ผๆ][้ผๆ]...\n501504 0 ๅ ไธบๆฏๅคฉ้ฝๅธฆ็้ฒๆ้ๅๅฐ้ปไผ\n1000 Processed\n classify content\n502000 0 ไธ้จๆดพๅบๆ็ธๅ
ณ่ด่ดฃไบบ่ฏๅฎไธ่ฟฐๆท็ฑ่ฏๆ็กฎไธบ่ฏฅๆดพๅบๆๅผๅ
ท\n502001 1 ๆฉไธๅฅฝ ๆๆฏๅๆฌง็บธ็ ้่ฆๆ่ไธไธชๅบๅๅบ่ชxxxx ๆๆ็พๅไบๆ็น ๆๅ
ด่ถฃ่็ณป\n502002 0 ๏ผๆนๅโๅไบบ็ตๆขฏโๅๅฎถ๏ผๅฐๆ็
ง่ฎกๅไธๅธ\n502003 0 Itseemstheworldwasn'tquitereadyformydaddyๅ ๆฒนๅง่็น\n502004 0 Kenworthyๅจ1997ๅนดๆๅบ\n1000 Processed\n classify content\n502500 0 ไฝ ็ไธไฝไธบ็ๆฏๅคไบโฆโฆไป
ๆญค็บชๅฟตๅณๅฐ็ฆปๅป็VNPTโฆโฆ\n502501 0 ้ฝๆฏๆดๆ็ถhighlydesirableareas\n502502 0 ๆ ้กๅค็ ๅบๅฐๆ็็งๆไฝๅ\n502503 0 ๆขไปไนๆๆบๅฅฝๅขไธๆA8้ญ
ๆMx5ๅไธบ่ฃ่7\n502504 0 ไธๆฏBB้่ฟไฝ ้ไธฝๅฎน้ข\n1000 Processed\n classify content\n503000 0 ๆไนไธบ่ฟไธชๅผบๅฅธๅคง็ท็็ฝช็ฏๅซๅฅฝ\n503001 0 x่คช็ๆดๅๅฏๆฒป็ป่ๆงๆๆ็ๅคง็็\n503002 0 ไป็็็็้ฟๅผๅงๅฐๆถๅคฑๅ
จ็จ็ฎก็\n503003 0 ๆฌ้ๅพไธ็ ด็ๆญฆๆดๅฅชๅพ้ๅไธๅฃ้ซๆฎ็ซ ไบ\n503004 0 1949ๅนด4ๆ23ๆฅๅไบฌ่งฃๆพๅ\n1000 Processed\n classify content\n503500 0 ๆๅจๆๆบ้
ท็ๅ็ฐโๅคง้ไธถJEFFโ็็ฒพๅฝฉ่กจๆผ\n503501 0 ๅไบฌๅ่ฟ
็ฑปTAXI็ไธ่ฝฆ่ฟ่ก\n503502 0 ็พๅบฆๆต่งๅจ่ฟไนๅคไบบไธ่ฝฝไนไธๆฏๆฒกๅๅ ็\n503503 0 ไธญๅฝ็ณๅๅไบฌ็ณๆฒนไบ7ๆ7ๆฅๅผๅฑๅ
ๅๅญฆไน ๆดปๅจ\n503504 0 ๅ
ถๅฎๅพ็ฎๅ~ๅช้่ฆๆ็
งไปฅไธๆญฅ้ชค\n1000 Processed\n classify content\n504000 0 ๆ้ซไบบๆฐๆฃๅฏ้ขๅๅๅค้ฟไธ่กไบไบบๅจๅ
ตๅขๆฃๅฏ้ขๅฎๅพไธๅงใๆๆฏๅค้ญ็ๆฐๅค้ฟ็้ชๅไธ่
ไธดๅๅธๆฃๅฏๅ้ข...\n504001 0 ่ๅทๆฐ่พพ็ตๆถๆขฏ้จไปถๆ้ๅ
ฌๅธ21\n504002 0 1986ๅนด็พๅฝไฟไบฅไฟๅท็ๅ
้ๅคซๅ
ฐๅธไธบไบๅ้ ไธ็็บชๅฝๆพ้ฃไบ1็พ40ไธๅชๆฐ็\n504003 0 IOPE็ๆๆไบงๅ้ฝ็ป่ฟ็ฎ่ค็งๅฎๅ
จๆต่ฏๅณไฝฟๆฏๆๆ็ฎ่คไบฆๅฏๆนๅผไฝฟ็จIOPE็ๅฝฉๅฆๆขๆฏๅๅฆๅไนๆฏไฟๅ
ปๅ\n504004 0 ๅบ็จไบ้ๅบๅธๆฟๅบๆๅฎๅ่ก็่กไธๆฏไป็ฑปICๅก\n1000 Processed\n classify content\n504500 0 ็จๆณๆณ็ญๅฟๅๆปดๆปดๆฑๆฐด็ฎไธไบๅฏนๆ่ทฏ้ๆ็พๅฅฝ็็ฅ็ฆ\n504501 0 ไฝ ไปฌ่ฟ้ๆไบบๅฌ่ฏด่ฟๅไธบ่ฃ่้จ้จ่งฃๆฃ็ๆถๆฏๅ\n504502 0 ๅจๆตๆฑๅคงๅญฆ้ๅฑๅป้ข็ไธช็
ไผฐ่ฎก็ญไบๅฟซxไธชๆไบ\n504503 0 ๅ
็ต20ๅ้ๅๆๆบ็ต้่ฟ
้็ไป52%้ๅฐ4%ๆๆณ็ฅ้ๅฎ่ฟ20ๅ้้ฝๅนฒไบไปไนๆๆฆ\n504504 0 ๆถๆพไธ่ฅฟๆถ็ฟปๅบๆฅๆ็ต่ไธ็ซ็ถๅ่ฝ่ฏปไบ\n1000 Processed\n classify content\n505000 1 ไปชๅผ๏ผๆไปฌ็ๅ
จๅ
่ฃ
ไฟฎๅฅ้คโxxxโไนๅฐๆญฃๅผ็ป้ๆญฆๆฑ๏ผๅฐๅบ็ๅไบๅไฝไธไธปๅ
่ฎพ่ฎก่ดน๏ผๅไธๅไฝๅ
็ฎก...\n505001 0 ็ๅฐๅฅฝๅฃฐ้ณ็ฌฌไธๅญฃๆจกไปฟ้ไธฝๅ้ฃไธชๅงๅจ\n505002 1 xxxxxxxxxxxxxxxxxxx ่ๅ ๅทฅๅ้ถ่ก\n505003 0 foreo่ฟไธช็ตๅจ็ๅท้ ๅ็ฅ็ฅๅฅๅxxxxxx่ฐ่กฃ่็ดซ้ข่ฒๅฅฝๆฃ\n505004 0 ไป่ฏด้๏ผโๅพๅฑ็็พๅฝๆฟๅบๆฒก\n1000 Processed\n classify content\n505500 0 ่ฐทๆญๆฒกๆๅฐๆๆ็ฒพๅๆพๅจๆ้ซๆ็งไบงๅๆๆๅก10%็ๆ็ๆน้ข\n505501 0 ่ฟๅบงstonetownๅคง้จๅๅปบ็ญๅปบ้ ไบ200ๅนดๅ\n505502 0 ่ๅทๅธๅง่ๅบๆ่ฒๅไฝ่ฒๅฑไธๅฑไบไธๅไฝๅฎไบ2015ๅนด8ๆ11ๆฅ้ขๅ็คพไผๅ
ฌๅผๆ่ไฝ่ฒๆๅธ10ๅ\n505503 0 ไปไปฌๅฌไบ็ปๆๅฐ่ท่ฟๆฅ็้ฃไธช็ฌ้ด\n505504 0 ่ฟ2000ๅๅ้ไบบๆฏไปๅ
จๅธ11ไธชๅบ็ฌฆๅ้ไปปๆกไปถ็ๅ
ฌๆฐไธญ้ๆบๆฝ้็\n1000 Processed\n classify content\n506000 0 ็ทๅญๅผบๅฅธ9ๅฒๅฅณ็ซฅไธๆๅฐๅ
ถๆจไธๆฅผๆๆญป\n506001 0 ็ปๅปๆคไบบๅไธไธชๅฎๅ
จ็ๅป็็ฏๅข\n506002 0 ๆฑฝ่ฝฆ้ฆ้ๅๅฃซๆญฃๅจ็ปๅคงๅฎถไป็ปๆฑฝ่ฝฆ็ฅ่ฏ๏ฝไธบไบๆฌๆฌกๆฑฝ่ฝฆๆปๅจๅ\n506003 0 ๅฎๆฐๅฎถๆ1949ๅนดไนฐไธ่ฟ้็ๅบๅญ\n506004 0 ่ฏฅๅธ่ฎกๅๅฎๆฝ3ไธชๆฃๆทๅบๆน้ ้กน็ฎ\n1000 Processed\n classify content\n506500 0 ๆๆ็ปไบโ็ฝฎไธ้กพ้ฎโๅญๆฌฃ็ชโ่ฟไธช้้กน\n506501 0 ๅป็็ญๆถ้ดไธ้ฟไฝ้ไธๅไนไธ็ฎไธๆญฃๅธธ\n506502 0 ๅฆๅฌ็็ธ็ปๆๅปไนฐiPhone6\n506503 0 ใxxxไธไบบๅจไผ ็ใ็็ฌ่ซๅญๅ้็็ฑณ่ฑ่ถ
ๆฒปๆ็็ฌๅ
ๅผ\n506504 0 ่ณ่ๆฐๅจๆฟ็้ซๅฐๅคซ็ๆไนๅ ๅ
ฅไบๆฃฎๆๅฎๅซ้ไผ\n1000 Processed\n classify content\n507000 0 ้่ฅฟ19ๅฒๅฐๅฅณๅ ๆฅผ่บซไบกๅฝๅฐๅฎๅๆถๅผบๅฅธ่ขซๅ่\n507001 0 ็ถๅๆ็ๅคฉ้ฃๆบ้ฃ่ฟๅคฉ็ฉบโฆ\n507002 0 ๅผ่ตทๆญฃไฝฟ็จ็ตๆขฏ็ๆฐไผไธ้ตๆๆ
\n507003 0 ๆปจๆตทๆฐๅบๅๅ ๆฎไบๅคฉๆดฅๅธ็ฌฌไธไบงไธๆ่ต็้ๅคดๆ\n507004 0 ่ฟๆฏ่ฏฅ้ข่ฟๅนดๆฅ็ฌฌ4ๆฌก่ท็ๆณ้ขๅพๆๅฅ\n1000 Processed\n classify content\n507500 0 xxๅฅ่
พๆธ
ๆซๆฅ็ปง็ปญๆๅผๅบๅน๏ผไฝ ็\n507501 0 ๅ
จๆฐ็้ฟๅฎๆฑฝ่ฝฆๆ ๅฟๅๆๆฅ่ชไบๆฝ่ฑก็็พ่งๅฝข่ฑก\n507502 0 ๆๆณๆณ็่็ณปไปxxxxxxxxxxx\n507503 0 ไธๆ ท็่ไธๆ็็ๆญฆๅจๅทๅฏๆฌ็ๆถๅ่พๅบๆ\n507504 0 ๅบทๅฉทๅฎถไบบๅป้ข็ฐๅจๅขๅ ไบๅป็้กน็ฎ\n1000 Processed\n classify content\n508000 0 WindowsServer่ฟๆฅๆฐ็Azureไบๆๅก|WPC2012ๅคงไผไธ\n508001 0 2015ๅนดไธๅๅนดๅปบ่ฎฎ๏ผไธๆฏๅ ๅฟซๅฝไผใ่ดข็จใ้่้็น้ขๅๆน้ฉ\n508002 0 ๅฏไปฅๆฏ็ท็ฅไนๅฏไปฅๆฏ็ท็ฅ็ปไป\n508003 0 ็็ธๅฐไบบ๏ผไฝ ๆไนไธ่ฏดๅจ่ฟไนๅไฝ ่ฟๅไบไธ็ๆฒๆๅไธๅ่ๅฃซ้ฆ
้ฅผ\n508004 0 ๅชไฝๅช่ฆ้ฃไธช็ญๅบฆๅฐฑๅฅฝๅ
ถไปไปไน็็ธๅ็ธ็้ฝ็ๅธฆๆไปฅไฝ ็บขๅฐฑๆฏไฝ ็้
\n1000 Processed\n classify content\n508500 1 ่ฑๆ้ถ่กๅนธ็ฆๆถ่ดทไบงๅ๏ผ ๆฎ้ๅฎขๆทx.x-x.xๅ ่ๅธๅป็ๅ
ฌๅกๅx.x-x.xๅ ๆพๆฌพๆถ้ด:...\n508501 0 ๅจๅนฟๅทๅฐ้้่ฟท่ฟท็ณ็ณ็ๅฌๅฐ๏ผๅ่ฝฆๅณๅฐๅฐ่พพๅ
ฌๅญๅโฆโฆๆโฆโฆ็ฉฟ่ถไบ\n508502 0 ๅไบซๅไธบ20ๅคๅนดๅ้ฉ็ๅฎ่ทต็ป้ช\n508503 1 ๅ
ดไธ้ถ่ก็นๅคงๅฅฝ่จๆฏ๏ผxxxxๅนด่ตทxxxไธๅฎขๆท็่ดขๅฏๅฎ่ก็งไบบๅฎๅถxไธชๆx.x%/xไธชๆx.x%...\n508504 0 ๅฎๅพฝ็ฝ่ฎฐ่
7ๆ20ๆฅไปๆ้ณๅฟๆณ้ข่ทๆ\n1000 Processed\n classify content\n509000 0 xxxxๅนดๆฏไธไบ่ฅฟๅ่ๅๅคงๅญฆ็ฉ็ๅญฆ็ณป\n509001 0 โๆนๅๆ็พๅบๅฑๆณๅฎโ่ฏ้ๆดปๅจๆญๆ\n509002 0 ๅไบไผๅฟๅๅฏ็้ชbb็ๅฆๅช่ไบไผๅฟๅคฉ\n509003 0 ไธๆณพ้ณๆณ้ขๅ
็ปไนฆ่ฎฐใไปฃ้ข้ฟ้ๅปบๅฉ็ญ้ขๅฏผไบคๆตๅบง่ฐ\n509004 0 ๆๅๆพๅๅปไบๅฅฝๅคๅฐๆนๅชๆ่ชๅทฑ้ป้ปๅฐๅจๅฎถ้ป้ปๅฐ็ปไปไปฌ็น่ต่ฏ่ฎบๅฟ้่ฟๆฏๆบ้พๅ็ๅฅๆถๅ็ญๅฑๆ้ฑไบไน...\n1000 Processed\n classify content\n509500 1 ๅฎๆณขๅฎๅ
ดๆฐๅฎๅฅ้ฉฐxSๅบๆ่ฐขๆจ็ๆฅ็ต๏ผๆไปฌๆฌ็โๆๆๆๅฅฝโ็ๅ็็ๅฟต๏ผไปฅๅฎขๆท่ณๅฐใไธ่็ธๆฟ็ๆ...\n509501 0 ๆCJๅฟ็ๅ้่บไธๆ ทๅ้ฅฑไบๆ็ไธบไบไธไธชๅทฎ่ฏ้ชๆฐไฝ ~็ฎ็ดๅตๅตไบ\n509502 1 ๅ
็๏ผไธๅๅฅฝ๏ผโๆฑๅ่บๆฏๆๅๆโไฝไบๆญฆๆ็ๅป่ทฏxxxๅท๏ผๅๅๆณ็ซไบคๆกฅๆ๏ผใๆปไปทๅไบไธ่ตทๅฎ๏ผ้ข...\n509503 1 xใxๅฅณไบบ่๏ผๅโs\"ๅฅๅบทๆงๆๅฅณไบบ๏ผ้็คผๅผๅงๅฆโฆ!ๆไฝไป
ๅๅฐ๏ผxใxๆโฆ!โฆโฆโฆ[ๅพฎ็ฌ][็ซ...\n509504 0 ๆฅๅๅพฎ่ฝฏCEO่จ่ไบ?็บณๅพทๆๅจๅพฎ่ฝฏๅ
จ็ๅไฝไผไผดๅคงไผ็ๅๅฐๆฟๅ้่ฎฟๆถ่กจๆ\n1000 Processed\n classify content\n510000 0 ้ข่ฎกๆชๆฅ6ๅฐๆถๅ
ๆๅธๅคง้จๅฐๅบไปๅฐๅฏ่ฝๅ็้ท็ตๆดปๅจ\n510001 1 ใๆๅคงๅพกๆฏๆนพใ่ไธไธปๆๅๆจ่่ณผๆฟ๏ผ่ทๆฟๆฌพx%่ถ
ๅธ่ดญ็ฉๅกๆไบๅนด็ฉไธ่ดนๅฅๅฑ๏ผๅ
ๅฎตไฝณ่ไธๅ็็ฏ่ฐ๏ผ...\n510002 0 ๆไบบ็ผ่งฃๆนฟ็นใ็็ฎ็ฃใๅ็ง็ฎ็ๅ็ฎ่คๅนฒ็ฅ\n510003 0 ็ญไฟกๆๅ็ญๆนๅผๆฒ่ฏๅ็ดขไปไบบ\n510004 0 ๅไบฌๆตฆๅฃใๅๅบๅๆฑๅฎ็้้จ้ๅ ่ตทๆฅ่ถ
่ฟไบxxxๆฏซ็ฑณ\n1000 Processed\n classify content\n510500 0 ๆฅไธๆฅๅฐฑๆๅพ
ๆๅคฉๅปๆตๆฑๅซ่งๅฝ่็ฎไบ\n510501 0 ้ฒ็ๆ ่ๅฑ
็ถ็จๅฎถ้็็ต่ๅฑๅ็ฝ็ธไบ่ฟ็จๅทๅพฎๅ\n510502 0 ๅฐๆกไปถ็งป้็ธๅ
ณ่กๆฟๆงๆณๆบๅ
ณ\n510503 0 ๆฏๅคฉ้ฝๅฐฑๆฏ็ๆๆบ็็ต่่บบๅบไธโฆโๅฐฑๅจๅๆ\n510504 0 ๅๅปถๆถๅผๅๆฟๅฐไบงโ่ดขๅฏไธๅฎถโๆฏไบ้ๅทไธๅคไธคๅๅคๅนด็ๅค่ฟน\"้ญๅ่กฃๅ ๅขโ\n1000 Processed\n classify content\n511000 0 ็ญ้ๅบ55ๅไบ่ฟน็ชๅบใไปฃ่กจๆงๅผบใๅฝฑๅๅๅคง็้ๅพทๅ
ธๅ\n511001 0 ๆไปๅพๅทๅ็ซ่ฝฆๅๅฐๅไบฌๅไบๅด่ฆ5ไธชๅคๅฐๆถ\n511002 0 ๅ
จๅฝๅๅฐๅฐๅคๆ
ๆธธ็พกๆ
ๅซๅฆ็ๅๆถ่ชๅทฑๅๆฒก่ฝๅๅปๅพๅฐ่ๆไปฌ็ฅ้้ ่ชๅทฑๅชๅๅปๆขๅๆณ่ฆ็ไธๅไธๆฏๅชๆณ...\n511003 0 ๅคฉ่ต่ฟๆฏmiss็น็ๅๅๅๆฐไผฆๅ
ฌไธพๅ ๆฒน\n511004 0 ๆฎ็ฝๅ็ๆ๏ผๆญคๆณ้ฒไธบๅฟไธน่ฅฟๅบ้ๆฒนๅ็ฎก่พๅบ\n1000 Processed\n classify content\n511500 0 ็ฐๅจ็จ็ไบ้ฉฌ้ไธๆฐไนฐ็็ตๆบๆๅผไธ็ๆ นๆฌๆฒกไฟฎๅฅฝ\n511501 0 ็ฑๅ
ๅๅๆ่ฎพ่ฎก็้ฉฌๅฐๆๅๆฏๅญ\n511502 0 โโ้ๆญๅนถไธๆปๆฏไปฃ่กจๆๆฟ่ฎค่ชๅทฑ้ไบ\n511503 0 โโ็็ธ้พ้ๆฏๅ ไธบๆฒกๆๅๅๆ่ฟๆ ท\n511504 1 ๅธไธญๅฟ็ป็ไธ็บฟๆฑๆฏ็ฐๆฟxxx-xxxๅนณ็ฑณ๏ผๅณไนฐๅณไบคๆฟ๏ผๆธ
็ๆดปๅจๅฏๅจไธญ๏ผๆๅพ
ๆจ็ๅ
ไธด๏ผ็ต่ฏxx...\n1000 Processed\n classify content\n512000 0 ็ถๅๆๅคฉๅๅฎถๅฆ๏ฝ๏ฝ\n512001 0 ไปไปฌๆๅพๅฉ็จๅพๅคไผ ็ปๅไธ่ง้ไนๅค็ๅทฅๅ
ทๅปๅ้ ๆฐๅบๆฏ\n512002 0 ๅปๅนดๅไบฌๅคงๅฑ ๆๅ
ฌ็ฅญๆฅๅไธๅคฉ12\n512003 0 ๆตๆฑไผ ๅชๅญฆ้ขๆญ้ณไธปๆ่บๆฏๅญฆ้ขๆๅธๅ่ถ
ๆๅฏผ\n512004 0 ไปๆฅไบๅไบฌไธญๅฟๅคง้
ๅบ้
ๅบๅ
ฅไฝ\n1000 Processed\n classify content\n512500 0 ๅคงๅซยท่ดๅ
ๆฑๅงๆไบ็ๆน็็งๅฏๆญฆๅจ\n512501 0 ็ฑไธฝๅฐๅฑEtudeHouseๅง่็ฌ\n512502 0 ่ฎฐๅพๆๅๆ่ๅฉไปๅไบฌ่บๆฏ่ฐๅฐๆฑๅๅนฒๆจๅทฅๆดปๅฝๆ\n512503 0 ๆฌ็งไธๆนๅ
ฑๅฝๅๆฐ็9960ไฝไบบ\n512504 0 ่ฝฆ็ๅทไธบๅGxxxx็ๅฐๅๆฎ้ๅฎข่ฝฆๅ็่ถ
้่ฟๆณ่กไธบ\n1000 Processed\n classify content\n513000 0 xxxๅ
ฌๆคๅญๅฆๅจๆญฆๆฑๅธๅฆๅนผไฟๅฅ้ขไบงไธx\n513001 0 ๆๅคฉ้ๆฅๅฐฑๅฐ่ฟไบๆธฏไบใๅไฝๆๅฎ??\n513002 0 ่ฝฐ็ธๆบ็ธ่ฟไผผ็โฆโฆ็ป่ฟไธๆไธ็ๅชๅ\n513003 1 .ๆจๅฅฝ๏ผๆๅ
ฌๅธ้ฟๆๅ็ๆตๆผใๆ ๆตๆผ่ดทๆฌพ. ๆๅก่ๅด๏ผๅๅคง้ถ่ก่ดทๆฌพไธๅก๏ผๅฝๅฎถๅบๅๅฉ็๏ผ.ๅคง้ขไฟก็จๅก.\n513004 1 ้ๆกฅๅๅบ๏ผๅฃ่่ฑ้
๏ผ็พ่คๅฎๅๅฆๅไธๆ๏ผไธๅ
ซๆๆฉๅ้ฆๆดปๅจๅผๅงไบ๏ผไผๆ ๅคๅค๏ผ็คผๅๅคๅค๏ผๆฌข่ฟๅ
ๆฅๆข...\n1000 Processed\n classify content\n513500 0 ไป็่บซไธๆฏๆฒๅฟ็ไธไธชๅฑฑ้ๆๅคซ\n513501 0 ่ฎค่ฏไฟกๆฏไธบโ้ฟ้ๅฆๅฆๆๅกๅ้ฟ้ๅฆๅฆๆทๆๆกฃโ\n513502 0 ็ฏ็ฝชๅซ็ไบบๅบๆๅทฒ่ขซๅไบๆ็\n513503 1 ๅนฟ่ฟ๏ผๆฐๅนด้
ฌๅฎพๆดปๅจxๆxๅท็ๅคงๅผๅฏ๏ผๆดปๅจๆ้ดๅ
จๅบ้
ๆฐดไธๅพไนฐไบ่ต ไธ๏ผๅฑๆถๆญ่ฟๅคง้ฉพๅ
ไธด๏ผ๏ผ๏ผ[้ผ...\n513504 0 ๆบๅจไบบไธ็ด่ฏด็ๆดปๅจๆฐๆถไปฃไธ่ฆๅ่ๅ่\n1000 Processed\n classify content\n514000 0 ๅฝๆฐๆฟๅบ้ขๅฏผไธ็ๅฝๆฐ้ฉๅฝๅไธๆฅๅๆ22ๆฌกๅคงๅไผๆใ1117ๆฌกๅคงๅๆๆใๅฐๅๆๆ28931ๆฌก\n514001 0 ๅไบฌๅธๆบงๆฐดๅบไบบๆฐๆฃๅฏ้ขไพๆณๅฏนไบๅ
ๆ ็ญxไบบๆ่ตทๅ
ฌ่ฏ\n514002 0 ๆฉๅฎ้ฆๅฐconๅ ๆฒนๆๅพ
ๆไธ่ขซๅทๅฑ๏ฝ\n514003 0 ๅ่ฎฏ๏ผๅฝไผๆนๅถ่ฏไผฐๅขๅผๆๅพ็จไธ็จไบคไบ\n514004 0 ๆฏๅฏนๆ ้กไธๆฑฝ้ธ้ ๆ้ๅ
ฌๅธๆๅ
ด่ถฃ็็ธๅ
ณ็ฝๅ่ทๅๆ ้กไธๆฑฝ้ธ้ ๆ้ๅ
ฌๅธ่ต่ฎฏ็้่ฆๅ่\n1000 Processed\n classify content\n514500 0 ๆๆบๅ
จ่ฃธ็ๆฒก่ดด่ๆฒกๅธฆๅฃณไธคๅคฉ้ไปๆๆไธๆๅฐไธๆไบ5ๆฌก็ซ็ถ้ฝๆฒกๅฅๅคง็ข\n514501 0 ็ปฟ่ถไธไป
ๅฏไปฅ่พพๅฐๅ ้่่ชๆถ่ๅผบๅๅฅๅบทๅ่ฅ็ๆๆใ่ฟๅฏไปฅ้ไฝ็็ใ่ๅนด็ดๅ็ไปฅๅ็ณๅฐฟ็
ใๅฟ่ก็ฎก...\n514502 0 ๅพฎ่ฝฏ็SurfacePhoneไผ ่จๅ่ตท\n514503 0 ๅฎไน ็้้ข็ๅคงๅ
ฌๅธ็ไป็ป็ๆฏๅ็ๆ่่บฏไธ้\n514504 0 ๅ่ๅพ็ปๆฟๅบ็ๆงๆณ้จ้จไปฅๅผบๅๅผฑ\n1000 Processed\n classify content\n515000 1 ๆจๅฅฝๆไปฌๆฏไธไธช่ฎพ่ฎก็ๅข้ ไธไธไธบๆทๅฎๅๅคงๅๅฎถ่ฎพ่ฎกๅบ้ข ๆ็
ง ่ง้ขๆๆ ๅข้ๅๅคไธชไบบ ๆจ่ฆๆฏๆ...\n515001 0 ็ๆฅๆๆบ่ฟ่ฆ็ญๅฐๆๅคฉๆ่ฝไนฐไธ\n515002 0 ๅฑ้ฉ่ฅฟๅฎ2ๅ่่ไบบ้ฃไธญไฝไธๅคฑๆงๆๅปๅคงๆฅผ่ดๆญป\n515003 0 ็ถๅๆ็ปๅๆไบ้ฃๆบโฆโฆๅฐๆปๅ็งๅฎ ๆบบไปโฆโฆๅๅคไนไธ่ฎฐๅพไบ\n515004 0 ่ญฆ่ฝฆๅฐ็ฐๅบๅฐ่ฟไน็ทๅญๅธฆ่ตฐ\n1000 Processed\n classify content\n515500 0 ่ฟ็ปญๅๅจๅจๅ
จๅฝ60ๅคงๅๅธๅ
จ้ขๅฏๅจโ้่ฒๆๆๅคฉ\n515501 1 ่ทๅพ็ฒพๅ็คผ็ฉไธไปฝ[็คผ็ฉ]๏ผๅจx๏ฝxๅทๆ้ดๅ็vipๅกๅฏไปฅไบซๅxxๆไผๆ ๏ผ่ฟ่ต ้xxxๅ
ๅคง็คผๅ
...\n515502 0 ๅพๅทๅ ๅๆฏ่ฏด่ตฐๅฐฑ่ตฐ็happygohomenow\n515503 1 ใๅผๅธ้กบๅฎ่กใๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผMINIๅนดๆซๅฒ้้ๆ ๏ผๆๅคงไผๆ ๅฏ่พพๅฐxไธๅ
ใ่ฏ้MINIๆญปๅ
ๅฐ...\n515504 1 ๆฐๆฅ็ๆฌขไฝไปทๆฅ่ขญ๏ผๆดไฝๆฉฑๆไฝ่ณxxxxๅ
ไธๅฅ๏ผๅผๅนด็นๆ ๅฐฝๅจๅกไธนๅฉๆฉฑๆ่กฃๆ๏ผxxxxxxxxx...\n1000 Processed\n classify content\n516000 0 ๅๅฐ้ๆ็ๅทฆๅณไธค่พนๅๅ็ไธไธช็ๅญฉๅญ็ฎ็ด่ฆ็ฏ\n516001 0 3็นๅซๆฏๅฏไปฅๅปๆฏ็นๅฎ็ฝๆฅไบค้่ทฏ็บฟ\n516002 1 ๅฐ่ดต็ไผๅ๏ผๅๆๆฌงไบๅ้ฝๅฎถ็ตใๅฎถๅ
ท้ฃๆด่ขญๅทๅ
จๅ๏ผxๆxๆฅ-xๆxๆฅๅ
จๅๅบไปท็ๆฌข๏ผๅข่ดญใๅฅ่ดญใ...\n516003 0 ๅฅฝๅฃฐ้ณๅๆๅฅฝ้ปๅนไบๅฅณๅญ็ฝๆ้ซๅฏๅธ
ๅฅณๅญ่ขซๅทๅ
ฅๆถๆขฏ่บซไบกๆ้็กๅฟซไนๅคงๆฌ่ฅ\n516004 0 1ๆฌกๅฏน็ๅผฅ่กฅ3ๆฌก้็่ฟๆๅคงๅน
ๅบฆ็ๅฉ\n1000 Processed\n classify content\n516500 0 ๅฎ่ด่ฟๅจ่
พ่ฎฏ่ง้ข้ชๆๅบฆ่ฟไบ1330ๅฐๆถ\n516501 0 ็ฑปไผผๅปๅนดๅๆญขๆฏๆ็WindowsXP\n516502 0 ๅๆฌขๆ็ๆตๆฑ็ฒๅฏไปฅๅ ่ฟไธชQQ็พคๅฆ\n516503 0 ๅค้จไธ็ไธญ็ๅฅๅบทใ่ดขๅฏใไบบ่ๆฏๅ
ๅฟไธ็็ๅคๅจ่กจ็ฐ\n516504 0 ๆฅ่ไบบๆฐๆฃๅฏ้ข่ไฝ็ๅฐ้ๆตท็ไบบๆฐๆฃๅฏ้ข็กฎ่ฎคๅ ๅ\n1000 Processed\n classify content\n517000 0 ??ๆไธไผๅ ไธบไฝ ็่ดจ็่ๅๆญขๅ่ฟ็ๆญฅไผ\n517001 0 ๅฐฑไธๅพไธๆ่ตทๆฑ่ๅฎๅ
ด่ๅ็ฑๅฝๆฐไธปไบบๅฃซใๅฎไธๅฎถๅจๅๅผบ\n517002 0 ๆจๅงไธญ็ไบๆ
็ตๆขฏไธบ่ๅท็ณ้พ็่ชๅจๆถๆขฏ\n517003 0 ่Balenciagaๅทฒ็ปๆฅ่ฟ็ฆปไปปไบไธคไฝ้้็บง็่ฎพ่ฎกๅธ\n517004 0 ่ฟ
้ๆ้ซไบๆฐ่ญฆ110ๆฅๅค่ญฆไฟกๆฏๅฝๅ
ฅ็ๅบ็จๆฐดๅนณ\n1000 Processed\n classify content\n517500 0 ๅจๅ็ไฟก็จๅกๅๅ
ๆฅ็็ไปไนๆฏไฟก็จๅก\n517501 1 ้ฏ้ธๅฏๅฎๅจๅฎถ็บบๅผๅนดๅคงๅๆดปๅจๅผๅงไบ๏ผxๆx-xๆฅๅญๆญค็ญไฟกๅฐๅบๅ
่ดนๆ็บขๅ
ใๅฎๅฅ๏ผxxx%ๆๅฅ๏ผๅ...\n517502 0 ็ฉไธไนๅๅ
ถๅบๅ
ทไบๆดๆน้็ฅไนฆ\n517503 0 ๆฑ่ๆฏไธๆฏๅจไฝๆๅฟซ็ไธไธช็ไปฝๅข\n517504 0 ๅฏไปฅ็ดๆฅ็จ่ชๅทฑ็ๆบ่ฝๆๆบ่งฃ้ๆฟ้ด้จ\n1000 Processed\n classify content\n518000 0 ๆๅจ็ต่้ขๅๅท็ๅพฎๅ็ดๆฑ็ฌ\n518001 0 ๅฅฝๅๅพๅคไบบ่ฏดๆตๆฑ็ๅฟไฟ็ๅป็ๆๅบฆๅทฎ้ฟ\n518002 0 ๆฒกๆๅฏน้ๆฒกๆ่ฐ่ฝๅ่ๆ ็ๆฐธๆๅไธไธช่่ดๆไธๅคๅณๆๅชไธบๅฝ่ฟ่พฉๆคไธ้ดๅไธ็งๆฎ้
ท\n518003 0 ๆตทๅ็้ซ็บงไบบๆฐๆณ้ขๅฌๅผๅ
จ้ขๅนฒ่ญฆๅคงไผ\n518004 0 ไธไผ่ฟๅจๅป้ข้ฟ่ตฐๅป่ฝฌๅ่ฟท่ทฏ\n1000 Processed\n classify content\n518500 0 ๆ็็ต่ไนๅฅฝๅๅฐไบไธญๅนดไธๆ ท่ฟๅ
ฅ6ๆไปฝ\n518501 0 ๆฎ่ฏดๅ้ๅธๅดๅทๅบๅๆธฏ่ฑ่ๆไบบไบ\n518502 0 ๅฑ่งๆถ้ด๏ผ2015ๅนด8ๆ10ๆฅโโ8ๆ16ๆฅ\n518503 0 ไธญๅฝๅฅฝๅฃฐ้ณไปฅๆฅๆๅฅฝ็ๅฃฐ้ณๆ็็งๆไฝๅ\n518504 0 ๆฟๅบ็ๅ้็ปไบ155ไฝ่ช็ฑ\n1000 Processed\n classify content\n519000 1 ๆทฑๅณๅบท่พๆ
่ก็คพไธ้จ่ฅไธ้จๆฌข่ฟๆจ.ๆๅธไธป่ฅ:ๅฝๅ
ๅคๆ
ๆธธ;ๆบ็ฅจ\n519001 0 ็ฅๅ่ฏๅธๅ
ฌๅธ่ฎกๅ่ดขๅก้จๆ่ดขๅกๆ ธ็ฎๅฒ\n519002 0 ๆ ้กๆฎก่ฌ้จ้จๅทฅไฝไบบๅ่ๆไนๅจๅ
ถไธญ\n519003 0 ๅไบฌๅ
จๅนด็ฒฎ้ฃๆปไบง้้ไน่พพๅฐ118ไธๅจ\n519004 0 โๅพ้่พ้ฉๅฏไธๆปกๅปๅต็่ฒ่งๅฎ\n1000 Processed\n classify content\n519500 0 ๆธฃๅ็ด ๆๅคฉๅฟ
้กปไฟฎๆๆบ\n519501 0 ้คไบ้ๅฐ็งๅๅ้ไบ้ๅคฉ็ๆผๆๅค\n519502 0 ๅธๆฐๅจ่ฏข๏ผๅคๅฐๅคๆฅๅกๅทฅไบบๅๅญๅฅณๆณๅจ้ถๅทๆ่ฟๅฐๅญฆไธๅญฆ\n519503 1 ๆจๅฅฝ๏ผไธๆฑฝ-ๅคงไผๅ
จ็ณป่ฝฆๅ้ๆ ๅข่ดญ๏ผๆดๆๅค็ง้่ๆฟ็ญๅฉๆจ่พพๆ่ดญ่ฝฆๆขฆๆณโๅ
จ็ณป้ฆไปxx%่ตท๏ผx-x...\n519504 0 ๆฏๅคฉ้ฝๆฏไธๅบexcelๅคง่ฏพๆไปฅๆๆฏๅคฉ้ฝๅทจ้ฅฟโฆไฝๅฆๆๅญฆไผ็จ็ต่ๅผน้ข็ดๆ่งๅพ่ฟๆฏ่ฎ้
ท็\n1000 Processed\n classify content\n520000 0 ๅฆๆ่ฏดๅๅบฆๆฏๆฟๅฑฑ็ๆ
ๆธธๅฃๅฐใ้ฟ้ณๆฏ้ซ็ซฏๅๅธๅไธญๅฟ\n520001 0 ๆไธ็ดๆ
ๅฟงๆฒ้ณๅฐ้็ญๅ
ฌๅ
ฑๅบๆ็ๅฎๅ
จ\n520002 0 ๅไบฌ่็ซฅๆกๅฝไบไบบ้ฆๅๅฃฐ๏ผ้ฃๆฌก็็กฎๆฐ็ ไบ\n520003 0 ๆ ้กๅจ็ฉๅญยทๅคชๆนๆฌขไนๅญ2015ๅคๅญฃ็ๆฌขๅค\n520004 0 ๆ็คบXไฝ ็ๅพฎๅๆต็งฐๅฐไฟฎๆนไธบโๆฟไบงๅโ\n1000 Processed\n classify content\n520500 0 ้บฆๅ
็่ฏดโๆพไธไธชๆฒกๆ็ตๆขฏ็ๅฐๆนๅงโ\n520501 0 ๆจๆฅๅจๅไบฌๅธ้ซ็บงไบบๆฐๆณ้ขๅณๅบไบๆฐไธ่ฝฎ่่ดโ\n520502 0 ๅฏๅฐฑๅจไธๅๅ็ตๆขฏ็ๆถๅๅ้ขๆไธๅฏนๆ
ไพฃ\n520503 0 ๆๅปบ็ญ้ข็งฏ็ๅคงๅฐไปฅๅๆนไฝ็ไธๅ\n520504 0 ่ฏฅ็ท็ซฅ็ปๅป้ขๆขๆๆ ๆๅทฒ็ปๆญปไบก\n1000 Processed\n classify content\n521000 0 ๆปก่ถณ้่ๆถ่ดน่
็ๆๆ้ๆฑใ็ปดๆคๆถ่ดน่
ๆ็ๆฏ้่ๅทฅไฝ็ๅบๅ็น\n521001 0 ๅพๅคไบๆ
ไนไธไผๆนๅๆฏๅฆๆๆบ้่ฟๅจ็ผๅญ็ๆ็ๅฌๅฌๆจๆ่บบๅจๆๆ่พน็็ฐๅจๅทฒ็ปๅๅฎถ็ๆฒกๆ็ไธไธๅผ ๅ็
ง...\n521002 0 ๅฟ่กๅ้ๆฏๆ
ๅฟง้ฃๆบๅปถ่ฏฏไธ็นๆๅฐๅฎถๅฆๅฆ่ฟๅจ็ญ็ๆ็ปๆไธ้ฅบๅญๅ\n521003 0 ๅๅ
ๅซ้
ๅบใๅๅกใๅๅญๆฅผใ่ฑชๅฎ
็ญไธๆ้ขๅ
ๆถไปฃ็้กถ็บง็ปผๅไฝๅๆฐ\n521004 0 ๅณๅฎไป2015ๅนด7ๆ1ๆฅ่ตทๅฐ็ฑ็ๅฟๅฐๅขๅฟ็ซฅ็บณๅ
ฅๅบๆฌ็ๆดป่ดนๅๆพ่ๅด\n1000 Processed\n classify content\n521500 0 ไฝ ๅๅผ ไธฐๆฏ
่ๅธๅ็83ๅนดๆๆไธคๅฒ\n521501 0 ๅไฝๅธ่น่ฃธ่น็ง่ตๆฏ้ๅธธๅฅฝ็ไธ็งๆนๅผ\n521502 0 ็ซ่ฝฆ่ฟๅไบฌ้ฟๆฑๅคงๆกฅๅฏๆฏๆฑฝ่ฝฆๅฟซๅคชๅคไบ\n521503 0 โๅฝ่ฟๅฏนๆๅฆๆญคไธๅ
ฌโ่ฏดไบบ่ฏโ้ๆฉ้ขๅ
จ้ไบโ\n521504 0 ไธญๅฝๆบๅจไบบไบงไธ่็ๆฐๆฎๆพ็คบ\n1000 Processed\n classify content\n522000 0 ไธญๅคฎ็ต่งๅฐๆ ้กๅฝฑ่งๅบๅฐ็พ็พ็\n522001 1 ็พๅฎไฝณๆฌข่ฟๆจ.ๆฌๅบไปฅ่ฏๅฅฝ็ไฟก่ชไธป่ฅ:ๆฅ็จๅ\n522002 0 ๆณๅบญไฝๅบ็ๅคๅณๆไผๆไธบ็ๅพ่ง็ๅ
ฌๅนณๅๆญฃไน\n522003 0 7ๆ16ๆฅไธๅโๅคฉไฝฟVCไธๅ่ถโๅไธ่
ๆ่ตไบบๅฏนๆฅๆดปๅจ็ฌฌไบๆๆๅไธพๅ\n522004 0 ๅจ็พๅบฆ็ฅ้้ฎ็ญๅนณๅฐไธๆถๅๅ
ณ้ฎ่ฏโๅธๆฏโไบๅญ็้ฎ้ข\n1000 Processed\n classify content\n522500 0 ๅฐๅๅจๆ้ฝไบๅป้ขไบงๅฆๆจ่ณๅฆฎ\n522501 0 ๅทฒ็ป่ทๆฑฝ่ฝฆไนๅฎถ่ฝฆๅๅ็้ๅฎ็บฆๅฅฝไบ\n522502 0 ๅป่ฏ่กๅฐฑไธๅพไธๆ้ไธญ็ง็ณปไธพ็็0004ๅฝๅ็งๆ่ฟไธช่กไธ็ด่กจ็ฐ้ฝไธ้\n522503 0 ๅฅนๅชๅฅฝ่ท็ผฉๅจๅฐ้่ฝจ้ๅ็ซๅฐไน้ด็็ญๅฐ็ฉบ้ด\n522504 0 ่ฏไฟกๅ V๏ผ927901230\n1000 Processed\n classify content\n523000 0 ๆคไธไธๅฐ้ไฝ ๆฟๆ
ๅพไบไน\n523001 0 ไธญๅฝไนๆๅคง็็น๏ผ็ปๅฏน็ฒพๅ\n523002 0 ๅด่ซๆๅ่บซโLAGYGAGAโ\n523003 0 IKnowYouWantMeๆฌง็พๆต่กๆๅ\n523004 0 ็ดซ้ไฟ้ฉๆ็ณป็ปๆผๆดๅคง้็จๆทๆๆๆฐๆฎไฟกๆฏๆณๆผ3\n1000 Processed\n classify content\n523500 1 ใๆ่ทฏ้ใๆ่ฒๅฟ
้กปไธ ไธๆง๏ผๆ่ทฏ้ไธๆณจๆฐๅญฆๆนๆณๅน ่ฎญxxๅนด๏ผๅญฆๆฐๅญฆๅฐฑๅฐๆ่ทฏ้ใ่งฃๆพไธญ่ทฏๅ้ฝๅ...\n523501 0 ๅจ่ฟไธชๅ
ณ็ณปๆทๆจช่ก็ไธ็้ๆๅชๆณ่ฏดไธไธชๅญF\n523502 0 ไธญๅฝๅฅฝๅฃฐ้ณๆฏๆฌก้ฝไผๅ็ฐๅพๅคๅฅฝๆญ\n523503 0 ไธๅคๅบ่ฏ?v๏ผ13688474689\n523504 0 ็่ณๅฏไปฅ่ชๅจๅจAmazonFreshไธๅ\n1000 Processed\n classify content\n524000 0 ่ณไบ2GBRAMๅไธ็กฎๅฎ\n524001 0 ๆฏ็ฑๅกๆตฆ่ทฏๆฏ็TsikkinisArchitectureStudioๅข้ๆ้ ็็งไบบไฝๅฎ
้กน็ฎ\n524002 0 ๅๅๅ
ฌๅธๅๅฐ10ไบฟไผฐๅผ็3็งๆนๆณ๏ฝๆ่ฟ่่ขซไธไบ็ฅๅไธๅ
ฌๅธ็ๆฅ้ๅทๅฑ\n524003 0 ็ๆฅ่ฟๅ ๅคฉๆไบ็งๆ็ๅฐๅท้ฝไธไผๅบๆฅไบๅตๅต\n524004 0 |CBIๆธธๆๅคฉๅฐ็ฝ๏ผๅฏไธ็ๅชไฝ\n1000 Processed\n classify content\n524500 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 85a775ไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n524501 0 ๆๆบ็่ฟ่ฅ่ง่ใๅค็ฝๅ
ฌ็คบๅ่พ่ฐฃไธญๅฟ\n524502 0 ๆธธๆ็็ฉๆณๆฏๅจๅ
จ็้ๆฉไธไธชๅบ็ๅบๅ\n524503 0 ใ7208ไบบๅๆถๅจ็ใ็ธ็ธๅๆฅไบ็ฌฌ2ๅญฃไธๅๅฐๅฆๅฃฐๆณชไฟฑไธๆ่ฎญ่็ธไฝ ไธบไปไนไธๅๆฌขๆ\n524504 0 ๅฎ ็ฉไฟๅฅ็จๅ็็็พๅฅถ็ฒ็ฌ็ซๅพฎ้ๅ
็ด ็ปด็็ด ็ฒ้ๆฏๆณฐ่ฟช่็ฝ่ฅๅ
ป็ฒ\n1000 Processed\n classify content\n525000 0 USA|็ฎๅใ่้ใ่ช็ถ\n525001 0 ๅพฎ่ฝฏๅไปฃXboxMusic็ๅ
จๆฐ้ณไนๆๅก\n525002 0 ๆๅ
ๅๆฟๅบ็ๅ
ณๆ้ๅฐไบไปไปฌ็ๅฟๅไธ\n525003 0 ไปๅฆ็่ฏดๅฏนไธ่ตทๆ็จ่ฆ่ญฆๅฏๅนฒไปไน\n525004 0 comๆฌๅ
ฌๅธๅทฒ้่ฟ้ฟ้ๅทดๅทดๅฎๅฐ่ฎบ่ฏ\n1000 Processed\n classify content\n525500 0 ไธๆฑๅญๆไธ่ฅฟไธๅฐๅฟๆ้ฟๅญๅฅๆๅจๅฐไธ\n525501 0 ็ฅไน้ขๅpk้ฃๆฐไธๅคไน้ดๆถๅคฑไบๅๆฏไธบๅช่ฌ\n525502 0 ๆฐไธญๅผๅฎขๅ
็่ฃ
ไฟฎๆดๆฏๆต่กๆ้ฃ\n525503 0 ๅพฎไฟกๅ ไบไธช้็ไบบไธญ็
ๆฏไบไธไธๅปๆไน่งฃ\n525504 0 ็งๆๅ
ฌๅธๅณไฝฟๆพ็ปๆฅๆๅพฎ่ฝฏ่ฌ็ๅๆญๅฐไฝ\n1000 Processed\n classify content\n526000 0 ๆตฆไธๆฐๅบๅธๆณๅฑๅป่ฐๅ่ฐ็ ๅ่ๅฐๅ็้ซๆธฉ้
ทๆๆฅๅฐไบไธๅไบๆๅฐๅบ่ตฐๅๅฌไธใๅฌๅๆฐๆ\n526001 0 ็ๆฟๅบๆฐ้ปๅๅฌๅผไบโๆจ่ฟๆฒป็้คๆกๆฑกๆๅปบ่ฎพ้ฃๅๆพๅฟๅทฅ็จโๆฐ้ปๅๅธไผ\n526002 1 ไฝ ๅฅฝๆๆฏๅนณๅฎ่ฝฆ้ฉๅฐๅผ ๏ผๆๅคฉๅ็ๅไธ้ฉๅจๆไฝๆๆฃๅบ็กไธ๏ผๆปกxxxxๅฏไปฅๅ ๆไธ่พไปทๅผxxxๆฐธไน
ๆ...\n526003 0 ๆๅงๅคซ็ต่ฏ๏ผ13937131918\n526004 0 ๅฆๆ่ฟๆฏๆๅคงvๅฐฑๆฏ้ฃไธช่ช็งฐๆฏ้่่กๅซ่ฑๅญ็ๅ็\n1000 Processed\n classify content\n526500 0 ่ฟๆฏ็ปงๆฎไบฌxๆไปฝ่ฎฟๅๅๆนๅจๆๆฏ่ทฏ็บฟใๅ
ณ้ฎๆกๆฌพ็ญๆน้ข่พพๆไธ่ดๅ็ๅไธ่ฟๅฑ\n526501 0 ๆตท้จๅธๆฐๆฟๅฑๅๅๆตท้จๅธ่้พๅไผ\n526502 0 ๆฐๅ่บๆญใๅฐๅๅฎๅนณใๅฐไธญๆฐ็คพใๅฐไธ็ฅๆฌใ้้จ้ๅใๆกๅญๅคงๆบชใ้ซ้็ฒไปใ่ๆ ไธไนใ่ฑ่ฒ็็ฉใๅฐ...\n526503 0 ๅ
ซ่ทฏๅไปไนไนๆฒกๅนฒ๏ผๆ่ดจ็ไธไธชๆฐๆฎ๏ผๅ
ซๅนดๆๆ\n526504 1 ไฝ ๅฅฝ๏ผๆฐๅนดไธๅกๅทฒๅผ๏ผๅฆๆๅ
ณ้ถ่กๆฟๅ
ๆฑ็ฅจ็ไธๅก๏ผๆฌข่ฟๆฅ็ตๅจ่ฏข๏ผ็ต่ฏ๏ผxxxxxxxxxxx ้พ\n1000 Processed\n classify content\n527000 0 ๆฎInformation็ฝ็ซๅจไธๆฅ้\n527001 0 ๆไธบๆ้็ปๅฅน็11ๅฒ็ๆฅ็คผ็ฉโฆโฆ\n527002 0 03ไธ้นฐๅธโๆๆไธๆฃฎๆ็็ปๆฌไนๅฎถโ\n527003 0 โๅฆๅ
ไธค้กนโ็ซ่ตไธญๅฝ96Aๅฆๅ
็ปๅบ\n527004 0 ๅ ๆญคๅฏนๅฎซ้ข็ณ็่ฟ็ง็พ็
็ๆฒป็\n1000 Processed\n classify content\n527500 0 ๆญ่ดบ่ฏดๅฎข่ฑ่ฏญๆๅฑฑ้ๅท่ทฏไฝ้ชไธญๅฟไบ2015ๅนด7ๆ16ๆฅ้้ๅผไธ\n527501 0 ไบฒ่ฟๅจๅฏน็็ต่ๆตช่ดนๅ
้ดๅ\n527502 0 ไธฅๆ ผไพๆณ่ฝๅฎๅผบๅถๆงๅไบงๆชๆฝ\n527503 0 ๅช่ฆๅธฆโ็พๅบฆโ็ไบงๅๆ้ฝไธๅๆฌข็จ\n527504 0 ไธไธชๅฝๅฎถ็ๆณๅพๅฆๆๆฏๆญฃไนๅ
ฌๅนณ็\n1000 Processed\n classify content\n528000 0 ๆตๅ7ๆ23ๆฅ่ฎฏๅจๅธธๅทไธ่ฟๅคงๅญฆ็ๅฐ่\n528001 0 ๆ่นๆฆๅ็ๅฟ
่ๅฎขๆฏๅ
จๅไบฌไน่ณๅ
จๅฎๅฎๆๅฅฝๅ็ไธๅฎถ\n528002 0 ๅด่่ดคไธๅคฉๅฐๆๅจinsๅไธไบๅฅๅฅๆชๆช็ๅฑ่งๅๅฅฝ้\n528003 0 ไป่ฎคไธบTwitterไธ่ฝๅ่ฎฉๅผๅ่
ๅคฑๆไบ\n528004 0 ่ฟ้ไธบๆจๅฐฑๆป็ปไบ่ฃ
ไฟฎไธญ่ฝๅค็้ฑ็ๅๅคง็ง่ฏ\n1000 Processed\n classify content\n528500 0 ๆ่ดฃๅฏผๅธๅญๅจโๅ่ดฟโโ่ฏ้ชโ็ๅฏ่ฝ\n528501 0 ่ๅญๅฐฑๆฏๅผๅฝไปฅๅ่ฟๆณๆไป็็ซๅคด้นฐ็ฒพ\n528502 0 ๆ ไบบ้ฃๆบๅฐ็พ็ฑณ้ซ็ฉบไธ็ๆฏ่ฑกๅไผ ๅฐๆๆงๅจไธญ\n528503 0 ๆ็ป่ท่็ๆฏ็ฑSagaDesign่ฎพ่ฎก็โๆ ๆไธ่ฝ็siriโ\n528504 0 ๅฐฑ้่ๆๅฏๅฏไปทๅผxxx็ๆฉๆดๅฅๆตท่ปๆณฅๆด้ขๅฅถไธๅช\n1000 Processed\n classify content\n529000 0 ไปๅทฅ็จๅๅ
ใ่ฎพ่ฎกไผๅใๆๆน้ฉๆฐๅๅทฅ่บๅๆฐ็ญๅคๆนๅ
ฅๆ\n529001 0 ๅฎ่ฃ
Keyไผๅๆ3V66T็ปๅฐพ็Keyๆฅ่ชๅจๆฟๆดป\n529002 0 ๅฐ้ไธๅๆๆ่พน้ฃไธช็ท็ๅฟฝ็ถๅฑ่ตทๆญๆฅ\n529003 0 ๅบ5ๅผ ๆ ้กๅจ็ฉๅญๅคๅญฃ็ๆฌขๅค้จ็ฅจ\n529004 0 ไปๅนดไธๆผไธๆตทLINEFRIENDSCAFE&\n1000 Processed\n classify content\n529500 1 ็นๆจๅบๅนดๅๆถ็็ๅจxx%--xx.x%็จณๅฅๆง็่ดขไบงๅ๏ผๆฌข่ฟๅจ่ฏขใๅฐๅ:่ง้ณๆกฅๆญฅ่ก่ก่ๆๆถไปฃๅนฟ...\n529501 0 ็ฉๆฃฑๅธๆณ้ข็งไฟกๆณๅพๅจ่ฏขไฟกๆฏๅทฒๆด็ๅฎๆฏ\n529502 0 ่ขซๅไธ่พ่ญฆ่ฝฆๆไบไธๆฌกโฆโฆ\n529503 1 ้ถ่กๆ ๆ
ไฟๆ ๆตๆผ๏ผไฟก็จdai ๆฌพ๏ผๆ็ปญ็ฎๅ๏ผ้ขๅบฆ้ซ๏ผxๅคฉๆพๆฌพใ่ดขๅฏ็ญ็บฟxxxxxxxxxxx๏ผๅฐๆจ\n529504 0 ๅฐๅฐ็ๆขฆๆณ้็็บธ้ฃๆบ็้ฃ็ฟๅๅบไผ็พ็ๅผง็บฟ\n1000 Processed\n classify content\n530000 0 ๆ็ๅคง่
ฟๆ็ๅซฉ่คๆ็ไนณๆฟๆ็่ธ้ขๆ็ไธๅๅช่ฝๆ็ทๆๅ่งฆๆธ\n530001 0 ๆตๆฐด่ดฆๆจๅคฉ็่ฐทๆญๅ็ๅทซๅธ3่ง้ขๆป็ฅ่งฃ่ฏด็ๅฐๅไบ็นๅๆ็ก\n530002 0 ็ฉๅฎถๅฐไปฅๆงๅถไธๆกๆฑฝ่ฝฆ็ไบง็บฟไธบๅผๅง\n530003 0 ๅทๆธๅจ่
ไธ็บฆ15ๅ้ๅ็จๅทๆฐดๆดๅ\n530004 0 ไปไนๆถๅ่ฝๆ่ฑๅฐๅท็ๅญฉๅญ่ฟๆฏๅฐๅท็ๆถๅorๅฎฟๅฝ\n1000 Processed\n classify content\n530500 0 ๅจๆฑ่็ๅธธ็ๅธๆๅทฅ็ไธๅคงๆฌข้ซๅ
ดๅฐ่ฏด\n530501 0 ๆญฆๅฎฃๅฟๆณ้ขไธๅฎกๅคๅค่ขซๅไบบๅๆ็ๆๆๅพๅ10ๅนด\n530502 0 ็ๅๅธๅบ็ไบฒๅฏไปฅ้่ดงไนๅฏไปฅ่ชๅทฑไธ้จๆ้\n530503 0 ๅๅคๅผบๅฅธๆถไนฐๆฅ็่ขซๆๅๅฆๅฅณ่ดๅ
ถๆๅญใ็ไบง\n530504 0 24โโฆโฆ16โๆๅป่ฟ็ฉบ่ฐ็ๅฅฝๆฒกๆๆธฉๅทฎๅ\n1000 Processed\n classify content\n531000 0 ๅปบ่ฎฎ๏ผ่งฃๆพๅใๅ
ฌๅฎใๆดพๅบๆใๅฎๆฃใๅ
ฌไบค่ฝฆใๅ่ฝฆใ่ช็ฉบใ่ฝฆ็ซใ็ ๅคดใๆตทๅ
ณ็ญ่ฟไบๆๆ้จ้จๅบ่ฏฅ่ฆ้ฃ...\n531001 0 ไผๆญฅ็จๆทๅฏไปฅ้่ฟๅ
ถๅบ็จ่ดญไนฐๅฐ็ฑณNote\n531002 0 ๅ
ญใไธใๅ
ซๆๆฏๆ
ๆธธ็ๆไฝณๅญฃ่\n531003 0 ๅนถๅ็ผฉ25ๅฎถ็ป่ฅไธๅ็็พ่ดงๆฅผๅฑ\n531004 0 ไธไธชๅคฑๅฟไธไธชๅๅฟ่ฟๅฐฑๆฏ็ปๅโฆ้็ปๆญฃๅจ่ๅป็90\n1000 Processed\n classify content\n531500 0 ๆถๆพ่ๅฆไบ็ชๅจ็ต่ๆค
ไธ้ญ็ฎๅ
ป็ฅๅฌ็ฑๅฐๅ
ฐ้ฃ็ฌ\n531501 0 ไนฐไบ่ฟไนๅคAmazon็ไธ่ฅฟ\n531502 0 ๆขๆ่ฝๅๅธๅผไธๆฌพๆๆบ5000ๅ
ๆกฃไฝ็็จๆท\n531503 0 ๅจๅ็งไธไธไธญๅผ่ฎพCFALevel1pathway\n531504 0 ็พๅบฆ่ฟไนๅคง็ๅ
ฌๅธไธๆฏ้ชไบบ็ๅง\n1000 Processed\n classify content\n532000 0 ๅ่่ขซๅฅธ่ฃไธฅๅตฉ่ฏฌ้ทๅพ็ป่ๅคๆๅพ่ฐๅ\n532001 0 ๅซ็ไบบๅ ็็ชๅๅธๆฏๅทฒ่ขซ้่ณๅๅ
ๅธๅผบๅถ้็ฆปๆๆฏๆๅผบๅถ้็ฆปๆๆฏ\n532002 1 ๅขๅธใ็ชๅธx.xๆ่ตทใไธใๆดปๅจๆ้ด๏ผๆไธๆๆปกxxxxๅ
ๅข่ณxxxๅ
๏ผๆ้ซๅขๅผxxxๅ
ใๅฐๅ๏ผ...\n532003 0 ไน็ปไธ่ด่ฑๅ้ชจไธไบบ็ฝๅญ็ปไธ่ดๅ
จๅคฉไธไบบๅด็ป็ฉถ่ดไบ่ฑๅ้ชจไธไบบ\n532004 0 ๆๅฐฑๆๅผ็ฎกๅฎถ็่
พ่ฎฏๆฐ้ปๆฅไบ่งฃไธไธ็่ก็ๅ
ๅฎน\n1000 Processed\n classify content\n532500 0 ๅ
จ็่ฟ่ฅๅVoLTE้จ็ฝฒ่ๅฅๆญฃๅจๅ ๅฟซ\n532501 0 ็ต่ไธๅฅฝ็จใๆๆบไธๅฅฝ็จใ่ทฏ็ฑๅจไธๅฅฝ็จ\n532502 0 ไบๅฏฆๅฐฑๆฏๅพๅฐ็ไบไฝๆไบบ็ฟป่ญฏ้ฏ่ชคๆ็
ๆ
่ชๅคง็ตๆๆไบไบบๆญฃ็ขบ็ฟป่ญฏไธไฟก้ฏ็็ฟป่ญฏๅปๆทฑไฟกไธ็้่ณช็ๆญฃ็ขบ็ฟป...\n532503 0 ๅๆถๅ
ผๅ
ทๆฝฎๆต็่ฎพ่ฎกไธๅฅขไพๅ็่ดจๆ\n532504 0 ๅฅน็ปไฝ ็ไฟก็จ้ขๅบฆ้ฝๆฏๆ้็\n1000 Processed\n classify content\n533000 0 ๅจ้ฅๅๅ็่ฎพ่ฎกไธ็ๅบไบ้ฃไนไธ็น็ฉบ้\n533001 1 ใ่ดต้ณ่ฑๆบช็ขงๆกๅญใใๅๆผๅซๅข
ใ่ฑๅญๆดๆฟใๅจใxๆxxๆฅๅใ่ดญๆฟๅช้ไปใx%ๆฟๆฌพใๅณๅฏ๏ผ่ฅๆไบฒ...\n533002 0 ๆๅ็ๆดปๆฏ่ฟๆ ท็๏ผ่
พ่ฎฏๅพฎๅๅพฎไฟกๆบ็\n533003 0 ไธๆฟ่ฎค่ฎกๅ็่ฒ็ธๅ
ณๆณๅพๅฏนไธชไฝ่ช็ฑ็ๅ่ฟซไธๅผบๅถ\n533004 0 ๅฎถไบบๅ็ฐๅ้ๅป้ขๆขๆๅๆ ๆๆญปไบก\n1000 Processed\n classify content\n533500 0 ไธไธช็ฅ็ถ่ฝๅฆ่ฏทๅป็็็
็ๆฏไธช้พ้ขๅๆไน็ๆฏไธชๆทท่\n533501 0 ๅๅฝข็้คๆก่ฎพ่ฎกๆดๆพ็ๅจ้ๆง\n533502 0 ๆ่ง่พๅบๆฏ่ขซๅธธ็ๆๅผ็ไธไธช้\n533503 1 ๆๅๅ
่ฃ
ๅๆๅ่ขๅธๅ
ไธๅ๏ผๆฌข่ฟๅ ๅ
ฅๆไปฌ็ๅข้๏ผ็ต่ฏxxxxxxxxxxx๏ผๆ็ๅขฉๆดฒๆนพๅซ็็ซๅฏน้ข\n533504 0 ๅฆๆไฝ ๆญฆ่ฟๆณ้ขๅ
ฌๅผ็ซๅบๆฅๅธฎไปๆๆงๆ\n1000 Processed\n classify content\n534000 0 ไบ้ฉฌ้ๆจๅบไธ้ฎ่ดญ็ฉ็กฌไปถโDashโๆ้ฎ\n534001 0 ๆญฃๅธธๆฐ็ๅฟๅบ็ๅ24ๅฐๆถๅ
ๆฅ็งๅกไป่\n534002 0 A้
ธๆฌๅฐฑๆฏๆๅ
่ๅ้ๅธธๅฅฝ็ๆๅ\n534003 0 ไฝ ไปฌ่ฟไบๆบๅจไบบๅๆฐดๅ่ฟ็ๆฏ็ฆ\n534004 0 ๆฑไธญๅถ่ฏๅฐฑๆถ่ดน่
่ฏ่ฎผๆฑไธญ็ดๅง้ฅผๅนฒๅไปฃ่จไบบๅพ้่พๅ่กจๅฃฐๆ\n1000 Processed\n classify content\n534500 0 ๅ็ฐ่พนๆๆๆบ่พน่ตฐ่ทฏ็ไบบๆนๅๆๅๅทฎ\n534501 0 ๅฐฑ่ฟไปๆพ็ปcos็ๅผ ่ตท็ตไนๆไธบไบๆๅถๅ\n534502 0 ๅ
ดๅๅธๅป็ๅซ็ๅฟๆฟๆๅก้็ป็ปๅฟๆฟ่
ๆฅๅฐๆไธญ้่็ๅซ็้ขๅผๅฑโ็ฝ่กฃๅคฉไฝฟ่ฟๅๅฎถโๅฟๆฟๆๅกๆดปๅจ\n534503 0 ่ตถ็ดงๆๅผ็ต่ๆฅๅดๅ็ฐๅทฒ็ป่ขซไบบๆท่ถณๅ
็ป\n534504 0 11ๆๅบๅๅ
จ้ขๅฎๆๅฉไฝ้ฎ้ข็ๆดๆนไปปๅก\n1000 Processed\n classify content\n535000 0 ไฝ ไปฌ็จ็ต่็ๆถๅไผไปๆไฝ ๅฆๅๆ่พน็็ไน\n535001 0 ไปไผ้ฟใไธญๅคฉ้ๅข่ฃไบ้ฟๆฅผๆฐธ่ฏๆไธๆฅ่ฟ่ไนฆ\n535002 0 SuperDaE่ฎคไธบ่ชๅทฑ็้ปๅฎข่กไธบๆฒกไปไนไธๅฝ\n535003 0 BabyBananaๅฉดๅนผๅฟ่ฎญ็ป็ๅท$7\n535004 0 /ๅฝๅฎถไฝ่ฒๆปๅฑๆ็่ฟๅจ็ฎก็ไธญๅฟไธปไปปๆฝๅฟ็่ขซๆฅ\n1000 Processed\n classify content\n535500 1 ใๆฏๆฅ้ฒๅฅถๅงใไบฒ็ฑ็ๅฎขๆท:xๆxๆฅๅฅณ็่๏ผๆฌๅบๆจๅบ็ฌ้จ้
ๆนโ้ฒๅฅถๅฐๆทๆทโๅฝๅคฉๆบๅฅณ็่ฟๅบๅ
่ดนๅฐ...\n535501 0 ่ฟๅบๅฎกๅคไป1945ๅนด11ๆ20ๆฅๆ็ปญๅฐ1949ๅนด4ๆ13ๆฅ\n535502 0 ๅ็ปๅฅฝๅฟไบบ้ๅพๅป้ขๅดไธ็ดๆ่ฟทไธ้\n535503 0 ไธบ่ตท่่ฎฒ่ฏๅบ่ฐๅ็ญ็16ไบบ็ปๆ็้ฆ็ธๅจ่ฏขๅฐ็ปๆฅๅไน้ๅฐๅ
้\n535504 0 ๆ็็ต่ไปๆฅๆฒก่ฏ่ฟๅๆถ่ฃ
่ฟไนๅค็ๆญๆพๅจ\n1000 Processed\n classify content\n536000 0 TheVerge็่ฏๆตๆพ็งฐโAndroidWearๅจๆญฃ็กฎ็ๆถ้ดๅๆญฃ็กฎ็ไบ\n536001 0 โ้ฃๆบ็โ็ตๅฝฑ็ชๅค็ฏๅๆฏ่ฟๅ\n536002 0 ๅ
ถๅฎiPhone็จๆทๅฎๅ
จไธๅฟ
ๆ
ๅฟ\n536003 0 ๅๆฌข่ฟ็งๅฎ้็ๅฐๆนๅฐฝ็ฎก็ๆฅไนไธๆฟๅจ้ฉฌ่ทฏไธไธๆ็พค็ๆธธๅฎขๆคๆฅๆคๅป่ฟๅ ๅคฉ่ตฐ่ฟไธ็ฆๅฏนๆฅๅฆๆธธๅฎข็็ง็งไธ...\n536004 0 2ใๅนณๅฐๆญๅปบๆ ๆณไฝฟไผไบบ่ก\n1000 Processed\n classify content\n536500 0 ไธญๅฝ็ปฟๅไผๅ
ฌ็่ฏ่ฎผ็ณปๅไนไธ\n536501 0 ๆขๅซๆถ็่งๆผไบฎ็ๅฅณๅญ่ฟ่ฟ็ปญไธคๆๅฎ\n536502 0 ๅไบฌๅพทๆฏ็ซๆผ่ฃๅค่บบๆช็ฐๅบ็่ฟทไบฒๅฆไธๅฎถ|ๅพ้ธก่่ต\n536503 0 ้ฆๆธฏ้ซ็ญๆณ้ขๆฅๅๅฆๅณๅฑฑๆฐดๆ่ต่กไปฝๆ็ฎกไบบๆไบค็ๅๆ็ฝขๅ
ๅฑฑๆฐดๆฐดๆณฅ่ฃไบ็ๆ่ฎฎ\n536504 0 ๅ่ๅทๅ
ถไป็ฑๅฟไผไธ็ๆๅฉไธ่ตทๅ้่ดซๅฐๅฐๅบ็ๅฟ็ซฅ\n1000 Processed\n classify content\n537000 0 ไบ้ฉฌ้ไธญๅฝๅจๅนดๅบ7ๅคงๅ็ฑปๆ้ซๆปก199ๅ100\n537001 0 ็ฐไปฃๆฅๅๅ็ไบงๅๅ
่ฃ
่ฎพ่ฎกๅฎนๅจ็็จณๅฎๆๆ ็ๆฏไบบไปฌๅฏน้ ๅ็ๆๅบๆฌ่ฆๆฑ\n537002 0 ไธ็ฎญๅๆไธญๅฝๆๅๅๅฐๆฐไธไปฃๅๆๅฏผ่ชๅซๆ\n537003 1 ไธไธไธบไธชไบบๅไผไธๆไพ่่ตๆๅก๏ผxxไธ่ณxxxxไธ้ขๅบฆ๏ผๆๆฏไฝ่ณx.xๅใๅ
จๅฝๆๆญๆฟ๏ผ่ฝฆ๏ผไฟ้ฉ...\n537004 0 ็้328้ปๅผ ๅ
ฌ่ทฏ่ถๅทๅธ้ๅฒ้่ทฏๆฎต\n1000 Processed\n classify content\n537500 0 ๅฐๅบไธบๅฅ่ฑๅ้ชจๆฏๆฌกๅ้ข่ฆๆญๅฅฝๅค้ๅค็\n537501 0 ่ฎค่ฏไฟกๆฏไธบโ่ๅทๅฅ้่ไธๆๆฏๅญฆ้ขHoulc\n537502 0 ่ฆๆฑ่ฎพ่ฎก็ๅฎถๅ
ทไธๅช่ฆๅฅฝ็่้\n537503 0 egๆฑๆปจๆณ้ข่ง่ๅๅปบ่ฎพๅๅๅ\n537504 0 ็ฑ้็ฑๅซๅไธๆฅๅฐๅ็ฎกๅไปฅๆฃๅ\n1000 Processed\n classify content\n538000 0 ๅซๆ้ฉๅฝ็ปๅฐฟ้
ธBxใ่ฆ่่ๅ็ฒพๅใๆฐดๆบถ่ถๅ่็ฝ็ญ\n538001 0 2015ๆฅๅค็งๅฌๅญฃๅคง็ ๅฅณ่ฃ
่ฑๆต่ฌ่ฌ่ฃๅ
่ฃไธญ่ฃๆฌง็พไธญ้ฟๆฌพๅ่บซ่ฃๅฅณๅค\n538002 0 ็ปไบบไธ็งhometheatre็ๆๅฟซไฝ้ช\n538003 0 ่ฝฌๆญๅฐ่
พ่ฎฏๅพฎๅๅฐๅบฆๅชไฝ็จๆผซ็ปๆฅๆฅ้่ฏฅไบไปถ\n538004 0 ๅฅฝๅบๅๅฐ็ซๅ้ฃๆบ่ฟๆ็็ฑๅฐ้ฉฌๅฅไปฅๅ็ตๅฝฑ็ไธป็บฟไบบ็ฉๅฐๆ็ญ็ญ\n1000 Processed\n classify content\n538500 1 ๅฅฝๆถๆฏ [็ซ็ฐ]ๅฅฝๆถๆฏ [็ซ็ฐ]ๅฅฝๆถๆฏๆฅไบ๏ผ(ๅนธ็ฆไธๆ๏ผ่ฏทๆๅฅๅบทๅธฆๅๅฎถๅง๏ผ) ๎ ๎ ๎ ๎...\n538501 0 ๅฐๆบชๅกๆดพๅบๆๆฐ่ญฆ่ๅฟ่ฐ่งฃ1ไธชๅฐๆถ\n538502 0 ๆไปฌไธบๆจๅผ้ไบxG้ซ้ไธ็ฝไฝ้ชๅ่ฝ\n538503 0 11ๅบ็ๆๅๅ่ฏๆไบ้ฉฌ้ไธไนฐ็่ฟไธช็ปไบๅฐไบ\n538504 0 Innisfreeๆฆ่ฏ้ฃๅ่ฟๆฌพๆด้ขๅฅถๆฏ็ทๅฅณ้็จ็\n1000 Processed\n classify content\n539000 0 NOGๅจ้ฝๆ่งไธๆขๅ็ตๆขฏไบ\n539001 0 ๆๆขๅซไฝ ไปไนๆๅชๅฑ่ฑๆๆญโฆ\n539002 0 Bไผ็่ดขไฝ่ฏด่ฏ็ด่จไธ่ฎณไธๅฅฝๆฅๅโฆโฆ\n539003 0 ไบ้ฉฌ้ๆฅๅๅฐฑๆ นๆฎ2015ๅนดไธๅๅนด็้ๅฎๆ็ปฉๅ่ฏป่
่ฏๅ\n539004 0 ๆฅ่ชๅ
จๅฝๅๅฐ็xxxxๅคๅไผไธๅฎถไธๆถ่ดน่
ไปฃ่กจๅxxๅคๅฎถๅชไฝ่ฎฐ่
ๅๅ ่ฎบๅ\n1000 Processed\n classify content\n539500 1 ้กพ้ฎ๏ผ้ไปทๅผxxxๅ
็ไธๅไธๆด้ขๅฅถๅไฟๆนฟๆฐดใไฟ้xๅทๆฉไธx็นๅผๆข๏ผๆฐ้ๆ้้ๅฎๅณๆญข๏ผไบฒ๏ผ่ฟๅฏ...\n539501 0 ไฝ ๅไปปไฝ็ไฟๅ
ปๅฆๆไธๅ้ฒๆ\n539502 0 ๆๆฉๆๆๆฏ่ตทไธๆฅไบ~็ฐๅจๆๆบๅฟซๆฒก็ตไบ\n539503 0 ็ฐ่ดง็ฐ่ดงไปๅคฉไธๅๅ
้ฎๅขๅซ็ไปๅฐๅฐไธ็ถ\n539504 0 ๅฆ้จๅคงๅญฆๅบ็็คพxxxxๅนดxๆ็\n1000 Processed\n classify content\n540000 0 xxG่ฃ่้ๅๅกๅๅพ
ๅ
จ็ฝ้็ซๅณๅฐๆ\n540001 0 โๅฎๅ้ๆญโๆบๅจไบบไธๆตทๅฑๅบๆฅๆฌ็ฝๅ๏ผ่ฏฅ้ไบง\n540002 0 ็ฝๅญ็ป้ช็่ฑๅ้ชจๅจ้ฟ็ๆตทๅบ16ๅนด\n540003 0 fx็ฐๅจๆฏ้คไธ็ฅๅคไบบๆฐๆๅฐ็ไบ\n540004 0 โๅธๆๆฏโ2015ๅนดๆตๆฑ็ๆ กๅญ่ถณ็่่ตๅไธญๅฅณๅญ็ปๅณ่ตๅจๆนๅทๅธ่ฝไธๅธทๅน\n1000 Processed\n classify content\n540500 0 ๅฝๆถ็น็นๆปดๆปด็ๅผๅฟๅฏน็ฐๅจ็ๆๆฒกๆๆไน\n540501 0 ๆๅ็ๆฏ็พ่่ช้ฃๆบไธญ้ๅจ่ๅ ๅฅ่ฝฌๆฅๆถๅ\n540502 0 ๆฏpaparecipeๆไธไธๆฌพๅฎๅ
จ็่กฅๆฐด้ข่\n540503 0 ้่ฆๆ็คบ๏ผโๅ
ฌๅธ่ก็ฅจ่ฟ็ปญไธไธชไบคๆๆฅๅ
ๆฅๆถ็ไปทๆ ผๆถจๅน
ๅ็ฆปๅผ็ดฏ่ฎก่พพๅฐ20%ไปฅไธ\n540504 0 ๆๅธๆ่ฟ็งLOWB็ฉทBไบๅฟBๆฐธ่ฟ้ฝไธ่ฆๅบ็ฐๅจๆ็ๆดปไนไธญ\n1000 Processed\n classify content\n541000 0 ๅด่ขซๆฌๅทไธไฝ69ๅฒ็ๅๆฐ่ฑไธๅคฉๆถ้ด่งฃไบๅบๆฅ\n541001 0 ๆ่ฐขๅคฉๆ่ฐขๅฐๆ่ฐขAmazon\n541002 0 ๅป็้ๅธธไผๅผ็ซๆ็็ด ๅถๅใๅฃๆA้
ธๆๆฏๆๆถ็้ๅค็\n541003 0 ๆฏๅคฉ็ก่งๅ้ฝ่ฆ็็็พๅข็พๅบฆ็่ฏไปท็ๆๅทฎ่ฏๅไบบไนๆฏๆบ็ดฏ\n541004 0 ่ฟใๅบๆฐด้จไฝ็ๆต้ใๆบถ่งฃๆฐงๆตๅบฆ\n1000 Processed\n classify content\n541500 0 ๆด็็ต่getไธๅผ ๅฃ็บธโ\n541501 0 ๅซๆตด่ฎพ่ฎกไธ้ขๆ่ดดๅๆปไฝ็่ฎพ่ฎก้ฃๆ ผ\n541502 0 ่ฟ็พค90ๅๆถ้ฒๅฎๅ
ตๆ ๆจๆ ๆๅฐ่ตฐ่ฟ้จ้่ฟไธชๅคง็็\n541503 1 ใๆๆ่ฑไธๅบๅๆๆ่ฒใๆฅๅญฃๆๅ็ญ็ซ็ญๆฅๅไธญโฆๆ้ฝใ็ปต้ณๅพท้ณๅๅธๆงๆ๏ผ็ฒพๅๅฐ็ญ่ฎฉๆจ็ๅญฉๅญๆๆกๅญฆ...\n541504 0 ็ไผผ้ฉฌ่ชMH370้ฃๆบๆฎ้ชธๅจ้ๆดฒไธ้จๅฒๅฑฟ่ขซๅ็ฐ\n1000 Processed\n classify content\n542000 0 ๅธธๅทไพ็ตๅ
ฌๅธๅฏน110kV็คผๅๅ่ฟ่กๆฃไฟฎ\n542001 0 ่ธๅ่ฟทไบบ็่ตซๅๅปบ็ญ้ฃๆ ผ็ๅท
ๅณฐไนๆ21ๆฅ็้ปๅค้ไธดไปฅๅใๆๅ็ๆ้คใ่กฃ็ๅไฝโrocketne...\n542002 1 ๆฒๆฒณๅธๅๆบๅคงๅธๅบ ไธๆน็บข ้้ฉฌ ้ขๅฐไธๆ ็ญ็ณปๅๆๆๆบ ไธญๆถ ็ฆ็ฐ ๅ่ฟ ๅฅ็็ญ็ๅฐ้บฆ...\n542003 0 ไธ็ถๅฐฑๆฏ่
่ดฅๆจช็่ไธๅฑ้ขๅฏผๅฆๅ็ๅญไธๆ ท่ขซ่ๅจ้ผ้\n542004 0 ๅ็ไบ้ฉๆxไปถๅฅ๏ผๆด้ข+ๆฐด+ไนณ+็ฒพๅ+bb้\n1000 Processed\n classify content\n542500 0 ไฝๅคๅๅฆๆฐดใไฟๆนฟๅ้ฒ็ญ้ฝๆฏๅฆๆๆไธๅคฉไฝ ็ช็ ดไบ\n542501 0 ไผ ๅชๅฎถๅพฎๅ|ๅฌ่ฏด็ฟๆๅถไฝๆ่ไบ\n542502 0 ๆๅคฉ่ทๅๆฟๅไผไธญ่ช้ฃๆบ\n542503 0 ็ฐๅจ็ๅฐๅทไนๅคชTMๆๆๅๆๅไฝไบ\n542504 0 ๅๅพโ้ซๅคงไธโ็ๆทฎๅฎๅธๆฟๅกๆๅกไธญๅฟๅไบ็ๅไฝๅธๆฐๆๅไปฌ\n1000 Processed\n classify content\n543000 0 ๆฅๆฌ็คพไผๅฏนๅไบฌๅคงๅฑ ๆๆฎ้ๆฏๆฒกๆๅฆ่ฎค\n543001 1 xๆxๆฅ่ๆบไนฐx้xใไบฒไฝไนฐx้xใๅฐๅฎ็ด ๅไบฒๆคx.xๆ๏ผไธๆฎตไธๅๅ ๆดปๅจ๏ผ่ๆบ๏ผไบฒไฝ้ฝๆฏๆไฝ...\n543002 0 ๅ้ขๆพ็คบE7่ฏดๅฏ่ฝๆฏๆจกๅๆไธปๆฟๅไบ\n543003 0 ๅ
จๅฎถxๅฃๅๅซๆ่ฝฎ่ใ้
็ถ็ญ็ฉๆปๅป่ญฆๅกไบบๅ\n543004 1 ้ๅฐ่ฃ
้ฅฐ่ฎพ่ฎกๅธๅๅฝฆๆ ผ็ฅๆจ๏ผๅ
ๅฎต่ๅฟซไน๏ผไธบๅ้ฆๆฐ่ๅฎขๆท๏ผๆญฃๆๅไบๅ
ๅฎต่ๅฝๅคฉ็ญพ็บฆ้ๅฐๅจๆไผๆ ๅบ็ก...\n1000 Processed\n classify content\n543500 1 ๆฌข่ฟ่ด็ตๅๅถ้ๅฑๅถๅ(ๆทฑๅณ)ๆ้ๅ
ฌๅธ.ๆฌๅ
ฌๅธไธไธไธบๆจๆไพ:ไบ้.่ฝด็ฑป.้้็ญ.ๆจ็ๆปกๆๆฏๆไปฌ...\n543501 0 ๅพฎ่ฝฏๅฎฃๅธWindows10ๅฐๅ
จ้ขๅ
ผๅฎนAndroidๅiOSๅบ็จ\n543502 0 ๆป็ฎsurviveไบ่ฟๆฌก่พ่ฆ็ๆ
่ก\n543503 0 ๆๆ็ๅคดๆๅฐฝไธ่บซๆฑกๅขๅๅ็็ฑๅไบ่ฟๅกๅฐไธไฟไผๅจๆฒๅถไธๅป่็ๅคดๆไบๅฝๅ้ฟไน
่พ็ป้้
\n543504 0 GPSไปฅๅ
จๅคฉๅใ้ซ็ฒพๅบฆใ่ชๅจๅใ้ซๆ็็ญๆพ่็น็นๅๅ
ถๆ็ฌๅ
ท\n1000 Processed\n classify content\n544000 0 ไนๅบๅจๅป็็ๆๅฏผไธไธฅๆ ผๆงๅถๅบๅๅ็จ่ฏๆถ้ด\n544001 0 bbไปๅคฉ่ขซๅฟ็งๅป็ๆไบไธ้ๅๅๆฅๆพๅฐๅบไธ็ก่งไธไผๅฟๅฐฑไผ่ขซๅ้ๆฅ็ๅคงๅญ\n544002 0 ไบ่็ฝ้่ๅทฒ็ปๅ็บงไธบๆๆบไนๅๆ้่ฆ็ๆ็ฅๆฟๅ\n544003 0 ่ฟๆฏๅไธบ็ป็จๆทๆไพๆ่ดไฝ้ช\n544004 0 ไธ่ฝ็ฎๅๆๆ่่ใ้ฆ่่งฃๅณ้ฎ้ข\n1000 Processed\n classify content\n544500 0 ๅไบฌ่ๅท็ญ8ๅธ่ๅ็ณๅไธญๅฝ่ท็ท็ฏฎไธ็ๆฏไธพๅๆ\n544501 0 ่ฟ่ฝ้่ฟๆ็ปLED็ๆณก็ๅ
่ฒๅฑๆง\n544502 0 ๆนไบๆๅไธ็ญๅไธๆตท็้ฃๆบๅๆจไธ็นๆๅฐๅฎถ\n544503 0 ๆฒกๆ็ๆญฃ็ๆณ้ขๅฐฑๆฒก็ๆไธชไบบ็่ช็ฑๅๅนณๅฎ\n544504 1 ๆจๅฅฝ๏ผๆๆฏ่้ฆๅฎถๅฑ
็ป็่ดบๅธ
๏ผๆฌๅบ็ป่ฅๅฎๆจๅฎถๅ
ท๏ผ้ฃๆ ผๆไธญๅผ๏ผๆฐไธญๅผ๏ผ่ฑๅผ๏ผ็พๅผ๏ผๆๅผ๏ผๅฐไธญๆตท็ญ...\n1000 Processed\n classify content\n545000 0 ๆฑ้จๅฐๅบๅ
ฑๆ17ๅฐ่ๅท็ณ้พ็ตๆขฏๆ้ๅ
ฌๅธๅถ้ ็่ชๅจๆถๆขฏๅ่ชๅจไบบ่ก้\n545001 0 ๅๅ่ฅฟ็ๅๅ้ฅฟๅงๅๅ ๆฒนๅ ๆฒนๆๅคฉๆฉๅธ็่ตฐ่ตท\n545002 0 ๅฑๅฎถๅญฉๅญ500ๅคไธบๅฅๅฐฑไธไธไบๅไธๆๅญฆๆ ก\n545003 0 ไธขๅคฑ็xxไธช่ฅฟ็็ตๅบ็ฆปๅฅ็็ชๆกโฆโฆ\n545004 0 ่ฟๆฌกๆ็น่ฟๅไบๅฆๆไฝ ไปฌๆฒกๆๆไนๅๅดไบฆๅกๅฏ่ฝไผๆ็็ธ้็ไธ่พๅญๅซๆฌบ่ดไปๅ่ฏไฝ ไปฌไผๆๆฅๅบ็\n1000 Processed\n classify content\n545500 0 8ๆไปฝๆ็ฎๅป่ถๆตๆฑๆฃๆฃๅฟ\n545501 1 ไบฒ็ฑ็ไผๅ๏ผ่ฏ้ๅๅ xๆxๆฅโxๆxxๆฅโ้ณๆฅไธๆๆ ๅจๆดฅๅ่ฅฟ้จๅญๅไธๅฎถ็ตๅคงๅๅข่ดญไผโ ๆดปๅจ๏ผ...\n545502 0 ไธญ่ชๅจๅไปฅๅไธญ่ช้ฃๆบ็ญ้ฝๅบ็ฐไบ่พๅคง็ๅ่ฝ\n545503 1 ็ง้น
็ๆ่ฐขๆจ็ๆฅ็ต๏ผๆฌๅบไธบๆจๆไพ๏ผ้ขๅฎ็ง้น
ใ้็ไนณ็ชใ่ๆฑๅ็ง็ญ็งๅณ็ณปๅ๏ผๅขไฝ่ฎข้ค๏ผไผๆ ๅคๅค...\n545504 0 ๅนฟๅบๅ
่ฟๆๅ
่ดนwifiๅฆ~~ๆๆฒกๆๅฟๅจ\n1000 Processed\n classify content\n546000 0 ๆจๅคฉๆ็ไบๆทๅฎ็ไบ้ฟ้ๅทดๅทด็ไบไบฌไธ\n546001 1 ๆฐๆฅไฝณ่๏ผ้ไปๅ กๅฅ่บซๅ้ฆๆฐ่ๅฎขๆท๏ผๆๆๅก็ง้ฝๆๆๆฃ๏ผๆๆ่
ๅฏไปฅ่็ณปๆ๏ผไนๆผซใ่ฐข่ฐข\n546002 0 ๅ้ฟ้ไบ็ๅผๆพๆ็ดขๅฏนๆฅไธไบ\n546003 0 ็ๅฎๆ่ขญ่ดจ็่ดดๅๆดๅฐ่ดดๆๅฐฑ็บณ้ทไบ\n546004 0 17ไปๅนด3ๅฒๅๆฌขไฝ ็ๅคงๅคง็็ผ็ๅ็ฒๅซฉ็่ธๅบ\n1000 Processed\n classify content\n546500 0 ๆฑ่ๅซ่ง่้ขๆญ็่ฟๆฏๆถๅฟๅฐไธ่ฝๅๆถๅฟๅฅฝๅ\n546501 1 ไธญ่้ๅๅฆๅๅบไธๅ
ซๅฆๅฅณ่ๆดปๅจๅผๅงๅฆ๏ผๅฟซไนๅฅณไบบ่๏ผ็พไธฝๅญๆ้ไธๅ๏ผๆดปๅจๆถ้ดxๆxๆฅ่ณxๆฅๆฌข่ฟๆฐ...\n546502 1 ๅฒ็ไฟก่ดทๅจ่ฏข๏ผๅ
ฌๅกๅใไบไธ็ผใๅฝไผใไธๅธๅ
ฌๅธไป
ๅญ่ไธไฟกๆฏ๏ผๅณๅฏไป้ถ่ก่ฝปๆพ่ดท่ตฐxx-xxxไธ๏ผ...\n546503 0 ๅฐฑๅ่ขซๅคไบๆญปๅๅด่ฟๆฒกๆๆง่ก\n546504 0 ๅคงๅทด่ฝฆใ้ฃๆบใๅจ่ฝฆใ่ฝป่ฝจใ้ฝไธๅฏไปฅ่ถ
่ฝฝไธบไปไนๅฐฑ็ซ่ฝฆๅฏไปฅ่ถ
่ฝฝไบ\n1000 Processed\n classify content\n547000 0 ็พๅบฆ็พ็ง็ๆ่ฟฐๆฏไธๅพ่ฟๆ ท็\n547001 0 ๆๆๅๆขๅไน้ด็ฑๆ
ไธๅฉ็จ็บ ็ผ \n547002 0 ็ฐๅจๅฎ่ฃ
Exposureๆไปถๅๆ็คบๆ ๆณๅฏๅจ0xc000007b\n547003 0 ไฝๆๆฝ่งๅ็้็่ฟๆ ทๆๆ็ฝ็ฝ็่ฎฒๅบๆฅไน็ฎๆฏ็ช็ ดไบ\n547004 0 ไธ็ดๅจๅท1599ๅๅๅๅๅฟ้ฝ้
ฅไบ\n1000 Processed\n classify content\n547500 0 ไธๆฌก่ฎฐxๅ๏ผ้ฉพ้ฉถ็ฆๆญข้ฉถๅ
ฅ้ซ้ๅ
ฌ่ทฏ็ๆบๅจ่ฝฆ้ฉถๅ
ฅ้ซ้ๅ
ฌ่ทฏ\n547501 0 ๆตๆฑๅคงๅญฆๆๆ/ๅๅฏผใไธญๅฝๅทฅไธ่ฎพ่ฎกๆ่ฒ็ๅ
ญๅคงๅๅดไนไธโโๅบๆพๅคฉ่ๅธ\n547502 0 ๅฎๆณขๆณฐๅพ็งๆบๆ่็ฉๆตๅ่ฏไธๅ\n547503 0 ไธไธๅคง้จๆฟๅบ็ญ็ธๅ
ณ้จ้จๆๅฐ้นๆๆ็ๅถ\n547504 0 ่ฃ
ๆxxxx้ไบงๅบ้ไผๆฐๅคซๆฉๆ น่ฏด็ปด็ด ใ้พ่ดจๅๆ่ถ็ญ็ฉๆไธ็งๆทฑๆฒ็ๅๆฌข\n1000 Processed\n classify content\n548000 0 ็ต่่พๅฐใๆฑฝ่ฝฆๅฐพ็ใ็ฉบๆฐๆฑกๆใไปปไฝไธ็น้ฝๅจๆถๆถๆฑกๆไฝ ็ๆฏๅญ\n548001 0 ๆไปฅไธบ่ชๅทฑไผๆฐธไธๅๅผไฝ ๅไธ่พๅญๆๅๅข\n548002 0 ๆถๅฐๅฟซ้็ๆถๅ่ฟๅจๅค่พนๆ
ๆธธ\n548003 0 ๆฒๅฟไธพๅIDTAๆไธ่ๅฝ้
็ญ็บง่่ฏ\n548004 0 ๅฅฝ้พ่ฟๅฎๆคไธไบไฝ โฆ็็โฆๅฅฝ้พ่ฟ\n1000 Processed\n classify content\n548500 0 ่ฝฏๅซฉ้
ธ็่ฏฑไบบ็้ๆฑ็ณ้ๅฐๆ\n548501 1 ๆฟๆ
ไธๆ๏ผๆไนๅไธบๅ้ฆๆฐ่ๅฎขๆท๏ผ็นๆจๅบๆฐ็้
ๆฐดๆดปๅจ๏ผๆป็ปๅ
(xxxx)๏ผ่ฑชๅ
(xxxx)ๅคงๅ
...\n548502 0 Googleไธไธๆไน้ดไธๅญๅจ็ดงๅผ ๅ
ณ็ณป\n548503 1 ๆฌๅ
ฌๅธไธไธ่ฎพ่ฎกๅถ้ ๆถฒๅๅ้ๆบๆขฐ๏ผๅ
ฌๅธๅฐๅๅนฟๅทๅธ็ฝไบๅบๅ็ฆพ่กๆฐ็ณ่ทฏๅ
ฌไบค็ซๆ่พน.็ฐ้ๆ็ต็ๅทฅๅธๅ
...\n548504 0 ๆๆๆ่ดงไธๅฝๆๆ่ดง็ๆๅฎๅผๅธไปท็ฐๅๅซๆฅ24164ๅ11040\n1000 Processed\n classify content\n549000 0 ่ฏดๆๅฐฑๆๅคฉๅคฉๆๆๅๅๅๅๅชไธ่ๆ\n549001 0 ไธ่ฏๆๆฐไธไป
ไธไธพ็ช็ ด3700็น็ๆดๆฐๅ
ณๅฃ\n549002 0 ๅป็ๅฏนๆฏไบฒ่ฏด๏ผโไฝ ๅฅณๅฟ็ๅฆๆณ็ๅพไธฅ้\n549003 0 lumia526ไธคๆฌกๅ็บงwp10้ข่ง็้ฝๆฏไปฅๆ ้้ๅฏๅๆป่ฝฎๅคฑ่ดฅ\n549004 1 ็ฏไบๅธไธญๅบๆๅค่ดๅฐๅฅณไบบ่็นๅๅบๆดปๅจ๏ผ็นๅๅ
จๅบx.x-xๆ๏ผxxๅ
่ตทใไธๆx.xๆ่ตท๏ผๆฅ่ฃ
ๆๅฎ...\n1000 Processed\n classify content\n549500 1 ็ ไธ๏ผ็พๅณ้ข่็นไปทx.xๅ
/็ ๅ
ซ๏ผๅก่ฟๅบ่ดญๆปกxxๅ
ๅณๅฏ่ต ้ไฟฎ็ๅก ่ดญๆปกxxxๅ
ๅณ...\n549501 0 ไธ่ฌไธบ1000~3000ๅ
ฌ้ไน้ด\n549502 0 ๆๅคๅฉ้ฃๆ ทๅนด็บช70ๅคๅฒ่ฟๅชๅๆซๅฐ็่ไบบ\n549503 0 ๆ็็ต่ฏ่ขซ่ฎค่ฏไธบไฝๅง้ฟ้ๅทดๅทดๆๅกไธญๅฟไบ\n549504 0 ๅฅณN็ๅฐๆคๅฃซไนๅๆฌขไปๅพไน
ไบ\n1000 Processed\n classify content\n550000 0 ้ฆๆนๆๆฏๆ็็้้คๆบๅจไบบโ็ฟ ่ฑโ\n550001 0 ไปๅๅฐๅฎ็ฐ็ฑ็็บงๆฟๅบๅจๅ
จ็่กๆฟๅบๅ็ป็ญนๅผๅฑๅคง่งๆจก่ทจๅฐๅบใ่ทจ้จ้จใๅฎๅฑ็บง็ไฟกๆฏๅ
ฑไบซๅ่ๅจๅๅ
ฌ\n550002 0 ๆ้่ฒ็่่่ฒ่ฐ็ๆจๅถๅฐๅฑใ้จๆทๆฟ็ๅฑฑๅขๅๅ็้จๅป็ปๅ็ฒพ็พ็ๅฎคๅคๅ็ฉบ้ด็ชๅบไบ็ณ้บๅฐ้ขๅไธไธช่ฑๅญ็ปฟๆดฒ\n550003 0 ้ฃไนๆๅฏนไธๅไปฅๅๆ่ตๅ็็ผๅ
ๅผๅงไบไธฅ้็ๆ็\n550004 0 ไธ่ฟvxinๅ
ฌzhongๅท่ฟๆฏไผๅzoeไธ่ตทๆด\n1000 Processed\n classify content\n550500 0 ไน่ดญ่ๅๅทฅ้ไฝ็ดขๅฟๅๆถฆ็ณปๆๅ้จๅไน่ดญ้จๅบๆไธๅ
ๆถๆฏ็งฐ\n550501 0 ๆขฆ่งๆบๅจไบบๅ็ปๅฐๆญฆๅฃซๅจๆไบฎๅๅฏๆๆไน้ดๅณๆ\n550502 0 ๅฏปๆพๆฑ่ๆฌๅทๅฐๅบๅฟซ้ชๆๆๅข้ไธ็ป\n550503 0 ไธบไบ่ตถไธ7๏ผ00ๅฐduma็้ฃๆบ\n550504 0 ๅ
ดๅๅธ่ๅนฒ้จไนฆ็ปๅฑไปๆฅๅจๅธๅ็ฉ้ฆๆญ็\n1000 Processed\n classify content\n551000 0 ็ไธๆIFไธปๅๅIHไธปๅๆไน่ฟๆขๅคงๅน
่ดดๆฐด\n551001 0 5ใๆพๅบๅไพไธญๆๅ
ๅซ็่งๅๆๅๅ\n551002 0 ่ณ่พนไธ้้ไผ ๆฅๆฐงๆฐ็่ฝป่ฝป็ๆญๅฃฐ\n551003 1 ใไธญๅฑฑไปๅญๅฅ้ฉฐใ็ฅๆจๅๅฎถไบบๅ
ๅฎต่ๅฟซไน\n551004 0 ๅไบๆ็ถๅ
จๅฝๅๆดไผๆตๆฑ็พๅฅณไธ่ตท็ฉ่\n1000 Processed\n classify content\n551500 0 Sparkๅจ6ๆไปฝๅๅพไบๆฟๅจไบบๅฟ็ๆ็ปฉ\n551501 0 ๅจP2Pๅ่ดทใ็ฝ็ป็่ดข็ญไบ่็ฝๆ่่ตๆนๅผๆฟๅๅคงไผ็ๆ่ต็ญๆ
ๅ\n551502 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ rx67h9ไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n551503 1 xxx=xxxxๅ
๏ผๆๅฎ้ฉพ้ฉถๅ๏ผ๏ผxๆx.xๅท็ปญไฟ่ต ้ๆบๆฒนไธๆฌกๆๅ
่ดนๅทๆผไธๆฌกๆๅ่ฝฎๅฎไฝไธๆฌก๏ผ...\n551504 0 ็ซ่ตๅฝไธญ็่
่ดฅไธๆฏไธชไบบ่กไธบ\n1000 Processed\n classify content\n552000 0 5ๅฎถ้
ท็ซๅป้ขๅผ้ข็ฐไปฃๅป้ข่ฎพ่ฎก็ๅฟต\n552001 1 ๆจๅฅฝ๏ผๆๆฏๅๅๅๆจ่็ณปๅ็posๆบ็ใ็ต่ฏ๏ผxxxxxxxxxxx\n552002 0 ้ๅณๅขจ่ฅฟๅฅๆฟๅบๅฐฑๆๅไบไปถๅ
ฌๅผ้ๆญ\n552003 1 ไบฒ๏ผๅ่ฟๅ
ๅฎต่๏ผไธๅ
ซ่ไน้
๏ผๆไผ้ซ็พๅฆ๏ผ็นๅซๆจๅบ๏ผไผๅฆฎ้ๅฅฝ็คผ๏ผๅไปทxxxๅ
ๅฅ็็ซๅxxๅ
็ฐ้...\n552004 1 ๆจๅฅฝ๏ผๅๅธไบบๅฎถๆบๆxxๅฎถๅปบๆๅๆญฃๆๅไธ๏ผx.xๆฅ๏ผไธพ่กๅฎถ่ฃ
ๅปบๆๅข่ดญไผ๏ผๅฎ่ฃ
ไฟฎๅฏไบซๅๅปบๆๅ
จๅธๆ...\n1000 Processed\n classify content\n552500 1 ๆฌข่ฟ่ด็ตๆฎๅ
ฐๆฏๅนฟๅ๏ผๆฌๅ
ฌๅธไธไธๅถไฝไบๅ
ๅๅถๅ๏ผๅนฟๅๆ ่ฏๆ ็๏ผๅนฟๅๆ็\n552501 0 ่ฏดๅงไธไบ้ฃๆบๅปๅชๅฟ้ไบไธๅคฉๆ็ๅซไบบๅฌ็จฟๆฅ็่ฆๆญป\n552502 0 ๆไธ็ด่งๅพๆตทๅค็ๅไบบ้ ๆฟๅบ\n552503 1 KARL LAGERFELD xยทxๅฅณไบบ่ๅคง็คผๅ้ฆ๏ผxๆx-xๆฅ่ดญไนฐๆปกx\n552504 0 ๅพฎ่ฝฏ้ซ็ฎก๏ผๅพฎ่ฝฏไผๅ Win10ๅ
่ดนๅ็บงๆพๅผไธไบ่ฅๆถ\n1000 Processed\n classify content\n553000 0 ไป็้ฑ่ขซๆไป7000ๆฃๅพๅชๅฉ5000ๅฆ\n553001 0 โๆดไฝๅต่ฟๆฏๅฏนไบTFboysไธไธชๆชๆๅนด็ๅ
ฌๅ
ฑไบบ็ฉ\n553002 0 ่ฟๆฏๆฅ่ชๆฑ่่ฎธ้ธฟ่กๅ
็็ๅ่ตไฝๅ\n553003 0 ็พๅบฆๅ็ฐ่ฟๆ ท็้ฎ้ขไธๆฏไธชไพ\n553004 0 ๅ่ฎฎ่ต้295ไบฟๅ
ใๅฎๆๅนดๅบฆ็ฎๆ ไปปๅก็57%\n1000 Processed\n classify content\n553500 0 ไฝไธบๆพ็ป่ขซๅป็ๅคๅไธชๆญปๅ็ไบบๅคช่ฝ็่งฃไปไปฌๅคซๅฆปไธคไธบไบ่ฆๅญฉๅญๅ็่ฆ\n553501 0 ๅฐ็ไธ็็ปด็็ด B3ๅฏ่ฝ่ตทๆบไบๅคช็ฉบ\n553502 0 ๆ่ต่ฟไปถไบไธ่ฝๆฏ็ๆดป็ๅ
จ้จ\n553503 0 ไธคไธชๅฐๆถ็ไธๆ ๅๆฑ่ฏญ+ไธๆ ๅ่ฑ่ฏญ\n553504 0 Mxx็็ถๆๅขๆฏๅๅ็ไธญๆไบฎ็็็ถๆๅข\n1000 Processed\n classify content\n554000 0 ๅๅธๅจๅ
ๆฒณไธๅๆฐดๅ็95่้็ ่น้ไธญๆดๆฒปใ็ฐๅทฒๅ
จ้จๅๆบ\n554001 0 ้ไธญไฟฎๅค14ๅคฉ็ฒพๅๆฏๅคฉ็จ่ฟๅๅฆๆฐดๅๆๆฉๆถๅจๅ
จ่ธ\n554002 1 ๅฐๆฌ็็จๆท๏ผๆจ็ๆๆบ่ฏ่ดน็งฏๅๅทฒๆปก่ถณๅ
ๆขxxx.xxๅ
็็ฐ้็คผๅ
ๆกไปถ๏ผ่ฏท็จๆๆบ็ป้ jggxx...\n554003 1 lๆญ่ดบๆฐๆฅ๏ผ่ดตๅฎพๅ
็ฅๆจๆฐๆฅๅคงๅ๏ผwww.xxxxxx.comๆจๅบๆฏๅคฉๆ็ ่ฟx%ๆ ไธ้๏ผๆฟๆจๅฟ...\n554004 0 ๆไบฎ่ฒๅฝฉๅค็+้ๅคดๅช็นใไฟฏๆๅฏน็งฐๅๆฏ\n1000 Processed\n classify content\n554500 0 โ็
ๅไธ่งฃๅฐ่ฏด๏ผโ้ฃๆดๅบ่ฏฅ็ฅ้่ตกๅ
ป่ไบบๅ\n554501 0 ไธบไบ้ฒๆญขๆๅไบบๆปฅ็จไธชไบบไฟกๆฏ\n554502 1 ไธบ็ญ่ฐขๆฐ่้กพๅฎขๅฏนๆฐไธๆธฏKTV็ๅ็ฑ๏ผ็นๅจๅ
ๅฎตๆฅไธดไน้
ๆจๅบ็บฏ็ๅค้
xxxๅ
ไธคๆ๏ผๆๅพ
ๆจ็ๅ
ไธด๏ผ...\n554503 0 ๆๅจๆฑ่ไธบไผ ๅฅๆๆบ็ๅฉๅ\n554504 0 ๆ ๆป็ฅไธ100%้ๅ
ณ่ฏ่ฏด่ฟไธบไปไนไธๆนๅๅซ่ขซ้็็็ๅญ็็ๅญ็็ๅญ\n1000 Processed\n classify content\n555000 0 ่ทๅพไบๆตๆฑ็ๅไธๅๅฐผๆฏๅงๅไผ้ขๅ็ๆ้ซไปไบบๆๅฅ็\n555001 0 8ๆ1ๆฅๅพฎ่ฝฏ็ต่ฏ่ฅ้ๅข้ๅขๅปบ็ฐๅบๅพ\n555002 0 โๅฐๅท็็ๅฟๅญ้ฃๅฏๅฐๅด็ด่
ฎ็ๆจกๆ ท\n555003 0 ไนฐ็่ฑๅ้ชจๅฐ่ฏด็ปไบๅฐๅฆๅฅฝๅผๅฟๅฆๅฆๅฆ\n555004 0 ๅไบฌไธ็ง็ฝฎไธๆป็ป็ๆฑไฟๅ
จ\n1000 Processed\n classify content\n555500 0 ๆๅไธๅคฉๆๅ้็ผ็้บปไบ้ฟๅงจ่ฏด้ฃๆๅฅฝๆๅฐไฝไบๅๅๅฏ็ฑ็้ฟๅงจ๏ฝๅๅปๅผๅงๆฅ็ๆขๅคๆฌง่ถๆๅป็ไปฌ็ๆๅฏผ...\n555501 1 ๅฐๆฌ็ไผๅๆจๅฅฝ๏ผไธๆไปฝๅฅณไบบ่๏ผไธบ็ญ่ฐขๆฐ่ๅฎขๆทๆฌๅบๆๆๆค็ๆดปๅจๅไนฐไธ้ไธ๏ผไผๆ ๅคๅคใๆฌข่ฟๅไฝๅฐ...\n555502 0 ๅฐ้ๅฏน้ข็ซไบไธช็ไผผๅจๅทๆ็ไบบ็ฉ\n555503 0 ๆจ่่ฟๅฎถๅพฎๅบ๏ผ็ๅญ็่่\n555504 0 ่ไธ่ฟ็ๅฐxxๅค่พๅทฒ็ป็ฆๆญข็่ฅ่ฟไธ่ฝฎ็ญ็ๆไบบ\n1000 Processed\n classify content\n556000 1 ๆฌข่ฟ่ด็ต๏ผๆต้ณ่ธ่้ฆใๆฌๅบ็ฏๅขๅนฝ้
ใ่ๅ้ฝๅ
จใไธป่ฆ็ป่ฅๆต้ณ็น่ฒ่ธ่ใๆจ็ๆปกๆๅฐฑๆฏๆไปฌ็่ฟฝๆฑใ...\n556001 1 ๅฏ็ฑ*^o^*็ๅฎถ้ฟไปฌๆจๅฅฝ๏ผ็ซฅๅฟ็ญๆขฆๅนผๅฐๅฟไนฆ็ปๅน่ฎญ็ญๆฅๅญฃๆฅๅๆญฃๅผๅผๅงไบโฆ้่ฆ้ขๅฎๅญฆไฝ็ๅฎ่ดไปฌ...\n556002 0 ไธไฟกๅป้ฎไธGoogleโโๆฏไปๅฎ่ฏด๏ผไฝ ่ฟไนๆณจ้้็ง\n556003 0 xxxๅคฉๆฏๅคฉๆ ๆฐๆฌก้ๆถ้ๅฐ่กฅๆฐด\n556004 0 ้ๅฒ้ปๆตทๅญฆ้ข่ท้ๅฒๆปจๆตทๅญฆ้ขๅชไธชๅฅฝ4\n1000 Processed\n classify content\n556500 0 ่ไธๆญๅบ็่ฑๅ้ชจไธญ้้ ญ้ๆ10ๅคๅ้็้ๅพฉๆ
็ฏ\n556501 0 8ๆ7ๆฅ็พๅฝ่พพ็พ่ช็ฉบ1889ๆฌก่ช็ญ็ฑๆณขๅฃซ้กฟ่ตท้ฃ็ฎ็ๅฐไธบ็นไปๅท็ๆนๅ\n556502 0 ๆ่ต่
ๅบ่ฏฅๅ
ทๅค็้่ฆไธ็น๏ผ็พๆไธๆ \n556503 0 nicereunion??pic\n556504 0 ๅธๅผไบๆฅ่ชๅ
จๅฝ120ๅคๆ้ซๆ ก็1300ไฝๅ่ฟๅจๅๅๅ \n1000 Processed\n classify content\n557000 0 ๆฅๆฌ่ฝฌ่ฟไธญๅฝๆตทๅคๅ
ๆๅก่ดนๅ็ฎฑEMS/่ช็ฉบ/ๅ
ฌๅธไบ้ฉฌ้ไนๅคฉไปฃ่ดญ็งไบบ\n557001 0 ๅฏ้ ็P2P็ฝ่ดท็ณป็ปๅปบ่ฎพๅผๅๅ
ฌๅธๅฏไปฅๅธฎๅฉๅนณๅฐ่ตฐๅๆด่ง่ๅๅ็งๅญฆๅ็้่ทฏ\n557002 0 ๅฏ็ฌๆ่ฏฅๆถ็ๆฟไบง็จ่ฟ่ฟๆฒกๆๆถ\n557003 0 ่ฟ้จ่นๆ5sๆๆบๆฏไฟ็็ๅฉถๅฉถไนฐ็ปๅฅน็\n557004 0 ๆทฑๅณๅธๆฟๅบๅฏ็งๆธ้ท้ซๅ่ผ่กจ็คบ\n1000 Processed\n classify content\n557500 0 ๆณๆดชๅฟ้ฟ้ๆฑฝ่ฝฆ็ซๆณๆดช่ฝฆ็ซๅฐๅคฉๅฒๆนๅ
ฌไบค่ฝฆๅคชๆถจๅคชๅธไนฑๆถ่ดน\n557501 1 ๅณฐๅบฆๅฎถ่ฃ
ๆฝๅทฅ้ๆฟๆฅ่ฃ
ไฟฎไธๅก๏ผ่ๅๅผๆฅไปท๏ผไปทๆ ผๅฎๅจ๏ผ่ดจ้่ฟ็กฌใ็ต่ฏ๏ผxxxxxxxxxxx ๅฐ...\n557502 0 ไธๅๆปฅ็จๆ่ตๅฐ้ฉฌๆ ผ้ๅธ็่ต้\n557503 0 ๆฌๆฌกๅป็ๆบๆๅฎ่กไปทๆ ผๅ
ฌ็คบ้็น\n557504 0 intelๆด่ฆ็กฎไฟๅถ็จๅคงๆไธญไธ่ขซไธๆใๅฐ็งฏ็ต็ฉๅผ\n1000 Processed\n classify content\n558000 0 ้ขๅฏผไธๅฅๆฟๅบๅไธๅๆไนๅ็่ฟ็ปญๅ ็ญๅฐฑ็ฝ่ดนไบ\n558001 1 ่ๆฟ๏ผๆฐๅนดๅฅฝ๏ผๆไปฌๆฏๅธธ็ๅ้ซๆฐดๆดๅ๏ผไธไธๆฐดๆดๅ็ง็ไป่กฃๆ๏ผ่ฃคๅญ๏ผ่ฃๅญๆฐดๆด๏ผไปฅๅๅ็ง็ไปๅทฅ่บ๏ผ...\n558002 0 ๅฏนๆๅๅพๅๆไฝๅๆฒกๅๆณ่ฏดๅบๅฃๆณๆดไบบไฝๅๆฒกๆๅฅฝ็ๅๆณ\n558003 0 ๅจไธไธช็ฌ็ซ่ฎพ่ฎกๅธไบคๆตไผไธ็ๅฐ็็ฒพ่ดๆๅทฅไฝๅ\n558004 0 ๅฝๆไปฌ่ขซๆ้ช่ขซๆๅๆณ่ฆ็จๆ่ฐ็ๆณๅพๆญฆๅจไฟๆค่ชๅทฑๆถๅฐฑๅช่ฝๆพๅฐๅ้ปๆ่ฒ็ฝๆฌพ่ฟไบไน\n1000 Processed\n classify content\n558500 0 โๅฏนๆๆฅ่ฏดๆๆบๅชๆฏ็ตๅญๆ่กจ่ๅทฒ\n558501 0 Gxx้ฟๆทฑ้ซ้็ฑๆญๅทๅพ่ฟไบๆธฏๆนๅๅฎๆญๆฎตKxxxx+xxx่ณKxxxx+xxx้่ฟๆฝๅทฅ็ปๆ\n558502 0 ๆฌข่ฟๅ
ถไปๅไบฌๆช่ฝ็ธ่็็นๅฟๅ็ญ่ๆๅไปฌ่ธ่ทๅพๆญ\n558503 0 ๅฝๅฎถ้ๅจๆ็ๆ ๅฟๅฐฑๆฏๆ่ดง็ไธญๅๆจๅคฉ่ฟๆ ท่ๅฒๆๅ\n558504 1 ๅนฟไธๅๆ้ๆ้ญๅฏๅฉ: ๆๆบxxxxxxxxxxx ๆๆพ็ปๅฐ่ดตๅบๆ่ฎฟ่ฟ๏ผๆฌๅธไธไธ็ไบงๅ็ง้จ็ช้...\n1000 Processed\n classify content\n559000 0 ๅป็ฒๅบ่ฟๆฌพๅฑๅฑ้ๅๆฅๆฏ่ฆ็ปๅฐๅญฉๅญๆฒป็็บขๅฑๅฑ็\n559001 0 ๆ็ธ2015ๅนด6ๆ24ๆฅ่ขซๅทไผๆๅฎถๆ ็ไบบๆฝๆ\n559002 0 ๅคงๅฎถๆฏไธๆฏ้ฝไผ็ๆๆบ่ไธขๆๅฎฃไผ ้กตๅข?ๆไปฌ\n559003 0 ้ๅๅค็งๆชๆฝๅๅฅฝๆณๅพๆดๅฉๅทฅไฝ๏ผ\n559004 0 ้พๅฃๅธๅผๅฑ็ตๆขฏๅฎๅ
จไธ้กนๆดๆฒป\n1000 Processed\n classify content\n559500 0 ๆๅ
ถไปๅฐๆนๅ
ๆฌ็ต่ไธ้ฝๆฒกๅญๅ\n559501 0 2015ๅนดไธๅๅนด่ขซๅฑฑๅฏจๆๅค็ๅ็ไพ็ถๆฏไธๆใๅฐ็ฑณ\n559502 0 ๆๆณๆ็ๅไบฌๆๅคง็็่ฆๅฐฑๆฏๅฎถไบบ็ไธๆฏๆไปฅๅไธไธชไบบ็ๅญคๅ\n559503 0 ๆญคๆฌกๅ
ฑ่ฎกๆ168ๅ่็ๆฅๅๅๅ ่่ฏ\n559504 0 ้ฃ้ฉๆ่ตๆฒกๆ่ฎกๅๅๆฆๅฟตๅฐฑๆฏๆๅคง็้ฃ้ฉ\n1000 Processed\n classify content\n560000 0 ่ฎพ่ฎก่ฟไบๅปบ็ญ็ๅคงๅธไปฌ็ๆฏ็ๅฎไปคไบบไฝฉๆ\n560001 0 ๅนณๅไธๆฅ็ธๅฝไบๆฏๅคฉ20ๅ้ๅทฆๅณ\n560002 0 ่ฟ่ฎฐๅพๆๅปๆตๆฑ่ฟ่ฟ็งๆ็ๆถๅ\n560003 1 ้ฃ้นคๅฅถ็ฒๆๆๆๆดปๅจไบ๏ผๆ?ๆตทๆ้ฝไบๅบ\n560004 0 ๆ่ฟๆฏๅ
ฑไบงไธปไนๆฅ็ญไบบๅข\n1000 Processed\n classify content\n560500 0 ๆตๆฑ็ๆกฃๆกๅนฒ้จๆ่ฒๅน่ฎญไธญๅฟๅ
ณไบไธพๅๅ
จ็โไบๆฐดๅ
ฑๆฒปโๆกฃๆกไธๅกๅน่ฎญ็ญ็้็ฅ\n560501 0 ่ฝฌๆตทๆทๅฏ่ฃ็ๆฏๆฏ่ซxxx/xx\n560502 0 ไธBelgicaๅคๆฌกๅจ้ฃๆบไธ็ธ้\n560503 0 ๅฌๅญฃๆ่ฝ้ฟๅ
้ข้ฒ็พ็
็ไบง็\n560504 0 ๆจๆๅๅพๆไนไน็๏ฝ็ถๅๆๅฎขๆท่ฏด่ฆๆฅๆฟ่ดง\n1000 Processed\n classify content\n561000 0 ๅไบฌๅฟ็ซฅๅป้ขๅฟ็ซฅไฟๅฅไธญๅฟไธปไปปๅปๅธๅๆฅ้ณๆ้\n561001 0 ๅฐฝ็ฎกๆไปฌ่ขซๆๅๅฒ็ฌ่ขซๅฎถไบบๅซๅผ่ขซไผไผด่ดจ็่ขซ็ป้ชๆ็ผ่ขซๅ่กไนฑไปท่ขซๅชไฝๆน้ปไฝ่ฟไนๅฐ้พ\n561002 0 ๅตๅตๆฏๆฌก้ฝ่ฟๆ ท่ตฃๆฆไบบๅฐฑๆฏ่ฟ็งๆงๆ ผไนๆข็ถไปไน้ฝไธๅ้ฃๆ นๆฌไธๆฏไธ่ทฏไบบๅ็ๆฏๅๅคไบ\n561003 0 ๅฐๆนๆฅผๅธๅๅนดๆฅ๏ผๅนฟไธๆตๆฑ็ญๆฒฟๆตทๅๅธๅขๅน
้ขๅ
\n561004 0 ไฝไปไปฌๅด็ดๆฅๆ้ฑๆ็ปไฝ ่ฟๆฏๅซไบบ็ธไฟกไฝ \n1000 Processed\n classify content\n561500 0 ๆณๅญฆxxxx็ๅผ ๆผ้ๅๅญฆ็่ฟๅ
ฅไบๆปๅณ่ต\n561501 0 Piagetไผฏ็ตchaumetๆๆฐธๆ็ๅนธ็ฆไธๆฏๆฅๆไฝ \n561502 0 ไฝ ่ฆๅ็ปๅๅ
ญไธชๆๅๅ
ๆฌๆ่ฅไธๅ\n561503 0 ๆๅๅจๆฅๅปๆ ้ก้ณๅฑฑๆนๅๆฐด่ๆก\n561504 0 ๅ ไธบๆณ็ฌ็ซๅ ไธบไธไฟกๅฝๅ ไธบๅไฟกๅฏไปฅๆๅฐฑๆดๅฅฝ็่ชๅทฑๅจๅพฎๅ็่ทฏไธๅณๅฟ่ตฐๅฐๅบไธ็ๆงไธ้้ฟๅไบ็ด้ขๆๆ...\n1000 Processed\n classify content\n562000 0 goodbeysummer??\n562001 0 ๅไบฌ็ๆตทๅบไธ็่ฟๆฏไธๆตท็ๆตทๆด้ฆๆดๅฅฝ็ฉ\n562002 0 ๅผฅ่กฅไบๆๅฝๆ
ๆธธๆๅๅญฆ็ง็ ็ฉถ็ไธไธช็ฉบ็ฝ\n562003 0 ๅ ๆญคๆไปฌๅณๅฎ่ฏๆๆ1ๅนด็พ็ฒๅทฅไฝ็ป้ชไปฅไธ็็พ็ฒๅธไธๅ\n562004 0 ๅๆถไนๆฏ็พๅฝ่ชๅคฉ้ฃๆบ็ๆๅไธๆฌก้ฃ่กไปปๅก\n1000 Processed\n classify content\n562500 0 ๅฏ่ฝๆฒก่ฟๅคไน
็ๆถ้ด้ด้ๅฐฑไผๆๅฐ้ๅบ่ก็ฐ่ฑก็ๅ็\n562501 0 ๅพๅท็งปๅจไธๅกๆจๅนฟๅฐไธ็จๅท\n562502 0 ๅฐ9ไธชๆๅคงๆถ่ฟ้กฟๅฅถๆดๆฏ่ฏๅฎไธๅฟ
่ฆไบ\n562503 0 ่ฟๆฌกๆณๅๆณๅจ็ต่็ฝ้กตไธๆๅผๆฏไป็้ข\n562504 0 ไฝ ไนๅฟซๆฅ่กจๆๅง~ๅฆๆไฝ ๆฏ็ฑณๆต\n1000 Processed\n classify content\n563000 0 ๅผ ่บ่ฐๅทฒๅณๅฎๆฅๅๆ ้กๆปจๆนๅบ่ฎก็ๅฑxxxไธไฝๅ
็็ฝๆฌพ\n563001 0 ่่ฟๆณๅฏๅบฆๆๅคง็ๅฝๆฐๆๅ่ฅฟ่ทฏxxxๅค็ฑณ็ๆๆ่ทฏๆฎต\n563002 0 ็จๆปดๆปดๆ่ฝฆ้กบ้ฃ่ฝฆๅฟซ็ไธๅทไธ่ฝฆๅๅๆผ่ฝฆๅคฉๅคฉ็จ่ฝฆๆๅฐ็จ่ฝฆxx็จ่ฝฆ็ฅ่ไธ่ฝฆ่ฟไธๅฆ็จuberๆ่ฝฆ\n563003 0 ๅฎๆ่ๆตฆๅฃๅฐๅบๆดๆฒปๅๆตฆๅฃๅๅฒ้ฃๅ
ๅธฆๆดๆฒป\n563004 1 ????ๅฐๆฌ็่ดตๅฎพไปฌ๏ผXG้ชๆญ็ปๆจ้็คผๅฆ๏ผๅฅฝ็คผๅ
่ดน้ๅฆ[ๆๅฟซ][ๆๅฟซ]ๆฌข่ฟๆๅๅบ้บๅๅ ้ขๅฎๆข...\n1000 Processed\n classify content\n563500 0 ๅๅฆ้ฉฌไบๆฅๅฐไธญๅฝๅฅฝๅฃฐ้ณไฝ ไปฌ่ชๅทฑๆๅไธไธ\n563501 1 ๅไฝไบฒไปฌๆฐๅนดๅฅฝ'็ฅ็พๅนดๅคงๅ\n563502 0 ๆๆบๅ
ณๆบไธๅไบ่ฟๆฏ็ฌฌไธๆฌกๅผๆบไบ\n563503 0 ็นไน็ไฟๅฏๅไธช้ฃๆบๅชๆไธไธชๅฐ้ฉฌๅฅ่ท็\n563504 0 ไบ็ฒฎๆถฒ้ๅข้ฝๅบ็ๅ้
ไบโฆ็ๅๆค็ฉ่ๅ\n1000 Processed\n classify content\n564000 0 ่ฆๅ้ฒ่ดผไธๆ ท้ฒๅฎๅ่ดชๆฑก็็ช\n564001 0 1ๅฐwin10ๅบ็จๅๅบๅฐฑๆฒกๅฅฝ็จ่ฟ\n564002 0 ้ๆฏ่ฆๆไปฅๅพ้ฝไธ่ฆๆพcos็
งไธไพ็็ฏๅฅ\n564003 0 ๆฟxxๅ็ฒพๆฒน็็ๅ ๅ
ฅๅธธๆไปฃ็้ฎๆๆไนๅ ๅ
ฅๅๅจไปฃ็ๅข้\n564004 0 ๅฐๅๆฏๅจๅดไธญๅบไบบๅๅป้ขๆๅฏน้ข่ช่คๅ \n1000 Processed\n classify content\n564500 0 2011ๅนดๅผ ใๅไธคไบบ่ตท่ฏๅฐ่ฅฟไนกๅกๅบไบบๆฐๆณ้ขๅ\n564501 0 ็กฎไฟๅจ2016ๅนด้ซๆ ๅ้่ฟๅฝๅฎถๅซ็ๅฟๅไธๅนดไธๆฌก็ๅค่ฏ\n564502 0 ไฝๆฏๆฌกๅง็่ฃ
ไฟฎไนๆฏ้ๅธธๅผๅพ้่ง็\n564503 0 ไนๅๅปๅป้ขๅ็ฒพๅๅ
ๆฒๅฐๅปๅ็ซ้
\n564504 0 ๆๅๆฏไบบๆ ผๅ่ดขๅฏ็ๅ้ไธฐๆปก\n1000 Processed\n classify content\n565000 0 ๆตๆฑๅซ่ง็ไธญๅฝๅฅฝๅฃฐ้ณใๆฑ่ๅซ่ง็็ๅฟ่ฑ้ใไธๆนๅซ่ง็ๆฅๅ\n565001 0 ๅไฟ้ฉๆ ็ญๅค็้ ๅไนๆๆ่ฐๆด\n565002 0 ไธปๆๅพๅฐฑๆฏๅผบๅฅธไฝ ๅฏๆฏ่ฟ่ฆ้ชไฝ ๆไนไธๅญฆ็ไบซๅ\n565003 0 /ไธญๆฅ็กฌๅฎๅๅคงๆฏๆผ็ไธญๅฝๅๅทฅๅฆไฝๅฎ่ๆฅๆฌ\n565004 0 /ๅธๆฐๆฟxxๅนดๅๅญๆๅ้ฑ้ญๆ้ถ่ก๏ผๅฝๅนดๆ ็ต่\n1000 Processed\n classify content\n565500 0 xxxxๅนดๅฝๅฎถๆๅบ็ไธๅธฆไธ่ทฏๆ็ฅ\n565501 0 ๆญคๆฌกๅฑ่งๅฝๅไธบโๅฟ้ป่ง
็ฝโ\n565502 0 ๆฏๆฌกๆฌขๆฌขๅๅไนฐไธ่ฅฟ็ถๅ็ๅฐๅช้้ฝๆฏMadeinChina็ช็ถไธ็ฅ้่ฏฅไผคๅฟ่ฟๆฏ้ซๅ
ดไธไธช็ต่ไปไน...\n565503 0 ๅ้ๆฎๆณๅฟๆฟ่
ๅผๅฑโๅธฆ็ปฟ่ฟ็คพๅบ้ๆณๆ ๅฑ
ๆฐโๆดปๅจ\n565504 0 ๆๅพๆ็็ธๅฅฝๅจๅชไบ็ผ่งไธบๅฎ\n1000 Processed\n classify content\n566000 0 ๅค่งๅ
่ฃ
่ฎพ่ฎกไฝ็ฐไบๆถไปฃๆๅ้ซ้
ใๅ่ดตใๆถๅฐใๆฐๆดพ็้ฃๆ ผ\n566001 0 ็ฐไปฃ็ตๆขฏ็ๅฎไฟๆฏๅคฉ่กฃๆ ็ผ็\n566002 0 ็ฅ่ๆฅๅฟซไนโฆโฆๅไบฌ้ซๆธฉ้
ทๆไธญ\n566003 0 3ไปฅไธ่ฏญ่จ๏ผไธญๆๅ็ฑป๏ผ้่ฎฏ็คพไบคไฝ่
๏ผ่
พ่ฎฏๅพฎไฟก็็ๆพไบๅๅคฉ่ฟไธช่ฝ็จ\n566004 0 ๆๅฉไบ่งฃๅณๅฎฟ่ฟๅธๅฐๅพฎไผไธ่่ต้พ็ญ้ฎ้ข\n1000 Processed\n classify content\n566500 0 ไผผไนไธชไธช่ดชๅฎ้ฝ็ญ่กทไบๆพๅผ่็ๅญใๆพๆพ่ฃคๅธฆๅญ\n566501 0 ไธ็ฅ้ๅ็ฎก้ขๅฏผๅฎถ้จๅๆฏไธชไปไนๆ ทๅข\n566502 1 ๅฅฝๆถๆฏ! ๆฐๆฅผ็ๆฅๅฎใ ไฝไบ่ง็พ้พๆฑ ...\n566503 1 ๆญฆๆฑ็ฐไปฃๅ้ฆ็งๆๆ้ๅ
ฌๅธไธป่ฆ้ๅฎๅ็งๅฎคๅป็่ฎพๅค๏ผไปทๆ ผ็ปๅฏนๆไผๅฟ๏ผๆฌข่ฟๆฅ็ตๅจ่ฏขxxxxxxxx...\n566504 0 ไฟ้ฉๅ
ฌๅธไปๅนดไธๅๅนดๆ็็ฑปๆ่ตๆฏ้ๅคงๅน
ไธๅ\n1000 Processed\n classify content\n567000 0 AQXPๅฎฃๅธๅ
ถๅจๅๅฐ่่ฑ็ผ็็ปผๅๅพ/้ด่ดจๆง่่ฑ็ๆฃ่
็็ผ็ๆน้ข็ๆๆพ่\n567001 0 ๆฑ่ฅฟ็ๆฐๅปบๅฟไบบๆฐๆณ้ขๅฎก็ไธ่ตทๅผบๅฅธๆก\n567002 0 ๆญฆๆฑๅฐ้็ปงๆ่กฃๅคงๆๅๅ็ฐๅผบๆฑไน่ฎจๅฅณ\n567003 0 ๅฎๆฏไธๅบ็ฑSAPHR้จ้จ็ฒพๅฟ็ป็ป็ๅๅทฅๅนดๅบฆ็ไผ\n567004 0 IBMไบๅคชๅฐๅบ่ก้ๆป็ป็็่ไฝ\n1000 Processed\n classify content\n567500 0 ไธไบๆ็ๅนฒๆดๅบๆ่็ซ่
ป็ๅพฎๅ\n567501 0 ็ๅๅ
ฌๅญ10ๅนดไธไธๅข้็ปไธ่ฟ่ฅ็ฎก็ใ7%ๅนดๅๆฅ็ใ6็ฑณ่ถ
้ๅฑ้ซ\n567502 0 ๆฐ้ฒๅคง็ฝๅคๆ ้ก้ณๅฑฑๆฐด่ๆก่ๅๆๅญ็ฐๆ็ฐๅ\n567503 0 ๆณๅฐๅๆฌกๆฅ่งฆ็ต่่ฟๅญฆไปไนDOSๅฝไปค\n567504 0 ๆฉโฆโฆ่ฆไธ่ฆไธไธช่ฑๅ้ชจ็ฉ็ฉโฆโฆ\n1000 Processed\n classify content\n568000 0 ๆๅฃซๅ
ฐๅทฅๅ
ๆฟๅบๆจ็ฟปไบๅๅๅฑไผๅไธญๅปบ้ ๆธธ่ฝฎ็ ๅคด็้จๅ\n568001 0 4ใไธญๅฝๆๆ้่ดญไฟๅถ็ซ็ฎญๅๅจๆบ\n568002 0 ๆฑ่็ๆฐ่ฑกๅฐ15ๅนด7ๆ26ๆฅ14ๆถ29ๅๅๅธ้ท็ต้ป่ฒ้ข่ญฆไฟกๅท\n568003 0 ้ฃไนๅๆๅซๆๅฏผ่ช็ณป็ป็ๅด่ตท่ฟๆฏ้ฎ้ขๅ\n568004 0 ๅพฎ่ฝฏๅฏน็กฌไปถ็ไธฅๆ ผ่ฆๆฑไนๆฏไธๅคงๅๅ \n1000 Processed\n classify content\n568500 0 ๆฌๆฅ่ฆๅๅฆๅฆ่ฏดๅ็ฑณ้ฅญๅๆ่ๅๅไบ่็ผไบไฝๆฏๅฟไบ\n568501 0 ๆๅ้ฎๅฅนๆไธ็ฅ้ๅไปไน็ถๅๅฅนๆๅผ็ผ็ฝฉๅฒๆๆๆ็ๅคงๅซไธๅฃฐ็ถๅ้ชๆ็ฅ็ป็
โฆ่ฝ็ถ็ปไบๆ็ตๆไฝๆๆณ่ฏด...\n568502 1 ๅฐๆๆกๅคง่กใๅๅฆๅๆๆฃๅบใๅบไธๅ
ซๅฆๅฅณ่๏ผๅฏไบซๅไธๆฌพๅไปทไบงๅ๏ผๅไปทๅ
ญๅไน็่กฅๆฐด้็ฐไปทไธๅๅ
ซใไปป...\n568503 0 ๅนฟๅท่ญฆๆนๆๆไธๅไธบโFreedomFightersโ็ๅค็ฑๆถ้ป็ฏ็ฝชๅขไผ\n568504 0 ๆไปๅคฉๅจๅไธบ็ฝ็็ญพๅฐ่ทๅพไบ163Mๅ
่ดนๆฐธไน
ๅฎน้\n1000 Processed\n classify content\n569000 0 5ๅคฉๅ่ๅญไธๅฟฝ็ถ้ฟๆ็็็บข็นๆฌกๆฅๆถ้\n569001 0 xxxxๅนดxๆxxๆฅๆฉไธxx็นๅจๅ
ๆฑๅคง่ช็ถ็ป็ปไธๅคๆ
ไบบ่ๅฟซไน็ธไบฒ็ง็ค่ชๅฉ\n569002 0 ๆฅ่ชๆทๅ็ๆจก็นCharissa\n569003 0 ๆตท้จๆปจๆตทๆฐๅบ็ตๅๆขไฟฎ็ฐๅบ\n569004 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?ๆจๆดๅฑๆไปฌ็ๆญ\n1000 Processed\n classify content\n569500 0 ไน่ฎธไฝ ไธๅคฉๅชๆ50ๅๆ่
ไธไธค็พๅ\n569501 0 ๅ้ฒๅพ็ฌฌไบๆดพๅบๆ้ไธญๅผๅฑๆๅป็ตไฟก่ฏ้ชไธ้ขๅฎฃไผ ๆดปๅจ\n569502 0 ๆฉๆฆ๏ผๆฉๆฆๅถ็ฒพๅๆๅฉ็ป่ๅฏนๆๅค็ไพตๅฎณ5\n569503 0 ๆ ๆฎๅฐๅฉ็จ่ฟไธๆบไผๆขๅคบๆดๅฐ็ๅฎขๆท\n569504 1 ้ชไพๆฐๅฉทๅฉท็ฅๅคงๅฎถๅ
ๅฎต่ๅฟซไนไธไบๅฆๆ่ดขๆบๆปๆปใๆๅ็ฅไบฒไธๅ
ซๅฆๅฅณ่ๅฟซไนใๆฌๅบ็นๆจๅบๅ
xxxx้้ข...\n1000 Processed\n classify content\n570000 0 ๆ่ฎธไป้ฃๆถ่ตท่ฟ้ฆโฆๅไบซๅๆฒ\n570001 0 ๅๅบๆฅๅฅฝๅคๅคงๅฑๆๆบๅฅฝๅๆฌขๅช่ฝ็็\n570002 0 ๅง้บฆๆถ็ซ็ฎญๅๅคงๅฏนๆ๏ผ็ตๅฃซๅฑ
้ฆๅฐ็้ฉฌๅบๆไปฝ\n570003 0 ๆฌๅธๆ5ๅฎถๅป้ข็ๆฐๅปบ้กน็ฎ่ขซ็บณๅ
ฅ20้กนๆฐๅฟๅทฅ็จ\n570004 0 ่ฎฉๆ่ต่
ๆไบๆดๅค้ๆฉ็ๆบไผ\n1000 Processed\n classify content\n570500 0 ไปไผฐๅผใๅฎไฝ็ปๆตๆตๅจๆง่ถๅฟๅ็ปๆตๅๆฏ็\n570501 0 ๅๅๅป็ฝไธๅ็ไบไธ่พนaๅคง็ๅ่ฅๅผๅๅฐๅฐ็ฝ็่ฑๅ้ชจ\n570502 0 ๅบๅฑฑ็พไธฝ็ๆฌงๅผๅซๅข
ๆฏๅปบ็ญๅจ่ฟไปฃไธญๅฝๅฑ่พฑๅฒ็ๅบๅฒฉไธ\n570503 0 ๅไบๆกไปถๅ็ๆฐๅ้่ทฏไบค้ไบๆ
ๆฐๅๆฏๅคงๅน
ไธ้\n570504 0 ๅไธๆจกๅผๆๅณ็ไธไธชๅ
ฌๅธๆฏๅฆไฝ้่ฟๅจไปทๅผ้พไธญๅฎไฝ่ชๅทฑ\n1000 Processed\n classify content\n571000 0 ๆๅไปฌๅฏไปฅๅ ๅฅนๅซๆ๏ผlemon940422\n571001 0 ๅฎถ็ตไบงไธไธๆฟๅฐไบงๅธๅบ็ธๅ
ณๆง่พ้ซ็ๅจ็ต่กไธๆๆๅๆฌก่ฟๆฅๅฟซ้ๅข้ฟๆ\n571002 0 ๅจไธญๅฝ่ฟๆ ท็ฌฌไบๅญฃๅบฆGDPๅๆฏๅข้ฟ7%\n571003 0 ็พๅบฆ้ณไนไธบๆฏไนไธ็ป100%ๆ้ๅ\n571004 0 ไน่งๆญฃๅจไธๅพฎ่ฝฏใ็ดขๅฐผ็ญๅๅ่ฟ่กๆฅ่งฆ\n1000 Processed\n classify content\n571500 0 ๅจ่ฏขๅไธๅ็็ปๆ็่จๅฐฑ่ก้ไบ้ฝไผๅๅค??\n571501 0 ๅ็ฌฌไธๆฌก่ฃ
ไฟฎๆฐๆฟๅ็นๆตๆฐด่ดฆ\n571502 1 ็พไธฝๅฅณไบบ่๏ผๅฟซไนๅคงๆดพ้ใๅไปทxxxๅ
็็่ฑ้
่กฅๆฐด็พ็ฝ็ณปๅxxxๅ
ๅคงๆข่ดญใไป
xๆxๆฅxๆฅ๏ผ้พๆ...\n571503 0 ๅๅซ้ซไบๅ
จๅฝใๅ
จ็10ไธช็พๅ็นๅ4ไธช็พๅ็น\n571504 0 ๆๆบ้ๅ ไน้ฝๆๅฆนๅฆน็็พ็
งไบ\n1000 Processed\n classify content\n572000 0 ๅฐไธ่พๅถๅผ่ญฆ่ฝฆๅดๅ ตๅจๅป้ข้จๅ\n572001 0 ๆ่ฆๅฒ่
พ่ฎฏ่ง้ขไธ็พๅนด็ไผๅ\n572002 0 ๆฏ่ๆฃๅฏๅฎ/็ทๅญ็ๅผบๅฅธ13ๅฒๅนผๅฅณๆฃๅฏ้ข๏ผๅๆน่ชๆฟๆฆไฟๆคๆก\n572003 1 ไฝ ๅฅฝ๏ผ้ธฟ่พพ็ท็ ๅ็ฅไฝ ๆฐๅนด่ๆง!็ณๆๆๅคง้ด็็ท็ ๅไฝไบ้ปๅฎถ่็บข็ปฟ็ฏๅ ๅๅ ๆฒน็ซๆ๏ผๅ็ง้ฝๅ
จ๏ผๅ
จๅบ...\n572004 0 xxๅนด้็งฏๆๅไฝไธๅ
ๅ
จ้จ็จไบๆ
ๅ\n1000 Processed\n classify content\n572500 0 ไธไบๅฅฝๅ ๆฌก่
พ่ฎฏๆฐ้ป็้ๅคงไธ้้ข\n572501 0 ็้ฃ่ฏ็ใๅ
ฌๅฎใๅทฅๅ็ญ12ไธช็ธๅ
ณ่่ฝ้จ้จ\n572502 0 ๆจชๆฒฅไบบๆฐๅป้ขๆปฅ็จ้บป้ๆณจๅฐๅพ
ไบงๅฅๅบทๅญๅฆ\n572503 0 2011ๅนด่ณ2014ๅนด็ดฏ่ฎกๅฎๆๅ
จ็คพไผๅบๅฎ่ตไบงๆ่ต872\n572504 0 ๆจๅฟๅจๅฐ้็ๅฐไธไธชๅฆๅฅณๅธฆ็ไธไธชๅฐๅญฉ\n1000 Processed\n classify content\n573000 0 ๆๅฐฑ็ดๆฅ่็ณปไบ้ฉฌ้่ดญ็ฉๅฎๅฟไฟ้\n573001 0 ๅไบฌๆฟๅฐไบงๅผๅๆ่ตๅๆฏๅข้ฟ22\n573002 0 ๅฐๆตทๆนพ็กฎๅฎๆบ้ๅๆ่ต่ฎฉๅฎถ้ไบบๆฅ่ฟๅฌ็ๅ
ป็\n573003 0 ๅ่ธ่ฎพ่ฎกๅ้ดไบๅคๆ็น็ไธไบๅ
็ด \n573004 0 ๆงๆถ่พๅคๅค็ฑๅ่ดฉๅฑ
ๆญคไธๅธฆ่ฐ็\n1000 Processed\n classify content\n573500 0 ไฝๆฏไธบไปไนๅปบ็ญๅฅ็้ฝ้ฃไนๅค็บงไบๅข\n573501 0 ไฝ ไธๅป่ฟๆณ่ญฆๅฏไผ็จๆชๆ็ไฝ \n573502 0 ๅฏนๆๅ็ๅฆปๅญๅฎๆฝๅผบๅฅธxxไฝๆฌก\n573503 0 ๅพไธบๆตๆฑๆธฉๅฒญๅธ็ณๅก้ๆฒฟๆตทๆ่ตท็พ้ฃๅคงๆตช\n573504 0 ๅปไฝ ๅฆ็็ ด็ต่ๅงๆงฝๆ่ฆ่ทณๆฅผ\n1000 Processed\n classify content\n574000 0 ็นๅซๆฏๅจไธไบ็ฅๅไผไธๆ่ๅคงๅญฆ็็ฒพ่ฑๅ
ๆฌๆ่้ๅฎไบบๅ็ๆถๅ\n574001 0 WPๆๆบ้้ๆช่พพๅฐ้ขๆๅพฎ่ฝฏๅฐๅ่ฃๅ7800ไบบ\n574002 0 ๅ่ฝๅจ่ทๅไบฌ50ๅ
ฌ้็ๅๅฒๆ\n574003 0 2000ไฝๆฅๅๅจ6ๆถ่ฝฐ็ธๆบ็ๆฉๆคไธไปไธๆตท็ดๆๅๅๅฟๅ\n574004 0 ๆจๅจ็ไปฅไธๅฐๆนๆณ้ขใๆฃๅฏ้ขไบบ่ดข็ฉ็ปไธ็ฎก็โ\n1000 Processed\n classify content\n574500 0 ๅฐไปฅ15km/h้ๅบฆๅ่ฅฟๅๅๅๆนๅ็งปๅจ\n574501 0 ่ฟๅๅ
ฌๆฅผๅฐฑๅ้ฌผๆฅผไธๆ ทโฆโฆ\n574502 1 ๆจๅฅฝ๏ผๆๆฏๆดไฟก่ดทๆฌพๅจ่ฏข็ๅฐ้ป๏ผๆๅ
ฌๅธไธป่ฆไธบๅฎขๆทๅ็: ๆฟไบง/ๆฑฝ่ฝฆๆตๆผ่ดทๆฌพใไธญๅฐไผไธ่ดทๆฌพใไธชไบบ...\n574503 0 ไปๅคฉๆฉไธ้ๆๅฐ็ฑณ4ๆๆบๅ็บงไบไธๆๆฐ็็ณป็ป\n574504 0 ้ข้ฟๆฟๅจๅฐๆก็ๆคๅฃซๅฆนๅฆนๆ่ฏด๏ผโๅคช่ฐข่ฐขไบ\n1000 Processed\n classify content\n575000 0 ๆ ๅๆ๏ผๆๅใๅฝๅ็ผ็ใๅคด็ใๅฐฟ้ขใๆฑๅคใๅฎซๅฏ\n575001 0 ๅๅๅๅๅๅฅฝๆณๅบๅปๆ
ๆธธ้คไบๆๅฆๅฐฑๆฒกไธไบบๆ็ฉบ็\n575002 0 ็็ๅฅฝๅจๆฃฎ~ๆๅจ่ดขๅฏ้ณไน็น็นๆ
็พค้\n575003 0 ไปไปฌๅบ่ฏฅไธบๆปฅ็จไธชไบบๆ
ๆ่ๅฐ่ญฆๅฏๅฑ่ช้ฆ\n575004 0 ๆดไธชๆญๆพๅจ่ฎพ่ฎกๆไธไธช็ซๆนไฝ็ๅฝข็ถ\n1000 Processed\n classify content\n575500 0 ่ฟๆฌพไธ็ฒ็็ฑณ็็ง็คๅคน่ฎพ่ฎกๅ็\n575501 1 ๅๆน่ฑๅญxๆฅผxxๅนณๅทฆๅณ\n575502 0 ็ไบค่ญฆๆป้ใ้ปๅๅทๅ
ฌๅฎๅฑไธป่ฆ\n575503 0 ็ฎๅ็ฌฌ6่ฝฎๆๅไธๅบๆฒณๅvsๅนฟไธๅณๅฐๅผๅง\n575504 0 ๆๅ่ฝฎ่นๆถ่ดญๆทกๆฐดๆฒณ่ฐทๆๆฅๆ็ๅ่ไบๆ่ถ
ๅคงๅ้็ฟ็ณ่ฟ่พไธ็จ่น่ถ\n1000 Processed\n classify content\n576000 0 ่ฝ่ๅฟ็ญๅพ
ไธๅฐไบๅนดโโ้ๅๅๆ่ตๅฎถ\n576001 0 ๆฟๅ๏ผ่ทๅฑ้ฃๆ ผ๏ผ็พๅผ่ฎพ่ฎกๅธ๏ผ็ซฅ็ๅบ่ฃ
้ ไปท๏ผ&\n576002 0 ๅช้ๅ ๆฒนๅๆ่ตทๅ ๆฒนๆชๅฐฑ่กไบ\n576003 0 ๅธธๅท่ฅฟ็ปๅ้ซ้ๅฎๆฑๆนๅK102โK93ๅคไปๆฅๆตๅจ่ทฏ้ขไฟฎ่กฅๆฝๅทฅ็ปๆ\n576004 0 ๅฐฑๆฏ็ปๅคๆญปๅไบบๅฎถไน็ฅ้่ฐด่ดฃ่ฐด่ดฃ\n1000 Processed\n classify content\n576500 1 xxxxๅ
/ๅนณ็ฑณ่ตทๆๅบๅผๅ
ไธ่ทฏๅฐ้ๅ็ฐๆฟ๏ผๆๅบ็ญ็บฟ:xxxxxxxxxxxx/xxxxxxxx\n576501 0 ๅไบฌๅฐ้2ๅฅณๅญๆขๅบง็็ฒๅฃไธๆผโๆ่กฃๅคงๆโไธขไบบ\n576502 0 ไธๆ่พๅปๆฟๅฐไบง้ๅฎไธป็ฎก่ๅกโฆ็ถๅๅข\n576503 0 ไธๅไธปๆฒปๅป็่ฏทไธปไปปๅปๅธๅธฎๅฎๅฆ่ฏๆญๅ\n576504 0 ๆพๆงๆXboxๅจๆฅๆฌ็็ธๅ
ณไบๅก้ฟ่พพ8ๅนดไนไน
็ๆณๆฐดๆฌๅทฒๅจไธคๅคฉๅๆญฃๅผไปๅพฎ่ฝฏๅ
ฌๅธ็ฆป่\n1000 Processed\n classify content\n577000 0 ็จ็ปๆถ็ๆณๅพๆฅๅๅถไบบๆฐๅๅญฉๅญ\n577001 0 ๆณ็กไธ็ดๆฒก็ก็ๅๅ ๆฏ็พค้ๆไบบๅคงๅๅคๅจๅ็บขๅ
\n577002 0 applewatchๆ็ดขๆต้ๅชๆiPod็1/2\n577003 0 x็พๅ
/็ๅธๅ ๆ่ต่
ๆ
ๅฟง็พๅ
ๆ็ปญๅๅผๅ็พ่ๅจไผๅจๆชๆฅๅ ไธชๆๅ
ๅ ๆฏ็ๅๆฏ\n577004 0 ไบ่็ฝ+ๅไธ่ๅ่ๆ็็ตๅๅนณๅฐ\n1000 Processed\n classify content\n577500 1 xxxxxxoxxxxxxxxxoxoๅ่กๅผ ็ๅ
ฐ\n577501 0 ไนไธ่ฝไธบไบๅ้กฟTwitter็ๅทฅไฝ้คๆผๆด่ฟๆตท\n577502 0 ๅป็่ช่จ่ช่ฏญๅฐ่ฎฒๆ ้ก่ฏ๏ผๅฐ็ดไฝฌ\n577503 0 ่ฎฉๆจๅจๆณฐๅทไธๆ ทๆๅไธๅ็้ฃๅณ็็คไธฒ\n577504 0 ไปๆ22็นๅคฉๆดฅๆฐ้ปๅนฟๆญFM97\n1000 Processed\n classify content\n578000 0 ๆ่ตซๅบใ่พ็ปดใIAM27VIPๅกๅฏไบซๅๆไธๅจ9ๆ\n578001 0 ๅ ไธชๅฏๆปๅธไปค้ฝๆฏ่ดชๆฑก็ฏ็ๅ้\n578002 0 ๆ่ฐ่ขซๆฅๅค็่
่ดฅๅฎๅ็
งๆ ทๅฏไปฅไธไธ\n578003 0 ๅๆฐดๆๆไธ็ด็ธไฟกๅชๅ็ๆฑๆฐดไธๅฎไผๅพๅฐๆถ่ทๆๅฐฑๆฏๆไบบ็็ๆๅคง้กน็ฎ็ฎก็ๅๆ่ต่ชๅทฑๆฏไธ่พๅญ็ไบๅฟ\n578004 0 ้ๅ็ฌฌxx้ๅขๅๆๅข้่ฅๆ็ช้ญxx็บงๅผบ้ฃๅๆด้จ่ขญๅป\n1000 Processed\n classify content\n578500 0 ๅฐฑๆฏ็ฝ็ซไธๆๅผๅ
ณไบ่ฑๅ้ชจ็้ฃไธชไบ\n578501 1 ่ณๅจๆฅใ็น้ๆจๅๅฎถไบบๅๆฅๅ่ง่ฅฟๅๆๅคง็ๅฎถ่ฃ
ไฝ้ช้ฆ๏ผ็ฒพๅ่ฎพ่ฎกๆนๆก๏ฝ๏ฝx.ๅกๅฐๅบไธไธปๅๅฏ็ซๅณๆฝๅ...\n578502 0 ๅธธๅทๅธๆฐ่ฑกๅฐxๆxxๆฅxxๆถ็ปง็ปญๅๅธ้ซๆธฉ้ป่ฒ้ข่ญฆไฟกๅท\n578503 0 ๅไบฌ้ฆๅฎถๅไบบ่กๅ้ฆๆธ
ๅๆฅ่ขญ\n578504 0 ๅไธบP8้ๆฅ็ๆบ่ฝๅฎๅๆๆบ16Gไผๅๅข่ดญ๏ฟฅ1\n1000 Processed\n classify content\n579000 0 ็ธๅฏนไบๅไบฌๅฐ้็ๅ
จ็จ็งปๅจ4Gไธๆ็บฟ\n579001 0 ๆ นๆฎ้ๅฟๆณ้ขๆฐไบๅคๅณไนฆไปฅๅxxxxไฟๆฐไธ็ปๅญ็ฌฌxxๅทๅคๅณไนฆๅฏไปฅ็กฎๅฎ้ฉฌๆตทๅณฐ็ณปๆฐดๆณฅๅก็ฝๅ็็ฎก็่
...\n579002 0 UPไธป๏ผไผๅผๅฟ่ฆ่ๅ
ๅๅๅๅ\n579003 0 9ๆฅๅไบฌๆไปฌ่ฟๆๅฐ้็ฅจ้่ฆ็งไฟก่ดญ็ฅจ่ฐจๆ
ๅ็ฅจ\n579004 0 ๆต้ฒๅคๅคๆฐๆฏ่กฃ้ข็ณป่ด่ถ็ปๅค็\n1000 Processed\n classify content\n579500 0 ไธพไธชไพๅญไธคไธชๅฐๅญฆ็ๅจ็ฉๆธธๆๆๆฒณๆฏ่ต\n579501 0 ไธๅนดๅๅไธบ่ฃ่ๆฒกๆๅฏนๅฐ็ฑณM1ๆๆๅจ่\n579502 0 ๅนถไธๆฏipsๆ่ดจ็ๅค็น่งฆๆงๅฑๅน\n579503 1 ้
็ฒไปฌๅๆฏไธๅนดโไธๅ
ซๅฆๅฅณ่โๅฐๆฅไน้
ไธฝๆฐด็ปงๅ
่ก้
่นไบๅญฃ็นๆจๅบ็งๅฌๆฌพไฝ่ณxๆ่ตท๏ผๆฐๅxไธxๆๅจ...\n579504 0 ๅจๅ
จๅฝ27ไธชโ็ฑๅฟ้ฉฟ็ซโ็ไธๆไบบ็ฌ้ด\n1000 Processed\n classify content\n580000 0 ๅๆไบๅพไน
็ฐๅจๅทฒๅผๅง่ดจ็่ชๅทฑ\n580001 0 ๅๅทๅไธบSใMใLใXL๏ฝๅ
ทไฝๅๅทๅฏไปฅๅจ่ฏขๆ\n580002 0 ๅไบซBloodzBoi\n580003 1 ๅฐๆฌ็้กพๅฎขๆจๅฅฝ๏ผๅ
จๅๅคงๅๅบๅบๆดปๅจๅฎไบxๆxxไธxxๆฅ้้ไธพ่ก๏ผๅๅฎถ็ด้ยทๅ
จๅนดๆไฝ๏ผๆไปฌ้้ๆฟ...\n580004 0 ๆฟๅบ่ชๅทฑ็ฝ่ฒiPhone้็ปๅญๅฐๅญฉ\n1000 Processed\n classify content\n580500 1 ๆจๅฅฝ๏ผๆๆฏๆๅxxๆ้ๅฅไธฐ ้ๆป็ๅฉ็๏ผๆๅธ่ดๅไธบๆๅๅฎถไบบๆไพไธช่ดท๏ผ็ป่ฅ่ดท็ญๅ
จๆนไฝ็้่ๆๅก...\n580501 0 ๆๅจ็ๅฐๆถไปฃ4็ๆถๅไธ็ดๅจๆณโ4ไธชไบฟโๆฒกไบๆๆไปไน็ฝช\n580502 0 ๆฏๆฌกๅ้ฃๆบๅฐฑๆปๆไบๅป้ผ็ไบบ\n580503 0 ๅผ็ๅฝๆฅๅฐๆ320ๅฅไฝๅฎ
ๆชๆจๅบ\n580504 0 ็จๅบ็ฟๅฏนๅ่บซ็้ ๆไบ3ๅไปฃ็ ๅฑๆงไผคๅฎณ\n1000 Processed\n classify content\n581000 0 ๅจ้ฃๆบไธ็ๆไฝ้็ฌฆๅ้ฃไธชๅนด็ๅฎ้
ๆ
ๅต\n581001 0 ไผไผดไปฌ็ๅ ๆฒนๅๅไนๅ
ๆฅ็ๅ
จๅบ\n581002 0 ไธญๅฝๅปบ็ญ้ๅฑ็ปๆๅไผๆๅๅไฝ\n581003 0 ๅพฎ่ฝฏไธบๆฏไธๆShellIconOverlayIdentifiersๆๅฅฝไธ็น\n581004 0 ้ๅทๆ ๅฟๅปบ็ญโ็็ฑณ็ฉโJWไธ่ฑช้
ๅบไธๆฅผ\n1000 Processed\n classify content\n581500 0 ๅๆฏ็ไบๅๅคฉๅไบฌๅซ่ง็ๆฏๆฒกไฟก่ชๅฎๅ
จไธบไบๆถ่ง็\n581501 1 ๅฐๆฌ็ไผๅไฝ ๅฅฝ๏ผๆ ้กๅ
ซ็พไผดไธๆฅผPOZO๏ผไผฏๆ๏ผๅฅณ่ฃ
ไธๅ
ซ่ๅ
จๅบๆฐๅๆฅ่ฃ
xxxๅxx๏ผ่ไผๅๆถ่ดน...\n581502 1 ไนๅฎพ็พ่ดง็ฎๅฐๅกไธนๅ
่กฃไธๆใใ็ฐๆจๅบx.x่ๆดปๅจใ็ทใๅฅณๆฌพๅฅ่ฃ
ใไฟๆ่ฃคใๆฐๆฌพ็ง่ฃค่ถ
ไฝไปทx.xๆ...\n581503 0 ๅบไบๆ ๅPCๆถๆๅLinuxๆไฝ็ณป็ป\n581504 0 ๅๅจๆ็พๅพๅ่ดธๆ้ๅ
ฌๅธๆๅฐไบๅ
่ดนๅฅฝไธไธ\n1000 Processed\n classify content\n582000 0 ่ฟ่พนๅปบ็ญ้ฝๆฏๆ็นๅๆฌข็้ฃ็ง้ฃๆ ผ\n582001 0 ไฝ้
็xGRAM+xxGROM\n582002 0 ไฝไบ้่ฅฟ็ไนพๅฟๅฟๅไปฅๅๆขๅฑฑไธ็ไนพ้ต\n582003 0 ๅไปทxxxxxๅ
/ใกๅไบฌๆฒณ่ฅฟๆฐๆฟไปทๅ่ฎฐๅฝ\n582004 0 ็ทๅญๅ17ๅนดๅค็ฑ่ท่ต160ไธๆๅ่ฑชๅชๅฉๆไบฒ่ธ็ ด้จ24ๅฒๆถ\n1000 Processed\n classify content\n582500 0 ๅๅฑฑๅท็ฆๆฏๅงxxไธชๆๅๅไฝๅฐๆดพๅบไธไบบ\n582501 0 ไธๆณ่ท็ไบบๆๅคชๅค็็ปๆต็บ ็บท\n582502 0 ๆๆๅ่ฏดๅฅนๆ่ฝฐ็ธ่ฝฏไปถๆ่ฏดไธไฟก็จๆๅท็ ่ฏ่ฏ็ปๆโฆๆๆๆบๅก็็ฐๅจๆ่ฝ็จ\n582503 0 ๆ้ซ่งไผๅฏนๆๅ็ๅฎกๅคๆๅพ
ๅผ\n582504 0 ๅๅฎถ่ตๅฉ็ญ็บฟ๏ผ13608870413\n1000 Processed\n classify content\n583000 0 ่ฎค่ฏไฟกๆฏไธบโๆตๆฑ้พๆนซๆ้ฅฐๆ้ๅ
ฌๅธ็ฉๆต็ป็โ\n583001 0 ใๅ็บงๅบ้ไธบไปไนๆไธๆ้ฃ้ฉใ\n583002 0 ๆ็ปๅ่ดง4ใTSTไธ็ถWไน่ฝๅ
ฌๅธไปฃๅ\n583003 0 ๅฅฝๆฃๅฅฝๆฃๅๆๆฒกๆๆ่ๅๆไปๅนดๅๅคงๅญฆๆฏไธ\n583004 0 ็ตๆขฏๅฎๅ
จไธๅฎนๅฟฝ่งๅๆไปฌๅฐๅบ็ตๆขฏไธๆผๆ้ญๆชๅฎๅ\n1000 Processed\n classify content\n583500 0 ๅฆๅค่ฟ็บข็ฑณ2็็ปญ่ช้ฝๆฏๆธๅฆน930ๅฅฝ\n583501 0 ไธไธชๆฟๅฐไบงๅผๅๅ็ๅปบ่ฎฎ๏ผๅนด่ฝปไบบไธ่ฆๅ่ดทๆฌพไนฐๆฟไบ\n583502 0 ไธ้ฆๆญๅฑๅฐไธๅไน็ปๆ30็งๅนฟๅ\n583503 0 ๆๆจๆตๅฏ่ฝๆฏๆๆฏๆง่ฟ่ง็ๅฏ่ฝๆงๅคไบ\n583504 0 ๅฐคๅ
ถๆฏไบบๆฐๅธๆฏๅฆ็บณๅ
ฅSDR่ดงๅธ็ฏฎๅญ\n1000 Processed\n classify content\n584000 0 InternetExploreไฟ็งฐIE\n584001 0 ็ตๆขฏๅถ้ ๅไปฅๅ่ด่ดฃ็ตๆขฏ็ปดไฟฎ็ๅ
ฌๅธๅบ่ฏฅๆฟๆ
่ฟๅธฆ่ดฃไปป\n584002 0 ไปฅ538ๅ็ๆ็ปฉ่ขซๆฑ่ๅคงๅญฆๅฝๅ\n584003 0 ่ฟๆฅไฝๅฑฑไธ็ทๅญๅ ไธบ็ตๆขฏๆ
้\n584004 0 ๆพณๅฆ้ฆ้็ไธๆฌพๆญขๅณ็ณๆตๅฐ้่ๆฌพๅๅฐ็ปฟๅถๆฌพ\n1000 Processed\n classify content\n584500 1 ๅฅฝๆถๆฏ๏ผx.xๅ้ฆๅนฟๅคง่้กพๅฎข๏ผxๆไปฝๅฐๅบไธๆฌกๅณๅฏ่ทๅพไปทๅผxxxๅ
bx้ๅฝข้ข่x็๏ผๆฐ้ๆ้ๅ
...\n584501 0 ๅไบฌ่ฅฟ็ซๅฐ้็ซ็ๆฏ้็ปผๅคๆๅพๅซไบบ็ๅฟ่ชๅทฑ่บซๅจๅ
ซ็พไธชๅบๅฃ็ๆฐๅฎฟโฆ\n584502 0 ๆฅๅๆชๆญขๆถ้ดxxxxๅนดxๆxๆฅ\n584503 0 ๅๅฝไปฅๅ่ฟทไธ่ฑๅ้ชจใๅฏๆฏๆดๆฐๅฅฝๆ
ข\n584504 0 ๆๆญคๅปๅไบฌๅบ่ฏฅๆฏ่ฝ็ๅฐๆตทไบๅง\n1000 Processed\n classify content\n585000 0 ๅฏปๆพๆฑ่ๆๅฝฑๅคงๅธไธๅๆฌไบบๆณๅๅฉ็ไปฅๅไนๆฏไธๅๅฐๆๅฝฑๅธ\n585001 0 ๆ
ๆธธไธป็ฎก้จ้จ้ผๅฑ็ตๆดปๅฎๆๅทฅไฝๆถ้ด\n585002 0 ๅผๅบ็ๆไฟก็จๅก็้่ฆๅposๆบ็ๆๅไปฌ็่ฟๆฅใๆฎ้ๅทๅกไฟก็จๅกๅฅ็ฐ\n585003 0 ่ฟๅฐไฟๆถๆท้็จไธๅฏน4่ฑๅฏธ็ดๅพ็ๆๆฐ็ฎก\n585004 1 ๆจๅฅฝ๏ผๆๆฏๅ็ปๆจๆ็ต่ฏ่ฏๅคง่ดขๅฏ็ๅฎขๆท็ป็: ่กๅญๅญxxxxxxxxxxx ๅ็ๆ ๆตๆผไฟก็จ่ดทๆฌพ...\n1000 Processed\n classify content\n585500 0 ๆ่ฐขๅไบฌๅธๅ็ฎกใๅฐ็จ้จ้จๅคงๅๆฏๆ80ๅฎถๅ่ฝฆๆถ่ดน็ณป็ปๅๅฎถๅนถ่ฉๆปๅ
ณ2000ๅคๅฎถๅ่ฝฆๅบ็ป้ชๅไฝ็้ผๅ้
ๅ\n585501 0 7ๆ14ๆฅๅณ5ๅคฉๅๆฑ่ๅฎฟ่ฟๆณๆดชไธๅๅไบๅญฆ็่ขซๅๅญฆๅดๆฎด่ณๆญป\n585502 0 ๅฐๅทๆปๆฏๅๆๆ ็ผ้ธกไนๅ็ๅฅณๆงไผธๅบๅๆ\n585503 0 ๆฝ่งๅๅฏๅๅฎณไฝ ไธๆๅ\n585504 0 ่ฝฆไธ่ฏทไธ็พๅฅณๆๆบๅธฎ็ๅ
ๅผxxxๅ
ๆๅฝๅบไป็ฐ\n1000 Processed\n classify content\n586000 0 ๅ
ถๅจๅทด้ป่ฏๅธไบคๆๆ่ดญไนฐไบ็ฑ้ฉฌไป้ๅข็ไธ่ก่กไปฝ\n586001 0 ๅบ่ฏฅๆฏๅฟ
้กปๅไธ่ฐๆฅ/้ฉฌ่ช็ไผผ้ฃๆบๆฎ้ชธ่ฟๆตๆณๅฝไธญๆน่กจๆ\n586002 0 ้ฃๆบๆๅไธๆฌก่ฏ็้็ๆญปๅ
ณๅคดๅ\n586003 0 ้ผๆ ผๅฐthanbigger\n586004 1 ็นๅคงๅ่ฎฏ๏ผ่ไธๅคฉๅฐๅค่ฒไธๆๅ
้ไฝณ่ไธๅ
ซๅฆๅฅณ่ๅ่ๅๅบๅทจๆตๅ้ฆไฟ้ๆดปๅจxๆx๏ฝxๅทๅๅคฉVIPๅฐ...\n1000 Processed\n classify content\n586500 0 ๅปๅนดไธๅนดๆๆไบ่ฟไธๅผ ้ๆฑๅๅค้ฃๆฏ\n586501 0 ๅพฎ่ฝฏ่็ๆๅ ๅผ้ฎ็ๆญฃๅผๅผๅ\n586502 0 ่ฏดๆฏๆๆ9000ๅค็ๆฟไบง็จ่ฆ้็ปๆ\n586503 0 ไปๅคฉ4000๏ผๅทฒ็ปๅนฒไบไธค็ฝๆฐงๆฐ\n586504 1 ๅๅ๏ผๅไปทx-x.xไธ๏ผๆปไปทxx-xxไธ๏ผๆฌข่ฟๅฎๅฐ่ๅฏ๏ผ่ฟๆฏๆๆๆบๅท็ ๏ผๅฆๆๆนไพฟๅฏไปฅๅญไธใ็ฅ...\n1000 Processed\n classify content\n587000 1 ไฟ่ตข็ฌฌไธๅบ๏ผ xx:xx ่ท็ฒๅๅ ้ๆธฉvs้ฟ่ดพๅ
ๆฏ๏ผ็ฑๅ็ไธ็ๅๆ็ๅฅฝไธป้่ๅบ๏ผๆจ่ๅๅ ้ๆธฉ...\n587001 0 ่ดตๅท็ๆฃๅฏ้ขๅฌๅผๅ
จ็ๆฃๅฏๆบๅ
ณๅ
้ฃๅปๆฟๅปบ่ฎพ็ชๅบ้ฎ้ขไธ้กนๆดๆฒปๅจๅไผ\n587002 0 ไปๆฉไธ็ไบๅ ๆกๅไบบ็ตๆขฏ่ง้ข\n587003 0 ๆๅ
ณไปฃๅminitabanalysis็ไบๅฎ\n587004 1 --ๅซๅข
้้ๅทฅ่บๅฎๆฏๅฑ็คบ่ก็ๅคงๅผๆพ๏ผ้่ฝๅทฅ็จ้ๅธธ่บๆฏ็ๅฑ็ฐๅจไฝ ็ผๅโฆ็ญ็ญ๏ผ่ฏฆ็ปไบ่งฃใๆฌข่ฟ่ด็ต...\n1000 Processed\n classify content\n587500 0 ไปๅนด12ๆ็ณป็ป้่ฟ่ฎค่ฏๅไบๅๅจไธญๅฝ็ณๅๆดไธชๅทฅ็จๅปบ่ฎพๆฟๅๅฎ็ฐ็ตๅญๆๆๆ \n587501 0 ๅคง่ฟๅฐ่ฑกโฆโฆไบค้็ฏไบค้ๅฎๅจไธ่ฝๆญ็ปด\n587502 0 ๆ ๆฑฝ่ฝฆ+ๆฒน็ฎฑ30ๅ=ๆคๅไธๅขจๅจhhhhhhhhhๅๅๅๅๅๅๅๅๅๅๅๅ\n587503 0 ๅๅๅ
ฌๅธ่ฟไบๆธฏ็ขฑๅ็ๅ ๅบๅปบๅๆถ่ฎพ่ฎกไธบ้ฒๅคฉๅญๆพ\n587504 0 ็ฝๅ่ดจ็ไธบไฝๆฐ้ด่ดฆๅทๅ่ญฆๆน้ๆฅ\n1000 Processed\n classify content\n588000 0 ๅพๅทๆไธชๅฐๆนๅฏไปฅไบซๅ็ขงๆตท่ๅคฉ\n588001 0 ไฝๆฏ็ฐๅจ็ฎ็ดๅฐฑๅๆฎ็พๅฟ็ซฅไธๆ ท\n588002 0 ็ๆดปไธญไนไผๆๅพๅคๆฒป็็พ็
็ๆนๆณ\n588003 0 600308ๅๆณฐ่กไปฝ๏ผๅฐๅๅฝๅ
ๅคไธๆต็็ง็ ้ขๆใ้ๅขๅ
ฌๅธๅไฝ\n588004 1 xxxxxxxxxxxxxxxxxxx ๅทฅ่ก ๅขๅฅ่ดค\n1000 Processed\n classify content\n588500 0 ไธญ่น้ๅทฅ๏ผ่ชๆฏๅๆ ธๆฝ่็ญ้็นๅทฅ็จไปปๅก่ฟๅฑ้กบๅฉ\n588501 0 ๆๅฐ่ฟไธช่ฏๆปก่ๅญ้ฝๆฏNBAๅSD็็ป้ข\n588502 0 ๅป็ต่ไธๆพๅฑ
็ถไนๆฒกๆโ็้ญๅ\n588503 0 ๅตๅต่ฟ็ตๆขฏๆไฟๆ่ท็ฆปๆ็
ง็ฆปๅพ่ฟๆฏๅ ไธบๆไบ็ฆ่ท\n588504 0 ๅฐไผๅคๅฎ้ณ่ฟ็คๅป้ข่ต่ฎฏ่ฟ่กๆต็ผฉ\n1000 Processed\n classify content\n589000 0 ่ไธ่ถๆฅ่ถๆ
ๅฟไธพๆฅไผๆไธบๆๅป็ซไบๅฏนๆ็ๆๅๆๆฎตไบโฆโฆ\n589001 0 ๅพฎ่ฝฏๆบ่ฝๅฉ็Cortanaๆๅๆณ้ฒ\n589002 0 ๆทฑ่ช้ฃๆบ็บต็ซๆกๅคชไปๅฆๆ้ฉไบ\n589003 0 ๆจๆ็ฌฌไธๆฌก้้
ๅจๅ
ฐๅท็กๅคงๅๅคฉๅบ้จๅทฎ็น่ขซๅฐๅทๅท้ฑๅ
ไธคๅป้ผๅคๆธธไธญๅฑฑๆกฅๅๆฏไบฒๆฒณ็ฎ็ดไธ่ฝๅจไธฐๅฏ็ไธๅคฉ\n589004 0 ไธๅจๅปๅ็้กฟๆiphone่ณๆบไธขไบ\n1000 Processed\n classify content\n589500 0 ไปๅฏ่ฝๆฏNBAๅๅฒไธๆๅ
ทไธชๆง\n589501 0 ๆถ้ด่ฎฉไฝ ็ฌๅฐไบ็็ธๅดๆฒกๆ่กฅๅฟ\n589502 0 ้ฃไนไปไปฌ้ฝๆ่ตไบๅชไบๆๆฏๅ
ฌๅธ\n589503 0 ่ฏฅ่ทฏๆฎตๆพไบxxxxๅนดxxๆxxๆฅxxๆถxxๅๅ ้ซ้ไบค่ญฆ็ฎกๅถ\n589504 0 ๆ่ฟๆด่ฟๅฑๅฐ็จ็ต่ๆฃ็ญๅฃ\n1000 Processed\n classify content\n590000 0 ๆญฆๆฑไธญ้ขไบๅฎกๅคๅณ่ฟๅ็ทๅญไพต็ฏไปไบบๅ่ชๆ\n590001 0 ๆฎ่ฏดๅฎถ้่ฃ
ไฟฎๆ่ฟๆ ท่ๅ
ฌ้ฝๆฟๆๅๅฎถไบ\n590002 0 ่ก็ฅจไบ้ฑใๅๆฌข็ไบบๅไธ็ๆ\n590003 0 โโๆฑ่ๆฐๆฒๅธๅงไนฆ่ฎฐ่ตต็ซ็พค\n590004 1 ๅฎฝๅธฆๅ็บง๏ผๅ
็บคๅ
ฅๆท๏ผๆฐๆฅๆ็คผ๏ผxxๅ
xxxxๅ
ไธคๅนด๏ผๆๅxxๅ
ใๆฒณ่ฅฟๅฉๆฐ้็ซ๏ผๅฎ่ฃ
็ต่ฏxxx...\n1000 Processed\n classify content\n590500 0 ๅฃ็ฝๅ
ฐ็ทๅฅณ้็จๆฌพๆๆ่ๅ
่ไธๅฎๅบๅทฎๆ
ๆธธๅไนไธ็ฆๆผๅฅ้ฝ่ฝ่ฃ
ๆ้่ฆ็ๆฏ่ถ
็บง้กถ็บง่ดจ้่ฟไธๆๅฎๅ
จไธๆฏ...\n590501 0 ๅธไบบๆฐๆฃๅฏ้ขๅ
็ปๅฏไนฆ่ฎฐใๅฏๆฃๅฏ้ฟไบๅคฉๆๅบๅธญไปชๅผๅนถ่ฎฒ่ฏ\n590502 0 ๆ่ฟ้
ๅ
ตๆผไน ็้ฃๆบไปๆๅฎถ้่ฟ่ฟ\n590503 0 ็ฉบไธญ็ดๅ้ฃๆบ็ฏๆธธ้ป้ๆตทๅฒธ็ญๆๆๆ็ๆดปๅจ\n590504 0 ๆฐ็่ฟๅ ไบ่่ๅณ้ๅฅฝ้ป็ไธๆฏๅ\n1000 Processed\n classify content\n591000 0 ่ฎธๅคๆๆบๅฎ็ฐไบ็ตๆตๆฃๆต็ๆง\n591001 0 ๅ ไธบ่ฃ
ไฟฎไนฐๅฎถๅ
ทๅทฒๅตไบๅฅฝๅ ๆถไบ\n591002 0 Ins็padgram็ๆฌไธญๅซๆๅฎไฝๆๆฏๅซ้ๅฅฝ้ซๅฝไธไบๅฑ้ฝไธๆธ
ไบๆฅ??\n591003 0 ไธ้ฃๆบๆถๅ็ๅฟๆ
่ฟๆฏ่ท็ปๆบๆถๅไธๆ ทๅญค็ฌ\n591004 0 ๅปบ็ฏๅธซGn?dingerArchitects\n1000 Processed\n classify content\n591500 0 ่ฟไธๅ้ฝ้้็็ญๅพ
ไบ100ๅคๅนด\n591501 1 ๅฎๆฏๅถ่กฃๅไธๆณจ็ไบง็ไป่ฃค๏ผไธปไพๅนฟๅทๆฒๆฒณ\n591502 0 ๅคง็ฑไปๅคฉ่ฏท็่ฃๅค๏ฝ๏ฝๅคฉๆดฅ้ๆฃๆฃๅ๏ฝ๏ฝ\n591503 0 ็บฟไธๆดๆฏไปฅๅไธบๅไนๅ
ไธๆทฑๅณๅคๅฎถ็ตๅฝฑ้ข\n591504 0 ๅไบฌๅฎๅฎถๅฎถๅฑ
ๅคไป็ๆฟๅ
ๅไธๆตทๆตทๅๅฝ้
่ฟ่พไปฃ็ๆ้ๅ
ฌๅธๅฏนไบๅคไปๅๅทฅๅทฅ่ตๅฐๅ\n1000 Processed\n classify content\n592000 0 ๅ่ฟๅๅก็็ๅฏน้ข้ฃๆบ็ฐๆฅ็ฐๅป\n592001 0 ๅพๅทๅ
จ้ขไบๅญฉๆฟ็ญๆๅฟซๅฏ่ฝๅนดๅ
ๅฎๆฝ\n592002 0 ๆไปฌ่ฟๆไธไธชO2O็็บฟไธๅนณๅฐๅซ็นๆฅๅฒ\n592003 0 ไธ็ถ่ฟไผ้ชๆถ็็ฏ็ฝชๅๅญๆฉๅฐฑๆ้ฎ้ข็
ฝๅจ่ตทๆฅไบ\n592004 0 ่กไธๆดไฝๆ็ๅคงๅน
ๆๅๅๅทฅๆฟๅไธญ้ฟๆๆ่ตไปทๅผๅธๆพ\n1000 Processed\n classify content\n592500 0 FCๅฎ้ธก่ฝฐ็ธๆบ็ๅ
จๅฎถ็ฆ\n592501 0 ๅฐฑๆฏ13ๅฒ็ถๅๆๆฏไธ้ๆฉ็ปๅฎถๅบญ็ไบ้่ฟๆๅๆฌขJay็้ฃไธชไผข\n592502 0 ๅไบฌใๆ ้กใ่ๅทไธๅธๅทฒๆๅฐ้\n592503 0 ้ป้พๆฑ็ไฝณๆจๆฏๅธไบบยท่งไธๆโ\n592504 1 ๅ
้จๆธ ้ไธไธๆไฝ(ๅ
ฌๅๅก+ไบๆฅญ็ทจ+ๆ ๆญๆฟ+ๅฟๅๅ+่ฝฆ+ไฟๅ)ๅๆฏไฝ่ณxๅ๏ผ็ฎๅ๏ผไฝๆฏ๏ผ้ซๆ...\n1000 Processed\n classify content\n593000 1 ๅพทไฝ้ซๆฆไธพxxxxxxxxxxx๏ผๅคๆปฉ็ปฟๅฐๅไบบๅ๏ผๅๅ้xๆฟ๏ผxๆขฏxๆทๅพๆฟ็xx%๏ผ้ข็งฏxx...\n593001 0 ๅฏๆฏไน่งๆๆบๅฐไบๆๅฐฑๆๆ6็ปๆๅฆไบ\n593002 0 ไฝ ่ฏดๆไธๆฏ่ฏดไธไธชๆๆฅๆฌๅทๅ\n593003 0 ๅบ้ๆ ่งฃไธๅ้ข่กNBAๆปๅณ่ตๅไฝณ็\n593004 0 ๅ่ฟๆฅๅฐฑๆฏไนฐไฟ้ฉ็้กบๅบใ\n1000 Processed\n classify content\n593500 0 ๅฎ็ฐไบๅฎฟ่ฟๅธ็ธๅ
ณ้ขๅ้ถ็็ช็ ด\n593501 0 ไธ้ๅฐไปปๅก็ต่ๅฐฑๅผๅง่ฃ
ๆญปๅตๅตๅตๅต\n593502 0 ้ๅฎถๆๅ็นๅฐ้่ฏทไบๆๅฑฑๆๅฝฑๅไผ็็ไผๆ่ๅธๅๆฅ้ๅฎถไธบไผๅไปฌ่ฟ่กๆๅฝฑไธ้ข่ฎฒๅบง\n593503 0 ??ไธไน
ๅๅท็ๆๅๅโไผฐๅผ6ไบฟโ็ไบ่ง้พ\n593504 0 ๅไฟ้ฉๆ ไธฐๅฏ็็บฟๆกๅธๆพ่ฟๅจๆ\n1000 Processed\n classify content\n594000 1 ไฝ ๅฅฝ๏ผ ไธ็ฑปไบบๅๅฒไฝ่ฏไนฆ่่ฏ ๅ
่ฟใไปฅๅๅฏไปฅ็ดๆฅ่ทๆ่็ณปใๅฐๅๆญๅทๅธๆๅทฅ่ทฏxxxๅทๅ้จaๅบงx...\n594001 0 ๆญฆๆฑๅฐ้ไธคๅฅณๅญๆขๅบงไธๆผโๆฏๅๆ่กฃๅคงๆโ\n594002 0 ๅ่ๅฎๅฎๅ
ซไธชๅๆๅฆไธไธชๅฎๅฎๆฒกๆ่ฟๆ ท็ๆ
ๅต\n594003 0 ๅพๅทๆฒๅฟไบค่ญฆๅคง้้พๅบๅ
ฌๅฎๆฃๆฅ็ซๆฐ่ญฆๅจไพ่กๆฃๆฅๆถ\n594004 0 ๅๅคง็ฏ้็จไบไธปๅจๅผLED็ฏๆบ่ฎพ่ฎก\n1000 Processed\n classify content\n594500 0 ๅฎๅ็่ฟ็จๅป็ๅจ่ฏขๆๅกๅนณๅฐ\n594501 0 ๆฅ่ชๅ
จๅฝ็109ๆฏไปฃ่กจ้ๅๅฐ็ซๆๅฑ็ปๆดป\n594502 0 ็ถๅๅๅพฎๅๆๅบๆๆบ้็็ฌฌ7ๅผ ็
ง็\n594503 0 ็ผๆณชๆตไบไธๆฅ~ๆๆ็ฝ่ชๅทฑ่ฟๆฏๅฐๅญฉ\n594504 0 e็ๆกถๅๆไนๅฐฑๅช่ฝๆ่ไธ้ด้ดไบบ\n1000 Processed\n classify content\n595000 0 ๆฐๆฌพๆฌง็พๅคง็ๅๆฌพๅฎๆๅพไธๅฆ่ถ
็บงๅฅฝ็๏ฝsmๅพๆพ่บซๆๆฐ่ดจๅๅชๅ
้ฎ๏ฝ\n595001 0 ่ง้ขๆฅ่ชๅผ ๅฝ่ฃxxๅนดๅคๆฅไผฏ็ตๆผๅฑไผ\n595002 1 ๏ผๅ
จๆฌพ่ฝฆxๅx๏ผ่ดทๆฌพ่ฝฆxๅโฆโฆๆฌข่ฟๆจๆฅ็ตๅจ่ฏขxxxxxxxxxxxไบใไนๅฏๆทปๅ ๅพฎไฟก่ดฆๅทxxx...\n595003 0 ็ฌฌไธๆฌก่งๅฐไฝ ็ๆถๅๆฒกๆขๅคบไฝ ็ๅคง่็ๆฏๅคชๅฅฝไบ\n595004 0 ๆญป็ฅ็็พๅบฆ่ฟๆปๆๅๆๅบๆฌบ่ดๆๅฐๅ\n1000 Processed\n classify content\n595500 0 ่ฟ่xxxx่ฝฆไฝๆฑฝ่ฝฆๆป่ฃ
่น\n595501 0 ไฝ็
่ๅธ็ฅจ่ฝไธ็ฎ่ฟๆไนใๅฅฝ้พ่ฟ\n595502 0 ๆ
ๆธธๅฐไบงๆชๆฅๆฟๅฐไบงๅธๅบๅๅฑๆนๅ\n595503 0 ไบบๅฝขๆข็ขผๆฎบไบบๆก็ๅ
ๆ~ๆๅพ็ตๆผๆญปไบ~ๅฃไบบ็ไธๅ ด้ฝไธๆๅคชๅฅฝ~ๅๅฎฎ้ๅจ้้จๆฒๆผๅพๅพๅฅฝ~ๅฎณๆ้็้็ฝต\n595504 0 ๆๆ่ก็ฅจๅ
จๆๆ ่กไธ่บซ่ฝป\n1000 Processed\n classify content\n596000 0 ่ฟๅชๆบๅจไบบ็ฉๅถๅฐฑ็ฆปๅฎถๅบ่ตฐไบ\n596001 0 xใไธบๅญฆ้ขๅ้กนๆดปๅจ็ๅผๅฑๆไพ็ธๅบ็็ป่ดนใ็ฉ่ดจไฟ้\n596002 0 ๅๆณ๏ผ1ใๆดๅๆฉๅญๅจ็ๆฐดไธญๆตธๆณกไธไผ\n596003 0 ๆฉๅนดๆฏไธไบๅไบฌ่บๆฏๅญฆ้ข็พๆฏ็ณป\n596004 0 ไปๆฏไธไธช่ๅไบบไปๆฒกๆไปปไฝ่ดชๅฉช\n1000 Processed\n classify content\n596500 1 ๅฅฝๆถๆฏ๏ผๅๆๅ
้ผไฝณๅฎๆฃฎ้
xSๅบ่ฟๆฅๅทจ ็ฎ๏ผไฝณๅฎVxxๅ
จ็ณปๆ้ซ็ด้xxxxๅ
๏ผไฝณๅฎVxxๆฐ่ฝฆ็ซ...\n596501 0 ็ไบ่ฟๆฏๆไบ่ฎธๆ่งฆCBAไนไธๆๆดๅ ้พไปฅๆณ่ฑกNBAๆๅคๅๆๅชๆณ่ฏดๆๆฏไธไธช็ฏฎ็็ฑๅฅฝ่
\n596502 0 ๆฉไธ็ฅ่ฏดไบๅคๅฐ่ฟๆณ็ๅ
ๅฎนไบ\n596503 0 ๆ ้กไบบๆๅฅไฟ่ฏ๏ผๅฐๆ้้ป้ณ่ตไบบๅ\n596504 0 ๆ่ๅ็ๅๆฐดๆนๆณใ็็ ๆณใ้พๆฒณๆณ็ญๆฐด่ตๆบ\n1000 Processed\n classify content\n597000 0 ๅฝๆงๆณๆบๅ
ณ่ท่ฟๆณๅไฝ็ฉฟไธๆก่ฃคๅญ็ๆถๅ\n597001 0 8็น็้ฃๆบไธ็ดๆจ่ฟๅฐ11็น\n597002 0 ๅฃฐๆงๆชๅพๆจกๅผโโๅๅฉๆๆบ็้บฆๅ
้ฃ่ฟ่กๅๅซ\n597003 0 ๅผๅๅบๆณ้ข่ขซ็กฎๅฎไธบๅ
จ็ๅธๆณๅ
ฌๅผๅทฅไฝ็็บง็คบ่ๅไฝ\n597004 0 ๆๆๆๆ่ง้ขไผ ๅฐ็พๅบฆไบ็ถๅๆๆบ้็ๅ
จๅ ๆไบ็ฐๅจ้ฎ้ขๆฅไบ\n1000 Processed\n classify content\n597500 0 ๆ่ทๆฝ้x่ณxๅนด็ฝไธ้็ฏxxxๅ\n597501 0 ไธๅๅบๆฟๅบๆไธฝ็ผๅฏๅบ้ฟไธ่กๅฐๆ่ๆ กๅบๅฏนๆ็ๆ
ๅต่ฟ่กไบ็ฐๅบๆๅฏผ\n597502 0 ไปฅๅๆ่ฆๆฏๅนฒๆคๅฃซ่ฟไธช่กไธๆ่ฆ็จๅไธชๆๅทฅ่ตๆฅไนฐไฟ้ฉไธไธ้ญ็ ไบๆ่ฏปๅป็ๆๆฌ่ฟๆฒกๆถๅๅฐฑๆญปไบๆๅฎถ้ไบบ...\n597503 0 ็ฌ่ฃ=้ๆ=ๅจๆ=ๅผบๆ=ๆดๅ=ๅไบบๆง=ๆฟๆฒป่
่ดฅ\n597504 0 ไฝ ๆฟๅๅฏนๅฐๅทๆฝๆดๅฅๅคบๅฐๅท็็ๅฝ\n1000 Processed\n classify content\n598000 0 ่บซไฝๆฏ้ฉๅฝๆฌ้ฑๅฐฑๆๆกๅจไฝ ่ชๅทฑๆไธญ\n598001 1 ใๆพณ้จ้ๆฒ่ตๅบ็ด่ฅใๅซๅปๆพณ้จ่ตๅบๅคช้บป็ฆไบ๏ผ็ฝไธๅผๆทๅฐฑ่ฝ็ฉ๏ผๆๆๆพณ้จใ่ฒๅพๅฎพๆฟๅบ้ขๅๅๅฝฉๆง็
งใ...\n598002 0 ๅณๅจๅไธ็ฉบ้ดไธๅฑ
ไฝ็ฉบ้ดไธญๆๆๅฏ็งปๅจ็ๅ
็ด ็ป็งฐ่ฝฏ่ฃ
\n598003 0 ไบฒๆณ้ป======็น้ฃ็ฅจ็่งๅพๅ้ๅฏไปฅๅ ๆๅฅฝๅ๏ฝๅธฆไฝ ่ฟ้ข้ๅฐไบ่งฃๆธ
ๆฅๅ\n598004 0 ไปไปฅไธบๆฏๅ ไธบไป็ๆญฃไนๆณๅพๆ่ฏ\n1000 Processed\n classify content\n598500 0 ๅฆๆๅฐฑๅไธๆก่ท็ตๆขฏๆๅ
ณ็wbๅฐฑ่ขซๆฐดๅDTไบ\n598501 0 ็ไบไฝ ้ๆฌก็้จฐ่จ่ฆ้ ป็ดๆญ็ๆผๅฑๆ\n598502 0 ๆทฑๅณ้พๅฒๅๅฑ้พๆฐๆดพๅบๆๆฐ่ญฆๆๆๅคๆฅไบ็ซๅ\n598503 0 ่ฝ็ถๅฌๅฐไบ้จๅฃฐ็ถ่ๆ็ต่้ปๅฑไบ\n598504 0 ่่ๆฏๆๅจ็ฆๅทๅ่ฟไธๆฑ่ๅฃๅณๆไธบๆฅ่ฟ็\n1000 Processed\n classify content\n599000 0 ๆไปฅๅข้ข็่ฃ
้ฅฐๆฏ่ฃ
ไฟฎ่ฟ็จไธญ้่ฆ็ๆฝๅทฅ็ฏ่ไนไธ\n599001 0 ไธๆ่ฟไปฅไธบๆฏ้ฟ้ใ่
พ่ฎฏ็็ไบฒ่ชๆๅไปๅ
ฅ็\n599002 0 ๅฏนไบๆฐๆฒๅธ้ปๅ ไธญๅญฆ8ๅนด็บงๅญฆ็็ๅญ่ฒๆฅ่ฏด\n599003 0 ๆๅพๅฅฝๅๅ่งไธชๆ
ๆธธๆฏ็นไธๆ ท\n599004 0 ๅๆ
ๆ้๏ผ่ๅทๅคงๅญฆ็ฌๅข
ๆนๆ กๅบ็ณ้บๅพไนฆ้ฆ401้
่งๅฎคๅจ7ๆ13ๆฅๆ22๏ผ00่ฟ่กไบๆธ
ๅบ\n1000 Processed\n classify content\n599500 0 ไบบ็็่ดขๅฏไธๆฏไฝ ๆฅๆๅคๅฐ่ดงๅธๆฅๆๅคๅฐไธๅจไบง่ๆฏไฝ ๅธฎๅฉไบๅคๅฐไบบๆฏๆไบๅคๅฐไบบๅฝฑๅไบๅคๅฐไบบๅจไฝ ็ฆปๅผ...\n599501 0 ๅฝข่ฑกไผผๅๆจก/้ผๆ/้ผๆ/้ผๆ\n599502 0 ็่ฆ่ฎฉๆ็ญพๅญๆ็ๆholdไธไฝ\n599503 0 ๅฎไปฌๅฐๅจๅก่ฝฆๅ้ฃๆบไธๅบฆ่ฟ30ไธชๅฐๆถ\n599504 0 ๅฝปๅค็็ญ้ฃๆบๆๆ็ปไบ็ปไธไบๅฅๅท\n1000 Processed\n classify content\n600000 0 ไฝฟ่่ๅบ็ฐ้ตๅๆงๅ็ๆๆงๆถ็ผฉ\n600001 1 ไบๆฌๆธ
ไป๏ผไธไปถไธ็๏ผๅ
จๅบๅๅไฝ่ณxๆ๏ผ่ฏทไบฒไปฌ็ธไบ่ฝฌๅ๏ผๅนถ็ปไบๆ็ฒพ็ฅไธ็ๆฏๆ๏ผ่ฐข่ฐขๆ ้กพ๏ผ\n600002 0 ้ข้จๅจUVๅ
ไธๅฎๅ
จๆด้ฒๅบ็็ฎๅฑ็ฐๅจๅๆฝๅจๅญๅจ็้ฎ้ข\n600003 0 ๅฏนไบ็ฐๅจ็ญ่ฎฎ็ๆๆๅๆฟไบง็จ็ๅพ็ผด\n600004 0 ๆบๅจไบบไปฌ่ซๅๆณๆๆณช็นๅๅๅๅ\n1000 Processed\n classify content\n600500 0 ็คพๅบๆฏๆๆณๅพๅจ่ฏขๆๅกๅๆฌข่ฟxๆxxๆฅ\n600501 0 ๅฅฝๅฌใๅ่ๅทๅ่ใ้ฆใๅฅฝๅใๅ่ๅทๅ้
ใ้ใๅ้ฆ็พ\n600502 0 ๅจไธๅฎถๆๆฐๆ่ตๅๅพๅพ็็ๅบ้ๅ
ฌๅธ้\n600503 0 ไปไปฌๅฐฑ็ๆๆไบๆๆ็ๆญฃ็้ข็ฎ\n600504 0 โโๆๅชๆฏๆฆไบ้ฒๆ่ๅทฒไฝ ็ๆฏๅญ้ปๅคดๅบๆฅไบๅ ไธบๆๆฒกๆ็ฒ\n1000 Processed\n classify content\n601000 0 ๅทฅไฟก้จๅฐ้็นๆจ่ฟๅทฅไธๆบๅจไบบๅจๆฐ็็ญๅฑ้ฉไฝไธ่กไธ\n601001 0 ๆ็ต่็ง็จ็ๆไปถๅคน่ฟๅซๅๅณๅ
ดๆๆ\n601002 0 ่พๅ
ฅๆ็ๆจ่ไบบไปฃ็ โ44sxs5โๅฐฑๅฏไปฅ่ทๅพ่ถ
ๅผ้ญๅนปๅก\n601003 0 ็พ่คถ่ฑ่พน่ฃ่ฃค็พๆญๆพ็ฆ่ฃ่ฃค่ฎพ่ฎก้ฒ่ตฐๅ
ๅฆๅฐบ็ S้ฟxx่
ฐๅดxxM้ฟxx่
ฐๅดxxL้ฟxx่
ฐๅดxx\n601004 0 ๅ
จ็ๅ
ฑๆฅๅค้็นไบค้่ฟๆณ่กไธบ448019่ตท\n1000 Processed\n classify content\n601500 1 ไบฒไปฌไธๅนดไธๅบฆ็x.xๅฅณไบบ่ๅๅฐๅฏ๏ผไธบๆ่ฐขไบฒไปฌไธ็ดๅฏนไบฌๆถฆ็็ ็ๆฏๆใๆๅธๅฐไปxๆxๅทๅฐxๆxๅท...\n601501 0 1958ๅนดไนพ้็ๅฐๅฎซๅ
ฅๅฃๅทฒ็ปๆพๅฐไบ\n601502 0 ๅฎๆดๅฎๆๅฑฑไธๅ็ฎกๆ ก้จๅฃๆ6ๆฌ่ไบบ้ญๅไธญ็ๅดๆฎด\n601503 0 ้ฃๆบไธ็ไนๅฎขๅจ็ปๅไบๅง็้ข ็ฐธใๅ็ดไธ้่ฟxๅ้ไนๅ\n601504 0 ๅป็่ฏดๆฏ็กๅพๅคชๆๅ
็ซๅๅคชๅทฎๆไผ็
ๆฏๆๆ็\n1000 Processed\n classify content\n602000 0 ็ฆปๅฎถไธ้็xxๅฒๅฐๅนดๅจ่ฟไธๅปๅพๅฐไบๆๆๅๆฅ\n602001 0 ๆฑ่ๅไบฌๅธ็งฆๆทฎๅบ้ฟๅนฒๅฏบโไฝ้กถ่ๅฉ\n602002 0 ไธๆฌก่ฟๆไธชๅจๅฐ้้็จ็ฌ็ป็\n602003 0 ไปไปฌไผ้่ฟxxๅ้ๆ็่ๅฐ่กจ็ฐ่ฟ่ก็ฐๅบๆๅ\n602004 1 ๆจๅฅฝ๏ผๆฌข่ฟ่ด็ต่ฏไฟกๅผ้ๆๅกไธญๅฟ๏ผๆๅก้กน็ฎ๏ผๅผ้๏ผไฟฎ้๏ผๆข้๏ผๆข้่ฏ๏ผ้ๅฎๅๆฌพ่ถ
B็บง้่ฏใๅ
ฌๅฎ...\n1000 Processed\n classify content\n602500 0 ็ฌฌไบๅญฃๅบฆ่นๆMac็ต่็ๅ
จ็ๅบ่ดง้ไธบ510ไธๅฐ\n602501 0 ไธญ่ชๅ
็ตไธๆบๆๅไนฐๅ
ฅxxxxไธๅ
\n602502 0 7ใไธๆฏ้ฃ้ฉๅฏๆงๅฉๆถฆๅฎนๆ็ฟปๅ็่ก็ฅจ\n602503 0 ่็ณป็ต่ฏ๏ผ133681XXXX\n602504 0 ไปๅนด็ไปปๅก๏ผ1ๅฎๆๆถ่ดญๅ
ฌๅธ็ไบๆ
\n1000 Processed\n classify content\n603000 0 6ใๅธธๅทๅธ้ซๆฐๆๆฏไบงๅ่ฎคๅฎ่ฏไนฆ5ไปฝ\n603001 0 ๅ
ๅญ่ขซๆๅฐ็ๅฐ็นๆฏUnterdenLinden41\n603002 0 ้ป่ๅฐไธบ่ฐทๆญLollipopx\n603003 0 ไปๅนด้ฆๅฑ5000ๅๅๆ่ฎขๅๅฎๅ็ๅทฒๆฏไธ็ฆปๆ ก\n603004 0 bluedๆๆบ็ปๅฎๆถไธๅฐ็ญไฟก\n1000 Processed\n classify content\n603500 1 ่ๅค็ฅฅ็ธ็บฆไธๅ
ซ๏ผๆฌๅท้ๆๆง่ดนๅ
จๅ
ใ้ป้ๆข้ป็ณๆๆง่ดนๅ
จๅ
ใ้ถ้ฅฐxๆ๏ผ่ดญ็ ็ณๆปกxxxxๅ
่ฟxxx...\n603501 0 ๆฐดๅนณๅคง่ดไธChromeๆไปถ็บขๆ็ฑปไผผ\n603502 0 4ๅฒไปฅไธไฝ่บซ้ซๆช่ถ
่ฟๅด่ฃๆฟ็ๅฟ็ซฅ\n603503 0 ๅน็ฉบ่ฐๅปๅป้ขๆๅคช้ณๅน็ฉบ่ฐๅปๅป้ขๆๅคช้ณโฆโฆๅจ่ๅคๅง\n603504 1 ้ซโๆฐๅญฆ่ฑ่ฏญ็ฉ็ๅๅญฆๅฐ็้บๅฏผๆ็๏ผๅๆ ๅๅธไธป่ฎฒ.ๆๅ้ข็บฆ.ๆฌข่ฟๆฅ็ตๅจ่ฏข่ฏๅฌ๏ผๆฌๅจๆๆๆฅๆฉไธๅ
ซ...\n1000 Processed\n classify content\n604000 0 ็พๅฐๆฌไบบๅไบฌไธปๅๅบ็ๆฟๅญๅบๅฎ\n604001 0 ่็ฌฌไธๆนๆฏไปๆฏไบ่็ฝ้่็ๆ ธๅฟ่ฆ็ด \n604002 0 ็ฐ้ๆถ็A/Bไธๆฅๅนดๅๆถ็็2\n604003 0 ๅ็ฐๆฒน็ๅคง่พๅฑ
็ถๅฐฑๆฏๆฑ่็้บป่พฃๅฐ้พ่พ\n604004 0 ไปๅคฉๅ็ไบไธชๆฏ่่ญฆๅฏๅปๆฏ่ขญ่ญฆไบบ็ๆฐ้ป\n1000 Processed\n classify content\n604500 0 2015ๅนดไธนๆฃฑๅฟไบไธๅไฝๅ
ฌๅผ่่ฏๆ่ๅทฅไฝไบบๅ้่กฅไฝๆฃไบ้กน็ๅ
ฌๅ\n604501 0 ไธป่ฆๆฏ้ข่ฒๅพ็พไธ็ฝไธ้ป็ๆผๆฅ็ฎ่้็็พไธฝ้ฃๅ
\n604502 0 ๅจไปๅxๆฒกๆๆพไนๅไนๆๅพๅคไบบ่ดจ็ๅๅซฃ\n604503 0 MYไธ็ด้ป้ป็็้ผไธ้ ๅนฟๅๅไบบๆฐๅ็ๆ\n604504 0 ๆฌๅทไบค่ญฆๅค็ไบๆ
้ๅบฆ็ๅฏไปฅ\n1000 Processed\n classify content\n605000 0 ่ฑๅ้ชจ่ฟ้จๅงๅฐฑๅ่ฏไบๆไปฌไธไธช้็\n605001 0 ็พคๅท166480707้ช่ฏ988ๅฟ
ๅกซ\n605002 0 ๅฐ้ไธไธไธชๅฐไธ็น็ธไธญๆ็้ขๅ
ไบ\n605003 0 1็นไปๅป้ขๅบๆฅ็ผ็ๅทฒ็ป็ไธๅผไบ\n605004 0 ไนๆๅธๆณ้ขไธๅฎกๅคๅณ้ฉณๅไบ้ๆ็่ฏ่ฎผ่ฏทๆฑ\n1000 Processed\n classify content\n605500 0 ๅป็่ฎฉๆ32ๅจๅผๅงๅ่ไฝ็บ ๆญฃๆ\n605501 0 ๆๅทฒ็ปๅจๅไบฌๆบๅบๆไบ100ๅผ ่ชๆ\n605502 0 ็ผ
็ธๆฟๅบๅนฒๅพๆผไบฎ๏ผxใๅจๆ
ไบ่ฟๆณไผๆจ่
\n605503 0 ๆ่ฐขๅ ไฝๅฐไผไผดๅธฆๆๅป็็ต่\n605504 0 ๆๆบๆ็ตๅๅธฆไบๅ
็ตๅฎๅบ้จ็ๆถๅ\n1000 Processed\n classify content\n606000 0 ๆๆบๅฌๅฐไธๆญ็่ฏญ้ณๅ็ดง็ฎๅ\n606001 0 ็จ็ธๆบๆไธไบ้ฅๆง้ฃๆบๅจๅคด้กถ็ๆ็็ป้ข\n606002 0 ๅไธๅฏUrbeats็ฐ่ฒๆฌพ่ดญไบไบ้ฉฌ้่ณๆบๆฒกๆ่ดจ้้ฎ้ขๆฒกๆๅ้ณ็บฟๆงๆญฃๅธธๅ
่ฃ
็้
ไปถ้ฝๅจๅฏๆฏๆๅนณๆถ...\n606003 0 ไฝๆฟ่ไธ็ ็ฉถ็ไนๆฅๅๆฏๆ้ๆฐๅฎ่ฃ
ไนๆถ\n606004 0 ไธๆฌก่ฃ
ไฟฎไธๅฎ่ฆๅจๆดๆ้ด่ฃ
ไธช้ฒๆฐด็ๅฐไนฆๆถ\n1000 Processed\n classify content\n606500 1 ไธๆ่ถ
ๅผๅ้ฆๆๆดปๅจ้็ฅ:xๆxๆฅๅฐxxๆฅๅจๅนฟๅทๅ
ดๅๅนฟๅบไธพๅ!ๆฌข่ฟๅไฝ้กพๅฎขๅฐๅบ!xxxxๆ็พๆฏ...\n606501 0 ไธไบ้ฃๆบไธๅบๆด้จGZ้ๆฏซ๏ฝ\n606502 0 ไฝๆฏๅฐไบๅป้ข็ฎ็นไบ่ฟไบ็ๅต\n606503 0 ๅ่้กปๆ้ซ่ญฆๆ่ฟๆฅๆๆขไผชๆฟๆไบบๅๅผๅงๅคง่ๅฎฃๆฌๆๆขๅทฒโๆขๅคๆญฃๅธธโไผๅพ่ฏฑ้ชไธๆ็็ธ็็พคไผ\n606504 0 ไปฅๅ็ไนฐ็พๆฌๅ็งไธไธ็ฑป็ไนฆ่ชๅญฆ\n1000 Processed\n classify content\n607000 0 ALBION/ๅฅฅๅฐๆปจ่ถ
ๆ่ฝฏๅๅฆๆฃๆธ้ไนณไธ็จ120ๆ่ฟไธชๆฏๆธ้ไนณไธ็จ็\n607001 1 ็ฉฟ้็่ๆฏ้ฉฌใๅ
ถไป่ฐไธไธ๏ผไปๅ็ญๆกๆฏ้ฆๆธฏ่๏ผๆฏๆๅ
ญๅๅฝฉ็ฝๅฐๅง่๏ผ้ฃไธชๅฐๅงไธ็ฉฟ้ซๆ น้๏ผ็ฉฟๅ่ฃ๏ผ...\n607002 0 ไธบไธไธช็คผๆ็ๆ
ๆธธๅๅฅฝ้ฒๆๅๅคๅ\n607003 0 ๆฟๅฐไบง่ฐๆดๅทฒ็ป่ตฐๅบVๅฝขไฝ่ฐท\n607004 0 ๆ็็ต่ฏ่ขซ่ฎค่ฏไธบๆฌๅท้ณๅ
็ฉๆตๅฑฑไธไธ็บฟไบ\n1000 Processed\n classify content\n607500 0 ไธๅฎถ๏ผไธญๅฝๆบๅจไบบๆ็ฅๆนๅไธ้่ฏฏ\n607501 0 ๅฏไปฅไธ่ฝฝ็พๆฉๅฝฑ้ณAPP็็ฐๅงๅจ\n607502 0 ๅๅซๆฏ7000ไธ็พๅ
ๅ8470ไธ็พๅ
\n607503 0 ไฝ ๅคฉๅคฉ้ฃๆบๅทก่ชๅบๅฎซ้ผๆตทๅดๆฅๆฌๅฒ่ฝฌไธๅคงๅ\n607504 0 ็ฝ่ฒTๆคๆญ้
ๅ่บซ้ฟ่ฃ่ฎฉไฝ ็่บซๆๆดๆพไฟฎ้ฟ็ๅๆถๅฎฝๆพ็่ฎพ่ฎก่ฎฉไฝ ่บซๆๆดๆพไฟฎ้ฟ\n1000 Processed\n classify content\n608000 0 ๆฏๆไฝ ็ไบบไนๆ็ฅ้็็ธ็ๆๅๅ\n608001 0 ๆดๅฟๅ
ฌๅฎๅฑๆดๅทๆดพๅบๆไบๅฝๆฅไธๅๅฐๆถๅซๆๅๅฟ็ซฅ็็ฏ็ฝชๅซ็ไบบ้ๆใๅจๆๆ่ท\n608002 0 ๅคงๅฎไบคๆ็ๅบ้็ปๆ่ดนๆ็ซไปท\n608003 0 x่ฑๅฏธLCD็ตๅญ้ปๆฟx่ฒๅฏ้็พๅฝไบ้ฉฌ้$xx\n608004 0 ๅป็่ฏดๅฆๆ้ซ็งไธ้ไธๅจๅฐฑ่่ๅ
ถไป็
\n1000 Processed\n classify content\n608500 1 xxๅ
้กน็ฎ๏ผไธ.ๅ
ไธไธช็็จ้ไปทๅผxxxxๅ
้กน็ฎใๅ
xxxxๅ
็พๅฎนๅนดๅกไธ่ฎกๆฌกๆฐ๏ผ่บซไฝๅxxๆฌก่บซ...\n608501 0 ้ฃไนcon็พๅไนๅ
ซๅๅฐฑ็บฆ่ตทๆฅไบ\n608502 0 ไปๅคฉๆๅๅฎถ็ตๆขฏๅฐฑ่ขซไธ็ฅ้่ฐๅฎถ่ฃ
ไฟฎ็ๆฒๅญๆๅไบ\n608503 1 ๅๆฐ็พ่ดง็งๆๅฐ๏ผไปxๆxๅทๅฐxๆxๅทๆๆดปๅจ๏ผๆฐๅxx็ซๅxx๏ผ่ๅๆxๆ๏ผ่ฟๆx็นxๆ๏ผๆ่...\n608504 0 ไฝไบๆตๆฑ็้ณๆฑๅฃๅคxxๆตท้็ไธๆตท\n1000 Processed\n classify content\n609000 0 ๅ่
่ดฅๆฏๆจๅจ็ปๆตๅๅฑ็ๅผบๅคงๆญฃ่ฝ้\n609001 0 ้็จๅฎๅ
ด้ป้พๅฑฑ็ดซๆณฅๆๅทฅ็ฒพๅถ่ๆ\n609002 0 ็ทๅญๅคฉๅคฉๅillstandbeforeyouimm็ฆป\n609003 0 ๆฟZHIๅฟๅๅๆนๅจๅชไฝไธๆฅๅๆๆณ\n609004 0 ็ข้
ไผไฝฟ่ค่ฒๅ้ปใๆ็่ฏไธญๅผ่ตท่ค่ฒๅๅ็่ฏ\n1000 Processed\n classify content\n609500 1 ไธฐๆบ็พ่ดง ๆๅๅคงๆพ้๏ผๅชไธบ็พไธฝ็ไฝ ๅทด้ปๆฌง่ฑ้
ไธๆ:ไนฐxxxๅxxๅฆไธๆๆไนฐ่ต ^\n609501 0 ไฝ ๅฐฑๆฏๆ็ๆฐงๆฐ่ฏๅฌๅฐๅ>\n609502 0 MV็น้ไฟกๅฟต้ณไนๅๅงไบบๆฐๆๆ
ไปปๅฏผๆผ\n609503 0 ๆ็ฑไฝ ไธ็ฎกไฝ ๆฏๅฆๅจๆ่บซ่พนๆ่
่กฐ่็พ็
ๅฅๅฟๆ้ฝ้ชๅจไฝ ่บซ่พน็ญ็ไฝ ็ฑไฝ \n609504 0 ๅฏไธ่ฝๅ่ฟฝ่ฑๅ้ชจไธๆ ทๅปๅป็็ญๅ ไธชๆๅง\n1000 Processed\n classify content\n610000 0 ๅญฆ็ไปฌๆๆๆๅทฅๆถไธๅฎ่ฆๆฆไบฎ็ผ็\n610001 0 ไธๆตท็ญ็บฟๆฐ้ป้ข้โโ่ๅทๅๅโๆๆฟ้ๅฅๅญฆ้โใๅทฒ็ป่ฟ็ปญ้ขๅ30ๅฑ\n610002 0 ่ๅข้ขๅนๅฝข็ๅฟ็ฉบ่ฎพ่ฎกๆดๆฏๅซๅบๅฟ่ฃ\n610003 0 ๆไปๅ้่ฑไธ5000ๆฌงๅ
็ง็จ็ดๅๆบไธบๅฆปไนฐ้ขๅ
\n610004 0 ๆๅ
ณไบ็ญๅ็ตๅๆฐๅชไฝ่ฅ้ๆน้ข็ๅทฅไฝไฟกๆฏ็\n1000 Processed\n classify content\n610500 0 ๆๅฏ่ฝ่ฆๅๅไบฌ็ๅป่ฏ่กไธ็ป็ผไบโฆ\n610501 0 1ๅ
่ดนๅ็บงๆฟๆดปWin10ๆญฃๅผ็ๆนๆณ๏ผไฝ ไธๅฎ้่ฆ\n610502 0 ๅๅค็ๅฅถxxxๆฏซๅxๅฐๆจ็ๅ่ ่ๆพๅ
ฅๆ
ๆๆบไธญๅ ๅ
ฅ้ฒๅฅถ\n610503 0 ไธน้ณๅไบๅคๅๅฒณๅๅไบๅค้้จ้่พพๅฐ42\n610504 0 ๆฅๆฌ่ฎพ่ฎกๅ
ฌๅธnendoๆ่ฟๆจๅบไบไธๆฌพโaroboโ็ฉบๆฐๅๅๅจ\n1000 Processed\n classify content\n611000 0 ็ฑๅฟ
ๅฆฅ่ๅๅ็ไธ็บฟๆฒป็kras้็ๅ่ฝฌ็งปๆง็ป็ด่ ็ๆฃ่
่ฝๅคๅปถ้ฟๆป็ๅญๆ\n611001 0 ๆๅฃน้ต่ฅๅCoachDonovanๅไฝ ไธ่ตท่ฎญ็ป\n611002 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 79dmrwไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n611003 0 360ๆ่ฒ้ๅขๅพ้ปๆฐ่ฐ๏ผๆฐ่ฅฟๅ
ฐ็ๅญฆๅฐๅบๆฏDIY่ฟๆฏๆพไธญไป\n611004 0 ่ฟไนๆๆพ็่ฎพ่ฎก็บฐๆผๅฐฑๆ็ฎๅผ ่็ไธๅธไบ\n1000 Processed\n classify content\n611500 0 ้ฃ้ขๅ
ซๆนๅ
จๅฝๆๅ็ซ็ญ่ฟ่กไธญ\n611501 0 ๅซๆๅนฟๅบ้ฃไธช้บฆๅฝๅณๅบๅฐๅบๆฏไธๆฏ24ๅฐๆถ่ฅไธๅ\n611502 1 ไฝ ๅฅฝ๏ผๆๆฏๆฅๆๅนฟๅบ่ดไฝณไบบๅบ๏ผไธๅนดไธๆฌก็ไธๅ
ซ่ๅฐฑ่ฆๅฐไบ๏ผ้ข็ฅไฝ ่ๆฅๅฟซไน๏ผ่ถๆฅ่ถๅนด่ฝป๏ผไธป่ฆๆฏไธๅ
ซ...\n611503 0 9ๆไธญๅฝไบบๆฐๆๆฅๆไบๆจไธ็ๅๆณ่ฅฟๆฏๆไบ่ๅฉ70ๅจๅนดๅคงไผไนๅฐไธพ่ก\n611504 0 ่ๅทฅไธๅบๆฌ่ฟๆน้ ๅฐๅ
จ้ขๅ ๅฟซๆจ่ฟ\n1000 Processed\n classify content\n612000 0 ้ๆฉ็้ฒๆๅ็SPFๅPAๅผ่ฆๅคง\n612001 0 ่ๅๆๅคๆฑๅจๅคๅๅฐxxxxไบฟ็พๅ
\n612002 0 ไบๆไฝๆงๅฏนไบOpenStackๅๅฑๆ็้่ฆๆไน\n612003 0 xใ็ๆๆฐด็ตๅทฅไธป่ฆ็ๆๅฎถๅบญ็ตๅจๅฎ่ฃ
\n612004 0 ๅ่ขซ้ฃๆบ่ตท้ฃ็ๅฃฐ้ณๅฐไฝre\n1000 Processed\n classify content\n612500 0 ไปๆ็ปง็ปญ็ฑๅฐๅผๆณ้ขๅฐ็ผ้ชไผดๅคงๅฎถ\n612501 0 ๆฅ่ชๆทฎๅฎๅบ่ญ้ตไธญๅฟๅฐๅญฆๅๆทฎ้ดๅบๅ่ๅบไธญๅฟๅฐๅญฆ็43ๅ็ๅฎๅฟ็ซฅ\n612502 0 ๅชๅชๅชๅฐ่ง้ขๆฅๆฌ็ตๅฝฑgif็ทๅฅณๅชๅชๅชๅจๅพ็ทๅฅณๆปๅบๅๅชๅพ็คพๅชๅชๅชๅชๅชๅชๅชๅชๅชgifๅ็ฑๅพ\n612503 0 ๅคดไธๅคฉ้ฃๆบๆ็นๅๅค้็ฅๅๆถ\n612504 0 DANGQIDREAMๆฏไธชไปไน้ฌผ\n1000 Processed\n classify content\n613000 0 ไบฌ่้ซ้ๅ
ฌ่ทฏk1805+200็ฑณ\n613001 0 ้ๆบๆฝๅxxxMBๅฝๅ
ๆต้ๅ
ฑ่ฎกxxไปฝ\n613002 0 ่ขซไฟๅฎๅธๅ
ฌๅฎๅฑ็พๆฅผๆดพๅบๆๆฐ่ญฆๆ่ท\n613003 0 9โฆโฆๅฏๆ็ๅจโฆโฆๅไบ้็ง่ฏ่ฟๆฏๅๅๅคๅค็โฆโฆ\n613004 0 ๆฒกๆ็ๆญฃ็ๆณ้ขๅฐฑ้ธๆฒกๆไธชไบบ็่ช็ฑๅๅนณๅฎ\n1000 Processed\n classify content\n613500 0 ๅฐ้ฃๅฏ่ฝๅจๆตๆฑ็็ฆๅปบ็ไบค็ๅค็ป้\n613501 0 ๅฎๆๆ
ไพฃๅจๅฐ้ๅ
ๆ ่งๆไบบ่ฑ่กฃๆฟๆ
ๆ ่ฏๅฏ่ฏดไบ\n613502 0 ๅป็ๆปฅ็จๆฏๆถๆฏไธชๆๆๆ่พพ2ๅ\n613503 0 ๅฎถไฝๆฝ้ๅบ็ๅๆ็ญ6ไบบ็ผ้ ่ฐ่จ็งฐ่ฝๅ็ๅป็งๆฟ\n613504 0 Geroใใใซใๆฅๅบ้ ใใพใใใผ\n1000 Processed\n classify content\n614000 0 ๆฅๆธ
็็ธๅธฆๅคดๅคงๅฅๅๆฅๆฏๅฐๆๆนไธ\n614001 0 ไธญๅฝๆญป็ฅๆนไพฟKOๆฅๆฌๅๅฐๅฏนๆ่ทช่ถดๅจๅฐๆ่ฏไธๆธ
\n614002 0 RollingSpiderๆฏ้ฅๆง้ฃๆบ\n614003 0 ๆไธ็ปไปฟๅค็ฝ้ฉฌๅบๅขๅผ็ๅปบ็ญ\n614004 0 ๆฎ็งๆ่ต่ฎฏ็ฝ็ซComputerworldๆฅ้\n1000 Processed\n classify content\n614500 0 ๅ
ถไธญ156ไธชๅ็ง510ๆนๆฌกไธๅๆ ผ\n614501 0 ่พน่ฟฝ่ฑๅ้ชจ่พนๆๆ้ฟๆฒ็ฌๅฒๆฑๆนไปๅไธๅ็ง่ฎฟ่ฐ็ปผ่บๅ
จๅทไบไธ้\n614502 0 ๅๆญฃไธไธชๆๅ้ฟ้ๅทดๅทดๅ่่้ๆๅๆณจ่ต30ไบฟ\n614503 0 ๅจๆซ่ตฐ่ฟUnFashionCafeๅๆฏๅๅก\n614504 0 โๅป็่ฏด๏ผโๅ
ๆ็
ไพๆฟ็ปๆ็็\n1000 Processed\n classify content\n615000 0 ่ฐๆดๅๆ ๅไปไปๅนด6ๆ15ๆฅ่ตทๆง่ก\n615001 0 ไธๅฐ50ๅ้็ปไบๅฐ่พพ็ฑ่ญฆๅๅ
ฌๅฎคไบๅฆ\n615002 0 ไฝ่ณไปไธๅนดๅคๅคงๅฎถไพๆงๆฏๆฒกๆๅ
ป่ไฟ้ฉ\n615003 1 xๆxๅทๆญฃ็ๅฎๆ ๆฅ่่ฐทๆบ่ฝๅจๆฟ๏ผไธ็งๅบๅฐ๏ผไธ่ฏทไธปๆ๏ผไธๅๅก๏ผ็ไธๆฅ็ๅ
จ้จ็ปๆฏไฝ้กพๅฎข๏ผ่ฟๅบๅนถๆ...\n615004 1 ๅ ดๆฅ่ฃๆๅๆปฟxxx้xxๅ
็พ่ฒจๅธ๏ผ็ฉๅ็ถๆฅๆปฟxxxxๅ
ๅๅ ็พๅ็พไธญ็ๆดปๅ๏ผ่ช ้ไฝ ็ๅ
่จ๏ผ็ฅๆจ...\n1000 Processed\n classify content\n615500 1 ๆฅ็จ้ฑๆพๅนณๅฎ๏ผๅ
ๆตๆผ๏ผไฟก็จ่ดทๆฌพ๏ผๆๅฟซไธๅคฉๆพๆฌพ๏ผๅนณๅฎๆ่ดทๆ็ป็๏ผxxxxxxxxxxx\n615501 0 ไบๅฎๆฏ็ดข็ๆ
ๅๅบ้ไผๅฉ็จไบบๆงๆ่ขซ็
ฝๆ
็ๅผฑ็น็ญๅ็ไธ่ตทๅพฎๅไผ็ญนๆดปๅจ\n615502 0 ไธ็ถๆไปฌไผๅจ็จๆณๅพๆญฆๅจ็ปดๆ\n615503 0 โๆฌข่ฟๅๅ ไธๆตทๅคๆไนฆๅบไบ้ฉฌ้ๅบๅฅฝไนฆ็งๆ\n615504 0 ๅปบ็ญ็ปๆ่ฎพ่ฎกไฝฟ็จๅนด้ไธบ50\n1000 Processed\n classify content\n616000 0 ไพๅฆ่ฝ่ฃ
่ฝฝ8ๅๆญฅๅ
ต็่ฟๅ
ต่ฑใๆฐๆณกๅฝข้ฃๆก็ญ\n616001 1 ไบฒ็ฑ็ๅงๅฆนไธๅ
ซ่ๅฐ่ณ๏ผ็ฅๅงๅฆนไธๅ
ซ่ๅฟซไน๏ผๅกๆฏๆฌๆฐ่้กพๅฎขไธๅ
ซ่้ฃๅคฉ่ฟๅบๆ็คผ๏ผๆฐ้ๆ้๏ผๅ
ๅฐๅ
ๅพ...\n616002 0 ไธๅค้บป็ฆ็โฆๅธๆๅฐๅทๅฐฝๆฉ่ขซๆ\n616003 0 ๆณๅทๅธ้ฑผๅณฐๆณ้ขๆ็ซไบ้ข้ฒๆชๆๅนดไบบ็ฏ็ฝช่ญฆ็คบๆ่ฒๅบๅฐๆจ้ๅฐๅนดๅฟ็ๅฅๅบทไธญๅฟ\n616004 0 ้ณๅท่งๅ้ฆ้ไธ็พคๅญฆ็็บท็บทๅจๅฝฉ่นๆกฅ็
ง็ๅๆ็
ง\n1000 Processed\n classify content\n616500 0 ๅพฎๅๆ็ดขไบไธๅไธบmate7\n616501 0 ๅจ้ฃไธชๆฒกๆๆๆบๆฒกๆ็ฝ็ป็ๆถไปฃ\n616502 0 ๅ ไธบๆxไธชๆ ทๆฟ้ดๅxไธชไธดๆถ็ฉไธ็ฎก็็จๆฟๆไธ้ๅฎ\n616503 0 ไปๆพ่ทๅพNBAๆปๅ ๅๆๆ\n616504 0 ๆฑฝ่ฝฆๅๆไธบๅฟ
ไธๅฏๅฐ็ไบค้ๅทฅๅ
ท\n1000 Processed\n classify content\n617000 0 ๅ
ถๅฎๆไปๆฅๆฒกๆๆช่ฟ่ฐๅ ไธบๆธ
ๆฅ่ชๅทฑๆ นๆฌไธ้
ๆฅๆๆดๅคๆดๅฅฝ็\n617001 1 ่ฟๆ็ฒพ็พ็คผๅไธไปฝๅ๏ผ้ๅฎไธบๆญข๏ผ่ฟๆๆดๅฒ็ๆถๆฏ๏ผx.x x.x x.x ไธๅคฉ๏ผๅฑ
ๅฎถไบง...\n617002 0 ็ฅๆ
ไบบๅฃซไพๅพไบๅ่ช็ญๅคด็ญ่ฑไธๅบงๆค
้ ่ๅค\n617003 0 ๆณๆ็ๆๅไบๆไปฅไธไบ้ฃๆบๆไธ็ก่งๆถๅฐฑ้ป้ปๅไบ่ฟไนๅค\n617004 0 ่กๅจไธญๅ
ฑๆฅ่ท1่ตท้้
้ฉพ้ฉถใ2่ตท้
ๅ้ฉพ้ฉถ็่ฟๆณ่กไธบ\n1000 Processed\n classify content\n617500 0 ไฝ ๆ ๆณๆณ่ฑก้ฃๆถๅ็ๅไบฌ็ซ็ถๆฏ่ฟๆ ท็\n617501 0 ็ๅฐ่ฑๅ้ชจๅ็ดซ่ฐๅจๅๅ
้ผๅนป่ฑกไธญ\n617502 0 ๆๆๅฏ็งไฟกๆ็ต่xxxxxxxxxxx\n617503 0 ๆๅญๅๅจฉๅๆๅญๆฏๆนๅๅฅณๆงไฝ่ดจ็ๅฅฝๆถๆบ\n617504 0 ็ดๆฅๆพ็ฉไธๆๅ้กถไบๆ็ถๅฅนๅฎถ้ฎ้ข\n1000 Processed\n classify content\n618000 1 xxxไบฟๅนฟๅท่ฑ้ฝไธ่พพๅxๆxๆฅๆบ้ช่ๆฅ๏ผ็ฌฌไธๆxxโxxxๆน้้้บไฝxๆ้ฆๅผ๏ผ็ซ็็ป่ฎฐไธญ๏ผๆ...\n618001 1 ไธญๅฝๅๅคงๅ็็พๅฑ
ไฝณไธฝๅฐๆฟ๏ผๅฎๆฉ๏ผๅผบๅ๏ผๅคๅฑใไธญ๏ผ้ซใไฝๆกฃใๅๅๅค็งๆ ทๅไพๆจๅ่๏ผ็ฐๅ้ตๅๅบ๏ผ...\n618002 1 ไฟๅฉ็ฝๅ
ฐ้ฆ่ฐท่ตทไปทxxxx๏ผๅไปทxxxx๏ผ้ฆไปxไธ๏ผ่ฏฆ่ฏข๏ผ้็็ฝฎไธ้กพ้ฎ๏ผๅผ ๅญxxxxxxxxxxx\n618003 0 ๆฏ่ฟ็ฒพ็ฅ็่ดขๅฏๆฏๆ็ไผ็พ็่ๅงฟ\n618004 0 ไธๆถไปฃๆฅ่ฝจๆ่ฐขxx็ปๅคงๅนณๅฐ็ปๆไปฌๅฎๅฆๅธฆๆฅ็ๆฐ็ๆดป\n1000 Processed\n classify content\n618500 0 ๅจๆฅ่ฆๅปๅป้ข็็ๅฅฝๅฟงๆกโฆ\n618501 0 ๆฐ่ญฆๆฅๅคไธ่ตท่ฝฆ็ไธบๆC001**ๅท็ๅฐๅ่ฝฟ่ฝฆ้ฉพ้ฉถไบบๆชๆ่งๅฎไฝฟ็จๅฎๅ
จๅธฆ็่ฟๆณ่กไธบ\n618502 1 ๅ
่พๅฐๅบ่ต็ๅ ๅๆจไธๅๅบ๏ผๅกๆๆฌๅบไผๅๅกๅๅฏๅ
่ดน้ขๅ็ญ่ฐข็คผ๏ผๅๆถ็งฏๅๅ
ๆข่ฑช็คผใไธๅ
ซๆดปๅจๆปกxx...\n618503 0 ๅ weixin๏ผssokcc\n618504 0 ๅฝๆถ็็ ็ฉถ็ๅๅญฆ้ฝๅทฒ็ปๆฏไธ\n1000 Processed\n classify content\n619000 0 ๅฏปๆพๅฒไธๆ่ดต็ๅฎไน ็|ๅนฟๅ่กไธๆ็ด่ง็ๅฐ่ฑกๆฏไปไน\n619001 0 ๆไธๆ็ๆๅไธๅคฉ็ฅๆๅฅฝ่ฟๅงๅ่ๆๅฎๅฐไธธๅญ\n619002 0 ๅนฟๅฎๅ่ฅๅธไบบๆฐๆณ้ขๅฎก็ไธ่ตทๆๆฐๅฅฝๅฟไนๅกๅธฎๅฟ\n619003 0 ๅฐ้ๆคๆ ้็ปไธไบๆๆ
๏ผๅจๅฐ้็ญไบบ\n619004 0 ๅธๆฐๅฏ้่ฟ็ต่ๆ็งปๅจ่ฎพๅค้ๆถๅญฆไน ๅคๆ ทๅ่ฏพ็จ\n1000 Processed\n classify content\n619500 0 ๅจๆฅๅไธๅจไธ้ฝๆฏ่ฑๅ้ชจ็็ๅ\n619501 0 ไธๅบงๆฅๆ93ไธช่ฝฆไฝ็4ๅฑ็ซไฝ่ฝฆๅบๅจๅๅ
ณๅบๆๆดช่ทฏๅ
ซไธ้ณๅ
ๅฎถๅญ่ฝๅฐๅปบ่ฎพ\n619502 0 ๅ
่ดน้ฒ็็ญ็บฟxxxxxxxxxxx\n619503 0 ๅธธๅทๅธไธพ่กไบๅญฃๅบฆๅ
จๅธ็ๆๆๆๅจ่กๅจ็น่ฏไผ\n619504 0 ๆตๆฑๅบ็ฃๆไธคไผไนๅบๅ
ฌๅผๆ่ฎฎไฟก\n1000 Processed\n classify content\n620000 0 ๆๆบๆถ่ดน็้ชๅญๅฎน้ๅจๅ
จ็้ชๅญๅบ่ดง้ไธญๆ\n620001 0 ๅ
ๆฌๆณจๅ
ฅLEDๅ
็ด ็ๅๅคง็ฏใLEDๅฐพ็ฏไปฅๅๆฐ่ฎพ่ฎก็ๅๅไฟ้ฉๆ \n620002 0 ็พๅฝๅ
ฌๆฐๅฏนๆณๅพ็่ฎค็ฅ็ธๅฝ้ซ\n620003 0 ๅ
ณไบๅ
ผ่??ๆๅ็ๆฏไธไปฝๅ
ผ่่ต็ๆฏ็บฏๅฉๆถฆไธๆฏๅพฎๅๅฆๆไฝ ่ไธๅพไธคไธ็พ็ป่บซๅถ็ไผ่ดนๅฐฑๅซๆฅๅจ่ฏขๅคฉไธ...\n620004 0 ไธไธชๅฎถๅบญ้ๅบไบ9ไธช็ธๅฝขโ็ฎไบบโ\n1000 Processed\n classify content\n620500 0 ้็ปผๅคๆ็็็ธ~ไธๅฐๆๅ้ฝไธ็ฅ้ๅๆฅๆฏ่ฟๆ ท\n620501 0 7ๆไปฝๅฏๆถไธญๅฝA50ๆๆฐๆ่ดง็ๆชๅนณไปๅ็บฆๆปๆฐๆฅ่ท่ณ127\n620502 0 xใๅคฎ่กๆๅ
ณ่ด่ดฃไบบ๏ผ็ช่ไปทๆ ผๆณขๅจไธไผๅฝฑๅ่ดงๅธๆฟ็ญ\n620503 0 ่ฟๅซไธฐๅฏ็็ปด็็ด Cๅๅฏๆบถๆง็บค็ปด\n620504 0 ไธๅ่งๅฎ่ฏๅ148ๆนใๅๅ5ๆน\n1000 Processed\n classify content\n621000 0 ๅฅฝไธๅฎนๆๅไธไธไธชๆwifi็้ฃๆบๅฑ
็ถ่ฆๆถ่ดน\n621001 0 ๅ
ซๅฎ็ฝชไฟจ็ถๆฏๆญปๅๅฐฑ่ฟไนๅ\n621002 0 ้ฃๆไบๅป็ๆฅ็ๆฑๅญๅฏนๅนณไธๆ็ๆฏๆฌ็\n621003 1 ไฝฐ@ไฝณ@ๆณบ๏ผ้พ#ๅฌ#ๆ๏ผ็๏ฟฅ็๏ผไฝ*่ฒ็ญ็ญ๏ผ่ซๆฌพ่ตฝ๏ผ่ฏฆๆ
๏ผไธ่ฟฏๆดฎๆฆ xxxxxxx.โโM ...\n621004 0 ๅนถๅฐไธบ่
พ่ฎฏๅบ็จๅฎๅ
จๅบๆไพๅฏ้ ็wifi\n1000 Processed\n classify content\n621500 1 ๆญๅๆจๅๆทปๆฐๅฑ
๏ผๆๆฏๆไธฐ่ฃ
้ฅฐ็ๅฎถๅฑ
้กพ้ฎ๏ผๆๅงๅพใ ๆไปฌๅ
ฌๅธ็ฐๆจๅบไบไธไธชxxๅนณ็ฑณๆฟxไธๅคๅ
จ...\n621501 1 ๆฌๅบๆถ่ดน็พๅฎน้กน็ฎ๏ผๅฏไบซๅๅฏนๆไฝ้ช ๅฆ๏ผๆฅๅญฃๆฏๅ
ป่ๆไฝณๅญฃ่๏ผๆฌๅบๆฏไบบๆจๅบไธไธชๅ้ขๅช้xxx...\n621502 0 ้ผๆฅผๅป้ขๆ็ถๆฏๅคงๅป้ขๅ\n621503 0 ใ่็่ฏด๏ผใๆไปฌไฟฉๅช่ฆไธๅตๆถ\n621504 0 12ๅฒๆนๅๅฅณๅญฉๆๆ่ขซๅๆ74ๅฒ่ไบบๆงไพตๅนถไบงๅญ\n1000 Processed\n classify content\n622000 0 ๆณๅฟตๅฅฝๆๆไพตๅ ไบๆๆดไธช่บซไฝ\n622001 1 ๅญฃ้ช่้้ต๏ผๅญๆญคๅพฎไฟกไบซ็ดซๅณฐไธไธป็นๆ ๏ผ ่ฏฆๆ
ๅจ่ฏข๏ผ้ ๅ ้ข่ฎข็ญ็บฟ๏ผxxxxxxxxxxx ๆจ...\n622002 1 ๆจๅฅฝ๏ผๆฌข่ฟๅๅ ๅฟ้ฆๅจๆxxx้ขๅฎไผ๏ผๅ
จๅฝ็บฟไธ็บฟไธ่ๅจไผๆ ๆดปๅจ๏ผไปทๆ ผๅ
จๅนดๆไฝ๏ผๆดๆๆป้จ้ขๅฏผไบฒไธด...\n622003 0 ๅนถๅฏนQuanergyๅ
ฌๅธ่ฟ่กๆ็ฅๆ่ต\n622004 0 ๅไบฌ่ฟ่พน็้ต่พพไธๅฎๆฏไธๅคช่ชๆ\n1000 Processed\n classify content\n622500 1 ไฟฑไน้จKTVๆฌขๅฑ็ๆ KTVๆฟ้ด๏ผ็พๅจๅค้
๏ฟฅxxxxๅ
xๆ ่ฑชๅไธญๆฟๅฅ้ค ้ๅฒ็บฏ็๏ฟฅxxxๅ
...\n622501 0 ๆฑฝ่ฝฆ่ฝฎ่ๅ
ๆฐฎๆฐๆไธๅฎ็ๅฅฝๅค\n622502 1 ้ญ
ๅๅบไธๅ
ซ.ๅฟซไนๅฅณไบบ่.่ฟๅบๆๆๅ ้พๅๅฏ้ๅคฉ้ชไผ่ๅ
ฐ่็พๅฎนๅ
ป็้ฆๆๅ็ฅๆจไธๅ
ซ่ๅฟซไน๏ผ ...\n622503 0 ไธๆๆณxxxไบไปถๅ้
ตๅฐไบๅฆๆญคๅฐๆญฅ\n622504 0 ๅ
ๅฐๅฅณ็ๅจๆธฏ่กๅคด้ๆ่ขซๅคๆๅ12ไธชๆ\n1000 Processed\n classify content\n623000 0 ๆฅๆๅฝไบไบบๅๅ
ถไป่ฏ่ฎผๅไธไบบๅฐๅบญๆ
ๅต๏ผๅๅ๏ผ่ฎธๆใๆฑชๆ้ฉๅฐๅบญ\n623001 0 ็ปๆไบบ่ฟๆฒก่ตฐๅฐไปไปฌ้ฃๆบๆ่พน\n623002 0 ็ฆ
ๅๅบ่ฟ12ๅนดๆชๅบ็ฐๆฐๅๆฌๅฐ็
ไพ\n623003 0 ๅ
ฌๅธ่ฎกๅๅฌๅๆๅค1ไธ่พ2016XC90ๅ่ฟๅจๅๅค็จ้่ฝฆ\n623004 0 ๅ ๆฒนๅงๅฎไน ็็ๆฏไธ้จๆบๅฅฝ็็ต่งๅงๅ\n1000 Processed\n classify content\n623500 1 ๆ้ฝไผๅๆไธบๅ
ฌๅธ๏ผไธไธไปไบๅๅ็็ตๆขฏๅ้ขๅจใๆงๅถไธปๆฟใ้่ฎฏๆฟใ่ฝฟๅขๆฟใ้จๆบๆณใๅคๅผๆฟใๅบๆฅ็ต...\n623501 0 ๆ็ๅคงๅญฆๅจๆฑ่ๅจๅไบฌๅจๅไฝๅจๆฐ่กจ็ณป81242็ญ\n623502 0 ๅ
ถๅ
จ่ต้ๅฑๅ
ฌๅธxxไบฟๅ
ๅๅบ้ขๅบฆไธญ็้ฆๆxxไบฟๅ
xๅนดๆๅ
ฌๅธๅบๅทฒ็ปๅฎไปท\n623503 0 ่ๆฏ100ไธช่ฎพ็ฝฎไธชๆง็ญพๅ็ไบบไธญ\n623504 0 ๅคงๅฎถๅ ๆฒน็ป็ไฟๅฏ็ฌฌไธๅฅฝไธๅฅฝ\n1000 Processed\n classify content\n624000 0 ๅตๅตๅไธ็จๆณๅพๆ็ปดๆ้ฝๆ็น่ฟท็ณ\n624001 0 ๅธธๅทๅ้ๅฑ
ๆฐๅนณๅๆฏๆๆถๅ
ฅ๏ผ3580\n624002 0 ็ฆไธๆฅ่ฏๆ็ปๆๆ่ดจ็่ฟไฝ ็ไบบ็\n624003 0 ๅไนไธ็จๆพ็พๅบฆไบ็ฝ็้ซๆธ
่ตๆบ้พๆฅไบAPP\n624004 0 ไปฅ้ฒๆๅญๆซๆ็ฑไบๅญๅฎซ็ๅๅ่ไบง็็็ๆ\n1000 Processed\n classify content\n624500 1 ๅ
ๅฎตไฝณ่ๆฅไธดไน้
๏ผๅฏน้็ค้น
ไธบไบๅ้ฆๆฐ่้กพๅฎข๏ผ็นๆญคๆจๅบๆไนฐๆ่ต ๆดปๅจ๏ผ่ดญไนฐๅคง้น
ไธๅช๏ผ่ต ๆฑคๅไธ่ข๏ผ...\n624501 0 6ไธช85ๅผๅ2ไธชt90ๆๅ่ฟๆฏ้้นฟๅๆ็6ไธช็ฉบๅๅดๆฎดไปไธไธชๆๆๆญป้ฃk1a1\n624502 0 ไนฆไธญๅๅฎๆ็ไธ็พคๅๅชๆๅๆๅไปฃ็ๅฎๅบโๆญฃไนไบบๅฃซโ้ฝ้ปๅบ็ฟๆฅ\n624503 1 ๅไธๆญฅ่ก่ก๏ผๅนผๅฟๅญ๏ผๅพไนฆ้ฆ๏ผ็ตๅฝฑ้ข๏ผๅฟ็ซฅไนๅญ๏ผๅฅ่บซๅนฟๅบ๏ผ็ปฟๅๆฏ่งๅธฆ๏ผ็ไบงๅฑ
ไฝ้ๅฎไธๆฟๅค็จ็ฐๆจ...\n624504 0 ๆตๆฑๅโไบ่็ฝ+โๆ็ญๅ่ต็็ฎกโๆบๆ
งโๆฐๆ ผๅฑ๏ผใใๆฐไปฅ้ฃไธบๅคฉ\n1000 Processed\n classify content\n625000 0 AmazonAWSๆ็ถๅพๅฅฝ็ฉๅ\n625001 0 ไธ็งๅซๆๅฐฑๆฏ่ฆ็็ธๆๅฐฑๆฏ่ฆxๆธ
ๆฅ็ถๅๆญปๅฟ\n625002 0 ๅธธ็ๆทๆขฏๆ่ฒไธไธ่ฅฟ็ญ็่ฏญๅน่ฎญๅญฆๆ ก\n625003 0 ๆ่ฟๆฒกๆๆๆบไน้่ฟไบ่ฎธๅคๅบ่ฏฅ็บชๅฟต็ๅบๆฏ\n625004 0 โขไธ่ฌๅฐๅฐไบx็บง็ๅฐ้็งฐไธบ่ถ
ๅพฎ้\n1000 Processed\n classify content\n625500 0 ไธๆตทๆๅบๅไบบ่ดญ็ฉไธญๅฟไธพๅ็โ็ฅ็ง่ฑๅญโไบฒๅญๆธธๆดป\n625501 0 ไธไธชๅคๅฐๆฅ่ๅท็โๆฐ่ๅท\"ไบบ\n625502 0 ไปไปฌๅจไธๅฑไธๆ ็ๅ่
่ฟๅจไธญๆ ๆงๆ ๆ\n625503 0 ไบฎ็นๆฏ็บฏ็ฉ็้ฒๆๆธฉๅไธๅบๆฟ\n625504 0 ็ๅข็ฌ่ฎฐ่ฆVIPๆ่ฝ็ๅ
จ้่ๅธไฝ ็ไบๅโ\n1000 Processed\n classify content\n626000 1 ้ๅ ๅพทๅทไฟฑไน้จ็ฅๆจๅ
ๅฎต่ๅฟซไน๏ผๆฌๅบๆจๅบๅ็ฑปSNG้ฆๆ ่ต๏ผๆฌข่ฟๅคงๅฎถๅฐๅบๅๅ ๆฏ่ตใๅๆฌกๆ่ฐขๆจ็ๅฐ...\n626001 0 xxๅนด้
ๅฎๅฐ้๏ผ่ฏบๅบไบๆๆฌพxxxไธ\n626002 0 ่ๅทๅทๆณๆธฏๆฎๆผ็ไธบ้ซ็ซฏ็งๅญฆๆๅก็่ง่ฒ\n626003 0 ็ๅฐ็ๅนณๅบ่ๆ ผๅบ้ไธๅคงๅทฅไฝๅผ้ข่ฝฌๅๅๅฑๆๅๅผ่ต็ชๅบ่ฝฌๅๅๅฑ๏ผไปๅนดไธๅๅนด\n626004 0 โๅคไฝๅ่ฎฟๅๆไบบๅฃซ่ฎคไธบ็ช่ๆถจไปทๆจๅCPIไธๆถจๆ้\n1000 Processed\n classify content\n626500 0 ็ฌฌไธๆฌกๆๅ็ฎกๅฐฑๆไบๅๅๅๅๆๅฐฑ็ฅ้ๅคงๅฅๆ็ฑๆไนไนๅๅ่ธ่ๅๆๅบ่ๅฝป็ปไบๆททๅฐ็นๆฌงๆดฒ่ก็ปๅฆไธๆณๆ...\n626501 0 ๅธธๅทๅธ้ๅๅบๆฐ่ฑกๅฐ8ๆ9ๆฅ16ๆถๅๅธ็ๅฐ้ฃ่ญฆๆฅๅๅคฉๆฐ้ขๆฅ๏ผไปๅนด็ฌฌ13ๅท็ญๅธฆ้ฃๆด่่ฟช็ฝไปๅคฉ15...\n626502 0 ่ฎพ่ฎก้คๅญไพฟ็ญพ็่ฎพ่ฎก้ฃๆ ผๆฏ็ฎ็บฆ็ๆ็ฉ้ฃๆ ผ\n626503 0 ๆๅคฉๅๆณๅๅๆป็ปๅฎๅไธชไธ่ฏ็ถ้ขๅบๅฐฝ้ๅ่ไธญๅฝ็น่ฒ็คพไผไธปไน11ๅนดๅทไธ\n626504 0 ๅไธๆๆฐไธ4000็น็ๆถๅๆ ๅๅบ็ญไฟก้ฃ้ฉๆ็คบ\n1000 Processed\n classify content\n627000 0 ๅฝไฝ ๆฑๆจ่ชๅทฑๅทฒ็ปๅพ่พ่ฆ็ๆถๅ\n627001 0 5ใ่ฟ้ๅ็ปฟ่ฑๆฑคๆ่ด่ ่็พ็
\n627002 0 ๅ
่ดพๅพท05ๅนดไธๅฐไนๅๅผบ่ฟซๅฝไผๆททๆน็งๆๅ\n627003 0 ๅคงๅนดๅ3้ญๆๅฆๆฟ็้ธกๆฏๆธๅญ่ฟฝ็ๅฐๅคๆ\n627004 0 ๅ็ฎกๅฐฑๅฝขๆไบ่กๆฟ้จ้จ็ๆฐไธๅฝๆผๆญฃไนไบ\n1000 Processed\n classify content\n627500 0 ๅจๅป็็ๆๅฏผไธ็จ่ฏๆฏ่พๅฎๅ
จ\n627501 0 50ๅคๅฒ้ฃๅ ๅคงๅฆ็จๆๆบ็่ฑๅ้ชจ\n627502 0 ๅ
จๆฐ่ฎพ่ฎก็ๅๅ
ๅด่ฎฉ่ฝฆๅคด็่ตทๆฅๆดๅ ็ๆฆๅฎ\n627503 0 ็จไบ่ฟไนๅคๅนดๅฐ็ฑณๅซ็้ฝๆบๅฅฝ็ๆไนๅฐฑๆฐๆฎไผ ่พ่ฟไนๅทฎๅฒ\n627504 0 ็ฉ่ดจ่ดขๅฏ่ทไธไธ็ฒพ็ฅ้ฃ็ฒฎ้ๆฑโฆ็็่ฆๅปๅ่พไบ\n1000 Processed\n classify content\n628000 1 ไธ่ฝๅผ้ๅ
ฌๅธๆ่ฐขๆจๆฅ็ต๏ผๆฌๅบไธ้
ๆฑฝ่ฝฆ้ฅๅใ้ฅๆงๅจใๆบ่ฝๅกใๆๅ ๆน่ฃ
ใ่ฝฆๅบ้ฅๆงใไธ้จๅผ้ๆข้๏ผ...\n628001 0 ๅญฉๅญๅบ็ๅฐ็ฐๅจๆณ้ขๅผบๅถๆง่กๆๅ
ป่ดน็ฎๅๅฐฑ่ฎจๅไธๅ\n628002 0 2014ๅนด10ๆ้กถๆฐๅจๅฐๆนพๆญฃไน้ฅฒๆๆฒนไบไปถ็ๅ\n628003 0 ไฝฟๅคนๆฟๅทๅๆฃๅน้ฃๆบ็้ฝๅฐๅฟ็นๅงๅ้ฟ็ซฅๆจ่ๅบ็็ซ็ฎญไธๆ ทๅท็ซๆๅฐผ็ๅๆญปๆไบ\n628004 0 ๆพ็่ดข็ฝโโๆๆๅจ็็ฝ่ดทๅฏผ่ช็ฝ็ซ\n1000 Processed\n classify content\n628500 1 ๆจๅฅฝ๏ผๆๅธๆๅขฉๅฐ่ฅฟๅฎๅไธ็ฏๆตท่ฟ้จๅฐ้จไปทๆ ผxxxxๅ
/ๆ๏ผ้่ทฏxxxxๅ
/ๆ๏ผๆฌข่ฟ่ต่ฝฝ๏ผ่ฐข่ฐข๏ผ...\n628501 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?่บๅฅๅๅคง็ๅๅฝxxxxP\n628502 0 2ๅๅจ่ฏฅๅฒๅ็ฐไธญๆๅ้ฉฌๆฅ่ฅฟไบๆๅญ็้ฃๆบๆฎ้ชธ\n628503 0 2015้็ซๆบ่ฝๆๆบๆฏไธญๅฝๅดๆฃ็ฒ็บง่่ต่ฟ่ก็ฌฌ11่ฝฎ่พ้\n628504 0 ๅจ่ฏฅๅฒๆตทๅฒธ็บฟไธๅ็ฐ้ฃๆบๆฎ้ชธ\n1000 Processed\n classify content\n629000 0 ๆฏๆขจๅญๅคไธๅ้็ๅซ้ๆฏ่นๆๅคไธๅ\n629001 0 ๆไธ็ก่งๆขฆ่งๆๆๆๆบๆ็ขไบ\n629002 0 ๆจๅคฉๆนๆญฃ่ฏๅธๅไบไธไผๅฟ็้ขๅคด็พๅๆฅๆฏไธ็ปฉๅ7ๆไธ่ทๅฝฑๅๆๅฐ็ๅธๅ่กๅ\n629003 0 ่ฝ็ถๆๆฌๆฅๅฐฑ่ๆฎ~ๅฅฝไบๆๅคฉ่ฟ่ฆ่่ฏๅ\n629004 0 ๅทฆๅพๆฏ็็ๅจๅจ็ฉๅป้ขๅ
ๆฃๆฅๆถ็Xๅ
็\n1000 Processed\n classify content\n629500 0 ๆ้ฝๅฑ็ฎๅๆๅๅผ็ตๆขฏ่ฟไธคๅ้จ\n629501 0 ่ๅท่ดจ้ๆๆฏ็็ฃๅฑๅญๅบๅๅฑ่ฟ
้้จ็ฝฒ\n629502 0 ๅป็่ฏด๏ผ้ฃไฝ ๆไนไธๆไปๅธฆๆฅ\n629503 0 ๆฉ่ตท+ไธไธชๅๅฐๆถ็ญ่ฝฆ+ๆญฃไบๆฒกๅๅนฒ็ๅญๆฒกๅฐๆฏ+ๅไธชๅฐๆถ็ญ่ฝฆ+ไธไธชๅฐๆถๅฐ้+ๅไธชๅฐๆถๅ
ฌไบค=ๅ่พฐt...\n629504 0 AMD็็่ฝๅธฆไฝ ้ฃๅตๅ
ฅๅผๅค็ๅจๅทฒๆๅ
ฅๆณข้ณ้ฃๆบ้ฉพ้ฉถ่ฑ\n1000 Processed\n classify content\n630000 0 ่ฏฑ้ชไบไธป้่ฟATMๆบ่ฝฌ่ดฆๆฏไปๆ่ฐ็ๆน็ญพๆ็ปญ่ดน\n630001 0 ่ฝ็ถ่ฆ6็น่ตทๅบ่ตฐ40ๅ้ๅปไธ็ญ\n630002 0 ๅ
ฉ่
ๅทฎๅฅๅจๆผๆณ่ฆๅฝไปคๆฏไธ่ฌๆงๆฝ่ฑกๆง็่ฆๅฎ\n630003 0 ่ไธๆ่ต้ขๆฏ่ตทๆฏๅนดๅ
ไธชๆธฉๆณๆ
้ฆ็่ฑ้ๅคๅพๅค\n630004 0 ๆไธๅผๅไธๆญฅๆๆบ็ฐๅจ่ฟไธไธไบๅพฎไฟก\n1000 Processed\n classify content\n630500 0 ๆตท่พน่ฝๆฒกๆณ่ฑกไธญ็้ฃไนๆตชๆผซไฝไนไนๆญคไธ็ฒ่ฝๅธฆๅ็งไผค็ๅๅฎถไฝไนไธ่ๆญค่ก<\n630501 0 โโ้ฃๆบ็ๅๅๆฏๅฆไฝไบง็็ๅข\n630502 0 ๅจ่ฏขไฟ้ฉ่ฏท่็ณป18052233219\n630503 0 ๆฑ่่ฟไบๆธฏ่ดง่ฝฆไธขๅคฑ1ๅจๆๆฏๅฑๅๅ่ฟๆ6ๅ
ๅคฑ่ธช\n630504 0 ๅฝๅคงไผๅชไฝ่ดจ็ๆไธ็บฟๅ็ๆๅฏ่ฝๆฏโไธ็ฒพไธๆฐดโ็ๆถๅ\n1000 Processed\n classify content\n631000 1 ๅจๅ๏ผxๆxxๆฅ๏ผxxๆฅใ็นๆ ใๅไบฌๆ
ๅฎซๅ
ซ่พพๅฒญ้ฟๅๅ
ญๅคฉๅ้ฃๅขxๅคฉxxxxๅ
/ไบบ๏ผๆ้่ฆ่็ณปๆ...\n631001 0 ๆนๅ็ตๆขฏไบๆ
ๅ5ๅ้็ๆงๆๅ
๏ผไธคๅทฅไฝไบบๅ้ฉๅ ไธ\n631002 0 ่ฃๅคโ้ปๅๅโ็ๆจๅบๅด็ฅๆๆปๅ\n631003 0 ๅไบฌๅธไบบๆฐๆฃๅฏ้ขไพๆณๅฏนๅไบฌๅธๅ
ญๅๅบๅๆฐ็ฏ้ๆกฅ็ๆๆไธปไปปๆๅบญๆไปฅๆถๅซ่ดชๆฑก็ฝชๅณๅฎ้ฎๆ\n631004 0 ๆญ็่ๆฐ็ไธ่ฝฎ่ฝฆๅจxxxๅฝ้ไธ้ฃ้ฉฐ\n1000 Processed\n classify content\n631500 0 2XUๅฅณๅฃซ้ไบบไธ้กน่ฟๅจๅ็ผฉ็ญ่ฃค\n631501 0 CelticWaterไธบ่่คๆณจๅ
ฅๆ ้ๆฐดๅ\n631502 0 ไปๅฆ้ ็็ฉท็ญ็ๆฟๅบๅๆๅฉ่ฟๆฅๅญ\n631503 0 2015ๅนด้ข่ฎกGDPๆฏ65ไธไบฟ\n631504 0 ๅไปท่ฝฌไธคๅผ ๅจๆฐไผฆ่ๅทๆผๅฑไผ้จ็ฅจ\n1000 Processed\n classify content\n632000 0 ่ฟไธๅไธฅๅฏบ็ๅปบ็ญๅธๅฑๆ่ซๅคง็ๅ
ณ็ณป\n632001 0 ๆฎไธๅฎๅ
จ็ป่ฎกๆๅฝๆxxxไธๅฎๅ็ๅฆปๅญๅฟๅฅณๅ
จ้จ็งปๆฐๅฝๅค\n632002 0 ้ฟ้ๅทดๅทดๆไธๅ่็ฐไบบๅไฝๅ่ฟไบๅ
ฌๅธ่งๅฎ็ญ็ญๅฐๅๆญง่ง\n632003 0 ๆ้่ฆไบ่งฃ่ฏฆ็ปๆ
ๅต็ๅฏไปฅ้ๆถ่็ณป่็ณปไบบ๏ผ้่ๅธ่็ณป็ต่ฏ๏ผ13001057642\n632004 0 ๆฏๅคฉๅจๅฎถไธๆฏไธ็ปๆ็ฉๆๆบๅฐฑๆฏไธ็ปๆ็ฉ็ต่\n1000 Processed\n classify content\n632500 0 ๅบไบๅ
ณ็ณป็ๅไธๆจกๅผๆฏๆถ้ด็ๆๅ\n632501 0 ๆๅคงๅงจๅฆๅคงๅงจๅคซๆตชๅฐๆ ้กๅปๅฆ\n632502 0 ๅๅไธป่ฃๅคๆฏ่ฐๆไบไฝฉไฝฉๅๅๅๅๅๅๅๅๅ\n632503 1 ้่ธ่กฃไธไปถ๏ผไนฐไธไปถ้่ธ่กฃไธคไปถ๏ผไนฐๅพๅค้ๅพๅค๏ผ\n632504 0 Minoๅ
็ๅฐ่ง้ขไธไผ Youtube\n1000 Processed\n classify content\n633000 0 ่ฏดๆฑ่ไธๅญๅฆไบงไธๆ่ฝป้พๅค่\n633001 0 IMFๅจๆชๆฅๆฐๆ็่ณไธๅนดๆไธไผๅไธๅฏน่ฏฅๅฝ็็ฌฌไธ่ฝฎๆดๅฉ\n633002 0 โโPs๏ผๅคง็ญๅคฉ็้ฝๅไปไน็ญ้น\n633003 0 ไธญๅฝๅฅฝๅฃฐ้ณlive็ฉบ้ๅไบฌๆฐดๆธธๅ\n633004 0 ๆ็จๆ็่พๅ
ฅๆณๆไบ17492ๅญ\n1000 Processed\n classify content\n633500 0 ๅคฉๆดฅๅๅไบฌไน้ด็่ท็ฆปๆฏๆไธ่ฝๆฟๅ็็ดฏ\n633501 0 1981ๅนดๆๆฒกๆฅไธญๅฝไนๅๆฏไปไน็็ถ\n633502 0 ็
ไบบ็็ฝ็้ฃ็พ็
ไนไธๆฏๅพๆพ็\n633503 0 ็ฎ\n633504 0 ไผๅๆ็๏ผๆถ็ง็ซๅ20ๅ
ใๆฟไปท92ๆใๅปถๆถ้ๆฟ่ณ13\n1000 Processed\n classify content\n634000 0 ่ฆๆฑ็่ฏ็้ฟๅพฎๅๅงw็็
ง็่ฏๆๆ็งๆณๆๅ ไบQQๅๅฏไปฅๅ\n634001 0 ๆๅๅ ่ฟๅฅนๅฎถๅปๅนด็้40ๅ
ๆดปๅจ\n634002 0 ไป่ไฝฟVOCๅซ้่ฟไฝไบๅฝๅฎถ่งๅฎ\n634003 0 ่ญฆๅฏ่้ป่ฆๅๅ็ฎก็ปๆPKๅ\n634004 0 ๆตๆฑ็ๆธฉๅฒญๅธไบบๅฆๆไฝ ไปฌๆ็ๆณ็่ฏท็ธไบ่ฝฌๅ\n1000 Processed\n classify content\n634500 0 ่ฆไธไฝ ไปฌๅๅ็ฎก็ธ็ธ้ฃ้ๅปๅง\n634501 0 ไปๅคฉ่็ธ่ฏดๆๅคฉๅ็็ๅฃฐ้ณๅๅฝๅฝ\n634502 0 ๅฐฑๅๆๆฟ่ตทๆๆบๅทๅพฎๅๅฐฑไผๅฟ่ฎฐๅฝๅๆฏๆณ้ฎๅบฆๅจๆฅ่ตๆๆฅ็\n634503 0 ่ฏทๅคงๅฎถ่ตๅฉๆๆๅ็ๅบๅๆ้ฑๅบ้ฑ่ฟๆ่ฏทๅธฎๆๆพไธ็ไธๆๅฅฝ็ๅป็ๅฟซ้ๅฐๆๅฎถๆๅ่ฏทๅคงๅฎถ่ฎฐไฝๆๆฏๆฐธ่ฟ็ฑไฝ ไปฌ็\n634504 1 ๆฐๅนดๆฐๆฐ่ฑก๏ผๆฟๆจๆไธชๅฅฝๅฝข่ฑก๏ผๅฅฝๆถๆฏ๏ผๅฅฝๆถๆฏใๅๅฐๅๅๆฐๆฌพไธๆญไธๅธ๏ผๆฌๅบๆฃ่กฃ๏ผ่ฒ็ป็บฟ่กฃๆไบๆ๏ผ...\n1000 Processed\n classify content\n635000 0 ๅ ๆญคไบง็่
่ดฅใๅ้
ไธๅ
ฌใๅ
ฌๆ่ตไบงๆตๅคฑ็ญ้ฎ้ข\n635001 0 ่ไบไบๅ
ณไบ็ฒๅฃณ่ซ่ฝฆ่ดด่ฎพ่ฎกๅๆถ้ธฆ็้ฎ้ข๏ฝ\n635002 0 BetteๅฏนTina่ฏด็้ฃๅฅโโโๅฝๆๆชๅฟ่ช้ฎ\n635003 0 ๆ ้กๅๆๅ่ฏ๏ผไปไธปๆตๆฅไปทๆ็จณ\n635004 0 ๆฌฃๆฌ็นๅซ้ๅฏน่ฝฆ่ฝฝ็ต่่ฎพ่ฎกๅฎๅค็ๆบ่ฝ็ตๆบ็ฎก็ๅญ็ณป็ป่งฃๅณๆนๆก\n1000 Processed\n classify content\n635500 0 ่ถๅบ้จไนฐๆฐดๆฅๅฐไธไธชๆๆบๆ็ฝ็ๅฐๆน\n635501 0 ๅ
่ดนๅไบซๆตๆฑxไปฝๅๅถ่้ธ\n635502 0 ๅคไป็็ธไนๆญ็ง้ญๆ้ฃ่ไธๆฏไผ ่ฏดไธญ็้ฃไน่ฟทไบบ\n635503 0 ๆฎ้ๅฑฑไฝไบๆตๆฑ็ๆญๅทๆนพไปฅไธ็บฆ100ๆตท้\n635504 0 ๅๆญฃๆๅฐฑๆฏ็ไธๆฏ่ฟ็งๆฝ่งๅ\n1000 Processed\n classify content\n636000 0 ไปๅฐๅญฉๅฐๅญๅฆๅฐ100ๅฒ็่ไบบ้ฝๅฏไปฅๆ็จ\n636001 1 ๆธฉ้ฆจๆ็คบ: ๅไฝไผไธ่ๆปๆจไปฌๅฅฝ๏ผๅๅฃซๅใ้ฉฌไธฝ้ข็ฅไฝ ไปฌxxxxๅนด็ๆๅ
ด้ใ่ดขๆบๅนฟ่ฟใ ๅจๆฐ็ไธ...\n636002 1 ๆฑๅคดๆดๅPARTY้
ๅงๆฌข่ฟๆจ๏ผๅฅขๅๅ
ธ้
่ฃ
ไฟฎ๏ผ้กถ็บง้ณๅ่ฎพๅค๏ผ้ซไธญๆกฃๅคงๅฐๅ
ๅข๏ผๆฏๆๅฅ็ฎไธๅ็ฒพๅฝฉๅจฑ...\n636003 0 ๆฏๆ
ๆธธ่งๅ
ๅคงๅณก่ฐท็ๆไฝณๆถๆ\n636004 0 ่ฟๆ ๅฟ็่ฟๆฌไปฅ60ใ70ๅนดไปฃ็ฅ้ๅปบ่ฎพๅๅบไธบไธป่ฆ้ขๆ็ไธ่พ็ผ็บๅทฅไฝๅทฒ่ฟๅ
ฅๆถๅฐพ้ถๆฎต\n1000 Processed\n classify content\n636500 0 ๅไบฌ็้ฃไธๅนๅจ้ๅบ้ๆผไบๅ\n636501 0 ไธไผ่ฎพ่ฎก่กจๆ ผไธไผ็่กจๆ ผไธไผๅๆ็ป่ฎก็ ็ฉถๆฐๆฎ\n636502 0 45ไธชๆฐ็็ง่ต็น็ฑๅธไบคๆๆฐๆ็ซ็ๅ่ตไผไธๆๅผๅ
ฌๅธ่ด่ดฃๅปบ่ฎพ\n636503 0 ๅๆถ่??่ดชๅฎ็็ป่ชๅทฑๅ่ดฟ็ดข่ดฟ็โๆ้โ๏ผๅๆถ่่ดชๅฎ็็ป่ชๅทฑๅ่ดฟ็ดข่ดฟ็โๆ้โๆฏไบๅจๅฎๅบ่ก่ดฟ...\n636504 0 ๆตๆฑๆฏๅฎถ่ฏดๆตๆฑ็ๆฏ็ฝ็ซๆนๅ็ๆฏ้ป็ซ\n1000 Processed\n classify content\n637000 0 โๅฐๆถ็็Edๅฐๅบ้่ฏทไบๅคๅฐไบบๅปๆธฉๅธๅฉๅข\n637001 1 ๆจๅฅฝ๏ผๆฌข่ฟ่ด็ตๆตทๅฎ่น่ถ็ฉ่ตๆ้ๅ
ฌๅธ๏ผๆฌๅ
ฌๅธไธไธๆนๅๅ็ง่งๆ ผๆธ็จ้ขไธ็ปณ๏ผๅๆบ๏ผๆๆกฉ๏ผ็ตๆขฏ๏ผๅกๅ...\n637002 0 ๆฌ่ตๅญฃ็ปๆgoogleๅฎ่\n637003 0 ่็็ด ๅๆถฒ?10%็ๅๅฆๆฐดๆๆ\n637004 0 ๅจๅไบฌ็ๅๅคง่ก้ไธ่ตฐ็ๅ็่ฝฆ\n1000 Processed\n classify content\n637500 0 comๅฅถใๅฅถ้
ชใ้
ธๅฅถโฆโฆ่ฅฟๆน็ๅฅถๅถๅไธๅฆ\n637501 0 ็ฎๅ็จๆทๅฏไปฅ้่ฟWindowsStoreๅบ็จๅๅบไธ่ฝฝ่ฟ\n637502 0 ๅธ่พพๆๅฎซ็ไธปไฝๅปบ็ญไธบ็ฝๅฎซๅ็บขๅฎซไธค้จๅ\n637503 0 ็พๅบฆไธไธๅ็ฐ้ขๆ่ฟๆฏๆบๆฒ้็\n637504 0 ๅฏน้
้ฉพ่ฟๆณ่กไธบ็ๅธธๆๅไธ้กนๆดๆฒป\n1000 Processed\n classify content\n638000 0 ไปๅคฉๅพฎ่ฝฏๅไธบWindows10Build10240ๆจๅบไธ้กนๅฎๅ
จๆดๆฐ\n638001 0 ๆฌไบบ7ๆ9ๆฅๅจ่ๅท47่ทฏๅ
ฌไบค่ฝฆ้ๅฐไธไฝๅฅณๅญฉ\n638002 0 ๅไบฌๆบๆๅผ ๅฎถๅฃ็ณๅ2022ๅนดๅฌๅฅฅไผๆๅ\n638003 0 ็ณๆไนฆๅ้ๅธไฝๆฟไฟ้ๅๆฟไบง็ฎก็ๅฑ\n638004 0 ่ฎพ่ฎกๅธ็ปๆฏๅชๆฏๅญ้ฝๅ ไธไธไธช90ๅบฆ่ง\n1000 Processed\n classify content\n638500 1 ๅ
ณๆณจๅฐๆพ๏ผๅฉๆจๆๅ๏ผ่ดญไนฐๅฐๆพ๏ผๆพๅฟๆฝๅทฅ๏ผ่ดจไฟxๅนดxxxxxๅฐๆถ็่ถ
้ฟ่ดจไฟ่ฎฉไฝ ๆดๆพๅฟไฝฟ็จ๏ผๆดๅค...\n638501 0 ๆ่งๅพๆฒกๆๅฟ
่ฆๆๅฐ้ๆธฉๅบฆ่ฐ่ฟไนไฝ\n638502 0 ไปๆฅ่ตฐๅฟ๏ผๆฌง็พ่กๆๆดไฝ่ตฐ้ซ\n638503 0 ๆฏๅจๆท3ๆฌก่ฝๅค่ฎฉ็ฎ่คๅๅพ็ฝๆ็ป่
ป\n638504 0 ๅฆ็ๅฏน็็ต่ไธๅคฉไบๅ็ง็พๅบฆๅ็งๆๅญฆๅฐฑๆฏ็จmatlabๅๆไธๅบๆฅๅฝๆฐไนไธไผ็จ่ๅญ่ฆ็ฏไบ\n1000 Processed\n classify content\n639000 0 ็ฑๆฑ่ๅฎๅ
ดไธญๅฝ็ฏไฟ็งๆๅทฅไธๅญ็ฎกๅงไผ\n639001 0 ไบๅไปๅ่ญฆๆน่ฟๆณไธๅฎก่ขซๆณ้ข้ฉณๅ\n639002 0 1็็บขๅ
่ฎฉๆๆๅไธไธ่ๆฅ็ๆฐๆฐ\n639003 0 ไธ่ฟ่กๅฐฑๆ็คบ็ผบๅฐd3dx***\n639004 0 ้ฃไนๅฐฑไผๅบ็ฐMACD็ฐๅจ่ฏฅๅทจ้ๅคงๅไฝ็ฝฎๅค\n1000 Processed\n classify content\n639500 0 ่ฎค่ฏไฟกๆฏไธบโๆตๆฑๅคง้ซๅๅ็ปๆ้ๅ
ฌๅธIT้จ็ป็โ\n639501 0 ็้ๅฐๅนดๅๅฑๅบ้ไผ็งไนฆ้ฟใ็้ๅนดๅไธๅฐฑไธๆๅกไธญๅฟไธปไปปไปปๆใ็้ๅฐๅนดๅๅฑๅบ้ไผๅฏ็งไนฆ้ฟ็ฝๆๆไธ...\n639502 0 ้่ใๆถ่ดน้ขๅๅ
ทๆ่ฏๅฅฝไธ็ปฉๆฏๆ็ๆ่ตๆ ็\n639503 0 ๆฏๅฝๅฎถโ211ๅทฅ็จโๅโ985โฆ\n639504 0 ไผ็ง็ๅไธไธๆฏ่งๆจก่ๆฏไผ ๆฟๅนธ็ฆไธๅ่ดจ\n1000 Processed\n classify content\n640000 0 ๆฒชๆทฑไธคๅธๆต้ๅธๅผๆฅ405231ไบฟๅ
\n640001 0 ๆไปๅฆ่ฆ่ขซ่ฐทๆญ็บธ็้ป็งๆๅ่ทชไบๆๆ\n640002 0 ็พๅบฆ้้ฃไธช่ฏดๅฏไปฅ็จ้
ธๅฅถไปฃๆฟ้
ตๆฏ่ธ้ฆๅคด็\n640003 0 ็ๆฅไธ่ฝฆๆฏๅไบๅๆญ่กไธ็่็ณ\n640004 0 ๅฐ็ฎๅไธบๆญขๅพทๆธ
ๅฟๅทฒๆ36ไธชๆๅก็ซ็น\n1000 Processed\n classify content\n640500 0 ไฝ้ข้จๆฐด่ฟไนๆฏๅพๅค็พ็
็็็ถ\n640501 0 BigBang่ฟๆฌกๅๅฝๅจ้้่ๅฎๅคงๅ\n640502 0 ็ฐๅจๅคฉๅคฉๆ็็ต่ง็่ฑๅ้ชจ\n640503 0 ๅจ่่่ฟไธชๆไธๅชๆ
ๆธธ็ฉโฆ\n640504 0 ๅชๅ ไฝ ่ฏดๆๆบๆพ่ๆดๆๅ
็ตไผ็็ธๅพ่็\n1000 Processed\n classify content\n641000 0 ไธญๅฝ่ช็ฉบๅทฅไธ้จ้จๅจxDๆๅฐ็้จๅๅบ็จ้ขๅๅทฒ่ฟ\n641001 0 HCไธ่ฒๆฐดๆฏ็ฑปไบบ่ถๅ่็ฝ็ณปๅไบงๅ็ไบงๅๅฎถ่ฅฟๅฎๅทจๅญ็็ฉๅบๅ ๆๆฏ่กไปฝๆ้ๅ
ฌๅธ็โๅฝๅฎถ็็ฉๆๆๅทฅ็จ...\n641002 0 ๆฅๆฌ่ฟๆฌพMUJIๆ ๅฐ่ฏๅ้ณๅ
ๆจฑ่ฑไฟๆนฟๆปๆถฆๅๅฆๆฐด400ML็ๅปๅนด้้ๅๅฎ็ๆๅคไบ10็ถ\n641003 0 ๅนถ่ฎกๅๅๆจ่่ต้ขๅบฆๅ
ไธๅฐไบ100ไบฟๅ
่ฟ่กๅนถ่ดญๅไฝ\n641004 0 ไฝ ็ฅไธ็ฅ้KFCๆๅ ไธชไบบๆไฝไฝ ๅ\n1000 Processed\n classify content\n641500 0 ไบบ็็็ฌฌไธๆญฅๅ~่ฐ้ฝๆณ่ชๅทฑ็ๅญฉๅญๅฟตไธๆๅฅฝ็ๅญฆๆ ก\n641501 0 ๅฅฝๅ่ฟๆๅฅฝๅค็ๆ่ฟฝ้็ไบบๅฆไป็ป้ไธๅปๆๅ็ฐๆถ้ดๆนๅไบๅคชๅค้ฃไบ็พๅฅฝ็ๅๅฟๆบๅฏๆ็ไธ่ฟ่ฟๅฅฝๆ็นๅฟตๆณๅจ\n641502 0 ๅ
ๆฌ็ซ็ฎญใ่ชๅคฉ้ฃๆบ็ญ่ชๅคฉ่ฟ่พๅจๅๅ
ถ็ปไปถใๅ
ๅจไปถ\n641503 0 ๆฅ้ๅผ่ฟฐ็พๅฝๆฟๅบๅฎๅใๅๅคไบคๆถๆฏไบบๅฃซ\n641504 0 3ใ่ฅฟๅฎๅบ็ง่ฝฆ่ตทๆญฅ่ฐๅฐๅๅ
ไนๅฎข๏ผๆ้ซๆๅก่ดจ้\n1000 Processed\n classify content\n642000 0 ๆฒณๅใๆฑ่็ญxไธชไธปไบงๅบๅ็ฑป็ฒฎ้ฃไผไธๆถ่ดญๆฐไบงๅฐ้บฆxxxxไธๅจ\n642001 0 ๅไธๅฐwonbincๅคง็ฅ็ๆ่ฟฐๆจ่\n642002 0 ไฝฟๅพxxxxๅนดๅบฆๅๆๅบๆ่ฒ็ป่ดนๅ็ใ่ง่ใๆๆๅฐไฝฟ็จ\n642003 0 ๆบงๆฐดๆไธไธชๆดๆฐ็ๅ๏ผ้ๅฒๅฐผ็ยท็ญๅพท็ยทไผๆฏ\n642004 0 ๅธธๅทๅธ็ฏๅข็ๆตไธญๅฟใๅธธๅทๅธๆฐ่ฑกๅฐ2015ๅนด7ๆ31ๆฅ15ๆถ่ๅๅๅธ\n1000 Processed\n classify content\n642500 0 ่ฟๆๅ ไธช็ฎ็๏ผ่ฎฉๆบๅจไบบๅถ้ ่
ไธไป
ๅจไธ้ฟ้ขๅๅๅฑ\n642501 0 2015ๅนด7ๆ18ๆฅๆฏๅฟๅญ็ฌฌไธๆฌกๅ้ฃๆบ\n642502 0 ๅคง็ๅๆถ+ๆฟๅๆถจ่ท+ไบๅ้ๅผๅจ+่ช้่ก\n642503 0 ไธๅ้่ฃ
ไบ็ต่ไธ่ฝฝๅฎ็ฝ้ฉฑๅจๅคชๆ
ขๅฐฑๅ็จไบ้ฉฑๅจ็ฒพ็ต\n642504 1 ๆ๏ผๅฝฉๅฆxxx้xxๅ
็๏ผ็พๅณ้ข่xxๅ
xxไปถ่ฟๅบๅฐฑๆๅฐ็คผๅ็ธ้ๅฆ๏ผๆๆถ้ดๆจๅฏไปฅ่ฟๆฅ็็๏ผๆดป...\n1000 Processed\n classify content\n643000 0 xxๅฒ็ทๅญ็ซ่ฝฆไธๅผบๅฅธxxๅฒๅฐๅฅณ\n643001 0 ไนๅฐฑๆฏไนๅ็Win10Build10240\n643002 0 ็พๅฝ่ชๅคฉๅฑNASAๅฎฃๅธๅ็ฐๅฆไธไธชโๅฐ็โ\n643003 0 ๅซๆ๏ผdingxiaoqixxxxxx\n643004 0 ๆ ้กๆฉๅคฉๅ็ฆพๅฝฑๅๆฐไนๅๅบxๆxๆฅๆๆ\n1000 Processed\n classify content\n643500 0 com/168GrandIBMlivefeedๅ
ซๅคง่ฎฒๅธไธ็ฅ็งไธป่ฎฒไบบๅฐไผไธบ่ฟๅบๆดปๅจๅธฆๆฅๅฆไธไธช้ซๅณฐ\n643501 0 ็ถๅ้ฃไธชๅป็โฆโฆ็ซ็ถๅฐฑๆฏๆณฝๅก็ผๆฏๅขโฆโฆๅไธ่ไผ โฆโฆไธๅบๅบ็ฎ็ดๅฟ้ไธๅจโฆโฆ\n643502 0 ไธๆๅจไพฆ่ฎฏๅๅฅณๆน็ซ็ถๆช่ญฆๅฏ่้ปโ็ฎกๅคชๅคโ\n643503 0 ไป7ๆ29ๆฅไธญๅ12็นๅฐ8ๆ1ๆฅ0็น\n643504 1 ไบฒ็ฑ็ๆๅ๏ผๆๆ่ฟๅจๅ้ฟ่ฟช่ๅ
ไนไธน็ญๅ็้ไปฃ็๏ผๅฆๆๆ่ฟๆน้ข็้่ฆๅฏไปฅ่็ณปๆใ่ฐข่ฐขใไฝ ๅฏไปฅไป...\n1000 Processed\n classify content\n644000 0 ๆฑ่ๅซ่งๅจๆญ็ปงๆฟ่
ไปฌๅฝ่ฏญ็ๅๅๅๅๅๅๅๅๅๅๅๅๅ่ฏดๅฎ่ฏๅฝๆถๆๅฐฑๆฒกๆไน็\n644001 0 ไธๅจไธไธดๆถๅ
ด่ตท่ฎข็beachholidayๆๅคฉๅป่ฅฟ็ญ็็Majorca\n644002 0 ๆฑ่่ดจ็ๅฑๅผๅงไบๅฏนๆฃ้ชๆบๆ็ๅ
จ่ฟ็จ็็ฎก\n644003 1 ๅณกๅฑฑ็็ซ็ๅไธบๅ้ฆๅนฟๅคงๅฎขๆท๏ผ็ฐ้้ๆจๅบไผๅๅกไผๆ ๆๅก๏ผๅกๅฐๆฌๅบๆถ่ดนๆปกxxๅ
ๅณ้ไผๅๅก๏ผไผๅ็จ...\n644004 0 xxxxๆฐๆฌพๆฌง็พๆถๅฐๅ่ฉๆพ็ฆๆทฑV้ขไฟฎ่บซ้ฑผๅฐพ็ฎ็บฆ้็ ๆๅฐพๅฉ็บฑ็คผๆ\n1000 Processed\n classify content\n644500 0 ๆ้ฟ็ปๅฐๆฏ่ฆ้ขๅฏน่ดจ็๏ฝ่ดฃ้ช็\n644501 0 ไธ็ฅ้ๆฏ็ฅ้ไบ็็ธ็ไบบไผๆ้\n644502 0 ไธไธชๆ1800ไฝ ๅฏไปฅๆไธชๆ
ๆธธ็ป่ดนไธๅคฉ่ต100\n644503 0 ๆ้่ฆ็kittyๆงๅฆๅฆๅฏๅไธๅซ้่ฟ\n644504 0 ๅ16ๅนดๆฅๆฅๅ็
ง้กพ่ชๅทฑ็ๅๅ้ฟๅงจไปฌๆฅๅ\n1000 Processed\n classify content\n645000 0 OMEGAๆฌง็ฑณ่ๆๅบง็ณปๅๅ
จๆฐๅ่ฝด็ทๅฃซ่
่กจไธไธบ็ฐไปฃ็ทๅฃซ่ๆ้ \n645001 0 xใๅฐฝ้้ๆฉๅคฉ็ถไบงๅๆ้ๅพ้่ฆ\n645002 0 ๅณ่ตขๅพโ650ไบฟ็พๅ
CFOโ็็พ่ช\n645003 0 ๆฏไบฒ้่ฆ่็ฝ่ดจไพ็ปๅญๅฎซใ่็ๅไนณๆฟ็ๅ่ฒ\n645004 0 ไฝ ๅฝไบบๆๅๆฌขๅฒ็ฌๅฐๅบฆๅผบๅฅธๅคงๅฝ\n1000 Processed\n classify content\n645500 0 ๅปบๆไธปไน็ๆๅบๆยทยทยทยทยทยท\n645501 0 ๅพๅคไบบๆฏไธๅคฉ่ตดๅฎดNไธช้็บขๅ
Nไธช\n645502 0 ่ซไบๅป็ๆๆถ+19ๅบๆฃ่
ๅผ ๆ็็บขๅ
200ๅ
\n645503 0 ๆ
ๆธธ็็ฌฌไธๅคฉๅคง้จๅ้ฃๆบๅบฆ่ฟ\n645504 0 ๅฑๅน็ป้ขไธๅๅฃฐ้ณ็ปง็ปญ30็งๅๆขๅคๆญฃๅธธๆญๆพ\n1000 Processed\n classify content\n646000 0 ๅไบฌๆๆ่ฟ็งๆจกๅผ่ฝไธบ็ ด่งฃโๅๅพๅดๅโๆพๅฐๅบ่ทฏ\n646001 0 ๆฐ่ฅฟๅ
ฐ15ๅฒ็ๅญฆ็ๅจ้
่ฏป่ฝๅใๆฐๅญฆ่ฝๅใ็งๅญฆๅ่งฃๅณ้ฎ้ข็่ฝๅๅไธชๆน้ข่กจ็ฐๅ่\n646002 1 ๅ่ฟๅไบๅ
ๅฎต่๏ผๅฑ
ๆ่ฃ
้ฅฐๅฐๅด็ฅๆจ๏ผ็พๅนด่กๅคง่ฟ๏ผๅ
จๅฎถๅฅๅบทๅฟซไนใๅกๅจxๆๆฅๅ
ฌๅธๅจ่ฏข่ฃ
ไฟฎไบ้กนๅๆๅคง...\n646003 0 โๆพ่ฃ
ไฟฎๅ
ฌๅธๅฟ
้กปๅ
ๅทฅๅ
ๆโโฆโฆๅจ่ฃ
ไฟฎ่ฟ็จไธญ\n646004 0 ไปๅคฉๅจๆๆบๅบ็่งไธๅฏนไบๅ
ญๅๅฒ็ๅคซๅฆ\n1000 Processed\n classify content\n646500 0 ไปๅคฉๅฎถ้จๅฃๅฐๅ่ดฉ็ๆๅญ้ฝ่ขซๅ็ฎก็ ธไบ\n646501 0 ใๆนๅ็ตๆขฏๅไบบไบๆ
ๆญป่
ไธๅคซๅไฟก็ฅญๅฅ ไบกๅฆปใ\n646502 0 ๆ ้ก่ท่ฟ็งฐๅ
ถๆ ๆๅจๅไฝฟ็จโ่ฅฟไบ็นโๅๆ \n646503 0 ๅฆๆไธ็ด่ฎฐๅพไธ่ฆ็ฉๆๆบๅฐฑๅฅฝไบ\n646504 0 โไฝๅฝๆ้ ่ฟๅๅคๆๆถๅฎ่ขซๆๅ่ทไบ\n1000 Processed\n classify content\n647000 0 ๅ็ฐ็จๆๅๅคดๅๆๆบๅณไธ่งไผๅพ็ซ\n647001 0 7ๆไปฝๅผ็ฆๅบๅๅๅธ็ฎก็ไธๆงๆณ้จ้จๅ
ฑๆๆฃๆธฃๅ่ฝฆ57ๅฐ\n647002 0 ่ขซ้่ไธๅ็งๅญฆ็ ็ฉถๅๆๆฏๆๅกไธๅไปฃ\n647003 1 ไฝ ๅฅฝ๏ผไธๆฐ่ดงๅฆ๏ผ่ฟ่กฃ่ฃ๏ผ็ไธไธ่กฃ๏ผ่กฌ่กฃ๏ผไธ
ๆค้ฝ้ๅธธไธ้๏ผ้ข่ฒ้ฝๅพๆผไบฎๅฆใๆๆถ้ดๅฏไปฅ่ฟๆฅ็็ใ...\n647004 0 ไปฅๅๅฃฐๅฟๆตฉๅคง็ๅ่
่ฟๅจ็ไธๆญๆทฑๅ
ฅ\n1000 Processed\n classify content\n647500 0 ๆฝฎๆถๆตทๅคฉ้ๆฌๅธๆญฃๅฝๆถโโๅ
จ็้กน็ฎ่งๆฉๆดปๅจ้ๅ้กน็ฎๅปบ่ฎพ็ปผ่ฟฐ\n647501 0 ไปๅนด็ฌฌ9ๅทๅฐ้ฃ็ฟ้ธฟไธญๅฟไบ7ๆ11ๆฅ23ๆฅ็ฉฟ่ฟ็ปๅ
ดๅธๅบ็ๆฆ็ไธบ70%\n647502 0 ็ฐๅจๅ
จไธ็็้่ใ็ปๆต็่งๅฟต้ฝๅๅฏๆฉๆฏโๆถ่ดนๅบๆฟ็ไบงโ็่ฎบ็ๅฝฑๅโฆๅฏนไบ็ฉ่ดจ็ๆตช่ดนใ็ฏๅข็ๆฑกๆ...\n647503 0 ๆจๅคฉ่ขซๆณ้กพๅพ็จ่ฏดไปๅปๆณ้ขๆฒกไธชๅฝฑๅฟ\n647504 0 ๅฝๆๅ
จๅฝๆ่ดงๅธๅบๆไบค้ไธบ262\n1000 Processed\n classify content\n648000 0 ๅคงๆไธๅตๅฐๆ ๆณๅ
ฅ็กๆ็ปๅไบฌๆตทๆทๅ็ฎกไธพๆฅ็ซ็ถๅฌๅฎไบๆ
ๅๅๆ็ต่ฏ็ช็ถ่งๅพๅฟๅฏๆ ๆฏ็ฐๅจ็ปง็ปญๅฟๅ็ๅฏน...\n648001 0 ๅฐไบ9ๆ19ๆฅ่ณ21ๆฅๅจๅฑฑไธ็่ฑ่ๅธ้ช้ๆ
ๆธธๅบ่ช็ฉบ็งๆไฝ่ฒๅ
ฌๅญไธพ่ก\n648002 0 ๅไธไบๆ้กน็ฎๆฏ็ฎๅ่ๅทๅๆๅคง็ๅไธ็ปผๅไฝ้กน็ฎ\n648003 1 ๆญ็ฅๆจๅๅฎถไบบๆฐๅนดไธไบๅฆๆ๏ผไธญๅฐๅญฆๅ็งไผ็งๅจๆ กๆๅธ้ฝๅฏไปฅๅธฎๆจไป็ปใ้่ฆไธบๅญฉๅญ่ฏทไธ้จxๅฏนx่พ
ๅฏผๅฎถ...\n648004 0 ๆฉไธๅบ้จ็ๆถๅๅฟไบๆๆๆบๅธฆไธๅฐฑ้้จ\n1000 Processed\n classify content\n648500 0 ไธไธชๅไบฌ้ถ่ก้ฟๆๆ่ตไบบ็็ๆณ\n648501 0 ๏ผLOVERUMI้ฒๆ้็้่ฆๆง้ฒๆ็็ๅพ้่ฆ\n648502 0 ไธๅฑ็ญๆๅฎคๅ
ๆ็ฏ็ป360ๅบฆ็็ฉบไธญๅๅกๅ
\n648503 0 ๆๅพๆพไธๆๆบ็ก่งไบ??ไธ็ถ็ๅพ็\n648504 0 ๆฐ็ๅคดไธไธๆถๅไธๆถ็้ฃๆบ็ป่ฟ\n1000 Processed\n classify content\n649000 0 ๅๅถ้
ธใ่กฅ้ใ่ฐ็่บซไฝโฆโฆไฝ่ฆ็ๆญฃๅๅฐ็งๅญฆๅคๅญ็่ฏ\n649001 0 ๅงๅงๅจ้ฉๅฝ่ฏป็ ็ฉถ็็ฐๅจๅทฒ็ปๆฏไธไบ\n649002 0 ไฝๅๆฅไปฅๅWๅ
็ๅพ็ๆฅๅดๅนถๆฒกๆๆชๅฅน\n649003 0 ็ปไบ็พๅบฆ็ณฏ็ฑณๅ็พๅขๅณ่็ๆบไผ\n649004 0 ็ปผๅๆกไปถๆ
ๅตไพๆณๅคๅณ๏ผ่ขซๅไบบ้ซๆ็ฏ็็ช็ฝช\n1000 Processed\n classify content\n649500 0 ๆณ้ขไธๅฅ่ฏ๏ผไบบๅๆฅไบๆไปฌ็ฎก\n649501 0 ่ๅคฉๆ่ฐขๅป็่ฟ็ปไบฒ็ฑ็ไธไธชๅฎๆด็ๅฎถ\n649502 0 ๅจๅปบ็ญๆฝๅทฅไธญ็ปงๆฟๆญฃๆบๅฐไบง็็ฒพๅๆ็ฅไผ ็ป\n649503 1 ๅปบๆๅข่ดญไผ๏ผๅๅทฆๅณๆฒๅๅ๏ผๆนๅคช็ตๅจๅ๏ผไธ้น็ท็ โฆไฟ่ฏๅ
จๅนดๆไฝ๏ผไผๆ ๅคๅค๏ผ็คผ็ฉๅคๅค๏ผ็น้ๆจๅๆฅ...\n649504 0 ๅฎถ้็ต่็ๆฏๅก็่ฟไธๅฆๆๆบ\n1000 Processed\n classify content\n650000 1 ๆฐๅนดๅฅฝ๏ผๆๆฏ่ฟๅคงๅนฟๅบ็้ๅฎๅฐๆxxxxxxxxxxx๏ผไธ็ดๅๆจๆ่็ณป๏ผๆฐๅนดๆ็นๅซไผๆ ็็นไปทๆฟ...\n650001 0 ็ฆฝๅ
ฝๅป้ข็ฆฝๅ
ฝๅป็ๅฉ็จไบบไปฌๅฏนๅจ็ฉ็ๅๆ
ๅฟ\n650002 0 CHINADAILYๆๆฅ7\n650003 0 ๅไธไบบ่ฏๆWindows8ไนๅไปไปฌๅนถๆฒกๆไธ้ๅ้\n650004 0 ๅฐฑ่ฟ้ฃๆบ็้คไฝ ้ฝๅไบไธคๆฌกไบ\n1000 Processed\n classify content\n650500 0 ็พๅบฆๅพๅฎๅ่ทๅผๅฐ็ๆธ
ๆไธๆฒณๅญ็ฑปไผผ\n650501 0 ไบ่็ฝ+ๆฒน็ซ็ๅนณๅฐๅบ็ฐๆนๅไบไผ ็ปๅ ๆฒน็ซ็ๆถ่ดน็ๆดปๅบๆฏ\n650502 0 ๆ่ก้้่ฏทไบๅฝๅฎถๅฟ็ๅจ่ฏขๅธโโๅด็ซ็บขๅฅณๅฃซ็ปๆไปฌไธไธๅ โๅฟ็ๅฅๅบท่ฏพโ\n650503 0 ๆฒฟ็บฟๅๅธๅ้ใๆณฐๅทใๆฌๅทๅฐ่ฟๅ
ฅโๅจ่ฝฆๆถไปฃโ\n650504 0 ็ฑๆฟไบงๅๅฐๆ่ตๅใๅๅธ่ฟ่ฅๅ็่ง่ฒ่ฝฌๅ\n1000 Processed\n classify content\n651000 0 ๅ ไธบๆฌ้ไปๆฏไธบไบๅไบซ2594่ฟๅ ๅนด็้ซ้ๆ้ฟ\n651001 1 ไปๆฅx.xx-xx.xxๅๅปบ่ฃ
้ฅฐๅจๆญๆฅไบๆฅผไธพๅxxxxๆฐๆฅ็ฌฌไธๅฑ๏ผๆดปๅจๅทจไผๆ ๅๅ
x.xๆๅจ้...\n651002 0 ๅชๅ ็ง็
ๅ ่ฝๅฏผ่ด็บขๆ็ผ็ฎๅ็\n651003 0 ๅณๅๅฃซๅ็ไธๆฏๆ่กจๆฏๅฅขไพๅ้ซ่ดต็ๆ่ง\n651004 0 ๅนถไธๆฏๅ ไธบD็ซๅๅพๅฅฝๆ็ฎก็ๅพๅฅฝ\n1000 Processed\n classify content\n651500 0 โ็ทๅญๆ่ฏงๅฐ้ฎ๏ผโ้่ฆๆๅฎไปฌๆฐๅผๆ่ฝๅๅ\n651501 0 ่ฃ่ทไบๆฌๅนดๅบฆISPO่ฟๅจ่ฎพ่ฎกๅคงๅฅไบๆดฒไบงๅๅนดๅบฆๅคงๅฅ\n651502 0 ๅ
ณ่ตทๆฅไธ็ดๅผบๅฅธ็ดๅฐ็ๅบๅฟๅญ\n651503 0 ๅธธๅท่ฟไธชๅฐๆนไนๆฏๆญไธไบ่็ณป็\n651504 1 ใ่ฏพ็จๆๅฟ็ซฅ็ปใๅฝ็ปใๅก้็ปใ็ด ๆใ่ฒๅฝฉใไนฆๆณ็ญ๏ผๅฐ็น ๏ผ่ไปๅ่ทฏๅๆ่ฒๅญฆ้ขๆ กๅบ ๏ผ่ไป...\n1000 Processed\n classify content\n652000 0 ็็้ฟ้ไบๆฏๆๆ ทๅธฎไฝ ๅทๅฑ็\n652001 0 ๆธๆๅฒๅคง็ๆฒๆปฉxๆxxๆฅๆบบๆญปไธคๅๆธธๅฎข\n652002 0 youtellyourselftogๅ่ฆๅๅจๅฆ\n652003 0 ๆญ็งๆจๆณโxๆฌกๅฉๅงปโ่ๅ็็ธ\n652004 0 ๅฏผ่ดๆ็ต่่ชๅจๅ
ณๆบ้ๅฏๅฐ็ฐๅจ\n1000 Processed\n classify content\n652500 0 ๅ
จๅฝๅ
ฑๆ87ไธชๅฟ่ขซๅไธบๆทฑๅๅฟๅๅบ็ก่ฎพๆฝๆ่่ตไฝๅถๆน้ฉ่ฏ็นๅฟ\n652501 0 ๅฅฝๅๆฌขๆ้ก้ๅ่ฑๅ้ชจ็ๆไปฝๅ\n652502 0 ๅๅฎๅฝขๅกซ็ฉบ่ฟ่กPerformanceReview\n652503 0 ๅฏน่ไธๅไฟกไปฐไบง็ๆทฑๆทฑ็่ดจ็\n652504 0 ไปๅ ็ฏ็็ช็ฝช่ขซไธๅ
ดๅธๆณ้ขๅคๅคๆๆๅพๅxไธชๆ\n1000 Processed\n classify content\n653000 0 8ๆ1ๆฅๅๆๅฎๆฐด้ตๅ้ฝๅๅฆ้
ๅบ็ตๆขฏ็ช็ถๆๆไปฌๅคไบบ้ฟๆถ้ดๅฐๅจ็ตๆขฏ้\n653001 0 ไธ็ผบไธๆพไบบ่ตๅๅฅณไบบไฝๅ
ถๅค\n653002 1 ใๆฐไธ่ดทใๆๆฐๆป็ฅ ็น็น๏ผ็ฎๅไธบๆญข๏ผ ๅธ้ขไธๆ ๆตๆผ็ฑป่ดทๆฌพๅฉๆฏๆไฝ๏ผ ไธ็จๆตๆผ๏ผๆ ้ๆ
ไฟ๏ผ ๆ...\n653003 0 ็ถ่ๅฝๆๅปไธ็พๅบฆๆถๅดๆฏซๆ ๆ่ท\n653004 0 ๅผๆhiๆๆฅๆฅๅไฝฟๅ้้ๅ้้บผๅฅฝ\n1000 Processed\n classify content\n653500 0 ้ฃไบๆพ็ป็่งใๆฏๆใๅผบๅฅธๅฅน็ไบบๅๆบๆ\n653501 0 ๆๆบ้ๆๆๅฏไปฅๅ็บง็ๅบ็จๆดๆฐ\n653502 0 ็ปๆๅ็ฐ18ๅฐ็ป่่ถ
ๆ ใๅคง่ ๆ่่ถ
ๆ \n653503 0 ็น่ฏ๏ผ่ถๅๅๆบไธๅฆxxๅนดๅ็ไธญๅฝ\n653504 0 ็ท็ซฅๆ่ๅทๅ
ฅๆถๆขฏ่ฎฐไฝ่ฟไบ็ตๆขฏๅธธ่ฏๅฏๆๅฝ\n1000 Processed\n classify content\n654000 0 ็ปไบๅฎๅฐzipbagๅธฆ็ไธๆฎต็ฑณ็ฒ\n654001 1 ๅฅไฝ ๅฅฝ๏ผๆๆฏๅๅไฝ ้่ฏ็้ๆบ้ผ็ๅฎขๆท็ป็็ซ ๆ็๏ผๆๆฐๆฟ็ญ๏ผๆตๆผ่ดทๆฌพๅฉๆฏxๅ๏ผๆ้ฟไบๅนดๆ๏ผๆฌข่ฟ...\n654002 0 ไนไธๆฏ้ไพฟๅฏไปฅไนฑ็จใๆปฅ็จ็\n654003 1 ๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผๆ่กไปๆฅๆจๅบๅ
ๅฎต่็นๅซๆฌพ็่ดขไบงๅ๏ผๆญฃๅจ็ญ้๏ผxxๅคฉ๏ผๆถ็็x.x๏ผๆชๆญขไปๆฅไธ...\n654004 0 ็ฐๅจๆฏๆฌกๅผ็ต่็ไธๅฐ้ๅผ็่ธๆ้ฝ่ถ
็บงๆ
ๅผ \n1000 Processed\n classify content\n654500 0 ็่ฟ่ฑๅ้ชจ้ฃไธชๅฆ้ญๅๆ่งๅฅฝๅๅพๅฅฝ็็ๆ ทๅญ\n654501 0 ่ฟๆฏ็ปงๅไบฌยทๆฒงๅทๆธคๆตทๆฐๅบ็็ฉๅป่ฏไบงไธๅญไนๅ\n654502 0 ็พๅบฆ็คพไผๅๅไบซ็ปไปถๅฐ่ฃ
ไบๆฐๆตชๅพฎๅใไบบไบบ็ฝใๅผๅฟ็ฝใ่
พ่ฎฏๅพฎๅใQQ็ฉบ้ดๅ่ดดๅง็ญๅนณๅฐ็ๆๆๅๅไบซๅ่ฝ\n654503 0 ๅฏนไบ้ฟ้ๅทดๅทดไธ่ทๅๅ
่ดนๆต้ๅธฆๆฅ็ๆถ็\n654504 0 ๅฏนไปๆฝ่็ๅฉๅจๅฝ็ถ่ฟๆฏๆฝ่\n1000 Processed\n classify content\n655000 1 ๅ็ปด่ฑ่ฏญๅน่ฎญๅญฆๆ กๆญ็ฅๆจๅ
ๅฎต่้ๅฎถๆฌขไน๏ผๅนธ็ฆๅฎๅบท๏ผๆฌๆ กไธๆณจๅฐๅญฆๅน่ฎญ๏ผๅๆฑไธไธ๏ผๅณๆฅๅผๅงๆฅๅ๏ผๅณ...\n655001 0 ๅป้ขๅๅๆฅผ็ไธๅผ ๆฎ้็
ๅบไธ\n655002 0 ๆนไบbb้ไน้ฎไธไฝๅนไธๅป็็ผๅๅ็บข่ฟ็็ผ็ฎๆจๆ่ฟๅ็งไบๅ
ๅจๅงๅงๅฎถๅไธ้กฟๅฅฝๅ็ไธๅๅๅป่พๆถฒๆ็ฆ...\n655003 0 ไบค้๏ผๅ
ฌไบคxใxxๆxxx่ทฏ่ณๆบช้็คพๅบ็ซ\n655004 1 ๅฐๆฌ็ๅฎขๆทๆจๅฅฝ๏ผๅกๆฅๅฐๆญค็ญไฟก็ๅฎขๆทไบคxxxๅ
๏ผๅฏไบซๅๅ
่ดน้xGๆๆบไธ้จๅxxๅ
่ฏ่ดนๅxxxx...\n1000 Processed\n classify content\n655500 0 140ๅๅญๅๅบไบๅไธญๆ กๅญ็ๅๆ
ๆ
ไบ\n655501 0 ๅ ไธบ่ทไบไฝ xๅนดๅฑ
็ถ่ตฐๅฐๅฐฝๅคดๆฒกๆๆไธบๆดๅฅฝ็ไบบๆด็่ฆ\n655502 0 ่ฝฌๅ
จๆฐforeolunaforallskintypes\n655503 0 ไปฅๅๅๅฝๅปๆ
ๆธธๅชๅจๅคงๅๅธ็ฉ\n655504 0 ๅพๅท้ๅฑฑๅบๆณๆฐ้่ๅฑฑๆๆไธๅทฅๅใๅค้ดๅทฅไฝไบง็ๅทจๅคง็ๅช้ณ\n1000 Processed\n classify content\n656000 0 ๆฒกไบๆๅฐฑๅฏไปฅๆไธช็ตๆขฏไธๅปๅ้ฅญๅ้ขๅ
ๅ่็ณๆพๅฅน่่ๅ
ซๅฆ่ๆถๅฐ่ไฟๅ
ป\n656001 0 ๅ็ต่ๅฃ็บธๆพๅนป็ฏ็็ฎ็ดไธ่ฆๅคช่ต\n656002 0 ไฝ ไปฌไธ่งๅพๅผๅง่ฃ
ไฟฎไปฅๅ้ฃ้ฑ่ฑ็ๅฐฑ่ทๆตๆฐดไธๆ ท\n656003 0 ็ฏ็ฝชๅซ็ไบบๆๆๆไบxๆฅ่ขซๆผ่งฃๅ้ๅฒ\n656004 0 ๆไธๆณ้่ฟ่ฐทๆญไบ่งฃ่ฟๅบงๅๅธ๏ผๅฆ้จ\n1000 Processed\n classify content\n656500 0 ๅ ๆญคๆๆบไธ็ๅบ็จ็จๅบๅณไฝฟๆๅฏ็ \n656501 0 ไธบไบ้ๅฟซ้ไบ้ฉฌ้็ป็พๅฝๆ ไบบๆบๅไบๅฅไบค่งๆนๆกๅไบฌๆถ้ด7ๆ29ๆฅไธๅๆถๆฏ\n656502 0 4ๅคฉ3ๅค็ๅช้จๆ่ก่ตฐ่ฐ้ฉๅนถ็ฒพๅฝฉ็\n656503 0 ไธไบๅ
ณไบโ็ทๅฅณๆ็ฑไธญไธ็นๅฐไบโ็่ง้ข\n656504 0 ไปๅคฉๅจ่ๅทๆฐดไธไนๅญ็ฉไบไธไธๅ\n1000 Processed\n classify content\n657000 0 ไฝๆฒปไบ็ๅทฅๅญฆ้ขๆบๅจไบบๅฎ้ชๅฎคไธปไปปไบจ้ๅ
ยทๅ
้ๆฏๆปๆฃฎ่ฏด๏ผโไธญๅฝ็ปๅไบๆบๅจไบบ็็ธๅผๅข้ฟ\n657001 1 ่ฟ่ก็
คๆฐxx้จๅธ้จ๏ผๆฌๅ
ฌๅธ็ป่ฅ็ณๆฒนๆฐ๏ผ็
คๆฐ็ใ็ญๆฐดๅจใๅ็ง็ๆฐๅ
ทๆนๅ้ถๅฎ๏ผๅธๅ
ๅ
่ดนๅฎ่ฃ
๏ผ้่ดง...\n657002 0 ๆฉไธไธๆฏ่่ๆฐดๆๆฏ\n657003 0 ๅป้ข้จๅฃ่ฃ
ไฟฎๅต็ๆๅฟซๅดฉๆบไบ\n657004 0 ่ฟๆไธๆฌกๆฏ้ฎๆขๅป็๏ผๅฆๆๆๆฏไฝ ๅฅณๆๅ\n1000 Processed\n classify content\n657500 0 ็ฐๅจ็็ซๅป้ข่ฅฟๅบ2ๅทๆฅผ4ๅฑๅฟ็ซฅ่กๆถฒ่ฟ็ค็งA02ๅทๅบๅฐฑ่ฏ\n657501 0 ่ฟๆฌก้่ฟไธๆ ้กๅธ้ฎๆฟ็ฎก็ๅฑๅฏนๆฅ\n657502 0 ๆไปฌๆคๅฃซ็ๆดป่งๅพๅฐฑๆฏไธ็ฎก้ฅฟไธ้ฅฟ\n657503 0 ๆ้็ป็็ญๅคๆฏไบคxxxๅ้ฑโไธ็ฏโๆผ้ๆ็ปไฝ ๅ่ฏ\n657504 0 ๅนธๅฅฝๆ็่ช็ญไนๆ็นไบ2ไธชๅคๅฐๆถ\n1000 Processed\n classify content\n658000 0 ๅ้็งๆsh600862\n658001 0 ๅฏไปฅๅบๆฟT็ป่ๅ่ช็ถๆไผค็ป่ๆปๅป็็ป่\n658002 0 2็บขๅ
ๆๅฐไบ๏ผ็ตๅฝฑ็ฝๅ้ไฝ ๆๅฐไธ๏ผ้ถ้ฃไนฐ100ๅ5ๅๆๅฐๅ๏ผๆๅญ่ฝฏไปถๆๅฐไบ๏ผๆฅๅๅ ่ฝฏไปถๆๅฐๅ
ญ...\n658003 0 ๆธฏ่ก้ๆตๅบ19ไบฟโ่ทๅฟ่ๅปถ่ณๅๅๆ่ดง\n658004 1 ็้ๆฐไธ็บชๅๅๆฐๅ
็ผๆฏx.x-x.x ้ฟ่ขๅ่ข่กฌ่กฃxxๅ
่ตทๅคๅฅ่ฅฟๆxxxๅ
่ตท\n1000 Processed\n classify content\n658500 0 ๅฐๆฌ็ๅไฝ๏ผๅๅฑไธๅฎถๅคฉ้ฉฌๅบๆๅคฉๅผๅงxๆxๆฅ่ณxๆxxๆฅๅบ้ขๅ็บง\n658501 0 ๅจ\"ๅ่ชๅทฑ\"ๅ\"ๅๆฆไปไบบ\"ไน้ดๅฏปๆฑๅนณ่กก\n658502 0 โไฝ ไปฌ่ฏดๆไนฆไนๆฒก่ฏด็ต่ไธ่ฝๆๆดๅ\n658503 0 ้ฟ้็ๅบ็จ็ๅๅฅฝไปๅฆไธไธชๆนๅ\n658504 0 ็คพไผๅฎๆๆญฆๆฏๅญฆ้ข้ฃๅ ็ทๅฅณๆๆถ\n1000 Processed\n classify content\n659000 0 ไป็ซ็ถๅป้
ๅงไธๆทๅ้ยทยทยทยทยท\n659001 0 ๅฅฝๅฃฐ้ณ็ฌฌๅๅญฃ็ฌฌไธๆไธญๅฏไธ่ฎฉไบบ่ฎฐไฝ็ๆๆๅฐฑๆฏๆๅฎไบ\n659002 0 ่ฟ้ๅ
่ฃ
ไฟฎๆ็นๅซๅฐฑๆฏ่ฟไธชๆ่ฎพ\n659003 0 0ไธป็ธๆบไปฅๅ500ไธๅ็ด ๅ็ฝฎ็ธๆบ\n659004 1 ้ฟ้ๅทดๅทด่ฏไฟก้ๅจๅนดๅบๆ๏ผxxxxๅนดxๆxๅทๅ
ฅ้ฉป้ฟ้ๅทดๅทดๅฐๆๅคง็คผ่ต ้๏ผไผไธAPPไธๅนด+xไธชๆ่ฏข...\n1000 Processed\n classify content\n659500 0 ๆฃ้ผๆๆบ็ๆถๅๅฟไบๅคไปฝ็
ง็\n659501 0 ๅทๆ้ฑ็ๅฐๅทไปฅๅไธๅๆๆฒกๅธฆ็บธ\n659502 0 23ๆฅๅฌๅผ็ๆฑ่็ฉไปทๅฑ้ฟไผ่ฎฎไผ ๅบ็่ฟไธๆฐๅญ\n659503 0 f2ๆ ไธf3ๆ ไน้ดๆญไบๅพๅค้ขๆถ\n659504 0 ๅ้ฒ่ฟไธไธ้ฃๆบ็ๆ็ๆปๅณ\n1000 Processed\n classify content\n660000 0 ๅจๅฅฝๅฃฐ้ณ่ฟ็งๅฆๆญคไธปๆต็้็งๅนณๅฐ\n660001 1 ๅนณๅฎๆ่ดทๅฏๅธฎๆจ่ทๅๆ้่ต้๏ผๆ ้กปๆตๆผ๏ผๆ็ปญ็ฎไพฟๅฟซ้๏ผๅฟซๆฅ็ณ่ฏทๅง๏ผ๏ผๅจ็ณ่ฏทๆถ่พๅ
ฅๆ็้่ฏท็ xx...\n660002 0 wearein้กๆๆ ผๅๅจๆค็ฉ็งๆ้ฆ\n660003 0 ไบ็ปด็ ไธญๅ
ๅซๆจ้ฉฌ็
ๆฏ้ฃๅฐฑๆฏๅฏๆ็ไบๆ
ไบ\n660004 0 ๅนถ็บฆ็ๆไธไธ่ตทๅจ๏ฝๅๅๅๅๅๅ\n1000 Processed\n classify content\n660500 0 ๆขฆ่งไฝ ็QQๅ็ไธๆดๆฐไบไธคๅผ ็
ง็\n660501 0 ็็็ฝไธ็ๆ
ๆธธๆป็ฅ็็ๅคด็ผ\n660502 0 ๅๅพๅคฉๆดฅๆปจๆตทไธบๅ
ถๅ่กๆฐๅ่ๅฏ็ญๅ\n660503 0 ๅช่ฆ่ฟๆณไนฑ็บชๅฐฑๅฟ
ๅฐๅๅฐๆฉ็ฝ\n660504 0 ๆไปฌๅฌๅฌxxx้ฆๅธญๅทฅ็จๅธ้ๆๅฝฌๆไน่ฏด\n1000 Processed\n classify content\n661000 0 ๅคงๅฎถๆฏๅคฉๅฎ
ๅจ็ต่ๅไนๅฏไปฅ่ต็น้ถ่ฑ้ฑ\n661001 0 ๆจๅคฉๅจ้ถๅบง็ญ็ตๅฝฑๅผๅบ็ถๅๅปๅๅกๅบไนฐไบๅ็ไธ่ฅฟ่ท่ๆฟไธ่ช่ง็่ฏดไบๆฎ้่ฏ็ถๅๆๅๅฐฑ่ฏด่ฎฉๆ่ฝไธ่ฝไธ...\n661002 0 ไธ้ฃๆบๅๅจๆ้็ญpassportcontrol\n661003 0 39ไบฟ็พๅ
ๆถ่ดญๅฅๅบทๅบ็จRuntastic\n661004 0 slmไธๆดๅๅคๅ็ตๆไปไน้ฃๆบๅๅนฒ\n1000 Processed\n classify content\n661500 0 ๅป็่ขซๅฅนไฝๅ
็่ถ
ๅคง่ฟ็คๅไบไธ่ทณ\n661501 0 ๅทฒๆปก14ๅจๅฒไธๆปก18ๅจๅฒ็ๆชๆๅนดไบบ็ฏ็ฝชๅบๅฝไป่ฝปๆ่
ๅ่ฝปๅค็ฝ\n661502 0 ๅๅฎถๅไธๅคฉ24ไธชๅฐๆถ็กไบ18ๅฐๆถ\n661503 1 ๆๆฟๅฐฑๆ้ฑ๏ผx%ๅฉ็ๅๆฌพใ่ต้ๅจ่ฝฌๅฐ้พ๏ผ้พๅฎถๅฐไบงๆจๅบ๏ผxไธโxxxxไธ๏ผ็ญๆๅๆฌพใๅช่ฆๅไธๆ...\n661504 0 ็น่งฃๆๆฅ่ฆ็็ชๆไธช่ๅๅนฟๅ\n1000 Processed\n classify content\n662000 0 ็ถๅๅผๅง่ดจ็๏ผๆ่ฏฅ็ธไฟก็ฑๆ
ๅ\n662001 0 ็ฏ็ฝช็ไบบไธ่ฏฅๆฟๆ
ไป่กไธบ็ๅๆๅ\n662002 1 ๅฎฃๅๅไน้ฝ็่ฑ้
ไธๆ๏ผไธๅ
ซๅฆๅฅณ่็นๆ ๏ผ้ค็นไปทๅฅ็ๅค๏ผๅ
จๅบx.xๆ๏ผๆดปๅจๆถ้ดx.x๏ฝxๆฅ๏ผ่ฏฆๆ
...\n662003 0 12ๅฒ็ๅฐๅฎถไนๅ
ๅคฉ่ฝฏ้ชจ็คใไธไธช่
ฟ้ฟไธไธช่
ฟ็ญ\n662004 1 ไบฒ็ฑ็ไผๅ๏ผๅไปๅ ๅไธๅบไธๆxๆxๆฅ่ณxๆฅๅผๅฑ่้ไนฐxxx้xx๏ผไธญ่ฏใ่ฑ่้ข่ไนฐx้x๏ผ้...\n1000 Processed\n classify content\n662500 0 ๅฝไบงๅจ็ป่ขซๆๆ่ขญ็พๅฝๅคง็ๅฏผๆผ้ช่ดจ็่
ๆฏๆฑๅฅธ\n662501 0 ใใ44ใ้ปๅฑ
ๅฎถๅฐๅญฉ็บฆๆๅป็ฉๆฐดๆช\n662502 0 ็ๅ
ฌๅฎๅ
ๅ่จไบบ๏ผๆญคๆ ่ๅชไธพ็ปๆไปฌ่ญฆๆน็คพไผๅ
จๅฝๅๆไบบๆฐๅธฆๆฅๆๅคง็ไนฐไธบๆ
โฆ่ฏทๅไฝๆฎตๅ่ฎค็ไบๆไธฅ้ๆง\n662503 0 ็็กฎไธบไปไนๆไปฌไผไธบไบๅ ๅ้ฑๅจๅคช้ณๅบไธๅๅฐ่ดฉ่ฎจไปท่ฟไปทๅ ๅ้ฑ่ๅจๅๅบ้ๅๆ้ฑ็ๅๅฎถไปไธ่ฎจไปท่ฟไปท\n662504 0 โ้ขๅฏนๅคฑ่ๅคๅพ็็ฌ่ฎฐๆฌ็ต่\n1000 Processed\n classify content\n663000 0 ่ๅๅก่ฝฆไบ7ๆ31ๆฅๅจๅ
ฌๅธ้ฃๅ ไธพๅโๆ ๅฟ็ฎ่ก\n663001 0 ๅๆๆซไบ็ผๆฑ่ๅซ่ง้ฃไธช็ญ้ข็่็ฎ\n663002 1 xxxxxxxxxxxๅฐๆฌ็็จๆทๆจๅฅฝ! ๆจ็็งฏๅๅทฒๆปก่ถณๅ
ๆข็ฐ้ๅคง็คผๅ
ๆกไปถ๏ผ่ฏท็ปๅฝwap.xx...\n663003 0 ๅพฎ่ฝฏไฝ ๅไธๆจ้WIN10ๆๅฐฑๅปROOTๅฎๅไบ\n663004 0 ๆทฑๅคๆๅฅยท็งๆฟ็งๅฐๅดฉๆบโฆไปไนไบบไปไน้ฌผไปไน็ฅ้ฝๆๅ\n1000 Processed\n classify content\n663500 0 2015ๅนด7ๆ30ๆฅๅฅๅฆ็็ธๆ้่ฆ็้ฎ้ข็ฌฌๅ
ซ่พๆไปฌๅฏไปฅๅ็ฆ็ธๅคๅ\n663501 0 otherstories้ฃๆฌง็พๅๅฎฟๅคๅคๆ็ฎ็ฝ่ฒๆพ็ณๅคง็็ณ็บน่ณ้ไธๅฏนไปท\n663502 0 ๅฏไปฅๆๅผWindows7ไบ๏ฝไธไผๅฎ่ฃ
Office2016\n663503 0 ๅฐไผ็ๆไธบๅฅณไบบๆขๅซ้ๅบ้่ทๆถๆฏxxx็ฑณๆขไธ่พ่ฝฆ\n663504 0 ้ๅฎๅญๅญxxๅจๅฒๅๅฏๆฏๆๆขๆไธๆฌก\n1000 Processed\n classify content\n664000 0 ๅไบ่กๆๆ่ดง้
่ตๅไน่ฝ่ต้ฑไบ\n664001 0 ๅจๆๆบ่กไธๆ็ๆง่ฝๅพๅฅฝ็ๅฝไบงๆๆบ\n664002 0 ๅๅจT3่ช็ซๆฅผๅๆบๅคงๅ
ๅฎฝๆ็ๆค
ๅญไธ\n664003 0 ็ฝไธ่ฏด็ๆฏๅจ่
พ่ฎฏ่ง้ขๆฏๆถ่ดน็\n664004 0 ็ปๅคงๅฎถๅไบซโOfficeๆๅ
จ่ง้ข็ญโๆไปถ\n1000 Processed\n classify content\n664500 1 ้่ฆ่ดทๆฌพ่ตถ็ดง็ ใๆฑฝ่ฝฆๆต่ดทใ๏ผ ไธๆผ่ฝฆใ ไธ่ฟๆทใ ไธ่ฃ
GPS๏ผ ใๆๆฏx.xๅใๅนดๆฏxx%...\n664501 0 ๅฐ้ไธๅๅๅคด็ๆๆบ็ถๅๆ้บป่้บป\n664502 0 ๆฏๅจไธๆ22๏ผ00ๆฑ่ๅซ่งๅฃฎๅฟๅไบ\n664503 0 absๆฐๆฌพๆๆ็ฎฑๅคงๅด็ด้ๆกpc่กๆ็ฎฑไธๅ่ฝฎ้ป่็บขไธ่ฒๅก้็ฎฑๅ
ๅ
้ฎ\n664504 0 โไธญๅฝ็Airbnbโ้ๅฎถๅฎๆxไบฟ็พๅ
่่ต\n1000 Processed\n classify content\n665000 0 ๆฏๆ้้ๅฅฝๅฃฐ้ณ่ฏๅฌๅฐๅ>\n665001 0 ๅธฆ็็ชๅคงๅฅ็ๅๆฌพGโSHOCK\n665002 0 ๆๅทดๅ
ๅฎๆณขๅไน1844ๅบๆๅไบ\n665003 0 ๅงๆไบบๅฐฑๆกไปถไธๅพๅธ่ฟ่กๅๅ\n665004 0 TFBOYS็งๅฏ่ฑ็ตฎๆๅ
๏ผๅๅฎฟไน็ช่ขญ\n1000 Processed\n classify content\n665500 0 ้ฃไธชๆไธญๅฝ้ฃๆบ็ๆถไปฃไธๅปไธๅค่ฟไบ\n665501 1 ไธ่๏ผ้ผ ๏ผไธค่๏ผ้ผ ็พใ่ฟไธคไธช้ฝๅพๅฅฝ๏ผไธๆณจๆ้ฃ้ฉ๏ผๆ่ฐจๆ
ใ\n665502 0 ้
็ฝฎYKKๆ้พใNIFCOๆฃๅ
ท\n665503 0 โไธๅ
ฌโ็ป่ดนๆฏๅบๆฏ2013ๅนดๅบฆ้ๅน
ๆๆพ\n665504 0 ่ฎฐ่
4ๆฅไปๅๅฐๆปจ็ซๆน้ ๅทฅ็จ้กน็ฎๆฟๅฑๅพๆถๆๆฅ้จ่ทๆ\n1000 Processed\n classify content\n666000 0 ๆฅไฝ ๅฆไธพๆฅไธๆฌก้ปๅญ้ฌ็ธไธๆฌก\n666001 0 1ๅนฒ็ๅ ็ฎ่ค็ผบๆฐด็ป่ๅพไธๅฐๆฐดไปฝ2่ตท็ฎๅฑๅ ็ฎ่คๅคชๅนฒ็ฅ่ง่ดจๅฑ่ฑ่ฝ3้ฟ็็ๅ ไธบ็ฎ่คๆฒนๆฐดไธๅนณ่กก4ๅฎนๆ่ฟๆ\n666002 1 ๅๆฐ็ปๅฎคxๆxๅทๅจๅ
ญๅผ่ฏพๅฆ๏ผๅ็บง็ญ๏ผไธญ็บง็ญ๏ผ้ซ็บง็ญ๏ผๆฏ็ญๅชๆถxxไบบ๏ผxxxxๅนด๏ผๅ
จๆฐ็ๆๅญฆ็...\n666003 0 ็ตๆขฏๅไบบไบไปถๅ่ฏๆไปฌไธไธช้็๏ผ่ฝๅจๆ่ฟๅฟไนฐ็\n666004 1 ๅ็บฟๆงๆ้ๅฆน๏ผ่้้ๆ็ญ!้
็ฎๅค\n1000 Processed\n classify content\n666500 0 ๅ ไธบๆไปฌไปๅบๅ ไธๅฐฑๆฏ่ขซ่ฎพ่ฎกๆ่ฟๆ ท็\n666501 0 ไปๅ็ฑณ้ซ็ฉบ็้ฃๆบไธไฟฏ็ฐไผๅ็ฐ\n666502 1 ๅฎถไบบไปฌ๏ผๆฅ่ๆๅฟซ๏ผๆดๆดๅพไบฟ๏ผ่ฎฐๅพๆไปฌๆญฃๆๅๅ
ซ็ธ็บฆๅฉๆฐ้ช่ฒ๏ผ่ฎฉๆจ่ถ
ไฝ็่ณๅ
่ดนไฝ้ชๆไปฌ็็น่ฒ่บซไฝ...\n666503 1 ไบบๆฏ่ฑๅจxxx-xxxx-xxxx\n666504 0 ๆๅบๆ่ไฟกๆฏ้็โๅบโโฆ\n1000 Processed\n classify content\n667000 0 ไฟก็จๅกๅ็ฐ็งๅฐ่ดฆ่ดน็ไฝๅธฆ็งฏๅ\n667001 0 ๅฐฑไฝฟ็จๆ็้่ฏท็ 6a6gsdไธ่ฝฝๅนถ็ปๅฝๆตๆฑ็งปๅจๆๆบ่ฅไธๅ
\n667002 0 ๆ้ฃๆตๆฑ็ๅงๅงๆฌฃ้ปๆ่ฆๅปๅฅน่ๅ
ฌ็ๆฏๆ กไนๅ\n667003 0 13ๅนดๆๆบไธๅก้จ้จ่ขซๅพฎ่ฝฏๆถ่ดญ\n667004 0 ๆตๆฑไธๆผไบไธคๅบ่ฎฉไบบๅผ็ฌ็้็ๅฎถๅบญ้นๅง๏ผๆญๅทไธๅ็ทๅญ้พๅฟๆฏไบฒไธๅฆปๅญไบๅต\n1000 Processed\n classify content\n667500 0 83%ใ้้พ้ฑผๅคง่ฑๆฒน5ๅ/ๆกถ่ฃ
36ๅ
ไธไธๅจๆๅนณ\n667501 0 ๆ็นไนไนๆฏ้ไบๆญฃๅธธๅจ่ฏข็ๅพฎๅ้ฝไธๅบๅป\n667502 0 ๆๆๆชๆฅโโๅฝฉ้ขๆฟๅปบ็ญไน็งๅฎข็ฏ\n667503 0 IBMๅคงไธญๅๅบ้ฆๅธญๆง่กๆป่ฃ้ฑๅคง็พคๅฐ้ไผ\n667504 0 51ๅฎถไฟๅฅ้
ๅฃฎ้ณๅๆๅฏ่ฝๆฏ้ ่ฅฟ่ฏโไผๅฅโๆฅๆฏๆ\n1000 Processed\n classify content\n668000 0 ๆฅ่ชๅ
จๅฝ37ๅฎถๅ็ฉ้ฆๅๅคฉๆดฅๅ็ฉ้ฆ็ๆป่ฎก267ไปถๆ็ฉ\n668001 0 NBAๅผๅฏๅคๅญฃ่ฝฌไผๅ็ญพ็บฆ็็ชๅฃ\n668002 0 ๆณไธ้้ฃไนๅคๆฟๅฐไบง้ฝไธๆฏๆฐ\n668003 0 ไธๆฌไนฆๅจไบ้ฉฌ้ไธ้ฟๆไธๅ้ฑไธๅฐ็ไพฟๅฎๅไนไธไธๅฎๆฏ่ฟไนฆ็\n668004 0 ๆฌ้จไปๆๅฑฑ้ไน็ฎ็ๅคบๅพๆ็็ฅๅๆฎ็ซ ไบ\n1000 Processed\n classify content\n668500 0 ๅธฎๆๆพๅฐ้็ซๅ่ฟ้ๆๅฐๅฐ้ๅฃ\n668501 0 ๅบๅๆถๅฐ็ฝ็้ฃไธ็งๅป้ขๆฃๆตๅบ้ป่ฒ็ด ็ผบๅคฑ่ฏฑๅ \n668502 0 ไปๅนดๅ่ฎค่ฏ็ๅฅๅฅๅจๆนๅ้ฃๆตทๅฃ็้ฃๆบไธไธขไบๅฐ็ฑณๆๆบ\n668503 0 ๆ็ถ็ๅฐๅฐๅทๆญฃ่ถดๅจๅงๅฎค็้ณๅฐไธ\n668504 0 ็ฟป้
ไบไธไธ200็ฏๅพฎๅ็ปไบๆพๅฐไบ\n1000 Processed\n classify content\n669000 0 ๅนฟๅทๅผๅๅบๅ่
ๅกๅปๆ่ฒๅบๅฐๅๅ่ฒๅป้ฆ\n669001 0 ไธ็ฌไธ็ๆใ็ตฆ็ถๅธธ้่ฆ็ฌๅคใๅ้
็ไฝ \n669002 0 FOLLOWMEๅจ่ฏขๅซๆ๏ผNut1366\n669003 0 ่้้ขๆไพ่บซๆฃ็ฒพ็ฅ็พ็
็ๅฅนๆ็จไธๅนด็ๆฒป\n669004 0 ไธ่ฌ่ฟไบ้ฝๅพYG้ฃ่พนๅ่ฏๅฑไปฌ่ก็จไปฅๅ\n1000 Processed\n classify content\n669500 0 ่ฟ่งไธค็พๅบฆๆๅพ็ฝชseiไบๆ\n669501 0 ๆทฑๅคๆไฝไธๅชๅปบ็ญ็ๆๅคฉๅถๅงโฆ่ฎบๆๅๅพๆฏๆๅฟซไธ่ฝๅฟ\n669502 0 ๅฎฟ่ฟ้ๅๅญฆๆ ก็้จๅๆ่ๅทฅๅๅชไฝๅๆ \n669503 0 ๆณๅฎ็จๅ
ณไน็คพไผๅ
ฌๅนณ็ๆๅ่
่ดฅ\n669504 0 ไผ ๅฐๅบฆ็ตๅSnapdeal่่ตxไบฟ็พๅ
้ฟ้ๅๆๆฎๅฝๅคๅชไฝๆฅ้\n1000 Processed\n classify content\n670000 1 ๆฅ่ๅๆๅทฒ่ฟ๏ผ็ฐๅทฒๆญฃๅผๅผๅฑๅทฅไฝ๏ผๆฌไบบไพ็ถไปไบ็ไธชไบบๆ ๆตๆผไฟก็จ่ดทๆฌพ๏ผ่ฝฆ่พ่ดทๆฌพไปฅๅๆฟๅฑๆตๆผ่ดทๆฌพ๏ผ...\n670001 0 ๅฝ็ไธ่ฅ/ๅฝ็ๅฅๅบท็ๆดป้ฆ้กน็ฎไผๅฟ๏ผ1ใไฟๅฅๅ่กไธๅฑไบๆ้ณ่กไธ\n670002 0 cctv6็ๅจไธ่บๆฏๅฝฑ้ข็นๅซ่ฎฒ็ฉถ\n670003 0 Fsjไปฅไธบ็จไธๅ้็่ชๅทฑๆไฝๆไธบ\n670004 0 ็ฒ็ฒๅฐฑๆ็ต่ๅฐฑ็ปๆๆฌๅฐๅบไธๅฌๆ้่ฏพ\n1000 Processed\n classify content\n670500 0 20150712ไปๅคฉๅฐฑๆฏๆดๅคฉๅฐ่้ป่
ฆ\n670501 0 ๆณๅฟๆณ้ขๅ
ๅๆณๅฎ่ตฐ่ฟๅๅ
ณ็คพๅบ\n670502 0 ๅ
้จ่ฃ
็ฝฎ็LED็ณป็ปๅฏไฝฟๅ
ถๅจๆฅ้็ตๆบๅๅ็ฐๅบไธ็ด่ด่ถๅฝฑๅ\n670503 0 ็ญ็ญ20ๅ
ฌ้็่ทฏ่พน่ฑ่ๅธ็ฝฎๅฐฑๆฏ6000ไธ\n670504 0 ่ไธไธ่ฌAndroidๅนณๆฟ็ต่็้ฎ็ไฟๆคๅฅๆๆไธๅ็ๆฏ\n1000 Processed\n classify content\n671000 0 ๆณๆณๅนฟๆญไธ้ฝๆ้ฃ็งxxxxไนฐๅๅ ็ถ่
ๅฐ้xxxx่ฏ่ดน็\n671001 0 ๅๅฐๅ่่ตๅณๅฐ้ๅฏ็ๅฉ็ฉบไผ ้ปๅฝฑๅๅคง็่ทณ็ฉบไฝๅผ\n671002 1 ๅฅฝๆถๆฏ๏ผๅฅฝๆถๆฏ๏ผๆฐ้ฝๅจฑไนๆฑๆฐๆฅๆไผๆ ้
ๆฐดๅ
จ้จxๆ๏ผๅ
ๅ
ๆฟ่ดน๏ผๆฏๅไฝๅคงๅฅๅจฑไนไผ้ฒ็ๅฅฝๅปๅค๏ผไฝ ่ฟ...\n671003 0 NASAๅจTwitterไธ็ๅพ็\n671004 0 ๅผโ็ฅๆ
ไบบๅฃซโๅบๆดโกๆฐๆฎ้ ๅๆๆผๆ็\n1000 Processed\n classify content\n671500 0 ๅฎถ้็้
ฑๆฒน้็ณ็้ฝๆฏๆซๅถๅฝ็ๅฅฝๅๅฅฝๅค\n671501 0 ไปฅๅๆๆน่ฏดโ็ฉ็ต่โ==~~\n671502 0 ๅ็บขๅ
ๅฆ~ๅ็บขๅ
ๅฆ~ๅ็บขๅ
ๅฆ\n671503 0 ๆด้จๅง้ฝๆฏๅ็ง็ฏ็ฝชๅ็งๆๆถๅ็งๅธ
\n671504 0 ไธๆณๆไบๆฟๅบๅฟ
้กปๆณๆณๅฐๆฐไบบๅซๅ ๆๅคชๅค\n1000 Processed\n classify content\n672000 0 ่ฟไธชไธ่ฅฟๅคชๆๆไบๆดไธชๅฐๅบ็ไบบ้ฝๅ ไธๅ ๆๅฆๅฐฑไธ่ฎฉๆ็ธไนฐไป่ฟ่บฒ็ไนฐๆไธๆณไปไปฌ็ฆปๅฉ\n672001 0 ่ฎฉๅ่
ๅฒ้ฃๅน่ฟโ่ก้จโ็ๆฏไธไธช่ง่ฝ\n672002 0 ๆธ
ๆถงๅฟๅ
ฌๅฎๅฑๅจๅฟๅฑไธๆฅผไผ่ฎฎๅฎคๅฌๅผๅ
จๅฟ็ฝๅงไธไธปๅบง่ฐไผ\n672003 0 ๅฏๅๆณๅฎ่กๆฟๅค่ฎฎๆบๅ
ณ็ณ่ฏทๅค่ฎฎ\n672004 0 ๆณๅพๆฏๅฎ่ทต็ป้ช็ๆป็ปๅๆๅ\n1000 Processed\n classify content\n672500 0 ไธญๅฝไฟก็จๅกๆฐๅขๅๅก้6400ไธๅผ \n672501 0 ๆๆถ็ฆปๅผ่ฟไธชๅฎๅๅชๅพ็ป็ๅธๅบ\n672502 0 ๆฉ็ๅผ็ๆฏ็ดๆฅ็ซไธ3700็นๆๅฐ\n672503 0 ่ฏฅๅฟ็ดฏ่ฎกๆ่ต1300ๅคไธๅ
\n672504 0 ๅไธบNexusๆๆบ่ฆ็ดๆฅ็จ้ช้พ820ใ\n1000 Processed\n classify content\n673000 1 ไธญๅ
ฌๆ่ฒ็ผไบๅๆ กๆๅธๆ่่ฏพ็จไธ็บฟๅฆ๏ผๅจ็ผไบๅผ่ฏพๅฆ๏ผ่ไธxๆxxๆฅๅๆฅๅๆๆๅไผๆ ไปท๏ผ่ฏฆ่ฏขxx...\n673001 0 ๅ ๆๆฐๆชไบคๆฐๅๅไฝๅป็่ดน็จ่ๆๆญ็ต็บฟ\n673002 0 ๅคๅคฉ้็ต่ๆก้ขไนๆฏๆบๆ่ฎฒ็ฉถ็\n673003 0 ๅญฆๆ กๆๅฝๅฎถ้็นๅฎ้ชๅฎค3ไธช๏ผๅฝๅฎถๆ่ฒ้จๆบ่ฝๅถ้ ๆๆฏ้็นๅฎ้ชๅฎค\n673004 0 ๆๆๆๆๆฏDan่ฟ็ฐๅบๆผ็คบไบAppiumไธๆบๅจไบบ็ปๅ็็ๆบๆต่ฏๅบๆฏ\n1000 Processed\n classify content\n673500 0 ไธไปฃ็ๅฅณ้ๆฅๅถๅDebbieGibson็็ปๅ
ธๅๆฒ\n673501 0 ๅฟๆถๆ่ฟๆขฆๆณๆไธบๆกฅๆขๅปบ็ญๅธ\n673502 0 ้ฃไนๆๅไป่ฟไผไธบไบ่ฑๅ้ชจๆญปๅ\n673503 0 ็ฐๅจ็็
ไบบ่ทๅ่ดฉไธๆ ทๅฅธ่ฏๅข\n673504 0 ๆๆๅฎๆ่ฟ็ซ ๅปบ็ญๆฏไฝ็ฐ็คพไผๅ
ฌๅนณ็่กจ็ฐ\n1000 Processed\n classify content\n674000 1 ๆๅ
ฌๅธๅบไธญ็ฐๆ่ฟๅฃๆฌงๆดฒๅบๆงๅกๆใไธป่ฆๆ๏ผABSใABSๅ้ใPAใPcใPEใPPใPMMA...\n674001 0 xxๅๆพๆ่ฟโ้ๆฅๆฏ้ๆๅช็ๅฟงไผคโ\n674002 0 ๅฝๆๅจๅไบฌ่ทๅฎ่ฟxxๅ
ฌ้็ๆถๅไฝ ้ไธญไผๅ็ฐๆๅพๅค้ซๆฅผๅคงๅฆ\n674003 0 ๅฐฑๆณ่ฏด่
พ่ฎฏๆฐ้ปไน็จ้ๅซๅญๅ\n674004 0 ๆณๆด็ด่บซๅฐๅบ้ซ้ข้ ่ญ้ๆฏ็พๅฎนๆฉๆง้ๆฏ็ด่บซๅบ\n1000 Processed\n classify content\n674500 0 ็กไธนๆฑๅธ่ฅฟๅฎๅบๆณ้ข็ซๆกๅบญๅฏๅบญ้ฟ้ฉฌ่ดๆ
งไปๅฎถๅบๅ\n674501 0 ่ฟไบไฟ่ดจๆไธๅ
ไบๅค้ขๅ ๅๅ้ฑๅฐฑไฟฎๅฅฝไบ็ไป่ฆ220\n674502 0 ๆปจๆตทๅๅธ้ๅฒๅฐฑๆฏ่ฆๅป็ญๅพ
ๅๆๅปถ\n674503 0 ๆฏ้ฃ็ฉไธญๆฏๅ้ฃๆบๆง็พ็
ๅคๅๅญฃ่\n674504 0 ๅไฟชๆณฝยทๅฝๅฎถ้ฃๅ่ฏๅ็็ฃ็ฎก็ๆปๅฑๅฝไบง้็นๆฎ็จ้ๅๅฆๅๅคๆก้่ฟ\n1000 Processed\n classify content\n675000 0 ไบๆฏๅจ้ฃๆบไธ็ๅฐ็ๆฏ่ฒไนๆฏไธ้็\n675001 0 ๅบๆฌไธๆ็ปๅ
ธๅ็จณๅฆฅ็่ฎพ่ฎกไบ\n675002 0 ๅไธบmate7็จๆท็ๅฟๅฃฐ๏ผๅ็ณป็ปไปๅปๅนดไนๆๅๅธไผๅผๅงๅนๆงไนๅ่ฏดๅฐ11ๆไปฝๅผๅงๆจ้ๅฐไปๅคฉ201...\n675003 0 ๅธๆไธญๅฝๆฟๅบๅๅฎๅ
ณ็ฑไธญๅฝๆฎ็พไบบ็็ๅญ็ถๅต\n675004 0 ไฝ ไผๅ็ฐไปฅ4ๅ
ไนฐๅ
ฅๅนถไปฅ8ๅ
ๅๅบ็ๆบไผๅนถไธๅค\n1000 Processed\n classify content\n675500 0 ๅฐไบซ240โโ460ใก้บๅฑฑๅซๅข
\n675501 0 ๅฎ็ฐๅ
ฌๅธๅไธๅฐไบง็ปง็ปญๅๅคงๅๅผบ็ไธๅกๅๅฑ็ฎๆ \n675502 0 ๅ
ถๆๅ็ป่ๅๅธๅจๅ
ณ่ใ่่ใ่่
ฑ็ญ็ป็ปไธญ\n675503 0 ๆไปฌ็ๅๅทฅใ่ดจ้ไปฅๅๆๅก็ปๅฏนๆฏไธ็ญ็\n675504 0 ๆตๆฑไธๆฆ็ๅฐไบงใ่ฝๆบไผไธไธๅฐ\n1000 Processed\n classify content\n676000 0 ๅๅๆฟๅฐไบฌไธๆ็ฅๆ่ต็ๅ็ด็้ฒ็ตๅๅคฉๅคฉๆๅญ\n676001 0 ไธชไบบไฟกๆฏ่ขซๅซไบบ็็จ่ฟ่ก็ฏ็ฝช\n676002 1 ๆจๅฅฝ๏ผ่ฏ้ๅ
ไธดxxxxๅนฟๅท๏ผๆฐ๏ผๆตไฝๅฑ๏ผxxxๅฎถๆตไฝไผไธๅๆฅ่ชxxๅคไธชๅฝๅฎถ็ไธไธ่งไผ้ฝ่๏ผไธญ...\n676003 0 ๅป็่ฎฉๆๆ็ไนๅๅ็ๅจๆค็\n676004 1 ไบฒ็ฑๅงๅงไปฌๅ
ๅฎต่ๅฟซไน๏ผไธฝๆฅไธๆฅผ้ๆถ้
้ไธๆๆๆดปๅจ๏ผ็งๅฌ่ฃ
x.xไธx.xๆ๏ผๆๆถ้ด่ฟๆฅ็็๏ผๆดป...\n1000 Processed\n classify content\n676500 0 ไธ่ง้ๆฅ็ๅฐๆๆบไธ็ๆถ้ด้ฝๆตไบ\n676501 0 ็ๅฐ่ขซ่กๆ็บข็ๆตทๆฐดๆ่งๅฅฝ้พๅๅฅฝ้พๅ็\n676502 0 โโใ้ๆฟๅบไธบ็กฎไฟๅทฅ็จ่ฟๅบฆๅผบๆๆฐๆฟ่ขซๅค่ถๆ่ฟๆณใ\n676503 0 ๆณฐๅทๅจ็ฉๅญๅฏน้ข็ๆฑๅกๆๆๆๆฐๅฎถไธญ็
คๆฐ็็ธๅ็็ซ็พ\n676504 0 ไธๆฉๆๅบ็ง่ฝฆๅธ
ๅฅๅธๅ
่ฏดๆไนๅๅฅฝๅๆ่ฟไฝ ๆ่ฝ่ฏดๆๅฎๅ
จไธ่ฎฐๅพไบๅ็ๆฅไปไธๅๆฏ้ชๆ็ไธป่ฆๆฏๆๆฒก่ฏดๅป...\n1000 Processed\n classify content\n677000 0 ๅไบฌๅฅฝ็ญQAQๅบ้จๅฐฑ่งๅพ่ชๅทฑ่ฆๅไบๅท\n677001 1 ่ๆฟ๏ผๆจ้ฃ่พนๆฏๆไธชๅบ้ขๅจ่ฝฌ่ฎฉๅง๏ผๆ่ฟ่พนๆฏไธไธไปไบๅบ้ข่ฝฌ่ฎฉ็๏ผ้ๅบฆๅฟซใๆ็้ซใไปทๆ ผๅ็๏ผ่ฝฌๅบ้ข...\n677002 0 ๅฝFire็จๆทๅไบ้ฉฌ้ไบไธไผ ็
ง็ๅๆฐๆฎ\n677003 0 ๅไบฌ้ฑๅฎๅฎขๅบ3ๆฏ1ๅป่ดฅไฟๅฎๅฎนๅคง\n677004 0 ๅผ ่ๆๅนดๅจๆฑ่ๆฑ้ด็ๅฎถไธญไนฆๆฟ\n1000 Processed\n classify content\n677500 0 ๅ่
ๆกๅๅ่
ๆกๅๅพๆๅ\n677501 1 ๅ
่ดนๅจ่ฏขxxxxxxxxxxxๆๆฐ็ๆบ้บปๆงๅถๅจ๏ผไธๅฎ่ฃ
๏ผ่ตทๆๆฟๅฅฝ็๏ผๅฏ็ฐๅบ่ฏ็จ๏ผๆปกๆๅไนฐใy\n677502 0 ็ฌ็ซๅซๆตด็ตๆขฏ่ฟ็ซ่ฝฆ็ซColes\n677503 0 CCๆฏ่ฝ็จๅ้ณ้ๆผๅ
จไธ็็็ทไบบ\n677504 0 ๅธๆไปฅๅ่ฝๅป็พๅฝ่ฏป้่ๅๅฃซ\n1000 Processed\n classify content\n678000 0 xรxx*xไธ็ฅ้่ฝๅฆ็ปง็ปญๆฏไนณๅๅ
ป\n678001 1 ๅง๏ผๆฐๅนดๅฅฝ๏ผ็ฐxx่็นๆ ๆฏๆปกxxxxๅฏไบซ้xxx\n678002 0 xxxxๅนดๅบฆ้จ้จโไธๅ
ฌโ็ป่ดนๆฏๅบๆฏxxxxๅนดๅบฆ้ๅน
ๆๆพ\n678003 0 ๅ
ถๅฎๆๆบๆๆ นๆฌ็ไธๅบๆฅ่ธไธ้ฝๆถไบๅฅ\n678004 0 ไบๆ่ฝฆๅนณๅฐไบบไบบ่ฝฆ่ท8500ไธ็พๅ
C่ฝฎ่่ต่
พ่ฎฏ้ขๆ\n1000 Processed\n classify content\n678500 0 ไผ ็ป็งๆฒนๆฑฝ่ฝฆๅ ไธบๆบๆขฐๅทฅ่บๅคๆ\n678501 0 ๆ็ง่ฏดๆณๆฏไผไธๅไธ่ฎกๅpptๆฏๅ็ป่ชๅทฑ็\n678502 0 ่ไปไปฌไนๅฐๅๆฌกไธๅTopGear่ฃฝไฝไบบAndyWilmanๅๆฌกๅไฝ\n678503 0 ่ขซ้ฑๆฑๆๆฅไปๆฅๆกไนกๅๆตๆฑๆฐ้ป็ฝ็ซ็ธๆฅ้\n678504 1 ๅฃ่ฑช็พ่ดง่ฟชๅฃซๅฐผ็ซฅ่ฃ
ๆๆดปๅจๅฆ๏ผไธๆไธๆฅ่ณไธๆไนๆฅ๏ผๅฌ่ฃ
ไฝ่ณๅๆ๏ผๆฅ่ฃ
ไฝ่ณไธๆ๏ผๆฌพๅผๅคๅค๏ผไผๆ ๅค...\n1000 Processed\n classify content\n679000 0 ๅนฒไบ่ฟไบ่ฟๆณๅพไบๅผบ็ๆๆ
็ๆๆ
่ฏๆญๅฏไปฅๆฅๆ\n679001 0 Aๅ๏ผไปไปฌๆฝฎไธไบๆไปฌ้ฝไธ็ฅ้็็ๅญ\n679002 0 BTOB็ๆ๏ผ้ๆๆๅนฟๅๆถ็ๅคงๅฎถไธ่ตทๅ\n679003 0 ไฝ ๅจๅ
ถไปๅบ้ไนฐ็้ฒๆ้ๆไฟฎๅค้ฒ้
ๅฅๅ\n679004 0 ๆฒณๅๅคงๅๅๅฟๅงไนฆ่ฎฐๆ่ดขไธไบฟ่ช็งฐๆช่ทๅ่ฟๅฟ็ๅคฑ่กก\n1000 Processed\n classify content\n679500 1 xxๅฅณไบบ่่ถๆผซ็ซ็ฐไผๆ้ๆๅ๏ผxๆxๆฅๅฝๅคฉ็งๆ ไผๆ ไธ๏ผๅคด็๏ผๆ็๏ผๅ
จๅนดไธ้ๆฌกๆฐ๏ผ็งๆไปทx...\n679501 0 ไน็ฎๅฏนๅพ่ตทๆ100ๅๅ้ๆฐๆพ่ตทๆ่ฒๅญฆ็ๅๆฐ\n679502 0 ้ๅฏนๆถ็ผฉๆฏๅญใ็ฒๅบ้ปๅคดๆ็นๅซๆๆ\n679503 0 ๅๆตทไบบๅฏฟ็ฉถ็ซๆณๅไบซโๆๅบโๆถ็็ๆฐธไน
ๆงๅนณ้ๅข\n679504 0 ๆ ้ก่ๆๅญๅ
ป็ๅไผๅๅบทๅคไธญๅฟ้่ฏท่ก้้จๅ็คพๅทฅๅฌๅผๅบง่ฐไผ\n1000 Processed\n classify content\n680000 0 ็จ10็ง็่ฎพ่ฎกไธบๆฐๆๅๆฐๅ็น่ต\n680001 0 ๅ ไธบๅท้พๆฏ็ดๆฅไฝ็จๅจๅฝๅไธ็\n680002 1 ๆจๅฅฝ๏ผ้ญ
ๅๅจฑไนๅ
ฌๅธ็ฅๆจๅ
ๅฎต่ๅฟซไน๏ผๅกไปๆๅจ้ญ
ๅๆฌ่ฒ้
ๅงๆ้ญ
ๅๅฎๅKTVๆถ่ดนๆปกไฝๆถ็๏ผๅฐ่ทๅพ้ญ
...\n680003 0 ใใๅๆ๏ผๆตทๅธฆไธญ็็ข่ฝๅคๅพๅฅฝ็่ขซไบบไฝๅธๆถๅ\n680004 1 ๆฟไบงๆตๆผใxๅคฉๅบๅฎ่ฏๅๆพๆฌพ๏ผ็กฎไฟๆฌ้ๅฎๅ
จใ่ฏฆ็ปๆ
ๅตๅปบ่ฎฎๆจๆฅๆไปฌๅ
ฌๅธไบ่งฃ๏ผๅฐๅ๏ผไธๆตทๅธ้ปๆตฆๅบๆ...\n1000 Processed\n classify content\n680500 0 ๅฐ่ๅนดไบบใ็คพๅบใๅป็ๆบๆใๅปๆคไบบๅ\n680501 1 ๆจๅฅฝ๏ผๆจ่ดญไนฐ็ๅๅ็ฑไบ็ณป็ปๅ็บงๅฏผ่ดๆจ็่ฎขๅๅทฒๅปๆ ๆณๆญฃๅธธๅ่ดง๏ผ่ฏทไธคๅฐๆถๅ
่ด็ตๅฎขๆ๏ผxxx-xx...\n680502 0 ๅ ๅผบ็็ฑ็ณป็ป็บชๆฃ็ๅฏๅนฒ้จ้ไผๅปบ่ฎพ็ๅ ็นๆ่\n680503 0 ๅคๅญฃๆฐๆฌพๆฌง็พ่กๅคด้ซๆกฃๅฅณ่ฃ
็ไปๆๆๆฐดๆดๆธๅๅปๅ่่ ่ข่ๆฌพ็ญๅคๅฅ\n680504 0 ไฝ ๅฏไปฅ่ดจ็ๆไฝ ๅฏไปฅๅฆๅฎๆ\n1000 Processed\n classify content\n681000 0 ๅไบฌๆ้ณๆณ้ขๅฐไบ7ๆ23ๆฅไธๅ9็น\n681001 0 ๅคฉๅคฉ้ฝๆๅไฟก็จๅก็ๆฅๅ
ฌๅธ่ฝฌๆ \n681002 0 ไปๅฆ็ๅปบ้บๅบๆธ
่ทๅญๅๅญ็ฉไธไปๅฆ็ๅคชๅทฎๅฒ\n681003 0 ๅคฑๆ็ไบบไปฌๆ่ฎธ็ๅฐไบ็็ธ็ๅฐๆฅ\n681004 0 ๆชๆฅ็้ฃๆบๅบๆๅฏ่ฝๆฏ่ฟๆ ทๅญๅข\n1000 Processed\n classify content\n681500 0 ๅๅฎณไบบๅบ่ฏฅ้่ฟๆณๅพ้ๅพๅพๅฐไธๅฎ่กฅๅฟ\n681501 0 ๆฅๆฌๆฌๅไปฃ่ดญๅฐ้ปๅธฝ่่่ฏ็ฐ่ดงๆฌข่ฟๆพๆๆฟ่ดงๅ่ฟๆฌกๆฟไบๅฅฝๅค่่่ฏ\n681502 0 ๆฑ่็ๅ่ฐๆpreไธ็ฝไบญๆนๅบ้่ฟ็ๅฆๆๆ่ฏทๅซๆ\n681503 0 ็ผบๆพไธๅบๅ ไธ่ตท้ธๆไธไธชๆๅ\n681504 1 xxxx xxxx xxxx xxxx xxxๅจๅจๅ่ก\n1000 Processed\n classify content\n682000 0 ๆฏๅคฉๆณไผด้็้ฃๆบ็ฟฑ็ฟ็ๅฃฐ้ณๅผๅฑๅผไธไบๅฃ\n682001 0 2008ๅนด่ณ2015ๅนด8ๆ5ๆฅ\n682002 0 ้ๅคไธไธชๅปบ็ญๅฐฑๆฏๅฝ้
็ป็ป\n682003 0 ๆ็ถๆ่ฟๆฏ้ปๅนๅปๅง??ๅนถๆฒกๆไบบๆณ่ฆ??\n682004 0 ็ๅฐๅฏๅธ้ฟ่
่ดฅ้ฝๆชโๆฐ้ป่ๆญโ\n1000 Processed\n classify content\n682500 0 ๅพๅคไบบ็ๆ ้ขไผๆณๅฆ็็่
่ดฅ\n682501 0 ไบๅๅฆ้จๅนฟ่ฅฟๆฑ่ๅๅทๆทฑๅณไธไธ็ซๅปๅชๅๆข
ๅท\n682502 0 ๆ
ๆธธ่
็ป่ฟ่ฟไบๅฐๆนๆถๅฏๅพ็ๅฐผๅ ไธๆทปๅ ไธไบ็ณๅคด\n682503 0 ๆฏๅฏนๅนฟไธ็่ฟ็คๅป้ขๆๅ
ด่ถฃ็็ธๅ
ณๅคงไผ่ทๅๅนฟไธ็่ฟ็คๅป้ข่ต่ฎฏ็้่ฆๆธ ้\n682504 0 ๆ
ๆธธๆฏไบบไปฌไธบๅฏปๆฑ็ฒพ็ฅไธ็ๆๅฟซๆๅๅฐ่่ฟ่ก็้ๅฎๅฑ
ๆงๆ
่กๅๅจๆธธ่ง่ฟ็จไธญๆๅ็็ไธๅๅ
ณ็ณปๅ็ฐ่ฑก็ๆปๅ\n1000 Processed\n classify content\n683000 0 ๅ
ถไธๆๆxx๏ผxx็้ฃๆบๅฐx็นๅๆ่ตท้ฃ\n683001 0 ไปฅๅๅบ้ไธๅไผไนๅจ่ฟๆฅๅฏ้ๅๅธๅ็บงๅบ้ไธๆ้ฃ้ฉๆ็คบ็ไฟกๆฏ\n683002 0 ไธญ้ดๆ้พ้ถ้ฑ้ๅฑsize20*10~\n683003 0 ๆ ้กๆท้ฝๅฎ่ดๆๅๅๅฑๆๅกไธญๅฟ็ๅญฆๆ่ๅธไธบๅญฆ็ไปฌ่ฟ่กๆณฅๅกๅน่ฎญ\n683004 1 //่ฝฌ่ชxxxxx๏ผๅผๅญฆๅญฃ็นๆ ๆฅ่ขญ๏ผๅณๆฅ่ตทๅฐxๆxxๆฅ๏ผ้ขๅญxxxๅ
่ฏ่ดนๅณ้xxxๅ
่ฏ่ดนใx...\n1000 Processed\n classify content\n683500 0 ไผฏไผฏ7ๆ23ๆฅๆๆๆบๆๅจ้ซ้ไธ\n683501 1 ๅง๏ผๅฅฝๆถๆฏโฆๆ้ถๅบงๆปๆฐ่จไพฌๅบๆฐๆฌพๅฐๅบไบๆๆฃx----xๆไธไธๅฃไปท็๏ผๆฐๆฌพ่กฃๆๆปๆฐ.่จไพฌ.่พๆธฉ...\n683502 0 ๆฒๅฟๅ
ป็่่ๆๅฟ่่็งๅญๆน่ฏ่ดน่ๆฏๅคฉไธไธ็ๆ ฝ้่กๅ้่ก่\n683503 0 ๆ่ฟๅฐไบ2019ๅนดไบคไป็ๅ
ธๆตทๅ\n683504 0 ไฝๅ
ถโๅ
ทๆไพต็ฅๆงโ็ๅไธๅๅดๆๆไบไฝๆๆ นๅบ\n1000 Processed\n classify content\n684000 0 ๅบๅ็ฎกๆงๆณๅฑ็ป็ปๅ
จๅฑ่ฟ100ๅๅจ่ๅจ็ผไบบๅๅผๅฑไบ็ฌฌไบๆนๆฌกๅข้็ด ่ดจๅน่ฎญๆดปๅจ\n684001 0 ๆญฃ็กฎ้ฒๆ่ฟๆๅชไบไฝ ไธ็ฅ้็ๆๅทงๅข\n684002 0 ่ฟ็งๆฒก็ด ่ดจ็ๅซ่ทๅฎ่ฎฒ้็่ณไบๆ่พนๆ็
ง็่ฟๆณผๅฆๆฏไฝ ่ๅฉ\n684003 0 ๅฐไผ็ฉฟๅ็บค่กฃๆ่ตท้็ตๅ ๆฒน็ฌ้ดๅโ็ซไบบโ\n684004 0 ๅคงไผ่ฟๅฃๆฑฝ่ฝฆๅนด่ฝปๅฎถๆ๏ผ็ฒๅฃณ่ซTheBeetleใๅฐ้
ทScirocco้ไฝ ไธ่ตท็บตๆ
่ตท่ๅฟ\n1000 Processed\n classify content\n684500 0 ๅไบซๅพ็ๅธธ็้ฃ่ท็ฝๅๆๆๅพ\n684501 0 ไนๆฏๅฏไบ็ๆkeepๅฐ็ฐๅจ็้้ปๆๆฌ>\n684502 0 ไนๅคงๅธๅไป
ไธๅฎถ็็ฉบไธๅจไธๆตท้ๅฎถๅดๅนถ่ดญ่\n684503 0 ่ฏทๅไฝ้ฉพ้ฉถๅไธฅๆ ผ้ตๅฎๆณๅพๆณ่ง\n684504 0 ๅฆๅฆไผ่ฎฐๅฝไฝ ๆ้ฟ็็น็นๆปดๆปด\n1000 Processed\n classify content\n685000 0 ็ฉทๅจๅพๅ็ๅทฅ็งๅฏๅจๆดๅ็ฑๆ็ง\n685001 0 ๅผบ็ๆจ่ไปๅฎถๆๆบๅฃณ็ฎ็ดไธ็่ฏๅฟ\n685002 0 ๆฑ่็งปๅจ็จๆท็ๅฅฝๆถๆฏโxๆxๆฅ\n685003 0 ๅจไผไธ็ ดไบงๆกไปถ็ๅฎก็ไธญๅๆฅไบ้่ฆไฝ็จ\n685004 0 p2pๅนณๅฐ็็จณๆญฅๅๅฑๆฏๆฟๅบๆฟ็ญไฟๆคไธๆๆบไธ็ๅฟ
็ถ็ปๆ\n1000 Processed\n classify content\n685500 0 ็ฎ็ๅจไบๅนณ?ไฟ้ฉไบบไธๆไฟไบบไน้ด็ๅฉ็\n685501 0 ็ๆณไธๅฌ็ไนฐไธ่
พ่ฎฏใๆ็ใไน่งใ็ฑๅฅ่บใ่พ็ฑณใไผ้
ทโฆโฆ็VIP\n685502 0 ๆฟๆไปฌ็\"ไธญๅฝๅฟ\"่ฝๆฉๆฅ้ฉฑๅจไธญๅฝๅคง้ฃๆบ็ฟฑ็ฟ่ๅคฉ\n685503 0 ๅฅฝๆณ็ฌๅคๅฅฝๆณ็ฌๅคๅฅฝๆณ็ฌๅค้่ฆ็ไบๆ
่ฏดไธ้ๅฏๆฏโฆๆๅฎ\n685504 0 ๆขไธ็7ๆฃๆๆณฝๆบๅปๅบไธญๅค้่ขซไธญๅค้ๆฏ
ๆฅๆๅบๅฑ\n1000 Processed\n classify content\n686000 0 ็ซๅกไบบๅๅฐๆญฃๅผ่ฟ้ฉปๅฐ้่ฝฆ็ซ\n686001 0 ๅฏๆ้ถ่ก่ฟๆฒกๆๅป้ข็้ข็บฆๆๅก\n686002 0 ๅคง่ตๅฐๆนพ่ญฆๅฏ็ๆ็ๆด่ๅคง้ๅ
ฌๅฎไธ็พๅฝ่ญฆๅฏ\n686003 0 ๅๆ็lol็ๅฎฃไผ ็่ขซไปๅฌๅฐไบไน่ฆๆคๅจไธ่ตท็\n686004 0 200ๅนณ็ๅซๅข
่ฟ่ฆ่ฎพ่ฎกๆๅๅฑ\n1000 Processed\n classify content\n686500 0 ๆฌๅทๅๆๆฒกๆๆ่ฟๆ ท็ๅฅฝๆๅ\n686501 0 ๅธๅบ่ทๅน
็็ฌฌไธๅๅช่ก็ฅจๆญฃๅฅฝ่ท9%\n686502 0 ไปๅคฉๅฐฑ้ๅฐไบๅฟ
้กป่ฆๅ็ตๆขฏ็่ทฏ\n686503 1 ้่ฒๅนดๅไธป้ข็ซๅง่ฎฉๆจๅฐฝไบซ็พ้ฃ๏ผๆฌขๅฑๆ ้๏ผๆขๅ
็ญ็บฟ๏ผxxxxxxx\n686504 0 ๅ
ซๆฌ่ไบบๅดๆณๆๆฟไบง็็ป้็ไบบ\n1000 Processed\n classify content\n687000 0 ๆญฃ่ง/ๅฎๅ
จ/้ซๆ/ๆ ๅๆไปปไฝ่ดน็จๅทฅ่ช๏ผๅกๅทฅไฝๆปก6ๆ/ๆ่ช>\n687001 0 ๅฏไปฅ็ๅๆฏไธไธบOPPOFindx่ฎพ่ฎก็\n687002 0 ๆฟๅฐไบงๆ่ตไปๅจไธชไฝๆฐๅข้ฟไฝไฝๅพๅพ\n687003 0 ๆ่ง2ๅนด็ๆฐๅซ้ฝๆ็่ฝไบไธไธชๆฟๆข่ฝ็จๅๅนดๅค\n687004 1 ่ฝฌๅ๏ผๆจๅฅฝ๏ผ่ฟ้ๆฏ็พๅงๅคง่ฏๆฟ๏ผๆไปฌๆญฃๅจๆๅ
ณ็ฑๅฅณไบบ่ไนฐxxๅ
้xxๅ
ๅธๆดปๅจ๏ผ็ฆพ็ฉ้ๆxๅ
x็๏ผ...\n1000 Processed\n classify content\n687500 0 ไธญๅฝ่ฏๅธไธๅไผ็งไนฆ้ฟๅญๅฎฅๆ
ๅจ่ฏๅธๆ่ตๅจ่ฏขๆบๆๅๆฐๅๅฑ่ฎบๅๆจไธๅกๅน่ฎญไผ่ฎฎไธ่กจ็คบ\n687501 0 /่ฒๅพๅฎพๆธ้ปๅทไฟฎๅๆฒไป็ฑ็คโๅๆปฉโ็ ด่น\n687502 0 ๆๅฌๅฐๆ้ๅฐ็ๅ
ณไบๅฟซ่ฝฆ็ๆฅๅธธ๏ผๆๅฟซ่ฝฆๅธๆบๆฏๅไบ\n687503 0 ๆ่ตไบบๅไธชไบบ่ดขไบงๅฏนไผไธ็ๅบๅกๆฟๆ
ๆ ้่ดฃไปป็ๅฎไฝ็ป่ฅๆจกๅผ\n687504 0 ็็ๅธฆๅฐๅฎ ็ฉๅป้ขๆฃๆฅ้ฝๅพๅฅๅบท\n1000 Processed\n classify content\n688000 0 ๆฑ่้ฆๆนไธคๅ่ไปปๅถๅ
ฌๅกๅๅทฒๆญฃๅผไธๅฒๅนด่ชๅจ18ไธๅทฆๅณ\n688001 0 ๅ่ฟไธชๆถ้ด็้ฃๆบๆฏๅ ไธบๅๆจ่ช็ญไพฟๅฎๅ\n688002 0 ไธบไบๆดๅฅฝ็็ๆดปๆๅฏ่พพ~??่ๆฟ็ปๅ
็็บขๅ
\n688003 0 ๆฎ่ฏดๆฏไธชไธๅนดไบบ็ๅ
จ่บซ็ป่ไผๆดๆขไธ้\n688004 0 ็ฑxxxxๅคๅ่ๆๅทฅๅ ๅๆถไธคๅนดๅคไฟฎๅปบไบ่ๆๆๅๅปบ็ญ็พคโโ่ฉๅฐคไน้ปๅ\n1000 Processed\n classify content\n688500 0 ไนฐๆฟๅๅๆน้ฝๅบ่ฏฅ่ฟฝ็ฉถๅไบ่ดฃไปป\n688501 0 ๆฅๆฌ่ชไปฅไธบๆค่ฟ่ฅฟๆนๅคงๅฝ้ๅข\n688502 0 ่ฟๆๅนฝ็ต็ถ็ๅฆฎๅฆฎ็ชๅฟไฟฉๅงๅคซxxxxx่ฟ่ช็ฑ็ๆ่ง\n688503 0 ใ4ไธไบบๅจไผ ็ใไฝ ็ปๅฏนๆฒก่ง่ฟ็ๆนไพฟ้ขๆฐๅๆณ\n688504 0 ๅฝๅนดๅฐ้พๅฅณ่ขซๅฐนๅฟๅนณXXOO่ฟไบx\n1000 Processed\n classify content\n689000 0 ๅฟซๆฅๆฑ่้ฃๅ่ฏๅ่ไธๆๆฏๅญฆ้ข่ทๅฆนๅญๆฅไธๅบ็พไธฝ็้่งๅง\n689001 0 ไธบๅ
จๅฟๅ
ฌๅฎไบไธ็ๅๅฑ่ฟๆญฅไฝๅบๆดๅคง่ดก็ฎ\n689002 0 ๅพฎ่ฝฏWINDOWS10ๆดๆฐไธ่ฝฝ\n689003 1 ๅๆด็พๅฎนxยทxๅฆๅฅณ่ๅฝๅคฉ๏ผๅกๆฅๅบไผๅไป
้xxๅ
ๅณๅฏ่ฎข่ดญไปทๅผxxxๅ
ไปฅไธๅบท้ขๅๅ๏ผ๏ผๆๅฎ็ไธๆฌพ...\n689004 0 ๆไปๅคฉๅจๅไธบ็ฝ็็ญพๅฐ่ทๅพไบxxxMๅ
่ดนๆฐธไน
ๅฎน้\n1000 Processed\n classify content\n689500 0 ๅฅฝๆณๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ้ธกๅ\n689501 0 ๆฏๅนดxโxๆๆถ้ดไธ็ญ็ๆไฝ ่ญ็พ็\n689502 1 ๆญ็ฅๅ
ๅฎต่ๅฟซไน๏ผ ้ๅฒ็ฒพๅฝฉ้ชๅๅจ่ฏข๏ผ็ซ่ถณ่ถๅท๏ผไธๆณจ้ชๅๅไบๅนด๏ผไธ็ซๅผๆๅก๏ผ้่ฟ็xxx...\n689503 0 ๅด่ขซไธไธไธ็ๅไธๅ
็ๆไบบๅฃซๆด่\n689504 0 ๆบ่ฝๆๆบ็ๆฎๅ็ปๅคงๅฎถๅธฆๆฅไบๆดๆนไพฟ็้่ฎฏ็ๆดป\n1000 Processed\n classify content\n690000 0 ๆณๆ
ๆธธ็ๅฟๅไธไธๆฅๆๅคฉไนไนไธ็ญๆๅฎ\n690001 0 ๆฏ็
ๆฏๅคช่ฟๅๅฎณ่ฟๆฏ็พๅบฆๅนถๆฒกๆๆไปฌๆณ่ฑก็ๅผบๅคง\n690002 0 ๆ่ฏๅๅฑ็ฆๆฏๅฎฃไผ ๏ผ็็ฑ็ๅฝ\n690003 0 ๅ
ถๅฎๆด็ไธไธไนๆไธคไธช้ฃๆบ็่ฟไน็น\n690004 0 โ่ดชๆฑกๅๆตช่ดนๆฏๆๅคง็็ฏ็ฝชโ\n1000 Processed\n classify content\n690500 0 ไธค่ไฝ้ข้ชค้้พ400ไบฟๅ
ๅคๅธๅๆๅ่ๅธ\n690501 0 ๅ
ถไธญxไธชๆนๆฌกๆฅ่ชๆฑ่็ไบงไผไธ\n690502 0 ๅ
ทๅคไธๅฎ่ดขๅกๆๆณๅพๅบ็กๆดๅฅฝ\n690503 0 ๆต็
ไฝ ไปฌๆฏๅฏน่ชๅทฑ็ไบงๅไธ่ชไฟก\n690504 1 ๆฑฝๅคงไผ้ๅข่ตๅฉๆไพๆขฆๆณๅไธๅบ ้ xxxxxxๅ
็ฐ้ๅ่นๆ็ฌ่ฎฐๆฌ็ต่ไธๅฐใ่ฏท็ซๅณ็ป ้ bp...\n1000 Processed\n classify content\n691000 0 ไพ็ถไป็ถๆไบบ่งๅพๆๆฏๅ
ณ็ณปๆท~ๅฅฝๅงๅฑ\n691001 0 ็ๆฃๅฏ้ข็ๅฏๅคไปป็ซๆฐๅฏๅค้ฟไธ่ก4ไบบ่
ไธด็ฎๅฎๅฟ้ข\n691002 0 ๆไพๅไบฌๅคงๅญฆ่ฟ็คๅป้ขๆๆฐ่ต่ฎฏ\n691003 0 ็ๅฑ็โฆๅๅฐ่ง่ฐ
ๅโฆ่ฏๅฌๅฐๅ>\n691004 0 ~~~~ๅฏนไธไธ็ๅฎ้ฒ่กไธๆไบ่งฃ็ๆๅ\n1000 Processed\n classify content\n691500 0 ้ฃๆบๆ็นๅฐ่พพๆฅ่ช้
ๅบๅทฒ็ป4็น\n691501 0 23ๅไบฌ็ฆๅฃๆบๅบๆฅๆบ2015\n691502 0 ๆฏๅฆๅจ่ฏขๅธๅฏนๆ็่งฃ้ๆฏโ่ฆๅ
ๅญฆไผ็ฑ่ชๅทฑโ\n691503 0 ๅคๅ ๆ็ปญไฝฟ็จ3'5ไธชๅฐๆถๆ็ต้ไธ่ถณ\n691504 0 ๆฌง็พ่ๆฌพ่พไธๆ
่ถฃๆงๆ่ฏฑๆ\n1000 Processed\n classify content\n692000 0 ๅALPHAๅฅๅฉๅCTA้ๅไบคๆ\n692001 0 ็ฅ้ไธญๅฝๅฅฝๅฃฐ้ณWhyๆฒกไบบๆขๅฑๅตฉๅฅ็ๆญๆฒ\n692002 0 ็ๅฐ7ๆไปฝ้ฟๆๆ
ๆธธๆๆฐๆๅๅ
จๅฝ็ฌฌๅ
ญ\n692003 1 ไปๅคฉ็บข้ฆๆญฃๅผๅผไธ๏ผ้
ฌๅฎพไผๆ ๆดปๅจๅๅบฆๅคง๏ผ็พๅฅณไนๅพๅค๏ผๆไธ่ฟๅป็ฉไผๅง๏ผๅฐๅ:ๅฎ้พๅคงๆถฆๅ่ถ
ๅธๅฏน้ข๏ผ...\n692004 1 ๅ่กxxxxxxxxxxxxxxxxxxx้ๅปๆฐ\n1000 Processed\n classify content\n692500 0 ๆ่ฎค่ฏ่ๅท้ๅปๅธๅ
็่ฏท่็ณปๆ\n692501 0 ้ๅฏนๆชๆฅ่ฟ่ฝฝ็ซ็ฎญๅๅจๆบๅๅฑ็้่ฆ\n692502 0 ๅ ไธบ่ฟๆฐดไผๅฏนๆๆบๅ
้จๅคๅคๅ
ๅจไปถ้ ๆๆไผค\n692503 0 ๅๅ ๆฏ๏ผไฝ ๅปๅจ่ฏขไบไฝ ็ไธไฝๆฟ็3ๅๅๆ่ชๅทฅ่ต็ๆๅ\n692504 0 ๆๅผGoogle้ฒๆฅๆ ไบ~็ฉไบๅๅ
ณๅฅฅ็นๆผๅฐๆธธๆ~่งๅพ่ชๅทฑ่่ๅ~\n1000 Processed\n classify content\n693000 0 ๆตๆฑๅฐๅพ้ฃไธชๅดไบฆๅกๅพ้ฃไธช่็ฎไผๆฏ่ๆๅฐๅพๅถๅๆฅไบๆถ่ง็้ซ\n693001 0 ่ฎค่ฏไฟกๆฏไธบโๅฟตๅฟตไปๅฅณ้ๅข้ฆๆธฏๆ้ๅ
ฌๅธ่ฃไบ้ฟโ\n693002 0 ไฝๆฏๆไธ็นโฆโฆๅคชๅๅฐๅบๅคๅคง่ฝ่\n693003 0 ๆฅๆฌๆจๅบๆๆๅงไธญๅฝ็็ฌๆๆ้ทๅง็็น\n693004 1 ๅไฝๅกๆๆ็ๅทไธใๆซๆไธ็ๅไฝ่ๆป๏ผๆจๅฅฝ๏ผๆฌไบบ่กๅบๆ
งไปไบ่ฏฅ่กไธๅคๅนด๏ผๅ
ทๆไธฐๅฏ็ฎก็็ป้ชๅ็ไบงๆ...\n1000 Processed\n classify content\n693500 0 ๆถ้ญๅๅผPPTๅกๆฉๅ็ๅปๅฆไฝ็ฅๅก้ๆฎๆๅคๆๅฑไผๆๆไบๅๅฏๆฒกๆๆๅปๅฆๆๅๆ้ฏ็ๅฏ็่ทฏไธๅปๅฏๆฒกๆ...\n693501 1 ไปป่ทๆณข๏ผๅทฅ่กxxxxxxxxxxxxxxxxxxx๏ผๅปบ่กxxxxxxxxxxxxxxxxxxx ใ\n693502 0 ๅคง่่พนๅทๅพฎๅ่พน่ฏด๏ผ่
พ่ฎฏๅๅทฅ็ซ็ถๆๅธฆ็ๅทฅ็่ฟฝๅฆน็บธ็\n693503 0 ไบค่ญฆไนๅธฆ้ฒๆ่ข็้ฉๅฝไปฃ่ดญๆณๅฝไปฃ่ดญๅ
็จๅบไปฃ่ดญ\n693504 0 ๅฏนไบๆ ๆ
ขๆงไน่ๆๆๅฎถๅบญ่ๆฏไธ็
ๅ่่จ\n1000 Processed\n classify content\n694000 0 ๆฑ่็ๆณฐๅ
ดๅธๆตๅท่ก้ๅ่ฟๆณไนฑ็บช\n694001 0 ๆๆฒกๆๅฐไผไผด8ๆๅๅผ่ฝฆๅๅพๅทๆ่
่ทฏ่ฟๅพๅท็\n694002 0 โๅ
ฌ่ฏไบบ็งฐๆกๅๆถไปๆๅฎๅ
จๅไบ่ดฃไปป่ฝๅ\n694003 0 ๅ ไธบๆๅฟๅฟๅฟตๅฟต็ๅๅฆAๆขฆๆ
่ก็ฎฑ็ปไบๅฐไบ\n694004 0 ไธไธช้กถไธไธชไธๅชๆๅฎไบๅคง่่ค้ฎ้ขๅฆ\n1000 Processed\n classify content\n694500 0 ๆไปฅๆๅฎถๅฟซ่ฆๆๆๅคง็ๆฑ่่ๅคฉ้คไบๅญๅฏ\n694501 0 ็ฐๅจ็ๅป้ฝๆฏๆ่ฐ็ไปทๅผๆ่ต่
\n694502 0 ๅพฎ่ฝฏๅ่ฎฐ่ฏบๅบไบๆๆบไธๅกๅนถ่ฃๅxxxxไบบ\n694503 0 ๅฏผๆผๅฏๅฐใๆผๅๅฏๅฐใๆ่ตไนๅฏๅฐ\n694504 0 ๆฌๆฌกๅฏนๆฅไผๅฐไบ7ๆ29ๆฅๅจ้ๆฑไธพๅ\n1000 Processed\n classify content\n695000 0 ็ไผผMH370้ฃๆบๆฎ้ชธๅจๆณๅฑ็ๅฐผๆบๅฒๆฒฟๅฒธๅ็ฐ\n695001 0 333ใไบๆดๅปบ่ฎพ้ๅข่กไปฝๆ้ๅ
ฌๅธๆฟๅฑๅปบ็ญไธๆตๆฑ\n695002 0 ๅทๅพฎๅ็ญ่ฑๅ้ชจๅๅๅๅๅๅๅๆๅฟซ็ๆๅไธ่ฆ็ฆปๆ่ๅป\n695003 0 ๅจๅคง้่ฟๆณๅค็็ชๅฃใๆๅกๅบใๆญฆไธๅกๅฃ็ป็ปๆญๆพไบค้ๅฎๅ
จๅฎฃไผ ็\n695004 0 ่ฏท็ๆๅไธๅผ ๅพๅ็บฟ้จๅ๏ผๅคๆดปๅฅณๆง่กฐ้ๅตๅทข็ป่\n1000 Processed\n classify content\n695500 0 ๅไธๅๅนดๅgoogleๅฐๅๆgooglelatitudeไธๆจฃ\n695501 0 ๆจๆ้ฆ้ฝๆบๅบ้ฃๆบไธ็ญๅพ
3ๅฐๆถ่ตท้ฃ\n695502 0 ไบ้ฉฌ้่ฟๅพๆ
ๆจ่ไบไธๅฐๅฆ็ฑปไนฆ็ฑ\n695503 0 ไปๅนด็ๅ้ๆไนๅฐฑ่ฎพ่ฎก็่ฟไนไธ\n695504 0 6ใๆตทๆดๅฑ๏ผๅ
จๅฝๅทฒๅปบๆๆตทๆฐดๆทกๅๅทฅ็จ112ไธช\n1000 Processed\n classify content\n696000 0 ๆดไธชๆ ้กๅธ็ๅคงๆฐ่ชๅจ็ๆต็นๅฐ่พพๅฐ18ไธช\n696001 0 xไธ่ค็ซ่ซๆพ้ฃใๆตชๆผซๆฑๅฉใๅค้
็ๆฌขใ้ณไน็พ้ฃใๅธ็ฏท้ฒ่ฅ\n696002 0 ๅ
ถไปๆ ่ฎบ้ๅฒ็ๅฐๅจๆตท่ฟไบๆธฏ้้ๅชๆ็กฌๅบง\n696003 0 ๅ
ถๅฎ่ฟๅ ้่ฑๅ้ชจ็ๅ
ๅฎนๅฎๅ
จๅฏไปฅๆฆๆฌไธบๅฐ้ชจ๏ผๅธ็ถ~~~ๅธ็ถ~~~ๅๆ็ๆฏๅๆ็ๆฏๆฑไฝ ไธ่ฆๆๆ้...\n696004 1 ้ญ
ๅๅฅณไบบ่๏ผ็บฆๆตไธๆๅคฉใๅ็พ่ฒๅๅฆๅๅ
จไฝๅๅทฅๆฌ็ฅๅไฝๆถๅฐๅฅณ็ฅ่ๆฅๅฟซไน๏ผ้ๆฅๆฐธ้ฉป๏ผไธบไธๆจๅ
ฑๅบฆ็พ...\n1000 Processed\n classify content\n696500 0 ่ฟๆฅๅธฆๆฅiPhone6ๅiPhone6Plusๆๆบๅฃณ\n696501 0 ็็ต่งๅง็ๅพๅฟ่ฎฐๅฅฝๅฃฐ้ณไบๅๅๅ\n696502 0 2็ธๅฏนๅ
ถไป่ก็ฅจๆฏๅฐๅฐ้้็ๅฐ็่กๅ็ปฉไผ่ก\n696503 0 ้ซ้ฎไธญๅญฆๅด็ฟไปฅ389ๅ้ซ่ๆ็ปฉไฝๅฑ
ๅ
จ็็ฉบๅๆ้ฃๆๅๅ็ฌฌไธ\n696504 0 ๆไปฌๅญฆๆ ก็ป่ดธ็ณป17ไธชๅญฆ็ๅฏไปฅไบซๅๅ
ฌ่ดนๅปๆฑ่ๆตท้จๅๅ ๅฎ่ทตๆดปๅจ\n1000 Processed\n classify content\n697000 0 ๆๆไน่งๅพ่ฑๅ้ชจไธๆฏ็ฝๅญ็ป็็ๆญปๅซ\n697001 0 ็ต่ๅๆๆฐไธๆๅๅทฅ็ถๆ่ๅฟๅน่ฎญPPTไธญ\n697002 0 ไฝฟไธ้จๅไบบๅคง่ทๅ
ถๅฉ่ๅฆไธ้จๅไบบๆทฑๅๅ
ถๅฎณ\n697003 0 ่ฏดๆฏ่ฆ็ป้ไบๆธไธๅคง้ๅผๆ่ฟๆฟโฆ\n697004 0 ไนๅ็จ้ฒๅพๆธ
็้ฒๆ็็็้ธก่ไฝๆฏไปไปฌๅฎถ็fastabsorbingๆคๆ้่ฟๆฏ่ฎๅฅฝ็จ็ๆปๆถฆๅบฆๅพ...\n1000 Processed\n classify content\n697500 1 ไฝ ๅฅฝ๏ผxxxxxxxxxxx้็บขๅจๆฌๅธๅถไฝๅ็งๆฏ/ไธ่จผ\n697501 0 ๅไบฌไปฒๅคๅค่ค็ซ่ซ่ฝป่้ฃๆฌใๅฆ็นๆๅ ่ฝไบบ้ด้พ้ๆฏๅไบฌๅคงๅฑ ๆ็ไบก็ตโฆโฆ\n697502 0 ๅๅณ้้ๆฑ็ช่่ฏ่ๅนฒxxxgxx่ข\n697503 1 ็ผค็บท้ๆๅฒๆซๅคง้
ฌๅฎพ๏ผๅค้
??ไธๅไธ็ฎฑ๏ผๆด้
่ฝฉVไนฐไธ้ไธ๏ผไธญๅบ็ซ่พฃๅบๆฟๅฐๆธธๆ๏ผ็ฉบๅง็ๅถๆ่ฏฑๆ๏ผ...\n697504 0 ไปๅ่ชๅๅกซๅนณAๅๆฒณ้่งฃๆฑบ้ปๆฒ็ฐๆตท้้ทๆ่ญๅณๅ้ก\n1000 Processed\n classify content\n698000 0 ๅผ **ๅผ้ญๅนๅผๅฟๆ
ๅคๆ\n698001 0 ็ถๅๆpia็ๆไธไธช้็็ขฐๅฐไบๅฐไธ\n698002 0 ๆๅคๅนดๅTVB็บๆ้็้จฐ่จๅพฎๅ่ขซ้ปๅฎขๅ
ฅไพต\n698003 1 ๆธฉ้ฆจๆ็คบ๏ผไฝ่ฒ้ฆๅๆฐๆๅ่บๆฏๅน่ฎญๅญฆๆ กไธบๅบ็ฅๆฌๆ กๅปบๆ กไธๅจๅนดๅบ๏ผไบxxxxๅนดxๆxxๆฅไนๅๅฐๆๆ ก...\n698004 0 /ๆธฏๅคงๅญฆ็ๅฒๅปๆ กๅงไผๅ
ๅน๏ผๅญฆ็ไผ้ฟๅ
ๅฝโๆธฏ็ฌโๅ
ๅบ\n1000 Processed\n classify content\n698500 0 ๅพฎ่ฝฏ็็ณป็ป้ๅธธๆฏไธไปฃๆๅไธไปฃๅคฑ่ดฅ็ๆญปๅพช็ฏๅฆไปwin7ๆๅไบWin8ๅคฑ่ดฅไบ็ถๅwin10ไผๆๅๅ\n698501 0 ไธ่ๆฟๅบ่ฟๅฟไบๅไธๆ้่ฆ็ไบ\n698502 0 ่ชไฝ่่ชๅนฒ็ป่ๅกซๅ
่นๆ่่ๅฏไปฅ็ฌ้ดๅ่ถ
็บง็็พๅฆไธ่ฌๅ3ๆฌกๅทฆๅณๅฏไปฅๆฐธไน
\n698503 0 ๆฎไบฌๅๅป่ดจ็๏ผxxxxๅนดไธ็ๆฏๅฐฑๅจไฟ็ฝๆฏไธพ่ก\n698504 0 ๅฝๅ
้ป้็ฝ้ถtd่ตฐๅฟ็ปไบๅจ่ฟ็ปญไธ่ท็ๆๅปไธ่ตฐไฝ\n1000 Processed\n classify content\n699000 0 ๅไบฌๅณๆฅๆญฃๅผๆนๅไธบ้ๅฒๅฐผ็\n699001 0 13ๅนด็ไปๅคฉ็ฌฌไธๆฌกๆต่พพ่ๅท\n699002 1 ๅคงๅ้จๅฅๅผๆฉๆฅๆฐๆฌพๅฐๅบ๏ผๆถๅฐๅฐ่ฑ่กฌ่กซใbaba่้ฉ็้ฃ่กฃ.ๅฝฉ่ฒ็ไป่ฃคใๆดๅคๆๅๆๅพ
ๆจ็ๅฐๅบไฝ...\n699003 0 ไธชไบบๅปบ่ฎฎ๏ผๅจๆณ้ขๅคๅฎๅ่ดฅ่ฏไปไธไบค็ฉไธ่ดน็ๆนๅฏ็บณๅ
ฅๅคฑไฟกๅๅๆๅณ็ญ่
้่ง\n699004 0 ๆฆ้ณๅบๅณๅจ็ๅฏๅคง้็ๅทฅไฝๅบ่ฏฅ็ฑ่ฐๆฅโ็ๅฏโ\n1000 Processed\n classify content\n699500 0 ๅป็ๆๅกไธไป
ไป
ๆฏๅป้ข็้ฎ้ข\n699501 0 1ใๅฎถๅบญๆฟๅฑ่ฃ
ไฟฎ้ฃๆฐดโ็ๅ
ณ็ๅ
ณ็ฏๅขๅไนฑไผๅฏผ่ด็ฉบๆฐๆททๆต\n699502 0 โ้ฒๅ
ฌโ้้ขๆ ็งๅดๅไพ ้ชจๆๆ
\n699503 0 ๅฐๅ๏ผ็ณๅคง็ ็ฉถ็ๅน่ฎญไธญๅฟไธๅๅฝฑ่ง่บๆฏๅญฆๆ ก\n699504 0 ๆฏไธชๆณ้ข้ฝๆ้ฃไนไธไบ่ต็ฎ็ไบบ\n1000 Processed\n classify content\n700000 0 ๆฑ่ๅไบฌไธๅๅคงๅญฆ้ๅฑไธญๅคงๅป้ขๆถๅ็ง็ๅฏไธปไปปๅปๅธๆฏ็ฟ ๅๅจๅฎถไธญๅๅฎๆฉ้ฅญๅไธๅๆ\n700001 0 ้ป็ณไฝฟ็จๅฅฝๆนๆณ1้ป็ณๆณกๆฐดๅ็บขๆฃๆกๅ็
ฎ10ๅ้ๅทฆๅณ\n700002 0 ๆ่ฟ็ช็ถๅฏนๆฟๅญ็่ฃ
ไฟฎๅคไบ็นๆณๆณ\n700003 0 ๅฝฉๅก้ๆฟๅบๆฅๅฐ็ๆๅๅๅ้่ง\n700004 0 ๆไพๅคงๅฎถ้ธๆๅชๅ็ๆฌ้ฉๅ่ชๅทฑๅท\n1000 Processed\n classify content\n700500 0 ๅฎ็ๅๅฃณๅ
ไพงๆไธๅฅForOurPrincess\n700501 0 ่ฐข่ฐขๅฐ็ๅไบๅฅไธคไฝๅป็็ไธไธๅธฎๅฉ\n700502 0 ไฝ็ฎๅๆฟๅบ็ๆ็3600ไพ็ถ้ๅธธๅๆบ\n700503 1 ้ถ่กๆ ๆตๆผ่ดทๆฌพ๏ผๅฝๅคฉไธๆฌพ๏ผๅฉๆฏๅ
จๅธๆไฝ๏ผxxxxxxxxxxxๅค็ป็\n700504 0 ไนๆฏๆฉๅฐ90ๅนดไปฃๆผ็ไธบๆฐไธๅค็ๅป็ฝ็่ง่ฒ\n1000 Processed\n classify content\n701000 0 ไธ่ฆๅ ไธบๆๅฐ้ฑๅฐฑๆฏ็ปๆต็บ ็บท\n701001 0 xใไธ่ฆๆๆฑฝๆฒนใ็็ซน็ญๆ็ๆ็็ๅฑ้ฉๅๅธฆๅ
ฅ่ฝฆๅ
\n701002 0 ๅฝๆนๅๅณๅฎๅๅไบฌไน้ฃๆบๆป่ฟๅคฉ็ฉบ็ๆถๅ\n701003 0 90ๅฒ็้ป่ดค้ๅฝ็่ฎฐ่
้ขๆดๆ่ๅฉๆ ่งๆณๅพๅฎๆฝๅฎถๆดๆฏไธฅ้็ฏ็ฝช\n701004 1 ๆฎๅฎxxxๅฎข่ฟๅ
ฌๅธๆ่ฐขๆจ็ๆฅ็ต๏ผๆฌ็ซๆไพๅฟซ่ฝฆๆฎๅฎ่ณๅนฟๅทไธ็บฟ๏ผๅฎๅ
จ๏ผๅฟซๆท๏ผ่ฎข็ฅจ็ญ็บฟ๏ผxxxxx...\n1000 Processed\n classify content\n701500 0 ๅฝไฝ ไธๆฏๅฎๅๆๆบๅ่นๆๆๆบ็ๆถๅ\n701501 1 ๆจๅฅฝ๏ผ้ฆๅ
็ฅๆจๅฟๆ
ๆๅฟซ๏ผๅทฅไฝ้กบๅฉ๏ผxๆxๅท--xๆxxๅทไบไธน่กฃๆๅฐไธพ่ก็ๅคงๆดปๅจ๏ผไนฐ็งป้จ้ๆไฝ...\n701502 0 ๅไบฌๆฑๅฎ้บ้บ้จ้่ฟๆๅฐๆๅๆณๅญฆ็ป็ปๅ\n701503 0 ็ซ่ฝฆไธ็ซๅ
ฌ่ทฏๆฑฝ่ฝฆ็ซๅผ้ๆ
ๆบช็ญ่ฝฆ\n701504 0 ไธญๅฝๅฅฝๅฃฐ้ณ4๏ผๅถไฝๆนๆดไธชๅ็ฒๆจกๅผๅบๆฅ\n1000 Processed\n classify content\n702000 0 ๅไบซไผ้
ท่
พ่ฎฏ่ง้ข็ฑๅฅ่บไผๅ่ดฆๅท\n702001 0 ๆฟๆ็จๅๆค่ๆขๅจไบ้ฃๆบไธๅๆถ่ช็ญ\n702002 0 30ๆไบบๆก็ปๆฌๅทใๆฑ้ฝไธคๅฐ่ญฆๆนๅ
ฑๅๅชๅ\n702003 0 ๆจๆ่ขซๆพ้ฃๆบไปๆๅ่ขซๆพ้ฃๆบ\n702004 0 ไบฒ็ผ็็่ฝฆ็ฅธๅ็ๅฐๅป็็กฎๅฎๆญปไบก็ไธๅธ็่ฟ็จๆฏๅ้ป่ฟ่ฟ้พๅโฆโฆ\n1000 Processed\n classify content\n702500 0 ๆฏไป่
พ่ฎฏ่ชๆ้ข ่ฆๆๆๅใไนๆฏๅ ช็งฐ็งปๅจไบ่็ฝๆๆๅ็ไบงๅๅพฎไฟก\n702501 0 ่ฏท่ทๆไธ่ตท่ฏป็พๆฏ่พฃๅญ/ๅฒ็/ๅฒ็/ๅฒ็\n702502 0 ็จๅพ่งฃ่ทไฝ ็ป่ฏด็งป่โๅๆ๏ผ\n702503 0 ไปๅคฉๆฉไธๅฐ้็ๅฐ็ไธๅฅ่ฏ๏ฝ\n702504 0 ไธxๅฒ็ทๅญฉๅณๅฐไปx็ฑณ้ซ้ณๅฐๅ ไธๆถ\n1000 Processed\n classify content\n703000 0 ๅธไบบๆฐๆฃๅฏ้ขๅ
็ปไนฆ่ฎฐใๆฃๅฏ้ฟ่ฟๆ ๅธฆ้ขๅ
็ปไธญๅฟ\n703001 0 ไธบไปไนไธๆฏ้ฃไบไปฅๆ่ฐ็ง็ใๅๆๅๅฏป็ง็ๅไบบ่ตฐ\n703002 0 ไน็ฎๆฏไธชๆฏ็
็็ตๅฝฑ็ต่งๅงไปไน็ๆๅๆฌขๅง้ๅๆฌข็ไนๅๅฐฑไบ่งฃๅงๆ
่ตฐๅๅ็ปๅฑไปๅคฉ็gonegirl...\n703003 0 ้ฃๆถๅๆปๆฏๅจๆณ้ฃๆบไธๅ็้ฝๆฏไปไนไบบๅข\n703004 0 ้ฟๅ
ๆฟๅบๅจ็คพไผ็็พไธญๅคไบ้ฆๅฝๅ
ถๅฒ็ไฝ็ฝฎ\n1000 Processed\n classify content\n703500 0 ๅๆดๆฒณโโA่กใๅบๅค้
่ตโโไบ็บงๅธๅบๅฝฑๅไธ็บงๅธๅบโโ่่ต็ฏๅขๅๅ\n703501 0 ็ท็ซฅไน็ตๆขฏ่ขซๅคน่บซไบกๅ็ปญ๏ผๅไผ็งฐๆขฏๅค่ขซๅคนๆ ่งๅฎ\n703502 0 ๆฐๆญปๆไบโฆโฆ่ฐทๆญๆๅผๅพฎๅ็ซ็ถ้ปๅฑไฝ ไฟฉ็ธๅฒไน\n703503 0 ๆฑ่ๅคๅฐๅบ็ไผ ๅช่กไปฝๆ้ๅ
ฌๅธ่ฃไบไผ\n703504 0 ๆไธ็ๅฐ่ฑๅ้ชจๆๆไบบไปฅๅ้ฝ่ฟ็ๅนธ็ฆๅฟซไน็็ๆดป็็ปๅฑๆๅฐฑๆฏไธ็ๅฟ\n1000 Processed\n classify content\n704000 0 โๅฎถๅ้กบโๅฐฑๆฏๅฐๅบไธไธป่ชๆฒป็ไธไธ็ใ็ณป็ป็ใไบ่็ฝๅ็ๅทฅๅ
ท\n704001 0 ๆฏๆฌก็ๅฎ่ฑๅ้ชจ้ฝไผๆณ่ฏดไธๅฅๅงๆงฝไฝ ๅคง็ท\n704002 0 2015ๅนด่ฏฅๅฟๆ่ต160ไธๅ
ไธบๅ
จๅฟ533้ดๅนณๆฟๆ ก่ๆดๆขๆฐดๆณฅๆชฉๆก\n704003 1 ใๅผๅญฆๅฆ๏ผxๅฏนx็ฒพๅๅฐ็ญๆขๆฅไธญใๆฐๅญฆ/่ฑ่ฏญ/็งๅญฆ็ฐๆxxไบบ\n704004 1 ๆตฆๅ้ถ่ก่ฃ
ไฟฎ่ดทๆฌพ็ต่ฏๅ่ฎฟ้ฎ้ข๏ผ่ดทๆฌพๅฉๆฏไธบๅนดๆฏx%๏ผ่ดทๆฌพๅจๆๆฏxxๆ๏ผ้ถ่กไธๅกๅงๆฑ๏ผๅฅณๆง๏ผๆฏ้ถ...\n1000 Processed\n classify content\n704500 1 ๅนณๅฎๆ ๆตๆผ่ดทๆฌพ๏ผๆๅกๅทฅ่ตxไธ๏ผๅ็ๆxxไธ๏ผๅ
จๆฌพ่ฝฆxxไธ๏ผๆๆญๆฟxxไธ๏ผๆญฃ่ง้ถ่กๅฝๅคฉๆพๆฌพ๏ผๅฉ...\n704501 1 ๅฅณ็ฅ่ๆฅไบ๏ผๆ่ฟๆฏๅ็ๅงฟไธๆๅ้ฆ่้กพๅฎขๅ
ๅ
้จๅๅๅๆไฝxๆ่ตท๏ผ่ฟๆๆฅ่ฃ
ๆฐๆฌพๅฐๆ๏ผ็พๅฅณๅงๅงไปฌๆฅ...\n704502 0 ๅคดๅฑฏๆฒณๅบๆฃๅฏ้ขๆญฃๅผๅผ้โไปๆฅๅคดๆกโๆๆบๅฎขๆท็ซฏ\n704503 0 ๆ่่่ฟๆตๆฑๆญ่ฟท็ๆๅๅ\n704504 0 ๅธธๅทๅธ้ๆฅผๅบๆ้ตไธญ่ทฏxxxๅท้ฟๅ
ดๅคงๅฆxๆฅผใใ่็ณปไบบ๏ผๅข็ป็ใใ่็ณป็ต่ฏ๏ผxxxxxxxxxx...\n1000 Processed\n classify content\n705000 0 ๅไบฌ็ซx็ซๅฐxx๏ผxx็ฆปๅผ\n705001 0 ไธๆฏๅ็บฏ็่ฟฝๆฑ็ผๅ็ๅไธๅฉ็\n705002 0 ๆ็จ็พๅบฆ่ง้ขๆๆบ็็ไบโ็ท็ซฅ้จๅฃๆกๆฃๆฃ็ณๅไธ็ชๆต่กๆญปไบกโ\n705003 0 ๆ้ๆๅฝไบ5้ๆ็ๆๅฅฝ็ไธ้่ฝฌ่บซ\n705004 0 ๆไธไธชๅฐๅทๆฏๅคฉ้ฝๆญฃๅคงๅ
ๆ็ๅทๅฌๆไปฌ่ฏด่ฏ\n1000 Processed\n classify content\n705500 0 ๅๆปๆๆธฏ่ทฏๆฎตๅพ้ๆฟๅบๆนๅๅ ๆฑกๆฐด็ฎก็ฝๅปบ่ฎพ็้่ฆ\n705501 0 ่
พ่ฎฏRTXๅWorkECไธบ้ฆๆนๅไฝๆน\n705502 0 ๆฑ่ฅฟ็ๅๅฎๅฟไบบๆฐๆณ้ขไพๆณๅฎก็ปไบ่ฟ่ตท่ฟๅคฑไปฅๅฑ้ฉๆนๆณๅฑๅฎณๅ
ฌๅ
ฑๅฎๅ
จๆกไปถ\n705503 0 ๅๆ ทๅบ็ฅ่ๅฉๆๅ่ไธๆคๅฃซ่ฏ\n705504 0 ไนๆไฝ่
ไปๅปๆถไปไธๅๅป็ๆกไพๆทฑๆไบบ็ๅฒ็็ๆฃๆ\n1000 Processed\n classify content\n706000 0 ๆฝ็ฆ่พๆ ผๆพ่ทๅพ็ฏ็ๅฅ็พๅๅฅฅๆๅนๅ
ๅ
็ๅคด่ก\n706001 1 ๅ
็ๆจๅฅฝ๏ผๆๆฏๅๅๅๆจ่็ณป็ไฝไฟก้ถ้็ๅ
ณๅฐๅง๏ผๅฉๆฏๆไฝๅฏไปฅๅๅฐxๅ๏ผๆๅฟซๅฏไปฅๅฝๅคฉ็ณ่ฏทๅฝๅคฉๆพๆฌพ...\n706002 0 AdallomๆฏไธๅฎถSaaSไบๅฎๅ
จๅไธๅ
ฌๅธ\n706003 0 ๅฉๅงปไธญไป้ฑ่ๅธๆ็ต่ฏ็ปๆไบ\n706004 1 ่ๆณฝๅธ็ฉบๅๆบๅฎๅๆๅกไธญๅฟ๏ผ็นไปท้ๅฎ:็ฉบๅๆบ้
ไปถ๏ผ็ฉบๆปค๏ผๆฒนๆปค๏ผๆฒนๆฐๅ็ฆปๅจ๏ผ่บๆ็ฉบๅๆบไธ็จๆฒนใไธ...\n1000 Processed\n classify content\n706500 0 ็ฐๅจ็ปๅไปทๆ ผxxxxxxๅๅฎโฆไธๆณจ็ฒพๅโฆ\n706501 0 ็ฒ็ถ่
บ่ฟๅคงๅจ็
ๅๅๆๅไธญๆ\n706502 1 ไบฒ็ฑ็ๆจๅฅฝ๏ผๆๆฏLapagayo็็ๅฉ้
๏ผๅคฉๆฐๅนฒ็ฅ๏ผ่ฏทๅคๅๆฐดๅนถๆณจๆ็ฎ่คไฟๅ
ปใxๆxๅทๅๅฐๅบ่ดญ...\n706503 0 ๅธๆณๆๆ้ฟๅถๅปบ้่ดชๆฑกๅ่ดฟไธๅๅ่พพๆๅ่ฎฎ\n706504 0 ไธๅฝๅฟ?็ณ่ณ็ณ่ณ\n1000 Processed\n classify content\n707000 0 ็็ๆฏ่บซไฝ้็ๆฏไธช็ป่้ฝ่ท่ฟไธชไธ็ๆ ผๆ ผไธๅ
ฅ\n707001 0 ๆบๅจไบบ้ๆญๅๆ ไบไธญๅฝไบบ็ๆ
็ปช\n707002 1 ๅฐๆฌ็ๆฐ่้กพๅฎขๆๅไฝ ไปฌๅฅฝ๏ผไธบไบๆ่ฐขไฝ ไปฌไธๅนดๆฅๅฏนๅๅฃซๅญ็ๆฏๆไธๅ็ฑ๏ผๆฌๅบ็ฐๅทฒๆจๅบๅ
ๅผๅคงไผๆ ๏ผๅ
...\n707003 0 ๆต่กๆฆ็ฌฌไบไฝๆๅจ้ๅฟๆ้ๆฅ็ๅบๅฟๅฆๅ้ข\n707004 0 ๅๅๅไบฌ็้ฃ้จ่ฟๆฏไผ้ๆธๆๆพ\n1000 Processed\n classify content\n707500 0 ไธญๅฝๅฅฝๅฃฐ้ณ้ๆฅไบไธไธชๆๅคงๅไบบ\n707501 0 ๅนถๅ่ตดๅธๅบ20ไฝไธชไธป่ฆ่ทฏๅฃๅผๅฑไบค้ๆๆๅๅฏผๆดปๅจ\n707502 0 ่ๅ
ๅ
ฑ้ๅๆถ่ดน่
่ตๅฟxxxไธ็พๅ
\n707503 0 ๆ็่บซไฝๆฏๅจไธไฟก้ๅขๅ
ฌๅธไธๅๅฎๆ็\n707504 0 #NAME?\n1000 Processed\n classify content\n708000 0 ็ฐๆ219ๅบ524ๅบๅๆ็ฅจๅ
ๅบ็ฅจไนๆ\n708001 0 ๅป็ๆคๅฃซ่ฟ่ก่ธๅคๆๅ็ดๅฐxx๏ผxxๅทฆๅณ\n708002 0 ๅ
ฌๅธไบ2014ๅนดๅ
จ็ๆณ้ขๅ็่กๆฟๆกไปถ็ๅบๆฌๆ
ๅตใ็น็นๅๅๅคงๅ
ธๅ่กๆฟๆกไพ\n708003 0 ๆ ผๅๆๆบ็ไบใไธไปฃ็ไบงๅๅฐ่ฆ้ขไธ\n708004 0 ๅ ไธบ่ๅฆ่บซไฝๆฑๆๅ่ฃ
ไฟฎ็ไบๆ
ๆ็ๆ็ฆๅคด็้ข\n1000 Processed\n classify content\n708500 0 ็พๅบฆๅถ่้ฃ้ฃ่ดดๅง้ฆๅ๏ผ่ญ่พๅ
ซ็บงๅ่็\n708501 0 ๅนณๅๆฏๅฅๆไบค้ข็งฏ็บฆๅจxxๅนณๆน็ฑณ\n708502 0 ๅ
ฌๅธ่ก็ฅจ่ชxxxxๅนดxๆxxๆฅ่ตทๅ็\n708503 0 ๅ้ฆๆธฏไนๆๅ็ฎกไธไธช่ทฏ่พนๆ่ขซๆถไบ\n708504 0 ๆณ้ณๅขๅฟๅงๆทฑๅ
ฅๅญฆไน โไธไธฅไธๅฎโ่ช่ง่ทต่กโไธไธฅไธๅฎโ\n1000 Processed\n classify content\n709000 0 ๆๅจไฝฟ็จโRIO้ฆๆฉไผ็นๅ โ็นๆ็ฎ่ค\n709001 0 ๆฐ่ฅฟๅ
ฐๆฟๅบๅ็บง็งปๆฐๅฑ็ณป็ปๆนไพฟๅคๅฝไบบ็ญพ่ฏ็ณ่ฏท\n709002 0 ไธ็ไธไธบไปไน่ฆๆๅถๆ่ฟ็ง่ฏฑไบบ็ฏ็ฝช็ไธ่ฅฟ\n709003 0 ไปๅคฉๅพๅทๆผซๅฑ็ๅฐไบๅ
ดๆฌฃๅพฎ่็น่ฑ่กๆฏๅ่ฃ
paroๅๅฝๅฎถ้\n709004 1 ็พๅฅณๆฐๅนดๅฅฝ๏ผๆๆฏๆญไธฝ่ฌ็้ป็ป็๏ผ็ฅๆจๅๅฎถไบบๆฐๅนดๅฟซไน๏ผๅฟๆณไบๆ๏ผๆฌๆๆฅๅบxๆฌกๆค่ค๏ผ้xxxๅ
็...\n1000 Processed\n classify content\n709500 0 ่ฟไบxxๅฒใๅนถไธ็ผๅจๅบ็ฐๅฐ็ป็บน็ไบบx\n709501 0 ๅฐ้ขไฟฑไน้จๆ
ๆธธๅไบซไผๅๆปกๆๅ\n709502 1 ๅบ๏ผๆฑ่ๅ
จๆญฆๆฑไปฅๅๅจ่พนๅธๅบ็ๅไธๆนๅ้พใ็ฌ็ซไบงๆๅ้บใๅฏๅ
็งๅฏ่ช่ฅใ่ฏ้ๆจๅๅฎถไบบๆๅ็ฐๅบๅ่ง...\n709503 0 ไฝ ็ฅ้ๅๆกไธๆฌ ๆกๅจๆณๅพไธ็ๅบๅซๅ\n709504 0 ๅๅคฉๅฐฑไธ้ฃๆบไบ็ถ่ๆๅคฉ่ฟ่ฆๅฝไธ่ฅฟโฆโฆๆไนๆฏๆผ็ไธ่ก\n1000 Processed\n classify content\n710000 0 ๅฐไบง้ซ็ฎกๅไธๆฝฎๅฐไบง+ไบ่็ฝโฆโฆๆดๅค็นๆ\n710001 0 ๅช้ณๆฑกๆ็็ๅพ่ฎจๅๅคฉๆ็่ตถ็ดงๅไบๅง็ต่ๅฃฐ้ณ้ฝๅผๅฐๅคดไบ่ฟๅฌไธๆธ
โฆโฆๆไธชๆฒๆไบๅไธชๆๆๅ
ณ้จ้จไน็็ๆฏไบ\n710002 0 ๅqqๅไปฅไธบๆไบบๆฅๅ
ณๅฟๅ
ถๅฎๆฏ้ขๅฏผๆๆไธ่ตทๅ ็ญ\n710003 0 ๆชๆฅๅปบ็ญ่่ฝไฟๆธฉ้ๅญๅจๅนฟ้็ๅธๅบๅๅฑ็ฉบ้ด\n710004 0 ๅฎๅ ้ ๅๅฅ็น็ๅปบ็ญ่ขซ่ฏไธบไธ็ๆๅ้ไบง่ๅคๅไธไบบ็ฉ็ฎ\n1000 Processed\n classify content\n710500 0 ๆ่พน็่ฏ่ฎผๆถๆ่ฟๅจ็ฉ็่ชๅทฑ็ๅฏๆค้ๅฉๅงป\n710501 0 ๆๆๅฟ+่ฏๅชๅ=ๆๅๅไฝ ็็ฒไธๆฏๆไป7ๅนดๅๅผๅง็ด่ณ็ๅฝ็ปๆญข้ฝไธไผๆนๅ็ไบ\n710502 0 ่ฟๅ
ฅ็ซ็ฎญๅพๆ ๅนถๆ็คบๆHOMEๅๅฐไธป่ๅๅ\n710503 0 ๆคๅฃซๅฐๅงๅๅปๅธๆฒ้ๅๅธฎๆๅฎๆๅจ็ๅปๆ้คๆถ้ดๅธฎๅฅน็่ฏ\n710504 0 ๆฌง็พ้ฃๆฐ่ดจ็พๆญ็บฏ่ฒไผ้ฒ้่
ฟ่ฃคๆถๅฐ่คถ็ฑ่ฝป่็พ่คถ่ฃ่ฃคๅฅณๅฃซ้ช็บบ่ฃค่ฃ\n1000 Processed\n classify content\n711000 0 ๅ
ๆiPhone5sๅ็ฐ้1500ๅ
\n711001 0 ๅ้ๅฐไธบไปไน้่ฟๆ ท็ๅฝไธปๆไบบ\n711002 0 ๅฐๆๅกๆฒๆ่ฒ้ๅข็ฌฌxxๅฑ่ฟๅจไผๅฐไบxๆxxๆฅๅผๅน\n711003 0 /ๆฑ่่ๅทไธๅทฅๅๅ็ๅคง็ซ\n711004 0 ๅไธบpxๅญๅฅๆขๅ็ญ่ฟไนไธฅ้\n1000 Processed\n classify content\n711500 1 ไบฒ็ฑ็ๅฐๅง๏ผไฝ ๅฅฝ๏ผ็ฐๅจ่ฅฟๅไบxๆxๆฅxๆxxๆฅๅจ่ฅฟๅไธๆฅผๅ้จๆไฟ้ๆดปๅจๅ
จๅบx__xๆ๏ผๆๅๅค...\n711501 0 ๅฏๅฃซๅบท่
พ่ฎฏๅ่ฐๆฑฝ่ฝฆๅจ้ๆ็ซๆ่ตๅไฝๅ
ฌๅธ\n711502 0 ็ฑๅ
จๅฝ็บข่ฒๆ
ๆธธๅ่ฐๅฐ็ปๅๅ
ฌๅฎคไธปๅ\n711503 0 ๆๆณ้ฎไธบไปไนไธ่ฝไธพๆฅๅนฟๅๆๆๆฏๅนฟๅๅพฎๅๆฏ้ป่ฎค็ๆฏๅไธๅทๅพฎๅๆปกๅฑ็ๅนฟๅ็ๆฏๅฎๅ
จๅคไบ\n711504 0 thebodyshop็ๆฒๆตด้ฒๆ่ฆ็ๅ\n1000 Processed\n classify content\n712000 0 ๅฐๆฑฝ่ฝฆ็ขฐๅฐไบค้ๆ็คบ็ฏ่ฟไผๅไธๆฅ\n712001 0 ้ป้พๆฑ็ๆฒ่บๅข2015ๅนดโ้ๆฌข็ฌใๅฐๅบๅฑโๆดปๅจไปๅคฉๅฏๅจ\n712002 0 mkๅฐๆๆ้พๆก่ฉๅธฆ็ฟป็็ฃๆฃ่ฎพ่ฎก็พ่งๆถๅฐๅๅฎๅ
จๅฝๅ
็ฐ่ดง็ซ็บข้่ฒ้ๅฑ้พๆกๅ้ถ่ฒ้ๅฑ้พๆกๅฐบๅฏธ็บฆ๏ผ17\n712003 0 ๆฌง็พๆถๅฐๆงๆ็พ่ๆณขๆตชๆ่ไบฒ่คๆ่ธ็พๆญๅ
่กฃ\n712004 0 ๆๆไนๅธฎไฝ ็ๅผบๅฅธๆก่พฉๆคๅๅงๆงฝ\n1000 Processed\n classify content\n712500 0 ๅทฅไบบไธบไปไนๅ่ชๅ ไธบๅ่
่ดฅๅฝไผๆถๅ
ฅๅๅฐ\n712501 0 ๆ็ๆฏๆถจ็ฅ่ฏไบ~RMไฝไธบไบๆดฒ็ฌฌไธ็ปผ่บ\n712502 0 ๆนๅ็ๅ
้ฝๆฏ24ๅฐๆถไปฅๅ
้่ดงๅฐไฝ ๅฎถ\n712503 1 ไบฒๆๆฏไธๅๅๆฌง่ฑ้
ไธฅๆทๅบไธๅ
ซ่ๆดปๅจๅผๅงไบๆฌๆฅ่ตท่ดญไปปๆๅไปถx.xๆ่ฟๅฏๅๅๆบไธๅฏๅคฑๅฆ\n712504 0 ๅๅฎๆ้ป็ญโฆโฆsun420924868\n1000 Processed\n classify content\n713000 0 ๅงๅ ฐๅบๅธๅบ็็ฃ็ฎก็ๅฑ่ฟๆฅๅฏๅจไธบๆไธๅนด็็ตๆขฏๅฎๅ
จ็็ฎกๅคงไผๆ\n713001 0 Amazon็ฐๆPhilips้ฃๅฉๆตฆNorelcoCC5059/60ๅฟ็ซฅ็ตๅจ็ๅๅจ\n713002 0 ไธฐ่ธ่๏ผๅไปทๅ
ฅ็จ๏ผ28080ๆฅๅธ\n713003 0 ็ฉๅคไบ้ญๅ
ฝRPGๅพ็ๅ้็ๅฐฑๆฏ่ฟไธชไบ\n713004 0 27ยฐๆฉ็พฏๅบงไปๆฅ่ฟๅฟโ
โ
โ
โโ\n1000 Processed\n classify content\n713500 0 ๅฐ2/3็็ช่ไธๅ็ๅฅถ็จๆ
ๆๆบๆ็ข\n713501 0 ไธๆณจๆทๅฎๅคฉ็ซไบฌไธ้ฟ้่่่ก็พไธฝ่ฏด็ญๅไธช็ตๅๅนณๅฐ\n713502 0 ๅคไผ่ฟๅไธญๆ่ฏๅฝๅ
ไธญ่ฏไผไธๆ
ๅฟง\n713503 0 ๆฌง็พ้ฃ่ถ
ๆ่ดจๆๅคๅคๅฅณ้~>\n713504 0 ๅคๅนดๆฅ็ไธไธชๅคธไบบ็็ธ๏ผไฝ ้ฟๅพ็ๆธ
็งๅๅฐฑๆฏไฝ ็ผ็ๅฐ้ผปๅญๅฐๅๆญฃๅฐฑๆฏๅฐ\n1000 Processed\n classify content\n714000 0 ๅพๅฐไบๆทฎๅฎๅนฟๅคง็ไผฝ็ฑๅฅฝ่
็้ซๅบฆ่ฎคๅฏ\n714001 0 ๆ ้กๆฅ่ฟ่ๅทไธ็ฉบๆ็ฐๅฝฉ่ฒไบๆต\n714002 0 33ๅฒ็ๅธๆบๆจๆๅฐไปไปฌไปไธๆตทๆฅ้ๅฐๆกไนก\n714003 0 ๆ็ต่็ๅฏไปฅ่ฎพ็ฝฎๆ้ป่ฎค็ฝ้กต\n714004 0 ๆฑๅบท2015็งๅญฃๆฐๆฌพๅฅณ้ๆฌง็พ็ณปๅธฆ็ฒ่ท้ฉฌไธ้ดๅฅณ้ดๆผ็ฎๅฐๅคด้ซ่ท็ญ้ดๅฅณ\n1000 Processed\n classify content\n714500 0 1ๅท็บฟ้ปๅ็ซๅบๅๅบ้ๅขๅ
ๅงๅทๅฌ\n714501 0 ไฝ่
๏ผๅๅฑฑๅคงไปๆ้ฃ้ๆจ่ฟๆฅ\n714502 1 ๅง๏ผๅ่ฏๆจไธไธชๅฅฝๆถๆฏ๏ผๆไปฌๅบๆๆฐๅผ่ฟไบไธๅฐไปทๅผ็พไธ็ไปฅ่ฒๅSYNERONๅ่ฃ
่ฟๅฃไพ้ญ้ช้ข่ฑๆฏ...\n714503 0 iOS8้ๆไพไบ่ชๅจๅ ้คๅๅฒ็ญไฟก็่ฎพ็ฝฎ\n714504 0 ๆทฑๅณไธๅฝ้
ๆ
ๆธธๅบฆๅๅบไธ็บฟ็ๆตท่ฑชๅฎ
\n1000 Processed\n classify content\n715000 0 ๅ ๅพฎไฟกๅ็บขๅ
hyuk797\n715001 0 ๆบง้ณๅธไบบๆฐๆฃๅฏ้ขๅฏน็ๆไปฅๆถๅซๅผบๅฅธ็ฝชๆ่ตทๅ
ฌ่ฏ\n715002 0 ๅๅ ๅคฉๅปไบColoniadeSantPere\n715003 0 ๅไบฌ็ๅคๆๆฏ่ฅฟๅฎ่ฆ็นๅไธ็น\n715004 0 ๆๆฏ\"xxxๅผๅคด็ไผไธ่ก็ฅจๆด่ฃ
ๅพ
ๅ\"\n1000 Processed\n classify content\n715500 0 ๅฝๆถ่ฟๅ21ๅฒ็ๅทฅไบบๆญฃๅจๅฎ่ฃ
ๅ่ฐๅถๆบๅจไบบ\n715501 0 ๅฐๆบๅจไบบๆตๆณจๅจไฝไธ้ธ้ ๆบๅพ่ฝฌ่ฟๅจ่ฟ่กๅๆญฅ\n715502 0 ๅฐฑๆฏๅ ไธบๅฝๆฐๅ
ๆฟๅบ่ฆ็บ ๆญฃไปฅๅๆฐ่ฟๅ
ๆฟๅบๅถๅฎ็้่ฏฏ็ๅๅฒ่ง\n715503 0 ไธๆตทๅธๅฅณๅณๆจกๅ่ดฟ95ไธ?ไบๅๅไธๅค็ฝๅ\n715504 0 ไธ่ฌๆ็
งไปฅๅพ็ป้ชๆฌง็พ็ๅฑๅ้ฝไผๆฏ่พๅฅฝไธไบๆไปฅๅฐฑๆฒกๆๅพๆๅพ
\n1000 Processed\n classify content\n716000 1 ่พพๅฐๆฟ๏ผๅ
จๅๅฎถๅฑ
๏ผ็นๅฉ่พพๅ้กถ๏ผๆฏๅฎถๆฏๆทๅข็บธๅธ่บ๏ผๆตทๅฐๅฎถ็ต็ญxxๅคไธชไธ็บฟๅ็๏ผไผๆ ๅๅบฆ่พๅคง๏ผไบซ...\n716001 0 ๆ่ฟๅคชๅค้ถๅฎ้กพๅฎข้ฝๆฏๅฒ็้ฟ่ถ็ณๆฅ\n716002 0 ไธญ่ช็ฌฌไธ้ฃๆบ่ฎพ่ฎก็ ็ฉถ้ขๅจๆ\n716003 0 ๅชๅ ไธบไฝ ไปฌๆๆญช่็ญๅจๅฐไบๅจๅจ่บซไธไฝ ่ฏดไป้ฐๅ\n716004 0 ๆไธ็็งฆๆทฎไบบๅฑฑไบบๆตทๅไปๅคฉ็ปง็ปญ็ฉๅฏ\n1000 Processed\n classify content\n716500 0 ๆฑ่ๅฆ็ๅธๆฌ็ป้ๅ ๅ็คพๅบๅนฒ้จ้ๆตทๅฅๅจๅคๆๅทกๆฅ็งธ็ง็ฆ็งๆถไธๆ
ๆๅ\n716501 0 ๆฅ้ฟ้ๅทดๅทด็ไบบๅฟ
้กป่ฎคๅๅๅๅฎๆไปฌ็ไปทๅผ่ง\n716502 0 ไธๆชๆฅไธๅนด่ฟๅฐๆฐๅขxxไบฟๅนณๆน็ฑณ็็ฉ็ฎกๅธๅบๅฎน้\n716503 0 ๅ็ซ่ฝฆๅๅบไบไธญๅฝๅฅฝๅฃฐ้ณ้ๆ็ๆ่ง\n716504 0 ็ๅฃซFlNMAใAPl่ฏทไฝ ไปฌๅฌๅฌๆๅฐฑAPl็ฅ่ฏ็3ไธไธญๅฝๆ่ต่
็ๆณฃ่ก็ๅผๅฃฐ\n1000 Processed\n classify content\n717000 0 ๅคๆฆICๅก/FM1108IC็ฝๅก/M1็ฝๅก/้จ็ฆๅก/่ๅค/ICๅก/ๆๅบๅก/ๅฐ้ขๅก\n717001 0 GAx็ๅคง็ฏ่ฎพ่ฎก็นๅฆๅ็ฎ่ฌ้็ฎ\n717002 0 ๅ
ถๅ
จ่ตๅญๅ
ฌๅธๅ้ๆฏ็ไบ24ๆฅ็บฆไบบๆฐๅธ3\n717003 0 ไธบ่ฟไธๆญฅๆ้ซๅ
จ็่กๆฟๅค่ฎฎๅบ่ฏไบบๅ็ๅทฅไฝๆฐดๅนณ\n717004 0 ้่ฟ็ฝ็ป้้ฑผ่ฟๆณ่กไธบ็ๅๆฌไบบ้ฑ่ดข\n1000 Processed\n classify content\n717500 0 ็ฐๅจ็ฅ้ๆไธบไปไน้ๆฉBBTไบๅ\n717501 0 ไฝ ๆๆฒก่ง่ฟ็ๆๅ
้็ๅไบฌๅ\n717502 0 ็ฉๅฎถๆฎๆผไธๅๅซๅHaru็ๅฅณๅญฉ\n717503 0 ๆฅๅๆถ้ด๏ผ2015ๅนด7ๆ27ๆฅ่ณ2015ๅนด8ๆ2ๆฅ18ๆถ่็ณปไบบ๏ผ้กพ่ฅฟๅ13912523638...\n717504 1 ๆๆฏๆฐดๆๆน่็พๅงๅคง่ฏๆฟไธธ็พไธๆ็๏ผไธธ็พx.xๆๅคงๅไผๆ ๆดปๅจ๏ผๅ
จๅบxๆๅๆปกxxxๅๅxx๏ผ็นไปท...\n1000 Processed\n classify content\n718000 0 ๅฆๆ้ๅๅๆฏไบๅ
ญๆ็่ๅท่่\n718001 0 ๅพๅคๆถๅไธ่ฟ้ฎๆฏ็ฅ้็็ธ่ฆๅฅฝๅพๅค\n718002 0 ๅไธNBAๅฝๅบฆๅ้กนๆดปๅจๅๆถ\n718003 0 โxxxxxxโฆโๅๅฅไธๅบๆฅๅฐฑ่ฆๅญไบ\n718004 0 ๆณ็ๅจๅธธๅท๏ฝๅไนไธๆณๅ้ๆฑไบ\n1000 Processed\n classify content\n718500 0 ้PC้กตๆธธใๆๆบ็ฝๆธธไธบไธไฝ็ๅ
จๅนณๅฐ่็ณปๅก็RPGๆธธๆ\n718501 0 ๆฉๅจ60ๅนดไปฃๅฐฑๅ ๅๅฏนๅฝๆฐๅ
็ไธๅถ็ปๆฒป่่ขซๆไธ็ฑ\n718502 1 xxxxxxxxxxxxxxxxxxx ๅ่ก๏ผๆๅฐๅผบใ\n718503 0 ๆฐไธๆธ
ๆฏ็ฌฌๅ ๆฌก่ฑๅ้ชจๅฐ่ฏดไบ\n718504 0 ๆๆ่ฟไน่ๆไนๆฏไธชๅป็ๆฅๆฟ้ฝ้ฎ็ปๅฉไบๆฒก\n1000 Processed\n classify content\n719000 0 ไผฏๆ็น็ ๅๅบๅ
ญ่ฝดๅทฅไธๆบๅจไบบ\n719001 0 ่ฐ็ข่ฑ้ฟ่ฃ+ไผ้ฒๅธๅธ้็ไธ็ปธไธๆ ท\n719002 0 ้ฃไธชๅจZIPPOๅบ้็นๆ็ๅไปๅ็ทๅฃฐ\n719003 0 ๅๆฌๆจๅคฉไธๅๅฐฑ่ฏฅๅฐBergen\n719004 0 ไธ่ฆๆ่ชๅทฑไธๆถ็ๅฟซไนๅปบ็ญๅจๅซไบบไธ็็็่ฆไธ้ข\n1000 Processed\n classify content\n719500 0 ่ขซ็ฌๆฏๅ ไธบๆไน้พๆไปฅ็ต่ๅๆ
ขไบ\n719501 0 ๅฟ็ฆๆณ็ ไบบไธไธช็ต่็ฝ็ด็ฐๅจๅพๅพๅจ่ทๅฅ้ฝไธๅ
ผๅฎน็winx่ท่ฝฏไปถไน้ด\n719502 0 ๅชๆๆๅคง่ๅท่ฟ็งๆฑๅๅฐ้็็ป่
ปๆฆ่งๅ๏ฝไธ็ๆๅฟ็ฆ\n719503 0 ๆตๆฑๅซ่งๆๅ็ๅทฅไฝไบบๅๅๅไปฅๅ่ตๅฉๅๅๆ ๆฏ่ตฐ้ฉฌ็ฏๅฟซไธ็พๅๅไธ็พๅ\n719504 0 ไปๅคฉๆฃๅฏ้ขๆง็ณ็ง็่ๆไบฎไธๅๅธธๆไธ่ฎฒๆณ็\n1000 Processed\n classify content\n720000 0 2015ๅฎฟ่ฟๅธ้ๅพๅฝ้
ๅญฆๆ กๅ
ฌๅผๆ่8ๅๆๅธ็ฎ็ซ \n720001 0 ๅฎ่ๅไบ30ๅคๅณ้กถ็บงไธญ่ฏ้ฃๆ\n720002 0 ๆจๅคฉๆไธ็ๅ ๆฒนๅงๅฎไน ็็็ๅพ็ฒพๅฝฉ\n720003 0 1็ฑณ้ซ็็งๆฏ้ขไธญๅฝ้ป็ณๆๆญไธไบไปฅๅคไปฃๅๆผๅทฅ่บไธบ็ตๆ่ฎพ่ฎก่ๆ็็บขๆผ่พนๆกไฝไธบ้ๆญ\n720004 0 ๅๆฅผไธๆตท้จๅฅฝๅฃฐ้ณ็็ๅซไบบ็ญๆ
ไผผ็ซ่ตถ็็็ดๆญๆๆณ่ฏดๅฅๅฅๅจๅท้ขๅๆ่ๅฅฝๅต\n1000 Processed\n classify content\n720500 0 ๆไปฅ่ต็็ฎ่ฆๅ้ฃไธชๆๆบ็ไธปไบบไธๅผ ๅบ\n720501 0 ๅญๆฏSๅๅญๆฏDไธๆฏไธ็ดๅจไธ่ตทๅ\n720502 0 ๅพฎ้จ่ฝป้ฃๅนฟๅบไธๅจๆพๅปๅนดๅคๅคฉ็็ตๅฝฑ่บซ่พน่ทฏ่ฟๆต
้่ฒๅคดๅ็ๅฐ้ฒ่ไนไธ็ฅ้ๆตๆฑๆฅ็็ฌๅฎน็็็่ๆฟๅจๅ...\n720503 0 ๆๆณ่ตทๆฅๆไฟก็จๅก็ฐๅจ่ฟๆฒก่ฟ\n720504 0 ๆๅไบซไบ็พๅบฆไบ้็ๆไปถ๏ผ?Standbyme็ฒค่ฏญ\n1000 Processed\n classify content\n721000 0 ๅจไธๅ
ด่ถๅ้ฃๆ
่ก็ตๆขฏ้จๅไนฐ็ๆฆด่ฒ้้ข้ฝ้ฟ่ซไบ\n721001 0 ๅจๆ ้กๅชไธช็่งๆฎๆฏ้้ฝ่ฝ้ๅฐ็ไบบ\n721002 0 ็นๅซ็|ๅๅฎขๅไบฌไฝ่ฒๅนฟๆญ\n721003 1 ๆ่ฐขๆจ่ด็ตๅฎ้ธxxx่ฟ้้
ๅบ่ช่ดกๅบ!ๆฌๅบๆ ผ่ฐ่ฑช้
ใ้
ๅฅ้ฝๅ
จ\n721004 0 ๅธ็ฌฌไบๆณ้ข้ๆฅไบ่ฟ่ตท้ๆณๅถ้ ๆณจๅๅๆ ๆ ่ฏ็ฝช็ๅค\n1000 Processed\n classify content\n721500 0 ๅค3ไธๅ
ไปฅไธ10ไธๅ
ไปฅไธ็็ฝๆฌพ\n721501 0 ๆฌๆฌกๅฑไผ็ฑๆตๆฑ็่ฟ้็ป่ฅๅไผ\n721502 0 ๅไบซไธไธๆฌๆฅๆณไนฐ่ฟไธช็ไฝๆฏๆๆบ่ณๆบ่ฒไผผไธ้็จๅ
ถไป่ฎพๅค\n721503 0 ่ญฆ็คบๆฏไธไธชๅๆฅๅธๆณๆๅฝ้ขๆฅๅ็็คพๅบๆๅไบบๅ\n721504 0 ๆฌๅ่ฎฎๆฏ้ฟๅ
่พฐ่ฏไธๅซๆๅ
ฌๅธๅ
ณไบ้ซๆง่ฝCMOSๅพๅไผ ๆๅจ็ญ้กน็ฎ็ ๅถ็ๆกๆถๅ่ฎฎ\n1000 Processed\n classify content\n722000 0 ๅตไบๅๅคฉๆๅณๅฎๆปๅ็ต่ๅ็โฆโฆๆBOๅฅๆๅธ
ไบๆๆจๆ\n722001 0 โ6pairๅโๆฏๆๅฝๆถ้ฆๆธฏๅไธ็ตๅฐ็13ไธชDJ\n722002 0 ่ฏ่ฏดๅคชๆนๆ่พนๅฑฑไธ็ๅซๅข
ๆฏ็ๅฟๆผไบฎ\n722003 1 ๆฐๅนดๅฅฝ๏ผ็ฐๅงฟ้ๆๅกๅ
จ้ขๅผๅง๏ผไธบๆจๅฅไธๆฐๅนด็ฌฌไธๆกถ้๏ผไธบๆจไบไธไฟ้ฉพๆค่ช๏ผ้ถ ่ก่ดท ๆฌพ๏ผ็ซๆฏไฝ่ณx...\n722004 0 ๆญฆๅชๅจ้ซ็งๆๆบๅจไบบ็ๅ
ๅจๆบ่ฝๆๅกๆบๅจไบบ่กไธ่ฟ็จ้ซ็งๆๆๆฏ่ชไธป็ ๅ็ไผ่ฏด\n1000 Processed\n classify content\n722500 0 ๅฐ้ๅๅท็บฟ้ๅบ็ฐๅฅฝๅค่งไบบๅฐฑ็ฃๅคดไน่ฎจ็ๅฐๅญฉๅฟ\n722501 0 ๅฅฝๅฃฐ้ณๆไนๅ่ฎฉ่
พ่ฎฏๅๆญๆพๅนณๅฐๅ\n722502 0 cucumboๅฏไปฅ่ช่ก็ป่ฃ
ๅๆดๆข\n722503 0 ๆ ้กๅธ้กๅฑฑๅบไธไบญ่ก้ๆฐๅฑฏ็คพๅบ่คๅบๅททๅฑ
ๆฐๆ่ฟๆ
ๅตไปๆชๅฑฅ่กๅ
ฌ็คบ็จๅบ\n722504 0 ่็จๆ็ๆๆบๅทๅไธๆ ท็ๅฏ็ ่ฟๅปๅๅฐฑ่ฟๆฏๆ\n1000 Processed\n classify content\n723000 0 Qqๅพฎไฟก็จไธไบๆๅไปฌๅจๅพฎๅไธ่็ณปๆ\n723001 0 ๆฏๅคฉๅไบๅๅณไธญ่ฏ็ๆ็ช็ถๅ็ฐไบ่ชๅทฑ็ๅ
ด่ถฃ\n723002 0 ๆๅๆๅฑ
็ถ็จๅๆๆ็็ต่ๅฑๅน\n723003 0 ๆตๆฑๅซ่งๆๆ่
่็่ฟๆxๅคฉ\n723004 0 ไนๅ่้ซ้่ๅทๆนๅๅคๆกฅๅบๅฃๅ้ๆ่พๅฑๅๅ่ฟ่พ่ฝฆไพง็ฟป\n1000 Processed\n classify content\n723500 0 ไนไฝ้ชๅฐไบๅธธๅทไบบๆฐ็ไฝๆถ่ดน้ซ่ดจ้\n723501 0 ้็จ้่ฃ็จๅบๅฎก็ปๆกไปถxxxxไปถ\n723502 0 ๆฒณๅๅๅคๅนธ็ฆ0๏ผ0ๆฑ่่ๅคฉ\n723503 1 ............ๆคๅฅxxxๅ
ใ[้ผๆ][้ผๆ][้ผๆ][้ผๆ]ๅฏปๆพๆด้ณ็ฌซไธ่ๅ
่ดนๅ้...\n723504 0 ่ฟไธๅฆๆไปฌChinaJoy้ไพฟไธไธชๅฑๅฐ็ๅฆนๅญโโๆ ่ฎบๅจๆฐ้ๅ่ดจ้ไธ้ฝๆฏๅฆๆญค\n1000 Processed\n classify content\n724000 0 ๆด็ผ้็ฝ่กฃ็ทๅญๆฏไบบๅคง็ไธไธปไปปๆปก่บซ่กๆฏๅๅฎณ่
ๅผ ้ญ\n724001 0 ๅๅ
ฌ็็ไบบๆฟๅบๅๅฎๆนๅชไฝไธ่ฆๆๆ\n724002 0 ๆจๆๅจ้ฃๆบไธๆ็้ช็ต๏ฝ\n724003 0 ็ชๅจไผ ็ปๆๅไธญๆฏ่ดขๅฏ็่ฑกๅพ\n724004 1 ๏ผไบบๆฐๅๅบ๏ผๆฅๅคๆฐๆฌพๅ
จ้ฆ็ๆพ๏ผxxๅคงๅๅฆๅ็็ฌๅฎถ่ต ็คผ๏ผไผๅ็งฏๅๆปก้ขๅฏๅ
็ฐ้ๅธ๏ผๅญๆญค็ญไฟก่ดญๆๅฎ...\n1000 Processed\n classify content\n724500 0 ๅคๅช่ฏไปท๏ผไธๅ ๆๆบ2ไธไธๆS6ไธ็ธไธไธ\n724501 1 ๆจๅฅฝ๏ผๆๆฏๅๅ็ปๆจๆ็ต่ฏ็ๅ็็ใๆ่ก็ฐๅจๆจๅบ็บฏไฟก็จ่ดทๆฌพ๏ผ้ขๅบฆไธบxx๏ฝxxxไธ๏ผๅฉๆฏไฝไบๅธๅบ...\n724502 1 ๆจๅฅฝ๏ผๆๆฏๆฝๅๆฒๅฏๆบๆขฐๆ้ๅ
ฌๅธ ็ๅฅใๆไปฌไธป่ฆ็ไบงๅพฎ่ๆบใๅผๆฒๅนๅๆบใ็ตๅจๆฑฝ่ฝฆใ็ฐๅจๅจๆฒณๅๅธ...\n724503 0 ๅ้ใ่ๅทใๆ ้กใๅธธๅทใ็ๅใๆณฐๅท็ๆๅๆณจๆไบ\n724504 0 ็็งๅพไบ๏ผๅ
จ็ๆ้น่่ต/ๅๅก/็งปๆฐ/็จๅก/ๆณๅพๆๅก\n1000 Processed\n classify content\n725000 0 ๆณๅฝsandroๅฐไธๅๅ่ฟ่กฃ่ฃ่ฟๅฃๅๆๆฒ็ ๅฐไธ้ขๆๅ้ข้็ปๅ ๆทฑไปฝ้ๆ้ขๆ่ถ
ๅฅฝๆ็ๆไนๆดไนไธไผ...\n725001 0 ๅฆ่บฒๅฟๅจ้ๆตท่ฅฟๅญๆ ่ฏๅฐ่ดฉ\n725002 0 ่ฟ็ธๅฝไบๅจWindowsPhone็ๆฃบๆไธๅๅฎไธไธช้ๅญ\n725003 0 ็ฎๆตๅทฒ็ปๅฐไบๅไบฌๅคๅคฉๆ็ญ็ๆถๅ\n725004 0 ๅพๆณ็ฅ้็็ธ๏ผ็ๆๆฏๅฆ็่ฝ็จๆฐๅๆฒป็
\n1000 Processed\n classify content\n725500 1 ็พๆดฅๆค็งๅ
จๅบๆปกxxxๅxx๏ผxๆxๆฅไนๅ่ฟๅฏไบซๅๆปกๅๅ้ขๅคx.xๆ๏ผๅฑๆถๆฌ่ฏทๅไฝๆฐ่ไผๅๅ
ไธด...\n725501 0 xxxxๅไบฌๅไฟก็จๅกๅ็ฐ่ฅฟ่ๆ
ๆธธๆด่ตฐ่ฎกๅๆ็พ+็บฆไผด่ดด\n725502 0 ๅชๆ่ฟๆ ทไฝ ๆ่ฝ่ถๆฅ่ถๅฅฝๅ ๆฒน\n725503 0 2ใ้่ฟ้พๆฅ่ฟๅปๅๆณจๅๅฎๅ่ฎค่ฏ\n725504 0 ๅ
้ฎ็ๅฃซๆญฃๅๅ็BUREIๆ่กจ็ทๅฃซๆฐๆฌพ้จ้ข้ฒๆฐด็ณ่ฑ่กจๆ
ไพฃๆ่กจBUREIๅฎๆขญ\n1000 Processed\n classify content\n726000 0 miumiu็่ฎพ่ฎกๆฏๆๅผ้ฃๆ ผ\n726001 0 ๅจๅฝ่ฟ็ๆผฉๆถก้ไธ่พน็็ธๅจ็ๆดปไธญๆญค่ตทๅฝผไผไบๆ
่ก\n726002 1 ๅๅ
ดๅๆบชๅคๅๅฅฝๅบ่ฟ่พฃๅฆ้นๅ
ๅฎตๆดปๅจ๏ผxๆxๆฅไธไธxๆxๆฅ ๅ
จๅบๅฅถ็ฒ.็บธๅฐฟ่ฃคๆปกxxxๅ
้xxๅ
็คผ...\n726003 0 ๅซไบบไธ้ฃๆบๅปๅพ็ฌฌไธไธชๅฐๆนๆฏๆฏ็น\n726004 0 ็ฎๅๅ ๆถๅซๅ
ฅๅฎค็็ช่ขซๅไบๆ็\n1000 Processed\n classify content\n726500 0 ็ถๅๆไธช้ฃๆบๅปไบๅฆไธไธชๅฐๆน็ปง็ปญ่ฟ็็ฝๅคฉ๏ฝ\n726501 0 ่ฟๆไปไน็่ดขappไฝ ไปฌ่งๅพๆฏ่พๅฅฝ็จ็\n726502 0 ๅ
ญๅๅบๆญคๆฌก่ช่ก่ฝฆๆๆพๅฐๅไธบ3ไธชๆนๆฌก\n726503 0 ๅฌไป่ฎฒ่ฟฐๆบๅจไบบไบงไธ็ๅๅฑๆนๅ\n726504 0 ็จFWSim่ฝฏไปถ่ฎพ่ฎก็็่ฑ\n1000 Processed\n classify content\n727000 0 2014ๅนดๆฒฟๆตทๅๅฐๅ็ธๅ
ณ้จ้จ็งฏๆๆจ่ฟๆตทๆฐดๅฉ็จๅทฅไฝ\n727001 0 ไนๅทๆไธๅผๅงๆ็ๅไบฌxๅคฉxๅคๆ
่ก\n727002 0 ๆฌ้จไป็ๅ่ฟไบ็ฅ้ญ้จๅคบๅพๅธๆๅคงๆณๆฎ็ซ ๅ\n727003 0 ๆๆๅณฐๆๅคๆตทๅๆๅ
ๅนๅ่ขซๆ\n727004 0 Life็ ดisalwaysatest\n1000 Processed\n classify content\n727500 0 ๆไปฌAndroidๅผๅ่
็้ๆฉ่ฟ็ไธๅฐ\n727501 0 ไธ็ๅๅฐ็่ฎพ่ฎกๅธ็จไปไปฌ็ฌ็น็ๆณ่ฑกๅ่ฎพ่ฎกไบ่ฎธๅคไธไผไธๅ็็งๅ\n727502 0 ๅฐ้็ฅ่ฏไบงๆๆฏๅ
จ่กไธ็ๅ
ฑๅ่ดฃไปป\n727503 0 8ๅทๆฒๅฐ็ฑณๆฑฝ่ฝฆ็ง่ต่นๅฃๅผๅทฅไธปๅฉ่ฝฆๆฐๆฌพ็่ๆ่ๆป่ฃ\n727504 0 Pepper็ไฝฟ็จไฝ้ช็ฉถ็ซๅฆไฝ\n1000 Processed\n classify content\n728000 0 ็พ้ๅฐไปๅ็พๅบฆ้ฉ็ถๅx้ฆ้ฃไบบๅปๅจ็็ซ้็่\n728001 0 Amazon็ฐๆCreativeCatsๅจ็ฉ็ณปๅตๆไบบๆถ้ธฆๅกซ่ฒๆฌไป
ๅฎ$4\n728002 0 ๅฏนๆฃๆนๆๆงไปๅ่ดฟxxxxxxๅ
\n728003 1 j่ฝ่ฎxxx-xxxxxxxx* xxxxxxxx xxxxxxxxxxx xxxxxxxx...\n728004 0 ไธๅๅกๅธฆๅผ่ฑ่ฏญโฆใ25ไธ็ญๆญ่ง้ขใๆฐๅฝ็็จๅฝฑๅ\n1000 Processed\n classify content\n728500 0 ๅไบฌ่ฃ
ๆฝขๅ
ฌๅธ๏ผ4ๆๆนๅๅจๆฟ้ฃๆฐดๆ่ดข็บณๅฎๅฅฝ็ฆๆบ\n728501 0 ๆๅๅๅคๅค้ฝๆฏๆ้ก้/่ฒ/่ฒๅ็ง็ฏ่ฑ็ด/ไบฒไบฒ/ไบฒไบฒๅฅณไบบไปฌ\n728502 0 CDECGAMINGๅ ๆฒน\n728503 0 MichaelKorsSelmaไธญๅท้้่ณๆตๅ
\n728504 0 ็ถๅๅบง11็น45ๅ้ฃไฝไธฝๆฑ็็ญๆบ\n1000 Processed\n classify content\n729000 0 ็งๅถๅ็ๅฅณไปๅๅกๅฑ?cafeๆจๅบๅ
จๆฐๅถๆ\n729001 0 ไธไผไธๅฎถๅญฆ่
ๅด็ปโไธญ่ฏ่ฏ็็ ็ฉถ็ๅๆฐไธๅๅฑโ่ฟไธไผ่ฎฎไธป้ขๅฑๅผ็ ่ฎจ\n729002 0 ๆฑ้ฝๅบๆฐ่ฑกๅฐxๆxxๆฅxxๆถๅๅธ๏ผไปๅคฉๅค้ๅฐๆๅคฉ้ดๆ้ต้จๆ้ท้จ\n729003 0 ๅ็ฐ็็ๅฅฝ็็ๅคง้ฝๆฏๆ
ๆธธๅฉ็บฑ\n729004 0 ๆฅๅๆถ้ด๏ผ2015ๅนด8ๆ9ๆฅโ2015ๅนด8ๆ11ๆฅ\n1000 Processed\n classify content\n729500 0 ๆฏๅฆ็พๅฝๅฏไปฅ้่ฟไฟก็จๅก็ไฟกๆฏๅจๅฝๅฐ็ๅบ็ๆถ่ดน่ฎฐๅฝ้ๆฃ็ดขๅบๆ่ดญไนฐ็ๅๅทๅข\n729501 0 ้ฝๆฏ่่ๅ~~้ป้ๅทฆ่่ธข็ๅงฟๅฟไนไธ้\n729502 0 1946ๅนด้ๅทกๆด่ฐๅฆ้ซ่ขซๅฟๆฒไบ้ฉฌๅ
ญ็ฒๆตทๅณก\n729503 1 ๆจๅฅฝ๏ผๆๆฏ่ฅฟๅบๆฌงๅฏ้พ่ดไธๆฅผ๏ผๅจๅฃนๅ ็ฏไฟ้ๆ็ถ็้ๅฎๅ๏ฝๅฐๆดใๆฌๅๅบxxxxๅนดๅปบๆๅฎถๅ
ทโx.x...\n729504 0 Vic็ปๆ็่จ่ฏด๏ผๅๅผ้ฃๆบ็ๆ่ตทๆฅไบ\n1000 Processed\n classify content\n730000 0 ๆฏ่พ้่ฆๅฆ~ๅฟซget่ตทๆฅๅง~\n730001 0 2016ๅฝๅฎถๅ
ฌๅกๅ่่ฏๆๆๅค่ๆญฅๆญฅไธบ่ฅโๆ่กๆต็ณ่ฎบโ\n730002 0 ๆถ้5ๅคฉ็ปไบ่ฟไธๆๆๆบ็็ๆดปไบ\n730003 0 ๅคฎ่กๆจ้ๅ่ดญ400ไบฟๅ
ๆบๆ้ข่ฎกๆชๆฅๆฐๆๅฎฝๆพ็ซๅบไธๅ\n730004 0 ็พ้ญไน้ซๆบๅจไบบๅบๅฐๅไธๆฌกๅ้ๅๅ xxxxไธญๅฝๅฝ้
ๆถ่ดน็ตๅญๅ่งไผๆจๅฑฑไธ็ๆบๅจไบบๅฑไผ\n1000 Processed\n classify content\n730500 0 ็ๅฟไธบ็ๅ็ๅ
ฌๅ
ฑๅบ็ก่ฎพๆฝๅปบ่ฎพ็น่ต\n730501 0 ไปๅคฉ่ฆๆฑ็ไธๅฐ็ต่ๅๆๆๅๅปๅ\n730502 0 ๆ็้ข่ฏๆ็ปฉๆดไธชๆฃๅฏ้ข็ณป็ป็ฌฌไธ\n730503 0 ๆฟๅบไผๅผบๅถๆง่ฆๆฑๅถ้ ๅๅจLEDไบงๅไธๆ ๆณจ่ฝๆๆ ่ฏ\n730504 1 ใๅนณๅฎ้ถ่กๆฐไธ่ดทใ็ฎๅ ๅฟซ ๆ ้ๆตๆผใ xxๅฐๆถๅ
ๆพๆฌพใๅฉ็ไฝ๏ผ ่ดทๆฌพxxไธๅ
ไธๅนดๅฉๆฏ็บฆx....\n1000 Processed\n classify content\n731000 0 ๆๅฐ็ฐๅจๆ็ฅ้ๆผ่ฑๅ้ชจ้ฃไธชๆฏๅผ ไธนๅณฐ\n731001 1 ๆฎๆ ้่ โๅฎไฟก.ๅฎไบบ่ดทโ ไฟก็จ่ดทๆฌพโฆโฆๅ
จๅๅฉๆจๅๅฑใx.ไฟก็จๅๆฌพๆ ้ๆตๆผๆ
ไฟ๏ผๆ ไปปไฝๅ...\n731002 0 ๅพ่ดคๅๆดๆฐไบTwitter๏ผ??????????~?????????????ๅๆๆณ๏ผ\n731003 1 ็ณๆฒนๆดๆฅผๆญฃๆๅไบๅผๅง่ฟๆฅ้ซๆๅฅฝๅไบ๏ผ็พๅนดๅๆดๆด๏ผไปๅนด้
ๆฅผ่ๅไปฅ๏ผไฟ็น่ฒ๏ผๆจๆฐๅๅๅฎถๅฎด๏ผๅทฅไฝๆด...\n731004 0 ๅๆณ๏ผๅฐ็ๅฅถๅQQ็ณไธ่ตทๆพๅจ้
ไธญ\n1000 Processed\n classify content\n731500 0 ๅผ ๅฎถๆธฏๆ้ๆ่ฒๅฎ่ดไปฌๅไฝ็็ป\n731501 0 ๆฅๆฌ่ญฆๆน่ฟๆฅ้ฎๆไบๅ
ถCEO้ฉฌๅ
ยท็งๅฐไฝฉๅๆฏ\n731502 0 ๅฐฑๅธๆๆๆ่บซ่พน็่ดชๅฎๅ
จ่ฝ็ฝ\n731503 0 ๆไนไฟ่ฏ่ฟไบ้่้ซ็ฎกไธ่ขซๆไธๆฐด\n731504 0 ๆไธๅจ่ถ
ๅธ็ๅฐไธช็ท็ไธ็ตๆขฏ\n1000 Processed\n classify content\n732000 0 141ๆทๆๆฐไฝ่ฟไบๆฐๅปบ็ๅซๅข
\n732001 0 ๆฑ่ๅซ่ง้่ฏๅฟๆฐๅฅณๅๅฎพๅญๅชๅชไธชไบบ็ฎๅๅ็ฝๅ่ฏไปท|้ไฝ ่ซๅฑ|้่ฏๅฟๆฐ|็ต\n732002 0 ๅคงๅฎถๅฟซๆฅ็็~ๅพฎ่ฝฏๆญฃๅผๅผๅฏWinx/Winx\n732003 0 ๅทฅ็ฎก1402็ๆๆธฏๆฅๅๅญฆ้ๅไบๆ ้กๆฌๅฐๅไบบๆๅญไธญๅฝ่ฟไปฃๆๅๅคงๅธ้ฑ้ไนฆ็ๆ
ๅฑ
ไธบๆฌๆฌก็คพไผๅฎ่ทต็่ฎฟ้ฎๅฐ\n732004 0 ไธญๅฝ้ฆๅฑๆ ๆฒน็่่ฝๅนๆตๆฑๆตทๅฎ็ฑ็งๅคง้ๆ็ถไธปๅ\n1000 Processed\n classify content\n732500 1 ๅ
ฌๅธ่ดขๅกๅทฅ่กๅธๆท๏ผxxxxxxxxxxxxxxxxxxx็ๅฎ็ปช\n732501 0 ไธๅคงๆธ
ๆฉๅไบไปฌๅฐฑๅผๅง่็ต่งๅง่ฏด่ฑๅ้ชจ่ฏด็ๆดฅๆดฅๆๅณโฆๆ นๆฌๆฒกๆๅ
ฑๅ่ฏ้ขๅ\n732502 0 ไปๅคฉๆไธ็ๅๅบ็ฐไบไธ็พคๆฉๆ้ฃ่ฝฆๅ
\n732503 0 ไปๆฅๆ
ไปปๅฎกๅค้ฟ็ๆฏ้ณ็ๆณๅฎ\n732504 0 ้ๅๅพฎ่ฝฏXbox้จ้จไธป็ฎกPhilSpencer็ปๅบ\n1000 Processed\n classify content\n733000 0 ๅฐๅฆไปๆณๅฎๅบญๅฎกๆ้ตๅพช็โ่ช็ฑๅฟ่ฏโๅๆณๅพไบๅฎๅๆณๅพ่ฏๆฎไน้ด็็พ่กจ็ฐ็ๆทๆผๅฐฝ่ด\n733001 0 ๆ ็บฟU็พ็ธๅฝไบๆๆงๆๆบไฟกๆฏๅฎๅ
จ็ๆ ็บฟ้ฅๅ\n733002 0 ๆๅ็ตๆขฏไนๅ่ฟๅพๅ
ๅผๆ็ฝ็ตๆขฏ็ๅฎๅ
จไธ็ปๆ\n733003 0 ๅฆๆ็ๆดปๅผบๅฅธไบไฝ ไฝ ๅไธ่ฝๅๆ้ฃๅฐฑ่ฆ่ฏ็ไบซๅ\n733004 0 ๆจ็ๅป็ๆ็้ซ้่ๅๅๆฏๅฆๆ็ธๅ
ณๅฝฑๅๅญฆๆฃๆฅ่ฏๆฎๆฏๆ\n1000 Processed\n classify content\n733500 0 ๅธฆไฝ ไบ่งฃSOHO่ฟไบๅฎ็จๆง่ถ
้ซ็้
ๅ
ท\n733501 0 ๅฎฃๅธๅฐ็่กฃ้ๅฎ้ข็5%ๆ็ฎ็ปไธๅฎถๅไธบ\n733502 0 ไปฅ้ฒ็ตๆขฏ็ๆถๆๅๆขฏ็บง่ฟ่กไธๅฎๅ
จๅๆญฅ\n733503 0 ๅฐๆฃไปฌ็็ๅธๆๆฟๅบๆๅไธญ็ณๆฒน\n733504 0 ้ๆฉๆๅฝฑๅคง่ตไธญ็็ฌฌ10342ๅทๆฏๅคฉๆไธ็ฅจ\n1000 Processed\n classify content\n734000 0 ๆ็งๆฅผ็่ชไฟฎ`็ฏฎ็ๅบ็ๆฑๆฐด`ๅพไนฆ้ฆ็ๅฐ่ฏด`้ๆฒ็ฝๅง็ๅไผๆถๅ
`้ฃๅ ่งๆNBAๅไฝๅฎฟ็้ๆพ?ๅไผๅง\n734001 0 ่ชๅทฑๆไธ่ฅฟ็ๅฅฝๅๅๅๅๅๆ็ซ็ถๆ ่จไปฅๅฏน\n734002 1 ๆจๅฅฝ๏ผๆๆฏ้ๅธๆตท็ๅฐๅผ ๏ผ็ฐๅจๆไปฌๆจๅบxxๅทๆฅผ็ไธๆขฏไธๆทๆฐๅผ็็ๆฟๆบ๏ผxxxๅนณๆฟๆปไปทไธๅๅ
ซไธ่ตท...\n734003 1 ้็ฅ๏ผ็บข่ป่ๅนผๅฟๅญๆ็ๅจๅณใไบๅๅนดๅๅญ็ป้ช๏ผๅฃ็ขๅฅฝ๏ผๅธ่ตๅ้้ๅ๏ผๆๅญฆ็ฏๅขๅฎฝๆใๅฐๅ๏ผ็บขๆๅฐ...\n734004 0 ไธไป
็จๆฐๅญๆญๅผๆตๆฑๅฏ่ฑช็ๆฐ้\n1000 Processed\n classify content\n734500 0 |ๅๅ็ๅฐไธไธช่ฏ่ฎบ่ฏดๅๆฅ็ง่ฆๆ่ดฅ้ฟ็ๆ นๆฌไธ้พ\n734501 0 ่ฏฅ็ทๅญๅ ๆถๅซ็็ช่ขซๆณ้ขไธๅฎกๅคๅคๆๆๅพๅไธคๅนด\n734502 0 ่ขซ120ๆๆค่ฝฆ้ๅพๅป้ขๆฒป็\n734503 0 ๅฎๅฎๆฒ่ฟทๆๆบไธ่ฝ่ชๆๆไนๅ\n734504 0 2ใ้ฆ่ๆดๅๆณก่ฝฏใๆธๆๆดๅใๆกๅๅฅ็ฎๅค็จ\n1000 Processed\n classify content\n735000 0 ไฝ ่ฏด่ฟไธไธชไบบ็ๆ
ๆธธๆฏๅญค็ฌ็\n735001 0 ๆฏ่ตท้ฃไบไธไธชๅฒๅซไฝ ๅไฟก็จๅก็่ฟๆฏ่ๆๅคไบ\n735002 0 ๆฅไธญๅฝๆ
ๆธธๅ ้ช้ฑ็ๆพๆพไนๆ
\n735003 0 ๅป้ข๏ผ่ๅป้ขใ็ๅๅป้ขใ่ฅฟๅฎๅธ็ฌฌไธไบบๆฐๅป้ขใๅบทไนๅป้ข\n735004 0 ๆฅๆฌCurelๆๆ่ๅๅฆๆฐด้
ๅCurelไนณๆถฒไธ่ตทไฝฟ็จ็ๅๅฆๆฐด\n1000 Processed\n classify content\n735500 0 insider็จๆทๅฐๅบๆฏๆไนไธชๆฟ็ญ\n735501 0 ๅ่ฃๆๅ173RMBไธ่กฃๆๅ158RMB\n735502 1 ็ดงๆฅ้็ฅ:ใๅผ ็ฑ็ฒใ่ฟ็บฆๆทใๅฅ่ทๅงๅ
ๅผใๆ ็ฎ็ปๅๆฌ้ขๆไบคๆไปถๆง่ฏๆจๆๅๆ ็ฎ็ป็ๅ่ชๅๆญฃ่งๆดปๅจ...\n735503 0 SPF50็tromborg้ฒๆๅทฒ็ปๅ
จ้จๅฎๅฎๅฆ\n735504 0 ็ฌฌไบไธชๅพฎๅๆขไบ็ต่ๅฟ่ฎฐ่ดฆๅท\n1000 Processed\n classify content\n736000 0 ่ฃๅคๅๅป้ผไนไธ่ฝๆไฝ ๅฐ่ฟ็้จ็็ๅๅนๅบๆฅๅ\n736001 1 โๅฑไบซๅๅใๆดพๅฏนๅ็บฆโๅๅๆดพๅฏนKTV้ๆจๅฐๅบๆฌขๅฑไฝ้ช๏ผๆฏๅจไบไผๅๅ
จๅคฉๅ
่ดนๅฑxๅฐๆถใๅฐๅ๏ผไบๅ...\n736002 0 ๅพๅ็ทๅฃซๅฅ่ฃ
๏ฟฅxxxๅ
ๅซๆด้ข\n736003 0 ๅพฎ่ฝฏ็MikeYbarra่ฏๅฎWin10ๆต่ฏ็Minecraftๅฐๆทปๅ ๆๅฐฑๅ่ฝ\n736004 0 ็ชๅนณ่ไธ้ขๅคดไธญๅคฎๆๅนๅธ็ฐ่ฑก\n1000 Processed\n classify content\n736500 0 ่่ฒๆฐๅจๅฉ็บฑ็คผๆ่ฎฉไฝ ๆไธบๆ็พ็ๆฐๅซๅจ\n736501 0 ๆนๅ็ไบบๆฐๆฃๅฏ้ขไพๆณไปฅๆถๅซๅ่ดฟ็ฝชๅฏน็ๅๆนๅงๅๆป็ปๆตๅธๆจไธ่ณใไธญ็งปๅจๆนๅๅๅ
็ปไนฆ่ฎฐ็ๅปบๆ นๅณๅฎ้ฎๆ\n736502 0 ๆฟไบง็จๅบๆฌๆกๅฎๆ้ข็งฏๅพๆถ็ฆป็ ดๅฐ่ฟๆๅค่ฟ\n736503 0 ๅฏๆฏๅป็่ฏดๆ นๆฎb่ถ
ๆๅฏ่ฝไผๆฒกๅฅถ\n736504 0 ็ปๆๅดๆทปๅ ตไบโฆ็ฅ้็็ธ็ๆๆ ่ฏญไธๆไบบ\n1000 Processed\n classify content\n737000 1 ๅฎๆฐๅ้็ดซๆทไธๅ
ๅบๅๆ่ช้ข็ฅๆๆไบฒไบบๅ
ๅฎต่ๅฟซไน๏ผๅ
ๅฎต่่ฟๅไธบ่ฟๆฅxๆxๅทๅฆๅฅณ่็ๅฐๆฅๅ
จๅบๆฅ่ฃ
...\n737001 0 ๅฅณๅ็บขๆๅบๅข็ทๅๆๅ็ ไบบโโๅฅณๅๆญฃ็จ็บฆ็ฎ็ฅๅจไธ็ฝๅ่ๅคฉ\n737002 0 ็ฉ็พๆง่กๅๅไบฌ็ฉๆตไฟกๆฏๅ่ฎกๆๆๅ
ฌๅธ69\n737003 0 ๅ
ถๅฎๆฏไธชไบบๆ้ญๅ็ไธๅ
ฌไธ่ฆ้พ\n737004 0 ไฝ ไธ้่ฆๆ่ตใไธ้่ฆๅบๅฎ็ๅทฅไฝๆถ้ดใๆฒกๆๅฑฏ่ดงๅๅใๆฒกๆ็ฉๆตๅฐๆฐ\n1000 Processed\n classify content\n737500 0 ไธคไธชๆตๆฑ็ๅค้ไบบๅจๆไธๆถๅป่ขซๆณจๅฎ็ๅ ็ผ\n737501 0 ๅฐๅฐผCommunalๅๅกๅงๅ
ผ้คๅ
โโCommunalๅๅกๅงๅ
ผ้คๅ
\n737502 0 ็ซๅฑๆๆบ๏ผๅ4Gๅ็พๅ
ๅ
จๅ
ผๅฎน3\n737503 0 ็ๅฃซๅคฎ่กๆ่ต่ก็ฅจ็ๅธๅผ็ธๅฝไบ็ๅฃซGDP็xx%\n737504 0 ้พๅฉๅคไฝๅ2536ๅนด21ๅนด่็\n1000 Processed\n classify content\n738000 0 ็็ตๅฝฑ๏ผTheJudgeๆณๅฎๅไป็ไธไธชๅฟๅญ็ๆ
ไบ2ๅฐๆถ20ๅ้\n738001 0 ็ฐๅจ่ฎฉๆไธไธชไบบ็ฉไปไปฌๅป้ๅฃๆฟ้ด\n738002 0 ่ฎค่ฏไฟกๆฏไธบโๆฑๅคดไธญๅ คๆ่ต้ๅข้กน็ฎ้จโ\n738003 0 ๅคฉๆดฅๆปจๆตทๅบLTE็ฝ็ปๅ
ณ้ฎๆๆ ๅพๅฐๅพๅคงๆนๅ\n738004 0 ๆๆถๅ็็ไผ่งๅพๆๅฆ่ฆ้ผๆๅ
ถๅฎๆบๆนไพฟ็\n1000 Processed\n classify content\n738500 0 58ๅ
็บขๅ
ๆๅฐ3๏ผ็ปไฝ 0\n738501 0 ๅจๆจๅคฉๅธๆฟๅบๆฐ้ปๅไธพ่ก็โไธ่ฏๅไธโๆฐ้ปๅๅธไผไธ\n738502 1 ๅคงไธฐ้ซ้ฆจๆ่ฒๆฅๅญฃๆฅๅไบ๏ผ่่น็ด ๆ็ป็ป่ทๆณ้๏ผ้ญๆณ็ฉๅญๆ่้็ฎๅฐไธปๆ๏ผๅฐๅญฆ่ฏญๆฐๅค๏ผๆฌๅจๅ
ญไน็นๅ
จ...\n738503 0 ๆฒณๅไธๅฏๅ
้ฟๆฅๅๅ่ดฟ7ไธใไธๅญ้ถ่กใๅจฑไนใๆฌฃ่ต\n738504 0 ไปๅคฉ็ปไบๆ่ฑๅ้ชจ็ๅฐ่ฏด็ๅฎไบ\n1000 Processed\n classify content\n739000 0 ๅ ๅ
ฅ้ป็ๆฑใ่้ปใ่่ๅๆฉๆฆๆฒน\n739001 0 2015ๅนดๆฑ่็ๅฐๅฟ้จ็ๆฏ่ตๅจๅฎฟ่ฟๅค้ปๆฒณไฝ่ฒๅ
ฌๅญ้จ็ๅบไธพ่ก\n739002 0 ้่ฒๅฟซ่ฝฆ่ถ
ๅคง็ป็็ชไปฅๅ้ๆๅคฉ็ช่ฎฉๆๆ็พๆฏๅฐฝๆถ็ผๅบ\n739003 1 ไธญๅฝไบบๆฐ้ถ่ก็นๆนๅ่กใไธ็้ไบง็บชๅฟตๅธ็่ๅใๆปxxๆ๏ผๆป่้xๅ
๏ผ่xxxๅ
ๅทฆๅณ้้ๅ้๏ผไธ...\n739004 0 ไบบไฝๅทฅๅญฆ็ต่ๆค
ๆฏๅจๆกๅ็ต่ไธๆๅบ็ฐไนๅ\n1000 Processed\n classify content\n739500 0 ๅธฆ็ๆปกๆปก็ๆฐงๆฐๅๆฌกๅๅฝๅพช่ง่น็ฉ็็ๆดปไธญ\n739501 0 ๆๅๅฏป่ฎฟไบๆตๅคงๅ ค่ฝ็ถๆฑๆฏ็ฒพ็ฅๆฐธๅจ\n739502 0 ๅคงๅๅๅฆๅ้ๅขๆพไธไธชProductManager\n739503 0 ๅไธบ่ฃ่xplusๆๆบๅฃณ่ฃ่xplusไฟๆคๅฅ็ฃจ็ ่ถ
่ๅคๅฃณๆฐๆฝฎ็กฌ็ทๅฅณๆ
ไพฃ\n739504 0 x็ญๅx็ญๆ็ปฉ่พๅฅฝ็ไบบ้ๆฐๅฎๆๅฐx็ญ\n1000 Processed\n classify content\n740000 0 ๆๅๆๆๆบไปๅฝ่ก็ณป็ปๅทๆๅฐๆนพ็ณป็ป\n740001 1 ๆ่ฐข่ด็ตๆญๅทๅๅฑ้จไธ ๆฌๅบไธป่ฅ๏ผ็งปๅจ้จ ๆจๆ้จ็ญๅ็ง้จ ็ต่ฏ๏ผxxxxxxx ๅฐๅ๏ผ่ฅๅทๅบๅ...\n740002 0 ๆ่ฝๅๅฐไฝๆ่ต้ซๆ็ๅขไบงๅขๆถ\n740003 0 ๆฐๅบ็ป่ฎพ่ฎกๅพ็ๆถๅๅฅฝๆญน้ฟ็นๅฟๅง\n740004 0 \\๏ผไธๅธ\\๏ผ็ผไธญ่ชๆ็ญๆก????ๅจๅซๆ็็ผไธญ\n1000 Processed\n classify content\n740500 0 ่ๅท็ปๅS58ไธๆตทๅพ่ๅทๆนๅ28km่ฝฆๅๅบๅฃไธๅฐๅ็็ๅฐ่ดง่ฝฆไพง็ฟปไบๆ
่ฟๅจๅค็ไธญ\n740501 0 ไป็ช็ถ็่งไธ่พ่ญฆ่ฝฆ่ทๅจไปๅ้ข\n740502 0 ๅตๅตๆชๆๅนดไบบxๅคฉxxๅนด่ฟๆณๅฎๆถไบๅคๅฐ\n740503 0 900ๆฏซๅ็จๆ่กๆต่พพๆฑ่็ไบบๆฐๅป้ขicu้็็ๆคๅฎค\n740504 0 ๅฐฑ่ฟๆ ทไฝ ้ทๅ
ฅไบไธไธชๆญปๅพช็ฏ๏ผๅชไผ่ถๆฅ่ถ็ฉท่ถๆฅ่ถ่ๆญคๆถๆญคๅปไฝ ๅบ่ฏฅๅไธไธชๆญฃ็กฎ็้ๆฉ็ญไฝ ๆฅ้ๆถๆฌข่ฟไฝ ็ๅ ๅ
ฅ\n1000 Processed\n classify content\n741000 0 ่่่ฝฏ่ฝฏ็็ไป้ขๆๆฐดๆด็ไปไธไผๆ่ฒๅ\n741001 1 ๅนดๅบฆ้ฆๅผ็นๆ ๅไปทxxxxxๅ
ๆฏๅนณๆน ็ฐไบๆๅผ็ๅจๅณ ็ซ็ๅ
จๅ xไธๆตๆฃxไธๅข่ดญไผๆ ๆฅๅฐฑไบซ๏ผ...\n741002 0 ๆ็็ต่ๅ
ๆฏๆฒกๅฃฐ้ณใ่ฟไธคๅคฉ่ชๅจๆดๆฐใ่ฟๅฏๅจ้ฝๅฏๅจไธไบ\n741003 0 2ใไบ็ปด็ ่ฏ้ช๏ผโๅๅไบ็ปด็ โไธบๆๆบๆจ้ฉฌ\n741004 0 ้ฃไนไฝ ไธๅฎๆณ็ฅ้ๅจWin10ๆๆบ็10149ไธญ\n1000 Processed\n classify content\n741500 0 ๅฟตๅจๅ่บซ+ๅๅฃฐ+ๅฎกๅคๅ้่ถ
็บง็้ผๆไปฅไธ็ดไธ่ๅพ\n741501 0 ็ไผผๆฌยทๆ็ปๅฎถๆ็งไบบ้ฃๆบๅจ่ฑๅฝๅ ๆฏ\n741502 0 ่ฆ่งฃๅณๅฐฑ่ตถ็ดง็ปๆๆฅ็ต่ฏ่งฃๅณไธ็ถๅฐฑๅซ็ปๆๅไธๆก่ฏ่ฎบไธๆกๆๆฐๆ็้ช\n741503 0 ่ๆญคๆถๆญฃๅผไบ้ฉฌ้20ๅจๅนดๅบๅบไฟ้ๆดปๅจๅๅค\n741504 0 ๆๆ่
ๅพฎไฟก่็ณปjanephuaxxxx\n1000 Processed\n classify content\n742000 0 xxๅทๅฎๆ็ฌฌไบ้้ฒ็ซ่งๅฏไธๅจไบคๆฅ\n742001 0 ๆๅๅผ่ตๅฏนไธญๅฝๅถ้ ็ไฝ็จๅพๆ้\n742002 0 ่ฎพ่ฎกๅธ่ฎพ่ฎกไบ่ฟๅฅๅฏ็ฑ็ๅผๅ
ณ\n742003 0 xxๆๅบงๆๆๆฏ
ๅ็ฌฌไธๅ๏ผๅคๅฅณ\n742004 1 (่ฟชๅฐผ่ๅฅณ้)x.xๅฅณไบบ่ๅ้ฆๆฅ๏ผx.xๆฅ-x.xๆฅ๏ผ๏ผๆฐๅๆๅๅๅxxๅ
๏ผ่ๆฌพๅฅณ้ดๆธ
ไปๅคงๅค...\n1000 Processed\n classify content\n742500 0 ๆตทๅ็ไธๅฎๅธๆฃๅฏ้ขๆๅ
ณไบบๅฃซ้้ฒ\n742501 0 ๅ ไธบ็็ธๅพๅพไธๆฏไปไนๅฅฝไธ่ฅฟ\n742502 1 ๅๅ่พ็ไธๅซ็ๅทพ้ๆถไผๆ ไฝ้ชxๅฅไป
้xxxๅ
๏ผๅฎๆฏไธๆฌพ้ซ็ซฏๅฅ็ฑณ้ถ็ๅป็ๅ็ๅซ็ๅทพ๏ผ้ๆพ่ด็ฆปๅญ...\n742503 0 Nordstromๅ ไบ2000ๅค\n742504 0 ๅๅ็้ฒๆ้่ฎฉ็ฎ่คไธ่ฝๅผๅธๅ ตๅกๆฏๅญๅฏผ่ด้ฟ็\n1000 Processed\n classify content\n743000 0 ๆ ้กไธๆฃ็้ซๆกฃ็ดงๅฏ็บบ็ฒพๆขณๆฃ็บฑ็ไบง่ฝฆ้ดใๅธธๅทๅคฉๅๅ
่ฝ็ๅ
ไผ็ตๆฑ ๅ็ปไปถ็ไบง่ฝฆ้ดใๆฑ่ๅบท็ผ่ฏไธ็ไธญ...\n743001 0 ้ฝๅบ่ฏฅ่ด่ดฃ100๏ผ
ๆฐไบ่ตๅฟ\n743002 0 Themer้ๆฅๆไธฐๅฏ็ไธป้ข่ตๆบ\n743003 0 ๆฒกๆณๅฐๅ่ญฆๅฏๅฆนๅญ่ฟๆฏๆ กๅ๏ฝๆบๆ็ผๅ็\n743004 0 ๆฐ็ๅณๅฐ่ฟๆฅ60ๅจๅนดๅคงๅบไธคไธป็บฟๆๅๅธๅฑ\n1000 Processed\n classify content\n743500 0 ๆฌ้จไปๆๅฑฑ่กๅ้ๅคบๅพ้ฟ็ๅๆฎ็ซ ๅ\n743501 0 ็ฌฌ70่ฎฒ๏ผScala็้ขGUI็ผ็จๅฎๆ่ฏฆ่งฃ็พๅบฆไบ๏ผ\n743502 0 ๅๅฎ้ฃๆบ้คๅฐฑๅทฒ็ป่ถณๅค็ฅๅฅไบ\n743503 0 ๅฐๆๅไปฌไฝฟ็จxxxๅฟ็ซฅๅซๅฃซxไปฃๆ่กจ้กบๅฉๆพๅฐๅฎ่\n743504 0 ๅฏไปฅไธๅฟๅคไน ไธไธชๅฐๆถไธคไธชๅฐๆถไธไธชๅฐๆถ\n1000 Processed\n classify content\n744000 0 ๆไปฅไผๅปๆ่ฐๆฅๅญฉๅญ็ๅบ็ๆถ้ดๅๅบ็ๆฅๆ\n744001 0 ๅฏผๆผๅผ ๆฐๅปบใๆผๅ่จๆฅๅจใๅ
ๆทผใๅๅไบฌๅฐไผไธๅไบฌๅชไฝ้ถ่ท็ฆป่ง้ข~ๆๆจๆๅๆฌข็ๅ\n744002 0 ๆฑ่็ๆถ่ดน่
ๅไผๅๅธไบๆฐดไธไนๅญๆถ่ดน่ฐๆฅๆฅๅ\n744003 0 ๅคงๆธ
ๆฉๅซไบบ่ฃ
ไฟฎๅค้ขๅตๆถ้้ข่ๅคฉๆ้ผพ่ฎฒ็ต่ฏ็ฏ็ๆฒๆฟ้จ็ๅๆขฆๅ้ๆ่ฟๅพ่ตทๅบๅปๅผ้้บป็นๆๆฅ\n744004 0 ๅคๅฎถไธๅธๅ
ฌๅธ่ฏๅไธบไธๅๆ ผๅ\n1000 Processed\n classify content\n744500 0 ่ขซๆๆฏๅไบฌๅฒไธๆไพๅ็่ฟๅปบ\n744501 0 ไธคไนๅฎขๅ ๅบงไฝ้ฎ้ขๅจ้ฃๆบไธๅคงๆๅบๆ\n744502 0 ไธๆฑฝ่ฝฆไบบๆปๅจๅใ้ๅบธๆฐๆไฝๅผ\n744503 0 ไธ็ญๆถๆๅฐๅฐๅท็ถๅ้ญๅฐๅจ่ๅๆฅ็่ทฏไธๅพๅฎณๆ3\n744504 0 2ใๆฑ็่ฏ่ฏ็ๅฟๆ็ๅพฎๅ\n1000 Processed\n classify content\n745000 0 ๆญฆๆฑ้ฆ็ปฃ้ฟๆฑๆ่ฒ้ๅขๅๅฑๆ้ๅ
ฌๅธๆฐๅจๆๅฝ้
่ฑ่ฏญ่ตๆทฑๅธๅบๅๆๅธ\n745001 0 ๅพฎ่ฝฏๅฐไธๅๅ TGS2015\n745002 0 ่ณไบ่ฑๅ้ชจ็ฌๅฎถ็
ง็ๅ ๆถๆพๅบ\n745003 0 ๆฅๅคๆXX้ฉพ้ฉถ่ฑซFxx***ๅทๆบๅจ่ฝฆๅฎๆฝๆชๆ่งๅฎไฝฟ็จๅฎๅ
จๅธฆ็ๆบๅจ่ฝฆ็่ฟๆณ่กไธบ\n745004 0 xๆxxๆฅไบ้ฉฌ้็ไผๅๆฅ็ปๆ\n1000 Processed\n classify content\n745500 0 ๆไปฌๅญฆๆ กๆ่ฐๅจ่ๅทๅๆๅๅทฅๆปด\n745501 0 ๅๅทฒไปๅทฒ็ปๆ่ชๅทฑๅไธๅไบ็ๅงไบบ\n745502 1 ๅคง้ๅบๅฎ่ฎกๅๅค้ป็ๅซๆฅ่๏ผxๆxๅทๅบๆฃ๏ผๆ ๅไธๅถไธๅฟ๏ผๅ็งxx-xx๏ผไปทๆ ผx.xๅ
/ๆฃต๏ผๅฐๅ...\n745503 0 ่ฏด่ฏดไฝ ้ฅญไบ่ฟไนไน
็TFBOYS้ๅฐ่ฟไปไนๆไบบ็ไบๆ
ๆๅๆฌขไธไปไปฌๆนๅไบไฝ ไปไน\n745504 0 ๆฏไธชๅ
ฌไผๅท1000ๅฐ2500ไธ็ญ\n1000 Processed\n classify content\n746000 1 ๅฐๆฌ็้กพๅฎข๏ผไธ็บช้่ฑxๆxๆฅ๏ผxๆxๆฅ๏ผๅ
จๅบๆปกxxx่ฟxx็ฐ้ๅธ๏ผๅนถไบซx.xๅ็งฏๅใๅๆฅผAC...\n746001 0 ๅธฆไบ่ชๅทฑ็็ต่ๆฅๅไฝไปฅๅๆๅ็ฐๅๆฅๅไฝ็็ฝ้่ฟไนๅฟซโฆโฆๆๅคๅฟซๅขๆๆณๆๆ็็ต่ๆฌ่บซๅฝขๅฎนๆไธไธช่ทฏ...\n746002 0 ้ ๆๅๆพๅจ่ฝฆๅบ้็็บฆ300่พ็ตๅจ่ฝฆๅๆฉๆ่ฝฆ่ขซ็งๆฏ\n746003 0 17ๅฒ็ๅฅนๅฐฑ็ปไธ็ฆๅธๆฏๅไบบๆฆ\n746004 0 ไปๅคฉๅซ็ๆปดๆปดๅฟซ่ฝฆๆพ็้ๅ็ๆๆฒไผค็็ป่ชๅทฑ\n1000 Processed\n classify content\n746500 0 ๅคฉ็พไบบ็ฅธ็พ็
้ขๅๆไปฌ้ฝๆฏๆธบๅฐ็ๆไปฌๆ ่ฝไธบๅ\n746501 0 ๅจ่
พ่ฎฏๅผๅผนๅน็่ง้ขๅ็ฐๅป้ผๅฅฝๅค\n746502 0 ไปฅไธ5็งๆฐดไธๅปบ่ฎฎ็จๆฅ็ปๅฎๅฎๅฒๅฅถ็ฒ\n746503 0 ๅฐฑ่งๅพๅฅฝๅธ
็ถๅๅฐๅค่ทไบบ่ฏดๅๅๅ่พฃไธช้้ฟๅฅฝๅธ
ๅๆฅ้ๅบไบๅ
ถๅฎๆไน่งๅพๆบๅฅฝๆๆดๅคง็ๅๅฑ็ฉบ้ดๅ ไธบไป็...\n746504 0 ้ฉๅฝ้ฃ่พน็ranker่ฟๆฅๆ
ๆธธ้กบไพฟๅ่ต\n1000 Processed\n classify content\n747000 0 ็ฐๅจๆฒกไบๅ็ต่ๅไธค็ผไธ็ช่ฟ็ฝไธๅฒๆตช็ๅฐฑไฝ ไบไฝ ไธช่ซ่ฆๅจ\n747001 0 ๆตๆฑๆฑๅฑฑๅธๅ
ฌๅกๅ่่ฏ่ๅฏๅๅๅ
ฌ็คบๅฆ~~่ตถๅฟซ็็ๆๆฒกๆไฝ ็ๅๅญๅ\n747002 0 ๅตๅตๅ~ๅ็ง็ด็ท็ๅฐ็็ๆณๅพๅๆฟ็ญ\n747003 0 ๆตทๆฅๅผ ่ดด~ๅฝๅ
ๅคไธๅฎถไปฌๆๆฐๆๆ็ๅฑ็คบ\n747004 0 ไปๆข
ๆ็ๆ ้กๅพไนฆไปๅบๅฐ็กๆพ\n1000 Processed\n classify content\n747500 0 5ไฝ้ขๅผ่ถ
้ซ็ๅฐๆๅ่ช500ๅๅ่ต้ๆไธญ่ฑ้ข่ๅบ\n747501 0 ๅ
ฑไธบxxx่พ่ฝฆๅ็ไบ่ฝฆ่บซๅนฟๅ็ป่ฎฐ\n747502 0 ๆ็่ฏๅธ้ข้ไธป่งไธๆ็็ฉบใๅ็ฉบๅธๅบ็ๅพๅ\n747503 0 ไปฅ็ๆฐ็่ฎพ่ฎกๅ้
็ฝฎ็ปง็ปญๅทฉๅบๅ
ถ้ข่ขๅฐไฝ\n747504 0 ไธ่ถ
่ฟ6000็ไบบๅฃไปฅๆธไธไธบ็\n1000 Processed\n classify content\n748000 0 ่ฟ้็้ฎ้ขๆฏ๏ผAstuteไฝไธบๆปๅปๆ ธๆฝ่\n748001 0 ็พๅบฆๅญฆๆฏๆฏ่ฐทๆญๅญฆๆฏ้ฝๅผบๅคงไบ\n748002 0 ๆจๅ่ๅฎ่ฝๅ่ฆ็ๅปบ็ญๅทฅไบบ\n748003 0 ้ถ่ก99ๅนดๆถๅทฒ็ปๆๆฏๆง็ ดไบง่ฟไบ\n748004 0 ๆญๅท็ ด่ท็นๅคง็็ชๅขไผ่ญฆๆนๆ้่ฐจ้ฒๅค็\n1000 Processed\n classify content\n748500 0 445ๅพๅพโฆ่ฏฆๆ่ฏทClickไธ\n748501 1 ๆฏไผไธ็ไปชๅจ็พๅฎน่๏ผไธบ็ญ่ฐขๆฐ่้กพๅฎข็นๅซๅฅ็ฎไปชๅจไฝ้ชxxๅ
็งๆๆดปๅจ๏ผๆฏไฝไผๅๅฏไปฅ็งๆๆจๆๆณไฝ้ช...\n748502 0 ๆฑ่็ไบๆ
ๅ็ๅจ่พฝๅฎๅๆๆ ท\n748503 1 ้พๆนๅๅฅฝๆ ทๆฟๅทฅๅฐไพๆจๅ่ง๏ผไปทๆ ผๆไฝไผๆ xx๏ผ
๏ผๅไธไธ่ฆ้่ฟ่ฟไธชๅฅฝๆบไผ๏ผ่ฃ
ไฟฎ่ฆ็ฏไฟ๏ผๅจๆญๅท้ฆ้...\n748504 0 ๆฎ่ฏด่่ฟช็ฝๆๅคฉๅฐฑๆ่ฎฟ่ๅทไบo>\n1000 Processed\n classify content\n749000 0 ็พๅบฆๆๅฐ็ๅ ๅผ ๅๆฌข็็่กๅผๆ่็
ง็\n749001 0 ้ๅข้คไบๅจๆฌๆธฏๆฅๆ็จณๅบ็ไธๅกๅบ็กๅค\n749002 0 ็ฑๆฑ่็่ดจ็ๅฑใไบบ็คพๅ
ใๆปๅทฅไผๅๅข็ๅงไธปๅ็ๆฑ่็้ฆๅฑ็ตๆขฏๅฎ่ฃ
็ปดไฟฎๅทฅๆ่ฝ็ซ่ต\n749003 0 ๅ
ณ็ต่ๅ
ณ็ต่ไธๅทๅพฎๅ่ๅญไธ็ญๅฐฑไธๆณ่ๅ่ฏไบ\n749004 0 ๅพฎ่ฝฏๅๆฌก็กฎ่ฎคไบWin10ๅฐๅจ7ๆ29ๆฅๅ
่ดนๆไพ็ปๆญฃ็Win7ๅ8\n1000 Processed\n classify content\n749500 0 ็ตๅฑฑๅฟไบบๆฐๆณ้ข่กๆฟๅบญ็ๆณๅฎๅไนฆ่ฎฐๅใๆณ่ญฆ็ปๆ็ๅ้ชๅฐ็ปๅๅพ็ตๅฑฑๅฟๆ้ไธๅคๅทฒ่ขซๅผบๅถๆ้ค็ๅปบ็ญๅทฅ...\n749501 0 ็ฑ1ไธชๆ ๆกฉๅ70ๅผ ๅๅ็ๅนด่ฝฎไพฟ็ญพ็บธ็ปๆ\n749502 0 ่ฐๅๅ ๅฆๅ
ไธค้กน่ต็96A1ๅๅฆๅ
ใไฟ็ก๏ผไธญๅฝๅฆๅ
ๆฏๅปไฟ็ฝๆฏ\"็ ธๅบๅญ\"็ๅ\n749503 0 ๅคงๅฎถไธ่ฆๅจๆฑ็ๅฐ้ๅบๅพ้ๆไบ็ๆณ็ฉ็ญๅคง่ทไธคไธๅคฉๅๅ
ฅๅบๅไธๆณขๆ็จณๅฅ\n749504 0 ่ฏฆๆ
ๅ ๆQQxxxxxxxxxx่ฏฆ็ปไบ่งฃๅจ่ฏข\n1000 Processed\n classify content\n750000 0 ไปๅ
้ฒ็VC่ไธ้ๆฐๆๅ
ฅๅฐๅไธไผไธไธญ\n750001 0 zf็ไธไฝไธบ่ฆ่ฎฉ่็พๅงๆฅๆฟๆ
็ปๆ\n750002 0 ไธๅๆๅฎถ้็ๅฐๅผ็ต่xp็ณป็ปๅ็บงไบไธไธ\n750003 0 ๆทฑๅณๅธๆฐ็ๅ
็ๆฅๆ็งฐๅ
ถๅไฝ้่ฟ็ไธ็งๆๅ
ฌๅธๅฐๅ่พๅฅ้ฉฐ่ฝฟ่ฝฆๅฝไฝๅนด็ปๅฅ้็ปไผ็งๅๅทฅ\n750004 0 ๆ็็ไธๆณ่ฏดไปไนไบๅ
ๆฏ็ๅฐฑ็ไบๅฅฝๅ ้็ปๅ
ธ็ไธ่ฝๅ็ปๅ
ธ\n1000 Processed\n classify content\n750500 0 ไฝๅดๆฒกๆ็ปGoogleๅธฆๆฅ่ถณๅคๅค็โๅฉ็โ\n750501 0 ้ฉๅฝSINILPHARM็ฒๅณ่ดด\n750502 0 ็็ๅป้ข้็ฉฟ็็
ๅทๆ็็
ไบบ\n750503 0 ๅจๅไบฌๅ
ญๅไธคๅฑฑ้ด็ๆดผๅฐไธญๅธไธไธๅฅไป็ฒพๅฟ่ฎพ็้ตๆณ\n750504 0 ไธบไปไนๅจไผๅ็COSๅจ็ป้่ฟๆๅทโไปไปฌๆฏๅฎไธๅฎถ็ไบบโ่ฟๆ ท็KYๅญๅจโฆโฆ็็ๅฅฝไธ็่งฃโฆโฆๅๆญฃ็ป...\n1000 Processed\n classify content\n751000 0 ๅซๅข
ๅจโ้ด้ณโ็ๅบ็กไธ็ ็ฉถๅฏน็ซ\n751001 0 ๅฝไบงๅคง้ฃๆบC919้ฆๅฐๅๅจๆบไบคไป้ฆๆถๆบๅนดๅ
ไธ็บฟ\n751002 1 ๅฎถ้ฟๆจๅฅฝ๏ผๆไปฌๆฏ้ธฟ่ๆ่ฒๆ็ปด็ป็ปๅฟ็ซฅ่ๆฝ่ฝๅผๅ๏ผๆฌๅจๅ
ญไธๅๅ็นๅ
่ดนๅ
ฌๅผ่ฏพๆฌข่ฟๅธฆๅฎๅฎ่ฏๅฌ๏ผๅฐๅ...\n751003 0 ไธคๅ้ฟๅพไธๅทไน่ฟ่ฝฝ็ซ็ฎญๆญไนๅไธไธๅไปๅไบฌๅฏ็จใๅฅ่ตด่ฅฟๆๅซๆๅๅฐไธญๅฟๆง่กๅซๆๅๅฐไปปๅก\n751004 0 ๅทฒๆๆธ
ๅๅคงๅญฆๆบ่ฝๆบๅจไบบ็ ๅ็ไบงๅบๅฐ็ญxไธช็ฑไบฌๆดฅ้ซๆ กใไผไธๆ่ต็ๆบๅจไบบ้กน็ฎ\n1000 Processed\n classify content\n751500 0 ๅญธๆไบFIRAC้ๅ็ฌฆๅๆณๅพ็ๆ็็จๅบๅปๅ ฑๆก\n751501 0 ไธบไปไน็พๅบฆ็ๆฑ็ๅๅฝๅคๆดๆฐ็ๆฑ็ๆๅทฎๅผ\n751502 0 ไธๆตทๅฎๅ็ฉไธ้กพ้ฎๆ้ๅ
ฌๅธไธบไบๆ่ไบบๆไธ้จๅๅคไบ่ฟไธช้กต้ข\n751503 0 ๆๆญฃๅจ็ๆๅๅปบไบไธไธช่ๅฑฑ็ฎ้็ฟ\n751504 0 ๆฑๅข
ๅบ่ชๅทฑ็โๅๅฆAๆขฆโใๅบๆฃๅฏ้ขๆชๆฃโ่่็บธโไปๅนดๅ
ญไธไน่ฏ็ไบ\n1000 Processed\n classify content\n752000 0 ไปๅฐ่ๅข้ๅฐYtigerๅๅฐๆ็ซๆๆถฆๅ
ฌๅธ\n752001 0 ็ฌฌไบๅคง็็ฑ๏ผๅคๅฝฉ็ๆดป\n752002 0 ๆ็้ณไนๆฐ็ๆณๅฐฑๅจ่
พ่ฎฏ่ง้ขLiveMusic\n752003 0 4ใๅฐ่ถ
ๅธๅฑก้ญโๅคๅฝ้ปๆโ็็ช\n752004 0 ๆ้ขๅๆนๅผไปฅๆฅ่ขซไพตๅ ๅคๅฐ้ขๅ\n1000 Processed\n classify content\n752500 0 ๅนถ้่ฟๆๆบAppๆฅๅฏๅจSOSๆจกๅผ\n752501 0 ๅฅฝๅ
ฌๅธๅฅฝ่ก็ฅจไผๆ
ขๆ
ข่ตฐ็ฌ็ซ่กๆ
\n752502 0 ่ฏ้ขไผฆๆดพๅบๆๅ
ฑ็ป็ปไธๆนๅนณๅฎๅฟๆฟ่
1000ไฝไบบ่ฟ่กๆถๆๆถๆด\n752503 0 ๅฅฝๅฃฐ้ณโไธ่ตทๆๆโๅๆๅฑๅบไบไปๆๆฏๅๅฑ็ๆ่ง\n752504 1 xxxxxxxxxxxxxxxxxxx่ไธญๅซๅ่กๅกใ\n1000 Processed\n classify content\n753000 1 ไธๆไฟ้โโๅนปๆดปๆฐ็ ๆถๅ
็คผ้ ไธใๅๅผ ่ฎขๅ่ดญไนฐไปปๆๅนปๆถๆๅนปๆถไฝณไบงๅๆฏๆปกxxxxๅ
๏ผๅณๅฏๅ
่ดน...\n753001 0 ๅ
ณ้ญๅฐ็
ค็ฟใๆฅๅคๆฒนๆฐ็ฎก้ใ้ๅคงไบๆ
่ฐๆฅโโๅฎๅ
จ็็ฎกๆปๅฑๅๅบ็ญ็น้ฎ้ข\n753002 0 ๅฏนไบ็ไธนไธน็ฐๅจ็่ตไบง้ฎ้ขๆ่งๅพ่ญฆๆนๆๅฟ
่ฆๅป่ฐๆฅไธไธไบ่ฟๆๅฅน็ฅ้้ซๅๆณฝ็ถไบฒ้ฟ็ธ็้ฎ้ขๆไบบๅฟๅฝๆญปๅ\n753003 0 ๅฅณๆๆฝ่งๅ๏ผๅไธไธ่ฝๅ่ๅฐๅฐๅ็
ง\n753004 0 G15wๅธธๅฐ้ซ้่ๅทๆฎต็ฑไบๆฝๅทฅ\n1000 Processed\n classify content\n753500 0 ็ฝๅญ็ปๅทฒ็ป้ๆฉไบ้ขๅฏน่ฑๅ้ชจ\n753501 0 ๆ่ตฐ้่ๅ ๅธใๅฎฟ่ฟๅธ็4ไธชๅบๅ\n753502 0 xxxxๅนดไธญๅฝๅค่ดธxxxๅผบๅๅธๆๅๅฆๅพ\n753503 0 ไธญๅ
ฑ้็ไบๅไบฌๅคงๅฑ ๆ้ฟ่พพๅ ๅๅนด\n753504 0 ๅ้ข็็ฎๅ
ตใๅฆๅ
ๅ
ตใไพฆๅฏๅ
ต็ฏไธๆณ็ไบๆไนๅ\n1000 Processed\n classify content\n754000 0 ไธ้กพไธๅ็จ่ฟๆฒกๆๆ็ฃจๆ็็ๆฐๆๆฏ\n754001 0 ๆตๆฑๅซ่งๅไบ้้ๅณๅจ้ปๅฑฑๅ่ญ้ณ้ฑผๅข\n754002 0 ๅนฟไธไปฅ142ๅฎถๆๅจ็ฌฌไบโฆๅไบฌไปฅ83ไฝๅฑ
็ฌฌไบ\n754003 1 ่ฟใxxxxxใCxMใไฝฐ/ๅ /็ญใไฝ-่ฒ๏ผ็ซๅป็ญ็ญโฆโฆ้กถ็บงไฟก่ช๏ผๅคๅนณๅฐ็ฉๆณใ็ซๅณไธๅคฉๆ*ๅฝ...\n754004 0 ๅทๅด็ซ็ฐไนๅๆฌง็พ็ซ็ฐ็็ธไผผๅบฆๅพ้ซ\n1000 Processed\n classify content\n754500 0 ไธ็จ่ดจ็็จไบๅฐฑไผๆๆฟ่ชๅทฑๅฝๅ็้ๆฉ\n754501 0 ๅ้ฟ่กไปฝๆง่ก่กไธๆๆฅ่ต5000ไธๅ
ๅขๆ\n754502 0 ๆๅธฆ็ไฝ /ไฝ ๅธฆ็ๆบๆบ็ต่ไนๅฅฝๆๆบไน็ฝขๆจช็ฉฟๆฅๅพ
ๅคงๅ
ๆด่ตฐๅๅๆฟ้ด่ฎฉๆไปฌๅไธไธช่ฏดๅนฒๅฐฑๅนฒ็ๅทๆ\n754503 0 ๆณ่ตทever17ๅชๅฟ้ฝ่ฝ็ๅฐ็ไธๅฅ่ฏ๏ผไธ่ฆ่ฟๅบฆๆจ็\n754504 1 ๆฅๅไบบไบบๅๆตท้ฒ๏ผๅๆฐๆดๆด่ดบๆฐๅนดใๅณๆฅ่ตทๆๅ้ขๅฎๅนดๅค้ฅญๅฏ่ท่ต ๅ
ๅธ๏ผๅ็ง็คผ็ฉไปปๆจ้๏ผ่ฏฆ่ฏขๆฌๆบใ็พ...\n1000 Processed\n"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code"
]
]
|
d06207fa0f559b3ebc86802bd9160f20e280b273 | 88,828 | ipynb | Jupyter Notebook | Models/CNN_best.ipynb | DataMas/Deep-Learning-Image-Classification | 88916064a041ba1e8f3b0c397226710c1c9058fa | [
"MIT"
]
| 1 | 2021-04-14T18:50:50.000Z | 2021-04-14T18:50:50.000Z | Models/CNN_best.ipynb | DataMas/Deep-Learning-Image-Classification | 88916064a041ba1e8f3b0c397226710c1c9058fa | [
"MIT"
]
| null | null | null | Models/CNN_best.ipynb | DataMas/Deep-Learning-Image-Classification | 88916064a041ba1e8f3b0c397226710c1c9058fa | [
"MIT"
]
| null | null | null | 115.812256 | 24,292 | 0.82957 | [
[
[
"# Mount google drive to colab",
"_____no_output_____"
]
],
[
[
"from google.colab import drive\ndrive.mount(\"/content/drive\")",
"Mounted at /content/drive\n"
]
],
[
[
"# Import libraries",
"_____no_output_____"
]
],
[
[
"import os\nimport random \nimport numpy as np\nimport shutil\nimport time \nfrom PIL import Image, ImageOps\nimport cv2\nimport pandas as pd\nimport math\n\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nsns.set_style('darkgrid')\n\nimport tensorflow as tf\n\nfrom keras import models\nfrom keras import layers\nfrom keras import optimizers\nfrom keras.callbacks import EarlyStopping\nfrom keras.callbacks import ModelCheckpoint\nfrom keras.callbacks import LearningRateScheduler\nfrom keras.utils import np_utils\n\n\nfrom sklearn.metrics import confusion_matrix, classification_report\nfrom sklearn.preprocessing import LabelBinarizer\nfrom sklearn.preprocessing import MinMaxScaler\nfrom keras.preprocessing.image import ImageDataGenerator\n\nfrom keras import models, layers, optimizers\nfrom keras.callbacks import ModelCheckpoint\nfrom keras import losses",
"_____no_output_____"
]
],
[
[
"# Initialize basic working directories",
"_____no_output_____"
]
],
[
[
"directory = \"drive/MyDrive/Datasets/Sign digits/Dataset\"\ntrainDir = \"train\"\ntestDir = \"test\"\nos.chdir(directory)",
"_____no_output_____"
]
],
[
[
"# Augmented dataframes",
"_____no_output_____"
]
],
[
[
"augDir = \"augmented/\"\nclassNames_train = os.listdir(augDir+'train/')\nclassNames_test = os.listdir(augDir+'test/')\n\n\nclasses_train = []\ndata_train = []\npaths_train = []\n\nclasses_test = []\ndata_test = []\npaths_test = []\n\nclasses_val = []\ndata_val = []\npaths_val = []\n\nfor className in range(0,10):\n temp_train = os.listdir(augDir+'train/'+str(className))\n temp_test = os.listdir(augDir+'test/'+str(className))\n\n for dataFile in temp_train:\n path_train = augDir+'train/'+str(className)+'/'+dataFile\n\n paths_train.append(path_train)\n classes_train .append(str(className))\n \n testSize = [i for i in range(math.floor(len(temp_test)/2),len(temp_test))]\n valSize = [i for i in range(0,math.floor(len(temp_test)/2))]\n for dataFile in testSize:\n path_test = augDir+'test/'+str(className)+'/'+temp_test[dataFile]\n\n paths_test.append(path_test)\n classes_test .append(str(className))\n\n for dataFile in valSize:\n path_val = augDir+'test/'+str(className)+'/'+temp_test[dataFile]\n\n paths_val.append(path_val)\n classes_val .append(str(className))\n\n \naugTrain_df = pd.DataFrame({'fileNames': paths_train, 'labels': classes_train})\naugTest_df = pd.DataFrame({'fileNames': paths_test, 'labels': classes_test})\naugVal_df = pd.DataFrame({'fileNames': paths_val, 'labels': classes_val})",
"_____no_output_____"
],
[
"augTrain_df.head(10)",
"_____no_output_____"
],
[
"augTrain_df['labels'].hist(figsize=(10,5))\naugTest_df['labels'].hist(figsize=(10,5))",
"_____no_output_____"
],
[
"augTest_df['labels'].hist(figsize=(10,5))\naugVal_df['labels'].hist(figsize=(10,5))",
"_____no_output_____"
],
[
"augTrainX=[]\naugTrainY=[]\naugTestX=[]\naugTestY=[]\naugValX=[]\naugValY=[]\n\niter = -1\n\n#read images from train set\nfor path in augTrain_df['fileNames']:\n iter = iter + 1\n #image = np.array((Image.open(path)))\n image = cv2.imread(path)\n augTrainX.append(image)\n label = augTrain_df['labels'][iter]\n augTrainY.append(label)\n\niter = -1\n\nfor path in augTest_df['fileNames']:\n iter = iter + 1\n #image = np.array((Image.open(path)))\n image = cv2.imread(path)\n augTestX.append(image)\n augTestY.append(augTest_df['labels'][iter])\n\niter = -1\n\nfor path in augVal_df['fileNames']:\n iter = iter + 1\n #image = np.array((Image.open(path)))\n image = cv2.imread(path)\n augValX.append(image)\n augValY.append(augVal_df['labels'][iter])\n\naugTrainX = np.array(augTrainX)\naugTestX = np.array(augTestX)\naugValX = np.array(augValX)\n\n \naugTrainX = augTrainX / 255\naugTestX = augTestX / 255\naugValX = augValX / 255\n# OneHot Encode the Output\naugTrainY = np_utils.to_categorical(augTrainY, 10)\naugTestY = np_utils.to_categorical(augTestY, 10)\naugValY = np_utils.to_categorical(augValY, 10)",
"/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:37: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray\n/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:39: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray\n"
],
[
"train_datagen = ImageDataGenerator(rescale=1./255)\nvalidation_datagen = ImageDataGenerator(rescale=1./255)\ntest_datagen = ImageDataGenerator(rescale=1./255)\n\ntrain_generator = train_datagen.flow_from_dataframe(dataframe=augTrain_df,\n x_col=\"fileNames\",\n y_col=\"labels\",\n batch_size=16,\n class_mode=\"categorical\",\n color_mode=\"grayscale\",\n target_size=(100,100),\n shuffle=True)\n\nvalidation_generator = validation_datagen.flow_from_dataframe(dataframe=augVal_df,\n x_col=\"fileNames\",\n y_col=\"labels\",\n batch_size=16,\n class_mode=\"categorical\",\n color_mode=\"grayscale\",\n target_size=(100,100),\n shuffle=True)\n\ntest_generator = test_datagen.flow_from_dataframe(dataframe=augTest_df,\n x_col=\"fileNames\",\n y_col=\"labels\",\n batch_size=16,\n class_mode=\"categorical\",\n color_mode=\"grayscale\",\n target_size=(100,100),\n shuffle=True)",
"Found 3124 validated image filenames belonging to 10 classes.\nFound 252 validated image filenames belonging to 10 classes.\nFound 252 validated image filenames belonging to 10 classes.\n"
],
[
"model_best = models.Sequential()\n\nmodel_best.add(layers.Conv2D(64, (3,3), input_shape=(100, 100,1), padding='same', activation='relu'))\nmodel_best.add(layers.BatchNormalization(momentum=0.1))\nmodel_best.add(layers.MaxPooling2D(pool_size=(2,2)))\nmodel_best.add(layers.Conv2D(32, (3,3), padding='same', activation='relu'))\nmodel_best.add(layers.BatchNormalization(momentum=0.1))\nmodel_best.add(layers.MaxPooling2D(pool_size=(2,2)))\nmodel_best.add(layers.Conv2D(16, (3,3), padding='same', activation='relu'))\nmodel_best.add(layers.BatchNormalization(momentum=0.1))\nmodel_best.add(layers.MaxPooling2D(pool_size=(2,2)))\nmodel_best.add(layers.Flatten())\nmodel_best.add(layers.Dense(128, activation='relu'))\nmodel_best.add(layers.Dropout(0.2))\nmodel_best.add(layers.Dense(10, activation='softmax'))\n\nmodel_best.summary()",
"Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv2d (Conv2D) (None, 100, 100, 64) 640 \n_________________________________________________________________\nbatch_normalization (BatchNo (None, 100, 100, 64) 256 \n_________________________________________________________________\nmax_pooling2d (MaxPooling2D) (None, 50, 50, 64) 0 \n_________________________________________________________________\nconv2d_1 (Conv2D) (None, 50, 50, 32) 18464 \n_________________________________________________________________\nbatch_normalization_1 (Batch (None, 50, 50, 32) 128 \n_________________________________________________________________\nmax_pooling2d_1 (MaxPooling2 (None, 25, 25, 32) 0 \n_________________________________________________________________\nconv2d_2 (Conv2D) (None, 25, 25, 16) 4624 \n_________________________________________________________________\nbatch_normalization_2 (Batch (None, 25, 25, 16) 64 \n_________________________________________________________________\nmax_pooling2d_2 (MaxPooling2 (None, 12, 12, 16) 0 \n_________________________________________________________________\nflatten (Flatten) (None, 2304) 0 \n_________________________________________________________________\ndense (Dense) (None, 128) 295040 \n_________________________________________________________________\ndropout (Dropout) (None, 128) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 10) 1290 \n=================================================================\nTotal params: 320,506\nTrainable params: 320,282\nNon-trainable params: 224\n_________________________________________________________________\n"
],
[
"print(\"[INFO] Model is training...\")\ntime1 = time.time() # to measure time taken\n# Compile the model\nmodel_best.compile(loss='categorical_crossentropy',\n optimizer=optimizers.Adam(learning_rate=1e-3),\n metrics=['acc'])\n\nhistory_best = model_best.fit(\n train_generator,\n steps_per_epoch=train_generator.samples/train_generator.batch_size ,\n epochs=20,\n validation_data=validation_generator,\n validation_steps=validation_generator.samples/validation_generator.batch_size,\n)\nprint('Time taken: {:.1f} seconds'.format(time.time() - time1)) # to measure time taken\nprint(\"[INFO] Model is trained.\")",
"[INFO] Model is training...\nEpoch 1/20\n195/195 [==============================] - 87s 443ms/step - loss: 1.5492 - acc: 0.5419 - val_loss: 0.4457 - val_acc: 0.8373\nEpoch 2/20\n195/195 [==============================] - 86s 439ms/step - loss: 0.3078 - acc: 0.8933 - val_loss: 0.2915 - val_acc: 0.9087\nEpoch 3/20\n195/195 [==============================] - 85s 436ms/step - loss: 0.1132 - acc: 0.9614 - val_loss: 0.3068 - val_acc: 0.8968\nEpoch 4/20\n195/195 [==============================] - 86s 440ms/step - loss: 0.0780 - acc: 0.9808 - val_loss: 0.2856 - val_acc: 0.9246\nEpoch 5/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0408 - acc: 0.9869 - val_loss: 0.2254 - val_acc: 0.9444\nEpoch 6/20\n195/195 [==============================] - 86s 439ms/step - loss: 0.0308 - acc: 0.9909 - val_loss: 0.3072 - val_acc: 0.9286\nEpoch 7/20\n195/195 [==============================] - 85s 437ms/step - loss: 0.0409 - acc: 0.9857 - val_loss: 0.2902 - val_acc: 0.9246\nEpoch 8/20\n195/195 [==============================] - 85s 437ms/step - loss: 0.0526 - acc: 0.9827 - val_loss: 0.3072 - val_acc: 0.9206\nEpoch 9/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0241 - acc: 0.9896 - val_loss: 0.3179 - val_acc: 0.9127\nEpoch 10/20\n195/195 [==============================] - 85s 436ms/step - loss: 0.0223 - acc: 0.9945 - val_loss: 0.2930 - val_acc: 0.9405\nEpoch 11/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0151 - acc: 0.9952 - val_loss: 0.2063 - val_acc: 0.9444\nEpoch 12/20\n195/195 [==============================] - 86s 439ms/step - loss: 0.0185 - acc: 0.9940 - val_loss: 0.2123 - val_acc: 0.9563\nEpoch 13/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0434 - acc: 0.9863 - val_loss: 0.3235 - val_acc: 0.9484\nEpoch 14/20\n195/195 [==============================] - 85s 438ms/step - loss: 0.0478 - acc: 0.9856 - val_loss: 0.3105 - val_acc: 0.9365\nEpoch 15/20\n195/195 [==============================] - 86s 440ms/step - loss: 0.0110 - acc: 0.9966 - val_loss: 0.2986 - val_acc: 0.9405\nEpoch 16/20\n195/195 [==============================] - 86s 440ms/step - loss: 0.0169 - acc: 0.9932 - val_loss: 0.4730 - val_acc: 0.9286\nEpoch 17/20\n195/195 [==============================] - 85s 436ms/step - loss: 0.0693 - acc: 0.9743 - val_loss: 0.2832 - val_acc: 0.9405\nEpoch 18/20\n195/195 [==============================] - 85s 437ms/step - loss: 0.0265 - acc: 0.9925 - val_loss: 0.2911 - val_acc: 0.9365\nEpoch 19/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0233 - acc: 0.9920 - val_loss: 0.2732 - val_acc: 0.9524\nEpoch 20/20\n195/195 [==============================] - 85s 435ms/step - loss: 0.0167 - acc: 0.9940 - val_loss: 0.2515 - val_acc: 0.9603\nTime taken: 1713.1 seconds\n[INFO] Model is trained.\n"
],
[
"score = model_best.evaluate(test_generator)\n\nprint('===Testing loss and accuracy===')\nprint('Test loss: ', score[0])\nprint('Test accuracy: ', score[1])",
"16/16 [==============================] - 2s 111ms/step - loss: 0.4789 - acc: 0.9405\n===Testing loss and accuracy===\nTest loss: 0.47893190383911133\nTest accuracy: 0.9404761791229248\n"
],
[
"import matplotlib.pyplot as plot\nplot.plot(history_best.history['acc'])\nplot.plot(history_best.history['val_acc'])\nplot.title('Model accuracy')\nplot.ylabel('Accuracy')\nplot.xlabel('Epoch')\nplot.legend(['Train', 'Vall'], loc='upper left')\nplot.show()\n\nplot.plot(history_best.history['loss'])\nplot.plot(history_best.history['val_loss'])\nplot.title('Model loss')\nplot.ylabel('Loss')\nplot.xlabel('Epoch')\nplot.legend(['Train', 'Vall'], loc='upper left')\nplot.show()",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0621f6db1f8301e02bf44fe950e88798ea5aeb7 | 538,383 | ipynb | Jupyter Notebook | HW_exam/.ipynb_checkpoints/Exam_Prazdnichnykh-checkpoint.ipynb | AntonPrazdnichnykh/HSE.optimization | ca844bb041c614e0de95cab6de87db340323e59d | [
"Apache-2.0"
]
| null | null | null | HW_exam/.ipynb_checkpoints/Exam_Prazdnichnykh-checkpoint.ipynb | AntonPrazdnichnykh/HSE.optimization | ca844bb041c614e0de95cab6de87db340323e59d | [
"Apache-2.0"
]
| null | null | null | HW_exam/.ipynb_checkpoints/Exam_Prazdnichnykh-checkpoint.ipynb | AntonPrazdnichnykh/HSE.optimization | ca844bb041c614e0de95cab6de87db340323e59d | [
"Apache-2.0"
]
| null | null | null | 740.554333 | 25,460 | 0.954436 | [
[
[
"import numpy as np\nimport scipy.sparse as sp\nfrom sklearn.datasets import load_svmlight_file\nfrom oracle import Oracle, make_oracle\nimport scipy as sc\nfrom methods import OptimizeLassoProximal, OptimizeGD, NesterovLineSearch\nimport matplotlib.pyplot as plt\nfrom sklearn import linear_model",
"_____no_output_____"
]
],
[
[
"ะ ะตัะฐะตะผ ะทะฐะดะฐัั ะปะพะณะธััะธัะตัะบะพะน ัะตะณัะตััะธะธ ะธ l1-ัะตะณัะปััะธะทะฐัะธะตะน:\n$$F(w) = - \\frac{1}{N}\\sum\\limits_{i=1}^Ny_i\\ln(\\sigma_w(x_i)) + (1 - y_i)\\ln(1 - \\sigma_w(x_i)) + \\lambda\\|w\\|_1,$$\nะณะดะต $\\lambda$ -- ะฟะฐัะฐะผะตัั ัะตะณัะปััะธะทะฐัะธะธ.\n\nะะฐะดะฐัั ัะตัะฐะตะผ ะฟัะพะบัะธะผะฐะปัะฝัะผ ะณัะฐะดะธะตะฝัะฝัะผ ะผะตัะพะดะพะผ. ะฃะฑะตะดะธะผัั ัะฝะฐัะฐะปะฐ, ััะพ ะฟัะธ $\\lambda = 0$ ะฝะฐัะต ัะตัะตะฝะธะต ัะพะฒะฟะฐะดะฐะตั ั ัะตัะตะฝะธะตะผ ะผะตัะพะดะฐ ะณัะฐะดะธะตะฝัะฝะพะณะพ ัะฟััะบะฐ ั ะพัะตะฝะบะพะน ะดะปะธะฝั ัะฐะณะฐ ะผะตัะพะดะพะผ ะะตััะตัะพะฒะฐ.",
"_____no_output_____"
]
],
[
[
"orac = make_oracle('a1a.txt', penalty='l1', reg=0)\norac1 = make_oracle('a1a.txt')\nx, y = load_svmlight_file('a1a.txt', zero_based=False)\nm = x[0].shape[1] + 1\nw0 = np.zeros((m, 1))\noptimizer = OptimizeLassoProximal()\noptimizer1 = OptimizeGD()\npoint = optimizer(orac, w0)\npoint1 = optimizer1(orac1, w0, NesterovLineSearch())\n\nnp.allclose(point, point1)",
"_____no_output_____"
]
],
[
[
"ะะทััะธะผ ัะบะพัะพััั ัั
ะพะดะธะผะพััะธ ะผะตัะพะดะฐ ะฝะฐ ะดะฐัะฐัะตัะต a1a.txt ($\\lambda = 0.001$)",
"_____no_output_____"
]
],
[
[
"def convergence_plot(xs, ys, xlabel, title=None):\n plt.figure(figsize = (12, 3))\n plt.xlabel(xlabel)\n plt.ylabel('F(w_{k+1} - F(w_k)')\n plt.plot(xs, ys)\n plt.yscale('log')\n if title:\n plt.title(title)\n plt.tight_layout()\n plt.show()\n ",
"_____no_output_____"
],
[
"orac = make_oracle('a1a.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)",
"_____no_output_____"
],
[
"errs = optimizer.errs\ntitle = 'lambda = 0.001'\nconvergence_plot(optimizer.times, errs, 'ะฒะตัะผั ัะฐะฑะพัั, ั', title)\nconvergence_plot(optimizer.orac_calls, errs, 'ะบะพะป-ะฒะพ ะฒัะทะพะฒะพะฒ ะพัะฐะบัะปะฐ', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), errs, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน', title)",
"_____no_output_____"
]
],
[
[
"ะะฐะผะตัะธะผ, ััะพ ะฑัะปะพ ะธัะฟะพะปัะทะพะฒะฐะฝะพ ััะปะพะฒะธะต ะพััะฐะฝะพะฒะบะธ $F(w_{k+1}) - F(w_k) \\leq tol = 10^{-16}$. ะะท ะผะฐัะตะผะฐัะธัะตัะบะธั
ัะพะพะฑัะฐะถะตะฝะธะน ะบะฐะถะตััั, ััะพ ััะพ ะพะบ, ัะฐะบ ะบะฐะบ ะฒ ะฒะตัะตััะฒะตะฝะฝัั
ัะธัะปะฐั
ัั
ะพะดะธะผะพััั ะฟะพัะปะตะดะพะฒะฐัะตะปัะฝะพััะธ ัะฐะฒะฝะพัะธะปัะฝะฐ ะตั ััะฝะดะฐะผะตะฝัะฐะปัะฝะพััะธ. ะฏ ัะฐะบะถะต ะฟััะฐะปัั ะธัะฟะพะปัะทะพะฒะฐัั ะฒ ะบะฐัะตััะฒะต ััะปะพะฒะธั ะพััะฐะฝะพะฒะบะธ $\\|\\nabla_w f(w_k)\\|_2^2 / \\|\\nabla_w f(w_0)\\|_2^2 <= tol$, ะณะดะต $f$ -- ะปะพัั ะปะพะณะธััะธัะตัะบะพะน ัะตะณัะตััะธะธ ะฑะตะท ัะตะณัะปััะธะทะฐัะธะธ ($F = f + reg$), ะฝะพ, ะฒะพะพะฑัะต ะณะพะฒะพัั, ะฝะต ะพัะตะฝั ะฟะพะฝััะฝะพ, ะผะพะถะฝะพ ะปะธ ัะฐะบ ะดะตะปะฐัั, ะฟะพัะพะผั ััะพ ะพะฝะพ ััะธััะฒะฐะตั ัะพะปัะบะพ ัะฐััั ััะฝะบัะธะธ.\n\nะะท ะณัะฐัะธะบะพะฒ ะฒะธะดะฝะพ, ััะพ ะผะตัะพะด ะพะฑะปะฐะดะฐะตั ะปะธะฝะตะนะฝะพะน ัะบะพัะพัััั ัั
ะพะดะธะผะพััะธ",
"_____no_output_____"
],
[
"ะะทััะธะผ ัะตะฟะตัั ะทะฐะฒะธัะธะผะพััั ัะบะพัะพััะธ ัั
ะพะดะธะผะพััะธ ะธ ะบะพะปะธัะตััะฒะฐ ะฝะตะฝัะปะตะฒัั
ะบะพะผะฟะพะฝะตะฝั ะฒ ัะตัะตะฝะธะธ ะพั ะฟะฐัะฐะผะตััะฐ ัะตะณัะปััะธะทะฐัะธะธ $\\lambda$",
"_____no_output_____"
]
],
[
[
"def plot(x, ys, ylabel, legend=False): \n plt.figure(figsize = (12, 3))\n plt.xlabel(\"lambda\")\n plt.ylabel(ylabel)\n plt.plot(x, ys, 'o')\n plt.xscale('log')\n if legend:\n plt.legend()\n plt.tight_layout()\n plt.show()",
"_____no_output_____"
],
[
"lambdas = [10**(-i) for i in range(8, 0, -1)]\nnon_zeros = []\nfor reg in lambdas:\n orac = make_oracle('a1a.txt', penalty='l1', reg=reg)\n point = optimizer(orac, w0)\n convergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน',\n f\"lambda = {reg}\")\n non_zeros.append(len(np.nonzero(point)[0]))\nplot(lambdas, non_zeros, '# nonzero components')",
"_____no_output_____"
]
],
[
[
"ะะธะดะฝะพ, ััะพ ะฟะฐัะฐะผะตัั ัะตะณัะปััะธะทะฐัะธะธ ะฟัะฐะบัะธัะตัะบะธ ะฝะต ะฒะปะธัะตั ะฝะฐ ัะบะพัะพััั ัั
ะพะดะธะผะพััะธ (ะพะฝะฐ ะฒัะตะณะดะฐ ะปะธะฝะตะนะฝะฐั), ะฝะพ ะบะพะปะธัะตััะฒะพ ะธัะตัะฐัะธะน ะผะตัะพะดะฐ ะฟะฐะดะฐะตั ั ัะฒะตะปะธัะตะฝะธะตะผ ะฟะฐัะฐะผะตััะฐ ัะตะณัะปััะธะทะฐัะธะธ. ะขะฐะบ ะถะต ะธะท ะฟะพัะปะตะดะฝะตะณะพ ะณัะฐัะธะบะฐ ะดะตะปะฐะตะผ ะพะถะธะดะฐะตะผัะน ะฒัะฒะพะด, ััะพ ัะธัะปะพ ะฝะตะฝัะปะตะฒัั
ะบะพะผะฟะพะฝะตะฝั ะฒ ัะตัะตะฝะธะธ ัะผะตะฝััะฐะตััั ั ัะพััะพะผ ะฟะฐัะฐะผะตััะฐ ัะตะณัะปััะธะทะฐัะธะธ",
"_____no_output_____"
],
[
"ะะพัััะพะธะผ ะตัะต ะณัะฐัะธะบะธ ะทะฐะฒะธัะธะผะพััะธ ะทะฝะฐัะตะฝะธั ะพะฟัะธะผะธะทะธััะตะผะพะน ััะฝะบัะธะธ ะธ ะบัะธัะตัะธั ะพััะฝะพะฒะบะธ (ะตัั ัะฐะทะพะบ) ะฒ ะทะฐะฒะธัะธะผะพััะธ ะพั ะธัะตัะฐัะธะธ ($\\lambda = 0.001$)",
"_____no_output_____"
]
],
[
[
"def value_plot(xs, ys, xlabel, title=None):\n plt.figure(figsize = (12, 3))\n plt.xlabel(xlabel)\n plt.ylabel('F(w_k)')\n plt.plot(xs, ys)\n# plt.yscale('log')\n if title:\n plt.title(title)\n plt.tight_layout()\n plt.show()",
"_____no_output_____"
],
[
"orac = make_oracle('a1a.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)\ntitle = 'lambda = 0.001'\nvalue_plot(list(range(1, optimizer.n_iter + 1)), optimizer.values, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน', title)",
"_____no_output_____"
]
],
[
[
"ะะปั ะฟะพะดัะฒะตัะถะดะตะฝะธั ัะดะตะปะฐะฝัั
ะฒัะฒะพะดะพะฒ ะฟัะพะฒะตัะธะผ ะธั
ะตัั ะฝะฐ breast-cancer_scale ะดะฐัะฐัะตัะต.",
"_____no_output_____"
],
[
"ะัะพะฒะตัะบะฐ ัะฐะฒะฝะพัะธะปัะฝะพััะธ GD + Nesterov ะธ Proximal + $\\lambda = 0$:",
"_____no_output_____"
]
],
[
[
"orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=0)\norac1 = make_oracle('breast-cancer_scale.txt')\nx, y = load_svmlight_file('breast-cancer_scale.txt', zero_based=False)\nm = x[0].shape[1] + 1\nw0 = np.zeros((m, 1))\noptimizer = OptimizeLassoProximal()\noptimizer1 = OptimizeGD()\npoint = optimizer(orac, w0)\npoint1 = optimizer1(orac1, w0, NesterovLineSearch())\n\nnp.allclose(point, point1)",
"_____no_output_____"
],
[
"print(abs(orac.value(point) - orac1.value(point1)))",
"0.0001461093710795336\n"
]
],
[
[
"ะกะฐะผะธ ะฒะตะบัะพัะฐ ะฒะตัะพะฒ ะฝะต ัะพะฒะฟะฐะปะธ, ะฝะพ ะทะฝะฐัะตะฝะธั ะพะฟัะธะผะธะทะธััะตะผะพะน ััะฝะบัะธะธ ะฑะปะธะทะบะธ, ัะฐะบ ััะพ ะฑัะดะตะผ ััะธัะฐัั, ััะพ ะฒัะต ะพะบ.",
"_____no_output_____"
],
[
"ะะทััะฐะตะผ ัะบะพัะพััั ัั
ะพะดะธะผะพััะธ ะดะปั $\\lambda = 0.001$:",
"_____no_output_____"
]
],
[
[
"orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)\nerrs = optimizer.errs\ntitle = 'lambda = 0.001'\nconvergence_plot(optimizer.times, errs, 'ะฒะตัะผั ัะฐะฑะพัั, ั', title)\nconvergence_plot(optimizer.orac_calls, errs, 'ะบะพะป-ะฒะพ ะฒัะทะพะฒะพะฒ ะพัะฐะบัะปะฐ', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), errs, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน', title)",
"_____no_output_____"
]
],
[
[
"ะะฐะถะตััั, ััะพ ัะบะพัะพััั ัั
ะพะดะธะผะพััะธ ะพะฟััั ะปะธะฝะตะนะฝะฐั",
"_____no_output_____"
],
[
"ะะทััะฐะตะผ ะทะฐะฒะธัะธะผะพััั ัะบะพัะพััะธ ัั
ะพะดะธะผะพััะธ ะธ ะบะพะปะธัะตััะฒะฐ ะฝะตะฝัะปะตะฒัั
ะบะพะผะฟะพะฝะตะฝั ะฒ ัะตัะตะฝะธะธ ะพั $\\lambda$",
"_____no_output_____"
]
],
[
[
"lambdas = [10**(-i) for i in range(8, 0, -1)]\nnon_zeros = []\nfor reg in lambdas:\n orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=reg)\n point = optimizer(orac, w0)\n convergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน',\n f\"lambda = {reg}\")\n non_zeros.append(len(np.nonzero(point)[0]))\nplot(lambdas, non_zeros, '# nonzero components')",
"_____no_output_____"
]
],
[
[
"ะะตะปะฐะตะผ ัะต ะถะต ะฒัะฒะพะดั",
"_____no_output_____"
],
[
"ะะพัััะพะธะผ ะฝะฐะฟะพัะปะตะดะพะบ ะณััะธะบะธ ะดะปั ะทะฝะฐัะตะฝะธะน ะพะฟัะธะผะธะทะธััะตะผะพะน ััะฝะบัะธะธ ะธ ะบัะธัะตัะธั ะพััะฐะฝะพะฒะบะธ (ะตัั ัะฐะทะพะบ) ะฒ ะทะฐะฒะธัะธะผะพััะธ ะพั ะธัะตัะฐัะธะธ ($\\lambda = 0.001$)",
"_____no_output_____"
]
],
[
[
"orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)\ntitle = 'lambda = 0.001'\nvalue_plot(list(range(1, optimizer.n_iter + 1)), optimizer.values, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธัะตัะฐัะธะน', title)",
"_____no_output_____"
]
],
[
[
"ะะพะฝะตั.",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d06224770637f53d41cec6b94186b3ae72820478 | 250,590 | ipynb | Jupyter Notebook | sngp_with_bert_aws.ipynb | tejashrigadre/Anomaly-detection-for-chat-bots | 7cac681983bc435953d472a3d2f91bcbe4bc756f | [
"MIT"
]
| null | null | null | sngp_with_bert_aws.ipynb | tejashrigadre/Anomaly-detection-for-chat-bots | 7cac681983bc435953d472a3d2f91bcbe4bc756f | [
"MIT"
]
| null | null | null | sngp_with_bert_aws.ipynb | tejashrigadre/Anomaly-detection-for-chat-bots | 7cac681983bc435953d472a3d2f91bcbe4bc756f | [
"MIT"
]
| null | null | null | 192.317728 | 51,284 | 0.88982 | [
[
[
"## Implementing BERT with SNGP",
"_____no_output_____"
]
],
[
[
"!pip install tensorflow_text==2.7.3",
"Collecting tensorflow_text==2.7.3\n Using cached tensorflow_text-2.7.3-cp38-cp38-manylinux2010_x86_64.whl (4.9 MB)\nCollecting tensorflow-hub>=0.8.0\n Using cached tensorflow_hub-0.12.0-py2.py3-none-any.whl (108 kB)\nCollecting tensorflow<2.8,>=2.7.0\n Using cached tensorflow-2.7.1-cp38-cp38-manylinux2010_x86_64.whl (495.1 MB)\nCollecting tensorflow-estimator<2.8,~=2.7.0rc0\n Using cached tensorflow_estimator-2.7.0-py2.py3-none-any.whl (463 kB)\nRequirement already satisfied: six>=1.12.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.16.0)\nCollecting libclang>=9.0.1\n Using cached libclang-13.0.0-py2.py3-none-manylinux1_x86_64.whl (14.5 MB)\nRequirement already satisfied: flatbuffers<3.0,>=1.12 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.12)\nCollecting keras<2.8,>=2.7.0rc0\n Using cached keras-2.7.0-py2.py3-none-any.whl (1.3 MB)\nRequirement already satisfied: wheel<1.0,>=0.32.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.37.0)\nRequirement already satisfied: gast<0.5.0,>=0.2.1 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.4.0)\nRequirement already satisfied: protobuf>=3.9.2 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.19.1)\nRequirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.12.1)\nRequirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.6.3)\nRequirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.2.0)\nRequirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.42.0)\nRequirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.3.0)\nRequirement already satisfied: numpy>=1.14.5 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.19.5)\nRequirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.1.0)\nRequirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.7.4.3)\nRequirement already satisfied: tensorboard~=2.6 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.6.0)\nRequirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.1.2)\nRequirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.1.0)\nRequirement already satisfied: tensorflow-io-gcs-filesystem>=0.21.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.21.0)\nRequirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.10.0)\nRequirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (59.5.0)\nRequirement already satisfied: google-auth<2,>=1.6.3 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.35.0)\nRequirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.6.1)\nRequirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.8.0)\nRequirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.0.2)\nRequirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.4.6)\nRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.3.6)\nRequirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.25.1)\nRequirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.8/site-packages (from google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.2.8)\nRequirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.8/site-packages (from google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.2.4)\nRequirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.8/site-packages (from google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.7.2)\nRequirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.8/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.3.0)\nRequirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.8/site-packages (from markdown>=2.6.8->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.8.2)\nRequirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.10)\nRequirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.0.0)\nRequirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.26.7)\nRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2021.10.8)\nRequirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.8/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.6.0)\nRequirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.8/site-packages (from pyasn1-modules>=0.2.1->google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.4.8)\nRequirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.8/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.1.1)\nInstalling collected packages: tensorflow-estimator, libclang, keras, tensorflow-hub, tensorflow, tensorflow-text\n Attempting uninstall: tensorflow-estimator\n Found existing installation: tensorflow-estimator 2.6.0\n Uninstalling tensorflow-estimator-2.6.0:\n Successfully uninstalled tensorflow-estimator-2.6.0\n Attempting uninstall: keras\n Found existing installation: keras 2.6.0\n Uninstalling keras-2.6.0:\n Successfully uninstalled keras-2.6.0\n Attempting uninstall: tensorflow\n Found existing installation: tensorflow 2.6.2\n Uninstalling tensorflow-2.6.2:\n Successfully uninstalled tensorflow-2.6.2\n\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\ntensorflow-io 0.21.0 requires tensorflow<2.7.0,>=2.6.0, but you have tensorflow 2.7.1 which is incompatible.\u001b[0m\nSuccessfully installed keras-2.7.0 libclang-13.0.0 tensorflow-2.7.1 tensorflow-estimator-2.7.0 tensorflow-hub-0.12.0 tensorflow-text-2.7.3\n\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\n\u001b[33mWARNING: You are using pip version 21.3.1; however, version 22.0.3 is available.\nYou should consider upgrading via the '/usr/local/bin/python3.8 -m pip install --upgrade pip' command.\u001b[0m\n"
],
[
"!pip install -U tf-models-official==2.7.0",
"Collecting tf-models-official==2.7.0\n Using cached tf_models_official-2.7.0-py2.py3-none-any.whl (1.8 MB)\nRequirement already satisfied: tensorflow-text>=2.7.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (2.7.3)\nRequirement already satisfied: Pillow in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (8.3.2)\nRequirement already satisfied: matplotlib in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (3.5.0)\nCollecting pycocotools\n Using cached pycocotools-2.0.4-cp38-cp38-linux_x86_64.whl\nCollecting oauth2client\n Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)\nCollecting google-api-python-client>=1.6.7\n Using cached google_api_python_client-2.37.0-py2.py3-none-any.whl (8.1 MB)\nCollecting seqeval\n Using cached seqeval-1.2.2-py3-none-any.whl\nCollecting Cython\n Using cached Cython-0.29.28-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (1.9 MB)\nCollecting sentencepiece\n Using cached sentencepiece-0.1.96-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.2 MB)\nCollecting tensorflow-datasets\n Using cached tensorflow_datasets-4.5.2-py3-none-any.whl (4.2 MB)\nCollecting tensorflow-addons\n Using cached tensorflow_addons-0.16.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)\nRequirement already satisfied: psutil>=5.4.3 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (5.8.0)\nCollecting tensorflow-model-optimization>=0.4.1\n Using cached tensorflow_model_optimization-0.7.1-py2.py3-none-any.whl (234 kB)\nCollecting sacrebleu\n Using cached sacrebleu-2.0.0-py3-none-any.whl (90 kB)\nRequirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.19.5)\nRequirement already satisfied: tensorflow-hub>=0.6.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (0.12.0)\nCollecting py-cpuinfo>=3.3.0\n Using cached py_cpuinfo-8.0.0-py3-none-any.whl\nCollecting kaggle>=1.3.9\n Using cached kaggle-1.5.12-py3-none-any.whl\nRequirement already satisfied: scipy>=0.19.1 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.7.0)\nCollecting gin-config\n Using cached gin_config-0.5.0-py3-none-any.whl (61 kB)\nCollecting tf-slim>=1.1.0\n Using cached tf_slim-1.1.0-py2.py3-none-any.whl (352 kB)\nRequirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (5.4.1)\nRequirement already satisfied: tensorflow>=2.7.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (2.7.1)\nRequirement already satisfied: six in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.16.0)\nCollecting opencv-python-headless\n Using cached opencv_python_headless-4.5.5.62-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (47.7 MB)\nRequirement already satisfied: pandas>=0.22.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.2.5)\nCollecting httplib2<1dev,>=0.15.0\n Using cached httplib2-0.20.4-py3-none-any.whl (96 kB)\nCollecting uritemplate<5,>=3.0.1\n Using cached uritemplate-4.1.1-py2.py3-none-any.whl (10 kB)\nRequirement already satisfied: google-auth<3.0.0dev,>=1.16.0 in /usr/local/lib/python3.8/site-packages (from google-api-python-client>=1.6.7->tf-models-official==2.7.0) (1.35.0)\nCollecting google-auth-httplib2>=0.1.0\n Using cached google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB)\nCollecting google-api-core<3.0.0dev,>=1.21.0\n Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)\nRequirement already satisfied: python-dateutil in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (2.8.2)\nRequirement already satisfied: tqdm in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (4.62.3)\nRequirement already satisfied: certifi in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (2021.10.8)\nRequirement already satisfied: urllib3 in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (1.26.7)\nCollecting python-slugify\n Using cached python_slugify-6.0.1-py2.py3-none-any.whl (9.0 kB)\nRequirement already satisfied: requests in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (2.25.1)\nRequirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/site-packages (from pandas>=0.22.0->tf-models-official==2.7.0) (2021.3)\nRequirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.1.0)\nRequirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.1.0)\nRequirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.3.0)\nRequirement already satisfied: wheel<1.0,>=0.32.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.37.0)\nRequirement already satisfied: tensorflow-io-gcs-filesystem>=0.21.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.21.0)\nRequirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.1.2)\nRequirement already satisfied: libclang>=9.0.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (13.0.0)\nRequirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.7.4.3)\nRequirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.2.0)\nRequirement already satisfied: flatbuffers<3.0,>=1.12 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.12)\nRequirement already satisfied: keras<2.8,>=2.7.0rc0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (2.7.0)\nRequirement already satisfied: tensorflow-estimator<2.8,~=2.7.0rc0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (2.7.0)\nRequirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.6.3)\nRequirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.42.0)\nRequirement already satisfied: tensorboard~=2.6 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (2.6.0)\nRequirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.12.1)\nRequirement already satisfied: gast<0.5.0,>=0.2.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.4.0)\nRequirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.10.0)\nRequirement already satisfied: protobuf>=3.9.2 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.19.1)\nCollecting dm-tree~=0.1.1\n Using cached dm_tree-0.1.6-cp38-cp38-manylinux_2_24_x86_64.whl (94 kB)\nRequirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (0.11.0)\nRequirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (4.28.3)\nRequirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (21.3)\nRequirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (1.3.2)\nRequirement already satisfied: pyparsing>=2.2.1 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (3.0.6)\nRequirement already satisfied: setuptools-scm>=4 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (6.3.2)\nRequirement already satisfied: pyasn1>=0.1.7 in /usr/local/lib/python3.8/site-packages (from oauth2client->tf-models-official==2.7.0) (0.4.8)\nRequirement already satisfied: pyasn1-modules>=0.0.5 in /usr/local/lib/python3.8/site-packages (from oauth2client->tf-models-official==2.7.0) (0.2.8)\nRequirement already satisfied: rsa>=3.1.4 in /usr/local/lib/python3.8/site-packages (from oauth2client->tf-models-official==2.7.0) (4.7.2)\nRequirement already satisfied: tabulate>=0.8.9 in /usr/local/lib/python3.8/site-packages (from sacrebleu->tf-models-official==2.7.0) (0.8.9)\nRequirement already satisfied: colorama in /usr/local/lib/python3.8/site-packages (from sacrebleu->tf-models-official==2.7.0) (0.4.3)\nCollecting regex\n Using cached regex-2022.1.18-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (764 kB)\nCollecting portalocker\n Using cached portalocker-2.4.0-py2.py3-none-any.whl (16 kB)\nRequirement already satisfied: scikit-learn>=0.21.3 in /usr/local/lib/python3.8/site-packages (from seqeval->tf-models-official==2.7.0) (0.24.2)\nCollecting typeguard>=2.7\n Using cached typeguard-2.13.3-py3-none-any.whl (17 kB)\nCollecting promise\n Using cached promise-2.3-py3-none-any.whl\nRequirement already satisfied: dill in /usr/local/lib/python3.8/site-packages (from tensorflow-datasets->tf-models-official==2.7.0) (0.3.4)\nCollecting tensorflow-metadata\n Using cached tensorflow_metadata-1.6.0-py3-none-any.whl (48 kB)\nRequirement already satisfied: importlib-resources in /usr/local/lib/python3.8/site-packages (from tensorflow-datasets->tf-models-official==2.7.0) (5.4.0)\nCollecting googleapis-common-protos<2.0dev,>=1.52.0\n Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)\nRequirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.8/site-packages (from google-auth<3.0.0dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official==2.7.0) (4.2.4)\nRequirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.8/site-packages (from google-auth<3.0.0dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official==2.7.0) (59.5.0)\nRequirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/site-packages (from requests->kaggle>=1.3.9->tf-models-official==2.7.0) (2.10)\nRequirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/site-packages (from requests->kaggle>=1.3.9->tf-models-official==2.7.0) (4.0.0)\nRequirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.8/site-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official==2.7.0) (3.0.0)\nRequirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.8/site-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official==2.7.0) (1.1.0)\nRequirement already satisfied: tomli>=1.0.0 in /usr/local/lib/python3.8/site-packages (from setuptools-scm>=4->matplotlib->tf-models-official==2.7.0) (1.2.2)\nRequirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (2.0.2)\nRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (3.3.6)\nRequirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (0.6.1)\nRequirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (0.4.6)\nRequirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (1.8.0)\nRequirement already satisfied: zipp>=3.1.0 in /usr/local/lib/python3.8/site-packages (from importlib-resources->tensorflow-datasets->tf-models-official==2.7.0) (3.6.0)\nCollecting text-unidecode>=1.3\n Using cached text_unidecode-1.3-py2.py3-none-any.whl (78 kB)\nRequirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.8/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (1.3.0)\nRequirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.8/site-packages (from markdown>=2.6.8->tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (4.8.2)\nRequirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.8/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (3.1.1)\nInstalling collected packages: text-unidecode, httplib2, googleapis-common-protos, uritemplate, typeguard, tensorflow-metadata, regex, python-slugify, promise, portalocker, google-auth-httplib2, google-api-core, dm-tree, tf-slim, tensorflow-model-optimization, tensorflow-datasets, tensorflow-addons, seqeval, sentencepiece, sacrebleu, pycocotools, py-cpuinfo, opencv-python-headless, oauth2client, kaggle, google-api-python-client, gin-config, Cython, tf-models-official\nSuccessfully installed Cython-0.29.28 dm-tree-0.1.6 gin-config-0.5.0 google-api-core-2.5.0 google-api-python-client-2.37.0 google-auth-httplib2-0.1.0 googleapis-common-protos-1.54.0 httplib2-0.20.4 kaggle-1.5.12 oauth2client-4.1.3 opencv-python-headless-4.5.5.62 portalocker-2.4.0 promise-2.3 py-cpuinfo-8.0.0 pycocotools-2.0.4 python-slugify-6.0.1 regex-2022.1.18 sacrebleu-2.0.0 sentencepiece-0.1.96 seqeval-1.2.2 tensorflow-addons-0.16.1 tensorflow-datasets-4.5.2 tensorflow-metadata-1.6.0 tensorflow-model-optimization-0.7.1 text-unidecode-1.3 tf-models-official-2.7.0 tf-slim-1.1.0 typeguard-2.13.3 uritemplate-4.1.1\n\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\n\u001b[33mWARNING: You are using pip version 21.3.1; however, version 22.0.3 is available.\nYou should consider upgrading via the '/usr/local/bin/python3.8 -m pip install --upgrade pip' command.\u001b[0m\n"
],
[
"import matplotlib.pyplot as plt\nimport matplotlib.colors as colors\n\nimport sklearn.metrics\nimport sklearn.calibration\n\nimport tensorflow_hub as hub\nimport tensorflow_datasets as tfds\n\nimport numpy as np\nimport tensorflow as tf\nimport pandas as pd\nimport json\n\nimport official.nlp.modeling.layers as layers\nimport official.nlp.optimization as optimization",
"_____no_output_____"
]
],
[
[
"### Implement a standard BERT classifier following which classifies text",
"_____no_output_____"
]
],
[
[
"gpus = tf.config.list_physical_devices('GPU')\ngpus",
"_____no_output_____"
],
[
"# Standard BERT model\n\nPREPROCESS_HANDLE = 'https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/3'\nMODEL_HANDLE = 'https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/3'\n\nclass BertClassifier(tf.keras.Model):\n def __init__(self, \n num_classes=150, inner_dim=768, dropout_rate=0.1,\n **classifier_kwargs):\n \n super().__init__()\n self.classifier_kwargs = classifier_kwargs\n\n # Initiate the BERT encoder components.\n self.bert_preprocessor = hub.KerasLayer(PREPROCESS_HANDLE, name='preprocessing')\n self.bert_hidden_layer = hub.KerasLayer(MODEL_HANDLE, trainable=True, name='bert_encoder')\n\n # Defines the encoder and classification layers.\n self.bert_encoder = self.make_bert_encoder()\n self.classifier = self.make_classification_head(num_classes, inner_dim, dropout_rate)\n\n def make_bert_encoder(self):\n text_inputs = tf.keras.layers.Input(shape=(), dtype=tf.string, name='text')\n encoder_inputs = self.bert_preprocessor(text_inputs)\n encoder_outputs = self.bert_hidden_layer(encoder_inputs)\n return tf.keras.Model(text_inputs, encoder_outputs)\n\n def make_classification_head(self, num_classes, inner_dim, dropout_rate):\n return layers.ClassificationHead(\n num_classes=num_classes, \n inner_dim=inner_dim,\n dropout_rate=dropout_rate,\n **self.classifier_kwargs)\n\n def call(self, inputs, **kwargs):\n encoder_outputs = self.bert_encoder(inputs)\n classifier_inputs = encoder_outputs['sequence_output']\n return self.classifier(classifier_inputs, **kwargs)\n",
"_____no_output_____"
]
],
[
[
"### Build SNGP model",
"_____no_output_____"
],
[
"To implement a BERT-SNGP model designed by Google researchers",
"_____no_output_____"
]
],
[
[
"class ResetCovarianceCallback(tf.keras.callbacks.Callback):\n\n def on_epoch_begin(self, epoch, logs=None):\n \"\"\"Resets covariance matrix at the begining of the epoch.\"\"\"\n if epoch > 0:\n self.model.classifier.reset_covariance_matrix()",
"_____no_output_____"
],
[
"class SNGPBertClassifier(BertClassifier):\n\n def make_classification_head(self, num_classes, inner_dim, dropout_rate):\n return layers.GaussianProcessClassificationHead(\n num_classes=num_classes, \n inner_dim=inner_dim,\n dropout_rate=dropout_rate,\n gp_cov_momentum=-1,\n temperature=30.,\n **self.classifier_kwargs)\n\n def fit(self, *args, **kwargs):\n \"\"\"Adds ResetCovarianceCallback to model callbacks.\"\"\"\n kwargs['callbacks'] = list(kwargs.get('callbacks', []))\n kwargs['callbacks'].append(ResetCovarianceCallback())\n\n return super().fit(*args, **kwargs)",
"_____no_output_____"
]
],
[
[
"### Load train and test datasets",
"_____no_output_____"
]
],
[
[
"is_train = pd.read_json('is_train.json')\nis_train.columns = ['question','intent']\n\nis_test = pd.read_json('is_test.json')\nis_test.columns = ['question','intent']\n\noos_test = pd.read_json('oos_test.json')\noos_test.columns = ['question','intent']\n\nis_test.shape",
"_____no_output_____"
]
],
[
[
"Make the train and test data.",
"_____no_output_____"
]
],
[
[
"#Generate codes\nis_data = is_train.append(is_test)\nis_data.intent = pd.Categorical(is_data.intent)\nis_data['code'] = is_data.intent.cat.codes\n\n#in-scope evaluation data\nis_test = is_data[15000:19500]\n\nis_test_queries = is_test.question\nis_test_labels = is_test.intent\nis_test_codes = is_test.code\n\nis_eval_data = (tf.convert_to_tensor(is_test_queries), tf.convert_to_tensor(is_test_codes))\n\nis_train = is_data[0:15000]\nis_train_queries = is_train.question\nis_train_labels = is_train.intent\nis_train_codes = is_train.code\n\ntraining_ds_queries = tf.convert_to_tensor(is_train_queries)\n\ntraining_ds_labels = tf.convert_to_tensor(is_train_codes)",
"_____no_output_____"
],
[
"is_test.shape",
"_____no_output_____"
]
],
[
[
"Create a OOD evaluation dataset. For this, combine the in-scope test data 'is_test' and out-of-scope 'oos_test' data. Assign label 0 for in-scope and label 1 for out-of-scope data",
"_____no_output_____"
]
],
[
[
"train_size = len(is_train)\ntest_size = len(is_test)\noos_size = len(oos_test)\n\n# Combines the in-domain and out-of-domain test examples.\noos_queries= tf.concat([is_test['question'], oos_test['question']], axis=0)\noos_labels = tf.constant([0] * test_size + [1] * oos_size)\n\n# Converts into a TF dataset.\noos_eval_dataset = tf.data.Dataset.from_tensor_slices(\n {\"text\": oos_queries, \"label\": oos_labels})",
"_____no_output_____"
]
],
[
[
"### Train and evaluate",
"_____no_output_____"
]
],
[
[
"TRAIN_EPOCHS = 4\nTRAIN_BATCH_SIZE = 16\nEVAL_BATCH_SIZE = 256",
"_____no_output_____"
],
[
"#@title\n\ndef bert_optimizer(learning_rate, \n batch_size=TRAIN_BATCH_SIZE, epochs=TRAIN_EPOCHS, \n warmup_rate=0.1):\n \"\"\"Creates an AdamWeightDecay optimizer with learning rate schedule.\"\"\"\n train_data_size = train_size\n \n steps_per_epoch = int(train_data_size / batch_size)\n num_train_steps = steps_per_epoch * epochs\n num_warmup_steps = int(warmup_rate * num_train_steps) \n\n # Creates learning schedule.\n lr_schedule = tf.keras.optimizers.schedules.PolynomialDecay(\n initial_learning_rate=learning_rate,\n decay_steps=num_train_steps,\n end_learning_rate=0.0) \n \n return optimization.AdamWeightDecay(\n learning_rate=lr_schedule,\n weight_decay_rate=0.01,\n epsilon=1e-6,\n exclude_from_weight_decay=['LayerNorm', 'layer_norm', 'bias'])",
"_____no_output_____"
],
[
"optimizer = bert_optimizer(learning_rate=1e-4)\nloss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)\nmetrics = tf.metrics.SparseCategoricalAccuracy()",
"_____no_output_____"
],
[
"fit_configs = dict(batch_size=TRAIN_BATCH_SIZE,\n epochs=TRAIN_EPOCHS,\n validation_batch_size=EVAL_BATCH_SIZE, \n validation_data=is_eval_data)",
"_____no_output_____"
]
],
[
[
"### Model 1 - Batch size of 32 & 3 epochs ",
"_____no_output_____"
]
],
[
[
"sngp_model = SNGPBertClassifier()\nsngp_model.compile(optimizer=optimizer, loss=loss, metrics=metrics)\nsngp_model.fit(training_ds_queries, training_ds_labels, **fit_configs)",
"Epoch 1/2\n938/938 [==============================] - 481s 494ms/step - loss: 0.8704 - sparse_categorical_accuracy: 0.8241 - val_loss: 0.2888 - val_sparse_categorical_accuracy: 0.9473\nEpoch 2/2\n938/938 [==============================] - 464s 495ms/step - loss: 0.0647 - sparse_categorical_accuracy: 0.9853 - val_loss: 0.1979 - val_sparse_categorical_accuracy: 0.9598\n"
]
],
[
[
"### Model 2 - Batch size of 16 & 2 epochs ",
"_____no_output_____"
]
],
[
[
"sngp_model2 = SNGPBertClassifier()\nsngp_model2.compile(optimizer=optimizer, loss=loss, metrics=metrics)\nsngp_model2.fit(training_ds_queries, training_ds_labels, **fit_configs)",
"Epoch 1/3\n938/938 [==============================] - 480s 495ms/step - loss: 0.9506 - sparse_categorical_accuracy: 0.8029 - val_loss: 0.3883 - val_sparse_categorical_accuracy: 0.9376\nEpoch 2/3\n938/938 [==============================] - 462s 493ms/step - loss: 0.0989 - sparse_categorical_accuracy: 0.9769 - val_loss: 0.2342 - val_sparse_categorical_accuracy: 0.9522\nEpoch 3/3\n938/938 [==============================] - 462s 493ms/step - loss: 0.0272 - sparse_categorical_accuracy: 0.9939 - val_loss: 0.2013 - val_sparse_categorical_accuracy: 0.9598\n"
]
],
[
[
"### Model 3 - Batch size of 16 & 4 epochs ",
"_____no_output_____"
]
],
[
[
"sngp_model3 = SNGPBertClassifier()\nsngp_model3.compile(optimizer=optimizer, loss=loss, metrics=metrics)\nsngp_model3.fit(training_ds_queries, training_ds_labels, **fit_configs)",
"Epoch 1/4\n938/938 [==============================] - 477s 493ms/step - loss: 0.9459 - sparse_categorical_accuracy: 0.8066 - val_loss: 0.3804 - val_sparse_categorical_accuracy: 0.9393\nEpoch 2/4\n938/938 [==============================] - 465s 496ms/step - loss: 0.1192 - sparse_categorical_accuracy: 0.9730 - val_loss: 0.2526 - val_sparse_categorical_accuracy: 0.9511\nEpoch 3/4\n938/938 [==============================] - 466s 497ms/step - loss: 0.0372 - sparse_categorical_accuracy: 0.9917 - val_loss: 0.2169 - val_sparse_categorical_accuracy: 0.9564\nEpoch 4/4\n938/938 [==============================] - 465s 496ms/step - loss: 0.0135 - sparse_categorical_accuracy: 0.9974 - val_loss: 0.1992 - val_sparse_categorical_accuracy: 0.9629\n"
]
],
[
[
"### Evaluate OOD performance",
"_____no_output_____"
],
[
"Evaluate how well the model can detect the unfamiliar out-of-domain queries.",
"_____no_output_____"
]
],
[
[
"\n\ndef oos_predict(model, ood_eval_dataset, **model_kwargs):\n oos_labels = []\n oos_probs = []\n\n ood_eval_dataset = ood_eval_dataset.batch(EVAL_BATCH_SIZE)\n for oos_batch in ood_eval_dataset:\n oos_text_batch = oos_batch[\"text\"]\n oos_label_batch = oos_batch[\"label\"] \n\n pred_logits = model(oos_text_batch, **model_kwargs)\n pred_probs_all = tf.nn.softmax(pred_logits, axis=-1)\n pred_probs = tf.reduce_max(pred_probs_all, axis=-1)\n\n oos_labels.append(oos_label_batch)\n oos_probs.append(pred_probs)\n\n oos_probs = tf.concat(oos_probs, axis=0)\n oos_labels = tf.concat(oos_labels, axis=0) \n\n return oos_probs, oos_labels",
"_____no_output_____"
]
],
[
[
"Computes the OOD probabilities as $1 - p(x)$, where $p(x)=softmax(logit(x))$ is the predictive probability.",
"_____no_output_____"
]
],
[
[
"sngp_probs, ood_labels = oos_predict(sngp_model, oos_eval_dataset)",
"_____no_output_____"
],
[
"sngp_probs2, ood_labels2 = oos_predict(sngp_model2, oos_eval_dataset)",
"_____no_output_____"
],
[
"sngp_probs3, ood_labels3 = oos_predict(sngp_model3, oos_eval_dataset)",
"_____no_output_____"
],
[
"ood_probs = 1 - sngp_probs\nood_probs2 = 1 - sngp_probs2\nood_probs3 = 1 - sngp_probs3",
"_____no_output_____"
],
[
"plt.rcParams['figure.dpi'] = 140\n\nDEFAULT_X_RANGE = (-3.5, 3.5)\nDEFAULT_Y_RANGE = (-2.5, 2.5)\nDEFAULT_CMAP = colors.ListedColormap([\"#377eb8\", \"#ff7f00\"])\nDEFAULT_NORM = colors.Normalize(vmin=0, vmax=1,)\nDEFAULT_N_GRID = 100",
"_____no_output_____"
],
[
"ood_uncertainty = ood_probs * (1 - ood_probs)\nood_uncertainty2 = ood_probs2 * (1 - ood_probs2)\nood_uncertainty3 = ood_probs3 * (1 - ood_probs3)",
"_____no_output_____"
],
[
"s1 = np.array(sngp_probs.numpy())\nprint(s1[3000])",
"0.98855245\n"
],
[
"s2 = np.array(sngp_probs2.numpy())\nprint(s2[2000])",
"0.99832803\n"
],
[
"s3 = np.array(sngp_probs3.numpy())\nprint(s3[1000])",
"0.9983203\n"
]
],
[
[
"### Compute the Area under precision-recall curve (AUPRC) for OOD probability v.s. OOD detection accuracy.",
"_____no_output_____"
]
],
[
[
"precision, recall, _ = sklearn.metrics.precision_recall_curve(ood_labels, ood_probs)\nprecision2, recall2, _ = sklearn.metrics.precision_recall_curve(ood_labels2, ood_probs2)\nprecision3, recall3, _ = sklearn.metrics.precision_recall_curve(ood_labels3, ood_probs3)",
"_____no_output_____"
],
[
"print((precision3)\nprint(recall3)",
"_____no_output_____"
]
],
[
[
"[0.23380874 0.23362956 0.23368421 ... 1. 1. 1. ]\n[1. 0.999 0.999 ... 0.002 0.001 0. ]",
"_____no_output_____"
]
],
[
[
"sklearn.metrics.recall_score(oos_labels, ood_labels3, average='weighted')",
"_____no_output_____"
],
[
"sklearn.metrics.precision_score(oos_labels, ood_labels3, average='weighted')",
"_____no_output_____"
],
[
"auprc = sklearn.metrics.auc(recall, precision)\nprint(f'SNGP AUPRC: {auprc:.4f}')",
"SNGP AUPRC: 0.9026\n"
],
[
"auprc2 = sklearn.metrics.auc(recall2, precision2)\nprint(f'SNGP AUPRC 2: {auprc2:.4f}')",
"SNGP AUPRC 2: 0.8926\n"
],
[
"auprc3 = sklearn.metrics.auc(recall3, precision3)\nprint(f'SNGP AUPRC 3: {auprc3:.4f}')",
"SNGP AUPRC 3: 0.8926\n"
],
[
"prob_true, prob_pred = sklearn.calibration.calibration_curve(\n ood_labels, ood_probs, n_bins=10, strategy='quantile')\n\nprob_true2, prob_pred2 = sklearn.calibration.calibration_curve(\n ood_labels2, ood_probs2, n_bins=10, strategy='quantile')\n\nprob_true3, prob_pred3 = sklearn.calibration.calibration_curve(\n ood_labels3, ood_probs3, n_bins=10, strategy='quantile')",
"_____no_output_____"
],
[
"plt.plot(prob_pred, prob_true)\n\nplt.plot([0., 1.], [0., 1.], c='k', linestyle=\"--\")\nplt.xlabel('Predictive Probability')\nplt.ylabel('Predictive Accuracy')\nplt.title('Calibration Plots, SNGP')\n\nplt.show()",
"_____no_output_____"
],
[
"plt.plot(prob_pred2, prob_true2)\n\nplt.plot([0., 1.], [0., 1.], c='k', linestyle=\"--\")\nplt.xlabel('Predictive Probability')\nplt.ylabel('Predictive Accuracy')\nplt.title('Calibration Plots, SNGP')\n\nplt.show()",
"_____no_output_____"
],
[
"plt.plot(prob_pred3, prob_true3)\n\nplt.plot([0., 1.], [0., 1.], c='k', linestyle=\"--\")\nplt.xlabel('Predictive Probability')\nplt.ylabel('Predictive Accuracy')\nplt.title('Calibration Plots, SNGP')\n\nplt.show()",
"_____no_output_____"
],
[
"# calculate scores\nauc1 = roc_auc_score(oos_labels, ood_probs)\nauc2 = roc_auc_score(oos_labels, ood_probs2)\nauc3 = roc_auc_score(oos_labels, ood_probs3)\n# summarize scores\nprint('SNGP Model 1: ROC AUC=%.3f' % (auc1))\nprint('SNGP Model 2: ROC AUC=%.3f' % (auc2))\nprint('SNGP Model 3: ROC AUC=%.3f' % (auc3))\n# calculate roc curves\nfpr1, tpr1, _ = roc_curve(oos_labels, ood_probs)\nfpr2, tpr2, _ = roc_curve(oos_labels, ood_probs2)\nfpr3, tpr3, _ = roc_curve(oos_labels, ood_probs3)\n# plot the roc curve for the model\npyplot.plot(fpr1, tpr1, marker='.', label='SNGP Model 1')\npyplot.plot(fpr2, tpr2, marker='*', label='SNGP Model 2')\npyplot.plot(fpr3, tpr3, marker='+', label='SNGP Model 3')\n# axis labels\npyplot.xlabel('False Positive Rate (Precision)')\npyplot.ylabel('True Positive Rate (Recall)')\n# show the legend\npyplot.legend()\n# show the plot\npyplot.show()",
"SNGP Model 1: ROC AUC=0.972\nSNGP Model 2: ROC AUC=0.973\nSNGP Model 3: ROC AUC=0.973\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d06235b84db16934ad49870332ad1f26e8547ddb | 31,064 | ipynb | Jupyter Notebook | t81_558_class_02_4_pandas_functional.ipynb | AritraJana1810/t81_558_deep_learning | 184d84d202b54990be8c927499ce0a01a3662e6f | [
"Apache-2.0"
]
| 1 | 2021-07-03T09:02:59.000Z | 2021-07-03T09:02:59.000Z | t81_558_class_02_4_pandas_functional.ipynb | joaquinmorenoa/t81_558_deep_learning | 569ed623cb225a5d410fda6f49e1a15073b247ea | [
"Apache-2.0"
]
| null | null | null | t81_558_class_02_4_pandas_functional.ipynb | joaquinmorenoa/t81_558_deep_learning | 569ed623cb225a5d410fda6f49e1a15073b247ea | [
"Apache-2.0"
]
| 1 | 2020-09-21T15:11:35.000Z | 2020-09-21T15:11:35.000Z | 31,064 | 31,064 | 0.489988 | [
[
[
"<a href=\"https://colab.research.google.com/github/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class_02_4_pandas_functional.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# T81-558: Applications of Deep Neural Networks\n**Module 2: Python for Machine Learning**\n* Instructor: [Jeff Heaton](https://sites.wustl.edu/jeffheaton/), McKelvey School of Engineering, [Washington University in St. Louis](https://engineering.wustl.edu/Programs/Pages/default.aspx)\n* For more information visit the [class website](https://sites.wustl.edu/jeffheaton/t81-558/).",
"_____no_output_____"
],
[
"# Module 2 Material\n\nMain video lecture:\n\n* Part 2.1: Introduction to Pandas [[Video]](https://www.youtube.com/watch?v=bN4UuCBdpZc&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_1_python_pandas.ipynb)\n* Part 2.2: Categorical Values [[Video]](https://www.youtube.com/watch?v=4a1odDpG0Ho&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_2_pandas_cat.ipynb)\n* Part 2.3: Grouping, Sorting, and Shuffling in Python Pandas [[Video]](https://www.youtube.com/watch?v=YS4wm5gD8DM&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_3_pandas_grouping.ipynb)\n* **Part 2.4: Using Apply and Map in Pandas for Keras** [[Video]](https://www.youtube.com/watch?v=XNCEZ4WaPBY&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_4_pandas_functional.ipynb)\n* Part 2.5: Feature Engineering in Pandas for Deep Learning in Keras [[Video]](https://www.youtube.com/watch?v=BWPTj4_Mi9E&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_5_pandas_features.ipynb)",
"_____no_output_____"
],
[
"# Google CoLab Instructions\n\nThe following code ensures that Google CoLab is running the correct version of TensorFlow.",
"_____no_output_____"
]
],
[
[
"try:\n %tensorflow_version 2.x\n COLAB = True\n print(\"Note: using Google CoLab\")\nexcept:\n print(\"Note: not using Google CoLab\")\n COLAB = False",
"Note: not using Google CoLab\n"
]
],
[
[
"# Part 2.4: Apply and Map",
"_____no_output_____"
],
[
"If you've ever worked with Big Data or functional programming languages before, you've likely heard of map/reduce. Map and reduce are two functions that apply a task that you create to a data frame. Pandas supports functional programming techniques that allow you to use functions across en entire data frame. In addition to functions that you write, Pandas also provides several standard functions for use with data frames.",
"_____no_output_____"
],
[
"### Using Map with Dataframes\n\nThe map function allows you to transform a column by mapping certain values in that column to other values. Consider the Auto MPG data set that contains a field **origin_name** that holds a value between one and three that indicates the geographic origin of each car. We can see how to use the map function to transform this numeric origin into the textual name of each origin.\n\nWe will begin by loading the Auto MPG data set. ",
"_____no_output_____"
]
],
[
[
"import os\nimport pandas as pd\nimport numpy as np\n\ndf = pd.read_csv(\n \"https://data.heatonresearch.com/data/t81-558/auto-mpg.csv\", \n na_values=['NA', '?'])\n\npd.set_option('display.max_columns', 7)\npd.set_option('display.max_rows', 5)\n\ndisplay(df)",
"_____no_output_____"
]
],
[
[
"The **map** method in Pandas operates on a single column. You provide **map** with a dictionary of values to transform the target column. The map keys specify what values in the target column should be turned into values specified by those keys. The following code shows how the map function can transform the numeric values of 1, 2, and 3 into the string values of North America, Europe and Asia.",
"_____no_output_____"
]
],
[
[
"# Apply the map\ndf['origin_name'] = df['origin'].map(\n {1: 'North America', 2: 'Europe', 3: 'Asia'})\n\n# Shuffle the data, so that we hopefully see\n# more regions.\ndf = df.reindex(np.random.permutation(df.index)) \n\n# Display\npd.set_option('display.max_columns', 7)\npd.set_option('display.max_rows', 10)\ndisplay(df)",
"_____no_output_____"
]
],
[
[
"### Using Apply with Dataframes\n\nThe **apply** function of the data frame can run a function over the entire data frame. You can use either be a traditional named function or a lambda function. Python will execute the provided function against each of the rows or columns in the data frame. The **axis** parameter specifies of the function is run across rows or columns. For axis = 1, rows are used. The following code calculates a series called **efficiency** that is the **displacement** divided by **horsepower**. ",
"_____no_output_____"
]
],
[
[
"efficiency = df.apply(lambda x: x['displacement']/x['horsepower'], axis=1)\ndisplay(efficiency[0:10])",
"_____no_output_____"
]
],
[
[
"You can now insert this series into the data frame, either as a new column or to replace an existing column. The following code inserts this new series into the data frame.",
"_____no_output_____"
]
],
[
[
"df['efficiency'] = efficiency",
"_____no_output_____"
]
],
[
[
"### Feature Engineering with Apply and Map",
"_____no_output_____"
],
[
"In this section, we will see how to calculate a complex feature using map, apply, and grouping. The data set is the following CSV:\n\n* https://www.irs.gov/pub/irs-soi/16zpallagi.csv \n\nThis URL contains US Government public data for \"SOI Tax Stats - Individual Income Tax Statistics.\" The entry point to the website is here:\n\n* https://www.irs.gov/statistics/soi-tax-stats-individual-income-tax-statistics-2016-zip-code-data-soi \n\nDocumentation describing this data is at the above link.\n\nFor this feature, we will attempt to estimate the adjusted gross income (AGI) for each of the zip codes. The data file contains many columns; however, you will only use the following:\n\n* STATE - The state (e.g., MO)\n* zipcode - The zipcode (e.g. 63017)\n* agi_stub - Six different brackets of annual income (1 through 6) \n* N1 - The number of tax returns for each of the agi_stubs\n\nNote, the file will have six rows for each zip code, for each of the agi_stub brackets. You can skip zip codes with 0 or 99999.\n\nWe will create an output CSV with these columns; however, only one row per zip code. Calculate a weighted average of the income brackets. For example, the following six rows are present for 63017:\n\n\n|zipcode |agi_stub | N1 |\n|--|--|-- |\n|63017 |1 | 4710 |\n|63017 |2 | 2780 |\n|63017 |3 | 2130 |\n|63017 |4 | 2010 |\n|63017 |5 | 5240 |\n|63017 |6 | 3510 |\n\n\nWe must combine these six rows into one. For privacy reasons, AGI's are broken out into 6 buckets. We need to combine the buckets and estimate the actual AGI of a zipcode. To do this, consider the values for N1:\n\n* 1 = 1 to 25,000\n* 2 = 25,000 to 50,000\n* 3 = 50,000 to 75,000\n* 4 = 75,000 to 100,000\n* 5 = 100,000 to 200,000\n* 6 = 200,000 or more\n\nThe median of each of these ranges is approximately:\n\n* 1 = 12,500\n* 2 = 37,500\n* 3 = 62,500 \n* 4 = 87,500\n* 5 = 112,500\n* 6 = 212,500\n\nUsing this you can estimate 63017's average AGI as:\n\n```\n>>> totalCount = 4710 + 2780 + 2130 + 2010 + 5240 + 3510\n>>> totalAGI = 4710 * 12500 + 2780 * 37500 + 2130 * 62500 \n + 2010 * 87500 + 5240 * 112500 + 3510 * 212500\n>>> print(totalAGI / totalCount)\n\n88689.89205103042\n```\n\nWe begin by reading in the government data.",
"_____no_output_____"
]
],
[
[
"import pandas as pd\n\ndf=pd.read_csv('https://www.irs.gov/pub/irs-soi/16zpallagi.csv')",
"_____no_output_____"
]
],
[
[
"First, we trim all zip codes that are either 0 or 99999. We also select the three fields that we need.",
"_____no_output_____"
]
],
[
[
"df=df.loc[(df['zipcode']!=0) & (df['zipcode']!=99999),\n ['STATE','zipcode','agi_stub','N1']]\n\npd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\n\ndisplay(df)",
"_____no_output_____"
]
],
[
[
"We replace all of the **agi_stub** values with the correct median values with the **map** function.",
"_____no_output_____"
]
],
[
[
"medians = {1:12500,2:37500,3:62500,4:87500,5:112500,6:212500}\ndf['agi_stub']=df.agi_stub.map(medians)\n\npd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\ndisplay(df)",
"_____no_output_____"
]
],
[
[
"Next, we group the data frame by zip code.",
"_____no_output_____"
]
],
[
[
"groups = df.groupby(by='zipcode')",
"_____no_output_____"
]
],
[
[
"The program applies a lambda is applied across the groups, and then calculates the AGI estimate.",
"_____no_output_____"
]
],
[
[
"df = pd.DataFrame(groups.apply( \n lambda x:sum(x['N1']*x['agi_stub'])/sum(x['N1']))) \\\n .reset_index()",
"_____no_output_____"
],
[
"pd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\n\ndisplay(df)",
"_____no_output_____"
]
],
[
[
"We can now rename the new agi_estimate column.",
"_____no_output_____"
]
],
[
[
"df.columns = ['zipcode','agi_estimate']",
"_____no_output_____"
],
[
"pd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\n\ndisplay(df)",
"_____no_output_____"
]
],
[
[
"Finally, we check to see that our zip code of 63017 got the correct value.",
"_____no_output_____"
]
],
[
[
"df[ df['zipcode']==63017 ]",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d062446f226ef60582dc04dd098b4f3cbd00db61 | 52,841 | ipynb | Jupyter Notebook | matplotlibAndobjectorianted_linechart_with_errorbars.ipynb | GirijaJoshi/PyBer_Analysis | faaea2be8baae82ba8ca84314b51954b14784c8d | [
"MIT"
]
| 1 | 2020-10-20T15:15:37.000Z | 2020-10-20T15:15:37.000Z | matplotlibAndobjectorianted_linechart_with_errorbars.ipynb | GirijaJoshi/PyBer_Analysis | faaea2be8baae82ba8ca84314b51954b14784c8d | [
"MIT"
]
| null | null | null | matplotlibAndobjectorianted_linechart_with_errorbars.ipynb | GirijaJoshi/PyBer_Analysis | faaea2be8baae82ba8ca84314b51954b14784c8d | [
"MIT"
]
| null | null | null | 265.532663 | 16,468 | 0.931133 | [
[
[
"%matplotlib inline",
"_____no_output_____"
],
[
"# Import dependencies.\nimport matplotlib.pyplot as plt\nimport statistics",
"_____no_output_____"
],
[
"# Set the x-axis to a list of strings for each month.\nx_axis = [\"Jan\", \"Feb\", \"Mar\", \"April\", \"May\", \"June\", \"July\", \"Aug\", \"Sept\", \"Oct\", \"Nov\", \"Dec\"]\n\n# Set the y-axis to a list of floats as the total fare in US dollars accumulated for each month.\ny_axis = [10.02, 23.24, 39.20, 35.42, 32.34, 27.04, 43.82, 10.56, 11.85, 27.90, 20.71, 20.09]",
"_____no_output_____"
],
[
"average = sum(y_axis)/len(y_axis)\naverage",
"_____no_output_____"
],
[
"# Get the standard deviation of the values in the y-axis.\nstdev = statistics.stdev(y_axis)\nstdev",
"_____no_output_____"
],
[
"# added standart davitation to y-axis\nplt.errorbar(x_axis, y_axis, yerr=stdev)",
"_____no_output_____"
],
[
"# added standart davitation to y-axis, adding cap\nplt.errorbar(x_axis, y_axis, yerr=stdev, capsize=3)",
"_____no_output_____"
],
[
"fig, ax = plt.subplots()\nax.errorbar(x_axis, y_axis, yerr=stdev, capsize=3)\nplt.show()",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d06244e92170a3cee84bba7d981221ffc4f00179 | 350,583 | ipynb | Jupyter Notebook | Charting a path into the data science field.ipynb | khiara/DSND_Kaggle_2020_Survey | 57ba312125edbe8278b6f292b4f1bda1fb4018f0 | [
"CNRI-Python"
]
| null | null | null | Charting a path into the data science field.ipynb | khiara/DSND_Kaggle_2020_Survey | 57ba312125edbe8278b6f292b4f1bda1fb4018f0 | [
"CNRI-Python"
]
| null | null | null | Charting a path into the data science field.ipynb | khiara/DSND_Kaggle_2020_Survey | 57ba312125edbe8278b6f292b4f1bda1fb4018f0 | [
"CNRI-Python"
]
| null | null | null | 211.576946 | 41,700 | 0.882747 | [
[
[
"# Charting a path into the data science field",
"_____no_output_____"
],
[
"This project attempts to shed light on the path or paths to becoming a data science professional in the United States.\n\nData science is a rapidly growing field, and the demand for data scientists is outpacing supply. In the past, most Data Scientist positions went to people with PhDs in Computer Science. I wanted to know if that is changing in light of both the increased job openings and the expanding definition of data science that has come with more companies realizing the wealth of raw data they have available for analysis, and how that can help to grow and refine their businesses.",
"_____no_output_____"
],
[
"## Business Questions\n\n\n1. Do you need a a formal degree?\n2. What programming language(s) do data science professionals need to know?\n3. What are the preferred online learning platforms to gain data science knowledge and skills?",
"_____no_output_____"
],
[
"## Data\n\nSince 2017, Kaggle ('The world's largest data science community') has annually surveyed its users on demographics, practices, and preferences. This notebook explores the data from Kaggle's 2020 Machine Learning and Data Science survey. A caveat: Kaggle is heavy on Machine Learning and competitions, and while it claims over 8 million users the group may not be representative of the overall data science community. Additionally,survey respondents are self-selected, so we can't extrapolate any findings to the data science community as a whole, but the trends and demographics amongst Kaggle survey takers may still offer insights about data science professionals.",
"_____no_output_____"
],
[
"The first step is importing the necessary libraries and data.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport textwrap\n%matplotlib inline\n\nfrom matplotlib.ticker import PercentFormatter\n\nimport warnings\nwarnings.filterwarnings('ignore')",
"_____no_output_____"
],
[
"df = pd.read_csv('./kaggle_survey_2020_responses.csv')\nlow_memory = False",
"_____no_output_____"
]
],
[
[
"### Initial data exploration and cleaning\nLet's take a look at the survey data.",
"_____no_output_____"
]
],
[
[
"# Let's look at the first 5 rows of the dataset\ndf.head()",
"_____no_output_____"
]
],
[
[
"One thing we can see from this: some questions are tied to a single column, with a number of answers possible; these questions only allowed survey respondents to choose one answer from among the options. Other questions take up multiple columns, with each column tied to a specific answer; these were questions that allowed users to choose more than one option as the answer ('select all that apply'). The two types of questions will require different approaches to data preparation.",
"_____no_output_____"
],
[
"But first, we'll do some cleaning. The top row of data contains the question titles. We'll remove that, as well as the first column of survey completion time values.",
"_____no_output_____"
]
],
[
[
"# Removing the first column and the first row\ndf.drop(['Time from Start to Finish (seconds)'], axis=1, inplace=True)\ndf = df.loc[1:, :]\ndf.head()",
"_____no_output_____"
],
[
"df.shape",
"_____no_output_____"
]
],
[
[
"There are over 20,000 responses, with 354 answer fields.",
"_____no_output_____"
],
[
"#### Data preparation and filtering",
"_____no_output_____"
],
[
"To improve readability of visualizations, we'll aggregate some fields, shorten some labels, and re-order categories.",
"_____no_output_____"
]
],
[
[
"# Aggregating the nonbinary answers\ndf.loc[(df.Q2 == 'Prefer not to say'), 'Q2'] = 'Other Response'\ndf.loc[(df.Q2 == 'Prefer to self-describe'),'Q2'] = 'Other Response'\ndf.loc[(df.Q2 == 'Nonbinary'), 'Q2'] = 'Other Response'\n\n# Abbreviating country name\ndf.loc[(df.Q3 == 'United States of America'),'Q3']='USA'\n\n# Shortening education level descriptions\ndf.loc[(df.Q4 == 'Doctoral degree'),'Q4']='PhD'\ndf.loc[(df.Q4 == 'Masterโs degree'),'Q4']='Masterโs'\ndf.loc[(df.Q4 == 'Bachelorโs degree'),'Q4']='Bachelorโs'\ndf.loc[(df.Q4 == \"Some college/university study without earning a bachelorโs degree\"), 'Q4']='Some college/university'\ndf.loc[(df.Q4 == 'No formal education past high school'), 'Q4']='High school'\ndf.loc[(df.Q4 == 'I prefer not to answer'), 'Q4']='Prefer not to answer'\n\n# Ordering education levels by reverse typical chronological completion\nq4_order = [\n 'PhD',\n 'Masterโs', \n 'Professional degree', \n 'Bachelorโs', \n 'Some college/university', \n 'High school', \n 'Prefer not to answer']\n\n# Putting coding experience answers in order from shortest time to longest\nq6_order = [\n 'I have never written code', \n '< 1 years', \n '1-2 years', \n '3-5 years', \n '5-10 years', \n '10-20 years', \n '20+ years']\n\ndf.loc[(df.Q37_Part_9 == 'Cloud-certification programs (direct from AWS, Azure, GCP, or similar)'), 'Q37_Part_9']='Cloud-certification programs'\ndf.loc[(df.Q37_Part_10 == 'University Courses (resulting in a university degree)'), 'Q37_Part_10']='University Courses resulting in a degree'",
"_____no_output_____"
]
],
[
[
"We're going to focus on the US answers from currently employed Kagglers.",
"_____no_output_____"
]
],
[
[
"# Filtering for just US responses\nus_df = df[df['Q3'] == 'USA']\n\n# Filtering to only include currently employed Kagglers\nq5_order = [\n 'Data Scientist',\n 'Software Engineer',\n 'Data Analyst', \n 'Research Scientist',\n 'Product/Project Manager',\n 'Business Analyst',\n 'Machine Learning Engineer',\n 'Data Engineer',\n 'Statistician',\n 'DBA/Database Engineer',\n 'Other']\n\nus_df = us_df[us_df['Q5'].isin(q5_order)]",
"_____no_output_____"
]
],
[
[
"We're interested in the demographic questions at the beginning, plus coding experience, coding languages used, and online learning platforms used. ",
"_____no_output_____"
]
],
[
[
"# Filtering to only include specific question columns\nus_df = us_df.loc[:, ['Q1', 'Q2', 'Q3', 'Q4', 'Q5', 'Q6', 'Q7_Part_1', 'Q7_Part_2','Q7_Part_3','Q7_Part_4','Q7_Part_5',\n 'Q7_Part_6', 'Q7_Part_7','Q7_Part_8','Q7_Part_9','Q7_Part_10','Q7_Part_11', 'Q7_Part_12', 'Q7_OTHER',\n 'Q37_Part_1', 'Q37_Part_2', 'Q37_Part_3', 'Q37_Part_4', 'Q37_Part_5', 'Q37_Part_6', 'Q37_Part_7', \n 'Q37_Part_8', 'Q37_Part_9', 'Q37_Part_10','Q37_Part_11', 'Q37_OTHER']]",
"_____no_output_____"
],
[
"us_df.isna().sum()",
"_____no_output_____"
]
],
[
[
"Not much in the way of missing values in the first 6 questions; that changes for the multiple-column questions, as expected, since users only filled in the column when they were choosing that particular option. We'll address that by converting the missing values to zeros in the helper functions.",
"_____no_output_____"
]
],
[
[
"us_df.shape",
"_____no_output_____"
]
],
[
[
"This will be the data for our analysis -- covering 1680 currently employed Kagglers in the US.",
"_____no_output_____"
],
[
"## Helper functions",
"_____no_output_____"
],
[
"A few functions to help with data visualizations. The first two plot a barchart with a corresponding list of the counts and percentages for the values; one handles single-column questions and the other handles multiple-column questions. The third and fourth are heatmap functions -- one for single-column questions, and one for multiple-column questions.",
"_____no_output_____"
]
],
[
[
"def list_and_bar(qnum, q_order, title):\n \n '''\n INPUT:\n qnum - the y-axis variable, a single-column question\n q_order - the order to display responses on the barchart\n title - the title of the barchart\n \n OUTPUT:\n 1. A list of responses to the selected question, in descending order\n 2. A horizontal barchart showing the values, in sorted order \n '''\n\n # creating a dataframe of values to include both raw counts and percentages\n val_list = pd.DataFrame()\n val_list['Count'] = us_df[qnum].value_counts()\n pct = round(val_list * 100/us_df[qnum].count(),2)\n val_list['Pct'] = pct\n \n print(val_list)\n \n fig, ax = plt.subplots(1, 1, figsize=(12,6))\n ax = us_df[qnum].value_counts()[q_order].plot(kind='barh')\n \n # reversing the order of y axis -- \n # the horizontal barchart displays values in the reverse order of a regular barchart (i.e., where the barchart might show \n # a - b - c left to right, the corresponding horizontal barchart would show c at the top, and a at the bottom)\n ax.invert_yaxis()\n \n plt.title(title, fontsize = 14, fontweight = 'bold')\n plt.show()\n \n \n\ndef list_and_bar_mc(mc_df, title):\n \n '''\n INPUT:\n mc_df - a dataframe consisting of answers to a specific multiple-column question\n title - the title of the barchart\n \n OUTPUT:\n 1. A list of responses to the selected question, in descending order\n 2. A horizontal barchart showing the values, also in descending order\n '''\n print(mc_df)\n \n fig, ax = plt.subplots(1, 1, figsize=(12,6))\n mc_df['Count'].sort_values().plot(kind='barh')\n plt.title(title, fontsize = 14, fontweight = 'bold')\n plt.show()\n \n \n\ndef heatmap(qnum_a, qnum_b, title, order_rows, columns):\n \n '''\n INPUT:\n qnum_a - the x-axis variable, a single-column question\n qnum_b - the y-axis variable, a single-column question\n title - the title of the heatmap, describing the variables in the visualization\n order_rows - sorted order for the y-axis\n columns - sorted order for the x-axis\n \n OUTPUT:\n A heatmap showing the correlation between the two chosen variables\n '''\n vals = us_df[[qnum_a, qnum_b]].groupby(qnum_b)[qnum_a].value_counts().unstack()\n \n # getting the total number of responses for the columns in order to calculate the % of the total\n vals_rowsums = pd.DataFrame([vals.sum(axis=0).tolist()], columns=vals.columns, index=['All'])\n vals = pd.concat([vals_rowsums, vals], axis=0)\n\n # convert to % \n vals = ((vals.T / (vals.sum(axis=1) + 0.001)).T) * 100 \n\n order = order_rows\n columns = columns\n \n vals = vals.reindex(order).reindex(columns = columns)\n \n fig, ax = plt.subplots(1, 1, figsize=[12,6])\n ax = sns.heatmap(ax = ax, data = vals, cmap = 'GnBu', cbar_kws = {'format': '%.0f%%'})\n plt.title(title, fontsize = 14, fontweight = 'bold')\n ax.set_xlabel('')\n ax.set_ylabel('')\n plt.show()\n \n \n\ndef heatmap_mc(qnum, qnum_mc, title, columns, order_rows):\n \n '''\n INPUT:\n qnum - the y-axis variable, a single-column question\n qnum_mc - the x-axis variable, a question with multiple columns of answers\n title - the title of the heatmap, describing the variables in the visualization\n order_rows - sorted order for the y-axis\n columns - a list of column names, representing the multiple-column answer options, ordered\n \n OUTPUT:\n 1. A heatmap showing the correlation between the two specified variables\n 2. avg_num - the average number of answer options chosen for the multiple column question\n '''\n # creating a dataframe with the single-column question\n df_qnum = us_df[qnum]\n df_qnum = pd.DataFrame(df_qnum)\n \n # creating a dataframe containing all the columns for a given multiple-column question\n cols_mc = [col for col in us_df if col.startswith(qnum_mc)]\n df_mc = us_df[cols_mc]\n df_mc.columns = columns\n \n # converting column values to binary 0 or 1 values (1 if the user chose that answer, 0 if not)\n df_mc = df_mc.notnull().astype(int)\n \n # joining the dataframes together\n df_join = df_qnum.join(df_mc)\n \n # aggregating counts for each answer option and re-ordering dataframe\n df_agg = df_join.groupby([qnum]).agg('sum')\n df_agg = df_agg.reindex(order_rows)\n \n df_agg['users'] = df_join.groupby(qnum)[qnum].count()\n df_agg = df_agg.div(df_agg.loc[:, 'users'], axis=0)\n df_agg.drop(columns='users', inplace=True)\n \n \n fig, ax = plt.subplots(1, 1, figsize=(12, 6))\n ax = sns.heatmap(ax = ax, data = df_agg, cmap = 'GnBu')\n cbar = ax.collections[0].colorbar\n cbar.ax.yaxis.set_major_formatter(PercentFormatter(1, 0))\n plt.title(title, fontsize = 14, fontweight = 'bold')\n ax.set_xlabel('')\n ax.set_ylabel('')\n plt.show() \n \n # finding the average number of answers chosen for the multiple column options, minus tabulations for 'None'\n df_temp = df_join\n df_temp.drop('None', axis = 1, inplace = True)\n rowsums = df_temp.sum(axis = 1)\n avg_num = round(rowsums.mean(), 2)\n \n print('Average number of options chosen by survey respondents: ' + str(avg_num) + '.')\n",
"_____no_output_____"
]
],
[
[
"## Analysis and visualizations",
"_____no_output_____"
],
[
"We'll start by looking at the age and gender distribution, just to get an overview of the response community.",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=[12,6])\nus_ages = us_df['Q1'].value_counts().sort_index()\nsns.countplot(data = us_df, x = 'Q1', hue = 'Q2', order = us_ages.index)\nplt.title('Age and Gender Distribution')",
"_____no_output_____"
]
],
[
[
"The survey response pool skews heavily male, with most US Kagglers between the ages of 25 and 45. ",
"_____no_output_____"
]
],
[
[
"list_and_bar('Q6', q6_order, 'Years of Coding Experience')",
" Count Pct\n3-5 years 367 22.00\n20+ years 349 20.92\n5-10 years 334 20.02\n10-20 years 288 17.27\n1-2 years 171 10.25\n< 1 years 104 6.24\nI have never written code 55 3.30\n"
]
],
[
[
"Around 80 percent of those responding have 3 or more years experience coding.",
"_____no_output_____"
],
[
"### 1. Do you need a formal degree to become a data science professional?",
"_____no_output_____"
],
[
"Let's look at formal education, and how it correlates with job title.",
"_____no_output_____"
]
],
[
[
"list_and_bar('Q4', q4_order, 'Highest Level of Education Attained')",
" Count Pct\nMasterโs 819 48.75\nBachelorโs 409 24.35\nPhD 334 19.88\nSome college/university 71 4.23\nProfessional degree 34 2.02\nPrefer not to answer 8 0.48\nHigh school 5 0.30\n"
],
[
"list_and_bar('Q5', q5_order, 'Current Job Title')",
" Count Pct\nData Scientist 389 23.15\nOther 292 17.38\nSoftware Engineer 219 13.04\nData Analyst 192 11.43\nResearch Scientist 140 8.33\nProduct/Project Manager 117 6.96\nBusiness Analyst 107 6.37\nMachine Learning Engineer 97 5.77\nData Engineer 71 4.23\nStatistician 38 2.26\nDBA/Database Engineer 18 1.07\n"
],
[
"heatmap('Q4', 'Q5', 'Roles by Education Level', q5_order, q4_order)",
"_____no_output_____"
]
],
[
[
"### Question 1 analysis",
"_____no_output_____"
],
[
"With almost 49% of the responses, a Master's degree was by far the most common level of education listed, more than double the next most popular answer. Other notable observations:\n * Sixty-eight percent of US Kagglers hold a Master's Degree or higher. \n * Research scientists and statisticians are most likely to hold PhDs, followed by Data Scientists.\n * Relatively few survey respondents (around 5%) indicate they do not have at least a Bachelor's degree.\n * Only 23% of those responding hold the title of Data Scientist, but it is nonetheless the title with the highest count. \n Arguably anyone who is active on Kaggle and who would complete their survey considers themself to be either in, or \n interested in, the data science field, if not actively working as a Data Scientist. ",
"_____no_output_____"
],
[
"### Question 2. What programming language(s) do Data Scientists need to know?",
"_____no_output_____"
],
[
"Now we'll turn to programming languages used. As this is a \"Select all that apply\" question, with each language option appearing as a separate column, we need to do some processing to get the data into a format for easier graphing and analysis.",
"_____no_output_____"
]
],
[
[
"# creating a dataframe of the language options and the number of times each language was selected\nlanguages = pd.DataFrame()\n\nfor col in us_df.columns:\n if(col.startswith('Q7_')):\n language = us_df[col].value_counts()\n languages = languages.append({'Language':language.index[0], 'Count':language[0]}, ignore_index=True)\nlanguages = languages.set_index('Language')\nlanguages = languages.sort_values(by = 'Count', ascending = False)\nlanguages_tot = sum(languages.Count)\nlanguages['Pct'] = round((languages['Count'] * 100 / languages_tot), 2)",
"_____no_output_____"
],
[
"list_and_bar_mc(languages, 'Programming Languages Used')",
" Count Pct\nLanguage \nPython 1290.0 29.72\nSQL 899.0 20.71\nR 549.0 12.65\nBash 304.0 7.00\nOther 281.0 6.47\nJavascript 265.0 6.11\nJava 214.0 4.93\nC++ 177.0 4.08\nMATLAB 138.0 3.18\nC 125.0 2.88\nJulia 37.0 0.85\nNone 37.0 0.85\nSwift 24.0 0.55\n"
],
[
"heatmap_mc('Q5', 'Q7', 'Language Use by Role', languages.index, q5_order)",
"_____no_output_____"
],
[
"heatmap_mc('Q4', 'Q7','Language Use by Education Level', languages.index, q4_order)",
"_____no_output_____"
],
[
"heatmap_mc('Q6', 'Q7', 'Language Use by Years Coding', languages.index, q6_order)",
"_____no_output_____"
]
],
[
[
"### Question 2 analysis",
"_____no_output_____"
],
[
"Python was the most widely used language, followed by SQL and R. Python held the top spot across almost all job roles -- only Statisticians listed another language (SQL) higher -- and for all education levels and coding experience. R enjoys widespread popularity across education level and years coding as well; SQL shows a high number of users overall, but they are more concentrated in people holding Master's or PhD degrees, working as Statisticians, Data Scientists and Data Analysts.",
"_____no_output_____"
],
[
"Kagglers reported using 2-3 languages on a regular basis.",
"_____no_output_____"
],
[
"### 3. What are the preferred online learning platforms to gain data science knowledge and skills?",
"_____no_output_____"
],
[
"Regarding online learning, Kaggle's survey asked, \"On which platforms have you begun or completed data science courses? (Select all that apply).\" We'll handle the answers similarly to the language data. ",
"_____no_output_____"
]
],
[
[
"# creating a dataframe of online course providers and the number of times each was selected by users\nplatforms = pd.DataFrame()\n\nfor col in us_df.columns:\n if(col.startswith('Q37_')):\n platform = us_df[col].value_counts()\n platforms = platforms.append({'Platform':platform.index[0], 'Count':platform[0]}, ignore_index=True)\nplatforms = platforms.set_index('Platform')\nplatforms = platforms.sort_values(by = 'Count', ascending=False)\nplatforms_tot = sum(platforms.Count)\nplatforms['Pct'] = round((platforms['Count'] * 100 / platforms_tot), 2)",
"_____no_output_____"
],
[
"list_and_bar_mc(platforms, 'Learning Platforms Used')",
" Count Pct\nPlatform \nCoursera 774.0 20.78\nKaggle Learn Courses 433.0 11.63\nUniversity Courses resulting in a degree 414.0 11.12\nUdemy 393.0 10.55\nDataCamp 367.0 9.85\nedX 328.0 8.81\nUdacity 254.0 6.82\nLinkedIn Learning 209.0 5.61\nNone 154.0 4.14\nFast.ai 144.0 3.87\nOther 139.0 3.73\nCloud-certification programs 115.0 3.09\n"
],
[
"heatmap_mc('Q5', 'Q37', 'Learning Platform Use by Role', platforms.index, q5_order)",
"_____no_output_____"
],
[
"heatmap_mc('Q4', 'Q37', 'Learning Platform Use by Education Level', platforms.index, q4_order)",
"_____no_output_____"
]
],
[
[
"### Question 3 analysis",
"_____no_output_____"
],
[
"Coursera was the most popular response, by a good margin. Kaggle Learn, University Courses towards a degree and Udemy followed, with Datacamp and edX not far behind. Kaggle Learn is a relatively new entrant into this area, offering short, narrowly-focused, skill-based courses for free which offer certificates upon completion. These factors may all contribute to the platform's popularity, as it is easy to try out for the cost of a few hours and no money.",
"_____no_output_____"
],
[
"Kagglers reported trying data science courses on two platforms, on average.",
"_____no_output_____"
],
[
"Coursera's popularity was high across almost education levels and job titles. Kaggle Learn's usage was fairly uniform across categories. Fast.ai was popular with Research Scientists, Data Scientists, Machine Learnig Engineers, and Statisticians. Other platforms seem to enjoy popularity with some groups more than others, but not in ways that make it easy to extrapolate much.",
"_____no_output_____"
],
[
"## Conclusion",
"_____no_output_____"
],
[
"The most well-travelled path into the data science field, at least for those responding to the 2020 Kaggle survey:\n * Get at least a Bachelor's degree, though a Master's degree may be preferable\n * Learn at least 2 coding languages -- Python and R are the top data science languages; depending on the role you want,\n you might want to get comfortable with another language, such as SQL or C.\n * Take classes on online learning platforms to update your skills and learn new ones. Coursera is the standard, while\n Kaggle Learn is a good option for short,targeted learning.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
]
|
d06247bb441bf750a87dbb311ea26d7a156ab0c3 | 9,961 | ipynb | Jupyter Notebook | 7.16.ipynb | zhayanqi/mysql | 9f03e75ca641cff27fb203ddd0d397d357019057 | [
"Apache-2.0"
]
| null | null | null | 7.16.ipynb | zhayanqi/mysql | 9f03e75ca641cff27fb203ddd0d397d357019057 | [
"Apache-2.0"
]
| null | null | null | 7.16.ipynb | zhayanqi/mysql | 9f03e75ca641cff27fb203ddd0d397d357019057 | [
"Apache-2.0"
]
| null | null | null | 19.119002 | 318 | 0.466419 | [
[
[
"# ๅบๆฌ็จๅบ่ฎพ่ฎก\n- ไธๅไปฃ็ ่พๅ
ฅ๏ผ่ฏทไฝฟ็จ่ฑๆ่พๅ
ฅๆณ",
"_____no_output_____"
]
],
[
[
"print('hello word')",
"_____no_output_____"
],
[
"print 'hello'",
"_____no_output_____"
]
],
[
[
"## ็ผๅไธไธช็ฎๅ็็จๅบ\n- ๅๅ
ฌๅผ้ข็งฏ๏ผ area = radius \\* radius \\* 3.1415",
"_____no_output_____"
]
],
[
[
"radius = 1.0\narea = radius * radius * 3.14 # ๅฐๅๅ้จๅ็็ปๆ่ตๅผ็ปๅ้area\n# ๅ้ไธๅฎ่ฆๆๅๅงๅผ๏ผ๏ผ๏ผ\n# radius: ๅ้.area: ๅ้๏ผ\n# int ็ฑปๅ\nprint(area)",
"3.14\n"
]
],
[
[
"### ๅจPython้้ขไธ้่ฆๅฎไนๆฐๆฎ็็ฑปๅ",
"_____no_output_____"
],
[
"## ๆงๅถๅฐ็่ฏปๅไธ่พๅ
ฅ\n- input ่พๅ
ฅ่ฟๅป็ๆฏๅญ็ฌฆไธฒ\n- eval",
"_____no_output_____"
]
],
[
[
"radius = input('่ฏท่พๅ
ฅๅๅพ') # inputๅพๅฐ็็ปๆๆฏๅญ็ฌฆไธฒ็ฑปๅ\nradius = float(radius)\narea = radius * radius * 3.14\nprint('้ข็งฏไธบ:',area)",
"่ฏท่พๅ
ฅๅๅพ10\n้ข็งฏไธบ: 314.0\n"
]
],
[
[
"- ๅจjupyter็จshift + tab ้ฎๅฏไปฅ่ทณๅบ่งฃ้ๆๆกฃ",
"_____no_output_____"
],
[
"## ๅ้ๅฝๅ็่ง่\n- ็ฑๅญๆฏใๆฐๅญใไธๅ็บฟๆๆ\n- ไธ่ฝไปฅๆฐๅญๅผๅคด \\*\n- ๆ ่ฏ็ฌฆไธ่ฝๆฏๅ
ณ้ฎ่ฏ(ๅฎ้
ไธๆฏๅฏไปฅๅผบๅถๆนๅ็๏ผไฝๆฏๅฏนไบไปฃ็ ่ง่่่จๆฏๆๅ
ถไธ้ๅ)\n- ๅฏไปฅๆฏไปปๆ้ฟๅบฆ\n- ้ฉผๅณฐๅผๅฝๅ",
"_____no_output_____"
],
[
"## ๅ้ใ่ตๅผ่ฏญๅฅๅ่ตๅผ่กจ่พพๅผ\n- ๅ้: ้ไฟ็่งฃไธบๅฏไปฅๅๅ็้\n- x = 2 \\* x + 1 ๅจๆฐๅญฆไธญๆฏไธไธชๆน็จ๏ผ่ๅจ่ฏญ่จไธญๅฎๆฏไธไธช่กจ่พพๅผ\n- test = test + 1 \\* ๅ้ๅจ่ตๅผไนๅๅฟ
้กปๆๅผ",
"_____no_output_____"
],
[
"## ๅๆถ่ตๅผ\nvar1, var2,var3... = exp1,exp2,exp3...",
"_____no_output_____"
],
[
"## ๅฎไนๅธธ้\n- ๅธธ้๏ผ่กจ็คบไธ็งๅฎๅผๆ ่ฏ็ฌฆ๏ผ้ๅไบๅคๆฌกไฝฟ็จ็ๅบๆฏใๆฏๅฆPI\n- ๆณจๆ๏ผๅจๅ
ถไปไฝ็บง่ฏญ่จไธญๅฆๆๅฎไนไบๅธธ้๏ผ้ฃไน๏ผ่ฏฅๅธธ้ๆฏไธๅฏไปฅ่ขซๆนๅ็๏ผไฝๆฏๅจPythonไธญไธๅ็ๅฏน่ฑก๏ผๅธธ้ไนๆฏๅฏไปฅ่ขซๆนๅ็",
"_____no_output_____"
],
[
"## ๆฐๅผๆฐๆฎ็ฑปๅๅ่ฟ็ฎ็ฌฆ\n- ๅจPythonไธญๆไธค็งๆฐๅผ็ฑปๅ๏ผint ๅ float๏ผ้็จไบๅ ๅไน้คใๆจกใๅนๆฌก\n<img src = \"../Photo/01.jpg\"></img>",
"_____no_output_____"
],
[
"## ่ฟ็ฎ็ฌฆ /ใ//ใ**",
"_____no_output_____"
],
[
"## ่ฟ็ฎ็ฌฆ %",
"_____no_output_____"
],
[
"## EP๏ผ\n- 25/4 ๅคๅฐ๏ผๅฆๆ่ฆๅฐๅ
ถ่ฝฌๅไธบๆดๆฐ่ฏฅๆไนๆนๅ\n- ่พๅ
ฅไธไธชๆฐๅญๅคๆญๆฏๅฅๆฐ่ฟๆฏๅถๆฐ\n- ่ฟ้ถ: ่พๅ
ฅไธไธช็ง๏ผๆฐ๏ผๅไธไธช็จๅบๅฐๅ
ถ่ฝฌๆขๆๅๅ็ง๏ผไพๅฆ500็ง็ญไบ8ๅ20็ง\n- ่ฟ้ถ: ๅฆๆไปๅคฉๆฏๆๆๅ
ญ๏ผ้ฃไน10ๅคฉไปฅๅๆฏๆๆๅ ๏ผ ๆ็คบ๏ผๆฏไธชๆๆ็็ฌฌ0ๅคฉๆฏๆๆๅคฉ",
"_____no_output_____"
]
],
[
[
"day = eval(input('week'))\nplus_day = eval(input('plus'))\n",
"_____no_output_____"
]
],
[
[
"## ่ฎก็ฎ่กจ่พพๅผๅ่ฟ็ฎไผๅ
็บง\n<img src = \"../Photo/02.png\"></img>\n<img src = \"../Photo/03.png\"></img>",
"_____no_output_____"
],
[
"## ๅขๅผบๅ่ตๅผ่ฟ็ฎ\n<img src = \"../Photo/04.png\"></img>",
"_____no_output_____"
],
[
"## ็ฑปๅ่ฝฌๆข\n- float -> int\n- ๅ่ไบๅ
ฅ round",
"_____no_output_____"
],
[
"## EP:\n- ๅฆๆไธไธชๅนด่ฅไธ็จไธบ0.06%๏ผ้ฃไนๅฏนไบ197.55e+2็ๅนดๆถๅ
ฅ๏ผ้่ฆไบค็จไธบๅคๅฐ๏ผ(็ปๆไฟ็2ไธบๅฐๆฐ)\n- ๅฟ
้กปไฝฟ็จ็งๅญฆ่ฎกๆฐๆณ",
"_____no_output_____"
],
[
"# Project\n- ็จPythonๅไธไธช่ดทๆฌพ่ฎก็ฎๅจ็จๅบ๏ผ่พๅ
ฅ็ๆฏๆไพ(monthlyPayment) ่พๅบ็ๆฏๆป่ฟๆฌพๆฐ(totalpayment)\n",
"_____no_output_____"
],
[
"# Homework\n- 1\n<img src=\"../Photo/06.png\"></img>",
"_____no_output_____"
]
],
[
[
"celsius = input('่ฏท่พๅ
ฅๆธฉๅบฆ')\ncelsius = float(celsius)\nfahrenheit = (9/5) * celsius + 32\nprint(celsius,'Celsius is',fahrenheit,'Fahrenheit')",
"่ฏท่พๅ
ฅๆธฉๅบฆ43\n43.0 Celsius is 109.4 Fahrenheit\n"
]
],
[
[
"- 2\n<img src=\"../Photo/07.png\"></img>",
"_____no_output_____"
]
],
[
[
"radius = input('่ฏท่พๅ
ฅๅๅพ')\nlength = input('่ฏท่พๅ
ฅ้ซ')\nradius = float(radius)\nlength = float(length)\narea = radius * radius * 3.14\nvolume = area * length\nprint('The area is',area)\nprint('The volume is',volume)",
"่ฏท่พๅ
ฅๅๅพ5.5\n่ฏท่พๅ
ฅ้ซ12\nThe area is 94.985\nThe volume is 1139.82\n"
]
],
[
[
"- 3\n<img src=\"../Photo/08.png\"></img>",
"_____no_output_____"
]
],
[
[
"feet = input('่ฏท่พๅ
ฅ่ฑๅฐบ')\nfeet = float(feet)\nmeter = feet * 0.305\nprint(feet,'feet is',meter,'meters')",
"่ฏท่พๅ
ฅ่ฑๅฐบ16.5\n16.5 feet is 5.0325 meters\n"
]
],
[
[
"- 4\n<img src=\"../Photo/10.png\"></img>",
"_____no_output_____"
]
],
[
[
"M = input('่ฏท่พๅ
ฅๆฐด้')\ninitial = input('่ฏท่พๅ
ฅๅๅงๆธฉๅบฆ')\nfinal = input('่ฏท่พๅ
ฅๆ็ปๆธฉๅบฆ')\nM = float(M)\ninitial = float(initial)\nfinal = float(final)\nQ = M * (final - initial) * 4184\nprint('The energy needed is ',Q)",
"่ฏท่พๅ
ฅๆฐด้55.5\n่ฏท่พๅ
ฅๅๅงๆธฉๅบฆ3.5\n่ฏท่พๅ
ฅๆ็ปๆธฉๅบฆ10.5\nThe energy needed is 1625484.0\n"
]
],
[
[
"- 5\n<img src=\"../Photo/11.png\"></img>",
"_____no_output_____"
]
],
[
[
"cha = input('่ฏท่พๅ
ฅๅทฎ้ข')\nrate = input('่ฏท่พๅ
ฅๅนดๅฉ็')\ncha = float(cha)\nrate = float(rate)\ninterest = cha * (rate/1200)\nprint(interest)",
"่ฏท่พๅ
ฅๅทฎ้ข1000\n่ฏท่พๅ
ฅๅนดๅฉ็3.5\n2.916666666666667\n"
]
],
[
[
"- 6\n<img src=\"../Photo/12.png\"></img>",
"_____no_output_____"
]
],
[
[
"start = input('่ฏท่พๅ
ฅๅๅง้ๅบฆ')\nend = input('่ฏท่พๅ
ฅๆซ้ๅบฆ')\ntime = input('่ฏท่พๅ
ฅๆถ้ด')\nstart = float(start)\nend =float(end)\ntime = float(time)\na = (end - start)/time\nprint(a)",
"่ฏท่พๅ
ฅๅๅง้ๅบฆ5.5\n่ฏท่พๅ
ฅๆซ้ๅบฆ50.9\n่ฏท่พๅ
ฅๆถ้ด4.5\n10.088888888888889\n"
]
],
[
[
"- 7 ่ฟ้ถ\n<img src=\"../Photo/13.png\"></img>",
"_____no_output_____"
],
[
"- 8 ่ฟ้ถ\n<img src=\"../Photo/14.png\"></img>",
"_____no_output_____"
]
],
[
[
"a,b = eval(input('>>'))\nprint(a,b)\nprint(type(a),type(b))",
">>1,1.0\n1 1.0\n<class 'int'> <class 'float'>\n"
],
[
"a = eval(input('>>'))\nprint(a)",
">>1,2,3,4,5,6\n(1, 2, 3, 4, 5, 6)\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
]
]
|
d062488abb878f5992408c178f41f69940586dd0 | 55,258 | ipynb | Jupyter Notebook | notebooks/sum_backbone_stack_hb_0.ipynb | yizaochen/enmspring | 84c9aabeb7f87eda43967d86c763b7d600986215 | [
"MIT"
]
| null | null | null | notebooks/sum_backbone_stack_hb_0.ipynb | yizaochen/enmspring | 84c9aabeb7f87eda43967d86c763b7d600986215 | [
"MIT"
]
| null | null | null | notebooks/sum_backbone_stack_hb_0.ipynb | yizaochen/enmspring | 84c9aabeb7f87eda43967d86c763b7d600986215 | [
"MIT"
]
| null | null | null | 40.931852 | 3,264 | 0.434127 | [
[
[
"from os import path\nfrom enmspring.sum_bb_st_hb_k import ThreeBar\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom matplotlib import rcParams\nbig_traj_folder = '/home/ytcdata/bigtraj_fluctmatch/500ns'\ndrawzone_folder = '/home/yizaochen/Desktop/drawzone_temp'\ndata_folder = '/home/yizaochen/Documents/dna_2021_drawzone/summation_bb_st_hb'\nrcParams['font.family'] = 'Arial'",
"_____no_output_____"
]
],
[
[
"### Part 1: Initailize Plot Agent",
"_____no_output_____"
]
],
[
[
"plot_agent = ThreeBar(big_traj_folder, data_folder)",
"_____no_output_____"
]
],
[
[
"### Part 2: Make/Read DataFrame",
"_____no_output_____"
]
],
[
[
"makedf = False\nif makedf:\n plot_agent.ini_b_agent()\n plot_agent.ini_s_agent()\n plot_agent.ini_h_agent()\n plot_agent.make_df_for_all_host()",
"_____no_output_____"
],
[
"plot_agent.read_df_for_all_host()",
"_____no_output_____"
]
],
[
[
"### Part 2: Bar Plot",
"_____no_output_____"
]
],
[
[
"figsize = (1.817, 1.487)\nhspace = 0\n\nplot_agent.plot_main(figsize, hspace)\nsvg_out = path.join(drawzone_folder, 'sum_bb_st_hb.svg')\nplt.savefig(svg_out, dpi=200)\nplt.show()",
"_____no_output_____"
],
[
"from enmspring.graphs_bigtraj import BackboneMeanModeAgent",
"_____no_output_____"
],
[
"host = 'a_tract_21mer'\ninterval_time = 500\nb_agent = BackboneMeanModeAgent(host, big_traj_folder, interval_time)",
"/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/mean_mode_npy exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/0_500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/250_750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/500_1000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/750_1250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1000_1500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1250_1750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1500_2000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1750_2250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2000_2500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2250_2750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2500_3000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2750_3250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3000_3500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3250_3750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3500_4000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3750_4250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/4000_4500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/4250_4750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/4500_5000/pd_dfs exists\n"
],
[
"b_agent.preprocess_all_small_agents()",
"Thare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\n"
],
[
"b_agent.d_smallagents[(0,500)].laplacian_mat",
"_____no_output_____"
],
[
"b_agent.initialize_all_maps()",
"_____no_output_____"
],
[
"b_agent.n_window",
"_____no_output_____"
],
[
"from enmspring.hb_k import HBResidPlotV1\nbigtraj_folder = '/home/ytcdata/bigtraj_fluctmatch'\ndf_folder = '/home/yizaochen/Documents/dna_2021_drawzone/local_hb'",
"_____no_output_____"
],
[
"interval_time = 500\nplot_agent = HBResidPlotV1(bigtraj_folder, interval_time, df_folder)",
"_____no_output_____"
],
[
"plot_agent.read_mean_std_df()",
"Read df_mean from /home/yizaochen/Documents/dna_2021_drawzone/local_hb/hb.mean.csv\nRead df_std from /home/yizaochen/Documents/dna_2021_drawzone/local_hb/hb.std.csv\n"
],
[
"plot_agent.df_mean",
"_____no_output_____"
],
[
"plot_agent.df_std",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0624c3648d264d7db51f66d4e56be8724034121 | 5,430 | ipynb | Jupyter Notebook | notebooks/xmap.ipynb | yssource/xplot | 69233b204bd680eeb19cecbe7712c3e09fefb83a | [
"BSD-3-Clause"
]
| null | null | null | notebooks/xmap.ipynb | yssource/xplot | 69233b204bd680eeb19cecbe7712c3e09fefb83a | [
"BSD-3-Clause"
]
| null | null | null | notebooks/xmap.ipynb | yssource/xplot | 69233b204bd680eeb19cecbe7712c3e09fefb83a | [
"BSD-3-Clause"
]
| null | null | null | 19.462366 | 73 | 0.466851 | [
[
[
"empty"
]
]
]
| [
"empty"
]
| [
[
"empty"
]
]
|
d0625bd93fde81ac8be7517aa3101d8361d6bd43 | 47,126 | ipynb | Jupyter Notebook | chatbot.ipynb | Kevinz930/Alexiri-chatbot- | 43cd1daf633516a79a6d7ff23beb866f5f59d62d | [
"MIT"
]
| null | null | null | chatbot.ipynb | Kevinz930/Alexiri-chatbot- | 43cd1daf633516a79a6d7ff23beb866f5f59d62d | [
"MIT"
]
| null | null | null | chatbot.ipynb | Kevinz930/Alexiri-chatbot- | 43cd1daf633516a79a6d7ff23beb866f5f59d62d | [
"MIT"
]
| null | null | null | 38.407498 | 244 | 0.533463 | [
[
[
"from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport torch\nfrom torch.jit import script, trace\nimport torch.nn as nn\nfrom torch import optim\nimport torch.nn.functional as F\nimport csv\nimport random\nimport re\nimport os\nimport unicodedata\nimport codecs\nfrom io import open\nimport itertools\nimport math\nimport gensim",
"_____no_output_____"
],
[
"USE_CUDA = torch.cuda.is_available()\ndevice = torch.device(\"cuda\" if USE_CUDA else \"cpu\")",
"_____no_output_____"
]
],
[
[
"# Load & Preprocess Data",
"_____no_output_____"
],
[
"### Cornell Movie Dialogues Corpus",
"_____no_output_____"
]
],
[
[
"corpus_name = \"cornell movie-dialogs corpus\"\ncorpus = os.path.join(\"data\", corpus_name)\n\ndef printLines(file, n=10):\n with open(file, 'rb') as datafile:\n lines = datafile.readlines()\n for line in lines[:n]:\n print(line)\n\nprintLines(os.path.join(corpus, \"movie_lines.txt\"))",
"b'L1045 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ They do not!\\r\\n'\nb'L1044 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ They do to!\\r\\n'\nb'L985 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ I hope so.\\r\\n'\nb'L984 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ She okay?\\r\\n'\nb\"L925 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ Let's go.\\r\\n\"\nb'L924 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ Wow\\r\\n'\nb\"L872 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ Okay -- you're gonna need to learn how to lie.\\r\\n\"\nb'L871 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ No\\r\\n'\nb'L870 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ I\\'m kidding. You know how sometimes you just become this \"persona\"? And you don\\'t know how to quit?\\r\\n'\nb'L869 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ Like my fear of wearing pastels?\\r\\n'\n"
],
[
"# Splits each line of the file into a dictionary of fields\ndef loadLines(fileName, fields):\n lines = {}\n with open(fileName, 'r', encoding='iso-8859-1') as f:\n for line in f:\n values = line.split(\" +++$+++ \")\n # Extract fields\n lineObj = {}\n for i, field in enumerate(fields):\n lineObj[field] = values[i]\n lines[lineObj['lineID']] = lineObj\n return lines\n\n\n# Groups fields of lines from `loadLines` into conversations based on *movie_conversations.txt*\ndef loadConversations(fileName, lines, fields):\n conversations = []\n with open(fileName, 'r', encoding='iso-8859-1') as f:\n for line in f:\n values = line.split(\" +++$+++ \")\n # Extract fields\n convObj = {}\n for i, field in enumerate(fields):\n convObj[field] = values[i]\n # Convert string to list (convObj[\"utteranceIDs\"] == \"['L598485', 'L598486', ...]\")\n utterance_id_pattern = re.compile('L[0-9]+')\n lineIds = utterance_id_pattern.findall(convObj[\"utteranceIDs\"])\n # Reassemble lines\n convObj[\"lines\"] = []\n for lineId in lineIds:\n convObj[\"lines\"].append(lines[lineId])\n conversations.append(convObj)\n return conversations\n\n\n# Extracts pairs of sentences from conversations\ndef extractSentencePairs(conversations):\n qa_pairs = []\n for conversation in conversations:\n # Iterate over all the lines of the conversation\n for i in range(len(conversation[\"lines\"]) - 1): # We ignore the last line (no answer for it)\n inputLine = conversation[\"lines\"][i][\"text\"].strip()\n targetLine = conversation[\"lines\"][i+1][\"text\"].strip()\n # Filter wrong samples (if one of the lists is empty)\n if inputLine and targetLine:\n qa_pairs.append([inputLine, targetLine])\n return qa_pairs",
"_____no_output_____"
],
[
"# Define path to new file\ndatafile = os.path.join(corpus, \"formatted_movie_lines.txt\")\n\ndelimiter = '\\t'\n# Unescape the delimiter\ndelimiter = str(codecs.decode(delimiter, \"unicode_escape\"))\n\n# Initialize lines dict, conversations list, and field ids\nlines = {}\nconversations = []\nMOVIE_LINES_FIELDS = [\"lineID\", \"characterID\", \"movieID\", \"character\", \"text\"]\nMOVIE_CONVERSATIONS_FIELDS = [\"character1ID\", \"character2ID\", \"movieID\", \"utteranceIDs\"]\n\n# Load lines and process conversations\nprint(\"\\nProcessing corpus...\")\nlines = loadLines(os.path.join(corpus, \"movie_lines.txt\"), MOVIE_LINES_FIELDS)\nprint(\"\\nLoading conversations...\")\nconversations = loadConversations(os.path.join(corpus, \"movie_conversations.txt\"),\n lines, MOVIE_CONVERSATIONS_FIELDS)\n\n# Write new csv file\nprint(\"\\nWriting newly formatted file...\")\nwith open(datafile, 'w', encoding='utf-8') as outputfile:\n writer = csv.writer(outputfile, delimiter=delimiter, lineterminator='\\n')\n for pair in extractSentencePairs(conversations):\n writer.writerow(pair)\n\n# Print a sample of lines\nprint(\"\\nSample lines from file:\")\nprintLines(datafile)",
"\nProcessing corpus...\n\nLoading conversations...\n\nWriting newly formatted file...\n\nSample lines from file:\nb\"Can we make this quick? Roxanne Korrine and Andrew Barrett are having an incredibly horrendous public break- up on the quad. Again.\\tWell, I thought we'd start with pronunciation, if that's okay with you.\\r\\n\"\nb\"Well, I thought we'd start with pronunciation, if that's okay with you.\\tNot the hacking and gagging and spitting part. Please.\\r\\n\"\nb\"Not the hacking and gagging and spitting part. Please.\\tOkay... then how 'bout we try out some French cuisine. Saturday? Night?\\r\\n\"\nb\"You're asking me out. That's so cute. What's your name again?\\tForget it.\\r\\n\"\nb\"No, no, it's my fault -- we didn't have a proper introduction ---\\tCameron.\\r\\n\"\nb\"Cameron.\\tThe thing is, Cameron -- I'm at the mercy of a particularly hideous breed of loser. My sister. I can't date until she does.\\r\\n\"\nb\"The thing is, Cameron -- I'm at the mercy of a particularly hideous breed of loser. My sister. I can't date until she does.\\tSeems like she could get a date easy enough...\\r\\n\"\nb'Why?\\tUnsolved mystery. She used to be really popular when she started high school, then it was just like she got sick of it or something.\\r\\n'\nb\"Unsolved mystery. She used to be really popular when she started high school, then it was just like she got sick of it or something.\\tThat's a shame.\\r\\n\"\nb'Gosh, if only we could find Kat a boyfriend...\\tLet me see what I can do.\\r\\n'\n"
],
[
"# Default word tokens\nPAD_token = 0 # Used for padding short sentences\nSOS_token = 1 # Start-of-sentence token\nEOS_token = 2 # End-of-sentence token\n\nclass Voc:\n def __init__(self, name):\n self.name = name\n self.trimmed = False\n self.word2index = {}\n self.word2count = {}\n self.index2word = {PAD_token: \"PAD\", SOS_token: \"SOS\", EOS_token: \"EOS\"}\n self.num_words = 3 # Count SOS, EOS, PAD\n\n def addSentence(self, sentence):\n for word in sentence.split(' '):\n self.addWord(word)\n\n def addWord(self, word):\n if word not in self.word2index:\n self.word2index[word] = self.num_words\n self.word2count[word] = 1\n self.index2word[self.num_words] = word\n self.num_words += 1\n else:\n self.word2count[word] += 1\n\n # Remove words below a certain count threshold\n def trim(self, min_count):\n if self.trimmed:\n return\n self.trimmed = True\n\n keep_words = []\n\n for k, v in self.word2count.items():\n if v >= min_count:\n keep_words.append(k)\n\n print('keep_words {} / {} = {:.4f}'.format(\n len(keep_words), len(self.word2index), len(keep_words) / len(self.word2index)\n ))\n\n # Reinitialize dictionaries\n self.word2index = {}\n self.word2count = {}\n self.index2word = {PAD_token: \"PAD\", SOS_token: \"SOS\", EOS_token: \"EOS\"}\n self.num_words = 3 # Count default tokens\n\n for word in keep_words:\n self.addWord(word)",
"_____no_output_____"
],
[
"MAX_LENGTH = 10 # Maximum sentence length to consider\n\n# Turn a Unicode string to plain ASCII, thanks to\n# https://stackoverflow.com/a/518232/2809427\ndef unicodeToAscii(s):\n return ''.join(\n c for c in unicodedata.normalize('NFD', s)\n if unicodedata.category(c) != 'Mn'\n )\n\n# Lowercase, trim, and remove non-letter characters\ndef normalizeString(s):\n s = unicodeToAscii(s.lower().strip())\n s = re.sub(r\"([.!?])\", r\" \\1\", s)\n s = re.sub(r\"[^a-zA-Z.!?']+\", r\" \", s)\n s = re.sub(r\"\\s+\", r\" \", s).strip()\n return s\n\n# Read query/response pairs and return a voc object\ndef readVocs(datafile, corpus_name):\n print(\"Reading lines...\")\n # Read the file and split into lines\n lines = open(datafile, encoding='utf-8').\\\n read().strip().split('\\n')\n # Split every line into pairs and normalize\n pairs = [[normalizeString(s) for s in l.split('\\t')] for l in lines]\n voc = Voc(corpus_name)\n return voc, pairs\n\n# Returns True iff both sentences in a pair 'p' are under the MAX_LENGTH threshold\ndef filterPair(p):\n # Input sequences need to preserve the last word for EOS token\n return len(p[0].split(' ')) < MAX_LENGTH and len(p[1].split(' ')) < MAX_LENGTH\n\n# Filter pairs using filterPair condition\ndef filterPairs(pairs):\n return [pair for pair in pairs if filterPair(pair)]\n\n# Using the functions defined above, return a populated voc object and pairs list\ndef loadPrepareData(corpus, corpus_name, datafile, save_dir):\n print(\"Start preparing training data ...\")\n voc, pairs = readVocs(datafile, corpus_name)\n print(\"Read {!s} sentence pairs\".format(len(pairs)))\n pairs = filterPairs(pairs)\n print(\"Trimmed to {!s} sentence pairs\".format(len(pairs)))\n print(\"Counting words...\")\n for pair in pairs:\n voc.addSentence(pair[0])\n voc.addSentence(pair[1])\n print(\"Counted words:\", voc.num_words)\n return voc, pairs\n\n\n# Load/Assemble voc and pairs\nsave_dir = os.path.join(\"data\", \"save\")\nvoc, pairs = loadPrepareData(corpus, corpus_name, datafile, save_dir)\n# Print some pairs to validate\nprint(\"\\npairs:\")\nfor pair in pairs[:10]:\n print(pair)",
"Start preparing training data ...\nReading lines...\nRead 221282 sentence pairs\nTrimmed to 70086 sentence pairs\nCounting words...\nCounted words: 20282\n\npairs:\n[\"that's because it's such a nice one .\", 'forget french .']\n['there .', 'where ?']\n['you have my word . as a gentleman', \"you're sweet .\"]\n['hi .', 'looks like things worked out tonight huh ?']\n['you know chastity ?', 'i believe we share an art instructor']\n['have fun tonight ?', 'tons']\n['well no . . .', \"then that's all you had to say .\"]\n[\"then that's all you had to say .\", 'but']\n['but', 'you always been this selfish ?']\n['do you listen to this crap ?', 'what crap ?']\n"
],
[
"MIN_COUNT = 3 # Minimum word count threshold for trimming\n\ndef trimRareWords(voc, pairs, MIN_COUNT):\n # Trim words used under the MIN_COUNT from the voc\n voc.trim(MIN_COUNT)\n # Filter out pairs with trimmed words\n keep_pairs = []\n for pair in pairs:\n input_sentence = pair[0]\n output_sentence = pair[1]\n keep_input = True\n keep_output = True\n # Check input sentence\n for word in input_sentence.split(' '):\n if word not in voc.word2index:\n keep_input = False\n break\n # Check output sentence\n for word in output_sentence.split(' '):\n if word not in voc.word2index:\n keep_output = False\n break\n\n # Only keep pairs that do not contain trimmed word(s) in their input or output sentence\n if keep_input and keep_output:\n keep_pairs.append(pair)\n\n print(\"Trimmed from {} pairs to {}, {:.4f} of total\".format(len(pairs), len(keep_pairs), len(keep_pairs) / len(pairs)))\n return keep_pairs\n\n\n# Trim voc and pairs\npairs = trimRareWords(voc, pairs, MIN_COUNT)",
"keep_words 8610 / 20279 = 0.4246\nTrimmed from 70086 pairs to 57379, 0.8187 of total\n"
]
],
[
[
"# Prepare Data for Models",
"_____no_output_____"
]
],
[
[
"def indexesFromSentence(voc, sentence):\n return [voc.word2index[word] for word in sentence.split(' ')] + [EOS_token]\n\n\ndef zeroPadding(l, fillvalue=PAD_token):\n return list(itertools.zip_longest(*l, fillvalue=fillvalue))\n\ndef binaryMatrix(l, value=PAD_token):\n m = []\n for i, seq in enumerate(l):\n m.append([])\n for token in seq:\n if token == PAD_token:\n m[i].append(0)\n else:\n m[i].append(1)\n return m\n\n# Returns padded input sequence tensor and lengths\ndef inputVar(l, voc):\n indexes_batch = [indexesFromSentence(voc, sentence) for sentence in l]\n lengths = torch.tensor([len(indexes) for indexes in indexes_batch])\n padList = zeroPadding(indexes_batch)\n padVar = torch.LongTensor(padList)\n return padVar, lengths\n\n# Returns padded target sequence tensor, padding mask, and max target length\ndef outputVar(l, voc):\n indexes_batch = [indexesFromSentence(voc, sentence) for sentence in l]\n max_target_len = max([len(indexes) for indexes in indexes_batch])\n padList = zeroPadding(indexes_batch)\n mask = binaryMatrix(padList)\n mask = torch.BoolTensor(mask)\n padVar = torch.LongTensor(padList)\n return padVar, mask, max_target_len\n\n# Returns all items for a given batch of pairs\ndef batch2TrainData(voc, pair_batch):\n pair_batch.sort(key=lambda x: len(x[0].split(\" \")), reverse=True)\n input_batch, output_batch = [], []\n for pair in pair_batch:\n input_batch.append(pair[0])\n output_batch.append(pair[1])\n inp, lengths = inputVar(input_batch, voc)\n output, mask, max_target_len = outputVar(output_batch, voc)\n return inp, lengths, output, mask, max_target_len\n\n\n# Example for validation\nsmall_batch_size = 5\nbatches = batch2TrainData(voc, [random.choice(pairs) for _ in range(small_batch_size)])\ninput_variable, lengths, target_variable, mask, max_target_len = batches\n\nprint(\"input_variable:\", input_variable)\nprint(\"lengths:\", lengths)\nprint(\"target_variable:\", target_variable)\nprint(\"mask:\", mask)\nprint(\"max_target_len:\", max_target_len)",
"input_variable: tensor([[ 33, 42, 83, 181, 279],\n [ 97, 67, 59, 341, 31],\n [ 32, 1089, 735, 33, 10],\n [ 10, 260, 112, 32, 2],\n [ 563, 33, 16, 15, 0],\n [ 46, 121, 15, 2, 0],\n [ 82, 1727, 2, 0, 0],\n [ 10, 10, 0, 0, 0],\n [ 2, 2, 0, 0, 0]])\nlengths: tensor([9, 9, 7, 6, 4])\ntarget_variable: tensor([[ 56, 125, 5, 616, 22],\n [ 53, 548, 68, 175, 73],\n [ 33, 10, 10, 59, 7],\n [ 47, 2, 33, 1905, 3516],\n [ 15, 0, 32, 10, 4119],\n [ 2, 0, 204, 2, 10],\n [ 0, 0, 10, 0, 2],\n [ 0, 0, 2, 0, 0]])\nmask: tensor([[ True, True, True, True, True],\n [ True, True, True, True, True],\n [ True, True, True, True, True],\n [ True, True, True, True, True],\n [ True, False, True, True, True],\n [ True, False, True, True, True],\n [False, False, True, False, True],\n [False, False, True, False, False]])\nmax_target_len: 8\n"
]
],
[
[
"# Encoder",
"_____no_output_____"
]
],
[
[
"class EncoderRNN(nn.Module):\n def __init__(self, hidden_size, embedding, n_layers=1, dropout=0):\n super(EncoderRNN, self).__init__()\n self.n_layers = n_layers\n self.hidden_size = hidden_size\n self.embedding = embedding\n\n # Initialize GRU; the input_size and hidden_size params are both set to 'hidden_size'\n # because our input size is a word embedding with number of features == hidden_size\n self.gru = nn.GRU(hidden_size, hidden_size, n_layers,\n dropout=(0 if n_layers == 1 else dropout), bidirectional=True)\n\n def forward(self, input_seq, input_lengths, hidden=None):\n # Convert word indexes to embeddings\n embedded = self.embedding(input_seq)\n # Pack padded batch of sequences for RNN module\n packed = nn.utils.rnn.pack_padded_sequence(embedded, input_lengths)\n # Forward pass through GRU\n outputs, hidden = self.gru(packed, hidden)\n # Unpack padding\n outputs, _ = nn.utils.rnn.pad_packed_sequence(outputs)\n # Sum bidirectional GRU outputs\n outputs = outputs[:, :, :self.hidden_size] + outputs[:, : ,self.hidden_size:]\n # Return output and final hidden state\n return outputs, hidden",
"_____no_output_____"
]
],
[
[
"# Decoder",
"_____no_output_____"
]
],
[
[
"# Luong attention layer\nclass Attn(nn.Module):\n def __init__(self, method, hidden_size):\n super(Attn, self).__init__()\n self.method = method\n if self.method not in ['dot', 'general', 'concat']:\n raise ValueError(self.method, \"is not an appropriate attention method.\")\n self.hidden_size = hidden_size\n if self.method == 'general':\n self.attn = nn.Linear(self.hidden_size, hidden_size)\n elif self.method == 'concat':\n self.attn = nn.Linear(self.hidden_size * 2, hidden_size)\n self.v = nn.Parameter(torch.FloatTensor(hidden_size))\n\n def dot_score(self, hidden, encoder_output):\n return torch.sum(hidden * encoder_output, dim=2)\n\n def general_score(self, hidden, encoder_output):\n energy = self.attn(encoder_output)\n return torch.sum(hidden * energy, dim=2)\n\n def concat_score(self, hidden, encoder_output):\n energy = self.attn(torch.cat((hidden.expand(encoder_output.size(0), -1, -1), encoder_output), 2)).tanh()\n return torch.sum(self.v * energy, dim=2)\n\n def forward(self, hidden, encoder_outputs):\n # Calculate the attention weights (energies) based on the given method\n if self.method == 'general':\n attn_energies = self.general_score(hidden, encoder_outputs)\n elif self.method == 'concat':\n attn_energies = self.concat_score(hidden, encoder_outputs)\n elif self.method == 'dot':\n attn_energies = self.dot_score(hidden, encoder_outputs)\n\n # Transpose max_length and batch_size dimensions\n attn_energies = attn_energies.t()\n\n # Return the softmax normalized probability scores (with added dimension)\n return F.softmax(attn_energies, dim=1).unsqueeze(1)",
"_____no_output_____"
],
[
"class LuongAttnDecoderRNN(nn.Module):\n def __init__(self, attn_model, embedding, hidden_size, output_size, n_layers=1, dropout=0.1):\n super(LuongAttnDecoderRNN, self).__init__()\n\n # Keep for reference\n self.attn_model = attn_model\n self.hidden_size = hidden_size\n self.output_size = output_size\n self.n_layers = n_layers\n self.dropout = dropout\n\n # Define layers\n self.embedding = embedding\n self.embedding_dropout = nn.Dropout(dropout)\n self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=(0 if n_layers == 1 else dropout))\n self.concat = nn.Linear(hidden_size * 2, hidden_size)\n self.out = nn.Linear(hidden_size, output_size)\n\n self.attn = Attn(attn_model, hidden_size)\n\n def forward(self, input_step, last_hidden, encoder_outputs):\n # Note: we run this one step (word) at a time\n # Get embedding of current input word\n embedded = self.embedding(input_step)\n embedded = self.embedding_dropout(embedded)\n # Forward through unidirectional GRU\n rnn_output, hidden = self.gru(embedded, last_hidden)\n # Calculate attention weights from the current GRU output\n attn_weights = self.attn(rnn_output, encoder_outputs)\n # Multiply attention weights to encoder outputs to get new \"weighted sum\" context vector\n context = attn_weights.bmm(encoder_outputs.transpose(0, 1))\n # Concatenate weighted context vector and GRU output using Luong eq. 5\n rnn_output = rnn_output.squeeze(0)\n context = context.squeeze(1)\n concat_input = torch.cat((rnn_output, context), 1)\n concat_output = torch.tanh(self.concat(concat_input))\n # Predict next word using Luong eq. 6\n output = self.out(concat_output)\n output = F.softmax(output, dim=1)\n # Return output and final hidden state\n return output, hidden",
"_____no_output_____"
]
],
[
[
"# Training Procedure",
"_____no_output_____"
]
],
[
[
"def maskNLLLoss(inp, target, mask):\n nTotal = mask.sum()\n crossEntropy = -torch.log(torch.gather(inp, 1, target.view(-1, 1)).squeeze(1))\n loss = crossEntropy.masked_select(mask).mean()\n loss = loss.to(device)\n return loss, nTotal.item()",
"_____no_output_____"
],
[
"def train(input_variable, lengths, target_variable, mask, max_target_len, encoder, decoder, embedding,\n encoder_optimizer, decoder_optimizer, batch_size, clip, max_length=MAX_LENGTH):\n\n # Zero gradients\n encoder_optimizer.zero_grad()\n decoder_optimizer.zero_grad()\n\n # Set device options\n input_variable = input_variable.to(device)\n target_variable = target_variable.to(device)\n mask = mask.to(device)\n # Lengths for rnn packing should always be on the cpu\n lengths = lengths.to(\"cpu\")\n\n # Initialize variables\n loss = 0\n print_losses = []\n n_totals = 0\n\n # Forward pass through encoder\n encoder_outputs, encoder_hidden = encoder(input_variable, lengths)\n\n # Create initial decoder input (start with SOS tokens for each sentence)\n decoder_input = torch.LongTensor([[SOS_token for _ in range(batch_size)]])\n decoder_input = decoder_input.to(device)\n\n # Set initial decoder hidden state to the encoder's final hidden state\n decoder_hidden = encoder_hidden[:decoder.n_layers]\n\n # Determine if we are using teacher forcing this iteration\n use_teacher_forcing = True if random.random() < teacher_forcing_ratio else False\n\n # Forward batch of sequences through decoder one time step at a time\n if use_teacher_forcing:\n for t in range(max_target_len):\n decoder_output, decoder_hidden = decoder(\n decoder_input, decoder_hidden, encoder_outputs\n )\n # Teacher forcing: next input is current target\n decoder_input = target_variable[t].view(1, -1)\n # Calculate and accumulate loss\n mask_loss, nTotal = maskNLLLoss(decoder_output, target_variable[t], mask[t])\n loss += mask_loss\n print_losses.append(mask_loss.item() * nTotal)\n n_totals += nTotal\n else:\n for t in range(max_target_len):\n decoder_output, decoder_hidden = decoder(\n decoder_input, decoder_hidden, encoder_outputs\n )\n # No teacher forcing: next input is decoder's own current output\n _, topi = decoder_output.topk(1)\n decoder_input = torch.LongTensor([[topi[i][0] for i in range(batch_size)]])\n decoder_input = decoder_input.to(device)\n # Calculate and accumulate loss\n mask_loss, nTotal = maskNLLLoss(decoder_output, target_variable[t], mask[t])\n loss += mask_loss\n print_losses.append(mask_loss.item() * nTotal)\n n_totals += nTotal\n\n # Perform backpropatation\n loss.backward()\n\n # Clip gradients: gradients are modified in place\n _ = nn.utils.clip_grad_norm_(encoder.parameters(), clip)\n _ = nn.utils.clip_grad_norm_(decoder.parameters(), clip)\n\n # Adjust model weights\n encoder_optimizer.step()\n decoder_optimizer.step()\n\n return sum(print_losses) / n_totals",
"_____no_output_____"
],
[
"def trainIters(model_name, voc, pairs, encoder, decoder, encoder_optimizer, decoder_optimizer, embedding, encoder_n_layers, decoder_n_layers, save_dir, n_iteration, batch_size, print_every, save_every, clip, corpus_name, loadFilename):\n\n # Load batches for each iteration\n training_batches = [batch2TrainData(voc, [random.choice(pairs) for _ in range(batch_size)])\n for _ in range(n_iteration)]\n\n # Initializations\n print('Initializing ...')\n start_iteration = 1\n print_loss = 0\n if loadFilename:\n start_iteration = checkpoint['iteration'] + 1\n\n # Training loop\n print(\"Training...\")\n for iteration in range(start_iteration, n_iteration + 1):\n training_batch = training_batches[iteration - 1]\n # Extract fields from batch\n input_variable, lengths, target_variable, mask, max_target_len = training_batch\n\n # Run a training iteration with batch\n loss = train(input_variable, lengths, target_variable, mask, max_target_len, encoder,\n decoder, embedding, encoder_optimizer, decoder_optimizer, batch_size, clip)\n print_loss += loss\n\n # Print progress\n if iteration % print_every == 0:\n print_loss_avg = print_loss / print_every\n print(\"Iteration: {}; Percent complete: {:.1f}%; Average loss: {:.4f}\".format(iteration, iteration / n_iteration * 100, print_loss_avg))\n print_loss = 0\n\n # Save checkpoint\n if (iteration % save_every == 0):\n directory = os.path.join(save_dir, model_name, corpus_name, '{}-{}_{}'.format(encoder_n_layers, decoder_n_layers, hidden_size))\n if not os.path.exists(directory):\n os.makedirs(directory)\n torch.save({\n 'iteration': iteration,\n 'en': encoder.state_dict(),\n 'de': decoder.state_dict(),\n 'en_opt': encoder_optimizer.state_dict(),\n 'de_opt': decoder_optimizer.state_dict(),\n 'loss': loss,\n 'voc_dict': voc.__dict__,\n 'embedding': embedding.state_dict()\n }, os.path.join(directory, '{}_{}.tar'.format(iteration, 'checkpoint')))",
"_____no_output_____"
]
],
[
[
"# Evaluation",
"_____no_output_____"
]
],
[
[
"class GreedySearchDecoder(nn.Module):\n def __init__(self, encoder, decoder, voc):\n super(GreedySearchDecoder, self).__init__()\n self.encoder = encoder\n self.decoder = decoder\n self.voc = voc\n\n def forward(self, input_seq, input_length, max_length):\n # Forward input through encoder model\n encoder_outputs, encoder_hidden = self.encoder(input_seq, input_length)\n # Prepare encoder's final hidden layer to be first hidden input to the decoder\n decoder_hidden = encoder_hidden[:decoder.n_layers]\n # Initialize decoder input with SOS_token\n decoder_input = torch.ones(1, 1, device=device, dtype=torch.long) * SOS_token\n # Initialize tensors to append decoded words to\n all_tokens = torch.zeros([0], device=device, dtype=torch.long)\n all_scores = torch.zeros([0], device=device)\n # Iteratively decode one word token at a time\n for _ in range(max_length):\n # Forward pass through decoder\n decoder_output, decoder_hidden = self.decoder(decoder_input, decoder_hidden, encoder_outputs)\n # Obtain most likely word token and its softmax score\n decoder_scores, decoder_input = torch.max(decoder_output, dim=1)\n \n \n # Print words and scores\n# print('all tokens', all_tokens)\n print('all tokens words', [voc.index2word[token.item()] for token in all_tokens])\n \n \n if all_tokens.nelement() > 0 and int(decoder_input[0]) == self.voc.word2index['.']: # and int(all_tokens[-1]) == 2\n decoder_scores, decoder_input = torch.kthvalue(decoder_output, 2)\n \n # Record token and score\n all_tokens = torch.cat((all_tokens, decoder_input), dim=0)\n all_scores = torch.cat((all_scores, decoder_scores), dim=0)\n # Prepare current token to be next decoder input (add a dimension)\n decoder_input = torch.unsqueeze(decoder_input, 0)\n \n # Return collections of word tokens and scores\n return all_tokens, all_scores",
"_____no_output_____"
],
[
"def evaluate(encoder, decoder, searcher, voc, sentence, max_length=MAX_LENGTH):\n ### Format input sentence as a batch\n # words -> indexes\n indexes_batch = [indexesFromSentence(voc, sentence)]\n # Create lengths tensor\n lengths = torch.tensor([len(indexes) for indexes in indexes_batch])\n # Transpose dimensions of batch to match models' expectations\n input_batch = torch.LongTensor(indexes_batch).transpose(0, 1)\n # Use appropriate device\n input_batch = input_batch.to(device)\n lengths = lengths.to(\"cpu\")\n # Decode sentence with searcher\n tokens, scores = searcher(input_batch, lengths, max_length)\n # indexes -> words\n decoded_words = [voc.index2word[token.item()] for token in tokens]\n \n return decoded_words\n\n\ndef evaluateInput(encoder, decoder, searcher, voc):\n input_sentence = ''\n while True:\n try:\n # Get input sentence\n input_sentence = input('> ')\n # Check if it is quit case\n if input_sentence == 'q' or input_sentence == 'quit': break\n # Normalize sentence\n input_sentence = normalizeString(input_sentence)\n # Evaluate sentence\n output_words = evaluate(encoder, decoder, searcher, voc, input_sentence)\n \n # Format and print response sentence\n output_words[:] = [x for x in output_words if not (x == 'EOS' or x == 'PAD')] # or x == '.'\n \n print('human:', input_sentence)\n print('Bot:', ' '.join(output_words))\n\n except KeyError:\n print(\"Error: Encountered unknown word.\")",
"_____no_output_____"
]
],
[
[
"# Embeddings",
"_____no_output_____"
]
],
[
[
"# load pre-trained word2Vec model\nimport gensim.downloader as api\nmodel = api.load('word2vec-google-news-300')\nweights_w2v = torch.FloatTensor(model.vectors)",
"_____no_output_____"
],
[
"# load pre-trained Gloves 42B-300d model\n# model = gensim.models.KeyedVectors.load_word2vec_format('glove.42B.300d.w2vformat.txt')\n\ncorpus = os.path.join(\"glove\", \"glove.42B.300d.w2vformat.txt\")\nmodel = gensim.models.KeyedVectors.load_word2vec_format(corpus)\nweights_42b = torch.FloatTensor(model.vectors)",
"_____no_output_____"
],
[
"# load pre-trained Gloves 6B-300d model\ncorpus = os.path.join(\"glove\", \"glove.6B.300d.w2vformat.txt\")\nmodel = gensim.models.KeyedVectors.load_word2vec_format(corpus)\nweights_6b = torch.FloatTensor(model.vectors)",
"_____no_output_____"
],
[
"# Configure models\nmodel_name = 'cb_model'\n# attn_model = 'dot'\n#attn_model = 'general'\nattn_model = 'concat'\nhidden_size = 300 # 500 -> 300 to fit Gloves model\nencoder_n_layers = 3 # 2 -> 3\ndecoder_n_layers = 3 # 2 -> 3\ndropout = 0.1\nbatch_size = 64\n\n# Set checkpoint to load from; set to None if starting from scratch\nloadFilename = None\ncheckpoint_iter = 5000\n# loadFilename = os.path.join(save_dir, model_name, corpus_name,\n# '{}-{}_{}'.format(encoder_n_layers, decoder_n_layers, hidden_size),\n# '{}_checkpoint.tar'.format(checkpoint_iter))\n\n\n# Load model if a loadFilename is provided\nif loadFilename:\n # If loading on same machine the model was trained on\n checkpoint = torch.load(loadFilename)\n # If loading a model trained on GPU to CPU\n #checkpoint = torch.load(loadFilename, map_location=torch.device('cpu'))\n encoder_sd = checkpoint['en']\n decoder_sd = checkpoint['de']\n encoder_optimizer_sd = checkpoint['en_opt']\n decoder_optimizer_sd = checkpoint['de_opt']\n embedding_sd = checkpoint['embedding']\n voc.__dict__ = checkpoint['voc_dict']\n\n\nprint('Building encoder and decoder ...')\n# Initialize word embeddings\n# embedding = nn.Embedding(voc.num_words, hidden_size)\nembedding = nn.Embedding.from_pretrained(weights_w2v) # Choose embedding model\nif loadFilename:\n embedding.load_state_dict(embedding_sd)\n# Initialize encoder & decoder models\nencoder = EncoderRNN(hidden_size, embedding, encoder_n_layers, dropout)\ndecoder = LuongAttnDecoderRNN(attn_model, embedding, hidden_size, voc.num_words, decoder_n_layers, dropout)\nif loadFilename:\n encoder.load_state_dict(encoder_sd)\n decoder.load_state_dict(decoder_sd)\n# Use appropriate device\nencoder = encoder.to(device)\ndecoder = decoder.to(device)\nprint('Models built and ready to go!')",
"Building encoder and decoder ...\nModels built and ready to go!\n"
]
],
[
[
"# Run Model",
"_____no_output_____"
],
[
"### Training",
"_____no_output_____"
]
],
[
[
"# Configure training/optimization\nclip = 50.0\nteacher_forcing_ratio = 1.0\nlearning_rate = 0.0001\ndecoder_learning_ratio = 6.0 # 5.0 -> 4.0\nn_iteration = 5000 # 4000 -> 5000\nprint_every = 1\nsave_every = 500\n\n# Ensure dropout layers are in train mode\nencoder.train()\ndecoder.train()\n\n# Initialize optimizers\nprint('Building optimizers ...')\nencoder_optimizer = optim.Adam(encoder.parameters(), lr=learning_rate)\ndecoder_optimizer = optim.Adam(decoder.parameters(), lr=learning_rate * decoder_learning_ratio)\nif loadFilename:\n encoder_optimizer.load_state_dict(encoder_optimizer_sd)\n decoder_optimizer.load_state_dict(decoder_optimizer_sd)\n\n# If you have cuda, configure cuda to call\nfor state in encoder_optimizer.state.values():\n for k, v in state.items():\n if isinstance(v, torch.Tensor):\n state[k] = v.cuda()\n\nfor state in decoder_optimizer.state.values():\n for k, v in state.items():\n if isinstance(v, torch.Tensor):\n state[k] = v.cuda()\n\n# Run training iterations\nprint(\"Starting Training!\")\ntrainIters(model_name, voc, pairs, encoder, decoder, encoder_optimizer, decoder_optimizer,\n embedding, encoder_n_layers, decoder_n_layers, save_dir, n_iteration, batch_size,\n print_every, save_every, clip, corpus_name, loadFilename)",
"_____no_output_____"
]
],
[
[
"### Evaluation",
"_____no_output_____"
]
],
[
[
"# Set dropout layers to eval mode\nencoder.eval()\ndecoder.eval()\n\n# Initialize search module\nsearcher = GreedySearchDecoder(encoder, decoder, voc)\n\nevaluateInput(encoder, decoder, searcher, voc)",
"> hey\nall tokens words []\nall tokens words ['i']\nall tokens words ['i', \"don't\"]\nall tokens words ['i', \"don't\", 'bacon']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon', 'sandwich']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon', 'sandwich', 'bacon']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon', 'sandwich', 'bacon', 'sandwich']\nhuman: hey\nBot: i don't bacon sandwich sandwich bacon sandwich bacon sandwich\n"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d0625de76b5fbb52b499afab5a5debe2b3b06ab9 | 25,964 | ipynb | Jupyter Notebook | Seattle Busiest Time.ipynb | ShadyHanafy/Shady | f8e3f786840375845e2a1aede00212d8e5c95b25 | [
"CNRI-Python"
]
| null | null | null | Seattle Busiest Time.ipynb | ShadyHanafy/Shady | f8e3f786840375845e2a1aede00212d8e5c95b25 | [
"CNRI-Python"
]
| null | null | null | Seattle Busiest Time.ipynb | ShadyHanafy/Shady | f8e3f786840375845e2a1aede00212d8e5c95b25 | [
"CNRI-Python"
]
| null | null | null | 65.565657 | 8,320 | 0.757703 | [
[
[
"# Data Understanding\nIn order to get a better understanding of the busiest times in seattle, we will take a look at the dataset.\n\n## Access & Explore\nFirst, let's read and explore the data",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"#Import Calendar dataset\ndf_cal=pd.read_csv('calendar.csv', thousands=',')\npd.set_option(\"display.max_columns\", None)\ndf_cal.head()",
"_____no_output_____"
],
[
"#Check if any empty records for the price\ndf_cal['price'].isnull().value_counts()",
"_____no_output_____"
]
],
[
[
"# Data Preparation & Analysis\nNow we will prepare the data and make some convertions to prepare the data for visualization\n\n## Wrangle and Clean",
"_____no_output_____"
]
],
[
[
"#Convert price to numerical value\ndf_cal[\"price\"] = df_cal[\"price\"].str.replace('[$,,,]',\"\").astype(float)",
"<ipython-input-16-61781eef3286>:2: FutureWarning: The default value of regex will change from True to False in a future version.\n df_cal[\"price\"] = df_cal[\"price\"].str.replace('[$,,,]',\"\").astype(float)\n"
],
[
"#Impute the missing data of price columns with mean\ndf_cal['price'].fillna((df_cal['price'].mean()), inplace=True)",
"_____no_output_____"
],
[
"#Create new feature represent the month of a year\ndf_cal['month'] = pd.DatetimeIndex(df_cal['date']).month\ndf_cal.head()",
"_____no_output_____"
]
],
[
[
"## Data Visualization\nNow we will visualize our dataset to get the required answer for the main question that which time is the busiest in seattle all over the year and its reflection on price",
"_____no_output_____"
]
],
[
[
"#Plot the busiest seattle time of the year\nbusytime=df_cal.groupby(['month']).price.mean()\nbusytime.plot(kind = 'bar', title=\"BusyTime\")",
"_____no_output_____"
],
[
"#Plot the price range accross the year\nbusytime_price=df_cal.groupby(['month']).mean()['price'].sort_values().dropna()\nbusytime_price.plot(kind=\"bar\");\nplt.title(\"Price Trend over year\");",
"_____no_output_____"
]
],
[
[
"# Conclusion\n\nJuly, August and June are the busiest time of the year and this reflects proportionally in booking prices",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
]
|
d06277d0426a410149263ce439782d12a0d06670 | 406,806 | ipynb | Jupyter Notebook | m03_v01_store_sales_prediction.ipynb | luana-afonso/DataScience-Em-Producao | 13ef7d1a0f72fb9e83f6856612644f08e0bf0ae7 | [
"MIT"
]
| null | null | null | m03_v01_store_sales_prediction.ipynb | luana-afonso/DataScience-Em-Producao | 13ef7d1a0f72fb9e83f6856612644f08e0bf0ae7 | [
"MIT"
]
| null | null | null | m03_v01_store_sales_prediction.ipynb | luana-afonso/DataScience-Em-Producao | 13ef7d1a0f72fb9e83f6856612644f08e0bf0ae7 | [
"MIT"
]
| null | null | null | 102.884674 | 204,940 | 0.752998 | [
[
[
"# 0.0. IMPORTS",
"_____no_output_____"
]
],
[
[
"import math\nimport pandas as pd\nimport inflection\nimport numpy as np\nimport seaborn as sns\nimport matplotlib as plt\nimport datetime\n\nfrom IPython.display import Image",
"_____no_output_____"
]
],
[
[
"## 0.1. Helper Functions",
"_____no_output_____"
],
[
"## 0.2. Loading Data",
"_____no_output_____"
]
],
[
[
"# read_csv รฉ um metodo da classe Pandas\n# Preciso \"unzipar\" o arquivo antes?\n# low_memory para dizer se ele lรช o arquivo todo (False) ou em pedaรงรตes (True), ele costuma avisar qual o melhor para a situaรงรฃo\ndf_sales_raw = pd.read_csv(\"data/train.csv.zip\", low_memory=False)\ndf_store_raw = pd.read_csv(\"data/store.csv\", low_memory=False)\n\n# Merge (arquivo de referencia, arquivo a ser anexado a essa referencia, como quero fazer o merge, coluna que รฉ igual nos 2 datasets para servir de chave )\n# Merge tambรฉm รฉ um mรฉtodo da classe Pandas\ndf_raw = pd.merge( df_sales_raw, df_store_raw, how=\"left\", on=\"Store\" )",
"_____no_output_____"
],
[
"df_sales_raw.head()",
"_____no_output_____"
],
[
"df_store_raw.head()",
"_____no_output_____"
],
[
"# Plotar uma linha aleatรณria para ver se deu certo com o mรฉtodo sample\ndf_raw.sample()",
"_____no_output_____"
]
],
[
[
"# 1.0. STEP 01 - DATA DESCRIPTION",
"_____no_output_____"
]
],
[
[
"df1 = df_raw.copy()",
"_____no_output_____"
]
],
[
[
"## 1.1. Rename Columns",
"_____no_output_____"
],
[
"### Para ganhar velocidade no desenvolvimento!",
"_____no_output_____"
]
],
[
[
"df_raw.columns\n# Estรฃo atรฉ bem organizadas, formato candle (ou camble?) case, mas no mundo real pode ser bem diferente! rs",
"_____no_output_____"
],
[
"cols_old = ['Store', 'DayOfWeek', 'Date', 'Sales', 'Customers', 'Open', 'Promo',\n 'StateHoliday', 'SchoolHoliday', 'StoreType', 'Assortment',\n 'CompetitionDistance', 'CompetitionOpenSinceMonth',\n 'CompetitionOpenSinceYear', 'Promo2', 'Promo2SinceWeek',\n 'Promo2SinceYear', 'PromoInterval']\n\nsnakecase = lambda x: inflection.underscore( x )\n\ncols_new = list( map( snakecase, cols_old) )\n\n# Rename\ndf1.columns = cols_new",
"_____no_output_____"
],
[
"df1.columns",
"_____no_output_____"
]
],
[
[
"## 1.2. Data Dimensions",
"_____no_output_____"
],
[
"### Saber qual a quantidade de linhas e colunas do dataset",
"_____no_output_____"
]
],
[
[
"# O shape printa linhas e colunas do dataframe em que primeiro elemento sรฃo as rows\n# Pq ali sรฃo as chaves que ele usa? Isso tem a ver com placeholder?\nprint( \"Number of Rows: {}\".format( df1.shape[0] ) )\nprint( \"Number of Cols: {}\".format( df1.shape[1] ) )",
"Number of Rows: 1017209\nNumber of Cols: 18\n"
]
],
[
[
"## 1.3. Data Types",
"_____no_output_____"
]
],
[
[
"# Atente que nรฃo usamos os parรชnteses aqui. Isso pq estamos vendo uma propriedade e nรฃo usando um mรฉtodo?\n# O default do pandas รฉ assumir o que nรฃo for int como object. Object รฉ o \"caracter\" dentro do Pandas\n# Atente para o date, precisamos mudar de object para datetime!\ndf1.dtypes",
"_____no_output_____"
],
[
"df1[\"date\"] = pd.to_datetime( df1[\"date\"] )\ndf1.dtypes",
"_____no_output_____"
]
],
[
[
"## 1.4. Check NA",
"_____no_output_____"
]
],
[
[
"# O mรฉtodo isna vai mostrar todas as linhas que tem pelo menos uma coluna com um NA (vazia)\n# Mas como eu quero ver a soma disso por coluna, uso o mรฉtodo sum\ndf1.isna().sum()",
"_____no_output_____"
],
[
"# Precisamos tratar esses NAs.\n# Existem basicamente 3 maneiras:\n# 1. Descartar essas linhas (fรกcil e rรกpido; mas jogando dado fora)\n# 2. Usando algoritmos de machine learning. Tem alguns metodos de input NA que voce pode, por exemplo, substituir as colunas vazias pelo proprio comportamento da coluna (e.g. mediana, media...)\n# 3. Entendendo o negรณcio para colocar valores nos NAs e recuperar dados.",
"_____no_output_____"
]
],
[
[
"## 1.5. Fillout NA",
"_____no_output_____"
]
],
[
[
"df1[\"competition_distance\"].max()",
"_____no_output_____"
],
[
"#competition_distance: distance in meters to the nearest competitor store\n# Se pensarmos que nรฃo ter o dado nessa coluna significa um competidor estar muito longe geograficamente e, portanto, se assumirmos os valores como muito maiores que a distancia mรกxima encontrada resolveria o problema?\n# Quando uso funรงรฃo lambda, posso usar tudo conforme o nome da variรกvel que defino, no caso x\n# Funรงรฃo apply vai aplicar essa logica a todas as linhas do dataset\n# Aplica funรงรฃo apply sรณ na coluna competition_distance\n# O resultado eu quero sobrescrever na minha coluna original\n\ndf1[\"competition_distance\"] = df1[\"competition_distance\"].apply( lambda x: 200000.0 if math.isnan( x ) else x)\n\n#competition_open_since_month - gives the approximate year and month of the time the nearest competitor was opened \n# PREMISSA: Podemos assumir que se essa coluna for NA eu vou copiar a data de venda (extrair o mรชs)\n# Pq isso? jรก pensando na etapa a frente de feature engineering... tem algumas variaveis que derivamos do tempo que sรฃo muito importantes pra representar o comportamento, uma delas รฉ: quanto tempo faz desde que o evento aconteceu\n# A informaรงรฃo de competiรงรฃo proxima รฉ muito importante pois influencia nas vendas! (entao evitamos ao maximo excluir esses dados)\n# Primeiro tenho que ver se รฉ NA, uso a classe math. Se isso for verdade, vou pegar a coluna \"date\" e extrair o mรชs dela. Se nรฃo for verdade, mantem.\n# Vou usar funรงรฃo lambda, entรฃo posso colocar como x os df1.\n# Vou aplicar (funรงรฃo apply) isso ao longo das colunas (axis=1). Nรฃo precisamos fazer isso no \"competition_distance\" pois lรก estavamos avaliando apenas 1 coluna. Preciso explicitar para a funรงรฃo apply quando tenho mais de uma coluna\n# O resultado disso eu vou sobrescrever a coluna \"competition_open_since_month\"\n\ndf1[\"competition_open_since_month\"] = df1.apply( lambda x: x[\"date\"].month if math.isnan( x[\"competition_open_since_month\"] ) else x[\"competition_open_since_month\"] , axis=1)\n\n#competition_open_since_year - gives the approximate year and month of the time the nearest competitor was opened\n# Mesma lรณgica da coluna acima, sรณ que em anos\n\ndf1[\"competition_open_since_year\"] = df1.apply( lambda x: x[\"date\"].year if math.isnan( x[\"competition_open_since_year\"] ) else x[\"competition_open_since_year\"] , axis=1)\n\n#promo2 - Promo2 is a continuing and consecutive promotion for some stores: 0 = store is not participating, 1 = store is participating\n#promo2_since_week - describes the year and calendar week when the store started participating in Promo2 \n# Dados NA nessa coluna querem dizer que a loja nรฃo participa da promoรงรฃo\n# Similar ao de cima\n\ndf1[\"promo2_since_week\"] = df1.apply( lambda x: x[\"date\"].week if math.isnan( x[\"promo2_since_week\"] ) else x[\"promo2_since_week\"] , axis=1)\n\n#promo2_since_year \ndf1[\"promo2_since_year\"] = df1.apply( lambda x: x[\"date\"].year if math.isnan( x[\"promo2_since_year\"] ) else x[\"promo2_since_year\"] , axis=1)\n\n#promo_interval - describes the consecutive intervals Promo2 is started, naming the months the promotion is started anew. E.g. \"Feb,May,Aug,Nov\" means each round starts in February, May, August, November of any given year for that store (meses que a promoรงรฃo ficou ativa)\n# Vamos fazer um split dessa coluna e criar uma lista: se a minha data estiver dentro dessa lista (promoรงรฃo ativa) eu vou criar uma coluna falando que a promo2 foi ativa\n\n# Cria coluna auxiliar\nmonth_map = {1: \"Jan\",2: \"Feb\",3: \"Mar\",4: \"Apr\",5: \"May\",6: \"Jun\",7: \"Jul\",8: \"Aug\",9: \"Sep\",10: \"Oct\",11: \"Nov\",12: \"Dec\"}\n\n# Se o valor na coluna promo_interval for NA, substituo por 0 (nรฃo hรก promoรงรฃo ativa). inplace=True pois nรฃo quero que ele retorne nenhum valor (faรงa a modificaรงรฃo direto na coluna)\ndf1[\"promo_interval\"].fillna(0, inplace=True)\n\n# ??? Pq aqui usamos o map ao inves do apply?\ndf1[\"month_map\"] = df1[\"date\"].dt.month.map( month_map )\n\n# Se o mรชs da coluna month_map estiver na promoรงรฃo, vamos colocar 1, se nรฃo estiver, 0\n# Temos aluns zeros na coluna \"promo_interval\" que sรฃo lojas que nรฃo aderiram a promo2\n\n# 0 if df1[\"promo_interval\"] == 0 else 1 if df1[\"month_map\"] in df1[\"promo_interval\"].split( \",\" ) else 0\n\n# Como vou usar mais de uma coluna preciso especificar a direรงรฃo\n# apply(lambda x: 0 if x[\"promo_interval\"] == 0 else 1 if df1[\"month_map\"] in x[\"promo_interval\"].split( \",\" ) else 0, axis=1 )\n\n# Nรฃo vou aplicar no dataset todo, vou filtrar pra ficar mais fรกcil:\n# Vou criar uma nova coluna is_promo que vai ser 1 ou 0\n\ndf1[\"is_promo\"] = df1[[\"promo_interval\",\"month_map\"]].apply(lambda x: 0 if x[\"promo_interval\"] == 0 else 1 if x[\"month_map\"] in x[\"promo_interval\"].split( \",\" ) else 0, axis=1 )",
"_____no_output_____"
],
[
"df1.isna().sum()",
"_____no_output_____"
],
[
"# Agora a coluna \"competition_distance\" nรฃo tem mais NA e o valor maximo รฉ 200000\ndf1[\"competition_distance\"].max()",
"_____no_output_____"
],
[
"# Pegando linhas aleatorias. T para mostrar a transposta\ndf1.sample(5).T",
"_____no_output_____"
]
],
[
[
"## 1.6. Change Types",
"_____no_output_____"
]
],
[
[
"# Importante checar se alguma operaรงรฃo feita na etapa anterior alterou algum dado anterior\n# Mรฉtodo dtypes\n# competition_open_since_month float64\n# competition_open_since_year float64\n# promo2_since_week float64\n# promo2_since_year float64\n# Na verdade essas variaveis acima deveriam ser int (mรชs e ano)\n\ndf1.dtypes",
"_____no_output_____"
],
[
"# Mรฉtodo astype nesse caso vai aplicar o int sob essa coluna e vai salvar de volta\ndf1[\"competition_open_since_month\"] = df1[\"competition_open_since_month\"].astype(int)\ndf1[\"competition_open_since_year\"] = df1[\"competition_open_since_year\"].astype(int)\ndf1[\"promo2_since_week\"] = df1[\"promo2_since_week\"].astype(int)\ndf1[\"promo2_since_year\"] = df1[\"promo2_since_year\"].astype(int)",
"_____no_output_____"
],
[
"df1.dtypes",
"_____no_output_____"
]
],
[
[
"## 1.7. Descriptive Statistics",
"_____no_output_____"
],
[
"### Ganhar conhecimento de negรณcio e detectar alguns erros",
"_____no_output_____"
]
],
[
[
"# Central Tendency = mean, median\n# Dispersion = std, min, max, range, skew, kurtosis\n\n# Precisamos separar nossas variรกveis entre numรฉricas e categรณricas.\n# A estatรญstica descritiva funciona para os dois tipos de variรกveis, mas a forma com que eu construo a estatistica \n# descritiva รฉ diferente.\n\n# Vou separar todas as colunas que sรฃo numรฉricas:\n# mรฉtodo select_dtypes e vou passar uma lista de todos os tipos de variaveis que quero selecionar\n# datetime64(ns) = dado de tempo (date)\n\n# ??? Qual a diferenรงa do int64 e int32?\n\nnum_attributes = df1.select_dtypes( include=[\"int64\",\"int32\",\"float64\"] )\ncat_attributes = df1.select_dtypes( exclude=[\"int64\", \"float64\",\"int32\",\"datetime64[ns]\"] )",
"_____no_output_____"
],
[
"num_attributes.sample(2)",
"_____no_output_____"
],
[
"cat_attributes.sample(2)",
"_____no_output_____"
]
],
[
[
"## 1.7.1 Numerical Attributes",
"_____no_output_____"
]
],
[
[
"# Apply para aplicar uma operaรงรฃo em todas as colunas e transformar num dataframe pra facilitar a visualizaรงรฃo\n# Transpostas para ter metricas nas colunas e features nas linhas\n\n# central tendency\nct1 = pd.DataFrame( num_attributes.apply ( np.mean) ).T\nct2 = pd.DataFrame( num_attributes.apply ( np.median ) ).T\n\n# dispersion\nd1 = pd.DataFrame( num_attributes.apply( np.std )).T\nd2 = pd.DataFrame( num_attributes.apply( min )).T\nd3 = pd.DataFrame( num_attributes.apply( max )).T\nd4 = pd.DataFrame( num_attributes.apply( lambda x: x.max() - x.min() )).T\nd5 = pd.DataFrame( num_attributes.apply( lambda x: x.skew() )).T\nd6 = pd.DataFrame( num_attributes.apply( lambda x: x.kurtosis() )).T\n\n# Para concatenar todas essas mรฉtricas na ordem que quero ver:\n# obs: Classe Pandas\n# Tem que transpor e resetar o index (Pq???)\n\nm = pd.concat([d2,d3,d4,ct1,ct2,d1,d5,d6]).T.reset_index()\n\n# Vamos nomear as colunas para nรฃo aparecer o index padrรฃo\nm.columns = [\"attributes\",\"min\",\"max\",\"range\",\"mean\",\"median\",\"std\",\"skew\",\"kurtosis\"]\nm",
"_____no_output_____"
],
[
"# Avaliando por exemplo vendas: min 0, max 41k. Media e mediana parecidas, nao tenho um deslocamento da Normal muito grande.\n# Skew proxima de 0 - muito proxima de uma normal\n# Kurtosis proximo de 1 - nao tem um pico muuuito grande",
"_____no_output_____"
],
[
"# Plotando as sales passando as colunas que quero mostrar\n# Obs: Vocรช consegue mudar o tamanho do plot usando os parรขmetros height e aspect. Um exemplo ficaria assim:\n# sns.displot(df1['sales'], height=8, aspect=2)\n# Descobri isso procurando a funรงรฃo displot direto na documentaรงรฃo do seaborn: https://seaborn.pydata.org/generated/seaborn.displot.html#seaborn.displot\nsns.displot( df1[\"sales\"], height=8, aspect=2)",
"_____no_output_____"
],
[
"# Skew alta, alta concentraรงรฃo de valores no comeรงo\n# Meus competidores estรฃo muito proximos\n\nsns.displot( df1[\"competition_distance\"])",
"_____no_output_____"
]
],
[
[
"## 1.7.2 Categorical Attributes",
"_____no_output_____"
],
[
"### Vai de boxblot!",
"_____no_output_____"
]
],
[
[
"# ??? No do Meigarom sรณ apareceu os: state_holiday, store_type, assortment, promo_interval e month_map\n# Tirei os int32 tambem dos categoricos\ncat_attributes.apply( lambda x: x.unique().shape[0] )",
"_____no_output_____"
],
[
"# Meigarom prefere o seaborn do que o matplotlib\n# sns.boxplot( x= y=, data= )\n# x = linha que vai ficar como referencia\n# y = o que quero medir (no caso, as vendas)\n\nsns.boxplot( x=\"state_holiday\", y=\"sales\", data=df1 )",
"_____no_output_____"
],
[
"# Se plotamos da forma acima nรฃo da pra ver nada... (variaveis com ranges mt diferentes)\n# Vamos filtrar os dados para plotar:\n# ??? Pq esse 0 รฉ uma string e nao um numero? df1[\"state_holiday\"] != \"0\"\n\naux1 = df1[(df1[\"state_holiday\"] != \"0\") & (df1[\"sales\"] > 0)]\n\n# plt.subplot = para plotar um do lado do outro\n\nplt.pyplot.subplot( 1, 3, 1)\nsns.boxplot( x=\"state_holiday\", y=\"sales\", data=aux1)\n\nplt.pyplot.subplot( 1, 3, 2)\nsns.boxplot( x=\"store_type\", y=\"sales\", data=aux1)\n\nplt.pyplot.subplot( 1, 3, 3)\nsns.boxplot( x=\"assortment\", y=\"sales\", data=aux1)\n\n# Boxplot:\n# Linha do meio รฉ a mediana: chegou na metade dos valores (em termos de posiรงรฃo), aquele valor รฉ sua mediana\n# Limite inferior da barra: 25ยบ quartil (quartil 25) e o limite superior รฉ o quartil 75\n# Os ultimos tracinhos sรฃo em cima o maximo e embaixo o minimo. Todos os pontos acima do tracinho de maximo sรฃo considerados outliers (3x o desvio padrรฃo)\n# assortment = mix de produtos",
"_____no_output_____"
]
],
[
[
"# 2.0. STEP 02 - FEATURE ENGINEERING",
"_____no_output_____"
],
[
"Para quรช fazer a Feature Engineering? Para ter as variรกveis DISPONรVEIS para ESTUDO durante a Anรกlise Exploratรณria dos Dados. Pra nรฃo ter bagunรงa, crie as variรกveis ANTES na anรกlise exploratรณria!!!",
"_____no_output_____"
],
[
"Vou usar uma classe Image para colocar a imagem do mapa mental:",
"_____no_output_____"
]
],
[
[
"df2 = df1.copy()",
"_____no_output_____"
]
],
[
[
"## 2.1. Hypothesis Mind Map ",
"_____no_output_____"
]
],
[
[
"Image (\"img/mind-map-hypothesis.png\")",
"_____no_output_____"
]
],
[
[
"## 2.2. Hypothesis Creation",
"_____no_output_____"
],
[
"### 2.2.1 Store Hypothesis",
"_____no_output_____"
],
[
"1. Stores with greater number of employees should sell more.",
"_____no_output_____"
],
[
"2. Stores with greater stock size should sell more.",
"_____no_output_____"
],
[
"3. Stores with bigger size should sell more.",
"_____no_output_____"
],
[
"4. Stores with smaller size should sell less.",
"_____no_output_____"
],
[
"5. Stores with greater assortment should sell more.",
"_____no_output_____"
],
[
"6. Stores with more competitors nearby should sell less.",
"_____no_output_____"
],
[
"7. Stores with competitors for longer should sell more. ",
"_____no_output_____"
],
[
"### 2.2.2 Product Hypothesis",
"_____no_output_____"
],
[
"1. Stores with more marketing should sell more.",
"_____no_output_____"
],
[
"2. Stores that exhibit more products in the showcase sell more.",
"_____no_output_____"
],
[
"3. Stores that have lower prices on products should sell more.",
"_____no_output_____"
],
[
"4. Stores that have lower prices for longer on products should sell more.",
"_____no_output_____"
],
[
"5. Stores with more consecutive sales should sell more.",
"_____no_output_____"
],
[
"### 2.2.3Time-based Hypothesis",
"_____no_output_____"
],
[
"1. Stores with more days in holidays should sell less.",
"_____no_output_____"
],
[
"2. Stores that open in the first 6 months should sell more.",
"_____no_output_____"
],
[
"3. Stores that open on weekends should sell more.",
"_____no_output_____"
],
[
"## 2.3. Final Hypothesis List",
"_____no_output_____"
],
[
"### As hipรณteses das quais temos os dados, vรฃo para a lista final de hipรณteses.\n\n",
"_____no_output_____"
],
[
"1. Stores with greater assortment should sell more.\n\n2. Stores with more competitors nearby should sell less.\n\n3. Stores with competitors for longer should sell more. \n\n4. Stores with active sales for longer should sell more.\n\n5. Stores with more days on sale should sell more.\n\n7. Stores with more consecutive sales should sell more.\n\n8. Stores opened during the Christmas holiday should sell more.\n\n9. Stores should sell more over the years.\n\n10. Stores should sell more in the second half of the year.\n\n11. Stores should sell more after the 10th of each month.\n\n12. Stores should sell less on weekends.\n\n13. Stores should sell less during school holidays. ",
"_____no_output_____"
],
[
"## 2.4. Feature Engineering",
"_____no_output_____"
]
],
[
[
"# year\ndf2['year'] = df2['date'].dt.year\n\n# month\ndf2['month'] = df2['date'].dt.month\n\n# day\ndf2['day'] = df2['date'].dt.day\n\n# week of year\ndf2['week_of_year'] = df2['date'].dt.isocalendar().week\n\n# year week\n# aqui nรฃo usaremos nenhum metodo, e sim mudaremos a formataรงรฃo da data apenas\n# ele fala do strftime no bรดnus\ndf2['year_week'] = df2['date'].dt.strftime( '%Y-%W' )\n\n# week of year \n# ps: <ipython-input-35-d06c5b7375c4>:9: FutureWarning: Series.dt.weekofyear and Series.dt.week have been deprecated. Please use Series.dt.isocalendar().week instead.\n# df2[\"week_of_year\"] = df2[\"date\"].dt.weekofyear\n\ndf2[\"week_of_year\"] = df2[\"date\"].dt.isocalendar().week\n\n# ??? Nรฃo era pra week_of_year ser igual ร semana que aparece na coluna \"year_week\"? รฉ diferente!",
"_____no_output_____"
],
[
"df2.sample(10).T",
"_____no_output_____"
],
[
"# competition since\n# ja temos a coluna \"date\" para comparar, mas a informaรงรฃo de competition since estรก quebrada, temos coluna com year \n# e outra com month\n# Precisamos juntar as duas em uma data e fazer a substraรงรฃo das duas\n# mรฉtodo datetime vem de uma classe tambรฉm chamada datetime\n# datetime.datetime( year=, month=, day= )\n\n# datetime.datetime( year= df2[\"competition_open_since_year\"], month= df2[\"competition_open_since_month\"], day= 1 )\n# Vamos usar a funรงรฃo acima para todas as linhas do dataframe vamos usar lambda com variavel x e depois usar o apply\n# day = 1 pois nao temos informaรงรฃo sobre o dia\n# o apply vai precisar do axis pois estou usando duas colunas diferentes\n\ndf2[\"competition_since\"] = df2.apply(lambda x: datetime.datetime( year= x[\"competition_open_since_year\"], month= x[\"competition_open_since_month\"], day= 1), axis=1 )\n# com esse comando acima geramos a coluna \"competition_since\" no formato 2008-09-01 00:00:00. \n# Agora precisamos ver a diferenรงa dessa data com a date para saber o tempo de \"competition since\"\n\n# df2['date'] - df2['competition_since'] )/30 \n# divido por 30 pq quero manter a glanularidade em dias \n# o .days vai extrair os dias desse datetime e salva como inteiro em uma nova coluna 'competition_time_month'\ndf2['competition_time_month'] = ( ( df2['date'] - df2['competition_since'] )/30 ).apply( lambda x: x.days ).astype( int )",
"_____no_output_____"
],
[
"df2.head().T",
"_____no_output_____"
],
[
"# promo since, mesma estratรฉgia acima\n# Mas para as promoรงoes temos uma dificuldade a mais pois temos a coluna promo2 e informaรงรฃo de ano e semana\n# nรฃo temos de mรชs\n# Vamos fazer um join dos caracteres e depois converter na data\n# Mas para juntar as variรกveis assim precisamos que as 2 sejam strings (astype converte)\n# colocamos o \"-\" pra ficar no formato ano - semana do ano\n\n# df2['promo_since'] = df2['promo2_since_year'].astype( str ) + '-' + df2['promo2_since_week'].astype( str )\n# \"promo_since\" agora e string, nao รฉ datetime\n\ndf2['promo_since'] = df2['promo2_since_year'].astype( str ) + '-' + df2['promo2_since_week'].astype( str )\n\n# Deu uma complicada nesse promo, mas bora lรก...\n# Truque para converter o que geramos aqui em cima que ficou como string para data: datetime.datetime.strptime( x + '-1', '%Y-%W-%w' ). strptime( o que vai \n# mostrar, \"formato\")\n# x pq vamos aplicar para todas as linhas do dataframe\n# /7 para ter em semanas\n\ndf2['promo_since'] = df2['promo_since'].apply( lambda x: datetime.datetime.strptime( x + '-1', '%Y-%W-%w' ) - datetime.timedelta( days=7 ) )\n# Agora que temos duas datas sรณ falta subtrair...\ndf2['promo_time_week'] = ( ( df2['date'] - df2['promo_since'] )/7 ).apply( lambda x: x.days ).astype( int )\n\n#Obs:\n# %W Week number of the year (Monday as the first day of the week). \n# All days in a new year preceding the first Monday are considered to be in week 0\n# %w Weekday as a decimal number.\n\n# assortment (describes an assortment level: a = basic, b = extra, c = extended)\n# Mudar as letras para o que isso representa pra ficar mais facil a leitura:\n# Pq else e nรฃo elif na estrutura dentro do lambda???\n# ??? object type รฉ tipo string?\n# Nao preciso usar o axis pq sรณ vou usar a coluna \"assortment\"\n\n# assortment\ndf2['assortment'] = df2['assortment'].apply( lambda x: 'basic' if x == 'a' else 'extra' if x == 'b' else 'extended' )\n\n# Mesma coisa do assortment no \"state holiday\"\n# state holiday\ndf2['state_holiday'] = df2['state_holiday'].apply( lambda x: 'public_holiday' if x == 'a' else 'easter_holiday' if x == 'b' else 'christmas' if x == 'c' else 'regular_day' )\n",
"_____no_output_____"
],
[
"df2.head().T",
"_____no_output_____"
]
],
[
[
"# 3.0. STEP 03 - VARIABLES FILTERING",
"_____no_output_____"
]
],
[
[
"# Antes de qualquer coisa, ao comeรงar um novo passo, copia o dataset do passo anterior e passa a trabalhar com um novo\ndf3 = df2.copy()",
"_____no_output_____"
],
[
"df3.head()",
"_____no_output_____"
]
],
[
[
"## 3.1. ROWS FILTERING",
"_____no_output_____"
]
],
[
[
"# \"open\" != 0 & \"sales\" > 0\n\ndf3 = df3[(df3[\"open\"] != 0) & (df3[\"sales\"] > 0)]",
"_____no_output_____"
]
],
[
[
"## 3.2. COLUMNS SELECTION",
"_____no_output_____"
]
],
[
[
"# Vamos \"dropar\" as colunas que nรฃo queremos\n# A \"open\" estรก aqui pois apรณs tirarmos as linhas cujos dados da coluna \"open\" eram 0, sรณ sobraram valores 1, entรฃo รฉ uma coluna 'inรบtil'\ncols_drop = ['customers', 'open', 'promo_interval', 'month_map']\n# Drop รฉ um metodo da classe Pandas (quais colunas e sentido); axis 0 = linhas, axis 1 = colunas\ndf3 = df3.drop( cols_drop, axis=1 )",
"_____no_output_____"
],
[
"df3.columns",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d06279e2fc0fa31f7f3aa24441e977a8d467c22e | 540,612 | ipynb | Jupyter Notebook | Model backlog/DenseNet169/133 - DenseNet169 - Classification - Refactor.ipynb | ThinkBricks/APTOS2019BlindnessDetection | e524fd69f83a1252710076c78b6a5236849cd885 | [
"MIT"
]
| 23 | 2019-09-08T17:19:16.000Z | 2022-02-02T16:20:09.000Z | Model backlog/DenseNet169/133 - DenseNet169 - Classification - Refactor.ipynb | ThinkBricks/APTOS2019BlindnessDetection | e524fd69f83a1252710076c78b6a5236849cd885 | [
"MIT"
]
| 1 | 2020-03-10T18:42:12.000Z | 2020-09-18T22:02:38.000Z | Model backlog/DenseNet169/133 - DenseNet169 - Classification - Refactor.ipynb | ThinkBricks/APTOS2019BlindnessDetection | e524fd69f83a1252710076c78b6a5236849cd885 | [
"MIT"
]
| 16 | 2019-09-21T12:29:59.000Z | 2022-03-21T00:42:26.000Z | 150.924623 | 156,324 | 0.797855 | [
[
[
"## Dependencies",
"_____no_output_____"
]
],
[
[
"import os\nimport cv2\nimport shutil\nimport random\nimport warnings\nimport numpy as np\nimport pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom tensorflow import set_random_seed\nfrom sklearn.utils import class_weight\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import confusion_matrix, cohen_kappa_score\nfrom keras import backend as K\nfrom keras.models import Model\nfrom keras.utils import to_categorical\nfrom keras import optimizers, applications\nfrom keras.preprocessing.image import ImageDataGenerator\nfrom keras.callbacks import EarlyStopping, ReduceLROnPlateau, Callback, LearningRateScheduler\nfrom keras.layers import Dense, Dropout, GlobalAveragePooling2D, Input\n\n# Set seeds to make the experiment more reproducible.\ndef seed_everything(seed=0):\n random.seed(seed)\n os.environ['PYTHONHASHSEED'] = str(seed)\n np.random.seed(seed)\n set_random_seed(0)\nseed = 0\nseed_everything(seed)\n\n%matplotlib inline\nsns.set(style=\"whitegrid\")\nwarnings.filterwarnings(\"ignore\")",
"Using TensorFlow backend.\n"
]
],
[
[
"## Load data",
"_____no_output_____"
]
],
[
[
"hold_out_set = pd.read_csv('../input/aptos-data-split/hold-out.csv')\nX_train = hold_out_set[hold_out_set['set'] == 'train']\nX_val = hold_out_set[hold_out_set['set'] == 'validation']\ntest = pd.read_csv('../input/aptos2019-blindness-detection/test.csv')\nprint('Number of train samples: ', X_train.shape[0])\nprint('Number of validation samples: ', X_val.shape[0])\nprint('Number of test samples: ', test.shape[0])\n\n# Preprocecss data\nX_train[\"id_code\"] = X_train[\"id_code\"].apply(lambda x: x + \".png\")\nX_val[\"id_code\"] = X_val[\"id_code\"].apply(lambda x: x + \".png\")\ntest[\"id_code\"] = test[\"id_code\"].apply(lambda x: x + \".png\")\nX_train['diagnosis'] = X_train['diagnosis'].astype('str')\nX_val['diagnosis'] = X_val['diagnosis'].astype('str')\ndisplay(X_train.head())",
"Number of train samples: 2929\nNumber of validation samples: 733\nNumber of test samples: 1928\n"
]
],
[
[
"# Model parameters",
"_____no_output_____"
]
],
[
[
"# Model parameters\nN_CLASSES = X_train['diagnosis'].nunique()\nBATCH_SIZE = 16\nEPOCHS = 40\nWARMUP_EPOCHS = 5\nLEARNING_RATE = 1e-4\nWARMUP_LEARNING_RATE = 1e-3\nHEIGHT = 320\nWIDTH = 320\nCHANNELS = 3\nES_PATIENCE = 5\nRLROP_PATIENCE = 3\nDECAY_DROP = 0.5",
"_____no_output_____"
],
[
"def kappa(y_true, y_pred, n_classes=5):\n y_trues = K.cast(K.argmax(y_true), K.floatx())\n y_preds = K.cast(K.argmax(y_pred), K.floatx())\n n_samples = K.cast(K.shape(y_true)[0], K.floatx())\n distance = K.sum(K.abs(y_trues - y_preds))\n max_distance = n_classes - 1\n \n kappa_score = 1 - ((distance**2) / (n_samples * (max_distance**2)))\n\n return kappa_score\n\ndef step_decay(epoch):\n lrate = 30e-5\n if epoch > 3:\n lrate = 15e-5\n if epoch > 7:\n lrate = 7.5e-5\n if epoch > 11:\n lrate = 3e-5\n if epoch > 15:\n lrate = 1e-5\n\n return lrate\n\ndef focal_loss(y_true, y_pred):\n gamma = 2.0\n epsilon = K.epsilon()\n \n pt = y_pred * y_true + (1-y_pred) * (1-y_true)\n pt = K.clip(pt, epsilon, 1-epsilon)\n CE = -K.log(pt)\n FL = K.pow(1-pt, gamma) * CE\n loss = K.sum(FL, axis=1)\n \n return loss",
"_____no_output_____"
]
],
[
[
"# Pre-procecess images",
"_____no_output_____"
]
],
[
[
"train_base_path = '../input/aptos2019-blindness-detection/train_images/'\ntest_base_path = '../input/aptos2019-blindness-detection/test_images/'\ntrain_dest_path = 'base_dir/train_images/'\nvalidation_dest_path = 'base_dir/validation_images/'\ntest_dest_path = 'base_dir/test_images/'\n\n# Making sure directories don't exist\nif os.path.exists(train_dest_path):\n shutil.rmtree(train_dest_path)\nif os.path.exists(validation_dest_path):\n shutil.rmtree(validation_dest_path)\nif os.path.exists(test_dest_path):\n shutil.rmtree(test_dest_path)\n \n# Creating train, validation and test directories\nos.makedirs(train_dest_path)\nos.makedirs(validation_dest_path)\nos.makedirs(test_dest_path)\n\ndef crop_image(img, tol=7):\n if img.ndim ==2:\n mask = img>tol\n return img[np.ix_(mask.any(1),mask.any(0))]\n elif img.ndim==3:\n gray_img = cv2.cvtColor(img, cv2.COLOR_RGB2GRAY)\n mask = gray_img>tol\n check_shape = img[:,:,0][np.ix_(mask.any(1),mask.any(0))].shape[0]\n if (check_shape == 0): # image is too dark so that we crop out everything,\n return img # return original image\n else:\n img1=img[:,:,0][np.ix_(mask.any(1),mask.any(0))]\n img2=img[:,:,1][np.ix_(mask.any(1),mask.any(0))]\n img3=img[:,:,2][np.ix_(mask.any(1),mask.any(0))]\n img = np.stack([img1,img2,img3],axis=-1)\n \n return img\n\ndef circle_crop(img):\n img = crop_image(img)\n\n height, width, depth = img.shape\n largest_side = np.max((height, width))\n img = cv2.resize(img, (largest_side, largest_side))\n\n height, width, depth = img.shape\n\n x = width//2\n y = height//2\n r = np.amin((x, y))\n\n circle_img = np.zeros((height, width), np.uint8)\n cv2.circle(circle_img, (x, y), int(r), 1, thickness=-1)\n img = cv2.bitwise_and(img, img, mask=circle_img)\n img = crop_image(img)\n\n return img\n \ndef preprocess_image(base_path, save_path, image_id, HEIGHT, WIDTH, sigmaX=10):\n image = cv2.imread(base_path + image_id)\n image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n image = circle_crop(image)\n image = cv2.resize(image, (HEIGHT, WIDTH))\n image = cv2.addWeighted(image, 4, cv2.GaussianBlur(image, (0,0), sigmaX), -4 , 128)\n cv2.imwrite(save_path + image_id, image)\n \n# Pre-procecss train set\nfor i, image_id in enumerate(X_train['id_code']):\n preprocess_image(train_base_path, train_dest_path, image_id, HEIGHT, WIDTH)\n \n# Pre-procecss validation set\nfor i, image_id in enumerate(X_val['id_code']):\n preprocess_image(train_base_path, validation_dest_path, image_id, HEIGHT, WIDTH)\n \n# Pre-procecss test set\nfor i, image_id in enumerate(test['id_code']):\n preprocess_image(test_base_path, test_dest_path, image_id, HEIGHT, WIDTH)",
"_____no_output_____"
]
],
[
[
"# Data generator",
"_____no_output_____"
]
],
[
[
"train_datagen=ImageDataGenerator(rescale=1./255, \n rotation_range=360,\n horizontal_flip=True,\n vertical_flip=True)\n\nvalid_datagen=ImageDataGenerator(rescale=1./255)\n\ntrain_generator=train_datagen.flow_from_dataframe(\n dataframe=X_train,\n directory=train_dest_path,\n x_col=\"id_code\",\n y_col=\"diagnosis\",\n class_mode=\"categorical\",\n batch_size=BATCH_SIZE,\n target_size=(HEIGHT, WIDTH),\n seed=seed)\n\nvalid_generator=valid_datagen.flow_from_dataframe(\n dataframe=X_val,\n directory=validation_dest_path,\n x_col=\"id_code\",\n y_col=\"diagnosis\",\n class_mode=\"categorical\",\n batch_size=BATCH_SIZE,\n target_size=(HEIGHT, WIDTH),\n seed=seed)\n\ntest_generator=valid_datagen.flow_from_dataframe( \n dataframe=test,\n directory=test_dest_path,\n x_col=\"id_code\",\n batch_size=1,\n class_mode=None,\n shuffle=False,\n target_size=(HEIGHT, WIDTH),\n seed=seed)",
"Found 2929 validated image filenames belonging to 5 classes.\nFound 733 validated image filenames belonging to 5 classes.\nFound 1928 validated image filenames.\n"
]
],
[
[
"# Model",
"_____no_output_____"
]
],
[
[
"def create_model(input_shape, n_out):\n input_tensor = Input(shape=input_shape)\n base_model = applications.DenseNet169(weights=None, \n include_top=False,\n input_tensor=input_tensor)\n base_model.load_weights('../input/keras-notop/densenet169_weights_tf_dim_ordering_tf_kernels_notop.h5')\n\n x = GlobalAveragePooling2D()(base_model.output)\n x = Dropout(0.5)(x)\n x = Dense(2048, activation='relu')(x)\n x = Dropout(0.5)(x)\n final_output = Dense(n_out, activation='softmax', name='final_output')(x)\n model = Model(input_tensor, final_output)\n \n return model",
"_____no_output_____"
]
],
[
[
"# Train top layers",
"_____no_output_____"
]
],
[
[
"model = create_model(input_shape=(HEIGHT, WIDTH, CHANNELS), n_out=N_CLASSES)\n\nfor layer in model.layers:\n layer.trainable = False\n\nfor i in range(-5, 0):\n model.layers[i].trainable = True\n \nclass_weights = class_weight.compute_class_weight('balanced', np.unique(X_train['diagnosis'].astype('int').values), X_train['diagnosis'].astype('int').values)\n\nmetric_list = [\"accuracy\", kappa]\noptimizer = optimizers.Adam(lr=WARMUP_LEARNING_RATE)\nmodel.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=metric_list)\nmodel.summary()",
"__________________________________________________________________________________________________\nLayer (type) Output Shape Param # Connected to \n==================================================================================================\ninput_1 (InputLayer) (None, 320, 320, 3) 0 \n__________________________________________________________________________________________________\nzero_padding2d_1 (ZeroPadding2D (None, 326, 326, 3) 0 input_1[0][0] \n__________________________________________________________________________________________________\nconv1/conv (Conv2D) (None, 160, 160, 64) 9408 zero_padding2d_1[0][0] \n__________________________________________________________________________________________________\nconv1/bn (BatchNormalization) (None, 160, 160, 64) 256 conv1/conv[0][0] \n__________________________________________________________________________________________________\nconv1/relu (Activation) (None, 160, 160, 64) 0 conv1/bn[0][0] \n__________________________________________________________________________________________________\nzero_padding2d_2 (ZeroPadding2D (None, 162, 162, 64) 0 conv1/relu[0][0] \n__________________________________________________________________________________________________\npool1 (MaxPooling2D) (None, 80, 80, 64) 0 zero_padding2d_2[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_bn (BatchNormali (None, 80, 80, 64) 256 pool1[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_relu (Activation (None, 80, 80, 64) 0 conv2_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_conv (Conv2D) (None, 80, 80, 128) 8192 conv2_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_relu (Activation (None, 80, 80, 128) 0 conv2_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_concat (Concatenat (None, 80, 80, 96) 0 pool1[0][0] \n conv2_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_bn (BatchNormali (None, 80, 80, 96) 384 conv2_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_relu (Activation (None, 80, 80, 96) 0 conv2_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_conv (Conv2D) (None, 80, 80, 128) 12288 conv2_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_relu (Activation (None, 80, 80, 128) 0 conv2_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_concat (Concatenat (None, 80, 80, 128) 0 conv2_block1_concat[0][0] \n conv2_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_relu (Activation (None, 80, 80, 128) 0 conv2_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_conv (Conv2D) (None, 80, 80, 128) 16384 conv2_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_relu (Activation (None, 80, 80, 128) 0 conv2_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_concat (Concatenat (None, 80, 80, 160) 0 conv2_block2_concat[0][0] \n conv2_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_bn (BatchNormali (None, 80, 80, 160) 640 conv2_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_relu (Activation (None, 80, 80, 160) 0 conv2_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_conv (Conv2D) (None, 80, 80, 128) 20480 conv2_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_relu (Activation (None, 80, 80, 128) 0 conv2_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_concat (Concatenat (None, 80, 80, 192) 0 conv2_block3_concat[0][0] \n conv2_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_bn (BatchNormali (None, 80, 80, 192) 768 conv2_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_relu (Activation (None, 80, 80, 192) 0 conv2_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_conv (Conv2D) (None, 80, 80, 128) 24576 conv2_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_relu (Activation (None, 80, 80, 128) 0 conv2_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_concat (Concatenat (None, 80, 80, 224) 0 conv2_block4_concat[0][0] \n conv2_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_bn (BatchNormali (None, 80, 80, 224) 896 conv2_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_relu (Activation (None, 80, 80, 224) 0 conv2_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_conv (Conv2D) (None, 80, 80, 128) 28672 conv2_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_relu (Activation (None, 80, 80, 128) 0 conv2_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_concat (Concatenat (None, 80, 80, 256) 0 conv2_block5_concat[0][0] \n conv2_block6_2_conv[0][0] \n__________________________________________________________________________________________________\npool2_bn (BatchNormalization) (None, 80, 80, 256) 1024 conv2_block6_concat[0][0] \n__________________________________________________________________________________________________\npool2_relu (Activation) (None, 80, 80, 256) 0 pool2_bn[0][0] \n__________________________________________________________________________________________________\npool2_conv (Conv2D) (None, 80, 80, 128) 32768 pool2_relu[0][0] \n__________________________________________________________________________________________________\npool2_pool (AveragePooling2D) (None, 40, 40, 128) 0 pool2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_bn (BatchNormali (None, 40, 40, 128) 512 pool2_pool[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_relu (Activation (None, 40, 40, 128) 0 conv3_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_conv (Conv2D) (None, 40, 40, 128) 16384 conv3_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_relu (Activation (None, 40, 40, 128) 0 conv3_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_concat (Concatenat (None, 40, 40, 160) 0 pool2_pool[0][0] \n conv3_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_bn (BatchNormali (None, 40, 40, 160) 640 conv3_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_relu (Activation (None, 40, 40, 160) 0 conv3_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_conv (Conv2D) (None, 40, 40, 128) 20480 conv3_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_relu (Activation (None, 40, 40, 128) 0 conv3_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_concat (Concatenat (None, 40, 40, 192) 0 conv3_block1_concat[0][0] \n conv3_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_bn (BatchNormali (None, 40, 40, 192) 768 conv3_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_relu (Activation (None, 40, 40, 192) 0 conv3_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_conv (Conv2D) (None, 40, 40, 128) 24576 conv3_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_relu (Activation (None, 40, 40, 128) 0 conv3_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_concat (Concatenat (None, 40, 40, 224) 0 conv3_block2_concat[0][0] \n conv3_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_bn (BatchNormali (None, 40, 40, 224) 896 conv3_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_relu (Activation (None, 40, 40, 224) 0 conv3_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_conv (Conv2D) (None, 40, 40, 128) 28672 conv3_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_relu (Activation (None, 40, 40, 128) 0 conv3_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_concat (Concatenat (None, 40, 40, 256) 0 conv3_block3_concat[0][0] \n conv3_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_bn (BatchNormali (None, 40, 40, 256) 1024 conv3_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_relu (Activation (None, 40, 40, 256) 0 conv3_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_conv (Conv2D) (None, 40, 40, 128) 32768 conv3_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_relu (Activation (None, 40, 40, 128) 0 conv3_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_concat (Concatenat (None, 40, 40, 288) 0 conv3_block4_concat[0][0] \n conv3_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_bn (BatchNormali (None, 40, 40, 288) 1152 conv3_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_relu (Activation (None, 40, 40, 288) 0 conv3_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_conv (Conv2D) (None, 40, 40, 128) 36864 conv3_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_relu (Activation (None, 40, 40, 128) 0 conv3_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_concat (Concatenat (None, 40, 40, 320) 0 conv3_block5_concat[0][0] \n conv3_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_bn (BatchNormali (None, 40, 40, 320) 1280 conv3_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_relu (Activation (None, 40, 40, 320) 0 conv3_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_conv (Conv2D) (None, 40, 40, 128) 40960 conv3_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_relu (Activation (None, 40, 40, 128) 0 conv3_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_concat (Concatenat (None, 40, 40, 352) 0 conv3_block6_concat[0][0] \n conv3_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_bn (BatchNormali (None, 40, 40, 352) 1408 conv3_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_relu (Activation (None, 40, 40, 352) 0 conv3_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_conv (Conv2D) (None, 40, 40, 128) 45056 conv3_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_relu (Activation (None, 40, 40, 128) 0 conv3_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_concat (Concatenat (None, 40, 40, 384) 0 conv3_block7_concat[0][0] \n conv3_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_bn (BatchNormali (None, 40, 40, 384) 1536 conv3_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_relu (Activation (None, 40, 40, 384) 0 conv3_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_conv (Conv2D) (None, 40, 40, 128) 49152 conv3_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_relu (Activation (None, 40, 40, 128) 0 conv3_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_concat (Concatenat (None, 40, 40, 416) 0 conv3_block8_concat[0][0] \n conv3_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_bn (BatchNormal (None, 40, 40, 416) 1664 conv3_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_relu (Activatio (None, 40, 40, 416) 0 conv3_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_conv (Conv2D) (None, 40, 40, 128) 53248 conv3_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_concat (Concatena (None, 40, 40, 448) 0 conv3_block9_concat[0][0] \n conv3_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_bn (BatchNormal (None, 40, 40, 448) 1792 conv3_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_relu (Activatio (None, 40, 40, 448) 0 conv3_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_conv (Conv2D) (None, 40, 40, 128) 57344 conv3_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_concat (Concatena (None, 40, 40, 480) 0 conv3_block10_concat[0][0] \n conv3_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_bn (BatchNormal (None, 40, 40, 480) 1920 conv3_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_relu (Activatio (None, 40, 40, 480) 0 conv3_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_conv (Conv2D) (None, 40, 40, 128) 61440 conv3_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_concat (Concatena (None, 40, 40, 512) 0 conv3_block11_concat[0][0] \n conv3_block12_2_conv[0][0] \n__________________________________________________________________________________________________\npool3_bn (BatchNormalization) (None, 40, 40, 512) 2048 conv3_block12_concat[0][0] \n__________________________________________________________________________________________________\npool3_relu (Activation) (None, 40, 40, 512) 0 pool3_bn[0][0] \n__________________________________________________________________________________________________\npool3_conv (Conv2D) (None, 40, 40, 256) 131072 pool3_relu[0][0] \n__________________________________________________________________________________________________\npool3_pool (AveragePooling2D) (None, 20, 20, 256) 0 pool3_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_bn (BatchNormali (None, 20, 20, 256) 1024 pool3_pool[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_relu (Activation (None, 20, 20, 256) 0 conv4_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_conv (Conv2D) (None, 20, 20, 128) 32768 conv4_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_relu (Activation (None, 20, 20, 128) 0 conv4_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_concat (Concatenat (None, 20, 20, 288) 0 pool3_pool[0][0] \n conv4_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_bn (BatchNormali (None, 20, 20, 288) 1152 conv4_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_relu (Activation (None, 20, 20, 288) 0 conv4_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_conv (Conv2D) (None, 20, 20, 128) 36864 conv4_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_relu (Activation (None, 20, 20, 128) 0 conv4_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_concat (Concatenat (None, 20, 20, 320) 0 conv4_block1_concat[0][0] \n conv4_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_bn (BatchNormali (None, 20, 20, 320) 1280 conv4_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_relu (Activation (None, 20, 20, 320) 0 conv4_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_conv (Conv2D) (None, 20, 20, 128) 40960 conv4_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_relu (Activation (None, 20, 20, 128) 0 conv4_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_concat (Concatenat (None, 20, 20, 352) 0 conv4_block2_concat[0][0] \n conv4_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_bn (BatchNormali (None, 20, 20, 352) 1408 conv4_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_relu (Activation (None, 20, 20, 352) 0 conv4_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_conv (Conv2D) (None, 20, 20, 128) 45056 conv4_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_relu (Activation (None, 20, 20, 128) 0 conv4_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_concat (Concatenat (None, 20, 20, 384) 0 conv4_block3_concat[0][0] \n conv4_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_bn (BatchNormali (None, 20, 20, 384) 1536 conv4_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_relu (Activation (None, 20, 20, 384) 0 conv4_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_conv (Conv2D) (None, 20, 20, 128) 49152 conv4_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_relu (Activation (None, 20, 20, 128) 0 conv4_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_concat (Concatenat (None, 20, 20, 416) 0 conv4_block4_concat[0][0] \n conv4_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_bn (BatchNormali (None, 20, 20, 416) 1664 conv4_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_relu (Activation (None, 20, 20, 416) 0 conv4_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_conv (Conv2D) (None, 20, 20, 128) 53248 conv4_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_relu (Activation (None, 20, 20, 128) 0 conv4_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_concat (Concatenat (None, 20, 20, 448) 0 conv4_block5_concat[0][0] \n conv4_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_bn (BatchNormali (None, 20, 20, 448) 1792 conv4_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_relu (Activation (None, 20, 20, 448) 0 conv4_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_conv (Conv2D) (None, 20, 20, 128) 57344 conv4_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_relu (Activation (None, 20, 20, 128) 0 conv4_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_concat (Concatenat (None, 20, 20, 480) 0 conv4_block6_concat[0][0] \n conv4_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_bn (BatchNormali (None, 20, 20, 480) 1920 conv4_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_relu (Activation (None, 20, 20, 480) 0 conv4_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_conv (Conv2D) (None, 20, 20, 128) 61440 conv4_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_relu (Activation (None, 20, 20, 128) 0 conv4_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_concat (Concatenat (None, 20, 20, 512) 0 conv4_block7_concat[0][0] \n conv4_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_bn (BatchNormali (None, 20, 20, 512) 2048 conv4_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_relu (Activation (None, 20, 20, 512) 0 conv4_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_conv (Conv2D) (None, 20, 20, 128) 65536 conv4_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_relu (Activation (None, 20, 20, 128) 0 conv4_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_concat (Concatenat (None, 20, 20, 544) 0 conv4_block8_concat[0][0] \n conv4_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_bn (BatchNormal (None, 20, 20, 544) 2176 conv4_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_relu (Activatio (None, 20, 20, 544) 0 conv4_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_conv (Conv2D) (None, 20, 20, 128) 69632 conv4_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_concat (Concatena (None, 20, 20, 576) 0 conv4_block9_concat[0][0] \n conv4_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_bn (BatchNormal (None, 20, 20, 576) 2304 conv4_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_relu (Activatio (None, 20, 20, 576) 0 conv4_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_conv (Conv2D) (None, 20, 20, 128) 73728 conv4_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_concat (Concatena (None, 20, 20, 608) 0 conv4_block10_concat[0][0] \n conv4_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_bn (BatchNormal (None, 20, 20, 608) 2432 conv4_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_relu (Activatio (None, 20, 20, 608) 0 conv4_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_conv (Conv2D) (None, 20, 20, 128) 77824 conv4_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_concat (Concatena (None, 20, 20, 640) 0 conv4_block11_concat[0][0] \n conv4_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_bn (BatchNormal (None, 20, 20, 640) 2560 conv4_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_relu (Activatio (None, 20, 20, 640) 0 conv4_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_conv (Conv2D) (None, 20, 20, 128) 81920 conv4_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_concat (Concatena (None, 20, 20, 672) 0 conv4_block12_concat[0][0] \n conv4_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_bn (BatchNormal (None, 20, 20, 672) 2688 conv4_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_relu (Activatio (None, 20, 20, 672) 0 conv4_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_conv (Conv2D) (None, 20, 20, 128) 86016 conv4_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_concat (Concatena (None, 20, 20, 704) 0 conv4_block13_concat[0][0] \n conv4_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_bn (BatchNormal (None, 20, 20, 704) 2816 conv4_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_relu (Activatio (None, 20, 20, 704) 0 conv4_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_conv (Conv2D) (None, 20, 20, 128) 90112 conv4_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_concat (Concatena (None, 20, 20, 736) 0 conv4_block14_concat[0][0] \n conv4_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_bn (BatchNormal (None, 20, 20, 736) 2944 conv4_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_relu (Activatio (None, 20, 20, 736) 0 conv4_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_conv (Conv2D) (None, 20, 20, 128) 94208 conv4_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_concat (Concatena (None, 20, 20, 768) 0 conv4_block15_concat[0][0] \n conv4_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_bn (BatchNormal (None, 20, 20, 768) 3072 conv4_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_relu (Activatio (None, 20, 20, 768) 0 conv4_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_conv (Conv2D) (None, 20, 20, 128) 98304 conv4_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_concat (Concatena (None, 20, 20, 800) 0 conv4_block16_concat[0][0] \n conv4_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_bn (BatchNormal (None, 20, 20, 800) 3200 conv4_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_relu (Activatio (None, 20, 20, 800) 0 conv4_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_conv (Conv2D) (None, 20, 20, 128) 102400 conv4_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_concat (Concatena (None, 20, 20, 832) 0 conv4_block17_concat[0][0] \n conv4_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_bn (BatchNormal (None, 20, 20, 832) 3328 conv4_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_relu (Activatio (None, 20, 20, 832) 0 conv4_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_conv (Conv2D) (None, 20, 20, 128) 106496 conv4_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_concat (Concatena (None, 20, 20, 864) 0 conv4_block18_concat[0][0] \n conv4_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_bn (BatchNormal (None, 20, 20, 864) 3456 conv4_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_relu (Activatio (None, 20, 20, 864) 0 conv4_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_conv (Conv2D) (None, 20, 20, 128) 110592 conv4_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_concat (Concatena (None, 20, 20, 896) 0 conv4_block19_concat[0][0] \n conv4_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_bn (BatchNormal (None, 20, 20, 896) 3584 conv4_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_relu (Activatio (None, 20, 20, 896) 0 conv4_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_conv (Conv2D) (None, 20, 20, 128) 114688 conv4_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_concat (Concatena (None, 20, 20, 928) 0 conv4_block20_concat[0][0] \n conv4_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_bn (BatchNormal (None, 20, 20, 928) 3712 conv4_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_relu (Activatio (None, 20, 20, 928) 0 conv4_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_conv (Conv2D) (None, 20, 20, 128) 118784 conv4_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_concat (Concatena (None, 20, 20, 960) 0 conv4_block21_concat[0][0] \n conv4_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_bn (BatchNormal (None, 20, 20, 960) 3840 conv4_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_relu (Activatio (None, 20, 20, 960) 0 conv4_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_conv (Conv2D) (None, 20, 20, 128) 122880 conv4_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_concat (Concatena (None, 20, 20, 992) 0 conv4_block22_concat[0][0] \n conv4_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_bn (BatchNormal (None, 20, 20, 992) 3968 conv4_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_relu (Activatio (None, 20, 20, 992) 0 conv4_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_conv (Conv2D) (None, 20, 20, 128) 126976 conv4_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_concat (Concatena (None, 20, 20, 1024) 0 conv4_block23_concat[0][0] \n conv4_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_bn (BatchNormal (None, 20, 20, 1024) 4096 conv4_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_relu (Activatio (None, 20, 20, 1024) 0 conv4_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_conv (Conv2D) (None, 20, 20, 128) 131072 conv4_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_concat (Concatena (None, 20, 20, 1056) 0 conv4_block24_concat[0][0] \n conv4_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_bn (BatchNormal (None, 20, 20, 1056) 4224 conv4_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_relu (Activatio (None, 20, 20, 1056) 0 conv4_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_conv (Conv2D) (None, 20, 20, 128) 135168 conv4_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_concat (Concatena (None, 20, 20, 1088) 0 conv4_block25_concat[0][0] \n conv4_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_bn (BatchNormal (None, 20, 20, 1088) 4352 conv4_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_relu (Activatio (None, 20, 20, 1088) 0 conv4_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_conv (Conv2D) (None, 20, 20, 128) 139264 conv4_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_concat (Concatena (None, 20, 20, 1120) 0 conv4_block26_concat[0][0] \n conv4_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_bn (BatchNormal (None, 20, 20, 1120) 4480 conv4_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_relu (Activatio (None, 20, 20, 1120) 0 conv4_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_conv (Conv2D) (None, 20, 20, 128) 143360 conv4_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_concat (Concatena (None, 20, 20, 1152) 0 conv4_block27_concat[0][0] \n conv4_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_bn (BatchNormal (None, 20, 20, 1152) 4608 conv4_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_relu (Activatio (None, 20, 20, 1152) 0 conv4_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_conv (Conv2D) (None, 20, 20, 128) 147456 conv4_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_concat (Concatena (None, 20, 20, 1184) 0 conv4_block28_concat[0][0] \n conv4_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_bn (BatchNormal (None, 20, 20, 1184) 4736 conv4_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_relu (Activatio (None, 20, 20, 1184) 0 conv4_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_conv (Conv2D) (None, 20, 20, 128) 151552 conv4_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_concat (Concatena (None, 20, 20, 1216) 0 conv4_block29_concat[0][0] \n conv4_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_bn (BatchNormal (None, 20, 20, 1216) 4864 conv4_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_relu (Activatio (None, 20, 20, 1216) 0 conv4_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_conv (Conv2D) (None, 20, 20, 128) 155648 conv4_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_concat (Concatena (None, 20, 20, 1248) 0 conv4_block30_concat[0][0] \n conv4_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_bn (BatchNormal (None, 20, 20, 1248) 4992 conv4_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_relu (Activatio (None, 20, 20, 1248) 0 conv4_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_conv (Conv2D) (None, 20, 20, 128) 159744 conv4_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_concat (Concatena (None, 20, 20, 1280) 0 conv4_block31_concat[0][0] \n conv4_block32_2_conv[0][0] \n__________________________________________________________________________________________________\npool4_bn (BatchNormalization) (None, 20, 20, 1280) 5120 conv4_block32_concat[0][0] \n__________________________________________________________________________________________________\npool4_relu (Activation) (None, 20, 20, 1280) 0 pool4_bn[0][0] \n__________________________________________________________________________________________________\npool4_conv (Conv2D) (None, 20, 20, 640) 819200 pool4_relu[0][0] \n__________________________________________________________________________________________________\npool4_pool (AveragePooling2D) (None, 10, 10, 640) 0 pool4_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_bn (BatchNormali (None, 10, 10, 640) 2560 pool4_pool[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_relu (Activation (None, 10, 10, 640) 0 conv5_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_conv (Conv2D) (None, 10, 10, 128) 81920 conv5_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_relu (Activation (None, 10, 10, 128) 0 conv5_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_concat (Concatenat (None, 10, 10, 672) 0 pool4_pool[0][0] \n conv5_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_bn (BatchNormali (None, 10, 10, 672) 2688 conv5_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_relu (Activation (None, 10, 10, 672) 0 conv5_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_conv (Conv2D) (None, 10, 10, 128) 86016 conv5_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_relu (Activation (None, 10, 10, 128) 0 conv5_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_concat (Concatenat (None, 10, 10, 704) 0 conv5_block1_concat[0][0] \n conv5_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_bn (BatchNormali (None, 10, 10, 704) 2816 conv5_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_relu (Activation (None, 10, 10, 704) 0 conv5_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_conv (Conv2D) (None, 10, 10, 128) 90112 conv5_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_relu (Activation (None, 10, 10, 128) 0 conv5_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_concat (Concatenat (None, 10, 10, 736) 0 conv5_block2_concat[0][0] \n conv5_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_bn (BatchNormali (None, 10, 10, 736) 2944 conv5_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_relu (Activation (None, 10, 10, 736) 0 conv5_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_conv (Conv2D) (None, 10, 10, 128) 94208 conv5_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_relu (Activation (None, 10, 10, 128) 0 conv5_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_concat (Concatenat (None, 10, 10, 768) 0 conv5_block3_concat[0][0] \n conv5_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_bn (BatchNormali (None, 10, 10, 768) 3072 conv5_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_relu (Activation (None, 10, 10, 768) 0 conv5_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_conv (Conv2D) (None, 10, 10, 128) 98304 conv5_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_relu (Activation (None, 10, 10, 128) 0 conv5_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_concat (Concatenat (None, 10, 10, 800) 0 conv5_block4_concat[0][0] \n conv5_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_bn (BatchNormali (None, 10, 10, 800) 3200 conv5_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_relu (Activation (None, 10, 10, 800) 0 conv5_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_conv (Conv2D) (None, 10, 10, 128) 102400 conv5_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_relu (Activation (None, 10, 10, 128) 0 conv5_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_concat (Concatenat (None, 10, 10, 832) 0 conv5_block5_concat[0][0] \n conv5_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_bn (BatchNormali (None, 10, 10, 832) 3328 conv5_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_relu (Activation (None, 10, 10, 832) 0 conv5_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_conv (Conv2D) (None, 10, 10, 128) 106496 conv5_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_relu (Activation (None, 10, 10, 128) 0 conv5_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_concat (Concatenat (None, 10, 10, 864) 0 conv5_block6_concat[0][0] \n conv5_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_bn (BatchNormali (None, 10, 10, 864) 3456 conv5_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_relu (Activation (None, 10, 10, 864) 0 conv5_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_conv (Conv2D) (None, 10, 10, 128) 110592 conv5_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_relu (Activation (None, 10, 10, 128) 0 conv5_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_concat (Concatenat (None, 10, 10, 896) 0 conv5_block7_concat[0][0] \n conv5_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_bn (BatchNormali (None, 10, 10, 896) 3584 conv5_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_relu (Activation (None, 10, 10, 896) 0 conv5_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_conv (Conv2D) (None, 10, 10, 128) 114688 conv5_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_relu (Activation (None, 10, 10, 128) 0 conv5_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_concat (Concatenat (None, 10, 10, 928) 0 conv5_block8_concat[0][0] \n conv5_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_bn (BatchNormal (None, 10, 10, 928) 3712 conv5_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_relu (Activatio (None, 10, 10, 928) 0 conv5_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_conv (Conv2D) (None, 10, 10, 128) 118784 conv5_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_concat (Concatena (None, 10, 10, 960) 0 conv5_block9_concat[0][0] \n conv5_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_bn (BatchNormal (None, 10, 10, 960) 3840 conv5_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_relu (Activatio (None, 10, 10, 960) 0 conv5_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_conv (Conv2D) (None, 10, 10, 128) 122880 conv5_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_concat (Concatena (None, 10, 10, 992) 0 conv5_block10_concat[0][0] \n conv5_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_bn (BatchNormal (None, 10, 10, 992) 3968 conv5_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_relu (Activatio (None, 10, 10, 992) 0 conv5_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_conv (Conv2D) (None, 10, 10, 128) 126976 conv5_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_concat (Concatena (None, 10, 10, 1024) 0 conv5_block11_concat[0][0] \n conv5_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_bn (BatchNormal (None, 10, 10, 1024) 4096 conv5_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_relu (Activatio (None, 10, 10, 1024) 0 conv5_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_conv (Conv2D) (None, 10, 10, 128) 131072 conv5_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_concat (Concatena (None, 10, 10, 1056) 0 conv5_block12_concat[0][0] \n conv5_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_bn (BatchNormal (None, 10, 10, 1056) 4224 conv5_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_relu (Activatio (None, 10, 10, 1056) 0 conv5_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_conv (Conv2D) (None, 10, 10, 128) 135168 conv5_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_concat (Concatena (None, 10, 10, 1088) 0 conv5_block13_concat[0][0] \n conv5_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_bn (BatchNormal (None, 10, 10, 1088) 4352 conv5_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_relu (Activatio (None, 10, 10, 1088) 0 conv5_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_conv (Conv2D) (None, 10, 10, 128) 139264 conv5_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_concat (Concatena (None, 10, 10, 1120) 0 conv5_block14_concat[0][0] \n conv5_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_bn (BatchNormal (None, 10, 10, 1120) 4480 conv5_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_relu (Activatio (None, 10, 10, 1120) 0 conv5_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_conv (Conv2D) (None, 10, 10, 128) 143360 conv5_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_concat (Concatena (None, 10, 10, 1152) 0 conv5_block15_concat[0][0] \n conv5_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_bn (BatchNormal (None, 10, 10, 1152) 4608 conv5_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_relu (Activatio (None, 10, 10, 1152) 0 conv5_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_conv (Conv2D) (None, 10, 10, 128) 147456 conv5_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_concat (Concatena (None, 10, 10, 1184) 0 conv5_block16_concat[0][0] \n conv5_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_bn (BatchNormal (None, 10, 10, 1184) 4736 conv5_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_relu (Activatio (None, 10, 10, 1184) 0 conv5_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_conv (Conv2D) (None, 10, 10, 128) 151552 conv5_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_concat (Concatena (None, 10, 10, 1216) 0 conv5_block17_concat[0][0] \n conv5_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_bn (BatchNormal (None, 10, 10, 1216) 4864 conv5_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_relu (Activatio (None, 10, 10, 1216) 0 conv5_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_conv (Conv2D) (None, 10, 10, 128) 155648 conv5_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_concat (Concatena (None, 10, 10, 1248) 0 conv5_block18_concat[0][0] \n conv5_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_bn (BatchNormal (None, 10, 10, 1248) 4992 conv5_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_relu (Activatio (None, 10, 10, 1248) 0 conv5_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_conv (Conv2D) (None, 10, 10, 128) 159744 conv5_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_concat (Concatena (None, 10, 10, 1280) 0 conv5_block19_concat[0][0] \n conv5_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_bn (BatchNormal (None, 10, 10, 1280) 5120 conv5_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_relu (Activatio (None, 10, 10, 1280) 0 conv5_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_conv (Conv2D) (None, 10, 10, 128) 163840 conv5_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_concat (Concatena (None, 10, 10, 1312) 0 conv5_block20_concat[0][0] \n conv5_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_bn (BatchNormal (None, 10, 10, 1312) 5248 conv5_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_relu (Activatio (None, 10, 10, 1312) 0 conv5_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_conv (Conv2D) (None, 10, 10, 128) 167936 conv5_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_concat (Concatena (None, 10, 10, 1344) 0 conv5_block21_concat[0][0] \n conv5_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_bn (BatchNormal (None, 10, 10, 1344) 5376 conv5_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_relu (Activatio (None, 10, 10, 1344) 0 conv5_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_conv (Conv2D) (None, 10, 10, 128) 172032 conv5_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_concat (Concatena (None, 10, 10, 1376) 0 conv5_block22_concat[0][0] \n conv5_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_bn (BatchNormal (None, 10, 10, 1376) 5504 conv5_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_relu (Activatio (None, 10, 10, 1376) 0 conv5_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_conv (Conv2D) (None, 10, 10, 128) 176128 conv5_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_concat (Concatena (None, 10, 10, 1408) 0 conv5_block23_concat[0][0] \n conv5_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_bn (BatchNormal (None, 10, 10, 1408) 5632 conv5_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_relu (Activatio (None, 10, 10, 1408) 0 conv5_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_conv (Conv2D) (None, 10, 10, 128) 180224 conv5_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_concat (Concatena (None, 10, 10, 1440) 0 conv5_block24_concat[0][0] \n conv5_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_bn (BatchNormal (None, 10, 10, 1440) 5760 conv5_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_relu (Activatio (None, 10, 10, 1440) 0 conv5_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_conv (Conv2D) (None, 10, 10, 128) 184320 conv5_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_concat (Concatena (None, 10, 10, 1472) 0 conv5_block25_concat[0][0] \n conv5_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_bn (BatchNormal (None, 10, 10, 1472) 5888 conv5_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_relu (Activatio (None, 10, 10, 1472) 0 conv5_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_conv (Conv2D) (None, 10, 10, 128) 188416 conv5_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_concat (Concatena (None, 10, 10, 1504) 0 conv5_block26_concat[0][0] \n conv5_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_bn (BatchNormal (None, 10, 10, 1504) 6016 conv5_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_relu (Activatio (None, 10, 10, 1504) 0 conv5_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_conv (Conv2D) (None, 10, 10, 128) 192512 conv5_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_concat (Concatena (None, 10, 10, 1536) 0 conv5_block27_concat[0][0] \n conv5_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_bn (BatchNormal (None, 10, 10, 1536) 6144 conv5_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_relu (Activatio (None, 10, 10, 1536) 0 conv5_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_conv (Conv2D) (None, 10, 10, 128) 196608 conv5_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_concat (Concatena (None, 10, 10, 1568) 0 conv5_block28_concat[0][0] \n conv5_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_bn (BatchNormal (None, 10, 10, 1568) 6272 conv5_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_relu (Activatio (None, 10, 10, 1568) 0 conv5_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_conv (Conv2D) (None, 10, 10, 128) 200704 conv5_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_concat (Concatena (None, 10, 10, 1600) 0 conv5_block29_concat[0][0] \n conv5_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_bn (BatchNormal (None, 10, 10, 1600) 6400 conv5_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_relu (Activatio (None, 10, 10, 1600) 0 conv5_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_conv (Conv2D) (None, 10, 10, 128) 204800 conv5_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_concat (Concatena (None, 10, 10, 1632) 0 conv5_block30_concat[0][0] \n conv5_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_bn (BatchNormal (None, 10, 10, 1632) 6528 conv5_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_relu (Activatio (None, 10, 10, 1632) 0 conv5_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_conv (Conv2D) (None, 10, 10, 128) 208896 conv5_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_concat (Concatena (None, 10, 10, 1664) 0 conv5_block31_concat[0][0] \n conv5_block32_2_conv[0][0] \n__________________________________________________________________________________________________\nbn (BatchNormalization) (None, 10, 10, 1664) 6656 conv5_block32_concat[0][0] \n__________________________________________________________________________________________________\nrelu (Activation) (None, 10, 10, 1664) 0 bn[0][0] \n__________________________________________________________________________________________________\nglobal_average_pooling2d_1 (Glo (None, 1664) 0 relu[0][0] \n__________________________________________________________________________________________________\ndropout_1 (Dropout) (None, 1664) 0 global_average_pooling2d_1[0][0] \n__________________________________________________________________________________________________\ndense_1 (Dense) (None, 2048) 3409920 dropout_1[0][0] \n__________________________________________________________________________________________________\ndropout_2 (Dropout) (None, 2048) 0 dense_1[0][0] \n__________________________________________________________________________________________________\nfinal_output (Dense) (None, 5) 10245 dropout_2[0][0] \n==================================================================================================\nTotal params: 16,063,045\nTrainable params: 3,420,165\nNon-trainable params: 12,642,880\n__________________________________________________________________________________________________\n"
],
[
"STEP_SIZE_TRAIN = train_generator.n//train_generator.batch_size\nSTEP_SIZE_VALID = valid_generator.n//valid_generator.batch_size\n\nhistory_warmup = model.fit_generator(generator=train_generator,\n steps_per_epoch=STEP_SIZE_TRAIN,\n validation_data=valid_generator,\n validation_steps=STEP_SIZE_VALID,\n epochs=WARMUP_EPOCHS,\n class_weight=class_weights,\n verbose=1).history",
"Epoch 1/5\n183/183 [==============================] - 81s 445ms/step - loss: 1.3357 - acc: 0.5731 - kappa: 0.3848 - val_loss: 1.0849 - val_acc: 0.5083 - val_kappa: -0.2198\nEpoch 2/5\n183/183 [==============================] - 68s 373ms/step - loss: 0.9705 - acc: 0.6499 - kappa: 0.6185 - val_loss: 1.0448 - val_acc: 0.5760 - val_kappa: 0.1622\nEpoch 3/5\n183/183 [==============================] - 69s 379ms/step - loss: 0.9260 - acc: 0.6571 - kappa: 0.6398 - val_loss: 1.2030 - val_acc: 0.4881 - val_kappa: -0.4510\nEpoch 4/5\n183/183 [==============================] - 69s 378ms/step - loss: 0.8650 - acc: 0.6837 - kappa: 0.6950 - val_loss: 1.0301 - val_acc: 0.5425 - val_kappa: 0.0034\nEpoch 5/5\n183/183 [==============================] - 69s 377ms/step - loss: 0.8863 - acc: 0.6640 - kappa: 0.6651 - val_loss: 0.9225 - val_acc: 0.6444 - val_kappa: 0.5296\n"
]
],
[
[
"# Fine-tune the complete model",
"_____no_output_____"
]
],
[
[
"for layer in model.layers:\n layer.trainable = True\n\n# lrstep = LearningRateScheduler(step_decay)\nes = EarlyStopping(monitor='val_loss', mode='min', patience=ES_PATIENCE, restore_best_weights=True, verbose=1)\nrlrop = ReduceLROnPlateau(monitor='val_loss', mode='min', patience=RLROP_PATIENCE, factor=DECAY_DROP, min_lr=1e-6, verbose=1)\n\ncallback_list = [es, rlrop]\noptimizer = optimizers.Adam(lr=LEARNING_RATE)\nmodel.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=metric_list)\nmodel.summary()",
"__________________________________________________________________________________________________\nLayer (type) Output Shape Param # Connected to \n==================================================================================================\ninput_1 (InputLayer) (None, 320, 320, 3) 0 \n__________________________________________________________________________________________________\nzero_padding2d_1 (ZeroPadding2D (None, 326, 326, 3) 0 input_1[0][0] \n__________________________________________________________________________________________________\nconv1/conv (Conv2D) (None, 160, 160, 64) 9408 zero_padding2d_1[0][0] \n__________________________________________________________________________________________________\nconv1/bn (BatchNormalization) (None, 160, 160, 64) 256 conv1/conv[0][0] \n__________________________________________________________________________________________________\nconv1/relu (Activation) (None, 160, 160, 64) 0 conv1/bn[0][0] \n__________________________________________________________________________________________________\nzero_padding2d_2 (ZeroPadding2D (None, 162, 162, 64) 0 conv1/relu[0][0] \n__________________________________________________________________________________________________\npool1 (MaxPooling2D) (None, 80, 80, 64) 0 zero_padding2d_2[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_bn (BatchNormali (None, 80, 80, 64) 256 pool1[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_relu (Activation (None, 80, 80, 64) 0 conv2_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_conv (Conv2D) (None, 80, 80, 128) 8192 conv2_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_relu (Activation (None, 80, 80, 128) 0 conv2_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_concat (Concatenat (None, 80, 80, 96) 0 pool1[0][0] \n conv2_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_bn (BatchNormali (None, 80, 80, 96) 384 conv2_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_relu (Activation (None, 80, 80, 96) 0 conv2_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_conv (Conv2D) (None, 80, 80, 128) 12288 conv2_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_relu (Activation (None, 80, 80, 128) 0 conv2_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_concat (Concatenat (None, 80, 80, 128) 0 conv2_block1_concat[0][0] \n conv2_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_relu (Activation (None, 80, 80, 128) 0 conv2_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_conv (Conv2D) (None, 80, 80, 128) 16384 conv2_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_relu (Activation (None, 80, 80, 128) 0 conv2_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_concat (Concatenat (None, 80, 80, 160) 0 conv2_block2_concat[0][0] \n conv2_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_bn (BatchNormali (None, 80, 80, 160) 640 conv2_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_relu (Activation (None, 80, 80, 160) 0 conv2_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_conv (Conv2D) (None, 80, 80, 128) 20480 conv2_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_relu (Activation (None, 80, 80, 128) 0 conv2_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_concat (Concatenat (None, 80, 80, 192) 0 conv2_block3_concat[0][0] \n conv2_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_bn (BatchNormali (None, 80, 80, 192) 768 conv2_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_relu (Activation (None, 80, 80, 192) 0 conv2_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_conv (Conv2D) (None, 80, 80, 128) 24576 conv2_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_relu (Activation (None, 80, 80, 128) 0 conv2_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_concat (Concatenat (None, 80, 80, 224) 0 conv2_block4_concat[0][0] \n conv2_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_bn (BatchNormali (None, 80, 80, 224) 896 conv2_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_relu (Activation (None, 80, 80, 224) 0 conv2_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_conv (Conv2D) (None, 80, 80, 128) 28672 conv2_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_relu (Activation (None, 80, 80, 128) 0 conv2_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_concat (Concatenat (None, 80, 80, 256) 0 conv2_block5_concat[0][0] \n conv2_block6_2_conv[0][0] \n__________________________________________________________________________________________________\npool2_bn (BatchNormalization) (None, 80, 80, 256) 1024 conv2_block6_concat[0][0] \n__________________________________________________________________________________________________\npool2_relu (Activation) (None, 80, 80, 256) 0 pool2_bn[0][0] \n__________________________________________________________________________________________________\npool2_conv (Conv2D) (None, 80, 80, 128) 32768 pool2_relu[0][0] \n__________________________________________________________________________________________________\npool2_pool (AveragePooling2D) (None, 40, 40, 128) 0 pool2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_bn (BatchNormali (None, 40, 40, 128) 512 pool2_pool[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_relu (Activation (None, 40, 40, 128) 0 conv3_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_conv (Conv2D) (None, 40, 40, 128) 16384 conv3_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_relu (Activation (None, 40, 40, 128) 0 conv3_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_concat (Concatenat (None, 40, 40, 160) 0 pool2_pool[0][0] \n conv3_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_bn (BatchNormali (None, 40, 40, 160) 640 conv3_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_relu (Activation (None, 40, 40, 160) 0 conv3_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_conv (Conv2D) (None, 40, 40, 128) 20480 conv3_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_relu (Activation (None, 40, 40, 128) 0 conv3_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_concat (Concatenat (None, 40, 40, 192) 0 conv3_block1_concat[0][0] \n conv3_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_bn (BatchNormali (None, 40, 40, 192) 768 conv3_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_relu (Activation (None, 40, 40, 192) 0 conv3_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_conv (Conv2D) (None, 40, 40, 128) 24576 conv3_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_relu (Activation (None, 40, 40, 128) 0 conv3_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_concat (Concatenat (None, 40, 40, 224) 0 conv3_block2_concat[0][0] \n conv3_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_bn (BatchNormali (None, 40, 40, 224) 896 conv3_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_relu (Activation (None, 40, 40, 224) 0 conv3_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_conv (Conv2D) (None, 40, 40, 128) 28672 conv3_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_relu (Activation (None, 40, 40, 128) 0 conv3_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_concat (Concatenat (None, 40, 40, 256) 0 conv3_block3_concat[0][0] \n conv3_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_bn (BatchNormali (None, 40, 40, 256) 1024 conv3_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_relu (Activation (None, 40, 40, 256) 0 conv3_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_conv (Conv2D) (None, 40, 40, 128) 32768 conv3_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_relu (Activation (None, 40, 40, 128) 0 conv3_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_concat (Concatenat (None, 40, 40, 288) 0 conv3_block4_concat[0][0] \n conv3_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_bn (BatchNormali (None, 40, 40, 288) 1152 conv3_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_relu (Activation (None, 40, 40, 288) 0 conv3_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_conv (Conv2D) (None, 40, 40, 128) 36864 conv3_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_relu (Activation (None, 40, 40, 128) 0 conv3_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_concat (Concatenat (None, 40, 40, 320) 0 conv3_block5_concat[0][0] \n conv3_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_bn (BatchNormali (None, 40, 40, 320) 1280 conv3_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_relu (Activation (None, 40, 40, 320) 0 conv3_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_conv (Conv2D) (None, 40, 40, 128) 40960 conv3_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_relu (Activation (None, 40, 40, 128) 0 conv3_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_concat (Concatenat (None, 40, 40, 352) 0 conv3_block6_concat[0][0] \n conv3_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_bn (BatchNormali (None, 40, 40, 352) 1408 conv3_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_relu (Activation (None, 40, 40, 352) 0 conv3_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_conv (Conv2D) (None, 40, 40, 128) 45056 conv3_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_relu (Activation (None, 40, 40, 128) 0 conv3_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_concat (Concatenat (None, 40, 40, 384) 0 conv3_block7_concat[0][0] \n conv3_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_bn (BatchNormali (None, 40, 40, 384) 1536 conv3_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_relu (Activation (None, 40, 40, 384) 0 conv3_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_conv (Conv2D) (None, 40, 40, 128) 49152 conv3_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_relu (Activation (None, 40, 40, 128) 0 conv3_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_concat (Concatenat (None, 40, 40, 416) 0 conv3_block8_concat[0][0] \n conv3_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_bn (BatchNormal (None, 40, 40, 416) 1664 conv3_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_relu (Activatio (None, 40, 40, 416) 0 conv3_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_conv (Conv2D) (None, 40, 40, 128) 53248 conv3_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_concat (Concatena (None, 40, 40, 448) 0 conv3_block9_concat[0][0] \n conv3_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_bn (BatchNormal (None, 40, 40, 448) 1792 conv3_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_relu (Activatio (None, 40, 40, 448) 0 conv3_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_conv (Conv2D) (None, 40, 40, 128) 57344 conv3_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_concat (Concatena (None, 40, 40, 480) 0 conv3_block10_concat[0][0] \n conv3_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_bn (BatchNormal (None, 40, 40, 480) 1920 conv3_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_relu (Activatio (None, 40, 40, 480) 0 conv3_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_conv (Conv2D) (None, 40, 40, 128) 61440 conv3_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_concat (Concatena (None, 40, 40, 512) 0 conv3_block11_concat[0][0] \n conv3_block12_2_conv[0][0] \n__________________________________________________________________________________________________\npool3_bn (BatchNormalization) (None, 40, 40, 512) 2048 conv3_block12_concat[0][0] \n__________________________________________________________________________________________________\npool3_relu (Activation) (None, 40, 40, 512) 0 pool3_bn[0][0] \n__________________________________________________________________________________________________\npool3_conv (Conv2D) (None, 40, 40, 256) 131072 pool3_relu[0][0] \n__________________________________________________________________________________________________\npool3_pool (AveragePooling2D) (None, 20, 20, 256) 0 pool3_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_bn (BatchNormali (None, 20, 20, 256) 1024 pool3_pool[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_relu (Activation (None, 20, 20, 256) 0 conv4_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_conv (Conv2D) (None, 20, 20, 128) 32768 conv4_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_relu (Activation (None, 20, 20, 128) 0 conv4_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_concat (Concatenat (None, 20, 20, 288) 0 pool3_pool[0][0] \n conv4_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_bn (BatchNormali (None, 20, 20, 288) 1152 conv4_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_relu (Activation (None, 20, 20, 288) 0 conv4_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_conv (Conv2D) (None, 20, 20, 128) 36864 conv4_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_relu (Activation (None, 20, 20, 128) 0 conv4_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_concat (Concatenat (None, 20, 20, 320) 0 conv4_block1_concat[0][0] \n conv4_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_bn (BatchNormali (None, 20, 20, 320) 1280 conv4_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_relu (Activation (None, 20, 20, 320) 0 conv4_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_conv (Conv2D) (None, 20, 20, 128) 40960 conv4_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_relu (Activation (None, 20, 20, 128) 0 conv4_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_concat (Concatenat (None, 20, 20, 352) 0 conv4_block2_concat[0][0] \n conv4_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_bn (BatchNormali (None, 20, 20, 352) 1408 conv4_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_relu (Activation (None, 20, 20, 352) 0 conv4_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_conv (Conv2D) (None, 20, 20, 128) 45056 conv4_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_relu (Activation (None, 20, 20, 128) 0 conv4_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_concat (Concatenat (None, 20, 20, 384) 0 conv4_block3_concat[0][0] \n conv4_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_bn (BatchNormali (None, 20, 20, 384) 1536 conv4_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_relu (Activation (None, 20, 20, 384) 0 conv4_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_conv (Conv2D) (None, 20, 20, 128) 49152 conv4_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_relu (Activation (None, 20, 20, 128) 0 conv4_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_concat (Concatenat (None, 20, 20, 416) 0 conv4_block4_concat[0][0] \n conv4_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_bn (BatchNormali (None, 20, 20, 416) 1664 conv4_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_relu (Activation (None, 20, 20, 416) 0 conv4_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_conv (Conv2D) (None, 20, 20, 128) 53248 conv4_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_relu (Activation (None, 20, 20, 128) 0 conv4_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_concat (Concatenat (None, 20, 20, 448) 0 conv4_block5_concat[0][0] \n conv4_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_bn (BatchNormali (None, 20, 20, 448) 1792 conv4_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_relu (Activation (None, 20, 20, 448) 0 conv4_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_conv (Conv2D) (None, 20, 20, 128) 57344 conv4_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_relu (Activation (None, 20, 20, 128) 0 conv4_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_concat (Concatenat (None, 20, 20, 480) 0 conv4_block6_concat[0][0] \n conv4_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_bn (BatchNormali (None, 20, 20, 480) 1920 conv4_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_relu (Activation (None, 20, 20, 480) 0 conv4_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_conv (Conv2D) (None, 20, 20, 128) 61440 conv4_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_relu (Activation (None, 20, 20, 128) 0 conv4_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_concat (Concatenat (None, 20, 20, 512) 0 conv4_block7_concat[0][0] \n conv4_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_bn (BatchNormali (None, 20, 20, 512) 2048 conv4_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_relu (Activation (None, 20, 20, 512) 0 conv4_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_conv (Conv2D) (None, 20, 20, 128) 65536 conv4_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_relu (Activation (None, 20, 20, 128) 0 conv4_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_concat (Concatenat (None, 20, 20, 544) 0 conv4_block8_concat[0][0] \n conv4_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_bn (BatchNormal (None, 20, 20, 544) 2176 conv4_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_relu (Activatio (None, 20, 20, 544) 0 conv4_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_conv (Conv2D) (None, 20, 20, 128) 69632 conv4_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_concat (Concatena (None, 20, 20, 576) 0 conv4_block9_concat[0][0] \n conv4_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_bn (BatchNormal (None, 20, 20, 576) 2304 conv4_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_relu (Activatio (None, 20, 20, 576) 0 conv4_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_conv (Conv2D) (None, 20, 20, 128) 73728 conv4_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_concat (Concatena (None, 20, 20, 608) 0 conv4_block10_concat[0][0] \n conv4_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_bn (BatchNormal (None, 20, 20, 608) 2432 conv4_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_relu (Activatio (None, 20, 20, 608) 0 conv4_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_conv (Conv2D) (None, 20, 20, 128) 77824 conv4_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_concat (Concatena (None, 20, 20, 640) 0 conv4_block11_concat[0][0] \n conv4_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_bn (BatchNormal (None, 20, 20, 640) 2560 conv4_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_relu (Activatio (None, 20, 20, 640) 0 conv4_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_conv (Conv2D) (None, 20, 20, 128) 81920 conv4_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_concat (Concatena (None, 20, 20, 672) 0 conv4_block12_concat[0][0] \n conv4_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_bn (BatchNormal (None, 20, 20, 672) 2688 conv4_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_relu (Activatio (None, 20, 20, 672) 0 conv4_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_conv (Conv2D) (None, 20, 20, 128) 86016 conv4_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_concat (Concatena (None, 20, 20, 704) 0 conv4_block13_concat[0][0] \n conv4_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_bn (BatchNormal (None, 20, 20, 704) 2816 conv4_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_relu (Activatio (None, 20, 20, 704) 0 conv4_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_conv (Conv2D) (None, 20, 20, 128) 90112 conv4_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_concat (Concatena (None, 20, 20, 736) 0 conv4_block14_concat[0][0] \n conv4_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_bn (BatchNormal (None, 20, 20, 736) 2944 conv4_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_relu (Activatio (None, 20, 20, 736) 0 conv4_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_conv (Conv2D) (None, 20, 20, 128) 94208 conv4_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_concat (Concatena (None, 20, 20, 768) 0 conv4_block15_concat[0][0] \n conv4_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_bn (BatchNormal (None, 20, 20, 768) 3072 conv4_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_relu (Activatio (None, 20, 20, 768) 0 conv4_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_conv (Conv2D) (None, 20, 20, 128) 98304 conv4_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_concat (Concatena (None, 20, 20, 800) 0 conv4_block16_concat[0][0] \n conv4_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_bn (BatchNormal (None, 20, 20, 800) 3200 conv4_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_relu (Activatio (None, 20, 20, 800) 0 conv4_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_conv (Conv2D) (None, 20, 20, 128) 102400 conv4_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_concat (Concatena (None, 20, 20, 832) 0 conv4_block17_concat[0][0] \n conv4_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_bn (BatchNormal (None, 20, 20, 832) 3328 conv4_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_relu (Activatio (None, 20, 20, 832) 0 conv4_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_conv (Conv2D) (None, 20, 20, 128) 106496 conv4_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_concat (Concatena (None, 20, 20, 864) 0 conv4_block18_concat[0][0] \n conv4_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_bn (BatchNormal (None, 20, 20, 864) 3456 conv4_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_relu (Activatio (None, 20, 20, 864) 0 conv4_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_conv (Conv2D) (None, 20, 20, 128) 110592 conv4_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_concat (Concatena (None, 20, 20, 896) 0 conv4_block19_concat[0][0] \n conv4_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_bn (BatchNormal (None, 20, 20, 896) 3584 conv4_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_relu (Activatio (None, 20, 20, 896) 0 conv4_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_conv (Conv2D) (None, 20, 20, 128) 114688 conv4_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_concat (Concatena (None, 20, 20, 928) 0 conv4_block20_concat[0][0] \n conv4_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_bn (BatchNormal (None, 20, 20, 928) 3712 conv4_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_relu (Activatio (None, 20, 20, 928) 0 conv4_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_conv (Conv2D) (None, 20, 20, 128) 118784 conv4_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_concat (Concatena (None, 20, 20, 960) 0 conv4_block21_concat[0][0] \n conv4_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_bn (BatchNormal (None, 20, 20, 960) 3840 conv4_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_relu (Activatio (None, 20, 20, 960) 0 conv4_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_conv (Conv2D) (None, 20, 20, 128) 122880 conv4_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_concat (Concatena (None, 20, 20, 992) 0 conv4_block22_concat[0][0] \n conv4_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_bn (BatchNormal (None, 20, 20, 992) 3968 conv4_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_relu (Activatio (None, 20, 20, 992) 0 conv4_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_conv (Conv2D) (None, 20, 20, 128) 126976 conv4_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_concat (Concatena (None, 20, 20, 1024) 0 conv4_block23_concat[0][0] \n conv4_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_bn (BatchNormal (None, 20, 20, 1024) 4096 conv4_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_relu (Activatio (None, 20, 20, 1024) 0 conv4_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_conv (Conv2D) (None, 20, 20, 128) 131072 conv4_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_concat (Concatena (None, 20, 20, 1056) 0 conv4_block24_concat[0][0] \n conv4_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_bn (BatchNormal (None, 20, 20, 1056) 4224 conv4_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_relu (Activatio (None, 20, 20, 1056) 0 conv4_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_conv (Conv2D) (None, 20, 20, 128) 135168 conv4_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_concat (Concatena (None, 20, 20, 1088) 0 conv4_block25_concat[0][0] \n conv4_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_bn (BatchNormal (None, 20, 20, 1088) 4352 conv4_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_relu (Activatio (None, 20, 20, 1088) 0 conv4_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_conv (Conv2D) (None, 20, 20, 128) 139264 conv4_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_concat (Concatena (None, 20, 20, 1120) 0 conv4_block26_concat[0][0] \n conv4_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_bn (BatchNormal (None, 20, 20, 1120) 4480 conv4_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_relu (Activatio (None, 20, 20, 1120) 0 conv4_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_conv (Conv2D) (None, 20, 20, 128) 143360 conv4_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_concat (Concatena (None, 20, 20, 1152) 0 conv4_block27_concat[0][0] \n conv4_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_bn (BatchNormal (None, 20, 20, 1152) 4608 conv4_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_relu (Activatio (None, 20, 20, 1152) 0 conv4_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_conv (Conv2D) (None, 20, 20, 128) 147456 conv4_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_concat (Concatena (None, 20, 20, 1184) 0 conv4_block28_concat[0][0] \n conv4_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_bn (BatchNormal (None, 20, 20, 1184) 4736 conv4_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_relu (Activatio (None, 20, 20, 1184) 0 conv4_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_conv (Conv2D) (None, 20, 20, 128) 151552 conv4_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_concat (Concatena (None, 20, 20, 1216) 0 conv4_block29_concat[0][0] \n conv4_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_bn (BatchNormal (None, 20, 20, 1216) 4864 conv4_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_relu (Activatio (None, 20, 20, 1216) 0 conv4_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_conv (Conv2D) (None, 20, 20, 128) 155648 conv4_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_concat (Concatena (None, 20, 20, 1248) 0 conv4_block30_concat[0][0] \n conv4_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_bn (BatchNormal (None, 20, 20, 1248) 4992 conv4_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_relu (Activatio (None, 20, 20, 1248) 0 conv4_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_conv (Conv2D) (None, 20, 20, 128) 159744 conv4_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_concat (Concatena (None, 20, 20, 1280) 0 conv4_block31_concat[0][0] \n conv4_block32_2_conv[0][0] \n__________________________________________________________________________________________________\npool4_bn (BatchNormalization) (None, 20, 20, 1280) 5120 conv4_block32_concat[0][0] \n__________________________________________________________________________________________________\npool4_relu (Activation) (None, 20, 20, 1280) 0 pool4_bn[0][0] \n__________________________________________________________________________________________________\npool4_conv (Conv2D) (None, 20, 20, 640) 819200 pool4_relu[0][0] \n__________________________________________________________________________________________________\npool4_pool (AveragePooling2D) (None, 10, 10, 640) 0 pool4_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_bn (BatchNormali (None, 10, 10, 640) 2560 pool4_pool[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_relu (Activation (None, 10, 10, 640) 0 conv5_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_conv (Conv2D) (None, 10, 10, 128) 81920 conv5_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_relu (Activation (None, 10, 10, 128) 0 conv5_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_concat (Concatenat (None, 10, 10, 672) 0 pool4_pool[0][0] \n conv5_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_bn (BatchNormali (None, 10, 10, 672) 2688 conv5_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_relu (Activation (None, 10, 10, 672) 0 conv5_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_conv (Conv2D) (None, 10, 10, 128) 86016 conv5_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_relu (Activation (None, 10, 10, 128) 0 conv5_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_concat (Concatenat (None, 10, 10, 704) 0 conv5_block1_concat[0][0] \n conv5_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_bn (BatchNormali (None, 10, 10, 704) 2816 conv5_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_relu (Activation (None, 10, 10, 704) 0 conv5_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_conv (Conv2D) (None, 10, 10, 128) 90112 conv5_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_relu (Activation (None, 10, 10, 128) 0 conv5_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_concat (Concatenat (None, 10, 10, 736) 0 conv5_block2_concat[0][0] \n conv5_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_bn (BatchNormali (None, 10, 10, 736) 2944 conv5_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_relu (Activation (None, 10, 10, 736) 0 conv5_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_conv (Conv2D) (None, 10, 10, 128) 94208 conv5_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_relu (Activation (None, 10, 10, 128) 0 conv5_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_concat (Concatenat (None, 10, 10, 768) 0 conv5_block3_concat[0][0] \n conv5_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_bn (BatchNormali (None, 10, 10, 768) 3072 conv5_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_relu (Activation (None, 10, 10, 768) 0 conv5_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_conv (Conv2D) (None, 10, 10, 128) 98304 conv5_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_relu (Activation (None, 10, 10, 128) 0 conv5_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_concat (Concatenat (None, 10, 10, 800) 0 conv5_block4_concat[0][0] \n conv5_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_bn (BatchNormali (None, 10, 10, 800) 3200 conv5_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_relu (Activation (None, 10, 10, 800) 0 conv5_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_conv (Conv2D) (None, 10, 10, 128) 102400 conv5_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_relu (Activation (None, 10, 10, 128) 0 conv5_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_concat (Concatenat (None, 10, 10, 832) 0 conv5_block5_concat[0][0] \n conv5_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_bn (BatchNormali (None, 10, 10, 832) 3328 conv5_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_relu (Activation (None, 10, 10, 832) 0 conv5_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_conv (Conv2D) (None, 10, 10, 128) 106496 conv5_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_relu (Activation (None, 10, 10, 128) 0 conv5_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_concat (Concatenat (None, 10, 10, 864) 0 conv5_block6_concat[0][0] \n conv5_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_bn (BatchNormali (None, 10, 10, 864) 3456 conv5_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_relu (Activation (None, 10, 10, 864) 0 conv5_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_conv (Conv2D) (None, 10, 10, 128) 110592 conv5_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_relu (Activation (None, 10, 10, 128) 0 conv5_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_concat (Concatenat (None, 10, 10, 896) 0 conv5_block7_concat[0][0] \n conv5_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_bn (BatchNormali (None, 10, 10, 896) 3584 conv5_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_relu (Activation (None, 10, 10, 896) 0 conv5_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_conv (Conv2D) (None, 10, 10, 128) 114688 conv5_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_relu (Activation (None, 10, 10, 128) 0 conv5_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_concat (Concatenat (None, 10, 10, 928) 0 conv5_block8_concat[0][0] \n conv5_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_bn (BatchNormal (None, 10, 10, 928) 3712 conv5_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_relu (Activatio (None, 10, 10, 928) 0 conv5_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_conv (Conv2D) (None, 10, 10, 128) 118784 conv5_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_concat (Concatena (None, 10, 10, 960) 0 conv5_block9_concat[0][0] \n conv5_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_bn (BatchNormal (None, 10, 10, 960) 3840 conv5_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_relu (Activatio (None, 10, 10, 960) 0 conv5_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_conv (Conv2D) (None, 10, 10, 128) 122880 conv5_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_concat (Concatena (None, 10, 10, 992) 0 conv5_block10_concat[0][0] \n conv5_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_bn (BatchNormal (None, 10, 10, 992) 3968 conv5_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_relu (Activatio (None, 10, 10, 992) 0 conv5_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_conv (Conv2D) (None, 10, 10, 128) 126976 conv5_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_concat (Concatena (None, 10, 10, 1024) 0 conv5_block11_concat[0][0] \n conv5_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_bn (BatchNormal (None, 10, 10, 1024) 4096 conv5_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_relu (Activatio (None, 10, 10, 1024) 0 conv5_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_conv (Conv2D) (None, 10, 10, 128) 131072 conv5_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_concat (Concatena (None, 10, 10, 1056) 0 conv5_block12_concat[0][0] \n conv5_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_bn (BatchNormal (None, 10, 10, 1056) 4224 conv5_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_relu (Activatio (None, 10, 10, 1056) 0 conv5_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_conv (Conv2D) (None, 10, 10, 128) 135168 conv5_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_concat (Concatena (None, 10, 10, 1088) 0 conv5_block13_concat[0][0] \n conv5_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_bn (BatchNormal (None, 10, 10, 1088) 4352 conv5_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_relu (Activatio (None, 10, 10, 1088) 0 conv5_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_conv (Conv2D) (None, 10, 10, 128) 139264 conv5_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_concat (Concatena (None, 10, 10, 1120) 0 conv5_block14_concat[0][0] \n conv5_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_bn (BatchNormal (None, 10, 10, 1120) 4480 conv5_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_relu (Activatio (None, 10, 10, 1120) 0 conv5_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_conv (Conv2D) (None, 10, 10, 128) 143360 conv5_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_concat (Concatena (None, 10, 10, 1152) 0 conv5_block15_concat[0][0] \n conv5_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_bn (BatchNormal (None, 10, 10, 1152) 4608 conv5_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_relu (Activatio (None, 10, 10, 1152) 0 conv5_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_conv (Conv2D) (None, 10, 10, 128) 147456 conv5_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_concat (Concatena (None, 10, 10, 1184) 0 conv5_block16_concat[0][0] \n conv5_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_bn (BatchNormal (None, 10, 10, 1184) 4736 conv5_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_relu (Activatio (None, 10, 10, 1184) 0 conv5_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_conv (Conv2D) (None, 10, 10, 128) 151552 conv5_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_concat (Concatena (None, 10, 10, 1216) 0 conv5_block17_concat[0][0] \n conv5_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_bn (BatchNormal (None, 10, 10, 1216) 4864 conv5_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_relu (Activatio (None, 10, 10, 1216) 0 conv5_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_conv (Conv2D) (None, 10, 10, 128) 155648 conv5_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_concat (Concatena (None, 10, 10, 1248) 0 conv5_block18_concat[0][0] \n conv5_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_bn (BatchNormal (None, 10, 10, 1248) 4992 conv5_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_relu (Activatio (None, 10, 10, 1248) 0 conv5_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_conv (Conv2D) (None, 10, 10, 128) 159744 conv5_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_concat (Concatena (None, 10, 10, 1280) 0 conv5_block19_concat[0][0] \n conv5_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_bn (BatchNormal (None, 10, 10, 1280) 5120 conv5_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_relu (Activatio (None, 10, 10, 1280) 0 conv5_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_conv (Conv2D) (None, 10, 10, 128) 163840 conv5_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_concat (Concatena (None, 10, 10, 1312) 0 conv5_block20_concat[0][0] \n conv5_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_bn (BatchNormal (None, 10, 10, 1312) 5248 conv5_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_relu (Activatio (None, 10, 10, 1312) 0 conv5_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_conv (Conv2D) (None, 10, 10, 128) 167936 conv5_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_concat (Concatena (None, 10, 10, 1344) 0 conv5_block21_concat[0][0] \n conv5_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_bn (BatchNormal (None, 10, 10, 1344) 5376 conv5_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_relu (Activatio (None, 10, 10, 1344) 0 conv5_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_conv (Conv2D) (None, 10, 10, 128) 172032 conv5_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_concat (Concatena (None, 10, 10, 1376) 0 conv5_block22_concat[0][0] \n conv5_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_bn (BatchNormal (None, 10, 10, 1376) 5504 conv5_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_relu (Activatio (None, 10, 10, 1376) 0 conv5_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_conv (Conv2D) (None, 10, 10, 128) 176128 conv5_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_concat (Concatena (None, 10, 10, 1408) 0 conv5_block23_concat[0][0] \n conv5_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_bn (BatchNormal (None, 10, 10, 1408) 5632 conv5_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_relu (Activatio (None, 10, 10, 1408) 0 conv5_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_conv (Conv2D) (None, 10, 10, 128) 180224 conv5_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_concat (Concatena (None, 10, 10, 1440) 0 conv5_block24_concat[0][0] \n conv5_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_bn (BatchNormal (None, 10, 10, 1440) 5760 conv5_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_relu (Activatio (None, 10, 10, 1440) 0 conv5_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_conv (Conv2D) (None, 10, 10, 128) 184320 conv5_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_concat (Concatena (None, 10, 10, 1472) 0 conv5_block25_concat[0][0] \n conv5_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_bn (BatchNormal (None, 10, 10, 1472) 5888 conv5_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_relu (Activatio (None, 10, 10, 1472) 0 conv5_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_conv (Conv2D) (None, 10, 10, 128) 188416 conv5_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_concat (Concatena (None, 10, 10, 1504) 0 conv5_block26_concat[0][0] \n conv5_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_bn (BatchNormal (None, 10, 10, 1504) 6016 conv5_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_relu (Activatio (None, 10, 10, 1504) 0 conv5_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_conv (Conv2D) (None, 10, 10, 128) 192512 conv5_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_concat (Concatena (None, 10, 10, 1536) 0 conv5_block27_concat[0][0] \n conv5_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_bn (BatchNormal (None, 10, 10, 1536) 6144 conv5_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_relu (Activatio (None, 10, 10, 1536) 0 conv5_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_conv (Conv2D) (None, 10, 10, 128) 196608 conv5_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_concat (Concatena (None, 10, 10, 1568) 0 conv5_block28_concat[0][0] \n conv5_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_bn (BatchNormal (None, 10, 10, 1568) 6272 conv5_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_relu (Activatio (None, 10, 10, 1568) 0 conv5_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_conv (Conv2D) (None, 10, 10, 128) 200704 conv5_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_concat (Concatena (None, 10, 10, 1600) 0 conv5_block29_concat[0][0] \n conv5_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_bn (BatchNormal (None, 10, 10, 1600) 6400 conv5_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_relu (Activatio (None, 10, 10, 1600) 0 conv5_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_conv (Conv2D) (None, 10, 10, 128) 204800 conv5_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_concat (Concatena (None, 10, 10, 1632) 0 conv5_block30_concat[0][0] \n conv5_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_bn (BatchNormal (None, 10, 10, 1632) 6528 conv5_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_relu (Activatio (None, 10, 10, 1632) 0 conv5_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_conv (Conv2D) (None, 10, 10, 128) 208896 conv5_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_concat (Concatena (None, 10, 10, 1664) 0 conv5_block31_concat[0][0] \n conv5_block32_2_conv[0][0] \n__________________________________________________________________________________________________\nbn (BatchNormalization) (None, 10, 10, 1664) 6656 conv5_block32_concat[0][0] \n__________________________________________________________________________________________________\nrelu (Activation) (None, 10, 10, 1664) 0 bn[0][0] \n__________________________________________________________________________________________________\nglobal_average_pooling2d_1 (Glo (None, 1664) 0 relu[0][0] \n__________________________________________________________________________________________________\ndropout_1 (Dropout) (None, 1664) 0 global_average_pooling2d_1[0][0] \n__________________________________________________________________________________________________\ndense_1 (Dense) (None, 2048) 3409920 dropout_1[0][0] \n__________________________________________________________________________________________________\ndropout_2 (Dropout) (None, 2048) 0 dense_1[0][0] \n__________________________________________________________________________________________________\nfinal_output (Dense) (None, 5) 10245 dropout_2[0][0] \n==================================================================================================\nTotal params: 16,063,045\nTrainable params: 15,904,645\nNon-trainable params: 158,400\n__________________________________________________________________________________________________\n"
],
[
"history = model.fit_generator(generator=train_generator,\n steps_per_epoch=STEP_SIZE_TRAIN,\n validation_data=valid_generator,\n validation_steps=STEP_SIZE_VALID,\n epochs=EPOCHS,\n callbacks=callback_list,\n class_weight=class_weights,\n verbose=1).history",
"Epoch 1/40\n183/183 [==============================] - 139s 757ms/step - loss: 0.6850 - acc: 0.7466 - kappa: 0.8268 - val_loss: 0.5695 - val_acc: 0.7908 - val_kappa: 0.8843\nEpoch 2/40\n183/183 [==============================] - 87s 478ms/step - loss: 0.5764 - acc: 0.7828 - kappa: 0.8835 - val_loss: 0.5638 - val_acc: 0.7880 - val_kappa: 0.8543\nEpoch 3/40\n183/183 [==============================] - 89s 487ms/step - loss: 0.5302 - acc: 0.7968 - kappa: 0.8996 - val_loss: 0.4854 - val_acc: 0.8298 - val_kappa: 0.9267\nEpoch 4/40\n183/183 [==============================] - 90s 493ms/step - loss: 0.4941 - acc: 0.8060 - kappa: 0.9220 - val_loss: 0.5247 - val_acc: 0.8061 - val_kappa: 0.9171\nEpoch 5/40\n183/183 [==============================] - 89s 487ms/step - loss: 0.4654 - acc: 0.8279 - kappa: 0.9285 - val_loss: 0.4637 - val_acc: 0.8145 - val_kappa: 0.9086\nEpoch 6/40\n183/183 [==============================] - 90s 491ms/step - loss: 0.4864 - acc: 0.8170 - kappa: 0.9225 - val_loss: 0.4663 - val_acc: 0.8326 - val_kappa: 0.9399\nEpoch 7/40\n183/183 [==============================] - 90s 493ms/step - loss: 0.4761 - acc: 0.8265 - kappa: 0.9363 - val_loss: 0.6075 - val_acc: 0.8006 - val_kappa: 0.8896\nEpoch 8/40\n183/183 [==============================] - 90s 494ms/step - loss: 0.4110 - acc: 0.8473 - kappa: 0.9440 - val_loss: 0.5248 - val_acc: 0.8229 - val_kappa: 0.9262\n\nEpoch 00008: ReduceLROnPlateau reducing learning rate to 4.999999873689376e-05.\nEpoch 9/40\n183/183 [==============================] - 89s 486ms/step - loss: 0.4127 - acc: 0.8477 - kappa: 0.9442 - val_loss: 0.4522 - val_acc: 0.8187 - val_kappa: 0.9232\nEpoch 10/40\n183/183 [==============================] - 91s 498ms/step - loss: 0.4236 - acc: 0.8498 - kappa: 0.9455 - val_loss: 0.4969 - val_acc: 0.8173 - val_kappa: 0.9069\nEpoch 11/40\n183/183 [==============================] - 92s 503ms/step - loss: 0.3767 - acc: 0.8562 - kappa: 0.9504 - val_loss: 0.5195 - val_acc: 0.7950 - val_kappa: 0.8966\nEpoch 12/40\n183/183 [==============================] - 93s 509ms/step - loss: 0.3427 - acc: 0.8696 - kappa: 0.9628 - val_loss: 0.5767 - val_acc: 0.8131 - val_kappa: 0.9236\n\nEpoch 00012: ReduceLROnPlateau reducing learning rate to 2.499999936844688e-05.\nEpoch 13/40\n183/183 [==============================] - 92s 505ms/step - loss: 0.2877 - acc: 0.8839 - kappa: 0.9645 - val_loss: 0.4223 - val_acc: 0.8424 - val_kappa: 0.9401\nEpoch 14/40\n183/183 [==============================] - 93s 510ms/step - loss: 0.2880 - acc: 0.8910 - kappa: 0.9704 - val_loss: 0.4906 - val_acc: 0.8103 - val_kappa: 0.9350\nEpoch 15/40\n183/183 [==============================] - 92s 505ms/step - loss: 0.2696 - acc: 0.9003 - kappa: 0.9719 - val_loss: 0.4484 - val_acc: 0.8271 - val_kappa: 0.9320\nEpoch 16/40\n183/183 [==============================] - 93s 509ms/step - loss: 0.2698 - acc: 0.8996 - kappa: 0.9774 - val_loss: 0.4540 - val_acc: 0.8229 - val_kappa: 0.9406\n\nEpoch 00016: ReduceLROnPlateau reducing learning rate to 1.249999968422344e-05.\nEpoch 17/40\n183/183 [==============================] - 92s 504ms/step - loss: 0.2323 - acc: 0.9197 - kappa: 0.9798 - val_loss: 0.5455 - val_acc: 0.7894 - val_kappa: 0.8988\nEpoch 18/40\n183/183 [==============================] - 94s 515ms/step - loss: 0.2399 - acc: 0.9132 - kappa: 0.9767 - val_loss: 0.4185 - val_acc: 0.8508 - val_kappa: 0.9487\nEpoch 19/40\n183/183 [==============================] - 93s 507ms/step - loss: 0.2322 - acc: 0.9157 - kappa: 0.9791 - val_loss: 0.5034 - val_acc: 0.8061 - val_kappa: 0.9174\nEpoch 20/40\n183/183 [==============================] - 93s 508ms/step - loss: 0.2174 - acc: 0.9167 - kappa: 0.9826 - val_loss: 0.4698 - val_acc: 0.8452 - val_kappa: 0.9419\nEpoch 21/40\n183/183 [==============================] - 93s 507ms/step - loss: 0.2468 - acc: 0.9157 - kappa: 0.9800 - val_loss: 0.5091 - val_acc: 0.8131 - val_kappa: 0.9259\n\nEpoch 00021: ReduceLROnPlateau reducing learning rate to 6.24999984211172e-06.\nEpoch 22/40\n183/183 [==============================] - 92s 501ms/step - loss: 0.1998 - acc: 0.9276 - kappa: 0.9841 - val_loss: 0.4864 - val_acc: 0.8285 - val_kappa: 0.9446\nEpoch 23/40\n183/183 [==============================] - 93s 507ms/step - loss: 0.2131 - acc: 0.9232 - kappa: 0.9844 - val_loss: 0.4938 - val_acc: 0.8173 - val_kappa: 0.9299\nRestoring model weights from the end of the best epoch\nEpoch 00023: early stopping\n"
]
],
[
[
"# Model loss graph ",
"_____no_output_____"
]
],
[
[
"sns.set_style(\"whitegrid\")\nfig, (ax1, ax2, ax3) = plt.subplots(3, 1, sharex='col', figsize=(20, 18))\n\nax1.plot(history['loss'], label='Train loss')\nax1.plot(history['val_loss'], label='Validation loss')\nax1.legend(loc='best')\nax1.set_title('Loss')\n\nax2.plot(history['acc'], label='Train accuracy')\nax2.plot(history['val_acc'], label='Validation accuracy')\nax2.legend(loc='best')\nax2.set_title('Accuracy')\n\nax3.plot(history['kappa'], label='Train kappa')\nax3.plot(history['val_kappa'], label='Validation kappa')\nax3.legend(loc='best')\nax3.set_title('Kappa')\n\nplt.xlabel('Epochs')\nsns.despine()\nplt.show()",
"_____no_output_____"
],
[
"# Create empty arays to keep the predictions and labels\nlastFullTrainPred = np.empty((0, N_CLASSES))\nlastFullTrainLabels = np.empty((0, N_CLASSES))\nlastFullValPred = np.empty((0, N_CLASSES))\nlastFullValLabels = np.empty((0, N_CLASSES))\n\n# Add train predictions and labels\nfor i in range(STEP_SIZE_TRAIN+1):\n im, lbl = next(train_generator)\n scores = model.predict(im, batch_size=train_generator.batch_size)\n lastFullTrainPred = np.append(lastFullTrainPred, scores, axis=0)\n lastFullTrainLabels = np.append(lastFullTrainLabels, lbl, axis=0)\n\n# Add validation predictions and labels\nfor i in range(STEP_SIZE_VALID+1):\n im, lbl = next(valid_generator)\n scores = model.predict(im, batch_size=valid_generator.batch_size)\n lastFullValPred = np.append(lastFullValPred, scores, axis=0)\n lastFullValLabels = np.append(lastFullValLabels, lbl, axis=0)\n\nlastFullComPred = np.concatenate((lastFullTrainPred, lastFullValPred))\nlastFullComLabels = np.concatenate((lastFullTrainLabels, lastFullValLabels))\n\ntrain_preds = [np.argmax(pred) for pred in lastFullTrainPred]\ntrain_labels = [np.argmax(label) for label in lastFullTrainLabels]\nvalidation_preds = [np.argmax(pred) for pred in lastFullValPred]\nvalidation_labels = [np.argmax(label) for label in lastFullValLabels]\ncomplete_labels = [np.argmax(label) for label in lastFullComLabels]",
"_____no_output_____"
]
],
[
[
"# Model Evaluation",
"_____no_output_____"
],
[
"## Confusion Matrix\n\n### Original thresholds",
"_____no_output_____"
]
],
[
[
"labels = ['0 - No DR', '1 - Mild', '2 - Moderate', '3 - Severe', '4 - Proliferative DR']\ndef plot_confusion_matrix(train, validation, labels=labels):\n train_labels, train_preds = train\n validation_labels, validation_preds = validation\n fig, (ax1, ax2) = plt.subplots(1, 2, sharex='col', figsize=(24, 7))\n train_cnf_matrix = confusion_matrix(train_labels, train_preds)\n validation_cnf_matrix = confusion_matrix(validation_labels, validation_preds)\n\n train_cnf_matrix_norm = train_cnf_matrix.astype('float') / train_cnf_matrix.sum(axis=1)[:, np.newaxis]\n validation_cnf_matrix_norm = validation_cnf_matrix.astype('float') / validation_cnf_matrix.sum(axis=1)[:, np.newaxis]\n\n train_df_cm = pd.DataFrame(train_cnf_matrix_norm, index=labels, columns=labels)\n validation_df_cm = pd.DataFrame(validation_cnf_matrix_norm, index=labels, columns=labels)\n\n sns.heatmap(train_df_cm, annot=True, fmt='.2f', cmap=\"Blues\",ax=ax1).set_title('Train')\n sns.heatmap(validation_df_cm, annot=True, fmt='.2f', cmap=sns.cubehelix_palette(8),ax=ax2).set_title('Validation')\n plt.show()\n\nplot_confusion_matrix((train_labels, train_preds), (validation_labels, validation_preds))",
"_____no_output_____"
]
],
[
[
"## Quadratic Weighted Kappa",
"_____no_output_____"
]
],
[
[
"def evaluate_model(train, validation):\n train_labels, train_preds = train\n validation_labels, validation_preds = validation\n print(\"Train Cohen Kappa score: %.3f\" % cohen_kappa_score(train_preds, train_labels, weights='quadratic'))\n print(\"Validation Cohen Kappa score: %.3f\" % cohen_kappa_score(validation_preds, validation_labels, weights='quadratic'))\n print(\"Complete set Cohen Kappa score: %.3f\" % cohen_kappa_score(train_preds+validation_preds, train_labels+validation_labels, weights='quadratic'))\n \nevaluate_model((train_preds, train_labels), (validation_preds, validation_labels))",
"Train Cohen Kappa score: 0.962\nValidation Cohen Kappa score: 0.900\nComplete set Cohen Kappa score: 0.950\n"
]
],
[
[
"## Apply model to test set and output predictions",
"_____no_output_____"
]
],
[
[
"step_size = test_generator.n//test_generator.batch_size\ntest_generator.reset()\npreds = model.predict_generator(test_generator, steps=step_size)\npredictions = np.argmax(preds, axis=1)\n\nresults = pd.DataFrame({'id_code':test['id_code'], 'diagnosis':predictions})\nresults['id_code'] = results['id_code'].map(lambda x: str(x)[:-4])",
"_____no_output_____"
],
[
"# Cleaning created directories\nif os.path.exists(train_dest_path):\n shutil.rmtree(train_dest_path)\nif os.path.exists(validation_dest_path):\n shutil.rmtree(validation_dest_path)\nif os.path.exists(test_dest_path):\n shutil.rmtree(test_dest_path)",
"_____no_output_____"
]
],
[
[
"# Predictions class distribution",
"_____no_output_____"
]
],
[
[
"fig = plt.subplots(sharex='col', figsize=(24, 8.7))\nsns.countplot(x=\"diagnosis\", data=results, palette=\"GnBu_d\").set_title('Test')\nsns.despine()\nplt.show()",
"_____no_output_____"
],
[
"results.to_csv('submission.csv', index=False)\ndisplay(results.head())",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d0627caae9fc6d352f304825bb063317fa6c3193 | 9,486 | ipynb | Jupyter Notebook | notebooks/computer_vision/raw/tut5.ipynb | guesswhohaha/learntools | c1bd607ade5227f8c8977ff05bf9d04d0a8b7732 | [
"Apache-2.0"
]
| null | null | null | notebooks/computer_vision/raw/tut5.ipynb | guesswhohaha/learntools | c1bd607ade5227f8c8977ff05bf9d04d0a8b7732 | [
"Apache-2.0"
]
| null | null | null | notebooks/computer_vision/raw/tut5.ipynb | guesswhohaha/learntools | c1bd607ade5227f8c8977ff05bf9d04d0a8b7732 | [
"Apache-2.0"
]
| null | null | null | 37.346457 | 517 | 0.598461 | [
[
[
"<!--TITLE:Custom Convnets-->\n# Introduction #\n\nNow that you've seen the layers a convnet uses to extract features, it's time to put them together and build a network of your own!\n\n# Simple to Refined #\n\nIn the last three lessons, we saw how convolutional networks perform **feature extraction** through three operations: **filter**, **detect**, and **condense**. A single round of feature extraction can only extract relatively simple features from an image, things like simple lines or contrasts. These are too simple to solve most classification problems. Instead, convnets will repeat this extraction over and over, so that the features become more complex and refined as they travel deeper into the network.\n\n<figure>\n<img src=\"https://i.imgur.com/VqmC1rm.png\" alt=\"Features extracted from an image of a car, from simple to refined.\" width=800>\n</figure>\n\n# Convolutional Blocks #\n\nIt does this by passing them through long chains of **convolutional blocks** which perform this extraction.\n\n<figure>\n<img src=\"https://i.imgur.com/pr8VwCZ.png\" width=\"400\" alt=\"Extraction as a sequence of blocks.\">\n</figure>\n\nThese convolutional blocks are stacks of `Conv2D` and `MaxPool2D` layers, whose role in feature extraction we learned about in the last few lessons.\n\n<figure>\n<!-- <img src=\"./images/2-block-crp.png\" width=\"400\" alt=\"A kind of extraction block: convolution, ReLU, pooling.\"> -->\n<img src=\"https://i.imgur.com/8D6IhEw.png\" width=\"400\" alt=\"A kind of extraction block: convolution, ReLU, pooling.\">\n</figure>\n\nEach block represents a round of extraction, and by composing these blocks the convnet can combine and recombine the features produced, growing them and shaping them to better fit the problem at hand. The deep structure of modern convnets is what allows this sophisticated feature engineering and has been largely responsible for their superior performance.\n\n# Example - Design a Convnet #\n\nLet's see how to define a deep convolutional network capable of engineering complex features. In this example, we'll create a Keras `Sequence` model and then train it on our Cars dataset.\n\n## Step 1 - Load Data ##\n\nThis hidden cell loads the data.",
"_____no_output_____"
]
],
[
[
"#$HIDE_INPUT$\n# Imports\nimport os, warnings\nimport matplotlib.pyplot as plt\nfrom matplotlib import gridspec\n\nimport numpy as np\nimport tensorflow as tf\nfrom tensorflow.keras.preprocessing import image_dataset_from_directory\n\n# Reproducability\ndef set_seed(seed=31415):\n np.random.seed(seed)\n tf.random.set_seed(seed)\n os.environ['PYTHONHASHSEED'] = str(seed)\n os.environ['TF_DETERMINISTIC_OPS'] = '1'\nset_seed()\n\n# Set Matplotlib defaults\nplt.rc('figure', autolayout=True)\nplt.rc('axes', labelweight='bold', labelsize='large',\n titleweight='bold', titlesize=18, titlepad=10)\nplt.rc('image', cmap='magma')\nwarnings.filterwarnings(\"ignore\") # to clean up output cells\n\n\n# Load training and validation sets\nds_train_ = image_dataset_from_directory(\n '../input/car-or-truck/train',\n labels='inferred',\n label_mode='binary',\n image_size=[128, 128],\n interpolation='nearest',\n batch_size=64,\n shuffle=True,\n)\nds_valid_ = image_dataset_from_directory(\n '../input/car-or-truck/valid',\n labels='inferred',\n label_mode='binary',\n image_size=[128, 128],\n interpolation='nearest',\n batch_size=64,\n shuffle=False,\n)\n\n# Data Pipeline\ndef convert_to_float(image, label):\n image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n return image, label\n\nAUTOTUNE = tf.data.experimental.AUTOTUNE\nds_train = (\n ds_train_\n .map(convert_to_float)\n .cache()\n .prefetch(buffer_size=AUTOTUNE)\n)\nds_valid = (\n ds_valid_\n .map(convert_to_float)\n .cache()\n .prefetch(buffer_size=AUTOTUNE)\n)\n",
"_____no_output_____"
]
],
[
[
"## Step 2 - Define Model ##\n\nHere is a diagram of the model we'll use:\n\n<figure>\n<!-- <img src=\"./images/2-convmodel-1.png\" width=\"200\" alt=\"Diagram of a convolutional model.\"> -->\n<img src=\"https://i.imgur.com/U1VdoDJ.png\" width=\"250\" alt=\"Diagram of a convolutional model.\">\n</figure>\n\nNow we'll define the model. See how our model consists of three blocks of `Conv2D` and `MaxPool2D` layers (the base) followed by a head of `Dense` layers. We can translate this diagram more or less directly into a Keras `Sequential` model just by filling in the appropriate parameters.",
"_____no_output_____"
]
],
[
[
"import tensorflow.keras as keras\nimport tensorflow.keras.layers as layers\n\nmodel = keras.Sequential([\n\n # First Convolutional Block\n layers.Conv2D(filters=32, kernel_size=5, activation=\"relu\", padding='same',\n # give the input dimensions in the first layer\n # [height, width, color channels(RGB)]\n input_shape=[128, 128, 3]),\n layers.MaxPool2D(),\n\n # Second Convolutional Block\n layers.Conv2D(filters=64, kernel_size=3, activation=\"relu\", padding='same'),\n layers.MaxPool2D(),\n\n # Third Convolutional Block\n layers.Conv2D(filters=128, kernel_size=3, activation=\"relu\", padding='same'),\n layers.MaxPool2D(),\n\n # Classifier Head\n layers.Flatten(),\n layers.Dense(units=6, activation=\"relu\"),\n layers.Dense(units=1, activation=\"sigmoid\"),\n])\nmodel.summary()",
"_____no_output_____"
]
],
[
[
"Notice in this definition is how the number of filters doubled block-by-block: 64, 128, 256. This is a common pattern. Since the `MaxPool2D` layer is reducing the *size* of the feature maps, we can afford to increase the *quantity* we create.\n\n## Step 3 - Train ##\n\nWe can train this model just like the model from Lesson 1: compile it with an optimizer along with a loss and metric appropriate for binary classification.",
"_____no_output_____"
]
],
[
[
"model.compile(\n optimizer=tf.keras.optimizers.Adam(epsilon=0.01),\n loss='binary_crossentropy',\n metrics=['binary_accuracy']\n)\n\nhistory = model.fit(\n ds_train,\n validation_data=ds_valid,\n epochs=40,\n)\n",
"_____no_output_____"
],
[
"import pandas as pd\n\nhistory_frame = pd.DataFrame(history.history)\nhistory_frame.loc[:, ['loss', 'val_loss']].plot()\nhistory_frame.loc[:, ['binary_accuracy', 'val_binary_accuracy']].plot();",
"_____no_output_____"
]
],
[
[
"This model is much smaller than the VGG16 model from Lesson 1 -- only 3 convolutional layers versus the 16 of VGG16. It was nevertheless able to fit this dataset fairly well. We might still be able to improve this simple model by adding more convolutional layers, hoping to create features better adapted to the dataset. This is what we'll try in the exercises.\n\n# Conclusion #\n\nIn this tutorial, you saw how to build a custom convnet composed of many **convolutional blocks** and capable of complex feature engineering. \n\n# Your Turn #\n\nIn the exercises, you'll create a convnet that performs as well on this problem as VGG16 does -- without pretraining! [**Try it now!**](#$NEXT_NOTEBOOK_URL$)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
]
|
d0627ce38b47f49d40a787be57156a5c935c8209 | 5,818 | ipynb | Jupyter Notebook | 101notebook/ipython-rel-2.1.0-examples/Notebook/Raw Input.ipynb | OpenBookProjects/ipynb | 72a28109e8e30aea0b9c6713e78821e4affa2e33 | [
"MIT"
]
| 6 | 2015-06-08T12:50:14.000Z | 2018-11-20T10:05:01.000Z | 101notebook/ipython-rel-2.1.0-examples/Notebook/Raw Input.ipynb | OpenBookProjects/ipynb | 72a28109e8e30aea0b9c6713e78821e4affa2e33 | [
"MIT"
]
| null | null | null | 101notebook/ipython-rel-2.1.0-examples/Notebook/Raw Input.ipynb | OpenBookProjects/ipynb | 72a28109e8e30aea0b9c6713e78821e4affa2e33 | [
"MIT"
]
| 8 | 2016-01-26T14:12:50.000Z | 2021-02-20T14:24:09.000Z | 29.683673 | 762 | 0.508594 | [
[
[
"empty"
]
]
]
| [
"empty"
]
| [
[
"empty"
]
]
|
d06280bd27aa1ca8f8e3c4b7aae0d4c197c9d83e | 2,356 | ipynb | Jupyter Notebook | 11_Face_Detection.ipynb | EliasPapachristos/Computer_Vision_with_OpenCV | 05af3c6161bd446f7df81ad190e732b1c5c6eb42 | [
"Apache-2.0"
]
| 9 | 2020-05-01T10:28:55.000Z | 2021-04-15T15:58:00.000Z | 11_Face_Detection.ipynb | EliasPapachristos/Computer_Vision_with_OpenCV | 05af3c6161bd446f7df81ad190e732b1c5c6eb42 | [
"Apache-2.0"
]
| null | null | null | 11_Face_Detection.ipynb | EliasPapachristos/Computer_Vision_with_OpenCV | 05af3c6161bd446f7df81ad190e732b1c5c6eb42 | [
"Apache-2.0"
]
| 7 | 2020-06-11T18:09:25.000Z | 2020-12-11T09:35:03.000Z | 20.666667 | 122 | 0.48854 | [
[
[
"import numpy as np\nimport cv2 \nimport matplotlib.pyplot as plt\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"### Cascade Files\nOpenCV comes with these pre-trained cascade files, we've relocated the .xml files for you in our own DATA folder.\n\n### Face Detectionยถ",
"_____no_output_____"
]
],
[
[
"face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')",
"_____no_output_____"
],
[
"def detect_face(img):\n \n \n face_img = img.copy()\n \n face_rects = face_cascade.detectMultiScale(face_img) \n \n for (x, y, w, h) in face_rects: \n cv2.rectangle(face_img, (x, y), (x + w, y + h), (255, 255, 255), 10) \n \n return face_img",
"_____no_output_____"
]
],
[
[
"### Conjunction with Video\n",
"_____no_output_____"
]
],
[
[
"cap = cv2.VideoCapture(0) \n\nwhile True: \n \n ret, frame = cap.read(0) \n \n frame = detect_face(frame)\n \n cv2.imshow('Video Face Detection', frame) \n \n c = cv2.waitKey(1) \n if c == 27: \n break \n \ncap.release() \ncv2.destroyAllWindows()",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d062889d977111401f6d85b9721c2780a97ec009 | 58,900 | ipynb | Jupyter Notebook | 3-object-tracking-and-localization/activities/8-vehicle-motion-and-calculus/Looking up Trig Ratios.ipynb | S1lv10Fr4gn4n1/udacity-cv | ce7aafc41e2c123396d809042973840ea08b850e | [
"MIT"
]
| null | null | null | 3-object-tracking-and-localization/activities/8-vehicle-motion-and-calculus/Looking up Trig Ratios.ipynb | S1lv10Fr4gn4n1/udacity-cv | ce7aafc41e2c123396d809042973840ea08b850e | [
"MIT"
]
| 3 | 2020-03-24T21:18:48.000Z | 2021-06-08T21:11:14.000Z | 3-object-tracking-and-localization/activities/8-vehicle-motion-and-calculus/Looking up Trig Ratios.ipynb | S1lv10Fr4gn4n1/udacity-cv | ce7aafc41e2c123396d809042973840ea08b850e | [
"MIT"
]
| null | null | null | 198.986486 | 16,876 | 0.907878 | [
[
[
"# Looking up Trig Ratios\nThere are three ways you could find the value of a trig function at a particular angle.\n\n**1. Use a table** - This is how engineers used to find trig ratios before the days of computers. For example, from the table below I can see that $\\sin(60)=0.866$\n\n| angle | sin | cos | tan |\n| :---: | :---: | :---: | :---: |\n| 0 | 0.000 | 1.000 | 0.000 |\n| 10 | 0.174 | 0.985 | 0.176 |\n| 20 | 0.342 | 0.940 | 0.364 |\n| 30 | 0.500 | 0.866 | 0.577 |\n| 40 | 0.643 | 0.766 | 0.839 |\n| 50 | 0.766 | 0.643 | 1.192 |\n| 60 | 0.866 | 0.500 | 1.732 |\n| 70 | 0.940 | 0.342 | 2.747 |\n| 80 | 0.985 | 0.174 | 5.671 |\n\nThe problem with this technique is that there will always be gaps in a table. \n\n**2. Use a graph** - One way to try to fill these gaps is by consulting a graph of a trigonometric function. For example, the image below shows a plot of $\\sin(\\theta)$ for $0 \\leq \\theta \\leq 360$\n\n\n\nThese graphs are nice because they give a good visual sense for how these ratios behave, but they aren't great for getting accurate values. Which leads us to the **best** way to look up trig ratios...\n\n**3. Use a computer!** This probably isn't a surprise, but python has built in functions to calculate sine, cosine, and tangent... \n\nIn fact, you can even type \"sin(60 degrees)\" into **Google** and you'll get the correct answer!\n\n\n\nNote how I wrote in \"sin(60 degrees)\" instead of just \"sin(60)\". That's because these functions generally expect their input to be in **radians**. \n\nNow let's calculate these ratios with Python.",
"_____no_output_____"
]
],
[
[
"# Python's math module has functions called sin, cos, and tan\n# as well as the constant \"pi\" (which we will find useful shortly)\nfrom math import sin, cos, tan, pi\n\n# Run this cell. What do you expect the output to be?\nprint(sin(60))",
"-0.3048106211022167\n"
]
],
[
[
"Did the output match what you expected?\n\nIf not, it's probably because we didn't convert our angle to radians. \n\n### EXERCISE 1 - Write a function that converts degrees to radians\n\nImplement the following math in code:\n\n$$\\theta_{\\text{radians}} = \\theta_{\\text{degrees}} \\times \\frac{\\pi}{180}$$\n",
"_____no_output_____"
]
],
[
[
"from math import pi\ndef deg2rad(theta):\n \"\"\"Converts degrees to radians\"\"\"\n return theta * (pi/180)\n # TODO - implement this function (solution\n # code at end of notebook)\n\nassert(deg2rad(45.0) == pi / 4)\nassert(deg2rad(90.0) == pi / 2)\nprint(\"Nice work! Your degrees to radians function works!\")\n\nfor theta in [0, 30, 45, 60, 90]:\n theta_rad = deg2rad(theta)\n sin_theta = sin(theta_rad)\n print(\"sin(\", theta, \"degrees) =\", sin_theta)",
"Nice work! Your degrees to radians function works!\nsin( 0 degrees) = 0.0\nsin( 30 degrees) = 0.49999999999999994\nsin( 45 degrees) = 0.7071067811865475\nsin( 60 degrees) = 0.8660254037844386\nsin( 90 degrees) = 1.0\n"
]
],
[
[
"### EXERCISE 2 - Make plots of cosine and tangent",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom matplotlib import pyplot as plt\ndef plot_sine(min_theta, max_theta):\n \"\"\"\n Generates a plot of sin(theta) between min_theta\n and max_theta (both of which are specified in degrees).\n \"\"\"\n angles_degrees = np.linspace(min_theta, max_theta)\n angles_radians = deg2rad(angles_degrees)\n values = np.sin(angles_radians)\n X = angles_degrees\n Y = values\n plt.plot(X,Y)\n plt.show()\n \n# EXERCISE 2.1 Implement this! Try not to look at the\n# implementation of plot_sine TOO much...\ndef plot_cosine(min_theta, max_theta):\n \"\"\"\n Generates a plot of sin(theta) between min_theta\n and max_theta (both of which are specified in degrees).\n \"\"\"\n angles_degrees = np.linspace(min_theta, max_theta)\n angles_radians = deg2rad(angles_degrees)\n values = np.cos(angles_radians)\n X = angles_degrees\n Y = values\n plt.plot(X,Y)\n plt.show()",
"_____no_output_____"
],
[
"plot_sine(0, 360)",
"_____no_output_____"
],
[
"plot_cosine(0, 360)",
"_____no_output_____"
],
[
"#\n\n#\n\n#\n\n#\n\n# SOLUTION CODE\n\n#\n\n#\n\n#\n\n#\nfrom math import pi\ndef deg2rad_solution(theta):\n \"\"\"Converts degrees to radians\"\"\"\n return theta * pi / 180\n\nassert(deg2rad_solution(45.0) == pi / 4)\nassert(deg2rad_solution(90.0) == pi / 2)\n\nimport numpy as np\nfrom matplotlib import pyplot as plt\ndef plot_cosine_solution(min_theta, max_theta):\n \"\"\"\n Generates a plot of sin(theta) between min_theta\n and max_theta (both of which are specified in degrees).\n \"\"\"\n angles_degrees = np.linspace(min_theta, max_theta)\n angles_radians = deg2rad_solution(angles_degrees)\n values = np.cos(angles_radians)\n X = angles_degrees\n Y = values\n plt.plot(X,Y)\n plt.show()",
"_____no_output_____"
],
[
"plot_cosine_solution(0, 360)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d0629a451407335fb731206a6bb49b75cb94e08a | 32,217 | ipynb | Jupyter Notebook | gs_quant/content/events/00_virtual_event/0003_trades.ipynb | KabbalahOracle/gs-quant | e4daa30654d8e4757c84f8836b5c1e22f39e7174 | [
"Apache-2.0"
]
| 1 | 2020-11-04T21:21:45.000Z | 2020-11-04T21:21:45.000Z | gs_quant/content/events/00_virtual_event/0003_trades.ipynb | KabbalahOracle/gs-quant | e4daa30654d8e4757c84f8836b5c1e22f39e7174 | [
"Apache-2.0"
]
| null | null | null | gs_quant/content/events/00_virtual_event/0003_trades.ipynb | KabbalahOracle/gs-quant | e4daa30654d8e4757c84f8836b5c1e22f39e7174 | [
"Apache-2.0"
]
| null | null | null | 70.806593 | 1,918 | 0.701555 | [
[
[
"from gs_quant.data import Dataset\nfrom gs_quant.markets.securities import Asset, AssetIdentifier, SecurityMaster\nfrom gs_quant.timeseries import *\nfrom gs_quant.target.instrument import FXOption, IRSwaption\nfrom gs_quant.markets import PricingContext, HistoricalPricingContext, BackToTheFuturePricingContext\nfrom gs_quant.risk import CarryScenario, MarketDataPattern, MarketDataShock, MarketDataShockBasedScenario, MarketDataShockType, CurveScenario,CarryScenario\nfrom gs_quant.markets.portfolio import Portfolio\nfrom gs_quant.risk import IRAnnualImpliedVol\nfrom gs_quant.timeseries import percentiles\nfrom gs_quant.datetime import business_day_offset\nimport seaborn as sns\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom scipy import stats\nimport warnings\nfrom datetime import date\nwarnings.filterwarnings('ignore')\nsns.set(style=\"darkgrid\", color_codes=True)",
"_____no_output_____"
],
[
"from gs_quant.session import GsSession\n# external users should substitute their client id and secret; please skip this step if using internal jupyterhub\nGsSession.use(client_id=None, client_secret=None, scopes=('run_analytics',)) ",
"_____no_output_____"
]
],
[
[
"In this notebook, we'll look at entry points for G10 vol, look for crosses with the largest downside sensivity to SPX, indicatively price several structures and analyze their carry profile.\n\n* [1: FX entry point vs richness](#1:-FX-entry-point-vs-richness)\n* [2: Downside sensitivity to SPX](#2:-Downside-sensitivity-to-SPX)\n* [3: AUDJPY conditional relationship with SPX](#3:-AUDJPY-conditional-relationship-with-SPX)\n* [4: Price structures](#4:-Price-structures)\n* [5: Analyse rates package](#5:-Analyse-rates-package)",
"_____no_output_____"
],
[
"### 1: FX entry point vs richness\nLet's pull [GS FX Spot](https://marquee.gs.com/s/developer/datasets/FXSPOT_PREMIUM) and [GS FX Implied Volatility](https://marquee.gs.com/s/developer/datasets/FXIMPLIEDVOL_PREMIUM) and look at implied vs realized vol as well as current implied level as percentile relative to the last 2 years.",
"_____no_output_____"
]
],
[
[
"def format_df(data_dict):\n df = pd.concat(data_dict, axis=1)\n df.columns = data_dict.keys()\n return df.fillna(method='ffill').dropna()",
"_____no_output_____"
],
[
"g10 = ['USDJPY', 'EURUSD', 'AUDUSD', 'GBPUSD', 'USDCAD', 'USDNOK', 'NZDUSD', 'USDSEK', 'USDCHF', 'AUDJPY']\nstart_date = date(2005, 8, 26)\nend_date = business_day_offset(date.today(), -1, roll='preceding')\nfxspot_dataset, fxvol_dataset = Dataset('FXSPOT_PREMIUM'), Dataset('FXIMPLIEDVOL_PREMIUM')\n\nspot_data, impvol_data, spot_fx = {}, {}, {}\nfor cross in g10:\n spot = fxspot_dataset.get_data(start_date, end_date, bbid=cross)[['spot']].drop_duplicates(keep='last')\n spot_fx[cross] = spot['spot']\n spot_data[cross] = volatility(spot['spot'], 63) # realized vol \n vol = fxvol_dataset.get_data(start_date, end_date, bbid=cross, tenor='3m', deltaStrike='DN', location='NYC')[['impliedVolatility']]\n impvol_data[cross] = vol.drop_duplicates(keep='last') * 100\n\nspdata, ivdata = format_df(spot_data), format_df(impvol_data)\ndiff = ivdata.subtract(spdata).dropna()",
"_____no_output_____"
],
[
"_slice = ivdata['2018-09-01': '2020-09-08']\npct_rank = {}\nfor x in _slice.columns:\n pct = percentiles(_slice[x])\n pct_rank[x] = pct.iloc[-1]\n\nfor fx in pct_rank:\n plt.scatter(pct_rank[fx], diff[fx]['2020-09-08'])\n plt.legend(pct_rank.keys(),loc='best', bbox_to_anchor=(0.9, -0.13), ncol=3)\n \nplt.xlabel('Percentile of Current Implied Vol')\nplt.ylabel('Implied vs Realized Vol')\nplt.title('Entry Point vs Richness')\nplt.show()",
"_____no_output_____"
]
],
[
[
"### 2: Downside sensitivity to SPX\n\nLet's now look at beta and correlation with SPX across G10.",
"_____no_output_____"
]
],
[
[
"spx_spot = Dataset('TREOD').get_data(start_date, end_date, bbid='SPX')[['closePrice']]\nspx_spot = spx_spot.fillna(method='ffill').dropna()\ndf = pd.DataFrame(spx_spot)\n\n#FX Spot data\nfx_spots = format_df(spot_fx)\ndata = pd.concat([spx_spot, fx_spots], axis=1).dropna()\ndata.columns = ['SPX'] + g10",
"_____no_output_____"
],
[
"beta_spx, corr_spx = {}, {}\n\n#calculate rolling 84d or 4m beta to S&P\nfor cross in g10:\n beta_spx[cross] = beta(data[cross],data['SPX'], 84)\n corr_spx[cross] = correlation(data['SPX'], data[cross], 84)\n\nfig, axs = plt.subplots(5, 2, figsize=(18, 20))\nfor j in range(2):\n for i in range(5):\n color='tab:blue'\n axs[i,j].plot(beta_spx[g10[i + j*5]], color=color)\n axs[i,j].set_title(g10[i + j*5])\n color='tab:blue'\n axs[i,j].set_ylabel('Beta', color=color)\n axs[i,j].plot(beta_spx[g10[i + j*5]], color=color)\n ax2 = axs[i,j].twinx()\n color = 'tab:orange' \n ax2.plot(corr_spx[g10[i + j*5]], color=color)\n ax2.set_ylabel('Correlation', color=color)\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Part 3: AUDJPY conditional relationship with SPX\n\nLet's focus on AUDJPY and look at its relationship with SPX when SPX is significantly up and down.",
"_____no_output_____"
]
],
[
[
"# resample data to weekly from daily & get weekly returns\nwk_data = data.resample('W-FRI').last()\nrets = returns(wk_data, 1)\nsns.set(style='white', color_codes=True)\nspx_returns = [-.1, -.05, .05, .1]\nr2 = lambda x,y: stats.pearsonr(x,y)[0]**2 \nbetas = pd.DataFrame(index=spx_returns, columns=g10)\nfor ret in spx_returns:\n dns = rets[rets.SPX <= ret].dropna() if ret < 0 else rets[rets.SPX >= ret].dropna() \n j = sns.jointplot(x='SPX', y='AUDJPY', data=dns, kind='reg')\n j.set_axis_labels('SPX with {}% Returns'.format(ret*100), 'AUDJPY')\n j.fig.subplots_adjust(wspace=.02)\n plt.show()",
"_____no_output_____"
]
],
[
[
"Let's use the beta for all S&P returns to price a structure",
"_____no_output_____"
]
],
[
[
"sns.jointplot(x='SPX', y='AUDJPY', data=rets, kind='reg', stat_func=r2)",
"_____no_output_____"
]
],
[
[
"### 4: Price structures \n\n##### Let's now look at a few AUDJPY structures as potential hedges\n\n* Buy 4m AUDJPY put using spx beta to size. Max loss limited to premium paid.\n* Buy 4m AUDJPY put spread (4.2%/10.6% OTMS). Max loss limited to premium paid.\n\nFor more info on this trade, check out our market strats piece [here](https://marquee.gs.com/content/#/article/2020/08/28/gs-marketstrats-audjpy-as-us-election-hedge)",
"_____no_output_____"
]
],
[
[
"#buy 4m AUDJPY put\naudjpy_put = FXOption(option_type='Put', pair='AUDJPY', strike_price= 's-4.2%', expiration_date='4m', buy_sell='Buy') \nprint('cost in bps: {:,.2f}'.format(audjpy_put.premium / audjpy_put.notional_amount * 1e4))",
"_____no_output_____"
],
[
"#buy 4m AUDJPY put spread (5.3%/10.6% OTMS)\nfrom gs_quant.markets.portfolio import Portfolio\nput1 = FXOption(option_type='Put', pair='AUDJPY', strike_price= 's-4.2%', expiration_date='4m', buy_sell='Buy')\nput2 = FXOption(option_type='Put', pair='AUDJPY', strike_price= 's-10.6%', expiration_date='4m', buy_sell='Sell')\n\nfx_package = Portfolio((put1, put2))\ncost = put2.premium/put2.notional_amount - put1.premium/put1.notional_amount \nprint('cost in bps: {:,.2f}'.format(cost * 1e4))",
"_____no_output_____"
]
],
[
[
"##### ...And some rates ideas\n\n* Sell straddle. Max loss unlimited.\n* Sell 3m30y straddle, buy 2y30y straddle in a 0 pv package. Max loss unlimited.",
"_____no_output_____"
]
],
[
[
"leg = IRSwaption('Straddle', '30y', notional_currency='USD', expiration_date='3m', buy_sell='Sell')\nprint('PV in USD: {:,.2f}'.format(leg.dollar_price()))",
"_____no_output_____"
],
[
"leg1 = IRSwaption('Straddle', '30y', notional_currency='USD', expiration_date='3m', buy_sell='Sell',name='3m30y ATM Straddle')\nleg2 = IRSwaption('Straddle', '30y', notional_currency='USD', expiration_date='2y', notional_amount='{}/pv'.format(leg1.price()), buy_sell='Buy', name = '2y30y ATM Straddle')\n\nrates_package = Portfolio((leg1, leg2))\nrates_package.resolve()\n\nprint('Package cost in USD: {:,.2f}'.format(rates_package.price().aggregate()))\nprint('PV Flat notionals ($$m):', round(leg1.notional_amount/1e6, 1),' by ',round(leg2.notional_amount/1e6, 1))",
"_____no_output_____"
]
],
[
[
"### 5: Analyse rates package",
"_____no_output_____"
]
],
[
[
"dates = pd.bdate_range(date(2020, 6, 8), leg1.expiration_date, freq='5B').date.tolist()\n\nwith BackToTheFuturePricingContext(dates=dates, roll_to_fwds=True):\n future = rates_package.price()\nrates_future = future.result().aggregate()\n\nrates_future.plot(figsize=(10, 6), title='Historical PV and carry for rates package')\n\nprint('PV breakdown between legs:')\nresults = future.result().to_frame()\nresults /= 1e6\nresults.index=[leg1.name,leg2.name]\nresults.loc['Total'] = results.sum()\nresults.round(1)",
"_____no_output_____"
]
],
[
[
"Let's focus on the next 3m and how the calendar carries in different rates shocks.",
"_____no_output_____"
]
],
[
[
"dates = pd.bdate_range(dt.date.today(), leg1.expiration_date, freq='5B').date.tolist()\nshocked_pv = pd.DataFrame(columns=['Base', '5bp per week', '50bp instantaneous'], index=dates)\n\np1, p2, p3 = [], [], []\nwith PricingContext(is_batch=True):\n for t, d in enumerate(dates):\n with CarryScenario(date=d, roll_to_fwds=True):\n p1.append(rates_package.price())\n with MarketDataShockBasedScenario({MarketDataPattern('IR', 'USD'): MarketDataShock(MarketDataShockType.Absolute, t*0.0005)}):\n p2.append(rates_package.price())\n with MarketDataShockBasedScenario({MarketDataPattern('IR', 'USD'): MarketDataShock(MarketDataShockType.Absolute, 0.005)}):\n p3.append(rates_package.price())\n\nshocked_pv.Base = [p.result().aggregate() for p in p1]\nshocked_pv['5bp per week'] = [p.result().aggregate() for p in p2]\nshocked_pv['50bp instantaneous'] = [p.result().aggregate() for p in p3]\n\nshocked_pv/=1e6\nshocked_pv.round(1)\nshocked_pv.plot(figsize=(10, 6), title='Carry + scenario analysis')",
"_____no_output_____"
]
],
[
[
"### Disclaimers\n\nScenarios/predictions: Simulated results are for illustrative purposes only. GS provides no assurance or guarantee that the strategy will operate or would have operated in the past in a manner consistent with the above analysis. Past performance figures are not a reliable indicator of future results.\n\nIndicative Terms/Pricing Levels: This material may contain indicative terms only, including but not limited to pricing levels. There is no representation that any transaction can or could have been effected at such terms or prices. Proposed terms and conditions are for discussion purposes only. Finalized terms and conditions are subject to further discussion and negotiation.\nwww.goldmansachs.com/disclaimer/sales-and-trading-invest-rec-disclosures.html If you are not accessing this material via Marquee ContentStream, a list of the author's investment recommendations disseminated during the preceding 12 months and the proportion of the author's recommendations that are 'buy', 'hold', 'sell' or other over the previous 12 months is available by logging into Marquee ContentStream using the link below. Alternatively, if you do not have access to Marquee ContentStream, please contact your usual GS representative who will be able to provide this information to you.\n\nBacktesting, Simulated Results, Sensitivity/Scenario Analysis or Spreadsheet Calculator or Model: There may be data presented herein that is solely for illustrative purposes and which may include among other things back testing, simulated results and scenario analyses. The information is based upon certain factors, assumptions and historical information that Goldman Sachs may in its discretion have considered appropriate, however, Goldman Sachs provides no assurance or guarantee that this product will operate or would have operated in the past in a manner consistent with these assumptions. In the event any of the assumptions used do not prove to be true, results are likely to vary materially from the examples shown herein. Additionally, the results may not reflect material economic and market factors, such as liquidity, transaction costs and other expenses which could reduce potential return.\n\nOTC Derivatives Risk Disclosures: \nTerms of the Transaction: To understand clearly the terms and conditions of any OTC derivative transaction you may enter into, you should carefully review the Master Agreement, including any related schedules, credit support documents, addenda and exhibits. You should not enter into OTC derivative transactions unless you understand the terms of the transaction you are entering into as well as the nature and extent of your risk exposure. You should also be satisfied that the OTC derivative transaction is appropriate for you in light of your circumstances and financial condition. You may be requested to post margin or collateral to support written OTC derivatives at levels consistent with the internal policies of Goldman Sachs. \n \nLiquidity Risk: There is no public market for OTC derivative transactions and, therefore, it may be difficult or impossible to liquidate an existing position on favorable terms. Transfer Restrictions: OTC derivative transactions entered into with one or more affiliates of The Goldman Sachs Group, Inc. (Goldman Sachs) cannot be assigned or otherwise transferred without its prior written consent and, therefore, it may be impossible for you to transfer any OTC derivative transaction to a third party. \n \nConflict of Interests: Goldman Sachs may from time to time be an active participant on both sides of the market for the underlying securities, commodities, futures, options or any other derivative or instrument identical or related to those mentioned herein (together, \"the Product\"). Goldman Sachs at any time may have long or short positions in, or buy and sell Products (on a principal basis or otherwise) identical or related to those mentioned herein. Goldman Sachs hedging and trading activities may affect the value of the Products. \n \nCounterparty Credit Risk: Because Goldman Sachs, may be obligated to make substantial payments to you as a condition of an OTC derivative transaction, you must evaluate the credit risk of doing business with Goldman Sachs or its affiliates. \n \nPricing and Valuation: The price of each OTC derivative transaction is individually negotiated between Goldman Sachs and each counterparty and Goldman Sachs does not represent or warrant that the prices for which it offers OTC derivative transactions are the best prices available, possibly making it difficult for you to establish what is a fair price for a particular OTC derivative transaction; The value or quoted price of the Product at any time, however, will reflect many factors and cannot be predicted. If Goldman Sachs makes a market in the offered Product, the price quoted by Goldman Sachs would reflect any changes in market conditions and other relevant factors, and the quoted price (and the value of the Product that Goldman Sachs will use for account statements or otherwise) could be higher or lower than the original price, and may be higher or lower than the value of the Product as determined by reference to pricing models used by Goldman Sachs. If at any time a third party dealer quotes a price to purchase the Product or otherwise values the Product, that price may be significantly different (higher or lower) than any price quoted by Goldman Sachs. Furthermore, if you sell the Product, you will likely be charged a commission for secondary market transactions, or the price will likely reflect a dealer discount. Goldman Sachs may conduct market making activities in the Product. To the extent Goldman Sachs makes a market, any price quoted for the OTC derivative transactions, Goldman Sachs may differ significantly from (i) their value determined by reference to Goldman Sachs pricing models and (ii) any price quoted by a third party. The market price of the OTC derivative transaction may be influenced by many unpredictable factors, including economic conditions, the creditworthiness of Goldman Sachs, the value of any underlyers, and certain actions taken by Goldman Sachs. \n \nMarket Making, Investing and Lending: Goldman Sachs engages in market making, investing and lending businesses for its own account and the accounts of its affiliates in the same or similar instruments underlying OTC derivative transactions (including such trading as Goldman Sachs deems appropriate in its sole discretion to hedge its market risk in any OTC derivative transaction whether between Goldman Sachs and you or with third parties) and such trading may affect the value of an OTC derivative transaction. \n \nEarly Termination Payments: The provisions of an OTC Derivative Transaction may allow for early termination and, in such cases, either you or Goldman Sachs may be required to make a potentially significant termination payment depending upon whether the OTC Derivative Transaction is in-the-money to Goldman Sachs or you at the time of termination. Indexes: Goldman Sachs does not warrant, and takes no responsibility for, the structure, method of computation or publication of any currency exchange rates, interest rates, indexes of such rates, or credit, equity or other indexes, unless Goldman Sachs specifically advises you otherwise.\nRisk Disclosure Regarding futures, options, equity swaps, and other derivatives as well as non-investment-grade securities and ADRs: Please ensure that you have read and understood the current options, futures and security futures disclosure document before entering into any such transactions. Current United States listed options, futures and security futures disclosure documents are available from our sales representatives or at http://www.theocc.com/components/docs/riskstoc.pdf, http://www.goldmansachs.com/disclosures/risk-disclosure-for-futures.pdf and https://www.nfa.futures.org/investors/investor-resources/files/security-futures-disclosure.pdf, respectively. Certain transactions - including those involving futures, options, equity swaps, and other derivatives as well as non-investment-grade securities - give rise to substantial risk and are not available to nor suitable for all investors. If you have any questions about whether you are eligible to enter into these transactions with Goldman Sachs, please contact your sales representative. Foreign-currency-denominated securities are subject to fluctuations in exchange rates that could have an adverse effect on the value or price of, or income derived from, the investment. In addition, investors in securities such as ADRs, the values of which are influenced by foreign currencies, effectively assume currency risk.\nOptions Risk Disclosures: Options may trade at a value other than that which may be inferred from the current levels of interest rates, dividends (if applicable) and the underlier due to other factors including, but not limited to, expectations of future levels of interest rates, future levels of dividends and the volatility of the underlier at any time prior to maturity. Note: Options involve risk and are not suitable for all investors. Please ensure that you have read and understood the current options disclosure document before entering into any standardized options transactions. United States listed options disclosure documents are available from our sales representatives or at http://theocc.com/publications/risks/riskstoc.pdf. A secondary market may not be available for all options. Transaction costs may be a significant factor in option strategies calling for multiple purchases and sales of options, such as spreads. When purchasing long options an investor may lose their entire investment and when selling uncovered options the risk is potentially unlimited. Supporting documentation for any comparisons, recommendations, statistics, technical data, or other similar information will be supplied upon request.\nThis material is for the private information of the recipient only. This material is not sponsored, endorsed, sold or promoted by any sponsor or provider of an index referred herein (each, an \"Index Provider\"). GS does not have any affiliation with or control over the Index Providers or any control over the computation, composition or dissemination of the indices. While GS will obtain information from publicly available sources it believes reliable, it will not independently verify this information. Accordingly, GS shall have no liability, contingent or otherwise, to the user or to third parties, for the quality, accuracy, timeliness, continued availability or completeness of the data nor for any special, indirect, incidental or consequential damages which may be incurred or experienced because of the use of the data made available herein, even if GS has been advised of the possibility of such damages.\nStandard & Poor's ยฎ and S&P ยฎ are registered trademarks of The McGraw-Hill Companies, Inc. and S&P GSCIโข is a trademark of The McGraw-Hill Companies, Inc. and have been licensed for use by the Issuer. This Product (the \"Product\") is not sponsored, endorsed, sold or promoted by S&P and S&P makes no representation, warranty or condition regarding the advisability of investing in the Product.\nNotice to Brazilian Investors\nMarquee is not meant for the general public in Brazil. The services or products provided by or through Marquee, at any time, may not be offered or sold to the general public in Brazil. You have received a password granting access to Marquee exclusively due to your existing relationship with a GS business located in Brazil. The selection and engagement with any of the offered services or products through Marquee, at any time, will be carried out directly by you. Before acting to implement any chosen service or products, provided by or through Marquee you should consider, at your sole discretion, whether it is suitable for your particular circumstances and, if necessary, seek professional advice. Any steps necessary in order to implement the chosen service or product, including but not limited to remittance of funds, shall be carried out at your discretion. Accordingly, such services and products have not been and will not be publicly issued, placed, distributed, offered or negotiated in the Brazilian capital markets and, as a result, they have not been and will not be registered with the Brazilian Securities and Exchange Commission (Comissรฃo de Valores Mobiliรกrios), nor have they been submitted to the foregoing agency for approval. Documents relating to such services or products, as well as the information contained therein, may not be supplied to the general public in Brazil, as the offering of such services or products is not a public offering in Brazil, nor used in connection with any offer for subscription or sale of securities to the general public in Brazil.\nThe offer of any securities mentioned in this message may not be made to the general public in Brazil. Accordingly, any such securities have not been nor will they be registered with the Brazilian Securities and Exchange Commission (Comissรฃo de Valores Mobiliรกrios) nor has any offer been submitted to the foregoing agency for approval. Documents relating to the offer, as well as the information contained therein, may not be supplied to the public in Brazil, as the offer is not a public offering of securities in Brazil. These terms will apply on every access to Marquee.\nOuvidoria Goldman Sachs Brasil: 0800 727 5764 e/ou [email protected]\nHorรกrio de funcionamento: segunda-feira ร sexta-feira (exceto feriados), das 9hs ร s 18hs.\nOmbudsman Goldman Sachs Brazil: 0800 727 5764 and / or [email protected]\nAvailable Weekdays (except holidays), from 9 am to 6 pm.\n\n",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d062ad7c364d8195bd0661953d59fa2e49a6751f | 16,541 | ipynb | Jupyter Notebook | solutions/gqlalchemy-solutions.ipynb | pyladiesams/graphdatabases-gqlalchemy-beginner-mar2022 | 39b6d1eedfea63177b3e6a124411fdb2341116f5 | [
"MIT"
]
| 4 | 2021-11-28T09:28:06.000Z | 2022-02-23T20:30:47.000Z | solutions/gqlalchemy-solutions.ipynb | pyladiesams/graphdbs-gqlalchemy-beginner-mar2022 | 39b6d1eedfea63177b3e6a124411fdb2341116f5 | [
"MIT"
]
| null | null | null | solutions/gqlalchemy-solutions.ipynb | pyladiesams/graphdbs-gqlalchemy-beginner-mar2022 | 39b6d1eedfea63177b3e6a124411fdb2341116f5 | [
"MIT"
]
| null | null | null | 30.294872 | 445 | 0.517562 | [
[
[
"# ๐ก Solutions\n\nBefore trying out these solutions, please start the [gqlalchemy-workshop notebook](../workshop/gqlalchemy-workshop.ipynb) to import all data. Also, this solutions manual is here to help you out, and it is recommended you try solving the exercises first by yourself.\n\n## Exercise 1\n\n**Find out how many genres there are in the database.**\n\nThe correct Cypher query is:\n\n```\nMATCH (g:Genre)\nRETURN count(g) AS num_of_genres;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:",
"_____no_output_____"
]
],
[
[
"from gqlalchemy import match\n\ntotal_genres = (\n match()\n .node(labels=\"Genre\", variable=\"g\")\n .return_({\"count(g)\": \"num_of_genres\"})\n .execute()\n)\n\nresults = list(total_genres)\nfor result in results:\n print(result[\"num_of_genres\"])",
"22084\n"
]
],
[
[
"## Exercise 2\n\n**Find out to how many genres movie 'Matrix, The (1999)' belongs to.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (:Movie {title: 'Matrix, The (1999)'})-[:OF_GENRE]->(g:Genre)\nRETURN count(g) AS num_of_genres;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:\n",
"_____no_output_____"
]
],
[
[
"matrix = (\n match()\n .node(labels=\"Movie\", variable=\"m\")\n .to(\"OF_GENRE\")\n .node(labels=\"Genre\", variable=\"g\")\n .where(\"m.title\", \"=\", \"Matrix, The (1999)\")\n .return_({\"count(g)\": \"num_of_genres\"})\n .execute()\n)\n\nresults = list(matrix)\n\nfor result in results:\n print(result[\"num_of_genres\"])",
"3\n"
]
],
[
[
"## Exercise 3\n\n**Find out the title of the movies that the user with `id` 1 rated.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (:User {id: 1})-[:RATED]->(m:Movie)\nRETURN m.title;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:",
"_____no_output_____"
]
],
[
[
"movies = (\n match()\n .node(labels=\"User\", variable=\"u\")\n .to(\"RATED\")\n .node(labels=\"Movie\", variable=\"m\")\n .where(\"u.id\", \"=\", 1)\n .return_({\"m.title\": \"movie\"})\n .execute()\n)\n\nresults = list(movies)\n\nfor result in results:\n print(result[\"movie\"])",
"Toy Story (1995)\nGrumpier Old Men (1995)\nHeat (1995)\nSeven (a.k.a. Se7en) (1995)\nUsual Suspects, The (1995)\nFrom Dusk Till Dawn (1996)\nBottle Rocket (1996)\nBraveheart (1995)\nRob Roy (1995)\nCanadian Bacon (1995)\nDesperado (1995)\nBilly Madison (1995)\nClerks (1994)\nDumb & Dumber (Dumb and Dumber) (1994)\nEd Wood (1994)\nStar Wars: Episode IV - A New Hope (1977)\nPulp Fiction (1994)\nStargate (1994)\nTommy Boy (1995)\nClear and Present Danger (1994)\nForrest Gump (1994)\nJungle Book, The (1994)\nMask, The (1994)\nBlown Away (1994)\nDazed and Confused (1993)\nFugitive, The (1993)\nJurassic Park (1993)\nMrs. Doubtfire (1993)\nSchindler's List (1993)\nSo I Married an Axe Murderer (1993)\nThree Musketeers, The (1993)\nTombstone (1993)\nDances with Wolves (1990)\nBatman (1989)\nSilence of the Lambs, The (1991)\nPinocchio (1940)\nFargo (1996)\nMission: Impossible (1996)\nJames and the Giant Peach (1996)\nSpace Jam (1996)\nRock, The (1996)\nTwister (1996)\nIndependence Day (a.k.a. ID4) (1996)\nShe's the One (1996)\nWizard of Oz, The (1939)\nCitizen Kane (1941)\nAdventures of Robin Hood, The (1938)\nGhost and Mrs. Muir, The (1947)\nMr. Smith Goes to Washington (1939)\nEscape to Witch Mountain (1975)\nWinnie the Pooh and the Blustery Day (1968)\nThree Caballeros, The (1945)\nSword in the Stone, The (1963)\nDumbo (1941)\nPete's Dragon (1977)\nBedknobs and Broomsticks (1971)\nAlice in Wonderland (1951)\nThat Thing You Do! (1996)\nGhost and the Darkness, The (1996)\nSwingers (1996)\nWilly Wonka & the Chocolate Factory (1971)\nMonty Python's Life of Brian (1979)\nReservoir Dogs (1992)\nPlatoon (1986)\nBasic Instinct (1992)\nE.T. the Extra-Terrestrial (1982)\nAbyss, The (1989)\nMonty Python and the Holy Grail (1975)\nStar Wars: Episode V - The Empire Strikes Back (1980)\nPrincess Bride, The (1987)\nRaiders of the Lost Ark (Indiana Jones and the Raiders of the Lost Ark) (1981)\nClockwork Orange, A (1971)\nApocalypse Now (1979)\nStar Wars: Episode VI - Return of the Jedi (1983)\nGoodfellas (1990)\nAlien (1979)\nPsycho (1960)\nBlues Brothers, The (1980)\nFull Metal Jacket (1987)\nHenry V (1989)\nQuiet Man, The (1952)\nTerminator, The (1984)\nDuck Soup (1933)\nShining, The (1980)\nGroundhog Day (1993)\nBack to the Future (1985)\nHighlander (1986)\nYoung Frankenstein (1974)\nFantasia (1940)\nIndiana Jones and the Last Crusade (1989)\nPink Floyd: The Wall (1982)\nNosferatu (Nosferatu, eine Symphonie des Grauens) (1922)\nBatman Returns (1992)\nSneakers (1992)\nLast of the Mohicans, The (1992)\nMcHale's Navy (1997)\nBest Men (1997)\nGrosse Pointe Blank (1997)\nAustin Powers: International Man of Mystery (1997)\nCon Air (1997)\nFace/Off (1997)\nMen in Black (a.k.a. MIB) (1997)\nConan the Barbarian (1982)\nL.A. Confidential (1997)\nKiss the Girls (1997)\nGame, The (1997)\nI Know What You Did Last Summer (1997)\nStarship Troopers (1997)\nBig Lebowski, The (1998)\nWedding Singer, The (1998)\nWelcome to Woop-Woop (1997)\nNewton Boys, The (1998)\nWild Things (1998)\nSmall Soldiers (1998)\nAll Quiet on the Western Front (1930)\nRocky (1976)\nLabyrinth (1986)\nLethal Weapon (1987)\nGoonies, The (1985)\nBack to the Future Part III (1990)\nBambi (1942)\nSaving Private Ryan (1998)\nBlack Cauldron, The (1985)\nFlight of the Navigator (1986)\nGreat Mouse Detective, The (1986)\nHoney, I Shrunk the Kids (1989)\nNegotiator, The (1998)\nJungle Book, The (1967)\nRescuers, The (1977)\nReturn to Oz (1985)\nRocketeer, The (1991)\nSleeping Beauty (1959)\nSong of the South (1946)\nTron (1982)\nIndiana Jones and the Temple of Doom (1984)\nLord of the Rings, The (1978)\nCharlotte's Web (1973)\nSecret of NIMH, The (1982)\nAmerican Tail, An (1986)\nLegend (1985)\nNeverEnding Story, The (1984)\nBeetlejuice (1988)\nWillow (1988)\nToys (1992)\nFew Good Men, A (1992)\nRush Hour (1998)\nEdward Scissorhands (1990)\nAmerican History X (1998)\nI Still Know What You Did Last Summer (1998)\nEnemy of the State (1998)\nKing Kong (1933)\nVery Bad Things (1998)\nPsycho (1998)\nRushmore (1998)\nRomancing the Stone (1984)\nYoung Sherlock Holmes (1985)\nThin Red Line, The (1998)\nHoward the Duck (1986)\nTexas Chainsaw Massacre, The (1974)\nCrocodile Dundee (1986)\nยกThree Amigos! (1986)\n20 Dates (1998)\nOffice Space (1999)\nLogan's Run (1976)\nPlanet of the Apes (1968)\nLock, Stock & Two Smoking Barrels (1998)\nMatrix, The (1999)\nGo (1999)\nSLC Punk! (1998)\nDick Tracy (1990)\nMummy, The (1999)\nStar Wars: Episode I - The Phantom Menace (1999)\nSuperman (1978)\nSuperman II (1980)\nDracula (1931)\nFrankenstein (1931)\nWolf Man, The (1941)\nRocky Horror Picture Show, The (1975)\nRun Lola Run (Lola rennt) (1998)\nSouth Park: Bigger, Longer and Uncut (1999)\nGhostbusters (a.k.a. Ghost Busters) (1984)\nIron Giant, The (1999)\nBig (1988)\n13th Warrior, The (1999)\nAmerican Beauty (1999)\nExcalibur (1981)\nGulliver's Travels (1939)\nTotal Recall (1990)\nDirty Dozen, The (1967)\nGoldfinger (1964)\nFrom Russia with Love (1963)\nDr. No (1962)\nFight Club (1999)\nRoboCop (1987)\nWho Framed Roger Rabbit? (1988)\nLive and Let Die (1973)\nThunderball (1965)\nBeing John Malkovich (1999)\nSpaceballs (1987)\nRobin Hood (1973)\nDogma (1999)\nMessenger: The Story of Joan of Arc, The (1999)\nLongest Day, The (1962)\nGreen Mile, The (1999)\nEasy Rider (1969)\nTalented Mr. Ripley, The (1999)\nEncino Man (1992)\nSister Act (1992)\nWayne's World (1992)\nScream 3 (2000)\nJFK (1991)\nTeenage Mutant Ninja Turtles II: The Secret of the Ooze (1991)\nTeenage Mutant Ninja Turtles III (1993)\nRed Dawn (1984)\nGood Morning, Vietnam (1987)\nGrumpy Old Men (1993)\nLadyhawke (1985)\nHook (1991)\nPredator (1987)\nGladiator (2000)\nRoad Trip (2000)\nMan with the Golden Gun, The (1974)\nBlazing Saddles (1974)\nMad Max (1979)\nRoad Warrior, The (Mad Max 2) (1981)\nShaft (1971)\nBig Trouble in Little China (1986)\nShaft (2000)\nX-Men (2000)\nWhat About Bob? (1991)\nTransformers: The Movie (1986)\nM*A*S*H (a.k.a. MASH) (1970)\n"
]
],
[
[
"## Exercise 4\n\n**List 15 movies of 'Documentary' and 'Comedy' genres and sort them by title descending.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (m:Movie)-[:OF_GENRE]->(:Genre {name: \"Documentary\"})\nMATCH (m)-[:OF_GENRE]->(:Genre {name: \"Comedy\"})\nRETURN m.title\nORDER BY m.title DESC\nLIMIT 15;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:",
"_____no_output_____"
]
],
[
[
"movies = (\n match()\n .node(labels=\"Movie\", variable=\"m\")\n .to(\"OF_GENRE\")\n .node(labels=\"Genre\", variable=\"g1\")\n .where(\"g1.name\", \"=\", \"Documentary\")\n .match()\n .node(labels=\"Movie\", variable=\"m\")\n .to(\"OF_GENRE\")\n .node(labels=\"Genre\", variable=\"g2\")\n .where(\"g2.name\", \"=\", \"Comedy\")\n .return_({\"m.title\": \"movie\"})\n .order_by(\"m.title DESC\")\n .limit(15)\n .execute()\n)\n\nresults = list(movies)\n\nfor result in results:\n print(result[\"movie\"])",
"What the #$*! Do We Know!? (a.k.a. What the Bleep Do We Know!?) (2004)\nUnion: The Business Behind Getting High, The (2007)\nSuper Size Me (2004)\nSuper High Me (2007)\nSecret Policeman's Other Ball, The (1982)\nRichard Pryor Live on the Sunset Strip (1982)\nReligulous (2008)\nPaper Heart (2009)\nOriginal Kings of Comedy, The (2000)\nMerci Patron ! (2016)\nMartin Lawrence Live: Runteldat (2002)\nKevin Hart: Laugh at My Pain (2011)\nJeff Ross Roasts Criminals: Live at Brazos County Jail (2015)\nJackass: The Movie (2002)\nJackass Number Two (2006)\n"
]
],
[
[
"## Exercise 5\n\n**Find out the minimum rating of the 'Star Wars: Episode I - The Phantom Menace (1999)' movie.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (:User)-[r:RATED]->(:Movie {title: 'Star Wars: Episode I - The Phantom Menace (1999)'})\nRETURN min(r.rating);\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:",
"_____no_output_____"
]
],
[
[
"rating = (\n match()\n .node(labels=\"User\")\n .to(\"RATED\", variable=\"r\")\n .node(labels=\"Movie\", variable=\"m\")\n .where(\"m.title\", \"=\", \"Star Wars: Episode I - The Phantom Menace (1999)\")\n .return_({\"min(r.rating)\": \"min_rating\"})\n .execute()\n)\n\nresults = list(rating)\n\nfor result in results:\n print(result[\"min_rating\"])",
"0.5\n"
]
],
[
[
"And that's it! If you have any issues with this notebook, feel free to open an issue on the [GitHub repository](https://github.com/pyladiesams/graphdbs-gqlalchemy-beginner-mar2022), or [join the Discord server](https://discord.gg/memgraph) and get your answer instantly. If you are interested in the Cypher query language and want to learn more, sign up for the free [Cypher Email Course](https://memgraph.com/learn-cypher-query-language).",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d062af471a2cba624d9fcde9b9879fe8ddf2e6c9 | 325,489 | ipynb | Jupyter Notebook | _notebooks/2021_04_28_PGA_Wins.ipynb | brennanashley/lambdalost | cfa50a18039062712919d99a01a8e6dcd484dc6c | [
"Apache-2.0"
]
| 1 | 2021-02-27T02:10:15.000Z | 2021-02-27T02:10:15.000Z | _notebooks/2021_04_28_PGA_Wins.ipynb | brennanashley/lambdalost | cfa50a18039062712919d99a01a8e6dcd484dc6c | [
"Apache-2.0"
]
| null | null | null | _notebooks/2021_04_28_PGA_Wins.ipynb | brennanashley/lambdalost | cfa50a18039062712919d99a01a8e6dcd484dc6c | [
"Apache-2.0"
]
| null | null | null | 172.581654 | 195,586 | 0.851387 | [
[
[
"# \"PGA Tour Wins Classification\"\n\n\n",
"_____no_output_____"
]
],
[
[
"Can We Predict If a PGA Tour Player Won a Tournament in a Given Year?\n\nGolf is picking up popularity, so I thought it would be interesting to focus my project here. I set out to find what sets apart the best golfers from the rest. \nI decided to explore their statistics and to see if I could predict which golfers would win in a given year. My original dataset was found on Kaggle, and the data was scraped from the PGA Tour website. \n\nFrom this data, I performed an exploratory data analysis to explore the distribution of players on numerous aspects of the game, discover outliers, and further explore how the game has changed from 2010 to 2018. I also utilized numerous supervised machine learning models to predict a golfer's earnings and wins.\n\nTo predict the golfer's win, I used classification methods such as logisitic regression and Random Forest Classification. The best performance came from the Random Forest Classification method.",
"_____no_output_____"
],
[
"1. The Data\n\npgaTourData.csv contains 1674 rows and 18 columns. Each row indicates a golfer's performance for that year.\n",
"_____no_output_____"
]
],
[
[
"\n# Player Name: Name of the golfer\n\n# Rounds: The number of games that a player played\n\n# Fairway Percentage: The percentage of time a tee shot lands on the fairway\n\n# Year: The year in which the statistic was collected\n\n# Avg Distance: The average distance of the tee-shot\n\n# gir: (Green in Regulation) is met if any part of the ball is touching the putting surface while the number of strokes taken is at least two fewer than par\n\n# Average Putts: The average number of strokes taken on the green\n\n# Average Scrambling: Scrambling is when a player misses the green in regulation, but still makes par or better on a hole\n\n# Average Score: Average Score is the average of all the scores a player has played in that year\n\n# Points: The number of FedExCup points a player earned in that year\n\n# Wins: The number of competition a player has won in that year\n\n# Top 10: The number of competitions where a player has placed in the Top 10\n\n# Average SG Putts: Strokes gained: putting measures how many strokes a player gains (or loses) on the greens\n\n# Average SG Total: The Off-the-tee + approach-the-green + around-the-green + putting statistics combined\n\n# SG:OTT: Strokes gained: off-the-tee measures player performance off the tee on all par-4s and par-5s\n\n# SG:APR: Strokes gained: approach-the-green measures player performance on approach shots\n\n# SG:ARG: Strokes gained: around-the-green measures player performance on any shot within 30 yards of the edge of the green\n\n# Money: The amount of prize money a player has earned from tournaments\n",
"_____no_output_____"
],
[
"#collapse\n# importing packages\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns",
"_____no_output_____"
],
[
"# Importing the data \ndf = pd.read_csv('pgaTourData.csv')\n\n# Examining the first 5 data\ndf.head()",
"_____no_output_____"
],
[
"#collapse\ndf.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 2312 entries, 0 to 2311\nData columns (total 18 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 Player Name 2312 non-null object \n 1 Rounds 1678 non-null float64\n 2 Fairway Percentage 1678 non-null float64\n 3 Year 2312 non-null int64 \n 4 Avg Distance 1678 non-null float64\n 5 gir 1678 non-null float64\n 6 Average Putts 1678 non-null float64\n 7 Average Scrambling 1678 non-null float64\n 8 Average Score 1678 non-null float64\n 9 Points 2296 non-null object \n 10 Wins 293 non-null float64\n 11 Top 10 1458 non-null float64\n 12 Average SG Putts 1678 non-null float64\n 13 Average SG Total 1678 non-null float64\n 14 SG:OTT 1678 non-null float64\n 15 SG:APR 1678 non-null float64\n 16 SG:ARG 1678 non-null float64\n 17 Money 2300 non-null object \ndtypes: float64(14), int64(1), object(3)\nmemory usage: 325.2+ KB\n"
],
[
"#collapse\ndf.shape",
"_____no_output_____"
]
],
[
[
"2. Data Cleaning\n\n\nAfter looking at the dataframe, the data needs to be cleaned:\n\n-For the columns Top 10 and Wins, convert the NaNs to 0s\n\n-Change Top 10 and Wins into an int \n\n-Drop NaN values for players who do not have the full statistics\n\n-Change the columns Rounds into int\n\n-Change points to int\n\n-Remove the dollar sign ($) and commas in the column Money",
"_____no_output_____"
]
],
[
[
"# Replace NaN with 0 in Top 10 \ndf['Top 10'].fillna(0, inplace=True)\ndf['Top 10'] = df['Top 10'].astype(int)\n\n# Replace NaN with 0 in # of wins\ndf['Wins'].fillna(0, inplace=True)\ndf['Wins'] = df['Wins'].astype(int)\n\n# Drop NaN values \ndf.dropna(axis = 0, inplace=True)",
"_____no_output_____"
],
[
"# Change Rounds to int\ndf['Rounds'] = df['Rounds'].astype(int)\n\n# Change Points to int \ndf['Points'] = df['Points'].apply(lambda x: x.replace(',',''))\ndf['Points'] = df['Points'].astype(int)\n\n# Remove the $ and commas in money \ndf['Money'] = df['Money'].apply(lambda x: x.replace('$',''))\ndf['Money'] = df['Money'].apply(lambda x: x.replace(',',''))\ndf['Money'] = df['Money'].astype(float)",
"_____no_output_____"
],
[
"#collapse\ndf.info()",
"<class 'pandas.core.frame.DataFrame'>\nInt64Index: 1674 entries, 0 to 1677\nData columns (total 18 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 Player Name 1674 non-null object \n 1 Rounds 1674 non-null int64 \n 2 Fairway Percentage 1674 non-null float64\n 3 Year 1674 non-null int64 \n 4 Avg Distance 1674 non-null float64\n 5 gir 1674 non-null float64\n 6 Average Putts 1674 non-null float64\n 7 Average Scrambling 1674 non-null float64\n 8 Average Score 1674 non-null float64\n 9 Points 1674 non-null int64 \n 10 Wins 1674 non-null int64 \n 11 Top 10 1674 non-null int64 \n 12 Average SG Putts 1674 non-null float64\n 13 Average SG Total 1674 non-null float64\n 14 SG:OTT 1674 non-null float64\n 15 SG:APR 1674 non-null float64\n 16 SG:ARG 1674 non-null float64\n 17 Money 1674 non-null float64\ndtypes: float64(12), int64(5), object(1)\nmemory usage: 248.5+ KB\n"
],
[
"#collapse\ndf.describe()",
"_____no_output_____"
]
],
[
[
"3. Exploratory Data Analysis",
"_____no_output_____"
]
],
[
[
"#collapse_output\n# Looking at the distribution of data\nf, ax = plt.subplots(nrows = 6, ncols = 3, figsize=(20,20))\ndistribution = df.loc[:,df.columns!='Player Name'].columns\nrows = 0\ncols = 0\nfor i, column in enumerate(distribution):\n p = sns.distplot(df[column], ax=ax[rows][cols])\n cols += 1\n if cols == 3:\n cols = 0\n rows += 1",
"/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
]
],
[
[
"From the distributions plotted, most of the graphs are normally distributed. However, we can observe that Money, Points, Wins, and Top 10s are all skewed to the right. This could be explained by the separation of the best players and the average PGA Tour player. The best players have multiple placings in the Top 10 with wins that allows them to earn more from tournaments, while the average player will have no wins and only a few Top 10 placings that prevent them from earning as much.",
"_____no_output_____"
]
],
[
[
"#collapse_output\n# Looking at the number of players with Wins for each year \nwin = df.groupby('Year')['Wins'].value_counts()\nwin = win.unstack()\nwin.fillna(0, inplace=True)\n\n# Converting win into ints\nwin = win.astype(int)\n\nprint(win)",
"Wins 0 1 2 3 4 5\nYear \n2010 166 21 5 0 0 0\n2011 156 25 5 0 0 0\n2012 159 26 4 1 0 0\n2013 152 24 3 0 0 1\n2014 142 29 3 2 0 0\n2015 150 29 2 1 1 0\n2016 152 28 4 1 0 0\n2017 156 30 0 3 1 0\n2018 158 26 5 3 0 0\n"
]
],
[
[
"From this table, we can see that most players end the year without a win. It's pretty rare to find a player that has won more than once!",
"_____no_output_____"
]
],
[
[
"# Looking at the percentage of players without a win in that year \nplayers = win.apply(lambda x: np.sum(x), axis=1)\npercent_no_win = win[0]/players\npercent_no_win = percent_no_win*100\nprint(percent_no_win)",
"Year\n2010 86.458333\n2011 83.870968\n2012 83.684211\n2013 84.444444\n2014 80.681818\n2015 81.967213\n2016 82.162162\n2017 82.105263\n2018 82.291667\ndtype: float64\n"
],
[
"#collapse_output\n# Plotting percentage of players without a win each year \nfig, ax = plt.subplots()\nbar_width = 0.8\nopacity = 0.7 \nindex = np.arange(2010, 2019)\n\nplt.bar(index, percent_no_win, bar_width, alpha = opacity)\nplt.xticks(index)\nplt.xlabel('Year')\nplt.ylabel('%')\nplt.title('Percentage of Players without a Win')",
"_____no_output_____"
]
],
[
[
"From the box plot above, we can observe that the percentages of players without a win are around 80%. There was very little variation in the percentage of players without a win in the past 8 years.",
"_____no_output_____"
]
],
[
[
"#collapse_output\n# Plotting the number of wins on a bar chart \nfig, ax = plt.subplots()\nindex = np.arange(2010, 2019)\nbar_width = 0.2\nopacity = 0.7 \n\ndef plot_bar(index, win, labels):\n plt.bar(index, win, bar_width, alpha=opacity, label=labels)\n\n# Plotting the bars\nrects = plot_bar(index, win[0], labels = '0 Wins')\nrects1 = plot_bar(index + bar_width, win[1], labels = '1 Wins')\nrects2 = plot_bar(index + bar_width*2, win[2], labels = '2 Wins')\nrects3 = plot_bar(index + bar_width*3, win[3], labels = '3 Wins')\nrects4 = plot_bar(index + bar_width*4, win[4], labels = '4 Wins')\nrects5 = plot_bar(index + bar_width*5, win[5], labels = '5 Wins')\n\nplt.xticks(index + bar_width, index)\nplt.xlabel('Year')\nplt.ylabel('Number of Players')\nplt.title('Distribution of Wins each Year')\nplt.legend()",
"_____no_output_____"
]
],
[
[
"By looking at the distribution of Wins each year, we can see that it is rare for most players to even win a tournament in the PGA Tour. Majority of players do not win, and a very few number of players win more than once a year.",
"_____no_output_____"
]
],
[
[
"# Percentage of people who did not place in the top 10 each year\ntop10 = df.groupby('Year')['Top 10'].value_counts()\ntop10 = top10.unstack()\ntop10.fillna(0, inplace=True)\nplayers = top10.apply(lambda x: np.sum(x), axis=1)\n\nno_top10 = top10[0]/players * 100\nprint(no_top10)",
"Year\n2010 17.187500\n2011 25.268817\n2012 23.157895\n2013 18.888889\n2014 16.477273\n2015 18.579235\n2016 20.000000\n2017 15.789474\n2018 17.187500\ndtype: float64\n"
]
],
[
[
"By looking at the percentage of players that did not place in the top 10 by year, We can observe that only approximately 20% of players did not place in the Top 10. In addition, the range for these player that did not place in the Top 10 is only 9.47%. This tells us that this statistic does not vary much on a yearly basis.",
"_____no_output_____"
]
],
[
[
"# Who are some of the longest hitters \ndistance = df[['Year','Player Name','Avg Distance']].copy()\ndistance.sort_values(by='Avg Distance', inplace=True, ascending=False)\nprint(distance.head())",
" Year Player Name Avg Distance\n162 2018 Rory McIlroy 319.7\n1481 2011 J.B. Holmes 318.4\n174 2018 Trey Mullinax 318.3\n732 2015 Dustin Johnson 317.7\n350 2017 Rory McIlroy 316.7\n"
]
],
[
[
"Rory McIlroy is one of the longest hitters in the game, setting the average driver distance to be 319.7 yards in 2018. He was also the longest hitter in 2017 with an average of 316.7 yards. ",
"_____no_output_____"
]
],
[
[
"# Who made the most money\nmoney_ranking = df[['Year','Player Name','Money']].copy()\nmoney_ranking.sort_values(by='Money', inplace=True, ascending=False)\nprint(money_ranking.head())",
" Year Player Name Money\n647 2015 Jordan Spieth 12030465.0\n361 2017 Justin Thomas 9921560.0\n303 2017 Jordan Spieth 9433033.0\n729 2015 Jason Day 9403330.0\n520 2016 Dustin Johnson 9365185.0\n"
]
],
[
[
"We can see that Jordan Spieth has made the most amount of money in a year, earning a total of 12 million dollars in 2015.",
"_____no_output_____"
]
],
[
[
"#collapse_output\n# Who made the most money each year\nmoney_rank = money_ranking.groupby('Year')['Money'].max()\nmoney_rank = pd.DataFrame(money_rank)\n\n\nindexs = np.arange(2010, 2019)\nnames = []\nfor i in range(money_rank.shape[0]):\n temp = df.loc[df['Money'] == money_rank.iloc[i,0],'Player Name']\n names.append(str(temp.values[0]))\n\nmoney_rank['Player Name'] = names\nprint(money_rank)",
" Money Player Name\nYear \n2010 4910477.0 Matt Kuchar\n2011 6683214.0 Luke Donald\n2012 8047952.0 Rory McIlroy\n2013 8553439.0 Tiger Woods\n2014 8280096.0 Rory McIlroy\n2015 12030465.0 Jordan Spieth\n2016 9365185.0 Dustin Johnson\n2017 9921560.0 Justin Thomas\n2018 8694821.0 Justin Thomas\n"
]
],
[
[
"With this table, we can examine the earnings of each player by year. Some of the most notable were Jordan Speith's earning of 12 million dollars and Justin Thomas earning the most money in both 2017 and 2018.",
"_____no_output_____"
]
],
[
[
"#collapse_output\n# Plot the correlation matrix between variables \ncorr = df.corr()\nsns.heatmap(corr, \n xticklabels=corr.columns.values,\n yticklabels=corr.columns.values,\n cmap='coolwarm')",
"_____no_output_____"
],
[
"df.corr()['Wins']",
"_____no_output_____"
]
],
[
[
"From the correlation matrix, we can observe that Money is highly correlated to wins along with the FedExCup Points. We can also observe that the fairway percentage, year, and rounds are not correlated to Wins.",
"_____no_output_____"
],
[
"4. Machine Learning Model (Classification)\n\n\nTo predict winners, I used multiple machine learning models to explore which models could accurately classify if a player is going to win in that year.\n\nTo measure the models, I used Receiver Operating Characterisitc Area Under the Curve. (ROC AUC) The ROC AUC tells us how capable the model is at distinguishing players with a win. In addition, as the data is skewed with 83% of players having no wins in that year, ROC AUC is a much better metric than the accuracy of the model.",
"_____no_output_____"
]
],
[
[
"#collapse\n# Importing the Machine Learning modules\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.metrics import roc_curve, roc_auc_score\nfrom sklearn.metrics import confusion_matrix\nfrom sklearn.feature_selection import RFE\nfrom sklearn.metrics import classification_report\nfrom sklearn.preprocessing import PolynomialFeatures\nfrom sklearn.svm import SVC \nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.preprocessing import MinMaxScaler\n",
"_____no_output_____"
]
],
[
[
"Preparing the Data for Classification\n\nWe know from the calculation above that the data for wins is skewed. Even without machine learning we know that approximately 83% of the players does not lead to a win. Therefore, we will be utilizing ROC AUC as the metric of these models",
"_____no_output_____"
]
],
[
[
"# Adding the Winner column to determine if the player won that year or not \ndf['Winner'] = df['Wins'].apply(lambda x: 1 if x>0 else 0)\n\n# New DataFrame \nml_df = df.copy()\n\n# Y value for machine learning is the Winner column\ntarget = df['Winner']\n\n# Removing the columns Player Name, Wins, and Winner from the dataframe to avoid leakage\nml_df.drop(['Player Name','Wins','Winner'], axis=1, inplace=True)\nprint(ml_df.head())",
" Rounds Fairway Percentage Year ... SG:APR SG:ARG Money\n0 60 75.19 2018 ... 0.960 -0.027 2680487.0\n1 109 73.58 2018 ... 0.213 0.194 2485203.0\n2 93 72.24 2018 ... 0.437 -0.137 2700018.0\n3 78 71.94 2018 ... 0.532 0.273 1986608.0\n4 103 71.44 2018 ... 0.099 0.026 1089763.0\n\n[5 rows x 16 columns]\n"
],
[
"## Logistic Regression Baseline\nper_no_win = target.value_counts()[0] / (target.value_counts()[0] + target.value_counts()[1])\nper_no_win = per_no_win.round(4)*100\nprint(str(per_no_win)+str('%'))",
"83.09%\n"
],
[
"#collapse_show\n# Function for the logisitic regression \ndef log_reg(X, y):\n X_train, X_test, y_train, y_test = train_test_split(X, y,\n random_state = 10)\n clf = LogisticRegression().fit(X_train, y_train)\n y_pred = clf.predict(X_test)\n print('Accuracy of Logistic regression classifier on training set: {:.2f}'\n .format(clf.score(X_train, y_train)))\n print('Accuracy of Logistic regression classifier on test set: {:.2f}'\n .format(clf.score(X_test, y_test)))\n cf_mat = confusion_matrix(y_test, y_pred)\n confusion = pd.DataFrame(data = cf_mat)\n print(confusion)\n \n print(classification_report(y_test, y_pred))\n\n # Returning the 5 important features \n #rfe = RFE(clf, 5)\n # rfe = rfe.fit(X, y)\n # print('Feature Importance')\n # print(X.columns[rfe.ranking_ == 1].values)\n \n print('ROC AUC Score: {:.2f}'.format(roc_auc_score(y_test, y_pred)))",
"_____no_output_____"
],
[
"#collapse_show\nlog_reg(ml_df, target)",
"Accuracy of Logistic regression classifier on training set: 0.90\nAccuracy of Logistic regression classifier on test set: 0.91\n 0 1\n0 345 8\n1 28 38\n precision recall f1-score support\n\n 0 0.92 0.98 0.95 353\n 1 0.83 0.58 0.68 66\n\n accuracy 0.91 419\n macro avg 0.88 0.78 0.81 419\nweighted avg 0.91 0.91 0.91 419\n\nROC AUC Score: 0.78\n"
]
],
[
[
"From the logisitic regression, we got an accuracy of 0.9 on the training set and an accuracy of 0.91 on the test set. This was surprisingly accurate for a first run. However, the ROC AUC Score of 0.78 could be improved. Therefore, I decided to add more features as a way of possibly improving the model.\n\n",
"_____no_output_____"
]
],
[
[
"## Feature Engineering\n\n# Adding Domain Features \nml_d = ml_df.copy()\n# Top 10 / Money might give us a better understanding on how well they placed in the top 10\nml_d['Top10perMoney'] = ml_d['Top 10'] / ml_d['Money']\n\n# Avg Distance / Fairway Percentage to give us a ratio that determines how accurate and far a player hits \nml_d['DistanceperFairway'] = ml_d['Avg Distance'] / ml_d['Fairway Percentage']\n\n# Money / Rounds to see on average how much money they would make playing a round of golf \nml_d['MoneyperRound'] = ml_d['Money'] / ml_d['Rounds']",
"_____no_output_____"
],
[
"#collapse_show\nlog_reg(ml_d, target)",
"Accuracy of Logistic regression classifier on training set: 0.91\nAccuracy of Logistic regression classifier on test set: 0.91\n 0 1\n0 342 11\n1 27 39\n precision recall f1-score support\n\n 0 0.93 0.97 0.95 353\n 1 0.78 0.59 0.67 66\n\n accuracy 0.91 419\n macro avg 0.85 0.78 0.81 419\nweighted avg 0.90 0.91 0.90 419\n\nROC AUC Score: 0.78\n"
],
[
"#collapse_show\n# Adding Polynomial Features to the ml_df \nmldf2 = ml_df.copy()\npoly = PolynomialFeatures(2)\npoly = poly.fit(mldf2)\npoly_feature = poly.transform(mldf2)\nprint(poly_feature.shape)\n\n# Creating a DataFrame with the polynomial features \npoly_feature = pd.DataFrame(poly_feature, columns = poly.get_feature_names(ml_df.columns))\nprint(poly_feature.head())",
"(1674, 153)\n 1 Rounds Fairway Percentage ... SG:ARG^2 SG:ARG Money Money^2\n0 1.0 60.0 75.19 ... 0.000729 -72373.149 7.185011e+12\n1 1.0 109.0 73.58 ... 0.037636 482129.382 6.176234e+12\n2 1.0 93.0 72.24 ... 0.018769 -369902.466 7.290097e+12\n3 1.0 78.0 71.94 ... 0.074529 542343.984 3.946611e+12\n4 1.0 103.0 71.44 ... 0.000676 28333.838 1.187583e+12\n\n[5 rows x 153 columns]\n"
],
[
"#collapse_show\nlog_reg(poly_feature, target)",
"Accuracy of Logistic regression classifier on training set: 0.90\nAccuracy of Logistic regression classifier on test set: 0.91\n 0 1\n0 346 7\n1 32 34\n precision recall f1-score support\n\n 0 0.92 0.98 0.95 353\n 1 0.83 0.52 0.64 66\n\n accuracy 0.91 419\n macro avg 0.87 0.75 0.79 419\nweighted avg 0.90 0.91 0.90 419\n\nROC AUC Score: 0.75\n"
]
],
[
[
"From feature engineering, there were no improvements in the ROC AUC Score. In fact as I added more features, the accuracy and the ROC AUC Score decreased. This could signal to us that another machine learning algorithm could better predict winners.",
"_____no_output_____"
]
],
[
[
"#collapse_show\n## Randon Forest Model\n\ndef random_forest(X, y):\n X_train, X_test, y_train, y_test = train_test_split(X, y,\n random_state = 10)\n clf = RandomForestClassifier(n_estimators=200).fit(X_train, y_train)\n y_pred = clf.predict(X_test)\n print('Accuracy of Random Forest classifier on training set: {:.2f}'\n .format(clf.score(X_train, y_train)))\n print('Accuracy of Random Forest classifier on test set: {:.2f}'\n .format(clf.score(X_test, y_test)))\n \n cf_mat = confusion_matrix(y_test, y_pred)\n confusion = pd.DataFrame(data = cf_mat)\n print(confusion)\n \n print(classification_report(y_test, y_pred))\n \n # Returning the 5 important features \n rfe = RFE(clf, 5)\n rfe = rfe.fit(X, y)\n print('Feature Importance')\n print(X.columns[rfe.ranking_ == 1].values)\n \n print('ROC AUC Score: {:.2f}'.format(roc_auc_score(y_test, y_pred)))",
"_____no_output_____"
],
[
"#collapse_show\nrandom_forest(ml_df, target)",
"Accuracy of Random Forest classifier on training set: 1.00\nAccuracy of Random Forest classifier on test set: 0.94\n 0 1\n0 342 11\n1 16 50\n precision recall f1-score support\n\n 0 0.96 0.97 0.96 353\n 1 0.82 0.76 0.79 66\n\n accuracy 0.94 419\n macro avg 0.89 0.86 0.87 419\nweighted avg 0.93 0.94 0.93 419\n\nFeature Importance\n['Average Score' 'Points' 'Top 10' 'Average SG Total' 'Money']\nROC AUC Score: 0.86\n"
],
[
"#collapse_show\nrandom_forest(ml_d, target)",
"Accuracy of Random Forest classifier on training set: 1.00\nAccuracy of Random Forest classifier on test set: 0.94\n 0 1\n0 343 10\n1 16 50\n precision recall f1-score support\n\n 0 0.96 0.97 0.96 353\n 1 0.83 0.76 0.79 66\n\n accuracy 0.94 419\n macro avg 0.89 0.86 0.88 419\nweighted avg 0.94 0.94 0.94 419\n\nFeature Importance\n['Average Score' 'Points' 'Average SG Total' 'Money' 'MoneyperRound']\nROC AUC Score: 0.86\n"
],
[
"#collapse_show\nrandom_forest(poly_feature, target)",
"Accuracy of Random Forest classifier on training set: 1.00\nAccuracy of Random Forest classifier on test set: 0.94\n 0 1\n0 340 13\n1 14 52\n precision recall f1-score support\n\n 0 0.96 0.96 0.96 353\n 1 0.80 0.79 0.79 66\n\n accuracy 0.94 419\n macro avg 0.88 0.88 0.88 419\nweighted avg 0.94 0.94 0.94 419\n\nFeature Importance\n['Year Points' 'Average Putts Points' 'Average Scrambling Top 10'\n 'Average Score Points' 'Points^2']\nROC AUC Score: 0.88\n"
]
],
[
[
"The Random Forest Model scored highly on the ROC AUC Score, obtaining a value of 0.89. With this, we observed that the Random Forest Model could accurately classify players with and without a win.",
"_____no_output_____"
],
[
"6. Conclusion\n\nIt's been interesting to learn about numerous aspects of the game that differentiate the winner and the average PGA Tour player. For example, we can see that the fairway percentage and greens in regulations do not seem to contribute as much to a player's win. However, all the strokes gained statistics contribute pretty highly to wins for these players. It was interesting to see which aspects of the game that the professionals should put their time into. This also gave me the idea of track my personal golf statistics, so that I could compare it to the pros and find areas of my game that need the most improvement.\n\nMachine Learning Model\nI've been able to examine the data of PGA Tour players and classify if a player will win that year or not. With the random forest classification model, I was able to achieve an ROC AUC of 0.89 and an accuracy of 0.95 on the test set. This was a significant improvement from the ROC AUC of 0.78 and accuracy of 0.91. Because the data is skewed with approximately 80% of players not earning a win, the primary measure of the model was the ROC AUC. I was able to improve my model from ROC AUC score of 0.78 to a score of 0.89 by simply trying 3 different models, adding domain features, and polynomial features.\n\n",
"_____no_output_____"
],
[
"The End!!",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
]
]
|
d062cdd7eeb895340e644ac4092a20863b415b5b | 15,675 | ipynb | Jupyter Notebook | Week2-Lesson1-MNIST-Fashion.ipynb | monahatami1/Coursera_Introduction-to-TensorFlow-for-Artificial-Intelligence-Machine-Learning-and-Deep-Lear | 4ed47fac75d3ec2eea277ca64b1b99ba017f8a27 | [
"MIT"
]
| null | null | null | Week2-Lesson1-MNIST-Fashion.ipynb | monahatami1/Coursera_Introduction-to-TensorFlow-for-Artificial-Intelligence-Machine-Learning-and-Deep-Lear | 4ed47fac75d3ec2eea277ca64b1b99ba017f8a27 | [
"MIT"
]
| null | null | null | Week2-Lesson1-MNIST-Fashion.ipynb | monahatami1/Coursera_Introduction-to-TensorFlow-for-Artificial-Intelligence-Machine-Learning-and-Deep-Lear | 4ed47fac75d3ec2eea277ca64b1b99ba017f8a27 | [
"MIT"
]
| null | null | null | 60.755814 | 7,088 | 0.69327 | [
[
[
"import tensorflow as tf\nprint(tf.__version__)",
"2.3.1\n"
],
[
"mnist = tf.keras.datasets.fashion_mnist",
"_____no_output_____"
],
[
"(training_images, training_labels), (test_images, test_labels) = mnist.load_data()",
"_____no_output_____"
],
[
"import numpy as np\nnp.set_printoptions(linewidth=200)\nimport matplotlib.pyplot as plt\nplt.imshow(training_images[0])\nprint(training_labels[0])\nprint(training_images[0])",
"9\n[[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 13 73 0 0 1 4 0 0 0 0 1 1 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 3 0 36 136 127 62 54 0 0 0 1 3 4 0 0 3]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 6 0 102 204 176 134 144 123 23 0 0 0 0 12 10 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 155 236 207 178 107 156 161 109 64 23 77 130 72 15]\n [ 0 0 0 0 0 0 0 0 0 0 0 1 0 69 207 223 218 216 216 163 127 121 122 146 141 88 172 66]\n [ 0 0 0 0 0 0 0 0 0 1 1 1 0 200 232 232 233 229 223 223 215 213 164 127 123 196 229 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 183 225 216 223 228 235 227 224 222 224 221 223 245 173 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 193 228 218 213 198 180 212 210 211 213 223 220 243 202 0]\n [ 0 0 0 0 0 0 0 0 0 1 3 0 12 219 220 212 218 192 169 227 208 218 224 212 226 197 209 52]\n [ 0 0 0 0 0 0 0 0 0 0 6 0 99 244 222 220 218 203 198 221 215 213 222 220 245 119 167 56]\n [ 0 0 0 0 0 0 0 0 0 4 0 0 55 236 228 230 228 240 232 213 218 223 234 217 217 209 92 0]\n [ 0 0 1 4 6 7 2 0 0 0 0 0 237 226 217 223 222 219 222 221 216 223 229 215 218 255 77 0]\n [ 0 3 0 0 0 0 0 0 0 62 145 204 228 207 213 221 218 208 211 218 224 223 219 215 224 244 159 0]\n [ 0 0 0 0 18 44 82 107 189 228 220 222 217 226 200 205 211 230 224 234 176 188 250 248 233 238 215 0]\n [ 0 57 187 208 224 221 224 208 204 214 208 209 200 159 245 193 206 223 255 255 221 234 221 211 220 232 246 0]\n [ 3 202 228 224 221 211 211 214 205 205 205 220 240 80 150 255 229 221 188 154 191 210 204 209 222 228 225 0]\n [ 98 233 198 210 222 229 229 234 249 220 194 215 217 241 65 73 106 117 168 219 221 215 217 223 223 224 229 29]\n [ 75 204 212 204 193 205 211 225 216 185 197 206 198 213 240 195 227 245 239 223 218 212 209 222 220 221 230 67]\n [ 48 203 183 194 213 197 185 190 194 192 202 214 219 221 220 236 225 216 199 206 186 181 177 172 181 205 206 115]\n [ 0 122 219 193 179 171 183 196 204 210 213 207 211 210 200 196 194 191 195 191 198 192 176 156 167 177 210 92]\n [ 0 0 74 189 212 191 175 172 175 181 185 188 189 188 193 198 204 209 210 210 211 188 188 194 192 216 170 0]\n [ 2 0 0 0 66 200 222 237 239 242 246 243 244 221 220 193 191 179 182 182 181 176 166 168 99 58 0 0]\n [ 0 0 0 0 0 0 0 40 61 44 72 41 35 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]]\n"
],
[
"training_images = training_images / 255.0\ntest_images = test_images / 255.0\n",
"_____no_output_____"
],
[
"model = tf.keras.models.Sequential([tf.keras.layers.Flatten(), \n tf.keras.layers.Dense(128, activation=tf.nn.relu), \n tf.keras.layers.Dense(10, activation=tf.nn.softmax)])",
"_____no_output_____"
],
[
"model.compile(optimizer = tf.optimizers.Adam(),\n loss = 'sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\nmodel.fit(training_images, training_labels, epochs=5)",
"Epoch 1/5\n1875/1875 [==============================] - 2s 870us/step - loss: 1.1072 - accuracy: 0.6507\nEpoch 2/5\n1875/1875 [==============================] - 2s 835us/step - loss: 0.6459 - accuracy: 0.7674\nEpoch 3/5\n1875/1875 [==============================] - 2s 807us/step - loss: 0.5682 - accuracy: 0.7962\nEpoch 4/5\n1875/1875 [==============================] - 1s 796us/step - loss: 0.5250 - accuracy: 0.8135\nEpoch 5/5\n1875/1875 [==============================] - 2s 805us/step - loss: 0.4971 - accuracy: 0.8244\n"
],
[
"model.evaluate(test_images, test_labels)",
"313/313 [==============================] - 0s 704us/step - loss: 95.0182 - accuracy: 0.6898\n"
],
[
"classifications = model.predict(test_images)\n\nprint(classifications[0])",
"[6.8263911e-13 1.7325267e-12 2.5193808e-18 1.0686662e-12 9.9983463e-18 1.1335950e-01 2.2505068e-18 1.0656738e-01 2.8287264e-12 7.8007311e-01]\n"
],
[
"len(set(training_labels))",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d062d93ee6753a9376227ea46c29417c53a92f09 | 456,549 | ipynb | Jupyter Notebook | monte-carlo/Monte_Carlo.ipynb | jbdekker/deep-reinforcement-learning | faea2012a9b96ec013c5fbf83ef40316b446fe2e | [
"MIT"
]
| null | null | null | monte-carlo/Monte_Carlo.ipynb | jbdekker/deep-reinforcement-learning | faea2012a9b96ec013c5fbf83ef40316b446fe2e | [
"MIT"
]
| null | null | null | monte-carlo/Monte_Carlo.ipynb | jbdekker/deep-reinforcement-learning | faea2012a9b96ec013c5fbf83ef40316b446fe2e | [
"MIT"
]
| null | null | null | 842.341328 | 216,412 | 0.947911 | [
[
[
"# Monte Carlo Methods\n\nIn this notebook, you will write your own implementations of many Monte Carlo (MC) algorithms. \n\nWhile we have provided some starter code, you are welcome to erase these hints and write your code from scratch.\n\n### Part 0: Explore BlackjackEnv\n\nWe begin by importing the necessary packages.",
"_____no_output_____"
]
],
[
[
"import sys\nimport gym\nimport numpy as np\nfrom collections import defaultdict\n\nfrom plot_utils import plot_blackjack_values, plot_policy",
"_____no_output_____"
]
],
[
[
"Use the code cell below to create an instance of the [Blackjack](https://github.com/openai/gym/blob/master/gym/envs/toy_text/blackjack.py) environment.",
"_____no_output_____"
]
],
[
[
"env = gym.make('Blackjack-v0')",
"_____no_output_____"
]
],
[
[
"Each state is a 3-tuple of:\n- the player's current sum $\\in \\{0, 1, \\ldots, 31\\}$,\n- the dealer's face up card $\\in \\{1, \\ldots, 10\\}$, and\n- whether or not the player has a usable ace (`no` $=0$, `yes` $=1$).\n\nThe agent has two potential actions:\n\n```\n STICK = 0\n HIT = 1\n```\nVerify this by running the code cell below.",
"_____no_output_____"
]
],
[
[
"print(f\"Observation space: \\t{env.observation_space}\")\nprint(f\"Action space: \\t\\t{env.action_space}\")",
"Observation space: \tTuple(Discrete(32), Discrete(11), Discrete(2))\nAction space: \t\tDiscrete(2)\n"
]
],
[
[
"Execute the code cell below to play Blackjack with a random policy. \n\n(_The code currently plays Blackjack three times - feel free to change this number, or to run the cell multiple times. The cell is designed for you to get some experience with the output that is returned as the agent interacts with the environment._)",
"_____no_output_____"
]
],
[
[
"for i_episode in range(3):\n state = env.reset()\n while True:\n print(state)\n action = env.action_space.sample()\n state, reward, done, info = env.step(action)\n if done:\n print('End game! Reward: ', reward)\n print('You won :)\\n') if reward > 0 else print('You lost :(\\n')\n break",
"(19, 10, False)\nEnd game! Reward: 1.0\nYou won :)\n\n(14, 6, False)\n(15, 6, False)\nEnd game! Reward: 1.0\nYou won :)\n\n(16, 3, False)\nEnd game! Reward: 1.0\nYou won :)\n\n"
]
],
[
[
"### Part 1: MC Prediction\n\nIn this section, you will write your own implementation of MC prediction (for estimating the action-value function). \n\nWe will begin by investigating a policy where the player _almost_ always sticks if the sum of her cards exceeds 18. In particular, she selects action `STICK` with 80% probability if the sum is greater than 18; and, if the sum is 18 or below, she selects action `HIT` with 80% probability. The function `generate_episode_from_limit_stochastic` samples an episode using this policy. \n\nThe function accepts as **input**:\n- `bj_env`: This is an instance of OpenAI Gym's Blackjack environment.\n\nIt returns as **output**:\n- `episode`: This is a list of (state, action, reward) tuples (of tuples) and corresponds to $(S_0, A_0, R_1, \\ldots, S_{T-1}, A_{T-1}, R_{T})$, where $T$ is the final time step. In particular, `episode[i]` returns $(S_i, A_i, R_{i+1})$, and `episode[i][0]`, `episode[i][1]`, and `episode[i][2]` return $S_i$, $A_i$, and $R_{i+1}$, respectively.",
"_____no_output_____"
]
],
[
[
"def generate_episode_from_limit_stochastic(bj_env):\n episode = []\n state = bj_env.reset()\n \n while True:\n probs = [0.8, 0.2] if state[0] > 18 else [0.2, 0.8]\n action = np.random.choice(np.arange(2), p=probs)\n \n next_state, reward, done, info = bj_env.step(action)\n \n episode.append((state, action, reward))\n \n state = next_state\n \n if done:\n break\n \n return episode",
"_____no_output_____"
]
],
[
[
"Execute the code cell below to play Blackjack with the policy. \n\n(*The code currently plays Blackjack three times - feel free to change this number, or to run the cell multiple times. The cell is designed for you to gain some familiarity with the output of the `generate_episode_from_limit_stochastic` function.*)",
"_____no_output_____"
]
],
[
[
"for i in range(5):\n print(generate_episode_from_limit_stochastic(env))",
"[((18, 2, True), 0, 1.0)]\n[((16, 5, False), 1, 0.0), ((18, 5, False), 1, -1.0)]\n[((13, 5, False), 1, 0.0), ((17, 5, False), 1, -1.0)]\n[((14, 4, False), 1, 0.0), ((17, 4, False), 1, -1.0)]\n[((20, 10, False), 0, -1.0)]\n"
]
],
[
[
"Now, you are ready to write your own implementation of MC prediction. Feel free to implement either first-visit or every-visit MC prediction; in the case of the Blackjack environment, the techniques are equivalent.\n\nYour algorithm has three arguments:\n- `env`: This is an instance of an OpenAI Gym environment.\n- `num_episodes`: This is the number of episodes that are generated through agent-environment interaction.\n- `generate_episode`: This is a function that returns an episode of interaction.\n- `gamma`: This is the discount rate. It must be a value between 0 and 1, inclusive (default value: `1`).\n\nThe algorithm returns as output:\n- `Q`: This is a dictionary (of one-dimensional arrays) where `Q[s][a]` is the estimated action value corresponding to state `s` and action `a`.",
"_____no_output_____"
]
],
[
[
"def mc_prediction_q(env, num_episodes, generate_episode, gamma=1.0):\n # initialize empty dictionaries of arrays\n returns_sum = defaultdict(lambda: np.zeros(env.action_space.n))\n N = defaultdict(lambda: np.zeros(env.action_space.n))\n Q = defaultdict(lambda: np.zeros(env.action_space.n))\n R = defaultdict(lambda: np.zeros(env.action_space.n))\n \n # loop over episodes\n for i_episode in range(1, num_episodes+1):\n # monitor progress\n if i_episode % 1000 == 0:\n print(\"\\rEpisode {}/{}.\".format(i_episode, num_episodes), end=\"\")\n sys.stdout.flush()\n \n episode = generate_episode(env)\n \n n = len(episode)\n states, actions, rewards = zip(*episode)\n discounts = np.array([gamma**i for i in range(n+1)])\n \n for i, state in enumerate(states):\n returns_sum[state][actions[i]] += sum(rewards[i:] * discounts[:-(i+1)])\n N[state][actions[i]] += 1\n \n # comnpute Q table\n for state in returns_sum.keys():\n for action in range(env.action_space.n):\n Q[state][action] = returns_sum[state][action] / N[state][action]\n \n return Q, returns_sum, N",
"_____no_output_____"
]
],
[
[
"Use the cell below to obtain the action-value function estimate $Q$. We have also plotted the corresponding state-value function.\n\nTo check the accuracy of your implementation, compare the plot below to the corresponding plot in the solutions notebook **Monte_Carlo_Solution.ipynb**.",
"_____no_output_____"
]
],
[
[
"# obtain the action-value function\nQ, R, N = mc_prediction_q(env, 500000, generate_episode_from_limit_stochastic)\n\n# obtain the corresponding state-value function\nV_to_plot = dict((k,(k[0]>18)*(np.dot([0.8, 0.2],v)) + (k[0]<=18)*(np.dot([0.2, 0.8],v))) \\\n for k, v in Q.items())\n\n# plot the state-value function\nplot_blackjack_values(V_to_plot)",
"Episode 500000/500000."
]
],
[
[
"### Part 2: MC Control\n\nIn this section, you will write your own implementation of constant-$\\alpha$ MC control. \n\nYour algorithm has four arguments:\n- `env`: This is an instance of an OpenAI Gym environment.\n- `num_episodes`: This is the number of episodes that are generated through agent-environment interaction.\n- `alpha`: This is the step-size parameter for the update step.\n- `gamma`: This is the discount rate. It must be a value between 0 and 1, inclusive (default value: `1`).\n\nThe algorithm returns as output:\n- `Q`: This is a dictionary (of one-dimensional arrays) where `Q[s][a]` is the estimated action value corresponding to state `s` and action `a`.\n- `policy`: This is a dictionary where `policy[s]` returns the action that the agent chooses after observing state `s`.\n\n(_Feel free to define additional functions to help you to organize your code._)",
"_____no_output_____"
]
],
[
[
"def generate_episode_from_Q(env, Q, epsilon, n):\n \"\"\" generates an episode following the epsilon-greedy policy\"\"\"\n episode = []\n state = env.reset()\n \n while True:\n if state in Q:\n action = np.random.choice(np.arange(n), p=get_props(Q[state], epsilon, n))\n else:\n action = env.action_space.sample()\n \n next_state, reward, done, _ = env.step(action)\n episode.append((state, action, reward))\n \n state = next_state\n \n if done:\n break\n \n return episode",
"_____no_output_____"
],
[
"def get_props(Q_s, epsilon, n):\n policy_s = np.ones(n) * epsilon / n\n best_a = np.argmax(Q_s)\n policy_s[best_a] = 1 - epsilon + (epsilon / n)\n \n return policy_s",
"_____no_output_____"
],
[
"def update_Q(episode, Q, alpha, gamma):\n n = len(episode)\n \n states, actions, rewards = zip(*episode)\n discounts = np.array([gamma**i for i in range(n+1)])\n \n for i, state in enumerate(states):\n R = sum(rewards[i:] * discounts[:-(1+i)])\n Q[state][actions[i]] = Q[state][actions[i]] + alpha * (R - Q[state][actions[i]])\n \n return Q",
"_____no_output_____"
],
[
"def mc_control(env, num_episodes, alpha, gamma=1.0, eps_start=1.0, eps_decay=.99999, eps_min=0.05):\n nA = env.action_space.n\n # initialize empty dictionary of arrays\n Q = defaultdict(lambda: np.zeros(nA))\n \n epsilon = eps_start\n # loop over episodes\n for i_episode in range(1, num_episodes+1):\n # monitor progress\n if i_episode % 1000 == 0:\n print(\"\\rEpisode {}/{}.\".format(i_episode, num_episodes), end=\"\")\n sys.stdout.flush()\n \n epsilon = max(eps_min, epsilon * eps_decay)\n episode = generate_episode_from_Q(env, Q, epsilon, nA)\n \n Q = update_Q(episode, Q, alpha, gamma)\n \n policy = dict((s, np.argmax(v)) for s, v in Q.items())\n \n return policy, Q",
"_____no_output_____"
]
],
[
[
"Use the cell below to obtain the estimated optimal policy and action-value function. Note that you should fill in your own values for the `num_episodes` and `alpha` parameters.",
"_____no_output_____"
]
],
[
[
"# obtain the estimated optimal policy and action-value function\npolicy, Q = mc_control(env, 500000, 0.02)",
"Episode 500000/500000."
]
],
[
[
"Next, we plot the corresponding state-value function.",
"_____no_output_____"
]
],
[
[
"# obtain the corresponding state-value function\nV = dict((k,np.max(v)) for k, v in Q.items())\n\n# plot the state-value function\nplot_blackjack_values(V)",
"_____no_output_____"
]
],
[
[
"Finally, we visualize the policy that is estimated to be optimal.",
"_____no_output_____"
]
],
[
[
"# plot the policy\nplot_policy(policy)",
"_____no_output_____"
]
],
[
[
"The **true** optimal policy $\\pi_*$ can be found in Figure 5.2 of the [textbook](http://go.udacity.com/rl-textbook) (and appears below). Compare your final estimate to the optimal policy - how close are you able to get? If you are not happy with the performance of your algorithm, take the time to tweak the decay rate of $\\epsilon$, change the value of $\\alpha$, and/or run the algorithm for more episodes to attain better results.\n\n",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d062e3d741909a888a9e790727794972375a0f60 | 6,951 | ipynb | Jupyter Notebook | highPerformanceComputing/Project2-KNNClassifier/knnCountries.ipynb | naokishami/Classwork | ac59d640f15e88294804fdb518b6c84b10e0d2bd | [
"MIT"
]
| null | null | null | highPerformanceComputing/Project2-KNNClassifier/knnCountries.ipynb | naokishami/Classwork | ac59d640f15e88294804fdb518b6c84b10e0d2bd | [
"MIT"
]
| null | null | null | highPerformanceComputing/Project2-KNNClassifier/knnCountries.ipynb | naokishami/Classwork | ac59d640f15e88294804fdb518b6c84b10e0d2bd | [
"MIT"
]
| null | null | null | 24.736655 | 68 | 0.339951 | [
[
[
"import pandas as pd\n\nmatrix = pd.read_csv(\"./data/matrixCountries.csv\")\nmatrix",
"_____no_output_____"
],
[
"us = matrix.iloc[:, 2]\nrus = matrix.iloc[:, 3]\nquery = matrix.iloc[:, 4]",
"_____no_output_____"
],
[
"def magnitude(vec):\n total = 0\n for item in vec:\n total += item**2\n total /= len(vec)\n return total",
"_____no_output_____"
],
[
"us_cos = us @ query / magnitude(us) / magnitude(query)\nus_cos",
"_____no_output_____"
],
[
"rus_cos = rus @ query / (magnitude(rus) * magnitude(query))\nrus_cos",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d062f3c869548b0ed612ff7dc9ec73685185bd92 | 64,018 | ipynb | Jupyter Notebook | notebooks/1_0_EDA_BASE_A.ipynb | teoria/PD_datascience | e679e942b70be67f0f33cad6db11de3bc4cd9f1c | [
"MIT"
]
| null | null | null | notebooks/1_0_EDA_BASE_A.ipynb | teoria/PD_datascience | e679e942b70be67f0f33cad6db11de3bc4cd9f1c | [
"MIT"
]
| null | null | null | notebooks/1_0_EDA_BASE_A.ipynb | teoria/PD_datascience | e679e942b70be67f0f33cad6db11de3bc4cd9f1c | [
"MIT"
]
| null | null | null | 101.134281 | 9,175 | 0.801728 | [
[
[
"import pandas as pd\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport seaborn as sns\n#%matplotlib inline\n\nfrom IPython.core.pylabtools import figsize\nfigsize(8, 6)\nsns.set()",
"_____no_output_____"
]
],
[
[
"## Carregando dados dos usuรกrios premium",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv(\"../data/processed/premium_students.csv\",parse_dates=[1,2],index_col=[0])\nprint(df.shape)\ndf.head()",
"(6260, 2)\n"
]
],
[
[
"---\n### Novas colunas auxiliares",
"_____no_output_____"
]
],
[
[
"df['diffDate'] = (df.SubscriptionDate - df.RegisteredDate)\ndf['diffDays'] = [ item.days for item in df['diffDate']]\ndf['register_time'] = df.RegisteredDate.map( lambda x : int(x.strftime(\"%H\")) )\ndf['register_time_AM_PM'] = df.register_time.map( lambda x : 1 if x>=12 else 0)\ndf['register_num_week'] = df.RegisteredDate.map( lambda x : int(x.strftime(\"%V\")) )\ndf['register_week_day'] = df.RegisteredDate.map( lambda x : int(x.weekday()) )\ndf['register_month'] = df.RegisteredDate.map( lambda x : int(x.strftime('%m')) )\ndf['subscription_time'] = df.SubscriptionDate.map( lambda x : int(x.strftime(\"%H\") ))\ndf['subscription_time_AM_PM'] = df.subscription_time.map( lambda x : 1 if x>=12 else 0)\ndf['subscription_num_week'] = df.SubscriptionDate.map( lambda x : int(x.strftime(\"%V\")) )\ndf['subscription_week_day'] = df.SubscriptionDate.map( lambda x : int(x.weekday()) )\ndf['subscription_month'] = df.SubscriptionDate.map( lambda x : int(x.strftime('%m')) )\ndf.tail()",
"_____no_output_____"
]
],
[
[
"---\n### Verificando distribuiรงรตes",
"_____no_output_____"
]
],
[
[
"df.register_time.hist()",
"_____no_output_____"
],
[
"df.subscription_time.hist()",
"_____no_output_____"
],
[
"df.register_time_AM_PM.value_counts()",
"_____no_output_____"
],
[
"df.subscription_time_AM_PM.value_counts()",
"_____no_output_____"
],
[
"df.subscription_week_day.value_counts()",
"_____no_output_____"
],
[
"df.diffDays.hist()",
"_____no_output_____"
],
[
"df.diffDays.quantile([.25,.5,.75,.95])",
"_____no_output_____"
]
],
[
[
"Separando os dados em 2 momentos.",
"_____no_output_____"
]
],
[
[
"lt_50 = df.loc[(df.diffDays <50) & (df.diffDays >3)]\nlt_50.diffDays.hist()\nlt_50.diffDays.value_counts()",
"_____no_output_____"
],
[
"lt_50.diffDays.quantile([.25,.5,.75,.95])\n",
"_____no_output_____"
],
[
"range_0_3 = df.loc[(df.diffDays < 3)]\nrange_3_18 = df.loc[(df.diffDays >= 3)&(df.diffDays < 18)]\nrange_6_11 = df.loc[(df.diffDays >= 6) & (df.diffDays < 11)]\nrange_11_18 = df.loc[(df.diffDays >= 11) & (df.diffDays < 18)]\nrange_18_32 = df.loc[(df.diffDays >= 18 )& (df.diffDays <= 32)]\nrange_32 = df.loc[(df.diffDays >=32)]",
"_____no_output_____"
],
[
"total_subs = df.shape[0]\n(\nround(range_0_3.shape[0] / total_subs,2),\nround(range_3_18.shape[0] / total_subs,2),\nround(range_18_32.shape[0] / total_subs,2),\nround(range_32.shape[0] / total_subs,2)\n )",
"_____no_output_____"
],
[
"gte_30 = df.loc[df.diffDays >=32]\ngte_30.diffDays.hist()\ngte_30.diffDays.value_counts()\ngte_30.shape",
"_____no_output_____"
],
[
"gte_30.diffDays.quantile([.25,.5,.75,.95])",
"_____no_output_____"
],
[
"range_32_140 = df.loc[(df.diffDays > 32)&(df.diffDays <=140)]\nrange_140_168 = df.loc[(df.diffDays > 140)&(df.diffDays <=168)]\nrange_168_188 = df.loc[(df.diffDays > 168)&(df.diffDays <=188)]\nrange_188 = df.loc[(df.diffDays > 188)]\n\ntotal_subs_gte_32 = gte_30.shape[0]\n(\nround(range_32_140.shape[0] / total_subs,2),\nround(range_140_168.shape[0] / total_subs,2),\nround(range_168_188.shape[0] / total_subs,2),\nround(range_188.shape[0] / total_subs,2)\n )",
"_____no_output_____"
],
[
"(\nround(range_32_140.shape[0] / total_subs_gte_32,2),\nround(range_140_168.shape[0] / total_subs_gte_32,2),\nround(range_168_188.shape[0] / total_subs_gte_32,2),\nround(range_188.shape[0] / total_subs_gte_32,2)\n )\n",
"_____no_output_____"
]
],
[
[
"----\n## Questรฃo 1:\nDentre os usuรกrios cadastrados em Nov/2017 que assinaram o Plano Premium,\nqual a probabilidade do usuรกrio virar Premium apรณs o cadastro em ranges de dias? A escolha\ndos ranges deve ser feita por vocรช, tendo em vista os insights que podemos tirar para o\nnegรณcio.",
"_____no_output_____"
],
[
"- De 0 a 3 dias -> 53%\n- De 3 a 18 dias -> 12%\n- De 18 a 32 -> 3%\n- Mais 32 dias -> 33%\n\n\nAnalisando as inscriรงรตes feitas depois do primeiro mรชs (33%)\n\n* De 32 a 140 -> 8%\n* De 140 a 168 -> 8%\n* De 168 a 188 -> 8%\n* De 188 a 216 -> 8%",
"_____no_output_____"
],
[
"Um pouco mais da metade das conversรตes acontecem nos primeiros 3 dias.\nA taxa conversรฃo chega a 65% atรฉ 18 dias apรณs o registro.\nApรณs 100 dias acontece outro momento relevante que representa 33%.\nPossivelmente essa janela coincide com o calendรกrio de provas das instituiรงรตes.\n\nInsights:\n* Maioria das conversรตes no perรญodo da tarde\n* Maioria das conversรตes no comeรงo da semana ( anรบncios aos domingos )\n* Direcionar anรบncios de instagram geolocalizados (instituiรงรตes) nos perรญodos que antecede o calendรกrio de provas.\n* Tentar converter usuรกrios ativos 100 dias apรณs o registro\n* Tentar converter usuรกrios com base no calendรกrio de provas da instituiรงรฃo",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
]
]
|
d062f5c370018e72626a3bbe53882379df9d5c52 | 4,946 | ipynb | Jupyter Notebook | Numpy/Numpy Operations.ipynb | aaavinash85/100-Days-of-ML- | d055d718f7972e3a4469279b9112867a42cf652f | [
"Apache-2.0"
]
| 3 | 2021-01-15T14:59:57.000Z | 2021-07-01T07:32:19.000Z | Numpy/Numpy Operations.ipynb | aaavinash85/100-Days-of-ML- | d055d718f7972e3a4469279b9112867a42cf652f | [
"Apache-2.0"
]
| null | null | null | Numpy/Numpy Operations.ipynb | aaavinash85/100-Days-of-ML- | d055d718f7972e3a4469279b9112867a42cf652f | [
"Apache-2.0"
]
| 1 | 2021-07-01T07:32:23.000Z | 2021-07-01T07:32:23.000Z | 19.170543 | 115 | 0.45552 | [
[
[
"# NumPy Operations",
"_____no_output_____"
],
[
"## Arithmetic\n\nYou can easily perform array with array arithmetic, or scalar with array arithmetic. Let's see some examples:",
"_____no_output_____"
]
],
[
[
"import numpy as np\narr = np.arange(0,10)",
"_____no_output_____"
],
[
"arr + arr",
"_____no_output_____"
],
[
"arr * arr",
"_____no_output_____"
],
[
"arr - arr",
"_____no_output_____"
],
[
"arr**3",
"_____no_output_____"
]
],
[
[
"## Universal Array Functions\n\n",
"_____no_output_____"
]
],
[
[
"#Taking Square Roots\nnp.sqrt(arr)",
"_____no_output_____"
],
[
"#Calcualting exponential (e^)\nnp.exp(arr)",
"_____no_output_____"
],
[
"np.max(arr) #same as arr.max()",
"_____no_output_____"
],
[
"np.sin(arr)",
"_____no_output_____"
],
[
"np.log(arr)",
"<ipython-input-3-a67b4ae04e95>:1: RuntimeWarning: divide by zero encountered in log\n np.log(arr)\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d062f667bf2ab6a5ad7d20d6bbfe1c67d779f452 | 24,819 | ipynb | Jupyter Notebook | docker_galaxy/data/brassica_data/brapa_nb/1-bashNB_bra.ipynb | wilkinsonlab/epigenomics_pipeline | ed4a7fd97b798110e88364b101d020b2baebc298 | [
"MIT"
]
| null | null | null | docker_galaxy/data/brassica_data/brapa_nb/1-bashNB_bra.ipynb | wilkinsonlab/epigenomics_pipeline | ed4a7fd97b798110e88364b101d020b2baebc298 | [
"MIT"
]
| null | null | null | docker_galaxy/data/brassica_data/brapa_nb/1-bashNB_bra.ipynb | wilkinsonlab/epigenomics_pipeline | ed4a7fd97b798110e88364b101d020b2baebc298 | [
"MIT"
]
| 1 | 2019-10-17T10:50:38.000Z | 2019-10-17T10:50:38.000Z | 38.124424 | 943 | 0.462065 | [
[
[
"empty"
]
]
]
| [
"empty"
]
| [
[
"empty"
]
]
|
d063024d84566109dd7dd18bc446b9f87d5c39bb | 3,619 | ipynb | Jupyter Notebook | data/Untitled.ipynb | DSqiansun/CloudComparer | f5dfc6bda3ccb1d80421c241931d19f069ff2475 | [
"MIT"
]
| null | null | null | data/Untitled.ipynb | DSqiansun/CloudComparer | f5dfc6bda3ccb1d80421c241931d19f069ff2475 | [
"MIT"
]
| null | null | null | data/Untitled.ipynb | DSqiansun/CloudComparer | f5dfc6bda3ccb1d80421c241931d19f069ff2475 | [
"MIT"
]
| null | null | null | 26.035971 | 113 | 0.494059 | [
[
[
"def flatten_json(nested_json, exclude=['']):\n \"\"\"Flatten json object with nested keys into a single level.\n Args:\n nested_json: A nested json object.\n exclude: Keys to exclude from output.\n Returns:\n The flattened json object if successful, None otherwise.\n \"\"\"\n out = {}\n\n def flatten(x, name='', exclude=exclude):\n if type(x) is dict:\n for a in x:\n if a not in exclude: flatten(x[a], name + a + '_')\n elif type(x) is list:\n i = 0\n for a in x:\n flatten(a, name + str(i) + '_')\n i += 1\n else:\n out[name[:-1]] = x\n\n flatten(nested_json)\n return out",
"_____no_output_____"
],
[
"\n",
"_____no_output_____"
],
[
"from flatten_json import flatten\nimport json\n\nwith open('Services.json') as f:\n data = json.load(f)\ndic_flattened = (flatten(d) for d in data)\ndf = pd.DataFrame(dic_flattened)\nclos = [col for col in list(df.columns) if 'Propertie' not in col]\ndf = df[clos]#.drop_duplicates().to_csv('cloud_service.csv', index=False)",
"_____no_output_____"
],
[
"def rchop(s, suffix):\n if suffix and s.endswith(suffix):\n s = s[:-len(suffix)]\n if suffix and s.endswith(suffix):\n return rchop(s, suffix)\n return s\n\ndef concate(df, cloud, type_):\n col_ = [col for col in list(df.columns) if cloud in col and type_ in col]\n return df[col_].fillna('').astype(str).agg('<br/>'.join, axis=1).apply(lambda x: rchop(x, '<br/>') )\n\n\nclouds = ['aws', 'azure', 'google', 'ibm', 'alibaba', 'oracle']\ntype_s = ['name', 'ref', 'icon']\n\nfor cloud in clouds:\n for type_ in type_s:\n df[cloud +'_'+ type_] = concate(df, cloud, type_)\n",
"_____no_output_____"
],
[
"df.drop_duplicates().to_csv('cloud_service.csv', index=False)",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d06303ee27ab4398b14cd8cdbe43733805679c26 | 11,587 | ipynb | Jupyter Notebook | Chapters/06.MinimumSpanningTrees/Chapter6.ipynb | MichielStock/SelectedTopicsOptimization | 20f6b37566d23cdde0ac6b765ffcc5ed72a11172 | [
"MIT"
]
| 22 | 2017-03-21T14:01:10.000Z | 2022-03-02T18:51:40.000Z | Chapters/06.MinimumSpanningTrees/Chapter6.ipynb | MichielStock/SelectedTopicsOptimization | 20f6b37566d23cdde0ac6b765ffcc5ed72a11172 | [
"MIT"
]
| 2 | 2018-03-22T09:54:01.000Z | 2018-05-30T16:16:53.000Z | Chapters/06.MinimumSpanningTrees/Chapter6.ipynb | MichielStock/SelectedTopicsOptimization | 20f6b37566d23cdde0ac6b765ffcc5ed72a11172 | [
"MIT"
]
| 18 | 2018-01-21T15:23:51.000Z | 2022-02-05T20:12:03.000Z | 26.759815 | 493 | 0.543454 | [
[
[
"# Minimum spanning trees\n\n*Selected Topics in Mathematical Optimization*\n\n**Michiel Stock** ([email]([email protected]))\n\n",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n%matplotlib inline\nfrom minimumspanningtrees import red, green, blue, orange, yellow",
"_____no_output_____"
]
],
[
[
"## Graphs in python\n\nConsider the following example graph:\n\n",
"_____no_output_____"
],
[
"This graph can be represented using an *adjacency list*. We do this using a `dict`. Every vertex is a key with the adjacent vertices given as a `set` containing tuples `(weight, neighbor)`. The weight is first because this makes it easy to compare the weights of two edges. Note that for every ingoing edges, there is also an outgoing edge, this is an undirected graph.",
"_____no_output_____"
]
],
[
[
"graph = {\n 'A' : set([(2, 'B'), (3, 'D')]),\n 'B' : set([(2, 'A'), (1, 'C'), (2, 'E')]),\n 'C' : set([(1, 'B'), (2, 'D'), (1, 'E')]),\n 'D' : set([(2, 'C'), (3, 'A'), (3, 'E')]),\n 'E' : set([(2, 'B'), (1, 'C'), (3, 'D')])\n}",
"_____no_output_____"
]
],
[
[
"Sometimes we will use an *edge list*, i.e. a list of (weighted) edges. This is often a more compact way of storing a graph. The edge list is given below. Note that again every edge is double: an in- and outgoing edge is included.",
"_____no_output_____"
]
],
[
[
"edges = [\n (2, 'B', 'A'),\n (3, 'D', 'A'),\n (2, 'C', 'D'),\n (3, 'A', 'D'),\n (3, 'E', 'D'),\n (2, 'B', 'E'),\n (3, 'D', 'E'),\n (1, 'C', 'E'),\n (2, 'E', 'B'),\n (2, 'A', 'B'),\n (1, 'C', 'B'),\n (1, 'E', 'C'),\n (1, 'B', 'C'),\n (2, 'D', 'C')]",
"_____no_output_____"
]
],
[
[
"We can easily turn one representation in the other (with a time complexity proportional to the number of edges) using the provided functions `edges_to_adj_list` and `adj_list_to_edges`.",
"_____no_output_____"
]
],
[
[
"from minimumspanningtrees import edges_to_adj_list, adj_list_to_edges",
"_____no_output_____"
],
[
"adj_list_to_edges(graph)",
"_____no_output_____"
],
[
"edges_to_adj_list(edges)",
"_____no_output_____"
]
],
[
[
"## Disjoint-set data structure\n\nImplementing an algorithm for finding the minimum spanning tree is fairly straightforward. The only bottleneck is that the algorithm requires the a disjoint-set data structure to keep track of a set partitioned in a number of disjoined subsets.\n\nFor example, consider the following inital set of eight elements.\n\n\n\nWe decide to group elements A, B and C together in a subset and F and G in another subset.\n\n\n\nThe disjoint-set data structure support the following operations:\n\n- **Find**: check which subset an element is in. Is typically used to check whether two objects are in the same subset;\n- **Union** merges two subsets into a single subset.\n\nA python implementation of a disjoint-set is available using an union-set forest. A simple example will make everything clear!",
"_____no_output_____"
]
],
[
[
"from union_set_forest import USF\n\nanimals = ['mouse', 'bat', 'robin', 'trout', 'seagull', 'hummingbird',\n 'salmon', 'goldfish', 'hippopotamus', 'whale', 'sparrow']\nunion_set_forest = USF(animals)\n\n# group mammals together\nunion_set_forest.union('mouse', 'bat')\nunion_set_forest.union('mouse', 'hippopotamus')\nunion_set_forest.union('whale', 'bat')\n\n# group birds together\nunion_set_forest.union('robin', 'seagull')\nunion_set_forest.union('seagull', 'sparrow')\nunion_set_forest.union('seagull', 'hummingbird')\nunion_set_forest.union('robin', 'hummingbird')\n\n# group fishes together\nunion_set_forest.union('goldfish', 'salmon')\nunion_set_forest.union('trout', 'salmon')",
"_____no_output_____"
],
[
"# mouse and whale in same subset?\nprint(union_set_forest.find('mouse') == union_set_forest.find('whale'))",
"_____no_output_____"
],
[
"# robin and salmon in the same subset?\nprint(union_set_forest.find('robin') == union_set_forest.find('salmon'))",
"_____no_output_____"
]
],
[
[
"## Heap queue\n\nCan be used to find the minimum of a changing list without having to sort the list every update.",
"_____no_output_____"
]
],
[
[
"from heapq import heapify, heappop, heappush\n\nheap = [(5, 'A'), (3, 'B'), (2, 'C'), (7, 'D')]\n\nheapify(heap) # turn in a heap\n\nprint(heap)",
"_____no_output_____"
],
[
"# return item lowest value while retaining heap property\nprint(heappop(heap))",
"_____no_output_____"
],
[
"print(heap)",
"_____no_output_____"
],
[
"# add new item and retain heap prop\nheappush(heap, (4, 'E'))\nprint(heap)",
"_____no_output_____"
]
],
[
[
"## Prim's algorithm\n\nPrim's algorithm starts with a single vertex and add $|V|-1$ edges to it, always taking the next edge with minimal weight that connects a vertex on the MST to a vertex not yet in the MST.",
"_____no_output_____"
]
],
[
[
"from minimumspanningtrees import prim",
"_____no_output_____"
]
],
[
[
"def prim(vertices, edges, start):\n \"\"\"\n Prim's algorithm for finding a minimum spanning tree.\n\n Inputs :\n - vertices : a set of the vertices of the Graph\n - edges : a list of weighted edges (e.g. (0.7, 'A', 'B') for an\n edge from node A to node B with weigth 0.7)\n - start : a vertex to start with\n\n Output:\n - edges : a minumum spanning tree represented as a list of edges\n - total_cost : total cost of the tree\n \"\"\"\n adj_list = edges_to_adj_list(edges) # easier using an adjacency list\n \n ... # to complete\n return mst_edges, total_cost",
"_____no_output_____"
]
],
[
[
"## Kruskal's algorithm\n\n\nKruskal's algorithm is a very simple algorithm to find the minimum spanning tree. The main idea is to start with an intial 'forest' of the individual nodes of the graph. In each step of the algorithm we add an edge with the smallest possible value that connects two disjoint trees in the forest. This process is continued until we have a single tree, which is a minimum spanning tree, or until all edges are considered. In the latter case, the algoritm returns a minimum spanning forest.",
"_____no_output_____"
]
],
[
[
"from minimumspanningtrees import kruskal",
"_____no_output_____"
],
[
"def kruskal(vertices, edges):\n \"\"\"\n Kruskal's algorithm for finding a minimum spanning tree.\n\n Inputs :\n - vertices : a set of the vertices of the Graph\n - edges : a list of weighted edges (e.g. (0.7, 'A', 'B') for an\n edge from node A to node B with weigth 0.7)\n\n Output:\n - edges : a minumum spanning tree represented as a list of edges\n - total_cost : total cost of the tree\n \"\"\"\n ... # to complete\n return mst_edges, total_cost",
"_____no_output_____"
]
],
[
[
"from tickettoride import vertices, edges",
"_____no_output_____"
]
],
[
[
"print(vertices)",
"_____no_output_____"
],
[
"print(edges[:5])",
"_____no_output_____"
],
[
"# compute the minimum spanning tree of the ticket to ride data set\n...",
"_____no_output_____"
]
],
[
[
"## Clustering\n\nMinimum spanning trees on a distance graph can be used to cluster a data set.",
"_____no_output_____"
]
],
[
[
"# import features and distance\nfrom clustering import X, D",
"_____no_output_____"
],
[
"fig, ax = plt.subplots()\nax.scatter(X[:,0], X[:,1], color=green)",
"_____no_output_____"
],
[
"# cluster the data based on the distance",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"raw",
"code",
"markdown",
"code",
"raw",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"raw"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"raw"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
]
|
d0634585edff76f717a666c4474ee29bebee9bc6 | 4,661 | ipynb | Jupyter Notebook | section_2/03_simple_bert.ipynb | derwind/bert_nlp | ff1279e276c85a789edc863e1c27bbc2ef86e1f0 | [
"MIT"
]
| null | null | null | section_2/03_simple_bert.ipynb | derwind/bert_nlp | ff1279e276c85a789edc863e1c27bbc2ef86e1f0 | [
"MIT"
]
| null | null | null | section_2/03_simple_bert.ipynb | derwind/bert_nlp | ff1279e276c85a789edc863e1c27bbc2ef86e1f0 | [
"MIT"
]
| null | null | null | 4,661 | 4,661 | 0.680326 | [
[
[
"<a href=\"https://colab.research.google.com/github/yukinaga/bert_nlp/blob/main/section_2/03_simple_bert.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# ใทใณใใซใชBERTใฎๅฎ่ฃ
\n่จ็ทดๆธใฟใฎใขใใซใไฝฟ็จใใๆ็ซ ใฎไธ้จใฎไบๆธฌใๅใณ2ใคใฎๆ็ซ ใ้ฃ็ถใใฆใใใใฉใใใฎๅคๅฎใ่กใใพใใ",
"_____no_output_____"
],
[
"## ใฉใคใใฉใชใฎใคใณในใใผใซ\nPyTorch-Transformersใใใใณๅฟ
่ฆใชใฉใคใใฉใชใฎใคใณในใใผใซใ่กใใพใใ",
"_____no_output_____"
]
],
[
[
"!pip install folium==0.2.1\n!pip install urllib3==1.25.11\n!pip install transformers==4.13.0",
"_____no_output_____"
]
],
[
[
"## ๆ็ซ ใฎไธ้จใฎไบๆธฌ\nๆ็ซ ใซใใใไธ้จใฎๅ่ชใMASKใใใใใBERTใฎใขใใซใไฝฟใฃใฆไบๆธฌใใพใใ",
"_____no_output_____"
]
],
[
[
"import torch\nfrom transformers import BertForMaskedLM\nfrom transformers import BertTokenizer\n\n\ntext = \"[CLS] I played baseball with my friends at school yesterday [SEP]\"\ntokenizer = BertTokenizer.from_pretrained(\"bert-base-uncased\")\nwords = tokenizer.tokenize(text)\nprint(words)",
"_____no_output_____"
]
],
[
[
"ๆ็ซ ใฎไธ้จใMASKใใพใใ",
"_____no_output_____"
]
],
[
[
"msk_idx = 3\nwords[msk_idx] = \"[MASK]\" # ๅ่ชใ[MASK]ใซ็ฝฎใๆใใ\nprint(words)",
"_____no_output_____"
]
],
[
[
"ๅ่ชใๅฏพๅฟใใใคใณใใใฏในใซๅคๆใใพใใ",
"_____no_output_____"
]
],
[
[
"word_ids = tokenizer.convert_tokens_to_ids(words) # ๅ่ชใใคใณใใใฏในใซๅคๆ\nword_tensor = torch.tensor([word_ids]) # ใใณใฝใซใซๅคๆ\nprint(word_tensor)",
"_____no_output_____"
]
],
[
[
"BERTใฎใขใใซใไฝฟใฃใฆไบๆธฌใ่กใใพใใ",
"_____no_output_____"
]
],
[
[
"msk_model = BertForMaskedLM.from_pretrained(\"bert-base-uncased\")\nmsk_model.cuda() # GPUๅฏพๅฟ\nmsk_model.eval()\n\nx = word_tensor.cuda() # GPUๅฏพๅฟ\ny = msk_model(x) # ไบๆธฌ\nresult = y[0]\nprint(result.size()) # ็ตๆใฎๅฝข็ถ\n\n_, max_ids = torch.topk(result[0][msk_idx], k=5) # ๆใๅคงใใ5ใคใฎๅค\nresult_words = tokenizer.convert_ids_to_tokens(max_ids.tolist()) # ใคใณใใใฏในใๅ่ชใซๅคๆ\nprint(result_words)",
"_____no_output_____"
]
],
[
[
"## ๆ็ซ ใ้ฃ็ถใใฆใใใใฉใใใฎๅคๅฎ\nBERTใฎใขใใซใไฝฟใฃใฆใ2ใคใฎๆ็ซ ใ้ฃ็ถใใฆใใใใฉใใใฎๅคๅฎใ่กใใพใใ \nไปฅไธใฎ้ขๆฐ`show_continuity`ใงใฏใ2ใคใฎๆ็ซ ใฎ้ฃ็ถๆงใๅคๅฎใใ่กจ็คบใใพใใ",
"_____no_output_____"
]
],
[
[
"from transformers import BertForNextSentencePrediction\n\ndef show_continuity(text, seg_ids):\n words = tokenizer.tokenize(text)\n word_ids = tokenizer.convert_tokens_to_ids(words) # ๅ่ชใใคใณใใใฏในใซๅคๆ\n word_tensor = torch.tensor([word_ids]) # ใใณใฝใซใซๅคๆ\n\n seg_tensor = torch.tensor([seg_ids])\n\n nsp_model = BertForNextSentencePrediction.from_pretrained('bert-base-uncased')\n nsp_model.cuda() # GPUๅฏพๅฟ\n nsp_model.eval()\n\n x = word_tensor.cuda() # GPUๅฏพๅฟ\n s = seg_tensor.cuda() # GPUๅฏพๅฟ\n\n y = nsp_model(x, token_type_ids=s) # ไบๆธฌ\n result = torch.softmax(y[0], dim=1)\n print(result) # Softmaxใง็ขบ็ใซ\n print(str(result[0][0].item()*100) + \"%ใฎ็ขบ็ใง้ฃ็ถใใฆใใพใใ\")",
"_____no_output_____"
]
],
[
[
"`show_continuity`้ขๆฐใซใ่ช็ถใซใคใชใใ2ใคใฎๆ็ซ ใไธใใพใใ",
"_____no_output_____"
]
],
[
[
"text = \"[CLS] What is baseball ? [SEP] It is a game of hitting the ball with the bat [SEP]\"\nseg_ids = [0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ,1, 1] # 0:ๅใฎๆ็ซ ใฎๅ่ชใ1:ๅพใฎๆ็ซ ใฎๅ่ช\nshow_continuity(text, seg_ids)",
"_____no_output_____"
]
],
[
[
"`show_continuity`้ขๆฐใซใ่ช็ถใซใคใชใใใชใ2ใคใฎๆ็ซ ใไธใใพใใ",
"_____no_output_____"
]
],
[
[
"text = \"[CLS] What is baseball ? [SEP] This food is made with flour and milk [SEP]\"\nseg_ids = [0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1] # 0:ๅใฎๆ็ซ ใฎๅ่ชใ1:ๅพใฎๆ็ซ ใฎๅ่ช\nshow_continuity(text, seg_ids)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d06345a8a4ec272aabf44402ccdab8f1f0da4720 | 11,445 | ipynb | Jupyter Notebook | _notebooks/2022-05-03-binary-search-or-bust.ipynb | boolean-pandit/non-faangable-tokens | 14872620b21077be68ae64a2527643ed29e2260e | [
"Apache-2.0"
]
| null | null | null | _notebooks/2022-05-03-binary-search-or-bust.ipynb | boolean-pandit/non-faangable-tokens | 14872620b21077be68ae64a2527643ed29e2260e | [
"Apache-2.0"
]
| null | null | null | _notebooks/2022-05-03-binary-search-or-bust.ipynb | boolean-pandit/non-faangable-tokens | 14872620b21077be68ae64a2527643ed29e2260e | [
"Apache-2.0"
]
| null | null | null | 30.601604 | 506 | 0.562779 | [
[
[
"# Binary Search or Bust\n> Binary search is useful for searching, but its implementation often leaves us searching for edge cases\n\n- toc: true \n- badges: true\n- comments: true\n- categories: [data structures & algorithms, coding interviews, searching]\n- image: images/binary_search_gif.gif",
"_____no_output_____"
],
[
"# Why should you care?\nBinary search is useful for searching through a set of values (which typically are sorted) efficiently. At each step, it reduces the search space by half, thereby running in $O(log(n))$ complexity. While it sounds simple enough to understand, it is deceptively tricky to implement and use in problems. Over the next few sections, let's take a look at binary search and it can be applied to some commonly encountered interview problems.",
"_____no_output_____"
],
[
"# A Recipe for Binary Searching\nHow does binary search reduce the search space by half? It leverages the fact that the input is sorted (_most of the time_) and compares the middle value of the search space at any step with the target value that we're searching for. If the middle value is smaller than the target, then we know that the target can only lie to its right, thus eliminating all the values to the left of the middle value and vice versa. So what information do we need to implement binary search?\n1. The left and right ends of the search space \n2. The target value we're searching for\n3. What to store at each step if any\n\nHere's a nice video which walks through the binary search algorithm:\n > youtube: https://youtu.be/P3YID7liBug\n",
"_____no_output_____"
],
[
"Next, let's look at an implementation of vanilla binary search. ",
"_____no_output_____"
]
],
[
[
"#hide\nfrom typing import List, Dict, Tuple ",
"_____no_output_____"
],
[
"def binary_search(nums: List[int], target: int) -> int:\n \"\"\"Vanilla Binary Search.\n Given a sorted list of integers and a target value,\n find the index of the target value in the list.\n If not present, return -1.\n \"\"\"\n # Left and right boundaries of the search space\n left, right = 0, len(nums) - 1\n while left <= right:\n # Why not (left + right) // 2 ?\n # Hint: Doesn't matter for Python\n middle = left + (right - left) // 2\n\n # Found the target, return the index\n if nums[middle] == target:\n return middle \n # The middle value is less than the\n # target, so look to the right\n elif nums[middle] < target:\n left = middle + 1\n # The middle value is greater than the\n # target, so look to the left\n else:\n right = middle - 1\n return -1 # Target not found",
"_____no_output_____"
]
],
[
[
"Here're a few examples of running our binary search implementation on a list and target values",
"_____no_output_____"
]
],
[
[
"#hide_input\nnums = [1,4,9,54,100,123]\ntargets = [4, 100, 92]\n\nfor val in targets:\n print(f\"Result of searching for {val} in {nums} : \\\n {binary_search(nums, val)}\\n\")\n",
"Result of searching for 4 in [1, 4, 9, 54, 100, 123] : 1\n\nResult of searching for 100 in [1, 4, 9, 54, 100, 123] : 4\n\nResult of searching for 92 in [1, 4, 9, 54, 100, 123] : -1\n\n"
]
],
[
[
"> Tip: Using the approach middle = left + (right - left) // 2 helps avoid overflow. While this isn't a concern in Python, it becomes a tricky issue to debug in other programming languages such as C++. For more on overflow, check out this [article](https://ai.googleblog.com/2006/06/extra-extra-read-all-about-it-nearly.html).",
"_____no_output_____"
],
[
"Before we look at some problems that can be solved using binary search, let's run a quick comparison of linear search and binary search on some large input. ",
"_____no_output_____"
]
],
[
[
"def linear_search(nums: List[int], target: int) -> int:\n \"\"\"Linear Search.\n Given a list of integers and a target value, return\n find the index of the target value in the list.\n If not present, return -1.\n \"\"\"\n for idx, elem in enumerate(nums):\n # Found the target value\n if elem == target:\n return idx \n return -1 # Target not found",
"_____no_output_____"
],
[
"#hide\nn = 1000000\nlarge_nums = range((1, n + 1))\ntarget = 99999",
"_____no_output_____"
]
],
[
[
"Let's see the time it takes linear search and binary search to find $99999$ in a sorted list of numbers from $[1, 1000000]$",
"_____no_output_____"
],
[
"- Linear Search",
"_____no_output_____"
]
],
[
[
"#hide_input\n%timeit linear_search(large_nums, target)",
"5.19 ms ยฑ 26.3 ยตs per loop (mean ยฑ std. dev. of 7 runs, 100 loops each)\n"
]
],
[
[
"- Binary Search",
"_____no_output_____"
]
],
[
[
"#hide_input\n%timeit binary_search(large_nums, target)",
"6.05 ยตs ยฑ 46.9 ns per loop (mean ยฑ std. dev. of 7 runs, 100000 loops each)\n"
]
],
[
[
"Hopefully, that drives the point home :wink:.",
"_____no_output_____"
],
[
"# Naรฏve Binary Search Problems\nHere's a list of problems that can be solved using vanilla binary search (or slightly modifying it). Anytime you see a problem statement which goes something like _\"Given a sorted list..\"_ or _\"Find the position of an element\"_, think of using binary search. You can also consider **sorting** the input in case it is an unordered collection of items to reduce it to a binary search problem. Note that this list is by no means exhaustive, but is a good starting point to practice binary search:\n- [Search Insert Position](https://leetcode.com/problems/search-insert-position/\n)\n- [Find the Square Root of x](https://leetcode.com/problems/sqrtx/)\n- [Find First and Last Position of Element in Sorted Array](https://leetcode.com/problems/find-first-and-last-position-of-element-in-sorted-array/)\n- [Search in a Rotated Sorted Array](https://leetcode.com/problems/search-in-rotated-sorted-array/)\n\nIn the problems above, we can either directly apply binary search or adapt it slightly to solve the problem. For example, take the square root problem. We know that the square root of a positive number $n$ has to lie between $[1, n / 2]$. This gives us the bounds for the search space. Applying binary search over this space allows us to find the a good approximation of the square root. See the implementation below for details:",
"_____no_output_____"
]
],
[
[
"def find_square_root(n: int) -> int:\n \"\"\"Integer square root.\n Given a positive integer, return\n its square root.\n \"\"\"\n left, right = 1, n // 2 + 1\n\n while left <= right:\n middle = left + (right - left) // 2\n if middle * middle == n:\n return middle # Found an exact match\n elif middle * middle < n:\n left = middle + 1 # Go right\n else:\n right = middle - 1 # Go left\n \n return right # This is the closest value to the actual square root",
"_____no_output_____"
],
[
"#hide_input\nnums = [1,4,8,33,100]\n\nfor val in nums:\n print(f\"Square root of {val} is: {find_square_root(val)}\\n\")",
"Square root of 1 is: 1\n\nSquare root of 4 is: 2\n\nSquare root of 8 is: 2\n\nSquare root of 33 is: 5\n\nSquare root of 100 is: 10\n\n"
]
],
[
[
"# To Be Continued\n- Applying binary search to unordered data\n- Problems where using binary search isn't obvious",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
]
|
d0634fdb80f870ae48693947b54ac4b283f8f24f | 414,536 | ipynb | Jupyter Notebook | Lecture 30 - Assignment.ipynb | Elseidy83/Countery_data | 225c757687c3d799e1d6c719f57494b933d92e0c | [
"MIT"
]
| null | null | null | Lecture 30 - Assignment.ipynb | Elseidy83/Countery_data | 225c757687c3d799e1d6c719f57494b933d92e0c | [
"MIT"
]
| null | null | null | Lecture 30 - Assignment.ipynb | Elseidy83/Countery_data | 225c757687c3d799e1d6c719f57494b933d92e0c | [
"MIT"
]
| null | null | null | 371.781166 | 253,892 | 0.913636 | [
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom sklearn.cluster import AgglomerativeClustering\nimport seaborn as sns\n\nsns.set(rc={'figure.figsize': [7, 7]}, font_scale=1.2)",
"_____no_output_____"
],
[
"df = pd.read_csv('Country-data.csv')\ndf",
"_____no_output_____"
],
[
"ds = df.drop(['country'],axis=1)",
"_____no_output_____"
],
[
"sns.heatmap(ds.corr(), annot=True, fmt='.1f')",
"_____no_output_____"
],
[
"def get_sum(rw):\n return rw['child_mort']+ rw['exports']+rw['health']+rw['imports']+rw['income']+rw['inflation']+rw['life_expec']+rw['total_fer']+rw['gdpp']",
"_____no_output_____"
],
[
"dd = ds.corr().abs()",
"_____no_output_____"
],
[
"dd.apply(get_sum).sort_values(ascending=False)",
"_____no_output_____"
],
[
"ds = ds.drop(['inflation','imports','health'],axis=1)",
"_____no_output_____"
],
[
"ds",
"_____no_output_____"
],
[
"from sklearn.preprocessing import StandardScaler\nscaler = StandardScaler()\nx_scaled = scaler.fit_transform(ds)",
"_____no_output_____"
],
[
"x_scaled",
"_____no_output_____"
],
[
"sns.pairplot(ds)",
"_____no_output_____"
],
[
"plt.scatter(x_scaled[:, 0], x_scaled[:, 1])",
"_____no_output_____"
],
[
"import scipy.cluster.hierarchy as sch",
"_____no_output_____"
],
[
"dendrogram = sch.dendrogram(sch.linkage(ds, method='ward'))",
"_____no_output_____"
],
[
"model = AgglomerativeClustering(n_clusters=5)\nclusters = model.fit_predict(x_scaled)\nclusters",
"_____no_output_____"
],
[
"plt.scatter(x_scaled[:, 0], x_scaled[:,7], c=clusters, cmap='viridis')",
"_____no_output_____"
],
[
"df['Clusters'] = clusters\ndf",
"_____no_output_____"
],
[
"df.groupby('Clusters').describe().transpose()",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0635092cb676270c58b8bb3262e443ac5adbcfa | 52,667 | ipynb | Jupyter Notebook | 06. VGGNet Architecture/Mini VGGNet.ipynb | ThinamXx/ComputerVision | 732d251bc19bf9d9c1b497037c6acd3d640066b9 | [
"MIT"
]
| 13 | 2021-11-21T12:06:40.000Z | 2022-03-30T00:54:06.000Z | 06. VGGNet Architecture/Mini VGGNet.ipynb | ThinamXx/ComputerVision | 732d251bc19bf9d9c1b497037c6acd3d640066b9 | [
"MIT"
]
| null | null | null | 06. VGGNet Architecture/Mini VGGNet.ipynb | ThinamXx/ComputerVision | 732d251bc19bf9d9c1b497037c6acd3d640066b9 | [
"MIT"
]
| 10 | 2021-11-20T23:40:15.000Z | 2022-03-11T19:51:28.000Z | 132.997475 | 31,266 | 0.760856 | [
[
[
"**INITIALIZATION:**\n- I use these three lines of code on top of my each notebooks because it will help to prevent any problems while reloading the same project. And the third line of code helps to make visualization within the notebook.",
"_____no_output_____"
]
],
[
[
"#@ INITIALIZATION: \n%reload_ext autoreload\n%autoreload 2\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"**LIBRARIES AND DEPENDENCIES:**\n- I have downloaded all the libraries and dependencies required for the project in one particular cell.",
"_____no_output_____"
]
],
[
[
"#@ IMPORTING NECESSARY LIBRARIES AND DEPENDENCIES:\nfrom keras.models import Sequential\nfrom keras.layers import BatchNormalization\nfrom keras.layers.convolutional import Conv2D\nfrom keras.layers.convolutional import MaxPooling2D\nfrom keras.layers.core import Activation\nfrom keras.layers.core import Flatten\nfrom keras.layers.core import Dense, Dropout\nfrom keras import backend as K\nfrom tensorflow.keras.optimizers import SGD\nfrom tensorflow.keras.datasets import cifar10\nfrom keras.callbacks import LearningRateScheduler\n\nfrom sklearn.preprocessing import LabelBinarizer\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import classification_report\n\nimport matplotlib.pyplot as plt\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"**VGG ARCHITECTURE:**\n- I will define the build method of Mini VGGNet architecture below. It requires four parameters: width of input image, height of input image, depth of image, number of class labels in the classification task. The Sequential class, the building block of sequential networks sequentially stack one layer on top of the other layer initialized below. Batch Normalization operates over the channels, so in order to apply BN, we need to know which axis to normalize over. ",
"_____no_output_____"
]
],
[
[
"#@ DEFINING VGGNET ARCHITECTURE:\nclass MiniVGGNet: # Defining VGG Network. \n @staticmethod\n def build(width, height, depth, classes): # Defining Build Method. \n model = Sequential() # Initializing Sequential Model.\n inputShape = (width, height, depth) # Initializing Input Shape. \n chanDim = -1 # Index of Channel Dimension.\n if K.image_data_format() == \"channels_first\":\n inputShape = (depth, width, height) # Initializing Input Shape. \n chanDim = 1 # Index of Channel Dimension. \n model.add(Conv2D(32, (3, 3), padding='same', \n input_shape=inputShape)) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(Conv2D(32, (3, 3), padding='same')) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(MaxPooling2D(pool_size=(2, 2))) # Adding Max Pooling Layer. \n model.add(Dropout(0.25)) # Adding Dropout Layer.\n model.add(Conv2D(64, (3, 3), padding=\"same\")) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(Conv2D(64, (3, 3), padding='same')) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(MaxPooling2D(pool_size=(2, 2))) # Adding Max Pooling Layer. \n model.add(Dropout(0.25)) # Adding Dropout Layer. \n model.add(Flatten()) # Adding Flatten Layer. \n model.add(Dense(512)) # Adding FC Dense Layer. \n model.add(Activation(\"relu\")) # Adding Activation Layer. \n model.add(BatchNormalization()) # Adding Batch Normalization Layer. \n model.add(Dropout(0.5)) # Adding Dropout Layer. \n model.add(Dense(classes)) # Adding Dense Output Layer. \n model.add(Activation(\"softmax\")) # Adding Softmax Layer. \n return model",
"_____no_output_____"
],
[
"#@ CUSTOM LEARNING RATE SCHEDULER: \ndef step_decay(epoch): # Definig step decay function. \n initAlpha = 0.01 # Initializing initial LR.\n factor = 0.25 # Initializing drop factor. \n dropEvery = 5 # Initializing epochs to drop. \n alpha = initAlpha*(factor ** np.floor((1 + epoch) / dropEvery))\n return float(alpha)",
"_____no_output_____"
]
],
[
[
"**VGGNET ON CIFAR10**",
"_____no_output_____"
]
],
[
[
"#@ GETTING THE DATASET:\n((trainX, trainY), (testX, testY)) = cifar10.load_data() # Loading Dataset. \ntrainX = trainX.astype(\"float\") / 255.0 # Normalizing Dataset. \ntestX = testX.astype(\"float\") / 255.0 # Normalizing Dataset. \n\n#@ PREPARING THE DATASET:\nlb = LabelBinarizer() # Initializing LabelBinarizer. \ntrainY = lb.fit_transform(trainY) # Converting Labels to Vectors. \ntestY = lb.transform(testY) # Converting Labels to Vectors. \nlabelNames = [\"airplane\", \"automobile\", \"bird\", \"cat\", \"deer\", \n \"dog\", \"frog\", \"horse\", \"ship\", \"truck\"] # Initializing LabelNames.",
"_____no_output_____"
],
[
"#@ INITIALIZING OPTIMIZER AND MODEL: \ncallbacks = [LearningRateScheduler(step_decay)] # Initializing Callbacks. \nopt = SGD(0.01, nesterov=True, momentum=0.9) # Initializing SGD Optimizer. \nmodel = MiniVGGNet.build(width=32, height=32, depth=3, classes=10) # Initializing VGGNet Architecture. \nmodel.compile(loss=\"categorical_crossentropy\", optimizer=opt,\n metrics=[\"accuracy\"]) # Compiling VGGNet Model. \nH = model.fit(trainX, trainY, \n validation_data=(testX, testY), batch_size=64, \n epochs=40, verbose=1, callbacks=callbacks) # Training VGGNet Model.",
"Epoch 1/40\n782/782 [==============================] - 29s 21ms/step - loss: 1.6339 - accuracy: 0.4555 - val_loss: 1.1509 - val_accuracy: 0.5970 - lr: 0.0100\nEpoch 2/40\n782/782 [==============================] - 16s 21ms/step - loss: 1.1813 - accuracy: 0.5932 - val_loss: 0.9222 - val_accuracy: 0.6733 - lr: 0.0100\nEpoch 3/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.9908 - accuracy: 0.6567 - val_loss: 0.8341 - val_accuracy: 0.7159 - lr: 0.0100\nEpoch 4/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.8854 - accuracy: 0.6945 - val_loss: 0.8282 - val_accuracy: 0.7167 - lr: 0.0100\nEpoch 5/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.7380 - accuracy: 0.7421 - val_loss: 0.6881 - val_accuracy: 0.7598 - lr: 0.0025\nEpoch 6/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.6845 - accuracy: 0.7586 - val_loss: 0.6600 - val_accuracy: 0.7711 - lr: 0.0025\nEpoch 7/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.6628 - accuracy: 0.7683 - val_loss: 0.6435 - val_accuracy: 0.7744 - lr: 0.0025\nEpoch 8/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.6391 - accuracy: 0.7755 - val_loss: 0.6362 - val_accuracy: 0.7784 - lr: 0.0025\nEpoch 9/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.6204 - accuracy: 0.7830 - val_loss: 0.6499 - val_accuracy: 0.7744 - lr: 0.0025\nEpoch 10/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5912 - accuracy: 0.7909 - val_loss: 0.6161 - val_accuracy: 0.7856 - lr: 6.2500e-04\nEpoch 11/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5812 - accuracy: 0.7936 - val_loss: 0.6054 - val_accuracy: 0.7879 - lr: 6.2500e-04\nEpoch 12/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5730 - accuracy: 0.7978 - val_loss: 0.5994 - val_accuracy: 0.7907 - lr: 6.2500e-04\nEpoch 13/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5698 - accuracy: 0.7974 - val_loss: 0.6013 - val_accuracy: 0.7882 - lr: 6.2500e-04\nEpoch 14/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5623 - accuracy: 0.8009 - val_loss: 0.5973 - val_accuracy: 0.7910 - lr: 6.2500e-04\nEpoch 15/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5496 - accuracy: 0.8064 - val_loss: 0.5961 - val_accuracy: 0.7905 - lr: 1.5625e-04\nEpoch 16/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5484 - accuracy: 0.8048 - val_loss: 0.5937 - val_accuracy: 0.7914 - lr: 1.5625e-04\nEpoch 17/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5573 - accuracy: 0.8037 - val_loss: 0.5950 - val_accuracy: 0.7902 - lr: 1.5625e-04\nEpoch 18/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5477 - accuracy: 0.8062 - val_loss: 0.5927 - val_accuracy: 0.7907 - lr: 1.5625e-04\nEpoch 19/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5446 - accuracy: 0.8073 - val_loss: 0.5904 - val_accuracy: 0.7923 - lr: 1.5625e-04\nEpoch 20/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5391 - accuracy: 0.8104 - val_loss: 0.5926 - val_accuracy: 0.7920 - lr: 3.9062e-05\nEpoch 21/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5419 - accuracy: 0.8080 - val_loss: 0.5915 - val_accuracy: 0.7929 - lr: 3.9062e-05\nEpoch 22/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5438 - accuracy: 0.8099 - val_loss: 0.5909 - val_accuracy: 0.7925 - lr: 3.9062e-05\nEpoch 23/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5467 - accuracy: 0.8075 - val_loss: 0.5914 - val_accuracy: 0.7919 - lr: 3.9062e-05\nEpoch 24/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5376 - accuracy: 0.8103 - val_loss: 0.5918 - val_accuracy: 0.7920 - lr: 3.9062e-05\nEpoch 25/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5410 - accuracy: 0.8085 - val_loss: 0.5923 - val_accuracy: 0.7917 - lr: 9.7656e-06\nEpoch 26/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5406 - accuracy: 0.8084 - val_loss: 0.5910 - val_accuracy: 0.7915 - lr: 9.7656e-06\nEpoch 27/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5384 - accuracy: 0.8097 - val_loss: 0.5901 - val_accuracy: 0.7919 - lr: 9.7656e-06\nEpoch 28/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5431 - accuracy: 0.8089 - val_loss: 0.5915 - val_accuracy: 0.7927 - lr: 9.7656e-06\nEpoch 29/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5417 - accuracy: 0.8095 - val_loss: 0.5921 - val_accuracy: 0.7925 - lr: 9.7656e-06\nEpoch 30/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5385 - accuracy: 0.8108 - val_loss: 0.5900 - val_accuracy: 0.7926 - lr: 2.4414e-06\nEpoch 31/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5451 - accuracy: 0.8073 - val_loss: 0.5910 - val_accuracy: 0.7923 - lr: 2.4414e-06\nEpoch 32/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5402 - accuracy: 0.8103 - val_loss: 0.5899 - val_accuracy: 0.7925 - lr: 2.4414e-06\nEpoch 33/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5405 - accuracy: 0.8091 - val_loss: 0.5909 - val_accuracy: 0.7928 - lr: 2.4414e-06\nEpoch 34/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5427 - accuracy: 0.8091 - val_loss: 0.5914 - val_accuracy: 0.7921 - lr: 2.4414e-06\nEpoch 35/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5416 - accuracy: 0.8105 - val_loss: 0.5906 - val_accuracy: 0.7928 - lr: 6.1035e-07\nEpoch 36/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5375 - accuracy: 0.8109 - val_loss: 0.5905 - val_accuracy: 0.7927 - lr: 6.1035e-07\nEpoch 37/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5372 - accuracy: 0.8092 - val_loss: 0.5900 - val_accuracy: 0.7923 - lr: 6.1035e-07\nEpoch 38/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5438 - accuracy: 0.8090 - val_loss: 0.5907 - val_accuracy: 0.7927 - lr: 6.1035e-07\nEpoch 39/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5424 - accuracy: 0.8097 - val_loss: 0.5906 - val_accuracy: 0.7922 - lr: 6.1035e-07\nEpoch 40/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5385 - accuracy: 0.8116 - val_loss: 0.5909 - val_accuracy: 0.7928 - lr: 1.5259e-07\n"
]
],
[
[
"**MODEL EVALUATION:**",
"_____no_output_____"
]
],
[
[
"#@ INITIALIZING MODEL EVALUATION:\npredictions = model.predict(testX, batch_size=64) # Getting Model Predictions. \nprint(classification_report(testY.argmax(axis=1),\n predictions.argmax(axis=1), \n target_names=labelNames)) # Inspecting Classification Report.",
" precision recall f1-score support\n\n airplane 0.85 0.79 0.82 1000\n automobile 0.90 0.88 0.89 1000\n bird 0.73 0.65 0.69 1000\n cat 0.62 0.60 0.61 1000\n deer 0.72 0.81 0.76 1000\n dog 0.71 0.71 0.71 1000\n frog 0.80 0.89 0.84 1000\n horse 0.87 0.82 0.85 1000\n ship 0.89 0.89 0.89 1000\n truck 0.85 0.88 0.86 1000\n\n accuracy 0.79 10000\n macro avg 0.79 0.79 0.79 10000\nweighted avg 0.79 0.79 0.79 10000\n\n"
],
[
"#@ INSPECTING TRAINING LOSS AND ACCURACY:\nplt.style.use(\"ggplot\")\nplt.figure()\nplt.plot(np.arange(0, 40), H.history[\"loss\"], label=\"train_loss\")\nplt.plot(np.arange(0, 40), H.history[\"val_loss\"], label=\"val_loss\")\nplt.plot(np.arange(0, 40), H.history[\"accuracy\"], label=\"train_acc\")\nplt.plot(np.arange(0, 40), H.history[\"val_accuracy\"], label=\"val_acc\")\nplt.title(\"Training Loss and Accuracy\")\nplt.xlabel(\"Epoch\")\nplt.ylabel(\"Loss/Accuracy\")\nplt.legend()\nplt.show();",
"_____no_output_____"
]
],
[
[
"**Note:**\n- Batch Normalization can lead to a faster, more stable convergence with higher accuracy. \n- Batch Normalization will require more wall time to train the network even though the network will obtain higher accuracy in less epochs. ",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
]
|
d063570e27d884ad1284ea042d5745f573a85718 | 15,261 | ipynb | Jupyter Notebook | src/data-cleaning-final.ipynb | emilynomura1/1030MidtermProject | abe25ffde5733d7110f30ce81faf37a2dfa95abc | [
"MIT"
]
| null | null | null | src/data-cleaning-final.ipynb | emilynomura1/1030MidtermProject | abe25ffde5733d7110f30ce81faf37a2dfa95abc | [
"MIT"
]
| null | null | null | src/data-cleaning-final.ipynb | emilynomura1/1030MidtermProject | abe25ffde5733d7110f30ce81faf37a2dfa95abc | [
"MIT"
]
| null | null | null | 36.951574 | 120 | 0.519756 | [
[
[
"# Import packages\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# Read in data. If data is zipped, unzip the file and change file path accordingly\nyelp = pd.read_csv(\"../yelp_academic_dataset_business.csv\",\n dtype={'attributes': str, 'postal_code': str}, low_memory=False)\n\n# Reorder columns\n# https://stackoverflow.com/questions/41968732/set-order-of-columns-in-pandas-dataframe\ncols_to_order = ['name', 'stars', 'review_count', 'categories', 'city', 'state', \n 'postal_code', 'latitude', 'longitude', 'address']\nnew_cols = cols_to_order + (yelp.columns.drop(cols_to_order).tolist())\nyelp = yelp[new_cols]\n\nprint(yelp.shape)\nprint(yelp.info())",
"_____no_output_____"
],
[
"# Remove entries with null in columns: name, categories, city, postal code\nyelp = yelp[(pd.isna(yelp['name'])==False) & \n (pd.isna(yelp['city'])==False) & \n (pd.isna(yelp['categories'])==False) & \n (pd.isna(yelp['postal_code'])==False)]\nprint(yelp.shape)",
"_____no_output_____"
],
[
"# Remove columns with <0.5% non-null values (<894) except BYOB=641 non-null\n# and non-relevant columns\nyelp = yelp.drop(yelp.columns[[6,9,17,26,31,33,34,37,38]], axis=1)\nprint(yelp.shape)",
"_____no_output_____"
],
[
"# Remove entries with < 1000 businesses in each state\nstate_counts = yelp['state'].value_counts()\nyelp = yelp[~yelp['state'].isin(state_counts[state_counts < 1000].index)]\nprint(yelp.shape)",
"_____no_output_____"
],
[
"# Create new column of grouped star rating\nconds = [\n ((yelp['stars'] == 1) | (yelp['stars'] == 1.5)),\n ((yelp['stars'] == 2) | (yelp['stars'] == 2.5)),\n ((yelp['stars'] == 3) | (yelp['stars'] == 3.5)),\n ((yelp['stars'] == 4) | (yelp['stars'] == 4.5)),\n (yelp['stars'] == 5) ]\nvalues = [1, 2, 3, 4, 5]\nyelp['star-rating'] = np.select(conds, values)\nprint(yelp.shape)",
"_____no_output_____"
],
[
"# Convert 'hours' columns to total hours open that day for each day column\nfrom datetime import timedelta, time\n# Monday ---------------------------------------------------------\nyelp[['hours.Monday.start', 'hours.Monday.end']] = yelp['hours.Monday'].str.split('-', 1, expand=True)\n# Monday start time\nhr_min = []\nfor row in yelp['hours.Monday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el]) #change elements in list to int\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Monday.start'] = time_obj\n# Monday end time\nhr_min = []\nfor row in yelp['hours.Monday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Monday.end'] = time_obj\n# Create column of time difference\nyelp['Monday.hrs.open'] = yelp['hours.Monday.end'] - yelp['hours.Monday.start']\n# Convert seconds to minutes\nhour_calc = []\nfor ob in yelp['Monday.hrs.open']:\n hour_calc.append(ob.seconds//3600) #convert seconds to hours for explainability\nyelp['Monday.hrs.open'] = hour_calc\n# Tuesday -------------------------------------------------------------\nyelp[['hours.Tuesday.start', 'hours.Tuesday.end']] = yelp['hours.Tuesday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Tuesday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Tuesday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Tuesday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Tuesday.end'] = time_obj\nyelp['Tuesday.hrs.open'] = yelp['hours.Tuesday.end'] - yelp['hours.Tuesday.start']\nhour_calc = []\nfor ob in yelp['Tuesday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Tuesday.hrs.open'] = hour_calc\n# Wednesday ---------------------------------------------------------\nyelp[['hours.Wednesday.start', 'hours.Wednesday.end']] = yelp['hours.Wednesday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Wednesday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Wednesday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Wednesday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Wednesday.end'] = time_obj\nyelp['Wednesday.hrs.open'] = yelp['hours.Wednesday.end'] - yelp['hours.Wednesday.start']\nhour_calc = []\nfor ob in yelp['Wednesday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Wednesday.hrs.open'] = hour_calc\n# Thursday --------------------------------------------------------------------\nyelp[['hours.Thursday.start', 'hours.Thursday.end']] = yelp['hours.Thursday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Thursday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Thursday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Thursday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Thursday.end'] = time_obj\nyelp['Thursday.hrs.open'] = yelp['hours.Thursday.end'] - yelp['hours.Thursday.start']\nhour_calc = []\nfor ob in yelp['Thursday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Thursday.hrs.open'] = hour_calc\n# Friday -----------------------------------------------------------------------\nyelp[['hours.Friday.start', 'hours.Friday.end']] = yelp['hours.Friday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Friday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Friday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Friday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Friday.end'] = time_obj\nyelp['Friday.hrs.open'] = yelp['hours.Friday.end'] - yelp['hours.Friday.start']\nhour_calc = []\nfor ob in yelp['Friday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Friday.hrs.open'] = hour_calc\n# Saturday ------------------------------------------------------------------------\nyelp[['hours.Saturday.start', 'hours.Saturday.end']] = yelp['hours.Saturday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Saturday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Saturday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Saturday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Saturday.end'] = time_obj\nyelp['Saturday.hrs.open'] = yelp['hours.Saturday.end'] - yelp['hours.Saturday.start']\nhour_calc = []\nfor ob in yelp['Saturday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Saturday.hrs.open'] = hour_calc\n# Sunday ----------------------------------------------------------------------\nyelp[['hours.Sunday.start', 'hours.Sunday.end']] = yelp['hours.Sunday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Sunday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Sunday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Sunday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Sunday.end'] = time_obj\nyelp['Sunday.hrs.open'] = yelp['hours.Sunday.end'] - yelp['hours.Sunday.start']\nhour_calc = []\nfor ob in yelp['Sunday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Sunday.hrs.open'] = hour_calc",
"_____no_output_____"
],
[
"# Remove old target variable (stars) and \n# unecessary time columns that were created. Only keep 'day.hrs.open' columns\nyelp = yelp.drop(yelp.columns[[1,10,11,12,16,18,41,48,52,53,55,56,\n 58,59,61,62,64,65,67,68,70,71]], axis=1)\nprint(yelp.shape)",
"_____no_output_____"
],
[
"# Delete columns with unworkable form (dict)\ndel yelp['attributes.BusinessParking']\ndel yelp['attributes.Music']\ndel yelp['attributes.Ambience']\ndel yelp['attributes.GoodForKids']\ndel yelp['attributes.RestaurantsDelivery']\ndel yelp['attributes.BestNights']\ndel yelp['attributes.HairSpecializesIn']\ndel yelp['attributes.GoodForMeal']",
"_____no_output_____"
],
[
"# Look at final DF before saving\nprint(yelp.info())",
"_____no_output_____"
],
[
"# Save as CSV for faster loading -------------------------------------------------\nyelp.to_csv('/Data/yelp-clean.csv')",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0636b41b3a1672c4be3cb9dea70e74ac379adcf | 822,625 | ipynb | Jupyter Notebook | celebrity.ipynb | peter1505/AIFFEL | def84c450cc479d2fc34d428438c542595606286 | [
"MIT"
]
| 2 | 2021-11-18T08:40:43.000Z | 2021-12-17T07:46:26.000Z | celebrity.ipynb | peter1505/AIFFEL | def84c450cc479d2fc34d428438c542595606286 | [
"MIT"
]
| null | null | null | celebrity.ipynb | peter1505/AIFFEL | def84c450cc479d2fc34d428438c542595606286 | [
"MIT"
]
| null | null | null | 1,600.437743 | 576,380 | 0.958157 | [
[
[
"# ๋ด๊ฐ ๋ฎ์ ์ฐ์์ธ์?\n\n\n์ฌ์ง ๋ชจ์ผ๊ธฐ\n์ผ๊ตด ์์ญ ์๋ฅด๊ธฐ\n์ผ๊ตด ์์ญ Embedding ์ถ์ถ\n์ฐ์์ธ๋ค์ ์ผ๊ตด๊ณผ ๊ฑฐ๋ฆฌ ๋น๊ตํ๊ธฐ\n์๊ฐํ\nํ๊ณ \n\n\n1. ์ฌ์ง ๋ชจ์ผ๊ธฐ\n\n\n2. ์ผ๊ตด ์์ญ ์๋ฅด๊ธฐ\n์ด๋ฏธ์ง์์ ์ผ๊ตด ์์ญ์ ์๋ฆ\nimage.fromarray๋ฅผ ์ด์ฉํ์ฌ PIL image๋ก ๋ณํํ ํ, ์ถํ์ ์๊ฐํ์ ์ฌ์ฉ",
"_____no_output_____"
]
],
[
[
"# ํ์ํ ๋ชจ๋ ๋ถ๋ฌ์ค๊ธฐ\n\nimport os\nimport re\nimport glob\n\nimport glob\nimport pickle\nimport pandas as pd\n\n\nimport matplotlib.pyplot as plt\nimport matplotlib.image as img\nimport face_recognition\n%matplotlib inline \nfrom PIL import Image\nimport numpy as np\n\nimport face_recognition\nimport os\nfrom PIL import Image\n\n\n\n\ndir_path = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data'\nfile_list = os.listdir(dir_path)\n\nprint(len(file_list))\n\n# ์ด๋ฏธ์ง ํ์ผ ๋ถ๋ฌ์ค๊ธฐ\n\nprint('์ฐ์์ธ ์ด๋ฏธ์ง ํ์ผ ๊ฐฏ์:', len(file_list) - 5) # ์ถ๊ฐํ ๋ด ์ฌ์ง ์๋ฅผ ๋บ ๋๋จธ์ง ์ฌ์ง ์ ์ธ๊ธฐ",
"247\n์ฐ์์ธ ์ด๋ฏธ์ง ํ์ผ ๊ฐฏ์: 242\n"
],
[
"# ์ด๋ฏธ์ง ํ์ผ ๋ฆฌ์คํธ ํ์ธ\n\nprint (\"ํ์ผ ๋ฆฌ์คํธ:\\n{}\".format(file_list))",
"ํ์ผ ๋ฆฌ์คํธ:\n['๊ฐ์ธ๋.jpg', '๊น๊ฒฝํ.jpg', '๊ณ ๋ณด๊ฒฐ.jpg', 'T.O.P.jpg', '๊น๋์.jpg', '๊ธธ์์ .jpg', '๊ณ ์์.jpg', '๊ฐ๋ฌธ๊ฒฝ.jpg', '๊ฐ์ธ๋ด.jpg', '๊ณฝ์ง์.jpg', '๊ฐ์ฑ๋ฏผ.jpg', '๊ณต์น์ฐ.jpg', '๊น๋ฏผ์.jpg', '๊น๊ท์ข
.jpg', '๊ธฐ๋ํ.jpg', 'BUMZU.jpg', '๊ณ ์ ์ง.jpg', '๊น๋ณ๊ธฐ.jpg', '๊ณฝ๋์ฐ.jpg', '๊น๋ํฌ.jpg', '๊ถ์ธํ.jpg', '๊น๋๋ฅ .jpg', '๊น์๊ฒฝ.jpg', 'G-DRAGON.jpg', '๊ถ์ค์ค.jpg', '๊ฐ์ฑ์ฐ.jpg', '๊น์๋ผ.jpg', '๊น๋ค๋ฏธ.jpg', '๊น๊ฐ์.jpg', '๊น๊ธฐ๋.jpg', '๊ฐ์ฑ์.jpg', '๊ณ ์ด๋ด.jpg', '๊น๊ฐํฌ.jpg', '๊ฐ๋ฆฌ๋.jpg', '๊น์ ํธ.jpg', '๊ณ ์๋ผ.jpg', '๊น๊ทธ๋ฆผ.jpg', '๊น๋ฏผํฌ.jpg', '๊ฐ๊ฒฝ์ค.jpg', '๊น๊ฑด๋ชจ.jpg', '๊ฐ๊ท ์ฑ.jpg', '๊ณฝ์ง๋ฏผ.jpg', '๊ณ ๊ฒฝํ.jpg', '๊ฑด์ง.jpg', '๊น๊ฒฝํธ.jpg', '๊ฐํ์ค.jpg', '๊น๋ค์.jpg', '๊ณ ์ค์ .jpg', '๊ฐ์์.jpg', '๊ธธํ๋ฏธ.jpg', '๊ฐ๋๋ฆฌ.jpg', '๊น๋ขฐํ.jpg', '๊ถํ์.jpg', '๊ฐ๋ฏผ๊ฒฝ.jpg', '๊ณฝ๋ฏผ์.jpg', '๊ฐ์ํ.jpg', '๊ณฝ๋์.jpg', '๊น๋ฒ๋.jpg', 'K.WILL.jpg', '๊น์ํ.jpg', '๊ตฌ์คํ.jpg', '๊ธ๋ณด๋ผ.jpg', '๊น๊ฐ์.jpg', '๊น๋ช
์ค.jpg', '๊ฐ๊ฒฝํ.jpg', '๊ธธ์ ์ฐ.jpg', '๊น์ ํ.jpg', '๊ถ์ฑํฌ.jpg', '๊น๊ณ ์.jpg', '๊ฐ์ฐ์ฑ.jpg', '๊ฐ์๋ผ.jpg', '๊ฐ์น์ค.jpg', '๊ฒฝ์์ง.jpg', '๊น๊ฐ๋.jpg', '๊ฐ์ด์.jpg', '๊ณต์ ํ.jpg', '๊น๋ฏผ๊ธฐ.jpg', '๊น๋ฏผ๊ต.jpg', '๊ฐ์ ์ผ.jpg', '๊ถํ์.jpg', '๊น๊ฝ๋น.jpg', '๊น๋จ์ฃผ.jpg', '๊ณฝํฌ์ฑ.jpg', '๊ฐ๋ฏธ์ฐ.jpg', '๊น๋ฏผ์.jpg', '๊ฐ๋ฏผ์.jpg', 'SE7EN.jpg', '๊ฐ์๋ฆฌ.jpg', '๊ณฝ์ ์ฑ.jpg', '๊ณต๋ช
.jpg', '๊น๋ณด๋ฏธ.jpg', '๊น์ํธ.jpg', '๊น๋ช
๋ฏผ.jpg', '๊น์ํฌ.jpg', '๊ฐ๋ด์ฑ.jpg', '๊ธฐ๋ฆฌ๋ณด์ด.jpg', '๊น๊ท๋ฆฌ.jpg', '๊น๋ถ์ .jpg', '๊ณ ์.jpg', '๊น๋ณด๋ผ.jpg', 'RM.jpg', '๊ธฐ์ฃผ๋ด.jpg', '๊ฐ๋ฆฌ.jpg', '๊น๊ตญํ.jpg', '๊น๊ธฐ๋ฒ.jpg', '๊ณ ์์ฑ.jpg', '๊น์๋ก .jpg', '๊ณ ์ํฌ.jpg', '๊น๊ฐํ.jpg', '๊ฒฌ์ฐ.jpg', 'KCM.jpg', '๊ณฝ์์.jpg', '๊ถ์ ๋ฆฌ.jpg', '๊น๋ฒ.jpg', '์ด์์ฌ_01.jpg', '๊ณ ์ฑํฌ.jpg', '๊ธธ๊ฑด.jpg', 'Zion.T.jpg', '๊น๋ณ์ฅ.jpg', '๊ณ ์คํฌ.jpg', '๊น๊ด๊ท.jpg', '๊ณ ์ฃผ์.jpg', '๊ฐ์์.jpg', '๊ณ ๋ฏผ์.jpg', '๊ณตํจ์ง.jpg', '๊ฐ์์ฐ.jpg', '๊น๋ฏผ.jpg', '๊น๋ช
์.jpg', '๊ถํด์ฑ.jpg', '๊น๋ฒ๋ฃก.jpg', '๊ฐ์์ฐ.jpg', '๊ถ์ํ.jpg', '๊ฐ์ง์ญ.jpg', '๊ฐ์น์.jpg', 'MC๋ชฝ.jpg', '๊น๋๋ช
.jpg', '๊น๋ฏผ์.jpg', '๊น์ฌ๋.jpg', '๊น๊ฐ์ฐ.jpg', '๊ฐ๋ณ.jpg', '๊ฐ์งํ.jpg', '๊ฐ์์ง.jpg', '๊น์๋ฐฐ.jpg', '๊ถํดํจ.jpg', 'euPhemia.jpg', '์ด์์ฌ_02.jpg', '๊ธ์ฌํฅ.jpg', '๊น๋ฏผ์ข
.jpg', '๊ถํ์.jpg', '๊ณ ํ์ .jpg', '๊ฐ์ฑํ.jpg', 'V.One.jpg', '๊น๊ฐ์ฐ.jpg', '๊น์ ์
.jpg', '๊น๋์ฐ.jpg', '๊ถ์์.jpg', '๊ธฐ์์ธ.jpg', '๊น๋ํ.jpg', '๊ฐ์ด์ฑ.jpg', '๊ณ ์ค.jpg', '๊ธธํด์ฐ.jpg', '๊ฒฌ๋ฏธ๋ฆฌ.jpg', '๊ตฌ์ฌ์ด.jpg', '๊ฐ๊ธฐ์.jpg', '๊ณ ๋์ฌ.jpg', '๊น๋ฏผ์ค.jpg', '๊ถํ์ด.jpg', '๊ถ๋คํ.jpg', '๊ฐํฌ.jpg', '๊ฐ๋คํ.jpg', '๊ณ ์ธ๋ฒ.jpg', '๊น๊ด์.jpg', '๊ฐ์ฐ์.jpg', 'JK๊น๋์ฑ.jpg', '๊น์๋ฒฝ.jpg', '๊ถ์์.jpg', '๊ฐ์๋น.jpg', '๊ฐ์์ฐ.jpg', '๊น๊ธฐ๋ฐฉ.jpg', '๊ตฌํ์ .jpg', '๊ธ์๋.jpg', '๊ฐํ๋.jpg', '๊ณ ์ฐฝ์.jpg', '๊ฐ๋ฏผ์ฃผ.jpg', '๊น๋จ๊ธธ.jpg', '๊น๋์ด.jpg', '๊ฑฐ๋ฏธ.jpg', '๊ณ ์์.jpg', '๊ธธ์ฉ์ฐ.jpg', '๊ถ์จ.jpg', '๊น๋๋ช
.jpg', '๊น๋คํ.jpg', '๊ฐ๋จ๊ธธ.jpg', '๊น๋ณ์ธ.jpg', '๊น๋น์ฐ.jpg', '๊ธ์๋ก.jpg', '๊ฐํ๋.jpg', '๊น๋จ์ฐ.jpg', '๊น๋์ค.jpg', '๊ถ์์ฐ.jpg', '๊น๋ณด๊ฒฝ.jpg', '๊ฒฝ์ธ์ .jpg', '๊ฐํ.jpg', '๊ธธ๋ฏธ.jpg', '๊ฐ์์.jpg', '๊ณ ๋์.jpg', '๊ถ๋ํธ.jpg', '๊น๋น๋๋ฆฌ.jpg', '๊ณต์ .jpg', '๊ตฌ์์ฐฌ.jpg', '๊ธฐํ์.jpg', '๊น๋ฌด์ด.jpg', '๊น๊ถ.jpg', '๊น๋์ฑ.jpg', '๊น์ ์.jpg', '๊ณ ์ธ์.jpg', '๊น๋ฒ์.jpg', '๊น๋ช
๊ณค.jpg', '๊น๊ฒฝ๋ก.jpg', '๊น๋ณด๋ฏผ.jpg', '๊ณ ๋ํฌ.jpg', '๊ฐ๋ถ์.jpg', '๊ฐ์ฑ์ง.jpg', '๊ณ ์๋ฏธ.jpg', '๊ฐ์์ง.jpg', '๊ฐ์๋น.jpg', '๊น๋์.jpg', '๊น์ํ.jpg', '๊น๋ณด์ฐ.jpg', '๊ฐ์ง.jpg', '๊น๋จ์จ.jpg', '๊ธ๋จ๋น.jpg', '๊ฐ์ ์ฐ.jpg', '.ipynb_checkpoints', '๊น๋์.jpg', '๊น๊ท์ .jpg', '๊น๋ฏผ์ฃผ.jpg', '๊น์์ค.jpg', '๊น๋ฏผ๊ฒฝ.jpg', '๊ฐํ์ฐ.jpg', '๊ฐ๋์.jpg', '๊ฐ๋ฌธ์.jpg', '๊น๋ฌด์.jpg', '๊ณฝ์ฐฝ์ .jpg', '๊ณตํ์ง.jpg', '๊น๋น.jpg', '๊น๊ตญํฌ.jpg']\n"
],
[
"# ์ด๋ฏธ์ง ํ์ผ ์ผ๋ถ ํ์ธ\n\n# Set figsize here\nfig, axes = plt.subplots(nrows=2, ncols=3, figsize=(24,10))\n\n# flatten axes for easy iterating\nfor i, ax in enumerate(axes.flatten()):\n image = img.imread(dir_path+'/'+file_list[i])\n ax.imshow(image)\nplt.show()\n\nfig.tight_layout()",
"_____no_output_____"
],
[
"\n# ์ด๋ฏธ์ง ํ์ผ ๊ฒฝ๋ก๋ฅผ ํ๋ผ๋ฏธํฐ๋ก ๋๊ธฐ๋ฉด ์ผ๊ตด ์์ญ๋ง ์๋ผ์ฃผ๋ ํจ์\n\ndef get_cropped_face(image_file):\n image = face_recognition.load_image_file(image_file)\n face_locations = face_recognition.face_locations(image)\n a, b, c, d = face_locations[0]\n cropped_face = image[a:c,d:b,:]\n \n return cropped_face",
"_____no_output_____"
],
[
"# ์ผ๊ตด ์์ญ์ด ์ ํํ ์๋ฆฌ๋ ์ง ํ์ธ\n\nimage_path = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/์ด์์ฌ_02.jpg'\n\ncropped_face = get_cropped_face(image_path)\nplt.imshow(cropped_face)",
"_____no_output_____"
]
],
[
[
"## Step3. ์ผ๊ตด ์์ญ์ ์๋ฒ ๋ฉ ์ถ์ถํ๊ธฐ",
"_____no_output_____"
]
],
[
[
"# ์ผ๊ตด ์์ญ์ ๊ฐ์ง๊ณ ์ผ๊ตด ์๋ฒ ๋ฉ ๋ฒกํฐ๋ฅผ ๊ตฌํ๋ ํจ์\n\ndef get_face_embedding(face):\n return face_recognition.face_encodings(face, model='cnn')",
"_____no_output_____"
],
[
"# ํ์ผ ๊ฒฝ๋ก๋ฅผ ๋ฃ์ผ๋ฉด embedding_dict๋ฅผ ๋ฆฌํดํ๋ ํจ์\n\ndef get_face_embedding_dict(dir_path):\n file_list = os.listdir(dir_path)\n embedding_dict = {}\n \n for file in file_list:\n try: \n img_path = os.path.join(dir_path, file)\n face = get_cropped_face(img_path)\n embedding = get_face_embedding(face)\n if len(embedding) > 0: \n # ์ผ๊ตด์์ญ face๊ฐ ์ ๋๋ก detect๋์ง ์์ผ๋ฉด len(embedding)==0์ธ ๊ฒฝ์ฐ๊ฐ ๋ฐ์ํ๋ฏ๋ก \n # os.path.splitext(file)[0]์๋ ์ด๋ฏธ์งํ์ผ๋ช
์์ ํ์ฅ์๋ฅผ ์ ๊ฑฐํ ์ด๋ฆ์ด ๋ด๊น๋๋ค. \n embedding_dict[os.path.splitext(file)[0]] = embedding[0]\n # embedding_dict[] ์ด๋ฏธ์ง ํ์ผ์ ์๋ฒ ๋ฉ์ ๊ตฌํด ๋ด์ ํค=์ฌ๋์ด๋ฆ, ๊ฐ=์๋ฒ ๋ฉ ๋ฒกํฐ\n # os.path.splitext(file)[0] ํ์ผ์ ํ์ฅ์๋ฅผ ์ ๊ฑฐํ ์ด๋ฆ๋ง ์ถ์ถ\n # embedding[0]์ ๋ฃ๊ณ ์ถ์ ์์๊ฐ\n\n except:\n continue\n \n return embedding_dict",
"_____no_output_____"
],
[
"embedding_dict = get_face_embedding_dict(dir_path)",
"_____no_output_____"
]
],
[
[
"## Step4. ๋ชจ์ ์ฐ์์ธ๋ค๊ณผ ๋น๊ตํ๊ธฐ",
"_____no_output_____"
]
],
[
[
"# ์ด๋ฏธ์ง ๊ฐ ๊ฑฐ๋ฆฌ๋ฅผ ๊ตฌํ๋ ํจ์\n\ndef get_distance(name1, name2):\n return np.linalg.norm(embedding_dict[name1]-embedding_dict[name2], ord=2)",
"_____no_output_____"
],
[
"# ๋ณธ์ธ ์ฌ์ง์ ๊ฑฐ๋ฆฌ๋ฅผ ํ์ธํด๋ณด์\n\nprint('๋ด ์ฌ์ง๋ผ๋ฆฌ์ ๊ฑฐ๋ฆฌ๋?:', get_distance('์ด์์ฌ_01', '์ด์์ฌ_02'))",
"๋ด ์ฌ์ง๋ผ๋ฆฌ์ ๊ฑฐ๋ฆฌ๋?: 0.27525162596989655\n"
],
[
"# name1๊ณผ name2์ ๊ฑฐ๋ฆฌ๋ฅผ ๋น๊ตํ๋ ํจ์๋ฅผ ์์ฑํ๋, name1์ ๋ฏธ๋ฆฌ ์ง์ ํ๊ณ , name2๋ ํธ์ถ์์ ์ธ์๋ก ๋ฐ๋๋ก ํฉ๋๋ค.\n\ndef get_sort_key_func(name1):\n def get_distance_from_name1(name2):\n return get_distance(name1, name2)\n return get_distance_from_name1",
"_____no_output_____"
],
[
"\n# ๋ฎ์๊ผด ์์, ์ด๋ฆ, ์๋ฒ ๋ฉ ๊ฑฐ๋ฆฌ๋ฅผ ํฌํจํ Top-5 ๋ฆฌ์คํธ ์ถ๋ ฅํ๋ ํจ์\n\ndef get_nearest_face(name, top=5):\n sort_key_func = get_sort_key_func(name)\n sorted_faces = sorted(embedding_dict.items(), key=lambda x:sort_key_func(x[0]))\n \n rank_cnt = 1 # ์์๋ฅผ ์ธ๋ ๋ณ์\n pass_cnt = 1 # ๊ฑด๋๋ด ์ซ์๋ฅผ ์ธ๋ ๋ณ์(๋ณธ์ธ ์ฌ์ง ์นด์ดํธ)\n end = 0 # ๋ฎ์ ๊ผด 5๋ฒ ์ถ๋ ฅ์ ์ข
๋ฃํ๊ธฐ ์ํด ์ธ๋ ๋ณ์\n for i in range(top+15):\n rank_cnt += 1\n if sorted_faces[i][0].find('์ด์์ฌ_02') == 0: # ๋ณธ์ธ ์ฌ์ง์ธ mypicture๋ผ๋ ํ์ผ๋ช
์ผ๋ก ์์ํ๋ ๊ฒฝ์ฐ ์ ์ธํฉ๋๋ค.\n pass_cnt += 1\n continue\n if sorted_faces[i]:\n print('์์ {} : ์ด๋ฆ({}), ๊ฑฐ๋ฆฌ({})'.format(rank_cnt - pass_cnt, sorted_faces[i][0], sort_key_func(sorted_faces[i][0])))\n end += 1\n if end == 5: # end๊ฐ 5๊ฐ ๋ ๊ฒฝ์ฐ ์ฐ์์ธ 5๋ช
์ถ๋ ฅ๋์๊ธฐ์ ์ข
๋ฃํฉ๋๋ค.\n break",
"_____no_output_____"
],
[
"# '์ด์์ฌ_01'๊ณผ ๊ฐ์ฅ ๋ฎ์ ์ฌ๋์ ๋๊ตด๊น์?\n\nget_nearest_face('์ด์์ฌ_01')",
"์์ 1 : ์ด๋ฆ(์ด์์ฌ_01), ๊ฑฐ๋ฆฌ(0.0)\n์์ 2 : ์ด๋ฆ(euPhemia), ๊ฑฐ๋ฆฌ(0.39785575251289035)\n์์ 3 : ์ด๋ฆ(๊ณต๋ช
), ๊ฑฐ๋ฆฌ(0.43181500298337777)\n์์ 4 : ์ด๋ฆ(๊ฐ๊ธฐ์), ๊ฑฐ๋ฆฌ(0.44559566211978)\n์์ 5 : ์ด๋ฆ(JK๊น๋์ฑ), ๊ฑฐ๋ฆฌ(0.4560282622605789)\n"
],
[
"# '์ด์์ฌ_02'์ ๊ฐ์ฅ ๋ฎ์ ์ฌ๋์ ๋๊ตด๊น์?\n\nget_nearest_face('์ด์์ฌ_02')",
"์์ 1 : ์ด๋ฆ(์ด์์ฌ_01), ๊ฑฐ๋ฆฌ(0.27525162596989655)\n์์ 2 : ์ด๋ฆ(euPhemia), ๊ฑฐ๋ฆฌ(0.38568278214648233)\n์์ 3 : ์ด๋ฆ(๊ณต๋ช
), ๊ฑฐ๋ฆฌ(0.445581489047543)\n์์ 4 : ์ด๋ฆ(๊น๋์), ๊ฑฐ๋ฆฌ(0.44765017085662295)\n์์ 5 : ์ด๋ฆ(๊ฐ์ฑํ), ๊ฑฐ๋ฆฌ(0.4536061116328271)\n"
]
],
[
[
"## Step5. ๋ค์ํ ์ฌ๋ฏธ์๋ ์๊ฐํ ์๋ํด ๋ณด๊ธฐ",
"_____no_output_____"
]
],
[
[
"\n# ์ฌ์ง ๊ฒฝ๋ก ์ค์ \n\nmypicture1 = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/์ด์์ฌ_01.jpg'\nmypicture2 = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/์ด์์ฌ_02.jpg'\n\nmc= os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/MC๋ชฝ.jpg'\ngahee = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐํฌ.jpg'\nseven = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/SE7EN.jpg'\ngam = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ์ฐ์ฑ.jpg'\n\ngang = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ๊ฒฝ์ค.jpg'\ngyung = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ๊ฒฝํ.jpg'\ngi = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ๊ธฐ์.jpg'",
"_____no_output_____"
],
[
"\n# ํฌ๋กญํ ์ผ๊ตด์ ์ ์ฅํด ๋ณด์\n\na1 = get_cropped_face(mypicture1)\na2 = get_cropped_face(mypicture2)\n\nb1 = get_cropped_face(mc)\nb2 = get_cropped_face(gahee)\nb3 = get_cropped_face(gam)",
"_____no_output_____"
],
[
"plt.figure(figsize=(10,8))\n\nplt.subplot(231)\nplt.imshow(a1)\nplt.axis('off')\nplt.title('1st')\nplt.subplot(232)\nplt.imshow(a2)\nplt.axis('off')\nplt.title('me')\nplt.subplot(233)\nplt.imshow(b1)\nplt.axis('off')\nplt.title('2nd')\nplt.subplot(234)\n\nprint('''mypicture์ ์์\n์์ 1 : ์ด๋ฆ(์ฌ์ฟ ๋ผ), ๊ฑฐ๋ฆฌ(0.36107689719729225)\n์์ 2 : ์ด๋ฆ(ํธ์์ด์ค๋์ฐ), ๊ฑฐ๋ฆฌ(0.36906292012955577) \n์์ 3 : ์ด๋ฆ(์์ด์ ), ๊ฑฐ๋ฆฌ(0.3703590842312735) \n์์ 4 : ์ด๋ฆ(์ ํธ๋ฃจ), ๊ฑฐ๋ฆฌ(0.3809516850126146) \n์์ 5 : ์ด๋ฆ(์งํธ), ๊ฑฐ๋ฆฌ(0.3886670633997685)''')",
"mypicture์ ์์\n์์ 1 : ์ด๋ฆ(์ฌ์ฟ ๋ผ), ๊ฑฐ๋ฆฌ(0.36107689719729225)\n์์ 2 : ์ด๋ฆ(ํธ์์ด์ค๋์ฐ), ๊ฑฐ๋ฆฌ(0.36906292012955577) \n์์ 3 : ์ด๋ฆ(์์ด์ ), ๊ฑฐ๋ฆฌ(0.3703590842312735) \n์์ 4 : ์ด๋ฆ(์ ํธ๋ฃจ), ๊ฑฐ๋ฆฌ(0.3809516850126146) \n์์ 5 : ์ด๋ฆ(์งํธ), ๊ฑฐ๋ฆฌ(0.3886670633997685)\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
]
|
d0636f5a18ee02ac2f75f5d634bcbcb10c053fe4 | 43,386 | ipynb | Jupyter Notebook | Feature - Handling missing Values/5. Arbitrary Value Imputation.ipynb | deepakkum21/Feature-Engineering | ea10b2685c842bcf0247887db755d05c38b23844 | [
"Apache-2.0"
]
| null | null | null | Feature - Handling missing Values/5. Arbitrary Value Imputation.ipynb | deepakkum21/Feature-Engineering | ea10b2685c842bcf0247887db755d05c38b23844 | [
"Apache-2.0"
]
| null | null | null | Feature - Handling missing Values/5. Arbitrary Value Imputation.ipynb | deepakkum21/Feature-Engineering | ea10b2685c842bcf0247887db755d05c38b23844 | [
"Apache-2.0"
]
| null | null | null | 114.777778 | 28,204 | 0.847877 | [
[
[
"## 5. Arbitrary Value Imputation\n#### this technique was derived from kaggle competition It consists of replacing NAN by an arbitrary value",
"_____no_output_____"
]
],
[
[
"import pandas as pd",
"_____no_output_____"
],
[
"df=pd.read_csv(\"titanic.csv\", usecols=[\"Age\",\"Fare\",\"Survived\"])\ndf.head()",
"_____no_output_____"
],
[
"def impute_nan(df,variable):\n df[variable+'_zero']=df[variable].fillna(0)\n df[variable+'_hundred']=df[variable].fillna(100)",
"_____no_output_____"
],
[
"df['Age'].hist(bins=50)",
"_____no_output_____"
]
],
[
[
"### Advantages\n Easy to implement\n Captures the importance of missingess if there is one\n### Disadvantages\n Distorts the original distribution of the variable\n If missingess is not important, it may mask the predictive power of the original variable by distorting its distribution\n Hard to decide which value to use",
"_____no_output_____"
]
],
[
[
"impute_nan(df,'Age')\ndf.head()",
"_____no_output_____"
],
[
"print(df['Age'].std())\nprint(df['Age_zero'].std())\nprint(df['Age_hundred'].std())",
"14.526497332334044\n17.596074065915886\n30.930372890173594\n"
],
[
"print(df['Age'].mean())\nprint(df['Age_zero'].mean())\nprint(df['Age_hundred'].mean())",
"29.69911764705882\n23.79929292929293\n43.66461279461279\n"
],
[
"import matplotlib.pyplot as plt\n%matplotlib inline",
"_____no_output_____"
],
[
"fig = plt.figure()\nax = fig.add_subplot(111)\ndf['Age'].plot(kind='kde', ax=ax)\ndf.Age_zero.plot(kind='kde', ax=ax, color='red')\ndf.Age_hundred.plot(kind='kde', ax=ax, color='green')\nlines, labels = ax.get_legend_handles_labels()\nax.legend(lines, labels, loc='best')",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d063869573f5b3dfc578fc45bb7d1c7875fd50ea | 25,044 | ipynb | Jupyter Notebook | learning/matplot/animation/basic_animation.ipynb | HypoChloremic/python_learning | 3778f6d7c35cdd54a85a3418aba99f2b91d32775 | [
"Apache-2.0"
]
| 2 | 2019-06-23T07:17:30.000Z | 2019-07-06T15:15:42.000Z | learning/matplot/animation/basic_animation.ipynb | HypoChloremic/python_learning | 3778f6d7c35cdd54a85a3418aba99f2b91d32775 | [
"Apache-2.0"
]
| null | null | null | learning/matplot/animation/basic_animation.ipynb | HypoChloremic/python_learning | 3778f6d7c35cdd54a85a3418aba99f2b91d32775 | [
"Apache-2.0"
]
| 1 | 2019-06-23T07:17:43.000Z | 2019-06-23T07:17:43.000Z | 61.53317 | 4,768 | 0.592797 | [
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nimport matplotlib.animation as animation\nimport matplotlib\nfrom IPython.display import HTML\n\n",
"_____no_output_____"
],
[
"def update_line(num, data, line):\n print(num)\n line.set_data(data[..., :num])\n return line,",
"_____no_output_____"
],
[
"plt.rcParams['animation.writer'] = 'ffmpeg'\nprint(matplotlib.animation.writers.list())",
"['pillow', 'ffmpeg', 'ffmpeg_file', 'html']\n"
],
[
"fig1 = plt.figure()\n\n# Fixing random state for reproducibility\nnp.random.seed(19680801)\n\ndata = np.random.rand(2, 25)\nl, = plt.plot(x=[], y=[])\nplt.xlim(0, 1)\nplt.ylim(0, 1)\nline_ani = animation.FuncAnimation(fig1, update_line, 25, fargs=(data, l),\n interval=50, blit=True)\nHTML(line_ani.to_html5_video())",
"_____no_output_____"
],
[
"help(plt.plot)",
"Help on function plot in module matplotlib.pyplot:\n\nplot(*args, scalex=True, scaley=True, data=None, **kwargs)\n Plot y versus x as lines and/or markers.\n \n Call signatures::\n \n plot([x], y, [fmt], *, data=None, **kwargs)\n plot([x], y, [fmt], [x2], y2, [fmt2], ..., **kwargs)\n \n The coordinates of the points or line nodes are given by *x*, *y*.\n \n The optional parameter *fmt* is a convenient way for defining basic\n formatting like color, marker and linestyle. It's a shortcut string\n notation described in the *Notes* section below.\n \n >>> plot(x, y) # plot x and y using default line style and color\n >>> plot(x, y, 'bo') # plot x and y using blue circle markers\n >>> plot(y) # plot y using x as index array 0..N-1\n >>> plot(y, 'r+') # ditto, but with red plusses\n \n You can use `.Line2D` properties as keyword arguments for more\n control on the appearance. Line properties and *fmt* can be mixed.\n The following two calls yield identical results:\n \n >>> plot(x, y, 'go--', linewidth=2, markersize=12)\n >>> plot(x, y, color='green', marker='o', linestyle='dashed',\n ... linewidth=2, markersize=12)\n \n When conflicting with *fmt*, keyword arguments take precedence.\n \n \n **Plotting labelled data**\n \n There's a convenient way for plotting objects with labelled data (i.e.\n data that can be accessed by index ``obj['y']``). Instead of giving\n the data in *x* and *y*, you can provide the object in the *data*\n parameter and just give the labels for *x* and *y*::\n \n >>> plot('xlabel', 'ylabel', data=obj)\n \n All indexable objects are supported. This could e.g. be a `dict`, a\n `pandas.DataFame` or a structured numpy array.\n \n \n **Plotting multiple sets of data**\n \n There are various ways to plot multiple sets of data.\n \n - The most straight forward way is just to call `plot` multiple times.\n Example:\n \n >>> plot(x1, y1, 'bo')\n >>> plot(x2, y2, 'go')\n \n - Alternatively, if your data is already a 2d array, you can pass it\n directly to *x*, *y*. A separate data set will be drawn for every\n column.\n \n Example: an array ``a`` where the first column represents the *x*\n values and the other columns are the *y* columns::\n \n >>> plot(a[0], a[1:])\n \n - The third way is to specify multiple sets of *[x]*, *y*, *[fmt]*\n groups::\n \n >>> plot(x1, y1, 'g^', x2, y2, 'g-')\n \n In this case, any additional keyword argument applies to all\n datasets. Also this syntax cannot be combined with the *data*\n parameter.\n \n By default, each line is assigned a different style specified by a\n 'style cycle'. The *fmt* and line property parameters are only\n necessary if you want explicit deviations from these defaults.\n Alternatively, you can also change the style cycle using\n :rc:`axes.prop_cycle`.\n \n \n Parameters\n ----------\n x, y : array-like or scalar\n The horizontal / vertical coordinates of the data points.\n *x* values are optional and default to `range(len(y))`.\n \n Commonly, these parameters are 1D arrays.\n \n They can also be scalars, or two-dimensional (in that case, the\n columns represent separate data sets).\n \n These arguments cannot be passed as keywords.\n \n fmt : str, optional\n A format string, e.g. 'ro' for red circles. See the *Notes*\n section for a full description of the format strings.\n \n Format strings are just an abbreviation for quickly setting\n basic line properties. All of these and more can also be\n controlled by keyword arguments.\n \n This argument cannot be passed as keyword.\n \n data : indexable object, optional\n An object with labelled data. If given, provide the label names to\n plot in *x* and *y*.\n \n .. note::\n Technically there's a slight ambiguity in calls where the\n second label is a valid *fmt*. `plot('n', 'o', data=obj)`\n could be `plt(x, y)` or `plt(y, fmt)`. In such cases,\n the former interpretation is chosen, but a warning is issued.\n You may suppress the warning by adding an empty format string\n `plot('n', 'o', '', data=obj)`.\n \n Other Parameters\n ----------------\n scalex, scaley : bool, optional, default: True\n These parameters determined if the view limits are adapted to\n the data limits. The values are passed on to `autoscale_view`.\n \n **kwargs : `.Line2D` properties, optional\n *kwargs* are used to specify properties like a line label (for\n auto legends), linewidth, antialiasing, marker face color.\n Example::\n \n >>> plot([1, 2, 3], [1, 2, 3], 'go-', label='line 1', linewidth=2)\n >>> plot([1, 2, 3], [1, 4, 9], 'rs', label='line 2')\n \n If you make multiple lines with one plot command, the kwargs\n apply to all those lines.\n \n Here is a list of available `.Line2D` properties:\n \n Properties:\n agg_filter: a filter function, which takes a (m, n, 3) float array and a dpi value, and returns a (m, n, 3) array\n alpha: float or None\n animated: bool\n antialiased or aa: bool\n clip_box: `.Bbox`\n clip_on: bool\n clip_path: Patch or (Path, Transform) or None\n color or c: color\n contains: callable\n dash_capstyle: {'butt', 'round', 'projecting'}\n dash_joinstyle: {'miter', 'round', 'bevel'}\n dashes: sequence of floats (on/off ink in points) or (None, None)\n data: (2, N) array or two 1D arrays\n drawstyle or ds: {'default', 'steps', 'steps-pre', 'steps-mid', 'steps-post'}, default: 'default'\n figure: `.Figure`\n fillstyle: {'full', 'left', 'right', 'bottom', 'top', 'none'}\n gid: str\n in_layout: bool\n label: object\n linestyle or ls: {'-', '--', '-.', ':', '', (offset, on-off-seq), ...}\n linewidth or lw: float\n marker: marker style\n markeredgecolor or mec: color\n markeredgewidth or mew: float\n markerfacecolor or mfc: color\n markerfacecoloralt or mfcalt: color\n markersize or ms: float\n markevery: None or int or (int, int) or slice or List[int] or float or (float, float)\n path_effects: `.AbstractPathEffect`\n picker: float or callable[[Artist, Event], Tuple[bool, dict]]\n pickradius: float\n rasterized: bool or None\n sketch_params: (scale: float, length: float, randomness: float)\n snap: bool or None\n solid_capstyle: {'butt', 'round', 'projecting'}\n solid_joinstyle: {'miter', 'round', 'bevel'}\n transform: `matplotlib.transforms.Transform`\n url: str\n visible: bool\n xdata: 1D array\n ydata: 1D array\n zorder: float\n \n Returns\n -------\n lines\n A list of `.Line2D` objects representing the plotted data.\n \n See Also\n --------\n scatter : XY scatter plot with markers of varying size and/or color (\n sometimes also called bubble chart).\n \n Notes\n -----\n **Format Strings**\n \n A format string consists of a part for color, marker and line::\n \n fmt = '[marker][line][color]'\n \n Each of them is optional. If not provided, the value from the style\n cycle is used. Exception: If ``line`` is given, but no ``marker``,\n the data will be a line without markers.\n \n Other combinations such as ``[color][marker][line]`` are also\n supported, but note that their parsing may be ambiguous.\n \n **Markers**\n \n ============= ===============================\n character description\n ============= ===============================\n ``'.'`` point marker\n ``','`` pixel marker\n ``'o'`` circle marker\n ``'v'`` triangle_down marker\n ``'^'`` triangle_up marker\n ``'<'`` triangle_left marker\n ``'>'`` triangle_right marker\n ``'1'`` tri_down marker\n ``'2'`` tri_up marker\n ``'3'`` tri_left marker\n ``'4'`` tri_right marker\n ``'s'`` square marker\n ``'p'`` pentagon marker\n ``'*'`` star marker\n ``'h'`` hexagon1 marker\n ``'H'`` hexagon2 marker\n ``'+'`` plus marker\n ``'x'`` x marker\n ``'D'`` diamond marker\n ``'d'`` thin_diamond marker\n ``'|'`` vline marker\n ``'_'`` hline marker\n ============= ===============================\n \n **Line Styles**\n \n ============= ===============================\n character description\n ============= ===============================\n ``'-'`` solid line style\n ``'--'`` dashed line style\n ``'-.'`` dash-dot line style\n ``':'`` dotted line style\n ============= ===============================\n \n Example format strings::\n \n 'b' # blue markers with default shape\n 'or' # red circles\n '-g' # green solid line\n '--' # dashed line with default color\n '^k:' # black triangle_up markers connected by a dotted line\n \n **Colors**\n \n The supported color abbreviations are the single letter codes\n \n ============= ===============================\n character color\n ============= ===============================\n ``'b'`` blue\n ``'g'`` green\n ``'r'`` red\n ``'c'`` cyan\n ``'m'`` magenta\n ``'y'`` yellow\n ``'k'`` black\n ``'w'`` white\n ============= ===============================\n \n and the ``'CN'`` colors that index into the default property cycle.\n \n If the color is the only part of the format string, you can\n additionally use any `matplotlib.colors` spec, e.g. full names\n (``'green'``) or hex strings (``'#008000'``).\n\n"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d063877bfa6c74b4e238643da9e2ef6c123e9eec | 5,330 | ipynb | Jupyter Notebook | notebooks/Toy_problem/Tp4_criterium_2_daylighting_potential.ipynb | Maxketelaar/thesis | d1bab7dffa414c335b452476733c8b9d8ec24579 | [
"MIT"
]
| null | null | null | notebooks/Toy_problem/Tp4_criterium_2_daylighting_potential.ipynb | Maxketelaar/thesis | d1bab7dffa414c335b452476733c8b9d8ec24579 | [
"MIT"
]
| null | null | null | notebooks/Toy_problem/Tp4_criterium_2_daylighting_potential.ipynb | Maxketelaar/thesis | d1bab7dffa414c335b452476733c8b9d8ec24579 | [
"MIT"
]
| 1 | 2021-12-21T15:24:57.000Z | 2021-12-21T15:24:57.000Z | 27.905759 | 392 | 0.558161 | [
[
[
"#### loading the libraries",
"_____no_output_____"
]
],
[
[
"import os\nimport sys\nimport pyvista as pv\nimport trimesh as tm\nimport numpy as np\nimport topogenesis as tg\nimport pickle as pk\nsys.path.append(os.path.realpath('..\\..')) # no idea how or why this is not working without adding this to the path TODO: learn about path etc.\nfrom notebooks.resources import RES as res",
"_____no_output_____"
]
],
[
[
"#### loading the configuration of the test",
"_____no_output_____"
]
],
[
[
"# load base lattice CSV file\nlattice_path = os.path.relpath('../../data/macrovoxels.csv')\nmacro_lattice = tg.lattice_from_csv(lattice_path)\n\n# load random configuration for testing\nconfig_path = os.path.relpath('../../data/random_lattice.csv')\nconfiguration = tg.lattice_from_csv(config_path)\n\n# load environment\nenvironment_path = os.path.relpath(\"../../data/movedcontext.obj\") \nenvironment_mesh = tm.load(environment_path)\n\n# load solar vectors\nvectors = pk.load(open(\"../../data/sunvectors.pk\", \"rb\"))\n\n# load vector intensities\nintensity = pk.load(open(\"../../data/dnival.pk\", \"rb\"))",
"_____no_output_____"
]
],
[
[
"#### during optimization, arrays like these will be passed to the function:",
"_____no_output_____"
]
],
[
[
"variable = [0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 0, 0, 1, 0, 1, 0, 0]",
"_____no_output_____"
]
],
[
[
"#### calling the objective function",
"_____no_output_____"
]
],
[
[
"# input is the decision variables, a referenca lattice, the visibility vectors, their magnitude (i.e. direct normal illuminance for daylight), and a mesh of the environment\n# output is the total objective score in 100s of lux on the facade, and 100s of lux per each surface (voxel roofs)\ncrit, voxcrit = res.crit_2_DL(variable, macro_lattice, vectors, intensity, environment_mesh)",
"_____no_output_____"
]
],
[
[
"#### generating mesh",
"_____no_output_____"
]
],
[
[
"meshes, _, _ = res.construct_vertical_mesh(configuration, configuration.unit)\nfacademesh = tm.util.concatenate(meshes)",
"_____no_output_____"
]
],
[
[
"#### visualisation",
"_____no_output_____"
]
],
[
[
"p = pv.Plotter(notebook=True)\n\nconfiguration.fast_vis(p,False,False,opacity=0.1)\n# p.add_arrows(ctr_per_ray, -ray_per_ctr, mag=5, show_scalar_bar=False)\n# p.add_arrows(ctr_per_ray, nrm_per_ray, mag=5, show_scalar_bar=False)\n# p.add_mesh(roof_mesh)\np.add_mesh(environment_mesh)\np.add_mesh(facademesh, cmap='fire', scalars=np.repeat(voxcrit,2))\np.add_points(vectors*-300)\n# p.add_points(horizontal_test_points)\n\np.show(use_ipyvtk=True)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d0638e2eaa1d7f56e9fec1065b6dd907f395d8fb | 5,772 | ipynb | Jupyter Notebook | Cap04/.ipynb_checkpoints/modulos_pacotes-checkpoint.ipynb | carlos-freitas-gitHub/python-analytics | 4b55cb2acb3383ded700596c5a856b7e2124f2da | [
"Apache-2.0"
]
| 1 | 2020-07-31T20:31:19.000Z | 2020-07-31T20:31:19.000Z | Cap04/.ipynb_checkpoints/modulos_pacotes-checkpoint.ipynb | carlos-freitas-gitHub/python-analytics | 4b55cb2acb3383ded700596c5a856b7e2124f2da | [
"Apache-2.0"
]
| null | null | null | Cap04/.ipynb_checkpoints/modulos_pacotes-checkpoint.ipynb | carlos-freitas-gitHub/python-analytics | 4b55cb2acb3383ded700596c5a856b7e2124f2da | [
"Apache-2.0"
]
| null | null | null | 20.688172 | 502 | 0.445773 | [
[
[
"## Mรณdulo e pacote",
"_____no_output_____"
]
],
[
[
"# importando mรณdulo, math para operaรงรตes matemรกticas\nimport math",
"_____no_output_____"
],
[
"# verificando todos os metodos do modulo\ndir(math)",
"_____no_output_____"
],
[
"# usando um dos metรณdos do mรณdulo, sqrt, raiz quadrada\nprint(math.sqrt(25))",
"5.0\n"
],
[
"# importando apenas uma funรงรฃo do mรณdulo math\nfrom math import sqrt ",
"_____no_output_____"
],
[
"# usando este mรฉtodo, como importou somente a funรงรฃo do mรณdulo pode usar somente\n# a funรงรฃo sem o nome do pacote\nprint(sqrt(25))",
"5.0\n"
],
[
"# imprimindo todos os metodos do mรณdulo math\nprint(dir(math))",
"['__doc__', '__loader__', '__name__', '__package__', '__spec__', 'acos', 'acosh', 'asin', 'asinh', 'atan', 'atan2', 'atanh', 'ceil', 'copysign', 'cos', 'cosh', 'degrees', 'e', 'erf', 'erfc', 'exp', 'expm1', 'fabs', 'factorial', 'floor', 'fmod', 'frexp', 'fsum', 'gamma', 'gcd', 'hypot', 'inf', 'isclose', 'isfinite', 'isinf', 'isnan', 'ldexp', 'lgamma', 'log', 'log10', 'log1p', 'log2', 'modf', 'nan', 'pi', 'pow', 'radians', 'remainder', 'sin', 'sinh', 'sqrt', 'tan', 'tanh', 'tau', 'trunc']\n"
],
[
"# help da funรงรฃo sqrt do mรณdulo math\nprint(help(sqrt))",
"Help on built-in function sqrt in module math:\n\nsqrt(x, /)\n Return the square root of x.\n\nNone\n"
],
[
"# random\nimport random",
"_____no_output_____"
],
[
"# random choice(), escolha, buscando os elementos de maneira aleatรณria\nprint(random.choice(['Maรงa', 'Banana', 'Laranja']))",
"Laranja\n"
],
[
"# renadom sample(), amostra apartir de uma amostra de valores\nprint(random.sample(range(100), 10))",
"[51, 33, 65, 7, 66, 95, 96, 17, 77, 22]\n"
],
[
"# mรณdulo para estatistรญca\nimport statistics",
"_____no_output_____"
],
[
"# criando uma lista de nรบmeros reais\ndados = [2.75, 1.75, 1.25, 0.25, 1.25, 3.5]",
"_____no_output_____"
]
]
]
| [
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d0639939fd0e4f172287ca2c118fc3142e12f140 | 182,702 | ipynb | Jupyter Notebook | BareBones 1D CNN LSTM MLP - Sequence Prediction.ipynb | codeWhim/Sequence-Prediction | 2f9a0c3f57c20c311840ae00009637f553081f5b | [
"MIT"
]
| 1 | 2019-03-06T15:08:47.000Z | 2019-03-06T15:08:47.000Z | BareBones 1D CNN LSTM MLP - Sequence Prediction.ipynb | codeWhim/Sequence-Prediction | 2f9a0c3f57c20c311840ae00009637f553081f5b | [
"MIT"
]
| null | null | null | BareBones 1D CNN LSTM MLP - Sequence Prediction.ipynb | codeWhim/Sequence-Prediction | 2f9a0c3f57c20c311840ae00009637f553081f5b | [
"MIT"
]
| null | null | null | 148.78013 | 24,024 | 0.845612 | [
[
[
"<h1>Notebook Content</h1>\n\n1. [Import Packages](#1)\n1. [Helper Functions](#2)\n1. [Input](#3)\n1. [Model](#4)\n1. [Prediction](#5)\n1. [Complete Figure](#6)",
"_____no_output_____"
],
[
"<h1 id=\"1\">1. Import Packages</h1>\nImporting all necessary and useful packages in single cell.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport keras\nimport tensorflow as tf\nfrom numpy import array\nfrom keras.models import Sequential\nfrom keras.layers import LSTM\nfrom keras.layers import Dense\nfrom keras.layers import Flatten\nfrom keras.layers import TimeDistributed\nfrom keras.layers.convolutional import Conv1D\nfrom keras.layers.convolutional import MaxPooling1D\nfrom keras_tqdm import TQDMNotebookCallback\nfrom sklearn.preprocessing import MinMaxScaler\nfrom tqdm import tqdm_notebook\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport random\nfrom random import randint",
"_____no_output_____"
]
],
[
[
"<h1 id=\"2\">2. Helper Functions</h1>\nDefining Some helper functions which we will need later in code",
"_____no_output_____"
]
],
[
[
"# split a univariate sequence into samples\ndef split_sequence(sequence, n_steps, look_ahead=0):\n X, y = list(), list()\n for i in range(len(sequence)-look_ahead):\n # find the end of this pattern\n end_ix = i + n_steps\n # check if we are beyond the sequence\n if end_ix > len(sequence)-1-look_ahead:\n break\n # gather input and output parts of the pattern\n seq_x, seq_y = sequence[i:end_ix], sequence[end_ix+look_ahead]\n X.append(seq_x)\n y.append(seq_y)\n return array(X), array(y)\n\ndef plot_multi_graph(xAxis,yAxes,title='',xAxisLabel='number',yAxisLabel='Y'):\n linestyles = ['-', '--', '-.', ':']\n plt.figure()\n plt.title(title)\n plt.xlabel(xAxisLabel)\n plt.ylabel(yAxisLabel)\n for key, value in yAxes.items():\n plt.plot(xAxis, np.array(value), label=key, linestyle=linestyles[randint(0,3)])\n plt.legend()\n \ndef normalize(values):\n values = array(values, dtype=\"float64\").reshape((len(values), 1))\n # train the normalization\n scaler = MinMaxScaler(feature_range=(0, 1))\n scaler = scaler.fit(values)\n #print('Min: %f, Max: %f' % (scaler.data_min_, scaler.data_max_))\n # normalize the dataset and print the first 5 rows\n normalized = scaler.transform(values)\n return normalized,scaler",
"_____no_output_____"
]
],
[
[
"<h1 id=\"3\">3. Input</h1>\n\n<h3 id=\"3-1\">3-1. Sequence PreProcessing</h3>\nSplitting and Reshaping",
"_____no_output_____"
]
],
[
[
"n_features = 1\nn_seq = 20\nn_steps = 1\n \ndef sequence_preprocessed(values, sliding_window, look_ahead=0):\n \n # Normalization\n normalized,scaler = normalize(values)\n \n # Try the following if randomizing the sequence:\n # random.seed('sam') # set the seed\n # raw_seq = random.sample(raw_seq, 100)\n\n # split into samples\n X, y = split_sequence(normalized, sliding_window, look_ahead)\n\n # reshape from [samples, timesteps] into [samples, subsequences, timesteps, features]\n X = X.reshape((X.shape[0], n_seq, n_steps, n_features))\n \n return X,y,scaler",
"_____no_output_____"
]
],
[
[
"<h3 id=\"3-2\">3-2. Providing Sequence</h3>\nDefining a raw sequence, sliding window of data to consider and look ahead future timesteps",
"_____no_output_____"
]
],
[
[
"# define input sequence\nsequence_val = [i for i in range(5000,7000)]\nsequence_train = [i for i in range(1000,2000)]\nsequence_test = [i for i in range(10000,14000)]\n\n# choose a number of time steps for sliding window\nsliding_window = 20\n\n# choose a number of further time steps after end of sliding_window till target start (gap between data and target)\nlook_ahead = 20\n\nX_train, y_train, scaler_train = sequence_preprocessed(sequence_train, sliding_window, look_ahead)\nX_val, y_val ,scaler_val = sequence_preprocessed(sequence_val, sliding_window, look_ahead)\nX_test,y_test,scaler_test = sequence_preprocessed(sequence_test, sliding_window, look_ahead)",
"_____no_output_____"
]
],
[
[
"<h1 id=\"4\">4. Model</h1>\n\n<h3 id=\"4-1\">4-1. Defining Layers</h3>\nAdding 1D Convolution, Max Pooling, LSTM and finally Dense (MLP) layer",
"_____no_output_____"
]
],
[
[
"# define model\nmodel = Sequential()\nmodel.add(TimeDistributed(Conv1D(filters=64, kernel_size=1, activation='relu'), \n input_shape=(None, n_steps, n_features)\n ))\nmodel.add(TimeDistributed(MaxPooling1D(pool_size=1)))\nmodel.add(TimeDistributed(Flatten()))\nmodel.add(LSTM(50, activation='relu', stateful=False))\nmodel.add(Dense(1))",
"_____no_output_____"
]
],
[
[
"<h3 id=\"4-2\">4-2. Training Model</h3>\nDefined early stop, can be used in callbacks param of model fit, not using for now since it's not recommended at first few iterations of experimentation with new data",
"_____no_output_____"
]
],
[
[
"# Defining multiple metrics, leaving it to a choice, some may be useful and few may even surprise on some problems\nmetrics = ['mean_squared_error',\n 'mean_absolute_error',\n 'mean_absolute_percentage_error',\n 'mean_squared_logarithmic_error',\n 'logcosh']\n\n# Compiling Model\nmodel.compile(optimizer='adam', loss='mape', metrics=metrics)\n\n# Defining early stop, call it in model fit callback\nearly_stop = keras.callbacks.EarlyStopping(monitor='val_loss', patience=10)\n\n# Fit model\nhistory = model.fit(X_train, y_train, epochs=100, verbose=3, validation_data=(X_val,y_val))",
"Train on 960 samples, validate on 1960 samples\nEpoch 1/100\nEpoch 2/100\nEpoch 3/100\nEpoch 4/100\nEpoch 5/100\nEpoch 6/100\nEpoch 7/100\nEpoch 8/100\nEpoch 9/100\nEpoch 10/100\nEpoch 11/100\nEpoch 12/100\nEpoch 13/100\nEpoch 14/100\nEpoch 15/100\nEpoch 16/100\nEpoch 17/100\nEpoch 18/100\nEpoch 19/100\nEpoch 20/100\nEpoch 21/100\nEpoch 22/100\nEpoch 23/100\nEpoch 24/100\nEpoch 25/100\nEpoch 26/100\nEpoch 27/100\nEpoch 28/100\nEpoch 29/100\nEpoch 30/100\nEpoch 31/100\nEpoch 32/100\nEpoch 33/100\nEpoch 34/100\nEpoch 35/100\nEpoch 36/100\nEpoch 37/100\nEpoch 38/100\nEpoch 39/100\nEpoch 40/100\nEpoch 41/100\nEpoch 42/100\nEpoch 43/100\nEpoch 44/100\nEpoch 45/100\nEpoch 46/100\nEpoch 47/100\nEpoch 48/100\nEpoch 49/100\nEpoch 50/100\nEpoch 51/100\nEpoch 52/100\nEpoch 53/100\nEpoch 54/100\nEpoch 55/100\nEpoch 56/100\nEpoch 57/100\nEpoch 58/100\nEpoch 59/100\nEpoch 60/100\nEpoch 61/100\nEpoch 62/100\nEpoch 63/100\nEpoch 64/100\nEpoch 65/100\nEpoch 66/100\nEpoch 67/100\nEpoch 68/100\nEpoch 69/100\nEpoch 70/100\nEpoch 71/100\nEpoch 72/100\nEpoch 73/100\nEpoch 74/100\nEpoch 75/100\nEpoch 76/100\nEpoch 77/100\nEpoch 78/100\nEpoch 79/100\nEpoch 80/100\nEpoch 81/100\nEpoch 82/100\nEpoch 83/100\nEpoch 84/100\nEpoch 85/100\nEpoch 86/100\nEpoch 87/100\nEpoch 88/100\nEpoch 89/100\nEpoch 90/100\nEpoch 91/100\nEpoch 92/100\nEpoch 93/100\nEpoch 94/100\nEpoch 95/100\nEpoch 96/100\nEpoch 97/100\nEpoch 98/100\nEpoch 99/100\nEpoch 100/100\n"
]
],
[
[
"<h3 id=\"4-3\">4-3. Evaluating Model</h3>\nPlotting Training and Validation mean square error",
"_____no_output_____"
]
],
[
[
"# Plot Errors\n\nfor metric in metrics:\n xAxis = history.epoch\n yAxes = {}\n yAxes[\"Training\"]=history.history[metric]\n yAxes[\"Validation\"]=history.history['val_'+metric]\n plot_multi_graph(xAxis,yAxes, title=metric,xAxisLabel='Epochs')",
"_____no_output_____"
]
],
[
[
"<h1 id=\"5\">5. Prediction</h1>\n\n<h3 id=\"5-1\">5-1. Single Value Prediction</h3>\nPredicting a single value slided 20 (our provided figure for look_ahead above) values ahead",
"_____no_output_____"
]
],
[
[
"# demonstrate prediction\nx_input = array([i for i in range(100,120)])\nprint(x_input)\nx_input = x_input.reshape((1, n_seq, n_steps, n_features))\nyhat = model.predict(x_input)\nprint(yhat)",
"[100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117\n 118 119]\n[[105.82992]]\n"
]
],
[
[
"<h3 id=\"5-2\">5-2. Sequence Prediction</h3>\nPredicting complete sequence (determining closeness to target) based on data <br />\n<i>change variable for any other sequence though</i>",
"_____no_output_____"
]
],
[
[
"# Prediction from Training Set\npredict_train = model.predict(X_train)\n\n# Prediction from Test Set\npredict_test = model.predict(X_test)\n\n\"\"\"\ndf = pd.DataFrame(({\"normalized y_train\":y_train.flatten(),\n \"normalized predict_train\":predict_train.flatten(),\n \"actual y_train\":scaler_train.inverse_transform(y_train).flatten(),\n \"actual predict_train\":scaler_train.inverse_transform(predict_train).flatten(),\n }))\n\n\"\"\"\n\ndf = pd.DataFrame(({ \n \"normalized y_test\":y_test.flatten(),\n \"normalized predict_test\":predict_test.flatten(),\n \"actual y_test\":scaler_test.inverse_transform(y_test).flatten(),\n \"actual predict_test\":scaler_test.inverse_transform(predict_test).flatten()\n }))\ndf",
"_____no_output_____"
]
],
[
[
"<h1 id=\"6\">6. Complete Figure</h1>\nData, Target, Prediction - all in one single graph",
"_____no_output_____"
]
],
[
[
"xAxis = [i for i in range(len(y_train))]\nyAxes = {}\nyAxes[\"Data\"]=sequence_train[sliding_window:len(sequence_train)-look_ahead]\nyAxes[\"Target\"]=scaler_train.inverse_transform(y_train)\nyAxes[\"Prediction\"]=scaler_train.inverse_transform(predict_train)\nplot_multi_graph(xAxis,yAxes,title='')\n\nxAxis = [i for i in range(len(y_test))]\nyAxes = {}\nyAxes[\"Data\"]=sequence_test[sliding_window:len(sequence_test)-look_ahead]\nyAxes[\"Target\"]=scaler_test.inverse_transform(y_test)\nyAxes[\"Prediction\"]=scaler_test.inverse_transform(predict_test)\nplot_multi_graph(xAxis,yAxes,title='')\n\nprint(metrics)\nprint(model.evaluate(X_test,y_test))",
"['mean_squared_error', 'mean_absolute_error', 'mean_absolute_percentage_error', 'mean_squared_logarithmic_error', 'logcosh']\n3960/3960 [==============================] - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - 1s 294us/step\n[7.694095613258053, 0.00023503987094495595, 0.015312134466990077, 7.694095613258053, 0.00011939386936549021, 0.0001175134772149084]\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d0639b5e7b411de78ef7448fa61f58a26ff2ed77 | 47,720 | ipynb | Jupyter Notebook | models/Character_Level_CNN.ipynb | TheBlueEngineer/Serene-1.0 | 4f8c2e688c1403fda3c43c46c5ee598da3e607ea | [
"MIT"
]
| 1 | 2020-09-23T21:21:55.000Z | 2020-09-23T21:21:55.000Z | models/Character_Level_CNN.ipynb | TheBlueEngineer/Serene-1.0 | 4f8c2e688c1403fda3c43c46c5ee598da3e607ea | [
"MIT"
]
| null | null | null | models/Character_Level_CNN.ipynb | TheBlueEngineer/Serene-1.0 | 4f8c2e688c1403fda3c43c46c5ee598da3e607ea | [
"MIT"
]
| null | null | null | 57.842424 | 1,659 | 0.51536 | [
[
[
"# **Libraries**",
"_____no_output_____"
]
],
[
[
"from google.colab import drive\ndrive.mount('/content/drive')",
"Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n"
],
[
"# ***********************\n# *****| LIBRARIES |*****\n# ***********************\n%tensorflow_version 2.x\nimport pandas as pd\nimport numpy as np\nimport os\nimport json\n\nfrom sklearn.model_selection import train_test_split\nimport tensorflow as tf\nfrom keras.preprocessing.text import Tokenizer\nfrom keras.preprocessing.sequence import pad_sequences\nfrom keras.layers import Input, Embedding, Activation, Flatten, Dense\nfrom keras.layers import Conv1D, MaxPooling1D, Dropout\nfrom keras.models import Model\nfrom keras.utils import to_categorical\nfrom keras.optimizers import SGD\nfrom keras.wrappers.scikit_learn import KerasClassifier\nfrom sklearn.model_selection import RandomizedSearchCV, GridSearchCV\n\ndevice_name = tf.test.gpu_device_name()\nif device_name != '/device:GPU:0':\n print(\"GPU not found\")\nelse:\n print('Found GPU at: {}'.format(device_name))",
"Using TensorFlow backend.\n"
],
[
"# ******************************\n# *****| GLOBAL VARIABLES |*****\n# ******************************\ntest_size = 0.2\n\nconvsize = 256\nconvsize2 = 1024\nembedding_size = 27\ninput_size = 1000\nconv_layers = [\n [convsize, 7, 3],\n [convsize, 7, 3],\n [convsize, 3, -1],\n [convsize, 3, -1],\n [convsize, 3, -1],\n [convsize, 3, 3]\n ]\n\nfully_connected_layers = [convsize2, convsize2]\nnum_of_classes= 2\ndropout_p = 0.5\noptimizer= 'adam'\nbatch = 128\nloss = 'categorical_crossentropy'",
"_____no_output_____"
]
],
[
[
"# **Utility functions**",
"_____no_output_____"
]
],
[
[
"# *****************\n# *** GET FILES ***\n# *****************\ndef getFiles( driverPath, directory, basename, extension): # Define a function that will return a list of files\n pathList = [] # Declare an empty array\n directory = os.path.join( driverPath, directory) # \n \n for root, dirs, files in os.walk( directory): # Iterate through roots, dirs and files recursively\n for file in files: # For every file in files\n if os.path.basename(root) == basename: # If the parent directory of the current file is equal with the parameter\n if file.endswith('.%s' % (extension)): # If the searched file ends in the parameter\n path = os.path.join(root, file) # Join together the root path and file name\n pathList.append(path) # Append the new path to the list\n return pathList ",
"_____no_output_____"
],
[
"# ****************************************\n# *** GET DATA INTO A PANDAS DATAFRAME ***\n# ****************************************\ndef getDataFrame( listFiles, maxFiles, minWords, limit):\n counter_real, counter_max, limitReached = 0, 0, 0\n text_list, label_list = [], []\n\n print(\"Word min set to: %i.\" % ( minWords))\n # Iterate through all the files\n for file in listFiles:\n # Open each file and look into it\n with open(file) as f:\n if(limitReached):\n break\n if maxFiles == 0:\n break\n else:\n maxFiles -= 1\n objects = json.loads( f.read())['data'] # Get the data from the JSON file\n # Look into each object from the file and test for limiters\n for object in objects:\n if limit > 0 and counter_real >= (limit * 1000):\n limitReached = 1\n break\n if len( object['text'].split()) >= minWords:\n text_list.append(object['text'])\n label_list.append(object['label'])\n counter_real += 1\n counter_max += 1\n\n if(counter_real > 0 and counter_max > 0):\n ratio = counter_real / counter_max * 100\n else:\n ratio = 0\n # Print the final result\n print(\"Lists created with %i/%i (%.2f%%) data objects.\" % ( counter_real, counter_max, ratio))\n print(\"Rest ignored due to minimum words limit of %i or the limit of %i data objects maximum.\" % ( minWords, limit * 1000))\n # Return the final Pandas DataFrame\n return text_list, label_list, counter_real",
"_____no_output_____"
]
],
[
[
"# **Gather the path to files**",
"_____no_output_____"
]
],
[
[
"# ***********************************\n# *** GET THE PATHS FOR THE FILES ***\n# ***********************************\n\n# Path to the content of the Google Drive \ndriverPath = \"/content/drive/My Drive\"\n\n# Sub-directories in the driver\npaths = [\"processed/depression/submission\",\n \"processed/depression/comment\", \n \"processed/AskReddit/submission\", \n \"processed/AskReddit/comment\"]\n\nfiles = [None] * len(paths)\nfor i in range(len(paths)):\n files[i] = getFiles( driverPath, paths[i], \"text\", \"json\")\n print(\"Gathered %i files from %s.\" % ( len(files[i]), paths[i]))",
"Gathered 750 files from processed/depression/submission.\nGathered 2892 files from processed/depression/comment.\nGathered 1311 files from processed/AskReddit/submission.\nGathered 5510 files from processed/AskReddit/comment.\n"
]
],
[
[
"# **Gather the data from files**",
"_____no_output_____"
]
],
[
[
"# ************************************\n# *** GATHER THE DATA AND SPLIT IT ***\n# ************************************\n# Local variables\nrand_state_splitter = 1000\ntest_size = 0.2\n\nmin_files = [ 750, 0, 1300, 0] \nmax_words = [ 50, 0, 50, 0]\nlimit_packets = [300, 0, 300, 0]\nmessage = [\"Depression submissions\", \"Depression comments\", \"AskReddit submissions\", \"AskReddit comments\"]\ntext, label = [], []\n\n# Get the pandas data frames for each category\nprint(\"Build the Pandas DataFrames for each category.\")\nfor i in range(4):\n dummy_text, dummy_label, counter = getDataFrame( files[i], min_files[i], max_words[i], limit_packets[i])\n if counter > 0:\n text += dummy_text\n label += dummy_label\n dummy_text, dummy_label = None, None\n print(\"Added %i samples to data list: %s.\\n\" % ( counter ,message[i]) )\n\n# Splitting the data\nx_train, x_test, y_train, y_test = train_test_split(text, \n label, \n test_size = test_size, \n shuffle = True, \n random_state = rand_state_splitter)\nprint(\"Training data: %i samples.\" % ( len(y_train)) )\nprint(\"Testing data: %i samples.\" % ( len(y_test)) )\n\n# Clear data no longer needed\ndel rand_state_splitter, min_files, max_words, message, dummy_label, dummy_text",
"Build the Pandas DataFrames for each category.\nWord min set to: 50.\nLists created with 300000/349305 (85.88%) data objects.\nRest ignored due to minimum words limit of 50 or the limit of 300000 data objects maximum.\nAdded 300000 samples to data list: Depression submissions.\n\nWord min set to: 0.\nLists created with 0/0 (0.00%) data objects.\nRest ignored due to minimum words limit of 0 or the limit of 0 data objects maximum.\nWord min set to: 50.\nLists created with 300000/554781 (54.08%) data objects.\nRest ignored due to minimum words limit of 50 or the limit of 300000 data objects maximum.\nAdded 300000 samples to data list: AskReddit submissions.\n\nWord min set to: 0.\nLists created with 0/0 (0.00%) data objects.\nRest ignored due to minimum words limit of 0 or the limit of 0 data objects maximum.\nTraining data: 480000 samples.\nTesting data: 120000 samples.\n"
]
],
[
[
"# **Process the data at a character-level**",
"_____no_output_____"
]
],
[
[
"# *******************************\n# *** CONVERT STRING TO INDEX ***\n# *******************************\nprint(\"Convert the strings to indexes.\")\ntk = Tokenizer(num_words = None, char_level = True, oov_token='UNK')\ntk.fit_on_texts(x_train)\nprint(\"Original:\", x_train[0])\n# *********************************\n# *** CONSTRUCT A NEW VOCABULARY***\n# *********************************\nprint(\"Construct a new vocabulary\")\nalphabet = \"abcdefghijklmnopqrstuvwxyz\"\nchar_dict = {}\nfor i, char in enumerate(alphabet):\n char_dict[char] = i + 1\nprint(\"dictionary\")\ntk.word_index = char_dict.copy() # Use char_dict to replace the tk.word_index\nprint(tk.word_index)\ntk.word_index[tk.oov_token] = max(char_dict.values()) + 1 # Add 'UNK' to the vocabulary\nprint(tk.word_index)\n# *************************\n# *** TEXT TO SEQUENCES ***\n# *************************\nprint(\"Text to sequence.\")\nx_train = tk.texts_to_sequences(x_train)\nx_test = tk.texts_to_sequences(x_test)\nprint(\"After sequences:\", x_train[0])\n# ***************\n# *** PADDING ***\n# ***************\nprint(\"Padding the sequences.\")\nx_train = pad_sequences( x_train, maxlen = input_size, padding = 'post')\nx_test = pad_sequences( x_test, maxlen= input_size , padding = 'post')\n\n# ************************\n# *** CONVERT TO NUMPY ***\n# ************************\nprint(\"Convert to Numpy arrays\")\nx_train = np.array( x_train, dtype = 'float32')\nx_test = np.array(x_test, dtype = 'float32')\n\n# **************************************\n# *** GET CLASSES FOR CLASSIFICATION ***\n# **************************************\ny_test_copy = y_test\ny_train_list = [x-1 for x in y_train]\ny_test_list = [x-1 for x in y_test]\n\ny_train = to_categorical( y_train_list, num_of_classes)\ny_test = to_categorical( y_test_list, num_of_classes)",
"Convert the strings to indexes.\nOriginal: i did not think i had have to post in this subreddit i just feel empty and completely alone i am hanging out with friends but nothing makes me feel happy as i used to be i know people generally have it worse i just want someone to talk to and just be silly with \nConstruct a new vocabulary\ndictionary\n{'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 6, 'g': 7, 'h': 8, 'i': 9, 'j': 10, 'k': 11, 'l': 12, 'm': 13, 'n': 14, 'o': 15, 'p': 16, 'q': 17, 'r': 18, 's': 19, 't': 20, 'u': 21, 'v': 22, 'w': 23, 'x': 24, 'y': 25, 'z': 26}\n{'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 6, 'g': 7, 'h': 8, 'i': 9, 'j': 10, 'k': 11, 'l': 12, 'm': 13, 'n': 14, 'o': 15, 'p': 16, 'q': 17, 'r': 18, 's': 19, 't': 20, 'u': 21, 'v': 22, 'w': 23, 'x': 24, 'y': 25, 'z': 26, 'UNK': 27}\nText to sequence.\nAfter sequences: [9, 27, 4, 9, 4, 27, 14, 15, 20, 27, 20, 8, 9, 14, 11, 27, 9, 27, 8, 1, 4, 27, 8, 1, 22, 5, 27, 20, 15, 27, 16, 15, 19, 20, 27, 9, 14, 27, 20, 8, 9, 19, 27, 19, 21, 2, 18, 5, 4, 4, 9, 20, 27, 9, 27, 10, 21, 19, 20, 27, 6, 5, 5, 12, 27, 5, 13, 16, 20, 25, 27, 1, 14, 4, 27, 3, 15, 13, 16, 12, 5, 20, 5, 12, 25, 27, 1, 12, 15, 14, 5, 27, 9, 27, 1, 13, 27, 8, 1, 14, 7, 9, 14, 7, 27, 15, 21, 20, 27, 23, 9, 20, 8, 27, 6, 18, 9, 5, 14, 4, 19, 27, 2, 21, 20, 27, 14, 15, 20, 8, 9, 14, 7, 27, 13, 1, 11, 5, 19, 27, 13, 5, 27, 6, 5, 5, 12, 27, 8, 1, 16, 16, 25, 27, 1, 19, 27, 9, 27, 21, 19, 5, 4, 27, 20, 15, 27, 2, 5, 27, 9, 27, 11, 14, 15, 23, 27, 16, 5, 15, 16, 12, 5, 27, 7, 5, 14, 5, 18, 1, 12, 12, 25, 27, 8, 1, 22, 5, 27, 9, 20, 27, 23, 15, 18, 19, 5, 27, 9, 27, 10, 21, 19, 20, 27, 23, 1, 14, 20, 27, 19, 15, 13, 5, 15, 14, 5, 27, 20, 15, 27, 20, 1, 12, 11, 27, 20, 15, 27, 1, 14, 4, 27, 10, 21, 19, 20, 27, 2, 5, 27, 19, 9, 12, 12, 25, 27, 23, 9, 20, 8, 27]\nPadding the sequences.\nConvert to Numpy arrays\n"
]
],
[
[
"# **Load embedding words**",
"_____no_output_____"
]
],
[
[
"# ***********************\n# *** LOAD EMBEDDINGS ***\n# ***********************\nembedding_weights = []\nvocab_size = len(tk.word_index)\nembedding_weights.append(np.zeros(vocab_size))\n\nfor char, i in tk.word_index.items():\n onehot = np.zeros(vocab_size)\n onehot[i-1] = 1\n embedding_weights.append(onehot)\nembedding_weights = np.array(embedding_weights)\n\nprint(\"Vocabulary size: \",vocab_size)\nprint(\"Embedding weights: \", embedding_weights)",
"Vocabulary size: 27\nEmbedding weights: [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 1. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 1. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 1.]]\n"
]
],
[
[
"# **Build the CNN model**",
"_____no_output_____"
]
],
[
[
"def KerasModel():\n # ***************************************\n # *****| BUILD THE NEURAL NETWORK |******\n # ***************************************\n embedding_layer = Embedding(vocab_size+1,\n embedding_size,\n input_length = input_size,\n weights = [embedding_weights])\n\n # Input layer\n inputs = Input(shape=(input_size,), name='input', dtype='int64')\n\n # Embedding layer\n x = embedding_layer(inputs)\n\n # Convolution\n for filter_num, filter_size, pooling_size in conv_layers:\n x = Conv1D(filter_num, filter_size)(x)\n x = Activation('relu')(x)\n if pooling_size != -1:\n x = MaxPooling1D( pool_size = pooling_size)(x)\n x = Flatten()(x)\n\n # Fully Connected layers\n for dense_size in fully_connected_layers:\n x = Dense( dense_size, activation='relu')(x)\n x = Dropout( dropout_p)(x)\n\n # Output Layer\n predictions = Dense(num_of_classes, activation = 'softmax')(x)\n\n # BUILD MODEL\n model = Model( inputs = inputs, outputs = predictions)\n model.compile(optimizer = optimizer, loss = loss, metrics = ['accuracy'])\n model.summary()\n\n return model",
"_____no_output_____"
]
],
[
[
"# **Train the CNN**",
"_____no_output_____"
]
],
[
[
"#with tf.device(\"/gpu:0\"):\n# history = model.fit(x_train, y_train,\n# validation_data = ( x_test, y_test),\n# epochs = 10,\n# batch_size = batch,\n# verbose = True)\n \nwith tf.device(\"/gpu:0\"):\n grid = KerasClassifier(build_fn = KerasModel, epochs = 15, verbose= True)\n param_grid = dict(\n epochs = [15]\n )\n #grid = GridSearchCV(estimator = model, \n # param_grid = param_grid,\n # cv = 5, \n # verbose = 10, \n # return_train_score = True)\n \n grid_result = grid.fit(x_train, y_train)",
"Model: \"model_1\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ninput (InputLayer) (None, 1000) 0 \n_________________________________________________________________\nembedding_1 (Embedding) (None, 1000, 27) 756 \n_________________________________________________________________\nconv1d_1 (Conv1D) (None, 994, 256) 48640 \n_________________________________________________________________\nactivation_1 (Activation) (None, 994, 256) 0 \n_________________________________________________________________\nmax_pooling1d_1 (MaxPooling1 (None, 331, 256) 0 \n_________________________________________________________________\nconv1d_2 (Conv1D) (None, 325, 256) 459008 \n_________________________________________________________________\nactivation_2 (Activation) (None, 325, 256) 0 \n_________________________________________________________________\nmax_pooling1d_2 (MaxPooling1 (None, 108, 256) 0 \n_________________________________________________________________\nconv1d_3 (Conv1D) (None, 106, 256) 196864 \n_________________________________________________________________\nactivation_3 (Activation) (None, 106, 256) 0 \n_________________________________________________________________\nconv1d_4 (Conv1D) (None, 104, 256) 196864 \n_________________________________________________________________\nactivation_4 (Activation) (None, 104, 256) 0 \n_________________________________________________________________\nconv1d_5 (Conv1D) (None, 102, 256) 196864 \n_________________________________________________________________\nactivation_5 (Activation) (None, 102, 256) 0 \n_________________________________________________________________\nconv1d_6 (Conv1D) (None, 100, 256) 196864 \n_________________________________________________________________\nactivation_6 (Activation) (None, 100, 256) 0 \n_________________________________________________________________\nmax_pooling1d_3 (MaxPooling1 (None, 33, 256) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 8448) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 1024) 8651776 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 1024) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 1024) 1049600 \n_________________________________________________________________\ndropout_2 (Dropout) (None, 1024) 0 \n_________________________________________________________________\ndense_3 (Dense) (None, 2) 2050 \n=================================================================\nTotal params: 10,999,286\nTrainable params: 10,999,286\nNon-trainable params: 0\n_________________________________________________________________\n"
]
],
[
[
"# **Test the CNN**",
"_____no_output_____"
]
],
[
[
"#loss, accuracy = model.evaluate( x_train, y_train, verbose = True)\n#print(\"Training Accuracy: {:.4f}\".format( accuracy))\n#loss, accuracy = model.evaluate( x_test, y_test, verbose = True)\n#print(\"Testing Accuracy: {:.4f}\".format( accuracy))\n\nfrom sklearn.metrics import classification_report, confusion_matrix\ny_predict = grid.predict( x_test)\n# Build the confusion matrix \ny_tested = y_test\nprint( type(y_test))\nprint(y_tested)\ny_tested = np.argmax( y_tested, axis = 1)\nprint(y_tested)\nconfMatrix = confusion_matrix(y_tested, y_predict) \ntn, fp, fn, tp = confMatrix.ravel() \n# Build a classification report \nclassification_reports = classification_report( y_tested, y_predict, target_names = ['Non-depressed', 'Depressed'], digits=3)\nprint(confMatrix)\nprint(classification_reports)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d063a4b442a49c71d22336e6b555d4a0dd1f82bf | 404,490 | ipynb | Jupyter Notebook | src/plotting/OpenChromatin_plotsold.ipynb | Switham1/PromoterArchitecture | 0a9021b869ac66cdd622be18cd029950314d111e | [
"MIT"
]
| null | null | null | src/plotting/OpenChromatin_plotsold.ipynb | Switham1/PromoterArchitecture | 0a9021b869ac66cdd622be18cd029950314d111e | [
"MIT"
]
| null | null | null | src/plotting/OpenChromatin_plotsold.ipynb | Switham1/PromoterArchitecture | 0a9021b869ac66cdd622be18cd029950314d111e | [
"MIT"
]
| null | null | null | 154.444444 | 43,436 | 0.855368 | [
[
[
"import pandas as pd\nimport numpy as np\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom scipy import stats\nfrom statsmodels.formula.api import ols\nimport researchpy as rp\nfrom pingouin import kruskal\nfrom pybedtools import BedTool",
"_____no_output_____"
],
[
"RootChomatin_bp_covered = '../../data/promoter_analysis/responsivepromotersRootOpenChrom.bp_covered.txt'\nShootChomatin_bp_covered = '../../data/promoter_analysis/responsivepromotersShootOpenChrom.bp_covered.txt'\nRootShootIntersect_bp_covered = '../../data/promoter_analysis/responsivepromotersShootRootIntersectOpenChrom.bp_covered.txt'",
"_____no_output_____"
],
[
"def add_chr_linestart(input_location,output_location):\n \"\"\"this function adds chr to the beginning of the line if it starts with a digit and saves a file\"\"\"\n output = open(output_location, 'w') #make output file with write capability\n #open input file\n with open(input_location, 'r') as infile: \n #iterate over lines in file\n for line in infile:\n line = line.strip() # removes hidden characters/spaces\n if line[0].isdigit():\n \n line = 'chr' + line #prepend chr to the beginning of line if starts with a digit\n output.write(line + '\\n') #output to new file\n output.close()",
"_____no_output_____"
],
[
"def percent_coverage(bp_covered):\n \"\"\"function to calculate the % coverage from the output file of bedtools coverage\"\"\"\n\n coverage_df = pd.read_table(bp_covered, sep='\\t', header=None)\n col = ['chr','start','stop','gene','dot','strand','source', 'type', 'dot2', 'details', 'no._of_overlaps', 'no._of_bases_covered','promoter_length','fraction_bases_covered']\n coverage_df.columns = col\n #add % bases covered column\n coverage_df['percentage_bases_covered'] = coverage_df.fraction_bases_covered * 100\n\n #remove unnecessary columns\n coverage_df_reduced_columns = coverage_df[['chr','start','stop','gene','strand', 'no._of_overlaps', 'no._of_bases_covered','promoter_length','fraction_bases_covered','percentage_bases_covered']]\n return coverage_df_reduced_columns",
"_____no_output_____"
],
[
"root_coverage = percent_coverage(RootChomatin_bp_covered)",
"_____no_output_____"
],
[
"shoot_coverage = percent_coverage(ShootChomatin_bp_covered)",
"_____no_output_____"
],
[
"rootshootintersect_coverage = percent_coverage(RootShootIntersect_bp_covered)",
"_____no_output_____"
],
[
"sns.set(color_codes=True)\nsns.set_style(\"whitegrid\")",
"_____no_output_____"
],
[
"#distribution plot",
"_____no_output_____"
],
[
"dist_plot = root_coverage['percentage_bases_covered']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()\n\n#save to file\n#dist_plot_fig.savefig('../../data/plots/TFBS_coverage/all_genes_bp_covered_dist.pdf', format='pdf')\n",
"_____no_output_____"
],
[
"dist_plot = shoot_coverage['percentage_bases_covered']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()\n\n#save to file\n#dist_plot_fig.savefig('../../data/plots/TFBS_coverage/all_genes_bp_covered_dist.pdf', format='pdf')\n",
"_____no_output_____"
],
[
"dist_plot = rootshootintersect_coverage['percentage_bases_covered']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()\n\n#save to file\n#dist_plot_fig.savefig('../../data/plots/TFBS_coverage/all_genes_bp_covered_dist.pdf', format='pdf')\n",
"_____no_output_____"
]
],
[
[
"## constitutive vs variable",
"_____no_output_____"
]
],
[
[
"def add_genetype(coverage):\n \"\"\"function to add gene type to the df, and remove random genes\"\"\"\n select_genes_file = '../../data/genomes/ara_housekeeping_list.out'\n select_genes = pd.read_table(select_genes_file, sep='\\t', header=None)\n cols = ['gene','gene_type']\n select_genes.columns = cols\n merged = pd.merge(coverage, select_genes, on='gene')\n \n merged_renamed = merged.copy()\n merged_renamed.gene_type.replace('housekeeping','constitutive', inplace=True)\n merged_renamed.gene_type.replace('highVar','variable', inplace=True)\n merged_renamed.gene_type.replace('randCont','random', inplace=True)\n \n # no_random = merged_renamed[merged_renamed.gene_type != 'random']\n # no_random.reset_index(drop=True, inplace=True)\n \n return merged_renamed",
"_____no_output_____"
],
[
"roots_merged = add_genetype(root_coverage)\nno_random_roots = roots_merged[roots_merged.gene_type != 'random']",
"_____no_output_____"
],
[
"shoots_merged = add_genetype(shoot_coverage)\nno_random_shoots = shoots_merged[shoots_merged.gene_type != 'random']",
"_____no_output_____"
],
[
"rootsshootsintersect_merged = add_genetype(rootshootintersect_coverage)\nno_random_rootsshoots = rootsshootsintersect_merged[rootsshootsintersect_merged.gene_type != 'random']",
"_____no_output_____"
],
[
"#how many have open chromatin??\nprint('root openchromatin present:')\nprint(len(no_random_roots)-len(no_random_roots[no_random_roots.percentage_bases_covered == 0]))\nprint('shoot openchromatin present:')\nprint(len(no_random_shoots)-len(no_random_shoots[no_random_shoots.percentage_bases_covered == 0]))\nprint('root-shoot intersect openchromatin present:')\nprint(len(no_random_rootsshoots)-len(no_random_rootsshoots[no_random_rootsshoots.percentage_bases_covered == 0]))",
"root openchromatin present:\n164\nshoot openchromatin present:\n153\nroot-shoot intersect openchromatin present:\n149\n"
],
[
"#how many have open chromatin??\nprint('root openchromatin present variable promoters:')\nprint(len(no_random_roots[no_random_roots.gene_type=='variable'])-len(no_random_roots[no_random_roots.gene_type=='variable'][no_random_roots[no_random_roots.gene_type=='variable'].percentage_bases_covered == 0]))\nprint('root openchromatin present constitutive promoters:')\nprint(len(no_random_roots[no_random_roots.gene_type=='constitutive'])-len(no_random_roots[no_random_roots.gene_type=='constitutive'][no_random_roots[no_random_roots.gene_type=='constitutive'].percentage_bases_covered == 0]))\n\n\nprint('shoot openchromatin present variable promoters:')\nprint(len(no_random_shoots[no_random_shoots.gene_type=='variable'])-len(no_random_shoots[no_random_shoots.gene_type=='variable'][no_random_shoots[no_random_shoots.gene_type=='variable'].percentage_bases_covered == 0]))\nprint('shoot openchromatin present constitutive promoters:')\nprint(len(no_random_shoots[no_random_shoots.gene_type=='constitutive'])-len(no_random_shoots[no_random_shoots.gene_type=='constitutive'][no_random_shoots[no_random_shoots.gene_type=='constitutive'].percentage_bases_covered == 0]))\n\nprint('root-shoot intersect openchromatin present variable promoters:')\nprint(len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='variable'])-len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='variable'][no_random_rootsshoots[no_random_rootsshoots.gene_type=='variable'].percentage_bases_covered == 0]))\nprint('root-shoot intersect openchromatin present constitutive promoters:')\nprint(len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='constitutive'])-len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='constitutive'][no_random_rootsshoots[no_random_rootsshoots.gene_type=='constitutive'].percentage_bases_covered == 0]))",
"root openchromatin present variable promoters:\n75\nroot openchromatin present constitutive promoters:\n89\nshoot openchromatin present variable promoters:\n66\nshoot openchromatin present constitutive promoters:\n87\nroot-shoot intersect openchromatin present variable promoters:\n63\nroot-shoot intersect openchromatin present constitutive promoters:\n86\n"
],
[
"sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=roots_merged) #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered.pdf', format='pdf')",
"_____no_output_____"
],
[
"sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=shoots_merged) #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered.pdf', format='pdf')",
"_____no_output_____"
],
[
"#roots\nplot = sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", kind='box', data=no_random_roots)\n#plot points\nax = sns.swarmplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=no_random_roots, color=\".25\")\nplt.ylabel('Percentage bases covered')\nplt.xlabel('Gene type');\n#ax.get_figure() #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered_boxplot.pdf', format='pdf')",
"_____no_output_____"
],
[
"#shoots\nplot = sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", kind='box', data=no_random_shoots)\n#plot points\nax = sns.swarmplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=no_random_shoots, color=\".25\")\nplt.ylabel('Percentage bases covered')\nplt.xlabel('Gene type');\n#ax.get_figure() #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered_boxplot.pdf', format='pdf')",
"_____no_output_____"
],
[
"#roots-shoots intersect\nplot = sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", kind='box', data=no_random_rootsshoots)\n#plot points\nax = sns.swarmplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=no_random_rootsshoots, color=\".25\")\nplt.ylabel('Percentage bases covered')\nplt.xlabel('Gene type');\n#ax.get_figure() #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered_boxplot.pdf', format='pdf')",
"_____no_output_____"
],
[
"#Get names of each promoter\ndef normality(input_proms):\n \"\"\"function to test normality of data - returns test statistic, p-value\"\"\"\n #Get names of each promoter\n pd.Categorical(input_proms.gene_type)\n names = input_proms.gene_type.unique()\n# for name in names:\n# print(name)\n \n for name in names:\n print('{}: {}'.format(name, stats.shapiro(input_proms.percentage_bases_covered[input_proms.gene_type == name])))\n ",
"_____no_output_____"
],
[
"def variance(input_proms):\n \"\"\"function to test variance of data\"\"\"\n#test variance\n constitutive = input_proms[input_proms.gene_type == 'constitutive']\n #reset indexes so residuals can be calculated later\n constitutive.reset_index(inplace=True)\n\n responsive = input_proms[input_proms.gene_type == 'variable']\n responsive.reset_index(inplace=True)\n\n control = input_proms[input_proms.gene_type == 'random']\n control.reset_index(inplace=True)\n\n print(stats.levene(constitutive.percentage_bases_covered, responsive.percentage_bases_covered))",
"_____no_output_____"
],
[
"normality(no_random_roots)",
"variable: (0.8330899477005005, 3.833479311765586e-09)\nconstitutive: (0.7916173934936523, 1.8358696507458916e-10)\n"
],
[
"normality(no_random_shoots)",
"variable: (0.8625870943069458, 4.528254393676434e-08)\nconstitutive: (0.8724747896194458, 1.1140339495341323e-07)\n"
],
[
"normality(no_random_rootsshoots)",
"variable: (0.8546600937843323, 2.263117515610702e-08)\nconstitutive: (0.8711197376251221, 9.823354929494599e-08)\n"
]
],
[
[
"## Not normal",
"_____no_output_____"
]
],
[
[
"variance(no_random_roots)",
"LeveneResult(statistic=3.3550855113629137, pvalue=0.0685312309497174)\n"
],
[
"variance(no_random_shoots)",
"LeveneResult(statistic=0.20460439034148425, pvalue=0.6515350841099911)\n"
],
[
"variance(no_random_rootsshoots)",
"LeveneResult(statistic=0.00041366731166758155, pvalue=0.9837939970964911)\n"
]
],
[
[
"## unequal variance for shoots",
"_____no_output_____"
]
],
[
[
"def kruskal_test(input_data):\n \"\"\"function to do kruskal-wallis test on data\"\"\" \n \n #print('\\033[1m' +promoter + '\\033[0m')\n print(kruskal(data=input_data, dv='percentage_bases_covered', between='gene_type'))\n #print('')",
"_____no_output_____"
],
[
"no_random_roots",
"_____no_output_____"
],
[
"kruskal_test(no_random_roots)",
" Source ddof1 H p-unc\nKruskal gene_type 1 7.281793 0.006966\n"
],
[
"kruskal_test(no_random_shoots)",
" Source ddof1 H p-unc\nKruskal gene_type 1 20.935596 0.000005\n"
],
[
"kruskal_test(no_random_rootsshoots)",
" Source ddof1 H p-unc\nKruskal gene_type 1 22.450983 0.000002\n"
]
],
[
[
"## try gat enrichment",
"_____no_output_____"
]
],
[
[
"#add Chr to linestart of chromatin bed files\n\nadd_chr_linestart('../../data/ATAC-seq/potter2018/Shoots_NaOH_peaks_all.bed','../../data/ATAC-seq/potter2018/Shoots_NaOH_peaks_all_renamed.bed')\nadd_chr_linestart('../../data/ATAC-seq/potter2018/Roots_NaOH_peaks_all.bed','../../data/ATAC-seq/potter2018/Roots_NaOH_peaks_all_renamed.bed')\nadd_chr_linestart('../../data/ATAC-seq/potter2018/intersectRootsShoots_PeaksInBoth.bed','../../data/ATAC-seq/potter2018/intersectRootsShoots_PeaksInBoth_renamed.bed')",
"_____no_output_____"
],
[
"#create a bed file containing all 100 constitutive/responsive promoters with the fourth column annotating whether it's constitutive or responsive\nproms_file = '../../data/genes/constitutive-variable-random_100_each.csv'\npromoters = pd.read_csv(proms_file)\npromoters\ncols2 = ['delete','promoter_AGI', 'gene_type']\npromoters_df = promoters[['promoter_AGI','gene_type']]\npromoters_no_random = promoters_df.copy()\n#drop randCont rows\npromoters_no_random = promoters_df[~(promoters_df.gene_type == 'randCont')]\npromoters_no_random",
"_____no_output_____"
],
[
"#merge promoters with genetype selected\npromoterbedfile = '../../data/FIMO/responsivepromoters.bed'\npromoters_bed = pd.read_table(promoterbedfile, sep='\\t', header=None)\ncols = ['chr', 'start', 'stop', 'promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\npromoters_bed.columns = cols\nmerged = pd.merge(promoters_bed,promoters_no_random, on='promoter_AGI')",
"_____no_output_____"
],
[
"#add gene_type to column3\nmerged = merged[['chr','start','stop','gene_type','promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']]",
"_____no_output_____"
],
[
"#write to bed file\npromoter_file = '../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace.bed'\nwith open(promoter_file,'w') as f:\n merged.to_csv(f,index=False,sep='\\t',header=None)",
"_____no_output_____"
],
[
"# new_merged = merged.astype({'start': 'int'})\n# new_merged = merged.astype({'stop': 'int'})\n# new_merged = merged.astype({'chr': 'int'})",
"_____no_output_____"
],
[
"#add Chr to linestart of promoter bed file\n\nadd_chr_linestart('../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace.bed','../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace_renamed.bed')",
"_____no_output_____"
],
[
"#create separate variable and constitutive and gat workspace\npromoter_file_renamed = '../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace_renamed.bed'\npromoters = pd.read_table(promoter_file_renamed, sep='\\t', header=None)\n#make a new gat workspace file with all promoters (first 3 columns)\nbed = BedTool.from_dataframe(promoters[[0,1,2]]).saveas('../../data/promoter_analysis/chromatin/variable_constitutive_promoters_1000bp_workspace.bed')\n#select only variable promoters\nvariable_promoters = promoters[promoters[3] == 'highVar']\nsorted_variable = variable_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_variable).saveas('../../data/promoter_analysis/chromatin/variable_promoters_1000bp.bed')\n#make a constitutive only file\nconstitutive_promoters = promoters[promoters[3] == 'housekeeping']\nsorted_constitutive = constitutive_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_constitutive).saveas('../../data/promoter_analysis/chromatin/constitutive_promoters_1000bp.bed')",
"_____no_output_____"
]
],
[
[
"## now I will do the plots with non-overlapping promoters including the 5'UTR",
"_____no_output_____"
]
],
[
[
"#merge promoters with genetype selected\npromoter_UTR = '../../data/FIMO/non-overlapping_includingbidirectional_all_genes/promoters_5UTR_renamedChr.bed'\npromoters_bed = pd.read_table(promoter_UTR, sep='\\t', header=None)\ncols = ['chr', 'start', 'stop', 'promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\npromoters_bed.columns = cols\nmerged = pd.merge(promoters_bed,promoters_no_random, on='promoter_AGI')",
"_____no_output_____"
],
[
"#how many constitutive genes left after removed/shortened overlapping\nlen(merged[merged.gene_type == 'housekeeping'])",
"_____no_output_____"
],
[
"#how many variable genes left after removed/shortened overlapping\nlen(merged[merged.gene_type == 'highVar'])",
"_____no_output_____"
],
[
"merged['length'] = (merged.start - merged.stop).abs()\nmerged.sort_values('length',ascending=True)",
"_____no_output_____"
],
[
"#plot of lengths\ndist_plot = merged['length']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()",
"_____no_output_____"
],
[
"#remove 2 genes from constitutive group so equal sample size to variable\n#random sample of 98, using seed 1\nmerged[merged.gene_type == 'housekeeping'] = merged[merged.gene_type == 'housekeeping'].sample(98, random_state=1)",
"_____no_output_____"
],
[
"#drop rows with at least 2 NaNs\nmerged = merged.dropna(thresh=2)",
"_____no_output_____"
],
[
"merged",
"_____no_output_____"
],
[
"#write to bed file so can run OpenChromatin_coverage.py\nnew_promoter_file = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive.bed'\ncols = ['chr', 'start', 'stop', 'promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\n#remove trailing decimal .0 from start and stop\nmerged = merged.astype({'start': 'int'})\nmerged = merged.astype({'stop': 'int'})\nmerged = merged.astype({'chr': 'int'})\n\nmerged_coverage = merged[cols]\n\nwith open(new_promoter_file,'w') as f:\n merged_coverage.to_csv(f,index=False,sep='\\t',header=None)",
"_____no_output_____"
],
[
"#write to bed file so can run gat\nnew_promoter_file_gat = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat.bed'\ncols_gat = ['chr', 'start', 'stop', 'gene_type','promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\nmerged_gat = merged[cols_gat]\nwith open(new_promoter_file_gat,'w') as f:\n merged_gat.to_csv(f,index=False,sep='\\t',header=None)\n",
"_____no_output_____"
],
[
"#Read in new files\nRootChomatin_bp_covered = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutiveRootOpenChrom.bp_covered.txt'\nShootChomatin_bp_covered = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutiveShootOpenChrom.bp_covered.txt'\nRootShootIntersect_bp_covered = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutiveShootRootIntersectOpenChrom.bp_covered.txt'",
"_____no_output_____"
],
[
"root_coverage = percent_coverage(RootChomatin_bp_covered)\nshoot_coverage = percent_coverage(ShootChomatin_bp_covered)\nrootshootintersect_coverage = percent_coverage(RootShootIntersect_bp_covered)",
"_____no_output_____"
],
[
"#add Chr to linestart of promoter bed file\n\nadd_chr_linestart('../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat.bed','../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat_renamed.bed')",
"_____no_output_____"
],
[
"#create separate variable and constitutive and gat workspace\npromoter_file_renamed = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat_renamed.bed'\npromoters = pd.read_table(promoter_file_renamed, sep='\\t', header=None)\n#make a new gat workspace file with all promoters (first 3 columns)\nbed = BedTool.from_dataframe(promoters[[0,1,2]]).saveas('../../data/promoter_analysis/chromatin/non-overlapping_includingbidirectional_variable_constitutive_workspace.bed')\n#select only variable promoters\nvariable_promoters = promoters[promoters[3] == 'highVar']\nsorted_variable = variable_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_variable).saveas('../../data/promoter_analysis/chromatin/non-overlapping_includingbidirectional_variable_promoters.bed')\n#make a constitutive only file\nconstitutive_promoters = promoters[promoters[3] == 'housekeeping']\nsorted_constitutive = constitutive_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_constitutive).saveas('../../data/promoter_analysis/chromatin/non-overlapping_includingbidirectional_constitutive_promoters.bed')",
"_____no_output_____"
],
[
"#show distribution of the distance from the closest end of the open chromatin peak to the ATG (if overlapping already then distance is 0)\nroot_peaks_bed = '../../data/ATAC-seq/potter2018/Roots_NaOH_peaks_all_renamed.bed'\nshoot_peaks_bed = '../../data/ATAC-seq/potter2018/Shoots_NaOH_peaks_all_renamed.bed'\nrootshootintersect_peaks_bed = '../../data/ATAC-seq/potter2018/intersectRootsShoots_PeaksInBoth_renamed.bed'\npromoters_bed = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_renamed.bed'\npromoter_openchrom_intersect = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_chromintersect.bed'",
"_____no_output_____"
],
[
"add_chr_linestart('../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive.bed','../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_renamed.bed')",
"_____no_output_____"
],
[
"def distr_distance_ATG(peaks_bed, promoter_bed, output_file):\n \"\"\"function to show the distribution of the distance rom the closest end\n of the open chromatin peak to the ATG (if overlapping already then distance is 0)\"\"\"\n# peaks = pd.read_table(peaks_bed, sep='\\t', header=None)\n# cols = ['chr','start', 'stop']\n# peaks.columns = cols\n# promoters = pd.read_table(promoter_bed, sep='\\t', header=None)\n# cols_proms = ['chr', 'start', 'stop', 'gene_type','promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\n# promoters.columns = cols_proms\n proms = BedTool(promoter_bed) #read in files using BedTools\n peaks = BedTool(peaks_bed)\n #report chromosome position of overlapping feature, along with the promoter which overlaps it (only reports the overlapping nucleotides, not the whole promoter length. Can use u=True to get whole promoter length)\n #f, the minimum overlap as fraction of A. F, nucleotide fraction of B (genes) that need to be overlapping with A (promoters)\n #wa, Write the original entry in A for each overlap.\n #wo, Write the original A and B entries plus the number of base pairs of overlap between the two features. Only A features with overlap are reported. \n #u, write original A entry only once even if more than one overlap\n intersect = proms.intersect(peaks, wo=True) #could add u=True which indicates we want to see the promoters that overlap features in the genome\n #Write to output_file\n with open(output_file, 'w') as output:\n #Each line in the file contains bed entry a and bed entry b that it overlaps plus the number of bp in the overlap so 19 columns\n output.write(str(intersect))\n #read in intersect bed file\n overlapping_proms = pd.read_table(output_file, sep='\\t', header=None)\n cols = ['chrA', 'startA', 'stopA', 'promoter_AGI','dot1','strand','source','type','dot2','attributes','chrB', 'startB','stopB','bp_overlap']\n overlapping_proms.columns = cols\n #add empty openchrom_distance_from_ATG column\n overlapping_proms['openchrom_distance_from_ATG'] = int()\n for i, v in overlapping_proms.iterrows():\n #if positive strand feature A\n if overlapping_proms.loc[i,'strand'] == '+':\n #if end of open chromatin is downstream or equal to ATG, distance is 0\n if overlapping_proms.loc[i,'stopA'] <= overlapping_proms.loc[i, 'stopB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = 0\n #else if upstream and chromatin stop is after promoter start, add distance from chromatin stop to ATG\n elif overlapping_proms.loc[i,'startA'] <= overlapping_proms.loc[i, 'stopB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = overlapping_proms.loc[i,'stopA'] - overlapping_proms.loc[i, 'stopB'] \n \n elif overlapping_proms.loc[i,'strand'] == '-': \n #if end of open chromatin is downstream or equal to ATG, distance is 0\n if overlapping_proms.loc[i,'startA'] >= overlapping_proms.loc[i, 'startB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = 0\n #else if upstream and chromatin stop is after promoter start, add distance from chromatin stop to ATG \n elif overlapping_proms.loc[i,'stopA'] >= overlapping_proms.loc[i, 'startB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = overlapping_proms.loc[i, 'startB'] - overlapping_proms.loc[i,'startB']\n \n\n \n return overlapping_proms",
"_____no_output_____"
],
[
"#show length of open chromatin peaks\nrootshootintersect = distr_distance_ATG(rootshootintersect_peaks_bed)\nrootshootintersect['length'] = (rootshootintersect.start - rootshootintersect.stop).abs()\nrootshootintersect.sort_values('length',ascending=True)\n",
"_____no_output_____"
],
[
"rootshootintersect = distr_distance_ATG(rootshootintersect_peaks_bed,promoters_bed,promoter_openchrom_intersect)",
"_____no_output_____"
],
[
"rootshootintersect\nrootshootintersect.sort_values('openchrom_distance_from_ATG',ascending=True)",
"_____no_output_____"
],
[
"#plot of distances of chomatin to ATG\ndist_plot = rootshootintersect['openchrom_distance_from_ATG']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()",
"_____no_output_____"
],
[
"#now split constitutive and variable\nmerged_distances = pd.merge(merged, rootshootintersect, on='promoter_AGI')",
"_____no_output_____"
],
[
"merged_distances.gene_type",
"_____no_output_____"
],
[
"#VARIABLE\n#plot of distances of chomatin to ATG \ndist_plot = merged_distances[merged_distances.gene_type=='highVar']['openchrom_distance_from_ATG']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()",
"_____no_output_____"
],
[
"merged_distances[merged_distances.gene_type=='housekeeping']['openchrom_distance_from_ATG']",
"_____no_output_____"
],
[
"#CONSTITUTIVE\n#plot of distances of chomatin to ATG \ndist_plot = merged_distances[merged_distances.gene_type=='housekeeping']['openchrom_distance_from_ATG']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()",
"/home/witham/opt/anaconda3/envs/PromoterArchitecturePipeline/lib/python3.7/site-packages/seaborn/distributions.py:369: UserWarning: Default bandwidth for data is 0; skipping density estimation.\n warnings.warn(msg, UserWarning)\n"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d063a7d824a918b20fdae24c6f4a772f811a45cc | 218,539 | ipynb | Jupyter Notebook | main.ipynb | spectraldani/DeepMahalanobisGP | bf2d788ac8b56d25f544b6cb9c0325820f4b7e64 | [
"Apache-2.0"
]
| null | null | null | main.ipynb | spectraldani/DeepMahalanobisGP | bf2d788ac8b56d25f544b6cb9c0325820f4b7e64 | [
"Apache-2.0"
]
| null | null | null | main.ipynb | spectraldani/DeepMahalanobisGP | bf2d788ac8b56d25f544b6cb9c0325820f4b7e64 | [
"Apache-2.0"
]
| null | null | null | 402.46593 | 98,764 | 0.933339 | [
[
[
"dataset = 'load' # 'load' or 'generate'\nretrain_models = False # False or True or 'save'",
"_____no_output_____"
],
[
"import numpy as np\nimport pandas as pd\nimport tensorflow as tf\ntf.logging.set_verbosity(tf.logging.FATAL)\n\nimport gpflow\nimport library.models.deep_vmgp as deep_vmgp\nimport library.models.vmgp as vmgp\nfrom doubly_stochastic_dgp.dgp import DGP\n\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\nimport cplot\n\nimport sklearn.model_selection\nimport pickle\nfrom pathlib import Path\nfrom types import SimpleNamespace\nfrom library.helper import TrainTestSplit, initial_inducing_points\nfrom library import metrics\n\n%matplotlib inline",
"_____no_output_____"
],
[
"random_seed = 19960111\ndef reset_seed():\n np.random.seed(random_seed)\n tf.random.set_random_seed(random_seed)",
"_____no_output_____"
],
[
"if dataset == 'generate':\n s = 0.4\n n = 500//2\n reset_seed()\n rng = np.random.default_rng(random_seed)\n m1, m2 = np.array([[-1,1],[2,1]])\n X1 = rng.multivariate_normal(m1,s*np.eye(2), size=n)\n X2 = rng.multivariate_normal(m2,s*np.eye(2), size=n)\n y1 = X1[:,0]**2 + X1[:,0]\n y2 = X2[:,1]**2 + X2[:,1]\n\n X = np.concatenate([X1,X2],axis=0)\n y = np.concatenate([y1,y2],axis=0)[:,None]\n\n X_all, y_all = X,y\n n = X_all.shape[0]\n kfold = sklearn.model_selection.KFold(2,shuffle=True,random_state=random_seed)\n folds = [\n [TrainTestSplit(X_all[train],X_all[test]), TrainTestSplit(y_all[train],y_all[test])]\n for train, test in kfold.split(X_all, y_all)\n ]\n X,y = folds[0]\nelif dataset == 'load':\n with open('./dataset.pkl','rb') as f:\n X, y = pickle.load(f)\n X_all, y_all = np.concatenate(X,axis=0), np.concatenate(y,axis=0)",
"_____no_output_____"
],
[
"scalers = SimpleNamespace(x=sklearn.preprocessing.StandardScaler(),y=sklearn.preprocessing.StandardScaler())\nscalers.x.fit(X.train)\nX = X.apply(lambda x: scalers.x.transform(x))\nscalers.y.fit(y.train)\ny = y.apply(lambda y: scalers.y.transform(y))",
"_____no_output_____"
],
[
"models = pd.Series(index=pd.Index([],dtype='object'), dtype=object)\nparameters = pd.Series({p.stem:p for p in Path('./optimized_parameters/').glob('*.pkl')}, dtype=object).map(read_parameters)\n\ny_pred = pd.DataFrame(dtype=float, index=range(y.test.size), columns=pd.MultiIndex(levels=[[],['mean','var']],codes=[[],[]],names=['model','']))\nresults = pd.DataFrame(columns=['RMSE','NLPD','MRAE'],dtype=float)",
"_____no_output_____"
],
[
"def read_parameters(p):\n try:\n with p.open('rb') as f:\n return pickle.load(f)\n except:\n return None\n\ndef train_model(model_label):\n m = models[model_label]\n if retrain_models == True or retrain_models == 'save' or model_label not in parameters.index:\n print('Training',model_label)\n variance_parameter = m.likelihood.variance if not isinstance(m, DGP) else m.likelihood.likelihood.variance\n variance_parameter.assign(0.01)\n # First round\n variance_parameter.trainable = False\n opt = gpflow.train.AdamOptimizer(0.01)\n opt.minimize(m, maxiter=2000)\n\n # Second round\n variance_parameter.trainable = True\n opt = gpflow.train.AdamOptimizer(0.01)\n opt.minimize(m, maxiter=5000)\n if retrain_models == 'save' or model_label not in parameters.index:\n with open(f'./optimized_parameters/{model_label}.pkl','wb') as f:\n pickle.dump(m.read_trainables(), f)\n else:\n m.assign(parameters[model_label])",
"_____no_output_____"
]
],
[
[
"# Create, train, and predict with models",
"_____no_output_____"
]
],
[
[
"n,D = X.train.shape\nm_v = 25\nm_u, Q, = 50, D\nZ_v = (m_v,D)\nZ_u = (m_u,Q)\nsample_size = 200",
"_____no_output_____"
]
],
[
[
"### SGPR",
"_____no_output_____"
]
],
[
[
"models['sgpr'] = gpflow.models.SGPR(X.train, y.train, gpflow.kernels.RBF(D, ARD=True), initial_inducing_points(X.train, m_u))\ntrain_model('sgpr')\ny_pred[('sgpr','mean')], y_pred[('sgpr','var')] = models['sgpr'].predict_y(X.test)",
"_____no_output_____"
]
],
[
[
"### Deep Mahalanobis GP",
"_____no_output_____"
]
],
[
[
"reset_seed()\nwith gpflow.defer_build():\n models['dvmgp'] = deep_vmgp.DeepVMGP(\n X.train, y.train, Z_u, Z_v,\n [gpflow.kernels.RBF(D,ARD=True) for i in range(Q)],\n full_qcov=False, diag_qmu=False\n )\nmodels['dvmgp'].compile()\ntrain_model('dvmgp')\ny_pred[('dvmgp','mean')], y_pred[('dvmgp','var')] = models['dvmgp'].predict_y(X.test)",
"_____no_output_____"
]
],
[
[
"### Show scores",
"_____no_output_____"
]
],
[
[
"for m in models.index:\n scaled_y_test = scalers.y.inverse_transform(y.test)\n scaled_y_pred = [\n scalers.y.inverse_transform(y_pred[m].values[:,[0]]),\n scalers.y.var_ * y_pred[m].values[:,[1]]\n ]\n results.at[m,'MRAE'] = metrics.mean_relative_absolute_error(scaled_y_test, scaled_y_pred[0]).squeeze()\n results.at[m,'RMSE'] = metrics.root_mean_squared_error(scaled_y_test, scaled_y_pred[0]).squeeze()\n results.at[m,'NLPD'] = metrics.negative_log_predictive_density(scaled_y_test, *scaled_y_pred).squeeze()\n\nresults",
"_____no_output_____"
]
],
[
[
"# Plot results",
"_____no_output_____"
]
],
[
[
"class MidpointNormalize(mpl.colors.Normalize):\n def __init__(self, vmin=None, vmax=None, midpoint=None, clip=False):\n self.midpoint = midpoint\n mpl.colors.Normalize.__init__(self, vmin, vmax, clip)\n\n def __call__(self, value, clip=None):\n x, y = [self.vmin, self.midpoint, self.vmax], [0, 0.5, 1]\n return np.ma.masked_array(np.interp(value, x, y), np.isnan(value))",
"_____no_output_____"
],
[
"f = plt.figure()\nax = plt.gca()\nax.scatter(scalers.x.transform(X_all)[:,0],scalers.x.transform(X_all)[:,1],edgecolors='white',facecolors='none')\nlims = (ax.get_xlim(), ax.get_ylim())\nplt.close(f)",
"_____no_output_____"
],
[
"n = 50\ngrid_points = np.dstack(np.meshgrid(np.linspace(*lims[0],n), np.linspace(*lims[1],n))).reshape(-1,2)\ngrid_y = np.empty((len(models.index),grid_points.shape[0]))\nfor i,m in enumerate(models.index):\n reset_seed()\n grid_pred = models[m].predict_y(grid_points, sample_size)[0]\n if len(grid_pred.shape) == 3:\n grid_y[i] = grid_pred.mean(axis=0)[:,0]\n else:\n grid_y[i] = grid_pred[:,0]\n\ngrid_points = grid_points.reshape(n,n,2)\ngrid_y = grid_y.reshape(-1,n,n)",
"_____no_output_____"
],
[
"f = plt.figure(constrained_layout=True,figsize=(8,7))\ngs = f.add_gridspec(ncols=4, nrows=2)\naxs = np.empty(3,dtype=object)\naxs[0] = f.add_subplot(gs[0,0:2])\naxs[1] = f.add_subplot(gs[0,2:4],sharey=axs[0])\naxs[2] = f.add_subplot(gs[1,1:3])\n\naxs[1].yaxis.set_visible(False)\naxs[2].yaxis.set_visible(False)\n\naxs[0].set_title('SGPR')\naxs[1].set_title('DVMGP')\naxs[2].set_title('Full Dataset')\n\nims = np.empty((2,4),dtype=object)\n\nfor i,m in enumerate(['sgpr', 'dvmgp']):\n ax = axs[i]\n ims[0,i] = ax.contourf(grid_points[:,:,0],grid_points[:,:,1],grid_y[i],30)\n\n # Plot features\n Z = None\n if m == 'dgp':\n Z = models[m].layers[0].feature.Z.value\n elif m in ['sgpr','vmgp']:\n Z = models[m].feature.Z.value\n elif m == 'dvmgp':\n Z = models[m].Z_v.Z.value\n\n if Z is not None:\n ax.scatter(Z[:,0],Z[:,1],marker='^',edgecolors='white',facecolors='none')\n# ims[1,i] = ax.scatter(X.test[:,0],X.test[:,1],edgecolors='white',c=y.test)\n \nims[0,3] = axs[2].scatter(X.test[:,0],X.test[:,1],c=y.test)\nims[1,3] = axs[2].scatter(X.train[:,0],X.train[:,1],c=y.train)\n\nfor ax in axs:\n ax.set_xlim(lims[0]);\n ax.set_ylim(lims[1]);\n \nclim = np.array([i.get_clim() for i in ims.flat if i is not None])\nclim = (clim.min(), clim.max())\nnorm = mpl.colors.Normalize(vmin=clim[0], vmax=clim[1])\n# norm = MidpointNormalize(vmin=clim[0], vmax=clim[1], midpoint=0)\nfor im in ims.flat:\n if im is not None:\n im.set_norm(norm)\nf.colorbar(ims[0,0], ax=axs, orientation='vertical', fraction=1, aspect=50)\n\nfor im in ims[0,:3].flat:\n if im is not None:\n for c in im.collections:\n c.set_edgecolor(\"face\")\n\nf.savefig('./figs/outputs.pdf')",
"_____no_output_____"
],
[
"n = 50\ngrid_points = np.dstack(np.meshgrid(np.linspace(*lims[0],n), np.linspace(*lims[1],n))).reshape(-1,2)\ngrid_y = np.empty((grid_points.shape[0],2))\n\ngrid_y = models['dvmgp'].enquire_session().run(tf.matmul(\n tf.transpose(models['dvmgp'].compute_qW(grid_points)[0][...,0],[2,0,1]),grid_points[:,:,None]\n)[:,:,0])\n\ngrid_points = grid_points.reshape(n,n,2)\ngrid_y = grid_y.reshape(n,n,2)\n\nf = plt.figure(constrained_layout=True,figsize=(8,4))\ngs = f.add_gridspec(ncols=2, nrows=1)\naxs = np.empty(4,dtype=object)\naxs[0] = f.add_subplot(gs[0,0])\naxs[1] = f.add_subplot(gs[0,1])\n\nextent = (*lims[0], *lims[1])\ncolorspace = 'cielab'\nalpha = 0.7\n\naxs[0].imshow(\n cplot.get_srgb1(grid_points[:,:,0] + grid_points[:,:,1]*1j, colorspace=colorspace, alpha=alpha),\n origin='lower',\n extent=extent,\n aspect='auto',\n interpolation='gaussian'\n)\naxs[0].set_title('Identity map')\n\naxs[1].imshow(\n cplot.get_srgb1(grid_y[:,:,0] + grid_y[:,:,1]*1j, colorspace=colorspace, alpha=alpha),\n origin='lower',\n extent=extent,\n aspect='auto',\n interpolation='gaussian'\n)\naxs[1].set_title('DVMGP: $Wx^\\intercal$');\nf.savefig('./figs/layers.pdf')",
"_____no_output_____"
],
[
"dvmgp_var = np.array([k.variance.value for k in models['dvmgp'].w_kerns])\n\nf,ax = plt.subplots(1,1,figsize=(3,3))\nax.bar(np.arange(2), dvmgp_var/dvmgp_var.max(), color='C2')\nax.set_ylabel('1st layer variance\\nrelative to largest value')\n\nax.set_xlabel('Latent dimension')\nax.set_xticks([])\n\nax.set_title('DVMGP')\nf.tight_layout()\nf.savefig('./figs/dims.pdf')",
"_____no_output_____"
]
]
]
| [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d063b69f0b08b02a7b8969cb3540bb5645d3d954 | 7,384 | ipynb | Jupyter Notebook | docs/level1/sasum.ipynb | timleslie/pyblas | 9109f2cc24e674cf59a3b39f95c2d7b8116ae884 | [
"BSD-3-Clause"
]
| null | null | null | docs/level1/sasum.ipynb | timleslie/pyblas | 9109f2cc24e674cf59a3b39f95c2d7b8116ae884 | [
"BSD-3-Clause"
]
| 1 | 2020-10-10T23:23:06.000Z | 2020-10-10T23:23:06.000Z | docs/level1/sasum.ipynb | timleslie/pyblas | 9109f2cc24e674cf59a3b39f95c2d7b8116ae884 | [
"BSD-3-Clause"
]
| null | null | null | 31.555556 | 399 | 0.529117 | [
[
[
"# `sasum(N, SX, INCX)`\n\nComputes the sum of absolute values of elements of the vector $x$.\n\nOperates on single-precision real valued arrays.\n\nInput vector $\\mathbf{x}$ is represented as a [strided array](../strided_arrays.ipynb) `SX`, spaced by `INCX`.\nVector $\\mathbf{x}$ is of size `N`.",
"_____no_output_____"
],
[
"### Example usage",
"_____no_output_____"
]
],
[
[
"import os\nimport sys\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.abspath(''), \"..\", \"..\")))",
"_____no_output_____"
],
[
"import numpy as np\nfrom pyblas.level1 import sasum",
"_____no_output_____"
],
[
"x = np.array([1, 2, 3], dtype=np.single)\nN = len(x)\nincx = 1",
"_____no_output_____"
],
[
"sasum(N, x, incx)",
"_____no_output_____"
]
],
[
[
"### Docstring",
"_____no_output_____"
]
],
[
[
"help(sasum)",
"Help on function sasum in module pyblas.level1.sasum:\n\nsasum(N, SX, INCX)\n Computes the sum of absolute values of elements of the vector x\n \n Parameters\n ----------\n N : int\n Number of elements in input vector\n SX : numpy.ndarray\n A single precision real array, dimension (1 + (`N` - 1)*abs(`INCX`))\n INCX : int\n Storage spacing between elements of `SX`\n \n Returns\n -------\n numpy.single\n \n See Also\n --------\n dasum : Double-precision sum of absolute values\n \n Notes\n -----\n Online PyBLAS documentation: https://nbviewer.jupyter.org/github/timleslie/pyblas/blob/main/docs/sasum.ipynb\n Reference BLAS documentation: https://github.com/Reference-LAPACK/lapack/blob/v3.9.0/BLAS/SRC/sasum.f\n \n Examples\n --------\n >>> x = np.array([1, 2, 3], dtype=np.single)\n >>> N = len(x)\n >>> incx = 1\n >>> print(sasum(N, x, incx)\n 6.\n\n"
]
],
[
[
"### Source code",
"_____no_output_____"
]
],
[
[
"sasum??",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d063bda09143f23b1dec8e6d92850b0fa5e5ff8b | 431,235 | ipynb | Jupyter Notebook | 2019/PAN_AA_2018-POS-tag.ipynb | jeleandro/PANAA2018 | aa681fcb4e2f90841cf30f53265fecbb111123e1 | [
"Apache-2.0"
]
| null | null | null | 2019/PAN_AA_2018-POS-tag.ipynb | jeleandro/PANAA2018 | aa681fcb4e2f90841cf30f53265fecbb111123e1 | [
"Apache-2.0"
]
| null | null | null | 2019/PAN_AA_2018-POS-tag.ipynb | jeleandro/PANAA2018 | aa681fcb4e2f90841cf30f53265fecbb111123e1 | [
"Apache-2.0"
]
| null | null | null | 142.321782 | 120,504 | 0.77929 | [
[
[
"# Notebook para o PAN - Atribuiรงรฃo Autoral - 2018",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\n#python basic libs\nimport os;\nfrom os.path import join as pathjoin;\n\nimport warnings\nwarnings.simplefilter(action='ignore', category=FutureWarning)\nfrom sklearn.exceptions import UndefinedMetricWarning\nwarnings.simplefilter(action='ignore', category=UndefinedMetricWarning)\n\nimport re;\nimport json;\nimport codecs;\nfrom collections import defaultdict;\n\nfrom pprint import pprint\nfrom time import time\nimport logging\n\n\n#data analysis libs\nimport numpy as np;\nimport pandas as pd;\nfrom pandas.plotting import scatter_matrix;\nimport matplotlib.pyplot as plt;\nimport random;\n\n#machine learning libs\n#feature extraction\nfrom sklearn.feature_extraction.text import CountVectorizer, TfidfVectorizer, TfidfTransformer\n\n#preprocessing and transformation\nfrom sklearn import preprocessing\nfrom sklearn.preprocessing import normalize, MaxAbsScaler, RobustScaler;\nfrom sklearn.decomposition import PCA;\n\nfrom sklearn.base import BaseEstimator, ClassifierMixin\n\n#classifiers\nfrom sklearn import linear_model;\nfrom sklearn.linear_model import LogisticRegression\n\nfrom sklearn.svm import LinearSVC, SVC\nfrom sklearn.multiclass import OneVsOneClassifier, OneVsRestClassifier\nfrom sklearn.neural_network import MLPClassifier\n\n \n#\nfrom sklearn import feature_selection;\nfrom sklearn import ensemble;\n\nfrom sklearn.model_selection import train_test_split;\nfrom sklearn.model_selection import GridSearchCV\nfrom sklearn.pipeline import Pipeline\n\n#model valuation\nfrom sklearn.metrics import roc_auc_score, f1_score, precision_score, recall_score, accuracy_score;",
"_____no_output_____"
],
[
"import seaborn as sns;\nsns.set(color_codes=True);",
"_____no_output_____"
],
[
"import spacy\nget_ipython().config.get('IPKernelApp', {})['parent_appname'] = \"\" #spacy causes a bug on pandas and this code fix it",
"_____no_output_____"
],
[
"import platform;\nimport sklearn;\nimport scipy;\n\nprint(\"|%-15s|%-40s|\"%(\"PACK\",\"VERSION\"))\nprint(\"|%-15s|%-40s|\"%('-'*15,'-'*40))\nprint('\\n'.join(\n \"|%-15s|%-40s|\" % (pack, version)\n for pack, version in\n zip(['SO','NumPy','SciPy','Scikit-Learn','seaborn','spacy'],\n [platform.platform(), np.__version__, scipy.__version__, sklearn.__version__, sns.__version__, spacy.__version__])\n\n))",
"|PACK |VERSION |\n|---------------|----------------------------------------|\n|SO |Darwin-18.2.0-x86_64-i386-64bit |\n|NumPy |1.15.4 |\n|SciPy |1.1.0 |\n|Scikit-Learn |0.20.1 |\n|seaborn |0.9.0 |\n|spacy |2.0.16 |\n"
],
[
"np.set_printoptions(precision=4)\npd.options.display.float_format = '{:,.4f}'.format",
"_____no_output_____"
],
[
"#externalizing codes that is used in many notebooks and it is not experiment specific\nimport pan\n#convert a sparse matrix into a dense for being used on PCA\nfrom skleanExtensions import DenseTransformer;\n\n#convert an array of text into an array of tokenized texts each token must contain text, tag_, pos_, dep_\nfrom skleanExtensions import POSTagTransformer",
"_____no_output_____"
]
],
[
[
"### paths configuration",
"_____no_output_____"
]
],
[
[
"baseDir = '/Users/joseeleandrocustodio/Dropbox/mestrado/02 - Pesquisa/code';\n\ninputDir= pathjoin(baseDir,'pan18aa');\noutputDir= pathjoin(baseDir,'out',\"oficial\");\nif not os.path.exists(outputDir):\n os.mkdir(outputDir);",
"_____no_output_____"
]
],
[
[
"## loading the dataset",
"_____no_output_____"
]
],
[
[
"problems = pan.readCollectionsOfProblems(inputDir);",
"_____no_output_____"
],
[
"print(problems[0]['problem'])\nprint(problems[0].keys())",
"problem00001\ndict_keys(['problem', 'language', 'encoding', 'candidates_folder_count', 'candidates', 'unknown'])\n"
],
[
"pd.DataFrame(problems)",
"_____no_output_____"
],
[
"def cachingPOSTAG(problem, taggingVersion='TAG'):\n import json;\n print (\"Tagging: %s, language: %s, \" %(problem['problem'],problem['language']), end=' ');\n \n if not os.path.exists('POSTAG_cache'):\n os.makedirs('POSTAG_cache');\n \n _id = problem['problem']+problem['language'];\n filename = os.path.join('POSTAG_cache',taggingVersion+'_'+_id+'.json')\n if not os.path.exists(filename):\n lang = problem['language'];\n if lang == 'sp':\n lang = 'es';\n elif lang =='pl':\n print(lang, ' not supported');\n return ;\n\n train_docs, train_labels, _ = zip(*problem['candidates'])\n problem['training_docs_size'] = len(train_docs);\n test_docs, _, test_filename = zip(*problem['unknown'])\n\n t0 = time()\n tagger = POSTagTransformer(language=lang);\n train_docs = tagger.fit_transform(train_docs);\n test_docs = tagger.fit_transform(test_docs);\n \n print(\"Annotation time %0.3fs\" % (time() - t0))\n \n with open(filename,'w') as f:\n json.dump({\n 'train':train_docs,\n 'train_labels':train_labels,\n 'test':test_docs,\n 'test_filename':test_filename\n },f);\n else:\n with open(filename,'r') as f:\n data = json.load(f);\n\n train_docs = data['train'];\n train_labels = data['train_labels'];\n test_docs = data['test'];\n test_filename = data['test_filename'];\n print('tagged')\n return train_docs, train_labels, test_docs, test_filename;\n\nfor problem in problems:\n cachingPOSTAG(problem)",
"Tagging: problem00001, language: en, tagged\nTagging: problem00002, language: en, tagged\nTagging: problem00003, language: fr, tagged\nTagging: problem00004, language: fr, tagged\nTagging: problem00005, language: it, tagged\nTagging: problem00006, language: it, tagged\nTagging: problem00007, language: pl, pl not supported\nTagging: problem00008, language: pl, pl not supported\nTagging: problem00009, language: sp, tagged\nTagging: problem00010, language: sp, tagged\n"
],
[
"train_docs, train_labels, test_docs, test_filename = cachingPOSTAG(problem)",
"Tagging: problem00010, language: sp, tagged\n"
],
[
"class FilterTagTransformer(BaseEstimator):\n def __init__(self,token='POS', parts=None):\n self.token = token;\n self.parts = parts;\n\n def transform(self, X, y=None):\n \"\"\" Return An array of tokens \n Parameters\n ----------\n X : {array-like}, shape = [n_samples, n_tokens]\n Array documents, where each document consists of a list of node\n and each node consist of a token and its correspondent tag\n \n [\n [('a','TAG1'),('b','TAG2')],\n [('a','TAG1')]\n ]\n y : array-like, shape = [n_samples] (default: None)\n Returns\n ---------\n X_dense : dense version of the input X array.\n \"\"\"\n if self.token == 'TAG':\n X = [' '.join([d[1].split('__')[0] for d in doc]) for doc in X]\n elif self.token == 'POS':\n if self.parts is None:\n X = [' '.join([d[2] for d in doc]) for doc in X];\n else:\n X = [' '.join([d[0] for d in doc if d[2] in self.parts]) for doc in X]\n elif self.token == 'DEP':\n X = [' '.join([d[3] for d in doc]) for doc in X]\n elif self.token == 'word_POS':\n if self.parts is None:\n X = [' '.join([d[0]+'/'+d[2] for d in doc]) for doc in X]\n elif self.token == 'filter':\n if self.parts is None:\n X = [' '.join([d[2] for d in doc]) for doc in X];\n else:\n X = [' '.join([d[0] for d in doc if d[2] in self.parts]) for doc in X]\n else:\n X = [' '.join([d[0] for d in doc]) for doc in X]\n \n return np.array(X); \n\n def fit(self, X, y=None):\n self.is_fitted = True\n return self\n\n def fit_transform(self, X, y=None):\n return self.transform(X=X, y=y)",
"_____no_output_____"
]
],
[
[
"### analisando os demais parametros",
"_____no_output_____"
]
],
[
[
"def spaceTokenizer(x):\n return x.split(\" \");",
"_____no_output_____"
],
[
"def runML(problem):\n print (\"\\nProblem: %s, language: %s, \" %(problem['problem'],problem['language']), end=' ');\n \n lang = problem['language'];\n if lang == 'sp':\n lang = 'es';\n elif lang =='pl':\n print(lang, ' not supported');\n return None,None,None,None;\n \n \n train_docs, train_labels, test_docs, test_filename = cachingPOSTAG(problem)\n problem['training_docs_size'] = len(train_docs);\n\n t0 = time()\n \n pipeline = Pipeline([\n ('filter',FilterTagTransformer(token='TAG')),\n ('vect', CountVectorizer(\n tokenizer=spaceTokenizer,\n min_df=0.01,\n lowercase=False\n )),\n ('tfidf', TfidfTransformer()),\n ('scaler', MaxAbsScaler()),\n ('dense', DenseTransformer()),\n ('transf', PCA(0.999)),\n ('clf', LogisticRegression(random_state=0,multi_class='multinomial', solver='newton-cg')),\n ])\n \n \n # uncommenting more parameters will give better exploring power but will\n # increase processing time in a combinatorial way\n parameters = {\n 'vect__ngram_range' :((1,1),(1,2),(1,3),(1,5)),\n 'tfidf__use_idf' :(True, False),\n 'tfidf__sublinear_tf':(True, False),\n 'tfidf__norm':('l1','l2'),\n 'clf__C':(0.1,1,10),\n }\n \n grid_search = GridSearchCV(pipeline,\n parameters,\n cv=4,\n iid=False,\n n_jobs=-1,\n verbose=False,\n scoring='f1_macro')\n \n t0 = time()\n grid_search.fit(train_docs, train_labels)\n print(\"Gridsearh %0.3fs\" % (time() - t0), end=' ')\n\n print(\"Best score: %0.3f\" % grid_search.best_score_)\n print(\"Best parameters set:\")\n best_parameters = grid_search.best_estimator_.get_params()\n for param_name in sorted(parameters.keys()):\n print(\"\\t%s: %r\" % (param_name, best_parameters[param_name]))\n \n train_pred=grid_search.predict(train_docs);\n test_pred=grid_search.predict(test_docs);\n \n \n # Writing output file\n out_data=[]\n for i,v in enumerate(test_pred):\n out_data.append({'unknown-text': test_filename[i],'predicted-author': v})\n answerFile = pathjoin(outputDir,'answers-'+problem['problem']+'.json');\n with open(answerFile, 'w') as f:\n json.dump(out_data, f, indent=4)\n \n \n #calculating the performance using PAN evaluation code\n f1,precision,recall,accuracy=pan.evaluate(\n pathjoin(inputDir, problem['problem'], 'ground-truth.json'),\n answerFile)\n \n return {\n 'problem-name' : problem['problem'],\n \"language\" : problem['language'],\n 'AuthorCount' : len(set(train_labels)),\n 'macro-f1' : round(f1,3),\n 'macro-precision': round(precision,3),\n 'macro-recall' : round(recall,3),\n 'micro-accuracy' : round(accuracy,3),\n \n }, grid_search.cv_results_,best_parameters, grid_search.best_estimator_;",
"_____no_output_____"
],
[
"result = [];\ncv_result = [];\nbest_parameters = [];\nestimators = [];\nfor problem in problems:\n with warnings.catch_warnings():\n warnings.filterwarnings(\"ignore\");\n r, c, b, e = runML(problem);\n if r is None:\n continue;\n result.append(r);\n cv_result.append(c);\n estimators.append(e);\n b['problem'] = problem['problem'];\n best_parameters.append(b);",
"\nProblem: problem00001, language: en, Tagging: problem00001, language: en, tagged\nGridsearh 1107.958s Best score: 0.661\nBest parameters set:\n\tclf__C: 10\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 2)\n\nProblem: problem00002, language: en, Tagging: problem00002, language: en, tagged\nGridsearh 251.719s Best score: 0.840\nBest parameters set:\n\tclf__C: 0.1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: False\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 3)\n\nProblem: problem00003, language: fr, Tagging: problem00003, language: fr, tagged\nGridsearh 1038.886s Best score: 0.530\nBest parameters set:\n\tclf__C: 1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: False\n\tvect__ngram_range: (1, 3)\n\nProblem: problem00004, language: fr, Tagging: problem00004, language: fr, tagged\nGridsearh 256.516s Best score: 0.663\nBest parameters set:\n\tclf__C: 0.1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 5)\n\nProblem: problem00005, language: it, Tagging: problem00005, language: it, tagged\nGridsearh 1014.834s Best score: 0.622\nBest parameters set:\n\tclf__C: 10\n\ttfidf__norm: 'l1'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 3)\n\nProblem: problem00006, language: it, Tagging: problem00006, language: it, tagged\nGridsearh 264.087s Best score: 0.880\nBest parameters set:\n\tclf__C: 0.1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 2)\n\nProblem: problem00007, language: pl, pl not supported\n\nProblem: problem00008, language: pl, pl not supported\n\nProblem: problem00009, language: sp, Tagging: problem00009, language: sp, tagged\nGridsearh 1135.047s Best score: 0.610\nBest parameters set:\n\tclf__C: 1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 5)\n\nProblem: problem00010, language: sp, Tagging: problem00010, language: sp, tagged\nGridsearh 267.930s Best score: 0.678\nBest parameters set:\n\tclf__C: 1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: False\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 2)\n"
],
[
"df=pd.DataFrame(result)[['problem-name',\n \"language\",\n 'AuthorCount',\n 'macro-f1','macro-precision','macro-recall' ,'micro-accuracy']]",
"_____no_output_____"
],
[
"df",
"_____no_output_____"
],
[
"df[['macro-f1']].mean()",
"_____no_output_____"
],
[
"languages={\n 'en':'inglesa',\n 'sp':'espanhola',\n 'it':'italiana',\n 'pl':'polonesa',\n 'fr':'francesa'\n}",
"_____no_output_____"
],
[
"cv_result2 = [];\ndfCV = pd.DataFrame();\nfor i, c in enumerate(cv_result):\n temp = pd.DataFrame(c);\n temp['language'] = result[i]['AuthorCount']\n temp['problem'] = int(re.sub('\\D','',result[i]['problem-name']));\n temp['language'] = languages[result[i]['language']]\n dfCV = dfCV.append(temp);\n\nfor p in [\n 'mean_test_score','std_test_score','mean_train_score', \n 'split0_test_score',\n 'split1_test_score',\n 'split2_test_score']:\n dfCV[p]=dfCV[p].astype(np.float32);\n\n \ndfCV =dfCV[[\n 'problem',\n 'language',\n 'rank_test_score',\n 'param_vect__ngram_range',\n 'param_tfidf__sublinear_tf',\n 'param_tfidf__norm',\n 'param_clf__C',\n 'mean_test_score', \n 'std_test_score',\n\n 'split0_test_score',\n 'split1_test_score',\n 'split2_test_score',\n\n 'mean_score_time',\n 'mean_fit_time',\n 'std_fit_time',\n 'std_score_time',\n 'std_train_score',\n]];\n\ndfCV.rename(columns={\n 'param_vect__ngram_range':'ngram_range',\n 'param_tfidf__sublinear_tf':'sublinear_tf',\n 'param_tfidf__smooth_idf':'smooth_idf',\n 'param_tfidf__norm':'norm',\n 'param_clf__C':'regularization',\n},inplace=True);\n\n#print('\\',\\n\\''.join(dfCV.columns))\n",
"_____no_output_____"
],
[
"dfCV.head()",
"_____no_output_____"
]
],
[
[
"## Saving the model",
"_____no_output_____"
]
],
[
[
"dfCV.to_csv('PANAA2018_POSTAG.csv', index=False)",
"_____no_output_____"
],
[
"dfCV = pd.read_csv('PANAA2018_POSTAG.csv', na_values='')",
"_____no_output_____"
],
[
"import pickle;\nwith open(\"PANAA2018_POSTAG.pkl\",\"wb\") as f:\n pickle.dump(estimators,f)",
"_____no_output_____"
]
],
[
[
"## understanding the model with reports",
"_____no_output_____"
],
[
"Podemos ver que para um mesmo problema mais de uma configuraรงรฃo รฉ possรญvel",
"_____no_output_____"
]
],
[
[
"print(' | '.join(best_parameters[0]['vect'].get_feature_names()[0:20]))",
" | '' | -LRB- | CC | CD | DT | EX | IN | JJ | MD | NN | NNP | NNPS | NNS | PRP | PRP$ | RB | UH | VB | VBD\n"
],
[
"(dfCV[dfCV.rank_test_score == 1]).drop_duplicates()[\n ['problem',\n 'language',\n 'mean_test_score',\n 'std_test_score',\n 'ngram_range',\n 'sublinear_tf',\n 'norm']\n].sort_values(by=[\n 'problem',\n 'mean_test_score',\n 'std_test_score',\n 'ngram_range',\n 'sublinear_tf'\n], ascending=[True, False,True,False,False])",
"_____no_output_____"
],
[
"dfCV.pivot_table(\n index=['problem','language','norm','sublinear_tf'],\n columns=[ 'ngram_range','regularization'],\n values='mean_test_score'\n )",
"_____no_output_____"
]
],
[
[
"O score retornado vem do conjunto de teste da validaรงรฃo cruzada e nรฃo do conjunto de testes",
"_____no_output_____"
]
],
[
[
"pd.options.display.precision = 3 \nprint(u\"\\\\begin{table}[h]\\n\\\\centering\\n\\\\caption{Medida F1 para os parรขmetros }\")\n\nprint(re.sub(r'[ ]{2,}',' ',dfCV.pivot_table(\n index=['problem','language','sublinear_tf','norm'],\n columns=['ngram_range'],\n values='mean_test_score'\n ).to_latex()))\nprint (\"\\label{tab:modelocaracter}\")\nprint(r\"\\end{table}\")",
"\\begin{table}[h]\n\\centering\n\\caption{Medida F1 para os parรขmetros }\n\\begin{tabular}{llllrrrr}\n\\toprule\n & & & ngram\\_range & (1, 1) & (1, 2) & (1, 3) & (1, 5) \\\\\nproblem & language & sublinear\\_tf & norm & & & & \\\\\n\\midrule\n1 & inglesa & False & l1 & 0.4150 & 0.5965 & 0.6053 & 0.5652 \\\\\n & & & l2 & 0.4107 & 0.5957 & 0.6074 & 0.5462 \\\\\n & & True & l1 & 0.3265 & 0.6172 & 0.6278 & 0.5325 \\\\\n & & & l2 & 0.3280 & 0.6317 & 0.6174 & 0.5734 \\\\\n2 & inglesa & False & l1 & 0.6302 & 0.7712 & 0.8017 & 0.7436 \\\\\n & & & l2 & 0.6209 & 0.7722 & 0.8200 & 0.7352 \\\\\n & & True & l1 & 0.7302 & 0.7919 & 0.7552 & 0.7519 \\\\\n & & & l2 & 0.7306 & 0.7895 & 0.7678 & 0.7519 \\\\\n3 & francesa & False & l1 & 0.2583 & 0.4162 & 0.4969 & 0.4386 \\\\\n & & & l2 & 0.2444 & 0.4164 & 0.5003 & 0.4524 \\\\\n & & True & l1 & 0.1230 & 0.4329 & 0.4955 & 0.4724 \\\\\n & & & l2 & 0.1288 & 0.4439 & 0.5196 & 0.4928 \\\\\n4 & francesa & False & l1 & 0.4039 & 0.4439 & 0.6035 & 0.5917 \\\\\n & & & l2 & 0.4108 & 0.4278 & 0.5944 & 0.5662 \\\\\n & & True & l1 & 0.2567 & 0.3345 & 0.6181 & 0.6328 \\\\\n & & & l2 & 0.2594 & 0.3315 & 0.6411 & 0.6633 \\\\\n5 & italiana & False & l1 & 0.2972 & 0.4731 & 0.5116 & 0.4924 \\\\\n & & & l2 & 0.2880 & 0.4545 & 0.4813 & 0.4743 \\\\\n & & True & l1 & 0.1986 & 0.5366 & 0.6021 & 0.5081 \\\\\n & & & l2 & 0.1973 & 0.5217 & 0.5617 & 0.5230 \\\\\n6 & italiana & False & l1 & 0.7239 & 0.8300 & 0.8367 & 0.8367 \\\\\n & & & l2 & 0.7528 & 0.8300 & 0.8367 & 0.8233 \\\\\n & & True & l1 & 0.4723 & 0.8533 & 0.8367 & 0.8339 \\\\\n & & & l2 & 0.4858 & 0.8683 & 0.8367 & 0.8100 \\\\\n9 & espanhola & False & l1 & 0.2194 & 0.5035 & 0.5213 & 0.5761 \\\\\n & & & l2 & 0.2126 & 0.5008 & 0.5177 & 0.5638 \\\\\n & & True & l1 & 0.1186 & 0.4609 & 0.5542 & 0.6021 \\\\\n & & & l2 & 0.1213 & 0.4623 & 0.5585 & 0.5997 \\\\\n10 & espanhola & False & l1 & 0.3879 & 0.6108 & 0.5474 & 0.6333 \\\\\n & & & l2 & 0.3901 & 0.6106 & 0.5526 & 0.5783 \\\\\n & & True & l1 & 0.2956 & 0.5603 & 0.5447 & 0.6572 \\\\\n & & & l2 & 0.2665 & 0.5697 & 0.5450 & 0.6289 \\\\\n\\bottomrule\n\\end{tabular}\n\n\\label{tab:modelocaracter}\n\\end{table}\n"
],
[
"d = dfCV.copy()\nd = d.rename(columns={'language':u'Lรญngua', 'sublinear_tf':'TF Sublinear'})\nd = d [ d.norm.isna() == False]\nd['autorNumber'] = d.problem.map(lambda x: 20 if x % 2==0 else 5)\nd.problem = d.apply(lambda x: x[u'Lรญngua'] +\" \"+ str(x[u'problem']), axis=1)\n#d.ngram_range = d.apply(lambda x: str(x[u'ngram_range'][0]) +\" \"+ str(x[u'ngram_range'][1]), axis=1)\n\nd.std_test_score =d.std_test_score / d.std_test_score.quantile(0.95) *500;\nd.std_test_score +=1;\nd.std_test_score = d.std_test_score.astype(np.int64)\ng = sns.FacetGrid(d, col='Lรญngua', hue='TF Sublinear', row=\"regularization\", height=3,palette=\"Set1\")\ng.map(plt.scatter, \"ngram_range\", \"mean_test_score\",s=d.std_test_score.values).add_legend();\n#sns.pairplot(d, hue=\"TF Sublinear\", vars=[\"autorNumber\", \"mean_test_score\"])\n",
"_____no_output_____"
],
[
"g = sns.FacetGrid(d, row='autorNumber', hue='TF Sublinear', col=u\"Lรญngua\", height=3,palette=\"Set1\")\ng.map(plt.scatter, \"ngram_range\", \"mean_test_score\", alpha=0.5, s=d.std_test_score.values).add_legend();",
"_____no_output_____"
],
[
"sns.distplot(dfCV.std_test_score, bins=25);",
"_____no_output_____"
],
[
"import statsmodels.api as sm",
"_____no_output_____"
],
[
"d = dfCV[['mean_test_score','problem', 'language','sublinear_tf','norm','ngram_range']].copy();\nd.sublinear_tf=d.sublinear_tf.apply(lambda x: 1 if x else 0)\nd.norm=d.norm.apply(lambda x: 1 if x=='l1' else 0)\n\nd['autorNumber'] = d.problem.map(lambda x: 20 if x % 2==0 else 5)\nd.norm.fillna(value='None', inplace=True);\n\n_, d['ngram_max'] = zip(*d.ngram_range.str.replace(r'[^\\d,]','').str.split(',').values.tolist())\n#d.ngram_min = d.ngram_min.astype(np.uint8);\nd.ngram_max = d.ngram_max.astype(np.uint8);\nd.drop(columns=['ngram_range','problem'], inplace=True)\n#d['intercept'] = 1;\n\nd=pd.get_dummies(d, columns=['language'])",
"_____no_output_____"
],
[
"d.describe()",
"_____no_output_____"
],
[
"mod = sm.OLS( d.iloc[:,0], d.iloc[:,1:])\nres = mod.fit()\nres.summary()",
"_____no_output_____"
],
[
"sns.distplot(res.predict()-d.iloc[:,0].values, bins=25)",
"_____no_output_____"
],
[
"sns.jointplot(x='F1',y='F1-estimated',data=pd.DataFrame({'F1':d.iloc[:,0].values, 'F1-estimated':res.predict()}));",
"_____no_output_____"
]
],
[
[
"# tests",
"_____no_output_____"
]
],
[
[
"problem = problems[0]\nprint (\"\\nProblem: %s, language: %s, \" %(problem['problem'],problem['language']), end=' ');\n",
"\nProblem: problem00001, language: en, "
],
[
"def d(estimator, n_features=5):\n from IPython.display import Markdown, display, HTML\n names = np.array(estimator.named_steps['vect'].get_feature_names());\n classes_ = estimator.named_steps['clf'].classes_;\n weights = estimator.named_steps['clf'].coef_;\n \n def tag(tag, content, attrib=''):\n if attrib != '':\n attrib = ' style=\"' + attrib+'\"'; \n return ''.join(['<',tag,attrib,' >',content,'</',tag,'>']);\n \n def color(baseColor, intensity):\n r,g,b = baseColor[0:2],baseColor[2:4],baseColor[4:6]\n r,g,b = int(r, 16), int(g, 16), int(b, 16)\n \n f= (1-np.abs(intensity))/2;\n r = r + int((255-r)*f)\n g = g + int((255-g)*f)\n b = b + int((255-b)*f)\n rgb = '#%02x%x%x' % (r, g, b);\n #print(baseColor,rgb,r,g,b,intensity,f)\n return rgb\n \n \n spanStyle ='border-radius: 5px;margin:4px;padding:3px; color:#FFF !important;';\n \n lines = '<table>'+tag('thead',tag('th','Classes')+tag('th','positive')+tag('th','negative'))\n lines += '<tbody>'\n for i,c in enumerate(weights):\n c = np.round(c / np.abs(c).max(),2);\n positive = names[np.argsort(-c)][:n_features];\n positiveV = c[np.argsort(-c)][:n_features]\n negative = names[np.argsort(c)][:n_features];\n negativeV = c[np.argsort(c)][:n_features]\n \n lines += tag('tr',\n tag('td', re.sub('\\D0*','',classes_[i]))\n + tag('td',''.join([tag('span',d.upper()+' '+str(v),spanStyle+'background:'+color('51A3DD',v)) for d,v in zip(positive,positiveV)]))\n + tag('td',''.join([tag('span',d.upper()+' '+str(v),spanStyle+'background:'+color('DD5555',v)) for d,v in zip(negative,negativeV)]))\n )\n lines+= '</tbody></table>'\n \n display(HTML(lines))\n #print(lines)\n \nd(estimators[0])",
"_____no_output_____"
],
[
"%%HTML\n<table><tbody><tr><th>POS</th><th>Description</th><th>Examples</th></tr><tr >\n<td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text u-text-small\">adjective</td><td class=\"c-table__cell u-text u-text-small\"><em>big, old, green, incomprehensible, first</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text u-text-small\">adposition</td><td class=\"c-table__cell u-text u-text-small\"><em>in, to, during</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text u-text-small\">adverb</td><td class=\"c-table__cell u-text u-text-small\"><em>very, tomorrow, down, where, there</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text u-text-small\">auxiliary</td><td class=\"c-table__cell u-text u-text-small\"><em>is, has (done), will (do), should (do)</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text u-text-small\">conjunction</td><td class=\"c-table__cell u-text u-text-small\"><em>and, or, but</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CCONJ</code></td><td class=\"c-table__cell u-text u-text-small\">coordinating conjunction</td><td class=\"c-table__cell u-text u-text-small\"><em>and, or, but</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text u-text-small\">determiner</td><td class=\"c-table__cell u-text u-text-small\"><em>a, an, the</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>INTJ</code></td><td class=\"c-table__cell u-text u-text-small\">interjection</td><td class=\"c-table__cell u-text u-text-small\"><em>psst, ouch, bravo, hello</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text u-text-small\">noun</td><td class=\"c-table__cell u-text u-text-small\"><em>girl, cat, tree, air, beauty</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NUM</code></td><td class=\"c-table__cell u-text u-text-small\">numeral</td><td class=\"c-table__cell u-text u-text-small\"><em>1, 2017, one, seventy-seven, IV, MMXIV</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text u-text-small\">particle</td><td class=\"c-table__cell u-text u-text-small\"><em>'s, not, </em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text u-text-small\">pronoun</td><td class=\"c-table__cell u-text u-text-small\"><em>I, you, he, she, myself, themselves, somebody</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text u-text-small\">proper noun</td><td class=\"c-table__cell u-text u-text-small\"><em>Mary, John, London, NATO, HBO</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation</td><td class=\"c-table__cell u-text u-text-small\"><em>., (, ), ?</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SCONJ</code></td><td class=\"c-table__cell u-text u-text-small\">subordinating conjunction</td><td class=\"c-table__cell u-text u-text-small\"><em>if, while, that</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text u-text-small\">symbol</td><td class=\"c-table__cell u-text u-text-small\"><em>$, %, ยง, ยฉ, +, โ, ร, รท, =, :), ๐</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text u-text-small\">verb</td><td class=\"c-table__cell u-text u-text-small\"><em>run, runs, running, eat, ate, eating</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text u-text-small\">other</td><td class=\"c-table__cell u-text u-text-small\"><em>sfpksdpsxmsa</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SPACE</code></td><td class=\"c-table__cell u-text u-text-small\">space</td></tr></tbody></table>",
"_____no_output_____"
],
[
"%%HTML\n<h1>English</h1>\n\n<table class=\"c-table o-block\"><tbody><tr class=\"c-table__row c-table__row--head\"><th class=\"c-table__head-cell u-text-label\">Tag</th><th class=\"c-table__head-cell u-text-label\">POS</th><th class=\"c-table__head-cell u-text-label\">Morphology</th><th class=\"c-table__head-cell u-text-label\">Description</th></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>-LRB-</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=brck</code> <code>PunctSide=ini</code></td><td class=\"c-table__cell u-text u-text-small\">left round bracket</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>-RRB-</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=brck</code> <code>PunctSide=fin</code></td><td class=\"c-table__cell u-text u-text-small\">right round bracket</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>,</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=comm</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, comma</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>:</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, colon or ellipsis</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>.</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=peri</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, sentence closer</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>''</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=quot</code> <code>PunctSide=fin</code></td><td class=\"c-table__cell u-text u-text-small\">closing quotation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>\"\"</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=quot</code> <code>PunctSide=fin</code></td><td class=\"c-table__cell u-text u-text-small\">closing quotation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>#</code></td><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"> <code>SymType=numbersign</code></td><td class=\"c-table__cell u-text u-text-small\">symbol, number sign</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>``</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=quot</code> <code>PunctSide=ini</code></td><td class=\"c-table__cell u-text u-text-small\">opening quotation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$</code></td><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"> <code>SymType=currency</code></td><td class=\"c-table__cell u-text u-text-small\">symbol, currency</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADD</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">email</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>AFX</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Hyph=yes</code></td><td class=\"c-table__cell u-text u-text-small\">affix</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>BES</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">auxiliary \"be\"</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CC</code></td><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text\"> <code>ConjType=coor</code></td><td class=\"c-table__cell u-text u-text-small\">conjunction, coordinating</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CD</code></td><td class=\"c-table__cell u-text\"><code>NUM</code></td><td class=\"c-table__cell u-text\"> <code>NumType=card</code></td><td class=\"c-table__cell u-text u-text-small\">cardinal number</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>DT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>determiner</code></td><td class=\"c-table__cell u-text u-text-small\"></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>EX</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>AdvType=ex</code></td><td class=\"c-table__cell u-text u-text-small\">existential there</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>FW</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"> <code>Foreign=yes</code></td><td class=\"c-table__cell u-text u-text-small\">foreign word</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>GW</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">additional word in multi-word expression</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>HVS</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">forms of \"have\"</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>HYPH</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=dash</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, hyphen</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>IN</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">conjunction, subordinating or preposition</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>JJ</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Degree=pos</code></td><td class=\"c-table__cell u-text u-text-small\">adjective</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>JJR</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Degree=comp</code></td><td class=\"c-table__cell u-text u-text-small\">adjective, comparative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>JJS</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Degree=sup</code></td><td class=\"c-table__cell u-text u-text-small\">adjective, superlative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>LS</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>NumType=ord</code></td><td class=\"c-table__cell u-text u-text-small\">list item marker</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>MD</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">verb, modal auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NFP</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">superfluous punctuation</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NIL</code></td><td class=\"c-table__cell u-text\"><code></code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">missing tag</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NN</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"> <code>Number=sing</code></td><td class=\"c-table__cell u-text u-text-small\">noun, singular or mass</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNP</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"> <code>NounType=prop</code> <code>Number=sign</code></td><td class=\"c-table__cell u-text u-text-small\">noun, proper singular</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNPS</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"> <code>NounType=prop</code> <code>Number=plur</code></td><td class=\"c-table__cell u-text u-text-small\">noun, proper plural</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNS</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"> <code>Number=plur</code></td><td class=\"c-table__cell u-text u-text-small\">noun, plural</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PDT</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>AdjType=pdt</code> <code>PronType=prn</code></td><td class=\"c-table__cell u-text u-text-small\">predeterminer</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>POS</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>Poss=yes</code></td><td class=\"c-table__cell u-text u-text-small\">possessive ending</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRP</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code></td><td class=\"c-table__cell u-text u-text-small\">pronoun, personal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRP$</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code> <code>Poss=yes</code></td><td class=\"c-table__cell u-text u-text-small\">pronoun, possessive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RB</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>Degree=pos</code></td><td class=\"c-table__cell u-text u-text-small\">adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RBR</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>Degree=comp</code></td><td class=\"c-table__cell u-text u-text-small\">adverb, comparative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RBS</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>Degree=sup</code></td><td class=\"c-table__cell u-text u-text-small\">adverb, superlative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RP</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">adverb, particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>_SP</code></td><td class=\"c-table__cell u-text\"><code>SPACE</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">space</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">symbol</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>TO</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=inf</code> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitival to</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>UH</code></td><td class=\"c-table__cell u-text\"><code>INTJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">interjection</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VB</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">verb, base form</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBD</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>Tense=past</code></td><td class=\"c-table__cell u-text u-text-small\">verb, past tense</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBG</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=part</code> <code>Tense=pres</code> <code>Aspect=prog</code></td><td class=\"c-table__cell u-text u-text-small\">verb, gerund or present participle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBN</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=part</code> <code>Tense=past</code> <code>Aspect=perf</code></td><td class=\"c-table__cell u-text u-text-small\">verb, past participle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>Tense=pres</code></td><td class=\"c-table__cell u-text u-text-small\">verb, non-3rd person singular present</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBZ</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>Tense=pres</code> <code>Number=sing</code> <code>Person=3</code></td><td class=\"c-table__cell u-text u-text-small\">verb, 3rd person singular present</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WDT</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-determiner</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WP</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-pronoun, personal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WP$</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Poss=yes PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-pronoun, possessive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WRB</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>XX</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">unknown</td></tr></tbody></table>",
"_____no_output_____"
],
[
"%%HTML\n<h1>German</h1>\n<p> The German part-of-speech tagger uses the <a href=\"http://www.ims.uni-stuttgart.de/forschung/ressourcen/korpora/TIGERCorpus/annotation/index.html\" target=\"_blank\" rel=\"noopener nofollow\">TIGER Treebank</a> annotation scheme. We also map the tags to the simpler Google\nUniversal POS tag set.</p>\n\n<table class=\"c-table o-block\"><tbody><tr class=\"c-table__row c-table__row--head\"><th class=\"c-table__head-cell u-text-label\">Tag</th><th class=\"c-table__head-cell u-text-label\">POS</th><th class=\"c-table__head-cell u-text-label\">Morphology</th><th class=\"c-table__head-cell u-text-label\">Description</th></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$(</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=brck</code></td><td class=\"c-table__cell u-text u-text-small\">other sentence-internal punctuation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$,</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=comm</code></td><td class=\"c-table__cell u-text u-text-small\">comma</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$.</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=peri</code></td><td class=\"c-table__cell u-text u-text-small\">sentence-final punctuation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADJA</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">adjective, attributive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADJD</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Variant=short</code></td><td class=\"c-table__cell u-text u-text-small\">adjective, adverbial or predicative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APPO</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=post</code></td><td class=\"c-table__cell u-text u-text-small\">postposition</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APPR</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=prep</code></td><td class=\"c-table__cell u-text u-text-small\">preposition; circumposition left</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APPRART</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=prep</code> <code>PronType=art</code></td><td class=\"c-table__cell u-text u-text-small\">preposition with article</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APZR</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=circ</code></td><td class=\"c-table__cell u-text u-text-small\">circumposition right</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ART</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=art</code></td><td class=\"c-table__cell u-text u-text-small\">definite or indefinite article</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CARD</code></td><td class=\"c-table__cell u-text\"><code>NUM</code></td><td class=\"c-table__cell u-text\"> <code>NumType=card</code></td><td class=\"c-table__cell u-text u-text-small\">cardinal number</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>FM</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"> <code>Foreign=yes</code></td><td class=\"c-table__cell u-text u-text-small\">foreign language material</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ITJ</code></td><td class=\"c-table__cell u-text\"><code>INTJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">interjection</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KOKOM</code></td><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text\"> <code>ConjType=comp</code></td><td class=\"c-table__cell u-text u-text-small\">comparative conjunction</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KON</code></td><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">coordinate conjunction</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KOUI</code></td><td class=\"c-table__cell u-text\"><code>SCONJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">subordinate conjunction with \"zu\" and infinitive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KOUS</code></td><td class=\"c-table__cell u-text\"><code>SCONJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">subordinate conjunction with sentence</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NE</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">proper noun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNE</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">proper noun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NN</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">noun, singular or mass</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PAV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">pronominal adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PROAV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">pronominal adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PDAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">attributive demonstrative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PDS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">substituting demonstrative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PIAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=ind</code> <code>neg</code> <code>tot</code></td><td class=\"c-table__cell u-text u-text-small\">attributive indefinite pronoun without determiner</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PIDAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>AdjType=pdt PronType=ind</code> <code>neg</code> <code>tot</code></td><td class=\"c-table__cell u-text u-text-small\">attributive indefinite pronoun with determiner</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PIS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=ind</code> <code>neg</code> <code>tot</code></td><td class=\"c-table__cell u-text u-text-small\">substituting indefinite pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PPER</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code></td><td class=\"c-table__cell u-text u-text-small\">non-reflexive personal pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PPOSAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>Poss=yes</code> <code>PronType=prs</code></td><td class=\"c-table__cell u-text u-text-small\">attributive possessive pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PPOSS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=rel</code></td><td class=\"c-table__cell u-text u-text-small\">substituting possessive pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRELAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=rel</code></td><td class=\"c-table__cell u-text u-text-small\">attributive relative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRELS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=rel</code></td><td class=\"c-table__cell u-text u-text-small\">substituting relative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRF</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code> <code>Reflex=yes</code></td><td class=\"c-table__cell u-text u-text-small\">reflexive personal pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKA</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">particle with adjective or adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKANT</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=res</code></td><td class=\"c-table__cell u-text u-text-small\">answer particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKNEG</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>Negative=yes</code></td><td class=\"c-table__cell u-text u-text-small\">negative particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKVZ</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=vbp</code></td><td class=\"c-table__cell u-text u-text-small\">separable verbal particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKZU</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=inf</code></td><td class=\"c-table__cell u-text u-text-small\">\"zu\" before infinitive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PWAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code></td><td class=\"c-table__cell u-text u-text-small\">attributive interrogative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PWAV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code></td><td class=\"c-table__cell u-text u-text-small\">adverbial interrogative or relative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PWS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code></td><td class=\"c-table__cell u-text u-text-small\">substituting interrogative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>TRUNC</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"> <code>Hyph=yes</code></td><td class=\"c-table__cell u-text u-text-small\">word remnant</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAFIN</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>Mood=ind</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">finite verb, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAIMP</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>Mood=imp</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">imperative, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAINF</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAPP</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>Aspect=perf</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">perfect participle, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VMFIN</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Mood=ind</code> <code>VerbForm=fin</code> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">finite verb, modal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VMINF</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive, modal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VMPP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Aspect=perf</code> <code>VerbForm=part</code> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">perfect participle, modal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVFIN</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Mood=ind</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">finite verb, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVIMP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Mood=imp</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">imperative, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVINF</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVIZU</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive with \"zu\", full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVPP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Aspect=perf</code> <code>VerbForm=part</code></td><td class=\"c-table__cell u-text u-text-small\">perfect participle, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>XY</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">non-word containing non-letter</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SP</code></td><td class=\"c-table__cell u-text\"><code>SPACE</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">space</td></tr></tbody></table>",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d063dd4da6edcb8569e16c912721a1ffc128f161 | 12,840 | ipynb | Jupyter Notebook | jupyter/spark_nlp_model.ipynb | akashmavle5/--akash | cfb21d5a943a3d3fcae3c08921e7323a52761acd | [
"Apache-2.0"
]
| null | null | null | jupyter/spark_nlp_model.ipynb | akashmavle5/--akash | cfb21d5a943a3d3fcae3c08921e7323a52761acd | [
"Apache-2.0"
]
| null | null | null | jupyter/spark_nlp_model.ipynb | akashmavle5/--akash | cfb21d5a943a3d3fcae3c08921e7323a52761acd | [
"Apache-2.0"
]
| null | null | null | 34.423592 | 224 | 0.51285 | [
[
[
"",
"_____no_output_____"
],
[
"# Spark NLP Quick Start\n### How to use Spark NLP pretrained pipelines",
"_____no_output_____"
],
[
"[](https://colab.research.google.com/github/JohnSnowLabs/spark-nlp-workshop/blob/master/jupyter/quick_start_google_colab.ipynb)",
"_____no_output_____"
],
[
"We will first set up the runtime environment and then load pretrained Entity Recognition model and Sentiment analysis model and give it a quick test. Feel free to test the models on your own sentences / datasets.",
"_____no_output_____"
]
],
[
[
"!wget http://setup.johnsnowlabs.com/colab.sh -O - | bash",
"--2021-06-03 06:56:33-- http://setup.johnsnowlabs.com/colab.sh\nResolving setup.johnsnowlabs.com (setup.johnsnowlabs.com)... 51.158.130.125\nConnecting to setup.johnsnowlabs.com (setup.johnsnowlabs.com)|51.158.130.125|:80... connected.\nHTTP request sent, awaiting response... 302 Moved Temporarily\nLocation: https://raw.githubusercontent.com/JohnSnowLabs/spark-nlp/master/scripts/colab_setup.sh [following]\n--2021-06-03 06:56:34-- https://raw.githubusercontent.com/JohnSnowLabs/spark-nlp/master/scripts/colab_setup.sh\nResolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.111.133, 185.199.110.133, ...\nConnecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 1608 (1.6K) [text/plain]\nSaving to: โSTDOUTโ\n\n- 100%[===================>] 1.57K --.-KB/s in 0s \n\n2021-06-03 06:56:34 (34.0 MB/s) - written to stdout [1608/1608]\n\nsetup Colab for PySpark 3.0.2 and Spark NLP 3.0.3\nGet:1 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran40/ InRelease [3,626 B]\nIgn:2 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 InRelease\nGet:3 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB]\nGet:4 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic InRelease [15.9 kB]\nIgn:5 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease\nHit:6 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 Release\nHit:7 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release\nHit:9 http://archive.ubuntu.com/ubuntu bionic InRelease\nGet:11 http://archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]\nHit:12 http://ppa.launchpad.net/cran/libgit2/ubuntu bionic InRelease\nGet:13 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic InRelease [15.9 kB]\nGet:14 http://security.ubuntu.com/ubuntu bionic-security/restricted amd64 Packages [424 kB]\nGet:15 http://archive.ubuntu.com/ubuntu bionic-backports InRelease [74.6 kB]\nGet:16 http://archive.ubuntu.com/ubuntu bionic-updates/restricted amd64 Packages [478 kB]\nHit:17 http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu bionic InRelease\nGet:18 http://security.ubuntu.com/ubuntu bionic-security/universe amd64 Packages [1,414 kB]\nGet:19 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic/main Sources [1,770 kB]\nGet:20 http://security.ubuntu.com/ubuntu bionic-security/main amd64 Packages [2,154 kB]\nGet:21 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 Packages [2,615 kB]\nGet:22 http://archive.ubuntu.com/ubuntu bionic-updates/universe amd64 Packages [2,184 kB]\nGet:23 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic/main amd64 Packages [906 kB]\nGet:24 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic/main amd64 Packages [40.9 kB]\nFetched 12.3 MB in 7s (1,728 kB/s)\nReading package lists... Done\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 204.8MB 61kB/s \n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 51kB 6.0MB/s \n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 204kB 36.1MB/s \n\u001b[?25h Building wheel for pyspark (setup.py) ... \u001b[?25l\u001b[?25hdone\n"
],
[
"import sparknlp\nspark = sparknlp.start()\n\nprint(\"Spark NLP version: {}\".format(sparknlp.version()))\nprint(\"Apache Spark version: {}\".format(spark.version))",
"Spark NLP version: 3.0.3\nApache Spark version: 3.0.2\n"
],
[
"from sparknlp.pretrained import PretrainedPipeline ",
"_____no_output_____"
]
],
[
[
"Let's use Spark NLP pre-trained pipeline for `named entity recognition`",
"_____no_output_____"
]
],
[
[
"pipeline = PretrainedPipeline('recognize_entities_dl', 'en')",
"recognize_entities_dl download started this may take some time.\nApprox size to download 160.1 MB\n[OK!]\n"
],
[
"result = pipeline.annotate('President Biden represented Delaware for 36 years in the U.S. Senate before becoming the 47th Vice President of the United States.') ",
"_____no_output_____"
],
[
"print(result['ner'])\nprint(result['entities'])",
"['O', 'B-PER', 'O', 'B-LOC', 'O', 'O', 'O', 'O', 'O', 'B-LOC', 'O', 'B-ORG', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-LOC', 'I-LOC', 'O']\n['Biden', 'Delaware', 'U.S', 'Senate', 'United States']\n"
]
],
[
[
"Let's try another Spark NLP pre-trained pipeline for `named entity recognition`",
"_____no_output_____"
]
],
[
[
"pipeline = PretrainedPipeline('onto_recognize_entities_bert_tiny', 'en')\n\nresult = pipeline.annotate(\"Johnson first entered politics when elected in 2001 as a member of Parliament. He then served eight years as the mayor of London, from 2008 to 2016, before rejoining Parliament.\")\n\nprint(result['ner'])\nprint(result['entities'])",
"onto_recognize_entities_bert_tiny download started this may take some time.\nApprox size to download 30.2 MB\n[OK!]\n['B-PERSON', 'B-ORDINAL', 'O', 'O', 'O', 'O', 'O', 'B-DATE', 'O', 'O', 'O', 'O', 'B-ORG', 'O', 'O', 'O', 'B-DATE', 'I-DATE', 'O', 'O', 'O', 'O', 'B-GPE', 'O', 'B-DATE', 'O', 'B-DATE', 'O', 'O', 'O', 'B-ORG']\n['Johnson', 'first', '2001', 'Parliament.', 'eight years', 'London,', '2008', '2016', 'Parliament.']\n"
]
],
[
[
"Let's use Spark NLP pre-trained pipeline for `sentiment` analysis",
"_____no_output_____"
]
],
[
[
"pipeline = PretrainedPipeline('analyze_sentimentdl_glove_imdb', 'en')",
"analyze_sentimentdl_glove_imdb download started this may take some time.\nApprox size to download 155.3 MB\n[OK!]\n"
],
[
"result = pipeline.annotate(\"Harry Potter is a great movie.\")",
"_____no_output_____"
],
[
"print(result['sentiment'])",
"['pos']\n"
]
],
[
[
"### Please check our [Models Hub](https://nlp.johnsnowlabs.com/models) for more pretrained models and pipelines! ๐ ",
"_____no_output_____"
]
],
[
[
"",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d063e5bcce03d3c35d6f90ff41993985148b21df | 4,788 | ipynb | Jupyter Notebook | test/ipynb/groovy/TableMenuTest.ipynb | ssadedin/beakerx | 34479b07d2dfdf1404692692f483faf0251632c3 | [
"Apache-2.0"
]
| 1,491 | 2017-03-30T03:05:05.000Z | 2022-03-27T04:26:02.000Z | test/ipynb/groovy/TableMenuTest.ipynb | ssadedin/beakerx | 34479b07d2dfdf1404692692f483faf0251632c3 | [
"Apache-2.0"
]
| 3,268 | 2015-01-01T00:10:26.000Z | 2017-05-05T18:59:41.000Z | test/ipynb/groovy/TableMenuTest.ipynb | ssadedin/beakerx | 34479b07d2dfdf1404692692f483faf0251632c3 | [
"Apache-2.0"
]
| 287 | 2017-04-03T01:30:06.000Z | 2022-03-17T06:09:15.000Z | 23.130435 | 104 | 0.553258 | [
[
[
"empty"
]
]
]
| [
"empty"
]
| [
[
"empty"
]
]
|
d0643a20c5e91fa8a2a1902b0f62ac611d1e5814 | 45,339 | ipynb | Jupyter Notebook | chapter2/2.3.2-text_classification.ipynb | wangxingda/Tensorflow-Handbook | 97987e62da5a24dac6169fbacf1c3d4c041b3339 | [
"Apache-2.0"
]
| 22 | 2019-10-12T06:38:05.000Z | 2022-02-24T03:10:29.000Z | chapter2/2.3.2-text_classification.ipynb | wangxingda/tensorflow-handbook | 97987e62da5a24dac6169fbacf1c3d4c041b3339 | [
"Apache-2.0"
]
| null | null | null | chapter2/2.3.2-text_classification.ipynb | wangxingda/tensorflow-handbook | 97987e62da5a24dac6169fbacf1c3d4c041b3339 | [
"Apache-2.0"
]
| 6 | 2019-11-29T15:14:12.000Z | 2020-06-30T03:59:03.000Z | 50.488864 | 15,964 | 0.677717 | [
[
[
"# ็ตๅฝฑ่ฏ่ฎบๆๆฌๅ็ฑป",
"_____no_output_____"
],
[
"\nๆญค็ฌ่ฎฐๆฌ๏ผnotebook๏ผไฝฟ็จ่ฏ่ฎบๆๆฌๅฐๅฝฑ่ฏๅไธบ*็งฏๆ๏ผpositive๏ผ*ๆ*ๆถๆ๏ผnagetive๏ผ*ไธค็ฑปใ่ฟๆฏไธไธช*ไบๅ
๏ผbinary๏ผ*ๆ่
ไบๅ็ฑป้ฎ้ข๏ผไธ็ง้่ฆไธๅบ็จๅนฟๆณ็ๆบๅจๅญฆไน ้ฎ้ขใ\n\nๆไปฌๅฐไฝฟ็จๆฅๆบไบ[็ฝ็ป็ตๅฝฑๆฐๆฎๅบ๏ผInternet Movie Database๏ผ](https://www.imdb.com/)็ [IMDB ๆฐๆฎ้๏ผIMDB dataset๏ผ](https://tensorflow.google.cn/api_docs/python/tf/keras/datasets/imdb)๏ผๅ
ถๅ
ๅซ 50,000 ๆกๅฝฑ่ฏๆๆฌใไป่ฏฅๆฐๆฎ้ๅๅฒๅบ็25,000ๆก่ฏ่ฎบ็จไฝ่ฎญ็ป๏ผๅฆๅค 25,000 ๆก็จไฝๆต่ฏใ่ฎญ็ป้ไธๆต่ฏ้ๆฏ*ๅนณ่กก็๏ผbalanced๏ผ*๏ผๆๅณ็ๅฎไปฌๅ
ๅซ็ธ็ญๆฐ้็็งฏๆๅๆถๆ่ฏ่ฎบใ\n\nๆญค็ฌ่ฎฐๆฌ๏ผnotebook๏ผไฝฟ็จไบ [tf.keras](https://tensorflow.google.cn/guide/keras)๏ผๅฎๆฏไธไธช Tensorflow ไธญ็จไบๆๅปบๅ่ฎญ็ปๆจกๅ็้ซ็บงAPIใๆๅ
ณไฝฟ็จ `tf.keras` ่ฟ่กๆๆฌๅ็ฑป็ๆด้ซ็บงๆ็จ๏ผ่ฏทๅ้
[MLCCๆๆฌๅ็ฑปๆๅ๏ผMLCC Text Classification Guide๏ผ](https://developers.google.com/machine-learning/guides/text-classification/)ใ",
"_____no_output_____"
]
],
[
[
"from __future__ import absolute_import, division, print_function, unicode_literals\n\ntry:\n # Colab only\n %tensorflow_version 2.x\nexcept Exception:\n pass\nimport tensorflow as tf\nfrom tensorflow import keras\n\nimport numpy as np\n\nprint(tf.__version__)",
"2.0.0\n"
]
],
[
[
"## ไธ่ฝฝ IMDB ๆฐๆฎ้\n\nIMDB ๆฐๆฎ้ๅทฒ็ปๆๅ
ๅจ Tensorflow ไธญใ่ฏฅๆฐๆฎ้ๅทฒ็ป็ป่ฟ้ขๅค็๏ผ่ฏ่ฎบ๏ผๅ่ฏๅบๅ๏ผๅทฒ็ป่ขซ่ฝฌๆขไธบๆดๆฐๅบๅ๏ผๅ
ถไธญๆฏไธชๆดๆฐ่กจ็คบๅญๅ
ธไธญ็็นๅฎๅ่ฏใ\n\nไปฅไธไปฃ็ ๅฐไธ่ฝฝ IMDB ๆฐๆฎ้ๅฐๆจ็ๆบๅจไธ๏ผๅฆๆๆจๅทฒ็ปไธ่ฝฝ่ฟๅฐไป็ผๅญไธญๅคๅถ๏ผ๏ผ",
"_____no_output_____"
]
],
[
[
"imdb = keras.datasets.imdb\n\n(train_data, train_labels), (test_data, test_labels) = imdb.load_data(num_words=10000)",
"_____no_output_____"
]
],
[
[
"ๅๆฐ `num_words=10000` ไฟ็ไบ่ฎญ็ปๆฐๆฎไธญๆๅธธๅบ็ฐ็ 10,000 ไธชๅ่ฏใไธบไบไฟๆๆฐๆฎ่งๆจก็ๅฏ็ฎก็ๆง๏ผไฝ้ข่ฏๅฐ่ขซไธขๅผใ\n",
"_____no_output_____"
],
[
"## ๆข็ดขๆฐๆฎ\n\n่ฎฉๆไปฌ่ฑไธ็นๆถ้ดๆฅไบ่งฃๆฐๆฎๆ ผๅผใ่ฏฅๆฐๆฎ้ๆฏ็ป่ฟ้ขๅค็็๏ผๆฏไธชๆ ทๆฌ้ฝๆฏไธไธช่กจ็คบๅฝฑ่ฏไธญ่ฏๆฑ็ๆดๆฐๆฐ็ปใๆฏไธชๆ ็ญพ้ฝๆฏไธไธชๅผไธบ 0 ๆ 1 ็ๆดๆฐๅผ๏ผๅ
ถไธญ 0 ไปฃ่กจๆถๆ่ฏ่ฎบ๏ผ1 ไปฃ่กจ็งฏๆ่ฏ่ฎบใ",
"_____no_output_____"
]
],
[
[
"print(\"Training entries: {}, labels: {}\".format(len(train_data), len(train_labels)))",
"Training entries: 25000, labels: 25000\n"
]
],
[
[
"่ฏ่ฎบๆๆฌ่ขซ่ฝฌๆขไธบๆดๆฐๅผ๏ผๅ
ถไธญๆฏไธชๆดๆฐไปฃ่กจ่ฏๅ
ธไธญ็ไธไธชๅ่ฏใ้ฆๆก่ฏ่ฎบๆฏ่ฟๆ ท็๏ผ",
"_____no_output_____"
]
],
[
[
"print(train_data[0])",
"[1, 14, 22, 16, 43, 530, 973, 1622, 1385, 65, 458, 4468, 66, 3941, 4, 173, 36, 256, 5, 25, 100, 43, 838, 112, 50, 670, 2, 9, 35, 480, 284, 5, 150, 4, 172, 112, 167, 2, 336, 385, 39, 4, 172, 4536, 1111, 17, 546, 38, 13, 447, 4, 192, 50, 16, 6, 147, 2025, 19, 14, 22, 4, 1920, 4613, 469, 4, 22, 71, 87, 12, 16, 43, 530, 38, 76, 15, 13, 1247, 4, 22, 17, 515, 17, 12, 16, 626, 18, 2, 5, 62, 386, 12, 8, 316, 8, 106, 5, 4, 2223, 5244, 16, 480, 66, 3785, 33, 4, 130, 12, 16, 38, 619, 5, 25, 124, 51, 36, 135, 48, 25, 1415, 33, 6, 22, 12, 215, 28, 77, 52, 5, 14, 407, 16, 82, 2, 8, 4, 107, 117, 5952, 15, 256, 4, 2, 7, 3766, 5, 723, 36, 71, 43, 530, 476, 26, 400, 317, 46, 7, 4, 2, 1029, 13, 104, 88, 4, 381, 15, 297, 98, 32, 2071, 56, 26, 141, 6, 194, 7486, 18, 4, 226, 22, 21, 134, 476, 26, 480, 5, 144, 30, 5535, 18, 51, 36, 28, 224, 92, 25, 104, 4, 226, 65, 16, 38, 1334, 88, 12, 16, 283, 5, 16, 4472, 113, 103, 32, 15, 16, 5345, 19, 178, 32]\n"
]
],
[
[
"็ตๅฝฑ่ฏ่ฎบๅฏ่ฝๅ
ทๆไธๅ็้ฟๅบฆใไปฅไธไปฃ็ ๆพ็คบไบ็ฌฌไธๆกๅ็ฌฌไบๆก่ฏ่ฎบ็ไธญๅ่ฏๆฐ้ใ็ฑไบ็ฅ็ป็ฝ็ป็่พๅ
ฅๅฟ
้กปๆฏ็ปไธ็้ฟๅบฆ๏ผๆไปฌ็จๅ้่ฆ่งฃๅณ่ฟไธช้ฎ้ขใ",
"_____no_output_____"
]
],
[
[
"len(train_data[0]), len(train_data[1])",
"_____no_output_____"
]
],
[
[
"### ๅฐๆดๆฐ่ฝฌๆขๅๅ่ฏ\n\nไบ่งฃๅฆไฝๅฐๆดๆฐ่ฝฌๆขๅๆๆฌๅฏนๆจๅฏ่ฝๆฏๆๅธฎๅฉ็ใ่ฟ้ๆไปฌๅฐๅๅปบไธไธช่พ
ๅฉๅฝๆฐๆฅๆฅ่ฏขไธไธชๅ
ๅซไบๆดๆฐๅฐๅญ็ฌฆไธฒๆ ๅฐ็ๅญๅ
ธๅฏน่ฑก๏ผ",
"_____no_output_____"
]
],
[
[
"# ไธไธชๆ ๅฐๅ่ฏๅฐๆดๆฐ็ดขๅผ็่ฏๅ
ธ\nword_index = imdb.get_word_index()\n\n# ไฟ็็ฌฌไธไธช็ดขๅผ\nword_index = {k:(v+3) for k,v in word_index.items()}\nword_index[\"<PAD>\"] = 0\nword_index[\"<START>\"] = 1\nword_index[\"<UNK>\"] = 2 # unknown\nword_index[\"<UNUSED>\"] = 3\n\nreverse_word_index = dict([(value, key) for (key, value) in word_index.items()])\n\ndef decode_review(text):\n return ' '.join([reverse_word_index.get(i, '?') for i in text])",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/imdb_word_index.json\n1646592/1641221 [==============================] - 0s 0us/step\n"
]
],
[
[
"็ฐๅจๆไปฌๅฏไปฅไฝฟ็จ `decode_review` ๅฝๆฐๆฅๆพ็คบ้ฆๆก่ฏ่ฎบ็ๆๆฌ๏ผ",
"_____no_output_____"
]
],
[
[
"decode_review(train_data[0])",
"_____no_output_____"
]
],
[
[
"## ๅๅคๆฐๆฎ\n\nๅฝฑ่ฏโโๅณๆดๆฐๆฐ็ปๅฟ
้กปๅจ่พๅ
ฅ็ฅ็ป็ฝ็ปไนๅ่ฝฌๆขไธบๅผ ้ใ่ฟ็ง่ฝฌๆขๅฏไปฅ้่ฟไปฅไธไธค็งๆนๅผๆฅๅฎๆ๏ผ\n\n* ๅฐๆฐ็ป่ฝฌๆขไธบ่กจ็คบๅ่ฏๅบ็ฐไธๅฆ็็ฑ 0 ๅ 1 ็ปๆ็ๅ้๏ผ็ฑปไผผไบ one-hot ็ผ็ ใไพๅฆ๏ผๅบๅ[3, 5]ๅฐ่ฝฌๆขไธบไธไธช 10,000 ็ปด็ๅ้๏ผ่ฏฅๅ้้คไบ็ดขๅผไธบ 3 ๅ 5 ็ไฝ็ฝฎๆฏ 1 ไปฅๅค๏ผๅ
ถไป้ฝไธบ 0ใ็ถๅ๏ผๅฐๅ
ถไฝไธบ็ฝ็ป็้ฆๅฑโโไธไธชๅฏไปฅๅค็ๆตฎ็นๅๅ้ๆฐๆฎ็็จ ๅฏๅฑใไธ่ฟ๏ผ่ฟ็งๆนๆณ้่ฆๅคง้็ๅ
ๅญ๏ผ้่ฆไธไธชๅคงๅฐไธบ `num_words * num_reviews` ็็ฉ้ตใ\n\n* ๆ่
๏ผๆไปฌๅฏไปฅๅกซๅ
ๆฐ็ปๆฅไฟ่ฏ่พๅ
ฅๆฐๆฎๅ
ทๆ็ธๅ็้ฟๅบฆ๏ผ็ถๅๅๅปบไธไธชๅคงๅฐไธบ `max_length * num_reviews` ็ๆดๅๅผ ้ใๆไปฌๅฏไปฅไฝฟ็จ่ฝๅคๅค็ๆญคๅฝข็ถๆฐๆฎ็ๅตๅ
ฅๅฑไฝไธบ็ฝ็ปไธญ็็ฌฌไธๅฑใ\n\nๅจๆฌๆ็จไธญ๏ผๆไปฌๅฐไฝฟ็จ็ฌฌไบ็งๆนๆณใ\n\n็ฑไบ็ตๅฝฑ่ฏ่ฎบ้ฟๅบฆๅฟ
้กป็ธๅ๏ผๆไปฌๅฐไฝฟ็จ [pad_sequences](https://tensorflow.google.cn/api_docs/python/tf/keras/preprocessing/sequence/pad_sequences) ๅฝๆฐๆฅไฝฟ้ฟๅบฆๆ ๅๅ๏ผ",
"_____no_output_____"
]
],
[
[
"train_data = keras.preprocessing.sequence.pad_sequences(train_data,\n value=word_index[\"<PAD>\"],\n padding='post',\n maxlen=256)\n\ntest_data = keras.preprocessing.sequence.pad_sequences(test_data,\n value=word_index[\"<PAD>\"],\n padding='post',\n maxlen=256)",
"_____no_output_____"
]
],
[
[
"็ฐๅจ่ฎฉๆไปฌ็ไธๆ ทๆฌ็้ฟๅบฆ๏ผ",
"_____no_output_____"
]
],
[
[
"len(train_data[0]), len(train_data[1])",
"_____no_output_____"
]
],
[
[
"ๅนถๆฃๆฅไธไธ้ฆๆก่ฏ่ฎบ๏ผๅฝๅๅทฒ็ปๅกซๅ
๏ผ๏ผ",
"_____no_output_____"
]
],
[
[
"print(train_data[0])",
"[ 1 14 22 16 43 530 973 1622 1385 65 458 4468 66 3941\n 4 173 36 256 5 25 100 43 838 112 50 670 2 9\n 35 480 284 5 150 4 172 112 167 2 336 385 39 4\n 172 4536 1111 17 546 38 13 447 4 192 50 16 6 147\n 2025 19 14 22 4 1920 4613 469 4 22 71 87 12 16\n 43 530 38 76 15 13 1247 4 22 17 515 17 12 16\n 626 18 2 5 62 386 12 8 316 8 106 5 4 2223\n 5244 16 480 66 3785 33 4 130 12 16 38 619 5 25\n 124 51 36 135 48 25 1415 33 6 22 12 215 28 77\n 52 5 14 407 16 82 2 8 4 107 117 5952 15 256\n 4 2 7 3766 5 723 36 71 43 530 476 26 400 317\n 46 7 4 2 1029 13 104 88 4 381 15 297 98 32\n 2071 56 26 141 6 194 7486 18 4 226 22 21 134 476\n 26 480 5 144 30 5535 18 51 36 28 224 92 25 104\n 4 226 65 16 38 1334 88 12 16 283 5 16 4472 113\n 103 32 15 16 5345 19 178 32 0 0 0 0 0 0\n 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n 0 0 0 0]\n"
]
],
[
[
"## ๆๅปบๆจกๅ\n\n็ฅ็ป็ฝ็ป็ฑๅ ๅ ็ๅฑๆฅๆๅปบ๏ผ่ฟ้่ฆไปไธคไธชไธป่ฆๆน้ขๆฅ่ฟ่กไฝ็ณป็ปๆๅณ็ญ๏ผ\n\n* ๆจกๅ้ๆๅคๅฐๅฑ๏ผ\n* ๆฏไธชๅฑ้ๆๅคๅฐ*้ๅฑๅๅ
๏ผhidden units๏ผ*๏ผ\n\nๅจๆญคๆ ทๆฌไธญ๏ผ่พๅ
ฅๆฐๆฎๅ
ๅซไธไธชๅ่ฏ็ดขๅผ็ๆฐ็ปใ่ฆ้ขๆต็ๆ ็ญพไธบ 0 ๆ 1ใ่ฎฉๆไปฌๆฅไธบ่ฏฅ้ฎ้ขๆๅปบไธไธชๆจกๅ๏ผ",
"_____no_output_____"
]
],
[
[
"# ่พๅ
ฅๅฝข็ถๆฏ็จไบ็ตๅฝฑ่ฏ่ฎบ็่ฏๆฑๆฐ็ฎ๏ผ10,000 ่ฏ๏ผ\nvocab_size = 10000\n\nmodel = keras.Sequential()\nmodel.add(keras.layers.Embedding(vocab_size, 16))\nmodel.add(keras.layers.GlobalAveragePooling1D())\nmodel.add(keras.layers.Dense(16, activation='relu'))\nmodel.add(keras.layers.Dense(1, activation='sigmoid'))\n\nmodel.summary()",
"Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nembedding (Embedding) (None, None, 16) 160000 \n_________________________________________________________________\nglobal_average_pooling1d (Gl (None, 16) 0 \n_________________________________________________________________\ndense (Dense) (None, 16) 272 \n_________________________________________________________________\ndense_1 (Dense) (None, 1) 17 \n=================================================================\nTotal params: 160,289\nTrainable params: 160,289\nNon-trainable params: 0\n_________________________________________________________________\n"
]
],
[
[
"ๅฑๆ้กบๅบๅ ๅ ไปฅๆๅปบๅ็ฑปๅจ๏ผ\n\n1. ็ฌฌไธๅฑๆฏ`ๅตๅ
ฅ๏ผEmbedding๏ผ`ๅฑใ่ฏฅๅฑ้็จๆดๆฐ็ผ็ ็่ฏๆฑ่กจ๏ผๅนถๆฅๆพๆฏไธช่ฏ็ดขๅผ็ๅตๅ
ฅๅ้๏ผembedding vector๏ผใ่ฟไบๅ้ๆฏ้่ฟๆจกๅ่ฎญ็ปๅญฆไน ๅฐ็ใๅ้ๅ่พๅบๆฐ็ปๅขๅ ไบไธไธช็ปดๅบฆใๅพๅฐ็็ปดๅบฆไธบ๏ผ`(batch, sequence, embedding)`ใ\n2. ๆฅไธๆฅ๏ผ`GlobalAveragePooling1D` ๅฐ้่ฟๅฏนๅบๅ็ปดๅบฆๆฑๅนณๅๅผๆฅไธบๆฏไธชๆ ทๆฌ่ฟๅไธไธชๅฎ้ฟ่พๅบๅ้ใ่ฟๅ
่ฎธๆจกๅไปฅๅฐฝๅฏ่ฝๆ็ฎๅ็ๆนๅผๅค็ๅ้ฟ่พๅ
ฅใ\n3. ่ฏฅๅฎ้ฟ่พๅบๅ้้่ฟไธไธชๆ 16 ไธช้ๅฑๅๅ
็ๅ
จ่ฟๆฅ๏ผ`Dense`๏ผๅฑไผ ่พใ\n4. ๆๅไธๅฑไธๅไธช่พๅบ็ป็นๅฏ้่ฟๆฅใไฝฟ็จ `Sigmoid` ๆฟๆดปๅฝๆฐ๏ผๅ
ถๅฝๆฐๅผไธบไปไบ 0 ไธ 1 ไน้ด็ๆตฎ็นๆฐ๏ผ่กจ็คบๆฆ็ๆ็ฝฎไฟกๅบฆใ",
"_____no_output_____"
],
[
"### ้ๅฑๅๅ
\n\nไธ่ฟฐๆจกๅๅจ่พๅ
ฅ่พๅบไน้ดๆไธคไธชไธญ้ดๅฑๆโ้่ๅฑโใ่พๅบ๏ผๅๅ
๏ผ็ป็นๆ็ฅ็ปๅ
๏ผ็ๆฐ้ๅณไธบๅฑ่กจ็คบ็ฉบ้ด็็ปดๅบฆใๆขๅฅ่ฏ่ฏด๏ผๆฏๅญฆไน ๅ
้จ่กจ็คบๆถ็ฝ็ปๆๅ
่ฎธ็่ช็ฑๅบฆใ\n\nๅฆๆๆจกๅๅ
ทๆๆดๅค็้ๅฑๅๅ
๏ผๆด้ซ็ปดๅบฆ็่กจ็คบ็ฉบ้ด๏ผๅ/ๆๆดๅคๅฑ๏ผๅๅฏไปฅๅญฆไน ๅฐๆดๅคๆ็่กจ็คบใไฝๆฏ๏ผ่ฟไผไฝฟ็ฝ็ป็่ฎก็ฎๆๆฌๆด้ซ๏ผๅนถไธๅฏ่ฝๅฏผ่ดๅญฆไน ๅฐไธ้่ฆ็ๆจกๅผโโไธไบ่ฝๅคๅจ่ฎญ็ปๆฐๆฎไธ่ไธๆฏๆต่ฏๆฐๆฎไธๆนๅๆง่ฝ็ๆจกๅผใ่ฟ่ขซ็งฐไธบ*่ฟๆๅ๏ผoverfitting๏ผ*๏ผๆไปฌ็จๅไผๅฏนๆญค่ฟ่กๆข็ฉถใ",
"_____no_output_____"
],
[
"### ๆๅคฑๅฝๆฐไธไผๅๅจ\n\nไธไธชๆจกๅ้่ฆๆๅคฑๅฝๆฐๅไผๅๅจๆฅ่ฟ่ก่ฎญ็ปใ็ฑไบ่ฟๆฏไธไธชไบๅ็ฑป้ฎ้ขไธๆจกๅ่พๅบๆฆ็ๅผ๏ผไธไธชไฝฟ็จ sigmoid ๆฟๆดปๅฝๆฐ็ๅไธๅๅ
ๅฑ๏ผ๏ผๆไปฌๅฐไฝฟ็จ `binary_crossentropy` ๆๅคฑๅฝๆฐใ\n\n่ฟไธๆฏๆๅคฑๅฝๆฐ็ๅฏไธ้ๆฉ๏ผไพๅฆ๏ผๆจๅฏไปฅ้ๆฉ `mean_squared_error` ใไฝๆฏ๏ผไธ่ฌๆฅ่ฏด `binary_crossentropy` ๆด้ๅๅค็ๆฆ็โโๅฎ่ฝๅคๅบฆ้ๆฆ็ๅๅธไน้ด็โ่ท็ฆปโ๏ผๆ่
ๅจๆไปฌ็็คบไพไธญ๏ผๆ็ๆฏๅบฆ้ ground-truth ๅๅธไธ้ขๆตๅผไน้ด็โ่ท็ฆปโใ\n\n็จๅ๏ผๅฝๆไปฌ็ ็ฉถๅๅฝ้ฎ้ข๏ผไพๅฆ๏ผ้ขๆตๆฟไปท๏ผๆถ๏ผๆไปฌๅฐไป็ปๅฆไฝไฝฟ็จๅฆไธ็งๅซๅๅๆน่ฏฏๅทฎ็ๆๅคฑๅฝๆฐใ\n\n็ฐๅจ๏ผ้
็ฝฎๆจกๅๆฅไฝฟ็จไผๅๅจๅๆๅคฑๅฝๆฐ๏ผ",
"_____no_output_____"
]
],
[
[
"model.compile(optimizer='adam',\n loss='binary_crossentropy',\n metrics=['accuracy'])",
"_____no_output_____"
]
],
[
[
"## ๅๅปบไธไธช้ช่ฏ้\n\nๅจ่ฎญ็ปๆถ๏ผๆไปฌๆณ่ฆๆฃๆฅๆจกๅๅจๆช่ง่ฟ็ๆฐๆฎไธ็ๅ็กฎ็๏ผaccuracy๏ผใ้่ฟไปๅๅง่ฎญ็ปๆฐๆฎไธญๅ็ฆป 10,000 ไธชๆ ทๆฌๆฅๅๅปบไธไธช*้ช่ฏ้*ใ๏ผไธบไปไน็ฐๅจไธไฝฟ็จๆต่ฏ้๏ผๆไปฌ็็ฎๆ ๆฏๅชไฝฟ็จ่ฎญ็ปๆฐๆฎๆฅๅผๅๅ่ฐๆดๆจกๅ๏ผ็ถๅๅชไฝฟ็จไธๆฌกๆต่ฏๆฐๆฎๆฅ่ฏไผฐๅ็กฎ็๏ผaccuracy๏ผ๏ผใ",
"_____no_output_____"
]
],
[
[
"x_val = train_data[:10000]\npartial_x_train = train_data[10000:]\n\ny_val = train_labels[:10000]\npartial_y_train = train_labels[10000:]",
"_____no_output_____"
]
],
[
[
"## ่ฎญ็ปๆจกๅ\n\nไปฅ 512 ไธชๆ ทๆฌ็ mini-batch ๅคงๅฐ่ฟญไปฃ 40 ไธช epoch ๆฅ่ฎญ็ปๆจกๅใ่ฟๆฏๆๅฏน `x_train` ๅ `y_train` ๅผ ้ไธญๆๆๆ ทๆฌ็็ 40 ๆฌก่ฟญไปฃใๅจ่ฎญ็ป่ฟ็จไธญ๏ผ็ๆตๆฅ่ช้ช่ฏ้็ 10,000 ไธชๆ ทๆฌไธ็ๆๅคฑๅผ๏ผloss๏ผๅๅ็กฎ็๏ผaccuracy๏ผ๏ผ",
"_____no_output_____"
]
],
[
[
"history = model.fit(partial_x_train,\n partial_y_train,\n epochs=40,\n batch_size=512,\n validation_data=(x_val, y_val),\n verbose=1)",
"Train on 15000 samples, validate on 10000 samples\nEpoch 1/40\n15000/15000 [==============================] - 1s 99us/sample - loss: 0.6921 - accuracy: 0.5437 - val_loss: 0.6903 - val_accuracy: 0.6241\nEpoch 2/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.6870 - accuracy: 0.7057 - val_loss: 0.6833 - val_accuracy: 0.7018\nEpoch 3/40\n15000/15000 [==============================] - 1s 54us/sample - loss: 0.6760 - accuracy: 0.7454 - val_loss: 0.6694 - val_accuracy: 0.7501\nEpoch 4/40\n15000/15000 [==============================] - 1s 53us/sample - loss: 0.6563 - accuracy: 0.7659 - val_loss: 0.6467 - val_accuracy: 0.7571\nEpoch 5/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.6270 - accuracy: 0.7837 - val_loss: 0.6155 - val_accuracy: 0.7793\nEpoch 6/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.5882 - accuracy: 0.7993 - val_loss: 0.5762 - val_accuracy: 0.7960\nEpoch 7/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.5420 - accuracy: 0.8219 - val_loss: 0.5336 - val_accuracy: 0.8106\nEpoch 8/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.4955 - accuracy: 0.8367 - val_loss: 0.4930 - val_accuracy: 0.8262\nEpoch 9/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.4507 - accuracy: 0.8522 - val_loss: 0.4542 - val_accuracy: 0.8393\nEpoch 10/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.4107 - accuracy: 0.8667 - val_loss: 0.4218 - val_accuracy: 0.8478\nEpoch 11/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.3766 - accuracy: 0.8779 - val_loss: 0.3957 - val_accuracy: 0.8551\nEpoch 12/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.3483 - accuracy: 0.8843 - val_loss: 0.3741 - val_accuracy: 0.8613\nEpoch 13/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.3238 - accuracy: 0.8925 - val_loss: 0.3573 - val_accuracy: 0.8667\nEpoch 14/40\n15000/15000 [==============================] - 1s 54us/sample - loss: 0.3027 - accuracy: 0.8977 - val_loss: 0.3439 - val_accuracy: 0.8678\nEpoch 15/40\n15000/15000 [==============================] - 1s 54us/sample - loss: 0.2850 - accuracy: 0.9032 - val_loss: 0.3318 - val_accuracy: 0.8737\nEpoch 16/40\n15000/15000 [==============================] - 1s 56us/sample - loss: 0.2695 - accuracy: 0.9071 - val_loss: 0.3231 - val_accuracy: 0.8744\nEpoch 17/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2549 - accuracy: 0.9124 - val_loss: 0.3151 - val_accuracy: 0.8790\nEpoch 18/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2421 - accuracy: 0.9166 - val_loss: 0.3086 - val_accuracy: 0.8807\nEpoch 19/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2307 - accuracy: 0.9201 - val_loss: 0.3035 - val_accuracy: 0.8794\nEpoch 20/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2201 - accuracy: 0.9243 - val_loss: 0.2994 - val_accuracy: 0.8802\nEpoch 21/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2103 - accuracy: 0.9271 - val_loss: 0.2953 - val_accuracy: 0.8825\nEpoch 22/40\n15000/15000 [==============================] - 1s 53us/sample - loss: 0.2014 - accuracy: 0.9306 - val_loss: 0.2926 - val_accuracy: 0.8834\nEpoch 23/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.1923 - accuracy: 0.9352 - val_loss: 0.2901 - val_accuracy: 0.8848\nEpoch 24/40\n15000/15000 [==============================] - 1s 53us/sample - loss: 0.1845 - accuracy: 0.9395 - val_loss: 0.2907 - val_accuracy: 0.8852\nEpoch 25/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.1770 - accuracy: 0.9426 - val_loss: 0.2875 - val_accuracy: 0.8838\nEpoch 26/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.1696 - accuracy: 0.9459 - val_loss: 0.2870 - val_accuracy: 0.8849\nEpoch 27/40\n15000/15000 [==============================] - 1s 58us/sample - loss: 0.1628 - accuracy: 0.9492 - val_loss: 0.2868 - val_accuracy: 0.8849\nEpoch 28/40\n15000/15000 [==============================] - 1s 65us/sample - loss: 0.1563 - accuracy: 0.9513 - val_loss: 0.2876 - val_accuracy: 0.8842\nEpoch 29/40\n15000/15000 [==============================] - 1s 65us/sample - loss: 0.1505 - accuracy: 0.9534 - val_loss: 0.2881 - val_accuracy: 0.8849\nEpoch 30/40\n15000/15000 [==============================] - 1s 62us/sample - loss: 0.1450 - accuracy: 0.9553 - val_loss: 0.2878 - val_accuracy: 0.8857\nEpoch 31/40\n15000/15000 [==============================] - 1s 60us/sample - loss: 0.1389 - accuracy: 0.9584 - val_loss: 0.2879 - val_accuracy: 0.8862\nEpoch 32/40\n15000/15000 [==============================] - 1s 62us/sample - loss: 0.1347 - accuracy: 0.9595 - val_loss: 0.2907 - val_accuracy: 0.8849\nEpoch 33/40\n15000/15000 [==============================] - 1s 61us/sample - loss: 0.1286 - accuracy: 0.9626 - val_loss: 0.2908 - val_accuracy: 0.8859\nEpoch 34/40\n15000/15000 [==============================] - 1s 59us/sample - loss: 0.1244 - accuracy: 0.9645 - val_loss: 0.2926 - val_accuracy: 0.8864\nEpoch 35/40\n15000/15000 [==============================] - 1s 59us/sample - loss: 0.1192 - accuracy: 0.9664 - val_loss: 0.2945 - val_accuracy: 0.8850\nEpoch 36/40\n15000/15000 [==============================] - 1s 61us/sample - loss: 0.1149 - accuracy: 0.9688 - val_loss: 0.2959 - val_accuracy: 0.8847\nEpoch 37/40\n15000/15000 [==============================] - 1s 60us/sample - loss: 0.1107 - accuracy: 0.9699 - val_loss: 0.2998 - val_accuracy: 0.8833\nEpoch 38/40\n15000/15000 [==============================] - 1s 59us/sample - loss: 0.1065 - accuracy: 0.9703 - val_loss: 0.3007 - val_accuracy: 0.8844\nEpoch 39/40\n15000/15000 [==============================] - 1s 62us/sample - loss: 0.1029 - accuracy: 0.9722 - val_loss: 0.3042 - val_accuracy: 0.8827\nEpoch 40/40\n15000/15000 [==============================] - 1s 69us/sample - loss: 0.0995 - accuracy: 0.9736 - val_loss: 0.3074 - val_accuracy: 0.8817\n"
]
],
[
[
"## ่ฏไผฐๆจกๅ\n\nๆไปฌๆฅ็ไธไธๆจกๅ็ๆง่ฝๅฆไฝใๅฐ่ฟๅไธคไธชๅผใๆๅคฑๅผ๏ผloss๏ผ๏ผไธไธช่กจ็คบ่ฏฏๅทฎ็ๆฐๅญ๏ผๅผ่ถไฝ่ถๅฅฝ๏ผไธๅ็กฎ็๏ผaccuracy๏ผใ",
"_____no_output_____"
]
],
[
[
"results = model.evaluate(test_data, test_labels, verbose=2)\n\nprint(results)",
"25000/1 - 1s - loss: 0.3459 - accuracy: 0.8727\n[0.325805940823555, 0.87268]\n"
]
],
[
[
"่ฟ็งๅๅๆด็ด ็ๆนๆณๅพๅฐไบ็บฆ 87% ็ๅ็กฎ็๏ผaccuracy๏ผใ่ฅ้็จๆดๅฅฝ็ๆนๆณ๏ผๆจกๅ็ๅ็กฎ็ๅบๅฝๆฅ่ฟ 95%ใ",
"_____no_output_____"
],
[
"## ๅๅปบไธไธชๅ็กฎ็๏ผaccuracy๏ผๅๆๅคฑๅผ๏ผloss๏ผ้ๆถ้ดๅๅ็ๅพ่กจ\n\n`model.fit()` ่ฟๅไธไธช `History` ๅฏน่ฑก๏ผ่ฏฅๅฏน่ฑกๅ
ๅซไธไธชๅญๅ
ธ๏ผๅ
ถไธญๅ
ๅซ่ฎญ็ป้ถๆฎตๆๅ็็ไธๅไบไปถ๏ผ",
"_____no_output_____"
]
],
[
[
"history_dict = history.history\nhistory_dict.keys()",
"_____no_output_____"
]
],
[
[
"ๆๅไธชๆก็ฎ๏ผๅจ่ฎญ็ปๅ้ช่ฏๆ้ด๏ผๆฏไธชๆก็ฎๅฏนๅบไธไธช็ๆงๆๆ ใๆไปฌๅฏไปฅไฝฟ็จ่ฟไบๆก็ฎๆฅ็ปๅถ่ฎญ็ปไธ้ช่ฏ่ฟ็จ็ๆๅคฑๅผ๏ผloss๏ผๅๅ็กฎ็๏ผaccuracy๏ผ๏ผไปฅไพฟ่ฟ่กๆฏ่พใ",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n\nacc = history_dict['accuracy']\nval_acc = history_dict['val_accuracy']\nloss = history_dict['loss']\nval_loss = history_dict['val_loss']\n\nepochs = range(1, len(acc) + 1)\n\n# โboโไปฃ่กจ \"่็น\"\nplt.plot(epochs, loss, 'bo', label='Training loss')\n# bไปฃ่กจโ่่ฒๅฎ็บฟโ\nplt.plot(epochs, val_loss, 'b', label='Validation loss')\nplt.title('Training and validation loss')\nplt.xlabel('Epochs')\nplt.ylabel('Loss')\nplt.legend()\n\nplt.show()",
"_____no_output_____"
],
[
"plt.clf() # ๆธ
้คๆฐๅญ\n\nplt.plot(epochs, acc, 'bo', label='Training acc')\nplt.plot(epochs, val_acc, 'b', label='Validation acc')\nplt.title('Training and validation accuracy')\nplt.xlabel('Epochs')\nplt.ylabel('Accuracy')\nplt.legend()\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"\nๅจ่ฏฅๅพไธญ๏ผ็นไปฃ่กจ่ฎญ็ปๆๅคฑๅผ๏ผloss๏ผไธๅ็กฎ็๏ผaccuracy๏ผ๏ผๅฎ็บฟไปฃ่กจ้ช่ฏๆๅคฑๅผ๏ผloss๏ผไธๅ็กฎ็๏ผaccuracy๏ผใ\n\nๆณจๆ่ฎญ็ปๆๅคฑๅผ้ๆฏไธไธช epoch *ไธ้*่่ฎญ็ปๅ็กฎ็๏ผaccuracy๏ผ้ๆฏไธไธช epoch *ไธๅ*ใ่ฟๅจไฝฟ็จๆขฏๅบฆไธ้ไผๅๆถๆฏๅฏ้ขๆ็โโ็ๅบๅจๆฏๆฌก่ฟญไปฃไธญๆๅฐๅๆๆๅผใ\n\n้ช่ฏ่ฟ็จ็ๆๅคฑๅผ๏ผloss๏ผไธๅ็กฎ็๏ผaccuracy๏ผ็ๆ
ๅตๅดๅนถ้ๅฆๆญคโโๅฎไปฌไผผไนๅจ 20 ไธช epoch ๅ่พพๅฐๅณฐๅผใ่ฟๆฏ่ฟๆๅ็ไธไธชๅฎไพ๏ผๆจกๅๅจ่ฎญ็ปๆฐๆฎไธ็่กจ็ฐๆฏๅจไปฅๅไปๆช่ง่ฟ็ๆฐๆฎไธ็่กจ็ฐ่ฆๆดๅฅฝใๅจๆญคไนๅ๏ผๆจกๅ่ฟๅบฆไผๅๅนถๅญฆไน *็นๅฎ*ไบ่ฎญ็ปๆฐๆฎ็่กจ็คบ๏ผ่ไธ่ฝๅค*ๆณๅ*ๅฐๆต่ฏๆฐๆฎใ\n\nๅฏนไบ่ฟ็ง็นๆฎๆ
ๅต๏ผๆไปฌๅฏไปฅ้่ฟๅจ 20 ไธชๅทฆๅณ็ epoch ๅๅๆญข่ฎญ็ปๆฅ้ฟๅ
่ฟๆๅใ็จๅ๏ผๆจๅฐ็ๅฐๅฆไฝ้่ฟๅ่ฐ่ชๅจๆง่กๆญคๆไฝใ",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
]
|
d06454dfa2d4595190cbf36062f5989e3148a977 | 16,185 | ipynb | Jupyter Notebook | notebooks/00_just_plot_it.ipynb | NFAcademy/2021_course_dev-tacaswell | 78df3044f46581f6c5276283ff825b26a00a3039 | [
"MIT"
]
| null | null | null | notebooks/00_just_plot_it.ipynb | NFAcademy/2021_course_dev-tacaswell | 78df3044f46581f6c5276283ff825b26a00a3039 | [
"MIT"
]
| null | null | null | notebooks/00_just_plot_it.ipynb | NFAcademy/2021_course_dev-tacaswell | 78df3044f46581f6c5276283ff825b26a00a3039 | [
"MIT"
]
| null | null | null | 31.125 | 514 | 0.617547 | [
[
[
"# Just Plot It!",
"_____no_output_____"
],
[
"## Introduction",
"_____no_output_____"
],
[
"### The System",
"_____no_output_____"
],
[
"In this course we will work with a set of \"experimental\" data to illustrate going from \"raw\" measurement (or simulation) data through exploratory visualization to an (almost) paper ready figure.\n\nIn this scenario, we have fabricated (or simulated) 25 cantilevers. There is some value (suggestively called \"control\") that varies between the cantilevers and we want to see how the properties of the cantilever are affect by \"control\".",
"_____no_output_____"
],
[
"To see what this will look like physically, take part a \"clicky\" pen. Hold one end of the spring in your fingers and flick the free end. \n\nOr just watch this cat:",
"_____no_output_____"
]
],
[
[
"from IPython.display import YouTubeVideo\nYouTubeVideo('4aTagDSnclk?start=19')",
"_____no_output_____"
]
],
[
[
"Springs, and our cantilevers, are part of a class of systems known as (Damped) Harmonic Oscillators. We are going to measure the natural frequency and damping rate we deflect each cantilever by the same amount and then observe the position as a function of time as the vibrations damp out.",
"_____no_output_____"
],
[
"### The Tools",
"_____no_output_____"
],
[
"We are going make use of: \n\n- [jupyter](https://jupyter.org)\n- [numpy](https://numpy.org)\n- [matplotlib](https://matplotlib.org)\n- [scipy](https://www.scipy.org/scipylib/index.html)\n- [xarray](http://xarray.pydata.org/en/stable/index.html)\n- [pandas](https://pandas.pydata.org/docs/)\n\nWe are only going to scratch the surface of what any of these libraries can do! For the purposes of this course we assume you know numpy and Matplotlib at least to the level of LINKS TO OTHER COURSES. We will only be using one aspect (least square fitting) from scipy so no prior familiarity is needed. Similarly, we will only be superficially making use of pandas and xarray to provided access to structured data. No prior familiarity is required and if you want to learn more see LINK TO OTHER COURSES.",
"_____no_output_____"
]
],
[
[
"# interactive figures, requires ipypml!\n%matplotlib widget\n#%matplotlib inline\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport pandas as pd\nimport scipy\nimport xarray as xa",
"_____no_output_____"
]
],
[
[
"### Philsophy",
"_____no_output_____"
],
[
"While this coures uses Matplotlib for the visualization, the high-level lessons of this course are transferable to any plotting tools (in any language).\n\nAt its core, programing in the process of taking existing tools (libraries) and building new tools more fit to your purpose. This course will walk through a concrete example, starting with a pile of data and ending with a paper figure, of how to think about and design scientific visualizations tools tuned to exactly *your* data and questions.",
"_____no_output_____"
],
[
"## The Data",
"_____no_output_____"
],
[
"### Accessing data\n\nAs a rule-of-thumb I/O logic should be kept out of the inner loops of analysis or plotting. This will, in the medium term, lead to more re-usable and maintainable code. Remember your most frequent collaborator is yourself in 6 months. Be kind to your (future) self and write re-usable, maintainable, and understandable code now ;)\n\nIn this case, we have a data (simulation) function `get_data` that will simulate the experiment and returns to us a [`xarray.DataArray`](http://xarray.pydata.org/en/stable/quick-overview.html#create-a-dataarray). `xarray.DataArray` is (roughly) a N-dimensional numpy array that is enriched by the concept of coordinates and indies on the the axes and meta-data. \n\n`xarray` has much more functionality than we will use in this course!",
"_____no_output_____"
]
],
[
[
"# not sure how else to get the helpers on the path!\nimport sys\nsys.path.append('../scripts')",
"_____no_output_____"
],
[
"from data_gen import get_data, fit",
"_____no_output_____"
]
],
[
[
"### First look",
"_____no_output_____"
],
[
"Using the function `get_data` we can pull an `xarray.DataArray` into our namespace and the use the html repr from xarray to get a first look at the data",
"_____no_output_____"
]
],
[
[
"d = get_data(25)\nd",
"_____no_output_____"
]
],
[
[
"From this we can see that we have a, more-or-less, 2D array with 25 rows, each of which is a measurement that is a 4,112 point time series. Because this is an DataArray it also caries **coordinates** giving the value of **control** for each row and the time for each column.",
"_____no_output_____"
],
[
"If we pull out just one row we can see a single experimental measurement.",
"_____no_output_____"
]
],
[
[
"d[6]",
"_____no_output_____"
]
],
[
[
"We can see that the **control** coordinate now gives 1 value, but the **time** coordinate is still a vector. We can access these values via attribute access (which we will use later):",
"_____no_output_____"
]
],
[
[
"d[6].control",
"_____no_output_____"
],
[
"d[6].time",
"_____no_output_____"
]
],
[
[
"## The Plotting",
"_____no_output_____"
],
[
"### Plot it?\nLooking at (truncated) lists of numbers is not intuitive or informative for most people, to get a better sense of what this data looks like lets plot it! We know that `Axes.plot` can plot multiple lines at once so lets try naively throwing `d` at `ax.plot`!",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\nax.plot(d);",
"_____no_output_____"
]
],
[
[
"While this does look sort of cool, it is not *useful*. What has happened is that Matplotlib has looked at our `(25, 4_112)` array and said \"Clearly, you have a table that is 4k columns wide and 25 rows long. What you want is each column plotted!\". Thus, what we are seeing is \"The deflection at a fixed time as a function of cantilever ID number\". This plot does accurately reflect that data that we passed in, but this is a nearly meaningless plot!\n\nVisualization, just like writing, is a tool for communication and you need to think about the story you want to tell as you make the plots.",
"_____no_output_____"
],
[
"### Sidebar: Explicit vs Implicit Matplotlib API\n\nThere are two related but distinct APIs to use Matplotlib: the \"Explicit\" (nee \"Object Oriented\") and \"Implicit\" (nee \"pyplot/pylab\"). The Implicit API is implemented using the Explicit API; anything you can do with the Implicit API you can do with the Explicit API, but there is some functionality of the Explicit API that is not exposed through the Implicit API. It is also possible, but with one exception not suggested, to mix the two APIs.\n\nThe core conceptual difference is than in the Implicit API Matplotlib has a notion of the \"current figure\" and \"current axes\" that all of the calls re-directed to. For example, the implementation of `plt.plot` (once you scroll past the docstring) is only 1 line:",
"_____no_output_____"
]
],
[
[
"?? plt.plot",
"_____no_output_____"
]
],
[
[
"While the Implicit API reduces the boilerplate required to get some things done and is convenient when working in a terminal, it comes at the cost of Matplotlib maintaining global state of which Axes is currently active! When scripting this can quickly become a headache to manage.",
"_____no_output_____"
],
[
"When using Matplotlib with one of the GUI backends, we do need to, at the library level, keep track of some global state so that the plot windows remain responsive. If you are embedding Matplotlib in your own GUI application you are responsible for this, but when working at an IPython prompt,`pyplot` takes care of this for you.",
"_____no_output_____"
],
[
"This course is going to, with the exception of creating new figures, always use the Explict API.",
"_____no_output_____"
],
[
"### Plot it!\n\nWhat we really want to see is the transpose of the above (A line per experiment as a function of time):",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\nax.plot(d.T);",
"_____no_output_____"
]
],
[
[
"Which is better! If we squint a bit (or zoom in if we are using `ipympl` or a GUI backend) can sort of see each of the individual oscillators ringing-down over time.",
"_____no_output_____"
],
[
"### Just one at a time",
"_____no_output_____"
],
[
"To make it easier to see lets plot just one of the curves:",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\nax.plot(d[6]);",
"_____no_output_____"
]
],
[
[
"### Pass freshman physics",
"_____no_output_____"
],
[
"While we do have just one line on the axes and can see what is going on, this plot would, right, be marked as little-to-no credit if turned in as part of a freshman Physics lab! We do not have a meaningful value on the x-axis, no legend, and no axis labels!",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\nm = d[6]\nax.plot(m.time, m, label=f'control = {float(m.control):.1f}')\nax.set_xlabel('time (ms)')\nax.set_ylabel('displacement (mm)')\nax.legend();",
"_____no_output_____"
]
],
[
[
"At this point we have a minimally acceptable plot! It shows us one curve with axis labels (with units!) and a legend. With ",
"_____no_output_____"
],
[
"### sidebar: xarray plotting",
"_____no_output_____"
],
[
"Because xarray knows more about the structure of your data than a couple of numpy arrays in your local namespace or dictionary, it can make smarter choices about the automatic visualization:",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\nm.plot(ax=ax)",
"_____no_output_____"
]
],
[
[
"While this is helpful exploritory plotting, `xarray` makes some choices that make it difficult to compose plotting multiple data sets.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d06454dfb1715cc2a28cb347906303323e0c830d | 166,341 | ipynb | Jupyter Notebook | 08. Classification.ipynb | monocilindro/data_science | 36098bb6b3e731a11315e58e0b09628fae86069d | [
"MIT"
]
| 41 | 2020-01-25T21:23:59.000Z | 2022-02-22T19:48:15.000Z | 08. Classification.ipynb | ichit/data_science | 36098bb6b3e731a11315e58e0b09628fae86069d | [
"MIT"
]
| null | null | null | 08. Classification.ipynb | ichit/data_science | 36098bb6b3e731a11315e58e0b09628fae86069d | [
"MIT"
]
| 38 | 2020-01-27T18:57:46.000Z | 2022-03-05T00:33:45.000Z | 231.995816 | 136,448 | 0.902934 | [
[
[
"## 8. Classification\n\n[Data Science Playlist on YouTube](https://www.youtube.com/watch?v=VLKEj9EN2ew&list=PLLBUgWXdTBDg1Qgmwt4jKtVn9BWh5-zgy)\n[](https://www.youtube.com/watch?v=VLKEj9EN2ew&list=PLLBUgWXdTBDg1Qgmwt4jKtVn9BWh5-zgy \"Python Data Science\")\n\n**Classification** predicts *discrete labels (outcomes)* such as `yes`/`no`, `True`/`False`, or any number of discrete levels such as a letter from text recognition, or a word from speech recognition. There are two main methods for training classifiers: unsupervised and supervised learning. The difference between the two is that unsupervised learning does not use labels while supervised learning uses labels to build the classifier. The goal of unsupervised learning is to cluster input features but without labels to guide the grouping. ",
"_____no_output_____"
],
[
"\n\n### Supervised Learning to Classify Numbers\n\nA dataset that is included with sklearn is a set of 1797 images of numbers that are 64 pixels (8x8) each. There are labels with each to indicate the correct answer. A Support Vector Classifier is trained on the first half of the images.",
"_____no_output_____"
]
],
[
[
"from sklearn import datasets, svm\nfrom sklearn.model_selection import train_test_split\nimport matplotlib.pyplot as plt\n%matplotlib inline\nimport numpy as np\n\n# train classifier\ndigits = datasets.load_digits()\nn_samples = len(digits.images)\ndata = digits.images.reshape((n_samples, -1))\nsvc = svm.SVC(gamma=0.001)\nX_train, X_test, y_train, y_test = train_test_split(\n data, digits.target, test_size=0.5, shuffle=False)\nsvc.fit(X_train, y_train)\nprint('SVC Trained')",
"_____no_output_____"
]
],
[
[
"\n\n### Test Number Classifier\n\nThe image classification is trained on 10 randomly selected images from the other half of the data set to evaluate the training. Run the classifier test until you observe a misclassified number.",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(10,4))\nfor i in range(10):\n n = np.random.randint(int(n_samples/2),n_samples)\n predict = svc.predict(digits.data[n:n+1])[0]\n plt.subplot(2,5,i+1)\n plt.imshow(digits.images[n], cmap=plt.cm.gray_r, interpolation='nearest')\n plt.text(0,7,'Actual: ' + str(digits.target[n]),color='r')\n plt.text(0,1,'Predict: ' + str(predict),color='b')\n if predict==digits.target[n]:\n plt.text(0,4,'Correct',color='g')\n else:\n plt.text(0,4,'Incorrect',color='orange')\nplt.show()",
"_____no_output_____"
]
],
[
[
"\n\n### Classification with Supervised Learning",
"_____no_output_____"
],
[
"Select data set option with `moons`, `cirlces`, or `blobs`. Run the following cell to generate the data that will be used to test the classifiers.",
"_____no_output_____"
]
],
[
[
"option = 'moons' # moons, circles, or blobs\n\nn = 2000 # number of data points\nX = np.random.random((n,2))\nmixing = 0.0 # add random mixing element to data\nxplot = np.linspace(0,1,100)\nif option=='moons':\n X, y = datasets.make_moons(n_samples=n,noise=0.1)\n yplot = xplot*0.0\nelif option=='circles':\n X, y = datasets.make_circles(n_samples=n,noise=0.1,factor=0.5)\n yplot = xplot*0.0\nelif option=='blobs':\n X, y = datasets.make_blobs(n_samples=n,centers=[[-5,3],[5,-3]],cluster_std=2.0)\n yplot = xplot*0.0\n# Split into train and test subsets (50% each)\nXA, XB, yA, yB = train_test_split(X, y, test_size=0.5, shuffle=False)\n# Plot regression results\ndef assess(P):\n plt.figure()\n plt.scatter(XB[P==1,0],XB[P==1,1],marker='^',color='blue',label='True')\n plt.scatter(XB[P==0,0],XB[P==0,1],marker='x',color='red',label='False')\n plt.scatter(XB[P!=yB,0],XB[P!=yB,1],marker='s',color='orange',\\\n alpha=0.5,label='Incorrect')\n plt.legend()",
"_____no_output_____"
]
],
[
[
"\n\n### S.1 Logistic Regression\n\n**Definition:** Logistic regression is a machine learning algorithm for classification. In this algorithm, the probabilities describing the possible outcomes of a single trial are modelled using a logistic function.\n\n**Advantages:** Logistic regression is designed for this purpose (classification), and is most useful for understanding the influence of several independent variables on a single outcome variable.\n\n**Disadvantages:** Works only when the predicted variable is binary, assumes all predictors are independent of each other, and assumes data is free of missing values.",
"_____no_output_____"
]
],
[
[
"from sklearn.linear_model import LogisticRegression\nlr = LogisticRegression(solver='lbfgs')\nlr.fit(XA,yA)\nyP = lr.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.2 Naรฏve Bayes\n\n**Definition:** Naive Bayes algorithm based on Bayesโ theorem with the assumption of independence between every pair of features. Naive Bayes classifiers work well in many real-world situations such as document classification and spam filtering.\n\n**Advantages:** This algorithm requires a small amount of training data to estimate the necessary parameters. Naive Bayes classifiers are extremely fast compared to more sophisticated methods.\n\n**Disadvantages:** Naive Bayes is known to be a bad estimator.",
"_____no_output_____"
]
],
[
[
"from sklearn.naive_bayes import GaussianNB\nnb = GaussianNB()\nnb.fit(XA,yA)\nyP = nb.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.3 Stochastic Gradient Descent\n\n**Definition:** Stochastic gradient descent is a simple and very efficient approach to fit linear models. It is particularly useful when the number of samples is very large. It supports different loss functions and penalties for classification.\n\n**Advantages:** Efficiency and ease of implementation.\n\n**Disadvantages:** Requires a number of hyper-parameters and it is sensitive to feature scaling.",
"_____no_output_____"
]
],
[
[
"from sklearn.linear_model import SGDClassifier\nsgd = SGDClassifier(loss='modified_huber', shuffle=True,random_state=101)\nsgd.fit(XA,yA)\nyP = sgd.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.4 K-Nearest Neighbours\n\n**Definition:** Neighbours based classification is a type of lazy learning as it does not attempt to construct a general internal model, but simply stores instances of the training data. Classification is computed from a simple majority vote of the k nearest neighbours of each point.\n\n**Advantages:** This algorithm is simple to implement, robust to noisy training data, and effective if training data is large.\n\n**Disadvantages:** Need to determine the value of `K` and the computation cost is high as it needs to computer the distance of each instance to all the training samples. One possible solution to determine `K` is to add a feedback loop to determine the number of neighbors.",
"_____no_output_____"
]
],
[
[
"from sklearn.neighbors import KNeighborsClassifier\nknn = KNeighborsClassifier(n_neighbors=5)\nknn.fit(XA,yA)\nyP = knn.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.5 Decision Tree\n\n**Definition:** Given a data of attributes together with its classes, a decision tree produces a sequence of rules that can be used to classify the data.\n\n**Advantages:** Decision Tree is simple to understand and visualise, requires little data preparation, and can handle both numerical and categorical data.\n\n**Disadvantages:** Decision tree can create complex trees that do not generalise well, and decision trees can be unstable because small variations in the data might result in a completely different tree being generated.",
"_____no_output_____"
]
],
[
[
"from sklearn.tree import DecisionTreeClassifier\ndtree = DecisionTreeClassifier(max_depth=10,random_state=101,\\\n max_features=None,min_samples_leaf=5)\ndtree.fit(XA,yA)\nyP = dtree.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.6 Random Forest\n\n**Definition:** Random forest classifier is a meta-estimator that fits a number of decision trees on various sub-samples of datasets and uses average to improve the predictive accuracy of the model and controls over-fitting. The sub-sample size is always the same as the original input sample size but the samples are drawn with replacement.\n\n**Advantages:** Reduction in over-fitting and random forest classifier is more accurate than decision trees in most cases.\n\n**Disadvantages:** Slow real time prediction, difficult to implement, and complex algorithm.",
"_____no_output_____"
]
],
[
[
"from sklearn.ensemble import RandomForestClassifier\nrfm = RandomForestClassifier(n_estimators=70,oob_score=True,\\\n n_jobs=1,random_state=101,max_features=None,\\\n min_samples_leaf=3) #change min_samples_leaf from 30 to 3\nrfm.fit(XA,yA)\nyP = rfm.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.7 Support Vector Classifier\n\n**Definition:** Support vector machine is a representation of the training data as points in space separated into categories by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall.\n\n**Advantages:** Effective in high dimensional spaces and uses a subset of training points in the decision function so it is also memory efficient.\n\n**Disadvantages:** The algorithm does not directly provide probability estimates, these are calculated using an expensive five-fold cross-validation.",
"_____no_output_____"
]
],
[
[
"from sklearn.svm import SVC\nsvm = SVC(gamma='scale', C=1.0, random_state=101)\nsvm.fit(XA,yA)\nyP = svm.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### S.8 Neural Network\n\nThe `MLPClassifier` implements a multi-layer perceptron (MLP) algorithm that trains using Backpropagation.\n\n**Definition:** A neural network is a set of neurons (activation functions) in layers that are processed sequentially to relate an input to an output.\n\n**Advantages:** Effective in nonlinear spaces where the structure of the relationship is not linear. No prior knowledge or specialized equation structure is defined although there are different network architectures that may lead to a better result.\n\n**Disadvantages:** Neural networks do not extrapolate well outside of the training domain. They may also require longer to train by adjusting the parameter weights to minimize a loss (objective) function. It is also more challenging to explain the outcome of the training and changes in initialization or number of epochs (iterations) may lead to different results. Too many epochs may lead to overfitting, especially if there are excess parameters beyond the minimum needed to capture the input to output relationship.",
"_____no_output_____"
],
[
"\n\nMLP trains on two arrays: array X of size (n_samples, n_features), which holds the training samples represented as floating point feature vectors; and array y of size (n_samples,), which holds the target values (class labels) for the training samples.\nMLP can fit a non-linear model to the training data. clf.coefs_ contains the weight matrices that constitute the model parameters. Currently, MLPClassifier supports only the Cross-Entropy loss function, which allows probability estimates by running the predict_proba method. MLP trains using Backpropagation. More precisely, it trains using some form of gradient descent and the gradients are calculated using Backpropagation. For classification, it minimizes the Cross-Entropy loss function, giving a vector of probability estimates. MLPClassifier supports multi-class classification by applying Softmax as the output function. Further, the model supports multi-label classification in which a sample can belong to more than one class. For each class, the raw output passes through the logistic function. Values larger or equal to 0.5 are rounded to 1, otherwise to 0. For a predicted output of a sample, the indices where the value is 1 represents the assigned classes of that sample.",
"_____no_output_____"
]
],
[
[
"from sklearn.neural_network import MLPClassifier\n\nclf = MLPClassifier(solver='lbfgs',alpha=1e-5,max_iter=200,activation='relu',\\\n hidden_layer_sizes=(10,30,10), random_state=1, shuffle=True)\nclf.fit(XA,yA)\nyP = clf.predict(XB)\nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### Unsupervised Classification\n\nAdditional examples show the potential for unsupervised learning to classify the groups. Unsupervised learning does not use the labels (`True`/`False`) so the results may need to be switched to align with the test set with `if len(XB[yP!=yB]) > n/4: yP = 1 - yP \n`",
"_____no_output_____"
],
[
"\n\n### U.1 K-Means Clustering\n\n**Definition:** Specify how many possible clusters (or K) there are in the dataset. The algorithm then iteratively moves the K-centers and selects the datapoints that are closest to that centroid in the cluster.\n\n**Advantages:** The most common and simplest clustering algorithm.\n\n**Disadvantages:** Must specify the number of clusters although this can typically be determined by increasing the number of clusters until the objective function does not change significantly.",
"_____no_output_____"
]
],
[
[
"from sklearn.cluster import KMeans\nkm = KMeans(n_clusters=2)\nkm.fit(XA)\nyP = km.predict(XB)\nif len(XB[yP!=yB]) > n/4: yP = 1 - yP \nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### U.2 Gaussian Mixture Model\n\n**Definition:** Data points that exist at the boundary of clusters may simply have similar probabilities of being on either clusters. A mixture model predicts a probability instead of a hard classification such as K-Means clustering.\n\n**Advantages:** Incorporates uncertainty into the solution.\n\n**Disadvantages:** Uncertainty may not be desirable for some applications. This method is not as common as the K-Means method for clustering.",
"_____no_output_____"
]
],
[
[
"from sklearn.mixture import GaussianMixture\ngmm = GaussianMixture(n_components=2)\ngmm.fit(XA)\nyP = gmm.predict_proba(XB) # produces probabilities\nif len(XB[np.round(yP[:,0])!=yB]) > n/4: yP = 1 - yP \nassess(np.round(yP[:,0]))",
"_____no_output_____"
]
],
[
[
"\n\n### U.3 Spectral Clustering\n\n**Definition:** Spectral clustering is known as segmentation-based object categorization. It is a technique with roots in graph theory, where identify communities of nodes in a graph are based on the edges connecting them. The method is flexible and allows clustering of non graph data as well.\nIt uses information from the eigenvalues of special matrices built from the graph or the data set. \n\n**Advantages:** Flexible approach for finding clusters when data doesnโt meet the requirements of other common algorithms.\n\n**Disadvantages:** For large-sized graphs, the second eigenvalue of the (normalized) graph Laplacian matrix is often ill-conditioned, leading to slow convergence of iterative eigenvalue solvers. Spectral clustering is computationally expensive unless the graph is sparse and the similarity matrix can be efficiently constructed.",
"_____no_output_____"
]
],
[
[
"from sklearn.cluster import SpectralClustering\nsc = SpectralClustering(n_clusters=2,eigen_solver='arpack',\\\n affinity='nearest_neighbors')\nyP = sc.fit_predict(XB) # No separation between fit and predict calls\n # need to fit and predict on same dataset\nif len(XB[yP!=yB]) > n/4: yP = 1 - yP \nassess(yP)",
"_____no_output_____"
]
],
[
[
"\n\n### TCLab Activity\n\nTrain a classifier to predict if the heater is on (100%) or off (0%). Generate data with 10 minutes of 1 second data. If you do not have a TCLab, use one of the sample data sets.\n\n- [Sample Data Set 1 (10 min)](http://apmonitor.com/do/uploads/Main/tclab_data5.txt): http://apmonitor.com/do/uploads/Main/tclab_data5.txt \n- [Sample Data Set 2 (60 min)](http://apmonitor.com/do/uploads/Main/tclab_data6.txt): http://apmonitor.com/do/uploads/Main/tclab_data6.txt",
"_____no_output_____"
]
],
[
[
"# 10 minute data collection\nimport tclab, time\nimport numpy as np\nimport pandas as pd\nwith tclab.TCLab() as lab:\n n = 600; on=100; t = np.linspace(0,n-1,n) \n Q1 = np.zeros(n); T1 = np.zeros(n)\n Q2 = np.zeros(n); T2 = np.zeros(n) \n Q1[20:41]=on; Q1[60:91]=on; Q1[150:181]=on\n Q1[190:206]=on; Q1[220:251]=on; Q1[260:291]=on\n Q1[300:316]=on; Q1[340:351]=on; Q1[400:431]=on\n Q1[500:521]=on; Q1[540:571]=on; Q1[20:41]=on\n Q1[60:91]=on; Q1[150:181]=on; Q1[190:206]=on\n Q1[220:251]=on; Q1[260:291]=on\n print('Time Q1 Q2 T1 T2')\n for i in range(n):\n T1[i] = lab.T1; T2[i] = lab.T2\n lab.Q1(Q1[i])\n if i%5==0:\n print(int(t[i]),Q1[i],Q2[i],T1[i],T2[i])\n time.sleep(1)\ndata = np.column_stack((t,Q1,Q2,T1,T2))\ndata8 = pd.DataFrame(data,columns=['Time','Q1','Q2','T1','T2'])\ndata8.to_csv('08-tclab.csv',index=False)",
"_____no_output_____"
]
],
[
[
"Use the data file `08-tclab.csv` to train and test the classifier. Select and scale (0-1) the features of the data including `T1`, `T2`, and the 1st and 2nd derivatives of `T1`. Use the measured temperatures, derivatives, and heater value label to create a classifier that predicts when the heater is on or off. Validate the classifier with new data that was not used for training. Starting code is provided below but does not include `T2` as a feature input. **Add `T2` as an input feature to the classifer. Does it improve the classifier performance?**",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib import gridspec\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.model_selection import train_test_split\n\ntry:\n data = pd.read_csv('08-tclab.csv')\nexcept:\n print('Warning: Unable to load 08-tclab.csv, using online data')\n url = 'http://apmonitor.com/do/uploads/Main/tclab_data5.txt'\n data = pd.read_csv(url)\n \n# Input Features: Temperature and 1st / 2nd Derivatives\n# Cubic polynomial fit of temperature using 10 data points\ndata['dT1'] = np.zeros(len(data))\ndata['d2T1'] = np.zeros(len(data))\nfor i in range(len(data)):\n if i<len(data)-10:\n x = data['Time'][i:i+10]-data['Time'][i]\n y = data['T1'][i:i+10]\n p = np.polyfit(x,y,3)\n # evaluate derivatives at mid-point (5 sec)\n t = 5.0\n data['dT1'][i] = 3.0*p[0]*t**2 + 2.0*p[1]*t+p[2]\n data['d2T1'][i] = 6.0*p[0]*t + 2.0*p[1]\n else:\n data['dT1'][i] = np.nan\n data['d2T1'][i] = np.nan\n\n# Remove last 10 values\nX = np.array(data[['T1','dT1','d2T1']][0:-10])\ny = np.array(data[['Q1']][0:-10])\n\n# Scale data\n# Input features (Temperature and 2nd derivative at 5 sec)\ns1 = MinMaxScaler(feature_range=(0,1))\nXs = s1.fit_transform(X)\n# Output labels (heater On / Off)\nys = [True if y[i]>50.0 else False for i in range(len(y))]\n\n# Split into train and test subsets (50% each)\nXA, XB, yA, yB = train_test_split(Xs, ys, \\\n test_size=0.5, shuffle=False)\n\n# Supervised Classification\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.naive_bayes import GaussianNB\nfrom sklearn.linear_model import SGDClassifier\nfrom sklearn.neighbors import KNeighborsClassifier\nfrom sklearn.tree import DecisionTreeClassifier\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.svm import SVC\nfrom sklearn.neural_network import MLPClassifier\n\n# Create supervised classification models\nlr = LogisticRegression(solver='lbfgs') # Logistic Regression\nnb = GaussianNB() # Naรฏve Bayes\nsgd = SGDClassifier(loss='modified_huber', shuffle=True,\\\n random_state=101) # Stochastic Gradient Descent\nknn = KNeighborsClassifier(n_neighbors=5) # K-Nearest Neighbors\ndtree = DecisionTreeClassifier(max_depth=10,random_state=101,\\\n max_features=None,min_samples_leaf=5) # Decision Tree\nrfm = RandomForestClassifier(n_estimators=70,oob_score=True,n_jobs=1,\\\n random_state=101,max_features=None,min_samples_leaf=3) # Random Forest\nsvm = SVC(gamma='scale', C=1.0, random_state=101) # Support Vector Classifier\nclf = MLPClassifier(solver='lbfgs',alpha=1e-5,max_iter=200,\\\n activation='relu',hidden_layer_sizes=(10,30,10),\\\n random_state=1, shuffle=True) # Neural Network\nmodels = [lr,nb,sgd,knn,dtree,rfm,svm,clf]\n\n# Supervised learning\nyP = [None]*(len(models)+3) # 3 for unsupervised learning\nfor i,m in enumerate(models):\n m.fit(XA,yA)\n yP[i] = m.predict(XB)\n\n# Unsupervised learning modules\nfrom sklearn.cluster import KMeans\nfrom sklearn.mixture import GaussianMixture\nfrom sklearn.cluster import SpectralClustering\nkm = KMeans(n_clusters=2)\ngmm = GaussianMixture(n_components=2)\nsc = SpectralClustering(n_clusters=2,eigen_solver='arpack',\\\n affinity='nearest_neighbors')\nkm.fit(XA)\nyP[8] = km.predict(XB)\ngmm.fit(XA)\nyP[9] = gmm.predict_proba(XB)[:,0]\nyP[10] = sc.fit_predict(XB)\n\nplt.figure(figsize=(10,7))\ngs = gridspec.GridSpec(3, 1, height_ratios=[1,1,5])\nplt.subplot(gs[0])\nplt.plot(data['Time']/60,data['T1'],'r-',\\\n label='Temperature (ยฐC)')\nplt.ylabel('T (ยฐC)')\nplt.legend()\nplt.subplot(gs[1])\nplt.plot(data['Time']/60,data['dT1'],'b:',\\\n label='dT/dt (ยฐC/sec)') \nplt.plot(data['Time']/60,data['d2T1'],'k--',\\\n label=r'$d^2T/dt^2$ ($ยฐC^2/sec^2$)')\nplt.ylabel('Derivatives')\nplt.legend()\n\nplt.subplot(gs[2])\nplt.plot(data['Time']/60,data['Q1']/100,'k-',\\\n label='Heater (On=1/Off=0)')\n\nt2 = data['Time'][len(yA):-10].values\ndesc = ['Logistic Regression','Naรฏve Bayes','Stochastic Gradient Descent',\\\n 'K-Nearest Neighbors','Decision Tree','Random Forest',\\\n 'Support Vector Classifier','Neural Network',\\\n 'K-Means Clustering','Gaussian Mixture Model','Spectral Clustering']\nfor i in range(11):\n plt.plot(t2/60,yP[i]-i-1,label=desc[i])\n\nplt.ylabel('Heater')\nplt.legend()\n\nplt.xlabel(r'Time (min)')\nplt.legend()\nplt.show()",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d0646654843a52d1c0a410b3f80fe5892535ad2c | 6,094 | ipynb | Jupyter Notebook | reg-linear/Bonus/Simulador Interativo.ipynb | DiegoVialle/Regressao-Linear-Testando-Relacoes-e-Prevendo-Resultados | 3c185aefc9ab4fbcee98efb294c7637eb4f594e5 | [
"MIT"
]
| null | null | null | reg-linear/Bonus/Simulador Interativo.ipynb | DiegoVialle/Regressao-Linear-Testando-Relacoes-e-Prevendo-Resultados | 3c185aefc9ab4fbcee98efb294c7637eb4f594e5 | [
"MIT"
]
| null | null | null | reg-linear/Bonus/Simulador Interativo.ipynb | DiegoVialle/Regressao-Linear-Testando-Relacoes-e-Prevendo-Resultados | 3c185aefc9ab4fbcee98efb294c7637eb4f594e5 | [
"MIT"
]
| null | null | null | 32.414894 | 1,066 | 0.564654 | [
[
[
"<h1 style='color: green; font-size: 36px; font-weight: bold;'>Data Science - Regressรฃo Linear</h1>",
"_____no_output_____"
],
[
"# <font color='red' style='font-size: 30px;'>Bรดnus</font>\n<hr style='border: 2px solid red;'>",
"_____no_output_____"
],
[
"## Importando nosso modelo",
"_____no_output_____"
]
],
[
[
"import pickle\n\nmodelo = open('../Exercicio/modelo_preรงo','rb')\nlm_new = pickle.load(modelo)\nmodelo.close()\n\narea = 38\ngaragem = 2\nbanheiros = 4\nlareira = 4\nmarmore = 0\nandares = 1\n\nentrada = [[area, garagem, banheiros, lareira, marmore, andares]]\n\nprint('$ {0:.2f}'.format(lm_new.predict(entrada)[0]))",
"_____no_output_____"
]
],
[
[
"## Exemplo de um simulador interativo para Jupyter\n\nhttps://ipywidgets.readthedocs.io/en/stable/index.html\n\nhttps://github.com/jupyter-widgets/ipywidgets",
"_____no_output_____"
]
],
[
[
"# Importando bibliotecas\nfrom ipywidgets import widgets, HBox, VBox\nfrom IPython.display import display\n\n# Criando os controles do formulรกrio\narea = widgets.Text(description=\"รrea\")\ngaragem = widgets.Text(description=\"Garagem\")\nbanheiros = widgets.Text(description=\"Banheiros\")\nlareira = widgets.Text(description=\"Lareira\")\nmarmore = widgets.Text(description=\"Mรกrmore?\")\nandares = widgets.Text(description=\"Andares?\")\n\nbotao = widgets.Button(description=\"Simular\")\n\n# Posicionando os controles\nleft = VBox([area, banheiros, marmore])\nright = VBox([garagem, lareira, andares])\ninputs = HBox([left, right])\n\n# Funรงรฃo de simulaรงรฃo\ndef simulador(sender):\n entrada=[[\n float(area.value if area.value else 0), \n float(garagem.value if garagem.value else 0), \n float(banheiros.value if banheiros.value else 0), \n float(lareira.value if lareira.value else 0), \n float(marmore.value if marmore.value else 0), \n float(andares.value if andares.value else 0)\n ]]\n print('$ {0:.2f}'.format(lm_new.predict(entrada)[0]))\n \n# Atribuindo a funรงรฃo \"simulador\" ao evento click do botรฃo\nbotao.on_click(simulador) ",
"_____no_output_____"
],
[
"display(inputs, botao)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d064705e51ef7eddc4d0656b9a41f9386500351e | 11,423 | ipynb | Jupyter Notebook | albert-base/albert-baseline.ipynb | shanayghag/AV-Janatahack-Independence-Day-2020-ML-Hackathon | 410c549488b0e2ceece067a9e1581e182a11e885 | [
"MIT"
]
| 6 | 2020-08-26T13:00:11.000Z | 2021-12-28T18:58:43.000Z | albert-base/albert-baseline.ipynb | shanayghag/AV-Janatahack-Independence-Day-2020-ML-Hackathon | 410c549488b0e2ceece067a9e1581e182a11e885 | [
"MIT"
]
| null | null | null | albert-base/albert-baseline.ipynb | shanayghag/AV-Janatahack-Independence-Day-2020-ML-Hackathon | 410c549488b0e2ceece067a9e1581e182a11e885 | [
"MIT"
]
| 1 | 2020-08-24T08:34:19.000Z | 2020-08-24T08:34:19.000Z | 11,423 | 11,423 | 0.670927 | [
[
[
"## Installing & importing necsessary libs",
"_____no_output_____"
]
],
[
[
"!pip install -q transformers",
"_____no_output_____"
],
[
"import numpy as np\nimport pandas as pd\nfrom sklearn import metrics\nimport transformers\nimport torch\nfrom torch.utils.data import Dataset, DataLoader, RandomSampler, SequentialSampler\nfrom transformers import AlbertTokenizer, AlbertModel, AlbertConfig\nfrom tqdm.notebook import tqdm\nfrom transformers import get_linear_schedule_with_warmup",
"_____no_output_____"
],
[
"device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nn_gpu = torch.cuda.device_count()\ntorch.cuda.get_device_name(0)",
"_____no_output_____"
]
],
[
[
"## Data Preprocessing",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv(\"../input/avjantahack/data/train.csv\")\ndf['list'] = df[df.columns[3:]].values.tolist()\nnew_df = df[['ABSTRACT', 'list']].copy()\nnew_df.head()",
"_____no_output_____"
]
],
[
[
"## Model configurations",
"_____no_output_____"
]
],
[
[
"# Defining some key variables that will be used later on in the training\nMAX_LEN = 512\nTRAIN_BATCH_SIZE = 16\nVALID_BATCH_SIZE = 8\nEPOCHS = 5\nLEARNING_RATE = 3e-05\ntokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')",
"_____no_output_____"
]
],
[
[
"## Custom Dataset Class",
"_____no_output_____"
]
],
[
[
"class CustomDataset(Dataset):\n\n def __init__(self, dataframe, tokenizer, max_len):\n self.tokenizer = tokenizer\n self.data = dataframe\n self.abstract = dataframe.ABSTRACT\n self.targets = self.data.list\n self.max_len = max_len\n\n def __len__(self):\n return len(self.abstract)\n\n def __getitem__(self, index):\n abstract = str(self.abstract[index])\n abstract = \" \".join(abstract.split())\n\n inputs = self.tokenizer.encode_plus(\n abstract,\n None,\n add_special_tokens = True,\n max_length = self.max_len,\n pad_to_max_length = True,\n return_token_type_ids=True,\n truncation = True\n )\n\n ids = inputs['input_ids']\n mask = inputs['attention_mask']\n token_type_ids = inputs['token_type_ids']\n\n return{\n 'ids': torch.tensor(ids, dtype=torch.long),\n 'mask': torch.tensor(mask, dtype=torch.long),\n 'token_type_ids': torch.tensor(token_type_ids, dtype=torch.long),\n 'targets': torch.tensor(self.targets[index], dtype=torch.float)\n }",
"_____no_output_____"
],
[
"train_size = 0.8\ntrain_dataset=new_df.sample(frac=train_size,random_state=200)\ntest_dataset=new_df.drop(train_dataset.index).reset_index(drop=True)\ntrain_dataset = train_dataset.reset_index(drop=True)\n\n\nprint(\"FULL Dataset: {}\".format(new_df.shape))\nprint(\"TRAIN Dataset: {}\".format(train_dataset.shape))\nprint(\"TEST Dataset: {}\".format(test_dataset.shape))\n\ntraining_set = CustomDataset(train_dataset, tokenizer, MAX_LEN)\ntesting_set = CustomDataset(test_dataset, tokenizer, MAX_LEN)",
"_____no_output_____"
],
[
"train_params = {'batch_size': TRAIN_BATCH_SIZE,\n 'shuffle': True,\n 'num_workers': 0\n }\n\ntest_params = {'batch_size': VALID_BATCH_SIZE,\n 'shuffle': True,\n 'num_workers': 0\n }\n\ntraining_loader = DataLoader(training_set, **train_params)\ntesting_loader = DataLoader(testing_set, **test_params)",
"_____no_output_____"
]
],
[
[
"## Albert model",
"_____no_output_____"
]
],
[
[
"class AlbertClass(torch.nn.Module):\n def __init__(self):\n super(AlbertClass, self).__init__()\n self.albert = transformers.AlbertModel.from_pretrained('albert-base-v2')\n self.drop = torch.nn.Dropout(0.1)\n self.linear = torch.nn.Linear(768, 6)\n \n def forward(self, ids, mask, token_type_ids):\n _, output= self.albert(ids, attention_mask = mask)\n output = self.drop(output)\n output = self.linear(output)\n\n return output\n\nmodel = AlbertClass()\nmodel.to(device)",
"_____no_output_____"
]
],
[
[
"## Hyperparameters & Loss function",
"_____no_output_____"
]
],
[
[
"def loss_fn(outputs, targets):\n return torch.nn.BCEWithLogitsLoss()(outputs, targets)",
"_____no_output_____"
],
[
"param_optimizer = list(model.named_parameters())\nno_decay = [\"bias\", \"LayerNorm.bias\", \"LayerNorm.weight\"]\noptimizer_parameters = [\n {\n \"params\": [\n p for n, p in param_optimizer if not any(nd in n for nd in no_decay)\n ],\n \"weight_decay\": 0.001,\n },\n {\n \"params\": [\n p for n, p in param_optimizer if any(nd in n for nd in no_decay)\n ],\n \"weight_decay\": 0.0,\n },\n]\n\noptimizer = torch.optim.AdamW(optimizer_parameters, lr=1e-5)\nnum_training_steps = int(len(train_dataset) / TRAIN_BATCH_SIZE * EPOCHS)\n\nscheduler = get_linear_schedule_with_warmup(\n optimizer,\n num_warmup_steps = 0,\n num_training_steps = num_training_steps\n)",
"_____no_output_____"
]
],
[
[
"## Train & Eval Functions\n\n",
"_____no_output_____"
]
],
[
[
"def train(epoch):\n model.train()\n for _,data in tqdm(enumerate(training_loader, 0), total=len(training_loader)):\n ids = data['ids'].to(device, dtype = torch.long)\n mask = data['mask'].to(device, dtype = torch.long)\n token_type_ids = data['token_type_ids'].to(device, dtype = torch.long)\n targets = data['targets'].to(device, dtype = torch.float)\n\n outputs = model(ids, mask, token_type_ids)\n\n optimizer.zero_grad()\n loss = loss_fn(outputs, targets)\n if _%1000==0:\n print(f'Epoch: {epoch}, Loss: {loss.item()}')\n \n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n scheduler.step()\n\ndef validation(epoch):\n model.eval()\n fin_targets=[]\n fin_outputs=[]\n with torch.no_grad():\n for _, data in tqdm(enumerate(testing_loader, 0), total=len(testing_loader)):\n ids = data['ids'].to(device, dtype = torch.long)\n mask = data['mask'].to(device, dtype = torch.long)\n token_type_ids = data['token_type_ids'].to(device, dtype = torch.long)\n\n targets = data['targets'].to(device, dtype = torch.float)\n outputs = model(ids, mask, token_type_ids)\n fin_targets.extend(targets.cpu().detach().numpy().tolist())\n fin_outputs.extend(torch.sigmoid(outputs).cpu().detach().numpy().tolist())\n return fin_outputs, fin_targets",
"_____no_output_____"
]
],
[
[
"## Training Model",
"_____no_output_____"
]
],
[
[
"MODEL_PATH = \"/kaggle/working/albert-multilabel-model.bin\"\nbest_micro = 0\nfor epoch in range(EPOCHS):\n train(epoch)\n outputs, targets = validation(epoch)\n outputs = np.array(outputs) >= 0.5\n accuracy = metrics.accuracy_score(targets, outputs)\n f1_score_micro = metrics.f1_score(targets, outputs, average='micro')\n f1_score_macro = metrics.f1_score(targets, outputs, average='macro')\n print(f\"Accuracy Score = {accuracy}\")\n print(f\"F1 Score (Micro) = {f1_score_micro}\")\n print(f\"F1 Score (Macro) = {f1_score_macro}\")\n if f1_score_micro > best_micro:\n torch.save(model.state_dict(), MODEL_PATH)\n best_micro = f1_score_micro",
"_____no_output_____"
],
[
"def predict(id, abstract):\n MAX_LENGTH = 512\n inputs = tokenizer.encode_plus(\n abstract,\n None,\n add_special_tokens=True,\n max_length=512,\n pad_to_max_length=True,\n return_token_type_ids=True,\n truncation = True\n )\n \n ids = inputs['input_ids']\n mask = inputs['attention_mask']\n token_type_ids = inputs['token_type_ids']\n\n ids = torch.tensor(ids, dtype=torch.long).unsqueeze(0)\n mask = torch.tensor(mask, dtype=torch.long).unsqueeze(0)\n token_type_ids = torch.tensor(token_type_ids, dtype=torch.long).unsqueeze(0)\n\n ids = ids.to(device)\n mask = mask.to(device)\n token_type_ids = token_type_ids.to(device)\n\n with torch.no_grad():\n outputs = model(ids, mask, token_type_ids)\n\n outputs = torch.sigmoid(outputs).squeeze()\n outputs = np.round(outputs.cpu().numpy())\n \n out = np.insert(outputs, 0, id)\n return out",
"_____no_output_____"
],
[
"def submit():\n test_df = pd.read_csv('../input/avjantahack/data/test.csv')\n sample_submission = pd.read_csv('../input/avjantahack/data/sample_submission_UVKGLZE.csv')\n\n y = []\n for id, abstract in tqdm(zip(test_df['ID'], test_df['ABSTRACT']),\n total=len(test_df)):\n out = predict(id, abstract)\n y.append(out)\n y = np.array(y)\n submission = pd.DataFrame(y, columns=sample_submission.columns).astype(int)\n return submission",
"_____no_output_____"
],
[
"submission = submit()\nsubmission",
"_____no_output_____"
],
[
"submission.to_csv('/kaggle/working/alberta-tuned-lr-ws-dr.csv', index=False)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d06476f82b77f6e5bcbde9a83435ff7ac540991b | 1,585 | ipynb | Jupyter Notebook | chapter10/05_transfer_learning.ipynb | PacktPublishing/Mastering-Azure-Machine-Learning-Second-Edition | 1ca0cf19fd49f6781c589ae4d9bde56135791cf1 | [
"MIT"
]
| 1 | 2022-03-07T20:15:08.000Z | 2022-03-07T20:15:08.000Z | chapter10/05_transfer_learning.ipynb | PacktPublishing/Mastering-Azure-Machine-Learning-Second-Edition | 1ca0cf19fd49f6781c589ae4d9bde56135791cf1 | [
"MIT"
]
| null | null | null | chapter10/05_transfer_learning.ipynb | PacktPublishing/Mastering-Azure-Machine-Learning-Second-Edition | 1ca0cf19fd49f6781c589ae4d9bde56135791cf1 | [
"MIT"
]
| 1 | 2022-03-22T17:57:41.000Z | 2022-03-22T17:57:41.000Z | 20.855263 | 105 | 0.548265 | [
[
[
"import keras",
"_____no_output_____"
],
[
"from keras.applications.resnet50 import ResNet50\n\nnum_classes = 10\ninput_shape = (224, 224, 3)\n\n# create the base pre-trained model\nbase_model = ResNet50(input_shape=input_shape, weights='imagenet', include_top=False,pooling='avg')",
"_____no_output_____"
],
[
"for layer in base_model.layers:\n layer.trainable=False",
"_____no_output_____"
],
[
"from keras.models import Model\nfrom keras.layers import Flatten, Dense\n\nclf = base_model.output\nclf = Dense(256, activation='relu')(clf)\nclf = Dense(10, activation='softmax')(clf)\n\nmodel = Model(base_model.input, clf)",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code"
]
]
|
d06483d677627cbca55508862a20d7391a6436a3 | 20,102 | ipynb | Jupyter Notebook | AS2520 Propulsion Lab/Experiment 6 - Droplet Evaporation/re-work-notebook.ipynb | kirtan2605/Coursework-Codes | f979b45a608a3420a107de99fc6eb6acefcadf87 | [
"MIT"
]
| null | null | null | AS2520 Propulsion Lab/Experiment 6 - Droplet Evaporation/re-work-notebook.ipynb | kirtan2605/Coursework-Codes | f979b45a608a3420a107de99fc6eb6acefcadf87 | [
"MIT"
]
| null | null | null | AS2520 Propulsion Lab/Experiment 6 - Droplet Evaporation/re-work-notebook.ipynb | kirtan2605/Coursework-Codes | f979b45a608a3420a107de99fc6eb6acefcadf87 | [
"MIT"
]
| null | null | null | 20,102 | 20,102 | 0.717988 | [
[
[
"# Droplet Evaporation",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom scipy import optimize",
"_____no_output_____"
],
[
"# Ethyl Acetate\n#time_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\n#diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\n\n# Gasoline\n#time_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\n#diameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])",
"_____no_output_____"
]
],
[
[
"# Ethyl Acetate",
"_____no_output_____"
]
],
[
[
"time_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\ndiameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])",
"_____no_output_____"
],
[
"x = time_in_sec.tolist()\ny = diameter.tolist()\n\npolynomial_coeff_1=np.polyfit(x,y,1)\npolynomial_coeff_2=np.polyfit(x,y,2)\npolynomial_coeff_3=np.polyfit(x,y,3)\n\nxnew=np.linspace(0,110 ,100)\nynew_1=np.poly1d(polynomial_coeff_1)\nynew_2=np.poly1d(polynomial_coeff_2)\nynew_3=np.poly1d(polynomial_coeff_3)\n\nplt.plot(x,y,'o')\nplt.plot(xnew,ynew_1(xnew))\nplt.plot(xnew,ynew_2(xnew))\nplt.plot(xnew,ynew_3(xnew))\nprint(ynew_1)\nprint(ynew_2)\nprint(ynew_3)\nplt.title(\"Diameter vs Time(s)\")\nplt.xlabel(\"Time(s)\")\nplt.ylabel(\"Diameter\")\n\nplt.show()\n\n\n# Coeficients\n# LINEAR : -0.02386 x + 3.139\n# QUADRATIC : -0.0002702 x^2 + 0.005868 x + 2.619\n# CUBIC : -4.771e-07 x^3 - 0.0001915 x^2 + 0.002481 x + 2.646\n#\n# Using Desmos to find the roots of the best fit polynomials\n# Root of linear fit = 131.559\n# Root of quadratic fit = 109.908\n# Root of cubic fit = 109.414",
"_____no_output_____"
],
[
"def d_square_law(x, C, n):\n y = C/(x**n)\n return y",
"_____no_output_____"
]
],
[
[
"# Linear Fit",
"_____no_output_____"
]
],
[
[
"# Calculating time taken for vaporization for different diameters. (LINEAR FIT)\ndiameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\ntime_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\nt_vap = time_in_sec\nt_vap = t_vap*0\nt_vap = t_vap + 131.559\nt_vap = t_vap - time_in_sec\nprint(t_vap.tolist())",
"_____no_output_____"
],
[
"# Finding C and n for d-square law\n#initial_diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\n#vap_time = np.array([109.908, 104.908, 99.908, 94.908, 89.908, 84.908, 79.908, 74.908, 69.908, 64.908, 59.908, 54.908, 49.908, 44.908, 39.908, 34.908, 29.908, 24.908, 19.908, 14.908000000000001, 9.908000000000001, 4.908000000000001, -0.09199999999999875])\n\n# Linear \ninitial_diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\nvap_time_lin = np.array([131.559, 126.559, 121.559, 116.559, 111.559, 106.559, 101.559, 96.559, 91.559, 86.559, 81.559, 76.559, 71.559, 66.559, 61.559, 56.559, 51.559, 46.559, 41.559, 36.559, 31.558999999999997, 26.558999999999997, 21.558999999999997])",
"_____no_output_____"
],
[
"# Linear\nparameters_lin = optimize.curve_fit(d_square_law, xdata = initial_diameter, ydata = vap_time_lin)[0]\nprint(\"Linear : \",parameters_lin)\n#C = parameters_lin[0]\n#n = parameters_lin[1]",
"_____no_output_____"
]
],
[
[
"# Quadratic Fit",
"_____no_output_____"
]
],
[
[
"# Calculating time taken for vaporization for different diameters. (QUADRATIC FIT)\ndiameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\ntime_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\nt_vap = time_in_sec\nt_vap = t_vap*0\nt_vap = t_vap + 109.908\nt_vap = t_vap - time_in_sec\nprint(t_vap.tolist())",
"_____no_output_____"
],
[
"# Quadratic Fit\ninitial_diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372])\nvap_time_quad = np.array([109.908, 104.908, 99.908, 94.908, 89.908, 84.908, 79.908, 74.908, 69.908, 64.908, 59.908, 54.908, 49.908, 44.908, 39.908, 34.908, 29.908, 24.908, 19.908, 14.908000000000001, 9.908000000000001, 4.908000000000001])\n",
"_____no_output_____"
],
[
"# Quadratic\nparameters_quad = optimize.curve_fit(d_square_law, xdata = initial_diameter, ydata = vap_time_quad)[0]\nprint(\"Linear : \",parameters_quad)\n#C = parameters_lin[0]\n#n = parameters_lin[1]",
"_____no_output_____"
]
],
[
[
"# Ethyl Acetate - After finding d-square Law",
"_____no_output_____"
]
],
[
[
"# Linear\nC = 41.72856231\nn = -0.97941652\n\n# Quadratic\n# C = 11.6827828\n# n = -2.13925924\n\n\n\nx = vap_time.tolist()\ny = initial_diameter.tolist()\n\nynew=np.linspace(0,3 ,100)\nxnew=[]\nfor item in ynew:\n v1 = C/(item**n)\n xnew.append(v1)\n \nplt.plot(x,y,'o')\nplt.plot(xnew,ynew)\nplt.title(\"Initial Diameter vs Vaporization Time(s)\")\nplt.xlabel(\"Vaporization Time(s)\")\nplt.ylabel(\"Initial Diameter\")\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Gasoline",
"_____no_output_____"
]
],
[
[
"time_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\ndiameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])",
"_____no_output_____"
],
[
"x = time_in_min.tolist()\ny = diameter.tolist()\n\npolynomial_coeff_1=np.polyfit(x,y,1)\npolynomial_coeff_2=np.polyfit(x,y,2)\npolynomial_coeff_3=np.polyfit(x,y,3)\n\nxnew=np.linspace(0,300 ,100)\nynew_1=np.poly1d(polynomial_coeff_1)\nynew_2=np.poly1d(polynomial_coeff_2)\nynew_3=np.poly1d(polynomial_coeff_3)\n\nplt.plot(x,y,'o')\nplt.plot(xnew,ynew_1(xnew))\nplt.plot(xnew,ynew_2(xnew))\nplt.plot(xnew,ynew_3(xnew))\nprint(ynew_1)\nprint(ynew_2)\nprint(ynew_3)\nplt.title(\"Diameter vs Time(min)\")\nplt.xlabel(\"Time(min)\")\nplt.ylabel(\"Diameter\")\n\nplt.show()\n\n\n# Coeficients\n# LINEAR : -0.005637 x + 2.074\n# QUADRATIC : -6.67e-06 x^2 - 0.003865 x + 2\n# CUBIC : 1.481e-07 x^3 - 6.531e-05 x^2 + 0.00207 x + 1.891\n#\n# Using Desmos to find the roots of the best fit polynomials\n# Root of linear fit = 367.926\n# Root of quadratic fit = 329.781\n# Root of cubic fit = No Positive Root",
"_____no_output_____"
]
],
[
[
"# Linear Fit",
"_____no_output_____"
]
],
[
[
"# Calculating time taken for vaporization for different diameters. (LINEAR FIT)\ntime_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\ndiameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nt_vap = time_in_min\nt_vap = t_vap*0\nt_vap = t_vap + 367.926\nt_vap = t_vap - time_in_min\nprint(t_vap.tolist())",
"_____no_output_____"
],
[
"initial_diameter_g_lin = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nvap_time_g_lin = np.array([367.926, 352.926, 337.926, 322.926, 307.926, 292.926, 277.926, 262.926, 247.926, 232.926, 217.926, 202.926, 187.926, 157.926, 132.926, 117.92599999999999, 102.92599999999999])",
"_____no_output_____"
],
[
"parameters_g_lin = optimize.curve_fit(d_square_law, xdata = initial_diameter_g_lin, ydata = vap_time_g_lin)[0]\nprint(parameters_g_lin)\nC_g = parameters_g_lin[0]\nn_g = parameters_g_lin[1]",
"_____no_output_____"
]
],
[
[
"# Quadratic Fit",
"_____no_output_____"
]
],
[
[
"# Calculating time taken for vaporization for different diameters.\ntime_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\ndiameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nt_vap = time_in_min\nt_vap = t_vap*0\nt_vap = t_vap + 329.781\nt_vap = t_vap - time_in_min\nprint(t_vap.tolist())",
"_____no_output_____"
],
[
"initial_diameter_g_quad = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nvap_time_g_quad = np.array([329.781, 314.781, 299.781, 284.781, 269.781, 254.781, 239.781, 224.781, 209.781, 194.781, 179.781, 164.781, 149.781, 119.781, 94.781, 79.781, 64.781])",
"_____no_output_____"
],
[
"parameters_g_quad = optimize.curve_fit(d_square_law, xdata = initial_diameter_g_quad, ydata = vap_time_g_quad)[0]\nprint(parameters_g_quad)\nC_g = parameters_g_quad[0]\nn_g = parameters_g_quad[1]",
"_____no_output_____"
]
],
[
[
"# Gasoline - After finding Vaporization Time Data",
"_____no_output_____"
]
],
[
[
"#Linear \nC_g = 140.10666889\nn_g = -1.1686059 \n\n# Quadratic\nC_g = 140.10666889\nn_g = -1.1686059 \n\nx_g = vap_time_g.tolist()\ny_g = initial_diameter_g.tolist()\n\nynew_g=np.linspace(0,2.2 ,100)\nxnew_g=[]\nfor item in ynew_g:\n v1 = C_g/(item**n_g)\n xnew_g.append(v1)\nprint(ynew_g)\nprint(xnew_g)\n \nplt.plot(x_g,y_g,'o')\nplt.plot(xnew_g,ynew_g)\nplt.title(\"Initial Diameter vs Vaporization Time(min)\")\nplt.xlabel(\"Vaporization Time(min)\")\nplt.ylabel(\"Initial Diameter\")\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Optimization Methods (IGNORE)",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom scipy import optimize\nimport matplotlib.pyplot as plt\n\nplt.style.use('seaborn-poster')",
"_____no_output_____"
],
[
"time_in_sec = np.array([5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\ndiameter = np.array([2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])",
"_____no_output_____"
],
[
"def func(x, a, b):\n y = a/(x**b) \n return y\n\nparameters = optimize.curve_fit(func, xdata = time_in_sec, ydata = diameter)[0]\nprint(parameters)\nC = parameters[0]\nn = parameters[1]",
"_____no_output_____"
],
[
"plt.plot(time_in_sec,diameter,'o',label='data')\ny_new = []\nfor val in time_in_sec:\n v1 = C/(val**n)\n y_new.append(v1)\nplt.plot(time_in_sec,y_new,'-',label='fit')",
"_____no_output_____"
],
[
"log_time = np.log(time_in_min)\nlog_d = np.log(diameter)\nprint(log_d)\nprint(log_time)\nx = log_time.tolist()\ny = log_d.tolist()\npolynomial_coeff=np.polyfit(x,y,1)\nxnew=np.linspace(2.5,6,100)\nynew=np.poly1d(polynomial_coeff)\nplt.plot(xnew,ynew(xnew),x,y,'o')\nprint(ynew)\nplt.title(\"log(diameter) vs log(Time(s))\")\nplt.xlabel(\"log(Time(s))\")\nplt.ylabel(\"log(diameter)\")\n\nplt.show()",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
]
|
d064867b0d95195803807b55ac9a5c83deb05dbf | 26,902 | ipynb | Jupyter Notebook | Special. NLP_with_BERT.ipynb | Samrath49/AI_ML_DL | f5427adca5d914a7b69e11b578706cc7d21d3d56 | [
"Unlicense"
]
| null | null | null | Special. NLP_with_BERT.ipynb | Samrath49/AI_ML_DL | f5427adca5d914a7b69e11b578706cc7d21d3d56 | [
"Unlicense"
]
| null | null | null | Special. NLP_with_BERT.ipynb | Samrath49/AI_ML_DL | f5427adca5d914a7b69e11b578706cc7d21d3d56 | [
"Unlicense"
]
| null | null | null | 56.875264 | 451 | 0.597056 | [
[
[
"<a href=\"https://colab.research.google.com/github/Samrath49/AI_ML_DL/blob/main/Special.%20NLP_with_BERT.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# NLP with Bert for Sentiment Analysis",
"_____no_output_____"
],
[
"### Importing Libraries",
"_____no_output_____"
]
],
[
[
"!pip3 install ktrain ",
"Collecting ktrain\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/e1/3c/8469632f3fa51f244ce35ac184de4c55a260dccfcb7386529faf82ebf60f/ktrain-0.25.4.tar.gz (25.3MB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 25.3MB 133kB/s \n\u001b[?25hCollecting scikit-learn==0.23.2\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f4/cb/64623369f348e9bfb29ff898a57ac7c91ed4921f228e9726546614d63ccb/scikit_learn-0.23.2-cp37-cp37m-manylinux1_x86_64.whl (6.8MB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 6.8MB 48.5MB/s \n\u001b[?25hRequirement already satisfied: matplotlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from ktrain) (3.2.2)\nRequirement already satisfied: pandas>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from ktrain) (1.1.5)\nRequirement already satisfied: fastprogress>=0.1.21 in /usr/local/lib/python3.7/dist-packages (from ktrain) (1.0.0)\nRequirement already satisfied: requests in /usr/local/lib/python3.7/dist-packages (from ktrain) (2.23.0)\nRequirement already satisfied: joblib in /usr/local/lib/python3.7/dist-packages (from ktrain) (1.0.1)\nRequirement already satisfied: packaging in /usr/local/lib/python3.7/dist-packages (from ktrain) (20.9)\nRequirement already satisfied: ipython in /usr/local/lib/python3.7/dist-packages (from ktrain) (5.5.0)\nCollecting langdetect\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/56/a3/8407c1e62d5980188b4acc45ef3d94b933d14a2ebc9ef3505f22cf772570/langdetect-1.0.8.tar.gz (981kB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 983kB 50.1MB/s \n\u001b[?25hRequirement already satisfied: jieba in /usr/local/lib/python3.7/dist-packages (from ktrain) (0.42.1)\nCollecting cchardet\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/80/72/a4fba7559978de00cf44081c548c5d294bf00ac7dcda2db405d2baa8c67a/cchardet-2.1.7-cp37-cp37m-manylinux2010_x86_64.whl (263kB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 266kB 51.8MB/s \n\u001b[?25hCollecting syntok\n Downloading https://files.pythonhosted.org/packages/8c/76/a49e73a04b3e3a14ce232e8e28a1587f8108baa665644fe8c40e307e792e/syntok-1.3.1.tar.gz\nCollecting seqeval==0.0.19\n Downloading https://files.pythonhosted.org/packages/93/e5/b7705156a77f742cfe4fc6f22d0c71591edb2d243328dff2f8fc0f933ab6/seqeval-0.0.19.tar.gz\nCollecting transformers<4.0,>=3.1.0\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/3a/83/e74092e7f24a08d751aa59b37a9fc572b2e4af3918cb66f7766c3affb1b4/transformers-3.5.1-py3-none-any.whl (1.3MB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 1.3MB 50.5MB/s \n\u001b[?25hCollecting sentencepiece\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f5/99/e0808cb947ba10f575839c43e8fafc9cc44e4a7a2c8f79c60db48220a577/sentencepiece-0.1.95-cp37-cp37m-manylinux2014_x86_64.whl (1.2MB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 1.2MB 52.6MB/s \n\u001b[?25hCollecting keras_bert>=0.86.0\n Downloading https://files.pythonhosted.org/packages/e2/7f/95fabd29f4502924fa3f09ff6538c5a7d290dfef2c2fe076d3d1a16e08f0/keras-bert-0.86.0.tar.gz\nRequirement already satisfied: networkx>=2.3 in /usr/local/lib/python3.7/dist-packages (from ktrain) (2.5)\nCollecting whoosh\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/ba/19/24d0f1f454a2c1eb689ca28d2f178db81e5024f42d82729a4ff6771155cf/Whoosh-2.7.4-py2.py3-none-any.whl (468kB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 471kB 49.2MB/s \n\u001b[?25hCollecting threadpoolctl>=2.0.0\n Downloading https://files.pythonhosted.org/packages/f7/12/ec3f2e203afa394a149911729357aa48affc59c20e2c1c8297a60f33f133/threadpoolctl-2.1.0-py3-none-any.whl\nRequirement already satisfied: scipy>=0.19.1 in /usr/local/lib/python3.7/dist-packages (from scikit-learn==0.23.2->ktrain) (1.4.1)\nRequirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.7/dist-packages (from scikit-learn==0.23.2->ktrain) (1.19.5)\nRequirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (2.4.7)\nRequirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (1.3.1)\nRequirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (0.10.0)\nRequirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (2.8.1)\nRequirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.7/dist-packages (from pandas>=1.0.1->ktrain) (2018.9)\nRequirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (2.10)\nRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (2020.12.5)\nRequirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (3.0.4)\nRequirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (1.24.3)\nRequirement already satisfied: prompt-toolkit<2.0.0,>=1.0.4 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (1.0.18)\nRequirement already satisfied: pexpect; sys_platform != \"win32\" in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (4.8.0)\nRequirement already satisfied: setuptools>=18.5 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (54.0.0)\nRequirement already satisfied: pygments in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (2.6.1)\nRequirement already satisfied: pickleshare in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (0.7.5)\nRequirement already satisfied: simplegeneric>0.8 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (0.8.1)\nRequirement already satisfied: traitlets>=4.2 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (5.0.5)\nRequirement already satisfied: decorator in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (4.4.2)\nRequirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from langdetect->ktrain) (1.15.0)\nRequirement already satisfied: regex in /usr/local/lib/python3.7/dist-packages (from syntok->ktrain) (2019.12.20)\nRequirement already satisfied: Keras>=2.2.4 in /usr/local/lib/python3.7/dist-packages (from seqeval==0.0.19->ktrain) (2.4.3)\nRequirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.7/dist-packages (from transformers<4.0,>=3.1.0->ktrain) (4.41.1)\nCollecting sacremoses\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 890kB 53.4MB/s \n\u001b[?25hRequirement already satisfied: filelock in /usr/local/lib/python3.7/dist-packages (from transformers<4.0,>=3.1.0->ktrain) (3.0.12)\nCollecting tokenizers==0.9.3\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/7b/ac/f5ba028f0f097d855e1541301e946d4672eb0f30b6e25cb2369075f916d2/tokenizers-0.9.3-cp37-cp37m-manylinux1_x86_64.whl (2.9MB)\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 2.9MB 54.5MB/s \n\u001b[?25hRequirement already satisfied: protobuf in /usr/local/lib/python3.7/dist-packages (from transformers<4.0,>=3.1.0->ktrain) (3.12.4)\nCollecting keras-transformer>=0.38.0\n Downloading https://files.pythonhosted.org/packages/89/6c/d6f0c164f4cc16fbc0d0fea85f5526e87a7d2df7b077809e422a7e626150/keras-transformer-0.38.0.tar.gz\nRequirement already satisfied: wcwidth in /usr/local/lib/python3.7/dist-packages (from prompt-toolkit<2.0.0,>=1.0.4->ipython->ktrain) (0.2.5)\nRequirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.7/dist-packages (from pexpect; sys_platform != \"win32\"->ipython->ktrain) (0.7.0)\nRequirement already satisfied: ipython-genutils in /usr/local/lib/python3.7/dist-packages (from traitlets>=4.2->ipython->ktrain) (0.2.0)\nRequirement already satisfied: pyyaml in /usr/local/lib/python3.7/dist-packages (from Keras>=2.2.4->seqeval==0.0.19->ktrain) (3.13)\nRequirement already satisfied: h5py in /usr/local/lib/python3.7/dist-packages (from Keras>=2.2.4->seqeval==0.0.19->ktrain) (2.10.0)\nRequirement already satisfied: click in /usr/local/lib/python3.7/dist-packages (from sacremoses->transformers<4.0,>=3.1.0->ktrain) (7.1.2)\nCollecting keras-pos-embd>=0.11.0\n Downloading https://files.pythonhosted.org/packages/09/70/b63ed8fc660da2bb6ae29b9895401c628da5740c048c190b5d7107cadd02/keras-pos-embd-0.11.0.tar.gz\nCollecting keras-multi-head>=0.27.0\n Downloading https://files.pythonhosted.org/packages/e6/32/45adf2549450aca7867deccfa04af80a0ab1ca139af44b16bc669e0e09cd/keras-multi-head-0.27.0.tar.gz\nCollecting keras-layer-normalization>=0.14.0\n Downloading https://files.pythonhosted.org/packages/a4/0e/d1078df0494bac9ce1a67954e5380b6e7569668f0f3b50a9531c62c1fc4a/keras-layer-normalization-0.14.0.tar.gz\nCollecting keras-position-wise-feed-forward>=0.6.0\n Downloading https://files.pythonhosted.org/packages/e3/59/f0faa1037c033059e7e9e7758e6c23b4d1c0772cd48de14c4b6fd4033ad5/keras-position-wise-feed-forward-0.6.0.tar.gz\nCollecting keras-embed-sim>=0.8.0\n Downloading https://files.pythonhosted.org/packages/57/ef/61a1e39082c9e1834a2d09261d4a0b69f7c818b359216d4e1912b20b1c86/keras-embed-sim-0.8.0.tar.gz\nCollecting keras-self-attention==0.46.0\n Downloading https://files.pythonhosted.org/packages/15/6b/c804924a056955fa1f3ff767945187103cfc851ba9bd0fc5a6c6bc18e2eb/keras-self-attention-0.46.0.tar.gz\nBuilding wheels for collected packages: ktrain, langdetect, syntok, seqeval, keras-bert, sacremoses, keras-transformer, keras-pos-embd, keras-multi-head, keras-layer-normalization, keras-position-wise-feed-forward, keras-embed-sim, keras-self-attention\n Building wheel for ktrain (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for ktrain: filename=ktrain-0.25.4-cp37-none-any.whl size=25276443 sha256=a21bf62c621920a75422c4df8cae95d466380843fd1eda8e66302f5807ceda37\n Stored in directory: /root/.cache/pip/wheels/1b/77/8a/bdceaabc308e7178d575278bf6143b7d1a9b939a1e40c56b88\n Building wheel for langdetect (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for langdetect: filename=langdetect-1.0.8-cp37-none-any.whl size=993193 sha256=aec636b54ffe434c9028359c31bdfc76e9da9a1752fa7f10d87e69d57c34d46a\n Stored in directory: /root/.cache/pip/wheels/8d/b3/aa/6d99de9f3841d7d3d40a60ea06e6d669e8e5012e6c8b947a57\n Building wheel for syntok (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for syntok: filename=syntok-1.3.1-cp37-none-any.whl size=20919 sha256=4f6fa992ceefd03a0101faff02b00f882b85c93d7c32eac68c56155956a0bb9e\n Stored in directory: /root/.cache/pip/wheels/51/c6/a4/be1920586c49469846bcd2888200bdecfe109ec421dab9be2d\n Building wheel for seqeval (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for seqeval: filename=seqeval-0.0.19-cp37-none-any.whl size=9919 sha256=ac03ed5c47baebb742f37bf9b08ad4e45782dd3ee4bd727f850a4af61f5fbf77\n Stored in directory: /root/.cache/pip/wheels/8d/1f/bf/1198beceed805a2099060975f6281d1b01046dd279e19c97be\n Building wheel for keras-bert (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-bert: filename=keras_bert-0.86.0-cp37-none-any.whl size=34144 sha256=199f3eea09c452e52c98f833287b4c2e0161520432357af5ecfc932031eddb12\n Stored in directory: /root/.cache/pip/wheels/66/f0/b1/748128b58562fc9e31b907bb5e2ab6a35eb37695e83911236b\n Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for sacremoses: filename=sacremoses-0.0.43-cp37-none-any.whl size=893262 sha256=ba82c1360a233bd048daf43f948e0400661f80d3d21e0c1b72500c2fb34065b1\n Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n Building wheel for keras-transformer (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-transformer: filename=keras_transformer-0.38.0-cp37-none-any.whl size=12942 sha256=598c25f31534d9bbf3e134135ec5ffabecc0de55f3bc3ccef1bc9362f20c8f2b\n Stored in directory: /root/.cache/pip/wheels/e5/fb/3a/37b2b9326c799aa010ae46a04ddb04f320d8c77c0b7e837f4e\n Building wheel for keras-pos-embd (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-pos-embd: filename=keras_pos_embd-0.11.0-cp37-none-any.whl size=7554 sha256=8dffa94551da41c503305037b9936c354793a06d95bcd09d6489f3bea15c49ca\n Stored in directory: /root/.cache/pip/wheels/5b/a1/a0/ce6b1d49ba1a9a76f592e70cf297b05c96bc9f418146761032\n Building wheel for keras-multi-head (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-multi-head: filename=keras_multi_head-0.27.0-cp37-none-any.whl size=15611 sha256=7a015af070bc4ce247816f6ae650140ba6ac85bdb0a845d633c9dea464c22c7a\n Stored in directory: /root/.cache/pip/wheels/b5/b4/49/0a0c27dcb93c13af02fea254ff51d1a43a924dd4e5b7a7164d\n Building wheel for keras-layer-normalization (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-layer-normalization: filename=keras_layer_normalization-0.14.0-cp37-none-any.whl size=5269 sha256=c4050c794d67cf2aa834ffad4960aed9a36145f0a16b4e54f6fab703efb570f6\n Stored in directory: /root/.cache/pip/wheels/54/80/22/a638a7d406fd155e507aa33d703e3fa2612b9eb7bb4f4fe667\n Building wheel for keras-position-wise-feed-forward (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-position-wise-feed-forward: filename=keras_position_wise_feed_forward-0.6.0-cp37-none-any.whl size=5623 sha256=39e5ca51c76b0a07dd6c5f5208f8d68e5e5ab8d88ad8638279f506220420eb6a\n Stored in directory: /root/.cache/pip/wheels/39/e2/e2/3514fef126a00574b13bc0b9e23891800158df3a3c19c96e3b\n Building wheel for keras-embed-sim (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-embed-sim: filename=keras_embed_sim-0.8.0-cp37-none-any.whl size=4558 sha256=a01ad8cac95ba2cd3b0d2462b0dab4b91b5e57a13d65802f81b5ed8514cce406\n Stored in directory: /root/.cache/pip/wheels/49/45/8b/c111f6cc8bec253e984677de73a6f4f5d2f1649f42aac191c8\n Building wheel for keras-self-attention (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-self-attention: filename=keras_self_attention-0.46.0-cp37-none-any.whl size=17278 sha256=d0a6d2471a49500962a43539660a7cf5acaf4e829fb5d7c906fe434d7cbade2c\n Stored in directory: /root/.cache/pip/wheels/d2/2e/80/fec4c05eb23c8e13b790e26d207d6e0ffe8013fad8c6bdd4d2\nSuccessfully built ktrain langdetect syntok seqeval keras-bert sacremoses keras-transformer keras-pos-embd keras-multi-head keras-layer-normalization keras-position-wise-feed-forward keras-embed-sim keras-self-attention\n\u001b[31mERROR: transformers 3.5.1 has requirement sentencepiece==0.1.91, but you'll have sentencepiece 0.1.95 which is incompatible.\u001b[0m\nInstalling collected packages: threadpoolctl, scikit-learn, langdetect, cchardet, syntok, seqeval, sacremoses, tokenizers, sentencepiece, transformers, keras-pos-embd, keras-self-attention, keras-multi-head, keras-layer-normalization, keras-position-wise-feed-forward, keras-embed-sim, keras-transformer, keras-bert, whoosh, ktrain\n Found existing installation: scikit-learn 0.22.2.post1\n Uninstalling scikit-learn-0.22.2.post1:\n Successfully uninstalled scikit-learn-0.22.2.post1\nSuccessfully installed cchardet-2.1.7 keras-bert-0.86.0 keras-embed-sim-0.8.0 keras-layer-normalization-0.14.0 keras-multi-head-0.27.0 keras-pos-embd-0.11.0 keras-position-wise-feed-forward-0.6.0 keras-self-attention-0.46.0 keras-transformer-0.38.0 ktrain-0.25.4 langdetect-1.0.8 sacremoses-0.0.43 scikit-learn-0.23.2 sentencepiece-0.1.95 seqeval-0.0.19 syntok-1.3.1 threadpoolctl-2.1.0 tokenizers-0.9.3 transformers-3.5.1 whoosh-2.7.4\n"
],
[
"import os.path\r\nimport numpy as np\r\nimport pandas as pd\r\nimport tensorflow as tf\r\nimport ktrain\r\nfrom ktrain import text ",
"_____no_output_____"
]
],
[
[
"## Part 1: Data Preprocessing",
"_____no_output_____"
],
[
"### Loading the IMDB dataset",
"_____no_output_____"
]
],
[
[
"dataset = tf.keras.utils.get_file(fname = \"aclImdb_v1.tar\",\r\n origin = \"https://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar\",\r\n extract = True)\r\nIMDB_DATADIR = os.path.join(os.path.dirname(dataset), 'aclImdb')",
"Downloading data from https://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar\n84131840/84125825 [==============================] - 2s 0us/step\n"
],
[
"print(os.path.dirname(dataset))\r\nprint(IMDB_DATADIR)",
"/root/.keras/datasets\n/root/.keras/datasets/aclImdb\n"
]
],
[
[
"### Creating the training & test sets",
"_____no_output_____"
]
],
[
[
"(X_train, y_train), (X_test, y_test), preproc = text.texts_from_folder(datadir = IMDB_DATADIR, \r\n classes = ['pos','neg'],\r\n maxlen = 500, \r\n train_test_names = ['train', 'test'],\r\n preprocess_mode = 'bert')",
"detected encoding: utf-8\ndownloading pretrained BERT model (uncased_L-12_H-768_A-12.zip)...\n[โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ]\nextracting pretrained BERT model...\ndone.\n\ncleanup downloaded zip...\ndone.\n\npreprocessing train...\nlanguage: en\n"
]
],
[
[
"## Part 2: Building the BERT model",
"_____no_output_____"
]
],
[
[
"model = text.text_classifier(name = 'bert',\r\n train_data = (X_train, y_train),\r\n preproc = preproc)",
"Is Multi-Label? False\nmaxlen is 500\ndone.\n"
]
],
[
[
"## Part 3: Training the BERT model",
"_____no_output_____"
]
],
[
[
"learner = ktrain.get_learner(model = model, \r\n train_data = (X_train, y_train),\r\n val_data = (X_test, y_test),\r\n batch_size = 6)",
"_____no_output_____"
],
[
"learner.fit_onecycle(lr=2e-5,\r\n epochs = 1)",
"\n\nbegin training using onecycle policy with max lr of 2e-05...\n4167/4167 [==============================] - 3436s 820ms/step - loss: 0.3313 - accuracy: 0.8479 - val_loss: 0.1619 - val_accuracy: 0.9383\n"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
]
|
d0648ee1a52be7f2190c84ec7d539059836b6cb4 | 920,987 | ipynb | Jupyter Notebook | TD Actor Critic/TD_Actor_Critic_seperate_net.ipynb | gt-coar/BrianSURE2021 | ef3087763a1500b5dc01fe74474cb1bf4936773a | [
"MIT"
]
| null | null | null | TD Actor Critic/TD_Actor_Critic_seperate_net.ipynb | gt-coar/BrianSURE2021 | ef3087763a1500b5dc01fe74474cb1bf4936773a | [
"MIT"
]
| null | null | null | TD Actor Critic/TD_Actor_Critic_seperate_net.ipynb | gt-coar/BrianSURE2021 | ef3087763a1500b5dc01fe74474cb1bf4936773a | [
"MIT"
]
| null | null | null | 40.548893 | 407 | 0.49593 | [
[
[
"# Enable GPU",
"_____no_output_____"
]
],
[
[
"import torch\ndevice = torch.device('cuda:0' if torch.cuda.is_available else 'cpu')",
"_____no_output_____"
]
],
[
[
"# Actor and Critic Network\n\n",
"_____no_output_____"
]
],
[
[
"import torch.nn as nn\nimport torch.nn.functional as F\nfrom torch.distributions import Categorical\n\nclass Actor_Net(nn.Module):\n def __init__(self, input_dims, output_dims, num_neurons = 128):\n super(Actor_Net, self).__init__()\n self.fc1 = nn.Linear(input_dims, num_neurons)\n self.actor = nn.Linear(num_neurons, output_dims)\n self.log_probs = []\n self.entropies = []\n\n def forward(self, state):\n x = F.relu(self.fc1(state))\n x = F.softmax(self.actor(x), dim = 1)\n\n return x\n\n def get_action(self, state):\n with torch.no_grad():\n probs = self.forward(state)\n dist = Categorical(probs = probs)\n action = dist.sample()\n return action\n \n def eval_action(self, state):\n probs = self.forward(state)\n dist = Categorical(probs = probs)\n action = dist.sample().to(device)\n log_prob = dist.log_prob(action)\n entropy = dist.entropy()\n self.log_probs.append(log_prob)\n self.entropies.append(entropy)\n\n return action\n\nclass Critic_Net(nn.Module):\n def __init__ (self, input_dims, output_dims, num_neurons = 128):\n super(Critic_Net, self).__init__()\n\n self.values = []\n self.next_values = []\n \n self.fc1 = nn.Linear(input_dims, num_neurons)\n self.critic = nn.Linear(num_neurons, 1)\n\n def forward (self, state):\n x = F.relu(self.fc1(state))\n x = self.critic(x)\n\n return x",
"_____no_output_____"
],
[
"import torch.optim as optim\nimport numpy as np\nimport gym\n\nclass Actor_Critic_Agent(nn.Module):\n def __init__(self, input_dims, output_dims, optimizer = 'RMSprop', num_neurons = 128 , gamma = 0.99, actor_lr=0.001, critic_lr = 0.01):\n super(Actor_Critic_Agent, self).__init__()\n self.actor_net = Actor_Net(input_dims= input_dims, output_dims= output_dims, num_neurons= num_neurons).to(device)\n self.critic_net = Critic_Net(input_dims=input_dims, output_dims= output_dims, num_neurons= num_neurons).to(device)\n self.gamma = gamma\n if optimizer == 'RMSprop':\n self.actor_optimizer = optim.RMSprop(params = self.actor_net.parameters(), lr =actor_lr)\n self.critic_optimizer = optim.RMSprop(params = self.critic_net.parameters(), lr = critic_lr)\n else:\n self.actor_optimizer = optim.Adam(params = self.actor_net.parameters(), lr = actor_lr)\n self.critic_optimizer = optim.Adam(params = self.critic_net.parameters(), lr = critic_lr)\n\n def learn_mean(self, rewards, dones):\n value_criteration = nn.MSELoss()\n value_losses = []\n actor_losses = []\n self.critic_net.next_values = torch.cat(self.critic_net.next_values, dim = 0).squeeze(0)\n self.critic_net.values = torch.cat(self.critic_net.values, dim = 0).squeeze(0)\n self.actor_net.log_probs = torch.cat(self.actor_net.log_probs, dim = 0)\n self.actor_net.entropies = torch.cat(self.actor_net.entropies, dim = 0)\n\n for reward, entropy, log_prob, v, v_next, done in zip(rewards ,self.actor_net.entropies, self.actor_net.log_probs, self.critic_net.values, self.critic_net.next_values, dones):\n td_target = reward + self.gamma * v_next * done\n td_error = td_target - v\n value_loss = value_criteration(v, td_target.detach())- 0.001 * entropy.detach()\n actor_loss = - log_prob * td_error.detach() \n value_losses.append(value_loss)\n actor_losses.append(actor_loss)\n\n self.critic_optimizer.zero_grad()\n value_losses = torch.stack(value_losses).sum()\n value_losses.backward()\n self.critic_optimizer.step() \n\n self.actor_optimizer.zero_grad()\n actor_losses = torch.stack(actor_losses).sum()\n actor_losses.backward()\n self.actor_optimizer.step()\n\n \n # clear out memory \n self.actor_net.log_probs = []\n self.actor_net.entropies = []\n self.critic_net.values = []\n self.critic_net.next_values = []\n\n",
"_____no_output_____"
]
],
[
[
"# Without Wandb",
"_____no_output_____"
]
],
[
[
"import gym\nimport time\nimport pdb\n\nenv = gym.make('CartPole-v1')\nenv.seed(543)\ntorch.manual_seed(543)\nstate_dims = env.observation_space.shape[0]\naction_dims = env.action_space.n\nagent = Actor_Critic_Agent(input_dims= state_dims, output_dims = action_dims)\n\ndef train():\n\n num_ep = 2000\n print_every = 100\n running_score = 10\n start = time.time()\n\n rewards = []\n dones = []\n\n for ep in range(1, num_ep + 1):\n state = env.reset()\n score = 0\n done = False\n rewards = []\n dones = []\n\n while not done:\n state = torch.tensor([state]).float().to(device)\n action = agent.actor_net.eval_action(state)\n v = agent.critic_net(state)\n\n next_state, reward, done, _ = env.step(action.item())\n v_next = agent.critic_net(torch.tensor([next_state]).float().to(device))\n \n agent.critic_net.values.append(v.squeeze(0))\n agent.critic_net.next_values.append(v_next.squeeze(0))\n rewards.append(reward)\n dones.append(1 - done)\n \n # update episode\n score += reward\n state = next_state\n\n if done:\n break\n\n # update agent\n #pdb.set_trace()\n agent.learn_mean(rewards,dones)\n \n # calculating score and running score\n running_score = 0.05 * score + (1 - 0.05) * running_score\n\n if ep % print_every == 0:\n print('episode: {}, running score: {}, time elapsed: {}'.format(ep, running_score, time.time() - start))\n\n\n\n",
"_____no_output_____"
],
[
"train() #RMS",
"episode: 100, running score: 43.32507441570408, time elapsed: 4.842878341674805\nepisode: 200, running score: 129.30332722904944, time elapsed: 19.552313089370728\n"
]
],
[
[
"# Wtih wandb",
"_____no_output_____"
]
],
[
[
"!pip install wandb\n!wandb login\n",
"_____no_output_____"
],
[
"import wandb\nsweep_config = dict()\nsweep_config['method'] = 'grid'\nsweep_config['metric'] = {'name': 'running_score', 'goal': 'maximize'}\nsweep_config['parameters'] = {'learning': {'value': 'learn_mean'}, 'actor_learning_rate': {'values' : [0.01, 0.001, 0.0001,0.0003,0.00001]}, 'critic_learning_rate' : {'values': [0.01, 0.001, 0.0001, 0.0003, 0.00001]}\n , 'num_neurons': {'value': 128 }, 'optimizer': {'values' : ['RMSprop', 'Adam']}}\n\nsweep_id = wandb.sweep(sweep_config, project = 'Advantage_Actor_Critic')",
"Create sweep with ID: t9gia22t\nSweep URL: https://wandb.ai/ko120/Advantage_Actor_Critic/sweeps/t9gia22t\n"
],
[
"import gym \nimport torch\nimport time\nimport wandb\n\n\n\ndef train():\n wandb.init(config = {'env':'CartPole-v1','algorithm:': 'Actor_Critic','architecture': 'seperate','num_laeyrs':'2'}, project = 'Advantage_Actor_Critic',group = 'Cart_128_neurons_2_layer')\n config = wandb.config\n\n env = gym.make('CartPole-v1')\n env.seed(543)\n torch.manual_seed(543)\n\n state_dim = env.observation_space.shape[0]\n action_dim = env.action_space.n\n\n device = torch.device('cuda:0' if torch.cuda.is_available else 'cpu')\n agent = Actor_Critic_Agent(input_dims= state_dim, output_dims= action_dim, optimizer = config.optimizer, num_neurons= config.num_neurons, actor_lr = config.actor_learning_rate, critic_lr = config.critic_learning_rate)\n\n\n num_ep = 3000\n print_interval = 100\n save_interval = 1000\n running_score = 10\n start = time.time()\n\n \n wandb.watch(agent)\n for ep in range(1,num_ep+1):\n state = env.reset()\n score = 0\n done = False\n rewards = []\n dones = []\n\n while not done:\n state = torch.tensor([state]).float().to(device)\n action = agent.actor_net.eval_action(state)\n v = agent.critic_net(state)\n\n next_state, reward, done, _ = env.step(action.item())\n v_next = agent.critic_net(torch.tensor([next_state]).float().to(device))\n \n agent.critic_net.values.append(v.squeeze(0))\n agent.critic_net.next_values.append(v_next.squeeze(0))\n rewards.append(reward)\n dones.append(1 - done)\n \n # update episode\n score += reward\n state = next_state\n\n if done:\n break\n\n # update agent\n agent.learn_mean(rewards,dones)\n \n # calculating score and running score\n running_score = 0.05 * score + (1 - 0.05) * running_score\n\n wandb.log({'episode': ep, 'running_score': running_score}) \n\n if ep % print_interval == 0:\n print('episode {} average reward {}, ended at {:.01f}'.format(ep, running_score, time.time() - start)) \n \n if ep % save_interval == 0:\n save_name_actor = 'actor_' + str(ep) + '.pt'\n torch.save(agent.actor_net.state_dict(),save_name_actor)\n save_name_critic = 'critic_' + str(ep) + '.pt'\n torch.save(agent.critic_net.state_dict(),save_name_critic)\n wandb.save(save_name_actor)\n wandb.save(save_name_critic)\n\n if ep == num_ep:\n dummy_input = torch.rand(1,4).to(device)\n torch.onnx.export(agent.actor_net,dummy_input,'final_model_actor.onnx')\n wandb.save('final_model_actor.onnx')\n torch.onnx.export(agent.critic_net, dummy_input, 'final_model_critic.onnx')\n wandb.save('final_model_critic.onnx')\n ",
"_____no_output_____"
],
[
"wandb.agent(sweep_id, train)",
"\u001b[34m\u001b[1mwandb\u001b[0m: Agent Starting Run: wivnmds7 with config:\n\u001b[34m\u001b[1mwandb\u001b[0m: \tactor_learning_rate: 0.01\n\u001b[34m\u001b[1mwandb\u001b[0m: \tcritic_learning_rate: 0.01\n\u001b[34m\u001b[1mwandb\u001b[0m: \tlearning: learn_mean\n\u001b[34m\u001b[1mwandb\u001b[0m: \tnum_neurons: 128\n\u001b[34m\u001b[1mwandb\u001b[0m: \toptimizer: RMSprop\n\u001b[34m\u001b[1mwandb\u001b[0m: \u001b[33mWARNING\u001b[0m Ignored wandb.init() arg project when running a sweep\n"
]
],
[
[
"# You can see the result here!\n[Report Link](https://wandb.ai/ko120/Advantage_Actor_Critic/reports/TD-Actor-Critic-Learning-rate-tune---Vmlldzo4OTIwODg)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
]
]
|
d0648f64f9504f4ac0f20c8cc100a85b421e3dac | 173,648 | ipynb | Jupyter Notebook | PREDICTION-MODEL-1.ipynb | fuouo/TrafficBato | bd8ab5645116db4029b90bb5a28e0134d59e9ac0 | [
"MIT"
]
| null | null | null | PREDICTION-MODEL-1.ipynb | fuouo/TrafficBato | bd8ab5645116db4029b90bb5a28e0134d59e9ac0 | [
"MIT"
]
| null | null | null | PREDICTION-MODEL-1.ipynb | fuouo/TrafficBato | bd8ab5645116db4029b90bb5a28e0134d59e9ac0 | [
"MIT"
]
| null | null | null | 58.983696 | 38,484 | 0.633298 | [
[
[
"## Import Necessary Packages",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nimport datetime\nimport os\n\nnp.random.seed(1337) # for reproducibility\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics.classification import accuracy_score\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.metrics.regression import r2_score, mean_squared_error, mean_absolute_error\n\nfrom dbn.tensorflow import SupervisedDBNRegression",
"_____no_output_____"
]
],
[
[
"## Define Model Settings",
"_____no_output_____"
]
],
[
[
"RBM_EPOCHS = 5\nDBN_EPOCHS = 150\nRBM_LEARNING_RATE = 0.01\nDBN_LEARNING_RATE = 0.01\nHIDDEN_LAYER_STRUCT = [20, 50, 100]\nACTIVE_FUNC = 'relu'\nBATCH_SIZE = 28",
"_____no_output_____"
]
],
[
[
"## Define Directory, Road, and Year",
"_____no_output_____"
]
],
[
[
"# Read the dataset\nROAD = \"Vicente Cruz\"\nYEAR = \"2015\"\nEXT = \".csv\"\nDATASET_DIVISION = \"seasonWet\"\nDIR = \"../../../datasets/Thesis Datasets/\"\nOUTPUT_DIR = \"PM1/Rolling 3/\"\nMODEL_DIR = \"PM1/Rolling 3/\"\n\n'''''''Training dataset'''''''\nWP = False\nWEEKDAY = False\nCONNECTED_ROADS = False\nCONNECTED_1 = [\"Antipolo\"]\ntrafficDT = \"recon_traffic\" #orig_traffic recon_traffic\nfeatureEngineering = \"Rolling\" #Rolling Expanding Rolling and Expanding\ntimeFE = \"today\" #today yesterday\ntimeConnected = \"today\"\nROLLING_WINDOW = 3\nEXPANDING_WINDOW = 3\nRECON_SHIFT = 96\n# RECON_FE_WINDOW = 48",
"_____no_output_____"
],
[
"def addWorkingPeakFeatures(df):\n result_df = df.copy()\n\n # Converting the index as date\n result_df.index = pd.to_datetime(result_df.index)\n \n # Create column work_day\n result_df['work_day'] = ((result_df.index.dayofweek) < 5).astype(int)\n\n # Consider non-working holiday\n if DATASET_DIVISION is not \"seasonWet\":\n\n # Jan\n result_df.loc['2015-01-01', 'work_day'] = 0\n result_df.loc['2015-01-02', 'work_day'] = 0\n\n # Feb\n result_df.loc['2015-02-19', 'work_day'] = 0\n result_df.loc['2015-02-25', 'work_day'] = 0\n\n # Apr\n result_df.loc['2015-04-02', 'work_day'] = 0\n result_df.loc['2015-04-03', 'work_day'] = 0\n result_df.loc['2015-04-09', 'work_day'] = 0\n\n # May\n result_df.loc['2015-05-01', 'work_day'] = 0\n\n # Jun\n result_df.loc['2015-06-12', 'work_day'] = 0\n result_df.loc['2015-06-24', 'work_day'] = 0\n\n # Jul\n result_df.loc['2015-07-17', 'work_day'] = 0\n\n # Aug\n result_df.loc['2015-08-21', 'work_day'] = 0\n result_df.loc['2015-08-31', 'work_day'] = 0\n\n # Sep\n result_df.loc['2015-08-25', 'work_day'] = 0\n\n if DATASET_DIVISION is not \"seasonWet\":\n # Nov\n result_df.loc['2015-11-30', 'work_day'] = 0\n\n # Dec\n result_df.loc['2015-12-24', 'work_day'] = 0\n result_df.loc['2015-12-25', 'work_day'] = 0\n result_df.loc['2015-12-30', 'work_day'] = 0\n result_df.loc['2015-12-31', 'work_day'] = 0\n\n # Consider class suspension\n if DATASET_DIVISION is not \"seasonWet\":\n # Jan\n result_df.loc['2015-01-08', 'work_day'] = 0\n result_df.loc['2015-01-09', 'work_day'] = 0\n result_df.loc['2015-01-14', 'work_day'] = 0\n result_df.loc['2015-01-15', 'work_day'] = 0\n result_df.loc['2015-01-16', 'work_day'] = 0\n result_df.loc['2015-01-17', 'work_day'] = 0\n\n # Jul\n result_df.loc['2015-07-06', 'work_day'] = 0\n result_df.loc['2015-07-08', 'work_day'] = 0\n result_df.loc['2015-07-09', 'work_day'] = 0\n result_df.loc['2015-07-10', 'work_day'] = 0\n\n # Aug\n result_df.loc['2015-08-10', 'work_day'] = 0\n result_df.loc['2015-08-11', 'work_day'] = 0\n\n # Sep\n result_df.loc['2015-09-10', 'work_day'] = 0\n\n # Oct\n result_df.loc['2015-10-02', 'work_day'] = 0\n result_df.loc['2015-10-19', 'work_day'] = 0\n\n if DATASET_DIVISION is not \"seasonWet\":\n # Nov\n result_df.loc['2015-11-16', 'work_day'] = 0\n result_df.loc['2015-11-17', 'work_day'] = 0\n result_df.loc['2015-11-18', 'work_day'] = 0\n result_df.loc['2015-11-19', 'work_day'] = 0\n result_df.loc['2015-11-20', 'work_day'] = 0\n\n # Dec\n result_df.loc['2015-12-16', 'work_day'] = 0\n result_df.loc['2015-12-18', 'work_day'] = 0\n\n result_df['peak_hour'] = 0\n\n # Set morning peak hour\n\n start = datetime.time(7,0,0)\n end = datetime.time(10,0,0)\n\n result_df.loc[result_df.between_time(start, end).index, 'peak_hour'] = 1\n\n # Set afternoon peak hour\n\n start = datetime.time(16,0,0)\n end = datetime.time(19,0,0)\n\n result_df.loc[result_df.between_time(start, end).index, 'peak_hour'] = 1\n \n result_df\n \n return result_df",
"_____no_output_____"
],
[
"def reconstructDT(df, pastTraffic=False, trafficFeatureNeeded=[]):\n result_df = df.copy()\n\n # Converting the index as date\n result_df.index = pd.to_datetime(result_df.index, format='%d/%m/%Y %H:%M')\n result_df['month'] = result_df.index.month\n result_df['day'] = result_df.index.day\n result_df['hour'] = result_df.index.hour\n result_df['min'] = result_df.index.minute \n result_df['dayOfWeek'] = result_df.index.dayofweek\n \n if pastTraffic:\n for f in trafficFeatureNeeded:\n result_df[f + '-' + str(RECON_SHIFT*15) + \"mins\"] = result_df[f].shift(RECON_SHIFT)\n \n result_df = result_df.iloc[RECON_SHIFT:, :]\n \n for f in range(len(result_df.columns)):\n result_df[result_df.columns[f]] = normalize(result_df[result_df.columns[f]])\n\n return result_df",
"_____no_output_____"
],
[
"def getNeededFeatures(columns, arrFeaturesNeed, featureEngineering=\"Original\"):\n to_remove = []\n if len(arrFeaturesNeed) == 0: #all features aren't needed\n to_remove += range(0, len(columns))\n\n else:\n if featureEngineering == \"Original\":\n compareTo = \" \"\n elif featureEngineering == \"Rolling\" or featureEngineering == \"Expanding\":\n compareTo = \"_\"\n \n for f in arrFeaturesNeed:\n for c in range(0, len(columns)):\n if f not in columns[c].split(compareTo)[0] and columns[c].split(compareTo)[0] not in arrFeaturesNeed:\n to_remove.append(c)\n if len(columns[c].split(compareTo)) > 1:\n if \"Esum\" in columns[c].split(compareTo)[1]: #Removing all Expanding Sum \n to_remove.append(c)\n \n return to_remove",
"_____no_output_____"
],
[
"def normalize(data):\n y = pd.to_numeric(data)\n y = np.array(y.reshape(-1, 1))\n \n scaler = MinMaxScaler()\n y = scaler.fit_transform(y)\n y = y.reshape(1, -1)[0]\n return y",
"_____no_output_____"
]
],
[
[
"<br><br>\n### Preparing Traffic Dataset",
"_____no_output_____"
],
[
"#### Importing Original Traffic (wo new features)",
"_____no_output_____"
]
],
[
[
"TRAFFIC_DIR = DIR + \"mmda/\"\nTRAFFIC_FILENAME = \"mmda_\" + ROAD + \"_\" + YEAR + \"_\" + DATASET_DIVISION\norig_traffic = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\norig_traffic = orig_traffic.fillna(0)\n\n#Converting index to date and time, and removing 'dt' column\norig_traffic.index = pd.to_datetime(orig_traffic.dt, format='%d/%m/%Y %H:%M')\ncols_to_remove = [0]\ncols_to_remove = getNeededFeatures(orig_traffic.columns, [\"statusN\"])\norig_traffic.drop(orig_traffic.columns[[cols_to_remove]], axis=1, inplace=True)\norig_traffic.head()\n\nif WEEKDAY:\n orig_traffic = orig_traffic[((orig_traffic.index.dayofweek) < 5)]\norig_traffic.head()",
"_____no_output_____"
],
[
"TRAFFIC_DIR = DIR + \"mmda/Rolling/\" + DATASET_DIVISION + \"/\"\nTRAFFIC_FILENAME = \"eng_win\" + str(ROLLING_WINDOW) + \"_mmda_\" + ROAD + \"_\" + YEAR + \"_\" + DATASET_DIVISION\nrolling_traffic = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\n\ncols_to_remove = [0, 1, 2]\ncols_to_remove += getNeededFeatures(rolling_traffic.columns, [\"statusN\"], \"Rolling\")\n\nrolling_traffic.index = pd.to_datetime(rolling_traffic.dt, format='%Y-%m-%d %H:%M')\n\nrolling_traffic.drop(rolling_traffic.columns[[cols_to_remove]], axis=1, inplace=True)\n\nif WEEKDAY:\n rolling_traffic = rolling_traffic[((rolling_traffic.index.dayofweek) < 5)]\n \nrolling_traffic.head()",
"_____no_output_____"
],
[
"TRAFFIC_DIR = DIR + \"mmda/Expanding/\" + DATASET_DIVISION + \"/\"\nTRAFFIC_FILENAME = \"eng_win\" + str(EXPANDING_WINDOW) + \"_mmda_\" + ROAD + \"_\" + YEAR + \"_\" + DATASET_DIVISION\nexpanding_traffic = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\n\ncols_to_remove = [0, 1, 2, 5]\ncols_to_remove += getNeededFeatures(expanding_traffic.columns, [\"statusN\"], \"Rolling\")\n\nexpanding_traffic.index = pd.to_datetime(expanding_traffic.dt, format='%d/%m/%Y %H:%M')\n\nexpanding_traffic.drop(expanding_traffic.columns[[cols_to_remove]], axis=1, inplace=True)\n\nif WEEKDAY:\n expanding_traffic = expanding_traffic[((expanding_traffic.index.dayofweek) < 5)]\nexpanding_traffic.head()",
"_____no_output_____"
],
[
"recon_traffic = reconstructDT(orig_traffic, pastTraffic=True, trafficFeatureNeeded=['statusN'])\nrecon_traffic.head()",
"c:\\users\\ronnie nieva\\anaconda3\\envs\\tensorflow\\lib\\site-packages\\ipykernel_launcher.py:3: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead\n This is separate from the ipykernel package so we can avoid doing imports until\nc:\\users\\ronnie nieva\\anaconda3\\envs\\tensorflow\\lib\\site-packages\\sklearn\\utils\\validation.py:475: DataConversionWarning: Data with input dtype int64 was converted to float64 by MinMaxScaler.\n warnings.warn(msg, DataConversionWarning)\n"
],
[
"connected_roads = []\n\nfor c in CONNECTED_1:\n TRAFFIC_DIR = DIR + \"mmda/\"\n TRAFFIC_FILENAME = \"mmda_\" + c + \"_\" + YEAR + \"_\" + DATASET_DIVISION\n temp = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\n temp = temp.fillna(0)\n\n #Converting index to date and time, and removing 'dt' column\n temp.index = pd.to_datetime(temp.dt, format='%d/%m/%Y %H:%M')\n cols_to_remove = [0]\n cols_to_remove = getNeededFeatures(temp.columns, [\"statusN\"])\n temp.drop(temp.columns[[cols_to_remove]], axis=1, inplace=True)\n \n if WEEKDAY:\n temp = temp[((temp.index.dayofweek) < 5)]\n \n for f in range(len(temp.columns)):\n temp[temp.columns[f]] = normalize(temp[temp.columns[f]])\n temp = temp.rename(columns={temp.columns[f]: temp.columns[f] +\"(\" + c + \")\"})\n connected_roads.append(temp)\n \nconnected_roads[0].head()",
"c:\\users\\ronnie nieva\\anaconda3\\envs\\tensorflow\\lib\\site-packages\\ipykernel_launcher.py:3: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead\n This is separate from the ipykernel package so we can avoid doing imports until\n"
]
],
[
[
"### Merging datasets",
"_____no_output_____"
]
],
[
[
"if trafficDT == \"orig_traffic\":\n arrDT = [orig_traffic]\n \n if CONNECTED_ROADS:\n for c in connected_roads:\n arrDT.append(c)\n \nelif trafficDT == \"recon_traffic\":\n arrDT = [recon_traffic]\n \n if CONNECTED_ROADS:\n timeConnected = \"today\"\n print(\"TimeConnected = \" + timeConnected)\n for c in connected_roads:\n if timeConnected == \"today\":\n startIndex = np.absolute(len(arrDT[0])-len(c))\n endIndex = len(c)\n elif timeConnected == \"yesterday\":\n startIndex = 0\n endIndex = len(rolling_traffic) - RECON_SHIFT\n c = c.rename(columns={c.columns[0]: c.columns[0] + \"-\" + str(RECON_SHIFT*15) + \"mins\"})\n\n\n c = c.iloc[startIndex:endIndex, :]\n print(\"Connected Road Start time: \" + str(c.index[0]))\n c.index = arrDT[0].index\n arrDT.append(c)\n print(str(startIndex) + \" \" + str(endIndex))\n\n \nif featureEngineering != \"\":\n print(\"Adding Feature Engineering\")\n \n print(\"TimeConnected = \" + timeFE)\n\n \n if timeFE == \"today\":\n startIndex = np.absolute(len(arrDT[0])-len(rolling_traffic))\n endIndex = len(rolling_traffic)\n elif timeFE == \"yesterday\":\n startIndex = 0\n endIndex = len(rolling_traffic) - RECON_SHIFT\n \n if featureEngineering == \"Rolling\":\n temp = rolling_traffic.iloc[startIndex:endIndex, :]\n arrDT.append(temp)\n\n elif featureEngineering == \"Expanding\":\n temp = expanding_traffic.iloc[startIndex:endIndex, :]\n arrDT.append(temp)\n\n elif featureEngineering == \"Rolling and Expanding\":\n print(str(startIndex) + \" \" + str(endIndex))\n \n #Rolling\n temp = rolling_traffic.iloc[startIndex:endIndex, :]\n temp.index = arrDT[0].index\n arrDT.append(temp)\n \n #Expanding\n temp = expanding_traffic.iloc[startIndex:endIndex, :]\n temp.index = arrDT[0].index\n arrDT.append(temp)\n \nmerged_dataset = pd.concat(arrDT, axis=1)\nif \"Rolling\" in featureEngineering:\n merged_dataset = merged_dataset.iloc[ROLLING_WINDOW+1:, :]\n \nif WP:\n merged_dataset = addWorkingPeakFeatures(merged_dataset)\n print(\"Adding working / peak days\") \n\nmerged_dataset",
"Adding Feature Engineering\nTimeConnected = today\n"
]
],
[
[
"### Adding Working / Peak Features",
"_____no_output_____"
]
],
[
[
"if WP:\n merged_dataset = addWorkingPeakFeatures(merged_dataset)\n print(\"Adding working / peak days\")",
"_____no_output_____"
]
],
[
[
"## Preparing Training dataset",
"_____no_output_____"
],
[
"### Merge Original (and Rolling and Expanding)",
"_____no_output_____"
]
],
[
[
"# To-be Predicted variable \nY = merged_dataset.statusN\nY = Y.fillna(0)",
"_____no_output_____"
],
[
"# Training Data\nX = merged_dataset\nX = X.drop(X.columns[[0]], axis=1)\n\n# Splitting data\nX_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.67, shuffle=False)\nX_train = np.array(X_train)\nX_test = np.array(X_test)\nY_train = np.array(Y_train)\nY_test = np.array(Y_test)\n\n# Data scaling\n# min_max_scaler = MinMaxScaler()\n# X_train = min_max_scaler.fit_transform(X_train)\n\n#Print training and testing data\npd.concat([X, Y.to_frame()], axis=1).head()",
"_____no_output_____"
]
],
[
[
"<br><br>\n## Training Model",
"_____no_output_____"
]
],
[
[
"# Training\nregressor = SupervisedDBNRegression(hidden_layers_structure=HIDDEN_LAYER_STRUCT,\n learning_rate_rbm=RBM_LEARNING_RATE,\n learning_rate=DBN_LEARNING_RATE,\n n_epochs_rbm=RBM_EPOCHS,\n n_iter_backprop=DBN_EPOCHS,\n batch_size=BATCH_SIZE,\n activation_function=ACTIVE_FUNC)\nregressor.fit(X_train, Y_train)",
"[START] Pre-training step:\n>> Epoch 1 finished \tRBM Reconstruction error 0.600877\n>> Epoch 2 finished \tRBM Reconstruction error 0.567511\n>> Epoch 3 finished \tRBM Reconstruction error 0.521251\n>> Epoch 4 finished \tRBM Reconstruction error 0.471193\n>> Epoch 5 finished \tRBM Reconstruction error 0.412250\n>> Epoch 1 finished \tRBM Reconstruction error 0.150905\n>> Epoch 2 finished \tRBM Reconstruction error 0.141888\n>> Epoch 3 finished \tRBM Reconstruction error 0.133112\n>> Epoch 4 finished \tRBM Reconstruction error 0.125336\n>> Epoch 5 finished \tRBM Reconstruction error 0.117369\n>> Epoch 1 finished \tRBM Reconstruction error 0.045284\n>> Epoch 2 finished \tRBM Reconstruction error 0.038918\n>> Epoch 3 finished \tRBM Reconstruction error 0.038393\n>> Epoch 4 finished \tRBM Reconstruction error 0.037598\n>> Epoch 5 finished \tRBM Reconstruction error 0.036662\n[END] Pre-training step\n[START] Fine tuning step:\n>> Epoch 0 finished \tANN training loss 0.056863\n>> Epoch 1 finished \tANN training loss 0.048765\n>> Epoch 2 finished \tANN training loss 0.038931\n>> Epoch 3 finished \tANN training loss 0.028552\n>> Epoch 4 finished \tANN training loss 0.019801\n>> Epoch 5 finished \tANN training loss 0.014199\n>> Epoch 6 finished \tANN training loss 0.011577\n>> Epoch 7 finished \tANN training loss 0.010580\n>> Epoch 8 finished \tANN training loss 0.010219\n>> Epoch 9 finished \tANN training loss 0.010065\n>> Epoch 10 finished \tANN training loss 0.009976\n>> Epoch 11 finished \tANN training loss 0.009865\n>> Epoch 12 finished \tANN training loss 0.009775\n>> Epoch 13 finished \tANN training loss 0.009698\n>> Epoch 14 finished \tANN training loss 0.009636\n>> Epoch 15 finished \tANN training loss 0.009586\n>> Epoch 16 finished \tANN training loss 0.009556\n>> Epoch 17 finished \tANN training loss 0.009533\n>> Epoch 18 finished \tANN training loss 0.009486\n>> Epoch 19 finished \tANN training loss 0.009430\n>> Epoch 20 finished \tANN training loss 0.009416\n>> Epoch 21 finished \tANN training loss 0.009390\n>> Epoch 22 finished \tANN training loss 0.009394\n>> Epoch 23 finished \tANN training loss 0.009345\n>> Epoch 24 finished \tANN training loss 0.009330\n>> Epoch 25 finished \tANN training loss 0.009319\n>> Epoch 26 finished \tANN training loss 0.009298\n>> Epoch 27 finished \tANN training loss 0.009302\n>> Epoch 28 finished \tANN training loss 0.009276\n>> Epoch 29 finished \tANN training loss 0.009319\n>> Epoch 30 finished \tANN training loss 0.009279\n>> Epoch 31 finished \tANN training loss 0.009273\n>> Epoch 32 finished \tANN training loss 0.009264\n>> Epoch 33 finished \tANN training loss 0.009274\n>> Epoch 34 finished \tANN training loss 0.009242\n>> Epoch 35 finished \tANN training loss 0.009231\n>> Epoch 36 finished \tANN training loss 0.009227\n>> Epoch 37 finished \tANN training loss 0.009224\n>> Epoch 38 finished \tANN training loss 0.009249\n>> Epoch 39 finished \tANN training loss 0.009218\n>> Epoch 40 finished \tANN training loss 0.009307\n>> Epoch 41 finished \tANN training loss 0.009225\n>> Epoch 42 finished \tANN training loss 0.009235\n>> Epoch 43 finished \tANN training loss 0.009212\n>> Epoch 44 finished \tANN training loss 0.009213\n>> Epoch 45 finished \tANN training loss 0.009226\n>> Epoch 46 finished \tANN training loss 0.009228\n>> Epoch 47 finished \tANN training loss 0.009217\n>> Epoch 48 finished \tANN training loss 0.009202\n>> Epoch 49 finished \tANN training loss 0.009241\n>> Epoch 50 finished \tANN training loss 0.009205\n>> Epoch 51 finished \tANN training loss 0.009220\n>> Epoch 52 finished \tANN training loss 0.009202\n>> Epoch 53 finished \tANN training loss 0.009201\n>> Epoch 54 finished \tANN training loss 0.009201\n>> Epoch 55 finished \tANN training loss 0.009241\n>> Epoch 56 finished \tANN training loss 0.009195\n>> Epoch 57 finished \tANN training loss 0.009217\n>> Epoch 58 finished \tANN training loss 0.009208\n>> Epoch 59 finished \tANN training loss 0.009194\n>> Epoch 60 finished \tANN training loss 0.009195\n>> Epoch 61 finished \tANN training loss 0.009192\n>> Epoch 62 finished \tANN training loss 0.009193\n>> Epoch 63 finished \tANN training loss 0.009190\n>> Epoch 64 finished \tANN training loss 0.009193\n>> Epoch 65 finished \tANN training loss 0.009215\n>> Epoch 66 finished \tANN training loss 0.009211\n>> Epoch 67 finished \tANN training loss 0.009191\n>> Epoch 68 finished \tANN training loss 0.009190\n>> Epoch 69 finished \tANN training loss 0.009243\n>> Epoch 70 finished \tANN training loss 0.009219\n>> Epoch 71 finished \tANN training loss 0.009189\n>> Epoch 72 finished \tANN training loss 0.009185\n>> Epoch 73 finished \tANN training loss 0.009197\n>> Epoch 74 finished \tANN training loss 0.009182\n>> Epoch 75 finished \tANN training loss 0.009181\n>> Epoch 76 finished \tANN training loss 0.009182\n>> Epoch 77 finished \tANN training loss 0.009263\n>> Epoch 78 finished \tANN training loss 0.009181\n>> Epoch 79 finished \tANN training loss 0.009179\n>> Epoch 80 finished \tANN training loss 0.009179\n>> Epoch 81 finished \tANN training loss 0.009187\n>> Epoch 82 finished \tANN training loss 0.009196\n>> Epoch 83 finished \tANN training loss 0.009187\n>> Epoch 84 finished \tANN training loss 0.009178\n>> Epoch 85 finished \tANN training loss 0.009182\n>> Epoch 86 finished \tANN training loss 0.009179\n>> Epoch 87 finished \tANN training loss 0.009175\n>> Epoch 88 finished \tANN training loss 0.009176\n>> Epoch 89 finished \tANN training loss 0.009184\n>> Epoch 90 finished \tANN training loss 0.009173\n>> Epoch 91 finished \tANN training loss 0.009174\n>> Epoch 92 finished \tANN training loss 0.009226\n>> Epoch 93 finished \tANN training loss 0.009172\n>> Epoch 94 finished \tANN training loss 0.009193\n>> Epoch 95 finished \tANN training loss 0.009171\n>> Epoch 96 finished \tANN training loss 0.009180\n>> Epoch 97 finished \tANN training loss 0.009207\n>> Epoch 98 finished \tANN training loss 0.009206\n>> Epoch 99 finished \tANN training loss 0.009183\n>> Epoch 100 finished \tANN training loss 0.009167\n>> Epoch 101 finished \tANN training loss 0.009179\n>> Epoch 102 finished \tANN training loss 0.009191\n>> Epoch 103 finished \tANN training loss 0.009165\n>> Epoch 104 finished \tANN training loss 0.009184\n>> Epoch 105 finished \tANN training loss 0.009164\n>> Epoch 106 finished \tANN training loss 0.009169\n>> Epoch 107 finished \tANN training loss 0.009162\n>> Epoch 108 finished \tANN training loss 0.009175\n>> Epoch 109 finished \tANN training loss 0.009162\n>> Epoch 110 finished \tANN training loss 0.009170\n>> Epoch 111 finished \tANN training loss 0.009163\n>> Epoch 112 finished \tANN training loss 0.009163\n>> Epoch 113 finished \tANN training loss 0.009160\n>> Epoch 114 finished \tANN training loss 0.009168\n>> Epoch 115 finished \tANN training loss 0.009207\n>> Epoch 116 finished \tANN training loss 0.009159\n>> Epoch 117 finished \tANN training loss 0.009167\n>> Epoch 118 finished \tANN training loss 0.009176\n>> Epoch 119 finished \tANN training loss 0.009162\n>> Epoch 120 finished \tANN training loss 0.009156\n>> Epoch 121 finished \tANN training loss 0.009161\n>> Epoch 122 finished \tANN training loss 0.009157\n>> Epoch 123 finished \tANN training loss 0.009155\n>> Epoch 124 finished \tANN training loss 0.009222\n>> Epoch 125 finished \tANN training loss 0.009232\n>> Epoch 126 finished \tANN training loss 0.009151\n>> Epoch 127 finished \tANN training loss 0.009166\n>> Epoch 128 finished \tANN training loss 0.009171\n>> Epoch 129 finished \tANN training loss 0.009152\n>> Epoch 130 finished \tANN training loss 0.009160\n>> Epoch 131 finished \tANN training loss 0.009149\n>> Epoch 132 finished \tANN training loss 0.009163\n>> Epoch 133 finished \tANN training loss 0.009197\n>> Epoch 134 finished \tANN training loss 0.009197\n>> Epoch 135 finished \tANN training loss 0.009160\n>> Epoch 136 finished \tANN training loss 0.009154\n>> Epoch 137 finished \tANN training loss 0.009159\n>> Epoch 138 finished \tANN training loss 0.009166\n>> Epoch 139 finished \tANN training loss 0.009144\n>> Epoch 140 finished \tANN training loss 0.009144\n>> Epoch 141 finished \tANN training loss 0.009143\n>> Epoch 142 finished \tANN training loss 0.009144\n>> Epoch 143 finished \tANN training loss 0.009144\n>> Epoch 144 finished \tANN training loss 0.009146\n>> Epoch 145 finished \tANN training loss 0.009140\n>> Epoch 146 finished \tANN training loss 0.009140\n>> Epoch 147 finished \tANN training loss 0.009139\n>> Epoch 148 finished \tANN training loss 0.009162\n"
],
[
"#To check RBM Loss Errors:\nrbm_error = regressor.unsupervised_dbn.rbm_layers[0].rbm_loss_error\n#To check DBN Loss Errors\ndbn_error = regressor.dbn_loss_error",
"_____no_output_____"
]
],
[
[
"<br><br>\n## Testing Model",
"_____no_output_____"
]
],
[
[
"# Test\nmin_max_scaler = MinMaxScaler()\nX_test = min_max_scaler.fit_transform(X_test)\nY_pred = regressor.predict(X_test)\n\nr2score = r2_score(Y_test, Y_pred)\nrmse = np.sqrt(mean_squared_error(Y_test, Y_pred))\nmae = mean_absolute_error(Y_test, Y_pred)\nprint('Done.\\nR-squared: %.3f\\nRMSE: %.3f \\nMAE: %.3f' % (r2score, rmse, mae))",
"Done.\nR-squared: 0.892\nRMSE: 0.105 \nMAE: 0.063\n"
],
[
"print(len(Y_pred))\ntemp = []\nfor i in range(len(Y_pred)):\n temp.append(Y_pred[i][0])\nd = {'Predicted': temp, 'Actual': Y_test}\n\ndf = pd.DataFrame(data=d)\ndf.head()",
"9774\n"
],
[
"# Save the model\nif MODEL_DIR != \"\":\n directory = \"models/\" + MODEL_DIR\n if not os.path.exists(directory):\n print(\"Making Directory\")\n os.makedirs(directory)\n\nregressor.save('models/' + MODEL_DIR + 'pm1_' + ROAD + '_' + YEAR + '.pkl')",
"Making Directory\n"
]
],
[
[
"### Results and Analysis below",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt",
"_____no_output_____"
]
],
[
[
"##### Printing Predicted and Actual Results",
"_____no_output_____"
]
],
[
[
"startIndex = merged_dataset.shape[0] - Y_pred.shape[0]\ndt = merged_dataset.index[startIndex:,]\ntemp = []\nfor i in range(len(Y_pred)):\n temp.append(Y_pred[i][0])\nd = {'Predicted': temp, 'Actual': Y_test, 'dt': dt}\ndf = pd.DataFrame(data=d)\ndf.head()",
"_____no_output_____"
],
[
"df.tail()",
"_____no_output_____"
]
],
[
[
"#### Visualize Actual and Predicted Traffic ",
"_____no_output_____"
]
],
[
[
"print(df.dt[0])\nstartIndex = 0\nendIndex = 96\nline1 = df.Actual.rdiv(1)\nline2 = df.Predicted.rdiv(1)\nx = range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT))\nplt.figure(figsize=(20, 4))\nplt.plot(line1[startIndex:endIndex], c='red', label=\"Actual-Congestion\")\nplt.plot(line2[startIndex:endIndex], c='blue', label=\"Predicted-Congestion\")\nplt.legend()\nplt.xlabel(\"Date\")\nplt.ylabel(\"Traffic Congestion\")\nplt.show()",
"2015-07-22 04:30:00\n"
],
[
"if OUTPUT_DIR != \"\":\n directory = \"output/\" + OUTPUT_DIR\n if not os.path.exists(directory):\n print(\"Making Directory\")\n os.makedirs(directory)\n\ndf.to_csv(\"output/\" + OUTPUT_DIR + \"pm1_\" + ROAD + '_' + YEAR + EXT, index=False, encoding='utf-8')",
"_____no_output_____"
]
],
[
[
"#### Visualize trend of loss of RBM and DBN Training",
"_____no_output_____"
]
],
[
[
"line1 = rbm_error\nline2 = dbn_error\nx = range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT))\nplt.plot(range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT)), line1, c='red')\nplt.xticks(x)\nplt.xlabel(\"Iteration\")\nplt.ylabel(\"Error\")\nplt.show()\n\n\nplt.plot(range(DBN_EPOCHS), line2, c='blue')\nplt.xticks(x)\nplt.xlabel(\"Iteration\")\nplt.ylabel(\"Error\")\nplt.show()\n\nplt.plot(range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT)), line1, c='red')\nplt.plot(range(DBN_EPOCHS), line2, c='blue')\nplt.xticks(x)\nplt.xlabel(\"Iteration\")\nplt.ylabel(\"Error\")\nplt.show()",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d06497270dad7bc9e9b7b8fb64621773d14f527c | 521,000 | ipynb | Jupyter Notebook | Course02/Voxel-Map.ipynb | thhuang/NOTES-FCND | c5b0ec7d99df3cb60a850308d16ccc6c096c7931 | [
"MIT"
]
| 1 | 2018-10-26T04:06:21.000Z | 2018-10-26T04:06:21.000Z | Course02/Voxel-Map.ipynb | thhuang/notes-fcnd | c5b0ec7d99df3cb60a850308d16ccc6c096c7931 | [
"MIT"
]
| null | null | null | Course02/Voxel-Map.ipynb | thhuang/notes-fcnd | c5b0ec7d99df3cb60a850308d16ccc6c096c7931 | [
"MIT"
]
| 1 | 2018-10-26T04:06:23.000Z | 2018-10-26T04:06:23.000Z | 2,592.039801 | 515,672 | 0.960152 | [
[
[
"# 3D Map\n\nWhile representing the configuration space in 3 dimensions isn't entirely practical it's fun (and useful) to visualize things in 3D.\n\nIn this exercise you'll finish the implementation of `create_grid` such that a 3D grid is returned where cells containing a voxel are set to `True`. We'll then plot the result!",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom mpl_toolkits.mplot3d import Axes3D\n\n%matplotlib inline ",
"_____no_output_____"
],
[
"plt.rcParams['figure.figsize'] = 16, 16",
"_____no_output_____"
],
[
"# This is the same obstacle data from the previous lesson.\nfilename = 'colliders.csv'\ndata = np.loadtxt(filename, delimiter=',', dtype='Float64', skiprows=2)\nprint(data)",
"[[-305. -435. 85.5 5. 5. 85.5]\n [-295. -435. 85.5 5. 5. 85.5]\n [-285. -435. 85.5 5. 5. 85.5]\n ...\n [ 435. 465. 8. 5. 5. 8. ]\n [ 445. 465. 8. 5. 5. 8. ]\n [ 455. 465. 8. 5. 5. 8. ]]\n"
],
[
"def create_voxmap(data, voxel_size=5):\n \"\"\"\n Returns a grid representation of a 3D configuration space\n based on given obstacle data.\n \n The `voxel_size` argument sets the resolution of the voxel map. \n \"\"\"\n\n # minimum and maximum north coordinates\n north_min = np.floor(np.amin(data[:, 0] - data[:, 3]))\n north_max = np.ceil(np.amax(data[:, 0] + data[:, 3]))\n\n # minimum and maximum east coordinates\n east_min = np.floor(np.amin(data[:, 1] - data[:, 4]))\n east_max = np.ceil(np.amax(data[:, 1] + data[:, 4]))\n\n alt_max = np.ceil(np.amax(data[:, 2] + data[:, 5]))\n \n # given the minimum and maximum coordinates we can\n # calculate the size of the grid.\n north_size = int(np.ceil((north_max - north_min))) // voxel_size\n east_size = int(np.ceil((east_max - east_min))) // voxel_size\n alt_size = int(alt_max) // voxel_size\n\n voxmap = np.zeros((north_size, east_size, alt_size), dtype=np.bool)\n\n for datum in data:\n x, y, z, dx, dy, dz = datum.astype(np.int32)\n obstacle = np.array(((x-dx, x+dx),\n (y-dy, y+dy),\n (z-dz, z+dz)))\n obstacle[0] = (obstacle[0] - north_min) // voxel_size\n obstacle[1] = (obstacle[1] - east_min) // voxel_size\n obstacle[2] = obstacle[2] // voxel_size \n voxmap[obstacle[0][0]:obstacle[0][1], obstacle[1][0]:obstacle[1][1], obstacle[2][0]:obstacle[2][1]] = True\n \n return voxmap",
"_____no_output_____"
]
],
[
[
"Create 3D grid.",
"_____no_output_____"
]
],
[
[
"voxel_size = 10\nvoxmap = create_voxmap(data, voxel_size)\nprint(voxmap.shape)",
"(81, 91, 21)\n"
]
],
[
[
"Plot the 3D grid. ",
"_____no_output_____"
]
],
[
[
"fig = plt.figure()\nax = fig.gca(projection='3d')\nax.voxels(voxmap, edgecolor='k')\nax.set_xlim(voxmap.shape[0], 0)\nax.set_ylim(0, voxmap.shape[1])\n# add 100 to the height so the buildings aren't so tall\nax.set_zlim(0, voxmap.shape[2]+100//voxel_size)\n\nplt.xlabel('North')\nplt.ylabel('East')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"Isn't the city pretty?",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d0649f7ba52737564c85b801018ece1b975776e8 | 51,572 | ipynb | Jupyter Notebook | Projecting_Covid_19_Case_Growth_in_Bangladesh_Using_Logistic_Regression.ipynb | tanzimtaher/Modeling-Covid-19-Cumulative-Case-Growth-in-Bangladesh-with-Logistic-Regression | 6304b1422a4414a3eda7b8fd1ee529b69291edd6 | [
"Xnet",
"X11"
]
| null | null | null | Projecting_Covid_19_Case_Growth_in_Bangladesh_Using_Logistic_Regression.ipynb | tanzimtaher/Modeling-Covid-19-Cumulative-Case-Growth-in-Bangladesh-with-Logistic-Regression | 6304b1422a4414a3eda7b8fd1ee529b69291edd6 | [
"Xnet",
"X11"
]
| null | null | null | Projecting_Covid_19_Case_Growth_in_Bangladesh_Using_Logistic_Regression.ipynb | tanzimtaher/Modeling-Covid-19-Cumulative-Case-Growth-in-Bangladesh-with-Logistic-Regression | 6304b1422a4414a3eda7b8fd1ee529b69291edd6 | [
"Xnet",
"X11"
]
| null | null | null | 70.646575 | 31,328 | 0.796847 | [
[
[
"# Prologue",
"_____no_output_____"
],
[
"For this project we will use the logistic regression function to model the growth of confirmed Covid-19 case population growth in Bangladesh. The logistic regression function is commonly used in classification problems, and in this project we will be examining how it fares as a regression tool. Both cumulative case counts over time and logistic regression curves have a sigmoid shape and we shall try to fit a theoretically predicted curve over the actual cumulative case counts over time to reach certain conclusions about the case count growth, such as the time of peak daily new cases and the total cases that may be reached during this outbreak.",
"_____no_output_____"
],
[
"# Import the necessary modules",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nfrom datetime import datetime,timedelta\nfrom sklearn.metrics import mean_squared_error\nfrom scipy.optimize import curve_fit\nfrom scipy.optimize import fsolve\nimport matplotlib.pyplot as plt\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"# Connect to Google Drive (where the data is kept)",
"_____no_output_____"
]
],
[
[
"from google.colab import drive\ndrive.mount('/content/drive')",
"Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n"
]
],
[
[
"# Import data and format as needed",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv('/content/drive/My Drive/Corona-Cases.n-1.csv')\ndf.tail()",
"_____no_output_____"
]
],
[
[
"As you can see, the format of the date is 'month-day-year'. Let's specify the date column is datetime type. Let's also specify the formatting as %m-%d-%Y. And then, let's find the day when the first confirmed cases of Covid-19 were reported in Bangladesh.",
"_____no_output_____"
]
],
[
[
"FMT = '%m-%d-%Y'\ndf['Date'] = pd.to_datetime(df['Date'], format=FMT)",
"_____no_output_____"
]
],
[
[
"We have to initialize the first date of confirmed Covid-19 cases as the datetime variable start_date because we would need it later to calculate the peak.",
"_____no_output_____"
]
],
[
[
"# Initialize the start date\nstart_date = datetime.date(df.loc[0, 'Date'])\nprint('Start date: ', start_date)",
"Start date: 2020-03-08\n"
]
],
[
[
"Now, for the logistic regression function, we would need a timestep column instead of a date column in the dataframe. So we create a new dataframe called data where we drop the date column and use the index as the timestep column.",
"_____no_output_____"
]
],
[
[
"# drop date column\ndata = df['Total cases']\n\n# reset index and create a timestep\ndata = data.reset_index(drop=False)\n\n# rename columns\ndata.columns = ['Timestep', 'Total Cases']\n\n# check\ndata.tail()",
"_____no_output_____"
]
],
[
[
"# Defining the logistic regression function",
"_____no_output_____"
]
],
[
[
"def logistic_model(x,a,b,c):\n return c/(1+np.exp(-(x-b)/a))",
"_____no_output_____"
]
],
[
[
"In this formula, we have the variable x that is the time and three parameters: a, b, c.\n* a is a metric for the speed of infections\n* b is the day with the estimated maximum growth rate of confirmed Covid-19 cases\n* c is the maximum number the cumulative confirmed cases will reach by the end of the first outbreak here in Bangladesh\n\nThe growth of cumulative cases follows a sigmoid shape like the logistic regression curve and hence, this may be a good way to model the growth of the confirmed Covid-19 case population over time. For the first outbreak at least. It makes sense because, for an outbreak, the rise in cumulative case counts is initially exponential. Then there is a point of inflection where the curve nearly becomes linear. We assume that this point of inflection is the time around which the daily new case numbers will peak. After that the curve eventually flattens out. \n\n",
"_____no_output_____"
],
[
"# Fit the logistic function and extrapolate",
"_____no_output_____"
]
],
[
[
"# Initialize all the timesteps as x\nx = list(data.iloc[:,0])\n\n# Initialize all the Total Cases values as y\ny = list(data.iloc[:,1])\n\n# Fit the curve using sklearn's curve_fit method we initialize the parameter p0 with arbitrary values\nfit = curve_fit(logistic_model,x,y,p0=[2,100,20000])\n(a, b, c), cov = fit",
"_____no_output_____"
],
[
"# Print outputs\nprint('Metric for speed of infections: ', a)\nprint('Days from start when cumulative case counts will peak: ', b)\nprint('Total cumulative cases that will be reached: ', c)",
"Metric for speed of infections: 17.41386234974941\nDays from start when cumulative case counts will peak: 110.7731800890406\nTotal cumulative cases that will be reached: 265257.7755190932\n"
],
[
"# Print errors for a, b, c\nerrors = [np.sqrt(fit[1][i][i]) for i in [0,1,2]]\nprint('Errors in a, b and c respectively:\\n', errors)",
"Errors in a, b and c respectively:\n [0.12923467446546272, 0.24474862210706608, 1384.097103078659]\n"
],
[
"# estimated time of peak\nprint('Estimated time of peak between', start_date + timedelta(days=(b-errors[1])), ' and ', start_date + timedelta(days=(b+errors[1])))\n\n# estimated total number of infections \nprint('Estimated total number of infections betweeen ', (c - errors[2]), ' and ', (c + errors[2]))",
"Estimated time of peak between 2020-06-26 and 2020-06-27\nEstimated total number of infections betweeen 263873.67841601453 and 266641.8726221719\n"
]
],
[
[
"To extrapolate the curve to the future, use the fsolve function from scipy.",
"_____no_output_____"
]
],
[
[
"# Extrapolate\nsol = int(fsolve(lambda x : logistic_model(x,a,b,c) - int(c),b))",
"_____no_output_____"
]
],
[
[
"# Plot the graph",
"_____no_output_____"
]
],
[
[
"pred_x = list(range(max(x),sol))\nplt.rcParams['figure.figsize'] = [7, 7]\nplt.rc('font', size=14)\n# Real data\nplt.scatter(x,y,label=\"Real data\",color=\"red\")\n# Predicted logistic curve\nplt.plot(x+pred_x, [logistic_model(i,fit[0][0],fit[0][1],fit[0][2]) for i in x+pred_x], label=\"Logistic model\" )\nplt.legend()\nplt.xlabel(\"Days since 8th March 2020\")\nplt.ylabel(\"Total number of infected people\")\nplt.ylim((min(y)*0.9,c*1.1))\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Evaluate the MSE error",
"_____no_output_____"
],
[
"Evaluating the mean squared error (MSE) is not very meaningful on its own until we can compare it with another predictive method. We can compare MSE of our regression with MSE from another method to check if our logistic regression model works better than the other predictive model. The model with the lower MSE performs better.\n\n\n",
"_____no_output_____"
]
],
[
[
"y_pred_logistic = [logistic_model(i,fit[0][0],fit[0][1],fit[0][2])\nfor i in x]\n\nprint('Mean squared error: ', mean_squared_error(y,y_pred_logistic))",
"Mean squared error: 3298197.2412489704\n"
]
],
[
[
"# Epilogue",
"_____no_output_____"
],
[
"We should be mindful of some caveats:\n\n* These predictions will only be meaningful when the peak has actually been crossed definitively. \n\n* Also, the reliability of the reported cases would also influence the dependability of the model. Developing countries, especially the South Asian countries have famously failed to report accurate disaster statisticcs in the past. \n\n* Also, the testing numbers are low overall, especially in cities outside Dhaka where the daily new cases still have not peaked yet.\n\n* Since most of the cases reported were in Dhaka, the findings indicate that the peak in Dhaka may have been reached already.\n\n* If there is a second outbreak before the first outbreak subsides, the curve may not be sigmoid shaped and hence the results may not be as meaningful.\n\n* The total reported case numbers will possibly be greater than 260000, because the daily new cases is still rising in some cities other than Dhaka. It is not unsound to expect that the total reported case count for this first instance of Covid-19 outbreak could very well reach 300000 or more.\n\n* The government recently hiked the prices of tests which may have led to increased unwillingness in suspected candidates to actually test for the disease, and that may have influenced the recent confirmed case counts.",
"_____no_output_____"
],
[
"# References",
"_____no_output_____"
],
[
"Inspiration for theory and code from the following articles:\n\n* [Covid-19 infection in Italy. Mathematical models and predictions](https://towardsdatascience.com/covid-19-infection-in-italy-mathematical-models-and-predictions-7784b4d7dd8d)\n\n* [Logistic growth modelling of COVID-19 proliferation in China and its international implications](https://www.sciencedirect.com/science/article/pii/S1201971220303039)\n\n* [Logistic Growth Model for COVID-19](https://www.wolframcloud.com/obj/covid-19/Published/Logistic-Growth-Model-for-COVID-19.nb)\n",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
]
]
|
d064af379f3ecbb6e76efa98695906e87b3a7151 | 149,980 | ipynb | Jupyter Notebook | examples/Notebooks/flopy3_LoadSWRBinaryData.ipynb | gyanz/flopy | 282703716a01721e07905da65aa54e6017452a5a | [
"CC0-1.0",
"BSD-3-Clause"
]
| 1 | 2019-11-01T00:34:14.000Z | 2019-11-01T00:34:14.000Z | examples/Notebooks/flopy3_LoadSWRBinaryData.ipynb | gyanz/flopy | 282703716a01721e07905da65aa54e6017452a5a | [
"CC0-1.0",
"BSD-3-Clause"
]
| null | null | null | examples/Notebooks/flopy3_LoadSWRBinaryData.ipynb | gyanz/flopy | 282703716a01721e07905da65aa54e6017452a5a | [
"CC0-1.0",
"BSD-3-Clause"
]
| null | null | null | 323.930886 | 54,404 | 0.930931 | [
[
[
"# FloPy\n\n## Plotting SWR Process Results\n\nThis notebook demonstrates the use of the `SwrObs` and `SwrStage`, `SwrBudget`, `SwrFlow`, and `SwrExchange`, `SwrStructure`, classes to read binary SWR Process observation, stage, budget, reach to reach flows, reach-aquifer exchange, and structure files. It demonstrates these capabilities by loading these binary file types and showing examples of plotting SWR Process data. An example showing how the simulated water surface profile at a selected time along a selection of reaches can be plotted is also presented.",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\nfrom IPython.display import Image\nimport os\nimport sys\nimport numpy as np\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\n\n# run installed version of flopy or add local path\ntry:\n import flopy\nexcept:\n fpth = os.path.abspath(os.path.join('..', '..'))\n sys.path.append(fpth)\n import flopy\n\nprint(sys.version)\nprint('numpy version: {}'.format(np.__version__))\nprint('matplotlib version: {}'.format(mpl.__version__))\nprint('flopy version: {}'.format(flopy.__version__))",
"3.6.5 | packaged by conda-forge | (default, Apr 6 2018, 13:44:09) \n[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]\nnumpy version: 1.14.5\nmatplotlib version: 2.2.2\nflopy version: 3.2.10\n"
],
[
"#Set the paths\ndatapth = os.path.join('..', 'data', 'swr_test')\n\n# SWR Process binary files \nfiles = ('SWR004.obs', 'SWR004.vel', 'SWR004.str', 'SWR004.stg', 'SWR004.flow')",
"_____no_output_____"
]
],
[
[
"### Load SWR Process observations\n\nCreate an instance of the `SwrObs` class and load the observation data.",
"_____no_output_____"
]
],
[
[
"sobj = flopy.utils.SwrObs(os.path.join(datapth, files[0]))\n\nts = sobj.get_data()",
"_____no_output_____"
]
],
[
[
"#### Plot the data from the binary SWR Process observation file",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(figsize=(6, 12))\nax1 = fig.add_subplot(3, 1, 1)\nax1.semilogx(ts['totim']/3600., -ts['OBS1'], label='OBS1')\nax1.semilogx(ts['totim']/3600., -ts['OBS2'], label='OBS2')\nax1.semilogx(ts['totim']/3600., -ts['OBS9'], label='OBS3')\nax1.set_ylabel('Flow, in cubic meters per second')\nax1.legend()\n\nax = fig.add_subplot(3, 1, 2, sharex=ax1)\nax.semilogx(ts['totim']/3600., -ts['OBS4'], label='OBS4')\nax.semilogx(ts['totim']/3600., -ts['OBS5'], label='OBS5')\nax.set_ylabel('Flow, in cubic meters per second')\nax.legend()\n\nax = fig.add_subplot(3, 1, 3, sharex=ax1)\nax.semilogx(ts['totim']/3600., ts['OBS6'], label='OBS6')\nax.semilogx(ts['totim']/3600., ts['OBS7'], label='OBS7')\nax.set_xlim(1, 100)\nax.set_ylabel('Stage, in meters')\nax.set_xlabel('Time, in hours')\nax.legend();",
"_____no_output_____"
]
],
[
[
"### Load the same data from the individual binary SWR Process files\n\nLoad discharge data from the flow file. The flow file contains the simulated flow between connected reaches for each connection in the model.",
"_____no_output_____"
]
],
[
[
"sobj = flopy.utils.SwrFlow(os.path.join(datapth, files[1]))\ntimes = np.array(sobj.get_times())/3600.\nobs1 = sobj.get_ts(irec=1, iconn=0)\nobs2 = sobj.get_ts(irec=14, iconn=13)\nobs4 = sobj.get_ts(irec=4, iconn=3)\nobs5 = sobj.get_ts(irec=5, iconn=4)",
"_____no_output_____"
]
],
[
[
"Load discharge data from the structure file. The structure file contains the simulated structure flow for each reach with a structure.",
"_____no_output_____"
]
],
[
[
"sobj = flopy.utils.SwrStructure(os.path.join(datapth, files[2]))\nobs3 = sobj.get_ts(irec=17, istr=0)",
"_____no_output_____"
]
],
[
[
"Load stage data from the stage file. The flow file contains the simulated stage for each reach in the model.",
"_____no_output_____"
]
],
[
[
"sobj = flopy.utils.SwrStage(os.path.join(datapth, files[3]))\nobs6 = sobj.get_ts(irec=13)",
"_____no_output_____"
]
],
[
[
"Load budget data from the budget file. The budget file contains the simulated budget for each reach group in the model. The budget file also contains the stage data for each reach group. In this case the number of reach groups equals the number of reaches in the model.",
"_____no_output_____"
]
],
[
[
"sobj = flopy.utils.SwrBudget(os.path.join(datapth, files[4]))\nobs7 = sobj.get_ts(irec=17)",
"_____no_output_____"
]
],
[
[
"#### Plot the data loaded from the individual binary SWR Process files.\n\nNote that the plots are identical to the plots generated from the binary SWR observation data.",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(figsize=(6, 12))\nax1 = fig.add_subplot(3, 1, 1)\nax1.semilogx(times, obs1['flow'], label='OBS1')\nax1.semilogx(times, obs2['flow'], label='OBS2')\nax1.semilogx(times, -obs3['strflow'], label='OBS3')\nax1.set_ylabel('Flow, in cubic meters per second')\nax1.legend()\n\nax = fig.add_subplot(3, 1, 2, sharex=ax1)\nax.semilogx(times, obs4['flow'], label='OBS4')\nax.semilogx(times, obs5['flow'], label='OBS5')\nax.set_ylabel('Flow, in cubic meters per second')\nax.legend()\n\nax = fig.add_subplot(3, 1, 3, sharex=ax1)\nax.semilogx(times, obs6['stage'], label='OBS6')\nax.semilogx(times, obs7['stage'], label='OBS7')\nax.set_xlim(1, 100)\nax.set_ylabel('Stage, in meters')\nax.set_xlabel('Time, in hours')\nax.legend();",
"_____no_output_____"
]
],
[
[
"### Plot simulated water surface profiles\n\nSimulated water surface profiles can be created using the `ModelCrossSection` class. \n\nSeveral things that we need in addition to the stage data include reach lengths and bottom elevations. We load these data from an existing file.",
"_____no_output_____"
]
],
[
[
"sd = np.genfromtxt(os.path.join(datapth, 'SWR004.dis.ref'), names=True)",
"_____no_output_____"
]
],
[
[
"The contents of the file are shown in the cell below.",
"_____no_output_____"
]
],
[
[
"fc = open(os.path.join(datapth, 'SWR004.dis.ref')).readlines()\nfc",
"_____no_output_____"
]
],
[
[
"Create an instance of the `SwrStage` class for SWR Process stage data.",
"_____no_output_____"
]
],
[
[
"sobj = flopy.utils.SwrStage(os.path.join(datapth, files[3]))",
"_____no_output_____"
]
],
[
[
"Create a selection condition (`iprof`) that can be used to extract data for the reaches of interest (reaches 0, 1, and 8 through 17). Use this selection condition to extract reach lengths (from `sd['RLEN']`) and the bottom elevation (from `sd['BELEV']`) for the reaches of interest. The selection condition will also be used to extract the stage data for reaches of interest.",
"_____no_output_____"
]
],
[
[
"iprof = sd['IRCH'] > 0\niprof[2:8] = False\ndx = np.extract(iprof, sd['RLEN'])\nbelev = np.extract(iprof, sd['BELEV'])",
"_____no_output_____"
]
],
[
[
"Create a fake model instance so that the `ModelCrossSection` class can be used.",
"_____no_output_____"
]
],
[
[
"ml = flopy.modflow.Modflow()\ndis = flopy.modflow.ModflowDis(ml, nrow=1, ncol=dx.shape[0], delr=dx, top=4.5, botm=belev.reshape(1,1,12))",
"_____no_output_____"
]
],
[
[
"Create an array with the x position at the downstream end of each reach, which will be used to color the plots below each reach. ",
"_____no_output_____"
]
],
[
[
"x = np.cumsum(dx)",
"_____no_output_____"
]
],
[
[
"Plot simulated water surface profiles for 8 times.",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(figsize=(12, 12))\nfor idx, v in enumerate([19, 29, 34, 39, 44, 49, 54, 59]):\n ax = fig.add_subplot(4, 2, idx+1)\n s = sobj.get_data(idx=v)\n stage = np.extract(iprof, s['stage'])\n xs = flopy.plot.ModelCrossSection(model=ml, line={'Row': 0})\n xs.plot_fill_between(stage.reshape(1,1,12), colors=['none', 'blue'], ax=ax, edgecolors='none')\n linecollection = xs.plot_grid(ax=ax, zorder=10)\n ax.fill_between(np.append(0., x), y1=np.append(belev[0], belev), y2=-0.5, \n facecolor='0.5', edgecolor='none', step='pre')\n ax.set_title('{} hours'.format(times[v]))\n ax.set_ylim(-0.5, 4.5)",
"_____no_output_____"
]
],
[
[
"## Summary\n\nThis notebook demonstrates flopy functionality for reading binary output generated by the SWR Process. Binary files that can be read include observations, stages, budgets, flow, reach-aquifer exchanges, and structure data. The binary stage data can also be used to create water-surface profiles. \n\nHope this gets you started!",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
]
|
d064b3c29ebb882359b6c29fc749f2293e5fe886 | 19,035 | ipynb | Jupyter Notebook | Lec4/Lab6_result_report.ipynb | Cho-D-YoungRae/Standalone-DeepLearning | ea581708eca95fb73bb34dc17fb0dadb5f1a93a3 | [
"MIT"
]
| 553 | 2019-01-20T07:54:00.000Z | 2022-03-31T16:35:17.000Z | Lec4/Lab6_result_report.ipynb | betteryy/Standalone-DeepLearning | dfc12f6dc98d13751eebf5a1503665e09647f499 | [
"MIT"
]
| 10 | 2019-01-22T12:23:33.000Z | 2021-05-22T08:41:00.000Z | Lec4/Lab6_result_report.ipynb | betteryy/Standalone-DeepLearning | dfc12f6dc98d13751eebf5a1503665e09647f499 | [
"MIT"
]
| 190 | 2019-01-17T20:32:13.000Z | 2022-03-31T02:56:34.000Z | 33.277972 | 199 | 0.492409 | [
[
[
"[์ ๊ฐ ๋ฏธ๋ฆฌ ๋ง๋ค์ด๋์ ์ด ๋งํฌ](https://colab.research.google.com/github/heartcored98/Standalone-DeepLearning/blob/master/Lec4/Lab6_result_report.ipynb)๋ฅผ ํตํด Colab์์ ๋ฐ๋ก ์์
ํ์ค ์ ์์ต๋๋ค! \n๋ฐํ์ ์ ํ์ python3, GPU ๊ฐ์ ํ์ธํ๊ธฐ!",
"_____no_output_____"
]
],
[
[
"!mkdir results",
"_____no_output_____"
],
[
"import torch\nimport torchvision\nimport torchvision.transforms as transforms\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nimport argparse\nimport numpy as np\nimport time\nfrom copy import deepcopy # Add Deepcopy for args",
"_____no_output_____"
]
],
[
[
"## Data Preparation",
"_____no_output_____"
]
],
[
[
"transform = transforms.Compose(\n [transforms.ToTensor(),\n transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))])\n\ntrainset = torchvision.datasets.CIFAR10(root='./data', train=True,\n download=True, transform=transform)\ntrainset, valset = torch.utils.data.random_split(trainset, [40000, 10000])\ntestset = torchvision.datasets.CIFAR10(root='./data', train=False,\n download=True, transform=transform)\npartition = {'train': trainset, 'val':valset, 'test':testset}",
"_____no_output_____"
]
],
[
[
"## Model Architecture",
"_____no_output_____"
]
],
[
[
"class MLP(nn.Module):\n def __init__(self, in_dim, out_dim, hid_dim, n_layer, act, dropout, use_bn, use_xavier):\n super(MLP, self).__init__()\n self.in_dim = in_dim\n self.out_dim = out_dim\n self.hid_dim = hid_dim\n self.n_layer = n_layer\n self.act = act\n self.dropout = dropout\n self.use_bn = use_bn\n self.use_xavier = use_xavier\n \n # ====== Create Linear Layers ====== #\n self.fc1 = nn.Linear(self.in_dim, self.hid_dim)\n \n self.linears = nn.ModuleList()\n self.bns = nn.ModuleList()\n for i in range(self.n_layer-1):\n self.linears.append(nn.Linear(self.hid_dim, self.hid_dim))\n if self.use_bn:\n self.bns.append(nn.BatchNorm1d(self.hid_dim))\n \n self.fc2 = nn.Linear(self.hid_dim, self.out_dim)\n \n # ====== Create Activation Function ====== #\n if self.act == 'relu':\n self.act = nn.ReLU()\n elif self.act == 'tanh':\n self.act == nn.Tanh()\n elif self.act == 'sigmoid':\n self.act = nn.Sigmoid()\n else:\n raise ValueError('no valid activation function selected!')\n \n # ====== Create Regularization Layer ======= #\n self.dropout = nn.Dropout(self.dropout)\n if self.use_xavier:\n self.xavier_init()\n \n def forward(self, x):\n x = self.act(self.fc1(x))\n for i in range(len(self.linears)):\n x = self.act(self.linears[i](x))\n x = self.bns[i](x)\n x = self.dropout(x)\n x = self.fc2(x)\n return x\n \n def xavier_init(self):\n for linear in self.linears:\n nn.init.xavier_normal_(linear.weight)\n linear.bias.data.fill_(0.01)\n \nnet = MLP(3072, 10, 100, 4, 'relu', 0.1, True, True) # Testing Model Construction",
"_____no_output_____"
]
],
[
[
"## Train, Validate, Test and Experiment",
"_____no_output_____"
]
],
[
[
"def train(net, partition, optimizer, criterion, args):\n trainloader = torch.utils.data.DataLoader(partition['train'], \n batch_size=args.train_batch_size, \n shuffle=True, num_workers=2)\n net.train()\n\n correct = 0\n total = 0\n train_loss = 0.0\n for i, data in enumerate(trainloader, 0):\n optimizer.zero_grad() # [21.01.05 ์ค๋ฅ ์์ ] ๋งค Epoch ๋ง๋ค .zero_grad()๊ฐ ์คํ๋๋ ๊ฒ์ ๋งค iteration ๋ง๋ค ์คํ๋๋๋ก ์์ ํ์ต๋๋ค. \n\n # get the inputs\n inputs, labels = data\n inputs = inputs.view(-1, 3072)\n inputs = inputs.cuda()\n labels = labels.cuda()\n outputs = net(inputs)\n\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n\n train_loss += loss.item()\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n train_loss = train_loss / len(trainloader)\n train_acc = 100 * correct / total\n return net, train_loss, train_acc",
"_____no_output_____"
],
[
"def validate(net, partition, criterion, args):\n valloader = torch.utils.data.DataLoader(partition['val'], \n batch_size=args.test_batch_size, \n shuffle=False, num_workers=2)\n net.eval()\n\n correct = 0\n total = 0\n val_loss = 0 \n with torch.no_grad():\n for data in valloader:\n images, labels = data\n images = images.view(-1, 3072)\n images = images.cuda()\n labels = labels.cuda()\n outputs = net(images)\n\n loss = criterion(outputs, labels)\n \n val_loss += loss.item()\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n val_loss = val_loss / len(valloader)\n val_acc = 100 * correct / total\n return val_loss, val_acc",
"_____no_output_____"
],
[
"def test(net, partition, args):\n testloader = torch.utils.data.DataLoader(partition['test'], \n batch_size=args.test_batch_size, \n shuffle=False, num_workers=2)\n net.eval()\n \n correct = 0\n total = 0\n with torch.no_grad():\n for data in testloader:\n images, labels = data\n images = images.view(-1, 3072)\n images = images.cuda()\n labels = labels.cuda()\n\n outputs = net(images)\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n test_acc = 100 * correct / total\n return test_acc",
"_____no_output_____"
],
[
"def experiment(partition, args):\n \n net = MLP(args.in_dim, args.out_dim, args.hid_dim, args.n_layer, args.act, args.dropout, args.use_bn, args.use_xavier)\n net.cuda()\n\n criterion = nn.CrossEntropyLoss()\n if args.optim == 'SGD':\n optimizer = optim.RMSprop(net.parameters(), lr=args.lr, weight_decay=args.l2)\n elif args.optim == 'RMSprop':\n optimizer = optim.RMSprop(net.parameters(), lr=args.lr, weight_decay=args.l2)\n elif args.optim == 'Adam':\n optimizer = optim.Adam(net.parameters(), lr=args.lr, weight_decay=args.l2)\n else:\n raise ValueError('In-valid optimizer choice')\n \n # ===== List for epoch-wise data ====== #\n train_losses = []\n val_losses = []\n train_accs = []\n val_accs = []\n # ===================================== #\n \n for epoch in range(args.epoch): # loop over the dataset multiple times\n ts = time.time()\n net, train_loss, train_acc = train(net, partition, optimizer, criterion, args)\n val_loss, val_acc = validate(net, partition, criterion, args)\n te = time.time()\n \n # ====== Add Epoch Data ====== #\n train_losses.append(train_loss)\n val_losses.append(val_loss)\n train_accs.append(train_acc)\n val_accs.append(val_acc)\n # ============================ #\n \n print('Epoch {}, Acc(train/val): {:2.2f}/{:2.2f}, Loss(train/val) {:2.2f}/{:2.2f}. Took {:2.2f} sec'.format(epoch, train_acc, val_acc, train_loss, val_loss, te-ts))\n \n test_acc = test(net, partition, args) \n \n # ======= Add Result to Dictionary ======= #\n result = {}\n result['train_losses'] = train_losses\n result['val_losses'] = val_losses\n result['train_accs'] = train_accs\n result['val_accs'] = val_accs\n result['train_acc'] = train_acc\n result['val_acc'] = val_acc\n result['test_acc'] = test_acc\n return vars(args), result\n # ===================================== #",
"_____no_output_____"
]
],
[
[
"# Manage Experiment Result",
"_____no_output_____"
]
],
[
[
"import hashlib\nimport json\nfrom os import listdir\nfrom os.path import isfile, join\nimport pandas as pd\n\ndef save_exp_result(setting, result):\n exp_name = setting['exp_name']\n del setting['epoch']\n del setting['test_batch_size']\n\n hash_key = hashlib.sha1(str(setting).encode()).hexdigest()[:6]\n filename = './results/{}-{}.json'.format(exp_name, hash_key)\n result.update(setting)\n with open(filename, 'w') as f:\n json.dump(result, f)\n\n \ndef load_exp_result(exp_name):\n dir_path = './results'\n filenames = [f for f in listdir(dir_path) if isfile(join(dir_path, f)) if '.json' in f]\n list_result = []\n for filename in filenames:\n if exp_name in filename:\n with open(join(dir_path, filename), 'r') as infile:\n results = json.load(infile)\n list_result.append(results)\n df = pd.DataFrame(list_result) # .drop(columns=[])\n return df\n ",
"_____no_output_____"
]
],
[
[
"## Experiment",
"_____no_output_____"
]
],
[
[
"# ====== Random Seed Initialization ====== #\nseed = 123\nnp.random.seed(seed)\ntorch.manual_seed(seed)\n\nparser = argparse.ArgumentParser()\nargs = parser.parse_args(\"\")\nargs.exp_name = \"exp1_n_layer_hid_dim\"\n\n# ====== Model Capacity ====== #\nargs.in_dim = 3072\nargs.out_dim = 10\nargs.hid_dim = 100\nargs.act = 'relu'\n\n# ====== Regularization ======= #\nargs.dropout = 0.2\nargs.use_bn = True\nargs.l2 = 0.00001\nargs.use_xavier = True\n\n# ====== Optimizer & Training ====== #\nargs.optim = 'RMSprop' #'RMSprop' #SGD, RMSprop, ADAM...\nargs.lr = 0.0015\nargs.epoch = 10\n\nargs.train_batch_size = 256\nargs.test_batch_size = 1024\n\n# ====== Experiment Variable ====== #\nname_var1 = 'n_layer'\nname_var2 = 'hid_dim'\nlist_var1 = [1, 2, 3]\nlist_var2 = [500, 300]\n\n\nfor var1 in list_var1:\n for var2 in list_var2:\n setattr(args, name_var1, var1)\n setattr(args, name_var2, var2)\n print(args)\n \n setting, result = experiment(partition, deepcopy(args))\n save_exp_result(setting, result)\n",
"_____no_output_____"
],
[
"import seaborn as sns \nimport matplotlib.pyplot as plt\n\ndf = load_exp_result('exp1')\n\nfig, ax = plt.subplots(1, 3)\nfig.set_size_inches(15, 6)\nsns.set_style(\"darkgrid\", {\"axes.facecolor\": \".9\"})\n\nsns.barplot(x='n_layer', y='train_acc', hue='hid_dim', data=df, ax=ax[0])\nsns.barplot(x='n_layer', y='val_acc', hue='hid_dim', data=df, ax=ax[1])\nsns.barplot(x='n_layer', y='test_acc', hue='hid_dim', data=df, ax=ax[2])\n",
"_____no_output_____"
],
[
"var1 = 'n_layer'\nvar2 = 'hid_dim'\n\ndf = load_exp_result('exp1')\nlist_v1 = df[var1].unique()\nlist_v2 = df[var2].unique()\nlist_data = []\n\nfor value1 in list_v1:\n for value2 in list_v2:\n row = df.loc[df[var1]==value1]\n row = row.loc[df[var2]==value2]\n \n train_losses = list(row.train_losses)[0]\n val_losses = list(row.val_losses)[0]\n \n for epoch, train_loss in enumerate(train_losses):\n list_data.append({'type':'train', 'loss':train_loss, 'epoch':epoch, var1:value1, var2:value2})\n for epoch, val_loss in enumerate(val_losses):\n list_data.append({'type':'val', 'loss':val_loss, 'epoch':epoch, var1:value1, var2:value2})\n \ndf = pd.DataFrame(list_data)\ng = sns.FacetGrid(df, row=var2, col=var1, hue='type', margin_titles=True, sharey=False)\ng = g.map(plt.plot, 'epoch', 'loss', marker='.')\ng.add_legend()\ng.fig.suptitle('Train loss vs Val loss')\nplt.subplots_adjust(top=0.89)",
"_____no_output_____"
],
[
"var1 = 'n_layer'\nvar2 = 'hid_dim'\n\ndf = load_exp_result('exp1')\nlist_v1 = df[var1].unique()\nlist_v2 = df[var2].unique()\nlist_data = []\n\nfor value1 in list_v1:\n for value2 in list_v2:\n row = df.loc[df[var1]==value1]\n row = row.loc[df[var2]==value2]\n \n train_accs = list(row.train_accs)[0]\n val_accs = list(row.val_accs)[0]\n test_acc = list(row.test_acc)[0]\n \n for epoch, train_acc in enumerate(train_accs):\n list_data.append({'type':'train', 'Acc':train_acc, 'test_acc':test_acc, 'epoch':epoch, var1:value1, var2:value2})\n for epoch, val_acc in enumerate(val_accs):\n list_data.append({'type':'val', 'Acc':val_acc, 'test_acc':test_acc, 'epoch':epoch, var1:value1, var2:value2})\n \ndf = pd.DataFrame(list_data)\ng = sns.FacetGrid(df, row=var2, col=var1, hue='type', margin_titles=True, sharey=False)\ng = g.map(plt.plot, 'epoch', 'Acc', marker='.')\n\ndef show_acc(x, y, metric, **kwargs):\n plt.scatter(x, y, alpha=0.3, s=1)\n metric = \"Test Acc: {:1.3f}\".format(list(metric.values)[0])\n plt.text(0.05, 0.95, metric, horizontalalignment='left', verticalalignment='center', transform=plt.gca().transAxes, bbox=dict(facecolor='yellow', alpha=0.5, boxstyle=\"round,pad=0.1\"))\ng = g.map(show_acc, 'epoch', 'Acc', 'test_acc')\n\ng.add_legend()\ng.fig.suptitle('Train Accuracy vs Val Accuracy')\n\n\n\nplt.subplots_adjust(top=0.89)",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
]
|
d064c10961958259b92a5c2eacdc9bd02cc89d9b | 375,897 | ipynb | Jupyter Notebook | phase4_analysis_by_visualization/final_visualizations.ipynb | eric-wisniewski/visualizeYourself_Project | dcb6f3620468206de31ee587d4e2da18d6e1575d | [
"Apache-2.0"
]
| null | null | null | phase4_analysis_by_visualization/final_visualizations.ipynb | eric-wisniewski/visualizeYourself_Project | dcb6f3620468206de31ee587d4e2da18d6e1575d | [
"Apache-2.0"
]
| null | null | null | phase4_analysis_by_visualization/final_visualizations.ipynb | eric-wisniewski/visualizeYourself_Project | dcb6f3620468206de31ee587d4e2da18d6e1575d | [
"Apache-2.0"
]
| null | null | null | 1,070.931624 | 56,644 | 0.954472 | [
[
[
"import pandas as pd\nimport matplotlib.pyplot as plt\nfrom matplotlib import pyplot\n%matplotlib inline\n\nfinal_data_e = pd.read_csv(\"vizSelf_eric.csv\", index_col=0, parse_dates=True)\nfinal_data_p = pd.read_csv(\"vizSelf_parent.csv\", index_col=0, parse_dates=True)\nfinal_data_e.head()\nfinal_data_p.head()",
"_____no_output_____"
],
[
"axe = final_data_e.plot.area(figsize=(12,4), subplots=True)\naxp = final_data_p.plot.area(figsize=(12,4), subplots=True)",
"_____no_output_____"
],
[
"axeb = final_data_e.plot.bar(figsize=(12,4), subplots=True)\naxpb = final_data_p.plot.bar(figsize=(12,4), subplots=True)",
"_____no_output_____"
],
[
"axeh = final_data_e.plot.hist(figsize=(12,4), subplots=True)\naxph = final_data_p.plot.hist(figsize=(12,4), subplots=True)",
"_____no_output_____"
],
[
"axed = final_data_e.plot.density(figsize=(12,4), subplots=True)\naxpd = final_data_p.plot.density(figsize=(12,4), subplots=True)",
"_____no_output_____"
],
[
"axebp = final_data_e.plot.box(figsize=(12,4), subplots=True)\naxpbp = final_data_p.plot.box(figsize=(12,4), subplots=True)",
"_____no_output_____"
],
[
"axekde = final_data_e.plot.kde(figsize=(12,4), subplots=True)\naxpkde = final_data_p.plot.kde(figsize=(12,4), subplots=True)",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d064ccbc966f61700046f9a84bfae97b6c7585ed | 305,168 | ipynb | Jupyter Notebook | DAY6_Eda_With_Pandas.ipynb | averryset/H8_Python_for_Data_Science | c1e87c2e272d64147f50ae967bd0153073b23b9b | [
"MIT"
]
| null | null | null | DAY6_Eda_With_Pandas.ipynb | averryset/H8_Python_for_Data_Science | c1e87c2e272d64147f50ae967bd0153073b23b9b | [
"MIT"
]
| null | null | null | DAY6_Eda_With_Pandas.ipynb | averryset/H8_Python_for_Data_Science | c1e87c2e272d64147f50ae967bd0153073b23b9b | [
"MIT"
]
| null | null | null | 77.043171 | 110,700 | 0.704907 | [
[
[
"import pandas as pd\nimport numpy as np",
"_____no_output_____"
],
[
"df_properti = pd.read_csv(\"https://raw.githubusercontent.com/ardhiraka/PFDS_sources/master/property_data.csv\")",
"_____no_output_____"
],
[
"df_properti",
"_____no_output_____"
],
[
"df_properti.shape",
"_____no_output_____"
],
[
"df_properti.columns",
"_____no_output_____"
],
[
"df_properti[\"ST_NAME\"]",
"_____no_output_____"
],
[
"df_properti[\"ST_NUM\"].isna()",
"_____no_output_____"
],
[
"list_missing_values = [\"n/a\", \"--\", \"na\"]\ndf_properti = pd.read_csv(\n \"https://raw.githubusercontent.com/ardhiraka/PFDS_sources/master/property_data.csv\",\n na_values = list_missing_values\n)",
"_____no_output_____"
],
[
"df_properti",
"_____no_output_____"
],
[
"df_properti[\"OWN_OCCUPIED\"].isna()",
"_____no_output_____"
],
[
"cnt=0\ndf_properti_own = df_properti[\"OWN_OCCUPIED\"]\nfor row in df_properti_own:\n try:\n int(row)\n df_properti[cnt, \"OWN_OCCUPIED\"]=np.nan\n except ValueError:\n pass\n cnt+=1",
"_____no_output_____"
],
[
"df_properti ",
"_____no_output_____"
],
[
"df_properti[\"NEW_OWN_OCCUPIEW\"] = df_properti[\"OWN_OCCUPIED\"].apply(\n lambda val: 1 if val == \"Y\" else 0\n)\ndf_properti",
"_____no_output_____"
],
[
"df_properti.isna().sum()",
"_____no_output_____"
],
[
"df_properti.isna().sum().sum()",
"_____no_output_____"
],
[
"df_properti",
"_____no_output_____"
],
[
"cnt=0\ndf_properti_num_bat = df_properti[\"NUM_BATH\"]\nfor row in df_properti_num_bat:\n try:\n float(row)\n df_properti.loc[cnt, \"NEW_NUM_BATH\"]=row\n except ValueError:\n df_properti.loc[cnt, \"NEW_NUM_BATH\"]=np.nan\n cnt+=1",
"_____no_output_____"
],
[
"df_properti",
"_____no_output_____"
],
[
"df_properti[\"ST_NUM\"].fillna(125)",
"_____no_output_____"
],
[
"obes = pd.ExcelFile(\"csv/obes.xls\")",
"_____no_output_____"
],
[
"obes.sheet_names",
"_____no_output_____"
],
[
"obes_age = obes.parse(\"7.2\", skiprows=4, skipfooter=14)",
"_____no_output_____"
],
[
"obes_age",
"_____no_output_____"
],
[
"obes_age.set_index('Year', inplace=True)",
"_____no_output_____"
],
[
"obes_age.plot()",
"_____no_output_____"
],
[
"obes_age.drop(\"Total\", axis=1).plot()",
"_____no_output_____"
],
[
"from datetime import datetime",
"_____no_output_____"
],
[
"datetime.now().date()",
"_____no_output_____"
],
[
"opsd_daily = pd.read_csv(\n 'https://raw.githubusercontent.com/ardhiraka/PFDS_sources/master/opsd_germany_daily.csv',\n index_col=0, parse_dates=True\n)",
"_____no_output_____"
],
[
"opsd_daily.head()",
"_____no_output_____"
],
[
"opsd_daily['Year'] = opsd_daily.index.year\nopsd_daily['Month'] = opsd_daily.index.month\nopsd_daily['Weekday'] = opsd_daily.index.weekday",
"_____no_output_____"
],
[
"opsd_daily",
"_____no_output_____"
],
[
"opsd_daily[\"Consumption\"].plot(\n linewidth=.3, \n figsize=(12, 5)\n)",
"_____no_output_____"
],
[
"df_canada = pd.read_excel(\n \"https://github.com/ardhiraka/PFDS_sources/blob/master/Canada.xlsx?raw=true\",\n sheet_name=\"Canada by Citizenship\",\n skiprows=range(20),\n skipfooter=2\n)",
"_____no_output_____"
],
[
"df_canada.head()",
"_____no_output_____"
],
[
"df_canada.columns",
"_____no_output_____"
],
[
"df_canada.drop(\n columns=[\n \"AREA\", \"REG\", \"DEV\",\n \"Type\", \"Coverage\"\n ],\n axis=1,\n inplace=True\n)",
"_____no_output_____"
],
[
"df_canada.head()",
"_____no_output_____"
],
[
"df_canada.rename(\n columns={\n \"OdName\": \"Country\",\n \"AreaName\": \"Continent\",\n \"RegName\": \"Region\"\n },\n inplace=True\n)",
"_____no_output_____"
],
[
"df_canada.head()",
"_____no_output_____"
],
[
"df_canada_total = df_canada.sum(axis=1)",
"_____no_output_____"
],
[
"df_canada[\"Total\"] = df_canada_total\ndf_canada.head()",
"_____no_output_____"
],
[
"df_canada.describe()",
"_____no_output_____"
],
[
"df_canada.Country",
"_____no_output_____"
],
[
"df_canada[\n [\n \"Country\",\n 2000,\n 2001,\n 2002,\n 2003,\n 2004,\n 2005,\n 2006,\n 2007,\n 2008,\n 2009,\n 2010,\n 2011,\n 2012,\n 2013,\n ]\n]",
"_____no_output_____"
],
[
"df_canada[\"Continent\"] == \"Africa\"",
"_____no_output_____"
],
[
"df_canada[(df_canada[\"Continent\"]==\"Asia\") & (df_canada[\"Region\"]==\"Southern Asia\")]",
"_____no_output_____"
]
]
]
| [
"code"
]
| [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
]
|
d064e55492113bcb0b663aa6bb3350fddb822574 | 14,748 | ipynb | Jupyter Notebook | Demo.ipynb | 4nthon/HomographyNet | 9749b80b2d68d9ecff6423e209782327440e8226 | [
"MIT"
]
| 8 | 2020-07-02T00:23:09.000Z | 2022-03-17T01:55:22.000Z | Demo.ipynb | 4nthon/HomographyNet | 9749b80b2d68d9ecff6423e209782327440e8226 | [
"MIT"
]
| 1 | 2021-09-18T02:03:17.000Z | 2021-09-18T02:16:41.000Z | Demo.ipynb | 4nthon/HomographyNet | 9749b80b2d68d9ecff6423e209782327440e8226 | [
"MIT"
]
| 6 | 2020-10-26T08:41:48.000Z | 2021-07-05T03:08:01.000Z | 42.872093 | 1,663 | 0.606184 | [
[
[
"## 1ใๅฏ่งๅDataGeneratorHomographyNetๆจกๅ้ฝๅนฒไบไปไน",
"_____no_output_____"
]
],
[
[
"import glob\nimport os\nimport cv2\nimport numpy as np\nfrom dataGenerator import DataGeneratorHomographyNet",
"/home/nvidia/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\nUsing TensorFlow backend.\n"
],
[
"img_dir = os.path.join(os.path.expanduser(\"~\"), \"/home/nvidia/test2017\")\nimg_ext = \".jpg\"\nimg_paths = glob.glob(os.path.join(img_dir, '*' + img_ext))\ndg = DataGeneratorHomographyNet(img_paths, input_dim=(240, 240))\ndata, label = dg.__getitem__(0)\nfor idx in range(dg.batch_size):\n cv2.imshow(\"orig\", data[idx, :, :, 0])\n cv2.imshow(\"transformed\", data[idx, :, :, 1])\n cv2.waitKey(0)",
"_____no_output_____"
]
],
[
[
"## 2ใๅผๅง่ฎญ็ป",
"_____no_output_____"
]
],
[
[
"import os\nimport glob\nimport datetime\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport keras\nfrom keras.callbacks import ModelCheckpoint\nfrom sklearn.model_selection import train_test_split\nimport tensorflow as tf\nfrom homographyNet import HomographyNet\nimport dataGenerator as dg\nkeras.__version__",
"_____no_output_____"
],
[
"batch_size = 2\n#ๅๅผ0,1,2 0-ๅฎ้ๆจกๅผ 1-่ฟๅบฆๆก 2-ๆฏไธ่ก้ฝๆ่พๅบ\nverbose = 1\n#Epoch\nnb_epo = 150\n#่ฎกๆถๅผๅง\nstart_ts = datetime.datetime.now().strftime(\"%Y%m%d-%H%M%S\")\n#็จไบ่ฎญ็ป็ๅพ็็ฎๅฝ\ndata_path = \"/home/nvidia/test2017\"\n#ๆจกๅไฟๅญ็็ฎๅฝ\nmodel_dir = \"/home/nvidia\"\nimg_dir = os.path.join(os.path.expanduser(\"~\"), data_path)\nmodel_dir = os.path.join(os.path.expanduser(\"~\"), model_dir, start_ts)\n#ไปฅๆถ้ดไธบๅๅๅปบ็ฎๅฝ\nif not os.path.exists(model_dir):\n os.makedirs(model_dir)",
"_____no_output_____"
],
[
"img_ext = \".jpg\"\n#่ทๅๆๆๅพๅ็ฎๅฝ\nimg_paths = glob.glob(os.path.join(img_dir, '*' + img_ext))\ninput_size = (360, 360, 2)\n#ๅๅ่ฎญ็ป้ๅ้ช่ฏ้๏ผ้ช่ฏ้ๆๅฐไธ็น๏ผไธ็ถๆฏไธชepoch่ทๅฎๅคชๆ
ขไบ\ntrain_idx, val_idx = train_test_split(img_paths, test_size=0.01)\n#ๆฟๅฐ่ฎญ็ปๆฐๆฎ\ntrain_dg = dg.DataGeneratorHomographyNet(train_idx, input_dim=input_size[0:2], batch_size=batch_size)\n#ๆฟๅฐๆขๅฎไบๅฎ็ๆ ็ญพ\nval_dg = dg.DataGeneratorHomographyNet(val_idx, input_dim=input_size[0:2], batch_size=batch_size)\n#ๅฏนไบ็ฅ็ป็ฝ็ปๆฅ่ฏด่ฟไธช้ฌผไธๆ ท็ๅพๅฐฑๆฏ่พๅ
ฅ๏ผๅฎ่ชๅทฑไป่ฟๅน
ๅพ็ๅทฆ่พนๅๅณ่พนๅญฆไน ๅบๅๅบๆง็ฉ้ต๏ผ็ฅๅฅๅง๏ผ\n#ไฟฎๆญฃ็ฝ็ป่พๅ
ฅๅคด\nhomo_net = HomographyNet(input_size)\n#ๅฎไพๅ็ฝ็ป็ปๆ\nmodel = homo_net.build_model()\n#่พๅบๆจกๅ\nmodel.summary()",
"WARNING:tensorflow:From /home/nvidia/.local/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:1264: calling reduce_prod (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.\nInstructions for updating:\nkeep_dims is deprecated, use keepdims instead\nWARNING:tensorflow:From /home/nvidia/.local/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:1349: calling reduce_mean (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.\nInstructions for updating:\nkeep_dims is deprecated, use keepdims instead\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ninput_1 (InputLayer) (None, 360, 360, 2) 0 \n_________________________________________________________________\nconv2d_1 (Conv2D) (None, 360, 360, 64) 1216 \n_________________________________________________________________\nbatch_normalization_1 (Batch (None, 360, 360, 64) 256 \n_________________________________________________________________\nactivation_1 (Activation) (None, 360, 360, 64) 0 \n_________________________________________________________________\nconv2d_2 (Conv2D) (None, 360, 360, 64) 36928 \n_________________________________________________________________\nbatch_normalization_2 (Batch (None, 360, 360, 64) 256 \n_________________________________________________________________\nactivation_2 (Activation) (None, 360, 360, 64) 0 \n_________________________________________________________________\nmax_pooling2d_1 (MaxPooling2 (None, 180, 180, 64) 0 \n_________________________________________________________________\nconv2d_3 (Conv2D) (None, 180, 180, 64) 36928 \n_________________________________________________________________\nbatch_normalization_3 (Batch (None, 180, 180, 64) 256 \n_________________________________________________________________\nactivation_3 (Activation) (None, 180, 180, 64) 0 \n_________________________________________________________________\nconv2d_4 (Conv2D) (None, 180, 180, 64) 36928 \n_________________________________________________________________\nbatch_normalization_4 (Batch (None, 180, 180, 64) 256 \n_________________________________________________________________\nactivation_4 (Activation) (None, 180, 180, 64) 0 \n_________________________________________________________________\nmax_pooling2d_2 (MaxPooling2 (None, 90, 90, 64) 0 \n_________________________________________________________________\nconv2d_5 (Conv2D) (None, 90, 90, 128) 73856 \n_________________________________________________________________\nbatch_normalization_5 (Batch (None, 90, 90, 128) 512 \n_________________________________________________________________\nactivation_5 (Activation) (None, 90, 90, 128) 0 \n_________________________________________________________________\nconv2d_6 (Conv2D) (None, 90, 90, 128) 147584 \n_________________________________________________________________\nbatch_normalization_6 (Batch (None, 90, 90, 128) 512 \n_________________________________________________________________\nactivation_6 (Activation) (None, 90, 90, 128) 0 \n_________________________________________________________________\nmax_pooling2d_3 (MaxPooling2 (None, 45, 45, 128) 0 \n_________________________________________________________________\nconv2d_7 (Conv2D) (None, 45, 45, 128) 147584 \n_________________________________________________________________\nbatch_normalization_7 (Batch (None, 45, 45, 128) 512 \n_________________________________________________________________\nactivation_7 (Activation) (None, 45, 45, 128) 0 \n_________________________________________________________________\nconv2d_8 (Conv2D) (None, 45, 45, 128) 147584 \n_________________________________________________________________\nbatch_normalization_8 (Batch (None, 45, 45, 128) 512 \n_________________________________________________________________\nactivation_8 (Activation) (None, 45, 45, 128) 0 \n_________________________________________________________________\nmax_pooling2d_4 (MaxPooling2 (None, 22, 22, 128) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 61952) 0 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 61952) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 1028) 63687684 \n_________________________________________________________________\nactivation_9 (Activation) (None, 1028) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 8) 8232 \n=================================================================\nTotal params: 64,327,596\nTrainable params: 64,326,060\nNon-trainable params: 1,536\n_________________________________________________________________\n"
],
[
"#ๆฃๆฅ็นๅ่ฐ๏ผๆฒกๅtensorboard็ๅ่ฐ๏ผ็ๆญฃ็ๅคงๅธ้ฝๆฏ็ดๆฅ็loss่พๅบ็\ncheckpoint = ModelCheckpoint(\n os.path.join(model_dir, 'model.h5'),\n monitor='val_loss',\n verbose=verbose,\n save_best_only=True,\n save_weights_only=False,\n mode='auto'\n)",
"_____no_output_____"
],
[
"#ๆๅซๅผๅจไธ้ขๆนๅคช้บป็ฆ๏ผ็ดๆฅๅจ่ฟ้ๅฎไนไบ\n#ๅผๅง่ฎญ็ป\n#ๅฆๆไธๅ steps_per_epoch= 32, ๅฐฑๆฏๆฏๆฌกๅ
จ่ท\nhistory = model.fit_generator(train_dg, \n validation_data = val_dg,\n #steps_per_epoch = 32, \n callbacks = [checkpoint], \n epochs = 15, \n verbose = 1)",
"Epoch 1/15\n 1373/20131 [=>............................] - ETA: 1:18:50 - loss: 1615938396204833.0000 - mean_squared_error: 1615938396204833.0000"
]
],
[
[
"\n",
"_____no_output_____"
]
],
[
[
"#ๆดไธชๅพ็็\nhistory_df = pd.DataFrame(history.history)\nhistory_df.to_csv(os.path.join(model_dir, 'history.csv'))\nhistory_df[['loss', 'val_loss']].plot()\nhistory_df[['mean_squared_error', 'val_mean_squared_error']].plot()\nplt.show()",
"_____no_output_____"
]
],
[
[
"## ้ขๆต&่ฏไผฐ",
"_____no_output_____"
]
],
[
[
"TODO",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
]
|
d064e82ca5571ab846425be59e36d57fa95702af | 34,696 | ipynb | Jupyter Notebook | Chapter08/.ipynb_checkpoints/ch8-diamond-prices-model-tuning-checkpoint.ipynb | arifmudi/Hands-On-Predictive-Analytics-with-Python | 27122c8c75711c1e3e29d265f13788b9c4b8f5ee | [
"MIT"
]
| 38 | 2019-01-03T14:54:56.000Z | 2022-02-02T04:13:35.000Z | Chapter08/.ipynb_checkpoints/ch8-diamond-prices-model-tuning-checkpoint.ipynb | arifmudi/Hands-On-Predictive-Analytics-with-Python | 27122c8c75711c1e3e29d265f13788b9c4b8f5ee | [
"MIT"
]
| 4 | 2019-07-03T11:25:24.000Z | 2020-11-21T07:15:27.000Z | Chapter08/.ipynb_checkpoints/ch8-diamond-prices-model-tuning-checkpoint.ipynb | arifmudi/Hands-On-Predictive-Analytics-with-Python | 27122c8c75711c1e3e29d265f13788b9c4b8f5ee | [
"MIT"
]
| 31 | 2018-12-27T05:00:08.000Z | 2022-03-22T23:24:57.000Z | 65.095685 | 20,952 | 0.806837 | [
[
[
"# Diamond Prices: Model Tuning and Improving Performance",
"_____no_output_____"
],
[
"#### Importing libraries",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport os\n\npd.options.mode.chained_assignment = None\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"#### Loading the dataset",
"_____no_output_____"
]
],
[
[
"DATA_DIR = '../data'\nFILE_NAME = 'diamonds.csv'\ndata_path = os.path.join(DATA_DIR, FILE_NAME)\ndiamonds = pd.read_csv(data_path)",
"_____no_output_____"
]
],
[
[
"#### Preparing the dataset",
"_____no_output_____"
]
],
[
[
"## Preparation done from Chapter 2\ndiamonds = diamonds.loc[(diamonds['x']>0) | (diamonds['y']>0)]\ndiamonds.loc[11182, 'x'] = diamonds['x'].median()\ndiamonds.loc[11182, 'z'] = diamonds['z'].median()\ndiamonds = diamonds.loc[~((diamonds['y'] > 30) | (diamonds['z'] > 30))]\ndiamonds = pd.concat([diamonds, pd.get_dummies(diamonds['cut'], prefix='cut', drop_first=True)], axis=1)\ndiamonds = pd.concat([diamonds, pd.get_dummies(diamonds['color'], prefix='color', drop_first=True)], axis=1)\ndiamonds = pd.concat([diamonds, pd.get_dummies(diamonds['clarity'], prefix='clarity', drop_first=True)], axis=1)\n\n## Dimensionality reduction\nfrom sklearn.decomposition import PCA\npca = PCA(n_components=1, random_state=123)\ndiamonds['dim_index'] = pca.fit_transform(diamonds[['x','y','z']])\ndiamonds.drop(['x','y','z'], axis=1, inplace=True)",
"_____no_output_____"
],
[
"diamonds.columns",
"_____no_output_____"
]
],
[
[
"#### Train-test split",
"_____no_output_____"
]
],
[
[
"X = diamonds.drop(['cut','color','clarity','price'], axis=1)\ny = diamonds['price']\n\nfrom sklearn.model_selection import train_test_split\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=7)",
"_____no_output_____"
]
],
[
[
"#### Standarization: centering and scaling ",
"_____no_output_____"
]
],
[
[
"numerical_features = ['carat', 'depth', 'table', 'dim_index']\nfrom sklearn.preprocessing import StandardScaler\nscaler = StandardScaler()\nscaler.fit(X_train[numerical_features])\nX_train.loc[:, numerical_features] = scaler.fit_transform(X_train[numerical_features])\nX_test.loc[:, numerical_features] = scaler.transform(X_test[numerical_features])",
"_____no_output_____"
]
],
[
[
"## Optimizing a single hyper-parameter",
"_____no_output_____"
]
],
[
[
"X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, test_size=0.1, random_state=13)",
"_____no_output_____"
],
[
"from sklearn.neighbors import KNeighborsRegressor\nfrom sklearn.metrics import mean_absolute_error\n\ncandidates = np.arange(4,16)\nmae_metrics = []\nfor k in candidates:\n model = KNeighborsRegressor(n_neighbors=k, weights='distance', metric='minkowski', leaf_size=50, n_jobs=4)\n model.fit(X_train, y_train)\n y_pred = model.predict(X_val)\n metric = mean_absolute_error(y_true=y_val, y_pred=y_pred)\n mae_metrics.append(metric)",
"_____no_output_____"
],
[
"fig, ax = plt.subplots(figsize=(8,5))\nax.plot(candidates, mae_metrics, \"o-\")\nax.set_xlabel('Hyper-parameter K', fontsize=14)\nax.set_ylabel('MAE', fontsize=14)\nax.set_xticks(candidates)\nax.grid();",
"_____no_output_____"
]
],
[
[
"#### Recalculating train-set split",
"_____no_output_____"
]
],
[
[
"X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=7)\nscaler = StandardScaler()\nscaler.fit(X_train[numerical_features])\nX_train.loc[:, numerical_features] = scaler.fit_transform(X_train[numerical_features])\nX_test.loc[:, numerical_features] = scaler.transform(X_test[numerical_features])",
"_____no_output_____"
]
],
[
[
"#### Optimizing with cross-validation",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import cross_val_score\ncandidates = np.arange(4,16)\nmean_mae = []\nstd_mae = []\nfor k in candidates:\n model = KNeighborsRegressor(n_neighbors=k, weights='distance', metric='minkowski', leaf_size=50, n_jobs=4)\n cv_results = cross_val_score(model, X_train, y_train, scoring='neg_mean_absolute_error', cv=10)\n mean_score, std_score = -1*cv_results.mean(), cv_results.std()\n mean_mae.append(mean_score)\n std_mae.append(std_score)",
"_____no_output_____"
],
[
"fig, ax = plt.subplots(figsize=(8,5))\nax.plot(candidates, mean_mae, \"o-\")\nax.set_xlabel('Hyper-parameter K', fontsize=14)\nax.set_ylabel('Mean MAE', fontsize=14)\nax.set_xticks(candidates)\nax.grid();",
"_____no_output_____"
],
[
"fig, ax = plt.subplots(figsize=(8,5))\nax.plot(candidates, std_mae, \"o-\")\nax.set_xlabel('Hyper-parameter K', fontsize=14)\nax.set_ylabel('Standard deviation of MAE', fontsize=14)\nax.set_xticks(candidates)\nax.grid();",
"_____no_output_____"
]
],
[
[
"# Improving Performance",
"_____no_output_____"
],
[
"## Improving our diamond price predictions",
"_____no_output_____"
],
[
"### Fitting a neural network",
"_____no_output_____"
]
],
[
[
"from keras.models import Sequential\nfrom keras.layers import Dense\n\nn_input = X_train.shape[1]\nn_hidden1 = 32\nn_hidden2 = 16\nn_hidden3 = 8\n\nnn_reg = Sequential()\nnn_reg.add(Dense(units=n_hidden1, activation='relu', input_shape=(n_input,)))\nnn_reg.add(Dense(units=n_hidden2, activation='relu'))\nnn_reg.add(Dense(units=n_hidden3, activation='relu'))\n# output layer\nnn_reg.add(Dense(units=1, activation=None))",
"_____no_output_____"
],
[
"batch_size = 32\nn_epochs = 40\nnn_reg.compile(loss='mean_absolute_error', optimizer='adam')\nnn_reg.fit(X_train, y_train, epochs=n_epochs, batch_size=batch_size, validation_split=0.05)",
"_____no_output_____"
],
[
"y_pred = nn_reg.predict(X_test).flatten()\nmae_neural_net = mean_absolute_error(y_test, y_pred)\nprint(\"MAE Neural Network: {:0.2f}\".format(mae_neural_net))",
"_____no_output_____"
]
],
[
[
"### Transforming the target",
"_____no_output_____"
]
],
[
[
"diamonds['price'].hist(bins=25, ec='k', figsize=(8,5))\nplt.title(\"Distribution of diamond prices\", fontsize=16)\nplt.grid(False);",
"_____no_output_____"
],
[
"y_train = np.log(y_train)\npd.Series(y_train).hist(bins=25, ec='k', figsize=(8,5))\nplt.title(\"Distribution of log diamond prices\", fontsize=16)\nplt.grid(False);",
"_____no_output_____"
],
[
"nn_reg = Sequential()\nnn_reg.add(Dense(units=n_hidden1, activation='relu', input_shape=(n_input,)))\nnn_reg.add(Dense(units=n_hidden2, activation='relu'))\nnn_reg.add(Dense(units=n_hidden3, activation='relu'))\n# output layer\nnn_reg.add(Dense(units=1, activation=None))",
"_____no_output_____"
],
[
"batch_size = 32\nn_epochs = 40\nnn_reg.compile(loss='mean_absolute_error', optimizer='adam')\nnn_reg.fit(X_train, y_train, epochs=n_epochs, batch_size=batch_size, validation_split=0.05)",
"_____no_output_____"
],
[
"y_pred = nn_reg.predict(X_test).flatten()\ny_pred = np.exp(y_pred)\nmae_neural_net2 = mean_absolute_error(y_test, y_pred)\nprint(\"MAE Neural Network (modified target): {:0.2f}\".format(mae_neural_net2))",
"_____no_output_____"
],
[
"100*(mae_neural_net - mae_neural_net2)/mae_neural_net2",
"_____no_output_____"
]
],
[
[
"#### Analyzing the results",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots(figsize=(8,5))\nresiduals = y_test - y_pred\nax.scatter(y_test, residuals, s=3)\nax.set_title('Residuals vs. Observed Prices', fontsize=16)\nax.set_xlabel('Observed prices', fontsize=14)\nax.set_ylabel('Residuals', fontsize=14)\nax.grid();",
"_____no_output_____"
],
[
"mask_7500 = y_test <=7500\nmae_neural_less_7500 = mean_absolute_error(y_test[mask_7500], y_pred[mask_7500])\nprint(\"MAE considering price <= 7500: {:0.2f}\".format(mae_neural_less_7500))",
"_____no_output_____"
],
[
"fig, ax = plt.subplots(figsize=(8,5))\npercent_residuals = (y_test - y_pred)/y_test\nax.scatter(y_test, percent_residuals, s=3)\nax.set_title('Pecent residuals vs. Observed Prices', fontsize=16)\nax.set_xlabel('Observed prices', fontsize=14)\nax.set_ylabel('Pecent residuals', fontsize=14)\nax.axhline(y=0.15, color='r'); ax.axhline(y=-0.15, color='r'); \nax.grid();",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
]
| [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
]
|
d064e9236cbc6f092035bc0d5dba899559c20f39 | 254,266 | ipynb | Jupyter Notebook | materials/seaborn_data_viz_complete_with_outputs.ipynb | kthrog/dataviz_workshop | 971836516c72d3d07e7ef59ee8bb313cd788991c | [
"CC0-1.0"
]
| null | null | null | materials/seaborn_data_viz_complete_with_outputs.ipynb | kthrog/dataviz_workshop | 971836516c72d3d07e7ef59ee8bb313cd788991c | [
"CC0-1.0"
]
| null | null | null | materials/seaborn_data_viz_complete_with_outputs.ipynb | kthrog/dataviz_workshop | 971836516c72d3d07e7ef59ee8bb313cd788991c | [
"CC0-1.0"
]
| null | null | null | 167.280263 | 69,956 | 0.785123 | [
[
[
"# Visualizing COVID-19 Hospital Dataset with Seaborn\n\n**Pre-Work:**\n1. Ensure that Jupyter Notebook, Python 3, and seaborn (which will also install dependency libraries if not already installed) are installed. (See resources below for installation instructions.)\n\n### **Instructions:**\n1. Using Python, import main visualization library, `seaborn`, and its dependencies: `pandas`, `numpy`, and `matplotlib`.\n2. Define dataset and read in data using pandas function, `read_json()`. [Notes: a) we're reading in data as an API endpoint; for more about this, see associated workshop slides or resources at bottom of notebook. b) If, instead, you prefer to use your own data, see comment with alternative for `read_csv()`.]\n3. Check data has been read is as expected using `head()` function.\n4. Graph two variables with `seaborn`as a lineplot using the `lineplot()` function.\n5. Graph these same variables, plus a third, from the source dataset with `seaborn` as a scatterplot using the `relplot()` function.\n6. See additional methods, using filtered data and other graphs. Feel free to open a new notebook, and try out your own ideas, using different variables or charts. (Or try out your own data!)\n7. When ready, save figure using `matplotlib`'s `savefig`.\n\n**Note:**\n*If you're new to Jupyter Notebook, see resources below.*\n\n### **Data source:**\n\n[COVID-19 Reported Patient Impact and Hospital Capacity by State Timeseries](https://healthdata.gov/Hospital/COVID-19-Reported-Patient-Impact-and-Hospital-Capa/g62h-syeh),\" created by the U.S. Department of Health & Human Services, on [HealthData.gov](https://healthdata.gov/).",
"_____no_output_____"
]
],
[
[
"# import libraries\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport seaborn as sns",
"_____no_output_____"
],
[
"# read JSON data in via healthdata.gov's API endpoint - https://healthdata.gov/resource/g62h-syeh.json?$limit=50000\n# because the SODA API defaults to 1,000 rows, we're going to change that with the $limit parameter\n# define data as 'covid' and set equal to read function\n# if we want filtered data to compare to, define more datasets\n\ncovid = pd.read_json(\"https://healthdata.gov/resource/g62h-syeh.json?$limit=50000\")\n\ncovid_ct = pd.read_json(\"https://healthdata.gov/resource/g62h-syeh.json?state=CT\")\n\ncovid_maytopresent = pd.read_json(\"https://healthdata.gov/resource/g62h-syeh.json?$limit=50000&$where=date%20between%20%272021-05-01T12:00:00%27%20and%20%272021-08-01T12:00:00%27\")\n\n# if you want to read in your own data, see resources below, or if you have a CSV, try: mydata = pd.read_csv('')\n# and add data filepath inside ''\n# be sure to change covid to mydata in code below",
"_____no_output_____"
],
[
"# use head function and call our dataset (covid) to see the first few rows \n# the default argument for this function is 5 rows, but you can set this to anything, e.g. covid.head(20)\ncovid.head()",
"_____no_output_____"
],
[
"# example of head with more rows\ncovid_ct.head(20)",
"_____no_output_____"
],
[
"# use seaborn to plot inpatient beds used versus whether a critical staffing shortage is occuring\n# we also need to tell seaborn what dataset to use; in this case it's 'covid' as defined above\n# variables: inpatient_beds_used_covid; critical_staffing_shortage_today_yes\n\nsns.lineplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", data=covid)\n\n# save and name fig; uncomment below to run\n# plt.savefig('covid_lineplot.png')",
"_____no_output_____"
],
[
"# use seaborn to plot inpatient beds used versus whether a critical staffing shortage is occuring\n# this time, with a bar plot\n# variables: inpatient_beds_used_covid; critical_staffing_shortage_today_yes\n\nsns.barplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", data=covid)\n\n# save and name fig; uncomment below to run\n# plt.savefig('covid_barplot.png')",
"_____no_output_____"
],
[
"# now we're going to try another graph type, a relational graph that will be scatterplot, with the same variables\n# and add one more variable, deaths_covid, to color dots based on prevalance of COVID-19 deaths by setting hue\n# though feel free to try new variables by browsing them here (scroll down to Columns in this Dataset): https://healthdata.gov/Hospital/COVID-19-Reported-Patient-Impact-and-Hospital-Capa/g62h-syeh\n# variables: inpatient_beds_used_covid; critical_staffing_shortage_today_yes; deaths_covid\n\nsns.relplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", hue=\"deaths_covid\", data=covid)\n\n# save and name fig; uncomment below to run\n# plt.savefig('covid_scatterplot.png')",
"_____no_output_____"
],
[
"# now let's try some graphs with the more limited datasets above, for instance, just the CT data\n\nsns.relplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", hue=\"deaths_covid\", data=covid_ct)\n",
"_____no_output_____"
],
[
"# or just the May - August (present) 2021 date range\n\nsns.relplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", hue=\"deaths_covid\", data=covid_maytopresent)\n",
"_____no_output_____"
]
],
[
[
"### Final Note:\nIt's important to remember that we can't necessarily infer any causation or directionality from these charts, but they can be a good place to start for further analysis and exploration, and can point us in the right direction of where to apply more advanced statistical methods, such as linear regression. Even with more advanced methods, though, we still want to stick the principles we're using here: keep charts as simple as possible, using only a few variables, and adding color only where needed. We want our charts to be readable and understandable -- see resources below for more advice and guidance on this. \n\nUltimately, these quick-start methods are helpful for idea generation and early investigation, and can get that process up and running quickly.",
"_____no_output_____"
],
[
"#### Code/Tools Resources:\n- Jupyter notebook - about: https://jupyter-notebook.readthedocs.io/en/stable/notebook.html#introduction\n- Jupyter notebook - how to use this tool: https://jupyter-notebook.readthedocs.io/en/stable/notebook.html\n- Python: https://www.python.org/\n- Seaborn: https://seaborn.pydata.org/index.html\n- Seaborn tutorial: https://seaborn.pydata.org/tutorial.html\n- Seaborn gallery: https://seaborn.pydata.org/examples/index.html\n- Seaborn `lineplot()` function: https://seaborn.pydata.org/generated/seaborn.lineplot.html#seaborn.lineplot + https://seaborn.pydata.org/examples/errorband_lineplots.html\n- Seaborn `relplot()` function: https://seaborn.pydata.org/generated/seaborn.relplot.html#seaborn.relplot + https://seaborn.pydata.org/examples/faceted_lineplot.html\n- Pandas: https://pandas.pydata.org/\n- Pandas - how to read / write tabular data: https://pandas.pydata.org/docs/getting_started/intro_tutorials/02_read_write.html\n- Pandas `read.json()` function: https://pandas.pydata.org/docs/reference/api/pandas.io.json.read_json.html?highlight=read_json#pandas.io.json.read_json\n- Pandas `head()` function: https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.head.html?highlight=head#pandas.DataFrame.head\n- Matplotlib: https://matplotlib.org/\n- Matplotlib `savefig` function: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.savefig.html\n- Socrata Open Data API (SODA) Docs: https://dev.socrata.com/\n- SODA Docs for [Dataset](https://healthdata.gov/Hospital/COVID-19-Reported-Patient-Impact-and-Hospital-Capa/g62h-syeh): https://dev.socrata.com/foundry/healthdata.gov/g62h-syeh\n- SODA Docs - what is an endpoint: https://dev.socrata.com/docs/endpoints.html\n\n#### Visualization Resources:\n- 10 Simple Rules for Better Figures | *PLOS Comp Bio*: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1003833\n\n- How to Choose the Right Data Visualization | *Chartio*: https://chartio.com/learn/charts/how-to-choose-data-visualization/\n\n#### Additional Note:\nThis notebook was created by Kaitlin Throgmorton for a data analysis workshop, as part of an interview for Yale University.",
"_____no_output_____"
]
]
]
| [
"markdown",
"code",
"markdown"
]
| [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
]
]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.