hexsha
stringlengths
40
40
size
int64
6
14.9M
ext
stringclasses
1 value
lang
stringclasses
1 value
max_stars_repo_path
stringlengths
6
260
max_stars_repo_name
stringlengths
6
119
max_stars_repo_head_hexsha
stringlengths
40
41
max_stars_repo_licenses
list
max_stars_count
int64
1
191k
โŒ€
max_stars_repo_stars_event_min_datetime
stringlengths
24
24
โŒ€
max_stars_repo_stars_event_max_datetime
stringlengths
24
24
โŒ€
max_issues_repo_path
stringlengths
6
260
max_issues_repo_name
stringlengths
6
119
max_issues_repo_head_hexsha
stringlengths
40
41
max_issues_repo_licenses
list
max_issues_count
int64
1
67k
โŒ€
max_issues_repo_issues_event_min_datetime
stringlengths
24
24
โŒ€
max_issues_repo_issues_event_max_datetime
stringlengths
24
24
โŒ€
max_forks_repo_path
stringlengths
6
260
max_forks_repo_name
stringlengths
6
119
max_forks_repo_head_hexsha
stringlengths
40
41
max_forks_repo_licenses
list
max_forks_count
int64
1
105k
โŒ€
max_forks_repo_forks_event_min_datetime
stringlengths
24
24
โŒ€
max_forks_repo_forks_event_max_datetime
stringlengths
24
24
โŒ€
avg_line_length
float64
2
1.04M
max_line_length
int64
2
11.2M
alphanum_fraction
float64
0
1
cells
list
cell_types
list
cell_type_groups
list
d0604a8ef3c4efaa3345b4e152ffb03ab00a4160
62,735
ipynb
Jupyter Notebook
scratch/working/2.2.NoVPC-EFS-Train-Model.ipynb
gonsoomoon-ml/SageMaker-With-Secure-VPC
b3ccdede952e8a32a256cb1aab53d196e519f401
[ "MIT" ]
2
2021-02-01T00:48:25.000Z
2021-08-02T09:43:27.000Z
scratch/working/2.2.NoVPC-EFS-Train-Model.ipynb
gonsoomoon-ml/SageMaker-With-Secure-VPC
b3ccdede952e8a32a256cb1aab53d196e519f401
[ "MIT" ]
1
2021-02-08T06:18:25.000Z
2021-02-08T06:18:25.000Z
scratch/working/2.2.NoVPC-EFS-Train-Model.ipynb
gonsoomoon-ml/SageMaker-With-Secure-VPC
b3ccdede952e8a32a256cb1aab53d196e519f401
[ "MIT" ]
2
2021-02-04T08:23:14.000Z
2021-02-25T07:13:11.000Z
116.175926
19,813
0.635259
[ [ [ "# [๋ชจ๋“ˆ 2.1] SageMaker ํด๋Ÿฌ์Šคํ„ฐ์—์„œ ํ›ˆ๋ จ (No VPC์—์„œ ์‹คํ–‰)\n\n์ด ๋…ธํŠธ๋ถ์€ ์•„๋ž˜์˜ ์ž‘์—…์„ ์‹คํ–‰ ํ•ฉ๋‹ˆ๋‹ค.\n- SageMaker Hosting Cluster ์—์„œ ํ›ˆ๋ จ์„ ์‹คํ–‰\n- ํ›ˆ๋ จํ•œ Job ์ด๋ฆ„์„ ์ €์žฅ \n - ๋‹ค์Œ ๋…ธํŠธ๋ถ์—์„œ ๋ชจ๋ธ ๋ฐฐํฌ ๋ฐ ์ถ”๋ก ์‹œ์— ์‚ฌ์šฉ ํ•ฉ๋‹ˆ๋‹ค.\n---", "_____no_output_____" ], [ "SageMaker์˜ ์„ธ์…˜์„ ์–ป๊ณ , role ์ •๋ณด๋ฅผ ๊ฐ€์ ธ์˜ต๋‹ˆ๋‹ค.\n- ์œ„์˜ ๋‘ ์ •๋ณด๋ฅผ ํ†ตํ•ด์„œ SageMaker Hosting Cluster์— ์—ฐ๊ฒฐํ•ฉ๋‹ˆ๋‹ค.", "_____no_output_____" ] ], [ [ "import os\nimport sagemaker\nfrom sagemaker import get_execution_role\n\nsagemaker_session = sagemaker.Session()\n\nrole = get_execution_role()", "_____no_output_____" ] ], [ [ "## ๋กœ์ปฌ์˜ ๋ฐ์ดํ„ฐ S3 ์—…๋กœ๋”ฉ\n๋กœ์ปฌ์˜ ๋ฐ์ดํ„ฐ๋ฅผ S3์— ์—…๋กœ๋”ฉํ•˜์—ฌ ํ›ˆ๋ จ์‹œ์— Input์œผ๋กœ ์‚ฌ์šฉ ํ•ฉ๋‹ˆ๋‹ค.", "_____no_output_____" ] ], [ [ "# dataset_location = sagemaker_session.upload_data(path='data', key_prefix='data/DEMO-cifar10')\n# display(dataset_location)\ndataset_location = 's3://sagemaker-ap-northeast-2-057716757052/data/DEMO-cifar10'\ndataset_location", "_____no_output_____" ], [ "# efs_dir = '/home/ec2-user/efs/data'\n\n# ! ls {efs_dir} -al\n# ! aws s3 cp {dataset_location} {efs_dir} --recursive", "_____no_output_____" ], [ "from sagemaker.inputs import FileSystemInput\n\n# Specify EFS ile system id.\nfile_system_id = 'fs-38dc1558' # 'fs-xxxxxxxx'\nprint(f\"EFS file-system-id: {file_system_id}\")\n\n# Specify directory path for input data on the file system. \n# You need to provide normalized and absolute path below.\ntrain_file_system_directory_path = '/data/train'\neval_file_system_directory_path = '/data/eval'\nvalidation_file_system_directory_path = '/data/validation'\nprint(f'EFS file-system data input path: {train_file_system_directory_path}')\nprint(f'EFS file-system data input path: {eval_file_system_directory_path}')\nprint(f'EFS file-system data input path: {validation_file_system_directory_path}')\n\n# Specify the access mode of the mount of the directory associated with the file system. \n# Directory must be mounted 'ro'(read-only).\nfile_system_access_mode = 'ro'\n\n# Specify your file system type\nfile_system_type = 'EFS'\n\ntrain = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=train_file_system_directory_path,\n file_system_access_mode=file_system_access_mode)\n\neval = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=eval_file_system_directory_path,\n file_system_access_mode=file_system_access_mode)\n\nvalidation = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=validation_file_system_directory_path,\n file_system_access_mode=file_system_access_mode)", "EFS file-system-id: fs-38dc1558\nEFS file-system data input path: /data/train\nEFS file-system data input path: /data/eval\nEFS file-system data input path: /data/validation\n" ], [ "aws_region = 'ap-northeast-2'# aws-region-code e.g. us-east-1\ns3_bucket = 'sagemaker-ap-northeast-2-057716757052'# your-s3-bucket-name", "_____no_output_____" ], [ "prefix = \"cifar10/efs\" #prefix in your bucket\ns3_output_location = f's3://{s3_bucket}/{prefix}/output'\nprint(f'S3 model output location: {s3_output_location}')", "S3 model output location: s3://sagemaker-ap-northeast-2-057716757052/cifar10/efs/output\n" ], [ "security_group_ids = ['sg-0192524ef63ec6138'] # ['sg-xxxxxxxx'] \n# subnets = ['subnet-0a84bcfa36d3981e6','subnet-0304abaaefc2b1c34','subnet-0a2204b79f378b178'] # [ 'subnet-xxxxxxx', 'subnet-xxxxxxx', 'subnet-xxxxxxx']\nsubnets = ['subnet-0a84bcfa36d3981e6'] # [ 'subnet-xxxxxxx', 'subnet-xxxxxxx', 'subnet-xxxxxxx']\n\n\n", "_____no_output_____" ], [ "from sagemaker.tensorflow import TensorFlow\nestimator = TensorFlow(base_job_name='cifar10',\n entry_point='cifar10_keras_sm_tf2.py',\n source_dir='training_script',\n role=role,\n framework_version='2.0.0',\n py_version='py3',\n script_mode=True,\n hyperparameters={'epochs' : 1},\n train_instance_count=1, \n train_instance_type='ml.p3.2xlarge',\n output_path=s3_output_location, \n subnets=subnets,\n security_group_ids=security_group_ids, \n session = sagemaker.Session()\n )\n\nestimator.fit({'train': train,\n 'validation': validation,\n 'eval': eval,\n })\n# estimator.fit({'train': 'file://data/train',\n# 'validation': 'file://data/validation',\n# 'eval': 'file://data/eval'})", "train_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_count has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\n" ] ], [ [ "# VPC_Mode๋ฅผ True, False ์„ ํƒ\n#### **[์ค‘์š”] VPC_Mode์—์„œ ์‹คํ–‰์‹œ์— True๋กœ ๋ณ€๊ฒฝํ•ด์ฃผ์„ธ์š”**", "_____no_output_____" ] ], [ [ "VPC_Mode = False", "_____no_output_____" ], [ "from sagemaker.tensorflow import TensorFlow\n\ndef retrieve_estimator(VPC_Mode):\n if VPC_Mode:\n # VPC ๋ชจ๋“œ ๊ฒฝ์šฐ์— subnets, security_group์„ ๊ธฐ์ˆ  ํ•ฉ๋‹ˆ๋‹ค.\n estimator = TensorFlow(base_job_name='cifar10',\n entry_point='cifar10_keras_sm_tf2.py',\n source_dir='training_script',\n role=role,\n framework_version='2.0.0',\n py_version='py3',\n script_mode=True, \n hyperparameters={'epochs': 2},\n train_instance_count=1, \n train_instance_type='ml.p3.8xlarge',\n subnets = ['subnet-090c1fad32165b0fa','subnet-0bd7cff3909c55018'],\n security_group_ids = ['sg-0f45d634d80aef27e'] \n ) \n else:\n estimator = TensorFlow(base_job_name='cifar10',\n entry_point='cifar10_keras_sm_tf2.py',\n source_dir='training_script',\n role=role,\n framework_version='2.0.0',\n py_version='py3',\n script_mode=True, \n hyperparameters={'epochs': 2},\n train_instance_count=1, \n train_instance_type='ml.p3.8xlarge')\n return estimator\n\nestimator = retrieve_estimator(VPC_Mode)", "train_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_count has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\ntrain_instance_type has been renamed in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\n" ] ], [ [ "ํ•™์Šต์„ ์ˆ˜ํ–‰ํ•ฉ๋‹ˆ๋‹ค. ์ด๋ฒˆ์—๋Š” ๊ฐ๊ฐ์˜ ์ฑ„๋„(`train, validation, eval`)์— S3์˜ ๋ฐ์ดํ„ฐ ์ €์žฅ ์œ„์น˜๋ฅผ ์ง€์ •ํ•ฉ๋‹ˆ๋‹ค.<br>\nํ•™์Šต ์™„๋ฃŒ ํ›„ Billable seconds๋„ ํ™•์ธํ•ด ๋ณด์„ธ์š”. Billable seconds๋Š” ์‹ค์ œ๋กœ ํ•™์Šต ์ˆ˜ํ–‰ ์‹œ ๊ณผ๊ธˆ๋˜๋Š” ์‹œ๊ฐ„์ž…๋‹ˆ๋‹ค.\n```\nBillable seconds: <time>\n```\n\n์ฐธ๊ณ ๋กœ, `ml.p2.xlarge` ์ธ์Šคํ„ด์Šค๋กœ 5 epoch ํ•™์Šต ์‹œ ์ „์ฒด 6๋ถ„-7๋ถ„์ด ์†Œ์š”๋˜๊ณ , ์‹ค์ œ ํ•™์Šต์— ์†Œ์š”๋˜๋Š” ์‹œ๊ฐ„์€ 3๋ถ„-4๋ถ„์ด ์†Œ์š”๋ฉ๋‹ˆ๋‹ค.", "_____no_output_____" ] ], [ [ "%%time\nestimator.fit({'train':'{}/train'.format(dataset_location),\n 'validation':'{}/validation'.format(dataset_location),\n 'eval':'{}/eval'.format(dataset_location)})", "2021-01-27 04:02:44 Starting - Starting the training job...\n2021-01-27 04:03:08 Starting - Launching requested ML instancesProfilerReport-1611720164: InProgress\n.........\n2021-01-27 04:04:29 Starting - Preparing the instances for training......\n2021-01-27 04:05:44 Downloading - Downloading input data\n2021-01-27 04:05:44 Training - Downloading the training image...\n2021-01-27 04:06:11 Training - Training image download completed. Training in progress..\u001b[34m2021-01-27 04:06:06,541 sagemaker-containers INFO Imported framework sagemaker_tensorflow_container.training\u001b[0m\n\u001b[34m2021-01-27 04:06:07,035 sagemaker-containers INFO Invoking user script\n\u001b[0m\n\u001b[34mTraining Env:\n\u001b[0m\n\u001b[34m{\n \"additional_framework_parameters\": {},\n \"channel_input_dirs\": {\n \"eval\": \"/opt/ml/input/data/eval\",\n \"validation\": \"/opt/ml/input/data/validation\",\n \"train\": \"/opt/ml/input/data/train\"\n },\n \"current_host\": \"algo-1\",\n \"framework_module\": \"sagemaker_tensorflow_container.training:main\",\n \"hosts\": [\n \"algo-1\"\n ],\n \"hyperparameters\": {\n \"model_dir\": \"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\",\n \"epochs\": 2\n },\n \"input_config_dir\": \"/opt/ml/input/config\",\n \"input_data_config\": {\n \"eval\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n },\n \"validation\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n },\n \"train\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n }\n },\n \"input_dir\": \"/opt/ml/input\",\n \"is_master\": true,\n \"job_name\": \"cifar10-2021-01-27-04-02-44-183\",\n \"log_level\": 20,\n \"master_hostname\": \"algo-1\",\n \"model_dir\": \"/opt/ml/model\",\n \"module_dir\": \"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/source/sourcedir.tar.gz\",\n \"module_name\": \"cifar10_keras_sm_tf2\",\n \"network_interface_name\": \"eth0\",\n \"num_cpus\": 32,\n \"num_gpus\": 4,\n \"output_data_dir\": \"/opt/ml/output/data\",\n \"output_dir\": \"/opt/ml/output\",\n \"output_intermediate_dir\": \"/opt/ml/output/intermediate\",\n \"resource_config\": {\n \"current_host\": \"algo-1\",\n \"hosts\": [\n \"algo-1\"\n ],\n \"network_interface_name\": \"eth0\"\n },\n \"user_entry_point\": \"cifar10_keras_sm_tf2.py\"\u001b[0m\n\u001b[34m}\n\u001b[0m\n\u001b[34mEnvironment variables:\n\u001b[0m\n\u001b[34mSM_HOSTS=[\"algo-1\"]\u001b[0m\n\u001b[34mSM_NETWORK_INTERFACE_NAME=eth0\u001b[0m\n\u001b[34mSM_HPS={\"epochs\":2,\"model_dir\":\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\"}\u001b[0m\n\u001b[34mSM_USER_ENTRY_POINT=cifar10_keras_sm_tf2.py\u001b[0m\n\u001b[34mSM_FRAMEWORK_PARAMS={}\u001b[0m\n\u001b[34mSM_RESOURCE_CONFIG={\"current_host\":\"algo-1\",\"hosts\":[\"algo-1\"],\"network_interface_name\":\"eth0\"}\u001b[0m\n\u001b[34mSM_INPUT_DATA_CONFIG={\"eval\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"train\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"validation\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"}}\u001b[0m\n\u001b[34mSM_OUTPUT_DATA_DIR=/opt/ml/output/data\u001b[0m\n\u001b[34mSM_CHANNELS=[\"eval\",\"train\",\"validation\"]\u001b[0m\n\u001b[34mSM_CURRENT_HOST=algo-1\u001b[0m\n\u001b[34mSM_MODULE_NAME=cifar10_keras_sm_tf2\u001b[0m\n\u001b[34mSM_LOG_LEVEL=20\u001b[0m\n\u001b[34mSM_FRAMEWORK_MODULE=sagemaker_tensorflow_container.training:main\u001b[0m\n\u001b[34mSM_INPUT_DIR=/opt/ml/input\u001b[0m\n\u001b[34mSM_INPUT_CONFIG_DIR=/opt/ml/input/config\u001b[0m\n\u001b[34mSM_OUTPUT_DIR=/opt/ml/output\u001b[0m\n\u001b[34mSM_NUM_CPUS=32\u001b[0m\n\u001b[34mSM_NUM_GPUS=4\u001b[0m\n\u001b[34mSM_MODEL_DIR=/opt/ml/model\u001b[0m\n\u001b[34mSM_MODULE_DIR=s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/source/sourcedir.tar.gz\u001b[0m\n\u001b[34mSM_TRAINING_ENV={\"additional_framework_parameters\":{},\"channel_input_dirs\":{\"eval\":\"/opt/ml/input/data/eval\",\"train\":\"/opt/ml/input/data/train\",\"validation\":\"/opt/ml/input/data/validation\"},\"current_host\":\"algo-1\",\"framework_module\":\"sagemaker_tensorflow_container.training:main\",\"hosts\":[\"algo-1\"],\"hyperparameters\":{\"epochs\":2,\"model_dir\":\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\"},\"input_config_dir\":\"/opt/ml/input/config\",\"input_data_config\":{\"eval\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"train\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"},\"validation\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"}},\"input_dir\":\"/opt/ml/input\",\"is_master\":true,\"job_name\":\"cifar10-2021-01-27-04-02-44-183\",\"log_level\":20,\"master_hostname\":\"algo-1\",\"model_dir\":\"/opt/ml/model\",\"module_dir\":\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/source/sourcedir.tar.gz\",\"module_name\":\"cifar10_keras_sm_tf2\",\"network_interface_name\":\"eth0\",\"num_cpus\":32,\"num_gpus\":4,\"output_data_dir\":\"/opt/ml/output/data\",\"output_dir\":\"/opt/ml/output\",\"output_intermediate_dir\":\"/opt/ml/output/intermediate\",\"resource_config\":{\"current_host\":\"algo-1\",\"hosts\":[\"algo-1\"],\"network_interface_name\":\"eth0\"},\"user_entry_point\":\"cifar10_keras_sm_tf2.py\"}\u001b[0m\n\u001b[34mSM_USER_ARGS=[\"--epochs\",\"2\",\"--model_dir\",\"s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\"]\u001b[0m\n\u001b[34mSM_OUTPUT_INTERMEDIATE_DIR=/opt/ml/output/intermediate\u001b[0m\n\u001b[34mSM_CHANNEL_EVAL=/opt/ml/input/data/eval\u001b[0m\n\u001b[34mSM_CHANNEL_VALIDATION=/opt/ml/input/data/validation\u001b[0m\n\u001b[34mSM_CHANNEL_TRAIN=/opt/ml/input/data/train\u001b[0m\n\u001b[34mSM_HP_MODEL_DIR=s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\u001b[0m\n\u001b[34mSM_HP_EPOCHS=2\u001b[0m\n\u001b[34mPYTHONPATH=/opt/ml/code:/usr/local/bin:/usr/lib/python36.zip:/usr/lib/python3.6:/usr/lib/python3.6/lib-dynload:/usr/local/lib/python3.6/dist-packages:/usr/lib/python3/dist-packages\n\u001b[0m\n\u001b[34mInvoking script with the following command:\n\u001b[0m\n\u001b[34m/usr/bin/python3 cifar10_keras_sm_tf2.py --epochs 2 --model_dir s3://sagemaker-ap-northeast-2-057716757052/cifar10-2021-01-27-04-02-44-183/model\n\n\u001b[0m\n\u001b[34mTrain for 312 steps, validate for 78 steps\u001b[0m\n\u001b[34mEpoch 1/2\u001b[0m\n\u001b[34m#015 1/312 [..............................] - ETA: 34:31 - loss: 3.5045 - accuracy: 0.1094#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 7/312 [..............................] - ETA: 4:52 - loss: 3.1433 - accuracy: 0.1462 #010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 13/312 [>.............................] - ETA: 2:35 - loss: 2.9194 - accuracy: 0.1587#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 19/312 [>.............................] - ETA: 1:45 - loss: 2.7623 - accuracy: 0.1641#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 26/312 [=>............................] - ETA: 1:15 - loss: 2.6259 - accuracy: 0.1683#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 32/312 [==>...........................] - ETA: 1:00 - loss: 2.5445 - accuracy: 0.1753#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 39/312 [==>...........................] - ETA: 48s - loss: 2.4627 - accuracy: 0.1873 #010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 45/312 [===>..........................] - ETA: 41s - loss: 2.4148 - accuracy: 0.1951#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 51/312 [===>..........................] - ETA: 36s - loss: 2.3721 - accuracy: 0.2028#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 57/312 [====>.........................] - ETA: 31s - loss: 2.3383 - accuracy: 0.2057#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 64/312 [=====>........................] - ETA: 27s - loss: 2.2982 - accuracy: 0.2120#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 71/312 [=====>........................] - ETA: 24s - loss: 2.2635 - accuracy: 0.2171#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 78/312 [======>.......................] - ETA: 21s - loss: 2.2315 - accuracy: 0.2229#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 85/312 [=======>......................] - ETA: 19s - loss: 2.2051 - accuracy: 0.2268#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 92/312 [=======>......................] - ETA: 17s - loss: 2.1798 - accuracy: 0.2320#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 99/312 [========>.....................] - ETA: 16s - loss: 2.1550 - accuracy: 0.2371#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015106/312 [=========>....................] - ETA: 14s - loss: 2.1355 - accuracy: 0.2412#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015113/312 [=========>....................] - ETA: 13s - loss: 2.1166 - accuracy: 0.2458#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015120/312 [==========>...................] - ETA: 12s - loss: 2.0997 - accuracy: 0.2493#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015127/312 [===========>..................] - ETA: 11s - loss: 2.0852 - accuracy: 0.2542#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015134/312 [===========>..................] - ETA: 10s - loss: 2.0716 - accuracy: 0.2577#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015140/312 [============>.................] - ETA: 9s - loss: 2.0586 - accuracy: 0.2616 #010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015147/312 [=============>................] - ETA: 8s - loss: 2.0466 - accuracy: 0.2645#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015154/312 [=============>................] - ETA: 8s - loss: 2.0331 - accuracy: 0.2677#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015161/312 [==============>...............] - ETA: 7s - loss: 2.0210 - accuracy: 0.2723#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015168/312 [===============>..............] - ETA: 6s - loss: 2.0082 - accuracy: 0.2766#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015175/312 [===============>..............] - ETA: 6s - loss: 1.9988 - accuracy: 0.2790#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015181/312 [================>.............] - ETA: 5s - loss: 1.9901 - accuracy: 0.2804#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015188/312 [=================>............] - ETA: 5s - loss: 1.9790 - accuracy: 0.2836#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015195/312 [=================>............] - ETA: 4s - loss: 1.9695 - accuracy: 0.2856#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015202/312 [==================>...........] - ETA: 4s - loss: 1.9605 - accuracy: 0.2881#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015209/312 [===================>..........] - ETA: 4s - loss: 1.9531 - accuracy: 0.2906#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015216/312 [===================>..........] - ETA: 3s - loss: 1.9457 - accuracy: 0.2930#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015223/312 [====================>.........] - ETA: 3s - loss: 1.9350 - accuracy: 0.2959#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015230/312 [=====================>........] - ETA: 3s - loss: 1.9290 - accuracy: 0.2975#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015237/312 [=====================>........] - ETA: 2s - loss: 1.9219 - accuracy: 0.2991#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015244/312 [======================>.......] - ETA: 2s - loss: 1.9130 - accuracy: 0.3024#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015251/312 [=======================>......] - ETA: 2s - loss: 1.9066 - accuracy: 0.3046#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015258/312 [=======================>......] - ETA: 1s - loss: 1.9006 - accuracy: 0.3065#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015264/312 [========================>.....] - ETA: 1s - loss: 1.8959 - accuracy: 0.3079#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015271/312 [=========================>....] - ETA: 1s - loss: 1.8884 - accuracy: 0.3104#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015278/312 [=========================>....] - ETA: 1s - loss: 1.8834 - accuracy: 0.3122#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015285/312 [==========================>...] - ETA: 0s - loss: 1.8764 - accuracy: 0.3148#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015292/312 [===========================>..] - ETA: 0s - loss: 1.8714 - accuracy: 0.3172#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015299/312 [===========================>..] - ETA: 0s - loss: 1.8642 - accuracy: 0.3197#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015305/312 [============================>.] - ETA: 0s - loss: 1.8589 - accuracy: 0.3213#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015312/312 [==============================] - 10s 32ms/step - loss: 1.8530 - accuracy: 0.3232 - val_loss: 2.0282 - val_accuracy: 0.3226\u001b[0m\n\u001b[34mEpoch 2/2\u001b[0m\n\u001b[34m#015 1/312 [..............................] - ETA: 2s - loss: 1.4358 - accuracy: 0.4531#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 8/312 [..............................] - ETA: 2s - loss: 1.5428 - accuracy: 0.4131#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 15/312 [>.............................] - ETA: 2s - loss: 1.5658 - accuracy: 0.4026#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 22/312 [=>............................] - ETA: 2s - loss: 1.5621 - accuracy: 0.4116#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 29/312 [=>............................] - ETA: 2s - loss: 1.5536 - accuracy: 0.4181#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 36/312 [==>...........................] - ETA: 2s - loss: 1.5312 - accuracy: 0.4316#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 43/312 [===>..........................] - ETA: 2s - loss: 1.5190 - accuracy: 0.4391#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 50/312 [===>..........................] - ETA: 2s - loss: 1.5194 - accuracy: 0.4364#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 56/312 [====>.........................] - ETA: 2s - loss: 1.5234 - accuracy: 0.4351#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 63/312 [=====>........................] - ETA: 1s - loss: 1.5260 - accuracy: 0.4339#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 70/312 [=====>........................] - ETA: 1s - loss: 1.5249 - accuracy: 0.4376#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 77/312 [======>.......................] - ETA: 1s - loss: 1.5162 - accuracy: 0.4421#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 84/312 [=======>......................] - ETA: 1s - loss: 1.5111 - accuracy: 0.4443#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 91/312 [=======>......................] - ETA: 1s - loss: 1.5092 - accuracy: 0.4439#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015 98/312 [========>.....................] - ETA: 1s - loss: 1.5105 - accuracy: 0.4430#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015105/312 [=========>....................] - ETA: 1s - loss: 1.5119 - accuracy: 0.4424#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015112/312 [=========>....................] - ETA: 1s - loss: 1.5089 - accuracy: 0.4440#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015119/312 [==========>...................] - ETA: 1s - loss: 1.5087 - accuracy: 0.4458#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015126/312 [===========>..................] - ETA: 1s - loss: 1.5124 - accuracy: 0.4441#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015132/312 [===========>..................] - ETA: 1s - loss: 1.5132 - accuracy: 0.4441#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015139/312 [============>.................] - ETA: 1s - loss: 1.5099 - accuracy: 0.4453#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015146/312 [=============>................] - ETA: 1s - loss: 1.5104 - accuracy: 0.4464#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015153/312 [=============>................] - ETA: 1s - loss: 1.5065 - accuracy: 0.4489#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015160/312 [==============>...............] - ETA: 1s - loss: 1.5054 - accuracy: 0.4499#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015166/312 [==============>...............] - ETA: 1s - loss: 1.5030 - accuracy: 0.4507#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015172/312 [===============>..............] - ETA: 1s - loss: 1.5006 - accuracy: 0.4514#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015179/312 [================>.............] - ETA: 1s - loss: 1.4972 - accuracy: 0.4527#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015186/312 [================>.............] - ETA: 0s - loss: 1.4946 - accuracy: 0.4536#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015193/312 [=================>............] - ETA: 0s - loss: 1.4922 - accuracy: 0.4547#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015200/312 [==================>...........] - ETA: 0s - loss: 1.4917 - accuracy: 0.4553#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015207/312 [==================>...........] - ETA: 0s - loss: 1.4904 - accuracy: 0.4556#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015214/312 [===================>..........] - ETA: 0s - loss: 1.4877 - accuracy: 0.4567#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015221/312 [====================>.........] - ETA: 0s - loss: 1.4865 - accuracy: 0.4576#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015228/312 [====================>.........] - ETA: 0s - loss: 1.4846 - accuracy: 0.4582#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015235/312 [=====================>........] - ETA: 0s - loss: 1.4813 - accuracy: 0.4593#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015242/312 [======================>.......] - ETA: 0s - loss: 1.4780 - accuracy: 0.4611#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015249/312 [======================>.......] - ETA: 0s - loss: 1.4757 - accuracy: 0.4621#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015255/312 [=======================>......] - ETA: 0s - loss: 1.4742 - accuracy: 0.4624#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015262/312 [========================>.....] - ETA: 0s - loss: 1.4709 - accuracy: 0.4642#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015268/312 [========================>.....] - ETA: 0s - loss: 1.4689 - accuracy: 0.4651#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015275/312 [=========================>....] - ETA: 0s - loss: 1.4664 - accuracy: 0.4662#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015282/312 [==========================>...] - ETA: 0s - loss: 1.4634 - accuracy: 0.4671#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015289/312 [==========================>...] - ETA: 0s - loss: 1.4600 - accuracy: 0.4679#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015296/312 [===========================>..] - ETA: 0s - loss: 1.4562 - accuracy: 0.4693#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015303/312 [============================>.] - ETA: 0s - loss: 1.4529 - accuracy: 0.4707#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015310/312 [============================>.] - ETA: 0s - loss: 1.4507 - accuracy: 0.4713#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#010#015312/312 [==============================] - 3s 10ms/step - loss: 1.4498 - accuracy: 0.4717 - val_loss: 1.6843 - val_accuracy: 0.4161\u001b[0m\n\n2021-01-27 04:12:46 Uploading - Uploading generated training model\u001b[34m2021-01-27 04:12:39.226548: W tensorflow/python/util/util.cc:299] Sets are not currently considered sequences, but this may change in the future, so consider avoiding using them.\u001b[0m\n\u001b[34mWARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1781: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.\u001b[0m\n\u001b[34mInstructions for updating:\u001b[0m\n\u001b[34mIf using Keras pass *_constraint arguments to layers.\u001b[0m\n\u001b[34mWARNING:tensorflow:From /usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/resource_variable_ops.py:1781: calling BaseResourceVariable.__init__ (from tensorflow.python.ops.resource_variable_ops) with constraint is deprecated and will be removed in a future version.\u001b[0m\n\u001b[34mInstructions for updating:\u001b[0m\n\u001b[34mIf using Keras pass *_constraint arguments to layers.\u001b[0m\n\u001b[34mINFO:tensorflow:Assets written to: /opt/ml/model/1/assets\u001b[0m\n\u001b[34mINFO:tensorflow:Assets written to: /opt/ml/model/1/assets\u001b[0m\n\u001b[34m2021-01-27 04:12:42,835 sagemaker-containers INFO Reporting training SUCCESS\u001b[0m\n\n2021-01-27 04:13:16 Completed - Training job completed\nProfilerReport-1611720164: NoIssuesFound\nTraining seconds: 452\nBillable seconds: 452\nCPU times: user 1.59 s, sys: 1.44 ms, total: 1.59 s\nWall time: 10min 46s\n" ] ], [ [ "## training_job_name ์ €์žฅ\n\nํ˜„์žฌ์˜ training_job_name์„ ์ €์žฅ ํ•ฉ๋‹ˆ๋‹ค.\n- training_job_name์„ ์—๋Š” ํ›ˆ๋ จ์— ๊ด€๋ จ ๋‚ด์šฉ ๋ฐ ํ›ˆ๋ จ ๊ฒฐ๊ณผ์ธ **Model Artifact** ํŒŒ์ผ์˜ S3 ๊ฒฝ๋กœ๋ฅผ ์ œ๊ณต ํ•ฉ๋‹ˆ๋‹ค.", "_____no_output_____" ] ], [ [ "train_job_name = estimator._current_job_name", "_____no_output_____" ], [ "%store train_job_name", "Stored 'train_job_name' (str)\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d06051dceb9e7c3e4941d98bb5860c0cd4d1b728
66,639
ipynb
Jupyter Notebook
notebooks/test_0_lstm_shell_colab.ipynb
SPRACE/track-ml
3af95fd014e98a5b11261dc5d618f34f82fdf84d
[ "MIT" ]
null
null
null
notebooks/test_0_lstm_shell_colab.ipynb
SPRACE/track-ml
3af95fd014e98a5b11261dc5d618f34f82fdf84d
[ "MIT" ]
10
2019-04-15T21:44:31.000Z
2020-08-26T21:05:00.000Z
notebooks/test_0_lstm_shell_colab.ipynb
SPRACE/track-ml
3af95fd014e98a5b11261dc5d618f34f82fdf84d
[ "MIT" ]
4
2019-04-12T19:04:16.000Z
2020-01-14T13:30:44.000Z
213.586538
47,300
0.647204
[ [ [ "!pip install git+https://github.com/LAL/trackml-library.git\n!pip install plotly.express\n!pip install shortuuid", "_____no_output_____" ], [ "# Clonning the repository you can get the lastest code from dev branch \n#!git clone https://github.com/SPRACE/track-ml.git cloned-repo\n!git clone https://github.com/stonescenter/track-ml.git\n!ls", "_____no_output_____" ], [ "%cd track-ml/", "/content/track-ml\n" ] ], [ [ "# Running scripts with python shell #", "_____no_output_____" ] ], [ [ "#!pip install tensorflow==1.14.0\n#!pip install tensorflow-base==1.14.0\n#!pip install tensorflow-gpu==1.14.0\n\n%tensorflow_version 1.x", "_____no_output_____" ], [ "\n! python main_train.py --config config_default.json", "_____no_output_____" ] ], [ [ "# Plot Predicted Data #\n", "_____no_output_____" ] ], [ [ "import os\nimport json\nimport numpy as np\nimport pandas as pd\n\nconfigs = json.load(open('config_default.json', 'r'))\n\ncylindrical = configs['data']['cylindrical'] # set to polar or cartesian coordenates\nnormalise = configs['data']['normalise'] \nname = configs['model']['name']\n\nif cylindrical:\n coord = 'cylin'\nelse:\n coord = 'xyz'\n\npath1 = 'results/x_true_%s_%s.csv' % (name, coord)\npath2 = 'results/y_true_%s_%s.csv' % (name, coord)\npath3 = 'results/y_pred_%s_%s.csv' % (name, coord)\n\nprint('loading from .. %s' % path1)\nprint('loading from .. %s' % path2)\nprint('loading from .. %s' % path3)\n\ndf_test = pd.read_csv(path1, header=None)\ndf_true = pd.read_csv(path2, header=None)\ndf_pred = pd.read_csv(path3, header=None)\n\nprint('shape df_test ', df_test.shape)\nprint('shape df_true ', df_true.shape)\nprint('shape df_pred ', df_pred.shape)\n# concat\n#y_true = pd.concat([df_test, df_true], axis = 1, ignore_index = True)\n#y_pred = pd.concat([df_test, df_pred], axis = 1, ignore_index = True)\n\ny_true = np.concatenate([df_test, df_true], axis = 1)\ny_pred = np.concatenate([df_test, df_pred], axis = 1)\ny_true = pd.DataFrame(y_true)\ny_pred = pd.DataFrame(y_pred)\n#y_true.name = 'real'\n#y_pred.name = 'pred'\ny_pred.columns.name = 'pred'\ny_true.columns.name = 'real'\n\nprint('size y_true ', y_true.shape)\nprint('size y_pred ', y_pred.shape)", "loading from .. results/x_true_lstm_xyz.csv\nloading from .. results/y_true_lstm_xyz.csv\nloading from .. results/y_pred_lstm_xyz.csv\nshape df_test (528, 12)\nshape df_true (528, 18)\nshape df_pred (528, 18)\nsize y_true (528, 30)\nsize y_pred (528, 30)\n" ], [ "from core.utils.utils import *\nimport warnings\n\nN_tracks = 30\npath_html = ''\nname = configs['model']['name']\n\nfig = track_plot_xyz([y_true, y_pred], n_hits = 10, cylindrical = cylindrical, n_tracks = N_tracks, \n title='Track Prediction #10 Hit - Model %s (Nearest hits)' % name.upper())\n\nfig.show()", "/usr/local/lib/python3.6/dist-packages/statsmodels/tools/_testing.py:19: FutureWarning:\n\npandas.util.testing is deprecated. Use the functions in the public API at pandas.testing instead.\n\n" ], [ "", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ] ]
d06051e608c2bfcf56c74a60dc9f350d0d1b7d14
137,607
ipynb
Jupyter Notebook
code/preprocessing_and_decomposition/Matrix_Profile.ipynb
iotanalytics/IoTTutorial
33666ca918cdece60df4684f0a2ec3465e9663b6
[ "MIT" ]
3
2021-07-20T18:02:51.000Z
2021-08-18T13:26:57.000Z
code/preprocessing_and_decomposition/Matrix_Profile.ipynb
iotanalytics/IoTTutorial
33666ca918cdece60df4684f0a2ec3465e9663b6
[ "MIT" ]
null
null
null
code/preprocessing_and_decomposition/Matrix_Profile.ipynb
iotanalytics/IoTTutorial
33666ca918cdece60df4684f0a2ec3465e9663b6
[ "MIT" ]
null
null
null
539.635294
128,352
0.946311
[ [ [ "<a href=\"https://colab.research.google.com/github/iotanalytics/IoTTutorial/blob/main/code/preprocessing_and_decomposition/Matrix_Profile.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "## Matrix Profile\n", "_____no_output_____" ], [ "## Introduction\n\nThe matrix profile (MP) is a data structure and associated algorithms that helps solve the dual problem of anomaly detection and motif discovery. It is robust, scalable and largely parameter-free.\n\nMP can be combined with other algorithms to accomplish:\n\n* Motif discovery\n* Time series chains\n* Anomaly discovery\n* Joins\n* Semantic segmentation\n\nmatrixprofile-ts offers 3 different algorithms to compute Matrix Profile:\n* STAMP (Scalable Time Series Anytime Matrix Profile) - Each distance profile is independent of other distance profiles, the order in which they are computed can be random. It is an anytime algorithm.\n* STOMP (Scalable Time Series Ordered Matrix Profile) - This algorithm is an exact ordered algorithm. It is significantly faster than STAMP.\n* SCRIMP++ (Scalable Column Independent Matrix Profile) - This algorithm combines the anytime component of STAMP with the speed of STOMP.\n\n\nSee: https://towardsdatascience.com/introduction-to-matrix-profiles-5568f3375d90", "_____no_output_____" ], [ "## Code Example\n", "_____no_output_____" ] ], [ [ "!pip install matrixprofile-ts", "Collecting matrixprofile-ts\n Downloading matrixprofile_ts-0.0.9-py2.py3-none-any.whl (24 kB)\nRequirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.7/dist-packages (from matrixprofile-ts) (1.19.5)\nInstalling collected packages: matrixprofile-ts\nSuccessfully installed matrixprofile-ts-0.0.9\n" ], [ "import pandas as pd\n## example data importing\ndata = pd.read_csv('https://raw.githubusercontent.com/iotanalytics/IoTTutorial/main/data/SCG_data.csv').drop('Unnamed: 0',1).to_numpy()[0:20,:1000]", "_____no_output_____" ], [ "import operator\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom matrixprofile import *\n\nimport numpy as np\nfrom datetime import datetime\n\nimport matplotlib.pyplot as plt\nfrom matplotlib.colors import ListedColormap\nfrom sklearn import neighbors, datasets\n\n# Pull a portion of the data\npattern = data[10,:] + max(abs(data[10,:]))\n\n# Compute Matrix Profile\nm = 10\nmp = matrixProfile.stomp(pattern,m)\n\n#Append np.nan to Matrix profile to enable plotting against raw data\nmp_adj = np.append(mp[0],np.zeros(m-1)+np.nan)\n\n#Plot the signal data\nfig, (ax1, ax2) = plt.subplots(2,1,sharex=True,figsize=(20,10))\nax1.plot(np.arange(len(pattern)),pattern)\nax1.set_ylabel('Signal', size=22)\n\n#Plot the Matrix Profile\nax2.plot(np.arange(len(mp_adj)),mp_adj, label=\"Matrix Profile\", color='red')\nax2.set_ylabel('Matrix Profile', size=22)\nax2.set_xlabel('Time', size=22);", "_____no_output_____" ] ], [ [ "## Discussion\n\n\nPros:\n* It is exact: For motif discovery, discord discovery, time series joins etc., the Matrix Profile based methods provide no false positives or false dismissals.\n* It is simple and parameter-free: In contrast, the more general algorithms in this space\nthat typically require building and tuning spatial access methods and/or hash functions.\n* It is space efficient: Matrix Profile construction algorithms requires an inconsequential\nspace overhead, just linear in the time series length with a small constant factor, allowing\nmassive datasets to be processed in main memory (for most data mining, disk is death).\n* It allows anytime algorithms: While exact MP algorithms are extremely scalable, for\nextremely large datasets we can compute the Matrix Profile in an anytime fashion, allowing\nultra-fast approximate solutions and real-time data interaction.\n* It is incrementally maintainable: Having computed the Matrix Profile for a dataset,\nwe can incrementally update it very efficiently. In many domains this means we can effectively\nmaintain exact joins, motifs, discords on streaming data forever.\n* It can leverage hardware: Matrix Profile construction is embarrassingly parallelizable,\nboth on multicore processors, GPUs, distributed systems etc.\n* It is free of the curse of dimensionality: That is to say, It has time complexity that is\nconstant in subsequence length: This is a very unusual and desirable property; virtually all\nexisting algorithms in the time series scale poorly as the subsequence length grows.\n* It can be constructed in deterministic time: Almost all algorithms for time series\ndata mining can take radically different times to finish on two (even slightly) different datasets.\nIn contrast, given only the length of the time series, we can precisely predict in advance how\nlong it will take to compute the Matrix Profile. (this allows resource planning)\n* It can handle missing data: Even in the presence of missing data, we can provide\nanswers which are guaranteed to have no false negatives.\n* Finally, and subjectively: Simplicity and Intuitiveness: Seeing the world through\nthe MP lens often invites/suggests simple and elegant solutions. \n\nCons:\n* Larger datasets can take a long time to compute. Scalability needs to be addressed.\n* Cannot be used with Dynamic time Warping as of now.\n * DTW is used for one-to-all matching whereas MP is used for all-to-all matching.\n * DTW is used for smaller datasets rather than large.\n* Need to adjust window size manually for different datasets.\n\n*How to read MP* :\n* Where you see relatively low values, you know that the subsequence in the original time\nseries must have (at least one) relatively similar subsequence elsewhere in the data (such\nregions are โ€œmotifsโ€ or reoccurring patterns)\n* Where you see relatively high values, you know that the subsequence in the original time\nseries must be unique in its shape (such areas are โ€œdiscordsโ€ or anomalies). In fact, the highest point is exactly the definition of Time\nSeries Discord, perhaps the best anomaly detector for time series.\n", "_____no_output_____" ], [ "##References:\n\nhttps://www.cs.ucr.edu/~eamonn/MatrixProfile.html (powerpoints on this site - a lot of examples)\n\nhttps://towardsdatascience.com/introduction-to-matrix-profiles-5568f3375d90\n\nPython implementation: https://github.com/TDAmeritrade/stumpy", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ] ]
d06059853171785ee6baaefaab157274881917e4
41,007
ipynb
Jupyter Notebook
day37_ML_ANN_RNN.ipynb
DynamicEngine2001/Programming-Codes
6f19cbca47eef4b059b723703b261545ab08fc5d
[ "MIT" ]
null
null
null
day37_ML_ANN_RNN.ipynb
DynamicEngine2001/Programming-Codes
6f19cbca47eef4b059b723703b261545ab08fc5d
[ "MIT" ]
1
2020-10-15T14:33:30.000Z
2020-10-15T14:33:30.000Z
day37_ML_ANN_RNN.ipynb
DynamicEngine2001/Programming-Codes
6f19cbca47eef4b059b723703b261545ab08fc5d
[ "MIT" ]
7
2020-10-05T13:05:35.000Z
2021-10-18T17:06:50.000Z
35.596354
6,416
0.472797
[ [ [ "### Steps to build a Neural Network\n\n1. Empty Model (sequential/Model)\n2", "_____no_output_____" ] ], [ [ "import tensorflow.keras.datasets as kd", "_____no_output_____" ], [ "data = kd.fashion_mnist.load_data()", "Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-labels-idx1-ubyte.gz\n32768/29515 [=================================] - 0s 7us/step\nDownloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/train-images-idx3-ubyte.gz\n26427392/26421880 [==============================] - 13s 1us/step\nDownloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-labels-idx1-ubyte.gz\n8192/5148 [===============================================] - 0s 0us/step\nDownloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/t10k-images-idx3-ubyte.gz\n4423680/4422102 [==============================] - 2s 0us/step\n" ], [ "(xtrain,ytrain),(xtest,ytest) = data", "_____no_output_____" ], [ "xtrain.shape", "_____no_output_____" ], [ "import matplotlib.pyplot as plt", "_____no_output_____" ], [ "plt.imshow(xtrain[0,:,:],cmap='gray_r')", "_____no_output_____" ], [ "ytrain[0]", "_____no_output_____" ], [ "xtrain1 = xtrain.reshape(-1,28*28)\nxtest1 = xtest.reshape(-1,28*28)", "_____no_output_____" ], [ "xtrain1.shape", "_____no_output_____" ], [ "from tensorflow.keras.models import Sequential\nfrom tensorflow.keras.layers import Dense", "_____no_output_____" ], [ "model_ann = Sequential()\nmodel_ann.add(Dense(units=128, input_shape=(784,), activation='relu'))\nmodel_ann.add(Dense(units=128, activation='relu'))\nmodel_ann.add(Dense(units=10, activation='softmax'))\nmodel_ann.compile(optimizer='adam',loss='sparse_categorical_crossentropy',metrics=['accuracy'])", "_____no_output_____" ], [ "model_ann.summary()", "Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ndense (Dense) (None, 128) 100480 \n_________________________________________________________________\ndense_1 (Dense) (None, 128) 16512 \n_________________________________________________________________\ndense_2 (Dense) (None, 10) 1290 \n=================================================================\nTotal params: 118,282\nTrainable params: 118,282\nNon-trainable params: 0\n_________________________________________________________________\n" ], [ "1st layer = ", "_____no_output_____" ], [ "history = model_ann.fit(xtrain1,ytrain,epochs=10)", "Epoch 1/10\n1875/1875 [==============================] - 8s 3ms/step - loss: 2.0255 - accuracy: 0.7341\nEpoch 2/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.6520 - accuracy: 0.7857\nEpoch 3/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.6020 - accuracy: 0.8008\nEpoch 4/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.5439 - accuracy: 0.8156\nEpoch 5/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.5079 - accuracy: 0.8259\nEpoch 6/10\n1875/1875 [==============================] - 7s 4ms/step - loss: 0.4657 - accuracy: 0.8365\nEpoch 7/10\n1875/1875 [==============================] - 7s 4ms/step - loss: 0.4380 - accuracy: 0.8442\nEpoch 8/10\n1875/1875 [==============================] - 5s 3ms/step - loss: 0.4248 - accuracy: 0.8483\nEpoch 9/10\n1875/1875 [==============================] - 5s 3ms/step - loss: 0.4050 - accuracy: 0.8524\nEpoch 10/10\n1875/1875 [==============================] - 6s 3ms/step - loss: 0.3999 - accuracy: 0.8567\n" ], [ "plt.plot(history.history['loss'])\nplt.plot(history.history['accuracy'])\nplt.grid()\nplt.\nplt.xticks(range(1,11))\nplt.xlabel('Epochs-->')\nplt.show()", "_____no_output_____" ], [ "ypred = model_ann.predict(xtest1)", "_____no_output_____" ], [ "labels.get(ytest[0])", "_____no_output_____" ], [ "ypred[0].argmax()", "_____no_output_____" ], [ "model_ann.evaluate(xtest1,ytest)", "313/313 [==============================] - 1s 2ms/step - loss: 0.4793 - accuracy: 0.8335\n" ] ], [ [ "### Churn Modelling", "_____no_output_____" ] ], [ [ "import pandas as pd", "_____no_output_____" ], [ "df = pd.read_csv('Churn_Modelling.csv')\ndf", "_____no_output_____" ], [ "df.info()", "<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 10000 entries, 0 to 9999\nData columns (total 14 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 RowNumber 10000 non-null int64 \n 1 CustomerId 10000 non-null int64 \n 2 Surname 10000 non-null object \n 3 CreditScore 10000 non-null int64 \n 4 Geography 10000 non-null object \n 5 Gender 10000 non-null object \n 6 Age 10000 non-null int64 \n 7 Tenure 10000 non-null int64 \n 8 Balance 10000 non-null float64\n 9 NumOfProducts 10000 non-null int64 \n 10 HasCrCard 10000 non-null int64 \n 11 IsActiveMember 10000 non-null int64 \n 12 EstimatedSalary 10000 non-null float64\n 13 Exited 10000 non-null int64 \ndtypes: float64(2), int64(9), object(3)\nmemory usage: 1.1+ MB\n" ], [ "df1 = pd.get_dummies(df)", "_____no_output_____" ], [ "df1.head()", "_____no_output_____" ] ], [ [ "### Recurrent Neural Network", "_____no_output_____" ] ], [ [ "import numpy as np", "_____no_output_____" ], [ "stock_data = pd.read_csv('stock_data.csv')", "_____no_output_____" ], [ "fb = stock_data[['Open']] [stock_data['Stock']=='FB'].copy()", "_____no_output_____" ], [ "fb.head()", "_____no_output_____" ], [ "fb = fb.values", "_____no_output_____" ], [ "fb.shape", "_____no_output_____" ], [ "x = []\ny = []\nfor i in range(20, len(fb)):\n x.append(fb['Open'].valuesfb[i-20:1].tolist())\n y.append(fb[i].tolist())\n", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ] ]
d0605f7486d4b21270c84051e4665d2a8b65650a
16,303
ipynb
Jupyter Notebook
Python_Core/Python Modules and Imports.ipynb
ValRCS/RCS_Python_11
157c8e08aaf9849341cadb50077fe65dead536fa
[ "MIT" ]
1
2019-07-11T16:25:15.000Z
2019-07-11T16:25:15.000Z
Python_Core/Python Modules and Imports.ipynb
ValRCS/RCS_Python_11
157c8e08aaf9849341cadb50077fe65dead536fa
[ "MIT" ]
8
2020-01-28T22:54:14.000Z
2022-02-10T00:17:47.000Z
Python Modules and Imports.ipynb
ValRCS/RCS_Data_Analysis_Python_2019_July
19e2f8310f41b697f9c86d7a085a9ff19390eeac
[ "MIT" ]
2
2019-12-11T14:39:36.000Z
2019-12-13T14:29:09.000Z
22.674548
463
0.497577
[ [ [ "## Python Modules", "_____no_output_____" ] ], [ [ "%%writefile weather.py\ndef prognosis():\n print(\"It will rain today\")", "Writing weather.py\n" ], [ "import weather\n", "_____no_output_____" ], [ "weather.prognosis()", "It will rain today\n" ] ], [ [ "## How does Python know from where to import packages/modules from?", "_____no_output_____" ] ], [ [ "# Python imports work by searching the directories listed in sys.path.", "_____no_output_____" ], [ "import sys\nsys.path\n", "_____no_output_____" ], [ "## \"__main__\" usage\n# A module can discover whether or not it is running in the main scope by checking its own __name__, \n# which allows a common idiom for conditionally executing code in a module when it is run as a script or with python -m \n# but not when it is imported:", "_____no_output_____" ], [ "%%writefile hw.py\n#!/usr/bin/env python\ndef hw():\n print(\"Running Main\")\n\ndef hw2():\n print(\"Hello 2\")\n\nif __name__ == \"__main__\":\n # execute only if run as a script\n print(\"Running as script\")\n hw()\n hw2()", "Overwriting hw.py\n" ], [ "import main\nimport hw", "_____no_output_____" ], [ "main.main()\nhw.hw2()", "Running Main\nHello 2\n" ], [ "# Running on all 3 OSes from command line:\n\npython main.py", "_____no_output_____" ] ], [ [ "## Make main.py self running on Linux (also should work on MacOS):\n \nAdd \n#!/usr/bin/env python to first line of script\n\nmark it executable using\n\n### need to change permissions too!\n$ chmod +x main.py", "_____no_output_____" ], [ "## Making Standalone .EXEs for Python in Windows \n\n* http://www.py2exe.org/ used to be for Python 2 , now supposedly Python 3 as well\n* http://www.pyinstaller.org/\n Tutorial: https://medium.com/dreamcatcher-its-blog/making-an-stand-alone-executable-from-a-python-script-using-pyinstaller-d1df9170e263\n\n Need to create exe on a similar system as target system! ", "_____no_output_____" ] ], [ [ "# Exercise Write a function which returns a list of fibonacci numbers up to starting with 1, 1, 2, 3, 5 up to the nth.\nSo Fib(4) would return [1,1,2,3]", "_____no_output_____" ] ], [ [ "![Fibo](https://upload.wikimedia.org/wikipedia/commons/thumb/d/db/34%2A21-FibonacciBlocks.png/450px-34%2A21-FibonacciBlocks.png)", "_____no_output_____" ], [ "![Fibonacci](https://upload.wikimedia.org/wikipedia/commons/thumb/8/8e/Leonardo_da_Pisa.jpg/330px-Leonardo_da_Pisa.jpg)", "_____no_output_____" ] ], [ [ "%%writefile fibo.py\n# Fibonacci numbers module\n\ndef fib(n): # write Fibonacci series up to n\n a, b = 1 1\n while b < n:\n print(b, end=' ')\n a, b = b, a+b\n print()\n\ndef fib2(n): # return Fibonacci series up to n\n result = []\n a, b = 1, 1\n while b < n:\n result.append(b)\n a, b = b, a+b\n return result", "Writing fibo.py\n" ], [ "import fibo", "_____no_output_____" ], [ "fibo.fib(100)", "1 1 2 3 5 8 13 21 34 55 89 \n" ], [ "fibo.fib2(100)", "_____no_output_____" ], [ "fib=fibo.fib", "_____no_output_____" ] ], [ [ "If you intend to use a function often you can assign it to a local name:", "_____no_output_____" ] ], [ [ "fib(300)", "1 1 2 3 5 8 13 21 34 55 89 144 233 \n" ] ], [ [ "#### There is a variant of the import statement that imports names from a module directly into the importing moduleโ€™s symbol table. ", "_____no_output_____" ] ], [ [ "from fibo import fib, fib2 # we overwrote fib=fibo.fib", "_____no_output_____" ], [ "fib(100)", "1 1 2 3 5 8 13 21 34 55 89 \n" ], [ "fib2(200)", "_____no_output_____" ] ], [ [ "This does not introduce the module name from which the imports are taken in the local symbol table (so in the example, fibo is not defined).", "_____no_output_____" ], [ "There is even a variant to import all names that a module defines: **NOT RECOMMENDED**", "_____no_output_____" ] ], [ [ "## DO not do this Namespace collission possible!!", "_____no_output_____" ], [ "from fibo import *", "_____no_output_____" ], [ "fib(400)", "1 1 2 3 5 8 13 21 34 55 89 144 233 377 \n" ] ], [ [ "### If the module name is followed by as, then the name following as is bound directly to the imported module.", "_____no_output_____" ] ], [ [ "import fibo as fib", "_____no_output_____" ], [ "dir(fib)", "_____no_output_____" ], [ "fib.fib(50)", "1 1 2 3 5 8 13 21 34 \n" ], [ "### It can also be used when utilising from with similar effects:", "_____no_output_____" ], [ " from fibo import fib as fibonacci", "_____no_output_____" ], [ "fibonacci(200)", "1 1 2 3 5 8 13 21 34 55 89 144 \n" ] ], [ [ "### Executing modules as scriptsยถ", "_____no_output_____" ], [ "When you run a Python module with\n\npython fibo.py <arguments>\n \nthe code in the module will be executed, just as if you imported it, but with the \\_\\_name\\_\\_ set to \"\\_\\_main\\_\\_\". That means that by adding this code at the end of your module:", "_____no_output_____" ] ], [ [ "%%writefile fibbo.py \n \n# Fibonacci numbers module\n\ndef fib(n): # write Fibonacci series up to n\n a, b = 0, 1\n while b < n:\n print(b, end=' ')\n a, b = b, a+b\n print()\n\ndef fib2(n): # return Fibonacci series up to n\n result = []\n a, b = 0, 1\n while b < n:\n result.append(b)\n a, b = b, a+b\n return result\n\nif __name__ == \"__main__\":\n import sys\n fib(int(sys.argv[1], 10))", "Overwriting fibbo.py\n" ], [ "import fibbo as fi\nfi.fib(200)", "1 1 2 3 5 8 13 21 34 55 89 144 \n" ] ], [ [ "#### This is often used either to provide a convenient user interface to a module, or for testing purposes (running the module as a script executes a test suite).", "_____no_output_____" ], [ "### The Module Search Path\n\nWhen a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:\n\n* The directory containing the input script (or the current directory when no file is specified).\n* PYTHONPATH (a list of directory names, with the same syntax as the shell variable PATH).\n* The installation-dependent default.", "_____no_output_____" ], [ "Packages are a way of structuring Pythonโ€™s module namespace by using โ€œdotted module namesโ€. For example, the module name A.B designates a submodule named B in a package named A. Just like the use of modules saves the authors of different modules from having to worry about each otherโ€™s global variable names, the use of dotted module names saves the authors of multi-module packages like NumPy or Pillow from having to worry about each otherโ€™s module names.", "_____no_output_____" ] ], [ [ "sound/ Top-level package\n __init__.py Initialize the sound package\n formats/ Subpackage for file format conversions\n __init__.py\n wavread.py\n wavwrite.py\n aiffread.py\n aiffwrite.py\n auread.py\n auwrite.py\n ...\n effects/ Subpackage for sound effects\n __init__.py\n echo.py\n surround.py\n reverse.py\n ...\n filters/ Subpackage for filters\n __init__.py\n equalizer.py\n vocoder.py\n karaoke.py\n ...", "_____no_output_____" ] ], [ [ "The \\_\\_init\\_\\_.py files are required to make Python treat the directories as containing packages; this is done to prevent directories with a common name, such as string, from unintentionally hiding valid modules that occur later on the module search path. In the simplest case, \\_\\_init\\_\\_.py can just be an empty file", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ] ]
d060798bb49284e8cdc2d12caf2fd61bd056185e
237,327
ipynb
Jupyter Notebook
notebooks/testing_multitask.ipynb
GJBoth/MultiTaskPINN
8a9bb23b8bfc0d0f678090e015316dbd0cfbf024
[ "MIT" ]
null
null
null
notebooks/testing_multitask.ipynb
GJBoth/MultiTaskPINN
8a9bb23b8bfc0d0f678090e015316dbd0cfbf024
[ "MIT" ]
null
null
null
notebooks/testing_multitask.ipynb
GJBoth/MultiTaskPINN
8a9bb23b8bfc0d0f678090e015316dbd0cfbf024
[ "MIT" ]
1
2022-02-24T04:27:25.000Z
2022-02-24T04:27:25.000Z
40.72186
198
0.505539
[ [ [ "# General imports\nimport numpy as np\nimport torch\n\n# DeepMoD stuff\nfrom multitaskpinn import DeepMoD\nfrom multitaskpinn.model.func_approx import NN\nfrom multitaskpinn.model.library import Library1D\nfrom multitaskpinn.model.constraint import LeastSquares\nfrom multitaskpinn.model.sparse_estimators import Threshold\nfrom multitaskpinn.training import train, train_multitask\nfrom multitaskpinn.training.sparsity_scheduler import TrainTestPeriodic\n\nfrom phimal_utilities.data import Dataset\nfrom phimal_utilities.data.burgers import BurgersDelta\n\nif torch.cuda.is_available():\n device ='cuda'\nelse:\n device = 'cpu'\ndevice = 'cpu'\n\n# Settings for reproducibility\nnp.random.seed(0)\ntorch.manual_seed(0)\ntorch.backends.cudnn.deterministic = True\ntorch.backends.cudnn.benchmark = False\n\n%load_ext autoreload\n%autoreload 2", "_____no_output_____" ], [ "device", "_____no_output_____" ], [ "# Making dataset\nv = 0.1\nA = 1.0\n\nx = np.linspace(-3, 4, 100)\nt = np.linspace(0.5, 5.0, 50)\nx_grid, t_grid = np.meshgrid(x, t, indexing='ij')\ndataset = Dataset(BurgersDelta, v=v, A=A)\n\nX, y = dataset.create_dataset(x_grid.reshape(-1, 1), t_grid.reshape(-1, 1), n_samples=1000, noise=0.2, random=True, normalize=False)\nX, y = X.to(device), y.to(device)", "_____no_output_____" ], [ "network = NN(2, [30, 30, 30, 30, 30], 1)\nlibrary = Library1D(poly_order=2, diff_order=3) # Library function\nestimator = Threshold(0.1) # Sparse estimator \nconstraint = LeastSquares() # How to constrain\nmodel = DeepMoD(network, library, estimator, constraint).to(device) # Putting it all in the model", "_____no_output_____" ], [ "sparsity_scheduler = TrainTestPeriodic(patience=8, delta=1e-5, periodicity=50)\noptimizer = torch.optim.Adam(model.parameters(), betas=(0.99, 0.999), amsgrad=True, lr=2e-3) # Defining optimizer", "_____no_output_____" ], [ "train_multitask(model, X, y, optimizer, sparsity_scheduler, write_iterations=25, log_dir='runs/testing_multitask_unnormalized/', max_iterations=15000, delta=1e-3, patience=8) # Running", "| Iteration | Progress | Time remaining | Loss | MSE | Reg | L1 norm |\n 2150 14.33% 396s -1.56e+01 1.40e-03 1.12e-07 1.55e+00 Algorithm converged. Stopping training.\n" ], [ "network = NN(2, [30, 30, 30, 30, 30], 1)\nlibrary = Library1D(poly_order=2, diff_order=3) # Library function\nestimator = Threshold(0.1) # Sparse estimator \nconstraint = LeastSquares() # How to constrain\nmodel = DeepMoD(network, library, estimator, constraint).to(device) # Putting it all in the model", "_____no_output_____" ], [ "sparsity_scheduler = TrainTestPeriodic(patience=8, delta=1e-5, periodicity=50)\noptimizer = torch.optim.Adam(model.parameters(), betas=(0.99, 0.999), amsgrad=True, lr=2e-3) # Defining optimizer", "_____no_output_____" ], [ "train(model, X, y, optimizer, sparsity_scheduler, write_iterations=25, log_dir='runs/testing_normal_unnormalized/', max_iterations=15000, delta=1e-3, patience=8) # Running", "| Iteration | Progress | Time remaining | Loss | MSE | Reg | L1 norm |\n 11300 75.33% 108s 1.38e-03 1.36e-03 1.72e-05 1.63e+00 Algorithm converged. Stopping training.\n" ] ], [ [ "# Quick analysis", "_____no_output_____" ] ], [ [ "from phimal_utilities.analysis import Results\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nsns.set(context='notebook', style='white')\n\n%config InlineBackend.figure_format = 'svg'", "_____no_output_____" ], [ "data_mt = Results('runs/testing_multitask_unnormalized//')\ndata_bl = Results('runs/testing_normal_unnormalized//')\n\nkeys = data_mt.keys", "_____no_output_____" ], [ "fig, axes = plt.subplots(figsize=(10, 3), constrained_layout=True, ncols=2)\n\nax = axes[0]\nax.semilogy(data_bl.df.index, data_bl.df[keys['mse']], label='Baseline')\nax.semilogy(data_mt.df.index, data_mt.df[keys['mse']], label='Multitask')\nax.set_title('MSE')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\nax.legend()\n#ax.set_xlim([0, 8000])\n\n\nax = axes[1]\nax.semilogy(data_bl.df.index, data_bl.df[keys['reg']], label='Baseline')\nax.semilogy(data_mt.df.index, data_mt.df[keys['reg']], label='Multitask')\nax.set_title('Regression')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\nax.legend()\n#ax.set_xlim([0, 8000])\n\nfig.show()", "_____no_output_____" ], [ "fig, axes = plt.subplots(ncols=3, constrained_layout=True, figsize=(15, 4))\n\nax = axes[0]\nax.plot(data_bl.df.index, data_bl.df[keys['coeffs']])\nax.plot(data_bl.df.index, data_bl.df[keys['coeffs'][2]], lw=3)\nax.plot(data_bl.df.index, data_bl.df[keys['coeffs'][5]], lw=3)\nax.set_ylim([-2, 2])\nax.set_title('Coefficients baseline')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\n#ax.set_xlim([0, 8000])\n\nax = axes[1]\nax.plot(data_mt.df.index, data_mt.df[keys['coeffs']])\nax.plot(data_mt.df.index, data_mt.df[keys['coeffs'][2]], lw=3)\nax.plot(data_mt.df.index, data_mt.df[keys['coeffs'][5]], lw=3)\nax.set_ylim([-2, 2])\nax.set_title('Coefficients Multitask')\nax.set_xlabel('Epoch', weight='bold')\nax.set_ylabel('Cost', weight='bold')\n#ax.set_xlim([0, 8000])\n\nax = axes[2]\ntrue_coeffs = np.zeros(len(keys['unscaled_coeffs']))\ntrue_coeffs[2] = 0.1\ntrue_coeffs[5] = -1\n\nax.semilogy(data_bl.df.index, np.mean(np.abs(data_bl.df[keys['unscaled_coeffs']] - true_coeffs), axis=1), label='Baseline')\nax.semilogy(data_mt.df.index, np.mean(np.abs(data_mt.df[keys['unscaled_coeffs']] - true_coeffs), axis=1), label='Baseline')\nax.set_ylim([-5, 2])\nax.legend()\n\nfig.show()", "_____no_output_____" ] ] ]
[ "code", "markdown", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ] ]
d06079be15478f4e2f794b8f7c80c271870f6724
47,640
ipynb
Jupyter Notebook
notebook/pytorch/nn_tutorial.ipynb
mengwangk/myinvestor-toolkit
3dca9e1accfccf1583dcdbec80d1a0fe9dae2e81
[ "MIT" ]
7
2019-10-13T18:58:33.000Z
2021-08-07T12:46:22.000Z
notebook/pytorch/nn_tutorial.ipynb
mengwangk/myinvestor-toolkit
3dca9e1accfccf1583dcdbec80d1a0fe9dae2e81
[ "MIT" ]
7
2019-12-16T21:25:34.000Z
2022-02-10T00:11:22.000Z
notebook/pytorch/nn_tutorial.ipynb
mengwangk/myinvestor-toolkit
3dca9e1accfccf1583dcdbec80d1a0fe9dae2e81
[ "MIT" ]
4
2020-02-01T11:23:51.000Z
2021-12-13T12:27:18.000Z
29.849624
124
0.558858
[ [ [ "%matplotlib inline", "_____no_output_____" ] ], [ [ "\nWhat is `torch.nn` *really*?\n============================\nby Jeremy Howard, `fast.ai <https://www.fast.ai>`_. Thanks to Rachel Thomas and Francisco Ingham.\n\n", "_____no_output_____" ], [ "We recommend running this tutorial as a notebook, not a script. To download the notebook (.ipynb) file,\nclick `here <https://pytorch.org/tutorials/beginner/nn_tutorial.html#sphx-glr-download-beginner-nn-tutorial-py>`_ .\n\nPyTorch provides the elegantly designed modules and classes `torch.nn <https://pytorch.org/docs/stable/nn.html>`_ ,\n`torch.optim <https://pytorch.org/docs/stable/optim.html>`_ ,\n`Dataset <https://pytorch.org/docs/stable/data.html?highlight=dataset#torch.utils.data.Dataset>`_ ,\nand `DataLoader <https://pytorch.org/docs/stable/data.html?highlight=dataloader#torch.utils.data.DataLoader>`_\nto help you create and train neural networks.\nIn order to fully utilize their power and customize\nthem for your problem, you need to really understand exactly what they're\ndoing. To develop this understanding, we will first train basic neural net\non the MNIST data set without using any features from these models; we will\ninitially only use the most basic PyTorch tensor functionality. Then, we will\nincrementally add one feature from ``torch.nn``, ``torch.optim``, ``Dataset``, or\n``DataLoader`` at a time, showing exactly what each piece does, and how it\nworks to make the code either more concise, or more flexible.\n\n**This tutorial assumes you already have PyTorch installed, and are familiar\nwith the basics of tensor operations.** (If you're familiar with Numpy array\noperations, you'll find the PyTorch tensor operations used here nearly identical).\n\nMNIST data setup\n----------------\n\nWe will use the classic `MNIST <http://deeplearning.net/data/mnist/>`_ dataset,\nwhich consists of black-and-white images of hand-drawn digits (between 0 and 9).\n\nWe will use `pathlib <https://docs.python.org/3/library/pathlib.html>`_\nfor dealing with paths (part of the Python 3 standard library), and will\ndownload the dataset using\n`requests <http://docs.python-requests.org/en/master/>`_. We will only\nimport modules when we use them, so you can see exactly what's being\nused at each point.\n\n", "_____no_output_____" ] ], [ [ "from pathlib import Path\nimport requests\n\nDATA_PATH = Path(\"data\")\nPATH = DATA_PATH / \"mnist\"\n\nPATH.mkdir(parents=True, exist_ok=True)\n\nURL = \"http://deeplearning.net/data/mnist/\"\nFILENAME = \"mnist.pkl.gz\"\n\nif not (PATH / FILENAME).exists():\n content = requests.get(URL + FILENAME).content\n (PATH / FILENAME).open(\"wb\").write(content)", "_____no_output_____" ] ], [ [ "This dataset is in numpy array format, and has been stored using pickle,\na python-specific format for serializing data.\n\n", "_____no_output_____" ] ], [ [ "import pickle\nimport gzip\n\nwith gzip.open((PATH / FILENAME).as_posix(), \"rb\") as f:\n ((x_train, y_train), (x_valid, y_valid), _) = pickle.load(f, encoding=\"latin-1\")", "_____no_output_____" ] ], [ [ "Each image is 28 x 28, and is being stored as a flattened row of length\n784 (=28x28). Let's take a look at one; we need to reshape it to 2d\nfirst.\n\n", "_____no_output_____" ] ], [ [ "from matplotlib import pyplot\nimport numpy as np\n\npyplot.imshow(x_train[0].reshape((28, 28)), cmap=\"gray\")\nprint(x_train.shape)", "_____no_output_____" ] ], [ [ "PyTorch uses ``torch.tensor``, rather than numpy arrays, so we need to\nconvert our data.\n\n", "_____no_output_____" ] ], [ [ "import torch\n\nx_train, y_train, x_valid, y_valid = map(\n torch.tensor, (x_train, y_train, x_valid, y_valid)\n)\nn, c = x_train.shape\nx_train, x_train.shape, y_train.min(), y_train.max()\nprint(x_train, y_train)\nprint(x_train.shape)\nprint(y_train.min(), y_train.max())", "_____no_output_____" ] ], [ [ "Neural net from scratch (no torch.nn)\n---------------------------------------------\n\nLet's first create a model using nothing but PyTorch tensor operations. We're assuming\nyou're already familiar with the basics of neural networks. (If you're not, you can\nlearn them at `course.fast.ai <https://course.fast.ai>`_).\n\nPyTorch provides methods to create random or zero-filled tensors, which we will\nuse to create our weights and bias for a simple linear model. These are just regular\ntensors, with one very special addition: we tell PyTorch that they require a\ngradient. This causes PyTorch to record all of the operations done on the tensor,\nso that it can calculate the gradient during back-propagation *automatically*!\n\nFor the weights, we set ``requires_grad`` **after** the initialization, since we\ndon't want that step included in the gradient. (Note that a trailling ``_`` in\nPyTorch signifies that the operation is performed in-place.)\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>We are initializing the weights here with\n `Xavier initialisation <http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf>`_\n (by multiplying with 1/sqrt(n)).</p></div>\n\n", "_____no_output_____" ] ], [ [ "import math\n\nweights = torch.randn(784, 10) / math.sqrt(784)\nweights.requires_grad_()\nbias = torch.zeros(10, requires_grad=True)", "_____no_output_____" ] ], [ [ "Thanks to PyTorch's ability to calculate gradients automatically, we can\nuse any standard Python function (or callable object) as a model! So\nlet's just write a plain matrix multiplication and broadcasted addition\nto create a simple linear model. We also need an activation function, so\nwe'll write `log_softmax` and use it. Remember: although PyTorch\nprovides lots of pre-written loss functions, activation functions, and\nso forth, you can easily write your own using plain python. PyTorch will\neven create fast GPU or vectorized CPU code for your function\nautomatically.\n\n", "_____no_output_____" ] ], [ [ "def log_softmax(x):\n return x - x.exp().sum(-1).log().unsqueeze(-1)\n\ndef model(xb):\n return log_softmax(xb @ weights + bias)", "_____no_output_____" ] ], [ [ "In the above, the ``@`` stands for the dot product operation. We will call\nour function on one batch of data (in this case, 64 images). This is\none *forward pass*. Note that our predictions won't be any better than\nrandom at this stage, since we start with random weights.\n\n", "_____no_output_____" ] ], [ [ "bs = 64 # batch size\n\nxb = x_train[0:bs] # a mini-batch from x\npreds = model(xb) # predictions\npreds[0], preds.shape\nprint(preds[0], preds.shape)", "_____no_output_____" ] ], [ [ "As you see, the ``preds`` tensor contains not only the tensor values, but also a\ngradient function. We'll use this later to do backprop.\n\nLet's implement negative log-likelihood to use as the loss function\n(again, we can just use standard Python):\n\n", "_____no_output_____" ] ], [ [ "def nll(input, target):\n return -input[range(target.shape[0]), target].mean()\n\nloss_func = nll", "_____no_output_____" ] ], [ [ "Let's check our loss with our random model, so we can see if we improve\nafter a backprop pass later.\n\n", "_____no_output_____" ] ], [ [ "yb = y_train[0:bs]\nprint(loss_func(preds, yb))", "_____no_output_____" ] ], [ [ "Let's also implement a function to calculate the accuracy of our model.\nFor each prediction, if the index with the largest value matches the\ntarget value, then the prediction was correct.\n\n", "_____no_output_____" ] ], [ [ "def accuracy(out, yb):\n preds = torch.argmax(out, dim=1)\n return (preds == yb).float().mean()", "_____no_output_____" ] ], [ [ "Let's check the accuracy of our random model, so we can see if our\naccuracy improves as our loss improves.\n\n", "_____no_output_____" ] ], [ [ "print(accuracy(preds, yb))", "_____no_output_____" ] ], [ [ "We can now run a training loop. For each iteration, we will:\n\n- select a mini-batch of data (of size ``bs``)\n- use the model to make predictions\n- calculate the loss\n- ``loss.backward()`` updates the gradients of the model, in this case, ``weights``\n and ``bias``.\n\nWe now use these gradients to update the weights and bias. We do this\nwithin the ``torch.no_grad()`` context manager, because we do not want these\nactions to be recorded for our next calculation of the gradient. You can read\nmore about how PyTorch's Autograd records operations\n`here <https://pytorch.org/docs/stable/notes/autograd.html>`_.\n\nWe then set the\ngradients to zero, so that we are ready for the next loop.\nOtherwise, our gradients would record a running tally of all the operations\nthat had happened (i.e. ``loss.backward()`` *adds* the gradients to whatever is\nalready stored, rather than replacing them).\n\n.. tip:: You can use the standard python debugger to step through PyTorch\n code, allowing you to check the various variable values at each step.\n Uncomment ``set_trace()`` below to try it out.\n\n\n", "_____no_output_____" ] ], [ [ "from IPython.core.debugger import set_trace\n\nlr = 0.5 # learning rate\nepochs = 2 # how many epochs to train for\n\nfor epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n # set_trace()\n start_i = i * bs\n end_i = start_i + bs\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n with torch.no_grad():\n weights -= weights.grad * lr\n bias -= bias.grad * lr\n weights.grad.zero_()\n bias.grad.zero_()", "_____no_output_____" ] ], [ [ "That's it: we've created and trained a minimal neural network (in this case, a\nlogistic regression, since we have no hidden layers) entirely from scratch!\n\nLet's check the loss and accuracy and compare those to what we got\nearlier. We expect that the loss will have decreased and accuracy to\nhave increased, and they have.\n\n", "_____no_output_____" ] ], [ [ "print(loss_func(model(xb), yb), accuracy(model(xb), yb))", "_____no_output_____" ] ], [ [ "Using torch.nn.functional\n------------------------------\n\nWe will now refactor our code, so that it does the same thing as before, only\nwe'll start taking advantage of PyTorch's ``nn`` classes to make it more concise\nand flexible. At each step from here, we should be making our code one or more\nof: shorter, more understandable, and/or more flexible.\n\nThe first and easiest step is to make our code shorter by replacing our\nhand-written activation and loss functions with those from ``torch.nn.functional``\n(which is generally imported into the namespace ``F`` by convention). This module\ncontains all the functions in the ``torch.nn`` library (whereas other parts of the\nlibrary contain classes). As well as a wide range of loss and activation\nfunctions, you'll also find here some convenient functions for creating neural\nnets, such as pooling functions. (There are also functions for doing convolutions,\nlinear layers, etc, but as we'll see, these are usually better handled using\nother parts of the library.)\n\nIf you're using negative log likelihood loss and log softmax activation,\nthen Pytorch provides a single function ``F.cross_entropy`` that combines\nthe two. So we can even remove the activation function from our model.\n\n", "_____no_output_____" ] ], [ [ "import torch.nn.functional as F\n\nloss_func = F.cross_entropy\n\ndef model(xb):\n return xb @ weights + bias", "_____no_output_____" ] ], [ [ "Note that we no longer call ``log_softmax`` in the ``model`` function. Let's\nconfirm that our loss and accuracy are the same as before:\n\n", "_____no_output_____" ] ], [ [ "print(loss_func(model(xb), yb), accuracy(model(xb), yb))", "_____no_output_____" ] ], [ [ "Refactor using nn.Module\n-----------------------------\nNext up, we'll use ``nn.Module`` and ``nn.Parameter``, for a clearer and more\nconcise training loop. We subclass ``nn.Module`` (which itself is a class and\nable to keep track of state). In this case, we want to create a class that\nholds our weights, bias, and method for the forward step. ``nn.Module`` has a\nnumber of attributes and methods (such as ``.parameters()`` and ``.zero_grad()``)\nwhich we will be using.\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>``nn.Module`` (uppercase M) is a PyTorch specific concept, and is a\n class we'll be using a lot. ``nn.Module`` is not to be confused with the Python\n concept of a (lowercase ``m``) `module <https://docs.python.org/3/tutorial/modules.html>`_,\n which is a file of Python code that can be imported.</p></div>\n\n", "_____no_output_____" ] ], [ [ "from torch import nn\n\nclass Mnist_Logistic(nn.Module):\n def __init__(self):\n super().__init__()\n self.weights = nn.Parameter(torch.randn(784, 10) / math.sqrt(784))\n self.bias = nn.Parameter(torch.zeros(10))\n\n def forward(self, xb):\n return xb @ self.weights + self.bias", "_____no_output_____" ] ], [ [ "Since we're now using an object instead of just using a function, we\nfirst have to instantiate our model:\n\n", "_____no_output_____" ] ], [ [ "model = Mnist_Logistic()", "_____no_output_____" ] ], [ [ "Now we can calculate the loss in the same way as before. Note that\n``nn.Module`` objects are used as if they are functions (i.e they are\n*callable*), but behind the scenes Pytorch will call our ``forward``\nmethod automatically.\n\n", "_____no_output_____" ] ], [ [ "print(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "Previously for our training loop we had to update the values for each parameter\nby name, and manually zero out the grads for each parameter separately, like this:\n::\n with torch.no_grad():\n weights -= weights.grad * lr\n bias -= bias.grad * lr\n weights.grad.zero_()\n bias.grad.zero_()\n\n\nNow we can take advantage of model.parameters() and model.zero_grad() (which\nare both defined by PyTorch for ``nn.Module``) to make those steps more concise\nand less prone to the error of forgetting some of our parameters, particularly\nif we had a more complicated model:\n::\n with torch.no_grad():\n for p in model.parameters(): p -= p.grad * lr\n model.zero_grad()\n\n\nWe'll wrap our little training loop in a ``fit`` function so we can run it\nagain later.\n\n", "_____no_output_____" ] ], [ [ "def fit():\n for epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n start_i = i * bs\n end_i = start_i + bs\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n with torch.no_grad():\n for p in model.parameters():\n p -= p.grad * lr\n model.zero_grad()\n\nfit()", "_____no_output_____" ] ], [ [ "Let's double-check that our loss has gone down:\n\n", "_____no_output_____" ] ], [ [ "print(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "Refactor using nn.Linear\n-------------------------\n\nWe continue to refactor our code. Instead of manually defining and\ninitializing ``self.weights`` and ``self.bias``, and calculating ``xb @\nself.weights + self.bias``, we will instead use the Pytorch class\n`nn.Linear <https://pytorch.org/docs/stable/nn.html#linear-layers>`_ for a\nlinear layer, which does all that for us. Pytorch has many types of\npredefined layers that can greatly simplify our code, and often makes it\nfaster too.\n\n", "_____no_output_____" ] ], [ [ "class Mnist_Logistic(nn.Module):\n def __init__(self):\n super().__init__()\n self.lin = nn.Linear(784, 10)\n\n def forward(self, xb):\n return self.lin(xb)", "_____no_output_____" ] ], [ [ "We instantiate our model and calculate the loss in the same way as before:\n\n", "_____no_output_____" ] ], [ [ "model = Mnist_Logistic()\nprint(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "We are still able to use our same ``fit`` method as before.\n\n", "_____no_output_____" ] ], [ [ "fit()\n\nprint(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "Refactor using optim\n------------------------------\n\nPytorch also has a package with various optimization algorithms, ``torch.optim``.\nWe can use the ``step`` method from our optimizer to take a forward step, instead\nof manually updating each parameter.\n\nThis will let us replace our previous manually coded optimization step:\n::\n with torch.no_grad():\n for p in model.parameters(): p -= p.grad * lr\n model.zero_grad()\n\nand instead use just:\n::\n opt.step()\n opt.zero_grad()\n\n(``optim.zero_grad()`` resets the gradient to 0 and we need to call it before\ncomputing the gradient for the next minibatch.)\n\n", "_____no_output_____" ] ], [ [ "from torch import optim", "_____no_output_____" ] ], [ [ "We'll define a little function to create our model and optimizer so we\ncan reuse it in the future.\n\n", "_____no_output_____" ] ], [ [ "def get_model():\n model = Mnist_Logistic()\n return model, optim.SGD(model.parameters(), lr=lr)\n\nmodel, opt = get_model()\nprint(loss_func(model(xb), yb))\n\nfor epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n start_i = i * bs\n end_i = start_i + bs\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\nprint(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "Refactor using Dataset\n------------------------------\n\nPyTorch has an abstract Dataset class. A Dataset can be anything that has\na ``__len__`` function (called by Python's standard ``len`` function) and\na ``__getitem__`` function as a way of indexing into it.\n`This tutorial <https://pytorch.org/tutorials/beginner/data_loading_tutorial.html>`_\nwalks through a nice example of creating a custom ``FacialLandmarkDataset`` class\nas a subclass of ``Dataset``.\n\nPyTorch's `TensorDataset <https://pytorch.org/docs/stable/_modules/torch/utils/data/dataset.html#TensorDataset>`_\nis a Dataset wrapping tensors. By defining a length and way of indexing,\nthis also gives us a way to iterate, index, and slice along the first\ndimension of a tensor. This will make it easier to access both the\nindependent and dependent variables in the same line as we train.\n\n", "_____no_output_____" ] ], [ [ "from torch.utils.data import TensorDataset", "_____no_output_____" ] ], [ [ "Both ``x_train`` and ``y_train`` can be combined in a single ``TensorDataset``,\nwhich will be easier to iterate over and slice.\n\n", "_____no_output_____" ] ], [ [ "train_ds = TensorDataset(x_train, y_train)", "_____no_output_____" ] ], [ [ "Previously, we had to iterate through minibatches of x and y values separately:\n::\n xb = x_train[start_i:end_i]\n yb = y_train[start_i:end_i]\n\n\nNow, we can do these two steps together:\n::\n xb,yb = train_ds[i*bs : i*bs+bs]\n\n\n", "_____no_output_____" ] ], [ [ "model, opt = get_model()\n\nfor epoch in range(epochs):\n for i in range((n - 1) // bs + 1):\n xb, yb = train_ds[i * bs: i * bs + bs]\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\nprint(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "Refactor using DataLoader\n------------------------------\n\nPytorch's ``DataLoader`` is responsible for managing batches. You can\ncreate a ``DataLoader`` from any ``Dataset``. ``DataLoader`` makes it easier\nto iterate over batches. Rather than having to use ``train_ds[i*bs : i*bs+bs]``,\nthe DataLoader gives us each minibatch automatically.\n\n", "_____no_output_____" ] ], [ [ "from torch.utils.data import DataLoader\n\ntrain_ds = TensorDataset(x_train, y_train)\ntrain_dl = DataLoader(train_ds, batch_size=bs)", "_____no_output_____" ] ], [ [ "Previously, our loop iterated over batches (xb, yb) like this:\n::\n for i in range((n-1)//bs + 1):\n xb,yb = train_ds[i*bs : i*bs+bs]\n pred = model(xb)\n\nNow, our loop is much cleaner, as (xb, yb) are loaded automatically from the data loader:\n::\n for xb,yb in train_dl:\n pred = model(xb)\n\n", "_____no_output_____" ] ], [ [ "model, opt = get_model()\n\nfor epoch in range(epochs):\n for xb, yb in train_dl:\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\nprint(loss_func(model(xb), yb))", "_____no_output_____" ] ], [ [ "Thanks to Pytorch's ``nn.Module``, ``nn.Parameter``, ``Dataset``, and ``DataLoader``,\nour training loop is now dramatically smaller and easier to understand. Let's\nnow try to add the basic features necessary to create effecive models in practice.\n\nAdd validation\n-----------------------\n\nIn section 1, we were just trying to get a reasonable training loop set up for\nuse on our training data. In reality, you **always** should also have\na `validation set <https://www.fast.ai/2017/11/13/validation-sets/>`_, in order\nto identify if you are overfitting.\n\nShuffling the training data is\n`important <https://www.quora.com/Does-the-order-of-training-data-matter-when-training-neural-networks>`_\nto prevent correlation between batches and overfitting. On the other hand, the\nvalidation loss will be identical whether we shuffle the validation set or not.\nSince shuffling takes extra time, it makes no sense to shuffle the validation data.\n\nWe'll use a batch size for the validation set that is twice as large as\nthat for the training set. This is because the validation set does not\nneed backpropagation and thus takes less memory (it doesn't need to\nstore the gradients). We take advantage of this to use a larger batch\nsize and compute the loss more quickly.\n\n", "_____no_output_____" ] ], [ [ "train_ds = TensorDataset(x_train, y_train)\ntrain_dl = DataLoader(train_ds, batch_size=bs, shuffle=True)\n\nvalid_ds = TensorDataset(x_valid, y_valid)\nvalid_dl = DataLoader(valid_ds, batch_size=bs * 2)", "_____no_output_____" ] ], [ [ "We will calculate and print the validation loss at the end of each epoch.\n\n(Note that we always call ``model.train()`` before training, and ``model.eval()``\nbefore inference, because these are used by layers such as ``nn.BatchNorm2d``\nand ``nn.Dropout`` to ensure appropriate behaviour for these different phases.)\n\n", "_____no_output_____" ] ], [ [ "model, opt = get_model()\n\nfor epoch in range(epochs):\n model.train()\n for xb, yb in train_dl:\n pred = model(xb)\n loss = loss_func(pred, yb)\n\n loss.backward()\n opt.step()\n opt.zero_grad()\n\n model.eval()\n with torch.no_grad():\n valid_loss = sum(loss_func(model(xb), yb) for xb, yb in valid_dl)\n\n print(epoch, valid_loss / len(valid_dl))", "_____no_output_____" ] ], [ [ "Create fit() and get_data()\n----------------------------------\n\nWe'll now do a little refactoring of our own. Since we go through a similar\nprocess twice of calculating the loss for both the training set and the\nvalidation set, let's make that into its own function, ``loss_batch``, which\ncomputes the loss for one batch.\n\nWe pass an optimizer in for the training set, and use it to perform\nbackprop. For the validation set, we don't pass an optimizer, so the\nmethod doesn't perform backprop.\n\n", "_____no_output_____" ] ], [ [ "def loss_batch(model, loss_func, xb, yb, opt=None):\n loss = loss_func(model(xb), yb)\n\n if opt is not None:\n loss.backward()\n opt.step()\n opt.zero_grad()\n\n return loss.item(), len(xb)", "_____no_output_____" ] ], [ [ "``fit`` runs the necessary operations to train our model and compute the\ntraining and validation losses for each epoch.\n\n", "_____no_output_____" ] ], [ [ "import numpy as np\n\ndef fit(epochs, model, loss_func, opt, train_dl, valid_dl):\n for epoch in range(epochs):\n model.train()\n for xb, yb in train_dl:\n loss_batch(model, loss_func, xb, yb, opt)\n\n model.eval()\n with torch.no_grad():\n losses, nums = zip(\n *[loss_batch(model, loss_func, xb, yb) for xb, yb in valid_dl]\n )\n val_loss = np.sum(np.multiply(losses, nums)) / np.sum(nums)\n\n print(epoch, val_loss)", "_____no_output_____" ] ], [ [ "``get_data`` returns dataloaders for the training and validation sets.\n\n", "_____no_output_____" ] ], [ [ "def get_data(train_ds, valid_ds, bs):\n return (\n DataLoader(train_ds, batch_size=bs, shuffle=True),\n DataLoader(valid_ds, batch_size=bs * 2),\n )", "_____no_output_____" ] ], [ [ "Now, our whole process of obtaining the data loaders and fitting the\nmodel can be run in 3 lines of code:\n\n", "_____no_output_____" ] ], [ [ "train_dl, valid_dl = get_data(train_ds, valid_ds, bs)\nmodel, opt = get_model()\nfit(epochs, model, loss_func, opt, train_dl, valid_dl)", "_____no_output_____" ] ], [ [ "You can use these basic 3 lines of code to train a wide variety of models.\nLet's see if we can use them to train a convolutional neural network (CNN)!\n\nSwitch to CNN\n-------------\n\nWe are now going to build our neural network with three convolutional layers.\nBecause none of the functions in the previous section assume anything about\nthe model form, we'll be able to use them to train a CNN without any modification.\n\nWe will use Pytorch's predefined\n`Conv2d <https://pytorch.org/docs/stable/nn.html#torch.nn.Conv2d>`_ class\nas our convolutional layer. We define a CNN with 3 convolutional layers.\nEach convolution is followed by a ReLU. At the end, we perform an\naverage pooling. (Note that ``view`` is PyTorch's version of numpy's\n``reshape``)\n\n", "_____no_output_____" ] ], [ [ "class Mnist_CNN(nn.Module):\n def __init__(self):\n super().__init__()\n self.conv1 = nn.Conv2d(1, 16, kernel_size=3, stride=2, padding=1)\n self.conv2 = nn.Conv2d(16, 16, kernel_size=3, stride=2, padding=1)\n self.conv3 = nn.Conv2d(16, 10, kernel_size=3, stride=2, padding=1)\n\n def forward(self, xb):\n xb = xb.view(-1, 1, 28, 28)\n xb = F.relu(self.conv1(xb))\n xb = F.relu(self.conv2(xb))\n xb = F.relu(self.conv3(xb))\n xb = F.avg_pool2d(xb, 4)\n return xb.view(-1, xb.size(1))\n\nlr = 0.1", "_____no_output_____" ] ], [ [ "`Momentum <https://cs231n.github.io/neural-networks-3/#sgd>`_ is a variation on\nstochastic gradient descent that takes previous updates into account as well\nand generally leads to faster training.\n\n", "_____no_output_____" ] ], [ [ "model = Mnist_CNN()\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)\n\nfit(epochs, model, loss_func, opt, train_dl, valid_dl)", "_____no_output_____" ] ], [ [ "nn.Sequential\n------------------------\n\n``torch.nn`` has another handy class we can use to simply our code:\n`Sequential <https://pytorch.org/docs/stable/nn.html#torch.nn.Sequential>`_ .\nA ``Sequential`` object runs each of the modules contained within it, in a\nsequential manner. This is a simpler way of writing our neural network.\n\nTo take advantage of this, we need to be able to easily define a\n**custom layer** from a given function. For instance, PyTorch doesn't\nhave a `view` layer, and we need to create one for our network. ``Lambda``\nwill create a layer that we can then use when defining a network with\n``Sequential``.\n\n", "_____no_output_____" ] ], [ [ "class Lambda(nn.Module):\n def __init__(self, func):\n super().__init__()\n self.func = func\n\n def forward(self, x):\n return self.func(x)\n\n\ndef preprocess(x):\n return x.view(-1, 1, 28, 28)", "_____no_output_____" ] ], [ [ "The model created with ``Sequential`` is simply:\n\n", "_____no_output_____" ] ], [ [ "model = nn.Sequential(\n Lambda(preprocess),\n nn.Conv2d(1, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 10, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.AvgPool2d(4),\n Lambda(lambda x: x.view(x.size(0), -1)),\n)\n\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)\n\nfit(epochs, model, loss_func, opt, train_dl, valid_dl)", "_____no_output_____" ] ], [ [ "Wrapping DataLoader\n-----------------------------\n\nOur CNN is fairly concise, but it only works with MNIST, because:\n - It assumes the input is a 28\\*28 long vector\n - It assumes that the final CNN grid size is 4\\*4 (since that's the average\npooling kernel size we used)\n\nLet's get rid of these two assumptions, so our model works with any 2d\nsingle channel image. First, we can remove the initial Lambda layer but\nmoving the data preprocessing into a generator:\n\n", "_____no_output_____" ] ], [ [ "def preprocess(x, y):\n return x.view(-1, 1, 28, 28), y\n\n\nclass WrappedDataLoader:\n def __init__(self, dl, func):\n self.dl = dl\n self.func = func\n\n def __len__(self):\n return len(self.dl)\n\n def __iter__(self):\n batches = iter(self.dl)\n for b in batches:\n yield (self.func(*b))\n\ntrain_dl, valid_dl = get_data(train_ds, valid_ds, bs)\ntrain_dl = WrappedDataLoader(train_dl, preprocess)\nvalid_dl = WrappedDataLoader(valid_dl, preprocess)", "_____no_output_____" ] ], [ [ "Next, we can replace ``nn.AvgPool2d`` with ``nn.AdaptiveAvgPool2d``, which\nallows us to define the size of the *output* tensor we want, rather than\nthe *input* tensor we have. As a result, our model will work with any\nsize input.\n\n", "_____no_output_____" ] ], [ [ "model = nn.Sequential(\n nn.Conv2d(1, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 16, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.Conv2d(16, 10, kernel_size=3, stride=2, padding=1),\n nn.ReLU(),\n nn.AdaptiveAvgPool2d(1),\n Lambda(lambda x: x.view(x.size(0), -1)),\n)\n\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)", "_____no_output_____" ] ], [ [ "Let's try it out:\n\n", "_____no_output_____" ] ], [ [ "fit(epochs, model, loss_func, opt, train_dl, valid_dl)", "_____no_output_____" ] ], [ [ "Using your GPU\n---------------\n\nIf you're lucky enough to have access to a CUDA-capable GPU (you can\nrent one for about $0.50/hour from most cloud providers) you can\nuse it to speed up your code. First check that your GPU is working in\nPytorch:\n\n", "_____no_output_____" ] ], [ [ "print(torch.cuda.is_available())", "_____no_output_____" ] ], [ [ "And then create a device object for it:\n\n", "_____no_output_____" ] ], [ [ "dev = torch.device(\n \"cuda\") if torch.cuda.is_available() else torch.device(\"cpu\")", "_____no_output_____" ] ], [ [ "Let's update ``preprocess`` to move batches to the GPU:\n\n", "_____no_output_____" ] ], [ [ "def preprocess(x, y):\n return x.view(-1, 1, 28, 28).to(dev), y.to(dev)\n\n\ntrain_dl, valid_dl = get_data(train_ds, valid_ds, bs)\ntrain_dl = WrappedDataLoader(train_dl, preprocess)\nvalid_dl = WrappedDataLoader(valid_dl, preprocess)", "_____no_output_____" ] ], [ [ "Finally, we can move our model to the GPU.\n\n", "_____no_output_____" ] ], [ [ "model.to(dev)\nopt = optim.SGD(model.parameters(), lr=lr, momentum=0.9)", "_____no_output_____" ] ], [ [ "You should find it runs faster now:\n\n", "_____no_output_____" ] ], [ [ "fit(epochs, model, loss_func, opt, train_dl, valid_dl)", "_____no_output_____" ] ], [ [ "Closing thoughts\n-----------------\n\nWe now have a general data pipeline and training loop which you can use for\ntraining many types of models using Pytorch. To see how simple training a model\ncan now be, take a look at the `mnist_sample` sample notebook.\n\nOf course, there are many things you'll want to add, such as data augmentation,\nhyperparameter tuning, monitoring training, transfer learning, and so forth.\nThese features are available in the fastai library, which has been developed\nusing the same design approach shown in this tutorial, providing a natural\nnext step for practitioners looking to take their models further.\n\nWe promised at the start of this tutorial we'd explain through example each of\n``torch.nn``, ``torch.optim``, ``Dataset``, and ``DataLoader``. So let's summarize\nwhat we've seen:\n\n - **torch.nn**\n\n + ``Module``: creates a callable which behaves like a function, but can also\n contain state(such as neural net layer weights). It knows what ``Parameter`` (s) it\n contains and can zero all their gradients, loop through them for weight updates, etc.\n + ``Parameter``: a wrapper for a tensor that tells a ``Module`` that it has weights\n that need updating during backprop. Only tensors with the `requires_grad` attribute set are updated\n + ``functional``: a module(usually imported into the ``F`` namespace by convention)\n which contains activation functions, loss functions, etc, as well as non-stateful\n versions of layers such as convolutional and linear layers.\n - ``torch.optim``: Contains optimizers such as ``SGD``, which update the weights\n of ``Parameter`` during the backward step\n - ``Dataset``: An abstract interface of objects with a ``__len__`` and a ``__getitem__``,\n including classes provided with Pytorch such as ``TensorDataset``\n - ``DataLoader``: Takes any ``Dataset`` and creates an iterator which returns batches of data.\n\n", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d0609a652c7b452c6379d7bee3d565c6749ab9c6
107,761
ipynb
Jupyter Notebook
2. CNN/5D_doubleip.ipynb
nikhil-mathews/MastersPr_Predicting-Human-Pathogen-PPIs-using-Natural-Language-Processing-methods
78bbaaf5e4e52939a522fe14aedbf5acfd29e10c
[ "MIT" ]
null
null
null
2. CNN/5D_doubleip.ipynb
nikhil-mathews/MastersPr_Predicting-Human-Pathogen-PPIs-using-Natural-Language-Processing-methods
78bbaaf5e4e52939a522fe14aedbf5acfd29e10c
[ "MIT" ]
null
null
null
2. CNN/5D_doubleip.ipynb
nikhil-mathews/MastersPr_Predicting-Human-Pathogen-PPIs-using-Natural-Language-Processing-methods
78bbaaf5e4e52939a522fe14aedbf5acfd29e10c
[ "MIT" ]
null
null
null
107,761
107,761
0.918152
[ [ [ "import pandas as pd\n#Google colab does not have pickle\ntry:\n import pickle5 as pickle\nexcept:\n !pip install pickle5\n import pickle5 as pickle\nimport os\nimport seaborn as sns\nimport sys\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom keras.preprocessing.text import Tokenizer\nfrom keras.preprocessing.sequence import pad_sequences\nfrom keras.layers import Dense, Input, GlobalMaxPooling1D,Flatten\nfrom keras.layers import Conv1D, MaxPooling1D, Embedding, Concatenate, Lambda\nfrom keras.models import Model\nfrom sklearn.metrics import roc_auc_score,confusion_matrix,roc_curve, auc\nfrom numpy import random\nfrom keras.layers import LSTM, Bidirectional, GlobalMaxPool1D, Dropout\nfrom keras.optimizers import Adam\nfrom keras.utils.vis_utils import plot_model\n\nimport sys\nsys.path.insert(0,'/content/drive/MyDrive/ML_Data/')\nimport functions as f", "Collecting pickle5\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f7/4c/5c4dd0462c8d3a6bc4af500a6af240763c2ebd1efdc736fc2c946d44b70a/pickle5-0.0.11.tar.gz (132kB)\n\r\u001b[K |โ–ˆโ–ˆโ–Œ | 10kB 17.2MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ | 20kB 12.3MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ– | 30kB 8.9MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ | 40kB 7.9MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ– | 51kB 3.5MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–‰ | 61kB 4.1MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ– | 71kB 4.7MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–‰ | 81kB 5.3MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–Ž | 92kB 5.0MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–‰ | 102kB 5.5MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–Ž | 112kB 5.5MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–Š | 122kB 5.5MB/s eta 0:00:01\r\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 133kB 5.5MB/s \n\u001b[?25hBuilding wheels for collected packages: pickle5\n Building wheel for pickle5 (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for pickle5: filename=pickle5-0.0.11-cp37-cp37m-linux_x86_64.whl size=219265 sha256=4db361ac18314b7f73b02e1617744cb0e9d5f6acde8a0877a30a5f2c9fdfcbcb\n Stored in directory: /root/.cache/pip/wheels/a6/90/95/f889ca4aa8b0e0c7f21c8470b6f5d6032f0390a3a141a9a3bd\nSuccessfully built pickle5\nInstalling collected packages: pickle5\nSuccessfully installed pickle5-0.0.11\n" ], [ "def load_data(D=1,randomize=False):\n try:\n with open('/content/drive/MyDrive/ML_Data/df_train_'+str(D)+'D.pickle', 'rb') as handle:\n df_train = pickle.load(handle)\n except:\n df_train = pd.read_pickle(\"C:/Users/nik00/py/proj/hyppi-train.pkl\")\n try:\n with open('/content/drive/MyDrive/ML_Data/df_test_'+str(D)+'D.pickle', 'rb') as handle:\n df_test = pickle.load(handle)\n except:\n df_test = pd.read_pickle(\"C:/Users/nik00/py/proj/hyppi-independent.pkl\")\n if randomize:\n return shuff_together(df_train,df_test)\n else:\n return df_train,df_test\n\ndf_train,df_test = load_data(5)\nprint('The data used will be:')\ndf_train[['Human','Yersinia']]", "The data used will be:\n" ], [ "lengths = sorted(len(s) for s in df_train['Human'])\nprint(\"Median length of Human sequence is\",lengths[len(lengths)//2])\n_ = sns.displot(lengths)\n_=plt.title(\"Most Human sequences seem to be less than 2000 in length\")", "Median length of Human sequence is 477\n" ], [ "lengths = sorted(len(s) for s in df_train['Yersinia'])\nprint(\"Median length of Yersinia sequence is\",lengths[len(lengths)//2])\n_ = sns.displot(lengths)\n_=plt.title(\"Most Yersinia sequences seem to be less than 1000 in length\")", "Median length of Yersinia sequence is 334\n" ], [ "data1_5D_doubleip_pre,data2_5D_doubleip_pre,data1_test_5D_doubleip_pre,data2_test_5D_doubleip_pre,num_words_5D,MAX_SEQUENCE_LENGTH_5D,MAX_VOCAB_SIZE_5D = f.get_seq_data_doubleip(500000,1000,df_train,df_test,pad = 'pre',show = True)", "MAX_VOCAB_SIZE is 500000\nMAX_SEQUENCE_LENGTH is 1000\nmax sequences1_train length: 5301\nmin sequences1_train length: 12\nmedian sequences1_train length: 327\n" ], [ "EMBEDDING_DIM_5D = 15\nVALIDATION_SPLIT = 0.2\nBATCH_SIZE = 128\nEPOCHS = 5\nDROP=0.7\n\nx1 = f.conv_model(MAX_SEQUENCE_LENGTH_5D,EMBEDDING_DIM_5D,num_words_5D,DROP)\nx2 = f.conv_model(MAX_SEQUENCE_LENGTH_5D,EMBEDDING_DIM_5D,num_words_5D,DROP)\n\nconcatenator = Concatenate(axis=1)\nx = concatenator([x1.output, x2.output])\nx = Dense(128)(x)\nx = Dropout(DROP)(x)\noutput = Dense(1, activation=\"sigmoid\",name=\"Final\")(x)\nmodel5D_CNN_doubleip = Model(inputs=[x1.input, x2.input], outputs=output)\n\nmodel5D_CNN_doubleip.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])\n#plot_model(model5D_CNN_doubleip, to_file='model_plot.png', show_shapes=True, show_layer_names=False)\n\ntrains = [data1_5D_doubleip_pre,data2_5D_doubleip_pre]\ntests = [data1_test_5D_doubleip_pre,data2_test_5D_doubleip_pre]\n\n\nmodel5D_CNN_doubleip.fit(trains, df_train['label'].values, epochs=EPOCHS, batch_size=BATCH_SIZE,validation_data=(tests, df_test['label'].values))\nprint(roc_auc_score(df_test['label'].values, model5D_CNN_doubleip.predict(tests)))\n\n#asd\n", "Epoch 1/5\n49/49 [==============================] - 9s 165ms/step - loss: 0.6156 - accuracy: 0.6580 - val_loss: 0.5116 - val_accuracy: 0.7761\nEpoch 2/5\n49/49 [==============================] - 8s 160ms/step - loss: 0.4532 - accuracy: 0.7840 - val_loss: 0.4213 - val_accuracy: 0.8210\nEpoch 3/5\n49/49 [==============================] - 8s 159ms/step - loss: 0.2253 - accuracy: 0.9179 - val_loss: 0.4269 - val_accuracy: 0.8223\nEpoch 4/5\n49/49 [==============================] - 8s 157ms/step - loss: 0.1099 - accuracy: 0.9620 - val_loss: 0.4274 - val_accuracy: 0.8296\nEpoch 5/5\n49/49 [==============================] - 8s 159ms/step - loss: 0.0653 - accuracy: 0.9793 - val_loss: 0.4897 - val_accuracy: 0.8236\n0.8995234264434628\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code" ] ]
d0609b9a0781386f0d189721804704e4449abfe0
255,014
ipynb
Jupyter Notebook
Decision_tree_C5.O_CART.ipynb
anagha0397/Decision-Tree
745b3ca72ac52a93ef947c130bf5cb60ddc20f65
[ "MIT" ]
null
null
null
Decision_tree_C5.O_CART.ipynb
anagha0397/Decision-Tree
745b3ca72ac52a93ef947c130bf5cb60ddc20f65
[ "MIT" ]
null
null
null
Decision_tree_C5.O_CART.ipynb
anagha0397/Decision-Tree
745b3ca72ac52a93ef947c130bf5cb60ddc20f65
[ "MIT" ]
null
null
null
158.788294
169,820
0.860051
[ [ [ "import pandas as pd \nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom sklearn.datasets import load_iris\nfrom sklearn.model_selection import train_test_split # for spliting the data into train and test \nfrom sklearn.tree import DecisionTreeClassifier # For creating a decision a tree\nfrom sklearn import tree # for displaying the tree\nfrom sklearn.metrics import classification_report # for calculating accuracy\nfrom sklearn import preprocessing # As we have applied encoding technique we have used this preprocessing library ", "_____no_output_____" ], [ "iris = pd.read_csv(\"iris.csv\", index_col = 0) # In order to set the index to 0 we have mentioned that index_col = 0", "_____no_output_____" ], [ "iris.head()", "_____no_output_____" ], [ "# Converting the species column to numbers so we will use encoding technique called as label encoder\n\nlabel_encoder = preprocessing.LabelEncoder() # This is called function calling\niris['Species'] = label_encoder.fit_transform(iris['Species'])", "_____no_output_____" ], [ "iris.head()", "_____no_output_____" ], [ "# Splitting the data in x and y for classification purpose, for performing any classification we are required to split the data first in input and output\n\nx = iris.iloc[:,0:4]\ny = iris['Species']", "_____no_output_____" ], [ "x", "_____no_output_____" ], [ "y", "_____no_output_____" ], [ "iris['Species'].unique() # for determining unique values", "_____no_output_____" ], [ "iris.Species.value_counts()", "_____no_output_____" ], [ "# Splitting the data into training and test dataset\n\nx_train, x_test, y_train, y_test = train_test_split(x,y,\n test_size=0.2,\n random_state=40)", "_____no_output_____" ] ], [ [ "### Building decision tree classifier using entropy criteria (c5.o)", "_____no_output_____" ] ], [ [ "model = DecisionTreeClassifier(criterion = 'entropy',max_depth = 3)\n", "_____no_output_____" ], [ "model.fit(x_train,y_train)", "_____no_output_____" ] ], [ [ "### Plotting the decision tree", "_____no_output_____" ] ], [ [ "tree.plot_tree(model);", "_____no_output_____" ], [ "model.get_n_leaves()", "_____no_output_____" ], [ "## As this tree is not visible so we will display it with some another technique", "_____no_output_____" ], [ "# we will extract the feature names, class names and we will define the figure size so that our tree will be visible in a better way", "_____no_output_____" ], [ "fn = ['SepalLengthCm',\t'SepalWidthCm',\t'PetalLengthCm',\t'PetalWidthCm']\ncn = ['Iris-setosa', 'Iris-versicolar', 'Iris-virginica']\nfig,axes = plt.subplots(nrows = 1, ncols =1, figsize =(4,4), dpi = 300) #dpi is the no. of pixels\ntree.plot_tree(model, feature_names = fn, class_names = cn, filled = True); # filled = true will fill the values inside the boxes", "_____no_output_____" ], [ "# Predicting the builded model on our x-test data", "_____no_output_____" ], [ "preds = model.predict(x_test)\npd.Series(preds).value_counts()", "_____no_output_____" ], [ "preds", "_____no_output_____" ], [ "# In order to check whether the predictions are correct or wrong we will create a cross tab on y_test data", "_____no_output_____" ], [ "crosstable = pd.crosstab(y_test,preds)\ncrosstable", "_____no_output_____" ], [ "# Final step we will calculate the accuracy of our model", "_____no_output_____" ], [ "np.mean(preds==y_test) # We are comparing the predicted values with the actual values and calculating mean for the matches", "_____no_output_____" ], [ "print(classification_report(preds,y_test))", " precision recall f1-score support\n\n 0 1.00 1.00 1.00 8\n 1 1.00 0.92 0.96 13\n 2 0.90 1.00 0.95 9\n\n accuracy 0.97 30\n macro avg 0.97 0.97 0.97 30\nweighted avg 0.97 0.97 0.97 30\n\n" ] ], [ [ "## Building a decision tree using CART method (Classifier model)", "_____no_output_____" ] ], [ [ "model_1 = DecisionTreeClassifier(criterion = 'gini',max_depth = 3)", "_____no_output_____" ], [ "model_1.fit(x_train,y_train)", "_____no_output_____" ], [ "tree.plot_tree(model_1);", "_____no_output_____" ], [ "# predicting the values on xtest data\n\npreds = model_1.predict(x_test)", "_____no_output_____" ], [ "preds", "_____no_output_____" ], [ "pd.Series(preds).value_counts()", "_____no_output_____" ], [ "# calculating accuracy of the model using the actual values", "_____no_output_____" ], [ "np.mean(preds==y_test)", "_____no_output_____" ] ], [ [ "## Decision tree Regressor using CART", "_____no_output_____" ] ], [ [ "from sklearn.tree import DecisionTreeRegressor", "_____no_output_____" ], [ "# Just converting the iris data into the following way as I want my Y to be numeric\n\nX = iris.iloc[:,0:3]\nY = iris.iloc[:,3]", "_____no_output_____" ], [ "X_train,X_test,Y_train,Y_test = train_test_split(X,Y, test_size = 0.33, random_state = 1)", "_____no_output_____" ], [ "model_reg = DecisionTreeRegressor()\nmodel_reg.fit(X_train,Y_train)", "_____no_output_____" ], [ "preds1 = model_reg.predict(X_test)\npreds1", "_____no_output_____" ], [ "# Will see the correct and wrong matches", "_____no_output_____" ], [ "pd.crosstab(Y_test,preds1)", "_____no_output_____" ], [ "## We will calculate the accuracy by using score method,this is an either way to calculate the accuracy of the model", "_____no_output_____" ], [ "model_reg.score(X_test,Y_test) # THis model.score function will first calculate the predicted values using the X_test data and then internaly only it will compare those values with the y_test data which is our actual data", "_____no_output_____" ] ], [ [ "model_reg.score calculates r squared value and Aic value in background", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ] ]
d0609e31ed6dc92d0a56a47774098cabc15e3d07
17,609
ipynb
Jupyter Notebook
Lectures/Lecture6/Intro to Machine Learning Homework.ipynb
alaymodi/Spring-2019-Career-Exploration-master
2ca9b4466090d57702e97e70fa772535b2dc00f3
[ "MIT" ]
null
null
null
Lectures/Lecture6/Intro to Machine Learning Homework.ipynb
alaymodi/Spring-2019-Career-Exploration-master
2ca9b4466090d57702e97e70fa772535b2dc00f3
[ "MIT" ]
null
null
null
Lectures/Lecture6/Intro to Machine Learning Homework.ipynb
alaymodi/Spring-2019-Career-Exploration-master
2ca9b4466090d57702e97e70fa772535b2dc00f3
[ "MIT" ]
null
null
null
43.803483
1,777
0.61991
[ [ [ "# Homework", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt\n%matplotlib inline\nimport random\nimport numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\nfrom plotting import overfittingDemo, plot_multiple_linear_regression, overlay_simple_linear_model,plot_simple_residuals\nfrom scipy.optimize import curve_fit", "_____no_output_____" ] ], [ [ "**Exercise 1:** What are the two \"specialities\" of machine learning? Pick one and in your own words, explain what it means. `", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 2:** What is the difference between a regression task and a classification task?", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 3:** \n1. What is parametric fitting in your understanding?\n2. Given the data $x = 1,2,3,4,5, y_1 = 2,4,6,8,10, y_2 = 2,4,8,16,32,$ what function $f_1, f_2$ will you use to fit $y_1, y_2$? Why do you choose those?\n3. Why is parametric fitting somehow not machine learning?", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 4:** Take a look at the following residual plots. Residuals can be helpful in assessing if our model is overpredicting or underpredicting certain values. Assign the variable bestplot to the letter corresponding to which residual plot indicates a good fit for a linear model.\n\n<img src='residplots.png' width=\"600\" height=\"600\">", "_____no_output_____" ] ], [ [ "bestplot = 'Put your letter answer between these quotes'", "_____no_output_____" ] ], [ [ "**Exercise 5:** Observe the following graphs. Assign each graph variable to one of the following strings: 'overfitting', 'underfitting', or 'bestfit'.\n<img src='overfit-underfit.png' width=\"800\" height=\"800\">", "_____no_output_____" ] ], [ [ "graph1 = \"Put answer here\"\ngraph2 = \"Put answer here\"\ngraph3 = \"Put answer here\"", "_____no_output_____" ] ], [ [ "**Exercise 6:** What are the 3 sets we split our initial data set into?", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 7:** Refer to the graphs below when answering the following questions (Exercise 6 and 7).\n<img src='training_vs_test_error.png' width=\"800\" height=\"800\">\nAs we increase the degree of our model, what happens to the training error and what happens to the test error? ", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 8:** What is the issue with just increasing the degree of our model to get the lowest training error possible?", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 9:** Find the gradient for ridge loss, most concretely, when $L(\\theta, \\textbf{y}, \\alpha)\n= (\\frac{1}{n} \\sum_{i = 1}^{n}(y_i - \\theta)^2) + \\frac{\\alpha }{2}\\sum_{i = 1}^{n}\\theta ^2$\nfind $\\frac{\\partial}{\\partial \\hat{\\theta}} L(\\theta, \\textbf{y},\\alpha)$, you can have a look at the class example, they are really similar.", "_____no_output_____" ], [ "Your Answer Here", "_____no_output_____" ], [ "**Exercise 10:** Following the last part of the exercise, you've already fitted your model, now let's test the performance. Make sure you check the code for the previous example we went through in class.\n\n1. copy what you had from the exercise here.", "_____no_output_____" ] ], [ [ "import pandas as pd\n\nmpg = pd.read_csv(\"./mpg_category.csv\", index_col=\"name\")\n\n#exercise part 1\nmpg['Old?'] = ... \n\n#exercise part 2\nmpg_train, mpg_test = ..., ...\n\n#exercise part 3\nfrom sklearn.linear_model import LogisticRegression\nsoftmax_reg = LogisticRegression(multi_class=\"multinomial\",solver=\"lbfgs\", C=10)\nX = ...\nY = ...\nsoftmax_reg.fit(X, Y)", "_____no_output_____" ] ], [ [ "2. create the test data set and make the prediction on test dataset", "_____no_output_____" ] ], [ [ "X_test = ...\nY_test = ...\npred = softmax_reg.predict(...)", "_____no_output_____" ] ], [ [ "3. Make the confusion matrix and tell me how you interpret each of the cell in the confusion matrix. What does different depth of blue means. You can just run the cell below, assumed what you did above is correct. You just have to answer your understanding.", "_____no_output_____" ] ], [ [ "from sklearn.metrics import confusion_matrix\nconfusion_matrix = confusion_matrix(Y_test, pred)\nX_label = ['old', 'new']\ndef plot_confusion_matrix(cm, title='Confusion matrix', cmap=plt.cm.Blues):\n plt.imshow(cm, interpolation='nearest', cmap=cmap)\n plt.title(title)\n plt.colorbar()\n tick_marks = np.arange(len(X_label))\n plt.xticks(tick_marks, X_label, rotation=45)\n plt.yticks(tick_marks, X_label,)\n plt.tight_layout()\n plt.ylabel('True label')\n plt.xlabel('Predicted label')\nplot_confusion_matrix(confusion_matrix)\n# confusion_matrix", "_____no_output_____" ] ], [ [ "Your Answer Here", "_____no_output_____" ] ], [ [ "# be sure to hit save (File > Save and Checkpoint) or Ctrl/Command-S before you run the cell!\nfrom submit import create_and_submit\n\ncreate_and_submit(['Intro to Machine Learning Homework.ipynb'], verbose=True)", "Parsed Intro to Machine Learning Homework.ipynb\nEnter your Berkeley email address: [email protected]\nPosting answers for Intro to Machine Learning Homework\nYour submission: {'exercise-1': 'Your Answer Here', 'exercise-1_output': None, 'exercise-2': 'Your Answer Here', 'exercise-2_output': None, 'exercise-3': 'Your Answer Here', 'exercise-3_output': None, 'exercise-4': \"bestplot = 'Put your letter answer between these quotes'\", 'exercise-4_output': None, 'exercise-5': 'graph1 = \"Put answer here\"\\ngraph2 = \"Put answer here\"\\ngraph3 = \"Put answer here\"', 'exercise-5_output': None, 'exercise-6': 'Your Answer Here', 'exercise-6_output': None, 'exercise-7': 'Your Answer Here', 'exercise-7_output': None, 'exercise-8': 'Your Answer Here', 'exercise-8_output': None, 'exercise-9': 'Your Answer Here', 'exercise-9_output': None, 'exercise-10-1': 'import pandas as pd\\n\\nmpg = pd.read_csv(\"./mpg_category.csv\", index_col=\"name\")\\n\\n#exercise part 1\\nmpg[\\'Old?\\'] = ... \\n\\n#exercise part 2\\nmpg_train, mpg_test = ..., ...\\n\\n#exercise part 3\\nfrom sklearn.linear_model import LogisticRegression\\nsoftmax_reg = LogisticRegression(multi_class=\"multinomial\",solver=\"lbfgs\", C=...)\\nX = ...\\nY = ...\\nsoftmax_reg.fit(X, Y)', 'exercise-10-1_output': None, 'exercise-10-2': '2. create the test data set and make the prediction on test dataset', 'exercise-10-2_output': None, 'exercise-10-3': '3. Make the confusion matrix and tell me how you interpret each of the cell in the confusion matrix. What does different depth of blue means. You can just run the cell below, assumed what you did above is correct. You just have to answer your understanding.', 'exercise-10-3_output': None, 'exercise-10-4': 'Your Answer Here', 'exercise-10-4_output': None, 'email': '[email protected]', 'sheet': 'Intro to Machine Learning Homework', 'timestamp': datetime.datetime(2019, 3, 18, 16, 46, 54, 7302)}\n\nSubmitted!\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d060a7328730ae52269f71692aeaffba9f64d9a6
15,673
ipynb
Jupyter Notebook
_notebooks/2022-01-03-cs231n.ipynb
star77sa/TIL-Blog
782a24bf0b2324a66024e984dd1c7f3536cd17b9
[ "Apache-2.0" ]
null
null
null
_notebooks/2022-01-03-cs231n.ipynb
star77sa/TIL-Blog
782a24bf0b2324a66024e984dd1c7f3536cd17b9
[ "Apache-2.0" ]
1
2021-07-24T16:33:20.000Z
2021-07-24T16:43:02.000Z
_notebooks/2022-01-03-cs231n.ipynb
star77sa/TIL-Blog
782a24bf0b2324a66024e984dd1c7f3536cd17b9
[ "Apache-2.0" ]
null
null
null
36.79108
722
0.617559
[ [ [ "# CS231n_CNN for Visual Recognition\n> Stanford University CS231n\n\n- toc: true \n- badges: true\n- comments: true\n- categories: [CNN]\n- image: images/", "_____no_output_____" ], [ "---", "_____no_output_____" ], [ "- http://cs231n.stanford.edu/", "_____no_output_____" ], [ "---\n# Image Classification\n\n", "_____no_output_____" ], [ "- **Image Classification:** We are given a **Training Set** of labeled images, asked to predict labels on **Test Set.** Common to report the **Accuracy** of predictions(fraction of correctly predicted images)\n\n- We introduced the **k-Nearest Neighbor Classifier**, which predicts the labels based on nearest images in the training set\n\n- We saw that the choice of distance and the value of k are **hyperparameters** that are tuned using a **validation set**, or through **cross-validation** if the size of the data is small.\n\n- Once the best set of hyperparameters is chosen, the classifier is evaluated once on the test set, and reported as the performance of kNN on that data.", "_____no_output_____" ], [ "- Nearest Neighbor ๋ถ„๋ฅ˜๊ธฐ๋Š” CIFAR-10 ๋ฐ์ดํ„ฐ์…‹์—์„œ ์•ฝ 40% ์ •๋„์˜ ์ •ํ™•๋„๋ฅผ ๋ณด์ด๋Š” ๊ฒƒ์„ ํ™•์ธํ•˜์˜€๋‹ค. ์ด ๋ฐฉ๋ฒ•์€ ๊ตฌํ˜„์ด ๋งค์šฐ ๊ฐ„๋‹จํ•˜์ง€๋งŒ, ํ•™์Šต ๋ฐ์ดํ„ฐ์…‹ ์ „์ฒด๋ฅผ ๋ฉ”๋ชจ๋ฆฌ์— ์ €์žฅํ•ด์•ผ ํ•˜๊ณ , ์ƒˆ๋กœ์šด ํ…Œ์ŠคํŠธ ์ด๋ฏธ์ง€๋ฅผ ๋ถ„๋ฅ˜ํ•˜๊ณ  ํ‰๊ฐ€ํ•  ๋•Œ ๊ณ„์‚ฐ๋Ÿ‰์ด ๋งค์šฐ ๋งŽ๋‹ค.\n\n- ๋‹จ์ˆœํžˆ ํ”ฝ์…€ ๊ฐ’๋“ค์˜ L1์ด๋‚˜ L2 ๊ฑฐ๋ฆฌ๋Š” ์ด๋ฏธ์ง€์˜ ํด๋ž˜์Šค๋ณด๋‹ค ๋ฐฐ๊ฒฝ์ด๋‚˜ ์ด๋ฏธ์ง€์˜ ์ „์ฒด์ ์ธ ์ƒ‰๊น” ๋ถ„ํฌ ๋“ฑ์— ๋” ํฐ ์˜ํ–ฅ์„ ๋ฐ›๊ธฐ ๋•Œ๋ฌธ์— ์ด๋ฏธ์ง€ ๋ถ„๋ฅ˜ ๋ฌธ์ œ์— ์žˆ์–ด์„œ ์ถฉ๋ถ„ํ•˜์ง€ ๋ชปํ•˜๋‹ค๋Š” ์ ์„ ๋ณด์•˜๋‹ค.", "_____no_output_____" ], [ "---\n# Linear Classification", "_____no_output_____" ], [ "- We defined a **score function** from image pixels to class scores (in this section, a linear function that depends on weights **W** and biases **b**).\n\n- Unlike kNN classifier, the advantage of this **parametric approach** is that once we learn the parameters we can discard the training data. Additionally, the prediction for a new test image is fast since it requires a single matrix multiplication with **W**, not an exhaustive comparison to every single training example.\n\n- We introduced the **bias trick**, which allows us to fold the bias vector into the weight matrix for convenience of only having to keep track of one parameter matrix.\nํ•˜๋‚˜์˜ ๋งค๊ฐœ๋ณ€์ˆ˜ ํ–‰๋ ฌ๋งŒ ์ถ”์ ํ•ด์•ผ ํ•˜๋Š” ํŽธ์˜๋ฅผ ์œ„ํ•ด ํŽธํ–ฅ ๋ฒกํ„ฐ๋ฅผ ๊ฐ€์ค‘์น˜ ํ–‰๋ ฌ๋กœ ์ ‘์„ ์ˆ˜ ์žˆ๋Š” ํŽธํ–ฅ ํŠธ๋ฆญ์„ ๋„์ž…ํ–ˆ์Šต๋‹ˆ๋‹ค .\n\n- We defined a **loss function** (we introduced two commonly used losses for linear classifiers: the **SVM** and the **Softmax**) that measures how compatible a given set of parameters is with respect to the ground truth labels in the training dataset. We also saw that the loss function was defined in such way that making good predictions on the training data is equivalent to having a small loss.", "_____no_output_____" ], [ "---", "_____no_output_____" ], [ "# Optimization", "_____no_output_____" ], [ "- We developed the intuition of the loss function as a **high-dimensional optimization landscape** in which we are trying to reach the bottom. The working analogy we developed was that of a blindfolded hiker who wishes to reach the bottom. In particular, we saw that the SVM cost function is piece-wise linear and bowl-shaped.\n\n- We motivated the idea of optimizing the loss function with **iterative refinement**, where we start with a random set of weights and refine them step by step until the loss is minimized.\n\n- We saw that the **gradient** of a function gives the steepest ascent direction and we discussed a simple but inefficient way of computing it numerically using the finite difference approximation (the finite difference being the value of h used in computing the numerical gradient).\n\n- We saw that the parameter update requires a tricky setting of the **step size** (or the **learning rate**) that must be set just right: if it is too low the progress is steady but slow. If it is too high the progress can be faster, but more risky. We will explore this tradeoff in much more detail in future sections.\n\n- We discussed the tradeoffs between computing the **numerical** and **analytic** gradient. The numerical gradient is simple but it is approximate and expensive to compute. The analytic gradient is exact, fast to compute but more error-prone since it requires the derivation of the gradient with math. Hence, in practice we always use the analytic gradient and then perform a **gradient check**, in which its implementation is compared to the numerical gradient.\n\n- We introduced the **Gradient Descent** algorithm which iteratively computes the gradient and performs a parameter update in loop.", "_____no_output_____" ], [ "---", "_____no_output_____" ], [ "# Backprop", "_____no_output_____" ], [ "- We developed intuition for what the gradients mean, how they flow backwards in the circuit, and how they communicate which part of the circuit should increase or decrease and with what force to make the final output higher.\n\n- We discussed the importance of **staged computation** for practical implementations of backpropagation. You always want to break up your function into modules for which you can easily derive local gradients, and then chain them with chain rule. Crucially, you almost never want to write out these expressions on paper and differentiate them symbolically in full, because you never need an explicit mathematical equation for the gradient of the input variables. Hence, decompose your expressions into stages such that you can differentiate every stage independently (the stages will be matrix vector multiplies, or max operations, or sum operations, etc.) and then backprop through the variables one step at a time.", "_____no_output_____" ], [ "---\n# Neural Network - 1", "_____no_output_____" ], [ "- We introduced a very coarse model of a biological **neuron**\n\n- ์‹ค์ œ ์‚ฌ์šฉ๋˜๋Š” ๋ช‡๋ช‡ **ํ™œ์„ฑํ™” ํ•จ์ˆ˜** ์— ๋Œ€ํ•ด ๋…ผ์˜ํ•˜์˜€๊ณ , ReLU๊ฐ€ ๊ฐ€์žฅ ์ผ๋ฐ˜์ ์ธ ์„ ํƒ์ด๋‹ค.\n - ํ™œ์„ฑํ™” ํ•จ์ˆ˜ ์“ฐ๋Š” ์ด์œ  : ๋ฐ์ดํ„ฐ๋ฅผ ๋น„์„ ํ˜•์œผ๋กœ ๋ฐ”๊พธ๊ธฐ ์œ„ํ•ด์„œ. ์„ ํ˜•์ด๋ฉด ์€๋‹‰์ธต์ด 1๊ฐœ๋ฐ–์— ์•ˆ๋‚˜์˜ด\n\n\n- We introduced **Neural Networks** where neurons are connected with **Fully-Connected layers** where neurons in adjacent layers have full pair-wise connections, but neurons within a layer are not connected.\n\n- ์šฐ๋ฆฌ๋Š” layered architecture๋ฅผ ํ†ตํ•ด ํ™œ์„ฑํ™” ํ•จ์ˆ˜์˜ ๊ธฐ๋Šฅ ์ ์šฉ๊ณผ ๊ฒฐํ•ฉ๋œ ํ–‰๋ ฌ ๊ณฑ์„ ๊ธฐ๋ฐ˜์œผ๋กœ ์‹ ๊ฒฝ๋ง์„ ๋งค์šฐ ํšจ์œจ์ ์œผ๋กœ ํ‰๊ฐ€ํ•  ์ˆ˜ ์žˆ์Œ์„ ๋ณด์•˜๋‹ค.\n\n- ์šฐ๋ฆฌ๋Š” Neural Networks๊ฐ€ **universal function approximators**(NN์œผ๋กœ ์–ด๋– ํ•œ ํ•จ์ˆ˜๋“  ๊ทผ์‚ฌ์‹œํ‚ฌ ์ˆ˜ ์žˆ๋‹ค)์ž„์„ ๋ณด์•˜์ง€๋งŒ, ์šฐ๋ฆฌ๋Š” ๋˜ํ•œ ์ด ํŠน์„ฑ์ด ๊ทธ๋“ค์˜ ํŽธ์žฌ์ ์ธ ์‚ฌ์šฉ๊ณผ ๊ฑฐ์˜ ๊ด€๋ จ์ด ์—†๋‹ค๋Š” ์‚ฌ์‹ค์— ๋Œ€ํ•ด ๋…ผ์˜ํ•˜์˜€๋‹ค. They are used because they make certain โ€œrightโ€ assumptions about the functional forms of functions that come up in practice.\n\n- ์šฐ๋ฆฌ๋Š” ํฐ network๊ฐ€ ์ž‘์€ network๋ณด๋‹ค ํ•ญ์ƒ ๋” ์ž˜ ์ž‘๋™ํ•˜์ง€๋งŒ, ๋” ๋†’์€ model capacity๋Š” ๋” ๊ฐ•๋ ฅํ•œ ์ •๊ทœํ™”(๋†’์€ ๊ฐ€์ค‘์น˜ ๊ฐ์†Œ๊ฐ™์€)๋กœ ์ ์ ˆํžˆ ํ•ด๊ฒฐ๋˜์–ด์•ผ ํ•˜๋ฉฐ, ๊ทธ๋ ‡์ง€ ์•Š์œผ๋ฉด ์˜ค๋ฒ„ํ•๋  ์ˆ˜ ์žˆ๋‹ค๋Š” ์‚ฌ์‹ค์— ๋Œ€ํ•ด ๋…ผ์˜ํ•˜์˜€๋‹ค. ์ดํ›„ ์„น์…˜์—์„œ ๋” ๋งŽ์€ ํ˜•ํƒœ์˜ ์ •๊ทœํ™”(ํŠนํžˆ dropout)๋ฅผ ๋ณผ ์ˆ˜ ์žˆ์„ ๊ฒƒ์ด๋‹ค.", "_____no_output_____" ], [ "---\n# Neural Network - 2", "_____no_output_____" ], [ "- ๊ถŒ์žฅ๋˜๋Š” ์ „์ฒ˜๋ฆฌ๋Š” ๋ฐ์ดํ„ฐ์˜ ์ค‘์•™์— ํ‰๊ท ์ด 0์ด ๋˜๋„๋ก ํ•˜๊ณ  (zero centered), ์Šค์ผ€์ผ์„ [-1, 1]๋กœ ์ •๊ทœํ™” ํ•˜๋Š” ๊ฒƒ ์ž…๋‹ˆ๋‹ค.\n - ์˜ฌ๋ฐ”๋ฅธ ์ „์ฒ˜๋ฆฌ ๋ฐฉ๋ฒ• : ์˜ˆ๋ฅผ๋“ค์–ด ํ‰๊ท ์ฐจ๊ฐ ๊ธฐ๋ฒ•์„ ์“ธ ๋•Œ ํ•™์Šต, ๊ฒ€์ฆ, ํ…Œ์ŠคํŠธ๋ฅผ ์œ„ํ•œ ๋ฐ์ดํ„ฐ๋ฅผ ๋จผ์ € ๋‚˜๋ˆˆ ํ›„ ํ•™์Šต ๋ฐ์ดํ„ฐ๋ฅผ ๋Œ€์ƒ์œผ๋กœ ํ‰๊ท ๊ฐ’์„ ๊ตฌํ•œ ํ›„์— ํ‰๊ท ์ฐจ๊ฐ ์ „์ฒ˜๋ฆฌ๋ฅผ ๋ชจ๋“  ๋ฐ์ดํ„ฐ๊ตฐ(ํ•™์Šต, ๊ฒ€์ฆ, ํ…Œ์ŠคํŠธ)์— ์ ์šฉํ•˜๋Š” ๊ฒƒ์ด๋‹ค.\n\n- ReLU๋ฅผ ์‚ฌ์šฉํ•˜๊ณ  ์ดˆ๊ธฐํ™”๋Š” $\\sqrt{2/n}$ ์˜ ํ‘œ์ค€ ํŽธ์ฐจ๋ฅผ ๊ฐ–๋Š” ์ •๊ทœ ๋ถ„ํฌ์—์„œ ๊ฐ€์ค‘์น˜๋ฅผ ๊ฐ€์ ธ์™€ ์ดˆ๊ธฐํ™”ํ•ฉ๋‹ˆ๋‹ค. ์—ฌ๊ธฐ์„œ $n$์€ ๋‰ด๋Ÿฐ์— ๋Œ€ํ•œ ์ž…๋ ฅ ์ˆ˜์ž…๋‹ˆ๋‹ค. E.g. in numpy: `w = np.random.randn(n) * sqrt(2.0/n)`.\n\n- L2 regularization๊ณผ ๋“œ๋ž์•„์›ƒ์„ ์‚ฌ์šฉ (the inverted version)\n\n- Batch normalization ์‚ฌ์šฉ (์ด๊ฑธ์“ฐ๋ฉด ๋“œ๋ž์•„์›ƒ์€ ์ž˜ ์•ˆ์”€)\n\n- ์‹ค์ œ๋กœ ์ˆ˜ํ–‰ํ•  ์ˆ˜ ์žˆ๋Š” ๋‹ค์–‘ํ•œ ์ž‘์—…๊ณผ ๊ฐ ์ž‘์—…์— ๋Œ€ํ•œ ๊ฐ€์žฅ ์ผ๋ฐ˜์ ์ธ ์†์‹ค ํ•จ์ˆ˜์— ๋Œ€ํ•ด ๋…ผ์˜ํ–ˆ๋‹ค.", "_____no_output_____" ], [ "---\n# Neural Network - 3", "_____no_output_____" ], [ "์‹ ๊ฒฝ๋ง(neural network)๋ฅผ ํ›ˆ๋ จํ•˜๊ธฐ ์œ„ํ•˜์—ฌ:\n\n- ์ฝ”๋“œ๋ฅผ ์งœ๋Š” ์ค‘๊ฐ„์ค‘๊ฐ„์— ์ž‘์€ ๋ฐฐ์น˜๋กœ ๊ทธ๋ผ๋””์–ธํŠธ๋ฅผ ์ฒดํฌํ•˜๊ณ , ๋œปํ•˜์ง€ ์•Š๊ฒŒ ํŠ€์–ด๋‚˜์˜ฌ ์œ„ํ—˜์„ ์ธ์ง€ํ•˜๊ณ  ์žˆ์–ด์•ผ ํ•œ๋‹ค.\n\n- ์ฝ”๋“œ๊ฐ€ ์ œ๋Œ€๋กœ ๋Œ์•„๊ฐ€๋Š”์ง€ ํ™•์ธํ•˜๋Š” ๋ฐฉ๋ฒ•์œผ๋กœ, ์†์‹คํ•จ์ˆ˜๊ฐ’์˜ ์ดˆ๊ธฐ๊ฐ’์ด ํ•ฉ๋ฆฌ์ ์ธ์ง€ ๊ทธ๋ฆฌ๊ณ  ๋ฐ์ดํ„ฐ์˜ ์ผ๋ถ€๋ถ„์œผ๋กœ 100%์˜ ํ›ˆ๋ จ ์ •ํ™•๋„๋ฅผ ๋‹ฌ์„ฑํ•  ์ˆ˜ ์žˆ๋Š”์ง€ ํ™•์ธํ•ด์•ผํ•œ๋‹ค.\n\n- ํ›ˆ๋ จ ๋™์•ˆ, ์†์‹คํ•จ์ˆ˜์™€ train/validation ์ •ํ™•๋„๋ฅผ ๊ณ„์† ์‚ดํŽด๋ณด๊ณ , (์ด๊ฒŒ ์ข€ ๋” ๋ฉ‹์ ธ ๋ณด์ด๋ฉด) ํ˜„์žฌ ํŒŒ๋ผ๋ฏธํ„ฐ ๊ฐ’ ๋Œ€๋น„ ์—…๋ฐ์ดํŠธ ๊ฐ’ ๋˜ํ•œ ์‚ดํŽด๋ณด๋ผ (๋Œ€์ถฉ ~ 1e-3 ์ •๋„ ๋˜์–ด์•ผ ํ•œ๋‹ค). ๋งŒ์•ฝ ConvNet์„ ๋‹ค๋ฃจ๊ณ  ์žˆ๋‹ค๋ฉด, ์ฒซ ์ธต์˜ ์›จ์ดํŠธ๊ฐ’๋„ ์‚ดํŽด๋ณด๋ผ.\n\n- ์—…๋ฐ์ดํŠธ ๋ฐฉ๋ฒ•์œผ๋กœ ์ถ”์ฒœํ•˜๋Š” ๊ฑด SGD+Nesterov Momentum ํ˜น์€ Adam์ด๋‹ค.\n\n- ํ•™์Šต ์†๋„๋ฅผ ํ›ˆ๋ จ ๋™์•ˆ ๊ณ„์† ํ•˜๊ฐ•์‹œ์ผœ๋ผ. ์˜ˆ๋ฅผ ๋“ค๋ฉด, ์ •ํ•ด์ง„ ์—ํญ ์ˆ˜ ๋’ค์— (ํ˜น์€ ๊ฒ€์ฆ ์ •ํ™•๋„๊ฐ€ ์ƒ์Šนํ•˜๋‹ค๊ฐ€ ํ•˜๊ฐ•์„ธ๋กœ ๊บพ์ด๋ฉด) ํ•™์Šต ์†๋„๋ฅผ ๋ฐ˜์œผ๋กœ ๊นŽ์•„๋ผ.\n\n- Hyper parameter ๊ฒ€์ƒ‰์€ grid search๊ฐ€ ์•„๋‹Œ random search์œผ๋กœ ์ˆ˜ํ–‰ํ•˜๋ผ. ์ฒ˜์Œ์—๋Š” ์„ฑ๊ธด ๊ทœ๋ชจ์—์„œ ํƒ์ƒ‰ํ•˜๋‹ค๊ฐ€ (๋„“์€ hyper parameter ๋ฒ”์œ„, 1-5 epoch ์ •๋„๋งŒ ํ•™์Šต), ์ ์  ์ด˜์ด˜ํ•˜๊ฒŒ ๊ฒ€์ƒ‰ํ•˜๋ผ. (์ข์€ ๋ฒ”์œ„, ๋” ๋งŽ์€ ์—ํญ์—์„œ ํ•™์Šต)\n- ์ถ”๊ฐ€์ ์ธ ๊ฐœ์„ ์„ ์œ„ํ•˜์—ฌ ๋ชจํ˜• ์•™์ƒ๋ธ”์„ ๊ตฌ์ถ•ํ•˜๋ผ.", "_____no_output_____" ], [ "---\n# CNN", "_____no_output_____" ], [ "- ConvNet ์•„ํ‚คํ…์ณ๋Š” ์—ฌ๋Ÿฌ ๋ ˆ์ด์–ด๋ฅผ ํ†ตํ•ด ์ž…๋ ฅ ์ด๋ฏธ์ง€ ๋ณผ๋ฅจ์„ ์ถœ๋ ฅ ๋ณผ๋ฅจ (ํด๋ž˜์Šค ์ ์ˆ˜)์œผ๋กœ ๋ณ€ํ™˜์‹œ์ผœ ์ค€๋‹ค.\n\n- ConvNet์€ ๋ช‡ ๊ฐ€์ง€ ์ข…๋ฅ˜์˜ ๋ ˆ์ด์–ด๋กœ ๊ตฌ์„ฑ๋˜์–ด ์žˆ๋‹ค. CONV/FC/RELU/POOL ๋ ˆ์ด์–ด๊ฐ€ ํ˜„์žฌ ๊ฐ€์žฅ ๋งŽ์ด ์“ฐ์ธ๋‹ค.\n\n- ๊ฐ ๋ ˆ์ด์–ด๋Š” 3์ฐจ์›์˜ ์ž…๋ ฅ ๋ณผ๋ฅจ์„ ๋ฏธ๋ถ„ ๊ฐ€๋Šฅํ•œ ํ•จ์ˆ˜๋ฅผ ํ†ตํ•ด 3์ฐจ์› ์ถœ๋ ฅ ๋ณผ๋ฅจ์œผ๋กœ ๋ณ€ํ™˜์‹œํ‚จ๋‹ค.\n\n- parameter๊ฐ€ ์žˆ๋Š” ๋ ˆ์ด์–ด๋„ ์žˆ๊ณ  ๊ทธ๋ ‡์ง€ ์•Š์€ ๋ ˆ์ด์–ด๋„ ์žˆ๋‹ค (FC/CONV๋Š” parameter๋ฅผ ๊ฐ–๊ณ  ์žˆ๊ณ , RELU/POOL ๋“ฑ์€ parameter๊ฐ€ ์—†์Œ).\n\n- hyperparameter๊ฐ€ ์žˆ๋Š” ๋ ˆ์ด์–ด๋„ ์žˆ๊ณ  ๊ทธ๋ ‡์ง€ ์•Š์€ ๋ ˆ์ด์–ด๋„ ์žˆ๋‹ค (CONV/FC/POOL ๋ ˆ์ด์–ด๋Š” hyperparameter๋ฅผ ๊ฐ€์ง€๋ฉฐ ReLU๋Š” ๊ฐ€์ง€์ง€ ์•Š์Œ).\n\n- stride, zero-padding ...", "_____no_output_____" ], [ "---\n# Spatial Localization and Detection", "_____no_output_____" ], [ "<img src='img/cs231/detect.png' width=\"500\" height=\"500\">", "_____no_output_____" ], [ "- Classification : ์‚ฌ์ง„์— ๋Œ€ํ•œ ๋ผ๋ฒจ์ด ์•„์›ƒํ’‹\n- Localization : ์‚ฌ์ง„์— ๋Œ€ํ•œ ์ƒ์ž๊ฐ€ ์•„์›ƒํ’‹ (x, y, w, h)\n- Detection : ์‚ฌ์ง„์— ๋Œ€ํ•œ ์—ฌ๋Ÿฌ๊ฐœ์˜ ์ƒ์ž๊ฐ€ ์•„์›ƒํ’‹ DOG(x, y, w, h), CAT(x, y, w, h), ...\n- Segmentation : ์ƒ์ž๊ฐ€ ์•„๋‹ˆ๋ผ ๊ฐ์ฒด์˜ ์ด๋ฏธ์ง€ ํ˜•์ƒ์„ ๊ทธ๋Œ€๋กœ.", "_____no_output_____" ], [ "- Localization method : localization as Regression, Sliding Window : Overfeat\n\n- Region Proposals : ๋น„์Šทํ•œ ์ƒ‰๊น”, ํ…์Šค์ณ๋ฅผ ๊ธฐ์ค€์œผ๋กœ ๋ฐ•์Šค๋ฅผ ์ƒ์„ฑ\n\n- Detection :\n - R-CNN : Region-based CNN. Region -> CNN\n - ๋ฌธ์ œ์  : Region proposal ๋งˆ๋‹ค CNN์„ ๋Œ๋ ค์„œ ์‹œ๊ฐ„์ด ๋งค์šฐ ๋งŽ์ด๋“ ๋‹ค.\n - Fast R-CNN : CNN -> Region\n - ๋ฌธ์ œ์  : Region Proposal ๊ณผ์ •์—์„œ ์‹œ๊ฐ„์ด ๋“ ๋‹ค.\n - Faster R-CNN : Region Proposals๋„ CNN์„ ์ด์šฉํ•ด์„œ ํ•ด๋ณด์ž.\n \n - YOLO(You Only Look Once) : Detection as Regression\n - ์„ฑ๋Šฅ์€ Faster R-CNN๋ณด๋‹ค ๋–จ์–ด์ง€์ง€๋งŒ, ์†๋„๊ฐ€ ๋งค์šฐ ๋น ๋ฅด๋‹ค.", "_____no_output_____" ], [ "---\n# CNNs in practice", "_____no_output_____" ], [ "- Data Augmentation\n - Change the pixels without changing the label\n - Train on transformed data\n - VERY widely used\n \n .....\n \n 1. Horizontal flips\n 2. Random crops/scales\n 3. Color jitter", "_____no_output_____" ], [ "- Transfer learning\n\n ์ด๋ฏธ์ง€๋„ท์˜ ํด๋ž˜์Šค์™€ ๊ด€๋ จ์žˆ๋Š” ๋ฐ์ดํ„ฐ๋ผ๋ฉด ์‚ฌ์ „ํ•™์Šต์‹œ ์„ฑ๋Šฅ์ด ์ข‹์•„์ง€๋Š”๊ฒŒ ์ดํ•ด๊ฐ€๋˜๋Š”๋ฐ ๊ด€๋ จ์—†๋Š” ์ด๋ฏธ์ง€ (e.g. mri๊ฐ™์€ ์˜๋ฃŒ์ด๋ฏธ์ง€)์˜ ๊ฒฝ์šฐ๋„ ์„ฑ๋Šฅ์ด ์ข‹์•„์ง€๋Š”๋ฐ ๊ทธ ์ด์œ ๋Š” ๋ฌด์—‡์ธ๊ฐ€?\n\n -> ์•ž๋‹จ์—์„  ์—ฃ์ง€, ์ปฌ๋Ÿฌ๊ฐ™์€ low level์˜ feature๋ฅผ ์ธ์‹, ๋’ค๋กœ๊ฐˆ์ˆ˜๋ก ์ƒ์œ„๋ ˆ๋ฒจ์„ ์ธ์‹. lowlevel์„ ๋ฏธ๋ฆฌ ํ•™์Šตํ•ด๋†“๋Š”๋‹ค๋Š” ๊ฒƒ์€ ๊ทธ ์–ด๋–ค ์ด๋ฏธ์ง€๋ฅผ ๋ถ„์„ํ•  ๋•Œ๋„ ๋„์›€์ด๋œ๋‹ค!", "_____no_output_____" ], [ "- How to stack convolutions:\n\n - Replace large convolutions (5x5, 7x7) with stacks of 3x3 convolutions\n - 1x1 \"bottleneck\" convolutions are very efficient\n - Can factor NxN convolutions into 1xN and Nx1\n - All of the above give fewer parameters, less compute, more nonlinearity\n \n ๋” ์ ์€ ํŒŒ๋ผ๋ฏธํ„ฐ, ๋” ์ ์€ ์ปดํ“จํŒ…์—ฐ์‚ฐ, ๋” ๋งŽ์€ nonlinearity(ํ•„ํ„ฐ ์‚ฌ์ด์‚ฌ์ด ReLU๋“ฑ์ด ๋“ค์–ด๊ฐ€๊ธฐ์—)", "_____no_output_____" ], [ "- Computing Convolutions:\n - im2col : Easy to implement, but big memory overhead.\n - FFT : Big speedups for small kernels\n - \"Fast Algorithms\" : seem promising, not widely used yet", "_____no_output_____" ], [ "---\n# Segmentaion", "_____no_output_____" ], [ "- Semantic Segmentation\n - Classify all pixels\n - Fully convolutional models, downsample then upsample\n - Learnable upsampling: fractionally strided convolution\n - Skip connections can help\n\n...\n\n- Instance Segmentation\n - Detect instance, generate mask\n - Similar pipelines to object detection", "_____no_output_____" ] ] ]
[ "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ] ]
d060b12257f61ab276eae7b6126d23a80f820034
196,953
ipynb
Jupyter Notebook
colab/resnet.ipynb
WindQAQ/iree
68fc75cbe6e4bdf175885c17d41f4d61a55c3537
[ "Apache-2.0" ]
null
null
null
colab/resnet.ipynb
WindQAQ/iree
68fc75cbe6e4bdf175885c17d41f4d61a55c3537
[ "Apache-2.0" ]
null
null
null
colab/resnet.ipynb
WindQAQ/iree
68fc75cbe6e4bdf175885c17d41f4d61a55c3537
[ "Apache-2.0" ]
null
null
null
501.152672
185,484
0.926038
[ [ [ "##### Copyright 2020 Google LLC.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");", "_____no_output_____" ] ], [ [ "#@title License header\n# Copyright 2020 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.", "_____no_output_____" ] ], [ [ "# ResNet\n\n[ResNet](https://arxiv.org/abs/1512.03385) is a deep neural network architecture for image recognition.\n\nThis notebook\n\n* Constructs a [ResNet50](https://www.tensorflow.org/api_docs/python/tf/keras/applications/ResNet50) model using `tf.keras`, with weights pretrained using the[ImageNet](http://www.image-net.org/) dataset\n* Compiles that model with IREE\n* Tests TensorFlow and IREE execution of the model on a sample image", "_____no_output_____" ] ], [ [ "#@title Imports and common setup\n\nfrom pyiree import rt as ireert\nfrom pyiree.tf import compiler as ireec\nfrom pyiree.tf.support import tf_utils\n\nimport tensorflow as tf\nfrom matplotlib import pyplot as plt", "_____no_output_____" ], [ "#@title Construct a pretrained ResNet model with ImageNet weights\n\ntf.keras.backend.set_learning_phase(False)\n\n# Static shape, including batch size (1).\n# Can be dynamic once dynamic shape support is ready.\nINPUT_SHAPE = [1, 224, 224, 3]\n\ntf_model = tf.keras.applications.resnet50.ResNet50(\n weights=\"imagenet\", include_top=True, input_shape=tuple(INPUT_SHAPE[1:]))\n\n# Wrap the model in a tf.Module to compile it with IREE.\nclass ResNetModule(tf.Module):\n\n def __init__(self):\n super(ResNetModule, self).__init__()\n self.m = tf_model\n self.predict = tf.function(\n input_signature=[tf.TensorSpec(INPUT_SHAPE, tf.float32)])(tf_model.call)", "_____no_output_____" ], [ "#@markdown ### Backend Configuration\n\nbackend_choice = \"iree_vmla (CPU)\" #@param [ \"iree_vmla (CPU)\", \"iree_llvmjit (CPU)\", \"iree_vulkan (GPU/SwiftShader)\" ]\nbackend_choice = backend_choice.split(\" \")[0]\nbackend = tf_utils.BackendInfo(backend_choice)", "_____no_output_____" ], [ "#@title Compile ResNet with IREE\n# This may take a few minutes.\niree_module = backend.compile(ResNetModule, [\"predict\"])", "Created IREE driver vmla: <iree.bindings.python.pyiree.rt.binding.HalDriver object at 0x7fef48c98298>\nSystemContext driver=<iree.bindings.python.pyiree.rt.binding.HalDriver object at 0x7fef48c98298>\n" ], [ "#@title Load a test image of a [labrador](https://commons.wikimedia.org/wiki/File:YellowLabradorLooking_new.jpg)\n\ndef load_image(path_to_image):\n image = tf.io.read_file(path_to_image)\n image = tf.image.decode_image(image, channels=3)\n image = tf.image.resize(image, (224, 224))\n image = image[tf.newaxis, :]\n return image\n\ncontent_path = tf.keras.utils.get_file(\n 'YellowLabradorLooking_new.jpg',\n 'https://storage.googleapis.com/download.tensorflow.org/example_images/YellowLabradorLooking_new.jpg')\ncontent_image = load_image(content_path)\n\nprint(\"Test image:\")\nplt.imshow(content_image.numpy().reshape(224, 224, 3) / 255.0)\nplt.axis(\"off\")\nplt.tight_layout()", "Test image:\n" ], [ "#@title Model pre- and post-processing\ninput_data = tf.keras.applications.resnet50.preprocess_input(content_image)\n\ndef decode_result(result):\n return tf.keras.applications.resnet50.decode_predictions(result, top=3)[0]", "_____no_output_____" ], [ "#@title Run TF model\n\nprint(\"TF prediction:\")\ntf_result = tf_model.predict(input_data)\nprint(decode_result(tf_result))", "TF prediction:\n[('n02091244', 'Ibizan_hound', 0.12879108), ('n02099712', 'Labrador_retriever', 0.12632962), ('n02091831', 'Saluki', 0.09625229)]\n" ], [ "#@title Run the model compiled with IREE\n\nprint(\"IREE prediction:\")\niree_result = iree_module.predict(input_data)\nprint(decode_result(iree_result))", "IREE prediction:\n[('n02091244', 'Ibizan_hound', 0.12879075), ('n02099712', 'Labrador_retriever', 0.1263297), ('n02091831', 'Saluki', 0.09625255)]\n" ] ] ]
[ "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d060b4c997abb26f63478a8f14b093cdba70911a
20,118
ipynb
Jupyter Notebook
notebooks/art-for-tensorflow-v2-keras.ipynb
changx03/adversarial-robustness-toolbox
e21e0ff8ec5a88441da164c90376d63e07193242
[ "MIT" ]
1,350
2020-07-14T08:06:55.000Z
2022-03-31T19:22:25.000Z
notebooks/art-for-tensorflow-v2-keras.ipynb
bochengC/adversarial-robustness-toolbox
031ffe4426678487de0cbcec5ad13f355e570bc8
[ "MIT" ]
936
2020-07-14T03:33:00.000Z
2022-03-31T23:05:29.000Z
notebooks/art-for-tensorflow-v2-keras.ipynb
bochengC/adversarial-robustness-toolbox
031ffe4426678487de0cbcec5ad13f355e570bc8
[ "MIT" ]
413
2020-07-16T16:00:16.000Z
2022-03-29T10:31:12.000Z
50.295
5,712
0.772641
[ [ [ "# ART for TensorFlow v2 - Keras API", "_____no_output_____" ], [ "This notebook demonstrate applying ART with the new TensorFlow v2 using the Keras API. The code follows and extends the examples on www.tensorflow.org.", "_____no_output_____" ] ], [ [ "import warnings\nwarnings.filterwarnings('ignore')\nimport tensorflow as tf\ntf.compat.v1.disable_eager_execution()\nimport numpy as np\nfrom matplotlib import pyplot as plt\n\nfrom art.estimators.classification import KerasClassifier\nfrom art.attacks.evasion import FastGradientMethod, CarliniLInfMethod", "_____no_output_____" ], [ "if tf.__version__[0] != '2':\n raise ImportError('This notebook requires TensorFlow v2.')", "_____no_output_____" ] ], [ [ "# Load MNIST dataset", "_____no_output_____" ] ], [ [ "(x_train, y_train), (x_test, y_test) = tf.keras.datasets.mnist.load_data()\nx_train, x_test = x_train / 255.0, x_test / 255.0\n\nx_test = x_test[0:100]\ny_test = y_test[0:100]", "_____no_output_____" ] ], [ [ "# TensorFlow with Keras API", "_____no_output_____" ], [ "Create a model using Keras API. Here we use the Keras Sequential model and add a sequence of layers. Afterwards the model is compiles with optimizer, loss function and metrics.", "_____no_output_____" ] ], [ [ "model = tf.keras.models.Sequential([\n tf.keras.layers.InputLayer(input_shape=(28, 28)),\n tf.keras.layers.Flatten(),\n tf.keras.layers.Dense(128, activation='relu'),\n tf.keras.layers.Dropout(0.2),\n tf.keras.layers.Dense(10, activation='softmax')\n])\n\nmodel.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy']);", "_____no_output_____" ] ], [ [ "Fit the model on training data.", "_____no_output_____" ] ], [ [ "model.fit(x_train, y_train, epochs=3);", "Train on 60000 samples\nEpoch 1/3\n60000/60000 [==============================] - 3s 46us/sample - loss: 0.2968 - accuracy: 0.9131\nEpoch 2/3\n60000/60000 [==============================] - 3s 46us/sample - loss: 0.1435 - accuracy: 0.9575\nEpoch 3/3\n60000/60000 [==============================] - 3s 46us/sample - loss: 0.1102 - accuracy: 0.9664\n" ] ], [ [ "Evaluate model accuracy on test data.", "_____no_output_____" ] ], [ [ "loss_test, accuracy_test = model.evaluate(x_test, y_test)\nprint('Accuracy on test data: {:4.2f}%'.format(accuracy_test * 100))", "Accuracy on test data: 100.00%\n" ] ], [ [ "Create a ART Keras classifier for the TensorFlow Keras model.", "_____no_output_____" ] ], [ [ "classifier = KerasClassifier(model=model, clip_values=(0, 1))", "_____no_output_____" ] ], [ [ "## Fast Gradient Sign Method attack", "_____no_output_____" ], [ "Create a ART Fast Gradient Sign Method attack.", "_____no_output_____" ] ], [ [ "attack_fgsm = FastGradientMethod(estimator=classifier, eps=0.3)", "_____no_output_____" ] ], [ [ "Generate adversarial test data.", "_____no_output_____" ] ], [ [ "x_test_adv = attack_fgsm.generate(x_test)", "_____no_output_____" ] ], [ [ "Evaluate accuracy on adversarial test data and calculate average perturbation.", "_____no_output_____" ] ], [ [ "loss_test, accuracy_test = model.evaluate(x_test_adv, y_test)\nperturbation = np.mean(np.abs((x_test_adv - x_test)))\nprint('Accuracy on adversarial test data: {:4.2f}%'.format(accuracy_test * 100))\nprint('Average perturbation: {:4.2f}'.format(perturbation))", "Accuracy on adversarial test data: 0.00%\nAverage perturbation: 0.18\n" ] ], [ [ "Visualise the first adversarial test sample.", "_____no_output_____" ] ], [ [ "plt.matshow(x_test_adv[0])\nplt.show()", "_____no_output_____" ] ], [ [ "## Carlini&Wagner Infinity-norm attack", "_____no_output_____" ], [ "Create a ART Carlini&Wagner Infinity-norm attack.", "_____no_output_____" ] ], [ [ "attack_cw = CarliniLInfMethod(classifier=classifier, eps=0.3, max_iter=100, learning_rate=0.01)", "_____no_output_____" ] ], [ [ "Generate adversarial test data.", "_____no_output_____" ] ], [ [ "x_test_adv = attack_cw.generate(x_test)", "C&W L_inf: 100%|โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 1/1 [00:04<00:00, 4.23s/it]\n" ] ], [ [ "Evaluate accuracy on adversarial test data and calculate average perturbation.", "_____no_output_____" ] ], [ [ "loss_test, accuracy_test = model.evaluate(x_test_adv, y_test)\nperturbation = np.mean(np.abs((x_test_adv - x_test)))\nprint('Accuracy on adversarial test data: {:4.2f}%'.format(accuracy_test * 100))\nprint('Average perturbation: {:4.2f}'.format(perturbation))", "Accuracy on adversarial test data: 10.00%\nAverage perturbation: 0.03\n" ] ], [ [ "Visualise the first adversarial test sample.", "_____no_output_____" ] ], [ [ "plt.matshow(x_test_adv[0, :, :])\nplt.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d060c2786199cbceb6556c2ec911dc0cd631e0d6
913,950
ipynb
Jupyter Notebook
English/9_time_series_prediction/Prophet.ipynb
JeyDi/DataScienceCourse
905a37bf3e9f77ea00b3678f7ddbdd78d5c3361a
[ "MIT" ]
6
2020-04-11T18:02:57.000Z
2021-11-26T09:40:12.000Z
English/9_time_series_prediction/Prophet.ipynb
JeyDi/DataScienceCourse
905a37bf3e9f77ea00b3678f7ddbdd78d5c3361a
[ "MIT" ]
1
2020-05-08T15:30:02.000Z
2020-05-10T09:23:15.000Z
English/9_time_series_prediction/Prophet.ipynb
JeyDi/DataScienceCourse
905a37bf3e9f77ea00b3678f7ddbdd78d5c3361a
[ "MIT" ]
3
2019-12-05T16:02:50.000Z
2020-05-03T07:43:26.000Z
891.658537
316,048
0.944994
[ [ [ "# Prophet", "_____no_output_____" ], [ "Time serie forecasting using Prophet\n\nOfficial documentation: https://facebook.github.io/prophet/docs/quick_start.html", "_____no_output_____" ], [ "Procedure for forecasting time series data based on an additive model where non-linear trends are fit with yearly, weekly, and daily seasonality, plus holiday effects. It is released by Facebookโ€™s Core Data Science team.\n\nAdditive model is a model like: \n$Data = seasonal\\space effect + trend + residual$\n\nand, multiplicative model: \n$Data = seasonal\\space effect * trend * residual$\n\nThe algorithm provides useful statistics that help visualize the tuning process, e.g. trend, week trend, year trend and their max and min errors.", "_____no_output_____" ], [ "### Data\n\nThe data on which the algorithms will be trained and tested upon comes from Kaggle Hourly Energy Consumption database. It is collected by PJM Interconnection, a company coordinating the continuous buying, selling, and delivery of wholesale electricity through the Energy Market from suppliers to customers in the reagon of South Carolina, USA. All .csv files contains rows with a timestamp and a value. The name of the value column corresponds to the name of the contractor. the timestamp represents a single hour and the value represents the total energy, cunsumed during that hour.\n\nThe data we will be using is hourly power consumption data from PJM. Energy consumtion has some unique charachteristics. It will be interesting to see how prophet picks them up.\n\nhttps://www.kaggle.com/robikscube/hourly-energy-consumption\n\nPulling the PJM East which has data from 2002-2018 for the entire east region.\n\n", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom fbprophet import Prophet\nfrom sklearn.metrics import mean_squared_error, mean_absolute_error\nplt.style.use('fivethirtyeight') # For plots", "_____no_output_____" ], [ "dataset_path = './data/hourly-energy-consumption/PJME_hourly.csv'\ndf = pd.read_csv(dataset_path, index_col=[0], parse_dates=[0])\nprint(\"Dataset path:\",df.shape)\ndf.head(10)", "Dataset path: (145366, 1)\n" ], [ "# VISUALIZE DATA\n# Color pallete for plotting\ncolor_pal = [\"#F8766D\", \"#D39200\", \"#93AA00\",\n \"#00BA38\", \"#00C19F\", \"#00B9E3\",\n \"#619CFF\", \"#DB72FB\"]\ndf.plot(style='.', figsize=(20,10), color=color_pal[0], title='PJM East Dataset TS')\nplt.show()", "_____no_output_____" ], [ "#Decompose the seasonal data\n\ndef create_features(df, label=None):\n \"\"\"\n Creates time series features from datetime index.\n \"\"\"\n df = df.copy()\n df['date'] = df.index\n df['hour'] = df['date'].dt.hour\n df['dayofweek'] = df['date'].dt.dayofweek\n df['quarter'] = df['date'].dt.quarter\n df['month'] = df['date'].dt.month\n df['year'] = df['date'].dt.year\n df['dayofyear'] = df['date'].dt.dayofyear\n df['dayofmonth'] = df['date'].dt.day\n df['weekofyear'] = df['date'].dt.weekofyear\n \n X = df[['hour','dayofweek','quarter','month','year',\n 'dayofyear','dayofmonth','weekofyear']]\n if label:\n y = df[label]\n return X, y\n return X", "_____no_output_____" ], [ "df.columns", "_____no_output_____" ], [ "X, y = create_features(df, label='PJME_MW')\n\nfeatures_and_target = pd.concat([X, y], axis=1)\n\nprint(\"Shape\",features_and_target.shape)\nfeatures_and_target.head(10)", "Shape (145366, 9)\n" ], [ "sns.pairplot(features_and_target.dropna(),\n hue='hour',\n x_vars=['hour','dayofweek',\n 'year','weekofyear'],\n y_vars='PJME_MW',\n height=5,\n plot_kws={'alpha':0.15, 'linewidth':0}\n )\nplt.suptitle('Power Use MW by Hour, Day of Week, Year and Week of Year')\nplt.show()", "_____no_output_____" ] ], [ [ "## Train and Test Split", "_____no_output_____" ], [ "We use a temporal split, keeping old data and use only new period to do the prediction", "_____no_output_____" ] ], [ [ "split_date = '01-Jan-2015'\npjme_train = df.loc[df.index <= split_date].copy()\npjme_test = df.loc[df.index > split_date].copy()", "_____no_output_____" ], [ "# Plot train and test so you can see where we have split\npjme_test \\\n .rename(columns={'PJME_MW': 'TEST SET'}) \\\n .join(pjme_train.rename(columns={'PJME_MW': 'TRAINING SET'}),\n how='outer') \\\n .plot(figsize=(15,5), title='PJM East', style='.')\nplt.show()", "_____no_output_____" ] ], [ [ "To use prophet you need to correctly rename features and label to correctly pass the input to the engine.", "_____no_output_____" ] ], [ [ "# Format data for prophet model using ds and y\npjme_train.reset_index() \\\n .rename(columns={'Datetime':'ds',\n 'PJME_MW':'y'})\n\nprint(pjme_train.columns)\npjme_train.head(5)", "Index(['PJME_MW'], dtype='object')\n" ] ], [ [ "### Create and train the model", "_____no_output_____" ] ], [ [ "# Setup and train model and fit\nmodel = Prophet()\nmodel.fit(pjme_train.reset_index() \\\n .rename(columns={'Datetime':'ds',\n 'PJME_MW':'y'}))\n", "_____no_output_____" ], [ "# Predict on training set with model\npjme_test_fcst = model.predict(df=pjme_test.reset_index() \\\n .rename(columns={'Datetime':'ds'}))", "_____no_output_____" ], [ "pjme_test_fcst.head()", "_____no_output_____" ] ], [ [ "### Plot the results and forecast", "_____no_output_____" ] ], [ [ "# Plot the forecast\nf, ax = plt.subplots(1)\nf.set_figheight(5)\nf.set_figwidth(15)\nfig = model.plot(pjme_test_fcst,\n ax=ax)\nplt.show()", "_____no_output_____" ], [ "# Plot the components of the model\nfig = model.plot_components(pjme_test_fcst)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d060d44ea8124e137ea522b9965ec22ce7ada076
244,737
ipynb
Jupyter Notebook
examples/nyquist_plots_examples.ipynb
EISy-as-Py/EISy-as-Py
3086ecd043fce4d8ba49ec55004340a5444c0eb0
[ "MIT" ]
5
2020-02-06T21:38:47.000Z
2020-02-13T20:29:44.000Z
examples/nyquist_plots_examples.ipynb
EISy-as-Py/EISy-as-Py
3086ecd043fce4d8ba49ec55004340a5444c0eb0
[ "MIT" ]
2
2020-03-11T22:06:21.000Z
2020-05-18T17:22:43.000Z
examples/nyquist_plots_examples.ipynb
EISy-as-Py/EISy-as-Py
3086ecd043fce4d8ba49ec55004340a5444c0eb0
[ "MIT" ]
4
2020-03-13T20:35:04.000Z
2020-03-18T21:56:28.000Z
715.605263
35,248
0.948516
[ [ [ "from PyEIS import *", "_____no_output_____" ] ], [ [ "## Frequency range\nThe first first step needed to simulate an electrochemical impedance spectra is to generate a frequency domain, to do so, use to build-in freq_gen() function, as follows", "_____no_output_____" ] ], [ [ "f_range = freq_gen(f_start=10**10, f_stop=0.1, pts_decade=7)\n# print(f_range[0]) #First 5 points in the freq. array\nprint()\n# print(f_range[1]) #First 5 points in the angular freq.array", "\n" ] ], [ [ "Note that all functions included are described, to access these descriptions stay within () and press shift+tab. The freq_gen(), returns both the frequency, which is log seperated based on points/decade between f_start to f_stop, and the angular frequency. This function is quite useful and will be used through this tutorial", "_____no_output_____" ], [ "## The Equivalent Circuits\nThere exist a number of equivalent circuits that can be simulated and fitted, these functions are made as definations and can be called at any time. To find these, write: \"cir_\" and hit tab. All functions are outline in the next cell and can also be viewed in the equivalent circuit overview:", "_____no_output_____" ] ], [ [ "cir_RC\ncir_RQ\ncir_RsRQ\ncir_RsRQRQ\ncir_Randles\ncir_Randles_simplified\ncir_C_RC_C\ncir_Q_RQ_Q\ncir_RCRCZD\ncir_RsTLsQ\ncir_RsRQTLsQ\ncir_RsTLs\ncir_RsRQTLs\ncir_RsTLQ\ncir_RsRQTLQ\ncir_RsTL\ncir_RsRQTL\ncir_RsTL_1Dsolid\ncir_RsRQTL_1Dsolid", "_____no_output_____" ] ], [ [ "## Simulation of -(RC)-\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/RC_circuit.png' width=\"300\" />\n\n#### Input Parameters:\n- w = Angular frequency [1/s]\n- R = Resistance [Ohm]\n- C = Capacitance [F]\n- fs = summit frequency of RC circuit [Hz]", "_____no_output_____" ] ], [ [ "RC_example = EIS_sim(frange=f_range[0], circuit=cir_RC(w=f_range[1], R=70, C=10**-6), legend='on')", "_____no_output_____" ] ], [ [ "## Simulation of -Rs-(RQ)-\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/RsRQ_circuit.png' width=\"500\" />\n\n#### Input parameters:\n- w = Angular frequency [1/s]\n- Rs = Series resistance [Ohm]\n- R = Resistance [Ohm]\n- Q = Constant phase element [s^n/ohm]\n- n = Constant phase elelment exponent [-]\n- fs = summit frequency of RQ circuit [Hz]", "_____no_output_____" ] ], [ [ "RsRQ_example = EIS_sim(frange=f_range[0], circuit=cir_RsRQ(w=f_range[1], Rs=70, R=200, n=.8, Q=10**-5), legend='on')", "_____no_output_____" ], [ "RsRC_example = EIS_sim(frange=f_range[0], circuit=cir_RsRC(w=f_range[1], Rs=80, R=100, C=10**-5), legend='on')", "_____no_output_____" ] ], [ [ "## Simulation of -Rs-(RQ)-(RQ)-\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/RsRQRQ_circuit.png' width=\"500\" />\n\n#### Input parameters:\n- w = Angular frequency [1/s]\n- Rs = Series Resistance [Ohm]\n- R = Resistance [Ohm]\n- Q = Constant phase element [s^n/ohm]\n- n = Constant phase element exponent [-]\n- fs = summit frequency of RQ circuit [Hz]\n- R2 = Resistance [Ohm]\n- Q2 = Constant phase element [s^n/ohm]\n- n2 = Constant phase element exponent [-]\n- fs2 = summit frequency of RQ circuit [Hz]", "_____no_output_____" ] ], [ [ "RsRQRQ_example = EIS_sim(frange=f_range[0], circuit=cir_RsRQRQ(w=f_range[1], Rs=200, R=150, n=.872, Q=10**-4, R2=50, n2=.853, Q2=10**-6), legend='on')", "_____no_output_____" ] ], [ [ "## Simulation of -Rs-(Q(RW))- (Randles-circuit)\nThis circuit is often used for an experimental setup with a macrodisk working electrode with an outer-sphere heterogeneous charge transfer. This, classical, warburg element is controlled by semi-infinite linear diffusion, which is given by the geometry of the working electrode. Two Randles functions are avaliable for simulations: cir_Randles_simplified() and cir_Randles(). The former contains the Warburg constant (sigma), which summs up all mass transport constants (Dox/Dred, Cred/Cox, number of electrons (n_electron), Faradays constant (F), T, and E0) into a single constant sigma, while the latter contains all of these constants. Only cir_Randles_simplified() is avaliable for fitting, as either D$_{ox}$ or D$_{red}$ and C$_{red}$ or C$_{ox}$ are needed.\n\n<img src='https://raw.githubusercontent.com/kbknudsen/PyEIS/master/pyEIS_images/Randles_circuit.png' width=\"500\" />\n\n#### Input parameters:\n- Rs = Series resistance [ohm]\n- Rct = charge-transfer resistance [ohm]\n- Q = Constant phase element used to model the double-layer capacitance [F]\n- n = expononent of the CPE [-]\n- sigma = Warburg Constant [ohm/s^1/2]", "_____no_output_____" ] ], [ [ "Randles = cir_Randles_simplified(w=f_range[1], Rs=100, R=1000, n=1, sigma=300, Q=10**-5)\nRandles_example = EIS_sim(frange=f_range[0], circuit=Randles, legend='off')", "_____no_output_____" ], [ "Randles_example = EIS_sim(frange=f_range[0], circuit=cir_Randles_simplified(w=f_range[1], Rs=100, R=1000, n=1, sigma=300, Q='none', fs=10**3.3), legend='off')", "_____no_output_____" ] ], [ [ "In the following, the Randles circuit with the Warburg constant (sigma) defined is simulated where:\n- D$_{red}$/D$_{ox}$ = 10$^{-6}$ cm$^2$/s\n- C$_{red}$/C$_{ox}$ = 10 mM\n- n_electron = 1\n- T = 25 $^o$C\n\nThis function is a great tool to simulate expected impedance responses prior to starting experiments as it allows for evaluation of concentrations, diffusion constants, number of electrons, and Temp. to evaluate the feasability of obtaining information on either kinetics, mass-transport, or both.", "_____no_output_____" ] ], [ [ "Randles_example = EIS_sim(frange=f_range[0], circuit=cir_Randles(w=f_range[1], Rs=100, Rct=1000, Q=10**-7, n=1, T=298.15, D_ox=10**-9, D_red=10**-9, C_ox=10**-5, C_red=10**-5, n_electron=1, E=0, A=1), legend='off')", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ] ]
d060eba8647218bcd5795351a84e4d41e7367041
155,060
ipynb
Jupyter Notebook
nbs/examples/complex_dummy_experiment_manager.ipynb
Jaume-JCI/hpsearch
168d81f49e1a4bd4dbab838baaa8ff183a422030
[ "MIT" ]
null
null
null
nbs/examples/complex_dummy_experiment_manager.ipynb
Jaume-JCI/hpsearch
168d81f49e1a4bd4dbab838baaa8ff183a422030
[ "MIT" ]
15
2021-12-02T15:00:37.000Z
2022-02-22T17:53:50.000Z
nbs/examples/complex_dummy_experiment_manager.ipynb
Jaume-JCI/hpsearch
168d81f49e1a4bd4dbab838baaa8ff183a422030
[ "MIT" ]
null
null
null
197.277354
65,596
0.894589
[ [ [ "#hide\n#default_exp examples.complex_dummy_experiment_manager\nfrom nbdev.showdoc import *\nfrom block_types.utils.nbdev_utils import nbdev_setup, TestRunner\n\nnbdev_setup ()\ntst = TestRunner (targets=['dummy'])", "_____no_output_____" ] ], [ [ "# Complex Dummy Experiment Manager\n\n> Dummy experiment manager with features that allow additional functionality", "_____no_output_____" ] ], [ [ "#export\nfrom hpsearch.examples.dummy_experiment_manager import DummyExperimentManager, FakeModel\nimport hpsearch\nimport os\nimport shutil\nimport os\n\nimport hpsearch.examples.dummy_experiment_manager as dummy_em\nfrom hpsearch.visualization import plot_utils ", "_____no_output_____" ], [ "#for tests\nimport pytest\nfrom block_types.utils.nbdev_utils import md", "_____no_output_____" ] ], [ [ "## ComplexDummyExperimentManager", "_____no_output_____" ] ], [ [ "#export\nclass ComplexDummyExperimentManager (DummyExperimentManager):\n \n def __init__ (self, model_file_name='model_weights.pk', **kwargs):\n super().__init__ (model_file_name=model_file_name, **kwargs)\n self.raise_error_if_run = False\n\n def run_experiment (self, parameters={}, path_results='./results'):\n \n # useful for testing: in some cases the experiment manager should not call run_experiment\n if self.raise_error_if_run:\n raise RuntimeError ('run_experiment should not be called')\n \n # extract hyper-parameters used by our model. All the parameters have default values if they are not passed.\n offset = parameters.get('offset', 0.5) # default value: 0.5\n rate = parameters.get('rate', 0.01) # default value: 0.01\n epochs = parameters.get('epochs', 10) # default value: 10\n noise = parameters.get('noise', 0.0)\n if parameters.get('actual_epochs') is not None:\n epochs = parameters.get('actual_epochs')\n \n # other parameters that do not form part of our experiment definition\n # changing the values of these other parameters, does not make the ID of the experiment change\n verbose = parameters.get('verbose', True)\n \n # build model with given hyper-parameters\n model = FakeModel (offset=offset, rate=rate, epochs=epochs, noise = noise, verbose=verbose)\n \n # load training, validation and test data (fake step)\n model.load_data()\n\n # start from previous experiment if indicated by parameters\n path_results_previous_experiment = parameters.get('prev_path_results')\n if path_results_previous_experiment is not None:\n model.load_model_and_history (path_results_previous_experiment)\n \n # fit model with training data \n model.fit ()\n \n # save model weights and evolution of accuracy metric across epochs\n model.save_model_and_history(path_results)\n \n # simulate ctrl-c\n if parameters.get ('halt', False):\n raise KeyboardInterrupt ('stopped')\n \n # evaluate model with validation and test data\n validation_accuracy, test_accuracy = model.score()\n \n # store model\n self.model = model\n \n # the function returns a dictionary with keys corresponding to the names of each metric. \n # We return result on validation and test set in this example\n dict_results = dict (validation_accuracy = validation_accuracy,\n test_accuracy = test_accuracy)\n \n return dict_results\n ", "_____no_output_____" ] ], [ [ "### Usage", "_____no_output_____" ] ], [ [ "#exports tests.examples.test_complex_dummy_experiment_manager\ndef test_complex_dummy_experiment_manager ():\n #em = generate_data ('complex_dummy_experiment_manager')\n \n md (\n'''\nExtend previous experiment by using a larger number of epochs\n\nWe see how to create a experiment that is the same as a previous experiment, \nonly increasing the number of epochs. \n\n1.a. For test purposes, we first run the full number of epochs, 30, take note of the accuracy, \nand remove the experiment\n'''\n )\n \n em = ComplexDummyExperimentManager (path_experiments='test_complex_dummy_experiment_manager', \n verbose=0)\n em.create_experiment_and_run (parameters = {'epochs': 30});\n reference_accuracy = em.model.accuracy\n reference_weight = em.model.weight\n\n from hpsearch.config.hpconfig import get_path_experiments\n import os\n import pandas as pd\n\n path_experiments = get_path_experiments ()\n print (f'experiments folders: {os.listdir(f\"{path_experiments}/experiments\")}\\n')\n\n experiments_data = pd.read_pickle (f'{path_experiments}/experiments_data.pk')\n print ('csv data')\n display (experiments_data)\n\n md ('we plot the history')\n from hpsearch.visualization.experiment_visualization import plot_multiple_histories\n\n plot_multiple_histories ([0], run_number=0, op='max', backend='matplotlib', metrics='validation_accuracy')\n\n md ('1.b. Now we run two experiments: ')\n\n md ('We run the first experiment with 20 epochs:')\n\n # a.- remove previous experiment\n em.remove_previous_experiments()\n\n # b.- create first experiment with epochs=20\n em.create_experiment_and_run (parameters = {'epochs': 20});\n\n print (f'experiments folders: {os.listdir(f\"{path_experiments}/experiments\")}\\n')\n\n experiments_data = pd.read_pickle (f'{path_experiments}/experiments_data.pk')\n print ('csv data')\n display(experiments_data)\n print (f'weight: {em.model.weight}, accuracy: {em.model.accuracy}')\n\n md ('We run the second experiment resumes from the previous one and increases the epochs to 30')\n # 4.- create second experiment with epochs=10\n em.create_experiment_and_run (parameters = {'epochs': 30}, \n other_parameters={'prev_epoch': True,\n 'name_epoch': 'epochs',\n 'previous_model_file_name': 'model_weights.pk'});\n\n experiments_data = pd.read_pickle (f'{path_experiments}/experiments_data.pk')\n print ('csv data')\n display(experiments_data)\n\n new_accuracy = em.model.accuracy\n new_weight = em.model.weight\n\n assert new_weight==reference_weight\n assert new_accuracy==reference_accuracy\n\n print (f'weight: {new_weight}, accuracy: {new_accuracy}')\n\n md ('We plot the history')\n plot_multiple_histories ([1], run_number=0, op='max', backend='matplotlib', metrics='validation_accuracy')\n \n em.remove_previous_experiments()", "_____no_output_____" ], [ "tst.run (test_complex_dummy_experiment_manager, tag='dummy')", "running test_complex_dummy_experiment_manager\n" ] ], [ [ "## Running experiments and removing experiments", "_____no_output_____" ] ], [ [ "# export\ndef run_multiple_experiments (**kwargs):\n dummy_em.run_multiple_experiments (EM=ComplexDummyExperimentManager, **kwargs)\n\ndef remove_previous_experiments ():\n dummy_em.remove_previous_experiments (EM=ComplexDummyExperimentManager)", "_____no_output_____" ], [ "#export\ndef generate_data (name_folder):\n em = ComplexDummyExperimentManager (path_experiments=f'test_{name_folder}', verbose=0)\n em.remove_previous_experiments ()\n run_multiple_experiments (em=em, nruns=5, noise=0.1, verbose=False)\n return em", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d061077a7df45ec4c5ebf5d8df4448f6c1df44f4
12,842
ipynb
Jupyter Notebook
lessons/Workshop_13_OOP.ipynb
andrewt0301/python-problems
57338611ac631f094e3fb78a6cccab8b6fd7b442
[ "Apache-2.0" ]
null
null
null
lessons/Workshop_13_OOP.ipynb
andrewt0301/python-problems
57338611ac631f094e3fb78a6cccab8b6fd7b442
[ "Apache-2.0" ]
null
null
null
lessons/Workshop_13_OOP.ipynb
andrewt0301/python-problems
57338611ac631f094e3fb78a6cccab8b6fd7b442
[ "Apache-2.0" ]
null
null
null
42.664452
1,084
0.531926
[ [ [ "# Workshop 13\n## _Object-oriented programming._\n\n#### Classes and Objects\n", "_____no_output_____" ] ], [ [ "class MyClass:\n pass\n\n\nobj1 = MyClass()\nobj2 = MyClass()\n\nprint(obj1)\nprint(type(obj1))\n\nprint(obj2)\nprint(type(obj2))\n", "_____no_output_____" ] ], [ [ "##### Constructor and destructor\n", "_____no_output_____" ] ], [ [ "class Employee: \n \n def __init__(self): \n print('Employee created.') \n \n def __del__(self): \n print('Destructor called, Employee deleted.') \n \nobj = Employee() \ndel obj \n", "_____no_output_____" ] ], [ [ "##### Attributes and methods\n", "_____no_output_____" ] ], [ [ "class Student:\n\n def __init__(self, name, grade):\n self.name = name\n self.grade = grade\n\n def __str__(self):\n return '{' + self.name + ': ' + str(self.grade) + '}'\n\n def learn(self):\n print('My name is %s. I am learning Python! My grade is %d.' % (self.name, self.grade))\n\n\nstudents = [Student('Steve', 9), Student('Oleg', 10)]\n\nfor student in students:\n print()\n print('student.name = ' + student.name)\n print('student.grade = ' + str(student.grade))\n print('student = ' + str(student))\n student.learn()\n", "_____no_output_____" ] ], [ [ "##### Class and instance attributes\n", "_____no_output_____" ] ], [ [ "class Person:\n\n # class variable shared by all instances\n status = 'student'\n\n def __init__(self, name):\n # instance variable unique to each instance\n self.name = name\n\n\na = Person('Steve')\nb = Person('Mark')\n\nprint('')\nprint(a.name + ' : ' + a.status)\nprint(b.name + ' : ' + b.status)\n\nPerson.status = 'graduate'\n\nprint('')\nprint(a.name + ' : ' + a.status)\nprint(b.name + ' : ' + b.status)\n\nPerson.status = 'student'\n\nprint('')\nprint(a.name + ' : ' + a.status)\nprint(b.name + ' : ' + b.status)\n", "_____no_output_____" ] ], [ [ "##### Class and static methods\n", "_____no_output_____" ] ], [ [ "class Env:\n os = 'Windows'\n\n @classmethod\n def print_os(self):\n print(self.os)\n \n @staticmethod\n def print_user():\n print('guest')\n\n\nEnv.print_os()\nEnv.print_user()\n", "_____no_output_____" ] ], [ [ "##### Encapsulation\n", "_____no_output_____" ] ], [ [ "class Person:\n\n def __init__(self, name):\n self.name = name\n\n def __str__(self):\n return 'My name is ' + self.name\n\n\nperson = Person('Steve')\nprint(person.name)\n\nperson.name = 'Said' \nprint(person.name)\n", "_____no_output_____" ], [ "class Identity:\n\n def __init__(self, name):\n self.__name = name\n\n def __str__(self):\n return 'My name is ' + self.__name\n\n\nperson = Identity('Steve')\nprint(person.__name)\n\nperson.__name = 'Said' \nprint(person)\n", "_____no_output_____" ] ], [ [ "##### Operator overloading\n", "_____no_output_____" ] ], [ [ "class Number:\n\n def __init__(self, value):\n self.__value = value\n\n def __del__(self):\n pass\n\n def __str__(self):\n return str(self.__value)\n\n def __int__(self):\n return self.__value\n\n def __eq__(self, other):\n return self.__value == other.__value\n\n def __ne__(self, other):\n return self.__value != other.__value\n\n def __lt__(self, other):\n return self.__value < other.__value\n\n def __gt__(self, other):\n return self.__value > other.__value\n\n def __add__(self, other):\n return Number(self.__value + other.__value)\n\n def __mul__(self, other):\n return Number(self.__value * other.__value)\n\n def __neg__(self):\n return Number(-self.__value)\n\n\na = Number(10)\nb = Number(20)\nc = Number(5)\n\n# Overloaded operators\nx = -a + b * c\nprint(x)\n\nprint(a < b)\nprint(b > c)\n\n# Unsupported operators\nprint(a <= b)\nprint(b >= c)\nprint(a // c)\n", "_____no_output_____" ] ], [ [ "#### Inheritance and polymorphism\n", "_____no_output_____" ] ], [ [ "class Creature:\n def say(self):\n pass\n\n\nclass Dog(Creature):\n def say(self):\n print('Woof!')\n\n\nclass Cat(Creature):\n def say(self):\n print(\"Meow!\")\n\n\nclass Lion(Creature):\n def say(self):\n print(\"Roar!\")\n \n\nanimals = [Creature(), Dog(), Cat(), Lion()]\n\nfor animal in animals:\n print(type(animal))\n animal.say()\n\n", "_____no_output_____" ] ], [ [ "##### Multiple inheritance\n", "_____no_output_____" ] ], [ [ "class Person:\n def __init__(self, name):\n self.name = name\n\n\nclass Student(Person):\n def __init__(self, name, grade):\n super().__init__(name)\n self.grade = grade\n\n\nclass Employee:\n def __init__(self, salary):\n self.salary = salary\n\n\nclass Teacher(Person, Employee):\n def __init__(self, name, salary):\n Person.__init__(self, name)\n Employee.__init__(self, salary)\n\n\nclass TA(Student, Employee):\n def __init__(self, name, grage, salary):\n Student.__init__(self, name, grage)\n Employee.__init__(self, salary)\n\n\nx = Student('Oleg', 9)\ny = TA('Sergei', 10, 1000)\nz = Teacher('Andrei', 2000)\n\nfor person in [x, y, z]:\n print(person.name)\n if isinstance(person, Employee):\n print(person.salary)\n if isinstance(person, Student):\n print(person.grade)\n", "_____no_output_____" ] ], [ [ "##### Function _isinstance_\n", "_____no_output_____" ] ], [ [ "x = 10\nprint('')\nprint(isinstance(x, int))\nprint(isinstance(x, float))\nprint(isinstance(x, str))\n\ny = 3.14\nprint('')\nprint(isinstance(y, int))\nprint(isinstance(y, float))\nprint(isinstance(y, str))\n\n\nz = 'Hello world'\nprint('')\nprint(isinstance(z, int))\nprint(isinstance(z, float))\nprint(isinstance(z, str))\n", "_____no_output_____" ], [ "class A:\n pass\n\n\nclass B:\n pass\n\n\nclass C(A):\n pass\n\n\nclass D(A, B):\n pass\n\n\na = A()\nb = B()\nc = C()\nd = D()\n\nprint('')\nprint(isinstance(a, object))\nprint(isinstance(a, A))\nprint(isinstance(b, B))\n\nprint('')\nprint(isinstance(b, object))\nprint(isinstance(b, A))\nprint(isinstance(b, B))\nprint(isinstance(b, C))\n\nprint('')\nprint(isinstance(c, object))\nprint(isinstance(c, A))\nprint(isinstance(c, B))\nprint(isinstance(c, D))\n\n\nprint('')\nprint(isinstance(d, object))\nprint(isinstance(d, A))\nprint(isinstance(d, B))\nprint(isinstance(d, C))\nprint(isinstance(d, D))\n", "_____no_output_____" ] ], [ [ "##### Composition\n", "_____no_output_____" ] ], [ [ "class Teacher:\n pass\n\nclass Student:\n pass\n\nclass ClassRoom:\n def __init__(self, teacher, students):\n self.teacher = teacher\n self.students = students\n\n\ncl = ClassRoom(Teacher(), [Student(), Student(), Student()])\n", "_____no_output_____" ], [ "class Set:\n\n def __init__(self, values=None):\n self.dict = {}\n\n if values is not None:\n for value in values:\n self.add(value)\n\n def __repr__(self):\n return \"Set: \" + str(self.dict.keys())\n\n def add(self, value):\n self.dict[value] = True\n\n def contains(self, value):\n return value in self.dict\n\n def remove(self, value):\n del self.dict[value]\n\n\ns = Set([1,2,3])\ns.add(4)\nprint(s.contains(4))\ns.remove(3)\nprint(s.contains(3))\n", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d0610aae49ed9cd42f0ff340a636706689eeaf03
16,728
ipynb
Jupyter Notebook
examples/06_Scalable_GP_Classification_1D/KISSGP_Classification_1D.ipynb
phumm/gpytorch
4e8042bcecda049956f8f9e823d82ba6340766d5
[ "MIT" ]
1
2019-09-30T06:51:03.000Z
2019-09-30T06:51:03.000Z
examples/06_Scalable_GP_Classification_1D/KISSGP_Classification_1D.ipynb
phumm/gpytorch
4e8042bcecda049956f8f9e823d82ba6340766d5
[ "MIT" ]
null
null
null
examples/06_Scalable_GP_Classification_1D/KISSGP_Classification_1D.ipynb
phumm/gpytorch
4e8042bcecda049956f8f9e823d82ba6340766d5
[ "MIT" ]
1
2020-09-16T16:35:27.000Z
2020-09-16T16:35:27.000Z
55.026316
6,900
0.718675
[ [ [ "# Scalable GP Classification in 1D (w/ KISS-GP)\n\nThis example shows how to use grid interpolation based variational classification with an `ApproximateGP` using a `GridInterpolationVariationalStrategy` module. This classification module is designed for when the inputs of the function you're modeling are one-dimensional.\n\nThe use of inducing points allows for scaling up the training data by making computational complexity linear instead of cubic.\n\nIn this example, weโ€™re modeling a function that is periodically labeled cycling every 1/8 (think of a square wave with period 1/4)\n\nThis notebook doesn't use cuda, in general we recommend GPU use if possible and most of our notebooks utilize cuda as well.\n\nKernel interpolation for scalable structured Gaussian processes (KISS-GP) was introduced in this paper:\nhttp://proceedings.mlr.press/v37/wilson15.pdf\n\nKISS-GP with SVI for classification was introduced in this paper:\nhttps://papers.nips.cc/paper/6426-stochastic-variational-deep-kernel-learning.pdf", "_____no_output_____" ] ], [ [ "import math\nimport torch\nimport gpytorch\nfrom matplotlib import pyplot as plt\nfrom math import exp\n\n%matplotlib inline\n%load_ext autoreload\n%autoreload 2", "_____no_output_____" ], [ "train_x = torch.linspace(0, 1, 26)\ntrain_y = torch.sign(torch.cos(train_x * (2 * math.pi))).add(1).div(2)", "_____no_output_____" ], [ "from gpytorch.models import ApproximateGP\nfrom gpytorch.variational import CholeskyVariationalDistribution\nfrom gpytorch.variational import GridInterpolationVariationalStrategy\n\n\nclass GPClassificationModel(ApproximateGP):\n def __init__(self, grid_size=128, grid_bounds=[(0, 1)]):\n variational_distribution = CholeskyVariationalDistribution(grid_size)\n variational_strategy = GridInterpolationVariationalStrategy(self, grid_size, grid_bounds, variational_distribution)\n super(GPClassificationModel, self).__init__(variational_strategy)\n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n \n def forward(self,x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n latent_pred = gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n return latent_pred\n\n\nmodel = GPClassificationModel()\nlikelihood = gpytorch.likelihoods.BernoulliLikelihood()", "_____no_output_____" ], [ "from gpytorch.mlls.variational_elbo import VariationalELBO\n\n# Find optimal model hyperparameters\nmodel.train()\nlikelihood.train()\n\n# Use the adam optimizer\noptimizer = torch.optim.Adam(model.parameters(), lr=0.01)\n\n# \"Loss\" for GPs - the marginal log likelihood\n# n_data refers to the number of training datapoints\nmll = VariationalELBO(likelihood, model, num_data=train_y.numel())\n\ndef train():\n num_iter = 100\n for i in range(num_iter):\n optimizer.zero_grad()\n output = model(train_x)\n # Calc loss and backprop gradients\n loss = -mll(output, train_y)\n loss.backward()\n print('Iter %d/%d - Loss: %.3f' % (i + 1, num_iter, loss.item()))\n optimizer.step()\n \n# Get clock time\n%time train()", "Iter 1/100 - Loss: 0.070\nIter 2/100 - Loss: 14.834\nIter 3/100 - Loss: 0.977\nIter 4/100 - Loss: 3.547\nIter 5/100 - Loss: 8.699\nIter 6/100 - Loss: 6.352\nIter 7/100 - Loss: 1.795\nIter 8/100 - Loss: 0.188\nIter 9/100 - Loss: 2.075\nIter 10/100 - Loss: 4.160\nIter 11/100 - Loss: 3.899\nIter 12/100 - Loss: 1.941\nIter 13/100 - Loss: 0.344\nIter 14/100 - Loss: 0.360\nIter 15/100 - Loss: 1.501\nIter 16/100 - Loss: 2.298\nIter 17/100 - Loss: 1.944\nIter 18/100 - Loss: 0.904\nIter 19/100 - Loss: 0.177\nIter 20/100 - Loss: 0.297\nIter 21/100 - Loss: 0.916\nIter 22/100 - Loss: 1.281\nIter 23/100 - Loss: 1.024\nIter 24/100 - Loss: 0.451\nIter 25/100 - Loss: 0.111\nIter 26/100 - Loss: 0.246\nIter 27/100 - Loss: 0.593\nIter 28/100 - Loss: 0.733\nIter 29/100 - Loss: 0.526\nIter 30/100 - Loss: 0.206\nIter 31/100 - Loss: 0.087\nIter 32/100 - Loss: 0.225\nIter 33/100 - Loss: 0.408\nIter 34/100 - Loss: 0.413\nIter 35/100 - Loss: 0.245\nIter 36/100 - Loss: 0.091\nIter 37/100 - Loss: 0.096\nIter 38/100 - Loss: 0.210\nIter 39/100 - Loss: 0.273\nIter 40/100 - Loss: 0.210\nIter 41/100 - Loss: 0.104\nIter 42/100 - Loss: 0.064\nIter 43/100 - Loss: 0.117\nIter 44/100 - Loss: 0.173\nIter 45/100 - Loss: 0.159\nIter 46/100 - Loss: 0.093\nIter 47/100 - Loss: 0.056\nIter 48/100 - Loss: 0.077\nIter 49/100 - Loss: 0.115\nIter 50/100 - Loss: 0.115\nIter 51/100 - Loss: 0.078\nIter 52/100 - Loss: 0.050\nIter 53/100 - Loss: 0.061\nIter 54/100 - Loss: 0.083\nIter 55/100 - Loss: 0.086\nIter 56/100 - Loss: 0.062\nIter 57/100 - Loss: 0.045\nIter 58/100 - Loss: 0.053\nIter 59/100 - Loss: 0.064\nIter 60/100 - Loss: 0.065\nIter 61/100 - Loss: 0.050\nIter 62/100 - Loss: 0.040\nIter 63/100 - Loss: 0.046\nIter 64/100 - Loss: 0.052\nIter 65/100 - Loss: 0.051\nIter 66/100 - Loss: 0.041\nIter 67/100 - Loss: 0.037\nIter 68/100 - Loss: 0.041\nIter 69/100 - Loss: 0.044\nIter 70/100 - Loss: 0.042\nIter 71/100 - Loss: 0.035\nIter 72/100 - Loss: 0.034\nIter 73/100 - Loss: 0.036\nIter 74/100 - Loss: 0.037\nIter 75/100 - Loss: 0.033\nIter 76/100 - Loss: 0.030\nIter 77/100 - Loss: 0.030\nIter 78/100 - Loss: 0.033\nIter 79/100 - Loss: 0.031\nIter 80/100 - Loss: 0.029\nIter 81/100 - Loss: 0.028\nIter 82/100 - Loss: 0.028\nIter 83/100 - Loss: 0.028\nIter 84/100 - Loss: 0.026\nIter 85/100 - Loss: 0.025\nIter 86/100 - Loss: 0.025\nIter 87/100 - Loss: 0.025\nIter 88/100 - Loss: 0.025\nIter 89/100 - Loss: 0.024\nIter 90/100 - Loss: 0.022\nIter 91/100 - Loss: 0.022\nIter 92/100 - Loss: 0.022\nIter 93/100 - Loss: 0.022\nIter 94/100 - Loss: 0.020\nIter 95/100 - Loss: 0.021\nIter 96/100 - Loss: 0.020\nIter 97/100 - Loss: 0.019\nIter 98/100 - Loss: 0.018\nIter 99/100 - Loss: 0.019\nIter 100/100 - Loss: 0.017\nCPU times: user 6.33 s, sys: 9.66 s, total: 16 s\nWall time: 2.31 s\n" ], [ "# Set model and likelihood into eval mode\nmodel.eval()\nlikelihood.eval()\n\n# Initialize axes\nf, ax = plt.subplots(1, 1, figsize=(4, 3))\n\nwith torch.no_grad():\n test_x = torch.linspace(0, 1, 101)\n predictions = likelihood(model(test_x))\n\nax.plot(train_x.numpy(), train_y.numpy(), 'k*')\npred_labels = predictions.mean.ge(0.5).float()\nax.plot(test_x.data.numpy(), pred_labels.numpy(), 'b')\nax.set_ylim([-1, 2])\nax.legend(['Observed Data', 'Mean', 'Confidence'])", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d0611888d467fde28d61f6cbbf1ef43118ee3058
221,378
ipynb
Jupyter Notebook
notebooks/AB_tests/Understand Splitting Fraction.ipynb
mclaughlin6464/pearce
746f2bf4bf45e904d66996e003043661a01423ba
[ "MIT" ]
null
null
null
notebooks/AB_tests/Understand Splitting Fraction.ipynb
mclaughlin6464/pearce
746f2bf4bf45e904d66996e003043661a01423ba
[ "MIT" ]
16
2016-11-04T22:24:32.000Z
2018-05-01T22:53:39.000Z
notebooks/AB_tests/Understand Splitting Fraction.ipynb
mclaughlin6464/pearce
746f2bf4bf45e904d66996e003043661a01423ba
[ "MIT" ]
3
2016-10-04T08:07:52.000Z
2019-05-03T23:50:01.000Z
201.987226
72,772
0.897519
[ [ [ "import numpy as np\nimport astropy\nfrom itertools import izip\nfrom pearce.mocks import compute_prim_haloprop_bins, cat_dict\nfrom pearce.mocks.customHODModels import *\nfrom halotools.utils.table_utils import compute_conditional_percentiles\nfrom halotools.mock_observables import hod_from_mock, wp, tpcf, tpcf_one_two_halo_decomp\nfrom math import ceil", "_____no_output_____" ], [ "from matplotlib import pyplot as plt\n%matplotlib inline\nimport seaborn as sns\nsns.set()", "_____no_output_____" ], [ "shuffle_type = ''#'sh_shuffled'\nmag_type = 'vpeak'", "_____no_output_____" ], [ "mag_cut = -21\nmin_ptcl = 200\nmag_key = 'halo_%s%s_mag'%(shuffle_type, mag_type)\nupid_key = 'halo_%supid'%(shuffle_type)", "_____no_output_____" ], [ "PMASS = 591421440.0000001 #chinchilla 400/ 2048\ncatalog = astropy.table.Table.read('abmatched_halos.hdf5', format = 'hdf5')", "_____no_output_____" ], [ "cosmo_params = {'simname':'chinchilla', 'Lbox':400.0, 'scale_factors':[0.658, 1.0]}\ncat = cat_dict[cosmo_params['simname']](**cosmo_params)#construct the specified catalog!\n\ncat.load_catalog(1.0)\n#cat.h = 1.0\nhalo_catalog = catalog[catalog['halo_mvir'] > min_ptcl*cat.pmass] #mass cut\ngalaxy_catalog = halo_catalog[ halo_catalog[mag_key] < mag_cut ] # mag cut", "_____no_output_____" ], [ "def compute_mass_bins(prim_haloprop, dlog10_prim_haloprop=0.05): \n lg10_min_prim_haloprop = np.log10(np.min(prim_haloprop))-0.001\n lg10_max_prim_haloprop = np.log10(np.max(prim_haloprop))+0.001\n num_prim_haloprop_bins = (lg10_max_prim_haloprop-lg10_min_prim_haloprop)/dlog10_prim_haloprop\n return np.logspace(\n lg10_min_prim_haloprop, lg10_max_prim_haloprop,\n num=int(ceil(num_prim_haloprop_bins)))", "_____no_output_____" ], [ "mass_bins = compute_mass_bins(halo_catalog['halo_mvir'], 0.2)\nmass_bin_centers = (mass_bins[1:]+mass_bins[:-1])/2.0", "_____no_output_____" ], [ "cen_mask = galaxy_catalog['halo_upid']==-1\ncen_hod_sham, _ = hod_from_mock(galaxy_catalog[cen_mask]['halo_mvir_host_halo'],\\\n halo_catalog['halo_mvir'],\\\n mass_bins)\n\nsat_hod_sham, _ = hod_from_mock(galaxy_catalog[~cen_mask]['halo_mvir_host_halo'],\\\n halo_catalog['halo_mvir'],\\\n mass_bins)", "_____no_output_____" ], [ "cat.load_model(1.0, HOD=(FSAssembiasTabulatedCens, FSAssembiasTabulatedSats), hod_kwargs = {'prim_haloprop_vals': mass_bin_centers,\n #'sec_haloprop_key': 'halo_%s'%(mag_type),\n 'cen_hod_vals':cen_hod_sham,\n 'sat_hod_vals':sat_hod_sham,\n 'split':0.5})", "_____no_output_____" ], [ "print cat.model.param_dict", "{'mean_occupation_satellites_assembias_split1': 0.5, 'mean_occupation_satellites_assembias_param1': 0.5, 'mean_occupation_centrals_assembias_split1': 0.5, 'mean_occupation_centrals_assembias_param1': 0.5}\n" ], [ "#rp_bins = np.logspace(-1,1.5,20)\n#rp_bins = np.logspace(-1.1,1.8, 25)\n#rp_bins = np.loadtxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/rp_bins.npy')\nrp_bins = np.array([7.943282000000000120e-02,\n1.122018500000000057e-01,\n1.584893199999999891e-01,\n2.238721100000000130e-01,\n3.162277700000000191e-01,\n4.466835900000000192e-01,\n6.309573400000000332e-01,\n8.912509400000000470e-01,\n1.258925410000000022e+00,\n1.778279409999999894e+00,\n2.511886430000000114e+00,\n3.548133889999999901e+00,\n5.011872340000000037e+00,\n7.079457839999999891e+00,\n1.000000000000000000e+01,\n1.412537544999999994e+01,\n1.995262315000000086e+01,\n2.818382931000000013e+01,\n3.981071706000000177e+01])\n\nbin_centers = (rp_bins[:1]+rp_bins[:-1])/2", "_____no_output_____" ], [ "min_logmass, max_logmass = 9.0, 17.0\nnames = ['mean_occupation_centrals_assembias_param1','mean_occupation_satellites_assembias_param1',\\\n 'mean_occupation_centrals_assembias_split1','mean_occupation_satellites_assembias_split1']", "_____no_output_____" ], [ "#mock_wp = cat.calc_wp(rp_bins, RSD= False)\nMAP = np.array([ 0.85, -0.3,0.85,0.5])\n\nparams = dict(zip(names, MAP))\n#print params.keys()\n\nmock_wps = []\nmock_wps_1h, mock_wps_2h = [],[]\n#mock_nds = []\nsplit = np.linspace(0.1, 0.9, 4)\n#split_abcissa = [10**9, 10**13, 10**16]\n\n#cat.model._input_model_dictionary['centrals_occupation']._split_abscissa = split_abcissa\n#cat.model._input_model_dictionary['satellites_occupation']._split_abscissa = split_abcissa\nfor p in split:\n #params['mean_occupation_centrals_assembias_split1'] = p\n params['mean_occupation_satellites_assembias_split1'] = p\n #print params.keys()\n #print cat.model.param_dict\n cat.populate(params)\n #print cat.model.param_dict\n #cut_idx = cat.model.mock.galaxy_table['gal_type'] == 'centrals'\n mass_cut = np.logical_and(np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) > min_logmass,\\\n np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) <= max_logmass)\n #mass_cut = np.logical_and(mass_cut, cut_idx)\n #mock_nds.append(len(cut_idx)/cat.Lbox**3)\n mock_pos = np.c_[cat.model.mock.galaxy_table['x'],\\\n cat.model.mock.galaxy_table['y'],\\\n cat.model.mock.galaxy_table['z']]\n mock_wps.append(wp(mock_pos*cat.h, rp_bins ,40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1))\n #oneh, twoh = tpcf_one_two_halo_decomp(mock_pos,cat.model.mock.galaxy_table[mass_cut]['halo_hostid'],\\\n # rp_bins , period=cat.Lbox, num_threads=1)\n #mock_wps_1h.append(oneh)\n #mock_wps_2h.append(twoh)\n \nmock_wps = np.array(mock_wps)\nwp_errs = np.std(mock_wps, axis = 0)\n\n#mock_wps_1h = np.array(mock_wps_1h)\n#mock_wp_no_ab_1h = np.mean(mock_wps_1h, axis = 0)\n\n#mock_wps_2h = np.array(mock_wps_2h)\n#mock_wp_no_ab_2h = np.mean(mock_wps_2h, axis = 0)\n\n#mock_nds = np.array(mock_nds)\n#mock_nd = np.mean(mock_nds)\n#nd_err = np.std(mock_nds)", "_____no_output_____" ], [ "params", "_____no_output_____" ], [ "params = dict(zip(names, [0,0,0.5,0.5])) \ncat.populate(params)\nmass_cut = np.logical_and(np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) > min_logmass,\\\n np.log10(cat.model.mock.galaxy_table['halo_mvir'] ) <= max_logmass)\n\nprint cat.model.param_dict\nmock_pos = np.c_[cat.model.mock.galaxy_table['x'],\\\n cat.model.mock.galaxy_table['y'],\\\n cat.model.mock.galaxy_table['z']]\nnoab_wp = wp(mock_pos*cat.h, rp_bins ,40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1)", "{'mean_occupation_centrals_assembias_split1': 0.5, 'mean_occupation_centrals_assembias_param1': 0, 'mean_occupation_satellites_assembias_split1': 0.5, 'mean_occupation_satellites_assembias_param1': 0}\n" ], [ "print np.log10(noab_wp)", "[ 2.86479644 2.71375719 2.5690466 2.38344144 2.20022251 2.02959663\n 1.87899003 1.72579396 1.60715197 1.52219163 1.41551296 1.27122876\n 1.12708178 0.97127198 0.80736591 0.56640291 0.25364583 -0.18412503]\n" ], [ "from halotools.mock_observables import return_xyz_formatted_array", "_____no_output_____" ], [ "sham_pos = np.c_[galaxy_catalog['halo_x'],\\\n galaxy_catalog['halo_y'],\\\n galaxy_catalog['halo_z']]\n\ndistortion_dim = 'z'\nv_distortion_dim = galaxy_catalog['halo_v%s' % distortion_dim]\n# apply redshift space distortions\n#sham_pos = return_xyz_formatted_array(sham_pos[:,0],sham_pos[:,1],sham_pos[:,2], velocity=v_distortion_dim, \\\n# velocity_distortion_dimension=distortion_dim, period=cat.Lbox)\n#sham_wp = wp(sham_pos*cat.h, rp_bins, 40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1)\nsham_wp = wp(sham_pos*cat.h, rp_bins, 40.0*cat.h, period=cat.Lbox*cat.h, num_threads=1)\n\n#sham_wp = tpcf(sham_pos, rp_bins , period=cat.Lbox, num_threads=1)", "_____no_output_____" ], [ "sham_wp", "_____no_output_____" ], [ "len(galaxy_catalog)/((cat.Lbox*cat.h)**3)", "_____no_output_____" ] ], [ [ "np.savetxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/sham_vpeak_wp.npy', sham_wp)\n#np.savetxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/sham_vpeak_nd.npy', np.array([len(galaxy_catalog)/((cat.Lbox*cat.h)**3)]))", "_____no_output_____" ], [ "np.savetxt('/nfs/slac/g/ki/ki18/des/swmclau2/AB_tests/rp_bins_split.npy',rp_bins )", "_____no_output_____" ] ], [ [ "plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps):\n plt.plot(bin_centers, mock_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\nplt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "np.log10(mock_wps[-1])", "_____no_output_____" ], [ "plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps):\n plt.plot(bin_centers, mock_wp/sham_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\n#plt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\n#plt.loglog()\nplt.xscale('log')\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 15e0]);\nplt.ylim([0.8,1.2])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "plt.figure(figsize=(10,8))\n#for p, mock_wp in zip(split, mock_wps):\n# plt.plot(bin_centers, mock_wp/sham_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\nplt.plot(bin_centers, noab_wp/sham_wp, label = 'No AB')\n\n\n#plt.loglog()\nplt.xscale('log')\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 15e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps_1h):\n plt.plot(bin_centers, mock_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\n#plt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "/u/ki/swmclau2/.conda/envs/hodemulator/lib/python2.7/site-packages/matplotlib/axes/_axes.py:531: UserWarning: No labelled objects found. Use label='...' kwarg on individual plots.\n warnings.warn(\"No labelled objects found. \"\n" ], [ "plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps_2h):\n plt.plot(bin_centers, mock_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\nplt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "plt.figure(figsize=(10,8))\nfor p, mock_wp in zip(split, mock_wps_2h):\n plt.plot(bin_centers, mock_wp/noab_wp, label = p)\n \n#plt.plot(bin_centers, sham_wp, ls='--', label = 'SHAM')\n#plt.plot(bin_centers, noab_wp, ls=':', label = 'No AB')\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "plt.plot(bin_centers, mock_wps[0, :])\nplt.plot(bin_centers, mock_wps_1h[0, :])\nplt.plot(bin_centers, mock_wps_2h[0, :])\n\n\nplt.loglog()\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 30e0]);\n#plt.ylim([1,15000])\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "plt.figure(figsize=(10,8))\n#avg = mock_wps.mean(axis = 0)\nfor p, mock_wp in zip(split, mock_wps):\n plt.plot(bin_centers, mock_wp/sham_wp, label = 'p = %.2f'%p)\n \nplt.plot(bin_centers, noab_wp/sham_wp, label = 'No AB', ls = ':')\n\n#plt.loglog()\nplt.xscale('log')\nplt.legend(loc='best',fontsize = 15)\nplt.xlim([1e-1, 5e0]);\nplt.ylim([0.75,1.25]);\nplt.xlabel(r'$r$',fontsize = 15)\nplt.ylabel(r'$\\xi(r)/\\xi_{SHAM}(r)$',fontsize = 15)\nplt.show()", "_____no_output_____" ], [ "sats_occ = cat.model._input_model_dictionary['satellites_occupation']\nsats_occ._split_ordinates = [0.99]", "_____no_output_____" ] ], [ [ "cens_occ = cat.model._input_model_dictionary['centrals_occupation']\ncens_occ._split_ordinates = [0.1]", "_____no_output_____" ] ], [ [ "print sats_occ", "_____no_output_____" ], [ "baseline_lower_bound, baseline_upper_bound = 0,np.inf\nprim_haloprop = cat.model.mock.halo_table['halo_mvir']\nsec_haloprop = cat.model.mock.halo_table['halo_nfw_conc']", "_____no_output_____" ], [ "from halotools.utils.table_utils import compute_conditional_percentile_values", "_____no_output_____" ], [ "split = sats_occ.percentile_splitting_function(prim_haloprop)\n\n# Compute the baseline, undecorated result\nresult = sats_occ.baseline_mean_occupation(prim_haloprop=prim_haloprop)\n\n# We will only decorate values that are not edge cases,\n# so first compute the mask for non-edge cases\nno_edge_mask = (\n (split > 0) & (split < 1) &\n (result > baseline_lower_bound) & (result < baseline_upper_bound)\n)\n# Now create convenient references to the non-edge-case sub-arrays\nno_edge_result = result[no_edge_mask]\nno_edge_split = split[no_edge_mask]", "_____no_output_____" ] ], [ [ "percentiles = compute_conditional_percentiles(\n prim_haloprop=prim_haloprop,\n sec_haloprop=sec_haloprop\n )\nno_edge_percentiles = percentiles[no_edge_mask]\ntype1_mask = no_edge_percentiles > no_edge_split\n\nperturbation = sats_occ._galprop_perturbation(prim_haloprop=prim_haloprop[no_edge_mask],baseline_result=no_edge_result, splitting_result=no_edge_split)\n\nfrac_type1 = 1 - no_edge_split\nfrac_type2 = 1 - frac_type1\nperturbation[~type1_mask] *= (-frac_type1[~type1_mask] /\n(frac_type2[~type1_mask]))", "_____no_output_____" ], [ "# Retrieve percentile values (medians) if they've been precomputed. Else, compute them.\n\nno_edge_percentile_values = compute_conditional_percentile_values(p=no_edge_split,\n prim_haloprop=prim_haloprop[no_edge_mask],\n sec_haloprop=sec_haloprop[no_edge_mask])\n\npv_sub_sec_haloprop = sec_haloprop[no_edge_mask] - no_edge_percentile_values\n\nperturbation = sats_occ._galprop_perturbation(\n prim_haloprop=prim_haloprop[no_edge_mask],\n sec_haloprop=pv_sub_sec_haloprop/np.max(np.abs(pv_sub_sec_haloprop)),\n baseline_result=no_edge_result)", "_____no_output_____" ] ], [ [ "from halotools.utils.table_utils import compute_conditional_averages", "_____no_output_____" ], [ "strength = sats_occ.assembias_strength(prim_haloprop[no_edge_mask])\nslope = sats_occ.assembias_slope(prim_haloprop[no_edge_mask])\n\n# the average displacement acts as a normalization we need.\nmax_displacement = sats_occ._disp_func(sec_haloprop=pv_sub_sec_haloprop/np.max(np.abs(pv_sub_sec_haloprop)), slope=slope)\ndisp_average = compute_conditional_averages(vals=max_displacement,prim_haloprop=prim_haloprop[no_edge_mask])\n#disp_average = np.ones((prim_haloprop.shape[0], ))*0.5\n\nperturbation2 = np.zeros(len(prim_haloprop[no_edge_mask]))\n\ngreater_than_half_avg_idx = disp_average > 0.5\nless_than_half_avg_idx = disp_average <= 0.5\n\nif len(max_displacement[greater_than_half_avg_idx]) > 0:\n base_pos = result[no_edge_mask][greater_than_half_avg_idx]\n strength_pos = strength[greater_than_half_avg_idx]\n avg_pos = disp_average[greater_than_half_avg_idx]\n\n upper_bound1 = (base_pos - baseline_lower_bound)/avg_pos\n upper_bound2 = (baseline_upper_bound - base_pos)/(1-avg_pos)\n upper_bound = np.minimum(upper_bound1, upper_bound2)\n print upper_bound1, upper_bound2\n perturbation2[greater_than_half_avg_idx] = strength_pos*upper_bound*(max_displacement[greater_than_half_avg_idx]-avg_pos)\n \n\nif len(max_displacement[less_than_half_avg_idx]) > 0:\n base_neg = result[no_edge_mask][less_than_half_avg_idx]\n strength_neg = strength[less_than_half_avg_idx]\n avg_neg = disp_average[less_than_half_avg_idx]\n\n lower_bound1 = (base_neg-baseline_lower_bound)/avg_neg#/(1- avg_neg)\n lower_bound2 = (baseline_upper_bound - base_neg)/(1-avg_neg)#(avg_neg)\n lower_bound = np.minimum(lower_bound1, lower_bound2)\n perturbation2[less_than_half_avg_idx] = strength_neg*lower_bound*(max_displacement[less_than_half_avg_idx]-avg_neg)\n\n", "_____no_output_____" ], [ "print np.unique(max_displacement[indices_of_mb])\nprint np.unique(disp_average[indices_of_mb])", "_____no_output_____" ], [ "perturbation", "_____no_output_____" ], [ "mass_bins = compute_mass_bins(prim_haloprop)\nmass_bin_idxs = compute_prim_haloprop_bins(prim_haloprop_bin_boundaries=mass_bins, prim_haloprop = prim_haloprop[no_edge_mask])\nmb = 87\nindices_of_mb = np.where(mass_bin_idxs == mb)[0]", "_____no_output_____" ], [ "plt.hist(perturbation[indices_of_mb], bins =100);\nplt.yscale('log');\n#plt.loglog();", "_____no_output_____" ], [ "print max(perturbation)\nprint min(perturbation)", "_____no_output_____" ], [ "print max(perturbation[indices_of_mb])\nprint min(perturbation[indices_of_mb])", "_____no_output_____" ], [ "idxs = np.argsort(perturbation)\nprint mass_bin_idxs[idxs[-10:]]", "_____no_output_____" ], [ "plt.hist(perturbation2[indices_of_mb], bins =100);\nplt.yscale('log');\n#plt.loglog();", "_____no_output_____" ], [ "print perturbation2", "_____no_output_____" ] ] ]
[ "code", "raw", "code", "raw", "code", "raw", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "raw", "raw" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "raw" ], [ "code", "code", "code", "code" ], [ "raw", "raw" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d06118eed716b8b1c5ca1cd2af9f6d246ac13bb7
168,149
ipynb
Jupyter Notebook
training/cyclone_model_svm.ipynb
etassone1974/UWA-Project-3
af5608869ff1703a93855b084fb075d20df40a4a
[ "MIT" ]
null
null
null
training/cyclone_model_svm.ipynb
etassone1974/UWA-Project-3
af5608869ff1703a93855b084fb075d20df40a4a
[ "MIT" ]
null
null
null
training/cyclone_model_svm.ipynb
etassone1974/UWA-Project-3
af5608869ff1703a93855b084fb075d20df40a4a
[ "MIT" ]
null
null
null
391.044186
51,806
0.676091
[ [ [ "from sklearn.model_selection import train_test_split\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.preprocessing import StandardScaler\nfrom sklearn.svm import SVC\nfrom sklearn.model_selection import GridSearchCV\nfrom sklearn.metrics import accuracy_score\nimport joblib\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt", "_____no_output_____" ], [ " # Read data into DataFrame from CSV file\n # cyclone_df = pd.read_csv(\"Cyclone_ML.csv\")\n cyclone_df = pd.read_csv(\"../data/Cyclone_ML.csv\")\n\n # Select features for machine learning and assign to X\n selected_features = cyclone_df[[\"SURFACE_CODE\",\t\"CYC_TYPE\", \"LAT\", \"LON\", \"CENTRAL_PRES\", \"MAX_WIND_SPD\", \"CENTRAL_INDEX (CI)\", \"WAVE_HEIGHT\"]]\n X = selected_features\n\n # Set y to compass direction of cyclone based on wind direction degree\n y = cyclone_df[\"WIND_COMPASS\"]\n # y = cyclone_df[\"MAX_REP_WIND_DIR\"]\n \n\n print(X.shape, y.shape)", "(1691, 8) (1691,)\n" ], [ "cyclone_df", "_____no_output_____" ], [ "X", "_____no_output_____" ], [ "y", "_____no_output_____" ], [ " # train test split\n X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=42)", "_____no_output_____" ], [ " X_scaler = StandardScaler().fit(X_train)\n X_train_scaled = X_scaler.transform(X_train)\n X_test_scaled = X_scaler.transform(X_test)", "_____no_output_____" ], [ " # Support vector machine linear classifier\n model = SVC(kernel='linear')\n\n # Fit the model to the training data and calculate the scores for the training and testing data\n model.fit(X_train_scaled, y_train)", "_____no_output_____" ], [ " training_score = model.score(X_train_scaled, y_train)\n testing_score = model.score(X_test_scaled, y_test)\n \n print(f\"Training Data Score: {training_score}\")\n print(f\"Testing Data Score: {testing_score}\")", "Training Data Score: 0.23186119873817035\nTesting Data Score: 0.20094562647754138\n" ], [ " predictions = model.predict(X_test_scaled)\n acc = accuracy_score(y_test, preds)\n print(f'Model accuracy on test set: {acc:.2f}')", "Model accuracy on test set: 0.20\n" ], [ "from sklearn.metrics import plot_confusion_matrix\nplot_confusion_matrix(model, X_test_scaled, y_test, cmap=\"Blues\")\nplt.show()", "_____no_output_____" ], [ "plot_confusion_matrix(model, X_train_scaled, y_train, cmap=\"Blues\")\nplt.show()", "_____no_output_____" ], [ "plt.savefig('../static/images/clrep_train_svm.png')", "_____no_output_____" ], [ "plt.savefig('books_read.png')", "_____no_output_____" ], [ "from sklearn.metrics import classification_report\nprint(classification_report(y_test, predictions,\n target_names=[\"E\", \"N\", \"NE\", \"NW\", \"S\", \"SE\", \"SW\", \"W\"]))", " precision recall f1-score support\n\n E 0.17 0.60 0.27 73\n N 0.18 0.12 0.14 60\n NE 0.00 0.00 0.00 33\n NW 0.00 0.00 0.00 48\n S 0.00 0.00 0.00 58\n SE 0.22 0.34 0.27 65\n SW 0.00 0.00 0.00 25\n W 0.34 0.20 0.25 61\n\n accuracy 0.20 423\n macro avg 0.12 0.16 0.12 423\nweighted avg 0.14 0.20 0.14 423\n\n" ], [ "joblib.dump(model, 'cyclone_SVM.smd')\nprint(\"Model is saved.\")", "Model is saved.\n" ], [ "joblib.dump(model, '../cyclone_SVM.smd')\nprint(\"Model is saved.\")", "Model is saved.\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0611dc271fbaf941d70af46922cae2350029a02
415,066
ipynb
Jupyter Notebook
_notebooks/2020-06-29-01-Showing-uncertainty.ipynb
AntonovMikhail/chans_jupyter
c2cd1675408238ad5be81ba98994611d8c4e48ae
[ "Apache-2.0" ]
8
2020-06-26T23:48:52.000Z
2021-02-27T22:26:31.000Z
_notebooks/2020-06-29-01-Showing-uncertainty.ipynb
AntonovMikhail/chans_jupyter
c2cd1675408238ad5be81ba98994611d8c4e48ae
[ "Apache-2.0" ]
46
2020-06-30T00:45:37.000Z
2021-03-07T14:47:10.000Z
_notebooks/2020-06-29-01-Showing-uncertainty.ipynb
AntonovMikhail/chans_jupyter
c2cd1675408238ad5be81ba98994611d8c4e48ae
[ "Apache-2.0" ]
26
2020-07-24T17:30:15.000Z
2021-02-19T10:19:25.000Z
275.608234
78,352
0.909212
[ [ [ "# Showing uncertainty\n> Uncertainty occurs everywhere in data science, but it's frequently left out of visualizations where it should be included. Here, we review what a confidence interval is and how to visualize them for both single estimates and continuous functions. Additionally, we discuss the bootstrap resampling technique for assessing uncertainty and how to visualize it properly. This is the Summary of lecture \"Improving Your Data Visualizations in Python\", via datacamp.\n\n- toc: true \n- badges: true\n- comments: true\n- author: Chanseok Kang\n- categories: [Python, Datacamp, Visualization]\n- image: images/so2_compare.png", "_____no_output_____" ] ], [ [ "import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nplt.rcParams['figure.figsize'] = (10, 5)", "_____no_output_____" ] ], [ [ "### Point estimate intervals\n- When is uncertainty important?\n - Estimates from sample\n - Average of a subset\n - Linear model coefficients\n- Why is uncertainty important?\n - Helps inform confidence in estimate\n - Neccessary for decision making\n - Acknowledges limitations of data", "_____no_output_____" ], [ "### Basic confidence intervals\nYou are a data scientist for a fireworks manufacturer in Des Moines, Iowa. You need to make a case to the city that your company's large fireworks show has not caused any harm to the city's air. To do this, you look at the average levels for pollutants in the week after the fourth of July and how they compare to readings taken after your last show. By showing confidence intervals around the averages, you can make a case that the recent readings were well within the normal range.", "_____no_output_____" ] ], [ [ "average_ests = pd.read_csv('./dataset/average_ests.csv', index_col=0)\naverage_ests", "_____no_output_____" ], [ "# Construct CI bounds for averages\naverage_ests['lower'] = average_ests['mean'] - 1.96 * average_ests['std_err']\naverage_ests['upper'] = average_ests['mean'] + 1.96 * average_ests['std_err']\n\n# Setup a grid of plots, with non-shared x axes limits\ng = sns.FacetGrid(average_ests, row='pollutant', sharex=False, aspect=2);\n\n# Plot CI for average estimate\ng.map(plt.hlines, 'y', 'lower', 'upper');\n\n# Plot observed values for comparison and remove axes labels\ng.map(plt.scatter, 'seen', 'y', color='orangered').set_ylabels('').set_xlabels('');", "_____no_output_____" ] ], [ [ "This simple visualization shows that all the observed values fall well within the confidence intervals for all the pollutants except for $O_3$.", "_____no_output_____" ], [ "### Annotating confidence intervals\nYour data science work with pollution data is legendary, and you are now weighing job offers in both Cincinnati, Ohio and Indianapolis, Indiana. You want to see if the SO2 levels are significantly different in the two cities, and more specifically, which city has lower levels. To test this, you decide to look at the differences in the cities' SO2 values (Indianapolis' - Cincinnati's) over multiple years.\n\nInstead of just displaying a p-value for a significant difference between the cities, you decide to look at the 95% confidence intervals (columns `lower` and `upper`) of the differences. This allows you to see the magnitude of the differences along with any trends over the years.", "_____no_output_____" ] ], [ [ "diffs_by_year = pd.read_csv('./dataset/diffs_by_year.csv', index_col=0)\ndiffs_by_year", "_____no_output_____" ], [ "# Set start and ends according to intervals\n# Make intervals thicker\nplt.hlines(y='year', xmin='lower', xmax='upper', \n linewidth=5, color='steelblue', alpha=0.7,\n data=diffs_by_year);\n\n# Point estimates\nplt.plot('mean', 'year', 'k|', data=diffs_by_year);\n\n# Add a 'null' reference line at 0 and color orangered\nplt.axvline(x=0, color='orangered', linestyle='--');\n\n# Set descriptive axis labels and title\nplt.xlabel('95% CI');\nplt.title('Avg SO2 differences between Cincinnati and Indianapolis');", "_____no_output_____" ] ], [ [ "By looking at the confidence intervals you can see that the difference flipped from generally positive (more pollution in Cincinnati) in 2013 to negative (more pollution in Indianapolis) in 2014 and 2015. Given that every year's confidence interval contains the null value of zero, no P-Value would be significant, and a plot that only showed significance would have been entirely hidden this trend.", "_____no_output_____" ], [ "## Confidence bands", "_____no_output_____" ], [ "### Making a confidence band\nVandenberg Air Force Base is often used as a location to launch rockets into space. You have a theory that a recent increase in the pace of rocket launches could be harming the air quality in the surrounding region. To explore this, you plotted a 25-day rolling average line of the measurements of atmospheric $NO_2$. To help decide if any pattern observed is random-noise or not, you decide to add a 99% confidence band around your rolling mean. Adding a confidence band to a trend line can help shed light on the stability of the trend seen. This can either increase or decrease the confidence in the discovered trend.\n\n", "_____no_output_____" ] ], [ [ "vandenberg_NO2 = pd.read_csv('./dataset/vandenberg_NO2.csv', index_col=0)\nvandenberg_NO2.head()", "_____no_output_____" ], [ "# Draw 99% interval bands for average NO2\nvandenberg_NO2['lower'] = vandenberg_NO2['mean'] - 2.58 * vandenberg_NO2['std_err']\nvandenberg_NO2['upper'] = vandenberg_NO2['mean'] + 2.58 * vandenberg_NO2['std_err']\n\n# Plot mean estimate as a white semi-transparent line\nplt.plot('day', 'mean', data=vandenberg_NO2, color='white', alpha=0.4);\n\n# Fill between the upper and lower confidence band values\nplt.fill_between(x='day', y1='lower', y2='upper', data=vandenberg_NO2);", "_____no_output_____" ] ], [ [ "This plot shows that the middle of the year's $NO_2$ values are not only lower than the beginning and end of the year but also are less noisy. If just the moving average line were plotted, then this potentially interesting observation would be completely missed. (Can you think of what may cause reduced variance at the lower values of the pollutant?)", "_____no_output_____" ], [ "### Separating a lot of bands\nIt is relatively simple to plot a bunch of trend lines on top of each other for rapid and precise comparisons. Unfortunately, if you need to add uncertainty bands around those lines, the plot becomes very difficult to read. Figuring out whether a line corresponds to the top of one class' band or the bottom of another's can be hard due to band overlap. Luckily in Seaborn, it's not difficult to break up the overlapping bands into separate faceted plots.\n\nTo see this, explore trends in SO2 levels for a few cities in the eastern half of the US. If you plot the trends and their confidence bands on a single plot - it's a mess. To fix, use Seaborn's `FacetGrid()` function to spread out the confidence intervals to multiple panes to ease your inspection.", "_____no_output_____" ] ], [ [ "eastern_SO2 = pd.read_csv('./dataset/eastern_SO2.csv', index_col=0)\neastern_SO2.head()", "_____no_output_____" ], [ "# setup a grid of plots with columns divided by location\ng = sns.FacetGrid(eastern_SO2, col='city', col_wrap=2);\n\n# Map interval plots to each cities data with coral colored ribbons\ng.map(plt.fill_between, 'day', 'lower', 'upper', color='coral');\n\n# Map overlaid mean plots with white line\ng.map(plt.plot, 'day', 'mean', color='white');", "_____no_output_____" ] ], [ [ "By separating each band into its own plot you can investigate each city with ease. Here, you see that Des Moines and Houston on average have lower SO2 values for the entire year than the two cities in the Midwest. Cincinnati has a high and variable peak near the beginning of the year but is generally more stable and lower than Indianapolis.", "_____no_output_____" ], [ "### Cleaning up bands for overlaps\nYou are working for the city of Denver, Colorado and want to run an ad campaign about how much cleaner Denver's air is than Long Beach, California's air. To investigate this claim, you will compare the SO2 levels of both cities for the year 2014. Since you are solely interested in how the cities compare, you want to keep the bands on the same plot. To make the bands easier to compare, decrease the opacity of the confidence bands and set a clear legend.", "_____no_output_____" ] ], [ [ "SO2_compare = pd.read_csv('./dataset/SO2_compare.csv', index_col=0)\nSO2_compare.head()", "_____no_output_____" ], [ "for city, color in [('Denver', '#66c2a5'), ('Long Beach', '#fc8d62')]:\n # Filter data to desired city\n city_data = SO2_compare[SO2_compare.city == city]\n \n # Set city interval color to desired and lower opacity\n plt.fill_between(x='day', y1='lower', y2='upper', data=city_data, color=color, alpha=0.4);\n \n # Draw a faint mean line for reference and give a label for legend\n plt.plot('day', 'mean', data=city_data, label=city, color=color, alpha=0.25);\n \nplt.legend();", "_____no_output_____" ] ], [ [ "From these two curves you can see that during the first half of the year Long Beach generally has a higher average SO2 value than Denver, in the middle of the year they are very close, and at the end of the year Denver seems to have higher averages. However, by showing the confidence intervals, you can see however that almost none of the year shows a statistically meaningful difference in average values between the two cities.", "_____no_output_____" ], [ "## Beyond 95%\n", "_____no_output_____" ], [ "### 90, 95, and 99% intervals\nYou are a data scientist for an outdoor adventure company in Fairbanks, Alaska. Recently, customers have been having issues with SO2 pollution, leading to costly cancellations. The company has sensors for CO, NO2, and O3 but not SO2 levels.\n\nYou've built a model that predicts SO2 values based on the values of pollutants with sensors (loaded as `pollution_model`, a `statsmodels` object). You want to investigate which pollutant's value has the largest effect on your model's SO2 prediction. This will help you know which pollutant's values to pay most attention to when planning outdoor tours. To maximize the amount of information in your report, show multiple levels of uncertainty for the model estimates.", "_____no_output_____" ] ], [ [ "from statsmodels.formula.api import ols", "_____no_output_____" ], [ "pollution = pd.read_csv('./dataset/pollution_wide.csv')\npollution = pollution.query(\"city == 'Fairbanks' & year == 2014 & month == 11\")", "_____no_output_____" ], [ "pollution_model = ols(formula='SO2 ~ CO + NO2 + O3 + day', data=pollution)\nres = pollution_model.fit()", "_____no_output_____" ], [ "# Add interval percent widths\nalphas = [ 0.01, 0.05, 0.1] \nwidths = [ '99% CI', '95%', '90%']\ncolors = ['#fee08b','#fc8d59','#d53e4f']\n\nfor alpha, color, width in zip(alphas, colors, widths):\n # Grab confidence interval\n conf_ints = res.conf_int(alpha)\n \n # Pass current interval color and legend label to plot\n plt.hlines(y = conf_ints.index, xmin = conf_ints[0], xmax = conf_ints[1],\n colors = color, label = width, linewidth = 10) \n\n# Draw point estimates\nplt.plot(res.params, res.params.index, 'wo', label = 'Point Estimate')\n\nplt.legend(loc = 'upper right')", "_____no_output_____" ] ], [ [ "### 90 and 95% bands\nYou are looking at a 40-day rolling average of the $NO_2$ pollution levels for the city of Cincinnati in 2013. To provide as detailed a picture of the uncertainty in the trend you want to look at both the 90 and 99% intervals around this rolling estimate.\n\nTo do this, set up your two interval sizes and an orange ordinal color palette. Additionally, to enable precise readings of the bands, make them semi-transparent, so the Seaborn background grids show through.", "_____no_output_____" ] ], [ [ "cinci_13_no2 = pd.read_csv('./dataset/cinci_13_no2.csv', index_col=0);\ncinci_13_no2.head()", "_____no_output_____" ], [ "int_widths = ['90%', '99%']\nz_scores = [1.67, 2.58]\ncolors = ['#fc8d59', '#fee08b']\n\nfor percent, Z, color in zip(int_widths, z_scores, colors):\n \n # Pass lower and upper confidence bounds and lower opacity\n plt.fill_between(\n x = cinci_13_no2.day, alpha = 0.4, color = color,\n y1 = cinci_13_no2['mean'] - Z * cinci_13_no2['std_err'],\n y2 = cinci_13_no2['mean'] + Z * cinci_13_no2['std_err'],\n label = percent);\n \nplt.legend();", "_____no_output_____" ] ], [ [ "This plot shows us that throughout 2013, the average NO2 values in Cincinnati followed a cyclical pattern with the seasons. However, the uncertainty bands show that for most of the year you can't be sure this pattern is not noise at both a 90 and 99% confidence level.", "_____no_output_____" ], [ "### Using band thickness instead of coloring\nYou are a researcher investigating the elevation a rocket reaches before visual is lost and pollutant levels at Vandenberg Air Force Base. You've built a model to predict this relationship, and since you are working independently, you don't have the money to pay for color figures in your journal article. You need to make your model results plot work in black and white. To do this, you will plot the 90, 95, and 99% intervals of the effect of each pollutant as successively smaller bars.", "_____no_output_____" ] ], [ [ "rocket_model = pd.read_csv('./dataset/rocket_model.csv', index_col=0)\nrocket_model", "_____no_output_____" ], [ "# Decrase interval thickness as interval widens\nsizes = [ 15, 10, 5]\nint_widths = ['90% CI', '95%', '99%']\nz_scores = [ 1.67, 1.96, 2.58]\n\nfor percent, Z, size in zip(int_widths, z_scores, sizes):\n plt.hlines(y = rocket_model.pollutant, \n xmin = rocket_model['est'] - Z * rocket_model['std_err'],\n xmax = rocket_model['est'] + Z * rocket_model['std_err'],\n label = percent, \n # Resize lines and color them gray\n linewidth = size, \n color = 'gray'); \n \n# Add point estimate\nplt.plot('est', 'pollutant', 'wo', data = rocket_model, label = 'Point Estimate');\nplt.legend(loc = 'center left', bbox_to_anchor = (1, 0.5));", "_____no_output_____" ] ], [ [ "While less elegant than using color to differentiate interval sizes, this plot still clearly allows the reader to access the effect each pollutant has on rocket visibility. You can see that of all the pollutants, O3 has the largest effect and also the tightest confidence bounds", "_____no_output_____" ], [ "## Visualizing the bootstrap\n", "_____no_output_____" ], [ "### The bootstrap histogram\nYou are considering a vacation to Cincinnati in May, but you have a severe sensitivity to NO2. You pull a few years of pollution data from Cincinnati in May and look at a bootstrap estimate of the average $NO_2$ levels. You only have one estimate to look at the best way to visualize the results of your bootstrap estimates is with a histogram.\n\nWhile you like the intuition of the bootstrap histogram by itself, your partner who will be going on the vacation with you, likes seeing percent intervals. To accommodate them, you decide to highlight the 95% interval by shading the region.", "_____no_output_____" ] ], [ [ "# Perform bootstrapped mean on a vector\ndef bootstrap(data, n_boots):\n return [np.mean(np.random.choice(data,len(data))) for _ in range(n_boots) ]", "_____no_output_____" ], [ "pollution = pd.read_csv('./dataset/pollution_wide.csv')\ncinci_may_NO2 = pollution.query(\"city == 'Cincinnati' & month == 5\").NO2\n\n# Generate bootstrap samples\nboot_means = bootstrap(cinci_may_NO2, 1000)\n\n# Get lower and upper 95% interval bounds\nlower, upper = np.percentile(boot_means, [2.5, 97.5])\n\n# Plot shaded area for interval\nplt.axvspan(lower, upper, color = 'gray', alpha = 0.2);\n\n# Draw histogram of bootstrap samples\nsns.distplot(boot_means, bins = 100, kde = False);", "_____no_output_____" ] ], [ [ "Your bootstrap histogram looks stable and uniform. You're now confident that the average NO2 levels in Cincinnati during your vacation should be in the range of 16 to 23.", "_____no_output_____" ], [ "### Bootstrapped regressions\nWhile working for the Long Beach parks and recreation department investigating the relationship between $NO_2$ and $SO_2$ you noticed a cluster of potential outliers that you suspect might be throwing off the correlations.\n\nInvestigate the uncertainty of your correlations through bootstrap resampling to see how stable your fits are. For convenience, the bootstrap sampling is complete and is provided as `no2_so2_boot` along with `no2_so2` for the non-resampled data.", "_____no_output_____" ] ], [ [ "no2_so2 = pd.read_csv('./dataset/no2_so2.csv', index_col=0)\nno2_so2_boot = pd.read_csv('./dataset/no2_so2_boot.csv', index_col=0)", "_____no_output_____" ], [ "sns.lmplot('NO2', 'SO2', data = no2_so2_boot,\n # Tell seaborn to a regression line for each sample\n hue = 'sample', \n # Make lines blue and transparent\n line_kws = {'color': 'steelblue', 'alpha': 0.2},\n # Disable built-in confidence intervals\n ci = None, legend = False, scatter = False);\n\n# Draw scatter of all points\nplt.scatter('NO2', 'SO2', data = no2_so2);", "_____no_output_____" ] ], [ [ "The outliers appear to drag down the regression lines as evidenced by the cluster of lines with more severe slopes than average. In a single plot, you have not only gotten a good idea of the variability of your correlation estimate but also the potential effects of outliers.", "_____no_output_____" ], [ "### Lots of bootstraps with beeswarms\nAs a current resident of Cincinnati, you're curious to see how the average NO2 values compare to Des Moines, Indianapolis, and Houston: a few other cities you've lived in.\n\nTo look at this, you decide to use bootstrap estimation to look at the mean NO2 values for each city. Because the comparisons are of primary interest, you will use a swarm plot to compare the estimates.", "_____no_output_____" ] ], [ [ "pollution_may = pollution.query(\"month == 5\")\npollution_may", "_____no_output_____" ], [ "# Initialize a holder DataFrame for bootstrap results\ncity_boots = pd.DataFrame()\n\nfor city in ['Cincinnati', 'Des Moines', 'Indianapolis', 'Houston']:\n # Filter to city\n city_NO2 = pollution_may[pollution_may.city == city].NO2\n # Bootstrap city data & put in DataFrame\n cur_boot = pd.DataFrame({'NO2_avg': bootstrap(city_NO2, 100), 'city': city})\n # Append to other city's bootstraps\n city_boots = pd.concat([city_boots,cur_boot])\n\n# Beeswarm plot of averages with citys on y axis\nsns.swarmplot(y = \"city\", x = \"NO2_avg\", data = city_boots, color = 'coral');", "_____no_output_____" ] ], [ [ "The beeswarm plots show that Indianapolis and Houston both have the highest average NO2 values, with Cincinnati falling roughly in the middle. Interestingly, you can rather confidently say that Des Moines has the lowest as nearly all its sample estimates fall below those of the other cities.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d06127bfe2404b70d73671ba48a678224d198027
13,667
ipynb
Jupyter Notebook
notebooks/find_a_shakemap.ipynb
iwbailey/shakemap_lookup
75ac2739cba2a293f95a24feb1d7f57ebc0f834b
[ "MIT" ]
1
2020-10-16T07:31:58.000Z
2020-10-16T07:31:58.000Z
notebooks/find_a_shakemap.ipynb
iwbailey/shakemap_lookup
75ac2739cba2a293f95a24feb1d7f57ebc0f834b
[ "MIT" ]
null
null
null
notebooks/find_a_shakemap.ipynb
iwbailey/shakemap_lookup
75ac2739cba2a293f95a24feb1d7f57ebc0f834b
[ "MIT" ]
null
null
null
33.829208
299
0.524914
[ [ [ "from shakemap_lookup import usgs_web", "_____no_output_____" ], [ "help(usgs_web.search_usgsevents)", "Help on function search_usgsevents in module shakemap_lookup.usgs_web:\n\nsearch_usgsevents(searchParams, urlEndpt='https://earthquake.usgs.gov/fdsnws/event/1/query', maxNprint=30, isQuiet=False)\n Search the USGS for events satisfying the criteria and return a list of\n events\n \n IN:\n searchParams is a dict containing the search parameters used for the query\n urlEndpt [string] is the web address used for the search.\n \n API doc here... https://earthquake.usgs.gov/fdsnws/event/1/\n \n OUT:\n A list of events satisfying the conditions in a json structure\n\n" ] ], [ [ "## Define our search parameters and send to the USGS \nUse a dict, with same names as used by the USGS web call.\n\nSend a query to the web server. The result is a list of events also in a dict format.", "_____no_output_____" ] ], [ [ "search_params = {\n 'starttime': \"2018-05-01\",\n 'endtime': \"2018-05-17\",\n 'minmagnitude': 6.8,\n 'maxmagnitude': 10.0,\n 'mindepth': 0.0,\n 'maxdepth': 50.0,\n 'minlongitude': -180.0,\n 'maxlongitude': -97.0,\n 'minlatitude': 0.0,\n 'maxlatitude': 45.0,\n 'limit': 50,\n 'producttype': 'shakemap'\n}\n\nevents = usgs_web.search_usgsevents(search_params)", "Sending query to get events...\nParsing...\n\t...1 events returned (limit of 50)\n\t\t 70116556 : M 6.9 - 19km SSW of Leilani Estates, Hawaii\n" ] ], [ [ "## Check the metadata \nDisplay metadata including number of earthquakes returned and what url was used for the query", "_____no_output_____" ] ], [ [ "for k, v in events['metadata'].items():\n print(k,\":\", v)", "generated : 1575582197000\nurl : https://earthquake.usgs.gov/fdsnws/event/1/query?starttime=2018-05-01&endtime=2018-05-17&minmagnitude=6.8&maxmagnitude=10.0&mindepth=0.0&maxdepth=50.0&minlongitude=-180.0&maxlongitude=-97.0&minlatitude=0.0&maxlatitude=45.0&limit=50&producttype=shakemap&format=geojson&jsonerror=true\ntitle : USGS Earthquakes\nstatus : 200\napi : 1.8.1\nlimit : 50\noffset : 1\ncount : 1\n" ] ], [ [ "## Selection of event from candidates", "_____no_output_____" ] ], [ [ "my_event = usgs_web.choose_event(events)\nmy_event", "\nUSER SELECTION OF EVENT:\n========================\n 0: M 6.9 - 19km SSW of Leilani Estates, Hawaii (70116556)\nNone: First on list\n -1: Exit\n\nChoice: \n\t... selected M 6.9 - 19km SSW of Leilani Estates, Hawaii (70116556)\n\n" ] ], [ [ "## Select which ShakeMap for the selected event", "_____no_output_____" ] ], [ [ "smDetail = usgs_web.query_shakemapdetail(my_event['properties'])", "Querying detailed event info for eventId=70116556...\n\t...2 shakemaps found\n\nUSER SELECTION OF SHAKEMAP:\n===========================\nOption 0:\n\t eventsourcecode: 70116556\n\t version: 1\n\t process-timestamp: 2018-09-08T02:52:24Z\nOption 1:\n\t eventsourcecode: 1000dyad\n\t version: 11\n\t process-timestamp: 2018-06-15T23:02:03Z\n\nChoice [default 0]: \n\t... selected 0\n\n" ] ], [ [ "## Display available content for the ShakeMap", "_____no_output_____" ] ], [ [ "print(\"Available Content\\n=================\")\nfor k, v in smDetail['contents'].items():\n print(\"{:32s}: {} [{}]\".format(k, v['contentType'], v['length']))", "Available Content\n=================\nabout_formats.html : text/html [28820]\ncontents.xml : application/xml [9187]\ndownload/70116556.kml : application/vnd.google-earth.kml+xml [1032]\ndownload/cont_mi.json : application/json [79388]\ndownload/cont_mi.kmz : application/vnd.google-earth.kmz [17896]\ndownload/cont_pga.json : application/json [17499]\ndownload/cont_pga.kmz : application/vnd.google-earth.kmz [4362]\ndownload/cont_pgv.json : application/json [12352]\ndownload/cont_pgv.kmz : application/vnd.google-earth.kmz [3309]\ndownload/cont_psa03.json : application/json [24669]\ndownload/cont_psa03.kmz : application/vnd.google-earth.kmz [5843]\ndownload/cont_psa10.json : application/json [15028]\ndownload/cont_psa10.kmz : application/vnd.google-earth.kmz [3843]\ndownload/cont_psa30.json : application/json [7537]\ndownload/cont_psa30.kmz : application/vnd.google-earth.kmz [2254]\ndownload/epicenter.kmz : application/vnd.google-earth.kmz [1299]\ndownload/event.txt : text/plain [125]\ndownload/grid.xml : application/xml [3423219]\ndownload/grid.xml.zip : application/zip [493382]\ndownload/grid.xyz.zip : application/zip [428668]\ndownload/hazus.zip : application/zip [329755]\ndownload/hv70116556.kml : application/vnd.google-earth.kml+xml [1032]\ndownload/hv70116556.kmz : application/vnd.google-earth.kmz [127511]\ndownload/ii_overlay.png : image/png [25259]\ndownload/ii_thumbnail.jpg : image/jpeg [3530]\ndownload/info.json : application/json [2237]\ndownload/intensity.jpg : image/jpeg [60761]\ndownload/intensity.ps.zip : application/zip [139098]\ndownload/metadata.txt : text/plain [33137]\ndownload/mi_regr.png : image/png [35160]\ndownload/overlay.kmz : application/vnd.google-earth.kmz [25245]\ndownload/pga.jpg : image/jpeg [49594]\ndownload/pga.ps.zip : application/zip [89668]\ndownload/pga_regr.png : image/png [33466]\ndownload/pgv.jpg : image/jpeg [49781]\ndownload/pgv.ps.zip : application/zip [89389]\ndownload/pgv_regr.png : image/png [17605]\ndownload/polygons_mi.kmz : application/vnd.google-earth.kmz [43271]\ndownload/psa03.jpg : image/jpeg [49354]\ndownload/psa03.ps.zip : application/zip [90027]\ndownload/psa03_regr.png : image/png [18371]\ndownload/psa10.jpg : image/jpeg [49003]\ndownload/psa10.ps.zip : application/zip [89513]\ndownload/psa10_regr.png : image/png [31310]\ndownload/psa30.jpg : image/jpeg [48956]\ndownload/psa30.ps.zip : application/zip [89113]\ndownload/psa30_regr.png : image/png [18055]\ndownload/raster.zip : application/zip [1940448]\ndownload/rock_grid.xml.zip : application/zip [403486]\ndownload/sd.jpg : image/jpeg [45869]\ndownload/shape.zip : application/zip [1029832]\ndownload/stationlist.json : application/json [55083]\ndownload/stationlist.txt : text/plain [6737]\ndownload/stationlist.xml : application/xml [32441]\ndownload/stations.kmz : application/vnd.google-earth.kmz [7343]\ndownload/tvguide.txt : text/plain [8765]\ndownload/tvmap.jpg : image/jpeg [44223]\ndownload/tvmap.ps.zip : application/zip [273000]\ndownload/tvmap_bare.jpg : image/jpeg [48640]\ndownload/tvmap_bare.ps.zip : application/zip [273146]\ndownload/uncertainty.xml.zip : application/zip [211743]\ndownload/urat_pga.jpg : image/jpeg [45869]\ndownload/urat_pga.ps.zip : application/zip [51741]\nintensity.html : text/html [19291]\npga.html : text/html [19083]\npgv.html : text/html [19083]\nproducts.html : text/html [18584]\npsa03.html : text/html [20250]\npsa10.html : text/html [20249]\npsa30.html : text/html [20249]\nstationlist.html : text/html [127947]\n" ] ], [ [ "## Get download links\nClick on the link to download", "_____no_output_____" ] ], [ [ "# Extract the shakemap grid urls and version from the detail\ngrid = smDetail['contents']['download/grid.xml.zip']\nprint(grid['url'])", "https://earthquake.usgs.gov/archive/product/shakemap/hv70116556/us/1536375199192/download/grid.xml.zip\n" ], [ "grid = smDetail['contents']['download/uncertainty.xml.zip']\nprint(grid['url'])", "https://earthquake.usgs.gov/archive/product/shakemap/hv70116556/us/1536375199192/download/uncertainty.xml.zip\n" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d0613442e8a77054618c47832a0a30ce54d0c49d
148,109
ipynb
Jupyter Notebook
module3-permutation-boosting/LS_DS_233.ipynb
mariokart345/DS-Unit-2-Applied-Modeling
ecb7506dc3b08bb06c282937bdbc152332fb9b1b
[ "MIT" ]
null
null
null
module3-permutation-boosting/LS_DS_233.ipynb
mariokart345/DS-Unit-2-Applied-Modeling
ecb7506dc3b08bb06c282937bdbc152332fb9b1b
[ "MIT" ]
null
null
null
module3-permutation-boosting/LS_DS_233.ipynb
mariokart345/DS-Unit-2-Applied-Modeling
ecb7506dc3b08bb06c282937bdbc152332fb9b1b
[ "MIT" ]
null
null
null
68.064798
25,922
0.626633
[ [ [ "<a href=\"https://colab.research.google.com/github/mariokart345/DS-Unit-2-Applied-Modeling/blob/master/module3-permutation-boosting/LS_DS_233.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "Lambda School Data Science\n\n*Unit 2, Sprint 3, Module 3*\n\n---", "_____no_output_____" ], [ "# Permutation & Boosting\n\n- Get **permutation importances** for model interpretation and feature selection\n- Use xgboost for **gradient boosting**", "_____no_output_____" ], [ "### Setup\n\nRun the code cell below. You can work locally (follow the [local setup instructions](https://lambdaschool.github.io/ds/unit2/local/)) or on Colab.\n\nLibraries:\n\n- category_encoders\n- [**eli5**](https://eli5.readthedocs.io/en/latest/)\n- matplotlib\n- numpy\n- pandas\n- scikit-learn\n- [**xgboost**](https://xgboost.readthedocs.io/en/latest/)", "_____no_output_____" ] ], [ [ "%%capture\nimport sys\n\n# If you're on Colab:\nif 'google.colab' in sys.modules:\n DATA_PATH = 'https://raw.githubusercontent.com/LambdaSchool/DS-Unit-2-Applied-Modeling/master/data/'\n !pip install category_encoders==2.*\n !pip install eli5\n\n# If you're working locally:\nelse:\n DATA_PATH = '../data/'", "_____no_output_____" ] ], [ [ "We'll go back to Tanzania Waterpumps for this lesson.", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nfrom sklearn.model_selection import train_test_split\n\n# Merge train_features.csv & train_labels.csv\ntrain = pd.merge(pd.read_csv(DATA_PATH+'waterpumps/train_features.csv'), \n pd.read_csv(DATA_PATH+'waterpumps/train_labels.csv'))\n\n# Read test_features.csv & sample_submission.csv\ntest = pd.read_csv(DATA_PATH+'waterpumps/test_features.csv')\nsample_submission = pd.read_csv(DATA_PATH+'waterpumps/sample_submission.csv')\n\n\n# Split train into train & val\ntrain, val = train_test_split(train, train_size=0.80, test_size=0.20, \n stratify=train['status_group'], random_state=42)\n\n\ndef wrangle(X):\n \"\"\"Wrangle train, validate, and test sets in the same way\"\"\"\n \n # Prevent SettingWithCopyWarning\n X = X.copy()\n \n # About 3% of the time, latitude has small values near zero,\n # outside Tanzania, so we'll treat these values like zero.\n X['latitude'] = X['latitude'].replace(-2e-08, 0)\n \n # When columns have zeros and shouldn't, they are like null values.\n # So we will replace the zeros with nulls, and impute missing values later.\n # Also create a \"missing indicator\" column, because the fact that\n # values are missing may be a predictive signal.\n cols_with_zeros = ['longitude', 'latitude', 'construction_year', \n 'gps_height', 'population']\n for col in cols_with_zeros:\n X[col] = X[col].replace(0, np.nan)\n X[col+'_MISSING'] = X[col].isnull()\n \n # Drop duplicate columns\n duplicates = ['quantity_group', 'payment_type']\n X = X.drop(columns=duplicates)\n \n # Drop recorded_by (never varies) and id (always varies, random)\n unusable_variance = ['recorded_by', 'id']\n X = X.drop(columns=unusable_variance)\n \n # Convert date_recorded to datetime\n X['date_recorded'] = pd.to_datetime(X['date_recorded'], infer_datetime_format=True)\n \n # Extract components from date_recorded, then drop the original column\n X['year_recorded'] = X['date_recorded'].dt.year\n X['month_recorded'] = X['date_recorded'].dt.month\n X['day_recorded'] = X['date_recorded'].dt.day\n X = X.drop(columns='date_recorded')\n \n # Engineer feature: how many years from construction_year to date_recorded\n X['years'] = X['year_recorded'] - X['construction_year']\n X['years_MISSING'] = X['years'].isnull()\n \n # return the wrangled dataframe\n return X\n\ntrain = wrangle(train)\nval = wrangle(val)\ntest = wrangle(test)", "_____no_output_____" ], [ "# Arrange data into X features matrix and y target vector\ntarget = 'status_group'\nX_train = train.drop(columns=target)\ny_train = train[target]\nX_val = val.drop(columns=target)\ny_val = val[target]\nX_test = test", "_____no_output_____" ], [ "import category_encoders as ce\nfrom sklearn.impute import SimpleImputer\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.pipeline import make_pipeline\n\npipeline = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median'), \n RandomForestClassifier(n_estimators=100, random_state=42, n_jobs=-1)\n)\n\n# Fit on train, score on val\npipeline.fit(X_train, y_train)\nprint('Validation Accuracy', pipeline.score(X_val, y_val))", "/usr/local/lib/python3.6/dist-packages/statsmodels/tools/_testing.py:19: FutureWarning: pandas.util.testing is deprecated. Use the functions in the public API at pandas.testing instead.\n import pandas.util.testing as tm\n" ] ], [ [ "# Get permutation importances for model interpretation and feature selection", "_____no_output_____" ], [ "## Overview", "_____no_output_____" ], [ "Default Feature Importances are fast, but Permutation Importances may be more accurate.\n\nThese links go deeper with explanations and examples:\n\n- Permutation Importances\n - [Kaggle / Dan Becker: Machine Learning Explainability](https://www.kaggle.com/dansbecker/permutation-importance)\n - [Christoph Molnar: Interpretable Machine Learning](https://christophm.github.io/interpretable-ml-book/feature-importance.html)\n- (Default) Feature Importances\n - [Ando Saabas: Selecting good features, Part 3, Random Forests](https://blog.datadive.net/selecting-good-features-part-iii-random-forests/)\n - [Terence Parr, et al: Beware Default Random Forest Importances](https://explained.ai/rf-importance/index.html)", "_____no_output_____" ], [ "There are three types of feature importances:", "_____no_output_____" ], [ "### 1. (Default) Feature Importances\n\nFastest, good for first estimates, but be aware:\n\n\n\n>**When the dataset has two (or more) correlated features, then from the point of view of the model, any of these correlated features can be used as the predictor, with no concrete preference of one over the others.** But once one of them is used, the importance of others is significantly reduced since effectively the impurity they can remove is already removed by the first feature. As a consequence, they will have a lower reported importance. This is not an issue when we want to use feature selection to reduce overfitting, since it makes sense to remove features that are mostly duplicated by other features. But when interpreting the data, it can lead to the incorrect conclusion that one of the variables is a strong predictor while the others in the same group are unimportant, while actually they are very close in terms of their relationship with the response variable. โ€” [Selecting good features โ€“ Part III: random forests](https://blog.datadive.net/selecting-good-features-part-iii-random-forests/) \n\n\n \n > **The scikit-learn Random Forest feature importance ... tends to inflate the importance of continuous or high-cardinality categorical variables.** ... Breiman and Cutler, the inventors of Random Forests, indicate that this method of โ€œadding up the gini decreases for each individual variable over all trees in the forest gives a **fast** variable importance that is often very consistent with the permutation importance measure.โ€ โ€” [Beware Default Random Forest Importances](https://explained.ai/rf-importance/index.html)\n\n \n", "_____no_output_____" ] ], [ [ "# Get feature importances\nrf = pipeline.named_steps['randomforestclassifier']\nimportances = pd.Series(rf.feature_importances_, X_train.columns)\n\n# Plot feature importances\n%matplotlib inline\nimport matplotlib.pyplot as plt\n\nn = 20\nplt.figure(figsize=(10,n/2))\nplt.title(f'Top {n} features')\nimportances.sort_values()[-n:].plot.barh(color='grey');", "_____no_output_____" ] ], [ [ "### 2. Drop-Column Importance\n\nThe best in theory, but too slow in practice", "_____no_output_____" ] ], [ [ "column = 'wpt_name'\n\n# Fit without column\npipeline = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median'), \n RandomForestClassifier(n_estimators=100, random_state=42, n_jobs=-1)\n)\npipeline.fit(X_train.drop(columns=column), y_train)\nscore_without = pipeline.score(X_val.drop(columns=column), y_val)\nprint(f'Validation Accuracy without {column}: {score_without}')\n\n# Fit with column\npipeline = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median'), \n RandomForestClassifier(n_estimators=100, random_state=42, n_jobs=-1)\n)\npipeline.fit(X_train, y_train)\nscore_with = pipeline.score(X_val, y_val)\nprint(f'Validation Accuracy with {column}: {score_with}')\n\n# Compare the error with & without column\nprint(f'Drop-Column Importance for {column}: {score_with - score_without}')", "Validation Accuracy without wpt_name: 0.8087542087542088\nValidation Accuracy with wpt_name: 0.8135521885521886\nDrop-Column Importance for wpt_name: 0.004797979797979801\n" ] ], [ [ "### 3. Permutation Importance\n\nPermutation Importance is a good compromise between Feature Importance based on impurity reduction (which is the fastest) and Drop Column Importance (which is the \"best.\")\n\n[The ELI5 library documentation explains,](https://eli5.readthedocs.io/en/latest/blackbox/permutation_importance.html)\n\n> Importance can be measured by looking at how much the score (accuracy, F1, R^2, etc. - any score weโ€™re interested in) decreases when a feature is not available.\n>\n> To do that one can remove feature from the dataset, re-train the estimator and check the score. But it requires re-training an estimator for each feature, which can be computationally intensive. ...\n>\n>To avoid re-training the estimator we can remove a feature only from the test part of the dataset, and compute score without using this feature. It doesnโ€™t work as-is, because estimators expect feature to be present. So instead of removing a feature we can replace it with random noise - feature column is still there, but it no longer contains useful information. This method works if noise is drawn from the same distribution as original feature values (as otherwise estimator may fail). The simplest way to get such noise is to shuffle values for a feature, i.e. use other examplesโ€™ feature values - this is how permutation importance is computed.\n>\n>The method is most suitable for computing feature importances when a number of columns (features) is not huge; it can be resource-intensive otherwise.", "_____no_output_____" ], [ "### Do-It-Yourself way, for intuition", "_____no_output_____" ] ], [ [ "#lets see how permutation works first \r\nnevi_array = [1,2,3,4,5]\r\nnevi_permuted = np.random.permutation(nevi_array)\r\nnevi_permuted", "_____no_output_____" ], [ "#BEFORE : sequence of the feature to be permuted \r\nfeature = 'quantity'\r\nX_val[feature].head()", "_____no_output_____" ], [ "#BEFORE: distribution \r\nX_val[feature].value_counts()", "_____no_output_____" ], [ "#PERMUTE\r\n\r\nX_val_permuted = X_val.copy()\r\nX_val_permuted[feature] =np.random.permutation(X_val[feature])", "_____no_output_____" ], [ "#AFTER : sequence of the feature to be permuted \r\nfeature = 'quantity'\r\nX_val_permuted[feature].head()", "_____no_output_____" ], [ "#AFTER: distribution \r\nX_val_permuted[feature].value_counts()", "_____no_output_____" ], [ "#get the permutation importance \r\nX_val_permuted[feature] =np.random.permutation(X_val[feature])\r\n\r\nscore_permuted = pipeline.score(X_val_permuted, y_val)\r\n\r\nprint(f'Validation Accuracy with {feature}: {score_with}')\r\nprint(f'Validation Accuracy with {feature} permuted: {score_permuted}')\r\nprint(f'Permutation Importance: {score_with - score_permuted}')", "Validation Accuracy with quantity: 0.8135521885521886\nValidation Accuracy with quantity permuted: 0.7148148148148148\nPermutation Importance: 0.09873737373737379\n" ], [ "feature = 'wpt_name'\r\nX_val_permuted=X_val.copy()\r\nX_val_permuted[feature] = np.random.permutation(X_val[feature])\r\nscore_permuted = pipeline.score(X_val_permuted, y_val)\r\n\r\nprint(f'Validation Accuracy with {feature}: {score_with}')\r\nprint(f'Validation Accuracy with {feature} permuted: {score_permuted}')\r\nprint(f'Permutation Importance: {score_with - score_permuted}')", "Validation Accuracy with wpt_name: 0.8135521885521886\nValidation Accuracy with wpt_name permuted: 0.811952861952862\nPermutation Importance: 0.0015993265993266004\n" ], [ "X_val[feature]", "_____no_output_____" ] ], [ [ "### With eli5 library\n\nFor more documentation on using this library, see:\n- [eli5.sklearn.PermutationImportance](https://eli5.readthedocs.io/en/latest/autodocs/sklearn.html#eli5.sklearn.permutation_importance.PermutationImportance)\n- [eli5.show_weights](https://eli5.readthedocs.io/en/latest/autodocs/eli5.html#eli5.show_weights)\n- [scikit-learn user guide, `scoring` parameter](https://scikit-learn.org/stable/modules/model_evaluation.html#the-scoring-parameter-defining-model-evaluation-rules)\n\neli5 doesn't work with pipelines.", "_____no_output_____" ] ], [ [ "# Ignore warnings\n \ntransformers = make_pipeline(\n ce.OrdinalEncoder(), \n SimpleImputer(strategy='median')\n )\n\nX_train_transformed = transformers.fit_transform(X_train)\nX_val_transformed = transformers.transform(X_val)\n\nmodel = RandomForestClassifier(n_estimators=50, random_state=42, n_jobs=-1)\nmodel.fit(X_train_transformed, y_train)", "_____no_output_____" ], [ "import eli5\r\nfrom eli5.sklearn import PermutationImportance\r\n\r\npermuter = PermutationImportance(\r\n model, \r\n scoring='accuracy',\r\n n_iter=5, \r\n random_state=42\r\n)\r\n\r\npermuter.fit(X_val_transformed,y_val)\r\n", "_____no_output_____" ], [ "feature_names = X_val.columns.to_list()\r\npd.Series(permuter.feature_importances_, feature_names).sort_values(ascending=False)", "_____no_output_____" ], [ "eli5.show_weights(\r\n permuter, \r\n top=None, \r\n feature_names=feature_names\r\n)", "_____no_output_____" ] ], [ [ "### We can use importances for feature selection\n\nFor example, we can remove features with zero importance. The model trains faster and the score does not decrease.", "_____no_output_____" ] ], [ [ "print('Shape before removing feature ', X_train.shape)", "Shape before removing feature (47520, 45)\n" ], [ "#remove features with feature importance <0\r\nminimum_importance = 0\r\nmask=permuter.feature_importances_ > minimum_importance \r\nfeatures = X_train.columns[mask]\r\nX_train=X_train[features]", "_____no_output_____" ], [ "print('Shape AFTER removing feature ', X_train.shape)", "Shape AFTER removing feature (47520, 24)\n" ], [ "X_val=X_val[features]\r\n\r\npipeline = make_pipeline(\r\n ce.OrdinalEncoder(), \r\n SimpleImputer(strategy='mean'), \r\n RandomForestClassifier(n_estimators=50, random_state=42, n_jobs=-1)\r\n)\r\n\r\n#fit on train, score on val \r\npipeline.fit(X_train, y_train)\r\nprint('Validation accuracy', pipeline.score(X_val, y_val))", "Validation accuracy 0.8066498316498316\n" ] ], [ [ "# Use xgboost for gradient boosting", "_____no_output_____" ], [ "## Overview", "_____no_output_____" ], [ "In the Random Forest lesson, you learned this advice:\n\n#### Try Tree Ensembles when you do machine learning with labeled, tabular data\n- \"Tree Ensembles\" means Random Forest or **Gradient Boosting** models. \n- [Tree Ensembles often have the best predictive accuracy](https://arxiv.org/abs/1708.05070) with labeled, tabular data.\n- Why? Because trees can fit non-linear, non-[monotonic](https://en.wikipedia.org/wiki/Monotonic_function) relationships, and [interactions](https://christophm.github.io/interpretable-ml-book/interaction.html) between features.\n- A single decision tree, grown to unlimited depth, will [overfit](http://www.r2d3.us/visual-intro-to-machine-learning-part-1/). We solve this problem by ensembling trees, with bagging (Random Forest) or **[boosting](https://www.youtube.com/watch?v=GM3CDQfQ4sw)** (Gradient Boosting).\n- Random Forest's advantage: may be less sensitive to hyperparameters. **Gradient Boosting's advantage:** may get better predictive accuracy.", "_____no_output_____" ], [ "Like Random Forest, Gradient Boosting uses ensembles of trees. But the details of the ensembling technique are different:\n\n### Understand the difference between boosting & bagging\n\nBoosting (used by Gradient Boosting) is different than Bagging (used by Random Forests). \n\nHere's an excerpt from [_An Introduction to Statistical Learning_](http://www-bcf.usc.edu/~gareth/ISL/ISLR%20Seventh%20Printing.pdf) Chapter 8.2.3, Boosting:\n\n>Recall that bagging involves creating multiple copies of the original training data set using the bootstrap, fitting a separate decision tree to each copy, and then combining all of the trees in order to create a single predictive model.\n>\n>**Boosting works in a similar way, except that the trees are grown _sequentially_: each tree is grown using information from previously grown trees.**\n>\n>Unlike fitting a single large decision tree to the data, which amounts to _fitting the data hard_ and potentially overfitting, the boosting approach instead _learns slowly._ Given the current model, we fit a decision tree to the residuals from the model.\n>\n>We then add this new decision tree into the fitted function in order to update the residuals. Each of these trees can be rather small, with just a few terminal nodes. **By fitting small trees to the residuals, we slowly improve fห† in areas where it does not perform well.**\n>\n>Note that in boosting, unlike in bagging, the construction of each tree depends strongly on the trees that have already been grown.\n\nThis high-level overview is all you need to know for now. If you want to go deeper, we recommend you watch the StatQuest videos on gradient boosting!", "_____no_output_____" ], [ "Let's write some code. We have lots of options for which libraries to use:\n\n#### Python libraries for Gradient Boosting\n- [scikit-learn Gradient Tree Boosting](https://scikit-learn.org/stable/modules/ensemble.html#gradient-boosting) โ€” slower than other libraries, but [the new version may be better](https://twitter.com/amuellerml/status/1129443826945396737)\n - Anaconda: already installed\n - Google Colab: already installed\n- [xgboost](https://xgboost.readthedocs.io/en/latest/) โ€”ย can accept missing values and enforce [monotonic constraints](https://xiaoxiaowang87.github.io/monotonicity_constraint/)\n - Anaconda, Mac/Linux: `conda install -c conda-forge xgboost`\n - Windows: `conda install -c anaconda py-xgboost`\n - Google Colab: already installed\n- [LightGBM](https://lightgbm.readthedocs.io/en/latest/) โ€”ย can accept missing values and enforce [monotonic constraints](https://blog.datadive.net/monotonicity-constraints-in-machine-learning/)\n - Anaconda: `conda install -c conda-forge lightgbm`\n - Google Colab: already installed\n- [CatBoost](https://catboost.ai/) โ€”ย can accept missing values and use [categorical features](https://catboost.ai/docs/concepts/algorithm-main-stages_cat-to-numberic.html) without preprocessing\n - Anaconda: `conda install -c conda-forge catboost`\n - Google Colab: `pip install catboost`", "_____no_output_____" ], [ "In this lesson, you'll use a new library, xgboost โ€”ย But it has an API that's almost the same as scikit-learn, so it won't be a hard adjustment!\n\n#### [XGBoost Python API Reference: Scikit-Learn API](https://xgboost.readthedocs.io/en/latest/python/python_api.html#module-xgboost.sklearn)", "_____no_output_____" ] ], [ [ "from xgboost import XGBClassifier\r\npipeline = make_pipeline(\r\n ce.OrdinalEncoder(), \r\n XGBClassifier(n_estimators=100, random_state=42, n_jobs=-1)\r\n)\r\n\r\npipeline.fit(X_train, y_train)\r\n", "_____no_output_____" ], [ "from sklearn.metrics import accuracy_score\r\ny_pred=pipeline.predict(X_val)\r\nprint('Validation score', accuracy_score(y_val, y_pred))", "Validation score 0.7453703703703703\n" ] ], [ [ "#### [Avoid Overfitting By Early Stopping With XGBoost In Python](https://machinelearningmastery.com/avoid-overfitting-by-early-stopping-with-xgboost-in-python/)\n\nWhy is early stopping better than a For loop, or GridSearchCV, to optimize `n_estimators`?\n\nWith early stopping, if `n_iterations` is our number of iterations, then we fit `n_iterations` decision trees.\n\nWith a for loop, or GridSearchCV, we'd fit `sum(range(1,n_rounds+1))` trees.\n\nBut it doesn't work well with pipelines. You may need to re-run multiple times with different values of other parameters such as `max_depth` and `learning_rate`.\n\n#### XGBoost parameters\n- [Notes on parameter tuning](https://xgboost.readthedocs.io/en/latest/tutorials/param_tuning.html)\n- [Parameters documentation](https://xgboost.readthedocs.io/en/latest/parameter.html)\n", "_____no_output_____" ] ], [ [ "encoder = ce.OrdinalEncoder()\r\nX_train_encoded = encoder.fit_transform(X_train)\r\nX_val_encoded = encoder.transform(X_val)\r\n\r\nmodel = XGBClassifier(\r\n n_estimators=1000, # <= 1000 trees, depend on early stopping\r\n max_depth=7, # try deeper trees because of high cardinality categoricals\r\n learning_rate=0.5, # try higher learning rate\r\n n_jobs=-1\r\n)\r\n\r\neval_set = [(X_train_encoded, y_train), \r\n (X_val_encoded, y_val)]\r\n\r\nmodel.fit(X_train_encoded, y_train, \r\n eval_set=eval_set, \r\n eval_metric='merror', \r\n early_stopping_rounds=50) # Stop if the score hasn't improved in 50 rounds", "[0]\tvalidation_0-merror:0.254167\tvalidation_1-merror:0.264394\nMultiple eval metrics have been passed: 'validation_1-merror' will be used for early stopping.\n\nWill train until validation_1-merror hasn't improved in 50 rounds.\n[1]\tvalidation_0-merror:0.241898\tvalidation_1-merror:0.252946\n[2]\tvalidation_0-merror:0.234891\tvalidation_1-merror:0.243687\n[3]\tvalidation_0-merror:0.229082\tvalidation_1-merror:0.237458\n[4]\tvalidation_0-merror:0.220013\tvalidation_1-merror:0.230892\n[5]\tvalidation_0-merror:0.213994\tvalidation_1-merror:0.227273\n[6]\tvalidation_0-merror:0.208481\tvalidation_1-merror:0.224579\n[7]\tvalidation_0-merror:0.204146\tvalidation_1-merror:0.221212\n[8]\tvalidation_0-merror:0.200989\tvalidation_1-merror:0.218687\n[9]\tvalidation_0-merror:0.198359\tvalidation_1-merror:0.218771\n[10]\tvalidation_0-merror:0.195602\tvalidation_1-merror:0.217088\n[11]\tvalidation_0-merror:0.192782\tvalidation_1-merror:0.215404\n[12]\tvalidation_0-merror:0.188952\tvalidation_1-merror:0.215572\n[13]\tvalidation_0-merror:0.185227\tvalidation_1-merror:0.213636\n[14]\tvalidation_0-merror:0.182218\tvalidation_1-merror:0.213721\n[15]\tvalidation_0-merror:0.177273\tvalidation_1-merror:0.210017\n[16]\tvalidation_0-merror:0.175947\tvalidation_1-merror:0.210185\n[17]\tvalidation_0-merror:0.173695\tvalidation_1-merror:0.209512\n[18]\tvalidation_0-merror:0.172264\tvalidation_1-merror:0.209764\n[19]\tvalidation_0-merror:0.169802\tvalidation_1-merror:0.207912\n[20]\tvalidation_0-merror:0.167487\tvalidation_1-merror:0.207744\n[21]\tvalidation_0-merror:0.165488\tvalidation_1-merror:0.206902\n[22]\tvalidation_0-merror:0.163721\tvalidation_1-merror:0.207492\n[23]\tvalidation_0-merror:0.162584\tvalidation_1-merror:0.208081\n[24]\tvalidation_0-merror:0.161322\tvalidation_1-merror:0.209091\n[25]\tvalidation_0-merror:0.159491\tvalidation_1-merror:0.207744\n[26]\tvalidation_0-merror:0.157218\tvalidation_1-merror:0.205892\n[27]\tvalidation_0-merror:0.155787\tvalidation_1-merror:0.205556\n[28]\tvalidation_0-merror:0.154714\tvalidation_1-merror:0.205808\n[29]\tvalidation_0-merror:0.153725\tvalidation_1-merror:0.205219\n[30]\tvalidation_0-merror:0.152399\tvalidation_1-merror:0.205556\n[31]\tvalidation_0-merror:0.150421\tvalidation_1-merror:0.204461\n[32]\tvalidation_0-merror:0.147938\tvalidation_1-merror:0.204461\n[33]\tvalidation_0-merror:0.14596\tvalidation_1-merror:0.203872\n[34]\tvalidation_0-merror:0.14476\tvalidation_1-merror:0.202862\n[35]\tvalidation_0-merror:0.14314\tvalidation_1-merror:0.202694\n[36]\tvalidation_0-merror:0.142361\tvalidation_1-merror:0.202525\n[37]\tvalidation_0-merror:0.140657\tvalidation_1-merror:0.201768\n[38]\tvalidation_0-merror:0.140488\tvalidation_1-merror:0.20202\n[39]\tvalidation_0-merror:0.139268\tvalidation_1-merror:0.201515\n[40]\tvalidation_0-merror:0.137668\tvalidation_1-merror:0.200673\n[41]\tvalidation_0-merror:0.136532\tvalidation_1-merror:0.201178\n[42]\tvalidation_0-merror:0.135206\tvalidation_1-merror:0.201515\n[43]\tvalidation_0-merror:0.134133\tvalidation_1-merror:0.201852\n[44]\tvalidation_0-merror:0.132155\tvalidation_1-merror:0.200842\n[45]\tvalidation_0-merror:0.13104\tvalidation_1-merror:0.19899\n[46]\tvalidation_0-merror:0.13064\tvalidation_1-merror:0.198401\n[47]\tvalidation_0-merror:0.129819\tvalidation_1-merror:0.197643\n[48]\tvalidation_0-merror:0.12883\tvalidation_1-merror:0.197559\n[49]\tvalidation_0-merror:0.127546\tvalidation_1-merror:0.197811\n[50]\tvalidation_0-merror:0.125926\tvalidation_1-merror:0.197054\n[51]\tvalidation_0-merror:0.124769\tvalidation_1-merror:0.198906\n[52]\tvalidation_0-merror:0.123527\tvalidation_1-merror:0.198822\n[53]\tvalidation_0-merror:0.122748\tvalidation_1-merror:0.198737\n[54]\tvalidation_0-merror:0.121675\tvalidation_1-merror:0.197727\n[55]\tvalidation_0-merror:0.119823\tvalidation_1-merror:0.197222\n[56]\tvalidation_0-merror:0.119024\tvalidation_1-merror:0.19697\n[57]\tvalidation_0-merror:0.117887\tvalidation_1-merror:0.197054\n[58]\tvalidation_0-merror:0.117424\tvalidation_1-merror:0.19697\n[59]\tvalidation_0-merror:0.116814\tvalidation_1-merror:0.197727\n[60]\tvalidation_0-merror:0.115762\tvalidation_1-merror:0.197811\n[61]\tvalidation_0-merror:0.114836\tvalidation_1-merror:0.198064\n[62]\tvalidation_0-merror:0.113973\tvalidation_1-merror:0.198737\n[63]\tvalidation_0-merror:0.113215\tvalidation_1-merror:0.199158\n[64]\tvalidation_0-merror:0.112121\tvalidation_1-merror:0.198232\n[65]\tvalidation_0-merror:0.111301\tvalidation_1-merror:0.198569\n[66]\tvalidation_0-merror:0.110438\tvalidation_1-merror:0.198401\n[67]\tvalidation_0-merror:0.108607\tvalidation_1-merror:0.197896\n[68]\tvalidation_0-merror:0.107976\tvalidation_1-merror:0.198653\n[69]\tvalidation_0-merror:0.107113\tvalidation_1-merror:0.198232\n[70]\tvalidation_0-merror:0.105661\tvalidation_1-merror:0.197811\n[71]\tvalidation_0-merror:0.104314\tvalidation_1-merror:0.197727\n[72]\tvalidation_0-merror:0.103367\tvalidation_1-merror:0.198232\n[73]\tvalidation_0-merror:0.102925\tvalidation_1-merror:0.198232\n[74]\tvalidation_0-merror:0.101684\tvalidation_1-merror:0.198401\n[75]\tvalidation_0-merror:0.10061\tvalidation_1-merror:0.198064\n[76]\tvalidation_0-merror:0.099453\tvalidation_1-merror:0.198653\n[77]\tvalidation_0-merror:0.098653\tvalidation_1-merror:0.198401\n[78]\tvalidation_0-merror:0.098169\tvalidation_1-merror:0.198148\n[79]\tvalidation_0-merror:0.097138\tvalidation_1-merror:0.199242\n[80]\tvalidation_0-merror:0.096086\tvalidation_1-merror:0.198569\n[81]\tvalidation_0-merror:0.095686\tvalidation_1-merror:0.198401\n[82]\tvalidation_0-merror:0.094592\tvalidation_1-merror:0.198401\n[83]\tvalidation_0-merror:0.09354\tvalidation_1-merror:0.196801\n[84]\tvalidation_0-merror:0.093013\tvalidation_1-merror:0.196212\n[85]\tvalidation_0-merror:0.092066\tvalidation_1-merror:0.196633\n[86]\tvalidation_0-merror:0.09154\tvalidation_1-merror:0.197138\n[87]\tvalidation_0-merror:0.090951\tvalidation_1-merror:0.197306\n[88]\tvalidation_0-merror:0.090678\tvalidation_1-merror:0.197475\n[89]\tvalidation_0-merror:0.089289\tvalidation_1-merror:0.197643\n[90]\tvalidation_0-merror:0.088405\tvalidation_1-merror:0.19638\n[91]\tvalidation_0-merror:0.087584\tvalidation_1-merror:0.196465\n[92]\tvalidation_0-merror:0.086848\tvalidation_1-merror:0.195455\n[93]\tvalidation_0-merror:0.08609\tvalidation_1-merror:0.195791\n[94]\tvalidation_0-merror:0.085017\tvalidation_1-merror:0.19537\n[95]\tvalidation_0-merror:0.084722\tvalidation_1-merror:0.195455\n[96]\tvalidation_0-merror:0.08367\tvalidation_1-merror:0.196465\n[97]\tvalidation_0-merror:0.082828\tvalidation_1-merror:0.195707\n[98]\tvalidation_0-merror:0.082344\tvalidation_1-merror:0.197222\n[99]\tvalidation_0-merror:0.081692\tvalidation_1-merror:0.196886\n[100]\tvalidation_0-merror:0.08104\tvalidation_1-merror:0.197306\n[101]\tvalidation_0-merror:0.080661\tvalidation_1-merror:0.197306\n[102]\tvalidation_0-merror:0.080156\tvalidation_1-merror:0.197811\n[103]\tvalidation_0-merror:0.07944\tvalidation_1-merror:0.197727\n[104]\tvalidation_0-merror:0.078514\tvalidation_1-merror:0.197811\n[105]\tvalidation_0-merror:0.077378\tvalidation_1-merror:0.197559\n[106]\tvalidation_0-merror:0.076915\tvalidation_1-merror:0.197222\n[107]\tvalidation_0-merror:0.075947\tvalidation_1-merror:0.197643\n[108]\tvalidation_0-merror:0.075568\tvalidation_1-merror:0.197896\n[109]\tvalidation_0-merror:0.075084\tvalidation_1-merror:0.197138\n[110]\tvalidation_0-merror:0.074832\tvalidation_1-merror:0.197896\n[111]\tvalidation_0-merror:0.073948\tvalidation_1-merror:0.19798\n[112]\tvalidation_0-merror:0.073316\tvalidation_1-merror:0.198148\n[113]\tvalidation_0-merror:0.072201\tvalidation_1-merror:0.198906\n[114]\tvalidation_0-merror:0.071738\tvalidation_1-merror:0.199495\n[115]\tvalidation_0-merror:0.070686\tvalidation_1-merror:0.19899\n[116]\tvalidation_0-merror:0.070497\tvalidation_1-merror:0.199327\n[117]\tvalidation_0-merror:0.070097\tvalidation_1-merror:0.19899\n[118]\tvalidation_0-merror:0.068603\tvalidation_1-merror:0.198148\n[119]\tvalidation_0-merror:0.068413\tvalidation_1-merror:0.197727\n[120]\tvalidation_0-merror:0.067908\tvalidation_1-merror:0.197559\n[121]\tvalidation_0-merror:0.067529\tvalidation_1-merror:0.197643\n[122]\tvalidation_0-merror:0.067066\tvalidation_1-merror:0.198064\n[123]\tvalidation_0-merror:0.066267\tvalidation_1-merror:0.198401\n[124]\tvalidation_0-merror:0.065951\tvalidation_1-merror:0.198232\n[125]\tvalidation_0-merror:0.065699\tvalidation_1-merror:0.198569\n[126]\tvalidation_0-merror:0.065509\tvalidation_1-merror:0.198232\n[127]\tvalidation_0-merror:0.065215\tvalidation_1-merror:0.197727\n[128]\tvalidation_0-merror:0.064857\tvalidation_1-merror:0.197559\n[129]\tvalidation_0-merror:0.064373\tvalidation_1-merror:0.197306\n[130]\tvalidation_0-merror:0.064036\tvalidation_1-merror:0.197306\n[131]\tvalidation_0-merror:0.063279\tvalidation_1-merror:0.197054\n[132]\tvalidation_0-merror:0.062816\tvalidation_1-merror:0.197306\n[133]\tvalidation_0-merror:0.062226\tvalidation_1-merror:0.197391\n[134]\tvalidation_0-merror:0.061785\tvalidation_1-merror:0.197054\n[135]\tvalidation_0-merror:0.061279\tvalidation_1-merror:0.196886\n[136]\tvalidation_0-merror:0.060795\tvalidation_1-merror:0.197054\n[137]\tvalidation_0-merror:0.059996\tvalidation_1-merror:0.197559\n[138]\tvalidation_0-merror:0.05947\tvalidation_1-merror:0.197391\n[139]\tvalidation_0-merror:0.059217\tvalidation_1-merror:0.197475\n[140]\tvalidation_0-merror:0.059112\tvalidation_1-merror:0.197475\n[141]\tvalidation_0-merror:0.058628\tvalidation_1-merror:0.197811\n[142]\tvalidation_0-merror:0.057912\tvalidation_1-merror:0.197306\n[143]\tvalidation_0-merror:0.057723\tvalidation_1-merror:0.197138\n[144]\tvalidation_0-merror:0.057534\tvalidation_1-merror:0.197391\nStopping. Best iteration:\n[94]\tvalidation_0-merror:0.085017\tvalidation_1-merror:0.19537\n\n" ], [ "results = model.evals_result()\r\ntrain_error = results['validation_0']['merror']\r\nval_error = results['validation_1']['merror']\r\nepoch = list(range(1, len(train_error)+1))\r\nplt.plot(epoch, train_error, label='Train')\r\nplt.plot(epoch, val_error, label='Validation')\r\nplt.ylabel('Classification Error')\r\nplt.xlabel('Model Complexity (n_estimators)')\r\nplt.title('Validation Curve for this XGBoost model')\r\nplt.ylim((0.10, 0.25)) # Zoom in\r\nplt.legend();", "_____no_output_____" ] ], [ [ "### Try adjusting these hyperparameters\n\n#### Random Forest\n- class_weight (for imbalanced classes)\n- max_depth (usually high, can try decreasing)\n- n_estimators (too low underfits, too high wastes time)\n- min_samples_leaf (increase if overfitting)\n- max_features (decrease for more diverse trees)\n\n#### Xgboost\n- scale_pos_weight (for imbalanced classes)\n- max_depth (usually low, can try increasing)\n- n_estimators (too low underfits, too high wastes time/overfits) โ€” Use Early Stopping!\n- learning_rate (too low underfits, too high overfits)\n\nFor more ideas, see [Notes on Parameter Tuning](https://xgboost.readthedocs.io/en/latest/tutorials/param_tuning.html) and [DART booster](https://xgboost.readthedocs.io/en/latest/tutorials/dart.html).", "_____no_output_____" ], [ "## Challenge\n\nYou will use your portfolio project dataset for all assignments this sprint. Complete these tasks for your project, and document your work.\n\n- Continue to clean and explore your data. Make exploratory visualizations.\n- Fit a model. Does it beat your baseline?\n- Try xgboost.\n- Get your model's permutation importances.\n\nYou should try to complete an initial model today, because the rest of the week, we're making model interpretation visualizations.\n\nBut, if you aren't ready to try xgboost and permutation importances with your dataset today, you can practice with another dataset instead. You may choose any dataset you've worked with previously.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ] ]
d06137aa460ed001913396642f28d2945f230d06
823,130
ipynb
Jupyter Notebook
Irises_ML_Intro/Irises Data Analysis Workflow_06_2019.ipynb
ValRCS/RCS_Data_Analysis_Python_2019_July
19e2f8310f41b697f9c86d7a085a9ff19390eeac
[ "MIT" ]
1
2019-07-11T16:25:15.000Z
2019-07-11T16:25:15.000Z
Irises_ML_Intro/Irises Data Analysis Workflow_06_2019.ipynb
ValRCS/RCS_Data_Analysis_Python_2019_July
19e2f8310f41b697f9c86d7a085a9ff19390eeac
[ "MIT" ]
8
2020-01-28T22:54:14.000Z
2022-02-10T00:17:47.000Z
Irises_ML_Intro/Irises Data Analysis Workflow_06_2019.ipynb
ValRCS/RCS_Data_Analysis_Python_2019_July
19e2f8310f41b697f9c86d7a085a9ff19390eeac
[ "MIT" ]
null
null
null
199.788835
142,296
0.894876
[ [ [ "<h1><center>Introductory Data Analysis Workflow</center></h1>\n", "_____no_output_____" ], [ "![Pipeline](https://imgs.xkcd.com/comics/data_pipeline.png)\nhttps://xkcd.com/2054", "_____no_output_____" ], [ "# An example machine learning notebook\n\n* Original Notebook by [Randal S. Olson](http://www.randalolson.com/)\n* Supported by [Jason H. Moore](http://www.epistasis.org/)\n* [University of Pennsylvania Institute for Bioinformatics](http://upibi.org/)\n* Adapted for LU Py-Sem 2018 by [Valdis Saulespurens]([email protected])", "_____no_output_____" ], [ "**You can also [execute the code in this notebook on Binder](https://mybinder.org/v2/gh/ValRCS/RigaComm_DataAnalysis/master) - no local installation required.**", "_____no_output_____" ] ], [ [ "# text 17.04.2019\nimport datetime\nprint(datetime.datetime.now())\nprint('hello')", "2019-06-13 16:12:23.662194\nhello\n" ] ], [ [ "## Table of contents\n\n1. [Introduction](#Introduction)\n\n2. [License](#License)\n\n3. [Required libraries](#Required-libraries)\n\n4. [The problem domain](#The-problem-domain)\n\n5. [Step 1: Answering the question](#Step-1:-Answering-the-question)\n\n6. [Step 2: Checking the data](#Step-2:-Checking-the-data)\n\n7. [Step 3: Tidying the data](#Step-3:-Tidying-the-data)\n\n - [Bonus: Testing our data](#Bonus:-Testing-our-data)\n\n8. [Step 4: Exploratory analysis](#Step-4:-Exploratory-analysis)\n\n9. [Step 5: Classification](#Step-5:-Classification)\n\n - [Cross-validation](#Cross-validation)\n\n - [Parameter tuning](#Parameter-tuning)\n\n10. [Step 6: Reproducibility](#Step-6:-Reproducibility)\n\n11. [Conclusions](#Conclusions)\n\n12. [Further reading](#Further-reading)\n\n13. [Acknowledgements](#Acknowledgements)", "_____no_output_____" ], [ "## Introduction\n\n[[ go back to the top ]](#Table-of-contents)\n\nIn the time it took you to read this sentence, terabytes of data have been collectively generated across the world โ€” more data than any of us could ever hope to process, much less make sense of, on the machines we're using to read this notebook.\n\nIn response to this massive influx of data, the field of Data Science has come to the forefront in the past decade. Cobbled together by people from a diverse array of fields โ€” statistics, physics, computer science, design, and many more โ€” the field of Data Science represents our collective desire to understand and harness the abundance of data around us to build a better world.\n\nIn this notebook, I'm going to go over a basic Python data analysis pipeline from start to finish to show you what a typical data science workflow looks like.\n\nIn addition to providing code examples, I also hope to imbue in you a sense of good practices so you can be a more effective โ€” and more collaborative โ€” data scientist.\n\nI will be following along with the data analysis checklist from [The Elements of Data Analytic Style](https://leanpub.com/datastyle), which I strongly recommend reading as a free and quick guidebook to performing outstanding data analysis.\n\n**This notebook is intended to be a public resource. As such, if you see any glaring inaccuracies or if a critical topic is missing, please feel free to point it out or (preferably) submit a pull request to improve the notebook.**", "_____no_output_____" ], [ "## License\n\n[[ go back to the top ]](#Table-of-contents)\n\nPlease see the [repository README file](https://github.com/rhiever/Data-Analysis-and-Machine-Learning-Projects#license) for the licenses and usage terms for the instructional material and code in this notebook. In general, I have licensed this material so that it is as widely usable and shareable as possible.", "_____no_output_____" ], [ "## Required libraries\n\n[[ go back to the top ]](#Table-of-contents)\n\nIf you don't have Python on your computer, you can use the [Anaconda Python distribution](http://continuum.io/downloads) to install most of the Python packages you need. Anaconda provides a simple double-click installer for your convenience.\n\nThis notebook uses several Python packages that come standard with the Anaconda Python distribution. The primary libraries that we'll be using are:\n\n* **NumPy**: Provides a fast numerical array structure and helper functions.\n* **pandas**: Provides a DataFrame structure to store data in memory and work with it easily and efficiently.\n* **scikit-learn**: The essential Machine Learning package in Python.\n* **matplotlib**: Basic plotting library in Python; most other Python plotting libraries are built on top of it.\n* **Seaborn**: Advanced statistical plotting library.\n* **watermark**: A Jupyter Notebook extension for printing timestamps, version numbers, and hardware information.\n\n**Note:** I will not be providing support for people trying to run this notebook outside of the Anaconda Python distribution.", "_____no_output_____" ], [ "## The problem domain\n\n[[ go back to the top ]](#Table-of-contents)\n\nFor the purposes of this exercise, let's pretend we're working for a startup that just got funded to create a smartphone app that automatically identifies species of flowers from pictures taken on the smartphone. We're working with a moderately-sized team of data scientists and will be building part of the data analysis pipeline for this app.\n\nWe've been tasked by our company's Head of Data Science to create a demo machine learning model that takes four measurements from the flowers (sepal length, sepal width, petal length, and petal width) and identifies the species based on those measurements alone.\n\n<img src=\"img/petal_sepal.jpg\" />\n\nWe've been given a [data set](https://github.com/ValRCS/RCS_Data_Analysis_Python/blob/master/data/iris-data.csv) from our field researchers to develop the demo, which only includes measurements for three types of *Iris* flowers:\n\n### *Iris setosa*\n\n<img src=\"img/iris_setosa.jpg\" />\n\n### *Iris versicolor*\n<img src=\"img/iris_versicolor.jpg\" />\n\n### *Iris virginica*\n<img src=\"img/iris_virginica.jpg\" />\n\nThe four measurements we're using currently come from hand-measurements by the field researchers, but they will be automatically measured by an image processing model in the future.\n\n**Note:** The data set we're working with is the famous [*Iris* data set](https://archive.ics.uci.edu/ml/datasets/Iris) โ€” included with this notebook โ€” which I have modified slightly for demonstration purposes.", "_____no_output_____" ], [ "## Step 1: Answering the question\n\n[[ go back to the top ]](#Table-of-contents)\n\nThe first step to any data analysis project is to define the question or problem we're looking to solve, and to define a measure (or set of measures) for our success at solving that task. The data analysis checklist has us answer a handful of questions to accomplish that, so let's work through those questions.\n\n>Did you specify the type of data analytic question (e.g. exploration, association causality) before touching the data?\n\nWe're trying to classify the species (i.e., class) of the flower based on four measurements that we're provided: sepal length, sepal width, petal length, and petal width.\n\nPetal - ziedlapiล†a, sepal - arฤซ ziedlapiล†a\n\n![Petal vs Sepal](https://upload.wikimedia.org/wikipedia/commons/thumb/7/78/Petal-sepal.jpg/293px-Petal-sepal.jpg)\n\n>Did you define the metric for success before beginning?\n\nLet's do that now. Since we're performing classification, we can use [accuracy](https://en.wikipedia.org/wiki/Accuracy_and_precision) โ€” the fraction of correctly classified flowers โ€” to quantify how well our model is performing. Our company's Head of Data has told us that we should achieve at least 90% accuracy.\n\n>Did you understand the context for the question and the scientific or business application?\n\nWe're building part of a data analysis pipeline for a smartphone app that will be able to classify the species of flowers from pictures taken on the smartphone. In the future, this pipeline will be connected to another pipeline that automatically measures from pictures the traits we're using to perform this classification.\n\n>Did you record the experimental design?\n\nOur company's Head of Data has told us that the field researchers are hand-measuring 50 randomly-sampled flowers of each species using a standardized methodology. The field researchers take pictures of each flower they sample from pre-defined angles so the measurements and species can be confirmed by the other field researchers at a later point. At the end of each day, the data is compiled and stored on a private company GitHub repository.\n\n>Did you consider whether the question could be answered with the available data?\n\nThe data set we currently have is only for three types of *Iris* flowers. The model built off of this data set will only work for those *Iris* flowers, so we will need more data to create a general flower classifier.\n\n<hr />\n\nNotice that we've spent a fair amount of time working on the problem without writing a line of code or even looking at the data.\n\n**Thinking about and documenting the problem we're working on is an important step to performing effective data analysis that often goes overlooked.** Don't skip it.", "_____no_output_____" ], [ "## Step 2: Checking the data\n\n[[ go back to the top ]](#Table-of-contents)\n\nThe next step is to look at the data we're working with. Even curated data sets from the government can have errors in them, and it's vital that we spot these errors before investing too much time in our analysis.\n\nGenerally, we're looking to answer the following questions:\n\n* Is there anything wrong with the data?\n* Are there any quirks with the data?\n* Do I need to fix or remove any of the data?\n\nLet's start by reading the data into a pandas DataFrame.", "_____no_output_____" ] ], [ [ "import pandas as pd", "_____no_output_____" ], [ "\niris_data = pd.read_csv('../data/iris-data.csv')\n", "_____no_output_____" ], [ "# Resources for loading data from nonlocal sources\n# Pandas Can generally handle most common formats\n# https://pandas.pydata.org/pandas-docs/stable/io.html\n\n# SQL https://stackoverflow.com/questions/39149243/how-do-i-connect-to-a-sql-server-database-with-python\n# NoSQL MongoDB https://realpython.com/introduction-to-mongodb-and-python/\n# Apache Hadoop: https://dzone.com/articles/how-to-get-hadoop-data-into-a-python-model\n# Apache Spark: https://www.datacamp.com/community/tutorials/apache-spark-python\n# Data Scraping / Crawling libraries : https://elitedatascience.com/python-web-scraping-libraries Big Topic in itself\n\n# Most data resources have some form of Python API / Library ", "_____no_output_____" ], [ "iris_data.head()", "_____no_output_____" ] ], [ [ "We're in luck! The data seems to be in a usable format.\n\nThe first row in the data file defines the column headers, and the headers are descriptive enough for us to understand what each column represents. The headers even give us the units that the measurements were recorded in, just in case we needed to know at a later point in the project.\n\nEach row following the first row represents an entry for a flower: four measurements and one class, which tells us the species of the flower.\n\n**One of the first things we should look for is missing data.** Thankfully, the field researchers already told us that they put a 'NA' into the spreadsheet when they were missing a measurement.\n\nWe can tell pandas to automatically identify missing values if it knows our missing value marker.", "_____no_output_____" ] ], [ [ "iris_data.shape", "_____no_output_____" ], [ "iris_data.info()", "<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 150 entries, 0 to 149\nData columns (total 5 columns):\nsepal_length_cm 150 non-null float64\nsepal_width_cm 150 non-null float64\npetal_length_cm 150 non-null float64\npetal_width_cm 145 non-null float64\nclass 150 non-null object\ndtypes: float64(4), object(1)\nmemory usage: 5.9+ KB\n" ], [ "iris_data.describe()", "_____no_output_____" ], [ "iris_data = pd.read_csv('../data/iris-data.csv', na_values=['NA', 'N/A'])", "_____no_output_____" ] ], [ [ "Voilร ! Now pandas knows to treat rows with 'NA' as missing values.", "_____no_output_____" ], [ "Next, it's always a good idea to look at the distribution of our data โ€” especially the outliers.\n\nLet's start by printing out some summary statistics about the data set.", "_____no_output_____" ] ], [ [ "iris_data.describe()", "_____no_output_____" ] ], [ [ "We can see several useful values from this table. For example, we see that five `petal_width_cm` entries are missing.\n\nIf you ask me, though, tables like this are rarely useful unless we know that our data should fall in a particular range. It's usually better to visualize the data in some way. Visualization makes outliers and errors immediately stand out, whereas they might go unnoticed in a large table of numbers.\n\nSince we know we're going to be plotting in this section, let's set up the notebook so we can plot inside of it.", "_____no_output_____" ] ], [ [ "# This line tells the notebook to show plots inside of the notebook\n%matplotlib inline\n\nimport matplotlib.pyplot as plt\nimport seaborn as sb", "_____no_output_____" ] ], [ [ "Next, let's create a **scatterplot matrix**. Scatterplot matrices plot the distribution of each column along the diagonal, and then plot a scatterplot matrix for the combination of each variable. They make for an efficient tool to look for errors in our data.\n\nWe can even have the plotting package color each entry by its class to look for trends within the classes.", "_____no_output_____" ] ], [ [ "# We have to temporarily drop the rows with 'NA' values\n# because the Seaborn plotting function does not know\n# what to do with them\nsb.pairplot(iris_data.dropna(), hue='class')\n", "C:\\ProgramData\\Anaconda3\\lib\\site-packages\\numpy\\core\\_methods.py:140: RuntimeWarning: Degrees of freedom <= 0 for slice\n keepdims=keepdims)\nC:\\ProgramData\\Anaconda3\\lib\\site-packages\\numpy\\core\\_methods.py:132: RuntimeWarning: invalid value encountered in double_scalars\n ret = ret.dtype.type(ret / rcount)\n" ] ], [ [ "From the scatterplot matrix, we can already see some issues with the data set:\n\n1. There are five classes when there should only be three, meaning there were some coding errors.\n\n2. There are some clear outliers in the measurements that may be erroneous: one `sepal_width_cm` entry for `Iris-setosa` falls well outside its normal range, and several `sepal_length_cm` entries for `Iris-versicolor` are near-zero for some reason.\n\n3. We had to drop those rows with missing values.\n\nIn all of these cases, we need to figure out what to do with the erroneous data. Which takes us to the next step...", "_____no_output_____" ], [ "## Step 3: Tidying the data\n\n### GIGO principle\n\n[[ go back to the top ]](#Table-of-contents)\n\nNow that we've identified several errors in the data set, we need to fix them before we proceed with the analysis.\n\nLet's walk through the issues one-by-one.\n\n>There are five classes when there should only be three, meaning there were some coding errors.\n\nAfter talking with the field researchers, it sounds like one of them forgot to add `Iris-` before their `Iris-versicolor` entries. The other extraneous class, `Iris-setossa`, was simply a typo that they forgot to fix.\n\nLet's use the DataFrame to fix these errors.", "_____no_output_____" ] ], [ [ "iris_data['class'].unique()", "_____no_output_____" ], [ "# Copy and Replace\niris_data.loc[iris_data['class'] == 'versicolor', 'class'] = 'Iris-versicolor'\niris_data['class'].unique()\n", "_____no_output_____" ], [ "# So we take a row where a specific column('class' here) matches our bad values \n# and change them to good values\n\niris_data.loc[iris_data['class'] == 'Iris-setossa', 'class'] = 'Iris-setosa'\n\niris_data['class'].unique()", "_____no_output_____" ], [ "iris_data.tail()", "_____no_output_____" ], [ "iris_data[98:103]", "_____no_output_____" ] ], [ [ "Much better! Now we only have three class types. Imagine how embarrassing it would've been to create a model that used the wrong classes.\n\n>There are some clear outliers in the measurements that may be erroneous: one `sepal_width_cm` entry for `Iris-setosa` falls well outside its normal range, and several `sepal_length_cm` entries for `Iris-versicolor` are near-zero for some reason.\n\nFixing outliers can be tricky business. It's rarely clear whether the outlier was caused by measurement error, recording the data in improper units, or if the outlier is a real anomaly. For that reason, we should be judicious when working with outliers: if we decide to exclude any data, we need to make sure to document what data we excluded and provide solid reasoning for excluding that data. (i.e., \"This data didn't fit my hypothesis\" will not stand peer review.)\n\nIn the case of the one anomalous entry for `Iris-setosa`, let's say our field researchers know that it's impossible for `Iris-setosa` to have a sepal width below 2.5 cm. Clearly this entry was made in error, and we're better off just scrapping the entry than spending hours finding out what happened.", "_____no_output_____" ] ], [ [ "smallpetals = iris_data.loc[(iris_data['sepal_width_cm'] < 2.5) & (iris_data['class'] == 'Iris-setosa')]\nsmallpetals", "_____no_output_____" ], [ "iris_data.loc[iris_data['class'] == 'Iris-setosa', 'sepal_width_cm'].hist()", "_____no_output_____" ], [ "# This line drops any 'Iris-setosa' rows with a separal width less than 2.5 cm\n# Let's go over this command in class\niris_data = iris_data.loc[(iris_data['class'] != 'Iris-setosa') | (iris_data['sepal_width_cm'] >= 2.5)]\niris_data.loc[iris_data['class'] == 'Iris-setosa', 'sepal_width_cm'].hist()\n", "_____no_output_____" ] ], [ [ "Excellent! Now all of our `Iris-setosa` rows have a sepal width greater than 2.5.\n\nThe next data issue to address is the several near-zero sepal lengths for the `Iris-versicolor` rows. Let's take a look at those rows.", "_____no_output_____" ] ], [ [ "iris_data.loc[(iris_data['class'] == 'Iris-versicolor') &\n (iris_data['sepal_length_cm'] < 1.0)]", "_____no_output_____" ] ], [ [ "How about that? All of these near-zero `sepal_length_cm` entries seem to be off by two orders of magnitude, as if they had been recorded in meters instead of centimeters.\n\nAfter some brief correspondence with the field researchers, we find that one of them forgot to convert those measurements to centimeters. Let's do that for them.", "_____no_output_____" ] ], [ [ "iris_data.loc[iris_data['class'] == 'Iris-versicolor', 'sepal_length_cm'].hist()", "_____no_output_____" ], [ "iris_data['sepal_length_cm'].hist()", "_____no_output_____" ], [ "# Here we fix the wrong units\n\niris_data.loc[(iris_data['class'] == 'Iris-versicolor') &\n (iris_data['sepal_length_cm'] < 1.0),\n 'sepal_length_cm'] *= 100.0\n\niris_data.loc[iris_data['class'] == 'Iris-versicolor', 'sepal_length_cm'].hist()\n;", "_____no_output_____" ], [ "iris_data['sepal_length_cm'].hist()", "_____no_output_____" ] ], [ [ "Phew! Good thing we fixed those outliers. They could've really thrown our analysis off.\n\n>We had to drop those rows with missing values.\n\nLet's take a look at the rows with missing values:", "_____no_output_____" ] ], [ [ "iris_data.loc[(iris_data['sepal_length_cm'].isnull()) |\n (iris_data['sepal_width_cm'].isnull()) |\n (iris_data['petal_length_cm'].isnull()) |\n (iris_data['petal_width_cm'].isnull())]", "_____no_output_____" ] ], [ [ "It's not ideal that we had to drop those rows, especially considering they're all `Iris-setosa` entries. Since it seems like the missing data is systematic โ€” all of the missing values are in the same column for the same *Iris* type โ€” this error could potentially bias our analysis.\n\nOne way to deal with missing data is **mean imputation**: If we know that the values for a measurement fall in a certain range, we can fill in empty values with the average of that measurement.\n\nLet's see if we can do that here.", "_____no_output_____" ] ], [ [ "iris_data.loc[iris_data['class'] == 'Iris-setosa', 'petal_width_cm'].hist()\n", "_____no_output_____" ] ], [ [ "Most of the petal widths for `Iris-setosa` fall within the 0.2-0.3 range, so let's fill in these entries with the average measured petal width.", "_____no_output_____" ] ], [ [ "iris_data.loc[iris_data['class'] == 'Iris-setosa', 'petal_width_cm'].mean()", "_____no_output_____" ], [ "average_petal_width = iris_data.loc[iris_data['class'] == 'Iris-setosa', 'petal_width_cm'].mean()\nprint(average_petal_width)", "0.24999999999999997\n" ], [ "\n\niris_data.loc[(iris_data['class'] == 'Iris-setosa') &\n (iris_data['petal_width_cm'].isnull()),\n 'petal_width_cm'] = average_petal_width\n\niris_data.loc[(iris_data['class'] == 'Iris-setosa') &\n (iris_data['petal_width_cm'] == average_petal_width)]", "_____no_output_____" ], [ "iris_data.loc[(iris_data['sepal_length_cm'].isnull()) |\n (iris_data['sepal_width_cm'].isnull()) |\n (iris_data['petal_length_cm'].isnull()) |\n (iris_data['petal_width_cm'].isnull())]", "_____no_output_____" ] ], [ [ "Great! Now we've recovered those rows and no longer have missing data in our data set.\n\n**Note:** If you don't feel comfortable imputing your data, you can drop all rows with missing data with the `dropna()` call:\n\n iris_data.dropna(inplace=True)\n\nAfter all this hard work, we don't want to repeat this process every time we work with the data set. Let's save the tidied data file *as a separate file* and work directly with that data file from now on.", "_____no_output_____" ] ], [ [ "iris_data.to_json('../data/iris-clean.json')", "_____no_output_____" ], [ "iris_data.to_csv('../data/iris-data-clean.csv', index=False)\n\n", "_____no_output_____" ], [ "cleanedframe = iris_data.dropna()", "_____no_output_____" ], [ "iris_data_clean = pd.read_csv('../data/iris-data-clean.csv')", "_____no_output_____" ] ], [ [ "Now, let's take a look at the scatterplot matrix now that we've tidied the data.", "_____no_output_____" ] ], [ [ "myplot = sb.pairplot(iris_data_clean, hue='class')\nmyplot.savefig('irises.png')", "_____no_output_____" ], [ "import scipy.stats as stats", "_____no_output_____" ], [ "iris_data = pd.read_csv('../data/iris-data.csv')", "_____no_output_____" ], [ "iris_data.columns.unique()", "_____no_output_____" ], [ "stats.entropy(iris_data_clean['sepal_length_cm'])", "_____no_output_____" ], [ "iris_data.columns[:-1]", "_____no_output_____" ], [ "# we go through list of column names except last one and get entropy \n# for data (without missing values) in each column\nfor col in iris_data.columns[:-1]:\n print(\"Entropy for: \", col, stats.entropy(iris_data[col].dropna()))", "Entropy for: sepal_length_cm 4.96909746125432\nEntropy for: sepal_width_cm 5.000701325982732\nEntropy for: petal_length_cm 4.888113822938816\nEntropy for: petal_width_cm 4.754264731532864\n" ] ], [ [ "Of course, I purposely inserted numerous errors into this data set to demonstrate some of the many possible scenarios you may face while tidying your data.\n\nThe general takeaways here should be:\n\n* Make sure your data is encoded properly\n\n* Make sure your data falls within the expected range, and use domain knowledge whenever possible to define that expected range\n\n* Deal with missing data in one way or another: replace it if you can or drop it\n\n* Never tidy your data manually because that is not easily reproducible\n\n* Use code as a record of how you tidied your data\n\n* Plot everything you can about the data at this stage of the analysis so you can *visually* confirm everything looks correct", "_____no_output_____" ], [ "## Bonus: Testing our data\n\n[[ go back to the top ]](#Table-of-contents)\n\nAt SciPy 2015, I was exposed to a great idea: We should test our data. Just how we use unit tests to verify our expectations from code, we can similarly set up unit tests to verify our expectations about a data set.\n\nWe can quickly test our data using `assert` statements: We assert that something must be true, and if it is, then nothing happens and the notebook continues running. However, if our assertion is wrong, then the notebook stops running and brings it to our attention. For example,\n\n```Python\nassert 1 == 2\n```\n\nwill raise an `AssertionError` and stop execution of the notebook because the assertion failed.\n\nLet's test a few things that we know about our data set now.", "_____no_output_____" ] ], [ [ "# We know that we should only have three classes\nassert len(iris_data_clean['class'].unique()) == 3", "_____no_output_____" ], [ "# We know that sepal lengths for 'Iris-versicolor' should never be below 2.5 cm\nassert iris_data_clean.loc[iris_data_clean['class'] == 'Iris-versicolor', 'sepal_length_cm'].min() >= 2.5", "_____no_output_____" ], [ "# We know that our data set should have no missing measurements\nassert len(iris_data_clean.loc[(iris_data_clean['sepal_length_cm'].isnull()) |\n (iris_data_clean['sepal_width_cm'].isnull()) |\n (iris_data_clean['petal_length_cm'].isnull()) |\n (iris_data_clean['petal_width_cm'].isnull())]) == 0", "_____no_output_____" ], [ "# We know that our data set should have no missing measurements\nassert len(iris_data.loc[(iris_data['sepal_length_cm'].isnull()) |\n (iris_data['sepal_width_cm'].isnull()) |\n (iris_data['petal_length_cm'].isnull()) |\n (iris_data['petal_width_cm'].isnull())]) == 0", "_____no_output_____" ] ], [ [ "And so on. If any of these expectations are violated, then our analysis immediately stops and we have to return to the tidying stage.", "_____no_output_____" ], [ "### Data Cleanup & Wrangling > 80% time spent in Data Science", "_____no_output_____" ], [ "## Step 4: Exploratory analysis\n\n[[ go back to the top ]](#Table-of-contents)\n\nNow after spending entirely too much time tidying our data, we can start analyzing it!\n\nExploratory analysis is the step where we start delving deeper into the data set beyond the outliers and errors. We'll be looking to answer questions such as:\n\n* How is my data distributed?\n\n* Are there any correlations in my data?\n\n* Are there any confounding factors that explain these correlations?\n\nThis is the stage where we plot all the data in as many ways as possible. Create many charts, but don't bother making them pretty โ€” these charts are for internal use.\n\nLet's return to that scatterplot matrix that we used earlier.", "_____no_output_____" ] ], [ [ "sb.pairplot(iris_data_clean)\n;", "_____no_output_____" ] ], [ [ "Our data is normally distributed for the most part, which is great news if we plan on using any modeling methods that assume the data is normally distributed.\n\nThere's something strange going on with the petal measurements. Maybe it's something to do with the different `Iris` types. Let's color code the data by the class again to see if that clears things up.", "_____no_output_____" ] ], [ [ "sb.pairplot(iris_data_clean, hue='class')\n;", "_____no_output_____" ] ], [ [ "Sure enough, the strange distribution of the petal measurements exist because of the different species. This is actually great news for our classification task since it means that the petal measurements will make it easy to distinguish between `Iris-setosa` and the other `Iris` types.\n\nDistinguishing `Iris-versicolor` and `Iris-virginica` will prove more difficult given how much their measurements overlap.\n\nThere are also correlations between petal length and petal width, as well as sepal length and sepal width. The field biologists assure us that this is to be expected: Longer flower petals also tend to be wider, and the same applies for sepals.\n\nWe can also make [**violin plots**](https://en.wikipedia.org/wiki/Violin_plot) of the data to compare the measurement distributions of the classes. Violin plots contain the same information as [box plots](https://en.wikipedia.org/wiki/Box_plot), but also scales the box according to the density of the data.", "_____no_output_____" ] ], [ [ "plt.figure(figsize=(10, 10))\n\nfor column_index, column in enumerate(iris_data_clean.columns):\n if column == 'class':\n continue\n plt.subplot(2, 2, column_index + 1)\n sb.violinplot(x='class', y=column, data=iris_data_clean)", "_____no_output_____" ] ], [ [ "Enough flirting with the data. Let's get to modeling.", "_____no_output_____" ], [ "## Step 5: Classification\n\n[[ go back to the top ]](#Table-of-contents)\n\nWow, all this work and we *still* haven't modeled the data!\n\nAs tiresome as it can be, tidying and exploring our data is a vital component to any data analysis. If we had jumped straight to the modeling step, we would have created a faulty classification model.\n\nRemember: **Bad data leads to bad models.** Always check your data first.\n\n<hr />\n\nAssured that our data is now as clean as we can make it โ€” and armed with some cursory knowledge of the distributions and relationships in our data set โ€” it's time to make the next big step in our analysis: Splitting the data into training and testing sets.\n\nA **training set** is a random subset of the data that we use to train our models.\n\nA **testing set** is a random subset of the data (mutually exclusive from the training set) that we use to validate our models on unforseen data.\n\nEspecially in sparse data sets like ours, it's easy for models to **overfit** the data: The model will learn the training set so well that it won't be able to handle most of the cases it's never seen before. This is why it's important for us to build the model with the training set, but score it with the testing set.\n\nNote that once we split the data into a training and testing set, we should treat the testing set like it no longer exists: We cannot use any information from the testing set to build our model or else we're cheating.\n\nLet's set up our data first.", "_____no_output_____" ] ], [ [ "iris_data_clean = pd.read_csv('../data/iris-data-clean.csv')\n\n# We're using all four measurements as inputs\n# Note that scikit-learn expects each entry to be a list of values, e.g.,\n# [ [val1, val2, val3],\n# [val1, val2, val3],\n# ... ]\n# such that our input data set is represented as a list of lists\n\n# We can extract the data in this format from pandas like this:\nall_inputs = iris_data_clean[['sepal_length_cm', 'sepal_width_cm',\n 'petal_length_cm', 'petal_width_cm']].values\n\n# Similarly, we can extract the class labels\nall_labels = iris_data_clean['class'].values\n\n# Make sure that you don't mix up the order of the entries\n# all_inputs[5] inputs should correspond to the class in all_labels[5]\n\n# Here's what a subset of our inputs looks like:\nall_inputs[:5]", "_____no_output_____" ], [ "all_labels[:5]", "_____no_output_____" ], [ "type(all_inputs)", "_____no_output_____" ], [ "all_labels[:5]", "_____no_output_____" ], [ "type(all_labels)", "_____no_output_____" ] ], [ [ "Now our data is ready to be split.", "_____no_output_____" ] ], [ [ "from sklearn.model_selection import train_test_split", "_____no_output_____" ], [ "all_inputs[:3]", "_____no_output_____" ], [ "iris_data_clean.head(3)", "_____no_output_____" ], [ "all_labels[:3]", "_____no_output_____" ], [ "# Here we split our data into training and testing data\n\n(training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25, random_state=1)", "_____no_output_____" ], [ "training_inputs[:5]", "_____no_output_____" ], [ "testing_inputs[:5]", "_____no_output_____" ], [ "testing_classes[:5]", "_____no_output_____" ], [ "training_classes[:5]", "_____no_output_____" ] ], [ [ "With our data split, we can start fitting models to our data. Our company's Head of Data is all about decision tree classifiers, so let's start with one of those.\n\nDecision tree classifiers are incredibly simple in theory. In their simplest form, decision tree classifiers ask a series of Yes/No questions about the data โ€” each time getting closer to finding out the class of each entry โ€” until they either classify the data set perfectly or simply can't differentiate a set of entries. Think of it like a game of [Twenty Questions](https://en.wikipedia.org/wiki/Twenty_Questions), except the computer is *much*, *much* better at it.\n\nHere's an example decision tree classifier:\n\n<img src=\"img/iris_dtc.png\" />\n\nNotice how the classifier asks Yes/No questions about the data โ€” whether a certain feature is <= 1.75, for example โ€” so it can differentiate the records. This is the essence of every decision tree.\n\nThe nice part about decision tree classifiers is that they are **scale-invariant**, i.e., the scale of the features does not affect their performance, unlike many Machine Learning models. In other words, it doesn't matter if our features range from 0 to 1 or 0 to 1,000; decision tree classifiers will work with them just the same.\n\nThere are several [parameters](http://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html) that we can tune for decision tree classifiers, but for now let's use a basic decision tree classifier.", "_____no_output_____" ] ], [ [ "from sklearn.tree import DecisionTreeClassifier\n\n# Create the classifier\ndecision_tree_classifier = DecisionTreeClassifier()\n\n# Train the classifier on the training set\ndecision_tree_classifier.fit(training_inputs, training_classes)\n\n# Validate the classifier on the testing set using classification accuracy\ndecision_tree_classifier.score(testing_inputs, testing_classes)", "_____no_output_____" ], [ "150*0.25", "_____no_output_____" ], [ "len(testing_inputs)", "_____no_output_____" ], [ "37/38", "_____no_output_____" ], [ "from sklearn import svm\nsvm_classifier = svm.SVC(gamma = 'scale')", "_____no_output_____" ], [ "svm_classifier.fit(training_inputs, training_classes)", "_____no_output_____" ], [ "svm_classifier.score(testing_inputs, testing_classes)", "_____no_output_____" ], [ "svm_classifier = svm.SVC(gamma = 'scale')\nsvm_classifier.fit(training_inputs, training_classes)\nsvm_classifier.score(testing_inputs, testing_classes)", "_____no_output_____" ] ], [ [ "Heck yeah! Our model achieves 97% classification accuracy without much effort.\n\nHowever, there's a catch: Depending on how our training and testing set was sampled, our model can achieve anywhere from 80% to 100% accuracy:", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt", "_____no_output_____" ], [ "# here we randomly split data 1000 times in differrent training and test sets\nmodel_accuracies = []\n\nfor repetition in range(1000):\n (training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25)\n \n decision_tree_classifier = DecisionTreeClassifier()\n decision_tree_classifier.fit(training_inputs, training_classes)\n classifier_accuracy = decision_tree_classifier.score(testing_inputs, testing_classes)\n model_accuracies.append(classifier_accuracy)\n \nplt.hist(model_accuracies)\n;", "_____no_output_____" ], [ "100/38", "_____no_output_____" ] ], [ [ "It's obviously a problem that our model performs quite differently depending on the subset of the data it's trained on. This phenomenon is known as **overfitting**: The model is learning to classify the training set so well that it doesn't generalize and perform well on data it hasn't seen before.\n\n### Cross-validation\n\n[[ go back to the top ]](#Table-of-contents)\n\nThis problem is the main reason that most data scientists perform ***k*-fold cross-validation** on their models: Split the original data set into *k* subsets, use one of the subsets as the testing set, and the rest of the subsets are used as the training set. This process is then repeated *k* times such that each subset is used as the testing set exactly once.\n\n10-fold cross-validation is the most common choice, so let's use that here. Performing 10-fold cross-validation on our data set looks something like this:\n\n(each square is an entry in our data set)", "_____no_output_____" ] ], [ [ "# new text", "_____no_output_____" ], [ "import numpy as np\nfrom sklearn.model_selection import StratifiedKFold\n\ndef plot_cv(cv, features, labels):\n masks = []\n for train, test in cv.split(features, labels):\n mask = np.zeros(len(labels), dtype=bool)\n mask[test] = 1\n masks.append(mask)\n \n plt.figure(figsize=(15, 15))\n plt.imshow(masks, interpolation='none', cmap='gray_r')\n plt.ylabel('Fold')\n plt.xlabel('Row #')\n\nplot_cv(StratifiedKFold(n_splits=10), all_inputs, all_labels)", "_____no_output_____" ] ], [ [ "You'll notice that we used **Stratified *k*-fold cross-validation** in the code above. Stratified *k*-fold keeps the class proportions the same across all of the folds, which is vital for maintaining a representative subset of our data set. (e.g., so we don't have 100% `Iris setosa` entries in one of the folds.)\n\nWe can perform 10-fold cross-validation on our model with the following code:", "_____no_output_____" ] ], [ [ "from sklearn.model_selection import cross_val_score", "_____no_output_____" ], [ "from sklearn.model_selection import cross_val_score\n\ndecision_tree_classifier = DecisionTreeClassifier()\n\n# cross_val_score returns a list of the scores, which we can visualize\n# to get a reasonable estimate of our classifier's performance\ncv_scores = cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10)\nplt.hist(cv_scores)\nplt.title('Average score: {}'.format(np.mean(cv_scores)))\n;", "_____no_output_____" ], [ "len(all_inputs.T[1])", "_____no_output_____" ], [ "print(\"Entropy for: \", stats.entropy(all_inputs.T[1]))", "Entropy for: 4.994187360273029\n" ], [ "# we go through list of column names except last one and get entropy \n# for data (without missing values) in each column\ndef printEntropy(npdata):\n for i, col in enumerate(npdata.T):\n print(\"Entropy for column:\", i, stats.entropy(col))", "_____no_output_____" ], [ "printEntropy(all_inputs)", "Entropy for column: 0 4.9947332367061925\nEntropy for column: 1 4.994187360273029\nEntropy for column: 2 4.88306851089088\nEntropy for column: 3 4.76945055275522\n" ] ], [ [ "Now we have a much more consistent rating of our classifier's general classification accuracy.\n\n### Parameter tuning\n\n[[ go back to the top ]](#Table-of-contents)\n\nEvery Machine Learning model comes with a variety of parameters to tune, and these parameters can be vitally important to the performance of our classifier. For example, if we severely limit the depth of our decision tree classifier:", "_____no_output_____" ] ], [ [ "decision_tree_classifier = DecisionTreeClassifier(max_depth=1)\n\ncv_scores = cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10)\nplt.hist(cv_scores)\nplt.title('Average score: {}'.format(np.mean(cv_scores)))\n;", "_____no_output_____" ] ], [ [ "the classification accuracy falls tremendously.\n\nTherefore, we need to find a systematic method to discover the best parameters for our model and data set.\n\nThe most common method for model parameter tuning is **Grid Search**. The idea behind Grid Search is simple: explore a range of parameters and find the best-performing parameter combination. Focus your search on the best range of parameters, then repeat this process several times until the best parameters are discovered.\n\nLet's tune our decision tree classifier. We'll stick to only two parameters for now, but it's possible to simultaneously explore dozens of parameters if we want.", "_____no_output_____" ] ], [ [ "from sklearn.model_selection import GridSearchCV\n\ndecision_tree_classifier = DecisionTreeClassifier()\n\nparameter_grid = {'max_depth': [1, 2, 3, 4, 5],\n 'max_features': [1, 2, 3, 4]}\n\ncross_validation = StratifiedKFold(n_splits=10)\n\ngrid_search = GridSearchCV(decision_tree_classifier,\n param_grid=parameter_grid,\n cv=cross_validation)\n\ngrid_search.fit(all_inputs, all_labels)\nprint('Best score: {}'.format(grid_search.best_score_))\nprint('Best parameters: {}'.format(grid_search.best_params_))", "Best score: 0.9664429530201343\nBest parameters: {'max_depth': 3, 'max_features': 2}\n" ] ], [ [ "Now let's visualize the grid search to see how the parameters interact.", "_____no_output_____" ] ], [ [ "grid_search.cv_results_['mean_test_score']", "_____no_output_____" ], [ "grid_visualization = grid_search.cv_results_['mean_test_score']\ngrid_visualization.shape = (5, 4)\nsb.heatmap(grid_visualization, cmap='Reds', annot=True)\nplt.xticks(np.arange(4) + 0.5, grid_search.param_grid['max_features'])\nplt.yticks(np.arange(5) + 0.5, grid_search.param_grid['max_depth'])\nplt.xlabel('max_features')\nplt.ylabel('max_depth')\n;", "_____no_output_____" ] ], [ [ "Now we have a better sense of the parameter space: We know that we need a `max_depth` of at least 2 to allow the decision tree to make more than a one-off decision.\n\n`max_features` doesn't really seem to make a big difference here as long as we have 2 of them, which makes sense since our data set has only 4 features and is relatively easy to classify. (Remember, one of our data set's classes was easily separable from the rest based on a single feature.)\n\nLet's go ahead and use a broad grid search to find the best settings for a handful of parameters.", "_____no_output_____" ] ], [ [ "decision_tree_classifier = DecisionTreeClassifier()\n\nparameter_grid = {'criterion': ['gini', 'entropy'],\n 'splitter': ['best', 'random'],\n 'max_depth': [1, 2, 3, 4, 5],\n 'max_features': [1, 2, 3, 4]}\n\ncross_validation = StratifiedKFold(n_splits=10)\n\ngrid_search = GridSearchCV(decision_tree_classifier,\n param_grid=parameter_grid,\n cv=cross_validation)\n\ngrid_search.fit(all_inputs, all_labels)\nprint('Best score: {}'.format(grid_search.best_score_))\nprint('Best parameters: {}'.format(grid_search.best_params_))", "Best score: 0.9664429530201343\nBest parameters: {'criterion': 'gini', 'max_depth': 3, 'max_features': 3, 'splitter': 'best'}\n" ] ], [ [ "Now we can take the best classifier from the Grid Search and use that:", "_____no_output_____" ] ], [ [ "decision_tree_classifier = grid_search.best_estimator_\ndecision_tree_classifier", "_____no_output_____" ] ], [ [ "We can even visualize the decision tree with [GraphViz](http://www.graphviz.org/) to see how it's making the classifications:", "_____no_output_____" ] ], [ [ "import sklearn.tree as tree\nfrom sklearn.externals.six import StringIO\n\nwith open('iris_dtc.dot', 'w') as out_file:\n out_file = tree.export_graphviz(decision_tree_classifier, out_file=out_file)", "_____no_output_____" ] ], [ [ "<img src=\"img/iris_dtc.png\" />", "_____no_output_____" ], [ "(This classifier may look familiar from earlier in the notebook.)\n\nAlright! We finally have our demo classifier. Let's create some visuals of its performance so we have something to show our company's Head of Data.", "_____no_output_____" ] ], [ [ "dt_scores = cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10)\n\nsb.boxplot(dt_scores)\nsb.stripplot(dt_scores, jitter=True, color='black')\n;", "_____no_output_____" ] ], [ [ "Hmmm... that's a little boring by itself though. How about we compare another classifier to see how they perform?\n\nWe already know from previous projects that Random Forest classifiers usually work better than individual decision trees. A common problem that decision trees face is that they're prone to overfitting: They complexify to the point that they classify the training set near-perfectly, but fail to generalize to data they have not seen before.\n\n**Random Forest classifiers** work around that limitation by creating a whole bunch of decision trees (hence \"forest\") โ€” each trained on random subsets of training samples (drawn with replacement) and features (drawn without replacement) โ€” and have the decision trees work together to make a more accurate classification.\n\nLet that be a lesson for us: **Even in Machine Learning, we get better results when we work together!**\n\nLet's see if a Random Forest classifier works better here.\n\nThe great part about scikit-learn is that the training, testing, parameter tuning, etc. process is the same for all models, so we only need to plug in the new classifier.", "_____no_output_____" ] ], [ [ "from sklearn.ensemble import RandomForestClassifier", "_____no_output_____" ], [ "from sklearn.ensemble import RandomForestClassifier\n\nrandom_forest_classifier = RandomForestClassifier()\n\nparameter_grid = {'n_estimators': [10, 25, 50, 100],\n 'criterion': ['gini', 'entropy'],\n 'max_features': [1, 2, 3, 4]}\n\ncross_validation = StratifiedKFold(n_splits=10)\n\ngrid_search = GridSearchCV(random_forest_classifier,\n param_grid=parameter_grid,\n cv=cross_validation)\n\ngrid_search.fit(all_inputs, all_labels)\nprint('Best score: {}'.format(grid_search.best_score_))\nprint('Best parameters: {}'.format(grid_search.best_params_))\n\ngrid_search.best_estimator_", "Best score: 0.9664429530201343\nBest parameters: {'criterion': 'gini', 'max_features': 3, 'n_estimators': 25}\n" ] ], [ [ "Now we can compare their performance:", "_____no_output_____" ] ], [ [ "random_forest_classifier = grid_search.best_estimator_\n\nrf_df = pd.DataFrame({'accuracy': cross_val_score(random_forest_classifier, all_inputs, all_labels, cv=10),\n 'classifier': ['Random Forest'] * 10})\ndt_df = pd.DataFrame({'accuracy': cross_val_score(decision_tree_classifier, all_inputs, all_labels, cv=10),\n 'classifier': ['Decision Tree'] * 10})\nboth_df = rf_df.append(dt_df)\n\nsb.boxplot(x='classifier', y='accuracy', data=both_df)\nsb.stripplot(x='classifier', y='accuracy', data=both_df, jitter=True, color='black')\n;", "_____no_output_____" ] ], [ [ "How about that? They both seem to perform about the same on this data set. This is probably because of the limitations of our data set: We have only 4 features to make the classification, and Random Forest classifiers excel when there's hundreds of possible features to look at. In other words, there wasn't much room for improvement with this data set.", "_____no_output_____" ], [ "## Step 6: Reproducibility\n\n[[ go back to the top ]](#Table-of-contents)\n\nEnsuring that our work is reproducible is the last and โ€” arguably โ€” most important step in any analysis. **As a rule, we shouldn't place much weight on a discovery that can't be reproduced**. As such, if our analysis isn't reproducible, we might as well not have done it.\n\nNotebooks like this one go a long way toward making our work reproducible. Since we documented every step as we moved along, we have a written record of what we did and why we did it โ€” both in text and code.\n\nBeyond recording what we did, we should also document what software and hardware we used to perform our analysis. This typically goes at the top of our notebooks so our readers know what tools to use.\n\n[Sebastian Raschka](http://sebastianraschka.com/) created a handy [notebook tool](https://github.com/rasbt/watermark) for this:", "_____no_output_____" ] ], [ [ "!pip install watermark", "Requirement already satisfied: watermark in c:\\programdata\\anaconda3\\lib\\site-packages (1.8.1)\nRequirement already satisfied: ipython in c:\\programdata\\anaconda3\\lib\\site-packages (from watermark) (7.4.0)\nRequirement already satisfied: jedi>=0.10 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.13.3)\nRequirement already satisfied: backcall in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.1.0)\nRequirement already satisfied: pickleshare in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.7.5)\nRequirement already satisfied: setuptools>=18.5 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (40.8.0)\nRequirement already satisfied: colorama; sys_platform == \"win32\" in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (0.4.1)\nRequirement already satisfied: decorator in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (4.4.0)\nRequirement already satisfied: pygments in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (2.3.1)\nRequirement already satisfied: prompt-toolkit<2.1.0,>=2.0.0 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (2.0.9)\nRequirement already satisfied: traitlets>=4.2 in c:\\programdata\\anaconda3\\lib\\site-packages (from ipython->watermark) (4.3.2)\nRequirement already satisfied: parso>=0.3.0 in c:\\programdata\\anaconda3\\lib\\site-packages (from jedi>=0.10->ipython->watermark) (0.3.4)\nRequirement already satisfied: six>=1.9.0 in c:\\programdata\\anaconda3\\lib\\site-packages (from prompt-toolkit<2.1.0,>=2.0.0->ipython->watermark) (1.12.0)\nRequirement already satisfied: wcwidth in c:\\programdata\\anaconda3\\lib\\site-packages (from prompt-toolkit<2.1.0,>=2.0.0->ipython->watermark) (0.1.7)\nRequirement already satisfied: ipython-genutils in c:\\programdata\\anaconda3\\lib\\site-packages (from traitlets>=4.2->ipython->watermark) (0.2.0)\n" ], [ "%load_ext watermark", "The watermark extension is already loaded. To reload it, use:\n %reload_ext watermark\n" ], [ "pd.show_versions()", "\nINSTALLED VERSIONS\n------------------\ncommit: None\npython: 3.7.3.final.0\npython-bits: 64\nOS: Windows\nOS-release: 10\nmachine: AMD64\nprocessor: Intel64 Family 6 Model 158 Stepping 10, GenuineIntel\nbyteorder: little\nLC_ALL: None\nLANG: None\nLOCALE: None.None\n\npandas: 0.24.2\npytest: 4.3.1\npip: 19.0.3\nsetuptools: 40.8.0\nCython: 0.29.6\nnumpy: 1.16.2\nscipy: 1.2.1\npyarrow: None\nxarray: None\nIPython: 7.4.0\nsphinx: 1.8.5\npatsy: 0.5.1\ndateutil: 2.8.0\npytz: 2018.9\nblosc: None\nbottleneck: 1.2.1\ntables: 3.5.1\nnumexpr: 2.6.9\nfeather: None\nmatplotlib: 3.0.3\nopenpyxl: 2.6.1\nxlrd: 1.2.0\nxlwt: 1.3.0\nxlsxwriter: 1.1.5\nlxml.etree: 4.3.2\nbs4: 4.7.1\nhtml5lib: 1.0.1\nsqlalchemy: 1.3.1\npymysql: None\npsycopg2: None\njinja2: 2.10\ns3fs: None\nfastparquet: None\npandas_gbq: None\npandas_datareader: None\ngcsfs: None\n" ], [ "%watermark -a 'RCS_April_2019' -nmv --packages numpy,pandas,sklearn,matplotlib,seaborn", "RCS_April_2019 Wed Apr 17 2019 \n\nCPython 3.7.3\nIPython 7.4.0\n\nnumpy 1.16.2\npandas 0.24.2\nsklearn 0.20.3\nmatplotlib 3.0.3\nseaborn 0.9.0\n\ncompiler : MSC v.1915 64 bit (AMD64)\nsystem : Windows\nrelease : 10\nmachine : AMD64\nprocessor : Intel64 Family 6 Model 158 Stepping 10, GenuineIntel\nCPU cores : 12\ninterpreter: 64bit\n" ] ], [ [ "Finally, let's extract the core of our work from Steps 1-5 and turn it into a single pipeline.", "_____no_output_____" ] ], [ [ "%matplotlib inline\nimport pandas as pd\nimport seaborn as sb\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split, cross_val_score\n\n# We can jump directly to working with the clean data because we saved our cleaned data set\niris_data_clean = pd.read_csv('../data/iris-data-clean.csv')\n\n# Testing our data: Our analysis will stop here if any of these assertions are wrong\n\n# We know that we should only have three classes\nassert len(iris_data_clean['class'].unique()) == 3\n\n# We know that sepal lengths for 'Iris-versicolor' should never be below 2.5 cm\nassert iris_data_clean.loc[iris_data_clean['class'] == 'Iris-versicolor', 'sepal_length_cm'].min() >= 2.5\n\n# We know that our data set should have no missing measurements\nassert len(iris_data_clean.loc[(iris_data_clean['sepal_length_cm'].isnull()) |\n (iris_data_clean['sepal_width_cm'].isnull()) |\n (iris_data_clean['petal_length_cm'].isnull()) |\n (iris_data_clean['petal_width_cm'].isnull())]) == 0\n\nall_inputs = iris_data_clean[['sepal_length_cm', 'sepal_width_cm',\n 'petal_length_cm', 'petal_width_cm']].values\n\nall_labels = iris_data_clean['class'].values\n\n# This is the classifier that came out of Grid Search\nrandom_forest_classifier = RandomForestClassifier(criterion='gini', max_features=3, n_estimators=50)\n\n# All that's left to do now is plot the cross-validation scores\nrf_classifier_scores = cross_val_score(random_forest_classifier, all_inputs, all_labels, cv=10)\nsb.boxplot(rf_classifier_scores)\nsb.stripplot(rf_classifier_scores, jitter=True, color='black')\n\n# ...and show some of the predictions from the classifier\n(training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25)\n\nrandom_forest_classifier.fit(training_inputs, training_classes)\n\nfor input_features, prediction, actual in zip(testing_inputs[:10],\n random_forest_classifier.predict(testing_inputs[:10]),\n testing_classes[:10]):\n print('{}\\t-->\\t{}\\t(Actual: {})'.format(input_features, prediction, actual))", "[4.6 3.4 1.4 0.3]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.9 3. 4.2 1.5]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[7.2 3. 5.8 1.6]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[6.7 2.5 5.8 1.8]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[6.7 3.3 5.7 2.5]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[4.9 3.1 1.5 0.25]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[6.3 3.4 5.6 2.4]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[5.1 3.3 1.7 0.5]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[4.9 2.4 3.3 1. ]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[6.3 3.3 4.7 1.6]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n" ], [ "%matplotlib inline\nimport pandas as pd\nimport seaborn as sb\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import train_test_split, cross_val_score\n\ndef processData(filename): \n # We can jump directly to working with the clean data because we saved our cleaned data set\n iris_data_clean = pd.read_csv(filename)\n\n # Testing our data: Our analysis will stop here if any of these assertions are wrong\n\n # We know that we should only have three classes\n assert len(iris_data_clean['class'].unique()) == 3\n\n # We know that sepal lengths for 'Iris-versicolor' should never be below 2.5 cm\n assert iris_data_clean.loc[iris_data_clean['class'] == 'Iris-versicolor', 'sepal_length_cm'].min() >= 2.5\n\n # We know that our data set should have no missing measurements\n assert len(iris_data_clean.loc[(iris_data_clean['sepal_length_cm'].isnull()) |\n (iris_data_clean['sepal_width_cm'].isnull()) |\n (iris_data_clean['petal_length_cm'].isnull()) |\n (iris_data_clean['petal_width_cm'].isnull())]) == 0\n\n all_inputs = iris_data_clean[['sepal_length_cm', 'sepal_width_cm',\n 'petal_length_cm', 'petal_width_cm']].values\n\n all_labels = iris_data_clean['class'].values\n\n # This is the classifier that came out of Grid Search\n random_forest_classifier = RandomForestClassifier(criterion='gini', max_features=3, n_estimators=50)\n\n # All that's left to do now is plot the cross-validation scores\n rf_classifier_scores = cross_val_score(random_forest_classifier, all_inputs, all_labels, cv=10)\n sb.boxplot(rf_classifier_scores)\n sb.stripplot(rf_classifier_scores, jitter=True, color='black')\n\n # ...and show some of the predictions from the classifier\n (training_inputs,\n testing_inputs,\n training_classes,\n testing_classes) = train_test_split(all_inputs, all_labels, test_size=0.25)\n\n random_forest_classifier.fit(training_inputs, training_classes)\n\n for input_features, prediction, actual in zip(testing_inputs[:10],\n random_forest_classifier.predict(testing_inputs[:10]),\n testing_classes[:10]):\n print('{}\\t-->\\t{}\\t(Actual: {})'.format(input_features, prediction, actual))\n return rf_classifier_scores", "_____no_output_____" ], [ "myscores = processData('../data/iris-data-clean.csv')", "[5.1 3.7 1.5 0.4]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.8 2.7 4.1 1. ]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[5.7 3. 1.1 0.1]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.9 3. 5.1 1.8]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n[5.4 3.4 1.7 0.2]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[4.7 3.2 1.6 0.2]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5.4 3. 4.5 1.5]\t-->\tIris-versicolor\t(Actual: Iris-versicolor)\n[5.7 4.4 1.5 0.4]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[5. 3.2 1.2 0.2]\t-->\tIris-setosa\t(Actual: Iris-setosa)\n[7.2 3. 5.8 1.6]\t-->\tIris-virginica\t(Actual: Iris-virginica)\n" ], [ "myscores", "_____no_output_____" ] ], [ [ "There we have it: We have a complete and reproducible Machine Learning pipeline to demo to our company's Head of Data. We've met the success criteria that we set from the beginning (>90% accuracy), and our pipeline is flexible enough to handle new inputs or flowers when that data set is ready. Not bad for our first week on the job!", "_____no_output_____" ], [ "## Conclusions\n\n[[ go back to the top ]](#Table-of-contents)\n\nI hope you found this example notebook useful for your own work and learned at least one new trick by reading through it.\n\n\n* [Submit an issue](https://github.com/ValRCS/LU-pysem/issues) on GitHub\n\n* Fork the [notebook repository](https://github.com/ValRCS/LU-pysem), make the fix/addition yourself, then send over a pull request", "_____no_output_____" ], [ "## Further reading\n\n[[ go back to the top ]](#Table-of-contents)\n\nThis notebook covers a broad variety of topics but skips over many of the specifics. If you're looking to dive deeper into a particular topic, here's some recommended reading.\n\n**Data Science**: William Chen compiled a [list of free books](http://www.wzchen.com/data-science-books/) for newcomers to Data Science, ranging from the basics of R & Python to Machine Learning to interviews and advice from prominent data scientists.\n\n**Machine Learning**: /r/MachineLearning has a useful [Wiki page](https://www.reddit.com/r/MachineLearning/wiki/index) containing links to online courses, books, data sets, etc. for Machine Learning. There's also a [curated list](https://github.com/josephmisiti/awesome-machine-learning) of Machine Learning frameworks, libraries, and software sorted by language.\n\n**Unit testing**: Dive Into Python 3 has a [great walkthrough](http://www.diveintopython3.net/unit-testing.html) of unit testing in Python, how it works, and how it should be used\n\n**pandas** has [several tutorials](http://pandas.pydata.org/pandas-docs/stable/tutorials.html) covering its myriad features.\n\n**scikit-learn** has a [bunch of tutorials](http://scikit-learn.org/stable/tutorial/index.html) for those looking to learn Machine Learning in Python. Andreas Mueller's [scikit-learn workshop materials](https://github.com/amueller/scipy_2015_sklearn_tutorial) are top-notch and freely available.\n\n**matplotlib** has many [books, videos, and tutorials](http://matplotlib.org/resources/index.html) to teach plotting in Python.\n\n**Seaborn** has a [basic tutorial](http://stanford.edu/~mwaskom/software/seaborn/tutorial.html) covering most of the statistical plotting features.", "_____no_output_____" ], [ "## Acknowledgements\n\n[[ go back to the top ]](#Table-of-contents)\n\nMany thanks to [Andreas Mueller](http://amueller.github.io/) for some of his [examples](https://github.com/amueller/scipy_2015_sklearn_tutorial) in the Machine Learning section. I drew inspiration from several of his excellent examples.\n\nThe photo of a flower with annotations of the petal and sepal was taken by [Eric Guinther](https://commons.wikimedia.org/wiki/File:Petal-sepal.jpg).\n\nThe photos of the various *Iris* flower types were taken by [Ken Walker](http://www.signa.org/index.pl?Display+Iris-setosa+2) and [Barry Glick](http://www.signa.org/index.pl?Display+Iris-virginica+3).", "_____no_output_____" ], [ "## Further questions? \n\nFeel free to contact [Valdis Saulespurens]\n(email:[email protected])", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ] ]
d061465d23ce5abccb3893326eb5add1159d5665
25,452
ipynb
Jupyter Notebook
Deep_Learning_Specialization/04_Convolutional_Neural_Networks/02_Deep_Convolutional_Models_Case_Studies/02_Residual_Networks/Residual Networks - v2.ipynb
cilsya/coursera
4a7896f3225cb84e2f15770409c1f18bfe529615
[ "MIT" ]
1
2021-03-15T13:57:04.000Z
2021-03-15T13:57:04.000Z
Deep_Learning_Specialization/04_Convolutional_Neural_Networks/02_Deep_Convolutional_Models_Case_Studies/02_Residual_Networks/Residual Networks - v2.ipynb
cilsya/coursera
4a7896f3225cb84e2f15770409c1f18bfe529615
[ "MIT" ]
5
2020-03-24T16:17:05.000Z
2021-06-01T22:49:40.000Z
Deep_Learning_Specialization/04_Convolutional_Neural_Networks/02_Deep_Convolutional_Models_Case_Studies/02_Residual_Networks/Residual Networks - v2.ipynb
cilsya/coursera
4a7896f3225cb84e2f15770409c1f18bfe529615
[ "MIT" ]
null
null
null
47.220779
457
0.485424
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d0614fd8c14825f845dd96eed5634b241da21e66
371,618
ipynb
Jupyter Notebook
docs/source/_docs/pyNetLogo demo - SALib sequential.ipynb
jasonrwang/pyNetLogo
01117c9d9c7d4d5681fe1e08f8862d6b64c9a4b7
[ "BSD-3-Clause" ]
null
null
null
docs/source/_docs/pyNetLogo demo - SALib sequential.ipynb
jasonrwang/pyNetLogo
01117c9d9c7d4d5681fe1e08f8862d6b64c9a4b7
[ "BSD-3-Clause" ]
null
null
null
docs/source/_docs/pyNetLogo demo - SALib sequential.ipynb
jasonrwang/pyNetLogo
01117c9d9c7d4d5681fe1e08f8862d6b64c9a4b7
[ "BSD-3-Clause" ]
null
null
null
455.414216
209,950
0.92675
[ [ [ "## Example 2: Sensitivity analysis on a NetLogo model with SALib\n\nThis notebook provides a more advanced example of interaction between NetLogo and a Python environment, using the SALib library (Herman & Usher, 2017; available through the pip package manager) to sample and analyze a suitable experimental design for a Sobol global sensitivity analysis. All files used in the example are available from the pyNetLogo repository at https://github.com/quaquel/pyNetLogo.", "_____no_output_____" ] ], [ [ "#Ensuring compliance of code with both python2 and python3\n\nfrom __future__ import division, print_function\ntry:\n from itertools import izip as zip\nexcept ImportError: # will be 3.x series\n pass", "_____no_output_____" ], [ "%matplotlib inline\n\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\n\nimport pyNetLogo\n\n#Import the sampling and analysis modules for a Sobol variance-based sensitivity analysis\nfrom SALib.sample import saltelli\nfrom SALib.analyze import sobol", "_____no_output_____" ] ], [ [ "SALib relies on a problem definition dictionary which contains the number of input parameters to sample, their names (which should here correspond to a NetLogo global variable), and the sampling bounds. Documentation for SALib can be found at https://salib.readthedocs.io/en/latest/.", "_____no_output_____" ] ], [ [ "problem = { \n 'num_vars': 6,\n 'names': ['random-seed',\n 'grass-regrowth-time',\n 'sheep-gain-from-food',\n 'wolf-gain-from-food',\n 'sheep-reproduce',\n 'wolf-reproduce'], \n 'bounds': [[1, 100000],\n [20., 40.], \n [2., 8.], \n [16., 32.],\n [2., 8.],\n [2., 8.]]\n}", "_____no_output_____" ] ], [ [ "We start by instantiating the wolf-sheep predation example model, specifying the _gui=False_ flag to run in headless mode.", "_____no_output_____" ] ], [ [ "netlogo = pyNetLogo.NetLogoLink(gui=False)\nnetlogo.load_model(r'Wolf Sheep Predation_v6.nlogo')", "_____no_output_____" ] ], [ [ "The SALib sampler will automatically generate an appropriate number of samples for Sobol analysis. To calculate first-order, second-order and total sensitivity indices, this gives a sample size of _n*(2p+2)_, where _p_ is the number of input parameters, and _n_ is a baseline sample size which should be large enough to stabilize the estimation of the indices. For this example, we use _n_ = 1000, for a total of 14000 experiments.\n\nFor more complex analyses, parallelizing the experiments can significantly improve performance. An additional notebook in the pyNetLogo repository demonstrates the use of the ipyparallel library; parallel processing for NetLogo models is also supported by the Exploratory Modeling Workbench (Kwakkel, 2017).", "_____no_output_____" ] ], [ [ "n = 1000\nparam_values = saltelli.sample(problem, n, calc_second_order=True)", "_____no_output_____" ] ], [ [ "The sampler generates an input array of shape (_n*(2p+2)_, _p_) with rows for each experiment and columns for each input parameter.", "_____no_output_____" ] ], [ [ "param_values.shape", "_____no_output_____" ] ], [ [ "Assuming we are interested in the mean number of sheep and wolf agents over a timeframe of 100 ticks, we first create an empty dataframe to store the results.", "_____no_output_____" ] ], [ [ "results = pd.DataFrame(columns=['Avg. sheep', 'Avg. wolves'])", "_____no_output_____" ] ], [ [ "We then simulate the model over the 14000 experiments, reading input parameters from the param_values array generated by SALib. The repeat_report command is used to track the outcomes of interest over time. \n\nTo later compare performance with the ipyparallel implementation of the analysis, we also keep track of the elapsed runtime.", "_____no_output_____" ] ], [ [ "import time\n\nt0=time.time()\n\nfor run in range(param_values.shape[0]):\n \n #Set the input parameters\n for i, name in enumerate(problem['names']):\n if name == 'random-seed':\n #The NetLogo random seed requires a different syntax\n netlogo.command('random-seed {}'.format(param_values[run,i]))\n else:\n #Otherwise, assume the input parameters are global variables\n netlogo.command('set {0} {1}'.format(name, param_values[run,i]))\n \n netlogo.command('setup')\n #Run for 100 ticks and return the number of sheep and wolf agents at each time step\n counts = netlogo.repeat_report(['count sheep','count wolves'], 100)\n \n #For each run, save the mean value of the agent counts over time\n results.loc[run, 'Avg. sheep'] = counts['count sheep'].values.mean()\n results.loc[run, 'Avg. wolves'] = counts['count wolves'].values.mean()\n \nelapsed=time.time()-t0 #Elapsed runtime in seconds", "_____no_output_____" ], [ "elapsed", "_____no_output_____" ] ], [ [ "The \"to_csv\" dataframe method provides a simple way of saving the results to disk.\n\nPandas supports several more advanced storage options, such as serialization with msgpack, or hierarchical HDF5 storage.", "_____no_output_____" ] ], [ [ "results.to_csv('Sobol_sequential.csv')", "_____no_output_____" ], [ "results = pd.read_csv('Sobol_sequential.csv', header=0, index_col=0)", "_____no_output_____" ], [ "results.head(5)", "_____no_output_____" ] ], [ [ "We can then proceed with the analysis, first using a histogram to visualize output distributions for each outcome:", "_____no_output_____" ] ], [ [ "sns.set_style('white')\nsns.set_context('talk')\nfig, ax = plt.subplots(1,len(results.columns), sharey=True)\n\nfor i, n in enumerate(results.columns):\n ax[i].hist(results[n], 20)\n ax[i].set_xlabel(n)\n\nax[0].set_ylabel('Counts')\n\nfig.set_size_inches(10,4)\nfig.subplots_adjust(wspace=0.1)\n#plt.savefig('JASSS figures/SA - Output distribution.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/SA - Output distribution.png', dpi=300, bbox_inches='tight')\nplt.show()", "_____no_output_____" ] ], [ [ "Bivariate scatter plots can be useful to visualize relationships between each input parameter and the outputs. Taking the outcome for the average sheep count as an example, we obtain the following, using the scipy library to calculate the Pearson correlation coefficient (r) for each parameter:", "_____no_output_____" ] ], [ [ "%matplotlib\nimport scipy\n\nnrow=2\nncol=3\nfig, ax = plt.subplots(nrow, ncol, sharey=True)\nsns.set_context('talk')\ny = results['Avg. sheep']\n\nfor i, a in enumerate(ax.flatten()):\n x = param_values[:,i]\n sns.regplot(x, y, ax=a, ci=None, color='k',scatter_kws={'alpha':0.2, 's':4, 'color':'gray'})\n pearson = scipy.stats.pearsonr(x, y)\n a.annotate(\"r: {:6.3f}\".format(pearson[0]), xy=(0.15, 0.85), xycoords='axes fraction',fontsize=13)\n if divmod(i,ncol)[1]>0:\n a.get_yaxis().set_visible(False)\n a.set_xlabel(problem['names'][i])\n a.set_ylim([0,1.1*np.max(y)])\n\nfig.set_size_inches(9,9,forward=True) \nfig.subplots_adjust(wspace=0.2, hspace=0.3)\n#plt.savefig('JASSS figures/SA - Scatter.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/SA - Scatter.png', dpi=300, bbox_inches='tight')\nplt.show()", "_____no_output_____" ] ], [ [ "This indicates a positive relationship between the \"sheep-gain-from-food\" parameter and the mean sheep count, and negative relationships for the \"wolf-gain-from-food\" and \"wolf-reproduce\" parameters.\n\nWe can then use SALib to calculate first-order (S1), second-order (S2) and total (ST) Sobol indices, to estimate each input's contribution to output variance. By default, 95% confidence intervals are estimated for each index.", "_____no_output_____" ] ], [ [ "Si = sobol.analyze(problem, results['Avg. sheep'].values, calc_second_order=True, print_to_console=False)", "_____no_output_____" ] ], [ [ "As a simple example, we first select and visualize the first-order and total indices for each input, converting the dictionary returned by SALib to a dataframe.", "_____no_output_____" ] ], [ [ "Si_filter = {k:Si[k] for k in ['ST','ST_conf','S1','S1_conf']}\nSi_df = pd.DataFrame(Si_filter, index=problem['names'])", "_____no_output_____" ], [ "Si_df", "_____no_output_____" ], [ "sns.set_style('white')\nfig, ax = plt.subplots(1)\n\nindices = Si_df[['S1','ST']]\nerr = Si_df[['S1_conf','ST_conf']]\n\nindices.plot.bar(yerr=err.values.T,ax=ax)\nfig.set_size_inches(8,4)\n\n#plt.savefig('JASSS figures/SA - Indices.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/SA - Indices.png', dpi=300, bbox_inches='tight')\n\nplt.show()", "_____no_output_____" ] ], [ [ "The \"sheep-gain-from-food\" parameter has the highest ST index, indicating that it contributes over 50% of output variance when accounting for interactions with other parameters. However, it can be noted that the confidence bounds are overly broad due to the small _n_ value used for sampling, so that a larger sample would be required for reliable results. For instance, the S1 index is estimated to be larger than ST for the \"random-seed\" parameter, which is an artifact of the small sample size.\n\nWe can use a more sophisticated visualization to include the second-order interactions between inputs.", "_____no_output_____" ] ], [ [ "import itertools\nfrom math import pi\n\n\ndef normalize(x, xmin, xmax):\n return (x-xmin)/(xmax-xmin)\n\n\ndef plot_circles(ax, locs, names, max_s, stats, smax, smin, fc, ec, lw, \n zorder):\n s = np.asarray([stats[name] for name in names])\n s = 0.01 + max_s * np.sqrt(normalize(s, smin, smax))\n \n fill = True\n for loc, name, si in zip(locs, names, s):\n if fc=='w':\n fill=False\n else:\n ec='none'\n \n x = np.cos(loc)\n y = np.sin(loc)\n \n circle = plt.Circle((x,y), radius=si, ec=ec, fc=fc, transform=ax.transData._b,\n zorder=zorder, lw=lw, fill=True)\n ax.add_artist(circle)\n \n\ndef filter(sobol_indices, names, locs, criterion, threshold):\n if criterion in ['ST', 'S1', 'S2']:\n data = sobol_indices[criterion]\n data = np.abs(data)\n data = data.flatten() # flatten in case of S2\n # TODO:: remove nans\n \n filtered = ([(name, locs[i]) for i, name in enumerate(names) if \n data[i]>threshold])\n filtered_names, filtered_locs = zip(*filtered)\n elif criterion in ['ST_conf', 'S1_conf', 'S2_conf']:\n raise NotImplementedError\n else:\n raise ValueError('unknown value for criterion')\n\n return filtered_names, filtered_locs\n\n\ndef plot_sobol_indices(sobol_indices, criterion='ST', threshold=0.01):\n '''plot sobol indices on a radial plot\n \n Parameters\n ----------\n sobol_indices : dict\n the return from SAlib\n criterion : {'ST', 'S1', 'S2', 'ST_conf', 'S1_conf', 'S2_conf'}, optional\n threshold : float\n only visualize variables with criterion larger than cutoff\n \n '''\n max_linewidth_s2 = 15#25*1.8\n max_s_radius = 0.3\n \n # prepare data\n # use the absolute values of all the indices\n #sobol_indices = {key:np.abs(stats) for key, stats in sobol_indices.items()}\n \n # dataframe with ST and S1\n sobol_stats = {key:sobol_indices[key] for key in ['ST', 'S1']}\n sobol_stats = pd.DataFrame(sobol_stats, index=problem['names'])\n\n smax = sobol_stats.max().max()\n smin = sobol_stats.min().min()\n\n # dataframe with s2\n s2 = pd.DataFrame(sobol_indices['S2'], index=problem['names'], \n columns=problem['names'])\n s2[s2<0.0]=0. #Set negative values to 0 (artifact from small sample sizes)\n s2max = s2.max().max()\n s2min = s2.min().min()\n\n names = problem['names']\n n = len(names)\n ticklocs = np.linspace(0, 2*pi, n+1)\n locs = ticklocs[0:-1]\n\n filtered_names, filtered_locs = filter(sobol_indices, names, locs,\n criterion, threshold)\n \n # setup figure\n fig = plt.figure()\n ax = fig.add_subplot(111, polar=True)\n ax.grid(False)\n ax.spines['polar'].set_visible(False)\n ax.set_xticks(ticklocs)\n\n ax.set_xticklabels(names)\n ax.set_yticklabels([])\n ax.set_ylim(ymax=1.4)\n legend(ax)\n\n # plot ST\n plot_circles(ax, filtered_locs, filtered_names, max_s_radius, \n sobol_stats['ST'], smax, smin, 'w', 'k', 1, 9)\n\n # plot S1\n plot_circles(ax, filtered_locs, filtered_names, max_s_radius, \n sobol_stats['S1'], smax, smin, 'k', 'k', 1, 10)\n\n # plot S2\n for name1, name2 in itertools.combinations(zip(filtered_names, filtered_locs), 2):\n name1, loc1 = name1\n name2, loc2 = name2\n\n weight = s2.ix[name1, name2]\n lw = 0.5+max_linewidth_s2*normalize(weight, s2min, s2max)\n ax.plot([loc1, loc2], [1,1], c='darkgray', lw=lw, zorder=1)\n\n return fig\n\n\nfrom matplotlib.legend_handler import HandlerPatch\nclass HandlerCircle(HandlerPatch):\n def create_artists(self, legend, orig_handle,\n xdescent, ydescent, width, height, fontsize, trans):\n center = 0.5 * width - 0.5 * xdescent, 0.5 * height - 0.5 * ydescent\n p = plt.Circle(xy=center, radius=orig_handle.radius)\n self.update_prop(p, orig_handle, legend)\n p.set_transform(trans)\n return [p]\n\ndef legend(ax):\n some_identifiers = [plt.Circle((0,0), radius=5, color='k', fill=False, lw=1),\n plt.Circle((0,0), radius=5, color='k', fill=True),\n plt.Line2D([0,0.5], [0,0.5], lw=8, color='darkgray')]\n ax.legend(some_identifiers, ['ST', 'S1', 'S2'],\n loc=(1,0.75), borderaxespad=0.1, mode='expand',\n handler_map={plt.Circle: HandlerCircle()})\n\n\nsns.set_style('whitegrid')\nfig = plot_sobol_indices(Si, criterion='ST', threshold=0.005)\nfig.set_size_inches(7,7)\n#plt.savefig('JASSS figures/Figure 8 - Interactions.pdf', bbox_inches='tight')\n#plt.savefig('JASSS figures/Figure 8 - Interactions.png', dpi=300, bbox_inches='tight')\nplt.show()", "_____no_output_____" ] ], [ [ "In this case, the sheep-gain-from-food variable has strong interactions with the wolf-gain-from-food and sheep-reproduce inputs in particular. The size of the ST and S1 circles correspond to the normalized variable importances.", "_____no_output_____" ], [ "Finally, the kill_workspace() function shuts down the NetLogo instance.", "_____no_output_____" ] ], [ [ "netlogo.kill_workspace()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ] ]
d06165fcd0eb33f2d42dced43cf819c2024d6dbc
698,867
ipynb
Jupyter Notebook
Tennis_Time_Data_Visualization.ipynb
Tinzyl/Tennis_Time_Data_Visualization
761964f37a7f524edf708a1174d9ee8f73334889
[ "MIT" ]
null
null
null
Tennis_Time_Data_Visualization.ipynb
Tinzyl/Tennis_Time_Data_Visualization
761964f37a7f524edf708a1174d9ee8f73334889
[ "MIT" ]
null
null
null
Tennis_Time_Data_Visualization.ipynb
Tinzyl/Tennis_Time_Data_Visualization
761964f37a7f524edf708a1174d9ee8f73334889
[ "MIT" ]
null
null
null
107.783313
61,440
0.756047
[ [ [ "import pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt", "_____no_output_____" ], [ "players_time = pd.read_csv(\"players_time.csv\")", "_____no_output_____" ], [ "events_time = pd.read_csv(\"events_time.csv\")", "_____no_output_____" ], [ "serve_time = pd.read_csv(\"serve_times.csv\")", "_____no_output_____" ], [ "players_time", "_____no_output_____" ], [ "events_time", "_____no_output_____" ], [ "pd.options.display.max_rows = None", "_____no_output_____" ], [ "events_time", "_____no_output_____" ], [ "serve_time", "_____no_output_____" ] ], [ [ "## 1. Visualize The 10 Most Slow Players ", "_____no_output_____" ] ], [ [ "most_slow_Players = players_time[players_time[\"seconds_added_per_point\"] > 0].sort_values(by=\"seconds_added_per_point\", ascending=False).head(10)", "_____no_output_____" ], [ "most_slow_Players", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"player\", data=most_slow_Players)\nax.set_title(\"TOP 10 MOST SLOW PLAYERS\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Players\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "## 2. Visualize The 10 Most Fast Players", "_____no_output_____" ] ], [ [ "most_fast_Players = players_time[players_time[\"seconds_added_per_point\"] < 0].sort_values(by=\"seconds_added_per_point\").head(10)", "_____no_output_____" ], [ "most_fast_Players", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"player\", data=most_fast_Players)\nax.set_title(\"TOP 10 MOST FAST PLAYERS\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Players\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "## 3. Visualize The Time Of The Big 3", "_____no_output_____" ] ], [ [ "big_three_time = players_time[(players_time[\"player\"] == \"Novak Djokovic\") | (players_time[\"player\"] == \"Roger Federer\") | (players_time[\"player\"] == \"Rafael Nadal\")]", "_____no_output_____" ], [ "big_three_time", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"player\", data=big_three_time)\nax.set_title(\"TIME OF THE BIG THREE\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Players\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "## 4. Figure Out The Top 10 Surfaces That Take The Longest Time", "_____no_output_____" ] ], [ [ "longest_time_surfaces = events_time[events_time[\"seconds_added_per_point\"] > 0].sort_values(by=\"seconds_added_per_point\", ascending=False).head(10)", "_____no_output_____" ], [ "longest_time_surfaces", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.barplot(x=\"seconds_added_per_point\", y=\"tournament\", hue=\"surface\", data=longest_time_surfaces)\nax.set_title(\"TOP 10 SURFACES THAT TAKE THE LONGEST TIME\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Tournament\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "## 5. Figure Out The Top 10 Surfaces That Take The Shortest Time", "_____no_output_____" ] ], [ [ "shortest_time_surfaces = events_time[events_time[\"seconds_added_per_point\"] < 0].sort_values(by=\"seconds_added_per_point\").head(10)", "_____no_output_____" ], [ "shortest_time_surfaces", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax = sns.barplot(x=\"seconds_added_per_point\", y=\"tournament\", hue=\"surface\", data=shortest_time_surfaces)\nax.set_title(\"TOP 10 SURFACES THAT TAKE THE SHORTEST TIME\", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Tournament\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "## 6. Figure Out How The Time For The Clay Surface Has Progressed Throughout The Years", "_____no_output_____" ] ], [ [ "years = events_time[~events_time[\"years\"].str.contains(\"-\")]\nsorted_years_clay = years[years[\"surface\"] == \"Clay\"].sort_values(by=\"years\")", "_____no_output_____" ], [ "sorted_years_clay", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_clay)\nax.set_title(\"PROGRESSION OF TIME FOR THE CLAY SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ", "_____no_output_____" ] ], [ [ "## 7. Figure Out How The Time For The Hard Surface Has Progressed Throughout The Years", "_____no_output_____" ] ], [ [ "sorted_years_hard = years[years[\"surface\"] == \"Hard\"].sort_values(by=\"years\")", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_hard)\nax.set_title(\"PROGRESSION OF TIME FOR THE HARD SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ", "_____no_output_____" ] ], [ [ "## 8. Figure Out How The Time For The Carpet Surface Has Progressed Throughout The Years", "_____no_output_____" ] ], [ [ "sorted_years_carpet = years[years[\"surface\"] == \"Carpet\"].sort_values(by=\"years\")", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_carpet)\nax.set_title(\"PROGRESSION OF TIME FOR THE CARPET SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ", "_____no_output_____" ] ], [ [ "## 9. Figure Out How The Time For The Grass Surface Has Progressed Throughout The Years", "_____no_output_____" ] ], [ [ "sorted_years_grass = events_time[events_time[\"surface\"] == \"Grass\"].sort_values(by=\"years\").head(5)", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax= sns.lineplot(x=\"years\", y=\"seconds_added_per_point\", hue=\"surface\", data=sorted_years_grass)\nax.set_title(\"PROGRESSION OF TIME FOR THE GRASS SURFACE THROUGHOUT THE YEARS\", fontsize=17)\nplt.xlabel(\"Years\", fontsize=17)\nplt.ylabel(\"Seconds\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show() ", "_____no_output_____" ] ], [ [ "## 10. Figure Out The Person Who Took The Most Time Serving In 2015", "_____no_output_____" ] ], [ [ "serve_time", "_____no_output_____" ], [ "serve_time_visualization = serve_time.groupby(\"server\")[\"seconds_before_next_point\"].agg(\"sum\")", "_____no_output_____" ], [ "serve_time_visualization", "_____no_output_____" ], [ "serve_time_visual_data = serve_time_visualization.reset_index()", "_____no_output_____" ], [ "serve_time_visual_data", "_____no_output_____" ], [ "serve_time_visual_sorted = serve_time_visual_data.sort_values(by=\"seconds_before_next_point\", ascending = False)", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax = sns.barplot(x=\"seconds_before_next_point\", y=\"server\", data=serve_time_visual_sorted)\nax.set_title(\"PLAYERS TOTAL SERVING TIME(2015) \", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Player\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "### BIG THREE TOTAL SERVING TIME IN 2015", "_____no_output_____" ] ], [ [ "big_three_total_serving_time = serve_time_visual_sorted[(serve_time_visual_sorted[\"server\"] == \"Roger Federer\") | (serve_time_visual_sorted[\"server\"] == \"Rafael Nadal\") | (serve_time_visual_sorted[\"server\"] == \"Novak Djokovic\")]", "_____no_output_____" ], [ "big_three_total_serving_time", "_____no_output_____" ], [ "sns.set(style=\"darkgrid\")\nplt.figure(figsize = (10,5))\nax = sns.barplot(x=\"seconds_before_next_point\", y=\"server\", data=big_three_total_serving_time)\nax.set_title(\"BIG THREE TOTAL SERVING TIME(2015) \", fontsize=17)\nplt.xlabel(\"Seconds\", fontsize=17)\nplt.ylabel(\"Player\", fontsize=17)\nplt.yticks(size=17)\nplt.xticks(size=17)\nplt.show()", "_____no_output_____" ] ], [ [ "## Conclusion", "_____no_output_____" ], [ "### Matches are short when they are played on a Grass, Carpet or Hard Surface. Grass however has proved to let matches be way more short compared to the other 2. \n\n### Clay Surfaces have proved to make matches last so long. \n\n### In 2015, among the Big Three, Novak Djokovic took the shortest time serving followed by Rafael Nadal. Roger Federer took the longest time serving. Overall however, Roger Federer proved to have the shortest time serving over the past years, followed by Novak Djokovic. Rafael Nadal has proved to have the longest time serving over the past years, making the matches that he is involved in last longer.", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ] ]
d06168f1a481bfd64b39e2c63dac1b24eb7a07b8
125,628
ipynb
Jupyter Notebook
notebooks/old/Salt_9-resne34-with-highLR-derper.ipynb
GilesStrong/Kaggle_TGS-Salt
b47a468ee464581f1b843fdf3bc1230222982277
[ "Apache-2.0" ]
null
null
null
notebooks/old/Salt_9-resne34-with-highLR-derper.ipynb
GilesStrong/Kaggle_TGS-Salt
b47a468ee464581f1b843fdf3bc1230222982277
[ "Apache-2.0" ]
null
null
null
notebooks/old/Salt_9-resne34-with-highLR-derper.ipynb
GilesStrong/Kaggle_TGS-Salt
b47a468ee464581f1b843fdf3bc1230222982277
[ "Apache-2.0" ]
null
null
null
49.111806
26,436
0.587003
[ [ [ "%matplotlib inline\n%reload_ext autoreload\n%autoreload 2", "_____no_output_____" ], [ "from fastai.conv_learner import *\nfrom fastai.dataset import *\nfrom fastai.models.resnet import vgg_resnet50\n\nimport json", "_____no_output_____" ], [ "#torch.cuda.set_device(2)", "_____no_output_____" ], [ "torch.backends.cudnn.benchmark=True", "_____no_output_____" ] ], [ [ "## Data", "_____no_output_____" ] ], [ [ "PATH = Path('/home/giles/Downloads/fastai_data/salt/')\nMASKS_FN = 'train_masks.csv'\nMETA_FN = 'metadata.csv'\nmasks_csv = pd.read_csv(PATH/MASKS_FN)\nmeta_csv = pd.read_csv(PATH/META_FN)", "_____no_output_____" ], [ "def show_img(im, figsize=None, ax=None, alpha=None):\n if not ax: fig,ax = plt.subplots(figsize=figsize)\n ax.imshow(im, alpha=alpha)\n ax.set_axis_off()\n return ax", "_____no_output_____" ], [ "(PATH/'train_masks-128').mkdir(exist_ok=True)", "_____no_output_____" ], [ "def resize_img(fn):\n Image.open(fn).resize((128,128)).save((fn.parent.parent)/'train_masks-128'/fn.name)\n\nfiles = list((PATH/'train_masks').iterdir())\nwith ThreadPoolExecutor(8) as e: e.map(resize_img, files)", "_____no_output_____" ], [ "(PATH/'train-128').mkdir(exist_ok=True)", "_____no_output_____" ], [ "def resize_img(fn):\n Image.open(fn).resize((128,128)).save((fn.parent.parent)/'train-128'/fn.name)\n\nfiles = list((PATH/'train').iterdir())\nwith ThreadPoolExecutor(8) as e: e.map(resize_img, files)", "_____no_output_____" ], [ "TRAIN_DN = 'train-128'\nMASKS_DN = 'train_masks-128'\nsz = 32\nbs = 64\nnw = 16", "_____no_output_____" ] ], [ [ "TRAIN_DN = 'train'\nMASKS_DN = 'train_masks_png'\nsz = 128\nbs = 64\nnw = 16", "_____no_output_____" ] ], [ [ "class MatchedFilesDataset(FilesDataset):\n def __init__(self, fnames, y, transform, path):\n self.y=y\n assert(len(fnames)==len(y))\n super().__init__(fnames, transform, path)\n def get_y(self, i): return open_image(os.path.join(self.path, self.y[i]))\n def get_c(self): return 0", "_____no_output_____" ], [ "x_names = np.array(glob(f'{PATH}/{TRAIN_DN}/*'))\ny_names = np.array(glob(f'{PATH}/{MASKS_DN}/*'))", "_____no_output_____" ], [ "val_idxs = list(range(800))\n((val_x,trn_x),(val_y,trn_y)) = split_by_idx(val_idxs, x_names, y_names)", "_____no_output_____" ], [ "aug_tfms = [RandomFlip(tfm_y=TfmType.CLASS)]", "_____no_output_____" ], [ "tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm", "_____no_output_____" ], [ "x,y = next(iter(md.trn_dl))", "_____no_output_____" ], [ "x.shape,y.shape", "_____no_output_____" ], [ "denorm = md.val_ds.denorm", "_____no_output_____" ], [ "def show_aug_img(ims, idx, figsize=(5,5), normed=True, ax=None, nchannels=3):\n if ax is None: fig,ax = plt.subplots(figsize=figsize)\n if normed: ims = denorm(ims)\n else: ims = np.rollaxis(to_np(ims),1,nchannels+1)\n ax.imshow(np.clip(ims,0,1)[idx])\n ax.axis('off')", "_____no_output_____" ], [ "batches = [next(iter(md.aug_dl)) for i in range(9)]", "_____no_output_____" ], [ "fig, axes = plt.subplots(3, 6, figsize=(18, 9))\nfor i,(x,y) in enumerate(batches):\n show_aug_img(x,1, ax=axes.flat[i*2])\n show_aug_img(y,1, ax=axes.flat[i*2+1], nchannels=1, normed=False)", "_____no_output_____" ] ], [ [ "## Simple upsample", "_____no_output_____" ] ], [ [ "f = resnet34\ncut,lr_cut = model_meta[f]", "_____no_output_____" ], [ "def get_base():\n layers = cut_model(f(True), cut)\n return nn.Sequential(*layers)", "_____no_output_____" ], [ "def dice(pred, targs):\n pred = (pred>0.5).float()\n return 2. * (pred*targs).sum() / (pred+targs).sum()", "_____no_output_____" ] ], [ [ "## U-net (ish)", "_____no_output_____" ] ], [ [ "class SaveFeatures():\n features=None\n def __init__(self, m): self.hook = m.register_forward_hook(self.hook_fn)\n def hook_fn(self, module, input, output): self.features = output\n def remove(self): self.hook.remove()", "_____no_output_____" ], [ "class UnetBlock(nn.Module):\n def __init__(self, up_in, x_in, n_out):\n super().__init__()\n up_out = x_out = n_out//2\n self.x_conv = nn.Conv2d(x_in, x_out, 1)\n self.tr_conv = nn.ConvTranspose2d(up_in, up_out, 2, stride=2)\n self.bn = nn.BatchNorm2d(n_out)\n \n def forward(self, up_p, x_p):\n up_p = self.tr_conv(up_p)\n x_p = self.x_conv(x_p)\n cat_p = torch.cat([up_p,x_p], dim=1)\n return self.bn(F.relu(cat_p))", "_____no_output_____" ], [ "class Unet34(nn.Module):\n def __init__(self, rn):\n super().__init__()\n self.rn = rn\n self.sfs = [SaveFeatures(rn[i]) for i in [2,4,5,6]]\n self.up1 = UnetBlock(512,256,256)\n self.up2 = UnetBlock(256,128,256)\n self.up3 = UnetBlock(256,64,256)\n self.up4 = UnetBlock(256,64,256)\n self.up5 = UnetBlock(256,3,16)\n self.up6 = nn.ConvTranspose2d(16, 1, 1)\n \n def forward(self,x):\n inp = x\n x = F.relu(self.rn(x))\n x = self.up1(x, self.sfs[3].features)\n x = self.up2(x, self.sfs[2].features)\n x = self.up3(x, self.sfs[1].features)\n x = self.up4(x, self.sfs[0].features)\n x = self.up5(x, inp)\n x = self.up6(x)\n return x[:,0]\n \n def close(self):\n for sf in self.sfs: sf.remove()", "_____no_output_____" ], [ "class UnetModel():\n def __init__(self,model,name='unet'):\n self.model,self.name = model,name\n\n def get_layer_groups(self, precompute):\n lgs = list(split_by_idxs(children(self.model.rn), [lr_cut]))\n return lgs + [children(self.model)[1:]]", "_____no_output_____" ], [ "m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)", "_____no_output_____" ], [ "learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]", "_____no_output_____" ], [ "learn.summary()", "_____no_output_____" ], [ "[o.features.size() for o in m.sfs]", "_____no_output_____" ], [ "learn.freeze_to(1)", "_____no_output_____" ], [ "learn.lr_find()\nlearn.sched.plot()", "_____no_output_____" ], [ "lr=1e-2\nwd=1e-7\n\nlrs = np.array([lr/9,lr/3,lr])", "_____no_output_____" ], [ "learn.fit(lr,1,wds=wd,cycle_len=10,use_clr=(5,8))", "_____no_output_____" ], [ "learn.save('32urn-tmp')", "_____no_output_____" ], [ "learn.load('32urn-tmp')", "_____no_output_____" ], [ "learn.unfreeze()\nlearn.bn_freeze(True)", "_____no_output_____" ], [ "learn.fit(lrs/4, 1, wds=wd, cycle_len=20,use_clr=(20,10))", "_____no_output_____" ], [ "learn.sched.plot_lr()", "_____no_output_____" ], [ "learn.save('32urn-0')", "_____no_output_____" ], [ "learn.load('32urn-0')", "_____no_output_____" ], [ "x,y = next(iter(md.val_dl))\npy = to_np(learn.model(V(x)))", "_____no_output_____" ], [ "show_img(py[0]>0.5);", "_____no_output_____" ], [ "show_img(y[0]);", "_____no_output_____" ], [ "show_img(x[0][0]);", "_____no_output_____" ], [ "m.close()", "_____no_output_____" ] ], [ [ "## 64x64", "_____no_output_____" ] ], [ [ "sz=64\nbs=64", "_____no_output_____" ], [ "tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm", "_____no_output_____" ], [ "m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)", "_____no_output_____" ], [ "learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]", "_____no_output_____" ], [ "learn.freeze_to(1)", "_____no_output_____" ], [ "learn.load('32urn-0')", "_____no_output_____" ], [ "learn.fit(lr/2,1,wds=wd, cycle_len=10,use_clr=(10,10))", "_____no_output_____" ], [ "learn.sched.plot_lr()", "_____no_output_____" ], [ "learn.save('64urn-tmp')", "_____no_output_____" ], [ "learn.unfreeze()\nlearn.bn_freeze(True)", "_____no_output_____" ], [ "learn.load('64urn-tmp')", "_____no_output_____" ], [ "learn.fit(lrs/4,1,wds=wd, cycle_len=8,use_clr=(20,8))", "_____no_output_____" ], [ "learn.sched.plot_lr()", "_____no_output_____" ], [ "learn.save('64urn')", "_____no_output_____" ], [ "learn.load('64urn')", "_____no_output_____" ], [ "x,y = next(iter(md.val_dl))\npy = to_np(learn.model(V(x)))", "_____no_output_____" ], [ "show_img(py[0]>0.5);", "_____no_output_____" ], [ "show_img(y[0]);", "_____no_output_____" ], [ "show_img(x[0][0]);", "_____no_output_____" ], [ "m.close()", "_____no_output_____" ] ], [ [ "## 128x128", "_____no_output_____" ] ], [ [ "sz=128\nbs=64", "_____no_output_____" ], [ "tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm", "_____no_output_____" ], [ "m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)", "_____no_output_____" ], [ "learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]", "_____no_output_____" ], [ "learn.load('64urn')", "_____no_output_____" ], [ "learn.fit(lr/2,1, wds=wd, cycle_len=6,use_clr=(6,4))", "_____no_output_____" ], [ "learn.save('128urn-tmp')", "_____no_output_____" ], [ "learn.load('128urn-tmp')", "_____no_output_____" ], [ "learn.unfreeze()\nlearn.bn_freeze(True)", "_____no_output_____" ], [ "#lrs = np.array([lr/200,lr/30,lr])", "_____no_output_____" ], [ "learn.fit(lrs/5,1, wds=wd,cycle_len=8,use_clr=(20,8))", "_____no_output_____" ], [ "learn.sched.plot_lr()", "_____no_output_____" ], [ "learn.sched.plot_loss()", "_____no_output_____" ], [ "learn.save('128urn')", "_____no_output_____" ], [ "learn.load('128urn')", "_____no_output_____" ], [ "x,y = next(iter(md.val_dl))\npy = to_np(learn.model(V(x)))", "_____no_output_____" ], [ "show_img(py[0]>0.5);", "_____no_output_____" ], [ "show_img(y[0]);", "_____no_output_____" ], [ "show_img(x[0][0]);", "_____no_output_____" ], [ "y.shape", "_____no_output_____" ], [ "batches = [next(iter(md.aug_dl)) for i in range(9)]", "_____no_output_____" ], [ "fig, axes = plt.subplots(3, 6, figsize=(18, 9))\nfor i,(x,y) in enumerate(batches):\n show_aug_img(x,1, ax=axes.flat[i*2])\n show_aug_img(y,1, ax=axes.flat[i*2+1], nchannels=1, normed=False)", "_____no_output_____" ] ], [ [ "# Test on original validation", "_____no_output_____" ] ], [ [ "x_names_orig = np.array(glob(f'{PATH}/train/*'))\ny_names_orig = np.array(glob(f'{PATH}/train_masks/*'))", "_____no_output_____" ], [ "val_idxs_orig = list(range(800))\n((val_x_orig,trn_x_orig),(val_y_orig,trn_y_orig)) = split_by_idx(val_idxs_orig, x_names_orig, y_names_orig)", "_____no_output_____" ], [ "sz=128\nbs=64", "_____no_output_____" ], [ "tfms = tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(MatchedFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm", "_____no_output_____" ], [ "m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)", "_____no_output_____" ], [ "learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]", "_____no_output_____" ], [ "learn.load('128urn')", "_____no_output_____" ], [ "probs = learn.predict()", "_____no_output_____" ], [ "probs.shape", "_____no_output_____" ], [ "_, y = learn.TTA(n_aug=1)", "_____no_output_____" ], [ "y.shape", "_____no_output_____" ], [ "idx=0", "_____no_output_____" ], [ "show_img(probs[idx]>0.5);", "_____no_output_____" ], [ "show_img(probs[idx]);", "_____no_output_____" ], [ "show_img(y[idx]);", "_____no_output_____" ], [ "show_img(x[idx][0]);", "_____no_output_____" ] ], [ [ "# Optimise threshold", "_____no_output_____" ] ], [ [ "# src: https://www.kaggle.com/aglotero/another-iou-metric\ndef iou_metric(y_true_in, y_pred_in, print_table=False):\n labels = y_true_in\n y_pred = y_pred_in\n \n true_objects = 2\n pred_objects = 2\n\n intersection = np.histogram2d(labels.flatten(), y_pred.flatten(), bins=(true_objects, pred_objects))[0]\n\n # Compute areas (needed for finding the union between all objects)\n area_true = np.histogram(labels, bins = true_objects)[0]\n area_pred = np.histogram(y_pred, bins = pred_objects)[0]\n area_true = np.expand_dims(area_true, -1)\n area_pred = np.expand_dims(area_pred, 0)\n\n # Compute union\n union = area_true + area_pred - intersection\n\n # Exclude background from the analysis\n intersection = intersection[1:,1:]\n union = union[1:,1:]\n union[union == 0] = 1e-9\n\n # Compute the intersection over union\n iou = intersection / union\n\n # Precision helper function\n def precision_at(threshold, iou):\n matches = iou > threshold\n true_positives = np.sum(matches, axis=1) == 1 # Correct objects\n false_positives = np.sum(matches, axis=0) == 0 # Missed objects\n false_negatives = np.sum(matches, axis=1) == 0 # Extra objects\n tp, fp, fn = np.sum(true_positives), np.sum(false_positives), np.sum(false_negatives)\n return tp, fp, fn\n\n # Loop over IoU thresholds\n prec = []\n if print_table:\n print(\"Thresh\\tTP\\tFP\\tFN\\tPrec.\")\n for t in np.arange(0.5, 1.0, 0.05):\n tp, fp, fn = precision_at(t, iou)\n if (tp + fp + fn) > 0:\n p = tp / (tp + fp + fn)\n else:\n p = 0\n if print_table:\n print(\"{:1.3f}\\t{}\\t{}\\t{}\\t{:1.3f}\".format(t, tp, fp, fn, p))\n prec.append(p)\n \n if print_table:\n print(\"AP\\t-\\t-\\t-\\t{:1.3f}\".format(np.mean(prec)))\n return np.mean(prec)\n\ndef iou_metric_batch(y_true_in, y_pred_in):\n batch_size = y_true_in.shape[0]\n metric = []\n for batch in range(batch_size):\n value = iou_metric(y_true_in[batch], y_pred_in[batch])\n metric.append(value)\n return np.mean(metric)", "_____no_output_____" ], [ "thres = np.linspace(-1, 1, 10)\nthres_ioc = [iou_metric_batch(y, np.int32(probs > t)) for t in tqdm_notebook(thres)]", "_____no_output_____" ], [ "plt.plot(thres, thres_ioc);", "_____no_output_____" ], [ "best_thres = thres[np.argmax(thres_ioc)]\nbest_thres, max(thres_ioc)", "_____no_output_____" ], [ "thres = np.linspace(-0.5, 0.5, 50)\nthres_ioc = [iou_metric_batch(y, np.int32(probs > t)) for t in tqdm_notebook(thres)]", "_____no_output_____" ], [ "plt.plot(thres, thres_ioc);", "_____no_output_____" ], [ "best_thres = thres[np.argmax(thres_ioc)]\nbest_thres, max(thres_ioc)", "_____no_output_____" ], [ "show_img(probs[0]>best_thres);", "_____no_output_____" ] ], [ [ "# Run on test", "_____no_output_____" ] ], [ [ "(PATH/'test-128').mkdir(exist_ok=True)", "_____no_output_____" ], [ "def resize_img(fn):\n Image.open(fn).resize((128,128)).save((fn.parent.parent)/'test-128'/fn.name)\n\nfiles = list((PATH/'test').iterdir())\nwith ThreadPoolExecutor(8) as e: e.map(resize_img, files)", "_____no_output_____" ], [ "testData = np.array(glob(f'{PATH}/test-128/*'))", "_____no_output_____" ], [ "class TestFilesDataset(FilesDataset):\n def __init__(self, fnames, y, transform, path):\n self.y=y\n assert(len(fnames)==len(y))\n super().__init__(fnames, transform, path)\n def get_y(self, i): return open_image(os.path.join(self.path, self.fnames[i]))\n def get_c(self): return 0", "_____no_output_____" ], [ "tfms_from_model(resnet34, sz, crop_type=CropType.NO, tfm_y=TfmType.CLASS, aug_tfms=aug_tfms)\ndatasets = ImageData.get_ds(TestFilesDataset, (trn_x,trn_y), (val_x,val_y), tfms, test=testData, path=PATH)\nmd = ImageData(PATH, datasets, bs, num_workers=16, classes=None)\ndenorm = md.trn_ds.denorm", "_____no_output_____" ], [ "m_base = get_base()\nm = to_gpu(Unet34(m_base))\nmodels = UnetModel(m)", "_____no_output_____" ], [ "learn = ConvLearner(md, models)\nlearn.opt_fn=optim.Adam\nlearn.crit=nn.BCEWithLogitsLoss()\nlearn.metrics=[accuracy_thresh(0.5),dice]", "_____no_output_____" ], [ "learn.load('128urn')", "_____no_output_____" ], [ "x,y = next(iter(md.test_dl))\npy = to_np(learn.model(V(x)))", "_____no_output_____" ], [ "show_img(py[6]>best_thres);", "_____no_output_____" ], [ "show_img(py[6]);", "_____no_output_____" ], [ "show_img(y[6]);", "_____no_output_____" ], [ "probs = learn.predict(is_test=True)", "_____no_output_____" ], [ "show_img(probs[12]>best_thres);", "_____no_output_____" ], [ "show_img(probs[12]);", "_____no_output_____" ], [ "show_img(y[12]);", "_____no_output_____" ], [ "show_img(x[12][0]);", "_____no_output_____" ], [ "with open(f'{PATH}/probs.pkl', 'wb') as fout: #Save results\n pickle.dump(probs, fout)", "_____no_output_____" ], [ "probs.shape", "_____no_output_____" ], [ "def resize_img(fn):\n return np.array(Image.fromarray(fn).resize((101,101)))\n\nresizePreds = np.array([resize_img(x) for x in probs])", "_____no_output_____" ], [ "resizePreds.shape", "_____no_output_____" ], [ "show_img(resizePreds[12]);", "_____no_output_____" ], [ "testData", "_____no_output_____" ], [ "f'{PATH}/test'", "_____no_output_____" ], [ "test_ids = next(os.walk(f'{PATH}/test'))[2]", "_____no_output_____" ], [ "def RLenc(img, order='F', format=True):\n \"\"\"\n img is binary mask image, shape (r,c)\n order is down-then-right, i.e. Fortran\n format determines if the order needs to be preformatted (according to submission rules) or not\n\n returns run length as an array or string (if format is True)\n \"\"\"\n bytes = img.reshape(img.shape[0] * img.shape[1], order=order)\n runs = [] ## list of run lengths\n r = 0 ## the current run length\n pos = 1 ## count starts from 1 per WK\n for c in bytes:\n if (c == 0):\n if r != 0:\n runs.append((pos, r))\n pos += r\n r = 0\n pos += 1\n else:\n r += 1\n\n # if last run is unsaved (i.e. data ends with 1)\n if r != 0:\n runs.append((pos, r))\n pos += r\n r = 0\n\n if format:\n z = ''\n\n for rr in runs:\n z += '{} {} '.format(rr[0], rr[1])\n return z[:-1]\n else:\n return runs", "_____no_output_____" ], [ "pred_dict = {id_[:-4]:RLenc(np.round(resizePreds[i] > best_thres)) for i,id_ in tqdm_notebook(enumerate(test_ids))}", "_____no_output_____" ], [ "sub = pd.DataFrame.from_dict(pred_dict,orient='index')\nsub.index.names = ['id']\nsub.columns = ['rle_mask']\nsub.to_csv('submission.csv')", "_____no_output_____" ], [ "sub", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d06175055f3580333d9c8a5d91550bc2d22d341e
33,696
ipynb
Jupyter Notebook
bin/Jonghoo/altProj/proj/python/Music Recognition.ipynb
axcer2126/DINF
04bc17c5c7835da77debfef4ae7acd62a769585a
[ "MIT" ]
null
null
null
bin/Jonghoo/altProj/proj/python/Music Recognition.ipynb
axcer2126/DINF
04bc17c5c7835da77debfef4ae7acd62a769585a
[ "MIT" ]
null
null
null
bin/Jonghoo/altProj/proj/python/Music Recognition.ipynb
axcer2126/DINF
04bc17c5c7835da77debfef4ae7acd62a769585a
[ "MIT" ]
8
2020-09-18T05:46:42.000Z
2020-11-03T07:20:02.000Z
37.069307
272
0.485785
[ [ [ "# feature extractoring and preprocessing data\n# ์Œ์› ๋ฐ์ดํ„ฐ๋ฅผ ๋ถ„์„\nimport librosa\n\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# notebook์„ ์‹คํ–‰ํ•œ ๋ธŒ๋ผ์šฐ์ €์—์„œ ๋ฐ”๋กœ ๊ทธ๋ฆผ์„ ๋ณผ ์ˆ˜ ์žˆ๊ฒŒ ํ•ด์ฃผ๋Š” ๊ฒƒ\n%matplotlib inline\n\n# ์šด์˜์ฒด์ œ์™€์˜ ์ƒํ˜ธ์ž‘์šฉ์„ ๋•๋Š” ๋‹ค์–‘ํ•œ ๊ธฐ๋Šฅ์„ ์ œ๊ณต\n# 1. ํ˜„์žฌ ๋””๋ ‰ํ† ๋ฆฌ ํ™•์ธํ•˜๊ธฐ\n# 2. ๋””๋ ‰ํ† ๋ฆฌ ๋ณ€๊ฒฝ\n# 3. ํ˜„์žฌ ๋””๋ ‰ํ† ๋ฆฌ์˜ ํŒŒ์ผ ๋ชฉ๋ก ํ™•์ธํ•˜๊ธฐ\n# 4. csv ํŒŒ์ผ ํ˜ธ์ถœ\nimport os\n\n# ํŒŒ์ด์ฌ์—์„œ์˜ ์ด๋ฏธ์ง€ ์ฒ˜๋ฆฌ\nfrom PIL import Image\n\nimport pathlib\nimport csv\n\n# Preprocessing\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.preprocessing import LabelEncoder, StandardScaler\nfrom sklearn.metrics import mean_squared_error\n\n#Keras\nimport keras\n\n# ๊ฒฝ๊ณ  ๋ฉ”์‹œ์ง€๋ฅผ ๋ฌด์‹œํ•˜๊ณ  ์ˆจ๊ธฐ๊ฑฐ๋‚˜ -> warnings.filterwarnings(action='ignore')\n# ์ผ์น˜ํ•˜๋Š” ๊ฒฝ๊ณ ๋ฅผ ์ธ์‡„ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค = ('ignore')\nimport warnings\nwarnings.filterwarnings('ignore')", "Using TensorFlow backend.\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\n/home/bitai/anaconda3/lib/python3.7/site-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\n np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\n" ], [ "# ์›ํ•˜๋Š” ์ข…๋ฅ˜์˜ ์ƒ‰๊น”๋งŒ ๋„˜๊ฒจ์ฃผ๋Š” ๊ฒƒ\ncmap = plt.get_cmap('inferno')\n\nplt.figure(figsize=(10,10))\ngenres = 'blues classical country disco hiphop jazz metal pop reggae rock'.split()\nfor g in genres:\n pathlib.Path(f'img_data/{g}').mkdir(parents=True, exist_ok=True) \n for filename in os.listdir(f'./MIR/genres/{g}'):\n songname = f'./MIR/genres/{g}/{filename}'\n y, sr = librosa.load(songname, mono=True, duration=5)\n plt.specgram(y, NFFT=2048, Fs=2, Fc=0, noverlap=128, cmap=cmap, sides='default', mode='default', scale='dB');\n plt.axis('off');\n plt.savefig(f'img_data/{g}/{filename[:-3].replace(\".\", \"\")}.png')\n plt.clf()", "_____no_output_____" ], [ "header = 'filename chroma_stft rmse spectral_centroid spectral_bandwidth rolloff zero_crossing_rate'\nfor i in range(1, 21):\n header += f' mfcc{i}'\nheader += ' label'\nheader = header.split()", "_____no_output_____" ], [ "file = open('data.csv', 'w', newline='')\nwith file:\n writer = csv.writer(file)\n writer.writerow(header)\ngenres = 'blues classical country disco hiphop jazz metal pop reggae rock'.split()\nfor g in genres:\n for filename in os.listdir(f'./MIR/genres/{g}'):\n songname = f'./MIR/genres/{g}/{filename}'\n y, sr = librosa.load(songname, mono=True, duration=30)\n chroma_stft = librosa.feature.chroma_stft(y=y, sr=sr)\n spec_cent = librosa.feature.spectral_centroid(y=y, sr=sr)\n spec_bw = librosa.feature.spectral_bandwidth(y=y, sr=sr)\n rolloff = librosa.feature.spectral_rolloff(y=y, sr=sr)\n zcr = librosa.feature.zero_crossing_rate(y)\n mfcc = librosa.feature.mfcc(y=y, sr=sr)\n #rmse = mean_squared_error(y, y_pred=sr)**0.5\n rmse = librosa.feature.rms(y=y)\n to_append = f'{filename} {np.mean(chroma_stft)} {np.mean(rmse)} {np.mean(spec_cent)} {np.mean(spec_bw)} {np.mean(rolloff)} {np.mean(zcr)}' \n for e in mfcc:\n to_append += f' {np.mean(e)}'\n to_append += f' {g}'\n file = open('data.csv', 'a', newline='')\n with file:\n writer = csv.writer(file)\n writer.writerow(to_append.split())", "_____no_output_____" ], [ "# mfcc = ์˜ค๋””์˜ค ์‹ ํ˜ธ์—์„œ ์ถ”์ถœํ•  ์ˆ˜ ์žˆ๋Š” feature๋กœ, ์†Œ๋ฆฌ์˜ ๊ณ ์œ ํ•œ ํŠน์ง•์„ ๋‚˜ํƒ€๋‚ด๋Š” ์ˆ˜์น˜\n# = ๋“ฑ๋ก๋œ ์Œ์„ฑ๊ณผ ํ˜„์žฌ ์ž…๋ ฅ๋œ ์Œ์„ฑ์˜ ์œ ์‚ฌ๋„๋ฅผ ํŒ๋ณ„ํ•˜๋Š” ๊ทผ๊ฑฐ์˜ ์ผ๋ถ€๋กœ ์“ฐ์ž…๋‹ˆ๋‹ค.\n# = MFCC(Mel-Frequency Cepstral Coefficient)๋Š”\n# Mel Spectrum(๋ฉœ ์ŠคํŽ™ํŠธ๋Ÿผ)์—์„œ Cepstral(์ผ‘์ŠคํŠธ๋Ÿด) ๋ถ„์„์„ ํ†ตํ•ด ์ถ”์ถœ๋œ ๊ฐ’\n# \n# ์ดํ•ดํ•˜๊ธฐ ์œ„ํ•ด ๋จผ์ € \n# - Spectrum(์ŠคํŽ™ํŠธ๋Ÿผ)\n# - Cepstrum(์ผ‘์ŠคํŠธ๋Ÿผ)\n# - Mel Spectrum(๋ฉœ ์ŠคํŽ™ํŠธ๋Ÿผ) ๋“ค์„ ์•Œ์•„์•ผ ํ•œ๋‹ค.", "_____no_output_____" ], [ "data = pd.read_csv('data.csv')\ndata.head()\n\n# chroma_stft = ์ฑ„๋„_? , ํฌ๋กœ๋งˆ ํ‘œ์ค€\n# spectral_centroid = ์ŠคํŽ™ํŠธ๋Ÿผ ์ค‘์‹ฌ\n# spectral_bandwidth = ์ŠคํŽ™ํŠธ๋Ÿผ ๋Œ€์—ญํญ\n# rolloff = ๋กค ์˜คํ”„\n# zero_crossing_rate = ์ œ๋กœ ํฌ๋กœ์‹ฑ ๋น„์œจ\n# \n# mfcc[n] = ", "_____no_output_____" ], [ "data.shape", "_____no_output_____" ], [ "# Dropping unneccesary columns\ndata = data.drop(['filename'],axis=1)", "_____no_output_____" ], [ "genre_list = data.iloc[:, -1]\nencoder = LabelEncoder()\ny = encoder.fit_transform(genre_list)", "_____no_output_____" ], [ "scaler = StandardScaler()\nX = scaler.fit_transform(np.array(data.iloc[:, :-1], dtype = float))", "_____no_output_____" ], [ "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)", "_____no_output_____" ], [ "len(y_train)", "_____no_output_____" ], [ "len(y_test)", "_____no_output_____" ], [ "X_train[10]", "_____no_output_____" ], [ "from keras import models\nfrom keras import layers\n\nmodel = models.Sequential()\nmodel.add(layers.Dense(256, activation='relu', input_shape=(X_train.shape[1],)))\n\nmodel.add(layers.Dense(128, activation='relu'))\n\nmodel.add(layers.Dense(64, activation='relu'))\n\nmodel.add(layers.Dense(10, activation='softmax'))", "WARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:66: The name tf.get_default_graph is deprecated. Please use tf.compat.v1.get_default_graph instead.\n\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:541: The name tf.placeholder is deprecated. Please use tf.compat.v1.placeholder instead.\n\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:4432: The name tf.random_uniform is deprecated. Please use tf.random.uniform instead.\n\n" ], [ "model.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy'])", "WARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/optimizers.py:793: The name tf.train.Optimizer is deprecated. Please use tf.compat.v1.train.Optimizer instead.\n\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:3622: The name tf.log is deprecated. Please use tf.math.log instead.\n\n" ], [ "history = model.fit(X_train,\n y_train,\n epochs=20,\n batch_size=128)", "WARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/tensorflow/python/ops/math_grad.py:1250: add_dispatch_support.<locals>.wrapper (from tensorflow.python.ops.array_ops) is deprecated and will be removed in a future version.\nInstructions for updating:\nUse tf.where in 2.0, which has the same broadcast rule as np.where\nWARNING:tensorflow:From /home/bitai/anaconda3/lib/python3.7/site-packages/keras/backend/tensorflow_backend.py:1033: The name tf.assign_add is deprecated. Please use tf.compat.v1.assign_add instead.\n\nEpoch 1/20\n800/800 [==============================] - 1s 1ms/step - loss: 2.1563 - acc: 0.2263\nEpoch 2/20\n800/800 [==============================] - 0s 45us/step - loss: 1.8502 - acc: 0.3887\nEpoch 3/20\n800/800 [==============================] - 0s 30us/step - loss: 1.6190 - acc: 0.4163\nEpoch 4/20\n800/800 [==============================] - 0s 28us/step - loss: 1.4466 - acc: 0.4863\nEpoch 5/20\n800/800 [==============================] - 0s 15us/step - loss: 1.3198 - acc: 0.5587\nEpoch 6/20\n800/800 [==============================] - 0s 11us/step - loss: 1.2189 - acc: 0.5663\nEpoch 7/20\n800/800 [==============================] - 0s 14us/step - loss: 1.1357 - acc: 0.5988\nEpoch 8/20\n800/800 [==============================] - 0s 14us/step - loss: 1.0649 - acc: 0.6450\nEpoch 9/20\n800/800 [==============================] - 0s 11us/step - loss: 1.0059 - acc: 0.6625\nEpoch 10/20\n800/800 [==============================] - 0s 13us/step - loss: 0.9525 - acc: 0.6925\nEpoch 11/20\n800/800 [==============================] - 0s 13us/step - loss: 0.9039 - acc: 0.7025\nEpoch 12/20\n800/800 [==============================] - 0s 12us/step - loss: 0.8633 - acc: 0.7150\nEpoch 13/20\n800/800 [==============================] - 0s 13us/step - loss: 0.8188 - acc: 0.7350\nEpoch 14/20\n800/800 [==============================] - 0s 14us/step - loss: 0.7868 - acc: 0.7425\nEpoch 15/20\n800/800 [==============================] - 0s 12us/step - loss: 0.7527 - acc: 0.7475\nEpoch 16/20\n800/800 [==============================] - 0s 12us/step - loss: 0.7272 - acc: 0.7575\nEpoch 17/20\n800/800 [==============================] - 0s 13us/step - loss: 0.7033 - acc: 0.7688\nEpoch 18/20\n800/800 [==============================] - 0s 12us/step - loss: 0.6679 - acc: 0.7737\nEpoch 19/20\n800/800 [==============================] - 0s 12us/step - loss: 0.6405 - acc: 0.7925\nEpoch 20/20\n800/800 [==============================] - 0s 11us/step - loss: 0.6022 - acc: 0.8125\n" ], [ "test_loss, test_acc = model.evaluate(X_test,y_test)", "200/200 [==============================] - 0s 115us/step\n" ], [ "print('test_acc: ',test_acc)", "test_acc: 0.73\n" ], [ "x_val = X_train[:200]\npartial_x_train = X_train[200:]\n\ny_val = y_train[:200]\npartial_y_train = y_train[200:]", "_____no_output_____" ], [ "\nmodel = models.Sequential()\nmodel.add(layers.Dense(512, activation='relu', input_shape=(X_train.shape[1],)))\nmodel.add(layers.Dense(256, activation='relu'))\nmodel.add(layers.Dense(128, activation='relu'))\nmodel.add(layers.Dense(64, activation='relu'))\nmodel.add(layers.Dense(10, activation='softmax'))\n\nmodel.compile(optimizer='adam',\n loss='sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\nmodel.fit(partial_x_train,\n partial_y_train,\n epochs=30,\n batch_size=512,\n validation_data=(x_val, y_val))\nresults = model.evaluate(X_test, y_test)", "Train on 600 samples, validate on 200 samples\nEpoch 1/30\n600/600 [==============================] - 0s 341us/step - loss: 2.2897 - acc: 0.1167 - val_loss: 2.1786 - val_acc: 0.2650\nEpoch 2/30\n600/600 [==============================] - 0s 14us/step - loss: 2.1257 - acc: 0.3317 - val_loss: 2.0688 - val_acc: 0.3150\nEpoch 3/30\n600/600 [==============================] - 0s 15us/step - loss: 1.9839 - acc: 0.4133 - val_loss: 1.9469 - val_acc: 0.3100\nEpoch 4/30\n600/600 [==============================] - 0s 17us/step - loss: 1.8296 - acc: 0.4067 - val_loss: 1.8256 - val_acc: 0.3150\nEpoch 5/30\n600/600 [==============================] - 0s 13us/step - loss: 1.6836 - acc: 0.4150 - val_loss: 1.7084 - val_acc: 0.3500\nEpoch 6/30\n600/600 [==============================] - 0s 17us/step - loss: 1.5413 - acc: 0.4633 - val_loss: 1.6188 - val_acc: 0.4150\nEpoch 7/30\n600/600 [==============================] - 0s 12us/step - loss: 1.4307 - acc: 0.5100 - val_loss: 1.5584 - val_acc: 0.4350\nEpoch 8/30\n600/600 [==============================] - 0s 14us/step - loss: 1.3362 - acc: 0.5333 - val_loss: 1.5068 - val_acc: 0.4550\nEpoch 9/30\n600/600 [==============================] - 0s 11us/step - loss: 1.2556 - acc: 0.5433 - val_loss: 1.4741 - val_acc: 0.4700\nEpoch 10/30\n600/600 [==============================] - 0s 16us/step - loss: 1.2024 - acc: 0.5833 - val_loss: 1.4555 - val_acc: 0.4750\nEpoch 11/30\n600/600 [==============================] - 0s 12us/step - loss: 1.1433 - acc: 0.5983 - val_loss: 1.4419 - val_acc: 0.5200\nEpoch 12/30\n600/600 [==============================] - 0s 16us/step - loss: 1.0761 - acc: 0.6267 - val_loss: 1.4268 - val_acc: 0.5050\nEpoch 13/30\n600/600 [==============================] - 0s 12us/step - loss: 1.0272 - acc: 0.6500 - val_loss: 1.3786 - val_acc: 0.5450\nEpoch 14/30\n600/600 [==============================] - 0s 12us/step - loss: 0.9723 - acc: 0.6783 - val_loss: 1.3623 - val_acc: 0.5200\nEpoch 15/30\n600/600 [==============================] - 0s 16us/step - loss: 0.9400 - acc: 0.6867 - val_loss: 1.3410 - val_acc: 0.5750\nEpoch 16/30\n600/600 [==============================] - 0s 13us/step - loss: 0.8821 - acc: 0.7000 - val_loss: 1.3599 - val_acc: 0.5800\nEpoch 17/30\n600/600 [==============================] - 0s 12us/step - loss: 0.8603 - acc: 0.6983 - val_loss: 1.3182 - val_acc: 0.5850\nEpoch 18/30\n600/600 [==============================] - 0s 16us/step - loss: 0.8224 - acc: 0.7233 - val_loss: 1.2646 - val_acc: 0.5850\nEpoch 19/30\n600/600 [==============================] - 0s 14us/step - loss: 0.7897 - acc: 0.7367 - val_loss: 1.2845 - val_acc: 0.5650\nEpoch 20/30\n600/600 [==============================] - 0s 13us/step - loss: 0.7486 - acc: 0.7517 - val_loss: 1.3470 - val_acc: 0.5650\nEpoch 21/30\n600/600 [==============================] - 0s 14us/step - loss: 0.7371 - acc: 0.7533 - val_loss: 1.3236 - val_acc: 0.5850\nEpoch 22/30\n600/600 [==============================] - 0s 11us/step - loss: 0.7123 - acc: 0.7517 - val_loss: 1.2596 - val_acc: 0.5950\nEpoch 23/30\n600/600 [==============================] - 0s 13us/step - loss: 0.6772 - acc: 0.7767 - val_loss: 1.2605 - val_acc: 0.5850\nEpoch 24/30\n600/600 [==============================] - 0s 15us/step - loss: 0.6618 - acc: 0.7717 - val_loss: 1.2853 - val_acc: 0.5800\nEpoch 25/30\n600/600 [==============================] - 0s 14us/step - loss: 0.6487 - acc: 0.7800 - val_loss: 1.3147 - val_acc: 0.5900\nEpoch 26/30\n600/600 [==============================] - 0s 14us/step - loss: 0.6131 - acc: 0.8117 - val_loss: 1.3265 - val_acc: 0.6000\nEpoch 27/30\n600/600 [==============================] - 0s 14us/step - loss: 0.5971 - acc: 0.8033 - val_loss: 1.2807 - val_acc: 0.6000\nEpoch 28/30\n600/600 [==============================] - 0s 12us/step - loss: 0.5631 - acc: 0.8283 - val_loss: 1.2866 - val_acc: 0.5800\nEpoch 29/30\n600/600 [==============================] - 0s 14us/step - loss: 0.5477 - acc: 0.8350 - val_loss: 1.2839 - val_acc: 0.5750\nEpoch 30/30\n600/600 [==============================] - 0s 13us/step - loss: 0.5210 - acc: 0.8550 - val_loss: 1.2990 - val_acc: 0.5900\n200/200 [==============================] - 0s 19us/step\n" ], [ "results", "_____no_output_____" ], [ "predictions = model.predict(X_test)", "_____no_output_____" ], [ "predictions[0].shape", "_____no_output_____" ], [ "np.sum(predictions[0])", "_____no_output_____" ], [ "np.argmax(predictions[0])", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d06181e012184dce95535cfa5bf9cade17ef5c3f
1,702
ipynb
Jupyter Notebook
Chapter07/Exercise104/Exercise104.ipynb
adityashah95/Python
6b5ffc89f0abd77f07dd8049a4c1213ce56c4d84
[ "MIT" ]
4
2020-01-06T12:07:00.000Z
2022-03-22T04:03:49.000Z
Chapter07/Exercise104/Exercise104.ipynb
adityashah95/Python
6b5ffc89f0abd77f07dd8049a4c1213ce56c4d84
[ "MIT" ]
null
null
null
Chapter07/Exercise104/Exercise104.ipynb
adityashah95/Python
6b5ffc89f0abd77f07dd8049a4c1213ce56c4d84
[ "MIT" ]
4
2019-11-25T10:39:30.000Z
2020-02-22T07:26:40.000Z
21.544304
136
0.536428
[ [ [ "class Interrogator:\n def __init__(self, questions):\n self.questions = questions\n def __iter__(self):\n return self.questions.__iter__()", "_____no_output_____" ], [ "questions = [\"What is your name?\", \"What is your quest?\", \"What is the average airspeed velocity of an unladen swallow?\"]\nawkward_person = Interrogator(questions)\nfor question in awkward_person:\n print(question)\n\n", "What is your name?\nWhat is your quest?\nWhat is the average airspeed velocity of an unladen swallow?\n" ] ] ]
[ "code" ]
[ [ "code", "code" ] ]
d0618522ae84a467019ab9b7cc4372b10bffe018
125,489
ipynb
Jupyter Notebook
examples/models/aws_eks_deep_mnist/aws_eks_deep_mnist.ipynb
welcomemandeep/seldon-core
a257c5ef7baf042da4b2ca1b7aad959447d5bd7d
[ "Apache-2.0" ]
null
null
null
examples/models/aws_eks_deep_mnist/aws_eks_deep_mnist.ipynb
welcomemandeep/seldon-core
a257c5ef7baf042da4b2ca1b7aad959447d5bd7d
[ "Apache-2.0" ]
null
null
null
examples/models/aws_eks_deep_mnist/aws_eks_deep_mnist.ipynb
welcomemandeep/seldon-core
a257c5ef7baf042da4b2ca1b7aad959447d5bd7d
[ "Apache-2.0" ]
null
null
null
35.711155
4,684
0.436118
[ [ [ "# AWS Elastic Kubernetes Service (EKS) Deep MNIST\nIn this example we will deploy a tensorflow MNIST model in Amazon Web Services' Elastic Kubernetes Service (EKS).\n\nThis tutorial will break down in the following sections:\n\n1) Train a tensorflow model to predict mnist locally\n\n2) Containerise the tensorflow model with our docker utility\n\n3) Send some data to the docker model to test it\n\n4) Install and configure AWS tools to interact with AWS\n\n5) Use the AWS tools to create and setup EKS cluster with Seldon\n\n6) Push and run docker image through the AWS Container Registry\n\n7) Test our Elastic Kubernetes deployment by sending some data\n\nLet's get started! ๐Ÿš€๐Ÿ”ฅ\n\n## Dependencies:\n\n* Helm v3.0.0+\n* A Kubernetes cluster running v1.13 or above (minkube / docker-for-windows work well if enough RAM)\n* kubectl v1.14+\n* EKS CLI v0.1.32\n* AWS Cli v1.16.163\n* Python 3.6+\n* Python DEV requirements\n", "_____no_output_____" ], [ "## 1) Train a tensorflow model to predict mnist locally\nWe will load the mnist images, together with their labels, and then train a tensorflow model to predict the right labels", "_____no_output_____" ] ], [ [ "from tensorflow.examples.tutorials.mnist import input_data\nmnist = input_data.read_data_sets(\"MNIST_data/\", one_hot = True)\nimport tensorflow as tf\n\nif __name__ == '__main__':\n \n x = tf.placeholder(tf.float32, [None,784], name=\"x\")\n\n W = tf.Variable(tf.zeros([784,10]))\n b = tf.Variable(tf.zeros([10]))\n\n y = tf.nn.softmax(tf.matmul(x,W) + b, name=\"y\")\n\n y_ = tf.placeholder(tf.float32, [None, 10])\n\n cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), reduction_indices=[1]))\n\n train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)\n\n init = tf.initialize_all_variables()\n\n sess = tf.Session()\n sess.run(init)\n\n for i in range(1000):\n batch_xs, batch_ys = mnist.train.next_batch(100)\n sess.run(train_step, feed_dict={x: batch_xs, y_: batch_ys})\n\n correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))\n accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))\n print(sess.run(accuracy, feed_dict = {x: mnist.test.images, y_:mnist.test.labels}))\n\n saver = tf.train.Saver()\n\n saver.save(sess, \"model/deep_mnist_model\")", "Extracting MNIST_data/train-images-idx3-ubyte.gz\nExtracting MNIST_data/train-labels-idx1-ubyte.gz\nExtracting MNIST_data/t10k-images-idx3-ubyte.gz\nExtracting MNIST_data/t10k-labels-idx1-ubyte.gz\n0.9194\n" ] ], [ [ "## 2) Containerise the tensorflow model with our docker utility", "_____no_output_____" ], [ "First you need to make sure that you have added the .s2i/environment configuration file in this folder with the following content:", "_____no_output_____" ] ], [ [ "!cat .s2i/environment", "MODEL_NAME=DeepMnist\nAPI_TYPE=REST\nSERVICE_TYPE=MODEL\nPERSISTENCE=0\n" ] ], [ [ "Now we can build a docker image named \"deep-mnist\" with the tag 0.1", "_____no_output_____" ] ], [ [ "!s2i build . seldonio/seldon-core-s2i-python36:1.5.0-dev deep-mnist:0.1", "---> Installing application source...\n---> Installing dependencies ...\nLooking in links: /whl\nRequirement already satisfied: tensorflow>=1.12.0 in /usr/local/lib/python3.6/site-packages (from -r requirements.txt (line 1)) (1.13.1)\nRequirement already satisfied: keras-preprocessing>=1.0.5 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.9)\nRequirement already satisfied: gast>=0.2.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.2.2)\nRequirement already satisfied: absl-py>=0.1.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.1)\nRequirement already satisfied: astor>=0.6.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.7.1)\nRequirement already satisfied: keras-applications>=1.0.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.0.7)\nRequirement already satisfied: six>=1.10.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.12.0)\nRequirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.1.0)\nRequirement already satisfied: grpcio>=1.8.6 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.19.0)\nRequirement already satisfied: wheel>=0.26 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.33.1)\nRequirement already satisfied: tensorboard<1.14.0,>=1.13.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.13.1)\nRequirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.16.2)\nRequirement already satisfied: protobuf>=3.6.1 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.7.0)\nRequirement already satisfied: tensorflow-estimator<1.14.0rc0,>=1.13.0 in /usr/local/lib/python3.6/site-packages (from tensorflow>=1.12.0->-r requirements.txt (line 1)) (1.13.0)\nRequirement already satisfied: h5py in /usr/local/lib/python3.6/site-packages (from keras-applications>=1.0.6->tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.9.0)\nRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.6/site-packages (from tensorboard<1.14.0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (3.0.1)\nRequirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.6/site-packages (from tensorboard<1.14.0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (0.15.0)\nRequirement already satisfied: setuptools in /usr/local/lib/python3.6/site-packages (from protobuf>=3.6.1->tensorflow>=1.12.0->-r requirements.txt (line 1)) (40.8.0)\nRequirement already satisfied: mock>=2.0.0 in /usr/local/lib/python3.6/site-packages (from tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (2.0.0)\nRequirement already satisfied: pbr>=0.11 in /usr/local/lib/python3.6/site-packages (from mock>=2.0.0->tensorflow-estimator<1.14.0rc0,>=1.13.0->tensorflow>=1.12.0->-r requirements.txt (line 1)) (5.1.3)\nUrl '/whl' is ignored. It is either a non-existing path or lacks a specific scheme.\nYou are using pip version 19.0.3, however version 19.1.1 is available.\nYou should consider upgrading via the 'pip install --upgrade pip' command.\nBuild completed successfully\n" ] ], [ [ "## 3) Send some data to the docker model to test it\nWe first run the docker image we just created as a container called \"mnist_predictor\"", "_____no_output_____" ] ], [ [ "!docker run --name \"mnist_predictor\" -d --rm -p 5000:5000 deep-mnist:0.1", "5157ab4f516bd0dea11b159780f31121e9fb41df6394e0d6d631e6e0d572463b\n" ] ], [ [ "Send some random features that conform to the contract", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt\n# This is the variable that was initialised at the beginning of the file\ni = [0]\nx = mnist.test.images[i]\ny = mnist.test.labels[i]\nplt.imshow(x.reshape((28, 28)), cmap='gray')\nplt.show()\nprint(\"Expected label: \", np.sum(range(0,10) * y), \". One hot encoding: \", y)", "_____no_output_____" ], [ "from seldon_core.seldon_client import SeldonClient\nimport math\nimport numpy as np\n\n# We now test the REST endpoint expecting the same result\nendpoint = \"0.0.0.0:5000\"\nbatch = x\npayload_type = \"ndarray\"\n\nsc = SeldonClient(microservice_endpoint=endpoint)\n\n# We use the microservice, instead of the \"predict\" function\nclient_prediction = sc.microservice(\n data=batch,\n method=\"predict\",\n payload_type=payload_type,\n names=[\"tfidf\"])\n\nfor proba, label in zip(client_prediction.response.data.ndarray.values[0].list_value.ListFields()[0][1], range(0,10)):\n print(f\"LABEL {label}:\\t {proba.number_value*100:6.4f} %\")", "LABEL 0:\t 0.0068 %\nLABEL 1:\t 0.0000 %\nLABEL 2:\t 0.0085 %\nLABEL 3:\t 0.3409 %\nLABEL 4:\t 0.0002 %\nLABEL 5:\t 0.0020 %\nLABEL 6:\t 0.0000 %\nLABEL 7:\t 99.5371 %\nLABEL 8:\t 0.0026 %\nLABEL 9:\t 0.1019 %\n" ], [ "!docker rm mnist_predictor --force", "mnist_predictor\n" ] ], [ [ "## 4) Install and configure AWS tools to interact with AWS", "_____no_output_____" ], [ "First we install the awscli", "_____no_output_____" ] ], [ [ "!pip install awscli --upgrade --user", "Collecting awscli\n Using cached https://files.pythonhosted.org/packages/f6/45/259a98719e7c7defc9be4cc00fbfb7ccf699fbd1f74455d8347d0ab0a1df/awscli-1.16.163-py2.py3-none-any.whl\nCollecting colorama<=0.3.9,>=0.2.5 (from awscli)\n Using cached https://files.pythonhosted.org/packages/db/c8/7dcf9dbcb22429512708fe3a547f8b6101c0d02137acbd892505aee57adf/colorama-0.3.9-py2.py3-none-any.whl\nCollecting PyYAML<=3.13,>=3.10 (from awscli)\nCollecting botocore==1.12.153 (from awscli)\n Using cached https://files.pythonhosted.org/packages/ec/3b/029218966ce62ae9824a18730de862ac8fc5a0e8083d07d1379815e7cca1/botocore-1.12.153-py2.py3-none-any.whl\nRequirement already satisfied, skipping upgrade: docutils>=0.10 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from awscli) (0.14)\nCollecting rsa<=3.5.0,>=3.1.2 (from awscli)\n Using cached https://files.pythonhosted.org/packages/e1/ae/baedc9cb175552e95f3395c43055a6a5e125ae4d48a1d7a924baca83e92e/rsa-3.4.2-py2.py3-none-any.whl\nRequirement already satisfied, skipping upgrade: s3transfer<0.3.0,>=0.2.0 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from awscli) (0.2.0)\nRequirement already satisfied, skipping upgrade: urllib3<1.25,>=1.20; python_version >= \"3.4\" in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from botocore==1.12.153->awscli) (1.24.2)\nRequirement already satisfied, skipping upgrade: python-dateutil<3.0.0,>=2.1; python_version >= \"2.7\" in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from botocore==1.12.153->awscli) (2.8.0)\nRequirement already satisfied, skipping upgrade: jmespath<1.0.0,>=0.7.1 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from botocore==1.12.153->awscli) (0.9.4)\nCollecting pyasn1>=0.1.3 (from rsa<=3.5.0,>=3.1.2->awscli)\n Using cached https://files.pythonhosted.org/packages/7b/7c/c9386b82a25115cccf1903441bba3cbadcfae7b678a20167347fa8ded34c/pyasn1-0.4.5-py2.py3-none-any.whl\nRequirement already satisfied, skipping upgrade: six>=1.5 in /home/alejandro/miniconda3/envs/reddit-classification/lib/python3.7/site-packages (from python-dateutil<3.0.0,>=2.1; python_version >= \"2.7\"->botocore==1.12.153->awscli) (1.12.0)\nInstalling collected packages: colorama, PyYAML, botocore, pyasn1, rsa, awscli\nSuccessfully installed PyYAML-3.13 awscli-1.16.163 botocore-1.12.153 colorama-0.3.9 pyasn1-0.4.5 rsa-3.4.2\n" ] ], [ [ "### Configure aws so it can talk to your server \n(if you are getting issues, make sure you have the permmissions to create clusters)", "_____no_output_____" ] ], [ [ "%%bash \n# You must make sure that the access key and secret are changed\naws configure << END_OF_INPUTS\nYOUR_ACCESS_KEY\nYOUR_ACCESS_SECRET\nus-west-2\njson\nEND_OF_INPUTS", "AWS Access Key ID [****************SF4A]: AWS Secret Access Key [****************WLHu]: Default region name [eu-west-1]: Default output format [json]: " ] ], [ [ "### Install EKCTL\n*IMPORTANT*: These instructions are for linux\nPlease follow the official installation of ekctl at: https://docs.aws.amazon.com/eks/latest/userguide/getting-started-eksctl.html", "_____no_output_____" ] ], [ [ "!curl --silent --location \"https://github.com/weaveworks/eksctl/releases/download/latest_release/eksctl_$(uname -s)_amd64.tar.gz\" | tar xz ", "_____no_output_____" ], [ "!chmod 755 ./eksctl", "_____no_output_____" ], [ "!./eksctl version", "\u001b[36m[โ„น] version.Info{BuiltAt:\"\", GitCommit:\"\", GitTag:\"0.1.32\"}\n\u001b[0m" ] ], [ [ "## 5) Use the AWS tools to create and setup EKS cluster with Seldon\nIn this example we will create a cluster with 2 nodes, with a minimum of 1 and a max of 3. You can tweak this accordingly.\n\nIf you want to check the status of the deployment you can go to AWS CloudFormation or to the EKS dashboard.\n\nIt will take 10-15 minutes (so feel free to go grab a โ˜•). \n\n*IMPORTANT*: If you get errors in this step it is most probably IAM role access requirements, which requires you to discuss with your administrator.", "_____no_output_____" ] ], [ [ "%%bash\n./eksctl create cluster \\\n--name demo-eks-cluster \\\n--region us-west-2 \\\n--nodes 2 ", "Process is interrupted.\n" ] ], [ [ "### Configure local kubectl \nWe want to now configure our local Kubectl so we can actually reach the cluster we've just created", "_____no_output_____" ] ], [ [ "!aws eks --region us-west-2 update-kubeconfig --name demo-eks-cluster", "Updated context arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist in /home/alejandro/.kube/config\n" ] ], [ [ "And we can check if the context has been added to kubectl config (contexts are basically the different k8s cluster connections)\nYou should be able to see the context as \"...aws:eks:eu-west-1:27...\". \nIf it's not activated you can activate that context with kubectlt config set-context <CONTEXT_NAME>", "_____no_output_____" ] ], [ [ "!kubectl config get-contexts", "CURRENT NAME CLUSTER AUTHINFO NAMESPACE\n* arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist arn:aws:eks:eu-west-1:271049282727:cluster/deepmnist \n docker-desktop docker-desktop docker-desktop \n docker-for-desktop docker-desktop docker-desktop \n gke_ml-engineer_us-central1-a_security-cluster-1 gke_ml-engineer_us-central1-a_security-cluster-1 gke_ml-engineer_us-central1-a_security-cluster-1 \n" ] ], [ [ "## Setup Seldon Core\n\nUse the setup notebook to [Setup Cluster](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html#Setup-Cluster) with [Ambassador Ingress](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html#Ambassador) and [Install Seldon Core](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html#Install-Seldon-Core). Instructions [also online](https://docs.seldon.io/projects/seldon-core/en/latest/examples/seldon_core_setup.html).", "_____no_output_____" ], [ "## Push docker image\nIn order for the EKS seldon deployment to access the image we just built, we need to push it to the Elastic Container Registry (ECR).\n\nIf you have any issues please follow the official AWS documentation: https://docs.aws.amazon.com/AmazonECR/latest/userguide/what-is-ecr.html", "_____no_output_____" ], [ "### First we create a registry\nYou can run the following command, and then see the result at https://us-west-2.console.aws.amazon.com/ecr/repositories?#", "_____no_output_____" ] ], [ [ "!aws ecr create-repository --repository-name seldon-repository --region us-west-2", "{\n \"repository\": {\n \"repositoryArn\": \"arn:aws:ecr:us-west-2:271049282727:repository/seldon-repository\",\n \"registryId\": \"271049282727\",\n \"repositoryName\": \"seldon-repository\",\n \"repositoryUri\": \"271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository\",\n \"createdAt\": 1558535798.0\n }\n}\n" ] ], [ [ "### Now prepare docker image\nWe need to first tag the docker image before we can push it", "_____no_output_____" ] ], [ [ "%%bash\nexport AWS_ACCOUNT_ID=\"\"\nexport AWS_REGION=\"us-west-2\"\nif [ -z \"$AWS_ACCOUNT_ID\" ]; then\n echo \"ERROR: Please provide a value for the AWS variables\"\n exit 1\nfi\n\ndocker tag deep-mnist:0.1 \"$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/seldon-repository\"", "_____no_output_____" ] ], [ [ "### We now login to aws through docker so we can access the repository", "_____no_output_____" ] ], [ [ "!`aws ecr get-login --no-include-email --region us-west-2`", "WARNING! Using --password via the CLI is insecure. Use --password-stdin.\nWARNING! Your password will be stored unencrypted in /home/alejandro/.docker/config.json.\nConfigure a credential helper to remove this warning. See\nhttps://docs.docker.com/engine/reference/commandline/login/#credentials-store\n\nLogin Succeeded\n" ] ], [ [ "### And push the image\nMake sure you add your AWS Account ID", "_____no_output_____" ] ], [ [ "%%bash\nexport AWS_ACCOUNT_ID=\"\"\nexport AWS_REGION=\"us-west-2\"\nif [ -z \"$AWS_ACCOUNT_ID\" ]; then\n echo \"ERROR: Please provide a value for the AWS variables\"\n exit 1\nfi\n\ndocker push \"$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/seldon-repository\"", "The push refers to repository [271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository]\nf7d0d000c138: Preparing\n987f3f1afb00: Preparing\n00d16a381c47: Preparing\nbb01f50d544a: Preparing\nfcb82c6941b5: Preparing\n67290e35c458: Preparing\nb813745f5bb3: Preparing\nffecb18e9f0b: Preparing\nf50f856f49fa: Preparing\n80b43ad4adf9: Preparing\n14c77983a1cf: Preparing\na22a5ac18042: Preparing\n6257fa9f9597: Preparing\n578414b395b9: Preparing\nabc3250a6c7f: Preparing\n13d5529fd232: Preparing\n67290e35c458: Waiting\nb813745f5bb3: Waiting\nffecb18e9f0b: Waiting\nf50f856f49fa: Waiting\n80b43ad4adf9: Waiting\n6257fa9f9597: Waiting\n14c77983a1cf: Waiting\na22a5ac18042: Waiting\n578414b395b9: Waiting\nabc3250a6c7f: Waiting\n13d5529fd232: Waiting\n987f3f1afb00: Pushed\nfcb82c6941b5: Pushed\nbb01f50d544a: Pushed\nf7d0d000c138: Pushed\nffecb18e9f0b: Pushed\nb813745f5bb3: Pushed\nf50f856f49fa: Pushed\n67290e35c458: Pushed\n14c77983a1cf: Pushed\n578414b395b9: Pushed\n80b43ad4adf9: Pushed\n13d5529fd232: Pushed\n6257fa9f9597: Pushed\nabc3250a6c7f: Pushed\n00d16a381c47: Pushed\na22a5ac18042: Pushed\nlatest: digest: sha256:19aefaa9d87c1287eb46ec08f5d4f9a689744d9d0d0b75668b7d15e447819d74 size: 3691\n" ] ], [ [ "## Running the Model\nWe will now run the model.\n\nLet's first have a look at the file we'll be using to trigger the model:", "_____no_output_____" ] ], [ [ "!cat deep_mnist.json", "{\n \"apiVersion\": \"machinelearning.seldon.io/v1alpha2\",\n \"kind\": \"SeldonDeployment\",\n \"metadata\": {\n \"labels\": {\n \"app\": \"seldon\"\n },\n \"name\": \"deep-mnist\"\n },\n \"spec\": {\n \"annotations\": {\n \"project_name\": \"Tensorflow MNIST\",\n \"deployment_version\": \"v1\"\n },\n \"name\": \"deep-mnist\",\n \"oauth_key\": \"oauth-key\",\n \"oauth_secret\": \"oauth-secret\",\n \"predictors\": [\n {\n \"componentSpecs\": [{\n \"spec\": {\n \"containers\": [\n {\n \"image\": \"271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository:latest\",\n \"imagePullPolicy\": \"IfNotPresent\",\n \"name\": \"classifier\",\n \"resources\": {\n \"requests\": {\n \"memory\": \"1Mi\"\n }\n }\n }\n ],\n \"terminationGracePeriodSeconds\": 20\n }\n }],\n \"graph\": {\n \"children\": [],\n \"name\": \"classifier\",\n \"endpoint\": {\n\t\t\t\"type\" : \"REST\"\n\t\t },\n \"type\": \"MODEL\"\n },\n \"name\": \"single-model\",\n \"replicas\": 1,\n\t\t\"annotations\": {\n\t\t \"predictor_version\" : \"v1\"\n\t\t}\n }\n ]\n }\n}\n" ] ], [ [ "Now let's trigger seldon to run the model.\n\nWe basically have a yaml file, where we want to replace the value \"REPLACE_FOR_IMAGE_AND_TAG\" for the image you pushed", "_____no_output_____" ] ], [ [ "%%bash\nexport AWS_ACCOUNT_ID=\"\"\nexport AWS_REGION=\"us-west-2\"\nif [ -z \"$AWS_ACCOUNT_ID\" ]; then\n echo \"ERROR: Please provide a value for the AWS variables\"\n exit 1\nfi\n\nsed 's|REPLACE_FOR_IMAGE_AND_TAG|'\"$AWS_ACCOUNT_ID\"'.dkr.ecr.'\"$AWS_REGION\"'.amazonaws.com/seldon-repository|g' deep_mnist.json | kubectl apply -f -", "error: unable to recognize \"STDIN\": Get https://461835FD3FF52848655C8F09FBF5EEAA.yl4.us-west-2.eks.amazonaws.com/api?timeout=32s: dial tcp: lookup 461835FD3FF52848655C8F09FBF5EEAA.yl4.us-west-2.eks.amazonaws.com on 1.1.1.1:53: no such host\n" ] ], [ [ "And let's check that it's been created.\n\nYou should see an image called \"deep-mnist-single-model...\".\n\nWe'll wait until STATUS changes from \"ContainerCreating\" to \"Running\"", "_____no_output_____" ] ], [ [ "!kubectl get pods", "NAME READY STATUS RESTARTS AGE\nambassador-5475779f98-7bhcw 1/1 Running 0 21m\nambassador-5475779f98-986g5 1/1 Running 0 21m\nambassador-5475779f98-zcd28 1/1 Running 0 21m\ndeep-mnist-single-model-42ed9d9-fdb557d6b-6xv2h 2/2 Running 0 18m\n" ] ], [ [ "## Test the model\nNow we can test the model, let's first find out what is the URL that we'll have to use:", "_____no_output_____" ] ], [ [ "!kubectl get svc ambassador -o jsonpath='{.status.loadBalancer.ingress[0].hostname}' ", "a68bbac487ca611e988060247f81f4c1-707754258.us-west-2.elb.amazonaws.com" ] ], [ [ "We'll use a random example from our dataset", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt\n# This is the variable that was initialised at the beginning of the file\ni = [0]\nx = mnist.test.images[i]\ny = mnist.test.labels[i]\nplt.imshow(x.reshape((28, 28)), cmap='gray')\nplt.show()\nprint(\"Expected label: \", np.sum(range(0,10) * y), \". One hot encoding: \", y)", "_____no_output_____" ] ], [ [ "We can now add the URL above to send our request:", "_____no_output_____" ] ], [ [ "from seldon_core.seldon_client import SeldonClient\nimport math\nimport numpy as np\n\nhost = \"a68bbac487ca611e988060247f81f4c1-707754258.us-west-2.elb.amazonaws.com\"\nport = \"80\" # Make sure you use the port above\nbatch = x\npayload_type = \"ndarray\"\n\nsc = SeldonClient(\n gateway=\"ambassador\", \n ambassador_endpoint=host + \":\" + port,\n namespace=\"default\",\n oauth_key=\"oauth-key\", \n oauth_secret=\"oauth-secret\")\n\nclient_prediction = sc.predict(\n data=batch, \n deployment_name=\"deep-mnist\",\n names=[\"text\"],\n payload_type=payload_type)\n\nprint(client_prediction)", "Success:True message:\nRequest:\ndata {\n names: \"text\"\n ndarray {\n values {\n list_value {\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.3294117748737335\n }\n values {\n number_value: 0.7254902124404907\n }\n values {\n number_value: 0.6235294342041016\n }\n values {\n number_value: 0.5921568870544434\n }\n values {\n number_value: 0.2352941334247589\n }\n values {\n number_value: 0.1411764770746231\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.8705883026123047\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9450981020927429\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.7764706611633301\n }\n values {\n number_value: 0.6666666865348816\n }\n values {\n number_value: 0.2039215862751007\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.44705885648727417\n }\n values {\n number_value: 0.2823529541492462\n }\n values {\n number_value: 0.44705885648727417\n }\n values {\n number_value: 0.6392157077789307\n }\n values {\n number_value: 0.8901961445808411\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8823530077934265\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9803922176361084\n }\n values {\n number_value: 0.8980392813682556\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.5490196347236633\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.06666667014360428\n }\n values {\n number_value: 0.25882354378700256\n }\n values {\n number_value: 0.05490196496248245\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.26274511218070984\n }\n values {\n number_value: 0.23137256503105164\n }\n values {\n number_value: 0.08235294371843338\n }\n values {\n number_value: 0.9254902601242065\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.41568630933761597\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.32549020648002625\n }\n values {\n number_value: 0.9921569228172302\n }\n values {\n number_value: 0.8196079134941101\n }\n values {\n number_value: 0.07058823853731155\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.08627451211214066\n }\n values {\n number_value: 0.9137255549430847\n }\n values {\n number_value: 1.0\n }\n values {\n number_value: 0.32549020648002625\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.5058823823928833\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9333333969116211\n }\n values {\n number_value: 0.1725490242242813\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.23137256503105164\n }\n values {\n number_value: 0.9764706492424011\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.24313727021217346\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.5215686559677124\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.7333333492279053\n }\n values {\n number_value: 0.019607843831181526\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.03529411926865578\n }\n values {\n number_value: 0.803921639919281\n }\n values {\n number_value: 0.9725490808486938\n }\n values {\n number_value: 0.22745099663734436\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.4941176772117615\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.7137255072593689\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.29411765933036804\n }\n values {\n number_value: 0.9843137860298157\n }\n values {\n number_value: 0.9411765336990356\n }\n values {\n number_value: 0.22352942824363708\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.07450980693101883\n }\n values {\n number_value: 0.8666667342185974\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.6509804129600525\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.011764707043766975\n }\n values {\n number_value: 0.7960785031318665\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8588235974311829\n }\n values {\n number_value: 0.13725490868091583\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.14901961386203766\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.3019607961177826\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.12156863510608673\n }\n values {\n number_value: 0.8784314393997192\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.45098042488098145\n }\n values {\n number_value: 0.003921568859368563\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.5215686559677124\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.2039215862751007\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.2392157018184662\n }\n values {\n number_value: 0.9490196704864502\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.2039215862751007\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.4745098352432251\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8588235974311829\n }\n values {\n number_value: 0.1568627506494522\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.4745098352432251\n }\n values {\n number_value: 0.9960784912109375\n }\n values {\n number_value: 0.8117647767066956\n }\n values {\n number_value: 0.07058823853731155\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n values {\n number_value: 0.0\n }\n }\n }\n }\n}\n\nResponse:\nmeta {\n puid: \"l6bv1r38mmb32l0hbinln2jjcl\"\n requestPath {\n key: \"classifier\"\n value: \"271049282727.dkr.ecr.us-west-2.amazonaws.com/seldon-repository:latest\"\n }\n}\ndata {\n names: \"class:0\"\n names: \"class:1\"\n names: \"class:2\"\n names: \"class:3\"\n names: \"class:4\"\n names: \"class:5\"\n names: \"class:6\"\n names: \"class:7\"\n names: \"class:8\"\n names: \"class:9\"\n ndarray {\n values {\n list_value {\n values {\n number_value: 6.839015986770391e-05\n }\n values {\n number_value: 9.376968534979824e-09\n }\n values {\n number_value: 8.48581112222746e-05\n }\n values {\n number_value: 0.0034086888190358877\n }\n values {\n number_value: 2.3978568606253248e-06\n }\n values {\n number_value: 2.0100669644307345e-05\n }\n values {\n number_value: 3.0251623428512175e-08\n }\n values {\n number_value: 0.9953710436820984\n }\n values {\n number_value: 2.6070511012221687e-05\n }\n values {\n number_value: 0.0010185304563492537\n }\n }\n }\n }\n}\n\n" ] ], [ [ "### Let's visualise the probability for each label\nIt seems that it correctly predicted the number 7", "_____no_output_____" ] ], [ [ "for proba, label in zip(client_prediction.response.data.ndarray.values[0].list_value.ListFields()[0][1], range(0,10)):\n print(f\"LABEL {label}:\\t {proba.number_value*100:6.4f} %\")", "LABEL 0:\t 0.0068 %\nLABEL 1:\t 0.0000 %\nLABEL 2:\t 0.0085 %\nLABEL 3:\t 0.3409 %\nLABEL 4:\t 0.0002 %\nLABEL 5:\t 0.0020 %\nLABEL 6:\t 0.0000 %\nLABEL 7:\t 99.5371 %\nLABEL 8:\t 0.0026 %\nLABEL 9:\t 0.1019 %\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d061b18c3c6373ce1ad6bf555b22d3a7975a9251
25,479
ipynb
Jupyter Notebook
introduction-to-Tiny-AutoML.ipynb
thomktz/TinyAutoML
74d9d806ac31795dbf1c4fd60755b0bf9a7c4124
[ "MIT" ]
null
null
null
introduction-to-Tiny-AutoML.ipynb
thomktz/TinyAutoML
74d9d806ac31795dbf1c4fd60755b0bf9a7c4124
[ "MIT" ]
null
null
null
introduction-to-Tiny-AutoML.ipynb
thomktz/TinyAutoML
74d9d806ac31795dbf1c4fd60755b0bf9a7c4124
[ "MIT" ]
null
null
null
68.308311
16,274
0.820401
[ [ [ "# **Introduction to TinyAutoML**\n\n---\n\nTinyAutoML is a Machine Learning Python3.9 library thought as an extension of Scikit-Learn. It builds an adaptable and auto-tuned pipeline to handle binary classification tasks.\n\nIn a few words, your data goes through 2 main preprocessing steps. The first one is scaling and NonStationnarity correction, which is followed by Lasso Feature selection. \n\nFinally, one of the three MetaModels is fitted on the transformed data.\n\nLet's import the library !", "_____no_output_____" ] ], [ [ "%pip install TinyAutoML==0.2.3.3", "_____no_output_____" ], [ "from TinyAutoML.Models import *\nfrom TinyAutoML import MetaPipeline", "_____no_output_____" ] ], [ [ "## MetaModels\n\nMetaModels inherit from the MetaModel Abstract Class. They all implement ensemble methods and therefore are based on EstimatorPools.\n\nWhen training EstimatorPools, you are faced with a choice : doing parameterTuning on entire pipelines with the estimators on the top or training the estimators using the same pipeline and only training the top. The first case refers to what we will be calling **comprehensiveSearch**.\n\nMoreover, as we will see in details later, those EstimatorPools can be shared across MetaModels.\n\nThey are all initialised with those minimum arguments :\n\n```python\nMetaModel(comprehensiveSearch: bool = True, parameterTuning: bool = True, metrics: str = 'accuracy', nSplits: int=10)\n```\n- nSplits corresponds to the number of split of the cross validation\n- The other parameters are equivoque\n\n\n**They need to be put in the MetaPipeline wrapper to work**", "_____no_output_____" ], [ "**There are 3 MetaModels**\n\n1- BestModel : selects the best performing model of the pool", "_____no_output_____" ] ], [ [ "best_model = MetaPipeline(BestModel(comprehensiveSearch = False, parameterTuning = False))", "_____no_output_____" ] ], [ [ "2- OneRulerForAll : implements Stacking using a RandomForestClassifier by default. The user is free to use another classifier using the ruler arguments", "_____no_output_____" ] ], [ [ "orfa_model = MetaPipeline(OneRulerForAll(comprehensiveSearch=False, parameterTuning=False))", "_____no_output_____" ] ], [ [ "3- DemocraticModel : implements Soft and Hard voting models through the voting argument", "_____no_output_____" ] ], [ [ "democratic_model = MetaPipeline(DemocraticModel(comprehensiveSearch=False, parameterTuning=False, voting='soft'))", "_____no_output_____" ] ], [ [ "As of release v0.2.3.2 (13/04/2022) there are 5 models on which these MetaModels rely in the EstimatorPool:\n- Random Forest Classifier\n- Logistic Regression\n- Gaussian Naive Bayes\n- Linear Discriminant Analysis\n- XGBoost\n\n\n***\n\n\nWe'll use the breast_cancer dataset from sklearn as an example:", "_____no_output_____" ] ], [ [ "import pandas as pd\nfrom sklearn.datasets import load_breast_cancer\n\ncancer = load_breast_cancer()\n \nX = pd.DataFrame(data=cancer.data, columns=cancer.feature_names)\ny = cancer.target\n\ncut = int(len(y) * 0.8)\n\nX_train, X_test = X[:cut], X[cut:]\ny_train, y_test = y[:cut], y[cut:]", "_____no_output_____" ] ], [ [ "Let's train a BestModel first and reuse its Pool for the other MetaModels", "_____no_output_____" ] ], [ [ "best_model.fit(X_train,y_train)", "INFO:root:Training models\nINFO:root:The best estimator is random forest classifier with a cross-validation accuracy (in Sample) of 1.0\n" ] ], [ [ "We can now extract the pool", "_____no_output_____" ] ], [ [ "pool = best_model.get_pool()", "_____no_output_____" ] ], [ [ "And use it when fitting the other MetaModels to skip the fitting of the underlying models:", "_____no_output_____" ] ], [ [ "orfa_model.fit(X_train,y_train,pool=pool)\ndemocratic_model.fit(X_train,y_train,pool=pool)", "INFO:root:Training models...\nINFO:root:Training models...\n" ] ], [ [ "Great ! Let's look at the results with the sk_learn classification report :", "_____no_output_____" ] ], [ [ "orfa_model.classification_report(X_test,y_test)", " precision recall f1-score support\n\n 0 0.96 1.00 0.98 26\n 1 1.00 0.99 0.99 88\n\n accuracy 0.99 114\n macro avg 0.98 0.99 0.99 114\nweighted avg 0.99 0.99 0.99 114\n\n" ] ], [ [ "Looking good! What about the ROC Curve ?", "_____no_output_____" ] ], [ [ "democratic_model.roc_curve(X_test,y_test)", "_____no_output_____" ] ], [ [ "Let's see how the estimators of the pool are doing individually:", "_____no_output_____" ] ], [ [ "best_model.get_scores(X_test,y_test)", "_____no_output_____" ] ], [ [ "## What's next ? \n\nYou can do the same steps with comprehensiveSearch set to True if you have the time and if you want to improve your results. You can also try new rulers and so on.", "_____no_output_____" ], [ "Maria, Thomas and Lucas.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ] ]
d061c8abae2ad27910746ac39a22938acf0dd2af
6,121
ipynb
Jupyter Notebook
notebooks/Python3-Language/06-OOP/01-Class Basics.ipynb
binakot/Python3-Course
c555fc7376c45f4b2dedb6d57363c0070831c1e1
[ "MIT" ]
null
null
null
notebooks/Python3-Language/06-OOP/01-Class Basics.ipynb
binakot/Python3-Course
c555fc7376c45f4b2dedb6d57363c0070831c1e1
[ "MIT" ]
null
null
null
notebooks/Python3-Language/06-OOP/01-Class Basics.ipynb
binakot/Python3-Course
c555fc7376c45f4b2dedb6d57363c0070831c1e1
[ "MIT" ]
null
null
null
22.925094
318
0.525404
[ [ [ "#OOP allows to create their own objects that have methods and attributes\n#you already used objects (list and such)\n#You can represent the whole world by classes, attributes and methods\n#classes can contain data about itself\n\n#functions are not enough to program large programs", "_____no_output_____" ], [ "#also keep in mind that sometimes developers use \"object\" and \"class\" interchangeably", "_____no_output_____" ], [ "numbers = [1,2,3] #create/instantiate a built-in object\ntype(numbers)", "_____no_output_____" ], [ "class Character():\n pass", "_____no_output_____" ], [ "unit = Character() #create instance\ntype(unit)", "_____no_output_____" ], [ "#each class has a constructor\n#and by the way a class can have more than constructor\n#(explain that many objects have attributes that have to be initialized, \n#\"name\" for a person ; \"unique id for a credit card\", \n#so constructor is something that makes sure that an object can not be created in an invalid state)", "_____no_output_____" ], [ "class Character():\n \n def __init__(self, race): #self represent an instance\n self.race = race", "_____no_output_____" ], [ "unit = Character() #client create an instance, and uses unit to work with object while self is used by the class developer", "_____no_output_____" ], [ "unit = Character('Elf')", "_____no_output_____" ], [ "type(unit)", "_____no_output_____" ], [ "unit.race", "_____no_output_____" ], [ "#get back to class and demonstrate that \"race\" param can be renamed to \"unit_type\"\n#and say that self.race is an attribute it's like defining a variable within a class\n#and say that we can give any name for race and then call it by new name from a client's side\n#then revert back changes and say that this style is a standard", "_____no_output_____" ], [ "class Character():\n \n def __init__(self, race, damage = 10, armor=20):\n self.race = race\n self.damage = damage\n self.armor = armor", "_____no_output_____" ], [ "unit = Character('Ork', damage=20, armor=40) #though not obliged to write arg names and even pass anything\nprint(unit.damage)\nprint(unit.armor)", "20\n40\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d061eb51c53566b0ba0e39649cbe53175a51de39
36,626
ipynb
Jupyter Notebook
2_Training.ipynb
siddsrivastava/Image-captionin
683d06c674c737a71957a7f087b726220bce262b
[ "MIT" ]
10
2020-05-11T02:36:36.000Z
2022-03-22T22:51:22.000Z
2_Training.ipynb
srivastava41099/Image-captioning
683d06c674c737a71957a7f087b726220bce262b
[ "MIT" ]
null
null
null
2_Training.ipynb
srivastava41099/Image-captioning
683d06c674c737a71957a7f087b726220bce262b
[ "MIT" ]
3
2021-04-12T10:41:48.000Z
2021-06-18T02:17:53.000Z
61.247492
734
0.616475
[ [ [ "# Computer Vision Nanodegree\n\n## Project: Image Captioning\n\n---\n\nIn this notebook, you will train your CNN-RNN model. \n\nYou are welcome and encouraged to try out many different architectures and hyperparameters when searching for a good model.\n\nThis does have the potential to make the project quite messy! Before submitting your project, make sure that you clean up:\n- the code you write in this notebook. The notebook should describe how to train a single CNN-RNN architecture, corresponding to your final choice of hyperparameters. You should structure the notebook so that the reviewer can replicate your results by running the code in this notebook. \n- the output of the code cell in **Step 2**. The output should show the output obtained when training the model from scratch.\n\nThis notebook **will be graded**. \n\nFeel free to use the links below to navigate the notebook:\n- [Step 1](#step1): Training Setup\n- [Step 2](#step2): Train your Model\n- [Step 3](#step3): (Optional) Validate your Model", "_____no_output_____" ], [ "<a id='step1'></a>\n## Step 1: Training Setup\n\nIn this step of the notebook, you will customize the training of your CNN-RNN model by specifying hyperparameters and setting other options that are important to the training procedure. The values you set now will be used when training your model in **Step 2** below.\n\nYou should only amend blocks of code that are preceded by a `TODO` statement. **Any code blocks that are not preceded by a `TODO` statement should not be modified**.\n\n### Task #1\n\nBegin by setting the following variables:\n- `batch_size` - the batch size of each training batch. It is the number of image-caption pairs used to amend the model weights in each training step. \n- `vocab_threshold` - the minimum word count threshold. Note that a larger threshold will result in a smaller vocabulary, whereas a smaller threshold will include rarer words and result in a larger vocabulary. \n- `vocab_from_file` - a Boolean that decides whether to load the vocabulary from file. \n- `embed_size` - the dimensionality of the image and word embeddings. \n- `hidden_size` - the number of features in the hidden state of the RNN decoder. \n- `num_epochs` - the number of epochs to train the model. We recommend that you set `num_epochs=3`, but feel free to increase or decrease this number as you wish. [This paper](https://arxiv.org/pdf/1502.03044.pdf) trained a captioning model on a single state-of-the-art GPU for 3 days, but you'll soon see that you can get reasonable results in a matter of a few hours! (_But of course, if you want your model to compete with current research, you will have to train for much longer._)\n- `save_every` - determines how often to save the model weights. We recommend that you set `save_every=1`, to save the model weights after each epoch. This way, after the `i`th epoch, the encoder and decoder weights will be saved in the `models/` folder as `encoder-i.pkl` and `decoder-i.pkl`, respectively.\n- `print_every` - determines how often to print the batch loss to the Jupyter notebook while training. Note that you **will not** observe a monotonic decrease in the loss function while training - this is perfectly fine and completely expected! You are encouraged to keep this at its default value of `100` to avoid clogging the notebook, but feel free to change it.\n- `log_file` - the name of the text file containing - for every step - how the loss and perplexity evolved during training.\n\nIf you're not sure where to begin to set some of the values above, you can peruse [this paper](https://arxiv.org/pdf/1502.03044.pdf) and [this paper](https://arxiv.org/pdf/1411.4555.pdf) for useful guidance! **To avoid spending too long on this notebook**, you are encouraged to consult these suggested research papers to obtain a strong initial guess for which hyperparameters are likely to work best. Then, train a single model, and proceed to the next notebook (**3_Inference.ipynb**). If you are unhappy with your performance, you can return to this notebook to tweak the hyperparameters (and/or the architecture in **model.py**) and re-train your model.\n\n### Question 1\n\n**Question:** Describe your CNN-RNN architecture in detail. With this architecture in mind, how did you select the values of the variables in Task 1? If you consulted a research paper detailing a successful implementation of an image captioning model, please provide the reference.\n\n**Answer:** I used a pretrained Resnet152 network to extract features (deep CNN network). In the literature other architectures like VGG16 are also used, but Resnet152 is claimed to diminish the vanishing gradient problem.I'm using 2 layers of LSTM currently (as it is taking a lot of time), in the future I will explore with more layers.\nvocab_threshold is 6, I tried with 9 (meaning lesser elements in vocab) but the training seemed to converge faster in the case of 6. Many paper suggest taking batch_size of 64 or 128, I went with 64. embed_size and hidden_size are both 512. I consulted several blogs and famous papers like \"Show, attend and tell - Xu et al\" although i did not use attention currently. \n\n\n### (Optional) Task #2\n\nNote that we have provided a recommended image transform `transform_train` for pre-processing the training images, but you are welcome (and encouraged!) to modify it as you wish. When modifying this transform, keep in mind that:\n- the images in the dataset have varying heights and widths, and \n- if using a pre-trained model, you must perform the corresponding appropriate normalization.\n\n### Question 2\n\n**Question:** How did you select the transform in `transform_train`? If you left the transform at its provided value, why do you think that it is a good choice for your CNN architecture?\n\n**Answer:** The transform value is the same. Empirically, these parameters values worked well in my past projects.\n\n### Task #3\n\nNext, you will specify a Python list containing the learnable parameters of the model. For instance, if you decide to make all weights in the decoder trainable, but only want to train the weights in the embedding layer of the encoder, then you should set `params` to something like:\n```\nparams = list(decoder.parameters()) + list(encoder.embed.parameters()) \n```\n\n### Question 3\n\n**Question:** How did you select the trainable parameters of your architecture? Why do you think this is a good choice?\n\n**Answer:** Since resnet was pretrained, i trained only the embedding layer and all layers of the decoder. The resnet is already fitting for feature extraction as it is pretrained, hence only the other parts of the architecture should be trained.\n\n### Task #4\n\nFinally, you will select an [optimizer](http://pytorch.org/docs/master/optim.html#torch.optim.Optimizer).\n\n### Question 4\n\n**Question:** How did you select the optimizer used to train your model?\n\n**Answer:** I used the Adam optimizer, since in my past similar projects it gave me better performance than SGD. I have found Adam to perform better than vanilla SGD almost in all cases, aligning with intuition.", "_____no_output_____" ] ], [ [ "import nltk\nnltk.download('punkt')", "[nltk_data] Downloading package punkt to /root/nltk_data...\n[nltk_data] Unzipping tokenizers/punkt.zip.\n" ], [ "\nimport torch\nimport torch.nn as nn\nfrom torchvision import transforms\nimport sys\nsys.path.append('/opt/cocoapi/PythonAPI')\nfrom pycocotools.coco import COCO\nfrom data_loader import get_loader\nfrom model import EncoderCNN, DecoderRNN\nimport math\n\n\n## TODO #1: Select appropriate values for the Python variables below.\nbatch_size = 64 # batch size\nvocab_threshold = 6 # minimum word count threshold\nvocab_from_file = True # if True, load existing vocab file\nembed_size = 512 # dimensionality of image and word embeddings\nhidden_size = 512 # number of features in hidden state of the RNN decoder\nnum_epochs = 3 # number of training epochs\nsave_every = 1 # determines frequency of saving model weights\nprint_every = 100 # determines window for printing average loss\nlog_file = 'training_log.txt' # name of file with saved training loss and perplexity\n\n# (Optional) TODO #2: Amend the image transform below.\ntransform_train = transforms.Compose([ \n transforms.Resize(256), # smaller edge of image resized to 256\n transforms.RandomCrop(224), # get 224x224 crop from random location\n transforms.RandomHorizontalFlip(), # horizontally flip image with probability=0.5\n transforms.ToTensor(), # convert the PIL Image to a tensor\n transforms.Normalize((0.485, 0.456, 0.406), # normalize image for pre-trained model\n (0.229, 0.224, 0.225))])\n\n# Build data loader.\ndata_loader = get_loader(transform=transform_train,\n mode='train',\n batch_size=batch_size,\n vocab_threshold=vocab_threshold,\n vocab_from_file=vocab_from_file)\n\n# The size of the vocabulary.\nvocab_size = len(data_loader.dataset.vocab)\n\n# Initialize the encoder and decoder. \nencoder = EncoderCNN(embed_size)\ndecoder = DecoderRNN(embed_size, hidden_size, vocab_size)\n\n# Move models to GPU if CUDA is available. \ndevice = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nencoder.to(device)\ndecoder.to(device)\n\n# Define the loss function. \ncriterion = nn.CrossEntropyLoss().cuda() if torch.cuda.is_available() else nn.CrossEntropyLoss()\n\n# TODO #3: Specify the learnable parameters of the model.\nparams = list(decoder.parameters()) + list(encoder.embed.parameters())\n\n# TODO #4: Define the optimizer.\noptimizer = torch.optim.Adam(params, lr=0.001, betas=(0.9,0.999), eps=1e-8)\n\n# Set the total number of training steps per epoch.\ntotal_step = math.ceil(len(data_loader.dataset.caption_lengths) / data_loader.batch_sampler.batch_size)", "Vocabulary successfully loaded from vocab.pkl file!\nloading annotations into memory...\nDone (t=1.07s)\ncreating index...\n" ] ], [ [ "<a id='step2'></a>\n## Step 2: Train your Model\n\nOnce you have executed the code cell in **Step 1**, the training procedure below should run without issue. \n\nIt is completely fine to leave the code cell below as-is without modifications to train your model. However, if you would like to modify the code used to train the model below, you must ensure that your changes are easily parsed by your reviewer. In other words, make sure to provide appropriate comments to describe how your code works! \n\nYou may find it useful to load saved weights to resume training. In that case, note the names of the files containing the encoder and decoder weights that you'd like to load (`encoder_file` and `decoder_file`). Then you can load the weights by using the lines below:\n\n```python\n# Load pre-trained weights before resuming training.\nencoder.load_state_dict(torch.load(os.path.join('./models', encoder_file)))\ndecoder.load_state_dict(torch.load(os.path.join('./models', decoder_file)))\n```\n\nWhile trying out parameters, make sure to take extensive notes and record the settings that you used in your various training runs. In particular, you don't want to encounter a situation where you've trained a model for several hours but can't remember what settings you used :).\n\n### A Note on Tuning Hyperparameters\n\nTo figure out how well your model is doing, you can look at how the training loss and perplexity evolve during training - and for the purposes of this project, you are encouraged to amend the hyperparameters based on this information. \n\nHowever, this will not tell you if your model is overfitting to the training data, and, unfortunately, overfitting is a problem that is commonly encountered when training image captioning models. \n\nFor this project, you need not worry about overfitting. **This project does not have strict requirements regarding the performance of your model**, and you just need to demonstrate that your model has learned **_something_** when you generate captions on the test data. For now, we strongly encourage you to train your model for the suggested 3 epochs without worrying about performance; then, you should immediately transition to the next notebook in the sequence (**3_Inference.ipynb**) to see how your model performs on the test data. If your model needs to be changed, you can come back to this notebook, amend hyperparameters (if necessary), and re-train the model.\n\nThat said, if you would like to go above and beyond in this project, you can read about some approaches to minimizing overfitting in section 4.3.1 of [this paper](http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=7505636). In the next (optional) step of this notebook, we provide some guidance for assessing the performance on the validation dataset.", "_____no_output_____" ] ], [ [ "import torch.utils.data as data\nimport numpy as np\nimport os\nimport requests\nimport time\n\n# Open the training log file.\nf = open(log_file, 'w')\n\nold_time = time.time()\nresponse = requests.request(\"GET\", \n \"http://metadata.google.internal/computeMetadata/v1/instance/attributes/keep_alive_token\", \n headers={\"Metadata-Flavor\":\"Google\"})\n\nfor epoch in range(1, num_epochs+1):\n \n for i_step in range(1, total_step+1):\n \n if time.time() - old_time > 60:\n old_time = time.time()\n requests.request(\"POST\", \n \"https://nebula.udacity.com/api/v1/remote/keep-alive\", \n headers={'Authorization': \"STAR \" + response.text})\n \n # Randomly sample a caption length, and sample indices with that length.\n indices = data_loader.dataset.get_train_indices()\n # Create and assign a batch sampler to retrieve a batch with the sampled indices.\n new_sampler = data.sampler.SubsetRandomSampler(indices=indices)\n data_loader.batch_sampler.sampler = new_sampler\n \n # Obtain the batch.\n images, captions = next(iter(data_loader))\n\n # Move batch of images and captions to GPU if CUDA is available.\n images = images.to(device)\n captions = captions.to(device)\n \n # Zero the gradients.\n decoder.zero_grad()\n encoder.zero_grad()\n \n # Pass the inputs through the CNN-RNN model.\n features = encoder(images)\n outputs = decoder(features, captions)\n \n # Calculate the batch loss.\n loss = criterion(outputs.view(-1, vocab_size), captions.view(-1))\n \n # Backward pass.\n loss.backward()\n \n # Update the parameters in the optimizer.\n optimizer.step()\n \n # Get training statistics.\n stats = 'Epoch [%d/%d], Step [%d/%d], Loss: %.4f, Perplexity: %5.4f' % (epoch, num_epochs, i_step, total_step, loss.item(), np.exp(loss.item()))\n \n # Print training statistics (on same line).\n print('\\r' + stats, end=\"\")\n sys.stdout.flush()\n \n # Print training statistics to file.\n f.write(stats + '\\n')\n f.flush()\n \n # Print training statistics (on different line).\n if i_step % print_every == 0:\n print('\\r' + stats)\n \n # Save the weights.\n if epoch % save_every == 0:\n torch.save(decoder.state_dict(), os.path.join('./models', 'decoder-%d.pkl' % epoch))\n torch.save(encoder.state_dict(), os.path.join('./models', 'encoder-%d.pkl' % epoch))\n\n# Close the training log file.\nf.close()", "Epoch [1/3], Step [100/6471], Loss: 4.2137, Perplexity: 67.6088\nEpoch [1/3], Step [200/6471], Loss: 3.9313, Perplexity: 50.97528\nEpoch [1/3], Step [300/6471], Loss: 3.5978, Perplexity: 36.5175\nEpoch [1/3], Step [400/6471], Loss: 3.6794, Perplexity: 39.6219\nEpoch [1/3], Step [500/6471], Loss: 3.0714, Perplexity: 21.5712\nEpoch [1/3], Step [600/6471], Loss: 3.2012, Perplexity: 24.5617\nEpoch [1/3], Step [700/6471], Loss: 3.2718, Perplexity: 26.35966\nEpoch [1/3], Step [800/6471], Loss: 3.3748, Perplexity: 29.2185\nEpoch [1/3], Step [900/6471], Loss: 3.1745, Perplexity: 23.9146\nEpoch [1/3], Step [1000/6471], Loss: 3.2627, Perplexity: 26.1206\nEpoch [1/3], Step [1100/6471], Loss: 2.8865, Perplexity: 17.9312\nEpoch [1/3], Step [1200/6471], Loss: 2.9421, Perplexity: 18.9562\nEpoch [1/3], Step [1300/6471], Loss: 2.7139, Perplexity: 15.0875\nEpoch [1/3], Step [1400/6471], Loss: 2.6474, Perplexity: 14.1176\nEpoch [1/3], Step [1500/6471], Loss: 2.6901, Perplexity: 14.7331\nEpoch [1/3], Step [1600/6471], Loss: 2.6551, Perplexity: 14.2267\nEpoch [1/3], Step [1700/6471], Loss: 2.9028, Perplexity: 18.2242\nEpoch [1/3], Step [1800/6471], Loss: 2.5633, Perplexity: 12.9791\nEpoch [1/3], Step [1900/6471], Loss: 2.7250, Perplexity: 15.2564\nEpoch [1/3], Step [2000/6471], Loss: 2.5907, Perplexity: 13.3396\nEpoch [1/3], Step [2100/6471], Loss: 2.7079, Perplexity: 14.9985\nEpoch [1/3], Step [2200/6471], Loss: 2.5242, Perplexity: 12.4809\nEpoch [1/3], Step [2300/6471], Loss: 2.5016, Perplexity: 12.2019\nEpoch [1/3], Step [2400/6471], Loss: 2.6168, Perplexity: 13.6915\nEpoch [1/3], Step [2500/6471], Loss: 2.6548, Perplexity: 14.2225\nEpoch [1/3], Step [2600/6471], Loss: 2.4738, Perplexity: 11.8673\nEpoch [1/3], Step [2700/6471], Loss: 2.4797, Perplexity: 11.9380\nEpoch [1/3], Step [2800/6471], Loss: 2.6574, Perplexity: 14.2598\nEpoch [1/3], Step [2900/6471], Loss: 2.3054, Perplexity: 10.0281\nEpoch [1/3], Step [3000/6471], Loss: 2.5392, Perplexity: 12.6694\nEpoch [1/3], Step [3100/6471], Loss: 2.6166, Perplexity: 13.6890\nEpoch [1/3], Step [3200/6471], Loss: 2.2275, Perplexity: 9.27642\nEpoch [1/3], Step [3300/6471], Loss: 2.5271, Perplexity: 12.5177\nEpoch [1/3], Step [3400/6471], Loss: 2.3050, Perplexity: 10.0246\nEpoch [1/3], Step [3500/6471], Loss: 2.0236, Perplexity: 7.56542\nEpoch [1/3], Step [3600/6471], Loss: 2.1614, Perplexity: 8.68294\nEpoch [1/3], Step [3700/6471], Loss: 2.3635, Perplexity: 10.6284\nEpoch [1/3], Step [3800/6471], Loss: 2.3958, Perplexity: 10.9773\nEpoch [1/3], Step [3900/6471], Loss: 2.1591, Perplexity: 8.66344\nEpoch [1/3], Step [4000/6471], Loss: 2.3267, Perplexity: 10.2446\nEpoch [1/3], Step [4100/6471], Loss: 3.1127, Perplexity: 22.4825\nEpoch [1/3], Step [4200/6471], Loss: 2.3359, Perplexity: 10.3392\nEpoch [1/3], Step [4300/6471], Loss: 2.3215, Perplexity: 10.1912\nEpoch [1/3], Step [4400/6471], Loss: 2.2369, Perplexity: 9.36462\nEpoch [1/3], Step [4500/6471], Loss: 2.2770, Perplexity: 9.74746\nEpoch [1/3], Step [4600/6471], Loss: 2.2351, Perplexity: 9.34757\nEpoch [1/3], Step [4700/6471], Loss: 2.2890, Perplexity: 9.86499\nEpoch [1/3], Step [4800/6471], Loss: 2.2736, Perplexity: 9.713991\nEpoch [1/3], Step [4900/6471], Loss: 2.5273, Perplexity: 12.5202\nEpoch [1/3], Step [5000/6471], Loss: 2.1436, Perplexity: 8.52971\nEpoch [1/3], Step [5100/6471], Loss: 2.2414, Perplexity: 9.40672\nEpoch [1/3], Step [5200/6471], Loss: 2.3917, Perplexity: 10.9318\nEpoch [1/3], Step [5300/6471], Loss: 2.2926, Perplexity: 9.90097\nEpoch [1/3], Step [5400/6471], Loss: 2.0861, Perplexity: 8.05366\nEpoch [1/3], Step [5500/6471], Loss: 2.0797, Perplexity: 8.00241\nEpoch [1/3], Step [5600/6471], Loss: 2.5135, Perplexity: 12.3480\nEpoch [1/3], Step [5700/6471], Loss: 2.0843, Perplexity: 8.03936\nEpoch [1/3], Step [5800/6471], Loss: 2.4332, Perplexity: 11.3950\nEpoch [1/3], Step [5900/6471], Loss: 2.0920, Perplexity: 8.10140\nEpoch [1/3], Step [6000/6471], Loss: 2.3367, Perplexity: 10.3468\nEpoch [1/3], Step [6100/6471], Loss: 2.9598, Perplexity: 19.2937\nEpoch [1/3], Step [6200/6471], Loss: 2.0285, Perplexity: 7.60297\nEpoch [1/3], Step [6300/6471], Loss: 2.6213, Perplexity: 13.7538\nEpoch [1/3], Step [6400/6471], Loss: 2.0924, Perplexity: 8.10440\nEpoch [2/3], Step [100/6471], Loss: 2.1729, Perplexity: 8.783715\nEpoch [2/3], Step [200/6471], Loss: 2.1168, Perplexity: 8.30481\nEpoch [2/3], Step [300/6471], Loss: 2.2427, Perplexity: 9.41848\nEpoch [2/3], Step [400/6471], Loss: 2.5073, Perplexity: 12.2721\nEpoch [2/3], Step [500/6471], Loss: 2.1942, Perplexity: 8.97323\nEpoch [2/3], Step [600/6471], Loss: 2.2852, Perplexity: 9.82738\nEpoch [2/3], Step [700/6471], Loss: 2.0216, Perplexity: 7.55076\nEpoch [2/3], Step [800/6471], Loss: 2.0080, Perplexity: 7.44841\nEpoch [2/3], Step [900/6471], Loss: 2.6213, Perplexity: 13.7540\nEpoch [2/3], Step [1000/6471], Loss: 2.2098, Perplexity: 9.1141\nEpoch [2/3], Step [1100/6471], Loss: 2.3376, Perplexity: 10.3568\nEpoch [2/3], Step [1200/6471], Loss: 2.1687, Perplexity: 8.74662\nEpoch [2/3], Step [1300/6471], Loss: 2.4215, Perplexity: 11.2623\nEpoch [2/3], Step [1400/6471], Loss: 2.2622, Perplexity: 9.60387\nEpoch [2/3], Step [1500/6471], Loss: 2.0793, Perplexity: 7.99915\nEpoch [2/3], Step [1600/6471], Loss: 3.0006, Perplexity: 20.0976\nEpoch [2/3], Step [1700/6471], Loss: 2.1184, Perplexity: 8.31816\nEpoch [2/3], Step [1800/6471], Loss: 2.0555, Perplexity: 7.81114\nEpoch [2/3], Step [1900/6471], Loss: 2.4132, Perplexity: 11.1696\nEpoch [2/3], Step [2000/6471], Loss: 2.4320, Perplexity: 11.3817\nEpoch [2/3], Step [2100/6471], Loss: 2.6297, Perplexity: 13.8692\nEpoch [2/3], Step [2200/6471], Loss: 2.2170, Perplexity: 9.18001\nEpoch [2/3], Step [2300/6471], Loss: 2.1038, Perplexity: 8.19712\nEpoch [2/3], Step [2400/6471], Loss: 2.0491, Perplexity: 7.76052\nEpoch [2/3], Step [2500/6471], Loss: 1.9645, Perplexity: 7.13170\nEpoch [2/3], Step [2600/6471], Loss: 2.3801, Perplexity: 10.8063\nEpoch [2/3], Step [2700/6471], Loss: 2.3220, Perplexity: 10.1963\nEpoch [2/3], Step [2800/6471], Loss: 2.0542, Perplexity: 7.80050\nEpoch [2/3], Step [2900/6471], Loss: 1.9378, Perplexity: 6.94348\nEpoch [2/3], Step [3000/6471], Loss: 1.9138, Perplexity: 6.77860\nEpoch [2/3], Step [3100/6471], Loss: 2.2314, Perplexity: 9.31325\nEpoch [2/3], Step [3200/6471], Loss: 2.1790, Perplexity: 8.83758\nEpoch [2/3], Step [3300/6471], Loss: 2.7974, Perplexity: 16.4013\nEpoch [2/3], Step [3400/6471], Loss: 2.2902, Perplexity: 9.87657\nEpoch [2/3], Step [3500/6471], Loss: 2.0739, Perplexity: 7.95541\nEpoch [2/3], Step [3600/6471], Loss: 2.4700, Perplexity: 11.8226\nEpoch [2/3], Step [3700/6471], Loss: 2.0761, Perplexity: 7.97370\nEpoch [2/3], Step [3800/6471], Loss: 2.0085, Perplexity: 7.45224\nEpoch [2/3], Step [3900/6471], Loss: 2.0280, Perplexity: 7.59929\nEpoch [2/3], Step [4000/6471], Loss: 2.0487, Perplexity: 7.75750\nEpoch [2/3], Step [4100/6471], Loss: 2.0105, Perplexity: 7.46732\nEpoch [2/3], Step [4200/6471], Loss: 2.3099, Perplexity: 10.0733\nEpoch [2/3], Step [4300/6471], Loss: 1.8471, Perplexity: 6.34158\nEpoch [2/3], Step [4400/6471], Loss: 1.9144, Perplexity: 6.78305\nEpoch [2/3], Step [4500/6471], Loss: 2.3026, Perplexity: 10.0001\nEpoch [2/3], Step [4600/6471], Loss: 2.0366, Perplexity: 7.66411\nEpoch [2/3], Step [4700/6471], Loss: 2.4918, Perplexity: 12.0830\nEpoch [2/3], Step [4800/6471], Loss: 2.0035, Perplexity: 7.41520\nEpoch [2/3], Step [4900/6471], Loss: 2.0007, Perplexity: 7.39395\nEpoch [2/3], Step [5000/6471], Loss: 2.0057, Perplexity: 7.43157\nEpoch [2/3], Step [5100/6471], Loss: 2.0654, Perplexity: 7.88811\nEpoch [2/3], Step [5200/6471], Loss: 1.8834, Perplexity: 6.57597\nEpoch [2/3], Step [5300/6471], Loss: 1.9578, Perplexity: 7.08400\nEpoch [2/3], Step [5400/6471], Loss: 2.1135, Perplexity: 8.27759\nEpoch [2/3], Step [5500/6471], Loss: 1.9813, Perplexity: 7.25206\nEpoch [2/3], Step [5600/6471], Loss: 2.1926, Perplexity: 8.95865\nEpoch [2/3], Step [5700/6471], Loss: 2.2927, Perplexity: 9.90207\nEpoch [2/3], Step [5800/6471], Loss: 2.3188, Perplexity: 10.1636\nEpoch [2/3], Step [5900/6471], Loss: 1.9937, Perplexity: 7.34238\nEpoch [2/3], Step [6000/6471], Loss: 1.8804, Perplexity: 6.55632\nEpoch [2/3], Step [6100/6471], Loss: 1.8708, Perplexity: 6.49346\nEpoch [2/3], Step [6200/6471], Loss: 1.9785, Perplexity: 7.23204\nEpoch [2/3], Step [6300/6471], Loss: 2.1267, Perplexity: 8.38739\nEpoch [2/3], Step [6400/6471], Loss: 1.8215, Perplexity: 6.18116\nEpoch [3/3], Step [100/6471], Loss: 1.9881, Perplexity: 7.301406\nEpoch [3/3], Step [200/6471], Loss: 2.2102, Perplexity: 9.11727\nEpoch [3/3], Step [300/6471], Loss: 1.9104, Perplexity: 6.75575\nEpoch [3/3], Step [400/6471], Loss: 1.8180, Perplexity: 6.15938\nEpoch [3/3], Step [500/6471], Loss: 2.5038, Perplexity: 12.2288\nEpoch [3/3], Step [600/6471], Loss: 2.0724, Perplexity: 7.94375\nEpoch [3/3], Step [700/6471], Loss: 2.0264, Perplexity: 7.58681\nEpoch [3/3], Step [800/6471], Loss: 1.9343, Perplexity: 6.91936\nEpoch [3/3], Step [900/6471], Loss: 1.9347, Perplexity: 6.92228\nEpoch [3/3], Step [1000/6471], Loss: 2.6768, Perplexity: 14.5382\nEpoch [3/3], Step [1100/6471], Loss: 2.1302, Perplexity: 8.41696\nEpoch [3/3], Step [1200/6471], Loss: 1.9754, Perplexity: 7.20958\nEpoch [3/3], Step [1300/6471], Loss: 2.0288, Perplexity: 7.60478\nEpoch [3/3], Step [1400/6471], Loss: 2.1273, Perplexity: 8.39242\nEpoch [3/3], Step [1500/6471], Loss: 2.6294, Perplexity: 13.8661\nEpoch [3/3], Step [1600/6471], Loss: 2.6716, Perplexity: 14.4634\nEpoch [3/3], Step [1700/6471], Loss: 1.8720, Perplexity: 6.50130\nEpoch [3/3], Step [1800/6471], Loss: 2.3521, Perplexity: 10.5080\nEpoch [3/3], Step [1900/6471], Loss: 2.0034, Perplexity: 7.41405\nEpoch [3/3], Step [2000/6471], Loss: 2.0006, Perplexity: 7.39337\nEpoch [3/3], Step [2100/6471], Loss: 2.0902, Perplexity: 8.08620\nEpoch [3/3], Step [2200/6471], Loss: 3.3483, Perplexity: 28.4533\nEpoch [3/3], Step [2300/6471], Loss: 2.0799, Perplexity: 8.00390\nEpoch [3/3], Step [2400/6471], Loss: 2.1215, Perplexity: 8.34411\nEpoch [3/3], Step [2500/6471], Loss: 1.9870, Perplexity: 7.29389\nEpoch [3/3], Step [2600/6471], Loss: 2.1111, Perplexity: 8.25726\nEpoch [3/3], Step [2700/6471], Loss: 1.8926, Perplexity: 6.63631\nEpoch [3/3], Step [2800/6471], Loss: 2.0022, Perplexity: 7.40557\nEpoch [3/3], Step [2900/6471], Loss: 1.9249, Perplexity: 6.85467\nEpoch [3/3], Step [3000/6471], Loss: 1.8835, Perplexity: 6.57626\nEpoch [3/3], Step [3100/6471], Loss: 2.0569, Perplexity: 7.82189\nEpoch [3/3], Step [3200/6471], Loss: 1.8780, Perplexity: 6.54040\nEpoch [3/3], Step [3300/6471], Loss: 2.3703, Perplexity: 10.7010\nEpoch [3/3], Step [3400/6471], Loss: 1.9703, Perplexity: 7.17267\nEpoch [3/3], Step [3500/6471], Loss: 1.9115, Perplexity: 6.76300\nEpoch [3/3], Step [3600/6471], Loss: 2.2174, Perplexity: 9.18364\nEpoch [3/3], Step [3700/6471], Loss: 2.4291, Perplexity: 11.3490\nEpoch [3/3], Step [3800/6471], Loss: 2.3135, Perplexity: 10.1093\nEpoch [3/3], Step [3900/6471], Loss: 1.9082, Perplexity: 6.74124\nEpoch [3/3], Step [4000/6471], Loss: 1.9494, Perplexity: 7.02424\nEpoch [3/3], Step [4100/6471], Loss: 1.8795, Perplexity: 6.55057\nEpoch [3/3], Step [4200/6471], Loss: 2.0943, Perplexity: 8.12024\nEpoch [3/3], Step [4300/6471], Loss: 1.9174, Perplexity: 6.80361\nEpoch [3/3], Step [4400/6471], Loss: 1.8159, Perplexity: 6.14634\nEpoch [3/3], Step [4500/6471], Loss: 2.1579, Perplexity: 8.65335\nEpoch [3/3], Step [4600/6471], Loss: 2.0022, Perplexity: 7.40562\nEpoch [3/3], Step [4700/6471], Loss: 2.0300, Perplexity: 7.61381\nEpoch [3/3], Step [4800/6471], Loss: 1.9009, Perplexity: 6.69223\nEpoch [3/3], Step [4900/6471], Loss: 2.4837, Perplexity: 11.9857\nEpoch [3/3], Step [5000/6471], Loss: 2.0528, Perplexity: 7.79005\nEpoch [3/3], Step [5100/6471], Loss: 1.9514, Perplexity: 7.03869\nEpoch [3/3], Step [5200/6471], Loss: 1.8162, Perplexity: 6.14836\nEpoch [3/3], Step [5300/6471], Loss: 2.0564, Perplexity: 7.81761\nEpoch [3/3], Step [5400/6471], Loss: 1.8345, Perplexity: 6.26224\nEpoch [3/3], Step [5500/6471], Loss: 2.2075, Perplexity: 9.09278\nEpoch [3/3], Step [5600/6471], Loss: 1.8813, Perplexity: 6.56204\nEpoch [3/3], Step [5700/6471], Loss: 1.8286, Perplexity: 6.22503\nEpoch [3/3], Step [5800/6471], Loss: 1.8301, Perplexity: 6.23444\nEpoch [3/3], Step [5900/6471], Loss: 1.9318, Perplexity: 6.90176\nEpoch [3/3], Step [6000/6471], Loss: 1.9549, Perplexity: 7.06348\nEpoch [3/3], Step [6100/6471], Loss: 1.9326, Perplexity: 6.90775\nEpoch [3/3], Step [6200/6471], Loss: 2.0268, Perplexity: 7.58943\nEpoch [3/3], Step [6300/6471], Loss: 1.8465, Perplexity: 6.33754\nEpoch [3/3], Step [6400/6471], Loss: 1.9052, Perplexity: 6.72096\nEpoch [3/3], Step [6471/6471], Loss: 2.0248, Perplexity: 7.57506" ] ], [ [ "<a id='step3'></a>\n## Step 3: (Optional) Validate your Model\n\nTo assess potential overfitting, one approach is to assess performance on a validation set. If you decide to do this **optional** task, you are required to first complete all of the steps in the next notebook in the sequence (**3_Inference.ipynb**); as part of that notebook, you will write and test code (specifically, the `sample` method in the `DecoderRNN` class) that uses your RNN decoder to generate captions. That code will prove incredibly useful here. \n\nIf you decide to validate your model, please do not edit the data loader in **data_loader.py**. Instead, create a new file named **data_loader_val.py** containing the code for obtaining the data loader for the validation data. You can access:\n- the validation images at filepath `'/opt/cocoapi/images/train2014/'`, and\n- the validation image caption annotation file at filepath `'/opt/cocoapi/annotations/captions_val2014.json'`.\n\nThe suggested approach to validating your model involves creating a json file such as [this one](https://github.com/cocodataset/cocoapi/blob/master/results/captions_val2014_fakecap_results.json) containing your model's predicted captions for the validation images. Then, you can write your own script or use one that you [find online](https://github.com/tylin/coco-caption) to calculate the BLEU score of your model. You can read more about the BLEU score, along with other evaluation metrics (such as TEOR and Cider) in section 4.1 of [this paper](https://arxiv.org/pdf/1411.4555.pdf). For more information about how to use the annotation file, check out the [website](http://cocodataset.org/#download) for the COCO dataset.", "_____no_output_____" ] ], [ [ "# (Optional) TODO: Validate your model.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d062038e41a45a23deb4f42e109f95ea47b719ff
658,049
ipynb
Jupyter Notebook
mark/export2db.ipynb
shiniao/baozheng
c1ec89e5ce2395abad089f1e7b92c3b31842e9f8
[ "MIT" ]
1
2021-02-19T14:51:43.000Z
2021-02-19T14:51:43.000Z
mark/export2db.ipynb
shiniao/baozheng
c1ec89e5ce2395abad089f1e7b92c3b31842e9f8
[ "MIT" ]
null
null
null
mark/export2db.ipynb
shiniao/baozheng
c1ec89e5ce2395abad089f1e7b92c3b31842e9f8
[ "MIT" ]
null
null
null
61.534412
1,500
0.41543
[ [ [ "import pandas as pd\nimport mysql.connector\nfrom sqlalchemy import create_engine", "_____no_output_____" ], [ "df = pd.read_csv('/Users/zhezhezhu/.bz_datasets/datasets_shiniao_create_dataset_test.txt',\n sep='\\t')\ndf", "_____no_output_____" ], [ "df.memory_usage(index=False, deep=True).sum()\n\nMYSQL_USER \t\t= 'root'\nMYSQL_PASSWORD \t= '19(zhezhezhu)95'\nMYSQL_HOST_IP \t= '127.0.0.1'\nMYSQL_PORT\t\t= \"3306\"\nMYSQL_DATABASE\t= 'datasets'\nengine = create_engine('mysql+mysqlconnector://'+MYSQL_USER+':'+MYSQL_PASSWORD+'@'+MYSQL_HOST_IP+':'+MYSQL_PORT+'/'+MYSQL_DATABASE, echo=False)\n", "_____no_output_____" ], [ "idx = 1\nfor df in pd.read_csv('/Users/zhezhezhu/projects/data_source/spam_message/spam_message.txt',\n names=[\"classify\",\"content\"],\n chunksize=500,\n sep='\\t'):\n df.to_sql(name=\"spam_message\", con=engine, if_exists='replace', chunksize=500)\n print(str(500 * (idx+1))+\" Processed\")", " classify content\n0 0 ๅ•†ไธš็ง˜ๅฏ†็š„็ง˜ๅฏ†ๆ€ง้‚ฃๆ˜ฏ็ปด็ณปๅ…ถๅ•†ไธšไปทๅ€ผๅ’Œๅž„ๆ–ญๅœฐไฝ็š„ๅ‰ๆๆกไปถไน‹ไธ€\n1 1 ๅ—ๅฃ้˜ฟ็Ž›ๆ–ฝๆ–ฐๆ˜ฅ็ฌฌไธ€ๆ‰น้™้‡ๆ˜ฅ่ฃ…ๅˆฐๅบ—ๅ•ฆ๎Œ ๎Œ ๎Œ ๆ˜ฅๆš–่Šฑๅผ€ๆท‘ๅฅณ่ฃ™ใ€ๅ†ฐ่“่‰ฒๅ…ฌไธป่กซ๎€† ...\n2 0 ๅธฆ็ป™ๆˆ‘ไปฌๅคงๅธธๅทžไธ€ๅœบๅฃฎ่ง‚็š„่ง†่ง‰็››ๅฎด\n3 0 ๆœ‰ๅŽŸๅ› ไธๆ˜Ž็š„ๆณŒๅฐฟ็ณป็ปŸ็ป“็Ÿณ็ญ‰\n4 0 23ๅนดไปŽ็›ๅŸŽๆ‹‰ๅ›žๆฅ็š„้บป้บป็š„ๅซๅฆ†\n1000 Processed\n classify content\n500 0 ่ƒก่ๅœ็ด ๅขžๅŠ 3ๅ€ใ€็ปด็”Ÿ็ด Bl2ๅขžๅŠ 4ๅ€ใ€็ปด็”Ÿ็ด CๅขžๅŠ 4\n501 0 ้ซ˜็ฎก้ƒฝ้œ€่ฆKPIๅฐฑๆฒก่ต„ๆ ผๅš้ซ˜็ฎก\n502 0 ๆŠคๅฃซไธ€ๆฃ€ๆŸฅๆƒŠๅ‘ผๆ€Žไนˆ็‰™้ƒฝ่›€ๆˆ่ฟ™ๆ ทไบ†\n503 1 x.x-x.xๆฅๅผ ๅฎถ่พน่‹ๅฎ๏ผๆŠข็พŽ็š„็ฉบ่ฐƒ๏ผ ้ข„ๅญ˜xxๅ…ƒ๏ผšๆœ€ไฝŽ=xxxๅ…ƒ๏ผŒๆœ€้ซ˜=xxxxๅ…ƒ๏ผ้ข„็บฆ...\n504 0 ็ซ็ฎญไผ‘่ต›ๆœŸๆ€ป็ป“๏ผšๅฏๅ†ฒๅ‡ป่ฅฟ้ƒจๅ† ๅ†›ๆƒงๆ€•ไธ‰ๅŠฒๆ•Œ\n1000 Processed\n classify content\n1000 1 ๅŠฒๆพ็”ตๅ™จ้ŸฉไฟŠๅญฆ้‡ๅ›žไธŠ็ญ๏ผŒๆ„Ÿ่ฐขๆœ‹ๅ‹้ข†ๅฏผ็š„ๆ”ฏๆŒ๏ผŒ็ฉบ่ฐƒๆ‰นๅ‘ไปทไพ›ๅบ”xxxๅฅ—๏ผŒx.xxๅŒน็ฉบ่ฐƒxxxxๅ…ƒ...\n1001 0 ๅฆ‚ๆžœ่ฏด่ฎคไบ†ไบฒๆˆšๅฐฑไปฃ่กจ็€่…่ดฅ\n1002 0 /็”ทๅญๅœจๆดพๅ‡บๆ‰€ๆ—ๅผบๅฅธๅฅณๅญฉ่ขซๆ•ๆ—ถ่ฃคๅญ่ฟ˜ๆฒกๆ‹‰ไธŠ\n1003 0 15ๅฒไปฅไธŠๆฏๆ—ฅ1ๆฌก2็ฒ’้šๆ—ถๆœ็”จ\n1004 0 ๆŠ•่ต„่€…ไพ็„ถๅœจๅๅ‘ไบŽๆๆƒงไธ€ไพง\n1000 Processed\n classify content\n1500 1 ๆŠค่‚คๅ“xๆŠ˜ ๆด—ๅคดๆฐดxๆŠ˜๏ผŒๆปกxxxๅ…ƒ้€xxๅ…ƒ่ถ…ๅธ‚ๆŠต็”จๅท.xๅทไธ€xๅท็Ž‹ๅบœไบ•่ถ…ๅธ‚ใ€‚\n1501 0 ๅƒๅฒ›ๆน–ไฝไบŽๆต™ๆฑŸ็œๆญๅทžๅธ‚ๆทณๅฎ‰ๅŽฟๅขƒๅ†…\n1502 0 ๅคๅญฃ่กขๅทž็š„ๅคฉไบฎ็š„ๆ—ฉ็ปˆไบŽๅคฉไบฎไบ†็ญ‰ไธ‹ๅ›žๅฎถ่ดตๅทž้ป”่ฅฟๅ—~ๆต™ๆฑŸไธฝๆฐดๅธŒๆœ›ๆˆ‘ไน‰ๆ— ๅ้กพ้ฉฌไธๅœ่น„ๅƒ้‡Œๅ›žๅฝ’่ƒฝ่ตถไธŠ...\n1503 0 ๆน–ๅ—้ซ˜้€Ÿ็ƒ‚ๅฐพๆถ‰ๅกŒๆ–นๅผ่…่ดฅๆกˆ\n1504 0 05ไบฟ็พŽๅ…ƒ็š„ไปทๆ ผๅ‘ๆ™บๅˆฉๅฎ‰ๆ‰˜ๆณ•ๅŠ ๆ–ฏๅก”้›†ๅ›ขๅ‡บๅ”ฎ่จๅฐ”่ฟช็“ฆ้“œ็Ÿฟ50%่‚กๆƒ\n1000 Processed\n classify content\n2000 0 ๆฏๅคฉๆ™šไธŠๅˆฐไบ†11็‚นๅทฆๅณๅฎถ้‡Œ็ฝ‘้€Ÿๆ…ขๅพ—ๆˆ‘่ฆ็ ธๆ‰‹ๆœบ\n2001 0 ไปŠๅนดไธปๆ‰“็š„้˜ฒๆ™’ๅ–ท้›พ้Ÿฉๅ›ฝ่ถ…็ซ็š„Re๏ผšcipeๆฐดๆ™ถ้˜ฒๆ™’็ฅžๅ™จๆฐดๆ™ถๅ–ท้›พ\n2002 1 ไธบ็ญ”่ฐขๆ–ฐ่€ๅฎขๆˆทๅฏนๆˆ‘ๅบ—็š„ๅคงๅŠ›ๆ”ฏๆŒ๏ผŒๆญฃๆœˆๅไบ”ๅˆฐๆœฌๅบ—ๅŠž็†ๅฎฝๅธฆไธšๅŠกๅ‚ๅŠ ๆดปๅŠจๆœบๅž‹่€…ๅฏๅ‡ญๆญค็Ÿญไฟกไผ˜ๆƒ xxๅ…ƒ...\n2003 0 ไธบ้‡‘ๅถๆฐด็ฎฑ็ญ‰4ๅฎถไผไธšไธŠๆŠฅๅพ…ๆ‰น่ฐƒๆ•ด็ฝฎๆขๅœŸๅœฐ46\n2004 0 12342ๆ็คบๅ‘12345ๆŠ•่ฏ‰\n1000 Processed\n classify content\n2500 1 Duang...?้”ฆๅ…ฐๆ‘ไธบๅบ†็ฅx.xๅฆ‡ๅฅณ่Š‚๏ผŒ็‰นๅˆซๆŽจๅ‡บไผ˜ๆƒ ๅ……ๅ€ผๆดปๅŠจ๏ผŒๅ……็š„่ถŠๅคš้€็š„่ถŠๅคš๏ผŒๅ้ขๆœ‰้™...\n2501 1 ไธ‰่ฟชๅŽ็พŽ่พพๅนฟๅœบ้…’ๅบ—ไบŒๆฅผ้‡‘ๆฑคSPAไผšๆ‰€ ๏ผŒๆบๆ‰‹ๅ…จไฝ“ๅ‘˜ๅทฅๅœจๅ…ƒๅฎตไฝณ่Š‚ๅˆฐๆฅไน‹้™…๏ผŒ็ฅๆ‚จๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒๅˆๅฎถ...\n2502 0 โ€”โ€”ใ€Œ็ฌฌไธ‰ๆ–นๆ”ฏไป˜ๆˆ–ๅคฑP2Pๆฒ‰ๆท€่ต„้‡‘้“ถ่กŒๆŠข้’ฑๆŠขๅœฐ็›˜\n2503 0 ไธบไฝ•โ€œไธญๅ›ฝๅฅฝๅฃฐ้Ÿณโ€็š„้€‰ๆ‰‹ๆ•ดไฝ“ไธๅฆ‚โ€œ่ถ…ๅฅณๅฟซ็”ทโ€็บข\n2504 1 ๆต ๆฐด*ๆ–ฐๅ†œ้”€ๅ”ฎ้ƒจๆญ็ฅๆ–ฐ่€ๅฎขๆˆทๅ…ƒๅฎต่Š‚ๅฟซไน๏ผๆ‰“้€ ๅ›ฝๅ†…้ฆ–ๅฎถ็”ตๅ•†ๅ†œ่ดธๅธ‚ๅœบ๏ผ็งŸ้‡‘ๆŠตๆœˆไพ›๏ผŒ่ฝปๆพๅšๆˆฟไธœ๏ผไธ...\n1000 Processed\n classify content\n3000 0 ใ€Œxxไธ‡็ƒญๆ’ญ่ง†้ข‘ใ€่ฟฝๆ˜Ÿ่ฟ™ไปถๅฐไบ‹\n3001 0 18ๅฒๅฐไผ™\"่‡ชๅญฆ\"ๆˆ้ป‘ๅฎข็›—ๅˆท160ไธ‡ไบบไฟก็”จๅก\n3002 0 ๅ’จ่ฏขๅฟƒๆˆฟ็š„ไบบ่ถŠๆฅ่ถŠๅคšไบ†โ€ฆๅคงๅฎถ่ฟ‡ๆฅ็›ธ็Ÿฅ็›ธ่ฏ†\n3003 0 ๅฐ็พŠ่‚–ๆฉ้ซ˜ๆธ…็™พๅบฆ็ฝ‘็›˜ไธ‹่ฝฝ้“พๆŽฅ\n3004 0 ๆˆ‘ๅฏไปฅๅ…่ดน่ฝฌ่ต ่ฟ™2ไธ‡ๅ…ƒ็š„ๅฅ–ๅญฆ้‡‘\n1000 Processed\n classify content\n3500 0 ็ฟ้ธฟ่™ฝ็„ถๆฒกๅฏผ่‡ด่ฟžไบ‘ๆธฏ็š„้˜ด้›จ\n3501 1 ๅงš่ดๅจœๅ› ไนณ่…บ็™Œ่ตฐไบ†๏ผŒ่ฟ™ๆฌพไบงๅ“ๅด็ซไบ†๏ผŒๆฏๅคฉๅช้œ€x.xxๅ…ƒ๏ผŒไธ€ๅนดxxxๅ…ƒ๏ผŒๅฐฑๅฏไปฅๆฏไธ€ไธชๅฅณๆ€ง่Žทๅพ—x...\n3502 1 ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตๅŽๅณฐ็”ณ้“ถ--ๆ‚จ่บซ่พน็š„่ดทๆฌพๆ‹…ไฟๆ•ดไฝ“ๆ–นๆกˆ่งฃๅ†ณไธ“ๅฎถ๏ผŒๆˆ‘ไปฌๅฐ†ๅง‹็ปˆไธบๆ‚จๆไพ›ไธ“ไธšใ€ไพฟๆท่ดดๅฟƒ็š„ๆœ...\n3503 0 ็šฎๅฝฑใ€ๆ‚่€ใ€ๅปบ็ญ‘ใ€ๅบ†ๅ…ธใ€ๅคง่ก—ใ€ไบบ็พค\n3504 0 ่˜่ฏทๆต™ๆฑŸ็œๆ•ดๅฝขๅค–ๅˆ›ๅง‹ไบบ้ฉฌๅฅ‡ๆ•™ๆŽˆๆ‹…ไปปๅ่ช‰้™ข้•ฟ\n1000 Processed\n classify content\n4000 0 ๆˆ‘ไปฌไผผไนŽๆ˜ฏ่ขซๅฆ–ๅง‘ๅจ˜ๆฝœ่ง„ๅˆ™ไธ‹ๆดป็€็š„็”Ÿๅ‘ฝ\n4001 0 ๅฐ†้€š่ฟ‡็”ตๅญ่ญฆๅฏŸๆŠ“ๆ‹่ฟ›่กŒ็ฎก็†\n4002 0 ๅฐฑๅœจๅŒ—ไบฌ้ คๅ คๆธฏ็š„2015NBANATION็ฑƒ็ƒๅœ‹ๅบฆๅŒ—ไบฌๅ ด\n4003 0 ็Žฐๅœจ่ง‰ๅพ—้ป‘ๅ’–ๅ•กไนŸไธ้‚ฃไนˆ่‹ฆไธ้‚ฃไนˆ้…ธไบ†\n4004 1 ๅˆฉ็އๆตฎๅˆฐ้กถ๏ผŒๅญ˜ๆฌพๆœ€ๅˆ’็ฎ—๏ผ่‡ชxxxxๅนดxๆœˆxๆ—ฅ่ตท๏ผŒๆˆ‘่กŒๅญ˜ๆฌพๅˆฉ็އ็ปŸไธ€ไธŠๆตฎxx%๏ผŒ่ฎฉๆ‚จๅฐŠไบซๆœ€้ซ˜ๅญ˜ๆฌพ...\n1000 Processed\n classify content\n4500 1 ๆœ้˜ณๅ…ฌๅ›ญ๏ผŒ้ซ˜ๅฑ‚่ง‚ๆ™ฏxๅฑ…ๅฎค๏ผŒ่ง†้‡Ž็‰นๅˆซๅฅฝ๏ผŒๆ— ไปปไฝ•้ฎๆŒก๏ผŒ้žๅธธๅฎ‰้™ๅ››้ขไธไธด่ก—๏ผŒๆ€ปไปทไฝŽ๏ผŒไฝŽไบŽๅธ‚ๅœบxxไธ‡...\n4501 0 It'samazinghow้ขœ่‰ฒ\n4502 1 ๅปบ่กŒ๏ผšxxxx xxxx xxxx xxxx xxxๆจไธฝๅจŸ\n4503 0 ๆŽฅไธ‹ๆฅ็š„ไปปๅŠกๅฐฑๆ˜ฏ่ฐƒๆŸฅๅพๅทž็š„ๅคฉๆฐ”็Žฏๅขƒ\n4504 0 โ‘ขๆ™šไธŠๅ–้…ธๅฅถๆœ‰ๅˆฉไบŽ่กฅ้’™๏ผšๆ™š้—ด12็‚น่‡ณๅ‡Œๆ™จๆ˜ฏไบบไฝ“่ก€้’™ๅซ้‡ๆœ€ไฝŽ็š„ๆ—ถๅ€™\n1000 Processed\n classify content\n5000 1 ใ€ๆ ผๅ…ฐ็Ž›ๅผ—ๅ…ฐ่†้—จไธœๆ–นไธ“ๆŸœใ€‘ไบฒ็ˆฑ็š„ๅฅณ็ฅžx.xๅฅณไบบ่Š‚๏ผŒๅ…จๅœบไธ€ไปถxๆŠ˜๏ผŒไธคไปถxๆŠ˜๏ผŒไธ‰ไปถxๆŠ˜ๅ†้€ไปทๅ€ผx...\n5001 0 ๆŠ•ๅ‡บ็ ”็ฉถ็”Ÿ้˜ถๆฎต็š„็ฌฌไธ€ไปฝ็ฎ€ๅކ\n5002 0 ๅ…ถไป–ๆ‰‹ๆœบๆ‹ๅ‡บๆฅ็š„็…ง็‰‡้ƒฝๆŒบๅฅฝ็œ‹็š„\n5003 0 ็”ฑๆฑŸ่ฅฟ็œๆ—…ๆธธ่ง„ๅˆ’็ ”็ฉถ้™ขไธปๅŠž\n5004 0 ๆˆ‘ๅฆˆ่ฏดๆˆ‘่ƒŒไธ้€‚ๅˆ~ๆ‰€ไปฅๅฆ‚ๆžœๆœ‰ๅ–œๆฌข็š„ๆœ‹ๅ‹ๅฏไปฅๆ‰พๆˆ‘ไนฐ\n1000 Processed\n classify content\n5500 0 ่€Œไธ”ๅดๆฑŸๅŒบ็ณป็ปŸ่ทŸๅ›ญๅŒบไธไธ€ๆ ท\n5501 0 ้ ๅฅนๆฏๆœˆๅชๆœ‰2000ๅคš็š„ๅทฅ่ณ‡็ถญๆŒ\n5502 0 ๆ— ้”ก็›‘็‹ฑ็‰น้‚€86ๅฒ็š„ๆŠ—ๆˆ˜่€ๅ…ตๅดๆˆ่€ๅ…ˆ็”Ÿๆฅๅˆฐ็›‘็‹ฑ\n5503 0 ็œ‹ๅฎŒๅ…‹ๆ‹‰ๆ‹ไบบๆˆ‘ๆ‰็Ÿฅ้“่ฎพ่ฎกๅธˆๅฏไปฅๅŠ่ทฏๅ‡บ้“ๅŽŸๆœฌไปŽๆฒกๅญฆ่ฟ‡็š„ไธ“ไธšๅฐฑๅฏไปฅๆ‹…ไปป\n5504 1 ๅคง็–พ็—…ไฟ้™ฉไบงๅ“้•ฟไฟๅฎ‰ๅบทๅฐ†ไบŽxxxxๅนดxxๆœˆxxๆ—ฅ่ตทๆญฃๅผๅœๅ”ฎ๏ผๆœฌไบงๅ“็‰น่‰ฒๆ˜ฏไฟ่ดนไฝŽ๏ผŒไฟ้šœ้ซ˜๏ผŒไธ‰ๆฌก...\n1000 Processed\n classify content\n6000 0 ็ˆฑๅฟƒ1็ญไปŠๅคฉไธŠ็š„ๆ˜ฏ็ป™ๅ›พ็”ปไธŠ่‰ฒ่ฏพ\n6001 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณๆ˜จๆ™š26ๅฒๆน–ๅ—้‚ฃไธช็”ท็”Ÿ้ป„ๆบ\n6002 0 ็ฌฌไธ€้˜ถๆฎต๏ผš7ๆœˆ31ๆ—ฅ่‡ณ8ๆœˆ6ๆ—ฅ\n6003 0 ๅๅ…ฌไบคๅŽปๆ•ดๅฎนๅŒป้™ขไธ€ๅ—ๅŒป็”Ÿ่ฏดไฝ ่ฟ™ๅดๅฝฆ็ฅ–็š„่„ธ่ฟ˜ๆ•ด\n6004 0 ๆœๅŠกๅฎขๆˆท็š„็†ๅฟตไนŸ่ฎฉๅŽไธบ้€ๆธๆˆไธบไธญๅ›ฝไบบๅผ•ไปฅไธบ่ฑช็š„ๅ›ฝไบงๅ“็‰Œ\n1000 Processed\n classify content\n6500 1 ๆทฎๅ—็งปๅŠจxGๆ‰‹ๆœบโ€œxโ€ๅ…ƒ่ดญ๏ผŒ่ฃธๆœบ็›ด้™้ซ˜่พพxxxๅ…ƒใ€‚ๅŽไธบPxๅŽŸไปทxxxx๏ผŒ็Žฐๅœบxxxx๏ผx.x...\n6501 0 ๅŸบไบŽTwitterๅบžๅคง็š„็”จๆˆท็พคๆฅๅˆ›้€ ไปฅ่ฏ„่ฎบไธบไบฎ็‚น็š„้Ÿณไนๅ†…ๅฎนๆœ็ดขๅˆ†ไบซๆœๅŠก\n6502 0 xๅ็”ทๅญๅœจๅฏนๆ–นๆ‹’็ปไบคๅ‡บๆ‰‹ๆœบๅŽๆ‰“็ ธๅฐ่ฝฆ\n6503 0 ๅˆๆœ‰ๆ—ถ้—ดๅฏไปฅๅ‡บๅŽปๆ—…ๆธธไบ†~ๆˆ‘ๅˆฐๅบ•่ฆไธ่ฆๅŽปไบ‘ๅ—ๅ‘ข\n6504 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆƒณๅ’จ่ฏขๅ…ณไบŽ ไน่Œ‚ๅฐๅŒบ ๅ—ๅŒ—้€š้€ ๅพ—ๆˆฟ็އ้ซ˜ ๅœฐ้“ๅฃ ๆ€ฅๅ”ฎ ๅ”ฎไปทxxx.xไธ‡/ๆœˆxๅฎคxๅŽ…...\n1000 Processed\n classify content\n7000 0 ไปŠๅคฉๆ™šไธŠๅ’Œๆ˜Žๅคฉๆ™šไธŠๆ˜ฏFTISLAND้ฆ–ๅฐ”ๆผ”ๅ”ฑไผš\n7001 0 ๅฑฑไธœ่ฝ้ฉฌๅฅณๆณ•ๅฎ˜่ขซๆˆดๆขฐๅ…ท็งฐไธ่ƒฝ่กŒ่ตฐๅŒป้™ข็งฐ่ฏˆ็—…้‡ๅˆฐ่ฟ™ๆ ท็š„ไบบไนŸๆ˜ฏๆ— ่ฏญๅ‡ๅ™Žไบ†\n7002 1 ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตๆ˜†ๆ˜ŽๅšๆตทๆฑฝๆœๅŠกๆœ‰้™ๅ…ฌๅธ๏ผŒๆˆ‘ๅ…ฌๅธๆ˜ฏ็ฆ็”ฐๆˆดๅง†ๅ‹’ๆฑฝ่ฝฆๆŽˆๆƒๆœๅŠกๅ•†๏ผŒๅ“็ง้ฝๅ…จ๏ผŒๅทฅ่‰บ็ฒพๆน›๏ผŒไปทๆ ผ...\n7003 0 ๅชๆœ‰ๅœจไธ–ไฟ—ๅฑ‚้ข้ญๅˆฐ่ดจ็–‘็š„ๆ—ถๅ€™ๆ‰่ƒฝ็กฎๅฎš่‡ชๅทฑ็š„ๅ†…ๅฟƒ\n7004 1 ๅฏŒๅฐ”็Ž›ไบŒๆฅผโ€œ้น้นคโ€ๅฎžๆœจๅฎถๅ…ทๅฐ†ไบŽxๆœˆxๆ—ฅๆ—ฉx็‚น่‡ณๆ™šx็‚นไธพ่กŒๅคงๅž‹ไฟƒ้”€ๆดปๅŠจ๏ผŒ้น้นคๅฎถๅ…ทๆ˜ฏๅŽ‚ๅฎถ็›ด่ฅๅบ—๏ผŒ...\n1000 Processed\n classify content\n7500 0 ๅ››ๅทๅ…ฌๅฎ‰็š„ๅˆ‘ไพฆ่ญฆ็Šฌ้ƒฝๅฐ†ๆŽฅๅ—่ƒฝๅŠ›่ฎค่ฏ่€ƒๆ ธ\n7501 0 WVๅ†ๆฌกไธŠไบ†้’ฑๆฑŸๅซ่ง†ๆ–ฐ้—ปๅ‘ๅธƒไผš\n7502 0 ๆˆ‘็š„ๅพฎ่ฝฏ่ดฆๆˆท่ƒฝ็ป‘ๅฎšๅ‡ ๅฐ็”ต่„‘\n7503 0 ๅฏๆƒœๅˆๅฟƒ่ฟ˜ๆ˜ฏ่ดฟ่ต‚ไธ€ไธ‹โ€ฆ็ดซ่‹ฑๆœ‰ๆ€Žๆ ท็š„ๆ•…ไบ‹ไนŸไปŽๆฅๆฒกๆโ€ฆๆ€€ๆœ”ๅ’Œ็’‡็Ž‘ไนŸๆญปไบ†โ€ฆโ€ฆๅ“Žโ€ฆโ€ฆ\n7504 0 Nbaๆฌ ๅ‰่ฏบๆฏ”ๅˆฉไธ€ไธชmvp\n1000 Processed\n classify content\n8000 0 ไน‹ๅ‰ไป–็”ฑไบŽไฝฟ็”จๆฏ’ๅ“ๅ’Œๆปฅ็”จๆšดๅŠ›\n8001 0 ๅฅ่บซๅŽ6ๅคงไธ้€‚โ€”โ€”3ใ€ๅคด็—›ๅคดๆ™•ไธ€่ˆฌๅšไธ€ไบ›ๅ‰ง็ƒˆๅŠจไฝœๆ—ถ\n8002 0 ๅ—ไบฌๆธฏไธญๅ† Aๅฎๅˆฉๆฅไธ‰่ฏบ็”Ÿ็‰ฉๅ—ๆ–นๆฑ‡้€š็ดซ้‘ซ่ฏไธš\n8003 0 ่ฏ็›‘ไผšๅฏๅŠจๅ†่ž่ต„ๆ˜ฏๆŠ•็Ÿณ้—ฎ่ทฏ\n8004 0 ็†็ง‘ๅˆ†ๆ•ฐ็บฟๅˆ†ๅˆซไธบ740ๅˆ†ใ€690ๅˆ†ใ€631ๅˆ†ใ€625ๅˆ†\n1000 Processed\n classify content\n8500 1 ๅ…่ดนๅ’จ่ฏขxxxxxxxxxxxๆ–ฐ็‰ˆๆœบ้บปๆŽงๅˆถๅ™จ๏ผŒ่ตทๆ‰‹ๅฅฝ็‰Œ๏ผŒไธๅฎ‰่ฃ…๏ผŒๅฏ็Žฐๅœบ็œ‹ๆ•ˆๆžœ๏ผŒๆปกๆ„ๅ†ไนฐใ€‚t\n8501 0 16ๅฒ็•™ๅฎˆๅฐ‘ๅฅณ็–‘้ญๅ ‚ไผฏๅผบๅฅธๅฎถๅฑžๆŠฅๆกˆ่ญฆๆ–นไป‹ๅ…ฅๅฎžไน ็”Ÿๅˆ˜ๆ—ญ่ฎฐ่€…ๅต‡็Ÿณ8ๆœˆ6ๆ—ฅ\n8502 0 ็พŽๅฆ†ใ€ๅฟƒๆœบใ€ๆฝœ่ง„ๅˆ™ใ€็ปฟ่ŒถๅฉŠ็ญ‰ๆˆไธบ็ฝ‘็บข็š„ไปฃๅ่ฏ\n8503 0 ๅ‰ๆฎตๆ—ถ้—ดไนฐไบ†ๅฐ็ฑณNoteๅ…จ็ฝ‘้€šๅ‘็Žฐๅช่ƒฝ็”จ็”ตไฟกไธŠ็ฝ‘\n8504 0 ๅˆฐไธ‹้ฃžๆœบๅ‰ๅฅนไปฌๅ†ไนŸๆฒกๅ†่Š่ฟ‡ๅคฉไบ†\n1000 Processed\n classify content\n9000 0 ็ป™ไป–ไปฌ็•™ไธ‹็š„่‚ก็ฅจๆ˜ฏๅคšไนˆ็š„ๅ€ผ้’ฑ\n9001 0 ๅผ ็ดซๅซฃ่ฏด็š„ๆ˜ฏ่ง็Ž‹ๅฅๆž—ไธ€้ข็ป™ๅฅนไธ€็™พไธ‡ๅพˆๅฎนๆ˜“็š„ๆฒก่ง่ฟ‡liyingxin้‚ฃๆ ท็š„ไบบ่ฏดๅ’Œๆˆ‘็ป“ๅฉšๆ˜ฏๅทฒๅฉš็š„...\n9002 0 ไธ€ๆฅผ็š„้˜ฒ็›—็ช—ๆ˜ฏๅฐๅทๅคฉ็„ถ็š„ๆขฏๅญ\n9003 0 ๅ›ฐๅœจ็”ตๆขฏ้‡Œ็š„่…นๆณป็”ทๅฐผ็Ž›ๆก‘ไธ่ตทๅ•Š\n9004 0 Sprintๅ…ฌๅธๅœจๅ…จ็ƒ็บฆๆœ‰7ไธ‡ๅ\n1000 Processed\n classify content\n9500 0 ๆต™ๆฑŸๆ— ่‰ฏ่€ๆฟๆฌ ๅทฅไบบๅทฅ่ต„ๅŠๅนดไธ็ป™\n9501 0 ๆฑŸๅฑฑไน‰ไนŒๆญๅทžๅ—ไบฌๅ—้€š็ฆๅปบๆตฆๅŸŽๅ‘ตๅ‘ต\n9502 0 ่ฝฌๆ’ญๅˆฐ่…พ่ฎฏๅพฎๅšๆ‰ฟๅพท็ฌฌ4ไธชๅŽ…ๅฎ˜่ฝ้ฉฌ\n9503 0 7ๆœˆ29ๆ—ฅ้˜ฟๅ‹’ๆณฐๆ˜Žๆ˜Ÿๅฆ†ๅ“่ดดๆŸœ้”€ๅ”ฎไปŠๆ—ฅ็›ฎๆ ‡800\n9504 0 ่’œๅคดๅฏๆŠตๆŠ—็—…ๆฏ’ใ€็œŸ่Œใ€ๅŽŸ็”Ÿ็”Ÿ็‰ฉๅ’Œๅฏ„็”Ÿ็‰ฉ\n1000 Processed\n classify content\n10000 0 ๅพฎ่ฝฏไธบไบ†้‡ๆ–ฐไบ‰ๅคบ็งปๅŠจๅธ‚ๅœบไปฝ้ข\n10001 0 ็ฌฌไธ€ๆ—ถ้—ดๆ‹ฟ่ตทๆ‰‹ๆœบ็‚นๅผ€ๅพฎไฟก้ข†ๅ–็บขๅŒ…ไธ€ๆฐ”ๅ‘ตๆˆ\n10002 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅˆšๆ‰“็”ต่ฏ่ฟ‡ๆฅ็š„ๅพทๅ›ฝๆฑ‰ๆ–ฏๆ ผ้›…ๆดๅ…ทๅฐๅด๏ผŒๆฑ‰ๆ–ฏๆ ผ้›…xxxๅ…จๅนดๆœ€ไฝŽไปทๅคงไฟƒ๏ผŒ่Šฑๆด’๏ผ‹้พ™ๅคดๅฅ—็ป„...\n10003 0 ๅˆซๅƒๅฐๅทๆ€ปๆ˜ฏ่งไธๅพ—ไบบๅฎถๆฏ”ไฝ ๅฅฝ\n10004 0 ๆ˜Žๆ˜Žๆฏๅคฉ้ƒฝๆ“ฆๅฅฝ้˜ฒๆ™’้œœๅ†ๅ‡บ้—จ\n1000 Processed\n classify content\n10500 0 ไธ€ๅฎš้ƒฝๅŽป่ฟ‡ไบ†ๅง~ไฝ†่ฏด่ตทๅนฟๅทž็š„ๆตทๆด‹็”Ÿ็‰ฉๆ ‡ๆœฌ้ฆ†ใ€ๅฒญๅ—่ฏ็ง‘ๆ™ฎๅŸบๅœฐใ€่ถณ็ƒๆœบๅ™จไบบๅฎž้ชŒๅฎคใ€่ˆฐ่ˆน็ง‘ๆ™ฎๅŸบๅœฐโ€ฆ...\n10501 0 ๆฏไธ€ๆญฅ้ƒฝไธๆœƒ็™ฝ่ตฐ้€™ๆ˜ฏๆˆ‘ๆœ€่ฟ‘็š„ๅฟƒๆƒ…ๅคงๅฎถๅœจๆฏๅ€‹้šŽๆฎต็š„ๅŠชๅŠ›้ƒฝไธๆœƒ็™ฝ่ฒป็š„ๅŠ ๆฒน๏ฝž๏ฝž๏ฝž๏ฝžๆ„Ÿ่ฌไน‹ๅ‰่‡ชๅทฑ็š„ๅŠช...\n10502 0 ๅ‘็Žฐไบ†ไธ€ไธชๅ’Œiwanna็›ธไผผ็š„ๆ‰‹ๆœบๆธธๆˆ\n10503 0 ๆณจๅฎš่ฆๅˆ ๅŽปโ€ฆโ€ฆ็”ต่„‘ไธญๆฏ’ๆ ผๅผๅŒ–ไธญโ€ฆโ€ฆ้‚ฃไบ›่ฎฐๅฝ•\n10504 0 ้‚ฃไนˆๅŽไธบG8็š„้…็ฝฎๆ€Žไนˆๆ ทๅ‘ข\n1000 Processed\n classify content\n11000 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 46bhdxไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n11001 0 ๆ˜ฏๆ—…ๆธธๅบฆๅ‡ใ€ไผ‘้—ฒๅจฑไนใ€ๆœ‹ๅ‹่š้คไธ‹้…’็š„็†ๆƒณ้ฃŸๅ“\n11002 0 ๅฐฑ่ฟ™ๆ ท่ขซๆ”ฟๅบœ้ƒจ้—จ็š„ไธไฝœไธบ็ป™ๆฏๆމไบ†\n11003 0 ๆˆ‘ๆ—ฅไปŠๅคฉไธ‹ๅˆๆœฌๆฅๆƒณๅ้ฃžๆœบๅ›žๅŽป็š„\n11004 0 ๅทฅไฟก้ƒจๅ…ฌๅธƒ2015ไธญๅ›ฝ้ฆ–ๆ‰นๆ™บ่ƒฝๅˆถ้€ ไธ“้กน้กน็›ฎ\n1000 Processed\n classify content\n11500 1 ๆตทๅฐ”็ฉบ่ฐƒๅ†ฏๅˆฉ่Šณ็ป™ๆ‚จๆ‹œไธชๆ™šๅนด๏ผ็ฅๆ‚จๅฎถๅบญๅ‰็พŠๅฆ‚ๆ„๏ผŒๅฟƒๆƒ…็พŠๅ…‰ๆปก้ข๏ผŒๅฅๅบท็พŠ็พŠๅพ—ๆ„๏ผŒๅ‘Š่ฏ‰ๆ‚จไธชๅฅฝๆถˆๆฏ๏ผๆตท...\n11501 0 ๅญฉๅญไปฌๆฏๅคฉๆƒณ็€็”ต่„‘ๆ‰‹ๆœบๆธธๆˆ\n11502 0 ๅกซๅ†™้‚€่ฏท็ ๏ผš548903ๅฏ้ข†3ๅ…ƒ็บขๅŒ…\n11503 0 ไป€ไนˆๆƒ…ๅ†ตใ€Ž19ๅฒๅผฑๆ™บๅฅณๅŠๅนดๅ†…ๅฎžๆ–ฝ็›—็ชƒ20ไฝ™่ตท่ขซๆŠ“่Žทไธœๅ—็ฝ‘็ฆๅปบ็ฌฌไธ€้—จๆˆทใ€\n11504 0 ๅ…ถไธญๆœ€้‡่ฆ็š„ๆ˜ฏ่‚Œ่‚ค็ป†่ƒž่€ๅŒ–ๅผ•่ตท็š„่‚Œ่‚ค่€ๅŒ–ๆพๅผ›\n1000 Processed\n classify content\n12000 0 ๆฒก็”จๆˆทๆ€็ปด็ปๅฏน่ขซไบ’่”็ฝ‘ๆ—ถไปฃๆท˜ๆฑฐ\n12001 0 ๅ†ไบซๅ•็ฌ”ๆถˆ่ดนๆปก2000ๅ…ƒ่ต LASANTEๆ—ฅๅ…‰้คๅŽ…้ธกๅฐพ้…’ไนฐไธ€้€ไธ€ๅˆธไธ€ๅผ \n12002 0 ๅผ€ๅง‹ๅฎ่ดไธ€็›ด่ดจ็–‘ๆˆ‘ๅฎถ็š„็›ด้‚ฎ\n12003 0 ๅฅ‡ๆ€ช็š„ๆ˜ฏ~~ไป–ไป€ไนˆๆ—ถๅ€™ๅŽป้‚ฃไธŠ็ญๅ•ฆ\n12004 0 ็›ธๆฏ”ๅŽปๅนดๆˆ‘ไปฌๅœจๅฆๅ…‹ๆ–น้ขๅšไบ†ไป€ไนˆๆ”น่ฟ›\n1000 Processed\n classify content\n12500 0 ๅˆซๆๅคšๅคฑๆœ›โ€ฆmaybe่ฟ˜ๆ˜ฏๅœจๆณฐๅ›ฝ็š„ๆ‰ๆญฃๅฎ—ๅง\n12501 0 ไฝ ็”จไบ†13ไบฟไบบ็š„้’ฑๅฏŒ่ตทๆฅไบ†ไฝ ไปฌ8000ไธ‡\n12502 0 ๅŽฟ็บงไปฅไธŠไบบๆฐ‘ๆ”ฟๅบœๅบ”ๅฝ“ๅปบ\n12503 0 ็œ‹ไบ†ๅฅฝๅฃฐ้Ÿณ่ง‰ๅพ—ๅ‘จๆฐไผฆๅคชๅฏ็ˆฑไบ†\n12504 0 ่ถ…1ไธ‡่ตž็š„ๆ•™็ง‘ไนฆ่ˆฌ็š„ๅฎถ่ฃ…ๅพฎไฟก่ฅ้”€\n1000 Processed\n classify content\n13000 0 โ€œไธญๅ›ฝ้ฃŸ่ฏ็›‘็ฎกโ€Appๅ…จๆ–ฐๅ‡็บงไธŠ็บฟ\n13001 0 1998ๅนด่ฟ˜ๆ›พ่ขซๆ”น็ผ–ๆˆๅŒๅ็”ต่ง†ๅ‰ง\n13002 0 ้€ไผดๅจ˜ๆœ??้ƒฝๆ˜ฏ่‹ๅทžๅฉš็บฑๅŸŽๅคงๅบ—่ดญๅ…ฅ\n13003 0 ็”ฑ็ฝ‘ๆ˜“ๆ—ถๅฐšใ€็ฝ‘ๆ˜“ๆˆฟไบงใ€ไฝฟ้ฆ†ๅฃนๅท้™ข่”ๅˆไธพๅŠž็š„โ€œไธญๅ›ฝ่‡ชไฟกๆ€ๅบฆๆ—ถ่ฃ…showโ€ๅœจไบฌไธพ่กŒ\n13004 0 ่ฟ™็งๅŠจ็‰ฉๆœฌๆ€ง็š„้šๆ€งๅŸบๅ› ไธๆ—ถ็š„้œฒๅ‡บ็‹ฐ็‹ž็š„้ข็›ฎ\n1000 Processed\n classify content\n13500 0 ๆœ€ๅŽ็ ”็ฉถ็”Ÿๅ…ฅๅญฆ่€ƒ่ฏ•ๆˆ็ปฉๆˆ‘ๆฏ”ไป–้ซ˜ไบ†่ฟ‘100ๅˆ†\n13501 0 ๅœฐ้“ๆ˜ฏไธๆ˜ฏๅˆ่ฏฅๆขๅคๅ›ž2RMBไบ†\n13502 1 ๆ–ฐๅนดๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅคง่‡ช็„ถๆœจ้—จ๏ผŒ็งป้—จ๏ผŒๅข™็บธ็š„๏ผŒๆˆ‘ไปฌๅŽ‚ๅฎถxๆœˆxๅท-xxๅทๅ…จ้ƒจx.xๆŠ˜๏ผŒ่ฏทๆ‚จไธŽๅฎถไบบไธ€ๅฎš่ฆ...\n13503 1 ๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒ ๆ–ฐๅœบ้…ฌๅฎพ:้›ท่ฟชๆฃฎๆฌๅˆฐๆ–ฐ่Šฑๆ ทx/xๅท่ตท๏ผšไปปไฝ•ไธ“ๅœบๆด‹้…’๏ผˆๅๅฃซ๏ผŒ่“ๅธฆ๏ผŒXO๏ผ‰ไนฐไธ€็“ถ้€...\n13504 0 ไธบไบ†่ฟ™ไธช็”ต่ง†ๅ‰งๅ……ไบ†่…พ่ฎฏไผšๅ‘˜\n1000 Processed\n classify content\n14000 0 RnD็‘ท็ผ‡ๆ™บ่ƒฝๅฝฉ็ป˜ๅˆ็Žฉๅ‡บไบ†ๆ–ฐ่ฎพ่ฎก\n14001 0 58ไนฐๅ‚จๅ›žๆฅๅค„3400่‚ก002066\n14002 0 ๆ˜ฏไผฆๆ•ฆไบคๅ“ไนๅ›ขๅ’ŒBBCไบคๅ“ไนๅ›ข็š„้ฉปๅ›ขๅœบๅœฐ\n14003 0 ๆœฌๆฅๅ…ญ็‚นไบŒๅ็š„้ฃžๆœบๆš‚ๆ—ถๆˆๅŠŸๅปถ่ฏฏๅˆฐๅไธ€็‚นไบ†โ€ฆ็œŸๆฃ’\n14004 0 ๅŒบๅปบไบคๅง”๏ผš1๏ผŽๅฎๅฑฑๅŒบๅ…ฌๅ…ฑ่‡ช่กŒ่ฝฆ็ฝ‘็‚นๆŒ‰็…งๆ–นไพฟใ€้€‚้‡ๅ’Œ็ปŸ็ญนๅ…ผ้กพๅŽŸๅˆ™ๅŠๆ–นไพฟๅ–็‚น็ญ‰ๅ› ็ด \n1000 Processed\n classify content\n14500 1 ็–ฏ็‹‚ๆŽฅๅ•๏ผๅ้ขๆœ‰้™๏ผๅ…จๅ›ฝๆŽฅๅ•๏ผ้ป‘็™ฝๆˆทๅŠž็†xxwไปฅไธŠไฟก็”จๅก๏ผŒxxๅคฉไธ‹ๅก๏ผŒ้œ€่ฆๅŠž็†็š„่”็ณปๆˆ‘๏ผŒไธญไป‹...\n14501 0 ๅฅฝๆถˆๆฏ๏ผš่Žฝๅ”ๅ’Œ็ปต็ซน้ฃžๆœบๅฟซ้€ๅˆไฝœๆˆๅŠŸ\n14502 0 ๅฏๆ‰‹ๆœบๅพฎๅš้‡Œๆฒกๆ˜พ็คบๅ‡บๆฅโ€ฆโ€ฆไบŽๆ˜ฏๆŠŠๅพฎๅšๅธ่ฝฝไบ†ๅˆ้‡ไธ‹ไบ†ไธ€ๆฌกโ€ฆโ€ฆ็ป“ๆžœ่ฟ™ๆฌก้™ˆๅญฆไธœ่กจๆƒ…ๆฒกๆœ‰\n14503 0 ๆ“…้•ฟUI่ฎพ่ฎกใ€ไบคไบ’่ฎพ่ฎกใ€็”ตๅ•†่ฎพ่ฎกใ€VI่ฎพ่ฎก\n14504 0 ็›ฎๅ‰ๅทฒ็ปๅฎŒๆˆ8500ไธ‡็พŽๅ…ƒC่ฝฎ่ž่ต„\n1000 Processed\n classify content\n15000 0 ๆˆ‘ไธ€ๅ‡กไบบ่ƒฝ่ทณๅฎŒๅ‰15ๅˆ†้’Ÿ็š„็ƒญ่บซๅฐฑ็ดฏๅˆฐไธ่ฆไธ่ฆ็š„\n15001 0 ็”ฑSEGAๅผ€ๅ‘ใ€BBๅทฅไฝœๅฎค็›‘ไฟฎ็ผ–ๅ‰ง\n15002 0 ไธ“่ฎฟCassia่ตต็ฆๅ‹‡๏ผšๅšไบ†20ๅนดWiFi\n15003 0 ๆบง้˜ณๅธ‚ๆฐ”่ฑกๅฐ16ๆ—ฅ8ๆ—ถ30ๅˆ†\n15004 0 ๅฝ“ๅนด่ฏดๆƒณ่€ƒๅ—ไบฌ่ˆชๅคฉ่ˆช็ฉบๅคงๅญฆ็š„\n1000 Processed\n classify content\n15500 0 ่‡ช็”ฑ็ƒๅ‘˜ๆตท่€ถๆ–ฏๅทฒ็ปไธŽ็ซ็ฎญ่พพๆˆ็ญพ็บฆๅ่ฎฎ\n15501 0 ๆ–ฐๅ“1933้•‡ๆฑŸไธ‡่พพ๏ผš็Œ›ๆ–™1886\n15502 0 ๅˆšๅˆšๅฅนๅ…ณไบ†็ฏๆˆ‘็Žฉๆ‰‹ๆœบ็Žฉๅ—จไบ†\n15503 0 xxไปถไฝ ไธ็Ÿฅ้“ๅ…ณไบŽ่ดพๆ–ฏๆฑ€ๆฏ”ไผฏ็š„ไบ‹ๆƒ…\n15504 0 ๅขจๅฐ”ๆœฌboxhillsouthๅ…จๆ–ฐtownhouseๅ•้—ดๅ‡บ็งŸ$xxx/monthxxmin่ตฐ...\n1000 Processed\n classify content\n16000 0 ้‚ฃๆ—ถๅ€™ๅœฐ็ƒๅทฒ็ป่ฟˆๅ…ฅๆœบๅ™จไบบๆ—ถไปฃ\n16001 0 ๆŸณๅทžๅธ‚้ฑผๅณฐๅŒบไบบๆฐ‘ๆณ•้™ขๅฎก็ป“ไธ€ไปถไนฐๅ–ๅˆๅŒ็บ ็บทๆกˆไปถ\n16002 1 ๅฎถ้•ฟๆ‚จๅฅฝไนๆฉ™็พŽๆœฏๆ˜ฅๅญฃ็ญๅทฒ็ปๅผ€ๅง‹ๆŠฅๅ๏ผŒๅฏ้šๆ—ถๆŠฅๅๅ…่ดน่ฏ•ๅฌ๏ผŒๆๅ‰ๆŠฅๅๅฏ้€‚ๅฝ“ไผ˜ๆƒ ใ€‚ๅœฐๅ€๏ผš้“ญๅŸŽxxๅท...\n16003 0 ไธๆ˜ฏไฝ ไปฌๆ”ฟๅบœ้‚€ๅŠŸ็ซ‹่กจ่ฏๆ˜Žไฝ ไปฌไธบไบบๆฐ‘ๆœๅŠก็š„ๅœฐๆ–น\n16004 0 ๅธŒๆœ›ๅคงๅฎถๅฏไปฅๅ•็‹ฌๅ‘็ป™ๅฅน็บขๅŒ…\n1000 Processed\n classify content\n16500 1 TOTO้ฆ–ๅฑŠๆ™บ่ƒฝๅซๆตด่Š‚ๅผ€ๅง‹ไบ†๏ผŒๆ™บ่ƒฝ้ฉฌๆกถ็›–ๅช้œ€xxxxๅ…ƒ๏ผŒๅ‡ญ็ŸญไฟกๅฏไบซๆŠ˜ๅŽxxๆŠ˜๏ผŒๆ— ้œ€ๅ‡บๅ›ฝ๏ผŒๅพฎไธช...\n16501 0 ๅŒ—ไบฌใ€ไธŠๆตทใ€ๆต™ๆฑŸๆŠ•่ต„ไบ‹ไปถๆ•ฐ้‡ๅขžๅน…่ถ…่ฟ‡xxx%\n16502 0 ไธบไป€ไนˆ็Žฐๅœจ็š„ๆ‰‹ๆœบๅฐบๅฏธ้ƒฝๅพ€x\n16503 0 xๆœˆxxๆ—ฅ่ตทๆ—ฅ็…งไธปๅŸŽๅŒบ็ฆ่กŒไธ‰่ฝฎ่ฝฆใ€ๅ››่ฝฎไปฃๆญฅ่ฝฆ\n16504 1 ๅงๆ‚จๅฅฝ๏ผšๆˆ‘ๆ˜ฏ่“็ˆต็พŽๅฎนไผšๆ‰€็š„ๆฅ ๆฅ ๏ผŒไธ‰ๅ…ซ่Š‚ๅณๅฐ†ๅˆฐๆฅ๏ผŒๆˆ‘ๅบ—ไธบไบ†็ญ”่ฐข่€้กพๅฎข้•ฟๆœŸไปฅๆฅๅฏน่“็ˆต็พŽๅฎนไผšๆ‰€็š„ๆ”ฏ...\n1000 Processed\n classify content\n17000 0 ่ž่ต„็งŸ่ตๅ…ฌๅธ่ฎพ็ซ‹ๅญๅ…ฌๅธไธ่ฎพๆœ€ไฝŽๆณจๅ†Œ่ต„ๆœฌ้™ๅˆถ\n17001 0 ็ฑ่ดฏๆฑŸ่‹ๅธธๅทžๅœจไนŸ่ฎธๅทฒ็ป่ฎฐไธ่ตทไป–็š„\n17002 0 ็™ฝๅคฉ้กถ็€ไธญๆš‘ๅคฉ่ฟ˜่ฆ่ท‘ๅŒป้™ขๅทฎ็‚น่™š่„ฑ็š„ๆ™•่ฟ‡ๅŽป\n17003 0 ๅŠ้ฆ™ๆธฏ็ต‚ๅฏฉๆณ•้™ข้ฆ–ๅธญๆณ•ๅฎ˜้ฆฌ้“็ซ‹ๅˆ†ๅˆฅ็އ้ ˜็š„ไปฃ่กจๅœ˜ๆœƒ้ข\n17004 0 ไป–30ๅคšๅนดๆฅๅ…ฑๅˆ›ไฝœ็š„ๅ‡ ไธ‡ๅน…ไฝœๅ“\n1000 Processed\n classify content\n17500 0 ๅฐคๅ…ถๆ˜ฏๅผ€ๅฏไบ†ไนกๆ‘ๆ—…ๆธธ็š„ๅ…จๆ–ฐๆ—ถไปฃ\n17501 0 1958ๅนด8ๆœˆ็”ฑๆต™ๆฑŸๅŒปๅญฆ้™ขไปŽๆญๅทžๅˆ†่ฟ่‡ณ\n17502 1 ๅพๅทžๅฐš็พŽๆœบๆขฐ่ฎพๅค‡ๆœ‰้™ๅ…ฌๅธใ€‚่‹ๅŒ—ๅœฐๅŒบไธ“ไธš้›•ๅˆปๆœบ๏ผŒๆฟ€ๅ…‰ๆœบ๏ผŒๆœจๅทฅๆœบ้”€ๅ”ฎๅ•†ใ€‚ๅœฐๅ€๏ผšๅพๅทžๅธ‚ไบ‘้พ™ๅŒบไธ‹ๆฒณๅคด...\n17503 0 ๅฎ้™ตๅŽฟๆŸณๆฒณๆดพๅ‡บๆ‰€ๆ•‘ๅ‡บ่ฝไบ•ๅ„ฟ็ซฅ็š„่ญฆๅฏŸๅฅฝๆ ท็š„\n17504 0 ็Ž‹ๅฎถ่ฑช็œŸๆ˜ฏ้ป‘็š„ๅฐฑๅทฎไธชๆœˆไบฎๅฐฑๅฏไปฅๆผ”ๅŒ…ๆ‹ฏไบ†\n1000 Processed\n classify content\n18000 0 ๆฏไธ€ไปถIGNISไบงๅ“็”ฑๅ†…่€Œๅค–ๅ‡ๆธ—้€็€ๆบไบŽ่‡ช็„ถ็š„ๆทณๆœด่€Œ่„ฑไฟ—็š„ๆฐ”่ดจ\n18001 1 ๆบ็”Ÿๅ ‚ไธบๆปก่ถณๆ–ฐ่€ๅฎขๆˆท้œ€ๆฑ‚็‰นๆทปๅŠ ๏ผš็ƒซๆ‹‰ๆŸ“้กน็›ฎ๏ผๆดปๅŠจๆœŸ้—ดๆค็‰ฉๅฅๅบทๆŸ“ๅ‘ๅช้œ€xxๅ…ƒ๏ผๅฅๅบท้™ถ็“ท็ƒซๅ‘ๅช้œ€...\n18002 0 ่ฟ™ๆ˜ฏbabyๅš็š„่ฃ…ไฟฎๅฎถ็š„ๅทฅ็จ‹่กจ\n18003 0 ๅนถๅฏนๅ‡บ็งŸ็‰ฉไธšไบบๅ‘˜่ฟ›่กŒๆถˆ้˜ฒๅฎ‰ๅ…จๅŸน่ฎญ\n18004 0 ่ƒฝ่ฎฉไธ€ไธช57ๅฒ็š„ไบบ็œ‹่ตทๆฅๅƒ40ๅฒไธ€ๆ ท\n1000 Processed\n classify content\n18500 0 ้ข„็บฆไบ†ๅฐ็ฑณNOTEๅ…จ็ฝ‘้€šๅŽไธบ่ฃ่€€7ๅ…จ็ฝ‘้€š่ฃ่€€็•…็Žฉ4X\n18501 0 ๆ— ้”กไนฐไบŒๆ‰‹ๆˆฟๅ…ฌ็งฏ้‡‘่ดทๆฌพ30ไธ‡่ฏ„ไผฐ่ดน่ฆ5ๅƒไธบไป€ไนˆ่ฟ™ไนˆ้ซ˜่ฟ™ๆ ทๆ”ถ่ดนๅˆ็†ๅ—\n18502 0 ไธ€ๆ˜ฏๅˆถๅฎšไบ†ไบบๆฐ‘ๆณ•ๅบญ่ฎพ็ฝฎๆ”น้ฉๅฎžๆ–ฝๆ–นๆกˆ\n18503 0 ไปŠๅนด่ฟ˜่ฆๆ‰“้€ xxxๅฅ—็พค็งŸโ€œๆ ทๆฟๆˆฟโ€\n18504 0 ๅพฎ่ฝฏOfficeforMac2016ไธŠ็บฟไบ†\n1000 Processed\n classify content\n19000 0 I'mnotallowed๏ฝžไธ€ๅนดๅคšๅ‰ๅฟ˜่ฎฐๅ› ไธบๅ•ฅๆˆ‘็ฆๆญขๅฅน็Žฉ\n19001 0 ไฝ†ๆต™ๆฑŸไบค้€š็”ตๅฐ93่ฟ˜ๅœจ้‚ฃๅ„ฟๆ•…ไฝœ็Ÿซๆƒ…\n19002 0 ็›ฎๅ‰่‹ๅทž็ปง็ปญๅ—ๅคง้™†้ซ˜ๅŽ‹็š„ๆŽงๅˆถ\n19003 0 ไธญๅ›ฝไธ“ไธš็š„ไบšๅฅๅบทๅ’จ่ฏข็ฝ‘็ซ™\n19004 0 ๆต™ๆฑŸๅคงๅญฆๅฎž้ชŒๅฎค็ป™ไฝ ๆœ€ๆƒๅจ็š„็ญ”ๆกˆ\n1000 Processed\n classify content\n19500 0 ไธ€ไธชๅ…ฌๅนณๅ…ฌๆญฃ็š„ๆˆฟๅœฐไบง็จŽๆ˜ฏ่ฐƒ่Š‚ๅฑ…ๆฐ‘ๆ”ถๅ…ฅๅˆ†้…ไธๅ…ฌๆœ€ไธบ้‡่ฆ็š„ๅทฅๅ…ท\n19501 0 ไปŽๅคๅˆฐไปŠๅ†คๆกˆ็”šๅคšๆœ‰ๅ‡ ไธช่ƒฝ็ฟป่ฟ‡ๆฅ็š„\n19502 0 ๆ–ฐ้š็€ๅคงๅฎถๅฏน่ฃ…ไฟฎ่ฎพ่ฎก้ฃŽๆ ผ็š„้€ๆธ็ป†ๅˆ†\n19503 0 ๅ‚ๅŠ ๅฅฝๅฃฐ้Ÿณdeๆฐธ่ฟœ้ƒฝๆ˜ฏ่ฟ™ๆ ท่ฏด\n19504 0 ๆœฌๅบ—้”€ๅ”ฎ็š„ไบงๅ“ๅ‡ไธบโ€œๅ›ฝ็ไธ“่ฅโ€ๅฎžไฝ“ๅบ—ๆญฃๅ“\n1000 Processed\n classify content\n20000 0 ๆŠ•่ต„่‡ชๅทฑๆ‰ๆ˜ฏๆœ€ๅฅฝ็š„ๆŠ•่ต„\n20001 0 infiniteBAD้’ข็ด็‰ˆ\n20002 0 ็”จๅœ†ๅฝข่ฎพ่ฎกๅฐ†ๅงๆฆปไฝœไบ†่ฝฌ่ง’ๅปถไผธ\n20003 0 ๅฅฝๅคšๆณ•ๅพ‹ๆ˜Žๆ–‡่ง„ๅฎš็š„ๅŠณๅŠจไฟ้šœ้ƒฝไฟ้šœไธไบ†\n20004 1 ็„ถไธบๆ‚จๆไพ›่พƒ้ซ˜ๆ”ถ็›Š็š„็†่ดข๏ผŒๆ˜ฏๆ‚จ่ต„้‡‘ไฟๅ€ผ็š„ๆœ€ๅฅฝ็ฎกๅฎถใ€‚ๅฆ‚ๆ‚จๆœ‰้œ€่ฆ๏ผŒ่ฏท็”ต่ฏๅ’จ่ฏขๆˆ–ๅˆฐๆŸœ้ขๅ’จ่ฏขใ€‚ๆˆ‘่กŒๆฏ...\n1000 Processed\n classify content\n20500 0 QQๆ’ญๆ”พๅ™จๅฏไปฅๅฌไฝ†ๆ˜ฏ้ผ ๆ ‡็ขฐไธ็€ๆ’ญๆ”พๅ™จ\n20501 0 ๅค–ๅฐๅ†…้กตไฟฉไบบๅ‰ๅผ€็š„่ฎพ่ฎกๆฃ’ๆฃ’็š„\n20502 0 ๆœฌไบบๆฑ‚ๅˆ็งŸๆฑŸๅคๆกฅไธœๅคฉไธ€ไธƒๅก”ๅฏบ้™„่ฟ‘็š„็š„ๆˆฟๅญ\n20503 0 ็ƒญ็ƒˆ็ฅ่ดบๆˆ‘ๆ กๅฅณ่ถณ่Žทๅพ—xxxxๅนด่‹ๅทžๅธ‚โ€œๅฏๅฃๅฏไนๅง‘่‹ๆ™šๆŠฅโ€ๆฏ้’ๅฐ‘ๅนดๆš‘ๆœŸ่ถณ็ƒ่ต›้ซ˜ไธญๅฅณๅญ็ป„็ฌฌไธ€ๅ\n20504 0 ๆˆฟ้‡‘ๆ‰€ๆ‰€ๆœ‰ๆŠ•่ต„ๆ ‡็š„ๆŠ•่ต„้—จๆง›้™่‡ณ1000ๅ…ƒ\n1000 Processed\n classify content\n21000 0 ไธๆ˜ฏ้€›้€›็™พๅบฆๅฐฑๅฏไปฅๅฝ“่‡ชๅช’ไฝ“็š„\n21001 0 ้š่บซๅœจ็”ฐ้—ดๅฐ่ทฏไธญ้ ็€ๅซๆ˜Ÿๅฏผ่ˆชๅฏปๆ‰พ\n21002 0 ๅคงไผ™ไนŸๆ„ฟๆ„็œ‹ไธ€ๅœบๅคงๆˆ/ไฝ›ๆ•™ๅไผš้ฆ–ๆฌกๅฐฑโ€œ้‡Šๆฐธไฟก่ขซไธพๆŠฅโ€่กจๆ€\n21003 0 ่ฆๆ›ดๆ–ฐ็—…ๆฏ’ๅบ“ๆ€ๆฏ’โ€ฆโ€ฆไบŽๆ˜ฏไปŽๆ˜จๆ™šๅผ€็”ต่„‘ๅŽ่ฟ˜ๆฒกๆœ‰่ฟ›่กŒไปป\n21004 0 ็Žฐๅœจ็š„่…พ่ฎฏๅฎ‰ๅ…จ็ณปๆ•ฐ็œŸ็š„ๆ˜ฏๅคชไฝŽไบ†\n1000 Processed\n classify content\n21500 0 ็ญ‰ไฝ ๅŽๅคฉไธ‹้ฃžๆœบๆˆ‘ๅ’Œโ€œๅผŸๅผŸโ€ไฝ ๅ’Œๆˆ‘็š„ๅฎ่ด็‹—็‹—\n21501 0 ๅฎ‹็พŽ้พ„ๅ–œๆฌขๆณ•ๅ›ฝๆขงๆก้‚ฃๅนด่’‹ไป‹็Ÿณๅœจๆ•ดไธชๅ—ไบฌๅŸŽ็งๆปกไบ†ๆขงๆกไฝ ่‹ฅๆ˜ฏๅ–œๆฌขๅƒๅฑŽๆˆ‘ๆ„ฟๆฏๅคฉ้ƒฝไธบไฝ ๆ‹‰\n21502 0 labolaboๅŸŽ้‡ŽๅŒป็”Ÿๆฏ›ๅญ”ๆ”ถๆ•›ๆฐด\n21503 0 ไฝ†่ฐ้ƒฝ็Ÿฅ้“่ฟ™ๅชๆ˜ฏxๅนด็Ÿญๆš‚็š„็ผ˜ๅˆ†\n21504 0 ๆŠ•ไน‹ๅ‰ๅ…ˆ็กฎ่ฎค่‡ชๅทฑ็š„ipๅœฐๅ€ๆขไบ†ๆฒก\n1000 Processed\n classify content\n22000 0 xxๅนดxๆœˆๅˆ˜ๆŸ่ฎฒ่ˆนๅ–็ป™ไป–ไบบๅŽๆฝœ้€ƒ\n22001 0 ๅฐฑๆ˜ฏ่งฃๆ”พ่ทฏๅ†ฒ่ฟ›ไธŠๆตท่ฏๅˆธๅคงๆฅผ\n22002 0 ใ€Œ143ไธ‡ไบบๅœจไผ ็œ‹ใ€ๅฎžๆ‹ๆฏ’ๅ‡ค็ˆชๅŠ ๅทฅ็‚นๆญ็ง˜ๆณกๆค’ๅ‡ค็ˆชๆ— ๅๆทปๅŠ ๅ‰‚\n22003 0 ๅˆซ่ฎฉ่ฝฆๆˆไธบไผคๅฎณไป–ไบบ็š„ไฟๆŠคไผž\n22004 0 ๅธ‚ๆฐ‘ๆฅ็”ต่กจๆ‰ฌๅœจๅ…ญๅˆๅŒบๅคงๅŽ‚่ก—้“็š„ๆ‰ฌๅญๅ…ฌไบค็บฟๆ‰€ๆœ‰ๅ…ฌไบคๅธๆœบ้ƒฝไผšไธปๅŠจ้ฟ่ฎฉ่กŒไบบ\n1000 Processed\n classify content\n22500 0 ่ฟ™ไผšๆ‰“ๅผ€็”ต่„‘่ฏฅๅ›พไธ€็‚นไนŸไธๆ™š\n22501 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผŒx.xxๆ—ฅ๏ผŒไธญๅˆxx๏ผšxx๏ผŒๅปบๆๅฎถๅฑ…ๅ“็‰ŒๅŽ‚ๅฎถ็›ด้”€ๆดปๅŠจ๏ผŒไธŽๆ‚จ็›ธ็บฆ่ฟœๅทžๅ›ฝ้™…ๅคง้…’ๅบ—...\n22502 0 ๆ‚ฃ้ซ˜็ƒญใ€่…นๆณปใ€่‚็‚Žใ€่‚พ็‚Žใ€่ƒ†ๅ›Š็‚Žใ€่ƒ†็Ÿณ็—‡ไน‹ไบบๅฟŒ้ฃŸ\n22503 0 ๆˆ‘่ฐˆๅพๅทžไบบ่ฟ‡ๅˆ†ๅŠ้…’็š„ๅธ–ๅญๅ†ๅ‘่กจไธ€้\n22504 0 TPP่ฐˆๅˆคๆˆ–ไฟƒๆˆๅคšๅ›ฝๆ”พๅฎฝๅค–่ต„ๆŠ•่ต„้™ๅˆถ\n1000 Processed\n classify content\n23000 0 ่ƒฝๆไพ›ๅคš่พพ100ๅ€็š„ๆทฑๅบฆ่ฆ†็›–ๅขž็›Šๅ’Œๅƒๅ€ไปฅไธŠ็š„่”ๆŽฅ่ƒฝๅŠ›\n23001 0 ไธ€ไธ‹ๅญๅฐฑๆŠŠ่ฃ่€€6plus็š„ๅ…‰่Š’็ป™ๆŽฉ็›–ๆމไบ†\n23002 0 ๆŠคๅฃซ็ป™ๆˆ‘ๆ‰Ž้’ˆ็š„ๆ—ถๅ€™่ฏด๏ผšโ€œไธบไป€ไนˆๆฏๆฌก่ฟ™็ง่€ƒๆŠ€ๆœฏ็š„ๆดปๅฐฑ่ฝฎๅˆฐๆˆ‘ๅ‘ข\n23003 0 ่ฎฒ่ฟฐๆฏ”้žๆดฒ่ฟ˜็ƒญ็š„ๆ•…ไบ‹โ€ฆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n23004 0 ไน้พ™ไป“ๅ›ฝๅฎพ1ๅทๆ ทๆฟ้—ด่ต„ๆ–™ๅ›พ็‰‡\n1000 Processed\n classify content\n23500 0 ๅƒๆ˜ฏๅŒ—ไบฌๅธ‚ไธŠๆตทๆทฑๅœณ่‹ๅทž็š„ๆฅ็œ‹็—…\n23501 0 ่ฟ™ๅ…ถไธญๅŒ…ๆ‹ฌไบ†43ๅฎถๅŸŽๅ•†่กŒใ€37ๅฎถๅ†œๅ•†่กŒๅ’Œๅ†œไฟก็คพไปฅๅŠๅŽๅ•†้“ถ่กŒใ€ๅŽไธ€้“ถ่กŒใ€ๆฑ‡ไธฐ้“ถ่กŒ็ญ‰3ๅฎถๅค–่ต„่กŒ\n23502 0 ่ฏฅๆฌพๅนณๆฟ็”ต่„‘็š„ๅŽŸๅž‹ๆ˜ฏๅŽปๅนดCESๅฑ•ไผšไธŠๅŽ็ก•ๆ›พๅฑ•็คบ่ฟ‡็š„ๅนณๆฟ็”ต่„‘TransformerAio\n23503 0 ไธญๅ›ฝๅŽŸๅˆ›ๆ–‡ๅญฆIPๅœจ็ป่ฟ‡ๅๅ‡ ๅนด็š„็งฏ็ดฏๅŽๅœจไปŠๅนดๅ…จ้ข็ˆ†ๅ‘\n23504 0 ๅฅน่ฏดๆˆ‘็ˆธ็ˆธๆŠŠ้ฃžๆœบ้ƒฝๅผ€่ตฐไบ†ๆˆ‘ๆ€Žไนˆๅ›žๅปบๅฎๅ“ฆ\n1000 Processed\n classify content\n24000 0 ๅฅณๅฃซ็”ฑไบŽๅญๅฎซๅ’Œๅคง่‚ ไธ€ๅฑ‚่‚ ่†œ็›ธ้š”\n24001 0 ๅ…ณไบŽ้ญ”็™ฝๅ’จ่ฏข้—ฎ้ข˜ๆฑ‡ๆ€ป้ญ”็™ฝๆ˜ฏ็บฏๅคฉ็„ถๆˆๅˆ†\n24002 0 ๅŽไธบmeta7ไธขๆฐด้‡Œๆฒก้—ฎ้ข˜\n24003 0 ๅ…ฌๅธ\"ๅ›ฝ็\"ๅ“็‰Œ่ขซ่ฎค่ฏไธบๆœ€้ซ˜็ญ‰็บงโ€œไบ”ๆ˜Ÿๅ“็‰Œ\n24004 0 ๅฟƒ็†ๅ’จ่ฏขๅธˆโ€ฆๆฒ™็›˜ๅฎžๆˆ˜ๅผ€ๅง‹ๅ–ฝโ€ฆๆƒณๅญฆไน \n1000 Processed\n classify content\n24500 0 ๆ„Ÿๅ—ๅˆฐ็”จไบ†่ฟ™ไนˆไน…็”ต่„‘ๆ€ป่ฏฅๅญฆ็‚น็ปดไฟฎๅธธ่ฏ†ไบ†\n24501 0 ๆ›พ็ปไฝ่ฟ‡็š„ๅœฐๆ–น็š„็Œซๅ’ชไป–ไปฌ้ƒฝ่ฟ˜ๅœจ\n24502 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็ซŸ็„ถๆœ‰ไบบๅ”ฑๅ—ๅฑฑๅ—\n24503 1 ไบฒไปฌ.็พŽ่ตžๅทจๅฅถ็ฒ‰่ดญไธ€ๅฌ็ซ‹ๅ‡xๅ…ƒ'๏ผŒไธ‰ๅฌ็ซ‹ๅ‡xxๅ…ƒ.ๅ…ญๅฌ็ซ‹ๅ‡xxๅ…ƒ.๏ผŒๅ…จๅœบๆปกxxx้€่ดญ้‡‘xxๅ…ƒ...\n24504 0 Vx็ฆๅทžๆˆฟไบงๅ›žๅคๆ‘‡ๅฅ–16\n1000 Processed\n classify content\n25000 0 ้ซ˜็›–่œ‚่œœ็“ถๅŽ‚ๅฎถๆฑŸ่‹ๅพๅทžๅฎๅŽ็Žป็’ƒ็ง‘ๆŠ€ๆœ‰้™ๅ…ฌๅธ่œ‚่œœ็“ถๅŽ‚ๅฎถ\n25001 0 ็”ต่„‘ไธŠๆ’ญ10ๅˆ†้’Ÿๅฐฑ่ตฐ็ฅžๅŽปๅนฒๅˆซ็š„ไบ†\n25002 0 ๆฅผไธ‹ๅ”ฎๆฅผๅฐๅง่ทณ่ˆžๅ‘ขๅฑ…็„ถ้ƒฝๆ˜ฏไธ€ไธชๅŠจไฝœ\n25003 0 ๅฅน่ฏดๅŽปๅŒป้™ข้—ฎ้—ฎๅŒป็”Ÿๅˆฐๅบ•ไป€ไนˆ็—…\n25004 0 ไนฐๆ—ฉ้ฅญ็š„ไธœๅŒ—ๅคงๅฆˆ่ฎฉNไบบๆ’้˜Ÿ้€ผๆˆ‘ๅ‘้ฃ™\n1000 Processed\n classify content\n25500 0 ๆŽจไธพไธ€ไฝๅฅ็พŽๅ…ˆ็”Ÿ็„ถๅŽไธ€ๆžชๅ‡ปๆฏ™\n25501 1 xxxxxxๅ…ƒ(RMB)ๅŠ่‹นๆžœ็ฌ”่ฎฐๆœฌ็”ต่„‘ไธ€ๅฐ!่ฏทๅŠๆ—ถ็™ป้™†ๆดปๅŠจ็ซ™:gszwa.cc ๆŸฅ็œ‹้ข†ๅ–...\n25502 0 ๅ›ฝๆฐ‘้ฉๅ‘ฝๅ†›74ๅ†›ไธŽๆ—ฅๆœฌ11ๅ†›ๆฟ€ๆˆ˜ๆ˜ฅๅŽๅฑฑ\n25503 0 ๆ— ้”กๆฐธไธญ็ง‘ๆŠ€ๆœ‰้™ๅ…ฌๅธๅœจๅŒ—ไบฌไธŽๅ›ฝๅฎถ็ปŸ่ฎกๅฑ€็ญพ็ฝฒไฟกๆฏๅŒ–ๅˆไฝœๆก†ๆžถๅ่ฎฎ\n25504 0 ็œŸๅบ”่ฏฅไปŽๆณฐๅทžๅ›žๆฅ็š„ๆ™šไธŠๅฐฑๅ‡บๆฅ็š„\n1000 Processed\n classify content\n26000 0 ๆœฌๅบ—ๅ†้€ไปทๅ€ผxxๅ…ƒ็š„ๆฒๆตด้œฒไธ€็“ถ\n26001 0 ็›ธๆฏ”ไน‹ไธ‹ๅŽไธบๅ’Œๅฐ็ฑณๅทฒ็ปๅพˆๆŽฅ่ฟ‘\n26002 0 ๅ› ๆญคๅพˆๅฎนๆ˜“ๅผ•่ตทๅ‘ผๅธ็ณป็ปŸ็–พ็—…\n26003 0 ๆ–นๅคง้›†ๅ›ข็ญพ็บฆๅˆๅปบ16ไบฟๅ…ƒๅ…‰ไผๅ‘็”ต้กน็›ฎ\n26004 0 ่พพๅฎžๆ™บ่ƒฝไธŽไธœๅฐๅธ‚ไบบๆฐ‘ๆ”ฟๅบœใ€ๆทฑๅœณๅธ‚ไผ˜่ง†ๆŠ€ๆœฏๆœ‰้™ๅ…ฌๅธๅฐฑไธœๅฐๅธ‚ๆ™บๆ…งๅŸŽๅธ‚PPPๆŠ•่ต„้กน็›ฎ่พพๆˆไบ†ๅˆไฝœๆ„ๅ‘\n1000 Processed\n classify content\n26500 0 ่‹ๅทž็ฑ่‘—ๅๆณ•ๅญฆๅฎถๅ€ชๅพๆ—ฅๅฅฅๅ‚ๅŠ ๅฏนไบŒๆˆ˜ๆ—ฅๆœฌๆˆ˜็Šฏๅฎกๅˆค็š„ไบฒ่บซ็ปๅކ\n26501 0 ๅŒ—ไบฌ้–ๆฑŸๅ•†ไผšไผไธšๅฎถๅ„ไฝ้ข†ๅฏผ่Ž…ไธดๆŒ‡ๅฏผ\n26502 0 ๅ‘็Žฐ่ฟๆณ•ๆŽฅ้€ๅญฆ็”Ÿ็š„่ฝฆ่พ†ๅŠๆ—ถๅ‘ๅ…ฌๅฎ‰ไบค็ฎก้ƒจ้—จไธพๆŠฅ\n26503 0 ๅธฆๆฅไบ†10ๆฌพ้ฆ–ๆฌก็™ปๅœบ็š„ไฟๆ—ถๆท่ฝฆๅž‹ๅ’Œ500็‚น้ขๅค–ๆˆๅฐฑๅˆ†\n26504 0 2ใ€ๆ–ฝๅทฅ่ฎฒๆญฅ้ชคๅคง็†็Ÿณ่ƒŒๆ™ฏๅข™ๆ–ฝๅทฅๆ–นๆณ•ไป‹็ป\n1000 Processed\n classify content\n27000 0 ไธ‹ไธ€็ญๅŒ—ไบฌ็š„้ฃžๆœบๅˆไป–ๅฆˆ็š„ไปŽๅ…ญ็‚นๅปถ่ฟŸๅˆฐๅ…ซ็‚นๆ•ด็š„ๆˆ‘ๅ‡Œๆ™จๅˆฐๅฎถๅคงๅทดไธŠ่ฟ˜็œผ้“ฎ้“ฎ็œ‹็€้’ฑไธขไบ†\n27001 0 ่‚ฏๅฎšๆœ‰ๅฎƒ็š„้“็†~่ดญไนฐ่ฏฆๆƒ…ๅŠๆ”ป็•ฅ\n27002 0 ๆˆ‘ไปฌไธ€่ตทๆŒคๅŒ—ไบฌๅœฐ้“ไธ€ๅท็บฟ็š„ๆ—ฅๅญ\n27003 1 ๅ‘จไธ€ๅฅฝ๏ฝž xxxxๅนดๆณฐๅบท่ดขๅฏŒ้š†้‡ๆŽจๅ‡บ ็จณๅฅxๅทๅ’Œ ๅ“่ถŠxๅทxๆœŸ ไธคๆฌพไผ˜่ดจไบงๅ“๏ผŒ็จณๅฅxๅท่ตท็‚นx...\n27004 0 ๆœ€้ซ˜ไบบๆฐ‘ๆณ•้™ขๅฐๅ‘้€š็Ÿฅ่ฆๆฑ‚่ฎค็œŸ่ดฏๅฝปๆ–ฐ\n1000 Processed\n classify content\n27500 0 ๆฒฃไธœๆ–ฐๅŸŽๆ•™่‚ฒๅ…šๅง”ๅฌๅผ€2015ๅนดๅ…š้ฃŽๅป‰ๆ”ฟๅปบ่ฎพๆšจๆฒป็†ๆ•™่‚ฒไนฑๆ”ถ่ดนๅทฅไฝœไผš่ฎฎ\n27501 0 ๅˆšๆ‰้‚ฃไฟฎ็œ‰ๅˆ€ๅœจ่ƒณ่†ŠไธŠๅˆ’ไบ†ไธคไธ‹\n27502 0 ไบš้ฉฌ้€Šๆ”ถ่ดญGoodreads\n27503 0 ๆ–ฐๆณ•่ง„็š„ๅฎžๆ–ฝๅฏนๆˆ‘่พ“ๅพ€้Ÿฉๅ›ฝไบงๅ“็š„ไผไธšๅฝฑๅ“้‡ๅคง\n27504 0 ๅฝ“ไธ€ๅˆ‡้ƒฝไบŽๆˆ‘ไบ‹ไธŽๆ„ฟ่ฟๆˆ‘่ฆ่ฎฐๅพ—้ฃžๆœบไนŸๆ˜ฏ้€†้ฃŽ่€Œ่ตท้ฃž\n1000 Processed\n classify content\n28000 0 ๆˆ‘็Ÿฅ้“ๆœ‰ๅพˆๅคšไบบๅœจ่ดจ็–‘ๆˆ‘ไปฌ็š„ไบงๅ“\n28001 0 ๆฎๆœ€้ซ˜ๆฃ€ๅฏŸ้™ข็ฝ‘็ซ™7ๆœˆ30ๆ—ฅๆถˆๆฏ๏ผšๆ—ฅๅ‰\n28002 0 PE็ณปๅˆธๅ•†็Žฉๆณ•๏ผš็ชๅ‡บๆŠ•่กŒๆ‘’ๅผƒ่ฅไธš้ƒจ\n28003 0 ๆˆ‘ๅ‘่ช“ไปฅๅŽๅ†ไนŸไธ่ฆๆ—บๅญฃ็š„ๆ—ถๅ€™ๅŽปๆ—…ๆธธไบ†ไฝ ้€ ๅŽฆ้—จ้ผ“ๆตชๅฑฟๆœ‰ๅคšๅฐ‘ไบบๅ—ๅ•Š\n28004 0 ไป–่ฏด60ๅ‘จๅฒๅŽ้ข†ๅˆฐ็š„้€€ไผ‘้‡‘่ฟ˜ๆ˜ฏๅซไป€ไนˆ็š„\n1000 Processed\n classify content\n28500 0 ๅ…ฌไบคๅพ€ๅ‰ๅผ€ไบ†xๅ…ฌ้‡Œ็ป•็ซ‹ไบคๆกฅๆ•ดๅœˆ่ฝฌๅผฏ\n28501 0 ๅœจ้ฃžๆœบไธŠๅฏ่ƒฝไนŸไผš่ขซ่ฆๆฑ‚ๅ…ณ้—ญ\n28502 0 ๅ›žๅคดๆ•ด็†ไธ‹ๆˆ‘ๅ้ฃžๆœบ็š„ๆ—ถๅ€™้š่บซๆบๅธฆ็š„ไธœ่ฅฟ\n28503 0 ่ฟๆณ•ไบ†ๆ— ่ฎบไป€ไนˆ็†็”ฑ้ƒฝ่ฆๆŽฅๅ—ๅˆถ่ฃ\n28504 0 ๅฝ“ๆˆ‘ไปฌ็œ‹ๅˆฐ้‚ฃไบ›่ขซไบบ่ดจ็–‘็š„ๆขฆๆƒณไธ€ๅคฉๅคฉๅ˜ๆˆ็Žฐๅฎž็š„ๆ—ถๅ€™\n1000 Processed\n classify content\n29000 1 ใ€Šxxxxๆˆ‘ไธบ็˜ฆ็‹‚ใ€‹็พŽ้›…็พŽๅฎนไผšๆ‰€็šฎไธ‹ๆŠ‘่„‚ๆœฏ๏ผŒไธ€ไธชๆœˆๅšไธ€ๆฌก๏ผŒไบ”ๅˆ†้’Ÿๆ“ไฝœๅฏ่พพๅˆฐๅ››ไธชๆœˆ็‚น็ฉด็š„ๆ•ˆๆžœ๏ผŒ...\n29001 0 ไธๅŒๅˆ†ๆ•ฐใ€ไธๅŒ็Šฏ่ง„่ฃๅˆคๆ‰‹ๅŠฟๆ˜ฏไป€ไนˆโ€ฆโ€ฆๆ–ฐๅฅ‡ๆœ‰่ถฃ็š„่ฏพ่ฎฉๅญฉๅญไปฌๅ€ๆ„Ÿๆ–ฐ้ฒœ\n29002 0 ๅˆฐๆœ€ๅŽๆˆ‘ๅˆ†ไบ†24ๆœŸไฝ ๅˆไธๆ„ฟๆ„ๅ˜ๅฆไบ†\n29003 0 ๅพˆ้ฆ™~ๅœฐๅ€๏ผšๅฐๅบ—ๅŒบๅฏ‡ๅบ„่ฅฟ่ทฏ็Ž‹ๅบœไบ•่ดญ็‰ฉไธญๅฟƒ่ฅฟไพง้˜ณๅ…‰ๅœฐๅธฆๅบ•ๅ•†\n29004 1 ไธ‰ๆœˆๅฅณ็Ž‹ๆœˆ๏ผŒ่‡ด่Šฑๆ ทๅฅณไบบ๏ผŒๆœ€ๅฅฝ็š„็คผ็‰ฉ็ป™ไบฒ็ˆฑ็š„่‡ชๅทฑ๏ผๆœ€ๅคงๅŠ›ๅบฆๆดปๅŠจๆฅ่ขญ๏ผšx.x~x.x่ดญ็พฝ่ฅฟไบงๅ“ๆปก...\n1000 Processed\n classify content\n29500 0 ๅฎณ็š„ๆˆ‘ๆ‹ฟ็€ๆ‰‹ๆœบไธ€็›ด็ญ‰ไธ€็›ด็ญ‰\n29501 0 ่ญฆๅฏŸ่œ€้ปๆ็คบๆ‚จ๏ผš่‡ช่ง‰ๅšๅˆฐโ€œๅ–้…’ไธๅผ€่ฝฆ\n29502 0 ่€Œ็™พๅˆ†ไน‹80็š„ไบบ็š„ไบšๅฅๅบท้ƒฝๆ˜ฏๅ› ไธบๅไบŒ็ป็ปœไธ็•…่€Œๅฏผ่‡ด็š„\n29503 0 ๅฏๆ นๆฎๆฏไฝ้กพๅฎข็š„่ฆๆฑ‚่ฎพ่ฎกไธ“ๅฑžๅฎšๅˆถๆฌพ็พŽ็”ฒ\n29504 0 ๆด›ๆ‰็Ÿถ่ฎพ่ฎกๅธˆAlexanderPurcellๅ€Ÿ้‰ดๆˆ˜ไบ‰ไธญไฝฟ็”จ็š„ๆฐด้›ท\n1000 Processed\n classify content\n30000 0 ๅนฟๅทžๅœฐ้“้ข„ๆต‹ๆœŸ้—ดๆœ€้ซ˜ๆ—ฅๅฎขๆต้‡ๅฐ†่ถ…600ไธ‡ไบบๆฌก\n30001 0 ็„ถ่ฟž็ปญไธคๅคฉ้™ชๅฆˆๅฆˆๅŽปๅŒป้™ข็œ‹็—…\n30002 0 ้‡ๅบ†ๅธ‚็ฌฌไบ”ไธญ็บงไบบๆฐ‘ๆณ•้™ขๅฏน่ฏฅ็บ ็บทไฝœๅ‡บ็ปดๆŒๅŽŸๅˆค็š„ไบŒๅฎกๅˆคๅ†ณ\n30003 0 ๅฐๅทโ€œๅฟตๆ—งโ€ไธ€ไธชๅŠๆœˆๅฏนไธ€ๅฎถๅบ—โ€œไธ‹ๆ‰‹โ€3ๆฌก\n30004 1 ๅฐšไฝณๅ“็‰Œๅฅณ่ฃ…ๆŠ˜ๆ‰ฃ็ฅๆ‚จๅ…ƒๅฎต่Š‚ๅฟซไน๏ผไธ‰ๅ…ซๅฅณ็ฅž่Š‚ๆ˜ฏๆˆ‘ไปฌ่‡ชๅทฑ็š„่Š‚ๆ—ฅ๏ผŒ็ˆฑ่‡ชๅทฑๅฐฑไธบ่‡ชๅทฑไนฐไปถๆ–ฐ่กฃๅง๏ผๅ…ญใ€ไธƒ...\n1000 Processed\n classify content\n30500 0 ๆฒกๆœ‰ๅ–ๆถˆ่ˆช็ญๆฒกๆœ‰ๅปถ่ฏฏๅทฒ็ปๅไธŠ้ฃžๆœบๅฎ‰ๅฟƒ็ญ‰ๅพ…ๅ›žๅฎถไฝ ไปฌ่ฆๆƒณๆˆ‘ๆˆ‘ไนŸไผšๆƒณไฝ \n30501 0 ๅบ”ๅฏน้€š่ƒ€โ€”โ€”่‚กๅธ‚ๆœบไผšๅฐ‘\n30502 0 ๅŒ—ไบฌ315ๆถˆ่ดน่€…ๆŠ•่ฏ‰็ฝ‘ไธŠๅ„็งๆŠ•่ฏ‰ๅฐ็ฑณๅ…ฌๅธ็š„\n30503 0 MT6735็ญ‰ๅนณๅฐ็š„Layoutๅทฅไฝœ\n30504 0 ็‰›ๅฅถ็š„็ฅžๅฅ‡ๅŠŸๆ•ˆ๏ผš1็‰›ๅฅถ+้ข็ฒ‰=ไผ˜่ดจ้ข่†œ\n1000 Processed\n classify content\n31000 1 ๅฎœๅฎพๆ’ๆ˜Œๅ…ฌๅธ่‚ๅฐ็Žฒๆๅ‰้ข„็ฅๆ‚จ๏ผšๅ…ƒๅฎต่Š‚ๅฟซไน๏ผๆœ‰้œ€่ฆ่ต„้‡‘็š„ๆœ‹ๅ‹ๅฏไปฅ็”ต่ฏ่”็ณปๆˆ‘ๅ‡†ๅค‡่ต„ๆ–™ไบ†๏ผŒๆœ€ๅฟซxๅคฉ...\n31001 1 ^้•ฟๆœŸ่ฏšไฟกๅœจๆœฌๅธ‚ไฝœๅ„็ฑป่ต„ๆ ผ่Œ็งฐ๏ผไปฅๅŠๅฐ /็ซ ใ€็‰Œใ€ โ€ฆโ€ฆ็ญ‰ใ€‚็ฅฅ๏ผšx x x x x x x ...\n31002 0 PS๏ผš่€ๆŽๆ็คบ๏ผšๅŒ่‚ฉ่ƒŒ่ƒŒๅœจ่บซๅŽ\n31003 0 ๆฅ่‡ชไธŠๆตทใ€ๅนฟไธœ็ญ‰ๅœฐ็š„8ๅฎถๆœบๅ™จไบบ้‡็‚น้กน็›ฎไธŽ้•ฟๆฒ™้›จ่Šฑ็ปๅผ€ๅŒบ็ญพ่ฎขๆŠ•่ต„ๅ่ฎฎ\n31004 0 ่ฎพ่ฎกไบ†่ฟ™ๆฌพไผธ็ผฉๅผ็”ตๆบๆŽฅ็บฟๅ™จ\n1000 Processed\n classify content\n31500 0 ่ฟ™ๆ˜ฏๅœจๅ‘ไบบ่ดฉๅญไธ็”จๅˆคๆญปๅˆ‘ๆŒ‘ๆˆ˜ๅ—\n31501 0 ๅœจๅฎณๆ€•็ญ‰็€่ฃๅˆค็š„ๅˆคๅ†ณๅง\n31502 0 ๆŠคๅฃซ้•ฟไผฐ่ฎกๅˆšไปŽๅฐๆนพๅ‚่ง‚ๅ›žๆฅ\n31503 1 ๅฎžๆ™ฏ็Žฐๆค–ๅช่ฆxไธ‡๏ผŸใ€ๆ—ฅ ๆ˜ฑ ๅŸŽใ€‘่ถ…ไฝŽๆธžไฟฏไป…้œ€xไธ‡่ตท๏ผŒไนฐๆค–ๆ›ดๅฏๅ‚ๅŠ xๅนดๅ†…็š„่ถ…ๅ€ผ็‰นๆต๏ผŒๆค–ๆบๆœ‰ ้˜...\n31504 0 ไฝ ไธๆƒ…ๆ„ฟ่Šฑ20ๅ…ƒๅผ€ๅผ ๅญ˜ๆฌพ่ฏๆ˜Žๆฒกไบบ้€ผไฝ \n1000 Processed\n classify content\n32000 0 92ๅ•†ๅŸŽๅ็งฐ๏ผšAmazonๅ•†ๅ“ๅ็งฐ๏ผšNewChapterZyflamendNighttime\n32001 0 ๆœ€ๅŽๅฐๅคงไปŽvcr้‡Œ่ทณๅ‡บๆฅ็š„ๆ—ถๅ€™ๅฟไธไฝๆณช็›ฎ\n32002 0 xxx่ฎคไธบ็™พๅบฆๆปฅ็”จRobotsๅ่ฎฎ\n32003 0 ๆˆ‘่Šฑxxxๅ…ƒ้‚ฎ่ดญไบ†ไธ€ๅฐๆธ…ๅŽๅŒๆ–นๅนณๆฟ็”ต่„‘\n32004 0 googใ€Žไธญๅ›ฝ็•™ๅญฆ็”Ÿๅ› ๅ‡Œ่™ๆกˆ่ขซๅˆคๆ— ๆœŸ\n1000 Processed\n classify content\n32500 0 ๆœฌๅ›ฝๆŠ•่ต„่€…ๅœจ่ดญไนฐ่‚ก็ฅจใ€ๅ€บๅˆธใ€้‡‘่ž่ก็”Ÿๅ“ๅ’Œ่ฎค่ดญ่‚กๆƒๆ–น้ขๅ—ๅˆฐไธ€ๅฎš้™ๅˆถ\n32501 0 ่€ŒไปŠๅนด4ๆœˆๆ‰ไธŠๅธ‚็š„AppleWatchๆŽ’ๅๆœซๅฐพ\n32502 0 ๅ‡บ่‡ชDonaireArquitectos\n32503 0 ๅšๅ†ณๆŠŠ่…่ดฅๅˆ†ๅญๆธ…้™คๅ‡บๅ…šใ€ๅ†›้˜Ÿใ€ๅนฒ้ƒจ้˜Ÿไผ\n32504 1 ๆญฆ่ฟ›็บขๆ˜Ÿ็พŽๅ‡ฏ้พ™\"็ฎญ็‰Œๅซๆตด\"ๆ–ฐๅนด้€ๅฅฝ็คผ๏ผŒๅ‚ๅŠ ๆดปๅŠจ่€…๏ผŒๅ…่ดน้€่ง’้˜€๏ผŒๅปถ้•ฟ่ดจไฟไธ€ๅนด๏ผŒๆ›ดๆœ‰xxxๅ…ƒ้ฉฌๆกถ...\n1000 Processed\n classify content\n33000 0 ๆœ€ๅฅฝ่Šฑ1ๅˆ†้’Ÿๆ—ถ้—ดๆฃ€ๆŸฅไธ€ไธ‹่ฝฆๅบ•ไธ‹ใ€่ฝฆ่ฝฎไธŠๆˆ–ๅ‘ๅŠจๆœบไธŠๆœ‰ๆฒกๆœ‰ๅฐๅŠจ็‰ฉ\n33001 0 ้บฆ่ฟชNBA็”Ÿๆถฏๅ”ฏ็พŽ้œ‡ๆ’ผMV\n33002 0 127ๅ้ข„ๅพ้’ๅนดไบ•็„ถๆœ‰ๅบๅœฐ่ฟ›ๅ…ฅๅ„ไฝ“ๆฃ€็ง‘ๅฎคๆŽฅๅ—ไฝ“ๆฃ€\n33003 0 ๆฒณๅ—|่ฎธๆ˜Œๅธ‚ไธญ็บงไบบๆฐ‘ๆณ•้™ขๅฎกๅˆคไธšๅŠกๅบญๅฎคไบบๅ‘˜ๅๅ•ใ€่”็ณปๆ–นๅผ\n33004 0 ๅช่ฆไฝ ็กฎๅฎšๆ”ถ่ดง่ฟ”ๅˆฉ็š„็Žฐ้‡‘ๅฐฑๆ‰“ๅˆฐไฝ ๆ”ฏไป˜ๅฎ\n1000 Processed\n classify content\n33500 0 ๅฑ…็„ถๆ˜ฏ99ๅนด8ๆœˆ2ๆ—ฅ็š„็”Ÿๆ—ฅโ€ฆโ€ฆ\n33501 0 ๅฎ่ขซๅ‘Š็Ÿฅๆฎ‹ๅฟ็š„็œŸ็›ธไนŸไธๆ„ฟๅฌๆธฉๆŸ”็š„ๆฌบ็ž’\n33502 0 ็ฉฟ็€cosๆœ็š„ไบŒๆฌกๅ…ƒๅฐไผ™ไผด่ขซ่ฏดไฝœ็ฉฟ็€ๆšด้œฒ\n33503 0 ่ฏทๅ…ถไป–ๅ•ไฝ่”็ณปๅ‚ๅŠ 18ๆ—ฅๆ‹›่˜ไผš\n33504 0 ๅธธๅทžๅธ‚ๆƒ ๆฐ‘ๅทฅ็จ‹ไน‹ไธ€็š„ๅฐ‘ๅนดๅฎซไธ‰ๆœŸๅทฅ็จ‹ๆญฃๅœจ็ดงๅผ ๅปบ่ฎพไธญ\n1000 Processed\n classify content\n34000 0 ๆˆ‘่ง‰ๅพ—ๆœ‰ไบ›ไบบ็œŸ็š„ๆ˜ฏๅคงๆฆ‚ๅชๆœ‰ไบฒๅฆˆ็ˆ†็‚ธๆ‰ๆ˜ฏ็œŸ็š„ๅ…ถไป–ๅช่ฆไธๆ˜ฏๅฅฝไบ‹ๅฝ“ไบ‹ไบบๅ›žๅบ”ๅฐฑๅ…จๆ˜ฏๅ‡็š„\n34001 0 ่€Œๆœ้ฒœ็š„ๆ ธๆญฆๅ™จๆ˜ฏๅ‘จ่พนๆ‰€ๆœ‰ๅ›ฝๅฎถ้ƒฝไธ่ƒฝๆŽฅๅ—็š„\n34002 0 ๅ’ŒๆŸไธช้€ผๆ ทๅœจ่Šๅคฉๆˆ‘่ฏดๆˆ‘ๅœจ่ตค่†Šๆ•ท่ฏๆœ‰ๅ›พๆœ‰็œŸ็›ธ\n34003 0 ไธ“ๆ ไฝœๅฎถไธ‰ๅ…ฌๅญ้šๆ‰‹่ฎฐๅ‡บไบบ็”Ÿ็ฌฌไธ€ๆกถ้‡‘\n34004 0 ๆธ…ๆ”ฟๅบœๅ’Œๅฐ‘ๆž—ไน‹้—ด็š„ๅ…ณ็ณปไนŸๅœจๅ‘็”Ÿๅพฎๅฆ™็š„ๅ˜ๅŒ–\n1000 Processed\n classify content\n34500 0 ๅŽไธบๆถˆ่ดน่€…ไธšๅŠกไธŠๅŠๅนดๆ”ถๅ…ฅ90\n34501 0 ๆถๆž่Šฑๅƒ้ชจๅฏปๆ‰พxxxxChinaJoyๆœ€็พŽshowgirl\n34502 0 ๅŽไธบๆœ€ๅމๅฎณ/ๅŽไธบ่ฃ่€€7ๅ…จ็ฝ‘้€š็‰ˆๅผ€็ฎฑ๏ผšๅŒ…่ฃ…็›’็œ‹็‚นๅ่ถณ\n34503 0 0ๅ…ฌๅฏ“ๅฏนๅ…จ้ƒจๆ™บ่ƒฝๅฎถๅฑ…ๅฎž็Žฐโ€œไธ€้”ฎๅŒ–โ€ๆ“ไฝœ\n34504 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅคฉๅฎๅฉšๅบ†้ƒจ๏ผŒๆ„Ÿ่ฐขๆ‚จๅฐ†ไบ”ๆœˆไปฝ็š„ๅ–œๅฎด้€‰ๆ‹ฉไบ†ๅคฉๅฎ้…’ๅบ—๏ผๅฆ‚ๆžœๅฉšๅบ†ๅ…ฌๅธๆ‚จ่ฟ˜ๆฒกๆœ‰้€‰ๅฎš๏ผŒ่ฏทไธŽๆˆ‘ไปฌ...\n1000 Processed\n classify content\n35000 1 ๅปบ่ฎพ้“ถ่กŒxxxx xxxx xxxx xxxxๆˆทๅ:้™ˆไบฎ\n35001 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏไธŠๆตทไผ็š‡ๆŠ•่ต„็ฎก็†ๆœ‰้™ๅ…ฌๅธ็š„ไธšๅŠกๅ‘˜ๅฐ็Ž‹๏ผŒๆˆ‘ๅ…ฌๅธไธ“ไธšไปฃๅŠžๅฐ้ข่ดทๆฌพ๏ผŒไฟก็”จๅก๏ผŒๅชๅ‡ญไธชไบบ่บซไปฝ...\n35002 1 ๆญๅ–œๅ‘่ดข๏ผŒ็ฅๆ‚จ่บซไฝ“ๅฅๅบท๏ผŒไธ‡ไบ‹ๅฆ‚ๆ„ใ€‚ ๆˆ‘ไปฌๆ˜ฏๅšๆ•ดไฝ“ๆฉฑๆŸœ๏ผŒ่กฃๆŸœ็š„ๅŽ‚ๅฎถใ€‚ๅนดๅŽ็‰นไปท่กฃๆŸœxxxๅ…ƒ/ๆ–น๏ผŒ...\n35003 0 ่ฅฟ่ตต่ก—ใ€ไธญๅคง่ก—ๅ—ๆฎต9ๆœˆไปฝๅณๅฐ†ๅ…จ็บฟๅผ€้€š่ฟ่กŒ\n35004 0 ๆ˜Ÿๅฒ›็Žฏ็ƒ็ฝ‘โ€œๆธฏๅช’ไธญๅคฎไธฅๆŸฅๅ›ฝไผๅ˜ๅฎถไผ้‡ŠๅŠ ๅคงๆ•ดๆฒปไฟกๅทโ€\n1000 Processed\n classify content\n35500 0 ไธ€่ง‰้†’ๆฅๅ‘็Žฐๆ‰‹ๆœบ้‡Œๅคšไบ†ไธ€ๅผ ็…ง็‰‡\n35501 0 ๅŒป็”Ÿๅ’Œไป–่ฎฒ่งฃๆˆ‘็š„็—…ๆƒ…ๅˆ†ๆžๆˆ‘็š„็Šถๅ†ตไป–ๅœจๆ—่พนๆ‹ฟไธชๆœฌๅ“ผๅ“งๅ“ผๅ“ง็š„ๅ†™\n35502 0 ้€š่ฟ‡Carrobot่ฝฆ่ๅœ่ฝฆ่ฝฝๆ™บ่ƒฝ็ณป็ปŸ\n35503 1 ๆ‚จๅฅฝ๏ผ่ดขๅ•†่ดขๅฏŒๅœจxๆœˆxๆ—ฅ๏ฝžxๆœˆxxๆ—ฅๆŽจๅ‡บโ€œๅฅณไบบ่Š‚้ฆจ้ฆ™ๅผ€ๅ•็คผโ€ๅœจๆดปๅŠจๆœŸ้—ดๅ‡กๆ˜ฏ่ดญไนฐxไธ‡ๅŠไปฅไธŠ็†่ดข...\n35504 0 ๆต™ๆฑŸๅฎๆณขๅคฉ็ซฅ็ฆ…ๅฏบๅฐ†ไธพๅŠžโ€œๅคฉ็ซฅๅฑฑไผ ็ปŸ็ฆ…ไธƒโ€ๆณ•ไผš\n1000 Processed\n classify content\n36000 0 ๅ…ฌๅธ่ฆ่ƒฝๅƒๅพฎ่ฝฏไธ€ๆ ทๆœ‰ๆด—ๆพก็š„ๅœฐๆ–นๅฐฑๅฅฝไบ†ๆˆ‘่‚ฏๅฎš่ฟ‡ๅคœๅœจ้‡Œ้ข็”ปๅ›พๅ•Šๅ•Šๅ•Šๅ•Šๅ•Šๅ•Š\n36001 0 ๅ‡กๆ˜ฏไธŽๆ”ฟๅบœๅˆฉ็›Šไบง็”Ÿๅ†ฒ็ช็š„้ƒฝ่ฆๆ‰“ๅ‡ป\n36002 0 ๅŽŸๆฅๆˆ‘ไธๅ–œๆฌข่ƒ–ๅญ็š„็œŸ็›ธๅฐฑๆ˜ฏๆˆ‘ๆ˜ฏไธ€ไธช่ƒ–ๅญ\n36003 0 ๆ‰“ๅผ€็”ต่„‘่ขซ่นฆๅ‡บๆฅ็š„ไธ€็™พๅคšๆกๅทฅไฝœไฟกๆฏๅ“ๅˆฐไบ†โ€ฆโ€ฆ\n36004 0 ๆฑŸ่‹ๆตทๅฎ‰ๅ…ฌๅฎ‰ๅฑ€ไธ€ๅๆญฃๅผๆฐ‘่ญฆๅซๆŽ้พ™\n1000 Processed\n classify content\n36500 0 ้ฆ–ๆ‰น100่พ†ๆŠ•ๅ…ฅ่ฟ่ฅ็š„็บฏ็”ตๅŠจๅ‡บ็งŸ่ฝฆ\n36501 0 ๅฎฟ่ฟๆณ—ๆดชๅฝ“ๅœฐๅ†œๆ‘็‰ฉๆต็š„็•…้€šๆ˜ฏ็ฉ†ๅขฉๅฒ›ๆ‘ๅ‘็”Ÿๅ˜ๅŒ–็š„ๅŽŸๅ› ไน‹ไธ€\n36502 0 ็ผๆตทๅธ‚ๆ•™่‚ฒๅฑ€ๅœจๅธ‚ๆ”ฟๅบœ้—จๆˆท็ฝ‘็ซ™ๅ‘ๅธƒไบ†ๅ…ฌ็คบ\n36503 1 ๅฐŠๆ•ฌ็š„ๆ–ฐ่€้กพๅฎขๆ‚จไปฌๅฅฝ๏ผๅˆๆฑŸไธ€่ฝฌ็›˜ๆฑ‡้€š่ถ…ๅธ‚่ดๅ› ็พŽๅฅถ็ฒ‰ๆ‰“x.xๆŠ˜ไบ†๏ผŒๆ—ถ้—ดๆ˜ฏxๆœˆx.x.x.ๅท๏ผŒ่ฟ˜...\n36504 0 ๆฒกๆœ‰็ƒŸๆฒกๆœ‰้…’ๆฒกๆœ‰ๆ‰‹ๆœบ็š„ๆ™šไธŠ\n1000 Processed\n classify content\n37000 0 ็Žฐๅœจ็š„G2ไบฌๆฒช้ซ˜้€ŸๅŒๅ‘็š„ๅ ฐๆกฅๆœๅŠกๅŒบไปฅๅŠS38ๅธธๅˆ้ซ˜้€ŸๅŒๅ‘็š„ๆป†ๆน–ๆœๅŠกๅŒบใ€่ทๅถๅฑฑๆœๅŠกๅŒบ\n37001 0 ๆน–ไบบ้ข†่ก”NBA็ƒ้˜Ÿไปทๅ€ผๆŽ’่กŒๆฆœ\n37002 0 ๅฅฝๅคšๅฐไผ™ๆ™ƒๅŠจๆ‰‹ๆœบ่ฏ•ๅ›พโ€œ่„ฑๅ…‰โ€ๆ—ถ\n37003 0 ่€Œไธๆ˜ฏๅฏนไบŽ่ดขๅฏŒๅ’Œๅ‡บ่บซ็š„่†œๆ‹œ\n37004 0 ๅŸŽ็ฎก็ง‘้•ฟ็—ด่ฟท็ฝ‘ๆธธไธ‰ๅนด่Šฑ1500ไธ‡ๆœ‰ๅธๆฏ’ๅ‰็ง‘ไปๅ…ฅๅ…šๆ™‹ๅ‡็œŸ็š„ๅ‡็š„\n1000 Processed\n classify content\n37500 0 G25้•ฟๆทฑ้ซ˜้€ŸๅŒๅ‘ๅฎๆทฎๅ—ไบฌๆฎตไปŽ็ซน้•‡่‡ณ็ซน้•‡ไบ’้€š้™้€Ÿๅ–ๆถˆ\n37501 0 ๅทฒ่ขซๆทฑๅŸ‹ไบŽๅ—ๆžๅ†ฐไธ‹4000็ฑณๆทฑๅค„้•ฟ่พพ1400ไธ‡ๅนด\n37502 1 ไฝ ๅฅฝ๏ผ้ฅฎ้ฉฌ่ก—ๅ“ฅๅผŸ่ฟŽไธ‰ๅ…ซ่Š‚๏ผŒ็‰นๅˆซๆŽจๅ‡บ่ถ…ๅ€ผ็š„ๅฝฉ่‰ฒๆก็บนๆฏ›่กซๆŠ˜ๅŽxxxใ€้ฆ™ๅž‹ๅฝฉ่‰ฒๅค–ๆญๆŠ˜ๅŽxxxใ€ๆ—ถๅฐš...\n37503 0 ไฟ„็ฝ—ๆ–ฏๅ›ฝ้™…ๅ†›ไบ‹ๆฏ”่ต›ไธป่ฃๅˆคๅพท็ฑณ็‰น้‡Œยทๆˆˆๅฐ”ๅทดๅš็ง‘ๅฐ‘ๅฐ†่กจ็คบ\n37504 0 ็œŸ็š„ๆ˜ฏ่ฆ่ขซ่Šฑๅƒ้ชจ็”ต่ง†ๅ‰ง็ป™่™ๆญปไบ†\n1000 Processed\n classify content\n38000 0 ๆˆ‘ๆŠ•็ป™ไบ†โ€œๅผ ไธนๅณฐ้ฅฐไธœๆ–นๅฝงๅฟโ€่ฟ™ไธช้€‰้กน\n38001 0 ้ฟๅผ€่ฅฟ้ƒจๅคชๅคš้กถ็บง1ๅทๆˆ–2ๅทไฝ็š„ๅฏนๆŠ—\n38002 0 ๆ‰“ไปŽ2015ๅนดๅผ€ๅง‹ๅˆฐไธŠๅ‘จ้ผป็‚Žไธ€็›ดๆฒก็Šฏ\n38003 0 ๅœจๅ—ไบฌ๏ผšๅ’Œไพ„ๅฅณๆ‹‰ๅง†ๅœจไธ€่ตท\n38004 1 ไฝ ๅฅฝ๏ผŒ็พŽๅฅณใ€‚้ฉฌไธŠx.xๅฟซๅˆฐไบ†๏ผŒๆˆ‘ไปฌๅฎถๆ˜ฅๆฌพๅ…จ้ขไธŠๅธ‚๏ผŒๆœ‰ๅพˆๅคšๆฌพ๏ผŒ้€‚ๅˆไฝ ใ€‚ไฝ ๆฅ็œ‹็œ‹ใ€‚่€Œไธ” ็Žฐๅœจ๏ผŒๅ…ซ...\n1000 Processed\n classify content\n38500 0 ้•ฟๆœŸ็”ต่„‘ๅ’Œ็ฉบ่ฐƒๅทฅไฝœ็š„็šฎ่‚ค็ผบๆฐด\n38501 0 ANNAๆ–ฐไธ€ไปฃDD้œœ่ขซ็งฐไธบโ€œๅŠจๆ€ๅ…จๆ•ˆไบงๅ“โ€\n38502 0 โ€”โ€”ๆฒชๆŒ‡่ทŒ1%็›˜ไธญๅคฑ3900ๅทจ้œ‡็œŸ็›ธ\n38503 0 50ๅฒๅˆ˜้’ไบ‘ใ€46ๅฒ้ƒญ่”ผๆ˜Žไธๆƒณ็”Ÿ\n38504 0 ๆœๅŠก่Œƒๅ›ดโ€”โ€”่ฝฆ้™ฉ๏ผšไบคๅผบ้™ฉใ€ๅ•†ไธš้™ฉ\n1000 Processed\n classify content\n39000 0 xยทxx่†ๅทž่‡ชๅŠจๆ‰ถๆขฏไบ‹ๆ•…ๆถ‰ไบ‹ไผไธš\n39001 0 ๅฌดๆ—ญๅฑQAQโ€œๆŸทๆ•”ไน‹็ฅธ็ปตๅปถ่‡ณไปŠ\n39002 1 ็ฒพ้€‰่ดงๅ“xๆŠ˜ไผ˜ๆƒ ๏ผŒๅฟซๅฟซๆฅochirlyๅบ—้“บ้€‰่ดญๅง๏ผๆˆ‘ๆ˜ฏๅฏผ่ดญๅจŸๅจŸๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผ\n39003 0 ่Šฑๅƒ้ชจๆ˜ฏๆˆ‘ๆดป่ฟ™ไนˆๅคงๅ”ฏไธ€ไธ€้ƒจ็”จๅฟƒๅŽป็ˆฑ็š„ไฝœๅ“\n39004 1 ๅบ†ๅ…ƒๅฎต้•ฟๅŸŽไธบไบ†ๅ›ž้ฆˆๆ–ฐ้กพๅฎข๏ผŒๅ‡กๅœจๅ…ƒๅฎตๅฝ“ๅคฉไธ€ๅพ‹่ฃ…ๆฝขๅ“ๆ‰“xๆŠ˜๏ผŒไธ€ๅพ‹่ฝฆๅž‹ๅฝ“ๆœˆๆœ‰่ดง๏ผŒๆ›ดๆœ‰ๅฅฝ็คผไธๆ–ญ๏ผŒๆƒŠๅ–œ...\n1000 Processed\n classify content\n39500 0 5ๅ”‡่‰ฒ็ช็„ถๅ˜็ดซๆˆ–ๅ˜้ป‘่ฏดๆ˜Žๆทค่ก€ๆ”ปๅฟƒ\n39501 0 ไฝ ่ฏด่…พ่ฎฏไธชๅ‚ปๅ‰ไธบๅ•ฅ่ฆๆœ‰ไธช่ฎฟ้—ฎ่ฎฐๅฝ•โ€ฆโ€ฆไฝ†ๆ˜ฏ่ฟ˜ๆ˜ฏๅฌไฝ ็š„\n39502 0 ไธ่ฟ‡็œŸ็›ธๆ‰ๆ˜ฏ็œŸๆญฃ็š„\"ๆœ‰ๆƒ…ไบบ็ปˆๆˆ็œทๅฑž\"ๅ•Š\n39503 0 2015ๅนด7ๆœˆ18ๆ—ฅCA1541ๅŽŸๅฎšไธ‹ๅˆ14\n39504 1 ๏ผŒ่กฃๆŸœๆŸœไฝ“ๅฎžๆœจๅคšๅฑ‚ๆฟ้™้‡ๅทฅๅŽ‚ไปทๆŠข่ดญ๏ผŒๅ‰ไบŒๅๅ่ฟ˜ไบซๆœ‰้‡‘ๅกไผšๅ‘˜ๆŠ˜ไธŠๆŠ˜็š„็‰นๆƒ ๏ผŒ้žๅธธ้žๅธธ็š„ๅฎžๆƒ ๏ผŒๅฎถ้‡Œ...\n1000 Processed\n classify content\n40000 0 91โ‚ฌไธŠ้ฃžๆœบๅ‰่ฟ™ไธชไปทๆ ผๅชๅคŸ1ไธช้•ฟๆฃไธ‰ๆ˜Žๆฒป็‰ฉไปทไฝŽไบบไธๅคšๅฏๆƒœๆˆ‘ๆ˜ฏไธๅฟซไน็š„ๅนด่ฝปไบบ\n40001 0 xxxxxๅฐฑ่ฟ™ไนˆๆฌบ้ช—ๆถˆ่ดน่€…ๅ—\n40002 1 ๅ–œ่ฟŽ x x ๅฆ‡ๅฅณ่Š‚ ๆฐด่ƒญ่„‚็พŽๅฎน็พŽไฝ“ๅ…ณ็ˆฑๅฅณๆ€งๅŒ่ƒžไปฌ๏ผŒ็‰นๆŽจๅ‡บไธบๆœŸxๅคฉ็š„ๆดปๅŠจxใ€ๆ‰€ๆœ‰ๆŠค็†้กน็›ฎ...\n40003 0 ๅขจ่ฅฟๅ“ฅๅŸŽๅœจๅ…จๅธ‚21ไธชๅœฐ้“็ซ™ๅปบ็ซ‹ไบ†30ไธชๅฅๅบท็ซ™\n40004 0 ๆฏๅคฉๅ–4ๆฏ่Œถๆฏ”ๅ–8ๆฏ็™ฝๅผ€ๆฐดๆ›ดๆœ‰็›ŠไบŽ่บซไฝ“ๅฅๅบท\n1000 Processed\n classify content\n40500 0 ๅ…จๅ›ฝ้ฆ–ๅฎถ็œŸๆญฃๆ„ไน‰ไธŠ็š„360โ„ƒๅ…จๅผ€ๆ”พๅผๅŽจๆˆฟ่ฟ่ฅๆจกๅผ็š„ๅ€กๅฏผ่€…\n40501 0 Ialwayshavetๅฐ†้€ๆธๅ…ณ้—ญ่‡ชๅฎถ็š„้Ÿณไน\n40502 0 ๅฐๆ—ถไปฃ4๏ผš็ต้ญ‚ๅฐฝๅคดโ˜…โ˜…โ˜…่ฟ˜่กŒ\n40503 0 ๅ”ไปฃไธญๅคฎๆ”ฟๅบœ่ฎพๅŒ—ๅบญใ€ๅฎ‰่ฅฟ็ญ‰้ƒฝๆŠคๅบœ\n40504 0 com้˜…่ฏปๅ…จๆ–‡่ฏทๆˆณๅณ่พน\n1000 Processed\n classify content\n41000 1 ไบฒ็ˆฑ็š„ๅงๅฆนไปฌ๏ผšxๆœˆไปฝๆ–ฐๅนดๅผ€็ซฏ๏ผŒ็Žซ็ณๅ‡ฏไบงๅ“้€ๅคšๅคš๏ผ่ดญไนฐไบงๅ“ๆฏๆปกxxxxๅ…ƒ่Žท่ต ไปทๅ€ผxxxๅ…ƒไบงๅ“๏ผŒ...\n41001 0 comLogo็›ฎๅฝ•ๅ…ฌๅธ็ฎ€ไป‹็งŸ่ตๆœๅŠก็งŸ่ตๅˆ†็ฑปๆˆๅŠŸๆกˆไพ‹ๅˆไฝœ่ฏฆๆƒ…ๅ…ฌๅธ็ฎ€ไป‹็งŸ่ตๆœๅŠก็งŸ่ตๅˆ†็ฑปๆˆๅŠŸๆกˆไพ‹ๅˆ...\n41002 0 ไธ€ๅนดๅ†…็š„็–ค็—•่ƒฝๅŽปๆމ90%ไธคๅนดๅ†…็š„็–ค็—•่ƒฝๆœ‰ๆ•ˆๆทกๅŒ–\n41003 0 ไธ่ฎค็œŸๅธๆณ•่ฏ‰่ฎผ้‰ดๅฎšๅฎข่ง‚ๅค„็†\n41004 0 ไธ‡ๅŽๆ˜ฏEMCๅ’ŒSAP็š„ๅ…ฑๅŒๅฎขๆˆทๅพˆๆœ‰ไปฃ่กจๆ€ง\n1000 Processed\n classify content\n41500 0 ไฝ†ๆ˜ฏๅˆทๅฅฝไปฅๅŽๆ‰‹ๆœบๅ†…็ฝฎๅญ˜ๅ‚จไธ่ƒฝ่ฏ†ๅˆซ\n41501 0 ๆฎWindowsInsiders่ดŸ่ดฃไบบGabeAu\n41502 0 ๅ› ไธบๅฝ“ๅนดๅปบ็ญ‘็š„่ฟ‡็จ‹ไธญๅ‘็”Ÿไบ†ๅ€พๆ–œ\n41503 1 ไปŠๅคฉๆŽจๅ‡บ้™้‡็‰ˆ็†่ดขไบงๅ“๏ผŒๅˆฉ็އx.x%๏ผŒๆŠ•่ต„ๆœŸxxๅคฉ๏ผˆx.xx~x.xx๏ผ‰๏ผŒๅ…ˆๅˆฐๅ…ˆๅฎšใ€‚\n41504 0 ไธŠๆตท60ๅนดไธ้‡็š„ๅฐ้ฃŽ่ฎฉๆˆ‘็ป™่ตถไธŠไบ†\n1000 Processed\n classify content\n42000 0 ๆญคๆฌกๆŽจๅ‡บ็บฟ่ทฏไปทไฝๅ‡ๅœจ20ๅ…ƒไปฅๅ†…\n42001 0 ไฝ†ๆ˜ฏ80๏ผ…ไปฅไธŠ็š„็›ธไผผๅบฆๆ˜ฏ็ต•ๅฐๆฒ’้—ฎ้ข˜็š„\n42002 0 ๅ‰ๅŽไธๅฏน็งฐ็š„่ฎพ่ฎกๅˆไธชๆ€งๅ่ถณ\n42003 0 0255โ€ฐไธ€ๅพ‹่ฐƒๆ•ดไธบๆŒ‰็…งๆˆไบค้‡‘้ข0\n42004 0 ๅŽไธบไฝ“้ชŒไธญๅฟƒ้‡Œๅ„็ง็•Œ้ข็š„่ฏญ่จ€ๆ˜ฏ่‹ฑๆ–‡็š„\n1000 Processed\n classify content\n42500 0 ใ€€ใ€€ๅœจๆพณๅคงๅˆฉไบšG20ๅณฐไผš็š„ๆ–ฐ้—ปๅ‘ๅธƒไผšไธŠ\n42501 0 ๆ”ถๅˆฐไธๅฐ‘็ƒญๅฟƒ่ฏป่€…ๆถ‚้ธฆ่ฎพ่ฎก็š„ๅฐ้ขไฝœๅ“\n42502 0 ๅ…ถๅฎžๆˆ‘ๅฐฑๆ˜ฏๆƒณๅŽปๅ—ไบฌๅ•ฆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n42503 0 ๆŠŠ่€ๅญ็š„ๆ‹›่˜็”ต่ฏๆ‹ฆๆˆชไบ†ๅงๆงฝ\n42504 1 ็บขๅฆ†ๅ‡่‚ฅ็˜ฆ่บซๆญฃๆœˆไบŒๅๆญฃๅผๅผ€้—จ๏ผŒไธบๅ›ž้ฆˆ้กพๅฎข็‰นไธพๅŠžๅคงๅž‹ไผ˜ๆƒ ๆดปๅŠจ๏ผŒ๏ผˆxxxๅ…ƒxxๆฌก๏ผŒๅธฆไธ€ๅ้กพๅฎขๅŠ x...\n1000 Processed\n classify content\n43000 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅˆšๆ‰“็”ต่ฏ็ป™ไฝ ็š„ๆท่ถŠไฟก็”จ่ดทๆฌพๅ…ฌๅธ็š„ๅฎขๆˆทๅฐ้กพ๏ผŒๆˆ‘ไปฌๅ…ฌๅธ่ดทๆฌพๆ— ๆŠตๆŠผ๏ผŒๆ— ๆ‹…ไฟ๏ผŒ่ฟ™ๆ˜ฏๆˆ‘็š„ๅท...\n43001 0 ไธๆณ•ๅˆ†ๅญๅธธไผš็›—็ชƒ็†Ÿ็ก็š„ๆ—…ๅฎข\n43002 0 ๆˆ‘็Žฐๅœจๆฏๅคฉๅฐฑๆ˜ฏ็Žฉไธช็”ต่„‘ๆ‰“ไธช็”ต่ฏ\n43003 0 ไปŠๆ—ฉ้€€็ƒง็›ฎๅ‰ๆฒกๆœ‰็”จไปปไฝ•่ฏ็‰ฉๅญฉๅญ็ฒพ็ฅž็Šถๆ€่‰ฏๅฅฝ่ฟ›้ฃŸๆญฃๅธธไปŠๅคฉๅŽปๅŒป้™ข้ชŒ่ก€\n43004 0 5ไธชไบบๆ‰“ไธ€ไธช้ฃžๆœบๆ‰“ไธๆญป==\n1000 Processed\n classify content\n43500 0 ่ฆๆฑ‚ๅ„ๅ…ฌๅธๆŒ‰่ˆช็ญ้‡็š„25%่ฟ›่กŒ่ฐƒๅ‡\n43501 0 ๆœŸ่ดง่ˆตๆ‰‹ๅˆ็›˜ๅˆ†ๆž๏ผš็Žป็’ƒๅ›ž่ฝๅœจ850ๅ‡†ๅค‡ๅšๅคšๆ“ไฝœ\n43502 0 ๆˆ‘่ฆ่€ƒ่™‘่‹นๆžœๅŽไธบ้ญ…ๆ—ๅŠชๆฏ”ไบš้…ทๆดพไน่ง†ๅคง็ฅžoppovivo\n43503 0 ๅŽŸๆœ‰็š„ๅฐๅปบๅˆถๅบฆ็”ฑไบŽ่…่ดฅๅ’Œไธๅพ—ไบบๅฟƒ่€Œๅดฉๆบƒ\n43504 0 ไธŠ่ฏๅฐ†ๅœจxxไธชๆœˆๅ†…ๅ‡่‡ณxxxx็‚น\n1000 Processed\n classify content\n44000 0 ้ข„่ฎกๆœชๆฅxๅฐๆ—ถๅ†…ๅธธๅทžใ€้‡‘ๅ›ใ€ๆบง้˜ณๅคง้ƒจๅˆ†ๅœฐๅŒบๅฏ่ƒฝไผšๅ‡บ็Žฐ้›ท็”ตใ€้›ท้›จๅคง้ฃŽใ€็Ÿญๆ—ถๅผบ้™ๆฐด็ญ‰ๅผบๅฏนๆตๅคฉๆฐ”\n44001 0 ๅฏ่ƒฝๅพˆๅคšไบบ้ƒฝไผšๆƒณๅˆฐ้‡‘่žๅ’Œ็”ตๅ•†\n44002 0 ๅพฎ่ฝฏ๏ผš้ซ˜้€šๆˆ‘ๅคช้˜ณไฝ ๆฏไธŠๅคงไบบๆ•ดๅคฉๅชๆƒณๅ‘ๆˆ‘้’ฑ\n44003 0 KTVๆ˜ฏไธ€ไธชๅ‡ ไนŽๆฒกๆœ‰ๆ ธๅฟƒ็ซžไบ‰ๅŠ›็š„ไบงไธš\n44004 0 BaitollahAbbaspour็š„ๆฒป็–—่ดน็”จ็ปˆไบŽๆœ‰็œ‰็›ฎไบ†\n1000 Processed\n classify content\n44500 0 7ๆœˆ10ๆ—ฅ่‡ณ15ๆ—ฅ้•ฟๆตทๅŽฟๆ—…ๆธธๅฑ€ๅœจๅคง่ฟž็ฒพ้€‰ๆ ธๅฟƒ็คพๅŒบไธพๅŠžไบ†ๅ››ๅœบๅคงๅž‹ๆŽจไป‹ๆดปๅŠจ\n44501 0 ไธฅๆ ผ็š„้‡‘่ž้ฃŽๆŽงๆ ‡ๅ‡†ๆ‰ง่กŒๅฐฑOK\n44502 0 ๅ‚ๅŠ 2015ๅนดๆด›ๆ‰็Ÿถไธ–็•Œ็‰นๆฎŠๅฅฅๆž—ๅŒนๅ…‹่ฟๅŠจไผš็š„ๅ„ไปฃ่กจๅ›ข้™†็ปญๆŠต่พพๆด›ๆ‰็Ÿถ\n44503 0 ๆˆ‘ไธๆ›พ่ดจ็–‘่‡ชๅทฑๅš็š„ๆ˜ฏๅฆๆญฃ็กฎ\n44504 0 ๅฆ‚ไปŠIPOๆณจๅ†Œๅˆถ็š„ๆ”พๅผ€็ป™ๅŸŽๅ•†่กŒไธŠๅธ‚ๅธฆๆฅไบ†ๆ–ฐ็š„ๆœบไผš\n1000 Processed\n classify content\n45000 0 ๅฅณไธป็ฌฌไธ€ๆฌก่ขซ้’‰17้ข—้”€้ญ‚้’‰+่ขซๅˆบ101ๅ‰‘+่ขซ็ปๆƒ…ๆฑ ๆฐดๆฏๅฎนๆฏๅฃฐ+ๆตๆ”พ่›ฎ่’ๆ—ถ\n45001 0 TA่ฏด็š„่ฟ™ไธค็ฑปไบบ็ฉถ็ซŸ่ฏฅๅฆ‚ไฝ•ๅŒบๅˆ†\n45002 1 ๅนณๅฎ‰ๆ˜“่ดท๏ผŒ้’ˆๅฏน่ฝฆไธป๏ผŒๆˆฟไธป๏ผŒๆณ•ไบบ๏ผŒไธŠ็ญๆ—๏ผŒๅฏฟ้™ฉๆŠ•ไฟไบบๆไพ›ๆœ€้ซ˜xxไธ‡ไฟก็”จ่ดทๆฌพ๏ผŒๆœˆๆฏx.x%๏ผŒ่ฏฆๆƒ…...\n45003 0 S96ๅฎฟ่ฟๆ”ฏ็บฟๅŒๅ‘ๅœจๅฎฟ่ฟๅค„้™้€Ÿใ€้™่ฝฆๅ–ๆถˆ\n45004 0 ่ก—้“ๅŸŽ็ฎก็ง‘ใ€ๆ‰งๆณ•ไธญ้˜Ÿ็ป„็ป‡5ไบบ\n1000 Processed\n classify content\n45500 0 ็ปผๅˆๆ‰งๆณ•ๅคง้˜Ÿใ€ๅŸŽ็ฎกใ€ๅทฅๅ•†ใ€้ฃŸๅฎ‰ๅŠž็ญ‰ๆ‰งๆณ•ไบบๅ‘˜ไปŽๅญฆ้™ข่ก—ไธŽๅปบ่ฎพ่ทฏไบคๅ‰ๅฃๅ‘ไธœ่ฟ›่กŒๆธ…็†\n45501 0 ๅ‘จไธ€ๅฐๆนพๅฐ†ๆถ‰ๅซŒๅ•†ไธšๆฌบ่ฏˆๅ†…ๅœฐ่€ๆฟๅถๅฝฆ่ฃไธ€ๅฎถ้ฃ้€ๅ›žๅคง้™†\n45502 0 ไปŠๅนด10ๅทๅฐ้ฃŽโ€œ่Žฒ่Šฑโ€ไบŽไปŠๅคฉ12ๆ—ถ15ๅˆ†็™ป้™†ๅนฟไธœ้™†ไธฐ\n45503 0 ๆˆ‘ๆžœๆ–ญๅ†ณๅฎšๆŠŠ่Šฑๅƒ้ชจๅŽๅŠ้ƒจๅˆ†็š„ๅฐ่ฏด็œ‹ไบ†\n45504 0 ่ฟ™ๆ˜ฏๅœจ็ฝฎไธš่€…ๅœจ่ดญๆˆฟ่€…ไน‹ๅ‰ๅฟ…้กป่ฆไบ†่งฃ็š„ไบ‹ๆƒ…็š„ไธค้กน้‡่ฆ่€ƒ่™‘ๅ› ็ด \n1000 Processed\n classify content\n46000 0 ้ƒญๆฒซ่‹ฅๅ…ˆ็”Ÿๆ›พไธบๆ— ้”กๅคชๆน–้ผ‹ๅคดๆธš่ฏด่ฟ‡่‘—ๅ็š„ไธ€ๅฅโ€œๅคชๆน–ไฝณ็ปๅค„\n46001 0 ๆœ€็ˆฑ็š„ๅฐฑๆ˜ฏๅ•†่ดธ่ก—ๅ’Œๆƒ…ไบบๆน–ไบ†โ€ฆ\n46002 0 ้ฆ–ๆ‰นๅฎ˜ๆ–น่ฎคๅฎš็š„โ€œๆฑŸ่‹่€ๅญ—ๅทโ€ไผไธšๆ€ปๅ…ฑ่พพๅˆฐ176ๅฎถ\n46003 0 ไธๆ˜ฏๆˆ‘่ฏดโ€ฆโ€ฆไธ‰ๆฌกๅŸŽ็ฎกไบ†myไฟฑๅˆฉ้…ฑ้ƒฝไธญไผค็œŸๅ‰‘ไบ†็ป™่ทŸ้นคๆฏ›ๆ‘ธๆ‘ธ่กŒไธ\n46004 0 ๆฒกๆœ‰้ฃž่ฝฆๅ…šๆฒกๆœ‰ๅฐๅทๆฒกๆœ‰ๅๅŽไธ€่ทฏๅ‘ๅ—ๅด่ถŠ่ตฐ่ถŠ้šพ็ ด่ฎฐๆ€งๅฐฑๅทฎๆฒกๆŠŠ่‡ชๅทฑ็š„ๅคดๅผ„ไธขไบ†ๅฅฝไธๅฎนๆ˜“ไธŠไบ†่ฝฆๅๅๆผซ...\n1000 Processed\n classify content\n46500 0 ๅพฎ่ฝฏ็š„OneDriveๆ—ถ้—ดๆ˜พ็คบไธๆ˜ฏๆœฌๅœฐๆ—ถ้—ด\n46501 0 ๅธธๅทžไน้พ™ๅธŒๆœ›ๅฐๅญฆ็š„ๅงš่Šณๅˆ˜ไปๅˆš็Ž‹ไธฝ\n46502 0 ็„ถๅŽไธƒ็‚นๅทฆๅณๅˆฐ็š„ๅฎฟ่ฑซๅŒบๅœฐ็จŽๅฑ€ไธ‹็š„่ฝฆ\n46503 1 ๅฎถ้•ฟๆ‚จๅฅฝ๏ผŒๆ–ฐๅนดๅฟซไน๏ผไธญ่€ƒๅณๅฐ†ๅˆฐๆฅ๏ผŒๅœจๆญค็ดงๅผ ๆ—ถๆœŸ๏ผŒ่ฟœๆ™ฏๆ•™่‚ฒ็‰นๅผ€่ฎพๅ…จๆ—ฅๅˆถ่ฏพ็จ‹๏ผŒไธบๆ‚จๅญฉๅญไธญ่€ƒไฟ้ฉพๆŠค...\n46504 0 ๆœ‰ไธ€ไธชๅฐๅทๆฏๅคฉ้ƒฝๆญฃๅคงๅ…‰ๆ˜Ž็š„ๅทๅฌๆˆ‘ไปฌ่ฏด่ฏ\n1000 Processed\n classify content\n47000 0 ๅช่งไป–ๆŠ˜ไบ†ไธ€ไธช็บธ้ฃžๆœบ่‡ช่จ€่‡ช่ฏญ็š„่ฏด๏ผšโ€œๅฐ้ฃžๆœบๅ•Šๅฐฑ็œ‹ไฝ ็š„ๅ•ฆ\n47001 1 ๆ–ฐๅนดๅฅฝ๏ผๆˆ‘ๆ˜ฏไธ“ไธšๅŠž็†้Š€่กŒไปฃๆญ€็š„ๆฑค่๏ผŒๆ‚จไน‹ๅ‰ๅŠž็†ไธไบ†ๆˆ–่ขซๆ‹’็š„ไปฃๆญ€้ƒฝๅฏๅŠž็†๏ผŒ็ซ‹็†„ไฝŽ่‡ณxๅŽ˜๏ผŒ้ขๅบฆ้ซ˜...\n47002 0 ๅฎŒๆ•ด็‰ˆ๏ผšๅฅฝๅฃฐ้Ÿณๅผ ็Žฎๅ†็ง€HIGHๆญŒๅฏŒไบŒไปฃ็™พไธ‡่ฑช่ฝฆๆŠŠๅฆนไบ’ๅŠจๆŠ•็ฅจ๏ผšไฝ ่ฎคไธบ่ฐๆ›ด้€‚ๅˆๅš็”ทๅ‹\n47003 0 ไธ็œ ไธไผ‘็š„็œ‹ๅฎŒไบ†่Šฑๅƒ้ชจๅฐ่ฏด\n47004 0 **ๅ’Œๅœฐ้“็ฎก็†ไบบๅ‘˜ๅฏนๅ…จ่Žซๆ–ฏ็ง‘็š„ๅœฐ้“็บฟๅฑ•ๅผ€ไบ†ไธ€ๅœบๅœฐๆฏฏๅผ็š„ๆœ็ดข\n1000 Processed\n classify content\n47500 0 7ๆœˆ9ๆ—ฅๆถˆๆฏๅพฎ่ฝฏๆ‰‹ๆœบไธšๅŠกๅคฑ่ดฅๅŽๅ†ณๅฎš่ฃๅ‘˜7800ไบบ\n47501 0 ๅฏนๅค–้˜่ฟฐโ€œไบ’่”็ฝ‘+โ€ๅ’Œๅˆ›ไธšๆœบ้‡ใ€่…พ่ฎฏๆˆ˜็•ฅ\n47502 0 ??????????????????????\n47503 0 ๅƒ้ฅญๆ—ถ็œ‹็”ต่ง†ๅœจๆผ”่ญฆๅฏŸ2013\n47504 0 ้ซ˜ไฟๆนฟ็ดง่‡ดๆฏ›ๅญ”้•‡้™่ˆ’็ผ“ๆๆ‹‰่ฅๅ…ป100%็บฏ็ซน็บค็ปด้ข่†œๆ— ่Œๅค„็†ๅŽๅธฆๆธฉๅบฆ่ฎกๆ”พๅ…ฅๅ†ฐ็ฎฑ10ๅˆ†้’Ÿๅทฆๅณๆธฉๅบฆ...\n1000 Processed\n classify content\n48000 0 ๅœจๅ›ฝๅฎถ้ฃŸๅ“่ฏๅ“็›‘็ฃ็ฎก็†ๆ€ปๅฑ€ๅ…จ\n48001 0 ๆ˜จๅคฉๅ‡ ไธชไบบๅ’จ่ฏขๅ…ณไบŽ็บค่…ฐๅŽป้™ค่ƒณ่†Š\n48002 0 ไนŸ่ฎธๅ› ไธบๅ›ฝๅ†…ๆฒกๆœ‰googleๆŸฅไธๅˆฐไฟกๆฏๆบๅง\n48003 0 2ๅฃฎ้˜ณไธ่ƒฝๅ‚ฌๅ‘ๆ€งๆฌฒ๏ผšๆ€งๆฌฒ่ทŸ็ฒพ็ฅž็Šถๆ€ไธŽ้›„ๆฟ€็ด ๆฐดๅนณๆœ‰ๅ…ณ\n48004 0 ๅด็ปˆไบŽๅ˜ๆˆๅพˆไน…ไปฅๅ‰ไธŠ็”ต่„‘่ฏพ\n1000 Processed\n classify content\n48500 0 ไฝ•ๅ†ตๆˆ‘ๅŠžๅก็š„็ญพๅๆ˜ฏNๅนดๅ‰็š„็ญพๅ\n48501 0 ๅฅฝๅฅ‡ๅฟƒๆšดๅผบ็š„ๆˆ‘ๅˆๆƒณ็Ÿฅ้“็œŸ็›ธ\n48502 0 ๆˆ‘ๆ‰‹ๆœบ้‡Œๆœ‰ไปŠๅคฉๅˆšๆ›ดๆ–ฐ็š„้‚ฃ้›†\n48503 0 ไบบไบบ้œ€็Ÿฅ็š„ไบ’่”็ฝ‘้‡‘่žไฟกๆฏๅฎ‰ๅ…จๅŸบ็ก€\n48504 0 ๆฒป็–—็–พ็—…็š„ๆ•ˆๆžœๆฏ”่พƒ่ฟ…้€Ÿๅ’Œๆ˜พ่‘—\n1000 Processed\n classify content\n49000 0 ๆ‰ไธ‹้ฃžๆœบๅ›žไบ†ๅฎถๅฐฑๆ”ถๅˆฐ็š„็”Ÿๆ—ฅๅŒ…่ฃน่ฟ™ไบบๆ˜ฏ่ฐไนŸๅคชไป–ๅฆˆๆ‡‚ๆˆ‘ไบ†ๅง็œ‹ๅˆฐๆฝฎๆฐด็ฎด่จ€็ณปๅˆ—ๅ’Œไพง้ข็š„Itsdrea...\n49001 0 โ€”โ€”่ฟ™ๆœŸๅฅฝๅฃฐ้Ÿณ็กฎๅฎžๆ˜ฏๆŠŠๆ‰€ๆœ‰็ฆปไนกๅˆซไบ•ๅœจๅค–ๆ‰“ๆ‹ผ็š„ๆธธๅญ็š„ๅฟƒ้ƒฝ่™ไบ†ไธ€้\n49002 0 ไธŠๆตทไปๆตŽๅŒป้™ข่ก€็ฎกๅค–็ง‘ไธ“ๅฎถๆ•™ไฝ ่ฟœ็ฆป้™่„‰ๆ›ฒๅผ \n49003 0 ไปฅๅŠๆœ‰้…’ๅŽ้ฉพ้ฉถใ€่ถ…ๅ‘˜20%ใ€่ถ…้€Ÿ50%ใ€้ซ˜้€Ÿๅ…ฌ่ทฏ่ถ…้€Ÿ20%ๆˆ–12ไธชๆœˆๅ†…ๆœ‰ไธ‰ๆฌกไปฅไธŠ่ถ…้€Ÿ่ฟๆณ•่ฎฐๅฝ•...\n49004 0 5ๆ”ฏ็”ฑๆœฌ็ง‘็”Ÿ็ป„ๆˆ็š„้˜Ÿไผๅฐ†ๅˆ†่ตดไธœ้˜ณใ€ๅƒๅฒ›ๆน–ใ€ไฝ™ๅงšใ€ไธฝๆฐดๅ’Œ่กขๅทžๆไพ›ๅŒป็–—ๆœๅŠก\n1000 Processed\n classify content\n49500 0 ็œ‹ๅˆฐๅ่พ†่ญฆ่ฝฆ็ป่ฟ‡ๆ›ผๅ“ˆ้กฟไธญๅŸŽ\n49501 0 ๆฏไบฒ่‚็กฌๅŒ–็ˆถไบฒๆœ‰ๅ“ฎๅ–˜ๅœฐๅ€\n49502 0 ๅผŸๅผŸ็Žฐๅœจๅๅœจๅ›ž้’ๅฒ›็š„้ฃžๆœบไธŠ\n49503 0 ๆœฌ้—จไปŽ่ฟžไบ‘็‹ฎ้˜ๅคบๅพ—็‹ฌๅญคไนๅ‰‘ๆฎ‹็ซ ๅ››\n49504 0 ่‹นๆžœๅ’ŒIBMไป็„ถๆ˜ฏ็ง‘ๆŠ€็•Œๆœ€ๅฅ‡็‰น็š„็ป„ๅˆ\n1000 Processed\n classify content\n50000 0 ๆ”พๅคงๆƒ ๆ™ฎ็ญ‰ITๅคง้ณ„็š„ๅธฆๅŠจๆ•ˆๅบ”\n50001 0 ไบค่ญฆๆ้†’๏ผš้…’้ฉพไผšๆž„ๆˆไธ€่ˆฌไบค้€šๅคฑไฟก\n50002 0 ไบŽ6ๆœˆ30ๆ—ฅ20ๆ—ถๅœจ่ฅฟๅŒ—ๅคชๅนณๆด‹ไธŠ็”Ÿๆˆ\n50003 0 ไฝ“้‡xxๆ–ค่บซ้ซ˜xxcm็ฎ—ๆญฃๅธธไนˆ\n50004 1 ๆ–ฐๅนณๆ€€ๅพทไป่ฏๆˆฟไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚็บฆๆƒ ๅ•ฆ๏ผ ๆดปๅŠจๅฐ†ไบŽxๆœˆxๆ—ฅๅˆฐxๆœˆxxๆ—ฅ่ฟ›่กŒ๎ท๏ผๆดปๅŠจไธ€:ไนฐ่ต ๅคงไผ˜...\n1000 Processed\n classify content\n50500 0 ้™คไบ†ๆถ‰ๅŠITๆŠ€ๆœฏ็š„้ซ˜้ข‘ไบคๆ˜“ไปฅๅค–\n50501 1 ่ฏšๆ„ๅฎขๆˆทๆฑ‚่ดญๅ—ๅŒ—ไธ‰ๅฑ…ๅฎค\n50502 0 ๆˆ‘ๆญฃๅœจ็œ‹15ๅฒๅฅณๅญฉ้ญๆŠฅๅคๆ€ง่ฝฎ*\n50503 0 ไธ€ไธชๅŒป้™ขไธŠ็ญ็š„ๆŒฃ้‚ฃไฟฉ่‡ญ้’ฑๅฐฑไธ็Ÿฅๅคฉ้ซ˜ๅœฐๅŽšไบ†\n50504 0 ๅœจๆน–ๅ—ๅซ่ง†ๅ’ŒๆฑŸ่‹ๅซ่ง†ๅทฒ็ป็œ‹ๅˆฐๅ–ฝ\n1000 Processed\n classify content\n51000 0 xx%็พŠ็ป’+xx%็พŠๆฏ›+xx%็œŸไธ\n51001 0 ๅ“ฆไฝ ไป–ๅฆˆ็š„่ขซๅผบๅฅธไบ†่ขซไบบๆ‰“ๆญปไบ†ไฝ ๆ‰ๆดป่ฏฅๅ‘ขๅ› ไธบไฝ ไธŠ่พˆๅญๅฐฑๆ˜ฏไธชๅ‚ป้€ผ\n51002 0 Malloryzๆ˜ฏไธชๅ’Œๆˆ‘ๅŒๅนด็š„ๆฏ•ไธš็”Ÿ\n51003 0 ๅœจๆœบๅœบ็œ‹ๅˆฐๅŽไธบ็š„ๅนฟๅ‘Šๅ›พ่ง‰ๅพ—่›ฎ็ฌฆๅˆ่ฟ™ๅฅ่ฏ\n51004 0 ่ขซ้ข‘้ข‘ๅ‘็”Ÿ็›—็ชƒๆกˆไปถๆ…็š„ไธๅพ—ๅฎ‰ๅฎ\n1000 Processed\n classify content\n51500 1 ๆŒฏๅ…ด็™พ่ดง็ฅ๏ผš็พŽไธฝ๏ผŒๅฅฝ่ฟ๏ผŒๅฟซไน๏ผŒๅ–„่‰ฏ๏ผŒ็ƒญๆƒ…๏ผŒๅฅๅบท็š„ๆ‚จไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅฟซไน๏ผŒๆŠŠๆœ€็พŽ็š„็ฅ็ฆ้€็ป™ๆ‚จ๏ผxๆœˆx...\n51501 1 ใ€่€ๅฎขๆˆท็ฆๅˆฉๆฅไบ†ใ€‘ไบฒ๏ผŒๅ…ƒๅฎต่Š‚ๅฟซไน๏ผใ€ๆท˜ๅฎๅบ—๏ผš็ˆฑ็พŽๅฎ่ดๅฉด็ซฅ้ฆ†ใ€‘้€ๆ‚จxxๅ…ƒๅ…ƒๅฎตๅˆธ๏ผˆๆ˜ฅๆฌพๆ— ้—จๆง›ไฝฟ็”จ...\n51502 0 ๅ‹ไบบ็›ธ่ต ็š„ไธ€ๅชๆฅ่‡ชๆ— ้”ก็š„่ถ…ๅคงๆฐด่œœๆกƒ\n51503 0 ๆณ•ๅช’็งฐ้ฆ™ๆธฏๅ•†้“บ็งŸ้‡‘ๅคช่ดตLVๅ’ŒBurberry้ƒฝๅƒไธๆถˆโ€“ๅๅบ—็ฝ‘\n51504 0 Bravo~ไธ€ไธŠๅˆๅฌๅฎŒๅ›ฝ้™…็งๆณ•\n1000 Processed\n classify content\n52000 1 ๏ฝžxไธ‡่ฃ…ไฟฎ่ดน็”จxใ€็Žฐๅœบไธคๅƒๅนณ็ฑณๆๆ–™ๅฑ•ๅŽ…ๅ’Œไธ‰็งไธๅŒ้ฃŽๆ ผ็š„ๅฎžๆ™ฏๆ ทๆฟๆˆฟ็ญ‰ๆ‚จๆฅๅ‚่ง‚xใ€ๆฅ่ฎฟๅณๆœ‰็คผๅ“ไธ€...\n52001 0 ๆ˜Žๅคฉๆ™šไธŠๅ…ซ็‚นๅˆฐไน็‚นๆŠข็บขๅŒ…\n52002 1 ้‘ซไธฐๅ›ฝ้™…ๅฎถๅฑ…๏ผŒxๆœˆxxๅทxxๅทไธพๅŠž่ฏšไฟกx.xxๆ„Ÿๆฉๅ›žๆŠฅๆดปๅŠจ๏ผŒๅˆฐๅบ—ๆœ‰็คผๆดปๅŠจ๏ผŒๅ†…ๅฎน๏ผ›x:ๆœ‰้˜ถๆขฏ้€...\n52003 0 Saigon็š„้—นๅธ‚ๅŒบๅฐฑๆ˜ฏ็ฌฌไธ€้ƒก\n52004 0 ๆ•ฌ่ฏท่ฐ…่งฃ๏ผš่ฝฌๅ…จๆ–ฐBillinghamHadleyDigital่‹ฑๅ›ฝๅŽŸ่ฃ…\n1000 Processed\n classify content\n52500 1 ๆฑ‚่Œ็ƒง่œ๏ผŒๆœฌๅธฎ่œๅ†œๅฎถ่œ๏ผŒๅทฅ่ต„ๅ››ๅƒๅˆฐๅ››ๅƒไบ”๏ผŒๅ› ๆ‰‹็”Ÿ๏ผŒไธ่ฏ•่œ๏ผŒ่ฆๆœ‰ไผ‘ๆฏ๏ผŒๆœ‰้œ€่ฆ็š„่ฏท่”็ณปๆˆ‘xxxx...\n52501 0 ่ขซๅ‘Š็š„็ˆถๆฏๆฅ็พŽๅŽ่ดฟ่ต‚ๅ—ๅฎณไบบๅ’Œ่ฏไบบ\n52502 0 ๆ˜ฏๅฏนไบš้ฉฌ้€Šไบ‘่ฎก็ฎ—ๅนณๅฐๆ„Ÿๅ…ด่ถฃ็š„็›ธๅ…ณๅŒไป่Žทๅ–ไบš้ฉฌ้€Šไบ‘่ฎก็ฎ—ๅนณๅฐ่ต„่ฎฏ็š„้‡่ฆๆฅๆบ\n52503 0 ๅฅฝๅฃฐ้Ÿณ่ขซๆฑ‚ๅฉš็š„ๅฅณ็š„็ฌ‘่ตทๆฅๆ€Žไนˆๅƒๆ•ดๅฎนๅคฑ่ดฅไบ†\n52504 1 ไธ‡ๅˆฉ้ฉพๆ กxxๅฅณไบบ่Š‚ไผ˜ๆƒ ๆดปๅŠจ่ฟ›่กŒไธญใ€‚xๆœˆxๆ—ฅ๏ฝžxxๆ—ฅๅฅณๅฃซๆŠฅๅxxxxๅ…ƒ๏ผŒ็”ทๅฃซๆŠฅๅxxxxๅ…ƒ๏ผŒไธ€...\n1000 Processed\n classify content\n53000 0 ็‰นๅˆซๅœจ่ฟ›่กŒๆ–ฐๆˆฟ่ฃ…ไฟฎ็š„ๆ—ถๅ€™ๅฐฝ้‡ไผš้€‰็”จ็Žฏไฟๆๆ–™\n53001 0 FIBAๅฎ˜็ฝ‘ไธ“ๆ ไฝœๅฎถEnzoFlojo่ฏ„่ฎบไบ†2015ๅนด้•ฟๆฒ™ไบš้”ฆ่ต›ไธŠๆœ‰ๆฝœๅŠ›็ˆ†ๅ‘็š„ๆ–ฐๆ˜Ÿ\n53002 0 ไธŽไบš้ฉฌ้€Š็พŽๅ›ฝ็›ด้‚ฎไธญๅ›ฝ็š„ๅ“็ฑปๅฎž็ŽฐๅฎŒ็พŽๅฏนๆŽฅ\n53003 0 ๅ—ไบฌๆˆฟๅœฐไบงๅธ‚ๅœบ่กจ็Žฐ้ข†่ท‘ๅŒ็ฑปๅŸŽๅธ‚ไธๆ— ้“็†\n53004 0 sanE็ป™ๆˆ‘ๆ„Ÿ่ง‰ๆœ‰็‚นๅ•†ไธšๅŒ–ๅ•ฆ\n1000 Processed\n classify content\n53500 0 ่ถณไธๅ‡บๆˆท+Q844930494่€ƒ่™‘ไธ€ๅƒๆฌกไธๅฆ‚ๅŽปๅšไธ€ๆฌก\n53501 0 ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅœจไนฐ็”ต่„‘็š„ๅœฐๆ–น็œ‹่งๅ—ๅกๅฅถไบ†ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n53502 0 โ€œxยทxxๅฅณๅฐธๆกˆโ€็š„ไธ“ๆกˆ็ป„้‡่ฆๆˆๅ‘˜ๅ†ฏๅฟ—ๆ˜Ž\n53503 0 ๅ…จๅ›ฝxๅคฉๅ‘็”Ÿx่ตทโ€œ็”ตๆขฏๅ’ฌไบบโ€ไบ‹ๆ•…\n53504 0 ่€ŒไธๅŒ…ๆ‹ฌ้คๅŽ…่ฃ…ๆฝขใ€ๆœๅŠกใ€่ฎพๆ–ฝ็ญ‰\n1000 Processed\n classify content\n54000 0 ๅ’Œ้›†ๅ›ขๆ€ป่ฃ็›ธๅค„็š„ๆ—ฅๅญๆ˜ฏๆœ€ๆœ‰ๆธฉๆš–ๆ„Ÿ็š„\n54001 0 ๆœ€็ปˆๆณ•้™ขๅˆคๅ†ณๅŽ‚ๅฎถ่ต”ๅฟ่ฏฏๅทฅ่ดน500ๅ…ƒ\n54002 1 ไนไธŠๅ้ƒฝ่ฃ…้ฅฐxxx้›ถๅˆฉๆถฆx้‡ๅคง็คผ๏ผˆๅฎถๅ…ทใ€ๅฎถ็”ต็ญ‰๏ผ‰็–ฏ็‹‚โ€œๆŠขโ€ๅฎšๆƒ ใ€‚ๆ—ถ้—ดxๆœˆxๆ—ฅ--xๆœˆxxๆ—ฅใ€‚...\n54003 1 ๅบ—้“บไปปๆ„ไบงๅ“ๆปกxxxๅ…ƒๅณไบซx.xๆŠ˜ไผ˜ๆƒ ๅ“ฆ๏ผŒๆฌข่ฟŽๆ‚จๅˆฐๅบ—้€‰่ดญ^_^๏ผโ€”็‹็‹ธ็š„ๅฎ่—๏ผ่”็ณปไบบๅŠ็”ต่ฏ๏ผš...\n54004 1 ๆไพ›ๆ— ๆŠตๆŠผๆ— ๆ‹…ไฟไฟก็”จ่ดท ่งฃๅ†ณ่ต„้‡‘้œ€ๆฑ‚ ่ฝฆ่ดท ๆˆฟ่ดท ๆ— ๆˆฟๆ— ่ฝฆไฟก็”จ่ดท ๆ‰‹็ปญ็ฎ€ไพฟๅฟซๆท xๅคฉไธ‹ๆฌพ ้œ€...\n1000 Processed\n classify content\n54500 0 ๅปบ่ฎฎๅšๆŒ่กฅๅ……้…ต็ด ๆฅๅธฎๅŠฉ่บซไฝ“ๅ‡่กก\n54501 0 ๅ…จ้ขๆŽจๅŠจๅŒ—ๆ–—ๅซๆ˜Ÿๅฏผ่ˆชๅบ”็”จไบงไธšๅ‘ๅฑ•\n54502 0 ไพ‹ๅฆ‚2ไบฟ็พŽๅ…ƒๆ”ถ่ดญ็ฝ‘็ปœๅฎ‰ๅ…จๅ…ฌๅธAorato็ญ‰\n54503 0 ๆœ‰็ฝ‘ๅ‹ๅ›žๅธ–่ดจ็–‘โ€œไฝ ่‚ฏๅฎšๆ˜ฏไป–็š„ๅŒไผ™\n54504 1 ๆˆ‘่ฟ™่พนๆ˜ฏๅŠž็†ๆ— ๆŠตๆŠผไฟก็”จ่ดทๆฌพ็š„ๅฐๆจใ€‚ๅฆ‚ๆžœๆœ‰้œ€่ฆ็š„่ฏๅฏไปฅ่ทŸๆˆ‘่”็ณป\n1000 Processed\n classify content\n55000 1 ๅผ€ๆˆท่กŒ๏ผŒไธญๅ›ฝๅปบ่ฎพ้“ถ่กŒ ๅกๅท:xxxx xxxx xxxx xxxx xxx.ๅผ ็‡•\n55001 0 Twitterๅธ‚ๅ€ผ็บฆ็ป†่ฏญ็š„ๅฐ็Œซๅ’ช\n55002 0 ๅซ็ปด็”Ÿ็ด Cใ€็บค็ปด่ƒฝไฟƒ่ฟ›่‚ ่ƒƒ่ •ๅŠจ\n55003 0 ่ฟ›็š„ไธ‡็ง‘ๅ’Œไธญ้›†้›†ๅ›ขไธคๅช่‚ก็ฅจ\n55004 0 ๅพˆๆ—ฉ็กๅˆฐๅ‡Œๆ™จๅฐฑ้†’ไธ‹ๆฅผๅผ„ไธคๅ—้ขๅŒ…ๅŠ ็‚น็‚ผๅฅถๅ–็“ถ็›ŠๅŠ›ๅคšไธŠๅบŠ็ฟปไธ€็ฟปๆ‰‹ๆœบ็›ธ็‰‡ๆ‰“ไธชๅ“ˆๆฌ \n1000 Processed\n classify content\n55500 0 ๆ‰‹ๆœบๆขๅคๅ‡บๅŽ‚่ฎพ็ฝฎไบ†่ฟ˜ๆœ‰ๅฏ่ƒฝๆขๅค้‡Œ้ข็š„็Ÿญไฟกๅ—\n55501 0 ๅฅฝ่ฒ้Ÿณ็š„่ˆžๅฐๆไพ›ๅพˆๅคšๆ‡ทๆฃ้Ÿณๆจ‚ๅคขๆƒณ็š„ไบบไพ†ๅˆฐ้€™่ฃก้€™ๅ€‹่ˆžๅฐๆไพ›ไป–ๅ€‘ๆฉŸๆœƒๅณไพฟๆฒ’ๆœ‰ๅฐŽๅธซไบฎ็‡ˆไฝ†้€™ไปฝๆ‡ทๆฃๅคข...\n55502 0 ๅœจ็œ‹ๆธ…็”Ÿๆดป็š„็œŸ็›ธไน‹ๅŽ่ฟ˜ไพๆ—ง็ƒญ็ˆฑ็”Ÿๆดป\n55503 0 ไนๆ˜Œๅธ‚ๆณ•้™ขไธ€ๅฎกๅˆคๅ†ณ้ฉณๅ›žไบ†้™†ๆŸ็š„่ฏ‰่ฎผ่ฏทๆฑ‚\n55504 0 โˆžโ•ฌโ•ฌไนๅฑฑ็”ตๅŠ›ๅคๆ—ฆๅคๅŽๆฑŸ่‹่ˆœๅคฉๅทฅๅคง้ซ˜ๆ–ฐ*STไปชๅŒ–็ฒพ่พพ่‚กไปฝ็ฅžๅฅ‡ๅˆถ่ฏไธญๅคฎๅ•†ๅœบ้ฉฌๅบ”้พ™\n1000 Processed\n classify content\n56000 0 ่ฟ™ไธ‰ไธชไบบๆ˜ฏ้€š่ฟ‡ไป€ไนˆๆ–นๅผ่ฏๆ˜Ž่‡ชๅทฑ\n56001 0 ็”ฑๆทฎๅฎ‰ๅธ‚่ˆช้“็ฎก็†ๅค„ๅ…ญ็‚นไบ”ๅๅ‘ๅธƒ็š„่ˆน้—ธ้€š่ˆชๅ’Œๅพ…้—ธไฟกๆฏ\n56002 0 ๅ‹‡่€…ๅคงๅ†ฒๅ…ณๆ˜ฏๆฑŸ่‹ๅซ่ง†็š„ไธ€ๆกฃๆฐดไธŠๅ†ฒๅ…ณ่Š‚็›ฎ\n56003 0 ไธŽๅคง็›˜่ดŸ็›ธๅ…ณ็š„่ฏๅˆธๅธ‚ๅœบ้—ด้•ฟๆœŸๅ›ฝๅ€บ๏ผšๅนณๅผ€\n56004 0 ๅ’Œๆ‰ฟๆŽฅ็€ๆฑŸ่‹50%ไปฅไธŠๅ‡บๅ›ฝ้‡‘่žไธšๅŠก้‡็š„้พ™ๅคดๅˆไฝœๅฅฝ่ฃๅนธ\n1000 Processed\n classify content\n56500 0 ่ƒฝ่ฏดๆธ…ๆฅšๅ››ๅนดๅœจ้˜ฟ้‡Œๅทดๅทดๅšไป€ไนˆไบ†ๅ—\n56501 0 ไธญๅ›ฝ้ฆ–ๆฌพไธค่ฝฎ็”ตๅŠจๆฑฝ่ฝฆๅฎŒๆˆA่ฝฎ1000ไธ‡็พŽๅ…ƒ่ž่ต„\n56502 0 ๅฑฑ่ฅฟๆณ•้™ข้™ข้•ฟ่ฐƒ็ ”็ฃๅฏผๆถ‰่ฏ‰ไฟก่ฎฟไธ“้กนๆฒป็†ๅทฅไฝœ\n56503 0 ๅฌ่ฏดๅญฆๆ กๆœ‰600ๅ…ƒ/ๆœˆ/ๅบŠไฝ็š„็ฉบ่ฐƒๆˆฟ\n56504 1 ่‡ณไบฒ็ˆฑ็š„้กพๅฎข.ๅ…ƒๅฎตไฝณ่Š‚.ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚xxๅนด็ง‹ๅ†ฌ่ดงๅ“ๆœฌๅบ—ๅญฃๆœชๅคงๆธ…่ดงๅ…จๅœบไธ‰ๆŠ˜ๅคงๆŠข่ดญไธ้”™่ฟ‡ๆœบไผš๏ผŒๅœฐๅ€...\n1000 Processed\n classify content\n57000 0 ่ฟ™ๅฅ่ฏๆทฑๆทฑ็š„ๅฐๅœจๅŽไธบไบบ็š„ๅฟƒ้‡Œ\n57001 0 ๆƒ ๆ™ฎๅฝ“ๅ‰ๅทฒไธๅ†้ข„ๆœŸไธ‹ไธ€่ดขๅนดไผšๅฎž็Žฐๅขž้•ฟ\n57002 0 ๅŒ–ๅฆ†ๆฐด่ƒฝไธบ่‚Œ่‚ค่กฅๅ……่ฟ™ไบ›็šฎ่‚คๆž„ๆˆ็š„ไฟฎๅคๆˆๅˆ†\n57003 0 ๅŽๅคฉๅ› ็ด ่™ฝ็„ถๅชๅ ๅˆฐ20%๏ฝž30%\n57004 0 ้™ค9ๅฎถๅˆ›ไธšๆฟๅ…ฌๅธ่Žทๅพ—็คพไฟๅขžๆŒๅค–\n1000 Processed\n classify content\n57500 0 ๆฑŸ่‹ๅซ่ง†ๅ‹‡่€…ๅคงๅ†ฒๅ…ณ็ซŸ็„ถๆ”พไบ†SUJUDevil\n57501 0 ่ฟ‘ไบ”ๆ—ฅๆœ‰xxxๅชไธช่‚ก่ขซๆœบๆž„่ฏ„็บงไธบไนฐๅ…ฅ\n57502 0 ไธบๅ•ฅ่ฏด่ฏฅ็กไบ†ๅ‘ขๅ› ไธบๆ‰‹ๆœบๆฒก็”ตๅ•ฆ\n57503 0 ไป–ๅœจ็คพไบคๅช’ไฝ“ไธŠๅ‘่กจไบ†x็ฏ‡่ขซ่ฎคไธบๆ˜ฏ่ฏ‹ๆฏๆณฐๅ›ฝๅ›ฝ็Ž‹็š„ๆ–‡็ซ \n57504 0 ็ขงๆฐดไบ‘ๅคฉๅฐๅŒบxxๅทๆฅผไธ‰ๅ•ๅ…ƒไธ‰ๆฅผไธœๆˆทๅ–็ƒง็ƒค็š„่ฝฆไธŠๅธธๅนดๆ”พ็‡ƒๆฐ”็ฝๅœจไธŠ้ข\n1000 Processed\n classify content\n58000 1 (ๆœ‹ๅ‹-ๆœ€่ฟ‘ๅฅฝๅ—๏ผŸๆœ‰ ไธŠ ๅฅฝ ็š„ ่Œถ ๅถ๏ผŒ่ฟ˜ ๆ˜ฏ ๅŽŸ ๆฅ ็š„ ๅ‘ณ ้“ใ€‚ไฝ ๆ‡‚็š„ใ€‚xxxไธฝxxx...\n58001 1 ๅŽ้นค็ฑณๅ…ฐ็บณๆ—ถๅฐš้—จ๏ผŒx.xxโ€œไธบๅฅๅบท๏ผŒๆƒ ไปปๆ€งโ€ๆดปๅŠจ๏ผŒไบซๅŽ‚ๅฎถ็Žฏไฟ่กฅ่ดดxxxๅ…ƒ/ๆจ˜๏ผŒๅฎš้‡‘ๅขžๅ€ผไผ˜ๆƒ ๆปก...\n58002 0 ไธ€็”ท็š„้ข†็€่€ๅฉ†ๅญฉๅญๅœจxๅท็บฟไนž่ฎจ\n58003 0 ๅ‡กๅฐ”่ต›ๅฎซๅปบ็ญ‘ๅธƒๅฑ€ไธฅๅฏ†ใ€ๅ่ฐƒ\n58004 0 ้›†ไธญๅฏน่พ–ๅŒบ้žๆณ•่ฟๆณ•ๅปบ็ญ‘ๅทฅ็จ‹่ฟ›่กŒไบ†้‡ๆ‹ณๆ•ดๆฒป\n1000 Processed\n classify content\n58500 0 ็ซ ็š„้‡‘็ƒA็ฑปๆๅไปฅๅŠๅŒ—็พŽๅคšไผฆๅคš่‹ฑๅ›ฝๅคงๅคงๅฐๅฐๆๅไธ€ๅคงๅ †\n58501 0 ไธ€ไบ›้“ถ่กŒๅ’Œๅˆธๅ•†ๅฏ่ƒฝๅฐ†้ขไธด้ฃŽ้™ฉ\n58502 0 ๅพฎ่ฝฏCEO่จ่’‚ไบš็บณๅพทๆ‹‰ๅœจไบ‘่ฎก็ฎ—็š„้“่ทฏไธŠ่ถŠ่ตฐ่ถŠๅšๅฎš\n58503 0 ่ฟ™ไบ›ๅ…ฌไบค่ฝฆๅฐ†ๅˆ†ๅธƒๅœจ7่ทฏใ€33่ทฏใ€303่ทฏใ€405่ทฏใ€407่ทฏใ€414่ทฏใ€612่ทฏใ€708่ทฏ8...\n58504 0 ๅšๆŒ็ญ”่พฉๆ„่งๅฎ‹๏ผšๆ‰€ๆœ‰่ดน็”จไธๅบ”็”ฑๆˆ‘ๆ”ฏไป˜\n1000 Processed\n classify content\n59000 0 ๅŒๆ—ถๅซๆณชๅ‡บไธ€ไธชBoseQCxx้™ๅ™ช่€ณๆœบๅœจ็พŽๅ›ฝไบš้ฉฌ้€ŠไธŠไนฐ็š„\n59001 0 ้ช‘่กŒ3000ๅƒ็ฑณ็š„20ไฝไธญ้Ÿฉๅฐไผ™ไผดไปฌ\n59002 0 ๆ™šไธŠๆถ‚ไบ†ไธ€ๅฑ‚ๅธ•ๆฐ่œ‚่œœๆŠ—ๆฐงๅŒ–็ฒพๅŽ\n59003 0 ไธ“่ฝฆๅธๆœบ็š„ไธ€ๅคฉ๏ผšๆ”ถๅ…ฅๅ˜ๅฐ‘่€ƒ่™‘่ฝฌ่กŒ\n59004 0 ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๆƒณ่ตท็”ตๆขฏไบ‹ไปถ่ฟ˜ๆ˜ฏๆƒณไน็Žฐๅœจๅ‘†ๅœจไธ€่ตทไธ€็‚นไธ็€ๆ€ฅๅ›žๅฎถ้•ฟๅบฆ็งฏ็ดฏ็š„็‰นๅˆซ่ธๅฎž\n1000 Processed\n classify content\n59500 0 ๅญๅฎซ่‚Œ่…บ็—‡่ฟ™ไธ€ๅ่ฏๅœจๆˆ‘ไปฌ่บซ่พน้ข‘็นๅœฐๅ‡บ็Žฐ\n59501 0 ๆœ€ๅคงๅŽฟ้ข้›จ้‡ๅฎๆตท146ๆฏซ็ฑณใ€ไธ‰้—จ144ๆฏซ็ฑณใ€ๆค’ๆฑŸๅŒบ136ๆฏซ็ฑณ\n59502 0 ๆˆฟๅœฐไบงๆดป่ทƒ็ป™็ปๆตŽ่ฟ่กŒๅธฆๆฅ็š„ๆญฃ่ƒฝ้‡\n59503 0 ๅธฎๅฟ™่ฝฌไธ‹่‹ๆ‰“็ปฟๅ—ไบฌ็ซ™็š„็ฅจไธคๅผ ๅฅฝๅ—\n59504 0 ๅœฐ้“ไฟกๅท็›ฒๅŒบๆŒบๅคšโ€ฆโ€ฆๅ—้‚ต็ซ™ๅˆฐๆฒ™ๆฒณ้ซ˜ๆ•™ๅ›ญ\n1000 Processed\n classify content\n60000 0 2014ๅนด้ซ˜ๆทณๅŒบ่ง„ๆจกไปฅไธŠๅทฅไธšไผไธšไธ‡ๅ…ƒๅทฅไธšๅขžๅŠ ๅ€ผ่ƒฝ่€—ไธบ0\n60001 0 ๅˆšๅˆšๅ—ๅŒบ่€ๅคงๅ’Œๆˆ‘่ฏดๆต™ๆฑŸๆœ‰ไธชๅคงไธ€็”จๅพฎๅ•†ๅ›ข้˜Ÿๆจกๅผไธ€ไธชๆœˆ่ตšxxๅคšไธ‡\n60002 0 ไธŽๅพๅทžๅ›ฝๅŽ็‰ฉไธšๅคง่ก—ไฟๆดๅ‘˜้—ฒ่Šๅพ—็Ÿฅ\n60003 0 ๆƒณๆƒณๆœฌๆฅ่ฟ‘ๅœจๅ’ซๅฐบ็š„26ๆฅผ้‚ฃ้—ด้คๅŽ…\n60004 0 ๆˆ‘7ๆœˆ28ๆ—ฅๅˆš็ป“ๆŸๅคง่ฟžไธญๅฟƒๅŒป้™ข่ƒ–็˜ฆ็พค็ป„็ป‡็š„็ฌฌไบŒๆœŸ็™พๅคฉ่ฟๅŠจๅˆšๅฅฝ็ป“ๆŸ\n1000 Processed\n classify content\n60500 0 ๆข…ๅŸŽๆฑŸๅ—้‡‘ๆฒ™ๆนพKTVๅ‘็”ŸๆƒŠ้™ฉไธ€ๅน•\n60501 0 ๅœจๅ—ไบฌไธพๅŠž็š„ไปฅๆ„ๅคงๅˆฉ็ฑณๅ…ฐไธ–ๅšไผšๅ—ไบฌๅ‘จไธบไธป้ข˜็š„โ€œๅ—ไบฌ็ฑณๅ…ฐๅŒๅŸŽไบ’ๅŠจ็ง€โ€ไธŠ\n60502 1 .xx.xx.xx.xx~xxxๆœŸ:ๅฟ…ไธญใˆ ่‚–:้ฉฌ~ๅฟ…ไธญx็ :xx.xx.xx-xxxๆœŸ:ๅ†…้ƒจ...\n60503 0 ไธบไป€ไนˆๆท˜ๅฎappๅณไธ‹่ง’ไธ€ไธช่ƒ–ๆ‰‹\n60504 0 ็Ÿณ็‹ฎ็ต็ง€ๆดพๅ‡บๆ‰€ๆฐ‘่ญฆๆŽฅๅˆฐๆŠฅ่ญฆ\n1000 Processed\n classify content\n61000 0 ๆ˜ฏๅ€ผๅพ—ๆฏไธชไบบไธบไน‹้ช„ๅ‚ฒ็š„่ดขๅฏŒ\n61001 0 ๆœ‰ไบบ่ฆๅ—ไบฌ้‡‘้นฐๅข็ฑณๅŸƒ่ฅฟๆธธ่ฎฐไน‹ๅคงๅœฃๅฝ’ๆฅ็š„็ฅจไนˆ\n61002 0 longtimenoseehey๏ฝž\n61003 0 ไธ€ๅฎถ่€ๅฐไปŽๆญคไธŽๅŒป้™ขใ€่ฏ็‰ฉใ€็—…็—›ๆ— ็ผ˜\n61004 0 ่ขๅง—ๅง—ๆ‰ฎๆผ”ไบ†้ฅฑๅ—่ดจ็–‘ไฝ†ๅง‹็ปˆๅฏนๆผ”ๆˆๆœ‰็€็ƒญๅฟฑไน‹ๅฟƒ็š„ๅฅณๆผ”ๅ‘˜\n1000 Processed\n classify content\n61500 0 ๆฑฝ่ฝฆไน‹ๅฎถ่ƒฝๅฆๅฐ†ไธšๅŠกๆจกๅผ้กบๅˆฉ่ฟ‡ๆธกๅˆฐไบŒๆ‰‹่ฝฆ้ข†ๅŸŸ\n61501 0 ๅ…ฉๅฒธๅ››ๅœฐ้’ๅนด่š้ฆ–ๅ—ไบฌๆ…•็ทฌๆญดๅฒ\n61502 0 ๆˆ‘ๅœจๅ—ไบฌ่ทฏไบจๅพ—ๅˆฉไบงๅ“ๅทฒ้‡ๆ–ฐไธŠๆŸœโ€ฆ\n61503 1 ๅฎขๆˆทๆไพ›ไผ˜่ดจ็š„่ดท ๆฌพๅ’จ่ฏขๆœๅŠกใ€‚ ๆˆ‘ไปฌ็š„่ดท ๆฌพไบงๅ“ๅˆฉ็އๆœ‰ ไฟก*่ดท๏ผŒๆœˆ่ดน็އx.xx-x.xx%...\n61504 0 ๅ…ˆ็”Ÿๅ…ฌๅธ้›†ไฝ“ๆ—…ๆธธ5็‚นๅคšๅฐฑๅ‡บๅ‘\n1000 Processed\n classify content\n62000 0 ไปŠๆ—ฅๆท่ดขไธŽๅบ†ๆฑ‡็งŸ่ตๆœ‰้™ๅ…ฌๅธๆญฃๅผ่พพๆˆๆˆ˜็•ฅๅˆไฝœ\n62001 0 ๅ—ไบฌๅ„ๅคงๅŒป้™ข้‡็‚น็ง‘ๅฎคใ€ๅŒป็”Ÿ\n62002 0 GoogleDriveๅ†…ๅฎนไบบๅฎกๆŸฅ่ฟ˜ๆ˜ฏๅคช็‰›ไบ†\n62003 0 ไบ”ใ€ๆˆ‘ไปฌ่ฎคไธบ้žๆณ•ๆ‹†้™คไธŽๆ•ดๆ”นๆ•™ๅ ‚็š„ๅๅญ—ๆžถ\n62004 0 ไปŠๅนด้กบๅˆฉ่ขซๆต™ๆฑŸไผ ๅช’ๅคงๅญฆๅฝ•ๅ–็š„่ดบๆ™—็ดซๅ’Œ่ขซๅ—ๆ˜Œๅคงๅญฆๅฝ•ๅ–็š„ๆˆด้›…ๅฎœๆš‘ๆœŸๆŠฝๅ‡บไผ‘ๆฏๆ—ถ้—ดๆฅไธบๅญฆๅผŸๅญฆๅฆนไปฌๆŒ‡ๅฏผไธ“ไธš\n1000 Processed\n classify content\n62500 0 ไธ‰ๅนดๅ‰็ซ‹ๅฟ—่ฆ่€ƒๅˆฐไบบๅคงๆฅๆ‰พไฝ \n62501 0 ไธ่ฟ‡้š”ๅฃso?hoๆŒๅˆ€ไผคไบบๆกˆ่ฟ˜ๆ˜ฏๆ›ดๅฏๆ€•\n62502 0 ๆˆ‘ๅ€‘ๆ‰“็ฎ—ๆ–ผๅๆœˆๆˆ–ๅไธ€ๆœˆ่ˆ‰่พฆไธ€ๅ€‹ๆœ‰ๆฉŸ่พฒ็จฎๆค่พฒๅ ดไน‹ๆ—…\n62503 0 ่€Œไธ”ไธๅƒๆ™ฎ้€šๆ„Ÿๅ†’้‚ฃๆ ทxๅ‘จๅทฆๅณๅฐฑๅพˆๅฟซๅฅฝ่ฝฌ\n62504 0 ๅ…จๆ–ฐApivitanewBeeRadiantไบฎ่‚ŒๆŠ—็šบไฟฎ่ญท็ณปๅˆ—\n1000 Processed\n classify content\n63000 0 ็Žฐๅœจๅทฒ็ปๆœ‰ไบบ้ข„ๅฎšdcไบ†้‚ฃๅฐฑๅ…ˆๅผ€ๅง‹ๆŽ’ๅŽๅคฉ็š„ๅ•\n63001 0 ๅœฐ้“ไธŠ็š„ๆธฉๅบฆๅ’ŒๆทฑๅœณๅŒ—็š„ๆธฉๅบฆ็ฎ€็›ดๅคฉๅฃคไน‹ๅˆซ\n63002 0 ่ฟ™่ฆๅœจๆŸๅฎไธคๅคฉguyongไธๅˆฐ่‹ๅทž\n63003 1 โ€œไฝ ็š„ๅญ˜ๅœจใ€็”Ÿๅ‘ฝ็š„ๆ„ไน‰โ€ๅฅณไบบ่Š‚ๅˆฐๆฅไน‹ๅณ๏ผŒ่ฏ—็ฏ‡ไธ“ๆŸœ็‰น้‚€ๆ‚จๅˆฐๅบ—้€‰่ดญ๏ผŒ็‰นๆŽจๅ‡บๅ†ฌๆฌพxๆŠ˜ใ€ๆ˜ฅๆฌพxๆŠ˜๏ผ›ๆŠ˜...\n63004 0 ๅนด็ปˆๅฅ–10000ๅ…ƒๅทฆๅณ+ๅŒ…ไฝๅŒ…ๅƒ+ไบ”้™ฉไธ€้‡‘\n1000 Processed\n classify content\n63500 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅฑฑๆน–ๆนพ็š„็ฝฎไธš้กพ้—ฎๅผ ่‰ณ๏ผŒๅฑฑๆน–ๆนพ็Žฐๅœจๆ–ฐๆŽจๆˆฟๆบ็ซ็ˆ†็ƒญ้”€ไธญ๏ผŒ่ดญๆˆฟๅคš้‡ไผ˜ๆƒ ๏ผŒๆœŸๅพ…ไธŽๆ‚จ็š„ๅ†ๆฌก่ง...\n63501 0 DMๆ‰‹ๆœบu็›˜16gu็›˜ๅŒๆ’ๅคดotg็”ต่„‘ไธค็”จๅˆ›ๆ„้‡‘ๅฑž่ฟทไฝ ่ฝฆ่ฝฝ้˜ฒๆฐดๅŒ…้‚ฎ\n63502 0 ่ฝฆๅž‹ไป‹็ป๏ผšๅฅ”้ฉฐs็บงๅ ช็งฐๆœ€่ˆ’้€‚ใ€่ฑชๅŽ็š„ไธญๅคงๅž‹่ฝฟ่ฝฆ\n63503 0 ็šฎ่‚ค่กฐ่€็šฑ็บน็š„ไบง็”Ÿๆ˜ฏๅ› ไธบ่ƒถๅŽŸ่›‹็™ฝๆตๅคฑ\n63504 0 ๅ›ž้กพๅ’—ๅฅฝๅ‡ ๆฎตxxใ€xxๅนดไปฃๅ˜…NBA่ง†้ข‘\n1000 Processed\n classify content\n64000 0 ็ป“ๆžœ่ŠฑๅฎŒๅก้‡Œๆœ€ๅŽ็š„xxๅ—ไน‹ๅŽๆˆ‘ๅฐฑๅˆๆปšไบ†\n64001 0 ๅพ—ๅˆฐไบ†ไธ€ไธชๅŽไธบ็š„ๅฎ‰ๅ“ๆ‰‹ๆœบ็ป™ไบ†ไป–\n64002 1 ๆ–ฐๅนดๅฅฝ๏ผ้“ถ่กŒๆ–ฐๆ”ฟ็ญ–๏ผšๅช่ฆๆœ‰ไฟ้™ฉๅ•\\ๆŒ‰ๆญใ€่ญๆœฌๆˆฟ\\ๅฐ่ฝฆไน‹ไธ€๏ผŒๅ‡ญ่บซไปฝ\\่ฏๅณๅฏๅŠž็†๏ผŒๆœˆๆฏไป…xๅŽ˜-x...\n64003 0 1992ๅนดไปŽๅ—ไบฌ็ฉบๅ†›ๅธไปค้ƒจๅ†›่ฎญๅค„่ฝฌไธšๅŽ่ฟ›ๅ…ฅๆฑŸ่‹็œๆตท้—จๅธ‚ๅฎก่ฎกๅฑ€ๅทฅไฝœ\n64004 0 ็ฝ‘้กต็ซฏ่ฟ˜ๅฟ…้กปๆ‰‹ๆœบไธ‹่ฝฝappๆ‰ซๆ็™ป้™†\n1000 Processed\n classify content\n64500 0 ๆณ•ๅฎ˜ไธ่ฆๆˆไธบ็ซฏไบบ็ข—ๆœไบบ็ฎก็š„ๆณ•ๅฎ˜\n64501 0 โ€œsuper189็”ตไฟกๆฌขgo่Š‚โ€œ็ผบ็ฟผไธๅฏๅ†ๆฌกๆฅ่ขญ\n64502 0 ๆˆ‘ไธบไป€ไนˆไฝœๆญปๅŒๅผ€hhhhhhhhhhhhhhhๅฟซ็ฌ‘ๆญปไบ†hhhhhhhhhh\n64503 0 ไธญๅ›ฝๆฑŸ่‹็œๆทฎๅฎ‰ๅธ‚ๆทฎ้˜ดๅŒบ็Ž‹่ฅ้•‡\n64504 0 ๆฎๆต™ๆฑŸ็œๆ—…ๆธธๅฑ€ๆœ€ๆ–ฐ็ปŸ่ฎกๆ•ฐๆฎ๏ผšๆต™ๆฑŸxxxxๅนดไธŠๅŠๅนดๆŽฅๅพ…ๅ›ฝๅ†…ๅค–ๆธธๅฎข็บฆx\n1000 Processed\n classify content\n65000 0 ๅŽๅ†œ็™พ็ต7ๆœˆไธบๅนฟๅคงๅญฆๅ‘˜ๅผ€ๆ”พๅ…่ดน็š„่ทจๅขƒ็”ตๅ•†่ฏ•ๅฌ่ฏพ็จ‹\n65001 0 ไธญๅคฎ็ฉบ่ฐƒๅฃฐ้Ÿณๅคง็š„่ทŸ็ซ็ฎญๅ‘ๅฐ„ไผผ็š„\n65002 0 50ไธ‡็š„่ทฏ่™ŽVS15ไธ‡็š„้™†้ฃŽ็œŸ็›ธ่ฎฉไฝ ๅคงๅ่ก€\n65003 0 comๅ•†ๅŠก้ƒจ้ข†ๅฏผใ€ๅ›ฝๅฎถๅทฅๅ•†ๆ€ปๅฑ€้ข†ๅฏผใ€ไบบๆฐ‘ๆ—ฅๆŠฅ้ข†ๅฏผไธญๅ›ฝๅ•†ไธš่”ๅˆไผš้ข†ๅฏผ็ญ‰็ญ‰ๅ‰ๆฅๅˆฐๅœบ็ฅ่ดบๅŠๅ‘่จ€\n65004 0 ่ƒฝ่ฝฝ็€16ไบบไปฅ250ๅƒ็ฑณ/ๅฐๆ—ถ็š„้€Ÿๅบฆ้ฃž้ฉฐ\n1000 Processed\n classify content\n65500 0 wifi็”ต่„‘็ฉบ่ฐƒๅ†ฐ็ฎฑๅŠ ๅผ ๅบŠ\n65501 0 ๅทฒ็”จไบŽ้˜ต้ฃŽๆˆ˜ๆ–—ๆœบๅ’Œ็ซ็ฎญๅ‘ๅŠจๆœบๅ–ท็ฎกไธญ\n65502 0 ไนŸๆ˜ฏgoogleๅŸบไบŽlinuxๆ‰‹ๆœบๆ“ไฝœ็ณป็ปŸ็š„ๅ็งฐ\n65503 0 ๆˆฟๅœฐไบง็จŽ็ซ‹ๆณ•ๅˆ็จฟๅทฒๅŸบๆœฌๆˆๅž‹\n65504 0 ๅŽไธบๅฐๅ“ฅๅฐฑ่ตฐ่ฟ‡ๅŽป่ฏด๏ผšโ€œๅ…ˆ็”Ÿ\n1000 Processed\n classify content\n66000 0 ๅ–œๆฌขไธœๆ–น่ง‰ๅพ—ๅ’Œ่Šฑๅƒ้ชจๅพˆ่ˆฌ้…\n66001 0 ไธ€ไธชไบบๅŽปๅ—ไบฌๅƒไบ†็ด ้ขๆฒกๆœ‰็ขฐๅˆฐๆ’‘ไผž็š„ไฝ ็งฆๆทฎๆฒณ่ฟ˜ๆ˜ฏ้‚ฃๆ ท็ดข็„ถๆ— ๅ‘ณ\n66002 1 ๅฅณไบบ่Š‚ๆฅไธดไน‹้™…๏ผŒ็‰นๅ‘ๅนฟๅคงๅฅณๆ€งๆœ‹ๅ‹ๆŽจๅ‡บไผ˜ๆƒ ๆดปๅŠจ๏ผŒๅฟซไนๅฅณไบบ่Š‚๏ผVIP้กพๅฎขๆŒๅกไบซๅ—xๆŠ˜ไผ˜ๆƒ ใ€‚ๅฆ‚ๆžœๆฒก...\n66003 0 ๅˆ่ฆๅผ€ๅง‹myperfectday\n66004 0 ไธ€ๅๅœฐ้“ๅฐฑๆƒณ้ช‚่ก—ๅŒ—ไบฌๅœฐ้“ไธ‰ๅˆ†้’Ÿไธ€่ถŸไฝ ่ฟ™่ถŸๆฒก่ตถไธŠ่ตถไธ‹่ถŸๅฐฑ็œ‹ไธ่งไฝ ๅฆˆๆœ€ๅŽไธ€็œผไบ†่ฟ˜ๆ˜ฏๆ€Žไนˆ็€\n1000 Processed\n classify content\n66500 0 2013ๅนด11ๆœˆๅ€Ÿ็”จๅŽไธ€ๅนดๅคšไธ€็›ดไธๆ่ฟ˜่ฝฆ\n66501 0 ไธ€ๆ—ฆๆž่ตทๆฅๅฐฑๆ˜ฏๆฑฝ่ฝฆ้‡Œ็š„Windows\n66502 0 ๆจๅฝฉ้œžๅฐฑๅๅˆฐๅฐๆˆฟ้—ด็”ต่„‘ๅ‰็š„่ฝฌๆค…ไธŠ\n66503 0 ไธบไป€ไนˆๆฑŸ่‹็งปๅŠจ็ฝ‘ไธŠ่ฅไธšๅŽ…ๆฒกๆœ‰ๅˆ็บฆๆœบ\n66504 0 ้ขๅฏน่ดจ็–‘๏ผšไป–็Žฐๅœจๆ€Žไนˆๅฏ่ƒฝ็ก\n1000 Processed\n classify content\n67000 0 ้™่ง‚ๅ…ถๅ˜/็–‘ไผผMH370ๆฎ‹้ชธๅœฐๅ‘็Žฐไธญๅ›ฝ็Ÿฟๆณ‰ๆฐด็“ถ\n67001 0 ้ฃžๆœบไปŽๅ›พ็บธไธŠ่ฝๅœฐ้œ€่ฆไธคไปฃไบบ\n67002 0 ๆˆ‘็œŸไธ็›ธไฟกๅๅœจๆ‰‹ๆœบๅฑๅน•ๅ‰็š„ไฝ ไปฌไธ€ไธชไบŒไธช้ƒฝๆ˜ฏๅๅ…จๅ็พŽ็š„ไบบ\n67003 0 ๆœ‰ๅœจๆทฎๅฎ‰ๅ…ซไป™ๅฐ้™„่ฟ‘้™„่ฟ‘็š„ๆœ‰่ฝฆ็š„ๆœ‹ๅ‹ๅ—\n67004 0 ๅกๆ‹‰ๆ›ผ่‡ชไป–็š„BaupostๅŸบ้‡‘1982ๅนดๅˆ›็ซ‹ไปฅๆฅ\n1000 Processed\n classify content\n67500 0 Windows็”จๆˆทๅทฒ็ปๅฏไปฅๆ”ถๅˆฐๆญฃๅผ็‰ˆๆ›ดๆ–ฐ็š„ๆŽจ้€ไบ†\n67501 1 ๅฐŠๆ•ฌ็š„ๅญฆ็”Ÿๅฎถ้•ฟ๏ผŒ่‡ด่ฟœๆ•™่‚ฒ็ฅๆ‚จ้˜–ๅฎถๅนธ็ฆ๏ผŒ็พŠๅนดๅคงๅ‰๏ผ่‡ด่ฟœๆ•™่‚ฒๅˆไธญๅ„ๅญฆ็ง‘ๆŠฅไธ€็ง‘้€ไธ€็ง‘๏ผŒๅ…จ้ขๆๅ‡ๆˆ็ปฉ...\n67502 1 ๆ–ฐๆ€็ปดๅฅฅๆ•ฐไธ“ๅฎถๆœฌๅ‘จๅ››ๅณxๆœˆxๅทxx:xxๅผ€ๅง‹่ต›ๅ‰ๆจกๆ‹Ÿ้ข˜ๅž‹ๅ’Œ่€ƒ่ฏ•ๆŠ€ๅทงๅ…่ดน่ฎฒ่งฃๅŸน่ฎญใ€‚ๆดปๅŠจๆŒ็ปญไธคๅ‘จ...\n67503 0 ๅด่ขซHTC้€š่ฟ‡ไธ“ๅˆฉๆƒไบบๅœจไธ“ๅˆฉๅฎกๆŸฅ่ฟ‡็จ‹ไธญๆœ‰ไธๅ…ฌๅนณ่กŒไธบ็š„ไธพ่ฏ่€Œๅฏผ่‡ดๆปก็›˜็š†่พ“็š„ไบ‹ๆƒ…\n67504 0 ๅ—ไบฌๅธ‚็พŽๅไผšๅ‘˜ๅฑฑๆฐด็”ปๅฎถไฝœๅ“\n1000 Processed\n classify content\n68000 0 ๆœฌๆฅไปŠๅคฉๆƒณๅŽป็œ‹McQ็š„ๅฑ•่งˆโ€ฆ็ป“ๆžœๅš็‰ฉ้ฆ†ๅผ€้—จไธๅˆฐๅŠๅฐๆ—ถ็ฅจๅฐฑๅ–ๅ…‰ไบ†โ€ฆๅชๆœ‰ๅŽป็œ‹้ž‹ๅฑ•โ€ฆๅœจV&amp\n68001 0 ็„ถๅŽไธ€ไบ›ไธๆ˜Ž็œŸ็›ธ็š„ๅค–ๅ›ฝไบบไนŸไผš็•™่จ€ๅคง่‚†ๅๆงฝ\n68002 0 ็ป™ๅคงๅฎถๅˆ†ไบซ\"ๅŽ่ƒฅๅผ•ไน‹็ป็ฌ”ไน‹ๅŸŽ\"\n68003 0 Whataretheyๅผ„ๅ•ฅๅ˜ž\n68004 0 ๅชๆœ‰ๆ”ฟๅบœๆ‰ๆœ‰่ƒฝๅŠ›ๅŽปๅ…ณ็ˆฑ้‚ฃไบ›ๅคฑ็‹ฌๅฎถๅบญ\n1000 Processed\n classify content\n68500 0 ๅœŸๅœฐๆ•ดๆฒปๆ˜ฏๅ‘ๅฑ•็Žฐไปฃๅ†œไธšใ€ไฟƒ่ฟ›ๅ†œๆฐ‘ๅขžๆ”ถ็š„ๆœ‰ๆ•ˆ้€”ๅพ„\n68501 0 6ใ€ๅ…šๆ”ฟๅนฒ้ƒจใ€ๅ…ฌ่Œไบบๅ‘˜้“ๅพท่ดฅๅใ€็”Ÿๆดป่…ๅŒ–ๅ •่ฝ\n68502 0 ้ƒฝๅพ—้ ็€ๅ’ฝๅ–‰็ณ–ๆ‰่ƒฝๆญฃๅธธๅผ€ๅฃ่ฏด่ฏ\n68503 0 ๅคฎไผๆ”น้ฉใ€ๅ†›ๅทฅ่ˆชๅคฉใ€ๆœบๅ™จไบบ็ญ‰ๅผบ่€…ๅฝ’ๆฅ\n68504 0 ๆ–ฐๅŒ—ๅ…ซไป™ไนๅ›ญ็ฒ‰ๅฐ˜็ˆ†็‚ธไบ‹ไปถไธญๆœ‰432ไฝ็ƒงไผคๆ‚ฃ่€…ไปๅœจไฝ้™ขๆฒป็–—\n1000 Processed\n classify content\n69000 1 ............ๅบ—ๆœ‰็คผใ€่ฟž่ดญๆœ‰็คผ๏ผŒๆปกไธ‰ๅƒ่ฟ”xxx๏ผŒๆœ‰็›ดไพ›ๅก๏ผˆๆ‰่ƒฝๅ‚ๅŠ ๆดปๅŠจ๏ผ‰ๅฏไฝŽxxx...\n69001 0 ไธ€่ˆฌ็”ทไบบ็š„ไฝ“ๅ†…ๅคง็บฆๆœ‰300ไบฟไธช่„‚่‚ช็ป†่ƒž\n69002 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ้‡Œ้ข้ƒฝๆ˜ฏ้ป‘้พ™ๆฑŸ็š„ๆญŒๆ‰‹ๅ•Š\n69003 0 ๅฟซๆฅๅญฆๅงโ€”โ€”็”ฑ่œ‚่œœๅ’ŒๆŸ ๆชฌๅ†ฒ่ฐƒ่€Œๆˆ็š„่Œถ้ฅฎ\n69004 0 ๆฑŸ่‹็œไธญๅ‡บ็š„่ฟ™xๆณจไธ€็ญ‰ๅฅ–ๅˆ†ๅˆซ่ขซ่ฟžไบ‘ๆธฏใ€ๆ‰ฌๅทžๅ’Œๅฎฟ่ฟๅฝฉๆฐ‘ไธญๅพ—\n1000 Processed\n classify content\n69500 0 ๅˆไธ€้›†ๅ›ขๅฎฃๅธƒไธŽๅ›ฝๅ†…ๆœ€ๅคง็š„ๅœจ็บฟๆ—…ๆธธไผไธšๆบ็จ‹ๆ—…่กŒ็ฝ‘่พพๆˆๆˆ˜็•ฅๅˆไฝœ\n69501 1 ๅทๅค–.ๅทๅค–.ไธ็”จๅŽปๆพณ้—จ.็ฝ‘ไธŠๅผ€ๆˆทไนŸ่ƒฝ็Žฉ๏ผˆOKๅจฑ.ไนๅ…ฌๅธ๏ผ‰่ฟŽๆŽฅๆ–ฐๅฎขๆˆท๏ผๅผ€ๆˆทๅฐฑ้€xxxๅ…ƒ๏ผŒ็Žฉๅˆฐx...\n69502 0 2015ๅนด่ฏๅˆธไปŽไธš่ต„ๆ ผ่€ƒ่ฏ•้ข˜ๅบ“๏ผšd\n69503 0 20ๅ—ไบฌๆˆ‘ๅทฆไธŠ็œ‹ใ€ๅณไธŠ็œ‹ใ€ๅทฆ็œ‹ใ€ๅณ็œ‹ใ€ๅทฆไธ‹็œ‹ใ€ๅณไธ‹็œ‹\n69504 0 ๅ…ฑ่ฎกๅ‘ๆ”พๆถˆๆš‘ๆธ…ๅ‡‰้ฅฎๆ–™8735็ฎฑ\n1000 Processed\n classify content\n70000 0 โ€”โ€”โ€”โ€”โ€”โ€”ๅ„ฟๅญ\n70001 0 ๅฅฝๅฃฐ้Ÿณๅญฆๅ‘˜้‡Œ้ขๅ“ชไธชๆ˜ฏ่Š‚็›ฎ็ป„ไบฒ่‡ช่ฏท็š„\n70002 0 ๆ›ดๆฒกๆœ‰ๅนผ็จš่€Œ่ฃ…้€ผๆ ผ็š„็”Ÿๆดปไฝœ้ฃŽ\n70003 0 ๅทด้™ต็ŸณๅŒ–ใ€้•ฟๅฒญ็‚ผๅŒ–ๅŠๅฒณ้˜ณๆถˆ้˜ฒๆ”ฏ้˜Ÿใ€ๆน–ๅ—็œๆถˆ้˜ฒๆ€ป้˜Ÿ้•ฟๆฒ™ๆˆ˜ๅŒบๆˆ˜ๆƒ…ไฟ้šœๅคง้˜Ÿ็ญ‰12ๆ”ฏๆถˆ้˜ฒ้˜Ÿ่”ๅˆๅ…ฌๅฎ‰ใ€...\n70004 0 ๆˆ‘ๅŽปๅฐผ็Ž›็š„็บขๅŒ…้”ๅฑๅˆฐๅบ•่ƒฝไธ่ƒฝๅ…‘ๆข\n1000 Processed\n classify content\n70500 0 ๅฒญๅ…œๆ‘ไฝไบŽๅฎ˜ๆกฅ้•‡ๆ”ฟๅบœ้ฉปๅœฐๅŒ—xๅ…ฌ้‡Œๅค„\n70501 0 ๅช่ƒฝไธ€ๅคงๆ—ฉ็œ‹่Šฑๅƒ้ชจ็ฌฌไบŒ้›†ไบ†\n70502 0 ็Žฐ่ตทๆˆ‘ๅฝ“ๅนดๅฝ“ๅˆๆฒกๆœ‰้’ฑๅฐฑๆŠŠๆ‰‹ๆœบๅฝ“ไบ†\n70503 0 ***********็ŽฐๅœจๆŽงๅˆถ้ฃŽ้™ฉๅพˆ้‡่ฆๅ•Š***่ดตๅทž่Œ…ๅฐๆตทๅฒ›ๅปบ่ฎพ็Ÿณๅฒ˜็บธไธš้ป‘็‰กไธนไธญ่ˆช็”ตๅญๅ‡ค็ซน็บบ...\n70504 0 xใ€็พคๅ‘้ข‘ๆฌก่ฟ›่กŒ่ฐƒๆ•ด๏ผšๅพฎๅš็ญพ็บฆ่‡ชๅช’ไฝ“ใ€ๅช’ไฝ“่ฎค่ฏ็”จๆˆทไธบxๅคฉxๆฌก\n1000 Processed\n classify content\n71000 0 ๆˆ‘ไปฌๆ˜ฏ้ ๅ†…่กฃ็‹ฌ็‰น็š„่ฎพ่ฎกๅฐ†่ƒธ้ƒจๆ—่พน็š„่‚‰่‚‰่š้›†ๅœจไธ€่ตท\n71001 0 ledๆ™ถๆ ผๅฐ่Šฑๅฎ ็‰ฉๅ‘ๅ…‰้กนๅœˆๅคœ้—ด้›็‹—็‹—็Œซๅ’ช็ฅžๅ™จ\n71002 0 ็Ÿฅไบงๆณ•้™ข้™ข้•ฟๅดๅ•ๆž—ๆๅ‡บ่ฆๆฑ‚\n71003 0 ไพๆ—งๆฒ‰ๆตธๅœจๆŠŠ็”ต่„‘ๆ‘”ไบ†็š„็—›่‹ฆไน‹ไธญโ€ฆไปŠๅคฉๆ—ฉไธŠๆŠŠ็”ต่„‘ๆ”ถ่ฟ›็›’ๅญ้‡Œไพ›่ตทๆฅ\n71004 1 ใ€Š็Ž–้š†ไธป้ข˜ใ€‹๏ผŒๆ–ฐ่ŒถไธŠๅธ‚ใ€‚็™ฝๅคฉxx็‚นๅˆฐxx็‚น่ฌไธ€้คธไธ€๏ผŒๅ…จๅคฉไบ”ไบบๅŒ่กŒไธ€ไบบๅ†•ๅ•๏ผŒๅฆๆœ‰ๅ…จๅคฉๆœ‰็Žฐ้‡‘็ฎžใ€‚...\n1000 Processed\n classify content\n71500 0 ๆฏไธ€ๆญฅ้ƒฝไธๆ˜ฏๆ˜“ไบ‹ไฝ ๆ‰€ๆœŸๅพ…็š„้ƒฝไธๆ˜ฏ็œŸ็›ธ็š„ๅ…จ้ƒจๅ‘ฝ่ฟ็š„ๅฎ‰ๆŽ’ๆ€ปไผšๆœ‰็ผ˜็”ฑๆŠ—ๆ‹’่ฟ˜ไธๅฆ‚ๆœไปŽ้š้‡่€Œๅฎ‰\n71501 0 7ๆœˆ18ๆ—ฅ08ๆ—ถ่‡ณ19ๆ—ฅ08ๆ—ถ\n71502 0 ่ขซๅฑฑไธœ็œ่ŠๅŸŽๅธ‚ไธญ็บงๆณ•้™ขๅˆคๅค„ๆœ‰ๆœŸๅพ’ๅˆ‘ๅๅ››ๅนด\n71503 0 ๅฆๅˆ™้ฃžๆœบๆ— ๆณ•ๅœจๅ‰ฉไฝ™็š„่ท‘้“ไธŠๅœไธ‹ๆฅ\n71504 0 ่ขซ้ƒ‘ๅทž้“่ทฏๅ…ฌๅฎ‰ๅฑ€ๆด›้˜ณๅ…ฌๅฎ‰ๅค„ๆด›้˜ณไธœ็ซ™ๅ…ฌๅฎ‰ๆดพๅ‡บๆ‰€ๆ‰ฃๆŠผๅŽ\n1000 Processed\n classify content\n72000 0 ๅ—้€šๅธ‚ๅ‘ๅธƒ้›ท็”ต้ป„่‰ฒ้ข„่ญฆ|้›ท็ฅžๅˆๆฅไบ†\n72001 0 ไปฅ็™ฝๅคด็ฒ‰ๅˆบใ€้ป‘ๅคด็ฒ‰ๅˆบใ€็‚Žๆ€งไธ˜็–นใ€่„“็–ฑใ€็ป“่Š‚ใ€ๅ›Š่‚ฟ็ญ‰ไธบไธป่ฆ่กจ็Žฐ\n72002 0 ๅผ•ๅ‘ไป–ไธŽ่ฏฅๆ‰€่ญฆๅฏŸๅ‘็”Ÿ่‚ขไฝ“ๅ†ฒ็ชๅนถ่ขซๆ‰ฃ็•™xxๅฐๆ—ถ\n72003 0 ๅฅถๅ˜ด่ฎพ่ฎกๅพ—็›ธๅฝ“ๆœ‰็ˆฑ~้œธๆฐ”ๅค–้œฒๆœ‰ๆœจๆœ‰\n72004 0 ๆญค็•ชๅŒๆ–น้€‰ๆ‹ฉๅœจSUPER็ปๅ…ธLunettesII้•œๅž‹ไธŠๆŒฅๆด’ๅˆ›ๆ„็ตๆ„Ÿ\n1000 Processed\n classify content\n72500 0 ๅ…ˆๅŽ้ช—ๅ–53ๅๅค–ๅœฐๆˆท็ฑๅฎถ้•ฟ้’ฑๆฌพๅ…ฑ่ฎก46\n72501 0 ๅ…ป่€ไฟ้™ฉๆ˜ฏๅฏไปฅๅผ‚ๅœฐ่ฝฌ็งปๅนถ็ดฏ็งฏ็š„\n72502 0 MRSOH่ขซๆˆ‘้€ผ่ฟซ่ดฑๆ•ˆๆžœๅŽ็ฌ‘็š„็œŸๅคŸ่ดฑ\n72503 0 ่‰ฏ็”ฐๆดพๅ‡บๆ‰€ไธŽ้ซ˜ๆ–ฐๅŒบๅปบ่ฎพๅฑ€่”ๅˆๆ‰งๆณ•\n72504 0 ๆˆ‘ๅฐฑๆ•ข่ฎฉไฝ ไปฌๅ—่ˆช้ฃžๆœบ้ƒฝ้ฃžไธ่ตทๆฅ\n1000 Processed\n classify content\n73000 0 ๅฏ็คบ๏ผš่ฟ™ไนŸ่ฎธไธๅ…‰ๆ˜ฏไธ€็ง้”€ๅ”ฎ็š„ๆ‰‹ๆณ•\n73001 0 ๅŠ ๆ‹ฟๅคง่‘—ๅ็š„ๆญ่ฝฆๆ—…่กŒๆœบๅ™จไบบโ€œHitchBOTโ€ๅœจ็พŽๅ›ฝ่ดนๅŸŽไธๅนธ้‡ๅฎณ\n73002 0 3BeautyQuotient็พŽๅ•†\n73003 0 ้€‰ๆžธๆžๅฐฑๅบ”้€‰ๅฎๅคไธญๅฎๆžธๆž\n73004 0 ๅดๆฏๆฏๆ€ป่ขซ่ดจ็–‘โ€œไฝ ไธ‹็ญๆˆ–ๅ‘จๆœซ้ƒฝๅœจๅฎถๅนฒไป€ไนˆ\n1000 Processed\n classify content\n73500 0 ๆ˜ฏไธบไบ†ๆทกๅฎš็š„ไปŽๅฎน็š„ๅฅฝๅฅฝ็š„็œ‹็œ‹่ฟ™ไธชไธ–็•Œ\n73501 0 ไฝฟ็”จ้˜ฒๆ™’้š”็ฆปไบงๅ“x็šฎ่‚คๅทฒ็ปๆ˜ฏไบšๅฅๅบทๆ•ๆ„Ÿ่‚Œโ€”่€ๅฟƒๅš่‚Œ่‚คไฟฎๅค่ฐƒ็†\n73502 0 ้ธกไธœๅŽฟไบบๆฐ‘ๆณ•้™ข้™ข้•ฟๆŽฅๅพ…ๅฎค้‡Œ\n73503 0 ไปŠๅคฉ05ๆ—ถไฝไบŽๆต™ๆฑŸ็œๆธฉๅฒญๅธ‚ไธœๅ—ๆ–นๅ‘550ๅ…ฌ้‡Œ็š„ไธœๆตทๆตท้ขไธŠ\n73504 0 1ๅฅณไบบไธบ็ˆฑๅ‡บๅข™ๅ‡บๅข™ไธ€ๆฌกๅฐฑๅคŸๅฅณไบบไบซ็”จไธ€็”Ÿ็”ทไบบไธบ็”Ÿ็†ๅ‡บ่ฝจ็”ทไบบๅทๆƒ…ๅฐฑๅƒๅธๆฏ’\n1000 Processed\n classify content\n74000 0 66%็š„็”ต่„‘็”ตๅœจไธŠๅˆๅไธ€ๆ—ถๅฝปๅบ•็ฆปๆˆ‘่€ŒๅŽป\n74001 0 ็Ž‹่Šณ็œ‹ๅˆฐๆˆ‘็ฝ‘ไธŠไผ ไบ†ๆˆ‘็š„่ƒณ่†Šๅฅฝไบ†ๅฅนๅฐฑๅ‘้‚ฃ็ง็…ง็‰‡ไธŠๅŽป่ฟ˜่ฎพ่ฎกๆ€ๅฎณๆˆ‘ๆญปๅˆ‘ๆ€ไบบๅฟๅ‘ฝ\n74002 0 ๅž‹ๅทiphone6ไปฃ4\n74003 0 ไธ‹้ข่ฏท็™ฝๅŸŽ็Ž›ไธฝไบšๅฆ‡ไบงๅŒป้™ข็š„ๅŒป็”Ÿไธบๆ‚จไป‹็ป\n74004 0 ่ฏ็›‘ไผšๅฎกๆ ธ้€š่ฟ‡ไบ†็”ณ้พ™็”ตๆขฏ็š„IPO็”ณ่ฏท\n1000 Processed\n classify content\n74500 0 ไธบไบ†ๅŠๅคœไธ็ ดๅๆฏไบฒๅคงไบบ็š„็พŽๆขฆ\n74501 0 ไธ่ฟ‡ๅฐๅทไปฌๅฏ่ƒฝไผšๅฟ˜่ฎฐ่‹นๆžœๆ‰‹ๆœบ่ฟ˜ๆœ‰ๅฎšไฝๆœๅŠก\n74502 0 ็”จ็™พๅบฆๅ’Œ็”จGoogleๅฏนๆฏ”็€ๆฅๆœ็œŸๆœ‰ๆ„ๆ€\n74503 0 ๅฎšไบŽไปŠๆ—ฅๅ‡Œๆ™จๆŠต่พพ้‡ๅบ†็Ÿญๆš‚ๅœ็•™ๅŽๅฐ†้€”็ปๅ—ไบฌ็ปง็ปญๅ‘ไธœ้ฃž่ถŠๅคชๅนณๆด‹\n74504 0 ๆ‰พ่‹ๅทž้€Ÿ่พพๅผ€้”ๅ…ฌๅธไธ“ไธš้…ๆฑฝ่ฝฆ่Šฏ็‰‡้’ฅๅŒ™้ฅๆŽงโ€”ๅทฅไธšๅ›ญๅŒบโ€”ๆน–่ฅฟโ€”ๅฟซ็‚น8ๅˆ†็ฑปไฟกๆฏ็ฝ‘\n1000 Processed\n classify content\n75000 0 ๆ˜Žๅคฉๆ—ฉไธŠ็š„้ฃžๆœบ่ฆๅŽปๆ‚‰ๅฐผ็š„ไบบ็Žฐๅœจ่ฟ˜ๆฒกๆ”ถไธœ่ฅฟโ€ฆ่ฟ˜ๅœจ็ฃจๆด‹ๅทฅ\n75001 0 ๆ„Ÿ่ง‰ไธญๅ›ฝ็Žฐๅœจๆ˜ฏไธ€ไธชๆณ•ๅพ‹ใ€้“ๅพทๅŒๅคฑๆ•ˆ็š„็คพไผš\n75002 0 ่ง†้ข‘่…พ่ฎฏ้Ÿณ้ข‘51singUPไธป๏ผšๅคฉ้ญ”ๆ•™ๆ•™ไธป\n75003 0 ๅทฅไธšๅ›ญ็š„ๅก่ฝฆๅธๆœบdalaoไปฌๅพˆๅ–œๆฌขๅœๅœจๅ…ฌไบค็ซ™็‰Œๅ‰้ข\n75004 0 COM็พŽๅฆ†็ฝ‘็ซ™ไบ”ๆœˆไปฝ่Žทๅพ—ๆธ…ๆด็ฑป็š„ๅ† ๅ†›\n1000 Processed\n classify content\n75500 0 ๆ™“็‡•ๅงๅงไฝœไธบ็ฅฅ่Šๆณ•้™ข็š„ไปฃ่กจๅ‚ไธŽไบ†ๆญคๆฌกไผš่ฎฎ\n75501 0 ๆฏไธชไบบ้ƒฝๆ‰˜็€ๆ‰‹ๆœบๅฏนๅ‡†ๅŒไธ€ไธช็”ตๆขฏ้—จๆฏๆฌก็”ตๆขฏ้—จๅผ€้ƒฝไผšๅฌๅˆฐไธ€ๅฃฐๅนๆฏ็„ถๅŽๅคงๅฎถไธ€่ตทๅ“ˆๅ“ˆๅคง็ฌ‘\n75502 0 ๆฑŸ้ƒฝๆฐดๅˆฉๆžข็บฝๆ˜ฏๅ—ๆฐดๅŒ—่ฐƒไธœ็บฟๅทฅ็จ‹็š„ๆบๅคด\n75503 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆท๏ผšๆ‚จๅฅฝ๏ผ่ฏทๅ…่ฎธๆˆ‘ๅ†’ๆ˜ง็š„ๆ‰“ๆ‰ฐใ€‚้ฉฌๅฏๆณข็ฝ—็ฃ็ –ใ€ๅธƒ่‰บๅทฅๅŠ็ช—ๅธ˜๏ผŒๅฎพๅทๆœ€ๅ…ทๅฎžๅŠ›็š„ไธคๅคงๅ“็‰Œๅผบๅผบ...\n75504 0 ๅ†ทๆธ…็š„่‚ก็ฅจ่ฎฒๅบงไปŠๅคฉๆ˜ฏๆœ่€ๅธˆ่ฎฒ่ฏพ\n1000 Processed\n classify content\n76000 1 ๅ„ไฝๅฎถ้•ฟๆ–ฐๅนดๅฅฝ:ๆ˜ฅๅญฃ็ญๅทฒๅผ€ๅง‹ๆŠฅๅไบ†ๅ“ฆ๏ผ็ŽฐๅœจๆŠฅๅ้ƒฝๆœ‰ไผ˜ๆƒ ๏ผŒๅนถ่ต ้€็”ปๅ…ท๏ผŒๅŠๅนดๆˆ–ไธ€ๅนดไธ€ๆŠฅไผ˜ๆƒ ๆ›ดๅคš๏ผŒ...\n76001 0 ๅœจ8ๆœˆ10ๆ—ฅไน‹ๅ‰้€€่ฟ˜้กพๅฎข็š„ๆ‹ๆ‘„่ดน็”จ\n76002 0 ๅซ็‹ฌ็‰นๆดไบฎๆˆๅˆ†Sylodent\n76003 0 ๆฑฝ่ฝฆ็š„่กจ็›˜่ฆๆ˜ฏ่ฎพ่ฎกๆˆ่ฟ™ๆ ทไผšไธไผšๆฏ”่พƒๆœ‰ๅจๆ…‘ๅŠ›\n76004 0 ๆ‰€ไปฅๆˆ‘่ง‰ๅพ—่ฟ™ไธช็‚นไฝไธŠๆ”ฟๅบœๅ‡บๆ‰‹ๆŠค็›˜ๆœ‰็†ๆ€ง็š„ๅŸบ็ก€\n1000 Processed\n classify content\n76500 0 ่ฐทๆญŒ็œผ้•œใ€iPhonexSใ€็™พไธ‡็บขๅŒ…ๅ…่ดนๆ‹ฟ\n76501 0 ๅ…ฌๅ‘Š๏ผšxxๆœˆxxๆ—ฅๆ็คบๅ–ๅ‡บxxxๅช\n76502 0 ไบ’่”็ฝ‘ไธŠๅ…ณไบŽSEO็š„ๆ•™็จ‹้šๅค„ๅฏ่ง\n76503 0 โ€ๅ›ข่ดท็ฝ‘CEOๅ”ๅ†›ๅฏน่ฎฐ่€…ๅฆ‚ๆญค่กจ็คบ\n76504 1 ๅ‘ๅคง่ดขxxxxๅนด-็ฌฌxxxๆœŸ๏ผšไธ€ไบŒไธ‰ๅคดๅฅฝ็Ž„ๆœบ๏ผŒๅ•ๆ•ฐ็‰นๆณข็œ‹็บข็ปฟใ€‚้€๏ผš้พ™่™Ž็‹—็Œชไธญxxxxๅนด-็ฌฌx...\n1000 Processed\n classify content\n77000 1 ๆ–ฐๅนดๅฅฝ่€ๆฟๅจ˜๏ผๆˆ‘ๆ˜ฏๅนฟไธœไฝ›ๅฑฑๅธ‚้™ถๅผบไผไธš๏ผˆๅฏŒๅผบ้™ถ็“ท๏ผ‰ไธšๅŠก็ป็†๏ผšๆธธ้‘ซๆ—บ๏ผŒๆˆ‘ไปฌๅ…ฌๅธๆ˜ฏๅŽŸๅŽ‚ๅŽŸๆŠ›๏ผŒๅฑ•ๅŽ…ๅœฐ...\n77001 0 UPไธป๏ผšไธ่ดŸ่ดฃไปป็”ตๅฝฑ็ ”็ฉถ้™ข\n77002 0 ไปŠๅนดๆš‘ๆœŸๆผ”ๅ‡บๅญฃ็ƒญ่ตทๆฅๆฒกๅผ€ๅ‘ๅธƒไผšๅทฒๅ–้—จ็ฅจ่ฟ‘ๅ…ซๆˆ\n77003 0 GCPD่ทŸ็€่€็ˆทๅฑ่‚กๅŽ้ข่ฟฝ้‚ฃๅ‡ ๅนด\n77004 1 ๆ‚จๅฅฝ๏ผŒๆ—ถๅฐšๅฅณๅ‹ๅ…ฐๅบ•ๅบ—x.xๅฆ‡ๅฅณ่Š‚ไธบๆ‚จๅ‡†ๅค‡ไบ†ไธ€ไปฝ็คผ็‰ฉ๏ฝž่ฏทๆ‚จxๆœˆxๅทๆŒๅก่ฟ›ๅบ—้ข†ๅ–๏ฝžxๆœˆxๅทๅˆฐxๆœˆ...\n1000 Processed\n classify content\n77500 0 ่ฟ™ๅบงไฝไบŽๅพทๅ›ฝๅŒ—้ƒจsuurhusen็š„ๆ•™ๅ ‚ๅฐ–ๅก”ๅทฒ่ขซๆญฃๅผๅˆ—ๅ…ฅๅ‰ๅฐผๆ–ฏ็บชๅฝ•ไธ–็•Œๅคงๅ…จ\n77501 1 ๆ‚จๅฅฝ๏ผŒๆฌฃๅฅ•้™ค็–ค่ƒœๅˆฉๅบ—ไธ‰ๆœˆๅ…ซๆ—ฅ้’ˆๅฏนๆ–ฐ้กพๅฎข็‰นๆƒ ๆดปๅŠจๅŽŸไปทxxxๅ…ƒไฝ“้ชŒๆดปๅŠจxๆœˆxๆ—ฅไธ‹ๅˆๆ‰ซๅพฎไฟกๅช้œ€xx...\n77502 0 ๆœ‰ไบ›่ขซๅˆซ็š„ๅŒป็”Ÿๅฎฃๅˆคไธ่ƒฝไฟฎๅค็š„็œผ็›\n77503 0 ๆต™ๆฑŸ้‚ๆ˜ŒๅŽฟๅผ€ๅฑ•ๆŠ—ๆˆ˜ๅฒๆ–™ๅพ้›†\n77504 0 ้ซ˜ๆทณไบค่ญฆๅคง้˜Ÿๆทณๆบชไธญ้˜ŸไธปๅŠจไผšๅŒๅŒบๅŸŽ็ฎก้ƒจ้—จๅผ€ๅฑ•ๆฐดๆžœๆ‘Š่ดฉๅ ้“็ป่ฅ้›†ไธญๆ•ดๆฒป\n1000 Processed\n classify content\n78000 1 ็บข็ผจไธ€ๅ“ๅนผๅ„ฟๅ›ญ็Žฐๆญฃ็ซ็ƒญๆŠฅๅไธญใ€‚็ƒญๅฟฑๆฌข่ฟŽๆ‚จๅ’Œๅฎๅฎ็š„ๅˆฐๆฅ๏ผŒๅ‚่ง‚ๆœฌๅ›ญ๏ผŒๅฐฑ่ฏปๆœฌๅ›ญใ€‚ๅญฆไฝๆœ‰้™๏ผŒๆฌฒๆŠฅไปŽ้€Ÿ...\n78001 0 ไธœๅฑฑๅŽฟๆ”ฟๅบœ2015ๅนด็ฌฌไบŒๆฌกๅธธๅŠกไผš่ฎฎๅŒๆ„ไบค่ญฆๅคง้˜Ÿๆๅ‡บ็š„ๅผ€ๅฑ•้“่ทฏไบค้€šไบ‹ๆ•…ๆ•‘ๆตŽๆ•‘ๅŠฉ่ดฃไปปไฟ้™ฉๆ–นๆกˆ\n78002 0 ไฟƒ่ฟ›่‚็ป†่ƒžๅ†็”ŸๅนถไฟๆŠค่‚็ป†่ƒž\n78003 0 ๅŠŸ่ƒฝxใ€่ƒฝ่งฃๆ–‘่ฅใ€่Šซ้’ๆฏ’ใ€ๅœฐ่ƒ†ใ€ไบญ้•ฟใ€้‡Ž่‘›ใ€็กซ็ฃบๆฏ’ใ€่ฏธ่‚ๆฏ’\n78004 0 2015ๅนด7ๆœˆ10ๆ—ฅๅ’Œ7ๆœˆ20ๅฎ‰ๅพฝ็œๆทฎๅ—ๅธ‚ๅ‡คๅฐๅŽฟๅ‡คๅ‡ฐ้•‡ๆ–ฐๆน–็คพๅŒบ่ฟๆณ•ๅผบๆ‹†็™พๅง“ๆˆฟๅฑ‹ไธ็ป™่กฅๅฟ\n1000 Processed\n classify content\n78500 0 ๅŠ ๆฒนๅŠ ๆฒนๅŠ ๆฒนๅŠ ๆฒนๅŠ ๆฒน\n78501 1 ้—นๆ–ฐๆ˜ฅ๏ผŒๆŠขๅฝฉๅคดใ€‚็‘žๅšๆ–‡่ฃ…้ฅฐๆ˜ฅๅญฃxxๅฅ—็ฒพๅ“ๆ ทๆฟ้—ดๅพ้›†ไธญ...ๅฐ†ไบŽxๆœˆxๆ—ฅไธ‹ๅˆx:xx-x็‚น๏ผŒ้˜ณ...\n78502 0 2ใ€่ดŸ่ดฃๅฎขๆˆท็š„ๅผ€ๅ‘ใ€ๅฎžไฝ“ๅบ—ๅ•†ๅฎถๆ‹›ๅ•†ๅŠๆ–ฐๅฎขๆˆท่ฐˆๅˆคๅทฅไฝœ\n78503 0 ๆƒณ็Ÿฅ้“ๅ’จ่ฏข่€…ๅฆ‚ไฝ•่ฟฝๅ•ๆˆๅŠŸๅ—\n78504 0 ๆŠŠ่ฏไธธ็š„ๅŒ…่ฃ…่ฎพ่ฎกๆˆไธ‰ๅชๅฐ็Œช\n1000 Processed\n classify content\n79000 0 ไป–ไปฌๅ็€ๅฆๅ…‹ใ€ๆ‰›็€่ฟซๅ‡ป็‚ฎใ€ๆ‹ฟ็€ๆญชๆŠŠๅญ\n79001 1 ๅ€ผๆญคไธ‰ๅ…ซๅฅณ็ฅž่Š‚๏ผŒๆ–ฐ็Ž›็‰น้บฆไธญๆž—ๆดปๅŠจๅคšๅคš๏ผŒๅ‡กๅœจๆœฌๅบ—่ดญ็‰ฉๅณ้€็ฒพ็พŽ็คผๅ“๏ผŒๆ•ฐ้‡ๆœ‰้™ๅ……ๅ€ผๆœ‰็คผ ๅ……xxxx...\n79002 0 ๅขžๅผ€125ใ€128ใ€133่ทฏๅคœ็ญ่ฝฆ\n79003 1 ็ดงๆ€ฅ้€š็Ÿฅ๏ผšๅ€ผๆญคไธ‰ๅ…ซไน‹้™…๏ผŒ็‰นๅ‘ๅ‡บๅ‹ๆƒ…ๆ็คบ๏ผŒๆณฐๅ’Œๅ›ฝ้™…Lilyๅฅณ่ฃ…ๅ…จๅœบๆปกxxxๅ‡xx๏ผŒๆ€ปๆœ‰ไธ€ๆฌพ้€‚ๅˆ...\n79004 0 ๆ˜ฏๅ› ไธบๆˆ‘ๅŒๆ ท้š่—ไบ†ไธ€ไธช็œŸ็›ธ\n1000 Processed\n classify content\n79500 0 ๆน–ๅ—80ๅŽๅฅณๅ‰ฏๅธ‚้•ฟ็Ž‹ๅฟใ€ๅนฟไธœ27ๅฒๅ‰ฏๅŽฟ้•ฟๆฑชไธญๅ’ๅˆ่ฟ›ๅ…ฅๅ…ฌไผ—่ง†้‡Ž\n79501 0 ไธบMacไธŽLinuxๅนณๅฐๆไพ›ๅŸบไบŽ\n79502 0 ไธๅฐๅฟƒๆŠŠๅ„ฟๅญไป–็ˆธ็š„็”ต่„‘ๅผ„็Žฏไบ†\n79503 0 ็”จๆ‰‹ๆœบๅฏ็งŸ่ฟ˜่‡ช่กŒ่ฝฆๅคชๆน–้ฉฟ็ซ™่ฅฟๅฑฑๆฎตๅฏๅŠจ่ฟ่ฅ\n79504 0 xxxxๅนดxๆœˆxๆ—ฅๅ‡Œๆ™จx็‚น้’Ÿๅœจ็”Ÿๅ‘ฝ็ง‘ๅญฆ้™ขๅœฐ้“ไปŽ้•ฟ้€”ๅคงๅทดไธ‹่ฝฆๅŽๆ‹›ๆฅไธ€่พ†ๅ‡บ็งŸ่ฝฆ\n1000 Processed\n classify content\n80000 1 (x/x)ๆ„Ÿ่ฐข่‡ด็”ตๅŒ—ไบฌๆ–ฐไพจ่ฏบๅฏŒ็‰น้ฅญๅบ—ใ€‚ๅ‡ญๆญค็Ÿญไฟกๆฏๅฏ็‰นไปทไบซๅ—xxๅ…ƒ้‡‡่‡ชๅœฐไธ‹xxxx็ฑณ็š„ๅคฉ็„ถๆธฉๆณ‰...\n80001 0 50ใ€60ๅฒ็š„ๅฅณไบบๆ˜ฏ้ซ˜ๅฐ”ๅคซ?\n80002 0 ๅˆ˜ๆŸๅ้ซ˜้“ๅˆฐๅธธๅทžไธŽๅฐ่็บฆไผš\n80003 0 SMไฝ ไธบไป€ไนˆ่ฟ˜่ƒฝ่ฟ™ไนˆ็†็›ดๆฐ”ๅฃฎๅœฐ่ตท่ฏ‰\n80004 0 ไธญ็บฟ่‡ณๅฐ‘่ฆ่งฆๅŠ3100่‡ณ3072็‚นๆˆ–ๆ›ดไฝŽ2800็‚น\n1000 Processed\n classify content\n80500 0 ไธป่ฅๆฐดๅค„็†่ฎพๅค‡ใ€ROๅๆธ—้€ใ€ๆœบๆขฐ่ฟ‡ๆปคๅ™จ็ญ‰\n80501 0 ๅฝ“ๅ‡็ง‘ๆŠ€็ญนๅˆ’ๆŠ•่ต„ๅ›ฝๅค–้ซ˜็ซฏ้•้’ด้“ๆญฃๆžๆๆ–™\n80502 0 ๆˆ‘ๆœ‰ใ€Ž2015้ ‚็ดš็”Ÿๆดปๅฑ•ใ€้–€็ฅจ\n80503 0 ็ฌฌไธ‰่พ†Txxไธ็”จๅคš่ฏด่ฟ‡ๅผฏไธๅ‡้€Ÿ\n80504 0 ้™คๅค–่ฟ˜ๆœ‰2GRAMๅ’Œ16GROM\n1000 Processed\n classify content\n81000 0 ๆœๅŠก็ƒญ็บฟxxxxxxxxxxxๆ›นๅฉท\n81001 0 ๅ…ถไธญx่ตทไธฅ้‡็š„ไบค้€šไบ‹ๆ•…้€ ๆˆxไบบๆญปไบก\n81002 0 ๆ‰‹ๆœบๅ’Œ้ฉฑ้ญ”ๅธˆๅœจไธ€ๅˆ†้’Ÿๅ†…ๅŒๆ—ถๅ‘่ดงไบ†\n81003 0 HoloLens่ฟ˜ๅค„ๅœจๅผ€ๅ‘้˜ถๆฎต\n81004 0 ๅนถไธ”่กจ็คบ่ฆๅŽป่Šฑๅƒ้ชจ็”ต่ง†ๅ‰ง้‚ฃ้‡Œๆด—ไธ€ไธ‹่„‘โ€ฆโ€ฆโ€ฆโ€ฆโ€ฆโ€ฆโ€ฆโ€ฆ\n1000 Processed\n classify content\n81500 0 ่ฐทๆญŒ่ก—ๆ™ฏๅทฒๆ‹ๆ‘„่ฟ‡ไธ–็•ŒไธŠไธๅฐ‘ๅฅ‡ๅผ‚็š„ๆ™ฏ่‰ฒ\n81501 0 ๆ—ถ็ฉบ็ฉฟ่ถŠใ€ไฝ“็ป†่ƒž็‹ฌ็ซ‹ๅ…‹้š†ใ€ๅ็‰ฉ่ดจๆญฆๅ™จใ€็ƒ‚ๅคง่ก—็š„้ฃž่กŒๆฑฝ่ฝฆ้ƒฝๆฒกๆœ‰ๅ‡บ็Žฐ\n81502 0 ๆฌข่ฟŽๅŠ ๅ…ฅๅพฎ่ฝฏWindows\n81503 0 ๆฅไบ†่ฟ™ไนˆๅคšๆฌกๅฐฑไธญๆ„่ฟ™็š„ๅน้ฃžๆœบๅธฆ่“ๅ…‰ๆ—ถๅฐšๅˆๆด‹ๆฐ”ๅ…‰ๆƒณๅธฆๅ›žๅฎถ\n81504 0 ๅนถๅฏนๅ…ถ่ถ…ๅ‘˜่ฟๆณ•่กŒไธบๅค„200ๅ…ƒ็ฝšๆฌพ\n1000 Processed\n classify content\n82000 0 ่ขซๅ‘Š๏ผšๆˆ‘ๆฒกๆœ‰่ƒฝๅŠ›ๆ”ฏไป˜ๅขžๅŠ ่ดน\n82001 1 ๅ†œ่กŒ๏ผšxxxxxxxxxxxxxxxxxxxๅขๅคฉๅฟ \n82002 0 ๅฝ“ๅนดๆ”ฟๅบœๆ˜ฏ6ๆœˆ็ป™ๆˆ‘ๅผ€็š„็ฆป่Œ่ฏๆ˜Ž\n82003 0 ๅ—ไบฌๅคšๅฎถๅ…ฌๅ›ญๆ™ฏๅŒบ้ƒฝๅ‡†ๅค‡ไบ†่ฟŽๆ–ฐๅนด็š„่Š‚ๅบ†ๆดปๅŠจ\n82004 0 ่ฎค่ฏไฟกๆฏไธบโ€œๆทฎๅฎ‰ๆฐๆฃฎ็”ตๅญๅ•†ๅŠกๆœ‰้™ๅ…ฌๅธๆ€ป็ป็†ๅŠฉ็†โ€\n1000 Processed\n classify content\n82500 0 โ€”โ€”ๆญๅทžๆขฆๆน–ๅฑฑๅบ„ๆธ…ๅนฝๅฆ‚็”ป็š„ไบบ้—ดไป™ๅขƒ\n82501 0 ๆฒณๅŒ—ๅ…จ็œๆณ•้™ขๅฎก็ป“็Žฏๅขƒ่ต„ๆบ็ฑปไธ€ๅฎกๆกˆไปถๅ…ฑ่ฎก720ไฝ™ไปถ\n82502 1 xxxx xxxx xxxx xxxx xxxๅปบ่กŒ๏ผŒๆˆทๅ๏ผšไธๅˆš\n82503 0 ่ฟ™ไธชโ€œstaycoolโ€็œŸ็š„ๅบ”่ฏฅๆ”พๅœจๅœฐ้“็ซ™้‡Œไนˆ\n82504 0 ไธŽๅคงๅฎถๅช่ƒฝๅˆ†ไบซๅ‰ๅŠๆฎตๅŽๅŠๆฎตๅคช่ก€่…ฅ่ฟ˜ๆ˜ฏไธ็œ‹็š„ไธบๅฅฝ\n1000 Processed\n classify content\n83000 0 ่‹นๆžœใ€youtobe่ฟ˜็ปง็ปญๆดป็š„ๅฅฝๅฅฝ็š„\n83001 0 ๆˆ‘็Ÿฅ้“ๆœ€่„็š„็œŸ็›ธไธ€่ˆฌ้ƒฝๆŽฉ่—ๅœจๆœ€ๆผ‚ไบฎ็š„่กจ้ข้‡Œ\n83002 0 ๅนถไธ”ๅฝปๅบ•ๆŠ›ๅผƒๆจกไปฟๆ—ง็‰ˆMacOSๅค–่ง‚็š„่ฎพ่ฎก\n83003 0 ๅฎถ้‡Œ็”ต่„‘ไธŠๅฎŒๅพฎๅšๅฟ˜่ฎฐ้€€ๅ‡บไบ†\n83004 0 ็”ตๆขฏไธๅคนๆญปๅ‡ ไธชไบบๅฐฑๆฒกไบบๆŸฅไบงๅ“่ดจ้‡้ฃŸๅ“ไธไธญๆฏ’ๅฐฑๆฒกไบบ่ฏด้ฃŸๅ“ๅฎ‰ๅ…จไบบไธบไป€ไนˆๆ€ป่ฆไป˜ๅ‡บไปฃไปทไน‹ๅŽๅ†ๅŽปๅผฅ่กฅ่ฟ™...\n1000 Processed\n classify content\n83500 1 ็ฆๅปบๅคงๅž‹็Žฉๅ…ทๅŽ‚ๅคง้‡ๆ‹›่˜xx๏ฝžxx๏ผˆxxxx.x.xๅ‰๏ฝžxxxx.x.xๅŽ๏ผ‰ๅ‘จๅฒ็”ทๅฅณไฝœไธšๅ‘˜xx...\n83501 0 ๅ˜ๆˆ2=1+1ไบ†โ€ฆโ€ฆ่ฏดไฟ—็‚นไฟกๅฟƒๅฐฑๆ˜ฏ่ตš้’ฑๆ•ˆๅบ”\n83502 1 ใ€‚xๆœˆxๆ—ฅไธ€xๆœˆxๆ—ฅๅ…จๅœบๅ››ไปถๅ…ซๆŠ˜๏ผŒ้€Ÿ้€ŸๆฅๆŠขๅงไบฒไปฌ๏ผŒไธ‡่พพไฝฐ่‰้›†ๅœจๆญค็ญ‰ๅ€™ๆ‚จ็š„ๅ…‰ไธดใ€‚\n83503 1 ๅฆน๏ผŒๅ˜‰ๅฎๅฅถ็ฒ‰xๆœˆxxๆ›ฐๅœจ็ˆฑๅฉดๅฎคไธŠๆžถ้”€ๅ”ฎไบ†๏ผŒไฟƒ้”€ๅŠ›ๅบฆๅพˆๅคง่ดญไนฐx็ฎฑxๅฌ็ซ‹ๅ‡xOOๅ…ƒ๏ผŒๅœจๅ…ถไป–ๅฎๅฎๅบ—...\n83504 1 ๅคงๅŠจไฝœ๏ผŒ็œ‹ๆ ผๅŠ›๏ผŒๆ ผๅŠ›็ฉบ่ฐƒๅ…จๅนดๆœ€ๅบ•ไปท๏ผŒๅ†้ข„ๅญ˜xxๅ…ƒๆŠตxxxๅ…ƒ๏ผŒๆดปๅŠจๆ—ถ้—ดxๆœˆxไธ€xๆ—ฅ๏ผŒๅฑ้”ฆๅนฟๅœบ่ฝฌ...\n1000 Processed\n classify content\n84000 0 ๆˆ˜ๅœฐๅฆๅ…‹ไฝœ่€…ๆ˜ฏyax\n84001 1 ๆƒ ๅทžๅ–„่žไฟกไปฃๆœ‰้™ๅ…ฌๅธxๆœˆxๅทๆญฃๅผไธŠ็ญ๏ผŒๆœ‰้œ€่ฆ่ต„้‡‘ๅ‘จ่ฝฌ็š„ๆœ‹ๅ‹ๆฌข่ฟŽๆฅ็”ตxxxxxxxxxxx็Ž‹็ป็†ใ€‚\n84002 0 ๅŽฆ้—จๅทฅๅ•†ๆ—…ๆธธๅญฆๆ ก14็บงๅฐๅ…ฌไธพ\n84003 0 2014ๅนดๅ—ๅ—ไบฌ**ๅญฆ้™ขๅง”ๆ‰˜ๆ‰“้€ ็บชๅฟตไฝฉๅ‰‘\n84004 0 ๆ›ดๆœ‰idea็š„ๅฐๅ›ข้˜Ÿๅฐ†ไธปๅฏผVR\n1000 Processed\n classify content\n84500 0 ๆฑŸ่‹160ๅฎถไธŠๅธ‚ๅ…ฌๅธ่”ๅˆๅฃฐๆ˜Ž\n84501 0 ๅˆ›้€ ๅ˜ๅŒ–ๅ’Œๆ”น่ฟ›่ฎพ่ฎกๆ˜ฏไธ€ไธช้‡่ฆ็š„ๅทฅๅ…ท\n84502 0 ่ฏดๅฅฝ่ฟ˜ไฝ ็š„้ซ˜้€ผๆ ผๆ‰‹ๆœบๅคง็‰‡ๆฅไบ†\n84503 0 ็ฌฌไธ€ๆกไพฟๆ˜ฏโ€”โ€”ๆœบๅ™จไบบไธๅพ—ไผคๅฎณไบบ็ฑป\n84504 0 ๆœ‰ไธชไบบ่ฃ…้€ผ่ฏด่‡ชๅทฑๆ˜ฏๆŸ้›†ๅ›ข็ปงๆ‰ฟไบบ\n1000 Processed\n classify content\n85000 0 ๆœ‰ๅ…ด่ถฃ็š„ๅŠ ๆ‰ฃ1056527483\n85001 0 ๆ…ˆไธ–ๅนณๅฐฑๆ˜ฏๅฒไธŠๆœ€โ€œtoughโ€็š„ๅฎถไผ™\n85002 0 ไน˜ๅ็ ด็ ด็š„ๅœฐ้“ๅŽปTourMontparnasse\n85003 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็ฌฌๅ››ๅญฃๆฒกๆœ‰ไป€ไนˆๅฅฝๆญŒๆ‰‹\n85004 1 ไธœ้ญ…ๅจฑไนไผšๆ‰€ๆ–ฐๅนดๆ„Ÿๆฉๅ›ž้ฆˆๅ…จๅœบๅ•ค้…’ไนฐx็ฎฑ้€xxๆ”ฏ๏ผŒ็บข้…’ไนฐxๆ”ฏ้€xๆ”ฏ๏ผŒๅฆๅค–ๆฏ้—ดๅŒ…ๅŽขไผš็ป™ๆ‚จๅธฆๆฅไธ€ไธช...\n1000 Processed\n classify content\n85500 0 xxxๅคšๅๅฎถ้•ฟๅ’Œๅฐๆœ‹ๅ‹ๅˆฐๅœบๅ‚ๅŠ \n85501 0 ๅฅฝๅƒAmazonไธญๅ›ฝๅผ€้€šๅ…จ็ƒ่ดญไน‹ๅŽ\n85502 1 ๆฒชๅคช่ทฏ่ฟ‘็บฌๅœฐ่ทฏๆฒฟ่ก—่ฝฌ่ง’ไฝ็ฝฎ๏ผŒxxx.xxๅนณ็ฑณ๏ผŒ็งŸ้‡‘xxไธ‡๏ผๅนด๏ผŒ็งŸๅฎขไธบ:ไธญๅ›ฝ็งปๅŠจ๏ผŒ่‰ฏๅ‹่ถ…ๅธ‚๏ผŒ้›ท...\n85503 0 ๅ‘็”ตใ€ๆ—ฑ็จปใ€ไธญ่ฏ็ซ‹ไฝ“ๅ‘ๅฑ•ๆจกๅผ\n85504 0 ๆ—ฉไธŠ7๏ผš30ๅˆ†ๅˆฐๅธƒๅฐ”ๆดฅๆฑฝ่ฝฆ็ซ™ๅ่ฝฆๅŽป็ฆพๆœจ\n1000 Processed\n classify content\n86000 0 ๆœฌไนฆๆ˜ฏไธ€ๆœฌ่ฎฐๅฝ•ๆ–ฐไธญๅ›ฝ20ๅนด่ต„ๆœฌๅธ‚ๅœบๅ‘ๅฑ•ๅฒ็š„่‘—ไฝœ\n86001 0 ๅทฒไปŽ่ถ…ๅธ‚ๅ“ญๆ™•็”ตๆขฏๅœจๅ“ญๆ™•ๅˆฐๅŽ•ๆ‰€\n86002 0 ๅพฎไฟก่ฆๅšๆ™บ่ƒฝๆœบๅ™จไบบโ€œๅฐๅพฎโ€\n86003 0 ็Žฉๅ็š„ๆœบๅ™จไบบๅฟซๆฅๅ›ด่ง‚ๆˆ‘็š„็ฒพๅฝฉๅพฎ่ง†้ข‘\n86004 0 ๆต™ๆฑŸ็‘žๅฎ‰็š„ๅญ™ๅฅณๅฃซๅธฆ2ๅฒๅ„ฟๅญๅˆฐๅ…ฌๅ›ญ็Žฉ่€\n1000 Processed\n classify content\n86500 0 ่ฎพ่ฎกๅธˆๅฏน่ฟ‡ๅŽป็š„ๅˆทๅญ่ฟ›่กŒไฝฟ็”จไธŠใ€้€ ๅž‹ไธŠ็š„่€ƒ็ฉถๅŽ่ฟ›่กŒๆ”น่‰ฏ่ฎพ่ฎก\n86501 1 ่ฏๅ’Œ้“ถ่กŒๅกๆˆ–ๅญ˜ๆŠ˜ใ€‚ๅฆ‚้œ€้ข„็บฆ่ฏท่‡ณ็”ต่ฏ๏ผŒ้ข„็บฆๅŽๆ— ้กปๆŽ’้˜Ÿๅฎ‰่ฃ…๏ผŒๆŠฅ่ฃ…ๅœฐๅ€๏ผš้“ถๆณ‰ๅŒ—่ทฏๅนฟ็”ตๅคงๆฅผไพง๏ผˆๅณๅธ‚ๆ”ฟ...\n86502 0 ไบ”ๅŽๅŸŽ็ฎกๆ™ฎๅ‰ๆ‰งๆณ•ไธญ้˜Ÿ้›†ไธญๅŠ›้‡ๅ…จๅ‘˜ๅ‡บๅŠจ\n86503 0 ่ฐไธๆƒณไธ็”จๆŠ•่ต„ๅฐฑๅฏไปฅ่ตš้’ฑ็š„็”Ÿๆ„\n86504 0 ๆผ”ๅ”ฑไผšๅŽ้—็—‡ไธญ่ฟ˜ๆœ‰ๆต้ผปๆถ•่ฟ™ไธ€็—‡็Šถๅ˜›\n1000 Processed\n classify content\n87000 1 ใ€ๅˆฐๅทดๆฏ”ไบšๅŠๅฑฑๆฌขไน่ฟ‡ๅ…ƒๅฎตใ€‘xๆœˆxๆ—ฅ-xๆ—ฅ๏ผŒๅˆฐ็Žฐๅœบ็Œœ็ฏ่ฐœใ€ๅŒ…ๅ…ƒๅฎตใ€่ตข็คผๅ“ใ€‚xๅทๆฅผไธญๅบญๆ™ฏ่ง‚ๅทจ่‘—้ข...\n87001 0 ๅœจๆœบๅœบ้‡ๅˆฐไบ†TimeZ็ป“ๆžœไธ่ฎค่ฏ†\n87002 0 ๅฐผ็Ž›ๅ•ŠT^TT^Tๆˆ‘ๆ˜ฏๅฌๅŠ›ๆœ‰้—ฎ้ข˜ไธๆ˜ฏ้ผปๅญๆœ‰้—ฎ้ข˜ๅ•ŠT^Tไธบๅ•ฅ็ป™ไบ†ๆˆ‘ไธ€ไธช้ผป็‚Ž็š„่ฏT^Tไฝ ๅœจ้€—ๆˆ‘ๅ˜›\n87003 0 ไป€ไนˆๆœŸ่ดงๆœŸๆƒFOF็ฆปๅฒธๅŸบ้‡‘QDII\n87004 0 ็งไธ‹้‡Œๅดๆ‰ฟ่ฎค่ฟ™ไบ›่‚ก็ฅจๆ˜ฏๅžƒๅœพ\n1000 Processed\n classify content\n87500 0 ๅœจ่ฟ™ๆ ทไธ€ไธชๆœ‰ๅ›พๆœ‰็œŸ็›ธ็š„ไธ–็•Œไธญ\n87501 0 ๆœ€้ซ˜ไบบๆฐ‘ๆณ•้™ขๅทฒ็ป้ข‘้ข‘ๅšๅ‡บๆ˜Ž็กฎ่กจๆ€\n87502 0 ไป–่ƒฝไธ่ƒฝๅฎž็Žฐๅผ€้ฃžๆœบ็š„ๆขฆๆƒณๅ‘ข\n87503 1 xxxๆœŸ็ฒพ็‰ˆไธชๅไฝ๏ผŒไบบ่™ฝๅฐ๏ผŒๆŽ’ๅๅคง๏ผŒๆ็คบ๏ผŒๅŒ่ฟ›ๅŒๅ‡บ๏ผŒ้€ไฝ ไธ€ๅฅ่ฏ๏ผŒ่บซๆ€€็ปๆŠ€ๆ‰ฌๅ››ๆ–นใ€‚[่€ๅ…ฌ:xx...\n87504 0 ๆ— ่Š็œ‹่‡ชๅทฑๅพฎๅš็š„็ฒ‰ไธ็„ถๅŽๅ‘็Žฐๆœ‰ไธช่ดฆๅทๆ˜พ็คบๆ‰‹ๆœบ่”็ณปไบบ๏ผšXXXไป–็š„ๅคงๅ\n1000 Processed\n classify content\n88000 0 8ๆ‰นๆฌกไบงๅ“้žๆณ•ๆทปๅŠ ็ฆ็”จ็‰ฉ่ดจใ€่ฟ่ง„ไฝฟ็”จ้™็”จ็‰ฉ่ดจโ†“ๆฐฏๅ€ไป–็ดขไธ™้…ธ้…ฏๅฑžไบŽ็ณ–็šฎ่ดจๆฟ€็ด ็ฑป็‰ฉ่ดจ\n88001 1 ไธ–็•Œๅ› ๅฅณไบบ่€Œๅˆ†ๅค–็พŽไธฝ๏ผใ€ๅฎน่พฐๅ“ฅๅผŸใ€‘ๆๅ‰็ฅๆ‚จๅฅณไบบ่Š‚ๅฟซไน๏ผๆˆ‘ไปฌ็‰นไธบๆ‚จๅ‡†ๅค‡ไบ†ๅคง้‡ๆ˜ฅๆฌพๅŠ่Š‚ๆ—ฅๅฅ—่ฃ…๏ผŒๅฟซ...\n88002 0 ็œ‹ๅฎŒไบ†ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณๅ†ๆฅ็œ‹็œ‹็พŽๅ›ฝ็š„ๅง~็พŽๅ›ฝ็š„ๅฏผๅธˆ่ฝฌ่บซๅคชๅ†ท้™ไบ†\n88003 0 ไธๆ‰พๅ‡บ็—…ๆฏ’ๅˆฐๅบ•ๆ˜ฏ่ฐไผš่ฟ›่กŒๆฐธๆ’็š„ๆธธๆˆ\n88004 0 ไธ€ๅไธญๅนดๅ•†ๅœบไฟๆดๅทฅ่ขซB2่‡ณB1ๅฑ‚็š„่‡ชๅŠจๆ‰ถๆขฏๅคนไฝ่…ฟ้ƒจ\n1000 Processed\n classify content\n88500 0 ไธๅŽปๆ‰“็”ต่ฏๅซ็‰ฉไธšๆˆ–่€…็ป™ๅฎถไบบๆฅ่งฃๅ†ณ\n88501 0 ็ˆฌ่ตทๆฅๅผ€็”ต่„‘ๅ……ๅ€ผๅ…ƒ็ตไน‹ๅŠ›โ€ฆโ€ฆ่ฆๆ˜ฏๆ‰‹ๆธธไธŠ็›ดๆŽฅ่ƒฝๅ……ๅ€ผๅฐฑๅฅฝไบ†โ€ฆโ€ฆ\n88502 0 ๆœ€้‡่ฆ็š„ๆ˜ฏๆฏไธชไบบ้ƒฝๅฏนๆ—…ๆธธๅฎ‰ๆŽ’้žๅธธๆปกๆ„\n88503 0 ็”ฑ่ฑชๅŽ็š„็ฉบๅฎขA330โ€”200ๅž‹้ฃžๆœบๆฏๅ‘จไธ€ใ€ไธ‰ใ€ไบ”ๆ‰ง้ฃž\n88504 0 ๅฏไปฅๅ’ŒSpringๆก†ๆžถๆ— ็ผ้›†ๆˆ\n1000 Processed\n classify content\n89000 0 ่ฟ™ๅ—ไธšๅŠกๆญฃ้€ๆธ่ขซ้ƒจๅˆ†ๅŸบ้‡‘ๅ…ฌๅธๅ€š้‡\n89001 1 ่”้€šๅ…ฌๅธ้ซ˜่‰ณ็บข็ฅๆ‚จๅคฉๅคฉๆœ‰ไธชๅฅฝๅฟƒๆƒ…๏ผ็ŽฐๆŽจๅ‡บใ€‚้ข„ๅญ˜xxxๅ…ƒใ€ๆœˆไบคxxๅ…ƒๅฏไบซxxMๅ…‰็บคๅ…่ดน็”จใ€ๅซx...\n89002 0 ไฝ ๅฆˆ่Šฑๅƒ้ชจ27้›†้‡Œ่ฟ™ๆ˜ฏไป€ไนˆ้ฌผ\n89003 0 ๅ•†ไธš้ข†่ข–ๅฅณๅผบไบบไบ’่”็ฝ‘+่ทŸ้šๅ†…ๅฟƒ็š„ๅฃฐ้Ÿณ\n89004 0 ๅฎžไน ็”Ÿ็š„ๆˆ‘่ถŠๆฅ่ถŠๆ€ฅๆฒกๆœ‰ๅฎขๆˆท\n1000 Processed\n classify content\n89500 0 ๆน–ๅŒ—่ญฆ่ฝฆๅคฑๆŽงๆ’žไผคไธคไบบๆญฃๆŸฅๆ˜ฏๅฆ้…’้ฉพ\n89501 0 ๅœๅœจไธป็•Œ้ขๅฌ็€BGMๆ•ดไธชไบบ้ƒฝ่ขซๆฒปๆ„ˆไบ†\n89502 0 ๆญๅทžๆ€งไปทๆฏ”ๆœ€้ซ˜่ฃ…ไฟฎๆœๅŠกๅนณๅฐ\n89503 0 ็Žฐไปฃๅปบ็ญ‘ๅ’Œไธ‘้™‹็š„ๆฌงๅผๅปบ็ญ‘ๅ †็งฏๅ‡บๆฒกๆœ‰็ต้ญ‚็š„ๆตฎๆฌข\n89504 0 Alice็š„ๆ‰‹ๆœบ้“ƒๅฃฐ?ๆˆ‘ๆญฃๅœจๆ”ถๅฌP\n1000 Processed\n classify content\n90000 1 ๏ผŠxxx่šไผš้ฃŽๆšดๆฅไบ†๏ผŒไฝ ๅšๅฅฝๅ‡†ๅค‡ไบ†ๅ—๏ผŸ็ญๅฐ”ๅฅ‡ๅ…จๅฑ‹ๅฎšๅˆถๅฎถๅ…ท่ฏš้‚€ๆ‚จๅ‰ๆฅๅ“้‰ดใ€Š่ฎพ่ฎก็”ฑไฝ ๏ผŽๅ…จๆƒๅšไธปใ€‹...\n90001 1 ่ฏšไฟกxๆœˆ๏ผŒๆ„Ÿๆฉไธ€ๅ…ƒ่ดญโ€้พ™้—จไฝณๅฑ…ไน๏ผˆๆฉฑๆŸœใ€่กฃๆŸœใ€้—จไธšใ€่ฝฏๅŒ…๏ผ‰้’œๆƒ ่Š‚๏ผๆ–ฐๅนด็บขๅŒ…้€ไธๅœ๏ผŒไนฐไธ€้€ไธ€ใ€...\n90002 0 S20ๅค–ๅœˆๅพ€่™นๆกฅๆœบๅœบๆ–นๅ‘่ฟ‘็œŸๅ—่ทฏไธŠๅฃ่ฝฆๆต้‡ๅคง\n90003 0 ่ฟ™ไธชไธคไบบ็›—็ชƒๅ›ขไผ™่ขซๆฒณไธœๅ…ฌๅฎ‰ๅˆ†ๅฑ€ๆˆๅŠŸๆŠ“่Žท\n90004 0 xxๅฐๆ—ถๅ†…ๆˆ‘ๅŽฟ้˜ต้ฃŽๅฏ่พพx็บงไปฅไธŠ\n1000 Processed\n classify content\n90500 0 xx%็š„ๅ—ๅฎไบบๅฎถ้‡Œ็š„้”้ƒฝไธๅฎ‰ๅ…จ\n90501 0 xxใ€ๅŒป็–—ๅ™จๆขฐ็ง‘ๆ™ฎๆดปๅŠจ่ตฐ่ฟ›ๆฑŸๅคง\n90502 0 ๆ˜จๆ™šไธŠๅฎถ้‡Œ่ฟ›ๅฐๅทๆ‰‹ๆœบ็”ต่„‘ไป€ไนˆ้ƒฝ่ขซๅทไบ†\n90503 0 ไธบไบ†ๆต‹่ฏ•่ฃ…ไฟฎๅŽ็š„ๆ–ฐ่ฏ•ๅฌๅฎคๆ•ˆๆžœ\n90504 0 ๆˆ‘ๅฎถๅฐ็‹—็š„็ป†ๅฐ็—…ๆฏ’ๆฒป็–—็ป้ชŒ\n1000 Processed\n classify content\n91000 0 ๆ‰€ๆœ‰็ผด็บณๅŒป็–—ไฟ้™ฉ็š„ๅ‘˜ๅทฅๅฎšไบŽ2015ๅนด8ๆœˆ2ๆ—ฅไธŠๅˆๅœจไธญๅฟƒๅŒป้™ขไบŒ้—จ่ฏŠ่ฟ›่กŒ่‚ฒ้พ„ๅฆ‡ๅฅณๆฃ€ๆŸฅ\n91001 0 10ๆ”ฏๆ–‡่‰บ้˜Ÿ็บท็บทๆ‹ฟๅ‡บไบ†่‡ชๅทฑ็š„โ€œ็œ‹ๅฎถโ€่Š‚็›ฎ\n91002 0 ไธ€ไธ‡ๅคšๅๆน˜่ฅฟโ€œๅœŸๅŒชโ€ๅœจไบ”้›ถๅนดไปฃ่ตฐไธŠไบ†่‡ชๆˆ‘ๆ•‘่ตŽไน‹่ทฏxxxxๅนด\n91003 0 ๆžœ็œŸๅ›žๆฅๆ‰ๆœ‰ๅฎถไบบ็š„ๆ„Ÿ่ง‰็ฌฌไธ€ๅคฉไธญ่ฏ่ฟ˜ๅฅฝๅชๆ˜ฏ้บป็ƒฆ็‚นๆฒกๆƒณ่ฑกไธญ้‚ฃไนˆ้บป็ƒฆ\n91004 0 ไธ€ๆœŸ่Š‚็›ฎไธ่ฎบๅฎƒๅ†…ๅน•็ฉถ็ซŸๅœจๅ“ช้‡Œ\n1000 Processed\n classify content\n91500 0 ่‹ๅŒ—ๅŒป้™ขๅฐฑๅœจ้™„่ฟ‘ไฝ†ๆ˜ฏ120ๆฅ็š„ๅพˆๆ…ข\n91501 1 ๆ„Ÿ่ฐข่‡ด็”ตๆฝๅŠๅฆ‚ๅฎถไธœ้ฃŽไธœ่ก—ๅบ—๏ผŒ้…’ๅบ—ไฝไบŽๅฅŽๆ–‡ๅŒบไธœ้ฃŽไธœ่ก—xxxๅท-ไธ–็บชๆณฐๅŽๆญฃๅฏน้ข๏ผŒ็ŽฐๆŽจๅ‡บไผ˜ๆƒ ไฟƒ้”€ไปท...\n91502 1 ๆ‚จๅฅฝ๏ผ้พ™ๆนพ่ฟๅŠจไผšๆ‰€๏ผŒๅฐ†ไบŽxๆœˆxๆ—ฅxx็‚น๏ผŒไธพๅŠžไธ‰ๅ…ซๅฅณไบบ่Š‚ๆดปๅŠจ๏ผŒ็‰น่ฏทๅŒ—่พฐไธญๅŒป้™ขไธปไปปๅŒปๅธˆๅˆฐๅœบไธบๅ‰x...\n91503 0 ๆœ€้ซ˜ไบบๆฐ‘ๆณ•้™ขๅ…ณไบŽๅฎก็†ๆ‹’ไธๆ‰ง่กŒๅˆคๅ†ณใ€่ฃๅฎšๅˆ‘ไบ‹ๆกˆไปถ้€‚็”จๆณ•ๅพ‹่‹ฅๅนฒ้—ฎ้ข˜็š„่งฃ้‡Š\n91504 0 ๅฑ…็„ถๆœ‰ไบบ่ฏดๅถๅƒๆฅไบ†ๆ‹‰้ฃžๆœบๆŠ„่ขญๆž้™ๆŒ‘ๆˆ˜ๆˆ‘็š„ๅฆˆๅ‘€ๆž้™ๆŒ‘ๆˆ˜ๅฐฑๆ˜ฏไธชๆŠ„่ขญ็š„่Š‚็›ฎ้ƒฝๆ˜ฏๆŠ„้Ÿฉ็ปผ้—ฎ้ข˜ๆ˜ฏๅคฎ่ง†่ฟ˜ไนฐ...\n1000 Processed\n classify content\n92000 0 GoogleChrome้€Ÿๅบฆๆต‹่ฏ•ๆ…ข้€Ÿ้•œๅคดๅ‘Š่ฏ‰ไฝ ๆ‰“ๅผ€google็š„ๆ—ถๅ€™ๅ‘็”Ÿไบ†ไป€ไนˆ\n92001 0 ่ฟ™็งๅšๆณ•ๆ˜ฏๅฆ็ฌฆๅˆๅ…ฌๅฎ‰ๅ…ฅๆˆทไธ่ƒฝๅ’Œ่ฎกๅˆ’็”Ÿ่‚“ๆŒ‚้’ฉ\n92002 0 ็™ฝๅญ็”ป่Šฑๅƒ้ชจๅ†ๆฌก่ฟ”ๅ›žๅ‡ก้—ดๆœจๅฑ‹\n92003 0 ๅ…ถไธญๆต™ๆฑŸๅคงๅญฆๅธธๅŠกๅ‰ฏๆ ก้•ฟๅฎ‹ๆฐธๅŽๅ’Œๅ“ˆๅฐ”ๆปจๅทฅไธšๅคงๅญฆๅ‰ฏๆ ก้•ฟ้Ÿฉๆฐๆ‰ๅ‡ๆฅ่‡ชๆฉ้˜ณ\n92004 0 ๆต™ๆฑŸๅซ่ง†็š„่Š‚็›ฎๅ‰ช่พ‘่ถŠๆฅ่ถŠ็œ‹ไธๆ‡‚\n1000 Processed\n classify content\n92500 0 8ๆœˆ12ๆ—ฅๅ‘จไธ‰ไธ‹ๅˆๆ‹ฑๅŒ—ๅœฐไธ‹ๅ•†ๅœบ้ข่ฏ•\n92501 0 xxxx้พ™ๅฉ†ๅคไธ€ๆœŸไฝ›็‰ŒๆˆๅŠŸไฝ›็งฆๅจœๆ‹‰ไฝ›็ฅ–ไบ‹ไธš่ฟๅŠฟๆƒๅจ่ดขๅฏŒๅผ€ๆ‹“่ฟ›ๅ–ไน‹็‰Œ็บข้“œ็‰ˆๅธฆddpraๅกๅ›ฝๅ†…็Žฐ...\n92502 0 ๅŽไฟ้™ฉๆ ไนŸ่ฃ…ๅค‡ไบ†ไธ€็ป„้žๅŠŸ่ƒฝๆ€ง็š„ๆฐดๅนณๆ ผๆ …\n92503 0 xxxxๅนดxๆœˆxxๆ—ฅๅธ‚ๆฐ‘ๅ‘้‡‡่ฎฟ่Šฑ้ธŸๅญ—ไธ“ๅฎถๆธ ๆˆ็š„ๅพๅทž็”ต่ง†ๅฐๅพๅทžๆ–ฐ้—ป็Ž‹ไฝณไฝณใ€่’‹ๅŽ็››่ตžๆธ ๆˆๅคงๅธˆ๏ผš่Šฑ...\n92504 0 ๆˆ‘ๅฆˆๅฎถๆ—ง็”ต่„‘ๅฏ†็ ๅฟ˜ไบ†ๆ€Žไนˆ้ƒฝๆ‰“ไธๅผ€ๆ€ŽไนˆๅŠž\n1000 Processed\n classify content\n93000 0 ๅŽปๅนด็š„้•‡ๆฑŸๅ’ŒไปŠๅนด็š„ๅฅฅไฝ“ๅฎŒๅ…จๅƒไธคไธช็ƒ้˜Ÿ\n93001 0 ๆถๅƒง่ดชๆฑกๆฌพ้กนไบคๅธๆณ•ๆœบๅ…ณๅˆ‘ไบ‹็ซ‹ๆกˆ\n93002 0 ๆ‰‹ๆœบ้ฉฌไธŠๆฒก็”ตไนŸไธ่ƒฝ่ฐทๆญŒ่ตฐๅ›žๅŽป่ฟ˜ๅฅฝๆˆ‘ๅธฆไบ†500ไธ็„ถๆˆ‘ๅฐฑๅฎŒไบ†\n93003 0 ๅˆฉ็”จๆ™š้—ดๅ’Œๅ‡ๆœŸ้’ป็ ”่‚ก็ฅจไธšๅŠกๅ’Œๅธ‚ๅœบ่กŒๆƒ…\n93004 0 ๆ‰ฌๅทžๆ—ฉ้ค่กŒไธš็š„็‰น็‚น้šพ้“ๅฐฑๆ˜ฏๆœๅŠกๆ€ๅบฆๆžๅทฎ\n1000 Processed\n classify content\n93500 0 ๆต™ๆฑŸๆ…ˆๆบชไธ€ไบค่ญฆๅœจๆ‰งๅ‹คไธญ่ขซๅทฅ็จ‹่ฝฆ็ขพๅŽ‹่‡ดๆญป\n93501 0 ไธช่‚กๅˆฉๅฅฝ๏ผšๅฅ็››้›†ๅ›ขๅŠๅนดๆŠฅไธš็ปฉๅขžไบ”ๆˆๆ‹Ÿxx่ฝฌxx\n93502 0 ๅ–ๅคšไบ†็š„ไธ€ไธชๅŽ้—็—‡ๅฐฑๆ˜ฏ็œผ็›ๆฒกๅŠžๆณ•ๅฏน็„ฆ\n93503 0 ๅผ€ๅญฆ้ซ˜ไธ‰ๅŠ ๆฒนlightupthedark??\n93504 0 ๆณฐ็‡ฎไธŽๅบ†ไฟฎๅœจ้ฃžๆœบไธŠๅถ้‡โ€ฆ็”ท็”ทๅ‰ช่พ‘็‰ˆ่ง†้ข‘๏ผš\n1000 Processed\n classify content\n94000 1 ๆ’็ฟ”ๆ•™่‚ฒๆ–ฐๅญฆๆœŸๆŠฅๅๅผ€ๅง‹ไบ†๏ผŒๅ…ƒๅฎต่Š‚ๅ‰ๆŠฅๅไปทๆ ผไผ˜ๆƒ ๏ผŒๅนถๆœ‰็ฒพ็พŽ็คผ็‰ฉ็›ธ้€ใ€‚\n94001 0 ๅƒ็š„้ƒฝๅƒๅ‡†ๆ—ถๅ–ไธญ่ฏไธ€ๆ ทๆถๅฟƒไบ†\n94002 0 ็”จ่ดฟ่ต‚ใ€ๅ“„ๅŠ็š„ๆ–นๅผๆฏไบ‹ๅฎไบบ\n94003 0 ๆ—ฅๆœฌkissme็ซๆฏ›่†๏ฝž??ๅคงๅ้ผŽ้ผŽ\n94004 1 ๅฅ‰ๅŒ–xxๅนด่€ๅบ—(่‰บ่—คๅฑ…๏ผ‰๏ผˆ็ฟก็ฟ ่—คๅ™จ๏ผ‰ ็พŠๅนดๅ–œๆด‹ๆด‹.ๅ› ๅบ—้ข่ฃ…ไฟฎ.ๅ…จๅœบ็‰นไปทๅค„็†ใ€‚ ๆปกxxxx้€x...\n1000 Processed\n classify content\n94500 0 ็ป“ๆžœ่ฐƒไบ†ๅฟซ2g็š„psdไนŸไธๆ•ขๅ…ณ\n94501 0 VillasofPinecrestๅฐๅŒบๆฌกๅงๆ‹›็งŸ\n94502 0 ็„ถๅŽ่…พ่ฎฏๅฐฑ่Žซๅๅ…ถๅฆ™ๅœฐ่ฏดๆˆ‘ๅ‘ๅธƒ่ฏˆ้ช—ไฟกๆฏ\n94503 0 ??????????????????\n94504 1 ๅŒ—ไบฌๅฎœ็พŽๅฎถๅ›ญ่ฃ…้ฅฐๆ–ฐๆ˜ฅ้’œๆƒ ๏ผŒๅ…จๅŒ…ไฝŽ่‡ณxxx/ๅนณ็ฑณ๏ผŒๆ›ดๆœ‰ๅคš้กนๅคง็คผ็›ธ้€ใ€‚่ฏฆๆƒ…่ฏทๅ’จ่ฏขxxxxxxxx...\n1000 Processed\n classify content\n95000 0 ๅ…ถ่‘—ๅ็š„่ฝปๅทฅไธšใ€ๆ—…ๆธธไธšใ€้…’ๅบ—ไธšๅ’Œๅจฑไนๅœบไฝฟๆพณ้—จ้•ฟ็››ไธ่กฐ\n95001 0 comๆ–ฐ้‡‘็“ถๆข…้พš็Žฅ่ฒ็‰ˆ้ซ˜ๆธ…็™พๅบฆไบ‘็ฝ‘็›˜ไธ‹่ฝฝ\n95002 0 ไธญๅ›ฝ18ๅฒไปฅไธŠๆˆไบบ้ซ˜่ก€ๅŽ‹ๆ‚ฃ็—…็އไธบ18\n95003 0 ๆˆ‘ๆ‰‹ๆœบๆ—ถไธๆ—ถๅฐฑๅกไธ€ๅ›žๆ€Žไนˆๅฐฑไฟๅญ˜ไธไธŠๅ‘ขๅช่ƒฝๅ‘ตๅ‘ตไบ†๏ฝžgood\n95004 0 ๆตฆๅฃๆฃ€ๅฏŸ้™ขๆ˜ฏ7ๆœˆ20ๆ—ฅๅ‘ๆตฆๅฃๆณ•้™ขๆ่ตทๅ…ฌ่ฏ‰็š„\n1000 Processed\n classify content\n95500 0 2015ๅนด7ๆœˆ25ๆ—ฅ่‡ณ8ๆœˆ1ๆ—ฅ้บŸๆธธๅŽฟๅœจๆฒณๆปจๅ…ฌๅ›ญไธพ่กŒ็ฌฌไบŒๅฑŠๅ†œ็‰นไบงๅ“ๅฑ•้”€ไผš\n95501 0 ๅœจๅˆซ็š„ๅฐ็”Ÿๅ‡บ็Žฐ็š„ๅ‡็ˆ†ๆ–™ไธ‹่ฏดโ€œๆฒกๆจๆด‹่ฟ˜่ƒฝ็œ‹ๅ—โ€4\n95502 0 ๆต™ๆฑŸไธญๅ—้ƒจๅœฐๅŒบๅทฒ็ปๅ‡บ็Žฐ้›ทๆšดๅคฉๆฐ”\n95503 0 ็ฝ‘ๆ›ๆฑŸ่‹ๅฆ‚็š‹ๅธ‚ๅฅณๆฐ‘่ญฆๅฎถๅฑžๆณ„้œฒๆŽๆ˜“ๅณฐใ€ๆจๆด‹ไธคๅ่‰บไบบ็š„่บซไปฝ่ฏๅทใ€ๆ›พ็”จๅใ€ๆˆท็ฑๅœฐ็ญ‰ไฟกๆฏ\n95504 0 ๆค’ๆฑŸไฝ“่‚ฒ้ฆ†ๅฐ†ไธพ่กŒ2015้บฆ่ฟชไธญๅ›ฝ่กŒโ€œ็ปˆๆžไธ€ๆˆ˜โ€ๅฐๅทž็ซ™็ฏฎ็ƒ่ต›\n1000 Processed\n classify content\n96000 0 ็‹ฎๅญๅบง็š„่€ๅฆˆๅคช้œธ้“่ฎฉๆˆ‘ๅœจๅฎถๆ‰“ๆ‰ซๅซ็”Ÿๅฐฑ็ฎ—ไบ†็Žฐๅœจ่ฟ˜่ฆๆˆ‘ๅš้ฅญ็ญ‰ๅฅนๅ›žๆฅไนŸไธ็œ‹็œ‹ๆˆ‘ๆ˜ฏ้‚ฃ็ง่ขซๅผบๆƒๅŽ‹ๅ€’็š„ไบบๅ˜›\n96001 0 ๅนณๆ—ถ็œ‹่ฟ‡็š„ๅฎถๅฑ…่ฃ…ไฟฎๅฏนไบŽ็Žฏๅขƒๅˆ›่ฎพไผšๅธฆๆฅๅพˆๅคš็š„Idea\n96002 0 ๅ€ŸๆฌพๅˆฐๆœŸๅŽ่ตตๆŸไป…ๅฝ’่ฟ˜3ไธ‡ๅ…ƒไฝ™ไธ‹็š„ๆœช่ฟ˜\n96003 0 PPLๅคงๆ‰‹็ฌ”ๆŠ•่ต„ๅฅˆไฝ•็›ผ่พพ็ฝ‘่ฟ™ไธชๅๅญ—่ตทๅพ—ๅคชๅœŸไบ†\n96004 0 ๅˆšๅˆฐๅฃ่…”ๅŒป้™ขๅฐฑ็ขฐๅˆฐๅฐๅง‘ๅจ˜ๅ› ไธบๅฎณๆ€•ๆ‹”็‰™ๆ™•ๅ€’ไบ†\n1000 Processed\n classify content\n96500 0 ๆต™ๆฑŸไน‰ไนŒๆœ‰ไธชๅฆˆๅฆˆไธๅฐๅฟƒๆŠŠ่‡ชๅทฑ็š„ๅ„ฟๅญ้”ๅœจๅฎ้ฉฌ่ฝฆ้‡Œไบ†\n96501 0 ๅˆ†ไบซnmgtxl็š„ๅšๆ–‡ๅ›พ็‰‡๏ผš็บข็Ÿณๅด–ๆ—…ๆธธ้ฃŽๆ™ฏๅŒบ\n96502 0 ็™พ่„‘ๆฑ‡็š„็‰ฉไธšๅ…จๆ˜ฏ็‹—ๅจ˜ๅ…ป็š„ๅฉŠๅญ็•œ็”Ÿ\n96503 0 ๅไบ†ไบ”ๅคฉJRๆˆ‘ๆ˜ฏๅ†ไนŸๆฒกๆณ•ๅœจ้ญ”้ƒฝๅๅœฐ้“โ€ฆโ€ฆๅฆๅค–ๅœจไธœไบฌ้…’ๅบ—ไฝๅพ—ๅคช่ˆ’ๆœ\n96504 0 ่ฏดๅœจๅฎถๅซๅฐ็›†ๅฎ‡ไธบ๏ผš่€niangmen\n1000 Processed\n classify content\n97000 0 ๅฅฝๅฃฐ้ŸณไปŠๅคฉๆ™šไธŠๅ‡ ็‚น้’Ÿๅผ€ๅง‹็›ดๆ’ญ\n97001 0 ๆœ€ๅฅฝ็š„crystalinjectorๆฐดๆ™ถๆฐดๅ…‰ไปช็‘žๅฃซๅŽŸ่ฃ…้ฉฌ่พพๆœบ่Šฏ\n97002 0 ่€Œๆ˜ฏๆŠŠๆ‰€ๆœ‰้ƒฝ็‚น็‚นๆปดๆปด่ฎฐๅœจๅฟƒ้‡Œ\n97003 0 ๅธŒๆœ›้€š่ฟ‡่ฟ™็งๅฝขๅผๆžถ่ตทๅŸŽ็ฎกไธŽๅธ‚ๆฐ‘ๆฒŸ้€š็š„ๆกฅๆข\n97004 0 MFAไธ“ไธšๅญฆไฝๅ’Œ้€šๅธธๆ‰€่ฏด็š„โ€œ็ก•ๅฃซใ€ๅšๅฃซๅญฆไฝโ€ๆœ‰ไป€ไนˆไธๅŒ\n1000 Processed\n classify content\n97500 1 ใ€Šไฟๅชณๅฆ‡็ซ้”…ใ€‹้…ฌๅฎพๆดปๅŠจๅผ€ๅง‹๏ผๅ…จๅœบ่œๅ“x.xๆŠ˜๏ผŒๅฑฑๅŸŽๅ›ฝๅฎดใ€้›ช่Šฑๅ•ค้…’ๅ–x้€x๏ผ็”ต่ฏ๏ผšxxxxxx...\n97501 0 ๅฎ้ฉฌ็ญ‰4Sๅบ—็ซ™ๅทฅไฝœ็š„ไธ“ไธšๆŠ€ๆœฏ้ชจๅนฒๅ’Œ่ต„ๆทฑ็ฎก็†ไบบๅ‘˜\n97502 0 2015ๅนดๆˆ‘ๆ กๆฑŸ่‹ๆœฌไธ€็†็ง‘ๆŠ•ๆกฃ็บฟ345\n97503 0 ๅฅฝๅฃฐ้Ÿณๆœ€็พŽๅ’ŒๅฃฐๅŽŸๅ”ฑๅผนๅฅ่‹นๆžœๅ›ญ็ป„ๅˆ้ฃŽ้ก็นๆ˜Ÿ็ฝ‘\n97504 0 S38ๅธธๅˆ้ซ˜้€Ÿ็”ฑๅˆ่‚ฅๅพ€ๅธธ็†Ÿๆ–นๅ‘ๅฎๅธธๆฎตๅœจ137Kๅค„ๆ–ฝๅทฅ็ป“ๆŸ\n1000 Processed\n classify content\n98000 1 โ€œๅฐๅ‡ๅˆๅฎšๅ‘ๅŸนๅ…ป่ฏพ็จ‹โ€ๆŠขๆŠฅไธญ๏ผๅๅธˆๅฎšๅˆถ้‡็‚นไธญๅญฆๅๆ กๅค‡่€ƒ่ฎกๅˆ’๏ผŒ็ฒพ่ฎฒๆ•ฐๅญฆใ€่‹ฑ่ฏญใ€ไฝœๆ–‡็Ÿฅ่ฏ†ๆจกๅ—๏ผๅฟซ...\n98001 0 ๆท…ๅทๅŽฟๆณ•้™ข็ป„็ป‡็š„ไธ€ๅœบๅˆ‘ไบ‹ๅฎกๅˆคๆญฃๅœจ็Žฐๅœบๅผ€ๅบญ\n98002 0 ไธœๆญฃ็คพๅŒบๅผ€ๅฑ•ไบ†ๆณ•ๅพ‹็Ÿฅ่ฏ†่ฎฒๅบง\n98003 0 ๆฑŸ่‹ๆˆๅŠŸๆณจๅ†Œๅœฐ็†ๆ ‡ๅฟ—188ไปถ\n98004 0 ๆธธไพ ๆฑฝ่ฝฆๅœจๅŒ—ไบฌๅ‘ๅธƒไบ†ๆธธไพ X\n1000 Processed\n classify content\n98500 0 ๅ› ไธบ่ฟ™ๆ ท็š„ไบบไผš่…่ดฅไฝ ็š„็†ๆƒณไบบ็”Ÿ\n98501 0 ไนŸๅฏๅ’จ่ฏขๆœบๅœบ้—ฎ่ฎฏ็”ต่ฏxxxxxxxx\n98502 0 ๆŠ•่ต„ๆœบๆž„PiperJaffrayๆ—ฅๅ‰ๅฏน่ถ…่ฟ‡800ๅ็พŽๅ›ฝๆถˆ่ดน่€…่ฟ›่กŒไบ†่ฐƒๆŸฅ\n98503 0 ๆœ‰ๅคšๅฐ‘ไบบๆ˜ฏๅ†ฒ็€็™ฝๅญ็”ปๅŽป็œ‹็š„โ€œ่Šฑๅƒ้ชจโ€\n98504 0 ๅคงๅคšๆ•ฐไบบไผšๅๅฏนๆˆฟๅœฐไบง็จŽๆ”น้ฉ\n1000 Processed\n classify content\n99000 0 HealthyCare่œ‚่ƒถ็‰™่†120g\n99001 0 ๆˆ‘ๅฆ‚ๆžœ็œŸ็š„ๅฏไปฅๆขๆ‰‹ๆœบ็š„่ฏ\n99002 0 ๆœจๅถไนŒ้ธฆcosไฝœๅ“SDๅจƒๅจƒ่Œ็ฟปไบ†\n99003 0 ๆˆ‘้ข„ๆต‹็ซ็ฎญๅพˆๅฟซ็ญพ็บฆ่ฝ้€‰ๆ–ฐ็ง€ๅจๅป‰ๅง†ๆ–ฏ\n99004 0 ้ฅญๆญๅญAไธบไบ†ๅฎ‰ๆ…ฐ้ฅญๆญๅญBๆŒ‡็€ๆˆ‘่ฏด๏ผšๆฏ•ไธšๅญฆๆ กๅทฎๆฒกไป€ไนˆไธๅฅฝๅ•Š\n1000 Processed\n classify content\n99500 0 ็Žฐๅœจ่ฟ™ๆฌพ้”ฎ็›˜ๅฏไปฅๅœจAmazonๅ’Œๅพฎ่ฝฏ็ฝ‘ไธŠๅ•†ๅบ—ไธญ่ดญไนฐ\n99501 0 ๅŽฆ่ˆชMFใ€ๅฑฑ่ˆชSCใ€ๆทฑ่ˆชZHใ€ๆตท่ˆชHUใ€้ฆ–่ˆชJDใ€ๅท่ˆช3Uใ€ไธŠ่ˆชFMใ€ๆˆ้ƒฝ่ˆช็ฉบEUใ€ๆฒณๅŒ—่ˆช็ฉบ...\n99502 0 ๅญฆ่ฝฆๆ‰ๅ‡ ๅคฉๅฟ˜่ฎฐๆถ‚้˜ฒๆ™’้œœๅ…จ้ป‘ๅฎŒ\n99503 0 ๆœ‰ๆ•ฐไธช่‡ช่กŒ่ฝฆๆ‰‹ๆœบ็ญ‰็ญ‰็ญ‰็ญ‰็ญ‰็ญ‰็ญ‰็ญ‰็ญ‰็ญ‰โ€ฆ่‚พ้ƒฝไธขๅฎŒไบ†\n99504 0 ๆˆ‘ไปฌๅคงๆณ—้˜ณ็š„้Ÿณไน่Š‚ไนŸๅพˆ้ซ˜ๆ ผ้€ผ็š„\n1000 Processed\n classify content\n100000 0 ๆ˜จๅคฉๆ™šไธŠๆˆ‘ไปŽๆ‰ฌๅทžๅ็ซ่ฝฆ็ป่ฟ‡xxไธชๅฐๆ—ถ็š„้ข ็ฐธๅˆฐ่พพไบ†ๆฝขๅท\n100001 0 ่€Œไธ”ๆ˜ฏๅฝปๅบ•็š„ๅroot่ฎพ่ฎก\n100002 0 8ๆœˆ8ๅทๆฑ‚ไธชๅฆ†ๅจ˜ๆˆ–่€…ๆ‘„ๅฝฑๆˆ–่€…ๅŽๅ‹ค\n100003 0 ๅŽไธบๅทฒๆˆไธบไธ–็•Œ็ฌฌไธ‰ๅคงๆ‰‹ๆœบๅˆถ้€ ๅ•†\n100004 0 ๆต™ๆฑŸ็œๅฐๅทžๅธ‚ๆค’ๆฑŸๅŒบๆ–ฐไธ–็บชๅ•†ๅŸŽๅŽๆฎฟ้™ถๆ‘ๅนฒ้ƒจๆ‰“ไบบ๏ผš่ฏดไป–ๆ˜ฏๆ‘ๆ”ฏไนฆๆƒณๆžไฝ ๅฐฑๆžไฝ \n1000 Processed\n classify content\n100500 0 ไธญๅ›ฝ่ฏๅˆธ็ฝ‘๏ผšไธญ้“ๅปบๆ‹›ๆ ‡ๆ–ฐไธ€ๆ‰น้“่ทฏๅฎข่ฝฆ\n100501 0 ่‰พๆฃฎ่ฑชๅจๅฐ”1944ๅนด็ง‹ๅคฉ้”™่ฏฏๅœฐ้˜ปๆญขไป–ๅ…ณ้—ญโ€œๆณ•่Žฑๆ–ฏ็ผบๅฃโ€\n100502 0 ๆœบไผšๅทฒ็ป่ฟœ็ฆปไฝ ๆˆๅŠŸ็š„็ง˜่ฏ€ๅฐฑๆ˜ฏๅคšไป˜ๅ‡บ\n100503 0 ๅ…่ดนๅˆ†ไบซๆต™ๆฑŸ50ไปฝๅˆ†ไบซๅ†ฐ่Šฑ็งๅญ\n100504 0 ๅณไพฟๆ˜ฏๆœ‰BUGไนŸ่ƒฝ่ฎฉ็Žฉๅฎถไฝ“้ชŒ5D็š„ๅฟซๆ„Ÿ\n1000 Processed\n classify content\n101000 0 ๆ”ฟๅบœๆ‰˜ๅธ‚ๆ”ถ่ดญๅฏน็จป็ฑณไปทๆ ผๅฝฑๅ“ๆœ‰ๅคšๅคง\n101001 0 ็ป“ๆžœ่ฏ็›‘ไผš็š„ๅšๆณ•ๅดๆ˜ฏๆปฅ็”จ่กฅ่ฏ\n101002 0 43ๅฒ็š„้‚น่މๅœจๆฑ‰ๅฃไธ€็พŽๅฎนไผšๆ‰€ๅŠž็†ๅ‡่‚ฅๆœๅŠก\n101003 0 โ€Gartnerๅ‰ฏๆ€ป่ฃJohnMorency่ฏด้“\n101004 0 ้˜ฒๆ™’่กฃใ€้˜ฒๆ™’้œœใ€้ฎ้˜ณไผž็ญ‰้˜ฒๆ™’็”จๅ…ทๅˆ่ฏฅๅคงๆ˜พ่บซๆ‰‹ไบ†\n1000 Processed\n classify content\n101500 0 ไธ€ๆถ‚ๅณ็™ฝๆ˜ฏๅ› ไธบๆœ‰้˜ฒๆ™’้ฎ็‘•็š„ไฝœ็”จ\n101501 0 ๆŠŠ็œŸๆญฃ็š„็”ทไบบ็œ‹ๅฎŒไบ†ๅฅฝๅ–œๆฌขๆ–ฐๅ…ต่ฟž็š„็ญ้•ฟๅ•Š\n101502 1 ็‰นๅคงๅ–œ่ฎฏ๏ผš ้‡‘ๅฃ่พ‰็…Œ็ŽฐๅœจๆŽจๅฒ€๏ผšไธ€่ˆฌ็•…้ฅฎๅ’Œ่ฑชๅŽ็•…้ฅฎ๏ผ› ไธ€่ˆฌ็•…้ฅฎ๏ผšๅฐๅŒ…xxx๏ผ›ไธญๅŒ…xxx๏ผ›ๅคงๅŒ…x...\n101503 0 3ไบฟๅ…ƒๅฎžๆ–ฝ6ๆก15ๅ…ฌ้‡ŒไบŒ็บงๅ…ฌ่ทฏๆŽฅ็บฟ\n101504 0 A่‚ก็š„่‚กๆŒ‡ๆœŸ่ดงๅญ˜ๅœจๅทจๅคง็š„ๅš็ฉบๆผๆดž\n1000 Processed\n classify content\n102000 0 ็œ‹ๆฅ้‚ฃไธชๆณ•้™ขๅ‰ฏ้™ข้•ฟไน‹ๅŽ็กฎๅฎž้—ฎๅ‡บไบ†ไป–ไปฌ่ฆๅผ„ๆญปๅ‡ ไธชๅฐ†ๅ†›็š„่ฏๆฎ\n102001 0 ๅœจๅ…ถๆ‚ฌ็–‘ใ€็ง‘ๅนปใ€ๆœบๅ™จไบบไผฆ็†่กจ้ขไธ‹\n102002 0 xใ€ๆ‰ง่กŒๅŠ›ๆœŸ๏ผš้€ขไบบ่ฐˆๆ‰ง่กŒๅŠ›ไธบๅˆๅˆ›ๆœŸ\n102003 0 ไธบๆฏ›้ชŒ่ฏ็ ๅพ€ๆˆ‘ไธŠไธ€ไธชๆ‰‹ๆœบๅทๅ‘\n102004 1 ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅฟซๅˆฐไบ†๏ผŒ่ฟ™ๅฏๆ˜ฏๆˆ‘ไปฌๅฅณไบบ็š„่Š‚ๆ—ฅๅ“ฆ๏ผๅœจ่ฟ™้‡Œๆๅ‰็ฅๅคงๅฎถ่Š‚ๆ—ฅๅฟซไน๏ผไธบไบ†ๆ„Ÿ่ฐขๆ–ฐ่€้กพๅฎขๅฏนๆœฌ็š„ๆ”ฏ...\n1000 Processed\n classify content\n102500 0 ็”ต่„‘้œ€่ฆ่ดด่†œ??ๆ‰‹ๆœบ้œ€่ฆ่ดด่†œ??่ฝฆๆ›ด้œ€่ฆ่ดด่†œ??้‚ฃไฝ ็š„่„ธๅ‘ข??ไธบไป€ไนˆไธ็ป™่„ธ่ดด่†œ??ไธ‰ๅคฉไธ่ดด่†œ\n102501 0 ๅœจ็ป†่ƒžๅ’Œ็ป„็ป‡ๆฐดๅนณไธŠๅฏๅŠจ็šฎ่‚คไธปๅŠจไฟฎๆŠค็จ‹ๅบ\n102502 0 ๅ•ชๅŽฎ็š„ไธ€ๅฃฐ่€ณๆœบๅ‹พๆ‰ฏ่€ณๆœตdaung~ๆމไบ†่‰ๆณฅ้ฉฌ\n102503 0 ๅ€Ÿๅ—ไบฌ้‡‘ๆตทๆธฏ่ˆน่ˆถ็ฎก็†ๆœ‰้™ๅ…ฌๅธๆตŽๅ—ๅˆ†ๅ…ฌๅธ้ช—ๅ–ไธญไป‹่ดนๅ’Œๅ…ถไป–่ดน็”จ็ญ‰ๆ•ฐไธ‡ๅ…ƒ\n102504 0 โ€œ่ฐข่ฐขโ€็ฌฌไบŒๆžœ็ฒ‰้—ฎ๏ผšไฝ ็š„่ƒฝๅคฉๅคฉ้…ท่ท‘ๅ—\n1000 Processed\n classify content\n103000 1 ่ถ…ๅ€ผ็ฆๅˆฉ๏ผๅ‡ก้ข„ไบคxxๅ…ƒ่ฎค็ญน๏ผˆๅฏๆŠต่ดญๆœบๆฌพ๏ผๅฟ…้กปๆ‰พๆˆ‘ๆŠฅๅๅ“ฆ๏ผไธ็„ถ่ฎค็ญนไธๅˆฐๅ•Š๏ผ๏ผ‰๏ผŒ็ซ‹้€xxxๅ…ƒๅ–œๅบ†...\n103001 0 ไธ‹ๅ›พๆ˜ฏๅ—ไบฌๅธ‚ไธญ่ฅฟๅŒป็ป“ๅˆๅŒป้™ข็šฎ่‚ค็ง‘ๅ‰ฏไธปไปปไธญๅŒปๅธˆๆœ้•ฟๆ˜Ž\n103002 0 ๆœ€่ฟ‘ไธ€็›ดๅ‡บ็Žฐๆ‰‹ๆœบ้œ‡ๅŠจๅ—กๅ—ก็š„ๅนปๅฌ\n103003 0 ๅฏนไธชๆ€ง่ฟ›่กŒ้‡็ป„ๅŽๅฏไปฅ็œ‹ๅˆฐๅฆ‚ไธ‹็ป„ๅˆ\n103004 0 6ใ€็šฎ่‚คๅฏน็ƒˆๆ—ฅๅ’Œ็”ต่„‘่พๅฐ„ๅๆ˜ ๅผบ็ƒˆ\n1000 Processed\n classify content\n103500 0 ๅนซๆœ‹ๅ‹ๅ–็ป้™ขMScMoneyBanking&amp\n103501 0 ๆฑŸ่ฅฟ็œๅ—ๆ˜ŒๅŽฟไบบๆฐ‘ๆฃ€ๅฏŸ้™ขไปฅๆ•ฒ่ฏˆๅ‹’็ดข็ฝชๅฏน็Šฏ็ฝชๅซŒ็–‘ไบบๅผ ๆŸๆ‰นๅ‡†้€ฎๆ•\n103502 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜ๆœ‹ๅ‹๏ผŒๅ…ƒๅฎตๅฟซไนๅ™ข๏ผๅ› ๆ˜ฅ่Š‚ๆœŸ้—ดๆœฌๅบ—ไบบๆ‰‹็ดง็ผบ๏ผŒๅฏนๆ‚จๆœ‰ๆ‰€ๆ€ ๆ…ข๏ผŒ่ฟ˜่ฏท่ฐ…่งฃ๏ผŒไธบ่กจๆญ‰ๆ„๏ผŒๆ‚จๅฐ†ๆœ‰...\n103503 0 ๆผ”ๆˆ็š„ๅŒๆ—ถไนŸๅœจๆฏๆ กๅธฆ็ ”็ฉถ็”Ÿ\n103504 0 ่ขซๆ›ๅ…‰็š„่ฟๆณ•ๅนฟๅ‘Šๆถ‰ๅŠไฟๅฅ้ฃŸๅ“ใ€ๅค„ๆ–น่ฏใ€้žๅค„ๆ–น่ฏใ€ๅŒป็–—ๅ™จๆขฐๅ››ๅคง็ฑปๅˆซ\n1000 Processed\n classify content\n104000 0 ็œŸ็š„ไฝฉๆœๅช’ไฝ“็žŽbb็š„ๆฐดๅนณไปฅๅŠๅผ•ๅฏผไธๆ˜Ž็พคไผ—็š„่ˆ†่ฎบ่ƒฝๅŠ›\n104001 0 ๆ‰€ไปฅ่ฏดๅˆซ่ฎฉๅˆซไบบ็ขฐไฝ ็š„ๆ‰‹ๆœบ\n104002 0 ไธ็„ถไฝ ๆฃ€ๅฏŸ้™ขๆณ•้™ขๆœ‰ไธชๆฏ›ๅŠžๆณ•\n104003 0 Upไธป๏ผš็ด ๅนด้”ฆๆ—ถไธถๅˆ่งๆฅ่‡ชAcFunๆ–‡็ซ ้ข‘้“\n104004 0 ไธญๅˆๅ›žๅฎถๅƒๅฎŒ้ฅญ13็‚นๆ‰“ๅผ€็”ต่„‘\n1000 Processed\n classify content\n104500 0 ไธ€ๅคงๆ—ฉ่ท‘ๅˆฐๆทฎ้˜ดๅœจ่ฝฆ้‡Œ็ก่ง‰ๆˆ‘ไนŸๆ˜ฏ้†‰ไบ†\n104501 0 ่ฎฉๅƒไธๆƒฏ้ฃžๆœบ้ค็š„ๆ—…ๅฎขไปฌๆƒ…ไฝ•ไปฅๅ ช\n104502 0 ๅฏนๅ—ไบฌ็š„็ฌฌไธ€ๅฐ่ฑกไธ€ๅฎšๆ˜ฏๅ—ไบฌ็›ๆฐด้ธญใ€้‡‘้™ต้ธญ่ก€็ฒ‰ไธ็ญ‰็ญ‰\n104503 0 ๆ—‹้ฃŽๅฐ‘ๅฅณๆต™ๆฑŸๅซ่ง†ๅ…‹ๆ‹‰ๆ‹ไบบ\n104504 0 ๆฑŸ่‹้˜œๅฎๆŽจ15้กนๅฅฝไบบๅ…่ดนๆ”ฟ็ญ–ๅ‚ๆ”ฟๅฏ่Žทๆ”ฟๆฒปไผ˜ๅพ…\n1000 Processed\n classify content\n105000 0 ่ฟ™ๆ˜ฏไปฅๅ›ฝๅ†…ๆœบๆž„ไธบไธปๅ‹พ็ป“ๅค–่ต„ๅฑ ๆ€่กŒไธบ\n105001 0 ็ป่ฟ‡ๆ—ถ้—ด็š„ๆด—็คผ็ปˆ็ฉถ็œŸ็›ธไผš่ตค่ฃธ่ฃธ็š„็คบไบŽไบบๅ‰\n105002 0 ไธŠๅธ‚ๅ…ฌๅธไธœๆ–น่ดขๅฏŒ็ฝ‘ๆ‹›่˜UIๆฑ‚ๆ‰ฉๆ•ฃ\n105003 0 ๆˆ–่€…ๆ˜ฏๅ—ไบฌ่ฅฟๅฎ‰ๆˆ้ƒฝๅฏไปฅๅƒๅฅฝๅคšๅฅฝๅคšๅฐๅƒ็š„ๅœฐๆ–น\n105004 1 ไบฒ็ˆฑ็š„ๅฎขๆˆท๏ผŒๅ› ้ฒ่ฅฟ่‚ฅ็‰›ๆด‹ๆฒณๅบ—็งŸ็บฆๅˆฐๆœŸๅทฒๆญ‡ไธš้—ญๅบ—๏ผŒ่ฏšๆŒš้‚€่ฏทๆ‚จๆฅ้ป„ๆณฅ็ฃ…ๅบ—(็ดซ็ฆ่ทฏ้‡‘็މๆปกๅ ‚ๅฏน้ข)ๅฐฑ...\n1000 Processed\n classify content\n105500 0 GDPใ€ๅ…ฌๅ…ฑ่ดขๆ”ฟ้ข„็ฎ—ๆ”ถๅ…ฅใ€่ง„ๆจกๅทฅไธšๅขžๅŠ ๅ€ผใ€ๅ›บๅฎš่ต„ไบงๆŠ•่ต„ใ€็คพไผšๆถˆ่ดนๅ“้›ถๅ”ฎๆ€ป้ขๅขžๅน…ๅ‡้ซ˜ไบŽๅ…จๅ›ฝๅ…จ็œ...\n105501 0 ่ฐทๆญŒๅญฆๆœฏๆ•ฐๆฎๅบ“ๆ”ถๅ…ฅ่Œƒๅ›ด้žๅธธๅนฟๆณ›\n105502 0 ๅ็š„้ƒฝไธๆ˜ฏ้ฃžๆœบๆˆ‘ๆ€Žไนˆ่ง‰ๅพ—ๅƒๆ˜ฏๅ็ซ่ฝฆๅ‘ขๅ‘ข้‚ฃไนˆ็ดฏ้‚ฃไนˆ็ดฏ้‚ฃไนˆ็ดฏ\n105503 0 ๆญฃๅ“่Œƒ็‰น่ฏ—้ฒœๆžœ้…ถๆฐดๆžœ้…ต็ด ็˜ฆ่บซไธฐ่ƒธ็พŽ็™ฝๆฐดๆžœ้…ต็ด ๆ— ๅ‰ฏไฝœ็”จๆ— ๆ•ˆ้€€ๆฌพ\n105504 0 ่ฟ™ๆ ทๅฐฑไธไผšไธ€่ตทๅบŠๅฐฑๆŠŠๆ‰‹ๆœบ็ ธไบ†\n1000 Processed\n classify content\n106000 0 ้š†้‡ไธพ่กŒxxxxๅฑŠๅญฆ็”Ÿๆฏ•ไธšๅ…ธ็คผ\n106001 1 ๆ‚จๅฅฝๅๅช›ๅฐ่ฑกไธ–็บช่”ๅŽๅบ—๏ผŒๅœจx.x่Š‚ๆฅไธดไน‹้™…ๆ–‡่ƒธx.xๆŠ˜่ตท๏ผŒๅฎถๅฑ…ๆœxๆŠ˜๏ผŒๆปกxxxๅ…ƒ้€ๅ†…่ฃคไธ€ๆก๏ผŒ...\n106002 0 ๅœจๅœฐ้“ไธŠ็œ‹ๅˆฐไธ€ไธช่€ๅคชๅคชๅ› ็—…็—›็–ผๅพ—ไธ€็›ดๆต็œผๆณช\n106003 0 ไธ–็•ŒไธŠๆœ€ๅคง็š„ๆ‚ฒๅ‰งๆˆ‘็š„็”ต่„‘ๅกไฝไบ†\n106004 1 ๅฅณๅฃซๅณๅฏ้ข†ๅ–xxxxๅ…ƒ้กน็›ฎๅกไธ€ๅผ ๅŠ ็ฒพ็พŽ็คผๅ“ไธ€ไปฝ๏ผˆๅ‡ญ็Ÿญไฟก้ข†ๅ–๏ผ‰ๅŒๆ—ถๅบ—ๅ†…่ฟ˜ๆœ‰ๆ›ดๅคšๅญ˜้€ไผ˜ๆƒ [็‚ธๅผน]...\n1000 Processed\n classify content\n106500 0 ๆœ‹ๅ‹ๅผ€็š„้€้ฃŽ็ฎญๅ’–โ€”ๆ‰ฌๅทž้ฆ–ๅฎถไธ“ไธšๅฐ„็ฎญ้ฆ†\n106501 0 ๆฑฝ่ฝฆๅธธ่ฏ†โ€”โ€”่ฝฆๅฑ่‚กไธŠ็š„้‚ฃไบ›ไบ‹ๅฟซ้€Ÿไบ†่งฃ่ฝฆๅฐพๆ ‡็š„ๅซไน‰\n106502 0 ไธป้กตๅฆžๅฐ†ไธๆ–ญpoไธŠ่ฝฆ้˜Ÿ่ฟ‘ๆœŸๆ—ฅๅธธ\n106503 0 /ๅฏ็ˆฑ/ๅฏ็ˆฑ/ๅฏ็ˆฑ/ๅฏ็ˆฑ\n106504 0 ็ˆ†ไนณ่พฃๅฆนๆŒๅˆ€ๆŠขๅŠซๆฒก็ป้ชŒๅค„ๅค„็•™็—•่ฟนๆœ‰่ƒธๆžœ็„ถๆ˜ฏๆ— ่„‘็š„\n1000 Processed\n classify content\n107000 1 ๅญฆๆ”ฟๆ•™่‚ฒ็œ่€ƒๅ†ฒๅˆบๆŠผ้ข˜็ญๅฐ†ไบŽxๆœˆxxๅทๅผ€่ฏพ๏ผŒ็œ่€ƒไน‹ๅ‰็š„ๆœ€ๅŽไธ€ไธชๅŽ‹่ฝดๆ€งๅฐ้—ญ้›†่ฎญ่ฅไบ†๏ผŒๆๅˆ†ๆœ€ๅฟซ๏ผŒๅฐ...\n107001 0 ๆœ‰ๅช’ไฝ“7ๆ—ฅๅˆŠ็™ป็š„ๆ— ้”ก่ญฆๆ–นๆŠ“่Žท16ๅ่‹ๅ—็ก•ๆ”พๆœบๅœบ่ดง่ฟ็ซ™ๅ†…้ฌผไธ€ๆ–‡ๆ–ฐ้—ปไฟกๆฏๅคฑๅฎž\n107002 0 ่”ๅˆๅ„ๅœฐๆ”ฟๅบœๅ…จๅŠ›ๆŽจ่ฟ›1786ไธชๆถˆ็ซๆ “ๅปบ่ฎพๅทฅไฝœ\n107003 0 ๆˆ‘้ƒฝๅ‡†ๅค‡ๆ‰“110็œ‹ๆ˜ฏไธๆ˜ฏๆœ‰ไบบ็ป‘ๆžถไบ†\n107004 0 ็ฎ€็บฆ็š„ไธ€ไปถๆก็บนTๆคไธŽ็™ฝ่‰ฒ็Ÿญ่ฃค็›ธๆญๅ‡บๅคๆ—ฅๆธ…็ˆฝๆฐ”ๆฏ\n1000 Processed\n classify content\n107500 0 2ใ€ๅŽไธบโ€”็งฐ5Gๅ‘ๅฑ•ๅทฒๅˆฐๅ…ณ้”ฎ่Š‚็‚น\n107501 0 ๅ‰ๅคฉไธ€ๅฐๅ“ฅๅธฎๆˆ‘ไฟฎๅŠžๅ…ฌๅฎค็”ต่„‘\n107502 1 ๅนณๅฎ‰ๅ…ป่€ๅŠๆ•™่‚ฒไฟ้™ฉ๏ผŒๅชๅญ˜ไธ‰ๅนด๏ผŒๅญ˜ๆปกๅฐฑ้ข†๏ผŒๅนดๅนด้ข†๏ผŒ้ข†่‡ณ็ปˆ่บซ๏ผŒๆœฌ้‡‘ๅœจ๏ผŒๆœฌ้‡‘ๆถจ๏ผŒ่พ›่‹ฆไธ‰ๅนด๏ผŒๅนธ็ฆไธ€่บซ...\n107503 0 3้–ๆฑŸAnimeCityๅคๆ—ฅ็ฅญ\n107504 0 ๅ…ถไธญWinxxไธŽXbox็š„ๅŒๅ‘่žๅˆๆ›ดๆ˜ฏ่ฎฉๅนฟๅคง็Žฉๅฎถๅ…ดๅฅ‹ไธๅทฒ\n1000 Processed\n classify content\n108000 0 ๆœ€ไปคไป–่ˆˆๅฅฎๆ˜ฏ่ˆ‡NBA็ƒๆ˜Ÿๆž—ๆ›ธ่ฑชๅœจ็ƒๅ ดไธŠ่ผƒ้‡\n108001 1 ่ฏ็Ž‹ๅ ‚่ท่Šฑๆฑ ๅบ—ไธบๅบ†็ฅๅฅณๆ€งๅŒ่ƒž่Š‚ๆ—ฅ๏ผš๏ผˆไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚๏ผ‰xๆœˆxๆ—ฅ-xๆœˆxๆ—ฅ๏ผŒๅ…จๅœบๆปกxx้€xx๏ผŒไนฐๆปกx...\n108002 0 ็ญพๅไนŸOKไบ†โˆšๆ˜Žๅคฉๅฏไปฅๅ‘่ดงๅ…ˆๅ‘ไธ€้ƒจๅˆ†ไบ†\n108003 0 ็ƒงๆฏๆ—ฅๅ†›ๅฆๅ…‹ใ€่ฃ…็”ฒ่ฝฆ12่พ†โ€ฆโ€ฆไป–ๅฐฑๆ˜ฏๆŽๆตทๅฑฑ\n108004 0 ๅฐฑๆ˜ฏ็ฑณ่ฒ่กฃๆœ็š„้ขœ่‰ฒๅ“ฆ~ๅ‘็Žฐไบ†ไนˆ\n1000 Processed\n classify content\n108500 0 24ๅฐๆ—ถๆœๅŠก็š„็ฉบ่ฐƒๆฏๅˆฐๅŠๅคœ10็‚นไปฅๅŽๅฐฑๅ…ณ้—ญ\n108501 0 FX็œผ่ฏๆฐดๆ˜ฏ้’ˆๅฏนๆถˆ้™ค็œผ็›ๅ……่ก€ไปฅๅŠๅฏน้…ๆˆด้šๅฝข็œผ้•œๆˆ–็”จ็œผ่ฟ‡ๅบฆไบง็”Ÿ็š„็–ฒๅŠณ้žๅธธๆœ‰ๆ•ˆๆžœ็š„ไธ€ๆฌพ็œผ่ฏๆฐด\n108502 0 ไธŠๆž—ๅŽฟไบบๆฐ‘ๆฃ€ๅฏŸ้™ขๆŒ‡ๆดพไปฃๆฃ€ๅฏŸๅ‘˜่ฆƒๅ…‰้”กๅ‡บๅบญๆ”ฏๆŒๅ…ฌ่ฏ‰\n108503 0 win10็š„PINไนŸๆ˜ฏไธชbugๆ›ดๆ–ฐๅฎŒ็”จไบ†ไธๅˆฐไธ€ๅคฉๅฐฑ้‡ๅฏไบ†Nๆฌก่€Œไธ”่ฟž้‡ๅฏ้ƒฝๅ‡บ้—ฎ้ข˜่ฆๆ”ถ้›†ไฟกๆฏๅ‘ๅ›ž\n108504 0 ๆญคๆฌกๅคฑไบ‹็š„้ฃžๆœบ่ˆช็ญๅทไธบ4u9525\n1000 Processed\n classify content\n109000 0 ็ป™ไบˆไธ่ถ…่ฟ‡60ไธ‡ๅ…ƒ็š„ไธ€ๆฌกๆ€ง่ต„ๅŠฉ็ญ‰โ€ฆ\n109001 0 ๅพๅทžๅทฅ็จ‹ๅญฆ้™ขๅŒ–ๅทฅ้™ข้€่€็”Ÿๆ™šไผšๆœฑๆท่€ๅธˆ่„ฑๅฃ็ง€็œŸๅฟƒๅ…จ็จ‹ๆ— ๅฐฟ็‚นๅ€ผๅพ—ไธ€็œ‹\n109002 0 ไธŠๆฌกๅœฐ้“ไธŠไธ€็”ท็”Ÿ่กŒๆŽ็ฎฑๆŒกไฝไบ†่ทฏ\n109003 0 ไป–ไปฌๅฏ่ƒฝ็œ‹ๅˆฐcoconutmilkๅฐฑ่ฎคไธบๆ˜ฏ่ƒฝๅ–็š„ๆคฐๅฅถ\n109004 0 ๆœ‰ๆ—ถๅ€™่ง‰ๅพ—่‚ก็ฅจๅฐฑๆ˜ฏ่‡ชๅทฑ็š„ๅ„ฟๅญ\n1000 Processed\n classify content\n109500 0 ไธ€ไธชๅฅณๅญฉ้•ฟๆœŸ็”จxๅ—xไธ€็›’็š„้ข่†œ\n109501 0 ่ฎฉๆˆ‘ๅฟซ็‚นไผ ๆปก100ไธชiphone6ๆ‰‹ๆœบๅฃณๅง\n109502 0 xxๆ—ฅๆœฌๆ’็”ปๅฎถ็ฉบๅฑฑๅŸบ่‡ช็š„ไปฃ่กจไฝœๅ“ๆ€งๆ„Ÿๆœบๅ™จไบบ\n109503 0 ๅ›žๅฟ†ๆ˜ฏไบบ็”Ÿไธญๅฎ่ดต็š„่ดขๅฏŒไฝ ่ง‰ๅพ—็—›่‹ฆ้‚ฃๅฐฑ้€‰ๆ‹ฉ้บปๆœจไฝ ่ง‰ๅพ—ๅผ€ๅฟƒๅฐฑ่—ๅœจๅฟƒ้‡Œๆ—ถ้—ดไผšๆ”นๅ˜ๅพˆๅคš\n109504 0 ไธ€่พ†ๅท็‰Œไธบ8889็š„ๆณ•ๆ‹‰ๅˆฉไธŽไธ€่พ†ๅท็‰Œไธบ9888็š„ๅฎพๅˆฉ็›ธๆ’ž\n1000 Processed\n classify content\n110000 0 ๅŽไธบไนŸๆŽจๅ‡บไบ†LITEๅ’ŒTalkBandB2\n110001 0 ๅ—ไบฌๅคงๅญฆๅœฐ็ƒ็ง‘ๅญฆไธŽๅทฅ็จ‹ๅญฆ้™ขๅปบ่ฎพ้™ขๅœฐ่ดจๅš็‰ฉ้ฆ†็ ”่ฎจไผš\n110002 0 ๅˆๅฏไปฅๅŽปๆœ€้กถ็บง็š„shoppingmallๅคชๅคๆฑ‡่ดญ็‰ฉๅจฑไน\n110003 0 ๆ‹ฟๅพ—็Šถๅ…ƒ็š„ๆ˜ฏ้™้‡็‰ˆ็š„NokiaNxxxๅˆ†\n110004 0 ๆˆ‘ไปฌ็œ‹ๆœ้ฒœ็š„็œผ็ฅžๅ’Œๆฌง็พŽๅ›ฝๅฎถ็š„ไบบ็œ‹ๆˆ‘ไปฌ็š„็œผ็ฅž็ฅžไผผ\n1000 Processed\n classify content\n110500 0 ๆณฐๅทž่ฎก้‡้’ˆๅฏนๆœฌๅœฐไบงไธš็š„็ƒญ็‚น\n110501 0 ไฝ†ๆ˜ฏๆฐด่กจ่ขซ็›—็ชƒ็š„ไบ‹ๆƒ…ไนŸไธๆ˜ฏ็ฌฌไธ€ๆฌกๅ‘็”Ÿไบ†\n110502 0 BvlgariMonete15ๆœ€ๆ–ฐๆฌพๅŒๅฑ‚ๅ†…้‡Œๅธฆๅ†…่ข‹่ถ…็บงๅฎž็”จ็š„้š”ๅฑ‚่ฎพ่ฎก้”ๆ˜ฏๆ‰ฃๅพˆ็‰นๅˆซ็š„ๅค็ฝ—้ฉฌ้ฃŽๆ ผ...\n110503 0 DreamByๆขฆๆ—ถๅ…‰ๅฉš็คผ้กพ้—ฎๅฎ˜ๆ–นๅพฎๅš|ๅ–œ็ป“็ฝ‘ๅฉš็คผ็ตๆ„Ÿใ€้ฃŽๅฐšใ€ๆ”ป็•ฅๅšๅฎข\n110504 0 ่ฏฅxๅ็Šฏ็ฝชๅซŒ็–‘ไบบๅทฒ้กบๅˆฉ็งปไบค่ฅฟๅฎ‰็งฆ้ƒฝ่ญฆๆ–น\n1000 Processed\n classify content\n111000 0 ๅˆšๅœจๆดพๅ‡บๆ‰€็•™่ต„ๆ–™็š„ๆ—ถๅ€™ๅ’‹ไธ่ฏดๆˆ‘ๅซ้›ท้”‹ๅ‘ข\n111001 0 ไธดๅนณ็ซ™่”ๅˆ้ฉป็ซ™ๆดพๅ‡บๆ‰€็ป„็ป‡ๆŠค็ซ™้˜Ÿๅผ€ๅฑ•ๅๆ้˜ฒ็ˆ†ๆผ”็ปƒ\n111002 0 OrbisๆŽจๅ‡บ่ฝป็†Ÿ้พ„ๆŠ—่€็ณปๅˆ—=U\n111003 0 ๅพฎไฟก็พค้‡Œๅœจ็พค่Š็”ตๆขฏๆญปไบบ็š„ไบ‹\n111004 0 ไธ€่พ†็‰Œ็…งไธบ้ป‘R03**็š„่ญฆ่ฝฆๅฐ†ไธ€ๅ50ๅคšๅฒ็š„็”ทๅญๆ’žๅ€’\n1000 Processed\n classify content\n111500 0 ็”จๆ–™้…’ใ€่œ‚่œœใ€ๅงœ่’œใ€็›่…Œๅˆถๅ›žๅฟ†\n111501 0 ๆต™ๆฑŸ็œๅ…ฌๅฎ‰ๅŽ…็ฝ‘่ญฆๆ€ป้˜Ÿๆ€ปๅทฅ็จ‹ๅธˆ่”กๆž—ไป‹็ป\n111502 0 ่กŒๆ”ฟๆณ•ๆŒ‚็ง‘็އ้ซ˜ๅˆฐ่พ…ๅฏผๅ‘˜ไปฅไธบๆˆ็ปฉ็™ป้”™ไบ†\n111503 0 ๅ’Œ่ฐ็š„็คพไผšๅŸŽ็ฎกๅฐฑๅฟ…้กปๆ–‡ๆ˜Žๆ‰งๆณ•\n111504 0 nๆฌกไนฐไธœ่ฅฟ็ฌฌไธ€ๆฌก่ขซๆท˜ๅฎๅ–ๅฎถ้ชšๆ‰ฐ\n1000 Processed\n classify content\n112000 0 ็œ‹็€ๅ„็งๅค่‰ฒๅค้ฆ™็š„ๅปบ็ญ‘ไธŽๅฐๆกฅๆตๆฐด็š„ๆ™ฏ่‡ด\n112001 0 xใ€ๅˆ†้—จๅˆซ็ฑปใ€ๅ›พๆ–‡ๅนถ่Œ‚็š„ๅฑ•็คบ้กน็›ฎๆŽจไป‹\n112002 0 ่ฎพ่ฎกๅธˆHyunJuParkๅฐ†็บข็ปฟ็ฏ่ฟ็”จๅˆฐๅœฐ้“้—จไธŠ\n112003 0 ๆญ่ฝฝๅ‡บ็งŸ่ฝฆๅ‡บๅค–ๆˆ–่€…ๅˆฐๅ„ๅคง็ซ่ฝฆ็ซ™\n112004 0 xๅ‘ณๅ†ฒ็š„้ฃŸ็‰ฉ๏ผšๆด‹่‘ฑๅ’Œๅคง่’œๅ’Œ้˜ฒ็™Œ\n1000 Processed\n classify content\n112500 0 ferragamo่ฒๆ‹‰ๆ ผๆ…•้ซ˜่ทŸ้ž‹็ฒ‰็บข่‰ฒๆญฃ็‰ˆ็‰›็šฎ็š„ไบŒๆ‰‹ๆœ‰ๅ–œๆฌข็š„็งไฟกxx็ ๆฌง็พŽ็š„้ž‹ๅญๅๅคงไธ€็ \n112501 0 ไนŸๆ ‡ๅฟ—็€ๆˆ‘ๅ›ฝๅคง้ฃžๆœบ้กน็›ฎ่ฟ›ๅ…ฅๆ”ถ่ŽทๆœŸ\n112502 0 ๅทฒๅคงๅคง่ถ…ๅ‡บไบ†้ฃžๆœบ่ฎพ่ฎกๅˆถ้€ ่€…็š„้ข„ๆƒณ\n112503 0 ่ƒ–ๅผŸ็ฅžๅ›žๅคๅ“Žๅ‘€ๅฆˆๅ‘€ๅฐฑ็ฎ—ไฝ ไธ่ฏดๅฅนๅˆšไธŠ่ฏพไธๅฐฑไป‹็ป่‡ชๅทฑ่ฏดๅฅน่‡ชๅทฑๆ˜ฏไธ€้ซ˜ไธญ็ซ็ฎญ็ญๅพ—ไบ†ๅ˜›\n112504 0 ๅ‡บ็ŽฐไบŽ่ฏๅˆธไบคๆ˜“ๅœจๅผ€็›˜ๅŽๆ˜พ่‘—่ตฐ้ซ˜\n1000 Processed\n classify content\n113000 0 com้‡ๅบ†ๆ‹›่˜ๆฑ‚่Œๅƒไบบ็พค2๏ผš128245580\n113001 0 ่”็ณป็”ต่ฏxxxxxxxxxxx\n113002 0 ๅดๅ†ปๆญปๅœจๅคๅคฉ~ๅ„ไฝ็ซฅ้ž‹ไปฌๅœจๅฎถ่ฟ˜ๅฅฝๅ—\n113003 0 ๆˆ‘่ฆๅ›žๅ—ไบฌไฝ ๆŠฑ็€ๆˆ‘ๅ“ญ็š„ๆ ทๅญ\n113004 0 ้—ฒ้€›ๆ—ถๆ‹ฟ็€่ฃ…ๆ‰‹ๆœบ้’ฅๅŒ™็ญ‰ๅฐ็Žฉๆ„ๅ„ฟ\n1000 Processed\n classify content\n113500 0 ๅ›พ็‰‡ๅฐบๅฏธไธๅฐไบŽxxxxxxxpx่ฟ™ไธชๆ˜ฏๆ€Žไนˆๅ›žไบ‹ไนˆ\n113501 0 ไป™ๅŽๅบงๆ‰‹่กจ็”ฑSamFreeman่ฎพ่ฎก\n113502 0 ๅฎƒไปฃ่กจ็€ๅฝ“ไฝ ๆŠ•่ต„่‚ก็ฅจๅŽ็š„ๆ‹ฅๆœ‰ๆƒๅˆฉๆ˜ฏๅคšๅฐ‘\n113503 0 ใ€ŽIBM่ฎพ่ฎก่ฏญ่จ€๏ฝœๅŠจ็”ป้ƒจๅˆ†ใ€\n113504 0 ็ญ‰ไฝ ่Ÿ‘่ž‚่ฟ‡ๅŽปไบบๆ—ๅฃไธ€ๅ ตไธคๅฐๅฆๅ…‹ๆžถ่ตทๆฅๆ นๆœฌ่Žฝไธ่ฟ›ๅŽป็š„ๅ•Š\n1000 Processed\n classify content\n114000 1 ๆ˜ฅๅญฃๅคง้…ฌๅฎพ๏ผ›ๅŽ‚ๅฎถ่ฎฉๅˆฉ๏ผŒๅฎžๆœจๅคšๅฑ‚ๆฟ๏ผ›ๅŽŸไปทxxxๅ…ƒ/ๅนณๆ–น๏ผ›็Žฐไปทxxxๅ…ƒ/ๅนณๆ–น๏ผ›ๆฌข่ฟŽๆ–ฐ่€้กพๅฎขๆฅๅบ—ๅ’จ...\n114001 0 ่€Œ็›ธๅฏนๅบ”็š„ๆฑŸๅฎๆฟๅ—้ข„่ฎกๆœ‰11ๅฎถๆฅผ็›˜ๅฐ†ๅœจไธ‹ๅŠๅนดๆ”ถๅฎ˜\n114002 0 ่Šฑๅƒ้ชจๅนถไธๆ˜ฏ็™ฝๅญ็”ป็š„็”ŸๆญปๅŠซ\n114003 0 ๆœฌๆœบๅฑๅน•ๅฏ่ƒฝไฝฟ็”จไบ†โ€œIDๆ— ่พนๆก†โ€่ฎพ่ฎก\n114004 1 ๅง้พ™ๆ–ฐ่ก—ๅงœๅฎ‡็ฅๆ‚จๅ…ƒๅฎต่Š‚ๅฟซไน๏ผ็‰นๆŽจๅ‡บVIPๅก๏ผŒไธ€ไธ‡ๆŠตไธคไธ‡ใ€‚ๅœฐๅ€:x่ทฏ่ฝฆ็ปˆ็‚น็ซ™๏ผŒ่ฏฆ่ฏข:xxxxx...\n1000 Processed\n classify content\n114500 0 ๆต™ๆฑŸๆ€Žไนˆๅนดๅนด่ฟ™ไธชๆ—ถๅ€™้ƒฝๆœ‰ๅฐ้ฃŽๅ•Š\n114501 0 ๅฐ้ฃŽ้™ไธดๅฏŒ้˜ณๅฐฑๅœจไธไน…ๅ‰ๅœจๆต™ๆฑŸ็œๅฏŒ้˜ณๅธ‚ๅœบๅฃ้•‡ๆƒŠ็Žฐ็บฏ็™ฝๅ‡คๅ‡ฐ้šพ้“้ข„็คบ็€ไป€ไนˆ\n114502 0 ๅ…ณ็ˆฑๆ‰‹ๆœบ็‚‰็Ÿณ็”จๆˆทไปŽไฝ ๆˆ‘ๅš่ตท\n114503 0 ่ƒฝๅ…่ดนๆ‰“ๆ‰‹ๆœบ้‡Œ็…ง็‰‡่‡ชๅทฑๆญฃๅœจ้ผ“ๆฃ\n114504 0 ไฝฟๅพ—Mustang็š„็œผ็ฅžๆ›ดๅŠ ็Š€ๅˆฉ\n1000 Processed\n classify content\n115000 0 ๆ‰ฟๆŽฅDIY่›‹็ณ•้ฅผๅนฒๅทงๅ…‹ๅŠ›่›‹ๆŒžๆŠซ่จๆžœๅ†ปๅธƒไธๅŽๅคซ้ฅผๆพ้ฅผๅฏฟๅธๆš–ๅœบๆดปๅŠจ\n115001 0 Googleๅœฐๅ›พๆ˜พ็คบไธ่ฟœๅฐฑๆœ‰ๅ…ฌไบค่ฝฆ\n115002 0 ไปŠๅคฉๆ”ถๆ‹พ็”ต่„‘็œ‹ๅˆฐๅฅฝๅคšไน‹ๅ‰็š„็…ง็‰‡ๆปกๆปก็š„้ƒฝๆ˜ฏๅ›žๅฟ†\n115003 0 xๆ ‹xxๅฑ‚ใ€xๆ ‹xxๅฑ‚ใ€xๆ ‹xxๅฑ‚\n115004 0 ๆˆๅŠŸๆŠ“่Žทๆถ‰ๅซŒ็›—็ชƒๅนถ้€ƒ่ท‘ไธคๅนด็Šฏ็ฝชๅซŒ็–‘ไบบ\n1000 Processed\n classify content\n115500 1 ๅ…ˆ็”Ÿ/ๅฅณๅฃซๆ‚จๅฅฝ๏ผŒ็™พๅฎ‰ๅฑ…x/xโ€”x/xxๅ•†ๅบ—ๅปบๆๅ•†ๅ“ไฟƒ้”€ๆดปๅŠจ๏ผŒไฝŽ่‡ณxยทxๆŠ˜ใ€‚็ญพ็บฆ็™พๅฎ‰ๅฑ…่ฃ…ไฟฎๅฎขๆˆท...\n115501 0 ๆดพๅ‡บๆ‰€ๅ‘ผๅๆฐ‘ๆ”ฟๅฑ€ๅคšไธบๆฐ‘ๅŠžๅฎžไบ‹โ€ๅผ•่ตท็ฝ‘็ปœ็ƒญ่ฎฎ\n115502 1 ๆ‚จๅฅฝ๏ผ้žๅธธๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆ’้”ฆๅๅ›ญ่ฅ้”€ไธญๅฟƒๅ…ƒๆœˆxๆ—ฅ็››ๅคงๅผ€ๆ”พ๏ผŒๆƒŠๅ–œๆดปๅŠจ้’œๆƒ ้˜ณๆ–ฐ๏ผŒ็ฒพ็พŽ็คผๅ“ไปปๆ‚จไบซ๏ผ...\n115503 0 xxxxๅนดxๆœˆxxๆ—ฅๅˆ่‚ฅๅธ‚ไธญ็บงไบบๆฐ‘ๆณ•้™ข็‰นๅคง่ดฉๅ–ๆฏ’ๅ“ๆกˆ\n115504 0 ๅฐฑๆ˜ฏๆ‡’็š„ๆ‰“็†ๅค–ๅŠ ้ผป็‚Žๅ—ไธไบ†\n1000 Processed\n classify content\n116000 0 ไธŠๅ‘จ360ๆ•™่‚ฒ้›†ๅ›ขๆ–ฐ่ฅฟๅ…ฐ้‡‘็‰Œ็•™ๅญฆไธ“ๅฎถๅˆ˜้ข–่€ๅธˆไธบๅคงๅฎถๆไพ›ไบ†ๆ–ฐ่ฅฟๅ…ฐๅ…ซๆ‰€ๅ…ฌ็ซ‹ๅคงๅญฆไธญๅ…ถไธญๅ››ๆ‰€ๅคงๅญฆ็š„่‹ฑ...\n116001 0 ๆ‰พไบ†ไธ€ๅ †็†็”ฑๅฐฑๆ˜ฏ้˜ปๆญข่‡ชๅทฑ่ฎŠๅฅฝ\n116002 0 u/b/e/rๅฎžๅœจๆ˜ฏไธ€ไธชๆ–‡ๅŒ–ๅพˆaggressive็š„ๅ…ฌๅธ\n116003 0 ๅพฎ่ฝฏๆŠŠXboxMusicๆ›ดๅไธบใ€ŒGrooveMusicใ€ไบ†\n116004 0 ใ€€ใ€€ๅœจ่Œ็ ”็ฉถ็”Ÿ็ก•ๅฃซๅญฆไฝ่ฏไนฆ\n1000 Processed\n classify content\n116500 0 ๅœจ่ฎคๆธ…็”Ÿๆดป็š„็œŸ็›ธไปฅๅŽไพ็„ถ็ƒญ็ˆฑ็”Ÿๆดป\n116501 0 ๆ˜Žๅคฉ่ฆ่‡ชๅทฑๅ€’ๅœฐ้“ๅŽปๆ‰พๅฐไผ™ไผดๅ„ฟไบ†\n116502 0 ๆ˜ฏๅ†…็ฝฎไบŽไฝ Androidๆ‰‹ๆœบไธญ็š„ไธ€ไธช่ฟทไบบ็š„ๅฅณๅญฉๅ„ฟ\n116503 0 ๆดชๆณฝๅŽฟไธ‰ๆฒณ้•‡ๅซ็”Ÿ้™ขๅ›ด็ป•โ€œๆŠ—ๅ‡ป่‚็‚Ž้ข„้˜ฒๅ…ˆ่กŒโ€่ฟ™ไธ€ไธป้ข˜ๅผ€ๅฑ•้ข„้˜ฒ่‚็‚Ž็š„ๅฎฃไผ \n116504 0 D้™ถ็“ท่ฟ›้ฉปA8้ซ˜็ซฏ่ฎพ่ฎก่ฃ…้ฅฐ้›†ๅ›ข\n1000 Processed\n classify content\n117000 0 ๆœ‰่›‡/ๆต™ๆฑŸไธ€ๅฑฑไธญ้›จๅคฉๅธธๅฌๅˆฐ็พŠๆƒจๅซ81ๅช็พŠ็ฆปๅฅ‡ๅคฑ่ธช\n117001 0 ่ฟ™2ๅฎ—ๅœŸๅœฐๆˆ–ไธบไธญๆ–ฐๆ™บๆ…งๅŸŽ้กน็›ฎ็”จๅœฐ\n117002 0 ไบŒใ€็ปๆŸฅ่ฏๆ— ้”กๆฐธไธญ็ง‘ๆŠ€ๆœ‰้™ๅ…ฌๅธๅค–ๆ–น่‚กไธœ็™พๆ…•ๅคงevermoresoftwareๅ…ฌๅธ็š„ๆณ•ๅฎšไปฃ่กจไบบ...\n117003 0 ๅฐ็ ๅญ่ฟ˜่ฎฉไธ่ฎฉๆ‰‹ๆœบๆดปไบ†โ€ฆโ€ฆโ€ฆโ€ฆ\n117004 0 ๆฒณๅ—็œๆฃ€ๅฏŸ้™ขๅ…š็ป„ๅ‰ฏไนฆ่ฎฐใ€ๅธธๅŠกๅ‰ฏๆฃ€ๅฏŸ้•ฟๅผ ๅ›ฝ่‡ฃ้€šๆŠฅ็œๆฃ€ๅฏŸ้™ข2015ๅนด็ˆฑๆฐ‘ๅฎž่ทตๆœๅŠกๆ‰ฟ่ฏบ็š„ๅ…ทไฝ“ๅ†…ๅฎนๅ’Œ...\n1000 Processed\n classify content\n117500 0 ่ขซไบบๅผบๅฅธๆ‹ๅธฆๆฎดๆ‰“่‡ดๆฎ‹่ฟ˜ๆญปไธไบ†ๆฏๅคฉ็ฒพ็ฅž่‚‰ไฝ“ๅŒ้‡ๆŠ˜็ฃจๆ‰ๆ˜ฏ็œŸ็š„่ฎฉไป–ไปฌ็†่งฃไธบไป€ไนˆไผšๅฆ‚ๆญค็ป“ๆžœ\n117501 0 ๆฑŸ่‹็œ13ไธชๅŸŽๅธ‚2015ไธŠๅŠๅนดGDPๆŽ’ๅๆ–ฐ้ฒœๅ‡บ็‚‰\n117502 0 YoYoๆ•ฐๅญ—ๆ‰ญ่›‹ไผš่ฎฉๆ‚จ็Žฉ็š„็ˆฑไธ้‡Šๆ‰‹\n117503 0 ไธบไบ†่งไฝ ๅฏไปฅๅ15ๅฐๆ—ถ้ฃžๆœบไธ็ฎกไธ้กพ\n117504 1 ใ€็พŽไนๆดปใ€‘็ฅๅ„ไผšๅ‘˜ๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒ็พŽไธฝๆดปๅŠจๅณๆ—ฅ่ตท่‡ณxๆœˆxๆ—ฅ๏ผŒๆปกxxxๅ‡xx๏ผŒๅ†้€xxๅ…ƒๆŠต็”จๅˆธ๏ผŒๆฌข...\n1000 Processed\n classify content\n118000 0 ้—ฎๅ€™ๆ‰ๅพ—็Ÿฅๆ˜ฏไธ€ไธช6ๅฒ็š„ๅฐๅฅณๅญฉๅคฑ่ถณไปŽๅๅ…ซๆฅผๆ‘”ไธ‹ๆฅ\n118001 0 ็Žฐๅทฒๅนฟๆณ›ๅบ”็”จไบŽ่ˆช็ฉบใ€ๆฑฝ่ฝฆใ€ๆจกๅ…ท็ญ‰ๆœบๆขฐๅŠ ๅทฅ็š„ๅ„ไธช้ข†ๅŸŸ\n118002 0 ๅๅœจๆ”น่ฃ…jeep้กถไธŠไธ€่ทฏไบ’็›ธ่ฟฝ้€ๆ—ถ\n118003 0 ่ฟžๆœ€ๅŽไธชไฝ“ๅฏนๆ”ฟๅบœ็š„ๆŠตๆŠ—่พ“็ป™ไบ†ไบบๆ€ง็š„ๅ†ทๆผ ่ฟ™ไธ€็‚น้ƒฝๆ˜พๅพ—้™ˆๆ—ง\n118004 1 ๆœ‰่ถ…ๅ€ผ็บขๅŒ…็ญ‰ๆ‚จๆฅๆ‹ฟ๏ผๆๅ‰้ข„ๅญ˜ ๅฐฝไบซๅฎžๆƒ ๏ผŒ้”™่ฟ‡ไธ€ๅคฉๅ†็ญ‰ไธ€ๅนด๏ผๅœฐๅ€๏ผšไธŠๆตทๆ™ฎ้™€ๅŒบๆพณ้—จ่ทฏxxxๅทๆœˆๆ˜Ÿ...\n1000 Processed\n classify content\n118500 0 ๅพฎ่ฝฏๅˆๅฏไปฅๅคง่ตšไธ€็ฌ”ไผไธš่ดน็”จ\n118501 0 ็›ฎๆต‹ๆ‰‹ๆœบ่ฟ‘่ง†xxx+ๅบฆ??\n118502 1 ๅŸŽๅŒ—็ฌฌไธ€ไธญๅฟƒ๏ผŒ้ƒฝๅธ‚้€ธๆกƒๆบ๏ผ่ฟœๆด‹ๅ…ฌ้ฆ†็ปฟๅ›ญโ€”โ€”็ปฟๅŸŽใ€่ฟœๆด‹ใ€ๆต™้“็ฒพ่ฏš้’œ็Œฎxx-xxxใŽก็ฒพ่ฃ…ๅ…ฌๅ›ญๅคงๅฎ…...\n118503 1 ็ˆฑไฝณ่ถ…ๅธ‚้“บ้ขๅˆฐๆœŸ๏ผŒๅ•†ๅœบไธๅนฒไบ†๏ผŒ็็ˆฑๅๅญ—็ปฃ@้’ป็Ÿณ็”ปๆœ€ๅŽๅ‡ ๅคฉๅ…จ้ƒจxๆŠ˜ไบๆœฌๆธ…ไป“ๅคงๅค„็†๏ผŒไธๅ›พ่ตš้’ฑ๏ผŒๅช...\n118504 0 ๅทฅ่ต„ๆ˜ฏๆ—ฅ็ป“็š„ๆƒณๅ‘ๅฑ•ๆƒณๆ”นๅ˜ๆœบไผšๆฐธ่ฟœ็•™็ป™ๆœ‰ๆ‰€ๅ‡†ๅค‡ๅ‹‡ไบŽๆ‹ผๆ็š„ไบบๅˆทๅ•ๆ˜ฏไธ้œ€่ฆๅžซไป˜ไธ€ๅˆ†้’ฑ็š„\n1000 Processed\n classify content\n119000 0 ๆˆ‘ๆ‰“็ฎ—ๅ…ˆๅŽปๅ…ถไป–ๅ›ฝๅฎถๆ—…ๆธธxxๆœˆไปฝๅœๅœจ็พŽๅ›ฝ\n119001 0 ๅ†…่ฃคๆ˜ฏ่ฃคๅญ่ƒธ็ฝฉๆ˜ฏ่กฃๆœ่ฃ™ๅญๅฐฑๆ˜ฏ่ฃ™ๅญไฝ†ไธๆ˜ฏ่กฃๆœ\n119002 0 ๅ‘ๅช’ไฝ“็ˆ†ๆ–™้Ÿฉๅ›ฝ่‰บไบบ้‡‘ๆŸ็š„ๅ‰ๅ‰ๅฅณๅ‹\n119003 0 ็™ฝๅธฆๅผ‚ๅธธๅฆ‡็ง‘็–พ็—…ใ€ไธๅญ•\n119004 0 ๆปดๆปดๆ‰“่ˆนๅฟซ่ˆนไธ“่ˆน้กบ้ฃŽ่ˆน็ปŸ็ปŸๆญ‡่œ\n1000 Processed\n classify content\n119500 0 ไธŽ็™พๅบฆไบคๅฅฝโ€ฆไปฅๅŽๅฐฑๆ˜ฏไบ’่”็ฝ‘็š„ๆ—ถไปฃไบ†\n119501 0 PS๏ผšๅฏ่งๅ—ไบฌๅธ‚ๅฏนไบŽๆฑŸ่‹็œ็š„ไฝ•ๅ…ถ้‡่ฆ\n119502 0 ่ฟ™ไธช่…พ่ฎฏๆตฎไบ‘่”ก่Šท็บญ่ฃ…็š„ๅˆฐ่›ฎๅƒ็š„\n119503 0 Google็”จๆœบๅ™จไบบๆฅๆต‹่ฏ•่งฆๆ‘ธๅฑๆ—ถๅปถ\n119504 0 ่™ฝ่ฏด็ฌฌไธ€ๆฌก็”จ็”ต่„‘็š„yyๅ‚่ต›\n1000 Processed\n classify content\n120000 0 ๅ…ญๅˆๅŒบ้™้ขไปฅไธŠๅ•ไฝๅฎž็Žฐ็คพไผšๆถˆ่ดนๅ“้›ถๅ”ฎๆ€ป้ขxxxxxxไธ‡ๅ…ƒ\n120001 0 ๆˆ‘ๆ‰็Ÿฅ้“ไธบไป€ไนˆๆŠŠ้ฃžๆœบๅซๅšhuiๆœบ\n120002 0 ๅฏๅฎ‰่ฃ…่…พ่ฎฏๆ‰‹ๆœบ็ฎกๅฎถๆ‹ฆๆˆช่ฏˆ้ช—็Ÿญไฟก\n120003 0 ไฝ†่ฏด็œŸ็š„ๅฆ‚ๆžœๆ•™ไธปbabyไธŠ็œŸไบบ็ง€่Š‚็›ฎๅŠžๅฉš็คผ่ฟ˜ๆ˜ฏ่›ฎๆœ‰็œ‹็‚น็š„\n120004 0 ๆฒๆบๅŽฟๅธๆณ•ๅฑ€ไผ ่พพ็œๅŽ…ๅธ‚ๅฑ€ๅŠๅนดๅทฅไฝœไผš่ฎฎ็ฒพ็ฅž\n1000 Processed\n classify content\n120500 0 ๅ–ทไบ†่ฐทๆญŒๆ˜ฏไธๆ˜ฏ้™คไบ†็ฟป่ฏ‘ไปฅๅค–็š„้ƒฝไธ่ƒฝ็”จไบ†็™พๅบฆๆŸฅๅ‡บๆฅ็š„ๅ›พ็‰‡ๅ…จๆ˜ฏshi\n120501 0 ไนŸๅฐฑๆ˜ฏๅฎƒๅฏ็”จไฝœ็šฎ่‚ค็ง‘ๅŒป็”Ÿๅค„ๆ–นๅ’Œ่ฏๅ‰‚ไฝฟ็”จ็š„ๆŠค่‚คๅ“\n120502 0 ็œ‹่Šฑๅƒ้ชจ็š„้ƒฝๅ–œๆฌขไธŠไบ†็™ฝๅญ—็”ป\n120503 0 ๅ“†ๅ•ฆAๆขฆstandbyme\n120504 0 ๆˆ‘ไปฌ็š„ๆœบๅ™จไบบโ€œๅธˆๅ‚…โ€ๆญฃๅฎ‰้™ๅœฐโ€œ็ซ™โ€ๅœจๅŽจๆˆฟ้‡Œ\n1000 Processed\n classify content\n121000 1 ไฝ ๅฅฝใ€่‹ๅทžๅฅฅ็‰น่Žฑๆ–ฏtheoryไปŽxๆœˆxๅทๅˆฐxๆœˆxๅทๅ…จๅœบxๆŠ˜ใ€่ฐข่ฐข\n121001 0 ๅ†ไนŸๆฒกๆณ•ๅๅœจ็”ต่„‘ๅ‰็œ‹ไฝ ไปฌ็š„็›ดๆ’ญ\n121002 0 ๅŒ…ๆ‹ฌไบฌไธœๅ’Œ้˜ฟ้‡Œๅทดๅทดๅคง็”ตๅ•†้‡‘่žๅŒ–\n121003 0 ice็Ž‹ๅฏนๆกˆไปถๅˆ†ๆžๅๅˆ†่ฏฆๅฐฝ\n121004 0 2็š„ๆˆ‘ๅœจๅฎถๅผ€็€็ฉบ่ฐƒๆง็€่ฅฟ็“œ\n1000 Processed\n classify content\n121500 0 ๅฅฝๅฃฐ้Ÿณ่ˆžๅฐไธŠ้‚ฃไบ›ๆƒŠ่‰ณไบ†ๆˆ‘็š„ๅฅฝๅฃฐ้Ÿณ๏ผš็ฌฌไธ€ๅญฃ็š„ๅผ ็Žฎ\n121501 0 ้ฃŽ้ก้Ÿฉๅ›ฝ็š„่œ‚่œœ้ป„ๆฒนๆไปๅฐๅŒ…35G\n121502 0 ๅŽไธบโ€œ่ฟ›ๅ‡ปโ€ๅ…ฌๆœ‰ไบ‘ๆŒ‘ๆˆ˜ไบ’่”็ฝ‘ๅทจๅคด|ๅŽไธบโ€œ่ฟ›ๅ‡ปโ€ๅ…ฌๆœ‰ไบ‘ๆŒ‘ๆˆ˜ไบ’่”็ฝ‘ๅทจๅคด2015ๅนด07ๆœˆ31ๆ—ฅ03\n121503 0 ๅŒ—ไบฌๅŠ็ฏฎ้…้‡็”ตๅŠจๅŠ็ฏฎ้…้‡ๆฐดๆณฅ้…้‡็งŸ่ตๅ‡บ็งŸๅŠ็ฏฎ้…\n121504 1 ใ€TATAๆœจ้—จใ€‘ไบฒx.xxๅ…จๅ›ฝ่”ๅŠจๅทฅๅŽ‚็›ดไพ›ๅฎžๆœจๅคๅˆ*/ๆฌพxxxxๅ…ƒ๏ผŒๅ‰xxๅๅ‡ญๆญค็ŸญไฟกๅฏๆŠต/x...\n1000 Processed\n classify content\n122000 0 ๅˆฉ้€šๅ…ฌๅฎ‰ๅˆ†ๅฑ€ๆˆๅŠŸ็ ด่Žทไธ€่ตทไปฅโ€œๅŒ…ๅฐๅงโ€ไธบ็”ฑๅฎžๆ–ฝ่ฏˆ้ช—็š„็Šฏ็ฝชๅ›ขไผ™\n122001 0 ๆ˜Žๆ˜ŸไปฌไนŸ้ƒฝๅœจ็”จๆˆ‘ไปฌๅฆ†ๅŽ็š„้…ต็ด ๆด้ขœ็ฒ‰??\n122002 0 ไธฅๅމๆ‰“ๅ‡ป็ ดๅ็”ตไฟก่ฎพๆ–ฝ็Šฏ็ฝช่กŒไธบ\n122003 0 ่ฟ™ๆ ทไธ€ไธชๆœˆไธ‹ๆฅๅฐฑๅพ—5800ๅ…ƒไบบๆฐ‘ๅธ\n122004 0 ้ฉพ้ฉถๅ‘˜้ฉพ้ฉถ็š–C***ๅท้‡ๅž‹ๅŠๆŒ‚่กŒ้ฉถ่‡ณไบ”ไฟ้ซ˜้€Ÿxxxๅ…ฌ้‡Œๅค„ๆ—ถๅ› ๆŒ‚่ฝฆ็š„็ฏๅ…‰ไฟกๅทใ€ๅˆถๅŠจใ€่ฟžๆŽฅใ€ๅฎ‰ๅ…จ...\n1000 Processed\n classify content\n122500 0 ็›ฎๅ‰ๅฒ›ๅ†…ๆฏ”่พƒ็ƒญ้—จ็š„็งŸๆˆฟๅŒบๅŸŸๅฆ‚ๅ‰ๅŸ”ใ€็‘žๆ™ฏใ€็ซ่ฝฆ็ซ™็ญ‰\n122501 0 ๅœจๆญคๆณ•ๅฎ˜ๅฑ•็คบไธ€ไบ›็œŸๅฎž็š„ๅ€Ÿ่ดท็บ ็บทๆกˆไพ‹\n122502 0 ไธ€ๅ35ๅฒ็š„ๅ•†ๅœบไฟๆดๅทฅ่ขซB2่‡ณB1ๅฑ‚็š„่‡ชๅŠจๆ‰ถๆขฏๅคนไฝ่…ฟ้ƒจ\n122503 0 ๆณ•้ฉฌไปฃ่กจ่ฎจ่ฎบ้ฃžๆœบๆฎ‹้ชธ้‰ดๅฎšไบ‹ๅฎœๆฏ›้‡Œๆฑ‚ๆ–ฏๅผ€ๅง‹ๆœๅฏป๏ผญ๏ผจxxx\n122504 1 ้™ˆ่ฏš่ฝฎ่ƒŽๅ…ฌๅธ่ฐญๅฐๆ–‡็ฅๅ„ไฝ่€ๆฟ็พŠๅนดไธ‡ไบ‹ๅคงๅ‰ใ€่ดขๆบๅนฟ่ฟ›๏ผๅ…ฌๅธไธป่ฅไธ‰ๅŒ…ๅ…จ้’ข่ƒŽ:ๅ…จ็ƒ่กŒใ€ๅทฅ็Ÿฟๅž‹ใ€ๅ‡กไธ–...\n1000 Processed\n classify content\n123000 0 ๆ‰ฌๅญๆฑŸๅคง้“ๅฐ†ๆ–ฐๅขž29ๅค„่ฟ‡่ก—้€š้“\n123001 0 ไธญๅ›ฝ้‡ๅทฅไปฅๅŠไธญๅ›ฝๅซๆ˜Ÿ็ญ‰้ƒฝๆ˜ฏๅคฉๅพทๆŒ็ปญ็œ‹ๅฅฝ็š„ๅ“็ง\n123002 0 ๆ‰€ๆœ‰็”ตๆขฏ้ƒฝๆœ‰่ฟ™ไธช็บข่‰ฒ??ๆŒ‰้’ฎ\n123003 0 ๅ•ไฝ้€ไฟกไปถๆ‚ๅฟ—็š„็‰ฉไธšๅ‰ๅฐๅงๅง่ฎค่ฏ†ๆˆ‘ไบ†\n123004 0 ๅฎš่ฅฟๅธ‚้€šๆŠฅ10่ตทๅ‘็”Ÿๅœจ็พคไผ—่บซ่พน็š„โ€œๅ››้ฃŽโ€ๅ’Œ่…่ดฅ้—ฎ้ข˜\n1000 Processed\n classify content\n123500 0 Win10Mobile็‰ˆๅพฎ่ฝฏๅฐๅจœ็‰นๆŠ€๏ผš็Ÿฅๆ™“ไธ–็•Œๆ—ถ้—ด\n123501 0 ๅˆšๆ‰ๆˆ‘ๅœจๆˆฟ้—ด้‡Œ็Žฉๆ‰‹ๆœบๅ‘ขๅฐฑๅชๅฌๅค–้ขไธ€ๅฐๅง‘ๅจ˜ๅคงๅ–Š๏ผšๆˆ‘่ฆๅ‡บไธฝ่ŽŽๅธƒๅธƒ\n123502 0 ๅŒป็”Ÿไธ€ๅผ€ๅง‹่ฎคไธบไป–ๅชๆ˜ฏ่ขซ่™ซๅญๅ’ฌไบ†\n123503 0 ๆŒๆœ‰ๆ‰‹ไธญ็š„่‚ก็ฅจไปŠๅคฉไธ‹ๅˆๅญ˜ๅœจๅ›ž่ฝ็š„ๅฏ่ƒฝ\n123504 0 ๅœจ7ๆœˆ7ๅทๆœฌไบบๅ†ๆฌกๆŠฅ้“ๅคง่”ไธ€่€ๅ…ต22ๅนดๅ› ไธบๆ”ฟๅบœๅฎ˜ๅ‘˜ไธขๅคฑๆกฃๆกˆ้€ ๆˆ่€ๅ…ตๅฆป็ฆปๅญๆ•ฃๅŽ\n1000 Processed\n classify content\n124000 0 ๅˆš็œ‹ๅˆฐ้‚ฃไธช็”ท็”Ÿๅœจๅฅฝๅฃฐ้Ÿณ่ˆžๅฐไธŠๆฑ‚ๅฉš\n124001 0 ่‹ฑๅ›ฝ้“่ทฏ้€š็ฅจ๏ผšๅ…่ดน่ต ้€ไน˜่ฝฆๅคฉๆ•ฐ\n124002 0 ๆœ€้€‚ๅฝ“็š„้˜ฒๆ™’็ณปๆ•ฐๆ˜ฏไป‹ไบŽSPF15ๅˆฐSPF30ไน‹้—ด\n124003 0 ๅนถ็›—่ตฐ็ฌฌ328็ชŸๅฝฉๅก‘ไพ›ๅ…ป่ฉ่จๅƒ็ญ‰\n124004 1 ๅฐŠๆ•ฌ็š„ๅฎถ้•ฟๆ‚จๅฅฝ๏ผ้˜ณๅ…‰่‰บๆœฏ้ฆ†ๅผ€ๅญฆๅ•ฆ๏ผŒๅ‘จไธ€่‡ณๅ‘จไบ”ไธญๅฐๅญฆ็”Ÿไฝœไธš่พ…ๅฏผๅทฒๆœ‰ไธ“ไธš็š„่€ๅธˆ่พ…ๅฏผ๏ผŒ็พŽๆœฏๅ’Œไนฆๆณ•็š„...\n1000 Processed\n classify content\n124500 0 ไธ€่ตทๅ›žๅฟ†็€้‚ฃไบ›ๅนด็š„็‚น็‚นๆปดๆปด\n124501 0 ๅ›žๆฅๆ‰‹้‡ŒๅฐฑๆŠ“ๆ„š่ ขๅ’Œ็Šฏ็ฝช่กŒไธบ\n124502 0 ้‚ฃไนˆไบ’่”็ฝ‘+่กŒๅŠจ่ฎกๅˆ’่ฆๅฆ‚ไฝ•ๅพ—ไปฅๅฎž็Žฐๅ‘ข\n124503 0 ่€Œๆ นๆฎๆ–ฐไฟฎๆ”น็š„ๅฐ†ไบค้€šๆณ•่ง„่ง„ๅฎš\n124504 0 ๅŒ—ไบฌ่‡ณๅฎๅปๆฎต็บฆ80ๅ…ฌ้‡ŒไธŽไบฌๅ”ๅŸŽ้™…ๅ…ฑ็บฟ\n1000 Processed\n classify content\n125000 0 ็„ถๅŽไป–ๅฐฑไปŽ็”ต่„‘้‡Œๅ‡บๆฅไบ†โŠ™โ–ฝโŠ™\n125001 0 ่ฐ่ฟ˜ๆ•ข่ฏด่ฝฌๅŸบๅ› ๅชๆ€่™ซๅญไธๆ€ไบบ\n125002 0 ๆณฐๅทžๅฐฑๆฒกๆœ‰ๅ“ชๅฎถๆ—ฅๆ–™ๆœ‰็บณ่ฑ†ๆœ‰ๅฏฟๅ–œ้”…ๆœๅŠกๅ‘˜ไธไผšไนฑ้€ผ้€ผ็š„ๅ—??\n125003 0 ่ฟ™ๆฌก็š„ๅ›พ็‰‡ๆฎ่ฏดๆฅ่‡ชiPhone็›ธๅ…ณไบงไธš้“พ\n125004 0 ๆˆ‘็”จ็™พๅบฆ่ง†้ข‘ๆ‰‹ๆœบ็‰ˆ็œ‹ไบ†โ€œ็พŽๅฅณไธบๅ‡‰ๅฟซ่บฒ่ฟ›ๅ†ฐๆŸœ้‡Œโ€\n1000 Processed\n classify content\n125500 0 nuru็š„็ฟป่ฏ‘่ฟ˜ๆœ‰้…้ŸณไนŸๅฅฝๅฏ็ˆฑ\n125501 0 ๆ ก้•ฟใ€ๅ‰ฏๆ ก้•ฟๅ› ็Šฏ็Žฉๅฟฝ่Œๅฎˆ็ฝช\n125502 0 8ๆœˆ7ๆ—ฅๆ„Ÿ่ฐขไธŠๅธๆ„Ÿ่ฐข่€ถ็จฃๆ„Ÿ่ฐข่ฉ่จ็ปง็ปญๅฝ“ไธชๅฅฝไบบๆŠฅ็ญ”ไฝ ไปฌไนŸๆ้†’ๅคงๅฎถๅ็”ตๆขฏๆณจๆ„ๅฎ‰ๅ…จๆณจๆ„ๅฎ‰ๅ…จ\n125503 0 ็Žซ็ณๅ‡ฏ่‹ๅทž่ต›ๅŒบ็™พๅ˜็พŽไบบๅญฃ\n125504 0 ไปฅๅŠ้’่—้ซ˜ๅŽŸไธญ้€”็ซ™โ€”ๅ”ๅคๆ‹‰ๅฑฑๆŽช้‚ฃๆน–้‚ฃๆ›ฒ\n1000 Processed\n classify content\n126000 0 ๅœจๆŽฅๅ—ไธญ่ง†ใ€Žๆ”นๅ˜็š„่ตท็‚นใ€ไธ“่ฎฟๆ—ถ\n126001 0 ๅ‘จ่พน็Žฏ็ป•็€ๅทด้ปŽๆ˜ฅๅคฉๅ•†ไธšไธญๅฟƒ\n126002 0 ๆ‰“้บปๅฐ†่พ“ไบ†่ฟ˜ๆœ‰ๅŽๅค‡ๅ†›ๅ‘็บขๅŒ…\n126003 0 ๆˆ‘ๅˆ็”จ็™พๅบฆ้’ฑๅŒ…xๅˆ†้’ฑๅ……xๅ…ƒ่ฏ่ดนไบ†\n126004 1 ๅฐŠ่ดต็š„ๅฎขไบบ๏ผŒไบฌ้ƒฝ่–‡่–‡็ฅๆ‚จโ€œไธ‰ๅ…ซ็พŽๅฅณ่Š‚โ€ๅฟซไน๏ผไธบๆ„Ÿๆฉๅ›ž้ฆˆ๏ผŒ็‰นๅˆซไธบๆ‚จๆŽจๅ‡บโ€œ่ฐขๅคฉ่ฐขๅœฐ๏ผŒๆ‚จๆฅๅ•ฆ๏ผโ€็ญ”...\n1000 Processed\n classify content\n126500 0 ่ฎฉไธ€ไธช20ๅ‡ ๅฒ็š„ๅฐๅง‘ๅจ˜ๅœจๅ…ป่€้™ขไธ€ๅนฒๅฐฑๆ˜ฏ28ๅนด\n126501 0 ๅˆšไธ‹้ฃžๆœบๅฐฑๅฌๅˆฐๆœ‰่ˆช็ญๅ› ไธบๅคฉๆฐ”ๅŽŸๅ› ๅ–ๆถˆ\n126502 0 ้“ถๆณฐ0571โ€”86234738\n126503 1 ๏ผˆ่ฎฉไฝ ้šๆ—ถๆŽฅๅฌๅˆซไบบ็”ต่ฏๅ’Œ็œ‹ๅˆฐๅพฎไฟกQ-Q่Šๅคฉ่ฎฏๆฏโ†’ๅฏนๆ–นไธไผšๅ‘็Žฐ๏ผ‰๏ผˆ่ฏฆ๏ผšxxxxxxxxxxx๏ผ‰\n126504 0 innisfreeๅ’Œnaturerepublic้˜ฒๆ™’้œœxๅฎ้œฒ้œฒ้ฉฑ่šŠๅ–ท้›พๆˆไบบๅ„ฟ็ซฅๅฏ็”จx\n1000 Processed\n classify content\n127000 0 ๅค–้ข็š„ๅ„็ง่ฃ…ไฟฎๅฃฐๆŠŠๆˆ‘ๅต้†’ไบ†\n127001 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 9pd9u8ไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n127002 0 ้šๆ—ถ้šๅœฐๆŠŠ็…ง็‰‡ไธŠไผ ๅˆฐๆ‰‹ๆœบ็ญ‰็งปๅŠจ่ฎพๅค‡\n127003 0 ่€Œๅณๅฐ†ๅœจ9ๆœˆไปฝๅ‘ๅธƒ็š„2016ๆ˜ฅๅค็ณปๅˆ—ๅˆ™ๆ˜ฏไป–ๅœจ่ŒๆœŸ้—ด่ฎพ่ฎก็š„ๆœ€ๅŽไธ€ไธช็ณปๅˆ—\n127004 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผŒๆญฃๅ’Œ่ฃ…้ฅฐๅŸŽไธ‰ๆฅผ็พŽๅฟƒๅฎถ็พŽๆœจ้—จ็ฅๅ…ƒๅฎต่Š‚ๅฟซไน๏ผๅŒๆ—ถx.xx็š„โ€ไนไบซๅฅ—้คโ€ไผ˜ๆƒ ๆดปๅŠจไนŸ...\n1000 Processed\n classify content\n127500 0 ๅคง็บฆๆœ‰127็งๆถ‰ๅŠๅˆฐๆ“ไฝœ็ณป็ปŸๅ’Œๆ— ็บฟ้€šไฟก็š„ไธ“ๅˆฉ่ขซ็”จไบŽๅฎ‰ๅ“ๆ‰‹ๆœบ\n127501 1 ็Žฐๆœ‰ๅคš่‚ฝๅฐฟ็ด ๅˆฐ่ดง๏ผŒxxkg่ง„ๆ ผ๏ผŒ้ป„่‰ฒ้ข—็ฒ’๏ผŒๅŠ้€ๆ˜ŽๅŒ…่ฃ…๏ฝž๏ฝžไปทไฝxxxx๏ฝž่ฆๅพ—่ฏท่”็ณป๏ผŒๅฎš็‚น้”€ๅ”ฎ ...\n127502 0 ่ฟ˜ๆ˜ฏ็ปง็ปญๆšดๅŠ›ๆŠ—ๆณ•็กฎๅฎžๅฏไปฅๅฝ“ๅœบๅ‡ปๆฏ™\n127503 0 5ใ€้ฅฎ้ฃŸไธŠ่ฆๅšๅˆฐไฝŽ่„‚่‚ชๅ’Œ้ซ˜็บค็ปด็›ธ็ป“ๅˆ\n127504 0 ๅˆๅคฑ็œ ๆ˜Žๆ—ฉ่ฟ˜ๆœ‰labtest\n1000 Processed\n classify content\n128000 0 ่Œถ้’ฑ3ๅ…ƒๅŠๅคฉโ€ฆ็œŸ็š„ไปฅไธบ่€็™พๅง“็š„้’ฑๆ˜ฏๅ……่ฏ่ดน้€็š„ๅ—\n128001 0 ไธ€ๆ—ฆไธŽๅพฎ่ฝฏ็š„็ฆไธšๅ่ฎฎๅœจ2016ๅนดๅˆฐๆœŸ\n128002 0 ็œ‹็œ‹ๅบ“้‡ŒNBA็”Ÿๆถฏ่ฟ™ไธ€่ทฏ็ปๅ—็š„ไผค็—›\n128003 0 ่Žท็™พๅบฆipadๅฎขๆˆท็ซฏๅฃ็บธ\n128004 0 ๆฅ่‡ชๅฎ‰ๅพฝ็œ็š„xxๅ็Ÿฅๅไธ“ๅฎถๆฅๅˆฐๅฎฟๅทžๅธ‚ไธบๅฝ“ๅœฐ็š„ไผไธšใ€ๅ•ไฝๅผ€ๅฑ•ๆ™บๅŠ›ๆœๅŠก\n1000 Processed\n classify content\n128500 0 MicrosoftๅฐVRๆŠ€่ก“ไฟกๅฟƒๅ่ถณ\n128501 0 ้ซ˜ๅฐ”ๅคซ็ƒๅœบๆˆๅคงๅธๆฐดๅ™จๅŒ—ไบฌ็ƒๅœบ็”จๆฐด็›ธๅฝ“ไธคไธชๅŒบ\n128502 0 ไปฅๅผบๅฅธๅฆ‡ๅฅณ็ฝช่ขซๅˆคไบ†3๏ฝž10ๅนดโ€ฆโ€ฆๅฅฝๅƒๆŒบ็Ÿญ็š„\n128503 0 ๅฑ…็„ถๅœจ่…พ่ฎฏ่ง†้ข‘็œ‹่ง่‡ชๅทฑ็š„็…ง็‰‡็œŸ็š„ๅ‡บ็Žฐๅœจ7ๆœˆ5ๆ—ฅๆฑ ๆ˜Œๆ—ญ้Ÿฉๅ›ฝไธ–่ดธๅคฉ้˜ถ็”Ÿๆ—ฅไผš็š„ๅคงๅฑๅน•ไธŠ\n128504 0 17็บงๅฐ้ฃŽโ€œ็ฟ้ธฟโ€่ฆ็™ป้™†ๆต™ๆฑŸไบ†\n1000 Processed\n classify content\n129000 0 ไนŸ่ฎธๅฅณ็”Ÿๆœ‰้šพๅค„ๅคงๅญฆไผšๅญ˜ๅœจๆฝœ่ง„ๅˆ™ไนˆ\n129001 1 xxไธ‡ๆฌงๅ…ƒ่ดญๆˆฟ้€่ฅฟ็ญ็‰™็ปฟๅก๏ผŒxxไธ‡ไบบๆฐ‘ๅธ่ตท็งปๆฐ‘ๅพทๅ›ฝ๏ผŒไธ€ไบบๅŠž็†ๅ…จๅฎถไบซๅ—ๅŒ็ญ‰ๅพ…้‡๏ผŒๅญๅฅณไบซๅ—ๆฌงๆดฒๆ•™...\n129002 0 ๆฑŸ่‹ไบบ่ฟ™ไนˆ่ƒฝๅ–โ€ฆโ€ฆ็ฌฌไธ€ๆฏๅ–ๅฎŒ็›ดๆŽฅๅฐฑๆ‹ฟๅˆ†้…’ๅ™จๅนฒ\n129003 0 ่ญฆๅฏŸไนŸๆฒกๅ‡ ไธชๆ˜ฏๅคฉๅคฉๅ‡ญ่‰ฏๅฟƒๅšไบ‹็š„\n129004 0 ไธ€่ทฏๆŠ•่ต„ไธ€่ทฏๆ”ถ็›ŠๅฑžไบŽๆ‰“้•ฟ็บฟ\n1000 Processed\n classify content\n129500 0 ๅ—ไบฌ้’ๅฅฅไผš่ทณๆฐดๆฏ”่ต›็”ทๅญ3็ฑณๆฟ็š„ๅ† ๅ†›ๆ˜จๅคฉๅ‡บ็‚‰\n129501 0 ไธพๆŠฅ่€…โ€œ้‡Šๆญฃไน‰โ€ๅœจๆ˜จๆ™š7ๆ—ถๆœ€ๆ–ฐๅ‘็ป™ๅช’ไฝ“่ฎฐ่€…็š„ๆๆ–™ไธญ็งฐ\n129502 0 ็”ฑ็›ๅŸŽ่ดจ็›‘ๅฑ€ไธ‹ๅฑžๅ•ไฝ็›ๅŸŽๅธ‚่ฎก้‡ๆ‰€ๆ‰ฟ\n129503 0 ่ฏท็ป่ฟ‡ๅธธ็†Ÿๆฎตๅ›ฝ็œ้“็š„่ฝฆ่พ†ๅ‡้€Ÿๆ…ข่กŒ\n129504 0 ไธ€่ˆฌ็œ‹่ฐทๆญŒ็›ดๆ’ญไธ€่พนๆ— ้™loop่ง้€ธๆ™ด็š„ๅจ้ฃŽๅ ‚ๅ ‚ๆฏซๆ— ่ฟๅ’Œๆ„Ÿhhh\n1000 Processed\n classify content\n130000 1 ไบฒ็ˆฑ็š„VIP๏ผšMASFER-SUๅฅณไบบ่Š‚ๆดปๅŠจ็ซ็ƒญๆฅ่ขญๅ•ฆ๏ผx.x๏ฝžx.xๆœŸ้—ดxxๅนดๆ˜ฅๅญฃๆ‰€ๆœ‰ๅ•†ๅ“ๅŠ...\n130001 1 ้‡ๅบ†ๆ˜Ÿ้กบๅฅ”้ฉฐไธ‡ๅทžxSๅบ—ๅฐ†ไบŽๆœฌๅ‘จๆœซ๏ผˆxๆœˆxใ€xๆ—ฅ๏ผ‰ๅœจๅฑ•ๅŽ…ไธพ่กŒโ€œไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚โ€็‰นๆƒ ๆดปๅŠจ๏ผŒๅฏ่ƒฝๆ‚จๆ„ๅ‘...\n130002 0 ไบš้ฉฌ้€Šๆ‹›่˜ๅ†…ๅน•๏ผšBarRaiser็‹‚ๅˆทๅบ”่˜่€…\n130003 0 ๅญฆไผšไบ†ๆŸ ๆชฌ็š„ๅทๅ‘ๆ–นๆณ•ๆ„Ÿ่ง‰่‡ชๅทฑ็‰›้€ผ็š„ไธๅพ—ไบ†ๆฒกๆœ‰็”ต่„‘็š„ๅฅฝๅค„ๅ—ๅคชๆ— ่Šไบ†ๆ€ปๆƒณๆ‰พ็‚นไบ‹ๅš\n130004 0 ไปฅๅพ€้ฃ˜ๆ•ฃๅœจๅŠ ๆฒน็ซ™ๅ†…ๆต“ๆต“็š„ๆฑฝๆฒนๅ‘ณๅฆ‚ไปŠๅทฒ็ปไธ€ๅŽปไธๅค่ฟ”ไบ†\n1000 Processed\n classify content\n130500 0 ไธฅๅމๆ‰“ๅ‡ปๆถ‰ๆžชๆถ‰็ˆ†่ฟๆณ•็Šฏ็ฝช่กŒไธบ\n130501 0 ไธ“ๅฎถ๏ผšๅฏนๅŽๅ‡บๅฃ้‡ๅž‹็ซ็ฎญๅ‘ๅŠจๆœบไธ่ฟๅๅ†›็”จ็ซ็ฎญๆŠ€ๆœฏไธๆ‰ฉๆ•ฃๅˆถๅบฆ/Sputnikไธญๅ›ฝโ€“ๆ–ฐ้—ป\n130502 0 ๆ‰€ไปฅ่ดดๅฟƒ็š„ๅฐYO่”ๅˆ็™พๅบฆๆ‰‹ๆœบๅŠฉๆ‰‹็ป™ๅคงๅฎถ่ฐ‹\n130503 0 ๅพฎ่ฝฏ็กฎ่ฎคๅ…ฌๅ…ฑ้ข„่งˆ็‰ˆๅฐ†ๅœจxๆœˆๅบ•ๅˆฐๆฅ\n130504 0 ไธ€่ง‰้†’ๆฅๅ‘็Žฐ็”ต่„‘้‡Œ็่—ๅคšๅนด็š„ๆทปๅฏŒๅฎ็š„ๅ›พๅŒ…ๆฒกไบ†่ฟ˜็Žฉๆฏ›\n1000 Processed\n classify content\n131000 0 ||ๆˆ‘ๅœจๆฐงๆฐ”ๅฌไนฆๆ”ถๅฌโ€œ012้•ฟ็•™ไธŠไป™โ€\n131001 0 ๅฝ“ๅ‰A่‚ก่ž่ต„็›˜่ง„ๆจกไป็„ถๆ˜Žๆ˜พ้ซ˜ไบŽๅ›ฝ้™…ๆฐดๅนณ\n131002 0 ๅธฝๅญ้•ฟ่ข–้˜ฒๆ™’้œœandๅ…ญ็ฅžๅทฒๅ‡†ๅค‡ๅฅฝ\n131003 0 ้œ€่ฆๅŠ ๅซๆ˜Ÿๅท๏ผšxiaovvvjian\n131004 0 ไปปไฝ•่‡ช็”ฑ้ƒฝๅฟ…้ ˆๅ—ๆณ•ๅพ‹้™ๅˆถ็š„\n1000 Processed\n classify content\n131500 0 #NAME?\n131501 0 1ๆ˜ฏไบบๅŠ›่ต„ๆœฌๆตๅ‘่ฟ‡ๅบฆๆ”ฟๅบœๅนฒ้ข„\n131502 0 ๅœจไบ†่งฃๅ’Œ่ฎค่ฏ†ๅคงไผ—ๆฑฝ่ฝฆ็š„ๅŒๆ—ถ\n131503 0 ๆœ€่ฟ‘ๅœˆๅ†…็–ฏไผ ไธ€ๆฎต่ฏ๏ผšไธ–็•ŒไธŠๆœ€้ฅ่ฟœ็š„่ท็ฆปๆ˜ฏ๏ผšไฝ ๆ˜Ž็Ÿฅๆˆ‘ๅœจๅšWV\n131504 0 ๅพˆๅคšๆฌก็œ‹ๅˆฐๅฐๅญฉๅœจ็”ตๆขฏไนฑ่ท‘่นฆ่ทณ\n1000 Processed\n classify content\n132000 0 ็”Ÿ้ฒœ็”ตๅ•†็š„็ƒญๅบฆไปŽ2014ๅนดๅผ€ๅง‹ไพฟๆ˜ฏๆœ‰ๅขžๆ— ๅ‡\n132001 0 LXๆˆ็†Ÿ็š„็œŸ็›ธๅฐฑๆ˜ฏๆ‹จๅ‡บ่ฟท้›พ่ง้’ๅคฉ\n132002 0 ๅญฆไผšไบ†ๆ— ๆ‰€ไบ‹ไบ‹ๅ’Œๅผ„่™šไฝœๅ‡๏ฝž๏ฝž๏ฝž\n132003 0 ็ฌฌไธ€ๆฌกๅ้ฃžๆœบๆœ‰ไป€ไนˆ้œ€่ฆๆณจๆ„็š„ๅ‘€\n132004 1 ๆต™ๆฑŸๅŸบ่ฏไธญๆ ‡ไบงๅ“็‹ฌไธ€ๅ‘ณ้ข—็ฒ’๏ผˆ็œๅขž่กฅ๏ผŒไธดๅบŠ่ฎคๅฏๅบฆ้ซ˜๏ผŒxg*xx่ข‹/็›’๏ผŒไธญxx.xxๅ…ƒ๏ผ‰ๅ…จ็œ้š†้‡...\n1000 Processed\n classify content\n132500 1 ๆ‚จๅฅฝ๏ผšๆˆ‘ๆ˜ฏๅผ ๅฎถๆธฏๅŽ่ƒœ็š„ๅฐ้‡‘๏ผŒxๆœˆๅ…ฌๅธๅผ€ๅนดๅทจๆƒ ๆดปๅŠจ๏ผšx.ๆ–ฐๅฎขๆˆทๅฐไฟๅ…ป๏ผˆๆœบๆฒน๏ผŒๆœบๆปค๏ผŒๅทฅๆ—ถ๏ผ‰ๆ˜ฏxS...\n132501 0 GD็Ÿฅ้“ๆˆ‘ไปฌๅบ”ๆดๅทฎ่ฟ˜ๅธฆๅŠจ็Ÿฅ้“ๆˆ‘ไปฌไธ่ƒฝ็ซ™ๅฐฑๅซๆˆ‘ไปฌ็ซ™่ตทๆฅๆˆ‘็œŸ็š„ๅพˆๅฟƒ้…ธ\n132502 0 ๆฑ‚ๅŠฉไบŽ่…พ่ฎฏ่ง†้ข‘็š„ๅŒๅญฆไนŸๆ— ๆžœ\n132503 0 07ไบฟ็พŽๅ…ƒ็งๆœ‰ๅŒ–ๅ…จ็ƒ็ฌฌไธ‰ๅ’Œ็ฌฌๅ››ๅคงๆ‰‹ๆœบ่Šฏ็‰‡ๅ•†ๅฑ•่ฎฏไธŽ้”่ฟช็ง‘\n132504 0 ไป–ๆŽจๆธฌๆฅŠ่ฒดๅฆƒ็ด„165ๅ…ฌๅˆ†ใ€60ๅ…ฌๆ–ค\n1000 Processed\n classify content\n133000 1 ๆ„Ÿ่ฐข่‡ด็”ตๆฝฎๅทžๅ…จๅ…ด็‚‰ๅ…ท๏ผŒๆœฌๅ…ฌๅธไธป่ฆ็ป่ฅๅ›ฝๅ†…ๅค–ๅ„็งๅ็‰Œ็‚‰ๅ…ท๏ผŒ็ƒญๆฐดๅ™จ๏ผŒๆŠฝๆฒน็ƒŸๆœบ๏ผŒๅฎถ็”จ็”ตๅ™จๅŠๅ„็งๅ•ๅฑ...\n133001 0 ๆƒณๅœจๅ—ไบฌ็ปง็ปญไบฒไธด็›—ๅข“็ฌ”่ฎฐ2\n133002 0 โ€œ็”ต่„‘็ปผๅˆๅพโ€ๆ˜ฏๆœ€่ฟ‘ๅ‡ ๅนดๆๅ‡บ็š„ไธ€ไธช็–พ็—…็—‡ๅ€™็พค\n133003 0 ็ƒŸๅถๆฅๆบไธๅˆๆณ•ๅธฎๅฟ™่ฟ่พ“ๆˆๅ…ฑ็Šฏ\n133004 0 MKๅ…จ็‰›็šฎ่€ณๆœตๅŒ…ไธค่พน่ฑๆ ผ่ฎพ่ฎก\n1000 Processed\n classify content\n133500 0 ๅคฉๅ“ชๆˆ‘่ง‰ๅพ—่‡ชๅทฑๆœ‰็‚น้…ท็บขๅŒ…ๆ‰‹ๆฐ”ๆ˜ฏไธๆ˜ฏๅพˆๅމๅฎณ\n133501 0 ไฝ ไปฌ็š„ๅฐพๅทดๅœจ2015ๅนด7ๆœˆ8ๆ—ฅ้œฒไบ†ๅ‡บๆฅ\n133502 0 ๆˆ‘ไปฌ็š„ๅŸŽ็ฎกๅ”ๅ”ๅฐฑๆ‰‹ไธ‹็•™ๆƒ…ๅง\n133503 0 ้ข็งฏๆฎตxxๅนณ็ฑณโ€”โ€”xxxๅนณ็ฑณไธ็ญ‰\n133504 0 ็‰นๅˆซ่ฆๆ„Ÿ่ฐขไธ€ไธ‹่ถŠ็ง€ๆณ•้™ขไนฆ่ฎฐๅ‘˜็š„้ซ˜่ถ…ไธ“ไธš็ด ๅ…ป\n1000 Processed\n classify content\n134000 0 ๅ—ไบฌโ€ฆโ”€โ”€ๅˆ†ไบซ่‡ช้ƒฝๅธ‚ๅฟซๆŠฅiPhoneๅฎขๆˆท็ซฏ\n134001 0 ๆฑŸ่‹13ไธช็œ่พ–ๅธ‚18ไธชๅ…ฌ่ฏๅค„ๅทฒๅผ€่ฎพ6ๅคง็ฑป36็งๅ…ฌ่ฏไบ‹้กน็ฝ‘ไธŠๅŠž็†\n134002 0 ไธญๅ›ฝๆ”ฟๅบœไป€ไนˆๆ—ถๅ€™ๆ‰่ƒฝ่ฎฉๅฅน็š„ไบบๆฐ‘ๆดปๅพ—ๆœ‰ๅฐŠไธฅไธ€็‚น\n134003 0 ่€Œไธ”้ƒจๅˆ†ๆฑฝ่ฝฆๆธ…ๆ–ฐๅ‰‚ไบงๅ“ไธญๆทปๅŠ ็š„ๅŠฃ่ดจ้ฆ™็ฒพๆœฌ่บซไนŸๅญ˜ๅœจๅฑๅฎณ\n134004 0 ไธ€่ตทๆฅๅฌๅฌๅง๏ฝžOpening\n1000 Processed\n classify content\n134500 0 ๆ–ฐๅŽใ€ไบบๆฐ‘ๅ’Œๅ„ๅœฐๅ…šๆŠฅ้›†ๅ›ขๆ‰€ๅฑž็ฝ‘็ซ™ๆˆ–ๅˆ่ต„็ฝ‘็ซ™ๅ ๆฎๅคงๅคด\n134501 0 ไธ€ๆฌพP8+Mate7ไธป้ข˜โ€”โ€”ไผผๆฐดๆตๅนดไน‹ๅ›ๅญไบบ็”Ÿ\n134502 0 ๅคงๅ•†้›†ๅ›ข็š„้”€ๅ”ฎ็‚น้ๅŠๅ…จไธญๅ›ฝ11ไธช็œไปฝ\n134503 0 liyingxin่ฎพ่ฎกๆˆ‘ๅ’Œๆˆ‘ๆ‰“ๅฎ˜ๅธ็š„ไบ‹ๆƒ…ๆณ•้™ขๅทฒ็ปๅค„็†ไบ†ไป–่ฟ˜ๅทฎๆˆ‘ไธ€ไธช้ซ˜ๅฐ”ๅคซ็ƒๆ†ไธ€ไธช้ฆ™ๆฐดไธ€ไธชๆ‰‹ๆœบๆฒก็ป™ๆˆ‘\n134504 1 ๅฅฝๆถˆๆฏ๏ผ??????ไธŠๅคๅ…ฌๅ›ญๅคฉๅœฐๅˆซๅข…๏ผŒไธŠๆตทๅ”ฏไธ€ๅปบๅœจๅ…ฌๅ›ญไน‹ไธญ๏ผŒๅๆ‹ฅxxxxไบฉ้กพๆ‘ๅ…ฌๅ›ญ๏ผŒ่”ๆŽ’ๅ…จๆฐด...\n1000 Processed\n classify content\n135000 0 ่ขซ็งฐไธบ้˜ฟ้‡Œ\"ๆœ€ๅŠฑๅฟ—\"็š„ๅˆไผ™ไบบ\n135001 0 15ไธช้กน็›ฎๅ…จ้ƒจ่พพๅˆฐๅบๆ—ถๅปบ่ฎพ่ฟ›ๅบฆ\n135002 0 ๆฅๅฎพๅธ‚ๅ…ฌๅฎ‰ๅฑ€ไบค่ญฆๆ”ฏ้˜ŸไบŒๅคง้˜Ÿๆฐ‘่ญฆ้€š่ฟ‡็Žฐๅœบ่ฐƒๆŸฅ\n135003 0 ๅ›ฝไบงๆ‰‹ๆœบ็›ˆๅˆฉๆƒ…ๅ†ตๆœ€ๅฅฝ็š„ๅบ”่ฏฅๆ˜ฏvivoๅ’ŒOPPO\n135004 0 ๆฏ”ๅฆ‚ไป€ไนˆๆ‰‹ๆŒ‡ๅ…จๆ–ญๆฏ”ๅฆ‚ๅ…จ่บซไธŠไธ‹่ก€ๆท‹ๆท‹้ƒฝๆœ‰็ป™็‰นๅ†™\n1000 Processed\n classify content\n135500 0 ยทๅ› ไธบไธŠๆฌกไธ€่ตท้€›ไธŠๆตท่‡ช็„ถๅš็‰ฉ้ฆ†็š„ๆ—ถๅ€™ๆฒก่ƒฝไนฐไธ‹็Œน\n135501 0 ๅพˆๅคšๆ—ถๅ€™่งฃๅ†ณ้—ฎ้ข˜ๅฏ่ƒฝๆ˜ฏIBM\n135502 0 โ€nmbๅคฉๆƒนๅ™œๆœ‰ไบบๅœจ็œ‹ไฝ ็›ธๅ†Œ้‡Œ็š„ไธ‘็…ง\n135503 0 ไธ€ๅผ ไปทๅ€ผ329็š„่กฃๆœๅˆธๆœ€่ฟ‘่ฐๆƒณไนฐ่กฃๆœ\n135504 0 ไฝ™ๆญๅฅฝๅฃฐ้Ÿณ200ๅผบ42ๅทๅพ…ๅฎš้€‰ๆ‰‹\n1000 Processed\n classify content\n136000 1 ๆˆ‘็š„ๆ–ฐๅ•ไฝ็ฎ€ไป‹๏ผŒ่ฏทๅ„ไฝไบฒๅ‹ๅคšๅคšๆ”ฏๆŒ๏ผๅฑฑๆตทๅคง้…’ๅบ—ไปฅๅ…ถๅพ—ๅคฉ็‹ฌๅŽš็š„ๅœฐ็†ไฝ็ฝฎ้›„่ธžๅคฉๅค–ๆ‘ไน‹ๅ—๏ผˆๆžœ็ง‘ๆ‰€ๅŒ—...\n136001 0 ่€Œไปฅ้˜ฟ้‡Œใ€ไบฌไธœไธบไปฃ่กจ็š„ไบ’่”็ฝ‘็”ตๅ•†ๅนณๅฐ\n136002 0 Google็š„ไธœ่ฅฟไนŸๆ›ด็ˆฑๅ‘ๅˆฐๅŽไธค่€…\n136003 0 ๆˆ‘่ง‰ๅพ—่Šฑๅƒ้ชจ่ฟ™้ƒจ็”ต่ง†ๅ‰ง่ทŸๆˆ‘ไปฌ่ฏดๆ˜Žไบ†ไธ€ไปถไบ‹้‚ฃๅฐฑๆ˜ฏ16ๅฒๅฐฑๅฏไปฅ่ฐˆๆ‹็ˆฑไบ†่€Œไธ”็ˆฑ็š„ๆ˜ฏๆœ‰ๆƒๆœ‰ๅŠฟๆœ‰ๅœฐไฝๆœ‰...\n136004 1 ไฝ ๅฅฝ๏ผŒๆœฌไบบๆ˜ฏๅš่ฃ…ไฟฎ่ฎพ่ฎก๏ผŒๆœ‰ๆ ทๆฟ้—ดๅฏๅ‚่ง‚๏ผŒไปทๆ ผๅบ•๏ผŒ่ดจ้‡ๆœ‰ไฟ่ฏๅ”ฎๅŽๆœ‰ไฟ่ฏ๏ผŒๅ…”่ดน่ฎพ่ฎก็ŽฐๅœบๆŠฅไปท๏ผŒๆœ‰ๆ„...\n1000 Processed\n classify content\n136500 0 ๅŸŽ็ฎกๆ–ฐๅฎ‰ๆฑŸไธญ้˜ŸๅœจๅŸŽๅŒบ่Œƒๅ›ดๅ†…ๆŽ’ๆŸฅๅ„็ฑปๅฏ่ƒฝๅญ˜ๅœจ็š„ๅฎ‰ๅ…จ้šๆ‚ฃ\n136501 0 ่ฆๅ‡บๅคงไบ‹ไบ†~ๅฆ‚ๆžœไธ่ตถๅฟซ่ฟ›่กŒๆฒป็–—็š„่ฏ็”ฑไบŽไฝ ็š„ๆ„คๆ€’ไฝ ๅ‘จๅ›ด็š„ไบบไผšๆ„Ÿๅˆฐๅพˆ็–ฒ็ดฏ\n136502 1 (x/x)ๆ‚จๅฅฝ๏ผŒๆ„Ÿ่ฐข่‡ด็”ตไธŠๆตท่“ๅคฉๆก†ไธš๏ผŒๆœฌๅ…ฌๅธไธ“ไธšๆไพ›ไธชๆ€ง้“ๅˆ้‡‘็›ธๆก†ๅฎšๅˆถไธšๅŠกใ€‚่ฏฆๆƒ…ๆˆ–็™ปๅ…ฅwww...\n136503 0 ไผŠ็Ёๅทžๆฐ”่ฑกๅฐxๆœˆxๆ—ฅxxๆ—ถๅ‘ๅธƒๅคง้ฃŽ่“่‰ฒ้ข„่ญฆไฟกๅท\n136504 0 ๅธธๅทžๅธ‚ไบ”ๆด‹็บบ็ป‡ๆœบๆขฐๆœ‰้™ๅ…ฌๅธ่‘ฃไบ‹้•ฟ็Ž‹ๆ•ๅ…ถๅœจ็ข็ฃจๅฆ‚ไฝ•็”จไธ€ๅฅ—็ณป็ปŸๆŠŠ็”Ÿไบง่ทŸ่ธชใ€ไป“ๅ‚จ็›˜็‚นใ€่ฎพๅค‡็Šถๅ†ตใ€่ดจ...\n1000 Processed\n classify content\n137000 0 ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๆฑŸ่‹ๅฐ็ปงๆ‰ฟ่€…ไปฌ็š„้…้Ÿณ็ฎ€็›ดไบ†ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n137001 0 ไธ€ไธชๅฅณไบบ็š„ๆ™บๆ…งๆ˜ฏ็Žฏๅขƒๆ‰“้€ ๅ‡บๆฅ็š„\n137002 0 ่€Œ้ป„ๆ–‘ๆถๅŒ–ไผšๅผ•่ตทๅคฑๆ˜Žไธ”ๆ— ๆณ•ๆฒปๆ„ˆ\n137003 0 ไฝ ๅฏ่ƒฝไผš่ฏด๏ผšไพ›ๆˆฟๅญ3000ๅ…ƒ/ๆœˆ\n137004 0 ๅŽŸๆœฌไปฅไธบโ€œๆฝœ่ง„ๅˆ™โ€่ฟ™ไธช่ฏ่ขซ่ฏดไบ†่ฟ™ไนˆๅคšๅนด\n1000 Processed\n classify content\n137500 0 ่ฟชๅฐ”ๅพทไธฝๅ’Œluxelabๆบๆ‰‹้ฆ™ๆธฏไธ“ๆŸœไธŠๅธ‚\n137501 0 ๅˆšๅˆšๅœจๅ…ฌไบค่ฝฆไธŠ่งๅˆฐไธคไธชๅฐๅท\n137502 0 ไธๅˆฐๅคฉ็Ž‹1ๅ…ฌ้‡Œๅค„ๅ‘็”Ÿไธค่ฝฆ่ฟฝๅฐพไบ‹ๆ•…\n137503 0 ่‹ๅทžๅฐ่ฑกไธ้”™๏ฝžไปŠๅคฉๅœจ้ซ˜้“็ฉฟไบ†้•ฟ่ฃค\n137504 0 ๆœ€ๅคง็š„่ดญ็‰ฉไธญๅฟƒ่ฟชๆ‹œ่ดญ็‰ฉไธญๅฟƒ\n1000 Processed\n classify content\n138000 0 ็ป™ๅŠ›็š„ๆ˜ฏๆœ€ๅŽไธ€ๅผ ๆŠตๅˆฐๅพๅทž็š„็กฌๅบง่ฎฉๆˆ‘็ป™ๆŠขๅˆฐไบ†\n138001 0 ไฝฟๆ–ฐ่ฝฆๆฏ”Zx็œ‹่ตทๆฅๆ›ดๅŠ ้œธๆฐ”\n138002 0 ๆ•ˆๆžœๅŠ็–—ๆ•ˆ่ถ…่ฟ‡้ป„้ป‘ไธœ้ฉ้˜ฟ้‡Œ\n138003 0 ็œ‹็ฌฌ29ๅฑŠๅคๅญฃๅฅฅๆž—ๅŒนๅ…‹่ฟๅŠจไผšๅœจๅŒ—ไบฌๅผ€ๅน•\n138004 0 ้ƒฝ่กจๆ˜ŽNormanๆ˜ฏไธชๆœ‰็ฒพ็ฅž้—ฎ้ข˜็š„ๅญฉๅญ\n1000 Processed\n classify content\n138500 1 ไธบๅ›ž้ฆˆๅฎขๆˆทๅฏนๅ…ฌๅธ็š„ๆ”ฏๆŒไธŽๅŽš็ˆฑ๏ผŒๅ…ฌๅธ็‰นๆŽจๅ‡บไธ€ๆฌกๆ€งไธŠไป˜xx%ๆ”ฟ็ญ–๏ผŒๆˆชๆญขๅˆฐxๆœˆxxๆ—ฅ๏ผŒๅŒๆ—ถ่ฟ˜ๅฏๅ‚ๅŠ ...\n138501 0 ๅ’Œๅฐๅดๅ”ๅ”้˜ฟๅงจไธ€่ตทๅŽปๆต™ๆฑŸ่ฝฌ่ฝฌ\n138502 0 ่ฟ˜่ฆไธŠๆผ”ไธ€ๆฌกโ€œๆปจๆตทๅŸŽๅธ‚โ€ๅ—\n138503 0 ๆˆไธบๅ—ไบฌๆ—…ๆธธ็ง้ญ…ๅŠ›ๆฅ่‡ชๅฎƒๆ‰€ไธบไฝ ๆๅˆๆœŸๅปบ้€ ็š„ๅˆป็ป็Ÿณ็ชŸ\n138504 0 ่Šฑ่ดน560ๅ…ƒไนฐๅ›žๆฅไธ€ไธชไธๅฏไปฅไฝฟ็”จ็š„็‰ฉๅ“\n1000 Processed\n classify content\n139000 0 ไธ‰ๅ‰ๆˆŸxxxๅท้ฃžๆœบ็š„้’ข้“ไน‹่บฏ่ขซ็‚ธๅพ—็ฒ‰็ขŽ\n139001 0 ็œŸไปฅไธบ่‡ชๅทฑbbไธคๅฅ็™ฝๅญ็”ปๅฐฑไผšไธ็ˆฑ่Šฑๅƒ้ชจไธ€ๆ ท็š„\n139002 0 ไฝ ไปฌ้ƒฝ้‚ฃไนˆhighไผฐ่ฎก็”ต่„‘ๅŽ็š„ๅฅนๅœจๅทๅทๆš—็ˆฝ่€Œๅทฒๅง\n139003 0 ๆŠ—็™Œใ€ๆŠ—ๆฐงๅŒ–ใ€ๅฏนๅฅณๆ€งๅ†…ๅˆ†ๆณŒ่ฐƒๅ…ปๅคงๆœ‰็›Šๅค„\n139004 1 ็Œช้ธก็พŠ้ผ x่‚–ไธญ๏ผŒไธ‰ไธญไธ‰็Œช็พŠ้ผ ๏ผŒไบŒไธญไบŒ็Œช้ผ ๏ผŒไธ€่‚–ไธญ็Œช๏ผŒๅฟ…ไธญ๏ผŒไธŠๆฌก้€ไบบ็พŠ็‹—ๅ…จไธญ\n1000 Processed\n classify content\n139500 0 ๆ™บ่ƒฝๅ•†ไธšๅˆ†ๆžๅทฅๅ…ทPowerBIๅฐ†ๅœจ7ๆœˆ24ๆ—ฅ่„ฑ็ฆป้ข„่งˆ็‰ˆๆœฌ็š„็Šถๆ€\n139501 0 ไธ€ๅฐ็”ต่„‘ไธ€ๆฏ่Œถๅฐฑ่ƒฝ่ฟ™ไนˆๅไธ€ไธ‹ๅˆ\n139502 0 ็œ‹ๅˆฐ่ฟ™ไธ€ๅ †ๅ †็š„่…่ดฅ้ชจๅคด้šพไปฅๆƒณ่ฑก่ฟ™ๆ˜ฏ็”จๆฅๅš้’™็‰‡็š„\n139503 0 ้ขยท็ฑณๅ…‹ๆด›ไป€ๆŽงๅˆถไบ†ๆ–ฐ็š„โ€ฆโ€ฆ\\nSometimeswhenIๆปฉ\n139504 0 ๆฑŸๅ—ๅ…ฌๅฎ‰ๅˆ†ๅฑ€ๆฐ‘่ญฆ่ฟ‘ๆ—ฅๅฑ•ๅผ€่กŒๅŠจ\n1000 Processed\n classify content\n140000 1 xๆœˆxๆ—ฅ่‡ณxๆ—ฅๅ‘่กŒๅคšๆฌพๆญฃๆœˆๅไบ”ๅ…ƒๅฎตไฝณ่Š‚ไธ“ๅฑž็†่ดขไบงๅ“๏ผŒ่ดญไนฐ่ตท็‚นxไธ‡่ตท๏ผŒๆœŸxxๅคฉ๏ผŒxxๅคฉ๏ผŒxxx...\n140001 0 ๅ’Œๆž—ๅŽฟๆณ•้™ขไธ€ๅฎกไปฅ่ดชๆฑก็ฝชๅˆคๅค„ๅˆ˜ๆŸๆŸๆœ‰ๆœŸๅพ’ๅˆ‘1ๅนด\n140002 0 ไบš้ฉฌ้€Š่กจ็คบๅ…ถๅทฒ็ป็ญพ็บฆ่ฅฟ็ญ็‰™Iberdrolaๅ…ฌๅธๆฅๅปบ่ฎพๅ’Œ็ฎก็†ๅœจ็พŽๅ›ฝๅŒ—ๅก็ฝ—ๆฅ็บณๅทž็š„้ฃŽๅŠ›ๅ‘็”ตๅœบ\n140003 0 G25้•ฟๆทฑ้ซ˜้€Ÿ็”ฑ่ฟžไบ‘ๆธฏๅพ€ๆญๅทžๆ–นๅ‘ๅฎๆญๆฎตไปŽK2125+228่‡ณK2125+322ๅค„ๆ–ฝๅทฅ็ป“ๆŸ\n140004 0 ๆˆ‘ไฟฉ่ฟ˜ๆขฆๆƒณไธญๅฝฉๆŠ•่ต„่ฎฉไป–ไฟฉๅˆไฝœ\n1000 Processed\n classify content\n140500 0 ๅ…ณไบŽelegance็š„่ฟ™ๆฌพๆ•ฃ็ฒ‰\n140501 0 โ€โ€œๅŒป็”Ÿๅ‘Š่ฏ‰ๅฅนๅ–ๆฐดไธ่ฆ็›ดๆŽฅๅ’ฝ\n140502 0 ็Žฐๅœจ็š„็‹ฌๅค„้™คไบ†็Žฉๆ‰‹ๆœบ่ฟ˜่ƒฝๅนฒไป€ไนˆๅ™ข่ฟ™ๅบŸ็‰ฉไบบ็ฑปๅฟซ่ฆๅ˜ๆˆๆœบๅ™จไบบไบ†\n140503 0 ๆ”ฏ็‚นๅ…ป็”Ÿ้…’ไธŽๅ†ฌ็—…ๅคๆฒปๅ†ฌ็—…ๅคๆฒป๏ผšๆŒ‰็…งไธญๅŒป็†่ฎบ\n140504 0 ไป€ไนˆๆ—…ๆธธไป€ไนˆ้ฌผๅ•ŠๅŠณ่ต„ๆ นๆœฌๆฒกๅ‡บ้—จๅฅฝๅ˜›\n1000 Processed\n classify content\n141000 1 ๅฎถ้•ฟ๏ผŒไฝ ๅฅฝ๏ผ็Žฐๅ…ญๅนด็บงๆ•ฐๅญฆๅ†ฒๅˆบ็ญๅฐ†ไบŽxๆœˆx-xๆ—ฅๅผ€ๅง‹ๆŠฅๅๆณจๅ†Œ๏ผŒxๆœˆxๆ—ฅๆญฃๅผไธŠ่ฏพใ€‚ๅฐ็ญๆ•™ๅญฆ๏ผŒๅ้ข...\n141001 0 ๅ—ไบฌๆฏๅฉด/ๅ„ฟ็ซฅ็”จๅ“ไฟกๆฏ\"ๆฏๅฉด/ๅ„ฟ็ซฅ็”จๅ“\n141002 0 ้—จๅฃ็š„่›‹็ณ•ๅบ—้‡ๆ–ฐ่ฃ…ไฟฎๅŽๅผ€ไธšไบ†\n141003 0 ็™พๅบฆๆ”พ็€่ฟ™ไนˆ็—…ๆ€็š„่ดดๅงไธ็ฎก\n141004 0 ๆ–ฐ็”ท็ฅžๆŽ็ง€่ตซ/ๆŽๆด™่ตซ12ๅนดไธŠ็ปผ่‰บ\n1000 Processed\n classify content\n141500 0 ๆฑŸๆน–็š„ๅ‘ณ้“่ฟ˜ๆฎ‹็•™็€ใ€ๅŒป้™ข้‡Œไธ€่ตท็Žฉๅ•ๆœบ็š„ๅฐไผ™ไผดไปฌใ€\n141501 0 ็Žฉๅซๆ˜Ÿ็š„ๆœ‹ๅ‹ๅŠ ่ตทๆฅcomecomelet'sgo\n141502 0 ๆต™ๆฑŸๆญๅทžๅธ‚ๆŸๅคงๅญฆๅบ”ๅฑŠๆฏ•ไธš็”Ÿๅฐๆ–ฝๆฅๅˆฐๆญๅทžๅธ‚ๅ…ฌๅฎ‰ๅฑ€็ปๆตŽๆŠ€ๆœฏๅผ€ๅ‘ๅŒบๅˆ†ๅฑ€้‡‘ๆฒ™ๆน–ๆดพๅ‡บๆ‰€ๆŠฅๆกˆ\n141503 0 ๅทฆๅณๅ€พๆ–œๆ‰‹ๆœบๅฏไปฅๆŽงๅˆถไธป่ง’ๅทฆๅณ็งปๅŠจ\n141504 0 ๅ…ฌๅฎ‰ๆˆ˜็บฟๅฐคๅ…ถๆ˜ฏ้ฆ–้ƒฝ็š„ๅ…ฌๅฎ‰ๅนฒ่ญฆ\n1000 Processed\n classify content\n142000 0 ่ดฟ่ต‚ๅ•†็ป™ไฝ ็š„็”Ÿๆ„้ƒฝๆ˜ฏไบๆœฌ็”Ÿๆ„\n142001 0 ๅฆ‚ๆžœไฝ ๆถ‚ๆŠนๅฎŒ้˜ฒๆ™’่ง‰ๅพ—ๅพˆๆฒน่…ป็š„่ฏ\n142002 0 13ๅฒไธ€ไธชไบบๅŽป่ถ…ๅธ‚ไนฐไธ‹ไธชๆ˜ŸๆœŸ็š„็‰›ๅฅถๅ’Œ้ขๅŒ…\n142003 0 ๅ…ˆๅŽ้€š่ฟ‡ISO9001ๅ›ฝ้™…่ดจ้‡็ฎก็†ไฝ“็ณป่ฎค่ฏๅ’ŒISO14001ๅ›ฝ้™…็Žฏๅขƒ็ฎก็†ไฝ“็ณป่ฎค่ฏ\n142004 0 **ๆ–ฐๆน–ๅˆ›ไธš้“ถๆถฆๆŠ•่ต„ไธญๅฑฑๅ…ฌ็”จๆดฅๆปจๅ‘ๅฑ•ๅŽไธš่ต„ๆœฌ้€š็ญ–ๅŒป็–—ๆก‘ๅพท็Žฏๅขƒๅ—ไบฌไธญๅŒ—ๅŒ—ๅทดไผ ๅช’ๅผบ็”ŸๆŽง่‚ก็”ณ้€šๅœฐ้“\n1000 Processed\n classify content\n142500 0 ๅคๅŸƒๅŠๅŒบ็š„ๅปบ็ญ‘็‰นๅˆซๆœ‰ๅผ‚ๅŸŸ้ฃŽๆƒ…\n142501 0 ๆฏๅคฉๅๅ…ฌไบคๅๅœฐ้“็š„ๆ—ถ้—ดๅˆๆœ‰ไนฆ็œ‹ไบ†ไบ†\n142502 0 Win10Mobile้ข„่งˆ็‰ˆ10166่Žทๅพ—ๆ›ดๆ–ฐๆŽจ้€\n142503 0 ๅฏนๆ”ถ่ดญไบบ็š„้™ๅˆถ๏ผš1่ดŸๆœ‰ๆ•ฐ้ข่พƒๅคงๅ€บๅŠก\n142504 0 ่บซ่พนไธ€ไธชๅŒไบ‹็œ‹ๅˆฐxxxxxๅ‘็Ÿญไฟก่ฏด็งฏๅˆ†ๅคš่ƒฝๅ…‘ๆขๅ‡ ็™พ็Žฐ้‡‘\n1000 Processed\n classify content\n143000 0 ๅไธชๅœฐ้“่ขซๆญปๅ˜ๆ€็›ฏไธŠไบ†ไธ€็›ดๅฐพ้šๆˆ‘\n143001 0 ๅˆไธ€ไธชๅ†คๆกˆ/ไบ‘ๅ—ๅนผๅ„ฟๅ›ญๆŠ•ๆฏ’ๆกˆ13ๅนดๅŽๅ†ๅฎก่ญฆๆ–น่ฏๆฎ็–‘้€ ๅ‡\n143002 0 ledๅ‰ๅคง็ฏไธŽๅ‰่ฟ›ๆฐ”ๆ ผๆ …่žไธบไธ€ไฝ“\n143003 0 ่ฟžไบ‘ๆธฏๅธ‚ๆฐ”่ฑกๅฐxxๆ—ฅxxๆ—ถๅ‘ๅธƒ\n143004 0 ๆ˜ฏๆฐดไธŠไนๅ›ญไธŽไธ€ๅนดxxxๅคฉ้ƒฝๅฏไปฅไฝ“้ชŒๅˆฐ้›ชๅ›ฝ้ฃŽๆƒ…็š„โ€œๅ†ฐ้›ชไนๅ›ญโ€ไปฅๅŠ่ดญ็‰ฉไธญๅฟƒ็ญ‰็ป“ๅˆไธ€ไฝ“็š„ๅคงๅž‹็ปผๅˆๅจฑไน่ฎพๆ–ฝ\n1000 Processed\n classify content\n143500 0 ่ถ…ไธ‰ๆˆ็”ฑ้žๆœบๅŠจ่ฝฆ่ฟๆณ•ๆ‰€่‡ด\n143501 0 ้šพ้“่ฟ™ๆ˜ฏ่ฆๅœจๅœฐ้“ๅฃๅฟƒ็ตๆ„Ÿๅบ”ๅ—\n143502 0 ไปŠๅคฉ่€ƒๅฎŒๅ•ฆochemๆด—ๅฎŒๅ•ฆ็‰™่ฟ˜ๆœ‰biochemไธ€้—จ\n143503 0 ๆ—ฉไบ›ๅนดๆต™ๆฑŸๅซ่ง†็š„ๆˆ‘็ˆฑ่ฎฐๆญŒ่ฏๅฐฑๆ˜ฏๅฆ‚ๆญค\n143504 0 ๅ…ฑๅŒไธพๅŠž็š„xxxxๅนดโ€œ็”ฒ้ชจๆ–‡ๆฏโ€ๅ…จๅ›ฝJava็จ‹ๅบ่ฎพ่ฎกๅคง่ต›\n1000 Processed\n classify content\n144000 0 SherburnAeroClubๅผ€้ฃžๆœบๅ–ฝ\n144001 0 ๅ˜ๅฝข็Žฉๅ…ท่ถ…ๅ˜้‡‘ๅˆš4ๅˆ้‡‘้ป„่œ‚็ซ็‚ญ้’ข็ดข้”้œธ็Ž‹้พ™ๆ้พ™ๆœบๅ™จไบบๅ„ฟ็ซฅ็คผ็‰ฉ\n144002 0 ไปŽ็ซ่ฝฆ็ซ™็œ‹ๅ—ไบฌๆตท็š„ๆ™ฏ่‰ฒ็กฎๅฎžๅพˆ็พŽ\n144003 0 ไธ€่ง‰็ก้†’ไธŠไบ†ไธชๅŽ•ๆ‰€ๅˆทไธ‹ๆ‰‹ๆœบๅ‘็Žฐๆปกๅฑ้ƒฝๆ˜ฏ่ฏ•่กฃ้—ดๅ•Šโ€ฆ\n144004 0 ๅช่ƒฝ้ ้ข„ๅ‘Š็‰‡ๅ’Œๅ„็งmvๆฅๅบฆๆ—ฅๅ•Š\n1000 Processed\n classify content\n144500 0 ๅšๆ‹›่˜ๆœ€็ฅžๅฅ‡็š„ๆ˜ฏไฝ ไธ็Ÿฅ้“ไผš้‡ๅˆฐไป€ไนˆๆ ท็š„ๅ€™้€‰ไบบ\n144501 0 19ๅฒ็š„nino็œŸๆ˜ฏๅซฉ็š„ๆˆ‘ๆƒณ็Šฏ็ฝช\n144502 0 ไป…ๆœ‰ๅ›ฝ้˜ฒๅ†›ๅทฅใ€้ค้ฅฎๆ—…ๆธธๅ’Œๅปบๆ็ญ‰ๆฟๅ—ๅ‡บโ€ฆ\n144503 0 ่ฏšไฟก็ป่ฅไบงๅ“ไผ˜่ดจๆ‰ๆ˜ฏ้•ฟไน…็”Ÿๅญ˜ไน‹้“\n144504 0 ๆŠ•่ต„็†่ดขๆˆไบ†ๅพฎไฟกๆœ‹ๅ‹ๅœˆๆดฅๆดฅไน้“็š„่ฏ้ข˜\n1000 Processed\n classify content\n145000 0 whateveryouwriteโ€”whitepapers\n145001 0 ๆ€Žไนˆไธไธบไบ†ๆŸฅๆ˜Ž็œŸ็›ธ่€ŒๅŠ ็ญๅ‘ข\n145002 0 ่ฏ‘ไบ‘ๅ…ฌๅธƒไบ†ๅ“็‰Œไธ‹็š„ๅ•†ไธšใ€ๅทฅๅ…ทใ€ไบ’ๅŠจๅ’Œ่ต„่ฎฏ็ญ‰xๅคงๅนณๅฐ\n145003 0 ไปฅๅŽ่ฅฟๅฎ‰ๅ…ฌๅฎ‰่งไบ†ๆด›้˜ณไบค่ญฆ็›ดๆŽฅๅผ„ๆญป\n145004 0 ๅ…ซใ€ๆ™š้คไธŽ่„‚่‚ช่‚็š„ๅ…ณ็ณปๆˆ‘ไปฌๆ™š้ค่‹ฅๅƒๅคชๅฅฝ\n1000 Processed\n classify content\n145500 0 ๅฐฑๆ˜ฏ้˜ฟ้‡Œ้›†ๅ›ขๅ‡บ็š„ไธ€ๆฌพๆ†็ป‘ๆ”ฏไป˜ๅฎ็š„่ฝฏไปถ\n145501 0 ่€Œไธ”ๅฏไปฅ็”จๅœจๆ—ฉไธŠๅŒ–ๅฆ†ๆฐดๅŽ็›ดๆŽฅไฝฟ็”จ\n145502 0 ็‰นไปท248็พŽๅ›ฝๅŒปๅธˆๆŽจ่NO1ๆฐดๅฎๅฎcoppertone่ถ…ๅ€ผๅฅ—่ฃ…ๅ†…ๅซ1\n145503 0 ๆ›พ็ถ“ๅœจNBAๆ‰“็ƒ็š„ๅจๅง†ๆ–ฏๆŽฅๅ—ไบ†ๆŽก่จช\n145504 0 ๅคงๅคšๆ•ฐ็™Œ็—‡็—…ไบบ้ƒฝๆœ‰ไธ€็งไธ€่‡ด็š„ๆ€งๆ ผ็ฑปๅž‹\n1000 Processed\n classify content\n146000 0 SAP้‡Œ็š„่‹ฑๆ–‡ๆ˜ฏไธๆ˜ฏๅฐๅบฆ้˜ฟไธ‰ๅ†™็š„\n146001 1 ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตๅฎๆณขๅ…ƒ้€š่‹ฑ่ฒๅฐผ่ฟช๏ผŒๆˆ‘ๆ˜ฏ้”€ๅ”ฎ้กพ้—ฎ่ฃ˜้พ™ๅจ๏ผŒๆ‚จๅฏไปฅๅซๆˆ‘ๅฐ่ฃ˜ใ€‚ๆˆ‘็š„่”็ณปๆ–นๅผๆ˜ฏ๏ผšxxxxxx...\n146002 0 ไฝ•ๆ—ถๆ‰่ƒฝไน˜้™ค่…่ดฅ็ป™ๅญฉๅญ่ฎจๅ›žๅ…ฌ้“ใ€ใ€ใ€ใ€ใ€ใ€\n146003 0 ็›ฎๅ‰5ๅๆถ‰ๆกˆๆˆๅ‘˜ๅทฒ่ขซๅฎๅบ”่ญฆๆ–นๅˆ‘ไบ‹ๆ‹˜็•™\n146004 0 ๆ™ฏๆฐ”ๅ‘จๆœŸๅ†…ไธš็ปฉๆ— ๅฟง็š„่‚ก็ฅจๅฏไปฅๅคง่ƒ†ไนฐๅ…ฅๅนถไธญ็บฟๆŒๆœ‰\n1000 Processed\n classify content\n146500 0 ๆธฏ่‚กๆ™ฎ่ทŒ๏ผš้˜ฟ้‡Œ็ณป่ทŒๅน…่ถ…10%\n146501 0 ๅ—ไบฌใ€ๅพๅทžๅŠ่šŒๅŸ ๅพ€ๅ…ญๅฎ‰ๆ–นๅ‘็š„่ฝฆ่พ†็ป้™‡่ฅฟๆžข็บฝ็”ฑๅˆ่‚ฅ็ป•ๅŸŽ้ซ˜้€Ÿๅ—็Žฏๆฎต็ป•่กŒ\n146502 0 ๅ–œๆฌขๅฅนๅฐฑๅผบๅฅธๅฅนๅ•Šใ€ๅ‘Š็™ฝๆœ‰ไป€ไนˆ็”จใ€ๆ“ไธๅˆฐๅฐฑไธ‹่ฏๅ•Šใ€็ฟป่„ธไบ†ๅฐฑๅ‘่ฃธ็…งใ€ๅคงไธไบ†่นฒ็›‘็‹ฑใ€ไฝ ่ฟž็›‘็‹ฑ้ƒฝไธๆ•ข...\n146503 0 ๅฅ–ๅ“๏ผš1ใ€่ทฏ้€”ไนๆฑฝ่ฝฆๅ„ฟ็ซฅๅฎ‰ๅ…จๅบงๆค…\n146504 0 ๆƒ ๆ™ฎๅทฒ็ปๅผ€ๅง‹ๆๅ‰้ข„ๅ”ฎ้ƒจๅˆ†่ฃ…่ฝฝWindowsxxๆ“ไฝœ็ณป็ปŸ็š„PCๆ–ฐๅ“\n1000 Processed\n classify content\n147000 0 ็”ฑStephenNickel่ฎพ่ฎก็ผ”้€ \n147001 0 ๆ™บๅ…ธ่ดขๅฏŒๅœจๅฑฑ่ฅฟ็ต็ŸณๆˆๅŠŸไธพๅŠžไบ†็†่ดข่ฎฒๅบงๆดปๅŠจ\n147002 0 NBAๆ–ฐ่ต›ๅญฃ็š„่ต›็จ‹่‰ๆกˆๆญฃๅœจๅ„ไธช็ƒ้˜Ÿไธญไผ ๆ’ญ\n147003 0 ๅ˜‰ๅ–„ๆณ•้™ขๅ—็†็š„ๆต™ๆฑŸๆ™ฎ็ฟ”ไธ้”ˆ้’ขๆœ‰้™ๅ…ฌๅธ็ ดไบงๆธ…็ฎ—ไธ€ๆกˆๅฌๅผ€็ฌฌไธ€ๆฌกๅ€บๆƒไบบไผš่ฎฎ\n147004 0 ๆœฌๆœŸๅฐๆˆฟๅฐฑๆฅ็›˜็‚นๅœฐ้“5็บฟๆฒฟ็บฟๆœ€ๅ‡ๅ€ผๆฅผ็›˜\n1000 Processed\n classify content\n147500 1 ๆ›พๆ›พ็พŽๅฎน้™ข่ฅฟ่‹‘ๅบ—ๆธฉ้ฆจๆ็คบ๏ผšๅผ€้—จ็บข๏ผไธ‰ๅ…ซ่Š‚้€็Žฐ้‡‘๏ผšๆปกxxxxๅ…ƒ้€xxxxๅ…ƒ๏ผŒ็บน็ปฃxๆŠ˜ไผ˜๏ผŒ่ฟ›ๅบ—ๆœ‰...\n147501 0 ้ข„่ฎกไธญๅ›ฝ็š„ๆฑฝ่ฝฆไบง้‡ๅˆฐ2020ๅนดๅฐ†่พพๅˆฐ3000ไธ‡่พ†\n147502 1 ๅฎถ้•ฟๆ‚จๅฅฝ๏ผๅฐๆ ‘่‹ฑ่ฏญๅญฆๆ กๆ–ฐๅญฆๆœŸๆŠฅๅๅทฒๅผ€ๅง‹ใ€‚xๆœˆxๆ—ฅๆญฃๅผๅผ€่ฏพใ€‚ๅผ€่ฎพ็ง‘็›ฎ:่‹ฑ่ฏญ๏ผŒ่ฏญๆ–‡๏ผŒๆ•ฐๅญฆ๏ผŒ็‰ฉ็†๏ผŒ...\n147503 0 ็ŽฐๅœจๅŽปๅคง้™†ๆŸไบ›ๅŽฟๅธ‚ๅŒบๆณ•้™ขไธถๅŸบๅฑ‚ๆณ•้™ขไปฅๅŠๆดพๅ‡บๆ‰€ๅŠžไบ‹\n147504 0 600ๅคšๅๅค–ๆฅๅŠกๅทฅไบบๅ‘˜ๅฎ‰็ฝฎๅœจ่ฟ™้‡Œ\n1000 Processed\n classify content\n148000 0 ๆปจๆตทๆ–ฐๅŒบๅทฒ้›†ไธญๆธ…็†ๆต’่‹”xxxxๅจ\n148001 0 ๅฎƒๅฐฑๅƒๆ˜ฏไธ–็•Œๅปบ็ญ‘ๆต่กŒ่ถ‹ๅŠฟ็š„้ฃŽๅ‘ๆ ‡\n148002 0 ่€ๅธˆ้—ฎไบ†ไธ€ๅฅ๏ผšโ€œnobody\n148003 0 ๅ—ไบฌ็Ž„ๆญฆๅŒบไฝๅปบๅฑ€ๅ‰ฏ่ฐƒ็ ”ๅ‘˜้™ˆ็ˆฑๅนณ่ฆไฝ้…’ๅบ—\n148004 0 ๆœ€่ฟ‘ไธๅ‘ๅพฎๅšๆ˜ฏๅ› ไธบๆˆ‘ๆ‰‹ๆœบๅไบ†\n1000 Processed\n classify content\n148500 0 ไบบ็”Ÿ็š„็œŸ็›ธๆ€ปๆ˜ฏๅœจๆ–‡ๅญ—็š„่ƒŒๅŽ\n148501 1 ไฝ ๅฅฝ๏ผŒไบบๅๆ•™่‚ฒ่ฟ‘ๆœŸๅผ€่ฎพไผš่ฎก(ๅŠณๅŠจ)ๆŠ€่ƒฝ็†่ฎบๅŸน่ฎญใ€ๅฎžๅŠกๆ“ไฝœๅŸน่ฎญ๏ผŒๅญฆ่ดนไผ˜ๆƒ ๏ผๅฆๆœ‰่‹ฑ่ฏญ็ญ‰ๅค–่ฏญ๏ผŒๅฆ‚ๆœ‰...\n148502 0 ๆฒณๅŒ—่žๆŠ•ๆจกๅผๅฏนไธญๅฐไผไธšๅˆๆœ‰ๅ“ชไบ›ๅฅฝๅค„ๅ‘ข\n148503 0 ๆญคๅ‰B่ฝฎ็š„ๆŠ•่ต„ๆ–น่”ๅˆ›็ญ–ๆบใ€้กบไธบ่ต„ๆœฌๅ…จ้ƒจ่ทŸๆŠ•\n148504 0 xxxxxxxx๏ผšไฝ•็‚…๏ผšโ€œๆ‰€ไปฅๅฐฑๆ˜ฏๆ‰€ๆœ‰็š„ๅฅณ็”Ÿ้ƒฝ็ˆฑ็€็™ฝๅญ็”ป\n1000 Processed\n classify content\n149000 0 ่ขซ่ฟ่พ“๏ผšcecolisabienvoyagรฉ่ฟ™ไปถๅŒ…่ฃน็ปๅพ—่ตท่ฟ่พ“3\n149001 0 ็„ถๅŽ้œฒๅ‡บๅธ‚ไบ•ๅฐ่ดฉ็š„้‚ฃ็งๆปก่ถณ\n149002 0 ้ซ˜่ทŸ็Ÿญ้ด+้ป‘่‰ฒ้“พๆกๅŒ…็œผๅฐฑ็ˆฑไธŠ็š„่ƒŒๅฟƒ่ฃ™ๅญ\n149003 1 ไฝ ๅฅฝ๏ผŒๆˆ‘่ฟ™่พนๅŠž็†ไฟก็”จ่ดทๆฌพ็š„๏ผŒๆœ‰้œ€่ฆๅฏไปฅ่”็ณปๆˆ‘ใ€‚ๆœฑ็ป็†ใ€‚็”ต่ฏ๏ผšxxxxxxxxxxx ๅœฐๅ€๏ผš...\n149004 0 ๅ…จๅ›ฝๅฐฑๅ‘็”Ÿไบ†4่ตท็”ตๆขฏๅฎ‰ๅ…จไบ‹ๆ•…\n1000 Processed\n classify content\n149500 0 ๆ—ฉไธŠ่ตทๆฅๅ‘็Žฐๆ‰‹ๆœบ้‡Œ้ขๆœ‰่ฟ™ๆ ทไธ€ไบ›็…ง็‰‡\n149501 0 ๅฅณๅ„ฟๅœจ่‹ๅทžไนๅ›ญ็Žฉไบ†ไธชๅพˆๅˆบๆฟ€็š„ๆธธๆˆ\n149502 0 ่ฏฆๆƒ…่ฏท่ฏข0450930207Yolanda\n149503 0 ๅฐๆนพๅฐบๅ…ซๅๅฎถ้ฆ–ๆฌกๅ—ไบฌโ€œๅฐบๅ…ซ็ฆ…ๅฟƒโ€ไน‹ๆ—…\n149504 0 ๅทจ้ข้‡†่ดญไบ†IBM็š„ไฟกๆฏใ€ไบบๅŠ›็ฎก็†ๅ’จ่ฏขๆœๅŠกๆ–น\n1000 Processed\n classify content\n150000 0 ไธไผš่ขซๅˆ‘ไบ‹ไผคๅฎณ/่ญฆๆ–น๏ผšไธœ่Žžไฟๅฎ‰ๅฅณๅ„ฟ้ญ่ฝฎๅฅธๅŽๅนถๆœชๅ‡บ่ตฐไฟๅฎ‰้š็ž’ไบ‹ๅฎž\n150001 0 ็ดซ่–ฏไธญๅฏŒๅซ็š„็ปด็”Ÿ็ด Aๅฏไปฅๆ”นๅ–„่ง†ๅŠ›ๅ’Œ็šฎ่‚ค็š„็ฒ˜่†œไธŠ็šฎ็ป†่ƒž\n150002 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ xskxxeไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n150003 0 ๅณไฝฟๆถๆ€ง่‚ฟ็˜คๅœจ่‚ฟๅ—้•ฟๅคงๅˆฐๅŽ‹่ฟซ่บซไฝ“ๅ™จๅฎ˜ใ€่ก€็ฎก็ญ‰ๅ‰ๆ— ็–ผ็—›ๆ„Ÿ\n150004 0 ๆŽจไธ€ไธชๅซwikipaintings็š„ๅบ”็”จ\n1000 Processed\n classify content\n150500 0 ้’ฑ็ฉ†่ฏž็”ŸไบŽๆ— ้”ก้ธฟๅฑฑไธƒๆˆฟๆกฅไธ€ไธชไนฆ้ฆ™้—จ็ฌฌ\n150501 0 tiger้ž‹ๅญ็”ฑไบŽไพ›่ดง่€ๆฟๅŽปๆณฐๅ›ฝๆ—…ๆธธไบ†\n150502 0 ็ป“ๆžœ็™พๅบฆๅœฐๅ›พๆ˜พ็คบ่ฟ˜ๆœ‰ไธ€ๅฐๆ—ถ\n150503 1 ๆฌงๆดพ้ซ˜็ซฏๅ…จๅฑ‹ๅฎšๅˆถxxxๆœŸ้—ด๏ผŒๆฌงๆดพๆ•ดไฝ“ๆฉฑๆŸœ๏ผŒ่กฃๆŸœ๏ผŒๆœจ้—จๅ…จๅœบx.x ๆŠ˜๏ผŒๆ›ดๆœ‰ๅ…ฌๅธ็‰นไพ›ๆฉฑๆŸœๅŠ ็”ตๅ™จๅช...\n150504 0 ไธๅธŒๆœ›็š„ๆ˜ฏๆˆ‘ๅธŒๆœ›้ฃžๆœบๆญฃๅธธ้ฃž\n1000 Processed\n classify content\n151000 0 ๆญฆๆฑ‰ๅธ‚ๅ…ฌๅฎ‰ๅฑ€ไธœ่ฅฟๆน–ๅŒบๅˆ†ๅฑ€ไบค้€šๅคง้˜Ÿ่ฝฆ็ฎกไธญ้˜Ÿๆฐ‘่ญฆๅญ™ๆ˜Žๆฅๅˆฐๆš‘ๆœŸๅฐ‘ๅ„ฟๅคไปค่ฅ่พนไธŽๅฐๆœ‹ๅ‹ไปฌๅšๆธธๆˆ\n151001 0 ้ฆ–ๅฐ”ๅธ‚้•ฟ่ฎฟๅŽโ€œๆฝๅฎขโ€้Ÿฉๆตๆ˜Žๆ˜ŸๅŠฉๅŠ›ๆ—…ๆธธๅฎฃไผ \n151002 0 ๅนถ้‚€่ฏทๅˆฐ็Žฐไปปๅพฎ่ฝฏๅ…จ็ƒๆ‰ง่กŒๅ‰ฏๆ€ป่ฃ็š„ๆฒˆๅ‘้˜ณๅšๅฃซ่‡ดๅผ€ๅน•่ฏ\n151003 0 ็œ‹ๅฎŒไนฑๆญฅๅฅ‡่ฐญ็ฌฌไบ”้›†ๆ‰็œŸๆญฃๅผ€ๅง‹ๅ–œๆฌข่ฟ™็•ช็š„็š„็กฎๅฎƒๆ นๆœฌไธๆ˜ฏๆŽจ็†็•ชๅฎƒๆŠŠไบบ็š„ๅฟƒ็†็œŸ็š„ๅˆป็”ป็š„้žๅธธ็ป†่‡ดๆˆ‘่ง‰...\n151004 0 ๆŠฅๅๆ—ถ้—ด๏ผšxxxxๅนดxๆœˆxxๆ—ฅใ€xxๆ—ฅใ€xxๆ—ฅๅ…ฑไธ‰ๅคฉ\n1000 Processed\n classify content\n151500 0 ๅŒป็”Ÿ็งฐไปๆœช่„ฑ็ฆป็”Ÿๅ‘ฝๅฑ้™ฉโ€ฆโ€ฆ่ฏฅไฝๆˆทไน‹ๅ‰ๆ›พไธคๆฌกๅ ่ฝ่Šฑ็›†\n151501 0 ๅฝ“ๅˆ้€‰็š„ๅŽไธบไฝœไธบๆˆ‘็š„็ฌฌไธ€ๆฌพๆ™บ่ƒฝๆ‰‹ๆœบ็œŸๆ˜ฏๆฒก้€‰้”™\n151502 1 ่”็ณปQQxxxxxxxxxxใ€‚ๅฐ็ซ ๏ผŒ็‰Œ็…ง๏ผŒๅปบ็ญ‘ๅ…ซๅคงๅ‘˜๏ผŒไธ€ไบŒ็บงๅปบ้€ ๅธˆ๏ผŒ่ฅไธšๆ‰ง็…ง๏ผŒๅปบ็ญ‘๏ผŒๅ•†ไธš๏ผŒๅนฟ...\n151503 0 ๅ…ซๆœˆไปฝ่ฅฟๅฎ‰ๅ‡บ็งŸ่ฝฆ่ตทๆญฅไปทไธŠๅ‡ๅˆฐๅๅ…ƒ\n151504 0 ๆ‰‹ๆœบๅˆๅไบ†ๅˆๆฒกwifiๅˆไธ่ƒฝ็”จๆต้‡ๅฐฑ่ฟžqq้ƒฝ็™ปไธไบ†\n1000 Processed\n classify content\n152000 0 ไธญๅ›ฝ็š„ๅŸŽ็ฎกๆœ‰ๆ—ถไปฟไฝ›ๅฐฑๆ˜ฏๆšดๅŠจ้˜Ÿ\n152001 0 ใ€Œๆฌพๅผใ€ๆฌง็พŽ่ฅฟๆตทๅฒธใ€Œ้ขœ่‰ฒใ€่“่‰ฒ้ป„่‰ฒใ€Œ็ ๆ•ฐใ€50524854ใ€Œไปทๆ ผใ€?38ๅŒ…้‚ฎ\n152002 0 ๆ— ๆ„ไธญ็œ‹ๅˆฐไบ†MazdaMaxxAuto่ง‰ๅพ—ไนŸๅพˆไธ้”™\n152003 0 4ไธช11ๅฒ็š„ๅญฉๅญๅˆฐ้‚ฏ้ƒธๅธ‚ๆ–ฐๅธ‚ไธ€ไธญๅˆ˜ๆ‘ๆธธๆณณ้ฆ†็Žฉ่€\n152004 0 ไธ–็•ŒไธŠๆœ‰่ฟ™ๆ ทไธ€็พคๅฅณไบบไธๅ‚ๅคงๆฌพ\n1000 Processed\n classify content\n152500 1 ไฝ ๅฅฝ๏ผŒๅฆ‚้œ€ๅŠž&ๆญฃ่ฏทๆ‹”xxxxxxxxxxxๆŽๆ–ฐ\n152501 1 xxxxๅ…ƒๅณ่ต xxxๅ…ƒไบงๅ“ไธ€ไปถ๏ผŒๆปกxxxxๅ…ƒๅณxxxๅ…ƒไบงๅ“ไธ€ไปถ๏ผŒ่ฏฆ่ฏขxxx-xxxxxxxx...\n152502 1 ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅˆฐๅ•ฆ?? ่Š™็ฝ—ๅ…ฐๅ…ฌๅธไธบๅ›ž้ฆˆๆ–ฐ่€ไผšๅ‘˜๏ผŒๅšๅ‡บไปฅไธ‹ๆดปๅŠจใ€‚ๅ‡กๆ˜ฏไผšๅ‘˜่ดญไนฐ่Š™็ฝ—ๅ…ฐๅฅ—็›’ไธ€ๅพ‹xๆŠ˜...\n152503 0 ๆ‰€ไปฅ่ฝฌ่ฎฉๅง~~ๆœ‰ๅ…ด่ถฃ็š„่ตถ็ดง่”็ณปๆˆ‘\n152504 0 ChloeGoldie2015็ง‹ๅ†ฌๆœ€ๆ–ฐๆฌพๅคง็บข่‰ฒ็Žฐ่ดง\n1000 Processed\n classify content\n153000 0 โ˜…Dragon'Sโ”‹150807๏น้›†ไธญโ€–้ฆ–ๅฐ”&gt\n153001 0 02่ฟ™ๅช้™ชไผดไบ†ๆˆ‘6ๅนด็š„ๅฐๅฎถไผ™็”ฑไบŽๆˆ‘็š„็–ๅฟฝ\n153002 0 ้ฃžๆœบๅœจๅ››ๆ–นๅŸŽไธŠ็ฉบๅˆ’ไบ†ไธช็พŽไธฝ็š„ๅผง็บฟ\n153003 0 ๆ•™ไฝ ๅœจps้‡ŒๆŠŠๆ‰‹็ป˜้ป‘็™ฝ็บฟๆ็จฟ่ฐƒๆˆ่ƒฝ็›ดๆŽฅๅœจ็”ต่„‘ไธŠ่‰ฒ็š„ๅ›พ็จฟ\n153004 0 ๆ—ฅๅ‰ๅ—ไบฌ้ซ˜ๆทณๅŒบ็ –ๅข™้•‡ๅ†œๆฐ‘่ต„้‡‘ไบ’ๅŠฉ็คพๅœๆญข่ฅไธšๅนถ่ขซ็ซ‹ๆกˆ่ฐƒๆŸฅ\n1000 Processed\n classify content\n153500 0 ๅŒ—ไบฌๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ข็ฌฌไบŒๅˆ†้™ข่ตท่ฏ‰ไนฆๆŒ‡ๆŽง\n153501 0 ๆŠ—็™Œ้‡‘ไธ‰่ง’ๅŠŸ่ƒฝไธปๆฒปๅŠ้€‚็”จ่Œƒๅ›ด๏ผšๆŠ—็™Œ้‡‘ไธ‰่ง’ๅŠŸ่ƒฝไธปๆฒปๅŠ้€‚็”จ่Œƒๅ›ดๆ นๆฎไธดๅบŠ่ง‚ๅฏŸ๏ผšๆŠ—็™Œ้‡‘ไธ‰่ง’ๆฒป็–—ๆ–นๆกˆๅฏน...\n153502 0 ่ฟ™ไธคๅคฉ็”ป็”ปๆŠŠๅฅฝๅ‡ ๆ นHB้“…็ฌ”้ƒฝ็”จ็š„ๅทฎไธๅคšไบ†\n153503 0 ๅ‘ไบ†9756ไธชๅธ–ๅญโ€ฆโ€ฆๆˆ‘่ง‰ๅพ—ๆˆ‘ๆœ‰ๆ—ถๅ€™็œŸ็š„็‰น่ฏๅ” โ€ฆโ€ฆ็‰นๅˆซโ€ฆโ€ฆ่ฏๅ” โ€ฆโ€ฆ\n153504 0 ็ฅ›็—˜็—˜็—˜ๅฐ้ป‘ๅคด็ฒ‰ๅˆบ่ƒŒไธŠ็—˜็—˜\n1000 Processed\n classify content\n154000 0 150708ๅทด้ปŽๆˆด้ซ˜ไนๆœบๅœบๅ‡บๅขƒ&amp\n154001 0 ไบบ็”Ÿไธโ€œๆฒป็š„ไบ†ไฝ ่„พๆฐ”็š„ไบบๆ˜ฏไฝ ็ˆฑ็š„ไบบ\n154002 0 ๆ‰‹ๆœบๆฒก็”ต่”็ณปไธไธŠไนŸไผšๆŒ‰็บฆๅฎšไธ็ฆปไธๅผƒ\n154003 0 ๅธŒๆœ›ๅฝ“ๆˆ‘็œŸ็š„่บบๅœจไบ†ๅŒป้™ข็š„ๅ‘ขๅคฉ\n154004 0 /้™คไบ†ไปฃๅทฅNexusๆ‰‹ๆœบๅŽไธบๅฏ่ƒฝ่ฟ˜็ป™Googleโ€œ็”Ÿโ€ไบ†ๅ—่กจ\n1000 Processed\n classify content\n154500 0 ๆ—ฅๆœฌ็›ด้‚ฎไปฃ่ดญๆ—ฅๆœฌๅฟ…ไนฐ่ฏๅ“ๆฆœไธŠๆœ‰ๅๆ—ฅๆœฌNichibanๆธฉๆ„Ÿ้•‡็—›่ดด็ฉดไฝ่ดด่…ฐ็—›่‚ฉ็—›่‚Œ่‚‰็—›ๅ…ณ่Š‚็—›ๅˆซ็œ‹...\n154501 0 ๆœฌๆฅไธ€ไธช็พŽไธฝ็š„้–่ฅฟๅด่ขซไธ€ไธชๅทฅๅŽ‚ๆžๆˆ่ฟ™ๆ ท\n154502 0 ็œŸ็›ธๆ€ปๆ˜ฏ่ฆๆฏ”ๆˆ‘ไปฌๆƒณ่ฑกๆฎ‹ๅฟ่ฎธๅคš\n154503 0 ็ปˆไบŽ่ฆๆขๆˆ่‹Bๅ•ฆ๏ฝžๆ—ฉ็Ÿฅ้“ไฝ ่ฆๆˆๆˆ‘็š„\n154504 0 ๅ‘่ฟ‘30ไธชๅ‚จๆˆทๆฝๅญ˜็Žฐ้‡‘700ๅคšไธ‡ๅ…ƒ\n1000 Processed\n classify content\n155000 1 ็šฎๅก๏ผŒๆ—ฅ็ซ‹ๆŒ–ๆŽ˜ๆœบๅ“็‰Œ่ฅ้”€ๅฎ—ๆ—จ:ๅฎขๆˆท่‡ณไธŠ๏ผŒๅ“่ดจไฟ่ฏ๏ผŒๆˆๆœฌไฝŽ๏ผŒไฝŽๆฒน่€—๏ผŒ้ซ˜ๆ•ˆ็އ๏ผŒๅ›žๆŠฅๅฟซ๏ผŒๅณๆ—ฅ่ตท่ดญๆœบ...\n155001 0 ไฝ ไนŸๆฅไธบๅ–œๆฌข็š„ๆญŒๆ›ฒๅŠ ๆฒนๅง\n155002 0 ๆ‰‹ๆœบ็Žฉไน…ไบ†็š„ๅขƒ็•Œไธๆ˜ฏ็œผ็›็œ‹็—›ไบ†่€Œๆ˜ฏๅคงๆ‹‡ๆŒ‡่…นๅˆทๅฑๅˆท็—›ไบ†็ดฏ\n155003 0 ๆœ‰ไบ›ไบบไธ€็”Ÿ้ƒฝๅœจๆ‘ธ็ดข็ˆฑ็š„็œŸ็›ธ\n155004 0 ๆ‚ฆ่ฏ—้ฃŽๅŸๆฒน่œ่Šฑ่œ‚่œœๆถฆ่†ๅญ•ๅฆ‡ๅฏ็”จไธป่ฆไฝœ็”จๆ˜ฏไธบๅŒๅ”‡้”ไฝๆฐดๅˆ†ๆไพ›ๅฑ้šœ\n1000 Processed\n classify content\n155500 0 ่ฏฅๅ…ฌๅธ็š„IPO็”ณ่ฏทๅˆš่Žท่ฏ็›‘ไผš้€š่ฟ‡\n155501 0 ๅทไธœ่ฅฟ็š„้ƒฝ่ฏฅ็›ผๆญปๅˆ‘ๅ•Šโ€ฆๅŠณๅŠจไบบๆฐ‘่ก€ๆฑ—้’ฑๆˆ–่€…่ก€ๆฑ—้’ฑไนฐ็š„ไธœ่ฅฟ\n155502 0 ไฝ†้‚ฃ่ฆ็ญ‰ๅˆฐไฝ ๅŠ ๅฐ‘่ฎธ่œ‚่œœ่ฐƒๅŒ€\n155503 0 ่ฅฟ่—ๆฟๅ—ใ€ๆฑฝ่ฝฆๆ•ด่ฝฆใ€้“ถ่กŒๆฟๅ—ๆถจๅน…ๆœ€ๅฐ\n155504 0 ไฝ†่œ‚่œœ็š„ๅŠŸๆ•ˆ่ฟœ่ฟœไธๆญข่ฟ™็‚นๅ“Ÿ\n1000 Processed\n classify content\n156000 0 ้™•่ฅฟ็œๆด›ๅทๅŽฟไบบๆฐ‘ๆณ•้™ขๅฏน่ขซๅ‘Šไบบ็งฆ้›ท้žๆณ•ๆ€ๅฎณ็่ดตใ€ๆฟ’ๅฑ้‡Ž็”ŸๅŠจ็‰ฉไธ€ๆกˆไพๆณ•ๅผ€ๅบญๅฎก็†ๅนถๅฝ“ๅบญไฝœๅ‡บๅฎฃๅˆค\n156001 0 xxxxๅนดxๆœˆxxๅทๅŠ ๆŽจxๅทๆฅผ\n156002 0 ๅŽไธบๅทฒๆ˜Ž็กฎๅฐ†ๅœจ5G็ ”ๅ‘ไธŠๆŠ•ๅ…ฅ6ไบฟ็พŽๅ…ƒ\n156003 0 ็บข็‚นๅ‰ชๅˆ€ไนŸ่ฆ150ๅทฆๅณ็š„ไปทๆ ผๆ‰่ƒฝๅ…ฅๆ‰‹\n156004 0 ไธ่ฆๅฏนไธ€ไธชๅชๅ่…่€ŒไธๅฎŒๅ–„้˜ฒ่…ๅˆถๅบฆ็š„ๆ”ฟๆƒๆŠฑๆœ‰ๅธŒๆœ›\n1000 Processed\n classify content\n156500 0 ๆ•ดไฝ“่€Œ่จ€ไธ€ๅˆ‡้ƒฝๅพˆๅฅฝๅพˆๆœ‰ๆ–ฐ็”Ÿๆดป่ฑ็„ถๅผ€ๆœ—็š„ๆ„Ÿ่ง‰\n156501 0 ้œ€่ฆ็š„่ฏ่ฏทๆไพ›ไฝ ็š„็™พๅบฆไบ‘่ดฆๅท็ป™ๆˆ‘\n156502 0 ๆฏ•ๆบ่ฅฟ่ทฏๅฃณ็‰ŒๅŠ ๆฒน็ซ™ๅพ€ๆญฆ่ญฆๆ”ฏ้˜Ÿๆ–นๅ‘ไธ€่พ†่ฝฟ่ฝฆๆผๆฒน\n156503 0 ๅ…ฑๅ‘5973ไบบๆ”ฏไป˜ๅคง็—…ไฟ้™ฉ่กฅๅฟ้‡‘้ข3165\n156504 0 ็Žฉ้ฅๆŽงๆจกๅž‹้ฃžๆœบ่ฆ่€ƒ่ฏโ€œ้ป‘้ฃžโ€ๅฏ่ƒฝไผš่ขซ่กŒๆ‹˜\n1000 Processed\n classify content\n157000 0 ไธ‹ๅˆx็‚นๆ‰ๅ‡บ็ป“ๆžœๅธฆๅฅนๅœจๅค–้ขๅƒ่ฟ‡้ฅญๅŽๅฟƒๆƒณ่ฆๅ…ˆๅธฆๅฅนๅŽปๅ“ช้‡Œไผ‘ๆฏไธ€ไธ‹\n157001 0 ๆญคๅปบ็ญ‘็š„ไธ€ๅคง็‰น่‰ฒๆ˜ฏ็Žป็’ƒๅข™็š„่ฟ็”จ\n157002 0 ๆœ€ๆ„ŸๅŠจไบบๅฟƒ็š„ๆ˜ฏๆœบๅ™จไบบๅๅนดๅฆ‚ไธ€ๆ—ฅ็š„ๅฎˆๅ€™ๆˆ–่ฎธๆˆ‘ๅƒ้‡Œ่ฟข่ฟข่ตถ่ฟ‡ๆฅๅชไธบ็œ‹ไฝ ไธ€็œผๆˆ–่ฎธๆˆ‘ๆฏ็ญไธ€ๅˆ‡ๅชไธบไฝ ่ฟœ็ฆป...\n157003 0 ไฝœไธบๅคงbossๅ˜ๆˆๅฆ–ๆ‰“ไธชๆžถ่ฟ˜ๅœจๅ–่Œ\n157004 0 xxxxๆฌพ่ทฏ่™Žๅ‘็Žฐ็ฅž่กŒ้‡‡็”จไบ†่ทฏ่™Žๅ…ฌๅธๆœ€ๆ–ฐ็š„ๅŒๅน…่œ‚็ชไธญ็ฝ‘่ฎพ่ฎก\n1000 Processed\n classify content\n157500 1 ใ€ๆญฆๆฑ‰[็Žซ็‘ฐ]้€šๅ‘Šใ€‘xๆœˆx-xๅท๏ผŒๅธ‚ไธญๅฟƒๅฝฉๅฆ†ๆดปๅŠจ๏ผŒ่ฆๅ‡€xxx+ๆผ‚ไบฎใ€็ƒญๆƒ…็š„็คผไปช๏ผๅŒ…้ฅญ๏ผŒxๅคฉๅ…ฑ...\n157501 0 ไปŠๅนด7ๆœˆ28ๆ—ฅๆ˜ฏ็ฌฌไบ”ไธชโ€œไธ–็•Œ่‚็‚Žๆ—ฅโ€\n157502 0 ๆœ‰ๅฐๅญฉๅœจๅœฐ้“ไธ€ๅท็บฟๅธธ็†Ÿ่ทฏ็ซ™ๆ’’ๅฐฟ\n157503 0 ้’ๅนดๅคง่ก—ๆ–ฐๅขž4ไธชใ€ๅŒ—้™ตๅคง่ก—ๆ–ฐๅขž6ไธช\n157504 1 ไธญๅ›ฝ้“ถ่กŒๅฎถๅฑ…่ฃ…ไฟฎๅˆ†ๆœŸๆ˜“ๆ— ๆŠตๆŠผๅŠๆ—ถไธบๆ‚จ่งฃๅ†ณ่ต„้‡‘ๅ‘จ่ฝฌ้šพ้ข˜๏ผŒ่ฏฆ่ฏข่‡ด็”ตxxxx-xxxxxxx ...\n1000 Processed\n classify content\n158000 0 ๅพฎ่ฝฏ็š„ๆกŒ้ขๅš็š„ไนŸๆ˜ฏ้†‰ไบ†\n158001 0 ๅนฒไบ†ๅไบ‹่ฆ่ขซไธพๆŠฅๆฒก้—ฎ้ข˜ๅฏไปฅ่ฏด่‡ชๅทฑๆธ…็™ฝไบ‘ไบ‘WWWไฝ†ๆ˜ฏ่ฆไปฅๆ‰ฟๅคช้ƒŽ็š„ๅ่ช‰่ตท่ช“็š„่ฏ\n158002 0 ไธŠ็บฟ่ถ3ไบบไธๅค‡็ˆฌๅ‡บ็ช—ๆˆทๅ‘ผๆ•‘ๅดไธๆ…Žๅคฑ่ถณ\n158003 0 7/30ๅŽปๅธธๅทž่Žซๅๅฅ‡ๅฆ™้€›ไบ†ไธ€ๅœˆ\n158004 0 ่ทŸ้Ÿฉๅ›ฝ็š„ไนๆ—ฅ่œ‚่œœ้ป„ๆฒนๅฏนๆฏ”ไบ†\n1000 Processed\n classify content\n158500 0 ไปฅๆœ€ๅ…ทๅทดๆธไผ ็ปŸๅปบ็ญ‘็‰น่‰ฒ็š„ๅŠ่„šๆฅผไธบไธป\n158501 0 ๆ„Ÿๆฉๅ„็•Œ็ฒพ่‹ฑๅ‰่พˆๅฏนEHERDERๅ“็‰Œ็š„ๆŒ‡ๅฏผ\n158502 0 2011ไธญๅ›ฝ่Š่ฏ่Š‚ๅœจไปชๅพๅผ€ๅน•\n158503 0 ๆต™ๆฑŸๅซ่ง†ๅ†…ไธชโ€œๆˆ‘้€‰ๅ‘จๆฐไผฆโ€่›ฎไธ้”™็š„ๅ“ฆ\n158504 0 ๆกˆไปถๅˆคๅ†ณๅŽ่ขซๆ‰ง่กŒไบบไธ€็›ดไธ‹่ฝไธๆ˜Ž\n1000 Processed\n classify content\n159000 0 ไธ€ๅฎš่ฆ็ปๅธธไธŠไฝ ไปฌicloud้‚ฎ็ฎฑ\n159001 0 ไฟ„ๆ”ฟๅบœๅ’Œๅคฎ่กŒๅœจ2014ๅนดๅบ•่‡ณ2015ๅนดๅˆ้‡‡ๅ–็š„ไธ€็ณปๅˆ—ๆŽชๆ–ฝ็จณๅฎšไบ†้‡‘่žๅธ‚ๅœบๅฝขๅŠฟ\n159002 0 xใ€ๆœฌไบบๆˆ–้€š่ฟ‡ๅฎถไบบๅทจ้ขๅ—่ดฟ\n159003 0 ็ผ–ๅท1019167็š„ๅฐๅธ…ๅ“ฅๅธๆœบ\n159004 0 ่ฎฉ็‰›ไป”ๅ‡ ๅๅŒป็”Ÿๆฅ็ป™ไป–ๅšไพ‹่กŒ3ใ€ๆไป็ฒ‰โ€”ไฝฟ่‚Œ่‚คๆถฆ\n1000 Processed\n classify content\n159500 0 ๅคฎ่ง†ๅˆๆ›ๅ…‰ๅ›ฝๅ†…โ€œ็บข็ณ–โ€ๅ†…ๅน•\n159501 0 ticwatch็š„้˜ณๅ…‰ๅฑ้˜ณๅ…‰ไธ‹็š„่กจ็Žฐ่ฟ˜ๆ˜ฏ้žๅธธไธ้”™็š„\n159502 0 ็Œๅ—ๅŽฟ้ฆ–ๆœŸ้ญ”ๆœฏๅŸน่ฎญ็ญๅญฆๅ‘˜ๆ‰่‰บ่กจๆผ”ๅœจๅŽฟๅ…šๆ กๆ•™ๅฎคไธพ่กŒ\n159503 0 ้Ÿญ่œใ€ๅคง่’œๅ’Œๅฐ่‘ฑx็ง่œ็œ‹ไผผไธ่ตท็œผ\n159504 0 โ€œไธๆ˜ฏ่ฏดๆณ•ๅพ‹้ขๅ‰ไบบไบบๅนณ็ญ‰ๅ—\n1000 Processed\n classify content\n160000 0 ๆˆ‘ๅˆšๅˆš็œ‹ไบ†ๅœบEGๅ’ŒEHOME็š„ๆฏ”่ต›\n160001 0 ไธบไฝ•่ฟ‡ๅŽปไผ˜็ง€็š„ๅŒป็–—ๅซ็”Ÿๅˆถๅบฆๅค‡ๅ—่ดฃ้šพ\n160002 0 ๅ‘จๆฐไผฆ่‹ๅทžๆผ”ๅ”ฑไผšๅŠๅ‘จๆฐไผฆๅ—ๆ˜Œๆผ”ๅ”ฑไผš็œ‹ๅฐ380ๆœ€ๅŽๅ‡ ๅผ ็Žฐ็ฅจ\n160003 0 ๅฎถ้•ฟๅธฆๅ…ถๅˆฐๅŒป้™ขๆฃ€ๆŸฅๆ—ถๅ‘็Žฐ๏ผšๅญฉๅญๅฟƒ่„ไธŠ็ซŸ็„ถๆ’็€ไธ€ๆ น้’ˆ็Šถ็‰ฉไฝ“\n160004 0 ๆฏๆฌก็œ‹ๅˆฐๅˆซไบบ็š„่ดจ็–‘้ƒฝ่ฆๅœจๅฟƒ้‡Œ้ป˜ๅฟตโ€œไบบไธไธบๅทฑๅคฉ่ฏ›ๅœฐ็ญไบบไธไธบๅทฑๅคฉ่ฏ›ๅœฐ็ญโ€\n1000 Processed\n classify content\n160500 0 ็ป่ฎพ่ฎกๅธˆWoody็ฏกๆ”นๅŽๆทฑๅพ—ๆˆ‘ๅธๅ„่ทฏ้ชšๅนดๅ–œ็ˆฑ\n160501 0 A่‚กๆšด่ทŒ็œŸ็›ธ๏ผš้š็Žฐๅ›ฝ้™…ๅฏนๅ†ฒๅŸบ้‡‘ๆ‰‹ๆณ•ๆ›ๅ…‰\n160502 0 x่‹ฑๅฏธJDIๅ…จ้ซ˜ๆธ…INCELLๅฑๅน•\n160503 0 ่ฟ™ไธช้—ฎ้ข˜ไธ€่ˆฌๅ‘็”Ÿๅœจไปฅไธ‹ๅœบๆ™ฏ๏ผšๆ–‡็ง‘็”Ÿๅฅณๆœ‹ๅ‹็œ‹ไธๆƒฏไฝ ้ซ˜่ฐˆ้˜”่ฎบ\n160504 0 xxๆ—ฅๅ…จ็œๅคง้ƒจๆœ€้ซ˜ๆฐ”ๆธฉๅฐ†ๅ‡่‡ณxxโ„ƒไปฅไธŠ\n1000 Processed\n classify content\n161000 0 ไนŒๆตทๅธ‚ๅ…ฌๅฎ‰ๅฑ€ๆปจๆฒณๅ…ฌๅฎ‰ๅˆ†ๅฑ€ๅๅŠฉๅธ‚ๆธ”ๆ”ฟ็ฎก็†็ซ™ๆœ‰ๆ•ˆไฝœไธบ\n161001 1 ็พŽๅฅณไปฌไธ‰ๅ…ซ่Š‚ๅฟซไนใ€‚ๅบ†็ฅไธ‰ๅ…ซ่Š‚ๅ‡กๅœจๆœฌๅบ—่ดญไนฐxxxๅ…ƒไบงๅ“ๅฐฑ็ป™xxๅ…ƒๆด—้ขๅฅถไธ€็“ถใ€‚ไนฐxxxๅ…ƒไบงๅ“ๅฐฑ็ป™...\n161002 0 Blackmoresๆพณไฝณๅฎๆทฑๆตท้ฑผๆฒน่ƒถๅ›Š1000mg400็ฒ’็‰นไปท220\n161003 0 ๅˆซๅฟ˜่ฎฐๆŽจ่็š„ๆ—ถๅ€™่ฎฉๅฏนๆ–น่พ“ๅ…ฅๆŽจ่ไบบๆ‰‹ๆœบๅท็ \n161004 0 ๆด›้˜ณ็›้•‡ไนกๅผบๆŽจๅ•†ไธšไฟ้™ฉๅทฅไฝœไบบๅ‘˜๏ผšโ€œไธไบคไธ่กŒโ€\n1000 Processed\n classify content\n161500 0 ็ƒฆๆญปไบ†ๅ›žไธชๅ›ฝ็ซ่ฝฆๆฑฝ่ฝฆ้ฃžๆœบ้ƒฝ่ฆๅไธ€้โ€ฆโ€ฆโ€ฆโ€ฆ\n161501 0 FESCOๅ†ๆฌกๅ…‰ไธดhaolleeCafeๅ’–ๅ•ก\n161502 0 ๆˆ‘ๅฐฑๅ‘ตๅ‘ตๅ‘ตๅ‘ตๅ‘ตๅ‘ตไบ†ๆˆ‘่ฟ˜ๆ˜ฏๅ…ฑไบงไธปไน‰็š„ๆŽฅ็ญไบบๅ‘ข\n161503 0 ๅœจWVไผšๅ‘˜็ฝ‘็ซ™ๅช่ฎข่ดญๆ—…ๆธธๅฅ—้คใ€ๆœบ็ฅจๆˆ–้…’ๅบ—็ญ‰\n161504 0 ไบ”็‚นไธ‰ๅไบ”็š„้ฃžๆœบ่ฟ˜ๆŽ’ๅœจไบ†ๆˆ‘ไปฌไน‹ๅ‰้ฃž่ตฐไบ†\n1000 Processed\n classify content\n162000 0 ๅˆ†ไบซ่ก€ๆŠคๅฃซๆˆ–ๆฏ•ไธš่ก€็คผ็š„ๅšๆ–‡ๅ›พ็‰‡๏ผšๅค–ๅ›ฝๅฏŒไบŒไปฃ\n162001 0 ๆ˜จๅคฉๆ™šไธŠ9็‚น45้ฃžๆœบๅปถ่ฏฏๅˆฐ3็‚น45\n162002 0 Bigbangๅ—ไบฌๆผ”ๅ”ฑไผš\n162003 0 ่ฏฅ่ฝฆ่พ†ๅฐ†ๆญ่ฝฝๅ…ˆ่ฟ›็š„LED็…งๆ˜ŽๆŠ€ๆœฏ\n162004 0 ๅ…จๅ›ฝๅฐ†ไปŽ8ๆœˆ15ๅท่ตทไฝฟ็”จๆ–ฐ็š„\"ๅนณๅฃคๆ—ถ้—ด\"\n1000 Processed\n classify content\n162500 0 10ๆฌพๆž็ฎ€้ฃŽๆ ผ่ฎพ่ฎก็š„่…•่กจ\n162501 0 ่ดจ็–‘็คพไผšๅจ่ƒๅˆฐ็š„ๆ˜ฏๆ•ดไธชๅ›ฝๅฎถๅ’Œ็คพไผš็š„ๅŸบๆœฌๆž„ๆžถ\n162502 0 7๏ผŽๆ–ฐ็–†ๅ›พ็“ฆๆ‘๏ผš่พน้™ฒไฝ™ๆ™–ๅฐๆœจๅฑ‹\n162503 1 ไฝ ๅฅฝใ€ๆˆ‘ๆ˜ฏๅŽๅค้“ถ่กŒ็š„ๅผ ็ป็†ใ€‚ๆœˆ่ดน็”จx-x.xใ€‚้ขๅบฆxxไธ‡๏ผŒๅช้œ€ๆไพ›่บซไปฝ่ฏ๏ผŒๅทฅไฝœ่ฏๆ˜Ž่ฏขxxxx...\n162504 0 ็”ฑ6ไธช่‹ฑๅ›ฝไบบๅ’Œไธ€ไฝๆ—ฅๆœฌไบบๅ…ฑๅŒๅˆถๆˆ\n1000 Processed\n classify content\n163000 1 ็‚Žๆ‚ฃ่€…็ซ‹้ฉฌ่งๆ•ˆ๏ผŒๅ…่ดน่ฏ•ๅš๏ผๅ„็งไผ˜ๆƒ ๆดปๅŠจ่ฟ›่กŒไธญ็ฌๆฌข่ฟŽ่ฏฆ่ฏข๏ผๅฅๅบท็ƒญ็บฟxxxx-xxxxxxxx ...\n163001 0 ๅฆๅ…‹ๅฝขๆ€ไธ‹ไธคไธชๅ‰ๅฑฅๅธฆ่ฟ˜ๅฏไปฅๅพ€ๅ‰็ฟปๆŠ˜\n163002 0 ๆ ๆ†่ต„้‡‘่ขซๆ‰“่ท‘ไบ†ใ€่ž่ต„่ต„้‡‘่ขซๅ“ๆญปไบ†ใ€ๅฏนๅ†ฒ่ต„้‡‘่ขซๆ‰ผๆ€ไบ†ใ€ๆป‘ๅคด่ต„้‡‘้š”ๅฒธ่ง‚็ซไบ†ใ€ๅชๅ‰ฉๅ›ฝๅฎถ้˜Ÿๅ’Œๆญปๅคšๅคดไบ†\n163003 0 ๆˆ‘ๅฐฑๆƒณ่…พ่ฎฏๆ˜ฏไธๆ˜ฏไธๆ˜ฏๆœฌไบบไบ†\n163004 0 ่ƒฝ็œ‹ๅˆฐไธ€ไธชๅฐ้ฃžๆœบๆญฃๅœจไบ‘ๅฑ‚ไธŠ้ฃž่กŒ\n1000 Processed\n classify content\n163500 0 ๅฎฟ่ฟๆœชๆฅ็ผ–็ป‡โ€œไธ‰็บตไธคๆจชโ€้“่ทฏ็ฝ‘\n163501 0 ๅœจ้ฃžๆœบไธŠ็œ‹ๆŠฅ็บธๆ—ถ็œ‹ๅˆฐไบ†ๆˆ‘ๅฎถๅฎๅฎ~\n163502 0 ่ฟ˜ๆœ‰ๆœ€่ฟ‘ไธ€ไธชๅœจ4sๅบ—ไนฐๅˆฐไบŒๆ‰‹่ฝฆไธŠ่ฏ‰\n163503 0 ๆŽจ่่€ๅธˆๅกไน็Œซๆ‹›่˜่€ๅธˆๅกไน็Œซ\n163504 0 ่€Œๆ˜ฏๅฐ†่ฝฆ้€€ๅ›ž่ตท็‚น็ญ‰ๆฅ่ญฆๅฏŸๅค„็†\n1000 Processed\n classify content\n164000 1 ๅฅฝๆถˆๆฏ๏ผ xๆœˆxx xx xxๅท ๆณฐๅ›ฝๆธ…่ฟˆๅŒ้ฃžxๅคฉxๆ™š ็‰นไปทxxxxๅ…ƒ ไธๅซxxx่ฝ...\n164001 0 ไผžไฟฎๆœฌไผžไฟฎๅ››ๅไนๅƒๅœจๆญคๅšๆœ€ๅŽไธ€ๆฌกๅœบ่ดฉ\n164002 0 ไฝ†ๆ˜ฏxxๅนด้‚ฃไผšxxxxไธ‡ๅ’Œ็Žฐๅœจ่‚ฏๅฎšไธๆ˜ฏไธ€ไธชๆฆ‚ๅฟต\n164003 0 ๅทๅค–ๅทๅค–๏ฝžAmiๅฐๅŽจๅœจ้›ถๅท็บฟๅšไฟƒ้”€ๅ•ฆ\n164004 0 /ๆฏ•่Š‚่ญฆๅฏŸๅ‡ปๆฏ™ๆธ”ๆฐ‘ๆกˆ๏ผšๆญป่€…ไธญxๆžช\n1000 Processed\n classify content\n164500 0 ็Šถๅ†ตๅ’Œ1997ๅนดไบšๆดฒ้‡‘่žๅฑๆœบ็ˆ†ๅ‘ไน‹ๅ‰้žๅธธ็›ธไผผ\n164501 0 ่ฝฆ่ฝฝKTVไป€ไนˆ็š„ๆžœ็„ถๆ˜ฏๆŠŠๅฆน็ฅžๅ™จๅ•Š~~่€ๅธๆœบไนŸ่ฏทๅธฆไธŠๆˆ‘\n164502 0 ็ผ–่พ‘ไปŽ็ปๅ…ดๆท็Žฒๆ–ฏๆŸฏ่พพ4Sๅบ—่Žทๆ‚‰๏ผšๅบ—ๅ†…ๆ˜•ๅŠจ็Žฐ่ฝฆ้”€ๅ”ฎ\n164503 0 ๆฏไธชๅ‘จไบ”12็‚น้ƒฝๅฎˆๅœจ็”ต่„‘ๅ‰ๅฏๆ˜ฏๆฏไธ€ๆฌก้ƒฝๆŠขไธๅˆฐ\n164504 0 ๆœ‰็ฝ‘ๅ‹ๅŽปๆˆ้ฃžๆœบๅœบๅŽ‚ๆบœ่พพไธ€ๅœˆ\n1000 Processed\n classify content\n165000 0 xxๅท้‚ฃๅคฉๆˆ‘ๅชๅŽปไบ†ATMๆœบๅŽป่ฟ‡้’ฑ\n165001 0 ๅฐฑๅœจ็„ฆๆ€ฅ็š„ๆ—ถๅ€™ๅฐๅท่ฃ…ไฝœ่‡ชๅทฑๆ˜ฏๆณ•ๅ›ฝไบบๆฅๆ—…ๆธธ็š„\n165002 0 ๅญ—ๅฅๆฆ‚ๆ‹ฌๆ•ด้ƒจ่Šฑๅƒ้ชจ็š„ไธป่ฆไบบ็‰ฉๅƒ้ชจ\n165003 0 ็ƒญๆˆไธ€ๆกไธๆ˜Ž็œŸ็›ธ็š„้žๆดฒๅ’ธ้ฑผ\n165004 0 ๆˆ‘ๆŠ•็ป™ไบ†โ€œๆ˜Ž็Ÿฅ่Šฑๅƒ้ชจๆ˜ฏ่‡ชๅทฑ็”ŸๆญปๅŠซ\n1000 Processed\n classify content\n165500 0 ็”ฑไบŽ่กŒๆ”ฟๆ–‡ไนฆๅœจๆณ•้™ขไธ่ตทไฝœ็”จ\n165501 0 ๅทฒ็ปๅฟซ้™ชไผดtfboys2ๅนดไบ†\n165502 0 ๆ—ข็„ถ้€‰ๆ‹ฉ่ฟๆณ•ๅฐฑไธๅบ”่ฏฅไบซๅ—ๆƒๅˆฉ\n165503 1 ใ€‚ๅ„xxๅฅ—ๅ…ˆๅˆฐๅ…ˆๅพ—ๆฏไบบ้™ๅคŸไธ€ๅฅ—๏ผ้ขœๅฆ‚็މๆŠค่‚ค้—บ่œœไปทไนฐไธ€็“ถ้€ๅŒๆฌพไธ€็“ถ๏ผไนŸๆ˜ฏ้™้‡xxๅฅ—๏ผๆ•ฐ้‡ๆœ‰้™ๅ•Š...\n165504 0 ่…พ่ฎฏๆˆฟไบงไปŽๆ˜†ๆ˜Žๆˆฟไบงไฟกๆฏ็ฝ‘ๅ…ฌๅธƒ็š„้€€ๆˆฟๅ…ฌๅ‘Š็ปŸ่ฎก\n1000 Processed\n classify content\n166000 0 ไธญๅ›ฝๆˆฟๅœฐไบงๆŠฅ8ๆœˆ3ๆ—ฅๆŠฅ้“๏ผšๆฒณๅ—ๆ–ฐ็”ฐ็ฝฎไธšๅ†ฏๅธธ็”Ÿ็œ‹ๆฅ\n166001 0 ่ฟ™ๅฐฑๆ˜ฏ้„‚ๅฐ”ๅคšๆ–ฏไธญ้™ขๆณ•ๅฎ˜ๆณ•ๅพ‹ๆ–‡ไนฆ้€ ๅ‡้™ทๅฎณๅฝ“ไบ‹ไบบ็œŸๆ˜ฏ้„‚ๅฐ”ๅคšๆ–ฏไธ€ๅคงไธ‘้—ป\n166002 0 ๅ•ฅๅญ่ฎพ่ฎกไธŠ็š„้—ฎ้ข˜ๅœจไป–้‚ฃ้‡Œ้ƒฝๆ˜ฏไธๆ˜ฏ้—ฎ้ข˜ๅ“ฆ\n166003 0 ๆˆ‘ๅšไบ†ไบŒๅๅ‡ ๅนด็š„ๆŠคๅฃซโ€ฆโ€ฆโ€ๆŸ็”ฒ่ฏด๏ผšโ€œๅคชๅฅฝไบ†\n166004 0 ๅพๅทžไบ‘้พ™ๅŒบ็คพๅŒบๅŒป้™ข็ป™ๅฎๅฎๆŽฅ็งๅฟซๅˆฐๆœŸ็š„่‡ช่ดน่‚ฒ่‹—\n1000 Processed\n classify content\n166500 1 ๆˆ้ƒฝๆ€ฅ้œ€ไธ€ๅ็‚‰ๅญ๏ผŒๅฎถๅธธ่œ๏ผŒไนกๆ‘่œ้‚ฃ็งๅฝขๅผ็š„๏ผŒๅœฐๆ–นๅชๆœ‰xxxๅคšไธชๅนณๆ–น๏ผŒๅœฐๅ€ๅœจไบŒ็Žฏ่ทฏไธœไบ”ๆฎต้™ๅฑ…ๅฏบ...\n166501 0 ็›—็ชƒไบ†ๅ…ฌๅธ็‰ฉๅ“่ฟ˜ๆ่ตทๅŠณๅŠจไปฒ่ฃ\n166502 0 ้ฃžๅ‘452bๆ˜Ÿ็ƒๅคง็บฆ้œ€่ฆไธ‰ๅƒไธ‡ๅนด\n166503 0 ๅฎ˜ๅ•†ๅ‹พ็ป“่‹ฆไบ†่€็™พๅง“่ฆๆฑ‚ๆœ‰ๅ…ณ้ƒจ้—จๆฅไธฅๆŸฅ\n166504 0 ๅŽŸๅท็ 15173253196ไฝฟ็”จๅˆฐๆœฌๆœˆๅบ•\n1000 Processed\n classify content\n167000 0 7ๆœˆ13ๆ—ฅ่‡ชๆฒปๅŒบๅ…šๅง”็ฌฌไธ€ๅทก่ง†็ป„ๅ‘่ฅฟไนกๅก˜ๅŒบๅ้ฆˆๅทก่ง†ๆƒ…ๅ†ต\n167001 0 ๅฝ“ๅนดSurfaceRT็š„ๅ™ฉๆขฆๆ˜ฏๅฆไผšๅปถ็ปญ\n167002 0 ๅ†่ตฐไธ€ๆฌกๅ—ไบฌๅŸŽ๏ฝž่ฟ™ๆฌกๆฒกๆ—ถ้—ด้€›ๅ—ไบฌๅคงๅฑ ๆ€็บชๅฟต้ฆ†ไบ†๏ฝžไธ‹ๆฌกๆœ‰ๆœบไผšๆˆ‘ๆƒณๅ†ๅŽปไธ€ๆฌก\n167003 0 ่ฟ™ไธคๅคฉๆœ‹ๅ‹ๅœˆ้‡Œๅคดๆœ‰้˜ฒๆ™’ๅ›ข่ดญๆดปๅŠจๅ“ฆ\n167004 0 ๆ–ฐๅผ€ๆ—ฅไปฃๅพฎๅบ—~ๅ„ไฝๅคง็พŽๅฅณไปฌๆœ‰้—ฎ้ข˜ๆฌข่ฟŽๅ’จ่ฏขๅ“ฆ\n1000 Processed\n classify content\n167500 0 ไป–่ฎฉ23ๅทๆˆไธบ้ฃŽ้กๅ…จ็ƒใ€ๅฎถๅ–ปๆˆทๆ™“็š„ๅท็ \n167501 0 ไป–ไปฌๆˆ็พค็ป“้˜Ÿๅœฐ่ดจ็–‘่ฟ™ไธชๆฐ‘ๆ—็š„ไธ€ๅˆ‡\n167502 0 ๅœจๆฆ•ๅŸŽๅŒบๆŠขๅŠซ3ๅฎ—็š„่ฟๆณ•็Šฏ็ฝชไบ‹ๅฎž\n167503 0 ๅฐฑๆ•…ไฝœ้•‡้™~~~ไฝœไธบไธ€ๅๆœ‰่ดฃไปปๅฟƒๆปดโ€œๆ“็›˜ๆ‰‹โ€\n167504 0 ่ฟ˜ๅฏๅœจNetSuiteใ€Eloquaๅ’ŒMarketo็ญ‰็ณป็ปŸไธญ้กบ็•…่ฟ่กŒ\n1000 Processed\n classify content\n168000 0 ๅšๅฎžไธป่ฅไธšๅŠก๏ผš็ŸณๅŒ–ๅŒ–ๅทฅๅŽๅค„็†่‡ชๅŠจๅŒ–ๆˆๅฅ—่ฎพๅค‡็š„็ ”ๅ‘ใ€็”Ÿไบงๅ’Œ้”€ๅ”ฎ\n168001 0 ไธญ้—ด้ฃžๆœบ่ฟ˜ๅปถ่ฏฏไบ†11ไธชๅฐๆ—ถๅ„็งๅ–ๆถˆ\n168002 0 ๅฎ‰้‚ฆ้›†ๅ›ขใ€Žๅฎ‰้‚ฆๅ…ฅไธปๆ‹›่กŒๆฐ‘็”ŸไธŽๆฟ€่ฟ›้ฃŽๆ ผ็›ธๅ…ณๅคง้‡‘ๆŽงไป…ๅทฎไฟกๆ‰˜ใ€\n168003 0 ไธ€ๅ่‹ๅ“‘xxๅŽ็ชœ่‡ณๅฎ‰ๅพฝ็œ่šŒๅŸ ๅธ‚ๆŸๅคงๅญฆๅญฆ็”Ÿๅฎฟ่ˆๆฅผๅ†…\n168004 0 โ€œ้ขœๅ€ผโ€ๆˆไบ†่ฟ™ๆฌพๆ‰‹ๆœบ็š„ไธป่ฆๅ–็‚น\n1000 Processed\n classify content\n168500 0 MLGB่ฏดๅฅฝ็š„LGDๅ†…ๆˆ˜ๅ‘ขCDECๅŠ ๆฒน~\n168501 0 ๅญฆไน ่ฃ…็”ต่„‘็ป“ๆžœ็ป™่ฃ…ๆˆ็ –ๅคดไบ†\n168502 0 ๆˆๅƒไธŠไธ‡็š„่ดชๅฎ˜ๅฐ†ๆญปๆ— ่‘ฌ่บซไน‹ๅœฐไน‹ๅœฐ\n168503 0 NBA็ƒ้˜ŸLOGOๅ˜ๅŒ–้™ๆ€็‰ˆ\n168504 0 ้ข„่ฎกxxxxๅนดไธ‹ๅŠๅนด่ฏฅๅธ‚ๅœบ่ง„ๆจกๅฐ†่พพxxxxx\n1000 Processed\n classify content\n169000 0 ๆœˆๆ”ถๅ…ฅxwไปฅไธŠๅฏ่€ƒ่™‘ๆฌง็พŽไธญ็ซฏๆธธ\n169001 0 ่‡ชxxxxๅนดxxๆœˆๅ‚ๅŠ ๅทฅไฝœไปฅๆฅ\n169002 0 ๅฆ‚ไฝ•้€‰ๆ‹ฉๆŒ‡ๆ•ฐๆœŸ่ดงไธŽๆŒ‡ๆ•ฐETF\n169003 0 ้˜œๅฎๅŽฟๆ•™่‚ฒ็ณป็ปŸ2015ๅนดๆš‘ๆœŸ้ขๅ‘็คพไผšๅ…ฌๅผ€ๆ‹›่˜ๆ•™ๅธˆ196ๅ\n169004 0 ไธๆƒณๅ€Ÿๆ˜ฏไธๆƒณๅ€Ÿๅๆญฃๆˆ‘้ƒฝๅฅฝๆ„ๆ€่ฏด\n1000 Processed\n classify content\n169500 0 ๆต™ๆฑŸ้‡‘ๅŽ็ƒŸ่Šฑ็ˆ†็ซนๅบ—็ˆ†็‚ธไบ‹ๆ•…\n169501 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๆตทๆด‹ๅŠๅฒ›ๅพ—็ฝฎไธš้กพ้—ฎๅฎ‹่•พ๏ผŒๆ˜Žๅคฉๆˆ‘ไปฌๆŽจๅ‡บไบ†ๅๅฅ—็‰นไปทๆˆฟ๏ผŒๆ‚จๆ˜Žๅคฉ่ฟ‡ๆฅๅฏไปฅ็œ‹ไธ‹๏ผŒ่ฎฐๅพ—่ฟ›ๆฅๆ‰พๅฎ‹...\n169502 0 ๅ‡ๅฆ‚่Šฑๅƒ้ชจไนŸ็Žฉๆœ‹ๅ‹ๅœˆใ€ๆœ€ๅŽไธ€ไธชไบฎ็žŽไบ†\n169503 0 ็พŽ็™ฝ็ฅ›ๆ–‘ๆœ็”จๆ–นๆณ•๏ผš1็ฒ’/ๆ—ฅ้š้คๆœ็”จ\n169504 0 ๅฎคๅ†…่ตฐ้“็š„่ฎพ่ฎก็ตๆ„Ÿๆบ่‡ชไบŽโ€œๅฑฑๆดžโ€\n1000 Processed\n classify content\n170000 0 ไบ’่”็ฝ‘ๅŒป็–—ๆœชๆฅ็š„6ไธช่ถ‹ๅŠฟ\n170001 0 ไธ้”™~~~่ฟ™2็“ถๆญฃๅฅฝ้€‚ๅˆๆ—…ๆธธๆ—ถๅ€™ๅธฆ\n170002 0 ๅฐฑๆ˜ฏๅœจ่ฎคๆธ…็”Ÿๆดป็œŸ็›ธไน‹ๅŽไป็„ถ็ƒญ็ˆฑ็”Ÿๆดป็ฝ—ๆ›ผ็ฝ—ๅ…ฐ\n170003 0 ่ถฃๅŒป้™ขๆŸๅฎขๆˆท็ซฏๅญ˜ๅœจxssๅฏๅฏผ่‡ด่ฟ‘ๅƒๅฎถๅŒป้™ขๅ’Œ600w็”จๆˆทไฟกๆฏๆณ„้œฒ\n170004 0 ๆˆ‘ๅŽฟไธŽๅขƒๅ†…ๅค–ๅฎขๅ•†็ญพ่ฎข20ไธช้กน็›ฎ\n1000 Processed\n classify content\n170500 1 ้กพๅฎข๏ผŒไฝ ๅฅฝ๏ผ่ดๅ› ็พŽๅฉดๅนผๅ„ฟๅฅถ็ฒ‰ไบŽxๆœˆxๆ—ฅไธ€xๆœˆxๆ—ฅไธƒไบ”ๆŠ˜๏ผŒ็‰นไปท้™คๅค–ใ€‚ๅ›žๅ…ดๆฐธ่พ‰xๅบ—่ดๅ› ็พŽ\n170501 1 ไปŠๅคฉๆ˜ฏๅ…ƒๅฎต่Š‚๏ผŒ็ฅๅ„ไฝ่€ๆฟ็”Ÿๆ„ๅ…ด้š†๏ผŒ่ดขๆบๆปšๆปš๏ผ๎„ฏ๎„ฏใ€‚ ไผŠๅˆฉๅฅถ็‰‡็Žฐ็‰นไพ›xxxๅ…ƒ/ไปถ๏ผŒ้œ€่ฆ...\n170502 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆท๏ผšๆ‚จๅฅฝ๏ผŒx.x--x.xxๆ—ฅๅ‘ๅ”ฎไฟๆœฌxxxๅคฉ๏ผŒx.x%๏ผŒไฟๆœฌxxxๅคฉx.x%๏ผŒไฟๆœฌ...\n170503 0 ๆฎFIBAไบšๆดฒ็ฏฎ็ƒไธ“ๅฎถEnzoFlojoๅœจtwitterไธŠ้€้œฒ\n170504 0 ๅŠจๅ‘˜่พ–ๅŒบๅ†…8ๅฎถๅฑ…ๅง”ไผš็š„1ๆœˆโ€”3ๅฒๅ…ฑ14ไธชๅนด้พ„็ป„196ๅๅ„ฟ็ซฅๅ‚ไธŽๆดปๅŠจ\n1000 Processed\n classify content\n171000 0 xxๅŽ็พŽๅฅณๅ‰ฏๅฑ€้•ฟ่ฃธ่พžๆฑ‚่Œๆญๅฎ˜ๅœบๆฝœ่ง„ๅˆ™\n171001 0 ้‚ฃไบ›ไปฃ่กจ็€80ๅŽ็ˆถๆฏ็ซฅๅนดๆ—ถๅ…‰็š„ๆธธๆˆ\n171002 0 ่‡ช็„ถไฟฎ้ขœ้˜ฒๆฐด้˜ฒๆ™’้š”็ฆปโ€ฆๆˆ‘็š„โ€œๅ‡บ้—จ้œœโ€\n171003 0 ๅพˆๆƒณๅฟซ่ทณไธŠ้ฃžๆœบๅพˆๆƒณ็ซ‹ๅˆปๅˆฐ่พพ\n171004 0 ๆœ‰ๆฒกๆˆ‘ๆœ‰ๆฑŸ่‹้ซ˜ไธญ็š„่ฏญๆ–‡็ฌ”่ฎฐ\n1000 Processed\n classify content\n171500 0 ๅธ‚ๅŸŽๅŒบไบบๆฐ‘ๆณ•้™ขๆกˆไปถๅ—็†ๆ•ฐๅ‘ˆ็Žฐๅคงๅน…ๅขž้•ฟ็š„่ถ‹ๅŠฟ\n171501 0 ้•ฟ่ˆช่ญฆๆ–น2ๆœˆๅคšๆ—ถ้—ดๆŸฅ่Žท40ๅ่ฟๆณ•ๆถ‰ๆฐดไบบๅ‘˜\n171502 0 ๆˆ‘ๆญฃๅœจ็œ‹ๆต™ๆฑŸ4ๅฒๅฅณๅญฉๆމๅ…ฅ็Ÿณ็ฐๆฑ 90%่‚Œ่‚ค็ƒงไผค็”Ÿๅ‘ฝๅž‚ๅฑโ€”โ€”ไธญๅ›ฝ้’ๅนด็ฝ‘่งฆๅฑ็‰ˆ\n171503 0 ๆทฎๅฎ‰ๆทฎๆตทๅ—่ทฏๅ’Œ่งฃๆ”พไธœ่ทฏๅๅญ—่ทฏๅฃๅ—\n171504 0 ็”ตๆขฏ่กŒไธšๆ˜ฏๅฆไนŸๆ˜ฏ่ฃ…ๅค‡ๅˆถ้€ ไธš\n1000 Processed\n classify content\n172000 0 ็งฐๆ‰‹ๆœบๅ’Œ็ฝ‘่ทฏ้‚ฎ็ฎฑๆ”ถๅˆฐๅ‡ ไธชๆ—ถไธ‹็ƒญ้—จโ€œ็œŸไบบ็ง€โ€่Š‚็›ฎ็ป„็š„ไฟกๆฏๅ’Œ็”ต่ฏ\n172001 0 ่™ฝๅธฆ็‚นๅ•†ไธšๅ‘ณไฝ†ๆฏไบบxxๅ…ƒ็š„้—จ็ฅจ่ฟ˜่ƒฝๆŽฅๅ—\n172002 0 ๆ—ฉๆ™จๆƒณ็”จ็”ต่„‘็”ต่„‘ๅไบ†็„ถๅŽไธญๅˆๆˆ‘ๅˆๅ‘็Žฐๆ‰‹ๆœบๅ…ณๆœบ้”ฎๆ‘ไธๅŠจไบ†\n172003 0 ๆ€€็–‘ไป–ไปฌไนŸๆ˜ฏ่ตฐ็ง้›†ๅ›ข็š„ไธ€ไปฝๅญ\n172004 0 ๅŽไธบๅ‘ๅˆไฝœไผ™ไผด้‡็‚นไผ ้€’ISVๆ•ดไฝ“ๅˆไฝœ็ญ–็•ฅ\n1000 Processed\n classify content\n172500 0 ๅ‘จๆœซไธ‹ๅˆๅˆฐไธœๆฑŸๆปจcolletteๅฐๅ\n172501 0 ๅฎซๆ–—็š„QQ็พคๅ’Œ่ดดๅงๅ‘ๅฑ•ๅทฒ็ปๅๅˆ†ๅฃฎ\n172502 0 ๅฆ‚ๅญๅฎซ่‚Œ็˜คใ€ๅญๅฎซๅ†…่†œ็—…ๅ˜ใ€ไนณ่…บ็—…ๅ˜็ญ‰\n172503 0 ็พŽๅ…ƒ็†่ดขไบงๅ“ๅนดๅŒ–ๆ”ถ็›Š็އ่พƒไฝŽ\n172504 0 visualstudio2012ๅ’Œ\n1000 Processed\n classify content\n173000 0 BornFreeDecoBottleGiftSetๅฅถ็“ถ่ถ…ๅ€ผ็คผ็›’ๅฅ—่ฃ…$22\n173001 0 ็œ‹ๅฎŒ่Šฑๅƒ้ชจๆˆ‘ๆƒณๅฏนๆ€้˜ก้™Œ่ฏดโ€œ่ง่ฟ‡่‡ชๆ‹็š„\n173002 0 ไป–ไปฌไธๅœๅœฐ่ดจ็–‘ใ€ๆฟ€ๅŠฑใ€้ผ“ๅŠจไป–ไบบ\n173003 0 ่ฏฅๅฐๅŒบ็”ตๆขฏ็ช็„ถไปŽ20ๆฅผๆป‘่‡ณ14ๆฅผ\n173004 0 ๅˆšๆ‰็œ‹ๅˆฐไธ€ไธชไบบ่ฟ˜ๅœจ่‹ๅทžๆฒณ่พนไธŠ่ท‘ๆญฅ\n1000 Processed\n classify content\n173500 0 ไธญๅ›ฝ่ฟœๆด‹601919ๆ˜ฏไปŠๅคฉไธญๅญ—ๅคดๆœ€ๅผบ็š„็ฅจ\n173501 0 ่ฟ™่ฝฎ่ž่ต„็”ฑCharlesRiverVentures้ข†ๆŠ•\n173502 0 ๅ‘็Žฐ็™พๅบฆๅœฐๅ›พ็š„ๆธ…ๆ–ฐๅŠŸ่ƒฝ~share\n173503 0 ไธคไธชdealๅŒๆ—ถlive็œŸๅฟƒ่ฆไบบๅ‘ฝ\n173504 0 ๆทก็ปฟ่‰ฒๆถฒไฝ“ๆ˜ฏๆ—ฅ้—ด็ฒพๅŽๆฐด่ดจๅœฐ่ฝป่–„ๅ‡‰็ˆฝ\n1000 Processed\n classify content\n174000 0 ๆฑŸ่‹ๅซ่ง†็œ‹็”ท็ฅžๅผ ๆฐๅฆ‚ไฝ•ๅ˜่บซๅ‘†่Œ่ฏๅ” \n174001 0 ๅคๆ—ฆๆŠ•ๆฏ’ๆกˆไธคๅฎกๅ‡ๅˆคๆญปๅˆ‘็š„็ป“ๆžœ่ฎฉไป–ๆชๅฟƒ\n174002 0 ๅช่ฆๆŸฅๅค„ไบ†ๆถๆ„ๅšๅคš็ปฉๅทฎ่‚กไบๆŸ่‚ก็š„้‚ฃไบ›้ป‘ๅบ„\n174003 0 X้€Ÿๅบฆ็ซ่ตทๆฅwwใ‚ฆใƒซใƒˆใƒฉใƒžใƒณใŒๅคงๅฅฝใใชใฎ\n174004 1 ๅฎๅฎๅฎถ้•ฟ๏ผŒๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅ‡ฏ็‘žๅฎ่ดๆ—ฉๆ•™ไธญๅฟƒ็š„ๅข่€ๅธˆ๏ผŒๆˆ‘ไปฌๅญฆๆ กๆœ€่ฟ‘ๅœจๅšไผ˜ๆƒ ๆดปๅŠจ๏ผŒๆ‰˜็ญๅฏไปฅๅ…่ดน่ฏ•ๆ‰˜ไธ€ไธช...\n1000 Processed\n classify content\n174500 0 ๅฐฑๆ˜ฏๆ–ฐ็‰ˆๆผ”ๅ‘˜ๅ’ŒๆŠ•่ต„ๆ–นๆœ‰่ฏดไธๅ‡บๅฃ็š„ๅˆฉ็›Šๅ…ณ็ณป\n174501 0 ๆˆ‘ๆ€่€ƒๅˆฐไธ€ไธช้—ฎ้ข˜ไธ€ไธชไบบๅœจๆ„Ÿๆƒ…้‡Œๆ—ขๅฝ“่ฟๅŠจๅ‘˜ๅˆๅฝ“่ฃๅˆคๅฅนๅฏนๆˆ‘่ฏดๆˆ‘ไปฌไธ€่ตท่ท‘ๆญฅๅง่ท‘็š„ๆ—ถๅ€™่ฏดไธๅ†ฌไฝ ไธ่ƒฝ่ฟ™...\n174502 1 ไธ“ไธšๅŠž็†ๆˆฟไบงๆŠตๆŠผ่ดทๆฌพใ€ๆ— ๆŠตๆŠผ่ดทๆฌพใ€็–‘้šพ่ดทๆฌพ๏ผŒๅ’Œๅนดๆ”ถ็›Šxx%่ตท็š„ๅ„็ง็ฑปๅž‹็š„็†่ดขไบงๅ“๏ผˆPxPใ€ๅŸบ...\n174503 1 xๆœˆxไธ€xๆ—ฅๅฆ‚ไธœๆ–‡ๅณฐๅคงไธ–็•Œๆœ้ฅฐๆปกxx๏ผxxx๏ผxxxๅ…ƒๅ‡xxๅ…ƒๅ†ๆŽฅๅ—xxๅ…ƒๅˆธ๏ผŒๅ…จๅœบ่ดญ็‰ฉๆปก้ข่ต ...\n174504 0 ็œ‹่ฟ™ไธ€ๅญฃๅฅฝๅฃฐ้Ÿณๆ„Ÿ่ง‰ๆฒกๆœ‰ๅƒไธŠไธ€ๅญฃ่ฎฉไบบๅฐ่ฑกๆทฑๅˆป็š„ๅญฆๅ‘˜\n1000 Processed\n classify content\n175000 0 ๆˆ‘็ซŸ็„ถๅœจ็œ‹ๆฑŸ่‹ๅซ่ง†ไธญๆ–‡้…้Ÿณ็‰ˆ็š„็”ทไธปๅซ้‡‘่ฐญ่ฟ˜ๆ˜ฏ้‡‘ๆฝญ็š„โ€ฆโ€ฆ็ปงๆ‰ฟ่€…ไปฌ\n175001 0 ๅฝฉ็ฅจๆŠ•ๆณจๆ—ถ้—ดไธบ7ๆœˆ27ๆ—ฅไธ‹ๅˆ5็‚น44ๅˆ†\n175002 0 ๆ„ฟๆ‚จๅ’Œๅฎถไบบๅฅๅบทๅนธ็ฆโ€ฆโ€ฆ\n175003 0 ไฝ ไปฌไป–ๅ–ต็š„ๆ˜ฏๆฅๆŠขๅŠซ็š„่ฟ˜ๆ˜ฏๆฅ้€—ๆˆ‘็š„\n175004 0 ๆฑฝ่ฝฆๆ”น่ฃ…13063876999\n1000 Processed\n classify content\n175500 0 ็™ฝ้€??ๅ—ไบฌๅœฐๅŒบๅฏไปฅ่‡ชๅ–็š„ๆฅ\n175501 0 abuseofpowerๅฐฑๆ˜ฏใ€Œๆฟซ็”จๆฌŠๅŠ›ใ€\n175502 1 x.x่Š‚ๅฝ“ๅคฉๆฌง้›…้กฟๆŠค่‚คๅ“ๅ…จๅœบx.xๆŠ˜๏ผŒ่€ๅฎขๆˆทๅฝ“ๅคฉ่ฟ›ๅบ—่ฟ˜ๅฏ้ข†ๅ–็ฒพ็พŽ็คผๅ“ไธ€ไปฝใ€‚ๅฟซไนx.x่Š‚๏ผŒๆฌขไน...\n175503 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜๏ผŒๆ–ฐๅนดๅฅฝ[็Žซ็‘ฐ]ๆ›ผๅฆฎ่Šฌๅ†…่กฃๅ“็‰Œ็Ž‹ๅบœไบ•ไธ“ๆŸœxๆœˆxๆ—ฅ๏ฝžxๆœˆxๆ—ฅไธพ่กŒๅ†…่กฃ็‰นๆƒ ๆดปๅŠจ๏ผŒไธคไปถx...\n175504 0 ่ฆๆฟ€ๆดปๆ‰‹ๆœบๅกๅฐฑ่ฆๅ……ๅ€ผ1200ๅ…ƒ\n1000 Processed\n classify content\n176000 0 ไธ€ๅ—xxxๅนณๆ–น็ฑณ็š„็ปฟๅŒ–ๅธฆ่ขซไบบๆฏๅ\n176001 0 ไป–ไปฌๆ‰€ๅœจ็š„ๅฐๅŒบๅ…ฑๅ‘็”Ÿ็”ตๆขฏๆ•…้šœๅฐ†่ฟ‘400ๆฌก\n176002 0 ็”ตๆขฏๅƒไบบไบ‹ไปถๅŽŸๅ› ๅœจๅคฎ่ง†่ฎฐ่€…็ป†่‡ด้‡‡่ฎฟไธ‹ๆฐด่ฝ็Ÿณๅ‡บ\n176003 1 ๅ”ฎ๏ผšๆœจๅ…ฐๅฑฑxAๆ™ฏๅŒบๅ†…ใ€้ป„้‡‘ๅœฐๆฎต้—จ้ขๆˆฟxxxๅนณๆ–น๏ผŒxxxไธ‡๏ผŒไธค่ฏ้ฝxxxxxxxxxxx\n176004 0 ็œŸ็š„ๆƒณ่ฏด่‡ชๅทฑๅฅฝ็ฌจๅ‡บ้—จๆฒกๆœ‰ๅธฆ้’ฅๅŒ™่€Œไธ”ๅ‡บ้—จ่ฟ˜ๆฒกๆœ‰็ป™ๆ‰‹ๆœบๅ……็”ต\n1000 Processed\n classify content\n176500 0 ไบ‘ๅ—ๆœบๅœบ้›†ๅ›ขใ€ๆ˜†ๆ˜Žๆœบๅœบใ€็œๆฐ‘็”จๆœบๅœบๅ…ฌๅฎ‰ๅฑ€็ญ‰ๅ„ๆ•‘ๆดๅ•ไฝ็ซ‹ๅณ่ตถ่ตด็Žฐๅœบ\n176501 0 ็›ๅŸŽๆ™šๆŠฅ่ฎฐ่€…้‡‡่ฎฟไบ†ๅธ‚ๆ‹›็”Ÿ่€ƒ่ฏ•ไธญๅฟƒ็›ธๅ…ณ่ดŸ่ดฃไบบ\n176502 0 ไนŸๆ˜ฏไธ€ๅค„ไธๅฏๅคšๅพ—็š„ๆ—…ๆธธๆ™ฏๅŒบ\n176503 0 6ใ€่ง‚ไบบๅ››ๆณ•๏ผš่ฎฒไฟก็”จใ€ๆ— ๅฎ˜ๆฐ”ใ€ๆœ‰ๆก็†ใ€ๅฐ‘ๅคง่ฏ\n176504 1 ๆ–ฐๅนดๅฅฝ๏ผๆˆ‘ๆ˜ฏๆน˜่ฅฟ็งๅŽจ็š„็™ฝ็ป็†ใ€‚็ฅๆ‚จๅœจๆ–ฐ็š„ไธ€ๅนด้‡Œๅฟƒๆƒณไบ‹ๆˆ๏ผŒไธ‡ไบ‹ๅฆ‚ๆ„๏ผŒๆญๅ–œๅ‘่ดข๏ผ็Žฐๅœจๆœฌ้คๅŽ…ไปฅๅผ€ไธš...\n1000 Processed\n classify content\n177000 0 ๅนถๅœจ่ฏ‰่ฎผ็ป“ๆžœๅ‡บๆฅๅ‰ไธๅ†ๅ›žๅบ”\n177001 0 ็”ฒ้ชจๆ–‡็š„โ€œ่กŒโ€ๆ˜ฏไธ€ไธชๅๅญ—่ทฏๅฃ\n177002 0 ๅพฎ่ฝฏๅ…ฌๅธƒWindowsxxๅฎถๅบญ็‰ˆไปทๆ ผไธบxxx็พŽๅ…ƒ\n177003 0 ไธ‡็ง‘ๆŽŒ้—จไบบ็Ž‹็Ÿณๅœจ็ฌฌไธ‰ๅฑŠๆทฑๅœณๅ›ฝ้™…ไฝŽ็ขณๅŸŽ่ฎบๅ›ไธŠ็งฐ\n177004 0 ไนŸๆ˜ฏๅ›ฝๅฎถๆ—…ๆธธๆ€ปๅฑ€็กฎๅฎš็š„ๅ…จๅ›ฝๆ—…ๆธธๆ™ฏ็‚นไน‹ไธ€\n1000 Processed\n classify content\n177500 0 2๏ผšๅ…ฌ่ฏๅ‘˜้™ˆๅ›ฝๆ˜Žไธบไป€ไนˆไปŽๆฅไธๅœจๆญคไธบๅฝ“ไบ‹ไบบๅšๅ…ฌ่ฏ\n177501 0 6็‚น่ตทๆฅ็œ‹ๅˆฐig็ฌฌไธ€ๆŠŠ็ขพๅŽ‹็›ดๆŽฅๅ‡บๅŽปๅŒป้™ข่พ“ๆถฒไบ†ๅฟซๅˆฐไบ†ๅ‘็Žฐ็ฟป็›˜ไบ†ๅฟƒๆƒณ็จณไบ†่พ“็š„ๆ—ถๅ€™ๆฒกwifi็›ดๆŽฅไนฐ...\n177502 0 ๅ› ไธบๆˆ‘้ฉฌไธŠๅฐฑ่ฆไน˜้ฃžๆœบไบ†ๆ‰€ไปฅไผšๆฏ”ๅ„ไฝๆ™šๅฌๅˆฐๅ‘ข\n177503 0 ๅ•้กŒใฎTwitterใงใ™\n177504 0 ่คถ็šฑ็š„่ฎพ่ฎก็™พๆญๅˆๅ…ทๆœ‰ๅฐ‘ๅฅณๆ„Ÿ\n1000 Processed\n classify content\n178000 0 insider็”จๆˆทๅฏไปฅ้€‰ๆ‹ฉๅ‡็บงๅˆฐwindows10\n178001 1 ๅ…่ดนๅ’จ่ฏขxxxxxxxxxxxๆœ€ๆ–ฐ็‰ˆๆœบ้บปๆŽงๅˆถๅ™จ๏ผŒไธๅฎ‰่ฃ…๏ผŒ่ตทๆ‰‹ๆ‹ฟๅฅฝ็‰Œ๏ผŒๅฏ็Žฐๅœบ่ฏ•็”จ๏ผŒๆปกๆ„ๅ†ไนฐใ€‚q\n178002 0 ๆ˜จๅคฉ่งๅˆฐ็š„่…่ดฅ็œŸๆ˜ฏๆƒŠๅˆฐๆˆ‘ไบ†\n178003 0 4ใ€ไธ่ฆ็ขฐๆœ‰่‡ช่ž่ต„ๅซŒ็–‘็š„ๅนณๅฐๅฏนไบŽP2P\n178004 0 ไฟกๆฏๆฅๆบ๏ผš็ฆๅปบ้ซ˜่€ƒไฟกๆฏๅนณๅฐ\n1000 Processed\n classify content\n178500 0 โ€ใ€€ใ€€ๅฐ็ฝ‘ๆฐ‘ๅˆ้—ฎ๏ผšโ€œ็ˆ†ๆ–™ๅ‘ข\n178501 0 ๆŠ˜่…พๅˆฐ็Žฐๅœจ็ปˆไบŽๆŠŠpreview็œ‹ไบ†ไธ€้\n178502 0 ่ฏทๆŠ“ไฝๆ—ถๆœบ็š„ๅฎ่ดไปฌๆฅๅ’จ่ฏขๅง'\n178503 0 ่ดŸ่ดฃๅŸบไบŽffmpeg็š„่ง†้ข‘่งฃ็ ๆ’ญๆ”พๅ’Œๆ ผๅผๅ…ผๅฎน้€‚้…\n178504 0 ๅœจ่ฟ้ฃŽ้ซ˜้€Ÿ5KMๅค„ๅ› ๅฎžๆ–ฝ้ฅฎ้…’ๅŽ้ฉพ้ฉถๆœบๅŠจ่ฝฆ็š„่ฟๆณ•่กŒไธบ\n1000 Processed\n classify content\n179000 0 ่€Œๆ˜ฏๅœจไบŽๆณ•ๅฎ˜็š„ไบบๅ“ใ€ไบบๆ ผๅ’Œๅšไบบ็š„่‰ฏ็Ÿฅ\n179001 0 ๅ…ทไฝ“ๆ€ง่ƒฝๅฏไปฅ่‡ช่กŒ็™พๅบฆ๏ฝžๅ› ๆขๆ‰‹ๆœบๆ‰€ไปฅ้—ฒ็ฝฎ็€\n179002 0 ่ฎฒ่ฟฐไบ†xxๅฒ็š„ไนๅคฉๆดพๆผซ็”ปๅฎถ็†Š้กฟๅ› ๆ‚ฃ็™Œ็—‡่บซๅค„ไบบ็”Ÿๆœ€่‰ฐ้šพ็š„ๆ—ถๅˆปไฝ†ๅŒๆ ทๅฏน็€ๅ‘ฝ่ฟๅพฎ็ฌ‘็š„ๆ•…ไบ‹\n179003 0 ่ฟ˜ๆœ‰ๅ’ฝ็‚Žใ€ๆ‰ๆกƒไฝ“็‚Žใ€ๆ€ฅๆ€งๅ’ฝๅ–‰็‚Žใ€้ผป็‚Ž็ญ‰\n179004 0 ่ฝปไป“ๆŠ•่ต„่€…ๅฏ้€‚ๅบฆๆŠŠๆกไธช่‚กๆฟๅ—ๆœบไผš\n1000 Processed\n classify content\n179500 0 ่ฟ™ๆ ทไธŠ่ฏพไธไฝ†ไพตๅ ไบ†ๅญฆ็”Ÿๆš‘ๆœŸไผ‘ๆฏ\n179501 0 ่ฟ™ไนˆๅคงไบ†่ฟ™็งๆƒ…ๅ†ตไธ‹็ฎ—ๅฆˆๅฆˆ่ฎฉไป–่ทŸไป–ไธ€่ตท็ก\n179502 0 ไฝ‘ไฝ‘่ขซ็ˆธ็ˆธๅธฆๅˆฐๅ—ไบฌ็ญ›ๆŸฅ่ง†ๅŠ›ๅŽปไบ†\n179503 0 ๅฆ‚ๆžœไฝ ่ฎคไธบRolfBuchholz็š„ๅคงๅคšๆ•ฐ็ฉฟๅญ”ๅœจๅคด้ƒจๅ’Œ้ข้ƒจ\n179504 0 ไป–ไปฌๅฐ†ๅˆ†ๅˆซ่Žทๅพ—็”ฑๅพๅทžๆ…ˆ้“ญไฝ“ๆฃ€ไธญๅฟƒๆไพ›็š„\n1000 Processed\n classify content\n180000 1 ๆถฟๅทžใ€็‰›ๆดฅ่Šฑๅ›ญใ€‘๏ผŒๆ˜Žๅคฉๅผ€็›˜๏ผŒ้ฆ–ไป˜xไธ‡๏ผˆๆ— ๆฏๅžซไป˜ไธ€ๅนด๏ผ‰ๅ†…้ƒจๅ‘˜ๅทฅ้€‰ๆˆฟ๏ผŒไปทๆ ผ้ƒฝๆ˜ฏๆœ€ไฝŽ็š„๏ผŒๅ‡ไปทxxxx\n180001 0 ๆˆ‘็š„็”ต่„‘ๅ…‰้ฉฑๅๅๅๅไบ†ไบ†ไบ†ไบ†\n180002 0 ้˜ฒๆ™’ๅฎžไนƒ็พŽ็™ฝใ€ๆŠ—็šฑใ€้˜ฒ่กฐ่€ไน‹ๆ นๆœฌ\n180003 0 ๆณฐๅ›ฝTHANN็ดซ่‹ๅ‘่†œ&amp\n180004 1 ไบฒ๏ผš ไธ‰ๆœˆไฟƒ้”€โ€”โ€”ๅนปๆดปๆ–ฐ็”Ÿ ๆ—ถๅ…‰็คผ้‡๏ผš ไธ€ใ€ๅ•ๅผ ่ฎขๅ•่ดญไนฐไปปๆ„ๅนปๆ—ถๆˆ–ๅนปๆ—ถไฝณไบงๅ“ๆฏๆปกxxxxๅ…ƒ๏ผŒ...\n1000 Processed\n classify content\n180500 0 ๅพฎ่ฝฏๅ‚ป้€ผๅพฎ่ฝฏๅ‚ป้€ผๅพฎ่ฝฏๅ‚ป้€ผ\n180501 0 ่ฟ™้ข—ๅซๆ˜Ÿๆˆ–่ฎธๆœ‰่ƒฝๅŠ›ๆ”ฏๆŒ็”Ÿๅ‘ฝ็š„็”Ÿๅญ˜\n180502 1 ๅƒ็Ÿณ่ต„ๆœฌ-่‰พๆ–นๅคš็ญ–็•ฅๅฏนๅ†ฒๅขžๅผบxๅท่ต„ไบง็ฎก็†่ฎกๅˆ’ๅฐ†ไบŽๆœฌๆœˆไธญๆ—ฌๅ‘ๅ”ฎ๏ผŒ่ตท็‚นxxxไธ‡๏ผŒ้ข„ๆœŸๅนดๅŒ–ๆ”ถ็›Šxx...\n180503 0 ไธ‰็›ธๅฃฐ็จ‹ๅฎ‡้Ÿฉๅธ…ๅคซๅฆปไน‹้—ด\n180504 0 ๅซๆ˜Ÿ๏ผšcharles837668\n1000 Processed\n classify content\n181000 0 ๆฅไธบไป–ไปฌไธไฝœไธบ็š„่กŒไธบไป˜ๅ‡บไปฃไปท\n181001 0 ่“็ญน่‚กIBMๅ’Œ่”ๅˆๆŠ€ๆœฏใ€่‹นๆžœใ€ๅพฎ่ฝฏ\n181002 0 ๅกๅœฐไบšๅฆๅ…‹ๆ‰‹่กจๅฎขๆˆทๅฎšๅˆถ่กจๅธฆไนŸๆ˜ฏๅ…จ้‡‘็š„ๅˆ†้‡ๆ„Ÿๅ่ถณ\n181003 0 ็„ถ่€Œ็•™็š„ๆ‰‹ๆœบๅทๆ˜ฏไธ€ไธชๅทฒ็ปไธ็”จ็š„็”ต่ฏ\n181004 0 ็ป™ๅ—ไบฌ็”ตๅŠ›ๅ…ฌๅธ็š„ๅทฅไบบไปฌ็‚น่ตž\n1000 Processed\n classify content\n181500 0 ็›ธไบ’ไน‹้—ด่ฐˆ่ฎบ็”ตๆขฏ็š„่ฎพ่ฎกใ€ๅŽŸ็†ใ€็ป“ๆž„ใ€็ปดไฟ\n181501 1 ไธ–็•ŒไธŠๆ‰€ๆœ‰็š„ไบ‹็‰ฉ้ƒฝไผšไธบ็พŽไธฝ่€Œไผ˜้›…็š„ๅฅณไบบๆŠ˜่…ฐ๏ผๅ…‹ไธฝ็ผ‡ๅจœไธ€ๅบงๆ‰“้€ ็พŽไธฝไผ˜้›…ๅฅณไบบ็š„ๆฎฟๅ ‚๏ผŒ??ๅœจx'x...\n181502 0 ไบฆ่กจ็คบ็€็บชๅฟต20ไธ–็บช30ๅนดไปฃ้ฃŽ้กๅ—ไบฌ็š„ๅ—ไบฌ็ฌฌไธ€ๆก่ฝจ้“ไบค้€šโ€œไบฌๅธ‚้“่ทฏโ€\n181503 0 ๆ„Ÿ่ง‰่Šฑๅƒ้ชจๅ†่ฟ™ๆ ทไธ€ไธชๆ˜ŸๆœŸๅ››้›†\n181504 0 ๆŠฅ่ญฆ19ๅˆ†้’Ÿ่ญฆๅฏŸ่ฟ˜ๆฒกๅŠžๆณ•ๅ‡บ่ญฆ\n1000 Processed\n classify content\n182000 0 ไบ‘ๅ—็œๅง”ๅŽŸๅ‰ฏไนฆ่ฎฐไป‡ๅ’Œ่ขซๅŒๅผ€\n182001 0 ็‰นๅˆซๆ˜ฏๅŽไธบไฝ ๅฏนๅพ—่ตทไฝ ไธ–็•Œ500ๅผบ็š„ๅคด่ก”ไนˆ\n182002 0 MSCIๆ˜Žๆ™ŸๆœชๆŠŠA่‚ก็บณๅ…ฅๅ…ถๆ–ฐๅ…ดๅธ‚ๅœบๆŒ‡ๆ•ฐ็š„ๅ†ณๅฎš\n182003 0 7่„ฑๅ‘่ฟ‡ๅคšใ€ๅคด็šฎๅฑ‘่ฟ‡ๅคš๏ผš็ผบ็ปด็”Ÿ็ด \n182004 0 ็„ถๅŽ้กบไธฐๅฏ„ๅ‡บๅˆฐไป˜ไธ็”จ่ดŸๆ‹…ไธ€ๅˆ†้’ฑ\n1000 Processed\n classify content\n182500 0 ไบŒๆ˜ฏๆˆ‘้™คไบ†็œ‹่Š’ๆžœๅฐ็š„ๅคฉๅคฉๅ‘ไธŠ่ฟ˜ๆ˜ฏไปŽ็ฝ‘ไธŠ็œ‹\n182501 0 ๆŠ•่ฏ‰ๅ•ๆขๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขๆฐ‘่กŒๅค„ๅฐธไฝ็ด ้ค็›‘็ฃไธไฝœไธบๆ„šๅผ„่ดซๆฐ‘็š„็ผบๅพท่กŒไธบ\n182502 0 ๅผบ็ƒˆๅ‘ผๅๅŒป้™ขไธบ70ๅฒไปฅไธŠ่€ไบบ่ฎพ็ซ‹ๆŒ‚ๅท\n182503 0 ๆ‹ฟๅ‡บไธ€ๆ‰นๆ— ้”ก้˜ณๅฑฑ็š„ๆฐด่œœๆกƒ??ๆˆ‘ๆ‹ฟไบ†ไธ€็ฎฑ\n182504 0 ๆŒ‡ๆ•ฐไพ็„ถๆœ‰ๅ›žๆŠฝxxxx็‚น้™„่ฟ‘ๆ”ฏๆ’‘็š„ๅฏ่ƒฝ\n1000 Processed\n classify content\n183000 0 ๆˆ‘ไธ€ไธŠๅˆๆฒกๆณ•็œ‹gmailๆฒกๆณ•็”จgoogle็ฎ€็›ด็ƒฆ่บๆญป\n183001 0 2ใ€ๅ…จๅ›ฝๅฏผๆธธ๏ผš็ฝ‘ไธŠๆณจๅ†Œๆ—ถ้—ด๏ผš6ๆœˆ25ๆ—ฅโ€”10ๆœˆ9ๆ—ฅ\n183002 0 โ€โ€œๅ› ไธบไปŠๅคฉๆ˜ฏๆˆ‘ไปฌๆ‹ฅๆœ‰็š„ๆƒŸไธ€่ดขๅฏŒ\n183003 0 ็œ‹ๅˆฐๆœ‰็ฅจๅŠก่ฏดๅ—ไบฌ1212โ€ฆโ€ฆโ€ฆๆ„Ÿ่ง‰่‡ชๅทฑๅœจ่ฏพไธŠๅฐฑ่ฆๅ“ญๅ‡บๅฃฐไบ†โ€ฆโ€ฆโ€ฆ\n183004 0 ๆตทๅฃๅธ‚ๅ…ฌๅฎ‰ๅฑ€ไบค่ญฆๆ”ฏ้˜Ÿ้’ˆๅฏนๅธ‚ๆฐ‘ๅๆ˜ ๅœจๆตท็ง€ไธœ่ทฏๅฝฉ่™นๅคฉๆกฅใ€ไน‰้พ™่ฅฟ่ทฏๆกฅใ€ไธœๆน–ไธ‰่ง’ๆฑ ๅคง่ฝฌ็›˜่ฎพ็ฝฎไบ†ไธ‰ๅค„้ž...\n1000 Processed\n classify content\n183500 0 FuturesandOtherDerivativesโ€“theseventheditionofโ€ฆ\n183501 0 #NAME?\n183502 0 xไบฟๆ€ปไปทๅˆทๆ–ฐๆปจๆน–ๅ•ๆ€ปไปทๅœฐ็Ž‹็บชๅฝ•\n183503 0 ไธญๅˆๆถ‚ๅฅฝ้˜ฒๆ™’ๆ‰“็€ไผžๅœจๅค–้ข่ตฐไบ†xxๅˆ†้’Ÿ\n183504 1 ๅฐŠๆ•ฌ็š„ๅฎถ้•ฟ๏ผŒๆ‚จๅฅฝ๏ผๆˆ‘ไปฌๆ˜ฏๆˆดๆฐ้‡‘็Ÿณๆ•™่‚ฒ๏ผŒๅ†ฌๅŽปๆ˜ฅๆฅ๏ผŒๆ˜ฅๅญฃๅ‘จๆœซ่กฅไน ็ญๆญฃ็ซ็ƒญๆŠฅๅไธญ๏ผŒๆœฌๅ‘จๅ‘จๆœซๆŠฅๅๅฏไบซ...\n1000 Processed\n classify content\n184000 0 ็ฆๅปบใ€ๆต™ๆฑŸๅคง้ƒจใ€ๆฑŸ่ฅฟๅคง้ƒจใ€ๆน–ๅ—ไธœ้ƒจใ€ๅนฟไธœไธœ้ƒจใ€ๅฎ‰ๅพฝๅ—้ƒจๅŠๅฐๆนพ็ญ‰ๅœฐๆœ‰ๅคงๅˆฐๆšด้›จ\n184001 0 ไธบ่ฟ›ไธ€ๆญฅๅŠ ๅผบๆณ•้™ขๆ–‡ๅŒ–ๅปบ่ฎพๅทฅไฝœ\n184002 0 ๅœจmysqlไธญไธๆ”ฏๆŒmergeinto\n184003 0 ๅฐๅปบไปŠๅคฉไผไธš่ตš้’ฑไบ†้ƒฝๆŠ•่บซๆˆฟๅœฐไบง\n184004 0 ่…่ดฅไปฝๅญไปŽๅ†…้ƒจ็˜ซ็—ช่ฟ™็งๅ‘ๅฟƒๅŠ›\n1000 Processed\n classify content\n184500 0 ไฝ ไปฌ่ฟ™ไบ›ๅœŸๅŒชๆŠขไธœ่ฅฟๅƒๅฐฑ็ฎ—ไบ†่ฟ˜ๅˆถ้€ ๅ™ช้Ÿณ\n184501 0 ๅ“ชๆ€•ๆˆ‘ไปฌ็š„ๅทจๆ˜Ÿๅ†ฐๅ†ฐๅœจ้ฃžๆœบไธŠ่ดด้ข่†œ\n184502 0 ไน‹ๅ‰็œ‹่ฟ‡ไป–ไธŠไธ€ไธช่Š‚็›ฎๅฟซๆœฌ่ฟ˜ๆ˜ฏไป€ไนˆ\n184503 0 ไบบไปฌๅธธ่ฏดwexin๏ผšๅซๆ˜ŸGDxxxxxxxx\n184504 0 ไฝ ไป–ๅจ˜็š„ไธๅนณๅ‡็š„็œŸ็›ธ่‚ฏๅฎšๆ˜ฏๅ“ไบบ็š„\n1000 Processed\n classify content\n185000 0 ไธ็„ถไผš็ ดๅ่œ‚็Ž‹ๆต†็š„่ฅๅ…ปๆˆๅˆ†\n185001 0 ๅ–ไบ†xxxx่‚ก็ฅž็ซๅ’Œไธ€ไบ›ๅŸบ้‡‘\n185002 0 ้—ฎ้ข˜ๆฅไบ†๏ผšNBA็ƒๆ˜Ÿ่ฐๆ›ดๆœ‰็ปŸๆฒปๅŠ›\n185003 0 ๅœจ็œ‹ๅˆฐๅ่…ๆŒฝๅ›ž็ปๆตŽๆŸๅคฑ387ไบฟๅ’Œๆ”ถ็ผด201ไบฟ่ฟ็บชๆ‰€ๅพ—\n185004 0 ไปŽๅ—ไบฌๅธ‚็ปŸ่ฎกๅฑ€ๅฎ˜ๆ–น็ฝ‘็ซ™่Žทๆ‚‰\n1000 Processed\n classify content\n185500 0 ๅ‘็ฎ็žฌ้—ด้ฃžๆœบ่€ณโ€ฆโ€ฆไธปๆŒไบบๆ†‹ไธไฝ็ฌ‘ไบ†\n185501 0 ๅซๆ˜Ÿ๏ผšbbyyccppๆ—ข่ฆๅ”ฏ็‰ฉ\n185502 0 ไปฅๆ–ฐๅ…ดไธšๆ€ไธบๅผ•้ข†โ€œๆต™ๆฑŸๆœๅŠกโ€ๆ’‘่ตทๅŠ่พนๅคฉ\n185503 0 ๅฑ…ไฝ้ข็งฏ3259ๅฐบ/303ๅนณ็ฑณ\n185504 0 ๅฏไปฅ็œบๆœ›ๅˆฐ่ˆช็ฉบๆฏ่ˆฐใ€ๆˆ˜ๅˆ—่ˆฐๅ’Œๅทกๆด‹่ˆฐ็ญ‰\n1000 Processed\n classify content\n186000 1 ๅปบๆๅฎถๅฑ…ๅๅคงๅ“็‰ŒๅทฅๅŽ‚่”็›ŸๆŽจๅ‡บxxxไฝ™ๆฌพโ€œไนฐๆ–ญไปทโ€ไบงๅ“ไบŽxๆœˆxxๆ—ฅไธญๅˆxx:xx่กขๅทž้ฅญๅบ—้™้‡ๆŠข...\n186001 0 600ๅคšๆˆทไธๅŒๆ„ๆ‹†่ฟ็š„ๆƒ…ๅ†ตไธ‹ๆ‹ฟๅ‡บๅคๅฐไปถ\n186002 0 ่ฟ™ๆžถ้ฃžๆœบ็บฆๅˆไบบๆฐ‘ๅธ7000ไธ‡ๅ…ƒ\n186003 0 ไฝ†ๆ˜ฏ็›ฃๅฏŸ้™ขxๆ—ฅไปฅ้„ญๆฐธ้‡‘ๆ“”ไปป่‚กไปฝๆœ‰้™ๅ…ฌๅธ่‚กๆฑ\n186004 0 ๅฅนไปฌๅฐ†้™ชไผดไฝ ็›ดๅˆฐๆฐธ่ฟœUPไธป๏ผšๅˆฉ่ดๅฐ”้—้ญ‚\n1000 Processed\n classify content\n186500 0 ้ข„่ฎก่ฏฅ่ˆช็ญๅฐ†ไบŽxๆœˆxxๆ—ฅๅ‰ๅŽ้ฆ–้ฃž\n186501 1 ๆ‚จๅฅฝ๏ผ้‡‡่ดญ้ฆ–้€‰xxxxๅนฟๅทž๏ผˆๆ–ฐ๏ผ‰ๆตไฝ“ๅฑ•๏ผŒxxxๅฎถๆตไฝ“ไผไธšๅ’Œๆฅ่‡ชxxๅคšไธชๅ›ฝๅฎถ็š„ไธ“ไธš่ง‚ไผ—้ฝ่š๏ผŒไธญ...\n186502 0 ๆธฃๅœŸ่ฝฆๅธธ่ง็š„่ฟๆณ•่กŒไธบไธป่ฆๆœ‰6็ง๏ผš1ใ€ไธๆŒ‰็…ง่ง„ๅฎš็š„ๆ—ถ้—ดใ€็บฟ่ทฏ่กŒ้ฉถ\n186503 0 ไธญๅˆๆ—ถๅ‰่พˆๅ‘Š่ฏ‰ๆˆ‘ไธ‹ๅˆไธคๅคฉๅˆฐๅ•ไฝ\n186504 0 ๅพ—่€…ๅฑ…็„ถๅ› ไธบ่ขซไธพๆŠฅๅ’ŒๆŸๅŒไบ‹ๆœ‰ไธๆญฃๅฝ“ๅ…ณ็ณป่€Œๆ’ค\n1000 Processed\n classify content\n187000 0 ไธญๅ›ฝๅปบ็ญ‘้‡‘ๅฑž็ป“ๆž„ๅไผšๅปบ็ญ‘้’ข็ป“ๆž„ๅˆ†ไผšๅ‰ฏไผš้•ฟ่ƒก่‚ฒ็ง‘่ฟ‘ๆ—ฅ่กจ็คบ\n187001 1 ็‘œไผฝ่ฟๅŠจ๏ผŒๅก‘่บซๅ…ป็”Ÿโ€ฆโ€ฆๆœ‰ๆ•ˆๅ‡ๅŽ‹็ผ“่งฃ็–ฒๅŠณ๏ผŒๅขžๅŠ ๆดปๅŠ›ใ€่ฐƒ่Š‚่บซๅฟƒๅนณ่กก๏ผŒ่ฎฉๆ‚จๆ›ดๅฅๅบท_ๆ›ดๆ„‰ๆ‚ฆ!ๅจๅฐผๆ–ฏ่Šฑ...\n187002 0 ่ฟ™ๆ˜ฏไธ€ๆ ‹ไฝไบŽๆฏ”ๅˆฉๆ—ถrotselaarๅฏ†ๆž—ไธญ็š„ไฝๅฎ…\n187003 0 ่ฝจ้“ไบค้€šxๅท็บฟไธ€ๆœŸๅทฅ็จ‹ๅ…จ้•ฟxxๅ…ฌ้‡Œ\n187004 0 ไปŠๅคฉ่ขซๅ“ๅ‚ปไบ†่ฟ˜ๅฅฝไฝ ๅœจ้ฆ™ๆธฏๆˆ‘ๆฒกๆœ‰็ขฐ่งไฝ ๅฆ‚ๆžœ็œŸๆ˜ฏไฝ ่ฏฅๆ€Žไนˆๅฏน่ฏๆ•ท่ก\n1000 Processed\n classify content\n187500 0 ๅœจๅ”ๆขฆ่ฎพ่ฎกๆ€ป้ƒจ็ญพๅฎšๅˆไฝœๅ่ฎฎ\n187501 0 ๅ…ฌๅ…ฌๅพ—ๅ‡บๅŽปๆ‰“ๅทฅไธบๆญคๅพ—ไบ†็™Œ็—‡\n187502 1 xxx xxxๆŽ็บข่ƒœ ๅ†œ่กŒ:xxxx xxxx xxxx xxxx xxxๆŽ็บข่ƒœ\n187503 0 ๆ–ฐๅฎพๅŽฟๆณ•้™ขไบคๆตๅญฆไน ้‚น็ขงๅŽๅ…ˆ่ฟ›ไบ‹่ฟนไฝ“ไผš\n187504 0 ไธ–็•Œ้ฆ–ๆžš3Dๆ‰“ๅฐ็ซ็ฎญElectronๅฐ†ไบŽ2015ๅนดๅบ•ๅ‘ๅฐ„\n1000 Processed\n classify content\n188000 0 ไปŠๅ„ฟไธ€ๅคฉๆŠ˜่…พ่ฟ™็”ต่„‘ๆŠ˜่…พ็š„ๅฅฝๆฌขไนๅ•Š\n188001 0 ๆต™ๆฑŸ้‡‘ๆธฉ้“้“ๆœบ่ฝฆ่ฝฆ่พ†ๅคงไธ“็ญๆฏ•ไธšๅ…ธ็คผๅœจๆต™ๆฑŸๅธˆ่Œƒๅคงๅญฆไธพ่กŒ\n188002 0 Doomๅญฆ้™ข็š„ๅนด็ปˆๆต‹่ฏ•ๅŽŸๆฅๆ˜ฏๅคง้€ƒๆ€ๆจกๅผ\n188003 0 ๅ…ฐๆก‚ๅŠๅฎถๅบญๅˆซๅข…ๆ—…้ฆ†ๆฌข่ฟŽไฝ ็‚นๅ‡ป้“พๆŽฅๆ’ญๆ”พ่ถ…้…ทH5ๅคง็‰‡&gt\n188004 0 โ‘กๆฏๅคฉ็ญพๅˆฐ้ข†้’ฑโ‘ขๅˆทไปปๅŠก่ตš้’ฑ\n1000 Processed\n classify content\n188500 0 ๅฅ‘็บฆ็ฒพ็ฅžๅบ”่ฏฅ้ซ˜ไบŽๆณ•ๅพ‹่ดขไบง็š„็•Œ้™\n188501 0 ๅคงๅญฆ็”Ÿ1ๅˆฐ5ไธ‡็š„ๅฐ้ข่ดทๆฌพๆ”ถ211ไธŽ985้™ขๆ ก็š„ๅŠๅ…ถ้™„ๅฑž้™ขๆ ก็š„ไธ“็ง‘ๆœฌ็ง‘็ ”็ฉถ็”Ÿๅšๅฃซ็š„่ดทๆฌพ\n188502 0 ๅ—็Žฏๆกฅๅพ€ไธ‹็ซ็พ็ƒŸ็‰นๅˆซ็š„ๅคงๅพˆๅ‘›ๅ‰ๅŽๆˆ‘็œ‹ๅˆฐๅ…ญ่พ†ๆถˆ้˜ฒ่ฝฆๅ‘œๅ‘œ็š„่ฟ‡ๅŽปไบ†ๅธ‚ๆ”ฟๅบœ้™„่ฟ‘ๅ‡ ไธช่ทฏๅฃ็บข็ปฟ็ฏ้ƒฝๅœไบ†ไบค...\n188503 0 ๅฆ‚ๆžœไธๆ˜ฏ็”ต่„‘ๅŒ…ๅคชๅคšๆˆ‘ไธไผšๅ‡บๅ•ฆ\n188504 0 ๆณ•้™ข๏ผšๅŽŸๅ‘Šๅนถ้žๅ› ็”Ÿๆดป้œ€่ฆ่€Œ่ดญไนฐๅ•†ๅ“\n1000 Processed\n classify content\n189000 0 XboxOne้œ€่ฆๅฎŒๅ…จๆ”ฏๆŒ้”ฎ็›˜ๅ’Œ้ผ ๆ ‡\n189001 0 ไฝ ๆญฃๅœจ่ดจ็–‘ๆˆ‘็š„ๆ—ถๅ€™ๅˆซไบบๅทฒ็ปๅผ€ๅง‹่ตš้’ฑไบ†\n189002 0 ๅœจๆœ€ๆŽฅ่ฟ‘ๅคฉๅ ‚็š„ๅœฐๆ–นๅฏนๅฅน่ฏดโ€œๆˆ‘็ˆฑไฝ \n189003 0 ๆˆ‘็Žฐๅœจ็ฎ€็›ดๅฐฑๆ˜ฏ่…พ่ฎฏๅ…ฌๅธ็š„ๅคง็ฒ‰ไธ\n189004 0 ๅ“่ƒœOTGๆ•ฐๆฎ็บฟ้€‚็”จๅฐ็ฑณไธ‰ๆ˜ŸๅŽไธบๆ‰‹ๆœบ่ฝฌๆŽฅ็บฟmicroUSB่ฝฌๆข็บฟ\n1000 Processed\n classify content\n189500 0 ่‹ๅทžๆฏๅคฉๆ–ฐๅขž57ไพ‹็™Œ็—‡็—…ไบบ\n189501 0 ๅฎๅบ”ๆฐ”่ฑกๅฐ7ๆ—ฅ7ๆ—ถๅ‘ๅธƒๅคฉๆฐ”้ข„ๆŠฅ\n189502 1 ่ฆๅšๅทฅๅ•†ๆณจๅ†Œใ€็จŽๅŠกไปฃ่ดฆใ€่ž่ต„่ดทๆฌพใ€ๅ•†ๆ ‡ไธ“ๅˆฉใ€ๅ„็ฑป่ต„่ดจใ€่ต„้‡‘่ฟ‡ๆกฅใ€POSๆœบใ€็™ปๆŠฅๅˆป็ซ ็ญ‰็ญ‰๏ผŒๆœๅŠก...\n189503 0 13588288645QQ2367436628ๅธ‚ๅœบๅคงๅฅฝ\n189504 0 ๆˆ’ๆฏ’ๅŒป้™ข้™ข้•ฟ้™ˆๆ–‡ๅฝฌ็ซ‹ๅˆปๅฏๅŠจ็ดงๆ€ฅๆ•‘ๆŠคๆŽชๆ–ฝ\n1000 Processed\n classify content\n190000 0 ๅ…ถไฝ™่ฝฆๅž‹่ฟ˜้…ๅค‡ไบ†LEDๆ—ฅ้—ด่กŒ่ฝฆ็ฏ\n190001 0 6ๆœˆ26ๆ—ฅ้‚ฃๅคฉๆ˜ฏไนฐไบ†B็ซ™็š„ๆŽจๅนฟไฝ\n190002 0 ๆ˜จๅคฉๆ”ถๅˆฐไปๆตŽๅŒป้™ข้ซ˜ไธปไปป็š„้‚€่ฏท\n190003 0 ๅ“ชไธชๆณ•้™ขๅ“ชไธชๆณ•ๅฎ˜ๆ•ขๅ’Œๆˆ‘ๅ‹พ็ป“\n190004 0 ๅˆซ่ฎฉ้‡‘่ž็›‘็ฎกๅฝฑๅ“ไบ’่”็ฝ‘็ปๆตŽ\n1000 Processed\n classify content\n190500 0 ๅฐ‘็‰›ๆฏ›็บน้—ญๅˆๆฃ•็œผ้ข—้ข—้‡‘ๆ˜Ÿๅบ•่‰ฒๅนฒๅ‡€ๆฒนๆ€งๅ\n190501 0 ่พžไบ†ๅทฅไฝœๅŽปๅ—ไบฌๆˆ–่€…้‡ๅบ†็”Ÿๆดป\n190502 0 ๅŒป็”Ÿ่ฏดๅธฆไฟๆŒๅ™จๅฐฑๅฏไปฅ็Ÿซๆญฃ่ฟ‡ๆฅไบ†\n190503 0 ๆฑŸ่‹็œๅฆ‚็š‹ๅธ‚ๆฑŸๅฎ‰้•‡่‘›ๅธ‚ๆ‘ๅพๅง“ๆ‘ๆฐ‘่ขซไบบๆ€ๅฎณๅœจ่‡ชๅฎถๅŽ•ๆ‰€ๅ†…\n190504 0 ๆ˜Ž็™ฝ็ฏ€ๅฅๆทบ่–„ๆญŒ่ฉžๅซไบบๅฟƒ่…ๆ•—\n1000 Processed\n classify content\n191000 0 ๅ‘็Žฐ็—…ๆฏ’ๅˆ†ไบซๅพฎๅš่ฟ˜ๆœ‰็‰นๅˆซ็งฏๅˆ†ๅฅ–ๅŠฑ\n191001 0 ๆˆ‘็œ‹ๅพˆๅคš่‚ก็ฅจ้‡‘่žๅˆ†ๆžๅธˆ่‹ฆๅฃๅฉ†ๅฟƒ็š„ๅŠๆ•ฃๆˆท\n191002 0 WindowsๆŽˆๆƒ่ฅๆ”ถไธ‹้™8%็ญ‰\n191003 0 ็Ÿฅ้“็œŸ็›ธ็š„ๆˆ‘็Žป็’ƒๅฟƒ็žฌ้—ดๅ™ผ้‡Œๅ•ชๅ•ฆๅœฐๅ“\n191004 0 ่ตท็ ๆ—…ๆธธๅ›žๆฅๆš‘ๅ‡ๅœจๅฎถ่ฆ่ฟ‡ๅพ—ๅ……ๅฎž\n1000 Processed\n classify content\n191500 0 170g$118้ฉ็”จๆ–ผๅ…ญๅ€‹ๆœˆ่ตทๅฏถๅฏถ\n191501 0 ๅœจ่‹ๅทžๆŸ้ค้ฅฎๅบ—ๅš็€ไธ€ไปฝๅŽจๅธˆๅ…ผๅค–ๅ–ๅทฅไฝœ\n191502 1 ่‡ชไฟก็š„็”Ÿๅ‘ฝๆœ€็พŽไธฝ๏ผŒไธ€ๅนดไธ€ๅบฆ็š„ๅฆ‡ๅฅณ่Š‚ๅˆฐไบ†๏ผŒ่Šญ่ŽŽ็พŽๅฎน็ฅ็ฆ็พŽๅฅณ่‡ชไฟกๆฝ‡ๆด’ๆผ‚ไบฎ็พŽไธฝ๏ผŒๅ€ผๆญค็พŽๅฅฝ่Š‚ๆ—ฅ๏ผŒ่Šญ่ŽŽ...\n191503 0 ๅ‰ฉไธ‹0025ๅ’Œ326่ตŒๅคง็›˜็ฟป็บข\n191504 0 /ๅทฅไฟก้ƒจ่‹—ๅœฉ๏ผš็‰นๅˆซ่ฆ้‡่ง†ๆŠ“ๅฅฝ็ฝ‘็ปœๆ้€Ÿ้™่ดนๅทฅไฝœ\n1000 Processed\n classify content\n192000 0 ้ฃžๆœบๅœจไธŠๆตทไธŠ็ฉบ็›˜ๆ—‹ไบ†ไธ€ไธชๅŠๅฐๆ—ถๅŽ\n192001 0 ่‡ณๅฐ‘ๅๅนดๅ†…ไธๆƒณ่€ƒ่™‘win10\n192002 0 ๅนถๅฐฑๆ–ฐๅธธๆ€ไธ‹้›†ๅ›ข็ป่ฅ็š„่ฐƒๆ•ดไผ˜ๅŒ–ๅ‡็บงๅขžๆ•ˆ่ฟ›่กŒไบ†ไบคๆต\n192003 0 โ€œ็›‘ๅฌโ€ไธ‘้—ปๅˆถๅ†ท็พŽๆ—ฅๅŒ็›Ÿๅ…ณ็ณป\n192004 0 ๅˆฐ็›ฎๅ‰่ฟ˜ๆฒก่ง่ฟ‡ๅช่ดชๆฑกๅ—่ดฟไธๆ‰พๅฐไธ‰็š„\n1000 Processed\n classify content\n192500 0 ๆ ธๆญฆๅ™จ็š„ๅ‘ๆ˜Žๆœ‰ๅˆฉไบŽ็ปดๆŠคไธ–็•Œ็š„ๅ’Œๅนณ\n192501 0 ๅฏ่ƒฝๆ˜ฏ้‚ฃไธชไธปไบบ็š„ๅ…จ้ƒจ่ดขๅฏŒ\n192502 0 ๆœ‰ๆ—ถๅœจ็ซ่ฝฆ็ซ™ๆˆ–ๅœฐ้“ไธŠ็œ‹ๅˆฐๅฐๆŠฅ็ซฅๅ–ๆŠฅๆœ‰ๅ…ด่ถฃ็š„ไบบๅพˆๅฐ‘ๅฟไธไฝๆ”ฏๆŒไธ€ไธ‹ๅดๅ‘็Žฐๅ–2ๅ…ƒ็š„ๆŠฅ็บธๅŽŸไปทๅชๆœ‰0\n192503 0 ๅ‰ๆž—โ€œ้ฆ–่™Žโ€่ฐทๆ˜ฅ็ซ‹่ขซๆŸฅๅ่…38ๅคฉๆ‰“ไธ‹โ€œ9่™Žโ€\n192504 0 ็›ดๆŽฅ็™พๅบฆไบ‘้“พๆŽฅ็งไฟก็”ฉ็ป™ไฝ ไปฌ\n1000 Processed\n classify content\n193000 0 ๆผ”ๅ”ฑไผšๅŽ้—็—‡ไน‹ไธ€ๆ˜ฏๅƒไธไธ‹้ฅญ\n193001 0 ๅœจๅบฆๅ‡ๅ‹ๅœฐ้ฉ•ๅฅขๆทซ้€ธ็š„PIPPOๆธพ็„ถไธ็Ÿฅ่‡ชๅทฑๅœจ็ฑณ่˜ญ็š„ๅฎถ่ฎŠๆˆไบ†ๅฐๅท็š„็›ฎๆจ™\n193002 0 Gxไบฌๆฒช้ซ˜้€Ÿ็”ฑไธŠๆตทๅพ€ๅŒ—ไบฌๆ–นๅ‘ๆ— ้”กๆฎตไปŽKxxxx+xxx่‡ณKxxxx+xxx\n193003 0 1500ๅ…ƒ่ƒฝไนฐๅˆฐ็š„โ€œๅฅฝ่ฎพ่ฎกโ€ไบงๅ“ๅŠๆŽจ่็†็”ฑ\n193004 0 ๅ‘็Žฐไบบๆœ€ๆ‚ฒๅ‚ฌ็š„ไบ‹ไธๆ˜ฏ่ขซๅผบๅฅธ\n1000 Processed\n classify content\n193500 0 ๅ่€Œ็”จ3Gๆ‰‹ๆฉŸ็ถฒ็ตกๆœๅ‹™ๅฏไปฅ็‡็›ธ\n193501 1 ๆ„Ÿ่ฐข่‡ด็”ต้›ช่Žฒๅธƒ่‰บ๏ผŒๆˆ‘ไปฌไปฅไผ˜่ดจ็š„ไบงๅ“๏ผŒไผ˜ๆƒ ็š„ไปทๆ ผ๏ผŒๅฎŒๅ–„็š„ๅ”ฎๅŽๆœๅŠก๏ผŒ่ฐ’่ฏšๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผxxxxๅนด...\n193502 0 ้ข่†œ็ฒพๅŽไธญxx%้ƒฝๆ˜ฏ่œ‚่œœๅŽŸๆถฒ้€‚ๅˆไปปไฝ•็ฑปๅž‹็šฎ่‚ค\n193503 1 ๅคงๅฎถๅฅฝ๏ผไธœ่ŽžๅŽšๅคงๅธ่€ƒไธ€ๅฆ‚ๆ—ขๅพ€ๅœฐไธบๆ‚จๅธฆๆฅๅ…จๅ…่ดน็š„ๅๅธˆ่ง†้ข‘ๅ…ฌๅผ€่ฏพใ€‚ๆœบไผšๅฐฑๅœจ็œผๅ‰๏ผŒๆŠŠๆกๆœบไผš๏ผŒ่ตฐๅ‘ๆˆ...\n193504 0 apๆ•™ๅ ‚้ฃž้ฉฌsax่‰ฒๆฑ‡็އ7\n1000 Processed\n classify content\n194000 0 ๅœฐ้“ๅ…ฌๅฎ‰ๆ้†’๏ผšๅŒ…่ฃน่ฏท่ฟ‡ๅฎ‰ๆฃ€\n194001 0 ็ฎ—ๆ˜ฏshortcoveringๅธฆๆฅ็š„ๅ…ฌๅนณ\n194002 0 ๅฎ—ๆฑ‰ๅŸŽ็ฎกๅ’Œๆ›™ๅ…‰็คพๅŒบๅ…ฑๅŒไธพๅŠžไบ†โ€œไบ‰ๅฝ“ๅŸŽ็ฎกๅฐไน‰ๅทฅโ€ไฝ“้ชŒๆดปๅŠจ\n194003 0 3้“ถๆๅธ้Ÿณๆฟ็š„ๅข™ๅฃ4ไธ€ๅฅ—ๅฎž็”จ็š„ๅฎถ่ฃ…\n194004 0 ไฝ•ๅ†ตๅฆ‚ไปŠ่‚ก็ฅจ็Žฉ็š„ไธไป…ไป…ๆ˜ฏๆŠ€ๆœฏ้ข\n1000 Processed\n classify content\n194500 0 ๆˆ‘็š„ๆ‘„ๅฝฑๅธˆๆ—ฉๆœŸไฝœๅ“ยทBy้…’้ฌผๅฆๅ…‹\n194501 0 ๅ‘่กจไบŽBMJๆ‚ๅฟ—ไธŠ็š„ไธ€็ฏ‡ๆ–‡็ซ ็ปผ่ฟฐไบ†้—ด่ดจๆ€ง่‚บ็–พ็—…็š„่ฏŠๆ–ญไธŽๆฒป็–—\n194502 0 ๅ่…่ฟ™็งๅˆฉๅ›ฝๅˆฉๆฐ‘็š„ไบ‹ๅฆ‚ๆžœ่ฟ˜ๆœ‰ไบบๅซไธๅฅฝ\n194503 0 ๅฝ“ๆ‰‹ๆœบ็ฆปไฝ ไปฌ10mๅทฆๅณ่ท็ฆปๆ—ถ้€šๅธธไฝ ไปฌไผš๏ผš\n194504 0 ็‹ฎๅญไผš+่ˆตๆ‰‹็ญๅŒๅญฆไผš็คพไผšๅ…ฌ็›Š+ๅคงๅญฆ็”Ÿๅˆ›ไธš่ฟ™ๅฐฑๆ˜ฏไธญๅ›ฝ็‹ฎๅญ่”ไผš่ˆตๆ‰‹ๆœๅŠก้˜Ÿโ€œ้’ๅนดๅˆ›ๅฎข่ฅโ€\n1000 Processed\n classify content\n195000 0 ๅทฅไฝœๅ†ๅฟ™ไนŸ่ฆๅ—จ~1Gๅ…จๅ›ฝๆต้‡ๅŠๅนดๅŒ…6ๆŠ˜่ตท\n195001 0 ไธœ้ฉ้˜ฟ้‡ŒๅŠฉไฝ ๅปบ็ซ‹ๅฅๅบท็š„็”Ÿๆดปๆ–นๅผ\n195002 0 ๆ•ฐ้‡ๅนถไธๅคšๆš‚ๆ—ถๆˆ‘ไนŸไธๆ‰“็ฎ—ๅ–ไบ†\n195003 0 mokabros้‡Œ็š„่›‹็™ฝ่ดจ่กฅ็ป™็ซ™ๆฒ™ๆ‹‰้‡Œ้ขๆœ‰ๅƒๅฐๆ—ถๅ€™ๆž•ๅคด้‡Œ็š„ไธœ่ฅฟโ€ฆ\n195004 0 ็œ‹NBA็ƒๆ˜Ÿ่ขซๅฒๆœˆๆ”นๅ˜็š„็—•่ฟน\n1000 Processed\n classify content\n195500 0 ๅ—้€š็š„ไธๅคช่“็š„ๅคฉๅ’Œไธๅคช็™ฝ็š„ไบ‘\n195501 0 ็ฉบ่ฐƒ่กฃ็ฉฟ้ƒฝๅพˆๅˆ้€‚๏ฝž่“ๅกๅ…ถ2่‰ฒ\n195502 0 ๆˆ‘็Žฐๅœจ2็ฑณไบ†ๅทฒ็ปไธ‹ไธ€ๆญฅๅฐฑๆ˜ฏ่ฟ›ๅ…ฅๅ“ˆไฝ›็„ถๅŽๅŽปๆ‰“nba\n195503 0 ๆœ‰ๆ•ˆ้œ‡ๆ…‘ๅ’Œๆ‰“ๅ‡ปๅ„็ฑป่ฟๆณ•็Šฏ็ฝชๆดปๅŠจ\n195504 1 ใ€็บคๅงฟไพไบบใ€‘้ญ…ๅŠ›ๅฅณไบบ่Š‚๏ผŒๅ…ณ็ˆฑๅฅณๆ€งไนณ่…บๅฅๅบท๏ผŒๅฒๆ— ๅ‰ไพ‹ไผ˜ๆƒ ้€็ป™ๆ‚จใ€‚ไฟๅ…ปๅž‹ๆ–‡่ƒธ็ฌฌไธ€ไปถๆญฃไปท๏ผŒ็ฌฌไบŒไปถๅŠ...\n1000 Processed\n classify content\n196000 0 ้กถ็บงWHOO/ๅŽๅคฉๆฐ”ไธนๅŽๆณซ็Ž‹ๅŽๅฅ—็›’\n196001 0 maoliqiusi็š„photowhere\n196002 1 ๆ‚จๅฅฝ๏ผŒไฝๆœ‹็ฝ‘ๅฎขๆœๅฐๆŽ๏ผŒๅ…ซๆก‚็ปฟๅŸŽ้พ™ๅบญๆฐดๅฒธ-ๅคฉ่ช‰่Šฑๅ›ญ๏ผŒ้พ™ๅบญๆ–ฐๆ˜ฅ็‰นๆƒ ๆœ€้ซ˜ไผ˜ๆƒ x.xไธ‡๏ผŒๅคฉ่ช‰ไนฐๆˆฟ็ซ‹ๅ‡...\n196003 0 ็ฌ”่ฎฐ่ถ…่ฟ‡1ๅคฉๅŸบๆœฌๅฐฑๆฒกๆœ‰ๅคชๅคงๅญ˜ๅœจๆ„ไน‰\n196004 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ g4426jไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n1000 Processed\n classify content\n196500 0 ๅฏนไธญๅ›ฝๆ•ดไฝ“็š„ๅˆถ้€ ไธšใ€ๅปบ็ญ‘ไธšใ€ไปฅๅŠ่ฃ…ๅค‡ๅˆถ้€ ไธš็š„ๆฐดๅนณๆฅ่ฏด\n196501 0 ๆฏๅคฉไธŠไธ‡็š„ๆ—…ๆธธ่€…ๅ‰ๅพ€่ฟ™้‡Œ้ช‘้ช†้ฉผใ€็œ‹ๆ—ฅ่ฝๅฃฎ็พŽ\n196502 0 ๆžœๆžœๅ†™่Šฑๅƒ้ชจ็š„ๆ—ถๅ€™่ฟ˜ๅพˆๅนด่ฝป\n196503 0 ไปปๅฟ—ๅผบ๏ผšโ€œ่ฐฃยท่จ€ๅ€’้€ผ็œŸ็›ธโ€\n196504 0 ไบŒๆ˜ฏ็Žฐๅœจๅฐๅพฎไผไธš่ž่ต„้œ€ๆฑ‚ๅพˆ้ซ˜\n1000 Processed\n classify content\n197000 0 ไธ€ๅœบ่ทจ่ถŠไธค็œ็š„็ˆฑๅฟƒๅŒป็–—ๆ•‘ๅŠฉไธญๅ›ฝๆ–‡ๆ˜Ž็ฝ‘็”ฑไธญๅคฎๅฎฃไผ ้ƒจใ€ไธญๅคฎๆ–‡ๆ˜ŽๅŠžไธปๅŠž\n197001 0 ๅฏนไบŽ้•ฟๆœŸไพ่ต–ๆŠ•่ต„้ฉฑๅŠจ่€Œ้ซ˜ๆ ๆ†็š„ไธญๅ›ฝ\n197002 0 ่ทฏ้ƒฝๅ› ไธบๅฐ่ดฉไปฌไนฑๆ‰”ๅžƒๅœพใ€ๅˆฐ่„ๆฐดๅ˜ๅพ—้‚ฃไนˆ่„\n197003 0 ่ฎฉๆˆ‘ไปฌไธ€็ชฅQQ้กถๅฐ–ไผšๅ‘˜็š„โ€œๅฎžๅŠ›โ€๏ผšๅนด้พ„ไธŠ\n197004 1 ไบซxxMๅ…‰ๅฎฝๅธฆๅŠๆ•ฐๅญ—็”ต่ง†ๅ…่ดน็”จ๏ผŒ็งปๅŠจ่€็”จๆˆทๅญ˜่ฏ่ดนๆœ€ๅคšๅฏ้€xxxๅ…ƒ่ฏ่ดน๏ผ›็Žฐๅœบๆ›ดๆœ‰ๆ‰‹ๆœบๅคงไผ˜ๆƒ ๏ผŒๅ”ฎ...\n1000 Processed\n classify content\n197500 0 ๅˆ็”จGoogle็ฟป่ฏ‘ๆŠŠๆ„ๅคงๅˆฉ่ฏญ็ฟป่ฏ‘ๆˆไบ†่‹ฑๆ–‡ๅ’Œไธญๆ–‡\n197501 0 sugaryๅœจไธญๅญฆ็š„ๆ—ถๅ€™ๅฐฑ้•ฟๅพ—ๅฏ็ˆฑๆผ‚ไบฎ\n197502 0 ๆฒณๅ—้€ ่ก€ๅนฒ็ป†่ƒžๆ็Œฎไบบๆ•ฐ่พพ500ไพ‹ๅ…จๅ›ฝ็ฌฌไธ€\n197503 0 ้ฉฌ้พ™็š„NBA็”Ÿๆถฏๅฐฑๅง‹ไบŽ้ฉฌๅˆบๆ•™็ปƒๆณขๆณข็ปดๅฅ‡\n197504 0 ๅˆšๅœจ้ƒ‘ๅทž่ฟชๆœ›่‹นๆžœๆ‰‹ๆœบไธ“ๅ–ๆ‘‡ๅˆฐไบ†ๅ…่ดนๅฅฝไธœไธœ\n1000 Processed\n classify content\n198000 0 ็„ถๅŽ้‚ฃๆฌกๅˆท53ๆˆ‘ๅŒๅผ€ไฝ ไนŸๅŒๅผ€ๅธฎๆˆ‘ๅˆทๆญปไบ†ๅฅฝๅคšไบ‹่ทณไบ†ๅฅฝๅคšไธน็ปˆไบŽๆ˜ฏ่ฟ‡ไบ†\n198001 0 xๅ‘จๅฒDAYxxx๏ผšๆ˜จๆ™š้ฃžๆœบๅˆๆ™š็‚น\n198002 0 ๆ˜จๆ™š็”จไบ†้›ช่Šฑ็ง€ๆด—้ขๅฅถ้•ฟ3็ฒ’็ฒ‰ๅˆบไบ†ๆœ‰็—…ๅ“ฆ็ƒฆๆญป\n198003 0 ็”ช็›ดๅŸŽ็ฎกๅคง้˜ŸๅœจๅŒบๅธ‚ๅฎนๅธ‚ๆ”ฟ็ฎก็†ๅฑ€ๆ‰งๆณ•ๅคง้˜Ÿ็š„ๅธฆ้ข†ไธ‹\n198004 0 ๅ› ไธบ้ฃžๆœบๅปถ่ฏฏๆ— ๆ„้—ดๆ‹ๅˆฐไบ†็š„ๆ˜ฏ๏ฝž\n1000 Processed\n classify content\n198500 0 ๅ”ไธƒๅ…ฌๅญๆˆๅไน‹ไฝœๆŠ„่ขญๅคง้ฃŽๅˆฎ่ฟ‡็š„ไฝœๅ“ๅ„ไฝ็œ‹ๅฎ˜ๅงๅ‹่ฏทไฟๆŒๅนณ้™่ฎฉๆˆ‘ไปฌไธ€่ตท่ฟ˜ๅŽŸ็œŸ็›ธ\n198501 0 ๆต™ๆฑŸ็œ่กขๅทžๅ„ๅœฐๅขžๅŠ ไบ†ไธๅฐ‘้“ถๆใ€็บขๆžซ็ญ‰ๅฝฉๅŒ–ๆค็‰ฉ\n198502 1 x.x็พŽไธฝๅฅณไบบ่Š‚๏ผŒๅฅฝๅˆๅคš่ถ…ๅธ‚ไบฌๆถฆ็็ ๆŠค่‚คไธ“ๆŸœๅ€พๆƒ…ๅ›ž้ฆˆๆ–ฐ่€้กพๅฎน๏ผŒๅ…จๅœบไฝŽ่‡ณxๆŠ˜๏ผŒๅ‡ก่ฟ›ๅบ—็š„ไผšๅ‘˜ๆถˆ่ดน...\n198503 0 ๅŒๆ–นๅฐ†ไปฅ้˜ฟ้‡Œไบ‘ๅ’Œ้˜ฟ้‡Œๆ•ฐๆฎๅนณๅฐไธบๅŸบ็ก€\n198504 0 ใ€Œ96ไธ‡็ƒญๆ’ญ่ง†้ข‘ใ€ๅฐ่Œ่šชๆ‰พๅฆˆๅฆˆไธญๅ›ฝ็ฌฌไธ€้ƒจๆฐดๅขจๅŠจๆผซ\n1000 Processed\n classify content\n199000 0 ่€Œไฝ ไปฌ่ฟ™็พค็”จๆ‰‹ๆœบๅฑๅคงๅฐๅˆ†่€ๅคงใ€่€ไบŒใ€่€ไธ‰็š„ไบบ\n199001 0 26ๅฒๅฎ‰ๅพฝ็”ทๅญๆ‰‹ๆ‹ฟๆฐดๆžœๅˆ€ๅœจๅนฟ็›Š่ทฏไธŽไบšๆฌง่ทฏไธ€ๅคงๅŽฆๅ†…่‡ชๆ…่…น้ƒจไธ€ๅˆ€\n199002 0 xxๅฒๆ—ถๅ› ่ชๆ…งๅ–„ๆ–‡ไธบไธ€ไปฃๅฅณ็š‡ๆญฆๅˆ™ๅคฉ็š„้‡็”จ\n199003 0 ไปŠๅคฉไพ็„ถ้‚ฃไนˆๅนณ้™ๆˆ‘่ฟ˜ๆ˜ฏๅพˆๆƒณๆ™šๅฎ‰ๅŸŽ็ฎกๆ™šๅฎ‰ๆˆ‘็ฆปๅผ€็š„้‚ฃๅบงๅŸŽโ€”ๅทดๅ—\n199004 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ๆ–ฐๅนดๅฅฝ๏ผๅŒๅฏŒๆŽๅฎไธ“ๅ–ๅบ—ๅ…จไฝ“ๅŒไบ‹็ฅๅคงๅฎถๆ–ฐๅนดๅฟซไนใ€ไธ‡ไบ‹ๅฆ‚ๆ„ใ€‚ๅŒๅฏŒๆŽๅฎไธ“ๅ–ไปŽxๆœˆxๅท่‡ณx...\n1000 Processed\n classify content\n199500 0 ๆœชๆฅๆ˜ฏไฝ ็š„โ€ฆโ€ฆNBAๅๅคง็ปๅ…ธๅžƒๅœพ่ฏ\n199501 0 ๅ…ทไฝ“ๅ’จ่ฏขๅŒป็”Ÿ่ฏฅๅฆ‚ไฝ•่ฟ›่กŒๅ–‚ๅ…ปๆ‰ๆ˜ฏๆ˜Žๆ™บไน‹ไธพ\n199502 1 ๆ–ฐๅนดๅฅฝ๏ผŒๅนฟๅทžๅฐ–้ซ˜ๆจๅทฅ็ฅๆ‚จๅผ€ๅทฅๅคงๅ‰๏ผŒ็”Ÿๆ„ๅ…ด้š†ใ€‚๏ผๅฆ‚ๆ‚จๅผ€ๅทฅๆ—ถไปชๅ™จ่ฎพๅค‡ๆœ‰ๆ•…้šœๅฏไธบๆ‚จ็ปดไฟฎใ€‚ๅฆ‚้œ€ๆทปๅŠ ไบŒ...\n199503 1 xxxx็ ”-็ฉถ-็”Ÿ MBA่€ƒ-็”Ÿ๏ผš้€Ÿๆๅˆ†่พพๆ‚จๆ‰€ๆ„ฟ๏ผŒไธ“ไธšๆŽ’ๅๆŸฅ่ฏข๏ผŒๆˆ/ๅŠŸๆ ทๆœฌๆ ธ/ๅฎžใ€‚ๆ‰ฃๆ‰ฃ:xx...\n199504 0 ๅขƒๅ†…ๅค–่›‡ๅคดๅ‹พ็ป“ๆฏๅ‘จ้€xxxๅ†…ๅœฐๅฅณๅญๅˆฐ้ฆ™ๆธฏๅ–ๆทซxxxxๅนดxๆœˆxๆ—ฅๆŠฅ้“\n1000 Processed\n classify content\n200000 0 It'sRonniๆ‘†efromtwitter\n200001 0 ็ฅžๅ†œๆžถๅœŸ่œ‚่œœ2015ๅนดๆ–ฐ่œœไธŠๅธ‚\n200002 0 ๅธ‚ๆฐ‘้‡‘ๅ…ˆ็”Ÿ่Šฑ7ไธ‡ๅคšๅ…ƒๆ‹ฟไธ‹ๅ—ไบฌๅ—็ซ™็š„ไธ€้—ดๅœฐไธ‹ๅ•†้“บ14ๅนด็š„็ป่ฅๆƒ\n200003 0 ๅ›ฝๅฎถๆŠ•่ต„ๅ‡ ๅƒไธ‡็š„ๆฐดๅˆฉ้ƒฝๆฒกๆœ‰ไฝœ็”จ\n200004 1 ๏ผ็ง€ๅŸŸ็ญ‰ๆ‚จๆฅๅ“ฆ๏ผๅฆๅค–ๆœฌๆœˆๅ‰ไบ”ๅคฉๆœ‰ไบงๅ“ไบ”ๆŠ˜ไผ˜ๆƒ ๏ผŒๆฑ‰ๆ–น็พŽ็™ฝๅ’Œ็ง€ๅฆฎๅ„ฟๅ†…่กฃๆ›ดๆœ‰ๅคงไผ˜ๆƒ ๏ผŒๅฟƒๅŠจไธๅŽป่กŒๅŠจใ€‚...\n1000 Processed\n classify content\n200500 0 ไปŽๅ•†ไธšๅŸŽๆญๅ…ฌไบค่ฝฆ่ฟ‡ๆฅไธœ้—จ็ซ™\n200501 0 2013้ป„็‰›ๅ…šไผšไธไผš่ถŠๆฅ่ถŠๅฐ‘ๅ•Š\n200502 0 ไธŠๆท˜ๅฎใ€็™พๅบฆใ€ไบฌไธœใ€ไบš้ฉฌ้€Š้ƒฝๆตๅ””ๅˆฐ่ทๅ˜…ๅญ˜ๅœจ\n200503 0 ็œŸๆญฃ็š„ๅ‹‡ๅฃซๆ•ขไบŽไธๅฌๅŒป็”Ÿ็š„่ฏ\n200504 0 ่€Œ่ขซ่‚ ้“ๅ†…็š„็ป†่Œๅˆ†่งฃไบง็”Ÿๅคง้‡็š„ๆฐ”ไฝ“\n1000 Processed\n classify content\n201000 0 PACEMAN็š„ๅค–่ง‚่ฎพ่ฎกๅŒไน‹ๅ‰็š„ๆฆ‚ๅฟต่ฝฆไฟๆŒไบ†้ซ˜ๅบฆไธ€่‡ดๆ€ง\n201001 0 ๆˆ‘ๅœจๅฟ…่ƒœๅฎข็ญพๅˆฐๅ—ไบฌ้‡‘ๅฑฑ้คๅŽ…็ญพๅˆฐ๏ผš็ญพๅˆฐๅ•ฆ\n201002 0 ๆ—ฅๆœฌๅ‘ๆ—ฅ่‘ต8ๅทๅซๆ˜Ÿๆ‹ไธ‹ๅฐ้ฃŽ่‹่ฟช็ฝ—ๅ†™็œŸ\n201003 0 โ€ๆˆ‘ไธ€ๅฌ่ฟ™ๆšด่„พๆฐ”ๅฐฑไธŠๆฅไบ†ๆˆ‘็›ดๆŽฅไนฐไบ†ๅๅ‡ ไธชๆฐขๆฐ”็ƒๆฐ”็ƒๆ†ๆ‰‹ๆ่ข‹ไธŠไบ†\n201004 0 ๆœ€ๅŽไธ€ๅฐxxxxShelbyGTxxxๆ•ž็ฏท็‰ˆๅฐ†ไธบๆ…ˆๅ–„ๅŸบ้‡‘่€Œๆ‹ๅ–\n1000 Processed\n classify content\n201500 0 ๆญคไธพๅœจๆ—ฅๆœฌๆœบๆž„ๆŠ•่ต„่€…ไธญๅฐšๅฑž้ฆ–ๆฌก\n201501 0 ่ฟ™ๆ ท็š„็”ตๆขฏๆ€Žไนˆ่ฎฉไธšไธปๆ”พๅฟƒไน˜ๅ\n201502 0 f็™พๅบฆไบ‘ch็™พๅบฆไบ‘่ต„ๆบdzไบ‘็›˜่ต„ๆบi็™พๅบฆไบ‘่ต„ๆบ\n201503 0 ๆตŽๅ—ไธ€้ซ˜ๅฑ‚ไฝๅฎ…็”ตๆขฏๅ†…็Žฐ็”ทๅฐธไบ‹ๅ‘ๆฅผๆˆฟๆ— ไบบๅฑ…ไฝ\n201504 0 ๅŒ—ไบฌๅธ‚ๅ…ฌๅ›ญ็ปฟๅœฐๅไผš้ข†ๅฏผๅญŸๅบ†็บขๅธฆ้ข†ๅŒ—ไบฌๅธ‚16ไธชๅŒบๅŽฟ็š„ๅ…ฌๅ›ญ็ฎก็†่ดŸ่ดฃไบบ้’ˆๅฏนไธ‡ๅฏฟๅ…ฌๅ›ญ้›†ไธญ็ฎก็†ๅนฟๆ’ญ้Ÿณๅ“...\n1000 Processed\n classify content\n202000 0 ๆ‹็ฒพ่‡ดไบบๅƒๆ‘„ๅฝฑๅฐฑๆฅๅทฆๅฒธๅ”ฏ็พŽๆ‘„ๅฝฑ่ง†่ง‰ๆœบๆž„\n202001 0 ่ฟ™ๆ„ๅ‘ณ็€่ฟ‘ๆœŸๆŠ•่ต„ๆ”ฏๅ‡บๅ’Œๅฐฑไธšๅฒ—ไฝๅขžๅŠ ๅŠฟๅคดๆˆ–ๅฐ†ๆŒ็ปญ\n202002 0 ๆˆ‘ๅŒบๅฌๅผ€2016ๅนด้ƒจ้—จ้ข„็ฎ—็ผ–ๅˆถๅธƒ็ฝฎไผš\n202003 0 ๆฎNBAๅŠณ่ต„ๅ่ฎฎไธ“ๅฎถLarryCoon็งฐ\n202004 0 ๆ่ดจ๏ผšๅฃณๅญ่ƒŒ้ขไธบ็กฌๅฃณ่พน่พนๆ˜ฏ่ฝฏ็š„้€‚็”จ๏ผšiphone6/6plusๅกๅ“‡ไผŠ่ดด็บธ+ไธ€ไธชๆ‰‹ๆœบๅฃณ่‡ชๅทฑDI...\n1000 Processed\n classify content\n202500 0 ้ฃŸ็‰ฉ่…่ดฅๅพ—ๆฏ”่พƒๅฟซๅƒๅˆฐๅ˜่ดจ็š„้ฃŸ็‰ฉๅ‡ ็އๆฏ”่พƒ้ซ˜\n202501 0 ่‘—ๅ็š„ไน่œ€็ฅžๅบ™ๆœ‰้€พ3000ๅนด็š„ๅކๅฒ\n202502 0 ็ถ“้Ž่ˆ‡ๅฎถไบบๅ’Œๆœ‹ๅ‹ๅคšๆฌก็š„็ฅˆ็ฆฑๅ’Œ่จŽ่ซ–\n202503 0 ๆฐธ่ฟœไธ็”จไฝ“ๆธฉ่ฎกๆฐธ่ฟœไธๅŽปๅŒป้™ขๆฐธ่ฟœไธๅƒ่ฏไบบ็”Ÿไธ‰ๅคงๆ„ฟๆœ›่ฟ™ไธ‰้กนๆ‰ๆ˜ฏๅคงไบ‹ๅฅฝๅ—\n202504 0 ็œ‹่งไธ€ๅฐๅทๆญฃๅ‡†ๅค‡ๆŠŠๅทๆฅ็š„ๆ‰‹ๆœบๅพ€ๅฃ่ข‹้‡Œ่ฃ…\n1000 Processed\n classify content\n203000 0 ๅคง้€šๆดพๅ‡บๆ‰€ๆฐ‘่ญฆ้ญๅ€ฉๅธฆ้ข†่พ…่ญฆ็”ฐๅฎถ้ฃžใ€่‹ๆทใ€ๆ–นๆœ‹ๆก‚ๅทก้˜ฒไธญๅ‘็ŽฐๆœบๅŽ‚ๅŒป้™ขๅ†…ๅ‘็”Ÿไบ‰ๅต\n203001 0 41ๅนดไปฅๆฅ็™ป้™†ๅŽๅ—ๆœ€ๅผบ็š„็ƒญๅธฆๆฐ”ๆ—‹\n203002 0 amazonprimedayๆฎ่ฏดๆŠ˜ๆ‰ฃๆฏ”้ป‘ไบ”ๆ›ด็ป™ๅŠ›\n203003 0 ไธ€ๆ•ดๅ—่งฆๆ‘ธOLEDๆ˜พ็คบๅฑ่ดฏ้€šๆ•ดไธชไปช่กจๅฐ\n203004 0 ๆœ‰้’ฑๅฏŒไบŒไปฃๅŽ่ฃ”็ง’่ฝฌ็œŸๆ˜ฏ็š„\n1000 Processed\n classify content\n203500 1 ๅž‹ๅ›ข่ดญๆดปๅŠจใ€‹๏ผŒๅ‡กxๆœˆxๆ—ฅ่‡ณxๆœˆxๆ—ฅๅˆฐๅบ—ๅฎขๆˆท้ƒฝไธไป…ๆฅๅบ—ๅณๆœ‰็ฒพ็พŽ็คผๅ“็›ธ่ต ๏ผŒ่ฟ˜ๅฏไบซๅ—ๆฏๅนด้šพๅพ—็š„ใ€็‰น...\n203501 0 ไปŽไธŠไธ–็บช90ๅนดไปฃ็š„้ญ”ๆœฏๅธˆใ€ไน”ไธนโ€ฆโ€ฆ\n203502 0 bananaๅฐ้ป‘ไผž้ซ˜ๅฏ†้˜ฒๆ™’ๆถ‚ๅฑ‚\n203503 0 ไธญๅ›ฝไบ’่”็ฝ‘่ฟๆณ•ๅ’Œไธ่‰ฏไฟกๆฏไธพๆŠฅไธญๅฟƒใ€ๅ„ๅœฐ็ฝ‘ไฟกๅŠžไธพๆŠฅ้ƒจ้—จใ€ๅ„ไธป่ฆ็ฝ‘็ซ™้€š่ฟ‡ๅปบ็ซ‹ๅคšๅ…ƒไธพๆŠฅๆธ ้“ใ€ๆ‰ฉๅ……ๅ…ฌ...\n203504 0 ๆŠ•่ต„่€…่ฎพ็ฝฎไบ†ๆญขๆŸ่€Œๆฒกๆœ‰ๆ‰ง่กŒ็š„ไพ‹ๅญๆฏ”ๆฏ”็š†ๆ˜ฏ\n1000 Processed\n classify content\n204000 0 ่ฏฅ่ˆช็บฟๅ…ฑๆŠ•ๅ…ฅ11่‰˜่ฟๅŠ›็บฆ11000ๆ ‡็ฎฑ็š„้›†่ฃ…็ฎฑ่ˆน่ˆถ\n204001 0 ๆˆ˜ไบ‰้›ท้œ†่ง†้ข‘๏ผš่ก—ๆœบ่ฝฐ็‚ธๆœบ้ฃŸ็”จๆŒ‡ๅ—ๆ”ปๅ‡ปๆœบ\n204002 0 6็‚นๅŠ็š„ๆปจๆตทๅฎข่ฟ่ฝฝ็€ๆ—บๅฆˆๅจ˜ไฟฉไธ€่ทฏๅฅ”ๅ‘้ฆ–้ƒฝๆœบๅœบไธ‰ๅท่ˆช็ซ™ๆฅผ\n204003 0 GoogleShoppingๆœ็ดขไผ˜ๅŒ–ๆŠ€ๅทงๅˆ†ๆž\n204004 0 ่ฟ™ไธชๅธ‚ๅœบ็ปˆๅฐ†ๆˆไธบๆŠ•่ต„่€…็š„็ปž่‚‰ๆœบ\n1000 Processed\n classify content\n204500 0 ๆ˜Žๆ˜Žๅœจ้ฃžๆœบไธŠๅ†ทๆˆ็‹—ไธๆ•ขๅ‘Š่ฏ‰ๅฆˆๅฆˆๅช่ƒฝ่ฏดๆˆ‘ๆŠŠๆฏฏๅญไปŽๅคด็›–ๅˆฐ่„šไป€ไนˆๆ„Ÿ่ง‰ไนŸๆฒกๆœ‰\n204501 0 7ๆœˆ10ๆ—ฅ้“ถๅˆ›ๅฐๅˆ†้˜Ÿ็š„ๅฐไผ™ไผดไปฌๅœจไธญๅ›ฝๅทฅๅ•†้“ถ่กŒ่ฟ›่กŒไบ†่ฐƒ็ ”\n204502 0 ๆˆ‘ๅˆš็œ‹่…พ่ฎฏๆŠฅ้“โ€”โ€”ๆญ็ง˜้žๆดฒ้ƒจ่ฝโ€œๅทๅฆป่Š‚๏ผšๅฅฝๅฏ้šๆ„ๆŒ‘้€‰ๅคšๅไธˆๅคซ\n204503 0 ็”ตๆขฏไธบๆฑŸ่‹โ€œ็”ณ้พ™โ€็‰Œ็”ตๆขฏ\n204504 0 ไปฅๅŠEGF่ฟ™ไธช่ฃ่Žท่ฏบ่ดๅฐ”ๅฅ–ๆปดๆˆๅˆ†\n1000 Processed\n classify content\n205000 0 ๆˆไธบteamleaderๅ…ถๅฎžๅฐฑๆ˜ฏ่ฏดๅชๆœ‰ไฝ ๅนฒๆดปๅ„ฟ\n205001 0 ๅพฎ่ฝฏ่ฟ‘ๆœŸไนŸๅœจๅŠ ็ดงๅฎฃไผ ็š„ๆญฅไผ\n205002 0 ่ฏทไธ่ฆ่ฟ™ไนˆๆ‚ฒๅ‚ฌๆ„Ÿๅ†’ๅˆฐ็ˆฌไธ่ตทๆฅ\n205003 0 CSDAๆฑŸ่‹ๅˆ†ๅง”ๅ—ไบฌ่‰บๅ’Œ็ฅžๅฝขๆ–‡ๅŒ–่‰บๆœฏไบคๆตๆœ‰้™ๅ…ฌๅธๆ‰ฟๅŠž\n205004 0 ๆžๅซ็”Ÿ่ฆๆฑ‚๏ผš1ใ€ไบ”ๅฎ˜็ซฏๆญฃ\n1000 Processed\n classify content\n205500 0 ็”ฑ้‡ๅบ†ๅ„ฟ็ซฅๆ•‘ๅŠฉๅŸบ้‡‘ไผš่ต„ๅŠฉ็š„ๅ…ฌ็›Šๅˆ›ๆŠ•้กน็›ฎโ€œๆŠค่‹—่กŒๅŠจโ€”โ€”ๅ•ไบฒๅฎถๅบญๅ„ฟ็ซฅๅ…ณ็ˆฑ้กน็›ฎโ€ๅœจๆˆ‘ๅŒบๆญฃๅผๅฏๅŠจ\n205501 0 ไนฑๆœ้ฟๅญ•่ฏๆˆไธบๅฏผ่‡ดๆ–ฐ็”Ÿๅ„ฟ็ผบ้™ท\n205502 0 ไธ€ไธชๅ•†ไธšๆจกๅผๆ˜ฏ่ฟ่กŒไธ€ไธชๅ…ฌๅธ็š„ๆ–นๆณ•\n205503 0 ๅฅนๆ‰็Ÿฅ้“่‡ชๅทฑ่ขซไบบไปฅxไธ‡ๅ…ƒ็š„ไปทๆ ผๅ–็ป™่ฟ™ไธช็”ทไบบๅš่€ๅฉ†\n205504 0 ่ฟ™ๅซ็ผ˜ๅˆ†ๅ“ˆๅ“ˆๅ“ˆๅ“ˆไปŠๅคฉๅ…ฌๆผ”ๅŠ ๆฒน\n1000 Processed\n classify content\n206000 0 3็›ฎๆ ‡็ญนๆฌพ้‡‘้ข๏ผš100000ๆ•‘ๅŠฉๅฏน่ฑก๏ผšๅง“ๅ๏ผš่ตตๆญฃๅ…ธๆ€งๅˆซ๏ผš็”ทๅ‡บ็”Ÿๆ—ฅๆœŸ๏ผš2015/1/1่”็ณปไบบ๏ผš...\n206001 0 ๅฅน้œ€่ฆ็ญ‰็€็œ‹ไธญๅˆ็š„่‚ก็ฅจ่Š‚็›ฎ\n206002 0 ๅŒ…ๆ‹ฌLinuxไธป้ข˜ใ€ๅฅฅๅทด้ฉฌๆžๆ€ช่กจๆƒ…ใ€ไปฅๅŠไธ€็ณปๅˆ—GoogleEmoji่ดด็บธ็ญ‰็ญ‰\n206003 0 ้™ๅฎๅŽฟ2015ๅนดโ€œ็‰นๅฒ—่ฎกๅˆ’โ€ๅ…ฌๅผ€ๆ‹›\n206004 0 ๅœจๅ—้€šๅ„ๅŽฟๅธ‚ๅŒบไธญ็އๅ…ˆ่ถ…ๅƒไบฟ\n1000 Processed\n classify content\n206500 0 ๅ…จๆฃ‰็‰›ไป”้ขๆ–™่ฎพ่ฎกๆ˜Ž็บฟ็ผๅˆถๅฏน็งฐๆฌพๅผๅŒๅฑ‚่ฃค่…ฐ็ฟป่พน่ฎพ่ฎก\n206501 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏ่‹ๅทžๅธธๆ˜ฅ่—คๅŒป็–—็พŽๅฎนๅฎขๆœไธญๅฟƒ็”ฐๅŒป็”Ÿ๏ผŒๆˆ‘้™ขxๆœˆ็‰นๆƒ ๆดปๅŠจๅŠฒ็ˆ†็™ปๅœบ๏ผŒ่…‹ไธ‹่„ฑๆฏ›็‰นไปทxxxๅ…ƒใ€‚...\n206502 0 ๅ—ไบฌ็’ฐไฟ็†ฑ็ทš่ญ‰ๅฏฆๆœ‰ๅคœ้–“ๆ–ฝๅทฅ่จฑๅฏ\n206503 0 AIVA็š„็ฅ›ๅฐๅฅ—่ฃ…ๅทฒๆ˜ฏไธๅฏ็ผบๅฐ‘็š„ๆ—ฅๅธธ่ฃ…ๅค‡ไบ†\n206504 0 ๆˆ‘ๅพˆๆƒณ้—ฎ้—ฎๅฎถ้•ฟๅŒป้™ข็š„้‚ฃ้กนๆฃ€ๆŸฅๅญฉๅญๆœ‰้—ฎ้ข˜\n1000 Processed\n classify content\n207000 1 ่ฟŽx?x๏ผŒ็‰นๆŽจๅ‡บไนฐ้€ๆดปๅŠจใ€‚่™น็››ไบ”ๆฅผ็ฌฌไธ€ๅฎ่ดไธ“ๆŸœ\n207001 0 ๅœจ่ฏฅ็ณปๅˆ—็š„็ฌฌ4้ƒจๅˆ†ไนŸๅฐฑๆ˜ฏๆœ€ๅŽไธ€้ƒจๅˆ†ไธญ\n207002 0 ๅฎ˜ๅ•†ๅ‹พ็ป“ๅฎ˜ๅŒชๅ‹พ็ป“ๅฏŒไธ่ฟ‡ไธ‰ไปฃไบบ็”Ÿ้šพไปฅ่ถ…่ถŠ็š„ๆ˜ฏ็งๅฟƒ้šพไปฅๆ‘†่„ฑ็š„ๆ˜ฏ็‰ฉ่ดจ็คพไผšๆ—ข็„ถ่ฟœ็ฆปๆ›ดไธๆ„ฟๅ†ๅŽป้ ่ฟ‘\n207003 0 ่‹ๅฎๆ˜“่ดญๅ’ŒๅŽไธบๅ•†ๅŸŽ้ƒฝๆฒกๆœ‰ๅ‘็ŽฐไธŽ่ฃ่€€ๆš‘้ฃ™่Š‚็›ธๅ…ณ็š„ๆดปๅŠจๅ‘ข\n207004 0 ๆปจๆฒณๆดพๅ‡บๆ‰€ๆฐ‘่ญฆๅผ ็จšๆ‚ฆไปŽๅญฆๆณ•\n1000 Processed\n classify content\n207500 0 4็š„ๆ—ถๅ€™็”จgooglenowๆ— ๆณ•ๆ’ญๆ”พ่ฟ™้ฆ–ๆญŒ\n207501 0 trunatureๅถ้ป„็ด ๏ผ‹็މ็ฑณ้ป„็ด \n207502 0 ๆ‹›่˜ๅฅ่บซๆ•™็ปƒ็š„ๆกไปถๆ˜ฏ๏ผšๆƒณๆŒ‘ๆˆ˜้ซ˜่–ช\n207503 0 ๆฑฝ่ฝฆ้…ไปถ่กŒไธš็ปˆๅฐ†ๅฎž็Žฐ็”ตๅญๅ•†ๅŠกๅŒ–ๅทฒ็ปๆ˜ฏไธšๅ†…ๅ…ฌ่ฎค็š„่ถ‹ๅŠฟ\n207504 0 2015ๅนด7ๆœˆ31ๆ—ฅๆ™šๅœจๅฐ่ฑกๆฑŸๅ—ๅ—้—จๅฃ่ฟ็ซ ๅœ่ฝฆ\n1000 Processed\n classify content\n208000 0 ๆˆ‘ๅฐฑ็Ÿฅ้“ๆˆ‘ๅŽปไธไบ†ๅ—ไบฌๆ‰พไฝ ็Žฉไบ†\n208001 0 ็‰›ๅ“ฅๅœจๅ‘ไฝ ่ฏดๅฃฐ๏ผšโ€œๆ—ฉๅฎ‰\n208002 0 ็„ถ่€Œ็Žฐๅœจ่ฟ˜ๆ‹ฟ็€ๆ‰‹ๆœบๅฏนไฝ ๆทปๅฑ\n208003 0 ไปŠๅคฉๆŠŠๆฑŸ่‹้“ถ่กŒ้‡Œ็š„3000ๅ—้’ฑๅ…จๅ–ๅ…‰ไบ†\n208004 0 ๆžถๆž„โ€œๅคงๅŸŽ็ฎกโ€็ฎก็†ๆ ผๅฑ€็š„ๅŒๆ—ถ\n1000 Processed\n classify content\n208500 0 ไฝ†็”ต่„‘ๆ˜พ็คบๆฒกๅบ“ๅญ˜็š„่ฏ็š„้‚ฃ้ƒจๅˆ†้’ฑๅฐฑ่ฟ›ไบ†ๅฐไบบ็š„่…ฐๅŒ…ไบ†\n208501 0 ไฝ†ๆ˜ฏไป–่ฏด๏ผšIfightlikehelltopayaslittleaspossibleไป–ๆ‹ผๅ‘ฝๅฐ‘ไบค็จŽ\n208502 0 ๆˆ‘้ ๅŽŸๆฅ็Ÿญๅˆ€ๆ‰“ๅŸŽ็ฎกๅŽŸๆฅ่ฟ™ไนˆ็ˆฝ\n208503 0 ๆˆ‘ๅทฒ็ปๆฒกๆœ‰ๅœจๅ—ไบฌๅˆๆฌก็œ‹่ˆœๅคฉๆฏ”่ต›ๆ—ถ้‚ฃไนˆๆฟ€ๅŠจๅ’Œๅ…ดๅฅ‹ไบ†\n208504 0 ไปŠๅนด3ๆœˆ่‡ณ7ๆœˆไธŠๆ—ฌๆน–ๅŒ—็œๅŒป็–—ๅ™จๆขฐไธ“้กนๆ•ดๆฒปโ€œๅ›žๅคด็œ‹โ€ๆˆๆ•ˆๆ˜พ่‘—\n1000 Processed\n classify content\n209000 0 ๅพ€ๆ‰‹ๆœบ้‡ŒๅŠ ็‚นๆ–™ๅง๏ผšโ€œๅคฉๅคฉๆจก่€ƒ่…ฐๆžœๅ…ฌๅŠกๅ‘˜โ€ๅˆšๅ…ฅๆ‰‹\n209001 0 ไธŠไธ‹็ญ่ทฏไธŠใ€ๅœฐ้“้‡Œใ€้คๆกŒไธŠ\n209002 0 fresh็š„ไบงๅ“ไธ€็›ดๅšๆŒๆˆๅˆ†ๅคฉ็„ถๅ€ผๅพ—ไธ€่ฏ•\n209003 0 ๅˆฐfantasticbaby\n209004 0 ๆฐธๅฎšๅŒบๆณ•้™ขๅœจๆท˜ๅฎ็ฝ‘ๅธๆณ•ๆ‹ๅ–ๅนณๅฐไธญๆˆๅŠŸๆ‹ๅ–ไบ†ไธ€ๅค„ไฝไบŽ้พ™ๅฒฉๅธ‚ๆ–ฐ็ฝ—ๅŒบ้พ™่…พไธญ่ทฏ็š„ๆˆฟไบงๅŠ่ฝฆๅบ“\n1000 Processed\n classify content\n209500 0 ๅˆถไฝœๆ—ถ้—ด2015ๅนด8ๆœˆๆ—ถ้•ฟ\n209501 0 ๅคฉไธ‹ไธบไป€ไนˆ่ฆๆœ‰้‚ฃไนˆๅคš็š„ๅฏๆ€œ็š„ไบบ\n209502 0 ๆœ‰ๆฒกๆœ‰้•ฟๆฒ™็š„ๆฉ™ๅญๅŽปๅ—ไบฌ้ฆ–็ซ™็š„ๆฑ‚้™ชๅŒ\n209503 0 ๅฐคๅ…ถๅœจๆˆ‘ไปฌ้•ฟ้ฃŽๅœฐๅŒบ่‹ๅทžๆฒณๆฒฟๅฒธๅœฐๅ—\n209504 0 ๅ‡กๆ˜ฏ็œ‹ไธไธŠ่ฐทๆญŒๅ’Œๅฟ…ๅบ”ไธญๆ–‡ๆœ็ดข็š„\n1000 Processed\n classify content\n210000 0 ่ฆๆณจๆ„ๆธ…็†ๅฅฝcookiesๅ’Œๆต่งˆๅ™จ็ผ“ๅญ˜่ฟ™ไบ›ไผš้™ไฝŽไฝ ็”ต่„‘่ฟ่ฝฌ้€Ÿๅบฆ็š„ไธœ่ฅฟ\n210001 0 ่ณบ็œŸ็š„่ถ…็บง็ป™ๅŠ›ๅช่ƒฝ่ฏด้€‰ๆ‹ฉ็œŸ็š„ๅพˆ้‡่ฆไธ€้ƒจๆ‰‹ๆœบ\n210002 0 ไธ็Ÿฅๆ˜ฏๆ˜†ๅฑฑ่ฟ™ๅœฐๅ„ฟๆœ‰็ตๆฐ”่ฟ˜ๆ˜ฏๆˆ‘็œŸ็š„ๆ˜ฏ้€‚ๅˆ็‹ฌๅฑ…\n210003 0 ๅ’Œไธ€ไฝ23ๆฅผ็š„็Œฅ็ๅ•ค้…’่‚šไธญๅนด็”ทไบบๅๅŒไธช็”ตๆขฏ\n210004 0 ๅธไปชๆขฆไน‹้˜Ÿๅ›ข้˜Ÿไพ‹ไผšๅœจๅ—ไบฌๆ›™ๅ…‰ๅคง้…’ๅบ—ๅฆ‚ๆœŸๅฌๅผ€\n1000 Processed\n classify content\n210500 0 ไธบๅ•ฅๅŽไธบ็š„ๆ‰‹ๆœบ็Žฐๅœจๆ— ๆณ•ไฝฟ็”จไธŠๆตทๅ…ฌไบค็š„APP\n210501 0 ่ขซ็งฐไธบๅ—ไบฌไฝ“้‡ๆœ€ๅคงใ€ๅ“็‰Œๆœ€ๅ…จใ€ไธšๆ€ๆœ€ๅฎŒๆ•ด็š„ๅคๅˆๅž‹้ซ˜็ซฏๅฅฅ็‰น่Žฑๆ–ฏ\n210502 0 ๅฏปๅฏป่ง…่ง…็ˆฑๅฅฝๆขๆฅๆขๅŽปpartyๅŽปๆฅๅŽปๅŽป\n210503 0 4ใ€ๅฝ“ๅ—ๅˆฐ็”ตๅท็ ๆ˜พ็คบไธบไนฑ็ ็š„ๆœช็Ÿฅ็”ต่ฏๆ—ถ\n210504 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 482h5eไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n1000 Processed\n classify content\n211000 1 ไธญๅ›ฝไบบๅฏฟ้˜ฒ็™Œ้™ฉๅฏไปฅ่ฃธๅ–ๅ•ฆ๏ผๅทจๅคงๅ›ž้ฆˆ๏ผŒ้šพๅพ—ๆœบ้‡๏ผŒๅฐๆŠ•ๅ…ฅๅคงไฟ้šœ๏ผŒๆœ€้ซ˜ไฟ้šœxxไธ‡๏ผŒๅช้œ€็ผด่ดนxxๅฒๅฅณ...\n211001 0 ้›จ่Šฑ่ญฆๆ–นๅฐ†ๆˆๅŠŸ่ฟฝๅ›ž็š„x่พ†้ขๅŒ…่ฝฆไบค่ฟ˜็ป™ๅคฑไธป\n211002 0 ไปฅๅŠๆ—…ๆธธๅŽ•ๆ‰€้ฉๅ‘ฝใ€ๆ—…ๆธธๆ‰ถ่ดซใ€ๆ—…ๆธธไฝ“ๅˆถๆ”น้ฉๅˆ›ๆ–ฐ็ญ‰ๅทฅไฝœๆƒ…ๅ†ต\n211003 0 ๅฐ่ญฆๅฏŸๆญปๆމไน‹ๅŽๅŽป็ช—ๅฃๆŠฝไบ†ๆ น็ƒŸๅ†ท้™ๅ†ท้™\n211004 0 ๆญๅทž21ๅฒๅฅณๅญฉ่ขซ็”ตๆขฏๅคนไฝ่บซไบกๅคด่บซๅˆ†ๅค„ไธคๆฅผๅฑ‚\n1000 Processed\n classify content\n211500 0 ไธบไป€ไนˆไธ–็•ŒไธŠไผšๆœ‰ๅฐๅท่ฟ™็งๅ˜ๆ€็‹‚ๅญ˜ๅœจๅฆˆ็š„ๅ“ๆญปไบบไบ†\n211501 1 ็‰Œใ€‚ๆ—ฅๆœ€้ซ˜ๆๆฌพxxxx่ฌ๏ผŒxๅˆ†้˜ๅ…งๅˆฐ่ณฌใ€‚ ?ๅฐˆ็‡Ÿ๏ผš็™พๅฎถๆจ‚ใ€้พ่™Ž้ฌฅใ€่ผช็›คใ€่‚กๅฏถใ€็‰›็‰›ใ€ๆ™‚ๆ™‚ๅฝฉ...\n211502 0 ้šพ้“ๆ˜ฏ่ดง่ฝฆๅธๆœบ่ฟžๅฐๅท้ƒฝไธๅฆ‚\n211503 0 ๆ„Ÿๅ—ๅˆฐ้ฆ–้ƒฝๆ—ฉ้ซ˜ๅณฐๅœฐ้“ๆปกๆปก็š„ๆ•Œๆ„\n211504 0 ็”ตๆขฏๅƒไบบไน‹ๅŽๅ•†ๅœบ้ƒฝ่ฟ™ไนˆ่ฐจๆ…Žไบ†ๅ—\n1000 Processed\n classify content\n212000 0 ๆˆ‘็ป“ๆžœๆ˜ฏ่ƒƒ็—…็Šฏไบ†ๅŽปๅŒป้™ขๆ‰“้’ˆไบ†\n212001 0 ็ป“ๆžœgoogleplay็ฟปไบ†ๅข™ไนŸๆ…ข็š„่ฆๆญป\n212002 0 ๆ–‡ๆ˜Œ่Šฑๅ›ญ็คพๅŒบ่”ๅˆๆ‰ฌๅทžๅนฟ็”ต้‚€่ฏทๆ‰ฌๅทžๆตทไบ‹ๅฑ€ๅฎ‰ๅ…จๆ•‘ๆŠคไธ“่Œ่ฎฒๅธˆๅญฃ่”šไธบ็คพๅŒบ้’ๅฐ‘ๅนดไธพๅŠžไบ†ไธ€ๅœบๅฎ‰ๅ…จๆ•‘ๆŠค็Ÿฅ่ฏ†่ฎฒๅบง\n212003 0 12ๅฑŠๅœจไน–ๆ กๅญฆๅปบ็ญ‘ไธ“ไธšๆ‰พๅทฅไฝœไธญ\n212004 0 ๅบ”่ฏฅๆˆ็ซ‹ไธ“้—จ็š„็ป„็ป‡ๆŸฅๅค„ๅฎ—ๆ•™ๅ›ขไฝ“ไธญ่ฟ่ง„่กŒไธบ\n1000 Processed\n classify content\n212500 1 ๅฐŠๆ•ฌ็š„ไธšไธป๏ผšๆ„Ÿ่ฐขๆ‚จ้€‰ๆ‹ฉไธญๅ›ฝๅฅฝๅปบๆ่”็›ŸไธพๅŠž็š„โ€˜็ปๅฏน้€‰ๆ‹ฉโ€™ๅคงๅž‹่ฎฉๅˆฉๆดปๅŠจ๏ผŒ่ฏทๆ‚จไฟ็ฎกๅฅฝ่ฃ…ไฟฎๆŠค็…ง๏ผŒๆดปๅŠจ...\n212501 0 ไปŽๆณ•้™ขๅคง้™ข้‡Œๅ†ฒๅ‡บไธ€็พค้ป‘่ขๆฑ‰ๅญ\n212502 0 ๅˆๅ’ŒๅŒไบ‹ๅผ€ๅฏ็›ธไพไธบๅ‘ฝๆ–ฐๆฒ‚ไน‹ๆ—…ไบ†\n212503 0 ้กถๆ–ฐๆ——ไธ‹ๅ“็‰Œ๏ผšๅบทๅธˆๅ‚…ๅพทๅ…‹ๅฃซๅ‘ณๅ…จfamilymartไพฟๅˆฉๅบ—\n212504 0 ไฝฟๅพ—ๅ…จๅนดๆฐดๅˆฉๆŠ•่ต„ไปปๅŠกๅœจๅŽŸๆฅๅŸบ็ก€ไธŠๅ†ๆ้ซ˜xx%\n1000 Processed\n classify content\n213000 0 ๆˆ‘๏ผšๆ‹ฟ็€ๆ‰‹ๆœบไฝ ่ฟ˜ๆ€ŽไนˆๆŠฑ็€ๅฆˆๅฆˆ็กๅ•Š\n213001 1 xxxxxxxxxxxxxxxxxxxx-ใ€ๆฅไผŠไปฝใ€‘ๅฐŠๆ•ฌไผšๅ‘˜๏ผšโ€œๆข…โ€ไธฝๅฅณไบบ่Š‚๏ผŒๆปก้‡‘้ข้€่ดดๅฟƒๅฅฝ...\n213002 1 ๅฅฅๆ–ฏๅ‡ฏ็บณๅฎขๆˆฟx.xๆŠ˜๏ผไฝŽ่‡ณxxxๅ…ƒ้€ๆ—ฉ้คๅคœๅฎต๏ผ›้’Ÿ็‚นๆˆฟxxxๅ…ƒ่ตท๏ผๅœฐๅ€๏ผšๅค–็ ‚ๅคงๆกฅๅคด๏ผ›่ฎขๆˆฟ็ƒญ็บฟx...\n213003 1 ๆ”ถๅˆฐๅท็ :xxxxxxxxxxx๏ผŒ็Ÿญไฟกๅ†…ๅฎน:ๆพณ้—จๅจๅฐผๆ–ฏไบบๅจฑไนๅœบ็Žฐๅทฒๅผ€้€š็ฝ‘ไธŠๅšๅฝฉwww.xxx...\n213004 0 ๅนณ่ฐทๅŒบ้ฃŸ่ฏ็›‘ๅฑ€้€š่ฟ‡ๆกˆไปถๅฎกๆ ธไผš่ฎฎ\n1000 Processed\n classify content\n213500 0 ไธƒใ€ๅ•†ไธš่ฎกๅˆ’ไนฆๅˆฐๅบ•ๆœ‰ๅคš้‡่ฆ\n213501 0 ๅ…จๅœบ็ดฏ่ฎกๆถˆ่ดนๆปกxxxx้€็พŽ็ฝ—ๆ„Ÿๆฉๆœˆ้ฅผไธ€็›’\n213502 0 ๅœฐ้“็š„ๅ›พไนฆ็œ‹็€่ฒŒไผผ่ดจ้‡ไธ้”™\n213503 0 ไธŠไบคๆ‰€ๆœ‰ไน‰ๅŠก็ฃไฟƒๅˆธๅ•†ๅšๅฅฝๆŠ•็ฅจ็ณป็ปŸ\n213504 0 ๆฏๅคฉๆ‹็บฏ้œฒๆฐด็š„ๆ—ถๅ€™ๆ‹10ๅˆ†้’Ÿ็š„่„ธ\n1000 Processed\n classify content\n214000 0 ่…พ่ฎฏๆˆฟไบงๅœจไธญๅ…ณๆ‘ๅˆ›ไธšๅคง่ก—็š„ICๅ’–ๅ•กไธพๅŠžไบ†ไธ€ๅœบๆœ‰ๅ…ณๆˆฟไบงไผ—็ญน็š„ๅคงๅ’–ๅคด่„‘้ฃŽๆšด\n214001 0 8ๆœˆ11ๆ—ฅ่‹ๅทž้ซ˜ๆ–ฐๅŒบไบบๆ‰ๅธ‚ๅœบ็Žฐๅœบๆ‹›่˜ไผš\n214002 0 ๆฑฝ่ฝฆ็พŽๅฎน่ฝฆๅ†…ๆด—ๅฐ˜ๆฑฝ่ฝฆ็พŽๅฎน\n214003 0 ๆˆ‘ๆ˜Žๆ˜Žๆœ‰ๅฎšๆœŸๆ•ด็†็”ต่„‘ๆ–‡ไปถ็š„ไน ๆƒฏ\n214004 0 ๅœจๆ‰ฌๅทž่ฟ™ไนˆๅคšๅคฉ็ฌฌไธ€ๆฌกๅƒ่ฟ™ไนˆๅฅฝ\n1000 Processed\n classify content\n214500 0 ๅˆถ่ฎข\"ไฟƒ่ฟ›ไบบๆ‰ๅˆ›ๆ–ฐๅˆ›ไธš14ๆก\"\n214501 0 ๅฏนๅ…ถไป–3ๅๆ‘ๅนฒ้ƒจไปฅ่ดชๆฑก็ฝชๅ„ๅˆคๅค„ๆœ‰ๆœŸๅพ’ๅˆ‘ไธ‰ๅนด\n214502 0 ้‚่ขซๆŠ“่Žทไบ‹ๅŽ่ญฆๅฏŸ้—ฎไธบไป€ไนˆไฝ ๆŠขๅฎŒๅทงๅ…‹ๅŠ›ๅŽ่ฟ˜่ฆๅ›žๆฅๆŠข้š่บซๅฌ\n214503 0 ไธญๅ›ฝ็งปๅŠจๅฐฑๆŠŠ4Gๅก็ป™ๆˆ‘ๅฏ„่ฟ‡ๆฅไบ†\n214504 0 ๅ› ไธบๅผนๅฑ›้‡Œ่ฏ„่ฎบ่Šฑๅƒ้ชจ่ƒธๅฐ่ƒธๅคงๅต่ตทๆฅไบ†\n1000 Processed\n classify content\n215000 0 ไฝ†ๆ˜ฏๆˆ‘ไปฌxxๅท้€š็Ÿฅๅœˆๅ†…ๆœ‹ๅ‹ๆธ…ไป“็š„่ฟžไบ‘ๆธฏxxxxxxๆˆๅŠŸ่Žทๅˆฉ่ถ…่ฟ‡็™พๅˆ†ๅไบ”\n215001 0 ้€ ๆˆๅฎซๅดŽ้ชๅฑ•่งˆไธๅ†ๅฏนไธญๅ›ฝไบบๅผ€ๆ”พ\n215002 0 ไธไป…้“ๅพท่ดฅๅ่ฟ˜่ฟๆณ•ใ€่ฎฐๅพ—ไปŠๅนดๅœจ้ฆ™ๆธฏ่ทฏ่พนไนŸๆ˜ฏไธ€ๅฏนๅคง้™†้’ๅนดๅ…ฌ็„ถๅœจ่ทฏ่พน้‡Žๆˆ˜\n215003 0 ไปŠๅคฉๆ—ฉไธŠ็‚น่ฏ„็š„300477ไธ‹ๅˆไนŸๅฅ‹ๅŠ›ๆ‹‰ๆฟ\n215004 0 ๅ…จๅคฉๆœ€้ซ˜ๆธฉๅบฆ๏ผšๆœฌ็œ่ฅฟๅŒ—้ƒจๅœฐๅŒบ30โ„ƒๅทฆๅณ\n1000 Processed\n classify content\n215500 0 ้ฉพ้ฉถๅ‘˜้ฉพ้ฉถๅ†€A***ๅท้‡ๅž‹ๅŠๆŒ‚่กŒ้ฉถ่‡ณไบ”ไฟ้ซ˜้€Ÿ136ๅ…ฌ้‡Œๅค„ๆ—ถๅ› ๆŒ‚่ฝฆ็š„็ฏๅ…‰ไฟกๅทใ€ๅˆถๅŠจใ€่ฟžๆŽฅใ€ๅฎ‰ๅ…จ...\n215501 0 ๆฏๅคฉๅœจInsไธŠๆ™’ๆœ€ๆ–ฐ็š„ๅคง็‰Œๅ•ๅ“\n215502 0 ็Žฐๅœจๆœ‰ไธ€ๆฌพๅซใ€ŒPieceใ€็š„ๅค–่ฎพ\n215503 0 ๆˆ‘็‰นไนˆๅœจๅฎถ้‡Œ้™คไบ†็Žฉๆ‰‹ๆœบๅฐฑๆ˜ฏ็ก่ง‰็œŸ็‰นไนˆๆ†‹ๆญปไบ†\n215504 0 ไธ€็›ดไปฅไธบๅ…ฌไบคๆฏ”ๅœฐ้“ๆ›ด้€‚ๅˆๅ‘ๅ‘†\n1000 Processed\n classify content\n216000 0 ๅœจๆŠ—ๆˆ˜่ƒœๅˆฉ70ๅ‘จๅนด็š„้‡่ฆๆ—ถๅˆป่ฟ™ๆ— ็–‘ๆ˜ฏไธชๆ„ไน‰้žๅ‡ก็š„ๆดปๅŠจ\n216001 0 ่พ“ๅ…ฅ้กพๅฎข็š„ๆ‰‹ๆœบๅทๅŽ็š„็กฎ่ƒฝๆŸฅๅˆฐๆ‰€ๅœจๅฐๅŒบ\n216002 0 ๅพˆ้šพๅœจxๆœˆไปฝ็š„ๅŒ—ไบฌๅŸŽๆ‰พๅˆฐๅƒไบบ่ง„ๆจกไปฅไธŠ็š„ๅˆ้€‚ๅ‘ๅธƒไผšๅœบๅœฐ\n216003 0 ๅฝ“ๅณๅฐฑ็™พๅบฆไบ†ไธ€ไธ‹่™่ ไผšไธไผšๅธ่ก€\n216004 0 ่ฟžไบ‘ๆธฏๅธ‚ๆฐ”่ฑกๅฐ24ๆ—ฅ10ๆ—ถ30ๅˆ†ๅ‘ๅธƒ\n1000 Processed\n classify content\n216500 1 ๅฅฝๆถˆๆฏ๏ผš่”้€šxxๅ…†ๅ…‰้€Ÿๅฎฝๅธฆๅ…่ดนไฝฟ็”จไบ†๏ผ็”จ่”้€šๅฎฝๅธฆ๏ผŒๅธฆๅฎฝ็‹ฌไบซ๏ผŒ็ฝ‘้€Ÿๅฟซ๏ผไธŠ็ฝ‘ไธๆމ็บฟ๏ผŒ็ฝ‘็ปœ็จณๅฎš๏ผๅฏ...\n216501 0 ๆ˜จๅคฉ็ปซๆ™จไธ‰็‚นๅŠ้ฃžๆœบๆ™š็‚น็”ฑ้ƒŠๅŒบๆญๅทž่ทฏไธŠ\n216502 0 ่ขซx๏ผšๅฏน้“่ทฏไบค้€šไบ‹ๆ•…่ฎคๅฎšไนฆๆ— ๅผ‚่ฎฎ\n216503 0 ๅ› ไธบไปŠๅคฉๅœจๆ— ้”กๆ‰€ไปฅไป–ไปฌไธๆ˜ฏJhonๅ’ŒMaryไบ†\n216504 0 โ€็‰›ๅ›ž็ญ”๏ผšโ€œ็”Ÿๅ‘ฝไธ่ƒฝๆ‰ฟๅ—ไน‹่ฝป\n1000 Processed\n classify content\n217000 0 ๅพฎ่ฝฏWinxxๅฐ†ๆ†็ป‘่‡ชๅฎถ้Ÿณไนๆตๅช’ไฝ“ๅˆ Xboxๅ“็‰Œๆ—ฅๅ‰\n217001 0 ๅŸบ้‡‘ๆถจไบ†ๅฟƒๆƒ…ๅคงๅฅฝๅŠ ไธŠๆ˜ŽๅคฉไธไธŠ็ญ\n217002 0 ๅŽไธบmate7ๆ‰‹ๆœบๅฅ—huaweimate7ๆ‰‹ๆœบๅฃณไฟๆŠคๅฃณ่ถ…่–„็ซ‹ไฝ“ๆ˜Ÿ็ฉบไธชๆ€ง้Ÿฉ็‰ˆ\n217003 0 Limbergๆ˜ฏไธ€ไฝๆฐๅ‡บ็š„ๆ•ดๅฝขๅค–็ง‘ๅŒป็”Ÿ\n217004 0 ่ตถ็€ๅฎ ็‰ฉๅŒป้™ขๅŽป็ป™ๅฅถๆฒนๆ‰“่‚ฒ่‹—\n1000 Processed\n classify content\n217500 0 ๅކๅฒไธŠ็ฌฌ5ไธชๅปบๆˆๅœฐ้“็š„ๅŸŽๅธ‚\n217501 0 ๆ— ้”กๅšๅŒ…่ŒŽๆ‰‹ๆœฏๅคšๅฐ‘้’ฑโ€”โ€”ๆ— ้”กๅšๅŒ…่ŒŽๆ‰‹ๆœฏ้€‰ๆ‹ฉๆ— ้”ก่™นๆกฅๅŒป้™ข\n217502 0 ๆฒณๅŒ—ๅฐฑ่ฟ™็‚น็ฒฎ้ฃŸใ€ๆˆฟๅœฐไบงๅ’Œ้•ฟๅŸŽๆฑฝ่ฝฆๅ–ๅพ—ๆ ๆ ็š„\n217503 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜:ไฝ ๅฅฝ๏ผ โ€œไธ‰ๆœˆๅฅณไบบๅคฉ๏ผŒไปŠๅคฉไฝ ๆœ€ๅคง! โ€ไบฒ๏ผŒๅธญๆกฅ้ƒฝๅธ‚ๆ‹ไบบๅŠๅ–œ่ฟŽไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚๏ผŒๅ…จๅœบๅ…ซๆŠ˜...\n217504 0 ๆณ•้™ขๅ…ฑๅฏนๅนฟ่ฏ็Ž‹่€ๅ‰ไธŽๅŠ ๅคšๅฎ็š„7่ตทๆกˆไปถ่ฟ›่กŒๅฎฃๅˆค\n1000 Processed\n classify content\n218000 1 ไผŠไบบไธฝๅฆ†้€็บขๅŒ…ๅ•ฆ๏ผๅ€ผๅ…ƒๅฎต๏ผŒๅฅณไบบ่Š‚ๅŒ่Š‚ๆฅไธดไน‹ๅญฃ๏ผŒไธบๅ›ž้ฆˆๅ„ไฝไบฒไปฌ๏ผŒ็‰นๆŽจๅ‡บxxโ€”xxxๅ…ƒๅคงๅฅ–็บขๅŒ…๏ผŒ...\n218001 0 ่ฏฅๅŒบ19ๆกๅ†œ่ทฏๆๆกฃๅ‡็บงๅทฅ็จ‹ๅทฒ่ฟ›ๅœบๆ–ฝๅทฅ11ๆก\n218002 0 ๆฅ็ข—ไธญ่ฏ็ฎ€็›ดๆฒไบบๅฟƒ่„พ้ฆ™้ฃ˜ไธ‡้‡Œๅ•Š\n218003 0 ๆˆ‘ไปŠๅคฉๅœจๅŽไธบ็ฝ‘็›˜็ญพๅˆฐ่Žทๅพ—ไบ†112Mๅ…่ดนๆฐธไน…ๅฎน้‡\n218004 1 ้œ‡xxxxๅ…ƒ๏ผŒๅ…ถๅฎƒๅ›ฝ้™…ๆ ‡็›˜้•™x.xx็š„xxxxๅ…ƒ ๅฎ‰้’ขๅ’Œ้ฉฌ้’ข้ƒฝๅฏไปฅๅŽ‚ๆ.ๅฆ‚ๆœ‰้œ€่ฆๅฏ็”ต่ฏ่ฎข่ดญ....\n1000 Processed\n classify content\n218500 0 ๆฑŸๅฎๆฑฝ่ฝฆๅฎข่ฟ็ซ™็š„็ŽฏๅขƒไนŸๅคชๅทฎไบ†\n218501 0 ๆˆ‘็”จ็š„ๅฐฑๆ˜ฏๅŽไธบ็š„ๆ‰‹ๆœบๅพ…ๆœบๆ—ถ้—ด้•ฟๅๅบ”ไนŸๆŒบๅฟซ็š„ๆˆ‘ๅพˆๅ–œๆฌข\n218502 0 ๅŽฟไบบๅคงไธปไปปๆŽ็ฆๆฐ‘่ตด้ฆ™ๅบ™ไพฟๆฐ‘ๆœๅŠกไธญๅฟƒๅกฌ่พนๆ‘ไบŒ็ป„ๅŒ…ๆ‰ถๆˆท็จ‹ๆ–ฐๆฐ‘ๅฎถ่ตฐ่ฎฟ\n218503 0 ็œๅ…ฌๅฎ‰ๅŽ…ไบค่ญฆๆ€ป้˜Ÿๅ‰ฏๆ€ป้˜Ÿ้•ฟ่ดพไธญๅนณไธ€่กŒๅœจๅŽฟๅ…ฌๅฎ‰ๅฑ€ๅ‰ฏๅฑ€้•ฟใ€ไบค่ญฆๅคง้˜Ÿ้•ฟ่ตตๆ˜Ÿ็š„้™ชๅŒไธ‹ๆทฑๅ…ฅๆˆ‘้•‡ๆฃ€ๆŸฅๆŒ‡ๅฏผๅ†œ...\n218504 0 ๆ˜ฏๆŠŠไธ€ไบ›ๅœจ่‘—ๅAPP่ฝฏไปถSnapchatไธŠไธ€ไบ›็‚ซๅฏŒ็š„ๅญฉๅญ็š„ๅ›พ็‰‡ๆˆชๅ–POไธŠๅŽป\n1000 Processed\n classify content\n219000 0 ๆˆ‘็”ต่„‘ๅไบ†่ฟ˜ๆ˜ฏไฝ ไปฌQQๅ‡บ้—ฎ้ข˜ไบ†\n219001 0 ๆณ•ๅ›ฝๅŽŸไบงๆ™ฎ็ฝ—ๆ—บๆ–ฏๅฟƒๅฝขๅคฉ็„ถๆค็‰ฉ็ฒพๆฒนๆ‰‹ๅทฅ้ฆ™็š‚็คผ็›’่ฃ…100g*4ๅ—็พŽๅ›ฝไบš้ฉฌ้€Š$15\n219002 0 ๅบ“้‡Œๅœจๆ‰“NBAไน‹ๅ‰่ขซ่ฏ„ไปท่บซๆ็˜ฆๅฐๆœ‰ๆœ›ๆˆไธบ้ฆ–ๅ‘\n219003 0 ไฝ ๆ˜ฏๅฆ่ฟ˜ๅฎˆๅœจ็”ต่„‘้ขๅ‰้€‰่ดญ็€ๆฏ”ๅ•†ๅœบไปทๆ ผๅฎžๆƒ ็š„็”ตๅ™จ\n219004 0 ๏ผšๆ˜พ็„ถWindowsPhone้‡ๅˆฐ้บป็ƒฆไบ†\n1000 Processed\n classify content\n219500 0 ๆฌข่ฟŽไบฒไปฌ้€‰่ดญ่ฏšไฟก็ป่ฅ\n219501 0 NBA็ƒๆ˜Ÿๆตท่พนไผ‘ๅ‡็š„็›ธ็‰‡ๆ€ป่ƒฝๅธๅผ•็ƒ่ฟท็š„็›ฎๅ…‰\n219502 0 โ€œ120โ€ๆ€ฅๆ•‘ๅ‘˜่ตถๅˆฐๅŽๆต‹ไบ†ไฝ“ๆธฉ\n219503 0 ๅœฐ้“ๆœ‰ไธชๅฐๅฅณ็”Ÿ็›ดๆŽฅๆŒคๆ™•่ฟ‡ๅŽป\n219504 0 LinkedInๆžถๆž„ๆผ”ๅŒ–ๅކๅฒ่งฃๆž\n1000 Processed\n classify content\n220000 1 ไฝๅ…ญ้€ไธ€็ญ‰ไฟƒ้”€ๆดปๅŠจใ€‚ ๆตทๅ‹้…’ๅบ—ๅ…จไฝ“ๅ‘˜ๅทฅๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผ่ฎขๆˆฟ็ƒญ็บฟ๏ผšxxx-xxxxxxxx\n220001 0 2015ๅนด็ฌฌๅๅทๅฐ้ฃŽ็ฟ้ธฟ10ๅทๅ‰ๅŽ็™ป้™†ๆต™ๆฑŸ\n220002 0 ๆปฅ็”จ่Œๆƒใ€ๆณ„้œฒไธชไบบ้š็งๆ›ดๆ˜ฏ่บซไธบๆฐ‘่ญฆไธๅฏๅŽŸ่ฐ…็š„่ฟ‡้”™\n220003 0 ไธŽๅฝ“ๆ—ถ็š„ไธญๅคฎๆ”ฟๅบœๅ†›ไฝœๆˆ˜ๆ˜ฏๅ›ไนฑ\n220004 1 ๅๆ กไน‹่ทฏไปŽ็ซžๆ‰ไฟฎไธš่ตทๆญฅ๏ผŒๅŽปๅนดๆˆ‘ๆ กๆœ‰ไธคๅƒๅคšๅญฆ็”ŸๆˆๅŠŸๅ‡ๅ…ฅๅๆ ก้ซ˜ไธญ๏ผŒๆฌข่ฟŽๅ ฑๅๅ‚ๅŠ ๅ‘จๆœซๅญฆไน ๏ผŒๆˆ‘ไปฌๅฐ†ๅ…จ...\n1000 Processed\n classify content\n220500 1 ไบฒ็ˆฑ็š„ๅฎถ้•ฟ:่ด่ดๆ‰˜็ฎกๆ™š้—ด่พ…ๅฏผ็ญๅผ€ๅง‹็ญๅๅ•ฆ๏ผๆฌข่ฟŽๅนฟๅคงๅฎถ้•ฟๆฅ็”ตๆฅๅ›ญๅ’จ่ฏข๏ผxxxxxxxxxxxไฝ•\n220501 0 24โ€”2713880376289\n220502 0 ๆˆ˜ๆ–—ไธญๅˆไผšๅ˜ๅ›žไธŽ็Žฐๆœ‰FPS็ฝ‘็ปœๆธธๆˆ็›ธๅŒ็š„็ฌฌไธ€ไบบ็งฐ่ง†่ง’\n220503 0 ๆ ๆ†ๅŸบ้‡‘ๅˆๆˆไบ†ๆŠ•่ต„่€…็š„ๅ™ฉๆขฆ\n220504 0 ๆ„ๅคงๅˆฉ่ฎพ่ฎกๅทฅไฝœๅฎคgumdesign่ฎพ่ฎก็š„โ€œmastroโ€ๆ˜ฏไธ€ไธชๅฐๅž‹ๅฎถๅ…ท\n1000 Processed\n classify content\n221000 0 ๅ…จ็œ้“ถ่กŒไธš้‡‘่žๆœบๆž„่ต„ไบงๆ€ป้‡้ฆ–ๆฌก็ช็ ด3ไธ‡ไบฟ\n221001 0 ไฝ ๅช่ดŸ่ดฃๅผ€ๅฟƒๅฐฑๅฅฝ'ๅฝ“ๆ—ถไธๆ˜Ž็œŸ็›ธ็š„ๆˆ‘\n221002 0 ๆˆ‘ไนŸไธ็Ÿฅ้“่ฟ™็ฌ”xF็š„ๅŠ›้‡่ƒฝไธ่ƒฝๅคŸ็ช็ ดไธŠ้ข็š„้‡้‡ๅŽ‹ๅŠ›\n221003 0 Daianaๆ˜ฏ็”ฑSoupStudio่ฎพ่ฎก็š„ไธ€็›่ฝๅœฐ็ฏ\n221004 0 ไป–ไปฌ่‡ชๅทฑๆณก็š„้ป‘ๆพ้œฒ่œ‚่œœ้…’ไนŸๆ˜ฏๅคชๅฅฝๅ–\n1000 Processed\n classify content\n221500 0 8็‚น1ๆฐช๏ผšFacbookไนŸๅš็งปๅŠจ่ง†้ข‘็›ดๆ’ญไบ†\n221501 0 ไฝไบŽๆต™ๆฑŸ็œๆ…ˆๆบชๅธ‚ๆŽŒ่ตท้•‡ๅทฅไธš่ทฏxxๅทไฝฐไฝณ็”ตๅ™จๅŽ‚็ชๅ‘ๅคง็ซ\n221502 0 ็ฆๅทžๅธ‚ไธญ็บงๆณ•้™ขๆž—ไธฝๅจŸๆณ•ๅฎ˜ๅœจ20l5ๆฆ•่กŒ็ปˆๅญ—็ฌฌ306ๅทๆกˆๅฝ“ๅฎกๅˆค้•ฟ็บฆไธŠ่ฏ‰ไบบๆž—ไป™้œ–่ฐˆ่ฏ\n221503 0 ไฝ ๆ‹ฅๆœ‰ไบ†่ฟ™็”Ÿๅฐฑๆ˜ฏๆŠ•่ต„ๅ’Œๆ”ถ็›Š็š„ไบบ้•ฟ่ฝด็‰ˆ7ๅบงๅธƒๅฑ€/ๆญไธ‰็ผธmpareyourselfwithothๅฅฝ่ฏด็š„\n221504 0 ๆœ‰ๆ—ถๅบ†ๅนธ่‡ชๅทฑ่ฟ˜ๆ˜ฏ่ดŸ่ดฃ้ฃžๆœบ็บฟ่ทฏ\n1000 Processed\n classify content\n222000 0 ๅฐ็งฆๅ› ๆถ‰ๅซŒ่ŒๅŠกไพตๅ ๆกˆ่ขซๆฒฑๆฒณๆดพๅ‡บๆ‰€ไพๆณ•ๅˆ‘ไบ‹ๆ‹˜็•™\n222001 0 ๆฉๆ–ฝๅทžๆฃ€ๅฏŸๆœบๅ…ณไพๆณ•ๅฏนๆŽๆณฝๆ–Œๅ†ณๅฎš้€ฎๆ•\n222002 0 ๆ˜ฏๅŒป็”Ÿๅœจๅฏนๆ‹”็‰™ๅค„่ฟ›่กŒ้บป้†‰ๅŽๅผ€ๅง‹่ฟ›่กŒ็š„\n222003 0 ้ƒ‘ๅทžafp้ขๆŽˆ็ญๅฐ†ๆญฃๅผๅผ€่ฏพใ€€ใ€€xxxxๅนดxๆœˆxxใ€xxๆ—ฅ\n222004 0 ๆŠŠๆ‰‹ๆœบๅ’Œ็”ต่„‘็‰ˆ็š„ๅพฎๅšๆจกๆฟ้ƒฝๆขไบ†ๆ–ฐ็š„\n1000 Processed\n classify content\n222500 0 xๆœˆๅธธๅทžๆœบๅœบ่ฟŽๆฅไบ†ๆš‘่ฟ้ซ˜ๅณฐ\n222501 0 ไปŠๅคฉไธญๅˆๅœจๅŒป้™ข็œ‹ๅˆฐๅฐไพ„ๅฅณไบ†\n222502 0 ไปŠๅนด็ฝ‘ไธŠ้“ถ่กŒ็›ดๆŽฅ่ขซไบบ็›—ๅˆทๆถˆ่ดน3w4\n222503 0 ่ฏฅ่ทฏๆฎตๆ›พไบŽxxxxๅนดxxๆœˆxxๆ—ฅxxๆ—ถxxๅˆ†ๅ› ๅฑฑ่ฅฟๆ–นๅ‘ๅ†€ๆ™‹ไธป็บฟ็ซ™Kxxx+xxxๅค„่ฝฆๆต้‡ๅคง\n222504 0 ๆˆ–่ฎธๆ˜ฏ็”ต่„‘ๅ†…ๅญ˜ๅœจ็€็—…ๆฏ’ๅฏผ่‡ด่ฟ™ไธช็ฝ‘้กตๆ€ปๆ˜ฏ้šพไปฅๆ‰“ๅผ€\n1000 Processed\n classify content\n223000 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅˆš่ทŸไฝ ้€š่ฏ็š„ๅนณๅฎ‰ไฟก็”จ่ดทๆฌพ้ƒญ็ป็†๏ผŒๆๆ–™็ฎ€ๅ•๏ผŒx-xๅคฉๅˆฐ่ดฆ๏ผŒ้šๅ€Ÿ้š่ฟ˜๏ผŒๅ…ฌๅธๅœฐๅ€๏ผš้กบๅค–่ทฏ...\n223001 0 ไฝ ไธๅฆจๅ†ๅคšๅš่ฟ™3ๆ‹›่ฟ›้˜ถ่ฟๅŠจๆฅ้›•ๅก‘่บซๅฝขๅ“ฆ~\n223002 0 ๆˆ‘ๅœจโ€œ้’ข็ดๆ‰“ๅƒตๅฐธโ€ๆ‰“่ถดไบ†255ๅชๅƒตๅฐธ\n223003 0 MQTTๆญฃๅผๆˆไธบๆŽจ่็š„็‰ฉ่”็ฝ‘ไผ ่พ“ๅ่ฎฎๆ ‡ๅ‡†\n223004 0 ้ƒจๅˆ†็ฌฌไธ‰ๆ–น็จ‹ๅบไธญๅ…‰ๆ ‡ๅœจๅฎšไฝๅˆฐๆ–‡ๅญ—่พ“ๅ…ฅๆก†ๅŽไธ่‡ชๅŠจๅผนๅ‡บ่งฆๆŽง้”ฎ็›˜\n1000 Processed\n classify content\n223500 1 ๅฟ—้‚ฆๅŽจๆŸœxxxxโ€˜็™พๅŸŽๆ ทๆฟๆˆฟๅพ้›†ๆดปๅŠจโ€™ไปทๆ ผ้ข็บฑๅณๅฐ†ๆญๆ™“๏ผๆ‚จๅ‡†ๅค‡ๅฅฝไบ†ๅ—๏ผŸๅฟ—้‚ฆๅฐ่ฉนๅŠๅ…จๅ‘˜ไบŽxๆ—ฅๅณ...\n223501 0 ๅทฅไธšๆœบๅ™จไบบๅœจ้€‰ๅ–ๆœซ็ซฏๅทฅๅ…ทๆ—ถๅบ”่ฏฅๆ€่€ƒ้‚ฃไบ›้—ฎ้ข˜\n223502 0 ไป–ไปฌไธคไบบๅœจ้ฃžๆœบไธŠไธบไธ€ๅๅฟƒ่‚บ้ชคๅœ็š„2ๅฒ็”ท็ซฅ่ฟ›่กŒๆ€ฅๆ•‘\n223503 0 ็”š่‡ณ้ฒ่Žฝ+ๅฅฝ่ฟๆฐ”่ƒฝ้€ ๅฐฑๆƒŠไบบ็š„่พ‰็…Œ\n223504 0 ๅปถๆ‘็š„ๅคๅปบ็ญ‘ๅค„ๅค„ไฝ“็Žฐ็€ๅ•†ไบบ็š„่ฟ™็ง็ฅˆ็ฅท\n1000 Processed\n classify content\n224000 0 ๆต™ๆฑŸๅฎ‰ๅ‰่ฑชๅŽไธ‰ๆ—ฅๆธธๅฐ†ๅ˜ๆˆๅ››ๆ˜Ÿๅบฆๅ‡ๆ‘ไธ‰ๆ—ฅๆธธ\n224001 0 ๅฏŒๅฃซๅบท็ญพ็ฝฒๅ่ฎฎๅฐ†ๅœจๅฐๅบฆๆŠ•่ต„xxไบฟ็พŽๅ…ƒๅปบๅŽ‚\n224002 0 HR่ตซ่Žฒๆ‹‰ๆž่‡ดไน‹็พŽ่ๅŽ็œผ้œœ15ml\n224003 0 ๆƒ ๆ™ฎๆ‰ฟ่ฏบๅœจไธŽๅบทๆŸๅˆๅนถๅŽๆ้ซ˜ceo่ฒๅฅฅ่މ็บณ็š„่–ช่ต„\n224004 0 ้•‡ๆฑŸๆ— ็—•ๆŽฅๅ‘ไธน้˜ณๆ— ็—•ๆŽฅๅ‘ๅ—ไบฌๆ— ็—•ๆŽฅๅ‘\n1000 Processed\n classify content\n224500 0 ๆ”น้ฉๆ–นๆกˆๅพ—ๅˆฐไบ†ๅธๆณ•้ƒจ็š„่ฎคๅฏ\n224501 1 ไบฒ็ˆฑ็š„ๅŽไผšๅ‘˜xๆœˆxๆ—ฅไธ€xๆœˆxๆ—ฅไธ€ๅนดไธ€ๅบฆ็š„ๆ˜ฅๅญฃๅŒ–ๅฆ†ๅ“่Š‚๏ผŒๆญฆๆž—้“ถๆณฐๅ•†ๅœบๆดปๅŠจไนฐxxxx้€xxx๏ผŒx...\n224502 0 ้˜ฟ้‡Œๅทดๅทดๆœ‰ๆœ›่ฟ…้€Ÿๅˆ‡ๅ…ฅๅฅขไพˆๅ“ใ€่ฝปๅฅข\n224503 0 ใ€Žๆฏๅฅณๆดพๅ‡บๆ‰€ๅ†…่ขซๆ€ๅฎถๅฑžๆ›พxๆฌกๆŠฅ่ญฆ่ญฆๆ–น็งฐๅฎถๅŠกไบ‹ใ€ๆดพๅ‡บๆ‰€\n224504 0 21ไธ–็บชๆ•™่‚ฒ็ ”็ฉถ้™ขไธญๅฐๅญฆๆ•™่‚ฒ็ ”็ฉถไธญๅฟƒใ€ๆ‰ฌๅทžๅธ‚ๆข…ๅฒญๅฐๅญฆๆ‰ฟๅŠž\n1000 Processed\n classify content\n225000 0 ๅ…ดๆญฃ็บ ้”™ๅฎ‡ๅฎ™ๅคงๅญฆๆ กๆ ก้•ฟไปป็ง€็บขไธ€ไธชๅฎ‡ๅฎ™ๅœฐ็ƒไบบๅšไปปไฝ•ไธ€ไปถไบ‹ๅ„ฟ\n225001 0 โ€œ้›ถ่ท็ฆปโ€ๆ„Ÿๅ—ๆณ•้™ข็š„ๆ–‡ๅŒ–ๆฐ›ๅ›ด\n225002 0 ้’ๅฒ›ใ€ๅนฟๅทžใ€ๅคง่ฟžใ€ๅ—ไบฌ็ญ‰ๅœฐไนŸๅฎžๆ–ฝไบ†็ฆปๅฉš้™ๅท\n225003 0 ๆ›ดไธป่ฆ็š„่ฟ˜ๆ˜ฏ็”ฑไบŽๆธ…ๆ”ฟๅบœ่…่ดฅๅ’Œไฟๅฎˆ\n225004 0 B่ถ…ๅธˆๅ‚…ๆƒŠ่ฎถ้“๏ผšไฝ ็ป“่Š‚้‚ฃไนˆๅคงไธบไป€ไนˆ่„–ๅญ้‚ฃไนˆ็ป†\n1000 Processed\n classify content\n225500 0 ็ฌฌ61่ฎฒ๏ผšScalaไธญ้šๅผๅ‚ๆ•ฐไธŽ้šๅผ่ฝฌๆข็š„่”ๅˆไฝฟ็”จๅฎžๆˆ˜่ฏฆ่งฃๅŠๅ…ถๅœจSparkไธญ็š„ๅบ”็”จๆบ็ ่งฃๆž็™พๅบฆไบ‘๏ผš\n225501 0 ไปŽๆฅๆฒก็”จ่ฟ‡iPhoneๅคฉๅคฉ้…ธiPhone\n225502 0 ็ฆ่ƒฝ่ž่ต„็งŸ่ตๅ…ฌๅธโ€œๆ–ฐไธ‰ๆฟโ€ๆŒ‚็‰ŒไปชๅผๅœจๅŒ—ไบฌไธพ่กŒ\n225503 0 ๅฏนไธๆ˜ฏๅผบๅฅธๆ˜ฏ่ฝฎๅฅธโ€ฆโ€ฆ็ผ–ๅ‰ง้•ฟ็‚นๅฟƒๆˆๅ—โ€ฆโ€ฆ\n225504 1 ไฝ ๅฅฝๆˆ‘ๆ˜ฏ้˜ณๅ…‰ไฟก่ดท็š„ๅฐๅ‰ง๏ผŒ้˜ณๅ…‰ๅฎกๆ‰นๅพไฟก๏ผŒๅ…‰ๅคง้“ถ่กŒๆ”พๆฌพx-xๅคฉๅˆฐ่ดฆ๏ผŒๆ— ๆŠตๆŠผๆ— ๆ‹…ไฟ๏ผŒ็”ต่ฏxxxxx...\n1000 Processed\n classify content\n226000 0 ๅ‘Š่ฏ‰ไฝ ไปฌๆˆ‘็™ฝ็€ๅ‘ขๆœ‰ๅ›พๆœ‰็œŸ็›ธ\n226001 0 ไธŠๆตท็š„ไธญๅ›ฝๅ›ฝ้™…ๆœบๅ™จไบบๅฑ•ไผšไธŠ\n226002 0 ไฝœไบŽ2015ๅนด8ๆœˆ2ๆ—ฅๅ››ๆœˆ้ฃŽ\n226003 0 ไปŠๅคฉๆ‰ๅ‘็Žฐๅฅฝไน…ไธ็œ‹่ฃ…ไฟฎๅ›พๅบ“ไบ†\n226004 0 ไปŠๅคฉไธญๅˆๅŽปๅƒ็š„ๆ˜ ๅƒๅพๅทž็ปๅฃๆ˜ฏๆˆ‘ๅƒ่ฟ‡ๆœ€ๅฅฝๅƒ็š„้ฅญๅบ—\n1000 Processed\n classify content\n226500 0 ๆด‹ๆด‹ๅ’ฉ่กฃๅ ก่ถ…ๅผนๅŠ›2015ๅค่ฃ…ๆ–ฐๅ“ไธ“ๆŸœๅŒๆฌพๆฌง็พŽๆŠข็œผๆ’ž่‰ฒๅฐๆ€ช\n226501 0 ๅฏๅฏนๅ—ๆž—ๅคงxxๅ้ซ˜ๅฐ”ๅคซๅœบๅœฐๅŠฉ็†ๅฟ—ๆ„ฟ่€…ๆฅ่ฏดๅฐฑไธๆ˜ฏ่ฟ™ๅ›žไบ‹ไบ†\n226502 0 ไธ‹ไธชๆœˆ็š„ๆœŸๆŒ‡ไบคๅ‰ฒๆ—ฅๅœจ8ๆœˆ21ๆ—ฅ\n226503 0 ๆตทๅฃไธญ้™ขโ€œๆณ•ๅฎ˜ๆ•™ๆณ•ๅฎ˜โ€ๅคง่ฎฒๅ ‚ๅœจไธญ้™ขไธ‰ๆฅผๅคงๅฎกๅˆคๅบญๅผ€่ฏพ\n226504 0 ๆœ€่ฟ‘ๆ˜ฏ้˜ฒๆ™’?sofina?็ฒ‰้ฅผ\n1000 Processed\n classify content\n227000 0 ่ฏด่‡ชๅทฑๅˆฐ่พพไบ†TOPOFTHEWORLD\n227001 0 ่ฎคไธบๆœบๅ™จไบบๅจ่ƒ่ฎบๆ˜Žๆ˜พ่ขซๅคธๅคงไบ†\n227002 0 ็ป™ๅคงๅฎถๅˆ†ไบซโ€œYouAreBeautiful\n227003 0 ่ฎฐๅพ—ๅฐๆ—ถๅ€™ๅฐๅทๅท็‹—ไธ€่ˆฌ็”จ็ปณๅญ\n227004 0 ๆˆ‘ไนŸๆƒณ่ฆ็บขๅŒ…ๅƒๅ†ฐๆท‡ๆท‹ไบ”ๆฏ›็š„ๅฐๅธƒไธ\n1000 Processed\n classify content\n227500 0 ๅฏนไบŽๆˆ‘่ฟ™็ง็œ‹ๆƒฏไบ†SMTMๅ’ŒUnprettyRapStar่ฟ™็งๆปกๆ˜ฏๅ“”ๅฃฐ็š„Hippopๆฏ”่ต›็š„ไบบ\n227501 0 ๅ‘†ๅœจ็”ต่„‘้ขๅ‰ๅทฅไฝœๅฐฑๆƒณ็€ๅ‡บๅŽป่ท‘\n227502 0 8/4ๆธฏ็‰ˆiPadair\n227503 1 ๆณ‰ๅทžๅธ‚ๅคฉๅ’Œๅ‰ฒ็ƒนๆ–™็†ๅบ—\n227504 0 ๆฏ่พ†่ฝฆ้ƒฝไธฅๆ ผๆŒ‰็…งx๏ผšxx็š„ๆฏ”ไพ‹ๅˆถไฝœ\n1000 Processed\n classify content\n228000 0 ่€Œๆ”ฟๅบœๅœจ้‡ๅปบไฟกๅฟƒไธŠๅŠ›ๅบฆไธ่ถณ\n228001 0 ๅ—ไบฌๅทฅๅ•†้ƒจ้—จๅˆ™่กจ็คบไผš่€ƒ่™‘ๆŠฝๆฃ€\n228002 0 ๆŸๆŽจไธป่ฟ™ๆ ท่ชช็€twitter\n228003 0 ไธ่ฎฉๆ‰‹ๆœบๆ‘งๆฏไฝ ็š„่„ŠๆŸฑใ€็ก็œ ไปฅๅŠๆŠ—ๅŽ‹ๆฐดๅนณ\n228004 0 ็™พๅบฆ็š„ๆ”ถๅฝ•ๆ•ฐ้‡ใ€ๆ”ถๅฝ•ๆ•ˆ็އใ€ๅ…ณ้”ฎ่ฏๆŽ’ๅ่กจ็ŽฐไนŸไธๅคง็›ธๅŒ\n1000 Processed\n classify content\n228500 1 ๅณๆ—ฅ่ตท่‡ณxๆœˆxxๅท๏ผŒๆขๅญฃๆธ…ไป“๏ผŒๆ›ผๅคฉ้›จๆœ้ฅฐๅ…จๅœบๅฏนๆŠ˜โ€”โ€”ๆ’ๅˆฉๆ›ผๅคฉ้›จใ€‚\n228501 0 ๅ„ฟๆ—ถๆขฆๆƒณ็š„ไผ™ไผดๅ“†ๅ•ฆAๆขฆๅ†ๆฌก่ขซๆฌไธŠ่งๅน•\n228502 0 NoFilter๏ผšๅˆซไปฅไธบๆˆ‘็˜ฆไบ†\n228503 0 ไธ€ๆฌกไฟ‚ๅ’ๅ…ฉๆฌกไฟ‚ๅ’nๆฌกไฟ‚็”˜ๆ”พๅฟƒๅ†‡ไธ‹ๆฌกไบ†็”˜ๅ˜Žไบบ็œŸ็ณปๅพ้ฉๅˆ่ฆ้ปŽๅšๆœ‹ๅ‹\n228504 0 ไธ่ฟ‡่ฃ…ๆฝขๅ’Œๅนฟๅ‘Šไธ€ๆ ทๅพˆ็ฒ‰ๅซฉๅพˆๅฐ‘ๅฅณๅ‘ข\n1000 Processed\n classify content\n229000 0 2015ๅนด6ๆœˆ้’ๅฒ›ๅธ‚ๆ–ฐๅผ€ๅทฅ่ฟ‡ไบฟๅ…ƒไบงไธš็ฑป้กน็›ฎ48ไธช\n229001 0 ่ขซๅ‘Šไบบ่ก€ๆถฒ้…’็ฒพๅซ้‡ไธบ178ๆฏซๅ…‹/100ๆฏซๅ‡\n229002 0 ไธฐๅŽฟไพ›็”ตๅ…ฌๅธโ€œ้‡‘่œœ่œ‚โ€ๅ…šๅ‘˜ๆœๅŠก้˜Ÿ่ตฐ่ฟ›็คพๅŒบ\n229003 0 ๅ—ไบฌไธ€ๅŸŽ็ฎก้˜Ÿๅ‘˜ๅ†™ไฟก้—ฎๅธ‚้•ฟ๏ผšๆˆ‘ไปฌ็š„ๅŸŽๅธ‚่ƒฝ็ฎก็†ๅฅฝๅ—\n229004 0 ่ฏฅๆกˆๆถ‰ๆกˆ็š„้‡‘้ขๅ…ฑ่ฎก400ไฝ™ไธ‡ๅ…ƒ\n1000 Processed\n classify content\n229500 0 ่ฟ™็งๆœบๅ™จไบบๅœจ็Žฐๅฎž้‡Œๅทฒ็ป่ขซๅ›ฝๅค–็š„ๆžๅฎขDIYๅ‡บๆฅไบ†\n229501 0 ไปฃ่กจxxไธชๅทž็š„xxๆ นๅ–ทๆณ‰ๆฐดๆŸฑๆ˜ฏไธ€ๅคงไบฎ็‚น\n229502 0 ็”ฑไบŽ่ทฏ้ขๅๅกŒ้€ ๆˆDN200ไพ›ๆฐด็ฎก็บฟ็ˆ†็ฎก\n229503 0 ๅฝ“J็ป™A่ฏ‰่ฏด่‡ชๅทฑๆ—…ๆธธ็ปๅކ็š„ๆ—ถไป–ๅด็ก็€็š„้‚ฃๆ—ถ่ตท\n229504 1 ๆœฌๅ…ฌๅธ้•ฟๆœŸๆœ‰ๆ•ˆ๏ผŒไปฃๅผ€ไบ‘ๅ—ๅ„ๅœฐๆญฃ่ง„ๆœบๆ‰“ๅ‘็ฅจ๏ผšๅ›ฝ็จŽ๏ผŒๅœฐ็จŽ๏ผŒๅขžๅ€ผ๏ผŒๆๆ–™ๆฌพ๏ผŒๅทฅ็จ‹ๆฌพ๏ผŒๆœๅŠกไธš็ญ‰๏ผŒ็ฝ‘ไธŠๅฏ...\n1000 Processed\n classify content\n230000 0 ็†ฌๅคœ็œ‹็”ต่„‘ๆˆ–่€…ๅทฅไฝœไน‹ๅŽไฝ ๅฆ‚ๆžœ็…ง็…ง้•œๅญ\n230001 0 1986ๅนด่”ๅˆๅ›ฝๆ•™็ง‘ๆ–‡็ป„็ป‡ๅฐ†ๅ…ถๅˆ—ไธบไธ–็•Œ่‡ช็„ถ้—ไบง\n230002 0 ไธ‹ๅˆๆˆ‘ไปฌๆŒ‰็…ง่€ๅธˆ่ฆๆฑ‚็ฉฟ็€ๅญๆœๅธฆ็€่ŠฑๅœˆๅŽปไบ†\n230003 0 ่ทŸๆˆ‘่ฏดๅˆšๅ้ฃžๆœบไธ‹ๆฅๆƒณๅŽป่ฝฌ้ซ˜้“\n230004 0 ๆ นๆฎRealGM่ฎฐ่€…ShamsCharaniaๆŠฅ้“\n1000 Processed\n classify content\n230500 0 ่‡ช้€‚ๅบ”ๅทก่ˆช็ณป็ปŸใ€ๅฎ้ฉฌConnectedDriveไบ’่”้ฉพ้ฉถ็ณป็ปŸไนŸๅฐ†ๅ‡บ็Žฐ\n230501 0 โ€ๅŒป็”Ÿ็›ดๆŽฅๅ‘Š่ฏ‰ไป–๏ผšโ€œๆœ‰้‚ฃ้’ฑไฝ ่ฟ˜ๆ˜ฏ็œ‹็œ‹็œผ็›ๅง\n230502 0 ๆœ€่ฟ‘ๅšๆŒๅ–ไธญ่ฏๅคด้ƒจ็šฎ็‚Žๅฅ‡่ฟน่ˆฌ็š„ๅฅฝไบ†\n230503 0 xxxxๅปถ่พนๅทžๅฅ็พŽ้”ฆๆ ‡่ต›xxๅ…ฌๆ–คไปฅไธŠ็บง่‡ช็”ฑๅฑ•็คบ\n230504 0 ๅŸŽ็ฎก้˜Ÿๅ‘˜็ซ‹ๅณ้‡‡ๅ–ๆŽชๆ–ฝๅฐ†้’ขไธ็ปณๆ‰“ๅผ€\n1000 Processed\n classify content\n231000 0 ๅ››ๅท่ˆช็ฉบxuxxxx้ข„่ฎกๆ™š็‚นxๅฐๆ—ถ\n231001 0 ยทๅนฟๅทžๅธ‚ๆฐ‘ๆ‰“10ๅ…ƒ้บปๅฐ†่ขซๆ‹˜็•™ๆณ•้™ข็งฐ่ตŒ่ต„่พƒๅคง\n231002 0 ่‹ๅทžๅ›ฝ้™…็ง‘ๆŠ€ๅ›ญไธพๅŠž็š„ไธ€ๅœบๆ™บ่ƒฝ็”Ÿๆดปๆ–ฐๅ“ๅ‘ๅธƒไผšไธŠ\n231003 0 ็œŸๆ˜ฏๅคชไธๅฅฝไบ†ๅฐคๅ…ถๆ˜ฏ้ผป็‚Ž็ซŸ้š้š่ฆ็Šฏ็š„ๆ„ๆ€\n231004 0 T315I็ชๅ˜้˜ณๆ€งๆ‚ฃ่€…ๅฏน็ฌฌไธ€ไปฃๅŠ็ฌฌไบŒไปฃTKI่€่ฏ\n1000 Processed\n classify content\n231500 1 ่Žฑๅ…‹xๆฅผๆฐด็–—ๆ–ฐๆ˜ฅ็Œฎ็คผ็”ต่ฏ่ฎขๆˆฟๆœ‰ไผ˜ๆƒ ๏ผŒไธ€ไบบไผ˜ๆƒ xx๏ผŒไธคไบบๅŒ่กŒไธ€ไบบไผ˜ๆƒ xxxใ€‚ใ€‚ๆฌข่ฟŽๅ‰ๆฅๅ“่Œถใ€‚ใ€‚\n231501 0 ็–พ็—…็œ‹ไธŠๅŽปๅฐฑๅƒไธ€ไธชๅพˆ้ฅ่ฟœ็š„ไธœ่ฅฟ\n231502 0 ไฝ†ๆ„ฟๅŸŽ็ฎกไธŽๅ•†ๅฎถไน‹้—ดๅปบ็ซ‹ๅ‹่ฐŠ\n231503 0 ๆŠ•่ต„ๅฎ‰้€ธๅฎๆ›ดๆœ‰็Žฐ้‡‘ๅคงๅฅ–็ญ‰ๆ‚จๆ‹ฟๅ“ฆ\n231504 1 ๆœ—่ƒฝๅŽจๅซๅŠ้กถ@้•ฟๅฏฟไธ“ๅ–ๅบ—๏ผ›ๆฌขๅบ†x.xx๏ผ›ๆœ—่ƒฝๅŠ้กถ ๏ผ›ๅ…จๅœบx.xๆŠ˜๏ผˆ้™ค็‰นไปทๅค–๏ผ‰๏ผŒๆƒŠๅ–œๅคšๅคš๏ผŒ็คผๅ“...\n1000 Processed\n classify content\n232000 0 ๆ˜จๆ™šไธŠๅšๆขฆๆขฆๅˆฐๆ‰‹ๆœบ็ƒซๅพ—ไธๅพ—ไบ†\n232001 0 ๅŠชๅŠ›็กฎไฟๆœฌๅŒบๅŸŽ็ฎกๆ‰งๆณ•ๆก็บฟโ€œ็ƒญ็บฟไธ็ƒญโ€\n232002 0 ๆˆ‘ไปฌTF็š„็ฒ‰ไธๅŠ่ขซๆญค่Š‚็›ฎ็ป„ไพฎ่พฑ็š„ๆ˜Žๆ˜Ÿ็ฒ‰ไธไปฌ่ฆๆฑ‚ๅ…ฌๅผ€้“ๆญ‰\n232003 0 ่ฏฅๅŽปๆ•ดๆฒปไบค้€šๅดๆŠŠ็ฒพๅŠ›ๆตช่ดนๅœจ่ฟ™ไบ›ไธŠ\n232004 0 xxExofficioCafenistaJacquardๅฅณๅฃซ็พŠๆฏ›ๆทท็บบ้€Ÿๅนฒไฟๆš–้’ˆ็ป‡่กซ็พŽๅ›ฝไบš้ฉฌ้€Š...\n1000 Processed\n classify content\n232500 0 ๆˆ็†Ÿ็š„ๆ ‡ๅฟ—ไน‹ไธ€ๅฐฑๆ˜ฏไธๅ†ๆƒณๅŽปๅนฒๆถ‰ๆˆ–่€…ๆ‰นๅˆคไป–ไบบ็š„ไปทๅ€ผ่ง‚\\n0\\tไปŠๅนด็ฌฌ13ๅทๅฐ้ฃŽโ€œ่‹่ฟช็ฝ—โ€ๆฅๅŠฟๆฑน...\n232501 1 ไบ’ๅŠจ๏ผŒๅ“ฅๅ“ฅไธ็”จๅŠจ๏ผŒๅฐๅฆนๅ…จ่‡ชๅŠจ๏ผŒ็Žฉๆณ•็‹ฌ็‰น๏ผŒๅ‰ๆ‰€ๆœชๆœ‰๏ผŒๅธฆ็ป™ๆ‚จไธไธ€ๆ ท็š„ไฝ“้ชŒ๏ผŒๅœฐๅ€:ๆญๅทžๅ†œๅ‰ฏไบงๅ“็‰ฉๆต...\n232502 0 ๅ‘็Žฐ็™พๅบฆไผ˜ๅŒ–็กฎๅฎžไธŽ่ฐทๆญŒไผ˜ๅŒ–ๆœ‰ๆฏ”่พƒๅคง็š„ๅทฎๅˆซ\n232503 0 ๅคง้‡็š„WindowsPhone็ฒ‰ไธๆถŒๅ…ฅ่ฎบๅ›ๅนถ่กจ็คบๆŠ—่ฎฎ\n232504 0 ๅŒป้™ขๆญฃๅผ้‡‡็”จไฟๅฅๅ“ๆ›ฟไปฃๆŸไบ›่ฏ็‰ฉๆฒป็–—็–พ็—…\n1000 Processed\n classify content\n233000 0 ๆ‰‹ๆœบ็›ธๆœบไนŸ้ƒฝๆ‹ไธๅ‡บ้‡Œ้ข็š„็พŽๅ’Œๅฃฎ่ง‚\n233001 0 ๆˆ‘ไปฌๅ…จๅฎถๆœ€่ฟ‘้ƒฝๅœจ่Šฑๅƒ้ชจโ€ฆไนฐ่œๅ›žๆฅ่ฟ›้—จ็œ‹่งๅผ็ˆธ\n233002 0 ?ๆฑŸๅฎไน‹ๆ—…?็งฆๆทฎๆฒณ็•”ๅพกๅบญ่Šณ\n233003 0 ้ข„่ฎกๅคง็บฆๆœ‰6000ๅ่Œๅ‘˜ๅฐ†่ขซ่งฃ้›‡\n233004 0 ไผš่ฎฉๅพˆๅคšๅฐไผ™ไผดๆจไธๅพ—24ๅฐๆ—ถๆณกๅœจๆฐด้‡Œ้ฟๆš‘\n1000 Processed\n classify content\n233500 0 19ๅฒ็š„ๆŒชๅจๅฅณๅญฉNannไบบ็งฐโ€œSnapchatๅฅณ็Ž‹โ€\n233501 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?ๅคœ่กŒไนฆ็”Ÿxx\n233502 0 ่ฐข่ฐขๅ„ไฝ็†Ÿๆ‚‰ๆˆ–ๆ˜ฏ้™Œ็”Ÿ็š„่ถณ็ƒไบบๅฅฝๅฟƒไบบๅพฎไฟก่ฝฌๅธ่ฏท่ฝฌๅˆฐ็›ฎๅ‰้™ชไผดๅœจๅŒป้™ข็š„ๅด”ๅพฎ\n233503 0 ่ฃ่€€7ๅงๅฐฑๆˆ‘้‚ฃ็‚นๅพฎ่–„็š„็งฏ่“„ๆˆ‘ๆ‰ฟๅ—ไธ่ตท\n233504 0 ่ฟ™ไธ€ๅ˜ๅŒ–่ฎฉ้ƒจๅˆ†Windows็”จๆˆทๆ„Ÿๅˆฐๆ‹…ๅฟง\n1000 Processed\n classify content\n234000 0 ๆฒกๆœ‰ๅทฅๅ•†ๆ‰ง็…งๅฐฑๅผ€ไธไบ†้˜ฟ้‡Œๅทดๅทด่ฏšไฟก้€šๅšไธไบ†่ฏšไฟก้€šๅ—\n234001 0 ๆˆ‘ไธŠไผ ไบ†โ€œๆฐธไปๅŽฟไบบๆฐ‘ๆณ•้™ข2015ๅนดโ€œๅบญๅฎก็ซž่ต›โ€ๆดปๅŠจๅ‚่ต›\n234002 0 ๅ‰็ฑณ้ธกๆฏ›็ง€ๅ˜‰ๅฎพ่‡ชๆ›ๅ‰็ฑณๅคฑๆœ›ๅฅฝๅ‹้š็ž’็œŸ็›ธ่ถ…ๆธ…&gt\n234003 0 ๅฎนๆ˜“ไฝฟ3ๅฒไปฅไธ‹็š„ๅ„ฟ็ซฅไธๆ˜“ๅ…ฅ็กๅ’Œๅ“ญ้—นไธๅฎ‰\n234004 0 ๆ™จๆ‚ ็ป„ๅˆๅฏนๆฏ”๏ผš็ฟป็‰ˆTwinsๅ’Œๅฃฐ่Œ็ฟปๅฏผๅธˆ\n1000 Processed\n classify content\n234500 1 ไธญๅ›ฝๅ†œไธš้“ถ่กŒๅธๅท๏ผšxxxx.xxxx.xxxx.xxxx.xxx ็Ž‹ๆณข\n234501 0 ใ€€ใ€€1978ๅนด10ๆœˆ่‡ณ1982ๅนด7ๆœˆ\n234502 0 ่€Œๅœฐไธ‹็ฉบ้—ดๅˆไธŽไธ‰ๆกๅœฐ้“็บฟ่ทฏไบ’็›ธ่ฟž้€š\n234503 0 ไธƒๅทๆฅผๅœจๆต‡ๆณจ่ฅฟ่พน็š„็”ตๆขฏๅŸบๅ‘\n234504 0 ๅŒๆ—ถไนŸๅˆถ้€ ้ฃžๆœบๅ’Œๅ„็งๅ‘ๅŠจๆœบ\n1000 Processed\n classify content\n235000 0 ใ€€ใ€€1ใ€ๅŠๆ—ถ้ฟๅ…็—…ๅ› ็–พ็—…็š„ไบง็”Ÿ้ƒฝๆœ‰ไธ€ๅฎš็š„ๅŽŸๅ› \n235001 0 ็œ‹็€ไธ€ๆžถ้ฃžๆœบไธ€้—ชไธ€้—ชๅœฐ้ฃž่ฟ‡\n235002 0 ๆณ•ๅฎ˜๏ผšๅฅฝๅ•Š้‚ฃไธชๆƒๅˆฉๆ˜ฏไป€ไนˆๆ„ๆ€\n235003 1 ใ€ไธ‰ๆœˆๅผ€ๆ˜ฅ๏ผŒไธœ้ฃŽๆœฌ็”ฐๅผ€ๅ…ƒๅบ—้€็คผไบ†ใ€‘ๅ‡ก่ฟ›ๅบ—ๅŠž็†ไฟ้™ฉไธšๅŠก๏ผŒๅณๆœ‰ๆœบไผš่Žท่ต ่Š‚ๆฐ”้—จๆธ…ๆด—ใ€ๅ››่ฝฎๅฎšไฝใ€ๅทฅๆ—ถ...\n235004 0 ๆธ…็†ไนฑๅ †็‰ฉๅ †ๆ–™xๅค„ใ€่ฟ็ซ ๅนฟๅ‘Šๆ‹›่ดดxxๅค„\n1000 Processed\n classify content\n235500 0 ๆ•ดไธชๅ†™ๅญ—ๆฅผ่ขซ่“ๅคฉ็™ฝไบ‘โ€œๅ›ด็ป•โ€\n235501 0 ๅ—ไบฌๅคง้›จไนŸ้šพไปฅๆต‡็ญๆ‘‡ๆปšๆญŒ่ฟท็ƒญๆƒ…\n235502 0 ไธๅฐ‘Indiegogo็š„ๆ˜Žๆ˜Ÿไผ—็ญนไบงๅ“็ญ‰ๅทฒ็ป็™ป้™†\n235503 0 ๆ—ฅๆœฌHELLOKITTY้™้‡่‚Œ็ ”ๆžๆถฆไฟๆนฟๅŒ–ๅฆ†ๆฐด่‚Œ่‚คๅนฒ็‡ฅ\n235504 0 ้šๅŽไธŽ่ญฆๆ–นๅฑ•ๅผ€ไบ†้•ฟ่พพ20่‹ฑ้‡Œ็š„่ฟฝ้€\n1000 Processed\n classify content\n236000 0 ็ป“ๆžœๆ„ๅค–ๅœฐๅฅฝ็Žฉๅนถไธ”get็บขๅŒ…็Ž‹+็‰Œ็Ž‹็š„็งฐๅท\n236001 0 ็„ถ่€Œmips็”Ÿๆ€็š„ๅœๆปžไธŽ่ขซไพต่š€็ป™ๅ›ๆญฃๆŒ–ไบ†ไธ€ๅคงๅ‘\n236002 0 ็”ฑ้ซ˜ๅ“่ดจ็ฒพๅ“ๆฅผ็›˜โ€”โ€”ๆฝๅŠๆ’ๅคงๅ้ƒฝใ€ๆฝๅŠ็ฟก็ฟ ๅŽๅบญๆบๆ‰‹้Ÿณไนๅนฟๆ’ญFMxx\n236003 0 ็šฎ่‚ค่ฟ‡ๆ•ๅ‘็บขๅ–ทไธ€ๅ–ท่ƒฝๆœ‰ๆ•ˆ็ผ“่งฃ\n236004 0 ๅˆซๅข…=villa=cottage\n1000 Processed\n classify content\n236500 0 ็›ฎๅ‰5ๅนดๅทฒ็ดฏ่ฎกๆŠ•ๅ…ฅไธ€็™พไบฟๆŽจ่ฟ›ไบค้€šๅปบ่ฎพ\n236501 0 ๅœจ่ฟ™้‡Œๆœ‰ๅ„็ง้ซ˜็บง็š„ๆœบๅ™จไบบไธŽๅฎไบบ่ดน่งฃ็š„ๆฏ็ญ็Žฐ่ฑก\n236502 0 ps๏ผšไฝๅบ—ๅฎขไบบๅ…่ดน่ต ้€ๆ‰ฌๅทžๆ—…ๆธธๅ…ฌไบคๅกๅ•ฆ\n236503 0 ่ฟ™ๆฎตๆ—ถ้—ด็š„็”ต่„‘ๆกŒ้ขๅ…จ้ƒฝๆ˜ฏๅฅน\n236504 0 ๆณ•้™ขไธ€ๅฎกๅˆคๅ†ณ่ขซๅ‘Š้‚ฑๆŸ็Šฏๆ•…ๆ„ไผคๅฎณ็ฝช\n1000 Processed\n classify content\n237000 0 โ€œ็ฟ้ธฟโ€ๅทฒ็™ป้™†ๆต™ๆฑŸ่ˆŸๅฑฑ?ไธŠๆตท่ฝฌ็งปๆ’ค็ฆปxx\n237001 0 ๆ‰‹ๆœบๅŽ้ขๅฐฑๅฐ‘ไบ†ไธ€ไธชๆ ‡ๅฟ—็š„่†œ่€Œๅทฒ\n237002 0 ๅŒป็–—็พŽๅฎนๅ–็š„ไธๅ•ๅ•ๆ˜ฏๆ•ดๅฎน\n237003 0 ไธ€ไธชไพ้ ๅŽๆฅ่€…ๆŠ•ๅ…ฅ็ปด็ณปๅ…ˆๅ…ฅ่€…็š„ๅฑ€ๅทฒ็ป่ฏดๅพ—ๅ˜ด็ƒ‚\n237004 0 ๅ—้€š่€ๅ…ต่‡ชๆŽ50ไธ‡่ฎพๆ‹ฅๅ†›ๅŸบ้‡‘\n1000 Processed\n classify content\n237500 0 ๅพทๅ›ฝๆฐ‘ไผ—ๅœจ็บณ็ฒนๆ”ฟๆฒป่…่ดฅ้—ฎ้ข˜ไธŠ็š„่ฎค่ฏ†ไธไป…้ผ ็›ฎๅฏธๅ…‰ใ€่‡ชๆˆ‘ๆฌบ้ช—\n237501 0 ๆณ•ๅพ‹ใ€่กŒๆ”ฟๆณ•่ง„่ง„ๅฎšๅนฟๅ‘Šไธญๅบ”ๅฝ“ๆ˜Ž็คบ็š„ๅ†…ๅฎน\n237502 0 ๅ่…่‡ณๅฐ‘ไผšไฝฟGDPไธ‹้™ไธ‰ไธช็™พๅˆ†็‚น\n237503 0 ่ฟ™ๆ—ถๅŒป้™ขไธ€ไฝๅพท้ซ˜ๆœ›้‡็š„่€ไธปไปป่ฏด๏ผšๅฑ\n237504 0 ๆŸไบ›ๅฐๅธ‚ๆฐ‘ๆŠน้ป‘ไบ†ๅ—ไบฌไบบ็š„ๅฝข่ฑก\n1000 Processed\n classify content\n238000 0 ่ฟ™ๆ˜ฏ่‡ช1985ๅนดไปฅๆฅๅฝ“ๅœฐ้ญๅ—ๆœ€ๅคง็š„ไธ€ๆฌก่—็พ\n238001 0 ็œŸๆƒณๆ‰‹ไผธๅˆฐ็”ต่„‘้‚ฃๅคดๅคงๅ˜ดๅทด็”ฉไป–ไธซ็š„\n238002 0 ๅœจๆต™ๆฑŸ็‘žๅฎ‰ๆฑ€็”ฐๅ•†ไธš่ก—ๅ’Œ่”ไธญ่ทฏไบคๅ‰ๅฃไธŠๆผ”ไบ†โ€œ็ขฐ็ขฐ่ฝฆโ€ๅคงๆˆ˜\n238003 0 ๆœ‰ๅ‚ๅฑ•ๅŽ‚ๅ•†ๆ‰“้€ ไบ†ไธ€ไธช้•ฟ็›ธ้…ทไผผๆ—ฅๆœฌ้ฆ–็›ธๅฎ‰ๅ€ๆ™‹ไธ‰็š„ๆœบๅ™จไบบ\n238004 0 ้›ช่Šฑ็ง€ๆป‹้˜ดๆฐดไนณๅฅ—็›’ๅ› ไธบ็”Ÿ็†ๆˆ–็Žฏๅขƒ็š„ๅฝฑๅ“ไพฟๅ‡บ็Žฐ้˜ด่™š็š„ๆƒ…ๅ†ต\n1000 Processed\n classify content\n238500 0 ๅคง็‰Œๆณฐๅ›ฝMistine็พฝ็ฟผ็ฒ‰้ฅผ\n238501 0 G2501ๅ—ไบฌ็ป•ๅŸŽ้ซ˜้€Ÿ็”ฑๅ…ญๅˆๅพ€ๅ—ไบฌๆ–นๅ‘ๅ—ไบฌๅ››ๆกฅๆฎตไปŽ44K่‡ณ45K\n238502 0 ๅ…ถๅฎž่ฟ™ๅฎถๅบ—ๆ˜ฏlonelyplanetๆŽจ่็š„้ค้ฆ†\n238503 0 ๅŽŸๅˆ›ๅŠๅนด้”€้‡่ถ…็พŽๅ›ฝๆˆ‘ๅ›ฝๆˆ็ฌฌ1ๅคงๆ–ฐ่ƒฝๆบ่ฝฆๅธ‚ๅœบ\n238504 0 ๅฎ‹่Œœๅงๅงๅ‡ ็‚น็š„้ฃžๆœบๅ•Š\n1000 Processed\n classify content\n239000 0 9ๅ็Šฏ็ฝชๅซŒ็–‘ไบบ้ƒฝๅ› ๆถ‰ๅซŒ่ฏˆ้ช—็ฝช่ขซๅ—ไบฌ้›จ่Šฑ่ญฆๆ–นไพๆณ•ๅˆ‘ไบ‹ๆ‹˜็•™\n239001 0 ่ตšๅพ—ไธๅคšๆ–ฐๆ‰‹ไธ€ๅคฉ80ๅ…ƒๅˆฐ100ๅ…ƒ่ฟ˜ๆ˜ฏๆœ‰ไฟ่ฏ็š„\n239002 0 ๅฎ‹็ฃŠ็š„้ญ้‡ๅฑžไบŽ่ดต้‡‘ๅฑžๆŠ•่ต„ๅคฑ่ดฅ\n239003 0 ๅฏน่ฟๆณ•ไผไธšไธ่ƒฝๆƒฏๅ…ปๅฟ…้กปไธฅๆƒฉ\n239004 0 ๆณฐๅทžไธ‰็ฆ่ˆน่ˆถ่ˆนๅŽ‚ๅ› ่ถ…้‡ๅŠ่ฝฝๅฏผ่‡ดไธ€ๆญปไธ€้‡ไผค\n1000 Processed\n classify content\n239500 0 ไฝ ็š„ๅญฉๅญไนŸๆœ‰ๆœบไผšๅ“ฆ่ง†้ข‘้“พๆŽฅ๏ผš\n239501 0 ไปŽgoogle็š„ๅซๆ˜Ÿๅ›พ็‰‡ไธŠๆธ…ๆ™ฐๅฏ่ง่€ๅฎถ็š„ๅฐ้™ขๅ’Œ่œๅ›ญ\n239502 0 ๆ—ถๅฐšๅšไธป๏ผšTiphaine็š„ๆททๆญ้ฃŽๆ ผๆœ‰็€็‰นๅˆซ็š„ไธชๆ€ง\n239503 0 ๅฆจ็ขๅ…ฌๅŠก+ไธ็ณปๅฎ‰ๅ…จๅธฆ+ๆ‰ฐไนฑๅ…ฌๅ…ฑๅฎ‰ๅ…จ\n239504 0 ๆ‰‹ๆœบ็”จไบ†ไธ€ๅนดๆฒก่ขซๆŠขไนŸๆ˜ฏไธ็ฎ€ๅ•\n1000 Processed\n classify content\n240000 0 ๆ‘ธๅˆฐไบ†ๆ–ฐ็š„MicrosoftEdge\n240001 0 ๅนถๅฐ†ไบŽ8ๆœˆ29ๆ—ฅ่ตดๅ—ไบฌๅ‚ๅŠ ๅ…จ็œ่‹ๅ—ใ€่‹ๅŒ—็ฒพๅ“่Š‚็›ฎ็š„่ง’้€\n240002 0 V่„ธ่ฏ„ไปท็พŠๆผ”ๅ‡บไบ†ๆดป็”Ÿ็”Ÿ็š„ๅผ ่ตท็ตๆ˜ฏ่ฎคๅฏ\n240003 0 โ€œๅคฑ่”โ€xไธชๅคšๆœˆ็š„้’Ÿๅฏ็ซ ็ญ‰xไบบๅ› ๆถ‰ๅซŒ้žๆณ•ๅธๆ”ถๅ…ฌไผ—ๅญ˜ๆฌพ็ฝช่ขซๆŠ“่Žท\n240004 0 xใ€ๆœบไผšไธๅฏ่ƒฝๆฐธ่ฟœ้™ชไผดไฝ ไธ€่พˆๅญ\n1000 Processed\n classify content\n240500 0 ๆœฌ้—จไปŽ่ฟžไบ‘็™ฝ้นญๅžๅคบๅพ—็ฉบๆ˜Žๆ‹ณๆฎ‹็ซ ไธ‰\n240501 0 ๅ…ถไธญๅค–้ชจ้ชผๆœบๅ™จไบบๅทฒ็ปไธŽๅ…ซไธ€ๅบทๅคไธญๅฟƒ่ฟ›่กŒๅˆไฝœ\n240502 1 ๅฐŠๆ•ฌ็š„ๅฎถ้•ฟ๏ผŒๆ–ฐๅนดๅฅฝ๏ผ้˜ฟๅขจ้ฑผๆ•™่‚ฒๆ˜ฅๅญฃๆ™š่‡ชไน ่ฏพ็จ‹ๅฐ†ไบŽxๆœˆxๅท๏ผˆๆ˜Žๅคฉ๏ผ‰ๆญฃๅผๅผ€่ฏพ๏ผŒxๆœˆxxๆ—ฅๆŠฅๅๅ››ไธช...\n240503 1 ไน”ๆฒป็ฝ—ๅฐผไบš่ฎฉๅŠ็ƒxxใ€ๅ“ฅ็ฝ—้‚ฃxxใ€‚้”กๆฐธๅนณๆ‰‹xxใ€็‰ๆฃฎxxใ€‚้˜ฟๅฐ”ๅทดๅกž็‰น่ฎฉๅนณๅŠxxใ€้‚ฆๅผ—ๆ‹‰็”ธๆ‹ฟx...\n240504 0 ไธญๅ…ณๆ‘่ก—้“่‡ช2012ๅนดๅผ€ๅฑ•็ฝ‘ๆ ผๅŒ–็คพไผš็ฎก็†\n1000 Processed\n classify content\n241000 0 ๅฐๅบฆ้‡‘่žไธญๅฟƒ็š„้คๅŽ…ๅๅญ—ๆ˜ฏไปฅ็บณ็ฒนๅพทๅ›ฝ้ข†่ข–้˜ฟ้“ๅคซยทๅธŒ็‰นๅ‹’ๅ’Œ็บณ็ฒนๅ…šๅพฝๅ‘ฝๅ\n241001 0 ๆœ‰miniatureVersaillesไน‹็จฑgeHerrenchiemsee\n241002 0 ๅคฉๅบœ่ฝฏไปถๅ›ญC12็š„้ซ˜ๅฑ‚็”ตๆขฏไธ‹็ญๆ—ถ้—ดๅไบ†\n241003 0 ็œ‹็œ‹้ฃžๆœบๆ˜ฏๅฆ‚ไฝ•่ฟ›่กŒๅฎšๆฃ€ใ€็ปดไฟฎ็š„\n241004 0 ๅทฒ็ป่ขซ70ๅคšไธชๅ›ฝ้™…ๆœ่ฃ…ๅ“็‰ŒๆŠตๅˆถ\n1000 Processed\n classify content\n241500 0 ๅŒๆ–น่ขซๅนๅˆค58ๆฌก็Šฏ่ง„ๅˆ›ไธ‹ๆœฌ่ต›ๅญฃๅญฃๅŽ่ต›ไน‹ๆœ€\n241501 0 ๅ—้€šๅคง่ก—็”ฑๆ–‡ๅŒ–ๅ…ฌๅ›ญๅพ€ๅฎฃๅŒ–่ก—ๆ–นๅ‘\n241502 0 ๅ› ไธบไฝ ๅซ่ตตๆœ‹ๆถ›ๆ‰€ไปฅ็Ž‹ไธน้˜ณไธไผš็ฆปๅผ€ไฝ \n241503 0 ๆฅ่‡ชFM797198ๅคฉๆตทๆ— ่ด็š„ๅฌ้Ÿณ็ญ’\n241504 0 ็ ดwin10AๅกๆŒ‚ไบ†ไผ—ไบบ๏ผš่พฃ้ธกAๅก้ฉฑๅŠจ\n1000 Processed\n classify content\n242000 0 ไฝ ไป–ๅฆˆ็”จไธชๅŽไธบๅคงๅก‘ๆ–™ไธๅฑŒไธ\n242001 0 ๅฅณไธปๆ’ญๅฎžๅไธพๆŠฅ่ขซๅฎ˜ๅ‘˜ๅŒ…ๅ…ป4ๅนดๅŽ้ญๆŠ›ๅผƒ\n242002 0 ๅๆญฃ็ป‘ๅฎšไบ†QQๅนณๆ—ถ้ƒฝๆ˜ฏ็”จQQ็™ป\n242003 0 ไปŠๅคฉๅœจๅนฟไธœ็”ต็™ฝๆปจๆตทๆ–ฐๅŒบๅš็พŽๆ‘\n242004 0 ๅฐฑ่ฑกไธ€้ƒจ็ƒ‚็”ต่ง†ๅ‰งๅ่…่ฟ›ๅฑ•ๅ’Œๆƒ…่Š‚่ถŠๆฅ่ถŠ่’่ฏž็ฆปๅฅ‡\n1000 Processed\n classify content\n242500 0 ๅฎ˜ๅœบ่…่ดฅใ€ๆฐ‘้—ดๆถๆ€งไบ‹ไปถใ€ๅฎถๅบญ้—นๅ‰ง้ข‘ๅ‘\n242501 0 ็ขงๆฐดๆบ930ๆ‰‹ใ€ๆœบๅ™จไบบ563ๆ‰‹ใ€่“่‰ฒๅ…‰ๆ ‡1661ๆ‰‹\n242502 0 ไบบไฝ“ๅนฒ็ป†่ƒžๆ˜ฏๅฝ“้‡‘ๅŒป็–—ๅ†็”Ÿไธš็•Œ็š„ๆœ€ๅท…ๅณฐ\n242503 1 ้œฒๅฐฑ่กŒ๏ผŒcx-xr็Žฐ่ฝฆๅ……่ถณ๏ผŒ่ต„ๆบๆœ‰้™๏ผŒๆฌข่ฟŽๆŠข่ดญ\n242504 0 ไป“ๅฑฑๅŒบๆณ•้™ขๅˆคไปคๅนผๅ„ฟๅ›ญๅบ”ไป˜็ป™ๆž—ๅ…ˆ็”Ÿไธ€ๅฎš็š„ๅŠ ็ญ่ดน\n1000 Processed\n classify content\n243000 0 ๅธฎไฝ ไนฐ็š„้“พๆŽฅๅคๅˆถๅˆฐ๏ผš้˜ฟ้‡Œๅฆˆๅฆˆๆท˜ๅฎๅฎข\n243001 0 ๅคฉไบฎไบ†ๆˆ‘่ฏฅ็ก่ง‰ไบ†ๅฎ‰ๅคง่‹ๅทž\n243002 0 7ๆœˆ30ๅทๆŽ้นค็”Ÿๆ—ฅๅฟซไนไบ”ๅนดไบ†\n243003 0 ไธ่ฟ‡ๅ’Œ้˜ฟ้‡Œๅฝฑไธš็š„ๅˆไฝœ่ฟ˜ๆ˜ฏๅ€ผๅพ—ๆœŸๅพ…\n243004 0 ๅœจๆฑŸ่‹ๅคชไป“ๅŽฟๆ›พๆœ‰็š‡ๅฎถ็š„ๅคง็ฒฎไป“\n1000 Processed\n classify content\n243500 1 ๆˆ‘ๅ…ฌๅธไปฃๅผ€ๅ„็งๆญฃ่ง„ๅ‘็ฅจ๏ผŒ่ฏทๅ›ไฟ็•™๏ผŒไปฅๅค‡ๆ€ฅ็”จ๏ผ\n243501 0 ่ฟ™ๆ˜ฏๆ˜จๆ—ฅ14ๆ—ถ่‡ณไปŠๆ—ฅ21ๆ—ถ็š„ๆ„Ÿๅ—\n243502 1 ๅฐŠๆ•ฌ็š„้กพๅฎข๏ผšๆ‚จๅฅฝ๏ผๅ—ๅคง้“œ้”ฃๆนพ่ถ…ๅธ‚ๅœจxๆœˆxๆ—ฅ่‡ณxๆœˆxxๆ—ฅๆžๆตชๅฅ‡ๆด—่กฃๆถฒๆดปๅŠจใ€‚๏ผˆ็‰นไปทไบงๅ“ๅฆ‚ไธ‹๏ผšxx...\n243503 0 ๆœ‰ๆœ‹ๅ‹็งฐๅ“ฅๆ˜ฏ่ฎพ่ฎก้™ข้‡Œๆœ€่ƒฝ่ฏด็š„\n243504 0 ็ช็„ถๅ‘็Žฐtaylor็š„ๅทฎไธๅคš้ƒฝๆ˜ฏๆ— ๆŸ้Ÿณ่ดจ\n1000 Processed\n classify content\n244000 0 ๅˆๅผ€ๅง‹ไนฐไธๅˆฐๅฐ็ฑณๆ‰‹ๆœบๆ€’ไนฐ่”ๆƒณkxxx\n244001 0 ็œ‹ๅˆฐๆ—ฅๆœฌๆ”ฟๅบœๅฏนไบŽ็พŽๅ›ฝ็š„็›‘ๅฌ่กŒไธบๅชๆ˜ฏ่กจ็คบ้—ๆ†พ่€Œไธๆ•ขๆŠ—่ฎฎ\n244002 0 xๅ…ƒใ€้ฒๆŸๆŸๅฎถๅฑž่Žท่ต”xxxxxxx\n244003 0 10ๅคงNBA็ƒๆ˜Ÿๆฐธๆ’่ขซ้ป‘็‚น\n244004 0 GHairRecipe่œ‚่œœๆฐดๆžœๆ— ็ก…ๆฒน้…ๆ–นๆด—ๅ‘ๆฐด/ๆŠคๅ‘็ด \n1000 Processed\n classify content\n244500 0 ๅคงๅฎถไน˜ๅ็”ตๆขฏ็š„ๆ—ถๅ€™่ฆๆณจๆ„ๅฎ‰ๅ…จ\n244501 0 ๆญคๆฌกxxxๆกๅœ่ฝฆ็ฎก็†็คบ่Œƒๅคง่ก—้‡็‚นๆ•ดๆฒปๅŒบๅŸŸไธป่ฆๅˆ†ไธบโ€œ็ฆๅœไธฅ็ฎก่ก—โ€ๅ’Œโ€œๅœ่ฝฆๅ…ฅไฝ่ง„่Œƒ่ก—โ€\n244502 0 ๅˆ˜็›Š่ฐฆๆ˜ฏ2008ๅนดๅๅคงๆ–ฐ้—ปไบบ็‰ฉ\n244503 0 ้ฆ™้š…ๆดพๅ‡บๆ‰€ๆฐ‘่ญฆๅฏน่พ–ๅŒบๅฑ…ๆฐ‘็–‘้šพๆˆทๅฃ่ฟ›่กŒ่ฐƒๆŸฅ\n244504 0 ๆ™š้ค๏ผšๅŠ็ข—็ฑณ้ฅญใ€่ ่่œœใ€ไธ€็ฒ’็ปด็”Ÿ็ด ใ€ไธ€็ฒ’DHA\n1000 Processed\n classify content\n245000 0 ๅŠ ไน‹ๅŒ็‘žBใ€ๅˆ›ไธšๆฟBใ€ๅทฅไธšxBใ€็…ค็‚ญBๅŸบใ€่ฏๅˆธB็บง\n245001 0 ๅฐ†่Žทๅพ—็”ฑๅพๅทžๆ…ˆ้“ญไฝ“ๆฃ€ไธญๅฟƒๆไพ›็š„ใ€ไปทๅ€ผ500ๅ…ƒ็š„่ดตๅฎพไฝ“ๆฃ€ๅกไธ€ๅผ \n245002 0 ๅŠ ๆฒนๅฎžไน ็”Ÿ็œ‹ๅˆฐxx้›†ๆˆ‘ๆ‰็Ÿฅ้“้ƒไธปไปปๅ’Œ้‚ฃไธช่ฐๆ˜ฏไธ€ๅฎถ\n245003 0 ่ฆ็ญ‰ๅŠไธชๅฐๆ—ถๅŽ้ฃžๆœบๅ†ๆฌก่ตท้ฃž\n245004 0 ๆ†‹่ฏด่ฏๆˆ‘ๆญฃๅœจๅผบ่ฃ…ไธ€ๅœบ่ฏด่ตฐๅฐฑ่ตฐ็š„ๆ—…่กŒ้ฃžๆœบๅปถ่ฏฏไปจ็‚นๆˆ‘่กจ็คบๅพˆๅผ€ๅฟƒ\n1000 Processed\n classify content\n245500 0 ๅ…ถๅฎž่ทฏไบบ่ดจ็–‘ๅœ่ฝฆๆ•‘ไบบ่€…ๆ˜ฏ่‚‡ไบ‹่€…\n245501 0 ๆˆ‘ๆญฃๅœจไธ‹็€้›จ็š„ๆ— ้”กไนž่ฎจ็€็”Ÿๆดป็š„ๆƒๅˆฉ\n245502 0 ็Žฐๅœจ็š„่ญฆๅฏŸ้ƒฝๆ˜ฏ่ฟ™ไนˆไธไฝœไธบ็š„ๅ—\n245503 0 ๅฐฑๆƒณ็Ÿฅ้“้ฃžๆœบๆ™š็‚นไธ่ตท้ฃžๅฐฑ่ฎฉๆ—…ๅฎขไธ€็›ดๅๅœจๆœบ่ˆฑ้‡Œไนˆ\n245504 0 ่พ—่ฝฌไปŽๅŒ—ไบฌ้ฃžๅˆฐๅ—ไบฌๅˆไปŽๅ—ไบฌ้ซ˜้“ๅˆฐไธŠๆตท\n1000 Processed\n classify content\n246000 0 ๆœ€ๅฏๆถ็š„ๆ˜ฏ่ฟ˜ๅŠซๆŒไบ†xxxไธŽ้˜ฟ้‡Œ็š„DNS\n246001 0 ๅ‰ฏๅฑ€้•ฟ็งŸๆˆฟๅญ˜ไธƒๅ…ซๅƒไธ‡ๅ—่ดฟ็Žฐ้‡‘ๅบ”ไบ†ๅ•ฅ่ฏ\n246002 0 ้‡œๅฑฑๅ—ๆตท่ญฆๅฏŸๅŽ…ๅˆ‘ๆณ•ๆœๆŸฅ้˜Ÿๅœจ่ฏฅ็”ทๅญๆ‰‹ๆœบไธญๅ‘็Žฐๅคง้‡ๅนด่ฝปๅฅณๆ€ง่บซ็ฉฟๆฏ”ๅŸบๅฐผ็š„็…ง็‰‡\n246003 1 ไฝ ๅฅฝ ๆˆ‘ๆ˜ฏๅˆšๆ‰ๅ’Œไฝ ่”็ณป็š„้ผŽ้ผŽๆ—ฅ็››่ฃ…้ฅฐๅ…ฌๅธ็š„ๆœๅ…‰ไผš๏ผŒๅ…ฌๅธ็ŽฐๆŽจๅ‡บ๏ผŒ่ฟ›ๅบ—ๅฏไบจๆ–ฐๅนดๅผ€้—จ็บข็คผๅŒ…xxxx...\n246004 0 ๆ—ข็„ถ้ƒฝๅทฒ็ปๆ”พๅ‡ไบ†ๅฐฑๆŠŠๆกๅฅฝๆ”พๅ‡ๅชๅฑžไบŽ่‡ชๅทฑ็š„ๆ—ถ้—ดไธ“ๆณจไบŽไธ€ไบ›ๆœ‰ๆ„ไน‰็š„ไบ‹ไธขๆމๆ‰‹ๆœบ\n1000 Processed\n classify content\n246500 0 ๆœ€ๆ—ฉๅ…จ็งฐไธบๆ˜ฏSpecialWeaponsAttackTeam\n246501 0 ๅฅฝๅคšๅฎขไบบ้—ฎ่ฟ™ไธชpolaๆžๅ…‰้™ๅฎšๅฅ—่ฃ…\n246502 0 ๆ ‡้ข˜ๅฎ‹ๅฎ‰ไธœNHL้€‰็ง€ๆˆๅŠŸ็ˆฑไฝ“่‚ฒ็š„ๅฎถๅบญๅŠ ๆฒน\n246503 0 ็ŽฐๅธธๅทžๅฏŒ่ฑชๆฒƒๅฐ”ๆฒƒ่ฟๅŠจ่ฝฟ่ท‘S60L่ฏ•้ฉพ่ฝฆ้€€ๅฝนๅ•ฆ\n246504 0 5ใ€้ฅญๅ‰ไธ€ๆฏ่œ‚่œœๆฐดๆŠ‘ๅˆถ่ƒƒ้…ธ\n1000 Processed\n classify content\n247000 0 ไฝ†็œŸ็›ธๆ€ปๆ˜ฏๆฏ”ไฝ ๆƒณ็š„่ฆๆฎ‹้…ทๅพˆๅคš\n247001 1 ๆ–ฐๅนดๅฅฝ๏ผŒ้ป„ๆตฉ้ฃŸๅ“ๅบ—็ป™ๆ‚จๆ‹œไธชๆ™šๅนด๏ผŒๆ„Ÿ่ฐขๆ‚จๅŽปๅนดๅฏนๆœฌๅบ—็š„ๆ”ฏๆŒ๏ผŒไปŽๅณๆ—ฅ่ตท๏ผŒๅœจๆœฌๅบ—่ดญ็‰ฉๆปกxxๅ…ƒ๏ผŒๅณๅฏ้š...\n247002 0 ๆพณๆดฒColoxyloraldrops็ผ“่งฃๅฉดๅนผๅ„ฟไพฟ็ง˜/ไธŠ็ซๅฃๆœๆปดๅ‰‚30ml\n247003 0 ไฝ ้ซ˜้พ„โ€ฆโ€ฆblablablaโ€ฆโ€ฆๅฅฝๅคšๅ› ็ด ็š„\n247004 0 ๆ™‹็…ค้›†ๅ›ขๅคงๅŠ›ๅฎžๆ–ฝโ€œๆ–‡ไฝ“ๆƒ ่€ใ€ๆ–‡ๅŒ–ๅ…ป่€โ€ไธบ่€ๆœๅŠกๆ–ฐๆˆ˜็•ฅ\n1000 Processed\n classify content\n247500 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ไปฌๅ˜ฟๅฎขๅบ—็š„่Šฑ็Ž‹็ŽฐๅœจๅšๆดปๅŠจ๏ผŒๅŠ›ๅบฆ่ฟ˜ๆ˜ฏ่›ฎๅคง็š„๏ผŒๅŽŸ่ฃ…ๆ—ฅๆœฌ่Šฑ็Ž‹็บธๅฐฟ่ฃค้กบไธฐ็›ด้‡‡๏ผŒไธ‰ๅ…ซ็‰นๆƒ Lxx...\n247501 0 ็”ฑ้”กๅฑฑๆ–‡ไฝ“ๅฑ€ๅ’Œ้”กๅฑฑ็Žฐไปฃๅ†œไธšๅš่งˆๅ›ญ่”ๅˆไธพๅŠž็š„้”กๅฑฑๅŒบ2015ๅนดๅฐ‘ๅ„ฟๆธธๆณณ้‚€่ฏท่ต›ๅœจๅ†œๅšๅ›ญๆธธๆณณ้ฆ†ๅœ†ๆปก่ฝๅน•\n247502 1 ๅนฟ็ปฟ็Žฏไฟๅœจ่‚ƒๅฎๅŽฟ่‚ƒๆฐด่ทฏไธœ๏ผŒ็Žฐๆ‹›ไธšๅŠกๅ‘˜(xxๅนด่–ช่พพxxxxxxๅ…ƒ+)๏ผŒ็ฝ‘็ปœ่ฅ้”€ๅ‘˜(xxๅนด่–ช่พพx...\n247503 0 ๅชๆœ‰่Šฑๅƒ้ชจๅฎŒ็ป“ๆˆ–ๅœๆ’ญ่‚กๅธ‚ไนŸๅฐฑๆขๅคๆญฃๅธธไบ†\n247504 0 ๅˆ†ไบซๅฝญๅ•†ๅผบไธŽ่ฏๅˆธ็‰ฉ็†ๅญฆ็š„ๅšๆ–‡ๅ›พ็‰‡๏ผš20150807ๅฝ“ๅ‰่‚กๅธ‚ๅ„ๅคงๅ‘จๆœŸK็บฟๆŠ€ๆœฏๅˆ†ๆž\n1000 Processed\n classify content\n248000 0 ๆ›ดไธ็›ธไฟก่ฟ™ๆ˜ฏๅœจไธบTesiroๆ‰“ๅนฟๅ‘Š\n248001 0 ไธบไป€ไนˆ6p็ณป็ปŸๅ‡็บงๆˆb243ๅŽ\n248002 0 PS๏ผšๆˆ‘ๅฏๆฒกๆœ‰ๅœจไบบๆฐ‘ๅนฟๅœบๅƒ็‚ธ้ธก\n248003 0 ไบบ้ฃžๅ‡บๅŽป็ ธไธญไบ†ไธ€่พ†่ญฆ่ฝฆๆ™•ไบ†\n248004 0 ่ฎพ่ฎกๅธˆๆฒกhowilldisappear\n1000 Processed\n classify content\n248500 0 ็œ‹ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณๆœ€่ฎฉไบบๆ„ŸๅŠจ็š„ไธๆ˜ฏๅ”ฑ็š„ๆœ‰ๅคšๅŠจไบบ\n248501 0 ๅ‰ๅ…‹้šฝ้€ธ่กŒๅคด่ฟ‘ไบ”ไธ‡ไนฐไธๅˆฐๆœบ็ฅจ้ญ่ดจ็–‘\n248502 0 ๆŠŠไธ€ๅๅ‘ๅ…ถๅ‹’็ดขxๅ—้’ฑ็š„่ญฆๅฏŸๅ’Œ่ญฆๅฏŸๅŽ…ๅ‰ฏๅŽ…้•ฟๆžไธ‹ไบ†ๅฐ\n248503 0 ๅฎž้™…ๆœ€็ปˆ่พพๅˆฐ21ๅบฆ็š„็ซ‹ๅผ็ฉบ่ฐƒๅฏนๅ†ฒ\n248504 0 ็ ”็ฉถๅ‘็Žฐๆฏๅคฉๅ–3ๆฏๆˆ–ๆ›ดๅคšๅ’–ๅ•กๅฏไฝฟ2ๅž‹็ณ–ๅฐฟ็—…้ฃŽ้™ฉ้™ไฝŽ37%ไปฅไธŠ\n1000 Processed\n classify content\n249000 0 ็ป“ๆžœไป–้ฉฌไธŠๆ‰“ๅผ€็ฎกๅฎถ็š„็”ต่„‘่ฏŠๆ‰€\n249001 0 ใ€ŽJongSukใ€150725ๅ›พ็‰‡โ†’LOCK&amp\n249002 0 ๅŽŸๅฎšไธญๅˆ12ๆ™‚่ตท้ฃ›ๅ‰ๅพ€็พŽๅœ‹่ŠๅŠ ๅ“ฅ\n249003 0 ๆ˜ฏไธๆ˜ฏไฝ ๅฏน\"ๆœ‹ๅ‹\"่ฟ™ไธช่ฏ็š„ๅฎšไฝๆœ‰้—ฎ้ข˜ๅ‘ข\n249004 0 xๆฑ‡็އ๏ฝžxxxๅˆฐๆ‰‹๏ฝžmoto็Ÿญ่ฃค๏ฝžๅชๆœ‰xxx\n1000 Processed\n classify content\n249500 1 ๆ–ฐๅนดๅฅฝ๏ผŒๆœฌๅบ—ไปŠๅคฉๆญฃๅฅฝๅผ้—จ๏ผŒๆœฌๅบ—ไปฅๅท้—ธ้—จใ€ๆŠ—้ฃŽ้—จใ€้“ๅˆ้‡‘ๅž‹ๆ้—จใ€็ฝ‘ๅž‹้—จใ€ไผธ็ผฉ้—จใ€่ฝฆๅบ“้—จ็ญ‰็ญ‰ๆ‰นๅ‘...\n249501 0 TFBOYSๆญๅ–œๅ…ฉ้€ฑๅนดๅฟซๆจ‚ๆŽฅไธ‹ไพ†้‚„ๆœ‰ๆฏๅ€‹ๅๅนดไฝ ๅ€‘่ฆๅŠ ๆฒน\n249502 0 ๆœ€่ฟ‘ไฝ ็œ‹่Šฑๅƒ้ชจๅ’ŒๅŠ ๆฒนๅฎžไน ็”Ÿไบ†ๅ—\n249503 0 ๅฟ…้กป้€š่ฟ‡ๅพฎ่ฝฏๆต่งˆๅ™จๆ‰่ƒฝๆ‰“ๅผ€็ฝ‘้กตๅ›พ็‰‡\n249504 0 ไธŠๆตทๅธ‚้—ต่กŒๅŒบไบบๆฐ‘ๆณ•้™ขๆฐ‘ไบ‹่ฃๅฎšไนฆ\n1000 Processed\n classify content\n250000 0 ๅฐๆˆทๅž‹่ฎพ่ฎกไนŸ่ƒฝ้€š่ฟ‡่ฝฏๆœจ่ƒŒๆ™ฏๅข™ใ€ๅœฐๆฏฏๆ‰“้€ ๅˆซๅข…็š„่ฑชๅŽๆ„Ÿ\n250001 0 ๅ’Œ้—บ่œœ่Šไบ†ไธ€ๆ™šไธŠๆ™šไธŠ็š„้ฃžๆœบๅ›žๅฎถๆฌง่€ถ็œ‹ไนฆไนฆๅŽปๅ’ฏ๏ฝž\n250002 0 GoogleๆฏไธชๆœˆๆŸฅ่ฏข้‡1000ไบฟๆฌก\n250003 0 ็„ถๅŽๅŒป็”Ÿๅˆ่ฏด่ฟ˜่ฆๆ‹”ไธค้ข—ๅฐฝๅคด็‰™\n250004 1 ๅœฃไฝ›ๆตท้ญ‚ๆ—ฅๅŒ–็ฅๆ‚จ๏ผšไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅฟซไน๏ผ็ŽฐๆดปๅŠจๅฆ‚ไธ‹๏ผšไผŠ่ด่ฏ—็ฌฌไธ€ๆฌพๆญฃไปท๏ผŒ็ฌฌไบŒๆฌพx.xๆŠ˜๏ผ›่‡ช็„ถๅ ‚ๆœ‰็ง’ๆ€...\n1000 Processed\n classify content\n250500 0 ไปŠๆ™šๆœ‰่Šฑๅƒ้ชจ็œ‹ไปŠๆ™šๆœ‰่Šฑๅƒ้ชจ็œ‹ไปŠๆ™šๆœ‰่Šฑๅƒ้ชจ็œ‹\n250501 0 ๅŽŸ็ณป็Ž‹ๆŸ่ขซๅซŒ็Šฏๅฎถๅฑž็”จxxไธ‡ๆ”ถไนฐ\n250502 0 G15wๅธธๅฐ้ซ˜้€Ÿ็”ฑๆญๅทžๅพ€ๅธธ็†Ÿๆ–นๅ‘68K้™„่ฟ‘ไบ‹ๆ•…็Žฐๅœบๅค„็†็ป“ๆŸ\n250503 0 x็บงๅœฐ้œ‡ๅฝ“ๅœฐ็ฝ‘ๆฐ‘่ฐƒไพƒ็งฐโ€œๅฎถๅธธไพฟ้ฅญโ€\n250504 0 ๆžช6000ๅคšๆ”ฏโ€”โ€”่”ๆƒณๆœ‰ไบบ้€ ่ฐฃ่ฏด๏ผšไธญๅ›ฝ8ๅนดๆŠ—ๆˆ˜\n1000 Processed\n classify content\n251000 0 ้˜ฟ้‡ŒๅŽปๅ•Šๆ—…่กŒ่Š้บปไฟก็”จ็งฏๅˆ†ๆ–ฐๅŠ ๅก็ญพ่ฏๆ‹’็ญพ\n251001 0 ไฝฟ็”จๆ‰‹ๆœบ็ฝ‘็ปœ่ฟžๆŽฅๆฏ”WiFiๆ›ดๅฎ‰ๅ…จ\n251002 0 ๅ›žๅคMKLC83้ข†ๅ–80็พŽๅ…ƒๅˆทๅก้‡‘\n251003 0 ไธ€ๅฎกๆณ•้™ขไปฅ้žๆณ•ๅธๆ”ถๅ…ฌไผ—ๅญ˜ๆฌพ็ฝช\n251004 0 ๆˆ‘TไนŸ่ฆๅ‚ๅˆๅˆฐ่ฟ™ไธช็‰Œๅญๅ‡บ้ฃžๆœบๆฏๆƒน\n1000 Processed\n classify content\n251500 0 ็Žฐไปฃไธญๅผ่ฃ…ไฟฎๆกˆไพ‹โ€”โ€”่ฅ้€ ๆธฉ้ฆจ็š„ๅฎถ็Žฐไปฃไธญๅผ่ฃ…ไฟฎ้ฃŽๆ ผๆกˆไพ‹่žๅˆไบ†ๅบ„้‡ไธŽไผ˜้›…ๅŒ้‡ๆฐ”่ดจ\n251501 0 โ€œๅทดๅ“ˆ้ฉฌไธŽๆŠคๅฃซ้ฒจๆธธๆณณโ€ๆˆ‘็š„็ง’ๆ‹ไฝœๅ“\n251502 0 ไป”็ป†่†ๅฌๅฐ่ดฉไธ€้ๅˆไธ€้็š„ๅ†ๅ–\n251503 0 ๅ†ๅŽป็œ‹็™พๅบฆ็Ÿฅ้“ๅ•†ๅŸŽ็š„็‰ฉไปทๅทฒ็ป้•ฟๅˆฐไบ†ๅคฉไปท\n251504 1 ๅฐŠๆ•ฌ็š„็”จๆˆทๆ–ฐๅนดๅฅฝ๏ผไฟก้˜ณ็งปๅŠจๅ…ƒๅฎตไฝณ่Š‚ๆœŸ้—ดxๆœˆxๆ—ฅ-xxๆ—ฅ๏ผŒ้’ˆๅฏนๆ‚จ่ฟ™ๆ ท็š„xGๆ‰‹ๆœบๅฎขๆˆท๏ผŒๆŽจๅ‡บ็ผดxx...\n1000 Processed\n classify content\n252000 0 ๆ—ฅๆ–‡wikiๆŽจ่็š„้…็ฝฎๆ˜ฏใ€Œ้‡ๅทก+้ฃžๆœบๆฌงๆ น็”ตๆŽข\n252001 0 51ๅฒๅ‘จๆกƒไบšไผธๅ‡บๆดๆ‰‹ๆฅๅˆฐไบบๆฐ‘่ทฏ็š„็Œฎ่ก€ๅฑ‹็Œฎ่ก€\n252002 0 ๆ˜Ÿ็พŽๅทฒ็ปไธŽ็™พๅบฆๅˆไฝœๆŽจๅ‡บไบ†โ€œๆ˜Ÿ็พŽ็™พๅบฆๅกโ€\n252003 0 ๆˆ‘็œŸๆ€•ๆœ‰ไธๆ˜Ž็œŸ็›ธ็š„ไบบ้”™ๆŠŠ่ฟ™ไธชๅฝ“่“่Ž“ๅƒไบ†ๅ•Š\n252004 0 ๆœฌ่ฝฎๅฏนๆฑŸ่‹่ˆœๅคฉ็ƒๅ‘˜ๆถๆ„้“ฒๅฎŒๅดๆ›ฆๅŽ่ฟ˜ไธ€ๅน…ไธๅพ—ไบ†็š„ๆ ทๅญ\n1000 Processed\n classify content\n252500 1 xๆœˆxๆ—ฅๆˆฟ่ถ…็ฝ‘ๅฎถ่ฃ…ๅ›ข่ดญxxxๅฐš้ซ˜xๅ‡บๆฐด่Šฑๆด’็ญ‰Nๅคš่ถ…ไฝŽไปทไบงๅ“xๆŠ˜่ตท๏ผŒไธ้™้‡ๆŠข ่ดญ๏ผŒๆฅๅฐฑ้€ๆดไธฝ้›…...\n252501 0 ็ป“ๆžœ่ฟ™ๆฌกๅ›ž็จ‹่ˆช็ญๅŠๅคฉๅ†…ๆŽจ่ฟŸไบ†ไธ‰ๆฌก\n252502 0 ่ขซๅˆ่‚ฅๅธ‚ๅบ้˜ณๅŒบๆณ•้™ขไธ€ๅฎกๅˆคๅค„ๆœ‰ๆœŸๅพ’ๅˆ‘ไธ€ๅนด\n252503 0 ๅฏๆƒœ็š„ๆ˜ฏ็”ฒ้ชจๆ–‡้‡‘ๆ–‡ๅฑ•ๅŽ…็”ฑไบŽ็ญนๅค‡ไธ่ถณๆฒกๆœ‰ๅผ€\n252504 0 150710ASBeetwitterๆ–ฐๆดปๅŠจๆตทๆŠฅ\n1000 Processed\n classify content\n253000 0 ๅฏนๅ›ฝๅ†…่ญฆๅฏŸๅŸŽ็ฎก่ดช่ตƒๆž‰ๆณ•ๆฌบๅŽ‹่€็™พๅง“ๅด่ง†่€Œไธ่ง\n253001 0 ๆˆ‘็”จbabyskinๅฐฑไธไผš่ฟ‡ๆ•\n253002 0 ๅพฎไฟก็”ต่„‘็‰ˆ็”จๆˆท็Žฐๅทฒๅฏไธ‹่ฝฝๆ›ดๆ–ฐ\n253003 0 ๅ–œๆฌขๅ‡บ็งŸ่ฝฆ้‡Œๆฒ™ๅ“‘็š„radio\n253004 0 ๅœจP2P็ฝ‘่ดทๅนณๅฐๆไพ›ๆ‹…ไฟไธšๅŠกๅนถๆ”ถๅ–ไธ€ๅฎš็š„ๆ‹…ไฟ่ดนใ€ๆœๅŠก่ดน็ญ‰\n1000 Processed\n classify content\n253500 0 ็Ÿฅ้“็™ฝๅญ็”ปไธบไป€ไนˆๅฎ่ดŸ่Šฑๅƒ้ชจ\n253501 0 4็”ทไบบๅฏนไบŽ่กฅ่‚พใ€่„‚่‚ช่‚5่ดซ่ก€ไบบ็พค้˜ฟ่ƒถๅ–„ๆฒปๅ„็ง่ก€็—‡6ไบšๅฅๅบทไบบ็พค้˜ฟ่ƒถ่ƒฝๆœ‰ๆ•ˆๆๅ‡ๆœบไฝ“็™ฝ็ป†่ƒž่ฟ™ไธชๅ…็–ซ...\n253502 0 ่กจ็šฎๅฑ‚็š„็ป†่ƒž้—ด่ดจๅ’Œ็œŸ็šฎๅฑ‚็š„ๅผนๅŠ›ๆฐดๅˆ่ƒถๅซ้‡่ถŠๅ……่ถณ\n253503 0 BBๅฎถ็ƒญๅ–ไบงๅ“ๅ†ฌๅคฉๆœ€ๅฅฝ็”จ็š„่บซไฝ“ไนณ\n253504 0 ๅฐ†ๆˆไธบไธญๅ›ฝ็ฌฌไธ€ๅฎถไธŠๅธ‚้‡‘่žไบ’่”็ฝ‘ๅž‚็›ดๆœ็ดขๅผ•ๆ“Ž\n1000 Processed\n classify content\n254000 0 ไฝ ็Žฐๅœจๅฑ…็„ถ่ฟžๅ››ๅคงๅๆ•่Šฑๅƒ้ชจ้ƒฝ็œ‹ไบ†\n254001 0 ๆˆ‘ๅทฒ็ป้ข„ๆ„ŸB็ซ™ๅฐๅญฆ็”Ÿ่ฆๆŠŠ่ฟ™้ƒจ้ช‚ๆˆ็‹—ไบ†\n254002 0 ๅš็‰ฉ้ฆ†ๅปบ็ญ‘้ข็งฏ2500ๅนณๆ–น็ฑณ\n254003 0 ๅ‰งๆƒ…้€†่ฝฌโ€œๅƒตๅฐธ่‚‰โ€ๆˆ–ไธบๅ‡ๆ–ฐ้—ปโ€”โ€”้‚ฃ้‡Œๆ˜ฏ็œŸ็›ธ\n254004 0 ๆฅ่‡ชFMxxxxxxx็œŸๆฐดๆ— ้ฆ™โ€”่•พ\n1000 Processed\n classify content\n254500 0 ๅ…ถไธญๅ…ฌๅ…ฑ่ดขๆ”ฟ้ข„็ฎ—ๆ”ถๅ…ฅxxxxxxไธ‡ๅ…ƒ\n254501 0 ๆ„ฟๆ‰€ๆœ‰็ฉ†ๆ–ฏๆž—ๆ–‹ๅŠŸ็พŽๆปกๅนธ็ฆๅ‰็ฅฅ\n254502 0 ๅˆฐ่ทๅ…ฐไบ†ๅไบ†9ไธชๅฐๆ—ถ้ฃžๆœบๅพ…ไผš่ฟ˜่ฆ่ฝฌๆœบ็Žฐๅœจๆ˜ฏ่ทๅ…ฐๆ—ถ้—ด15\n254503 0 ็บฆ20ๅˆ†้’ŸๅŽๅ–ไธ‹้ขๅทพๅŠŸๆ•ˆ๏ผšๅฏไปฅๆœ‰ๆ•ˆๆ”นๅ–„่„ธ้ƒจ่‰ฒๆ–‘\n254504 0 ไฝ ไปฌ็š„APP้ข„่ฎก้€่พพๆ—ถ้—ด่ƒฝ้ ่ฐฑ็‚นไนˆ\n1000 Processed\n classify content\n255000 0 ไธ‹้ฃžๆœบๆˆ‘ๅฐฑๅธฆไฝ ไปฌๅŽปๅƒ้‡ๅบ†็ฌฌไธ€ไธฒไธฒ\n255001 0 ่‡ณๆญคๆฒฟ็บฟ14ๅบงๆ–ฐๅปบ็ซ่ฝฆ็ซ™ไธปไฝ“ๅ…จ้ƒจๅปบๆˆ\n255002 0 ๅฅฝๅฅฝไฟๆŠค่‡ชๅทฑ0็ˆฑไฝ ็š„An\n255003 0 ้‚ฏ้ƒธไธญ็บงไบบๆฐ‘ๆณ•้™ขๅฎฃๅˆคไธ€่ตท็ณปๅˆ—็›—็ชƒใ€ๆŠขๅŠซ็พŠๅชๆกˆ\n255004 0 DxxไปŠๅคฉๅŽปๆ‰Žไน™่‚็–ซ่‹—็ฌฌไบŒ้’ˆ\n1000 Processed\n classify content\n255500 0 ๆˆ‘็”ต่„‘้ƒฝๅ…ณไบ†่ฟ˜ๆƒณๆฅ็œ‹ๅถๅƒๆฅไบ†\n255501 0 1็”จ360ๆ›ดๆ–ฐwin10ไธๆ˜ฏๅŽŸ็”Ÿๅ†ๆขๅคๅ‡บๅŽ‚\n255502 0 ๅณๆฑŸๅŒบๆ”ฟๅบœๅทฒไผšๅŒ็›ธๅ…ณ้ƒจ้—จ็ ”็ฉถ็ –ๅŽ‚ๆ•ดๆฒป้—ฎ้ข˜\n255503 0 ๅŽไธบไผ ่พ“่ฎพๅค‡ๅธธ่งๅ‘Š่ญฆๅซไน‰ๅŠๅค„็†ๆ–นๆณ•\n255504 0 ๆญฃๅผๅ‘ฝๅไธบMOUNTPAVILIA\n1000 Processed\n classify content\n256000 0 ๆ‰“็ ดไบ†ไปŽๆ™บ่ƒฝๆ‰‹ๆœบ่ง’ๅบฆๅ‡บๅ‘็š„ไผ ็ปŸ่Š‚็”ตๆ€่ทฏ\n256001 0 \\nๅฐ็ขŽ่Šฑ็š„่ฃ™ๅญไธ‹ๅพ€ไธŠๆŒ‰ๆ‘ฉไธ‰ๅˆฐไบ”ๅˆ†้’Ÿ็ˆชไธญๆ‹ฏๆ•‘ๅ‡บๆฅ\n256002 1 ใ€ๆธฉ้ฆจๆ็คบใ€‘ๆ˜Žๅธˆๆ•™่‚ฒๅ†œๆž—ๆ กๅŒบๅฐ†ไบŽxๆœˆx๏ผŒxๆ—ฅไธพ่กŒไธญ่€ƒ็Šถๅ…ƒ็ญ็คบ่Œƒ่ฏ•ๅฌ่ฏพ๏ผŒๅŒๅญฆๅฏไปปๆ„้€‰ๆ‹ฉๆฒกๆœ‰ๆŠฅ่ฏป...\n256003 0 ๆœ‰ๆฒก็ป“ๅฉš้šพ้“ๅ—ไบฌไธŠ็ฝ‘ๆŸฅไธๅ‡บๆฅๅ—\n256004 0 ๅ—ไบฌๆฐดๆธธๅŸŽๅฏน้ข็š„ไธ€ๅฎถ้‡‘้™ต้ธญ่ก€็ฒ‰ไธๆฑคๅƒไธ‡ๅˆซๅŽป\n1000 Processed\n classify content\n256500 0 ไฟๅฎ‰ๆŠ“ๅฐๅท่ฟ™ๆ˜ฏๆญฃไน‰่กŒไธบ้ญๅˆฐๆŠฅๆœ\n256501 0 ไปŠๅคฉๅœจๆต™ๆฑŸๅคงๅญฆไธบไผไธšๅฎถๅˆ†ไบซ็งปๅŠจไบ’่”็ฝ‘ๆ—ถไปฃ็š„ๅ•†ไธšๅ˜้ฉๅŠๅˆ›ๆ„ไผ ๆ’ญ็ฎก็†\n256502 0 ๆธธๆˆไธญ็š„โ€œ็Žฏ็ƒๅฝฑไธšUniversalโ€ๆฏซไธไธบ่ฟ‡\n256503 0 ไปŠๅนดๅพ—ไบ‰ๅ–ๆŠŠไธ‰ๆฎตๅ’Œ่ฃๅˆค่ฏ็š„่€ƒ่ฏ•่ดน่ตšๅ‡บๆฅ\n256504 1 ๅนณๅฎ‰้“ถ่กŒๅˆ่‚ฅๅ…ฅ้ฉปโ€ฆๅนณๅฎ‰ๆ‹›่˜๏ผŒไฝ ๆƒณๅŠ ๅ…ฅๅ—๏ผŸไธ€๏ผŒ ๆœๅŠก้กน็›ฎ๏ผšx๏ผŒไปฃ่กจๅ…ฌๅธไธบๅนณๅฎ‰่€ๅฎขๆˆทๅŠž็†็†...\n1000 Processed\n classify content\n257000 0 5๏ผŽไธๅธธ่ง็š„่›‹็™ฝ่ดจๆฐจๅŸบ้…ธๆ˜ฏๅœจ่›‹็™ฝ่ดจๅˆๆˆๅŽไฟฎ้ฅฐๅฝขๆˆ็š„\n257001 0 ไธบไป€ไนˆๅŒป็–—ไฟ้™ฉ่ถŠๆฅ่ถŠ้ซ˜ๅŽปๅนดไธ€ไบบ60็Žฐๅœจไธ€ไบบ90ๆˆ‘ๅฎถๆœ‰18ๅฃไบบๅทฎไธๅคš่ฆ2000ๅ…ƒ้‚ฃ้‡Œๆฅ้’ฑไบค\n257002 0 intelligent่ฟ˜ๆ˜ฏsmart\n257003 0 ่ฟ˜ๅœจๅ…ฌไบคไธŠ่ขซๅฐๅทๆŠŠๆˆ‘ๆ‰‹ๆœบๅท่ตฐไบ†\n257004 0 ไฝ†ๆ˜ฏไบ‹ๅฎž็š„็œŸ็›ธๆ˜ฏๅ‘่ดข็š„ๆœบไผšๆ€ปๆ˜ฏ็ป™้‚ฃไบ›ไธ็Ÿฅ้“ๆ›ดๅคšๅฐฑ่ƒฝๆ˜Žๆ™บ็š„่กŒๅŠจ็š„ไบบ\n1000 Processed\n classify content\n257500 0 ๅบ†ๅฎ‰ๆณ•้™ข็งฏๆžๅผ€ๅฑ•ไบ†ๆฐ‘ๅ•†ไบ‹ๅฎกๅˆคๅบญๅฎก่ง‚ๆ‘ฉๆดปๅŠจ\n257501 0 ๆฅ็žง็žงไฝ ็š„ๆ˜Ÿๅบง้€‚ๅˆไนฐ่‚ก็ฅจๅ—\n257502 0 ไบ”ๅŽๅŸŽ็ฎกๆŠคๅ›ฝๆ‰งๆณ•ไธญ้˜Ÿๅฏน็ฅฅไบ‘็พŽ้ฃŸๅŸŽๅ‘จ่พนๅบ—ๅค–็ป่ฅ\n257503 0 ไนŸไปฅๆœ€้ซ˜ๆธฉ23โ„ƒ็จณๅฑ…ๅ…จๅธ‚ๆœ€ๅ‡‰็ˆฝ็ฌฌไธ€ๅ\n257504 0 ่ฟ›็ณป็ปŸไน‹ๅ‰ๆ‰€ๆœ‰usb่ฎพๅค‡ๆ— ๆณ•ไฝฟ็”จ\n1000 Processed\n classify content\n258000 0 ๅ‡็บงwin10ๅŽ็š„ๅ‡ ไธช้—ฎ้ข˜๏ผšไบฎๅบฆ่ฐƒ่Š‚ๅคฑๆ•ˆ\n258001 0 ๅฝผๅพ—ยท่’‚ๅฐ”่ฎคไธบไธ่ฎบๆ˜ฏๆŠ•่ต„่ฟ˜ๆ˜ฏ็”Ÿๆดป้ƒฝ้ตๅพชๅ†ฅๆฌกๆณ•ๅˆ™\n258002 0 ๅ›žๅˆฐๅ—ไบฌๆ„Ÿ่ง‰ๆ•ดไธชไบบๅˆๆดปไบ†่ฟ‡ๆฅ\n258003 0 ๆœ‰ๆ—ถ่ฟ˜ไผšๅฐ‘ๆ‰“ๆˆ–ๅคšๆ‰“โ€ฆโ€ฆโ€ฆโ€ฆไธ€ๆฏ›้ƒฝไธๅ€ผ\n258004 0 ๅนธๅฅฝๆœ‰ๆœบๅ™จไบบ็ป™ๆˆ‘ๅฝ“็ฟป่ฏ‘ๅ†ๅฝ“ๆˆ‘็š„ไฟๅง†\n1000 Processed\n classify content\n258500 0 ไบคๆˆฟๆ—ถ็ฒพ่ฃ…ไฟฎๆœช่พพ้€ ไปท็š„ๆ ‡ๅ‡†\n258501 0 ไธญ้•ฟๆœŸๆŠ•่ต„่€…ๅทฒ็ปๅผ€ๅง‹็ฆปๅœบไบ†\n258502 0 ่ฎพ่ฎกไธ€ไธช็”จๆฅไบซๅ—็š„ๅซๆตด้—ดๆ˜ฏไธชๅพˆ่ตž็š„้€‰ๆ‹ฉ\n258503 0 ไปฅๅ—่ดฟ็ฝชใ€็บตๅฎน้ป‘็คพไผšๆ€ง่ดจ็ป„็ป‡็ฝช็ญ‰\n258504 0 ๅคช็ฉบๆฃ‰ๅฎๅ…ฐๆ’ž่‰ฒๅ…ฐ็ซ–ๆก่ฎพ่ฎก่ง†่ง‰ๆตๆ„Ÿ่ถ…่ตž\n1000 Processed\n classify content\n259000 1 โ€œๅ่ดตโ€ไปŠๆ˜Žไธคๅคฉๅ•ค้…’xxx/ไบบๆ— ้™็•…้ฅฎ๏ผๆœฌๅ…ฌๅธๆ–ฐๆ‹›ไธ€ๆ‰นxxๅŽ็š„็พŽๅฅณ๏ผŒไธชไธชๆ€งๆ„Ÿ้ฃŽ้ชš่ฟ˜ๆœ‰ๆ›ดๅคšๆƒŠๅ–œ...\n259001 0 ็งฏๆžๅฏนๆŽฅไบฌไธœใ€้˜ฟ้‡Œ็ญ‰็Ÿฅๅ็”ตๅ•†ไผไธš\n259002 0 2ใ€ๅบ”่ฏฅๅฏไปฅ่ฏดๆ˜ฏโ€œ่ฟๆณ•่ฟ่ง„ไบบๅ‘˜โ€\n259003 0 ้‚ฃๆˆ‘ๅ€’ไธๅฆ‚ๆ‹”ๆމๆ‰‹ๆœบๅกๆ’่ŠฑๅฌๆญŒ็œ‹็œ‹ไนฆๅ‘ๅ‘ๅ‘†\n259004 0 ๅ„ๅŠ ๆฒน็ซ™็ฆๆญขๅฏนๆ— ็‰Œๆ— ่ฏ็š„ๅ†œ็”จไธ‰่ฝฎ่ฝฆใ€ๆ‘ฉๆ‰˜่ฝฆๅŠ ๆฒน\n1000 Processed\n classify content\n259500 0 ไบค้€šๆŒ‡ๅผ•๏ผšๆญไน˜ๅœฐ้“ไบ”ๅท็บฟ่‡ณ็ŒŽๅพท็ซ™Dๅ‡บๅฃไธ‹\n259501 0 ๆœ‹ๅ‹ๅœจ่ก—้“ๅฃๅผ€็š„ๅบ—iphoenๆ‰‹ๆœบ\n259502 0 ็œŸ็š„ๅพˆ้ฌผๅ‘€ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็ฌฌไบŒ้›†ไธญๆจๅฎๅฟƒไปŽๅ”ฑๆญŒๅˆฐ็‚น่ฏ„้‚ฃ่‹ฑ็š„ๅคดๅ‘ไธ€ไผš็›ดไธ€ไผšๅท\n259503 0 ็ดขๅฐผPS4็š„้”€้‡ไธ€็›ดๆฏ”ๅพฎ่ฝฏXboxOneๅคง\n259504 0 ็ฆ็ปตๅŒบไบบๆฐ‘ๆณ•้™ขไปฅไฟก็”จๅก่ฏˆ้ช—็ฝชไธ€ๅฎกๅˆคๅค„ๆขๆŸๆŸๆœ‰ๆœŸๅพ’ๅˆ‘ไบŒๅนด\n1000 Processed\n classify content\n260000 0 ๆฏๅคฉ้ƒฝๅœจ่ดจ็–‘ไบบ็”Ÿไธญๅบฆ่ฟ‡??\n260001 1 ๅฐ้™ˆ็ฅๆ‚จๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒ่Š‚ๅŽๆœ‰้œ€่ฆๅŠžไฟก็”จๅก๏ผŒ่ดทๆฌพ็š„ๅฏไปฅ็›ดๆŽฅ่”็ณปๆˆ‘ใ€‚\n260002 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏ้ข้˜ณ่ทฏ๏ผˆๅŽŸๅกž็Ž›็‰น๏ผ‰ไธญๅ•†้›…ๅฃซๅˆฉๅฏผ่ดญ๏ผŒ็Žฐ้›…ๅฃซๅˆฉๅฅถ็ฒ‰ๅ…จๅœบxๆŠ˜ๆ—ถ้—ดไธบx๏ฝžxๆ—ฅ๏ผŒๆฌข่ฟŽๆ‚จๆฅ้€‰่ดญ...\n260003 0 ไธŠ็ญ็š„ๅœฐๆ–น็”ตๆขฏๅไบ†๏ฝž๏ฝž๏ฝžๅไบ†โ€ฆโ€ฆโ€ฆๆœฌๅฎๅฎไธ€ๅฃๆฐ”็ˆฌไบ†ๅๅ‡ ๆฅผ\n260004 0 ๅคธไบ†ๅ‡บ็งŸ่ฝฆๅธˆๅ‚…ไธ€ๅฅๆŠ€ๆœฏๅฅฝ็ป“ๆžœๅฐฑๆ’’ๆฌขไบ†้ฉฌ่ทฏๆ˜ฏไฝ ๅฎถๅ•Š\n1000 Processed\n classify content\n260500 0 ๅฆˆๅฆˆ่ฆ็ป™ไป–ๆ“ๅŠžๅฉš็คผ่ฃ…ไฟฎๆˆฟๅฑ‹\n260501 0 ๅŠชๅŠ›ๆ‰“้€ ๆต™ๆฑŸ็œ็ฌฌไธ€็ง‘ๆŠ€ๆ–ฐ้—ป้—จๆˆท\n260502 0 ้›†ๅ›ข็š„ๅ‘ๅฑ•้€Ÿๅบฆๅ–ๅ†ณไบŽไบบๆ‰ๆขฏ้˜Ÿ็š„ๅปบ่ฎพๆƒ…ๅ†ต๏น‰\n260503 0 ้‡‘ๅฎ‰ๅŒบๆ—…ๆธธๅฑ€็›ธๅ…ณ่ดŸ่ดฃไบบ้™ชๅŒ\n260504 0 ไฝ†21ๅฒ็š„็”Ÿๆ—ฅๆ„ฟๆœ›็Žฐๅœจ่ฎธไธ‹ๅบ”่ฏฅไธ็ฎ—ๅคชๆ™š\n1000 Processed\n classify content\n261000 0 ๆ‰็Ÿฅ้“้™†้ฃŽXxๅค–่ง‚ๅ›ฝๅ†…ไธ“ๅˆฉ็š„็”ณ่ฏท่ฆๆ—ฉไบŽ่ทฏ่™Žๆžๅ…‰\n261001 0 ่ฟ™่ฎฉ็™พๅบฆ็š„ๅ‘ๅฑ•ไธๅ†ๅ—ๅˆถไบŽๆŸไธช้ซ˜็ฎก็š„ๅŽป็•™\n261002 0 ่ฑกๅพTIFFANY็ปๅ…ธๅ…ƒ็ด ็š„ๆ–‡ๆ˜Ž\n261003 0 ่ฟ™ๆฌกๆœตๅ”ฏiSuperS3็š„่ถ…็ช„่พน่ฎพ่ฎกๆก†่พพๅˆฐไบ†็š„2\n261004 0 Burt'sBeesMamaBeeBellyButterๅฐ่œœ่œ‚ๅซ็ปดE้˜ฒๅฆŠๅจ ็บน้œœ\n1000 Processed\n classify content\n261500 0 ไน‹ๅ‰ๆปก่„ธ็—˜็—˜ๅ’Œ็ฒ‰ๅˆบ็š„ๆˆ‘ๅฐฑๆ˜ฏไธ€ไธชๆ•™่ฎญ่ฟ˜ๅฅฝ็Žฐๅœจๅผฅ่กฅๅ›žๆฅไบ†\n261501 0 ็„ถๅŽ็œ‹ๅˆฐๆˆ‘้‚ฃๆฅผ็›˜40ๅคšๆก่ฏ„ไปท37ใ€8ๆก้ƒฝๆ˜ฏๅทฎ่ฏ„\n261502 0 ่ฏๅˆธ็Šฏ็ฝชๅชๆ‰“ๅ‡ป๏ผšxๅ†…ๅน•ไบคๆ˜“\n261503 0 ้‚ฃไธ€ไผ™้ชšๆปดๆปดๆนฟๅ“’ๅ“’็š„็œŸ็š„ๆƒจไธๅฟ็น\n261504 0 ๆŠนไธชๆ‰‹ๆœบไธŠ็š„ๆธฃๆธฃๆ‰‹ๆป‘ๆŒ‰ๅˆฐๆ‹็…ง่‡ชๅŠจไบบๅทฅ็งป่ฝดโ€ฆ\n1000 Processed\n classify content\n262000 0 ่ฎค่ฏไฟกๆฏไธบโ€œๅ—้€š้“ƒๅ…ฐๅ•†่ดธๆœ‰้™ๅ…ฌๅธ็ฝ‘็ปœๆŠ€ๆœฏ่ดŸ่ดฃไบบโ€\n262001 0 ็Ž‹็’ๅœจ่…พ่ฎฏ่ง†้ข‘้™ชๆˆ‘ๅบฆ่ฟ‡ไบ†1ๅฐๆ—ถ\n262002 0 ไปŠๅนดๅ—ไบฌๅนณๅฎ‰ๅคœๅ’Œๅœฃ่ฏžๅคœ้ƒฝๆ˜ฏโ€œๅ†ฐๅ†ปๅคœโ€\n262003 0 ้ฝ้ฝๅ“ˆๅฐ”้“่ทฏ่ฟ่พ“ๆณ•้™ขๅ…จไฝ“ๅนฒ่ญฆ็ป่ฟ‡1ไธชๅคšๅฐๆ—ถ็š„่ฝฆ็จ‹ๆฅๅˆฐ็œ็บง็ˆฑๅ›ฝไธปไน‰ๆ•™่‚ฒๅŸบๅœฐโ€”โ€”ๆฑŸๆกฅๆŠ—ๆˆ˜็บชๅฟต้ฆ†\n262004 0 ไพ็„ถๆ˜ฏๅ…ธๅž‹็š„่‹ๅทžๅ›ญๆž—้ฃŽๆ ผ\n1000 Processed\n classify content\n262500 0 ๆฅๅๆ˜ ๅ‡บๆญปๅˆ‘็Šฏๅฏนๆญปไบกโ˜…โ˜…โ˜…ๆ›ดๅคš่ฏฆๆƒ…๏ผš\n262501 0 ๆญๅทžไธ€็”ตๆขฏๅ†โ€œๅƒไบบโ€๏ผšไธ€ๅไฝๅœจ16ๆฅผ็š„ๅฅณๅญ\n262502 1 ็Žฐๆœ‰ๅ‡ ๅฅ—้ป„ๆฒณๆบๅ›ฝ้™…ๅŸŽ็š„ๆˆฟๅญ๏ผŒ็Žฐๅทฒๅ›ข่ดญไปทๅฏนๅค–ๅ‡บๅ”ฎ๏ผŒ้ข็งฏxxx-xxx-xxx-xxx็ญ‰ไธๅŒ็š„ๆˆท...\n262503 0 ใ€Žๅˆธๅ•†๏ผš็”Ÿ็‰ฉๅŒป่ฏ้œ‡่กๅธ‚ไธš็ปฉๆˆ้•ฟ็กฎๅฎšx่‚กๆŽ€ๆณขๆพœใ€\n262504 0 MulancyๅŠๆฐธไน…ๅŒ–ๅฆ†็ฌฌๅ…ญๆœŸๅ…จ็ง‘็ญ็š„ๅญฆๅ‘˜ไปฌๅฟซไน่š้ค\n1000 Processed\n classify content\n263000 0 ็œๅ„ฟๅŸบไผšๅทฒๆ”ถๅˆฐ็ˆฑๅฟƒๆ706540ๅ…ƒ\n263001 0 ไนŸๅ’ŒไปŠๅคฉไธ€ๆ ท้ซ˜ๆธฉxxๅบฆๆˆ‘ไปฌ่€ๆฑ‚ๆ–ฝๅทฅ้˜Ÿไผไธ‹ๅˆx็‚นๅผ€ๅทฅๅˆฐๆ™šx็‚นxx\n263002 0 ๅˆ›ไธšๆฟ้‡ๅ›žxxxx็‚นๅคงๆถจ้€พx%\n263003 0 12ๆ—ฅ10086ๅ›žๅค่ฏดๅŽๅฐๆ˜พ็คบๆˆๅŠŸไบ†\n263004 0 ๅˆ†ไบซไธ€ไธชๅ–œๆ„Ÿ๏ผš็Žฉๅฆๅ…‹ไธ–็•ŒๆŠŠ้ผ ๆ ‡็Žฉๅไบ†\n1000 Processed\n classify content\n263500 0 ็ฌฌไบŒๅŠๅฒไปฅๅ‰็š„ๅฉดๅ„ฟไธๅฎœๅƒ้ธก่›‹ๆธ…\n263501 0 ๆœฌๆกˆ่ฃๅˆคๆ–‡ไนฆ้šๅŽๅฐ†ๅœจไธญๅ›ฝๆณ•้™ข่ฃๅˆคๆ–‡ไนฆๅฎ˜ๆ–น็ฝ‘็ซ™ๅ…ฌๅธƒ\n263502 0 ๅ—ไบฌๅœฐ้“S1ๅท็บฟไบŒๆœŸใ€S7ๅท็บฟ้ข„่ฎก2017ๅนดๅปบๆˆ\n263503 0 ๅœจๆ–ฐไธ‰ๆฟไธŠๅธ‚็š„ๅˆธๅ•†ๅ…ฑๆœ‰4ๅฎถ\n263504 0 ้‚ฃไบ›ๅ†…็ฝฎ้”‚็”ต็š„ๆ‰‹ๆœบใ€psp็ญ‰่ฎพๅค‡ไนŸ่ƒฝ้€š่ฟ‡ๅฎƒ็›ดๆŽฅไพ›็”ตไบ†\n1000 Processed\n classify content\n264000 0 ๅ—ไบฌ่™็ซฅๅ…ปๆฏๆŽๅพ็ด๏ผšไปŽๆฒกๆƒณ็œŸๆญฃไผคๅฎณ่ฟ‡ๅญฉๅญ\n264001 0 ๅฐ†ๆˆ็ซ‹ไธ€ๆ”ฏ็”ฑ120ๅไบค่ญฆใ€็‰น่ญฆ็ญ‰ไบบๅ‘˜็ป„ๆˆ็š„็‰นๅˆซๆ‰งๆณ•้˜Ÿไผ\n264002 0 ็ฝ‘ๅ€ๆ˜ฏ่ฟ™ไธช่ฏดๆ˜ฏๆต™ๆฑŸ้“ถ่กŒๅกๅทๆ˜ฏ่ฟ™ไธช\n264003 0 ไฝไบŽๆต™ๆฑŸๅคฉๅฐๅนณๆกฅ็›†ๅœฐ็š„ๅง‹ไธฐๆบช็•”\n264004 0 ็›‘ๆ‹่ฑๆ‚ฆๆ’žๅŠณๆ–ฏ่Žฑๆ–ฏๅคฉไปท่ต”ๅฟ่ขซๅ…็ฝ‘ๅ‹่ดจ็–‘็‚’ไฝœ\n1000 Processed\n classify content\n264500 0 ไธญๅคฎ็”ต่ง†12ๅฐ็คพไผšไธŽๆณ•่Š‚็›ฎไธญ็š„ๅฅณๅฉฟ็Šถๅ‘Šๅฒณๆฏๅ’Œๆˆฟไบงๅ…ฌๅธ็š„ๅˆค็ฝšๆ˜ฏไธๆญฃ็กฎ็š„\n264501 0 ๅ่…ๆˆ˜ไบ‰่ƒœๅˆฉๅ†ฒๆ˜ๅคด่„‘่€Œๅฟ˜ไบ†ๆ้ซ˜ๆฐ‘็”Ÿ\n264502 0 ๆƒณๅŽป่ฟžๅกไฝ›ๆŠŠdrsebagh็š„VC็ฒ‰ไนฐไบ†\n264503 0 ๆ‰€่ฐ“โ€œๆฒก่ฏๆฎโ€ไธ่ฟ‡ๆ˜ฏไธไฝœไธบ็š„ๆ‰˜่ฏ็ฝขไบ†\n264504 0 ็กฎๅฎž็œ‹่งๅŸŽ็ฎกๆ‹ฟ็งค็ ฃๅ‡ปไธญ้‚“ๆญฃๅŠ ่„ธ้ƒจ\n1000 Processed\n classify content\n265000 0 ็Œช้ผป่ดดไธป่ฆๆ˜ฏไปฅๅŽป้ป‘ๅคด็ฒ‰ๅˆบไธบไธป\n265001 0 ๅƒ็€็พŽๅ‘ณxๅฐๆ—ถ่ƒฝ่ท‘xxxxๅ…ฌ้‡Œไบบ่ฟ˜ๆ˜ฏ้‚ฃไธชไบบ\n265002 0 ๅฐฑๅƒ่ฅฟๅฎ‰่ญฆๅฏŸๆ‰“ๆฒณๅ—่ญฆๅฏŸไธ€ๆ ท\n265003 0 5ๆœˆไปฝๅˆšไนฐ็š„ๅ…ซๅƒๅ—้’ฑ็š„้›…้ฉฌๅ“ˆๆ‘ฉๆ‰˜่ฝฆๆฒกไบ†\n265004 0 ๆถˆ่ดน่€…ๆƒ็›ŠไฟๆŠคๆณ•็ญ‰ๆณ•ๅพ‹็ปดๆŠค่‡ชๅทฑ็š„ๅˆๆณ•ๆƒ็›Š\n1000 Processed\n classify content\n265500 0 ไปฅๆญคไผ˜ๅŒ–ๅ…ฌๅธ็š„ๆŠ•่ต„็ป“ๆž„ใ€ๆๅ‡ๅ…ฌๅธ็ปผๅˆ็ซžไบ‰ๅŠ›\n265501 0 ๅŒป็”Ÿๅ†™ๅœจ็—…ๅކไธŠ็š„ไธœ่ฅฟๅซๅญ—ไนˆ\n265502 0 ็”ทๅฅณ้ƒฝ่ฆๆŠค่‚ไปฅ่ก€ไธบๆœฌไปฅ่‚ไธบๅคฉ่‚ๆœ‰ๅคšๆธ…่„‘ๆœ‰ๅคšๆธ…\n265503 0 ไธๆณ•ๅˆ†ๅญๅฅ—็”จๆน˜ๆฝญๅธ‚ๆฃ€ๅฏŸ้™ขๅ่ดชๅฑ€็”ต่ฏไผๅ›พ่ฏˆ้ช—\n265504 1 ๅนฟไธœๆธ…่ฟœ้พ™ๆน–ๅฅ‡็Ÿณๆ–‡ๅŒ–ไบงไธšๅ›ญๅฐ†ไบŽxxxxๅนดxๆœˆxๆ—ฅ--xๆœˆxxๆ—ฅไธพๅŠžๅฅ‡็Ÿณๅš่งˆไผš๏ผŒ็Žฐๆœ‰ๅฐ‘้‡ๆˆทๅค–ๅฑ•...\n1000 Processed\n classify content\n266000 0 ๆˆ‘่ทŸๅบ—ๅฎถ็บ ็บทๅฐฑๆ˜ฏๆˆ‘ๆๅ‡บ่ฏๆฎไน‹ๅŽๅบ—ๅฎถๅฑ้ƒฝไธๆ”พไธ€ไธช\n266001 0 ๅค–้ข้ฃŽๅˆฎๅพ—ๅƒๅผ€ไบ†ๅ‡ ๆžถ่ฝฐ็‚ธๆœบ\n266002 0 ๆœ‰็ฝ‘ๅ‹ๆ›ๅ…‰็™พๅบฆใ€360ใ€ไบฌไธœ็ญ‰ไบ’่”็ฝ‘ๅ…ฌๅธๅŠ ็ญไธ€ๆ—็บท็บทๅœจๅŠžๅ…ฌๆฅผไธ‹ไธพ็‰Œโ€œๆˆ‘่ฆๆ”พๅ‡โ€\n266003 1 ๆˆ‘ไปฌๆŽๅฎถ่ฏšๅ’Œ่ฎฐ้ป„ๅŸ”๏ผŒๅœจๅ‘จๆตฆไธ‡่พพๅนฟๅœบๆ—๏ผŒๆ‰“้€ ็š„ๅŸŽๅธ‚ๅž‹็ฒพๅ“ๆฐดๅฒธๅˆซๅข…๏ผŒxAๆœŸๅณๅฐ†็››ๅคงๅผ€็›˜๏ผŒ่ถ…ๅ‡บๅ‰ๆœŸ...\n266004 0 GooglePlay้‚ฃไธ€ๅฅ—ไธœ่ฅฟ้‚ฃๆ ท่ฎพ่ฎกไธขๅˆฐๆฟๅญไธŠ็”จๆ„Ÿ่ง‰ๅพˆไธ่ˆ’ๆœ\n1000 Processed\n classify content\n266500 0 ๅŽๅค‡็ฎฑ้ƒฝๆ”พไธไธ‹้ข~็ง’ๆ€็š„่ฑกๅฐไฟๆธฉๆฏๅพˆ็พŽไธฝๅ•Š\n266501 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ iq2397ไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n266502 0 ๅฟไธไฝไธŠๅ‡ ๅผ ๅˆšๅฏผๅ…ฅ็”ต่„‘็š„ๅŽŸๅ›พ\n266503 0 3ใ€็”ต่„‘ๅ‘จๅ›ดๅ †ๆ”พไนฆๆœฌใ€ๆ‚ๅฟ—็ญ‰ๅฏ็‡ƒ็‰ฉ\n266504 0 ๅœจ็މ่‹ๅฑฑ้กถ้ฃžๆฅไธ€ๆžถ้ฃžๆœบ็›ฎๆต‹่ฝๆธฉๅทž็š„้ฃž็š„้žๅธธไฝŽ\n1000 Processed\n classify content\n267000 0 15874109600ๅญฉๅญๅฆˆๅฆˆ\n267001 0 ่ดจ็–‘ไป€ไนˆ็š„่ฏไนŸไผšๅพˆ็›ด็™ฝๅœฐ่ฏด\n267002 0 ๅธธๅธธๅœจๅœฐ้“้‡Œ้‡ๅˆฐๆœ‰ๅฅณๆ€งไน˜ๅฎขๅƒไธœ่ฅฟใ€ๅ–ไธœ่ฅฟ\n267003 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ๆ–ฐๅนดๅฅฝ๏ผๆ–ฐๅนดๆ–ฐๆฐ”่ฑก๏ผŒไธบๆ„Ÿ่ฐขๆ–ฐ่€้กพๅฎขๅฏนใ€ๅ‡€้ฆ™ใ€‘็š„ๆ”ฏๆŒ๏ผŒๅ‡กxๆœˆไปฝๅˆฐๅบ—ๅšๆŠค็†xๆฌกไปฅไธŠ้€ไปท...\n267004 0 ๅƒ็š„ๆ—ถๅ€™ๅฏๆ”พไธ€ๅ‹บ่œ‚่œœๆˆ–่€…็บข็ณ–\n1000 Processed\n classify content\n267500 0 ๅœจๅ—ไบฌ่ฟ˜ๆœ‰่ฟ™ๆ ท็š„ๅฏๆ‹†ๅธ็‰Œ็…งๅ—\n267501 0 ๅ››ๅท็œๆ•™่‚ฒๅŽ…ๅ…ณไบŽๆณธๅทžๅŒปๅญฆ้™ขๆ›ดๅไธบๅ››ๅทๅŒป็ง‘ๅคงๅญฆ็š„่ฎบ่ฏๆŠฅๅ‘Š\n267502 0 ็Žฐๅœจๅชๆœ‰ๅœฐ้“็ซ™ๆ—่พน็š„็ ดๅœ่ฝฆๅœบ\n267503 0 7ๆœˆ26ๅทๆ˜ฏไธช่ฎฉ้‡Šๆฐธไฟกโ€œๅคงๅธˆโ€ๅพˆๅคด็—›็š„ๆ—ฅๅญ๏ผšๆœ‰ไบบไปฅๅฎžๅไธพๆŠฅ็š„ๅไน‰\n267504 0 ๆˆ‘ๅœจๆ‰‹ๆœบ้…ท็‹—ๅ‘็Žฐโ€œๅฐไนไนbiuโ€็š„็ฒพๅฝฉ่กจๆผ”\n1000 Processed\n classify content\n268000 0 com็Žฐๆœ‰MadpaxFullๅฝฉ่‰ฒๅˆบ็Œฌๆฝฎ่ƒŒๅŒ…\n268001 0 ๅ‘ๆถˆๆฏ้—ฎๅˆๆฒกๆœ‰ๅš้˜ฟ้‡Œๅทดๅทด็š„\n268002 0 ๆฒชๆŒ‡ไธไป…ๅœจ3600็‚น้™„่ฟ‘ๆจช็›˜่พพ5ๅคฉ\n268003 0 ๅฎฃๅธƒๅฏๅŠจไธ€็ณปๅˆ—็™พๅบฆๅ†…้ƒจ้กน็›ฎๅฏนๅค–ๅผ€ๆ”พๅธๅผ•ๆŠ•่ต„่€…็š„โ€œ่ˆชๆฏ่ฎกๅˆ’โ€\n268004 0 ๆต™ๆฑŸๆœ‰ไธชไป€ไนˆ่ˆชๅฐๅˆšๅคŸๆƒณๆถˆๅŒ—ไบฌ้‡ๅ›ฝไผ้‚ฃๆ ท็บข\n1000 Processed\n classify content\n268500 0 ๅขๆ€€ๅบ†xxxxๅนดๆ—ถๅพˆๅนธ่ฟๅœฐๆ‰พๅˆฐไบ†ไป–ๅœจๅคชไป“็š„ๅ“ฅๅ“ฅๅงๅง\n268501 0 ๅ‰ฏ้™ข้•ฟ่Œƒๅฐไบ‘ๅœจไธญๅ›ฝ้‡‘่žๅš็‰ฉ้ฆ†็Ž‹ๅท็†ไบ‹้•ฟ็š„้™ชๅŒไธ‹\n268502 0 ๅœจ่ฏฅๅƒๅˆ้ฅญ็š„ๆ—ถๅ€™้ฃžๆœบๆ™š็‚น็œŸ็š„ๅคช็—›่‹ฆไบ†\n268503 0 ๆฏ”ๅฆ‚ๅญฆ่€…ๅž‹ๆณ•ๅฎ˜ๅ’Œ่ฟๆณ•ๆณ•ๅฎ˜็š„ๅŒบๅˆซ\n268504 0 ๆœ€่ฟ‘็œ‹็š„่ฟ™ไธชๅŠ ๆฒนๅฎžไน ็”ŸๆŒบๆ„ŸๅŠจ็š„\n1000 Processed\n classify content\n269000 0 ็‹ฌ็‰น็š„ๅ‡ไธคไปถ่ฎพ่ฎกยทๅœจ้˜ณๅ…‰ไธ‹่‹ฅ้š่‹ฅ็Žฐ\n269001 0 399006็š„ไธ€ๅคงไธป้ข˜ๅฐฑๆ˜ฏไบ’่”็ฝ‘้‡‘่ž\n269002 0 ไธญไฟก็‰ฉไธš้•ฟๆ˜ฅๅ…ฌๅธ2015ๅนดๅบฆไธญๆœŸๅทฅไฝœไผš่ฎฎๅฌๅผ€\n269003 0 ้™คๅŽป้ฆ–ๅฐพไธคไฝๆญŒๆ‰‹โ€œ้ป‘ๅคฉ้น…โ€ๅ’Œโ€œ็‹ผ็‰™โ€\n269004 0 ไปชๅพๅ›ฝไฟกๅฝฑๅŸŽ2015ๅนด7ๆœˆ20ๆ—ฅๆœ€ๆ–ฐๅฝฑ่ฎฏ\n1000 Processed\n classify content\n269500 0 ไฝ†ๅฆปๅญๅฑ…็„ถๅˆฉ็”จ้€š่ฟ‡่ฐทๆญŒ่ก—ๆ™ฏๅœฐๅ›พๆ‹ๆ‘„็š„3D็…ง็‰‡\n269501 0 Xไฟก่ฏๅˆธX้ฆ–ๅธญ็ญ–็•ฅๅˆ†ๆžๅธˆๆœ€ๆ–ฐๅคง็›˜็œ‹ๆณ•\n269502 0 ๆ‚Ÿ็ฉบๆ‰พๆˆฟไธ“ๆณจ็‹ฌๅฎถไบŒๆ‰‹ๆˆฟๆˆฟๆบ\n269503 0 ๅฏๆ‰‹ๆœบ็›ดๆŽฅๆ‰“ๅฐ็…ง็‰‡็š„ๆ‰“ๅฐๆœบ\n269504 0 ๅ›žๆฅๅŽ้ซ˜ๅคงไธŠ็š„ๅŒป็”Ÿๅฝข่ฑกๅ…จๆฏไบ†\n1000 Processed\n classify content\n270000 0 ๅœฐ้“ไธŠๆ—่พน็š„ๅ“ฅไปฌๆ‹ฟ็€ๅพฎๆ˜Ÿ็ฌ”่ฎฐๆœฌๆ‰“ๅผ€ๅœจ็ผ–็จ‹็•Œ้ข\n270001 0 ็›ธๅ…ณ้ข†ๅŸŸไธ“ๅฎถๅฐฑ4ไธชๆœ€ๅธธ่ง็š„ๆธธๆณณๅœฐ็‚น\n270002 0 ๅซๆ˜Ÿๅ’Œ่ˆชๆ‹็…งไธบๆŸไบ›ไบ‹ๅฏนไฝ ๆœ‰ไบ›ๅพฎ็š„ไธ€ไปฝ้ป˜ๅฅ‘ใ€ไธ€ไปฝๅนณๆทกใ€\n270003 0 ่ฟ™ไธชๅคฉๅ‡บ้—จๅˆไธ็ˆฑๆ‰“ไผž็›ดๆŽฅๆ™’ๆˆ็ขณ\n270004 0 ๅฐ‰ๅฅ่กŒไปปๅ†…ๆ›พๆŸฅๅค„xxๅชๅคง่€่™Ž\n1000 Processed\n classify content\n270500 0 ๅปบ่ฎฎๅฏน็ขฐ็“ท็š„ไบบ่ฟ›่กŒๅ…ฌๅฎ‰ๅฑ€ๅค‡ๆกˆ\n270501 0 ้šพ้“ๅŽไธบไธ็Ÿฅ้“่ฃ่€€7็Žฐๅœจๅพˆ้šพๅ–ๅ—\n270502 0 ไธ€่ทฏไธŠ็š„่ฎฝๅˆบๅ˜ฒ็ฌ‘่ดจ็–‘๏ฝžๆœ‰ๆ—ถไฝ ไผšไผคๅฟƒ้šพ่ฟ‡ไฝ†ๆ›ดๅคš็š„ไพฟๆ˜ฏไธ€็ฌ‘่€Œ่ฟ‡\n270503 0 ไธป่ง’ๆŽๅŒป็”Ÿ็š„ๆ‰ฎๆผ”่€…ๆ˜ฏ้ฆ™ๆธฏๆผ”ๅ‘˜Angie\n270504 0 ็‹ฌ้พ™ๆ—ไบบๅŸบๅ› ๆ˜ฏ็™พๅˆ†ไน‹็™พ็š„Ox็ณป\n1000 Processed\n classify content\n271000 0 ไธซไธซ540ๅคฉ๏ผšๅค–ๅฉ†็‰™็–ผ่ฆๅŽปๆ‹”็‰™\n271001 0 ไธญๅ›ฝๆณ•ๅพ‹่ง„ๅฎšไธ€ๅคซไธ€ๅฆปๅ•Šโ€ฆๆƒณๆƒณ่ฆๆ˜ฏๅ‘็”Ÿ่ฟ™ไบ›ไบ‹\n271002 0 ไบŒๆฅๆˆ‘ๆง็€ๆ‰‹ๆœบ่Šๅคฉ่Šๅˆฐๆฒก็”ต\n271003 0 ไฝ†ๆ˜ฏ็”ตๆขฏๅฃๅˆๆœ‰ๆ ‡่ฏญโ€œ่ฏทๅ‹ฟ่กŒ่ตฐโ€\n271004 0 ๆˆ‘ๆ•ขๆ‰“่ตŒไป–burpeeไธ€ๅฃๆฐ”ไธไผš่ถ…่ฟ‡1โ€ฆ\n1000 Processed\n classify content\n271500 0 ็”ทๅญฉไธบ่นญWiFiไธŠ็ฝ‘็ˆฌ้˜ฒๆŠค็ฝ‘ๅœจ้“่ฝจไธŠ็Žฉๆ‰‹ๆœบt\n271501 0 TCLๆ‰‹ๆœบ่ฏทไบ†ๅฝ“็บขๆ˜Žๆ˜Ÿ้‡‘ๅ–œๅ–„ๅšไปฃ่จ€\n271502 0 ๆˆช่‡ณ7ๆœˆ17ๆ—ฅไพ็„ถๅ…ฑๆ”ถๆๆฌพ183089\n271503 0 ็ญ”๏ผš็ปๆตŽ็ŽฐไปฃๅŒ–ใ€ๆ”ฟๆฒป็ŽฐไปฃๅŒ–ใ€็คพไผš็ป“ๆž„็ŽฐไปฃๅŒ–ใ€ๆ–‡ๅŒ–็ŽฐไปฃๅŒ–ใ€็Žฏๅขƒ็ŽฐไปฃๅŒ–ใ€ไบบ็š„็ŽฐไปฃๅŒ–\n271504 0 ๅฐŠ่Œ‚้…’ๅบ—้›†ๅ›ขๅœจ่ฅ่ฟๅ‰ฏๆ€ป็›‘ๅ…ผๅธ‚ๅœบ้”€ๅ”ฎๆ€ป็›‘ๆ›น้“ธๅ…ˆ็”Ÿๅธฆ้ข†ไธ‹\n1000 Processed\n classify content\n272000 0 ๅพˆๅคšๅฅณๆ€งๆ‚ฃไบ†ๅฆ‡็ง‘็–พ็—…ๅŽ้ฆ–้€‰็š„ๅฐฑๆ˜ฏๅŽป็œ‹่ฅฟๅŒป\n272001 0 ่ฐทๆญŒ่ฆๆ‹›SEOไธ“ๅฎถ็ป™่‡ชๅทฑๅšๆœ็ดขๅผ•ๆ“Žไผ˜ๅŒ–|ไปฅๆ’’็ง‘ๆŠ€\n272002 0 ๅซŒ็Šฏๆ’žๅไธค่ฝฟ่ฝฆๅŽ้€ƒ่„ฑ่ฝฆไธปๆŸๅคฑ่ฐๆฅ่ต”ไป˜\n272003 1 ไฝ ๅฅฝ๏ผๆˆ‘ๅธๆœ‰โ’˜%๏ผ็Ž_้ …โ‰ฎใ€‚ใ€‚ ๅฆ‚้œ€๏ผšxxx xxxx xxxxๆŽ็”Ÿ\n272004 0 Twitterๆฏๅคฉไธ€ๅฐๆŽจ้€ไปŽๆœชไธญๆ–ญ\n1000 Processed\n classify content\n272500 0 ๆฏๅคฉ่Šฑๅƒ้ชจๆ‰€ๆผ”ๅ‰งๆƒ…้‡Œ้ขๆœ‰ๅงๅง็š„ๅ…จ้ƒจ้ƒฝๆˆชๅฑไบ†\n272501 0 ๅฟƒ็†็ฝชๆ— ้”กๆ‹็š„่€ถไธ€็œผๅฐฑ็œ‹ๅ‡บๆฅๆƒน\n272502 0 ๆˆ‘่ขซไธญๅ›ฝๅผ็š„ๅฉšๅงปโ€œๅผบๅฅธโ€ไบ†\n272503 0 TotheFirstYearใ€TomyMr\n272504 0 ๆ˜ฏๆ—ฅๆœฌๅฆ‡ไบง็ง‘ๅŒป็”ŸๆŽจ่็š„ไบงๅ“~\n1000 Processed\n classify content\n273000 0 ่ฟ‘ๅนดๆฅไธญๅ›ฝๆœบๅ™จไบบๅธ‚ๅœบไฟๆœ‰้‡ๅฟซ้€Ÿๆๅ‡\n273001 0 ๆˆ‘่™ฝ็„ถๅธŒๆœ›windows10ๆ˜ฏwinๆ‰‹ๆœบ็š„ๅคงๆœบ้‡\n273002 0 ไปŠๅคฉๅŽปๅDuffy้ฃžๆœบ็š„ไบบ่ฟ˜็œŸๅคšโ€ฆโ€ฆไธ€ๅผ€ๆŽจ็‰น่ขซๅˆทๅฑ\n273003 1 โ€œx.x่‡ณx.xๅทไผ˜ๆ‚ฆๅฑ…ๅฎถ่ฃ…็”Ÿๆดป้ฆ†๏ผŒ่ฏš้‚€ๆ‚จๅ‚ๅŠ โ€œๆƒ ๆฐ‘ๅฎถ่ฃ…ๅปบ.ๆๅคง่กฅ.่ดดๆดปๅŠจโ€ใ€‚ๆฅๅบ—ๅฐฑ้€็ฒพ็พŽ็คผ...\n273004 0 ๅ›ฝๅค–ๆฑฝ่ฝฆๅˆถ้€ ๅ•†ๅชๆ„ฟๆ„ๆŠŠไธ€ไบ›่ฟ‡ๆ—ถ็š„ๆŠ€ๆœฏๅ–็ป™ๅ›ฝๅ†…็š„ๅˆ่ต„ๅŽ‚ๅ•†\n1000 Processed\n classify content\n273500 0 ๆฐ‘่ฅๅŒป้™ขใ€็ง็ซ‹่ฏŠๆ‰€ๅทฒๅ ๆฎไบ†ๅŠๅฃๆฑŸๅฑฑ๏ผš็›ฎๅ‰ๅ…จๅ›ฝๅทฒๆœ‰่ฟ‘6\n273501 1 ไปๅฑฑๆ™บๆฐดๅทฒๆœ‰xxไฝไธšไธปๅœจๆˆ‘ๅธ็ญพ็บฆ๏ผŒ็Žฐๅผ€ๆ”พxๅฅ—ๅ’Œๆ‚จๅŒๆˆถๅž‹ๆจฃๆฟๆˆฟ๏ผŒไพ›ๆ‚จๅ‚่ง‚ๅ€Ÿ้‰ดใ€‚ๅ’จ่ฉข๏ผšxxxxx...\n273502 0 ๅ˜‰ๅ…ดๆ—ฅๆŠฅๆŠฅไธšไผ ๅช’้›†ๅ›ขๆ‰ฟๅŠž็š„2011ๅ…จๅ›ฝ่ฎฐ่€…็ซ™ๅทฅไฝœไผš่ฎฎไปฃ่กจไธ€่กŒ40ๅคšไบบ\n273503 0 ้™†ๅทๆณ•้™ข็บชๆฃ€็ป„็›‘ๅฏŸๅฎค็ป„ๆˆ็ฃๆŸฅ็ป„ๆทฑๅ…ฅ่ฏฅ้™ขๅ„ไธญๅฑ‚้ƒจ้—จ\n273504 0 ๆฒชๆ˜†้ซ˜้“้™คไบ†G82ๆฌก่ฝฆ็ฅจ็ดงไฟๅค–\n1000 Processed\n classify content\n274000 0 ๆ–ฐๆ—ถไปฃไบ’่”็ฝ‘้‡‘่ž็š„่™šๆ‹Ÿ่ดขๅฏŒๆ˜ฏๅฆๅฏ่ƒฝๆ›ฟไปฃ่‡ชๅคไปฅๆฅ็š„้ป„้‡‘่ดขๅฏŒไปทๅ€ผ่ง‚\n274001 0 ใ€Žๅฆ‚ไฝ•ๅšไธ€ๅ็œŸๆญฃ็š„ๅ‘ๅž‹่ฎพ่ฎกๅธˆใ€\n274002 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜ไฝ ๅฅฝ๏ผๆˆ‘ๆ˜ฏSmๆฒƒๅฐ”็Ž›่‡ช็„ถๅ ‚ไธ“ๆŸœ็š„๏ผŒๆœ‰ไธชๅฅฝๆถˆๆฏๅ‘Š่ฏ‰ไฝ ๏ผŒๆˆ‘ไปฌx.x่Š‚่ฆๆžไธ€ๅ‘จๆดปๅŠจxๆŠ˜่ตท...\n274003 1 ๅ…่ดน่ฃ…ไฟฎ๏ผŒxไธ‡ๅทจๅฅ–๏ผ้ฆจๅฑ…ๅฐšxๆ—ฅโ€œๅ…่ดนๆ•ด่ฃ…ๆŠข็ญพไผšโ€็ปๆ€xไธ‡๏ผŒ่ฃ…ไฟฎไธ่Šฑ้’ฑ๏ผๅ†ๆŠฝiphonex๏ผๆฅ...\n274004 0 ๅฐฑๆ˜ฏไธŠๆฌก้ข„่จ€ไบšๆดฒ้‡‘่žๅฑๆœบๅ‡บๅ็š„็”ทไบบ\n1000 Processed\n classify content\n274500 0 ็œ‹ไธ‹้›†้ข„ๅ‘Šไธ‹็บงๆ˜ฏๅœจๅฑฑไธœๆ‹็š„้‚ฃไธช~้‚ฃๆต™ๆฑŸ้‚ฃๆœŸๅ‘ข\n274501 0 ไฝ ๅฐฑๅฏไปฅๅ€ŸๅŠฉ่ฎพ่ฎกๅธˆWAGAiiๅธฆๆฅ็š„ๅก‘ๆ–™่Šฑ\n274502 0 ไน‹ๅ‰้ฉฌไบ‘็ ธ10ไบฟๆฅๆŽจๅนฟๆปดๆปดๆ‰“่ฝฆๆ‚จ้”™่ฟ‡ไบ†็™พๅบฆ็ ธๅ‡ ไบฟๆฅๆŽจๅนฟ็™พๅบฆ้’ฑๅŒ…ๆ‚จๅˆ้”™่ฟ‡ไบ†่ฟ˜ๆƒณ้”™่ฟ‡่ฟ™ๆฌกๅ€Ÿ่ดทๅฎๅ—\n274503 0 ไธญๅ›ฝๅ—ๆตทๅ†›ๆผ”้œ‡ๅŠจๆฌง็พŽ่ฅฟๅช’้ข ๅ€’้ป‘็™ฝๆฑก่”‘่งฃๆ”พๅ†›ๆญฆๅŠ›ๆซๅ“โ€ฆโ€ฆๅฆ‚ๆžœ่ฟ™ไนŸๆ˜ฏโ€œๆญฆๅŠ›ๆซๅ“โ€\n274504 0 xxๅนดๅœจๅพทๅ›ฝๅผ€ๅง‹ๅšebayใ€xxๅนดๅšไบš้ฉฌ้€Š\n1000 Processed\n classify content\n275000 0 ๅผ€ไบ†ไธ‰ๅฐๆ—ถไธ“ๅฎถๅ’จ่ฏขไผšไธ€็พคไบบๆ‹ฟ็€ๅฐธไฝ“็…ง็‰‡็ ”็ฉถๅŠๅคฉ่ฏ่ฏดๆณ•ๅŒป่ฟ™ไธช่Œไธš็œŸๆ˜ฏๅฑŒ็ˆ†ไบ†ๆ„Ÿ่ง‰ๆ•ดไธชไบบ้ƒฝๅœจๅ‘ๅ…‰ๅ“ˆๅ“ˆๅ“ˆ\n275001 0 1ไธชๅฐๅท20ๅนดๅ‰ๅท่ตฐ2ไปถๅค็‰ฉๅŽๅณๅŽ„่ฟ่ฟž่ฟž\n275002 0 WindowsPhoneๅฐฑๅƒๆ–ฐ่ƒฝๆบ่ฝฆ\n275003 0 ๅœจๆทฎๅฎ‰ไธป่ฆๅนฒ้“็ซ™ๅŠ ่ฝฆๆŸด็š„ๆœ‹ๅ‹ๆœ‰โ€œ็ฆโ€ๅ•ฆ\n275004 0 ไปŽๆˆ้ƒฝๅธ‚ๅŸŽ็ฎกๅง”ๆ‰งๆณ•ๆ€ป้˜Ÿ่Žทๆ‚‰\n1000 Processed\n classify content\n275500 0 ๆžธๆž+่Š่Šฑ๏ผš่ƒฝไฝฟ็œผ็›่ฝปๆพใ€ๆ˜Žไบฎ\n275501 1 ๅฎถ้•ฟไฝ ๅฅฝ๏ผ้ข†ๅ…ˆๆ•™่‚ฒๆ˜ฅๅญฃๆŽจๅ‡บ่ถ…ไฝŽไปทไฝœไธš่พ…ๅฏผ็ญxxxๅ…ƒ๏ผŒ้™ไธ€ไธช็ญ๏ผŒ้ขๆปกไธบๆญขใ€‚ๆฌข่ฟŽๅ‰ๆฅๅ’จ่ฏขๆŠฅๅใ€‚\n275502 0 ่ฟ™ๆ˜ฏ20ๅฒ็š„ๅฅณๅญฉๅญฃๆ˜ฅๅจŸ็š„ๅนธ็ฆ่ง‚\n275503 0 ็‹ฌ็ซ‹ๅฏปๆ‰พ้ฃŸ็‰ฉใ€ๆฐดๆบใ€่ฏๅ“ใ€่กฅ็ป™\n275504 0 ๅŠ ๅ…ฅๆžธๆžใ€ๅ†ฐ็ณ–ๅ†็…ฎxxๅˆ†้’Ÿ\n1000 Processed\n classify content\n276000 0 ๆ–ฝๅปทๆ‡‹ไธ็ฎกไปŽ่ตท่ทณ้ซ˜ๅบฆ็ฉบไธญๅงฟๆ€็š„ไฟๆŒ่ฟ˜ๆ˜ฏๅˆฐๅ…ฅๆฐดๆ—ถๅฏนๆฐด่Šฑ็š„ๆŽงๅˆถ้ƒฝๆฏ”ๆœ€ๅŽๅคบๅ† ็š„ๆ„ๅคงๅˆฉ้€‰ๆ‰‹่ฆๅš็š„ๆ›ดๅฅฝ\n276001 0 ไธ‡่พพ็™พ่ดงๅ…ณ้—ญๅ…จๅ›ฝ40ไฝ™ๅฎถ้—จๅบ—็š„ๆถˆๆฏ\n276002 0 ๆŠฅๅ็ƒญ็บฟ๏ผš18051024703\n276003 0 ๅ็‰‡ๅ็‰‡่ฎพ่ฎกๅ็‰‡็ด ๆๅ็‰‡ๆจกๆฟๅ…ฌๅธๅ็‰‡ๅ•†ๅŠกๅ็‰‡ๅนฟๅ‘Šๅ็‰‡ไผไธšๅ็‰‡็ง‘ๆŠ€ๅ็‰‡ๅˆ›ๆ„ๅ็‰‡ๅ•†ไธšๅ็‰‡้€š็”จๅ็‰‡...\n276004 0 ๅฏ่ƒฝๅ› ไธบxxไธ‡ๆŠฅไปทๆฏ”่พƒ้ซ˜ไธ€็›ดๅฐšๆœชๆˆไบค\n1000 Processed\n classify content\n276500 0 ๅฝ“ๆˆ‘ๅผ€ๅง‹่ดจ็–‘ไธ€ไปถไบ‹็š„ๆ—ถๅ€™ๅ‡†็กฎ็އๆ€ปๆ˜ฏ้‚ฃไนˆ้ซ˜??\n276501 1 ๅ…ˆ็”Ÿ๏ผŒๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅˆš่ทŸๆ‚จ่”็ณป่ฟ‡็š„ๅนณๅฎ‰ๆ˜“่ดทๅฎขๆˆท็ป็†ๅป–ๅ…ด่๏ผŒๅŠž็†ๅ†œ่กŒใ€ๅ…‰ๅคงๆ— ๆŠตๆŠผ่ดทๆฌพxโ€”xxxไธ‡๏ผŒ...\n276502 0 ่ฟ™ไนŸ่ฎฉ็ซ็ฎญ็š„้˜ตๅฎนๅพ—ไปฅ่ฟ›ไธ€ๆญฅๅฎŒๅ–„\n276503 0 ไปฅๅŽๅๅœฐ้“ๆˆ–ๅŽปๅ•†ๅœบๅฟ…้กป่ตฐๆฅผๆขฏ\n276504 0 ็œ‹็”ต่ง†็Žฉ็”ต่„‘ๆ‰‹ๆœบๆธธๆˆๆœบipadๅˆๅคš\n1000 Processed\n classify content\n277000 0 ๆ”นๆ”นๅ…ฌ็คพ่ณฝ้“่ตฐ่กŒไผšๅ—ไบฌ็ซ™้กบๅˆฉๅฎŒๆˆ\n277001 0 ่ฎพ่ฎกๅธˆGonglueJiangๆƒณๅˆฐไบ†ไธ€ไธชๅฅฝ็š„่งฃๅ†ณๅŠžๆณ•\n277002 0 13ๅฒ็š„ๆน–ๅŒ—ๅฐๅฅณๅญฉๅฐๆŽ่ขซๅธฆๅˆฐ่ˆ’ๅŸŽๅ†œๆ‘ไธ€ๅคง้พ„็”ทๅญๅฎถ\n277003 0 ๅœจๅŽฆ้—จๅœฐๅŒบไธญ็ŸณๅŒ–ๆฃฎ็พŽๅŠ ๆฒน็ซ™\n277004 0 ๆˆ‘ๅ–œๆฌข่Šฑๅƒ้ชจๅ’Œๆœ”้ฃŽ่ฟ™ไธ€ๅฏนๅ„ฟๅ•Šๅ•Šๅ•Šๅ•Š\n1000 Processed\n classify content\n277500 0 ้ƒฝๅฏ่ƒฝ้€š่ฟ‡ไนฆๆœฌใ€็™พๅบฆๅ’Œๅ–„็Ÿฅ่ฏ†ๅค„ๅญฆๆฅ\n277501 0 ๆ‰‹ๆœบๅก็š„ๅŠๆญป็งไฟกๅ›žไธไบ†ๆ™š็‚นๅ›žๅŽปๅ†็”จ้‚ฃไธชๅ›žๅคไฝ ไปฌ้‚ฃไธชๆ‰‹ๆœบๆฒกๅธฆๅ‡บๆฅไฝ ไปฌ็š„ๆต้‡ๆˆ‘ๅทฒ็ปๅ†ฒไบ†\n277502 0 ไธชๅˆซไธๆณ•ๅ•†่ดฉ่ฟ่ง„้”€ๅ”ฎๆณจๅฐ„ๆ•ดๅฎน็”จ้€ๆ˜Ž่ดจ้…ธ้’ ไบงๅ“\n277503 1 ๆ˜Žๅคฉๆ–ฐๅˆฐOKยทไบ”ๆ˜Ÿ ไธ‰ๆ‰’ไธ€้œ– ็‰›ๅ‰ ๅ…จ็‰› ้‡ๅคงไปŽไผ˜๏ผŒ่ดงๆบ็ดง็ผบ๏ผŒๆŠ“็ดงๆ—ถ้—ด๏ผŒ้ข„่ดญ่ฏท้€Ÿ็”ต๏ผ\n277504 0 2๏ผšๅˆซ็”จ่‡ชไปฅไธบ็š„ๆ ‡ๅ‡†่ฆๆฑ‚้™ๅˆถๅฅณๆ€ง\n1000 Processed\n classify content\n278000 0 โ€ฆไฟก็”จๅกๆปž็บณ้‡‘็š„่ฎก็ฎ—ๆ–นๅผ้‡Ž่›ฎ็ฒ—ๆšด\n278001 0 ๅƒ่Šฑๅƒ้ชจๅ‰ง็ป„่ฟ™ๆ ทๅˆฉ็”จ่‡ช่บซๅฝฑๅ“ๅŠ›\n278002 0 ๆˆ‘ๅˆšๅˆšๆŠŠๆ‰‹ๆœบๅฃ็บธๆขๆˆไบ†่ฟ™ไธช๏ผšBretAdeeopensoneofhis72\n278003 0 ๅ’จ่ฏขๅฏๅŠ ๆˆ‘ๅพฎไฟกๅท๏ผš1239847679้‚“ๆฆ•\n278004 0 ่”ก่€ๅธˆ็š„็…ง็‰‡5็‚นๆ‹็š„6็‚น็ง‹็ง‹่ฟ˜็ฉฟ่ถฟๆ‹‰ๆฟ่ทŸๆœ‹ๅ‹็Žฉ\n1000 Processed\n classify content\n278500 0 ๆฒชๆŒ‡ๅทฒ็ป่ทŒๅ›ž7ๆœˆ8ๆ—ฅใ€7ๆœˆ9ๆ—ฅ้™„่ฟ‘ไฝ็ฝฎ\n278501 1 (x/x)ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตไธŠๆตทๅ…ƒ็ฅ–๏ผๅ…ƒ็ฅ–ๅ‡ฏ็‰น่Š’ๆžœ/็ˆฑๆ–‡่Š’ๆžœๆ–ฐ้ฒœ้ข„่ดญๅ’ฏ๏ผๅณๆ—ฅ่ตท-xๆœˆxxๆ—ฅๆœŸ้—ด้ข„่ดญๅณ...\n278502 0 xๅ€‹้ญ…ๅŠ›ๅ…จ้ข่งฃๆž|ETtodayๅฝฑๅЇๆ–ฐ่ž|ETtodayๆฑๆฃฎๆ–ฐ่ž้›ฒๆ‰‹ๆฉŸ็‰ˆ|ETtodayMo...\n278503 1 ใ€Šๆœ‰ใ€‹โŠฅใ€ŠๆŠตใ€‹ ใ€Šๅข—ใ€‹โŠฅใ€Šๆ‰ฃใ€‹ ใ€Šๅ€คใ€‹โŠฅใ€ŠIx%ใ€‹ xxx xxxx xxxxๅˆ˜โ€™\n278504 0 ๅ›ฝๆœ‰็‰ฉไธšๆœๅŠกไผไธšใ€็คพๅŒบ็ป„ๅปบ็‰ฉไธšๆœๅŠกๆœบๆž„ๆˆ–ไธ“ไธš็‰ฉไธšไผไธš่ฟ›้ฉป็ฎก็†\n1000 Processed\n classify content\n279000 0 ไธ€ไธชๅท็ ๅซ๏ผš15298601091ๅš็š„ๆ˜ฏไธ“ไธšๆ— ๆŠตๆŠผๅ…ๆ‹…ไฟไฟก็”จ่ดทๆฌพ\n279001 0 ๆƒณ็Ÿฅ้“ไปŠๅนดๅ—ไบฌ่ˆž่นˆๅœˆๆœ€ๅคงไบ‹ไปถๆ˜ฏไป€ไนˆๅ—\n279002 0 ๆ˜จๅคฉๆŠŠWIN10่ฟ˜ๅŽŸไบ†WIN7\n279003 0 I'matๅœฐ้“็ ๆฑŸ่ทฏ็ซ™ZHUJIANGLUStationinๅ—ไบฌ\n279004 0 ๅšไบ†ๅฟƒ็”ตๅ›พ็ป“ๆžœๆ˜ฏ็ชฆๆ€งๅฟƒๅพ‹ไธ้ฝ\n1000 Processed\n classify content\n279500 0 ๅฅฝๅฃฐ้Ÿณ่ฟ™ๆœŸๅ–œๆฌข่ตตๅคงๆ ผๅ’Œๅฅนๅ‰้ข้‚ฃไธช\n279501 0 P4bomb็œผ็ฅžๅพˆๆ”ป~~cr\n279502 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜ๆœ‹ๅ‹๏ผŒๆ–ฐๅนดๅฅฝ๏ผxxxxๅนดxๆœˆxๆ—ฅโ€”xๆœˆxๆ—ฅ๏ผŒๅ‡ก่ฟ›็ปด็บณ่ดๆ‹‰ๅบ—ๆœ‰ๅ…จๅนดๆœ€ๅคง็š„ๆƒŠๅ–œๅ“ฆ๏ผๆœบไผš...\n279503 1 ๆ‚จๅฅฝ๏ผŒๆ–นๆž—xxxxๅˆซๅข…่ฃ…้ฅฐxๆœˆxๆ—ฅ-xๆ—ฅ้‡็ฃ…ๆŽจๅ‡บโ€œๅผ€ๅนด้’œ็Œฎ็ฏๅณฐ้€ ๆžโ€ๅ…ƒๅฎต็ฏไผšไผ˜ๆƒ ๆดปๅŠจ๏ผŒ็Žฐๅœบ่ฑช...\n279504 0 ๅ‘ไป”่ง‰ๅพ—๏ผšๆŠ•่ต„็†่ดขๆ˜ฏไธ€้—จๅญฆ้—ฎ\n1000 Processed\n classify content\n280000 0 ๅ› ไธบๆขฆๆƒณ่ฟ‡็จ‹ไธญ็š„็‚น็‚นๆปดๆปดๆฑ—ๆฐดๆท‹ๆผ“\n280001 1 ่ฅฟๅ—ๅคงๅญฆ็ถฆๆฑŸๆŠฅๅ็‚นๆ˜ฅ ๅญฃๆ‹›็”Ÿๅฐ†ๅœจxๆœˆxxๆ—ฅ็ป“ๆŸ๏ผŒๅ‡ก่€ๅญฆๅ‘˜ๅธฆๆ–ฐๅญฆๅ‘˜๏ผŒๅ‡ไผš่Žทๅพ—xxx.xx/ไบบ็š„...\n280002 0 ๆž็ฌ‘/NBA็ƒๆ˜Ÿไธ‰ๅˆ†็ƒๅ‘ฝไธญ็އไธ‹้™ๅŽŸๅ› ๆ˜ฏiPhoneๅฑๅน•่ฟ‡ๅคง\n280003 0 ็พŽๅ›ฝ็ฆๆญข้ฃžๆœบๅ‘ๅŠจๆœบๆ‰€็”จ็‡ƒๆ–™ใ€้ฃžๆœบๆ‰€็”จๆถฆๆฒนใ€ๅบŸ้’ข้“็ญ‰็‰ฉ่ต„ๅ‡บๅฃ\n280004 0 ๅ†ไนŸไธไผšไนฐไบ†ๅ†ไนŸไธไผšไนฐไบ†ๅ†ไนฐไนŸไธไผšไนฐไบ†\n1000 Processed\n classify content\n280500 0 ่ตถๅฟซๆŠ“ไฝๆœบไผšๅŠ ๅ…ฅVIVAไผ—็ญนๅง\n280501 0 ไธ€5ไบบ็Šฏ็ฝชๅ›ขไผ™่ขซ่‚ฅไธœๆณ•้™ขๅˆคๅˆ‘\n280502 0 ไธ€้ฃžๆœบ็š„ไธœๅŒ—ไบบๅฅฝๅƒไธ€ๅฎถไบบไบ’็›ธๆŽจ่็€ๆณก้ขโ€œๆ•ด็‚นๅ„ฟๅง\n280503 0 ็Žฐๅฐ†ๅพๅทž่ฟœๅคง็Žป็บคๅˆถๅ“ๆœ‰้™ๅ…ฌๅธๅŽ‚ๅŒบ่ง„ๅˆ’ๆ€ปๅนณ้ขๅ›พๅŠไธป่ฆ็ปๆตŽๆŠ€ๆœฏๆŒ‡ๆ ‡ๅ‘ๅนฟๅคงๅธ‚ๆฐ‘ๅ…ฌ็คบ\n280504 1 ๅคง้‡ๅ‡บๅ”ฎ็บฏ่’™ๅ…”๏ผŒๆ•ฐ้‡ๆœ‰้™๏ผŒๅฏŒ้•‡่€ๆŽ๏ผŒ่”็ณป็”ต่ฏxxxxxxxxxxx.\n1000 Processed\n classify content\n281000 1 ๆบ๏ผŒ้€‰ๆ‹ฉๅฅฅ้€”ๅ…ฌๅ…ณ ็ณ–้…’ไผš็ปง็ปญไธบๆ‚จๆœๅŠกใ€‚ๆœฌๅ…ฌๅธไธ“ๆณจ็ณ–้…’ไผšๅทฒxๅนด๏ผŒไธ“ไธšๆไพ›็คผไปช๏ผŒๆจก็‰น๏ผŒไธพ็‰Œ๏ผŒๅ‘ๅ•...\n281001 0 ๆˆ‘ไปฌ็š„ไปทๆ ผduangduang็š„ๅพ€ไธ‹้™\n281002 0 ็ฝ‘็‚นๅ็งฐ๏ผšๆฑŸ่‹ๅ—้€šๆธฏๅŠกๅŒบ็ฝ‘็‚น็”ต่ฏ๏ผšxxxxxxxxxxxๆดพ้€ๅŒบๅŸŸ๏ผš็‹ผๅฑฑๆธฏๅทฅไธšๅ›ญ\n281003 0 ๆตดๅฎค่ฎพ่ฎก็š„ไบฎ็‚นๆ˜ฏๅข™่ง’็š„ไธ€้—ด็ŸฎๆŸœ\n281004 0 ๅซๆ˜Ÿ๏ผšaiscpxxxxๆฒกไบบๆ‰“ๅผ€็š„ๆณชๆปด\n1000 Processed\n classify content\n281500 0 ๅทฅไฝœๅ›ขๅ่ฏทๆ”ฟๅบœๅฎฃไผ ไฟๆŠคๆญค็‰นๆœ‰็‰ฉ็ง\n281501 0 ็™พๅบฆ็ฒ‰ไธๅŠ›้‡ๆˆ‘ๆ”ฏๆŒๆŽๆ˜“ๅณฐ\n281502 0 ๅฏนๆœ‰่ฏ‰่ฎผใ€ๅค่ฎฎไบคๅ‰ๆƒ…ๅ†ต็š„ๆกˆไปถๅŒๅŒบๆ”ฟๅบœๆณ•ๅˆถๅŠž่ฟ›่กŒๆฒŸ้€š\n281503 1 ๅฐŠๆ•ฌ็š„ๅ„ไฝๅฎถ้•ฟ๏ผš ๅผ€ๅญฆๆฅไธดไน‹ๅญฃ๏ผŒๆ˜Ÿ่พฐๅญฆไน ๅฎถๅ›ญไผ˜ๆƒ ๆดปๅŠจๅผ€ๅง‹ไบ†๏ผŒ้’ˆๅฏน่€็”Ÿไธ€ๆฌกๆ€งๆŠฅ...\n281504 0 ้‡็‚น้…็ฝฎๅซๆ˜Ÿๅฏผ่ˆชใ€ๆ ธ็”ตใ€็Œช่‚‰ใ€็Žฐไปฃๅ†œไธšใ€ๅ›ฝไผๆ”น้ฉ็›ธๅ…ณ็ƒญ็‚น้ข˜ๆ\n1000 Processed\n classify content\n282000 1 ๅฅฝๆถˆๆฏๅ“ฆ!๏ผๅ…ฌๅธไธบ่ฟŽๅ…ƒๅฎต่Š‚ๅ’Œ็พŽไธฝๅฅณไบบ่Š‚๏ผŒ็‰นๅ›ž้ฆˆๆ–ฐ่€้กพๅฎขใ€‚ๅบ—ๅ†…ๆ‰€ๆœ‰ๆฌพๅผไธ€ๅพ‹x่‡ณxๆŠ˜๏ผŒๆœ‰็ปๅ…ธ็š„้ป‘...\n282001 1 ๆทฑๅœณๅธ‚ๅ…ดๅŽๆฑฝ่ฝฆ่ฟ่พ“ๆœ‰้™ๅ…ฌๅธ ๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผๅ…ฌๅธ็ซญ่ฏšไธบๆ‚จๆไพ›้ซ˜ๆ•ˆใ€ไผ˜่ดจ็š„ๆœๅŠก๏ผ่”็ณป็”ต่ฏ๏ผšxxx...\n282002 0 ๅซๆ˜Ÿๅ’Œ่ˆชๆ‹็…งrgottenhadbeenhere\n282003 0 ๅธฎไฝ ไปฌๅ›ข็š„sๅ‡ฏๅ…”ๅ‰็”ทๅ‹้ข่†œไนฐๅฅฝๅ•ฆ็Žฐๅœจ็œ‹ๅˆฐ็š„้ƒฝๆ˜ฏๆœ‰็›’ๅญๆ™šไธŠ่ฟ‡ๅ…ณๅ‰้ƒฝไผšๆ‹†ๆމไธ็„ถๅคชๆ˜พ็œผไบ†\n282004 0 ็„ถๅŽๆˆ‘ๅœจ่‡ชๅทฑ็”ต่„‘ๅ’Œๆ‰‹ๆœบ่ฟ˜ๆœ‰ipadไธ‰ไธช่ฎพๅค‡็œ‹้‚ฃๅ‡ ไธชgif้€ŸๅบฆๅทฎไนŸๅฅฝๅคง\n1000 Processed\n classify content\n282500 0 ไธ‡ไธ€ไฝ ็ขฐๅˆฐไธ€ไธชๅ–ไธญ่ฏ็š„ไธ€ๆŠ“ไธ€ไธชๅ‡†ๅฏๆ€ŽไนˆๅŠžๅ‘๏ฝž็œŸๆ˜ฏไธบไฝ ๆ“็ขŽไบ†ๅฟƒๅ‘\n282501 0 ไธ€ๆ–‡็ง’ๆ‡‚ๆ˜†ๅฑฑโ€œๅ››ๅคงๅŒบๅŸŸโ€\n282502 0 ๅ„่ทฏ่ฎฐ่€…ๅˆๅผ€ๅง‹ๅธฆ็€ๅฎžไน ็”Ÿๆฅไธฒ้—จ\n282503 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ๆ˜ฏไธœๆน–ๅฐ้•‡้กน็›ฎ็ฝฎไธš้กพ้—ฎๅผ ๆด‹๏ผŒ ้กน็›ฎๆœฌๆœˆๆœˆๅบ•ไธพๅŠž็›ธไบฒๅ›ข่ดญๆดปๅŠจ๏ผŒๅˆฐๆ—ถไผšๆŽจๅ‡บๅ›ข่ดญๆˆฟๆบ๏ผŒๅนถไธ”...\n282504 0 ไธ‹่ฝฝๅนณๅฎ‰ไบบๅฏฟapp่‡ชๅŠฉไธ‹ๅ•\n1000 Processed\n classify content\n283000 0 ๅฐผ็Ž›ไธๅผ€้พ™่ขซ็”ต่„‘ไธ€ๆณขๆŽจๅˆฐๅฎถ้—จๅฃ็š„ๆˆ‘ๅˆไธๆ˜ฏๆฒก่ง่ฟ‡\n283001 0 ๆฒกๆœ‰ๅˆไนŽ็ซ‹ๆณ•ๆƒใ€ๅธๆณ•ๆƒใ€่กŒๆ”ฟๆƒๆƒๅŠ›่กŒไฝฟ็š„ๅŸบๆœฌๆกไปถ\n283002 0 ไปŽๆญคๅˆป่ตท\n283003 0 ๆœฌๆฅๆƒณ็€ๆœ‰้“ถ่€ณ่Žฒๅญๆžธๆž็™พๅˆ\n283004 0 10ไธ‡ๅ›พไนฆๆปก200ๅ‡100\n1000 Processed\n classify content\n283500 0 ๅ†ๅพ€ๅŽไธ‰ไฝxxx่กจ็คบๆ˜ฏ็ฌฌxxxๅคฉ\n283501 0 ไน‹ๅ‰ๆ€ป่ง‰ๅพ—็งŸ็š„ๆˆฟๅญ้‡Œwifiไฟกๅทๅฎžๅœจๅคชๅทฎ\n283502 1 ๆ€ปไปทxไธ‡่ตท็š„ไธ€็บฟๆน–ๆ™ฏ็”ตๆขฏๆด‹ๆˆฟ๏ผ้ฆ–ๆฌก็ซ็ˆ†ๅผ€็›˜ใ€‚ๆŠขๅˆฐๅณ่ตšๅˆฐ๏ผๅœฐๅ€๏ผš้‡‘ๅฑฑๅ—่ทฏไธŽ็މๅฑฑ่ทฏไบคๅ‰ๅฃ๏ผŒ่ฏš้‚€ๆ‚จ...\n283503 0 3ใ€ๆŠ“ๅ‡†็พŽๅฎนๆ—ถ้—ด๏ผšๅœจ11็‚นๅ‰ๅฐฑๅฏ\n283504 0 ็œๆ”ฟๅบœๅœจๅฎๅงๅบ„ๅฎพ้ฆ†ไธพๅŠžไธ็ปธไน‹่ทฏ็ปๆตŽๅธฆโ€œไบ’่”็ฝ‘+ไบค้€š+ๅˆถ้€ โ€ๅ…ฐๅทžๅœ†ๆกŒๅณฐไผš\n1000 Processed\n classify content\n284000 0 ๆˆ‘ๅชๆ˜ฏๆƒณๅœจๅœฐ้“ไธŠ้‡ๆธฉไธ€ไธ‹้ข„ๅ‘Š็‰‡ๅ’ŒMV\n284001 0 ๅ…จ็œๆธ…็†ๅ…ฌๅธƒๆƒๅŠ›ๆธ…ๅ•xxไธ‡ไฝ™้กน\n284002 1 ๆ‚จๅฅฝ๏ผๅ“ฅใ€‚ๆˆ‘ๆ˜ฏ่ฑชไธ–ๅŽ้‚ฆๆ–‡้ผŽ่‹‘็š„ๆŽๅญ่ฎฐ๏ผŒๅˆšๆ‰ๅ’Œๆ‚จ่”็ณปไบ†๏ผŒๅๅฎ‰ไธŠ้ƒกๆ–ฐๅ‡บไธ€ๅฅ—xxx.xxๆ–นxxxไธ‡...\n284003 0 ไป–ๅฏไปฅๅฐ†ๆŸ็ง็–พ็—…่ฎฒๅพ—ๅคดๅคดๆ˜ฏ้“\n284004 0 ๆฑŸ่‹็œๆทฎๅฎ‰ๅธ‚้‡‘ๆน–ๅŽฟ้“ถ้›†้•‡ๆทฎๅปบๅฑ…ๅง”ไผšๆ‘ๅนฒ้ƒจ้œธๅ ๅ†œ็”ฐๅธฆๆ–ฝๅทฅไบบๅ‘˜ๆ‰“่€ไบบ\n1000 Processed\n classify content\n284500 0 ๆ‘้‡Œ90ๅฒ้ซ˜้พ„็š„่€ๅ…šๅ‘˜ๅ†’็€ๅคง้ฃŽๅคง้›จๅœจไฟฎ็ ๆ ‘ๆž\n284501 0 ๅฏๅœจๅŒป็”Ÿ็š„ๆŒ‡ๅฏผไธ‹ๆœ็”จๆญขๅ’ณ่ฏ\n284502 0 ๅฐๅทžๆฐ‘่ˆชๅฑ€้•ฟๅฐฑๅœฐๅ…่Œๅฝ“็ญๅฎ‰ๆฃ€ไบบๅ‘˜ๅ…จ้ƒจๅผ€้™ค\n284503 0 ไธ€ไธชๆฑŸ่ฅฟๆฅ็š„ๆ–ฐๅ…ตๅœจๆต™ๆฑŸๅ…ต่ฅๅƒๅฎŒ็ฌฌไธ€้กฟ้ฃŸๅ ‚้ฅญ่นฒๅœจ่ตฐๅปŠๅ“ญไบ†้ฆ–้•ฟ่ตฐ่ฟ‡ๅŽป้—ฎไธบๅ•ฅๅ“ญๆ–ฐๅ…ต่ฏด้•ฟ่ฟ™ไนˆๅคง่ฟ™ๆ˜ฏๅƒ...\n284504 1 ๆ„Ÿ่ฐข่‡ด็”ตๅทจ็Ÿณๅฑฑ็”Ÿๆ€ๆ–‡ๅŒ–ๆ—…ๆธธๅŒบ๏ผŒๆ‚จ็š„ๆฅ็”ตๆ˜ฏๅฏนๆˆ‘ไปฌๆœ€ๅคง็š„ไฟกไปป๏ผไธญๅ›ฝ็”ตไฟกๆŒ‚ๆœบๅ็‰‡ๆ˜ฏๆ‚จๆœ€ไพฟๆทใ€ๆœ€็ปๆตŽ...\n1000 Processed\n classify content\n285000 0 ็œ้˜ฒๆŒ‡ๅ†ณๅฎšไบŽ12ๆ—ฅ12ๆ—ถ่ตท็ป“ๆŸๅ…จ็œ้˜ฒๅฐ้ฃŽIII็บงๅบ”ๆ€ฅๅ“ๅบ”\n285001 0 ไธ‡็ง€ๅŒบๆณ•้™ข็š„่กŒๆ”ฟๆณ•ๅฎ˜ๆฒกๆœ‰่ฝปๆ˜“ๅฏนๆญคๆกˆไฝœๅ‡บๅˆคๅ†ณ\n285002 0 ๆฅๆŽจ่ไธ‹THANN็ดซ่‹้˜ฒๆ™’้œœ\n285003 0 ๅฅ‡่‘ฉ็š„ๅบ—ๆ‰ฌๅทž็‚’้ฅญ็ซŸ็„ถ็”จๆข…ๅนฒ่œ็‚’\n285004 0 ๆˆ‘ๅ“ฅ่ทŸๆˆ‘่ฏดไบ†ไฝ ๆฏๅคฉ็ผ ็€ไป–ๆ˜ฏไธๆ˜ฏ็š„\n1000 Processed\n classify content\n285500 0 ๆข็Ÿณๅท๏ผš่ฅฟๅฎๅธ‚ๆ”ฟๅบœๅฆ‚ๆญคโ€œๅซๅฅณโ€ๅ› ไธบๅ•ฅ\n285501 0 ๅ…ฑๅŒๆ‰“ๅ‡ป่ทจๅŒบๅŸŸๆฏ’ๅ“่ฟๆณ•็Šฏ็ฝช\n285502 0 ๆฅ่‡ชๆฑŸ่‹็œ็ปŸ่ฎกๅฑ€็š„่ฐƒๆŸฅๆ˜พ็คบ\n285503 0 ๆˆ‘ไปฌๆ‰พๅˆฐๅ››ๆ–นๅฐๅŒบๆ”ฟๅบœๅปบ่ฎพๅฑ€\n285504 0 ๆŠฅๅ20ๅท้Ÿฉๅ›ฝๅŸน่ฎญ็ญๅŠๆฐธไน…ๅŒ–ๅฆ†็š„ๅญฆๅ‘˜ไธ‹้ฃžๆœบ่ฏท่ทŸๆˆ‘่”็ณป\n1000 Processed\n classify content\n286000 0 ใ€Œ1646ไธ‡ไบบๅœจไผ ็œ‹ใ€ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็ฌฌ4ๅญฃๆทท่ก€ๅž‹็”ทๅˆ˜ไผŸ็”ท่ฟทๅ€’ๅ››ๅฏผๅธˆๅ‘จ่‘ฃๆŒบๅ›ฝ่ฏญๆญŒ๏ผšไธญๆ–‡ๆœ€diao\n286001 0 ๆฏ”ๅฆ‚ๆณข้Ÿณ787ๆ‰€็”จ็š„ๆ˜ฏๆถกๆ‰‡ๅ‘ๅŠจๆœบ\n286002 0 ่ฟ™ๅฏนไบŽไธคไธชๆ˜ŸๆœŸๅ‰ๅฏนC่ฏญ่จ€ไธ€็ชไธ้€š็š„ๆˆ‘ๅŽ‹ๅŠ›ๅฏๆƒณ่€Œ็Ÿฅ\n286003 0 ๆฏๆฌกๆŒ‰ๅŽ‹ๅ–ทๅคด1่‡ณๅฐ‘2ไธ‹ๅฐ†่ฏๆถฒๅ–ทไบŽๅ„้ผปๅญ”ๅ†…\n286004 0 ๅพฎ่ฝฏๅทฒไบŽxๆœˆๅบ•็š„buildxxxxๅผ€ๅ‘่€…ๅคงไผšๅ‘ๅธƒไบ†windowsx\n1000 Processed\n classify content\n286500 0 ๆ—…ๆธธๅ›žๆฅๅฟƒๆƒ…ๅฅฝๅ‘ๅ‘ๆˆ‘็š„่‡ชๆ‹\n286501 0 ๅ•็‹ฌ็š„xxml็ฅžไป™ๆฐดๆŠ˜ๅŽxxx\n286502 0 ็œ‹็™พๅบฆ็™พ็ง‘ไป‹็ปๆจไผฏๅณปๅ…ˆ็”Ÿ\n286503 0 ๆœ‰ๆฒกๆœ‰ๆ‰‹ๆœบ่ฝฏไปถ็š„็›ดๆ’ญๅœฐๅ€ๅ•Š\n286504 0 ๅŒ—ไบฌไธ‹่ตทโ€œ็บขๅŒ…้›จโ€\n1000 Processed\n classify content\n287000 0 ่‚กไธœ็”ฑๆต™ๆฑŸ่‡ดไธญๅ’Œๅฎžไธšๆœ‰้™ๅ…ฌๅธๅ’Œ้™ˆๅฐ‘้žๆž„ๆˆ\n287001 0 pptๆ–‡ไปถๆฒกๆœ‰่‡ชๅŠจๅ…ณ่”o16ๅบ”็”จ\n287002 0 ๆฎ่ฏดๆ˜ฏๆ— ้”กๅฐๆœ‰ๅๆฐ”็š„่ตฐ็ฉดๆญŒๆ‰‹\n287003 0 xxxxxxxxๆธ…ๆ™จๅœจๅ›ข็ป“้™„่ฟ‘่ตฐไธข\n287004 0 ้—ฎๅฅนไธช้—ฎ้ข˜้‚ฃ็งๅพˆไธๆƒ…ๆ„ฟๆŠŠๆ‰‹ๆœบไปŽ็œผๅ‰ๆ”พไธ‹็„ถๅŽ็œ‹้ƒฝไธ็œ‹ไฝ ็„ถๅŽๅˆๅพˆ่ฝปๅฃฐๅพˆๅผ็š„ๆ…ขๅžๅž็š„่ฏดๆˆ‘ไธ็Ÿฅ้“ๆฒกๅฌ...\n1000 Processed\n classify content\n287500 0 ๆ™šไธŠๅ…ญ็‚น็š„้ฃžๆœบ้ƒฝๆฒก็ฃจๅฝ็š„ๆฒก่ตถไธŠ\n287501 0 ็œŸๅฟƒ่ง‰ๅพ—็Œฅไบต่ทŸๅผบๅฅธไธๅŒบๅˆซๅฏนๅพ…ๅฐฑๅฅฝไบ†\n287502 0 ๅ—ไบฌ๏ผšๅ…ญๅˆ็™พ็›่ทฏ็ฏๅคง็™ฝๅคฉไบฎ็€ๆตช่ดน\n287503 0 1987ๅนดไป–ๅšไบ†ไธช้ซ˜20cm็š„ๆœบๅ™จไบบ\n287504 0 ่ฏ•้ชŒใ€ๅ’จ่ฏขๅŸน่ฎญไบŽไธ€ไฝ“็š„ๆ–ฐๅ…ด่™ซๅฎณ็ฎก็†ไผไธš\n1000 Processed\n classify content\n288000 1 ็ˆฑๆ…•ๅ†…่กฃ่ฟŽโ€œไธ‰ๅ…ซ\n288001 0 ้šๆ„ๆญ้…ไธ€ไปถtๆคๅฐฑ่ถณๅคŸๅฝฐๆ˜พไฝ ็š„ๆ—ถๅฐš้ญ…ๅŠ›ๅ•ฆ\n288002 0 ้˜ฟ้‡ŒๅทดๅทดยทๆกๅŸŽไบงไธšๅธฆ็Žฐๆœ‰ไธŠ็บฟไผไธš372ๅฎถ\n288003 0 ๆฑฝ่ฝฆ้…ไปถๅฐฝๅœจๅŒๅฟƒๅ›ฝ้™…ๅ•†ๅŸŽ\n288004 0 ๅ—ไบฌๅธ‚ๆฑŸๅฎๅŒบๆฐ”่ฑกๅฐ2015ๅนด7ๆœˆ20ๆ—ฅ06ๆ—ถๅ‘ๅธƒ๏ผšไปŠๅคฉ้˜ดๅˆฐๅคšไบ‘\n1000 Processed\n classify content\n288500 0 ๅŒป็”Ÿ่ฏดๆˆ‘่ฟ™็งๆ‡’ๅทฒ็ปๆ— ่ฏๅฏๆ•‘ไบ†\n288501 0 ๅšไธบ่ฃๅˆค็š„ๆˆ‘ไปฅ่ฟ…้›ทไธๅŠๆŽฉ่€ณ็š„้€ŸๅบฆๆŽฅไฝไบ†ๆญฃๅ‘ๆˆ‘้ฃžๆฅ็š„็ฏฎ็ƒ็ƒ\n288502 1 ๆฐ”ไธธx่ข‹่ฃ…x.xๅ…ƒ/็›’ใ€‚ๆ›ดๅคšไผ˜ๆƒ ๆฌข่ฟŽ่ฟ›ๅบ—ๅ’จ่ฏขใ€‚โ€”โ€”ๆˆ‘ๆ˜ฏไธ€็ฌ‘ๅ ‚็ซ‹ๅพท่ฏๆˆฟ็š„ๅผ ๆก‚็ \n288503 0 Windows10ๆ›ดๆ–ฐไธ‰ไธชๅคšๅฐๆ—ถ่ฟ˜ๆฒกๅฅฝ\n288504 0 ไผ ๅทด่ฒ็‰นๆญฃๆดฝ่ดญ้ฃžๆœบ้›ถไปถๅ•†ไบคๆ˜“ๆˆ–่พพ300ไบฟ็พŽๅ…ƒ\n1000 Processed\n classify content\n289000 0 ไปŠๅคฉๆ‹ฟๅˆฐๅพๅทž้ผ“ๆฅผๅŒบไบบๆฐ‘ๆณ•้™ข็š„่ฃๅฎšไนฆ\n289001 0 eCareDiaryไนŸๆ˜ฏไธ€ไธช็คพๅŒบ\n289002 0 20ๅคšๅนดไธ€ไธชๅœฐๆ–นไบคไปฃไธ‹ๅŽปๆ‰ๅˆšๅˆšๅœ†ๆปก\n289003 0 ๆฑŸๅŒ—ๆดพๅ‡บๆ‰€็ป„็ป‡่ญฆๅŠ›้€š่ฟ‡็ป†่‡ดๆŽ’ๆŸฅ่ตฐ่ฎฟ่Žทๅ–ๆกˆไปถ็บฟ็ดข\n289004 0 C21ไบฎ็™ฝ้ฎ็‘•่‰ฒ่ฎฉไฝ ่ฝปๆพๅฎž็Žฐ่ฃธๅฆ†ไธญ็š„็พŽๅฆ†\n1000 Processed\n classify content\n289500 0 ๆญคๅค–ไบš้ฉฌ้€Šไธญๅ›ฝ่ฟ˜ๅฎฃๅธƒไธŽๅฎๆด็พŽๅ‘่พพๆˆๆˆ˜็•ฅๅˆไฝœ\n289501 0 ๅฏๆถ็š„็พŽๅธๅŒป็–—ไฝ“็ณปๆœ‰ๅคšๅฐ‘ไบบๆฒก่ขซๅ‘ๅ•Š\n289502 0 2015/7/28่ฒๅ–ๅบ—็พŽๅ›ฝๅบ—ๅˆฐ่ดง้€š็Ÿฅ\n289503 1 ๅณๅญ˜ๅณ้€็™พๅˆ†็™พ xxx xxx.com ้‘ซ ๆ™ฎ ไบฌ\n289504 0 ็ฌฌไบŒๅคฉๅฐๅทไธŠ่ก—ไนฐไบ†ไปฝๆŠฅ็บธ\n1000 Processed\n classify content\n290000 0 ่ฟๆณ•ๅœฐ็‚น๏ผšGxxๆต‘ๆบๆ–นๅ‘xxxxKM+xxM\n290001 0 ๅˆšๅˆšๅœจQQ็ฉบ้—ดๅˆ†ไบซQQ้Ÿณไนไปฅๅค–็š„ๆญŒๆ›ฒ\n290002 0 ไป€ไนˆๆ—ถๅ€™ๆฅๅ—ไบฌๅ•Šๅ•Šๅ•Šๅ•Šๅ•Šๅ•Š\n290003 0 ็ ๅฎไธŽๅ›ฝ้™…ๆ•™่‚ฒๅญฆ้™ข้…’ๅบ—็ฎก็†ๆœฌไธ“ไธšๅŸนๅ…ป็š„ๅญฆ็”Ÿ่ƒฝๅ…ทๅค‡ๆŽŒๆก้…’ๅบ—็ฎก็†็›ธๅ…ณไธ“ไธš็Ÿฅ่ฏ†\n290004 0 ็คพไผšไธŠๅฅฝๅคšไบ‹ๆ˜ฏๅฝ“ๆณ•ๅฎ˜ๆ—ถ็œ‹ไธ่ง็š„\n1000 Processed\n classify content\n290500 0 ๅฎŒ็พŽ่žๅˆไบ†่ฐทๆญŒChromeๆต่งˆๅ™จ็š„ๆž้€Ÿ็‰น็‚นๅ’Œๅพฎ่ฝฏIEๆต่งˆๅ™จ็š„ๅ…ผๅฎน็‰นๆ€ง\n290501 1 ๆ„Ÿ่ฐข่‡ด็”ต้‘ซ้š†ๆฑฝ่ฝฆ็”จๅ“ๆ‰นๅ‘ไธญๅฟƒ๏ผŒไธป่ฅDVDๅฏผ่ˆชใ€ไปช่กŒ่ฝฆ่ฎฐๅฝ•ใ€็”ตๅญ็‹—ใ€ๆ”น่ฃ…ๆก†็ญ‰็ณปๅˆ—ๆฑฝ่ฝฆ็”จๅ“๏ผŒ่ดจ้‡...\n290502 0 ไปŽๆ–ฐๅˆๅŒๅฏไปฅ็œ‹ๅ‡บJLๅœจNBA็š„ๅฎšไฝๅฐฑๆ˜ฏๆฟๅ‡ณ้˜Ÿๅ‘˜\n290503 0 ๆˆ‘็œŸๅฟƒ่ง‰ๅพ—่…พ่ฎฏๆธธๆˆๅนณๅฐๅšๅˆฐๅพˆๅžƒๅœพๅพˆๅžƒๅœพๅพˆๅžƒๅœพๅพˆๅžƒๅœพ\n290504 0 ไบ‹ๆƒ…็š„็œŸ็›ธๆ นๆœฌๅฐฑๆฒกๆœ‰ๅ…ด่ถฃๅŽปไบ†่งฃ\n1000 Processed\n classify content\n291000 0 2015ๆน–ๅ—ๅธธๅพทๅธ‚็‰นๆฎŠๆ•™่‚ฒๅญฆๆ กๆ•™ๅธˆๆ‹›่˜ๆ‹Ÿ่˜ๅๅ•ๅ…ฌ็คบ\n291001 0 ๆˆ‘็”จ็™พๅบฆ่ง†้ข‘ๆ‰‹ๆœบ็‰ˆ็œ‹ไบ†โ€œๅคšๅๆœชๆˆๅนดๅฐ‘ๅฅณ้…’ๅง้™ช้…’โ€\n291002 0 ็ป่ฟ‡่ฎค่ฏ็š„ๆœ‰ๆœบ็†่กฃ่‰ๆœ‰ๅธฎๅŠฉไบŽ่ˆ’็ผ“ๅ’Œ้•‡้™็šฎ่‚ค\n291003 0 ๅ้ฃžๆœบๅ้ฃžๆœบๅ้ฃžๆœบๅ้ฃžๆœบๅ้ฃžๆœบ\n291004 0 ๆœฌๆฌก่ต„่ดน่ฐƒๆ•ดไธป่ฆ้’ˆๅฏนๆ‰‹ๆœบ็”จๆˆท็š„่ฏญ้ŸณไธšๅŠก\n1000 Processed\n classify content\n291500 0 ๅฟƒๆƒ…ๅคๆ‚ๅˆฐxxxxๅญ—้ƒฝ่ฏดไธๆธ…ๆฅš็š„ๆƒ…ๅ†ตไธ‹ไธŠไบ†้ฃžๆœบ\n291501 0 ๅˆซๅข…ๅŒบๅŠžๅ…ฌๅฎคๅ’Œ้ซ˜ๆฅผๅคงๅŽฆๅŠžๅ…ฌๅฎค\n291502 0 ็›ธๅŸŽๅŒ—ๆกฅๆดพๅ‡บๆ‰€่ฟ‘ๆœŸๆŽฅๆŠฅ็š„ไธ€ไธช่ญฆๆƒ…\n291503 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ : ๆ‚จๅฅฝ๏ผ \"x.x\"ๆƒ ่šๅฅณไบบ่Š‚ๆดปๅŠจๅทฒๅฏๅŠจ๏ผŒ่ฟ›ๅบ—ๅฐฑๆœ‰็คผๅ“่ต ้€๏ผŒๆฌข...\n291504 0 ไธปๆ‰“่ทจๅนณๅฐๅผ€ๅ‘๏ผšๅพฎ่ฝฏๅ‘ๅธƒVisualStudio2015\n1000 Processed\n classify content\n292000 0 ๅฟ—้ซ˜ๅฐๅŒบๅฐฑๆœ‰18่พ†่ฝฆ็š„่ฝฆ่ƒŽ่ขซๆ‰Ž็ ด\n292001 0 7ๅคๅจๅคท็พคๅฒ›ไธœๅ—้ƒจ็‰น้ฒๅ…‹ๆณปๆน–ไธ‹ๅญ˜ๆœ‰50ๅคš่‰˜ใ€ŽไบŒๆˆ˜ใ€็š„ๆˆ˜่ˆฐๆฒ‰่ˆนๆฎ‹้ชธ\n292002 0 ้ฉฌๆ ผ่ฏบๅก”ๅŽปๅนดxxๆœˆ่ขซๆณ•้™ข่ฃๅฎšไธ€็บง่ฐ‹ๆ€็ญ‰x้กน็ฝชๅๆˆ็ซ‹\n292003 0 ไนŸๅฏไปฅๆŠ•่ต„็†่ดข่Žทๅพ—ๆœ€้ซ˜26%็š„ๅฎ‰ๅ…จๆ”ถ็›Š\n292004 0 ๅฐ†ๅ…ถ่ทŸ6ๆœˆ12ๆ—ฅๅณฐๅ€ผ็š„่ฝๅทฎๆ‹‰ๅผ€ๅˆฐ38%\n1000 Processed\n classify content\n292500 0 ๅฎ‰ไธ˜ๅŸŽ็ฎกๆ‰“ไบบๅฎ‰ไธ˜ๅŸŽ็ฎกๆ‰“ไบบไบ‹ไปถๆœ€ๆ–ฐ่ฟ›ๅฑ•\n292501 0 ไธ็†่งฃ้™ๅˆถ5000ๅ—ๅฏนไบ’่”้‡‘่žๆœ‰ไป€ไนˆๅฎž่ดจๅฝฑๅ“\n292502 0 ๆ‰‹ๆœบๅพฎๅšๅ›พ็‰‡ๅŠ ่ฝฝไธๅ‡บๆฅ่ฏฅๆ€ŽไนˆๅŠž\n292503 0 ๅœจ่ดตๆฑ ๅŒบๅ†œๆ‘ๅœฐๅŒบ็›—็ชƒ็Œชไป”37ๆฌก\n292504 0 ไฝ ๅ›ฝๆ”ฟ็ญ–ๆณ•ๅพ‹ๅˆๆŠŠไธ€ๆ‰น้ซ˜็ด ่ดจๅฅณๆ€ง็š„ๅฟƒ้€ผ่ตฐไบ†\n1000 Processed\n classify content\n293000 0 ๆœ‰xร—xxๅฐๆ—ถๅ…จๅคฉๅ€™็š„ๅฟซ้€Ÿๅฎ‰ๅ…จๅ“ๅบ”ๆœบๅˆถ\n293001 0 ๅฏนไป–ไปฌไปŠๅŽๅœจๅˆถ้€ ๆœบๆขฐ้ฃžๆœบไธญๆ‰€ๅ–ๅพ—็š„ๆˆๅŠŸๆ˜ฏ่‡ณๅ…ณ้‡่ฆ็š„\n293002 0 15็š„่ง้ขไผšไนŸๆ˜ฏ่ฆไผšๅ‘˜ๆ‰่ƒฝ็œ‹็š„\n293003 0 ๅ“ชๆ€•ๆ˜ฏๅ› ไธบๅŒป็–—ๆŠ€ๆœฏ็š„้—ฎ้ข˜่€Œๅ‘็”Ÿๆ„ๅค–\n293004 0 ๅ…ทๆœ‰ๆต“ๅŽš็ฆๅทžไผ ็ปŸๅปบ็ญ‘ใ€ๆ–‡ๅŒ–็‰น่‰ฒ็š„ๅ…ธๅž‹้‡ŒๅŠๅผๅކๅฒๆ–‡ๅŒ–่ก—ๅŒบ\n1000 Processed\n classify content\n293500 0 ๆœ‰ๅŠ›ๆ‰“ๅ‡ปไบ†่ฟๆณ•็Šฏ็ฝชไบบๅ‘˜็š„ๅšฃๅผ ๆฐ”็„ฐ\n293501 0 ๆˆ–่€…ๅœจ55ใ€54ใ€78ใ€62ใ€63ใ€64ๅŒบๅ‰ๆŽ’\n293502 0 ็ปˆไบŽ็Ÿฅ้“่Šฑๅƒ้ชจๆ˜ฏไฝ ็š„็”ŸๆญปๅŠซ\n293503 0 ๅดๆฑŸ่ทฏๅบ—็š„ไผ™ไผดๅฅฝๅƒไธๅคชๅผ€ๅฟƒๅ“ฆ็ป™็ฅจๆฎ็š„ๆ—ถๅ€™็›ดๆŽฅๆ‰ญๅคดๅ•ๆ‰‹็ป™ๆˆ‘\n293504 0 ๅœจ่‹ๅทžๅธ‚็›ธๅŸŽๅŒบๅ…ฑ้’ๅ›ขๅฝฑ้™ข็š„ๆ”พๆ˜ ๅŽ…ไธญ\n1000 Processed\n classify content\n294000 0 ไบ”ไธชๅ›žๅˆ้ฃžๆœบ้ฃžโ€ฆๅคฉๆฐ”ๅฎžๅœจๅคช็ƒญ\n294001 0 ไฝ ya็š„้‚„ๆ˜ฏๅˆฅๅ†่ธข็ƒไบ†ๅŽป็•ถๆตๆฐ“็ฎ—ไบ†ๅๆญฃ้ƒฝไธ€ๅ‰ฏๅพท่กŒ\n294002 0 ๆœฌไบบๅทฒ็ป่ขซๅฎณไบ†้‚ฃไบ›ไบบๅฆ‚ๆžœไธๅˆคๆญปๅˆ‘ๅฐ†ๆœ‰ๆ›ดๅคš็š„ไบบ่ขซ็—…ๆฏ’ๆ€ๆญป็š„ๆญปๅˆ‘ๆ€ไบบๅฟๅ‘ฝๆญปๆญปๅˆ‘\n294003 0 ๅŽๆ™จไธœไบšๆฑฝ่ฝฆ้‡‘่žๆœ‰้™ๅ…ฌๅธHR้ƒจ้—จๆ‹›intern่ฆๆฑ‚๏ผšไธ€ๅ‘จไธ‰ๅคฉไปฅไธŠ\n294004 0 ๅœจ20ไบฟ่ดขๅฏŒ้—จๆง›็š„่ƒกๆถฆ็™พๅฏŒๆฆœไธŠ\n1000 Processed\n classify content\n294500 1 ๅฐŠ่ดต็š„ไผšๅ‘˜ๆ‚จๅฅฝ๏ผšไธบๆ„Ÿ่ฐขๆ‚จๅฏนๅŠจ้™็•Œๆ”ฏๆŒ๏ผŒๅ…ฌๅธๅœจxๆœˆxๆ—ฅๅฆ‡ๅฅณ่Š‚ๆฅไธดไน‹้™…ๅ›ž้ฆˆๅนฟๅคงไผšๅ‘˜๏ผŒๅ‡กๅœจxๆœˆxๅท...\n294501 1 ๅฐŠๆ•ฌ็š„่ดตๅฎพๆ‚จๅฅฝ!ๅŸŽ่ฅฟ๏ผˆๆฑ‡้€šๅ›ฝ้™…ๆฑฝ่ดธๅŸŽ๏ผ‰้ฆ–ๆœŸๆˆฟๅญๅณๅฐ†ๆทธ็›˜๏ผŒไป…ๅ‰ฉxxๅฅ—๏ผŒ็Žฐๅฆ‚ๆˆๅŠŸ้€‰ๆˆฟ้€่ฃ…ๆฝข่กฅ่ดดใ€...\n294502 0 ไธŠๆตทๆตฆไธœ่ญฆๅฏŸๆŠ“ไบ†ๅ‡ ไธชๅทๆฒน่ดผ่ฟžๅคฎ่ง†้ƒฝๆ‹ฟๅ‡บๆฅ็‚ซ่€€\n294503 0 ๅ…ณ้—ญๅ€Ÿ้’ฑ็‚’่‚ก็š„่ž่ต„่žๅˆธๆญฆๆฑ‰็ง‘ๆŠ€ๅคงๅญฆ้‡‘่ž่ฏๅˆธ็ ”็ฉถๆ‰€ๆ‰€้•ฟ่‘ฃ็™ปๆ–ฐๆ•™ๆŽˆๆฅๆบ๏ผš่‘ฃ็™ปๆ–ฐ็š„ๆœ็‹ๅšๅฎขๅœจๆœฌ่ฝฎ็–ฏ...\n294504 0 6ใ€ๅŠๆ—ถๅŠ ๅ›บๆˆฟๅฑ‹้œ€่ฆๅŠ ๅ›บ็š„้ƒจไฝ\n1000 Processed\n classify content\n295000 0 ChristianLouboutinๆ‹ผ่‰ฒ้€ๆ˜Žpvcๅนณๅบ•้ž‹ๆ€งๆ„Ÿๅฐ–ๅคดๅŠ ไธŠ้€ๆ˜Žๆ‹ผๆŽฅ่ฎพ่ฎก้š้š็บฆ็บฆ็š„ๆต...\n295001 0 ๆŽจไธ€ไธชgoogle็š„้•œๅƒๅœฐๅ€\n295002 0 ๅ—ไบฌ่„‘็ง‘ๅŒป้™ขไธŽๅ—ไบฌๅธ‚็บขๅๅญ—ไผš่”ๅˆๅปบ็ซ‹ไบ†โ€œๅ—ไบฌๅธ‚็บขๅๅญ—โ€˜ๆ˜Ÿๆ˜Ÿโ€™ๅš็ˆฑ่ต„้‡‘โ€\n295003 0 ๅ›พ่…พ็บน่บซx่ฆ็ด ๏ผšxใ€ๅทฅๆ•ดxใ€้ฅฑๅ’Œxใ€ๅฐ–่ง’xใ€่พน็ผ˜็บฟ\n295004 0 TaeYangๅˆšๆ›ดๆ–ฐไบ†Twitter๏ผšTAEYANGWELCOMESYOUๅŽŸๆ–‡ๆˆณ๏ผš\n1000 Processed\n classify content\n295500 0 ๆฆ‚่ฟฐ๏ผšๆญค้กน็›ฎๆญฃๅœจ่ฃ…ไฟฎ่ฟ›่กŒไธญ\n295501 0 ๅพฎ่ฝฏๅธŒๆœ›่ƒฝๅคŸๆ‰พๅ‡บไธ€ไบ›่ฎฉไบบ็œผๅ‰ไธ€ไบฎใ€ๅนถไธ”ๆ›ดๅŠ ๅญฆๆœฏ่Œƒ็š„HoloLensไฝฟ็”จๆ–นๅผ\n295502 1 ๅนฟๅพทๅพทๅ–„ๅฐ้ข่ดทๆฌพๆœ‰้™ๅ…ฌๅธๆ˜ฏ้šถๅฑžไบŽๅฎ‰ๅพฝ็œไพ›้”€็คพ็š„ๅ›ฝๆœ‰ๅ…ฌๅธ๏ผŒไธ“้—จไธบๅ„็ฑปๅž‹ไธชไฝ“ๅทฅๅ•†ๆˆทใ€ๅพฎๅฐๅž‹ไผไธšๆ...\n295503 0 ไผ˜ๅˆฉๅœจxxxxๅพฎ่ฝฏๅ…จ็ƒๅˆไฝœไผ™ไผดๅคงไผšไธŠ่ฃ่ŽทๅนดๅบฆEPGAzureๅˆ›ๆ–ฐไผ™ไผดๅฅ–\n295504 0 ๅ‡ ไธ‡ๅ—้’ฑๆฒปไธๅฅฝ็š„ๅญๅฎซ่…บ่‚Œ็˜ค\n1000 Processed\n classify content\n296000 0 ่ฟ”่ฟ˜็Žฐ้‡‘xxxxxxๅ…ƒใ€ๆ‘ฉๆ‰˜่ฝฆxx้ƒจใ€็”ตๅŠจ่ฝฆx้ƒจใ€ๅŠฉๅŠ›่ฝฆx้ƒจใ€ๆ‰‹ๆœบxx้ƒจใ€็ฌ”่ฎฐๆœฌ็”ต่„‘xๅฐใ€ๆ‰‹...\n296001 0 ็ฅžๆญฆๅฅณๅ„ฟๅ›ฝTeemoยฐ็‰ต้ญ‚็ฅžๆญฆๅฅฝๅฃฐ้Ÿณ\n296002 1 ไฝ ๅฅฝ๏ผŒ็Žฐๆ–ฐไธŠๅธ‚xxๆธ…็บฏๆธ…้ฆ™่Œถ๏ผŒ่ฟ˜ๆœ‰ๆœ€ๆ–ฐ็š„ๆณก่ŒถๅŠŸๅคซ๏ผŒ้ข„`่ฎขๆœ‰ๅฅฝ่Œถxxxxxxxxxxxๅคšๅคš ็Žฐ...\n296003 1 ๏ผŒ่ฎฉไฝ ๆ„Ÿๅ—่บซๅฟƒๆ„‰ๆ‚ฆ็š„ๅŽจๆˆฟ็”Ÿๆดปใ€‚ๅ€ผๆญคx.xxๆฅไธดไน‹้™…๏ผŒๅŽ‚ๅฎถ็‰นไปทๅคšๅคš๏ผŒไผ˜ๆƒ ๅคšๅคš๏ผŒๆœŸๅพ…ไฝ ็š„ๅ…‰ไธด๏ผๅœฐ...\n296004 0 ไธ€ๅˆฐๅคๅคฉๆˆ‘้ƒฝไผšๆ‰“็”ต่ฏ็ป™ๆˆ‘้‚ฃไบ›็Žฐๅœจๆญฃๅœจ้กถ็€็ƒˆๆ—ฅๅœจๅทฅๅœฐไธŠ็š„ๅ“ฅไปฌๅ„ฟ\n1000 Processed\n classify content\n296500 0 ็ป“ๆžœๅŠณ่ต„ๅˆฐ็Žฐๅœจ่ฟ˜ๅœจ้ฃžๆœบไธŠๅ็€\n296501 0 ไธญๅ›ฝๆŠ—็”Ÿ็ด ๆปฅ็”จ่งฆ็›ฎๆƒŠๅฟƒ่ถ…5ไธ‡ๅจ่ขซๆŽ’ๆ”พๅ…ฅๆฐดๅœŸ็Žฏๅขƒ\n296502 0 ๅŒ—้™†่ฏไธšใ€่ฟ็››ๅŒป็–—ใ€ไธ‡ๆ˜Œ็ง‘ๆŠ€ๅ‹‡ๅˆ›ๆ–ฐไฝŽ\n296503 0 ไฝ†ๅˆๆ€•ๅฅนholdไธไฝๅฆ–็ฅž่ฟ™ไธช่ง’่‰ฒ\n296504 0 ่ฏฅ้•‡ๅ› ๆฏ’ๅ“้—ฎ้ข˜ๅˆคๆญปๅˆ‘็š„52ไบบ\n1000 Processed\n classify content\n297000 0 ่ฟ™ๆฌพ็Ž„ๅ…ณ้•œ็š„้•œๆก†ๅˆ™ไธบ่ฎพ่ฎก็š„ไบฎ็‚น\n297001 0 ้€€ๅŒ–ๆ€ง่ง†็ฝ‘่†œ้ป„ๆ–‘้ƒจๅŒบ็—…ๅ˜่€…7\n297002 0 ๅฎถๅบญ่ดขๅŠกๅŸบๆœฌ่‡ช็”ฑ=็œไผšๅŸŽๅธ‚ไธ€ๅฅ—150ๅนณ็š„ๆˆฟ+500W็š„็Žฐ้‡‘\n297003 0 ๆ›พไธป่ฎฒไบŽๆต™ๆฑŸ่•บๅฑฑไนฆ้™ขใ€ๅนฟไธœ็ซฏๆบชไนฆ้™ข\n297004 0 ็›‘่€ƒ็š„ๆ—ถๅ€™่ฟ˜ไธ่ƒฝๆŠŠๆ‰‹ๆœบๆ”พๅœจ่ทๅŒ…้‡Œ\n1000 Processed\n classify content\n297500 0 AGxxxๆ˜ฏๅฝ“ไปŠไธ–็•ŒไธŠๆœ€ๅคง็š„ไธ€ๆฌพๅœจ็ ”็š„ๆฐด้™†ไธคๆ –้ฃžๆœบ\n297501 0 ไธ€ไธชๅฐๅท็š„ๆˆๅŠŸๆ˜ฏๆ—ถไปฃ็š„่€ป่พฑ\n297502 0 ไปŠๅคฉไธ€ๅ€‹ไบบๅพžBarcelonaๅˆฐPrague\n297503 0 Delaynomoreๅ…ฉ้ปž็š„้ฃ›ๆฉŸๅปถ่ชคๅˆฐๆ™šไธŠๅ้ปž้‚„่ฎ“ๆˆ‘ๅ€‘ไธ€็›ดๅพ…ๅœจ้ฃ›ๆฉŸไธŠ\n297504 0 โ€”โ€”ๅทฆๅฎ—ๆฃ ้ข˜ๆฑŸ่‹็œๆ— ้”กๆข…ๅ›ญ\n1000 Processed\n classify content\n298000 0 ๅŸบๆœฌๆ“ฆๅฎŒไธ€ๅฑ‚้˜ฒๆ™’ๅ†ๆ“ฆLMไผšๆ“ๆณฅ\n298001 0 ไฝ ่ฟ˜ไธบไฝ ็š„ไบŒไบบๅฐ็ช่ฃ…ไฟฎ่ฟ˜็Šฏๆ„ๅ—\n298002 0 ้ป„้™‚ๅŒบไบบๆฐ‘ๆณ•้™ขๅฏน่ฟ™ไธ€ๅˆฉ็”จ่ŒๅŠกไน‹ไพฟ\n298003 0 ๅ…ถไป–ๆทฑๅœณๆ‰‹ๆœบๅŽ‚ๅ•†ไนŸ็บท็บท่ตฐๅ‡บๅŽปๆ‹“ๅฑ•ๅ…จ็ƒๅธ‚ๅœบ\n298004 0 ้กถไธชๅคงๅคช้˜ณๅธฆ็€ๅ—ๅจœ้˜ฒๆ™’ๅ–ท้›พ๏ฝž\n1000 Processed\n classify content\n298500 0 ไธบไป€ไนˆไธๆ„ฟๆ„็ป™ๆˆ‘ๅŠž็†ไฟก็”จๅก\n298501 0 ๆ’ๅคง่ทŸ่…พ่ฎฏ่”ๆ‰‹ๅฐฑๆ˜ฏไธไธ€ๆ ทๅ•Š\n298502 0 ็‰Ÿ็คผ้•‡่ฎก็”ŸๅŠžไบŽxxxxๅนดxๆœˆxxๆ—ฅไธŠๅˆx็‚นๅœจ้•‡ๆ”ฟๅบœไธ€ๆฅผไผš่ฎฎๅฎคๅฌๅผ€ไบ†ๅ…จ้•‡ๅ„ๆ‘ๅฆ‡ๅฅณไธปไปปไผš่ฎฎ\n298503 0 16ๅฒๅฐ‘ๅฅณ่ขซ้€ผๅ–ๆทซ้ญ่€้ธจๆฎดๆ‰“ๅซ–ๅฎขๆŠฅ่ญฆ็›ธๆ•‘่ฏฅไธ่ฏฅ่ขซๆ‹˜็•™\n298504 0 ๆˆ‘้ƒฝไผšๅœจ็”ต่„‘ๅ…ณๆœบๅ‰็”จ็”ต่„‘็ฎกๅฎถๆธ…้™ค็”ต่„‘ไฝฟ็”จ็—•่ฟน?\n1000 Processed\n classify content\n299000 0 ๅ‡†ๅค‡ๅœจๆ‰ฟๅŒ…็š„600ๅคšไบฉๅœŸๅœฐไธŠ็งไธŠ็ดซ็”˜่“ๅ’Œ่ฅฟๅ…ฐ่Šฑ\n299001 0 ไธญๅ›ฝไบบๅฏฟๆŽจๅ‡บไบ†ไฟๅ•ๅฐๅŠฉๆ‰‹โ€œeๅฎโ€\n299002 0 ไฝ†ไบ‹ๅฎžไธŠ่…่ดฅๅชๆ˜ฏ็‰ฉ่”็ฝ‘ไธ“้กน่ต„้‡‘้—ฎ้ข˜็š„่กจ่ฑกไน‹ไธ€\n299003 0 ๅคšๆ—ฅๅ‰ๅœจๅ—ไบฌ็ญ‰ๅœฐ้“็ป“ๆžœ่…ฟๅกๅˆฐๅๅ‡ณไน‹้—ดๆ”ถ่Žท็š„ๆทค้’\n299004 0 ไธบไป€ไนˆ่ฟ™ไนˆๅคš่‚กๆฐ‘ๆƒณๅšไธญ่ฏๆ\n1000 Processed\n classify content\n299500 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?TOPxxxไฝŽ\n299501 0 ๆžœๆ–ญไธ‹ไบ†ไธช่…พ่ฎฏ่ง†้ข‘็œ‹ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ\n299502 0 ็Žฐๅœจ่ฟ™ไบ›ไธญไป‹ๆ€ง่ดจ็š„่ดทๆฌพๅ…ฌๅธๆ‰€ไปŽไบ‹็š„ไธšๅŠกไธŽไป–ไปฌๅฎŒๅ…จไธ€ๆ ท\n299503 1 ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตๅŽป่ŒถๅŽป็“ฏๅŒ—ๅบ—๏ผ้ข„็Ÿฅๆ›ดๅคšไผ˜ๆƒ ๆดปๅŠจ๏ผŒ่ฏทๅ…ณๆณจๅŽป่ŒถๅŽป็“ฏๅŒ—ๅบ—ๅฎ˜ๆ–นๅพฎไฟก๏ผšๅŽป่ŒถๅŽป็“ฏๅŒ—ๅบ—๏ผˆgote...\n299504 0 ๅ›ฝ้™…่ถณ่”ๅ†ณๅฎšๅผ•่ฟ›้—จ็บฟ่ฃๅˆคๆŠ€ๆœฏ\n1000 Processed\n classify content\n300000 0 ๅฎๅพทๅธ‚ๆฐ”่ฑกๅฐ7ๆœˆ10ๆ—ฅ17ๆ—ถ35ๅˆ†็ปง็ปญๅ‘ๅธƒๅฐ้ฃŽๆฉ™่‰ฒ้ข„่ญฆไฟกๅท๏ผšไปŠๅนด็ฌฌ9ๅทๅฐ้ฃŽโ€็ฟ้ธฟโ€ไปŠๅคฉ17ๆ—ถ...\n300001 1 ๅคฎ่กŒ้™ๆฏไบ†๏ผ้™ๆฏไบ†๏ผๆฒƒๅฏŒ้‡‘่žๅ…ฌๅธไธ“ไธšๆไพ›x-xxๅ€ๅฎž็›˜็‚’่‚ก่ต„้‡‘๏ผŒ็ป“็ฎ—ๆจกๅผ็ตๆดปใ€‚้€Ÿๆฅ้…่ต„๏ผ\n300002 0 ๅŒ—ไบฌ่พพ้ฃžๅฎ‰่ฏ„็ฎก็†้กพ้—ฎๆœ‰้™ๅ…ฌๅธๆŠ€ๆœฏไบ”้ƒจไบบๅ‘˜ๅœจๅ‰ๆž—็œๅ‰ๆž—ๅธ‚ๆฃ€ๆŸฅ้•ฟๆ˜ฅ่พ“ๆฒนๆฐ”ๅˆ†ๅ…ฌๅธ้•ฟๅ‰็บฟ้™„ๅฑž็ซ™ๅœบๅ‰ๆž—่พ“ๆฒน็ซ™\n300003 0 ๆ‰‹ๆœบ2ๅท่‡ชๅŠจๆŠŠSDๅกๆ‰€ๆœ‰ๅบ”็”จๅธ่ฝฝ่€Œไธ”ๆ— ๆณ•ๅ†ๅฎ‰่ฃ…\n300004 0 ็ผด่Žท่ตƒ่ฝฆ400ไฝ™่พ†โ€ฆโ€ฆ2014ๅนด8ๆœˆๅˆฐ2015ๅนด6ๆœˆๅบ•\n1000 Processed\n classify content\n300500 0 ่ฏทๆฑ‚ๆณ•้™ขๆ”ฏๆŒๅŽŸๅ‘Š็š„่ฏ‰่ฎผ่ฏทๆฑ‚\n300501 0 ๅŒป้™ข่ฏŠๆ–ญๆˆ‘ๆ‚ฃไบ†ๆ€ฅๆ€งๆท‹ๅทดๆ€ง็™ฝ่ก€็—…\n300502 0 ไปฅๅŽๅŽป้ซ˜ๆทณๅฏๆฒฟๆ…ขๅŸŽโ†’ๆธธๅญๅฑฑโ†’ๅ›บๅŸŽๆน–โ†’่€\n300503 0 ๅŽๅŠ้ƒจๅˆ†ๅˆ™ๅœจ็”ต่„‘็‰นๆ•ˆไธญๆ‹ผๅ‘ฝ้€ ้ข„็ฎ—\n300504 0 ๆ–ฐๅ“็œผ็ฝฉEYEMASKๅˆฐ่ดง\n1000 Processed\n classify content\n301000 0 ๆ‰‹ๆœบๅฟซๆฒก็”ตไบ†้ฃžๆœบๆ˜ฏๅไธไบ†ไบ†ไนŸๅชๆœ‰ๅ็ซ่ฝฆไบ†\n301001 1 ๆ˜ฅๅญฃๆ˜ฏๅ…ณ่Š‚ใ€้ขˆ่…ฐๆคŽ็—…้ซ˜ๅ‘ๆœŸ.ไป™่‰ๆดป้ชจ่†โ€œๅ…ƒๅฎต-ไธ‰ๅ…ซโ€ๅŒ่Š‚็‰นๆƒ ๆดปๅŠจๅทฒๅผ€ๅง‹๏ผŒไนฐxx้€xใ€ไนฐxx้€...\n301002 0 ็”ตๅŠจ่ฝฆๆ‘ฉๆ‰˜่ฝฆGPSๅซๆ˜Ÿๅฎšไฝ่ฟฝ่ธช้˜ฒ็›—ๅ™จ\n301003 0 ๅƒไบ†2ไธช็ƒค่‚ 14ๅ—้’ฑๅ‘จ้ป‘้ธญ่ฟ˜ๆœ‰2็ข—็“ฆ็ฝๆฑค\n301004 0 ๆœฌๆฅๅฐฑๅป‰ไปทไฟๆŠค่ฟ‡ๅบฆ็œŸ็›ธ้ƒฝ่ฟ›ไธๆฅ\n1000 Processed\n classify content\n301500 0 2005ๅนด่ขซไบŒไธญ้™ขไปฅๅ—่ดฟ็ฝชๅˆคๅค„ๆญปๅˆ‘\n301501 0 ็ฆปไบ†QQ็ฆปไบ†ๆ‰‹ๆœบๅˆไธๆ˜ฏไผšๆญป\n301502 0 ไบš้ฉฌ้€Šๅ…ฌๅธƒไบ†ๅ…ถ2015่ดขๅนด็ฌฌไบŒๅญฃๅบฆ่ดขๆŠฅ\n301503 0 ๆˆ‘ไปฌๅค่‰้ป‘ๆžธๆžๅ“่ดจๅฅฝๅˆฐๅˆซ็š„ๅ“็‰Œ้ƒฝๆƒณ่ฆ\n301504 0 ไธ‰ๅ…ƒๅŒบๆถˆ้˜ฒๅคง้˜ŸๆŒ‡ๆˆ˜ๅ‘˜xxไบบๅˆฐ้™ขๅ‚ๅŠ ๆดปๅŠจ\n1000 Processed\n classify content\n302000 0 ๅฆ‚ไปŠ่Žทไบฌไธœ43ไบฟๅ…ƒๅทจ้ขๆŠ•่ต„\n302001 0 ๅฏ„ไฟก็š„ไบบ้œ€ๆˆดไธŠๅ‘ผๅธๅ™จไธ‹ๅˆฐไธ‰็ฑณๆทฑๅค„ๆŠ•้€’้˜ฒๆฐดๆ˜Žไฟก็‰‡\n302002 0 ไฝ›ๅฑฑ็ปฟๅœฐไธญๅฟƒไบŒๆœŸ้”€ๅ”ฎไธญๅฟƒๅคœๆ™ฏ\n302003 0 ๆฑŸ่‹ๅ—้€šๅธ‚ๅฆ‚ไธœๅŽฟๆŽ˜ๆธฏ้•‡ๆ”ฟๅบœไปฅๅŠๆฑŸ่‹ๆณ•้™ขๆฃ€ๅฏŸ้™ข่…่ดฅ็š„้ฃŽๆฐ”่ถŠๆฅ่ถŠๆถๅŒ–\n302004 1 ๅฎถ้•ฟไปฌ๏ผšๆ–ฐๅนดๅฟซไน๏ผๅฟƒๆƒณไบ‹ๆˆ๏ผๆœฌ็คพๅฎšไบŽxๆœˆxๆ—ฅๅผ€่ฏพ๏ผŒๅ‡ญๆญค็Ÿญไฟก่ฟ‡ๆฅๆŠฅๅ้€็คผ็‰ฉไธ€ไปฝ๏ผŒ็บขๅŒ…ไธ€ๅฐ๏ผŒๅ้ข...\n1000 Processed\n classify content\n302500 0 ๆˆ–่ฎธๆˆ‘่ฟ˜ไผšๅๅœจ็”ต่„‘้ขๅ‰้™้™ๅพ—็œ‹็œ‹ๅพฎๅš\n302501 0 ๅœฐ็‚น๏ผšๆฐ‘ไธป่ทฏไธŽๆœๅฑฑ่ทฏ่ทฏๅฃ่ฟๆณ•่กŒไธบ๏ผš้ฉพ้ฉถๆœบๅŠจ่ฝฆ่ฟๅ้“่ทฏไบค้€šไฟกๅท็ฏ้€š่กŒๅค„็ฝš๏ผš็ฝšๆฌพ200ๅ…ƒ\n302502 0 ๆœ€่ฟ‘็ˆฑ็œ‹ไธ€ไบ›็Šฏ็ฝชๅ•ŠๆŽจ็†ๅ•Šไธ€็ฑป็š„ๅ‰ง\n302503 0 ็ฅ่ดบๆˆ‘ๅงๅฎถ็š„ๅฐๅญฉ่€ƒไธŠๅ—ไบฌ่ˆช็ฉบ่ˆชๅคฉๅคงๅญฆ\n302504 0 ๅœจ็”ต่„‘ไธŠ่ดด็…ง็‰‡โ€ฆโ€ฆไฝ ๅฟๅฟƒๆˆ‘ไธ่ขซ็ฟป็‰Œๅ—\n1000 Processed\n classify content\n303000 0 ่ก—้•‡ๅ›บๅฎš่ต„ไบงๆŠ•่ต„็ดฏ่ฎกๅฎŒๆˆ206\n303001 0 ้˜ฟQ็ฒพ็ฅž่ตค่ฃธ่ฃธ็š„ๅฑ•็คบ็ป™ไธ–็•Œๅ„ๅ›ฝไบบๆฐ‘็œ‹\n303002 0 ๆˆ‘ๆญฃๅœจไฝฟ็”จ็™พๅบฆ่พ“ๅ…ฅๆณ•โ€œ้‡‘่‰ฒๆ‰“ๅญ—ๆœบโ€็‰นๆŠ€็šฎ่‚ค\n303003 0 ไธญ่ˆช็š„ไธช่‚กๅฐ†ไผš้‡็ป„ๆ‰€ไปฅ็›ฎๅ‰ๆšดๆถจ้žๅธธๆ˜Žๆ˜พ\n303004 0 ๆญๅ…จๆ–ฐๅŠจๅŠ›็ณป็ปŸๅŒ—ไบฌxxLๆต‹่ฏ•่ฐ็…งๆ›ๅ…‰\n1000 Processed\n classify content\n303500 0 ่‡ด็œ‹็€่‡ชๅทฑๅญฉๅญๅœจๅœฐ้“ไธŠ็–ฏ้—น\n303501 1 ๅ†€ๆ–ฐ็‰ฉ่ต„:ๆˆ‘ๅ…ฌๅธไธ“่ฅๆ‰นๅ‘Hๅž‹้’ข\n303502 0 ่™šๆ‹Ÿ็š„ไนŸ่ƒฝไฝ“้ชŒๅˆฐCHERRY้’่ฝด็š„ๆฎต่ฝๆ„Ÿ\n303503 0 21D108dayๅ†ๅŽปๆฑŸ้ƒฝๅธ‚็ป™้˜ฟๅงจ่ฟ‡็”Ÿๆ—ฅ๏ฝž\n303504 0 ๅ…ถๅฎž้—ฎๆˆ‘่ฆ็บขๅŒ…็š„่ฟ™Nๅคšไบบไฝ ไปฌๆœ‰็ป™ๆˆ‘ๅ‘่ฟ‡ๅ—\n1000 Processed\n classify content\n304000 0 ๆฎไบš้ฉฌ้€Šไธญๅ›ฝๅ…ฌๅธƒ็š„ๆœ€ๆ–ฐๆ•ฐๆฎ\n304001 0 ๅ•†ไธš้ข†ๅŸŸๅญฆๅކๅญฆไฝ้ซ˜็š„ไบบๅ…ถไธš็ปฉๅธธไธๅฆ‚ไฝŽ็š„ไบบ\n304002 0 โ€”โ€”ใ€Œ14ๅฒๅฅณๅญฉ้ญ่„ฑ่กฃๆฌบๅ‡Œไบ‹ๅŽ็ช็„ถๆŠŠ่‡ชๅทฑ็œ‰ๆฏ›ๅ‰ƒๅ…‰ใ€\n304003 0 ๅฐฑๆ˜ฏๅฏน็ˆฑbulaๆ‰‹ๆœบ็š„ๆ‰‹ๆœบๆฒกๅฅฝๆ„Ÿ\n304004 0 7ๆœˆ20่‡ณ27ๆ—ฅๆˆ‘ๅธ‚็ฒฎๆฒน้›†ๅธ‚้›ถๅ”ฎไปทๆ ผ็ปง็ปญไฟๆŒ็จณๅฎš\n1000 Processed\n classify content\n304500 0 ๅŒๆ ทๅฏไปฅ่งๅˆฐ่ฎพ่ฎกๅธˆๅฏน้˜ณๅ…‰็š„้‡‡้›†ๆ–น้ข็š„็”จๅฟƒ\n304501 0 ไผŠไธฝ่““ๅก”ๅฐ้ป‘่ฃ™ๅ…ทๆœ‰SPF30/PA+++็š„้˜ฒๆ™’ไฟๆŠคๆŒ‡ๆ•ฐ\n304502 0 ๅฐฑๆ˜ฏๆˆ‘็š„ไธšๅŠกไปฃ็ xxxxxxxxxxๆญคๅ…จ้ƒจๆ“ไฝœไธบๅ…่ดน\n304503 1 ๅ…„ๅผŸ๏ผŒๆ–ฐๅนดๅฅฝ๏ผ็Žฐๆˆ‘ๅบฆๆ€ฅๆ‹›็‚’้”…ไธ€ๅ๏ผŒๅทฅ่ต„็”ต่ฏ่Š๏ผŒๆœˆไผ‘ไธ‰ๅคฉ๏ผŒๆฏไธชๆœˆไบŒๅๅทๅ‡†ๆ—ถๅ‡บ็ฒฎ๏ผŒๅœฐๅ€๏ผšไฝ›ๅฑฑๅธ‚ๅ—...\n304504 1 ไฝ ๅฅฝๅง๏ผŒๆˆ‘ๆ˜ฏๅ—ๆดช่ก—็Ž–่‰ฒๅฏ็พŽๅŒ–ๅฆ†ๅ“ๅบ—็š„๏ผŒๆˆ‘ไปฌๅฎถx.x.x.xๅ››ๅคฉไผšๆœ‰ๅคงๅž‹็š„ไนฐ่ต ๆดปๅŠจ๏ผŒๅบ—ๅ†…ไนŸๆœ‰ๆดป...\n1000 Processed\n classify content\n305000 0 ไฝ tmไธ€ไธชๆ‹ฆ่ทฏๆŠขๅŠซ็š„ไนŸ่ต”ไบ†\n305001 0 ๅŒป้™ข็œŸๆ˜ฏๆœ€่ฎฉไบบ็ณŸๅฟƒ็š„ๅœฐๆ–น\n305002 0 ๅ…‰็€ไธŠ่บซไปŽไธญๅฑฑๅŒป้™ขๅฟƒ่„ไธญๅฟƒ่ท‘ๅ‡บ\n305003 0 M่ถŠ่ฏด่ถŠๅฏ’้…ธๆˆ‘้ƒฝ่ฆๅ“ญไบ†โ€ฆไธ่ฟ‡่กฃๆœ่ฃคๅญ้ƒฝๆ˜ฏๅŽปๅนด็š„ๆฌพไบ†ๅŒ…ๆ˜ฏๅ‡ ไธชๆœˆๅ‰ไนฐ็š„\n305004 0 ็›ฎๅ‰่ฟ™ๆฌพAPPๅทฒ็ปๅธๅผ•ไบ†100ไฝ™ไธ‡็”จๆˆทๆณจๅ†Œ\n1000 Processed\n classify content\n305500 0 HamiltonBeach67601Aๆฑ‰็พŽ้ฉฐๅคงๅ˜ดๅทดๆฆจๆฑๆœบ$49\n305501 0 ไธŽๆฐ‘ไบ‹่ฏ‰่ฎผ็š„ๆ–นๅผ่ฆ่ตฐไป€ไนˆ็จ‹ๅบ\n305502 0 ไธ‹้ขๅ—้€šๅฉš็บฑๆ‘„ๅฝฑ็ฑณๅ…ฐๅฐ็ผ–ไธไธ็ป™ๅคงๅฎถๅˆ†ไบซไธ€ไธ‹ๆœ€ๆ–ฐ็š„ๅฉš็คผ่ดบ่ฏ\n305503 0 ๆŠŠ็ˆฑ็މไธŠไฝ ็š„ๅคง็บ ็บท่ดด็งไฟกๅ‡บๅŽป\n305504 0 ๅ›พไธบๆ‘ๆฐ‘็”จๆ‰‹ๆœบๆ‹ไธ‹ๅซๆ˜Ÿๅ‘ๅฐ„ๅŽๆމ่ฝๆฎ‹้ชธ็š„็…ง็‰‡\n1000 Processed\n classify content\n306000 0 ไธ‡ไธ€็œŸ็š„ๅƒ่…พ่ฎฏ่ฏด็š„ๅพฎ่ฝฏๅ€’้—ญไบ†ๆ€ŽไนˆๅŠž\n306001 0 ่ฎพๆœ‰ๅ—ไบฌๅ—ใ€ๆฑŸๅฎ่ฅฟใ€้ฉฌ้žๅฑฑไธœใ€ๅฝ“ๆถ‚ไธœใ€่Šœๆน–ใ€ๅผ‹ๆฑŸใ€็นๆ˜Œ่ฅฟใ€้“œ้™ตใ€ๆฑ ๅทžๅ’Œๅฎ‰ๅบ†10ไธช่ฝฆ็ซ™\n306002 0 ็›ๅŸŽๅฝข่ฑกๅฎฃไผ ็‰‡ไธŽๆ‚จๅ†็ปญๅ‰็ผ˜\n306003 1 ๅบทๅฎ้‡ŒๅฐๅŒบ๏ผŒไธœๅคง่ก—ๅฐšๅ‹ค่ทฏ๏ผŒ็™พ็››ๅŽ๏ผŒxxใŽกใ€‚xxๅนด็‹ฌ็ซ‹ไบงๆƒ๏ผŒๅฏๆŒ‰ๆญ๏ผŒ็Žฐๆˆฟ๏ผŒ็ฒพ่ฃ…ไฟฎ๏ผŒ้€ๅฎถๅ…ท๏ผŒๅ‡ไปท...\n306004 0 4ใ€ๆŽ’ๅŸบๅฅฅๆ™ฎๆ–ฏ้™ตๅข“้กถ็ซฏๅ‡บๅ‘ๆ˜ฏไธ็†ไบบ\n1000 Processed\n classify content\n306500 0 ๅ–ทไธช้ฆ™ๆฐด่™ๆญป้ผป็‚Ž่ฆๆญป็š„่‡ชๅทฑ\n306501 0 ๅ”ฏไธ€็š„ไธ€ๆก่ทฏๅฐฑๆ˜ฏไปŽๆญค็”ตๆขฏไธŠ้€†่กŒไธŠๅŽป\n306502 1 ๅฆ‚ๆ„้˜ๅฅขไบซ็พŽๅฎนไผšๆ‰€ๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆˆ‘ไปฌไปฅๆ‰“้€ ๅ…ฐๆบช็ฌฌไธ€ๅ“็บง็š„ไผšๆ‰€๏ผŒๆณจ้‡ๅ“่ดจๆœๅŠก๏ผŒไธบๆ‚จๆไพ›ๅ…จ่บซๅฟƒๆ”พ...\n306503 0 ๅ˜‰ๅฎพ๏ผšไธญๅ›ฝๆฑฝ่ฝฆๅทฅไธšๅ’จ่ฏขๅ‘ๅฑ•ๅ…ฌๅธ้ฆ–ๅธญๅˆ†ๆžๅธˆ่ดพๆ–ฐๅ…‰\n306504 0 ็พŽๅ›ฝๅ†…ๅŽ่พพๅทž่ฟ‘ๆ—ฅๅ‘googleๅ‘ๅธƒ็ฌฌไธ€ๅผ ่‡ชๅŠจ้ฉพ้ฉถๆฑฝ่ฝฆ็‰Œ็…ง\n1000 Processed\n classify content\n307000 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅนณๅฎ‰ๆ˜“่ดทๅฎขๆˆท็ป็†ๅด้นใ€‚ๅˆšๅˆš่ทŸๆ‚จ้€š่ฟ‡็”ต่ฏใ€‚xๆœˆไปฝๆˆ‘ไปฌไบ‹ไธš่ดทไธŠๅธ‚๏ผŒ้ขๅบฆxxไธ‡ๅทฆๅณ๏ผŒๅˆฉๆฏ...\n307001 0 ๆˆ‘ๆ˜ฏ็œŸ็š„่ขซpoไธปไธ€ๅฅๆจๆด‹ไนŸๆ˜ฏ่ฟ™ไนˆ่ฟ‡ๆฅ็š„ๅฟƒ้…ธๅˆฐไบ†\n307002 0 ๆˆ‘ไปฌไฝ็š„ๆ˜ฏ้ซ˜ๅฐ”ๅคซ็ƒๅœบ่พนไธŠ็š„ๅคๅผ่ทƒๅฑ‚ๅฅ—ๆˆฟ\n307003 0 ็”ตๆขฏๆ˜ฏๅผฏ็š„ๆ˜ฏๅผฏ็š„๏ฝž่ฟ™ไธชๅ•Š๏ฝžไธŠๆตทไนŸๆœ‰๏ฝžๅฆˆ่›‹ๅฆˆ่›‹ๅฆˆ่›‹๏ฝž\n307004 0 ่ฏฅ้™ขๆ˜ฅ่Š‚ๅ‰ๅŽๅ…ฑๅ‘29ๅ็‰นๅ›ฐ็พคไผ—ๅ‘ๆ”พๅธๆณ•ๆ•‘ๅŠฉ้‡‘25ไธ‡ไฝ™ๅ…ƒ\n1000 Processed\n classify content\n307500 0 ่ดŸ่ดฃๅ…ฌๅธ็š„ไบบๅ‘˜ๆ‹›่˜ๅŠๅ…ฌๅธ็ฝ‘็ปœ่ฟ่ฅ\n307501 1 ๆฝฎ่†ณ้˜ๆตท้ฒœ็ซ้”…ไบŽxๆœˆxxๆ—ฅๅผ€ๅทฅ๏ผŒๅนถ็‰นๆŽจใ€ๅ…จๆ–ฐใ€‘็พŽๅ‘ณๆญฃๅฎ—ๆฝฎๅทž่œใ€ๅ…ป็”Ÿ็‚–ๆฑคใ€ๆตท้ฒœ็ซ้”…๏ผŒๆฌข่ฟŽๆ–ฐ่€้กพ...\n307502 0 ็”ฑ่‡ชๆฒปๅŒบไบบๆฐ‘ๆ”ฟๅบœๅ‘ๅฑ•็ ”็ฉถไธญๅฟƒใ€ไธญๅ›ฝ้ซ˜็ง‘ๆŠ€ไบงไธšๅŒ–็ ”็ฉถไผšๆตทๆด‹ๅˆ†ไผšใ€ๆ–ฐ็–†่ดข็ปๅคงๅญฆๅ’Œไธœ่ฅฟ้ƒจ็ปๆตŽ็ ”็ฉถ้™ขไธปๅŠž\n307503 0 ๆ˜จๅคฉไธŠๅˆๅ็‚นๅ‡บๅธญๆฑŸ่‹ๅฎœๅ…ด็ดซ็ ‚ๅฃถๅœจๅนฟๅทžไผšๅฑ•ไธญๅฟƒๅผ€ๅน•ๅผๆดปๅŠจ\n307504 0 100็ฒ’่ฃ…538ๅ’Œ200็ฒ’่ฃ…958\n1000 Processed\n classify content\n308000 0 ๆณฐๅทžๅŽไธœ็ป็ผ˜ๆๆ–™ๅŽ‚ใ€ๆณฐๅทžๅธ‚ไบšๆ˜Ÿๅก‘ไธšๆœ‰้™ๅ…ฌๅธ็ญ‰ๆˆ‘ๅธ‚62ๅฎถไผไธšๆˆๅŠŸๅ…ฅๅ›ด\n308001 0 ๆœฌๆฅๆ นๆœฌๅ›žไธๆฅๆ‰€ๆœ‰้ฃžๆœบๅœ้ฃžๅ‡†ๅค‡ๆ‰พไบบ้กถๆดปๅฎขๆˆท่ฏดไผšไธๆƒœไปฃไปทๆŠŠๆˆ‘ๆžๅ›žๆฅ็„ถๅŽๆˆ‘ๅฐฑ็œŸ็š„่ขซๅผ„ๅ›žๆฅไบ†โ€ฆโ€ฆโ€ฆ\n308002 0 ๅผ ๅฎถๆธฏๅธ‚่‡ชๅผบ็คพไผšๆœๅŠก็คพๅŸŽๅ—ๅทฅไฝœ็ซ™ๅ’Œ็ฆๆฏ’ๅฟ—ๆ„ฟ่€…ๅœจๆƒ ไธฐ็คพๅŒบๅผ€ๅฑ•ไบ†้’ๅฐ‘\n308003 0 ๅˆธๅ•†่‚กๅ›žๆœฌๅŽๅ†ไธๅš่ฟ™ไธชๆฟๅ—ไบ†\n308004 0 ๅฐ‘ๆž—ๅฏบๅ•†ไธšๅŒ–ๆœฌไบ‹ๅฐฑ่ฟ่ƒŒไบ†ไฝ›ๆ•™็š„ๅฎ—ๆ—จ\n1000 Processed\n classify content\n308500 0 ้˜ฟๅ…‹่‹ๅธ‚ๆณ•้™ขๅ…š็ป„ไนฆ่ฎฐๅผ ๅบ†ๆฐ‘ๆญฃๅœจไธบ็บขๆกฅ่ก—้“็ƒญๆ–ฏ็‰น็คพๅŒบ่€ๅ…šๅ‘˜ใ€่€ๅนฒ้ƒจ้€ๅŽปๆ…ฐ้—ฎๅ“\n308501 0 ๆœ‰ๅฏ่ƒฝไผšๆˆไธบWin10SR1ๆ›ดๆ–ฐ็š„็‰ˆๆœฌๅท\n308502 0 ่ตดๆพฅๆตฆ้•‡ๆฑ‡ๆบๅฐๅŒบไธบ80ไฝ™ๅๅญฉๅญ\n308503 0 ้™„ไปถ๏ผšๆ‹›ๆ ‡ๆ–‡ไปถไธ“็”จๆกๆฌพใ€็”ตๅญ่ฏ„ๆ ‡ๆ‹›ๆ ‡ๆ–‡ไปถใ€็”ตๅญๅ›พ็บธ\n308504 0 ๆดชๅŸŽๅคงๅธ‚ๅœบๅทฅๅ•†ๅฑ€้’ไบ‘ๅˆ†ๅฑ€ๅผ€ๅฑ•ไบ†็”ตๆš–ๅ™จๅธ‚ๅœบไธ“้กนๆ•ดๆฒป่กŒๅŠจ\n1000 Processed\n classify content\n309000 0 ไบš้ฉฌ้€ŠWebๆœๅŠก็Žฐๅœจๆไพ›Auroraๆžๅ…‰ๆ•ฐๆฎๅบ“ๆœๅŠก\n309001 0 ๅฎๅฎๅฅถ็ฒ‰ๅ–‚ๅ…ปๅคงไพฟ่€ๆ˜ฏๅธฆ็ฒ˜่„“็จ ็ŠถๅŽปๅŒป้™ขๆฃ€ๆŸฅๅคงไพฟๆฒกๆœ‰้—ฎ้ข˜ๅŒป็”Ÿๅผ€ไบ†ๅคดๅญขๅ’Œๅฆˆๅ’ช็ˆฑ\n309002 0 ๅ•ๆ˜ฅ็ง‹ๆ˜ฏ่Šฑๅƒ้ชจ้‡Œๆœ€็ƒฆ็š„ไบบๅ•Š\n309003 0 ๆˆ‘ไปฌๅทฒไบŽ2015ๅนด2ๆœˆๅ‘้นฟๆ™—ๅœจไธญๅ›ฝๅผ€ๅฑ•็š„้žๆณ•ๆผ”่‰บๆดปๅŠจ\n309004 0 ๅˆ›้€ ไบ†ๆ–ฐ็š„NBAๅญฃๅŽ่ต›ๅ•ๅœบๅคฑ่ฏฏ็บชๅฝ•\n1000 Processed\n classify content\n309500 0 ThisSummer\\'sGonnaHurtLikeๆฌง็พŽๆต่กŒๆŒ‡ๅ—\n309501 0 ๆต™ๆฑŸๅซ่ง†ๆฒกๆ”พๅ–œ็พŠ็พŠไธŽ็ฐๅคช็‹ผ\n309502 0 ่€Œไธ”่ฟ˜ไป‹็ป็ป™ไบ†ๅšๆŠคๅฃซ็š„ๆœ‹ๅ‹\n309503 0 HUAHUA้ฆ†ๅฐๆนพ่ฎพ่ฎกๅธˆๆ‰‹ไฝœๅคฉ็„ถ็ƒŸ็ฐ็Ž›็‘™็็ 925็บฏ้“ถ้’ˆ่€ณ้’‰่€ณ็Žฏ\n309504 1 ไบฒ๏ผŒไธ–่ดธๅฑˆ่‡ฃๆฐ่‡ช็„ถๅ ‚็Žฐๅœจไนฐxxxๅ‡xxๅ†x.xๆŠ˜๏ผŒxๆœˆxๆ—ฅ่‡ช็„ถๅ ‚ไผšๅ‘˜่ฟ˜ๆœ‰ๅŒๅ€็งฏๅˆ†๏ผŒ่ถ…ๅˆ’็ฎ—ๅ“ฆ\n1000 Processed\n classify content\n310000 0 ๅฏไปฅไฝ ๅทฒ็ถ“ๆ˜ฏๆˆ‘็”Ÿๅ‘ฝไธญ็š„้Žๅฎขไบ†\n310001 0 ๅนณๅฟƒ่€Œ่ฎบๆฑŸๅฎๅ…ฌๅฎ‰ๅ’Œๅš็‰ฉๆ‚ๅฟ—ไธคไธช่ดฆๅท้ƒฝๆ˜ฏๆŒบๅฅฝ็š„\n310002 0 Sๅงๅฆน็š„ๅฆˆๅฆˆๅˆ™ๆŠŠๅฐSๅ’Œ่Œƒ็Žฎ็ชๆ‚ๅœจไธ€่ตท\n310003 0 ๅˆ›ๅปบๅ›ฝๅฎถ็”Ÿๆ€ๅŽฟๅปบ่ฎพ็พŽไธฝๆ–ฐๆฒ›ๅŽฟ\n310004 0 ๆฑŸ่‹็œ่ฟžไบ‘ๆธฏๅธ‚ๆฎ‹่”ๅ…š็ป„ไนฆ่ฎฐใ€็†ไบ‹้•ฟไพๅฏ้กบ\n1000 Processed\n classify content\n310500 0 ๅ—ไบฌๆญฆ่ญฆๅŒป้™ข่‚›่‚ ็ง‘ไธปไปป้ƒญๅ–œๆณ•่ฏด\n310501 1 ๅงๅงไฝ ๅฅฝ๏ผŒๆˆ‘ๅบ—ไธบไบ†ๅ›ž้ฆˆ่€้กพๅฎขไธ‰ๅ…ซ่Š‚ๅฝ“ๅคฉๆ‰€ๆœ‰ๆŠค่‚ค็พŽไฝ“ไบงๅ“ๅ…จ้ƒจไบ”ๆŠ˜๏ผŒๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผŒๆŸณๆ–ฐไฟก็”จ็คพ่ฅฟ้—จ...\n310502 1 ๆญฃๅผๅผ€ไธšใ€‚ๅฅฅ็‰น่Žฑๆ–ฏๅๅ“ๅŸŽๅ•†้“บxx-xxใŽก๏ผŒ้ข็งฏๅฐๆ€ปไปทไฝŽใ€‚ๆœˆไพ›ไธคๅƒๅคšๅ…ƒ๏ผŒๆœˆๆ”ถ็งŸไธ‰ๅƒๅคš๏ผŒๆœˆ็งŸๅฏๆŠต...\n310503 0 ่€Œๅˆ˜ๆŸไธบ่ฏๆ˜Žโ€œๆธ…็™ฝโ€็ซŸ็„ถๅธฆ้ข†ๆฐ‘่ญฆๅŽป็งŸไฝ็‚นๆ‰พๅฎคๅ‹ไฝœ่ฏ\n310504 0 xxxxๅนดxๆœˆxxๆ—ฅๅ‘ผๅ’Œๆตฉ็‰นๅธ‚็ฌฌไธ€ๅŒป้™ขๆ•ดๅฝข็พŽๅฎน็ง‘ๆณจๅฐ„ไผš่ฏŠ\n1000 Processed\n classify content\n311000 0 ไปฅๅŠ่‘—ๅ็š„้ฃŽ้™ฉๆŠ•่ต„ไบบโ€”โ€”็บชๆบ่ต„ๆœฌๆ€ป่ฃ็ฌฆ็ปฉๅ‹‹ๅ’Œ็ป็บฌไธญๅ›ฝๅŸบ้‡‘็ฎก็†ๅˆไผ™ไบบๅพไผ ้™ž\n311001 0 2015ๅนด้•‡ๆฑŸๅธ‚่€ƒ่ฏ•ๅฝ•็”จๅ…ฌๅŠกๅ‘˜ๆ‹Ÿๅฝ•็”จไบบๅ‘˜้€’่กฅๅๅ•ๅ…ฌ็คบ\n311002 0 ่ญฆๅฏŸๆฒกๅฌๆ‡‚ๆŠŠxxxไปฅ้˜ฒ็ขๅ…ฌๅŠกๆ‹˜็•™ไบ†\n311003 0 ้‚ฃไนˆๅŽไธบMate8ไผšๆœ‰ๅ“ชไบ›ๆ”น่ฟ›\n311004 0 ๆ„ๆ€ๅฐฑๆ˜ฏไฝ ๅ‰ฉไธ‹็š„359ยฐๅ…จๆ˜ฏๆญป่ง’\n1000 Processed\n classify content\n311500 0 AdrianไบŽ7ๆ—ฅๆ™š้—ดๅœจ้ฆ™ๆธฏๅ‡บๅธญๆดปๅŠจ\n311501 0 ๅŒป็”Ÿๅงๅง่ฆ้ชŒๆˆ‘็š„ๅ—ฏๅ—ฏโ€ฆโ€ฆโ€ฆโ€ฆ\n311502 0 ๅพˆๅคšไบบไธ็†่งฃๆบง้˜ณไบบไธบไป€ไนˆไน ๆƒฏๅƒๆณก้ฅญ\n311503 0 ่€Œๅ†…้ƒจ่ฎพ่ฎกๅธˆEndramuktiHidayatๅˆ™ๅทงๅฆ™ๆณจๅ…ฅๅทดๅŽ˜ๅฒ›็š„่ถฃๅ‘ณ\n311504 0 ๆปก่ถณไบ†ไธๅŒๅฎถๅฑ…่ฃ…ไฟฎ็š„้œ€่ฆ\n1000 Processed\n classify content\n312000 0 ๆŒ‡ๅ‡บๅˆฐ2020ๅนดไธŠๆตทๅธ‚่ทจๅขƒ็”ตๅ•†ๅ‘ๅฑ•ๆฐดๅนณ่ฆๅฑ…ๅ…จๅ›ฝๅ‰ๅˆ—\n312001 0 ๅ˜‰ๅฎšๅŒบๆŸไพฟๅˆฉๅบ—่ขซ5ๅๅค–ๆฅไบบๅ‘˜ๆŒๅˆ€ๆŠขๅŠซ8ๅƒๅคšๅ…ƒ\n312002 0 ็œŸ็›ธๅ’Œ็ญ”ๆกˆๅชๆœ‰ไธ€ไธช๏ผš่†จ่†จๅ†ฐ\n312003 0 ไปŠๅคฉๆ˜ฏๅธธๅทžไธ€ไธชๆœˆไปฅๆฅ้šพๅพ—็š„ๅคฉๆฐ”ไบ†\n312004 0 ๅนถไธŽAC็ฑณๅ…ฐไฟฑไน้ƒจCEOๅŠ ๅˆฉไบšๅฐผๅฐฑๅŒๆ–นๅ“็‰Œๅˆไฝœ็ญ‰ๆ–น้ข็š„ไบ‹ๅฎœ่ฟ›่กŒไบ†ๆทฑๅ…ฅไบคๆต\n1000 Processed\n classify content\n312500 0 ๆ็Œฎ็ป™ๆต™ๆฑŸ็œๅš็‰ฉ้ฆ†ไบ”ๅƒๅคšๅน…\n312501 1 xๅ…ƒๅ„ๆฌพ่‰ฒ้’ˆ็ป‡ๅผ€่กซ๏ผŒxxxๅ…ƒใ€xxxๅ…ƒๅ“ฅๅผŸ่ฃค๏ผŒ่ฟ˜ๆœ‰ๆ›ดๅคšๅฐๆƒŠๅ–œ๏ผŒ้€ๆฏไบฒ้€้—บ่œœ??๏ผŒๆฌข่ฟŽๅ‰ๆฅไฝ“้ชŒ...\n312502 0 ๅดๅ‡บไนŽๆ„ๆ–™็š„้‡‡็”จไบ†Q็‰ˆ็š„ๅก้€š็”ป้ข่ฎพ่ฎก\n312503 0 ๅจœๅก”่މยทๆณข็‰นๆ›ผ็š„ๆผ”ๆŠ€็œŸ็š„ๅพˆๆฃ’๏ฝžV็š„ๅฃฐ้ŸณไนŸๅพˆๅฅฝๅฌ\n312504 0 ่ท็ฆปxxxxๅนด็ ”็ฉถ็”Ÿ่€ƒ่ฏ•ๅชๅ‰ฉxxxๅคฉ\n1000 Processed\n classify content\n313000 0 Don'twasteyourtimewithexplanations๏ผšpeopleonlyh...\n313001 0 ๅไธชไบบๅฐฑๅชๆœ‰ไธ‰ไธชไบบๆ‰‹ๆœบๆœ‰ไฟกๅทโ€ฆๆ‰“็”ต่ฏๆŠฅ่ญฆ\n313002 0 ่ดต็พŽไบบๆ•ดๅฝข็พŽๅฎนๅŒป้™ขๅง‹็ปˆไปฅๅฎ‰ๅ…จๅก‘็พŽไธบๅทฑไปป\n313003 1 ไธบๅบ†็ฅไธญๅ›ฝไบบๅฏฟ่ฃๅ‡ๅ‰ฏ้ƒจ็บงๅคฎไผ๏ผŒ็‰นๆŽจๅ‡บไฟๅ€ผๅขžๅ€ผใ€้ซ˜้ขไฟ้šœใ€ๅ…จ็จ‹ๅˆ†็บข็š„้ธฟ็›ˆไธคๅ…จไฟ้™ฉ๏ผŒไบค่ดนxๆฌก๏ผŒx...\n313004 0 ๅŽปxxx่บบไธ‹ไธ€ๅˆ†้’ŸๅŒป็”Ÿๅ‘Š่ฏ‰ๆˆ‘ๆฒกๆ‰พๅˆฐ\n1000 Processed\n classify content\n313500 0 ๅ™ๅˆฉไบš็ซ็ฎญๅผน่ขซๆ”น้€ ๆˆๆ—‹่ฝฌๆœจ้ฉฌ\n313501 0 ้ขๆ–™ๆฃ‰้บป้€ๆฐ”่ฝป่–„ไธŠ่บซๆ•ˆๆžœๅพˆๅคง็‰Œๆญ้…้ดๅญๆฌง็พŽ่ก—ๆ‹ๆ„Ÿ\n313502 0 ๅœบๅค–่ฏๅˆธไธšๅŠก้ป‘ๅๅ•ๅˆถๅบฆๅฐ†ๅปบ\n313503 0 ๅ…จๅนด่ฅๆ”ถๅฏ่ƒฝไปŽๅŽปๅนดxxxไบฟๅ…ƒไบบๆฐ‘ๅธๅขž้•ฟๅˆฐxxxไบฟๅ…ƒไบบๆฐ‘ๅธ\n313504 0 ๆฅ่ฏ•่ฏ•CoolFatBurner็‡ƒ่„‚่ƒŒๅฟƒ\n1000 Processed\n classify content\n314000 1 ไธบๅบ†็ฅไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚๏ผŒxๆœˆx่‡ณxๆœˆxๆฒ™ๆฒณๅคฉ่™น็މๅ…ฐๆฒนไธ“ๆŸœๆŽจๅ‡บๅ…จๅœบxๆŠ˜ไผ˜ๆƒ ๏ผŒๅฆๆŠ˜ๅŽxxxๅ†ๅ‡xxx๏ผŒ...\n314001 0 ้กพๅฎขๅฐ†่‡ชๆ‹็…ง็‰‡้€š่ฟ‡ๆ‰‹ๆœบไผ ้€่‡ณๆ‰“ๅฐๆœบ\n314002 0 ไปฅไธ‹ๆ‰€ๆœ‰่‚ก็ฅจๅ‡ไธๅšไธบๆ‚จไนฐๅ–ไพๆฎ\n314003 1 ไธ‰ๅ…ซ่Š‚ๅฐฑๆ˜ฏ่ฟ™ไนˆไปปๆ€งใ€ไธ‰ๆœˆไธƒๆ—ฅ่ฟ›ๅบ—็™ฝ้€xxๅ…ƒ็ซๆฏ›่†ไธ€ๆ”ฏใ€ไธ‰ๆœˆๅ…ซๆ—ฅ่ฟ˜็™ฝ้€xxxๅ…ƒbb้œœไธ€ๆ”ฏใ€ไฝ†ไฝ ...\n314004 0 BESCONNewGen38้€ๆ˜Ž้šๅฝข็œผ้•œBESCONNewGen38้€ๆ˜Ž็‰‡ๅฆ‚ๅŒ่‚Œ่‚ค็š„่ˆ’้€‚ๆ„Ÿ\n1000 Processed\n classify content\n314500 0 ็ฉบไน‹่ฝจ่ฟนFCEVO็›ดๆ’ญๅผ€ๅง‹ๅ’ฏๅœฐๅ€ๅœจ\n314501 0 ็Žฐๅœจๆฏๅคฉ็ก่ง‰ๅšๆขฆ้ƒฝๆ˜ฏ่Šฑๅƒ้ชจ็š„ๅ‰งๆƒ…\n314502 0 ๅ—้•ฟๆฃ€ๅฏŸ้™ขโ€œๅ‘ๆ—ฅ่‘ตไน‹ๅฎถโ€็ป„็ป‡ๅ…จๅธ‚60ไฝ™ๅไธญๅฐๅญฆ็”Ÿ\n314503 0 ็œ‹็€ๆ‰‹ๆœบ้ƒฝ่ƒฝๆ„Ÿ่ง‰็š„ๅฎถ้‚ฃ่พน็š„ๅฐๅ‡‰้ฃŽ\n314504 0 ๆƒณ่ตทๅฎถ้‡Œๆฅผๆˆฟ้‚ฃๆŠŠxxๆฅผๅซxxB\n1000 Processed\n classify content\n315000 0 ๅ…ฌๅฎ‰ๅฑ€ๆ‰“็”ต่ฏๅš็คพไผšๆฒปๅฎ‰ๆปกๆ„ๅบฆ่ฐƒๆŸฅ\n315001 0 ่ฅฟ่—ไธๆ˜ฏไฝ ๅˆท้€ผๆ ผ็š„ๅทฅๅ…ทๅ’Œๆปก่ถณๅพๆœๆฌฒ็š„ๅฏน่ฑก\n315002 0 ๆ”ฟๅบœๆŠ•่ต„้กน็›ฎไนŸๅฐฑๅ ‚่€Œ็š‡ไน‹่ถŠๆฅ่ถŠๅคš\n315003 0 ๅญฆ่€…ๆž—ๅ–†็Š€ๅˆฉ่จ€ๅ่…่ดจๆœดๅพ…ไธ–็•Œ\n315004 0 ๅผ ็ขงๆ™จๆ˜ฏไธๆ˜ฏๅฅฝๅฃฐ้Ÿณๆœ€็ซ็š„\n1000 Processed\n classify content\n315500 0 ไธบไธญๅฐๆŠ•่ต„่€…ๆไพ›ๆœ€้ ่ฐฑ็š„่ดทๆฌพๆ–นๅผ\n315501 0 YouTubeไฝœ่€…ๆ˜ฏGoogle\n315502 0 ๅ…ฌๅ…ฑๅœบๅˆๆŽฅๅปๆ˜ฏไธ€็ง่ฟๆณ•่กŒไธบ\n315503 0 ่ตตๆฐธ้ฃžๅ…ˆๅŽ่ฃ็ซ‹ไบŒ็ญ‰ๅŠŸ1ๆฌกใ€ไธ‰็ญ‰ๅŠŸ4ๆฌก\n315504 0 ๅ‰ๆž—็œ้ฃŸๅ“่ฏๅ“็›‘็ฃ็ฎก็†ๅฑ€ๅœจๅ…ถๅฎ˜ๆ–น็ฝ‘็ซ™็š„้ป‘ๆฆœไธญๅ…ฌๅธƒไบ†ไธ‰ๆ‰นๆฌกไธๅˆๆ ผ้ฃŸๅ“\n1000 Processed\n classify content\n316000 0 ๆžฃๅบ„ไธญ้™ขๅฌๅผ€ๅ…จๅธ‚ๆณ•้™ขไบบๆฐ‘้™ชๅฎกๅทฅไฝœ็ป้ชŒไบคๆตไผš\n316001 0 ็”ฑๅซๆ˜Ÿๅœจๅคฉ็ฉบ็ผ–็ป‡็š„ๅฎšไฝ็ฝ‘็ปœ\n316002 0 ๅ–็‚น๏ผšMACDๆญปๅ‰ไธ”่‚กไปท่ทŒ็ ด30ๅ‘จๆœŸ็บฟ\n316003 0 ๆ‰‹ๆœบๆ‘”ไบ†ไธคๆฌกๅฑๅน•ๆœไธ‹โ€ฆโ€ฆ่ฟ™ๆ‰‹ๆœบๅฟ…้กปไธ‰้˜ฒๅฆๅˆ™่ทŸ็€ๆˆ‘่ฟ™ไธช็ ดๅ็Ž‹ไธ็Ÿฅ้“่ƒฝ็”จๅคšไน…โ€ฆโ€ฆ\n316004 1 jE \u0003mfeeox.com/?f=xxx\u0001\u0003xxxxxxxxxxxๆ‚จๅฅฝ๏ผ่‹นๆžœxSๆ‰‹ๆœบๅช่ฆx...\n1000 Processed\n classify content\n316500 0 ๅพฎ่ฝฏ๏ผšWindows10ๅผ€ๆ”พๆ›ดๆ–ฐ้ฆ–ๆ—ฅ่ฃ…ๆœบ้‡่ถ…1400ไธ‡\n316501 0 ๆฑŸ้˜ดๆฃ€้ชŒๆฃ€็–ซๅฑ€ๅœจไธ€ๆ‰น็พŽๅ›ฝ่ฟ›ๅขƒๅคง่ฑ†ไธญๆˆช่Žทๆฃ€็–ซๆ€ง็—…ๅฎณๅคง่ฑ†ๅŒ—ๆ–น่ŒŽๆบƒ็–ก็—…่Œ\n316502 0 ๅŽไธบeSpaceUxxxx\n316503 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆท๏ผŒๆˆ‘่กŒxๆœˆxๆ—ฅ--xๆœˆxxๆŽจๅ‡บๅคšๆฌพ็†่ดขไบงๅ“๏ผšๆœŸ้™๏ผšxxๅคฉ--xxxๅคฉ๏ผŒๅˆฉ็އ๏ผšx.x...\n316504 0 ๅ›žๅฎถ็š„ๆ—ถๅ€™ๅฅฝไธๅฎนๆ˜“ๅฆ‚่ตถ้›†่ˆฌไบบ่ดด็€ไบบๆŒคไธŠไบ†่ฝฆๆˆ‘่ƒŒๅŽไธ€ไฝๅคงๅ”็ดง่ดด็€ๆˆ‘้š็€ๆฑฝ่ฝฆ็š„ๆ™ƒๅŠจไธไธ€ไผšๆˆ‘ๆ„Ÿๅˆฐไบ†...\n1000 Processed\n classify content\n317000 1 ไบฒ็ˆฑ็š„๏ผŒ็ปด็บณ่ดๆ‹‰ไธ‰ๅ…ซ่Š‚๏ผŒไธ€ๅนดๅฐฑไธ€ๆฌก็š„ๅคงๅž‹็›ดๆŽฅๅ‡้’ฑๅผ€ๅง‹ไบ†๏ผŒๅˆฐxๅท็ป“ๆŸไบ†๏ผŒ็ฅฅ่ฏขๅˆฐ่—ๅซ่ทฏ็ปด็บณ่ดๆ‹‰ๅ†…...\n317001 0 ๆญฃๅฆ‚้‚ฃๆทฎๅฎ‰ๅธ‚ๅ…ฌๅฎ‰ๅฑ€ๅšไฟก่ฎฟๅทฅไฝœ็š„่ญฆๅฏŸ\n317002 0 ๆœ‰ไบบ่ฏด่‚ก็ฅจๆŠ•่ต„ๆ˜ฏไธ€ไธชๆ€งๆ ผ็š„ๆ”น้€ ่ฟ‡็จ‹\n317003 0 ๆˆ‘ๅฐฑไธ่ฏด2ไธช่ทฏ็—ด้—ฎ่ทฏ็ป“ๆžœไปŽๅœฐ้“Aๅ‡บๅฃ่ฟ›Bๅ‡บๅฃๅ‡บๆฅไปฅๅŽๅ‘็Žฐ2ไธชๅ‡บๅฃๅฐฑๅ‡ ๅ็ฑณ่ท็ฆป\n317004 0 ๅฟไบ†่พฃไนˆๅคšๅคฉๅˆ˜้†’ไป–ไปฌไปŠๆ™šไธ€ๅฎšๅคงๆ‰“้ฃžๆœบไปฅๅฃฎๅ›ๅจๅฆ™ๅ•Š\n1000 Processed\n classify content\n317500 0 ็ซŸ่ขซๅฐๅท็ป™ๅคงๆ‘‡ๅคงๆ‘†็š„้กบ่ตฐไบ†\n317501 0 ไธ‹ๅˆๅŽปๅŒป้™ขๆŒ‚ไธชๆ€ฅ่ฏŠๅšไธชๅ–‰้•œ\n317502 0 ๆœฌ้—จไปŽ็งฆๅฒญ็ƒˆ็ซ็Ž„ๆญฆๅธฎๅคบๅพ—ๅผนๆŒ‡็ฅž้€šๆฎ‹็ซ ไธ‰\n317503 0 ๅฅน็š„่ˆŒๅคดไปŽ10ไธชๆœˆ่ตทๅฐฑๆ€ปๆ˜ฏ่ฟ™ๆ ท\n317504 0 ๆž„็ญ‘ไบ†ไบงๅ“ไธŽไผ—ไธๅŒ็š„ๆ ธๅฟƒ็Žฉๆณ•่ฎพ่ฎก\n1000 Processed\n classify content\n318000 0 xxxxไธŠๅŠๅนดๅทฒ็ผฉๅ‡ไธบ+xx%\n318001 0 ๆŠ•่ต„ๅปบ่ฎพๆ— ้”กๆตทๅฒธๅŸŽ็š„ๆทฑๅœณๆตทๅฒธ้›†ๅ›ข\n318002 1 ๅœฐๆ ‡๏ผxๆกๅœฐ้“็บฟ็Žฏ็ป•๏ผŒๅธ‚ไธญๅฟƒ็จ€็ผบ็”Ÿๆ€ๅ•†ๅŠกๅ…ฌๅ›ญ๏ผ็›ฎๅ‰ๅœจๅ”ฎ้ข็งฏไธบ็จ€็ผบ็ฒพ่‡ดไบงๅ“ๆญฃๅผๅฏนๅค–ๅ‘ๅ”ฎ๏ผŒๆฌข่ฟŽๆ‚จ...\n318003 0 ๆฏๆฌกๅœจๅ…ฌไบคใ€ๅœฐ้“ไธŠ็ป™ๅนด็บชๅคง็š„ไบบ่ฎฉๅบง\n318004 0 ็”š่‡ณๆœ‰็š„14ใ€15ๅฒ็š„ไธญๅญฆ็”ŸไนŸไผšๅคฑ็œ \n1000 Processed\n classify content\n318500 0 ๅœจไน˜ๅ็”ตๆขฏๆ—ถไธ€ๅฎš่ฆ้ตๅฎˆ็”ตๆขฏๅฎ‰ๅ…จๆณจๆ„ไบ‹้กน\n318501 0 ไธ่ฟ‡่นฒๆฌง็พŽๅœˆๅ’Œ้ฅญๆ—ฅๅœˆ็š„ไบบ็œŸๆ˜ฏ้ฃŽๆ ผๅคง็›ธๅพ„ๅบญ\n318502 0 Naturie่–ไปๆฐดๅŒ–ๅฆ†ๆฐดๆฒกไป€ไนˆๅ‘ณ้“\n318503 0 ๅผ ๅฌๅฟ ๅฐ†ๅ†›่ฏดโ€œ้›ถไธ‹700ๅบฆไปฅไธŠ็š„็‰ฉไฝ“้ƒฝ่ƒฝ่ขซ่ง‚ๆต‹ๅˆฐ\n318504 0 ็œŸ็š„ๆ˜ฏไธ€็‚นๅ„ฟ็†่ดขๆฆ‚ๅฟต้ƒฝๆฒกๆœ‰\n1000 Processed\n classify content\n319000 0 com็Ž‹ๆท‘ๆ•ๅปบ่กŒๆฑŸ่ฅฟๅˆ†่กŒๅธๅท6217002080001336407็Ž‹ๆท‘ๆ•\n319001 0 ๅ…‰ๆ˜ฏmeasureๅ’Œdefinition็›ธๅ…ณ็š„็ปๅ…ธๆ–‡็Œฎๅฐฑๆœไบ†19็ฏ‡\n319002 0 xๆœˆxxๆ—ฅๅ‘ๅธƒไปฅๆฅๆƒณๅฟ…ๅพˆๅคš็”จๆˆท้ƒฝๅŠ ๅ…ฅๅˆฐ่ฟ™ๅœบ็ณป็ปŸๅ‡็บง็š„็››ๅฎดไธญ\n319003 0 ๅ€Ÿ่ดทๅฎๆŠ•ๅ…ฅ20ไบฟๅทจ่ต„ๅผบๅŠฟๆ‰“้€ P2P็†่ดขๅนณๅฐ\n319004 0 ไธ€่ƒ–ๅฐฑ่ƒ–่„ธๅ’Œ่‚šๅญๅˆฐๅบ•่ƒฝไธ่ƒฝ่กŒๅ•Š\n1000 Processed\n classify content\n319500 0 ๅคชไป“ๆญฆๆธฏ็ ๅคดๆœ‰้™ๅ…ฌๅธๅฟ—ๆ„ฟ่€…ๆœๅŠก้˜Ÿๆฅๅˆฐไบ†ๅคชไป“ๅธ‚ๅ›พไนฆ้ฆ†\n319501 0 ้ขๅฏนไป–ไปฌ็š„่ดจ็–‘่ฏดๅฟซๆœฌๆ˜ฏไธ€็พค็–ฏๅญ\n319502 0 ไบบ็š„้‡ๅฟƒไธไผšๅƒ้ฃžๆœบไธ€ๆ ท็ช้™\n319503 1 ๅฐŠๆ•ฌ็š„้ฆจๆ„ๅฅณ่ดตๅฎพ:ไธบๅบ†็ฅxxๅฆ‡ๅฅณ่Š‚๏ผ็‰นๆŽจๅ‡บๅช้œ€่ฆxxๅ…ƒ็ซ‹ๅˆปๅฏไปฅไฝ“้ชŒไปทๅ€ผxxxๅ…ƒ็š„้ข้ƒจ็މ็ŸณๆŽ’ๆฏ’...\n319504 0 ๆœบๅฐพๅŽ้ƒจ็š„Vๅž‹็ป“ๆž„ๆœ‰ๅˆฉไบŽๆฐด้ข่ˆช่กŒ็จณๅฎšๆ€ง\n1000 Processed\n classify content\n320000 0 ๆˆ‘ๅทฒ็ปๅšๅฅฝไบ†่ฟŽๆŽฅๅ—ไบฌ็ƒญๆตช็š„ๅ‡†ๅค‡ไบ†\n320001 0 ่ฎฉๅผ ๅฎถ็•Œ\"ๅŽ่Šฑๅ›ญ\"ไน‹็งฐ็ฅžๅฅ‡็พŽๆ™ฏ็”ฑๅน•ๅŽ่ตฐ่‡ณๅ‰ๅฐ\n320002 0 ่ฏดๅฅฝ็š„้˜ฟ้‡Œ้ƒฝไธ็Ÿฅ้“ไฝ•ๆ—ถๆ‰่ƒฝๆˆ่กŒ\n320003 0 ๆทฎๅฎ‰ๅธ‚ไพ›็”ตๅ…ฌๅธๅ‘˜ๅทฅๅ†’็€้…ทๆš‘\n320004 0 ๅ› ๆญคๅฐ†ๅ“ˆๅผ—H8็š„ไธŠๅธ‚ๆ—ถ้—ดๆŽจ่ฟŸไธ‰ไธชๆœˆ\n1000 Processed\n classify content\n320500 0 ่ฐƒๆŸฅๅฏน่ฑกๅฐ†ๆถ‰ๅŠไธน้œžๅฑฑ5A็บงๆ—…ๆธธๆ™ฏๅŒบ็š„้ฃŸไฝ่กŒๆธธ่ดญๅจฑ็ญ‰\n320501 0 ็ป™ๅคงๅฎถๅˆ†ไบซโ€œzhiๅญhuaๅผ€โ€\n320502 0 ไบŒๅไบ”ๅฒๆœฌๅ‘ฝๅนดๅŽ้—็—‡้€š้€š่ฟ‡ๅŽป\n320503 0 ๅ‘้€ๆˆๅŠŸๅŽไธ€่ˆฌไผšๅœจ48ๅฐๆ—ถๅ†…็ป™ๆ‚จ็ญ”ๅค่ฟ™ๆ ท็š„้‚ฎไปถ\n320504 0 ๅŽ‚้•ฟS5็š„ๅฅ–ๆฏๅœจ็ญ‰ไฝ ๆŠŠๅฎƒๆงๅ›žไธญๅ›ฝ\n1000 Processed\n classify content\n321000 0 ็‡•้บฆ้ป‘้บฆๆฝœ่‰‡ๅŒ…ๆ˜จๆ™šx็‚นๅคšๆ‰ๆ“\n321001 0 ่‹นๆžœๆญฃๅœจๆ‹›ๅ‹Ÿๆฑฝ่ฝฆ่กŒไธšๆŠ€ๆœฏๅ’Œ่ฎพ่ฎกๅทฅไฝœ่€…\n321002 0 ๆ‰‹ๆœบไธŠ็ฝ‘40ๅˆ†้’Ÿ็ฝ‘่ดน่พพ1000ๅ…ƒ\n321003 0 ๅ…จๅœฐๅŒบไธค็บงๆณ•้™ขๅ…ฑไธŠๆŠฅๅคฑไฟกๅๅ•531ไปถ\n321004 0 ๅทฒๅฎŒๆˆ3200ๆ ชๅคๅ†ฌๆžฃๆ ‘ๆ ‘ๅ† ๆ ‘ๆ†ๆ›ดๆ–ฐๅŠๅซๆŽฅ\n1000 Processed\n classify content\n321500 0 ๆตŽๅท่ทฏ่ฝฌๆณฐๅทžๅคง้“ๆณฐๅทžๅคง้“้€šG2\n321501 1 ้Ÿฉๅ›ฝ็พŽไฝณ็ฅๆ‚จๅœจๆ–ฐ็š„ไธ€ๅนด้‡Œ่บซไฝ“ๅฅๅบท๏ผŒๆ›ดๅŠ ๅนด่ฝป๏ผŒๆผ‚ไบฎ๏ผx.x่Š‚ไผ˜ๆƒ ๆดปๅŠจๅทฒ็ปๅผ€ๅง‹ๅ•ฆ๏ผ่ฏฆ่ฏขxxxxx...\n321502 1 xๆœˆxxๆ—ฅๆ–นๅคชๆ——่ˆฐๅบ—็››่ฃ…ๅผ€ไธš๏ผŒๅ‡ญxxๅ…ƒๅขžๅ€ผๅกๆœ€้ซ˜ๆŠต็Žฐxxxxๅ…ƒใ€‚่ฎขๅ•ๆœ‰็คผ่ฟ˜่ฆ้ข†็Žฐ้‡‘็บขๅŒ…ใ€‚็ ธ้‡‘...\n321503 0 ๆธฏ่‹ฑๆฎ–ๆฐ‘ๆ”ฟๅบœ้ฆ–่ฎพๆฐ‘ๆ”ฟๅฑ€็š„ๅ‰่บซๆฐ‘ๆ”ฟๅธๆ—ถ\n321504 1 xๅ‡xxใ€‚ xใ€็กฌ้‡‘ใ€ๅƒ่ถณ้‡‘้•ถๅตŒ็ฑปๆฏๆปกxxxๅ‡xxๅ—ใ€‚ xใ€้’ป็Ÿณ็‰นไปทๅ†xๆŠ˜๏ผŒๆญฃไปทx.xๆŠ˜ใ€‚ ...\n1000 Processed\n classify content\n322000 0 ๅŒๆ ทไนŸๆ˜ฏไธ€ๅœบ็œŸๅฎž็š„ๆŠ—็™Œ็ปๅކ\n322001 1 ๆ‚จๅฅฝๆˆ‘ๆ˜ฏๆ•™็ปƒๅ‘˜ไป˜ๅ…ˆ่މ๏ผŒไผ˜ๆƒ xๅทๆˆชๆญข๏ผŒๆœ›ไบˆๅญฆ่€…ไปŽ้€Ÿ๏ผŒๅฐ่ฝฆxxxxๅ…ƒๅคง่ฝฆxxxxๅ…ƒ๏ผŒๆŠฅๅไธ€ๅฎšๆ‰พไฟบ...\n322002 0 ๅˆ›้€ ๅ‡บไธ€ไธช้€‚็”จไบŽๆ‰‹ๆœบ็”จๆˆท็š„ๆธธๆˆไบงๅ“\n322003 0 ็œ‹ๅˆฐ็ปˆ่บซ็–พ็—…ไธๅฏๆฒปๆ„ˆๆ—ถๅฟƒๆƒ…ๆžๅบฆไฝŽ่ฝ\n322004 0 ๆ‰‹ๆœบSIMๅกๆ— ๆ•ˆ็š„ๅŠžๆณ•โ†“โ†“โ†“\n1000 Processed\n classify content\n322500 0 ่ฟ˜ๆœ‰ๅคงๆฆ‚5ๅ‘จ็š„ๆ—ถ้—ดๅฐฑ่ฆ่ฟŽๆŽฅๆ–ฐ็”Ÿๅ‘ฝๅ•ฆ\n322501 0 ๆœ€็ปˆๅธฆ้ข†็ƒ้˜Ÿๅœจไธปๅœบไปฅ100๏ผš86ๆˆ˜่ƒœๆณข็‰นๅ…ฐๅผ€ๆ‹“่€…\n322502 0 ๅฅฝๅฃฐ้Ÿณไธๆ˜ฏๅ›ž้”…่‚‰ๅฐฑๆ˜ฏๅ…ณ็ณปๆˆท\n322503 0 ไนŸๆœŸๅพ…ๆˆ‘ไปฌๅŒไบ‹ๅ’Œๅ…ณๅฟƒๆƒ ๆ™ฎ็š„ๆœ‹ๅ‹่ƒฝๅ’Œๆƒ ๆ™ฎไธ€ๆ ท\n322504 0 ๆงฝ่ฟ™ไธ‰ๅŽŸๅˆ™ไธๅฐฑๆ˜ฏๆœบๅ™จไบบไธ‰ๅŽŸๅˆ™ๅ—\n1000 Processed\n classify content\n323000 0 ๅœจๆกฅ้•ฟ็บฆx/xๅค„ไธบVๅญ—ๅฝขๆœ€ไฝŽ็ซฏ\n323001 0 1988ๅนดๅ•†ๅŸŽๅŽฟๆ”ฟๅบœๅฐ†ๆญคๅˆ—ไธบๅŽฟ็บงๆ–‡็‰ฉไฟๆŠคๅ•ไฝ\n323002 0 ๆธฉๅทž็š„ไน ไฟ—ๅ››ไธชๆœˆๅ››ๅคฉ่ฆ็ฉฟ้‡‘ๆˆด้“ถๅ“ˆๅ“ˆ่ดช่ดขๅฎๅฎ็ก่ง‰้ƒฝ็ฌ‘่ตทๆฅ\n323003 1 ๅฅฝๆถˆๆฏ!ๆ‚จๆˆไธบๅปบ่กŒ็š„ไผ˜่ดจๅฎขๆˆท\n323004 0 ็”ฑๆต™ๆฑŸๅทฅไธšๅคงๅญฆ่ฏๅญฆ้™ข็š„ๅŒๅญฆไปฌ็ป„ๆˆ็š„โ€œๆ—ง่ฏๅœจไธ€่ตทโ€ๆš‘ๆœŸ็คพไผšๅฎž่ทตๅฐๅˆ†้˜Ÿ\n1000 Processed\n classify content\n323500 0 ๅธธๅทžๅ…ฌๅฎ‰ๆถˆ้˜ฒๆŒ‡ๆŒฅไธญๅฟƒๆŽฅๅˆฐ5่ตทๅ…ณไบŽๆบบๆฐด็š„ๆŠฅ่ญฆๆฑ‚ๅŠฉ\n323501 0 ๅ†็ญ‰ไธ€ๆฎตๅผ€ๅ‘ๅ•†ๅŠžๆˆฟไบง่ฏๆ—ถ่ฏฅ้ข็ญพไบ†\n323502 1 ๆ‚จๅฅฝ๏ผ๏ผๅ…ฌๅ…ƒไน้‡Œไธ€ๅฑ‚ไธ€ๆˆท==ๆฅผ็Ž‹ๅ››้ข่ง‚ๆ™ฏ๏ผŒๆฐด็ณป็Žฏ็ป•๏ผŒx/x๏ผŒ้ข็งฏxxxๅนณไฝŽไบŽๅธ‚ๅœบไปทxxxไธ‡๏ผŒ...\n323503 0 โ€2015็ซๅŠ›ๅ…จๅผ€็š„้ญๆ™จ็ปˆไบŽ่ฆ็Žฐๅœบ็Žฉ่ฝฌๅธฝๅญๆˆๆณ•\n323504 0 ่ฟ™ไธชไบงๅ“ๅŒป็”จๅๅซๅทฆๆ—‹่šไนณ้…ธ่ฝฏ็ป„็ป‡ๅกซๅ……ๆๆ–™\n1000 Processed\n classify content\n324000 0 ็ป™ไฝ ๅšไบ†ๅŠ ๆฒน็š„ๅŠจไฝœไฝ ๅดไปฅไธบๆ˜ฏๅ†่งๅฏนๆˆ‘ๆŒฅๆŒฅๆ‰‹\n324001 0 ๆŽจ่้›ช็ƒ|ไฝ ้€‚ๅˆๅšใ€Žๅˆ†็บงๅŸบ้‡‘ใ€ๅฅ—ๅˆฉๅ—\n324002 0 ๆฏๅคฉๅพ€่ฟ”ๆˆฟๅฑฑๅŒบ็š„835ๅฟซ่ฝฆ\n324003 0 ไฝ†ๆ˜ฏ็Žฐๅœจๆˆ‘่ฆ้‡ๆ–ฐไธ‹่ฝฝ็™พๅบฆ้Ÿณไนไบ†\n324004 0 ๆฏๆฌก็š„ๅ‡บ่กŒๆœ‹ๅ‹ๅ‡บๅทฎ้ฃžๆœบ้…’ๅบ—้ƒฝๆ˜ฏๅœจWVๅŽๅฐๆฏ”ๅค–้ขๆ–นไพฟๅฎžๆƒ ๅคšไบ†\n1000 Processed\n classify content\n324500 0 ๅๅ‡บ็งŸ่ฝฆ็š„ๆ—ถๅ€™ๆˆ‘ไธ€ไธ‹ๆฒกๅๅบ”่ฟ‡ๆฅๅพ€ๅ†…ๅœฐๅ‰ฏ้ฉพ้ฉถ่ตฐๅŽปไบ†\n324501 0 2015ๅนด็ฌฌไธ‰ๅฑŠ่ดขๅฏŒๆ–ฐ็ปๆตŽๅ…จ็ƒ่ฎบๅ›โ€œไธ€ๅธฆไธ€่ทฏโ€ๆพณๆดฒๅณฐไผš\n324502 0 ไป€ไนˆๆ—ถๅ€™ๆ‰่ƒฝ็ญ‰ๅˆฐ่Šฑๅƒ้ชจๅ˜ๅฆ–็ฅžๅ•Š\n324503 0 ใ€Žๅ—ไบฌๅฎ˜ๅ‘˜่‡ช็งฐๆˆ‘ๆ˜ฏๅค„้•ฟๆˆ‘ๆœ‰้’ฑใ€\n324504 0 innisfree็ปฟ่Œถ/็ซๅฑฑๆณฅ/ๆฉ„ๆฆ„ๆด—้ขๅฅถ\n1000 Processed\n classify content\n325000 0 ๅผ•่ฟ›ไบ†ไธŽxSๅบ—ๅŒๆญฅๅ‡็บง็š„ๅŽŸๅŽ‚ๆฃ€ๆต‹่ฎพๅค‡\n325001 0 ๅ’จ่ฏข็”ต่ฏ๏ผš15090322533\n325002 0 ็œ‹Victor่ฐขๅฟ—ๆตฉ่€ๅธˆ็š„๏ฝž\n325003 0 erm?chteeinDeutschlehrerwerden\n325004 0 xxxxไธŠๅŠๅนดๆณฐๅทžๅ•†ๅ“ๆˆฟๆˆไบคxxxxๅฅ—\n1000 Processed\n classify content\n325500 0 ๆœบๆ™บ็š„็™พๅบฆๆ€Žๆ ทๆ’•ไธ็–ผ??็ป“ๆžœๅคงๅฎถ้ƒฝ่ฏดๆฒกๅŠžๆณ•\n325501 0 ๅˆšๆ‰ๆœ‰ไบบๆ‹ฟๅ›พ้ป‘่ฃ่€€7ๅ’Œ้บ’้บŸ935\n325502 0 ๆˆ่ฟ‘50ๅนดๆฅๅŒๆœŸ็™ป้™†ๆต™ๆฑŸ็š„ๆœ€ๅผบๅฐ้ฃŽ\n325503 0 ๆ˜†ๅฑฑๆ–‡ๅŒ–่‰บๆœฏไธญๅฟƒๅฝฑ่ง†ไธญๅฟƒๅŠๅธ‚ๆฐ‘ๆ–‡ๅŒ–ๅนฟๅœบๅฝฑ้™ข็ป่ฅ็ฎก็†ๅ•ไฝๅพ้€‰ๅ…ฌๅ‘Š\n325504 0 ๅฟ˜ไบ†ไฝ ่ฐˆไฝ•ๅฎนๆ˜“ๆ‰‹ๆœบๅช่ฆๅ“ไบ†ๅฐฑไปฅไธบๆ˜ฏไฝ \n1000 Processed\n classify content\n326000 1 ็ฝ‘ๅ€xxxxxx\n326001 0 ็Ž‹่Šณ่ฏดๅฅนๅ’Œๆœบๅœบ็š„ๅˆ˜ๆ˜Š่ฟ˜ๆ˜ฏๅฅนๅ’Œ่ญฆๅฏŸ็š„้™ถๅฎฝๆ€ไบบๅฟๅ‘ฝๆญปๅˆ‘\n326002 0 ๆ›พ็ป่ฟ˜ๅœจๆ”ฟๅบœๅผบๅŽ‹ไธ‹่ขซ่ฟซ็š„่ขซ้‚ชๆ•™ไบบๅ‘˜็š„ๅœจๅคงๅบญๅนฟไผ—ไน‹ไธ‹้ช‚ไบ†\n326003 0 ไพๆ—งไปฅ่ดฉๅ–ไบบๅฃๅ’Œๅผบๅฅธๆœชๆˆๅนดๅฐ‘ๅฅณๅฎš็ฝชไบ†\n326004 0 ่™ฝ็„ถๅˆฉไฝ›ๆ‘ฉๅฐ”ๅœจxxxxๅนดๅš็ฉบ่‚กๅธ‚ไธ€ๆˆ˜ๆˆๅ\n1000 Processed\n classify content\n326500 0 ่ฝฆ็‰Œๅทไธบๆ™‹B****ๅฎžๆ–ฝ่ถ…้€Ÿ่ฟๆณ•่กŒไธบ\n326501 0 xxๅฒ็”ทๅญๆŽๆŸๆฐไธŽxxๅฒๅฅณๅญ็Ž‹ๆŸ็ตๆญคๅ‰ๅœจ็ฝ‘ไธŠ่Šๅคฉๅ‘็”Ÿไบ‰ๅต\n326502 0 ๅๅนดๅŽๅ‘่กจๅœจJCOไธŠ็š„่ฟ‘10ๅนด็ ”็ฉถๅ†ๆฌก่ฏๅฎžไบ†่ฟ™ไธ€็›Šๅค„\n326503 0 ๅŽŸๆฅไธญ้˜Ÿ30ๅฒ็š„ๆ‰งๆณ•้˜Ÿๅ‘˜ๅถๅพๆŒบไน‹ๅ‰ไธ€ไธช็Œ›ๅญๆ‰Ž่ฟ›ๆฐด้‡Œ\n326504 0 ่ฟ™ๆ˜ฏๅพๅทžๅทฅ็จ‹ๅญฆ้™ข็†ๅทฅ็ง‘ๆ•™ๅธˆๆœฑๆทไธบๅคงๅ››ๅญฆ็”ŸไธŠๆผ”็š„่„ฑๅฃ็ง€\n1000 Processed\n classify content\n327000 0 ๅžƒๅœพ่ฝฆ่ฝฆ็‰Œๅท็ ๆ˜ฏ่‹D17921\n327001 0 ๆœˆ้ฃŸๆŠ•ๅฅ”ๅฟซ่ˆนไน‹ๅŽ็ซ็ฎญ็š„้˜ตๅฎนๅ่€Œ็œ‹็š„ๆ›ดๆธ…ๆฅšไธ€ไบ›ไบ†\n327002 1 ๅพทไฟกๆณŠๆž—ๅ…ฌ้ฆ†ๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆตท็›้ฆ–ๅธญๅพทๅ›ฝๅ“่ดจไฝๅŒบๅณๅฐ†ๅˆฐๆฅ๏ผŒๆ•ฌ่ฏทๆœŸๅพ…๏ผๅœฐๅ€๏ผšๆžฃๅ›ญ่ทฏไธŽๅŸŽไธœ่ทฏไบคๅ‰ๅฃ...\n327003 0 ๅฝ“ๆˆ‘ๅผ€ๅง‹่ดจ็–‘ไธ€ไปถไบ‹ๆƒ…็š„ๆ—ถๅ€™\n327004 0 ๅœจ้“œ้™ต่ทฏๅŸŽ็ฎกไธญ้˜Ÿ็š„ๅ…ฑๅŒๅŠชๅŠ›ไธ‹\n1000 Processed\n classify content\n327500 0 ็”ฑ่ฎพ่ฎกๅธˆKazumiAbe่ฎพ่ฎก็š„Forestๅฐ็ฏ\n327501 0 ๅนถๅฏนๆตๅŠจๅ•†่ดฉๅŠไนฑๅœ่ฝฆ็Žฐ่ฑก่ฟ›่กŒๅŠๅฏผ็–้€š\n327502 0 ๆˆ‘ไบบๅฅนๅฆˆๅœจๆต™ๆฑŸไฝ ็ป™ๆˆ‘ๆŽจ้€ๅ…ฐๅทž็š„ๆœฌๅœฐๆถˆๆฏ\n327503 0 ๆˆ‘ๅˆšๅˆšๅœจๅคฉ็Œซ่ฏ•็”จไธญๅฟƒๅ…่ดน็”ณ่ฏทไบ†็ฆ็ปดๅ…‹VR200ๆ‰ซๅœฐๆœบๅ™จไบบ\n327504 0 5ใ€ๅŒ—ไบฌ้ฆ–้ƒฝๆ—…ๆธธ้›†ๅ›ขๆœ‰้™่ดฃไปปๅ…ฌๅธ\n1000 Processed\n classify content\n328000 1 ๆ—  ๆŠต ๆŠผ ไฟก ็”จ ่ดท ๆฌพ ๏ผŒๆ‰‹็ปญ็ฎ€ๅ•๏ผŒไธๆ”ถๆ‰‹็ปญ่ดน๏ผŒๆ€ฅ็”จ้’ฑ่ฏท่”็ณปๅขๅฐๅงxxxx...\n328001 0 ไบ”็›ธๅฃฐๆจ้˜ณๅคงๅฒšๅญๅญฆๅ››็œ\n328002 0 Gregoryๆ ผ้‡Œ้ซ˜ๅˆฉTrailblazerๆ—ถๅฐšๅŒ่‚ฉ่ƒŒๅŒ…$xx\n328003 0 ่€Œๅ‘่ฎฐ่€…็ˆ†ๆ–™็š„ไธšๅ†…ไบบๅฃซๆŒ‡ๅ‡บ\n328004 0 ๅŒ—ไบฌ็š„ๅœฐ้“็ฎก็†ๆ–นๅญ˜ๅœจๆ˜Žๆ˜พ้—ฎ้ข˜\n1000 Processed\n classify content\n328500 0 ๅฅฝๆƒณๅŽป่‹ๅทžไนๅ›ญ็ƒญไนŸๆƒณๅŽป\n328501 0 ๅชๅฌๅˆฐไธ€ๅฅๆญŒ่ฏโ€œbeautifuloneformeโ€\n328502 0 ๅธๅผ•ๆŠ•่ต„ไธป่ฆ้ข†ๅŸŸไธบๆ—…ๆธธใ€ๅทฅไธšใ€ๆœๅŠกไธšๅ’Œๅ†œไธš\n328503 0 ๅŸŽ็ฎกๆ‰“ไบบ้—นไบ‹ๆจช่›ฎๆ— ็†ๅผ•่ตทๆฐ‘ๆ„ค\n328504 0 ๆœ‰ไธ€ๆฎตๆ—ถ้—ดๅฌๅˆฐ้ฃžๆœบๅฃฐๅฐฑๆ€•โ€ฆ\n1000 Processed\n classify content\n329000 0 ๅดๆฑŸๅŒบๅ…ฌๅฎ‰ๅฑ€ๆพ้™ตๆดพๅ‡บๆ‰€็ป„็ป‡่พ–ๅŒบๅ„ฟ็ซฅๅŠๅฎถ้•ฟ50ไฝ™ไบบ่ตฐ่ฟ›็‰นๅ‹คๅคง้˜Ÿใ€ๅ…ฌๆฐ‘่ญฆๆ กๅ’Œๆพ้™ตๆ‰€่ญฆๆฐ‘ๅ…ฑๅปบ้•ฟๅปŠ\n329001 0 ไปŠๅคฉๅฅฝๅฃฐ้Ÿณ้‚ฃไธชๅฅณ็š„ๅ“ๆญปๆˆ‘ไบ†\n329002 0 ๅ…่ดน็š„ๅทดๅฃซๆœๅŠกๅฐ†ไปŽๆ˜ŸๆœŸๆ—ฅ็ฌฌไธ€็ญ่ฝฆๅผ€ๅง‹\n329003 0 K่”่ต›็›ฎๅ‰ๆœ€ๅผบ็š„ๅ…จๅŒ—็Žฐไปฃไธ€ๅนด็š„่ฟ่ฅ่ดน็”จๆ‰300ไบฟ้Ÿฉๅ…ƒ\n329004 0 ๅŒบๆ”ฟๅบœ้—จๅ‰ไธ€็พค็ปฟ่‰ฒ็š„่บซๅฝฑไพ็„ถๅœจๅฟ™็ขŒ\n1000 Processed\n classify content\n329500 0 ๅฆ‚ไฝ•้€š่ฟ‡Skypeforbusinessใ€Cortana็ญ‰ๅบ”็”จๅ’ŒๅŠŸ่ƒฝ\n329501 0 ไฟฎๅปบๆ•ฐๆก็ดข้“ๆˆ–็”ตๆขฏ็›ด่พพๅฑฑ้กถ\n329502 0 ๆ‹ฅๆœ‰่ถ…่ฟ‡95%็š„ไธ€็บฟๅฅขไพˆๅ“็‰Œๅ—ไบฌๅพทๅŸบๅนฟๅœบ\n329503 0 ไธญๅฑฑๅธ‚็ฌฌไบŒไบบๆฐ‘ๆณ•้™ข็ปๅฎก็†่ฎคไธบ\n329504 0 ๆˆ‘ไธ€่ˆฌๅชๆถ‚30ๅ€็š„้š”็ฆปไธๅ†ๆถ‚้˜ฒๆ™’\n1000 Processed\n classify content\n330000 1 ๆ™š่พ…็ญใ€ๅค–่€ƒ็ญใ€็ซž่ต›็ญxๆœˆxๆ—ฅๅผ€ๅง‹ไธŠ่ฏพไบ†๏ผŒๆฌข่ฟŽไปฅ่€ๅธฆๆ–ฐๅ…ˆ่ฏ•ๅฌๆปกๆ„ๅ†ๆŠฅๅ๏ผ็ฅ่ดบxxไฝๅ‚่€ƒๅญฆๅ‘˜ๅ…จ...\n330001 0 ไฝ†้€™ๆฌกๆญ้ƒต่ผชๅพžVictoriaๅˆฐVancouver\n330002 0 ๅ˜‰ๅ…ดๅ…ญ้‡Œๅ’Œ่‹ๅทžๅนณๆฑŸ่ก—็š„้ฃŽๆ™ฏ\n330003 0 ๆ˜จๅคฉ่ขซ่ญฆๅฏŸๆŒ‰ไปŠๅคฉๆ‰พ่ญฆๅฏŸๅธฎๅฟ™ๅ‰ๅคฉ่ฟฝ่ญฆ่ฝฆ่ท‘่ฟ™ๅ‡ ๅคฉๅ’Œ่ญฆๅฏŸ้ƒฝๅฅฝๆœ‰็ผ˜ๅˆ†ๅ•Š่ฟ˜ๆœ‰ๅœจ็พŽๅ›ฝ้”ป็‚ผ็š„้‡ๅˆฐไบ‹ๅ„ฟ้ƒฝ็‰นๆทกๅฎš\n330004 1 ๆ™ฏๅฎ‰ๅนผๅ„ฟๅ›ญๆ˜ฅๅญฃๆ‹›็”ŸๆŠฅๅๅผ€ๅง‹ๅ•ฆ๏ผๆฌข่ฟŽๅฐๆœ‹ๅ‹ไปฌๆฅๅ›ญไฝ“้ชŒ๏ผๆญฃๆœˆๅไนๆญฃๅผๅผ€ๅญฆ๏ผๅ’จ่ฏข็”ต่ฏ๏ผšxxxxxx...\n1000 Processed\n classify content\n330500 0 ไธ€่ถŸ่ถŸ่ท‘ๅพ€ๅŒ—ไบฌ็š„ๅŒป้™ข่ท‘็ซ่ฝฆไธŠ็ซ™ไธ‰ไธชๅคšๅฐๆ—ถๅๅœฐ้“่ตฐ่ทฏๆŠ˜่…พไธ€ๅคฉ\n330501 0 ็Žฉๆณจๅ†Œ่กจๅทฎ็‚นๆŠŠ็”ต่„‘็Žฉๅฅ”ๆบƒไบ†่ฟ˜ๅฅฝไฟฎๅฅฝไบ†ไธ็„ถๆˆ‘ๅฐฑๅฏไปฅๆๅ‰ๆข็”ต่„‘ไบ†\n330502 0 ไนฐไธญๅ›ฝไบบๅฏฟๆ˜ฏ็œ‹ๅˆฐๆ˜จๅคฉๆœ‰ๅคง้‡่ต„้‡‘ๅœจ้ซ˜ไฝ่ขซๅฅ—ไบ†\n330503 0 ไธ€ๅฐไผ™็”จ2000ๅ…ƒๅ…ถไธญ็š„1500ๅ…ƒ็ป™MMไนฐไบ†้ƒจๆ‰‹ๆœบ\n330504 0 2011โ€œ่งๅฑฑๆ™ฎไน่ฟช่ก—่ˆžๅคง่ต›็‰น้‚€่กจๆผ”ๅ˜‰ๅฎพ\n1000 Processed\n classify content\n331000 0 ๅผ ๅฎถๅฃๅผ ๅฎถ็•Œๅผ ๅฎถๆธฏโ€ฆๅ‚ปๅ‚ปๅˆ†ไธๆธ…ๆฅš\n331001 0 ๆˆ‘ๆญฃๅœจๅฌๆณ ้ธขyousa็š„ๆญŒๆ›ฒๆœˆๅ…‰ๆถฆ่‰ฒๅฅณๅญฉ\n331002 0 ๆ่€ๆ—๏ผšๅฎณๆ€•่‚Œ่‚คๆพๅผ›ใ€็ป†็บนๆ‰พไธŠ้—จ\n331003 0 xใ€ๆŠฅๅๆˆชๆญขๆ—ถ้—ด๏ผšxxxxๅนดxๆœˆxxๆ—ฅ\n331004 0 ็พŽไธฝ็š„showgirlๆ— ็–‘ๆ˜ฏไธ€้“ๆœ€้“ไธฝ็š„้ฃŽๆ™ฏ็บฟ\n1000 Processed\n classify content\n331500 0 ็œ‹็Ž‹็ฅ–่“ๆผ”็ปŽๅฎŒ็พŽๅœฐๆฟ\n331501 0 ไธ€ๆฏไธๅคช่‹ฆ็š„ๅ’–ๅ•กไธ€้ƒจไธๅคชๆ–ฐ็š„ๆ‰‹ๆœบไธ€ๆ”ฏไธๅคชๅฅฝๅฌ็š„ๆญŒไธ€ไธชไธๅคชๅ‚ป็š„ไฝ \n331502 0 ้ฃžๆœบ็ž„็€ๅ…‰็บฟๅฐฑไปŽ่„ธไธŠๅ‘ผๅ•ธ้ฃž่ฟ‡\n331503 0 ็กๅœจ้ฃžๆœบไธŠๅฐฑๆ„Ÿ่ง‰็กๅœจไธ€ไธชๅฐ้—ญ็š„ๆŠฝ้ฃŽๆกถ\n331504 0 ๆพไธ‹็”ตๅ™จpanasoNic\n1000 Processed\n classify content\n332000 0 IBM่‹ๆ€ป็›‘ๅฏน่ฟ‘ๅนดๆฅ้‡่ฆ้กน็›ฎ็š„ๆขณ็†ๆ˜ฏๅœจ้ผ“ๅŠฑๅŽปโ€œๅšๅคงไบ‹โ€\n332001 0 ไปฅ็บฆxxไบฟ็พŽๅ…ƒ็š„ไปทๆ ผๆ”ถ่ดญ็™ฝๅฑฑไฟ้™ฉๆ——ไธ‹็š„ๅ†ไฟ้™ฉๅ…ฌๅธ\n332002 0 ไฝณ้“ๆณ•้™ขโ€œๆณ•ๅพ‹่ฟ›ไผไธšใ€่ฟ›ๅญฆๆ กโ€ๆดปๅŠจๅฐ†ๅˆฐ็ฌฌไบŒ็ซ™โ€”โ€”ไฝณๆœจๆ–ฏ้“่ทฏๅทฅๅŠกๆฎต\n332003 0 ๆถ‰ๅซŒ่ฟ็บช็š„ๅ…šๅ‘˜่ƒฝๅคŸ้…ๅˆ่ฐƒๆŸฅๅทฅไฝœ\n332004 0 ๆœ‰ๆฒกๆœ‰็Ÿฅ้“้•‡ๆฑŸๅ“ช้‡Œๆœ‰็ปดไฟฎๅ†ฐ็ฎฑ็š„\n1000 Processed\n classify content\n332500 0 ้Ÿฉๆ–‡็‰ˆ??????้€12่‰ฒๅฝฉ้“…\n332501 0 ่กฅๅŠžๆ—ถๅŽปๆดพๅ‡บๆ‰€ๅผ€ๅ…ทไธขๅคฑ่ฏๆ˜Ž\n332502 1 ๅทด้ปŽๆฌง่Žฑ้›…๏ผšไธœๅคงๆ—ฅๅŒ–โ€œไธœ้ฃŽๅบ—โ€๏ผˆไธœ้ฃŽๆกฅๅŒ—ไบฌๅŽ่”ๆ—่พน๏ผ‰๏ผŒxๆœˆxๆ—ฅ-xๆœˆxxๆ—ฅ่ดญๆฌง่Žฑ้›…ๆŠค่‚คไบงๅ“ไปป...\n332503 1 ็ˆฑ็‰น่ฃ…้ฅฐยท่ฏš้‚€ยทๆจๅ…‰ๆธ…ยทไธšไธปๅ‚ๅŠ ็ˆฑ็‰นๅนดๅˆๆ•ด่ฃ…่šๆƒ ใ€‚ ๆดปๅŠจ๏ผšxๆœˆxๆ—ฅ-xๆ—ฅ ๅœฐ็‚น๏ผšๆฑŸๅŒ—ๅŒบๅคงๆตชๆท˜...\n332504 0 ็ป†็ช„็š„ๅ‰ๅคง็ฏๅ’Œ้‡ๆ–ฐ่ฎพ่ฎก็š„ๅฐพ็ฏ็ป„\n1000 Processed\n classify content\n333000 0 ไปŽๅพๅฎฟๆทฎ็›้“่ทฏ็›ๅŸŽๆฎตๅปบ่ฎพๆŒ‡ๆŒฅ้ƒจๆˆๅ‘˜ไผš่ฎฎไธŠ่Žทๆ‚‰\n333001 0 ๅพกๆณฅๅŠ่–ฐ่กฃ่‰็Ÿฟ็‰ฉ่š•ไธ้ข่†œ14็‰‡่กฅๆฐดไฟๆนฟ่ˆ’็ผ“ไฟฎๆŠคๆŠค่‚คๅ“้ข่†œ่ดด\n333002 0 ้‡ๅ…ณ็ณป่ฝปๆฐ‘ๆ„3ๅ„’ๅฎถๅ ้ข†ๆ„่ฏ†้ข†ๅŸŸ\n333003 0 ๅŒป้™ข็ฝ‘็ปœ่ฅ้”€ๅœจๅŒป็–—่กŒไธš็š„็ฝ‘็ปœ่ฅ้”€ไธญๅบ”่ฏฅๆ˜ฏ็›ฎๅ‰็š„ไธ€ไธช\n333004 0 ่‡ณๅฐ‘20็ง’๏ฝžไธ€ไธช่ฝฆ้“้ƒฝ่ขซๆ‹ฆไฝไบ†\n1000 Processed\n classify content\n333500 0 ๅทจ่ดชใ€ๅทจๅทจ่ดชๅฐฑๅƒๅˆ˜็–ฏๅญ้‚ฃๆ ท็š„้ƒฝๆฒกๆฏ™ๆމ\n333501 0 ่ฎฉๆ‰‹ๆœบ้‡Œ็š„่ต„ๆ–™ๅ’Œๅ›พ็‰‡ๅ…จ้ƒจๆ ผๅผๅŒ–\n333502 0 ๅ…ฌๅธๆœๅŠก้™คไบ†ไผ ็ปŸ็š„็‰ฉไธšไนฐๅ–ไธญไป‹ๆœๅŠก\n333503 0 ๆž„ๅปบๆ–ฐๅž‹โ€œไบ’่”็ฝ‘+็คพๅŒบๆœๅŠกโ€็”Ÿๆ€ๅœˆ\n333504 0 ๆน˜ๆกฅๆณ•้™ขๅฎก็ป“ไบ†ไธ€ๅฎ—ๆœบๅŠจ่ฝฆไบค้€šไบ‹ๆ•…่ดฃไปป็บ ็บทๆกˆไปถ\n1000 Processed\n classify content\n334000 0 youtubeไธŠ็š„ๅŽไธบๅนฟๅ‘Š4minๅคšๅฅฝ็‰›็š„ๆ ทๅญ\n334001 1 ๆŠข็บขๅŒ…ๅ•ฆ๏ผxๆœˆxๆ—ฅ๏ผŒๅ’Œๅคง่…•ไธ€่ตท่ฟ‡่Š‚๏ผ ้“ญๅ“่ฃ…้ฅฐๆบๆ‰‹ๆท˜ๅฎ็ฝ‘ใ€็ง‘ๅ‹’ไธญๅ›ฝ้œ‡ๆ’ผๆฅ่ขญ๏ผ xๆœˆxๆ—ฅ๏ผŒๆญๅทž...\n334002 0 2015/8/6็ฌฌ1ๅคฉ๏ผš1\n334003 0 ๅ’Œไธญๅ›ฝๅคงๅฆˆ็”Ÿไบง็š„ๅฐผ้พ™้˜ฒๆ™’ๅคงๅค–ๅฅ—่ฏดgoodbyeๆŠŠ~~ๆˆดไธŠไปฅๅŽ็ซ‹้ฉฌ้™ๆธฉ3/5ๅบฆ\n334004 0 ๅฏนโ€œๅผบๅฅธ่‡ดไบบ้‡ไผคใ€ๆญปไบกโ€็š„็†่งฃๅ’Œ้€‚็”จ\n1000 Processed\n classify content\n334500 0 ๆˆ‘ไธ€็›ดไปฅไธบ่Šฑๅƒ้ชจ็š„ๅคšไบบๆœฌๆ˜ฏๅ’Œๅˆซ็š„็ŽฉๅฎถๅŒๆญฅ่ฟ›่กŒ็š„\n334501 0 ้ป‘่‰ฒๅ›พ็”ต่„‘็ปฃ่ŠฑๆŠ€ๆœฏ็ปๅฏน่ˆ’ๆœ่šๆ‹ข้€ๆฐ”\n334502 0 ๆœ‰ๆฒกๆœ‰ไธŠๅคๅท่ฝดxๅคฉ้™…็š„ๆฑ‰ๅŒ–ไธ‹่ฝฝ้“พๆŽฅ\n334503 0 ๆ้†’ๆš‘ๆœŸๅค–ๅ‡บๆ—…ๆธธ็š„็ซฅ้ž‹ไปฌ้œ€ๆณจๆ„ๆ—…็จ‹ๅฎ‰ๅ…จ\n334504 0 ็”ต่„‘ๅๆމไบ†ๆ‰‹ๆœบๆญปๆœบ็”ต่ง†ๆฒกไฟกๅท่ฟ™ๆ˜ฏๅœจ่€ƒ้ชŒๆˆ‘ๅ—\n1000 Processed\n classify content\n335000 0 ๅ•†ไธšๆ€งไฝๆˆฟ่ดทๆฌพไธๅฏไปฅ่ฝฌ็ป„ๅˆ่ดทๆฌพ\n335001 0 ไธญๅ›ฝๅฅฝๅฃฐ้ŸณๆˆๅŠŸ็š„ๆŠŠๅ—ๅฑฑๅ—่ฟ™้ฆ–ๆญŒๆฏไบ†\n335002 0 ๅข™่ฃ‚่ฆๆฑ‚ๅœฐ้“ไธ่ฆๆŠ•ๆ”พๆ˜Žๆ˜Ÿๅนฟๅ‘Šไบ†\n335003 0 ่ƒฝไธ่ƒฝๆ”พๅคง่ƒฝไธ่ƒฝๆ”พๅคง~ๆˆ‘ๆไผผ็œ‹ไธๆธ…ๅ’‹ๅœฐๅ–ฝ~ไบบๅฎถๅนณๆฟๅฐฑๅพˆๅฅฝๅ˜›\n335004 0 ็งฆๅฎถๆกฅๅ†œๆ‘ไฟก็”จ็คพๅ‘่ฅฟไบ”ๅ็ฑณ\n1000 Processed\n classify content\n335500 0 ๅคงๅฎถๆœ‹ๅ‹ไปฌ็จ‹็”ตๆขฏ็š„ๆ—ถๅ€™ไธ€ๅฎšๅฐๅฟƒๅœจๅฐๅฟƒๅง\n335501 1 ๆˆ‘็คพๅฎŒๅ–„ๅˆ›ๆ–ฐไบ†ๅญ˜ๆฌพ็ฑปใ€้“ถ่กŒๅกใ€่ดทๆฌพ็ฑปใ€็”ตๅญ้“ถ่กŒ็ญ‰ๅ„็ฑป้‡‘่žไบงๅ“ใ€‚็ฝ‘ไธŠใ€็Ÿญไฟกใ€ๆ‰‹ๆœบ้“ถ่กŒๅ…จ้ขๅผ€้€š๏ผŒ...\n335502 0 ่€็™พๅง“ๆฒกๆœ‰ไบ†ๅœฐไฝๅœจๆฅผไธŠ่ฟ˜ไธๅคŸไบคๆฐด็”ตๆš–่ดน็”จ็š„\n335503 0 ไฝ ไผšไธ€็›ดๅ—ๅˆฐๅ‘จ้ญ็š„่ดจ็–‘ๅ’Œๆ‰“ๅ‡ป\n335504 0 ๆฒกๆœ‰ๆ‘ฉๅฐ”ๅบ„ๅ›ญ3ๅคง็”ตๅฝฑ็™พๅบฆไบ‘็ฝ‘็›˜้ซ˜ๆธ…่ต„ๆบ้“พๆŽฅ็š„\n1000 Processed\n classify content\n336000 0 ็ป“ๆžœ็™พๅบฆไบ†ๅƒไธ‡้ไป€ไนˆ้ฌผ้ƒฝๆฒกๆœ‰\n336001 0 ไนๅคงๅˆธๅ•†๏ผšไธ‹ๅ‘จๅฏ่ƒฝๅ‘ไธŠๅ˜็›˜้‡่ฟ”xxxx็‚นไน‹ไธŠไฟก่พพ่ฏๅˆธ๏ผšๆฒชๆŒ‡ๅฏ่ƒฝๅ›ด็ป•xxxx็‚นๅๅค้œ‡่กใ€€ใ€€xๆœˆ...\n336002 0 FrontenacCountyCourtHouse้‡‘ๆ–ฏ้กฟ็š„ๅœฐๆ–นๆณ•้™ข\n336003 0 2ใ€2015ๅนด3ๆœˆ26ๆ—ฅๅธ‚ๆ”ฟๅบœไธ‹่พพ่ฏฅๅฑ€โ€œไธ‰ๅฎšๆ–นๆกˆโ€ๅŽ\n336004 0 ไปฅ่‡ณไบŽๆŠค็›˜ไนฐ่ฟ›่ถ…่ฟ‡5%็š„ไผŠๅˆฉA่‚ก\n1000 Processed\n classify content\n336500 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 366vffไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n336501 0 ๅ’Œๆœ€่ฟ‘HeroicHollywood็ˆ†ๆ–™็š„ๅ†…ๅฎนไธ€ๆฏ›ไธ€ๆ ท\n336502 0 JoJoๆ‹็š„ไธ็ฎกๆ˜ฏๅ“็‰Œๅ›พ่ฟ˜ๆ˜ฏๆ‚ๅฟ—ๅ›พไธป้กตๅ›้ƒฝไบ‰ๅ–่ƒฝๆ‰พๅˆฐ\n336503 0 ๆˆ‘็Žฐๅœจๆœ‰็‚น่ฎจๅŽŒๆ— ้”ก่ฟ™ไธชๅœฐๆ–น\n336504 0 ๅฅฝๆ„ŸๅŠจ/ๅคงๅ“ญ/ๅคงๅ“ญ/ๅคงๅ“ญ/ๅคงๅ“ญ/ๅคงๅ“ญๆˆ‘ๆดปไบ†ไบŒๅๅ‡ ๅนด\n1000 Processed\n classify content\n337000 0 ๅˆšๅˆš้‚ฃๆกๆ˜ฏ้‚ฃไธชnoface็š„yyy่‡ชๅทฑ่ท‘ๆฅๆŠขๆˆ‘ๆ‰‹ๆœบๅ‘็š„\n337001 0 ๆŠฝ่ก€็š„ๅŒป็”Ÿๅทฅไฝœๆ€ๅบฆๆœ‰็‚นไธ่ฎค็œŸ\n337002 0 ๅ‰ๅคซๆฏๆœˆๆ”ฏไป˜ๆŠšๅ…ป่ดน600ๅ…ƒ็›ด่‡ณๅฐๅญฉๅนดๆปก18ๅ‘จๅฒๆญข\n337003 0 ไธไธŽๆ—ถ้—ด่ต›่ท‘/้ฃžๆœบ/้ฃžๆœบ/้ฃžๆœบ\n337004 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜้กพๅฎข๏ผŒ ๆ‚จ็š„่Š‚ๆ—ฅๅฟซๅˆฐไบ†็ฅๆ‚จ่Š‚ๆ—ฅๅฟซไน๏ผ่Š‚ๆ—ฅๆœŸ้—ดๆฌงๅฎถๅฐ†ๆœ‰ๅ…จๅนดๆœ€ๅคงๅŠ›ๅบฆๅฎ ็ˆฑๅ›ž้ฆˆ๎„’ ๏ผŒๅ…จๅœบ...\n1000 Processed\n classify content\n337500 0 ๆˆ‘ๅฏนไธ€ไธชไบบ็š„ๅ่งไปŽไป–็”Ÿๅˆฐๆญป้ƒฝไผšๆœ‰\n337501 0 ๅฟซ่ˆน2ๅนด640ไธ‡็ปญ็บฆไธปๅธ…ไน‹ๅญๅญฃๅŽ่ต›ๆ›พๆ‰“็ˆ†็ซ็ฎญ\n337502 0 ๅ‡่‚ฅ่Œถ+ๅ‡่‚ฅ่ดด+่ฟๅŠจ+ๅฐ‘ๅƒ\n337503 0 ่ฎก็ฎ—ๆœบ็ณป็ปŸๆ˜ฏๅฆ็ฌฆๅˆGSP่ฆๆฑ‚่ฆ†็›–ๅ…จ็จ‹่ดจ็ฎกๆŽงๅˆถ\n337504 0 ๆƒณๅ–ไป€ไนˆ่ดงๆฏๅคฉไธ‹ๅฎšๅ•ๆˆ‘ๅŽป่ท‘่…ฟ\n1000 Processed\n classify content\n338000 0 ๅฅฝๅฃฐ้Ÿณๆ˜ฏไธๆ˜ฏไฝ ไปฌ็š„ๆถˆๆš‘ๅœฃๅ“ๅ‘ข~~็›ฒ้€‰ๆŽฅ่ฟ‘ๅฐพๅฃฐ\n338001 0 ไบ‘ๅ—็œ็ฝ—ๅนณๅŽฟไน้พ™้•‡ๅพ—็ญ‰ๆ‘ๅง”ไผšๅคงไปฅ็™พๆ‘\n338002 0 ไธบๅ„็ฑปๅฆ‡็ง‘็‚Ž็—‡็ป†่ƒž็ญ‰่‡ด็—…ๅพฎ็”Ÿ็‰ฉๆไพ›็น่กใ€็”Ÿๅญ˜ใ€โ€œ้—นโ€็—’็š„ๆกไปถ\n338003 0 ๅœๅœจ7ๅทๆฅผๅœฐไธ‹่ฝฆๅบ“็š„ๅฅฅ่ฟช่ขซไบบไธบๆŸๅ\n338004 0 ไบš้ฉฌ้€Š็งฐๆญฃๅœจๆต‹่ฏ•ไธ€ๆฌพๅซโ€œdashbuttonโ€็š„็กฌไปถ\n1000 Processed\n classify content\n338500 0 ไธๆŽ’้™คๆ”ฟๅบœๅ†ๆฌกๅฎž่กŒ้™ไปท็š„ๅฏ่ƒฝ\n338501 0 ่‹ๅทžๅคšๅฎถๅ“็‰ŒๆˆฟไผๅŒ…ๆ‹ฌไธญๆตทใ€ไฟๅˆฉ็ญ‰ๅด้™ทๅ…ฅไบ†โ€œๅœฐ่’โ€\n338502 0 wnbaๅ’Œnba่ฟ˜ๆœ‰ๅŒบๅˆซๅ—\n338503 0 1907ๅนดๅ…ถไบŽๅผ—ๆด›ไผŠๅพทๅˆไฝœๅ‘ๅฑ•ๅŠๆŽจๅนฟ็ฒพ็ฅžๅˆ†ๆžๅญฆ่ฏด้•ฟ่พพ6ๅนดไน‹ไน…\n338504 0 ๅฆ‚ไฝ•็ฎกๆŽงไผไธšๅŸŸๅ็š„ๆณ•ๅพ‹้ฃŽ้™ฉ\n1000 Processed\n classify content\n339000 0 ๆˆ‘ๅฆˆdoge่„ธ็œ‹็€ๆˆ‘่ฏดไฝ ๅˆซ่ฃ…ไบ†่ฟ˜ๆ˜ฏๅœจไนฆๆˆฟ็œ‹ๅฎŒๅฐ่ฏดๅ†็กๅง\n339001 0 ๅฝ“็”ตๆขฏๅ‡ๅˆฐไธ‰ๆฅผๆˆ‘ๆ‰ๆ„่ฏ†ๅˆฐๅ‘จๅ›ด้ป‘ๆผ†ๆผ†ไธ€็‰‡\n339002 0 ๅ“ˆๅฐ”ๆปจๆœ€ๅฅฝ็š„็šฎ่‚ค็—…ๅŒป้™ขไธ“ๅฎถๆ้†’๏ผš่€ๅนดไบบๆœ‰\n339003 0 ๅธ‚ๆฃ€ๅฏŸ้™ขๆฃ€ๅฏŸ้•ฟ็Ž‹้‡‘ๅบ†ๅ‡บๅธญๅบง่ฐˆไผš\n339004 0 ๆฉ˜่‰ฒ่™่ ่ข–่•พไธๅฎฝๆพTๆค+็™ฝ่‰ฒ็ดง่บซ่ฃค+็™ฝ่‰ฒๅ•้ž‹\n1000 Processed\n classify content\n339500 0 ๅ‡บๅฃๅˆฐๆฌง็พŽ็š„ไบงๅ“ไธๅ†ไป…ไป…ๆ˜ฏๅป‰ไปทไบงๅ“\n339501 0 Hirudold็ฅ›็–ค่†ๆณฐๅ›ฝๅ„ๅคงๅŒป้™ขๆŒ‡ๅฎš็š„็ฅ›็–ค่ฏ่†\n339502 0 ็”จๆœ€ๅธธ่งใ€ๆœ€ๅŸบๆœฌ็š„ๅปบ็ญ‘ๆๆ–™ๅ’Œไผ ็ปŸ็š„ๆญๅปบๆ–นๅผ\n339503 0 ็ป†่ƒž็”Ÿๆ€ๆŠค็œผๆฐดๆ˜ฏๅ–่‡ชๅ››ๅท้˜ฟๅ9610ๅนด็š„่พพๅคๅ†ฐๅทๆฐด\n339504 0 ๆพณๆดฒGoatSoap็บฏๅคฉ็„ถ็พŠๅฅถ็š‚\n1000 Processed\n classify content\n340000 0 ๆŠ˜้จฐไบ†ไธ€ๆ—ฉไธŠไนŸๅผ„ไธๅˆฐๅฎ˜ๆ–น็š„ๆŽจ้€้‚„ๆ˜ฏ็”จ้จฐ่จŠๅ‡็ดšWindows10็ฎ—ไบ†\n340001 0 ่งฃๅ†ณไฟ้™ฉๆถˆ่ดน่€…โ€œ็†่ต”้šพโ€ใ€โ€œ็†่ต”ๆ…ขโ€็ญ‰้—ฎ้ข˜\n340002 0 xxxxๅนดxๆœˆxๆ—ฅ่ฎฏ๏ผšๅ›ฝๆฐ‘ๆ”ฟๅบœ็”ณไปค็ฆๆญขไปฅๆœบๅ…ณ้•ฟๅฎ˜็งไบบๅไน‰ๅฏนไธ‹็บงๆœบๅ…ณๆปฅ่ไบบๅ‘˜\n340003 0 ้ฉฌๅนฟๅนณ่ขซๆ— ้”กๅธ‚ๆ–‡ๆ˜ŽๅŠžใ€ๆ— ้”กๅธ‚ๅฟ—ๆ„ฟ่€…ๆ€ปไผšๆŽˆไบˆxxxxๅนดๅบฆโ€œๆ— ้”กๅธ‚็™พๅๆœ€็พŽๅฟ—ๆ„ฟ่€…โ€็งฐๅท\n340004 0 ้˜ฒ่…ๅ‰‚ๆ€ปๆ˜ฏ่ƒฝไธŽ\"ไธๅคฉ็„ถ\"/\"ๆœ‰ๅฎณ\"/\"้ป‘ๅฟƒๅ•†ๅฎถ\"ไน‹็ฑป็š„่ดŸ้ข่ฏๆฑ‡่”็ณปๅˆฐไธ€่ตท\n1000 Processed\n classify content\n340500 0 ๅฏไปฅ่ตท้ฃžไบ†ไธน้˜ณไบบๆฐ‘้ƒฝ่ฟ™ไนˆไปปๆ€ง\n340501 1 xxxxxxxxxxxxxxxxxxxๅ‘จ็บขไบ‘ๅ†œ่กŒ\n340502 0 ๆˆ‘ๅˆšๅˆšๆŠŠๆ‰‹ๆœบๅฃ็บธๆขๆˆไบ†่ฟ™ไธช๏ผšGoldensilkorbweaverspider\n340503 0 ๅค„ๆญปๅˆ‘ใ€ๆ— ๆœŸๅพ’ๅˆ‘ๆˆ–่€…ๅๅนดไปฅไธŠๆœ‰ๆœŸๅพ’ๅˆ‘\n340504 0 ้˜ฒ่ŒƒๆŠขๅŠซๆœ‰้ซ˜ๆ‹›่ฟ›้—จไน‹ๅ‰ๅ›žๅคดๆœ›\n1000 Processed\n classify content\n341000 0 ไธŽ็‹ฌ็‰น็š„ไธ่ง„ๅˆ™ไธ‹ๆ‘†่ฎพ่ฎกไธ€่ตท่ฏ ้‡ŠๅฑžไบŽไฝ ็š„ๆฝฎๆต่…”่ฐƒ\n341001 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ๆ‚จๅฅฝ๏ผŒๆ‚จๅœจ้•ฟๆฑŸUXx็”ท่ฃ…็”Ÿๆดป้ฆ†ๆถˆ่ดน็งฏๅˆ†่ดญ็‰ฉๆ—ถๅฏๆŠต็Žฐ้‡‘xxๅ…ƒ๏ผŒๅœจxๆœˆxๆ—ฅโ€”xๆœˆxxๆ—ฅ...\n341002 0 ๅคงๆฒน็”ฐๆฏๆ™šไธ€ๆฌกๅนฒ็‡ฅ่‚Œ่‚คไธ€ๅ‘จไธ€ๆฌก\n341003 0 ๆˆ‘่ฟ™ๅ—ไบฌ็š„่กจๅผŸๅด่ฆๅŽปไป–ไปฌๅคชๅŽŸไบ†\n341004 0 ๅ“Žๆ‰ๆณจๆ„ๅˆฐ่ฟ™ไธชๆ‰‹ๆœบๅ‘ๅพฎๅšๅช่ƒฝๅ‘ๅฐๅ›พ\n1000 Processed\n classify content\n341500 1 ็ˆฑๅฅณไบบ.็ˆฑ่‡ชๅทฑ๏ผŒ็”ทๅฃซ้€็ป™ๅฅณไบบ็š„็คผ็‰ฉ๏ผŒๅฅณไบบ้€็ป™่‡ชๅทฑ็š„ๅ…ณ็ˆฑ๏ผๅฐŠๆ•ฌ็š„็พŽๅฎน้กพๅฎข๏ผŒไธ‰ๅ…ซ่Š‚ๅฟซไน๏ผxๆœˆxๅท...\n341501 0 ไฝ†ๆ˜ฏๆ€Žไนˆๅฐฑๅœจๆˆ‘ๅˆšๅ‡บๅœฐ้“ไธ‹ๅ‘ข\n341502 0 ๆ‰“ๅŒ…35ๅฎถๅช’ไฝ“ๅฆๅค–้€15ๅฎถๅช’ไฝ“ๅˆ่ฎก50ไธช็ฝ‘็ซ™ไป…้œ€2000ๅ…ƒ\n341503 0 ้‚ฃๅ‰Ž้‚ฃ้—ด้ฃžๆœบ่ฝฎๅญ็€่ทฏๆ—ถๆ‰็Ÿฅ้“\n341504 0 ็”ทๅญ็–‘ๆ€งไพต13ๅฒๅนผๅฅณๆฃ€ๅฏŸ้™ข็งฐๅŒๆ–น่‡ชๆ„ฟ\n1000 Processed\n classify content\n342000 0 ๅฏไปฅๅ‘็Žฐ่ฟ™้‡Œๆœ‰ไธ€็งๅฎ‰้™ๆฌ็„ถ็š„ๆฐ”้Ÿต\n342001 0 ๆ ธๅฎž1955ๅนด12ๆœˆ31ๆ—ฅๅ‰ๅ‡บ็”Ÿ็š„60ๅฒไปฅไธŠ็‹ฌ็”Ÿๅญๅฅณๅฅ–ๆ‰ถ็š„ไบบๅ‘˜\n342002 0 3214507110่ถ…่–„ๆฃ‰้บป่กฌ่กฃๅฏๅฝ“้˜ฒๆ™’่กฃๅ‡็ ไธค่‰ฒๅ…ฅๆ–™ๅญไธ้”™\n342003 0 ๆฏๅคฉๅ›žๅˆฐๅฎถๅฏน็€ๆ‰‹ๆœบๅ’Œๅ››้ขๅข™\n342004 0 ๅ›ฝ็”ตๅ—่‡ชAGC/AVCๆŽงๅˆถ็ณป็ปŸๆˆๅŠŸๅบ”็”จๆ•ˆ็›Šๆ˜Žๆ˜พ๏ผš่ฟ‘ๆ—ฅ\n1000 Processed\n classify content\n342500 0 ้‚ฌ็ป็†ๅ’ŒๆŠ€ๆœฏ้ฃž่ฟœๅœจๆญๅทžๆŒ‡ๅฏผๅไฝœ\n342501 0 ไบฟๅˆฉ่พพ้€พxxxxไธ‡ๆ”ถ่ดญไธคๅ…ฌๅธๆŽง่‚กๆƒ\n342502 0 ็ฌฌๅไบ”ๅฑŠไธญๅ›ฝ้’ๅฐ‘ๅนดๆœบๅ™จไบบ็ซž่ต›ๅœจๅ†…่’™ๅค้„‚ๅฐ”ๅคšๆ–ฏๅธ‚ไธพๅŠž\n342503 0 ไธ‰ๆตๆ‚็ขŽๆ˜Žๆ˜Ÿๅคชๅคšๆฒกไบ‹ๅนฒๆ‰พๆจ็‹—็š„็™พๅบฆ็™พ็ง‘ๅ‘็Žฐ่ฟ˜ๆฒกๆœ‰โ€œๅฑŽโ€็š„็™พๅบฆ็™พ็ง‘้•ฟๆปšๅ‡บๅŒ—ไบฌๅงๅฅฝๅ—็‹—ๅ„ฟๅญ\n342504 0 ่ฟ™ไบ›ๅฎž้™…ไธŠๆ˜ฏๆ–ฐๅŠ ๅก่‰บๆœฏๅฎถKengLyeๅไธบโ€œๆดป็€ไฝ†ๆ— ๅ‘ผๅธโ€็ณปๅˆ—็š„่ถ…็Žฐๅฎžไธปไน‰3Dๆ ‘่„‚็”ป\n1000 Processed\n classify content\n343000 0 ๆฐดๆ˜ŸไธŽๅœฐ็ƒ็š„ๅซๆ˜Ÿโ€”โ€”ๆœˆ็ƒไน‹้—ด\n343001 0 ๆ˜จๆ™š6็‚นๆ‰“่ฝฆๅŽป็œ‹็”ตๅฝฑ็ซŸ็„ถๆ˜ฏๅฎ้ฉฌ\n343002 0 ๆ‰ฌๅทžไธœๆ–นๅจƒ็Žฉๅ…ทๆœ‰้™ๅ…ฌๅธ๏ผš็ƒญ้”€ไน‹ไธญ\n343003 0 6ๆœˆไปฝๅฝ“ๆœˆๅฎž็Žฐ็คพไผšๆถˆ่ดนๅ“้›ถๅ”ฎๆ€ป้ข148528ไธ‡ๅ…ƒ\n343004 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็œŸ็š„ไธๆ˜ฏไธ€่ˆฌๆถๅฟƒ\n1000 Processed\n classify content\n343500 0 2015ๆฃฒ้œžๅฏบไฝ›ๅญธๅคไปค็‡ŸDay2โ€”โ€”ๅฝญ้†ซ็”Ÿ้คŠ็”Ÿๅธธ่ญ˜ๅŠ้ˆ้€ฒๆณ•ๅธซๆขตๅ‘—ๆ•™ๅ”ฑ\n343501 0 ๅฎ‰ๅ…ปๅธ‚ๅฐ†ๅœจ็Žฐๆœ‰ๅธ‚ๆ”ฟๅบœๅŠžๅ…ฌๆฅผ็š„ๅœฐ็šฎไธŠๆ–ฐๅปบไธ€ๅบง100ๅฑ‚็š„\n343502 0 ๆœฌไบบ่„‘้ƒจๅณ่พน่‚ฟๅ—ๅฟ…ๆญปไบ†ๆŠฅ่ญฆไบ†ๆ€ไบบๅฟๅ‘ฝๆญปๆญปๅˆ‘ๅˆคๅˆ‘\n343503 0 ๅฐฑๆ—…ๆธธO2Oใ€ๅธ‚ๅœบ่ฅ้”€ใ€ๆ™ฏๅŒบ้—จ็ฅจ็ญ‰ๆ–น้ขๅฑ•ๅผ€ๅˆไฝœ\n343504 0 ๅฎๅ˜ฑ่‡ชๅทฑ็š„ๅ‡ ๅฅ่ฏ๏ผš1ใ€ไธ่ฆไพฅๅนธใ€ๆŠฅๅคใ€ๆŠฌๆ ใ€ๅฌๆถˆๆฏๆ€งไบคๆ˜“\n1000 Processed\n classify content\n344000 0 ๅฎƒๅŒ…ๆ‹ฌๅญๅฎซใ€ๅตๅทขๅ’Œๆ•ดไธชๅฅณๆ€ง็”Ÿๆฎ–็ณป็ปŸๅŠ็›ธๅ…ณๅŠŸ่ƒฝ\n344001 0 ็”จๅผ€่ฎฏ่ง†้ข‘็œ‹่Šฑๅƒ้ชจๆ€Žไนˆๆฏ้›†ๅชๆœ‰15ๅˆ†้’Ÿๅ‘ข\n344002 0 ๆฐขๆฐง้‡่ฆไฝœ็”จๆฐงๆฐ”ๆ˜ฏ้œ€ๆฐง็”Ÿ็‰ฉ็ปดๆŒ็”Ÿๅ‘ฝไธๅฏ็ผบๅฐ‘็š„็‰ฉ่ดจ\n344003 0 ๅฌๆญŒๅ‰ง~ๆžๅญฆๆœฏ็š„ๆ›ดๆ˜ฏๅฏ้šๆ—ถๅ‚ๅŠ ๅญฆๆ กๆฏๅคฉ็š„ๅ„็ง่ฎบๅ›่ฎฒๅบง\n344004 0 ไปŽๅŽŸๆฅ็š„360ๆ”ถ่—ๅคนๅฏผๅ…ฅedge่ฆๆ€Žไนˆ็ ด\n1000 Processed\n classify content\n344500 0 ๆˆ‘่ฆ่ตฐไบ†ๆˆ‘่ฆๅ‡บๅŽป้ฟ้ฟ้ฃŽๅคดๅ…ฌๅฎ‰ๅฑ€ๅœจ่ฐƒๆŸฅๅ…จๅ›ฝๅธ…ๅ“ฅๆœ‰็‚นๅธ…็š„ๅˆคไบ”ๅนด้žๅธธๅธ…็š„ๅˆคๅๅนด่ถ…็บงๅธ…็š„ๅˆคไบŒๅๅนดๅƒๆˆ‘...\n344501 0 ไฝ ๅชๆ˜ฏ่ทฏ่ฟ‡ๅ—/ๆˆ‘่‡ช็ผ–็š„่ฐŽ่จ€ๅพˆ็พŽ\n344502 0 โ€œ็ฟ้ธฟโ€ๅผบๅบฆๅ‡ๅˆฐ17็บงๆต™ๆฑŸๅ‘่ถ…ๅผบๅฐ้ฃŽ็ดงๆ€ฅ่ญฆๆŠฅ\n344503 0 OPPORxPlusๅฐฑๆ•ขๅธฎไฝ ๆŠŠๅฃฐ้Ÿณไผ ้ๆ•ดไธชไธ–็•Œ\n344504 0 ๆˆ–่€…ไพๆณ•ๅ‘ไบบๆฐ‘ๆณ•้™ขๆ่ตท่ฏ‰่ฎผ\n1000 Processed\n classify content\n345000 0 ๅคฉ่ŠฑๆฟไธŠ็š„็ŽฏๅฝขLEDๅฑๅฏไปฅ่ฎฉ่ง‚ไผ—ๅธญไธŠ็š„ไบบๆ›ดๅฅฝๅœฐ่ง‚็œ‹ๆฏ”่ต›\n345001 0 ไธ€ๅฃๆฐ”ๆŠŠQTFx็š„x่ฏ้ƒฝ็œ‹ไบ†wwwwwwwwๅฎžๅœจๅคชๅฅฝ็ฌ‘ไบ†\n345002 0 ็„ถ่€Œ็œŸ็›ธๅบ”่ฏฅๅชๆ˜ฏไป–ไธ่ˆ’ๆœไบ†ๆƒณๆ“ฆๆฑ—ไบ†่€Œๅทฒ2333\n345003 0 ๅ…่ดนๅˆ†ไบซๆต™ๆฑŸๅฐๅทž13ไปฝๅˆ†ไบซ7*7ๅฐ้ป‘ๆ–น\n345004 0 ๆˆ‘่ฟ˜ไปฅไธบๆ˜ฏ่‹นๆžœๆ–ฐๅ‡บ็š„็”ต่„‘ๅ‘ข\n1000 Processed\n classify content\n345500 0 ๆ„Ÿๆƒ…่ฟ™ๆ˜ฏๅๅœฐ้“ๆฅๅ›ž็Žฉๅ„ฟๅ‘ขๆ˜ฏไนˆโ€ฆSB\n345501 0 2ใ€ๅฎƒๆœ‰ๆ‹›่ดข็š„็ฅž็ง˜ๅŠ›๏ผš้‡‘้ป„่‰ฒ็š„่œœ่œกๅฏไปฅๆ‹›ๆฅ่ดขๅฏŒ\n345502 0 ๅˆฐ15ๅนด7ๆœˆๆญฃ้˜ณๅทฒๆœ‰ๅคšๅฎถๅ•†้“บ่ฅไธš\n345503 0 ไธญๅ—่ดข็ปๆ”ฟๆณ•ๅคงๅญฆ้‡‘่žๅญฆ้™ขไฟ้™ฉไธ“ไธš2010็บงๅญฆ็”Ÿๅดๆ–ฐๅฎ‡ๆˆไบ†ๆ กๅ›ญ้‡Œ็š„ๅˆ›ไธšๅถๅƒ\n345504 0 ่ฎค่ฏไฟกๆฏไธบโ€œ่‹ๅทžไธ‰้“ญไผไธš็ฎก็†ๆœ‰้™ๅ…ฌๅธไบบไบ‹ไธ“ๅ‘˜โ€\n1000 Processed\n classify content\n346000 1 ็พŽไธŠ็พŽๅฐ†ไบŽxๆœˆxxๆ—ฅ-xๆœˆxxๆ—ฅไธพ่กŒใ€Š็ˆฑ็พŽๅˆๅฎžๆƒ ใ€‹๏ผŒ็‰ฉไปทๅ†ๅ›žxxๅนดๅนณไปทๅคง้ฉๅ‘ฝๆดปๅŠจ๏ผŒๅฐ†ไบงๅ“ๅนณไปท...\n346001 0 4ๅฒๅ‰ๆ˜ฏๅญฉๅญๅฝข่ฑก่ง†่ง‰ๅ‘ๅฑ•็š„ๅ…ณ้”ฎๆœŸ\n346002 0 ๆœ‰ๅ†™็€ๅฏไปฅ็›ด้‚ฎ็š„ๅฐฑๆ˜ฏๅฏไปฅ็›ด้‚ฎ็š„ๅง็›ด้‚ฎ้‚ฎ่ดน่ดตๅ—\n346003 0 ไฝ†ๆ˜ฏๅคงๅพฎ่ฝฏ่ดจ็–‘็š„่ฆๆŠŠ่ฟ™ไธช็•Œ้ขๅšๆˆ็ฑป็งปๅŠจๅนณๅฐ็š„ๆ ทๅญ\n346004 0 ็ƒญ็ƒˆ็ฅ่ดบ็ซ‹็Ÿฅๆ•™่‚ฒxxxxๆŠคๅฃซ่ต„ๆ ผ่€ƒ่ฏ•ๅŸบ็ก€ๅ่ฎฎไฟ่ฟ‡็ญ้€šๅ…ณ็އxxx%\n1000 Processed\n classify content\n346500 0 75ๅฒ็š„ๆต™ๆฑŸ่กขๅทžๅธ‚ๅ†œๆ‘่€ๆฑ‰็Ž‹ๆŸๆŸ็”จไธ€ๆ นๅˆน่ฝฆ็บฟๅ‹’ๆญปไบ†ไธŽไป–ๆœ‰ๅ‡ ๅๅนดโ€œๅœฐไธ‹ๆƒ…โ€็š„ๅˆ˜ๆŸๆŸ\n346501 0 ไฝ ็œ‹้‚ฃไธคไธช็”ตๆขฏๆ—่พนไธคไธชๆ‰“ๆ‰ฎๆผ‚ไบฎ็š„ๆœๅŠกๅ‘˜\n346502 0 ไปŠๅคฉ่ทŸ่‹ๅทžๅทฅๅŽ‚ๆญฃๅผ็ญพ็บฆไธŽๆˆ‘็š„ๆ‘„ๅฝฑไฝœๅ“ๅˆไฝœๆญฃๅผๆŠ•ไบงไธญๅ›ฝ้ฃŽ็š„ๅฑ้ฃŽ\n346503 0 ๅฟ…ๅฐ†ๆˆไธบไบ’่”็ฝ‘้‡‘่žๅนณๅฐๅฑ•ๅผ€ๅŽฎๆ€็š„ไธ€ๅคงๆˆ˜ๅœบ\n346504 0 ๆœ€ๅ–œๆฌขๅฌๆ‰ฌๅทžๆœ‹ๅ‹้ฅญๆกŒไธŠ่ฎฒๅฐๆ—ถๅ€™็š„ๆ•…ไบ‹\n1000 Processed\n classify content\n347000 1 ๅฏŒๅฏๆ€ๆ•™่‚ฒๆƒณๅญฆ็”Ÿไน‹ๆ‰€ๆƒณใ€ๆ€ฅๅฎถ้•ฟไน‹ๆ‰€ๆ€ฅ๏ผŒ็‰นๅœจๆœฌๅญฆๆœŸๅผ€ๅญฆ็š„ไธ€ๅ‘จๅ‘จๆœซxๆœˆxๆ—ฅๆ—ขๅ…จ้ขๅผ€่ฎพๅฐๅ‡ๅˆๅ„็ง‘่พ…...\n347001 0 ็–‘้—ฎ๏ผšไธคๅนดๆ—ถ้—ด็”ฑ70ๅคšๅฎถๅ‘ๅฑ•ๅˆฐ2็™พไบ”ๅ…ญๅๅคšๅฎถ\n347002 0 ๆฑพ็Ÿฟ้›†ๅ›ขๅญๅผŸใ€ๅพ…ไธš้’ๅนดใ€ๆŠ€ๆ กๆœชๅˆ†้…ๅญฆ็”Ÿๅ’Œ้€€ไผๅ†›ไบบไผ˜ๅ…ˆๅฝ•็”จ\n347003 0 ๅ›ฝ้™…ๆณŒไนณ้กพ้—ฎ็š„่€ƒ่ฏ•่ฟ‡ๅŽปๅฅฝๅ‡ ๅคฉ\n347004 0 ๆœ‰ๆ—ถไนŸ่กจ็Žฐๆ‰ง่กŒๆญปๅˆ‘ๆ—ถ็ฅž็ˆถไธบๆญปๅˆ‘็Šฏ็š„็ฅท่ฏ\n1000 Processed\n classify content\n347500 0 ๅฝ“ๅนดๆˆ‘ๆœ‰ไธช็‰นๅˆซๅ–œๆฌข่Šฑๅƒ้ชจๅ•Šไธ‰็”Ÿไธ‰ไธ–ๅ•Š่ฟ™ๅ‡ ๆœฌๅฐ่ฏด็š„ๆœ‹ๅ‹\n347501 0 ไบš้ฉฌ้€Š่ฟ›ๅฃๅŽŸ็‰ˆkindle็”ตๅญไนฆ๏ผš\n347502 0 ไปŽๆ–‡ๅญ—่ฎฐ่ฝฝ็š„็”ฒ้ชจๆ–‡ๆ—ถไปฃๅฐฑๅทฒๆœ‰ไน‹\n347503 0 20150801้€ŸๆŠฅ๏ผšๆ ‡้ŸฉไธŠๅ†Œ็š„ๅ•่ฏๅ…จ่ƒŒๅฎŒๅ•ฆ\n347504 0 ๆบง้˜ณๅ—ๅฑฑ็ซนๆตทๅŽŸ็”Ÿๆ€็š„้Ÿตๅ‘ณ\n1000 Processed\n classify content\n348000 0 ็คพไฟๅŸบ้‡‘ๅœจไธŠ่ฟฐ65ๅฎถๅ…ฌๅธไธญ\n348001 0 ๅˆšๆ‰็œ‹ๅˆฐๆœ‰ไธชๆญŒ่ฏๆ˜ฏyoumakemehappywhenskiesaregray\n348002 1 ้ซ˜ไปทๅ›žๆ”ถๅ็ƒŸๅ้…’ใ€ๅ†ฌ่™ซๅค่‰ใ€ๅ„่ถ…ๅธ‚่ดญ็‰ฉๅกใ€็”ต่ฏ๏ผšxxxxxxxxxxx\n348003 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๆฐ‘็”Ÿ่งฃๆ”พ่ทฏๅบ—่“่ฑนไธ“ๆŸœใ€‚xๆœˆxๅˆฐxๆœˆxๅท๏ผŒๆœฌๅบ—ๆ–ฐๅ“ๅˆฐๅบ—ๅ‚ๅŠ ๅ•†ๅœบxxxxๅฝ“xxxxๆดปๅŠจ...\n348004 0 10ๅธธๅทžD149๏ผšๆฅ\n1000 Processed\n classify content\n348500 0 โ€”โ€”ใ€Œไธญๅ›ฝๆฏๅคฉ2ไบฟไบบไน˜็”ตๆขฏๅฎ‰ๅ…จไบ‹ๆ•…้ข‘ๅ‘ๆšด้œฒ็ปดไฟ็ผบๅคฑใ€\n348501 0 ๆ˜Ž็Ÿฅ้“ๅˆซไบบ่‚ก็ฅจๆทฑๅฅ—่ฟ˜่ฆๅœจไบบๅฎถไผคๅฃๆ’’็›\n348502 0 ่‚ก็ฅจๅ•Š๏ฝž่ตถ็ดงๅ›žๆœฌๅง๏ฝžไธŠๅธๅ•Š๏ฝžๅฏๆ€œๅฏๆ€œๆˆ‘ๅง๏ฝž\n348503 0 ๆ‰€ๆœ‰ๆƒณๅˆไผ™ๆžๆˆ‘้˜ฟ้‡Œ็š„้ƒฝๆ˜ฏไนŒๅˆไน‹ไผ—\n348504 0 ไธ€ใ€ๆœฌๆœŸไธš็ปฉ้ข„ๅ‘Šๆƒ…ๅ†ต่Œ‚ไธš็‰ฉๆต่‚กไปฝๆœ‰\n1000 Processed\n classify content\n349000 0 ไฝ†ๆ˜ฏๅนถๆœช่ขซๅฝ’ๅˆฐๆ™บ่ƒฝๆ‰‹ๆœบๆˆ–่€…ๆ˜ฏๅนณๆฟ็š„่Œƒๅ›ด\n349001 0 ้‚ฃไฝ ไธ€ๅฎšๆฒกๅŽป็œ‹่ฟ‡ๅธธๅทž็š„้™ๅ›ญ~\n349002 0 ่ฟ™ๆญฃๆ˜ฏazureๆœบๅ™จๅญฆไน ๆœ€ๆ“…้•ฟ็š„้ข†ๅŸŸ\n349003 0 ่ฟ™ๆ ท็š„่‚กๅธ‚ๅทฒ็ปๆฒกๆœ‰ๆŠ•่ต„ไปทๅ€ผไบ†\n349004 0 ๆžœ็„ถOS็ณป็ปŸ้œ€่ฆ้ ๅพฎ่ฝฏๆฅ่กฅๅฎŒ\n1000 Processed\n classify content\n349500 0 โ€ๆˆ‘็ฌ‘ๆ›ฐ๏ผšโ€œๆˆ‘ๅŽปๅšๅ—ไบฌไธ€ๆ—ฅๆธธ็š„ๅฐŽๆธธๅฐๅงๆ€Žๆจฃ\n349501 0 ่ฏ็ฎกๅฑ€ๅˆ™่ดŸ่ดฃ่ฝฌๅŸบๅ› ้ฃŸๅ“ๅ’Œ้ฅฒๆ–™็š„ๅฎ‰ๅ…จๆ€ง่ฏ„ไผฐ\n349502 0 13ๅนด622ไนฐ็š„็ฌ”่ฎฐๆœฌ็”ต่„‘\n349503 0 ไบบๆฐ‘ๆ—ฅๆŠฅๆ”ฟๆ–‡้ฉณโ€œ่ดชๅฎ˜้ƒฝๆœ‰ๆƒ…ไบบโ€๏ผšๅŠๆ•ฐๆถ‰้€šๅฅธ\n349504 0 ่ทŸ็ˆธๅฆˆๅ‡บๅŽปๅค–้ขๅƒๅƒๅƒ~~ๅƒ้ฅฑ่ฟ˜ๆœ‰ๆฒ™ๅ‘่บบ\n1000 Processed\n classify content\n350000 0 ไบŽๆ˜ฏAๅธ‚ๅˆ†ๅฑ€ๅ†ณๅฎšๅฐ†ๆŽๆŸ็š„็›‘่ง†ๅฑ…ไฝๅทฅไฝœ่ฝฌไบค็ป™ไบ†ๆŽๆŸๆˆท็ฑๆ‰€ๅœจๅœฐ็š„Bๅธ‚ๅ…ฌๅฎ‰ๅฑ€\n350001 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ้ป„ๆบๆฑ‚ๅฉšๆˆ‘้ƒฝๆ„ŸๅŠจๅ“ญไบ†\n350002 0 ๅฎๅคๆ—…ๆธธ+ๆฌข่ฟŽๆ‚จ่ฎฟ้—ฎ้’้“œๅณก\n350003 0 ไป–ไปฌ้‚ฃไธ‰็”ฒๅŒป้™ขๅˆๅ‘็”Ÿไบ†ไธ€ไพ‹ๅ›ฐ้šพๆฐ”้“ๅฏผ่‡ด็š„ไธฅ้‡ไธ่‰ฏไบ‹ไปถ\n350004 1 ใ€้€š็Ÿฅใ€‘๏ผš้ƒซๅŽฟไธ€ๆตๅ“่ดจ็‰ฉไธšใ€ๅŽไพจๅ‡คๅ‡ฐๅ›ฝ้™…ๅŸŽใ€‘ๅฎžๅพ—xxxx/ๅนณ่ตท๏ผŒxxๅนณ่ˆ’ๅฑ…ไธ‰ๆˆฟ็ซ็ˆ†๏ผŒๅ‘จๆœซๆŠฝๅฅ–...\n1000 Processed\n classify content\n350500 0 ่ตถๅฟซไธ‹่ฝฝๅคงไผ—็‚น่ฏ„/็พŽๅ›ข/็™พๅบฆ็ณฏ็ฑณAPPๆœ็ดขๅฐ่พฃๆค’ๅง\n350501 0 ๅพˆๅคšไบบ้ƒฝไผš่ฐˆๅˆฐไธญๅŒปๅญฆ็š„ๅค็ฑไน‹ๅคš\n350502 0 ้œ€่ฆๅŸŽ็ฎก็š„ๆ—ถๅ€™ๅŸŽ็ฎก้ƒฝๅŽปๅ“ช้‡Œไบ†\n350503 0 ๆœบๆขฐๅŠ ๅทฅๆœจๅทฅ็”ตๅ™จๅฎ‰่ฃ…ๅฐฑๆˆไธบไธ€ๅไธ“ไธš็š„่ฐƒๆ•™ๅฎค่ฎพ่ฎกๅธˆ~~~\n350504 0 ๅธ‚ไธญ็บงไบบๆฐ‘ๆณ•้™ขๅœจๆ–‡ๆˆไบบๆฐ‘ๆณ•้™ขๅผ€ๅบญๅฎก็†่ฟ™่ตทๆกˆไปถ\n1000 Processed\n classify content\n351000 0 ๅ‡บๅทฎๅฝ“ไธชๅฐ่ทŸ็ญ็ป“ๆžœๆ„Ÿ่ง‰่‡ชๅทฑ่ฆ่ขซๆ™’ๆˆ้ป‘็šฎไบ†\n351001 0 ็™ฝ้“ถไนŸๅœจ่ตš้’ฑๅ“ˆๅ“ˆๆผ‚ไบฎ้œ€่ฆ็š„ๆ‰พๆˆ‘ๅ…่ดนๅผ€ๆˆทๆŒ‡ๅฏผ่‚ก็ฅจๅ’Œ็™ฝ้“ถ\n351002 0 ้“ถ่กŒ็š„่‚ก็ฅจ้…่ต„ไธšๅŠก้™ทๅ…ฅๅ†ทๆธ…\n351003 1 ใ€ๆ™จๅ…‰ๆ€กๅฑ…่‹‘ใ€‘ๅ•†ๆด›ๅ”ฏไธ€ไธ“ๆขฏ็บฏๆฟๅฐ้ซ˜ๅฑ‚๏ผŒๆˆทๆˆท่ต ้€็งๅฎถ่Šฑๅ›ญ๏ผŒไบคxxxxๆœ€้ซ˜ๆŠตxxxxxๅ…ƒ๏ผŒๅ…จๅŸŽไบ‰...\n351004 0 ไธ‰ๆ˜ŸๅŽไธบๅฐ็ฑณ้…ทๆดพ็ญ‰็ญ‰ๅฎ‰ๅ“้ƒฝ้€š็”จ๏ฝžๅŽŸไปท49\n1000 Processed\n classify content\n351500 0 ็™พๅบฆๆŠ„่ฐทๆญŒใ€็™พ็ง‘ๆŠ„็ปดๅŸบใ€่…พ่ฎฏๆŠ„็™พๅบฆใ€่…พ่ฎฏๆŠ„360ใ€ๆœ็‹—ๆŠ„็™พๅบฆ\n351501 0 ๅพˆๅคšๆœ‹ๅ‹้ƒฝ้€‰ๆ‹ฉๆฟ€ๅ…‰็ฅ›็—˜ๅซฉ่‚ค\n351502 0 ไป™ๅฑ…ๅคช็พŽไบ†๏ฝžๅ’Œๅฟ—ๅŒ้“ๅˆ็š„ๅฐไผ™ไผด่ฟ˜ๆœ‰ๅธฎๅŠฉๆˆ‘ไปฌ็š„่€ๅธˆไปฌๅœจไธ€่ตท็š„ๆ„Ÿ่ง‰็œŸๅฅฝ\n351503 1 ๆ‚จๅฅฝ๏ผšๆฑŸๅคไธญ็™พไป“ๅ‚จๅœฃๅ…ƒๅฅถ็ฒ‰xๆœˆxๅท่‡ณxๆœˆxxๅทๆญขx.xๆŠ˜ๅ”ฎ๏ผๆฌข่ฟŽๆƒ ้กพ๏ผ\n351504 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๆ…ง่ช็ฝ‘็š„ๅฎขๆˆท็ป็†ไป˜ๅฐ‘่ฝฉ๏ผŒๆ…ง่ช็ฝ‘็บฏไฟก็”จ๏ผŒไฝŽๅˆฉๆฏ๏ผŒๆ— ๆŠตๆŠผ่ดทๆฌพ๏ผŒๅˆฉๆฏๆ˜ฏxๅŽ˜xๅˆฐxๅŽ˜xไน‹้—ด...\n1000 Processed\n classify content\n352000 1 ๅ–œๅบ†ๅ…ƒๅฎต ็Œœ็ฏ่ฐœ ่ตขๅคง็คผ๏ผ›ๅŒๆ—ถไธบไบ†ๅ›ž้ฆˆๅนฟๅคงๆ–ฐ่€้กพๅฎข๏ผŒ็‰นๆŽจๅ‡บไปฅไธ‹ไผ˜ๆƒ ๆดปๅŠจ:x๏ผŒๅ…จๅœบๆ‰‹ๆœบไฝŽ่‡ณxx...\n352001 0 ๅฎŒ็ˆ†่Šฑๅƒ้ชจ๏ผš็ปˆๆœ‰ไธ€ๅคฉๆˆ‘ๆ‰‹ไธญ็š„็ผ–่ฏ‘ๅ™จๅฐ†ๆˆไธบๆˆ‘็ต้ญ‚็š„ไธ€้ƒจๅˆ†\n352002 0 ่€Œๆ˜ฏไธ€็งๆฌงๅผ้ฃŽๆ ผๅธฆๆฅ็š„ไธ€็งๆ–‡ๅŒ–ไผ ็ปŸๆ‰€่กจ่พพ็š„ๅผบ็ƒˆ็š„ๆ–‡ๅŒ–ๅ†…ๆถต\n352003 0 ๆŒฏๅŽ่ฐƒ่Š‚110่ฐƒ่Š‚ไพ็„ถ่…ฟ็–ผๅนถไธ”ๅๅคๆ— ๅธธๆžๆ— ไฟก่ช‰ๅฏ่จ€\n352004 0 ๅพฎ่ฝฏๅทฒ็ปๅฎฃๅธƒwindows8\n1000 Processed\n classify content\n352500 0 ็ป™ๅคงๅฎถๅˆ†ไบซโ€œfenlei็ญ‰โ€ๆ–‡ไปถ\n352501 0 ๆˆ‘ไธๆ‡‚ไธบไป€ไนˆๆ—…ๆธธๅ›žๆฅๅŽๆˆ‘ๅฆˆไธ€็›ดๅœจๅพฎไฟกๅˆ†ไบซๆˆ‘ๅ›ฝๅค–็‰นๅˆซๆ˜ฏๆฌงๆดฒ็š„็พŽ้ฃŸ็พŽๆ™ฏโ€ฆโ€ฆไฝ ๅ‘Š่ฏ‰ๆˆ‘ไป€ไนˆๆ„ๆ€่ฟ˜ๆƒณๅŽป...\n352502 0 ๆˆ‘็Žฐๅœจๅฏนๅทจๅคง็š„ๅปบ็ญ‘็‰ฉๆทฑๆตทๅคง้ฑผ้ซ˜็ฉบๆœ‰ไบ†ๆ›ดๆทฑไธ€ๅฑ‚็š„ๆๆƒงๆ„Ÿ่€Œไธ”ๆฐธ่ฟœ้ƒฝไธไผšๅฅฝไบ†\n352503 0 ็ป™ๅคงๅฎถๅˆ†ไบซโ€œ่ทณ่ˆž่ง†้ข‘xfulidx\n352504 0 ็ก็œ ไธๅฅฝ้œ€่ฆๅŽปๅŒป้™ขๆฃ€ๆŸฅๆˆ–่กฅๅ……VDๅ—\n1000 Processed\n classify content\n353000 0 ็œ‹ๆฅๅœจ้ญ”้ƒฝๅ้ฃžๆœบ่ฆๆๅ‰ๅ…ญไธชๅฐๆ—ถๅ‡บ้—จไบ†\n353001 1 ๆ„Ÿ่ฐข่‡ด็”ตๅธƒไธ้…’ๅบ—(ๆญๅทžๆฒณๅŠ่ก—ๅบ—)๏ผŒdown.podinns.comไธ‹่ฝฝๅธƒไธ้…’ๅบ—APP๏ผŒไธ“ไบซx...\n353002 0 ๆต™ๆฑŸ้ซ˜่€ƒ็ฌฌไบŒๆ‰นๆ–‡็†็ง‘ใ€ไฝ“่‚ฒๅŠ่‰บๆœฏไธ“็ง‘ๆŽงๅˆถๅˆ†ๆ•ฐ็บฟๆ˜จๆ™šๆญๆ™“\n353003 0 ๅ› ไธบ้ป„ๆŸ็”Ÿๅ‰ๆบๅธฆๆœ‰ไน™่‚็—…ๆฏ’\n353004 1 ๆ˜ฅๅญฃโ€œๅˆไธญไฝ“่‚ฒไธญ่€ƒๅผบๅŒ–โ€โ€œๅฐๅญฆไฝ“่‚ฒ็ปผๅˆ็ด ่ดจโ€่ฎญ็ปƒ็ญไบŽxๆœˆxxๆ—ฅๅผ€่ฏพ๏ผˆๆฏๅ‘จๅ…ญๆˆ–ๆ—ฅ่ฎญ็ปƒ๏ผ‰็Žฐๅทฒๅผ€ๅง‹...\n1000 Processed\n classify content\n353500 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ xrpxxxไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n353501 0 ไนŸๅฅฝๆƒณๅœจไธœๅฐๅธ‚็ฌฌไธ€ไธญๅญฆๆ‹ไธ€็ป„ๅ…ณไบŽ้’ๆ˜ฅ็š„ๅ†™็œŸๅ•Š\n353502 0 8ใ€ๅธ‚ๆ”ฟๅบœ็ณป็ปŸๅป‰ๆ”ฟๅปบ่ฎพ้‡็‚นไปปๅŠกๆŽจ่ฟ›ไผšไธพ่กŒ\n353503 0 ้žๆดฒๅކๅฒไธŠ้ฆ–ๅœบNBAๆฏ”่ต›ๅฐ†ๅœจๅ—้ž็š„็บฆ็ฟฐๅ†…ๆ–ฏๅ กไธพๅŠž\n353504 0 ๅž‹็”ทๅธ…ๅ“ฅๅฅ็พŽๅคง็‰‡21\n1000 Processed\n classify content\n354000 0 ๅ”ฑๆธ…ๅ‡‰็š„ๆญŒๅ†™ๅฆฉๅชš็š„่ฏ—ๅšไธปๅฏผ่ฏป\n354001 0 ๅ€Ÿ่ดทๅฎapp็Žฐๅ…จๅ›ฝๆ‹›่˜็บฟไธŠๆŽจๅนฟไบบๅ‘˜ๅพ…้‡ไธฐๅŽšๆŽจ่ไธ€ไบบ่‡ณๅฐ‘ๆœ‰20ๅ…ƒ็š„ไฝฃ้‡‘ๆœˆ่–ชไธŠไธ‡่ฝป่ฝปๆพๆพๆƒณๅš็š„่ฏท...\n354002 0 ไปŠๅนดๅ‚ๅŠ ็š„ๅฐๅฐ†ไปฅๆฑŸ่‹ๅ’Œๆต™ๆฑŸๅฑ…ๅคšๅ•Š\n354003 0 ๅฅฝๅฃฐ้Ÿณ้‚ฃไธชๅฅณๅญฉๅ”ฑallaboutthatbassไธฒๆˆ‘ๅœจไบบๆฐ‘ๅนฟๅœบๅƒ็‚ธ้ธก็š„ๆ—ถๅ€™ๆฒกๆœ‰่ทณ้‚ฃ็ง่ˆž\n354004 0 ่ฟ™ๆ•ดไธช่ฟ‡็จ‹ๅทฒ่ขซ่ญฆ่ฝฆไธŠ็š„่ฝฆ่ฝฝๆ‘„ๅƒๆœบๆ‹ไบ†ไธ‹ๆฅ\n1000 Processed\n classify content\n354500 0 ๆŠคๅฃซ่ดด็š„ๆญข่ก€่ดดๅฎŒๅ…จๆฒก่ดดๅœจไฝ็ฝฎไธŠ\n354501 0 ๅฟซ้€Ÿไบ†่งฃC็›˜ไธญๅธธ่งๆ–‡ไปถๅคน็š„ไฝœ็”จ\n354502 0 ๅŽๆ‹ฑ่พฐไบซๆฐ”ๅžซ้˜ฒๆ™’ๆœ€ๅŽไธ€ไธช็™ฝ่œไปทๅ•ฆxxx\n354503 0 ้‚ฃไบ›ๅœฐๆ–นๆœ‰ๅฎถไบบๆœ‰้ข†ๅฏผ็œŸๆ˜ฏไธ่ƒฝๅฅฝๅฅฝ่ฏด่ฏๅ•ฆ\n354504 0 ๆฅ้’ๅบฆๅ‡็š„ๅฐๅฑๅญฉๅ„ฟๆœบๅ™จไบบๅ˜่บซไธญ๏ฝž\n1000 Processed\n classify content\n355000 0 ็Šฏ็ฝชๅซŒ็–‘ไบบๅผ ๆŸ่ขซๆญฆ่ฟ›ๅŒบๆฃ€ๅฏŸ้™ขไพๆณ•ๆ่ตทๅ…ฌ่ฏ‰\n355001 0 ๅพฎ่ฝฏๆญป็ฃ•่‹นๆžœSurfacePro4ๆ–ฐๆœบๆ›ๅ…‰ๆ€ง่ƒฝ้ฃ™ๅ‡\n355002 0 ใ€Œtwitterใ€scooterbraun\n355003 0 ็œŸๆƒณๅœจ่ƒŒๅŽ่ดดๅผ ็บธไธŠ้ขๅ†™ไธŠๆˆ‘tm็œŸ็š„ไธๆ˜ฏๆŠคๅฃซโ€ฆ\n355004 0 ๆฏๅคฉๅช่ƒฝๆœ‰6ๅๆธธๅฎขๆŽข่ฎฟไธ€ไธชๅคง็Œฉ็Œฉ่šๅฑ…็พค\n1000 Processed\n classify content\n355500 0 ๆˆ‘ๆƒณๅฏนๆฅผไธŠ่ฃ…ไฟฎ็š„ๅคงๅ“ฅ่ฏดๅฅ่ฏๆ‚จ่ฝป็‚น็ ธๅข™่–„ไธ็ป“ๅฎžๆ‚จ่ฏดๆ‚จ่ฆๆ˜ฏไธ€้”คๅญไธ‹ๅŽปๅŠฒไฝฟๅคงไบ†ๅœจๆމไธ‹ๆฅๅคšๅฐดๅฐฌๆˆ‘ๆ˜ฏ่ฏฅ...\n355501 0 ๆฏ”ๅฆ‚้˜…่ฏปใ€ๆ—…ๆธธใ€ๅ–่Œถโ€ฆโ€ฆ่ฎกๅˆ’ๅŽปๅทฅไฝœ่ฎกๅˆ’ๅŽป็”Ÿๆดปๆ‰ไธไผš่™šๅบฆไบบ็”Ÿ\n355502 0 ไธƒๅค•ๆ–ฐๅ“|่ฟ™ๅ‡ ๅคฉๅ’จ่ฏขๅฝฉ่™น็Žซ็‘ฐ็š„ไบฒๅฅฝๅคš\n355503 0 ้›†ๆˆCortana็š„่ฟ‡็จ‹ไธญ้‡ๅˆฐ็š„ๆŒ‘ๆˆ˜\n355504 1 ๅ…ˆ็”Ÿ๏ผŒๆ‚จๅฅฝ๏ผ ๆˆ‘ๆ˜ฏๅˆš่”็ณป่ฟ‡ๆ‚จ็š„(ๅšๆ—  ๆŠตๆŠผๆ— ๆ‹…ไฟ็š„ไฟก็”จๅ€Ÿๆฌพ๏ผ‰๏ผŒๆœ€ๅฟซxๅคฉๅˆฐๅธใ€‚ ไปฅๅŽๆ‚จๆˆ–ๆœ‹ๅ‹...\n1000 Processed\n classify content\n356000 1 x.ๆดปๅŠจๆœŸ้—ดไธ‹ๅฎšๅฏ่Žท่ต ไปทๅ€ผxxxๅ…ƒ็ดข่ฒไบš่ฝฏๅŒ…็™ปไธ€ไธช๏ผŒไบคๆฌพๆปกxxxxxๅฏ่Žท่ต xxxๅ…ƒ็ดข่ฒไบš่ฏ•่กฃ...\n356001 0 ๆฅ่‡ชๆปจๆตทๆ–ฐๅŒบ็ดซไบ‘ไธญๅญฆๆ–ฐ็–†็ญ็š„80ๅคšๅๅธˆ็”Ÿๆฅๅˆฐๅคฉๆดฅๆ–‡ๅŒ–ไธญๅฟƒๅ‚่ง‚ๅคฉๆดฅๅš็‰ฉ้ฆ†\n356002 0 ๅฟซ็‚น็œ‹ๅง่ฆไธๅฐฑ่ขซ่…พ่ฎฏๅ’Œ่ฐไบ†\n356003 0 ๅ‡บ่ฟœ้—จๅ็ซ่ฝฆๅŒ้ฃžๆœบ่Šฑ่ดน้ƒฝๆˆๆœฌไธ€ๆ ท\n356004 0 ๅนถๅŠๆ—ถ่”็ณปๆถˆ้˜ฒๅคง้˜Ÿๅ’Œ120ๆ€ฅๆ•‘ไธญๅฟƒ่ตถๅพ€็Žฐๅœบ\n1000 Processed\n classify content\n356500 0 4ใ€ไฟๆŠค็šฎ่‚ค็ป†่ƒžไธๅ—ๅˆฐๅค–็•Œ็š„็Žฏๅขƒๅฝฑๅ“\n356501 0 xxๆฑŸๅ—ๅคงๅญฆๆœบๆขฐๅทฅ็จ‹ๅญฆ้™ขๅŒ…่ฃ…xxxx็ญโ€œ่ตฐ่ฟ›็ปฟ่‰ฒ\n356502 0 ้ฒๆ™บๆทฑ๏ผšๆˆ‘็š„้…’ๅบ—ไธ€ๅนด่ต”ไบ†xxไธ‡ไธค้“ถๅญ™ไบŒๅจ˜็ซ™่ตทๆฅๅ“ญ้“๏ผšๅคงๅ“ฅ\n356503 0 ThinkPadๆœ€ๅผบ็ฌ”่ฎฐๆœฌ้…ไปถThinkPadStackๆฅ่ขญ\n356504 0 ๅŠซๅŽ้‡็”Ÿ็š„ๆ‰‹ๆœบไผš่ฎฉๆˆ‘ๆ›ดๆ‡‚ๅพ—็ๆƒœ\n1000 Processed\n classify content\n357000 0 โ€”โ€”Whichwouldyoulikefordinner\n357001 0 ๆžๅ้—ฎ็š„ๅทๅทฒ็ปๅ› ไธบไฟก็”จๅˆ†ๅคชๅฐ‘่ขซ้—ๅผƒไบ†\n357002 0 5ๆ˜ŸๆŽจ่ๆฑฝ่ฝฆไบบๆ€ปๅŠจๅ‘˜็™พๅบฆไบ‘็ฝ‘็›˜\n357003 0 ไธ‹ๅˆๆœ‰ๅŽป็ฆๅ›ญๅ•†ไธš่ก—ไธ€ๅธฆ็š„ๅฐไผ™ไผดๅ˜›ๆœ‰็š„่ฏ่ฎฐๅพ—่”็ณปๆˆ‘\n357004 0 ไปŽไธ‹้ฃžๆœบ้‚ฃไธ€ๅˆป่ฟŽ้ข่€Œๆฅ็š„็ƒญๆตช\n1000 Processed\n classify content\n357500 0 ๆˆ‘่ง‰ๅพ—่ฟ™ๆฏ”ๅœจNBAๆŠ•่ฟ›่ทณๆŠ•ๅމๅฎณๅคšไบ†\n357501 0 ๆˆ‘ไปฌๅœจๅ…จ็ƒxxxไธชๅœฐๆ–นๆไพ›ๅธฆ่ˆนๅ‘˜ๅŠจๅŠ›ๆธธ่‰‡็งŸ่ตไธšๅŠก\n357502 0 ๅ› ไธบ่ฟ™ๆ ทๆˆ‘ๅœจ่‡ชๅฎถ็”ต่„‘ๅ‚ฒๅจ‡็š„ๆ—ถๅ€™\n357503 0 ๅฅณๅคงๅญฆ็”Ÿ่ขซ็ฝ‘ๅ‹่ฃ…้ฌผ่ฝฎๅฅธไปฅไธบ้ญ้‡้ฌผๅŽ‹ๅบŠ\n357504 0 ไน‹ๅ‰ไธ€็›ดไปฅไธบๅชๆœ‰่…พ่ฎฏ็š„่ฝฏไปถๆ‰ไผš่ถŠๆ›ดๆ–ฐ่ถŠ็ƒ‚้€ผ็š„ไฝ ไธๅพ—ไธๅŽปๆ‰พ่€็‰ˆๆœฌ\n1000 Processed\n classify content\n358000 1 ๆ‚จๅฅฝ๏ผŒๅปบ่พ‰่ฏŠๆ‰€xๆœˆxๅทโ€”xๆœˆxxๅทไธพ่กŒไผ˜ๆƒ ไฝ“ๆฃ€ๆดปๅŠจ๏ผŒ้œ€ๆณจๆ„:x็ฉบ่…นๆŠฝ่ก€ๅ’Œๆ†‹ๅฐฟ๏ผŒx.B่ถ…ๆ—ถ้—ดไธบ...\n358001 0 ๆ™บ่ƒฝๆœๅŠกๆœบๅ™จไบบโ€œAngelโ€้ฆ–ๆฌกไบฎ็›ธ\n358002 0 ๆดป็”Ÿ็”Ÿๅœจ้ฃžๆœบไธŠ็ญ‰ไบ†ๅŠไธชๅฐๆ—ถๆฒกๆณ•ไธ‹ๆœบ\n358003 1 ่”กๅˆฉๆณข ๅปบ่ฎพ้“ถ่กŒ xxxx xxxx xxxx xxxx ...\n358004 0 ไปŽxๅŠ ้€Ÿๅˆฐxxxkm/hๅช้œ€่ฆไธๅˆฐx็ง’้’Ÿ\n1000 Processed\n classify content\n358500 1 ๆžณๆฒŸๅˆฉ็พคๅบ†x.xๅฆ‡ๅฅณ่Š‚๏ผŒไผŠๅˆฉๅฅถ็ฒ‰.้‡‘้ข†ๅ† .้‡‘่ฃ….็ๆŠคๅ…จๅœบๅฅถ็ฒ‰x.xๆŠ˜๏ผŒๆฌข่ฟŽๆ–ฐ่€้กพๅฎข่ดญไนฐใ€‚ๆ—ถ้—ด...\n358501 0 ่ฃฝไฝœ็ณ–้†‹่’œๆฑ๏ผš็ง˜ๆ–น๏ผš1ๆ–คๅคง่’œใ€1ๆ–ค็™ฝ็ฑณ้†‹ใ€ๅ†ฐ็ณ–4ไธค\n358502 0 ่ตท็ ๅœจๅพˆๅคšๆกˆไปถไน‹ไธญ้ƒฝๆœ‰็€ๅƒไธไธ‡็ผ•็š„่”็ณป\n358503 0 Wilkinson็š„ๅˆฎ่ƒกๅˆ€ๅนฟๅ‘Š\n358504 0 ๆฏๅนด้ƒฝไผšๅ› ไธบ็งŸๆˆฟ็š„ไบ‹ๆž็–ฏๆމ\n1000 Processed\n classify content\n359000 0 ็™พๅบฆๅพฎๅšไบ†ๅๅˆ†้’Ÿ้ƒฝๆ‰พไธๅˆฐๅ‡†็กฎไฟกๆฏ\n359001 0 ็ฃไป”ๆœƒ่ญฐๅฑ•่ฆฝไธญๅฟƒๅฑ•ๅปณ3ๅฑ•ๆˆฟX12ๅŒๅคงๅฎถ่ฆ‹้ข\n359002 0 /ๆฒณๅŒ—ไธ€่พ…่ญฆๆ— ่ฏ้ฉพ้ฉถๆ’žๆญป2ๅญฆ็”Ÿ้€ƒ้€ธ็Žฐๅทฒ่ขซๆ‰นๆ•\n359003 0 ่‡ชไปฅไธบๅพˆๅމๅฎณๅฎžๅˆ™ไธ่ฟ‡ๆ˜ฏไธ€็พค่ทณๆขๅฐไธ‘่€Œๅทฒ\n359004 0 ๅŒ—ๅคง่ก—ไนˆๅคšไบ†็‚นๆ—…ๆธธๆ™ฏ็‚น็š„ๅ–ง้—น\n1000 Processed\n classify content\n359500 0 ๆ‹’่ฏŠๆ˜ฏๆญฃๅธธ็š„ใ€Žไฝœไธบไธ‰็”ฒๅŒป้™ขๅŒป็”Ÿ\n359501 0 ็œ‹ๅฎŒ็”ตๅฝฑๅ›žๅฎถ็”ตๆขฏๅฃ็œ‹ๅˆฐ่ฟ™ไธช\n359502 0 ๆŠข็บขๅŒ…็พคๆŠขๅˆฐๆœ€ไฝŽ็š„่ฆๅ‘็บขๅŒ…ไธๆ˜ฏๆœ€ไฝŽ็š„ๅฏไปฅไธ€็›ดๆŠข\n359503 0 ๅ› ๆญคๅฐๆœ‰DermatologistTested็šฎ่‚ค็ง‘ๅŒป็”ŸไธดๅบŠ่ฏ•้ชŒ็ญ‰ๅญ—ๆ ท\n359504 0 ๆฒกๆƒณๅˆฐ่ฟ˜ๆ˜ฏๅฏน่ฟ™ๆ ท็š„ไธๅ…ฌๆœ‰ไบ†ๅๅบ”\n1000 Processed\n classify content\n360000 0 xxxxๅนดxxๆœˆ่‡ณxxxxๅนดxๆœˆ\n360001 0 ๆŸไธชๆœบๆ™บ็š„ๅŒป็”Ÿ่ฏดๅƒ็‚น็ขฑๅฅฝไบ†โ€ฆโ€ฆ\n360002 0 ้ฆ–ๅ…ˆๆ˜ฏ่ฎค็œŸ่ฝๅฎžๅฐ้ขๆ‹…ไฟ่ดทๆฌพๆ”ฟ็ญ–\n360003 0 ๅ›บๅฎš่ต„ไบงๆŠ•่ต„ๅฎŒๆˆ่ฟ‘115ไบฟๅ…ƒ\n360004 0 ๆœ‰ไบ›ๆ„Ÿๆƒ…ๆ— ้œ€่ดจ็–‘ไฟกไปปๆ˜ฏๅฝผๆญค็ป™ๅฝผๆญค็š„\n1000 Processed\n classify content\n360500 0 1ไธ‡ๅ…ƒ่‹ๅทžๅ˜‰ๅนดๅŽไธคๅŽข่ฎฉๅˆฉ1\n360501 0 29ๆ“ไฝœๅปบ่ฎฎ๏ผš1ใ€ไบš็›˜ๆ—ถๆฎต็™ฝ้“ถ2930้™„่ฟ‘ๅš็ฉบ\n360502 0 ๆ‰‹ๆœบไธŠๅ›ฝๅ†…็ฝ‘่ถ…ๅฟซ็”ต่„‘ๅฐฑๆ— ๆ•Œๆ…ข่ฟ˜ๆ‰“ไธๅผ€??็”ต่„‘็™ฝ็—ดๆŠ˜่…พไธ€ๆ™š่ฟ˜ๆ˜ฏๆ•ดไธๅ‡บๆ‰€ไปฅ็„ถGoogle้ƒฝๆ‰“ไธๅผ€...\n360503 0 ๆธ…็ง€็š„่ถŠๅ—MMๅ”ฏไธ€ไธ่ถณๅฐฑๆ˜ฏๅคชๆต“็ผฉ\n360504 0 ไป–ไธ็Ÿฅ้“ไฟ„็พ…ๆ–ฏๆ”ฟๅบœๆœƒๅœจไบŒใ€‡ไธ€ไธ‰ๅนดๅ…ฌ้–‹ไบค้‚„ๅ‰่˜‡่ฏๅธถ่ตฐ็š„ๆ–‡็จฟ\n1000 Processed\n classify content\n361000 1 ไบฒไปฌ๏ผŒ้ญ”ๆณ•ๅŒป็”Ÿๅˆๆœ‰ๆ–ฐๆดปๅŠจไบ†๏ผŒ้ญ”ๆณ•ๅŒป็”Ÿ่Œ‰่މ็ณปๅˆ—ๅ’Œ็Žซ็‘ฐ้ƒจๅˆ†ไบงๅ“xๆŠ˜ไผ˜ๆƒ ๏ผŒ่…Š่Šๆฐดxxxๅ…ƒๅ’Œไนณxxx...\n361001 0 ็ปˆไบŽ็ญ‰ๅˆฐไฝ \n361002 0 2015ไธ–็•Œๅ„ฟ็ซฅ่ท†ๆ‹ณ้“ๆฏ”่ต›้ฉฌไธŠๅผ€ๅง‹ๅ•ฆ\n361003 0 ๅ–10๏ฝž15g้ฉฌ้ฝฟ่‹‹่‰็…Žๆฑคๆˆ–ๆฆจๆฑๆœ็”จ\n361004 0 ๅŽไธบๆ‰‹่กจๅ’Œไบงๅ“ๅŒ…่ฃ…่กจ้ข็œ‹ไธŠๅŽปไธŽๆ™ฎ้€šๆ‰‹่กจๆฒกไป€ไนˆๅŒบๅˆซ\n1000 Processed\n classify content\n361500 0 ๆˆ‘ไปฌ้ƒฝๆ˜ฏ365ๅคฉ่ฆไธบไธŽไฝ ๆœ‰ไธ€ๆ ท้œ€ๆฑ‚็š„ไบบๅ‘˜่€Œ่ฅไธš็š„\n361501 0 ๅ„็ง็ฑปๅž‹็š„ๅŒป่ฏไผไธš่ถŠๆฅ่ถŠๅคš\n361502 0 ๅ“†ๅ•ฆAๆขฆๆฑฝ่ฝฆๆ‘†ไปถๆœบๅ™จ็Œซ่ฝฆๅ†…้ฅฐๅ“่“่ƒ–ๅญๅˆ›ๆ„ๅ…ฌไป”ๅฎๅฝ“็Œซ่ฝฆ่ฝฝๆ‘†ไปถ\n361503 0 ไฝ†ไธ‰ๅๆณ•ๅฎ˜็ป„ๆˆ็š„ๅฎกๅˆคๅฐ็ป„ไนŸ่กจ็คบ\n361504 0 ้’ๆตทๆน–็•”199ๅ…ฌ้‡Œ้ช‘่กŒ็ญ‰ไฝ ๆฅๆŒ‘ๆˆ˜\n1000 Processed\n classify content\n362000 0 ่ฎฉๆŠ•่ต„่€…ๅผ€ๅง‹ๅ›žๅฝ’็†ๆ€งไธŽๅ†ท้™\n362001 1 ็š„โ€œ่ฏšไฟกxxx๏ผŒ็คผๆƒ ๅฑฑๅŸŽโ€œๆ„Ÿๆฉๅ›ž้ฆˆๆดปๅŠจ๏ผŒๅ…จๅœบๆ‰€ๆœ‰ๅ•†ๅ“็›ด้™xx%๏ผŒๅœจๆญคๆŠ˜ๆ‰ฃๅŸบ็ก€ไธŠ๏ผŒๆฏไบคxไธ‡ๅ…ƒ็›ด...\n362002 0 ๆ˜ฏๅฏนๆบง้˜ณ่ดจ็›‘ๅฑ€ๅฑ€้•ฟๆ„Ÿๅ…ด่ถฃ็š„็›ธๅ…ณไบบ็พค่Žทๅ–ๆบง้˜ณ่ดจ็›‘ๅฑ€ๅฑ€้•ฟ่ต„่ฎฏ็š„้‡่ฆๅนณๅฐ\n362003 0 ๅˆ›้€ ไบ†NBAๅކๅฒไธŠ่ฅฟ้ƒจ็ƒ้˜Ÿๅ•่ต›ๅญฃไธปๅœบ่ƒœๅœบ็บชๅฝ•\n362004 0 ๅ—ไบฌ้›ถ้›ถๅŽ้˜Ÿ่”กๆถฆไธœ่Žทๆฃ‹็Ž‹็ป„ๅ† ๅ†›\n1000 Processed\n classify content\n362500 0 ๅœจๅฅฝๅฃฐ้Ÿณๅญฆๅ‘˜ๆฑŸๆบไธœ็š„้‡ๆ–ฐๆผ”็ปŽไน‹ๅŽ\n362501 0 ๅžไฟฎๅ…จ๏ผšๆณ•ๅฎ˜ๆฃ€ๅฏŸๅฎ˜ไธ่ƒฝๅ†ๆœ‰้“้ฅญ็ข—\n362502 0 ๅˆฐๅ…ฌๅธ็œ‹ไบ†ๅŠๅคฉ็”ต่„‘ๆ— ่Šไธ‹่ฝฆ้—ด็Žฉไบ†ไธชๆŠŠๅฐๆ—ถๆ‰‹ๆœบ\n362503 0 ็œ‹ๆฅๅ็ซ็ฎญ้ƒฝ่ตถไธไธŠๅˆซไบบ่Š‚ๅฅไบ†\n362504 0 ๅŸŽ็ฎกๆŠ—ๅฐ้˜ฒๆฑ›ๅบ”ๆ€ฅ้˜Ÿไผ้šๆ—ถๅพ…ๅ‘ฝ\n1000 Processed\n classify content\n363000 1 ใ€ๅฏ็ˆฑๅฏไบฒใ€‘ๆ˜ฅๅญฃๅคง่ฟ”ๅˆฉ๏ผšๆƒ ๆฐๅฌไธ€xxx๏ผŒๅฌไบŒxxx๏ผŒๅฌไธ‰xxx๏ผŒ้›…ๅŸนใ€ๅคš็พŽๆป‹ไธ€้™ๅˆฐๅบ•๏ผ๏ผ๏ผๅ…จ...\n363001 0 ๅฐๆˆทๅž‹็š„ๆฉฑๆŸœ่ฎพ่ฎกไนŸไธ่ƒฝๅฟ˜ไบ†ๅงๅฐ่ฟ™ไธ€็Žฏ่Š‚\n363002 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏ่ฏธๅŸŽไธ‡ๅ‘ๅ†ท่ฟๆˆ‘ๅค„ไธ“ไธšไปŽไบ‹ๅ…ฌ่ทฏๅŠๆตท่ฟๅ†ท่—่ฟ่พ“ไธป่ฆ่ทฏ็บฟ๏ผšๅนฟไธœ.ๅนฟ่ฅฟ.ไบ‘ๅ—.ๅ››ๅท.้‡ๅบ†....\n363003 0 ๆƒณ็Ÿฅ้“็”ตๅฝฑไธญ็š„ๅจฑไนๅœˆ็œŸ็›ธๆ˜ฏๅฆๅฑžๅฎž\n363004 0 ็ฒ‰่‰ฒ่›‹็ณ•2็ฒ‰่‰ฒๅฝฉๆ——3่“่‰ฒ่›‹็ณ•4่“่‰ฒๅฝฉๆ——\n1000 Processed\n classify content\n363500 0 ่”ๆƒณ็”ต่„‘ๅ…ฌๅธไฝฟๅ‘ฝ๏ผšไธบๅฎขๆˆทๅˆฉ็›Š่€ŒๅŠชๅŠ›ๅˆ›ๆ–ฐ\n363501 0 ่ฎฉๆˆ‘ๆ‰พไบฒๆˆšๆœ‹ๅ‹ๅ€Ÿไธ€ไธชๆ‰‹ๆœบๅ…ˆๅฐ†ๅฐฑ็€\n363502 0 ๆœ‰ไบบ่ดจ็–‘ๆˆ‘ไธ€ไธชๅŸบ็ฃๅพ’ๆœ‰ไฝ•่ต„ๆ ผไธบไฝ›ๆ•™ๅพ’่ฏด่ฏ\n363503 0 ๆˆ‘ๅœจgoogleไธŠๆœJu็š„็…ง็‰‡\n363504 0 ่Žทๅพ—็ฌฌ21ๅฑŠไธœๆ–น้ฃŽไบ‘ๆฆœๅๅคง้‡‘ๆ›ฒ\n1000 Processed\n classify content\n364000 0 ็ˆฑ็”Ÿๆดป็š„ไบบ้ƒฝๆœ‰3ไธชๅนณๆ–น็š„ไธ–ๅค–ๆกƒๆบ๏ผš้˜ณๅฐๆ˜ฏๆˆ‘ไปฌๅœจ่ฟ›่กŒๅฎถๅบญ่ฃ…ไฟฎ่ฟ‡็จ‹ไธญๆœ€ๅฎนๆ˜“่ขซๅฟฝ็•ฅ็š„ไธ€ไธชไธ–ๅค–ๆกƒๆบ\n364001 0 ้ข„่ฎกๅˆฐๅนดๅบ•ๅ†ๅฎŒๆˆxๆˆทไผไธšๆฌ่ฟๅ…ณๅœ\n364002 0 Techcrunchๆ’ฐๆ–‡็งฐ\n364003 0 ๅไผš็งฏๆžๆ”ฏๆŒ่ฏๅˆธๆŠ•่ต„ๅ’จ่ฏขๆœบๆž„ๅˆ›ๆ–ฐๅ‘ๅฑ•\n364004 0 ้€š่ฟ‡400ไฝ™ไปถๅŠจๆผซๆ‰‹็จฟใ€้›•ๅก‘ๅ’Œ็™พไฝ™้ƒจไธญๅค–็ปๅ…ธๅŠจ็”ปๅฝฑ็‰‡\n1000 Processed\n classify content\n364500 0 ่Šฑๅƒ้ชจๅƒ็š„ๅŒ…ๅญไธบๅ˜›ๆ˜ฏ็”ต่„‘ๅˆๆˆ็š„\n364501 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅˆš็ป™ๆ‚จๆ‰“็”ต่ฏ็š„ไธญๅ›ฝ็”ตไฟก็š„ๅฎขๆˆท็ป็†ๅฐ้‚ฑ๏ผŒ้ข„ๅญ˜่ฏ่ดน้€่ฏ่ดน้€ๅฎฝๅธฆ็š„ไผ˜ๆƒ ๆดปๅŠจๆญฃๅœจ่ฟ›่กŒไธญใ€‚้ข„...\n364502 0 ่ฏทๅนฟๅคงๅธ‚ๆฐ‘็™ปๅฝ•ๆตท้—จๆ–‡ๆ˜Ž็ฝ‘\n364503 0 /ๆญๅทž๏ผšๅฎ˜ๅ‘˜่ขซไธพๆŠฅๅผบๅฅธๅฅณๅ‘˜ๅทฅไธŽไป–ไบบ้€šๅฅธ\n364504 0 BamBamๅˆšๆ›ดๆ–ฐไบ†Twitter๏ผš????????????????Hahahahahaha...\n1000 Processed\n classify content\n365000 0 ๅพฎไฟก็งๅ‘+461303้ชŒ่ฏ๏ผšๅฎๆณฝๆถ›ๅธ…ๅธ…็š„\n365001 0 ๆœ‰ไธช็ˆฑ็ฎก้—ฒไบ‹ไธๆ˜Ž็œŸ็›ธ็š„ๆ›ดๅนดๆœŸ่€ๅคชๅฉ†้—ฎๆˆ‘ไธบไป€ไนˆไธไธ€่ตทๅƒ\n365002 0 ๆ›ดๆ˜ฏ็ป™ๅ—ไบฌๆฅผๅธ‚ๅธฆๆฅไบ†ๅŽปๅŒ–ๆƒŠไบบ็š„ๆˆ็ปฉ\n365003 0 7ใ€้“ถๆๅถๅˆถๅ“ๅฆ‚้“ถๆๅถใ€้“ถๆ\n365004 0 ๅผ ๆฐธ็†™ๅ…ˆ็”ŸไบŽxxxxๅนดxๆœˆxxๆ—ฅไธ‹ๅˆxx็‚น้€ไธ–ไบŽๅ—ไบฌ\n1000 Processed\n classify content\n365500 0 ไธŽไน‹ๅŒน้…็š„8้€ŸDCTๅ˜้€Ÿ็ฎฑไนŸ่ƒฝๅพˆๅฅฝ็š„ๅฑฅ่กŒๅฎƒ็š„่ดฃไปป\n365501 0 ๆœ‹ๅ‹ๆ€ฅๅ‡บไธ€ๅฐ99ๆ–ฐ็คผ็›’็‰ˆtr350้™้‡่‰ฒ่–„่ท็ปฟ\n365502 0 qq๏ผš1985177275่”็ณป\n365503 0 ๅœจTwitter็œ‹ๅˆฐ้€™ๅผต็…ง็‰‡\n365504 0 ๆœ‰ไบบ่ฏดๆ˜ฏ่Šฑๅƒ้ชจๆฏไบ†้‚ฃไธช้ซ˜้ซ˜ๅœจไธŠ\n1000 Processed\n classify content\n366000 0 ๅฏนๆฑฝ่ฝฆ้›ถ้ƒจไปถ็‰นๅˆซๆ˜ฏๅฝฑๅ“ๆฑฝ่ฝฆๅฎ‰ๅ…จ่กŒ้ฉถ็š„ๅ…ณ้”ฎ้ƒจไปถ่ฆๅฎšๆœŸๆฃ€ๆŸฅ็ปดไฟฎ\n366001 0 ๅ—ไบฌ้™ถ็ฌ›ไน‹็ฝ—ๅ‹็พค็พคไธป้™ถ็ฌ›่‘ซ่Šฆไธ็ˆฑๅฅฝ่€…\n366002 1 ๅฐŠๆ•ฌ็š„ๅฎถ้•ฟๆ‚จๅฅฝ!็ฅžๅขจๆ•™่‚ฒๆ˜ฅๅญฃ็ญ(็ ๅฟƒ็ฎ—๏ผŒๅฃๆ‰๏ผŒ่‹ฑ่ฏญ๏ผŒ็ปƒๅญ—๏ผŒxFๅ…จ่„‘่ฎญ็ปƒ๏ผŒ็ป˜็”ป)ๆญฃๅผๅผ€่ฏพๅ•ฆ!ๆœฌ...\n366003 0 ๆ›ดdisppoint็š„ๆ˜ฏ๏ผšไธ€ไธชsecurity็ซŸ็„ถ่ฏดไปŠๅคฉๆฒกๆœ‰ไปปไฝ•ไบบ้€ๆฅไธœ่ฅฟ\n366004 0 ็”จๅพ—ๅฏ็ˆฑ่ฅฟ็“œๅๅธๆญ้…่œ‚่œœๆธๆŸ ๆชฌๅคชๆœ‰ๅ–œๆ„Ÿไบ†\n1000 Processed\n classify content\n366500 0 ็™พๅบฆ่‡ชๅทฑ่บซไธŠ็š„ๅฐๆฏ›็—…็š„ๆ—ถๅ€™ๆ€ปไผšๅ“็€ๆ€ป่ง‰ๅพ—ๅพ—ไบ†ไป€ไนˆไธๆฒปไน‹็—‡ๅฏๆ˜ฏ่ฐ่บซไธŠๆฒกไธชๅฐๆฏ›็—…ๅ‘ข่‡ชๅทฑ่ƒฝๅฟๅ—ๅฏๅˆซ...\n366501 0 ๅŽŸๅ‘ŠๆŽ้ข–ๆฟ€ๅŠจๅœฐๅฏนๅŠžๆกˆๆณ•ๅฎ˜่ฟž่ฟž้“่ฐข\n366502 0 ๆœ€ๅคšๅฏไพ›6ไบบๅŒๆ—ถ็”Ÿๆดปโ€ฆโ€ฆ้ป„ไฟŠๆฐ้žๅธธโ€œ้ช„ๅ‚ฒโ€ๅœฐไป‹็ป่‡ชๅทฑ็š„ๆˆฟ่ฝฆ\n366503 0 ๆ—ถ้—ด๏ผš13๏ผš00ใ€14๏ผš50ใ€18๏ผš30\n366504 0 ่€Œๅ‚ๅŠ ๅฌ่ฏ็š„19ๅไปฃ่กจ้ƒฝๅŒๆ„ๆฐดไปทไธŠ่ฐƒ\n1000 Processed\n classify content\n367000 0 ไฝ ไธ่ƒฝๆŠŠไป–้€ๅพ€ๅฐๅŒป้™ขๅ˜›โ€ๅ‘ตๅ‘ตๆฐ‘่ญฆ่ฟ™่ฏ\n367001 0 ๆœ‰ไบบๆ‹…ๅฟƒๆ™บ่ƒฝๆœบๅ™จไบบไผšๅจ่ƒไบบ็ฑป\n367002 1 ้ข†ๅฏผๆ‚จๅฅฝ๏ผๅŽๅค้“ถ่กŒ็ŽฐๆŽจๅ‡บๆ— ๆŠตๆŠผใ€ๅ…ๆ‹…ไฟ็š„ไฟก็”จ่ดท๏ผŒๆŽˆไฟก้ขๅบฆxxไธ‡๏ผŒไธ€ๅผ ่บซไปฝ่ฏๅณๅฏๅŠž็†ใ€‚่ฏฆ่ฏขไป˜็ป...\n367003 0 ๆ˜Žๅคฉ่ฟ™ไธชๆ—ถๅ€™ๅทฒ็ปๅœจ้ฃžๆœบไธŠไบ†\n367004 0 ็”ฑDatourenๆœ€ๆ–ฐ่ฎพ่ฎก็š„็ฒพ็ต็ฝ\n1000 Processed\n classify content\n367500 0 ๅฆ‚ๆžœ็ญ็ป„่ตถไธŠ่ตทไธ‹้’ปๆˆ–ๅธ่ฏๅ“\n367501 0 SJๆˆๅ‘˜ๅด”ๅง‹ๆบๅทฒ็ป้€š่ฟ‡้Ÿฉๅ›ฝไน‰ๅŠก่ญฆๅฏŸ็‰น้•ฟๅ…ต็š„ๆœ€็ปˆๅฎกๆ ธ\n367502 0 ็ปๅฏน่‰ฏๅฟƒไปทๅ†’็€็”Ÿๅ‘ฝๅฑ้™ฉๆ‰ๆŠ“็š„/้…ท/้…ท\n367503 0 NBAๅ„ๆ”ฏ็ƒ้˜Ÿๅทฒ็ปๅฏไปฅๅผ€ๅง‹ๅ’Œ่พพๆˆๆ„ๅ‘็š„็ƒๅ‘˜ๆญฃๅผ็ญพ็บฆ\n367504 0 ไฟๅญ˜ๆœ€ๅฎŒๅฅฝ็š„ๆดž็ฉดๅฏบๅบ™ๅปบ็ญ‘็พค\n1000 Processed\n classify content\n368000 1 xxๅนด็ ”็ฉถ็”Ÿๆœช่ฟ‡ๅœจ็บฟๅŠ ๅˆ†๏ผŒ่ฏทๅŠ qq๏ผšxxxxxxxxxไธ€ๆฌก่ฟ‡๏ผŒไธ่ฟ”ๅผน ๆฐธไน…ๆœ‰ๆ•ˆ๏ผŒๅฎŒๆˆไฝ ็š„ๅฟƒๆ„ฟ.\n368001 0 7ๅท็บฟๅธๆœบไธๆ˜ฏๅ–้†‰ไบ†ๅฐฑๆ˜ฏๅ—‘่ฏไบ†\n368002 0 ๅ„็งcoserๅ‡บ็Žฐๅœจๅ•†่ดธๅŸŽ\n368003 0 Gxx้•ฟๆทฑ้ซ˜้€Ÿ็”ฑๆญๅทžๅพ€่ฟžไบ‘ๆธฏๆ–นๅ‘ๅฎๆญๆฎตKxxxx+xxx่‡ณKxxxx+xxx้™„่ฟ‘ๆ–ฝๅทฅ็ป“ๆŸ\n368004 0 ๅด่งฃๅ†ณไธไบ†xxไบฟไบบๅŒป็–—ๅ…่ดน\n1000 Processed\n classify content\n368500 0 ๅˆฉ็”จไผช้€ ็š„็ป“ๅฉš่ฏใ€่ดญๆˆฟๅˆๅŒใ€่ดทๆฌพๅˆๅŒๅ’Œ้“ถ่กŒ่ฟ˜ๆฌพๅ•็ญ‰้ช—ๅ–ไฝๆˆฟๅ…ฌ็งฏ้‡‘\n368501 1 ๅปถๅ‰้ฉฌๅฏๆณข็ฝ—็ฃ็ –๏ผŒๅ…จๅŸŽๅผ•็ˆ†๏ผŒ็พŠๅนดๅผ€้—จ้€่ฃธไปท ้€ๅฅฝ็คผ. ้€ไฟ้šœใ€‚ ไบคๅฎš้‡‘๏ผŒไบคๅ…จๆฌพ้ƒฝๆœ‰ๅคง็คผ๏ผŒๅ››ๅคง...\n368502 0 ๅฝ“ๆˆ‘่ฟˆ่ฟ›ๅฎฟ่ฟๅนฟๆ’ญ็”ต่ง†ๆ€ปๅฐ็š„ๆ—ถๅ€™\n368503 0 ไปฅๅคฉๆดฅๆปจๆตทๅ›ฝ้™…ๆœบๅœบไธบไธป่ฟ่ฅๅŸบๅœฐ\n368504 0 ็ง‘ๆŠ€ๅ‘ๆ˜Žๆ˜ฏๆˆ‘ไปฌ็š„ๅŸบๅ› ๅˆ›้€ ็š„่‚‰ไฝ“็š„ไผŸๅคงๅค–ๅปถ\n1000 Processed\n classify content\n369000 0 ๅฎถๅบญ่šไผšไธŠ่ญฆๅฏŸไผฏไผฏๆ•™ๆˆ‘ไธ€ๅ †ๅฆ‚ไฝ•ๅœจ็คพไบคๅœบๅˆไฟๆŠค่‡ชๅทฑ\n369001 0 ๆต™ๆฑŸไธœ้ƒจๆฒฟๆตท้ƒจๅˆ†ๅœฐๅŒบ12็บงไปฅไธŠๅคง้ฃŽๅทฒๆŒ็ปญ12๏ฝž20ๅฐๆ—ถ\n369002 1 ๆ‚จๅฅฝ๏ผŒๅ…ˆ็”Ÿ๏ผŒๆˆ‘ๆ˜ฏ่Žžๅคงไฟก่ดท็š„ๅฐๅงš๏ผŒๅˆฉๆฏๆœ€ไฝŽๅฏไปฅๅšๅˆฐxๅŽ˜ๅคšใ€‚ๆˆ‘ไปฌๅ…ฌๅธๅœฐๅ€๏ผšไธœ่Žžๅธ‚ๅ—ๅŸŽๅŒบๅ…ƒ็พŽ่ทฏๅŽๅ‡ฏ...\n369003 1 ๆ€ฅ็”จ้’ฑๆ‰พๆˆ‘ไปฌ๏ผŒๆ— ๆŠตๆŠผๆ— ๆ‹…ไฟใ€‚ๆˆ‘ๆ˜ฏๅฎœไฟกๆ™ฎๆƒ ๆŽ็ง€ๅช›๏ผŒๆˆ‘ไปฌๅทฒ็ปๆญฃๅผไธŠ็ญ๏ผŒ้œ€็”จ้’ฑ่ฏท่”็ณปๆˆ‘๏ผŒๆฌข่ฟŽๅ’จ่ฏขใ€‚\n369004 0 ๅฎŒไบ†ๅˆšๆ‰็œ‹ๆ–ฐ้—ปๅฐ้ฃŽๅˆฐๆต™ๆฑŸไบ†ๆˆ‘ๅฅฝ็ดงๅผ ๆ•ดไธชไบบๅœจๅ‘ๆŠ–ๅฅฝๆƒณๅ“ญไฝ ่ฟ˜ๆ˜ฏๆฒกๆœ‰ๅ›žๆถˆๆฏ\n1000 Processed\n classify content\n369500 1 ๅทโ€”xๆœˆxๅทๅ…จๅœบๆ˜ฅ่ฃ…x.xๆŠ˜๏ผŒไผšๅ‘˜ๆŠ˜ไธŠx.xๆŠ˜๏ผŒๅฆๅ…จๅœบๆถˆ่ดนๆปกxxxๅ…ƒไปฅไธŠ่ต ๅคชๅนณ้ธŸๆฑฝ่ฝฆๆŠฑๆž•ไธ€ไธช...\n369501 0 ๆˆ‘ไธชไบบ่ง‰ๅพ—็Žปๅฐฟ้…ธๅœจ25ๅฒไน‹ๅ‰ๆœ€ๅฅฝๅˆซ็”จ\n369502 0 ๅ้ฃžๆœบไธ€ๅฎš่ฆๆ•ท้ข่†œๅ้ฃžๆœบไธ€ๅฎš่ฆๆ•ท้ข่†œๅ้ฃžๆœบไธ€ๅฎš่ฆๆ•ท้ข่†œ\n369503 0 ๅ› ๆถ‰ๅซŒ็ŠฏๆŒช็”จๅ…ฌๆฌพ็ฝชๅ’Œ่ดชๆฑก็ฝช\n369504 0 ๆญฆๆฑ‰ๅธ‚ๆ–ฐๅปบๅ•†ๅ“ๆˆฟ8ๆœˆ1ๆ—ฅ้”€ๅ”ฎๅฅ—ๆ•ฐ626ๅฅ—\n1000 Processed\n classify content\n370000 1 ไฝ ๅฅฝ๏ผๆˆ‘ๆ˜ฏ็ฆไธด่ฃ…้ฅฐๅผ ้ปŽ๏ผŒๅ…ฌๅธๅผ€ๅนด้’œๆƒ ๏ผŒๅ“่ดจๆ•ด่ฃ…๏ผŒๅฎš่ฃ…ไฟฎๅณ้€โ€œ็พŽ+ๅ‡€โ€ไธ“ไธš้™ค็”ฒ้†›ใ€ๆ•ดไฝ“ๅฎถๅบญ็Žฏไฟ...\n370001 0 ้‡ๅบ†ๅฅฅไฝ“ๆœ‰38120ไบบ็œ‹ๅˆฐ็™ปๅทดๅทดๅผ็ขพๅŽ‹\n370002 0 ๅฐฑๅฏไปฅ้€š่ฟ‡ๆŸฅๆ‰พๅˆฐๅฏนๆ–น็š„IPๅœฐๅ€\n370003 0 ็ญพ็บฆไปชๅผๅœจๅธธๅทžๅธ‚็›ๅŸŽๅ•†ไผšไผš่ฎฎๅฎคไธพ่กŒ\n370004 0 ็™พๅบฆๅœฐๅ›พ่ฅฟๅฎ็š„ๅœฐๅ›พๆ•ฐๆฎ่ฏฅๆ›ดๆ–ฐไบ†\n1000 Processed\n classify content\n370500 0 ๅœจไธ‰ๆ˜ŸไธŽGoogle็จๆ—ฉๅ‡ๅฎฃๅธƒๅฐ†้’ˆๅฏนๆ——ไธ‹Androidๅนณๅฐ่ฃ…็ฝฎๆไพ›ๆฏๆœˆๆผๆดžไฟฎๆญฃๆ›ดๆ–ฐๅŽ\n370501 0 ๅœจๆทฎๅฎ‰ๅธ‚ๅŒป็”Ÿ็š„็œผ้‡Œๅฐฑ่ฟ™ไนˆๆฒป็–—\n370502 0 ๅˆšๅœจๆ˜“ๅธฎ็”ต่„‘็ง‘ๆŠ€ๆ‘‡ๅˆฐไบ†ๅ…่ดนๅฅฝไธœไธœ\n370503 1 ไบฒไฝ ๅฅฝ๏ผๆˆ‘ๆ˜ฏไธ“้—จๅœจไนๅทๅ…ฌ้ฆ†ๅšๆš–ๆฐ”/ไธญๅคฎ็ฉบ่ฐƒ็š„็พŽๆ™ฏ้”€ๅ”ฎๅทฅ็จ‹ๅธˆ--ๅฐๅ”ใ€‚ๅผ€ไผšๅพ—็ŸฅไปŠๅนดxxxๆดปๅŠจๅŠ›...\n370504 0 ๆˆ‘ไปฌ็ญ–ๅˆ’็š„้ฆ–ๅœบๆดปๅŠจ8ๆœˆๆœซๅฐ†็ฒพๅฝฉๅ‘ˆ็Žฐ\n1000 Processed\n classify content\n371000 0 ไน้ผŽ้›†ๅ›ขๅ€Ÿ่ดทๅฎไน‹้ช—ๅฑ€ๅคงๆญ็ง˜\n371001 0 ้•ฟๆฒ™ๅธ‚ๅ…ฌๅฎ‰ๆถˆ้˜ฒๆ”ฏ้˜Ÿ็‰นๅ‹คๅคง้˜Ÿ่ฎญ็ปƒๅœบๅœฐไธŠ\n371002 0 ้˜œๅฎๅŽฟ่ฟ˜ๆŠ“ๅฅฝโ€œ็บข่‰ฒ็›้˜œโ€”โ€”้˜œๅฎๅˆ†็ซ™โ€ไธปไฝ“็ฝ‘็ซ™็š„ๅปบ่ฎพๅทฅไฝœ\n371003 0 ๆ‰€ๆœ‰็œŸ็›ธๅชๆœ‰ๆˆ‘ไธ€ไธชไบบ่’™ๅœจ้ผ“้‡Œ\n371004 0 win10ๆˆ‘ๆ„Ÿ่ง‰ๅฐฑๆ˜ฏๅŸบไบŽwin7ๅšไบ†ไธ€ไธ‹ๅขžๅผบ\n1000 Processed\n classify content\n371500 0 maybeๆˆ‘ไธปๅ‹•ไธ€ไธ‹ๅฐฑๅˆ่ƒฝ่Š่ตทไพ†ไบ†\n371501 0 ่ƒฝ้กบๅˆฉๅˆฐ่พพ็ปˆ็‚นๆ‹ฟๅˆฐoffer็š„็œŸ็š„ๅฐ‘ไน‹ๅˆๅฐ‘\n371502 0 ๅŒๅญธๅ€‘ไบ†่งฃไบ†ๅธธๅทžๆขณ็ฏฆ็š„่ฃฝไฝœ้Ž็จ‹ๅ’Œๅ•†ๆฅญๆจกๅผ\n371503 1 ่€็ช–้›†ๅ›ขๅ…ฌๅธ่ฆๆฑ‚:ๅ†ณๅฎšxๆœˆxxๅทๆญฃๅผ่ฟ›้ฉปๅฑ•ๅŽ…ๆญ่ฟŽๅ…ซๆ–น่ดตๅฎพ!ๆฌข่ฟŽๅ…จๅ›ฝๅ„ๅœฐๆ–ฐ่€ๅฎขๆˆทๅคง้ฉพๅ…‰ไธด๏ผŒๆดฝ่ฐˆ...\n371504 0 ่งฃๅ†ณ็”ต่„‘่พๅฐ„ๆŠค่‚ค็พŽๅฎนๆŠ˜่€Œๅฟงไผคๅพ˜ๅพŠ็š„ๆ—ถๅ€™\n1000 Processed\n classify content\n372000 0 6ไธ‡ๅคšๅบงๅ„็ฑป้›จๆฐดไบ•่ฟ›่กŒๆŽ’ๆŸฅๅ’Œ็–้€šๆธ…ๆŽ\n372001 0 ไฟฉๅญฉๅญๅพˆไน…ๅฐฑ่ฆๆฑ‚ๆœ‰่‡ชๅทฑ็š„ๅฐๆค็‰ฉใ€ๆ€•็›ดๆŽฅไนฐ็ป™ไป–ไปฌไผšไธ็ๆƒœ\n372002 0 ๆฒกๆœ‰ๅคฑ่ดฅๅ“ชๆฅ็š„ๆˆๅŠŸๅพ€ๅ‰่ตฐๅพ€ๅ‰ๆ‹ผ็›ธไฟก่‡ชๅทฑ\n372003 0 ่ฟ™ๆ˜ฏไธ€ไธชๆˆ‘ไปฌๆณจๅฎšๆ— ๆณ•้•ฟ็›ธๅฎˆ็š„ไธ–็•Œ\n372004 0 ไธŠๅคฉๅคชไธๅ…ฌ~่ฏดๅฅฝ็š„ๅฅฝๅงๅฆนไธ€่พˆๅญๅ‘ข\n1000 Processed\n classify content\n372500 0 siri็•Œ้ขๅ’Œwatchไธ€ๆ ทไบ†\n372501 0 ่‰ฒๅทsupercoral็”จไธ‰ๆฌก\n372502 0 ๅนถๅฏน่ฟ™็ง่…่ดฅ็Žฐ่ฑก่ฟ›่กŒไบ†่พ›่พฃ็š„่ฎฝๅˆบๅ’ŒๆŠจๅ‡ป\n372503 0 ๅˆฐไบ†็Žฐๅœบไฝ ไปฌๆ˜†ๅฑฑๅœ†้€šๆ€ปๅ…ฌๅธๅทฒ็ปๆŠŠๅŒ…่ฃนๅ…จ้ƒจ็ง่‡ชๆ‹†ๅผ€\n372504 0 ็„ถๅŽๅ‘Š่ฏ‰ไฝ ไฝ ็›ธๆœบๅ’Œๆ‰‹ๆœบๆœช่ฟžๆŽฅ\n1000 Processed\n classify content\n373000 0 ไบบๆดป็€่ต–็€ไธ€ๅฃๆฐงๆฐ”ๆฐงๆฐ”ๆ˜ฏไฝ ๅฆ‚ๆžœไฝ ็ˆฑๆˆ‘ไฝ ไผšๆฅๆ•‘ๆˆ‘\n373001 0 ็”จๅŒไธ€ไธชไบš้ฉฌ้€Š่ดฆๅทไนฐไบ†ไธคไธช\n373002 0 ๅ…ˆๅผ•่ฏฑไบ†้›†ๅ›ข็š„ๅ“ฅๅ“ฅไฝ†ๆ˜ฏๅ“ฅๅ“ฅๆ˜ฏไธ€ไธช่บซไปทๆธ…็™ฝ็š„ไบบ\n373003 0 99%็š„ๅฅณๆ€ง็ป็ฌฌไธ€ๆฌกไฝฟ็”จfemfreshๅŽ้ƒฝไผš้€‰ๆ‹ฉๆŽจ่็ป™ๅ…ถๅฅฝๅ‹\n373004 0 ๆ˜ฏ็พŽๅ›ฝไบบๆ— ่€ป่€Œๅ‡ถๆฎ‹็š„ไพตๅ ไบ†ไธญๅŽ้ข†ๅœŸ\n1000 Processed\n classify content\n373500 0 ็ฉบๆ–นๅœจๅ›ฝไผๆœŸ่ดงไธŠๅฏ่ตš800ไบฟ\n373501 0 ๅฟไธไฝๅˆ†ไบซๅคฉๅบœๅนฟๅœบๅœฐ้“็ซ™ๅ‡บๆฅ็š„ไปŠ็ซ™่ดญ็‰ฉๅนฟๅœบ็Žฐๅœจๅผ€ไธšไบ†่ฎธๅคšๅƒไธœ่ฅฟ็š„ๅบ—\n373502 0 ่ฆ็Ÿฅ้“่…พ่ฎฏๅ‘Šๅ€’ๅฟซๆ’ญ็š„็†็”ฑๆ˜ฏๅ› ไธบ็›—็‰ˆ่€Œไธๆ˜ฏ่‰ฒๆƒ…\n373503 0 ??Deaๅˆ็ง‹้ซ˜็บงๅฎšๅˆถๅคๅค้’ˆ็ป‡ๅŠ่ฃ™ไธŠ่บซ้žๅธธๅฅฝ็œ‹็š„ไธ€ๆฌพ้’ˆ็ป‡่ฃ™ๆœ‰็€ๆฌง็พŽๅคง็‰Œๆ„Ÿ็š„ๅคๅค็บน่ทฏๆญ้…ไปปไฝ•ไธŠ...\n373504 0 ๅฎœๅ…ดๆœฌๅœŸไนฆ็”ปๅฎถ็š„ไฝœๅ“ไธๅœจๅฐ‘ๆ•ฐ\n1000 Processed\n classify content\n374000 0 ๅฐฑๆ˜ฏๅปบ็ญ‘็‰ฉๆˆ–่€…ๆž„็ญ‘็‰ฉ็š„้ชจๆžถ\n374001 0 ๅœจๅพๅทžไฝ ๅฏ่ƒฝๅฌ่ฟ‡ๅ–็ฑณ็บฟใ€ๅ–้ฆ„้ฅจ\n374002 0 ไธ่ฟ‡ๆ‹Œ็€่œ‚่œœ้…ธๅฅถ่ฟ˜ๅฏไปฅๅ‹‰ๅผบไธ‹ๅ’ฝ\n374003 0 xxxไธ‡ๅไผšๅ‘˜็š„ๆ‰‹ๆœบ่Šๅคฉ่ฝฏไปถโ€œKakaotalkโ€ๅผ€ๅ‘ๅ•†้Ÿฉๅ›ฝkaokaoๅ…ฌๅธๆŠ•ๅ…ฅxxxไบฟๅทฆๅณ่ต„้‡‘\n374004 0 โ€œ่ฟ™ไบ›ไบค้€š่ฟๆณ•่กŒไธบๅคง้ƒจๅˆ†ไธŽโ€˜่ทฏๆ€’็—‡โ€™ๆœ‰ๅ…ณ\n1000 Processed\n classify content\n374500 0 ๅ—ไบฌ่ทฏๆญฅ่กŒ่ก—่ฟ™้‡Œๅคฉๅˆ€็š„ๅนฟๅ‘Š็‰Œ\n374501 0 ๅธธๅทž่‹ฑ่ฏญๅฃ่ฏญๅŸน่ฎญๅฐ็ผ–ๅˆ—ไธพไบ†2015ๅนดไธŠๅŠๅนด็š„้‚ฃไบ›ๆต่กŒๆฝฎ่ฏๅ„ฟ\n374502 0 ็šฎ่‚ค้ƒฝ็–ผ๏ฝžๅฐฑๆ€•็”ต่„‘ๆ•ฃ็ƒญไธ่กŒ\n374503 0 hellokitty็‰ˆๆฐดไฟก็Ž„้ฅผ\n374504 0 ๅ‘ฝ่ฟๆœฌ็”Ÿไธๅ…ฌ่€Œๆˆ‘ไปฌๅˆ่ƒฝๅšไบ›ไป€ไนˆ้™คไบ†ๆ— ๆณ•ๆ”นๅ˜ๅ’Œๆ„คๆ€’ไธๆปกๅ…ถๅฎžไธ€ๆ— ๆ˜ฏๅค„้šๆณข้€ๆตๆดป็€\n1000 Processed\n classify content\n375000 0 ไธบไป€ไนˆๆˆ‘็™พๅบฆไบ‘app็‚น่ฟ›ๅŽป\n375001 0 ๅฒณ้˜ณๅŒป้™ข่‡ชๅˆถ็š„็™ฝๅœฐ็ฅ›่„‚ๅˆๅ‰‚\n375002 0 โ€œๅธธ่ง่ฟ‡ๆ•ๆ€ง็–พ็—…็š„้ข„้˜ฒไธŽ้˜ฒๆฒปโ€ไธป้ข˜่ฎฒๅบงไธบๅคงๅฎถ่ฎฒ่งฃไบบไฝ“่ฟ‡ๆ•็š„ๅŽŸๅ› ใ€ๅฆ‚ไฝ•ๆญฃ็กฎ่ฏ†ๅˆซใ€้ข„้˜ฒๅŠๆฒป็–—่ฟ‡ๆ•ๆ€ง็–พ็—…\n375003 0 ๆธ…้™คๆต่งˆๆ•ฐๆฎๅ’Œcookieๆ— ๆ•ˆ\n375004 0 ้€‚ๅฝ“pingไธŠไธ€ไบ›ๅ…ป็”Ÿๅคฉๅ“็š„ๆตท้ฒœ้‚ฃๅฐฑๅคช็พŽไบ†\n1000 Processed\n classify content\n375500 0 ๅ‡บๅฃๆฌง็พŽ็š„ๆก็บนๅซ่กฃ3ๅˆฐ8ๅฒๅฏ็ฉฟ\n375501 0 ไผš่ฎฎๅผบ่ฐƒไธ“้กนๆ•ดๆฒปๆดปๅŠจๅฟ…้กป๏ผš่ฎคๆธ…ๅฝขๅŠฟ\n375502 0 ไธ€่พ†156่ทฏๅ…ฌไบค่ฝฆๅณๅŽ่ฝฎๅŽ‹ๅˆฐไบ†ไธ€ไฝ่€ๅคชๅคช็š„ๅณ่„š\n375503 0 โ€œ็”ตๆขฏๅƒไบบโ€ไบ‹ๆ•…ๅ‰5ๅˆ†้’Ÿๅ•†ๅœบ็›‘ๆŽง่ง†้ข‘ๆ›ๅ…‰\n375504 0 ๅธ‚ๆ”ฟๅบœๆ–ฐ้—ปๅŠžๅ…ฌๅฎคๅฌๅผ€ๅ…จๅธ‚ๅ…ฌๅ…ฑไบค้€šๅฎ‰ๅ…จ็›‘็ฎกไฝ“็ณปๅปบ่ฎพๆƒ…ๅ†ตๅ‘ๅธƒไผš\n1000 Processed\n classify content\n376000 0 ไปŠๅคฉ็œ‹ไบ†็Žฐไปฃๅ•†ไธš็‰ˆ็š„ๆขต้ซ˜็บชๅฟตๅฑ•\n376001 0 ๅฎƒๅพˆๅฏ่ƒฝๆ˜ฏๆญคๅ‰ไผ ่จ€็š„Lumia950็ณปๅˆ—ๆœบๅž‹\n376002 0 ๅฎพๅค•ๆณ•ๅฐผไบšๅคงๅญฆๆœบๅ™จไบบๆ›ฒๆฃ็ƒ่ต›\n376003 1 xxxx้พ™ๆธธ่‰พ่Žฑไพๅฅณ่ฃ…ๆ—ฉๆ˜ฅ็ณปๅˆ—่ŒๅŠจไธŠๅธ‚๏ผŒxๆœˆๅฅณ็Ž‹่Š‚๏ผŒ่ง่ฏ็พŽไธฝxxๅ˜ใ€‚ๆฌข่ฟŽๆ–ฐ่€้˜Ÿๅ‘˜้กพๅฎขๅ‰ๆฅ้€‰่ดญใ€‚\n376004 0 ๆปจๆตทๆ–ฐๅŒบไธพๅŠžๅ„็ฑปๅฑ•่งˆ่ฎบๅ›85ไธช\n1000 Processed\n classify content\n376500 0 ไธ‹้ข2ๆฌพๅฎนๅ™จๅ†…้ƒจ่ฎพ่ฎกไนƒไปฌๆƒณ่ฆไป€ไนˆๆ ท็š„ๆถ…\n376501 0 ็†่ดข็ป็†ไธๅœจ็š„ๆ—ถๅ€™ๆˆ‘ๅฐฑๅพ—็›ฏไฝŽๆŸœ\n376502 0 ๅ›žๅบ”่ดจ็–‘็š„ๆ–นๅผๅฐฑๆ˜ฏ้ป˜้ป˜ๆฒ‰ๆท€\n376503 0 ๆณ—ๆดชๅŽฟไธŽ้˜ฟ้‡Œๅทดๅทดๅ…ฌๅธ็ญพ่ฎขๅ†œๆ‘็”ตๅญๅ•†ๅŠกๅˆไฝœๅ่ฎฎ\n376504 0 ๆœ‰bigbangๅ—ไบฌๅœบ980็œ‹ๅฐๆญฃๅฏน่ˆžๅฐ็š„็ฅจๅ—\n1000 Processed\n classify content\n377000 0 ๅฏ’ๅ‡็ป“ๆŸไบ†ๆœฌๆฅ่ฎกๅˆ’ๅฅฝ็š„็Ÿญๅ‘ๆฒกๆœ‰ๅœ†ๅœ†็œผ้•œๆฒกๆœ‰ๅ‡่‚ฅๆฒกๆœ‰ๅฏๅฐฑ่ฟ™ๆ ท็ป“ๆŸไบ†ๆ˜Žๅคฉๅฐฑ่ฆๅผ€ๅญฆไบ†ๅ—ฏๅผ€ๅญฆๅฟซไน่Œไฝ ...\n377001 0 ่ฟ™ๅบงไบŽๅฝ“ๅนด2ๆœˆ28ๆ—ฅๅผ€ๅทฅๅปบ่ฎพ็š„ไธญๅ›ฝ็›ฎๅ‰ๆœ€ๅคง็š„ๅฑ•็ฟ…ๅž‹้ซ˜ๆžถๆกฅ\n377002 0 ๅŠจๅŠ›็”ตๆฑ ็š„ๆต‹่ฏ•ไธŽ่ฏ„ไผฐๅœจ็บฏ็”ตๅŠจๆฑฝ่ฝฆไบงไธš็š„ๅ‘ๅฑ•่ฟ‡็จ‹ไธญ่‡ณๅ…ณ้‡่ฆ\n377003 0 ๅฆ‚ๆžœๅฅฝๅฃฐ้Ÿณ้ป‘ๅน•็œŸ็š„ๅฆ‚ๆญคโ€ฆโ€ฆไธ็œ‹ไบ†\n377004 0 ๅฑฑ่ฅฟ้ฆ–ๅฏŒ้‚ขๅˆฉๆ–Œ8000ไธ‡ๆ”ถ่ดญ1\n1000 Processed\n classify content\n377500 1 ๅŽไธบ ่ฃ่€€ ็•…็ŽฉxX (Chex-TLxx) ไฝŽ้…็‰ˆ ็™ฝ่‰ฒ ็งปๅŠจxGๆ‰‹ๆœบ ๅŒๅกๅŒๅพ…ไบฌไธœไปท๏ผš๏ฟฅ...\n377501 0 /ๅนฟ่ฅฟๅฎ˜ๅ‘˜้ญไธพๆŠฅไธŽ2ๅฅณๅคงๅญฆ็”Ÿๅผ€ๆˆฟโ€œๅŒ้ฃžโ€่ขซๅœ่Œ\n377502 0 LeohNewTabๅ…่ฒปๆผ‚ไบฎ็š„GoogleChromeๅˆ†้ ๅค–ๆŽ›\n377503 0 ๅถ็„ถ็œ‹ๅˆฐๆฑŸ่‹ๅซ่ง†ๅœจๆ”พ็ฟป่ฏ‘ๆˆไธญๅญ—็ปงๆ‰ฟ่€…ไปฌ\n377504 0 ๆ–ฐๅŒบๆŠ•ไฟƒไธญๅฟƒ็ป„็ป‡ไบ†20ไฝ™ไบบ็š„ไธ“ไธšๆ‹›ๅ•†ๅ›ข้˜Ÿ\n1000 Processed\n classify content\n378000 0 ็ญ‰ๆˆ‘ไปŽๅ—ไบฌๅ›žๆฅไฝ ไฟฉไธ€ไธชๅ›žไธŠๆตทไธ€ไธชๅผ€ๅง‹ๅŸน่ฎญไบ†\n378001 0 ๆญๅทžๅœฐ้“4ๅท็บฟ่ฟ่ฅ็ฌฌ188ๅคฉ\n378002 0 ๆˆ‘็œ‹ไธ€้›†่Šฑๅƒ้ชจๆ–—้ญ่™ๅพ—ไธ่กŒไบ†\n378003 0 ๅˆšๆ‰ไธ€ไธชไบบ็œ‹ๅˆฐๆˆ‘็š„ๆ‰‹ๆœบ่ƒŒๆ™ฏ\n378004 1 ไพ›็š„ๆๆ–™็ฎ€ๅ•๏ผŒๆ‰‹็ปญๅฟซๆท๏ผŒๆœ€ๅฟซไธ€ๅคฉๆ”พๆฌพใ€‚ ๆ‚จๆˆ–่€…ๆ‚จ็š„ไบฒๆˆšๆœ‹ๅ‹ๆœ‰่ต„้‡‘ๆ–น้ข็š„้œ€ๆฑ‚็š„่ฏ๏ผŒๅฏ...\n1000 Processed\n classify content\n378500 0 ๅ’Œไนฐไธช้ฅๆŽงๆฑฝ่ฝฆๅ›žๆฅ็Žฉๅ„ฟ็š„ๆ•ˆๆžœไผผ็š„\n378501 0 ๅœจ้ฃžๆœบไธŠๅˆ็œ‹ไบ†ไธ€้ๆ˜Ÿ้™…็ฉฟ่ถŠ\n378502 0 ๆ‰‹ๆœบ้‡Œๅคšไบ†ๅฅฝๅคšๅ›่œœ็š„ๅ›พโ€ฆๅฏน\n378503 0 ไธญๅ›ฝ็ŸณๆฒนA่‚ก่‚กไปทๆฏ”H่‚ก้ซ˜104%\n378504 0 ๅ—ไบฌๅธ‚ไธญ็บงๆณ•้™ขๅฌๅผ€โ€œ้žๆณ•้›†่ต„ไธŽๅธๆณ•้ข„้˜ฒโ€ๆ–ฐ้—ปๅ‘ๅธƒไผš\n1000 Processed\n classify content\n379000 0 ไพ‹ๅฅ2Awomengotheatstrokeonthestreetandrushedtoth...\n379001 0 ๆฌง็พŽๅพˆๅคšไบบ็”จiPhoneไผš็”จๅฅฝๅ‡ ๅนดไธๆขๆ–ฐๆ‰‹ๆœบ\n379002 1 ๆ‚จๅฅฝๆˆ‘ๆ˜ฏๆ˜Ÿ่พฐ่ทฏ็”ฑๅ™จๆ‰นๅ‘\n379003 0 ๆทฎ้˜ดๅฑ ๅฎฐๅœบ้‡Œๆœ‰ไพฎ่พฑ้Ÿฉไฟก็š„ๅนด่ฝปไบบ\n379004 0 ๅคงๅฎถไนŸ้ƒฝ็Ÿฅ้“ๆŠ—็”Ÿ็ด ๆปฅ็”จ็š„ๅŽๆžœ\n1000 Processed\n classify content\n379500 0 ่€Œ้’พไธŽ้’ ๅ…ฑๅŒ็ปดๆŠค็ป†่ƒžๅ†…ๅค–ๆญฃๅธธๆธ—้€ๅŽ‹ๅ’Œ้…ธ็ขฑๅนณ่กก\n379501 1 ็ฝ‘็ƒใ€ไน’ไน“็ƒใ€ๆธธๆณณใ€่ฝฎๆป‘ใ€ๆ•ฃๆ‰“็ญ‰ๅŸน่ฎญ๏ผๆฌข่ฟŽๆ‚จๆฅๆฌงๅฐผๅฅ่บซไฝ“้ชŒไธไธ€ๆ ท็š„ๆฟ€ๆƒ…๏ผๆฌงๅฐผๅฅ่บซไป˜่ฃ•็ฅๆ‚จ็”Ÿๆดปๆ„‰ๅฟซ๏ผ\n379502 0 ๅฏน้ข้ฉถๆฅ็š„ๅคงๆปดๆปดๅ‰็ฏ็…ง็š„ๆˆ‘ไป€ไนˆไนŸ็œ‹ไธๆธ…\n379503 0 ไธญๅˆๅฐฑไนฐไบ†็ข—ๆณก้ข่ฎฉๅฅน90ๅฒ็ˆถไบฒๅๅœจ้•ฟๆค…ไธŠๅƒ\n379504 0 ใ€€ใ€€่€ŒๅฏนไบŽๅ‰ไปป็š„่ดจ็–‘ใ€ๆฑก่”‘ใ€ๆ”ปๅ‡ป\n1000 Processed\n classify content\n380000 0 3ๅทๅ‡Œๆ™จๅพฎๅšๅ›žๅบ”๏ผšไธ‹ๅˆๅœบ็š„่ง‚ไผ—ไธๅ€ผๅพ—็ปดๆƒ\n380001 0 ไปŠๅคฉๆณกไบ†ไธญ่ฏๆพกๆœฌๆฅๆƒณๆ—ฉ็‚น็ก\n380002 0 ๆน–ๅ—6ๅ้’ๅนดๅœจๅŠžๅ…ฌๅฎค่ขซ19ไบบ็ ๆ€\n380003 0 ๅœจๆขฆๆณ‰็ซ็ฎญๅ›ขๆœ‰ไธ€ไธช็‰นๆฎŠ็š„ๅฐๅ›ข้˜Ÿ\n380004 0 ๆˆ‘ไนŸๆ˜ฏไฝฉๆœๆˆ‘ๅฆˆ็”จๅธธ็†Ÿ่ฏ่ฏปๅ‡บไบ†ๆˆ‘็š„ๆฏไธ€ๆก็Šถๆ€\n1000 Processed\n classify content\n380500 0 ๆˆ‘ๅ‘่ตทไบ†ไธ€ไธชๆŠ•็ฅจ็ˆฑๅฅณๅฟƒๅˆ‡ๅฏไปฅ็†่งฃ\n380501 0 ๅณไฝฟๅœจ่ฟ™ๆ ทๆŠ€ๆœฏๅ…ˆ่ฟ›็š„ๅŒป้™ขไนŸ้šพๅ…่ฝๅˆฐ้ญไบบ็—›้ช‚็š„ๅขƒๅœฐ\n380502 1 xๆœˆxxๆ—ฅๅฅฅๅŽ็”Ÿๆ€้›†ๆˆๅŠ้กถๅ‘จๅนดๅบ†๏ผŒๅŽ‚ๅฎถ่กฅ่ดดใ€้›ถๅˆฉๆถฆ้”€ๅ”ฎ๏ผŒ่ฎฉๅฅฅๅŽไธบๆ‚จๅฎถๅขžๆทปๆ—ถๅฐšๆธฉๆš–๏ผŒ่ฏšๆ„้‚€่ฏทๆ‚จ...\n380503 0 ่ด็‰นๆ–ฏ็š„LinkedIn่ต„ๆ–™ๆ˜พ็คบๅ…ถ7ๆœˆๅŠ ๅ…ฅ่‹นๆžœไปŽไบ‹่ฟ่ฅๅทฅไฝœ\n380504 0 ็ป่ฟ‡ไบ†ๆผซ้•ฟ่€Œๅˆ็Ÿญๆš‚็š„้ฃžๆœบๅ’Œ่ฝฆๅญ็š„่ทฏ็จ‹\n1000 Processed\n classify content\n381000 0 ็ฝ‘ๅ‹็›ดๅ‘ผ่ฎคไธๅ‡บ็œŸ็›ธ็ซŸๆ˜ฏ่ฟ™\n381001 0 ๆˆ‘ๆœ‰โ€˜็ดงๅนณๆข…โ€™็™พๅบฆไบ‘็›˜้œ€่ฆ่ฟžๆŽฅ็ง่Šๆˆ‘ไฟ่ดจไฟ้‡\n381002 0 ็บขๆตทๅบฆๅ‡ๅฐ้ฃŽๅธ†ๅขๅ…‹็ดขๅ†…้™†้ฃžๆœบๅฐ้ฃŽ้ƒฝ่ฟ‡ๅŽปไบ†\n381003 0 ้š”็ฆป+ไฟๆนฟ+้˜ฒๆ™’+็พŽ็™ฝ+่ˆ’็ผ“ไบ”ๅคงๅŠŸๆ•ˆ\n381004 0 ๅฐไผ™ไผดไปฌๅฐ†ๆญคๅฝ“ไฝœ่‡ชๅฎถ่ฃ…ไฟฎไธ€่ˆฌ\n1000 Processed\n classify content\n381500 0 ่ฟ™ๆฌกๅคงๆณขๅŠจๅœจๆ”ฟๅบœๅผบๅŠฟๅนฒ้ข„ไธ‹ๅนถๆœช้€ ๆˆ่ฏธๅฆ‚้‡‘่žๅฑๆœบ็ญ‰ๆถๅŠฃๅŽๆžœ\n381501 0 ้‡Œ้ขๅผบๅฅธ้€ป่พ‘็š„้ƒจๅˆ†ๅฎžๅœจๆ‡’ๅพ—ๅๆงฝ\n381502 0 ไน‹ๅ‰่Šฑไบ†ไธ€ๅคฉๅšๅฅฝ็š„6ๅคฉๆ—…ๆธธๆ”ป็•ฅ\n381503 0 ๆธฉ็ป็†๏ผš13798580082\n381504 1 ไฝ ๅฅฝใ€‚ ๆˆ‘ๆ˜ฏไนๆธธ็š„ๅ…ฌไธป๏ผŒๆ–ฐๅนดไปฅ่ฟ‡ไนๆธธktvๆญฃๅธธๅผ€ไธš๏ผŒๆฌข่ฟŽๅคงๅฎถๆฅๆงๅœบๅฅฝ็Žฉ็พŽๅฅณๅคš๏ผๆœ‰็ฉบๆฅๆงๅœบๅ“ฆ๏ผ...\n1000 Processed\n classify content\n382000 0 ็œ‹thenightshiftๅทฒ็ถ“็œ‹็˜‹ไบ†ๆฏๆฌก็œ‹้†ซ็™‚ๅЇๆˆ‘้ƒฝๅœจๆƒณๅ…ถๅฏฆๆˆ‘ๅฏไปฅ็•ถๅ€‹้†ซ็”Ÿ็š„้›ฃ้“ไธๆ˜ฏๅ—Ž\n382001 0 G15ๆฒˆๆตท้ซ˜้€Ÿๅพ€ๆต™ๆฑŸๆ–นๅ‘1291\n382002 0 ๅฏนไบŽๅฎš็‚นๅŒป็–—ๆœบๆž„็›‘็ฃ็ฎก็†ๅพˆ้‡่ฆ\n382003 0 ๅค„500ๅ…ƒไปฅไธŠ2000ๅ…ƒไปฅไธ‹็š„็ฝšๆฌพ\n382004 1 ๆ„Ÿ่ฐข่‡ด็”ตๆตทๅฎ‡ๅฝฉๅฐ๏ผŒๆœฌๅ…ฌๅธไธ“ไธš็”Ÿไบงๅ„็ง่‡ช็ซ‹ๆ‹‰้“พ่ข‹๏ผŒๆก‚ๅœ†ๅนฒ่ข‹ใ€้ข็ญ‹่ข‹ใ€็œŸ็ฉบ่ข‹ใ€่’ธ็…ฎ่ข‹ใ€็ณ–ๆžœ่ข‹ๅนถไธบ...\n1000 Processed\n classify content\n382500 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏ็บฝๆณฝ่ฃ…้ฅฐ็š„ๅฐๅˆ˜๏ผŒๆˆ‘ไปฌๅ…ฌๅธ้’ˆๅฏน่ฃ็››ๅœฐไบง็š„ๅฐๅŒบๅพ้›†ๆ ทๆฟ้—ด๏ผŒๅฎš่ฃ…ไฟฎๅ…ไธ€ๅนดๅฐๅŒบ็‰ฉไธš่ดน๏ผŒ่ฟ˜้€...\n382501 0 ็ป“ๆžœ็‚นๅผ€็ปดๅŸบ้“พๆŽฅ๏ผšThesocietyaimstofosterunderstandingan...\n382502 1 ไธญๅ›ฝๅนณๅฎ‰ๅนดๅŽๆŽจๅ‡บๅ‡ก่ดญไนฐ่ฝฆ้™ฉๅฐฑ้€ๅฐ‘ๅ„ฟๅŒป็–—ไฟ้™ฉ๏ผŒ่ดญไนฐๅฐ‘ๅ„ฟ้™ฉ้€ๆˆไบบไฟ้™ฉใ€‚ๆ•ฐ้‡ๆœ‰้™๏ผŒ้€ๅฎŒๅŠๆญข.ๅ’จ่ฏข็”ต...\n382503 0 ไบบๆฐ‘ๅธๆฑ‡็އ่ƒฝไธ่ƒฝๅฎˆไฝ็œŸไธๅฅฝ่ฏด\n382504 0 ๅ› ๆญคๆŠŠ่ฟ™็งๅปบ็ญ‘้ฃŽๆ ผๅธฆๅˆฐไธ–็•Œ\n1000 Processed\n classify content\n383000 0 ็„ถ่€Œๆฒกๆœ‰ไป€ไนˆ็”จ~ไพๆฎๅ—ไบฌๆฐ”่ฑกๅฐๅ‘ๅธƒ็š„ๅคฉๆฐ”้ข„ๆŠฅ\n383001 0 ๅœจTumblrไธŠ็š„็ฝ‘ๅไธบLazyBones\n383002 0 ๅฎ‰้พ™ๆ”ฟๅบœๆดพไธŠ็™พๅ็‰น่ญฆๅŽ‹ๅˆถๅ†œๆฐ‘ๅทฅ\n383003 0 ้™„ๅŠ ๆˆๆœฌ=ๅฟƒๆƒ…ไฝŽ่ฝ้ƒ้—ท+ๅ่ช‰ๅฝข่ฑกๅ—ๆŸ+ๅฎถไบบๆœ‹ๅ‹ๆ‹…ๅฟง+ๅทฅไฝœๅญฆไน ็”Ÿๆดปโ€”โ€”ๅคๅญฃ็‚Ž็ƒญ\n383004 0 ๅŒ่Šฑ้กบๆ—ฉๅœจ2009ๅนดๅฐฑๅทฒ็™ป้™†ๅˆ›ไธšๆฟ\n1000 Processed\n classify content\n383500 0 ็„ถ่€Œๆˆ‘่ง‰ๅพ—ๆ˜ฏๆ—ถๅ€™ๅฐ†็œŸ็›ธๅ…ฌ่ฏธไบŽไธ–ไบ†\n383501 0 ไธ‹้ขๅฑฑไธœ่ฟœ้‚ฆ็ง‘ๆŠ€้›†ๅ›ขๅฐฑไธบไฝ ่ฏฆ็ป†\n383502 0 ๅฏน็”ฒๅบ”ไปฅไพตๅ ็ฝชไธŽ่ฏˆ้ช—็ฝชๅนถ็ฝš\n383503 0 ๅฏ่ฟ™ๆ ทๅšๅฎžๅœจๅคชKYไบ†ๆˆ‘ๅชๅฅฝๅฟ็€\n383504 0 ๅšไธชvi่ฟ˜่ฆๆˆ‘ๅ†™ๆ–‡ๆกˆๅš็ฝ‘้กต่ฎพ่ฎก\n1000 Processed\n classify content\n384000 0 ๆทฑๅœณๅœฐ้“ๆ–ฝๅทฅๅๅกŒ่ขซๅ›ฐ่€…ๅทฒๆ•‘ๅ‡บๅทฅๅœฐๅœๅทฅๆ•ดๆ”น\n384001 0 โ€โ€ฆB๏ผšโ€œ้‚ฃไฝ ๅ‘Š่ฏ‰ๆˆ‘่ฟ™้‡Œๆ˜ฏไธๆ˜ฏไธญๅŽไบบๆฐ‘ๅ…ฑๅ’Œๅ›ฝ\n384002 0 ไฝœๅ“่ฎพ่ฎก๏ผšๅผ ๅฏ’ๆ‘„ๅฝฑ๏ผšXYๅทฅไฝœ\n384003 0 ๅˆšๅˆšๅฏผๅ›พ็”ต่„‘ๆ็คบๆˆ‘็พ้šพๆ€งๆ•…้šœๆฅ็š„ๅคช็ช็„ถๆฒกๆœ‰ๆ‹ไธ‹ๆฅ็ฌ‘ๆญปๆˆ‘ไบ†็พ้šพๆ€ง23333\n384004 0 ๅœจๅฏน้˜ตๅพๅทž้…ทๆฃ’ไฟฑไน้ƒจ็š„ๆ—ถๅ€™\n1000 Processed\n classify content\n384500 0 ๆ™šไธŠๅšๆขฆๆขฆๅˆฐๅ้ฃžๆœบ้ฃžๆœบ่ขซ็ผ…็”ธๅๆดพๅ†›ๅŠซๆŒๆˆ‘่ทณๆœบ่ท‘ไบ†็„ถๅŽๅˆฐๅค„้—ฎ่ทฏไฝ†ๆ˜ฏ่ฏญ่จ€ไธ้€š่‹ฑ่ฏญไนŸๅคชๆธฃๆฒกไบบๅธฎๆˆ‘\n384501 0 ๆตชๅญๅ›žๅคด๏ผšNBA็ƒๆ˜Ÿ่ดๅ…‹ๆ˜ฏๅฆ‚ไฝ•่ดฅๆމไบฟไธ‡ๅฎถ่ดขๆฒฆ่ฝๅˆฐๅ–ๅ’–ๅ•ก็š„\n384502 0 ไปŠๅนดxxๅฒ็š„ๅˆ˜ๆŸๆ›พๅ› ่ดชๆฑกใ€็›—็ชƒใ€็Œฅไบตๅฅณ็ซฅ่ขซๅˆคๅˆ‘ๅ…ฅ็‹ฑ\n384503 0 ๅพฎ่ฝฏๅทฒ็ปๅฐ†iPhoneไธŠ็š„XboxMusicๅบ”็”จๆ”นๅไธบGroove\n384504 0 ๅˆšๅพ—็Ÿฅไธ€ไธชๅŒๅญฆๅœจ่…พ่ฎฏๆ€ป้ƒจไธŠ็ญ\n1000 Processed\n classify content\n385000 0 4ใ€ๆณจๆ„้ฅฎ้ฃŸๅซ็”Ÿๅ’ŒๆถˆๅŒ–้€š้“็–พ็—…ๅ‘็”Ÿ\n385001 0 ๆˆ‘ไปŽไฝ ่ฟ™ๆ•ฒๆˆ‘ๅฐฑ็œ‹ๅ‡บๆฅไฝ ่ดชๅฟƒ\n385002 0 ๆˆ‘ๅปบ่ฎฎๆ”ฟๅบœๅ…ฌๅฎ‰ๆœบๅ…ณๅœจๆˆ้ƒฝๅธ‚ๅฏไปฅไธฅๆ‰“ไธ€ๆฌก\n385003 0 ๆ˜ฏไธๆ˜ฏ็Šฏ็ฝชๆกˆไพ‹ๆ˜ฏๆœๆ’ฐๅ‡บๆฅ็š„\n385004 0 ่ถๆถ‚้˜ฒๆ™’้œœ็š„ๆ—ถ้—ดๅŽปๆฏ้ฅฌไบ†ไธ‹่ฟ™ไธช็Žฉๆ„\n1000 Processed\n classify content\n385500 0 ๅŒป้™ข็”ตๆขฏๅฎ‰ๅ…จ็ฎก็†ไบบๅ‘˜ๅ’Œ็”ตๆขฏ็ปดไฟไบบๅ‘˜ๅฏนๆฏ้ƒจ็”ตๆขฏ็š„\n385501 0 ๅๅ…ญๅนดๅŽ่ฎฉไป–ๅŽปๅ‚ๅŠ ๅฅฝๅฃฐ้Ÿณ็ฌฌไบŒๅๅญฃ\n385502 0 ๆฒกๆœ‰้˜ฒๆ™’้œœ็š„ๅคๅคฉ??????\n385503 0 ไฝ ไนŸๅฟซๆฅ่กจๆ€ๅง~้€‰ๅ‡บไฝ ๆœ€็ˆฑ็š„ๆœบๅ™จไบบๅค–่ง‚\n385504 0 ๆ„ๅค–็š„ๆ˜ฏCCTVx็ป™ไบ†ๅพˆๅคš้’ฑๆ‰€ไปฅไธญๅ›ฝๅ…ƒ็ด ๅพˆๅคš\n1000 Processed\n classify content\n386000 0 ็›—ๅข“็ฌ”่ฎฐ็™พๅบฆไบ‘\n386001 0 ๅคดๅ‘ๆ˜ฏ็ˆถๆฏๅŸบๅ› ๅ”ฏไธ€ไฟๅญ˜ๆ—ถ้—ดๆœ€้•ฟ็š„่ฝฝไฝ“\n386002 0 ๆฏ็ฎฑๅฅถ็ฒ‰ๆ”ฟๅบœไธฅๆ ผ็›‘็ฎกโ€ฆๆ‰€ไปฅๅช่ฆๆ˜ฏ็›ด้‚ฎ\n386003 1 ๆ‚จๅฅฝ๏ผŒๅนฟๅทž้‘ซไฟŠๅ‘ๆฑฝ่ฝฆ้…ไปถๆฌข่ฟŽๆ‚จๆฅ็”ตๅž‚่ฏข๏ผŒไธป่ฅ๏ผšๅฅ”้ฉฐ.ๅฎ้ฉฌ.ๅŽŸ่ฃ…ๆ‹†่ฝฆไปถใ€‚็œไผšๅŸŽๅธ‚.่ดงๅˆฐไป˜ๆฌพ.็”ต...\n386004 1 ่ฟŽไธ‰๏ผŒๅ…ซ่Š‚๏ผŒๆฐดไธญ่Šฑ็พŽๅฎน้™ข่”ๆ‰‹ไบงๅ“ๆ€ปๅ…ฌๅธ๏ผŒๆดปๅŠจๆœŸ้—ด๏ผŒ่ดญไนฐๆŠค่‚คๅ“ๆŽจๅ‡บไธ€็ณปๅˆ—ไผ˜ๆƒ ๆดปๅŠจ๏ผŒๅคšไนฐๅคš้€๏ผŒ้€...\n1000 Processed\n classify content\n386500 0 xxๆ—ฅPxPๆˆไบคๆ•ฐๆฎไธ€่งˆ่กจ๏ผšๅ็งฐๆ—ถ้—ดๅŠ ๆƒๆˆไบค้‡ๆˆไบค้‡ๅนณๅ‡ๅˆฉ็އๆŠ•่ต„ไบบๆ•ฐไบบๅ‡ๆŠ•่ต„้‡‘้ขๅนณๅ‡ๅ€ŸๆฌพๆœŸ้™...\n386501 0 ่ดขๅฏŒๆŽขๆบโ€ๅ…จๅ›ฝ็œไปฃไผš่ฎฎๅฌๅผ€\n386502 0 ็ป„็ป‡xxไฝ™็งๆ–ฐ่ฃ…ๅค‡่ฟ›่กŒๅฎžๆ‰“ๅฎž็ˆ†ๅฎžไฟฎ่ฎญ็ปƒ\n386503 0 ๆณฐๅ›ฝmistine็พฝ็ฟผ็ฒ‰้ฅผๅ’ŒChanel็ฒ‰้ฅผๆœ‰ไธ€ๆ‹ผ่€Œไปทๆ ผๅดๅนณๆฐ‘ๅพˆๅคšmistineๅ‡บๆฑ—้ƒฝไธไผš่„ฑ...\n386504 0 ๆˆฟๅฑ‹็š„่ฃ…้ฅฐ่ฃ…ไฟฎไธๅพ—ๅฝฑๅ“ๅ…ฑๆœ‰้ƒจๅˆ†็š„ไฝฟ็”จ\n1000 Processed\n classify content\n387000 0 ๅ้ฃžๆœบๆ— ็–‘ๆ˜ฏๅˆ็ปๅކไธ€ๆฌก็”ŸไธŽๆญป\n387001 0 ๆญป่€…ๅฎถๅฑžๅผ€ๅง‹ๆ‰ญๆ›ฒไบ‹ๆƒ…็š„็œŸ็›ธ\n387002 0 ่‹นๆžœ6plusๆ‰‹ๆœบๅฃณiPhone6plus้‡‘ๅฑž่พนๆก†5\n387003 0 ๅ‹ฟ็†็‰ฉ็†โ€่‰ฏๅธˆ็›Šๅ‹ๅธฎไฝ ๆ‹จๅผ€็†็ง‘ๆ•™ๅญฆ็š„่ฟท้›พ\n387004 0 ่พ›่‹ฆ็Žฉ็š„ๆธธๆˆ่ฆๅ‡31็บงไบ†่ฆๅˆฐๆˆ‘ๆƒณๅฎŒๆˆ็š„้‚ฃไธชไปปๅŠกไบ†\n1000 Processed\n classify content\n387500 0 twitter/morinorom๏ผšใจใ€ใจใจโ€ฆใจใชใ‚Š็ฉบใ„ใฆใ‚‹ใ‚ˆโ€ฆ\n387501 0 ๆ— ่ฎบไฝ ๆŠ•่ต„ไธ‹ๅŽปๅคšๅฐ‘่ดขๅŠ›ใ€็‰ฉ็†ๅ’ŒไบบๅŠ›\n387502 0 ๅŠ ๆˆ‘ๅซๆ˜Ÿ๏ผšT1523349912ๅ’Œๆˆ‘ไธ€่ตทๅš็ฝ‘่Œๅง\n387503 0 ๅœจๆญฃๅœจไธพ่กŒ็š„GDC2014ๅฑ•ไผšไธŠ\n387504 0 โ€่Šฑๅƒ้ชจไธ‹ๅทดๅทฎ็‚นๆฒกๆމไธ‹ๆฅ๏ผšโ€œ่ฟ™ไนˆๅคšไบบ้ƒฝๆ˜ฏๆฅๅ‘ๅผ‚ๆœฝๅ›้—ฎ้—ฎ้ข˜็š„ไนˆ\n1000 Processed\n classify content\n388000 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?ๆ„ฝ3\n388001 0 ่ฐ่ฏดๅฎšไฝ่ƒฝๆ‰พๅˆฐๆ‰‹ๆœบๅœจๅ“ช้‡Œ็š„\n388002 0 ่ฟ™ไธชๆ•ฐๅญ—ๅˆฐไบ†2014ๅนดๅˆ™็ฟปไบ†20ๅ‡ ๅ€\n388003 0 ๆฒกๆœ‰็ป่ฟ‡ๅˆปๆ„ไฟฎๅ‰ชๅ’Œ่ฎพ่ฎก็š„่‰ๆœจ่“ฌๅ‹ƒ็น่Œ‚\n388004 0 โ€œ้‡Šๆญฃไน‰โ€ไธพๆŠฅไธ€ไบ‹โ€œๅฑžๆถๆ„่ฏ‹ๆฏโ€\n1000 Processed\n classify content\n388500 0 ่ฟ‘ๆ—ฅไฟ„โ€œๅซๆ˜Ÿโ€ๆ–ฐ้—ป้€š่ฎฏ็คพๆŠฅ้“\n388501 0 ไบบๅ‡60ๅนณๆ–น็ฑณๅ…ๅพไธบไธปๆตๆ„่ง\n388502 0 ๅนถไพ็…ง็›ธๅ…ณ่ง„ๅฎšไพๆณ•็ป™ไบˆๅค„็ฝš\n388503 0 ||ๆˆ‘ๅœจๆฐงๆฐ”ๅฌไนฆๆ”ถๅฌโ€œ็›—ๅข“็ฌ”่ฎฐxไบ‘้กถๅคฉๅฎซxxxโ€\n388504 0 ๅฝ“ไธญๅŒ…ๆ‹ฌ้ซ˜่พพ711็ฑณ็š„ๅ…จ็ƒๆœ€้ซ˜ไฝๅฎ…ๅคงๅŽฆโ€œ่ฟชๆ‹œไธ€ๅทโ€\n1000 Processed\n classify content\n389000 0 ๅพฎ่ฝฏๅ›žๅบ”่ฟ™ไธ€็Šถๅ†ต๏ผš่ฟ™ๅฏ่ƒฝๆ˜ฏXboxOneไธŠๆ–ฐDRM็ญ–็•ฅๅผ•ๅ‘ไธ€ไธชBug\n389001 0 ๅšๅฅฝไธ‡่พพๆ–‡ๅŒ–ๆ—…ๆธธๅŸŽ้กน็›ฎๅพๅœฐๆ‹†่ฟๆ‰ซๅฐพ้ƒจๅˆ†ๅ่ฎฎ็ญพ่ฎขๅทฅไฝœใ€ๅœŸๅœฐไธ‰ๆธ…ๅทฅไฝœ\n389002 0 ๅŽไธบๆ——ไธ‹่ฃ่€€ๅฐ†ๅœจๆฒ™็‰น้˜ฟๆ‹‰ไผฏ้ฆ–้ƒฝๅ‘ๅธƒๅŒๆ ทๆกฃๆฌก็š„ๆ–ฐๅ“ๆœฌๆ‹‰็™ป\n389003 0 ่ฏทไฟฎๆฐดๅŽฟๆ”ฟๅบœใ€ไฟฎๆฐดๅŽฟๅ…ฌๅฎ‰ๅฑ€ไปฅๅŠ็›ธๅ…ณ็š„ๅ•ไฝไธฅๆƒฉๆšดๅŠ›ไผคๅŒป็š„ไบบๅ‘˜\n389004 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผๆˆ‘ๅค„ๅฏๅฟซ้€ŸๅŠž็†ๅคง้ขใ€Šxx-xxxไธ‡ใ€‹ไฟก็”จๅก๏ผŒๅ‰ๆœŸๆ— ่ดน็”จ๏ผŒๆ— ๆˆท็ฑ่ฆๆฑ‚๏ผŒๅ‡ๅฏๅฅ—็Žฐ...\n1000 Processed\n classify content\n389500 0 ๆŸไบบไปฅ็ซ็ฎญ่ˆฌ็š„้€ŸๅบฆๆŠŠ้ฃŸ็‰ฉๅกžๆปกๅˆฐ่‡ชๅทฑ็š„่ƒƒ\n389501 0 ๆœ‰ๆ—ถๅ€™ๅพˆๆ€•ๆƒŠ็š„ๆ„Ÿ่ง‰ๅทฎไธๅคšไธคไธ‰ไธชๅฐๆ—ถๅ–‚ๅฅถไธ€ๆฌก\n389502 0 ๏ผปLittleLostProject๏ผฝๅœจๅคง่ก—ไธŠๆˆ‘ไปฌๅฏไปฅๅ‘็ŽฐๅพˆๅคšๅบŸๅผƒ็š„็‰ฉๅ“\n389503 0 ๅ„็ฑป็”ตๅŠจๆฑฝ่ฝฆๅœจๆ˜†ๅฑฑๅธ‚ๅ‘จ่พนๅ……็”ตโ€œ็ปญ่ˆชโ€้œ€ๆฑ‚\n389504 0 ไธ‹ไธชๆ‰‹ๆœบไธ็”จ่‹นๆžœไบ†็”จๅŽไธบๅŽไธบๅฐ‘ๅนดๆ‰ๆ˜ฏๆœ€ๅฅฝ็š„ๅฐ‘ๅนด\n1000 Processed\n classify content\n390000 0 ๅ…ถไธญไธญ็ขณ้“ฌ้“fecr55c200ๆ‹›ๆ ‡ไปทๆ ผไธบ11200ๅ…ƒ/60ๅŸบไปท\n390001 0 ๆŠฅ้“ไธ€็—…ไพ‹๏ผšๆ‚ฃ่€…ๆœ‰ๅญๅฎซ็ ด่ฃ‚ๅฒ\n390002 0 ็™พๅบฆ่ดดๅงโ€œ็พŽๅ›ฝโ€ไบŒๅญ—้ƒฝๆˆๆ•ๆ„Ÿ่ฏไบ†\n390003 0 ็”จๆฅๅฐ†Java็š„ๅฏน่ฑก่ฝฌๅŒ–ไธบJSON\n390004 0 ่ฟ™่ฎฉๆˆ‘่ดจ็–‘ๆˆ‘ไน‹ๅ‰็œ‹็š„้ƒฝๆ˜ฏไบ›ๅ•ฅ\n1000 Processed\n classify content\n390500 0 ๆˆ‘ไปฌ็‰ง้š†ๅฐฑๆ˜ฏๆ‚จ่บซ่พน็š„ไฟๆŠคไผž\n390501 0 ไธœๅŒบ่ญฆๆ–น็ ด่Žทไธ€่ตท็›—็ชƒๅผ€ไธš่Šฑ็ฏฎ็š„ๆกˆไปถ\n390502 0 186*3230ไป˜ๆฌพๆˆๅŠŸ่Žทๅพ—ไธ€ไธช5ๅ…ƒ็บขๅŒ…\n390503 0 ไธŠๆตทไธ€้ฉดๅ‹ๅœจๆต™ๆฑŸไป™ๅฑ…ๅŽฟๆœฑๅ‘้‡Œ็€‘ๅธƒโ€œ็€‘้™โ€ๆ—ถ\n390504 1 ๅพทๅ›ฝ่ฒๆž—ๆ ผๅฐ”ๅœฐๆฟx.xxไฟƒ้”€ ๅไผ˜ๅ“็‰Œใ€ๅฎžๆƒ ไปทๆ ผใ€่ฟ‡็กฌ่ดจ้‡ใ€่‰ฏๅฅฝ็Žฏไฟใ€ไผ˜่ดจๆœๅŠกใ€‚ ๅœฐๅ€๏ผš...\n1000 Processed\n classify content\n391000 1 $ๆ‚จๅฅฝ๏ผๆ–ฐๅนดๅฟซไนใ€๏ผšๅช่ฆโ€œ้“ณ ๅฐฑโ€้€โ€œ็™พๅˆ†ไธ‰ๅใ€‘ๅ‡บxๅˆ†้’Ÿๅˆฐๆˆท๏ผ๏ผ่ฏฆๆƒ…ไธŠใ€xxxxxw.CCใ€‘\n391001 0 ๆฒกๆžๆ˜Ž็™ฝ็œŸ็›ธๅฐฑไนฑ็”ฉไบบๅ˜ดๅทดๅญ็š„ๅฅณไธป่ง’\n391002 0 ้ป„ๆ™“ๆ˜Žๆ˜ฏ่ขซ่ดจ็–‘่บซ้ซ˜ๆœ€ๅคš็š„็”ทๆผ”ๅ‘˜ไน‹ไธ€\n391003 0 ่ฟ˜่ƒฝๅบ”็”จๆ™บ่ƒฝๅŒ–็†่ดขๆจกๅ—ๆŠตๆŠ—้ฃŽ้™ฉ\n391004 0 ๅ—ไบฌ็งฆๆทฎ่ญฆๆ–นๆŽฅๅˆฐไธ€ไธชๅฐไผ™ๅญๆŠฅ่ญฆ็งฐ\n1000 Processed\n classify content\n391500 0 ๅŽไธบไธ‰ๆฌพๆ–ฐๆœบๆ›ๅ…‰้…Kirinxxxๅค„็†ๅ™จ\n391501 0 ไธๅพ—ไธๅพ—่ฏดไธ€ๅฅ่Šฑๅƒ้ชจ่ฟ™ไนˆๅ‚ป็ผบ\n391502 0 ๆœ€ๅŽไธ€้ข—่ฟ˜่ฆ็ญ‰ๅˆฐ่Šฑๅƒ้ชจๆฅไบ†ๆ‰่ทŸๅฅนไธ€่ตทๅˆ†็€ๅƒๆ€้˜ก้™Œๆฒกๆœ‰็ณ–\n391503 0 ๅฆˆ็š„ไปฅๅŽๅฐฑๆŒ‡็€็™พๅบฆๅงๅ•Šไธ่ฟ›ๆญฅๅฐฑไผšๅƒๅฑŽๅ‚ป้€ผ\n391504 0 ๆ— ้”กๅธ‚ไธญ็บงไบบๆฐ‘ๆณ•้™ขๅฌๅผ€ๆ–ฐ้—ปๅ‘ๅธƒไผš\n1000 Processed\n classify content\n392000 0 ็›ธไฟก็†่ดขไบงๅ“ๆˆ‘ๅ€’ๅฎๆ„ฟ็›ธไฟก็‚’่‚กๆ›ด้ ่ฐฑ\n392001 0 ่‡ชๅœฐ้“xๅท็บฟไบŒๆœŸๅ’Œๅ‘จ่พนๅคšไธช็ซ‹ไบคๆกฅๆ–ฝๅทฅไปฅๆฅ\n392002 0 ๅคชๅคšๅคชๅคš็š„่ดชๅฎ˜ๆฑกๅ้žไฝ†ๆœ‰ๅพˆๅคšๅพˆๅคš็š„ๅŠžๆณ•ไผšๅŽป่งฃๅ†ณ\n392003 0 ๅค–็ป่ดธๅนฟๅœบ9ๆฅผ็š„็”ตๆขฏ้—จ็ˆ†็‚ธไบ†\n392004 0 ๆœ‰ๆ•ˆๅฏนๆŠ—UVAUVB้˜ฒๆญข้ป‘ๆ–‘ไบง็”Ÿ\n1000 Processed\n classify content\n392500 0 ๆ—…ๆธธๅœฐไบงๅ‘ๅฑ•ๆธ้œฒ้”‹่Š’ๅˆ†ๆƒๅบฆๅ‡ๆจกๅผๆˆๆœชๆฅ่ถ‹ๅŠฟ\n392501 1 ~ๆœ‹ๅ‹๏ผผๆœ€่ฟ‘ๅฅฝๅ—๏ผŸๆœ‰ ไธŠ ๅฅฝ ็š„ ่Œถ ๅถ๏ผŒ่ฟ˜ ๆ˜ฏ ๅŽŸ ๆฅ ็š„ ๅ‘ณ ้“ใ€‚ไฝ ๆ‡‚็š„ใ€‚xxxไธฝxxx...\n392502 0 thetruthcomestolight็œŸ็›ธ่ฟŸๆ—ฉไผšๅคง็™ฝ\n392503 0 ๅ…‰่ฟ™ไธชNIVEAๅฐฑๅทฒ็ปๆ˜ฏ็ฌฌไธ‰็“ถไบ†\n392504 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ๆ‚จๅฅฝ๏ผโ€œไธ‰ๅ…ซโ€ๆฅไธด๏ผŒๆ–‡่ƒธๅŠ้…ๅฅ—็Ÿญ่ฃค็ง’ๆ€ไปทx.xๆŠ˜๏ผŒๆญคๆดปๅŠจxๆœˆxๆ—ฅ็ป“ๆŸ๏ผๅฟซ่กŒๅŠจ่ตทๆฅ๏ผŒ...\n1000 Processed\n classify content\n393000 0 ๆญฃๅœจไธŠๆตทๅ‚ๅŠ ๅ•†ไธšๆดปๅŠจ็š„็ง‘ๆฏ”ๅฐฑๆ›พ่ตดๆญๅทžๅฏ†ไผš้ฉฌไบ‘\n393001 0 ้™คไบ†ๅฐๅจๅฐผๆ–ฏ50็ฑณๆฐด้“ๅ’Œๅผ€้—ธๆ”พๆฐด่ฟ‡ๆกฅๅค–\n393002 0 ๆป‘็ฟ”ไผž้™่ฝๅคฑ่ดฅ็›ดๆŽฅ่ขซๆฑฝ่ฝฆ็ป™ๆ’žไธŠ\n393003 0 2015ๅ‰ๆž—ๆฐ‘่ˆชๆœบๅœบ้›†ๅ›ขๅ…ฌๅธๆ‹›่˜11ไบบ\n393004 0 ๅญฆไบ†ไธ€ๅฅๆ‰ฌๅทž่ฏ~็Ž‹ๅ…ƒๆˆ‘็ˆฑไฝ ~ๅ˜›ๅ˜›~ๆฏๅคฉ้ƒฝๅพˆๆƒณไฝ ~็ˆฑไฝ ~\n1000 Processed\n classify content\n393500 1 ่ฎพ่ฎกๅธˆไธ€ๅฏนไธ€ไธŽๆ‚จไบคๆต๏ผ$ๅฝ“ๅคฉๅฎšๆ ทๆฟ้—ดๅฏไบซไปฅไธ‹ไผ˜ๆƒ ๏ผš$x๏ผŒ่ต ้€ๅฎถๅ…ทๅฎถ็”ต๏ผˆๆฒ™ๅ‘๏ผŒ้คๆกŒ๏ผŒ็ƒŸๆœบ๏ผŒ็ถๅ…ท...\n393501 0 ็œŸ็š„ๆ€ฅ้œ€ไธ€ๅผ ๅ—ไบฌbigbangๆผ”ๅ”ฑไผš้—จ็ฅจ\n393502 0 ไฝ†ๆ กๅ›ญๅ†…็š„็‰น่‰ฒๅปบ็ญ‘็พคๅคง้ƒจๅˆ†ๅพ—ไปฅไฟๅญ˜\n393503 0 46ไธชๅธ‚็บง้ƒจ้—จๆƒๅŠ›ไบ‹้กน่ดฃไปปๆธ…ๅ•ๅทฒ้€š่ฟ‡็›ๅŸŽๆœบๆž„็ผ–ๅˆถ็ฝ‘ๅ‘็คพไผšๆญฃๅผๅ…ฌๅธƒ\n393504 0 ๆฑŸ่‹็š„็ด ่ดจ็”ทsๆœŸๅพ…ไธ€ไธชๅฌ่ฏๆ„ฟๆ„้•ฟๆœŸๆ…ขๆ…ขๅ‘ๅฑ•็š„ๆ„ฟๆ„ไธ€ๆญฅไธ€ๆญฅ่ขซ่ฐƒๆ•™็š„mๅ–œๆฌขๅค้ฃŽๆˆ–่€…ๅคๅ…ธๆ–‡ๅญฆ็š„ๅฆนๅญๆœ€ๅฅฝ\n1000 Processed\n classify content\n394000 0 ไนŸๆŠตไธ่ฟ‡2000ไธ‡ไบบ็š„่ฟๆณ•ไนฑ็บช\n394001 0 ไธบไป€ไนˆไฝ ไปฌไธ€่ฏด้ญ…ๆ—mx5ๆˆ‘็ฌฌไธ€ๅๆ˜ ้ƒฝๆ˜ฏ้ฉฌ่‡ช่พพmx5ๅ‘ข\n394002 0 ไธŠๅ‘จkingcountryๅฎฃๅธƒๅง”ๅ‘˜ไผšๆ่ฎฎ่งฃๅ†ณๅฏนไบŽๆŒ็ปญๅขž้•ฟ็š„ๅฐ‘ๅนดๅธๆณ•ๅˆถๅบฆไธญ็งๆ—ๅทฎ่ท้—ฎ้ข˜\n394003 0 ๆ‰ฌๅทžๅธ‚ๅŒบๅŠๆฑŸ้ƒฝๅŒบๅ…ฑๆˆไบคๅ•†ๅ“ๆˆฟ77ๅฅ—\n394004 0 ๅผบๅฅธ็Šฏไธ€ๅฎถไพต็Šฏ็š„ๅฆ‡็ง‘๏ผš1ๅŒป็”Ÿ้™ˆ่ฟฐๅฅๅบทๆญฃๅธธ\n1000 Processed\n classify content\n394500 0 ่€Œๅœจๅ’Œ่ฐ็‰ˆไธญๅ…ฌๅฎ‰่งฃๆ•‘ๅฅณๅคงๅญฆ็”ŸๆˆๅŠŸ\n394501 1 ๆˆ‘ๅค„ๅฏไปฅๅ…่ดนๅŠž็†ๆ‘ฉๆ‰˜่ฝฆ้ฉพ้ฉถ่ฏใ€‚้กปๅŠž่ฏ็š„ไบฒๆœ‹ๅฅฝๅ‹่ฏท่ทŸๆˆ‘่”็ณป๏ผ\n394502 0 ๅฎƒๆ›พ่ฟž็ปญ17ๅนด่‰่”็พŽๅ›ฝไบš้ฉฌ้€Š็•…้”€ไนฆๆŽ’่กŒๆฆœ\n394503 0 ๅคง็ƒญๅคฉ็š„ไธ€ไธชไบบๅŽปๅ—ไบฌๅนฟ็”ต่ท‘ไบ†ไธช่…ฟ\n394504 0 ๅฐบๅฏธ20ใ€24ใ€26ใ€29่Šญๆฏ”็ฒ‰ๆœ€ๆ–ฐ่‰ฒไฟ็œŸๅ‡บๆธธๅญฃๅธฆไธŠๅฟƒ็ˆฑ็š„ไป–ๅบฆ่ฟ‡ๅฎŒ็พŽๅ‡ๆœŸ\n1000 Processed\n classify content\n395000 0 ็ซŸ็„ถๆ•ขๅฏน่‡ชๅŠจ็›‘ๆŽงๆ•ฐๆฎๅผ„่™šไฝœๅ‡\n395001 0 ๅˆšไปŽๆดพๅ‡บๆ‰€ๅ‡บๆฅโ€ฆโ€ฆๅ†ณๅฎš่ฟ˜ๆ˜ฏๅ…ˆไธๆตๆตชไบ†\n395002 0 ๆฅ่‡ชๅ•่ฏzappyๅธŒๆœ›ๆฏไธชๆ‹ฅๆœ‰Z11็š„ๅฅณ็”Ÿ้ƒฝ่ƒฝ่ฎฉไบบๆ„Ÿๅ—ๅˆฐๅฅนๆฐธ่ฟœ็š„ๆดปๆณผใ€็พŽไธฝ\n395003 0 5ๆžถB๏ผ29่ฝฐ็‚ธๆœบ็ป„ๆˆ็š„็ชๅ‡ป้˜Ÿๅฐ†ๅŽŸๅญๅผนโ€œ่ƒ–ๅญโ€ๆŠ•ๅˆฐ้•ฟๅดŽๅธ‚ไธญๅฟƒ\n395004 0 ไธบไฝ•่ฟฝๆ˜Ÿๅชๆ˜ฏ่ฎถๅผ‚ไบŽ็ฒ‰ไธไธŽๅถๅƒไน‹้—ด็š„้‚ฃ็งๅ…ณ็ณปไธ€้ข็œŸๅฟƒไธ€้ขๅ‡ๆ„ๅƒๆžไบ†่ฟ™ไธ–ไธŠๆ‰€ๆœ‰้›พ้‡Œ็œ‹่Šฑๆœ‰่‡ชไปฅไธบๆ˜ฏ...\n1000 Processed\n classify content\n395500 0 ๅพฎ่ฝฏๅฏนไบŽWindowsUpdateๆจกๅผ็š„่ฐƒๆ•ดๅดๅผ•ๅ‘ไบ†ๅค–็•Œ็š„ไธๆปก\n395501 0 ้ข„่ฎกๅนดๅ†…xxxxๅคšๅฅ—ไฟ้šœๆˆฟๅฏ้™†็ปญไบคไป˜ไฝฟ็”จ\n395502 0 ๅฝ“ไฝ ๆฅๅˆฐๅŒป้™ข็š„ๆ—ถๅ€™ไฝ ๆ‰่ƒฝๆ„Ÿๅ—ๅˆฐ\n395503 0 ็އ30ๅคšๅฎถไผไบ‹ไธšๅ•ไฝใ€130ๅคšไบบใ€ๆบ4000ๅคšไปถๅฑ•ๅ“ๅ‚ๅŠ ๅœจๆทฑๅœณไธพๅŠž็š„็ฌฌไนๅฑŠๆ–‡ๅšไผš\n395504 0 ็งŸๆˆฟไนฐๆˆฟๅฐ้‚“ๅธฎๅฟ™18170265496่ฅฟไธ€่ทฏๅˆ›ไฝณๆˆฟไบงๅœ†ๆ‚จๅฎถ็š„ๆขฆๆƒณ\n1000 Processed\n classify content\n396000 0 ๅ› ไธบ็™พๅบฆ่ฟ™ไธชๆœ็ดขๅผ•ๆ“Ž็ป™ไบ’่”็ฝ‘ๅ„ๅคง็”จๆˆทๅธฆๆฅไบ†ๅพˆๅคง็š„ๆ–นไพฟ\n396001 0 2015ๅนดไธŠๅŠๅนดๅค„็†็”ณ่ฏ‰ไธพๆŠฅ75่ตท\n396002 0 ๆœ€ๅฅฝๅฌ็š„ๅฃฐ้Ÿณๅฑ…็„ถๆฒกๆœ‰ๅฏผๅธˆ่ฝฌ่บซ\n396003 0 ๅฐ‘ๅนด็ฅžๆŽข็‹„ไปๆฐๅฐ็™ฝ่„ธไธ้€‚ๅˆๆผ”ๅ…ƒ่Šณ\n396004 0 ๅ—ไบฌๅคช็พŽไบ†ๆœ‰็ผ˜ๅˆ†็š„ไธคไธชๅ‚ป้€ผๆ€ปๆ˜ฏ่ƒฝๅ†่ง็š„\n1000 Processed\n classify content\n396500 0 ้ƒฝๆ˜ฏๅ› ไธบๅœจไธไบ†่งฃไบ‹ๆƒ…็œŸ็›ธๆ—ถๅฐฑๅฆ„ๅŠ ่ฏ„่ฎบ\n396501 0 ๅฐ็ผ–้š†้‡ๆŽจ่4็งๆ–ฐๆฌพๆ—ฉ้ค๏ผšๆžๅถ่›‹้ฅผ้…ๆตทๅ—้ป„็ฏ็ฌผ่พฃๆค’\n396502 0 ๅœจ็”ต่„‘ๅ‰ๅ‘†ๅ3ไธชๅฐๆ—ถโ€ฆโ€ฆ\n396503 0 ็œ‹้˜ฟๅฆนๅฆนๆฅๅฅฝๅฃฐ้Ÿณๆœ‰ไธ€็ง็Ž‹ๆ€่ช็‹ฌ่‡ชๅŽปๆฝ˜็Ÿณๅฑน็š„sohoๆ‰พๅทฅไฝœ็š„่ตถ่„š\n396504 0 ้‚ฃไธช่…่ดฅ็š„ๅ„ฟๆœๅปทๅทฒ็ปๅฎŒๅ…จๆฒกๆœ‰็”Ÿๅญ˜ไธ‹ๅŽป็š„่ƒฝๅŠ›ไบ†\n1000 Processed\n classify content\n397000 0 ๆ นๆฎ็Žฐๅœบๆƒ…ๅ†ต่ญฆๅฏŸๅ’Œ่ญฆ่ฝฆๆ˜ฏๆœ‰็›ดไปป\n397001 0 keithๅฐckๆ–ฐๆฌพๅฅณๅŒ…ๆฌง็พŽ้ฃŽ่ฝฆ็ผ็บฟ้•ฟๆฌพ้’ฑๅŒ…ๆ‰‹ๆ‹ฟๅŒ…็ฒ‰่‰ฒ้ป‘่‰ฒ็™ฝ่‰ฒๅŽŸไปท269ๆŠ˜ๆ‰ฃ165ๅŒ…้‚ฎ\n397002 0 IMAX่ฆ็š„ๅปบ็ญ‘ๆ˜ฏไธ€ๅบงๆžๅ…ถๅฎ\n397003 0 ไปŠๆ™šไธ‹ๅ…ณ็ช็„ถไธ‹ไบ†ไธ€ๅœบๅพˆๅคงๅพˆๅคง็š„้›จ\n397004 0 Pๅ›พไบ†่ฟ˜ๆ˜ฏๅพˆไธ‘โ€ฆโ€ฆ่ทŸไฝ ไปฌๅคงๅฉŠๅงๅญ™่€€็ฆไธ€ๆ ท\n1000 Processed\n classify content\n397500 0 PhytoTreeไบบๅฐ†้ป„็“œๅšๆˆไบ†ไธ€็งๆŠค่‚คๅ“\n397501 0 ไธ–็•Œ็Ÿฅๅๆ—…่กŒ็ฎฑๅ“็‰ŒRIMOWAๅฑ•็คบไบ†ๆญฃๅœจๅคๆดปไธญ็š„๏ผšไธ–็•ŒไธŠ้ฆ–ๆฌพๅ…จ้‡‘ๅฑžๅฎขๆœบๅฎนๅ…‹ๆ–ฏF\n397502 0 ไฝ ๅ›ฝๆ”ฟๅบœไธไป…ไป…ๆ˜ฏ่ ข็š„้—ฎ้ข˜ไบ†\n397503 0 ๅฎฟ่ฟๆทฎๆตทๆŠ€ๅธˆๅญฆ้™ข้™„่ฟ‘ไธ€ๅค„ๆถตๆดžๆ—ถ\n397504 0 ่ฎพ่ฎกๅธˆ้€š่ฟ‡ๅฏน้“ๆ็š„ๆŒคๅŽ‹ๅ˜ๅฝขๆฅ็‰ข็‰ข็š„ๅ›บๅฎšไฝๅ…ถๅฎƒไธค็งๆๆ–™ๅนถๅฝขๆˆไบ†ไธ€ไธชๆœ‰่ถฃ็š„ๅฐ็ฏ็ฏ็ฝฉ้€ ๅž‹\n1000 Processed\n classify content\n398000 0 ็Šฏ็ฝชๅฟƒ็†S4E19ไธญ็š„็ฝช็Šฏๆ˜ฏๆšฎๅ…‰้‡Œ็š„JasperๆˆๅŠŸๆผ”็ปŽไบ†ไธ€ไฝๅŒ้‡ไบบๆ ผ็ฝช็Šฏ่€Œไธ”่ฟ˜ๆ˜ฏๅฅณไบบ็š„็ฌฌไบŒ...\n398001 0 ๅฝ“ๅ‰่กŒๅŠ ๆ–ญ็‚นcom+\\็งปๅŠจ็ผ–่พ‘ๅŒบๆœ€ไธŠๆ–นcom+ไธŠ็งปๅŠจ็ผ–่พ‘ๅŒบๆœ€ไธ‹ๆ–นcom+ไธ‹็งปๅŠจๅ…‰ๆ ‡\n398002 0 ไธไป…ๅ› ไธบๅฎณๆ€•ๆŸฅๆ˜Ž็œŸ็›ธ็š„่‰ฐ้šพๅ›ฐ่‹ฆ\n398003 0 ๅ“ˆๅ“ˆๅฝ“ๆ—ถ่ฟ™็…ง็‰‡่ฟ˜ๆ˜ฏๆˆ‘ๆ‰‹ๆœบๆ‹็š„ๆˆ‘้ƒฝๆฒกไบ†\n398004 0 ๆณฐๅ›ฝ่ญฆๆ–นๆ‹˜ๆ•ไบ†4ๅๅฅณๅญๅ’Œ1ๅ็”ทๅญ\n1000 Processed\n classify content\n398500 0 ๅฏน้žP8็”จๆˆท่€Œ่จ€ๅˆๅฆ‚ไฝ•ๅฎž็Žฐๆตๅ…‰ๅฟซ้—จๅ‘ข\n398501 0 ๅ•†ไธšๅนฟๅ‘Š๏ผšไปทๆ ผ1500โ€”โ€”2500\n398502 0 ็މๅ…ฐ๏ผšๅ†ฐๆธ…็މๆดๅ–œๆฌข็މๅ…ฐ็บฏๆดๅฑžไบŽไฝ \n398503 0 ๅ—ไบฌ่„‘ๅบทไธญๅŒปๅŒป้™ขๅ…จๅ›ฝๅไฝณ้‡็‚น้ข็ฅž็ปไธ“็ง‘ๅŒป้™ข\n398504 0 1985ๅนดๆŠ—ๆ—ฅๆˆ˜ไบ‰่ƒœๅˆฉ40ๅ‘จๅนด\n1000 Processed\n classify content\n399000 0 ๅชๆ˜ฏๆˆ‘ไปฌ็š„ๆฑฝ่ฝฆ่ฎฉๅฐๅทๅท่ตฐไบ†\n399001 0 ๆ‰€ไปฅ่ฎพ่ฎกไบ†่ฟ™ไธชๅˆฉ็”จๆŸณๆก็ผ–ๅˆถๅ‡บๆฅ็š„ๆœ‰็‚นๅƒ่‰บๆœฏๅ“็š„็ง‹ๅƒ\n399002 0 ไธๆ™“ๅพ—ๅฅฝๅฎ‰้€ธไฝ ๆ˜ฏๆƒณ็œ‹้•ฟ็š„ไน–ๅธ…็š„้šไพฟๆ’’ๅญๆฌพๅผ็ป™ๆˆ‘่ฏดๆˆ‘ๅŽป็ป™ไฝ ๆŸฅ็™พๅบฆๆ–—ๆ˜ฏ\n399003 0 ๆœ‰็š„็พŽๅฎน้™ข็œ‹ไบ†ๆˆ‘ๅ’จ่ฏข็š„็…ง็‰‡ๅฐฑ่ฏดไธ‘\n399004 0 ๆฑ‚ไธ€่ตทๅŽป็œ‹exoไบŒๅทกๅ—ไบฌๅœบ็š„ๅฐไผ™ไผด\n1000 Processed\n classify content\n399500 0 ๆ‹ฟๅ›žๆ—งๆ‰‹ๆœบ็ฟปๅˆฐไบ†ไปฅๅ‰ๅญ˜็š„ไธ€ๅคงๅ †ๅŠ ่ฒ็š„ๅ›พ\n399501 0 8ๆœˆ8ๆ—ฅๅ‰ๆ˜†ๆ˜Ž่ฆๅฎŒๆˆ็”ตๆขฏๅฎ‰ๅ…จๅคงๆฃ€ๆŸฅ\n399502 0 ไธคๅฐๆ—ถๅ†…ๆˆๅŠŸๆŸฅๅค„ไธ‰่พ†่ฟๆณ•่ถ…้™่ฝฆ่พ†\n399503 0 ๆฒณๅŒ—ไปŠๅนดไผ˜ๅ…ˆๆ•ดๆฒป้›†้›จๅŒบ้‡็‚นๆ‘ๅบ„็Žฏๅขƒ\n399504 0 ๆญ้…ไธญ่ฏ่ฏๆใ€ไธŠ็ญ‰้…ฑๆฒนๅŠ็‰นๆฎŠ้…ๆ–™\n1000 Processed\n classify content\n400000 1 ๅฐŠๆ•ฌ็š„ๅฎถ้•ฟๆœ‹ๅ‹๏ผŒๆ–ฐๅนดๅฅฝ๏ผŒxxxxๅนดๆ˜ฅๅญฃ็พŽๆœฏ่ฏพ็จ‹xๆœˆxๆ—ฅๅณๅฐ†ๅผ€่ฏพ๏ผŒๆฌข่ฟŽๅธฆๅญฉๅญๅ‰ๆฅๅ’จ่ฏขใ€่ฏ•่ฏพๆˆ–ๆŠฅ...\n400001 0 ๆฏๆฌก็œ‹ๅˆฐ้‚ฃไบ›ๅŽปไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็š„ๅญฆๅ‘˜ๅผ€ๅง‹่ฏด่‡ชๅทฑ็š„ๆ‚ฒๆƒจๆ•…ไบ‹็š„ๆ—ถๅ€™ๆˆ‘็š„ๅฐดๅฐฌๆๆƒง็—‡ๅฐฑๅฟไธไฝ่ฆ็Šฏๅ“ฆ\n400002 0 ๆทฑๅœณๅธ‚ๅŸŽ็ฎกๅฑ€ๅทฒ้ƒจ็ฝฒไปŠๅนด็”Ÿๆดปๅžƒๅœพๅˆ†็ฑปๅ‡้‡ๆŽชๆ–ฝ\n400003 0 4ใ€ๅฐๆฆ‚็އ็š„โ€œ้ป‘ๅคฉ้น…โ€ไบ‹ไปถๅœจ่ต„ๆœฌๅธ‚ๅœบไธŠๅนถไธๅฐ‘่ง\n400004 0 ๆˆ‘่ฎค่ฏ†็š„ๅฎฟ่ฟไบบไธๆ˜ฏ่ฟ™ๆ ท็š„ๅ•Š\n1000 Processed\n classify content\n400500 1 ๅงๆ‚จๅฅฝ๏ผŒไธ‰ๅ…ซๅนธ็ฆๅฅณไบบ่Š‚๏ผŒxๆœˆxๆ—ฅไธ€xๆœˆxๆ—ฅไนๆฑŸ่”็››xๆฅผ็ปดๆ ผๅจœไธVไธ€GRASS็‰นๆŽจๅ‡บๆ˜ฅ่ฃ…xxx...\n400501 0 ๅ…จๅธ‚ๅ…ฑๆ‹†้™คๅ„็ฑป่ฟๆณ•ๅปบ็ญ‘3377ๆ ‹\n400502 0 ๅˆๆ”ถๅˆฐไบบๅฎถไธ€ไฝๅฆนๅญ้ฃžๆœบ็›’่ฃ…ไนฆ่„Š็ฃจ็™ฝ็…ง็‰‡โ€ฆโ€ฆ่ฎฉๅฆนๅญๆ‰พๅฎขๆœๅŽปไบ†\n400503 0 ๅ…จ่บซๅฟƒๆ‹’็ปๅ—ไบฌๆœบๅœบ็š„wifi\n400504 0 ๅนถๅœจ2000ๅนดๅ‚ไธŽไบ†็™พๅบฆๆ—ฉๆœŸๅˆ›ไธš\n1000 Processed\n classify content\n401000 0 ๅ‘็Žฐ่ฏฅ่ฝฆๆฃ€้ชŒๅˆๆ ผ่‡ณxxxxๅนดxxๆœˆๆœ‰ๆ•ˆใ€ไธ”ๆœ‰xxๆก้ž็Žฐๅœบๆ‰งๆณ•่ฟๆณ•ๆœชๅค„็†่ฎฐๅฝ•\n401001 0 โ€œONENIGHTๅ…ณ็ˆฑ่‡ช้—ญ็—‡ๅ„ฟ็ซฅๆ…ˆๅ–„ๆ™šๅฎดโ€ๅœจๆญไธพ่กŒxๆœˆxxๆ—ฅๆ™š\n401002 0 ๆฃ€ๆŸฅ็ป„่ฎค็œŸๆŸฅ้˜…ไบ†ๅŽฟๅฑ€2015ๅนดไธŠๅŠๅนดๅŠž็†็š„้ƒจๅˆ†ๆถ‰ๆž—ๆกˆไปถๅทๅฎ—\n401003 0 ๅ—ไบฌๆ‰€ๆœ‰็š„ๅคงๅญฆ้ƒฝๅ˜ๆˆไบ†โ€œๆฒณๆตทๅคงๅญฆโ€\n401004 0 ็พŽๅ›ฝไบบไธๆญง่ง†่€ไบบๅœจไธญๅ›ฝ่ฆๆ˜ฏไธ€ไธชๅคง็ˆทๅŽปไธŠๅญฆ\n1000 Processed\n classify content\n401500 0 ๅœจๆต“้›พ่’™ไฝๅปบ็ญ‘็‰ฉ็š„ๆฏไธชๆ—ฉๆ™จ\n401501 0 ๅ…จ็œ50ๆˆทๅฎถๅบญ่Žท้€‰ๆฑŸ่‹โ€œๆœ€็พŽๅฎถๅบญโ€\n401502 0 Sorryใ€ๆˆ‘ไธๆ˜ฏ่ญฆๅฏŸ็š„ๅพฎๅ็‰‡ๅ…ฌๅธ\n401503 0 ๆˆ‘ๆ˜ฏไธๆ˜ฏๆœ‰ๅฟƒ็†็–พ็—…ไธ€่ขซๅ’ฌๅฐฑ็‰นๅˆซๅŽŒๆถๅฅน\n401504 0 ๆฅๅŽฆ้—จไธญๅฑฑๅŒป้™ขๆขๅ–ๅฅนๆ‰€้œ€็š„่ก€ๅž‹\n1000 Processed\n classify content\n402000 0 ็พŽๅ†›ๅ‡บๅŠจxxxๅคšๅๅฃซๅ…ตใ€xx่พ†ๆˆ˜่ฝฆๅŠๆ•ฐๆžถ็›ดๅ‡้ฃžๆœบ็ช็„ถๅฐ†ไผŠๅŒ—้ƒจๅŸŽๅธ‚ๆ‘ฉ่‹ๅฐ”็š„ไธ€ๅบงๅˆซๅข…ๅ›ขๅ›ขๅŒ…ๅ›ด\n402001 0 ๆˆ‘็”จ็™พๅบฆ่ง†้ข‘ๆ‰‹ๆœบ็‰ˆ็œ‹ไบ†โ€œ่’œ้ฆ™็ƒค่Œ„ๅญโ€\n402002 1 ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตๆญฅๆญฅ้ซ˜ๅทๆน˜่œ้ฆ†๏ผŒๆœฌๅบ—ไธบๆท˜็‚น็‚นๆŒ‡ๅฎšๅˆไฝœๅ•†ๅฎถ๏ผŒ็”จๆท˜็‚น็‚น็‚นๅค–ๅ–ๆœ€ๆ˜ฏๅˆ’็ฎ—ใ€‚ๅˆฐๅบ—ๅ ‚้ฃŸๅŒไบซไผ˜/...\n402003 0 ๅ…ฌๅธๅฎ—ๆ—จ๏ผšๅง‹็ปˆๅšๆŒโ€œไธ–็•Œ็บง็š„ๅŒป็–—ๆŠ€ๆœฏ็†ๅบ”ๅฑžไบŽๅ…จไธ–็•Œ็š„ๆ‚ฃ่€…โ€่ฟ™ไธ€็†ๅฟต\n402004 0 ไปปๆ‚จๅฐฝๆƒ…ไบซๅ—2015ๆฌพ็บณๆ™บๆทไผ˜6SUVๆญ่ฝฝ็š„ๅ…จ่ƒฝๅคง่„‘\n1000 Processed\n classify content\n402500 1 ไธบ็ฅ่ดบๅพๅทฅๆŒ–ๆŽ˜ๆœบ่ดตๅทž้”€้‡็ช็ ดxxxxๅฐ๏ผŒๅฎ้“ถๆบๆ‰‹ๅŽ‚ๅฎถไบŽxๆœˆxxๆ—ฅๅœจ่ดต้˜ณไธพๅŠžๅคงๅž‹่ฎข่ดงไผš๏ผŒxx้‡...\n402501 0 ไบบไปฌๅธธ่ฏดwexin๏ผšgjn19940907\n402502 0 ๅŠ ๅฟซๅปบ็ญ‘ไธšๆ–ฐๆŠ€ๆœฏๅœจๆ–ฝๅทฅๅทฅ็จ‹ไธญ็š„ๆŽจๅนฟๅบ”็”จ\n402503 0 onedrive็ญ‰็ญ‰ๆœๅŠก่ฟžๆŽฅๆœๅŠกๅ™จไน‹ๆ…ข่ฎฉไบบๅฎžๅœจๅ—ไธไบ†\n402504 0 ่ƒฝไฟๆŠคๅฉดๅ„ฟๅ…ๅ—็ป†่Œๅ’Œ็—…ๆฏ’็š„ไพตๅฎณ\n1000 Processed\n classify content\n403000 0 ๅฐ„้˜ณๅŽฟ็พๅฎณๆ€งๅคฉๆฐ”ๅ†ณ็ญ–ๆœๅŠกๅนณๅฐxxๅนดxๆœˆxxๆ—ฅxxๆ—ถxxๅˆ†ๅ‘ๅธƒ\n403001 0 ๆˆ‘ไปฌๅฐฑ่ƒฝ็Ÿฅ้“ๅฎ้ฉฌๅฏนi็ณปๅˆ—็š„้‡่ง†ไบ†\n403002 0 โ€œๅคง็พŽไธŠ้ฅถ\"ๆ™ฏๅพท้•‡้™ถ็“ท่‰บๆœฏๅฑ•ๅœจไธŠ้ฅถๅธ‚ๅฑ•ๅ‡บ\n403003 0 ๆˆ‘ๅ–œๆฌข็œ‹ๆ—ฅๆœฌไบบๅ†™็š„ๅ•†ไธšไนฆ็ฑ\n403004 1 ๅนฟไธœๆฑ‡็ฅฅ็››ๅฎžไธšๆŠ•่ต„ๆœ‰้™ๅ…ฌๅธๆญ็ฅๅ„ไฝ็พŠๅนดๅ‰็ฅฅ๏ผŒไธ‡ไบ‹ๅฆ‚ๆ„ใ€‚ๆœฌๅ…ฌๅธไธ“ไธš่ฝฆ่ดทๆˆฟ่ดท๏ผŒๆŠผ่ฏไธๆŠผ่ฝฆ๏ผŒไบŒๆ‰‹่ฝฆ...\n1000 Processed\n classify content\n403500 0 cn่ฐทๆญŒAndroidWearๅนณๅฐๆ–ฐไธ€่ฝฎๆ›ดๆ–ฐๅฐ†ไธบๆ™บ่ƒฝๆ‰‹่กจไธŠๆœ€ไผ˜็ง€็š„็”จๆˆท็•Œ้ขๅธฆๆฅ็‚นๅ‡ปๆ‰‹ๅŠฟๅŠŸ่ƒฝ\n403501 0 ๆˆ‘ๆœ€ๅ–œๆฌข็š„ไบ‹ๆƒ…ๅฐฑๆ˜ฏๅ”ฑๆญŒๆˆ‘ๆฏๅคฉ้ƒฝ่ฆๅš็š„ไบ‹ๆƒ…ๅฐฑๆ˜ฏๅ”ฑๆญŒๆˆ‘ๆœ€ๆƒณๅš็š„ไบ‹ๆƒ…ๅฐฑๆ˜ฏๅ”ฑๆญŒๅ”ฑไฝ ๅฆนๅ•Š\n403502 0 ๆœ€่ฟ‘ๅฅฝๅคšๅไผšๅœจๅทๅคง็š„็™พๅบฆ่ดดๅง้‡Œๅ‘ไบ†ๆ‹›ๆ–ฐ่ดด\n403503 0 29้˜ฟ่ฏบๅพทๆ–ฝ็“ฆ่พ›ๆ ผๆ‘ไธŠ้š†ๆ•™SMAP็”ป่ŠฑUPไธป๏ผšshiyo29\n403504 0 ๅนฒ่„†ๅฐฑๆŠŠ็”ต่„‘pad็š„็”ต็”จๅ…‰ๅฅฝไบ†\n1000 Processed\n classify content\n404000 0 ๅธŒๆœ›ไฝ ่€ƒไธŠ็ ”็ฉถ็”Ÿๆ‰พๅˆฐๅฅฝๅทฅไฝœ\n404001 0 ไธŽ้ฆ–ๆฑฝ็งŸ่ตๅ‡ ไนŽไธๅญ˜ๅœจ็ซžไบ‰ๅ…ณ็ณป\n404002 0 ๆ„Ÿ่ฐขๅŒป็”Ÿๆ„Ÿ่ฐขๅธฎๅŠฉๆˆ‘็š„ๆฏไธ€ไฝ\n404003 0 Vertuๆ˜Ÿๅบง็ณป็ปŸๅฎ‰ๅ“ๆ™บ่ƒฝๆ”ฏๆŒๆ‰€ๆœ‰App่ฝฏไปถ\n404004 0 ๆœชๅฐไธๆ˜ฏไธ€็ง้ข†ๆ‚Ÿๅ’Œ่งฃ่„ฑ่ฟ™ไธ€ๅˆปไฝ ๆฅไธด้ƒฝๆ˜ฏไบบ้—ดๆœ€็พŽ\n1000 Processed\n classify content\n404500 0 ไบ’่”็ฝ‘Nๅคšๆ•™็ง‘ไนฆ้ƒฝๅฏไปฅ็›ดๆŽฅๆ‰”ไบ†\n404501 0 ไผผไนŽ่ดจ็–‘ๅˆซไบบๅฏน่‡ชๅทฑ็š„ๅฟ ่ฏšๅทฒ็ปๆˆไธบไบ†ๆœฌๆ€ง\n404502 0 ๆˆ‘ไปฌ็š„ๅช’ไฝ“ๅฆ‚ๆžœๆŠฅๅฏผ็œŸ็›ธๅฐฑไผšไธข้ฅญ็ข—\n404503 0 ่ฟ™ๆฌก่ฎจ่ฎบๆ˜ฏๅœจxxๆœˆxxๆ—ฅๅผ€ๅง‹็š„\n404504 0 9ๆˆ‘็œ‹่ง้‚ฃๆขฆไธญไบบ่ตฐๅ‡บๅฝฉ่‰ฒ็š„ๆˆฟ้—ด\n1000 Processed\n classify content\n405000 0 ็ฟ”ไบ‘ๆดพๅ‡บๆ‰€็š„ๆฐ‘่ญฆๅ’Œ้˜Ÿๅ‘˜ไปฌๅทก้€ปๅˆฐๆญค่ฟ…้€Ÿๆธ…้€š้“่ทฏ\n405001 0 962269ๆˆฟๅœฐไบง็ƒญ็บฟ่ฏดไบ†\n405002 0 ็ป„็ป‡xxๅๆฐดๅˆฉ่Œๅทฅๅญๅฅณๅ…ˆๅŽๅˆฐๅˆ˜ๆนพๆฐดๅŽ‚ใ€่†้ฉฌๆฒณๆฑกๆฐดๅค„็†ๅŽ‚ใ€ๅ—ๆฐดๅŒ—่ฐƒ่งฃๅฐ็ซ™ใ€ๆฝ˜ๅฎ‰ๆน–ๆนฟๅœฐๅ…ฌๅ›ญใ€ๆ–ฐๅŸŽ...\n405003 0 ๅคงๅˆฐๆฑฝ่ฝฆใ€้ซ˜้“ใ€้ฃžๆœบ็ญ‰่กŒไธš\n405004 0 ๆ’ธๅ•ๆœบไธ€็‚น้ƒฝไธๆƒณๅผ€็”ต่„‘ๆˆ‘่ฟ™ๆ˜ฏๆ€Žไนˆไบ†\n1000 Processed\n classify content\n405500 0 ??eventไปฒๆœ‰่‡ช็„ถ็™‚ๆณ•้šๅฅณ้†ซ็”Ÿๅนซๆˆ‘่จบๆฒป\n405501 0 ๅ…ถ่ฎพ่ฎก็ปผๅˆไบ†่ฅฟๅผๆœ่ฃ…ไธŽไธญๅผๆœ่ฃ…็š„็‰น็‚น\n405502 0 ๅธ‚็›ธๅ…ณๆ—…ๆธธๅ…ฌๅธๅŠๅ„็•Œๅช’ไฝ“ๅ˜‰ๅฎพๅ…ฑ200ๅคšไบบๅ‚ๅŠ ไบ†ๅผ€ๅน•ๅผ\n405503 1 ไบฒใ€ๅ…ดไน‰โ€œ้›ช่”ปโ€ไธ“ๅ–ๅบ—ๅœจโ€œไธ‰ๅ…ซโ€่Š‚ไธพๅŠžไธ€ๆฌกๅฅณๆ€งๅ…ณ็ˆฑๆดปๅŠจ๏ผŒxๆœˆxๆ—ฅไธ€xๆœˆxๆ—ฅไธ‰ๅคฉๅ…จๅœบx.xๆŠ˜๏ผŒ...\n405504 0 ๆฏ•็ซŸไธญๅ…ดๅŽไธบไธญๅ…ดๅŽไธบไนˆ\n1000 Processed\n classify content\n406000 0 ๆœ—ๆ–็บข้…’ๅŽŸ็“ถ่ฟ›ๅฃ่ดง็œŸไปทๅฎž\n406001 0 ๆš‚ๅœ5ไธชๅŠ่กฐๆœŸๅŽๅฏๅ–‚ๅฅถโ€ฆโ€ฆ่ฎคไธบๅช่ฆๆœ่ฏๅฐฑไธ่ƒฝๅ“บไนณ\n406002 0 ๅญฆsapๅญฆ็š„ๆˆ‘่„‘ๅญ้ƒฝๅ‡บ่ก€ไบ†\n406003 0 ไธจ้นค้š็ŽบๅŠจไธจ0710่ฎจ่ฎบโ—†ๅƒ็บธ้นคๅฏไธๆ˜ฏ็บธ็ณŠ็š„\n406004 0 ๆœๅ‘ๅคฉ็ฉบๆ˜ฏ้ฃžๆœบๆ’žๅ‡ปๅ‰็š„่ˆช็บฟ\n1000 Processed\n classify content\n406500 0 โ€œไธ‡้ƒ้ค้ฅฎ7ๆœˆ่กจๅฝฐๆšจ8ๆœˆๅฏๅŠจๅคงไผšโ€ๅœจๆ–‡ๆ™ฏๅ•†ๅŠกๅคงๅŽฆ29ๅฑ‚ไผš่ฎฎๅŽ…้š†้‡ไธพ่กŒ\n406501 0 ่งๅŽ่กจ็คบNBA่‡ช็”ฑ็ƒๅ‘˜ๅธ‚ๅœบไน‹ๅˆ็š„็ผ“ๅ†ฒๆœŸไปฅๅŠโ€œ็ ้ฒจโ€็ญ‰่ง„ๅˆ™้ข„่ฎกไธไผšๅ‘็”Ÿๅ˜ๅŒ–\n406502 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๅ…ฌๅธๅฏไปฃๅผ€ไผ˜ๆƒ ๅ‘็ฅจ๏ผˆ้ชŒๅŽไป˜ๆฌพ๏ผ‰๏ผŒๅฏไปฃๅผ€๏ผšๆŠตๆ‰ฃๅขžๅ€ผ็จŽๅ‘็ฅจ๏ผŒๅœฐ็จŽๅ‘็ฅจ๏ผŒๅนฟๅ‘Š่ดน๏ผŒๅปบ็ญ‘ๅทฅ็จ‹๏ผŒ...\n406503 0 ๅณฐๅณฐๅŠ ๆฒนๅณฐๅณฐๅŠ ๆฒน๏ผ‹ๆญคๅ›พ็‰‡็”ฑๆœฌไบบๅˆถไฝœ\n406504 0 ไฝ ็œ‹ๆˆ‘่€ƒไธŠ็ ”็ฉถ็”Ÿไปฟไฝ›ๅพˆ่ฝปๆพ\n1000 Processed\n classify content\n407000 0 ๅœจ่ตฐ่ฎฟ้ƒจๅˆ†ๅ‰ๅˆฉๆฑฝ่ฝฆ4Sๅบ—ๅŽ่Žทๆ‚‰๏ผšๆ–ฐๆฌพGC7ๆœ‰ๆœ›ๅœจๆ˜ŽๅนดๆŽจๅ‡บ\n407001 0 ไปŠๅคฉไธ‹็ญๅ›žๅฎถๅ่ฝฆๅŽป่ดขๅฏŒ็ป™ๆปดๆปดไนฐ็ฏฎ็ƒ้ž‹\n407002 0 ๆ™‹ไธญๆ—ฅๆŠฅ๏ผšๅทฆๆƒๅŽฟๅผบๅŒ–็”ตๆขฏๅฎ‰ๅ…จ็›‘็ฎก\n407003 0 ๆŒ‰ๅŒป้™ข็š„่ง„ๅฎš้ข„็บฆๆŽ’ๅท็œ‹็—…ๆ˜ฏๆฅไธๅŠไบ†\n407004 0 2015ๅนด8ๆœˆ2ๆ—ฅๆœ‰ไธ€ไฝๅ…ญๅๅฒ่€ๅคชๅคชไปŽไธŠๆตทๅ‡บๅ‘ๅŽปๅŸบ่พ…\n1000 Processed\n classify content\n407500 0 ๅฏๆœ€ๅŽๆˆ‘่ฟ˜ๆ˜ฏ้‚ฃไธชๆ€•้บป็ƒฆ็š„ๅญฉๅญ\n407501 0 ๅฎžๆ—ถๆ’ญๆŠฅ๏ผšไธœๆ–น่ฏๅˆธ่งฃ่ฏป็ฌฌไธ‰ๆ–นๆ”ฏไป˜่ง„ๅฎš\n407502 0 ๆŠข็บขๅŒ…ๆธธๆˆๅฐ่ฏดๆŠฝ็ƒŸsex\n407503 0 ๅˆ็กใ€้•ฟ้€”่ฝฆใ€็ซ่ฝฆใ€้ฃžๆœบใ€ๆ—…ๆธธใ€ๅ‡บๅทฎ็š„ๅฟ…ๅค‡ๅ•ๅ“\n407504 0 4ใ€ๆฅผๅธ‚ๆ‹็‚นๆ˜พ็Žฐ่ฟŽ\"ไธ‰้€Ÿ\"ๅ›žๆš–ไธ‹ๅŠๅนดๅธ‚ๅœบๆˆ–ๅ‘ไธŠ\n1000 Processed\n classify content\n408000 0 ไปฃๅ‘5ๅน…่ตทๅŒ…้‚ฎๅ†ฐ่ข–่ฟ›ๅ…ฅ็–ฏ็‹‚่ฎขๅ•ๆจกๅผไปฃๅ‘\n408001 0 ๆต™ๆฑŸไธ‡้‡Œๅญฆ้™ข็š„็ฉบ่ฐƒไฝฟ็”จ่ดนไธ€ๅนดxxxๅ…ƒ\n408002 0 ๅพๅทž1000mlๆด‹้…’็“ถๆฑŸ่‹ๅพๅทžๅฎๅŽ็Žป็’ƒ็ง‘ๆŠ€ๆœ‰้™ๅ…ฌๅธๆด‹้…’็“ถใ€ๅˆ†้…’ๅ™จ\n408003 0 ไฟๅ…ป็šฎ่‚ค็ป†่ƒž็ป„็ป‡ๆœ‰ๆžๅ…ถ็ฅžๅฅ‡็š„ๅŠŸๆ•ˆ๏ผš่กฅๅ……็šฎ่‚ค็ป†่ƒž็š„่ฅๅ…ป\n408004 0 2000ๅนด9ๆœˆ็ปๅธ‚ๆ”ฟๅบœไผš่ฎฎ็ ”็ฉถๅŒๆ„\n1000 Processed\n classify content\n408500 0 SuitSuit่ฎพ่ฎก็š„ๆ—…่กŒ็ฎฑๅฐ†่ฎฉไฝ ่„ฑ้ข–ไบŽไบบ็พคไน‹ไธญ\n408501 0 ๆˆ‘็š„ไธไฝœไธบๆฐๆฐๆˆๅ…จไบ†ๅ‘†ๆฟไธฅ่‹›็š„ไฝ“ๅˆถ\n408502 0 ๆตท้€š่ฏๅˆธH่‚กๅค็‰Œไธ€ๅบฆ่ทŒ17%้ญๅŸบ้‡‘ๆŠ›ๅ”ฎ\n408503 0 ๆปฅ็”จไฝ•้ฆ–ไนŒ็พŽๅ‘ๆˆ–ๅฏๅฏผ่‡ด่‚ๆŸๅฎณๅ’Œๆ€ฅๆ€งๅ‘็ƒญ\n408504 0 ๆœ‰ไบ›ไบบ้ ่ฟ™็ง็Œซ่…ป็š„ๅ…ณ็ณปๅพ—ไบ†ๅŠฟ\n1000 Processed\n classify content\n409000 0 โ€่ญฆๅฏŸ๏ผšโ€œ่ฟ™ๆ˜ฏๅฏนๅ†ฒๅŠจๆœ€ๅฅฝ็š„ๆƒฉ็ฝš\n409001 0 ๅก”่บซ็”ฑ5043ๅ—็ƒญๅๅฐ„็Žป็’ƒ้“บๆˆ\n409002 0 ๆปก่„‘ๅญ้ƒฝๆ˜ฏโ€œ้˜ฟ้‡Œ้‡Œ้˜ฟ้‡Œ้‡Œโ€้ญ”ๆ€งไบ†\n409003 0 ๆ‰‹ๆœบๆœ‰็”ตๆœ‰WiFiๆœ‰็ฉบ่ฐƒๆœ‰้›ถ้ฃŸๆ—ถ้—ดๅทฒ่ฟ‡ๅŽปไธ‰ๅˆ†ไน‹ไธ€\n409004 0 ้™•ๅŒ—ๅœฐๅŒบๆ‰ง่กŒๆ—ถ้—ดไธบ6ๆœˆ15ๆ—ฅ่‡ณ8ๆœˆ15ๆ—ฅ\n1000 Processed\n classify content\n409500 0 ่ฟฝๅŠ ไบ†ๅ‡ ไธชๆˆๅฐฑ~ๅฆๅค–่ฏ•่ฏ•ไน‹ๅ‰่งฃ้”ไธไบ†็š„้‚ฃไธชๆฏๆ—ฅๆŒ‘ๆˆ˜ๆˆๅฐฑไฟฎๅคไบ†ๆฒก\n409501 0 ๅถๅฐ”ๆฅๅ‡ ๆžถ้ฃžๆœบ็š„้ฃž่กŒๅ™ช้Ÿณไธ็Ÿฅ้“ๆ˜ฏ็ฉบๅ†›่ฟ˜ๆ˜ฏไน‰ไนŒๆœบๅœบ็š„้ฃžๆœบ\n409502 0 ้ฆ™็ƒŸๅƒ็š„ๅ–็š„็”ต่„‘็ญ‰ไธ€ๅบ”ไฟฑๅ…จ\n409503 0 3ใ€ๆˆ’ๅธธๅผ€ๅคœ่ฝฆโ€”โ€”ไน…ๅผ€ๅคœ่ฝฆไฝฟ็”Ÿ้•ฟๆฟ€็ด ๅ’Œ่‚พไธŠ่…บ็šฎ่ดจๆฟ€็ด ๅˆ†ๆณŒ็ดŠไนฑ\n409504 0 ่ง่ฏไบ†้•‡ๆฑŸ79ๅฒ็š„ๆจๆญฃ้พ™่€ไบบไธŽ่€ไผด55ๅนด็š„็œŸๆƒ…็›ธๅฎˆ\n1000 Processed\n classify content\n410000 0 ่ง‰ๅพ—ๆฏไธ€ๆžถ้ฃž่ฟ‡็š„้ฃžๆœบๆธ…ๆ™ฐๅˆฐ็”š่‡ณ่ƒฝ็œ‹ๅˆฐๆœบๅฐพ็š„่ˆช็ฉบๅ…ฌๅธ็š„ๅๅญ—\n410001 0 avastๆ˜จๆ™šๆŠŠๆˆ‘็”ต่„‘้‡ๅฏๅŽๅฑ…็„ถ่‡ชๅทฑๅคๆดปไบ†\n410002 0 ๆ‰€ไปฅ่…พ่ฎฏไผšๅนๅพˆๅคš็š„้ฃ“้ฃŽๅธฎๅŠฉๆŒ‚ๅท็ฝ‘ไธŠๅŽป\n410003 0 ๅ› ไธบ่ƒš่ƒŽ็ป†่ƒž็š„็”Ÿ็‰ฉๅˆๆˆๅพˆๆดป่ทƒ\n410004 0 ๆœ‰ไบบ่ฏดๆ—…ๆธธๆ˜ฏไปŽ่‡ชๅทฑๅ‘†่…ปไบ†็š„ๅœฐๆ–นๅˆฐๅˆซไบบๅ‘†่…ปไบ†็š„ๅœฐๆ–น\n1000 Processed\n classify content\n410500 0 ็พŽๅ›ฝไธบ้ฟๅ…ๆฝœ่‰‡ๆŠ€ๆœฏ่ฝๅ…ฅ่‹่”\n410501 0 ๆœบๅ™จไบบๆฏ”่ต›่Œ็ฟปไบ†ๅธ‚ๆฐ‘ๆœŸๅพ…ๆœๅŠกๅŠๆ•‘ๆดๆœบๅ™จไบบๅœจ็”Ÿๆดปไธญๅบ”็”จ\n410502 0 ๅœจ็ซ็ฎญ้˜Ÿ่ก€ๆด—ๆน–ไบบ็š„ไธ€ๅœบๆฏ”่ต›ไธญ\n410503 1 ๅ›ๆตฉๆฑฝ่ฝฆๆ–ฐๆ˜ฅ็‰นๆƒ ๏ผŒๅˆฐๅบ—็ซ‹ๅ‡xxxๅ…ƒ็Žฐ้‡‘๏ผŒไปทๆ ผไผ˜ๆƒ ๅ“่ดจไฟ่ฏ๏ผŒ้ข„็บฆๅฝญ็šŽ่‹นๆˆ–ๅŠ ๅพฎไฟกxxxxxxxx...\n410504 1 xxxxxxxxxxxxxxxxxxx้‚ฎๆ”ฟๅดๅฐๅจฃใ€‚ๆ‰“ไบ†ๅ‘ไธชไฟกๆฏ่ฏดๅ“ˆ\n1000 Processed\n classify content\n411000 0 ็ฉบ่ฐƒ่ฅฟ็“œ็”ต่„‘็œŸๆ˜ฏๆƒณไธๅ‡บ่ฟ˜ๆœ‰ๆฏ”่ฟ™ๆ›ด็ˆฝ็š„ๅ‘จๆœซ้…็ฝฎ\n411001 1 ๅ‡ก่ดญไนฐๆฌง่Žฑ้›…xxxmlๆด—ๆŠคไบงๅ“้€ๆฌง่Žฑ้›…ๆ—…่กŒ่ฃ…xxmlๆˆ–xmlๆด—ๆŠคไบงๅ“้™้€‰ๅ…ถไธ€๏ผŒ้€ๅฎŒไธบๆญข ๅ‡ก...\n411002 0 ไป–่ฟ‘ๆ—ฅ่ขซๆ‹ๅˆฐๅ’ŒAngealababyๅœจ็‰‡ๅ ด่Šๅคฉ\n411003 0 ็ŽฐๅœจๆญฃๅœจไธŠๆตทๆ–ฐๅ›ฝ้™…ๅฑ•่งˆไธญๅฟƒไธพๅŠž\n411004 0 3ๅท็บฟใ€6ๅท็บฟใ€7ๅท็บฟใ€8ๅท็บฟใ€ๆœบๅœบ็บฟ็ญ‰9ๆกๅœฐ้“็บฟ่ทฏๅŒๆ—ถๅœจๅปบ\n1000 Processed\n classify content\n411500 0 ๅœŸ่ฑ†ๆ‹ๅฎข๏ผšๆƒ…ไบบ่Š‚ๅพๅทžๅธ‚ๆทฎๆตท้ฃŸๅ“ๅŸŽๅ‘็”Ÿๅคง้ข็งฏ็ซ็พ\n411501 1 ๅŽๅซไธญไนๅŒๆจกๆœบ๏ผŒๅคฎ่ง†xxxxๅŠ ๅ‡คๅ‡ฐๅซ่ง†ๅ…ฑxxๅฅ—่Š‚็›ฎ๏ผŒๆ‰นๅ‘xxxๅ…ƒใ€‚xxๅฐ้€xๅฐ๏ผŒไธๆ’ๅกไธๅฎšไฝ...\n411502 0 ่ขซ่ฟ™ไฝ่ญฆๅฏŸ่œ€้ป็ฝšไบ†xxxๅ—้’ฑ\n411503 0 ๆƒณ็€ไธ€ไธชไบบไธ€ๆ™šไธŠๅ€’3่ถŸ้ฃžๆœบ\n411504 0 ไปŠๅนด็š„็จป็”ฐ็”ปไปŽxๆœˆๅผ€ๅง‹่ฟ›่กŒ็งๆค\n1000 Processed\n classify content\n412000 0 ๆ‰ไธฐๅฏŒไบ†็‘Ÿ็ผฉ?ๆŠฝ่ฑกๅ‡ ไฝ•ๆŒ‚็”ป\n412001 1 ๅ‡กๅฝ“ๅคฉๅˆฐๅบ—่ฃ…ไฟฎ็š„ไธšไธป๏ผŒๅ‡ๅฏ้ข†ๅ–้ฃŸ็”จๆฒนไธ€ๆกถๅฆๅฏไบซๅ—ๅˆๅŒ้ข็š„x%็š„่ฃ…ไฟฎ่กฅ่ดด๏ผๅนถ่ต ้€xxxxๅ…ƒ่‹ๅฎ...\n412002 0 ๅฐ‘ๅฅณไปŽๅคฉ่€Œ้™่ฟ™้ƒจไฝœๅ“็š„้Ÿณไน็”ฑไน…็Ÿณ่ฎฉ่ดŸ่ดฃ\n412003 0 ่ฟ™ไบ›ไบบ้ƒฝๆ˜ฏๆต™ๆฑŸๅฎๆณขไธ€็พค้ช—ๅญๅ›ขไผ™\n412004 0 ๆ‰“ๅผ€็”ต่„‘ไธ€็œ‹็ซŸๆ˜ฏ่ฟ™่ˆฌๅ‡„ๆƒจ็š„ๆ™ฏ่ฑก\n1000 Processed\n classify content\n412500 0 ๆ‰ฌๅทž่ญฆๆ–นๆญฃๅผๅฏนๅค–ๅ‘ๅธƒๆถˆๆฏ็งฐ\n412501 0 ๆˆ‘ๅœจๅณๆ—ถPK่ต›ไธญ็ปˆไบŽๆˆ˜่ƒœไบ†้‡‘ๅทž็ƒ้˜Ÿ\n412502 0 ๅ‘็Žฐ็”ต่„‘้‡Œๅฑ…็„ถๆœ‰่ฟ™ๅผ โ€ฆๆ„Ÿ่ง‰ไธ€ไธ‹ๅญๅ›žๅˆฐไบ†ๅˆไธญโ€ฆ็ช็„ถๆƒณๅ”ฏไธ€ไธ€ไธชๆ’•่ฟ‡็š„้—บ่œœไบ†โ€ฆ็Žฐๅœจๆƒณๆƒณๆ’•่ตทๆฅ็š„็†็”ฑ็œŸๅ‚ป\n412503 0 8ๆœˆ7ๆ—ฅ่‚กๆŒ‡ๆœŸ่ดงๆ“ไฝœ็ญ–็•ฅ\n412504 0 ๆˆ‘ๅŒบxๅฎถๅ…ฌๅธ็š„xไธชไบงๅ“ๆฆœไธŠๆœ‰ๅ\n1000 Processed\n classify content\n413000 0 ๅค–็ฑไบฌๅ‰งๆฏ”ๅŸบๅฐผๅŽฆ้—จๆผณๅทžๆณ‰ๅทž็Ÿณ็‹ฎๆ™‹ๆฑŸๆต™ๆฑŸๆธฉๅทžๆญๅทž้•ฟๆฒ™ๅ—ไบฌๅŒ—ไบฌๆญฆๆฑ‰ๆˆ้ƒฝๅค–็ฑๆจก็‰น่ˆž่นˆไน้˜Ÿ18965...\n413001 0 ๅŽไธบ็ปๅธธๆŠฑๆ€จ่‡ชๅทฑๅœจ็พŽๅ›ฝ้ญๅˆฐไธๅ…ฌๅนณ็š„ๅฆ–้ญ”ๅŒ–โ€”โ€”ๅŽไธบ่ฟ„ไปŠๆœช่ƒฝไปŽ็พŽๅ›ฝไธป่ฆ็ฝ‘็ปœ่ฟ่ฅๅ•†่ตขๅพ—ไธ€ๅ•็ฝ‘็ปœ่ฎพๅค‡ๅˆ็บฆๅ‘ข\n413002 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅฐๅˆš๏ผŒๆฒงๅทžๅŒ—็Žฏ้›ชไฝ›ๅ…ฐxSๅบ—็š„๏ผๅ’ฑไปฌๅ…ซๅทๅŽ‚ๅฎถๆœ‰ๆ‰น็‰นไปท่ฝฆ๏ผŒไผ˜ๆƒ ๆฏ”่พƒๅคง๏ผŒ่€Œไธ”่ฟ›ๅบ—ๅฐฑๆœ‰็ฒพ็พŽ...\n413003 0 ๅฃ่…”ๅ†…ๅพฎ็”Ÿ็‰ฉ่…่ดฅๆถˆๅŒ–ๅฃ่…”ๆปž็•™็‰ฉ่ดจไบง็”ŸๆŒฅๅ‘ๆ€ง็กซๅŒ–็‰ฉ็ญ‰ๅผ‚ๅ‘ณ็‰ฉ่ดจๆ˜ฏๅฏผ่‡ดๅฃ่‡ญ็š„ไธป่ฆๆˆๅˆ†\n413004 0 ๆœ‰ไธชๆœ‹ๅ‹ๆ˜ฏๆธ…ๅŽ็š„ๆณ•ๅญฆ็ ”็ฉถ็”Ÿๅ“ฆ\n1000 Processed\n classify content\n413500 0 ๆˆ‘ๆŒบ้šพ่ฟ‡็š„ๆˆ‘็™พๅบฆไบ†ๆญฃ็กฎ็š„ๆด—่„ธๅ›พ็‰‡็ป™ๅคงๅฎถๅ‚่€ƒไธ‹\n413501 0 ๅ‡่ƒฝๆๅ‰24ๅฐๆ—ถๅœจๆ— ้”กๅŠž็†ๅ€ผๆœบๆ‰‹็ปญ\n413502 0 ้“พๆŽฅๆ”พๅœจ่ฏ„่ฎบ้‡Œ้ขๅ•ฆ~่ฏทๅซๆˆ‘ๆดป้›ท้”‹~ไปฅๅŽๅฐฑไธ่ฆๅ†้—ฎๆˆ‘ๅ•ฆ\n413503 0 ๅฆ‚ๆžœ่‚ก็ฅจ็š„่ตฐๅŠฟ่ทŸไฝ ็š„ๅคดๅฏธ็›ธๅ\n413504 0 ่ดๅก”ๅทฒ็ปๆ•ดไบ”ไธชๆœˆไบ†\n1000 Processed\n classify content\n414000 1 ไฟก็”จ็คพ๏ผŒxxxx๏ผŒxxxx๏ผŒxxxx๏ผŒxxxx๏ผŒxxxx๏ผŒxxๅ•†ๆŸๆ˜†\n414001 0 ็™พๅง“ๅคง่ˆžๅฐxxxxๆถˆๅคๆˆๆ›ฒๆ–‡่‰บๆ™šไผšๅœจๆ–ฐๆณฐๆปจๆน–ๅนฟๅœบๆ‹‰ๅผ€ๅบๅน•\n414002 0 ่ญฆๅฏŸๅ”ๅ”ๅฐฑไผšๆŠŠไฝ ๅฎถ้นฟๆ™—ๆ‰พๆฅ\n414003 0 ๅฐฑๅฅฝๅƒๅ‡บๅŽป็Žฉ่ขซๅฐๅทๅทไบ†ๅไบบๆ‰“ไบ†ๆŠฅ่ญฆไธ็ฎก็„ถๅŽ่ญฆๅฏŸ้—ฎๆˆ‘ไปฌๅ—จไธๅ—จไธ€ๆ ท\n414004 0 ๅˆฐไธ€้™ขๅŽ็ปct่ฏŠๆ–ญไธบ่››็ฝ‘่†œไธ‹่…”ๅ‡บ่ก€\n1000 Processed\n classify content\n414500 0 ๆ‰€ไปฅๅ–œๆฌข่กฃ่กฃ็š„?ๆˆ‘ๅซๆ˜Ÿchangliang8899\n414501 0 ๅœจ1860ๆ–‡ๅŒ–ๅˆ›ๆ„ๅ›ญไธบๅฎถไนกไบบๆฐ‘็ŒฎไธŠๅŒ้’ข็ด็พŽๅฆ™ไน้Ÿตโ€”โ€”โ€œ็ด็ณปๆ•…้‡Œโ€”ไธญๆณ•้’ข็ด้Ÿณไนไผšโ€\n414502 0 ่‹ๅทžๅคงๅญฆๆ–‡ๅญฆ้™ข้‡่ตฐๅดๆฑŸโ€œ่ˆŒๅฐ–โ€่ทฏ็š„ๅฎž่ทตๅ›ข้˜ŸๆฅๅˆฐๆฑŸๅ—ๅค้•‡้œ‡ๆณฝ\n414503 0 ๆƒณๆ‰พๆœ‹ๅ‹ไปฌๅ€Ÿไธช่…พ่ฎฏ่ง†้ข‘็š„ๅฅฝ่Žฑๅžไผšๅ‘˜่ดฆๅท็”จ\n414504 0 ๆฅผไธŠๆฅผไธ‹ๅŒๆ—ถ่ฃ…ไฟฎๆ˜ฏไธ€็งๆ€Žๆ ท็š„ไฝ“ไผš\n1000 Processed\n classify content\n415000 0 ไปŠๅŽx~xๅนดๆœ€็ปˆๆดปไธ‹ๆฅ็š„ๆ˜ฏๆžๅฐ‘ๆ•ฐ็š„ๅŽ‚ๅ•†\n415001 0 ๆฏ•็ซŸๆฒก่ฏšๆ„็š„ๆ˜ฏๆˆ‘ๆฏ•็ซŸๆˆ‘่ฟ˜่ฎฐๅพ—ๅฝ“ๅนดJS่ฏด่ฟ‡ๅฐฑ็ฎ—ๅฅนๅŽปไบ†ๅˆซ็š„ๅŸŽๅธ‚ๆˆ‘ไนŸๅฏไปฅๅ้ฃžๆœบๅธธ่ฟ‡ๅŽป็œ‹ๅฅนๆˆ‘ๅฐฑๅšไธๅˆฐ...\n415002 0 ๅœจ้ฉฌๅ‹’ๅˆซๅข…ไธพๅŠžๅฉš็คผๅฎšไผšๅƒ่ฟ™ๅบงๅปบ็ญ‘ๆœฌ่บซไธ€ๆ ท็ซฅ่ฏ\n415003 0 ๆˆณๅ›พไบ†่งฃๆ›ดๅคš็Ÿฅ่ฏ†โ†“โ†“โ†“viaไบบๆฐ‘ๆ—ฅๆŠฅ\n415004 0 ็Žฉๅ…ทๅƒ็š„ๅ–็š„ใ€้˜ฒๆ™’ใ€ๅบŠๅ•ใ€่‰ๅธญใ€ๅฎ่ด็š„่ขซๅญ\n1000 Processed\n classify content\n415500 0 ไธญ็ฒฎ้›†ๅ›ข็š„็›ธๅ…ณไธŠๅธ‚ๅ…ฌๅธๅฆ‚ไธญ็ฒฎๅฑฏๆฒณ\n415501 0 ๆˆ‘ไปฌไฟฉๆ—ฉไธŠไธ€่ตทๅƒ้ฅญ็„ถๅŽๅŽปไธŠ็ญไธ‹็ญๅ›žๆฅๅƒไธช็พŽ็พŽ็š„ๆ™š้ค็„ถๅŽ็ชๅœจไธ€่ตท็œ‹็”ต่ง†็Žฉ็”ต่„‘ๅ•ฆ\n415502 0 ๆญปๅˆ‘ๅ’Œไธ€่พˆๅญ็ป™ไป–ๅฎถๅš็‰›ๅš้ฉฌ\n415503 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ n4xen3ไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n415504 0 3ใ€ๆœบๅˆถ้™ˆๆ—งไธ่ƒฝๆฟ€ๅŠฑไบบๅฟƒ\n1000 Processed\n classify content\n416000 0 ๅ‘จ่พน่ขซไผŠไธœไธฐ้›„่ฎพ่ฎก็š„Lๅž‹Todๆฅผๆ‰€ๅŒ…ๅ›ด\n416001 0 ๆœฌๆกˆไพๆณ•้€‚็”จ็ฎ€ๆ˜“็จ‹ๅบ่ฟ›่กŒๅฎก็†\n416002 0 ๅธธๅทžๅธ‚้‡‘ๅ›ๅŒบๆฐ”่ฑกๅฐ7ๆœˆ27ๆ—ฅ16ๆ—ถๅ‘ๅธƒ็š„้ซ˜ๆธฉ่ญฆๆŠฅๅ’Œๅคฉๆฐ”้ข„ๆŠฅ๏ผšไปŠๅคฉๅคœ้‡Œๅˆฐๆ˜Žๅคฉๅคšไบ‘ๅˆฐๆ™ด\n416003 1 ๆ‚จๅฅฝ๏ผŒๆทฑๅœณๅธ‚ๆก‚่Šฑ่ƒถ็ฎกไบ”้‡‘ๅ•†่กŒๆฌข่ฟŽๆ‚จ็š„ๆฅ็”ต๏ผŒๆˆ‘ๅ•†่กŒ็ซญ่ฏšไธบๆ‚จๆไพ›๏ผš่”ๅก‘ใ€ๆทฑๅก‘่ƒถ็ฎกๅŠๅ„็ฑปๅ“็‰Œ่ƒถ็ฎก๏ผŒ...\n416004 0 ไธ่ฟ‡ๅพˆๅคšไบบ็œ‹่ตทๆฅๆƒณ้ ่ญฆๅฏŸๅ”ๅ”ๅฑ ๆ‘่งฃๅ†ณไธ€ๅˆ‡\n1000 Processed\n classify content\n416500 0 ๆ˜Žๆ—ฉๅฐฑๅฏไปฅ็œ‹่Šฑๅƒ้ชจๅ•ฆๆˆ‘่ฆๅŽป่กฅไปŠๆ™š็š„ๅฅ‡่‘ฉ่ฏดๅ•ฆ้‡‘ๆ˜Ÿๆˆ‘็ˆฑไฝ ไนˆไนˆๅ“’\n416501 0 ๅˆฐๆ•ดๅฝขๅŒป้™ข่ฆๆฑ‚ๅŒป็”Ÿไธบๅ…ถๅšๆฌงๅผๅŒ็œผ็šฎ\n416502 0 ๅฎŒ็พŽ่€Œ็ซ‹ไฝ“ๆ„Ÿ่ง‰็š„ๆ‹ผๆŽฅ่ฎพ่ฎก้ฃŽๆ ผ\n416503 0 xๆœˆxxๅทไธพ่กŒๆดปๅŠจ็‹‚้€ๅ››ๆ ธMX\n416504 0 ๆˆ‘ไปฌๆฌข่ฟŽๅคงๅฐๆœ‹ๅ‹ๅ‚ๅŠ ~ไธ่ฆๅ†็Šน่ฑซ\n1000 Processed\n classify content\n417000 1 ๅซŒๅŽปๆพณ้–€ๅคช้บป็ƒฆ๏ผๅฐฑๆฅxxxxxx๏ผŒcโ—‹m ๆ„ๅค–็š„ๆƒŠๅ–œ็ญ‰็€ไฝ ใ€‚ๆ–ฐๅนด้€ๅฅฝ่ฟ๏ผŒๆฅไบ†ๅฐฑๅฏๅพ—xxใ€‚ๆปกไบ†...\n417001 0 ๅพฎ่ฝฏไปŠๅคฉๆญฃๅผๅฎฃๅธƒไผšๅœจๆœชๆฅๅ‡ ไธชๆœˆ้€ๆญฅ่ฃๅ‡x\n417002 0 ๆœบๅ™จไบบๅฏๅœจๆŠค็†่ฎพๆ–ฝๅ†…ๆ นๆฎๅ…ฅไฝ่€…็š„ๆ•ฐๆฎไฝœๅ‡บโ€œ่ฆๆ˜ฏ่บซไธŠๅ“ช้‡Œ็–ผ่ฏทๅ‘Š็Ÿฅโ€ใ€โ€œๅˆฐ้‡่ก€ๅŽ‹็š„ๆ—ถ้—ดไบ†โ€็ญ‰ๆ้†’\n417003 1 ๅ“ฅๅ“ฅไฝ ๅฅฝ๏ผŒๆˆ‘ๅซๅฐ็Žฒ๏ผŒๅ››ๅท็š„๏ผŒxxไบ†๏ผŒ็šฎ่‚ค็™ฝ๏ผŒๅ› ไธบๅฎถ้‡Œๅ›ฐ้šพ่ทŸ็€ๆœ‹ๅ‹ๅ‡บๆฅๆ‰“ๅทฅ๏ผŒ่ฟ˜ๆฒกๆœ‰ๅค„่ฟ‡ๆœ‹ๅ‹๏ผŒๅฌ...\n417004 0 ๅพฎ่ฝฏ้ญ‚ๆทกๅ•Šโ€ฆไธ€่พนๅ ็€ๅธฆๅฎฝไธ€่พนไธไธ‹ไธœ่ฅฟ\n1000 Processed\n classify content\n417500 0 ่‚Œ่‚ค็š„็ป†่ƒžไธไผšๅƒๅ…ถไป–ๅญฃ่Š‚้‚ฃ่ˆฌๆดป่ทƒ\n417501 1 ๆ‚จๅฅฝ๏ผšๆˆ‘ๆ˜ฏๅ”ๅฑฑๆœ—ๅจ ๅผ ๅปบ็‚Ž ๅ”ๅฑฑๆœ—ๅจ่ฟ›ๅ‡บๅฃ่ดธๆ˜“ๆœ‰้™ๅ…ฌๅธ ไธป่ฅ๏ผ›ๆดฅ่ฅฟHๅž‹้’ขไธ€็บงไปฃ็†\n417502 0 ไฝ†่ฝฌ่€Œๆ‰‹้‡Œไป…ๅ‰ฉ็š„3000ๅ…ƒๅฐฑ่ขซๆ‹ฟ่ตฐ\n417503 0 ๅ…จๅธ‚xxxๅๆฃ€ๅฏŸๅนฒ่ญฆๅ‚ๅŠ ๅ‘˜้ขๅˆถๆฃ€ๅฏŸๅฎ˜้€‰ไปป่€ƒ่ฏ•็ฌ”่ฏ•\n417504 0 ไฝ†็ผฉ็Ÿญ็งŸๆœŸๅˆฐ2020ๅนด10ๆœˆ26ๆ—ฅ\n1000 Processed\n classify content\n418000 0 10086็ญ”ๅคๅ‡ ไธชๆœˆๅ‰็š„็งฏๅˆ†ๅ…‘ๆข้‡‘้ขไธ้€€\n418001 0 ็›ฎๅ‰่ฏฅๅ›ฝๆ”ฟๅบœๆญฃๅฏปๆฑ‚ๅฐ†้“่ฟ›ๅฃๅ…ณ็จŽไธŠ่ฐƒไธ€ๅ€่‡ณ10%\n418002 0 ๆ˜ฏ่ฐๆ‹›ๆฅๅœจ่…พ่ฎฏๅฐฑๅŠฃ่ฟนๆ–‘ๆ–‘็š„ไบบ\n418003 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ๆ˜ฏ่ดตๅทžๆ’็พŽๅ…ฌๅธ๏ผŒๆˆ‘ๅ…ฌๅธ้•ฟๆœŸๆŽจๅ‡บ๏ผš็ป†่ƒžๆดป่ƒฝ๏ผŒๆน˜้ฃž็ฅ›ๆ–‘๏ผŒ็ˆฑไบบ่‘ก่„็ฑฝ๏ผŒไธฝ็ปฃๅช›๏ผŒไธ‰ไธ–็ผ˜ใ€‚ไธ“ไธš...\n418004 0 ๆฒก็œ‹ๅคฉๅคฉๅ่…ไบบๆฐ‘็”Ÿๆดปๅฐฑๅคšๅฅฝไบ†\n1000 Processed\n classify content\n418500 0 ๅ—จWhatsAppๅœจiPhoneใ€ๅฎ‰ๅ“ใ€่ฏบๅŸบไบšใ€้ป‘่Ž“ๅ’Œๅพฎ่ฝฏๆ‰‹ๆœบๅนณๅฐไธŠ้ƒฝๅฏไปฅไฝฟ็”จ\n418501 0 ็”ทๅญ่นฒ5ๅนดๅ†ค็‹ฑๅŽ็”จๆผซ็”ปๆ็ป˜ๅˆ‘่ฎฏ๏ผš่ขซๅคšไบบ็ ๆ‰“ๆŠ˜็ฃจ\n418502 0 ไปฅ็กฎไฟ็–ซๆƒ…ๆ—ฉๅ‘็Žฐใ€ๆ—ฉๆŽงๅˆถใ€ๆ—ฉๆ‰‘็ญ\n418503 0 ไปปไฝ•ๆŠ•่ต„้ƒฝ้œ€ๅ…ทๅค‡ๆ™บๆ…งๆ€ง็š„ๅฟ่€ๅŠ›\n418504 1 ๆˆ‘ๅธไปฃๅŠžๅ„ๆˆทยทๅฃๆœฌโ€˜ๆฏ•.ไธšๆœฌ๏ผŒ่บซ*ไปฝๆญฃ๏ผŒ้ฉพ๏ผ่กŒ#้ฉถๆญฃ็ญ‰ไธ€ๅˆ‡ๆœ‰ๆ•ˆๆญฃ&ไปถ๏ผŒ่”็ณป๏ผšโ’ˆโ’Šโ’Œโ’Œโ’Œโ’โ’โ’Ž...\n1000 Processed\n classify content\n419000 0 ๆ–ฝๅŽๆด›ไธ–ๅฅ‡๏ผš็‹ฌๆฝๆฐดๆ™ถ่ดขๅฏŒๆˆๅฐฑ็™พๅนดไผ ๅฅ‡ๅฎถๆ—\n419001 0 FATEๅŽ้—็—‡๏ผš็œ‹APHๆ€ป่ง‰ๅพ—็œ‰ๆฏ›ๆก‘ๆฏๆฌก่ฆๆ”ปๆ‰“ๅˆซๅฎถ็š„ๆ—ถๅ€™ๅฐฑ่ฆๅฌๅ”คไธช้ช‘ๅฃซ็Ž‹ๅ•Šๆ”พไธชๆŠ•ๅฝฑๅ•ฅ็š„ไบ†โ€ฆโ€ฆ\n419002 0 ๆพ็ดง่…ฐ็š„่ฎพ่ฎกๆ›ด่ดดๅˆ่บซไฝ“ๆ›ฒ็บฟ\n419003 0 ๆˆ‘ไปฌๅœจ7ๆœˆ20ๅท่งฆๅˆฐ196็พŽๅ…ƒ็š„็›ฎๆ ‡ไปท\n419004 0 ่ƒฝไธ่ƒฝๅ…ˆ่ฏ„ไผฐไธ€ไธ‹่‡ชๅทฑ็š„่กŒไธบ\n1000 Processed\n classify content\n419500 0 ๅคชๅ‚ปๆฏ”ไบ†ไธ่ฐˆไบ†็ˆฑไธๅŠจๅ“ˆๅ“ˆๅ“ˆegๅŠ ๆฒนๅง\n419501 0 2ใ€็™ฝ่‰ฒๅŠ้€ๆ˜Ž็œผๅฑŽ๏ผšๅฏ่ƒฝๆ‚ฃๆ€ฅๆ€ง็—…ๆฏ’ๆ€ง็ป“่†œ็‚Ž\n419502 1 xxxx้ญ…ๅŠ›ไธ‰ๅ…ซๆ‚จ็บฆๅ—๏ผŸๆ–ฐ็ปๅ…ธ็บฆๆ‚จxๆœˆxๆ—ฅ่‡ณxๆ—ฅๆƒŠๅ–œไธๆ–ญ.ๅฎžๆƒ ๅคšๅคš๏ผๅ…จๅนดไป…ๆœ‰ไธ€ๆฌก๏ผๆˆ‘ไปฌ็ญ‰ไฝ ๏ผ...\n419503 0 ๆ™šไธŠๅ‡บ้—จไธ็”จ้˜ฒๆ™’ๅ‘€hhh๏ฝž\n419504 0 ้‚„ๅˆฐ้‡‘ๆฒ™็š„57ๅฑค็œ‹ๆ•ดๅ€‹ๆตทๅฒธๅ’ŒๅŸŽๆ™ฏ\n1000 Processed\n classify content\n420000 0 ็œŸไปฅไธบ่‡ชๅทฑๅคงๅŸŽๅธ‚ไบ†ๆ”ฟๅบœๆ— ่ƒฝๅฐฑๆƒณ็€ไธ€ๅˆ€ๅˆ‡ๅœฐ่งฃๅ†ณ้—ฎ้ข˜ๆœ‰ๆฒกๆœ‰ๆƒณ่ฟ‡ไบค้€š้—ฎ้ข˜ไธๅฅฝๆ˜ฏๅ› ไธบไฝ ไปฌ็š„่ง„ๅˆ’ไธๅฅฝ่€Œ...\n420001 0 ๆฌ ๅ€บ่ฆ้’ฑๅฑ…็„ถ้€ผๅพ—ๆˆ‘ไธŠ็™พๅบฆไบ†\n420002 1 ๅฅถ็ฒ‰๏ผ‰ไธ€ๅพ‹ไธƒๆŠ˜ไผ˜ๆƒ ๏ผŒๅ‡กไธ€ๆฌกๆ€ง่ดญ็‰ฉๆปก็™พๅ…ƒ็š„้กพๅฎข๏ผŒๅฐ†ไผš่ต ้€็ฒพ็พŽ็คผๅ“ไธ€ไปฝ๏ผŒๅƒไธ‡ไธ่ฆ้”™่ฟ‡ๅ•Š๏ผ๎•ๅœฐๅ€๏ผš...\n420003 0 ่ฟ™ๆ˜ฏไธคๅนดๆฅๆตฆๅฃไบŒๆ‰‹ๆˆฟๆœˆๆˆไบค้‡้ฆ–ๆฌกๆˆไธบๅ…จๅธ‚็ฌฌไธ€\n420004 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅˆš่ทŸๆ‚จ่”็ณป็š„ๆน–ๅŒ—ไธญ่ดตๅ…ด็š„ๅฐๆœฑ๏ผŒๅ…ฌๅธๅœฐๅ€ๅœจๅ…‰่ฐทๆญฅ่กŒ่ก—ไธ–็•ŒๅŸŽๅนฟๅœบxxๆฅผ๏ผŒๅฆ‚ๆžœๆ‚จ้œ€่ฆ่ต„...\n1000 Processed\n classify content\n420500 0 ๅๅญ—ๅซโ€œTheBeatlesPubโ€ไฝ†ๆ˜ฏไธ€ๆ™š้ƒฝๆฒกๅฌๅˆฐไธ€้ฆ–Beatles\n420501 0 ไฝ†ๆ˜ฏๅนถ้žๆ‰€ๆœ‰็š„Lumiaๆ‰‹ๆœบ้ƒฝๅฏไปฅๅœจ็ฌฌไธ€ๆ—ถ้—ดๅ†…ๅ‡็บง\n420502 0 ็”ฑไบŽๅœจFirefoxๆต่งˆๅ™จไธŠๅ‘็Žฐไธ€ๅค„ไธฅ้‡็บงๅˆซๅฎ‰ๅ…จๆผๆดž\n420503 0 ๆŸฅ่Žทๆ‰‹ๆœบ70ไฝ™้ƒจใ€้“ถ่กŒๅก70ไฝ™ๅผ ใ€่ฑชๅŽๆฑฝ่ฝฆ5่พ†\n420504 0 ๅคšๆ•ˆ็‚ซๅฝฉ้ญ”ๆณ•้š”็ฆป้œœ50mlไฟๆนฟๅฆ†ๅ‰ไนณๆไบฎ็พŽ็™ฝHERA่ตซๆ‹‰้ญ”ๆณ•ๅฆ†ๅ‰ไนณ\n1000 Processed\n classify content\n421000 0 ๅคงๅŠ ็ดขๅฐ”ๅœจๅ—้žๅ‚ๅŠ NBA้žๆดฒ่กจๆผ”่ต›\n421001 1 ็พŽไธฝๅฅณไบบ่Š‚ๅˆฐๅ•ฆ๏ผxๆœˆxๆ—ฅๅฝ“ๅคฉๅˆฐๆฏ›ๆˆˆๅนณ็”Ÿๆดป้ฆ†ๆถˆ่ดน็š„็พŽๅฅณ้ƒฝๅฏๅพ—็ฒพ็พŽ็คผๅ“ไธ€ไปฝๅ“ฆ๏ผ๏ผๆฏ›ๆˆˆๅนณ็”Ÿๆดป้ฆ†ๅ…จไฝ“...\n421002 0 ไธŠๅ‘จๆŠ•่ต„่€…ไปŽๆ–ฐๅ…ดๅธ‚ๅœบๅŸบ้‡‘ๅ‡€่ตŽๅ›žxx\n421003 0 WindowsPhone่ฎพๅค‡xๅนดๆ€ป้”€้‡่พพๅˆฐไธ€ไบฟๅฐ\n421004 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?Bad้ข„ๅ‘Š็‰ˆ้Ÿณ้ข‘\n1000 Processed\n classify content\n421500 0 ่ฏฅๅธ‚โ€œๅพฎๅฏ†ยท้“ๅฎขไน‹ๅฎถโ€ๅ‡บ็งŸ่ฝฆ็ˆฑๅฟƒ่ฝฆ้˜Ÿ็š„ๅธๆœบไปฌๅพ—็Ÿฅๆญคไบ‹ๅŽ\n421501 0 xใ€็œ็บง้ƒจ้—จๅฝขๅŒ่™š่ฎพไธไฝœไธบ\n421502 0 ่ฏท่ฎคๅ‡†๏ผšsozu1991้ž่ฏšๅ‹ฟๆ‰ฐไธพๆŠฅๆญปๅ…จๅฎถ\n421503 0 ไธบๅฎ˜ๆธ…ๅป‰็…งๆฑ—้’ๆ•…ๅฑ…็บต็ซ่ขซๅผบๆ‹†\n421504 0 ๆˆ‘ๅฐฑ่ตทๅบŠๆŠŠ็”ต่„‘ๅ’Œๆ‰‹ๆœบ้‡Œ็š„่ฝฏไปถ้ƒฝๆ›ดๆ–ฐไธ€้\n1000 Processed\n classify content\n422000 1 ๅ•†ๅœบ่ฟ›ๅฃๅŒ–ๅฆ†ๅŒบ๏ผŒ่ดญไนฐๅŒ–ๅฆ†ๅ“๏ผๆœ€้ซ˜ๅฏ่ฟ”็Žฐxxxxๅ…ƒ็š„่ฟ›ๅฃๅŒ–ๅฆ†ๅ“็Žฐ้‡‘ๅˆธ๏ผŒ่ฟ˜ๆœ‰ไบ”็บงๅฅฝ็คผๅŠ ่ต ๏ผๆœŸๅพ…ๆ‚จ...\n422001 0 ๆฎ้ข„ๆต‹๏ผš2015ๅนดๅ…จ็ƒๆ‰‹ๆœบๆˆ็˜พ่€…ๅฐ†่พพๅˆฐ2\n422002 0 3๏ผŽๆค้…ธ้…ถๅ’Œๆœจ่š็ณ–้…ถ็š„ไฝœ็”จๆœบ็†\n422003 0 ไปŠๅนดๆˆ‘้™คไบ†ไฝ ๅฎถkindleไนฐไบ†ๅ‡ ๆœฌไนฆๅค–ๅฐฑๆฒกไนฐ่ฟ‡ไปปไฝ•ไธœ่ฅฟ\n422004 0 ใ€Ž็ฅžๅทžไธ“่ฝฆ่ขซ็บฆ่ฐˆ็งŸ่ฝฆ้…ๅธๆœบๅ…่ดนๆŽฅ้€ๆœบ่ขซๆŒ‡่ฟๆณ•ใ€\n1000 Processed\n classify content\n422500 0 ็–ซๆƒ…ๅนณ็จณ้ข„็คบๅ‘็—…้ซ˜ๅณฐๆœŸๅทฒ่ฟ‡\n422501 0 xๆœˆxๆ—ฅไฝ ไปฌ่ฟŽๆฅไบ†ไฝ ไปฌ็š„ๅˆไธ€ไธชๆˆๅ‘˜\n422502 0 ไธ่ฟ‡่ฟ™ๅญฃ็œ‹่ตทๆฅๅˆๆ˜ฏๅพˆhighๅ˜ฟๅ˜ฟๅ“ˆๅ˜ฟ\n422503 0 ๅŽŸๆ–‡โ€”้ฃŽ้ฃŽ้›จ้›จๆš–ๆš–ๅฏ’ๅฏ’ๅค„ๅค„ๅฏปๅฏป่ง…่ง…\n422504 0 ๆน–ๅŒ—่†ๅทžๅฎ‰่‰ฏ็™พ่ดงๆ‰‹ๆ‰ถ็”ตๆขฏๅ‘็”Ÿไบ‹ๆ•…\n1000 Processed\n classify content\n423000 0 ๅพฎ่ฝฏ่ฆ่ตท่ฏ‰ไธญๅ›ฝๅฆ‚ๆžœๆˆๅŠŸ่ฆ่ต”ๅ‡ ็™พไบฟ\n423001 0 chrome็”จๅคšไบ†ๆž—ๅญ่ช้ƒฝ่งๅพ—ๅฐ‘ไบ†ๅ‘ข\n423002 0 ไธญๅ›ฝ้“ถ่กŒๅฐ†ๅœจ8ๆœˆไธ‹ๆ—ฌ็ป„็ป‡่ฟ›่กŒ็ปŸไธ€็ฌ”่ฏ•\n423003 0 ๅๅ’ŒๅŒป้™ข็ฒพๅฟƒๅ“็‰Œๅ‡บ็š„ไธค็งๆ–ฐๅ“ๆด—้ขๅฅถ\n423004 0 Nxxxxx่ฟ™ๆฌพ้˜ณๅˆš็š„Sheltonไธญๅทๆ‰‹่ข‹ไปฅไผ˜้›…่€Œ่€็”จ็š„DamierEbรจneๅธ†ๅธƒๅˆถๆˆ\n1000 Processed\n classify content\n423500 0 ๆ—ฉๆœŸ็œ‹่ตทๆฅๆœ‰ๆœบไผšๅšO2O็š„ๅฃ็ข‘็ฝ‘ใ€็ˆฑๅธฎ็ฝ‘ใ€ๅคงไผ—็‚น่ฏ„็ฝ‘็Žฐๅœจๅชๆœ‰ๅคงไผ—ๆดป็š„่ฟ˜ไธ้”™\n423501 0 ๅŽไธบP8้ซ˜้…็‰ˆๆ‰‹ๆœบ่ฟ˜ๆไพ›ไธ€้ข—500ไธ‡ๅƒ็ด ๅ‰็ฝฎๆ‘„ๅƒๅคดๅ’Œไธ€้ข—1300ไธ‡ๅƒ็ด ๅ…‰ๅญฆ้˜ฒๆŠ–ๅŽ็ฝฎๆ‘„ๅƒๅคด\n423502 0 ไธŠๆตทๅ—ไบฌ่ทฏๅค–ๆปฉ็™ฝๅคฉ่ทŸๅ…ถไป–ๅคงๅŸŽๅธ‚ไธ€ไธชๆ ท\n423503 0 ่ฟ™ไบ›้—ๆ†พๅทฅ็จ‹ๅคงๅคšๅ› ไธบ่ฃ…ไฟฎ่ฟ‡็จ‹็š„่ฏฏๅŒบๆ‰€ๅฏผ่‡ด็š„\n423504 0 ็Žฐๅœจๅคงๅฎถ้ƒฝ็Ÿฅ้“็œ‹5ๆ—ฅ็บฟใ€10ๆ—ฅ็บฟ\n1000 Processed\n classify content\n424000 0 ๆฏๆฌก่ธ่ฟ›ๅŒป้™ขๅคง้—จๅฟƒๆƒ…้ƒฝๆ˜ฏๆฒ‰้‡็š„\n424001 0 ๆฅ่‡ชFireEye็š„็ ”็ฉถๅ‘˜ไป‹็ปไบ†ไธ€็งโ€œๆŒ‡็บนไผ ๆ„Ÿๅ™จ็›‘่ง†ๆ”ปๅ‡ปโ€็š„ๆ–นๆณ•\n424002 0 ๅ…ถๅฎžๆœ‰ไธชcba็š„ไธปๅœบๆ”พ่‹ๅทžๆ›ดๅฅฝ็š„\n424003 0 ๅพฎ่ฝฏๅ…ฌๅธƒ2015ไบŒๅญฃๅบฆไธš็ปฉๆŠฅๅ‘Š\n424004 0 ๅ…จ้•‡้€‚้พ„้’ๅนดๆฅๅˆฐๅธธๅนณๅŒป้™ขๅ‚ๅŠ ไบ†ไฝ“ๆฃ€\n1000 Processed\n classify content\n424500 0 ็ป™ๅคงๅฎถๅˆ†ไบซโ€œOfficeMacxxxxspx\n424501 0 ๆ–ฐๅนฒๅŽฟไบบๆฐ‘ๆณ•้™ขๅฎก็†ไบ†ไธ€่ตทๆฐ‘้—ด\n424502 0 ๅพฎ่ฝฏWin10็ณป็ปŸU็›˜ๅค–่ง‚ๆ›ๅ…‰\n424503 1 (x/x)ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตไฟๅˆฉๅ›ฝ้™…ๅฝฑๅŸŽ.ๆœฌๆœˆxxxๅ…ƒๅณๅฏๅ…ฅไผš\n424504 0 ไป…ๆต™ๆฑŸ็ปๅ…ดไผไธš่ดขไบงไฟ้™ฉๆŠฅๆกˆ1300ๅคš่ตท\n1000 Processed\n classify content\n425000 0 QQxxxxxxxxxiphoneๅ…จ็ƒๅซๆ˜ŸGPSๅฎšไฝ้ช—็บธ\n425001 0 ่€ŒๆŒ‰ๅœจๅฒ—ๅนดๅนณๅ‡72818ๅ…ƒๆฅ็ฎ—็š„่ฏ\n425002 1 ไฟก้˜ณๅธ‚ๅฎๆบไพ›ๆฐด่ฎพๅค‡ๅˆถ้€ ๆœ‰้™ๅ…ฌๅธๆ€ป็ป็†ๆŽๆ ‘ๆˆๆบๅ…จไฝ“ๅ‘˜ๅทฅ็ฅๆ‚จๅœจๆ–ฐ็š„ไธ€ๅนด้‡Œ็”Ÿๆ„ๅ…ด้š†๏ผๅ…ฌๅธไธป่ฅๅ„็งๅž‹...\n425003 0 ๆ˜จๅคฉ่ฟ˜ๅœจ่ดจ็–‘ๆˆ‘็š„ไบบ็ฑปๆœ‹ๅ‹็งฐๅกไธ่ฝฆๆ˜ฏไธ€้กน่ฟๅŠจ\n425004 0 ๅœจ่ฟ™ไธช็œ‹่Šฑๅƒ้ชจ็š„ๅนด็บชๆˆ‘่ฟทๅคฑๅœจโ€ฆ\n1000 Processed\n classify content\n425500 0 ๅฏนๆˆ‘ๆฅ่ฏดๆˆ‘ๅฎŒ็พŽ็†ๆƒณไธญ็š„ๅคๅคฉๆ˜ฏ่ฟ™ๆ ท็š„\n425501 0 ็›ด้€š937็›ดๆ’ญ่ดด๏ผš19ๅฒ็š„ๅฐ็Ž‹14ๅฒๅทฆๅณๅฐฑ่พๅญฆ้š็ˆถไบฒๅˆฐ่‹ๅทžๆ‰“ๅทฅ\n425502 0 Dior่ฟชๅฅฅ็œŸๆˆ‘100ml้ฆ™็ฒพๅ7\n425503 0 ๅ‘จๆฐไผฆ่‹ๅทž็ซ™ๆผ”ๅ”ฑไผš้—จ็ฅจ้ƒฝไนฐไธๅˆฐ\n425504 0 ๅˆ†10็‚นๅ’Œ15็‚นไฟฉๆ—ถๆฎตๅˆ†ๅˆซ7ๆŠ˜็ง’ๆ€ๅŽไธบ่ฃ่€€4Xใ€ๅฐ็ฑณ4ใ€iPhone6\n1000 Processed\n classify content\n426000 0 ๅŸบๅœฐ็š„ไธ‰่ง’ๅฝข้ฃžๆœบๅœจๅคดไธŠ็ป•ๆฅ็ป•ๅŽป\n426001 0 ๅฝ“ๆ—ถไบ”ไธ€ๅคง้“่ฟ˜ๅœจไฟฎๅœฐ้“ๅช่ง‰่„ไนฑ\n426002 0 ๅฆ‚ๆžœ้™ชๅ„ฟๅญไนŸ่ƒฝๅƒ็œ‹่Šฑๅƒ้ชจไธ€ๆ ทๆœ‰ๅ…ด่ถฃๅฐฑๅฅฝไบ†\n426003 0 ่ฟ™ๆ—ถๅชๆœ‰่‚ก็ฅจ่บฒๅœจ่ง’่ฝ้—ท้—ทไธไน๏ผšไฝ ไปฌ็œŸๅนธ็ฆ\n426004 0 ๆ”ฏๆŒAndroidใ€Linuxใ€Windows10\n1000 Processed\n classify content\n426500 0 ๅœจๅŽป็ก…่ฐท็š„่ทฏไธŠๆŠŠๆ‰‹ๆœบไธขไบ†ๆ˜ฏ็งๆ€Žๆ ท็š„ไฝ“้ชŒ\n426501 0 ไพฆๆŸฅไบบๅ‘˜ๅœจๅฎกๅˆค้˜ถๆฎตไฝœไธบ่ฏไบบๅ‡บ็Žฐ\n426502 0 ๅˆšไธ‹้ฃžๆœบๅฐฑ่ขซ็œผๅ‰็š„ไธ€ๅน•ๆƒŠๅ‘†ไบ†๏ผšๆฑ‰ๅŸŽ้‡‘ๆตฆๆœบๅœบๅฐ†ไธ€ไธชไธ“้—จ้€š้“็•™็ป™่ฟ™ๅฏนๆ–ฐไบบ\n426503 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 95725eไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n426504 0 ่ฆๆ•™ไผšxๅฒไปฅไธŠ็š„ๅญฉๅญไธ€ไบ›ๅŸบๆœฌ้€ƒ็”ŸๆŠ€่ƒฝ\n1000 Processed\n classify content\n427000 0 ้˜ฟ้‡Œๅทดๅทดๅ†œๆ‘ๆท˜ๅฎไบ‹ไธš้ƒจไธญ่ฅฟ้ƒจๅคงๅŒบ็ป็†ๅด”ไผšๆ•ๅบ”้‚€ๅˆฐๆˆ‘ๅŽฟๅ•†ๆŠ•ไธ–็บชไธญๅฟƒใ€ไบฟ่”ไธ“ไธšๅธ‚ๅœบใ€็š‚่ง’็คพๅŒบ\n427001 0 ๅธฆ็€ไผš้—ช็บขๅ…‰็š„ๆฃ’ๆฃ’็š„police้ƒฝ่ฟ‡ๅŽปไบ†ๅ–‚\n427002 0 ไธ€่ตทๅˆถ้€ ๆญฃไน‰็š„ๅทจๅคงๆœบๅ™จไบบๅง~\n427003 0 ๅŠ ๅ…ฅ็ซ‹็œ100ใ€ๅช่ฆไฝ ่ˆๅพ—ไธ€ไธช228\n427004 0 ๅ…ˆๆฅ็œ‹็œ‹BB้œœ็š„ๆˆไปฝๅ‰–ๆžๅ’Œ็›ธๅ…ณๆŠฅๅ‘Š\n1000 Processed\n classify content\n427500 0 ้‡่ฆไบ‹่ฆ่ฏดไธ‰้ๅˆฐๅบ•่ฐ็ฎกๅˆฐๅบ•่ฐ็ฎกๅˆฐๅบ•่ฐ็ฎก\n427501 0 ไธ€ๅคฉๆ•ฐๆฌกไฝ“ไผšๅˆฐ็™พๅบฆๅ’Œ่ฐทๆญŒ็š„ๅทฎ่ทโ€ฆๅ…ˆๆœ็”ณๅฟ…่พพ\n427502 0 ไธญๅ›ฝ็บบ็ป‡ๅทฅไธš่”ๅˆไผšๅœจ่ฟžไบ‘ๆธฏ็ป„็ป‡ๅฌๅผ€ไบ†็”ฑไธญๅค็ฅž้นฐ็ขณ็บค็ปดๆœ‰้™่ดฃไปปๅ…ฌๅธใ€ไธœๅŽๅคงๅญฆใ€ๆฑŸ่‹้นฐๆธธ็บบๆœบๆœ‰้™...\n427503 0 ๅฐฑ็ฎ—ๅ†ๅŠ ๆฒนๅคงๆฆ‚ไธไผšๆผ้‚ฃไนˆๅคšไบ†\n427504 0 ่€ŒๅคšๅๆฑŸ่‹่ˆœๅคฉ้˜Ÿๅ‘˜ๆœ€ๅŽๆ—ถๅˆปๆƒ…็ปช้žๅธธๆฟ€ๅŠจ\n1000 Processed\n classify content\n428000 0 ็ปๅธธๆ—…ๆธธๅ‡บๅทฎ็š„ใ€ไน˜้ฃžๆœบใ€่ฝฎ่ˆนใ€ๅๆฑฝ่ฝฆ\n428001 0 ๅ‚ไฟๆ‚ฃ่€…ๅœจๅŒป่”ไฝ“ๅ†…ๅŒป็–—ๆœบๆž„้—ดๅˆ†็บง่ฏŠ็–—\n428002 0 xxxxๅŽไธบPxโ€ฆโ€ฆๅ“ชไธชๆ˜ฏไฝ ็š„่œ\n428003 0 ๆŒ‡ๅ‡บๆญฃๅธธ็š„DNAๆŸไผคๅบ”็ญ”่ƒฝๅŠ›ๅœจ้€ ่ก€ๅนฒ็ป†่ƒž็ปดๆŒๅ…ถ่‡ชๆˆ‘ๆ›ดๆ–ฐ่ƒฝๅŠ›ไธญๆ‰ฎๆผ”ไธๅฏๆˆ–็ผบ็š„ไฝœ็”จ\n428004 0 ไธบๅ•ฅ่ฟ™ๆ ท็š„ๅฎณ็พคไน‹้ฉฌไธๅˆคๆญปๅˆ‘\n1000 Processed\n classify content\n428500 0 ๆžๅฎขๆ—ฉ็‚น๏ผšIBM่ฟž็ปญxxๅญฃๆ”ถๅ…ฅไธ‹ๆป‘\n428501 0 ๆฑŸ่‹ๅŽ็ปฟ็”Ÿ็‰ฉ็ง‘ๆŠ€่‚กไปฝๆœ‰้™ๅ…ฌๅธๅœจๅŒ—ไบฌๅ…จๅ›ฝไธญๅฐไผไธš่‚กไปฝ่ฝฌ่ฎฉ็ณป็ปŸๅ…ฌๅธไธพๅŠžโ€œๆ–ฐไธ‰ๆฟโ€ๆŒ‚็‰Œไปชๅผ\n428502 0 ๆˆ‘ๅฑ…็„ถ่ฎฐ้”™ๆŒ‘ๆˆ˜่€…่”็›Ÿ็š„ๆ’ญๅ‡บๆ—ถ้—ด\n428503 0 ไธ‰ไบšๅญฆ้™ข2015ๅนดๅœจๆฑŸ่‹็œ่‰บๆœฏใ€ไฝ“่‚ฒ็ฑปไธ“ไธšไบŒๅฟ—ๆ„ฟๆŠ•ๆกฃ22ไบบ\n428504 0 ็œ‹ๅˆฐ่ฟ™็งๆœ‰ๆŸๅ—ไบฌๅฝข่ฑก็š„่กŒไธบๅฐฑๅพˆ่ฎจๅŽŒ\n1000 Processed\n classify content\n429000 0 ่ฎพ่ฎก่€…็”จ่™šๅŒ–็š„่ƒŒๆ™ฏ็ชๅ‡บๆญŒๆ›ฒ็š„ๅ็งฐ\n429001 0 ไป…3็ง’ๅฐฑๅฐ†็บฆ13ๅจ็š„ๅ…ฌไบค่ฝฆๆŠฌไบ†่ตทๆฅ\n429002 0 ๅฝ“ๅœบๆŠ“่Žท็†ŠๆŸใ€ๆฒˆๆŸ็ญ‰9ๅๅซŒ็–‘ไบบ\n429003 0 ไบค10ๅนดไฟ20ๅนด่ฏด็ป™ไฝ ๅนดๅ›บๅฎšๆ”ถ็›Š12%\n429004 0 ๅˆ˜็ง‹ๆฑŸๆณ‰ๅทžๅธ‚ไธญๅŒป้™ขๅ‰ฏไธปไปปๅŒปๅธˆๆ‚ฃ่€…ๆปกๆ„ๅบฆ100%ๆ“…้•ฟไธญ่ฅฟๅŒป็ป“ๅˆๆฒป็–—็—”็–ฎใ€่‚›็˜˜ใ€่‚›ๅ‘จ่„“่‚ฟใ€่‚›่ฃ‚ใ€...\n1000 Processed\n classify content\n429500 1 ไบฒ็ˆฑ็š„ๅธ็‘ž็พŽ้กพๅฎขๅผ€ๅนดๆœ‰ๅคง็คผๅ“ฆ๏ผๆœบไธๅฏๅคฑ๏ผŒๆ—ถไธๅœจๆฅ๏ผๅ‡กๅœจxๆœˆxOๆ—ฅๅ‰ๅˆฐๅบ—้กพๅฎขๅŠๅฏไบซๅ—ๆŠฝๅฅ–ไธ€ๆฌก๏ผ...\n429501 0 ๅพฎ่ฝฏไธŽ่ฏบๅŸบไบšๅˆ†้“ๆ‰ฌ้•ณLumiaไฝ•ๅŽปไฝ•ไปŽ\n429502 0 ไปŠๅคฉๆˆ‘ไปฌๅฐฑไธ€่ตท่ตฐ่ฟ›HAMANN\n429503 0 ๅœจ19ๆฅผไธŠๆ‰พๅˆฐไบ†้‚ฃไธชๆ‰€่ฐ“็š„็›ธไบฒๅคงไผš็š„็œŸ็›ธ\n429504 0 ไนŸๆ˜ฏๅŒ—ไป‘ๅŸŽ็ฎกๅฑ€้•ฟๆœŸๅšๆŒ็š„้‡็‚นๅทฅไฝœ\n1000 Processed\n classify content\n430000 0 ๆ‰€ไปฅ่ฏดGoogle่ทŸBaidu็š„ๅทฎ่ทๆœ‰็‚นๅคง\n430001 0 ๆฑŸ่‹ๅซ่ง†้…้Ÿณ็‰ˆ็ปงๆ‰ฟ่€…ไปฌ็ฎ€็›ดๅ“ญ็žŽ\n430002 1 ใ€ๆณฐ็ฆพๅŽฆ้—จ้™ขๅญยท้ฆ–็Žบใ€‘ๆตทๆฒง็ฌฌไธ€ๆตทๆ™ฏ้ซ˜ๅฑ‚๏ผŒๅœฐ้“ๅฃ๏ผŒๅ…ฌๅ›ญๆ—๏ผŒๅๆ กๅญฆๅŒบๆˆฟ๏ผŒxxๅˆ†้’Ÿ่ฟ›ๅฒ›๏ผxx-xx...\n430003 0 ๆฒณๅŒ—ๅŒบๅŒบๆ”ฟๅบœๅคฉๅคฉ่ฟ™ไนˆ็ƒญ้—นๅ•Š\n430004 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅ›ฝๅ’Œๅ˜‰ๅญšไธœ้ฃŽ้›ช้“้พ™้”€ๅ”ฎ้กพ้—ฎ่ดพๆตท้‘ซ๏ผŒๅพˆๆ„Ÿ่ฐขๆ‚จๅฏนไธœ้ฃŽ้›ช้“้พ™็š„ๅ…ณๆณจ๏ผๆœฌๅ‘จๅ‘จๅ…ญๅ‘จๆ—ฅไธบไบ†ๅบ†็ฅ...\n1000 Processed\n classify content\n430500 0 ไธญ็ฒฎ้›†ๅ›ขๅ‰ไธ‰ไธชๆœˆไบง็”Ÿ็š„ๅˆฉๆฏๆ”ฏๅ‡บ็บฆไธบ19\n430501 0 ็Ÿญ่ฃค็š„ๅŠ่ฃ™ๆ‘†่ฎพ่ฎกไนŸ้žๅธธ็‰นๅˆซ\n430502 0 ๅฐๅฎถไผ™่ƒฝๅคŸๅฎž็Žฐ170ยฐๅนฟ่ง’ๆ‰ซๆ\n430503 0 ็กฎๅฎšๆˆ‘ๅธ‚20ๅฐ็”ณ้พ™็”ตๆขฏไธญๆฒกๆœ‰ๅ‘็Žฐโ€œๅƒไบบโ€็”ตๆขฏๅž‹ๅท\n430504 0 ๅดไบฆๅ‡ก่ฏ‰่ฎผ็ป†่Š‚ๆ›ๅ…‰๏ผšๅ…ฌๅธ่ง†ๆˆ‘ไธบๆœบๅ™จ้›ถไปถ\n1000 Processed\n classify content\n431000 0 ็ฌฌ10้›†๏ผšๅทฅๅŽ‚ไนŸไผšๆœ‰ๅœŸๅฃคๆฑกๆŸ“้—ฎ้ข˜ๅ—\n431001 0 ๅคงๅŠๅคœ็š„ๆˆ‘็”จๆฐดๆžœๆ‰‹ๆœบๆ‹็š„ๅคœ้—ด้ฃž่กŒ\n431002 0 ๅฆ‚ๆžœไฝ ่ง‰ๅพ—ๅฏนๆ‰‹ๆœบๅ’Œ็”ต่„‘ๅผบ่ฟซ็—‡็š„ๆˆ‘ๆ˜ฏๅ–œๆฌขๆ–ฐ้ฒœไบ‹็‰ฉ\n431003 0 ้€€ไผ‘ๆˆไบ†้›จ่ŠฑๅฐๅŒบๆฟๆกฅๆ–ฐๅŸŽ็ฎกๅง”ไผšๆ–ฐๆž—็คพๅŒบ็š„็คพๅŒบๆฒปๅฎ‰ๅฟ—ๆ„ฟ่€…\n431004 0 ๅฆ‚ๆžœไฝ ็š„่ฝฆ้…ๅค‡ไบ†ECOๅ‘ๅŠจๆœบๅฏๅœๅŠŸ่ƒฝ\n1000 Processed\n classify content\n431500 0 ็”ฑๅŒ—ไบฌๅพ€ไธŠๆตทๆ–นๅ‘่‹ๅทžๆฎตxxxxK้™„่ฟ‘็Žฐๅœบ่ฝฆๅคš็ผ“่กŒ\n431501 0 ๅฟซtm่ขซ็”ต่„‘ๆฐ”ๅ“ญไบ†ไป€ไนˆ็Žฉๆ„ๅ„ฟๅ•Š็ญ‰ไบ†20ๅˆ†้’Ÿๅ‚จๅญ˜ๅคฑ่ดฅ\n431502 0 ๆ‰€ไปฅ่ฏด้‚ฃไธชๅทžๆ˜ฏไธๆ˜ฏๆฒกๆญปๅˆ‘ๅชๆœ‰็ปˆ่บซ็›‘็ฆ\n431503 0 ่ฟ™ๆ—ถไธ€ไธชๆฃ‰่Šฑ็ณ–ๅฐ่ดฉๅธๅผ•ไบ†ๅฐๆœ‹ๅ‹ไปฌ็š„ๆณจๆ„\n431504 0 ๆฌข่ฟŽ็ˆฑๆถๆžใ€็ˆฑ็–ฏใ€็ˆฑ็Žฉใ€็ˆฑๅ†’้™ฉใ€็ˆฑๅƒ\n1000 Processed\n classify content\n432000 0 bigbangๅ—ไบฌๅœบ1280\n432001 0 ไธ่ฆ็›ฒ็›ฎไธ่ฆๅŽปไนฑๆŽจๅนฟๅ่€Œ่ขซ้ช—่ง‰ๅพ—ๆˆ‘ไปฌๅ…ถไป–ๆŽจๅนฟๆ€Žไนˆๆ€Žไนˆๆ ท\n432002 0 ็Ÿญๅฐพ็š„่ฎพ่ฎกไฝฟๆ•ดไธช่ฝฆ่บซๆ›ดๅŠ ็ฒพ่‡ด\n432003 0 ๆฒณๅŒ—ๆณ•้™ขๅœจๅ…จ็œๅผ€ๅฑ•ๆ‰“ๅ‡ปๆ‹’ไธๆ‰ง่กŒๅˆคๅ†ณใ€่ฃๅฎš็ญ‰\n432004 0 ๆˆ‘่ฟ™่พนไธŠmicrosoftๆˆ‘้ƒฝ่ฆๆŒ‚vpnไฝ ไปฌๆ˜ฏ็ญ‰็€ๆˆ‘ไปฌ้ƒฝๅŽปๅทฅไฟก้ƒจๆŠ•่ฏ‰ไนˆ\n1000 Processed\n classify content\n432500 0 2018ๅนดๅบ•ๅ‰ๅฐ†่‡ดๅŠ›ไบŽ5Gๆ ‡ๅ‡†ๅŒ–ๅˆถๅฎš\n432501 0 x็‰›ๅฅถไฝ“่†œxxๅธƒไธ็ฒ‰x็›’bb้ข่†œx้˜ฒๆ™’็พŽ็™ฝๅ–ท้›พ\n432502 0 ็ˆฑ็ฌ‘ๆœ‰่ดฃไปปๅฟƒไธŠ่ฟ›ๅฟƒๅœจๅ—ไบฌๆœ‰็จณๅฎšๅทฅไฝœ\n432503 0 ๆœ‰็ฝ‘ๅ‹ๆ„คๆ€’้—ฎ้“๏ผšๅฅฝๅฃฐ้Ÿณๆ˜ฏๅค–ๅ›ฝ็š„\n432504 0 ่ฒ็‰นยทๆณฐๆ–ฏ็‰น็ฝ—่ŽŽยทๅ“ˆๆ‹‰ๆธฉ\n1000 Processed\n classify content\n433000 0 ไนๅ›พโ†“โ†“โ†“ไบ†่งฃ้‚ฃไบ›ๆฑฝ่ฝฆ็š„้š่—ๅŠŸ่ƒฝ๏ฝž\n433001 1 ไบฒ็ˆฑ็š„ๅงๅงๆ‚จๅฅฝ๏ผŒไธ‰ๅ…ซ่Š‚ๅฐ†ๅˆฐ๏ผŒใ€็‚่Žฑ่’‚ๅฐ”ใ€‘ๅฐๆ›ผ๏ผŒๆๅ‰็ฅๆ‚จ่Š‚ๆ—ฅๅฟซไนๆฐธ่ฟœๅนด่ฝปๆผ‚ไบฎใ€‚xยทxโ€”x.xๆœ‰...\n433002 0 ็†ๆƒณ17ๅฑ‚ๅฅณ็”Ÿๆด—ๆ‰‹้—ดๆกๅˆฐไธ€ไธชๅทฅๅก\n433003 0 ไฝฟ็”จ้˜ฟๆ‹‰ๆ–ฏๅŠ ็š„TourSaver้…ท่ƒ–\n433004 0 ไนฐไธ‹ไฝไบŽ็™ฝๅŽ…็š„ไผฆๆ•ฆ่ญฆๅฏŸๅฑ€ๆ—งๆ€ป้ƒจ\n1000 Processed\n classify content\n433500 0 ๅคชๅŽŸๅธ‚ไนไธ€ๅฐๅญฆ2015ๆ‹›่˜ๆ•™ๅธˆ็–‘ไผผๅ†…้ƒจๆ“ไฝœ\n433501 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš\n433502 0 ๆœ‰ๅพˆๅคšๅฐๅž‹ๅบ—้“บ่ฃ…ไฟฎๅพ—็ฒพ่‡ด็š„\n433503 0 ่งฃๆ”พๅ†›ๆ€ปๅŒป้™ขๅฟƒๅ†…็ง‘้™ˆ้Ÿตๅฒฑๆ•™ๆŽˆๅšไบ†้ข˜ไธบโ€œSTEMIๅ’ŒNSTEMIๆฒป็–—็ญ–็•ฅ่งฃๆžโ€็š„็ฒพๅฝฉๆŠฅๅ‘Š\n433504 0 ไธ€้ข—ๅŸบๅ› ็ชๅ˜็š„่‘ก่„๏ฝž??\n1000 Processed\n classify content\n434000 0 ็™พๅบฆๆ‰็Ÿฅ้“ๆ€ๅงๅง็š„ๆผ”ๅ‘˜็š„ๅๅญ—ๆ‰ๆ˜ฏ้ฉฌๅฏ\n434001 0 ๅœจๆ—…ๆธธ็š„ๅŒๆ—ถไนŸๆœ‰ๅฅฝๅคš็พŽๅ‘ณ็š„ๅฐๅƒๅ’Œๅฝ“ๅœฐๆตท้ฒœ\n434002 0 ๅฟฝ็•ฅwinphone็”จๆˆท็š„็ปๆตŽ่ƒฝๅŠ›\n434003 0 ไผ—ๅคšๆžœ็ฒ‰็บท็บทๅ‘ๅธ–ๆŠจๅ‡ปiOS7ไน‹ไธ‘\n434004 0 ๅฌ้™ตๅŒบๆณ•้™ขๅœจๅ…ซไธ€ๅ‰ๅค•ๅฌๅผ€ๆถ‰ๅ†›็ปดๆƒๆ–ฐ้—ปๅ‘ๅธƒไผš\n1000 Processed\n classify content\n434500 0 xxx็š„ไธคๅผ xxๅŒบxๆŽ’ไฝ็ฝฎ้žๅธธๅฅฝ้ž่ฏšๅ‹ฟๆ‰ฐ\n434501 0 ่ดตๅทžไบบๅฌไบ†ๆฐ”ๅˆฐ่ฏด\"่ดต้˜ณๆœ‰ๅบง้’Ÿ้ผ“ๆฅผ\n434502 0 ๆฑ‚ไธ€ๅผ bigbangๅ—ไบฌๅœบ็ฅจ\n434503 0 ANGLEๅ…ฌไธปๅฎŒ็พŽ็š„่žๅˆไบ†้บป้บป็š„็พŽ่ฒŒไธŽ็ฒ‘็ฒ‘็š„ๅธ…ๆฐ”\n434504 0 ไธ็Ÿฅไธ่ง‰ๅญฆC่ฏญ่จ€ๅทฒ็ปไธคไธชๆ˜ŸๆœŸไบ†\n1000 Processed\n classify content\n435000 0 ็™ฝ้ข†ไธฝไบบ่‹นๆžœๅนฒ็ป†่ƒž้ข่†œไธญๅ›ฝ็ฌฌไธ€ๆฌพไธ“ๆณจ่งๅน•่พๅฐ„่€Œ็ ”ๅ‘็š„้ข่†œๆŠค่‚ค่พพไบบๆœ€ๅ–œ็ˆฑ็š„ไธ€ๆฌพ้ข่†œ\n435001 0 ๅ…จ้ƒจๅปบ็ญ‘ไปฅๆฌงๅผๅ’Œไฟ„็ฝ—ๆ–ฏ้ฃŽๆ ผไธบไธปไฝ“\n435002 0 ้‡ๅˆฐไธ€ไธชไผšremix็š„ๆœ‰ๆ‰่€็ˆธ\n435003 0 ็Žฐๅœจ็งฏๆžๆฒŸ้€šๆดพๅ‡บๆ‰€้‚ฃ่พนๆพๅฃไบ†\n435004 0 ๆ‰“ๅผ€็”ต่„‘ๅ›žๅฟ†ไธ€ไธ‹ๆˆ‘็š„้ƒจ้˜Ÿ็”Ÿๆดปโ€ฆไธคๅนดๆ—ถ้—ดๅพˆๅ……ๅฎž\n1000 Processed\n classify content\n435500 0 ๅˆšๅˆšๅพ—็Ÿฅxxๆœˆๆœ‰ๅ—ไบฌๅœบๅฐๅ…ดๅฅ‹ๅคด่„‘้ข„ไผฐไบ†ไธ€ไธ‹็ž’็€็ˆธๅฆˆ็œ้’ฑๆด—็œผ็š„ๅฏ่กŒๅบฆ\n435501 0 ๅคงไธญๅˆ้กถ็€ๅคช้˜ณๅŽปไบ†่บบๆˆฟไบงไบคๆ˜“ไธญๅฟƒ\n435502 0 ไปฅๅŽๅฏน็€็”ต่„‘ๅพ—่Š‚ๅˆถไบ›โ€ฆ่‚ฉ่†€ๅฅฝ็—›โ€ฆ\n435503 0 ๆˆ‘ไปฌๅŒป้™ขไปŠๅนด่ฟ›ไบ†ๅๅ‡ ไธชๅšๅฃซ\n435504 0 ้ƒฝๆœ‰ไธ€้ข—ๅ‘ๅพ€ๅคงๆตท็š„ๅฟƒ~่ฟ™ไธชๅคๅคฉ\n1000 Processed\n classify content\n436000 0 WhooๅŽๅ…จๅ›ฝ็ฌฌไธ€ๅฎถไฝ“้ชŒๅผไธ“ๆŸœ\n436001 0 ๅŠ ๅผบไฟก็”จ็›‘็ฎกไธญๅ›ฝ่ง„ๅˆ’ๅปบ่ฎพโ€œๅ…จๅ›ฝไธ€ๅผ ็ฝ‘โ€\n436002 0 ๆˆฟๅœฐไบงๅผ€ๅ‘ๅ…ฌๅธ็š„ๆณ•ไบบ่ฟ˜ๆ˜ฏ้‡‘ๅฐๆˆฟ็ฎกๆ‰€ๆ‰€้•ฟ\n436003 0 ๅ•็ฌ”ๅ……ๅ€ผๆปก100ๅ…ƒๅณ้€10ๅ…ƒ่ฏ่ดน\n436004 0 ไธ็”จๅฐ็ฑณๅŽไธบ็š„ๅนด้’ไบบๅฟƒๆ€ๆ˜ฏ่€็š„\n1000 Processed\n classify content\n436500 0 ็ป“ๆžœๅŒป็”Ÿๅฌไบ†ไธ‹่ƒŽๅฟƒๅฐฑๅผ€ไธ€ๅ †ๅ•ๅญๅซๆˆ‘็ผด่ดน\n436501 0 ็พŽๆ–นๅฏนๅพทๆ”ฟๅบœ้ƒจ้—จ็š„็›‘ๅฌๆดปๅŠจๅฐฑๅทฒ็ปๅผ€ๅง‹\n436502 0 ไปŠๅนดxๆœˆๆ‹Ÿๅฎš็š„ไธ€ไปฝ่ฝฏไปถๅ‡บๅฃ็›‘็ฎกๆณ•่ง„\n436503 0 ็›ฎๅ‰ๅฏๅŠจๅŒบ่ง„ๅˆ’้ข็งฏไธบ10ๅนณๆ–นๅ…ฌ้‡Œ\n436504 0 ๆญๅทžๅ‡ๆฒƒๅŒป้™ข็ฎก็†ๆœ‰้™ๅ…ฌๅธๆ‹›่˜ๅธ‚ๅœบๆŽจๅนฟไธป็ฎก\n1000 Processed\n classify content\n437000 0 ๆˆ‘ๅฎๆ„ฟ่ฟ™ๆ ท็š„็œŸ็›ธไธ€่พˆๅญ่ขซๆˆไธบ่ขซๆ—ถๅ…‰ๆŽฉๅŸ‹็š„็ง˜ๅฏ†\n437001 1 ๅฅฅๅŽๅŠฉๆ‚จๅผ€้—จ็บข๏ผ็บข๏ผ็บข๏ผๅ‡กxxxxๅนดxๆœˆxๆ—ฅๅ‰่ฟ›ๅบ—ๅฎขๆˆทๅ‡ๆœ‰็บขๅŒ…่ต ้€๏ผŒๆ›ดๆœ‰ๅฎš้‡‘ๅŒๅ€ๆŠตๆปก้ข้€ๆ™บ่ƒฝ...\n437002 0 ๅคงๅ…ณๆดพๅ‡บๆ‰€ๅฐ†ๆฝœๅ›žๅฎถไธญๅ–้’ฑ็š„ๅ€ชๆŸๆŠ“่Žทๅฝ’ๆกˆใ€ไพๆณ•ๅˆ‘ๆ‹˜\n437003 1 ใ€ๆƒ ่พพๆฑฝ่ฝฆใ€‘xๆœˆไผ˜ๆƒ ๆดปๅŠจ๏ผšๆฅๅบ—ๆ—ขๆœ‰ๅ…่ดนๅฎ‰ๅ…จๆฃ€ๆต‹ๆœๅŠก๏ผ›ไฟๅ…ปๆ›ดๆขๆœบๆฒน่ต ้€ๆœบๆฒนๆ ผไธ€ไธช๏ผ›่ดญไนฐ็ฒพๅ“ๆปกx...\n437004 0 celine้žฆ้Ÿ†ๅ†ๆทปๆ–ฐ่‰ฒ็ดฐ็ฏ€ไป€้บผ็š„้ƒฝๆฃ’ๆฃ’็š„ๆญค่‰ฒๆ›ดๆ˜ฏ็ง‹ๅ†ฌ่ชฟ่ชฟๅคงๆฐฃๅปไธๅผตๆšไฝŽ่ชฟๅˆ่€็œ‹ๅฅฝๆญ้…็š„้ก่‰ฒ...\n1000 Processed\n classify content\n437500 0 ๆ„Ÿ่ง‰ๅƒๅผบๅฅธ็Šฏ่ฏดๅผบๅฅธไธๆ˜ฏไธบไบ†ๆ€งๆฌฒ\n437501 0 ไบš้ฉฌ้€Šๅ…ฌๅธ่€—่ต„10ไบฟ็พŽๅ…ƒๆ”ถ่ดญไบ†่ง†้ข‘็ฝ‘็ซ™TwitchไปฅๅŠ ๅผบๆธธๆˆ่ง†้ข‘็›ธๅ…ณไธšๅŠก\n437502 0 ไธบไป€ไนˆ่ฆ้€ๅฅนๅŽปๅŒป้™ขโ€่ฏฅไผ‘็Ÿฃ\n437503 0 ไฝ›ๅฑฑๅธ‚็ฆ…ๅŸŽๅŒบโ€”ๅผ€ๅฟƒๅคง่ฏๆˆฟๆพๅ‡คๅˆ†ๅบ—\n437504 0 ๅฝ“ๆœˆ็š„ๆŒ‡ๆ ‡ๅ‡บๆฅๅŽ่ฟ˜ๆ˜ฏๆๅ‰ๅฐฑๅกซๅฅฝๅ‘ข\n1000 Processed\n classify content\n438000 0 ไธŠๆตทๅŽ็พŽๅŒป็–—็พŽๅฎนๅŒป้™ข็š„ๅถไธฝ่ๅŒป็”Ÿๆ€Žไนˆๆ ท\n438001 0 ไธŠไบ†ไธ€ๅคฉ็š„่ฏพ็ปˆไบŽ็†ฌๅˆฐๆ”พๅญฆ็ซ‹้ฉฌ่ตถๅŽป็œ‹\n438002 0 ่ฃๅˆคๆˆ‘ๅ„ฟไนŸๅฅฝ่ฟ™ไบ›้ƒฝๆ˜ฏไธ€็งไผ ๆ‰ฟ\n438003 0 ๆƒ ๆ™ฎ้‡็‚นไธบไธญๅ›ฝๅŒป็–—ๆœบๆž„ๆไพ›ไธค็งๆ–นไพฟ\n438004 0 7ๆœˆๅนฟๅทžไธญๅฟƒๅ…ญๅŒบไบŒๆ‰‹ไฝๅฎ…ไบคๆ˜“ๆถจๅน…ๅพฎๅผฑไธ่ถณ1%\n1000 Processed\n classify content\n438500 0 ๅŽŸๆๆ–™ๆˆ˜็•ฅๆ€งๆŠ•่ต„ๅ‘จๆœŸ้™ไธด\n438501 0 ไนŸๆ˜ฏ่ขซLPๅ—ไบฌๆผ”ๅ”ฑไผš็š„่ฏ„่ฎบๅผ„้†‰ไบ†โ€ฆ1ไธชๅŠๅฐๆ—ถๅซŒ็Ÿญ\n438502 0 ่ฎค่ฏไฟกๆฏไธบโ€œๆฑŸ่‹็œๆตท้—จๅธ‚ๅ…ฌๅฎ‰ๅฑ€ๆฐ‘่ญฆโ€\n438503 0 ๆ—ฅๆœฌไธ€ๆžถๅฐๅž‹้ฃžๆœบ่ตท้ฃžๅŽไป…ไธๅˆฐxๅˆ†้’Ÿไพฟๅ ่ฝ่‡ณไธœไบฌ้ƒฝ่ฐƒๅธƒๅธ‚ๆฐ‘ๅฎ…\n438504 0 ไบบๅงๅœจๆพๆ‡ˆ็š„ๆ—ถๅ€™ๆŠฌไธชๅคด้ƒฝไผšๅ‘็Žฐ่ฟžไธชๅœฐ้“้ƒฝๅœจ็ป™ไฝ ๆ•ฒ่ญฆ้’Ÿๆœ‰็‚นๆƒณๅ‘•\n1000 Processed\n classify content\n439000 0 ๅ†ๆŒ‡ไฝฟๆ‰‹ไธญๅฅณๆ€ง็”จBๅŽป่ดฟ่ต‚CๆขD\n439001 0 ไปŠๅนดไธ€ๅฎš่ฆๅŽปๅ—ไบฌ็œ‹ๆŽๅฟ—็š„่ทจๅนด\n439002 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ jz4r5pไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n439003 0 ๆ˜ฅๆ™–่ทฏ175ๅท็š„ๅบ—้“บ็›ฎๅ‰ๆญฃๅœจ่ฃ…ไฟฎไธญ\n439004 0 ๆœ€่ฟ‘ๆœ€็ˆฑๆˆ‘็š„ๅฐฑๆ˜ฏๅ„ไฟ้™ฉๅ…ฌๅธไบ†\n1000 Processed\n classify content\n439500 0 metrostationๅœฐ้“็ซ™\n439501 0 ็œๆ”ฟๅบœ้ƒจ้—จๅ’Œๅธ‚ๆ”ฟๅบœ้ƒจ้—จ้ƒฝๆŸฅๅ‡บไธๅฐ‘็š„่ดชๅฎ˜ๆฑกๅ\n439502 0 ไธญๅ›ฝๅปบ็ญ‘็ฌฌไบŒๅทฅ็จ‹ๅฑ€ๆœ‰้™ๅ…ฌๅธๅ…ฌๅธ็ฎ€ไป‹\n439503 0 ๆฐพๆฐด้•‡2015ๅนดๅค็ง‹ๅญฃๅพๅ…ตไฝ“ๆฃ€ๅทฅไฝœๆญฃๅผๅฏๅŠจ\n439504 0 ่ฏทไธ่ฆ่ขซไธญๅ›ฝ้‚ฃๅ‡ ไฝ่ฝฌๅŸบๅ› ๅคง่…•้™ขๅฃซไธ“ๅฎถๅŒ…ๆ‹ฌๅ†œไธš้ƒจๆญฃๅ‰ฏ้ƒจ้•ฟ่’™ไฝ็œผ\n1000 Processed\n classify content\n440000 0 ๆˆ‘ๅช่ƒฝ่ฏด้‡ไธŠxxxไธ€ๆ ท็š„่ฃๅˆค\n440001 0 ๆˆ‘ๅ‘่กจไบ†ๆ–‡็ซ โ€”โ€”ๆฑ‚่Œ็ขฐไธŠโ€œ้ป‘ไธญไป‹โ€ๆ€ŽไนˆๅŠž\n440002 0 ๆฅ่‡ชๆฑŸ่‹็œ็š„ๆ”ถๅ‰ฒๆœบ่ฝฆ้˜Ÿๅˆ†ๅธƒๅœจ้’ๅŽŸๅŒบ็š„ๅ„ๆก้“่ทฏ\n440003 0 ็€šๆฒƒ็‹ฌๅฎถ้‡‡็”จๅ…จ็ƒ500ๅผบๆ— ็”ฒ้†›+ๅˆ†่งฃ็”ฒ้†›็š„็Žฏไฟๆ–ฐ่พ…ๆ–™ๅ•ฆ\n440004 0 ไปฅๅ‰้ƒฝ่ฟ˜่ฆไธ€็›ดๅœจ็”ต่„‘ไธŠ็ฟป็ฟป็ฟป็ฟป็ฟป็ฟป\n1000 Processed\n classify content\n440500 0 ๆˆ‘ๅŽฟ้ฃŸ่ฏ็›‘ๅฑ€่ฟ›่กŒไบ†็”ตๆขฏๅฎ‰ๅ…จๅคงๆฃ€ๆŸฅ\n440501 0 xxๅนดๅ…ˆๆๅ‡บ็š„่งฃ็บฆไฝ ไปฌๆ–นๆ—ข็„ถๅŒๆ„้‚ฃ่ฟ˜่ทŸๆˆ‘ๅฎถๆœ‰ไป€ไนˆๅ…ณ็ณป\n440502 1 ไฝณ่ด่‰พ็‰น็พŠๅฅถ็ฒ‰xๆœˆxๅทๅŒๅ€็งฏๅˆ†ๅ“ฆ๏ผ่ดญไนฐxๅฌๅฐฑๅฏไปฅๅ…‘ๆข็›ธๅŒ็š„ไบงๅ“xๅฌไบ†๏ผๅฆๆœ‰ๅฅฝ็คผ็›ธ้€๏ผๅ—้—จไปฅๆ’ๆฏๅฉดๅบ—\n440503 0 ่ฎค่ฏไฟกๆฏไธบโ€œ่…พ่ฎฏๆ–‡ๅญฆไฝœๅฎถโ€\n440504 0 ๅƒไธญ่ฏ็š„ๆ—ฅๅญไธ่ƒฝๅƒๅ‡‰ไธ่ƒฝๅƒ่พฃ\n1000 Processed\n classify content\n441000 0 comๆ— ้”กๅ”ฏไธ€ไธ“ๆณจ่ฎพ่ฎกๅŸน่ฎญ็š„ๆƒๅจๆœบๆž„CGๆ’็”ป่ฎฒๅธˆ๏ผš8ๅนดCGๆ’็”ป็ป้ชŒ\n441001 0 10086ๅ‘Š่ฏ‰ๆˆ‘็ณป็ปŸๅ‡็บงๆฒกๅŠžๆณ•ๅธฎๆณ•ๆˆ‘ๅผ€้€š\n441002 0 2015ๅ…จ็ƒๆฏ”ๅŸบๅฐผๅฐๅงไธญๅ›ฝๅคง่ต›ๅผ€่ต›ๆœ€ๅฐ16ๅฒ\n441003 0 ๅฅฝๅƒๆœ‰ๅพˆๅคšgnๅ› ไธบๆœ‰ไบ‹ๆฒกๅŠžๆณ•ๆฅ\n441004 0 ๆต™ๆฑŸๆฏไธ€ไธ‡ไธชไบบไธญๅฐฑๆœ‰801ไธช่€ๆฟ\n1000 Processed\n classify content\n441500 1 ๆ˜ฅๆฅๆฐ”็ˆฝ๏ฟฅ่ฒธ็ฅžๅ‡บๅœบ๏ฟฅๅ…ƒๅฎตๅทฒ่ฟ‡๏ฟฅๅ—จ็ˆ†ๆทฑๅนฟ๏ผๆˆฟ$ๅฑ‹๏ฟฅ่ฒธ๏ฟฅๆญ€ไปŽๆ— ๅฏนๆ‰‹ใ€ๅนดๅŒ–ไฝŽ่‡ณx/ๅŽ˜ใ€้ซ˜่ฏ„้ซ˜$่ฒธ....\n441501 0 ๅฅนๅฐ†ๅ‰ๅพ€ๅ“ˆๅฐ”ๆปจๅ‚ๅŠ 7ๆœˆ23ๆ—ฅๅผ€ๅน•็š„2015ๅนดๅ…จๅ›ฝ้€Ÿๅบฆ่ฝฎๆป‘้”ฆๆ ‡่ต›\n441502 0 ๅŽŸไปทxxxใ€xxx็š„้’ฑๅŒ…็Žฐๅœจๅชๅ–xxๅ…ƒ\n441503 0 ไบฌๆดฅ็ฟผใ€ไฝ“่‚ฒใ€ๆฐดๅŠก็ญ‰ๆฟๅ—่ทŒๅน…ๅฑ…ๅ‰\n441504 0 ๆ˜ฏๅฐ†่ฑ†่ง’ใ€็Œช่‚‰ใ€ๅœŸ่ฑ†ใ€่ฅฟ็บขๆŸฟใ€่Œ„ๅญไพๆฌกๅ…ฅ้”…\n1000 Processed\n classify content\n442000 0 ไปŠๅคฉๅฌๅŒป็”Ÿ่ฏด้™คไบ†็”Ÿ็Œช่‚‰ใ€็”Ÿ็‰›่‚‰ใ€้ฒจ้ฑผใ€ไธ‰ๆ–‡้ฑผไธ่ƒฝๅƒๅ…ถไป–้ƒฝๆฒก้—ฎ้ข˜\n442001 0 ๅ•ๅœบๅพ—ๅˆ†่ถ…่ฟ‡100ๅˆ†็š„ๅนถไธๅœจๅฐ‘ๆ•ฐ\n442002 0 ไธญๅ›ฝๅคฉๆ–‡ๅญฆไผšxxxxๅนดๅบฆๅซๆ˜Ÿๆฟ€ๅ…‰ๆต‹่ทๆŠ€ๆœฏไธŽๅบ”็”จ็ ”่ฎจไผšๅœจไบ‘ๅ—ๆพ„ๆฑŸๅฌๅผ€\n442003 0 ๅทžๆณ•้™ขไปฅๆ•ดๆฒปๅ‘ๆ”นๆกˆไปถไธบไพง้‡็‚นๅ’Œ็ช็ ดๅฃ\n442004 0 6ๆœˆไปฝ็š„่ดน็”จๅทฒๅœจ7ๆœˆ24ๅท็š„ๆ—ถๅ€™็ผด็บณ\n1000 Processed\n classify content\n442500 0 ๆฑ‚้—ฎ๏ฝžๅŽปๅ“ชๅŠžๆ‰ฌๅทžๆ™ฏๅŒบ่€ๅนดๅก\n442501 0 ่™ฝ็„ถ้ƒฝๅœจๆฑŸ่‹่ฟ˜ๆ˜ฏ่ฆxxxๅคšๅ…ฌ้‡Œ็š„\n442502 0 ้™„่ฟ‘ๅฑ…ๆฐ‘่ฏฏไปฅไธบๆ˜ฏๅ‘็”ŸๅผบๅฅธๆกˆๆŠฅ่ญฆ\n442503 0 ่ฏดไธๅฎš่ฟ˜่ƒฝ้‡่ง็ป™ไธช50็ป™ๆˆ‘ๅ”ฑไธชๅฐๆ›ฒๅ‘ข\n442504 0 ไบŽไปŠๅคฉ็™ฝๅคฉๅœจๆต™ๆฑŸๆธฉๅฒญ่‡ณ่ˆŸๅฑฑไธ€ๅธฆๆฒฟๆตท็™ป้™†\n1000 Processed\n classify content\n443000 1 ไฝ ๅฅฝๅง ๆˆ‘ๆ˜ฏๆผฏๆฒณๆ–ฐ็Ž›็‰น็މๅ…ฐๆฒนๅฐๅผ  ๅ’ฑไปฌไธ‰ๅ…ซ่Š‚ๆดปๅŠจๆ˜Žๅคฉๅผ€ๅง‹ไบ† ๆปกxxxๅ‡xxๆปกxxxๅ‡xxx...\n443001 0 ไธบไฝ•่ฟ˜ไบง็”Ÿไบ†30ๅคšๅ…ƒ็š„ๆผซๆธธ้€š่ฏ่ดน็”จ\n443002 0 ็šฎ่‚ค็ฒ—็ณ™็š„ๅŽŸๅ› ไธ€๏ผšๅœจๅนฒ็‡ฅ็š„ๅ†ฌๅญฃ\n443003 0 ๆฑŸ่‹็œๅŽ…ไธ“ๅฎถ็ป„่ฐƒ็ ”ๅฎฟ่ฟๆ•ฐๅญ—ๅŒ–ๅŸŽ็ฎกๅทฅไฝœ\n443004 0 nkxxxๅ’Œmonxxxๅž‹็މ็ฑณ่ฝฌๅŸบๅ› ็މ็ฑณ็ญ‰\n1000 Processed\n classify content\n443500 0 ็”ต่ง†ๅฐ่ขซๆˆ‘ไปฌๆ‰ฟๅŒ…ไบ†ๆˆ‘่ฐ้ƒฝไธ็บฆ\n443501 0 ้€ƒๅŽปๅ—ไบฌๅš็‰ฉ้ฆ†ๆŠŠๅ‰ฉไธ‹็š„้ƒจๅˆ†็œ‹ๅฎŒ\n443502 0 MININSCE2015็”จๆฟ€ๆƒ…\n443503 0 ็ฆปๅผ€้ฉฌ็ดฏ็š„ๆ—ถๅ€™้ฃžๆœบไธŠๆ‹ๅˆฐไบ†ไบ‘ไธญๅฝฉ่™น\n443504 0 ้ข„่ฎกxๆœˆๅบ•่‡ณxxๆœˆๅˆ่ƒฝๅผ€ๅ›ญ่ฟŽๅฎข\n1000 Processed\n classify content\n444000 0 ๅฐ้ข่ดทๆฌพไฟ่ฏไฟ้™ฉๆ”ฏๆŒ็š„่ดทๆฌพๅฏน่ฑกไธบไธ‰็ฑปไบบ็พค\n444001 0 ่ฟ™ไบ›ๅปบ็ญ‘่ฎพ่ฎก่กŒไธš็š„ๆฝœ่ง„ๅˆ™ๅคชๅฏๆ€•\n444002 0 ๅ‘Š่ฏ‰ไฝ ไปฌไธ€ไธช็ง˜ๅฏ†๏ผšๆด—่„ธ็š„ๆ—ถๅ€™\n444003 0 7ๅท็บฟๅนฟๆธ ้—จๅˆฐไน้พ™ๅฑฑๆฒฟ็บฟไธŠ่”้€šๆ€Žไนˆๆฒกๆœ‰ๆ‰‹ๆœบไฟกๅท\n444004 0 ๅŠจๆผซPSPๆ—…ๆธธไพตๆฒกไบ‹็žŽๆƒณ็š„ๅŒ้ฑผ\n1000 Processed\n classify content\n444500 1 ๆดปๅŠจใ€‚้€ไฝ ไธ€ๆฌกๅฅๅบท็พŽไธฝไน‹ๆ—…ใ€‚ๆดปๅŠจๆœŸ้—ดๅ››้ขๅฑฑๆ™ฏๅŒบๅฐ†ๅฏนๆ‰€ๆœ‰ๅฅณๆ€งๆœ‹ๅ‹ๅฎž่กŒๅ…่ดนๅผ€ๆ”พ๏ผŒ้™ชๅŒ็”ทไผดๅช้œ€xx...\n444501 0 ๅทด่ฅฟๅ‰ๅ›ฝ่„šๅผบๅฅธๅฅณๆ˜Ÿ่ขซๅˆคๅ…ฅ็‹ฑ32ไธชๆœˆ\n444502 0 ๅฎฟๅŸŽๅŒบๅŠณๅŠจ่ฅฟๅททๅฎนๅจœ่‰ฒๅฝฉ้—จๅฃๆฒฟ่ก—ๆ™พๆ™’\n444503 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ้ƒฝๆ˜ฏๅœŸ่ฑช็Žฉ็š„ๆธธๆˆ\n444504 0 ่ฎธๅคšๆ–ฐๆฌพๆ‰‹ๆœบๅค–ๅฃณ้…ไปถไป€ไนˆ็š„ๆˆ‘้ƒฝๆ˜ฏๅœจๅพฎไฟกๆ›ดๆ–ฐ็š„\n1000 Processed\n classify content\n445000 0 ไฝ ๆ‰€ๆ‰พๅˆฐ็š„ๅฐ้ฅฐๅ“ๅบ—่ฃ…ไฟฎๅ›พ็‰‡ๅˆๆ˜ฏไป€ไนˆๆ ท\n445001 0 ่ฟ™ไธชๆœฌไธŠๅŸบๆœฌ้ƒฝๆ˜ฏ้€†่ฝฌ่ฃๅˆคๅ’ŒFFไนŸๆ˜ฏ่›ฎๆ‹ผโ€ฆโ€ฆ\n445002 0 ่ดผTheboytoldmethatIwasathiefwhostolehismemory\n445003 0 ๅ‰่€…่ฝฎๅฅธๅŠ ๆ‹ๅ–ๅฏ็›ผๆญปๅˆ‘ๅˆฐๆ— ๆœŸ็š„่ฏ่ฟฝ่ฏ‰ๆ—ถๆ•ˆๅบ”่ฏฅๆฒก่ฟ‡ไฝ†ๆŠ“ไบบไผฐ่ฎกๆœ‰ๅ›ฐ้šพ\n445004 0 ๅฐฑๆ˜ฏๆˆ‘ๅฎถ็š„็บฏๅคฉ็„ถๆžœๆฑFcup??\n1000 Processed\n classify content\n445500 0 ๅ› ไธบๆ€ปๆŠŠๆถฒไฝ“้˜ฒๆ™’้œœไธๅฐๅฟƒ่นญๅˆฐ่กฃๆœไธŠ\n445501 0 ็š–BH9785็™ฝ่‰ฒ้•ฟๅฎ‰ๅฐๅž‹ๆฑฝ่ฝฆ้€ๅพ€่Šœๆน–ๅธ‚ๆŠขๆ•‘\n445502 0 2015ๅนดๆตๅนดไธบไบŒ้ป‘็—…็ฌฆๆ˜Ÿๅ…ฅๅฎซ\n445503 0 8ๆœˆ2ๅท้ฃžๆœบ้Ÿฉๅ›ฝๆ–ฐ็ฝ—ๅ…็จŽๅบ—ไบบ่‚‰่ƒŒๅ›ž\n445504 0 makeupforeverHDๆ•ฃ็ฒ‰\n1000 Processed\n classify content\n446000 0 ้€‚ๅˆ12~60ๅฒ็š„ๆ‰€ๆœ‰ๅฅ่บซไบบ็พค\n446001 0 ๆœฌ้—จไปŽๆ˜†ๅฑฑๅฎš่ฟœๅฟซๅˆ€ๅธฎๅคบๅพ—ๅŒ–ๅŠŸๅคงๆณ•ๆฎ‹็ซ ไบŒ\n446002 0 ็›ฎๅ‰A่‚กไธญ็ŸณๅŒ–ๅ…ทๅค‡ไบ†ๅทจๅคง็š„ไปทๅ€ผๆŠ•่ต„ๆœบไผš\n446003 0 ๅˆšๅˆฐ็š„็Žฐ่ดง๏ผšDHCๅ”‡่†ใ€DHC็˜ฆ่…ฟไธธใ€ๆ–ฐ่ฐท้…ต็ด ๅŠ ๅผบ็‰ˆใ€่Šฑ็Ž‹็œผ็ฝฉ\n446004 0 ่ฏ„ไผฐๅˆฐไผฆๆ•ฆๅคงๅญฆๅ›ฝ็Ž‹ๅญฆ้™ข็š„ๅ…ฅๅญฆ็އ?\n1000 Processed\n classify content\n446500 0 ็™พๅบฆๆไพ›ๅ†…ๅฎนใ€ๆŠ€ๆœฏใ€ไบบๅŠ›ใ€่ฟ่ฅ่ต„ๆบ่ฟ›่กŒ็‹ฌๅฎถ่ฟ่ฅ\n446501 0 ไปŽxๆœˆๅˆๅฐฑๆŠขไฝ ไปฌ้‚ฃไธชๅŒ—ไบฌๅœฐๅŒบ็š„็บขๅŒ…\n446502 0 ไนŸๆ˜ฏไฟ„โ€œxxxxๅ›ฝ้™…ๅ†›ไบ‹ๆฏ”่ต›โ€็š„ๅผ€ๅน•ๅผ\n446503 1 ๆŒ็ ไธป๏ผŒxx.xx.xx.้˜ฒ.xx.xx.xx.xx.xx.xx.xx.xx.ๅฎถ่‚–ไธญ๏ผŒ\n446504 0 ๅ…ฌๅธ่‚ก็ฅจๅฐ†ไบŽ7ๆœˆ13ๆ—ฅ่ตทๅค็‰Œ\n1000 Processed\n classify content\n447000 1 ไบฒ็ˆฑ็š„ๅง๏ผŒๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏWHOOๅŽไธ“ๆŸœ็š„็พŽๅฎน้กพ้—ฎไปปๅ›ญใ€‚็Žฐๅœจๆˆ‘ไปฌๅ•†ๅœบๆžโ€œไธ‰ๅ…ซโ€ๅฆ‡ๅฅณ่Š‚็š„ๆดปๅŠจ๏ผŒ็Žฐๅœจๆˆ‘...\n447001 0 ๅฏ่ƒฝไบŽxxๆ—ฅๆ—ฉๆ™จไปŽๆˆ‘็œไธœๅ—้ƒจๅ…ฅๅขƒ\n447002 0 ๆ‰็Ÿฅ้“่Šฑๅƒ้ชจไนŸๆ˜ฏไธš็•Œ่‰ฏๅฟƒไบ†\n447003 0 ๆ‰€ไปฅๅญๅฎซ็š„ไฟๅ…ปๅฐฑๆ˜พๅพ—ๆฏ”่บซๆๅ’Œ็šฎ่‚ค้‡่ฆๅญๅฎซไฟๅ…ปๅ“็ญ‘็พŽ้›ช่Žฒ่ดดไฝ ๆœ€ๆญฃ็กฎ็š„้€‰ๆ‹ฉ\n447004 0 ไปŠๅคฉๆ˜ฏๆน–ๅŒ—่†ๅทž็”ตๆขฏไบ‹ไปถๆญป่€…็š„ๅคดไธƒ\n1000 Processed\n classify content\n447500 0 ๅŽๆœŸๅฐ†่ฝฌๅ˜ไธบไฝๅŒบ็š„ไผšๆ‰€ๅ•†ไธš\n447501 0 ๅฎ‰ๅ“ๆ‰‹ๆœบๅฏไปฅไธ‹ไธ€ไธช็ฉบ่ฐƒ็ฒพ็ต\n447502 0 3ใ€็”จๅ†ฐๅ†ป่ฟ‡็š„ๅŒ–ๅฆ†ๆฐดๆ‹่„ธ่›‹\n447503 1 ไบฒ๏ผŒๆ™šไธŠๅฅฝ๏ผŒๆทฑๅ—ๅคฉ่™นๅทง่ฟชๅฐšๆƒ ็ฅไฝ ๅ…ƒๅฎตๅฟซไน๏ผ xๆœˆxๆ—ฅ่‡ณxๆœˆxๆ—ฅๆดปๅŠจๅผ€ๅง‹ๅ•ฆ๏ผๅ…จๅœบxไปถๅณๅฏไบซๅ—x...\n447504 0 ๅฝ“ๆ™šๆœ‰ๆตท้—จๅธ‚ๆฎ‹็–พไบบ่”ๅˆไผš็ป„็ป‡\n1000 Processed\n classify content\n448000 0 ๅ’Œๆˆ‘ๅ›ฝๅคๅปบ็ญ‘ไธญ็š„ๅŠ›ๅฃซ้›•ๅƒๅฏนๆฏ”\n448001 0 ไน่ง†็š„1080pๆฐๆฐ่ฏดๆ˜Žๅฎƒๆ˜ฏ่ฝฌ็ ็š„\n448002 0 ๅ“ˆๅฐผ่ฏๆตดๅฏปๆ นไน‹ๆ—…่ตฐ่ฟ›้˜ฟ่€…็ง‘ๅคๆ‘่ฝ\n448003 1 <ๆ–ฐ - ่‘ก - ไบฌ>๏ผŒ้ฆ– ๆฌก ๅญ˜ ็ซ‹ ้€ . xoo%๏ผŒๅœฐๅ€๏ผšx x x x x x...\n448004 0 ้ซ˜ๅ“่ดจๅ‡€่œๅทฒๆˆไธบๆฌง็พŽ็ญ‰ๅ‘่พพๅ›ฝๅฎถ่”ฌ่œๆถˆ่ดน็š„ไธปๆต\n1000 Processed\n classify content\n448500 0 ่ฎพ่ฎก๏ผšramonaenache\n448501 0 ็†Ÿๆ‚‰็š„่ฃ…ๆฝขไผผๆ›พ็›ธ่ฏ†็š„BGMๅ’Œๅ‘ณ้“\n448502 0 ่ฟ™ๅฐฑไผš้€ ๆˆๅคงๅฎถๅœจๆ‰พSEOไผ˜ๅŒ–ๅ…ฌๅธ็š„ๆ—ถๅ€™ๆฏ”่พƒๅ›ฐๆƒ‘\n448503 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅทด็‰นๅˆฉๆฉฑๆŸœๅฏผ่ดญ๏ผŒxๆœˆxๆ—ฅ-xๆœˆxxๆ—ฅๆˆ‘ไปฌๅทด็‰นๅˆฉๅฎžๆœจๅฎถๅ…ทๅฎšๅˆถๆฉฑๆŸœ๏ผŒๅŽ‚ๅฎถๅคงๆ”พไปท๏ผŒไปทๆ ผไฟ...\n448504 0 ๅคฉๅ‘ๅˆšๅˆš็œ‹ๅˆฐๅฆ™่„†่ง’็š„ๅนฟๅ‘Šๅ“ๅพ—ๆˆ‘ๆ‰‹ๆœบๅทฎ็‚นๆމไบ†\n1000 Processed\n classify content\n449000 0 ไฝ ่บซ่พน่ฟ˜ๆœ‰5ไธช้šๆ—ถๆ”ถไฝ ๅ‘ฝ็š„ไธœ่ฅฟ\n449001 0 ไธ€ไธชไบบๅฐฑๆ˜ฏไธ€ๅชๆผ‚ๆณŠ็š„้ฃŽ็ญๆ€ปๆ˜ฏ่ก”็€ไนกๆ„ๅœจๆททๆฒŒไธŽๅฟ™็ขŒไธญ\n449002 0 ๅ…ถๅฎž็ป™ไธชๅžƒๅœพๅพฎ่ฝฏ็š„ๅปบ่ฎฎ๏ผšๅฆ‚ๆžœๆ‰€ๆœ‰็š„wpๆ‰‹ๆœบ้ƒฝ่ƒฝๅ‡็บงๅˆฐwin10\n449003 0 xxxxๆฌพๆ–ฐ่ƒœ่พพๆ˜ฏ็ฌฌไธ‰ไปฃ่ƒœ่พพ่ฝฆๅž‹\n449004 0 ้ข„่ฎกๅฐ†ไบŽxxxxๅนดไธŠๅŠๅนดๅผ€ไธš\n1000 Processed\n classify content\n449500 0 ไปŠๅนดๅคงๅŠๅนดๆฒกๆœ‰ๅ‡บ้—จๆ—…ๆธธ่ฟ‡ไธ€ๆฌก\n449501 0 ๅœจamazonไนฐไบ†ไธชu็›˜ใ€ไธŠ้ข้ƒฝๆ˜ฏไธญๆ–‡่ฏดๆ˜Ž\n449502 0 โ€œitlittleprofitsthatanidleking\n449503 0 UtenaPuresไฝ‘ๅคฉๅ…ฐ่œ‚็Ž‹ๆต†็Žปๅฐฟ้…ธไฟๆนฟ้ข่†œๅˆๆฅๅ•ฆ\n449504 0 ็ป™ๅคงๅฎถๅˆ†ไบซ\"killmehealme\"\n1000 Processed\n classify content\n450000 0 ๅ› ไธบๅฎ˜ไฝ่ฟ˜ๅœจไปปไธŠๅ‰ๆœ‰่…่ดฅๆฒกๅ‘็Žฐ\n450001 0 ไป–ๆฏไธ€ๆฌกไธ้€ๆˆ‘้ฃžๆœบ็š„็†็”ฑ้ƒฝๆ˜ฏๆˆ‘ไธๆƒณ็œ‹็€ไฝ ๅ“ญ\n450002 0 ไฝ็ฝฎๅœจMONTEREYPARK\n450003 0 ไธญๅ›ฝๆˆฟๅœฐไบงๅธ‚ๅœบ้ขไธดๅๅˆ†ไธฅๅณป็š„่€ƒ้ชŒ\n450004 0 ็ŽฐๅœจๆŠ•่ต„ไธ€ๅŒ่ฟๅŠจ้ž‹ๆฏ”่ดญไนฐไธ€ๅŒ้ซ˜่ทŸ้ž‹ๆ›ดๅˆ’็ฎ—\n1000 Processed\n classify content\n450500 0 ้ป‘่‰ฒ๏ฝžๆ–ฐๆฌพAsh่ฟๅŠจ้ž‹้‡‡็”จ็ปๅ…ธ่ฟๅŠจ้ฃŽๆฌพๅผ่ฎพ่ฎก\n450501 0 ๅฎžไน ็”Ÿๅคชๅฅฝ็œ‹ไบ†้ƒ‘ๆบ็š„่กŒไธบๅธ…ๅˆฐไธ่กŒ\n450502 1 ๆˆ‘่กŒๆœ€ๆ–ฐๆŽจๅ‡บๅนธ็ฆๆ˜“่ดทไฟก็”จ่ดทๆฌพ๏ผŒๆ— ้œ€ๆŠตๆŠผๆ‹…ไฟ๏ผŒๆ‰‹็ปญ็ฎ€ไพฟ๏ผŒๅฎกๆ‰นๅฟซๆท๏ผŒๅนด็ปผๅˆๆˆๆœฌx.xx%ๅทฆๅณ๏ผŒ่ฏฆ...\n450503 0 xๅˆ†xx็ง’ๅผ€ๅง‹ๆœ‰~ๅŽ้ขๆމไธ‹ๆฅ็š„ไธœ่ฅฟๆ˜ฏไป€ไนˆ\n450504 0 ๅœจๅคๆ—ฅ้‡็‡ƒ่ฏปไนฆ็š„็ƒญๆƒ…~ๆœ€ๆ–ฐๅฅฝไนฆ\n1000 Processed\n classify content\n451000 1 ๆ–ฐๆ˜ฅๅฅฝ๏ผๆˆ‘ๆ˜ฏๅˆ่‚ฅๆถฆๅฎ‰ๅฐๅˆทๅŽ‚๏ผŒๆœฌๅŽ‚ๆ‰ฟๆŽฅ้ป‘็™ฝๅฝฉ่‰ฒๅฐๅˆท๏ผŒๅ‘่ดงๅŠๆ—ถไปทๆ ผๅฅฝ่ฏด๏ผŒๅฆ‚้œ€ๅฏ่”็ณปqqxxxxx...\n451001 0 ๆน˜ๆฝญไธญ้™ขๅฏนๅ‘็”Ÿๅœจ12ๅนดๅ‰็š„โ€œๆน˜ๆฝญๅคงๅญฆ็ ”็ฉถ็”Ÿๆ€ไบบๆกˆโ€ไธ€ๅฎกๅฎฃๅˆค๏ผšๅˆคๅ†ณ่ขซๅ‘Šไบบๆ›พ็ˆฑไบ‘ๆ— ็ฝช\n451002 0 ไป–ไปฌๆฅๅˆฐๆต™ๆฑŸ็พŽๅคงๆ€ป้ƒจๅ‚ๅŠ ่ถ…็บงๅ›ข่ดญไผš\n451003 0 ๆˆ‘่ง‰ๅพ—ๆˆ‘ๅฏ่ƒฝๆœ‰็ฒพ็ฅž็—…่ขซๅฎณๅฆ„ๆƒณ็—‡็„ฆ่™‘็—‡่ฟ˜ๆœ‰่ฝปๅพฎ็ฒพ็ฅžๅˆ†่ฃ‚ๅ’ŒๆšดๅŠ›ๅ€พๅ‘ๆฑ‚ๆŽจ่ๅฟƒ็†ๅŒป็”Ÿ\n451004 0 ็งฐ็ฆปWindowsXP้€€ไผ‘่ฟ˜ๆœ‰95ๅคฉ\n1000 Processed\n classify content\n451500 0 1000ไปฅไธŠ่€…ๅ‡้€ๅ”ค้†’ๅฅ่บซ่‡ณๅฐŠVIPๅกไธ€ๅผ ๅ–”\n451501 1 ๅŒ—้ผป๏ผŒๅ“่€Œ็พŽๅฅณไบบๆ˜ฅๅ…‰ไนๆณ„๏ผŒ่ฟ›้™ข็บขๅŒ…xx-xxxๅ…ƒๆ‹ผๆ‰‹ๆฐ”๏ผŒๅ…่ดน่„ฑๆฏ›ไธๆป‘่ฟŽๅคใ€‚ๆถˆ่ดนๅฐฑๆœ‰่ฑช็คผ๏ผŒ็ˆฑ็–ฏ...\n451502 0 ๆˆ‘ๆ˜ŽๅคฉไธŠๅˆ่ฟ˜่ฆๅ้ฃžๆœบๅ›žๅทซๆบช\n451503 0 ๆต™ๆฑŸ้‡‘ๅŽๅ‘็”Ÿไบ†ไธ€่ตทๅ€’่ฝฆๅผ•ๅ‘็š„่ฝฆ็ฅธ\n451504 0 ๅธธๅทžๅธ‚่ฅฟ็ป•ๅŸŽ้ซ˜้€ŸๆฑŸๅฎœๆ–นๅ‘่ฝฌๆญฆ่ฟ›็ปๅ‘ๅŒบๅŒ้“ๆ ‡็บฟๆ–ฝๅทฅ็ป“ๆŸ\n1000 Processed\n classify content\n452000 0 ๆ‘ฉ็พฏๅบง๏ผšโ˜…โ˜…ๆฑŸๆœ€็พŽไน‹ๆต™ๆฑŸๅ…ญๅคงๅฒ›ๅฑฟ\n452001 0 ็Ž‹ๆž—่ขซๆŒ‡ๆ‰ฟ่ฏบโ€œๅˆค้‚นๅ‹‡ๆญปๅˆ‘้…ฌ่ฐข500ไธ‡โ€\n452002 0 ๅœจไป–100ๅฒๆ—ถๆ›พๅ› ๅœจไธญๅŒปไธญ่ฏๆ–น้ข็š„ๆฐๅ‡บๆˆๅฐฑ่Žทๆ”ฟๅบœ็š„็‰นๅˆซๅฅ–ๅŠฑ\n452003 0 ๅพˆๆ˜Žๆ˜พ7ๆœˆ็š„ๆœ€ๅŽไธ€ๅ‘จๅชๅ‰ฉไธ‹ๆฌกๆ–ฐ่‚ก\n452004 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏไธœไบŒ็Žฏ็™พ็››็€่Žฑ้›…ไธ“ๆŸœ็š„็พŽๅฎน้กพ้—ฎ๏ผŒไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅˆฐๆฅ็€่Žฑ้›…ๆ–ฐๅ“ๆฐดx.xๆŠ˜๏ผŒ็‰นๆƒ ...\n1000 Processed\n classify content\n452500 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆท๏ผŒ้ฆ–ๅ…ˆๆˆ‘ไปฃ่กจๅพก้‚ฆๆ„Ÿ่ฐขๆ‚จไธ€็›ดไปฅๆฅ็š„ๆ”ฏๆŒ๏ผŒๆœฌๅบ—ๅฐ†ๅœจๆ˜ฅ่Š‚ไธพๅŠžๅ…จๅœบ้…ฌๅฎพๆดปๅŠจ๏ผŒๆถˆ่ดนๆปกxxxๅ…ƒ...\n452501 0 ๅ—ไบฌ็ต่ฐทๅฏบไธŠไธ‡่ค็ซ่™ซ้ฃž่ˆž็พŽๅฆ‚็นๆ˜Ÿโ€”โ€”ๅœจๅ—ไบฌ็ดซ้‡‘ๅฑฑไธŠ\n452502 1 ๆœฌๅบ—ๅบ†็พŽไธฝx.xๅฆ‡ๅฅณ่Š‚๏ผŒๆทปๅงฟๅŒ–ๅฆ†ๅ“ๅบ—ๅ…จๅœบไนฐๆปกxxxๅ…ƒ้€xxxๅ…ƒ่š•ไธ้ข่†œ๏ผŒๆดปๅŠจๆ—ถ้—ดx.xโ€•โ€•x...\n452503 0 ๅ›ด็ป•ๅธ‚ๅง”ใ€ๅธ‚ๆ”ฟๅบœ็š„ๅทฅไฝœ้‡็‚น\n452504 0 ๆˆ‘ๆƒ…ๆ„ฟๆไธชๆ— ๅ›ฝ็•ŒๅŒป็”Ÿๆˆ–ๅธŒๆœ›ๅฐๅญฆ\n1000 Processed\n classify content\n453000 1 ใ€่Žฑ็‰นๅฆฎไธใ€‘ๆญฆๅนฟ่Žฑ็‰นๅฆฎไธไธ‰.ๅ…ซ้’œๆƒ :ๅณๆ—ฅ่ตท่‡ณx.xxๅ…จๅœบxxๅ…ƒ่ตท๏ผŒไฝŽ่‡ณx.xๆŠ˜๏ผŒ้™้‡็ปๅ…ธๆฌพๅ…จ...\n453001 1 ๅฏปๆ‰พ้กน็›ฎๅˆไฝœ๏ผŒ่ดขๅŠ›ๅ›ฝๅ†…ไธญๅฐๅž‹ไผไธšๅ‘ๅฑ•๏ผŒๆ–ฐๅปบใ€ๆ‰ฉๅปบใ€ๆ–ฐ้กน็›ฎ็ ”ๅ‘๏ผŒๅ†œไธšๅˆไฝœ๏ผŒๆ‰‹็ปญ็ฎ€ๅ•๏ผŒๅˆฐไฝๅฟซ.้™ˆ...\n453002 0 ๅœจPowerSystemsไธŠ\n453003 0 ่‡ณๅฐ‘ๅ…ˆ็ ธxxxx่ฌๅ…ƒโ€ฆโ€ฆ่ณ‡ๆบ็จ€็ผบใ€ๆฌŠๅŠ›ๅคฑ็ฏ„\n453004 0 ไฝๅˆซๅข…็š„ไธไธ€ๅฎšๅผ€ๅฅฝ่ฝฆ่œๅธ‚ๅœบไนฐ่œ็š„่€ๅคชๅคชๆฒกๅ‡†ๆ˜ฏๆŸไธช้ข†ๅฏผ็š„ๅฎถๅฑžไฝ ๆŽฅ่งฆ็š„ไบบๅคšไบ†ๅฐฑไผšๅ‘็Žฐ่ƒฝๅ’‹ๅ‘ผ็š„ไธไธ€...\n1000 Processed\n classify content\n453500 0 ไธŠ่ฟฐไธคไธชๅŸบ้‡‘6ๆœˆๅบ•็š„่‚ก็ฅจไป“ไฝๅพˆ่ฝป\n453501 0 ้ฆ–ๅ…ˆ๏ผšๆˆ‘ไปฌ่ฆๅœจๆกŒ้ขไธŠๆ–ฐๅปบไธ€ไธชๆ–‡ไปถๅคน\n453502 0 ๆ—ฅ็…ง็š„ๅ†œๆ‘็กฎๅฎžไธ่ƒฝ่ทŸๅ—ไบฌๆฏ”ๅ•Š\n453503 0 ็ฌฌๅ››ๅญฃไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ็Žฐๅœบๆผ”ๅ”ฑๅ‘จๆฐไผฆ็š„่‘—ๅไฝœ\n453504 0 ๅฎžๆ‹ๅจ่ˆช็‰นๅˆซ็‰ˆ็‹‚ๅฅ”380km/h\n1000 Processed\n classify content\n454000 0 ๅฐๅท้ ็€ๆŠ„่ขญๅ‡บๅ็Žฐๅœจ่ฝฌๆˆ˜็”ตๅฝฑๅœˆ\n454001 0 ๅ‘จ่‘ฃ่ฐˆๅด”ๅฅๅฐดๅฐฌไบ’ๅŠจๆŠ•็ฅจ๏ผšไฝ ๆ”ฏๆŒ่‰พๅ…‹ๆ‹œๅฐ”้‡่ฟ”ๅฅฝๅฃฐ้Ÿณๅ—\n454002 1 ๆ–ฐๅนดๅฅฝ๏ผๆญๅ–œๅ‘่ดข๏ผๆˆ‘ๆ˜ฏๅ›ๆˆๅ…ดๅŒ…่ฃ…ๅ™จๆ็š„ๅฐ้™ˆ๏ผŒๆˆ‘ๅธๅทฒๆญฃๅธธไธŠ็ญ๏ผŒๅœจๆ–ฐ็š„ไธ€ๅนด้‡ŒๅธŒๆœ›ๆ›ดๅฅฝ็š„ไธบๆ‚จๆœๅŠก๏ผ...\n454003 0 viaๆญฆๆฑ‰ๆ™šๆŠฅ~ๅ–ƒไปฌๆƒณ่ฏด็‚นๅ•ฅ\n454004 0 ไธ€่ตทๅ‡บ้—จ่ขซ่ฏดๅƒ่ขซๆ‹ๅธฆ/ๅฎถ้•ฟๅธฆๅฐๅญฉ\n1000 Processed\n classify content\n454500 1 ๅ„ไฝๆœ‹ๅ‹๏ผŒๆœฌ็Šฌ่ˆๆไพ›ไผ˜่ดจๆ—ฅ็ณป็บฏ็ง็ง‹็”ฐ็Šฌ๏ผˆ่ต›็บง๏ผ‰๏ผŒๅฎถๅ…ป๏ผๆœ‰้œ€่ฆ็š„ๆœ‹ๅ‹ๅฏ่”็ณปๅพฎไฟกxxxxxxxxx\n454501 1 ๆ™จๆ›ฆ็ฎ€่ฎฏ:ๆจๆธ…ๅฟ ๆ•™ๆŽˆๅ’Œๅถไปฒๆณ‰ๆ•™ๆŽˆๅฐ†ๅœจ้‡ๅธˆไผ ๆŽˆ่€ƒ็ ”ๅ„็ง‘ๅคไน ็š„็‹ฌๅฎถ็ง˜็ฌˆ ๅœฐ็‚น:ๅผ˜ๅพทๆฅผxxxxx ...\n454502 0 ๅ’Œ้‚ฃไธชๅผบๅฅธๆ€ไบบ็Šฏ้ƒฝๆฒกๆœ‰ๆƒณๅˆฐ\n454503 0 ่ฟๆณ•ๅŽๆดพไบบๅฐฑๅพ—็Ÿฅ็š„ๆˆ‘็š„้š็งๆฅๆดพไบบ้—ฎใ€ๆฅๆ‰“ๅฌ็š„ๆ˜ฏไป–\n454504 0 Gxไบฌๆฒช้ซ˜้€Ÿ็”ฑไธŠๆตทๅพ€ๅŒ—ไบฌๆ–นๅ‘ๆ— ้”กๆฎตไปŽxxxxK่‡ณxxxxKๆ–ฝๅทฅ็ป“ๆŸ\n1000 Processed\n classify content\n455000 0 ๆฏไธ€ๅค„็ป†่ƒž้ƒฝๅ› ๆญค่€Œ่‚่‚ ๅฏธๆ–ญ\n455001 0 tagๅ’Œๅผนๅน•ไธ่ฆๅคช็œŸ็›ธwww\n455002 0 ๆ‰€ไปฅๅฎคๅ†…่ฎพ่ฎก่ถŠๆฅ่ถŠ่ถŠๆฅๅŽๆœŸ็š„่ฝฏ่ฃ…ๆญ้…ๆฅๅ‘ˆ็Žฐ็š„ๅฎคๅ†…่ฎพ่ฎกๆ•ˆๆžœ\n455003 0 ๅฏๅ…่ดน่Žทๅพ—36ๅ‘จๅนดๅบ†ๅ…่ดนๅฎšๅˆถ่ฅฟๆœ็คผๆœไธ€ๅฅ—ๅ“ฆ\n455004 0 ๅœจ360ๅ’Œ่…พ่ฎฏไน‹้—ดๆˆ‘้€‰ๆ‹ฉไบ†่…พ่ฎฏ\n1000 Processed\n classify content\n455500 0 ่ฆๆฑ‚่ฟ‘xxxxๅๅ†œๆ‘ไธญๅญฆ็”Ÿ็ปŸไธ€ๅฐฑ่ฏปๅŽฟๅŸŽๆ–ฐๆ‰“้€ ็š„ๅด‡ๆ–‡ไธญๅญฆ็š„ๅธ–ๆ–‡\n455501 0 ้™‡ๅ—ๅธ‚ๆฃ€ๅฏŸ้™ขๆฃ€ๅฏŸ้•ฟ้ซ˜่ฟžๅŸŽๅœจๅธ‚ๅ›ฝๅœŸ่ต„ๆบๅฑ€ๅผ€ๅฑ•โ€œๅญฆๅฅฝๆณ•\n455502 1 ไผš่ฎฎๆœŸ้—ดxxๅ˜้ข‘ๆœบไป…้œ€xxxxๅ…ƒ๏ผˆไป…้™ไผš่ฎฎๅฝ“ๅคฉ๏ผ‰๏ผŒๆŠขๅˆฐๅณๆ˜ฏ่ตšๅˆฐ๏ผ็œŸ่ฏš็š„ๆœŸๅพ…ๆ‚จ็š„ๅคง้ฉพๅ…‰ไธด๏ผ็ฅไธ‡...\n455503 0 ๆณ•้™ขๅบ”ๅฝ“็ปง็ปญๅฎก็†ๆฐ‘้—ดๅ€Ÿ่ดท็บ ็บทๆกˆไปถ\n455504 0 ไบบๅฟƒ้™ฉๆถไธญๅ›ฝ็คพไผšๅคช่…่ดฅ\n1000 Processed\n classify content\n456000 0 ไธ‰ๅคฉไธ‰ๅคœโ€”โ€”ไธœๆžๅฒ›ใฎHigh็ฟป่™็ฟป่‡ชๅŠฉๆธธ\n456001 0 ็”ทๅญฉ๏ผšโ€œ้šพ้“ไฝ ็š„้ฃžๆœบๆ˜ฏ็œŸ็š„โ€\n456002 0 ๅฏนไบŽๅฆ‚ไฝ•ไพๆณ•่ฎข็ซ‹้—ๅ˜ฑๆฏซๆ— ไบ†่งฃ\n456003 1 ๆˆ‘่กŒๅˆฉ็އไปŽๆœฌๆ—ฅ่ตทไธŠๆตฎxx%๏ผˆๆœ‰้‡‘้ขๅ’ŒๆœŸ้™่ฆๆฑ‚๏ผ‰๏ผŒๅ„ไฝๅฐŠๆ•ฌ็š„ๅฎขๆˆทๅฆ‚ๆœ‰้œ€่ฆ๏ผŒๅฏ็”ต่ฏ่”็ณปใ€‚ๆ„Ÿ่ฐขๆ‚จๅฏน...\n456004 0 ้ฃžๆœบๅˆšไธŠๅคฉๅฐฑๅ‡บไบ‹่ฝฌไบ†ไธ€ไธชๅœˆไธ‹ๆฅไบ†\n1000 Processed\n classify content\n456500 0 ๅฎœๅ…ดๆ–ฐๅคฉๅœฐๅนฟๅœบๅฐ้ฉฌๅๅบ—ๅผ€ๅทฅๅคงๅ‰\n456501 0 ่ฟ™้‡Œๆ›พๆ˜ฏๆฑชๆ”ฟๆƒไผช้ƒฝๅ—ไบฌๅ‰็š„ไธดๆ—ถๅŠžๅ…ฌๅค„\n456502 0 ๅคงๅฎถ็œ‹็œ‹ๅผ ่€ๅœจๆœฌๆœˆ24ๅฅฝ็›ด็™ฝ็›˜ๅ‰ๆŽจ่ๅคงๅฎถ่ทŸ่ธช็š„ไธช่‚กไธญๅ›ฝๅซๆ˜Ÿ\n456503 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆท:ๆฌงๆดพๆ–ฐ่ฟŽx.xxๅทฅๅŽ‚ๅคงไฟƒ้”€!็‰นไปทxxxxๅ…ƒ:่ถ…้•ฟx็ฑณๆฉฑๆŸœ๏ผ‹็Ÿณ่‹ฑ็Ÿณๅฐ้ข๏ผŒ้…็ƒญ้”€ๅธ็ƒŸ...\n456504 0 ่ขซ้ช‚ไบ†ๅคšๅนดblxไนŸsaybyeไบ†\n1000 Processed\n classify content\n457000 0 ๅ–œๆฌข้‚ฃ้‡Œ็š„่€ๅปบ็ญ‘ๅฑ‹้กถไธŠ็š„็ปฟๆ ‘\n457001 0 ไธ€ไธชๆŠŠ่€็™พๅง“็š„ๅฑ…ไฝๆƒๅฅๅบทๆƒๅ’Œๅ—ๆ•™่‚ฒๆƒๆ‹ฟๆฅๆ‹‰ๅŠจ็ปๆตŽ็š„ๆ”ฟๅบœ\n457002 0 ๅ—ๅฒธๆณ•้™ข็Ž‹ๅญไผŸ้™ข้•ฟ็އ็ญๅญๆˆๅ‘˜ๅˆฐๆฑŸๅŒ—ๆณ•้™ขไบคๆตๅบง่ฐˆ\n457003 0 ไปฅๅŽ็”ต่„‘่ฆi7็š„ๅƒไธ‡ไธ่ฆไนฐ็ฌ”่ฎฐๆœฌ\n457004 0 2ๆŒ‚ๅฃๅ…ฌ่ทฏๆŒ‚ๅฃๅ…ฌ่ทฏๆ˜ฏๅœจๆ‚ฌๅด–ๅณญๅฃไธŠๅผ€ๅ‡ฟ่€Œๅ‡บ็š„\n1000 Processed\n classify content\n457500 0 ไนฐๆ‰‹ๆœบๅฅนไธˆๅคซๆ‰“็”ต่ฏๅˆ ้™ค่”็ณปไบบ\n457501 0 ๆ„Ÿๅ—็€ๆฎ่ฏดๅผ€ๆŒ‚็š„็‰นๆ•ˆ~่‰ฏๅฟƒๆฅ่ฏด\n457502 0 ไนๆœˆๅ…จๅ›ฝ100ๅœบๅทกๅ›žโ€œ็‹ฌๅˆ›ๅ•†ไธšๆจกๅž‹โ€ๅˆ†ไบซ\n457503 0 xxๅ…ƒๅฐฑๅœจไบš้ฉฌ้€Šไนฐๅˆฐไบ†่ฟ™ๆœฌ็š„Kindle็‰ˆ\n457504 0 MKCๅœจ้€‰ๆ–™ใ€้€ ๅทฅใ€่ฃ…้…ๅ’Œ้…็ฝฎๆ–น้ข้ƒฝๅฐ†่ถ…่ถŠ็ฆ็‰น็ฟผ่™Ž\n1000 Processed\n classify content\n458000 0 ้‡Ž่˜‘่‡ๅˆซไนฑๅƒ่ฝปๅˆ™ไผค่บซ้‡ๅˆ™ไธงๅ‘ฝ\n458001 0 ๅ่…ๅฒ‚ไธๆˆไบ†ไธ€ๅฅ็ฉบ่ก่ก็š„ๅฃๅท\n458002 0 ๅ‘ผๅๅฏนๅšๅ‡้ฃŸๅ“่€…ๅˆคโ€œๆญปๅˆ‘โ€\n458003 1 ไบฒ็ˆฑ็š„ๅงๅงไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅณๅฐ†ๅˆฐๆฅ๏ผŒ็ฅไฝ ่Š‚ๆ—ฅๅฟซไนใ€่ถŠๆฅ่ถŠๅนด่ฝปๆผ‚ไบฎ๏ผๆœฌๅบ—้ข้ƒจๆŠค็†ๅ…จ้ƒจๆ‰“x.xๆŠ˜ ๅ…ป็”Ÿ...\n458004 0 โ€œๅผ˜ๆ‰ฌ็Ž‹ๆฐ็ฒพ็ฅžไผ ๆ‰ฟๅ‰ช็บธ่‰บๆœฏโ€้‚ณๅทžๅ‰ช็บธไผ ๆ‰ฟไบบๅŸน่ฎญ็ญๅผ€็ญ\n1000 Processed\n classify content\n458500 0 ๅ—ไบฌไนŸๆ›พๅ‡บ็Žฐ่ฟ‡ๆฐด่ดจๆ€ง็ผบๆฐด่€Œๆฌ่ฟๅ–ๆฐดๅฃ็š„ๆƒ…ๅ†ต\n458501 0 ๅ…ฑๅŒๆŽข่ฎจๆต™ๆฑŸ็œๅปบ็ญ‘ๆถ‚ๆ–™็”Ÿไบง็Žฐ็Šถ\n458502 0 ่€Œๆˆ‘ไปฌๅฐฑๅƒๆœบๅ™จไบบไธŠไบ†ๅ‘ๆกไธ€ๆ ท\n458503 0 ๆฒˆ้˜ณๅธ‚ๅœฐ้“่ง„ๅˆ’ๅ›พๆ–ฐ้ฒœๅ‡บ็‚‰ๅ•ฆ\n458504 0 ๆ„Ÿ่ฐขxไฝๅˆšไปŽ็พŽๅ›ฝWPSๅพฎ่ฝฏๅ…จ็ƒๅˆไฝœไผ™ไผดๅคงไผšๅ›žๆฅ\n1000 Processed\n classify content\n459000 0 ไธƒๆˆๅฎžไน ็”Ÿ่ต”้’ฑ่ตš็ป้ชŒ๏ผšๆœˆ่–ชๅœจ2000ๅ…ƒไปฅไธ‹\n459001 0 ๆ‰‹ๆœบๅทๅฐพๅทๅฏไปฅๆšด้œฒไฝ ็š„ๅนด้พ„\n459002 0 ๆ‹†ไธช่ฟ็‰น่ญฆๅŸŽ็ฎกไบค่ญฆ้›†ไฝ“ๅ‡บๅŠจ\n459003 0 ็Žฐๅœจๆˆ‘ๅœจๅด‡ๅฎ‰ๅฏบไบ‘่ ๅคงๅŽฆ็š„ๆ ผๆž—ๅธŒๅฐ”็š„่‹ฑ่ฏญๆœบๆž„ๅšไผ ้”€\n459004 0 0LEๆ ‡ๅ‡†็š„iPhoneๆˆ–Androidๆ™บ่ƒฝๆ‰‹ๆœบไธŠๅฎ‰่ฃ…้…ๅฅ—ๅบ”็”จ\n1000 Processed\n classify content\n459500 0 ๆœฌ้—จไปŽๆ™ฏๅพท็ฅž้พ™็Ž„ๆญฆๅžๅคบๅพ—ๅธๆ˜Ÿๅคงๆณ•ๆฎ‹็ซ ๅ››\n459501 0 ๅŽๆฅๆฅไบ†ไธ€ไธชๅซๅš่…พ่ฎฏ็š„ๅไบบๆŠ“่ตฐไบ†ๆฏไผ้น…\n459502 1 ๆตทๅฎๆฐ‘ๅŠ›ๆ‹…ไฟ๏ผšไผไธš่ฟ˜่ดทใ€ๅบ”ไป˜ๆฌพ็›ธๅ…ณไธšๅŠก๏ผ่ฏทๆๅ‰ไธคๅคฉ้ข„็บฆ๏ผๅœฐๅ€๏ผšๆตทๅฎๅธ‚ๆ”ฟๅบœๅ—๏ผˆๆ–‡็คผ่ทฏ๏ผ‰ๅฐšไธœๅ†™ๅญ—...\n459503 0 ่‹ฑ่ฏญๅ›žๅˆฐThefirstdayatschool\n459504 0 ็ฅไฝ ๆˆๅŠŸยทยทยทยทๅ‘ตๅ‘ตยทยทยท็›ธไฟกไฝ ่‡ชๅทฑๅฏไปฅ็š„\n1000 Processed\n classify content\n460000 0 ๆ‰‹ๆœบๅ„ๅคง้ƒจไปถ็›ธ็ปงๅผ€ๅง‹ๅคฑ็ตๅผ‚ๅธธ่ฎค็œŸ็š„ๅผ€ๅง‹ๆ€่€ƒๆขๅ“ชไธชๆ‰‹ๆœบๆฏ”่พƒๅฅฝ\n460001 0 ่ฏดๅคšไบ†้ƒฝๆ˜ฏๆณช่ฏ่ฏดๅœจๆฅๅพๅทž็š„่ทฏไธŠ\n460002 0 ๅœจmy่Šฑ็š„่ฏ้ข˜ๅ›ด่ง‚ๅพ—ๅ…ฅ่ฟทไนฐไบ†ไธชๅ†ฐๆท‡ๆท‹ๅƒๅฐฑๆŠŠ่ฆๅธฆๅŒป้™ขๅŽป็š„ไธœ่ฅฟ่ฝๅœจไบบๅฎถๅบ—้‡Œไบ†\n460003 0 ๆŠŠๆˆ‘็š„็Šฏ็ฝชๅ› ๅญ้ƒฝๅฟซๆ™’ๅ‡บๆฅไบ†\n460004 0 ไธบไบ†ๅšไธ€ไธช\"็Œฎ็ป™ไธ็Ÿฅ้“่ƒฝๅฆๅšๆŒไธ‹ๅŽป็š„ไฝ \"ๅฅ่บซๆด—่„‘ๅฐ่ง†้ข‘\n1000 Processed\n classify content\n460500 0 ๆๅซๆณ•ๅฎ˜็š„ๅฐŠไธฅๅฐฑๆ˜ฏๆๅซๆณ•ๅพ‹ๅ’Œๅนฟๅคงไบบๆฐ‘ๅ…ฑๅŒๆ„ๅฟ—็š„ๅฐŠไธฅ\n460501 0 ๅฆ‚ๆžœไฝ ่ฟ˜ๅœจ็”จๆฏ›ๅทพ้šๆ„ๆ“ฆๆ“ฆ่„ธไฝ ๅฐฑ็ญ‰็€ๆฏ›ๅญ”่ถŠๆฅ่ถŠๅคงๅง\n460502 0 ๆฏๅคฉๆ‰“้’ˆๅƒไธญ่ฏๅชๆƒณๅฅฝ็‚น่ฎฉๆˆ‘็กไธชๅฎ‰็จณ่ง‰\n460503 1 xxๆœŸ:ใ€–ๅ…ญ่‚–ไธญ็‰นใ€—โ†’โ†’็พŠ้พ™็Œช็‹—้ธก็Œดโ†โ†้–‹:๏ผŸxxๅ‡†\n460504 0 ๆ‰€ๅ‘็”Ÿ็š„ๅฑžไบŽๅŸบๆœฌๅŒป็–—ไฟ้™ฉๆŠฅ้”€\n1000 Processed\n classify content\n461000 0 ๆœฌ้—จไปŽ้ป”ๆฑŸ็Ž„ๆญฆๅžๅคบๅพ—ๅฏ’ๅ†ฐ็ปตๆŽŒๆฎ‹็ซ ไธ€\n461001 0 ๅˆๅŽป้ขๅœฃไบ†~็„ถ่€Œ่ฟ™ๅœบ็œ‹ๅพ—ๆˆ‘ๅพˆไธๅผ€ๅฟƒ\n461002 0 ๆˆ‘็ฃŠๆ‰‹ๆœบ้“ƒๅฃฐ่ฟ˜็œŸ็š„ๆ˜ฏ้‚ฃ้ฆ–ๆญŒๅ“ฆwwwwwwww\n461003 0 ๆ นๆœฌๅŽŸๅ› ๆ˜ฏๆ”ฟๅบœๅฝ“็€ๅฉŠๅญๅŒๆ—ถๅด่ฆๅŒป็”Ÿๅธฎไป–็ซ‹็‰ŒๅŠ\n461004 0 ็™ผ็พ็œŸ็›ธ่ˆ‡ๆฏ่ฆช่ˆ‡2ๅญๅฃ่ฟฐไผผไนŽใ€ŒๅฎŒๅ…จไธๅŒใ€\n1000 Processed\n classify content\n461500 0 ๆƒณ็Ÿฅ้“ๆ‚จ็š„ๅบ”็”จๆœ‰ๆฒกๆœ‰้ป‘็™ฝ่พนๅ—\n461501 0 ่ฎค่ฏไฟกๆฏไธบโ€œFEELONEๆจก็‰น็ป็บชไบบโ€\n461502 0 ROSESHIREHEZ็›’่ฃ…็Žซ็‘ฐ่ŠฑๆŸ\n461503 0 ๅˆšๆ‰่ฒŒไผผๅฌๅˆฐ้ฃžๆœบไปŽๆฅผไธŠ้ฃž่ฟ‡\n461504 0 ่‡ณๅฐ‘่ฆๅœจ็™พๅบฆไธŠๅˆ›้€ ไธ€ไธชๅฑžไบŽ่‡ชๅทฑ่ฃ่€€\n1000 Processed\n classify content\n462000 0 ๅธŒๆœ›ไธๆ˜ฏไผค็—…ๅŽŸๅ› viatwitter\n462001 1 ๅฎ‰ๅŒ–่Œฏ็ –่Œถๅ†…็‹ฌๆœ‰็š„โ€œ้‡‘่Šฑโ€โ€”โ€”ๅญฆๅโ€œๅ† ็ชๆ•ฃๅ›Š่Œโ€่ƒฝๆœ‰ๆ•ˆ่ฐƒ่Š‚ไบบไฝ“ๆ–ฐ้™ˆไปฃ่ฐข๏ผŒๅนถๆœ‰ๅ‡่‚ฅๅŽป่„‚ใ€ๅ’Œ้กบ่‚ ...\n462002 1 ็Ÿณๆž—ๅฎœไฟกๆ™ฎๆƒ ไฟก็”จๅ€Ÿๆฌพๅ…ฌๅธ๏ผŒๅŠž็†ๆ— ๆŠตๆŠผไฟก็”จๅ€Ÿๆฌพ๏ผŒไธŠ้—จๆœๅŠก๏ผŒxๅƒๅˆฐxxไธ‡๏ผŒๆœ€ๅฟซๅฝ“ๅคฉ้€š่ฟ‡ๆฌข่ฟŽๆฅ็”ตๅ’จ...\n462003 0 ๆพณๆดฒๆ—…ๆธธๅฑ€ๆ—ฅๅ‰ๅœจๅฎ˜ๆ–น็คพไบค็ฝ‘็ปœไธŠๅผ ่ดดไบ†ไธ€ๅผ ่ข‹้ผ ็š„็…ง็‰‡\n462004 0 ็”ฑๆญค่€Œไบง็”Ÿ็š„็™พๅบฆๅ•†ไธšๆƒณ่ฑก็ฉบ้—ดๅทจๅคง\n1000 Processed\n classify content\n462500 0 ไธบๅฅฝๆˆฟๅญ้…ๅฅฝ่ฃ…ไฟฎโ€”โ€”ไป€ไนˆๆ˜ฏๅฅฝ่ฃ…ไฟฎ\n462501 1 ๅดๆฑŸ่กŒๆ”ฟๆ ธๅฟƒๅŒบๅŸŸใ€ไบจ้€š้•ฟๅฎ‰ๅบœxx--xxxๅนณ็ฑณใ€‘่ฝป่ฝจๅญฆๅŒบๅ‡†็Žฐๆˆฟ๏ผŒๆ‚จๆƒณไธๅˆฐ็š„ไผ˜ๆƒ ๅ’Œ่ถ…ๅ€ผ ๆฅๅ‰ๆ...\n462502 0 ไฝ†้€š่ฟ‡ๆปฅ็”จๆ”ฟๅบœ่กฅ่ดดใ€ๆ”ฟๅบœๆ•‘ๆตŽๅ’Œๆ‰ถๆŒ็ญ‰ๆ‰‹ๆฎต\n462503 0 ๅนฟๅทžๅ—็ซ™ๅฑ…็„ถ่ฟ˜ๆœ‰่ฃ…่‹ๅ“‘่ฎฉไบบ็ญพๅ้ช—้’ฑ็š„\n462504 0 2015ๆ–ฐๆฌพๅค–ๅฅ—็”ท็ง‹่ฃ…้Ÿฉ็‰ˆๆฝฎไผ‘้—ฒๅคนๅ…‹็”ทๅฃซๆ˜ฅ็ง‹่–„ๆฌพไฟฎ่บซไธŠ่กฃ้’ๅนด็”ท่ฃ…\n1000 Processed\n classify content\n463000 0 ๅœจไธš็•Œๆœ€ๆƒๅจ็š„CPUๅŸบๅ‡†ๆต‹่ฏ•SPECCPU2006ไธญ\n463001 0 ็ดฏ่ฎกๆœ‰8000ไฝ™ไธชๅฎถๅบญๆŠฅๅๅ‚ๅŠ ๆดปๅŠจ\n463002 0 ไธ็”จๅผ€่ฏๆ–นใ€€ใ€€็™ฝ่œ่ๅœๆฑค\n463003 0 ๅทฎ็š„่ดงๅ€Ÿๆˆ‘xxไธช่ƒ†ๅญๆˆ‘ไนŸไธๆ•ขๅš\n463004 1 ไบฌๅฎ‡่ฝฉๅฅๅบทๅ…ป็”Ÿไผšๆ‰€ใ€‚ไธป่ฅ๏ผš่ถณๆตดๆŒ‰ๆ‘ฉใ€‚TEL:xxxxxxxใ€‚ๅœฐๅ€๏ผšๆ–ฐๅžตๆ–ฐ็››่ทฏxxๅทๆ‚ฆๅฎžๅนฟๅœบๅ››...\n1000 Processed\n classify content\n463500 0 ้‡่ฆ็š„ไบ‹่ฏดไธ‰้๏ผšๅŠ ๆฒนใ€ๅŠ ๆฒนใ€ๅŠ ๆฒน\n463501 0 ๅฎฟ่ฟๅ“ช้‡Œๆœ‰ๅฅฝๅƒ็š„ๅฅฝๅƒ็š„ๅฅฝๅƒ็š„ๅฅฝๅƒ็š„ๅฅฝๅƒ็š„\n463502 0 ๅŠ ๆฒนๅ“ฆ่ฟ˜ๆœ‰ๅŠไธชๅฐๆ—ถ็š„ๆ—ถ้—ดๅฏไปฅ็ปง็ปญ\n463503 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?02\n463504 1 ๆ’ญใ€็‚นๆ’ญๅ’Œๅ›žๆ”พ๏ผŒๅŠŸ่ƒฝ่ดนxxๅ…ƒ/ๆœˆ๏ผŒxๅนดไป…้œ€xxxๅ…ƒ๏ผŒ่ฟ˜่ต ้€็ฝ‘็ปœๆœบ้กถ็›’ใ€‚่ฏฆ่ฏขๅŒๆปฆ่”้€šxxxxx...\n1000 Processed\n classify content\n464000 0 ๅถ็„ถๆกๅˆฐ็š„ๅพ’ๅผŸๆญฃๅฅฝ้ƒฝๆ˜ฏ้ซ˜่€ƒ่ฟ˜ๆ˜ฏๅธธๅทž็š„ๅฅฝๅทงๅฅฝๅทงๅพ’ๅผŸ้ซ˜่€ƒๅฎŒไนฐไบ†ๅค–ๆ˜Ÿไบบๅผ€็”ตๅฝฑ็บง็Žฉๅ‰‘ไธ‰โ€ฆโ€ฆไธ€็›ดๅœจๆˆชๅ›พโ€ฆโ€ฆ\n464001 0 ๅฐ†ไฝ ็š„็”Ÿๅ‘ฝๆŠ•่ต„ไบŽๅคงไผ—่บซไธŠๆ—ถ\n464002 0 ไฝ†ๆ•ดไธชๆปจๆตทๆ–ฐๅŒบ็š„ๆŒ‡ๆ•ฐๅนถๆฒกๆœ‰ๅผ‚ๅŠจ\n464003 1 ็ป็†ไฝ ๅฅฝ๏ผŒ็”Ÿๆ„ๅ…ด้š†๏ผŒไธ‡ไบ‹ๅฆ‚ๆ„ใ€‚ๆˆ‘ๅ…ฌๅธๅฏไปฅๅ…่ดนๅŠž็†ๅ…‰ๅคง๏ผŒไบค้€š๏ผŒๅ†œ่กŒ๏ผŒๆตฆๅ‘ไฟก็”จๅกใ€‚็งปๅŠจๅธฆ็งฏๅˆ†ๆ้ข...\n464004 0 ไธ€ไธชไธๆƒณๅšไธดๅบŠๅŒป็”Ÿ็š„ไธดๅบŠไธ“ไธšๅญฆ็”Ÿ็š„ๅฎžไน ๆ—ฅ่ฎฐ๏ผš2015ๅนด8ๆœˆ6ๆ—ฅ\n1000 Processed\n classify content\n464500 0 ๅ› ไธบ่‡ช3ยท30ๆˆฟๅœฐไบงๆ–ฐๆ”ฟไปฅๆฅ\n464501 1 xๅ…ƒ็ง’ๆฑŸๆฑ‰ๅŒบไธ€ๅฅ—ๆˆฟ๏ผŒใ€็ฆๆ˜ŸๅŽๅบœใ€‘xxxxๅ…ƒ/ๅนณ่ตท็ง’xx-xxxๅนณ้™้‡ๆˆฟๆบ๏ผŒๅŒๅœฐ้“ๅญฆๅ…ปๆˆฟ๏ผŒๆดปๅŠจ...\n464502 0 ๅžƒๅœพๅ‹’ๆถฉๅœŸ็‹—ๆ—ฅ่„“ๅŒ…ๆˆณ้”…ๆผ\n464503 1 ๆฅๅบ—ๆถˆ่ดน็š„่ฏๅ…จๅœบxxๆŠ˜ไผ˜ๆƒ ๏ผŒๅ‰่ฟ›ๅบ—ๆฑค่‡ฃๅ€ๅฅ้™†ๅฐๅง\n464504 0 ๅฎ‰ๅ“ๆœบๅ™จไบบโ€œ้˜ณๆ‰ฌโ€ไบŽไธŠๆตท้ซ˜ๅฒ›ๅฑ‹็™ปๅœบๅމๅฎณ\n1000 Processed\n classify content\n465000 0 ๅ”ๅŠฉไฝ ๆ‰“้€ 13ๅ„„็พŽๅ…ƒๅธ‚ๅ ด็š„็ถฒ่ทฏไบ‹ๆฅญ\n465001 0 ไฝไบŽๅขจๅฐ”ๆœฌๅคงๅญฆๅ’ŒRMIT็š‡ๅฎถ็†ๅทฅๅคงๅญฆไน‹้—ด่ฑชๅŽๅ…ฌๅฏ“\n465002 0 ๅŸŽ็ฎก็š„้ฆ–่ฆไปปๅŠกไธๆ˜ฏๆŽ€ๆฐดๆžœๆ‘Š\n465003 0 ๅ…ถๅฎžๆˆ‘็Ÿฅ้“ๆˆ‘ๅทฒ็ป่ตขไบ†ๅฏๆ˜ฏๆˆ‘ๅฐฑๆ˜ฏๆƒณๆ‰“็ ด็ ‚้”…้—ฎๅˆฐๅบ•ๆˆ‘ๅฐฑๆ˜ฏๆƒณๆŠŠ็œŸ็›ธๆ‹ฟๅ‡บๆฅ่ฎฉไป–ๅ“‘ๅฃๆ— ่จ€ๅฏ่ฟ™ๆ ทๅฏนๆˆ‘่‡ชๅทฑ...\n465004 0 ๅซๆ˜Ÿๅ†ไธ€ๆฌกไฝฟๅ‡บไบ†ๆœบๆขฐๅŒ–TvZ\n1000 Processed\n classify content\n465500 0 ็ƒŸๅฐๅธ‚ๅ…ฌๅฎ‰ๅฑ€็ปŸไธ€้ƒจ็ฝฒๅ…จๅธ‚ๅ…ฌๅฎ‰ๆœบๅ…ณๅ…จ้ขๅฏๅŠจไธ€็บงๅทก้€ป้˜ฒๆŽงๅทฅไฝœ\n465501 0 ๅผ ๅธˆๅ‚…้ฉพ้ฉถ46่ทฏๅ…ฌไบค่ฝฆ่กŒ่‡ณ่ˆชๆตท่ทฏไธƒ้‡Œๆฒณ็ซ™\n465502 0 ๆฏๅน…็ซŸ็„ถไปฅ1000็พŽๅ…ƒ็š„ไปทๆ ผ่ขซๅ–ๆމไบ†\n465503 0 ่€Œไธ”ๅปบ็ญ‘ๅธˆ็š„ๆขฆๆƒณๆ€ป่ง‰ๅพ—ๅพˆ็ช็„ถ\n465504 0 ใ€ŒInๅฝฑ็™ฝใ€ไบบ็”Ÿ็š„้€™ๅ ด้ฆฌๆ‹‰ๆพ\n1000 Processed\n classify content\n466000 0 ๆฒกๆœ‰ๅฅณ็”Ÿๅฎถ้‡Œๆฒกๆœ‰ไธ€่„šไฟฎ่„š็š„็‰›ไป”้•ฟ่ฃค\n466001 0 ้‚ฃไนˆๆปจๆน–ๅŒบ็œๅฎž้ชŒๅๆ กๆœ‰ๅ“ชไบ›ๅ‘ข\n466002 0 ๆ‰‹ๆœบๅทฒๆŠฅๅบŸๅฅฝๆƒณๅŽปๅ–่‚พๅ•Š่ฐ่ฆ\n466003 1 ๅ†œ่กŒxxxxxx xxxxx xxxx xxxxๅผ ๅฐ็Žฒ\n466004 0 ||ๆˆ‘ๅœจๆฐงๆฐ”ๅฌไนฆๆ”ถๅฌโ€œ0001ๅฆพๆœฌๆƒŠๅŽโ€\n1000 Processed\n classify content\n466500 0 ๅฅฝๅ–œๆฌข็š„ไธ€้ฆ–ๆญŒๅฏๆƒœ็”ทๅฃฐๅ”ฑไธไบ†ๆ˜Žๅคฉๅธฆ็€ๅฐUไธŠ้ฃžๆœบๅ›žๅฎถๅ†ๅฅฝๅฅฝ็ปƒ&gt\n466501 0 mxๆ›ดๆ–ฐsensexไนŸ่ฆ็ญ‰ๅˆฐๅ…ซๆœˆไปฝ\n466502 0 5ๅคฉ4ๅคœๅ‡ ไนŽๅ…จ็จ‹้ƒฝๆ˜ฏๅˆšไธ‹่ฝฆๅฐฑๅผ€ๅง‹ๅ\n466503 1 ้˜ณๅ…‰็‘žๅŸŽไธšไธป๏ผšๆ–ฐๅนดๅฅฝ๏ผ[ๆฅๅฐฑ้€ๆ–ฐๅนด็คผ][x๏ผšxx~xx๏ผšxx]้’ˆๅฏนๆ‚จๅฐๅŒบๆˆทๅž‹่ฎพ่ฎก่งฃๆžไผš๏ผŒๆˆฟๅž‹ๆ•ˆๆžœๅ›พ\n466504 0 ๆทฑๆทฑ่ง‰ๅพ—ๆˆ‘ๅธ็”ต่„‘่ฏฅๆท˜ๆฑฐไธ€ๆ‰นไบ†\n1000 Processed\n classify content\n467000 0 ๅ†ฐไธ้˜ฒๆ™’่ข–ๅฅ—็”ทๅฅณ้•ฟๆฌพ้˜ฒ็ดซๅค–็บฟๅผ€่ฝฆๆ‰‹ๅฅ—้ช‘่ฝฆ่‡‚ๅฅ—่ข–\n467001 0 ไธœๅŒ—้ฃŽไปŠๅคฉๅคœ้‡Œ4ๅˆฐ5็บง้˜ต้ฃŽ6็บง\n467002 0 ๅฝ“่…นๆณป็”ท่ขซๅ›ฐ็”ตๆขฏ41ๅฐๆ—ถ\n467003 0 ๆˆ‘่ง‰ๅพ—ไนŸๆ˜ฏ็ฌฌไธ€ๆฌก่ฟ™ไนˆๆธ…ๆฅšๆ˜Ž็™ฝ็š„ๆ‡‚ๅพ—ๅ’ซๅฐบๅคฉๆถฏไป€ไนˆๆ„ๆ€\n467004 0 ๆฏๆฌก้ƒฝ่พ“ๅœจๆ”ฏไป˜ไธŠ็พŽๅฎขไผšVIPไผšๅ‘˜ๅกๅœจๆ‰‹็„ถ่€Œๅนถๆฒกไป€ไนˆๅต็”จ\n1000 Processed\n classify content\n467500 0 4GDDR3ๅ†…ๅญ˜ๅ’Œๅฎšๅˆถ็‰ˆ็š„GeForceGTX860M\n467501 0 USCๅ•†ๅญฆ้™ขMBAๆฏ•ไธš็”ŸJing\n467502 0 ๅ‘็ŽฐAๆœ‰ๅฅฝๅคšๅผ€ๆˆฟ็บชๅฝ•่€Œไธ”ๆ—ถ้—ดๅฏ†้›†\n467503 0 ไปŠๅนดๅบ•AndroidMไผš้šๆœ€ๆ–ฐ็š„Nexus่ฎพๅค‡ไธ€่ตทๅ‘ๅธƒ\n467504 0 ไธ‰ๅฎ…ไธ€็”Ÿpleatspleasexxxx็ง‹ๅ†ฌไธ€่งˆ็ฝ‘ๅ€\n1000 Processed\n classify content\n468000 0 6159ๆ‰‹ใ€18600ๆ‰‹ๆฏ้š”ๅ‡ ๅˆ†้’Ÿ้ข‘็Žฐ\n468001 0 ไปŠๅคฉๅฅ”ๆณขไบ†ไธ€ๅคฉ็š„ๅŒป้™ขๅ„็งๆฃ€ๆŸฅ\n468002 0 ๅ—ไบฌๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขไพๆณ•ๅฏนๆž—้ฃžไปฅๆถ‰ๅซŒ็Žฉๅฟฝ่Œๅฎˆ็ฝชใ€ๅ—่ดฟ็ฝชๅ†ณๅฎš้€ฎๆ•\n468003 0 ๆ–นๅคง้›†ๅ›ขๅˆ่ฎกไธญๆ ‡xไบฟๅ…ƒ่ฝจไบคๅฑ่”ฝ้—จๅˆๅŒ\n468004 0 ๆˆ‘ๅœจDosnapๅˆ†ไบซไบ†ไธ€ๅผ ๆถตๅฎ็š„ไฝœๅ“\n1000 Processed\n classify content\n468500 0 ๅคงๅฎถๅฅฝๆœฌไบบ็Žฐๅœจๅœจๅšๆžธๆž่ฟ™ไธช่กŒไธš\n468501 0 ่ทŸ็Šฏ็ฝชๅซŒ็–‘ไบบ็œผ็ฅžไบคๆฑ‡็š„ๆ—ถๅ€™ๅ†…ๅฟƒ่ฟ˜ๆ˜ฏๆœ‰ๆˆ˜ๆ —็š„\n468502 0 ๅฎๅฎxxๅคฉๆฏๅคฉๆ‹‰ๅคงไพฟ็š„ๆฌกๆ•ฐๅพˆๅคš\n468503 0 ๅฆ‚ๆŠŠ่€็™พๅง“็š„ไฝๆˆฟๅŒป็–—ๆ•™่‚ฒๅ…ป่€้—ฎ้ข˜่งฃๅ†ณไบ†\n468504 0 ไปฅๆ‰ฌๅทžไธ‡ๆพๅฑฑใ€้‡‘้’ฑๅขฉใ€่ฑก็‰™ๆž—ใ€่‘ต่Šฑๅฒ—ๅ››ๅคงๅๆ™ฏไธบไธป้ข˜ๅšๆˆไบ†ๆพ้ผ ๆก‚้ฑผใ€้‡‘้’ฑ่™พ้ฅผใ€่ฑก็‰™้ธกๆกๅ’Œ่‘ต่Šฑๆ–ฉ...\n1000 Processed\n classify content\n469000 0 ๅคชๅŽŸๅธ‚ไธญๅฟƒๅŒป้™ข็šฎ่‚ค็ง‘ๅœจ้—จ่ฏŠๅนฟๅœบไธพๅŠžไบ†โ€œ็พŽไธฝไปŽๅฅๅบท็šฎ่‚คๅผ€ๅง‹โ€ๅคงๅž‹ๅ…ฌ็›Šไน‰่ฏŠๆดปๅŠจ\n469001 0 ่€Œไธ”ไธๅพ—ไธ่ฏดๅฅฝๅฃฐ้Ÿณ็š„ๅ”ฑ็š„็œŸๅฟƒไธๅฆ‚ๅŽŸๅ”ฑ\n469002 0 ่€ŒCitadelๆ˜ฏๅ…จ็ƒๆœ€ๅคง็š„ๅฏนๅ†ฒๅŸบ้‡‘ไน‹ไธ€\n469003 0 ่€Œๆ˜ฏ็Ÿฅ้“็œŸ็›ธๅŽๆ‰€ๆœ‰ๆˆ‘ไธ่ƒฝๆŽฅๅ—็š„่ฐŽ่จ€\n469004 0 ๅฝ“ไฝ ๅฟƒ่บซๆ„Ÿๅˆฐ็–ฒๆƒซๆ˜ฏๅฎถๆ˜ฏๆœ€ๅฅฝ็š„ๆœ€ๆธฉ้ฆจ็š„ๅททๆนพ็„ถ่€Œๅฏนๆˆ‘ๆฅ่ฏดๆ˜ฏๅคšไนˆ็š„ๅฏ็ฌ‘ๅ•Šๆฏๅ›žไธ€ๆฌกๅฟƒๅ‡‰ไธ€ๆฌกๅฟƒ็—›ไธ€ๆฌกๅฅฝ...\n1000 Processed\n classify content\n469500 0 7ๆœˆ28ๆ—ฅๆต™ๆฑŸไผ—ๆˆ่ทŒ่‡ณ23ไนฐ่ฟ›\n469501 0 xๆœˆๆธฏไบคๆ‰€ๆˆไธบไธญๅ›ฝไผไธšIPOไธปๆˆ˜ๅœบ\n469502 0 ๆ— ้”ก้ƒฝๅธ‚็”Ÿๆดปๅนฟๆ’ญ่”ๅˆๆ— ้”กๆถˆ้˜ฒๆ”ฏ้˜Ÿใ€ๆ— ้”กๅŽๆถฆ็‡ƒๆฐ”ๅœจ็บณๆ–ฐๆกฅ็คพๅŒบๅ‰ๅฎ‹ๅททๅฐๅŒบๅผ€ๅฑ•ไบ†โ€œๅคๅญฃๅฎ‰ๅ…จๅ…ฌ็›Š่ฟ›็คพ...\n469503 0 ๅทฒ็ปๆ˜ฏๆ”ฟๅบœ้€š่ƒ€็›ฎๆ ‡็š„ไธคๅ€ๅคš\n469504 0 ๆฏๆฌกๅ’ŒๅŸŽ็ฎกๆ‰“ๅฎŒๆžถๅ‡บๆฅๆ–ฐๅˆ€ๆ็คบๆˆ‘้ƒฝๅœจๅšŽๅซโ€œ็ป™ๆˆ‘ๆตฆๅฒ›\n1000 Processed\n classify content\n470000 0 2็พŽๅ…ƒๆฒก่’ฒๅผ่€ณโ€โ€œ้‚ฃไธบไป€ไนˆไธ็Žฐๅœจไนฐ\n470001 0 ๅฐ†่ฆๆ”นๅŠจ็š„styleไฝœไธบstateๆฅๆ”นๅ†™\n470002 0 ๅˆ˜ๆ˜Šmeitong0711้ซ˜ไพไพangelababyๅฟ…้กปๆญปๆญปๅˆ‘ๆญปๆ€ไบบๅฟๅ‘ฝๆญป\n470003 0 โˆžโ•ฌโ•ฌๅฎๆณฐ้š†ๅ–œไธด้—จๅคง่ฅฟๆด‹้นฟๆธฏ็ง‘ๆŠ€ๅšไฟก่‚กไปฝๅ†…่’™ๅ›ๆญฃๆต™ๆฑŸ้พ™็››\n470004 0 ๆˆ‘็Žฐๅœบ็š„็บธ้ฃžๆœบๅ†™ไบ†ไธ€ๅฆ‚ๆ—ขๅพ€็š„ๅŠชๅŠ›็”Ÿๆดป\n1000 Processed\n classify content\n470500 0 ๅฐ็ฑณๆ‰‹็Žฏ๏ผ‹1000ๅ…ƒ็Žฐ้‡‘โ€ฆโ€ฆๅ‚ๅŠ ๆš‘ๆœŸๅฎž่ทตๅคง่ต›\n470501 0 ่ฏดๆ˜ฏLONGLONGAGO\n470502 0 ๅฎƒ็š„่ฎพ่ฎกไปฅๅŠๅˆถ้€ ๅทฅ่‰บ้ƒฝๆœ‰ไบ†็ช้ฃž็Œ›่ฟ›็š„ๆๅ‡\n470503 0 ๅœจ้ฉฌ่ทฏไธญ้—ดSๅฝขๆป‘ไบ†ๅ‡ ไธชๆฅๅ›žไน‹ๅŽๆ’žๅ€’่ทฏไธญ้—ด็š„ๆŠคๆ \n470504 0 ่ฅฟๅฎ‰ๅธ‚ไบบๆฐ‘ๆ”ฟๅบœๅ†ณๅฎš๏ผšxxxxๅนดxๆœˆxxๆ—ฅไธŠๅˆxxๆ—ถxxๅˆ†่‡ณxxๆ—ถxxๅˆ†ๅœจๅ…จๅธ‚่Œƒ\n1000 Processed\n classify content\n471000 0 ๆ ผ็ฝ—ๆ–ฏ๏ผšๅ…จ็ƒ้‡‘่žๅธ‚ๅœบ็Žฐๅœจ้ƒฝๆ˜ฏ้ช—ๅฑ€\n471001 0 ไปŠๅคฉๅŽป่‹ๅทžๅ›ญๅšไผšๆ–ฝๅทฅ็Žฐๅœบ่ฝฌไบ†ไธ€ๅœˆ\n471002 0 โ€็›ฎๅ‰ๅผ ่Šฑๅทฒ่ขซๆฐ‘่ญฆ็”จไธ“่ฝฆ้€ๅ›žๆน–ๅŒ—่€ๅฎถ\n471003 0 B้—จ่ฎพ็ฝฎใ€็ฎก็†ๅŠไธŽๆญฆ่ญฆๅ“จไฝ็š„่”ๅŠจ\n471004 0 ็ซ ๆณฝๅคฉๅ’Œๅ…ถไป–ๆŠ•่ต„ไบบๅนถๆฒกๆœ‰ๅคชๅคšๅŒบๅˆซ\n1000 Processed\n classify content\n471500 0 โ€œๆœบๅ™จไบบๆ€ไบบโ€ๆ˜ฏๆ— ไธญ็”Ÿๆœ‰or็กฎๆœ‰ๆญคไบ‹\n471501 0 ๅฟ…้กปๅผ€ๅ…ฌๅฎ‰็จŽๅŠกๅทฅๅ•†้“ถ่กŒๆ•™่‚ฒๆœบๆž„ๅฑ…ๅง”ไผš็ญ‰5ๆœบๆž„่ฏๆ˜Žไธชไบบ่บซไปฝๆ‰่ƒฝๅ‘\n471502 0 ๅคงๅฎถไธ€่ตท็Žฉๆธธๆˆโ€ฆ2015็š„็”Ÿๆ—ฅๆˆ‘่ฟ‡ไบ†ๅพˆๅผ€ๅฟƒ\n471503 0 TheLightRunๆณฐๅทžไธ‡่พพ่งๅ…‰ๅคœ่ท‘\n471504 0 JeremyRennerไนŸๆ˜ฏๆˆ‘ๅ–œๆฌข็š„ๆผ”ๅ‘˜\n1000 Processed\n classify content\n472000 0 ๅŸŽ็ฎกๅฑ€ๆ‰งๆณ•ไบบๅ‘˜ไบŽ7ๆœˆ28ๆ—ฅๆ™šๅœจ่กก้˜ณ่ทฏๆ‰พๅˆฐไบ†่ฏฅๅบ—่ดŸ่ดฃไบบ\n472001 0 NEOๅฅณ็š‡ๅ››่‰ฒๆฃ•่‰ฒ่‡ช็„ถๅฐๆทท่ก€็‹ฌ็‰นๆฏ›่พน่ฎพ่ฎกๅ’Œๆต…ๆฃ•่‰ฒๅฎŒ็พŽๆญ้…ๆ˜พๅพ—็œผ็›ๆธ…้€ๆทท่ก€NEOไปปๆ„ไธคๅน…ๆดปๅŠจx...\n472002 0 ไปŠๅคฉๅŽปๅŒป้™ขๅฌ่ฏŠ่‚บ้ƒจๆฒกๆœ‰้—ฎ้ข˜ใ€ๆฐ”็ฎกๆœ‰็‚นไธๅฅฝ\n472003 0 ๅ…‹ๆ‹‰ๆ‹ไบบๆ˜Žๆ—ฅๆต™ๆฑŸๅซ่ง†\n472004 0 1258ๅบžๅคง้›†ๅ›ข้€†ๅŠฟ็š„ไธŠๆถจ\n1000 Processed\n classify content\n472500 0 ๅˆšๅœจไบš้ฉฌ้€Šๅ…ฅๆ‰‹ไบ†่ฟ™ๅ‡ ๆœฌไนฆ๏ผšๅญคๅ„ฟๅˆ—่ฝฆ\n472501 0 ๆ˜Ž็กฎ่ง„ๅฎšๆŠ•่ต„่€…ๅœจ่žๅˆธๅ–ๅ‡บๅŽ\n472502 1 ไบฒ็ˆฑ็š„๏ผŒ็ปด็บณ่ดๆ‹‰ไธ‰ๅ…ซ่Š‚๏ผŒไธ€ๅนดๅฐฑไธ€ๆฌกๆœ€ๅคงๅž‹็š„็›ดๆŽฅ็œ้’ฑๆดปๅŠจ้ฉฌไธŠๅฐฑๅผ€ๅง‹ไบ†๏ผŒๅˆฐไนๅทๆˆชๆญข๏ผŒ็ฅฅๆƒ…ๆฅ็”ตๅ’จ่ฏข...\n472503 0 ่€Œ่Šฑๅƒ้ชจ่ฆ่ฏดๅ‰งๆƒ…ๅงๆœ‰ไบ›ๆ–น้ขๅ‘ๅฑ•ๅคชๅฟซ\n472504 0 ๅพๅทžๆ˜ฏไน…่ดŸ็››ๅ็š„ๆ—ขๅŒ…้‚ฎๅˆๆœ‰ๆš–ๆฐ”็š„ๅŸŽๅธ‚\n1000 Processed\n classify content\n473000 0 ไป–ไปฌ้˜ตๅฎนไธญๆœ‰3ๅๆŽงๅซ็š„ๅ‡บ็”Ÿๅนดๆœˆๆ—ฅ็ซŸ็„ถๆ˜ฏไธ€ๆจกไธ€ๆ ท็š„\n473001 1 ้ƒ‘ๅทžไธนๅฐผๆ–ฏ่Šฑๅ›ญๅบ—ๅŽๆญŒๅฐ”xๆœˆxๅท----xๆœˆxๅทๆปกxxxๅ‡xxx\n473002 0 ๅทฒๅปบๆˆ5ๆก๏ผš้•ฟๆฑŸๅคงๆกฅใ€้•ฟๆฑŸไบŒๆกฅใ€้•ฟๆฑŸไธ‰ๆกฅใ€้•ฟๆฑŸๅ››ๆกฅๅ’Œ็บฌไธƒ่ทฏ้•ฟๆฑŸ้šง้“\n473003 0 ็”ต่„‘้‚ฃ่พน็š„ๅฅนไพฟไน็š„ๅ“ˆๅ“ˆๅคง็ฌ‘\n473004 0 ่ฅฟ่—้˜ฟ้‡Œๆœ‰่™่ ไพ ่ขซๆ”น็ผ–ๆˆๆธธๆˆไธ€ๆฌกๆœˆ้ฃŸ็š„ๅ‘็”Ÿ\n1000 Processed\n classify content\n473500 0 ๅพฎ่ฝฏๅŒๆ—ถๅ…ฌๅธƒไบ†Win7/Win8\n473501 0 ๅฎ‰่€ๆ™’ๆœ‰ๆฌพๅนถไธ็บข็š„้˜ฒๆ™’ๅ‡ ๅนดๅ‰ๅพˆๅ–œๆฌข\n473502 0 ๆญฆๆฑ‰ๅธ‚ไธญ็บงไบบๆฐ‘ๆณ•้™ขไบŒๅฎกๅˆคๅ†ณ็ง่—ๆžชๆ”ฏ็š„็ˆถไบฒๅค„ไปฅ5ๅนดๆœ‰ๆœŸๅพ’ๅˆ‘\n473503 0 ไบง็ป้ข„่ญฆใ€ๆ”ฟ็ญ–ๅŠจๆ€ใ€ๅธ‚ๅœบ็Žฏๅขƒใ€็ปŸ่ฎกๆ•ฐๆฎใ€ๆƒๅจๅ‘ๅธƒใ€ๆ–‡ๅŒ–ใ€ๅŒป่ฏใ€้ฃŸๅ“ใ€ๆœบ็”ตใ€ๅ†œๆž—ใ€ๅปบ็ญ‘ใ€้€šไฟกI...\n473504 0 ็”ตๅฝฑ็‹‚ๆ€’็ฎ—ๆ˜ฏ่ฟ‘ๆœŸๆกฃๆฏ”่พƒ้‡ๅคด็š„ๅฝฑ็‰‡ไบ†\n1000 Processed\n classify content\n474000 0 ๅšๆˆ‘่ฟ™่กŒๆœ‰่ถฃ็š„ไบ‹ๆฏๅคฉ้ƒฝๅฏไปฅ็Žฉไธ้œ€่ฆๅ‘่ฐ่ฏทๅ‡็ดฏไบ†็กไธ€่ง‰ๅœจๅฎถไบซๅ—ๅฅฝๅƒ็š„็Žฉ็€ๆ‰‹ๆœบๆŒฃqian้€›้€›ๅพฎๅš...\n474001 0 ๅ“ˆๅ“ˆ่…พ่ฎฏๆ–ฐ้—ปไฝ ไปฌๅฐ็ผ–ไปŽQQ็ฉบ้—ด่ฏทๆฅ็š„\n474002 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ้ƒฝๆ˜ฏ็พŽๅ›ฝไบบๆ–ฐๅŠ ๅกไบบ้ฉฌๆฅ่ฅฟไบšไบบๅŠ ๆ‹ฟๅคงไบบ\n474003 0 orbisๆ— ๆฒนๅธๅฆ†้œฒๅธ้˜ฒๆ™’ๅพˆๅฅฝ\n474004 0 ๆˆ‘ๆญฃๅœจ็œ‹10ๆฌพ่ถ…็บขไธœๆด‹้›ถ้ฃŸ็ผ–่พ‘้ƒจๅฎžๆต‹ๆŠฅๅ‘Š\n1000 Processed\n classify content\n474500 0 ๆœ€็ปˆๆฌง็พŽๅœฐๅŒบๆœ€็ซ็ˆ†็š„FPSๆธธๆˆๆˆ˜่ƒœไบ†ๆ—ฅๆœฌ็Žฉๅฎถๆœ€็ˆฑ็š„่ฟๅŠจๆธธๆˆ\n474501 0 ้‡็‚นๆ˜ฏๅฎƒ่ทŸ็ฆๅ…‹ๆ–ฏSTๆœ‰็‚น็ฅž้Ÿต็›ธ้—ด\n474502 0 ๆต™ๆฑŸๅซ่ง†ๆœ‰ไธ่‰ฏไบบๆ‰‹ๆธธ็š„ๅนฟๅ‘Šไบ†\n474503 0 ๆˆฟไบง้”€ๅ”ฎไธš็ปฉไนŸๆ˜ฏ้žๅธธๅމๅฎณ็š„\n474504 1 ๆ‚จๅฅฝ๏ผŒๆ„Ÿ่ฐขๆ‚จไธ€็›ดไปฅๆฅๅฏน็บข่‹นๆžœๅฎถๅ…ท็š„ๆ”ฏๆŒไธŽๅŽšๆ„›๏ผŒๅ…ฌๅธไธบ็ปดๆŠคๆถˆ่ดน่€…ๆƒ็›Š๏ผŒx.xxๆดปๅŠจๅฏๅŠจไธญ(xๆœˆ...\n1000 Processed\n classify content\n475000 0 ๆฏๅคฉๆ—ฉๆ™จไผด็€็ช—ๅค–ๅ„็ฑปๆฑฝ่ฝฆๅ‘ๅŠจๆœบ็š„่ฝฐ้ธฃๅ’Œๆฑฝ็ฌ›้†’ๆฅ\n475001 0 ๅ› ไธบๅ› ไธบๆฑฝๆฒนๆฑฝ่ฝฆๅฎนๆ˜“ๅœจ็ขฐๆ’žๅŽๅ‘็”Ÿ็ˆ†็‚ธ\n475002 0 ๆŸฅๅค„ๅ„็ฑปไบค้€š่ฟๆณ•xxxxx่ตท\n475003 0 ไธคๅผ ็…ง็‰‡็„ถๅŽ่ฟžๆˆไธ€ๅผ ๅŠจๅ›พ็™ฝๅคฉๆ็€ๆ‰‹ๆœบๅ“ญ็Žฐๅœจๅฏน็€็”ต่„‘ๅˆๅ“ญไบ†ๆˆ‘ไธ€ๅฎšๆ˜ฏๆœ‰็—…ๅพ—ๆฒป\n475004 0 ่ฟžๅ—ๅŽฟไบบๆฐ‘ๆณ•้™ขไธ€ๅฎกไปฅ่ฏˆ้ช—็ฝชๅˆคๅค„ๅผ ๆŸๆœ‰ๆœŸๅพ’ๅˆ‘ๅ…ซๅนดๅ…ญไธชๆœˆ\n1000 Processed\n classify content\n475500 1 ๅŽๆด‹ๅ ‚ๆขฆๅฆ†:ไบฒ็ˆฑ็š„ไผšๅ‘˜๏ผŒๆ‚จๅฅฝ๏ผŒไธบๅบ†็ฅไธ‰ๅ…ซ่Š‚ๅˆฐๆฅไน‹้™…๏ผŒๆœฌไธ“ๆŸœไปŽๅณๆ—ฅ่ตท่‡ณxๆœˆxๆ—ฅๅ…จๅœบxxxๅ‡xx...\n475501 0 ๅœจๅœฐ้“ไธŠไธ€ๅไธ‹ๆฅ็žฌ้—ดๅฐฑๅ›ฐไบ†\n475502 1 ๅผ€่ตฐๆˆ–ไธๅผ€่ตฐ่ดทๆฌพ๏ผŒๅ€Ÿๆฌพ้ขๅบฆx-xxxxไธ‡๏ผŒๅช่ฆๆกไปถ็ฌฆๅˆ๏ผŒๅฐไฝ•ไธ€ๅพ‹ๅฏไปฅ่งฃๅ†ณใ€‚ๅฐๅฟƒๆ„๏ผŒๅช่ฆไป‹็ป็š„...\n475503 0 Intelๆ‹›่˜ๅ›พๅƒๅฎžไน ็”Ÿ\n475504 0 ไบš้ฉฌ้€Šๅๆงฝ็พŽๅ›ฝๆ— ไบบๆœบ้™ๅˆถ๏ผš่ฟ™ๆ˜ฏๅบŸ็บธๆˆ‘ไปฌไธ่ฆ\n1000 Processed\n classify content\n476000 0 ไธญๅคฎๆฐ”่ฑกๅฐ7ๆœˆ10ๆ—ฅ06ๆ—ถๅ‘ๅธƒๅฐ้ฃŽ็บข่‰ฒ้ข„่ญฆ๏ผš้ข„่ฎก\n476001 0 ๅ‘็Žฐไบ†ไธ€ไธชๆ‰‹ๆœบ้‡Œๆฒกๆœ‰ๅธ่ฝฝ็š„ๆ—ฅ่ฎฐ\n476002 0 tm็š„ไธญๅˆๆขไฟ้™ฉไธไนŸไธ้€š็Ÿฅๆˆ‘\n476003 0 ๅˆ†ไบซๅ›พ็‰‡2015่ฅฟ่—ๆ—…ๆธธๅ›žๆฅ\n476004 0 NAVERONSTYLEๅพ่ณขๅ€‹ไบบ้ ๅ‘Š\n1000 Processed\n classify content\n476500 1 ใ€ๆ€่ทฏ้€šใ€‘ไธƒๅนด็บงๆ•ฐ ๅญฆๆธ… ๅŒ—ๅฅฅ-ๆ•ฐ็ญ๏ผšๅŒๆญฅ้‡้šพ็‚นๆ–นๆณ•ๆ€ป็ป“๏ผŒๆŽŒๆกๅฅฅ-ๆ•ฐๅ’Œๅ-ๆ กไธ“้ข˜๏ผ›ๅŸนไผ˜็ญ๏ผšๅทฉ...\n476501 0 ๆœจๅ‡ณๅญ่ฎพ่ฎก็”ฑๆ•™่‚ฒ้ƒจๆ”พๅ›žๅŽŸๅค„\n476502 1 //่ฝฌ่‡ชxxxxxxxx๏ผšไธญๅ›ฝไบบๅฏฟ้กถ้ขๅˆฉ็އx.xxx้‘ซๅฆ‚ๆ„ไบงๅ“้ฉฌไธŠๅœๅ”ฎ๏ผŒๆœ‰ๆ„่€…็ซ้€Ÿ่”็ณป ็ซ้€Ÿ...\n476503 0 ไธ‹ๅ›พๆน–ๅ—ๅšไบ‘ๆ–ฐๆๆ–™่‚กไปฝๆœ‰้™ๅ…ฌๅธ็”Ÿไบง่ฝฆ้—ด\n476504 0 ใ€Žๆ˜ฏๅŒป้™ขใ€Œๅฎณใ€ๆญปไบ†ๅพ—่‚บ็‚Ž็š„ๅฐ็”ทๅญฉๅ—\n1000 Processed\n classify content\n477000 0 ๅฏไธœๆฐ”่ฑกๅฐ7ๆœˆ27ๆ—ฅไธ‹ๅˆๅ‘ๅธƒ๏ผšไปŠๅคฉๅคœ้‡Œๅคšไบ‘\n477001 0 ๅฎƒไผšๅคง่จ€ไธๆƒญ๏ผšๅ‘ๅ†…ๅซ็”Ÿๆ‰ๅฝ’ๆˆ‘็ฎก\n477002 0 2ใ€ๅ†ๆ…ขๆ…ขๅœฐๅŠ ๅ…ฅ้ฒœๅฅถๅ’Œ่œ‚่œœ\n477003 0 ่‡ชๅทฑไนŸๆ˜ฏ้ป˜้ป˜ๅœจ็”ต่„‘ๅ‰็ƒญๆณช็›ˆ็œถ\n477004 0 ็พŽๅ…ฐๅŒบๆณ•้™ขๅฏนๆญคๆกˆไฝœๅ‡บไธ€ๅฎกๅˆคๅ†ณ\n1000 Processed\n classify content\n477500 0 ๅ‚ๆ™š่ฝฌ็š„้‚ฃๆกstrongfemalecharacter็š„ๆ–‡็ซ ๅผ•ๅ‡บไบ†ๅฅฝๅคšๆœ‰่ถฃ็š„่ง‚็‚น\n477501 0 ไปปไฝ•ไบบไธๅพ—ไนฐๅ–ใ€ๅ•†ไธšไฝฟ็”จๅธฆๆœ‰่ฏฅLOGO็š„็”Ÿไบงๆๆ–™\n477502 0 ็›ๅŸŽไธ€่€ๅคง็ˆท็Œฅไบตๅคšๅๅฅณ็ซฅ\n477503 0 ใ€ใ€Œๅฎๅฎ้œ€่ฆ้ขๅค–่กฅๅ……็”ต่งฃๆฐดๅ—\n477504 0 ๅŒ…ๆ‹ฌ้พ™ๅ›พๆธธๆˆCEOๆจๅœฃ่พ‰ใ€่…พ่ฎฏQQๆต่งˆๅ™จ้ซ˜็บงไบงๅ“็ป็†ๆจไธ‰้‡‘ๅœจๅ†…็š„ๅ‚ไผšๅ˜‰ๅฎพๅ‘่กจไบ†ๅฏนHTML5ๆธธ...\n1000 Processed\n classify content\n478000 0 ไธบไป€ไนˆ็ˆฑๅฅ‡่‰บ็š„ๅฎžไน ๅŒป็”Ÿๆ ผ่•พๅ…จ้ƒฝๆฒกไบ†\n478001 1 ๅฐ‘้€ๅคšๅฐ‘ใ€‚ไผ˜ๆƒ ไธ‰๏ผšๆปกxxx้ข่†œ็ฒ‰ๅ…่ดน้€๏ผŒๆปกxxxๆกถ่ฃ…ๆตทๆพกๅ…่ดน้€ใ€‚ ๆƒณ็Ÿฅ...\n478002 0 ๅฝ“ๅˆๆˆ‘่ฏทๅธ‚ๆ”ฟๅบœๅ–ๆถˆๅฎถไบบๅฏน174ๅŒป้™ข็š„ๅๅบ”ไธ€ๆ ท\n478003 0 ไบบ็”Ÿไธไธ€ๅฎšๅช่ฟฝๆฑ‚็ป“ๆžœ่ฟ‡็จ‹ไนŸๅพˆ้‡่ฆ\n478004 0 ๅฟƒ็œŸ็š„ๅฅฝ็ดฏ็”ต่„‘ไธŠ็š„ๆกฃๆกˆ่ขซๆˆ‘ๅˆ ไบ†\n1000 Processed\n classify content\n478500 0 ๅคๆ—ฅ็‚Ž็‚Žไบบไปฌไธบไบ†้˜ฒๆ™’ไนŸๆ˜ฏ่›ฎๆ‹ผ็š„ๅ•Š\n478501 1 xๆœˆxๆ—ฅๆ‚จๆœ‰ๆดปๅŠจๅ—๏ผŸ/:?/:?ๅฆ‚ๆžœๆฒกๆœ‰๏ผŒๅฟซๅฟซๆฅๆ™ฏไผŠๅๅฆ†็œ‹็œ‹ๅง๏ผ/:jj/:jj/:jj้€‰่ดญ...\n478502 0 ๆˆ‘็š„ๆ‰‹ๆœบ่ขซๆฒกๆ”ถไบ†็”ต่„‘ๅไบ†่ฟ™ๅ‡ ๅคฉ้ƒฝๆฒกไธŠ็ฝ‘\n478503 0 ๆต่กŒ่ฟ‡็ˆธ็ˆธๅŽปๅ“ชๅ„ฟ้ฃžๆœบๅŽปๅ“ชๅ„ฟๆ—ถ้—ดๅŽปๅ“ชๅ„ฟๅฆ‚ๅ†ไธไฟๅ…ปไธๆ‰“ๆ‰ฎๅฐฑ่ฏฅๆต่กŒ่€ๅ…ฌๅŽปๅ“ชๅ„ฟไบ†\n478504 0 ็›‘็ฎกๆœบๆž„็š„ๆŸฅๅค„ๆœ‰้›ทๅฃฐๅคง้›จ็‚นๅฐไน‹ๅซŒ็–‘\n1000 Processed\n classify content\n479000 0 ไธ€ๅฎถๅไธบBelgocontrol็š„ๆฏ”ๅˆฉๆ—ถๅ…ฌๅธๅฎฃๅธƒ\n479001 0 ๅฌ่ฏด็ฝ‘ไธŠ้›ถๅ”ฎๅทจๅคดไบš้ฉฌ้€ŠไนŸๅฐ†ๅœจๆœชๆฅไบ”ๅนดๅฎž็Žฐๆ— ไบบๆœบ้€ๅฟซ้€’ๅ“Ÿ\n479002 0 ไนŸๆญฃๆ˜ฏไผ ่ฏดไธญ็š„Surfaceๆ‰‹ๆœบ\n479003 0 ็ฌฌไธ€ๆฌกๅƒ้ฅญๆŠŠๆ‰‹ๆœบไธขๅœจๆกŒๅญไธŠ\n479004 0 ้ข่†œไธๅฏๆ–ญ/ๅ็ฌ‘/ๅ็ฌ‘็กๅ‰ๆ•ท้ข่†œ\n1000 Processed\n classify content\n479500 1 ๅ–œ่ฟŽไธ‰ๅ…ซๅฅณไบบ่Š‚๏ผ้Ÿตๅงฟ็พŽๅฎน็‰นๆƒ x.xๆŠ˜๏ผŒๆ•ฐ้‡ๆœ‰้™๏ผŒ่ตถ็ดงๆŠข่ดญใ€‚่ฎฉๆˆ‘ไปฌไธ€่ตทๅšไธช็ˆฑ่‡ชๅทฑ็š„้ญ…ๅŠ›ๅฅณไบบใ€‚ไธ‰...\n479501 1 ็ป็†ๆ‚จๅฅฝ:ๅคฉๆดฅๅธ‚ๅ…จ้€š้’ข็ฎกๅ…ฌๅธๅ‘ๆ‚จ่‡ดๆ•ฌ:ๆˆ‘ๅ…ฌๅธไธ“ไธš็”Ÿไบง.่žบๆ—‹้’ข็ฎก.้˜ฒ่…ไฟๆธฉ้’ข็ฎก.xpe้’ข็ฎก.ๆฌข...\n479502 0 ่€Œไธ”ๆ˜ฏๆš—ๅœฐๅ‹พ็ป“็พŽๅ›ฝๆŽ ๅคบไธญๅ›ฝๅคง้™†็™พๅง“\n479503 0 ๆฑ‚ไธชๆฑŸ่‹็œ่Œƒๅ›ดๅ†…็š„้ ่ฐฑๅฎ ็‰ฉๅŒป้™ข\n479504 0 ๅ‡ๅฆ‚ๆœ‰ไธ€ๅคฉๅ›ฝๅฎถๆ”ฟๅบœๅ‡บไธ€ๅฐๆ”ฟ็ญ–่ฏดๆ€ไบบไธ็Šฏๆณ•\n1000 Processed\n classify content\n480000 0 ไปŠๅคฉไปŽๅŒๆตŽๅŒป้™ข้‚ฃ่พน็š„่‰ณ้˜ณๅคฉๅผ€ๅˆฐไธ‡็ง‘ๅŸŽ\n480001 0 xใ€Steamๅนณๅฐxxxxๅนด็‹‚ๆžxxไบฟ็พŽๅ…ƒG่ƒ–็ฌ‘ๅผ€่Šฑ\n480002 1 ๅณๆ—ฅ่ตท๏ผšๆฑŸๆนพๅ‰ไนฐ็››็บข่ฑ†ๅฑ…ๅฎถไธ“ๅ–ๅบ—x?xๅฆ‡ๅฅณ่Š‚็‰นๅ–ๆดปๅŠจๅ…จๅœบxๆŠ˜่ตทx?xๆ—ฅๆญขใ€‚ๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผŒๆ„ฟๆ‚จ...\n480003 0 ไปŠๆ™šๆฅๅˆฐไบซ่ช‰ๅ…จ็พŽ็š„SoulFood้คๅŽ…Sylvia's\n480004 1 ่ฃ•ๆณฐๆฑฝ้…ๅ•†่กŒๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆœฌ่กŒไธป่ฅๅ„็ง้ซ˜ไธญๆกฃ๏ผŒๆ–ฐ็บฏๆญฃๅŽ‚๏ผŒ้…ๅฅ—ไป˜ๅŽ‚๏ผŒ่ฟ›ๅฃๅŽŸ่ฃ…๏ผŒๆ—งๆ‹†่ฝฆไปถ๏ผŒๅนถๅŠžๅ…จ...\n1000 Processed\n classify content\n480500 0 ๆˆ‘ๅฑ…็„ถๅฎŒๅฎŒๆ•ดๆ•ด็œ‹ๅฎŒไบ†~ๆƒณ่ตท็ฌฌไธ€ๆฌก็œ‹ๆญŒๅ‰ง้ญ…ๅฝฑๆ˜ฏ้ซ˜ไธญ้Ÿณไน้‰ด่ต่ฏพไธŠ่ขซ้œ‡ๅœฐไธ€ๆ„ฃไธ€ๆ„ฃ็š„\n480501 0 ่€Œๆญคๆฌก็™พๅบฆๅฏนไบŽไธ€ๆ‰นๅธฆๆœ‰โ€œๅฎ˜็ฝ‘โ€ๅญ—ๆ ท็š„ไผไธš็ซ™่ฟ›่กŒไบ†้™ๆƒๅค„็†\n480502 0 1ๅ…ฌ้‡Œๅผ€ๆŒ–ๅŸ‹่ฎพ2ๆ น็›ดๅพ„1800ๅŽŸๆฐด็ฎก\n480503 0 ๆฏๆฏๅœจๅ…ฌไบคๆˆ–่€…ๅœฐ้“ๆˆ–่€…่ทฏไธŠ็ขฐๅˆฐ็š„่กŒไบบๅฐฑๆƒณ่ตทๆˆ‘็š„็ˆธๅฆˆ\n480504 0 ไฝ ่ถŠไธๆ•ข่ฟฝ็š„่‚ก็ฅจๅฐฑ่ถŠ่ฆ่ฟฝ\n1000 Processed\n classify content\n481000 0 ๅ‘็‰ฉไธšๅๆ˜ ๆŽจ่พž่ฏดๅธ‚ๆ”ฟไพ›ๆฐดไธๆญฃๅธธ\n481001 1 ๆ‚จๅฅฝ. ่ˆ’้€‚ๅ กๅฅ่บซๆ™ถๅ“ๅบ—ไผš็ฑ้กพ้—ฎๅฐๆฝ˜๏ผŒ่ฏš้‚€ๆ‚จๆฅๅ‚ๅŠ ๆœ€ๅŽไธ€ๆœŸVIPไผšๅ‘˜ๅ›ข่ดญๆดปๅŠจใ€ๅ‰xxๅๅฏๅ‡ญ้ข„...\n481002 0 ๅญๅฎซๅ†…่†œ้‡Œ็˜€่ก€่ถŠ็งฏ่ถŠๅคš็ป“ๆˆ่ก€ๅ—\n481003 0 ๅƒ่œ‚่œœ่›‹็ณ•ๅฐฑไผšๅ™Žไฝๆ‰“ๅ—ไฝ“่ดจ\n481004 0 ไธ่ฆๅชไผšๅœจ็”ต่„‘ใ€็”ต่ง†ๅ‰็พกๆ…•้‚ฃไบ›่‡ช็”ฑ่ˆžๅŠจ็š„ไบบ\n1000 Processed\n classify content\n481500 0 ่€Œไธ”้žๅธธๅฎนๆ˜“ๅฐฑ่ขซ่ฏฌๅ‘Šๅผบๅฅธใ€ๅผ•ๅ‘ๆƒ…ๆ€็ญ‰ๅˆ‘ไบ‹ๆกˆไปถ\n481501 0 8ๆœˆ่€ƒ็”Ÿไธ€ๅฎšๆŠŠ7ๆœˆ็š„ๅ‡ ไธช็ปๅ…ธ้ข˜็›ฎ้ƒฝๅ‡†ๅค‡ไธ‹\n481502 0 21ๆ—ฅๅˆไปฅ7ๆฏ”0่ฎคไธบๅŒๆ€งๆ‹ไผดไพฃ่‡ณๅฐ‘ๅบ”ๅฝ“่Žทๅพ—ๆฐ‘ไบ‹ไผดไพฃๆƒ\n481503 0 ๆ— ่ต–ๆฑ‰็™พๅบฆไบ‘็ฝ‘็›˜่ต„ๆบ้ซ˜ๆธ…้“พๆŽฅ\n481504 0 ็ซ‹ๅณๅฏนๆ–ฝๅทฅๅŒบๆก‚่Šฑใ€้“ถๆใ€็บขๅถ็Ÿณๆฅ ่ฟ›่กŒไบ†็งปๆคไฝœไธš\n1000 Processed\n classify content\n482000 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜๏ผšโ€œ็พŽไธฝๅฅณไบบ่Š‚๏ผŒๅฎžๆƒ ็œ‹ๅพ—่งโ€ๅ›ฝๅคง่ฏๆˆฟไบŽxๆœˆxๆ—ฅ๏ฝžxๆœˆxๆ—ฅๆŽจๅ‡บไผ˜ๆƒ ๅคง่ฎฉๅˆฉๆดปๅŠจใ€‚ ...\n482001 1 ่‰บๆœ่‰บๅค•ๅผ€ๅญฆๅ•ฆ๏ผxๆœˆxxๆญฃๅผไธŠ่ฏพ๏ผŒไธ“ไธšๅŸน่ฎญๅฐ‘ๅ„ฟ็พŽๆœฏใ€้Ÿณไนใ€่ˆž่นˆใ€้™ถ่‰บ๏ผŒๅธฆๅญฉๅญๅœจๅฟซไนไธญไบซๅ—่‰บๆœฏ...\n482002 0 ๅ…จๅธ‚ๆ–ฐๅขžๅ„็ฑปๅธ‚ๅœบไธปไฝ“16478ๆˆท\n482003 0 ่™ฝ็„ถๆœ‰ๆ—ถๅ€™่บซไฝ“ไนŸๅ‘ๅ‡บ็–พ็—…ๅˆฐๆฅ็š„ไฟกๅท\n482004 0 ๆ€ป็ฎ—่Šฑไบ†15ไธชๅฐๆ—ถๅˆฐๅคง็†ไธ‹ๅ…ณไบ†\n1000 Processed\n classify content\n482500 0 ๆ›พ็ปๅƒ่ฟ‡ไธญ่ฏ็”จ่ฟ‡ๅŒป้™ข็š„่ฏ่†้ƒฝๆฒกๆœ‰ๅฅฝ\n482501 0 ๅฏๅœจไธŠไธ‹็ญไน˜ๅ…ฌๅ…ฑๆฑฝ่ฝฆๆˆ–ๅœฐ้“ๆ—ถ\n482502 0 ๅ‡บไบŽ็งๅฟƒๆˆ‘ๆƒณไฝ ไธ่ขซๆ‰ฌๅทžๅคงๅญฆๅฝ•ๅ–่€ŒๅŽป่ฏปๆˆ้ƒฝ็†ๅทฅๅ•Š\n482503 0 ่ฎฐๅพ—ๅšๆŒ็”จๅ“ฆ่คๆžœๆžœ็š‚่ฎฉไฝ ่ถŠๆฅ่ถŠ็พŽไธฝ\n482504 0 ๆ ธๆญฆๅ™จไธ€่ˆฌ่ขซ่ฆ–็ˆฒๅฐๅ’Œๅนณ็š„ๅจ่„…\n1000 Processed\n classify content\n483000 0 ้€‚็”จไบŽๆฒป็–—่‚บ็ƒญๆ‰€่‡ด็ฒ‰ๅˆบใ€็—˜็—˜ใ€็—ค็–ฎใ€้…’็ณŸ\n483001 1 ๅคฉ่ถŠๆฑฝ้…ๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผไธป่ฅ๏ผšไธœ้ฃŽ้›ช้“้พ™็ณปๅˆ—็ญ‰ๆฑฝ้…็š„ๆ‰นๅ‘ๅ…ผ้›ถๅ”ฎ๏ผŒๆฌข่ฟŽๆฅ็”ตๅ’จ่ฏข๏ผ็”ต่ฏ๏ผšxxxxx...\n483002 0 ไธๅ‘ๆœ‹ๅ‹ๅœˆๅ‘่ฟ™้‡Œๅงๅชๆ˜ฏ้—ฎไบ†ไธ€ๅฅๆˆ‘่ฆๅŽป่‹ๅทžๅ•ฆไฝ ๆƒณๅฟตๆˆ‘ไธ\n483003 0 ๆฒณๅŒ—็œๆ—…ๆธธๅฑ€ๅ…š็ป„ไธญๅฟƒ็ป„ๅฌๅผ€ไผš่ฎฎ็ป„็ป‡โ€œไธ‰ไธฅไธ‰ๅฎžโ€ไธ“้ข˜ๆ•™่‚ฒๅญฆไน ็ ”่ฎจ\n483004 0 ไธŽๆญคๅŒๆ—ถ้ƒŽ็‰น่ฟ˜ๅฌๅผ€ไบ†xxxxๅนดๅคงๅญฆ็”Ÿๅ…ฅ่ŒๅŸน่ฎญไผš่ฎฎ\n1000 Processed\n classify content\n483500 0 Atestpublic7755้—ฏ็บข็ฏๆฑ‚ๅŠไปท\n483501 0 ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅฝ•้Ÿณๆฒกๆณ•ไปŽๆ‰‹ๆœบ้‡Œๆžๅ‡บๆฅ\n483502 0 ็™ฝ้ฉฌๆน–็‰‡ๅŒบๅฐ†ๆˆไธบๆทฎๅฎ‰้‡่ฆ็š„็”Ÿๆ€ไผ‘้—ฒใ€ๅ…ป็”Ÿๅ…ป่€ใ€ๆ–‡ๅŒ–ๅˆ›ๆ„ๅŸบๅœฐ\n483503 0 ๅ…จๅธ‚ๅœจ็”จ็š„xxxxxๅฐ็”ตๆขฏ\n483504 0 ๅทฅ่ต„้ƒฝๅพ—่ดก็Œฎๅ‡บๆฅๆƒน~ๅœŸ่ฑชๅฟซๆฅๅŒ…ๅ…ปๆˆ‘\n1000 Processed\n classify content\n484000 0 ๅฅณๆ€งๅฟƒ่ก€็ฎก็–พ็—…็š„ๅ‘็—…็އๅ’Œ็—…ๆญป็އๅ‘ˆๆŒ็ปญๅขžๅŠ ็š„่ถ‹ๅŠฟ\n484001 0 ๆตŽๅ—ไนๅˆ›่ฃ…้ฅฐๅ…ฌๅธ่ฃ…ไฟฎไธ€ๅฃไปท\n484002 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ๅธไธ“ไธš็ป่ฅๅฐๆนพๅŽŸ่ฃ…ๅบ†้ธฟๆ…ข่ตฐไธ๏ผŒ่ดจ้‡๏ผŒไปทๆ ผ๏ผŒๅ”ฎๅŽ้ฆ–้€‰๏ผŒๆฌข่ฟŽๆฅ็”ตๅ’จ่ฏขไบ†่งฃ๏ผ็ฅไฝ ็”Ÿๆดปๆ„‰ๅฟซ๏ผŒ...\n484003 0 FM91่ทฏๅ†ต๏ผšๆญคๅˆปๅฏฅๅป“ๅ—่ทฏๅฆ‡ๅนผๅŒป้™ขๅŒๅ‘้บ’้บŸ่Šฑๅ›ญ็Žฏๅฒ›่ฝฆๆต้‡ๅคง้€š่กŒ็ผ“ๆ…ข\n484004 0 ่ฝๅนฟ่ฅฟๆก‚ๆž—ๆตทๆด‹้“ถๆๆž—ๅถ็ผค็บท\n1000 Processed\n classify content\n484500 0 ๆฑฝ่ฝฆ้ฉพ้ฉถๆŠ€ๆœฏ็š„็š„ๆ้ซ˜่ฆ้ ่‡ชๅทฑๅคšๅผ€ๅคš็ปƒ\n484501 0 ไฝ†ๅณไพฟๆ˜ฏ่ฏธๅฆ‚ๅพฎ่ฝฏXBOX่ฟ™ๆ ท็š„ไธ“ไธšๆธธๆˆไธปๆœบ\n484502 0 ๅœจๅœฐ้“ไธŠ็ขฐๅˆฐไธ€ไปถๅพˆๆธฉ้ฆจ็š„ไบ‹\n484503 0 ๆฒกๆœ‰้’ฑ่ต”ๅฟๅ—ไผค็ฌฌ8ไธชๆœˆไปฅๅŽ็š„ๅŒป็–—่ดน\n484504 0 ๅฟซๆทไบ’้€šๅŒ—ไบฌๆ˜Œๅนณ็งปๅŠจT3+ๆœบๆˆฟๆŠขๅ”ฎไธญ\n1000 Processed\n classify content\n485000 0 ๅŽไธบ๏ผšNowisP8\n485001 0 ่ฎฉไบบ่ง‰ๅพ—็”œ็พŽ้žๅธธplewhojudgemewithoutๅŸŽ\n485002 1 ้‡‘ ๆฒ™ๆ ก ๅŒบ:ๆ˜ฅ ๅญฃๅ‘จ ๆœซ็ญxๆœˆx๏ผxๆ—ฅๅผ€ ่ฏพ\n485003 0 ๅœจๆƒณๆ˜Žๅคฉ่ฆไธ่ฆ็ฉฟ่Šฑๅƒ้ชจ็š„t\n485004 0 ไธบไป€ไนˆๆˆ‘็ˆธๅฆˆ่ฟ™ไนˆๅฅฝ็š„ๅŸบๅ› ๅฝ“ๅˆไธๅฅฝๅฅฝ่€ƒ่™‘ๅ†็”Ÿๅญฉๅญๅ‘ข\n1000 Processed\n classify content\n485500 0 ๅทฅๅทGQxxxxๅฎ‰ๆฃ€ๅพ€ๅฅณไน˜ๅฎข่ƒธๅฃๆŒก\n485501 0 Troy้šๅŽๅˆ ้™คไบ†่ฟ™ๆกๆŽจๆ–‡ๅนถๅ‘ๅธƒไบ†ไธคๆกๆ–ฐ็š„ๆŽจๆ–‡ๅฎฃๅธƒ้€€ๅ‡บTwitter\n485502 0 ๆฐ‘่ˆช็ฎก็†ๅฑ€ๅ†ณๅฎš่‡ชไปŠxๆ—ถ่ตทๆš‚ๅœๅ‘็”Ÿๆทฑ่ˆช็บต็ซไบ‹ไปถ็š„ๆต™ๆฑŸๅฐๅทžๆœบๅœบ่ฟ่กŒ\n485503 0 ไธ€ๆ†โ€œๅ–ๅ„็งๆ‰‹ๆœบโ€็š„ๆ——ไธ‹ๅ็€ไธ€ไธช็ฅž็ง˜็”ทๅญ\n485504 0 5ไธชๆœˆๅŽๅ’ŒๆŽๆŸไบบ็ซ™ๅœจไฝ้™ขไธ€้ƒจๆฅผไธ‹ๆ‰พ้—จ\n1000 Processed\n classify content\n486000 0 xxxxๅนดๅˆฐxxxxๅนด็ฌฌไธ€ๅญฃๅบฆ\n486001 0 ๅ—ไบฌๅ›ฝ้™…ๅš่งˆไธญๅฟƒๅ’Œๅ—ไบฌๅ›ฝ้™…ๅฑ•่งˆไธญๅฟƒ็ซŸ็„ถๆ˜ฏไธคไธชๅœฐๆ–น\n486002 0 ๆ•ดไธชๅฑ•่งˆไปŽๆ›ผๅ“ˆ้กฟ23่ก—ๅพ€ๅ—ๅปถไผธ่‡ณๅ—่ก—ๆตทๆธฏ\n486003 0 ๅœจๅ—ไบฌๅœฐ้“้‡Œ่กŒไนž็š„่Œไธšไนž่ฎจ่€…\n486004 0 80%่‡ณ90%็š„่‚บ็™Œๆ˜ฏ็”ฑๅธ็ƒŸๅผ•่ตท\n1000 Processed\n classify content\n486500 1 ใ€็Žซ็ณๅ‡ฏxๆœˆไฟƒ้”€ใ€‘ๅฅฝๆถˆๆฏ๏ผŒxๆœˆVIP่ฎขๅ•็ฆๅˆฉๆฅๅ•ฆ๏ผๅฅฝ็คผ้€ไธๅฎŒ๏ผๆ”นๅ˜ไปŽ็พŽไธฝๅผ€ๅง‹๏ผๅคŸไนฐ็Žซ็ณๅ‡ฏไปปๆ„...\n486501 0 ไธด็กๅ‰ๅ’Œ่ˆๅ‹่ฏดๆˆ‘ไธญๅˆไธŠ็ญ็š„ๆ—ถๅ€™ๅœจๆƒณๅ†่ฟ‡ๅทฎไธๅคšไธ€ไธชๆœˆๆˆ‘ไปฌๅฐฑ่ฎค่ฏ†ไนๅนดไบ†่ˆๅ‹ๆƒณไบ†ไธ‹่ฏดๅบ”่ฏฅๆ˜ฏๅๅนดโ€ฆๆˆ‘...\n486502 0 ่ง‰ๅพ—ๅพˆ้œ‡ๆƒŠโ€ฆๅคœๆทฑไบบ้™็š„ๆ—ถๅ€™โ€ฆไผšๅพˆๆƒณๆญป\n486503 0 ๅ’ฑไปฌๅŽไธบๆ‰‹ๆœบๆœ‰Mate7้’ๆ˜ฅ็‰ˆ่ฟ™ๆฌพๆ‰‹ๆœบๅ—\n486504 0 WIN7/8ๆญฃ็‰ˆ็š„ๆœ€ๅฅฝ่ฟ˜ๆ˜ฏ็ญ‰ๅพฎ่ฝฏๆ›ดๆ–ฐๆฏ”่พƒๅฅฝ\n1000 Processed\n classify content\n487000 0 ๆœฌๅ‘จๅ…ญๆฃฎๅพทๅ…ฐๅ›ฝ้™…่ดธๆ˜“้›†ๅ›ข่”ๅˆๅŒ—้’ๆŠฅๅœจๅŽ่”้กบไน‰้‡‘่ก—่ดญ็‰ฉไธญๅฟƒไธพๅŠžโ€œ่ˆžไธŽไผฆๆฏ”่ก—่ˆž็ง€โ€\n487001 0 xxxxๅนด้•ฟ่‘›่ฝฉ่พ•ๆ‘้•‡้“ถ่กŒๆ‹›่˜ๅ…ฌๅ‘Š\n487002 0 ๅพžGoogleๅปฃๅ‘ŠๆŠ•ๆ”พๅฏ็žง่ฆ‹็ซฏๅ€ช\n487003 0 ่ญฌๅฆ‚ๅพฎ่ฝฏOfficexxxไธŽGoogleApps็ญ‰็ญ‰\n487004 0 ๆœ€่ตš้’ฑ็š„ๅ›ฝไบงๆœบไธๆ˜ฏๅฐ็ฑณๅŽไธบ\n1000 Processed\n classify content\n487500 0 ่ฆไนˆ่€ƒไบ†็ ”็ฉถ็”Ÿ่ฟ˜่ฆ่€ƒCPA็š„\n487501 0 ๆ”พๅœจ่บซๅŽๆค…ๅญไธŠ็š„่‹นๆžœxSๆ‰‹ๆœบ่ขซไบบๅทไบ†\n487502 0 xx้กนไธŽ้š”้Ÿณ้™ๅ™ช็›ธๅ…ณ็š„้›ถ้ƒจไปถๆ”น้€ \n487503 0 ๅฏนไบŽ็ฌฆๅˆๅฐ้ข่ฏ‰่ฎผ็จ‹ๅบ้€‚็”จๆกไปถ็š„ๆกˆไปถ\n487504 0 ไธ€ไธชๅฅฝ็š„็†่ดขไน ๆƒฏไธๅชๆ˜ฏไผšๆŒฃ้’ฑ\n1000 Processed\n classify content\n488000 0 ้ฆ–ๅฑ•โ€œๆ–‡ๆ˜Ž็š„็ปดๅบฆ\"ๅŒ…ๅซ19ไธ–็บชไธญๅ›ฝๆด‹้ฃŽ็”ปใ€ๅคไปฃไธญๅ›ฝๅœฐๅ›พใ€ๅฝ“ไปฃ่‰บๆœฏไธ‰้ƒจๅˆ†ๅ†…ๅฎน\n488001 0 ็™ฝๅญ็”ปๅชๅš่Šฑๅƒ้ชจไธ€ไบบ็š„ๆš–็”ท\n488002 0 ็„ถๅŽe+ๅˆฐwincciๅงๅˆๅ’ณๅˆๅ‘•\n488003 0 ๅพฎ่ฝฏๅบ”่ฏฅๅญฆไน ่‹นๆžœไปฅSurfaceๆ‰‹ๆœบๆ‹ฏๆ•‘WPๅนณๅฐ๏ผšๆœชๆฅๅพฎ่ฝฏๅบ”่ฏฅๆ”พๅผƒๆœบๆตทๆˆ˜ๆœฏๅ’ŒไธญไฝŽ็ซฏๅป‰ไปทๆ‰‹ๆœบ\n488004 0 ๆ‰พไธ€ไธชไบบไป–ๅธฆ็€ๆˆ‘ๆˆ‘ๅธฆ็€็›ธๆœบ็„ถๅŽๆˆ‘ไปฌไธ€่ตทๅŽปๆ—…ๆธธ\n1000 Processed\n classify content\n488500 0 ๆฑŸ่‹็›ๅŸŽๅปบๆน–ๅŽฟไปฅๅคงๅญฆ็”Ÿๆ‘ๅฎ˜ไธบๆฐ‘ๆœๅŠก่€ƒๆ ธๆœบๅˆถไธบๆŠ“ๆ‰‹\n488501 0 ๅฅฝ่ฟ็›ธๅฎˆ๏ผšๆœ‹ๅ‹ๆ˜ฏๆฐธ่ฟœ็š„่ดขๅฏŒ\n488502 0 ็Ž‹ๅ…ˆ็”Ÿ่ทฏ่ฟ‡ไฝไบŽๆต™ๆฑŸๆฐธๅ˜‰ๅŽฟๆฑŸๅŒ—่ก—้“ๆฅ ๆฑŸไธญ่ทฏ7ๅท็š„72101ไฝ“ๅฝฉ้”€ๅ”ฎ็ฝ‘็‚นๆ—ถ\n488503 0 ๅ—ไบฌๅธ‚ๅ†…ๅ„ๆญฃ่ง„ๆถˆๆฏ’้คๅ…ท็”Ÿไบงไผไธšๅฐ†็ปŸไธ€ๆŽจๅนฟไฝฟ็”จ1122ๆ ‡ๅฟ—็š„ๆถˆๆฏ’้คๅ…ทๅŒ…่ฃ…\n488504 1 ๆ–ฐๅนดๅฅฝ๏ผไฟๅˆฉ.็ฝ—ๅ…ฐๅ›ฝ้™…็š„ๅฐๅงš็ป™ๆ‚จๆ‹œไธชๆ™šๅนด.้กน็›ฎ็Žฐๆ–ฐ่ดงๅŠ ๆŽจ\n1000 Processed\n classify content\n489000 1 xๅนณ็ฑณ็š„ๆˆทๅž‹ๅœจๅ”ฎ๏ผŒๆœ‰็ฒพๅ“ๆ ทๆฟ้—ดๅฑ•็คบ๏ผŒๅ‡ไปทxxxx ๆ„Ÿ่ฐขๆ‚จๅฏนๆˆ‘ๅทฅไฝœ็š„ๆ”ฏๆŒ๏ผ\n489001 0 NBAๅฒไธŠๆœ€ๅผบๅ›ฝ้™…็ƒๅ‘˜่ƒฝๅฆ็ˆ†ๆขฆไน‹้˜Ÿ\n489002 1 ๆณฐๅทž็Ÿญ ไฟก็พค ๅ‘ๅนณๅฐ๏ผŒ้™ไฝŽๅฎขๆˆทๆตๅคฑ็އ๏ผŒๅขžๅŠ ๅฎขๆˆทๆถˆ่ดนๆฌกๆ•ฐ๏ผŒๆๅ‡่ฅไธšๆฌพใ€‚ๅฆ‚๏ผšๅ•†ๅ“ๆŽจๅนฟ๏ผŒๅบ—้ขไฟƒ้”€๏ผŒ...\n489003 1 ๅ…จๅ››ๅทๆŠข็พŽ็š„๏ผŒxๆœˆx-xๆ—ฅไป…้™xๅคฉ๏ผŒๅฐๅฐๅทฅ็จ‹ๆœบ๏ผŒๆฌพๆฌพๅ‡บๅŽ‚ไปท๏ผŒๅ…จๅนดๆœ€ไฝŽไปท๏ผŒไปทไฟๅ…จๅนด๏ผŒไนฐ่ดตxxๅ€...\n489004 0 ๆœ€้ซ˜ไบบๆฐ‘ๆณ•้™ขๅ…ณไบŽไบบๆฐ‘ๆณ•้™ขไธบโ€œไธ€ๅธฆไธ€่ทฏโ€ๅปบ่ฎพๆไพ›ๅธๆณ•ๆœๅŠกๅ’Œไฟ้šœ็š„่‹ฅๅนฒๆ„่ง\n1000 Processed\n classify content\n489500 0 ๅŽ้—็—‡๏ผšๅฐ‘ๆ•ฐๆ‚ฃ่€…ๅฏ็•™ๆœ‰ๅŽ้—็—‡\n489501 0 ้˜ฟ้‡Œๅทดๅทดใ€่…พ่ฎฏใ€็™พๅบฆใ€ไบฌไธœใ€ๅฅ‡่™Ž360ใ€ๆœ็‹ใ€็ฝ‘ๆ˜“ใ€ๆ–ฐๆตชใ€ๆบ็จ‹ใ€ๆœๆˆฟ็ฝ‘ไฝๅˆ—2015ๅนด\n489502 0 ็ปดCๅฏไปฅๅธฎๅŠฉไบบไฝ“ๆ้ซ˜ๅ…็–ซๅŠ›่ฟœ็ฆป็–พ็—…\n489503 0 ไธ€ๅคงๆ—ฉ่ตถ้ฃžๆœบ็ดฏshiไบบไบ†้ƒฝ\n489504 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏไธŠๅˆๅˆš็ป™ๆ‚จๆ‰“็”ต่ฏๅพท่ฏบ็”Ÿๆดป้ฆ†็š„ๅฐ็Ž‹ใ€‚ๆˆ‘ไปฌxxx็š„ๆดปๅŠจๆ˜ฏๅ…จๅœบxๆŠ˜๏ผŒ่ฟ˜ๅฏไปฅๅ‚ๅŠ ่”ๅคŸ่ฟ”็Žฐๅ’Œ...\n1000 Processed\n classify content\n490000 0 โ‘กๅฏ็”จไธŽๅŠ ๆฒน็ซ™ใ€ๆž—ๅœบ็ญ‰็ฆ็ซใ€้˜ฒ็ซๅ•ไฝ\n490001 0 ใ€–ZUKZxใ€—็ผ”้€ Zxๅฏผ่ˆช่ดด\n490002 1 ๅ“ฅๅ“ฅไฝ ๅฅฝ๏ผŒๆˆ‘ๅซๅฐๅจŸ๏ผŒๅนฟ่ฅฟไบบ๏ผŒๅ› ไธบ็ˆถๆฏ็ฆปๅฉš๏ผŒๆˆ‘่ทŸๅฆˆๅฆˆ๏ผŒๅฆˆๅฆˆ่บซไฝ“ไธๅฅฝ๏ผŒๆˆ‘้€‰ๆ‹ฉ่พๅญฆ๏ผŒๅฌ่ฏดๅ“ฅๅ“ฅไบบๅพˆๅฅฝ...\n490003 0 ไธŽ็พŽๅ›ฝๅผๆœบๅ™จไบบๆ‰‹ๆœฏๅฎค็›ธๆŠ—่กก\n490004 1 ๅง๏ผŒๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏ็Ž‹ๅบœไบ•ๆฌง็€่Žฑไธ“ๆŸœ็š„ๅฐๅผ ๏ผŒx.xโ€”x.xๆฌง็€่ŽฑๆŸœๅฐๆปกxxxx่ฟ”xxx๏ผŒxxxx่ฟ”...\n1000 Processed\n classify content\n490500 0 ็›ฎๅ‰ๆฅ่ฏดๆˆ‘ๅฏน้œ“่™น็š„็ˆฑๅช้™ไบŽ่ฏๅฆ†ๅบ—ๅ…็จŽไนฐ่‹นๆžœๆšด่ทŒ็š„ๆฑ‡็އไฝ†\n490501 0 ๅ„ๅœฐๆ–นๆ”ฟๅบœๆ‰นๅˆฐๅ“ชๅ„ฟไธญ็ŸณๅŒ–ๅฐฑ่ƒฝๆŠŠๅ›ฝๅ››ใ€ๅ›ฝไบ”้€ๅˆฐๅ“ชๅ„ฟ\n490502 0 ๅŠ›ไบ‰8ๆœˆไปฝๆๅ‰่ถ…้ขๅฎŒๆˆๅ…จๅนด้กน็›ฎๆ‹›ๅผ•็›ฎๆ ‡ไปปๅŠก\n490503 0 ็ก้†’ไบ†ๅฐฑ็œ‹่งๆ‰‹ๆœบๆ•ดไธชๅฑๅน•ไธไบฎๆ€Žไนˆ้ƒฝๆฒกๅๅบ”ไฝ ๆ˜ฏๅ‡ๆญปๆˆ‘ๆ•ดไธชไบบๅ“ๅพ—้ƒฝๅฟซ็œŸๆญปไบ†ๅฅฝๅ—\n490504 0 0ๅ›ฝไบง็‰ˆๆ™บ่ƒฝๆ‰‹ๆœบ้ฅๆŽง็”ตๅŠจ็บธ้ฃžๆœบ่ˆช็ฉบๆจกๅž‹ๅ™จๆๆป‘็ฟ”ๆœบ\n1000 Processed\n classify content\n491000 0 ไปฅๆญคๆ‰ญ่ฝฌwin8็ป™ๅพฎ่ฝฏๅธฆๆฅ็š„้ข“ๅŠฟ\n491001 0 ็ญ‰ไบบๆ•ฃๅŽปๅŽ่ฟ˜ๅพ—ๆฅไธชๅฐ่ดฉๆ”ถๆ‘Š\n491002 0 ๆˆ‘ๆƒณ้™้™โ€ฆๅฏๆ€œ่Œ่Œๅช่ƒฝ็œ‹็€ไธ‰้’\n491003 1 ๅฐŠๆ•ฌ็š„้กพๅฎขๆ‚จๅฅฝ๏ผšๅฎๅฑ…ไนๆฌง็พŽๅฎถ็งxๆœˆx--xๅทๅ…จๅœบx.xๆŠ˜๏ผŒไป…ๆญคไธคๅคฉ๏ผŒๆฌข่ฟŽๆ–ฐ่€้กพๅฎขๅ‰ๆฅ้€‰่ดญใ€‚ๅœฐ...\n491004 0 07ๆถจๅœ159่ทŒๅœ3ไธŠๆถจๅฎถๆ•ฐ2294ๅนณ็›˜ๅฎถๆ•ฐ525ไธ‹่ทŒๅฎถๆ•ฐ61ไบ’่”็ฝ‘๏ผš38ไบ’่”็ฝ‘้‡‘่ž๏ผš300...\n1000 Processed\n classify content\n491500 1 ๅฅฝๆถˆๆฏ๏ผŒๆˆ‘ไปฌ่ถณ้ผŽ้˜๏ผŒxๆœˆxๅทๅ’Œxๅท๏ผŒๅฅณๅœŸไผšๅ‘˜xๆ‹†๏ผŒ้žไผšๅ‘˜x.xๆ‹†๏ผŒxๆœˆxๅทไผšๅ‘˜x.xๆ‹†๏ผŒ้žไผš...\n491501 0 ้€š่ฟ‡่že่ดญไผไธšๅ•†ๅŸŽ้ขๅผ€ๅฑ•ๅ…ฌๅผ€้”€ๅ”ฎ\n491502 0 ๆˆไธบๆต™ๆฑŸ็œ้ฆ–ๅฎถๅœจๆ–ฐไธ‰ๆฟๆŒ‚็‰Œ็š„ๅ›ญๅŒบ็ฑปๅ…ฌๅธๅ’Œ็ฌฌไธ€ๅฎถๅ›ฝไผ\n491503 0 ๆฑŸ่‹ๅทฅ้™ขไน‹ๅฃฐ่†œๅนฟๆ’ญๅฐ็ผ–่พ‘ๆœฌๆฅๆ— ไธ€็‰ฉ\n491504 0 ็”จๆ ธๆญฆๅ™จๆ‰่ƒฝ้˜ปๆญข็š„ๆฐ‘ๆ—ๆ˜ฏๅคš้บผ็š„ๅฏๆ€•\n1000 Processed\n classify content\n492000 0 ๆ–ฐ่ƒฝๆบๆฑฝ่ฝฆๅŽๅธ‚ๅœบๅญ˜ๅœจไฟ้™ฉ็›‘็ฎกๆกไพ‹็ผบๅคฑ\n492001 0 ไธญๅŒป่ฏ็ŽฐไปฃๅŒ–ไบบๅทฅๅˆๆˆ็†Š่ƒ†็ฒ‰ๅ–ๅพ—็ช็ ด\n492002 0 ๆฅๅˆฐๅธธ็†Ÿๅ›ฝ็จŽ็Žฐๅœบ่€ƒ่ฏ„่‹ๅทžๅธ‚ๆ–‡ๆ˜Žๅ•ไฝๆ ‡ๅ…ตๅˆ›ๅปบๅทฅไฝœ\n492003 0 ่Š้บปไฟก็”จๆฐดๅˆ†่ƒŒๅŽ็š„้ฃŽ้™ฉๅŠฟๅฟ…ไผšไผ ๅฏผ่‡ณP2P่กŒไธš\n492004 0 ๅฐ้ฃŽ??ๅค–้ขๅน็€ๅคง้›จๅˆฎ็€้ฃŽ่ทฏไธŠๅบ”่ฏฅๆฒกไป€ไนˆไบบไบ†ๆœ‰็š„ไบบไธŠ็ญ็š„่ฟ˜ๆ˜ฏไธŠ็ญ\n1000 Processed\n classify content\n492500 0 ๅพˆๆƒณ็Ÿฅ้“ๆฒกๆœ‰ไปปไฝ•้˜ฒๆ™’ๆˆ‘ๆœ€ๅŽ็š„ๆ™’ๆ–‘ไผšๅขžๅŠ ๅคšๅฐ‘\n492501 0 ๅˆšๅˆš็™พๅบฆไบ†ไธ€ไธ‹็ปˆไบŽไผšๆ‹’็ปไธๆ˜ฏๅฅฝๅ‹็š„่ฏ„่ฎบไบ†\n492502 0 ไธฅๅމๆ‰“ๅ‡ปไบ†ไธ€ๆ‰น็Žฏๅขƒ่ฟๆณ•่กŒไธบ\n492503 1 ๅ‹ๆƒ…ๆ้†’ใ€ๆ—ฅๅ‡บๅบทๅŸŽใ€‘ๆ˜ฅ่Š‚็‰นๆƒ ไป…ๅ‰ฉxๅคฉ๏ผ่ฃธไปทxxxx/ๅนณ๏ผ้ฆ–ไป˜xไธ‡่ตท๏ผŒ่ต ้€้ข็งฏ้ซ˜่พพxxๅนณใ€‚ๆ—บ้“บ...\n492504 0 ๅœจ็™พๅบฆไธŠๆœ็ดขไบ†่‡ชๅทฑๆ›พ็ปๅนฒๆ–ฐ้—ปๅช’ไฝ“ๆ—ถๅ‘่กจ่ฟ‡็š„ๆ–ฐ้—ป็จฟไปถ\n1000 Processed\n classify content\n493000 0 ไฝ†ๆ˜ฏDCๆŽ’ๆฏ’้ข่†œๅฏไปฅ็ปŸ็ปŸๆžๅฎš่ฟ™ไบ›็šฎ่‚ค้—ฎ้ข˜็š„\n493001 0 ๅฐ†ไบบๅฎถ็š„ๆ‰‹ๆœบๆ”พ่ฟ›ไบ†่‡ชๅทฑๅฃ่ข‹\n493002 1 ๅ„ไฝๆŒ–ๆœบ่€ๆฟๅ…ƒๅฎต่Š‚ๅฟซไนใ€ๆˆ‘ๅธๆœฌๆœˆxๅทไธพ่กŒๅคงๅž‹่ฎข่ดงไผšใ€ๅฆๅค–ๆŽจ่้…ไปถ่ดญxxxxx้€xxxxๅ…ƒ็š„ๆดป...\n493003 0 HelmsBrosMercedesbenzๅŠๆ—ฅๆธธ\n493004 0 ๅŒๆ–นๆœชๆฅ3ๅนดๅ†…ๅˆฉ็”จ่‡ช่บซ่ต„ๆบๅœจ็”ตๅฝฑ็š„ๆŠ•่ต„ใ€ๅˆถไฝœใ€ๅ‘่กŒใ€ๆŽจๅนฟใ€ๆ”น็ผ–็ญ‰ๅคšไธชๆ–น้ขๅฑ•ๅผ€ๅˆไฝœ\n1000 Processed\n classify content\n493500 0 2015ๅนดๆžฃ้˜ณไบ‹ไธšๅ•ไฝ้ข่ฏ•่ฏพ็จ‹\n493501 0 ๅœจ้€Ÿ้€Ÿ็š„ๆŽๆ‰‹ๆœบ็™พๅบฆๆŸฅๆกฃๆ˜ฏไธชไป€ไนˆ็Žฉๆ„ๅ„ฟไธญๆƒŠ้†’ไน‹ๅŽ\n493502 0 ่ฟ™ไนŸๆ˜ฏ่ฟ‡ๅŽป8ไธชๆœˆๅ†…็ฉบ้—ด็ซ™่กฅ็ป™่ฎกๅˆ’้ญ้‡็š„็ฌฌไธ‰ๆฌกๅคฑ่ดฅ\n493503 0 ๆ— ไธบๅŽฟๆ˜†ๅฑฑไนกๅฑ…ๆฐ‘ๆŽๆก‚้ฆ™ๅฎถๅฎ‰่ฃ…ไบ†ๆ–ฐ็š„่‡ชๆฅๆฐด็ฎก\n493504 0 ๆต™ๆฑŸๅ—ๅฝฑๅ“ไธ่ฟ‡ๅนๅนๆปกๅ‡‰ๅฟซ\n1000 Processed\n classify content\n494000 0 ๅŒ–ๅฆ†ๅธˆjessciaๅ’Œๆˆ‘ๆฒŸ้€š้€ ๅž‹\n494001 0 ไฝ†่ƒฝๅค ็š„่ฉฑๆœ€ๅฅฝ้ƒฝไธ่ฆๆ”พๅˆฐๅ—ๅ€ๅŸŸ็ฎกๅˆถ็š„็ถฒ็ตก\n494002 0 ็œŸ็›ธๆ˜ฏๅฎถๅฑž็ป™ไป–ๅƒไบ†ๆœ‰ๆ ธ็š„่ฏๆข…\n494003 0 ็ป“ๆžœๅŒป็”Ÿๆ‹จๆ‹จๆˆ‘็š„ๅคดๅ‘ไธๅˆฐไธค็ง’ๅฐฑ่ฏด\n494004 0 ไป–้˜ปๆ‹ฆ่ญฆ่ฝฆ็ฆปๅผ€ๅนถ็ ธๅๆŒก้ฃŽ็Žป็’ƒ\n1000 Processed\n classify content\n494500 0 ๆฐธไธๆถ่จ€็›ธๅ‘/ๆฐธไธๆš—่‡ช่€ƒ้‡/ๆฐธไธๆ”พไปปไน–ๅผ /ๆฐธไธๅœๆญขๆˆ้•ฟ\n494501 0 ?ไฝœไธบไฝŽๅคดๆ—ใ€็”ต่„‘ๆ—ใ€ไน…ๅๆ—\n494502 0 ็”ฒ้ชจๆ–‡ๅƒๆœ‰้”‹ๅˆƒ็š„้•ฟๆŸ„ๅทฅๅ…ทๆˆ–ๅ…ตๅ™จ\n494503 0 ๆ‰พๆˆ‘่ฆ่ฟ™ไธชbb็š„ๆˆ‘ๅฟ˜ไบ†้ƒฝ่ฐไบ†่‡ชๅทฑๆ‰พๆˆ‘WhooๅŽๆ‹ฑ่พฐไบซ้›ช็މๅ‡็พŽ็™ฝBB้œœ้˜ฒๆ™’ๆธ…็ˆฝไธๆฒน่…ป40ml๏ฝž\n494504 0 ็ฝ‘ๅ‹็บท็บท่ฐƒไพƒโ€œๆฒกๅ›พๆฒก็œŸ็›ธโ€ใ€โ€œๅ†่ƒ–ไนŸๆฏ”ๆˆ‘็˜ฆโ€ใ€โ€œๅๆ‰‹ๆ‘ธไธๅˆฐ่‚š่„็œผไบ†ๆ˜ฏไนˆโ€\n1000 Processed\n classify content\n495000 0 โ‘ฅไธ่ฆๅ’จ่ฏขๅฎขๆœไธ็„ถๅˆๆ”นไบ†~\n495001 0 ๆ–ฐไธ€ไปฃ็š„Nexus็ณปๅˆ—ๅฐ†ไผš็”ฑๅŽไธบไปฃๅทฅ\n495002 0 ๆฑŸ่‹ๅคงๆ„่ฅไธšๅ‘˜51ๅ…ƒๅ–ไบ†ipad2\n495003 0 P8ๅˆฐ็Žฐๅœจๅ‘ๅธƒๅฟซไฟฉๆœˆไบ†่ฟ˜ๆ˜ฏๆ˜พ็คบ็ผบ่ดง\n495004 0 ็ฌฌไธ‰ๅ๏ผšๅค„ๅฅณๅบงๅฝ“็”Ÿๆดปไธญ็š„็ป†่Š‚ๅธฆๆฅ็ฃจๅˆๅŽๆ‰ๅ‘็Žฐ\n1000 Processed\n classify content\n495500 0 ไฝ ไปฌ่ƒฝไธ่ƒฝไธ่ฆๅ†่ฏดๆˆ‘็”ต่„‘่Šฑไบ†\n495501 0 ๅœจ็ปๅކไบ†โ€œๅฐฑ้‚ฃๆ ทโ€็š„Windows8\n495502 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผ็งปๅŠจๅ…‰ๅฎฝๅธฆ็Žฐๅœจๆ˜ฏๅฆ่ฟ˜ๆœ‰ๆ„ๅ‘ๅŠž็†๏ผŸ็ŽฐๅŠž็†ๅ…ๅฎ‰่ฃ…ใ€‚่ฏฆ่ฏขxxxxxxxxxxx\n495503 0 ๆƒณ็Ÿฅ้“็™พๅบฆ็š„ๅนฟๅ‘Š็ณป็ปŸไป€ไนˆๆ—ถๅ€™ๅ˜็š„ๅฆ‚ๆญค็ฒพๅ‡†\n495504 1 ใ€ๆ™‹ๆ‹“ๆฑฝ้…ใ€‘ๆ‹›่˜็”ทๅฅณๆ™ฎๅทฅ๏ผŒxx-xxๅฒ๏ผŒๅŸบๆœฌๅทฅ่ต„xxxxๅ…ƒ/ๆœˆ+ๅŠ ็ญ+ๅ„้กน่กฅ่ดด\n1000 Processed\n classify content\n496000 0 ๅคฉๆฐ”่ฟ™ไนˆ็ƒญๆฐด็พไปฅๅŽๅฏ่ƒฝไผšๆœ‰็–ซๆƒ…\n496001 0 ไธป่ง’ๆ˜ฏ็†Š็†Š+ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ?1MOREๅคดๆˆดๅผ่€ณๆœบ\n496002 0 ่ฟ™ไธช่€ๅธˆ่ทŸxโ€”xไธชๅญฆ็”Ÿๅ‘็”Ÿ็€ไธๆญฃๅฝ“็š„ๅ…ณ็ณป\n496003 0 ไธ€ไธช็‰ฉไธšๅทฅไบบไธŠ24ๆญ‡48ๆฏๆœˆ2400ๅŠ ไฟ้™ฉ\n496004 0 ้‚ฃๆ™‚ไธ€่ทฏ่ท‘ๅ‘€่ท‘ไธ€่ทฏๆ‘”่ทคๅฐฑ็›ดๆŽฅๆปพๅ€’ๅœจ่‰ๅœฐไธŠๅ’ฏ\n1000 Processed\n classify content\n496500 0 ๆœ‰่ƒฝๅŠ›่€…ๅ—่ดฟ็š„ๅฆไธ€ๆ–น้ขไนŸๅˆ›้€ ็€ๆ›ดๅคง็š„่ดขๅฏŒ\n496501 0 ๆฏๅคฉ็ก่ง‰็š„ๆ—ถๅ€™ๆ‰‹ๆœบๅทฎไธๅคšๅชๅ‰ฉไธ‹3%็š„็”ต\n496502 1 ไผ˜ๅšๆ•™ ่‚ฒๅ…ฅ้ฉปๆ›ผๅ“ˆ้กฟๆญฅ่กŒ่ก—ๅ— ๅผ€่ฎพ่ฏพ็จ‹ๆœ‰๏ผšๅˆใ€้ซ˜ไธญๅ„็ง‘๏ผŒๆœ‰่พ… ๅฏผ็š„ๆ™š่‡ชไน ๏ผŒๅฐ‘ๅ„ฟๅ›ดๆฃ‹ใ€ไนฆๆณ•ใ€ๅฐ...\n496503 0 ไฝ ไป–ๅฆˆ็š„ไธ็”จๆŒ‡ๆœ›่Šฑ2ๅ—้’ฑๆŒ‚ๅท่ดนไบบๅฎถ็ป™ไฝ ่งฃ้‡Šไธ€ๅฐๆ—ถใ€ๅŽ้ข่ฟ˜ๆœ‰ไธ€ๅคงๅ †็—…ไบบๅ‘ข\n496504 0 ๅ–„ไบŽๆญ้œฒโ€œๆ”ฟๅบœๆŽฉ่—็œŸ็›ธโ€็š„ๅ…ฌ็Ÿฅ่ฝฎๅญๅธฆ่ทฏๅ…šๅœจๅ…จๅŠ›็ปดๆŠคไธญๅ›ฝๅธๆณ•ๅ…ฌๆญฃ\n1000 Processed\n classify content\n497000 1 ไธบ่ฝๅฎžๆ™บๆ…งไธƒๅธˆๆ–‡ไปถ๏ผŒxxxๅ›ข็”ตไฟกๅฑ€ๆ‰‹ๆœบ็‰นๆƒ ๏ผšๆฏๆœˆxๅ…ƒ้€xxxๅˆ†้’Ÿ้€š่ฏ๏ผŒ่ถ…ๅ‡บๆฏๅˆ†้’Ÿxๅˆ†ใ€‚ๆฏๆœˆx...\n497001 0 ๅ›พ3ๅฅณไบบๆˆ‘ๆœ€ๅคงๆŽจ่็š„่‡ช็™ฝ่‚ŒๅŒ–ๅฆ†ๆฐด\n497002 1 ใ€ๆจŠๆ–‡่Šฑ้ข่†œไฝ“้ชŒๅบ—ใ€‘ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๆดปๅŠจๅ…จๅœบๆปกxxxๅ…ƒ้€xxxๅ…ƒ็Žฐ้‡‘ๅˆธ๏ผŒๆ—ถ้—ด:x.x--x.xๅท๏ผŒ...\n497003 0 ๅนณๆ—ถไธ€่ˆฌ2XUไฝŽไบŽ5ๆŠ˜้ƒฝๆฏ”่พƒๅฐ‘\n497004 0 ไธ€้ฆ–่‹ฑๆ–‡ๆญŒAppletree\n1000 Processed\n classify content\n497500 0 ๅˆšๅˆšๆ‰‹่ดฑๅŽป็œ‹ไบ†ไธ‹ๆฑŸ่‹ๅซ่ง†็š„ๅฃฎๅฟ—ๅ‡Œไบ‘\n497501 1 ๅ†œไธšxxxxxxxxxxxxxxxxxxx ๅๅญ— ้™ˆๆตท็ผ\n497502 0 ๅฐ†ๅœจxๆœˆxxๆ—ฅๆญฃๅผๅ‘ๅธƒๆ——ไธ‹ๆ–ฐ็š„ๅฐ็”ตX่ถ…็บงๆฟ\n497503 0 ๅบ—้“บๅฎขๆœไผšๅœจ48ๅฐๆ—ถๅ†…ๅฎŒๆˆ่ฟ”็Žฐ\n497504 0 ๅ…จ็ƒๆœ‰ๅ“ชไบ›ไผ˜็ง€็š„ๅปบ็ญ‘ๆ‘„ๅฝฑไฝœๅ“\n1000 Processed\n classify content\n498000 0 ไปฅๅŠๅฏนๆถ‰ๅซŒ่ฟ่ง„็š„้ซ˜้ข‘ไบคๆ˜“่ดฆๆˆทไบˆไปฅๆš‚ๅœไบคๆ˜“็š„ๅค„ๅˆ†\n498001 0 ่ญฆๅฏŸ็œผ็ฅž้‡Œ้ƒฝๆ˜ฏๆ˜Ÿๆ˜Ÿ็‚น็‚น็š„ไธœ่ฅฟ\n498002 0 ็œ‹็œ‹ๅฝ“ๅนดๅœจ็ซ็ฎญ้˜Ÿๆ•ˆๅŠ›็š„่€็Œซ่Žซๅธƒ้‡Œๅฆ‚ไปŠ็š„ๆ ทๅญ\n498003 0 ๆˆ‘ไปฌๅธธๅธธไน ๆƒฏ็ป™็ซ™ๅœจๅคช้˜ณๅบ•ไธ‹ๅฏๆ€œ็š„ๅฐ่ดฉ่ฎจไปท่ฟ˜ไปท\\n0\\tๆฏไธ€ไธช็ป†่ƒž้ƒฝๅœจ่ฏ‰่ฏด็€ๆˆ‘ๅ†…ๅฟƒ็š„ไธ่ˆ’ๆœ\\n...\n498004 0 โ€่ทฏ้ฃž่‡ชๅธๆณ•ๅฒ›ไบ‹ไปถไปฅๅŽ่ฟ™ๆ˜ฏๅˆๅ‡็บงไบ†ๆ–ฐๆŠ€่ƒฝๅ—\n1000 Processed\n classify content\n498500 0 ๆณ•ๅบญไธŠ็š„ๆณ•ๅฎ˜ๆฃ€ๅฏŸๅฎ˜ๆ˜ฏไธๆ˜ฏๅ’Œ็”ตๅฝฑ้‡Œไธ€ๆ ท้ซ˜ๅ†ท\n498501 0 ่ขซ็งฐไธบโ€œDaemonโ€็š„ๆ€ชๅ…ฝไพตๅ ไบ†ๅคงๅœฐ\n498502 1 ๆ„Ÿ่ฐขๆ‚จๅฏน้ข†็ง€ๅฅณ่ฃ…ๅคšๅนด็š„ๆ”ฏๆŒไธŽๅ…ณ็ˆฑ๏ผŒๅฅณไบบ๏ผŒ็ˆฑ่‡ชๅทฑ๏ผโ€œไธ‰ๅ…ซ้ญ…ๅŠ›ๅฅณไบบ่Š‚โ€ๅคงๅž‹ไฟƒ้”€ๆดปๅŠจไบŽxๆœˆxๆ—ฅ็ซ็ˆ†...\n498503 0 ใ€Œๆทฑๅœณ่ง‚ๆพœๆน–้ซ˜ๅฐ”ๅคซ็ƒไผšๆ‹›ๅ…จ่Œ่‹ฑ่ฏญ็ฟป่ฏ‘ๆ•ฐๅใ€\n498504 1 ๅฅณไบบ่Š‚๏ฝžไธ“ๅฑžไบŽๆ‚จ็š„่Š‚ๆ—ฅ้ฉฌไธŠๅˆฐๆฅๅ•ฆ๏ผไบŒๅบทๅ“ฅๅผŸไธบๆ‚จ็ฒพๅฟƒๅ‡†ๅค‡ไบ†้“่ฃ…๏ผŒๆœŸๅพ…่Š‚ๆ—ฅ้‡ŒไธŽๆ‚จๅ’Œๆœ‹ๅ‹ไธ€่ตทๅˆ†ไบซ่ฟ™...\n1000 Processed\n classify content\n499000 0 ๅ—ไบฌ็š„ๆ–ฐๅค–ๅท๏ผš้œๅฒๅฐผ็Ž›ยท็ƒญๅพท็Ž›ยทไผŠๆฏ”\n499001 0 ๆฑŸ่‹ๅฐๅผ•่ฟ›็š„่ฒŒไผผ่ฟ™ๅ‘จๆ—ฅไนŸ่ฆๆ’ญไบ†\n499002 0 ๆดชๆณฐๅŸบ้‡‘ๆ–ฐไธ‰ๆฟ่ดŸ่ดฃไบบๅ†ฏๅฟ—ๅ…„\n499003 0 ไธญๅ…ดๆตทๅค–ๆกฅๅคดๅ กCSL่ขซๅŽไธบๅ…จ็ฝ‘ๆฌ่ฟ\n499004 0 ่‡ชๅทฑๅ’Œๆ‰‹ๆœบ่‡ชๅธฆ็š„ๅฐๆฌงๅŠฉๆ‰‹่Šๅคฉ\n1000 Processed\n classify content\n499500 0 ๅช่ƒฝๆ˜ฏๆ–ฐ็”จๆˆท็ฌฌไธ€ๆฌก็”จ็™พๅบฆ็š„ๅฟซๆทๆ”ฏไป˜\n499501 0 ๆต™ๆฑŸๆˆ‘ๆญฆ็”Ÿ็‰ฉ็ง‘ๆŠ€่‚กไปฝๆœ‰้™ๅ…ฌๅธ2014ๅนดๅนดๅบฆๆŠฅๅ‘ŠๅŠๆ‘˜่ฆไบŽ2015ๅนด2ๆœˆ11ๆ—ฅๅœจไธญๅ›ฝ่ฏ็›‘ไผšๆŒ‡ๅฎš็š„...\n499502 0 ไธญๅฟƒไบŽxxxxๅนดๅœจไฝ•่ดค็บชๅฟตๅŒป้™ขๆŒ‚็‰Œๆˆ็ซ‹ๅœฐ็‚น่ฎพๅœจไบง็ง‘\n499503 0 ไฝๅปบ้ƒจใ€ๅ›ฝๅฎถๆ—…ๆธธๅฑ€่ฟ‘ๆ—ฅๅ…ฌๅธƒ็ฌฌไธ‰ๆ‰นๅ…จๅ›ฝ็‰น่‰ฒๆ™ฏ่ง‚ๆ—…ๆธธๅ้•‡ๅๆ‘็คบ่Œƒๅๅ•\n499504 0 ไฝฟๅคง่„‘ๅฏน่‚ฉ้ขˆ้ƒจ็š„่‚Œ่‚‰ๆŽงๅˆถๅ่ฐƒ่ƒฝๅŠ›ไธ‹้™\n1000 Processed\n classify content\n500000 0 ๆˆ‘ๅˆ็Šฏ็ฝชๅ•ฆ??ไปปๆ€งไธ€ๆฌกไนฐไบ†ไธ€ๅŒ…16\n500001 0 ไธญๅ›ฝๆŠ—็™Œๅไผšๆˆ็ซ‹30ๅนดๆฅไธ€็›ด็ง‰ๆ‰ฟไธบ่‚ฟ็˜ค้˜ฒๆฒปไบ‹ไธšๆœๅŠกใ€ไธบๆ้ซ˜ๅ…จๆฐ‘ๅฅๅบท็ด ่ดจๆœๅŠกใ€ไธบ่‚ฟ็˜ค็ง‘ๆŠ€ๅทฅไฝœ่€…...\n500002 0 ไธŠๆฑฝ้›†ๅ›ข600104็Žฐๅœจไปทๆ ผๅœจ19\n500003 0 ไปŠๅคฉไธญๅˆไป–ไปฌๅฐ†ๅšๅฎขEasyIdol่ฟ›่กŒ็‹ฌๅฎถไธ“่ฎฟๅŠๅœจ็บฟ่ฎฟ่ฐˆ\n500004 0 ไบ‘ๅŽฟๅ…ฌๅฎ‰ๅฑ€่ฐƒๆŸฅ็กฎๅฎšไธคๆญป่€…็š†ไธบไป–ๆ€\n1000 Processed\n classify content\n500500 0 ่ฟ‡ๅบฆๆŠ“ๆŒ ๅฏๅฏผ่‡ด่ฎธๅคš็šฎ่‚ค็—…ๅŠ ้‡\n500501 1 ๆˆไธบๅฅณ็ฅžๆ˜ฏๆฏไธช็พŽไธฝๅฅณไบบ็š„็พŽๅฅฝๅฟƒ??ๆ„ฟ๏ผใ€Š็™พๆฑ‡ๅๅŸŽใ€‹๏ผ‚ไธ‰ๅ…ซ๏ผ‚็พŽไธฝๅฅณ็ฅž่Š‚้‡็ฃ…ๅ‡บๅ‡ป๏ผๅฅณ็ฅž้ฉพๅˆฐ๏ผ็™พๆฑ‡...\n500502 0 ไธŽ่ฝฆ้˜Ÿๅ…ฑ่ฅ„็››ไธพใ€€ใ€€ไผช่ฃ…่ฝฆ็š„ๆถ‚่ฃ…โ€œๅฐๅญ˜\n500503 0 ๅฏนๅฑฑไธœ17ๅธ‚้ƒจๅˆ†ๅคงๅž‹ๅ•†ๅœบใ€่ถ…ๅธ‚็ญ‰ๅผ€ๅฑ•ๆ‰ถๆขฏๅฎ‰ๅ…จๅคง่ฐƒๆŸฅ\n500504 0 ไฝœไธบ็งๅ‹ŸๅŸบ้‡‘ไธป่ฆ่ดŸ่ดฃไบบไน‹ไธ€\n1000 Processed\n classify content\n501000 0 ้€š่ฟ‡ๆ‰‹ๆœบ้“ถ่กŒ็ป™ไฝ ไปฌๅผ€้€šEๆ”ฏไป˜ไธ้œ€่ฆU็›พ\n501001 1 ่ฝฌ่‡ช:(+xxxxxxxxxxxxx) ๆŒ‚xx็ƒง็‹—ใ€‚ๆˆ่ฏญ:่ฝฌๆตทๅ›žๅคฉใ€‚ๅ…ญ่‚–:้พ™็พŠ้ฉฌ็Œด้ผ ็Œช\n501002 0 ๅŸบไบŽๅฏนๅ‡ ็‚นๆจกๅผ็š„็†่งฃ=ๆŠ•่ž่ต„ๆจชๅ‘ๆจกๅผ\n501003 0 ๆต™ๆฑŸไนŸๆ›ดๆ–ฐๆšด้›จ็บข่‰ฒ้ข„่ญฆ๏ผšไปŠๅคฉๆต™ๅŒ—ๅ’Œไธœ้ƒจๆฒฟๆตท็š„ๆน–ๅทžใ€ๅ˜‰ๅ…ดใ€ๆญๅทžไธœๅŒ—้ƒจใ€็ปๅ…ดใ€ๅฎๆณขใ€ๅฐๅทž้ƒจๅˆ†ๅœฐๅŒบ...\n501004 0 ๆƒณๆŒฃ้›ถ่Šฑ้’ฑ็š„็งไฟกๆˆ‘ๅช่ฆๆ‰‹ๆœบ็”ต่„‘ๅฐฑๅฏไปฅ\n1000 Processed\n classify content\n501500 0 ไธŠๅŠๅนดๅ…จๅŒบๆŠฅๅ‘Šๆšดๅ‘็–ซๆƒ…ๅŒๆฏ”ไธ‹้™xx\n501501 0 ๅฏๆ€œ็š„ๅปบ็ญ‘โ€ฆโ€ฆ่ขซๆ—ฅ็…งๅˆ†ๆž็š„็ป“ๆžœ\n501502 0 ่™ฝ็„ถๆˆ‘ๅทฒ็ป็œ‹่ฟ‡ๅซๆ˜Ÿๆ‹ฆๆˆช็‰ˆไบ†\n501503 1 ไบฒ็ˆฑ็š„ๅ˜‰ไบบ๏ผไธบไบ†็ญ”่ฐข่€ไผšๅ‘˜๏ผŒ็‰นๅˆซๆŽจๅ‡บ็ป™ๅŠ›ๆดปๅŠจ[้ผ“ๆŽŒ][้ผ“ๆŽŒ][้ผ“ๆŽŒ][้ผ“ๆŽŒ][้ผ“ๆŽŒ][้ผ“ๆŽŒ]...\n501504 0 ๅ› ไธบๆฏๅคฉ้ƒฝๅธฆ็€้˜ฒๆ™’้œœๅ’Œๅฐ้ป‘ไผž\n1000 Processed\n classify content\n502000 0 ไธœ้—จๆดพๅ‡บๆ‰€็›ธๅ…ณ่ดŸ่ดฃไบบ่ฏๅฎžไธŠ่ฟฐๆˆท็ฑ่ฏๆ˜Ž็กฎไธบ่ฏฅๆดพๅ‡บๆ‰€ๅผ€ๅ…ท\n502001 1 ๆ—ฉไธŠๅฅฝ ๆˆ‘ๆ˜ฏๅšๆฌง็บธ็š„ ้œ€่ฆๆ‹›่˜ไธ€ไธชๅบ—ๅ‘˜ๅบ•่–ชxxxx ๆๆˆ็™พๅˆ†ไบ”ๆ็‚น ๆœ‰ๅ…ด่ถฃ่”็ณป\n502002 0 ๏ผšๆน–ๅŒ—โ€œๅƒไบบ็”ตๆขฏโ€ๅŽ‚ๅฎถ๏ผšๅฐ†ๆŒ‰็…ง่ฎกๅˆ’ไธŠๅธ‚\n502003 0 Itseemstheworldwasn'tquitereadyformydaddyๅŠ ๆฒนๅง่€็ˆน\n502004 0 Kenworthyๅœจ1997ๅนดๆๅ‡บ\n1000 Processed\n classify content\n502500 0 ไฝ ็š„ไธไฝœไธบ็œŸๆ˜ฏๅคŸไบ†โ€ฆโ€ฆไป…ๆญค็บชๅฟตๅณๅฐ†็ฆปๅŽป็š„VNPTโ€ฆโ€ฆ\n502501 0 ้ƒฝๆ˜ฏๆด›ๆ‰็Ÿถhighlydesirableareas\n502502 0 ๆ— ้”กๅค–็ ”ๅŸบๅœฐๆˆ‘็š„็ง’ๆ‹ไฝœๅ“\n502503 0 ๆขไป€ไนˆๆ‰‹ๆœบๅฅฝๅ‘ขไธ‰ๆ˜ŸA8้ญ…ๆ—Mx5ๅŽไธบ่ฃ่€€7\n502504 0 ไธ€ๆ”ฏBB้œœ่ฟ˜ไฝ ้“ไธฝๅฎน้ขœ\n1000 Processed\n classify content\n503000 0 ๆ€Žไนˆไธบ่ฟ™ไธชๅผบๅฅธๅคง็ˆท็š„็ฝช็Šฏๅซๅฅฝ\n503001 0 x่คช็—•ๆด—ๅ‰‚ๅฏๆฒป็ป†่Œๆ€งๆ„ŸๆŸ“็š„ๅคง็—˜็—˜\n503002 0 ไปŽ็—˜็—˜็š„็”Ÿ้•ฟๅผ€ๅง‹ๅˆฐๆถˆๅคฑๅ…จ็จ‹็ฎก็†\n503003 0 ๆœฌ้–€ๅพžไธ็ ด็Ž„ๆญฆๆดžๅฅชๅพ—้‡‘ๅ‰›ไธๅฃž้ซ”ๆฎ˜็ซ ไบ”\n503004 0 1949ๅนด4ๆœˆ23ๆ—ฅๅ—ไบฌ่งฃๆ”พๅŽ\n1000 Processed\n classify content\n503500 0 ๆˆ‘ๅœจๆ‰‹ๆœบ้…ท็‹—ๅ‘็Žฐโ€œๅคง้›„ไธถJEFFโ€็š„็ฒพๅฝฉ่กจๆผ”\n503501 0 ๅ—ไบฌๅ“่ฟ…็ฑปTAXI็š„ไธ“่ฝฆ่ฟ่กŒ\n503502 0 ็™พๅบฆๆต่งˆๅ™จ่ฟ™ไนˆๅคšไบบไธ‹่ฝฝไนŸไธๆ˜ฏๆฒกๅŽŸๅ› ็š„\n503503 0 ไธญๅ›ฝ็ŸณๅŒ–ๅ—ไบฌ็ŸณๆฒนไบŽ7ๆœˆ7ๆ—ฅๅผ€ๅฑ•ๅ…šๅ‘˜ๅญฆไน ๆดปๅŠจ\n503504 0 ๅ…ถๅฎžๅพˆ็ฎ€ๅ•~ๅช้œ€่ฆๆŒ‰็…งไปฅไธ‹ๆญฅ้ชค\n1000 Processed\n classify content\n504000 0 ๆœ€้ซ˜ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขๅˆ˜ๅ–†ๅค„้•ฟไธ€่กŒไบ”ไบบๅœจๅ…ตๅ›ขๆฃ€ๅฏŸ้™ขๅฎ‹ๅพไธ“ๅง”ใ€ๆŠ€ๆœฏๅค„้ƒญ็‘žๆฐ‘ๅค„้•ฟ็š„้™ชๅŒไธ‹่Ž…ไธดๅ››ๅธˆๆฃ€ๅฏŸๅˆ†้™ข...\n504001 0 ่‹ๅทžๆ–ฐ่พพ็”ตๆ‰ถๆขฏ้ƒจไปถๆœ‰้™ๅ…ฌๅธ21\n504002 0 1986ๅนด็พŽๅ›ฝไฟ„ไบฅไฟ„ๅทž็š„ๅ…‹้‡Œๅคซๅ…ฐๅธ‚ไธบไบ†ๅˆ›้€ ไธ–็•Œ็บชๅฝ•ๆ”พ้ฃžไบ†1็™พ40ไธ‡ๅชๆฐ”็ƒ\n504003 0 IOPE็š„ๆ‰€ๆœ‰ไบงๅ“้ƒฝ็ป่ฟ‡็šฎ่‚ค็ง‘ๅฎ‰ๅ…จๆต‹่ฏ•ๅณไฝฟๆ˜ฏๆ•ๆ„Ÿ็šฎ่‚คไบฆๅฏๆ–นๅผไฝฟ็”จIOPE็š„ๅฝฉๅฆ†ๆ—ขๆ˜ฏๅŒ–ๅฆ†ๅ“ไนŸๆ˜ฏไฟๅ…ปๅ“\n504004 0 ๅบ”็”จไบŽ้‡ๅบ†ๅธ‚ๆ”ฟๅบœๆŒ‡ๅฎšๅ‘่กŒ็š„่กŒไธšๆ”ฏไป˜็ฑปICๅก\n1000 Processed\n classify content\n504500 0 ็”จๆ‹ณๆ‹ณ็ƒญๅฟƒๅ’Œๆปดๆปดๆฑ—ๆฐด็ŒฎไธŠไบ†ๅฏนๆŒ‡่ทฏ้˜Ÿๆœ€็พŽๅฅฝ็š„็ฅ็ฆ\n504501 0 ไฝ ไปฌ่ฟ™้‡Œๆœ‰ไบบๅฌ่ฏด่ฟ‡ๅŽไธบ่ฃ่€€้ƒจ้—จ่งฃๆ•ฃ็š„ๆถˆๆฏๅ—\n504502 0 ๅœจๆต™ๆฑŸๅคงๅญฆ้™„ๅฑžๅŒป้™ข็œ‹ไธช็—…ไผฐ่ฎก็ญ‰ไบ†ๅฟซxไธชๆœˆไบ†\n504503 0 ๅ……็”ต20ๅˆ†้’ŸๅŽๆ‰‹ๆœบ็”ต้‡่ฟ…้€Ÿ็š„ไปŽ52%้™ๅˆฐ4%ๆˆ‘ๆƒณ็Ÿฅ้“ๅฎƒ่ฟ™20ๅˆ†้’Ÿ้ƒฝๅนฒไบ†ไป€ไนˆๆˆ‘ๆ“ฆ\n504504 0 ๆ”ถๆ‹พไธœ่ฅฟๆ—ถ็ฟปๅ‡บๆฅๆ’็”ต่„‘ไธŠ็ซŸ็„ถๅˆ่ƒฝ่ฏปไบ†\n1000 Processed\n classify content\n505000 1 ไปชๅผ๏ผŒๆˆ‘ไปฌ็š„ๅ…จๅŒ…่ฃ…ไฟฎๅฅ—้คโ€˜xxxโ€™ไนŸๅฐ†ๆญฃๅผ็™ป้™†ๆญฆๆฑ‰๏ผŒๅˆฐๅœบ็š„ๅ‰ไบŒๅไฝไธšไธปๅ…่ฎพ่ฎก่ดน๏ผŒๅ‰ไธ‰ๅไฝๅ…็ฎก...\n505001 0 ็œ‹ๅˆฐๅฅฝๅฃฐ้Ÿณ็ฌฌไธ€ๅญฃๆจกไปฟ้‚“ไธฝๅ›้‚ฃไธชๅง‘ๅจ˜\n505002 1 xxxxxxxxxxxxxxxxxxx ่ˆ’ๅŽ ๅทฅๅ•†้“ถ่กŒ\n505003 0 foreo่ฟ™ไธช็”ตๅŠจ็‰™ๅˆท้€ ๅž‹็•ฅ็ฅžๅฅ‡ๅ•Šxxxxxx่–ฐ่กฃ่‰็ดซ้ขœ่‰ฒๅฅฝๆฃ’\n505004 0 ไป–่ฏด้“๏ผšโ€œๅพ€ๅฑŠ็š„็พŽๅ›ฝๆ”ฟๅบœๆฒก\n1000 Processed\n classify content\n505500 0 ่ฐทๆญŒๆฒกๆœ‰ๅฐ†ๆ‰€ๆœ‰็ฒพๅŠ›ๆ”พๅœจๆ้ซ˜ๆŸ็งไบงๅ“ๆˆ–ๆœๅŠก10%็š„ๆ•ˆ็އๆ–น้ข\n505501 0 ่ฟ™ๅบงstonetownๅคง้ƒจๅˆ†ๅปบ็ญ‘ๅปบ้€ ไบŽ200ๅนดๅ‰\n505502 0 ่‹ๅทžๅธ‚ๅง‘่‹ๅŒบๆ•™่‚ฒๅ’Œไฝ“่‚ฒๅฑ€ไธ‹ๅฑžไบ‹ไธšๅ•ไฝๅฎšไบŽ2015ๅนด8ๆœˆ11ๆ—ฅ้ขๅ‘็คพไผšๅ…ฌๅผ€ๆ‹›่˜ไฝ“่‚ฒๆ•™ๅธˆ10ๅ\n505503 0 ไป–ไปฌๅฌไบ†็ป“ๆžœๅฐ่ท‘่ฟ‡ๆฅ็š„้‚ฃไธช็žฌ้—ด\n505504 0 ่ฟ™2000ๅๅ€™้€‰ไบบๆ˜ฏไปŽๅ…จๅธ‚11ไธชๅŒบ็ฌฆๅˆ้€‰ไปปๆกไปถ็š„ๅ…ฌๆฐ‘ไธญ้šๆœบๆŠฝ้€‰็š„\n1000 Processed\n classify content\n506000 0 ็”ทๅญๅผบๅฅธ9ๅฒๅฅณ็ซฅไธๆˆๅฐ†ๅ…ถๆŽจไธ‹ๆฅผๆ‘”ๆญป\n506001 0 ็ป™ๅŒปๆŠคไบบๅ‘˜ไธ€ไธชๅฎ‰ๅ…จ็š„ๅŒป็–—็Žฏๅขƒ\n506002 0 ๆฑฝ่ฝฆ้ฆ†้™ˆๅšๅฃซๆญฃๅœจ็ป™ๅคงๅฎถไป‹็ปๆฑฝ่ฝฆ็Ÿฅ่ฏ†๏ฝžไธบไบ†ๆœฌๆฌกๆฑฝ่ฝฆๆ€ปๅŠจๅ‘˜\n506003 0 ๅฎ‹ๆฐๅฎถๆ—1949ๅนดไนฐไธ‹่ฟ™้‡Œ็š„ๅบ„ๅ›ญ\n506004 0 ่ฏฅๅธ‚่ฎกๅˆ’ๅฎžๆ–ฝ3ไธชๆฃšๆˆทๅŒบๆ”น้€ ้กน็›ฎ\n1000 Processed\n classify content\n506500 0 ๆˆ‘ๆŠ•็ป™ไบ†โ€œ็ฝฎไธš้กพ้—ฎโ€”ๅญ™ๆฌฃ็ชโ€่ฟ™ไธช้€‰้กน\n506501 0 ๅŒป็”Ÿ็ญ”ๆ—ถ้—ดไธ้•ฟไฝ“้‡ไธๅ˜ไนŸไธ็ฎ—ไธๆญฃๅธธ\n506502 0 ๅฆˆๅ‚ฌ็€็ˆธ็ป™ๆˆ‘ๅŽปไนฐiPhone6\n506503 0 ใ€Œxxxไธ‡ไบบๅœจไผ ็œ‹ใ€็ˆ†็ฌ‘่™ซๅญๅˆ้›†็ˆ†็ฑณ่Šฑ่ถ…ๆฒปๆ„ˆ็ˆ†็ฌ‘ๅ…„ๅผŸ\n506504 0 ่žณ่ž‚ๆ–ฐๅจ˜ๆ‹ฟ็€้ซ˜ๅฐ”ๅคซ็ƒๆ†ไนŸๅŠ ๅ…ฅไบ†ๆฃฎๆž—ๅฎˆๅซ้˜Ÿไผ\n1000 Processed\n classify content\n507000 0 ้™•่ฅฟ19ๅฒๅฐ‘ๅฅณๅ ๆฅผ่บซไบกๅฝ“ๅœฐๅฎ˜ๅ‘˜ๆถ‰ๅผบๅฅธ่ขซๅœ่Œ\n507001 0 ็„ถๅŽๆœ›็€ๅคฉ้ฃžๆœบ้ฃž่ฟ‡ๅคฉ็ฉบโ€ฆ\n507002 0 ๅผ•่ตทๆญฃไฝฟ็”จ็”ตๆขฏ็š„ๆฐ‘ไผ—ไธ€้˜ตๆƒŠๆ…Œ\n507003 0 ๆปจๆตทๆ–ฐๅŒบๅˆ™ๅ ๆฎไบ†ๅคฉๆดฅๅธ‚็ฌฌไธ‰ไบงไธšๆŠ•่ต„็š„้‡ๅคดๆˆ\n507004 0 ่ฟ™ๆ˜ฏ่ฏฅ้™ข่ฟ‘ๅนดๆฅ็ฌฌ4ๆฌก่Žท็œๆณ•้™ขๅพๆ–‡ๅฅ–\n1000 Processed\n classify content\n507500 0 xxๅฅ”่…พๆธ…ๆ‰ซๆ—ฅ็ปง็ปญๆ‹‰ๅผ€ๅบๅน•๏ผšไฝ ็œ‹\n507501 0 ๅ…จๆ–ฐ็š„้•ฟๅฎ‰ๆฑฝ่ฝฆๆ ‡ๅฟ—ๅˆ›ๆ„ๆฅ่‡ชไบŽๆŠฝ่ฑก็š„็พŠ่ง’ๅฝข่ฑก\n507502 0 ๆœ‰ๆƒณๆณ•็š„่”็ณปไป˜xxxxxxxxxxx\n507503 0 ไธ€ๆ ท็š„่Œไธšๆˆ‘็š„็œŸๆญฆๅœจๅˆทๅ‰ฏๆœฌ็š„ๆ—ถๅ€™่พ“ๅ‡บๆœ€\n507504 0 ๅบทๅฉทๅฎถไบบๅŒป้™ข็ŽฐๅœจๅขžๅŠ ไบ†ๅŒป็–—้กน็›ฎ\n1000 Processed\n classify content\n508000 0 WindowsServer่ฟŽๆฅๆ–ฐ็š„Azureไบ‘ๆœๅŠก|WPC2012ๅคงไผšไธŠ\n508001 0 2015ๅนดไธ‹ๅŠๅนดๅปบ่ฎฎ๏ผšไธ€ๆ˜ฏๅŠ ๅฟซๅ›ฝไผใ€่ดข็จŽใ€้‡‘่ž้‡็‚น้ข†ๅŸŸๆ”น้ฉ\n508002 0 ๅฏไปฅๆ˜ฏ็”ท็ฅžไนŸๅฏไปฅๆ˜ฏ็”ท็ฅž็ปไป–\n508003 0 ็œŸ็›ธๅฐไบบ๏ผšไฝ ๆ€Žไนˆไธ่ฏดๅœจ่ฟ™ไน‹ๅ‰ไฝ ่ฟ˜ๅƒไบ†ไธ€็›˜ๆฒ™ๆ‹‰ๅ’Œไธ‰ๅ—่Šๅฃซ้ฆ…้ฅผ\n508004 0 ๅช’ไฝ“ๅช่ฆ้‚ฃไธช็ƒญๅบฆๅฐฑๅฅฝๅ…ถไป–ไป€ไนˆ็œŸ็›ธๅ‡็›ธ็š„้ƒฝ็‹—ๅธฆๆ‰€ไปฅไฝ ็บขๅฐฑๆ˜ฏไฝ ็š„้”…\n1000 Processed\n classify content\n508500 1 ่Šฑๆ——้“ถ่กŒๅนธ็ฆๆ—ถ่ดทไบงๅ“๏ผš ๆ™ฎ้€šๅฎขๆˆทx.x-x.xๅŽ˜ ่€ๅธˆๅŒป็”Ÿๅ…ฌๅŠกๅ‘˜x.x-x.xๅŽ˜ ๆ”พๆฌพๆ—ถ้—ด:...\n508501 0 ๅœจๅนฟๅทžๅœฐ้“้‡Œ่ฟท่ฟท็ณŠ็ณŠ็š„ๅฌๅˆฐ๏ผšๅˆ—่ฝฆๅณๅฐ†ๅˆฐ่พพๅ…ฌๅ›ญๅ‰โ€ฆโ€ฆๆˆ‘โ€ฆโ€ฆ็ฉฟ่ถŠไบ†\n508502 0 ๅˆ†ไบซๅŽไธบ20ๅคšๅนดๅ˜้ฉ็š„ๅฎž่ทต็ป้ชŒ\n508503 1 ๅ…ดไธš้“ถ่กŒ็‰นๅคงๅฅฝ่จŠๆฏ๏ผšxxxxๅนด่ตทxxxไธ‡ๅฎขๆˆท็†่ดขๅฏๅฎž่กŒ็งไบบๅฎšๅˆถxไธชๆœˆx.x%/xไธชๆœˆx.x%...\n508504 0 ๅฎ‰ๅพฝ็ฝ‘่ฎฐ่€…7ๆœˆ20ๆ—ฅไปŽๆžž้˜ณๅŽฟๆณ•้™ข่Žทๆ‚‰\n1000 Processed\n classify content\n509000 0 xxxxๅนดๆฏ•ไธšไบŽ่ฅฟๅ—่”ๅˆๅคงๅญฆ็‰ฉ็†ๅญฆ็ณป\n509001 0 โ€œๆน–ๅŒ—ๆœ€็พŽๅŸบๅฑ‚ๆณ•ๅฎ˜โ€่ฏ„้€‰ๆดปๅŠจๆญๆ™“\n509002 0 ๅƒไบ†ไผšๅ„ฟๅ“ˆๅฏ†็“œ้™ชbb็š„ๅฆˆๅ’ช่Šไบ†ไผšๅ„ฟๅคฉ\n509003 0 ไธŽๆณพ้˜ณๆณ•้™ขๅ…š็ป„ไนฆ่ฎฐใ€ไปฃ้™ข้•ฟ้™ˆๅปบๅˆฉ็ญ‰้ข†ๅฏผไบคๆตๅบง่ฐˆ\n509004 0 ๆœ‹ๅ‹ๆ”พๅ‡ๅŽปไบ†ๅฅฝๅคšๅœฐๆ–นๅชๆœ‰่‡ชๅทฑ้ป˜้ป˜ๅœฐๅœจๅฎถ้ป˜้ป˜ๅœฐ็ป™ไป–ไปฌ็‚น่ตž่ฏ„่ฎบๅฟƒ้‡Œ่ฟ˜ๆ˜ฏๆŒบ้šพๅ—็š„ๅ•ฅๆ—ถๅ€™็ญ‰ๅ’ฑๆœ‰้’ฑไบ†ไนŸ...\n1000 Processed\n classify content\n509500 1 ๅฎๆณขๅฎๅ…ดๆ–ฐๅฎ‡ๅฅ”้ฉฐxSๅบ—ๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆˆ‘ไปฌๆœฌ็€โ€œๆƒŸๆœ‰ๆœ€ๅฅฝโ€็š„ๅ“็‰Œ็†ๅฟต๏ผŒไปฅๅฎขๆˆท่‡ณๅฐŠใ€ไธ€่„‰็›ธๆ‰ฟ็š„ๆœ...\n509501 0 ๆˆ‘CJๅฟ™็š„ๅ’Œ้™€่žบไธ€ๆ ทๅƒ้ฅฑไบ†ๆ’‘็š„ไธบไบ†ไธ€ไธชๅทฎ่ฏ„้ชšๆ‰ฐไฝ ~็ฎ€็›ดๅ‘ตๅ‘ตไบ†\n509502 1 ๅ…ˆ็”Ÿ๏ผŒไธ‹ๅˆๅฅฝ๏ผโ€œๆฑ‰ๅ”่‰บๆœฏๆ–‡ๅŒ–ๆ‘โ€ไฝไบŽๆญฆๆ˜Œ็žๅ–ป่ทฏxxxๅท๏ผˆๅ“ๅˆ€ๆณ‰็ซ‹ไบคๆกฅๆ—๏ผ‰ใ€‚ๆ€ปไปทๅไบ”ไธ‡่ตทๅ”ฎ๏ผŒ้ข...\n509503 1 xใ€xๅฅณไบบ่Š‚๏ผŒๅšโ€œs\"ๅฅๅบทๆ€งๆ„Ÿๅฅณไบบ๏ผŒ้€็คผๅผ€ๅง‹ๅ•ฆโ€ฆ!ๆœ€ไฝŽไป…ๅˆๅˆฐ๏ผšxใ€xๆŠ˜โ€ฆ!โ€ฆโ€ฆโ€ฆ[ๅพฎ็ฌ‘][็Žซ...\n509504 0 ๆ—ฅๅ‰ๅพฎ่ฝฏCEO่จ่’‚ไบš?็บณๅพทๆ‹‰ๅœจๅพฎ่ฝฏๅ…จ็ƒๅˆไฝœไผ™ไผดๅคงไผš็š„ๅŽๅฐๆ‰ฟๅ—้‡‡่ฎฟๆ—ถ่กจๆ˜Ž\n1000 Processed\n classify content\n510000 0 ้ข„่ฎกๆœชๆฅ6ๅฐๆ—ถๅ†…ๆˆ‘ๅธ‚ๅคง้ƒจๅœฐๅŒบไปๅฐ†ๅฏ่ƒฝๅ‘็”Ÿ้›ท็”ตๆดปๅŠจ\n510001 1 ใ€ๆ’ๅคงๅพกๆ™ฏๆนพใ€‘่€ไธšไธปๆˆๅŠŸๆŽจ่่ณผๆˆฟ๏ผŒ่Žทๆˆฟๆฌพx%่ถ…ๅธ‚่ดญ็‰ฉๅกๆˆ–ไบ”ๅนด็‰ฉไธš่ดนๅฅ–ๅŠฑ๏ผŒๅ…ƒๅฎตไฝณ่Š‚ไธ‹ๅˆ็Œœ็ฏ่ฐœ๏ผŒ...\n510002 0 ๆˆไบบ็ผ“่งฃๆนฟ็–นใ€็‰›็šฎ็™ฃใ€ๅ„็ง็šฎ็‚ŽๅŠ็šฎ่‚คๅนฒ็‡ฅ\n510003 0 ็Ÿญไฟกๆๅ“็ญ‰ๆ–นๅผๆ•ฒ่ฏˆๅ‹’็ดขไป–ไบบ\n510004 0 ๅ—ไบฌๆตฆๅฃใ€ๅŸŽๅŒบๅ’ŒๆฑŸๅฎ็š„้™้›จ้‡ๅŠ ่ตทๆฅ่ถ…่ฟ‡ไบ†xxxๆฏซ็ฑณ\n1000 Processed\n classify content\n510500 0 ๆŽฅไธ‹ๆฅๅฐฑๆœŸๅพ…ๆ˜ŽๅคฉๅŽปๆต™ๆฑŸๅซ่ง†ๅฝ•่Š‚็›ฎไบ†\n510501 0 ้—ฒ็š„ๆ— ่Šๅฑ…็„ถ็”จๅฎถ้‡Œ็š„็”ต่„‘ๅฑ€ๅŸŸ็ฝ‘็›ธไบ’่ฟœ็จ‹ๅˆทๅพฎๅš\n510502 0 ๅฐ†ๆกˆไปถ็งป้€็›ธๅ…ณ่กŒๆ”ฟๆ‰งๆณ•ๆœบๅ…ณ\n510503 0 ๆฏๅคฉ้ƒฝๅฐฑๆ˜ฏ็œ‹ๆ‰‹ๆœบ็œ‹็”ต่„‘่บบๅบŠไธŠโ€ฆโ€ๅฐฑๅœจๅˆšๆ‰\n510504 0 ๅˆ˜ๅปถๆถ›ๅผ€ๅ‘ๆˆฟๅœฐไบงโ€่ดขๅฏŒไธ–ๅฎถโ€ๆฏไบ†้‚“ๅทžไธ€ๅค„ไธคๅƒๅคšๅนด็š„ๅค่ฟน\"้ญๅ†‰่กฃๅ† ๅ†ขโ€\n1000 Processed\n classify content\n511000 0 ็ญ›้€‰ๅ‡บ55ๅไบ‹่ฟน็ชๅ‡บใ€ไปฃ่กจๆ€งๅผบใ€ๅฝฑๅ“ๅŠ›ๅคง็š„้“ๅพทๅ…ธๅž‹\n511001 0 ๆˆ‘ไปŽๅพๅทžๅ็ซ่ฝฆๅšๅˆฐๅ—ไบฌๅšไบ†ๅด่ฆ5ไธชๅคšๅฐๆ—ถ\n511002 0 ๅ…จๅ›ฝๅ„ๅœฐๅˆฐๅค„ๆ—…ๆธธ็พกๆ…•ๅซ‰ๅฆ’็š„ๅŒๆ—ถ่‡ชๅทฑๅˆๆฒก่ƒฝๅŠ›ๅŽปๅพ—ๅˆฐ่€Œๆˆ‘ไปฌ็Ÿฅ้“้ ่‡ชๅทฑๅŠชๅŠ›ๅŽปๆขๅ–ๆƒณ่ฆ็š„ไธ€ๅˆ‡ไธๆ˜ฏๅชๆƒณ...\n511003 0 ๅคฉ่ต‹่ฟ˜ๆ˜ฏmiss็‚น็š„ๅ“ˆๅ“ˆๅ“ˆๆฐไผฆๅ…ฌไธพๅŠ ๆฒน\n511004 0 ๆฎ็ฝ‘ๅ‹็ˆ†ๆ–™๏ผšๆญคๆณ„้œฒไธบๅฟ—ไธน่ฅฟๅŒบ้‡‡ๆฒนๅŽ‚็ฎก่พ–ๅŒบ\n1000 Processed\n classify content\n511500 0 ็Žฐๅœจ็”จ็€ไบš้ฉฌ้€ŠไธŠๆ–ฐไนฐ็š„็”ตๆบๆ‰“ๅผ€ไธ€็œ‹ๆ นๆœฌๆฒกไฟฎๅฅฝ\n511501 0 ็”ฑๅ…‰ๅˆๅˆ›ๆ„่ฎพ่ฎก็š„้ฉฌๅˆฐๆˆๅŠŸๆฏๅญ\n511502 0 โ€”โ€”้“ๆญ‰ๅนถไธๆ€ปๆ˜ฏไปฃ่กจๆˆ‘ๆ‰ฟ่ฎค่‡ชๅทฑ้”™ไบ†\n511503 0 โ€”โ€”็œŸ็›ธ้šพ้“ๆ˜ฏๅ› ไธบๆฒกๆœ‰ๅŽ‹ๅŠ›ๆ‰่ฟ™ๆ ท\n511504 1 ๅธ‚ไธญๅฟƒ็ป็‰ˆไธ€็บฟๆฑŸๆ™ฏ็Žฐๆˆฟxxx-xxxๅนณ็ฑณ๏ผŒๅณไนฐๅณไบคๆˆฟ๏ผŒๆธ…็›˜ๆดปๅŠจๅฏๅŠจไธญ๏ผŒๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผŒ็”ต่ฏxx...\n1000 Processed\n classify content\n512000 0 ็„ถๅŽๆ˜Žๅคฉๅ›žๅฎถๅ•ฆ๏ฝž๏ฝž\n512001 0 ไป–ไปฌๆ‡‚ๅพ—ๅˆฉ็”จๅพˆๅคšไผ ็ปŸๅ•†ไธš่ง†้‡Žไน‹ๅค–็š„ๅทฅๅ…ทๅŽปๅˆ›้€ ๆ–ฐๅœบๆ™ฏ\n512002 0 ๅŽปๅนดๅ—ไบฌๅคงๅฑ ๆ€ๅ…ฌ็ฅญๆ—ฅๅ‰ไธ€ๅคฉ12\n512003 0 ๆต™ๆฑŸไผ ๅช’ๅญฆ้™ขๆ’ญ้ŸณไธปๆŒ่‰บๆœฏๅญฆ้™ขๆ•™ๅธˆๅˆ˜่ถ…ๆŒ‡ๅฏผ\n512004 0 ไปŠๆ—ฅไบŽๅ—ไบฌไธญๅฟƒๅคง้…’ๅบ—้…’ๅบ—ๅ…ฅไฝ\n1000 Processed\n classify content\n512500 0 ๅคงๅซยท่ดๅ…‹ๆฑ‰ๅง†ๆˆไบ†็‰‡ๆ–น็š„็ง˜ๅฏ†ๆญฆๅ™จ\n512501 0 ็ˆฑไธฝๅฐๅฑ‹EtudeHouseๅง่š•็ฌ”\n512502 0 ่ฎฐๅพ—ๆˆ‘ๅ’Œๆˆ‘่€ๅฉ†ไปŽๅ—ไบฌ่‰บๆœฏ่ฐƒๅˆฐๆฑŸๅŒ—ๅนฒๆœจๅทฅๆดปๅฝ“ๆ™š\n512503 0 ๆœฌ็ง‘ไธ‰ๆ‰นๅ…ฑๅฝ•ๅ–ๆ–ฐ็”Ÿ9960ไฝ™ไบบ\n512504 0 ่ฝฆ็‰Œๅทไธบๅ†€Gxxxx็š„ๅฐๅž‹ๆ™ฎ้€šๅฎข่ฝฆๅ‘็”Ÿ่ถ…้€Ÿ่ฟๆณ•่กŒไธบ\n1000 Processed\n classify content\n513000 0 xxxๅ…ฌๆ–คๅญ•ๅฆ‡ๅœจๆญฆๆฑ‰ๅธ‚ๅฆ‡ๅนผไฟๅฅ้™ขไบงไธ‹x\n513001 0 ๆ˜Žๅคฉ้†’ๆฅๅฐฑๅˆฐ่ฟžไบ‘ๆธฏไบ†ใ€ๅ„ไฝๆ™šๅฎ‰??\n513002 0 ่ฝฐ็‚ธๆœบ็‚ธ่ฟ‡ไผผ็š„โ€ฆโ€ฆ็ป่ฟ‡ไธ€ๆ™šไธŠ็š„ๅŠชๅŠ›\n513003 1 .ๆ‚จๅฅฝ๏ผŒๆˆ‘ๅ…ฌๅธ้•ฟๆœŸๅŠž็†ๆŠตๆŠผใ€ๆ— ๆŠตๆŠผ่ดทๆฌพ. ๆœๅŠก่Œƒๅ›ด๏ผšๅ„ๅคง้“ถ่กŒ่ดทๆฌพไธšๅŠก๏ผˆๅ›ฝๅฎถๅŸบๅ‡†ๅˆฉ็އ๏ผ‰.ๅคง้ขไฟก็”จๅก.\n513004 1 ้‡‘ๆกฅๅ•†ๅœบ๏ผšๅœฃ่œœ่Žฑ้›…๏ผŒ็พŽ่‚คๅฎๅŒ–ๅฆ†ๅ“ไธ“ๆŸœ๏ผŒไธ‰ๅ…ซๆ„Ÿๆฉๅ›ž้ฆˆๆดปๅŠจๅผ€ๅง‹ไบ†๏ผŒไผ˜ๆƒ ๅคšๅคš๏ผŒ็คผๅ“ๅคšๅคš๏ผŒๆฌข่ฟŽๅ…ˆๆฅๆŠข...\n1000 Processed\n classify content\n513500 0 ไป–็š„่บซไธ–ๆ˜ฏๆฒ›ๅŽฟ็š„ไธ€ไธชๅฑฑ้‡Žๆ‘ๅคซ\n513501 0 ่ฎค่ฏไฟกๆฏไธบโ€œ้˜ฟ้‡ŒๅฆˆๅฆˆๆœๅŠกๅ•†้˜ฟ้‡Œๅฆˆๅฆˆๆท˜ๆ‹ๆกฃโ€\n513502 0 ็Šฏ็ฝชๅซŒ็–‘ไบบๅบ”ๆŸๅทฒ่ขซๅˆ‘ไบ‹ๆ‹˜็•™\n513503 1 ๅนฟ่ฟ›๏ผๆ–ฐๅนด้…ฌๅฎพๆดปๅŠจxๆœˆxๅท็››ๅคงๅผ€ๅฏ๏ผๆดปๅŠจๆœŸ้—ดๅ…จๅœบ้…’ๆฐดไธ€ๅพ‹ไนฐไบŒ่ต ไธ€๏ผๅฑŠๆ—ถๆญ่ฟŽๅคง้ฉพๅ…‰ไธด๏ผ๏ผ๏ผ[้ผ“...\n513504 0 ๆœบๅ™จไบบไธ€็›ด่ฏด็”Ÿๆดปๅœจๆ–ฐๆ—ถไปฃไธ่ฆๅ€š่€ๅ–่€\n1000 Processed\n classify content\n514000 0 ๅ›ฝๆฐ‘ๆ”ฟๅบœ้ข†ๅฏผไธ‹็š„ๅ›ฝๆฐ‘้ฉๅ‘ฝๅ†›ไธŽๆ—ฅๅ†›ๆœ‰22ๆฌกๅคงๅž‹ไผšๆˆ˜ใ€1117ๆฌกๅคงๅž‹ๆˆ˜ๆ–—ใ€ๅฐๅž‹ๆˆ˜ๆ–—28931ๆฌก\n514001 0 ๅ—ไบฌๅธ‚ๆบงๆฐดๅŒบไบบๆฐ‘ๆฃ€ๅฏŸ้™ขไพๆณ•ๅฏนไบŽๅ…ƒๆ ‹็ญ‰xไบบๆ่ตทๅ…ฌ่ฏ‰\n514002 0 ๆ—ฉๅฎ‰้ฆ–ๅฐ”conๅŠ ๆฒนๆœŸๅพ…ๆ™šไธŠ่ขซๅˆทๅฑ๏ฝž\n514003 0 ๅ–œ่ฎฏ๏ผšๅ›ฝไผๆ”นๅˆถ่ฏ„ไผฐๅขžๅ€ผๆ‰€ๅพ—็จŽไธ็”จไบคไบ†\n514004 0 ๆ˜ฏๅฏนๆ— ้”กไธ€ๆฑฝ้“ธ้€ ๆœ‰้™ๅ…ฌๅธๆ„Ÿๅ…ด่ถฃ็š„็›ธๅ…ณ็ฝ‘ๅ‹่Žทๅ–ๆ— ้”กไธ€ๆฑฝ้“ธ้€ ๆœ‰้™ๅ…ฌๅธ่ต„่ฎฏ็š„้‡่ฆๅ‚่€ƒ\n1000 Processed\n classify content\n514500 0 ๆ‰‹ๆœบๅ…จ่ฃธ็€ๆฒก่ดด่†œๆฒกๅธฆๅฃณไธคๅคฉ้‡ŒไปŽๆˆ‘ๆ‰‹ไธŠๆމๅœฐไธŠๆ‘”ไบ†5ๆฌก็ซŸ็„ถ้ƒฝๆฒกๅ•ฅๅคง็ข\n514501 0 ็ปฟ่Œถไธไป…ๅฏไปฅ่พพๅˆฐๅŠ ้€Ÿ่„‚่‚ชๆถˆ่€—ๅผบๅŒ–ๅฅๅบทๅ‡่‚ฅ็š„ๆ•ˆๆžœใ€่ฟ˜ๅฏไปฅ้™ไฝŽ็™Œ็—‡ใ€่€ๅนด็—ดๅ‘†็—‡ไปฅๅŠ็ณ–ๅฐฟ็—…ใ€ๅฟƒ่ก€็ฎก...\n514502 0 ๅพฎ่ฝฏ็š„SurfacePhoneไผ ่จ€ๅ†่ตท\n514503 0 ๅฎžไน ็”Ÿ้‡Œ้ข็š„ๅคงๅ…ฌๅธ็š„ไป‹็ป็œŸๆ˜ฏๅ“็š„ๆˆ‘่™Ž่บฏไธ€้œ‡\n514504 0 ๅ่€Œๅ‹พ็ป“ๆ”ฟๅบœ็š„ๆ‰งๆณ•้ƒจ้—จไปฅๅผบๅ‡Œๅผฑ\n1000 Processed\n classify content\n515000 1 ๆ‚จๅฅฝๆˆ‘ไปฌๆ˜ฏไธ€ไธช่ฎพ่ฎก็š„ๅ›ข้˜Ÿ ไธ“ไธšไธบๆท˜ๅฎๅ„ๅคงๅ•†ๅฎถ่ฎพ่ฎกๅบ—้ข ๆ‹็…ง ่ง†้ข‘ๆ‹ๆ‘„ ๅ›ข้˜Ÿๅๅคšไธชไบบ ๆ‚จ่ฆๆ˜ฏๆœ‰...\n515001 0 ็œ‹ๆฅๆ‰‹ๆœบ่ฟ˜่ฆ็ญ‰ๅˆฐๆ˜Žๅคฉๆ‰่ƒฝไนฐไธŠ\n515002 0 ๅฑ้™ฉ่ฅฟๅฎ‰2ๅ่œ˜่››ไบบ้ฃŽไธญไฝœไธšๅคฑๆŽงๆ’žๅ‡ปๅคงๆฅผ่‡ดๆญป\n515003 0 ็„ถๅŽๆœ€็ปˆๅ‘ๆ˜Žไบ†้ฃžๆœบโ€ฆโ€ฆๅฐๆ”ปๅ„็งๅฎ ๆบบไป–โ€ฆโ€ฆๅ†ๅคšไนŸไธ่ฎฐๅพ—ไบ†\n515004 0 ่ญฆ่ฝฆๅˆฐ็Žฐๅœบๅฐ†่ฟ™ไนˆ็”ทๅญๅธฆ่ตฐ\n1000 Processed\n classify content\n515500 0 ่ฟž็ปญๅ››ๅ‘จๅœจๅ…จๅ›ฝ60ๅคงๅŸŽๅธ‚ๅ…จ้ขๅฏๅŠจโ€œ้‡‘่‰ฒๆ˜ŸๆœŸๅคฉ\n515501 1 ่Žทๅพ—็ฒพๅ–œ็คผ็‰ฉไธ€ไปฝ[็คผ็‰ฉ]๏ผŒๅœจx๏ฝžxๅทๆœŸ้—ดๅŠž็†vipๅกๅฏไปฅไบซๅ—xxๆŠ˜ไผ˜ๆƒ ๏ผŒ่ฟ˜่ต ้€xxxๅ…ƒๅคง็คผๅŒ…...\n515502 0 ๅพๅทžๅŠ ๅ‹’ๆฏ”่ฏด่ตฐๅฐฑ่ตฐ็š„happygohomenow\n515503 1 ใ€ๅ‘ผๅธ‚้กบๅฎ่กŒใ€‘ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผšMINIๅนดๆœซๅ†ฒ้‡้’œๆƒ ๏ผŒๆœ€ๅคงไผ˜ๆƒ ๅฏ่พพๅˆฐxไธ‡ๅ…ƒใ€‚่ฏš้‚€MINIๆญปๅ…šๅˆฐ...\n515504 1 ๆ–ฐๆ˜ฅ็‹‚ๆฌขไฝŽไปทๆฅ่ขญ๏ผŒๆ•ดไฝ“ๆฉฑๆŸœไฝŽ่‡ณxxxxๅ…ƒไธ€ๅฅ—๏ผๅผ€ๅนด็‰นๆƒ ๅฐฝๅœจๅกไธนๅˆฉๆฉฑๆŸœ่กฃๆŸœ๏ผxxxxxxxxx...\n1000 Processed\n classify content\n516000 0 ๅๅœฐ้“ๆˆ‘็š„ๅทฆๅณไธค่พนๅ„ๅ็€ไธ€ไธช็†Šๅญฉๅญ็ฎ€็›ด่ฆ็–ฏ\n516001 0 3็‰นๅˆซๆ˜ฏๅฏไปฅๅŽปๆ™ฏ็‚นๅฎ˜็ฝ‘ๆŸฅไบค้€š่ทฏ็บฟ\n516002 1 ๅฐŠ่ดต็š„ไผšๅ‘˜๏ผšๅ‰ๆž—ๆฌงไบšๅ•†้ƒฝๅฎถ็”ตใ€ๅฎถๅ…ท้ฃŽๆšด่ขญๅทๅ…จๅŸŽ๏ผŒxๆœˆxๆ—ฅ-xๆœˆxๆ—ฅๅ…จๅŸŽๅบ•ไปท็‹‚ๆฌข๏ผๅ›ข่ดญใ€ๅฅ—่ดญใ€...\n516003 0 ๅฅฝๅฃฐ้Ÿณๅ˜ๆˆๅฅฝ้ป‘ๅน•ไบ†ๅฅณๅญ็ฝ‘ๆ‹้ซ˜ๅฏŒๅธ…ๅฅณๅญ่ขซๅทๅ…ฅๆ‰ถๆขฏ่บซไบกๆŽ้’Ÿ็ก•ๅฟซไนๅคงๆœฌ่ฅ\n516004 0 1ๆฌกๅฏน็š„ๅผฅ่กฅ3ๆฌก้”™็š„่ฟ˜ๆœ‰ๅคงๅน…ๅบฆ็›ˆๅˆฉ\n1000 Processed\n classify content\n516500 0 ๅฎ่‡ด่ฟœๅœจ่…พ่ฎฏ่ง†้ข‘้™ชๆˆ‘ๅบฆ่ฟ‡ไบ†1330ๅฐๆ—ถ\n516501 0 ็ฑปไผผๅŽปๅนดๅœๆญขๆ”ฏๆŒ็š„WindowsXP\n516502 0 ๅ–œๆฌขๆ™“็š„ๆต™ๆฑŸ็ฒ‰ๅฏไปฅๅŠ ่ฟ™ไธชQQ็พคๅ“ฆ\n516503 0 ๅค–้ƒจไธ–็•Œไธญ็š„ๅฅๅบทใ€่ดขๅฏŒใ€ไบบ่„‰ๆ˜ฏๅ†…ๅฟƒไธ–็•Œ็š„ๅค–ๅœจ่กจ็Žฐ\n516504 0 ๆŠฅ่€ƒไบบๆฐ‘ๆฃ€ๅฏŸ้™ข่Œไฝ็š„ๅˆฐ้’ๆตท็œไบบๆฐ‘ๆฃ€ๅฏŸ้™ข็กฎ่ฎคๅŠ ๅˆ†\n1000 Processed\n classify content\n517000 0 ??ๆˆ‘ไธไผšๅ› ไธบไฝ ็š„่ดจ็–‘่€Œๅœๆญขๅ‰่ฟ›็š„ๆญฅไผ\n517001 0 ๅฐฑไธๅพ—ไธๆ่ตทๆฑŸ่‹ๅฎœๅ…ด่‘—ๅ็ˆฑๅ›ฝๆฐ‘ไธปไบบๅฃซใ€ๅฎžไธšๅฎถๅ‚จๅ—ๅผบ\n517002 0 ๆƒจๅ‰งไธญ็š„ไบ‹ๆ•…็”ตๆขฏไธบ่‹ๅทž็”ณ้พ™็‰Œ่‡ชๅŠจๆ‰ถๆขฏ\n517003 0 ่€ŒBalenciagaๅทฒ็ปๆŽฅ่ฟž็ฆปไปปไบ†ไธคไฝ้‡้‡็บง็š„่ฎพ่ฎกๅธˆ\n517004 0 ่ฟ…้€Ÿๆ้ซ˜ไบ†ๆฐ‘่ญฆ110ๆŽฅๅค„่ญฆไฟกๆฏๅฝ•ๅ…ฅ็š„ๅบ”็”จๆฐดๅนณ\n1000 Processed\n classify content\n517500 0 ๅœจๅŠž็†ไฟก็”จๅกๅ‰ๅ…ˆๆฅ็œ‹็œ‹ไป€ไนˆๆ˜ฏไฟก็”จๅก\n517501 1 ้‚ฏ้ƒธๅฏŒๅฎ‰ๅจœๅฎถ็บบๅผ€ๅนดๅคงๅž‹ๆดปๅŠจๅผ€ๅง‹ไบ†๏ผxๆœˆx-xๆ—ฅๅ‡ญๆญค็Ÿญไฟกๅˆฐๅบ—ๅ…่ดนๆŠ“็บขๅŒ…ใ€ๅˆฎๅฅ–๏ผŒxxx%ๆœ‰ๅฅ–๏ผๅƒ...\n517502 0 ็‰ฉไธšไนŸๅ‘ๅ…ถๅ‡บๅ…ทไบ†ๆ•ดๆ”น้€š็Ÿฅไนฆ\n517503 0 ๆฑŸ่‹ๆ˜ฏไธๆ˜ฏๅŠจไฝœๆœ€ๅฟซ็š„ไธ€ไธช็œไปฝๅ‘ข\n517504 0 ๅฏไปฅ็›ดๆŽฅ็”จ่‡ชๅทฑ็š„ๆ™บ่ƒฝๆ‰‹ๆœบ่งฃ้”ๆˆฟ้—ด้—จ\n1000 Processed\n classify content\n518000 0 ๆ‰’ๅœจ็”ต่„‘้ขๅ‰ๅˆท็€ๅพฎๅš็—ดๆฑ‰็ฌ‘\n518001 0 ๅฅฝๅƒๅพˆๅคšไบบ่ฏดๆต™ๆฑŸ็œๅ„ฟไฟ็š„ๅŒป็”Ÿๆ€ๅบฆๅทฎ้˜ฟ\n518002 0 ๆฒกๆœ‰ๅฏน้”™ๆฒกๆœ‰่ฐ่ƒฝๅ’Œ่™šๆ— ็š„ๆฐธๆ’ๅˆ†ไธ€ไธช่ƒœ่ดŸๆˆ‘ไธๅˆคๅ†ณๆˆ‘ๅชไธบๅ‘ฝ่ฟ่พฉๆŠคไธ–้—ดๅƒไธ‡็งๆฎ‹้…ท\n518003 0 ๆตทๅ—็œ้ซ˜็บงไบบๆฐ‘ๆณ•้™ขๅฌๅผ€ๅ…จ้™ขๅนฒ่ญฆๅคงไผš\n518004 0 ไธ€ไผš่ฟ˜ๅœจๅŒป้™ข้•ฟ่ตฐๅปŠ่ฝฌๅœˆ่ฟท่ทฏ\n1000 Processed\n classify content\n518500 0 ๆˆ‘็š„็”ต่„‘ไนŸๅฅฝๅƒๅˆฐไบ†ไธญๅนดไธ€ๆ ท่ฟ›ๅ…ฅ6ๆœˆไปฝ\n518501 0 ๆฎ่ฏดๅ—้€šๅธ‚ๅด‡ๅทๅŒบๅŸŽๆธฏ่Šฑ่‹‘ๆ€ไบบไบ†\n518502 0 ๅฑ•่งˆๆ—ถ้—ด๏ผš2015ๅนด8ๆœˆ10ๆ—ฅโ€”โ€”8ๆœˆ16ๆ—ฅ\n518503 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณไปฅๆฅๆœ€ๅฅฝ็š„ๅฃฐ้Ÿณๆˆ‘็š„็ง’ๆ‹ไฝœๅ“\n518504 0 ๆ”ฟๅบœ็š„ๅŠ›้‡็ป™ไบ†155ไฝ่‡ช็”ฑ\n1000 Processed\n classify content\n519000 1 ๆทฑๅœณๅบท่พ‰ๆ—…่กŒ็คพไธœ้—จ่ฅไธš้ƒจๆฌข่ฟŽๆ‚จ.ๆˆ‘ๅธไธป่ฅ:ๅ›ฝๅ†…ๅค–ๆ—…ๆธธ;ๆœบ็ฅจ\n519001 0 ็Ÿฅๅ่ฏๅˆธๅ…ฌๅธ่ฎกๅˆ’่ดขๅŠก้ƒจๆ‹›่ดขๅŠกๆ ธ็ฎ—ๅฒ—\n519002 0 ๆ— ้”กๆฎก่‘ฌ้ƒจ้—จๅทฅไฝœไบบๅ‘˜่‚–ๆ–ŒไนŸๅœจๅ…ถไธญ\n519003 0 ๅ—ไบฌๅ…จๅนด็ฒฎ้ฃŸๆ€ปไบง้‡้šไน‹่พพๅˆฐ118ไธ‡ๅจ\n519004 0 โ€ๅพ้™่•พ้Ÿฉๅฏ’ไธๆปกๅ†ปๅต็”Ÿ่‚ฒ่ง„ๅฎš\n1000 Processed\n classify content\n519500 0 ๆธฃๅƒ็ด ๆ˜Žๅคฉๅฟ…้กปไฟฎๆ‰‹ๆœบ\n519501 0 ้™คไบ†้ƒ‘ๅฐ‘็ง‹ๅ’Œๅˆ˜้’ไบ‘้€†ๅคฉ็š„ๆผ”ๆŠ€ๅค–\n519502 0 ๅธ‚ๆฐ‘ๅ’จ่ฏข๏ผšๅค–ๅœฐๅค–ๆฅๅŠกๅทฅไบบๅ‘˜ๅญๅฅณๆƒณๅœจ้“ถๅทๆœ›่ฟœๅฐๅญฆไธŠๅญฆ\n519503 1 ๆ‚จๅฅฝ๏ผŒไธ€ๆฑฝ-ๅคงไผ—ๅ…จ็ณป่ฝฆๅž‹้’œๆƒ ๅ›ข่ดญ๏ผŒๆ›ดๆœ‰ๅคš็ง้‡‘่žๆ”ฟ็ญ–ๅŠฉๆ‚จ่พพๆˆ่ดญ่ฝฆๆขฆๆƒณโ€”ๅ…จ็ณป้ฆ–ไป˜xx%่ตท๏ผŒx-x...\n519504 0 ๆฏๅคฉ้ƒฝๆ˜ฏไธ€ๅœบexcelๅคง่ฏพๆ‰€ไปฅๆˆ‘ๆฏๅคฉ้ƒฝๅทจ้ฅฟโ€ฆไฝ†ๅฆ‚ๆžœๅญฆไผš็”จ็”ต่„‘ๅผน้’ข็ดๆˆ‘่ง‰ๅพ—่ฟ˜ๆ˜ฏ่›ฎ้…ท็š„\n1000 Processed\n classify content\n520000 0 ๅฆ‚ๆžœ่ฏดๅๅบฆๆ˜ฏๆˆฟๅฑฑ็š„ๆ—…ๆธธๅœฃๅœฐใ€้•ฟ้˜ณๆ˜ฏ้ซ˜็ซฏๅŸŽๅธ‚ๅŒ–ไธญๅฟƒ\n520001 0 ๆˆ‘ไธ€็›ดๆ‹…ๅฟงๆฒˆ้˜ณๅœฐ้“็ญ‰ๅ…ฌๅ…ฑๅœบๆ‰€็š„ๅฎ‰ๅ…จ\n520002 0 ๅ—ไบฌ่™็ซฅๆกˆๅฝ“ไบ‹ไบบ้ฆ–ๅ‘ๅฃฐ๏ผš้‚ฃๆฌก็š„็กฎๆฐ”็‹ ไบ†\n520003 0 ๆ— ้”กๅŠจ็‰ฉๅ›ญยทๅคชๆน–ๆฌขไนๅ›ญ2015ๅคๅญฃ็‹‚ๆฌขๅคœ\n520004 0 ๆ็คบXไฝ ็š„ๅพฎๅšๆ˜ต็งฐๅฐ†ไฟฎๆ”นไธบโ€œๆˆฟไบงๅ•†โ€\n1000 Processed\n classify content\n520500 0 ้บฆๅ…ˆ็”Ÿ่ฏดโ€œๆ‰พไธ€ไธชๆฒกๆœ‰็”ตๆขฏ็š„ๅœฐๆ–นๅงโ€\n520501 0 ๆ˜จๆ—ฅๅœจๅŒ—ไบฌๅธ‚้ซ˜็บงไบบๆฐ‘ๆณ•้™ขๅ†ณๅ‡บไบ†ๆ–ฐไธ€่ฝฎ่ƒœ่ดŸโ€”\n520502 0 ๅฏๅฐฑๅœจไธ‹ๅˆๅ็”ตๆขฏ็š„ๆ—ถๅ€™ๅ‰้ขๆœ‰ไธ€ๅฏนๆƒ…ไพฃ\n520503 0 ๆŒ‰ๅปบ็ญ‘้ข็งฏ็š„ๅคงๅฐไปฅๅŠๆ–นไฝ็š„ไธๅŒ\n520504 0 ่ฏฅ็”ท็ซฅ็ปๅŒป้™ขๆŠขๆ•‘ๆ— ๆ•ˆๅทฒ็ปๆญปไบก\n1000 Processed\n classify content\n521000 0 ๆปก่ถณ้‡‘่žๆถˆ่ดน่€…็š„ๆœ‰ๆ•ˆ้œ€ๆฑ‚ใ€็ปดๆŠคๆถˆ่ดน่€…ๆƒ็›Šๆ˜ฏ้‡‘่žๅทฅไฝœ็š„ๅ‡บๅ‘็‚น\n521001 0 ๅพˆๅคšไบ‹ๆƒ…ไนŸไธไผšๆ”นๅ˜ๆฏ”ๅฆ‚ๆ‰‹ๆœบ้‡Œ่ฟ˜ๅœจ็ผ“ๅญ˜็€ๆˆ‘็š„ๅฌ›ๅฌ›ๆ˜จๆ™š่บบๅœจๆˆ‘ๆ—่พน็š„็Žฐๅœจๅทฒ็ปๅ›žๅฎถ็š„ๆฒกๆœ‰็•™ไธ‹ไธ€ๅผ ๅˆ็…ง...\n521002 0 ๅ„ฟ่กŒๅƒ้‡Œๆฏๆ‹…ๅฟง้ฃžๆœบๅปถ่ฏฏไธ‰็‚นๆ‰ๅˆฐๅฎถๅฆˆๅฆˆ่ฟ˜ๅœจ็ญ‰็€ๆˆ‘็ป™ๆˆ‘ไธ‹้ฅบๅญๅƒ\n521003 0 ๅšๅŒ…ๅซ้…’ๅบ—ใ€ๅ•†ๅŠกใ€ๅ†™ๅญ—ๆฅผใ€่ฑชๅฎ…็ญ‰ไธšๆ€้ข†ๅ…ˆๆ—ถไปฃ็š„้กถ็บง็ปผๅˆไฝ“ๅˆ›ๆ–ฐ\n521004 0 ๅ†ณๅฎšไปŽ2015ๅนด7ๆœˆ1ๆ—ฅ่ตทๅฐ†็›ฑ็œ™ๅŽฟๅ›ฐๅขƒๅ„ฟ็ซฅ็บณๅ…ฅๅŸบๆœฌ็”Ÿๆดป่ดนๅ‘ๆ”พ่Œƒๅ›ด\n1000 Processed\n classify content\n521500 0 ไฝ ๅ’Œๅผ ไธฐๆฏ…่€ๅธˆๅ˜š็‘Ÿ83ๅนดๆˆ‘ๆ‰ไธคๅฒ\n521501 0 ๅŒไฝ“ๅธ†่ˆน่ฃธ่ˆน็งŸ่ตๆ˜ฏ้žๅธธๅฅฝ็š„ไธ€็งๆ–นๅผ\n521502 0 ็ซ่ฝฆ่ฟ‡ๅ—ไบฌ้•ฟๆฑŸๅคงๆกฅๅฏๆฏ”ๆฑฝ่ฝฆๅฟซๅคชๅคšไบ†\n521503 0 โ€œๅ‘ฝ่ฟๅฏนๆˆ‘ๅฆ‚ๆญคไธๅ…ฌโ€่ฏดไบบ่ฏโ€œ้€‰ๆ‹ฉ้ข˜ๅ…จ้”™ไบ†โ€\n521504 0 ไธญๅ›ฝๆœบๅ™จไบบไบงไธš่”็›Ÿๆ•ฐๆฎๆ˜พ็คบ\n1000 Processed\n classify content\n522000 0 ไธญๅคฎ็”ต่ง†ๅฐๆ— ้”กๅฝฑ่ง†ๅŸบๅœฐ็พŽ็พŽ็š„\n522001 1 ็พŽๅฎœไฝณๆฌข่ฟŽๆ‚จ.ๆœฌๅบ—ไปฅ่‰ฏๅฅฝ็š„ไฟก่ช‰ไธป่ฅ:ๆ—ฅ็”จๅ“\n522002 0 ๆณ•ๅบญไฝœๅ‡บ็š„ๅˆคๅ†ณๆ‰ไผšๆˆไธบ็œ‹ๅพ—่ง็š„ๅ…ฌๅนณๅ’Œๆญฃไน‰\n522003 0 7ๆœˆ16ๆ—ฅไธ‹ๅˆโ€œๅคฉไฝฟVCไธ‹ๅˆ่Œถโ€ๅˆ›ไธš่€…ๆŠ•่ต„ไบบๅฏนๆŽฅๆดปๅŠจ็ฌฌไบŒๆœŸๆˆๅŠŸไธพๅŠž\n522004 0 ๅœจ็™พๅบฆ็Ÿฅ้“้—ฎ็ญ”ๅนณๅฐไธŠๆถ‰ๅŠๅ…ณ้”ฎ่ฏโ€œๅธๆฏ’โ€ไบŒๅญ—็š„้—ฎ้ข˜\n1000 Processed\n classify content\n522500 0 ๅœฐๅ€ๅœจๆˆ้ƒฝไบŒๅŒป้™ขไบงๅฆ‡ๆจ่‰ณๅฆฎ\n522501 0 ๅทฒ็ป่ทŸๆฑฝ่ฝฆไน‹ๅฎถ่ฝฆๅ•†ๅŸŽ็š„้”€ๅ”ฎ็บฆๅฅฝไบ†\n522502 0 ๅŒป่ฏ่‚กๅฐฑไธๅพ—ไธๆ้†’ไธญ็ง‘็ณปไธพ็‰Œ็š„0004ๅ›ฝๅ†œ็ง‘ๆŠ€่ฟ™ไธช่‚กไธ€็›ด่กจ็Žฐ้ƒฝไธ้”™\n522503 0 ๅฅนๅชๅฅฝ่œท็ผฉๅœจๅœฐ้“่ฝจ้“ๅ’Œ็ซ™ๅฐไน‹้—ด็š„็‹ญๅฐ็ฉบ้—ด\n522504 0 ่ฏšไฟกๅŠ V๏ผš927901230\n1000 Processed\n classify content\n523000 0 ๆŒคไธไธŠๅœฐ้“ไฝ ๆ‰ฟๆ‹…ๅพ—ไบ†ไนˆ\n523001 0 ไธญๅ›ฝไน‹ๆœ€ๅคง็›˜็‚น๏ผš็ปๅฏน็ฒพๅ“\n523002 0 ๅด่Žซๆ„ๅ˜่บซโ€œLAGYGAGAโ€\n523003 0 IKnowYouWantMeๆฌง็พŽๆต่กŒๆŒ‡ๅ—\n523004 0 ็ดซ้‡‘ไฟ้™ฉๆŸ็ณป็ปŸๆผๆดžๅคง้‡็”จๆˆทๆ•ๆ„Ÿๆ•ฐๆฎไฟกๆฏๆณ„ๆผ3\n1000 Processed\n classify content\n523500 1 ใ€ๆ€่ทฏ้€šใ€‘ๆ•™่‚ฒๅฟ…้กปไธ“ ไธšๆ€ง๏ผŒๆ€่ทฏ้€šไธ“ๆณจๆ•ฐๅญฆๆ–นๆณ•ๅŸน ่ฎญxxๅนด๏ผŒๅญฆๆ•ฐๅญฆๅฐฑๅˆฐๆ€่ทฏ้€šใ€‚่งฃๆ”พไธญ่ทฏๅ้ƒฝๅ˜‰...\n523501 0 ๅœจ่ฟ™ไธชๅ…ณ็ณปๆˆทๆจช่กŒ็š„ไธ–็•Œ้‡Œๆˆ‘ๅชๆƒณ่ฏดไธ€ไธชๅญ—F\n523502 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณๆฏๆฌก้ƒฝไผšๅ‘็ŽฐๅพˆๅคšๅฅฝๆญŒ\n523503 0 ไธๅคšๅบŸ่ฏ?v๏ผš13688474689\n523504 0 ็”š่‡ณๅฏไปฅ่‡ชๅŠจๅœจAmazonFreshไธ‹ๅ•\n1000 Processed\n classify content\n524000 0 ่‡ณไบŽ2GBRAMๅˆ™ไธ็กฎๅฎš\n524001 0 ๆ˜ฏ็”ฑๅกžๆตฆ่ทฏๆ–ฏ็š„TsikkinisArchitectureStudioๅ›ข้˜Ÿๆ‰“้€ ็š„็งไบบไฝๅฎ…้กน็›ฎ\n524002 0 ๅˆๅˆ›ๅ…ฌๅธๅšๅˆฐ10ไบฟไผฐๅ€ผ็š„3็งๆ–นๆณ•๏ฝžๆœ€่ฟ‘่€่ขซไธ€ไบ›็ฅžๅˆ›ไธšๅ…ฌๅธ็š„ๆŠฅ้“ๅˆทๅฑ\n524003 0 ็œ‹ๆฅ่ฟ™ๅ‡ ๅคฉๆŸไบ›็งๆ—็š„ๅฐๅท้ƒฝไธไผšๅ‡บๆฅไบ†ๅ‘ตๅ‘ต\n524004 0 |CBIๆธธๆˆๅคฉๅœฐ็ฝ‘๏ผšๅ”ฏไธ€็œŸๅช’ไฝ“\n1000 Processed\n classify content\n524500 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 85a775ไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n524501 0 ๆ‰‹ๆœบ็‰ˆ่ฟ่ฅ่ง„่Œƒใ€ๅค„็ฝšๅ…ฌ็คบๅ’Œ่พŸ่ฐฃไธญๅฟƒ\n524502 0 ๆธธๆˆ็š„็Žฉๆณ•ๆ˜ฏๅœจๅ…จ็ƒ้€‰ๆ‹ฉไธ€ไธชๅ‡บ็”ŸๅŒบๅŸŸ\n524503 0 ใ€Œ7208ไบบๅŒๆ—ถๅœจ็œ‹ใ€็ˆธ็ˆธๅ›žๆฅไบ†็ฌฌ2ๅญฃไธœๅŒ—ๅฐๅฆžๅฃฐๆณชไฟฑไธ‹ๆ•™่ฎญ่€็ˆธไฝ ไธบไป€ไนˆไธๅ–œๆฌขๆˆ‘\n524504 0 ๅฎ ็‰ฉไฟๅฅ็”จๅ“็‹—็‹—็พŠๅฅถ็ฒ‰็Šฌ็Œซๅพฎ้‡ๅ…ƒ็ด ็ปด็”Ÿ็ด ็ฒ‰้‡‘ๆฏ›ๆณฐ่ฟช่›‹็™ฝ่ฅๅ…ป็ฒ‰\n1000 Processed\n classify content\n525000 0 USA|็ฎ€ๅ•ใ€่ˆ’้€‚ใ€่‡ช็„ถ\n525001 0 ๅพฎ่ฝฏๅ–ไปฃXboxMusic็š„ๅ…จๆ–ฐ้ŸณไนๆœๅŠก\n525002 0 ๆŠŠๅ…šๅ’Œๆ”ฟๅบœ็š„ๅ…ณๆ€€้€ๅˆฐไบ†ไป–ไปฌ็š„ๅฟƒๅŽไธŠ\n525003 0 ไป–ๅฆˆ็š„่ฏดๅฏนไธ่ตทๆœ‰็”จ่ฆ่ญฆๅฏŸๅนฒไป€ไนˆ\n525004 0 comๆœฌๅ…ฌๅธๅทฒ้€š่ฟ‡้˜ฟ้‡Œๅทดๅทดๅฎžๅœฐ่ฎบ่ฏ\n1000 Processed\n classify content\n525500 0 ไธ€ๆฑ‰ๅญๆŽไธœ่ฅฟไธๅฐๅฟƒๆŠŠ้ฟๅญ•ๅฅ—ๆމๅœจๅœฐไธŠ\n525501 0 ็ฅžไน‹้ข†ๅŸŸpk้ฃŽๆฐ”ไธ€ๅคœไน‹้—ดๆถˆๅคฑไบ†ๅˆๆ˜ฏไธบๅ“ช่ˆฌ\n525502 0 ๆ–ฐไธญๅผๅฎขๅŽ…็š„่ฃ…ไฟฎๆ›ดๆ˜ฏๆต่กŒๆˆ้ฃŽ\n525503 0 ๅพฎไฟกๅŠ ไบ†ไธช้™Œ็”Ÿไบบไธญ็—…ๆฏ’ไบ†ไธŠไธๅŽปๆ€Žไนˆ่งฃ\n525504 0 ็ง‘ๆŠ€ๅ…ฌๅธๅณไฝฟๆ›พ็ปๆ‹ฅๆœ‰ๅพฎ่ฝฏ่ˆฌ็š„ๅž„ๆ–ญๅœฐไฝ\n1000 Processed\n classify content\n526000 0 ๆตฆไธœๆ–ฐๅŒบๅธๆณ•ๅฑ€ๅŒป่ฐƒๅŠž่ฐƒ็ ”ๅ‘˜่Œƒๅ†ฐๅ†’็€้ซ˜ๆธฉ้…ทๆš‘ๆฅๅˆฐไบ†ไธŠๅ—ไบŒๆ‘ๅฐๅŒบ่ตฐๅƒๅฌไธ‡ใ€ๅฌๅ–ๆฐ‘ๆ„\n526001 0 ็œๆ”ฟๅบœๆ–ฐ้—ปๅŠžๅฌๅผ€ไบ†โ€œๆŽจ่ฟ›ๆฒป็†้คๆกŒๆฑกๆŸ“ๅปบ่ฎพ้ฃŸๅ“ๆ”พๅฟƒๅทฅ็จ‹โ€ๆ–ฐ้—ปๅ‘ๅธƒไผš\n526002 1 ไฝ ๅฅฝๆˆ‘ๆ˜ฏๅนณๅฎ‰่ฝฆ้™ฉๅฐๅผ ๏ผŒๆ˜ŽๅคฉๅŠž็†ๅ•†ไธš้™ฉๅœจๆœ€ไฝŽๆŠ˜ๆ‰ฃๅŸบ็ก€ไธŠ๏ผŒๆปกxxxxๅฏไปฅๅŠ ๆŠ•ไธ€่พ†ไปทๅ€ผxxxๆฐธไน…ๆŠ˜...\n526003 0 ๆˆ‘ๅง‘ๅคซ็”ต่ฏ๏ผš13937131918\n526004 0 ๅฆ‚ๆžœ่ฟ™ๆ˜ฏๆŸๅคงvๅฐฑๆ˜ฏ้‚ฃไธช่‡ช็งฐๆ˜ฏ้‡‘่ž่ก—ๅซ่Šฑๅญ็š„ๅ‘็š„\n1000 Processed\n classify content\n526500 0 ่ฟ™ๆ˜ฏ็ปงๆ™ฎไบฌxๆœˆไปฝ่ฎฟๅŸƒๅŒๆ–นๅœจๆŠ€ๆœฏ่ทฏ็บฟใ€ๅ…ณ้”ฎๆกๆฌพ็ญ‰ๆ–น้ข่พพๆˆไธ€่‡ดๅŽ็š„ๅˆไธ€่ฟ›ๅฑ•\n526501 0 ๆตท้—จๅธ‚ๆฐ‘ๆ”ฟๅฑ€ๅๅŒๆตท้—จๅธ‚่€้พ„ๅไผš\n526502 0 ๆ–ฐๅŒ—่ŽบๆญŒใ€ๅฐๅ—ๅฎ‰ๅนณใ€ๅฐไธญๆ–ฐ็คพใ€ๅฐไธœ็Ÿฅๆœฌใ€้‡‘้—จ้‡‘ๅŸŽใ€ๆกƒๅ›ญๅคงๆบชใ€้ซ˜้›„็”ฒไป™ใ€่‹—ๆ —ไธ‰ไน‰ใ€่Šฑ่Žฒ็‘ž็ฉ—ใ€ๅฐ...\n526503 0 ๅ…ซ่ทฏๅ†›ไป€ไนˆไนŸๆฒกๅนฒ๏ผšๆˆ‘่ดจ็–‘ไธ€ไธชๆ•ฐๆฎ๏ผšๅ…ซๅนดๆŠ—ๆˆ˜\n526504 1 ไฝ ๅฅฝ๏ผๆ–ฐๅนดไธšๅŠกๅทฒๅผ€๏ผŒๅฆ‚ๆœ‰ๅ…ณ้“ถ่กŒๆ‰ฟๅ…‘ๆฑ‡็ฅจ็š„ไธšๅŠก๏ผŒๆฌข่ฟŽๆฅ็”ตๅ’จ่ฏข๏ผ็”ต่ฏ๏ผšxxxxxxxxxxx ้พ™\n1000 Processed\n classify content\n527000 0 ๆฎInformation็ฝ‘็ซ™ๅ‘จไธ€ๆŠฅ้“\n527001 0 ๆˆไธบๆˆ‘้€็ป™ๅฅน็š„11ๅฒ็”Ÿๆ—ฅ็คผ็‰ฉโ€ฆโ€ฆ\n527002 0 03ไธ‰้นฐๅธ‚โ€œๆ˜Ÿๆ˜ŸไธŽๆฃฎๆž—็š„็ป˜ๆœฌไน‹ๅฎถโ€\n527003 0 โ€œๅฆๅ…‹ไธค้กนโ€็ซž่ต›ไธญๅ›ฝ96Aๅฆๅ…‹็™ปๅœบ\n527004 0 ๅ› ๆญคๅฏนๅฎซ้ขˆ็ณœ็ƒ‚่ฟ™็ง็–พ็—…็š„ๆฒป็–—\n1000 Processed\n classify content\n527500 0 ๆญ่ดบ่ฏดๅฎข่‹ฑ่ฏญๆ˜†ๅฑฑ้œ‡ๅท่ทฏไฝ“้ชŒไธญๅฟƒไบŽ2015ๅนด7ๆœˆ16ๆ—ฅ้š†้‡ๅผ€ไธš\n527501 0 ไบฒ่ฟ˜ๅœจๅฏน็€็”ต่„‘ๆตช่ดนๅ…‰้˜ดๅ—\n527502 0 ไธฅๆ ผไพๆณ•่ฝๅฎžๅผบๅˆถๆ€งๅœไบงๆŽชๆ–ฝ\n527503 0 ๅช่ฆๅธฆโ€œ็™พๅบฆโ€็š„ไบงๅ“ๆˆ‘้ƒฝไธๅ–œๆฌข็”จ\n527504 0 ไธ€ไธชๅ›ฝๅฎถ็š„ๆณ•ๅพ‹ๅฆ‚ๆžœๆ˜ฏๆญฃไน‰ๅ…ฌๅนณ็š„\n1000 Processed\n classify content\n528000 0 ๆตŽๅ—7ๆœˆ23ๆ—ฅ่ฎฏๅœจๅธธๅทžไธŠ่ฟ‡ๅคงๅญฆ็š„ๅฐ่™ž\n528001 0 ๆˆ‘่™นๆ‚ฆๅŸŽ็š„ๅฟ…่ƒœๅฎขๆ˜ฏๅ…จๅ—ไบฌไนƒ่‡ณๅ…จๅฎ‡ๅฎ™ๆœ€ๅฅฝๅƒ็š„ไธ€ๅฎถ\n528002 0 ๅด”่ƒœ่ดคไธ€ๅคฉๅˆฐๆ™šๅœจinsๅ‘ไธ€ไบ›ๅฅ‡ๅฅ‡ๆ€ชๆ€ช็š„ๅฑ•่งˆๅ“ๅฅฝ้†‰\n528003 0 ไป–่ฎคไธบTwitterไธ่ƒฝๅ†่ฎฉๅผ€ๅ‘่€…ๅคฑๆœ›ไบ†\n528004 0 ่ฟ™้‡Œไธบๆ‚จๅฐฑๆ€ป็ป“ไบ†่ฃ…ไฟฎไธญ่ƒฝๅคŸ็œ้’ฑ็š„ๅๅคง็ง˜่ฏ€\n1000 Processed\n classify content\n528500 0 ๆŒ‡่ดฃๅฏผๅธˆๅญ˜ๅœจโ€œๅ—่ดฟโ€โ€œ่ฏˆ้ช—โ€็š„ๅฏ่ƒฝ\n528501 0 ่€ๅญๅฐฑๆ˜ฏๅผ€ๅ›ฝไปฅๅŽ่ฟๆณ•ๆˆไป™็š„็Œซๅคด้นฐ็ฒพ\n528502 0 ๆ— ไบบ้ฃžๆœบๅฐ†็™พ็ฑณ้ซ˜็ฉบไธ‹็š„ๆ™ฏ่ฑกๅ›žไผ ๅˆฐๆ“ๆŽงๅ™จไธญ\n528503 0 ๆœ€็ปˆ่Žท่ƒœ็š„ๆ˜ฏ็”ฑSagaDesign่ฎพ่ฎก็š„โ€œๆ— ๆ‰€ไธ่ƒฝ็š„siriโ€\n528504 0 ๅฐฑ้€่œœๆ€ๅฏๅฏไปทๅ€ผxxx็š„ๆ‘ฉๆด›ๅ“ฅๆตท่—ปๆณฅๆด—้ขๅฅถไธ€ๅช\n1000 Processed\n classify content\n529000 0 ไปŽๅทฅ็จ‹ๅ‘ๅŒ…ใ€่ฎพ่ฎกไผ˜ๅŒ–ใ€ๆŠ€ๆ”น้ฉๆ–ฐๅ’Œๅทฅ่‰บๅˆ›ๆ–ฐ็ญ‰ๅคšๆ–นๅ…ฅๆ‰‹\n529001 0 ๅฎ‰่ฃ…Keyไผšๅ˜ๆˆ3V66T็ป“ๅฐพ็š„Keyๆฅ่‡ชๅŠจๆฟ€ๆดป\n529002 0 ๅœฐ้“ไธŠๅๆˆ‘ๆ—่พน้‚ฃไธช็”ท็š„ๅฟฝ็„ถๅ”ฑ่ตทๆญŒๆฅ\n529003 0 ๅ‡บ5ๅผ ๆ— ้”กๅŠจ็‰ฉๅ›ญๅคๅญฃ็‹‚ๆฌขๅคœ้—จ็ฅจ\n529004 0 ไปŠๅนดไธŠๆผ”ไธŠๆตทLINEFRIENDSCAFE&amp\n1000 Processed\n classify content\n529500 1 ็‰นๆŽจๅ‡บๅนดๅŒ–ๆ”ถ็›Š็އๅœจxx%--xx.x%็จณๅฅๆ€ง็†่ดขไบงๅ“๏ผŒๆฌข่ฟŽๅ’จ่ฏขใ€‚ๅœฐๅ€:่ง‚้Ÿณๆกฅๆญฅ่กŒ่ก—่žๆ’ๆ—ถไปฃๅนฟ...\n529501 0 ็ฉ†ๆฃฑๅธ‚ๆณ•้™ข็งไฟกๆณ•ๅพ‹ๅ’จ่ฏขไฟกๆฏๅทฒๆ•ด็†ๅฎŒๆฏ•\n529502 0 ่ขซๅŒไธ€่พ†่ญฆ่ฝฆๆ‹ไบ†ไธ‰ๆฌกโ€ฆโ€ฆ\n529503 1 ้“ถ่กŒๆ— ๆ‹…ไฟๆ— ๆŠตๆŠผ๏ผŒไฟก็”จdai ๆฌพ๏ผŒๆ‰‹็ปญ็ฎ€ๅ•๏ผŒ้ขๅบฆ้ซ˜๏ผŒxๅคฉๆ”พๆฌพใ€‚่ดขๅฏŒ็ƒญ็บฟxxxxxxxxxxx๏ผŒๅฐๆจ\n529504 0 ๅฐๅฐ็š„ๆขฆๆƒณ้š็€็บธ้ฃžๆœบ็š„้ฃž็ฟ”ๅˆ’ๅ‡บไผ˜็พŽ็š„ๅผง็บฟ\n1000 Processed\n classify content\n530000 0 ๆˆ‘็š„ๅคง่…ฟๆˆ‘็š„ๅซฉ่‚คๆˆ‘็š„ไนณๆˆฟๆˆ‘็š„่„ธ้ขŠๆˆ‘็š„ไธ€ๅˆ‡ๅช่ƒฝๆˆ‘็”ทๆœ‹ๅ‹่งฆๆ‘ธ\n530001 0 ๆตๆฐด่ดฆๆ˜จๅคฉ็œ‹่ฐทๆญŒๅš็š„ๅทซๅธˆ3่ง†้ข‘ๆ”ป็•ฅ่งฃ่ฏด็œ‹ๅˆฐๅไบŒ็‚นๅŠๆ‰็ก\n530002 0 ็Žฉๅฎถๅฐ†ไปฅๆŽงๅˆถไธ€ๆกๆฑฝ่ฝฆ็”Ÿไบง็บฟไธบๅผ€ๅง‹\n530003 0 ๅทๆ‘ธๅœจ่…‹ไธ‹็บฆ15ๅˆ†้’ŸๅŽ็”จๅ†ทๆฐดๆด—ๅ‡€\n530004 0 ไป€ไนˆๆ—ถๅ€™่ƒฝๆ‘†่„ฑๅฐๅท็š„ๅญฉๅญ่ฟ˜ๆ˜ฏๅฐๅท็š„ๆถๅ’’orๅฎฟๅ‘ฝ\n1000 Processed\n classify content\n530500 0 ๅœจๆฑŸ่‹็œๅธธ็†Ÿๅธ‚ๆ‰“ๅทฅ็š„ไธๅคงๆฌข้ซ˜ๅ…ดๅœฐ่ฏด\n530501 0 ๆญฆๅฎฃๅŽฟๆณ•้™ขไธ€ๅฎกๅˆคๅค„่ขซๅ‘Šไบบๅˆ˜ๆŸ็ๆœ‰ๆœŸๅพ’ๅˆ‘10ๅนด\n530502 0 ็›ๅŸŽๅธ‚ๅŒบ็š„ไบฒๅฏไปฅ้€่ดงไนŸๅฏไปฅ่‡ชๅทฑไธŠ้—จๆŒ‘้€‰\n530503 0 ๅๅคๅผบๅฅธๆ”ถไนฐๆฅ็š„่ขซๆ‹ๅ–ๅฆ‡ๅฅณ่‡ดๅ…ถๆ€€ๅญ•ใ€็”Ÿไบง\n530504 0 24โ„ƒโ€ฆโ€ฆ16โ„ƒๆˆ‘ๅŽป่ฟ™็ฉบ่ฐƒ็œŸๅฅฝๆฒกๆœ‰ๆธฉๅทฎๅ‘€\n1000 Processed\n classify content\n531000 0 ๅปบ่ฎฎ๏ผš่งฃๆ”พๅ†›ใ€ๅ…ฌๅฎ‰ใ€ๆดพๅ‡บๆ‰€ใ€ๅฎ‰ๆฃ€ใ€ๅ…ฌไบค่ฝฆใ€ๅˆ—่ฝฆใ€่ˆช็ฉบใ€่ฝฆ็ซ™ใ€็ ๅคดใ€ๆตทๅ…ณ็ญ‰่ฟ™ไบ›ๆ•ๆ„Ÿ้ƒจ้—จๅบ”่ฏฅ่ฆ้‚ฃ...\n531001 0 ไผ˜ๆญฅ็”จๆˆทๅฏไปฅ้€š่ฟ‡ๅ…ถๅบ”็”จ่ดญไนฐๅฐ็ฑณNote\n531002 0 ๅ…ญใ€ไธƒใ€ๅ…ซๆœˆๆ˜ฏๆ—…ๆธธ็š„ๆœ€ไฝณๅญฃ่Š‚\n531003 0 ๅนถๅŽ‹็ผฉ25ๅฎถ็ป่ฅไธๅ–„็š„็™พ่ดงๆฅผๅฑ‚\n531004 0 ไธ€ไธชๅคฑๅฟ†ไธ€ไธชๅ›žๅฟ†่ฟ™ๅฐฑๆ˜ฏ็ปๅކโ€ฆ้€็ป™ๆญฃๅœจ่€ๅŽป็š„90\n1000 Processed\n classify content\n531500 0 ๆ”ถๆ‹พ่ˆ’ๅฆไบ†็ชๅœจ็”ต่„‘ๆค…ไธŠ้—ญ็›ฎๅ…ป็ฅžๅฌ็ˆฑๅฐ”ๅ…ฐ้ฃŽ็ฌ›\n531501 0 ไนฐไบ†่ฟ™ไนˆๅคšAmazon็š„ไธœ่ฅฟ\n531502 0 ๆ—ขๆœ‰่ƒฝๅŠ›ๅธๅผ•ไธŠๆฌพๆ‰‹ๆœบ5000ๅ…ƒๆกฃไฝ็š„็”จๆˆท\n531503 0 ๅœจๅ•†็ง‘ไธ“ไธšไธญๅผ€่ฎพCFALevel1pathway\n531504 0 ็™พๅบฆ่ฟ™ไนˆๅคง็š„ๅ…ฌๅธไธๆ˜ฏ้ช—ไบบ็š„ๅง\n1000 Processed\n classify content\n532000 0 ๅ่€Œ่ขซๅฅธ่‡ฃไธฅๅตฉ่ฏฌ้™ทๅ‹พ็ป“่’™ๅคๆ„ๅ›พ่ฐ‹ๅ\n532001 0 ๅซŒ็–‘ไบบๅ› ็›—็ชƒๅ’Œๅธๆฏ’ๅทฒ่ขซ้€่‡ณๅ—ๅ……ๅธ‚ๅผบๅˆถ้š”็ฆปๆˆ’ๆฏ’ๆ‰€ๅผบๅˆถ้š”็ฆปๆˆ’ๆฏ’\n532002 1 ๅข™ๅธƒใ€็ช—ๅธ˜x.xๆŠ˜่ตทใ€‚ไธ‰ใ€ๆดปๅŠจๆœŸ้—ด๏ผŒๆŠ˜ไธŠๆŠ˜ๆปกxxxxๅ…ƒๅขž่‡ณxxxๅ…ƒ๏ผŒๆœ€้ซ˜ๅขžๅ€ผxxxๅ…ƒใ€‚ๅœฐๅ€๏ผš...\n532003 0 ไนŸ็ปไธ่ดŸ่Šฑๅƒ้ชจไธ€ไบบ็™ฝๅญ็”ปไธ่ดŸๅ…จๅคฉไธ‹ไบบๅด็ปˆ็ฉถ่ดŸไบ†่Šฑๅƒ้ชจไธ€ไบบ\n532004 0 ๆˆ‘ๅฐฑๆ‰“ๅผ€็ฎกๅฎถ็š„่…พ่ฎฏๆ–ฐ้—ปๆฅไบ†่งฃไธ€ไธ‹็‚’่‚ก็š„ๅ†…ๅฎน\n1000 Processed\n classify content\n532500 0 ๅ…จ็ƒ่ฟ่ฅๅ•†VoLTE้ƒจ็ฝฒ่Š‚ๅฅๆญฃๅœจๅŠ ๅฟซ\n532501 0 ็”ต่„‘ไธๅฅฝ็”จใ€ๆ‰‹ๆœบไธๅฅฝ็”จใ€่ทฏ็”ฑๅ™จไธๅฅฝ็”จ\n532502 0 ไบ‹ๅฏฆๅฐฑๆ˜ฏๅพˆๅฐ็š„ไบ‹ไฝ†ๆœ‰ไบบ็ฟป่ญฏ้Œฏ่ชคๆŠŠ็—…ๆƒ…่ช‡ๅคง็ตๆžœๆŸไบ›ไบบๆญฃ็ขบ็ฟป่ญฏไธไฟก้Œฏ็š„็ฟป่ญฏๅปๆทฑไฟกไธ็–‘้‚„่ณช็–‘ๆญฃ็ขบ็ฟป...\n532503 0 ๅŒๆ—ถๅ…ผๅ…ทๆฝฎๆต็š„่ฎพ่ฎกไธŽๅฅขไพˆๅ“็š„่ดจๆ„Ÿ\n532504 0 ๅฅน็ป™ไฝ ็š„ไฟก็”จ้ขๅบฆ้ƒฝๆ˜ฏๆœ‰้™็š„\n1000 Processed\n classify content\n533000 0 ๅœจ้’ฅๅŒ™ๅœˆ็š„่ฎพ่ฎกไธŠ็•™ๅ‡บไบ†้‚ฃไนˆไธ€็‚น็ฉบ้š™\n533001 1 ใ€่ดต้˜ณ่Šฑๆบช็ขงๆก‚ๅ›ญใ€‘ใ€ๅŒๆ‹ผๅˆซๅข…ใ€่Šฑๅ›ญๆด‹ๆˆฟใ€‘ๅœจใ€xๆœˆxxๆ—ฅๅ‰ใ€‘่ดญๆˆฟๅช้œ€ไป˜ใ€x%ๆˆฟๆฌพใ€‘ๅณๅฏ๏ผŒ่‹ฅๆœ‰ไบฒ...\n533002 0 ๆš‘ๅ‡็”Ÿๆดปๆ˜ฏ่ฟ™ๆ ท็š„๏ผš่…พ่ฎฏๅพฎๅšๅพฎไฟกๆบœ็‹—\n533003 0 ไธๆ‰ฟ่ฎค่ฎกๅˆ’็”Ÿ่‚ฒ็›ธๅ…ณๆณ•ๅพ‹ๅฏนไธชไฝ“่‡ช็”ฑ็š„ๅŽ‹่ฟซไธŽๅผบๅˆถ\n533004 0 ๅฎถไบบๅ‘็ŽฐๅŽ้€ๅŒป้™ขๆŠขๆ•‘ๅŽๆ— ๆ•ˆๆญปไบก\n1000 Processed\n classify content\n533500 0 ไธ€ไธช็ฅž็ˆถ่ƒฝๅฆ่ฏทๅŒป็”Ÿ็œ‹็—…็œŸๆ˜ฏไธช้šพ้ข˜ๅ•Šๆˆ‘ไนŸ็œŸๆ˜ฏไธชๆทท่›‹\n533501 0 ๅœ†ๅฝข็š„้คๆกŒ่ฎพ่ฎกๆ›ดๆ˜พ็”ŸๅŠจ้šๆ€ง\n533502 0 ๆ„Ÿ่ง‰่พ›ๅบ„ๆ˜ฏ่ขซๅธธ็†ŸๆŠ›ๅผƒ็š„ไธ€ไธช้•‡\n533503 1 ๆž—ๅ‘ๅŒ…่ฃ…ๅŽ‚ๆ‹›ๅˆ‡่ข‹ๅธˆๅ‚…ไธ€ๅ๏ผŒๆฌข่ฟŽๅŠ ๅ…ฅๆˆ‘ไปฌ็š„ๅ›ข้˜Ÿ๏ผ็”ต่ฏxxxxxxxxxxx๏ผŒๆœ›็‰›ๅขฉๆดฒๆนพๅซ็”Ÿ็ซ™ๅฏน้ข\n533504 0 ๅฆ‚ๆžœไฝ ๆญฆ่ฟ›ๆณ•้™ขๅ…ฌๅผ€็ซ™ๅ‡บๆฅๅธฎไป–ๆŒ‡ๆŽงๆˆ‘\n1000 Processed\n classify content\n534000 0 ไบš้ฉฌ้€ŠๆŽจๅ‡บไธ€้”ฎ่ดญ็‰ฉ็กฌไปถโ€œDashโ€ๆŒ‰้’ฎ\n534001 0 ๆญฃๅธธๆ–ฐ็”Ÿๅ„ฟๅ‡บ็”ŸๅŽ24ๅฐๆ—ถๅ†…ๆŽฅ็งๅกไป‹่‹—\n534002 0 A้…ธๆœฌๅฐฑๆ˜ฏๆŠ—ๅ…‰่€ๅŒ–้žๅธธๅฅฝ็š„ๆˆๅˆ†\n534003 0 ไฝ ไปฌ่ฟ™ไบ›ๆœบๅ™จไบบๅ’Œๆฐดๅ†›่ฟ˜็œŸๆ˜ฏ็ƒฆ\n534004 0 ๆฑŸไธญๅˆถ่ฏๅฐฑๆถˆ่ดน่€…่ฏ‰่ฎผๆฑŸไธญ็Œดๅง‘้ฅผๅนฒๅŠไปฃ่จ€ไบบๅพ้™่•พๅ‘่กจๅฃฐๆ˜Ž\n1000 Processed\n classify content\n534500 0 ๅ‘็Žฐ่พนๆ‰“ๆ‰‹ๆœบ่พน่ตฐ่ทฏ็š„ไบบๆ–นๅ‘ๆ„Ÿๅ˜ๅทฎ\n534501 0 ๅฐฑ่ฟžไป–ๆ›พ็ปcos็š„ๅผ ่ตท็ตไนŸๆˆไธบไบ†ๆˆ‘ๅถๅƒ\n534502 0 ๅ…ดๅŒ–ๅธ‚ๅŒป็–—ๅซ็”Ÿๅฟ—ๆ„ฟๆœๅŠก้˜Ÿ็ป„็ป‡ๅฟ—ๆ„ฟ่€…ๆฅๅˆฐๆŽไธญ้•‡่ˆœ็”Ÿๅซ็”Ÿ้™ขๅผ€ๅฑ•โ€œ็™ฝ่กฃๅคฉไฝฟ่ฟ›ๅ†œๅฎถโ€ๅฟ—ๆ„ฟๆœๅŠกๆดปๅŠจ\n534503 0 ่ตถ็ดงๆ‰“ๅผ€็”ต่„‘ๆŸฅๅดๅ‘็Žฐๅทฒ็ป่ขซไบบๆท่ถณๅ…ˆ็™ป\n534504 0 11ๆœˆๅบ•ๅ‰ๅ…จ้ขๅฎŒๆˆๅ‰ฉไฝ™้—ฎ้ข˜็š„ๆ•ดๆ”นไปปๅŠก\n1000 Processed\n classify content\n535000 0 ไฝ ไปฌ็”จ็”ต่„‘็š„ๆ—ถๅ€™ไผšไป‹ๆ„ไฝ ๅฆˆๅๆ—่พน็œ‹็€ไนˆ\n535001 0 ไปŽไผš้•ฟใ€ไธญๅคฉ้›†ๅ›ข่‘ฃไบ‹้•ฟๆฅผๆฐธ่‰ฏๆ‰‹ไธŠๆŽฅ่ฟ‡่˜ไนฆ\n535002 0 SuperDaE่ฎคไธบ่‡ชๅทฑ็š„้ป‘ๅฎข่กŒไธบๆฒกไป€ไนˆไธๅฝ“\n535003 0 BabyBananaๅฉดๅนผๅ„ฟ่ฎญ็ปƒ็‰™ๅˆท$7\n535004 0 /ๅ›ฝๅฎถไฝ“่‚ฒๆ€ปๅฑ€ๆŽ’็ƒ่ฟๅŠจ็ฎก็†ไธญๅฟƒไธปไปปๆฝ˜ๅฟ—็›่ขซๆŸฅ\n1000 Processed\n classify content\n535500 1 ใ€ๆฏๆ—ฅ้ฒœๅฅถๅงใ€‘ไบฒ็ˆฑ็š„ๅฎขๆˆท:xๆœˆxๆ—ฅๅฅณ็Ž‹่Š‚๏ผŒๆœฌๅบ—ๆŽจๅ‡บ็‹ฌ้—จ้…ๆ–นโ€œ้ฒœๅฅถๅ†ฐๆท‡ๆท‹โ€ๅฝ“ๅคฉๆบๅฅณ็Ž‹่ฟ›ๅบ—ๅ…่ดนๅฐ...\n535501 0 ่ฟ™ๅœบๅฎกๅˆคไปŽ1945ๅนด11ๆœˆ20ๆ—ฅๆŒ็ปญๅˆฐ1949ๅนด4ๆœˆ13ๆ—ฅ\n535502 0 ๅŽ็ปๅฅฝๅฟƒไบบ้€ๅพ€ๅŒป้™ขๅดไธ€็›ดๆ˜่ฟทไธ้†’\n535503 0 ไธบ่ตท่‰่ฎฒ่ฏๅ‡บ่ฐ‹ๅˆ’็ญ–็š„16ไบบ็ป„ๆˆ็š„้ฆ–็›ธๅ’จ่ฏขๅฐ็ป„ๆŠฅๅ‘ŠไนŸ้€’ๅˆฐๅ†…้˜\n535504 0 ๆˆ‘็š„็”ต่„‘ไปŽๆฅๆฒก่ฏ•่ฟ‡ๅŒๆ—ถ่ฃ…่ฟ™ไนˆๅคš็š„ๆ’ญๆ”พๅ™จ\n1000 Processed\n classify content\n536000 0 TheVerge็š„่ฏ„ๆต‹ๆ›พ็งฐโ€œAndroidWearๅœจๆญฃ็กฎ็š„ๆ—ถ้—ดๅšๆญฃ็กฎ็š„ไบ‹\n536001 0 โ€œ้ฃžๆœบ็‰ˆโ€็”ตๅฝฑ็•ชๅค–็ฏ‡ๅ››ๆ”ฏ่ฟžๅ‘\n536002 0 ๅ…ถๅฎžiPhone็”จๆˆทๅฎŒๅ…จไธๅฟ…ๆ‹…ๅฟƒ\n536003 0 ๅ–œๆฌข่ฟ™็งๅฎ‰้™็š„ๅœฐๆ–นๅฐฝ็ฎก็ƒˆๆ—ฅไนŸไธๆ„ฟๅœจ้ฉฌ่ทฏไธŠไธŽๆˆ็พค็š„ๆธธๅฎขๆŒคๆฅๆŒคๅŽป่ฟ™ๅ‡ ๅคฉ่ตฐ่ฟ‡ไธ็ฆๅฏนๆฅๅŽฆๆธธๅฎข็š„็ง็งไธ...\n536004 0 2ใ€ๅนณๅฐๆญๅปบๆ— ๆณ•ไฝฟไผ—ไบบ่กŒ\n1000 Processed\n classify content\n536500 0 ไธญๅ›ฝ็ปฟๅ‘ไผšๅ…ฌ็›Š่ฏ‰่ฎผ็ณปๅˆ—ไน‹ไธ‰\n536501 0 ๆŠขๅŠซๆ—ถ็œ‹่งๆผ‚ไบฎ็š„ๅฅณๅญ่ฟ˜่ฟž็ปญไธคๆ™šๅฎž\n536502 0 ๅŒ—ไบฌๅพทๆฏ”็ซๆ‹ผ่ฃๅˆค่บบๆžช็Žฐๅœบ็ƒ่ฟทไบฒๅฆ‚ไธ€ๅฎถ|ๅ›พ้ธก่‚‹่ต›\n536503 0 ้ฆ™ๆธฏ้ซ˜็ญ‰ๆณ•้™ขๆ—ฅๅ‰ๅฆๅ†ณๅฑฑๆฐดๆŠ•่ต„่‚กไปฝๆ‰˜็ฎกไบบๆไบค็š„ๅŒๆ„็ฝขๅ…ๅฑฑๆฐดๆฐดๆณฅ่‘ฃไบ‹็š„ๆ่ฎฎ\n536504 0 ๅ’Œ่‹ๅทžๅ…ถไป–็ˆฑๅฟƒไผไธš็š„ๆๅŠฉไธ€่ตทๅ‘้€่ดซๅ›ฐๅœฐๅŒบ็š„ๅ„ฟ็ซฅ\n1000 Processed\n classify content\n537000 0 ไบš้ฉฌ้€Šไธญๅ›ฝๅ‘จๅนดๅบ†7ๅคงๅ“็ฑปๆœ€้ซ˜ๆปก199ๅ‡100\n537001 0 ็Žฐไปฃๆ—ฅๅŒ–ๅ“็š„ไบงๅ“ๅŒ…่ฃ…่ฎพ่ฎกๅฎนๅ™จ็š„็จณๅฎšๆ„Ÿๆ— ็–‘ๆ˜ฏไบบไปฌๅฏน้€ ๅž‹็š„ๆœ€ๅŸบๆœฌ่ฆๆฑ‚\n537002 0 ไธ€็ฎญๅŒๆ˜Ÿไธญๅ›ฝๆˆๅŠŸๅ‘ๅฐ„ๆ–ฐไธ€ไปฃๅŒ—ๆ–—ๅฏผ่ˆชๅซๆ˜Ÿ\n537003 1 ไธ“ไธšไธบไธชไบบๅ’Œไผไธšๆไพ›่ž่ต„ๆœๅŠก๏ผŒxxไธ‡่‡ณxxxxไธ‡้ขๅบฆ๏ผŒๆœˆๆฏไฝŽ่‡ณx.xๅŽ˜ใ€‚ๅ…จๅ›ฝๆŒ‰ๆญๆˆฟ๏ผŒ่ฝฆ๏ผŒไฟ้™ฉ...\n537004 0 ็œ้“328้ป„ๅผ ๅ…ฌ่ทฏ่ƒถๅทžๅธ‚้‡Œๅฒ”้•‡่ทฏๆฎต\n1000 Processed\n classify content\n537500 0 ๅˆฐๅบ•ไธบๅ•ฅ่Šฑๅƒ้ชจๆฏๆฌกๅ‰้ข่ฆๆ’ญๅฅฝๅคš้‡ๅค็š„\n537501 0 ่ฎค่ฏไฟกๆฏไธบโ€œ่‹ๅทžๅฅ้›„่ŒไธšๆŠ€ๆœฏๅญฆ้™ขHoulc\n537502 0 ่ฆๆฑ‚่ฎพ่ฎก็š„ๅฎถๅ…ทไธๅช่ฆๅฅฝ็œ‹่ˆ’้€‚\n537503 0 egๆฑ‰ๆปจๆณ•้™ข่ง„่ŒƒๅŒ–ๅปบ่ฎพๅ†ๅ‘ๅŠ›\n537504 0 ็”ฑ้•‡็ˆฑๅซๅŠžไธŠๆŠฅๅˆฐๅŸŽ็ฎกๅŠžไปฅๆ‰ฃๅˆ†\n1000 Processed\n classify content\n538000 0 ๅซๆœ‰้Ÿฉๅ›ฝ็Žปๅฐฟ้…ธBxใ€่Šฆ่Ÿ่ƒๅ–็ฒพๅŽใ€ๆฐดๆบถ่ƒถๅŽŸ่›‹็™ฝ็ญ‰\n538001 0 2015ๆ˜ฅๅค็ง‹ๅ†ฌๅญฃๅคง็ ๅฅณ่ฃ…่Šฑๆœต่“ฌ่“ฌ่ฃ™ๅŒ…่ฃ™ไธญ่ฃ™ๆฌง็พŽไธญ้•ฟๆฌพๅŠ่บซ่ฃ™ๅฅณๅค\n538002 0 ็ป™ไบบไธ€็งhometheatre็š„ๆ„‰ๅฟซไฝ“้ชŒ\n538003 0 ่ฝฌๆ’ญๅˆฐ่…พ่ฎฏๅพฎๅšๅฐๅบฆๅช’ไฝ“็”จๆผซ็”ปๆฅๆŠฅ้“่ฏฅไบ‹ไปถ\n538004 0 ๅฅฝๅŸบๅ‹ๅฐ็Œซๅ’Œ้ฃžๆœบ่ฟ˜ๆœ‰็œŸ็ˆฑๅฐ้ฉฌๅ“ฅไปฅๅŠ็”ตๅฝฑ็š„ไธป็บฟไบบ็‰ฉๅฐๆ˜Ž็ญ‰็ญ‰\n1000 Processed\n classify content\n538500 1 ๅฅฝๆถˆๆฏ [็Žซ็‘ฐ]ๅฅฝๆถˆๆฏ [็Žซ็‘ฐ]ๅฅฝๆถˆๆฏๆฅไบ†๏ผ(ๅนธ็ฆไธ‰ๆœˆ๏ผŒ่ฏทๆŠŠๅฅๅบทๅธฆๅ›žๅฎถๅง๏ผ) ๎„’ ๎„’ ๎Œ ๎Œ...\n538501 0 ๅฐๆบชๅก”ๆดพๅ‡บๆ‰€ๆฐ‘่ญฆ่€ๅฟƒ่ฐƒ่งฃ1ไธชๅฐๆ—ถ\n538502 0 ๆˆ‘ไปฌไธบๆ‚จๅผ€้€šไบ†xG้ซ˜้€ŸไธŠ็ฝ‘ไฝ“้ชŒๅŠŸ่ƒฝ\n538503 0 11ๅŒบ็š„ๆœ‹ๅ‹ๅ‘Š่ฏ‰ๆˆ‘ไบš้ฉฌ้€ŠไธŠไนฐ็š„่ฟ™ไธช็ปˆไบŽๅˆฐไบ†\n538504 0 Innisfreeๆ‚ฆ่ฏ—้ฃŽๅŸ่ฟ™ๆฌพๆด—้ขๅฅถๆ˜ฏ็”ทๅฅณ้€š็”จ็š„\n1000 Processed\n classify content\n539000 0 NOGๅจ˜้ƒฝๆ„Ÿ่ง‰ไธๆ•ขๅ็”ตๆขฏไบ†\n539001 0 ๆˆ‘ๆŠขๅŠซไฝ ไป€ไนˆๆˆ‘ๅชๅ”ฑ่‹ฑๆ–‡ๆญŒโ€ฆ\n539002 0 Bไผš็†่ดขไฝ†่ฏด่ฏ็›ด่จ€ไธ่ฎณไธๅฅฝๆŽฅๅ—โ€ฆโ€ฆ\n539003 0 ไบš้ฉฌ้€Šๆ—ฅๅ‰ๅฐฑๆ นๆฎ2015ๅนดไธŠๅŠๅนด็š„้”€ๅ”ฎๆˆ็ปฉๅ’Œ่ฏป่€…่ฏ„ๅˆ†\n539004 0 ๆฅ่‡ชๅ…จๅ›ฝๅ„ๅœฐ็š„xxxxๅคšๅไผไธšๅฎถไธŽๆถˆ่ดน่€…ไปฃ่กจๅ’Œxxๅคšๅฎถๅช’ไฝ“่ฎฐ่€…ๅ‚ๅŠ ่ฎบๅ›\n1000 Processed\n classify content\n539500 1 ้กพ้—ฎ๏ผŒ้€ไปทๅ€ผxxxๅ…ƒ็š„ไธ‰ๅˆไธ€ๆด—้ขๅฅถๅ’Œไฟๆนฟๆฐดใ€‚ไฟƒ้”€xๅทๆ—ฉไธŠx็‚นๅผ€ๆŠข๏ผŒๆ•ฐ้‡ๆœ‰้™้€ๅฎŒๅณๆญข๏ผไบฒ๏ผš่ฟ˜ๅฏ...\n539501 0 ไฝ ๅšไปปไฝ•็š„ไฟๅ…ปๅฆ‚ๆžœไธๅš้˜ฒๆ™’\n539502 0 ๆ˜Žๆ—ฉๆˆ‘ๆ€•ๆ˜ฏ่ตทไธๆฅไบ†~็Žฐๅœจๆ‰‹ๆœบๅฟซๆฒก็”ตไบ†\n539503 0 ็Žฐ่ดง็Žฐ่ดงไปŠๅคฉไธ‹ๅ•ๅŒ…้‚ฎๅ™ขๅˆซ็œ‹ไป–ๅฐๅฐไธ€็“ถ\n539504 0 ๅŽฆ้—จๅคงๅญฆๅ‡บ็‰ˆ็คพxxxxๅนดxๆœˆ็‰ˆ\n1000 Processed\n classify content\n540000 0 xxG่ฃ่€€้‡‘ๅŒๅกๅŒๅพ…ๅ…จ็ฝ‘้€š็ซ‹ๅณๅˆฐๆ‰‹\n540001 0 โ€œๅฎ‰ๅ€้“ๆญ‰โ€ๆœบๅ™จไบบไธŠๆตทๅฑ•ๅ‡บๆ—ฅๆœฌ็ฝ‘ๅ‹๏ผš่ฏฅ้‡ไบง\n540002 0 ็™ฝๅญ็”ป้™ช็€่Šฑๅƒ้ชจๅœจ้•ฟ็•™ๆตทๅบ•16ๅนด\n540003 0 fx็Žฐๅœจๆ˜ฏ้™คไธœ็ฅžๅค–ไบบๆ•ฐๆœ€ๅฐ‘็š„ไบ†\n540004 0 โ€œๅธŒๆœ›ๆฏโ€2015ๅนดๆต™ๆฑŸ็œๆ กๅ›ญ่ถณ็ƒ่”่ต›ๅˆไธญๅฅณๅญ็ป„ๅ†ณ่ต›ๅœจๆน–ๅทžๅธ‚่ฝไธ‹ๅธทๅน•\n1000 Processed\n classify content\n540500 0 ๅฝ“ๆ—ถ็‚น็‚นๆปดๆปด็š„ๅผ€ๅฟƒๅฏน็Žฐๅœจ็š„ๆˆ‘ๆฒกๆœ‰ๆ„ไน‰\n540501 0 ๆˆ‘ๅš็š„ๆ˜ฏ็พŽ่”่ˆช้ฃžๆœบไธญ้€”ๅœจ่ŠๅŠ ๅ“ฅ่ฝฌๆŽฅๆ—ถๅ€™\n540502 0 ๆ˜ฏpaparecipeๆ——ไธ‹ไธ€ๆฌพๅฎ‰ๅ…จ็š„่กฅๆฐด้ข่†œ\n540503 0 ้‡่ฆๆ็คบ๏ผšโ—ๅ…ฌๅธ่‚ก็ฅจ่ฟž็ปญไธ‰ไธชไบคๆ˜“ๆ—ฅๅ†…ๆ—ฅๆ”ถ็›˜ไปทๆ ผๆถจๅน…ๅ็ฆปๅ€ผ็ดฏ่ฎก่พพๅˆฐ20%ไปฅไธŠ\n540504 0 ๆˆ‘ๅธŒๆœ›่ฟ™็งLOWB็ฉทBไบ‹ๅ„ฟBๆฐธ่ฟœ้ƒฝไธ่ฆๅ‡บ็Žฐๅœจๆˆ‘็”Ÿๆดปไน‹ไธญ\n1000 Processed\n classify content\n541000 0 ๅด่ขซๆ‰ฌๅทžไธ€ไฝ69ๅฒ็š„ๅ†œๆฐ‘่Šฑไธ‰ๅคฉๆ—ถ้—ด่งฃไบ†ๅ‡บๆฅ\n541001 0 ๆ„Ÿ่ฐขๅคฉๆ„Ÿ่ฐขๅœฐๆ„Ÿ่ฐขAmazon\n541002 0 ๅŒป็”Ÿ้€šๅธธไผšๅผ€็ซ‹ๆŠ—็”Ÿ็ด ๅˆถๅ‰‚ใ€ๅฃๆœA้…ธๆˆ–ๆ˜ฏๆ‰“ๆถˆ็—˜้’ˆๅค„็†\n541003 0 ๆฏๅคฉ็ก่ง‰ๅ‰้ƒฝ่ฆ็œ‹็œ‹็พŽๅ›ข็™พๅบฆ็š„่ฏ„ไปท็”Ÿๆ€•ๅทฎ่ฏ„ๅšไบบไนŸๆ˜ฏๆŒบ็ดฏ\n541004 0 ่ฟ›ใ€ๅ‡บๆฐด้ƒจไฝ็š„ๆต้‡ใ€ๆบถ่งฃๆฐงๆต“ๅบฆ\n1000 Processed\n classify content\n541500 0 ๆ•ด็†็”ต่„‘getไธ€ๅผ ๅฃ็บธโˆš\n541501 0 ๅซๆตด่ฎพ่ฎกไธŠ้ขๆœ‰่ดดๅˆๆ€ปไฝ“็š„่ฎพ่ฎก้ฃŽๆ ผ\n541502 0 ่ฟ™็พค90ๅŽๆถˆ้˜ฒๅฎ˜ๅ…ตๆ— ๆ€จๆ— ๆ‚”ๅœฐ่ตฐ่ฟ›้ƒจ้˜Ÿ่ฟ™ไธชๅคง็†”็‚‰\n541503 1 ใ€ๆ”€ๆž่ŠฑไธœๅŒบๅๆœ›ๆ•™่‚ฒใ€‘ๆ˜ฅๅญฃๆๅ‡็ญ็ซ็ƒญๆŠฅๅไธญโ€ฆๆˆ้ƒฝใ€็ปต้˜ณๅพท้˜ณๅๅธˆๆ‰งๆ•™๏ผŒ็ฒพๅ“ๅฐ็ญ่ฎฉๆ‚จ็š„ๅญฉๅญๆŽŒๆกๅญฆ...\n541504 0 ็–‘ไผผ้ฉฌ่ˆชMH370้ฃžๆœบๆฎ‹้ชธๅœจ้žๆดฒไธœ้ƒจๅฒ›ๅฑฟ่ขซๅ‘็Žฐ\n1000 Processed\n classify content\n542000 0 ๅธธๅทžไพ›็”ตๅ…ฌๅธๅฏน110kV็คผๅ˜‰ๅ˜่ฟ›่กŒๆฃ€ไฟฎ\n542001 0 ่ƒธๅ‰่ฟทไบบ็š„่ตซๅ‹’ๅปบ็ญ‘้ฃŽๆ ผ็š„ๅท…ๅณฐไน‹ๆœˆ21ๆ—ฅ็š„้ป‘ๅคœ้™ไธดไปฅๅŽใ€ๆœ€ๅŽ็š„ๆ™š้คใ€่กฃ็€ๅ’Œไฝ“โ€œrocketne...\n542002 1 ๆฒ™ๆฒณๅธ‚ๅ†œๆœบๅคงๅธ‚ๅœบ ไธœๆ–น็บข ้‡‘้ฉฌ ้‚ขๅฐไธ€ๆ‹– ็ญ‰็ณปๅˆ—ๆ‹–ๆ‹‰ๆœบ ไธญๆ”ถ ็ฆ็”ฐ ๅš่ฟœ ๅฅ‡็‘ž็ญ‰็‰Œๅฐ้บฆ...\n542003 0 ไธ็„ถๅฐฑๆ˜ฏ่…่ดฅๆจช็”Ÿ่€ŒไธŠๅฑ‚้ข†ๅฏผๅฆ‚ๅŒ็žŽๅญไธ€ๆ ท่ขซ่’™ๅœจ้ผ“้‡Œ\n542004 0 ๅ–็ˆ†ไบ†้ŸฉๆŸxไปถๅฅ—๏ผšๆด้ข+ๆฐด+ไนณ+็ฒพๅŽ+bb้œœ\n1000 Processed\n classify content\n542500 0 ไฝŽๅค„ๅŒ–ๅฆ†ๆฐดใ€ไฟๆนฟๅ‡้œฒ็ญ‰้ƒฝๆ˜ฏๅฆ‚ๆžœๆœ‰ไธ€ๅคฉไฝ ็ช็ ดไบ†\n542501 0 ไผ ๅช’ๅฎถๅพฎๅœˆ|ๅฌ่ฏด็ฟๆ˜Ÿๅˆถไฝœๆ‹›่˜ไบ†\n542502 0 ๆ˜Žๅคฉ่ทŒๅœๆฟๅŸ‹ไผไธญ่ˆช้ฃžๆœบ\n542503 0 ็Žฐๅœจ็š„ๅฐๅทไนŸๅคชTMๆœ‰ๆ–‡ๅŒ–ๆœ‰ๅ“ไฝไบ†\n542504 0 ๅ‰ๅพ€โ€œ้ซ˜ๅคงไธŠโ€็š„ๆทฎๅฎ‰ๅธ‚ๆ”ฟๅŠกๆœๅŠกไธญๅฟƒๅŠžไบ‹็š„ๅ„ไฝๅธ‚ๆฐ‘ๆœ‹ๅ‹ไปฌ\n1000 Processed\n classify content\n543000 0 ๆ—ฅๆœฌ็คพไผšๅฏนๅ—ไบฌๅคงๅฑ ๆ€ๆ™ฎ้ๆ˜ฏๆฒกๆœ‰ๅฆ่ฎค\n543001 1 xๆœˆxๆ—ฅ่ๆ™บไนฐx้€xใ€ไบฒไฝ“ไนฐx้€xใ€ๅฐๅฎ‰็ด ๅ’ŒไบฒๆŠคx.xๆŠ˜๏ผŒไธ€ๆฎตไธๅ‚ๅŠ ๆดปๅŠจ๏ผ่ๆ™บ๏ผŒไบฒไฝ“้ƒฝๆ˜ฏๆœ€ไฝŽ...\n543002 0 ๅ˜้ข‘ๆ˜พ็คบE7่ฏดๅฏ่ƒฝๆ˜ฏๆจกๅ—ๆˆ–ไธปๆฟๅไบ†\n543003 0 ๅ…จๅฎถxๅฃๅˆ†ๅˆซๆŒ่ฝฎ่ƒŽใ€้…’็“ถ็ญ‰็‰ฉๆ”ปๅ‡ป่ญฆๅŠกไบบๅ‘˜\n543004 1 ้‡‘ๅฐš่ฃ…้ฅฐ่ฎพ่ฎกๅธˆๅˆ˜ๅฝฆๆ ผ็ฅๆ‚จ๏ผšๅ…ƒๅฎต่Š‚ๅฟซไน๏ผไธบๅ›ž้ฆˆๆ–ฐ่€ๅฎขๆˆท๏ผŒๆญฃๆœˆๅไบ”ๅ…ƒๅฎต่Š‚ๅฝ“ๅคฉ็ญพ็บฆ้‡‘ๅฐšๅœจๆœ€ไผ˜ๆƒ ๅŸบ็ก€...\n1000 Processed\n classify content\n543500 1 ๆฌข่ฟŽ่‡ด็”ตๅƒๅถ้‡‘ๅฑžๅˆถๅ“(ๆทฑๅœณ)ๆœ‰้™ๅ…ฌๅธ.ๆœฌๅ…ฌๅธไธ“ไธšไธบๆ‚จๆไพ›:ไบ”้‡‘.่ฝด็ฑป.้“†้’‰็ญ‰.ๆ‚จ็š„ๆปกๆ„ๆ˜ฏๆˆ‘ไปฌ...\n543501 0 ๅพฎ่ฝฏๅฎฃๅธƒWindows10ๅฐ†ๅ…จ้ขๅ…ผๅฎนAndroidๅ’ŒiOSๅบ”็”จ\n543502 0 ๆ€ป็ฎ—surviveไบ†่ฟ™ๆฌก่พ›่‹ฆ็š„ๆ—…่กŒ\n543503 0 ๆ‘‡ๆ‘‡็œ‰ๅคดๆ“ๅฐฝไธ€่บซๆฑกๅžขๅ‰ๅŠ็”Ÿ็ˆฑๅไบ†่ฟ™ๅ‡กๅฐ˜ไธ–ไฟ—ไผๅœจๆฒ™ๅถไธŠๅป่ˆ”็ƒŸๅคดๆไบ†ๅ’ฝๅ–‰้•ฟไน…่พ“็ป™้†‰้…’\n543504 0 GPSไปฅๅ…จๅคฉๅ€™ใ€้ซ˜็ฒพๅบฆใ€่‡ชๅŠจๅŒ–ใ€้ซ˜ๆ•ˆ็އ็ญ‰ๆ˜พ่‘—็‰น็‚นๅŠๅ…ถๆ‰€็‹ฌๅ…ท\n1000 Processed\n classify content\n544000 0 ไนŸๅบ”ๅœจๅŒป็”Ÿ็š„ๆŒ‡ๅฏผไธ‹ไธฅๆ ผๆŽงๅˆถๅŒบๅŸŸๅ’Œ็”จ่ฏๆ—ถ้—ด\n544001 0 bbไปŠๅคฉ่ขซๅ„ฟ็ง‘ๅŒป็”Ÿๆ‰Žไบ†ไธ€้’ˆๅŽๅ›žๆฅๆ”พๅˆฐๅบŠไธŠ็ก่ง‰ไธ€ไผšๅ„ฟๅฐฑไผš่ขซๅ“้†’ๆŽฅ็€ๅคงๅ“ญ\n544002 0 ไบ’่”็ฝ‘้‡‘่žๅทฒ็ปๅ‡็บงไธบๆ‰‹ๆœบไน‹ๅŽๆœ€้‡่ฆ็š„ๆˆ˜็•ฅๆฟๅ—\n544003 0 ่ฟ™ๆ˜ฏๅŽไธบ็ป™็”จๆˆทๆไพ›ๆž่‡ดไฝ“้ชŒ\n544004 0 ไธ่ƒฝ็ฎ€ๅ•ๆŒ‡ๆœ›่œ‚่œœใ€้ฆ™่•‰่งฃๅ†ณ้—ฎ้ข˜\n1000 Processed\n classify content\n544500 0 ๅ—ไบฌ่‹ๅทž็ญ‰8ๅธ‚่”ๅˆ็”ณๅŠžไธญๅ›ฝ่Žท็”ท็ฏฎไธ–็•ŒๆฏไธพๅŠžๆƒ\n544501 0 ่ฟ˜่ƒฝ้€š่ฟ‡ๆˆ็ป„LED็ƒๆณก็š„ๅ…‰่‰ฒๅฑžๆ€ง\n544502 0 ๆ”นไบ†ๆœ€ๅŽไธ€็ญๅ›žไธŠๆตท็š„้ฃžๆœบๅ‡Œๆ™จไธ‰็‚นๆ‰ๅˆฐๅฎถ\n544503 0 ๆฒกๆœ‰็œŸๆญฃ็š„ๆณ•้™ขๅฐฑๆฒก็“œๆœ‰ไธชไบบ็š„่‡ช็”ฑๅ’Œๅนณๅฎ‰\n544504 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏ่”้‚ฆๅฎถๅฑ…็ป็†่ดบๅธ…๏ผŒๆœฌๅบ—็ป่ฅๅฎžๆœจๅฎถๅ…ท๏ผŒ้ฃŽๆ ผๆœ‰ไธญๅผ๏ผŒๆ–ฐไธญๅผ๏ผŒ่‹ฑๅผ๏ผŒ็พŽๅผ๏ผŒๆ„ๅผ๏ผŒๅœฐไธญๆตท็ญ‰...\n1000 Processed\n classify content\n545000 0 ๆฑŸ้—จๅœฐๅŒบๅ…ฑๆœ‰17ๅฐ่‹ๅทž็”ณ้พ™็”ตๆขฏๆœ‰้™ๅ…ฌๅธๅˆถ้€ ็š„่‡ชๅŠจๆ‰ถๆขฏๅ’Œ่‡ชๅŠจไบบ่กŒ้“\n545001 0 ๅƒๅ—่ฅฟ็“œๅŽ‹ๅŽ‹้ฅฟๅงๅ”‰ๅŠ ๆฒนๅŠ ๆฒนๆ˜Žๅคฉๆ—ฉๅธ‚็š„่ตฐ่ตท\n545002 0 ๅ’ฑๅฎถๅญฉๅญ500ๅคšไธบๅ•ฅๅฐฑไธŠไธไบ†ๅŒไธ€ๆ‰€ๅญฆๆ ก\n545003 0 ไธขๅคฑ็š„xxไธช่ฅฟ็“œ็‰ตๅ‡บ็ฆปๅฅ‡็›—็ชƒๆกˆโ€ฆโ€ฆ\n545004 0 ่ฟ™ๆฌกๆœ‰็‚น่ฟ‡ๅˆ†ไบ†ๅฆ‚ๆžœไฝ ไปฌๆฒกๆœ‰ๆ€Žไนˆๅšๅดไบฆๅ‡กๅฏ่ƒฝไผšๆŠŠ็œŸ็›ธ้š็ž’ไธ€่พˆๅญๅˆซๆฌบ่ดŸไป–ๅ–„่‰ฏไฝ ไปฌไผšๆœ‰ๆŠฅๅบ”็š„\n1000 Processed\n classify content\n545500 0 8ๆœˆไปฝๆ‰“็ฎ—ๅŽป่ถŸๆต™ๆฑŸๆ•ฃๆ•ฃๅฟƒ\n545501 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜๏ผš่ฏš้‚€ๅ‚ๅŠ xๆœˆxๆ—ฅโ€”xๆœˆxxๆ—ฅโ€œ้˜ณๆ˜ฅไธ‰ๆœˆๆƒ ๅŠจๆดฅๅŸŽ่ฅฟ้—จๅญๅšไธ–ๅฎถ็”ตๅคงๅž‹ๅ›ข่ดญไผšโ€ ๆดปๅŠจ๏ผŒ...\n545502 0 ไธญ่ˆชๅŠจๅŠ›ไปฅๅŠไธญ่ˆช้ฃžๆœบ็ญ‰้ƒฝๅ‡บ็Žฐไบ†่พƒๅคง็š„ๅ›ž่ฝ\n545503 1 ็ƒง้น…็Ž‹ๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆœฌๅบ—ไธบๆ‚จๆไพ›๏ผš้ข„ๅฎš็ƒง้น…ใ€้‡‘็‰Œไนณ็Œชใ€่œœๆฑๅ‰็ƒง็ญ‰็ƒงๅ‘ณ็ณปๅˆ—๏ผŒๅ›ขไฝ“่ฎข้ค๏ผŒไผ˜ๆƒ ๅคšๅคš...\n545504 0 ๅนฟๅœบๅ†…่ฟ˜ๆœ‰ๅ…่ดนwifiๅ‘ฆ~~ๆœ‰ๆฒกๆœ‰ๅฟƒๅŠจ\n1000 Processed\n classify content\n546000 0 ๆ˜จๅคฉๆˆ‘็œ‹ไบ†ๆท˜ๅฎ็œ‹ไบ†้˜ฟ้‡Œๅทดๅทด็œ‹ไบ†ไบฌไธœ\n546001 1 ๆ–ฐๆ˜ฅไฝณ่Š‚๏ผŒ้‡‘ไป•ๅ กๅฅ่บซๅ›ž้ฆˆๆ–ฐ่€ๅฎขๆˆท๏ผŒๆ‰€ๆœ‰ๅก็ง้ƒฝๆ‰“ๆŠ˜ๆ‰ฃ๏ผŒๆœ‰ๆ„่€…ๅฏไปฅ่”็ณปๆˆ‘๏ผŒไน”ๆผซใ€‚่ฐข่ฐข\n546002 0 ๅ’Œ้˜ฟ้‡Œไบ‘็š„ๅผ€ๆ”พๆœ็ดขๅฏนๆŽฅไธŠไบ†\n546003 0 ็œ‹ๅฎŒๆŠ„่ขญ่ดจ็–‘่ดดๅ’Œๆด—ๅœฐ่ดดๆˆ‘ๅฐฑ็บณ้—ทไบ†\n546004 0 17ไปŠๅนด3ๅฒๅ–œๆฌขไฝ ็š„ๅคงๅคง็š„็œผ็›ๅ’Œ็ฒ‰ๅซฉ็š„่„ธๅบž\n1000 Processed\n classify content\n546500 0 ๆฑŸ่‹ๅซ่ง†่’™้ขๆญŒ็Ž‹่ฟ™ๆ˜ฏๆถๅฟƒๅˆฐไธ่ƒฝๅ†ๆถๅฟƒๅฅฝๅ—\n546501 1 ไธญ่‰้›†ๅŒ–ๅฆ†ๅ“ๅบ†ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๆดปๅŠจๅผ€ๅง‹ๅ•ฆ๏ผๅฟซไนๅฅณไบบ่Š‚๏ผ็พŽไธฝๅญ˜ๆŠ˜้€ไธๅœ๏ผๆดปๅŠจๆ—ถ้—ดxๆœˆxๆ—ฅ่‡ณxๆ—ฅๆฌข่ฟŽๆ–ฐ...\n546502 1 ๅŠฒ็ˆ†ไฟก่ดทๅ’จ่ฏข๏ผšๅ…ฌๅŠกๅ‘˜ใ€ไบ‹ไธš็ผ–ใ€ๅ›ฝไผใ€ไธŠๅธ‚ๅ…ฌๅธไป…ๅ‡ญ่Œไธšไฟกๆฏ๏ผŒๅณๅฏไปŽ้“ถ่กŒ่ฝปๆพ่ดท่ตฐxx-xxxไธ‡๏ผŒ...\n546503 0 ๅฐฑๅƒ่ขซๅˆคไบ†ๆญปๅˆ‘ๅด่ฟ˜ๆฒกๆœ‰ๆ‰ง่กŒ\n546504 0 ๅคงๅทด่ฝฆใ€้ฃžๆœบใ€ๅŠจ่ฝฆใ€่ฝป่ฝจใ€้ƒฝไธๅฏไปฅ่ถ…่ฝฝไธบไป€ไนˆๅฐฑ็ซ่ฝฆๅฏไปฅ่ถ…่ฝฝไบ†\n1000 Processed\n classify content\n547000 0 ็™พๅบฆ็™พ็ง‘็š„ๆ่ฟฐๆ˜ฏไธ‹ๅ›พ่ฟ™ๆ ท็š„\n547001 0 ๆ€ๆ‰‹ๅ’ŒๆŽขๅ‘˜ไน‹้—ด็ˆฑๆƒ…ไธŽๅˆฉ็”จ็บ ็ผ \n547002 0 ็Žฐๅœจๅฎ‰่ฃ…Exposureๆ’ไปถๅŽๆ็คบๆ— ๆณ•ๅฏๅŠจ0xc000007b\n547003 0 ไฝ†ๆŠŠๆฝœ่ง„ๅˆ™็š„้“็†่ฟ™ๆ ทๆ˜Žๆ˜Ž็™ฝ็™ฝ็š„่ฎฒๅ‡บๆฅไนŸ็ฎ—ๆ˜ฏ็ช็ ดไบ†\n547004 0 ไธ€็›ดๅœจๅˆท1599ๅ•Šๅ•Šๅ•Šๅ•Šๅฟƒ้ƒฝ้…ฅไบ†\n1000 Processed\n classify content\n547500 0 ไธ€ๆฌก่ฎฐxๅˆ†๏ผš้ฉพ้ฉถ็ฆๆญข้ฉถๅ…ฅ้ซ˜้€Ÿๅ…ฌ่ทฏ็š„ๆœบๅŠจ่ฝฆ้ฉถๅ…ฅ้ซ˜้€Ÿๅ…ฌ่ทฏ\n547501 0 ๆต™ๆฑŸๅคงๅญฆๆ•™ๆŽˆ/ๅšๅฏผใ€ไธญๅ›ฝๅทฅไธš่ฎพ่ฎกๆ•™่‚ฒ็•Œๅ…ญๅคงๅๅ˜ดไน‹ไธ€โ€”โ€”ๅบ”ๆ”พๅคฉ่€ๅธˆ\n547502 0 ๅฎๆณขๆณฐๅพ—็ง‘ๆบๆ‹›่˜็‰ฉๆตๅ•่ฏไธ€ๅ\n547503 0 ไธ€ไธ‹ๅคง้›จๆ”ฟๅบœ็ญ‰็›ธๅ…ณ้ƒจ้—จๆ€•ๅ†ฐ้›นๆ‰“ๆމ็ƒŸๅถ\n547504 0 ่ฃ…ๆˆxxxx้—ไบงๅŸบ้‡‘ไผšๆฐๅคซๆ‘ฉๆ น่ฏด็ปด็ด ใ€้’พ่ดจๅ’Œๆžœ่ƒถ็ญ‰็‰ฉๆœ‰ไธ€็งๆทฑๆฒ‰็š„ๅ–œๆฌข\n1000 Processed\n classify content\n548000 0 ็”ต่„‘่พๅฐ„ใ€ๆฑฝ่ฝฆๅฐพ็ƒŸใ€็ฉบๆฐ”ๆฑกๆŸ“ใ€ไปปไฝ•ไธ€็‚น้ƒฝๅœจๆ—ถๆ—ถๆฑกๆŸ“ไฝ ็š„ๆฏ›ๅญ”\n548001 0 ๆˆ‘ไปฅไธบ่‡ชๅทฑไผšๆฐธไธๅŽŒๅผƒไฝ ๅšไธ€่พˆๅญๆœ‹ๅ‹ๅ‘ข\n548002 0 ๆ”ถๅˆฐๅฟซ้€’็š„ๆ—ถๅ€™่ฟ˜ๅœจๅค–่พนๆ—…ๆธธ\n548003 0 ๆฒ›ๅŽฟไธพๅŠžIDTAๆ‹‰ไธ่ˆžๅ›ฝ้™…็ญ‰็บง่€ƒ่ฏ•\n548004 0 ๅฅฝ้šพ่ฟ‡ๅฎˆๆŠคไธไบ†ไฝ โ€ฆ็œŸ็š„โ€ฆๅฅฝ้šพ่ฟ‡\n1000 Processed\n classify content\n548500 0 ่ฝฏๅซฉ้…ธ็”œ่ฏฑไบบ็š„้•‡ๆฑŸ็ณ–้†‹ๅฐๆŽ’\n548501 1 ๆฟ€ๆƒ…ไธ‰ๆœˆ๏ผŒๆ˜Šไนๅœ†ไธบๅ›ž้ฆˆๆ–ฐ่€ๅฎขๆˆท๏ผŒ็‰นๆŽจๅ‡บๆ–ฐ็š„้…’ๆฐดๆดปๅŠจ๏ผŒๆ€ป็ปŸๅŒ…(xxxx)๏ผŒ่ฑชๅŒ…(xxxx)ๅคงๅŒ…...\n548502 0 GoogleไธŽไธ‰ๆ˜Ÿไน‹้—ดไธๅญ˜ๅœจ็ดงๅผ ๅ…ณ็ณป\n548503 1 ๆœฌๅ…ฌๅธไธ“ไธš่ฎพ่ฎกๅˆถ้€ ๆถฒๅŽ‹ๅ‡้™ๆœบๆขฐ๏ผŒๅ…ฌๅธๅœฐๅ€ๅนฟๅทžๅธ‚็™ฝไบ‘ๅŒบๅ‡็ฆพ่ก—ๆ–ฐ็Ÿณ่ทฏๅ…ฌไบค็ซ™ๆ—่พน.็Žฐ้œ€ๆ‹›็”ต็„Šๅทฅๅธˆๅ‚…...\n548504 0 ๆ’ๆŒ‡ๆœŸ่ดงไธŽๅ›ฝๆŒ‡ๆœŸ่ดง็š„ๆ‹Ÿๅฎšๅผ€ๅธ‚ไปท็Žฐๅˆ†ๅˆซๆŠฅ24164ๅ’Œ11040\n1000 Processed\n classify content\n549000 0 ่ฏดๆˆ’ๅฐฑๆˆ’ๅคฉๅคฉๆ™ƒๆ™ƒๅƒๅƒๅ–ๅ–ๅ“ชไธ่ˆ’ๆœ\n549001 0 ไธŠ่ฏๆŒ‡ๆ•ฐไธไป…ไธ€ไธพ็ช็ ด3700็‚น็š„ๆ•ดๆ•ฐๅ…ณๅฃ\n549002 0 ๅŒป็”Ÿๅฏนๆฏไบฒ่ฏด๏ผšโ€œไฝ ๅฅณๅ„ฟ็š„ๅฆ„ๆƒณ็—‡ๅพˆไธฅ้‡\n549003 0 lumia526ไธคๆฌกๅ‡็บงwp10้ข„่งˆ็‰ˆ้ƒฝๆ˜ฏไปฅๆ— ้™้‡ๅฏๅ’Œๆปš่ฝฎๅคฑ่ดฅ\n549004 1 ็Žฏไบšๅธ‚ไธญๅบ—ๆ‹‰ๅค่ดๅฐ”ๅฅณไบบ่Š‚็‰นๅ–ๅœบๆดปๅŠจ๏ผŒ็‰นๅ–ๅ…จๅœบx.x-xๆŠ˜๏ผŒxxๅ…ƒ่ตทใ€‚ไธ“ๆŸœx.xๆŠ˜่ตท๏ผŒๆ˜ฅ่ฃ…ๆŒ‡ๅฎš...\n1000 Processed\n classify content\n549500 1 ็›’ ไธƒ๏ผŒ็พŽๅณ้ข่†œ็‰นไปทx.xๅ…ƒ/็‰‡ ๅ…ซ๏ผŒๅ‡ก่ฟ›ๅบ—่ดญๆปกxxๅ…ƒๅณๅฏ่ต ้€ไฟฎ็œ‰ๅก ่ดญๆปกxxxๅ…ƒๅณ...\n549501 0 ไธ€่ˆฌไธบ1000~3000ๅ…ฌ้‡Œไน‹้—ด\n549502 0 ๆˆ‘ๅค–ๅฉ†้‚ฃๆ ทๅนด็บช70ๅคšๅฒ่ฟ˜ๅŠชๅŠ›ๆ‰ซๅœฐ็š„่€ไบบ\n549503 0 ๆˆ‘็š„็”ต่ฏ่ขซ่ฎค่ฏไธบไฝ™ๅงš้˜ฟ้‡ŒๅทดๅทดๆœๅŠกไธญๅฟƒไบ†\n549504 0 ๅฅณN็š„ๅฐๆŠคๅฃซไนŸๅ–œๆฌขไป–ๅพˆไน…ไบ†\n1000 Processed\n classify content\n550000 0 ้ฆ–ๆ‰นๆŠ€ๆœฏๆˆ็†Ÿ็š„้€้คๆœบๅ™จไบบโ€œ็ฟ ่Šฑโ€\n550001 0 ไปŠๅŽๅฐ†ๅฎž็Žฐ็”ฑ็œ็บงๆ”ฟๅบœๅœจๅ…จ็œ่กŒๆ”ฟๅŒบๅŸŸ็ปŸ็ญนๅผ€ๅฑ•ๅคง่ง„ๆจก่ทจๅœฐๅŒบใ€่ทจ้ƒจ้—จใ€ๅžฎๅฑ‚็บง็š„ไฟกๆฏๅ…ฑไบซๅ’Œ่”ๅŠจๅŠžๅ…ฌ\n550002 0 ๆš–้‡‘่‰ฒ็š„่œ‚่œœ่‰ฒ่ฐƒ็š„ๆœจๅˆถๅฐๅฑ‹ใ€้›จๆท‹ๆฟ็š„ๅฑฑๅข™ๅ’Œๅ‰็›–้—จๅปŠ็ป“ๅˆ็ฒพ็พŽ็š„ๅฎคๅค–ๅ‰็ฉบ้—ด็ชๅ‡บไบ†็Ÿณ้“บๅœฐ้ขๅ’Œไธ€ไธช่Šฑๅ›ญ็ปฟๆดฒ\n550003 0 ้‚ฃไนˆๆˆ‘ๅฏนไธ‰ๅ”ไปฅๅŠๆŠ•่ต„ๅ•†็š„็œผๅ…‰ๅผ€ๅง‹ไบ†ไธฅ้‡็š„ๆ€€็–‘\n550004 0 ไธ่ฟ‡vxinๅ…ฌzhongๅท่ฟ˜ๆ˜ฏไผšๅ’Œzoeไธ€่ตทๆ›ด\n1000 Processed\n classify content\n550500 0 ไน่ดญ่€ๅ‘˜ๅทฅ้›†ไฝ“็ดขๅฟๅŽๆถฆ็ณปๆˆ–ๅ–้ƒจๅˆ†ไน่ดญ้—จๅบ—ๆœ‰ไธšๅ†…ๆถˆๆฏ็งฐ\n550501 0 ๆขฆ่งๆœบๅ™จไบบๅ’Œ็ปๅœฐๆญฆๅฃซๅœจๆœˆไบฎๅ’Œๅฏๆ˜Žๆ˜Ÿไน‹้—ดๅ†ณๆ–—\n550502 0 ๅฏปๆ‰พๆฑŸ่‹ๆ‰ฌๅทžๅœฐๅŒบๅฟซ้—ชๆ‹ๆ‘„ๅ›ข้˜Ÿไธ€็ป„\n550503 0 ไธบไบ†่ตถไธŠ7๏ผš00ๅˆฐduma็š„้ฃžๆœบ\n550504 0 ๅ…ดๅŒ–ๅธ‚่€ๅนฒ้ƒจไนฆ็”ปๅฑ•ไปŠๆ—ฅๅœจๅธ‚ๅš็‰ฉ้ฆ†ๆญ็‰Œ\n1000 Processed\n classify content\n551000 0 ็œ‹ไธๆ‡‚IFไธปๅŠ›ๅ’ŒIHไธปๅŠ›ๆ€Žไนˆ่ฟ˜ๆ•ขๅคงๅน…่ดดๆฐด\n551001 0 5ใ€ๆ‰พๅ‡บๅ‰ไพ‹ไธญๆ‰€ๅŒ…ๅซ็š„่ง„ๅˆ™ๆˆ–ๅŽŸๅˆ™\n551002 0 ่€ณ่พนไธ€้้ไผ ๆฅๆฐงๆฐ”็š„่ฝป่ฝป็š„ๆญŒๅฃฐ\n551003 1 ใ€ไธญๅฑฑไปๅญšๅฅ”้ฉฐใ€‘็ฅๆ‚จๅ’Œๅฎถไบบๅ…ƒๅฎต่Š‚ๅฟซไน\n551004 0 ๅ’Œไบ•ๆŸ็„ถๅ…จๅ›ฝๅŽๆดไผšๆต™ๆฑŸ็พŽๅฅณไธ€่ตท็Žฉ่€\n1000 Processed\n classify content\n551500 0 Sparkๅœจ6ๆœˆไปฝๅ–ๅพ—ไบ†ๆฟ€ๅŠจไบบๅฟƒ็š„ๆˆ็ปฉ\n551501 0 ๅœจP2Pๅ€Ÿ่ดทใ€็ฝ‘็ปœ็†่ดข็ญ‰ไบ’่”็ฝ‘ๆŠ•่ž่ต„ๆ–นๅผๆฟ€ๅ‘ๅคงไผ—็š„ๆŠ•่ต„็ƒญๆƒ…ๅŽ\n551502 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ rx67h9ไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n551503 1 xxx=xxxxๅ…ƒ๏ผˆๆŒ‡ๅฎš้ฉพ้ฉถๅ‘˜๏ผ‰๏ผŒxๆœˆx.xๅท็ปญไฟ่ต ้€ๆœบๆฒนไธ€ๆฌกๆˆ–ๅ…่ดนๅ–ทๆผ†ไธ€ๆฌกๆˆ–ๅ››่ฝฎๅฎšไฝไธ€ๆฌก๏ผŒ...\n551504 0 ็ซž่ต›ๅฝ“ไธญ็š„่…่ดฅไธๆ˜ฏไธชไบบ่กŒไธบ\n1000 Processed\n classify content\n552000 0 5ๅฎถ้…ท็‚ซๅŒป้™ขๅผ•้ข†็ŽฐไปฃๅŒป้™ข่ฎพ่ฎก็†ๅฟต\n552001 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅˆšๅ’Œๆ‚จ่”็ณปๅŠž็†posๆœบ็š„ใ€‚็”ต่ฏ๏ผšxxxxxxxxxxx\n552002 0 ้šๅณๅขจ่ฅฟๅ“ฅๆ”ฟๅบœๅฐฑๆŽ’ๅŽไบ‹ไปถๅ…ฌๅผ€้“ๆญ‰\n552003 1 ไบฒ๏ผๅ–œ่ฟŽๅ…ƒๅฎต่Š‚๏ผŒไธ‰ๅ…ซ่Š‚ไน‹้™…๏ผŒๆˆ‘ไผ—้‘ซ็พŽๅฆ†๏ผŒ็‰นๅˆซๆŽจๅ‡บ๏ผŒไผ˜ๅฆฎ้€ๅฅฝ็คผ๏ผๅŽŸไปทxxxๅ…ƒๅฅ—็›’็ซ‹ๅ‡xxๅ…ƒ็Žฐ้‡‘...\n552004 1 ๆ‚จๅฅฝ๏ผšๅŸŽๅธ‚ไบบๅฎถๆบๆ‰‹xxๅฎถๅปบๆๅ•†ๆญฃๆœˆๅไธƒ๏ผˆx.xๆ—ฅ๏ผ‰ไธพ่กŒๅฎถ่ฃ…ๅปบๆๅ›ข่ดญไผš๏ผŒๅฎš่ฃ…ไฟฎๅฏไบซๅ—ๅปบๆๅ…จๅธ‚ๆœ€...\n1000 Processed\n classify content\n552500 1 ๆฌข่ฟŽ่‡ด็”ตๆ™ฎๅ…ฐๆ–ฏๅนฟๅ‘Š๏ผŒๆœฌๅ…ฌๅธไธ“ไธšๅˆถไฝœไบšๅ…‹ๅŠ›ๅˆถๅ“๏ผŒๅนฟๅ‘Šๆ ‡่ฏ†ๆ ‡็‰Œ๏ผŒๅนฟๅ‘Šๆ‹›็‰Œ\n552501 0 ่ฏดๅงไธ‹ไบ†้ฃžๆœบๅŽปๅ“ชๅ„ฟ้‡Žไบ†ไธ€ๅคฉๆ‹–็€ๅˆซไบบๅ‚ฌ็จฟๆ€ฅ็š„่ฆๆญป\n552502 0 ๆˆ‘ไธ€็›ด่ง‰ๅพ—ๆตทๅค–็š„ๅŽไบบ้ ๆ”ฟๅบœ\n552503 1 KARL LAGERFELD xยทxๅฅณไบบ่Š‚ๅคง็คผๅ›ž้ฆˆ๏ผšxๆœˆx-xๆ—ฅ่ดญไนฐๆปกx\n552504 0 ๅพฎ่ฝฏ้ซ˜็ฎก๏ผšๅพฎ่ฝฏไผšๅ› Win10ๅ…่ดนๅ‡็บงๆ”พๅผƒไธ€ไบ›่ฅๆ”ถ\n1000 Processed\n classify content\n553000 0 ไป–็š„้’ฑ่ขซๆˆ‘ไปŽ7000ๆ‰ฃๅพ—ๅชๅ‰ฉ5000ๅ•ฆ\n553001 0 โ€ๆ›ดไฝ•ๅ†ต่ฟ˜ๆ˜ฏๅฏนไบŽTFboysไธ‰ไธชๆœชๆˆๅนด็š„ๅ…ฌๅ…ฑไบบ็‰ฉ\n553002 0 ่ฟ™ๆ˜ฏๆฅ่‡ชๆฑŸ่‹่ฎธ้ธฟ่กŒๅ…ˆ็”Ÿ็š„ๅ‚่ต›ไฝœๅ“\n553003 0 ็™พๅบฆๅ‘็Žฐ่ฟ™ๆ ท็š„้—ฎ้ข˜ไธๆ˜ฏไธชไพ‹\n553004 0 ๅ่ฎฎ่ต„้‡‘295ไบฟๅ…ƒใ€ๅฎŒๆˆๅนดๅบฆ็›ฎๆ ‡ไปปๅŠก็š„57%\n1000 Processed\n classify content\n553500 0 ไฝœไธบๆ›พ็ป่ขซๅŒป็”ŸๅˆคๅŠไธชๆญปๅˆ‘็š„ไบบๅคช่ƒฝ็†่งฃไป–ไปฌๅคซๅฆปไธคไธบไบ†่ฆๅญฉๅญๅ—็š„่‹ฆ\n553501 0 ๅœฐ็ƒไธŠ็š„็ปด็”Ÿ็ด B3ๅฏ่ƒฝ่ตทๆบไบŽๅคช็ฉบ\n553502 0 ๆŠ•่ต„่ฟ™ไปถไบ‹ไธ่ƒฝๆ˜ฏ็”Ÿๆดป็š„ๅ…จ้ƒจ\n553503 0 ไธคไธชๅฐๆ—ถ็š„ไธๆ ‡ๅ‡†ๆฑ‰่ฏญ+ไธๆ ‡ๅ‡†่‹ฑ่ฏญ\n553504 0 Mxx็ƒ็Šถๆ˜Ÿๅ›ขๆ˜ฏๅŒ—ๅŠ็ƒไธญๆœ€ไบฎ็š„็ƒ็Šถๆ˜Ÿๅ›ข\n1000 Processed\n classify content\n554000 0 ๅˆ†ๅธƒๅœจๅ†…ๆฒณไธๅŒๆฐดๅŸŸ็š„95่‰˜้‡‡็ ‚่ˆน้›†ไธญๆ•ดๆฒปใ€็Žฐๅทฒๅ…จ้ƒจๅœๆœบ\n554001 0 ้›†ไธญไฟฎๅค14ๅคฉ็ฒพๅŽๆฏๅคฉ็”จ่ฟ‡ๅŒ–ๅฆ†ๆฐดๅŽๆŒ‰ๆ‘ฉๆถ‚ๅœจๅ…จ่„ธ\n554002 1 ๅฐŠๆ•ฌ็š„็”จๆˆท๏ผšๆ‚จ็š„ๆ‰‹ๆœบ่ฏ่ดน็งฏๅˆ†ๅทฒๆปก่ถณๅ…‘ๆขxxx.xxๅ…ƒ็š„็Žฐ้‡‘็คผๅŒ…ๆกไปถ๏ผ่ฏท็”จๆ‰‹ๆœบ็™ป้™† jggxx...\n554003 1 lๆญ่ดบๆ–ฐๆ˜ฅ๏ผš่ดตๅฎพๅŽ…็ฅๆ‚จๆ–ฐๆ˜ฅๅคงๅ‰๏ผwww.xxxxxx.comๆŽจๅ‡บๆฏๅคฉๆ‰“็ ่ฟ”x%ๆ— ไธŠ้™๏ผ›ๆ„ฟๆ‚จๅฟƒ...\n554004 0 ๆ˜Žไบฎ่‰ฒๅฝฉๅค„็†+้•œๅคดๅ™ช็‚นใ€ไฟฏๆ‹ๅฏน็งฐๅ–ๆ™ฏ\n1000 Processed\n classify content\n554500 0 โ€็—…ๅ‹ไธ่งฃๅœฐ่ฏด๏ผšโ€œ้‚ฃๆ›ดๅบ”่ฏฅ็Ÿฅ้“่ตกๅ…ป่€ไบบๅ•Š\n554501 0 ไธบไบ†้˜ฒๆญขๆœ‰ๅไบบๆปฅ็”จไธชไบบไฟกๆฏ\n554502 1 ไธบ็ญ”่ฐขๆ–ฐ่€้กพๅฎขๅฏนๆ–ฐไธœๆธฏKTV็š„ๅŽš็ˆฑ๏ผŒ็‰นๅœจๅ…ƒๅฎตๆฅไธดไน‹้™…ๆŽจๅ‡บ็บฏ็”Ÿๅ•ค้…’xxxๅ…ƒไธคๆ‰“๏ผŒๆœŸๅพ…ๆ‚จ็š„ๅ…‰ไธด๏ผŒ...\n554503 0 ๆˆ‘ๅœจๆฑŸ่‹ไธบไผ ๅฅ‡ๆ‰‹ๆœบ็‰ˆๅŠฉๅŠ›\n554504 0 ๆ— ๆ”ป็•ฅไธ‹100%้€šๅ…ณ่ฏ่ฏด่ฟ™ไธบไป€ไนˆไธๆ”นๅๅซ่ขซ้”็€็š„็›’ๅญ็š„็›’ๅญ็š„็›’ๅญ\n1000 Processed\n classify content\n555000 0 ่Žทๅพ—ไบ†ๆต™ๆฑŸ็œๅ†œไธšๅ‰ๅฐผๆ–ฏๅง”ๅ‘˜ไผš้ขๅ‘็š„ๆœ€้ซ˜ไป™ไบบๆŽŒๅฅ–็‰Œ\n555001 0 8ๆœˆ1ๆ—ฅๅพฎ่ฝฏ็”ต่ฏ่ฅ้”€ๅ›ข้˜Ÿๅ›ขๅปบ็Žฐๅœบๅ›พ\n555002 0 โ€ๅฐๅท็œ‹็€ๅ„ฟๅญ้‚ฃๅ‰ฏๅฐ–ๅ˜ด็Œด่…ฎ็š„ๆจกๆ ท\n555003 0 ไนฐ็š„่Šฑๅƒ้ชจๅฐ่ฏด็ปˆไบŽๅˆฐๅ•ฆๅฅฝๅผ€ๅฟƒๅ•ฆๅ•ฆๅ•ฆ\n555004 0 ๅ—ไบฌไธ‡็ง‘็ฝฎไธšๆ€ป็ป็†ๆœฑไฟๅ…จ\n1000 Processed\n classify content\n555500 0 ๆœ€ๅŽไธ€ๅคฉๆ‰Žๅ้’ˆ็–ผ็š„้บปไบ†้˜ฟๅงจ่ฏด้‚ฃๆœ€ๅฅฝๆ‰Žๅˆฐไฝไบ†ๅ“ˆๅ“ˆๅฏ็ˆฑ็š„้˜ฟๅงจ๏ฝžๅ›žๅŽปๅผ€ๅง‹ๆŽฅ็€ๆขๅคๆฌง่€ถๆŒ‰ๅŒป็”Ÿไปฌ็š„ๆŒ‡ๅฏผ...\n555501 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ๆ‚จๅฅฝ๏ผŒไธ‰ๆœˆไปฝๅฅณไบบ่Š‚๏ผŒไธบ็ญ”่ฐขๆ–ฐ่€ๅฎขๆˆทๆœฌๅบ—ๆ‰€ๆœ‰ๆŠค็†ๆดปๅŠจๅ‡ไนฐไธ€้€ไธ€๏ผŒไผ˜ๆƒ ๅคšๅคšใ€‚ๆฌข่ฟŽๅ„ไฝๅฐ...\n555502 0 ๅœฐ้“ๅฏน้ข็ซ™ไบ†ไธช็–‘ไผผๅœจๅทๆ‹็š„ไบบ็‰ฉ\n555503 0 ๆŽจ่่ฟ™ๅฎถๅพฎๅบ—๏ผš็‡•ๅญ็š„่œ‚่œœ\n555504 0 ่€Œไธ”่ฟ˜็œ‹ๅˆฐxxๅคš่พ†ๅทฒ็ป็ฆๆญข็š„่ฅ่ฟไธ‰่ฝฎ็ญ‰็€ๆ‹‰ไบบ\n1000 Processed\n classify content\n556000 1 ๆฌข่ฟŽ่‡ด็”ต๏ผšๆต้˜ณ่’ธ่œ้ฆ†ใ€ๆœฌๅบ—็Žฏๅขƒๅนฝ้›…ใ€่œๅ“้ฝๅ…จใ€ไธป่ฆ็ป่ฅๆต้˜ณ็‰น่‰ฒ่’ธ่œใ€ๆ‚จ็š„ๆปกๆ„ๅฐฑๆ˜ฏๆˆ‘ไปฌ็š„่ฟฝๆฑ‚ใ€...\n556001 1 ๅฏ็ˆฑ*^o^*็š„ๅฎถ้•ฟไปฌๆ‚จๅฅฝ๏ผ็ซฅๅฟƒ็ญ‘ๆขฆๅนผๅฐ‘ๅ„ฟไนฆ็”ปๅŸน่ฎญ็ญๆ˜ฅๅญฃๆŠฅๅๆญฃๅผๅผ€ๅง‹ไบ†โ€ฆ้œ€่ฆ้ข„ๅฎšๅญฆไฝ็š„ๅฎ่ดไปฌ...\n556002 0 ไธไฟกๅŽป้—ฎไธ‹Googleโ€”โ€”ๆ”ฏไป˜ๅฎ่ฏด๏ผšไฝ ่ฟ™ไนˆๆณจ้‡้š็ง\n556003 0 xxxๅคฉๆฏๅคฉๆ— ๆ•ฐๆฌก้šๆ—ถ้šๅœฐ่กฅๆฐด\n556004 0 ้’ๅฒ›้ป„ๆตทๅญฆ้™ข่ทŸ้’ๅฒ›ๆปจๆตทๅญฆ้™ขๅ“ชไธชๅฅฝ4\n1000 Processed\n classify content\n556500 0 ่€Œไธ”ๆ’ญๅ‡บ็š„่Šฑๅƒ้ชจไธญ้–‹้ ญ้‚„ๆœ‰10ๅคšๅˆ†้˜็š„้‡ๅพฉๆƒ…็ฏ€\n556501 0 8ๆœˆ7ๆ—ฅ็พŽๅ›ฝ่พพ็พŽ่ˆช็ฉบ1889ๆฌก่ˆช็ญ็”ฑๆณขๅฃซ้กฟ่ตท้ฃž็›ฎ็š„ๅœฐไธบ็Šนไป–ๅทž็›ๆน–ๅŸŽ\n556502 0 ๆŠ•่ต„่€…ๅบ”่ฏฅๅ…ทๅค‡็š„้‡่ฆไธ€็‚น๏ผš็™พๆŠ˜ไธๆŒ \n556503 0 nicereunion??pic\n556504 0 ๅธๅผ•ไบ†ๆฅ่‡ชๅ…จๅ›ฝ120ๅคšๆ‰€้ซ˜ๆ ก็š„1300ไฝ™ๅ่ฟๅŠจๅ‘˜ๅ‚ๅŠ \n1000 Processed\n classify content\n557000 0 ๆ—ฅๆœฌ่ฝฌ่ฟไธญๅ›ฝๆตทๅค–ๅ…ๆœๅŠก่ดนๅˆ็ฎฑEMS/่ˆช็ฉบ/ๅ…ฌๅธไบš้ฉฌ้€Šไนๅคฉไปฃ่ดญ็งไบบ\n557001 0 ๅฏ้ ็š„P2P็ฝ‘่ดท็ณป็ปŸๅปบ่ฎพๅผ€ๅ‘ๅ…ฌๅธๅฏไปฅๅธฎๅŠฉๅนณๅฐ่ตฐๅ‘ๆ›ด่ง„่ŒƒๅŒ–ๅ’Œ็ง‘ๅญฆๅŒ–็š„้“่ทฏ\n557002 0 ๅ”ฏ็‹ฌๆœ€่ฏฅๆ”ถ็š„ๆˆฟไบง็จŽ่ฟŸ่ฟŸๆฒกๆœ‰ๆ”ถ\n557003 0 ่ฟ™้ƒจ่‹นๆžœ5sๆ‰‹ๆœบๆ˜ฏไฟž็‡•็š„ๅฉถๅฉถไนฐ็ป™ๅฅน็š„\n557004 0 ๆทฑๅœณๅธ‚ๆ”ฟๅบœๅ‰ฏ็ง˜ๆ›ธ้•ท้ซ˜ๅœ‹่ผ่กจ็คบ\n1000 Processed\n classify content\n557500 0 ๆณ—ๆดชๅŽฟ้•ฟ้€”ๆฑฝ่ฝฆ็ซ™ๆณ—ๆดช่ฝฆ็ซ™ๅˆฐๅคฉๅฒ—ๆน–ๅ…ฌไบค่ฝฆๅคชๆถจๅคชๅธไนฑๆ”ถ่ดน\n557501 1 ๅณฐๅบฆๅฎถ่ฃ…ๆ–ฝๅทฅ้˜Ÿๆ‰ฟๆŽฅ่ฃ…ไฟฎไธšๅŠก๏ผŒ่œๅ•ๅผๆŠฅไปท๏ผŒไปทๆ ผๅฎžๅœจ๏ผŒ่ดจ้‡่ฟ‡็กฌใ€‚็”ต่ฏ๏ผšxxxxxxxxxxx ๅœฐ...\n557502 0 ไธๅ†ๆปฅ็”จๆŠ•่ต„ๅˆฐ้ฉฌๆ ผ้‡Œๅธƒ็š„่ต„้‡‘\n557503 0 ๆœฌๆฌกๅŒป็–—ๆœบๆž„ๅฎž่กŒไปทๆ ผๅ…ฌ็คบ้‡็‚น\n557504 0 intelๆ›ด่ฆ็กฎไฟๅˆถ็จ‹ๅคงๆˆ˜ไธญไธ่ขซไธ‰ๆ˜Ÿใ€ๅฐ็งฏ็”ต็”ฉๅผ€\n1000 Processed\n classify content\n558000 0 ้ข†ๅฏผไธ€ๅฅๆ”ฟๅบœๅŠžไธๅ‘ๆˆ‘ไน‹ๅ‰็š„่ฟž็ปญๅŠ ็ญๅฐฑ็™ฝ่ดนไบ†\n558001 1 ่€ๆฟ๏ผŒๆ–ฐๅนดๅฅฝ๏ผŒๆˆ‘ไปฌๆ˜ฏๅธธ็†ŸๅŽ้‘ซๆฐดๆด—ๅŽ‚๏ผŒไธ“ไธšๆฐดๆด—ๅ„็ง็‰›ไป”่กฃๆœ๏ผŒ่ฃคๅญ๏ผŒ่ฃ™ๅญๆฐดๆด—๏ผŒไปฅๅŠๅ„็ง็‰›ไป”ๅทฅ่‰บ๏ผŒ...\n558002 0 ๅฏนๆœ‹ๅ‹ๅพˆๅๆ„Ÿไฝ†ๅˆๆฒกๅŠžๆณ•่ฏดๅ‡บๅฃๆƒณๆ•ดไบบไฝ†ๅˆๆฒกๆœ‰ๅฅฝ็š„ๅŠžๆณ•\n558003 0 ๅœจไธ€ไธช็‹ฌ็ซ‹่ฎพ่ฎกๅธˆไบคๆตไผšไธŠ็œ‹ๅˆฐ็š„็ฒพ่‡ดๆ‰‹ๅทฅไฝœๅ“\n558004 0 ๅฝ“ๆˆ‘ไปฌ่ขซๆ‰“้ช‚่ขซๆๅ“ๆƒณ่ฆ็”จๆ‰€่ฐ“็š„ๆณ•ๅพ‹ๆญฆๅ™จไฟๆŠค่‡ชๅทฑๆ—ถๅฐฑๅช่ƒฝๆ‰พๅˆฐๅŠ้˜ปๆ•™่‚ฒ็ฝšๆฌพ่ฟ™ไบ›ไนˆ\n1000 Processed\n classify content\n558500 0 โ€œๅฏนๆˆ‘ๆฅ่ฏดๆ‰‹ๆœบๅชๆ˜ฏ็”ตๅญๆ‰‹่กจ่€Œๅทฒ\n558501 0 Gxx้•ฟๆทฑ้ซ˜้€Ÿ็”ฑๆญๅทžๅพ€่ฟžไบ‘ๆธฏๆ–นๅ‘ๅฎๆญๆฎตKxxxx+xxx่‡ณKxxxx+xxx้™„่ฟ‘ๆ–ฝๅทฅ็ป“ๆŸ\n558502 0 ๆฌข่ฟŽๅ…ถไป–ๅ—ไบฌๆœช่ƒฝ็›ธ่š็š„็‚นๅฟƒๅ’Œ็ƒญ่”ๆˆๅ‘˜ไปฌ่ธŠ่ทƒๅ‹พๆญ\n558503 0 ๅ›ฝๅฎถ้˜ŸๅŠจๆ‰‹็š„ๆ ‡ๅฟ—ๅฐฑๆ˜ฏๆœŸ่ดง็›˜ไธญๅƒๆ˜จๅคฉ่ฟ™ๆ ท่„‰ๅ†ฒๆ‹‰ๅ‡\n558504 1 ๅนฟไธœๅŽๆ้“ๆ้ญๅฏๅŠฉ: ๆ‰‹ๆœบxxxxxxxxxxx ๆˆ‘ๆ›พ็ปๅˆฐ่ดตๅบ—ๆ‹œ่ฎฟ่ฟ‡๏ผŒๆœฌๅธไธ“ไธš็”Ÿไบงๅ„็ง้—จ็ช—้“...\n1000 Processed\n classify content\n559000 0 ๅŽป็ฒ‰ๅˆบ่ฟ™ๆฌพๅฑๅฑ้œœๅŽŸๆฅๆ˜ฏ่ฆ็ป™ๅฐๅญฉๅญๆฒป็–—็บขๅฑๅฑ็š„\n559001 0 ๆˆ‘็ˆธ2015ๅนด6ๆœˆ24ๆ—ฅ่ขซๅทไผๆˆ‘ๅฎถๆ ‘็š„ไบบๆ‹ฝๆ™•\n559002 0 ๅคงๅฎถๆ˜ฏไธๆ˜ฏ้ƒฝไผš็œ‹ๆ‰‹ๆœบ่€Œไธขๆމๅฎฃไผ ้กตๅ‘ข?ๆˆ‘ไปฌ\n559003 0 ้‡‡ๅ–ๅคš็งๆŽชๆ–ฝๅšๅฅฝๆณ•ๅพ‹ๆดๅŠฉๅทฅไฝœ๏ผš\n559004 0 ้พ™ๅฃๅธ‚ๅผ€ๅฑ•็”ตๆขฏๅฎ‰ๅ…จไธ“้กนๆ•ดๆฒป\n1000 Processed\n classify content\n559500 0 ๆˆ‘ๅ…ถไป–ๅœฐๆ–นๅŒ…ๆ‹ฌ็”ต่„‘ไธŠ้ƒฝๆฒกๅญ˜ๅ•Š\n559501 0 2015ๅนดไธŠๅŠๅนด่ขซๅฑฑๅฏจๆœ€ๅคš็š„ๅ“็‰Œไพ็„ถๆ˜ฏไธ‰ๆ˜Ÿใ€ๅฐ็ฑณ\n559502 0 ๆˆ‘ๆƒณๆˆ‘็•™ๅ—ไบฌๆœ€ๅคง็š„็—›่‹ฆๅฐฑๆ˜ฏๅฎถไบบ็š„ไธๆ”ฏๆŒไปฅๅŠไธ€ไธชไบบ็š„ๅญคๅ•\n559503 0 ๆญคๆฌกๅ…ฑ่ฎกๆœ‰168ๅ่€ƒ็”ŸๆŠฅๅๅ‚ๅŠ ่€ƒ่ฏ•\n559504 0 ้ฃŽ้™ฉๆŠ•่ต„ๆฒกๆœ‰่ฎกๅˆ’ๅ’Œๆฆ‚ๅฟตๅฐฑๆ˜ฏๆœ€ๅคง็š„้ฃŽ้™ฉ\n1000 Processed\n classify content\n560000 0 ่ฎพ่ฎก่ฟ™ไบ›ๅปบ็ญ‘็š„ๅคงๅธˆไปฌ็œŸๆ˜ฏ็€ๅฎžไปคไบบไฝฉๆœ\n560001 0 ๅนณๅ‡ไธ‹ๆฅ็›ธๅฝ“ไบŽๆฏๅคฉ20ๅˆ†้’Ÿๅทฆๅณ\n560002 0 ่ฟ˜่ฎฐๅพ—ๆˆ‘ๅŽปๆต™ๆฑŸ่ฟž่ฟž็ง‘ๆŠ€็š„ๆ—ถๅ€™\n560003 1 ้ฃž้นคๅฅถ็ฒ‰ๆžๆ‰“ๆŠ˜ๆดปๅŠจไบ†๏ผŒๆ˜Ÿ?ๆตทๆ˜Ž้ƒฝไบŒๅบ—\n560004 0 ๆˆ‘่ฟ˜ๆ˜ฏๅ…ฑไบงไธปไน‰ๆŽฅ็ญไบบๅ‘ข\n1000 Processed\n classify content\n560500 0 ๆต™ๆฑŸ็œๆกฃๆกˆๅนฒ้ƒจๆ•™่‚ฒๅŸน่ฎญไธญๅฟƒๅ…ณไบŽไธพๅŠžๅ…จ็œโ€œไบ”ๆฐดๅ…ฑๆฒปโ€ๆกฃๆกˆไธšๅŠกๅŸน่ฎญ็ญ็š„้€š็Ÿฅ\n560501 0 ่ฝฌๆตทๆท˜ๅฏŒ่ฃ•็š„ๆฏ›ๆฏ›่™ซxxx/xx\n560502 0 ไธŽBelgicaๅคšๆฌกๅœจ้ฃžๆœบไธŠ็›ธ้‡\n560503 0 ๅ†ฌๅญฃๆ‰่ƒฝ้ฟๅ…้ข„้˜ฒ็–พ็—…็š„ไบง็”Ÿ\n560504 0 ๆ˜จๆ™šๅ–ๅพ—ๆ™•ไนŽไนŽ็š„๏ฝž็„ถๅŽๆœ‰ๅฎขๆˆท่ฏด่ฆๆฅๆ‹ฟ่ดง\n1000 Processed\n classify content\n561000 0 ๅŒ—ไบฌๅ„ฟ็ซฅๅŒป้™ขๅ„ฟ็ซฅไฟๅฅไธญๅฟƒไธปไปปๅŒปๅธˆๅˆ˜ๆ˜ฅ้˜ณๆ้†’\n561001 0 ๅฐฝ็ฎกๆˆ‘ไปฌ่ขซๆœ‹ๅ‹ๅ˜ฒ็ฌ‘่ขซๅฎถไบบๅซŒๅผƒ่ขซไผ™ไผด่ดจ็–‘่ขซ็ป้ชŒๆŸ็ผš่ขซๅŒ่กŒไนฑไปท่ขซๅช’ไฝ“ๆŠน้ป‘ไฝ†่ฟ™ไนˆๅ›ฐ้šพ\n561002 0 ๅ‘ตๅ‘ตๆฏๆฌก้ƒฝ่ฟ™ๆ ท่ตฃๆฆ†ไบบๅฐฑๆ˜ฏ่ฟ™็งๆ€งๆ ผไนˆๆ—ข็„ถไป€ไนˆ้ƒฝไธๅˆ้‚ฃๆ นๆœฌไธๆ˜ฏไธ€่ทฏไบบๅ•Š็œŸๆ˜ฏๅ—ๅคŸไบ†\n561003 0 ๅœฐๆ–นๆฅผๅธ‚ๅŠๅนดๆŠฅ๏ผšๅนฟไธœๆต™ๆฑŸ็ญ‰ๆฒฟๆตทๅŸŽๅธ‚ๅขžๅน…้ข†ๅ…ˆ\n561004 0 ไฝ†ไป–ไปฌๅด็›ดๆŽฅๆŠŠ้’ฑๆ‰“็ป™ไฝ ่ฟ™ๆ˜ฏๅˆซไบบ็›ธไฟกไฝ \n1000 Processed\n classify content\n561500 0 ๆณ•ๅญฆxxxx็š„ๅผ ๆผ”้”‹ๅŒๅญฆ็š†่ฟ›ๅ…ฅไบ†ๆ€ปๅ†ณ่ต›\n561501 0 Piagetไผฏ็ˆตchaumetๆœ€ๆฐธๆ’็š„ๅนธ็ฆไธๆ˜ฏๆ‹ฅๆœ‰ไฝ \n561502 0 ไฝ ่ฆๅ‘็ป™ๅๅ…ญไธชๆœ‹ๅ‹ๅŒ…ๆ‹ฌๆˆ‘่‹ฅไธๅ‘\n561503 0 ๆœ‹ๅ‹ๅ‘จๆ—ฅๅŽปๆ— ้”ก้˜ณๅฑฑๆ‰นๅ‘ๆฐด่œœๆกƒ\n561504 0 ๅ› ไธบๆƒณ็‹ฌ็ซ‹ๅ› ไธบไธไฟกๅ‘ฝๅ› ไธบๅšไฟกๅฏไปฅๆˆๅฐฑๆ›ดๅฅฝ็š„่‡ชๅทฑๅœจๅพฎๅ•†็š„่ทฏไธŠๅ†ณๅฟƒ่ตฐๅˆฐๅบ•ไธ็•ๆƒงไธ้€ƒ้ฟๅ‹‡ไบŽ็›ด้ขๆŒ‘ๆˆ˜...\n1000 Processed\n classify content\n562000 0 goodbeysummer??\n562001 0 ๅ—ไบฌ็š„ๆตทๅบ•ไธ–็•Œ่ฟ˜ๆ˜ฏไธŠๆตท็š„ๆตทๆด‹้ฆ†ๆ›ดๅฅฝ็Žฉ\n562002 0 ๅผฅ่กฅไบ†ๆˆ‘ๅ›ฝๆ—…ๆธธๆ–‡ๅŒ–ๅญฆ็ง‘็ ”็ฉถ็š„ไธ€ไธช็ฉบ็™ฝ\n562003 0 ๅ› ๆญคๆˆ‘ไปฌๅ†ณๅฎš่ฏšๆ‹›ๆœ‰1ๅนด็พŽ็”ฒๅทฅไฝœ็ป้ชŒไปฅไธŠ็š„็พŽ็”ฒๅธˆไธ€ๅ\n562004 0 ๅŒๆ—ถไนŸๆ˜ฏ็พŽๅ›ฝ่ˆชๅคฉ้ฃžๆœบ็š„ๆœ€ๅŽไธ€ๆฌก้ฃž่กŒไปปๅŠก\n1000 Processed\n classify content\n562500 0 ๅฏ่ƒฝๆฒก่ฟ‡ๅคšไน…็š„ๆ—ถ้—ด้˜ด้“ๅฐฑไผšๆœ‰ๅฐ‘้‡ๅ‡บ่ก€็Žฐ่ฑก็š„ๅ‘็”Ÿ\n562501 0 ๅพๅทž็งปๅŠจไธšๅŠกๆŽจๅนฟๅฐ•ไธ“็”จๅท\n562502 0 ๅˆฐ9ไธชๆœˆๅคงๆ—ถ่ฟ™้กฟๅฅถๆ›ดๆ˜ฏ่‚ฏๅฎšไธๅฟ…่ฆไบ†\n562503 0 ่ฟ™ๆฌกๆƒณๅŠžๆณ•ๅœจ็”ต่„‘็ฝ‘้กตไธŠๆ‰“ๅผ€ๆ”ฏไป˜็•Œ้ข\n562504 0 ไฝ ไนŸๅฟซๆฅ่กจๆ€ๅง~ๅฆ‚ๆžœไฝ ๆ˜ฏ็ฑณๆœต\n1000 Processed\n classify content\n563000 0 ๅผ ่‰บ่ฐ‹ๅทฒๅ†ณๅฎšๆŽฅๅ—ๆ— ้”กๆปจๆน–ๅŒบ่ฎก็”Ÿๅฑ€xxxไธ‡ไฝ™ๅ…ƒ็š„็ฝšๆฌพ\n563001 0 ่€Œ่ฟๆณ•ๅฏ†ๅบฆๆœ€ๅคง็š„ๅฝ“ๆ•ฐๆ–‡ๅŒ–่ฅฟ่ทฏxxxๅคš็ฑณ็š„ๆŠ“ๆ‹่ทฏๆฎต\n563002 0 ็”จๆปดๆปดๆ‰“่ฝฆ้กบ้ฃŽ่ฝฆๅฟซ็š„ไธ€ๅทไธ“่ฝฆๅ˜€ๅ—’ๆ‹ผ่ฝฆๅคฉๅคฉ็”จ่ฝฆๆ˜“ๅˆฐ็”จ่ฝฆxx็”จ่ฝฆ็ฅž่ˆŸไธ“่ฝฆ่ฟ˜ไธๅฆ‚็”จuberๆ‰“่ฝฆ\n563003 0 ๅฎŒๆˆ่€ๆตฆๅฃๅœฐๅŒบๆ•ดๆฒปๅ’Œๆตฆๅฃๅކๅฒ้ฃŽๅ…‰ๅธฆๆ•ดๆฒป\n563004 1 ????ๅฐŠๆ•ฌ็š„่ดตๅฎพไปฌ๏ผ›XG้›ชๆญŒ็ป™ๆ‚จ้€็คผๅ•ฆ๏ผŒๅฅฝ็คผๅ…่ดน้€ๅ“ฆ[ๆ„‰ๅฟซ][ๆ„‰ๅฟซ]ๆฌข่ฟŽๆๅ‰ๅบ—้“บๅ‚ๅŠ ้ข„ๅฎšๆŠข...\n1000 Processed\n classify content\n563500 0 ๅ‡ๅฆ‚้ฉฌไบ‘ๆฅๅˆฐไธญๅ›ฝๅฅฝๅฃฐ้Ÿณไฝ ไปฌ่‡ชๅทฑๆ„Ÿๅ—ไธ€ไธ‹\n563501 1 ๅ„ไฝไบฒไปฌๆ–ฐๅนดๅฅฝ'็ฅ็พŠๅนดๅคงๅ‰\n563502 0 ๆ‰‹ๆœบๅ…ณๆœบไธ‰ๅ›žไบ†่ฟ™ๆ˜ฏ็ฌฌไธ‰ๆฌกๅผ€ๆœบไบ†\n563503 0 ็‰นไนˆ็Ž‹ไฟŠๅ‡ฏๅไธช้ฃžๆœบๅชๆœ‰ไธ€ไธชๅฐ้ฉฌๅ“ฅ่ทŸ็€\n563504 0 ไบ”็ฒฎๆถฒ้›†ๅ›ข้ƒฝๅ‡บ็Ž›ๅ’–้…’ไบ†โ€ฆ็Ž›ๅ’–ๆค็‰ฉ่ƒๅ–\n1000 Processed\n classify content\n564000 0 ่ฆๅƒ้˜ฒ่ดผไธ€ๆ ท้˜ฒๅฎ˜ๅ‘˜่ดชๆฑก็›—็ชƒ\n564001 0 1ๅˆฐwin10ๅบ”็”จๅ•†ๅบ—ๅฐฑๆฒกๅฅฝ็”จ่ฟ‡\n564002 0 ้€™ๆ˜ฏ่ฆๆˆ‘ไปฅๅพŒ้ƒฝไธ่ฆๆ”พcos็…งไธŠไพ†็š„็ฏ€ๅฅ\n564003 0 ๆ‹ฟxxๅ—็ฒพๆฒน็š‚็š‚ๅŠ ๅ…ฅๅธธๆœ‰ไปฃ็†้—ฎๆˆ‘ๆ€ŽไนˆๅŠ ๅ…ฅๅ—ๅจœไปฃ็†ๅ›ข้˜Ÿ\n564004 0 ๅœฐๅ€ๆ˜ฏๅœจๅดไธญๅŒบไบบๅๅŒป้™ขๆ–œๅฏน้ข่ช‰่‚คๅ ‚\n1000 Processed\n classify content\n564500 0 2011ๅนดๅผ ใ€ๅ‡Œไธคไบบ่ตท่ฏ‰ๅˆฐ่ฅฟไนกๅก˜ๅŒบไบบๆฐ‘ๆณ•้™ขๅŽ\n564501 0 ็กฎไฟๅœจ2016ๅนด้ซ˜ๆ ‡ๅ‡†้€š่ฟ‡ๅ›ฝๅฎถๅซ็”ŸๅŽฟๅŸŽไธ‰ๅนดไธ€ๆฌก็š„ๅค่ฏ„\n564502 0 ไฝ†ๆ˜ฏๆฌกๅง็š„่ฃ…ไฟฎไนŸๆ˜ฏ้žๅธธๅ€ผๅพ—้‡่ง†็š„\n564503 0 ไน‹ๅŽๅŽปๅŒป้™ขๅˆ็ฒพๅŠ›ๅ……ๆฒ›ๅˆฐๅŽปๅƒ็ซ้”…\n564504 0 ๆˆๅŠŸๆ˜ฏไบบๆ ผๅ’Œ่ดขๅฏŒ็š„ๅŒ้‡ไธฐๆปก\n1000 Processed\n classify content\n565000 0 ๆต™ๆฑŸๅซ่ง†็š„ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณใ€ๆฑŸ่‹ๅซ่ง†็š„็œŸๅฟƒ่‹ฑ้›„ใ€ไธœๆ–นๅซ่ง†็š„ๆŠฅๅ‘Š\n565001 0 ๅ‰ไฟ้™ฉๆ ็ญ‰ๅค„็š„้€ ๅž‹ไนŸๆœ‰ๆ‰€่ฐƒๆ•ด\n565002 0 ไธปๆ—‹ๅพ‹ๅฐฑๆ˜ฏๅผบๅฅธไฝ ๅฏๆ˜ฏ่ฟ˜่ฆ้ช‚ไฝ ๆ€Žไนˆไธๅญฆ็€ไบซๅ—\n565003 0 /ไธญๆ—ฅ็กฌๅฎžๅŠ›ๅคงๆฏ”ๆ‹ผ็œ‹ไธญๅ›ฝๅ†›ๅทฅๅฆ‚ไฝ•ๅฎŒ่™ๆ—ฅๆœฌ\n565004 0 /ๅธ‚ๆฐ‘ๆ‹ฟxxๅนดๅ‰ๅญ˜ๆŠ˜ๅ–้’ฑ้ญๆ‹’้“ถ่กŒ๏ผšๅฝ“ๅนดๆ— ็”ต่„‘\n1000 Processed\n classify content\n565500 0 xxxxๅนดๅ›ฝๅฎถๆๅ‡บ็š„ไธ€ๅธฆไธ€่ทฏๆˆ˜็•ฅ\n565501 0 ๆญคๆฌกๅฑ•่งˆๅ‘ฝๅไธบโ€œๅ‡ฟ้ป‘่ง…็™ฝโ€\n565502 0 ๆฏๆฌกๆฌขๆฌขๅ–œๅ–œไนฐไธœ่ฅฟ็„ถๅŽ็œ‹ๅˆฐๅ“ช้‡Œ้ƒฝๆ˜ฏMadeinChina็ช็„ถไธ็Ÿฅ้“่ฏฅไผคๅฟƒ่ฟ˜ๆ˜ฏ้ซ˜ๅ…ดไธ€ไธช็”ต่„‘ไป€ไนˆ...\n565503 0 ๅ—้€šๆ™ฎๆณ•ๅฟ—ๆ„ฟ่€…ๅผ€ๅฑ•โ€œๅธฆ็ปฟ่ฟ›็คพๅŒบ้€ๆณ•ๆƒ ๅฑ…ๆฐ‘โ€ๆดปๅŠจ\n565504 0 ๆœ‰ๅ›พๆœ‰็œŸ็›ธๅฅฝๅœจๅ“ชไบ†็œผ่งไธบๅฎž\n1000 Processed\n classify content\n566000 0 ๅค–่ง‚ๅŒ…่ฃ…่ฎพ่ฎกไฝ“็Žฐไบ†ๆ—ถไปฃๆ„Ÿๅ’Œ้ซ˜้›…ใ€ๅŽ่ดตใ€ๆ—ถๅฐšใ€ๆฐ”ๆดพ็š„้ฃŽๆ ผ\n566001 0 ็Žฐไปฃ็”ตๆขฏ็š„ๅฎ‰ไฟๆ˜ฏๅคฉ่กฃๆ— ็ผ็š„\n566002 0 ็ฅ่Š‚ๆ—ฅๅฟซไนโ€ฆโ€ฆๅ—ไบฌ้ซ˜ๆธฉ้…ทๆš‘ไธญ\n566003 0 3ไปฅไธŠ่ฏญ่จ€๏ผšไธญๆ–‡ๅˆ†็ฑป๏ผš้€š่ฎฏ็คพไบคไฝœ่€…๏ผš่…พ่ฎฏๅพฎไฟก็‰ˆ็‰ˆๆ‰พไบ†ๅŠๅคฉ่ฟ™ไธช่ƒฝ็”จ\n566004 0 ๆœ‰ๅˆฉไบŽ่งฃๅ†ณๅฎฟ่ฟๅธ‚ๅฐๅพฎไผไธš่ž่ต„้šพ็ญ‰้—ฎ้ข˜\n1000 Processed\n classify content\n566500 0 ไผผไนŽไธชไธช่ดชๅฎ˜้ƒฝ็ƒญ่กทไบŽๆ”พๅผ€่„‘็“œๅญใ€ๆ”พๆพ่ฃคๅธฆๅญ\n566501 0 ไธ็Ÿฅ้“ๅŸŽ็ฎก้ข†ๅฏผๅฎถ้—จๅ‰ๆ˜ฏไธชไป€ไนˆๆ ทๅ‘ข\n566502 1 ๅฅฝๆถˆๆฏ! ๆ–ฐๆฅผ็›˜ๆ€ฅๅ”ฎใ€‚ ไฝไบŽ่ง’็พŽ้พ™ๆฑ ...\n566503 1 ๆญฆๆฑ‰็Žฐไปฃๅ‹้‚ฆ็ง‘ๆŠ€ๆœ‰้™ๅ…ฌๅธไธป่ฆ้”€ๅ”ฎๅ„็ง‘ๅฎคๅŒป็–—่ฎพๅค‡๏ผŒไปทๆ ผ็ปๅฏนๆœ‰ไผ˜ๅŠฟ๏ผŒๆฌข่ฟŽๆฅ็”ตๅ’จ่ฏขxxxxxxxx...\n566504 0 ไฟ้™ฉๅ…ฌๅธไปŠๅนดไธŠๅŠๅนดๆƒ็›Š็ฑปๆŠ•่ต„ๆฏ”้‡ๅคงๅน…ไธŠๅ‡\n1000 Processed\n classify content\n567000 0 AQXPๅฎฃๅธƒๅ…ถๅœจๅ‡ๅฐ‘่†€่ƒฑ็–ผ็—›็ปผๅˆๅพ/้—ด่ดจๆ€ง่†€่ƒฑ็‚Žๆ‚ฃ่€…็š„็–ผ็—›ๆ–น้ข็–—ๆ•ˆๆ˜พ่‘—\n567001 0 ๆฑŸ่ฅฟ็œๆ–ฐๅปบๅŽฟไบบๆฐ‘ๆณ•้™ขๅฎก็†ไธ€่ตทๅผบๅฅธๆกˆ\n567002 0 ๆญฆๆฑ‰ๅœฐ้“็ปงๆ’•่กฃๅคงๆˆ˜ๅŽๅˆ็ŽฐๅผบๆŠฑไนž่ฎจๅฅณ\n567003 0 ๅฎƒๆ˜ฏไธ€ๅœบ็”ฑSAPHR้ƒจ้—จ็ฒพๅฟƒ็ป„็ป‡็š„ๅ‘˜ๅทฅๅนดๅบฆ็››ไผš\n567004 0 IBMไบšๅคชๅœฐๅŒบ่กŒ้”€ๆ€ป็ป็†็š„่Œไฝ\n1000 Processed\n classify content\n567500 0 ไธ€ไบ›ๆ€€็–‘ๅนฒๆด—ๅบ—ๆš—่—็Œซ่…ป็š„ๅพฎๅš\n567501 0 ็š‡ๅŽๅ…ฌๅ›ญ10ๅนดไธ“ไธšๅ›ข้˜Ÿ็ปŸไธ€่ฟ่ฅ็ฎก็†ใ€7%ๅนดๅ›žๆŠฅ็އใ€6็ฑณ่ถ…้˜”ๅฑ‚้ซ˜\n567502 0 ๆ–ฐ้ฒœๅคง็™ฝๅ‡คๆ— ้”ก้˜ณๅฑฑๆฐด่œœๆกƒ่œœๅ‘€ๆžœๅ›ญ็Žฐๆ‘˜็Žฐๅ‘\n567503 0 ๆƒณๅˆฐๅˆๆฌกๆŽฅ่งฆ็”ต่„‘่ฟ˜ๅญฆไป€ไนˆDOSๅ‘ฝไปค\n567504 0 ๆฉโ€ฆโ€ฆ่ฆไธ่ฆไธ‹ไธช่Šฑๅƒ้ชจ็Žฉ็Žฉโ€ฆโ€ฆ\n1000 Processed\n classify content\n568000 0 ๆ˜†ๅฃซๅ…ฐๅทฅๅ…šๆ”ฟๅบœๆŽจ็ฟปไบ†ๅŽŸๅ‘ๅฑ•ไผๅˆ’ไธญๅปบ้€ ๆธธ่ฝฎ็ ๅคด็š„้ƒจๅˆ†\n568001 0 4ใ€ไธญๅ›ฝๆœ‰ๆ„้‡‡่ดญไฟ„ๅˆถ็ซ็ฎญๅ‘ๅŠจๆœบ\n568002 0 ๆฑŸ่‹็œๆฐ”่ฑกๅฐ15ๅนด7ๆœˆ26ๆ—ฅ14ๆ—ถ29ๅˆ†ๅ‘ๅธƒ้›ท็”ต้ป„่‰ฒ้ข„่ญฆไฟกๅท\n568003 0 ้‚ฃไนˆๅŒ—ๆ–—ๅซๆ˜Ÿๅฏผ่ˆช็ณป็ปŸ็š„ๅด›่ตท่ฟ˜ๆ˜ฏ้—ฎ้ข˜ๅ—\n568004 0 ๅพฎ่ฝฏๅฏน็กฌไปถ็š„ไธฅๆ ผ่ฆๆฑ‚ไนŸๆ˜ฏไธ€ๅคงๅŽŸๅ› \n1000 Processed\n classify content\n568500 0 ๆœฌๆฅ่ฆๅ’Œๅฆˆๅฆˆ่ฏดๅƒ็ฑณ้ฅญๅˆๆŠŠ่ƒƒๅƒๅไบ†่ƒƒ็–ผไบ†ไฝ†ๆ˜ฏๅฟ˜ไบ†\n568501 0 ๆˆ‘ๅˆš้—ฎๅฅนๆˆ‘ไธ็Ÿฅ้“ๅ‘ไป€ไนˆ็„ถๅŽๅฅนๆŽ€ๅผ€็œผ็ฝฉๅ†ฒๆˆ‘ๆƒŠๆ็š„ๅคงๅซไธ€ๅฃฐ็„ถๅŽ้ช‚ๆˆ‘็ฅž็ป็—…โ€ฆ่™ฝ็„ถ็ป™ไบ†ๆˆ‘็ตๆ„Ÿไฝ†ๆˆ‘ๆƒณ่ฏด...\n568502 1 ๅฐๆ–‡ๆกˆๅคง่ก—ใ€‚ๅŒ–ๅฆ†ๅ“ๆŠ˜ๆ‰ฃๅบ—ใ€‚ๅบ†ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚๏ผŒๅฏไบซๅ—ไธ€ๆฌพๅŠไปทไบงๅ“๏ผŒๅŽŸไปทๅ…ญๅไน็š„่กฅๆฐด้œœ็Žฐไปทไธ‰ๅๅ…ซใ€‚ไปป...\n568503 0 ๅนฟๅทž่ญฆๆ–นๆ‰“ๆމไธ€ๅไธบโ€œFreedomFightersโ€็š„ๅค–็ฑๆถ‰้ป‘็Šฏ็ฝชๅ›ขไผ™\n568504 0 ๆˆ‘ไปŠๅคฉๅœจๅŽไธบ็ฝ‘็›˜็ญพๅˆฐ่Žทๅพ—ไบ†163Mๅ…่ดนๆฐธไน…ๅฎน้‡\n1000 Processed\n classify content\n569000 0 5ๅคฉๅ‰่‚šๅญไธŠๅฟฝ็„ถ้•ฟๆˆ็‰‡็š„็บข็–นๆฌกๆ—ฅๆถˆ้€€\n569001 0 xxxxๅนดxๆœˆxxๆ—ฅๆ—ฉไธŠxx็‚นๅœจๅ†…ๆฑŸๅคง่‡ช็„ถ็ป„็ป‡ไธƒๅค•ๆƒ…ไบบ่Š‚ๅฟซไน็›ธไบฒ็ƒง็ƒค่‡ชๅŠฉ\n569002 0 ๆฅ่‡ชๆท„ๅš็š„ๆจก็‰นCharissa\n569003 0 ๆตท้—จๆปจๆตทๆ–ฐๅŒบ็”ตๅŠ›ๆŠขไฟฎ็Žฐๅœบ\n569004 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?ๆจๆด‹ๅ”ฑๆˆ‘ไปฌ็š„ๆญŒ\n1000 Processed\n classify content\n569500 0 ไนŸ่ฎธไฝ ไธ€ๅคฉๅชๆœ‰50ๅ—ๆˆ–่€…ไธ€ไธค็™พๅ—\n569501 0 ๅ˜Ž้ฒๅ›พ็ฌฌไบŒๆดพๅ‡บๆ‰€้›†ไธญๅผ€ๅฑ•ๆ‰“ๅ‡ป็”ตไฟก่ฏˆ้ช—ไธ“้ข˜ๅฎฃไผ ๆดปๅŠจ\n569502 0 ๆฉ„ๆฆ„๏ผšๆฉ„ๆฆ„ๅถ็ฒพๅŽๆœ‰ๅŠฉ็ป†่ƒžๅฏนๆŠ—ๅค–็•Œไพตๅฎณ5\n569503 0 ๆƒ ๆ™ฎๅฐ†ๅˆฉ็”จ่ฟ™ไธ€ๆœบไผšๆŠขๅคบๆˆดๅฐ”็š„ๅฎขๆˆท\n569504 1 ้›ชไพๆฐๅฉทๅฉท็ฅๅคงๅฎถๅ…ƒๅฎต่Š‚ๅฟซไนไธ‡ไบ‹ๅฆ‚ๆ„่ดขๆบๆปšๆปšใ€‚ๆๅ‰็ฅไบฒไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅฟซไนใ€‚ๆœฌๅบ—็‰นๆŽจๅ‡บๅ……xxxx้€้ข...\n1000 Processed\n classify content\n570000 0 ๆˆ–่ฎธไปŽ้‚ฃๆ—ถ่ตท่ฟ™้ฆ–โ€ฆๅˆ†ไบซๅ•ๆ›ฒ\n570001 0 ๅˆๅ‡บๆฅๅฅฝๅคšๅคงๅฑๆ‰‹ๆœบๅฅฝๅ–œๆฌขๅช่ƒฝ็œ‹็œ‹\n570002 0 ๅงš้บฆๆ—ถ็ซ็ฎญๅๅคงๅฏนๆ‰‹๏ผš็ˆตๅฃซๅฑ…้ฆ–ๅฐ็‰›้ฉฌๅˆบๆœ‰ไปฝ\n570003 0 ๆœฌๅธ‚ๆœ‰5ๅฎถๅŒป้™ข็š„ๆ–ฐๅปบ้กน็›ฎ่ขซ็บณๅ…ฅ20้กนๆฐ‘ๅฟƒๅทฅ็จ‹\n570004 0 ่ฎฉๆŠ•่ต„่€…ๆœ‰ไบ†ๆ›ดๅคš้€‰ๆ‹ฉ็š„ๆœบไผš\n1000 Processed\n classify content\n570500 0 ไปŽไผฐๅ€ผใ€ๅฎžไฝ“็ปๆตŽๆตๅŠจๆ€ง่ถ‹ๅŠฟๅ’Œ็ปๆตŽๅ‰ๆ™ฏ็œ‹\n570501 0 ๅˆšๅˆšๅŽป็ฝ‘ไธŠๅˆ็œ‹ไบ†ไธ€่พนaๅคง็š„ๅŽ่ƒฅๅผ•ๅ’Œๅฐๅฐ็™ฝ็š„่Šฑๅƒ้ชจ\n570502 0 ๅบๅฑฑ็พŽไธฝ็š„ๆฌงๅผๅˆซๅข…ๆ˜ฏๅปบ็ญ‘ๅœจ่ฟ‘ไปฃไธญๅ›ฝๅฑˆ่พฑๅฒ็š„ๅŸบๅฒฉไธŠ\n570503 0 ๅˆ‘ไบ‹ๆกˆไปถๅ—็†ๆ•ฐๅ’Œ้“่ทฏไบค้€šไบ‹ๆ•…ๆ•ฐๅŒๆฏ”ๅคงๅน…ไธ‹้™\n570504 0 ๅ•†ไธšๆจกๅผๆ„ๅ‘ณ็€ไธ€ไธชๅ…ฌๅธๆ˜ฏๅฆ‚ไฝ•้€š่ฟ‡ๅœจไปทๅ€ผ้“พไธญๅฎšไฝ่‡ชๅทฑ\n1000 Processed\n classify content\n571000 0 ๆœ‹ๅ‹ไปฌๅฏไปฅๅŠ ๅฅนๅซๆ˜Ÿ๏ผšlemon940422\n571001 0 ๅฎถ็”ตไบงไธšไธŽๆˆฟๅœฐไบงๅธ‚ๅœบ็›ธๅ…ณๆ€ง่พƒ้ซ˜็š„ๅŽจ็”ต่กŒไธšๆœ‰ๆœ›ๅ†ๆฌก่ฟŽๆฅๅฟซ้€Ÿๅขž้•ฟๆœŸ\n571002 0 ๅœจไธญๅ›ฝ่ฟ™ๆ ท็ฌฌไบŒๅญฃๅบฆGDPๅŒๆฏ”ๅขž้•ฟ7%\n571003 0 ็™พๅบฆ้Ÿณไนไธบๆฏ›ไนŸไธ็ป™100%ๆƒ้™ๅ•Š\n571004 0 ไน่ง†ๆญฃๅœจไธŽๅพฎ่ฝฏใ€็ดขๅฐผ็ญ‰ๅŽ‚ๅ•†่ฟ›่กŒๆŽฅ่งฆ\n1000 Processed\n classify content\n571500 0 ๅ’จ่ฏขๅ’Œไธ‹ๅ•็š„็ป™ๆˆ‘็•™่จ€ๅฐฑ่กŒ้†’ไบ†้ƒฝไผšๅ›žๅค??\n571501 0 ๅ€Ÿ็ฌฌไธ€ๆฌก่ฃ…ไฟฎๆ–ฐๆˆฟๅ†™็‚นๆตๆฐด่ดฆ\n571502 1 ็พŽไธฝๅฅณไบบ่Š‚๏ผŒๅฟซไนๅคงๆดพ้€ใ€‚ๅŽŸไปทxxxๅ…ƒ็š„็€่Žฑ้›…่กฅๆฐด็พŽ็™ฝ็ณปๅˆ—xxxๅ…ƒๅคงๆŠข่ดญใ€‚ไป…xๆœˆxๆ—ฅxๆ—ฅ๏ผŒ้€พๆœŸ...\n571503 0 ๅˆ†ๅˆซ้ซ˜ไบŽๅ…จๅ›ฝใ€ๅ…จ็œ10ไธช็™พๅˆ†็‚นๅ’Œ4ไธช็™พๅˆ†็‚น\n571504 0 ๆ‰‹ๆœบ้‡Œๅ‡ ไนŽ้ƒฝๆˆๅฆนๅฆน็š„็พŽ็…งไบ†\n1000 Processed\n classify content\n572000 0 ๅฐ†ไธ€่พ†ๅˆถๅผ่ญฆ่ฝฆๅ›ดๅ ตๅœจๅŒป้™ข้—จๅ‰\n572001 0 ๆˆ‘่ฆๅ†ฒ่…พ่ฎฏ่ง†้ข‘ไธ€็™พๅนด็š„ไผšๅ‘˜\n572002 0 ๆ‰ฏ่›‹ๆฃ€ๅฏŸๅฎ˜/็”ทๅญ็–‘ๅผบๅฅธ13ๅฒๅนผๅฅณๆฃ€ๅฏŸ้™ข๏ผšๅŒๆ–น่‡ชๆ„ฟๆ•ฆไฟƒๆ’คๆกˆ\n572003 1 ไฝ ๅฅฝ๏ผ้ธฟ่พพ็“ท็ –ๅŸŽ็ฅไฝ ๆ–ฐๅนด่ƒœๆ—ง!็ŸณๆŽ’ๆœ€ๅคง้—ด็š„็“ท็ –ๅŸŽไฝไบŽ้ป„ๅฎถ่Œ”็บข็ปฟ็ฏๅŠ ๅ‘ๅŠ ๆฒน็ซ™ๆ—๏ผŒๅ“็ง้ฝๅ…จ๏ผŒๅ…จๅœบ...\n572004 0 xxๅนด้‡Œ็งฏๆ”’ๅไฝ™ไธ‡ๅ…ƒๅ…จ้ƒจ็”จไบŽๆ…ˆๅ–„\n1000 Processed\n classify content\n572500 0 ไธŠไบ†ๅฅฝๅ‡ ๆฌก่…พ่ฎฏๆ–ฐ้—ป็š„้ƒ‘ๅคงไธ€้™„้™ข\n572501 0 ็œ้ฃŸ่ฏ็›‘ใ€ๅ…ฌๅฎ‰ใ€ๅทฅๅ•†็ญ‰12ไธช็›ธๅ…ณ่Œ่ƒฝ้ƒจ้—จ\n572502 0 ๆจชๆฒฅไบบๆฐ‘ๅŒป้™ขๆปฅ็”จ้บป้†‰ๆณจๅฐ„ๅพ…ไบงๅฅๅบทๅญ•ๅฆ‡\n572503 0 2011ๅนด่‡ณ2014ๅนด็ดฏ่ฎกๅฎŒๆˆๅ…จ็คพไผšๅ›บๅฎš่ต„ไบงๆŠ•่ต„872\n572504 0 ๆ˜จๅ„ฟๅœจๅœฐ้“็œ‹ๅˆฐไธ€ไธชๅฆ‡ๅฅณๅธฆ็€ไธ€ไธชๅฐๅญฉ\n1000 Processed\n classify content\n573000 0 ๆˆ‘ๅฐฑ็›ดๆŽฅ่”็ณปไบš้ฉฌ้€Š่ดญ็‰ฉๅฎ‰ๅฟƒไฟ้šœ\n573001 0 ๅŒ—ไบฌๆˆฟๅœฐไบงๅผ€ๅ‘ๆŠ•่ต„ๅŒๆฏ”ๅขž้•ฟ22\n573002 0 ๅฐๆตทๆนพ็กฎๅฎžๆŒบ้€‚ๅˆๆŠ•่ต„่ฎฉๅฎถ้‡Œไบบๆฅ่ฟ‡ๅ†ฌ็–—ๅ…ป็š„\n573003 0 ๅ‰่„ธ่ฎพ่ฎกๅ€Ÿ้‰ดไบ†ๅคๆ€็‰น็š„ไธ€ไบ›ๅ…ƒ็ด \n573004 0 ๆ—งๆ—ถ่พƒๅคšๅค–็ฑๅ•†่ดฉๅฑ…ๆญคไธ€ๅธฆ่ฐ‹็”Ÿ\n1000 Processed\n classify content\n573500 0 ไฝ†ๆ˜ฏไธบไป€ไนˆๅปบ็ญ‘ๅ•ฅ็š„้ƒฝ้‚ฃไนˆๅคš็บงไบ†ๅ‘ข\n573501 0 ไฝ ไธๅŽป่ฟๆณ•่ญฆๅฏŸไผš็”จๆžชๆŒ‡็€ไฝ \n573502 0 ๅฏนๆœ‹ๅ‹็š„ๅฆปๅญๅฎžๆ–ฝๅผบๅฅธxxไฝ™ๆฌก\n573503 0 ๅ›พไธบๆต™ๆฑŸๆธฉๅฒญๅธ‚็Ÿณๅก˜้•‡ๆฒฟๆตทๆŽ€่ตท็–พ้ฃŽๅคงๆตช\n573504 0 ๅŽปไฝ ๅฆˆ็š„็ ด็”ต่„‘ๅงๆงฝๆˆ‘่ฆ่ทณๆฅผ\n1000 Processed\n classify content\n574000 0 ็‰นๅˆซๆ˜ฏๅœจไธ€ไบ›็Ÿฅๅไผไธšๆ‹›่˜ๅคงๅญฆ็”Ÿ็ฒพ่‹ฑๅŒ…ๆ‹ฌๆ‹›่˜้”€ๅ”ฎไบบๅ‘˜็š„ๆ—ถๅ€™\n574001 0 WPๆ‰‹ๆœบ้”€้‡ๆœช่พพๅˆฐ้ข„ๆœŸๅพฎ่ฝฏๅฐ†ๅ†่ฃๅ‘˜7800ไบบ\n574002 0 ๅ่ฝๅœจ่ทๅ—ไบฌ50ๅ…ฌ้‡Œ็š„ๅކๅฒๆ–‡\n574003 0 2000ไฝ™ๆ—ฅๅ†›ๅœจ6ๆžถ่ฝฐ็‚ธๆœบ็š„ๆŽฉๆŠคไธ‹ไปŽไธŠๆตท็›ดๆ‰‘ๅ˜‰ๅ–„ๅŽฟๅŸŽ\n574004 0 ๆŽจๅŠจ็œไปฅไธ‹ๅœฐๆ–นๆณ•้™ขใ€ๆฃ€ๅฏŸ้™ขไบบ่ดข็‰ฉ็ปŸไธ€็ฎก็†โ€\n1000 Processed\n classify content\n574500 0 ๅฐ†ไปฅ15km/h้€Ÿๅบฆๅ‘่ฅฟๅŒ—ๅๅŒ—ๆ–นๅ‘็งปๅŠจ\n574501 0 ่ฟ™ๅŠžๅ…ฌๆฅผๅฐฑๅƒ้ฌผๆฅผไธ€ๆ ทโ€ฆโ€ฆ\n574502 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๆœดไฟก่ดทๆฌพๅ’จ่ฏข็š„ๅฐ้ป„๏ผ›ๆˆ‘ๅ…ฌๅธไธป่ฆไธบๅฎขๆˆทๅŠž็†: ๆˆฟไบง/ๆฑฝ่ฝฆๆŠตๆŠผ่ดทๆฌพใ€ไธญๅฐไผไธš่ดทๆฌพใ€ไธชไบบ...\n574503 0 ไปŠๅคฉๆ—ฉไธŠ้šๆ‰‹ๅฐ็ฑณ4ๆ‰‹ๆœบๅ‡็บงไบ†ไธ‹ๆœ€ๆ–ฐ็š„็ณป็ปŸ\n574504 0 ้™ข้•ฟๆฟ€ๅŠจๅœฐๆก็€ๆŠคๅฃซๅฆนๅฆนๆ‰‹่ฏด๏ผšโ€œๅคช่ฐข่ฐขไบ†\n1000 Processed\n classify content\n575000 0 ๆ— ๅๆŒ‡๏ผšๆ„Ÿๅ†’ใ€ๅ’ฝๅ–‰็–ผ็—›ใ€ๅคด็—›ใ€ๅฐฟ้ข‘ใ€ๆฑ—ๅคšใ€ๅฎซๅฏ’\n575001 0 ๅ•Šๅ•Šๅ•Šๅ•Šๅ•Šๅฅฝๆƒณๅ‡บๅŽปๆ—…ๆธธ้™คไบ†ๆˆ‘ๅฆˆๅฐฑๆฒกไธ€ไบบๆœ‰็ฉบ็š„\n575002 0 ็œŸ็š„ๅฅฝๅ—จๆฃฎ~ๆˆ‘ๅœจ่ดขๅฏŒ้Ÿณไน็‚น็‚นๆƒ…็พค้‡Œ\n575003 0 ไป–ไปฌๅบ”่ฏฅไธบๆปฅ็”จไธชไบบๆƒ…ๆ„Ÿ่€Œๅˆฐ่ญฆๅฏŸๅฑ€่‡ช้ฆ–\n575004 0 ๆ•ดไธชๆ’ญๆ”พๅ™จ่ฎพ่ฎกๆˆไธ€ไธช็ซ‹ๆ–นไฝ“็š„ๅฝข็Šถ\n1000 Processed\n classify content\n575500 0 ่ฟ™ๆฌพไธ็ฒ˜็މ็ฑณ็š„็ƒง็ƒคๅคน่ฎพ่ฎกๅˆ็†\n575501 1 ๅ—ๆน–่Šฑๅ›ญxๆฅผxxๅนณๅทฆๅณ\n575502 0 ็œไบค่ญฆๆ€ป้˜Ÿใ€้ป„ๅ—ๅทžๅ…ฌๅฎ‰ๅฑ€ไธป่ฆ\n575503 0 ็›ฎๅ‰็ฌฌ6่ฝฎๆœ€ๅŽไธ€ๅœบๆฒณๅ—vsๅนฟไธœๅณๅฐ†ๅผ€ๅง‹\n575504 0 ๆ‹›ๅ•†่ฝฎ่ˆนๆ”ถ่ดญๆทกๆฐดๆฒณ่ฐทๆ‰€ๆ‹ฅๆœ‰็š„ๅ››่‰˜ไบŒๆ‰‹่ถ…ๅคงๅž‹้“็Ÿฟ็Ÿณ่ฟ่พ“ไธ“็”จ่ˆน่ˆถ\n1000 Processed\n classify content\n576000 0 ่ƒฝ่€ๅฟƒ็ญ‰ๅพ…ไธ‰ๅˆฐไบ”ๅนดโ€”โ€”้€‚ๅˆๅšๆŠ•่ต„ๅฎถ\n576001 0 ๆˆฟๅž‹๏ผš่ทƒๅฑ‚้ฃŽๆ ผ๏ผš็พŽๅผ่ฎพ่ฎกๅธˆ๏ผš็ซฅ็‡•ๅŸบ่ฃ…้€ ไปท๏ผš&amp\n576002 0 ๅช้œ€ๅŠ ๆฒนๅ‘˜ๆ่ตทๅŠ ๆฒนๆžชๅฐฑ่กŒไบ†\n576003 0 ๅธธๅทž่ฅฟ็ป•ๅŸŽ้ซ˜้€ŸๅฎœๆฑŸๆ–นๅ‘K102โ€”K93ๅค„ไปŠๆ—ฅๆตๅŠจ่ทฏ้ขไฟฎ่กฅๆ–ฝๅทฅ็ป“ๆŸ\n576004 0 ๅฐฑๆ˜ฏ็ป™ๅˆคๆญปๅˆ‘ไบบๅฎถไนŸ็Ÿฅ้“่ฐด่ดฃ่ฐด่ดฃ\n1000 Processed\n classify content\n576500 1 xxxxๅ…ƒ/ๅนณ็ฑณ่ตทๆŠ„ๅบ•ๅผ€ๅ…ƒไธœ่ทฏๅœฐ้“ๅ›—็Žฐๆˆฟ๏ผŒๆŠ„ๅบ•็ƒญ็บฟ:xxxxxxxxxxxx/xxxxxxxx\n576501 0 ๅŒ—ไบฌๅœฐ้“2ๅฅณๅญๆŠขๅบง็ˆ†็ฒ—ๅฃไธŠๆผ”โ€œๆ’•่กฃๅคงๆˆ˜โ€ไธขไบบ\n576502 0 ไธŠๆœˆ่พžๅŽปๆˆฟๅœฐไบง้”€ๅ”ฎไธป็ฎก่ŒๅŠกโ€ฆ็„ถๅŽๅ‘ข\n576503 0 ไธŠๅˆไธปๆฒปๅŒป็”Ÿ่ฏทไธปไปปๅŒปๅธˆๅธฎๅฎๅฆˆ่ฏŠๆ–ญๅŽ\n576504 0 ๆ›พๆ‰งๆŽŒXboxๅœจๆ—ฅๆœฌ็š„็›ธๅ…ณไบ‹ๅŠก้•ฟ่พพ8ๅนดไน‹ไน…็š„ๆณ‰ๆฐดๆ•ฌๅทฒๅœจไธคๅคฉๅ‰ๆญฃๅผไปŽๅพฎ่ฝฏๅ…ฌๅธ็ฆป่Œ\n1000 Processed\n classify content\n577000 0 ็”จ็ป‘ๆžถ็š„ๆณ•ๅพ‹ๆฅๅŽ‹ๅˆถไบบๆฐ‘ๅ’Œๅญฉๅญ\n577001 0 ๆƒณ็กไธ€็›ดๆฒก็ก็š„ๅŽŸๅ› ๆ˜ฏ็พค้‡Œๆœ‰ไบบๅคงๅŠๅคœๅœจๅ‘็บขๅŒ…\n577002 0 applewatchๆœ็ดขๆต้‡ๅชๆœ‰iPod็š„1/2\n577003 0 x็พŽๅ…ƒ/็›Žๅธๅ› ๆŠ•่ต„่€…ๆ‹…ๅฟง็พŽๅ…ƒๆŒ็ปญๅ‡ๅ€ผๅŠ็พŽ่”ๅ‚จไผšๅœจๆœชๆฅๅ‡ ไธชๆœˆๅ†…ๅŠ ๆฏ็š„ๅ‰ๆ™ฏ\n577004 0 ไบ’่”็ฝ‘+ๅ†œไธš่žๅˆ่€Œๆˆ็š„็”ตๅ•†ๅนณๅฐ\n1000 Processed\n classify content\n577500 1 xxxxxxoxxxxxxxxxoxoๅ†œ่กŒๅผ ็މๅ…ฐ\n577501 0 ไนŸไธ่ƒฝไธบไบ†ๅƒ้กฟTwitter็š„ๅทฅไฝœ้คๆผ‚ๆด‹่ฟ‡ๆตท\n577502 0 ๅŒป็”Ÿ่‡ช่จ€่‡ช่ฏญๅœฐ่ฎฒๆ— ้”ก่ฏ๏ผšๅฐ็—ดไฝฌ\n577503 0 ่ฎฉๆ‚จๅœจๆณฐๅทžไธ€ๆ ทๆ„Ÿๅ—ไธœๅŒ—็š„้ฃŽๅ‘ณ็š„็ƒคไธฒ\n577504 0 ไปŠๆ™š22็‚นๅคฉๆดฅๆ–ฐ้—ปๅนฟๆ’ญFM97\n1000 Processed\n classify content\n578000 0 ๆŒ่ตซๅŸบใ€่‰พ็ปดใ€IAM27VIPๅกๅฏไบซๅ—ๆŠ˜ไธŠๅœจ9ๆŠ˜\n578001 0 ๅ‡ ไธชๅ‰ฏๆ€ปๅธไปค้ƒฝๆ˜ฏ่ดชๆฑก็Šฏ็š„ๅ†›้˜Ÿ\n578002 0 ๆ‰€่ฐ“่ขซๆŸฅๅค„็š„่…่ดฅๅฎ˜ๅ‘˜็…งๆ ทๅฏไปฅไธไธ‹\n578003 0 ๅƒๆฐดๆžœๆˆ‘ไธ€็›ด็›ธไฟกๅŠชๅŠ›็š„ๆฑ—ๆฐดไธ€ๅฎšไผšๅพ—ๅˆฐๆ”ถ่Žทๆˆ‘ๅฐฑๆ˜ฏๆˆ‘ไบบ็”Ÿ็š„ๆœ€ๅคง้กน็›ฎ็ฎก็†ๅ’ŒๆŠ•่ต„่‡ชๅทฑๆ˜ฏไธ€่พˆๅญ็š„ไบ‹ๅ„ฟ\n578004 0 ้™†ๅ†›็ฌฌxx้›†ๅ›ขๅ†›ๆŸๅ›ข้‡Ž่ฅๆ‘็ช้ญxx็บงๅผบ้ฃŽๅ’Œๆšด้›จ่ขญๅ‡ป\n1000 Processed\n classify content\n578500 0 ๅฐฑๆ˜ฏ็ฝ‘็ซ™ไธ€ๆ‰“ๅผ€ๅ…ณไบŽ่Šฑๅƒ้ชจ็š„้‚ฃไธชไบ†\n578501 1 ่‡ณๅ‘จๆ—ฅใ€‚็‰น้‚€ๆ‚จๅŠๅฎถไบบๅ‰ๆฅๅ‚่ง‚่ฅฟๅŒ—ๆœ€ๅคง็š„ๅฎถ่ฃ…ไฝ“้ชŒ้ฆ†๏ผŒ็ฒพๅ“่ฎพ่ฎกๆ–นๆกˆ๏ฝž๏ฝžx.ๅ‡กๅˆฐๅบ—ไธšไธปๅ‡ๅฏ็ซ‹ๅณๆŠฝๅ–...\n578502 0 ๅธธๅทžๅธ‚ๆฐ”่ฑกๅฐxๆœˆxxๆ—ฅxxๆ—ถ็ปง็ปญๅ‘ๅธƒ้ซ˜ๆธฉ้ป„่‰ฒ้ข„่ญฆไฟกๅท\n578503 0 ๅ—ไบฌ้ฆ–ๅฎถๅไบบ่œกๅƒ้ฆ†ๆธ…ๅ‡‰ๆฅ่ขญ\n578504 0 ๅŽไธบP8้’ๆ˜ฅ็‰ˆๆ™บ่ƒฝๅฎ‰ๅ“ๆ‰‹ๆœบ16Gไผ˜ๅ“ๅ›ข่ดญ๏ฟฅ1\n1000 Processed\n classify content\n579000 0 ็›ธๅฏนไบŽๅŒ—ไบฌๅœฐ้“็š„ๅ…จ็จ‹็งปๅŠจ4Gไธๆމ็บฟ\n579001 0 ๆ นๆฎ้›„ๅŽฟๆณ•้™ขๆฐ‘ไบ‹ๅˆคๅ†ณไนฆไปฅๅŠxxxxไฟๆฐ‘ไธ€็ปˆๅญ—็ฌฌxxๅทๅˆคๅ†ณไนฆๅฏไปฅ็กฎๅฎš้ฉฌๆตทๅณฐ็ณปๆฐดๆณฅๅก”็ฝๅŽ‚็š„็ฎก็†่€…...\n579002 0 UPไธป๏ผšไผๅผ€ๅฟƒ่ฆ่‚‰ๅŒ…ๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n579003 0 9ๆ—ฅๅ—ไบฌๆˆ‘ไปฌ่ฟ˜ๆœ‰ๅฐ‘้‡็ฅจ้œ€่ฆ็งไฟก่ดญ็ฅจ่ฐจๆ…Žๅ‡็ฅจ\n579004 0 ๆต้œฒๅคๅคๆฐ”ๆฏ่กฃ้ข†็ณป่ด่ถ็ป“ๅค„็†\n1000 Processed\n classify content\n579500 0 ไธพไธชไพ‹ๅญไธคไธชๅฐๅญฆ็”Ÿๅœจ็Žฉๆธธๆˆๆ‹”ๆฒณๆฏ”่ต›\n579501 0 ไธ€ๅนดๅ‰ๅŽไธบ่ฃ่€€ๆฒกๆœ‰ๅฏนๅฐ็ฑณM1ๆž„ๆˆๅจ่ƒ\n579502 0 ๅนถไธ”ๆ˜ฏipsๆ่ดจ็š„ๅคš็‚น่งฆๆŽงๅฑๅน•\n579503 1 ้›…็ฒ‰ไปฌๅˆๆ˜ฏไธ€ๅนดโ€œไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚โ€ๅˆฐๆฅไน‹้™…ไธฝๆฐด็ปงๅ…‰่ก—้›…่Žนไบ”ๅญฃ็‰นๆŽจๅ‡บ็ง‹ๅ†ฌๆฌพไฝŽ่‡ณxๆŠ˜่ตท๏ผŒๆ–ฐๅ“xไธ€xๆŠ˜ๅœจ...\n579504 0 ๅœจๅ…จๅ›ฝ27ไธชโ€œ็ˆฑๅฟƒ้ฉฟ็ซ™โ€็•™ไธ‹ๆ„Ÿไบบ็žฌ้—ด\n1000 Processed\n classify content\n580000 0 ๅŽ‹ๆŠ‘ไบ†ๅพˆไน…็Žฐๅœจๅทฒๅผ€ๅง‹่ดจ็–‘่‡ชๅทฑ\n580001 0 ๅž‹ๅทๅˆ†ไธบSใ€Mใ€Lใ€XL๏ฝžๅ…ทไฝ“ๅž‹ๅทๅฏไปฅๅ’จ่ฏขๆˆ‘\n580002 0 ๅˆ†ไบซBloodzBoi\n580003 1 ๅฐŠๆ•ฌ็š„้กพๅฎขๆ‚จๅฅฝ๏ผšๅ…จๅ‹ๅคงๅž‹ๅบ—ๅบ†ๆดปๅŠจๅฎšไบŽxๆœˆxxไธ€xxๆ—ฅ้š†้‡ไธพ่กŒ๏ผŒๅŽ‚ๅฎถ็›ด้”€ยทๅ…จๅนดๆœ€ไฝŽ๏ผๆˆ‘ไปฌ้ƒ‘้‡ๆ‰ฟ...\n580004 0 ๆ‹ฟๅ‡บ่‡ชๅทฑ็™ฝ่‰ฒiPhone้€’็ป™ๅ“ญๅฐๅญฉ\n1000 Processed\n classify content\n580500 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๆ–‡ๅŽxxๆœŸ้™ˆๅฅ‡ไธฐ ้™ˆๆ€ป็š„ๅŠฉ็†๏ผŒๆˆ‘ๅธ่‡ดๅŠ›ไธบๆ–‡ๅŽๅฎถไบบๆไพ›ไธช่ดท๏ผŒ็ป่ฅ่ดท็ญ‰ๅ…จๆ–นไฝ็š„้‡‘่žๆœๅŠก...\n580501 0 ๆˆ‘ๅœจ็œ‹ๅฐๆ—ถไปฃ4็š„ๆ—ถๅ€™ไธ€็›ดๅœจๆƒณโ€œ4ไธชไบฟโ€ๆฒกไบ†ๆž„ๆˆไป€ไนˆ็ฝช\n580502 0 ๆฏๆฌกๅ้ฃžๆœบๅฐฑๆ€ปๆœ‰ไบ›ๅ‚ป้€ผ็š„ไบบ\n580503 0 ๅผ€็›˜ๅฝ“ๆ—ฅๅฐšๆœ‰320ๅฅ—ไฝๅฎ…ๆœชๆŽจๅ‡บ\n580504 0 ็จ‹ๅบ็Œฟๅฏนๅ•่บซ็‹—้€ ๆˆไบ†3ๅ€ไปฃ็ ๅฑžๆ€งไผคๅฎณ\n1000 Processed\n classify content\n581000 0 ๅœจ้ฃžๆœบไธŠ็š„ๆ“ไฝœ้œ€็ฌฆๅˆ้‚ฃไธชๅนด็š„ๅฎž้™…ๆƒ…ๅ†ต\n581001 0 ไผ™ไผดไปฌ็š„ๅŠ ๆฒนๅ‘ๅ–ŠไนŸๅ……ๆ–ฅ็€ๅ…จๅœบ\n581002 0 ไธญๅ›ฝๅปบ็ญ‘้‡‘ๅฑž็ป“ๆž„ๅไผšๆˆๅ‘˜ๅ•ไฝ\n581003 0 ๅพฎ่ฝฏไธบๆฏ›ไธๆŠŠShellIconOverlayIdentifiersๆžๅฅฝไธ€็‚น\n581004 0 ้ƒ‘ๅทžๆ ‡ๅฟ—ๅปบ็ญ‘โ€œ็މ็ฑณ็ฉ—โ€JWไธ‡่ฑช้…’ๅบ—ไธ‰ๆฅผ\n1000 Processed\n classify content\n581500 0 ๅ€’ๆ˜ฏ็œ‹ไบ†ๅŠๅคฉๅŒ—ไบฌๅซ่ง†็œŸๆ˜ฏๆฒกไฟก่ช‰ๅฎŒๅ…จไธบไบ†ๆ”ถ่ง†็އ\n581501 1 ๅฐŠๆ•ฌ็š„ไผšๅ‘˜ไฝ ๅฅฝ๏ผŒๆ— ้”กๅ…ซ็™พไผดไธ‰ๆฅผPOZO๏ผˆไผฏๆŸ”๏ผ‰ๅฅณ่ฃ…ไธ‰ๅ…ซ่Š‚ๅ…จๅœบๆ–ฐๅ“ๆ˜ฅ่ฃ…xxxๅ‡xx๏ผŒ่€ไผšๅ‘˜ๆถˆ่ดน...\n581502 1 ไนๅฎพ็™พ่ดง็šฎๅฐ”ๅกไธนๅ†…่กฃไธ“ๆŸœใ€ใ€็ŽฐๆŽจๅ‡บx.x่Š‚ๆดปๅŠจใ€็”ทใ€ๅฅณๆฌพๅฅ—่ฃ…ใ€ไฟๆš–่ฃคใ€ๆ–ฐๆฌพ็ง‹่ฃค่ถ…ไฝŽไปทx.xๆŠ˜...\n581503 0 ๅŸบไบŽๆ ‡ๅ‡†PCๆžถๆž„ๅ’ŒLinuxๆ“ไฝœ็ณป็ปŸ\n581504 0 ๅˆšๅœจๆ€็พŽๅพ—ๅ•†่ดธๆœ‰้™ๅ…ฌๅธๆ‘‡ๅˆฐไบ†ๅ…่ดนๅฅฝไธœไธœ\n1000 Processed\n classify content\n582000 0 ่ฟ™่พนๅปบ็ญ‘้ƒฝๆ˜ฏๆˆ‘็‰นๅ–œๆฌข็š„้‚ฃ็ง้ฃŽๆ ผ\n582001 0 ไฝŽ้…็‰ˆxGRAM+xxGROM\n582002 0 ไฝไบŽ้™•่ฅฟ็œไนพๅŽฟๅŽฟๅŸŽไปฅๅŒ—ๆขๅฑฑไธŠ็š„ไนพ้™ต\n582003 0 ๅ‡ไปทxxxxxๅ…ƒ/ใŽกๅ—ไบฌๆฒณ่ฅฟๆ–ฐๆˆฟไปทๅˆ›่ฎฐๅฝ•\n582004 0 ็”ทๅญๅ17ๅนดๅ†ค็‹ฑ่Žท่ต”160ไธ‡ๆˆๅœŸ่ฑชๅช’ๅฉ†ๆไบฒ่ธ็ ด้—จ24ๅฒๆ—ถ\n1000 Processed\n classify content\n582500 0 ๅ‡‰ๅฑฑๅทž็ฆๆฏ’ๅง”xxไธชๆˆๅ‘˜ๅ•ไฝๅฐ†ๆดพๅ‡บไธ“ไบบ\n582501 0 ไธๆƒณ่ทŸ็†Ÿไบบๆœ‰ๅคชๅคš็š„็ปๆตŽ็บ ็บท\n582502 0 ๆˆ‘ๆœ‹ๅ‹่ฏดๅฅนๆœ‰่ฝฐ็‚ธ่ฝฏไปถๆˆ‘่ฏดไธไฟก็”จๆˆ‘ๅท็ ่ฏ•่ฏ•็ป“ๆžœโ€ฆๆˆ‘ๆ‰‹ๆœบๅก็š„็Žฐๅœจๆ‰่ƒฝ็”จ\n582503 0 ๆ้ซ˜่ง‚ไผ—ๅฏนๆœ€ๅŽ็š„ๅฎกๅˆคๆœŸๅพ…ๅ€ผ\n582504 0 ๅ•†ๅฎถ่ตžๅŠฉ็ƒญ็บฟ๏ผš13608870413\n1000 Processed\n classify content\n583000 0 ่ฎค่ฏไฟกๆฏไธบโ€œๆต™ๆฑŸ้พๆนซๆœ้ฅฐๆœ‰้™ๅ…ฌๅธ็‰ฉๆต็ป็†โ€\n583001 0 ใ€Žๅˆ†็บงๅŸบ้‡‘ไธบไป€ไนˆๆœ‰ไธ‹ๆŠ˜้ฃŽ้™ฉใ€\n583002 0 ๆœ็ปๅ‡่ดง4ใ€TSTไธ€็“ถWไนŸ่ƒฝๅ…ฌๅธไปฃๅ‘\n583003 0 ๅฅฝๆฃ’ๅฅฝๆฃ’ๅ‘€ๆœ‰ๆฒกๆœ‰ๆ‹›่˜ๅ•Šๆˆ‘ไปŠๅนดๅˆšๅคงๅญฆๆฏ•ไธš\n583004 0 ็”ตๆขฏๅฎ‰ๅ…จไธๅฎนๅฟฝ่ง†ๅ•Šๆˆ‘ไปฌๅฐๅŒบ็”ตๆขฏไธŠๆผ”ๆƒŠ้ญ‚ๆœชๅฎšๅ•Š\n1000 Processed\n classify content\n583500 0 ๅฆๅค–่ฟž็บข็ฑณ2็š„็ปญ่ˆช้ƒฝๆฏ”ๆ’ธๅฆน930ๅฅฝ\n583501 0 ไธ€ไธชๆˆฟๅœฐไบงๅผ€ๅ‘ๅ•†็š„ๅปบ่ฎฎ๏ผšๅนด่ฝปไบบไธ่ฆๅ†่ดทๆฌพไนฐๆˆฟไบ†\n583502 0 ไธ€้ฆ–ๆญŒๅ”ฑๅˆฐไธ€ๅŠไนŸ็ป™ๆ’30็ง’ๅนฟๅ‘Š\n583503 0 ๆˆ‘ๆŽจๆต‹ๅฏ่ƒฝๆ˜ฏๆŠ€ๆœฏๆ€ง่ฟ่ง„็š„ๅฏ่ƒฝๆ€งๅคšไบ›\n583504 0 ๅฐคๅ…ถๆ˜ฏไบบๆฐ‘ๅธๆ˜ฏๅฆ็บณๅ…ฅSDR่ดงๅธ็ฏฎๅญ\n1000 Processed\n classify content\n584000 0 InternetExploreไฟ—็งฐIE\n584001 0 ็”ตๆขฏๅˆถ้€ ๅ•†ไปฅๅŠ่ดŸ่ดฃ็”ตๆขฏ็ปดไฟฎ็š„ๅ…ฌๅธๅบ”่ฏฅๆ‰ฟๆ‹…่ฟžๅธฆ่ดฃไปป\n584002 0 ไปฅ538ๅˆ†็š„ๆˆ็ปฉ่ขซๆฑŸ่‹ๅคงๅญฆๅฝ•ๅ–\n584003 0 ่ฟ‘ๆ—ฅไฝ›ๅฑฑไธ€็”ทๅญๅ› ไธบ็”ตๆขฏๆ•…้šœ\n584004 0 ๆพณๅฆˆ้ฆ–้€‰็š„ไธ€ๆฌพๆญขๅ’ณ็ณ–ๆต†ๅฐ้’่›™ๆฌพๅ’Œๅฐ็ปฟๅถๆฌพ\n1000 Processed\n classify content\n584500 1 ๅฅฝๆถˆๆฏ๏ผšx.xๅ›ž้ฆˆๅนฟๅคง่€้กพๅฎข๏ผŒxๆœˆไปฝๅˆฐๅบ—ไธ€ๆฌกๅณๅฏ่Žทๅพ—ไปทๅ€ผxxxๅ…ƒbx้šๅฝข้ข่†œx็‰‡๏ผŒๆ•ฐ้‡ๆœ‰้™ๅ…ˆ...\n584501 0 ๅŒ—ไบฌ่ฅฟ็ซ™ๅœฐ้“็ซ™็œŸๆ˜ฏ้”™็ปผๅคๆ‚ๅพ—ๅซไบบ็–‘ๅฟƒ่‡ชๅทฑ่บซๅœจๅ…ซ็™พไธชๅ‡บๅฃ็š„ๆ–ฐๅฎฟโ€ฆ\n584502 0 ๆŠฅๅๆˆชๆญขๆ—ถ้—ดxxxxๅนดxๆœˆxๆ—ฅ\n584503 0 ๅ›žๅ›ฝไปฅๅŽ่ฟทไธŠ่Šฑๅƒ้ชจใ€ๅฏๆ˜ฏๆ›ดๆ–ฐๅฅฝๆ…ข\n584504 0 ๆˆ‘ๆญคๅŽปๅ—ไบฌๅบ”่ฏฅๆ˜ฏ่ƒฝ็œ‹ๅˆฐๆตทไบ†ๅง\n1000 Processed\n classify content\n585000 0 ๅฏปๆ‰พๆฑŸ่‹ๆ‘„ๅฝฑๅคงๅธˆไธ€ๅๆœฌไบบๆƒณๅšๅŠฉ็†ไปฅๅ‰ไนŸๆ˜ฏไธ€ๅๅฐๆ‘„ๅฝฑๅธˆ\n585001 0 ๆ—…ๆธธไธป็ฎก้ƒจ้—จ้ผ“ๅŠฑ็ตๆดปๅฎ‰ๆŽ’ๅทฅไฝœๆ—ถ้—ด\n585002 0 ๅผ€ๅบ—็š„ๆœ‰ไฟก็”จๅก็š„้œ€่ฆๅŠžposๆœบ็š„ๆœ‹ๅ‹ไปฌ็œ‹่ฟ‡ๆฅใ€ๆ™ฎ้€šๅˆทๅกไฟก็”จๅกๅฅ—็Žฐ\n585003 0 ่ฟ™ๅฐไฟๆ—ถๆท้‡‡็”จไธ€ๅฏน4่‹ฑๅฏธ็›ดๅพ„็š„ๆŽ’ๆฐ”็ฎก\n585004 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏๅˆš็ป™ๆ‚จๆ‰“็”ต่ฏ่ฏๅคง่ดขๅฏŒ็š„ๅฎขๆˆท็ป็†: ่ƒกๅ›ญๅ›ญxxxxxxxxxxx ๅŠž็†ๆ— ๆŠตๆŠผไฟก็”จ่ดทๆฌพ...\n1000 Processed\n classify content\n585500 0 ๆ„Ÿ่ฐขๅ—ไบฌๅธ‚ๅŸŽ็ฎกใ€ๅœฐ็จŽ้ƒจ้—จๅคงๅŠ›ๆ”ฏๆŒ80ๅฎถๅœ่ฝฆๆ”ถ่ดน็ณป็ปŸๅŽ‚ๅฎถๅนถ่‚ฉๆ”ปๅ…ณ2000ๅคšๅฎถๅœ่ฝฆๅœบ็ป้ชŒๅ•ไฝ็š„้ผŽๅŠ›้…ๅˆ\n585501 0 7ๆœˆ14ๆ—ฅๅณ5ๅคฉๅ‰ๆฑŸ่‹ๅฎฟ่ฟๆณ—ๆดชไธ€ๅๅˆไบŒๅญฆ็”Ÿ่ขซๅŒๅญฆๅ›ดๆฎด่‡ณๆญป\n585502 0 ๅฐๅทๆ€ปๆ˜ฏๅ‘ๆ‰‹ๆ— ็ผš้ธกไน‹ๅŠ›็š„ๅฅณๆ€งไผธๅ‡บๅŒๆ‰‹\n585503 0 ๆฝœ่ง„ๅˆ™ๅฏๅމๅฎณไฝ ไธๆ€•ๅ•Š\n585504 0 ่ฝฆไธŠ่ฏทไธ€็พŽๅฅณๆ‰‹ๆœบๅธฎ็€ๅ……ๅ€ผxxxๅ…ƒๆˆ‘ๅฝ“ๅœบไป˜็Žฐ\n1000 Processed\n classify content\n586000 0 ๅ…ถๅœจๅทด้ปŽ่ฏๅˆธไบคๆ˜“ๆ‰€่ดญไนฐไบ†็ˆฑ้ฉฌไป•้›†ๅ›ข็š„ไธ€่‚ก่‚กไปฝ\n586001 0 ๅบ”่ฏฅๆ˜ฏๅฟ…้กปๅ‚ไธŽ่ฐƒๆŸฅ/้ฉฌ่ˆช็–‘ไผผ้ฃžๆœบๆฎ‹้ชธ่ฟๆŠตๆณ•ๅ›ฝไธญๆ–น่กจๆ€\n586002 0 ้ฃžๆœบๆœ€ๅŽไธ€ๆฌก่ฏ•็€้™†็”Ÿๆญปๅ…ณๅคดๅ‰\n586003 0 ้€ผๆ ผๅฐ”thanbigger\n586004 1 ็‰นๅคงๅ–œ่ฎฏ๏ผš่Œ‚ไธšๅคฉๅœฐๅค่‰ฒไธ“ๆŸœๅ…ƒ้œ„ไฝณ่Š‚ไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚ๅŒ่Š‚ๅŒๅบ†ๅทจๆตๅ›ž้ฆˆไฟƒ้”€ๆดปๅŠจxๆœˆx๏ฝžxๅทๅ››ๅคฉVIPๅˆฐ...\n1000 Processed\n classify content\n586500 0 ๅŽปๅนดไธ€ๅนดๆ‹ๆ‘„ไบ†่ฟ‘ไธ‡ๅผ ้•‡ๆฑŸๅ„ๅค„้ฃŽๆ™ฏ\n586501 0 ๅพฎ่ฝฏ่“็‰™ๆŠ˜ๅ ๅผ้”ฎ็›˜ๆญฃๅผๅผ€ๅ–\n586502 0 ่ฏดๆ˜ฏๆˆ‘ๆœ‰9000ๅคš็š„ๆˆฟไบง็จŽ่ฆ้€€็ป™ๆˆ‘\n586503 0 ไปŠๅคฉ4000๏ผ‹ๅทฒ็ปๅนฒไบ†ไธค็ฝๆฐงๆฐ”\n586504 1 ๅ•†ๅœˆ๏ผŒๅ•ไปทx-x.xไธ‡๏ผŒๆ€ปไปทxx-xxไธ‡๏ผŒๆฌข่ฟŽๅฎžๅœฐ่€ƒๅฏŸ๏ผ่ฟ™ๆ˜ฏๆˆ‘ๆ‰‹ๆœบๅท็ ๏ผŒๅฆ‚ๆžœๆ–นไพฟๅฏไปฅๅญ˜ไธ‹ใ€‚็ฅ...\n1000 Processed\n classify content\n587000 1 ไฟ่ตข็ฌฌไธƒๅœบ๏ผŒ xx:xx ่ท็”ฒๅŸƒๅ› ้œๆธฉvs้˜ฟ่ดพๅ…‹ๆ–ฏ๏ผŒ็”ฑๅŠ็ƒไธ€็ƒๅˆ†ๆž็œ‹ๅฅฝไธป้˜Ÿ่ƒœๅ‡บ๏ผŒๆŽจ่ๅŸƒๅ› ้œๆธฉ...\n587001 0 ่ดตๅทž็œๆฃ€ๅฏŸ้™ขๅฌๅผ€ๅ…จ็œๆฃ€ๅฏŸๆœบๅ…ณๅ…š้ฃŽๅป‰ๆ”ฟๅปบ่ฎพ็ชๅ‡บ้—ฎ้ข˜ไธ“้กนๆ•ดๆฒปๅŠจๅ‘˜ไผš\n587002 0 ไปŠๆ—ฉไธŠ็œ‹ไบ†ๅ‡ ๆกๅƒไบบ็”ตๆขฏ่ง†้ข‘\n587003 0 ๆœ‰ๅ…ณไปฃๅ†™minitabanalysis็š„ไบ‹ๅฎœ\n587004 1 --ๅˆซๅข…้“‚้‡‘ๅทฅ่‰บๅฎžๆ™ฏๅฑ•็คบ่ก—็››ๅคงๅผ€ๆ”พ๏ผŒ้š่”ฝๅทฅ็จ‹้žๅธธ่‰บๆœฏ็š„ๅฑ•็Žฐๅœจไฝ ็œผๅ‰โ€ฆ็ญ‰็ญ‰๏ผ่ฏฆ็ป†ไบ†่งฃใ€ๆฌข่ฟŽ่‡ด็”ต...\n1000 Processed\n classify content\n587500 0 ไปŠๅนด12ๆœˆ็ณป็ปŸ้€š่ฟ‡่ฎค่ฏๅŽไบ‰ๅ–ๅœจไธญๅ›ฝ็ŸณๅŒ–ๆ•ดไธชๅทฅ็จ‹ๅปบ่ฎพๆฟๅ—ๅฎž็Žฐ็”ตๅญๆ‹›ๆŠ•ๆ ‡\n587501 0 ๅคง่ฟžๅฐ่ฑกโ€ฆโ€ฆไบค้€š็ฏ‡ไบค้€šๅฎžๅœจไธ่ƒฝๆญ็ปด\n587502 0 ๆ— ๆฑฝ่ฝฆ+ๆฒน็ฎฑ30ๅ‡=ๆŒคๅŽ‹ไธŠๅขจๅ™จhhhhhhhhhๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n587503 0 ๅ—ๅŒ–ๅ…ฌๅธ่ฟžไบ‘ๆธฏ็ขฑๅŽ‚็›ๅ †ๅœบๅปบๅŽ‚ๆ—ถ่ฎพ่ฎกไธบ้œฒๅคฉๅญ˜ๆ”พ\n587504 0 ็ฝ‘ๅ‹่ดจ็–‘ไธบไฝ•ๆฐ‘้—ด่ดฆๅทๅ‘่ญฆๆ–น้€šๆŠฅ\n1000 Processed\n classify content\n588000 0 ๅพๅทžๆœ‰ไธชๅœฐๆ–นๅฏไปฅไบซๅ—็ขงๆตท่“ๅคฉ\n588001 0 ไฝ†ๆ˜ฏ็Žฐๅœจ็ฎ€็›ดๅฐฑๅƒๆฎ‹็–พๅ„ฟ็ซฅไธ€ๆ ท\n588002 0 ็”ŸๆดปไธญไนŸไผšๆœ‰ๅพˆๅคšๆฒป็–—็–พ็—…็š„ๆ–นๆณ•\n588003 0 600308ๅŽๆณฐ่‚กไปฝ๏ผšๅฐ†ๅ’Œๅ›ฝๅ†…ๅค–ไธ€ๆต็š„็ง‘็ ”้™ขๆ‰€ใ€้›†ๅ›ขๅ…ฌๅธๅˆไฝœ\n588004 1 xxxxxxxxxxxxxxxxxxx ๅทฅ่กŒ ๅขๅฅ่ดค\n1000 Processed\n classify content\n588500 0 ไธญ่ˆน้‡ๅทฅ๏ผš่ˆชๆฏๅ’Œๆ ธๆฝœ่‰‡็ญ‰้‡็‚นๅทฅ็จ‹ไปปๅŠก่ฟ›ๅฑ•้กบๅˆฉ\n588501 0 ๆๅˆฐ่ฟ™ไธช่ฏๆปก่„‘ๅญ้ƒฝๆ˜ฏNBAๅ’ŒSD็š„็”ป้ข\n588502 0 ๅŽป็”ต่„‘ไธŠๆ‰พๅฑ…็„ถไนŸๆฒกๆœ‰โ€็š„้ญ”ๅ’’\n588503 0 ๅ‘ตๅ‘ต่ฟ›็”ตๆขฏๆœ‰ไฟๆŒ่ท็ฆปๆ‹็…ง็ฆปๅพ—่ฟ‘ๆ˜ฏๅ› ไธบๆ‹‰ไบ†็„ฆ่ท\n588504 0 ๅฐ†ไผ—ๅคšๅฎ‰้˜ณ่‚ฟ็˜คๅŒป้™ข่ต„่ฎฏ่ฟ›่กŒๆต“็ผฉ\n1000 Processed\n classify content\n589000 0 ่€Œไธ”่ถŠๆฅ่ถŠๆ‹…ๅฟƒไธพๆŠฅไผšๆˆไธบๆ‰“ๅ‡ป็ซžไบ‰ๅฏนๆ‰‹็š„ๆœ‰ๅŠ›ๆ‰‹ๆฎตไบ†โ€ฆโ€ฆ\n589001 0 ๅพฎ่ฝฏๆ™บ่ƒฝๅŠฉ็†Cortanaๆๅ‰ๆณ„้œฒ\n589002 0 ๆทฑ่ˆช้ฃžๆœบ็บต็ซๆกˆๅคชไป–ๅฆˆๆƒŠ้™ฉไบ†\n589003 0 ๆ˜จๆ™š็ฌฌไธ€ๆฌก้†‰้…’ๅœจๅ…ฐๅทž็กๅคงๅŠๅคฉๅ‡บ้—จๅทฎ็‚น่ขซๅฐๅทๅท้’ฑๅŒ…ไธคๅ‚ป้€ผๅคœๆธธไธญๅฑฑๆกฅๅ’Œๆฏไบฒๆฒณ็ฎ€็›ดไธ่ƒฝๅœจไธฐๅฏŒ็š„ไธ€ๅคฉ\n589004 0 ไธŠๅ‘จๅŽปๅŽ็››้กฟๆŠŠiphone่€ณๆœบไธขไบ†\n1000 Processed\n classify content\n589500 0 ไป–ๅฏ่ƒฝๆ˜ฏNBAๅކๅฒไธŠๆœ€ๅ…ทไธชๆ€ง\n589501 0 ๆ—ถ้—ด่ฎฉไฝ ็†ฌๅˆฐไบ†็œŸ็›ธๅดๆฒกๆœ‰่กฅๅฟ\n589502 0 ้‚ฃไนˆไป–ไปฌ้ƒฝๆŠ•่ต„ไบ†ๅ“ชไบ›ๆŠ€ๆœฏๅ…ฌๅธ\n589503 0 ่ฏฅ่ทฏๆฎตๆ›พไบŽxxxxๅนดxxๆœˆxxๆ—ฅxxๆ—ถxxๅˆ†ๅ› ้ซ˜้€Ÿไบค่ญฆ็ฎกๅˆถ\n589504 0 ๆœ€่ฟ‘ๆ›ด่ฟ›ๅฑ•ๅˆฐ็”จ็”ต่„‘ๆ•ฃ็ƒญๅฃ\n1000 Processed\n classify content\n590000 0 ๆญฆๆฑ‰ไธญ้™ขไบŒๅฎกๅˆคๅ†ณ่ฟ™ๅ็”ทๅญไพต็Šฏไป–ไบบๅ่ช‰ๆƒ\n590001 0 ๆฎ่ฏดๅฎถ้‡Œ่ฃ…ไฟฎๆˆ่ฟ™ๆ ท่€ๅ…ฌ้ƒฝๆ„ฟๆ„ๅ›žๅฎถไบ†\n590002 0 ่‚ก็ฅจไบ้’ฑใ€ๅ–œๆฌข็š„ไบบๅˆไธ็†ๆˆ‘\n590003 0 โ€”โ€”ๆฑŸ่‹ๆ–ฐๆฒ‚ๅธ‚ๅง”ไนฆ่ฎฐ่ตต็ซ‹็พค\n590004 1 ๅฎฝๅธฆๅ‡็บง๏ผŒๅ…‰็บคๅ…ฅๆˆท๏ผŒๆ–ฐๆ˜ฅๆœ‰็คผ๏ผŒxxๅ…†xxxxๅ…ƒไธคๅนด๏ผŒๆœˆๅ‡xxๅ…ƒใ€‚ๆฒณ่ฅฟๅˆฉๆฐ‘้“็ซ™๏ผŒๅฎ‰่ฃ…็”ต่ฏxxx...\n1000 Processed\n classify content\n590500 0 ๅœฃ็ฝ—ๅ…ฐ็”ทๅฅณ้€š็”จๆฌพๆ˜Ÿๆ˜Ÿ่ƒŒๅŒ…่ƒŒไธŠๅฎƒๅ‡บๅทฎๆ—…ๆธธๅ†ไนŸไธ็ƒฆๆผๅ•ฅ้ƒฝ่ƒฝ่ฃ…ๆœ€้‡่ฆ็š„ๆ˜ฏ่ถ…็บง้กถ็บง่ดจ้‡่ฟ›ไธ“ๆŸœๅฎŒๅ…จไธๆ˜ฏ...\n590501 0 ๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขๅ…š็ป„ๅ‰ฏไนฆ่ฎฐใ€ๅ‰ฏๆฃ€ๅฏŸ้•ฟไบŽๅคฉๆ•ๅ‡บๅธญไปชๅผๅนถ่ฎฒ่ฏ\n590502 0 ๆœ€่ฟ‘้˜…ๅ…ตๆผ”ไน ็š„้ฃžๆœบไปŽๆˆ‘ๅฎถ้™„่ฟ‘่ฟ‡\n590503 0 ็ฉบไธญ็›ดๅ‡้ฃžๆœบ็Žฏๆธธ้ป„้‡‘ๆตทๅฒธ็ญ‰ๆœ‰ๆ„ๆ€็š„ๆดปๅŠจ\n590504 0 ๆ–ฐ็‰ˆ่ฟ˜ๅŠ ไบ†่œ‚่œœๅ‘ณ้“ๅฅฝ้—ป็š„ไธ€ๆฏ”ๅ•Š\n1000 Processed\n classify content\n591000 0 ่ฎธๅคšๆ‰‹ๆœบๅฎž็Žฐไบ†็”ตๆตๆฃ€ๆต‹็›‘ๆŽง\n591001 0 ๅ› ไธบ่ฃ…ไฟฎไนฐๅฎถๅ…ทๅทฒๅตไบ†ๅฅฝๅ‡ ๆžถไบ†\n591002 0 Ins็š„padgram็‰ˆๆœฌไธญๅซๆ˜ŸๅฎšไฝๆŠ€ๆœฏๅซ้‡ๅฅฝ้ซ˜ๅฝ“ไธ‹ไบ‘ๅฑ‚้ƒฝไธ€ๆธ…ไบŒๆฅš??\n591003 0 ไธ‹้ฃžๆœบๆ—ถๅ€™็š„ๅฟƒๆƒ…่ฟ˜ๆ˜ฏ่ทŸ็™ปๆœบๆ—ถๅ€™ไธ€ๆ ทๅญค็‹ฌ\n591004 0 ๅปบ็ฏ‰ๅธซGn?dingerArchitects\n1000 Processed\n classify content\n591500 0 ่ฟ™ไธ€ๅˆ‡้ƒฝ้™้™็š„็ญ‰ๅพ…ไบ†100ๅคšๅนด\n591501 1 ๅฎๆ™ฏๅˆถ่กฃๅŽ‚ไธ“ๆณจ็”Ÿไบง็‰›ไป”่ฃค๏ผŒไธปไพ›ๅนฟๅทžๆฒ™ๆฒณ\n591502 0 ๅคง็ˆฑไปŠๅคฉ่ฏท็š„่ฃๅˆค๏ฝž๏ฝžๅคฉๆดฅ้˜Ÿๆฃ’ๆฃ’ๅ“’๏ฝž๏ฝž\n591503 0 ็บฟไธ‹ๆ›ดๆ˜ฏไปฅๅŽไธบๅไน‰ๅŒ…ไธ‹ๆทฑๅœณๅคšๅฎถ็”ตๅฝฑ้™ข\n591504 0 ๅ—ไบฌๅฎœๅฎถๅฎถๅฑ…ๅค–ไป“็š„ๆ‰ฟๅŒ…ๅ•†ไธŠๆตทๆตทๅšๅ›ฝ้™…่ฟ่พ“ไปฃ็†ๆœ‰้™ๅ…ฌๅธๅฏนไบŽๅค–ไป“ๅ‘˜ๅทฅๅทฅ่ต„ๅฐ‘ๅ‘\n1000 Processed\n classify content\n592000 0 ๅ’Œ่ฟ™ๅ’–ๅ•ก็œ‹็€ๅฏน้ข้ฃžๆœบ็ฐๆฅ็ฐๅŽป\n592001 0 ๅพๅทžๅ…จ้ขไบŒๅญฉๆ”ฟ็ญ–ๆœ€ๅฟซๅฏ่ƒฝๅนดๅ†…ๅฎžๆ–ฝ\n592002 0 ๆˆ‘ไปฌ่ฟ˜ๆœ‰ไธ€ไธชO2O็š„็บฟไธŠๅนณๅฐๅซ็‰นๆฅๅŠฒ\n592003 0 ไธ็„ถ่ฟ™ไผ™้‚ชๆถ็š„็Šฏ็ฝชๅˆ†ๅญๆ—ฉๅฐฑๆŠŠ้—ฎ้ข˜็…ฝๅŠจ่ตทๆฅไบ†\n592004 0 ่กŒไธšๆ•ดไฝ“ๆ•ˆ็›Šๅคงๅน…ๆๅ‡ๅ†›ๅทฅๆฟๅ—ไธญ้•ฟๆœŸๆŠ•่ต„ไปทๅ€ผๅ‡ธๆ˜พ\n1000 Processed\n classify content\n592500 0 FCๅฎ้ธก่ฝฐ็‚ธๆœบ็š„ๅ…จๅฎถ็ฆ\n592501 0 ๅฐฑๆ˜ฏ13ๅฒ็„ถๅŽๆŠŠๆฏไธ€้›†ๆ‘ฉ็™ปๅฎถๅบญ็œ‹ไบ”้่ฟ˜ๆœ€ๅ–œๆฌขJay็š„้‚ฃไธชไผข\n592502 0 ๅ—ไบฌใ€ๆ— ้”กใ€่‹ๅทžไธ‰ๅธ‚ๅทฒๆœ‰ๅœฐ้“\n592503 0 ้ป‘้พ™ๆฑŸ็œไฝณๆœจๆ–ฏๅธ‚ไบบยท่งไธ‹ๆ–‡โ†“\n592504 1 ๅ†…้ƒจๆธ ้“ไธ“ไธšๆ“ไฝœ(ๅ…ฌๅ‹™ๅ“ก+ไบ‹ๆฅญ็ทจ+ๆŒ‰ ๆญๆˆฟ+ๅ˜ฟๅๅ•+่ฝฆ+ไฟๅ•)ๅ”ŽๆฏไฝŽ่‡ณxๅŽ˜๏ผŒ็ฎ€ๅ•๏ผไฝŽๆฏ๏ผ้ซ˜ๆ•ˆ...\n1000 Processed\n classify content\n593000 1 ๅพทไฝ‘้ซ˜ๆฆ•ไธพxxxxxxxxxxx๏ผšๅค–ๆปฉ็ปฟๅœฐๅไบบๅŠ๏ผŒๅ—ๅŒ—้€šxๆˆฟ๏ผŒxๆขฏxๆˆทๅพ—ๆˆฟ็އxx%๏ผŒ้ข็งฏxx...\n593001 0 ๅฏๆ˜ฏไน่ง†ๆ‰‹ๆœบๅˆฐไบ†ๆˆ‘ๅฐฑๆŠŠๆžœ6็ป™ๆˆ‘ๅฆˆไบ†\n593002 0 ไฝ ่ฏดๆˆ‘ไธๆ˜ฏ่ฏดไธ‹ไธชๆœˆๆฅๆ‰ฌๅทžๅ—\n593003 0 ๅบ“้‡Œๆ— ่งฃไธ‰ๅˆ†้ข†่ก”NBAๆ€ปๅ†ณ่ต›ๅไฝณ็ƒ\n593004 0 ๅ€’่ฟ‡ๆฅๅฐฑๆ˜ฏไนฐไฟ้™ฉ็š„้กบๅบใ€\n1000 Processed\n classify content\n593500 0 ๅฎž็Žฐไบ†ๅฎฟ่ฟๅธ‚็›ธๅ…ณ้ข†ๅŸŸ้›ถ็š„็ช็ ด\n593501 0 ไธ€้‡ๅˆฐไปปๅŠก็”ต่„‘ๅฐฑๅผ€ๅง‹่ฃ…ๆญปๅ‘ตๅ‘ตๅ‘ตๅ‘ต\n593502 0 ้™†ๅฎถๆ‘„ๅ็‰นๅœฐ้‚€่ฏทไบ†ๆ˜†ๅฑฑๆ‘„ๅฝฑๅไผš็š„็Ž‹ไผŸๆ˜Ž่€ๅธˆๅ‰ๆฅ้™†ๅฎถไธบไผšๅ‘˜ไปฌ่ฟ›่กŒๆ‘„ๅฝฑไธ“้ข˜่ฎฒๅบง\n593503 0 ??ไธไน…ๅ‰ๅˆท็ˆ†ๆœ‹ๅ‹ๅœˆโ€œไผฐๅ€ผ6ไบฟโ€็š„ไบ‘่ง†้“พ\n593504 0 ๅ‰ไฟ้™ฉๆ ไธฐๅฏŒ็š„็บฟๆกๅ‡ธๆ˜พ่ฟๅŠจๆ„Ÿ\n1000 Processed\n classify content\n594000 1 ไฝ ๅฅฝ๏ผŒ ไธ‰็ฑปไบบๅ‘˜ๅฒ—ไฝ่ฏไนฆ่€ƒ่ฏ• ๅŒ…่ฟ‡ใ€‚ไปฅๅŽๅฏไปฅ็›ดๆŽฅ่ทŸๆˆ‘่”็ณปใ€‚ๅœฐๅ€ๆญๅทžๅธ‚ๆ•™ๅทฅ่ทฏxxxๅทๅŽ้—จaๅบงx...\n594001 0 ๆญฆๆฑ‰ๅœฐ้“ไธคๅฅณๅญๆŠขๅบงไธŠๆผ”โ€œๆ‰ฏๅ‘ๆ’•่กฃๅคงๆˆ˜โ€\n594002 0 ๅŒ่ƒŽๅฎๅฎๅ…ซไธชๅŠๆœˆๅฆไธ€ไธชๅฎๅฎๆฒกๆœ‰่ฟ™ๆ ท็š„ๆƒ…ๅ†ต\n594003 0 ๅพๅทžๆฒ›ๅŽฟไบค่ญฆๅคง้˜Ÿ้พ™ๅ›บๅ…ฌๅฎ‰ๆฃ€ๆŸฅ็ซ™ๆฐ‘่ญฆๅœจไพ‹่กŒๆฃ€ๆŸฅๆ—ถ\n594004 0 ๅ‰ๅคง็ฏ้‡‡็”จไบ†ไธปๅŠจๅผLED็ฏๆบ่ฎพ่ฎก\n1000 Processed\n classify content\n594500 0 ๅฎŒๅ–„็š„่ฟœ็จ‹ๅŒป็–—ๅ’จ่ฏขๆœๅŠกๅนณๅฐ\n594501 0 ๆฅ่‡ชๅ…จๅ›ฝ็š„109ๆ”ฏไปฃ่กจ้˜ŸๅŒๅฐ็ซžๆŠ€ๅฑ•็ปๆดป\n594502 0 ็„ถๅŽๅ‘ๅพฎๅšๆ™’ๅ‡บๆ‰‹ๆœบ้‡Œ็š„็ฌฌ7ๅผ ็…ง็‰‡\n594503 0 ็œผๆณชๆตไบ†ไธ‹ๆฅ~ๆ‰ๆ˜Ž็™ฝ่‡ชๅทฑ่ฟ˜ๆ˜ฏๅฐๅญฉ\n594504 0 e็š„ๆกถๅ‰ๆœŸไนŸๅฐฑๅช่ƒฝๆ‰”่‰ไธ›้˜ด้˜ดไบบ\n1000 Processed\n classify content\n595000 0 ๆ–ฐๆฌพๆฌง็พŽๅคง็‰ŒๅŒๆฌพๅฎžๆ‹ๅ›พไธŠๅ•ฆ่ถ…็บงๅฅฝ็œ‹๏ฝžsmๅพˆๆ˜พ่บซๆๆฐ”่ดจๅๅช›ๅŒ…้‚ฎ๏ฝž\n595001 0 ่ง†้ข‘ๆฅ่‡ชๅผ ๅ›ฝ่ฃxxๅนดๅคๆ—ฅไผฏ็ˆตๆผ”ๅ”ฑไผš\n595002 1 ๏ผšๅ…จๆฌพ่ฝฆxๅŽ˜x๏ผŒ่ดทๆฌพ่ฝฆxๅŽ˜โ€ฆโ€ฆๆฌข่ฟŽๆ‚จๆฅ็”ตๅ’จ่ฏขxxxxxxxxxxxไบŽใ€‚ไนŸๅฏๆทปๅŠ ๅพฎไฟก่ดฆๅทxxx...\n595003 0 ็ฌฌไธ€ๆฌก่งๅˆฐไฝ ็š„ๆ—ถๅ€™ๆฒกๆŠขๅคบไฝ ็š„ๅคง่„‘็œŸๆ˜ฏๅคชๅฅฝไบ†\n595004 0 ๆญป็Œฅ็็™พๅบฆ่ฟ›ๆ”ปๆ•ŒๅŽๆˆ˜ๅœบๆฌบ่ดŸๆˆ‘ๅฐๅœˆ\n1000 Processed\n classify content\n595500 0 ่ฟ™่‰˜xxxx่ฝฆไฝๆฑฝ่ฝฆๆปš่ฃ…่ˆน\n595501 0 ไฝ•็‚…่€ๅธˆ็ฅจ่ƒฝไธ็ฎ—่ฟ‡ๆœŸไนˆใ€ๅฅฝ้šพ่ฟ‡\n595502 0 ๆ—…ๆธธๅœฐไบงๆœชๆฅๆˆฟๅœฐไบงๅธ‚ๅœบๅ‘ๅฑ•ๆ–นๅ‘\n595503 0 ไบบๅฝขๆข็ขผๆฎบไบบๆกˆ็š„ๅ…‡ๆ‰‹~ๆœ€ๅพŒ็ต‚ๆ–ผๆญปไบ†~ๅฃžไบบ็š„ไธ‹ๅ ด้ƒฝไธๆœƒๅคชๅฅฝ~ๅ—ๅฎฎ้–”ๅœจ้€™้ƒจๆˆฒๆผ”ๅพ—ๅพˆๅฅฝ~ๅฎณๆˆ‘้‚Š็œ‹้‚Š็ฝต\n595504 0 ๆ‰€ๆœ‰่‚ก็ฅจๅ…จๆŠ›ๆ— ่‚กไธ€่บซ่ฝป\n1000 Processed\n classify content\n596000 0 ่ฟ™ๅชๆœบๅ™จไบบ็Žฉๅถๅฐฑ็ฆปๅฎถๅ‡บ่ตฐไบ†\n596001 0 xใ€ไธบๅญฆ้™ขๅ„้กนๆดปๅŠจ็š„ๅผ€ๅฑ•ๆไพ›็›ธๅบ”็š„็ป่ดนใ€็‰ฉ่ดจไฟ้šœ\n596002 0 ๅšๆณ•๏ผš1ใ€ๆด—ๅ‡€ๆฉ™ๅญๅœจ็›ๆฐดไธญๆตธๆณกไธ€ไผš\n596003 0 ๆ—ฉๅนดๆฏ•ไธšไบŽๅ—ไบฌ่‰บๆœฏๅญฆ้™ข็พŽๆœฏ็ณป\n596004 0 ไป–ๆ˜ฏไธ€ไธช่‹ๅ“‘ไบบไป–ๆฒกๆœ‰ไปปไฝ•่ดชๅฉช\n1000 Processed\n classify content\n596500 1 ๅฅฝๆถˆๆฏ๏ผๅ‰ๆž—ๅ…ƒ้ผŽไฝณๅฎๆฃฎ้›…xSๅบ—่ฟŽๆ˜ฅๅทจ ็Œฎ๏ผŒไฝณๅฎVxxๅ…จ็ณปๆœ€้ซ˜็›ด้™xxxxๅ…ƒ๏ผŒไฝณๅฎVxxๆ–ฐ่ฝฆ็ซ...\n596501 0 ็œ‹ไบ†่ฟ˜ๆ˜ฏๆœ‰ไบ›่ฎธๆ„Ÿ่งฆCBAไนŸไธๆ˜“ๆ›ดๅŠ ้šพไปฅๆƒณ่ฑกNBAๆœ‰ๅคšๅŠๆˆ‘ๅชๆƒณ่ฏดๆˆ‘ๆ˜ฏไธ€ไธช็ฏฎ็ƒ็ˆฑๅฅฝ่€…\n596502 0 ๆ—ฉไธ็Ÿฅ่ฏดไบ†ๅคšๅฐ‘่ฟๆณ•็š„ๅ†…ๅฎนไบ†\n596503 0 ๆ— ้”กไบบๆœ‰ๅฅไฟ—่ฏ๏ผšๅฐๆš‘้‡Œ้ป„้ณ่ต›ไบบๅ‚\n596504 0 ๆœ‰่‘—ๅ็š„ๅ“ๆฐดๆน–ๆณ‰ใ€็็ ๆณ‰ใ€้พ™ๆฒณๆณ‰็ญ‰ๆฐด่ต„ๆบ\n1000 Processed\n classify content\n597000 0 ๅฝ“ๆ‰งๆณ•ๆœบๅ…ณ่ทŸ่ฟๆณ•ๅ•ไฝ็ฉฟไธ€ๆก่ฃคๅญ็š„ๆ—ถๅ€™\n597001 0 8็‚น็š„้ฃžๆœบไธ€็›ดๆŽจ่ฟŸๅˆฐ11็‚น\n597002 0 ๅฃฐๆŽงๆˆชๅ›พๆจกๅผโ€”โ€”ๅ€ŸๅŠฉๆ‰‹ๆœบ็š„้บฆๅ…‹้ฃŽ่ฟ›่กŒๅ–Šๅซ\n597003 0 ๅผ€ๅ‘ๅŒบๆณ•้™ข่ขซ็กฎๅฎšไธบๅ…จ็œๅธๆณ•ๅ…ฌๅผ€ๅทฅไฝœ็œ็บง็คบ่Œƒๅ•ไฝ\n597004 0 ๆˆ‘ๆŠŠๆ‰€ๆœ‰่ง†้ข‘ไผ ๅˆฐ็™พๅบฆไบ‘็„ถๅŽๆ‰‹ๆœบ้‡Œ็š„ๅ…จๅˆ ๆމไบ†็Žฐๅœจ้—ฎ้ข˜ๆฅไบ†\n1000 Processed\n classify content\n597500 0 ๆŠ“่Žทๆฝœ้€ƒx่‡ณxๅนด็ฝ‘ไธŠ้€ƒ็Šฏxxxๅ\n597501 0 ไธ‹ๅˆๅŒบๆ”ฟๅบœๆœŸไธฝ็ผๅ‰ฏๅŒบ้•ฟไธ€่กŒๅˆฐๆ˜Ž่‹‘ๆ กๅŒบๅฏนๆ‹›็”Ÿๆƒ…ๅ†ต่ฟ›่กŒไบ†็ŽฐๅœบๆŒ‡ๅฏผ\n597502 0 ไปฅๅŽๆˆ‘่ฆๆ˜ฏๅนฒๆŠคๅฃซ่ฟ™ไธช่กŒไธšๆˆ‘่ฆ็”จๅŠไธชๆœˆๅทฅ่ต„ๆฅไนฐไฟ้™ฉไธ‡ไธ€้ญ็ ไบ†ๆˆ‘่ฏปๅŒป็š„ๆˆๆœฌ่ฟ˜ๆฒกๆ”ถๅ›žๅฐฑๆญปไบ†ๆˆ‘ๅฎถ้‡Œไบบ...\n597503 0 ็‹ฌ่ฃ=้›†ๆƒ=ๅจๆƒ=ๅผบๆƒ=ๆšดๅŠ›=ๅไบบๆ€ง=ๆ”ฟๆฒป่…่ดฅ\n597504 0 ไฝ ๆ‹ฟๅˆ€ๅฏนๅฐๅทๆ–ฝๆšดๅ‰ฅๅคบๅฐๅท็š„็”Ÿๅ‘ฝ\n1000 Processed\n classify content\n598000 0 ่บซไฝ“ๆ˜ฏ้ฉๅ‘ฝๆœฌ้’ฑๅฐฑๆŽŒๆกๅœจไฝ ่‡ชๅทฑๆ‰‹ไธญ\n598001 1 ใ€ๆพณ้—จ้‡‘ๆฒ™่ตŒๅœบ็›ด่ฅใ€‘ๅซŒๅŽปๆพณ้—จ่ตŒๅœบๅคช้บป็ƒฆไบ†๏ผŒ็ฝ‘ไธŠๅผ€ๆˆทๅฐฑ่ƒฝ็Žฉ๏ผŒๆŒๆœ‰ๆพณ้—จใ€่ฒๅพ‹ๅฎพๆ”ฟๅบœ้ขๅ‘ๅšๅฝฉๆ‰ง็…งใ€‚...\n598002 0 ๅณๅœจๅ•†ไธš็ฉบ้—ดไธŽๅฑ…ไฝ็ฉบ้—ดไธญๆ‰€ๆœ‰ๅฏ็งปๅŠจ็š„ๅ…ƒ็ด ็ปŸ็งฐ่ฝฏ่ฃ…\n598003 0 ไบฒๆƒณ้’ป======็‚น้ฃ˜็ฅจ็š„่ง‰ๅพ—ๅˆ้€‚ๅฏไปฅๅŠ ๆˆ‘ๅฅฝๅ‹๏ฝžๅธฆไฝ ่ฟ›้ข‘้“ๅฐไบ†่งฃๆธ…ๆฅšๅ“ˆ\n598004 0 ไป–ไปฅไธบๆ˜ฏๅ› ไธบไป–็š„ๆญฃไน‰ๆณ•ๅพ‹ๆ„่ฏ†\n1000 Processed\n classify content\n598500 0 ๅ“ฆๆˆ‘ๅฐฑๅ‘ไธ€ๆก่ทŸ็”ตๆขฏๆœ‰ๅ…ณ็š„wbๅฐฑ่ขซๆฐดๅ†›DTไบ†\n598501 0 ็œ‹ไบ†ไฝ ้€™ๆฌก็š„้จฐ่จŠ่ฆ–้ ป็›ดๆ’ญ็š„ๆผ”ๅ”ฑๆœƒ\n598502 0 ๆทฑๅœณ้พ™ๅฒ—ๅˆ†ๅฑ€้พ™ๆ–ฐๆดพๅ‡บๆ‰€ๆฐ‘่ญฆๆŽๆŸๅคๆ€ฅไบŽ็ซ‹ๅŠŸ\n598503 0 ่™ฝ็„ถๅฌๅˆฐไบ†้›จๅฃฐ็„ถ่€Œๆˆ‘็”ต่„‘้ป‘ๅฑไบ†\n598504 0 ่’œ่“‰ๆ˜ฏๆˆ‘ๅœจ็ฆๅทžๅƒ่ฟ‡ไธŽๆฑŸ่‹ๅฃๅ‘ณๆœ€ไธบๆŽฅ่ฟ‘็š„\n1000 Processed\n classify content\n599000 0 ๆ‰€ไปฅๅข™้ข็š„่ฃ…้ฅฐๆ˜ฏ่ฃ…ไฟฎ่ฟ‡็จ‹ไธญ้‡่ฆ็š„ๆ–ฝๅทฅ็Žฏ่Š‚ไน‹ไธ€\n599001 0 ไธๆ‡‚่ฟ˜ไปฅไธบๆ˜ฏ้˜ฟ้‡Œใ€่…พ่ฎฏ็œŸ็š„ไบฒ่‡ชๆ“ๅˆ€ไป‹ๅ…ฅ็š„\n599002 0 ๅฏนไบŽๆ–ฐๆฒ‚ๅธ‚้ป‘ๅŸ ไธญๅญฆ8ๅนด็บงๅญฆ็”Ÿ็Ž‹ๅญ่Žฒๆฅ่ฏด\n599003 0 ๆžๅพ—ๅฅฝๅƒๅ‚่ง‚ไธชๆ—…ๆธธๆ™ฏ็‚นไธ€ๆ ท\n599004 0 ๅ‹ๆƒ…ๆ้†’๏ผš่‹ๅทžๅคงๅญฆ็‹ฌๅข…ๆน–ๆ กๅŒบ็‚ณ้บŸๅ›พไนฆ้ฆ†401้˜…่งˆๅฎคๅœจ7ๆœˆ13ๆ—ฅๆ™š22๏ผš00่ฟ›่กŒไบ†ๆธ…ๅœบ\n1000 Processed\n classify content\n599500 0 ไบบ็”Ÿ็š„่ดขๅฏŒไธๆ˜ฏไฝ ๆ‹ฅๆœ‰ๅคšๅฐ‘่ดงๅธๆ‹ฅๆœ‰ๅคšๅฐ‘ไธๅŠจไบง่€Œๆ˜ฏไฝ ๅธฎๅŠฉไบ†ๅคšๅฐ‘ไบบๆ”ฏๆŒไบ†ๅคšๅฐ‘ไบบๅฝฑๅ“ไบ†ๅคšๅฐ‘ไบบๅœจไฝ ็ฆปๅผ€...\n599501 0 ๅฝข่ฑกไผผๅๆจก/้ผ“ๆŽŒ/้ผ“ๆŽŒ/้ผ“ๆŽŒ\n599502 0 ็œŸ่ฆ่ฎฉๆˆ‘็ญพๅญ—ๆˆ‘็œŸๆ€•holdไธไฝ\n599503 0 ๅฎƒไปฌๅฐ†ๅœจๅก่ฝฆๅ’Œ้ฃžๆœบไธŠๅบฆ่ฟ‡30ไธชๅฐๆ—ถ\n599504 0 ๅฝปๅคœ็š„็ญ‰้ฃžๆœบๆˆ˜ๆ–—็ปˆไบŽ็”ปไธŠไบ†ๅฅๅท\n1000 Processed\n classify content\n600000 0 ไฝฟ่†ˆ่‚Œๅ‡บ็Žฐ้˜ตๅ‘ๆ€งๅ’Œ็—‰ๆŒ›ๆ€งๆ”ถ็ผฉ\n600001 1 ไบๆœฌๆธ…ไป“๏ผŒไธ€ไปถไธ็•™๏ผŒๅ…จๅœบๅ•†ๅ“ไฝŽ่‡ณxๆŠ˜๏ผŒ่ฏทไบฒไปฌ็›ธไบ’่ฝฌๅ‘Š๏ผŒๅนถ็ป™ไบˆๆˆ‘็ฒพ็ฅžไธŠ็š„ๆ”ฏๆŒ๏ผŒ่ฐข่ฐขๆƒ ้กพ๏ผ\n600002 0 ้ข้ƒจๅœจUVๅ…‰ไธ‹ๅฎŒๅ…จๆšด้œฒๅ‡บ็œŸ็šฎๅฑ‚็ŽฐๅœจๅŠๆฝœๅœจๅญ˜ๅœจ็š„้—ฎ้ข˜\n600003 0 ๅฏนไบŽ็Žฐๅœจ็ƒญ่ฎฎ็š„ๆŒๆœ‰ๅž‹ๆˆฟไบง็จŽ็š„ๅพ็ผด\n600004 0 ๆœบๅ™จไบบไปฌ่Žซๅๆˆณๆˆ‘ๆณช็‚นๅ•Šๅ•Šๅ•Šๅ•Š\n1000 Processed\n classify content\n600500 0 ็คพๅŒบๆฏๆœˆๆณ•ๅพ‹ๅ’จ่ฏขๆœๅŠกๅ—ๆฌข่ฟŽxๆœˆxxๆ—ฅ\n600501 0 ๅฅฝๅฌใ€ๅƒ่‹ๅทžๅ่œใ€้ฆ™ใ€ๅฅฝๅƒใ€ๅ–่‹ๅทžๅ้…’ใ€้†‡ใ€ๅŽš้ฆ™็พŽ\n600502 0 ๅœจไธ€ๅฎถๆŒ‡ๆ•ฐๆŠ•่ต„ๅšๅพ—ๅพˆ็‰›็š„ๅŸบ้‡‘ๅ…ฌๅธ้‡Œ\n600503 0 ไป–ไปฌๅฐฑ็ˆ†ๆ–™ๆŸไบ›ๆ˜Žๆ˜Ÿ็œŸๆญฃ็š„้ข็›ฎ\n600504 0 โ€โ€œๆˆ‘ๅชๆ˜ฏๆ“ฆไบ†้˜ฒๆ™’่€Œๅทฒไฝ ็œ‹ๆฏ›ๅญ”้ป‘ๅคดๅ‡บๆฅไบ†ๅ› ไธบๆˆ‘ๆฒกๆ‰‘็ฒ‰\n1000 Processed\n classify content\n601000 0 ๅทฅไฟก้ƒจๅฐ†้‡็‚นๆŽจ่ฟ›ๅทฅไธšๆœบๅ™จไบบๅœจๆฐ‘็ˆ†็ญ‰ๅฑ้™ฉไฝœไธš่กŒไธš\n601001 0 ๆˆ‘็”ต่„‘็ง็”จ็š„ๆ–‡ไปถๅคน่ฟ˜ๅซๅšๅณๅ…ดๆ‘‡ๆ‘†\n601002 0 ่พ“ๅ…ฅๆˆ‘็š„ๆŽจ่ไบบไปฃ็ โ€œ44sxs5โ€ๅฐฑๅฏไปฅ่Žทๅพ—่ถ…ๅ€ผ้ญ”ๅนปๅก\n601003 0 ็™พ่คถ่Šฑ่พน่ฃ™่ฃค็™พๆญๆ˜พ็˜ฆ่ฃ™่ฃค่ฎพ่ฎก้˜ฒ่ตฐๅ…‰ๅ“ฆๅฐบ็ S้•ฟxx่…ฐๅ›ดxxM้•ฟxx่…ฐๅ›ดxxL้•ฟxx่…ฐๅ›ดxx\n601004 0 ๅ…จ็œๅ…ฑๆŸฅๅค„้‡็‚นไบค้€š่ฟๆณ•่กŒไธบ448019่ตท\n1000 Processed\n classify content\n601500 1 ไบฒไปฌไธ€ๅนดไธ€ๅบฆ็š„x.xๅฅณไบบ่Š‚ๅˆๅˆฐๅ’ฏ๏ผŒไธบๆ„Ÿ่ฐขไบฒไปฌไธ€็›ดๅฏนไบฌๆถฆ็็ ็š„ๆ”ฏๆŒใ€ๆˆ‘ๅธๅฐ†ไปŽxๆœˆxๅทๅˆฐxๆœˆxๅท...\n601501 0 1958ๅนดไนพ้š†็š„ๅœฐๅฎซๅ…ฅๅฃๅทฒ็ปๆ‰พๅˆฐไบ†\n601502 0 ๅฎŒๆ•ดๅฎžๆ‹ๅฑฑไธœๅŸŽ็ฎกๆ ก้—จๅฃๆ‰“6ๆ—ฌ่€ไบบ้ญๅˆไธญ็”Ÿๅ›ดๆฎด\n601503 0 ้ฃžๆœบไธŠ็š„ไน˜ๅฎขๅœจ็ปๅކไบ†ๅ‰ง็ƒˆ้ข ็ฐธใ€ๅž‚็›ดไธ‹้™่ฟ‘xๅˆ†้’Ÿไน‹ๅŽ\n601504 0 ๅŒป็”Ÿ่ฏดๆ˜ฏ็กๅพ—ๅคชๆ™šๅ…็–ซๅŠ›ๅคชๅทฎๆ‰ไผš็—…ๆฏ’ๆ„ŸๆŸ“็š„\n1000 Processed\n classify content\n602000 0 ็ฆปๅฎถไธ‡้‡Œ็š„xxๅฒๅฐ‘ๅนดๅœจ่ฟ™ไธ€ๅˆปๅพ—ๅˆฐไบ†ๆ‰€ๆœ‰ๅ›žๆŠฅ\n602001 0 ๆฑŸ่‹ๅ—ไบฌๅธ‚็งฆๆทฎๅŒบ้•ฟๅนฒๅฏบโ€”ไฝ›้กถ่ˆๅˆฉ\n602002 0 ไธŠๆฌก่ฟ˜ๆœ‰ไธชๅœจๅœฐ้“้‡Œ็”จ็ฌ”็”ป็š„\n602003 0 ไป–ไปฌไผš้€š่ฟ‡xxๅ้€‰ๆ‰‹็š„่ˆžๅฐ่กจ็Žฐ่ฟ›่กŒ็Žฐๅœบๆ‰“ๅˆ†\n602004 1 ๆ‚จๅฅฝ๏ผŒๆฌข่ฟŽ่‡ด็”ต่ฏšไฟกๅผ€้”ๆœๅŠกไธญๅฟƒ๏ผŒๆœๅŠก้กน็›ฎ๏ผšๅผ€้”๏ผŒไฟฎ้”๏ผŒๆข้”๏ผŒๆข้”่Šฏ๏ผŒ้”€ๅ”ฎๅ„ๆฌพ่ถ…B็บง้”่Šฏใ€‚ๅ…ฌๅฎ‰...\n1000 Processed\n classify content\n602500 0 ็ฌฌไบŒๅญฃๅบฆ่‹นๆžœMac็”ต่„‘็š„ๅ…จ็ƒๅ‡บ่ดง้‡ไธบ510ไธ‡ๅฐ\n602501 0 ไธญ่ˆชๅ…‰็”ตไธ€ๆœบๆž„ๅ‡€ไนฐๅ…ฅxxxxไธ‡ๅ…ƒ\n602502 0 7ใ€ไธ€ๆ”ฏ้ฃŽ้™ฉๅฏๆŽงๅˆฉๆถฆๅฎนๆ˜“็ฟปๅ€็š„่‚ก็ฅจ\n602503 0 ่”็ณป็”ต่ฏ๏ผš133681XXXX\n602504 0 ไปŠๅนด็š„ไปปๅŠก๏ผš1ๅฎŒๆˆๆ”ถ่ดญๅ…ฌๅธ็š„ไบ‹ๆƒ…\n1000 Processed\n classify content\n603000 0 6ใ€ๅธธๅทžๅธ‚้ซ˜ๆ–ฐๆŠ€ๆœฏไบงๅ“่ฎคๅฎš่ฏไนฆ5ไปฝ\n603001 0 ๅŒ…ๅญ่ขซๆŠ“ๅˆฐ็š„ๅœฐ็‚นๆ˜ฏUnterdenLinden41\n603002 0 ้ป‘่Ž“ๅฐ†ไธบ่ฐทๆญŒLollipopx\n603003 0 ไปŠๅนด้ฆ–ๅฑŠ5000ๅๅ†œๆ‘่ฎขๅ•ๅฎšๅ‘็”Ÿๅทฒๆฏ•ไธš็ฆปๆ ก\n603004 0 bluedๆ‰‹ๆœบ็ป‘ๅฎšๆ”ถไธๅˆฐ็Ÿญไฟก\n1000 Processed\n classify content\n603500 1 ่€ๅ‡ค็ฅฅ็›ธ็บฆไธ‰ๅ…ซ๏ผšๆœฌๅท้‡‘ๆŠ˜ๆ—ง่ดนๅ…จๅ…ใ€้ป„้‡‘ๆข้’ป็ŸณๆŠ˜ๆ—ง่ดนๅ…จๅ…ใ€้“ถ้ฅฐxๆŠ˜๏ผŒ่ดญ็ ็Ÿณๆปกxxxxๅ…ƒ่ฟ”xxx...\n603501 0 ๆฐดๅนณๅคง่‡ดไธŽChromeๆ’ไปถ็บขๆ็ฑปไผผ\n603502 0 4ๅฒไปฅไธŠไฝ†่บซ้ซ˜ๆœช่ถ…่ฟ‡ๅ›ด่ฃ™ๆฟ็š„ๅ„ฟ็ซฅ\n603503 0 ๅน็ฉบ่ฐƒๅŽปๅŒป้™ขๆ™’ๅคช้˜ณๅน็ฉบ่ฐƒๅŽปๅŒป้™ขๆ™’ๅคช้˜ณโ€ฆโ€ฆๅ‘จ่€Œๅคๅง‹\n603504 1 ้ซ˜โ€”ๆ•ฐๅญฆ่‹ฑ่ฏญ็‰ฉ็†ๅŒ–ๅญฆๅœฐ็†้“บๅฏผๆ‹›็”Ÿ๏ผŒๅŽๆ ‹ๅๅธˆไธป่ฎฒ.ๆๅ‰้ข„็บฆ.ๆฌข่ฟŽๆฅ็”ตๅ’จ่ฏข่ฏ•ๅฌ๏ผ›ๆœฌๅ‘จๆ˜ŸๆœŸๆ—ฅๆ—ฉไธŠๅ…ซ...\n1000 Processed\n classify content\n604000 0 ็พๅฐ‡ๆœฌไบบๅ—ไบฌไธปๅŸŽๅŒบ็š„ๆˆฟๅญๅ‡บๅ”ฎ\n604001 0 ่€Œ็ฌฌไธ‰ๆ–นๆ”ฏไป˜ๆ˜ฏไบ’่”็ฝ‘้‡‘่ž็š„ๆ ธๅฟƒ่ฆ็ด \n604002 0 ็Žฐ้‡‘ๆ”ถ็›ŠA/Bไธƒๆ—ฅๅนดๅŒ–ๆ”ถ็›Š็އ2\n604003 0 ๅ‘็Žฐๆฒน็„–ๅคง่™พๅฑ…็„ถๅฐฑๆ˜ฏๆฑŸ่‹็š„้บป่พฃๅฐ้พ™่™พ\n604004 0 ไปŠๅคฉๅˆš็œ‹ไบ†ไธชๆฏ•่Š‚่ญฆๅฏŸๅ‡ปๆฏ™่ขญ่ญฆไบบ็š„ๆ–ฐ้—ป\n1000 Processed\n classify content\n604500 0 2015ๅนดไธนๆฃฑๅŽฟไบ‹ไธšๅ•ไฝๅ…ฌๅผ€่€ƒ่ฏ•ๆ‹›่˜ๅทฅไฝœไบบๅ‘˜้€’่กฅไฝ“ๆฃ€ไบ‹้กน็š„ๅ…ฌๅ‘Š\n604501 0 ไธป่ฆๆ˜ฏ้ขœ่‰ฒๅพˆ็™พไธŠ็™ฝไธ‹้ป‘็š„ๆ‹ผๆŽฅ็šฎ่‰้ž็š„็พŽไธฝ้ฃŽๅ…‰\n604502 0 ๅœจไป™ๅ‰‘xๆฒกๆœ‰ๆ”พไน‹ๅ‰ไนŸๆœ‰ๅพˆๅคšไบบ่ดจ็–‘ๅ”ๅซฃ\n604503 0 MYไธ€็›ด้ป˜้ป˜็š„็‰›้€ผไธ้ ๅนฟๅ‘Šๅšไบบๆฐ”ๅš็”Ÿๆ„\n604504 0 ๆ‰ฌๅทžไบค่ญฆๅค„็†ไบ‹ๆ•…้€Ÿๅบฆ็œŸๅฏไปฅ\n1000 Processed\n classify content\n605000 0 ่Šฑๅƒ้ชจ่ฟ™้ƒจๅ‰งๅฐฑๅ‘Š่ฏ‰ไบ†ๆˆ‘ไปฌไธ€ไธช้“็†\n605001 0 ็พคๅท166480707้ชŒ่ฏ988ๅฟ…ๅกซ\n605002 0 ๅœฐ้“ไธŠไธ€ไธชๅฐไธ็‚น็›ธไธญๆˆ‘็š„้ขๅŒ…ไบ†\n605003 0 1็‚นไปŽๅŒป้™ขๅ‡บๆฅ็œผ็›ๅทฒ็ป็ไธๅผ€ไบ†\n605004 0 ไนๆ˜Œๅธ‚ๆณ•้™ขไธ€ๅฎกๅˆคๅ†ณ้ฉณๅ›žไบ†้™†ๆŸ็š„่ฏ‰่ฎผ่ฏทๆฑ‚\n1000 Processed\n classify content\n605500 0 ๅŒป็”Ÿ่ฎฉๆˆ‘32ๅ‘จๅผ€ๅง‹ๅš่‡€ไฝ็บ ๆญฃๆ“\n605501 0 ๆˆ‘ๅทฒ็ปๅœจๅ—ไบฌๆœบๅœบๆ‹ไบ†100ๅผ ่‡ชๆ‹\n605502 0 ็ผ…็”ธๆ”ฟๅบœๅนฒๅพ—ๆผ‚ไบฎ๏ผšxใ€ๅจๆ…‘ไบ†่ฟๆณ•ไผๆœจ่€…\n605503 0 ๆ„Ÿ่ฐขๅ‡ ไฝๅฐไผ™ไผดๅธฆๆˆ‘ๅŽป็œ‹็”ต่„‘\n605504 0 ๆ‰‹ๆœบๆœ‰็”ตๅˆๅธฆไบ†ๅ……็”ตๅฎๅ‡บ้—จ็š„ๆ—ถๅ€™\n1000 Processed\n classify content\n606000 0 ๆ‰‹ๆœบๅฌๅˆฐไธๆ–ญ็š„่ฏญ้Ÿณๅƒ็ดง็ฎๅ’’\n606001 0 ็”จ็›ธๆœบๆ‹ไธ‹ไบ†้ฅๆŽง้ฃžๆœบๅœจๅคด้กถ็›˜ๆ—‹็š„็”ป้ข\n606002 0 ๅ–ไธ€ๅ‰ฏUrbeats็ฐ่‰ฒๆฌพ่ดญไบŽไบš้ฉฌ้€Š่€ณๆœบๆฒกๆœ‰่ดจ้‡้—ฎ้ข˜ๆฒกๆœ‰ๅ้Ÿณ็บฟๆŽงๆญฃๅธธๅŒ…่ฃ…็›’้…ไปถ้ƒฝๅœจๅฏๆ˜ฏๆˆ‘ๅนณๆ—ถ...\n606003 0 ไฝ†ๆ„ฟ่€ƒไธŠ็ ”็ฉถ็”Ÿไน‹ๆ—ฅๅ†ๆ˜ฏๆˆ‘้‡ๆ–ฐๅฎ‰่ฃ…ไน‹ๆ—ถ\n606004 0 ไธ‹ๆฌก่ฃ…ไฟฎไธ€ๅฎš่ฆๅœจๆด—ๆ‰‹้—ด่ฃ…ไธช้˜ฒๆฐด็š„ๅฐไนฆๆžถ\n1000 Processed\n classify content\n606500 1 ไธ‰ๆœˆ่ถ…ๅ€ผๅ›ž้ฆˆๆœˆๆดปๅŠจ้€š็Ÿฅ:xๆœˆxๆ—ฅๅˆฐxxๆ—ฅๅœจๅนฟๅทžๅ…ดๅ‘ๅนฟๅœบไธพๅŠž!ๆฌข่ฟŽๅ„ไฝ้กพๅฎขๅˆฐๅœบ!xxxxๆ’็พŽๆฏ›...\n606501 0 ไธ‹ไบ†้ฃžๆœบไธ€ๅœบๆšด้›จGZ้‡Œๆฏซ๏ฝž\n606502 0 ไฝ†ๆ˜ฏๅˆฐไบ†ๅŒป้™ข็›ฎ็นไบ†่ฟ™ไบ›็››ๅ†ต\n606503 0 ๅŒ่ƒž้กปๆ้ซ˜่ญฆๆƒ•่ฟ‘ๆ—ฅๆžœๆ•ขไผชๆ”ฟๆƒไบบๅ‘˜ๅผ€ๅง‹ๅคง่‚†ๅฎฃๆ‰ฌๆžœๆ•ขๅทฒโ€œๆขๅคๆญฃๅธธโ€ไผๅ›พ่ฏฑ้ช—ไธๆ˜Ž็œŸ็›ธ็š„็พคไผ—\n606504 0 ไปฅๅ‰็‹‚ไนฐ็™พๆœฌๅ„็งไธ“ไธš็ฑป็š„ไนฆ่‡ชๅญฆ\n1000 Processed\n classify content\n607000 0 ALBION/ๅฅฅๅฐ”ๆปจ่ถ…ๆŸ”่ฝฏๅŒ–ๅฆ†ๆฃ‰ๆธ—้€ไนณไธ“็”จ120ๆžš่ฟ™ไธชๆ˜ฏๆธ—้€ไนณไธ“็”จ็š„\n607001 1 ็ฉฟ้ž‹็”Ÿ่‚–ๆ˜ฏ้ฉฌใ€‚ๅ…ถไป–่ฐˆไธไธŠ๏ผŒไป–ๅš็ญ”ๆกˆๆ˜ฏ้ฆ™ๆธฏ่„š๏ผŒๆ˜ฏๆŒ‡ๅ…ญๅˆๅฝฉ็™ฝๅฐๅง่„š๏ผŒ้‚ฃไธชๅฐๅงไธ็ฉฟ้ซ˜ๆ น้ž‹๏ผŒ็ฉฟๅŠ่ฃ™๏ผŒ...\n607002 0 ไธบไธ‹ไธช็คผๆ‹œ็š„ๆ—…ๆธธๅšๅฅฝ้˜ฒๆ™’ๅ‡†ๅค‡ๅ“ˆ\n607003 0 ๆˆฟๅœฐไบง่ฐƒๆ•ดๅทฒ็ป่ตฐๅ‡บVๅฝขไฝŽ่ฐท\n607004 0 ๆˆ‘็š„็”ต่ฏ่ขซ่ฎค่ฏไธบๆ‰ฌๅทž้˜ณๅ…‰็‰ฉๆตๅฑฑไธœไธ“็บฟไบ†\n1000 Processed\n classify content\n607500 0 ไธ“ๅฎถ๏ผšไธญๅ›ฝๆœบๅ™จไบบๆˆ˜็•ฅๆ–นๅ‘ไธŠ้”™่ฏฏ\n607501 0 ๅฏไปฅไธ‹่ฝฝ็พŽๆฉ™ๅฝฑ้ŸณAPP็œ‹็ฐๅง‘ๅจ˜\n607502 0 ๅˆ†ๅˆซๆ˜ฏ7000ไธ‡็พŽๅ…ƒๅ’Œ8470ไธ‡็พŽๅ…ƒ\n607503 0 ไฝ ๅคฉๅคฉ้ฃžๆœบๅทก่ˆชๅ‡บๅฎซ้ผ“ๆตทๅ›ดๆ—ฅๆœฌๅฒ›่ฝฌไธ€ๅคงๅœˆ\n607504 0 ็™ฝ่‰ฒTๆคๆญ้…ๅŠ่บซ้•ฟ่ฃ™่ฎฉไฝ ็š„่บซๆๆ›ดๆ˜พไฟฎ้•ฟ็š„ๅŒๆ—ถๅฎฝๆพ็š„่ฎพ่ฎก่ฎฉไฝ ่บซๆๆ›ดๆ˜พไฟฎ้•ฟ\n1000 Processed\n classify content\n608000 0 ๆ”ฏๆŒไฝ ็š„ไบบไนŸๆœ‰็Ÿฅ้“็œŸ็›ธ็š„ๆƒๅŠ›ๅ‘€\n608001 0 ๆด‹ๅŽฟๅ…ฌๅฎ‰ๅฑ€ๆด‹ๅทžๆดพๅ‡บๆ‰€ไบŽๅฝ“ๆ—ฅไธ‹ๅˆๅฐ†ๆถ‰ๅซŒๆ‹ๅ–ๅ„ฟ็ซฅ็š„็Šฏ็ฝชๅซŒ็–‘ไบบ้›ๆŸใ€ๅ‘จๆŸๆŠ“่Žท\n608002 0 ๅคงๅฎ—ไบคๆ˜“็š„ๅŸบ้‡‘็ปๆ‰‹่ดนๆŒ‰็ซžไปท\n608003 0 x่‹ฑๅฏธLCD็”ตๅญ้ป‘ๆฟx่‰ฒๅฏ้€‰็พŽๅ›ฝไบš้ฉฌ้€Š$xx\n608004 0 ๅŒป็”Ÿ่ฏดๅฆ‚ๆžœ้ซ˜็ƒงไธ้€€ไธ€ๅ‘จๅฐฑ่€ƒ่™‘ๅ…ถไป–็—…\n1000 Processed\n classify content\n608500 1 xxๅ…ƒ้กน็›ฎ๏ผŒไธ‰.ๅŒ…ไธ‰ไธช็–—็จ‹้€ไปทๅ€ผxxxxๅ…ƒ้กน็›ฎใ€‚ๅŒ…xxxxๅ…ƒ็พŽๅฎนๅนดๅกไธ่ฎกๆฌกๆ•ฐ๏ผŒ่บซไฝ“ๅšxxๆฌก่บซ...\n608501 0 ้‚ฃไนˆcon็™พๅˆ†ไน‹ๅ…ซๅๅฐฑ็บฆ่ตทๆฅไบ†\n608502 0 ไปŠๅคฉๆˆ‘ๅ›žๅฎถ็”ตๆขฏๅฐฑ่ขซไธ็Ÿฅ้“่ฐๅฎถ่ฃ…ไฟฎ็š„ๆฒ™ๅญๆžๅไบ†\n608503 1 ๅ‡ๆ›ฐ็™พ่ดง็ง€ๆŸœๅฐ๏ผŒไปŽxๆœˆxๅทๅˆฐxๆœˆxๅทๆžๆดปๅŠจ๏ผŒๆ–ฐๅ“xx็ซ‹ๅ‡xx๏ผŒ่€ๅ“ๆ‰“xๆŠ˜๏ผŒ่ฟ˜ๆœ‰x็‚นxๆŠ˜๏ผŒๆœ›่€...\n608504 0 ไฝไบŽๆต™ๆฑŸ็œ้ณŒๆฑŸๅฃๅค–xxๆตท้‡Œ็š„ไธœๆตท\n1000 Processed\n classify content\n609000 0 ๅ่…่ดฅๆ˜ฏๆŽจๅŠจ็ปๆตŽๅ‘ๅฑ•็š„ๅผบๅคงๆญฃ่ƒฝ้‡\n609001 0 ้‡‡็”จๅฎœๅ…ด้ป„้พ™ๅฑฑ็ดซๆณฅๆ‰‹ๅทฅ็ฒพๅˆถ่€Œๆˆ\n609002 0 ็”ทๅญๅคฉๅคฉๅƒillstandbeforeyouimm็ฆป\n609003 0 ๆ”ฟZHIๅŠฟๅŠ›ๅŒๆ–นๅœจๅช’ไฝ“ไธŠๆฅๅ›žๆ–—ๆณ•\n609004 0 ็ข˜้…’ไผšไฝฟ่‚ค่‰ฒๅ˜้ป„ใ€ๆŠ—็™Œ่ฏไธญๅผ•่ตท่‚ค่‰ฒๅ˜ๅŒ–็š„่ฏ\n1000 Processed\n classify content\n609500 1 ไธฐๆบ็™พ่ดง ๆƒŠๅ–œๅคงๆ”พ้€๏ผŒๅชไธบ็พŽไธฝ็š„ไฝ ๅทด้ปŽๆฌง่Žฑ้›…ไธ“ๆŸœ:ไนฐxxxๅ‡xxๅฆไธ“ๆŸœๆœ‰ไนฐ่ต ^\n609501 0 ไฝ ๅฐฑๆ˜ฏๆˆ‘็š„ๆฐงๆฐ”่ฏ•ๅฌๅœฐๅ€&gt\n609502 0 MV็‰น้‚€ไฟกๅฟต้Ÿณไนๅˆ›ๅง‹ไบบๆฐๆ–Œๆ‹…ไปปๅฏผๆผ”\n609503 0 ๆˆ‘็ˆฑไฝ ไธ็ฎกไฝ ๆ˜ฏๅฆๅœจๆˆ‘่บซ่พนๆˆ–่€…่กฐ่€็–พ็—…ๅฅๅฟ˜ๆˆ‘้ƒฝ้™ชๅœจไฝ ่บซ่พน็ญ‰็€ไฝ ็ˆฑไฝ \n609504 0 ๅฏไธ่ƒฝๅƒ่ฟฝ่Šฑๅƒ้ชจไธ€ๆ ทๅ‚ปๅ‚ป็š„็ญ‰ๅ‡ ไธชๆœˆๅ’ง\n1000 Processed\n classify content\n610000 0 ๅญฆ็”Ÿไปฌๆš‘ๆœŸๆ‰“ๅทฅๆ—ถไธ€ๅฎš่ฆๆ“ฆไบฎ็œผ็›\n610001 0 ไธŠๆตท็ƒญ็บฟๆ–ฐ้—ป้ข‘้“โ€”โ€”่‹ๅทžๅœๅ‘โ€œๆŽๆ”ฟ้“ๅฅ–ๅญฆ้‡‘โ€ใ€€ๅทฒ็ป่ฟž็ปญ้ขๅ‘30ๅฑŠ\n610002 0 ่€Œๅข™้ขๅ‡นๅฝข็š„ๅ‡ฟ็ฉบ่ฎพ่ฎกๆ›ดๆ˜ฏๅˆซๅ‡บๅฟƒ่ฃ\n610003 0 ๆ‹œไปๅ‰้”‹่Žฑไธ‡5000ๆฌงๅ…ƒ็งŸ็”จ็›ดๅ‡ๆœบไธบๅฆปไนฐ้ขๅŒ…\n610004 0 ๆœ‰ๅ…ณไบŽ็ญ–ๅˆ’็”ตๅ•†ๆ–ฐๅช’ไฝ“่ฅ้”€ๆ–น้ข็š„ๅทฅไฝœไฟกๆฏ็š„\n1000 Processed\n classify content\n610500 0 ๆˆ‘ๅฏ่ƒฝ่ฆๅ’Œๅ—ไบฌ็š„ๅŒป่ฏ่กŒไธš็ป็ผ˜ไบ†โ€ฆ\n610501 0 1ๅ…่ดนๅ‡็บงๆฟ€ๆดปWin10ๆญฃๅผ็‰ˆๆ–นๆณ•๏ผšไฝ ไธ€ๅฎš้œ€่ฆ\n610502 0 ๅ‡†ๅค‡็‰›ๅฅถxxxๆฏซๅ‡xๅฐ†ๆœจ็“œๅ’Œ่ ่ๆ”พๅ…ฅๆ…ๆ‹ŒๆœบไธญๅŠ ๅ…ฅ้ฒœๅฅถ\n610503 0 ไธน้˜ณๅŠžไบ‹ๅค„ๅ’ŒๅฒณๅŸŽๅŠžไบ‹ๅค„้™้›จ้‡่พพๅˆฐ42\n610504 0 ๆ—ฅๆœฌ่ฎพ่ฎกๅ…ฌๅธnendoๆœ€่ฟ‘ๆŽจๅ‡บไบ†ไธ€ๆฌพโ€œaroboโ€็ฉบๆฐ”ๅ‡€ๅŒ–ๅ™จ\n1000 Processed\n classify content\n611000 0 ็ˆฑๅฟ…ๅฆฅ่”ๅˆๅŒ–็–—ไธ€็บฟๆฒป็–—kras้‡Ž็”Ÿๅž‹่ฝฌ็งปๆ€ง็ป“็›ด่‚ ็™Œๆ‚ฃ่€…่ƒฝๅคŸๅปถ้•ฟๆ€ป็”Ÿๅญ˜ๆœŸ\n611001 0 ๆœ‰ๅฃน้˜ต่ฅๅ’ŒCoachDonovanๅ’Œไฝ ไธ€่ตท่ฎญ็ปƒ\n611002 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 79dmrwไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n611003 0 360ๆ•™่‚ฒ้›†ๅ›ขๅพ้—ปๆฐ่ฐˆ๏ผšๆ–ฐ่ฅฟๅ…ฐ็•™ๅญฆๅˆฐๅบ•ๆ˜ฏDIY่ฟ˜ๆ˜ฏๆ‰พไธญไป‹\n611004 0 ่ฟ™ไนˆๆ˜Žๆ˜พ็š„่ฎพ่ฎก็บฐๆผๅฐฑๆ˜Ž็›ฎๅผ ่ƒ†็š„ไธŠๅธ‚ไบ†\n1000 Processed\n classify content\n611500 0 ้ฃŸ้ขๅ…ซๆ–นๅ…จๅ›ฝๆ‹›ๅ•†็ซ็ƒญ่ฟ›่กŒไธญ\n611501 0 ๅซๆ˜Ÿๅนฟๅœบ้‚ฃไธช้บฆๅฝ“ๅŠณๅบ—ๅˆฐๅบ•ๆ˜ฏไธๆ˜ฏ24ๅฐๆ—ถ่ฅไธšๅ•Š\n611502 1 ไฝ ๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๆ—ฅๆœˆๅนฟๅœบ่ดไฝณไบบๅบ—๏ผŒไธ€ๅนดไธ€ๆฌก็š„ไธ‰ๅ…ซ่Š‚ๅฐฑ่ฆๅˆฐไบ†๏ผŒ้ข„็ฅไฝ ่Š‚ๆ—ฅๅฟซไน๏ผŒ่ถŠๆฅ่ถŠๅนด่ฝป๏ผŒไธป่ฆๆ˜ฏไธ‰ๅ…ซ...\n611503 0 9ๆœˆไธญๅ›ฝไบบๆฐ‘ๆŠ—ๆ—ฅๆˆ˜ไบ‰ๆšจไธ–็•Œๅๆณ•่ฅฟๆ–ฏๆˆ˜ไบ‰่ƒœๅˆฉ70ๅ‘จๅนดๅคงไผšไนŸๅฐ†ไธพ่กŒ\n611504 0 ่€ๅทฅไธšๅŒบๆฌ่ฟๆ”น้€ ๅฐ†ๅ…จ้ขๅŠ ๅฟซๆŽจ่ฟ›\n1000 Processed\n classify content\n612000 0 ้€‰ๆ‹ฉ็š„้˜ฒๆ™’ๅ“็š„SPFๅ’ŒPAๅ€ผ่ฆๅคง\n612001 0 ่€ŒๅŒๆœŸๅค–ๆฑ‡ๅ‚จๅค‡ๅ‡ๅฐ‘xxxxไบฟ็พŽๅ…ƒ\n612002 0 ไบ’ๆ“ไฝœๆ€งๅฏนไบŽOpenStackๅ‘ๅฑ•ๆœ‰็€้‡่ฆๆ„ไน‰\n612003 0 xใ€็†Ÿๆ‚‰ๆฐด็”ตๅทฅไธป่ฆ็†Ÿๆ‚‰ๅฎถๅบญ็”ตๅ™จๅฎ‰่ฃ…\n612004 0 ๅˆ่ขซ้ฃžๆœบ่ตท้ฃž็š„ๅฃฐ้Ÿณๅ›ฐไฝre\n1000 Processed\n classify content\n612500 0 ไปŠๆ™š็ปง็ปญ็”ฑๅฐๅผ€ๆณ•้™ขๅฐ็ผ–้™ชไผดๅคงๅฎถ\n612501 0 ๆฅ่‡ชๆทฎๅฎ‰ๅŒบ่Œญ้™ตไธญๅฟƒๅฐๅญฆๅ’Œๆทฎ้˜ดๅŒบๅˆ˜่€ๅบ„ไธญๅฟƒๅฐๅญฆ็š„43ๅ็•™ๅฎˆๅ„ฟ็ซฅ\n612502 0 ๅ•ชๅ•ชๅ•ชๅฐ่ง†้ข‘ๆ—ฅๆœฌ็”ตๅฝฑgif็”ทๅฅณๅ•ชๅ•ชๅ•ชๅŠจๅ›พ็”ทๅฅณๆปšๅบŠๅ•ๅ•ชๅ›พ็คพๅ•ชๅ•ชๅ•ชๅ•ชๅ•ชๅ•ชๅ•ชๅ•ชๅ•ชgifๅš็ˆฑๅ›พ\n612503 0 ๅคดไธ€ๅคฉ้ฃžๆœบๆ™š็‚นๅŠๅคœ้€š็Ÿฅๅ–ๆถˆ\n612504 0 DANGQIDREAMๆ˜ฏไธชไป€ไนˆ้ฌผ\n1000 Processed\n classify content\n613000 0 ไบฌ่—้ซ˜้€Ÿๅ…ฌ่ทฏk1805+200็ฑณ\n613001 0 ้šๆœบๆŠฝๅ–xxxMBๅ›ฝๅ†…ๆต้‡ๅ…ฑ่ฎกxxไปฝ\n613002 0 ่ขซไฟๅฎšๅธ‚ๅ…ฌๅฎ‰ๅฑ€็™พๆฅผๆดพๅ‡บๆ‰€ๆฐ‘่ญฆๆŠ“่Žท\n613003 0 9โ€ฆโ€ฆๅฏๆ€œ็š„ๅจƒโ€ฆโ€ฆๅƒไบ†้€€็ƒง่ฏ่ฟ˜ๆ˜ฏๅๅๅคๅค็š„โ€ฆโ€ฆ\n613004 0 ๆฒกๆœ‰็œŸๆญฃ็š„ๆณ•้™ขๅฐฑ้œธๆฒกๆœ‰ไธชไบบ็š„่‡ช็”ฑๅ’Œๅนณๅฎ‰\n1000 Processed\n classify content\n613500 0 ๅฐ้ฃŽๅฏ่ƒฝๅœจๆต™ๆฑŸ็œ็ฆๅปบ็œไบค็•Œๅค„็™ป้™†\n613501 0 ๅฎžๆ‹ๆƒ…ไพฃๅœจๅœฐ้“ๅ†…ๆ— ่ง†ๆ—ไบบ่„ฑ่กฃๆฟ€ๆƒ…ๆ— ่ฏๅฏ่ฏดไบ†\n613502 0 ๅŒป็”Ÿๆปฅ็”จๆ”ฏๆžถๆฏไธชๆๆˆๆˆ–่พพ2ๅƒ\n613503 0 ๅฎถไฝๆฝ˜้›†ๅŒบ็š„ๅˆ˜ๆŸ็ญ‰6ไบบ็ผ–้€ ่ฐŽ่จ€็งฐ่ƒฝๅŠž็†ๅป‰็งŸๆˆฟ\n613504 0 Geroใ•ใ‚“ใซใ”ๆฅๅบ—้ ‚ใใพใ—ใŸใƒผ\n1000 Processed\n classify content\n614000 0 ๆŸฅๆธ…็œŸ็›ธๅธฆๅคดๅคงๅ“ฅๅŽŸๆฅๆ˜ฏๅฐ‘ๆž—ๆ–นไธˆ\n614001 0 ไธญๅ›ฝๆญป็ฅžๆ–นไพฟKOๆ—ฅๆœฌๅๅฐ†ๅฏนๆ‰‹่ทช่ถดๅœจๅœฐๆ„่ฏ†ไธๆธ…\n614002 0 RollingSpiderๆ˜ฏ้ฅๆŽง้ฃžๆœบ\n614003 0 ๆœ‰ไธ€็ป„ไปฟๅค็ฝ—้ฉฌๅบŸๅขŸๅผ็š„ๅปบ็ญ‘\n614004 0 ๆฎ็ง‘ๆŠ€่ต„่ฎฏ็ฝ‘็ซ™ComputerworldๆŠฅ้“\n1000 Processed\n classify content\n614500 0 ๅ…ถไธญ156ไธชๅ“็ง510ๆ‰นๆฌกไธๅˆๆ ผ\n614501 0 ่พน่ฟฝ่Šฑๅƒ้ชจ่พนๆŠŠๆˆ˜้•ฟๆฒ™็ฌ‘ๅ‚ฒๆฑŸๆน–ไป™ๅ‰‘ไธ‰ๅ„็ง่ฎฟ่ฐˆ็ปผ่‰บๅ…จๅˆทไบ†ไธ€้\n614502 0 ๅๆญฃไธ€ไธชๆœˆๅ‰้˜ฟ้‡Œๅทดๅทดๅ’Œ่š‚่š้‡‘ๆœๅ„ๆณจ่ต„30ไบฟ\n614503 0 ๅ‘จๆœซ่ตฐ่ฟ›UnFashionCafeๅ–ๆฏๅ’–ๅ•ก\n614504 0 โ€ๅŒป็”Ÿ่ฏด๏ผšโ€œๅ…ˆๆŠŠ็—…ไพ‹ๆ‹ฟ็ป™ๆˆ‘็œ‹็œ‹\n1000 Processed\n classify content\n615000 0 ่ฐƒๆ•ดๅŽๆ ‡ๅ‡†ไปŽไปŠๅนด6ๆœˆ15ๆ—ฅ่ตทๆ‰ง่กŒ\n615001 0 ไธๅˆฐ50ๅˆ†้’Ÿ็ปˆไบŽๅˆฐ่พพ็‹ฑ่ญฆๅŠžๅ…ฌๅฎคไบ†ๅ“ฆ\n615002 0 ไฝ†่‡ณไปŠไธ‰ๅนดๅคšๅคงๅฎถไพๆ—งๆ˜ฏๆฒกๆœ‰ๅ…ป่€ไฟ้™ฉ\n615003 1 xๆœˆxๅทๆญฃ็œŸๅฎžๆƒ ๆฅ่“่ฐทๆ™บ่ƒฝๅŽจๆˆฟ๏ผŒไธ็งŸๅœบๅœฐ๏ผไธ่ฏทไธปๆŒ๏ผไธๅ–ๅก๏ผ็œไธ‹ๆฅ็š„ๅ…จ้ƒจ็ป™ๆฏไฝ้กพๅฎข๏ผ่ฟ›ๅบ—ๅนถๆœ‰...\n615004 1 ๅ ดๆ˜ฅ่ฃๆŠ˜ๅŽๆปฟxxx้€xxๅ…ƒ็™พ่ฒจๅˆธ๏ผ็ฉๅˆ†็•ถๆ—ฅๆปฟxxxxๅ…ƒๅƒๅŠ ็™พๅˆ†็™พไธญ็Žๆดปๅ‹•๏ผ่ช ้‚€ไฝ ็š„ๅ…‰่‡จ๏ผ็ฅๆ‚จ...\n1000 Processed\n classify content\n615500 1 ๆ€ฅ็”จ้’ฑๆ‰พๅนณๅฎ‰๏ผŒๅ…ๆŠตๆŠผ๏ผŒไฟก็”จ่ดทๆฌพ๏ผŒๆœ€ๅฟซไธ€ๅคฉๆ”พๆฌพ๏ผŒๅนณๅฎ‰ๆ˜“่ดทๆž—็ป็†๏ผšxxxxxxxxxxx\n615501 0 ไบ‹ๅฎžๆ˜ฏ็ดข็Ž›ๆ…ˆๅ–„ๅŸบ้‡‘ไผšๅˆฉ็”จไบบๆ€งๆ˜“่ขซ็…ฝๆƒ…็š„ๅผฑ็‚น็ญ–ๅˆ’็š„ไธ€่ตทๅพฎๅšไผ—็ญนๆดปๅŠจ\n615502 0 ไธ็„ถๆˆ‘ไปฌไผšๅŠจ็”จๆณ•ๅพ‹ๆญฆๅ™จ็ปดๆƒ\n615503 0 โ–Œๆฌข่ฟŽๅ‚ๅŠ ไธŠๆตทๅค–ๆ–‡ไนฆๅบ—ไบš้ฉฌ้€Šๅบ—ๅฅฝไนฆ็ง’ๆ€\n615504 0 ๅปบ็ญ‘็ป“ๆž„่ฎพ่ฎกไฝฟ็”จๅนด้™ไธบ50\n1000 Processed\n classify content\n616000 0 ไพ‹ๅฆ‚่ƒฝ่ฃ…่ฝฝ8ๅๆญฅๅ…ต็š„่ฟๅ…ต่ˆฑใ€ๆฐ”ๆณกๅฝข้ฃŽๆŒก็ญ‰\n616001 1 ไบฒ็ˆฑ็š„ๅงๅฆนไธ‰ๅ…ซ่Š‚ๅฐ†่‡ณ๏ผš็ฅๅงๅฆนไธ‰ๅ…ซ่Š‚ๅฟซไน๏ผๅ‡กๆ˜ฏๆœฌๆ–ฐ่€้กพๅฎขไธ‰ๅ…ซ่Š‚้‚ฃๅคฉ่ฟ›ๅบ—ๆœ‰็คผ๏ผŒๆ•ฐ้‡ๆœ‰้™๏ผŒๅ…ˆๅˆฐๅ…ˆๅพ—...\n616002 0 ไธๅคŸ้บป็ƒฆ็š„โ€ฆๅธŒๆœ›ๅฐๅทๅฐฝๆ—ฉ่ขซๆŠ“\n616003 0 ๆŸณๅทžๅธ‚้ฑผๅณฐๆณ•้™ขๆˆ็ซ‹ไบ†้ข„้˜ฒๆœชๆˆๅนดไบบ็Šฏ็ฝช่ญฆ็คบๆ•™่‚ฒๅŸบๅœฐๆšจ้’ๅฐ‘ๅนดๅฟƒ็†ๅฅๅบทไธญๅฟƒ\n616004 0 ้‚ณๅทž่ง„ๅˆ’้ฆ†้‡Œไธ€็พคๅญฆ็”Ÿ็บท็บทๅœจๅฝฉ่™นๆกฅ็…ง็‰‡ๅ‰ๆ‹็…ง\n1000 Processed\n classify content\n616500 0 ๅพฎๅšๆœ็ดขไบ†ไธ‹ๅŽไธบmate7\n616501 0 ๅœจ้‚ฃไธชๆฒกๆœ‰ๆ‰‹ๆœบๆฒกๆœ‰็ฝ‘็ปœ็š„ๆ—ถไปฃ\n616502 0 ๅ› ไธบๆœ‰xไธชๆ ทๆฟ้—ดๅŠxไธชไธดๆ—ถ็‰ฉไธš็ฎก็†็”จๆˆฟๆš‚ไธ้”€ๅ”ฎ\n616503 0 ไป–ๆ›พ่Žทๅพ—NBAๆ€ปๅ† ๅ†›ๆˆ’ๆŒ‡\n616504 0 ๆฑฝ่ฝฆๅˆ™ๆˆไธบๅฟ…ไธๅฏๅฐ‘็š„ไบค้€šๅทฅๅ…ท\n1000 Processed\n classify content\n617000 0 ๅ…ถๅฎžๆˆ‘ไปŽๆฅๆฒกๆœ‰ๆ€ช่ฟ‡่ฐๅ› ไธบๆธ…ๆฅš่‡ชๅทฑๆ นๆœฌไธ้…ๆ‹ฅๆœ‰ๆ›ดๅคšๆ›ดๅฅฝ็š„\n617001 1 ่ฟ˜ๆœ‰็ฒพ็พŽ็คผๅ“ไธ€ไปฝๅ“Ÿ๏ผŒ้€ๅฎŒไธบๆญข๏ผ่ฟ˜ๆœ‰ๆ›ดๅŠฒ็ˆ†ๆถˆๆฏ๏ผŒx.x x.x x.x ไธ‰ๅคฉ๏ผŒๅฑ…ๅฎถไบง...\n617002 0 ็Ÿฅๆƒ…ไบบๅฃซไพ›ๅ›พไบ‹ๅ‘่ˆช็ญๅคด็ญ‰่ˆฑไธ€ๅบงๆค…้ ่ƒŒๅค„\n617003 0 ๆƒณๆˆ‘็š„ๆŸๅ‹ไบ†ๆ‰€ไปฅไธ‹ไบ†้ฃžๆœบๆ™šไธŠ็ก่ง‰ๆ—ถๅฐฑ้ป˜้ป˜ๅ†™ไบ†่ฟ™ไนˆๅคš\n617004 0 ่กŒๅŠจไธญๅ…ฑๆŸฅ่Žท1่ตท้†‰้…’้ฉพ้ฉถใ€2่ตท้…’ๅŽ้ฉพ้ฉถ็š„่ฟๆณ•่กŒไธบ\n1000 Processed\n classify content\n617500 0 ไฝ ๆ— ๆณ•ๆƒณ่ฑก้‚ฃๆ—ถๅ€™็š„ๅ—ไบฌ็ซŸ็„ถๆ˜ฏ่ฟ™ๆ ท็š„\n617501 0 ็œ‹ๅˆฐ่Šฑๅƒ้ชจๅ’Œ็ดซ่–ฐๅœจๅœๅ…ƒ้ผŽๅนป่ฑกไธญ\n617502 0 ๆœ‰ๆ„ๅฏ็งไฟกๆˆ–็”ต่”xxxxxxxxxxx\n617503 0 ๆ€€ๅญ•ๅˆ†ๅจฉๅๆœˆๅญๆ˜ฏๆ”นๅ–„ๅฅณๆ€งไฝ“่ดจ็š„ๅฅฝๆ—ถๆœบ\n617504 0 ็›ดๆŽฅๆ‰พ็‰ฉไธšๆ‹†ๅŠ้กถไบ†ๆžœ็„ถๅฅนๅฎถ้—ฎ้ข˜\n1000 Processed\n classify content\n618000 1 xxxไบฟๅนฟๅทž่Šฑ้ƒฝไธ‡่พพๅŸŽxๆœˆxๆ—ฅๆบ้›ช่€Œๆฅ๏ผŒ็ฌฌไธ€ๆœŸxxโ€”xxxๆ–น้“‚้‡‘้“บไฝxๆœˆ้ฆ–ๅผ€๏ผŒ็ซ็ˆ†็™ป่ฎฐไธญ๏ผๆœ‰...\n618001 1 ไธญๅ›ฝๅๅคงๅ“็‰Œ็™พๅฑ…ไฝณไธฝๅœฐๆฟ๏ผŒๅฎžๆœฉ๏ผŒๅผบๅŒ–๏ผŒๅคšๅฑ‚ใ€ไธญ๏ผŒ้ซ˜ใ€ไฝŽๆกฃใ€ๅ››ๅๅคš็งๆ ทๅ“ไพ›ๆ‚จๅ‚่€ƒ๏ผŒ็Žฐๅ—้™ตๅˆ†ๅบ—๏ผŒ...\n618002 1 ไฟๅˆฉ็ฝ—ๅ…ฐ้ฆ™่ฐท่ตทไปทxxxx๏ผŒๅ‡ไปทxxxx๏ผŒ้ฆ–ไป˜xไธ‡๏ผŒ่ฏฆ่ฏข๏ผŒ้‡‘็‰Œ็ฝฎไธš้กพ้—ฎ๏ผšๅผ ๅ›ญxxxxxxxxxxx\n618003 0 ๆ˜ฏ่ฟ™็ฒพ็ฅž็š„่ดขๅฏŒๆ”ฏๆ’‘็€ไผ˜็พŽ็š„่ˆžๅงฟ\n618004 0 ไธŽๆ—ถไปฃๆŽฅ่ฝจๆ„Ÿ่ฐขxx็ป„ๅคงๅนณๅฐ็ป™ๆˆ‘ไปฌๅฎๅฆˆๅธฆๆฅ็š„ๆ–ฐ็”Ÿๆดป\n1000 Processed\n classify content\n618500 0 ๅ‘จๆ—ฅ่ฆๅŽปๅŒป้™ข็œŸ็š„ๅฅฝๅฟงๆก‘โ€ฆ\n618501 0 ๆฐ‘่ญฆๆŸฅๅค„ไธ€่ตท่ฝฆ็‰Œไธบๆ™‹C001**ๅท็š„ๅฐๅž‹่ฝฟ่ฝฆ้ฉพ้ฉถไบบๆœชๆŒ‰่ง„ๅฎšไฝฟ็”จๅฎ‰ๅ…จๅธฆ็š„่ฟๆณ•่กŒไธบ\n618502 1 ๅ…‰่พ‰ๅฐๅŒบ่ต„็”Ÿๅ ‚ๅ˜‰ๆ™จไธ“ๅ–ๅบ—๏ผŒๅ‡กๆŒๆœฌๅบ—ไผšๅ‘˜ๅกๅ‡ๅฏๅ…่ดน้ข†ๅ–็ญ”่ฐข็คผ๏ผŒๅŒๆ—ถ็งฏๅˆ†ๅ…‘ๆข่ฑช็คผใ€‚ไธ‰ๅ…ซๆดปๅŠจๆปกxx...\n618503 0 ๅŠ weixin๏ผšssokcc\n618504 0 ๅฝ“ๆ—ถ็š„็ ”็ฉถ็”ŸๅŒๅญฆ้ƒฝๅทฒ็ปๆฏ•ไธš\n1000 Processed\n classify content\n619000 0 ๅฏปๆ‰พๅฒไธŠๆœ€่ดต็š„ๅฎžไน ็”Ÿ|ๅนฟๅ‘Š่กŒไธšๆœ€็›ด่ง‚็š„ๅฐ่ฑกๆ˜ฏไป€ไนˆ\n619001 0 ๆˆ‘ไธƒๆœˆ็š„ๆœ€ๅŽไธ€ๅคฉ็ฅๆˆ‘ๅฅฝ่ฟๅงๅ่›‹ๆ™šๅฎ‰ๅฐไธธๅญ\n619002 0 ๅนฟๅฎ‰ๅŽ่“ฅๅธ‚ไบบๆฐ‘ๆณ•้™ขๅฎก็†ไธ€่ตทๆ‘ๆฐ‘ๅฅฝๅฟƒไน‰ๅŠกๅธฎๅฟ™\n619003 0 ๅœฐ้“ๆŠคๆ ้š”็ปไธไบ†ๆ„Ÿๆƒ…๏ผšๅœจๅœฐ้“็ญ‰ไบบ\n619004 0 ๅธ‚ๆฐ‘ๅฏ้€š่ฟ‡็”ต่„‘ๆˆ–็งปๅŠจ่ฎพๅค‡้šๆ—ถๅญฆไน ๅคšๆ ทๅŒ–่ฏพ็จ‹\n1000 Processed\n classify content\n619500 0 ๅ‘จๆ—ฅๅ’Œไธ‹ๅ‘จไธ€้ƒฝๆ˜ฏ่Šฑๅƒ้ชจ็œŸ็š„ๅ—\n619501 0 ไธ€ๅบงๆ‹ฅๆœ‰93ไธช่ฝฆไฝ็š„4ๅฑ‚็ซ‹ไฝ“่ฝฆๅบ“ๅœจๅŸŽๅ…ณๅŒบๆŽ’ๆดช่ทฏๅ…ซไธ€้˜ณๅ…‰ๅฎถๅ›ญ่ฝๅœฐๅปบ่ฎพ\n619502 0 ๅ…่ดน้˜ฒ็™Œ็ƒญ็บฟxxxxxxxxxxx\n619503 0 ๅธธๅทžๅธ‚ไธพ่กŒไบŒๅญฃๅบฆๅ…จๅธ‚็”Ÿๆ€ๆ–‡ๆ˜Žๅœจ่กŒๅŠจ็‚น่ฏ„ไผš\n619504 0 ๆต™ๆฑŸๅŸบ็ฃๆ•™ไธคไผšไนŸๅ‡บๅ…ฌๅผ€ๆŠ—่ฎฎไฟก\n1000 Processed\n classify content\n620000 0 ๆ‰‹ๆœบๆถˆ่ดน็š„้—ชๅญ˜ๅฎน้‡ๅœจๅ…จ็ƒ้—ชๅญ˜ๅ‡บ่ดง้‡ไธญๆ‰€\n620001 0 ๅŒ…ๆ‹ฌๆณจๅ…ฅLEDๅ…ƒ็ด ็š„ๅ‰ๅคง็ฏใ€LEDๅฐพ็ฏไปฅๅŠๆ–ฐ่ฎพ่ฎก็š„ๅ‰ๅŽไฟ้™ฉๆ \n620002 0 ็พŽๅ›ฝๅ…ฌๆฐ‘ๅฏนๆณ•ๅพ‹็š„่ฎค็Ÿฅ็›ธๅฝ“้ซ˜\n620003 0 ๅ…ณไบŽๅ…ผ่Œ??ๆˆ‘ๅš็š„ๆ˜ฏไธ€ไปฝๅ…ผ่Œ่ตš็š„ๆ˜ฏ็บฏๅˆฉๆถฆไธๆ˜ฏๅพฎๅ•†ๅฆ‚ๆžœไฝ ่ˆไธๅพ—ไธคไธ‰็™พ็ปˆ่บซๅˆถ็š„ไผš่ดนๅฐฑๅˆซๆฅๅ’จ่ฏขๅคฉไธ‹...\n620004 0 ไธ€ไธชๅฎถๅบญ้‡Œๅ‡บไบ†9ไธช็•ธๅฝขโ€œ็Ÿฎไบบโ€\n1000 Processed\n classify content\n620500 0 ้”™็ปผๅคๆ‚็š„็œŸ็›ธ~ไธๅˆฐๆœ€ๅŽ้ƒฝไธ็Ÿฅ้“ๅŽŸๆฅๆ˜ฏ่ฟ™ๆ ท\n620501 0 7ๆœˆไปฝๅฏŒๆ—ถไธญๅ›ฝA50ๆŒ‡ๆ•ฐๆœŸ่ดง็š„ๆœชๅนณไป“ๅˆ็บฆๆ€ปๆ•ฐๆ€ฅ่ทŒ่‡ณ127\n620502 0 xใ€ๅคฎ่กŒๆœ‰ๅ…ณ่ดŸ่ดฃไบบ๏ผš็Œช่‚‰ไปทๆ ผๆณขๅŠจไธไผšๅฝฑๅ“่ดงๅธๆ”ฟ็ญ–\n620503 0 ่ฟ˜ๅซไธฐๅฏŒ็š„็ปด็”Ÿ็ด Cๅ’Œๅฏๆบถๆ€ง็บค็ปด\n620504 0 ไธๅˆ่ง„ๅฎš่ฏๅ“148ๆ‰นใ€ๅ‡ๅ†’5ๆ‰น\n1000 Processed\n classify content\n621000 0 ๅฅฝไธๅฎนๆ˜“ๅไธŠไธ€ไธชๆœ‰wifi็š„้ฃžๆœบๅฑ…็„ถ่ฆๆ”ถ่ดน\n621001 0 ๅ…ซๅฎ—็ฝชไฟจ็„ถๆ˜ฏๆญปๅˆ‘ๅฐฑ่ฟ™ไนˆๅŠž\n621002 0 ้‚ฃๆ‰ไบ†ๅŒป็”Ÿๆฅ็š„ๆฑ‰ๅญๅฏนๅนณไธ€ๆŒ‡็”šๆ˜ฏๆ•ฌ็•\n621003 1 ไฝฐ@ไฝณ@ๆณบ๏ผŒ้พ#ๅ”ฌ#ๆ–—๏ผŒ็‰›๏ฟฅ็‰›๏ผŒไฝ“*่‚ฒ็ญ‰็ญ‰๏ผ่ซŸๆฌพ่ตฝ๏ผŒ่ฏฆๆƒ…๏ผšไธ–่ฟฏๆดฎๆฆž xxxxxxx.โ„ƒโ—‹M ...\n621004 0 ๅนถๅฐ†ไธบ่…พ่ฎฏๅบ”็”จๅฎๅ…จๅœบๆไพ›ๅฏ้ ็š„wifi\n1000 Processed\n classify content\n621500 1 ๆญๅ–œๆ‚จๅ–œๆทปๆ–ฐๅฑ…๏ผŒๆˆ‘ๆ˜ฏๆ’ไธฐ่ฃ…้ฅฐ็š„ๅฎถๅฑ…้กพ้—ฎ๏ผŒๆˆ‘ๅง“ๅพใ€‚ ๆˆ‘ไปฌๅ…ฌๅธ็ŽฐๆŽจๅ‡บไบ†ไธ€ไธชxxๅนณ็ฑณๆˆฟxไธ‡ๅคšๅ…จ...\n621501 1 ๆœฌๅบ—ๆถˆ่ดน็พŽๅฎน้กน็›ฎ๏ผŒๅฏไบซๅ—ๅฏนๆŠ˜ไฝ“้ชŒ ๅฆ๏ผšๆ˜ฅๅญฃๆ˜ฏๅ…ป่‚ๆœ€ไฝณๅญฃ่Š‚๏ผŒๆœฌๅบ—ๆฏไบบๆŽจๅ‡บไธ€ไธชๅ้ขๅช้œ€xxx...\n621502 0 ้ผ“ๆฅผๅŒป้™ขๆžœ็„ถๆ˜ฏๅคงๅŒป้™ขๅ•Š\n621503 0 ใ€่€็Ž‹่ฏด๏ผšใ€Œๆˆ‘ไปฌไฟฉๅช่ฆไธ€ๅตๆžถ\n621504 0 12ๅฒๆน–ๅ—ๅฅณๅญฉๆ€ๆ€่ขซๅŒๆ‘74ๅฒ่€ไบบๆ€งไพตๅนถไบงๅญ\n1000 Processed\n classify content\n622000 0 ๆƒณๅฟตๅฅฝๆๆ€–ไพตๅ ไบ†ๆˆ‘ๆ•ดไธช่บซไฝ“\n622001 1 ๅญฃ้—ช่€€้‡‘้™ต๏ผๅ‡ญๆญคๅพฎไฟกไบซ็ดซๅณฐไธšไธป็‰นๆƒ ๏ผ ่ฏฆๆƒ…ๅ’จ่ฏข๏ผš้™ˆ ๅˆš ้ข„่ฎข็ƒญ็บฟ๏ผšxxxxxxxxxxx ๆ‚จ...\n622002 1 ๆ‚จๅฅฝ๏ผŒๆฌข่ฟŽๅ‚ๅŠ ๅฟ—้‚ฆๅŽจๆŸœxxx้ข„ๅ”ฎไผš๏ผŒๅ…จๅ›ฝ็บฟไธŠ็บฟไธ‹่”ๅŠจไผ˜ๆƒ ๆดปๅŠจ๏ผŒไปทๆ ผๅ…จๅนดๆœ€ไฝŽ๏ผŒๆ›ดๆœ‰ๆ€ป้ƒจ้ข†ๅฏผไบฒไธด...\n622003 0 ๅนถๅฏนQuanergyๅ…ฌๅธ่ฟ›่กŒๆˆ˜็•ฅๆŠ•่ต„\n622004 0 ๅ—ไบฌ่ฟ™่พน็š„้Ÿต่พพไธ€ๅฎšๆ˜ฏไธๅคช่ชๆ˜Ž\n1000 Processed\n classify content\n622500 1 ไฟฑไน้ƒจKTVๆฌขๅ”ฑ็››ๆƒ  KTVๆˆฟ้—ด๏ผš็™พๅจๅ•ค้…’๏ฟฅxxxxๅ…ƒxๆ‰“ ่ฑชๅŽไธญๆˆฟๅฅ—้ค ้’ๅฒ›็บฏ็”Ÿ๏ฟฅxxxๅ…ƒ...\n622501 0 ๆฑฝ่ฝฆ่ฝฎ่ƒŽๅ……ๆฐฎๆฐ”ๆœ‰ไธ€ๅฎš็š„ๅฅฝๅค„\n622502 1 ้ญ…ๅŠ›ๅบ†ไธ‰ๅ…ซ.ๅฟซไนๅฅณไบบ่Š‚.่ฟ›ๅบ—ๆœ‰ๆƒŠๅ–œ ้พ™ๅŽๅฏŒ้€šๅคฉ้ชไผŠ่ŽŽๅ…ฐ่’‚็พŽๅฎนๅ…ป็”Ÿ้ฆ†ๆๅ‰็ฅๆ‚จไธ‰ๅ…ซ่Š‚ๅฟซไน๏ผ ...\n622503 0 ไธๆ–™ๆƒณxxxไบ‹ไปถๅ‘้…ตๅˆฐไบ†ๅฆ‚ๆญคๅœฐๆญฅ\n622504 0 ๅ†…ๅœฐๅฅณ็”Ÿๅœจๆธฏ่ก—ๅคด้‡Žๆˆ˜่ขซๅˆคๆ„ŸๅŒ–12ไธชๆœˆ\n1000 Processed\n classify content\n623000 0 ๆŸฅๆ˜Žๅฝ“ไบ‹ไบบๅ’Œๅ…ถไป–่ฏ‰่ฎผๅ‚ไธŽไบบๅˆฐๅบญๆƒ…ๅ†ต๏ผšๅŽŸๅ‘Š๏ผš่ฎธๆ–Œใ€ๆฑชๆ–‡้ฉๅˆฐๅบญ\n623001 0 ็ป“ๆžœไบบ่ฟ˜ๆฒก่ตฐๅˆฐไป–ไปฌ้ฃžๆœบๆ—่พน\n623002 0 ็ฆ…ๅŸŽๅŒบ่ฟ‘12ๅนดๆœชๅ‡บ็Žฐๆ–ฐๅ‘ๆœฌๅœฐ็—…ไพ‹\n623003 0 ๅ…ฌๅธ่ฎกๅˆ’ๅฌๅ›žๆœ€ๅคš1ไธ‡่พ†2016XC90ๅž‹่ฟๅŠจๅž‹ๅคš็”จ้€”่ฝฆ\n623004 0 ๅŠ ๆฒนๅงๅฎžไน ็”Ÿ็œŸๆ˜ฏไธ€้ƒจๆŒบๅฅฝ็š„็”ต่ง†ๅ‰งๅ•Š\n1000 Processed\n classify content\n623500 1 ๆˆ้ƒฝไผ—ๅˆๆ€ไธบๅ…ฌๅธ๏ผŒไธ“ไธšไปŽไบ‹ๅ„ๅ“็‰Œ็”ตๆขฏๅ˜้ข‘ๅ™จใ€ๆŽงๅˆถไธปๆฟใ€้€š่ฎฏๆฟใ€่ฝฟๅŽขๆฟใ€้—จๆœบๆ‰ณใ€ๅค–ๅ‘ผๆฟใ€ๅบ”ๆ€ฅ็”ต...\n623501 0 ๆˆ‘็š„ๅคงๅญฆๅœจๆฑŸ่‹ๅœจๅ—ไบฌๅœจๅ—ไฝ“ๅœจๆฐ‘่กจ็ณป81242็ญ\n623502 0 ๅ…ถๅ…จ่ต„้™„ๅฑžๅ…ฌๅธxxไบฟๅ…ƒๅ‘ๅ€บ้ขๅบฆไธญ็š„้ฆ–ๆœŸxxไบฟๅ…ƒxๅนดๆœŸๅ…ฌๅธๅ€บๅทฒ็ปๅฎšไปท\n623503 0 ่€Œๆฏ100ไธช่ฎพ็ฝฎไธชๆ€ง็ญพๅ็š„ไบบไธญ\n623504 0 ๅคงๅฎถๅŠ ๆฒน็ป™็Ž‹ไฟŠๅ‡ฏ็ฌฌไธ€ๅฅฝไธๅฅฝ\n1000 Processed\n classify content\n624000 0 ๅ‘ตๅ‘ตๅ—’ไธ็”จๆณ•ๅพ‹ๆ€็ปดๆˆ‘้ƒฝๆœ‰็‚น่ฟท็ณŠ\n624001 0 ๅธธๅทžๅŸŽ้•‡ๅฑ…ๆฐ‘ๅนณๅ‡ๆฏๆœˆๆ”ถๅ…ฅ๏ผš3580\n624002 0 ็˜ฆไธ‹ๆฅ่ฏๆ˜Ž็ป™ๆ‰€ๆœ‰่ดจ็–‘่ฟ‡ไฝ ็š„ไบบ็œ‹\n624003 0 ๅ†ไนŸไธ็”จๆ‰พ็™พๅบฆไบ‘็ฝ‘็›˜้ซ˜ๆธ…่ต„ๆบ้“พๆŽฅไบ†APP\n624004 0 ไปฅ้˜ฒๆ€€ๅญ•ๆœซๆœŸ็”ฑไบŽๅญๅฎซ็š„ๅŽ‹ๅŠ›่€Œไบง็”Ÿ็š„็—‰ๆŒ›\n1000 Processed\n classify content\n624500 1 ๅ…ƒๅฎตไฝณ่Š‚ๆฅไธดไน‹้™…๏ผŒๅฏน้’็ƒค้น…ไธบไบ†ๅ›ž้ฆˆๆ–ฐ่€้กพๅฎข๏ผŒ็‰นๆญคๆŽจๅ‡บๆœ‰ไนฐๆœ‰่ต ๆดปๅŠจ๏ผŒ่ดญไนฐๅคง้น…ไธ€ๅช๏ผŒ่ต ๆฑคๅœ†ไธ€่ข‹๏ผŒ...\n624501 0 6ไธช85ๅผๅ’Œ2ไธชt90ๆœ€ๅŽ่ฟ˜ๆ˜ฏ้›Œ้นฟๅ’Œๆˆ‘็š„6ไธช็ฉบๅ†›ๅ›ดๆฎดไป–ไธ€ไธชๆ‰ๆ‰“ๆญป้‚ฃk1a1\n624502 0 ไนฆไธญๅ€Ÿๅฎ‹ๆœ็š„ไธ€็พคๅœŸๅŒชๆŠŠๅކๆœๅކไปฃ็š„ๅฎ˜ๅœบโ€œๆญฃไน‰ไบบๅฃซโ€้ƒฝ้ป‘ๅ‡บ็ฟ”ๆฅ\n624503 1 ๅ•†ไธšๆญฅ่กŒ่ก—๏ผŒๅนผๅ„ฟๅ›ญ๏ผŒๅ›พไนฆ้ฆ†๏ผŒ็”ตๅฝฑ้™ข๏ผŒๅ„ฟ็ซฅไนๅ›ญ๏ผŒๅฅ่บซๅนฟๅœบ๏ผŒ็ปฟๅŒ–ๆ™ฏ่ง‚ๅธฆ๏ผŒ็”Ÿไบงๅฑ…ไฝ้”€ๅ”ฎไธ€ๆˆฟๅคš็”จ็ŽฐๆŽจ...\n624504 0 ๆต™ๆฑŸๅ€Ÿโ€œไบ’่”็ฝ‘+โ€ๆž„็ญ‘ๅ†œ่ต„็›‘็ฎกโ€œๆ™บๆ…งโ€ๆ–ฐๆ ผๅฑ€๏ผšใ€€ใ€€ๆฐ‘ไปฅ้ฃŸไธบๅคฉ\n1000 Processed\n classify content\n625000 0 AmazonAWSๆžœ็„ถๅพˆๅฅฝ็Žฉๅ•Š\n625001 0 ไธ€็งๅซๆˆ‘ๅฐฑๆ˜ฏ่ฆ็œŸ็›ธๆˆ‘ๅฐฑๆ˜ฏ่ฆxๆธ…ๆฅš็„ถๅŽๆญปๅฟƒ\n625002 0 ๅธธ็†Ÿๆทๆขฏๆ•™่‚ฒไธ“ไธš่ฅฟ็ญ็‰™่ฏญๅŸน่ฎญๅญฆๆ ก\n625003 0 ๆœ€่ฟ‘ๆฒกๆœ‰ๆ‰‹ๆœบไนŸ้”™่ฟ‡ไบ†่ฎธๅคšๅบ”่ฏฅ็บชๅฟต็š„ๅœบๆ™ฏ\n625004 0 โ—ขไธ€่ˆฌๅฐ†ๅฐไบŽx็บง็š„ๅœฐ้œ‡็งฐไธบ่ถ…ๅพฎ้œ‡\n1000 Processed\n classify content\n625500 0 ไธŠๆตทๆ’ๅŸบๅไบบ่ดญ็‰ฉไธญๅฟƒไธพๅŠž็š„โ€œ็ฅž็ง˜่Šฑๅ›ญโ€ไบฒๅญๆธธๆดป\n625501 0 ไธ€ไธชๅค–ๅœฐๆฅ่‹ๅทž็š„โ€œๆ–ฐ่‹ๅทž\"ไบบ\n625502 0 ไป–ไปฌๅœจไธๅฑˆไธๆŒ ็š„ๅ่…่ฟๅŠจไธญๆ— ๆƒงๆ ‘ๆ•Œ\n625503 0 ไบฎ็‚นๆ˜ฏ็บฏ็‰ฉ็†้˜ฒๆ™’ๆธฉๅ’Œไธๅˆบๆฟ€\n625504 0 ็›—ๅข“็ฌ”่ฎฐ่ฆVIPๆ‰่ƒฝ็œ‹ๅ…จ้›†่€ๅธˆไฝ ็œ‹ไบ†ๅ—โ€\n1000 Processed\n classify content\n626000 1 ้‡‘ๅ† ๅพทๅทžไฟฑไน้ƒจ็ฅๆ‚จๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒๆœฌๅบ—ๆŽจๅ‡บๅ„็ฑปSNG้”ฆๆ ‡่ต›๏ผŒๆฌข่ฟŽๅคงๅฎถๅˆฐๅบ—ๅ‚ๅŠ ๆฏ”่ต›ใ€‚ๅ†ๆฌกๆ„Ÿ่ฐขๆ‚จ็š„ๅˆฐ...\n626001 0 xxๅนด้›…ๅฎ‰ๅœฐ้œ‡๏ผš่ฏบๅŸบไบšๆๆฌพxxxไธ‡\n626002 0 ่‹ๅทžๅ†ทๆณ‰ๆธฏๆ‰ฎๆผ”็€ไธบ้ซ˜็ซฏ็ง‘ๅญฆๆœๅŠก็š„่ง’่‰ฒ\n626003 0 ็ƒŸๅฐ็‰ŸๅนณๅŒบ่Ž’ๆ ผๅบ„้•‡ไธ‰ๅคงๅทฅไฝœๅผ•้ข†่ฝฌๅž‹ๅ‘ๅฑ•ๆ‹›ๅ•†ๅผ•่ต„็ชๅ‡บ่ฝฌๅž‹ๅ‘ๅฑ•๏ผšไปŠๅนดไธŠๅŠๅนด\n626004 0 โ—‹ๅคšไฝๅ—่ฎฟๅˆ†ๆžไบบๅฃซ่ฎคไธบ็Œช่‚‰ๆถจไปทๆŽจๅ‡CPIไธŠๆถจๆœ‰้™\n1000 Processed\n classify content\n626500 0 ็ฌฌไธƒๆฌกๆ‰“ๅŸŽ็ฎกๅฐฑๆމไบ†ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๆˆ‘ๅฐฑ็Ÿฅ้“ๅคงๅ“ฅๆœ€็ˆฑๆˆ‘ไนˆไนˆๅ“’ๅŸ‹่ƒธ่‚Œๅƒๆˆ˜ๅ‡บ่™Žๅฝป็ปˆไบŽๆททๅˆฐ็‚นๆฌงๆดฒ่ก€็ปŸๅ•ฆไธๆƒณๆŠŠ...\n626501 0 ๅธธๅทžๅธ‚้‡‘ๅ›ๅŒบๆฐ”่ฑกๅฐ8ๆœˆ9ๆ—ฅ16ๆ—ถๅ‘ๅธƒ็š„ๅฐ้ฃŽ่ญฆๆŠฅๅ’Œๅคฉๆฐ”้ข„ๆŠฅ๏ผšไปŠๅนด็ฌฌ13ๅท็ƒญๅธฆ้ฃŽๆšด่‹่ฟช็ฝ—ไปŠๅคฉ15...\n626502 0 ่ฎพ่ฎก้”คๅญไพฟ็ญพ็š„่ฎพ่ฎก้ฃŽๆ ผๆ˜ฏ็ฎ€็บฆ็š„ๆ‹Ÿ็‰ฉ้ฃŽๆ ผ\n626503 0 ๆ˜Žๅคฉๅˆ‘ๆณ•ๅˆ†ๅˆ™ๆ€ป็ป“ๅฎŒๅ†™ไธชไธŠ่ฏ‰็Šถ้ข˜ๅบ“ๅฐฝ้‡ๅš่ƒŒไธญๅ›ฝ็‰น่‰ฒ็คพไผšไธปไน‰11ๅนดๅทไธ€\n626504 0 ๅˆ›ไธšๆŒ‡ๆ•ฐไธŠ4000็‚น็š„ๆ—ถๅ€™ๆ— ๅ‘ๅ‡บ็Ÿญไฟก้ฃŽ้™ฉๆ็คบ\n1000 Processed\n classify content\n627000 0 ๅฝ“ไฝ ๆŠฑๆ€จ่‡ชๅทฑๅทฒ็ปๅพˆ่พ›่‹ฆ็š„ๆ—ถๅ€™\n627001 0 5ใ€่ฟ‡้‡ๅ–็ปฟ่ฑ†ๆฑคๆˆ–่‡ด่‚ ่ƒƒ็–พ็—…\n627002 0 ๅ†…่ดพๅพท05ๅนดไธŠๅฐไน‹ๅŽๅผบ่ฟซๅ›ฝไผๆททๆ”น็งๆœ‰ๅŒ–\n627003 0 ๅคงๅนดๅˆ3้ญๆˆ‘ๅฆˆๆ‹ฟ็€้ธกๆฏ›ๆŽธๅญ่ฟฝ็€ๅˆฐๅค„ๆ‰“\n627004 0 ๅŸŽ็ฎกๅฐฑๅฝขๆˆไบ†่กŒๆ”ฟ้ƒจ้—จ็š„ๆ–ฐไธ‰ๅ›ฝๆผ”ๆญฃไน‰ไบ†\n1000 Processed\n classify content\n627500 0 ๅœจๅŒป็”Ÿ็š„ๆŒ‡ๅฏผไธ‹็”จ่ฏๆฏ”่พƒๅฎ‰ๅ…จ\n627501 0 50ๅคšๅฒ้ฃŸๅ ‚ๅคงๅฆˆ็”จๆ‰‹ๆœบ็œ‹่Šฑๅƒ้ชจ\n627502 0 ๅ…จๆ–ฐ่ฎพ่ฎก็š„ๅ‰ๅŒ…ๅ›ด่ฎฉ่ฝฆๅคด็œ‹่ตทๆฅๆ›ดๅŠ ็š„ๆ•ฆๅฎž\n627503 0 ็”จไบ†่ฟ™ไนˆๅคšๅนดๅฐ็ฑณๅˆซ็š„้ƒฝๆŒบๅฅฝ็š„ๆ€Žไนˆๅฐฑๆ•ฐๆฎไผ ่พ“่ฟ™ไนˆๅทฎๅŠฒ\n627504 0 ็‰ฉ่ดจ่ดขๅฏŒ่ทŸไธไธŠ็ฒพ็ฅž้ฃŸ็ฒฎ้œ€ๆฑ‚โ€ฆ็œŸ็š„่ฆๅŽปๅ–่‚พไบ†\n1000 Processed\n classify content\n628000 1 ไธ‡่ƒฝๅผ€้”ๅ…ฌๅธๆ„Ÿ่ฐขๆ‚จๆฅ็”ต๏ผŒๆœฌๅบ—ไธ“้…ๆฑฝ่ฝฆ้’ฅๅŒ™ใ€้ฅๆŽงๅ™จใ€ๆ™บ่ƒฝๅกใ€ๆŠ˜ๅ ๆ”น่ฃ…ใ€่ฝฆๅบ“้ฅๆŽงใ€ไธŠ้—จๅผ€้”ๆข้”๏ผˆ...\n628001 0 ๅญฉๅญๅ‡บ็”Ÿๅˆฐ็Žฐๅœจๆณ•้™ขๅผบๅˆถๆ‰ง่กŒๆŠšๅ…ป่ดน็›ฎๅ‰ๅฐฑ่ฎจๅ›žไธ€ๅƒ\n628002 0 2014ๅนด10ๆœˆ้กถๆ–ฐๅœจๅฐๆนพๆญฃไน‰้ฅฒๆ–™ๆฒนไบ‹ไปถ็ˆ†ๅ‘\n628003 0 ไฝฟๅคนๆฟๅทๅ‘ๆฃ’ๅน้ฃŽๆœบ็š„้ƒฝๅฐๅฟƒ็‚นๅงๅƒ้˜ฟ็ซฅๆœจ่„šๅบ•็š„็ซ็ฎญไธ€ๆ ทๅ–ท็ซๆˆ‘ๅฐผ็Ž›ๅ“ๆญปๆˆ‘ไบ†\n628004 0 ๆ‰พ็†่ดข็ฝ‘โ€”โ€”ๆœ€ๆƒๅจ็š„็ฝ‘่ดทๅฏผ่ˆช็ฝ‘็ซ™\n1000 Processed\n classify content\n628500 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๅธๆž—ๅขฉๅˆฐ่ฅฟๅฎ‰ๅŒ—ไธ‰็Žฏๆตท่ฟ้—จๅˆฐ้—จไปทๆ ผxxxxๅ…ƒ/ๆŸœ๏ผŒ้“่ทฏxxxxๅ…ƒ/ๆŸœ๏ผŒๆฌข่ฟŽ่ต่ฝฝ๏ผŒ่ฐข่ฐข๏ผ...\n628501 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?่Žบๅ“ฅๅ’Œๅคง็Ž‹ๅŒๅฝ’xxxxP\n628502 0 2ๅˆๅœจ่ฏฅๅฒ›ๅ‘็Žฐไธญๆ–‡ๅ’Œ้ฉฌๆฅ่ฅฟไบšๆ–‡ๅญ—็š„้ฃžๆœบๆฎ‹้ชธ\n628503 0 2015้‡‘็ซ‹ๆ™บ่ƒฝๆ‰‹ๆœบๆฏไธญๅ›ฝๅ›ดๆฃ‹็”ฒ็บง่”่ต›่ฟ›่กŒ็ฌฌ11่ฝฎ่พƒ้‡\n628504 0 ๅœจ่ฏฅๅฒ›ๆตทๅฒธ็บฟไธŠๅ‘็Žฐ้ฃžๆœบๆฎ‹้ชธ\n1000 Processed\n classify content\n629000 0 ๆฏ”ๆขจๅญๅคšไธƒๅ€้“็š„ๅซ้‡ๆฏ”่‹นๆžœๅคšไธ‰ๅ€\n629001 0 ๆ™šไธŠ็ก่ง‰ๆขฆ่งๆˆ‘ๆŠŠๆ‰‹ๆœบๆ‘”็ขŽไบ†\n629002 0 ๆ˜จๅคฉๆ–นๆญฃ่ฏๅˆธๅšไบ†ไธ€ไผšๅ„ฟ็š„้ข†ๅคด็พŠๅŽŸๆฅๆ˜ฏไธš็ปฉๅ—7ๆœˆไธ‹่ทŒๅฝฑๅ“ๆœ€ๅฐ็š„ๅˆธๅ•†่‚กๅ•Š\n629003 0 ่™ฝ็„ถๆˆ‘ๆœฌๆฅๅฐฑ่„‘ๆฎ‹~ๅฅฝไบ†ๆ˜Žๅคฉ่ฟ˜่ฆ่€ƒ่ฏ•ๅ‘\n629004 0 ๅทฆๅ›พๆ˜ฏ็‹—็‹—ๅœจๅŠจ็‰ฉๅŒป้™ขๅ†…ๆฃ€ๆŸฅๆ—ถ็š„Xๅ…‰็‰‡\n1000 Processed\n classify content\n629500 0 ๆˆ้ƒฝๅฑ€็›ฎๅ‰ๆœ‰ๅ„ๅผ็”ตๆขฏ่ฟ‘ไธคๅƒ้ƒจ\n629501 0 ่‹ๅทž่ดจ้‡ๆŠ€ๆœฏ็›‘็ฃๅฑ€ๅ›ญๅŒบๅˆ†ๅฑ€่ฟ…้€Ÿ้ƒจ็ฝฒ\n629502 0 ๅŒป็”Ÿ่ฏด๏ผš้‚ฃไฝ ๆ€ŽไนˆไธๆŠŠไป–ๅธฆๆฅ\n629503 0 ๆ—ฉ่ตท+ไธ€ไธชๅŠๅฐๆ—ถ็ญ่ฝฆ+ๆญฃไบ‹ๆฒกๅ’‹ๅนฒ็ŠŠๅญๆฒกๅฐ‘ๆ‰ฏ+ๅŠไธชๅฐๆ—ถ็ญ่ฝฆ+ไธ€ไธชๅฐๆ—ถๅœฐ้“+ๅŠไธชๅฐๆ—ถๅ…ฌไบค=ๅŒ—่พฐt...\n629504 0 AMD็œŸ็š„่ƒฝๅธฆไฝ ้ฃžๅตŒๅ…ฅๅผๅค„็†ๅ™จๅทฒๆ‰“ๅ…ฅๆณข้Ÿณ้ฃžๆœบ้ฉพ้ฉถ่ˆฑ\n1000 Processed\n classify content\n630000 0 ่ฏฑ้ช—ไบ‹ไธป้€š่ฟ‡ATMๆœบ่ฝฌ่ดฆๆ”ฏไป˜ๆ‰€่ฐ“็š„ๆ”น็ญพๆ‰‹็ปญ่ดน\n630001 0 ่™ฝ็„ถ่ฆ6็‚น่ตทๅบŠ่ตฐ40ๅˆ†้’ŸๅŽปไธŠ็ญ\n630002 0 ๅ…ฉ่€…ๅทฎๅˆฅๅœจๆ–ผๆณ•่ฆๅ‘ฝไปคๆ˜ฏไธ€่ˆฌๆ€งๆŠฝ่ฑกๆ€ง็š„่ฆๅฎš\n630003 0 ่€Œไธ”ๆŠ•่ต„้ขๆฏ”่ตทๆฏๅนดๅŒ…ไธชๆธฉๆณ‰ๆ—…้ฆ†็š„่Šฑ้”€ๅคšๅพ—ๅคš\n630004 0 ๆ™šไธŠๅผ„ๅไธ€ๆญฅๆ‰‹ๆœบ็Žฐๅœจ่ฟ˜ไธŠไธไบ†ๅพฎไฟก\n1000 Processed\n classify content\n630500 0 ๆตท่พน่™ฝๆฒกๆƒณ่ฑกไธญ็š„้‚ฃไนˆๆตชๆผซไฝ†ไนŸไนๆญคไธ็–ฒ่™ฝๅธฆๅ„็งไผค็—•ๅ›žๅฎถไฝ†ไนŸไธ่™šๆญค่กŒ&lt\n630501 0 โ€โ€œ้ฃžๆœบ็š„ๅ‡ๅŠ›ๆ˜ฏๅฆ‚ไฝ•ไบง็”Ÿ็š„ๅ‘ข\n630502 0 ๅ’จ่ฏขไฟ้™ฉ่ฏท่”็ณป18052233219\n630503 0 ๆฑŸ่‹่ฟžไบ‘ๆธฏ่ดง่ฝฆไธขๅคฑ1ๅจๆœ‰ๆฏ’ๅฑๅŒ–ๅ“่ฟ˜ๆœ‰6ๅŒ…ๅคฑ่ธช\n630504 0 ๅฝ“ๅคงไผ—ๅช’ไฝ“่ดจ็–‘ๆŸไธ€็บฟๅ“็‰Œๆœ‰ๅฏ่ƒฝๆ˜ฏโ€œไธ‰็ฒพไธ€ๆฐดโ€็š„ๆ—ถๅ€™\n1000 Processed\n classify content\n631000 1 ๅœจๅ—๏ผŸxๆœˆxxๆ—ฅ๏ผŒxxๆ—ฅใ€็‰นๆƒ ใ€‘ๅŒ—ไบฌๆ•…ๅฎซๅ…ซ่พพๅฒญ้•ฟๅŸŽๅ…ญๅคฉๅŒ้ฃžๅ›ขxๅคฉxxxxๅ…ƒ/ไบบ๏ผŒๆœ‰้œ€่ฆ่”็ณปๆˆ‘...\n631001 0 ๆน–ๅŒ—็”ตๆขฏไบ‹ๆ•…ๅ‰5ๅˆ†้’Ÿ็›‘ๆŽงๆ›ๅ…‰๏ผšไธคๅทฅไฝœไบบๅ‘˜้™ฉๅ ไธ‹\n631002 0 ่ฃๅˆคโ€œ้ป‘ๅๅ•โ€็š„ๆŽจๅ‡บๅด็•ฅๆœ‰ๆปžๅŽ\n631003 0 ๅ—ไบฌๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขไพๆณ•ๅฏนๅ—ไบฌๅธ‚ๅ…ญๅˆๅŒบๅŽŸๆ–ฐ็ฏ้•‡ๆกฅ็Ž‹ๆ‘ๆ‘ไธปไปปๆž—ๅบญๆœไปฅๆถ‰ๅซŒ่ดชๆฑก็ฝชๅ†ณๅฎš้€ฎๆ•\n631004 0 ๆญ็€่—ๆฐ‘็š„ไธ‰่ฝฎ่ฝฆๅœจxxxๅ›ฝ้“ไธŠ้ฃž้ฉฐ\n1000 Processed\n classify content\n631500 0 2XUๅฅณๅฃซ้“ไบบไธ‰้กน่ฟๅŠจๅŽ‹็ผฉ็Ÿญ่ฃค\n631501 0 CelticWaterไธบ่‚Œ่‚คๆณจๅ…ฅๆ— ้™ๆฐดๅˆ†\n631502 0 ไป–ๅฆˆ้ ็€็ฉท็ญ‰็€ๆ”ฟๅบœๅ‘ๆ•‘ๅŠฉ่ฟ‡ๆ—ฅๅญ\n631503 0 2015ๅนด้ข„่ฎกGDPๆ˜ฏ65ไธ‡ไบฟ\n631504 0 ๅŽŸไปท่ฝฌไธคๅผ ๅ‘จๆฐไผฆ่‹ๅทžๆผ”ๅ”ฑไผš้—จ็ฅจ\n1000 Processed\n classify content\n632000 0 ่ฟ™ไธŽๅŽไธฅๅฏบ็š„ๅปบ็ญ‘ๅธƒๅฑ€ๆœ‰่Žซๅคง็š„ๅ…ณ็ณป\n632001 0 ๆฎไธๅฎŒๅ…จ็ปŸ่ฎกๆˆ‘ๅ›ฝๆœ‰xxxไธ‡ๅฎ˜ๅ‘˜็š„ๅฆปๅญๅ„ฟๅฅณๅ…จ้ƒจ็งปๆฐ‘ๅ›ฝๅค–\n632002 0 ้˜ฟ้‡ŒๅทดๅทดๆŠŠไธๅ’Œ่ކ็”ฐไบบๅˆไฝœๅ†™่ฟ›ไบ†ๅ…ฌๅธ่ง„ๅฎš็ญ‰็ญ‰ๅœฐๅŸŸๆญง่ง†\n632003 0 ๆœ‰้œ€่ฆไบ†่งฃ่ฏฆ็ป†ๆƒ…ๅ†ต็š„ๅฏไปฅ้šๆ—ถ่”็ณป่”็ณปไบบ๏ผš้ƒ‘่€ๅธˆ่”็ณป็”ต่ฏ๏ผš13001057642\n632004 0 ๆฏๅคฉๅœจๅฎถไธๆ˜ฏไธ็ป™ๆˆ‘็Žฉๆ‰‹ๆœบๅฐฑๆ˜ฏไธ็ป™ๆˆ‘็Žฉ็”ต่„‘\n1000 Processed\n classify content\n632500 0 ๅŸบไบŽๅ…ณ็ณป็š„ๅ•†ไธšๆจกๅผๆ˜ฏๆ—ถ้—ด็š„ๆœ‹ๅ‹\n632501 0 ๆˆ‘ๅคงๅงจๅฆˆๅคงๅงจๅคซๆตชๅˆฐๆ— ้”กๅŽปๅ•ฆ\n632502 0 ๅˆšๅˆšไธป่ฃๅˆคๆ˜ฏ่ฐƒๆˆไบ†ไฝฉไฝฉๅ—ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n632503 1 ้€่ƒธ่กฃไธ€ไปถ๏ผŒไนฐไธ‰ไปถ้€่ƒธ่กฃไธคไปถ๏ผŒไนฐๅพ—ๅคš้€ๅพ—ๅคš๏ผŒ\n632504 0 Minoๅ…ˆ็”Ÿๅฐ†่ง†้ข‘ไธŠไผ Youtube\n1000 Processed\n classify content\n633000 0 ่ฏดๆฑŸ่‹ไธ€ๅญ•ๅฆ‡ไบงไธ‹ๆœ€่ฝป้พ™ๅ‡ค่ƒŽ\n633001 0 IMFๅœจๆœชๆฅๆ•ฐๆœˆ็”š่‡ณไธ€ๅนดๆˆ–ไธไผšๅ‚ไธŽๅฏน่ฏฅๅ›ฝ็š„็ฌฌไธ‰่ฝฎๆดๅŠฉ\n633002 0 โ†“โ†“Ps๏ผšๅคง็ƒญๅคฉ็š„้ƒฝๅ‡‘ไป€ไนˆ็ƒญ้—น\n633003 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณlive็ฉบ้™ๅ—ไบฌๆฐดๆธธๅŸŽ\n633004 0 ๆˆ‘็”จๆœ็‹—่พ“ๅ…ฅๆณ•ๆ‰“ไบ†17492ๅญ—\n1000 Processed\n classify content\n633500 0 ๅคฉๆดฅๅ’Œๅ—ไบฌไน‹้—ด็š„่ท็ฆปๆ˜ฏๆˆ‘ไธ่ƒฝๆ‰ฟๅ—็š„็ดฏ\n633501 0 1981ๅนดๆˆ‘ๆฒกๆฅไธญๅ›ฝไน‹ๅ‰ๆ˜ฏไป€ไนˆ็—‡็Šถ\n633502 0 ็—…ไบบ็š„็™ฝ็™œ้ฃŽ็–พ็—…ไนŸไธๆ˜ฏๅพˆๆ˜พ็€\n633503 0 ็ฎ€\n633504 0 ไผšๅ‘˜ๆƒ็›Š๏ผšๆ—ถ็งŸ็ซ‹ๅ‡20ๅ…ƒใ€ๆˆฟไปท92ๆŠ˜ใ€ๅปถๆ—ถ้€€ๆˆฟ่‡ณ13\n1000 Processed\n classify content\n634000 0 ่ฆๆฑ‚็š„่ฏ็œ‹้•ฟๅพฎๅšๅงw็ˆ†็…ง็š„่ฏๆœ‰ๆ„็งๆˆณๆˆ‘ๅŠ ไบ†QQๅŽๅฏไปฅๅ“Ÿ\n634001 0 ๆˆ‘ๅ‚ๅŠ ่ฟ‡ๅฅนๅฎถๅŽปๅนด็š„้€40ๅ…ƒๆดปๅŠจ\n634002 0 ไปŽ่€ŒไฝฟVOCๅซ้‡่ฟœไฝŽไบŽๅ›ฝๅฎถ่ง„ๅฎš\n634003 0 ่ญฆๅฏŸ่œ€้ป่ฆๅ’ŒๅŸŽ็ฎก็ปˆๆžPKๅ‘€\n634004 0 ๆต™ๆฑŸ็œๆธฉๅฒญๅธ‚ไบบๅฆ‚ๆžœไฝ ไปฌๆœ‰็†ๆƒณ็š„่ฏท็›ธไบ’่ฝฌๅ‘Š\n1000 Processed\n classify content\n634500 0 ่ฆไธไฝ ไปฌๅ›žๅŸŽ็ฎก็ˆธ็ˆธ้‚ฃ้‡ŒๅŽปๅง\n634501 0 ไปŠๅคฉ่€็ˆธ่ฏดๆ˜ŽๅคฉๅŒ–็–—็š„ๅฃฐ้Ÿณๅ“ๅฝ“ๅฝ“\n634502 0 ๅฐฑๅƒๆˆ‘ๆ‹ฟ่ตทๆ‰‹ๆœบๅˆทๅพฎๅšๅฐฑไผšๅฟ˜่ฎฐๅฝ“ๅˆๆ˜ฏๆƒณ้—ฎๅบฆๅจ˜ๆŸฅ่ต„ๆ–™ๆฅ็€\n634503 0 ่ฏทๅคงๅฎถ่ต„ๅŠฉๆˆ‘ๆœ‰ๅŠ›็š„ๅ‡บๅŠ›ๆœ‰้’ฑๅ‡บ้’ฑ่ฟ˜ๆœ‰่ฏทๅธฎๆˆ‘ๆ‰พไธ–็•ŒไธŠๆœ€ๅฅฝ็š„ๅŒป็”Ÿๅฟซ้€Ÿๅˆฐๆˆ‘ๅฎถๆœ€ๅŽ่ฏทๅคงๅฎถ่ฎฐไฝๆˆ‘ๆ˜ฏๆฐธ่ฟœ็ˆฑไฝ ไปฌ็š„\n634504 1 ๆ–ฐๅนดๆ–ฐๆฐ”่ฑก๏ผŒๆ„ฟๆ‚จๆœ‰ไธชๅฅฝๅฝข่ฑก๏ผŒๅฅฝๆถˆๆฏ๏ผŒๅฅฝๆถˆๆฏใ€‚ๅ“ๅฐšๅๅ“ๆ–ฐๆฌพไธๆ–ญไธŠๅธ‚๏ผŒๆœฌๅบ—ๆฃ‰่กฃ๏ผŒ่ฒ‚็ป’็บฟ่กฃๆ‰“ไบ”ๆŠ˜๏ผŒ...\n1000 Processed\n classify content\n635000 0 ๅ› ๆญคไบง็”Ÿ่…่ดฅใ€ๅˆ†้…ไธๅ…ฌใ€ๅ…ฌๆœ‰่ต„ไบงๆตๅคฑ็ญ‰้—ฎ้ข˜\n635001 0 ่Šไบ†ไบ›ๅ…ณไบŽ็”ฒๅฃณ่™ซ่ฝฆ่ดด่ฎพ่ฎกๅ’Œๆถ‚้ธฆ็š„้—ฎ้ข˜๏ฝž\n635002 0 BetteๅฏนTina่ฏด็š„้‚ฃๅฅโ€”โ€”โ€œๅฝ“ๆˆ‘ๆ‰ชๅฟƒ่‡ช้—ฎ\n635003 0 ๆ— ้”กๅž‹ๆๅˆ่ฏ„๏ผšไปŠไธปๆตๆŠฅไปทๆŒ็จณ\n635004 0 ๆฌฃๆ‰ฌ็‰นๅˆซ้’ˆๅฏน่ฝฆ่ฝฝ็”ต่„‘่ฎพ่ฎกๅฎŒๅค‡็š„ๆ™บ่ƒฝ็”ตๆบ็ฎก็†ๅญ็ณป็ปŸ่งฃๅ†ณๆ–นๆกˆ\n1000 Processed\n classify content\n635500 0 ่ถๅ‡บ้—จไนฐๆฐดๆฅๅˆฐไธ€ไธชๆ‰‹ๆœบๆœ‰็ฝ‘็š„ๅœฐๆ–น\n635501 0 ๅ…่ดนๅˆ†ไบซๆต™ๆฑŸxไปฝๅŽšๅถ่“้ธŸ\n635502 0 ๅคไปŠ็œŸ็›ธไน‹ๆญ็ง˜้ญๆ™‹้ฃŽ่Œƒไธๆ˜ฏไผ ่ฏดไธญ็š„้‚ฃไนˆ่ฟทไบบ\n635503 0 ๆ™ฎ้™€ๅฑฑไฝไบŽๆต™ๆฑŸ็œๆญๅทžๆนพไปฅไธœ็บฆ100ๆตท้‡Œ\n635504 0 ๅๆญฃๆˆ‘ๅฐฑๆ˜ฏ็œ‹ไธๆƒฏ่ฟ™็งๆฝœ่ง„ๅˆ™\n1000 Processed\n classify content\n636000 0 ไปŽๅฐๅญฉๅˆฐๅญ•ๅฆ‡ๅˆฐ100ๅฒ็š„่€ไบบ้ƒฝๅฏไปฅๆœ็”จ\n636001 1 ๆธฉ้ฆจๆ็คบ: ๅ„ไฝไผไธš่€ๆ€ปๆ‚จไปฌๅฅฝ๏ผŒๅ•ๅฃซๅ†›ใ€้ฉฌไธฝ้ข„็ฅไฝ ไปฌxxxxๅนด็”Ÿๆ„ๅ…ด้š†ใ€่ดขๆบๅนฟ่ฟ›ใ€‚ ๅœจๆ–ฐ็š„ไธ€...\n636002 1 ๆฑ•ๅคดๆด›ๅŸŽPARTY้…’ๅงๆฌข่ฟŽๆ‚จ๏ผๅฅขๅŽๅ…ธ้›…่ฃ…ไฟฎ๏ผŒ้กถ็บง้Ÿณๅ“่ฎพๅค‡๏ผŒ้ซ˜ไธญๆกฃๅคงๅฐๅŒ…ๅŽข๏ผŒๆฏๆ™šๅฅ‰็ŒฎไธๅŒ็ฒพๅฝฉๅจฑ...\n636003 0 ๆ˜ฏๆ—…ๆธธ่ง‚ๅ…‰ๅคงๅณก่ฐท็š„ๆœ€ไฝณๆ—ถๆœŸ\n636004 0 ่ฟ™ๆ ‡ๅฟ—็€่ฟ™ๆœฌไปฅ60ใ€70ๅนดไปฃ็Ÿฅ้’ๅปบ่ฎพๅ†œๅœบไธบไธป่ฆ้ข˜ๆ็š„ไธ“่พ‘็ผ–็บ‚ๅทฅไฝœๅทฒ่ฟ›ๅ…ฅๆ”ถๅฐพ้˜ถๆฎต\n1000 Processed\n classify content\n636500 0 ๅ—ไบฌ็š„้‚ฃไธ€ๅน•ๅœจ้‡ๅบ†้‡ๆผ”ไบ†ๅ—\n636501 0 ไธไผš่ฎพ่ฎก่กจๆ ผไธไผš็œ‹่กจๆ ผไธไผšๅˆ†ๆž็ปŸ่ฎก็ ”็ฉถๆ•ฐๆฎ\n636502 0 45ไธชๆ–ฐ็š„็งŸ่ต็‚น็”ฑๅธ‚ไบคๆŠ•ๆ–ฐๆˆ็ซ‹็š„ๅˆ่ต„ไผไธšๆ˜“ๅผ€ๅ…ฌๅธ่ดŸ่ดฃๅปบ่ฎพ\n636503 0 ๅ–œๆ”ถ่—??่ดชๅฎ˜็•™็ป™่‡ชๅทฑๅ—่ดฟ็ดข่ดฟ็š„โ€œๆš—้“โ€๏ผšๅ–œๆ”ถ่—่ดชๅฎ˜็•™็ป™่‡ชๅทฑๅ—่ดฟ็ดข่ดฟ็š„โ€œๆš—้“โ€ๆƒฏไบŽๅœจๅฎ˜ๅœบ่กŒ่ดฟ...\n636504 0 ๆต™ๆฑŸๆฏ›ๅฎถ่ฏดๆต™ๆฑŸ็š„ๆ˜ฏ็™ฝ็Œซๆน–ๅ—็š„ๆ˜ฏ้ป‘็Œซ\n1000 Processed\n classify content\n637000 0 โ€ๅˆฐๆ—ถ็œ‹็œ‹Edๅˆฐๅบ•้‚€่ฏทไบ†ๅคšๅฐ‘ไบบๅŽปๆธฉๅธƒๅˆฉๅ‘ข\n637001 1 ๆ‚จๅฅฝ๏ผŒๆฌข่ฟŽ่‡ด็”ตๆตทๅฎ‰่ˆน่ˆถ็‰ฉ่ต„ๆœ‰้™ๅ…ฌๅธ๏ผŒๆœฌๅ…ฌๅธไธ“ไธšๆ‰นๅ‘ๅ„็ง่ง„ๆ ผๆธ”็”จ้’ขไธ็ปณ๏ผŒๅŠๆœบ๏ผŒๆ‰“ๆกฉ๏ผŒ็”ตๆขฏ๏ผŒๅก”ๅŠ...\n637002 0 ๆœฌ่ต›ๅญฃ็ป“ๆžœgoogleๅฎŒ่ƒœ\n637003 0 ่ƒŽ็›˜็ด ๅŽŸๆถฒ?10%็š„ๅŒ–ๅฆ†ๆฐดๆž„ๆˆ\n637004 0 ๅœจๅ—ไบฌ็š„ๅ„ๅคง่ก—้“ไธŠ่ตฐ็€ๅ็€่ฝฆ\n1000 Processed\n classify content\n637500 0 comๅฅถใ€ๅฅถ้…ชใ€้…ธๅฅถโ€ฆโ€ฆ่ฅฟๆ–น็š„ๅฅถๅˆถๅ“ไธšๅฆ‚\n637501 0 ็›ฎๅ‰็”จๆˆทๅฏไปฅ้€š่ฟ‡WindowsStoreๅบ”็”จๅ•†ๅบ—ไธ‹่ฝฝ่ฟ™\n637502 0 ๅธƒ่พพๆ‹‰ๅฎซ็š„ไธปไฝ“ๅปบ็ญ‘ไธบ็™ฝๅฎซๅ’Œ็บขๅฎซไธค้ƒจๅˆ†\n637503 0 ็™พๅบฆไธ€ไธ‹ๅ‘็Žฐ้ข˜ๆ่ฟ˜ๆ˜ฏๆŒบๆฒ‰้‡็š„\n637504 0 ๅฏน้…’้ฉพ่ฟๆณ•่กŒไธบ็š„ๅธธๆ€ๅŒ–ไธ“้กนๆ•ดๆฒป\n1000 Processed\n classify content\n638000 0 ไปŠๅคฉๅพฎ่ฝฏๅ†ไธบWindows10Build10240ๆŽจๅ‡บไธ€้กนๅฎ‰ๅ…จๆ›ดๆ–ฐ\n638001 0 ๆœฌไบบ7ๆœˆ9ๆ—ฅๅœจ่‹ๅทž47่ทฏๅ…ฌไบค่ฝฆ้‡ๅˆฐไธ€ไฝๅฅณๅญฉ\n638002 0 ๅŒ—ไบฌๆบๆ‰‹ๅผ ๅฎถๅฃ็”ณๅŠž2022ๅนดๅ†ฌๅฅฅไผšๆˆๅŠŸ\n638003 0 ็”ณๆ˜Žไนฆๅ—้€šๅธ‚ไฝๆˆฟไฟ้šœๅ’Œๆˆฟไบง็ฎก็†ๅฑ€\n638004 0 ่ฎพ่ฎกๅธˆ็ป™ๆฏๅชๆฏๅญ้ƒฝๅŠ ไธŠไธ€ไธช90ๅบฆ่ง’\n1000 Processed\n classify content\n638500 1 ๅ…ณๆณจๅฐๆพ๏ผŒๅŠฉๆ‚จๆˆๅŠŸ๏ผ่ดญไนฐๅฐๆพ๏ผŒๆ”พๅฟƒๆ–ฝๅทฅ๏ผ่ดจไฟxๅนดxxxxxๅฐๆ—ถ็š„่ถ…้•ฟ่ดจไฟ่ฎฉไฝ ๆ›ดๆ”พๅฟƒไฝฟ็”จ๏ผŒๆ›ดๅคš...\n638501 0 ๆˆ‘่ง‰ๅพ—ๆฒกๆœ‰ๅฟ…่ฆๆŠŠๅœฐ้“ๆธฉๅบฆ่ฐƒ่ฟ™ไนˆไฝŽ\n638502 0 ไปŠๆ—ฅ่ตฐๅŠฟ๏ผšๆฌง็พŽ่‚กๆŒ‡ๆ•ดไฝ“่ตฐ้ซ˜\n638503 0 ๆฏๅ‘จๆ•ท3ๆฌก่ƒฝๅคŸ่ฎฉ็šฎ่‚คๅ˜ๅพ—็™ฝๆš‚็ป†่…ป\n638504 0 ๅฆˆ็š„ๅฏน็€็”ต่„‘ไธ€ๅคฉไบ†ๅ„็ง็™พๅบฆๅ„็งๆ•™ๅญฆๅฐฑๆ˜ฏ็”จmatlabๅˆ†ๆžไธๅ‡บๆฅๅ‡ฝๆ•ฐไนŸไธไผš็”จ่€ๅญ่ฆ็–ฏไบ†\n1000 Processed\n classify content\n639000 0 ็”ฑๆฑŸ่‹ๅฎœๅ…ดไธญๅ›ฝ็Žฏไฟ็ง‘ๆŠ€ๅทฅไธšๅ›ญ็ฎกๅง”ไผš\n639001 0 ไบ‹ๅŽไป–ๅ‘Š่ญฆๆ–น่ฟๆณ•ไธ€ๅฎก่ขซๆณ•้™ข้ฉณๅ›ž\n639002 0 1็š„็บขๅŒ…่ฎฉๆˆ‘ๆ„Ÿๅ—ไธ€ไธ‹่Š‚ๆ—ฅ็š„ๆฐ”ๆฐ›\n639003 0 ไธ€่ฟ่กŒๅฐฑๆ็คบ็ผบๅฐ‘d3dx***\n639004 0 ้‚ฃไนˆๅฐฑไผšๅ‡บ็ŽฐMACD็ฐ‡ๅœจ่ฏฅๅทจ้‡ๅคงๅ•ไฝ็ฝฎๅค„\n1000 Processed\n classify content\n639500 0 ่ฎค่ฏไฟกๆฏไธบโ€œๆต™ๆฑŸๅคง้‘ซๅ•†ๅ“็ปๆœ‰้™ๅ…ฌๅธIT้ƒจ็ป็†โ€\n639501 0 ็œ้’ๅฐ‘ๅนดๅ‘ๅฑ•ๅŸบ้‡‘ไผš็ง˜ไนฆ้•ฟใ€็œ้’ๅนดๅˆ›ไธšๅฐฑไธšๆœๅŠกไธญๅฟƒไธปไปปไปปๆ–Œใ€็œ้’ๅฐ‘ๅนดๅ‘ๅฑ•ๅŸบ้‡‘ไผšๅ‰ฏ็ง˜ไนฆ้•ฟ็ฝ—ๆœๆ’ไธ€...\n639502 0 ้‡‘่žใ€ๆถˆ่ดน้ข†ๅŸŸๅ…ทๆœ‰่‰ฏๅฅฝไธš็ปฉๆ”ฏๆŒ็š„ๆŠ•่ต„ๆ ‡็š„\n639503 0 ๆ˜ฏๅ›ฝๅฎถโ€œ211ๅทฅ็จ‹โ€ๅ’Œโ€œ985โ€ฆ\n639504 0 ไผ˜็ง€็š„ๅ•†ไธšไธๆ˜ฏ่ง„ๆจก่€Œๆ˜ฏไผ ๆ‰ฟๅนธ็ฆไธŽๅ“่ดจ\n1000 Processed\n classify content\n640000 0 ๆฒชๆทฑไธคๅธ‚ๆต้€šๅธ‚ๅ€ผๆŠฅ405231ไบฟๅ…ƒ\n640001 0 ๆˆ‘ไป–ๅฆˆ่ฆ่ขซ่ฐทๆญŒ็บธ็›’้ป‘็ง‘ๆŠ€ๅ“่ทชไบ†ๆˆ‘ๆ“\n640002 0 ็™พๅบฆ้‡Œ้‚ฃไธช่ฏดๅฏไปฅ็”จ้…ธๅฅถไปฃๆ›ฟ้…ตๆฏ่’ธ้ฆ’ๅคด็š„\n640003 0 ็œ‹ๆฅไธ“่ฝฆๆ˜ฏๅƒไบ†ๅž„ๆ–ญ่กŒไธš็š„่›‹็ณ•\n640004 0 ๅˆฐ็›ฎๅ‰ไธบๆญขๅพทๆธ…ๅŽฟๅทฒๆœ‰36ไธชๆœๅŠก็ซ™็‚น\n1000 Processed\n classify content\n640500 0 ไฝ†้ข้ƒจๆฐด่‚ฟไนŸๆ˜ฏๅพˆๅคš็–พ็—…็š„็—‡็Šถ\n640501 0 BigBang่ฟ™ๆฌกๅ›žๅฝ’ๅŠจ้™้™่€ๅฎžๅคงๅ•Š\n640502 0 ็Žฐๅœจๅคฉๅคฉๆ‰’็€็”ต่ง†็œ‹่Šฑๅƒ้ชจ\n640503 0 ๅœจ่€ƒ่™‘่ฟ™ไธชๆœˆไธŠๅ“ชๆ—…ๆธธ็Žฉโ€ฆ\n640504 0 ๅชๅ› ไฝ ่ฏดๆ‰‹ๆœบๆ”พ่„‘ๆดžๆ—ๅ……็”ตไผš็ˆ†็‚ธๅพ—่„‘็™Œ\n1000 Processed\n classify content\n641000 0 ไธญๅ›ฝ่ˆช็ฉบๅทฅไธš้ƒจ้—จๅœจxDๆ‰“ๅฐ็š„้ƒจๅˆ†ๅบ”็”จ้ข†ๅŸŸๅทฒ่ฟ›\n641001 0 HCไธ‡่‰ฒๆฐดๆฏ็ฑปไบบ่ƒถๅŽŸ่›‹็™ฝ็ณปๅˆ—ไบงๅ“็”ŸไบงๅŽ‚ๅฎถ่ฅฟๅฎ‰ๅทจๅญ็”Ÿ็‰ฉๅŸบๅ› ๆŠ€ๆœฏ่‚กไปฝๆœ‰้™ๅ…ฌๅธ็š„โ€œๅ›ฝๅฎถ็”Ÿ็‰ฉๆๆ–™ๅทฅ็จ‹...\n641002 0 ๆ—ฅๆœฌ่ฟ™ๆฌพMUJIๆ— ๅฐ่‰ฏๅ“้˜ณๅ…‰ๆจฑ่Šฑไฟๆนฟๆป‹ๆถฆๅŒ–ๅฆ†ๆฐด400ML็š„ๅŽปๅนด้™้‡ๅ‘ๅ”ฎ็š„ๆˆ‘ๅ›คไบ†10็“ถ\n641003 0 ๅนถ่ฎกๅˆ’ๅˆ†ๆ‹จ่ž่ต„้ขๅบฆๅ†…ไธๅฐ‘ไบŽ100ไบฟๅ…ƒ่ฟ›่กŒๅนถ่ดญๅˆไฝœ\n641004 0 ไฝ ็Ÿฅไธ็Ÿฅ้“KFCๆœ‰ๅ‡ ไธชไบบๆœ›ไฝไฝ ๅ•Š\n1000 Processed\n classify content\n641500 0 ไบบ็”Ÿ็š„็ฌฌไธ€ๆญฅๅ•Š~่ฐ้ƒฝๆƒณ่‡ชๅทฑ็š„ๅญฉๅญๅฟตไธ€ๆ‰€ๅฅฝ็š„ๅญฆๆ ก\n641501 0 ๅฅฝๅƒ่ฟ˜ๆœ‰ๅฅฝๅคš็š„ๆˆ‘่ฟฝ้€็š„ไบบๅฆ‚ไปŠ็™ป้™†ไธŠๅŽปๆ‰ๅ‘็Žฐๆ—ถ้—ดๆ”นๅ˜ไบ†ๅคชๅคš้‚ฃไบ›็พŽๅฅฝ็š„ๅ›žๅฟ†ๆŒบๅฏๆƒœ็š„ไธ่ฟ‡่ฟ˜ๅฅฝๆœ‰็‚นๅฟตๆƒณๅœจ\n641502 0 ๅŒ…ๆ‹ฌ็ซ็ฎญใ€่ˆชๅคฉ้ฃžๆœบ็ญ‰่ˆชๅคฉ่ฟ่พ“ๅ™จๅŠๅ…ถ็ป„ไปถใ€ๅ…ƒๅ™จไปถ\n641503 0 ๆŠฅ้“ๅผ•่ฟฐ็พŽๅ›ฝๆ”ฟๅบœๅฎ˜ๅ‘˜ใ€ๅŠๅค–ไบคๆถˆๆฏไบบๅฃซ\n641504 0 3ใ€่ฅฟๅฎ‰ๅ‡บ็งŸ่ฝฆ่ตทๆญฅ่ฐƒๅˆฐๅๅ…ƒไน˜ๅฎข๏ผšๆ้ซ˜ๆœๅŠก่ดจ้‡\n1000 Processed\n classify content\n642000 0 ๆฒณๅ—ใ€ๆฑŸ่‹็ญ‰xไธชไธปไบงๅŒบๅ„็ฑป็ฒฎ้ฃŸไผไธšๆ”ถ่ดญๆ–ฐไบงๅฐ้บฆxxxxไธ‡ๅจ\n642001 0 ๅšไธๅˆฐwonbincๅคง็ฅž็š„ๆ่ฟฐๆŽจ่\n642002 0 ไฝฟๅพ—xxxxๅนดๅบฆๅšๆœ›ๅŒบๆ•™่‚ฒ็ป่ดนๅˆ็†ใ€่ง„่Œƒใ€ๆœ‰ๆ•ˆๅœฐไฝฟ็”จ\n642003 0 ๆบงๆฐดๆœ‰ไธ€ไธชๆด‹ๆฐ”็š„ๅ๏ผš้œๅฒๅฐผ็Ž›ยท็ƒญๅพท็Ž›ยทไผŠๆฏ”\n642004 0 ๅธธๅทžๅธ‚็Žฏๅขƒ็›‘ๆต‹ไธญๅฟƒใ€ๅธธๅทžๅธ‚ๆฐ”่ฑกๅฐ2015ๅนด7ๆœˆ31ๆ—ฅ15ๆ—ถ่”ๅˆๅ‘ๅธƒ\n1000 Processed\n classify content\n642500 0 ่ฟ˜ๆœ‰ๅ‡ ไธช็›ฎ็š„๏ผš่ฎฉๆœบๅ™จไบบๅˆถ้€ ่€…ไธไป…ๅœจไธ“้•ฟ้ข†ๅŸŸๅ‘ๅฑ•\n642501 0 2015ๅนด7ๆœˆ18ๆ—ฅๆ˜ฏๅ„ฟๅญ็ฌฌไธ€ๆฌกๅ้ฃžๆœบ\n642502 0 ๅคง็›˜ๅˆ†ๆ—ถ+ๆฟๅ—ๆถจ่ทŒ+ไบ”ๅˆ†้’Ÿๅผ‚ๅŠจ+่‡ช้€‰่‚ก\n642503 0 ไธ‹ๅˆ้‡่ฃ…ไบ†็”ต่„‘ไธ‹่ฝฝๅฎ˜็ฝ‘้ฉฑๅŠจๅคชๆ…ขๅฐฑๅˆ็”จไบ†้ฉฑๅŠจ็ฒพ็ต\n642504 1 ๆŠ˜๏ผŒๅฝฉๅฆ†xxx้€xxๅ…ƒ็š„๏ผŒ็พŽๅณ้ข่†œxxๅ…ƒxxไปถ่ฟ›ๅบ—ๅฐฑๆœ‰ๅฐ็คผๅ“็›ธ้€ๅ“ฆ๏ผŒๆœ‰ๆ—ถ้—ดๆ‚จๅฏไปฅ่ฟ‡ๆฅ็œ‹็œ‹๏ผŒๆดป...\n1000 Processed\n classify content\n643000 0 xxๅฒ็”ทๅญ็ซ่ฝฆไธŠๅผบๅฅธxxๅฒๅฐ‘ๅฅณ\n643001 0 ไนŸๅฐฑๆ˜ฏไน‹ๅ‰็š„Win10Build10240\n643002 0 ็พŽๅ›ฝ่ˆชๅคฉๅฑ€NASAๅฎฃๅธƒๅ‘็Žฐๅฆไธ€ไธชโ€œๅœฐ็ƒโ€\n643003 0 ๅซๆ˜Ÿ๏ผšdingxiaoqixxxxxx\n643004 0 ๆ— ้”กๆฉ™ๅคฉๅ˜‰็ฆพๅฝฑๅŸŽๆ–ฐไน‹ๅŸŽๅบ—xๆœˆxๆ—ฅๆŽ’ๆœŸ\n1000 Processed\n classify content\n643500 0 com/168GrandIBMlivefeedๅ…ซๅคง่ฎฒๅธˆไธŽ็ฅž็ง˜ไธป่ฎฒไบบๅฐ†ไผšไธบ่ฟ™ๅœบๆดปๅŠจๅธฆๆฅๅฆไธ€ไธช้ซ˜ๅณฐ\n643501 0 ็„ถๅŽ้‚ฃไธชๅŒป็”Ÿโ€ฆโ€ฆ็ซŸ็„ถๅฐฑๆ˜ฏๆณฝๅก”็ผๆ–ฏๅ™ขโ€ฆโ€ฆๅไธ่™šไผ โ€ฆโ€ฆไธ€ๅ‡บๅœบ็ฎ€็›ดๅฟƒ้‡Œไธ€ๅŠจโ€ฆโ€ฆ\n643502 0 ไธๆ–™ๅœจไพฆ่ฎฏๅŽๅฅณๆ–น็ซŸ็„ถๆ€ช่ญฆๅฏŸ่œ€้ปโ€œ็ฎกๅคชๅคšโ€\n643503 0 ไปŽ7ๆœˆ29ๆ—ฅไธญๅˆ12็‚นๅˆฐ8ๆœˆ1ๆ—ฅ0็‚น\n643504 1 ไบฒ็ˆฑ็š„ๆœ‹ๅ‹๏ผŒๆˆ‘ๆœ€่ฟ‘ๅœจๅš้˜ฟ่ฟช่€ๅ…‹ไน”ไธน็ญ‰ๅ“็‰Œ้ž‹ไปฃ็†๏ผŒๅฆ‚ๆžœๆœ‰่ฟ™ๆ–น้ข็š„้œ€่ฆๅฏไปฅ่”็ณปๆˆ‘ใ€‚่ฐข่ฐขใ€‚ไฝ ๅฏไปฅไปŽ...\n1000 Processed\n classify content\n644000 0 ๆฑŸ่‹ๅซ่ง†ๅœจๆ’ญ็ปงๆ‰ฟ่€…ไปฌๅ›ฝ่ฏญ็š„ๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ่ฏดๅฎž่ฏๅฝ“ๆ—ถๆˆ‘ๅฐฑๆฒกๆ€Žไนˆ็œ‹\n644001 0 ไธŠๅ‘จไธ‰ไธดๆ—ถๅ…ด่ตท่ฎข็š„beachholidayๆ˜ŽๅคฉๅŽป่ฅฟ็ญ็‰™็š„Majorca\n644002 0 ๆฑŸ่‹่ดจ็›‘ๅฑ€ๅผ€ๅง‹ไบ†ๅฏนๆฃ€้ชŒๆœบๆž„็š„ๅ…จ่ฟ‡็จ‹็›‘็ฎก\n644003 1 ๅณกๅฑฑ็”œ็Œซ็”œๅ“ไธบๅ›ž้ฆˆๅนฟๅคงๅฎขๆˆท๏ผŒ็Žฐ้š†้‡ๆŽจๅ‡บไผšๅ‘˜ๅกไผ˜ๆƒ ๆœๅŠก๏ผŒๅ‡กๅˆฐๆœฌๅบ—ๆถˆ่ดนๆปกxxๅ…ƒๅณ้€ไผšๅ‘˜ๅก๏ผŒไผšๅ‘˜็”จ...\n644004 0 xxxxๆ–ฐๆฌพๆฌง็พŽๆ—ถๅฐšๅŒ่‚ฉๆ˜พ็˜ฆๆทฑV้ข†ไฟฎ่บซ้ฑผๅฐพ็ฎ€็บฆ้’‰็ ๆ‹–ๅฐพๅฉš็บฑ็คผๆœ\n1000 Processed\n classify content\n644500 0 ๆˆ้•ฟ็ปˆๅฐ†ๆ˜ฏ่ฆ้ขๅฏน่ดจ็–‘๏ฝž่ดฃ้ช‚็š„\n644501 0 ไธ็Ÿฅ้“ๆ˜ฏ็Ÿฅ้“ไบ†็œŸ็›ธ็š„ไบบไผšๆŠ‘้ƒ\n644502 0 ไธ€ไธชๆœˆ1800ไฝ ๅฏไปฅๆ”’ไธชๆ—…ๆธธ็ป่ดนไธ€ๅคฉ่ตš100\n644503 0 ๆœ‰้œ€่ฆ็š„kittyๆŽงๅฆˆๅฆˆๅฏๅƒไธ‡ๅˆซ้”™่ฟ‡\n644504 0 ๅ‘16ๅนดๆฅๆŽฅๅŠ›็…ง้กพ่‡ชๅทฑ็š„ๅ”ๅ”้˜ฟๅงจไปฌๆŠฅๅ–œ\n1000 Processed\n classify content\n645000 0 OMEGAๆฌง็ฑณ่Œ„ๆ˜Ÿๅบง็ณปๅˆ—ๅ…จๆ–ฐๅŒ่ฝด็”ทๅฃซ่…•่กจไธ“ไธบ็Žฐไปฃ็”ทๅฃซ่€Œๆ‰“้€ \n645001 0 xใ€ๅฐฝ้‡้€‰ๆ‹ฉๅคฉ็„ถไบงๅ“ๆ™š้œœๅพˆ้‡่ฆ\n645002 0 ๅณ่ตขๅพ—โ€œ650ไบฟ็พŽๅ…ƒCFOโ€็š„็พŽ่ช‰\n645003 0 ๆฏไบฒ้œ€่ฆ่›‹็™ฝ่ดจไพ›็ป™ๅญๅฎซใ€่ƒŽ็›˜ๅŠไนณๆˆฟ็š„ๅ‘่‚ฒ\n645004 0 ไฝ ๅ›ฝไบบๆœ€ๅ–œๆฌขๅ˜ฒ็ฌ‘ๅฐๅบฆๅผบๅฅธๅคงๅ›ฝ\n1000 Processed\n classify content\n645500 0 ๅปบๆž„ไธปไน‰็š„ๆๅ‡บๆœ‰ยทยทยทยทยทยท\n645501 0 ๅพˆๅคšไบบๆ˜ฏไธ€ๅคฉ่ตดๅฎดNไธช้€็บขๅŒ…Nไธช\n645502 0 ่ŽซไบšๅŒป็”Ÿๆ‹’ๆ”ถ+19ๅบŠๆ‚ฃ่€…ๅผ ๆˆ็މ็บขๅŒ…200ๅ…ƒ\n645503 0 ๆ—…ๆธธ็š„็ฌฌไธ€ๅคฉๅคง้ƒจๅˆ†้ฃžๆœบๅบฆ่ฟ‡\n645504 0 ๅฑๅน•็”ป้ขไธๅ˜ๅฃฐ้Ÿณ็ปง็ปญ30็ง’ๅŽๆขๅคๆญฃๅธธๆ’ญๆ”พ\n1000 Processed\n classify content\n646000 0 ๅ—ไบฌๆœŸๆœ›่ฟ™็งๆจกๅผ่ƒฝไธบ็ ด่งฃโ€œๅžƒๅœพๅ›ดๅŸŽโ€ๆ‰พๅˆฐๅ‡บ่ทฏ\n646001 0 ๆ–ฐ่ฅฟๅ…ฐ15ๅฒ็š„ๅญฆ็”Ÿๅœจ้˜…่ฏป่ƒฝๅŠ›ใ€ๆ•ฐๅญฆ่ƒฝๅŠ›ใ€็ง‘ๅญฆๅ’Œ่งฃๅ†ณ้—ฎ้ข˜็š„่ƒฝๅŠ›ๅ››ไธชๆ–น้ข่กจ็Žฐๅ“่‘—\n646002 1 ๅ–œ่ฟŽๅไบ”ๅ…ƒๅฎต่Š‚๏ผŒๅฑ…ๆ’่ฃ…้ฅฐๅฐๅด็ฅๆ‚จ๏ผŒ็พŠๅนด่กŒๅคง่ฟ๏ผŒๅ…จๅฎถๅฅๅบทๅฟซไนใ€‚ๅ‡กๅœจxๆœˆๆฅๅ…ฌๅธๅ’จ่ฏข่ฃ…ไฟฎไบ‹้กนๅ‡ๆœ‰ๅคง...\n646003 0 โ€œๆ‰พ่ฃ…ไฟฎๅ…ฌๅธๅฟ…้กปๅŒ…ๅทฅๅŒ…ๆ–™โ€โ€ฆโ€ฆๅœจ่ฃ…ไฟฎ่ฟ‡็จ‹ไธญ\n646004 0 ไปŠๅคฉๅœจๆ‰‹ๆœบๅบ—็œ‹่งไธ€ๅฏนไบ”ๅ…ญๅๅฒ็š„ๅคซๅฆ‡\n1000 Processed\n classify content\n646500 0 ไปŠๅคฉๅฎถ้—จๅฃๅฐๅ•†่ดฉ็š„ๆ‘Šๅญ้ƒฝ่ขซๅŸŽ็ฎก็ ธไบ†\n646501 0 ใ€Žๆน–ๅŒ—็”ตๆขฏๅžไบบไบ‹ๆ•…ๆญป่€…ไธˆๅคซๅ†™ไฟก็ฅญๅฅ ไบกๅฆปใ€\n646502 0 ๆ— ้”ก่ทƒ่ฟ›็งฐๅ…ถๆ— ๆƒๅœจๅŽไฝฟ็”จโ€œ่ฅฟไบš็‰นโ€ๅ•†ๆ ‡\n646503 0 ๅฆ‚ๆžœไธ€็›ด่ฎฐๅพ—ไธ่ฆ็Žฉๆ‰‹ๆœบๅฐฑๅฅฝไบ†\n646504 0 โ€ไฝ†ๅฝ“ๆˆ‘้ ่ฟ‘ๅ‡†ๅค‡ๆ‹ๆ—ถๅฎƒ่ขซๆˆ‘ๅ“่ท‘ไบ†\n1000 Processed\n classify content\n647000 0 ๅ‘็Žฐ็”จๆ‘„ๅƒๅคดๅŽๆ‰‹ๆœบๅณไธŠ่ง’ไผšๅพˆ็ƒซ\n647001 0 7ๆœˆไปฝๅผ€็ฆๅŒบๅŸŽๅŸŽๅธ‚็ฎก็†ไธŽๆ‰งๆณ•้ƒจ้—จๅ…ฑๆš‚ๆ‰ฃๆธฃๅœŸ่ฝฆ57ๅฐ\n647002 0 ่ขซ้‡‘่žไธšๅ’Œ็ง‘ๅญฆ็ ”็ฉถๅ’ŒๆŠ€ๆœฏๆœๅŠกไธšๅ–ไปฃ\n647003 1 ไฝ ๅฅฝ๏ผไธŠๆ–ฐ่ดงๅ•ฆ๏ผ่ฟž่กฃ่ฃ™๏ผŒ็œŸไธไธŠ่กฃ๏ผŒ่กฌ่กฃ๏ผŒไธ…ๆค้ƒฝ้žๅธธไธ้”™๏ผŒ้ขœ่‰ฒ้ƒฝๅพˆๆผ‚ไบฎๅ“ฆใ€‚ๆœ‰ๆ—ถ้—ดๅฏไปฅ่ฟ‡ๆฅ็œ‹็œ‹ใ€‚...\n647004 0 ไปฅๅŠๅฃฐๅŠฟๆตฉๅคง็š„ๅ่…่ฟๅŠจ็š„ไธๆ–ญๆทฑๅ…ฅ\n1000 Processed\n classify content\n647500 0 ๆฝฎๆถŒๆตทๅคฉ้˜”ๆ‰ฌๅธ†ๆญฃๅฝ“ๆ—ถโ€”โ€”ๅ…จ็œ้กน็›ฎ่ง‚ๆ‘ฉๆดปๅŠจ้™‡ๅ—้กน็›ฎๅปบ่ฎพ็ปผ่ฟฐ\n647501 0 ไปŠๅนด็ฌฌ9ๅทๅฐ้ฃŽ็ฟ้ธฟไธญๅฟƒไบŽ7ๆœˆ11ๆ—ฅ23ๆ—ฅ็ฉฟ่ฟ‡็ปๅ…ดๅธ‚ๅŒบ็š„ๆฆ‚็އไธบ70%\n647502 0 ็Žฐๅœจๅ…จไธ–็•Œ็š„้‡‘่žใ€็ปๆตŽ็š„่ง‚ๅฟต้ƒฝๅ—ๅ‡ฏๆฉๆ–ฏโ€œๆถˆ่ดนๅˆบๆฟ€็”Ÿไบงโ€็†่ฎบ็š„ๅฝฑๅ“โ€ฆๅฏนไบŽ็‰ฉ่ดจ็š„ๆตช่ดนใ€็Žฏๅขƒ็š„ๆฑกๆŸ“...\n647503 0 ๆ˜จๅคฉ่ขซๆณ•้กพๅพ็”จ่ฏดไปŠๅŽปๆณ•้™ขๆฒกไธชๅฝฑๅ„ฟ\n647504 0 ๅฝ“ๆœˆๅ…จๅ›ฝๆœŸ่ดงๅธ‚ๅœบๆˆไบค้‡ไธบ262\n1000 Processed\n classify content\n648000 0 ๅคงๆ™šไธŠๅตๅˆฐๆ— ๆณ•ๅ…ฅ็กๆ‰“็ป™ๅŒ—ไบฌๆตทๆท€ๅŸŽ็ฎกไธพๆŠฅ็ซŸ็„ถๅฌๅฎŒไบ‹ๆƒ…ๅŽๅŽ‹ๆˆ‘็”ต่ฏ็ช็„ถ่ง‰ๅพ—ๅฟƒๅฏ’ๆ— ๆฏ”็Žฐๅœจ็ปง็ปญๅฟๅ—็€ๅฏน...\n648001 0 ๅฐ†ไบŽ9ๆœˆ19ๆ—ฅ่‡ณ21ๆ—ฅๅœจๅฑฑไธœ็œ่Žฑ่Šœๅธ‚้›ช้‡Žๆ—…ๆธธๅŒบ่ˆช็ฉบ็ง‘ๆŠ€ไฝ“่‚ฒๅ…ฌๅ›ญไธพ่กŒ\n648002 0 ๅ•†ไธšไบŒๆœŸ้กน็›ฎๆ˜ฏ็›ฎๅ‰่‹ๅทžๅŸŽๆœ€ๅคง็š„ๅ•†ไธš็ปผๅˆไฝ“้กน็›ฎ\n648003 1 ๆญ็ฅๆ‚จๅŠๅฎถไบบๆ–ฐๅนดไธ‡ไบ‹ๅฆ‚ๆ„๏ผไธญๅฐๅญฆๅ„็ง‘ไผ˜็ง€ๅœจๆ กๆ•™ๅธˆ้ƒฝๅฏไปฅๅธฎๆ‚จไป‹็ปใ€‚้œ€่ฆไธบๅญฉๅญ่ฏทไธŠ้—จxๅฏนx่พ…ๅฏผๅฎถ...\n648004 0 ๆ—ฉไธŠๅ‡บ้—จ็š„ๆ—ถๅ€™ๅฟ˜ไบ†ๆŠŠๆ‰‹ๆœบๅธฆไธŠๅฐฑ้”้—จ\n1000 Processed\n classify content\n648500 0 ไธ€ไธชๅ—ไบฌ้“ถ่กŒ้•ฟๆœŸๆŠ•่ต„ไบบ็š„็œ‹ๆณ•\n648501 0 ๏ผšLOVERUMI้˜ฒๆ™’้œœ็š„้‡่ฆๆ€ง้˜ฒๆ™’็œŸ็š„ๅพˆ้‡่ฆ\n648502 0 ไธ‰ๅฑ‚็žญๆœ›ๅฎคๅ†…ๆœ‰็Žฏ็ป•360ๅบฆ็š„็ฉบไธญๅ’–ๅ•กๅŽ…\n648503 0 ๆˆ‘ๅพ—ๆ”พไธ‹ๆ‰‹ๆœบ็ก่ง‰ไบ†??ไธ็„ถ็œŸๅพ—็žŽ\n648504 0 ๆ•ฐ็€ๅคดไธŠไธ€ๆžถๅˆไธ€ๆžถ็š„้ฃžๆœบ็ป่ฟ‡\n1000 Processed\n classify content\n649000 0 ๅƒๅถ้…ธใ€่กฅ้’™ใ€่ฐƒ็†่บซไฝ“โ€ฆโ€ฆไฝ†่ฆ็œŸๆญฃๅšๅˆฐ็ง‘ๅญฆๅค‡ๅญ•็š„่ฏ\n649001 0 ๅงๅงๅœจ้Ÿฉๅ›ฝ่ฏป็ ”็ฉถ็”Ÿ็Žฐๅœจๅทฒ็ปๆฏ•ไธšไบ†\n649002 0 ไฝ†ๅ›žๆฅไปฅๅŽWๅ…ˆ็”Ÿๅพˆ็€ๆ€ฅๅดๅนถๆฒกๆœ‰ๆ€ชๅฅน\n649003 0 ็ป™ไบ†็™พๅบฆ็ณฏ็ฑณๅ’Œ็พŽๅ›ขๅ†ณ่ƒœ็š„ๆœบไผš\n649004 0 ็ปผๅˆๆกˆไปถๆƒ…ๅ†ตไพๆณ•ๅˆคๅ†ณ๏ผš่ขซๅ‘Šไบบ้ซ˜ๆŸ็Šฏ็›—็ชƒ็ฝช\n1000 Processed\n classify content\n649500 0 ๆณ•้™ขไธ€ๅฅ่ฏ๏ผšไบบๅ›žๆฅไบ†ๆˆ‘ไปฌ็ฎก\n649501 0 ่€ๅคฉๆ„Ÿ่ฐขๅŒป็”Ÿ่ฟ˜็ป™ไบฒ็ˆฑ็š„ไธ€ไธชๅฎŒๆ•ด็š„ๅฎถ\n649502 0 ๅœจๅปบ็ญ‘ๆ–ฝๅทฅไธญ็ปงๆ‰ฟๆญฃๆบๅœฐไบง็š„็ฒพๅ“ๆˆ˜็•ฅไผ ็ปŸ\n649503 1 ๅปบๆๅ›ข่ดญไผš๏ผŒๅƒๅทฆๅณๆฒ™ๅ‘ๅ‘€๏ผŒๆ–นๅคช็”ตๅ™จๅ•Š๏ผŒไธœ้น็“ท็ –โ€ฆไฟ่ฏๅ…จๅนดๆœ€ไฝŽ๏ผŒไผ˜ๆƒ ๅคšๅคš๏ผŒ็คผ็‰ฉๅคšๅคš๏ผŒ็‰น้‚€ๆ‚จๅ‰ๆฅ...\n649504 0 ๅฎถ้‡Œ็”ต่„‘็œŸๆ˜ฏๅก็š„่ฟ˜ไธๅฆ‚ๆ‰‹ๆœบ\n1000 Processed\n classify content\n650000 1 ๆ–ฐๅนดๅฅฝ๏ผๆˆ‘ๆ˜ฏ่ฟœๅคงๅนฟๅœบ็š„้”€ๅ”ฎๅฐๆŽxxxxxxxxxxx๏ผŒไธ€็›ดๅ’Œๆ‚จๆœ‰่”็ณป๏ผŒๆ–ฐๅนดๆœ‰็‰นๅˆซไผ˜ๆƒ ็š„็‰นไปทๆˆฟ...\n650001 0 ็ฆฝๅ…ฝๅŒป้™ข็ฆฝๅ…ฝๅŒป็”Ÿๅˆฉ็”จไบบไปฌๅฏนๅŠจ็‰ฉ็š„ๅŒๆƒ…ๅฟƒ\n650002 0 CHINADAILYๆ™šๆŠฅ7\n650003 0 ๅ‘ไธ–ไบบ่ฏๆ˜ŽWindows8ไน‹ๅŽไป–ไปฌๅนถๆฒกๆœ‰ไธ€้”™ๅ†้”™\n650004 0 ๅฐฑ่ฟž้ฃžๆœบ็š„้คไฝ ้ƒฝๅ‘ไบ†ไธคๆฌกไบ†\n1000 Processed\n classify content\n650500 0 ็™พๅบฆๅพ—ๅฎ‹ๅŸŽ่ทŸๅผ€ๅฐ็š„ๆธ…ๆ˜ŽไธŠๆฒณๅ›ญ็ฑปไผผ\n650501 0 ไบ’่”็ฝ‘+ๆฒน็ซ™็š„ๅนณๅฐๅ‡บ็Žฐๆ”นๅ˜ไบ†ไผ ็ปŸๅŠ ๆฒน็ซ™็š„ๆถˆ่ดน็”Ÿๆดปๅœบๆ™ฏ\n650502 0 ๆˆ‘่ก—้“้‚€่ฏทไบ†ๅ›ฝๅฎถๅฟƒ็†ๅ’จ่ฏขๅธˆโ€”โ€”ๅด็ซ‹็บขๅฅณๅฃซ็ป™ๆˆ‘ไปฌไธŠไธ€ๅ ‚โ€ๅฟƒ็†ๅฅๅบท่ฏพโ€œ\n650503 0 ๆฒฟ็บฟๅŸŽๅธ‚ๅ—้€šใ€ๆณฐๅทžใ€ๆ‰ฌๅทžๅฐ†่ฟˆๅ…ฅโ€œๅŠจ่ฝฆๆ—ถไปฃโ€\n650504 0 ็”ฑๆˆฟไบงๅ•†ๅˆฐๆŠ•่ต„ๅ•†ใ€ๅŸŽๅธ‚่ฟ่ฅๅ•†็š„่ง’่‰ฒ่ฝฌๅž‹\n1000 Processed\n classify content\n651000 0 ๅ› ไธบๆœฌ้‡‘ไป“ๆ˜ฏไธบไบ†ๅˆ†ไบซ2594่ฟ™ๅ‡ ๅนด็š„้ซ˜้€Ÿๆˆ้•ฟ\n651001 1 ไปŠๆ—ฅx.xx-xx.xxๅŸŽๅปบ่ฃ…้ฅฐๅœจๆญๆŠฅไบŒๆฅผไธพๅŠžxxxxๆ–ฐๆ˜ฅ็ฌฌไธ€ๅฑ•๏ผŒๆดปๅŠจๅทจไผ˜ๆƒ ๅŠๅŒ…x.xๆŠ˜ๅœจ้€...\n651002 0 ๅ“ชๅ‡ ็ง็—…ๅ› ่ƒฝๅฏผ่‡ด็บขๆ–‘็‹ผ็–ฎๅ‘็”Ÿ\n651003 0 ๅŠณๅŠ›ๅฃซๅš็š„ไธๆ˜ฏๆ‰‹่กจๆ˜ฏๅฅขไพˆๅ’Œ้ซ˜่ดต็š„ๆ„Ÿ่ง‰\n651004 0 ๅนถไธๆ˜ฏๅ› ไธบD็ซ™ๅšๅพ—ๅฅฝๆˆ–็ฎก็†ๅพ—ๅฅฝ\n1000 Processed\n classify content\n651500 0 โ€็”ทๅญๆƒŠ่ฏงๅœฐ้—ฎ๏ผšโ€œ้œ€่ฆๆŠŠๅฎƒไปฌๆŽฐๅผ€ๆ‰่ƒฝๅƒๅ—\n651501 0 ่ฃ่Žทไบ†ๆœฌๅนดๅบฆISPO่ฟๅŠจ่ฎพ่ฎกๅคงๅฅ–ไบšๆดฒไบงๅ“ๅนดๅบฆๅคงๅฅ–\n651502 0 ๅ…ณ่ตทๆฅไธ€็›ดๅผบๅฅธ็›ดๅˆฐ็”Ÿๅ‡บๅ„ฟๅญ\n651503 0 ๅธธๅทž่ฟ™ไธชๅœฐๆ–นไนŸๆ˜ฏๆ–ญไธไบ†่”็ณป็š„\n651504 1 ใ€‚่ฏพ็จ‹ๆœ‰ๅ„ฟ็ซฅ็”ปใ€ๅ›ฝ็”ปใ€ๅก้€š็”ปใ€็ด ๆใ€่‰ฒๅฝฉใ€ไนฆๆณ•็ญ‰๏ผŒๅœฐ็‚น ๏ผš่‹ไป™ๅŒ—่ทฏๅŽŸๆ•™่‚ฒๅญฆ้™ขๆ กๅŒบ ๏ผˆ่‹ไป™...\n1000 Processed\n classify content\n652000 0 ็œ‹็œ‹้˜ฟ้‡Œไบ‘ๆ˜ฏๆ€Žๆ ทๅธฎไฝ ๅˆทๅฑ็š„\n652001 0 ๆžธๆžๅฒ›ๅคง็Ž‹ๆฒ™ๆปฉxๆœˆxxๆ—ฅๆบบๆญปไธคๅๆธธๅฎข\n652002 0 youtellyourselftogๅˆ่ฆๅ‘ๅจๅ•ฆ\n652003 0 ๆญ็ง˜ๆจๆŸณโ€œxๆฌกๅฉšๅงปโ€่ƒŒๅŽ็œŸ็›ธ\n652004 0 ๅฏผ่‡ดๆˆ‘็”ต่„‘่‡ชๅŠจๅ…ณๆœบ้‡ๅฏๅˆฐ็Žฐๅœจ\n1000 Processed\n classify content\n652500 0 ๅ…จๅ›ฝๅ…ฑๆœ‰87ไธชๅŽฟ่ขซๅˆ—ไธบๆทฑๅŒ–ๅŽฟๅŸŽๅŸบ็ก€่ฎพๆ–ฝๆŠ•่ž่ต„ไฝ“ๅˆถๆ”น้ฉ่ฏ•็‚นๅŽฟ\n652501 0 ๅฅฝๅ–œๆฌขๆ€้˜ก้™Œๅ’Œ่Šฑๅƒ้ชจ็š„ๆˆไปฝๅ•Š\n652502 0 ๅšๅฎŒๅฝขๅกซ็ฉบ่ฟ›่กŒPerformanceReview\n652503 0 ๅฏน่Œไธšๅ’Œไฟกไปฐไบง็”Ÿๆทฑๆทฑ็š„่ดจ็–‘\n652504 0 ไป–ๅ› ็Šฏ็›—็ชƒ็ฝช่ขซไธœๅ…ดๅธ‚ๆณ•้™ขๅˆคๅค„ๆœ‰ๆœŸๅพ’ๅˆ‘xไธชๆœˆ\n1000 Processed\n classify content\n653000 0 8ๆœˆ1ๆ—ฅๅŒ—ๆ–‡ๅฎ‰ๆฐด้Ÿตๅ้ƒฝๅ‡ๅฆ‚้…’ๅบ—็”ตๆขฏ็ช็„ถๆŠŠๆˆ‘ไปฌๅคšไบบ้•ฟๆ—ถ้—ดๅ›ฐๅœจ็”ตๆขฏ้‡Œ\n653001 0 ไธ‰็ผบไธ€ๆ‰พไบบ่ตŒๅšๅฅณไบบไฝ•ๅ…ถๅคš\n653002 1 ใ€Šๆ–ฐไธ€่ดทใ€‹ๆœ€ๆ–ฐๆ”ป็•ฅ ็‰น็‚น๏ผš็›ฎๅ‰ไธบๆญข๏ผŒ ๅธ‚้ขไธŠๆ— ๆŠตๆŠผ็ฑป่ดทๆฌพๅˆฉๆฏๆœ€ไฝŽ๏ผŒ ไธ็”จๆŠตๆŠผ๏ผŒๆ— ้œ€ๆ‹…ไฟ๏ผŒ ๆœ€...\n653003 0 ็„ถ่€Œๅฝ“ๆˆ‘ๅŽปไธŠ็™พๅบฆๆ—ถๅดๆฏซๆ— ๆ‰€่Žท\n653004 0 ๅผๆŽhiๆŽๆ—ฅๆ—ฅๅ””ไฝฟๅš้‡Ž้‚„ๅƒ้€™้บผๅฅฝ\n1000 Processed\n classify content\n653500 0 ้‚ฃไบ›ๆ›พ็ป็›‘่ง†ใ€ๆฏ’ๆ‰“ใ€ๅผบๅฅธๅฅน็š„ไบบๅ’Œๆœบๆž„\n653501 0 ๆ‰‹ๆœบ้‡Œๆ‰€ๆœ‰ๅฏไปฅๅ‡็บง็š„ๅบ”็”จๆ›ดๆ–ฐ\n653502 0 ็ป“ๆžœๅ‘็Žฐ18ๅฐ็ป†่Œ่ถ…ๆ ‡ใ€ๅคง่‚ ๆ†่Œ่ถ…ๆ ‡\n653503 0 ็‚น่ฏ„๏ผš่ถŠๅ—ๅ•†ๆœบไธ€ๅฆ‚xxๅนดๅ‰็š„ไธญๅ›ฝ\n653504 0 ็”ท็ซฅๆ‰‹่‡‚ๅทๅ…ฅๆ‰ถๆขฏ่ฎฐไฝ่ฟ™ไบ›็”ตๆขฏๅธธ่ฏ†ๅฏๆ•‘ๅ‘ฝ\n1000 Processed\n classify content\n654000 0 ็ป™ไบŒๅฎๅฐzipbagๅธฆ็š„ไธ€ๆฎต็ฑณ็ฒ‰\n654001 1 ๅ“ฅไฝ ๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅ’Œไฝ ้€š่ฏ็š„้‡‘ๆบ้ผŽ็››ๅฎขๆˆท็ป็†็ซ ๆ™“็‡•๏ผŒๆœ€ๆ–ฐๆ”ฟ็ญ–๏ผšๆŠตๆŠผ่ดทๆฌพๅˆฉๆฏxๅŽ˜๏ผŒๆœ€้•ฟไบ”ๅนดๆœŸ๏ผŒๆฌข่ฟŽ...\n654002 0 ไนŸไธๆ˜ฏ้šไพฟๅฏไปฅไนฑ็”จใ€ๆปฅ็”จ็š„\n654003 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผŒๆˆ‘่กŒไปŠๆ—ฅๆŽจๅ‡บๅ…ƒๅฎต่Š‚็‰นๅˆซๆฌพ็†่ดขไบงๅ“๏ผŒๆญฃๅœจ็ƒญ้”€๏ผŒxxๅคฉ๏ผŒๆ”ถ็›Š็އx.x๏ผŒๆˆชๆญขไปŠๆ—ฅไธ‹...\n654004 0 ็Žฐๅœจๆฏๆฌกๅผ€็”ต่„‘็œ‹ไธๅˆฐ้‡‘ๅผ€็š„่„ธๆˆ‘้ƒฝ่ถ…็บงๆ…Œๅผ \n1000 Processed\n classify content\n654500 0 ็œ‹่ฟ‡่Šฑๅƒ้ชจ้‚ฃไธชๅฆ–้ญ”ๅŒ–ๆ„Ÿ่ง‰ๅฅฝๅƒๅพˆๅฅฝ็œ‹็š„ๆ ทๅญ\n654501 0 ่ฟ™ๆ˜ฏ็ปงๅŒ—ไบฌยทๆฒงๅทžๆธคๆตทๆ–ฐๅŒบ็”Ÿ็‰ฉๅŒป่ฏไบงไธšๅ›ญไน‹ๅŽ\n654502 0 ็™พๅบฆ็คพไผšๅŒ–ๅˆ†ไบซ็ป„ไปถๅฐ่ฃ…ไบ†ๆ–ฐๆตชๅพฎๅšใ€ไบบไบบ็ฝ‘ใ€ๅผ€ๅฟƒ็ฝ‘ใ€่…พ่ฎฏๅพฎๅšใ€QQ็ฉบ้—ดๅ’Œ่ดดๅง็ญ‰ๅนณๅฐ็š„ๆŽˆๆƒๅŠๅˆ†ไบซๅŠŸ่ƒฝ\n654503 0 ๅฏนไบŽ้˜ฟ้‡ŒๅทดๅทดไธŠ่Žทๅ–ๅ…่ดนๆต้‡ๅธฆๆฅ็š„ๆ”ถ็›Š\n654504 0 ๅฏนไป˜ๆฝœ่‰‡็š„ๅˆฉๅ™จๅฝ“็„ถ่ฟ˜ๆ˜ฏๆฝœ่‰‡\n1000 Processed\n classify content\n655000 1 ๅˆ›็ปด่‹ฑ่ฏญๅŸน่ฎญๅญฆๆ กๆญ็ฅๆ‚จๅ…ƒๅฎต่Š‚้˜–ๅฎถๆฌขไน๏ผŒๅนธ็ฆๅฎ‰ๅบท๏ผๆœฌๆ กไธ“ๆณจๅฐๅญฆๅŸน่ฎญ๏ผŒๅŠ›ๆฑ‚ไธ“ไธš๏ผŒๅณๆ—ฅๅผ€ๅง‹ๆŠฅๅ๏ผŒๅณ...\n655001 0 ๅŒป้™ขๅๅ››ๆฅผ็š„ไธ€ๅผ ๆ™ฎ้€š็—…ๅบŠไธŠ\n655002 0 ๆŠนไบ†bb้œœไนŸ้ฎไธไฝๅ‡นไธ‹ๅŽป็š„็œผๅœˆๅ’Œ็บข่‚ฟ็š„็œผ็šฎๆ˜จๆ™š่ฟ˜ๅ‘็ƒงไบ†ๅ…ˆๅœจๅงๅงๅฎถๅƒไธ€้กฟๅฅฝๅƒ็š„ไธ‹ๅˆๅ†ๅŽป่พ“ๆถฒๆœ€็ƒฆ...\n655003 0 ไบค้€š๏ผšๅ…ฌไบคxใ€xxๆˆ–xxx่ทฏ่‡ณๆบช้š็คพๅŒบ็ซ™\n655004 1 ๅฐŠๆ•ฌ็š„ๅฎขๆˆทๆ‚จๅฅฝ๏ผŒๅ‡กๆŽฅๅˆฐๆญค็Ÿญไฟก็š„ๅฎขๆˆทไบคxxxๅ…ƒ๏ผŒๅฏไบซๅ—ๅ…่ดน้€xGๆ‰‹ๆœบไธ€้ƒจๅ’Œxxๅ…ƒ่ฏ่ดนๅ’Œxxxx...\n1000 Processed\n classify content\n655500 0 140ๅƒๅญ—ๅ†™ๅ‡บไบ†ๅˆไธญๆ กๅ›ญ็š„ๅ‹ๆƒ…ๆ•…ไบ‹\n655501 0 ๅ› ไธบ่ทŸไบ†ไฝ xๅนดๅฑ…็„ถ่ตฐๅˆฐๅฐฝๅคดๆฒกๆœ‰ๆˆไธบๆ›ดๅฅฝ็š„ไบบๆ›ด็—›่‹ฆ\n655502 0 ่ฝฌๅ…จๆ–ฐforeolunaforallskintypes\n655503 0 ไปฅๅŽๅ›žๅ›ฝๅŽปๆ—…ๆธธๅชๅœจๅคงๅŸŽๅธ‚็Žฉ\n655504 0 ๅพๅทž้“œๅฑฑๅŒบๆŸณๆ–ฐ้•‡่Œƒๅฑฑๆ‘ๆœ‰ไธ€ๅทฅๅŽ‚ใ€ๅคœ้—ดๅทฅไฝœไบง็”Ÿๅทจๅคง็š„ๅ™ช้Ÿณ\n1000 Processed\n classify content\n656000 0 ๆฒกไบ‹ๆˆ‘ๅฐฑๅฏไปฅๆŒ‰ไธช็”ตๆขฏไธŠๅŽปๅš้ฅญๅš้ขๅŒ…ๅš่›‹็ณ•ๆ‰พๅฅน่Š่Šๅ…ซๅฆ่Šๆ—ถๅฐš่Šไฟๅ…ป\n656001 0 ๅš็”ต่„‘ๅฃ็บธๆ”พๅนป็ฏ็‰‡็ฎ€็›ดไธ่ฆๅคช่ตž\n656002 0 ไฝ ไปฌไธ่ง‰ๅพ—ๅผ€ๅง‹่ฃ…ไฟฎไปฅๅŽ้‚ฃ้’ฑ่Šฑ็š„ๅฐฑ่ทŸๆตๆฐดไธ€ๆ ท\n656003 0 ็Šฏ็ฝชๅซŒ็–‘ไบบๆŽๆŸๆŸไบŽxๆ—ฅ่ขซๆŠผ่งฃๅ›ž้’ๅฒ›\n656004 0 ๆˆ‘ไธๆƒณ้€š่ฟ‡่ฐทๆญŒไบ†่งฃ่ฟ™ๅบงๅŸŽๅธ‚๏ผšๅŽฆ้—จ\n1000 Processed\n classify content\n656500 0 ๅ› ๆญคๆ‰‹ๆœบไธŠ็š„ๅบ”็”จ็จ‹ๅบๅณไฝฟๆœ‰ๅฏ†็ \n656501 0 ไธบไบ†้€ๅฟซ้€’ไบš้ฉฌ้€Š็ป™็พŽๅ›ฝๆ— ไบบๆœบๅšไบ†ๅฅ—ไบค่ง„ๆ–นๆกˆๅŒ—ไบฌๆ—ถ้—ด7ๆœˆ29ๆ—ฅไธ‹ๅˆๆถˆๆฏ\n656502 0 4ๅคฉ3ๅคœ็Ž›ๅŠช้›จๆž—่กŒ่ตฐ่‰ฐ้™ฉๅนถ็ฒพๅฝฉ็€\n656503 0 ไธ€ไบ›ๅ…ณไบŽโ€œ็”ทๅฅณๆ‹็ˆฑไธญไธ€็‚นๅฐไบ‹โ€็š„่ง†้ข‘\n656504 0 ไปŠๅคฉๅœจ่‹ๅทžๆฐดไธŠไนๅ›ญ็Žฉไบ†ไธ€ไธ‹ๅˆ\n1000 Processed\n classify content\n657000 0 ไฝๆฒปไบš็†ๅทฅๅญฆ้™ขๆœบๅ™จไบบๅฎž้ชŒๅฎคไธปไปปไบจ้‡Œๅ…‹ยทๅ…‹้‡Œๆ–ฏๆป•ๆฃฎ่ฏด๏ผšโ€œไธญๅ›ฝ็ปๅކไบ†ๆœบๅ™จไบบ็ˆ†็‚ธๅผๅขž้•ฟ\n657001 1 ่ฟ่กŒ็…คๆฐ”xx้—จๅธ‚้ƒจ๏ผŒๆœฌๅ…ฌๅธ็ป่ฅ็Ÿณๆฒนๆฐ”๏ผŒ็…คๆฐ”็‚‰ใ€็ƒญๆฐดๅ™จใ€ๅ„็ง็‡ƒๆฐ”ๅ…ทๆ‰นๅ‘้›ถๅ”ฎ๏ผŒๅธ‚ๅ†…ๅ…่ดนๅฎ‰่ฃ…๏ผŒ้€่ดง...\n657002 0 ๆ—ฉไธŠไธ€ๆฏ่œ‚่œœๆฐดๆŽ’ๆฏ’\n657003 0 ๅŒป้™ข้—จๅฃ่ฃ…ไฟฎๅต็š„ๆˆ‘ๅฟซๅดฉๆบƒไบ†\n657004 0 ่ฟ˜ๆœ‰ไธ€ๆฌกๆ˜ฏ้—ฎๆขๅŒป็”Ÿ๏ผšๅฆ‚ๆžœๆˆ‘ๆ˜ฏไฝ ๅฅณๆœ‹ๅ‹\n1000 Processed\n classify content\n657500 0 ็Žฐๅœจ็œ็ซ‹ๅŒป้™ข่ฅฟๅŒบ2ๅทๆฅผ4ๅฑ‚ๅ„ฟ็ซฅ่ก€ๆถฒ่‚ฟ็˜ค็ง‘A02ๅทๅบŠๅฐฑ่ฏŠ\n657501 0 ่ฟ™ๆฌก้€š่ฟ‡ไธŽๆ— ้”กๅธ‚้‚ฎๆ”ฟ็ฎก็†ๅฑ€ๅฏนๆŽฅ\n657502 0 ๆˆ‘ไปฌๆŠคๅฃซ็”Ÿๆดป่ง„ๅพ‹ๅฐฑๆ˜ฏไธ็ฎก้ฅฟไธ้ฅฟ\n657503 0 ๆ‘้‡Œ็ป™็š„็ญ”ๅคๆ˜ฏไบคxxxๅ—้’ฑโ€˜ไธŠ็Žฏโ€™ๆŠผ้‡‘ๆ‰็ป™ไฝ ๅŠž่ฏ\n657504 0 ๅนธๅฅฝๆˆ‘็š„่ˆช็ญไนŸๆ™š็‚นไบ†2ไธชๅคšๅฐๆ—ถ\n1000 Processed\n classify content\n658000 0 ๅ—้€š็ง‘ๆŠ€sh600862\n658001 0 ๅฏไปฅๅˆบๆฟ€T็ป†่ƒžๅ’Œ่‡ช็„ถๆ€ไผค็ป†่ƒžๆ”ปๅ‡ป็™Œ็ป†่ƒž\n658002 0 2็บขๅŒ…ๆ‘‡ๅˆฐไบŒ๏ผš็”ตๅฝฑ็ฝ‘ๅ€้€ไฝ ๆ‘‡ๅˆฐไธ‰๏ผš้›ถ้ฃŸไนฐ100ๅ‡5ๅ—ๆ‘‡ๅˆฐๅ››๏ผšๆ‰“ๅญ—่ฝฏไปถๆ‘‡ๅˆฐไบ”๏ผšๆŸฅๅ•ๅˆ ่ฝฏไปถๆ‘‡ๅˆฐๅ…ญ...\n658003 0 ๆธฏ่‚ก้€šๆตๅ‡บ19ไบฟโ—‹่ทŒๅŠฟ่”“ๅปถ่‡ณๅ•†ๅ“ๆœŸ่ดง\n658004 1 ็‡•้ƒŠๆ–ฐไธ–็บชๅ•†ๅŸŽๆฐๅ…‹็ผๆ–ฏx.x-x.x ้•ฟ่ข–ๅŠ่ข–่กฌ่กฃxxๅ…ƒ่ตทๅค–ๅฅ—่ฅฟๆœxxxๅ…ƒ่ตท\n1000 Processed\n classify content\n658500 0 ๅฐŠๆ•ฌ็š„ๅ„ไฝ๏ผšๅ’–ๅ–ฑไธ€ๅฎถๅคฉ้ฉฌๅบ—ๆ˜Žๅคฉๅผ€ๅง‹xๆœˆxๆ—ฅ่‡ณxๆœˆxxๆ—ฅๅบ—้ขๅ‡็บง\n658501 0 ๅœจ\"ๅš่‡ชๅทฑ\"ๅ’Œ\"ๅ–ๆ‚ฆไป–ไบบ\"ไน‹้—ดๅฏปๆฑ‚ๅนณ่กก\n658502 0 โ€œไฝ ไปฌ่ฏดๆ˜ŽไนฆไนŸๆฒก่ฏด็”ต่„‘ไธ่ƒฝๆ‰‹ๆด—ๅ•Š\n658503 0 ้˜ฟ้‡Œ็š„ๅบ”็”จ็š„ๅˆšๅฅฝไปŽๅฆไธ€ไธชๆ–นๅ‘\n658504 0 ็คพไผšๅฎžๆ‹ๆญฆๆœฏๅญฆ้™ข้ฃŸๅ ‚็”ทๅฅณๆ‰“ๆžถ\n1000 Processed\n classify content\n659000 0 ไป–็ซŸ็„ถๅŽป้…’ๅงไธ€ๆŽทๅƒ้‡‘ยทยทยทยทยท\n659001 0 ๅฅฝๅฃฐ้Ÿณ็ฌฌๅ››ๅญฃ็ฌฌไธ€ๆœŸไธญๅ”ฏไธ€่ฎฉไบบ่ฎฐไฝ็š„ๆๆ€•ๅฐฑๆ˜ฏๆŽๅฎ‰ไบ†\n659002 0 ่ฟ™้‡Œๅ˜…่ฃ…ไฟฎๆœ€็‰นๅˆซๅฐฑๆ˜ฏ่ฟ™ไธชๆ‘†่ฎพ\n659003 0 0ไธป็›ธๆœบไปฅๅŠ500ไธ‡ๅƒ็ด ๅ‰็ฝฎ็›ธๆœบ\n659004 1 ้˜ฟ้‡Œๅทดๅทด่ฏšไฟก้€šๅ‘จๅนดๅบ†ๆœˆ๏ผŒxxxxๅนดxๆœˆxๅทๅ…ฅ้ฉป้˜ฟ้‡Œๅทดๅทดๅฐ†ๆœ‰ๅคง็คผ่ต ้€๏ผŒไผไธšAPPไธ€ๅนด+xไธชๆœˆ่ฏข...\n1000 Processed\n classify content\n659500 0 ๆฃ้ผ“ๆ‰‹ๆœบ็š„ๆ—ถๅ€™ๅฟ˜ไบ†ๅค‡ไปฝ็…ง็‰‡\n659501 0 ๅทๆˆ‘้’ฑ็š„ๅฐๅทไปฅๅŽไธŠๅŽ•ๆ‰€ๆฒกๅธฆ็บธ\n659502 0 23ๆ—ฅๅฌๅผ€็š„ๆฑŸ่‹็‰ฉไปทๅฑ€้•ฟไผš่ฎฎไผ ๅ‡บ็š„่ฟ™ไธ€ๆ•ฐๅญ—\n659503 0 f2ๆ ‹ไธŽf3ๆ ‹ไน‹้—ดๆญไบ†ๅพˆๅคš้’ขๆžถ\n659504 0 ๅ‰้ฒ่ฟ™ไธ“ไธš้ฃžๆœบ็†Ÿๆ‚‰็š„ๆป‹ๅ‘ณ\n1000 Processed\n classify content\n660000 0 ๅœจๅฅฝๅฃฐ้Ÿณ่ฟ™็งๅฆ‚ๆญคไธปๆต็š„้€‰็ง€ๅนณๅฐ\n660001 1 ๅนณๅฎ‰ๆ˜“่ดทๅฏๅธฎๆ‚จ่Žทๅ–ๆ‰€้œ€่ต„้‡‘๏ผŒๆ— ้กปๆŠตๆŠผ๏ผŒๆ‰‹็ปญ็ฎ€ไพฟๅฟซ้€Ÿ๏ผŒๅฟซๆฅ็”ณ่ฏทๅง๏ผ๏ผŒๅœจ็”ณ่ฏทๆ—ถ่พ“ๅ…ฅๆˆ‘็š„้‚€่ฏท็ xx...\n660002 0 wearein้”กๆž—ๆ ผๅ‹’ๅŠจๆค็‰ฉ็ง‘ๆŠ€้ฆ†\n660003 0 ไบŒ็ปด็ ไธญๅŒ…ๅซๆœจ้ฉฌ็—…ๆฏ’้‚ฃๅฐฑๆ˜ฏๅฏๆ€•็š„ไบ‹ๆƒ…ไบ†\n660004 0 ๅนถ็บฆ็€ๆ™šไธŠไธ€่ตทๅ—จ๏ฝžๅ•Šๅ“ˆๅ“ˆๅ“ˆๅ“ˆๅ“ˆ\n1000 Processed\n classify content\n660500 0 ๆขฆ่งไฝ ็š„QQๅ็‰‡ไธŠๆ›ดๆ–ฐไบ†ไธคๅผ ็…ง็‰‡\n660501 0 ็œ‹็€็ฝ‘ไธŠ็š„ๆ—…ๆธธๆ”ป็•ฅ็œ‹็š„ๅคด็–ผ\n660502 0 ๅ‰ๅพ€ๅคฉๆดฅๆปจๆตทไธบๅ…ถๅˆ†่กŒๆ–ฐๅ€่€ƒๅฏŸ็ญ–ๅˆ’\n660503 0 ๅช่ฆ่ฟๆณ•ไนฑ็บชๅฐฑๅฟ…ๅฐ†ๅ—ๅˆฐๆƒฉ็ฝš\n660504 0 ๆˆ‘ไปฌๅฌๅฌxxx้ฆ–ๅธญๅทฅ็จ‹ๅธˆ้ƒ‘ๆ–‡ๅฝฌๆ€Žไนˆ่ฏด\n1000 Processed\n classify content\n661000 0 ๅคงๅฎถๆฏๅคฉๅฎ…ๅœจ็”ต่„‘ๅ‰ไนŸๅฏไปฅ่ตš็‚น้›ถ่Šฑ้’ฑ\n661001 0 ๆ˜จๅคฉๅœจ้“ถๅบง็ญ‰็”ตๅฝฑๅผ€ๅœบ็„ถๅŽๅŽปๅ’–ๅ•กๅบ—ไนฐไบ†ๅ–็š„ไธœ่ฅฟ่ทŸ่€ๆฟไธ่‡ช่ง‰็š„่ฏดไบ†ๆ™ฎ้€š่ฏ็„ถๅŽๆœ‹ๅ‹ๅฐฑ่ฏด่ฎฉๆˆ‘่ƒฝไธ่ƒฝไธ...\n661002 0 ไธ‹้ฃžๆœบๅŽๅœจๆŽ’้˜Ÿ็ญ‰passportcontrol\n661003 0 39ไบฟ็พŽๅ…ƒๆ”ถ่ดญๅฅๅบทๅบ”็”จRuntastic\n661004 0 slmไธ‰ๆ›ดๅŠๅคœๅœ็”ตๆžไป€ไนˆ้ฃžๆœบๅ•Šๅนฒ\n1000 Processed\n classify content\n661500 0 ๅŒป็”Ÿ่ขซๅฅนไฝ“ๅ†…็š„่ถ…ๅคง่‚ฟ็˜คๅ“ไบ†ไธ€่ทณ\n661501 0 ๅทฒๆปก14ๅ‘จๅฒไธๆปก18ๅ‘จๅฒ็š„ๆœชๆˆๅนดไบบ็Šฏ็ฝชๅบ”ๅฝ“ไปŽ่ฝปๆˆ–่€…ๅ‡่ฝปๅค„็ฝš\n661502 0 ๅ›žๅฎถๅŽไธ€ๅคฉ24ไธชๅฐๆ—ถ็กไบ†18ๅฐๆ—ถ\n661503 1 ๆœ‰ๆˆฟๅฐฑๆœ‰้’ฑ๏ผŒx%ๅˆฉ็އๅ€Ÿๆฌพใ€‚่ต„้‡‘ๅ‘จ่ฝฌๅ›ฐ้šพ๏ผŸ้“พๅฎถๅœฐไบงๆŽจๅ‡บ๏ผšxไธ‡โ€”xxxxไธ‡๏ผŒ็ŸญๆœŸๅ€Ÿๆฌพใ€‚ๅช่ฆๅไธ‹ๆœ‰...\n661504 0 ็‚น่งฃๆˆๆ—ฅ่ฆ็›—็ชƒๆˆ‘ไธช่„–ๅ–ๅนฟๅ‘Š\n1000 Processed\n classify content\n662000 0 ็„ถๅŽๅผ€ๅง‹่ดจ็–‘๏ผšๆˆ‘่ฏฅ็›ธไฟก็ˆฑๆƒ…ๅ—\n662001 0 ็Šฏ็ฝช็š„ไบบไธ่ฏฅๆ‰ฟๆ‹…ไป–่กŒไธบ็š„ๅŽๆžœๅ—\n662002 1 ๅฎฃๅŸŽๅ•†ไน‹้ƒฝ็€่Žฑ้›…ไธ“ๆŸœ๏ผŒไธ‰ๅ…ซๅฆ‡ๅฅณ่Š‚็‰นๆƒ ๏ผŒ้™ค็‰นไปทๅฅ—็›’ๅค–๏ผŒๅ…จๅœบx.xๆŠ˜๏ผŒๆดปๅŠจๆ—ถ้—ดx.x๏ฝžxๆ—ฅ๏ผŒ่ฏฆๆƒ…...\n662003 0 12ๅฒ็š„ๅฐๅฎถไนๅ…ˆๅคฉ่ฝฏ้ชจ็˜คใ€ไธ€ไธช่…ฟ้•ฟไธ€ไธช่…ฟ็Ÿญ\n662004 1 ไบฒ็ˆฑ็š„ไผšๅ‘˜๏ผŒๅŒไปๅ ‚ๅŠไธšๅœบไธ“ๆŸœxๆœˆxๆ—ฅ่‡ณxๆ—ฅๅผ€ๅฑ•่†้œœไนฐxxx้€xx๏ผŒไธญ่ฏใ€่Šฑ่‰้ข่†œไนฐx้€x๏ผˆ้€...\n1000 Processed\n classify content\n662500 0 ๅ›ฝไบงๅŠจ็”ป่ขซๆŒ‡ๆŠ„่ขญ็พŽๅ›ฝๅคง็‰‡ๅฏผๆผ”้ช‚่ดจ็–‘่€…ๆ˜ฏๆฑ‰ๅฅธ\n662501 0 ใ€€ใ€€44ใ€้‚ปๅฑ…ๅฎถๅฐๅญฉ็บฆๆˆ‘ๅŽป็Žฉๆฐดๆžช\n662502 0 ็œๅ…ฌๅฎ‰ๅŽ…ๅ‘่จ€ไบบ๏ผšๆญคๆ— ่Šๅชไธพ็ป™ๆˆ‘ไปฌ่ญฆๆ–น็คพไผšๅ…จๅ›ฝๅ„ๆ—ไบบๆฐ‘ๅธฆๆฅๆžๅคง็š„ไนฐไธบๆƒ…โ€ฆ่ฏทๅ„ไฝๆฎตๅ‹่ฎค็œŸไบ‹ๆ€ไธฅ้‡ๆ€ง\n662503 0 ็š„็กฎไธบไป€ไนˆๆˆ‘ไปฌไผšไธบไบ†ๅ‡ ๅ—้’ฑๅœจๅคช้˜ณๅบ•ไธ‹ๅ’Œๅฐ่ดฉ่ฎจไปท่ฟ˜ไปทๅ‡ ๅ—้’ฑ่€Œๅœจๅ•†ๅœบ้‡Œๅ’Œๆœ‰้’ฑ็š„ๅ–ๅฎถไปŽไธ่ฎจไปท่ฟ˜ไปท\n662504 0 โ€้ขๅฏนๅคฑ่€Œๅคๅพ—็š„็ฌ”่ฎฐๆœฌ็”ต่„‘\n1000 Processed\n classify content\n663000 0 ่”ๅˆๅก่ฝฆไบŽ7ๆœˆ31ๆ—ฅๅœจๅ…ฌๅธ้ฃŸๅ ‚ไธพๅŠžโ€œๆ— ๅฟ็Œฎ่ก€\n663001 0 ๅˆšๆ‰ๆ‰ซไบ†็œผๆฑŸ่‹ๅซ่ง†้‚ฃไธช็ญ”้ข˜็š„่Š‚็›ฎ\n663002 1 xxxxxxxxxxxๅฐŠๆ•ฌ็š„็”จๆˆทๆ‚จๅฅฝ! ๆ‚จ็š„็งฏๅˆ†ๅทฒๆปก่ถณๅ…‘ๆข็Žฐ้‡‘ๅคง็คผๅŒ…ๆกไปถ๏ผŒ่ฏท็™ปๅฝ•wap.xx...\n663003 0 ๅพฎ่ฝฏไฝ ๅ†ไธๆŽจ้€WIN10ๆˆ‘ๅฐฑๅŽปROOTๅฎ‰ๅ“ไบ†\n663004 0 ๆทฑๅคœๆƒŠๅฅ‡ยท็งŸๆˆฟ็งŸๅˆฐๅดฉๆบƒโ€ฆไป€ไนˆไบบไป€ไนˆ้ฌผไป€ไนˆ็ฅž้ƒฝๆœ‰ๅ•Š\n1000 Processed\n classify content\n663500 0 2015ๅนด7ๆœˆ30ๆ—ฅๅฅ‡ๅฆ™็œŸ็›ธๆœ€้‡่ฆ็š„้—ฎ้ข˜็ฌฌๅ…ซ่พ‘ๆˆ‘ไปฌๅฏไปฅๅ’Œ็ฆ็›ธๅค„ๅ—\n663501 0 otherstories้ฃŽๆฌง็พŽๅŽŸๅฎฟๅคๅคๆž็ฎ€็™ฝ่‰ฒๆพ็Ÿณๅคง็†็Ÿณ็บน่€ณ้’‰ไธ€ๅฏนไปท\n663502 0 ๅฏไปฅๆŠ›ๅผƒWindows7ไบ†๏ฝžไธ€ไผšๅฎ‰่ฃ…Office2016\n663503 0 ๅฐไผ™็Ž‹ๆŸไธบๅฅณไบบๆŠขๅŠซ้‡‘ๅบ—้€ƒ่ท‘ๆ—ถๆฏxxx็ฑณๆขไธ€่พ†่ฝฆ\n663504 0 ้™ๅฎšๅญ™ๅญxxๅ‘จๅฒๅ‰ๅฏๆฏๆœˆๆŽขๆœ›ไธ€ๆฌก\n1000 Processed\n classify content\n664000 0 ๅšไบ†่‚กๆŒ‡ๆœŸ่ดง้…่ต„ๅŽไนŸ่ƒฝ่ตš้’ฑไบ†\n664001 0 ๅœจๆ‰‹ๆœบ่กŒไธšๆœ‰็š„ๆ€ง่ƒฝๅพˆๅฅฝ็š„ๅ›ฝไบงๆ‰‹ๆœบ\n664002 0 ๅๅœจT3่ˆช็ซ™ๆฅผๅ€™ๆœบๅคงๅŽ…ๅฎฝๆ•ž็š„ๆค…ๅญไธŠ\n664003 0 ็ฝ‘ไธŠ่ฏด็š„ๆ˜ฏๅœจ่…พ่ฎฏ่ง†้ข‘ๆ˜ฏๆ”ถ่ดน็š„\n664004 0 ็ป™ๅคงๅฎถๅˆ†ไบซโ€œOfficeๆœ€ๅ…จ่ง†้ข‘็ญ‰โ€ๆ–‡ไปถ\n1000 Processed\n classify content\n664500 1 ้œ€่ฆ่ดทๆฌพ่ตถ็ดง็š„ ใ€ๆฑฝ่ฝฆๆŠต่ดทใ€‘๏ผš ไธๆŠผ่ฝฆใ€ ไธ่ฟ‡ๆˆทใ€ ไธ่ฃ…GPS๏ผŒ ใ€ๆœˆๆฏx.xๅŽ˜ใ€‘ๅนดๆฏxx%...\n664501 0 ๅœฐ้“ไธŠๅˆๅŸ‹ๅคด็œ‹ๆ‰‹ๆœบ็„ถๅŽๆ‰‹้บป่„š้บป\n664502 0 ๆฏๅ‘จไธ‰ๆ™š22๏ผš00ๆฑŸ่‹ๅซ่ง†ๅฃฎๅฟ—ๅ‡Œไบ‘\n664503 0 absๆ–ฐๆฌพๆ‹‰ๆ†็ฎฑๅคงๅ˜ด็Œด้“ๆก†pc่กŒๆŽ็ฎฑไธ‡ๅ‘่ฝฎ้ป‘่“็บขไธ‰่‰ฒๅก้€š็ฎฑๅŒ…ๅŒ…้‚ฎ\n664504 0 โ€œไธญๅ›ฝ็‰ˆAirbnbโ€้€”ๅฎถๅฎŒๆˆxไบฟ็พŽๅ…ƒ่ž่ต„\n1000 Processed\n classify content\n665000 0 ๆ”ฏๆŒ้“้“ๅฅฝๅฃฐ้Ÿณ่ฏ•ๅฌๅœฐๅ€&gt\n665001 0 ๅธฆ็€็ชๅคงๅ“ฅ็š„ๅŒๆฌพGโ€“SHOCK\n665002 0 ๆ˜Ÿๅทดๅ…‹ๅฎๆณขๅ’Œไน‰1844ๅบ—ๆ‹›ๅ‹Ÿไบ†\n665003 0 ๅง”ๆ‰˜ไบบๅฐฑๆกˆไปถไธŽๅพ‹ๅธˆ่ฟ›่กŒๅๅ•†\n665004 0 TFBOYS็งๅฏ†่Šฑ็ตฎๆ›ๅ…‰๏ผšๅˆๅฎฟไน‹็ช่ขญ\n1000 Processed\n classify content\n665500 0 ้‚ฃไธชๆ’žไธญๅ›ฝ้ฃžๆœบ็š„ๆ—ถไปฃไธ€ๅŽปไธๅค่ฟ”ไบ†\n665501 1 ไธ€่‚–๏ผš้ผ ๏ผ›ไธค่‚–๏ผš้ผ ็พŠใ€‚่ฟ™ไธคไธช้ƒฝๅพˆๅฅฝ๏ผŒไธ‹ๆณจๆœ‰้ฃŽ้™ฉ๏ผŒๆœ›่ฐจๆ…Žใ€‚\n665502 0 ้…็ฝฎYKKๆ‹‰้“พใ€NIFCOๆ‰ฃๅ…ท\n665503 0 โ€œไธ‰ๅ…ฌโ€็ป่ดนๆ”ฏๅ‡บๆฏ”2013ๅนดๅบฆ้™ๅน…ๆ˜Žๆ˜พ\n665504 0 ่ฎฐ่€…4ๆ—ฅไปŽๅ“ˆๅฐ”ๆปจ็ซ™ๆ”น้€ ๅทฅ็จ‹้กน็›ฎๆˆฟๅฑ‹ๅพๆ”ถๆŒ‡ๆŒฅ้ƒจ่Žทๆ‚‰\n1000 Processed\n classify content\n666000 0 ๆ—ฅไฝ ๅฆˆไธพๆŠฅไธ€ๆฌก้ป„ๅญ้Ÿฌ็‚ธไธ€ๆฌก\n666001 0 1ๅนฒ็—’ๅ› ็šฎ่‚ค็ผบๆฐด็ป†่ƒžๅพ—ไธๅˆฐๆฐดไปฝ2่ตท็šฎๅฑ‘ๅ› ็šฎ่‚คๅคชๅนฒ็‡ฅ่ง’่ดจๅฑ‚่„ฑ่ฝ3้•ฟ็—˜็—˜ๅ› ไธบ็šฎ่‚คๆฒนๆฐดไธๅนณ่กก4ๅฎนๆ˜“่ฟ‡ๆ•\n666002 1 ๅๆฐ็”ปๅฎคxๆœˆxๅทๅ‘จๅ…ญๅผ€่ฏพๅ•ฆ๏ผๅˆ็บง็ญ๏ผŒไธญ็บง็ญ๏ผŒ้ซ˜็บง็ญ๏ผŒๆฏ็ญๅชๆ”ถxxไบบ๏ผŒxxxxๅนด๏ผŒๅ…จๆ–ฐ็š„ๆ•™ๅญฆ็†...\n666003 0 ็”ตๆขฏๅžไบบไบ‹ไปถๅ‘Š่ฏ‰ๆˆ‘ไปฌไธ€ไธช้“็†๏ผš่ƒฝๅœจๆˆ‘่ฟ™ๅ„ฟไนฐ็š„\n666004 1 ๅ†็บฟๆ€งๆ„Ÿ้“ๅฆน๏ผ่–„้‡‡้Šๆˆ็ญ‰!้ …็›ฎๅคš\n1000 Processed\n classify content\n666500 0 ๅ› ไธบๆˆ‘ไปฌไปŽๅŸบๅ› ไธŠๅฐฑๆ˜ฏ่ขซ่ฎพ่ฎกๆˆ่ฟ™ๆ ท็š„\n666501 0 ไปŽๅƒ็ฑณ้ซ˜็ฉบ็š„้ฃžๆœบไธŠไฟฏ็žฐไผšๅ‘็Žฐ\n666502 1 ๅฎถไบบไปฌ๏ผšๆ˜ฅ่Š‚ๆ„‰ๅฟซ๏ผŒๆด‹ๆด‹ๅพ—ไบฟ๏ผ่ฎฐๅพ—ๆˆ‘ไปฌๆญฃๆœˆๅๅ…ซ็›ธ็บฆๅˆฉๆฐ‘้›ช่Žฒ๏ผŒ่ฎฉๆ‚จ่ถ…ไฝŽ็”š่‡ณๅ…่ดนไฝ“้ชŒๆˆ‘ไปฌ็š„็‰น่‰ฒ่บซไฝ“...\n666503 1 ไบบๆฏ”่Šฑๅจ‡xxx-xxxx-xxxx\n666504 0 ๆ‹”ๅ‡บๆ‹›่˜ไฟกๆฏ้‡Œ็š„โ€œๅˆบโ€โ€ฆ\n1000 Processed\n classify content\n667000 0 ไฟก็”จๅกๅ–็Žฐ็ง’ๅˆฐ่ดฆ่ดน็އไฝŽๅธฆ็งฏๅˆ†\n667001 0 ๅฐฑไฝฟ็”จๆˆ‘็š„้‚€่ฏท็ 6a6gsdไธ‹่ฝฝๅนถ็™ปๅฝ•ๆต™ๆฑŸ็งปๅŠจๆ‰‹ๆœบ่ฅไธšๅŽ…\n667002 0 ๆˆ‘้‚ฃๆต™ๆฑŸ็š„ๅงๅงๆฌฃ้—ปๆˆ‘่ฆๅŽปๅฅน่€ๅ…ฌ็š„ๆฏๆ กไน‹ๅŽ\n667003 0 13ๅนดๆ‰‹ๆœบไธšๅŠก้ƒจ้—จ่ขซๅพฎ่ฝฏๆ”ถ่ดญ\n667004 0 ๆต™ๆฑŸไธŠๆผ”ไบ†ไธคๅ‡บ่ฎฉไบบๅ•ผ็ฌ‘็š†้ž็š„ๅฎถๅบญ้—นๅ‰ง๏ผšๆญๅทžไธ€ๅ็”ทๅญ้šพๅฟๆฏไบฒไธŽๅฆปๅญไบ‰ๅต\n1000 Processed\n classify content\n667500 0 83%ใ€้‡‘้พ™้ฑผๅคง่ฑ†ๆฒน5ๅ‡/ๆกถ่ฃ…36ๅ…ƒไธŽไธŠๅ‘จๆŒๅนณ\n667501 0 ๆˆ‘็‰นไนˆไนŸๆ˜ฏ้†‰ไบ†ๆญฃๅธธๅ’จ่ฏข็š„ๅพฎๅš้ƒฝไธๅ‡บๅŽป\n667502 0 ๆ‰ๆœ‰ๆœชๆฅโ€”โ€”ๅฝฉ้’ขๆฟๅปบ็ญ‘ไน‹็งŸๅฎข็ฏ‡\n667503 0 IBMๅคงไธญๅŽๅŒบ้ฆ–ๅธญๆ‰ง่กŒๆ€ป่ฃ้’ฑๅคง็พคๅฐ†้€€ไผ‘\n667504 0 51ๅฎถไฟๅฅ้…’ๅฃฎ้˜ณๅŠŸๆ•ˆๅฏ่ƒฝๆ˜ฏ้ ่ฅฟ่ฏโ€œไผŸๅ“ฅโ€ๆฅๆ”ฏๆ’‘\n1000 Processed\n classify content\n668000 0 ๆฅ่‡ชๅ…จๅ›ฝ37ๅฎถๅš็‰ฉ้ฆ†ๅŠๅคฉๆดฅๅš็‰ฉ้ฆ†็š„ๆ€ป่ฎก267ไปถๆ–‡็‰ฉ\n668001 0 NBAๅผ€ๅฏๅคๅญฃ่ฝฌไผšๅ’Œ็ญพ็บฆ็š„็ช—ๅฃ\n668002 0 ๆƒณไธ้€š้‚ฃไนˆๅคšๆˆฟๅœฐไบง้ƒฝไธๆ™ฏๆฐ”\n668003 0 ไธ€ๆœฌไนฆๅœจไบš้ฉฌ้€ŠไธŠ้•ฟๆœŸไธ€ๅ—้’ฑไธๅˆฐ็š„ไพฟๅฎœๅ–ไนŸไธไธ€ๅฎšๆ˜ฏ่ฟ™ไนฆ็ƒ‚\n668004 0 ๆœฌ้—จไปŽๆ˜†ๅฑฑ้‡‘ไนŒ็‹ฎ็›Ÿๅคบๅพ—ๆ˜Ž็މ็ฅžๅŠŸๆฎ‹็ซ ไบ”\n1000 Processed\n classify content\n668500 0 ๅธฎๆˆ‘ๆ‰พๅœฐ้“็ซ™ๅŽ่ฟ˜้€ๆˆ‘ๅˆฐๅœฐ้“ๅฃ\n668501 0 ๅบ”ๅŠๆ—ถๅˆฐ็™ฝ็™œ้ฃŽไธ“็ง‘ๅŒป้™ขๆฃ€ๆต‹ๅ‡บ้ป‘่‰ฒ็ด ็ผบๅคฑ่ฏฑๅ› \n668502 0 ไปŠๅนดๅˆ่ฎค่ฏ†็š„ๅ“ฅๅ“ฅๅœจๆน–ๅ—้ฃžๆตทๅฃ็š„้ฃžๆœบไธŠไธขไบ†ๅฐ็ฑณๆ‰‹ๆœบ\n668503 0 ๆžœ็„ถ็œ‹ๅˆฐๅฐๅทๆญฃ่ถดๅœจๅงๅฎค็š„้˜ณๅฐไธŠ\n668504 0 ็ฟป้˜…ไบ†ไธไธ‹200็ฏ‡ๅพฎๅš็ปˆไบŽๆ‰พๅˆฐไบ†\n1000 Processed\n classify content\n669000 0 ๅนฟๅทžๅผ€ๅ‘ๅŒบๅ่…ๅ€กๅป‰ๆ•™่‚ฒๅŸบๅœฐๅˆๅ่‚ฒๅป‰้ฆ†\n669001 0 ไธ”็†ฌไธ”็ๆƒœใ€็ตฆ็ถ“ๅธธ้œ€่ฆ็†ฌๅคœใ€ๅ–้…’็š„ไฝ \n669002 0 FOLLOWMEๅ’จ่ฏขๅซๆ˜Ÿ๏ผšNut1366\n669003 0 ่€Œ้‡Œ้ขๆœ‰ไพ›่บซๆ‚ฃ็ฒพ็ฅž็–พ็—…็š„ๅฅนๆœ็”จไธ€ๅนด็š„ๆฒป\n669004 0 ไธ€่ˆฌ่ฟ™ไบ›้ƒฝๅพ—YG้‚ฃ่พนๅ‘Š่ฏ‰ๅ’ฑไปฌ่กŒ็จ‹ไปฅๅŽ\n1000 Processed\n classify content\n669500 0 ่ฟ‘่ง†ไธค็™พๅบฆๆˆ‘ๅพ—็ฝชseiไบ†ๆˆ‘\n669501 0 ๆทฑๅคœๆŠ“ไฝไธ€ๅชๅปบ็ญ‘็‹—ๆœ‰ๅคฉๅถๅงโ€ฆ่ฎบๆ–‡ๅ†™ๅพ—ๆฏ”ๆˆ‘ๅฟซไธ่ƒฝๅฟ\n669502 0 ๅฎฟ่ฟ้’ๅŽๅญฆๆ ก็š„้ƒจๅˆ†ๆ•™่Œๅทฅๅ‘ๅช’ไฝ“ๅๆ˜ \n669503 0 ๆณ•ๅฎ˜็”จๅ…ณไนŽ็คพไผšๅ…ฌๅนณ็š„ๆƒๅŠ›่…่ดฅ\n669504 0 ไผ ๅฐๅบฆ็”ตๅ•†Snapdeal่ž่ต„xไบฟ็พŽๅ…ƒ้˜ฟ้‡Œๅ‚ๆŠ•ๆฎๅ›ฝๅค–ๅช’ไฝ“ๆŠฅ้“\n1000 Processed\n classify content\n670000 1 ๆ˜ฅ่Š‚ๅ‡ๆœŸๅทฒ่ฟ‡๏ผŒ็Žฐๅทฒๆญฃๅผๅผ€ๅฑ•ๅทฅไฝœ๏ผŒๆœฌไบบไพ็„ถไปŽไบ‹็€ไธชไบบๆ— ๆŠตๆŠผไฟก็”จ่ดทๆฌพ๏ผŒ่ฝฆ่พ†่ดทๆฌพไปฅๅŠๆˆฟๅฑ‹ๆŠตๆŠผ่ดทๆฌพ๏ผŒ...\n670001 0 ๅ›ฝ็ไธ“่ฅ/ๅ›ฝ็ๅฅๅบท็”Ÿๆดป้ฆ†้กน็›ฎไผ˜ๅŠฟ๏ผš1ใ€ไฟๅฅๅ“่กŒไธšๅฑžไบŽๆœ้˜ณ่กŒไธš\n670002 0 cctv6็š„ๅ‘จไธ€่‰บๆœฏๅฝฑ้™ข็‰นๅˆซ่ฎฒ็ฉถ\n670003 0 Fsjไปฅไธบ็”จไธ€ๅˆ‡้š็ž’่‡ชๅทฑๆ‰€ไฝœๆ‰€ไธบ\n670004 0 ็ฒ‘็ฒ‘ๅฐฑๆŠŠ็”ต่„‘ๅฐฑ็ป™ๆˆ‘ๆฌๅˆฐๅบŠไธŠๅ‚ฌๆˆ‘้€‰่ฏพ\n1000 Processed\n classify content\n670500 0 20150712ไปŠๅคฉๅฐฑๆ˜ฏๆ•ดๅคฉๅฐ่‘—้›ป่…ฆ\n670501 0 ๆณ—ๅŽฟๆณ•้™ขๅ…šๅ‘˜ๆณ•ๅฎ˜่ตฐ่ฟ›ๅ—ๅ…ณ็คพๅŒบ\n670502 0 ๅ†…้ƒจ่ฃ…็ฝฎ็š„LED็ณป็ปŸๅฏไฝฟๅ…ถๅœจๆŽฅ้€š็”ตๆบๅŽๅ‘ˆ็Žฐๅ‡บไธ€็›ด่ด่ถๅฝฑๅƒ\n670503 0 ็Ÿญ็Ÿญ20ๅ…ฌ้‡Œ็š„่ทฏ่พน่Šฑ่‰ๅธƒ็ฝฎๅฐฑๆ˜ฏ6000ไธ‡\n670504 0 ่€ŒไธŽไธ€่ˆฌAndroidๅนณๆฟ็”ต่„‘็š„้”ฎ็›˜ไฟๆŠคๅฅ—ๆœ‰ๆ‰€ไธๅŒ็š„ๆ˜ฏ\n1000 Processed\n classify content\n671000 0 ๆƒณๆƒณๅนฟๆ’ญไธŠ้ƒฝๆœ‰้‚ฃ็งxxxxไนฐๅๅ‡ ็“ถ่Œ…ๅฐ้€xxxx่ฏ่ดน็š„\n671001 0 ๅ—ๅˆฐๅ†่ž่ต„ๅณๅฐ†้‡ๅฏ็š„ๅˆฉ็ฉบไผ ้—ปๅฝฑๅ“ๅคง็›˜่ทณ็ฉบไฝŽๅผ€\n671002 1 ๅฅฝๆถˆๆฏ๏ผๅฅฝๆถˆๆฏ๏ผๆ–ฐ้ƒฝๅจฑไนๆฑ‡ๆ–ฐๆ˜ฅๆœ‰ไผ˜ๆƒ ้…’ๆฐดๅ…จ้ƒจxๆŠ˜๏ผŒๅ…ๅŒ…ๆˆฟ่ดน๏ผŒๆ˜ฏๅ„ไฝๅคงๅ“ฅๅจฑไนไผ‘้—ฒ็š„ๅฅฝๅŽปๅค„๏ผŒไฝ ่ฟ˜...\n671003 0 NASAๅœจTwitterไธŠ็š„ๅ›พ็‰‡\n671004 0 ๅผ•โ€œ็Ÿฅๆƒ…ไบบๅฃซโ€ๅ‡บๆดžโ‘กๆ•ฐๆฎ้€ ๅ‡ๆ„ˆๆผ”ๆ„ˆ็ƒˆ\n1000 Processed\n classify content\n671500 0 ๅฎถ้‡Œ็š„้…ฑๆฒน้†‹็ณ–็›้ƒฝๆฏ”ๆžซๅถๅ›ฝ็š„ๅฅฝๅƒๅฅฝๅคš\n671501 0 ไปฅๅŽๆˆ‘ๆ”น่ฏดโ€œ็Žฉ็”ต่„‘โ€==~~\n671502 0 ๅ‘็บขๅŒ…ๅ•ฆ~ๅ‘็บขๅŒ…ๅ•ฆ~ๅ‘็บขๅŒ…ๅ•ฆ\n671503 0 ๆ•ด้ƒจๅ‰ง้ƒฝๆ˜ฏๅ„็ง็Šฏ็ฝชๅ„็งๆ‰“ๆžถๅ„็งๅธ…\n671504 0 ไธๆƒณๆˆ˜ไบ‰ๆ”ฟๅบœๅฟ…้กปๆƒณๆณ•ๅฐ‘ๆ•ฐไบบๅˆซๅ ๆœ‰ๅคชๅคš\n1000 Processed\n classify content\n672000 0 ่ฟ™ไธชไธœ่ฅฟๅคชๆๆ€–ไบ†ๆ•ดไธชๅฐๅŒบ็š„ไบบ้ƒฝๅ †ไธ€ๅ †ๆˆ‘ๅฆˆๅฐฑไธ่ฎฉๆˆ‘็ˆธไนฐไป–่ฟ˜่บฒ็€ไนฐๆˆ‘ไธๆƒณไป–ไปฌ็ฆปๅฉš\n672001 0 ่ฎฉๅ่…ๅŠฒ้ฃŽๅน่ฟ›โ€œ่ก™้—จโ€็š„ๆฏไธ€ไธช่ง’่ฝ\n672002 0 ๆธ…ๆถงๅŽฟๅ…ฌๅฎ‰ๅฑ€ๅœจๅŽฟๅฑ€ไธ‰ๆฅผไผš่ฎฎๅฎคๅฌๅผ€ๅ…จๅŽฟ็ฝ‘ๅงไธšไธปๅบง่ฐˆไผš\n672003 0 ๅฏๅ‘ๆณ•ๅฎš่กŒๆ”ฟๅค่ฎฎๆœบๅ…ณ็”ณ่ฏทๅค่ฎฎ\n672004 0 ๆณ•ๅพ‹ๆ˜ฏๅฎž่ทต็ป้ชŒ็š„ๆ€ป็ป“ๅ’Œๆๅ‡\n1000 Processed\n classify content\n672500 0 ไธญๅ›ฝไฟก็”จๅกๆ–ฐๅขžๅ‘ๅก้‡6400ไธ‡ๅผ \n672501 0 ๆš‚ๆ—ถ็ฆปๅผ€่ฟ™ไธชๅฎ˜ๅ•†ๅช’ๅ‹พ็ป“็š„ๅธ‚ๅœบ\n672502 0 ๆ—ฉ็›˜ๅผ€็›˜ๆ˜ฏ็›ดๆŽฅ็ซ™ไธŠ3700็‚นๆˆ˜ๅœฐ\n672503 0 ่ฏฅๅŽฟ็ดฏ่ฎกๆŠ•่ต„1300ๅคšไธ‡ๅ…ƒ\n672504 0 ๅŽไธบNexusๆ‰‹ๆœบ่ฆ็›ดๆŽฅ็”จ้ช้พ™820ใ€\n1000 Processed\n classify content\n673000 1 ไธญๅ…ฌๆ•™่‚ฒ็ผ™ไบ‘ๅˆ†ๆ กๆ•™ๅธˆๆ‹›่˜่ฏพ็จ‹ไธŠ็บฟๅ•ฆ๏ผŒๅœจ็ผ™ไบ‘ๅผ€่ฏพๅ“ฆ๏ผŒ่€Œไธ”xๆœˆxxๆ—ฅๅ‰ๆŠฅๅๆœ‰ๆƒŠๅ–œไผ˜ๆƒ ไปท๏ผŒ่ฏฆ่ฏขxx...\n673001 0 ๅ› ๆ‘ๆฐ‘ๆœชไบคๆ–ฐๅž‹ๅˆไฝœๅŒป็–—่ดน็”จ่€Œๆސๆ–ญ็”ต็บฟ\n673002 0 ๅคๅคฉ้€‰็”ต่„‘ๆกŒ้ขไนŸๆ˜ฏๆŒบๆœ‰่ฎฒ็ฉถ็š„\n673003 0 ๅญฆๆ กๆœ‰ๅ›ฝๅฎถ้‡็‚นๅฎž้ชŒๅฎค3ไธช๏ผšๅ›ฝๅฎถๆ•™่‚ฒ้ƒจๆ™บ่ƒฝๅˆถ้€ ๆŠ€ๆœฏ้‡็‚นๅฎž้ชŒๅฎค\n673004 0 ๆœ€ๆœ‰ๆ„ๆ€ๆ˜ฏDan่ฟ˜็Žฐๅœบๆผ”็คบไบ†AppiumไธŽๆœบๅ™จไบบ็ป“ๅˆ็š„็œŸๆœบๆต‹่ฏ•ๅœบๆ™ฏ\n1000 Processed\n classify content\n673500 0 ไธ€ไปฃ็މๅฅณ้’ๆ˜ฅๅถๅƒDebbieGibson็š„็ปๅ…ธๅๆ›ฒ\n673501 0 ๅ„ฟๆ—ถๆœ‰่ฟ‡ๆขฆๆƒณๆˆไธบๆกฅๆขๅปบ็ญ‘ๅธˆ\n673502 0 ้‚ฃไนˆๆœ€ๅŽไป–่ฟ˜ไผšไธบไบ†่Šฑๅƒ้ชจๆญปๅ—\n673503 0 ็Žฐๅœจ็š„็—…ไบบ่ทŸๅ•†่ดฉไธ€ๆ ทๅฅธ่ฏˆๅ™ข\n673504 0 ๆ‹†ๆމๅฎ—ๆ•™่ฟ็ซ ๅปบ็ญ‘ๆ˜ฏไฝ“็Žฐ็คพไผšๅ…ฌๅนณ็š„่กจ็Žฐ\n1000 Processed\n classify content\n674000 1 ๆˆ‘ๅ…ฌๅธๅบ“ไธญ็Žฐๆœ‰่ฟ›ๅฃๆฌงๆดฒๅบŸๆ—งๅก‘ๆ–™ใ€ไธป่ฆๆœ‰๏ผšABSใ€ABSๅˆ้‡‘ใ€PAใ€Pcใ€PEใ€PPใ€PMMA...\n674001 0 xxๅŽๆ›พๆœ‰่ฟ‡โ€œ้’ๆ˜ฅๆ˜ฏ้“ๆ˜Žๅชš็š„ๅฟงไผคโ€\n674002 0 ๅฝ“ๆˆ‘ๅœจๅ—ไบฌ่ท‘ๅฎŒ่ฟ™xxๅ…ฌ้‡Œ็š„ๆ—ถๅ€™ไฝ ้€”ไธญไผšๅ‘็Žฐๆœ‰ๅพˆๅคš้ซ˜ๆฅผๅคงๅŽฆ\n674003 0 ๅฐฑๆƒณ่ฏด่…พ่ฎฏๆ–ฐ้—ปไนŸ็”จ้”™ๅˆซๅญ—ๅ•Š\n674004 0 ๆƒณๆด—็ด‹่บซๅˆฐๅบ•้†ซ้™ข้ ่ญœ้‚„ๆ˜ฏ็พŽๅฎนๆฉŸๆง‹้‚„ๆ˜ฏ็ด‹่บซๅบ—\n1000 Processed\n classify content\n674500 0 ็‰กไธนๆฑŸๅธ‚่ฅฟๅฎ‰ๅŒบๆณ•้™ข็ซ‹ๆกˆๅบญๅ‰ฏๅบญ้•ฟ้ฉฌ่•ดๆ…งไปŽๅฎถๅ‡บๅ‘\n674501 0 ่ฟ‡ไบ†ไฟ่ดจๆœŸไธๅŒ…ไบ†ๅค–้ขๅ‡ ๅๅ—้’ฑๅฐฑไฟฎๅฅฝไบ†็š„ไป–่ฆ220\n674502 0 ๆปจๆตทๅŸŽๅธ‚้’ๅฒ›ๅฐฑๆ˜ฏ่ฆๅŽป็ญ‰ๅพ…ๅ’Œๆ‹–ๅปถ\n674503 0 ๆ˜ฏ้ฃŸ็‰ฉไธญๆฏ’ๅ’Œ้ฃŸๆบๆ€ง็–พ็—…ๅคšๅ‘ๅญฃ่Š‚\n674504 0 ๅ˜‰ไฟชๆณฝยทๅ›ฝๅฎถ้ฃŸๅ“่ฏๅ“็›‘็ฃ็ฎก็†ๆ€ปๅฑ€ๅ›ฝไบง้ž็‰นๆฎŠ็”จ้€”ๅŒ–ๅฆ†ๅ“ๅค‡ๆกˆ้€š่ฟ‡\n1000 Processed\n classify content\n675000 0 ไบŽๆ˜ฏๅœจ้ฃžๆœบไธŠ็œ‹ๅˆฐ็š„ๆ™ฏ่‰ฒไนŸๆ˜ฏไธ้”™็š„\n675001 0 ๅŸบๆœฌไธŠๆœ€็ปๅ…ธๅ’Œ็จณๅฆฅ็š„่ฎพ่ฎกไบ†\n675002 0 ๅŽไธบmate7็”จๆˆท็š„ๅฟƒๅฃฐ๏ผšๅŒ็ณป็ปŸไปŽๅŽปๅนดไนๆœˆๅ‘ๅธƒไผšๅผ€ๅง‹ๅนๆงไน‹ๅŽ่ฏดๅˆฐ11ๆœˆไปฝๅผ€ๅง‹ๆŽจ้€ๅˆฐไปŠๅคฉ201...\n675003 0 ๅธŒๆœ›ไธญๅ›ฝๆ”ฟๅบœๅˆ‡ๅฎžๅ…ณ็ˆฑไธญๅ›ฝๆฎ‹็–พไบบ็š„็”Ÿๅญ˜็Šถๅ†ต\n675004 0 ไฝ ไผšๅ‘็Žฐไปฅ4ๅ…ƒไนฐๅ…ฅๅนถไปฅ8ๅ…ƒๅ–ๅ‡บ็š„ๆœบไผšๅนถไธๅคš\n1000 Processed\n classify content\n675500 0 ๅฐŠไบซ240โ€”โ€”460ใŽก้บ“ๅฑฑๅˆซๅข…\n675501 0 ๅฎž็Žฐๅ…ฌๅธๅ•†ไธšๅœฐไบง็ปง็ปญๅšๅคงๅšๅผบ็š„ไธšๅŠกๅ‘ๅฑ•็›ฎๆ ‡\n675502 0 ๅ…ถๆ„Ÿๅ—็ป†่ƒžๅˆ†ๅธƒๅœจๅ…ณ่Š‚ใ€่‚Œ่‚‰ใ€่‚Œ่…ฑ็ญ‰็ป„็ป‡ไธญ\n675503 0 ๆˆ‘ไปฌ็š„ๅšๅทฅใ€่ดจ้‡ไปฅๅŠๆœๅŠก็ปๅฏนๆ˜ฏไธ€็ญ‰็š„\n675504 0 ๆต™ๆฑŸไธŠๆฆœ็š„ๅœฐไบงใ€่ƒฝๆบไผไธšไธๅฐ‘\n1000 Processed\n classify content\n676000 0 ๅˆšๅˆšๆ‹ฟๅˆฐไบฌไธœๆˆ˜็•ฅๆŠ•่ต„็š„ๅž‚็›ด็”Ÿ้ฒœ็”ตๅ•†ๅคฉๅคฉๆžœๅ›ญ\n676001 0 ไธชไบบไฟกๆฏ่ขซๅˆซไบบ็›—็”จ่ฟ›่กŒ็Šฏ็ฝช\n676002 1 ๆ‚จๅฅฝ๏ผŒ่ฏš้‚€ๅ…‰ไธดxxxxๅนฟๅทž๏ผˆๆ–ฐ๏ผ‰ๆตไฝ“ๅฑ•๏ผŒxxxๅฎถๆตไฝ“ไผไธšๅ’Œๆฅ่‡ชxxๅคšไธชๅ›ฝๅฎถ็š„ไธ“ไธš่ง‚ไผ—้ฝ่š๏ผŒไธญ...\n676003 0 ๅŒป็”Ÿ่ฎฉๆˆ‘ๆ‹”็‰™ไน‹ๅ‰ๅš็‰™ๅ‘จๆŠค็†\n676004 1 ไบฒ็ˆฑๅงๅงไปฌๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒไธฝๆ—ฅไธ‰ๆฅผ้€‚ๆ—ถ้›…้›†ไธ“ๆ‹’ๆžๆดปๅŠจ๏ผŒ็ง‹ๅ†ฌ่ฃ…x.xไธ€x.xๆŠ˜๏ผŒๆœ‰ๆ—ถ้—ด่ฟ‡ๆฅ็œ‹็œ‹๏ผŒๆดป...\n1000 Processed\n classify content\n676500 0 ไธ€่ง‰้†’ๆฅ็œ‹ๅˆฐๆ‰‹ๆœบไธŠ็š„ๆ—ถ้—ด้ƒฝๆ‡ตไบ†\n676501 0 ็œ‹ๅˆฐ่ขซ่ก€ๆŸ“็บข็š„ๆตทๆฐดๆ„Ÿ่ง‰ๅฅฝ้šพๅ—ๅฅฝ้šพๅ—็š„\n676502 0 โ€”โ€”ใ€Œ้•‡ๆ”ฟๅบœไธบ็กฎไฟๅทฅ็จ‹่ฟ›ๅบฆๅผบๆ‹†ๆฐ‘ๆˆฟ่ขซๅˆค่ถŠๆƒ่ฟๆณ•ใ€\n676503 0 ๆณฐๅทžๅŠจ็‰ฉๅ›ญๅฏน้ข็š„ๆœฑๅก˜ๆ‘ๆŸๆ‘ๆฐ‘ๅฎถไธญ็…คๆฐ”็ˆ†็‚ธๅ‘็”Ÿ็ซ็พ\n676504 0 ไธ€ๆ—ฉๆ‰“ๅ‡บ็งŸ่ฝฆๅธ…ๅ“ฅๅธˆๅ‚…่ฏดๆˆ‘ไน‹ๅ‰ๅฅฝๅƒๆ‹‰่ฟ‡ไฝ ๆˆ‘่ƒฝ่ฏดๆˆ‘ๅฎŒๅ…จไธ่ฎฐๅพ—ไบ†ๅ—็œ‹ๆฅไป–ไธๅƒๆ˜ฏ้ช—ๆˆ‘็š„ไธป่ฆๆ˜ฏๆˆ‘ๆฒก่ฏดๅŽป...\n1000 Processed\n classify content\n677000 0 ๅ—ไบฌๅฅฝ็ƒญQAQๅ‡บ้—จๅฐฑ่ง‰ๅพ—่‡ชๅทฑ่ฆๅŒ–ไบ†ๅ—ท\n677001 1 ่€ๆฟ๏ผŒๆ‚จ้‚ฃ่พนๆ˜ฏๆœ‰ไธชๅบ—้ขๅœจ่ฝฌ่ฎฉๅง๏ผŸๆˆ‘่ฟ™่พนๆ˜ฏไธ“ไธšไปŽไบ‹ๅบ—้ข่ฝฌ่ฎฉ็š„๏ผŒ้€Ÿๅบฆๅฟซใ€ๆ•ˆ็އ้ซ˜ใ€ไปทๆ ผๅˆ็†๏ผ่ฝฌๅบ—้ข...\n677002 0 ๅฝ“Fire็”จๆˆทๅ‘ไบš้ฉฌ้€Šไบ‘ไธŠไผ ็…ง็‰‡ๅ’Œๆ•ฐๆฎ\n677003 0 ๅ—ไบฌ้’ฑๅฎๅฎขๅœบ3ๆฏ”1ๅ‡ป่ดฅไฟๅฎšๅฎนๅคง\n677004 0 ๅผ ่€ๆ™šๅนดๅœจๆฑŸ่‹ๆฑŸ้˜ด็š„ๅฎถไธญไนฆๆˆฟ\n1000 Processed\n classify content\n677500 0 ๅ่…ๆกˆๅˆ—ๅ่…ๆกˆๅˆ—ๅพˆๆˆๅŠŸ\n677501 1 ๅ…่ดนๅ’จ่ฏขxxxxxxxxxxxๆœ€ๆ–ฐ็‰ˆๆœบ้บปๆŽงๅˆถๅ™จ๏ผŒไธๅฎ‰่ฃ…๏ผŒ่ตทๆ‰‹ๆ‹ฟๅฅฝ็‰Œ๏ผŒๅฏ็Žฐๅœบ่ฏ•็”จ๏ผŒๆปกๆ„ๅ†ไนฐใ€‚y\n677502 0 ็‹ฌ็ซ‹ๅซๆตด็”ตๆขฏ่ฟ‘็ซ่ฝฆ็ซ™Coles\n677503 0 CCๆ˜ฏ่ƒฝ็”จๅ—“้Ÿณ้œ‡ๆ’ผๅ…จไธ–็•Œ็š„็”ทไบบ\n677504 0 ๅธŒๆœ›ไปฅๅŽ่ƒฝๅŽป็พŽๅ›ฝ่ฏป้‡‘่žๅšๅฃซ\n1000 Processed\n classify content\n678000 0 xร—xx*xไธ็Ÿฅ้“่ƒฝๅฆ็ปง็ปญๆฏไนณๅ–‚ๅ…ป\n678001 1 ๅง๏ผŒๆ–ฐๅนดๅฅฝ๏ผ็Žฐxx่Š‚็‰นๆƒ ๆ˜ฏๆปกxxxxๅฏไบซ้€xxx\n678002 0 xxxxๅนดๅบฆ้ƒจ้—จโ€œไธ‰ๅ…ฌโ€็ป่ดนๆ”ฏๅ‡บๆฏ”xxxxๅนดๅบฆ้™ๅน…ๆ˜Žๆ˜พ\n678003 0 ๅ…ถๅฎžๆ‰‹ๆœบๆ‹ๆ นๆœฌ็œ‹ไธๅ‡บๆฅ่„ธไธŠ้ƒฝๆถ‚ไบ†ๅ•ฅ\n678004 0 ไบŒๆ‰‹่ฝฆๅนณๅฐไบบไบบ่ฝฆ่Žท8500ไธ‡็พŽๅ…ƒC่ฝฎ่ž่ต„่…พ่ฎฏ้ข†ๆŠ•\n1000 Processed\n classify content\n678500 0 ไผ ็ปŸ็ƒงๆฒนๆฑฝ่ฝฆๅ› ไธบๆœบๆขฐๅทฅ่‰บๅคๆ‚\n678501 0 ๆœ‰็ง่ฏดๆณ•ๆ˜ฏไผไธšๅ•†ไธš่ฎกๅˆ’pptๆ˜ฏๅ†™็ป™่‡ชๅทฑ็š„\n678502 0 ่€Œไป–ไปฌไนŸๅฐ†ๅ†ๆฌกไธŽๅ‰TopGear่ฃฝไฝœไบบAndyWilmanๅ†ๆฌกๅˆไฝœ\n678503 0 ่ขซ้’ฑๆฑŸๆ™šๆŠฅไปŠๆ—ฅๆกไนกๅ’Œๆต™ๆฑŸๆ–ฐ้—ป็ฝ‘็ซž็›ธๆŠฅ้“\n678504 1 ๅœฃ่ฑช็™พ่ดง่ฟชๅฃซๅฐผ็ซฅ่ฃ…ๆžๆดปๅŠจๅ•ฆ๏ผไธ‰ๆœˆไธ‰ๆ—ฅ่‡ณไธ‰ๆœˆไนๆ—ฅ๏ผŒๅ†ฌ่ฃ…ไฝŽ่‡ณๅ››ๆŠ˜๏ผŒๆ˜ฅ่ฃ…ไฝŽ่‡ณไธƒๆŠ˜๏ผŒๆฌพๅผๅคšๅคš๏ผŒไผ˜ๆƒ ๅคš...\n1000 Processed\n classify content\n679000 0 ๅนฒไบ†่ฟ™ไบ›่ฟๆณ•ๅ‹พไบŽๅผบ็ƒˆๆ„Ÿๆƒ…็š„ๆŠ’ๆƒ…่ฏ—ๆญŒๅฏไปฅๆ‹ฅๆœ‰\n679001 0 Aๅ›๏ผšไป–ไปฌๆฝฎไธ€ไบ›ๆˆ‘ไปฌ้ƒฝไธ็Ÿฅ้“็š„็‰Œๅญ\n679002 0 BTOB็ˆ†ๆ–™๏ผš้™†ๆ˜Ÿๆๅนฟๅ‘Šๆ”ถ็›Šๅคงๅฎถไธ€่ตทๅˆ†\n679003 0 ไฝ ๅœจๅ…ถไป–ๅบ—้‡Œไนฐ็š„้˜ฒๆ™’้œœๆœ‰ไฟฎๅค้œฒ้…ๅฅ—ๅ—\n679004 0 ๆฒณๅŒ—ๅคงๅๅŽŸๅŽฟๅง”ไนฆ่ฎฐๆ•›่ดขไธŠไบฟ่‡ช็งฐๆœช่Žทๅ‡่ฟๅฟƒ็†ๅคฑ่กก\n1000 Processed\n classify content\n679500 1 xxๅฅณไบบ่Š‚่ถๆผซ็Žซ็‘ฐไผšๆ‰€้€ๆƒŠๅ–œ๏ผšxๆœˆxๆ—ฅๅฝ“ๅคฉ็ง’ๆ€ ไผ˜ๆƒ ไธ€๏ผŒๅคด็–—๏ผŒๆ‰‹็–—๏ผˆๅ…จๅนดไธ้™ๆฌกๆ•ฐ๏ผ‰็ง’ๆ€ไปทx...\n679501 0 ไนŸ็ฎ—ๅฏนๅพ—่ตทๆˆ‘100ๅ—ๅ’Œ้‡ๆ–ฐๆ‹พ่ตทๆ•™่‚ฒๅญฆ็š„ๅ‹‡ๆฐ”\n679502 0 ้’ˆๅฏนๆ”ถ็ผฉๆฏ›ๅญ”ใ€็ฒ‰ๅˆบ้ป‘ๅคดๆœ‰็‰นๅˆซๆ•ˆๆžœ\n679503 0 ๅ‰ๆตทไบบๅฏฟ็ฉถ็ซŸๆƒณๅไบซโ€œๆŠ„ๅบ•โ€ๆ”ถ็›Š็š„ๆฐธไน…ๆ€งๅนณ้™ๅ‘ข\n679504 0 ๆ— ้”ก่Œๆ•™ๅ›ญๅ…ป็”Ÿๅˆ†ไผšๅ’Œๅบทๅคไธญๅฟƒ้‚€่ฏท่ก—้“้ƒจๅˆ†็คพๅทฅๅฌๅผ€ๅบง่ฐˆไผš\n1000 Processed\n classify content\n680000 0 ็”จ10็ง’็š„่ฎพ่ฎกไธบๆฐ‘ๆ—ๅˆ›ๆ–ฐๅŠ›็‚น่ตž\n680001 0 ๅ› ไธบๅ–ท้›พๆ˜ฏ็›ดๆŽฅไฝœ็”จๅœจๅ’ฝๅ–‰ไธŠ็š„\n680002 1 ๆ‚จๅฅฝ๏ผ้ญ…ๅŠ›ๅจฑไนๅ…ฌๅธ็ฅๆ‚จๅ…ƒๅฎต่Š‚ๅฟซไน๏ผŒๅ‡กไปŠๆ™šๅœจ้ญ…ๅŠ›ๆœฌ่‰ฒ้…’ๅงๆˆ–้ญ…ๅŠ›ๅฎ‰ๅ—KTVๆถˆ่ดนๆปกไฝŽๆถˆ็š„๏ผŒๅฐ†่Žทๅพ—้ญ…...\n680003 0 ใ€€ใ€€ๅŠŸๆ•ˆ๏ผšๆตทๅธฆไธญ็š„็ข˜่ƒฝๅคŸๅพˆๅฅฝ็š„่ขซไบบไฝ“ๅธๆ”ถๅŽ\n680004 1 ๆˆฟไบงๆŠตๆŠผใ€xๅคฉๅ‡บๅฎƒ่ฏๅ†ๆ”พๆฌพ๏ผŒ็กฎไฟๆœฌ้‡‘ๅฎ‰ๅ…จใ€‚่ฏฆ็ป†ๆƒ…ๅ†ตๅปบ่ฎฎๆ‚จๆฅๆˆ‘ไปฌๅ…ฌๅธไบ†่งฃ๏ผŒๅœฐๅ€๏ผšไธŠๆตทๅธ‚้ป„ๆตฆๅŒบๆˆ...\n1000 Processed\n classify content\n680500 0 ๅฐ†่€ๅนดไบบใ€็คพๅŒบใ€ๅŒป็–—ๆœบๆž„ใ€ๅŒปๆŠคไบบๅ‘˜\n680501 1 ๆ‚จๅฅฝ๏ผๆ‚จ่ดญไนฐ็š„ๅ•†ๅ“็”ฑไบŽ็ณป็ปŸๅ‡็บงๅฏผ่‡ดๆ‚จ็š„่ฎขๅ•ๅทฒๅ†ปๆ— ๆณ•ๆญฃๅธธๅ‘่ดง๏ผŒ่ฏทไธคๅฐๆ—ถๅ†…่‡ด็”ตๅฎขๆœ๏ผšxxx-xx...\n680502 0 ๅŠ ๅผบ็›‘็‹ฑ็ณป็ปŸ็บชๆฃ€็›‘ๅฏŸๅนฒ้ƒจ้˜Ÿไผๅปบ่ฎพ็š„ๅ‡ ็‚นๆ€่€ƒ\n680503 0 ๅคๅญฃๆ–ฐๆฌพๆฌง็พŽ่ก—ๅคด้ซ˜ๆกฃๅฅณ่ฃ…็‰›ไป”ๆ‰ŽๆŸ“ๆฐดๆด—ๆธๅ˜ๅป“ๅž‹่™่ ่ข–่–„ๆฌพ็Ÿญๅค–ๅฅ—\n680504 0 ไฝ ๅฏไปฅ่ดจ็–‘ๆˆ‘ไฝ ๅฏไปฅๅฆๅฎšๆˆ‘\n1000 Processed\n classify content\n681000 0 ๅŒ—ไบฌๆœ้˜ณๆณ•้™ขๅฐ†ไบŽ7ๆœˆ23ๆ—ฅไธŠๅˆ9็‚น\n681001 0 ๅคฉๅคฉ้ƒฝๆœ‰ๅŠžไฟก็”จๅก็š„ๆฅๅ…ฌๅธ่ฝฌๆ‚ \n681002 0 ไป–ๅฆˆ็š„ๅปบ้‚บๅŒบๆธ…่ทๅ›ญๅ—ๅ›ญ็‰ฉไธšไป–ๅฆˆ็š„ๅคชๅทฎๅŠฒ\n681003 0 ๅคฑๆœ›็š„ไบบไปฌๆˆ–่ฎธ็œ‹ๅˆฐไบ†็œŸ็›ธ็š„ๅˆฐๆฅ\n681004 0 ๆœชๆฅ็š„้ฃžๆœบๅœบๆœ‰ๅฏ่ƒฝๆ˜ฏ่ฟ™ๆ ทๅญๅ‘ข\n1000 Processed\n classify content\n681500 0 ๅ—ๅฎณไบบๅบ”่ฏฅ้€š่ฟ‡ๆณ•ๅพ‹้€”ๅพ„ๅพ—ๅˆฐไธ€ๅฎš่กฅๅฟ\n681501 0 ๆ—ฅๆœฌๆœฌๅœŸไปฃ่ดญๅฐ้ป‘ๅธฝ่Ÿ‘่ž‚่ฏ็Žฐ่ดงๆฌข่ฟŽๆ‰พๆˆ‘ๆ‹ฟ่ดงๅ–”่ฟ™ๆฌกๆ‹ฟไบ†ๅฅฝๅคš่Ÿ‘่ž‚่ฏ\n681502 0 ๆฑŸ่‹็›ๅŸŽ่ฐๆœ‰preไธ€็ฝไบญๆน–ๅŒบ้™„่ฟ‘็š„ๅฆ‚ๆžœๆœ‰่ฏทๅซๆˆ‘\n681503 0 ็ผบๆ†พไธŽๅŸบๅ› ไธ€่ตท้“ธๆˆไธ€ไธชๆˆๅ“\n681504 1 xxxx xxxx xxxx xxxx xxxๅ‘จๅจŸๅ†œ่กŒ\n1000 Processed\n classify content\n682000 0 ๆฏๅคฉๆƒณไผด้š็€้ฃžๆœบ็ฟฑ็ฟ”็š„ๅฃฐ้Ÿณๅ“ผๅ”ฑๅผ€ไธไบ†ๅฃ\n682001 0 2008ๅนด่‡ณ2015ๅนด8ๆœˆ5ๆ—ฅ\n682002 0 ้šๅค„ไธ€ไธชๅปบ็ญ‘ๅฐฑๆ˜ฏๅ›ฝ้™…็ป„็ป‡\n682003 0 ๆžœ็„ถๆˆ‘่ฟ˜ๆ˜ฏ้ป‘ๅน•ๅŽปๅง??ๅนถๆฒกๆœ‰ไบบๆƒณ่ฆ??\n682004 0 ็ƒŸๅฐๅ‰ฏๅธ‚้•ฟ่…่ดฅ้ƒฝๆ€ชโ€œๆ–ฐ้—ป่”ๆ’ญโ€\n1000 Processed\n classify content\n682500 0 ๅพˆๅคšไบบ็œ‹ๆ ‡้ข˜ไผšๆƒณๅฆˆ็š„็œŸ่…่ดฅ\n682501 0 ไบ‘ๅ—ๅŽฆ้—จๅนฟ่ฅฟๆฑŸ่‹ๅ››ๅทๆทฑๅœณไธ‹ไธ€็ซ™ๅŽปๅ“ชๅ›žๆข…ๅทž\n682502 0 ๆ—…ๆธธ่€…็ป่ฟ‡่ฟ™ไบ›ๅœฐๆ–นๆ—ถๅฏๅพ€็Ž›ๅฐผๅ †ไธŠๆทปๅŠ ไธ€ไบ›็Ÿณๅคด\n682503 0 ๆ˜ฏๅฏนๅนฟไธœ็œ่‚ฟ็˜คๅŒป้™ขๆ„Ÿๅ…ด่ถฃ็š„็›ธๅ…ณๅคงไผ—่Žทๅ–ๅนฟไธœ็œ่‚ฟ็˜คๅŒป้™ข่ต„่ฎฏ็š„้‡่ฆๆธ ้“\n682504 0 ๆ—…ๆธธๆ˜ฏไบบไปฌไธบๅฏปๆฑ‚็ฒพ็ฅžไธŠ็š„ๆ„‰ๅฟซๆ„Ÿๅ—ๅฐ่€Œ่ฟ›่กŒ็š„้žๅฎšๅฑ…ๆ€งๆ—…่กŒๅ’Œๅœจๆธธ่งˆ่ฟ‡็จ‹ไธญๆ‰€ๅ‘็”Ÿ็š„ไธ€ๅˆ‡ๅ…ณ็ณปๅ’Œ็Žฐ่ฑก็š„ๆ€ปๅ’Œ\n1000 Processed\n classify content\n683000 0 ๅ…ถไธ‰ๆ˜Žๆ˜Žxx๏ผšxx็š„้ฃžๆœบๅˆฐx็‚นๅŠๆ‰่ตท้ฃž\n683001 0 ไปฅๅŠๅŸบ้‡‘ไธšๅไผšไนŸๅœจ่ฟ‘ๆ—ฅๅฏ†้›†ๅ‘ๅธƒๅˆ†็บงๅŸบ้‡‘ไธ‹ๆŠ˜้ฃŽ้™ฉๆ็คบ็š„ไฟกๆฏ\n683002 0 ไธญ้—ดๆ‹‰้“พ้›ถ้’ฑ้š”ๅฑ‚size20*10~\n683003 0 ๆ— ้”กๆท˜้ฝๅฎ่ดๆ–‡ๅŒ–ๅ‘ๅฑ•ๆœๅŠกไธญๅฟƒ็š„ๅญฆๆ•™่€ๅธˆไธบๅญฆ็”Ÿไปฌ่ฟ›่กŒๆณฅๅก‘ๅŸน่ฎญ\n683004 1 //่ฝฌ่‡ชxxxxx๏ผšๅผ€ๅญฆๅญฃ็‰นๆƒ ๆฅ่ขญ๏ผๅณๆ—ฅ่ตทๅˆฐxๆœˆxxๆ—ฅ๏ผŒ้ข„ๅญ˜xxxๅ…ƒ่ฏ่ดนๅณ้€xxxๅ…ƒ่ฏ่ดนใ€x...\n1000 Processed\n classify content\n683500 0 ไผฏไผฏ7ๆœˆ23ๆ—ฅๆŠŠๆ‰‹ๆœบๆމๅœจ้ซ˜้“ไธŠ\n683501 1 ๅง๏ผๅฅฝๆถˆๆฏโ€ฆๆˆ‘้“ถๅบงๆป•ๆฐ่จไพฌๅบ—ๆ–ฐๆฌพๅˆฐๅบ—ไบ†ๆŠ˜ๆ‰ฃx----xๆŠ˜ไธŽไธ€ๅฃไปท็š„๏ผŒๆ–ฐๆฌพ่กฃๆœๆป•ๆฐ.่จไพฌ.่‰พๆธฉ...\n683502 0 ๆฒ›ๅŽฟๅ…ป็”Ÿ่œ่‹—ๆ•‘ๅฟƒ่œ่‹—็งๅญๆ”น่‰ฏ่ดน่œๆ™ฏๅคฉไธ‰ไธƒ็›†ๆ ฝ้™่ก€ๅŽ‹้™่ก€่„‚\n683503 0 ๆœ€่ฟŸๅฐ†ไบŽ2019ๅนดไบคไป˜็‘žๅ…ธๆตทๅ†›\n683504 0 ไฝ†ๅ…ถโ€œๅ…ทๆœ‰ไพต็•ฅๆ€งโ€็š„ๅ•†ไธšๅŒ–ๅดๆœ‰ๆ‚–ไบŽไฝ›ๆ•™ๆ นๅŸบ\n1000 Processed\n classify content\n684000 0 ๅŒบๅŸŽ็ฎกๆ‰งๆณ•ๅฑ€็ป„็ป‡ๅ…จๅฑ€่ฟ‘100ๅๅœจ่Œๅœจ็ผ–ไบบๅ‘˜ๅผ€ๅฑ•ไบ†็ฌฌไบŒๆ‰นๆฌกๅ›ข้˜Ÿ็ด ่ดจๅŸน่ฎญๆดปๅŠจ\n684001 0 ๆญฃ็กฎ้˜ฒๆ™’่ฟ˜ๆœ‰ๅ“ชไบ›ไฝ ไธ็Ÿฅ้“็š„ๆŠ€ๅทงๅ‘ข\n684002 0 ่ฟ™็งๆฒก็ด ่ดจ็š„ๅˆซ่ทŸๅฎƒ่ฎฒ้“็†่‡ณไบŽๆ—่พนๆ‹็…ง็š„่ฟ™ๆณผๅฆ‡ๆ˜ฏไฝ ่€ๅฉ†\n684003 0 ๅฐไผ™็ฉฟๅŒ–็บค่กฃๆœ่ตท้™็”ตๅŠ ๆฒน็žฌ้—ดๅ˜โ€œ็ซไบบโ€\n684004 0 ๅคงไผ—่ฟ›ๅฃๆฑฝ่ฝฆๅนด่ฝปๅฎถๆ—๏ผš็”ฒๅฃณ่™ซTheBeetleใ€ๅฐš้…ทScirocco้‚€ไฝ ไธ€่ตท็บตๆƒ…่ตท่Œƒๅ„ฟ\n1000 Processed\n classify content\n684500 0 ๅˆ†ไบซๅ›พ็‰‡ๅธธ็†Ÿ้ฃž่ทƒ็ฝ‘ๅ’–ๆ•ˆๆžœๅ›พ\n684501 0 ไนŸๆ˜ฏๅ”ฏไบŒ็š„ๆˆ‘keepๅˆฐ็Žฐๅœจ็š„้’้ป„ๆ–‡ๆœฌ&gt\n684502 0 ไนๅคงๅˆธๅ•†ไป…ไธ€ๅฎถ็œ‹็ฉบไธ‹ๅ‘จไธŠๆตท้™†ๅฎถๅ˜ดๅนถ่ดญ่”\n684503 0 ่ฏทๅ„ไฝ้ฉพ้ฉถๅ‘˜ไธฅๆ ผ้ตๅฎˆๆณ•ๅพ‹ๆณ•่ง„\n684504 0 ๅฆˆๅฆˆไผš่ฎฐๅฝ•ไฝ ๆˆ้•ฟ็š„็‚น็‚นๆปดๆปด\n1000 Processed\n classify content\n685000 0 ็ฉทๅจƒๅ€พๅ‘็†ๅทฅ็ง‘ๅฏŒๅจƒๆ›ดๅ็ˆฑๆ–‡็ง‘\n685001 0 ๅผบ็ƒˆๆŽจ่ไป–ๅฎถๆ‰‹ๆœบๅฃณ็ฎ€็›ดไธš็•Œ่‰ฏๅฟƒ\n685002 0 ๆฑŸ่‹็งปๅŠจ็”จๆˆท็š„ๅฅฝๆถˆๆฏโ€œxๆœˆxๆ—ฅ\n685003 0 ๅœจไผไธš็ ดไบงๆกˆไปถ็š„ๅฎก็†ไธญๅ‘ๆŒฅไบ†้‡่ฆไฝœ็”จ\n685004 0 p2pๅนณๅฐ็š„็จณๆญฅๅ‘ๅฑ•ๆ˜ฏๆ”ฟๅบœๆ”ฟ็ญ–ไฟๆŠคไธŽๆๆบไธ‹็š„ๅฟ…็„ถ็ป“ๆžœ\n1000 Processed\n classify content\n685500 0 ็›ฎ็š„ๅœจไบŽๅนณ?ไฟ้™ฉไบบไธŽๆŠ•ไฟไบบไน‹้—ด็š„ๅˆฉ็›Š\n685501 0 ็œŸๆƒณไธ€ๅ’ฌ็‰™ไนฐไธ‹่…พ่ฎฏใ€ๆœ็‹ใ€ไน่ง†ใ€็ˆฑๅฅ‡่‰บใ€่™พ็ฑณใ€ไผ˜้…ทโ€ฆโ€ฆ็š„VIP\n685502 0 ๆ„ฟๆˆ‘ไปฌ็š„\"ไธญๅ›ฝๅฟƒ\"่ƒฝๆ—ฉๆ—ฅ้ฉฑๅŠจไธญๅ›ฝๅคง้ฃžๆœบ็ฟฑ็ฟ”่“ๅคฉ\n685503 0 ๅฅฝๆƒณ็†ฌๅคœๅฅฝๆƒณ็†ฌๅคœๅฅฝๆƒณ็†ฌๅคœ้‡่ฆ็š„ไบ‹ๆƒ…่ฏดไธ‰้ๅฏๆ˜ฏโ€ฆๆ™šๅฎ‰\n685504 0 ๆขไธŠ็š„7ๆฃ’ๆŽๆณฝๆบๅ‡ปๅ‡บไธญๅค–้‡Ž่ขซไธญๅค–้™†ๆฏ…ๆŽฅๆ€ๅ‡บๅฑ€\n1000 Processed\n classify content\n686000 0 ็ซ™ๅŠกไบบๅ‘˜ๅฐ†ๆญฃๅผ่ฟ›้ฉปๅœฐ้“่ฝฆ็ซ™\n686001 0 ๅฏๆƒœ้“ถ่กŒ่ฟ˜ๆฒกๆœ‰ๅŒป้™ข็š„้ข„็บฆๆœๅŠก\n686002 0 ๅคง่ตžๅฐๆนพ่ญฆๅฏŸ็š„ๆ•ˆ็އๆ›ด่ƒœๅคง้™†ๅ…ฌๅฎ‰ไธŽ็พŽๅ›ฝ่ญฆๅฏŸ\n686003 0 ๅˆšๆˆ‘็œ‹lol็š„ๅฎฃไผ ็‰‡่ขซไป–ๅฌๅˆฐไบ†ไนŸ่ฆๆŒคๅœจไธ€่ตท็œ‹\n686004 0 200ๅนณ็š„ๅˆซๅข…่ฟ˜่ฆ่ฎพ่ฎกๆˆๅ››ๅฑ‚\n1000 Processed\n classify content\n686500 0 ๆ‰ฌๅทžๅŸŽๆœ‰ๆฒกๆœ‰ๆˆ‘่ฟ™ๆ ท็š„ๅฅฝๆœ‹ๅ‹\n686501 0 ๅธ‚ๅœบ่ทŒๅน…็š„็ฌฌไธ€ๅƒๅช่‚ก็ฅจๆญฃๅฅฝ่ทŒ9%\n686502 0 ไปŠๅคฉๅฐฑ้‡ๅˆฐไบ†ๅฟ…้กป่ฆๅ็”ตๆขฏ็š„่ทฏ\n686503 1 ้‡‘่‰ฒๅนดๅŽไธป้ข˜็ซๅง่ฎฉๆ‚จๅฐฝไบซ็พŽ้ฃŸ๏ผŒๆฌขๅ”ฑๆ— ้™๏ผๆŠขๅŒ…็ƒญ็บฟ๏ผšxxxxxxx\n686504 0 ๅ…ซๆ—ฌ่€ไบบๅดๆƒณๆŠŠๆˆฟไบง็•™็ป™้™Œ็”Ÿไบบ\n1000 Processed\n classify content\n687000 0 ๆญฃ่ง„/ๅฎ‰ๅ…จ/้ซ˜ๆ•ˆ/ๆ— ๅ‰ๆœŸไปปไฝ•่ดน็”จๅทฅ่–ช๏ผšๅ‡กๅทฅไฝœๆปก6ๆœˆ/ๆœˆ่–ช&gt\n687001 0 ๅฏไปฅ็œ‹ๅšๆ˜ฏไธ“ไธบOPPOFindx่ฎพ่ฎก็š„\n687002 0 ๆˆฟๅœฐไบงๆŠ•่ต„ไปๅœจไธชไฝๆ•ฐๅขž้•ฟไฝŽไฝๅพ˜ๅพŠ\n687003 0 ๆ„Ÿ่ง‰2ๅนด็š„ๆฐ”ๅžซ้ƒฝๆœ‰็€่ฝไบ†ไธ€ไธชๆ›ฟๆข่ƒฝ็”จๅŠๅนดๅคš\n687004 1 ่ฝฌๅ‘๏ผšๆ‚จๅฅฝ๏ผ่ฟ™้‡Œๆ˜ฏ็™พๅง“ๅคง่ฏๆˆฟ๏ผŒๆˆ‘ไปฌๆญฃๅœจๆžๅ…ณ็ˆฑๅฅณไบบ่Š‚ไนฐxxๅ…ƒ้€xxๅ…ƒๅˆธๆดปๅŠจ๏ผŒ็ฆพ็ฉ—้€Ÿๆ•ˆxๅ…ƒx็›’๏ผŒ...\n1000 Processed\n classify content\n687500 0 ไธญๅ›ฝ่ฏๅˆธไธšๅไผš็ง˜ไนฆ้•ฟๅญŸๅฎฅๆ…ˆๅœจ่ฏๅˆธๆŠ•่ต„ๅ’จ่ฏขๆœบๆž„ๅˆ›ๆ–ฐๅ‘ๅฑ•่ฎบๅ›ๆšจไธšๅŠกๅŸน่ฎญไผš่ฎฎไธŠ่กจ็คบ\n687501 0 /่ฒๅพ‹ๅฎพๆ‘ธ้ป‘ๅทไฟฎๅ—ๆฒ™ไป็ˆฑ็คโ€œๅๆปฉโ€็ ด่ˆน\n687502 0 ๆˆ‘ๅฌๅˆฐๆˆ–้‡ๅˆฐ็š„ๅ…ณไบŽๅฟซ่ฝฆ็š„ๆ—ฅๅธธ๏ผšๆ‰“ๅฟซ่ฝฆๅธๆœบๆ˜ฏๅŒไบ‹\n687503 0 ๆŠ•่ต„ไบบๅ’Œไธชไบบ่ดขไบงๅฏนไผไธš็š„ๅ€บๅŠกๆ‰ฟๆ‹…ๆ— ้™่ดฃไปป็š„ๅฎžไฝ“็ป่ฅๆจกๅผ\n687504 0 ็‹—็‹—ๅธฆๅˆฐๅฎ ็‰ฉๅŒป้™ขๆฃ€ๆŸฅ้ƒฝๅพˆๅฅๅบท\n1000 Processed\n classify content\n688000 0 ๆฑŸ่‹้ฆ–ๆ‰นไธคๅ่˜ไปปๅˆถๅ…ฌๅŠกๅ‘˜ๅทฒๆญฃๅผไธŠๅฒ—ๅนด่–ชๅœจ18ไธ‡ๅทฆๅณ\n688001 0 ๅ่ฟ™ไธชๆ—ถ้—ด็š„้ฃžๆœบๆ˜ฏๅ› ไธบๅ‡Œๆ™จ่ˆช็ญไพฟๅฎœๅ—\n688002 0 ไธบไบ†ๆ›ดๅฅฝ็š„็”Ÿๆดปๆ€ๅฏ†่พพ~??่€ๆฟ็ป™ๅŒ…็š„็บขๅŒ…\n688003 0 ๆฎ่ฏดๆฏไธชไธƒๅนดไบบ็š„ๅ…จ่บซ็ป†่ƒžไผšๆ›ดๆขไธ€้\n688004 0 ็”ฑxxxxๅคšๅ่‹—ๆ—ๅทฅๅŒ ๅކๆ—ถไธคๅนดๅคšไฟฎๅปบไบ†่‹—ๆ—ๆ–‡ๅŒ–ๅปบ็ญ‘็พคโ€”โ€”่šฉๅฐคไน้ปŽๅŸŽ\n1000 Processed\n classify content\n688500 0 ไนฐๆˆฟๅ’Œๅ–ๆ–น้ƒฝๅบ”่ฏฅ่ฟฝ็ฉถๅˆ‘ไบ‹่ดฃไปป\n688501 0 ๆ—ฅๆœฌ่‡ชไปฅไธบๆŒค่ฟ›่ฅฟๆ–นๅคงๅ›ฝ้›†ๅ›ข\n688502 0 ่ฟ˜ๆœ‰ๅนฝ็ต็Šถ็š„ๅฆฎๅฆฎ็Œชๅ„ฟไฟฉๅงๅคซxxxxx่ฟ™่‡ช็”ฑ็š„ๆ„Ÿ่ง‰\n688503 0 ใ€Œ4ไธ‡ไบบๅœจไผ ็œ‹ใ€ไฝ ็ปๅฏนๆฒก่ง่ฟ‡็š„ๆ–นไพฟ้ขๆ–ฐๅšๆณ•\n688504 0 ๅฝ“ๅนดๅฐ้พ™ๅฅณ่ขซๅฐนๅฟ—ๅนณXXOO่ฟ‡ไบ†x\n1000 Processed\n classify content\n689000 0 ๅฟซๆฅๆฑŸ่‹้ฃŸๅ“่ฏๅ“่ŒไธšๆŠ€ๆœฏๅญฆ้™ข่ทŸๅฆนๅญๆฅไธ€ๅœบ็พŽไธฝ็š„้‡่งๅง\n689001 0 ไธบๅ…จๅŽฟๅ…ฌๅฎ‰ไบ‹ไธš็š„ๅ‘ๅฑ•่ฟ›ๆญฅไฝœๅ‡บๆ›ดๅคง่ดก็Œฎ\n689002 0 ๅพฎ่ฝฏWINDOWS10ๆ›ดๆ–ฐไธ‹่ฝฝ\n689003 1 ๅˆ˜ๆด็พŽๅฎนxยทxๅฆ‡ๅฅณ่Š‚ๅฝ“ๅคฉ๏ผŒๅ‡กๆฅๅบ—ไผšๅ‘˜ไป…้œ€xxๅ…ƒๅณๅฏ่ฎข่ดญไปทๅ€ผxxxๅ…ƒไปฅไธŠๅบท้ขœๅ•ๅ“๏ผŒ๏ผˆๆŒ‡ๅฎš็š„ไธ‰ๆฌพ...\n689004 0 ๆˆ‘ไปŠๅคฉๅœจๅŽไธบ็ฝ‘็›˜็ญพๅˆฐ่Žทๅพ—ไบ†xxxMๅ…่ดนๆฐธไน…ๅฎน้‡\n1000 Processed\n classify content\n689500 0 ๅฅฝๆƒณๅƒ้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—้ธกๅ—\n689501 0 ๆฏๅนดxโ€”xๆœˆๆ—ถ้—ดไธ็ญ‰็œ‹ๆŠŠไฝ ่‡ญ็พŽ็š„\n689502 1 ๆญ็ฅๅ…ƒๅฎต่Š‚ๅฟซไน๏ผ ้’ๅฒ›็ฒพๅฝฉ้ชŒๅŽ‚ๅ’จ่ฏข๏ผŒ็ซ‹่ถณ่ƒถๅทž๏ผŒไธ“ๆณจ้ชŒๅŽ‚ๅไบŒๅนด๏ผŒไธ€็ซ™ๅผๆœๅŠก๏ผŒ้€š่ฟ‡็އxxx...\n689503 0 ๅด่ขซไธไธ“ไธš็š„ๅŠไธšๅ†…็ˆ†ๆ–™ไบบๅฃซๆด—่„‘\n689504 0 ๆ™บ่ƒฝๆ‰‹ๆœบ็š„ๆ™ฎๅŠ็ป™ๅคงๅฎถๅธฆๆฅไบ†ๆ›ดๆ–นไพฟ็š„้€š่ฎฏ็”Ÿๆดป\n1000 Processed\n classify content\n690000 0 ๆƒณๆ—…ๆธธ็š„ๅฟƒๅœไธไธ‹ๆฅๆ˜Žๅคฉไน–ไน–ไธŠ็ญๆ™šๅฎ‰\n690001 0 ๆ˜ฏ็—…ๆฏ’ๅคช่ฟ‡ๅމๅฎณ่ฟ˜ๆ˜ฏ็™พๅบฆๅนถๆฒกๆœ‰ๆˆ‘ไปฌๆƒณ่ฑก็š„ๅผบๅคง\n690002 0 ๆ–‡่‰ฏๅˆ†ๅฑ€็ฆๆฏ’ๅฎฃไผ ๏ผš็็ˆฑ็”Ÿๅ‘ฝ\n690003 0 ๅ…ถๅฎžๆ•ด็†ไธ€ไธ‹ไนŸๆ‰ไธคไธช้ฃžๆœบ็›’่ฟ™ไนˆ็‚น\n690004 0 โ€œ่ดชๆฑกๅ’Œๆตช่ดนๆ˜ฏๆžๅคง็š„็Šฏ็ฝชโ€\n1000 Processed\n classify content\n690500 0 ไธค่žไฝ™้ข้ชค้™้€พ400ไบฟๅ…ƒๅคšๅˆธๅ•†ๆš‚ๅœ่žๅˆธ\n690501 0 ๅ…ถไธญxไธชๆ‰นๆฌกๆฅ่‡ชๆฑŸ่‹็”Ÿไบงไผไธš\n690502 0 ๅ…ทๅค‡ไธ€ๅฎš่ดขๅŠกๆˆ–ๆณ•ๅพ‹ๅŸบ็ก€ๆ›ดๅฅฝ\n690503 0 ๆต็•…ไฝ ไปฌๆ˜ฏๅฏน่‡ชๅทฑ็š„ไบงๅ“ไธ่‡ชไฟก\n690504 1 ๆฑฝๅคงไผ—้›†ๅ›ข่ตžๅŠฉๆไพ›ๆขฆๆƒณๅˆ›ไธšๅŸบ ้‡‘ xxxxxxๅ…ƒ็Žฐ้‡‘ๅŠ่‹นๆžœ็ฌ”่ฎฐๆœฌ็”ต่„‘ไธ€ๅฐใ€‚่ฏท็ซ‹ๅณ็™ป ้™† bp...\n1000 Processed\n classify content\n691000 0 ไพ็„ถไป็„ถๆœ‰ไบบ่ง‰ๅพ—ๆˆ‘ๆ˜ฏๅ…ณ็ณปๆˆท~ๅฅฝๅง”ๅฑˆ\n691001 0 ็œๆฃ€ๅฏŸ้™ข็›‘ๅฏŸๅค„ไปป็ซ‹ๆ–ฐๅ‰ฏๅค„้•ฟไธ€่กŒ4ไบบ่Ž…ไธด็“ฎๅฎ‰ๅŽฟ้™ข\n691002 0 ๆไพ›ๅŒ—ไบฌๅคงๅญฆ่‚ฟ็˜คๅŒป้™ขๆœ€ๆ–ฐ่ต„่ฎฏ\n691003 0 ็žŽๅ”ฑ็š„โ€ฆๅ“ๅˆฐ่ง่ฐ…ๅ“ˆโ€ฆ่ฏ•ๅฌๅœฐๅ€&gt\n691004 0 ~~~~ๅฏนไธ“ไธš็š„ๅฎ‰้˜ฒ่กŒไธšๆœ‰ไบ†่งฃ็š„ๆœ‹ๅ‹\n1000 Processed\n classify content\n691500 0 ้ฃžๆœบๆ™š็‚นๅˆฐ่พพๆ—ฅ่ˆช้…’ๅบ—ๅทฒ็ป4็‚น\n691501 0 23ๅ—ไบฌ็ฆ„ๅฃๆœบๅœบๆŽฅๆœบ2015\n691502 0 ๆฏ”ๅฆ‚ๅ’จ่ฏขๅธˆๅฏนๆˆ‘็š„่งฃ้‡Šๆ˜ฏโ€œ่ฆๅ…ˆๅญฆไผš็ˆฑ่‡ชๅทฑโ€\n691503 0 ๅค–ๅŠ ๆŒ็ปญไฝฟ็”จ3'5ไธชๅฐๆ—ถๆ‰็”ต้‡ไธ่ถณ\n691504 0 ๆฌง็พŽ่–„ๆฌพ่•พไธๆƒ…่ถฃๆ€งๆ„Ÿ่ฏฑๆƒ‘\n1000 Processed\n classify content\n692000 0 ๅšALPHAๅฅ—ๅˆฉๅ’ŒCTA้‡ๅŒ–ไบคๆ˜“\n692001 0 ็Ÿฅ้“ไธญๅ›ฝๅฅฝๅฃฐ้ŸณWhyๆฒกไบบๆ•ขๅ”ฑๅตฉๅ“ฅ็š„ๆญŒๆ›ฒ\n692002 0 ็ƒŸๅฐ7ๆœˆไปฝ้ฟๆš‘ๆ—…ๆธธๆŒ‡ๆ•ฐๆŽ’ๅๅ…จๅ›ฝ็ฌฌๅ…ญ\n692003 1 ไปŠๅคฉ็บข้ฆ†ๆญฃๅผๅผ€ไธš๏ผŒ้…ฌๅฎพไผ˜ๆƒ ๆดปๅŠจๅŠ›ๅบฆๅคง๏ผŒ็พŽๅฅณไนŸๅพˆๅคš๏ผŒๆ™šไธŠ่ฟ‡ๅŽป็Žฉไผšๅง๏ผŒๅœฐๅ€:ๅฎ้พ™ๅคงๆถฆๅ‘่ถ…ๅธ‚ๅฏน้ข๏ผŒ...\n692004 1 ๅ†œ่กŒxxxxxxxxxxxxxxxxxxx้ƒ‘ๅ–ปๆฐ\n1000 Processed\n classify content\n692500 0 ๆœ‰่ฎค่ฏ†่‹ๅทž้›•ๅˆปๅธˆๅ‚…็š„่ฏท่”็ณปๆˆ‘\n692501 0 ้’ˆๅฏนๆœชๆฅ่ฟ่ฝฝ็ซ็ฎญๅ‘ๅŠจๆœบๅ‘ๅฑ•็š„้œ€่ฆ\n692502 0 ๅ› ไธบ่ฟ›ๆฐดไผšๅฏนๆ‰‹ๆœบๅ†…้ƒจๅคšๅค„ๅ…ƒๅ™จไปถ้€ ๆˆๆŸไผค\n692503 0 ๅŽŸๅ› ๆ˜ฏ๏ผšไฝ ๅŽปๅ’จ่ฏขไบ†ไฝ ็š„ไธ€ไฝๆ‹ฟ็€3ๅƒๅ—ๆœˆ่–ชๅทฅ่ต„็š„ๆœ‹ๅ‹\n692504 0 ๆ‰“ๅผ€Google้—ฒๆฅๆ— ไบ‹~็Žฉไบ†ๅๅ…ณๅฅฅ็‰นๆ›ผๅฐๆธธๆˆ~่ง‰ๅพ—่‡ชๅทฑ่Œ่Œๅ“’~\n1000 Processed\n classify content\n693000 0 ๆต™ๆฑŸๅฐๅพ—้‚ฃไธชๅดไบฆๅ‡กๅพ—้‚ฃไธช่Š‚็›ฎไผšๆฏ”่Š’ๆžœๅฐๅพ—ๅถๅƒๆฅไบ†ๆ”ถ่ง†็އ้ซ˜\n693001 0 ่ฎค่ฏไฟกๆฏไธบโ€œๅฟตๅฟตไป™ๅฅณ้›†ๅ›ข้ฆ™ๆธฏๆœ‰้™ๅ…ฌๅธ่‘ฃไบ‹้•ฟโ€\n693002 0 ไฝ†ๆ˜ฏๆœ‰ไธ€็‚นโ€ฆโ€ฆๅคชๅ›ๅˆฐๅบ•ๅคšๅคง่ƒฝ่€\n693003 0 ๆ—ฅๆœฌๆŽจๅ‡บๆŠ—ๆˆ˜ๅ‰งไธญๅ›ฝ็ˆ†็ฌ‘ๆŠ—ๆˆ˜้›ทๅ‰ง็›˜็‚น\n693004 1 ๅ„ไฝๅก‘ๆ–™ๆ‹‰็‰™ๅˆทไธใ€ๆ‰ซๆŠŠไธ็š„ๅ„ไฝ่€ๆ€ป๏ผšๆ‚จๅฅฝ๏ผๆœฌไบบ่”กๅบ†ๆ…งไปŽไบ‹่ฏฅ่กŒไธšๅคšๅนด๏ผŒๅ…ทๆœ‰ไธฐๅฏŒ็ฎก็†็ป้ชŒๅ’Œ็”ŸไบงๆŠ€...\n1000 Processed\n classify content\n693500 0 ๆถ้ญ”ๅ””ๅผ€PPTๅกๆฉๅ””็†ๅŽปๅ•ฆไฝ›็ฅ–ๅก้‡Œๆ™ฎๆ‹‰ๅคšๆˆ‘ๅฑ‹ไผๆˆ‘ๆ‰“ไบๅ•Šๅฏๆฒกๆ„ๆ€ๅŽปๅ•ฆๆๅ–ๆ‹‰้”ฏ็”˜ๅ’ฏ็š„่ทฏไธŠๅŽปๅ’ฏๆฒกๆœ‰...\n693501 1 ไปป่ทƒๆณข๏ผŒๅทฅ่กŒxxxxxxxxxxxxxxxxxxx๏ผŒๅปบ่กŒxxxxxxxxxxxxxxxxxxx ใ€‚\n693502 0 ๅคง่ƒ–่พนๅˆทๅพฎๅš่พน่ฏด๏ผš่…พ่ฎฏๅ‘˜ๅทฅ็ซŸ็„ถๆœ‰ๅธฆ็€ๅทฅ็‰Œ่ฟฝๅฆน็บธ็š„\n693503 0 ไบค่ญฆไนŸๅธฆ้˜ฒๆ™’่ข–็š„้Ÿฉๅ›ฝไปฃ่ดญๆณ•ๅ›ฝไปฃ่ดญๅ…็จŽๅบ—ไปฃ่ดญ\n693504 0 ๅฏนไบŽๆ— ๆ…ขๆ€งไน™่‚ๆ„ŸๆŸ“ๅฎถๅบญ่ƒŒๆ™ฏไธ‹็—…ๅ‹่€Œ่จ€\n1000 Processed\n classify content\n694000 0 ๆฑŸ่‹็œๆณฐๅ…ดๅธ‚ๆตŽๅท่ก—้“ๅŠž่ฟๆณ•ไนฑ็บช\n694001 0 ๆœ‰ๆฒกๆœ‰ๅฐไผ™ไผด8ๆœˆๅˆๅผ€่ฝฆๅ›žๅพๅทžๆˆ–่€…่ทฏ่ฟ‡ๅพๅทž็š„\n694002 0 โ€ๅ…ฌ่ฏ‰ไบบ็งฐๆกˆๅ‘ๆ—ถไป–ๆœ‰ๅฎŒๅ…จๅˆ‘ไบ‹่ดฃไปป่ƒฝๅŠ›\n694003 0 ๅ› ไธบๆˆ‘ๅฟƒๅฟƒๅฟตๅฟต็š„ๅ“†ๅ•ฆAๆขฆๆ—…่กŒ็ฎฑ็ปˆไบŽๅˆฐไบ†\n694004 0 ไธ€ไธช้กถไธ‰ไธชไธ€ๅชๆžๅฎšไบ”ๅคง่‚Œ่‚ค้—ฎ้ข˜ๅ“ฆ\n1000 Processed\n classify content\n694500 0 ๆ‰€ไปฅๆˆ‘ๅฎถๅฟซ่ฆๆ‰“ๆ’ๅคง็š„ๆฑŸ่‹่ˆœๅคฉ้™คไบ†ๅญ™ๅฏ\n694501 0 ็Žฐๅœจ็š„ๅ‚ป้ƒฝๆ˜ฏๆ‰€่ฐ“็š„ไปทๅ€ผๆŠ•่ต„่€…\n694502 0 ๅพฎ่ฝฏๅ‡่ฎฐ่ฏบๅŸบไบšๆ‰‹ๆœบไธšๅŠกๅนถ่ฃๅ‘˜xxxxไบบ\n694503 0 ๅฏผๆผ”ๅฏๅฐใ€ๆผ”ๅ‘˜ๅฏๅฐใ€ๆŠ•่ต„ไนŸๅฏๅฐ\n694504 0 ๆœฌๆฌกๅฏนๆŽฅไผšๅฐ†ไบŽ7ๆœˆ29ๆ—ฅๅœจ้•‡ๆฑŸไธพๅŠž\n1000 Processed\n classify content\n695000 0 ็–‘ไผผMH370้ฃžๆœบๆฎ‹้ชธๅœจๆณ•ๅฑž็•™ๅฐผๆ—บๅฒ›ๆฒฟๅฒธๅ‘็Žฐ\n695001 0 333ใ€ไบ”ๆด‹ๅปบ่ฎพ้›†ๅ›ข่‚กไปฝๆœ‰้™ๅ…ฌๅธๆˆฟๅฑ‹ๅปบ็ญ‘ไธšๆต™ๆฑŸ\n695002 0 ๅˆทๅพฎๅš็ญ‰่Šฑๅƒ้ชจๅ•Šๅ•Šๅ•Šๅ•Šๅ•Šๅ•Šๅ•Šๆ„‰ๅฟซ็š„ๆš‘ๅ‡ไธ่ฆ็ฆปๆˆ‘่€ŒๅŽป\n695003 0 ๅœจๅคง้˜Ÿ่ฟๆณ•ๅค„็†็ช—ๅฃใ€ๆœๅŠกๅŒบใ€ๆญฆไธœๅกๅฃ็ป„็ป‡ๆ’ญๆ”พไบค้€šๅฎ‰ๅ…จๅฎฃไผ ็‰‡\n695004 0 ่ฏท็œ‹ๆœ€ๅŽไธ€ๅผ ๅ›พๅˆ’็บฟ้ƒจๅˆ†๏ผšๅคๆดปๅฅณๆ€ง่กฐ้€€ๅตๅทข็ป†่ƒž\n1000 Processed\n classify content\n695500 0 ๅƒไธ‰ๅ››ๅนดๅ‰googleๅœฐๅœ–ๆˆ–googlelatitudeไธ€ๆจฃ\n695501 0 ๆ˜จๆ™š้ฆ–้ƒฝๆœบๅœบ้ฃžๆœบไธŠ็ญ‰ๅพ…3ๅฐๆ—ถ่ตท้ฃž\n695502 0 ไบš้ฉฌ้€Š่ฟ˜ๅ€พๆƒ…ๆŽจ่ไบ†ไธๅฐ‘ๅฆ็ฑปไนฆ็ฑ\n695503 0 ไปŠๅนด็š„ๅ‡‰้ž‹ๆ€Žไนˆๅฐฑ่ฎพ่ฎก็š„่ฟ™ไนˆไธ‘\n695504 0 6ใ€ๆตทๆด‹ๅฑ€๏ผšๅ…จๅ›ฝๅทฒๅปบๆˆๆตทๆฐดๆทกๅŒ–ๅทฅ็จ‹112ไธช\n1000 Processed\n classify content\n696000 0 ๆ•ดไธชๆ— ้”กๅธ‚็š„ๅคงๆฐ”่‡ชๅŠจ็›‘ๆต‹็‚นๅฐ†่พพๅˆฐ18ไธช\n696001 0 xไธ‡่ค็ซ่™ซๆ”พ้ฃžใ€ๆตชๆผซๆฑ‚ๅฉšใ€ๅ•ค้…’็‹‚ๆฌขใ€้Ÿณไน็พŽ้ฃŸใ€ๅธ็ฏท้œฒ่ฅ\n696002 0 ๅ…ถไป–ๆ— ่ฎบ้’ๅฒ›็ƒŸๅฐๅจๆตท่ฟžไบ‘ๆธฏ้€š้€šๅชๆœ‰็กฌๅบง\n696003 0 ๅ…ถๅฎž่ฟ™ๅ‡ ้›†่Šฑๅƒ้ชจ็š„ๅ†…ๅฎนๅฎŒๅ…จๅฏไปฅๆฆ‚ๆ‹ฌไธบๅฐ้ชจ๏ผšๅธˆ็ˆถ~~~ๅธˆ็ˆถ~~~ๅ–ๆˆ‘็š„ๆฏ’ๅ–ๆˆ‘็š„ๆฏ’ๆฑ‚ไฝ ไธ่ฆๆŠŠๆˆ‘้€...\n696004 1 ้ญ…ๅŠ›ๅฅณไบบ่Š‚๏ผŒ็บฆๆตไธ‰ๆœˆๅคฉใ€‚ๅƒ็™พ่‰ฒๅŒ–ๅฆ†ๅ“ๅ…จไฝ“ๅ‘˜ๅทฅๆ•ฌ็ฅๅ„ไฝๆ—ถๅฐšๅฅณ็ฅž่Š‚ๆ—ฅๅฟซไน๏ผŒ้’ๆ˜ฅๆฐธ้ฉป๏ผไธบไธŽๆ‚จๅ…ฑๅบฆ็พŽ...\n1000 Processed\n classify content\n696500 0 ่ฟ‘ๆ—ฅๅธฆๆฅiPhone6ๅ’ŒiPhone6Plusๆ‰‹ๆœบๅฃณ\n696501 0 ็œ‹็”ต่ง†ๅ‰ง็œ‹ๅพ—ๅฟ˜่ฎฐๅฅฝๅฃฐ้Ÿณไบ†ๅ•Šๅ•Šๅ•Š\n696502 0 2็›ธๅฏนๅ…ถไป–่‚ก็ฅจๆ˜ฏๅœฐๅœฐ้“้“็š„ๅฐ็›˜่‚กๅ’Œ็ปฉไผ˜่‚ก\n696503 0 ้ซ˜้‚ฎไธญๅญฆๅด็ฟไปฅ389ๅˆ†้ซ˜่€ƒๆˆ็ปฉไฝๅฑ…ๅ…จ็œ็ฉบๅ†›ๆ‹›้ฃžๆ–‡ๅŒ–ๅˆ†็ฌฌไธ€\n696504 0 ๆˆ‘ไปฌๅญฆๆ ก็ป่ดธ็ณป17ไธชๅญฆ็”Ÿๅฏไปฅไบซๅ—ๅ…ฌ่ดนๅŽปๆฑŸ่‹ๆตท้—จๅ‚ๅŠ ๅฎž่ทตๆดปๅŠจ\n1000 Processed\n classify content\n697000 0 ๆˆ‘ๆ€Žไนˆ่ง‰ๅพ—่Šฑๅƒ้ชจไธๆ˜ฏ็™ฝๅญ็”ป็š„็”ŸๆญปๅŠซ\n697001 0 ็”ต่„‘ๅ‰ๆ†‹ๆ–ฐไธ€ๆœŸๅ‘˜ๅทฅ็Šถๆ€่šๅŠฟๅŸน่ฎญPPTไธญ\n697002 0 ไฝฟไธ€้ƒจๅˆ†ไบบๅคง่Žทๅ…ถๅˆฉ่€Œๅฆไธ€้ƒจๅˆ†ไบบๆทฑๅ—ๅ…ถๅฎณ\n697003 0 ่ฏดๆ˜ฏ่ฆ็ป™้’ไบ‘ๆธ”ไธšๅคง้˜Ÿๅผ„ๆ‹†่ฟๆˆฟโ€ฆ\n697004 0 ไน‹ๅ‰็”จ้œฒๅพ—ๆธ…็š„้˜ฒๆ™’็œŸ็œŸ็š„้ธก่‚‹ไฝ†ๆ˜ฏไป–ไปฌๅฎถ็š„fastabsorbingๆŠคๆ‰‹้œœ่ฟ˜ๆ˜ฏ่›ฎๅฅฝ็”จ็š„ๆป‹ๆถฆๅบฆๅพˆ...\n1000 Processed\n classify content\n697500 1 ไฝ ๅฅฝ๏ผxxxxxxxxxxx้™ˆ็บขๅœจๆœฌๅธ‚ๅˆถไฝœๅ„็งๆฏ•/ไธš่จผ\n697501 0 ๅ—ไบฌไปฒๅคๅคœ่ค็ซ่™ซ่ฝป่ˆž้ฃžๆ‰ฌใ€€ๅฆ‚็นๆ˜Ÿๅ ่ฝไบบ้—ด้šพ้“ๆ˜ฏๅ—ไบฌๅคงๅฑ ๆ€็š„ไบก็ตโ€ฆโ€ฆ\n697502 0 ๅ–„ๅ‘ณ้˜้–ๆฑŸ็Œช่‚‰่„ฏ่‚‰ๅนฒxxxgxx่ข‹\n697503 1 ็ผค็บท้‡‘ๆŸœๅฒๆœซๅคง้…ฌๅฎพ๏ผŒๅ•ค้…’??ไธ€ๅƒไธ‰็ฎฑ๏ผŒๆด‹้…’่ฝฉVไนฐไธ‰้€ไธ‰๏ผŒไธญๅœบ็ซ่พฃๅˆบๆฟ€ๅฐๆธธๆˆ๏ผŒ็ฉบๅง็š„ๅˆถๆœ่ฏฑๆƒ‘๏ผŒ...\n697504 0 ไป–ๅˆ่ชๅŒๅกซๅนณAๅ€ๆฒณ้“่งฃๆฑบ้ป‘ๆฒ™็’ฐๆตท้‚Š้•ทๆœŸ่‡ญๅ‘ณๅ•้กŒ\n1000 Processed\n classify content\n698000 0 ๅผ **ๅผ€้—ญๅน•ๅผๅฟƒๆƒ…ๅคๆ‚\n698001 0 ็„ถๅŽๆˆ‘pia็š„ๆŠŠไธ€ไธช้“็›’็ขฐๅˆฐไบ†ๅœฐไธŠ\n698002 0 ๆˆ‘ๅคšๅนดๅ‰TVB็‚บๆˆ‘้–‹็š„้จฐ่จŠๅพฎๅš่ขซ้ป‘ๅฎขๅ…ฅไพต\n698003 1 ๆธฉ้ฆจๆ็คบ๏ผšไฝ“่‚ฒ้ฆ†ๅˆ›ๆ–ฐๆ–‡ๅŒ–่‰บๆœฏๅŸน่ฎญๅญฆๆ กไธบๅบ†็ฅๆœฌๆ กๅปบๆ กไธ‰ๅ‘จๅนดๅบ†๏ผŒไบŽxxxxๅนดxๆœˆxxๆ—ฅไน‹ๅ‰ๅˆฐๆˆ‘ๆ ก...\n698004 0 /ๆธฏๅคงๅญฆ็”Ÿๅ†ฒๅ‡ปๆ กๅง”ไผšๅ†…ๅน•๏ผšๅญฆ็”Ÿไผš้•ฟๅ……ๅฝ“โ€œๆธฏ็‹ฌโ€ๅ†…ๅบ”\n1000 Processed\n classify content\n698500 0 ๅพฎ่ฝฏ็š„็ณป็ปŸ้€šๅธธๆ˜ฏไธ€ไปฃๆˆๅŠŸไธ€ไปฃๅคฑ่ดฅ็š„ๆญปๅพช็Žฏๅฆ‚ไปŠwin7ๆˆๅŠŸไบ†Win8ๅคฑ่ดฅไบ†็„ถๅŽwin10ไผšๆˆๅŠŸๅ—\n698501 0 ไธœ่Žžๆ”ฟๅบœ่ฟ˜ๅฟ˜ไบ†ๅšไธ€ๆœ€้‡่ฆ็š„ไบ‹\n698502 0 ่‡ชไฝ“่„‚่‚ชๅนฒ็ป†่ƒžๅกซๅ……่‹นๆžœ่‚Œ่‚‰ๅฏไปฅ็žฌ้—ดๅ˜่ถ…็บง็”œ็พŽๅ“ฆไธ€่ˆฌๅš3ๆฌกๅทฆๅณๅฏไปฅๆฐธไน…\n698503 0 ๆ™ฎไบฌๅ›žๅ‡ป่ดจ็–‘๏ผšxxxxๅนดไธ–็•Œๆฏๅฐฑๅœจไฟ„็ฝ—ๆ–ฏไธพ่กŒ\n698504 0 ๅ›ฝๅ†…้ป„้‡‘็™ฝ้“ถtd่ตฐๅŠฟ็ปˆไบŽๅœจ่ฟž็ปญไธ‹่ทŒ็š„ๆ‰“ๅ‡ปไธ‹่ตฐไฝŽ\n1000 Processed\n classify content\n699000 0 ๅ—ไบฌๅณๆ—ฅๆญฃๅผๆ”นๅไธบ้œๅฒๅฐผ็Ž›\n699001 0 13ๅนด็š„ไปŠๅคฉ็ฌฌไธ€ๆฌกๆŠต่พพ่‹ๅทž\n699002 1 ๅคงๅ—้—จๅ“ฅๅผŸๆ—ฉๆ˜ฅๆ–ฐๆฌพๅˆฐๅบ—๏ผŒๆ—ถๅฐšๅฐ่Šฑ่กฌ่กซใ€baba่“้Ÿฉ็‰ˆ้ฃŽ่กฃ.ๅฝฉ่‰ฒ็‰›ไป”่ฃคใ€ๆ›ดๅคšๆƒŠๅ–œๆœŸๅพ…ๆ‚จ็š„ๅˆฐๅบ—ไฝ“...\n699003 0 ไธชไบบๅปบ่ฎฎ๏ผšๅœจๆณ•้™ขๅˆคๅฎšๅŽ่ดฅ่ฏ‰ไปไธไบค็‰ฉไธš่ดน็š„ๆ–นๅฏ็บณๅ…ฅๅคฑไฟกๅๅ•ๆœ›ๅ†ณ็ญ–่€…้‡่ง†\n699004 0 ๆฆ†้˜ณๅŒบๅŠณๅŠจ็›‘ๅฏŸๅคง้˜Ÿ็š„ๅทฅไฝœๅบ”่ฏฅ็”ฑ่ฐๆฅโ€œ็›‘ๅฏŸโ€\n1000 Processed\n classify content\n699500 0 ๅŒป็–—ๆœๅŠกไธไป…ไป…ๆ˜ฏๅŒป้™ข็š„้—ฎ้ข˜\n699501 0 1ใ€ๅฎถๅบญๆˆฟๅฑ‹่ฃ…ไฟฎ้ฃŽๆฐดโ€”็Ž„ๅ…ณ็Ž„ๅ…ณ็Žฏๅขƒๅ‡Œไนฑไผšๅฏผ่‡ด็ฉบๆฐ”ๆททๆตŠ\n699502 0 โ€œ้ฒๅ…ฌโ€้“้ขๆ— ็งๅดๅˆไพ ้ชจๆŸ”ๆƒ…\n699503 0 ๅœฐๅ€๏ผš็Ÿณๅคง็ ”็ฉถ็”ŸๅŸน่ฎญไธญๅฟƒไธ–ๅšๅฝฑ่ง†่‰บๆœฏๅญฆๆ ก\n699504 0 ๆฏไธชๆณ•้™ข้ƒฝๆœ‰้‚ฃไนˆไธ€ไบ›่ต–็šฎ็š„ไบบ\n1000 Processed\n classify content\n700000 0 ๆฑŸ่‹ๅ—ไบฌไธœๅ—ๅคงๅญฆ้™„ๅฑžไธญๅคงๅŒป้™ขๆถˆๅŒ–็ง‘็š„ๅ‰ฏไธปไปปๅŒปๅธˆๆฏ›็ฟ ๅŽๅœจๅฎถไธญๅšๅฎŒๆ—ฉ้ฅญๅŽไธŠๅŽ•ๆ‰€\n700001 0 ้ป‘็ณ–ไฝฟ็”จๅฅฝๆ–นๆณ•1้ป‘็ณ–ๆณกๆฐดๅ’Œ็บขๆžฃๆก‚ๅœ†็…ฎ10ๅˆ†้’Ÿๅทฆๅณ\n700002 0 ๆœ€่ฟ‘็ช็„ถๅฏนๆˆฟๅญ็š„่ฃ…ไฟฎๅคšไบ†็‚นๆƒณๆณ•\n700003 0 ๅฝฉๅก˜้•‡ๆ”ฟๅบœๆŽฅๅˆฐ็ˆ†ๆ–™ๅŽๅๅˆ†้‡่ง†\n700004 0 ๆไพ›ๅคงๅฎถ้ธๆ“‡ๅ“ชๅ€‹็‰ˆๆœฌ้ฉๅˆ่‡ชๅทฑๅ”ท\n1000 Processed\n classify content\n700500 0 ๅฎƒ็š„ๅŽๅฃณๅ†…ไพงๆœ‰ไธ€ๅฅForOurPrincess\n700501 0 ่ฐข่ฐขๅฐ็œŸๅ’Œไบ”ๅ“ฅไธคไฝๅŒป็”Ÿ็š„ไธ“ไธšๅธฎๅŠฉ\n700502 0 ไฝ†็›ฎๅ‰ๆ”ฟๅบœ็š„ๆ˜Ž็‰Œ3600ไพ็„ถ้žๅธธๅšๆŒบ\n700503 1 ้“ถ่กŒๆ— ๆŠตๆŠผ่ดทๆฌพ๏ผŒๅฝ“ๅคฉไธ‹ๆฌพ๏ผŒๅˆฉๆฏๅ…จๅธ‚ๆœ€ไฝŽ๏ผŒxxxxxxxxxxxๅค็ป็†\n700504 0 ไนŸๆ˜ฏๆ‘ฉๅฐ”90ๅนดไปฃๆผ”็š„ไธบๆ•ฐไธๅคš็š„ๅ‚ป็™ฝ็”œ่ง’่‰ฒ\n1000 Processed\n classify content\n701000 0 ไธ่ฆๅ› ไธบๆๅˆฐ้’ฑๅฐฑๆ˜ฏ็ปๆตŽ็บ ็บท\n701001 0 xใ€ไธ่ฆๆŠŠๆฑฝๆฒนใ€็ˆ†็ซน็ญ‰ๆ˜“็‡ƒๆ˜“็ˆ†็š„ๅฑ้™ฉๅ“ๅธฆๅ…ฅ่ฝฆๅ†…\n701002 0 ๅฝ“ๆ–นๅœ†ๅ†ณๅฎšๅ›žๅŒ—ไบฌไน˜้ฃžๆœบๆป‘่ฟ‡ๅคฉ็ฉบ็š„ๆ—ถๅ€™\n701003 0 90ๅฒ็š„้ป„่ดค้€šๅฝ“็€่ฎฐ่€…้ขๆšดๆ‰“่€ๅฉ†ๆ— ่ง†ๆณ•ๅพ‹ๅฎžๆ–ฝๅฎถๆšดๆ˜ฏไธฅ้‡็Šฏ็ฝช\n701004 1 ๆ™ฎๅฎxxxๅฎข่ฟๅ…ฌๅธๆ„Ÿ่ฐขๆ‚จ็š„ๆฅ็”ต๏ผŒๆœฌ็ซ™ๆไพ›ๅฟซ่ฝฆๆ™ฎๅฎ่‡ณๅนฟๅทžไธ“็บฟ๏ผŒๅฎ‰ๅ…จ๏ผŒๅฟซๆท๏ผ่ฎข็ฅจ็ƒญ็บฟ๏ผšxxxxx...\n1000 Processed\n classify content\n701500 0 ๅฝ“ไฝ ไธๆ˜ฏๅฎ‰ๅ“ๆ‰‹ๆœบๅ’Œ่‹นๆžœๆ‰‹ๆœบ็š„ๆ—ถๅ€™\n701501 1 ๆ‚จๅฅฝ๏ผ้ฆ–ๅ…ˆ็ฅๆ‚จๅฟƒๆƒ…ๆ„‰ๅฟซ๏ผๅทฅไฝœ้กบๅˆฉ๏ผxๆœˆxๅท--xๆœˆxxๅทไบšไธน่กฃๆŸœๅฐ†ไธพ่กŒ็››ๅคงๆดปๅŠจ๏ผŒไนฐ็งป้—จ้€ๆŸœไฝ“...\n701502 0 ๅ—ไบฌๆฑŸๅฎ้บ’้บŸ้—จ้™„่ฟ‘ๆœ‰ๅฐๆœ‹ๅ‹ๆƒณๅญฆ็”ป็”ปๅ—\n701503 0 ็ซ่ฝฆไธœ็ซ™ๅ…ฌ่ทฏๆฑฝ่ฝฆ็ซ™ๅผ€้€šๆ…ˆๆบช็ญ่ฝฆ\n701504 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ4๏ผšๅˆถไฝœๆ–นๆ•ดไธชๅŒ็›ฒๆจกๅผๅ‡บๆฅ\n1000 Processed\n classify content\n702000 0 ๅˆ†ไบซไผ˜้…ท่…พ่ฎฏ่ง†้ข‘็ˆฑๅฅ‡่‰บไผšๅ‘˜่ดฆๅท\n702001 0 ๆ„ฟๆ„็”จๅๆ–ค่‚‰ๆขๅ‘จไบ”้ฃžๆœบไธๅ–ๆถˆ่ˆช็ญ\n702002 0 30ๆ€ไบบๆกˆ็ปๆ‰ฌๅทžใ€ๆฑŸ้ƒฝไธคๅœฐ่ญฆๆ–นๅ…ฑๅŒๅŠชๅŠ›\n702003 0 ๆ˜จๆ™š่ขซๆ”พ้ฃžๆœบไปŠๆ™šๅˆ่ขซๆ”พ้ฃžๆœบ\n702004 0 ไบฒ็œผ็œ‹็€่ฝฆ็ฅธๅ‘็”ŸๅˆฐๅŒป็”Ÿ็กฎๅฎšๆญปไบก็›–ไธŠๅธƒ็š„่ฟ‡็จ‹ๆฏ”ๅƒ้ป„่ฟž่ฟ˜้šพๅ—โ€ฆโ€ฆ\n1000 Processed\n classify content\n702500 0 ๆ˜ฏไป–่…พ่ฎฏ่‡ชๆˆ‘้ข ่ฆ†ๆœ€ๆˆๅŠŸใ€ไนŸๆ˜ฏๅ ช็งฐ็งปๅŠจไบ’่”็ฝ‘ๆœ€ๆˆๅŠŸ็š„ไบงๅ“ๅพฎไฟก\n702501 0 ่ฏท่ทŸๆˆ‘ไธ€่ตท่ฏป็พŠๆฏ›่พฃๅญ/ๅ‘ฒ็‰™/ๅ‘ฒ็‰™/ๅ‘ฒ็‰™\n702502 0 ็”จๅ›พ่งฃ่ทŸไฝ ็ป†่ฏด็งป่‹—โ—†ๅŽŸๆ–‡๏ผš\n702503 0 ไปŠๅคฉๆ—ฉไธŠๅœฐ้“็œ‹ๅˆฐ็š„ไธ€ๅฅ่ฏ๏ฝž\n702504 0 ไธ€xๅฒ็”ทๅญฉๅณๅฐ†ไปŽx็ฑณ้ซ˜้˜ณๅฐๅ ไธ‹ๆ—ถ\n1000 Processed\n classify content\n703000 0 ๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขๅ…š็ป„ไนฆ่ฎฐใ€ๆฃ€ๅฏŸ้•ฟ่€ฟๆ ‡ๅธฆ้ข†ๅ…š็ป„ไธญๅฟƒ\n703001 0 ไธบไป€ไนˆไธๆ˜ฏ้‚ฃไบ›ไปฅๆƒ่ฐ‹็ง็š„ใ€ๅ€ŸๆƒๅŠ›ๅฏป็งŸ็š„ๅไบบ่ตฐ\n703002 0 ไนŸ็ฎ—ๆ˜ฏไธชๆฏ›็—…็œ‹็”ตๅฝฑ็”ต่ง†ๅ‰งไป€ไนˆ็š„ๆˆ‘ๅ–œๆฌขๅ‰ง้€ๅ–œๆฌข็œ‹ไน‹ๅ‰ๅฐฑไบ†่งฃๅ‰งๆƒ…่ตฐๅ‘ๅ’Œ็ป“ๅฑ€ไปŠๅคฉ็œ‹gonegirl...\n703003 0 ้‚ฃๆ—ถๅ€™ๆ€ปๆ˜ฏๅœจๆƒณ้ฃžๆœบไธŠๅ็š„้ƒฝๆ˜ฏไป€ไนˆไบบๅ‘ข\n703004 0 ้ฟๅ…ๆ”ฟๅบœๅœจ็คพไผš็Ÿ›็›พไธญๅค„ไบŽ้ฆ–ๅฝ“ๅ…ถๅ†ฒ็š„ไฝ็ฝฎ\n1000 Processed\n classify content\n703500 0 ๅŒ—ๆˆดๆฒณโ€”โ€”A่‚กใ€ๅœบๅค–้…่ต„โ€”โ€”ไบŒ็บงๅธ‚ๅœบๅฝฑๅ“ไธ€็บงๅธ‚ๅœบโ€”โ€”่ž่ต„็Žฏๅขƒๅ˜ๅŒ–\n703501 0 ็”ท็ซฅไน˜็”ตๆขฏ่ขซๅคน่บซไบกๅŽ็ปญ๏ผšๅไผš็งฐๆขฏๅค–่ขซๅคนๆ— ่ง„ๅฎš\n703502 0 ๆฐ”ๆญปๆˆ‘ไบ†โ€ฆโ€ฆ่ฐทๆญŒๆ‰“ๅผ€ๅพฎๅš็ซŸ็„ถ้ป‘ๅฑไฝ ไฟฉ็›ธๅ†ฒไนˆ\n703503 0 ๆฑŸ่‹ๅ‡คๅ‡ฐๅ‡บ็‰ˆไผ ๅช’่‚กไปฝๆœ‰้™ๅ…ฌๅธ่‘ฃไบ‹ไผš\n703504 0 ๆˆ‘ไธ็œ‹ๅˆฐ่Šฑๅƒ้ชจๆ‰€ๆœ‰ไบบไปฅๅŽ้ƒฝ่ฟ‡็€ๅนธ็ฆๅฟซไน็š„็”Ÿๆดป็š„็ป“ๅฑ€ๆˆ‘ๅฐฑๆ˜ฏไธ็”˜ๅฟƒ\n1000 Processed\n classify content\n704000 0 โ€œๅฎถๅ’Œ้กบโ€ๅฐฑๆ˜ฏๅฐๅŒบไธšไธป่‡ชๆฒป็š„ไธ“ไธš็š„ใ€็ณป็ปŸ็š„ใ€ไบ’่”็ฝ‘ๅŒ–็š„ๅทฅๅ…ท\n704001 0 ๆฏๆฌก็œ‹ๅฎŒ่Šฑๅƒ้ชจ้ƒฝไผšๆƒณ่ฏดไธ€ๅฅๅงๆงฝไฝ ๅคง็ˆท\n704002 0 2015ๅนด่ฏฅๅŽฟๆŠ•่ต„160ไธ‡ๅ…ƒไธบๅ…จๅŽฟ533้—ดๅนณๆˆฟๆ ก่ˆๆ›ดๆขๆฐดๆณฅๆชฉๆก\n704003 1 ใ€ๅผ€ๅญฆๅ•ฆ๏ผxๅฏนx็ฒพๅ“ๅฐ็ญๆŠขๆŠฅไธญใ€‘ๆ•ฐๅญฆ/่‹ฑ่ฏญ/็ง‘ๅญฆ็Žฐๆ‹›xxไบบ\n704004 1 ๆตฆๅ‘้“ถ่กŒ่ฃ…ไฟฎ่ดทๆฌพ็”ต่ฏๅ›ž่ฎฟ้—ฎ้ข˜๏ผŒ่ดทๆฌพๅˆฉๆฏไธบๅนดๆฏx%๏ผŒ่ดทๆฌพๅ‘จๆœŸๆ˜ฏxxๆœˆ๏ผŒ้“ถ่กŒไธšๅŠกๅง“ๆœฑ๏ผŒๅฅณๆ€ง๏ผŒๆ˜ฏ้“ถ...\n1000 Processed\n classify content\n704500 1 ๅนณๅฎ‰ๆ— ๆŠตๆŠผ่ดทๆฌพ๏ผŒๆ‰“ๅกๅทฅ่ต„xไธ‡๏ผŒๅš็”Ÿๆ„xxไธ‡๏ผŒๅ…จๆฌพ่ฝฆxxไธ‡๏ผŒๆŒ‰ๆญๆˆฟxxไธ‡๏ผŒๆญฃ่ง„้“ถ่กŒๅฝ“ๅคฉๆ”พๆฌพ๏ผŒๅˆฉ...\n704501 1 ๅฅณ็ฅž่Š‚ๆฅไบ†๏ผŒๆ‹›่ฟœๆŒฏๅŽ็Ž–ๅงฟไธ“ๆŸœๅ›ž้ฆˆ่€้กพๅฎขๅŽ…ๅ†…้ƒจๅˆ†ๅ•†ๅ“ๆœ€ไฝŽxๆŠ˜่ตท๏ผŒ่ฟ˜ๆœ‰ๆ˜ฅ่ฃ…ๆ–ฐๆฌพๅˆฐๆŸœ๏ผŒ็พŽๅฅณๅงๅงไปฌๆฅ...\n704502 0 ๅคดๅฑฏๆฒณๅŒบๆฃ€ๅฏŸ้™ขๆญฃๅผๅผ€้€šโ€œไปŠๆ—ฅๅคดๆกโ€ๆ‰‹ๆœบๅฎขๆˆท็ซฏ\n704503 0 ๆœ‰่€ƒ่™‘่ฟ‡ๆต™ๆฑŸๆญŒ่ฟท็š„ๆ„Ÿๅ—ๅ—\n704504 0 ๅธธๅทžๅธ‚้’ŸๆฅผๅŒบๆ™‹้™ตไธญ่ทฏxxxๅท้•ฟๅ…ดๅคงๅŽฆxๆฅผใ€€ใ€€่”็ณปไบบ๏ผšๅข็ป็†ใ€€ใ€€่”็ณป็”ต่ฏ๏ผšxxxxxxxxxx...\n1000 Processed\n classify content\n705000 0 ๅ—ไบฌ็ซ™x็ซ™ๅฐxx๏ผšxx็ฆปๅผ€\n705001 0 ไธๆ˜ฏๅ•็บฏ็š„่ฟฝๆฑ‚็œผๅ‰็š„ๅ•†ไธšๅˆฉ็›Š\n705002 0 ๆˆ‘็”จ็™พๅบฆ่ง†้ข‘ๆ‰‹ๆœบ็‰ˆ็œ‹ไบ†โ€œ็”ท็ซฅ้—จๅฃๆกๆฃ’ๆฃ’็ณ–ๅƒไธƒ็ชๆต่ก€ๆญปไบกโ€\n705003 0 ๆŸ้€‰ๆ‰‹ๅฝ•ไบ†5้ๆŒ‘็š„ๆœ€ๅฅฝ็š„ไธ€้่ฝฌ่บซ\n705004 0 ๆœ‰ไธ€ไธชๅฐๅทๆฏๅคฉ้ƒฝๆญฃๅคงๅ…‰ๆ˜Ž็š„ๅทๅฌๆˆ‘ไปฌ่ฏด่ฏ\n1000 Processed\n classify content\n705500 0 ๅŒ—ๆป˜ๆž—ๆธฏ่ทฏๆฎตๅพ€้•‡ๆ”ฟๅบœๆ–นๅ‘ๅ› ๆฑกๆฐด็ฎก็ฝ‘ๅปบ่ฎพ็š„้œ€่ฆ\n705501 0 ่…พ่ฎฏRTXๅ’ŒWorkECไธบ้ฆ–ๆ‰นๅˆไฝœๆ–น\n705502 0 ๆฑŸ่ฅฟ็œๅˆ†ๅฎœๅŽฟไบบๆฐ‘ๆณ•้™ขไพๆณ•ๅฎก็ป“ไบ†่ฟ™่ตท่ฟ‡ๅคฑไปฅๅฑ้™ฉๆ–นๆณ•ๅฑๅฎณๅ…ฌๅ…ฑๅฎ‰ๅ…จๆกˆไปถ\n705503 0 ๅŒๆ ทๅบ†็ฅ่€ๅฉ†ๆˆๅŠŸ่€ƒไธŠๆŠคๅฃซ่ฏ\n705504 0 ไนŸๆœ‰ไฝœ่€…ไปŽๅŒปๆ—ถไปŽไธๅŒๅŒป็–—ๆกˆไพ‹ๆทฑๆ€ไบบ็”Ÿๅ“ฒ็†็š„ๆ•ฃๆ–‡\n1000 Processed\n classify content\n706000 0 ๆ–ฝ็“ฆ่พ›ๆ ผๆ›พ่Žทๅพ—็Žฏ็ƒๅฅ็พŽๅŠๅฅฅๆž—ๅŒนๅ…‹ๅ…ˆ็”Ÿๅคด่ก”\n706001 1 ๅ…ˆ็”Ÿๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅˆšๅ’Œๆ‚จ่”็ณป็š„ไฝ›ไฟก้“ถ้€š็š„ๅ…ณๅฐๅง๏ผŒๅˆฉๆฏๆœ€ไฝŽๅฏไปฅๅšๅˆฐxๅŽ˜๏ผŒๆœ€ๅฟซๅฏไปฅๅฝ“ๅคฉ็”ณ่ฏทๅฝ“ๅคฉๆ”พๆฌพ...\n706002 0 Adallomๆ˜ฏไธ€ๅฎถSaaSไบ‘ๅฎ‰ๅ…จๅˆ›ไธšๅ…ฌๅธ\n706003 0 ๅฉšๅงปไธญไป‹้’ฑ่€ๅธˆๆ‰“็”ต่ฏ็ป™ๆˆ‘ไบ†\n706004 1 ่ๆณฝๅธ‚็ฉบๅŽ‹ๆœบๅ”ฎๅŽๆœๅŠกไธญๅฟƒ๏ผŒ็‰นไปท้”€ๅ”ฎ:็ฉบๅŽ‹ๆœบ้…ไปถ๏ผŒ็ฉบๆปค๏ผŒๆฒนๆปค๏ผŒๆฒนๆฐ”ๅˆ†็ฆปๅ™จ๏ผŒ่žบๆ†็ฉบๅŽ‹ๆœบไธ“็”จๆฒนใ€‚ไธŠ...\n1000 Processed\n classify content\n706500 0 ็Žฐๅœจ็ป™ๅŠ›ไปทๆ ผxxxxxxๅ‘ๅ”ฎโ€ฆไธ“ๆณจ็ฒพๅ“โ€ฆ\n706501 0 ็”ฒ็Šถ่…บ่‚ฟๅคงๅœจ็—…ๅ˜ๅˆๆœŸๅŠไธญๆœŸ\n706502 1 ไบฒ็ˆฑ็š„ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏLapagayo็š„็Ž‹ๅ€ฉ้›…๏ผŒๅคฉๆฐ”ๅนฒ็‡ฅ๏ผŒ่ฏทๅคšๅ–ๆฐดๅนถๆณจๆ„็šฎ่‚คไฟๅ…ปใ€‚xๆœˆxๅทๅ‰ๅˆฐๅบ—่ดญ...\n706503 0 ๅธๆณ•ๆ‰€ๆ‰€้•ฟๅ†ถๅปบ้›„่ดชๆฑกๅ—่ดฟไธŽๅŽŸๅ‘Š่พพๆˆๅ่ฎฎ\n706504 0 ไธ‰ๅ›ฝๅฟ—?็ณœ่Šณ็ณœ่Šณ\n1000 Processed\n classify content\n707000 0 ็œŸ็š„ๆ˜ฏ่บซไฝ“้‡Œ็š„ๆฏไธช็ป†่ƒž้ƒฝ่ทŸ่ฟ™ไธชไธ–็•Œๆ ผๆ ผไธๅ…ฅ\n707001 0 ๆœบๅ™จไบบ้“ๆญ‰ๅๆ˜ ไบ†ไธญๅ›ฝไบบ็š„ๆƒ…็ปช\n707002 1 ๅฐŠๆ•ฌ็š„ๆ–ฐ่€้กพๅฎขๆœ‹ๅ‹ไฝ ไปฌๅฅฝ๏ผŒไธบไบ†ๆ„Ÿ่ฐขไฝ ไปฌไธ€ๅนดๆฅๅฏนๅšๅฃซๅ›ญ็š„ๆ”ฏๆŒไธŽๅŽš็ˆฑ๏ผŒๆœฌๅบ—็ŽฐๅทฒๆŽจๅ‡บๅ……ๅ€ผๅคงไผ˜ๆƒ ๏ผŒๅ……...\n707003 0 ๆต่กŒๆฆœ็ฌฌไบ”ไฝๆŽ’ๅœจ้‡‘ๅฟ—ๆ–‡้™ˆๆฅš็”Ÿๅบ„ๅฟƒๅฆๅ‰้ข\n707004 0 ๅˆๅŽๅ—ไบฌ็š„้ฃŽ้›จ่ฟ˜ๆ˜ฏไผš้€ๆธๆ˜Žๆ˜พ\n1000 Processed\n classify content\n707500 0 ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ้‡Œๆฅไบ†ไธ€ไธชๆˆ‘ๅคงๅ”ไบบ\n707501 0 ๅนถๅˆ†่ตดๅธ‚ๅŒบ20ไฝ™ไธชไธป่ฆ่ทฏๅฃๅผ€ๅฑ•ไบค้€šๆ–‡ๆ˜ŽๅŠๅฏผๆดปๅŠจ\n707502 0 ่€ๅ…‹ๅ…ฑ้œ€ๅ‘ๆถˆ่ดน่€…่ต”ๅฟxxxไธ‡็พŽๅ…ƒ\n707503 0 ๆˆ‘็š„่บซไฝ“ๆ˜ฏๅœจไธœไฟก้›†ๅ›ขๅ…ฌๅธไธ€ๅŽ‚ๅžฎๆމ็š„\n707504 0 #NAME?\n1000 Processed\n classify content\n708000 0 ็Žฐๆœ‰219ๅŒบ524ๅŒบๅ‰ๆŽ’็ฅจๅ†…ๅœบ็ฅจไนŸๆœ‰\n708001 0 ๅŒป็”ŸๆŠคๅฃซ่ฟ›่กŒ่ƒธๅค–ๆŒ‰ๅŽ‹็›ดๅˆฐxx๏ผšxxๅทฆๅณ\n708002 0 ๅ…ฌๅธƒไบ†2014ๅนดๅ…จ็œๆณ•้™ขๅ—็†่กŒๆ”ฟๆกˆไปถ็š„ๅŸบๆœฌๆƒ…ๅ†ตใ€็‰น็‚นๅ’Œๅๅคงๅ…ธๅž‹่กŒๆ”ฟๆกˆไพ‹\n708003 0 ๆ ผๅŠ›ๆ‰‹ๆœบ็š„ไบŒใ€ไธ‰ไปฃ็š„ไบงๅ“ๅฐ†่ฆ้ขไธ–\n708004 0 ๅ› ไธบ่€ๅฆˆ่บซไฝ“ๆŠฑๆ™ๅ’Œ่ฃ…ไฟฎ็š„ไบ‹ๆƒ…ๆž็š„ๆˆ‘็„ฆๅคด็ƒ‚้ข\n1000 Processed\n classify content\n708500 0 ็™พๅบฆๅถ่ˆž้ฃ˜้ฃ˜่ดดๅง้ฆ–ๅ‘๏ผš่Šญ่•พๅ…ซ็บงๅŒ—่ˆž็‰ˆ\n708501 0 ๅนณๅ‡ๆฏๅฅ—ๆˆไบค้ข็งฏ็บฆๅœจxxๅนณๆ–น็ฑณ\n708502 0 ๅ…ฌๅธ่‚ก็ฅจ่‡ชxxxxๅนดxๆœˆxxๆ—ฅ่ตทๅœ็‰Œ\n708503 0 ๅ“ˆ้ฆ™ๆธฏไนŸๆœ‰ๅŸŽ็ฎกไธ€ไธช่ทฏ่พนๆ‘Š่ขซๆ”ถไบ†\n708504 0 ๆณ—้˜ณๅ›ขๅŽฟๅง”ๆทฑๅ…ฅๅญฆไน โ€œไธ‰ไธฅไธ‰ๅฎžโ€่‡ช่ง‰่ทต่กŒโ€œไธ‰ไธฅไธ‰ๅฎžโ€\n1000 Processed\n classify content\n709000 0 ๆˆ‘ๅœจไฝฟ็”จโ€œRIO้ฆ™ๆฉ™ไผ็‰นๅŠ โ€็‰นๆŠ€็šฎ่‚ค\n709001 0 ๆ–ฐ่ฅฟๅ…ฐๆ”ฟๅบœๅ‡็บง็งปๆฐ‘ๅฑ€็ณป็ปŸๆ–นไพฟๅค–ๅ›ฝไบบ็ญพ่ฏ็”ณ่ฏท\n709002 0 ไธ–็•ŒไธŠไธบไป€ไนˆ่ฆๆœ‰ๅˆถๆœ่ฟ™็ง่ฏฑไบบ็Šฏ็ฝช็š„ไธœ่ฅฟ\n709003 0 ไปŠๅคฉๅพๅทžๆผซๅฑ•็œ‹ๅˆฐไบ†ๅ…ดๆฌฃๅพฎ่‰็น่Šฑ่ก€ๆ™ฏๅ†›่ฃ…paroๅ’Œๅ›ฝๅฎถ้˜Ÿ\n709004 1 ็พŽๅฅณๆ–ฐๅนดๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๆญŒไธฝ่Šฌ็š„้ป„็ป็†๏ผŒ็ฅๆ‚จๅŠๅฎถไบบๆ–ฐๅนดๅฟซไน๏ผŒๅฟƒๆƒณไบ‹ๆˆ๏ผๆœฌๆœˆๆฅๅบ—xๆฌกๆŠค่‚ค๏ผŒ้€xxxๅ…ƒ็š„...\n1000 Processed\n classify content\n709500 0 ่ฟ‡ไบ†xxๅฒใ€ๅนถไธ”็œผๅ‘จๅ‡บ็Žฐๅฐ็ป†็บน็š„ไบบx\n709501 0 ๅœฐ้ขไฟฑไน้ƒจๆ—…ๆธธๅˆ†ไบซไผšๅœ†ๆปกๆˆๅŠŸ\n709502 1 ๅœบ๏ผŒๆฑ‡่šๅ…จๆญฆๆฑ‰ไปฅๅŠๅ‘จ่พนๅธ‚ๅœบ็š„ๅ•†ไธšๆ‰นๅ‘้“พใ€‚็‹ฌ็ซ‹ไบงๆƒๅ•†้“บใ€‚ๅฏๅŒ…็งŸๅฏ่‡ช่ฅใ€‚่ฏš้‚€ๆ‚จๅ’Œๅฎถไบบๆœ‹ๅ‹็Žฐๅœบๅ‚่ง‚...\n709503 0 ไฝ ็Ÿฅ้“ๅ€ŸๆกไธŽๆฌ ๆกๅœจๆณ•ๅพ‹ไธŠ็š„ๅŒบๅˆซๅ—\n709504 0 ๅŽๅคฉๅฐฑไธŠ้ฃžๆœบไบ†็„ถ่€Œๆ˜Žๅคฉ่ฟ˜่ฆๅฝ•ไธœ่ฅฟโ€ฆโ€ฆๆˆ‘ไนŸๆ˜ฏๆ‹ผ็š„ไธ่กŒ\n1000 Processed\n classify content\n710000 0 ๅœฐไบง้ซ˜็ฎกๅˆ›ไธšๆฝฎๅœฐไบง+ไบ’่”็ฝ‘โ€ฆโ€ฆๆ›ดๅคš็‚นๆˆ‘\n710001 0 ๅ™ช้ŸณๆฑกๆŸ“็œŸ็š„ๅพˆ่ฎจๅŽŒๅคฉๆ€็š„่ตถ็ดงๅœไบ†ๅง็”ต่„‘ๅฃฐ้Ÿณ้ƒฝๅผ€ๅˆฐๅคดไบ†่ฟ˜ๅฌไธๆธ…โ€ฆโ€ฆๆŒ–ไธชๆฒŸๆŒ–ไบ†ๅŠไธชๆœˆๆœ‰ๅ…ณ้ƒจ้—จไนŸ็œŸ็š„ๆ˜ฏไบ†\n710002 0 ๅˆšqqๅ“ไปฅไธบๆœ‰ไบบๆฅๅ…ณๅฟƒๅ…ถๅฎžๆ˜ฏ้ข†ๅฏผๆ‹‰ๆˆ‘ไธ€่ตทๅŠ ็ญ\n710003 0 ๆœชๆฅๅปบ็ญ‘่Š‚่ƒฝไฟๆธฉ้’‰ๅญ˜ๅœจๅนฟ้˜”็š„ๅธ‚ๅœบๅ‘ๅฑ•็ฉบ้—ด\n710004 0 ๅฎƒๅ› ้€ ๅž‹ๅฅ‡็‰น็š„ๅปบ็ญ‘่ขซ่ฏ„ไธบไธ–็•Œๆ–‡ๅŒ–้—ไบง่€Œๅค‡ๅ—ไธ–ไบบ็žฉ็›ฎ\n1000 Processed\n classify content\n710500 0 ๆ—่พน็š„่ฏ‰่ฎผๆ—ถๆ•ˆ่ฟ˜ๅœจ็Žฉ็€่‡ชๅทฑ็š„ๅฏๆ’ค้”€ๅฉšๅงป\n710501 0 ๆœ‰ๆ’ๅฟƒ+่‚ฏๅŠชๅŠ›=ๆˆๅŠŸๅšไฝ ็š„็ฒ‰ไธๆ˜ฏๆˆ‘ไปŽ7ๅนดๅ‰ๅผ€ๅง‹็›ด่‡ณ็”Ÿๅ‘ฝ็ปˆๆญข้ƒฝไธไผšๆ”นๅ˜็š„ไบ‹\n710502 0 ่ฟ›ๅ…ฅ็ซ็ฎญๅ›พๆ ‡ๅนถๆ็คบๆŒ‰HOMEๅ›žๅˆฐไธป่œๅ•ๅŽ\n710503 0 ๆŠคๅฃซๅฐๅงๅ’ŒๅŒปๅธˆๆฒŸ้€šๅŽๅธฎๆˆ‘ๅฎ‰ๆŽ’ๅœจ็‰™ๅŒปๆ™š้คๆ—ถ้—ดๅธฎๅฅน็œ‹่ฏŠ\n710504 0 ๆฌง็พŽ้ฃŽๆฐ”่ดจ็™พๆญ็บฏ่‰ฒไผ‘้—ฒ้˜”่…ฟ่ฃคๆ—ถๅฐš่คถ็šฑ่ฝป่–„็™พ่คถ่ฃ™่ฃคๅฅณๅฃซ้›ช็บบ่ฃค่ฃ™\n1000 Processed\n classify content\n711000 0 ๅ†…ๆœ‰iPhone5sๅŠ็Žฐ้‡‘1500ๅ…ƒ\n711001 0 ๅ—้€šๅฐไธบไป€ไนˆ้€‰่ฟ™ๆ ท็š„ๅฝ“ไธปๆŒไบบ\n711002 0 ๅฐ‘ๆž—ๅก”ๆฒŸๆ•™่‚ฒ้›†ๅ›ข็ฌฌxxๅฑŠ่ฟๅŠจไผšๅฐ†ไบŽxๆœˆxxๆ—ฅๅผ€ๅน•\n711003 0 /ๆฑŸ่‹่‹ๅทžไธ€ๅทฅๅŽ‚ๅ‘็”Ÿๅคง็ซ\n711004 0 ๅŽไธบpxๅ‡ญๅ•ฅๆ•ขๅ‘็ƒญ่ฟ™ไนˆไธฅ้‡\n1000 Processed\n classify content\n711500 1 ไบฒ็ˆฑ็š„ๅฐๅง๏ผšไฝ ๅฅฝ๏ผ็Žฐๅœจ่ฅฟๅŸŽไบŽxๆœˆxๆ—ฅxๆœˆxxๆ—ฅๅœจ่ฅฟๅŸŽไธ€ๆฅผๅŒ—้—จๆœ‰ไฟƒ้”€ๆดปๅŠจๅ…จๅœบx__xๆŠ˜๏ผŒๆƒŠๅ–œๅคš...\n711501 0 ๅฏŒๅฃซๅบท่…พ่ฎฏๅ’Œ่ฐๆฑฝ่ฝฆๅœจ้ƒ‘ๆˆ็ซ‹ๆŠ•่ต„ๅˆไฝœๅ…ฌๅธ\n711502 0 ็”ฑๅ…จๅ›ฝ็บข่‰ฒๆ—…ๆธธๅ่ฐƒๅฐ็ป„ๅŠžๅ…ฌๅฎคไธปๅŠž\n711503 0 ๆˆ‘ๆƒณ้—ฎไธบไป€ไนˆไธ่ƒฝไธพๆŠฅๅนฟๅ‘Šๆ„ๆ€ๆ˜ฏๅนฟๅ‘Šๅพฎๅšๆ˜ฏ้ป˜่ฎค็š„ๆ˜ฏๅ—ไธ€ๅˆทๅพฎๅšๆปกๅฑ็š„ๅนฟๅ‘Š็œŸๆ˜ฏๅฎŒๅ…จๅคŸไบ†\n711504 0 thebodyshop็š„ๆฒๆตด้œฒๆœ‰่ฆ็š„ๅ—\n1000 Processed\n classify content\n712000 0 ๅฐๆฑฝ่ฝฆ็ขฐๅˆฐไบค้€šๆŒ‡็คบ็ฏ่ฟ˜ไผšๅœไธ‹ๆฅ\n712001 0 ้ป‘้พ™ๆฑŸ็œๆ›ฒ่‰บๅ›ข2015ๅนดโ€œ้€ๆฌข็ฌ‘ใ€ๅˆฐๅŸบๅฑ‚โ€ๆดปๅŠจไปŠๅคฉๅฏๅŠจ\n712002 0 mkๅฐๆ–œๆŒŽ้“พๆก่‚ฉๅธฆ็ฟป็›–็ฃๆ‰ฃ่ฎพ่ฎก็พŽ่ง‚ๆ—ถๅฐšๅˆๅฎ‰ๅ…จๅ›ฝๅ†…็Žฐ่ดง็Žซ็บข้‡‘่‰ฒ้‡‘ๅฑž้“พๆกๅ’Œ้“ถ่‰ฒ้‡‘ๅฑž้“พๆกๅฐบๅฏธ็บฆ๏ผš17\n712003 0 ๆฌง็พŽๆ—ถๅฐšๆ€งๆ„Ÿ็พŽ่ƒŒๆณขๆตชๆŒ‚่„–ไบฒ่‚คๆ–‡่ƒธ็™พๆญๅ†…่กฃ\n712004 0 ๆˆ‘ๆ€Žไนˆๅธฎไฝ ็š„ๅผบๅฅธๆกˆ่พฉๆŠคๅ•Šๅงๆงฝ\n1000 Processed\n classify content\n712500 0 ๅทฅไบบไธบไป€ไนˆๅ‡่–ชๅ› ไธบๅ่…่ดฅๅ›ฝไผๆ”ถๅ…ฅๅ‡ๅฐ‘\n712501 0 ๆžœ็œŸๆ˜ฏๆถจ็Ÿฅ่ฏ†ไบ†~RMไฝœไธบไบšๆดฒ็ฌฌไธ€็ปผ่‰บ\n712502 0 ๆน–ๅŒ—็œๅ†…้ƒฝๆ˜ฏ24ๅฐๆ—ถไปฅๅ†…้€่ดงๅˆฐไฝ ๅฎถ\n712503 1 ไบฒๆˆ‘ๆ˜ฏไธ‡ๅ’ŒๅŸŽๆฌง่Žฑ้›…ไธฅๆท‘ๅบ†ไธ‰ๅ…ซ่Š‚ๆดปๅŠจๅผ€ๅง‹ไบ†ๆœฌๆ—ฅ่ตท่ดญไปปๆ„ๅ››ไปถx.xๆŠ˜่ฟ˜ๅฏๅŒๅ€ๆœบไธๅฏๅคฑๅ“ฆ\n712504 0 ๅƒๅฎŒๆˆ้ป‘็‚ญโ€ฆโ€ฆsun420924868\n1000 Processed\n classify content\n713000 0 ๅงœๅ ฐๅŒบๅธ‚ๅœบ็›‘็ฃ็ฎก็†ๅฑ€่ฟ‘ๆ—ฅๅฏๅŠจไธบๆœŸไธ€ๅนด็š„็”ตๆขฏๅฎ‰ๅ…จ็›‘็ฎกๅคงไผšๆˆ˜\n713001 0 Amazon็Žฐๆœ‰Philips้ฃžๅˆฉๆตฆNorelcoCC5059/60ๅ„ฟ็ซฅ็”ตๅŠจ็†ๅ‘ๅ™จ\n713002 0 ไธฐ่ƒธ่†๏ผšๅŽŸไปทๅ…ฅ็จŽ๏ผš28080ๆ—ฅๅธ\n713003 0 ็Žฉๅคšไบ†้ญ”ๅ…ฝRPGๅ›พ็š„ๅŽ้—็—‡ๅฐฑๆ˜ฏ่ฟ™ไธชไบ†\n713004 0 27ยฐๆ‘ฉ็พฏๅบงไปŠๆ—ฅ่ฟๅŠฟโ˜…โ˜…โ˜…โ˜†โ˜†\n1000 Processed\n classify content\n713500 0 ๅฐ†2/3็š„็•ช่Œ„ไธๅ’Œ็‰›ๅฅถ็”จๆ…ๆ‹Œๆœบๆ‰“็ขŽ\n713501 0 ไธ“ๆณจๆท˜ๅฎๅคฉ็Œซไบฌไธœ้˜ฟ้‡Œ่˜‘่‡่ก—็พŽไธฝ่ฏด็ญ‰ๅ„ไธช็”ตๅ•†ๅนณๅฐ\n713502 0 ๅค–ไผ่ฟ›ๅ†›ไธญๆˆ่ฏๅ›ฝๅ†…ไธญ่ฏไผไธšๆ‹…ๅฟง\n713503 0 ๆฌง็พŽ้ฃŽ่ถ…ๆœ‰่ดจๆ„Ÿๅคๅคๅฅณ้ž‹~&gt\n713504 0 ๅคšๅนดๆฅ็š„ไธ€ไธชๅคธไบบ็œŸ็›ธ๏ผšไฝ ้•ฟๅพ—็œŸๆธ…็ง€ๅ•Šๅฐฑๆ˜ฏไฝ ็œผ็›ๅฐ้ผปๅญๅฐๅๆญฃๅฐฑๆ˜ฏๅฐ\n1000 Processed\n classify content\n714000 0 ๅพ—ๅˆฐไบ†ๆทฎๅฎ‰ๅนฟๅคง็‘œไผฝ็ˆฑๅฅฝ่€…็š„้ซ˜ๅบฆ่ฎคๅฏ\n714001 0 ๆ— ้”กๆŽฅ่ฟ‘่‹ๅทžไธŠ็ฉบๆƒŠ็Žฐๅฝฉ่‰ฒไบ‘ๆœต\n714002 0 33ๅฒ็š„ๅธๆœบๆจๆŸๅฐ†ไป–ไปฌไปŽไธŠๆตทๆŽฅ้€ๅˆฐๆกไนก\n714003 0 ๆœ‰็”ต่„‘็š„ๅฏไปฅ่ฎพ็ฝฎๆˆ้ป˜่ฎค็ฝ‘้กต\n714004 0 ๆฑ‡ๅบท2015็ง‹ๅญฃๆ–ฐๆฌพๅฅณ้ž‹ๆฌง็พŽ็ณปๅธฆ็ฒ—่ทŸ้ฉฌไธ้ดๅฅณ้ดๆผ†็šฎๅฐ–ๅคด้ซ˜่ทŸ็Ÿญ้ดๅฅณ\n1000 Processed\n classify content\n714500 0 1ๅท็บฟ้ป„ๅŸ”็ซ™ๅŒบๅ“ๅบ”้›†ๅ›ขๅ…šๅง”ๅทๅฌ\n714501 0 ไฝœ่€…๏ผšๅ—ๅฑฑๅคงไป™ๆ–‡้ฃŽ้’ๆจ่ฟ‘ๆ—ฅ\n714502 1 ๅง๏ผŒๅ‘Š่ฏ‰ๆ‚จไธ€ไธชๅฅฝๆถˆๆฏ๏ผๆˆ‘ไปฌๅบ—ๆœ€ๆ–ฐๅผ•่ฟ›ไบ†ไธ€ๅฐไปทๅ€ผ็™พไธ‡็š„ไปฅ่‰ฒๅˆ—SYNERONๅŽŸ่ฃ…่ฟ›ๅฃไพ้•ญ้—ช้ข‘่„ฑๆฏ›...\n714503 0 iOS8้‡Œๆไพ›ไบ†่‡ชๅŠจๅˆ ้™คๅކๅฒ็Ÿญไฟก็š„่ฎพ็ฝฎ\n714504 0 ๆทฑๅœณไธœๅ›ฝ้™…ๆ—…ๆธธๅบฆๅ‡ๅŒบไธ€็บฟ็œ‹ๆตท่ฑชๅฎ…\n1000 Processed\n classify content\n715000 0 ๅŠ ๅพฎไฟกๅ‘็บขๅŒ…hyuk797\n715001 0 ๆบง้˜ณๅธ‚ไบบๆฐ‘ๆฃ€ๅฏŸ้™ขๅฏน็Ž‹ๆŸไปฅๆถ‰ๅซŒๅผบๅฅธ็ฝชๆ่ตทๅ…ฌ่ฏ‰\n715002 0 ๅ‰ๅ‡ ๅคฉๅŽปไบ†ColoniadeSantPere\n715003 0 ๅ—ไบฌ็š„ๅคœๆ™šๆฏ”่ฅฟๅฎ‰่ฆ็นๅŽไธ€็‚น\n715004 0 ๆˆ‘ๆ˜ฏ\"xxxๅผ€ๅคด็š„ไผไธš่‚ก็ฅจๆ•ด่ฃ…ๅพ…ๅ‘\"\n1000 Processed\n classify content\n715500 0 ๅฝ“ๆ—ถ่ฟ™ๅ21ๅฒ็š„ๅทฅไบบๆญฃๅœจๅฎ‰่ฃ…ๅ’Œ่ฐƒๅˆถๆœบๅ™จไบบ\n715501 0 ๅฐ†ๆœบๅ™จไบบๆต‡ๆณจๅŠจไฝœไธŽ้“ธ้€ ๆœบๅ€พ่ฝฌ่ฟๅŠจ่ฟ›่กŒๅŒๆญฅ\n715502 0 ๅฐฑๆ˜ฏๅ› ไธบๅ›ฝๆฐ‘ๅ…šๆ”ฟๅบœ่ฆ็บ ๆญฃไปฅๅ‰ๆฐ‘่ฟ›ๅ…šๆ”ฟๅบœๅˆถๅฎš็š„้”™่ฏฏ็š„ๅކๅฒ่ง‚\n715503 0 ไธŠๆตทๅธ‚ๅฅณๅŠณๆจกๅ—่ดฟ95ไธ‡?ไบ‹ๅ‘ๅŽไธ€ๅคœ็™ฝๅ‘\n715504 0 ไธ€่ˆฌๆŒ‰็…งไปฅๅพ€็ป้ชŒๆฌง็พŽ็š„ๅ”ฑๅŠŸ้ƒฝไผšๆฏ”่พƒๅฅฝไธ€ไบ›ๆ‰€ไปฅๅฐฑๆฒกๆœ‰ๅพˆๆœŸๅพ…\n1000 Processed\n classify content\n716000 1 ่พพๅœฐๆฟ๏ผŒๅ…จๅ‹ๅฎถๅฑ…๏ผŒ็‰นๅˆฉ่พพๅŠ้กถ๏ผŒๆฏๅฎถๆฏๆˆทๅข™็บธๅธƒ่‰บ๏ผŒๆตทๅฐ“ๅฎถ็”ต็ญ‰xxๅคšไธชไธ€็บฟๅ“็‰Œ๏ผŒไผ˜ๆƒ ๅŠ›ๅบฆ่พƒๅคง๏ผŒไบซ...\n716001 0 ๆœ€่ฟ‘ๅคชๅคš้›ถๅ”ฎ้กพๅฎข้ƒฝๆ˜ฏๅ†ฒ็€้˜ฟ่ƒถ็ณ•ๆฅ\n716002 0 ไธญ่ˆช็ฌฌไธ€้ฃžๆœบ่ฎพ่ฎก็ ”็ฉถ้™ขๅŠจๆ€\n716003 0 ๅชๅ› ไธบไฝ ไปฌๆŠŠๆญช่„‘็ญ‹ๅŠจๅˆฐไบ†ๅจœๅจœ่บซไธŠไฝ ่ฏดไป–้—ฐๅœŸ\n716004 0 ๆ™šไธŠ็š„็งฆๆทฎไบบๅฑฑไบบๆตทๅ•ŠไปŠๅคฉ็ปง็ปญ็Žฉๅ’ฏ\n1000 Processed\n classify content\n716500 0 ๆฑŸ่‹ๅฆ‚็š‹ๅธ‚ๆฌ็ป้•‡ๅŠ ๅŠ›็คพๅŒบๅนฒ้ƒจ้™ˆๆตทๅฅๅœจๅคœๆ™šๅทกๆŸฅ็งธ็ง†็ฆ็ƒงๆ—ถไธๆ…Žๆ‘”ๅ€’\n716501 0 ๆฅ้˜ฟ้‡Œๅทดๅทด็š„ไบบๅฟ…้กป่ฎคๅŒๅ’Œๅšๅฎˆๆˆ‘ไปฌ็š„ไปทๅ€ผ่ง‚\n716502 0 ไธ”ๆœชๆฅไธ‰ๅนด่ฟ˜ๅฐ†ๆ–ฐๅขžxxไบฟๅนณๆ–น็ฑณ็š„็‰ฉ็ฎกๅธ‚ๅœบๅฎน้‡\n716503 0 ๅ็ซ่ฝฆๅๅ‡บไบ†ไธญๅ›ฝๅฅฝๅฃฐ้Ÿณ้€‰ๆ‰‹็š„ๆ„Ÿ่ง‰\n716504 0 ็‘žๅฃซFlNMAใ€APl่ฏทไฝ ไปฌๅฌๅฌๆˆๅฐฑAPl็ฅž่ฏ็š„3ไธ‡ไธญๅ›ฝๆŠ•่ต„่€…็š„ๆณฃ่ก€็š„ๅ‘ผๅฃฐ\n1000 Processed\n classify content\n717000 0 ๅคๆ—ฆICๅก/FM1108IC็™ฝๅก/M1็™ฝๅก/้—จ็ฆๅก/่€ƒๅ‹ค/ICๅก/ๆ„Ÿๅบ”ๅก/ๅฐ„้ข‘ๅก\n717001 0 GAx็š„ๅคง็ฏ่ฎพ่ฎก็Šนๅฆ‚ๅ—็‹ฎ่ˆฌ้†’็›ฎ\n717002 0 ๅ…ถๅ…จ่ต„ๅญๅ…ฌๅธๅ—้€šๆ™ฏ็‘žไบŽ24ๆ—ฅ็บฆไบบๆฐ‘ๅธ3\n717003 0 ไธบ่ฟ›ไธ€ๆญฅๆ้ซ˜ๅ…จ็œ่กŒๆ”ฟๅค่ฎฎๅบ”่ฏ‰ไบบๅ‘˜็š„ๅทฅไฝœๆฐดๅนณ\n717004 0 ้€š่ฟ‡็ฝ‘็ปœ้’“้ฑผ่ฟๆณ•่กŒไธบ็›—ๅ–ๆœฌไบบ้’ฑ่ดข\n1000 Processed\n classify content\n717500 0 ็Žฐๅœจ็Ÿฅ้“ๆˆ‘ไธบไป€ไนˆ้€‰ๆ‹ฉBBTไบ†ๅ—\n717501 0 ไฝ ๆ‰€ๆฒก่ง่ฟ‡็š„ๆœˆๅ…‰้‡Œ็š„ๅ—ไบฌๅŸŽ\n717502 0 ็Žฉๅฎถๆ‰ฎๆผ”ไธ€ๅๅซๅšHaru็š„ๅฅณๅญฉ\n717503 0 ๆŠฅๅๆ—ถ้—ด๏ผš2015ๅนด7ๆœˆ27ๆ—ฅ่‡ณ2015ๅนด8ๆœˆ2ๆ—ฅ18ๆ—ถ่”็ณปไบบ๏ผš้กพ่ฅฟๅŒ13912523638...\n717504 1 ๆˆ‘ๆ˜ฏๆฐดๆžœๆน–่€็™พๅง“ๅคง่ฏๆˆฟไธธ็พŽไธ“ๆŸœ็š„๏ผŒไธธ็พŽx.xๆœ‰ๅคงๅž‹ไผ˜ๆƒ ๆดปๅŠจ๏ผŒๅ…จๅœบxๆŠ˜ๅŽๆปกxxxๅ†ๅ‡xx๏ผŒ็‰นไปท...\n1000 Processed\n classify content\n718000 0 ๅฆ‚ๆžœ้™ˆๅœ†ๅœ†ๆ˜ฏไบ”ๅ…ญๆœˆ็š„่‹ๅทž่Œ‰่މ\n718001 0 ๅพˆๅคšๆ—ถๅ€™ไธ่ฟ‡้—ฎๆฏ”็Ÿฅ้“็œŸ็›ธ่ฆๅฅฝๅพ—ๅคš\n718002 0 ๅ‚ไธŽNBAๅ›ฝๅบฆๅ„้กนๆดปๅŠจๅŒๆ—ถ\n718003 0 โ€œxxxxxxโ€ฆโ€ๅ‰ๅฅไธ€ๅ‡บๆฅๅฐฑ่ฆๅ“ญไบ†\n718004 0 ๆƒณ็•™ๅœจๅธธๅทž๏ฝžๅ†ไนŸไธๆƒณๅ›ž้•‡ๆฑŸไบ†\n1000 Processed\n classify content\n718500 0 ้›†PC้กตๆธธใ€ๆ‰‹ๆœบ็ฝ‘ๆธธไธบไธ€ไฝ“็š„ๅ…จๅนณๅฐ่Œ็ณปๅก็‰ŒRPGๆธธๆˆ\n718501 0 ๆ—ฉๅœจ60ๅนดไปฃๅฐฑๅ› ๅๅฏนๅ›ฝๆฐ‘ๅ…š็š„ไธ“ๅˆถ็ปŸๆฒป่€Œ่ขซๆ•ไธ‹็‹ฑ\n718502 1 xxxxxxxxxxxxxxxxxxx ๅ†œ่กŒ๏ผŒๆŽๅฐๅผบใ€‚\n718503 0 ๆ•ฐไธๆธ…ๆ˜ฏ็ฌฌๅ‡ ๆฌก่Šฑๅƒ้ชจๅฐ่ฏดไบ†\n718504 0 ๆˆ‘ๆœ‰่ฟ™ไนˆ่€ๆˆไนˆๆฏไธชๅŒป็”ŸๆŸฅๆˆฟ้ƒฝ้—ฎ็ป“ๅฉšไบ†ๆฒก\n1000 Processed\n classify content\n719000 0 ไผฏๆœ—็‰น็ ”ๅ‘ๅ‡บๅ…ญ่ฝดๅทฅไธšๆœบๅ™จไบบ\n719001 0 ่ฐƒ็ขŽ่Šฑ้•ฟ่ฃ™+ไผ‘้—ฒๅธ†ๅธƒ้ž‹็š„ไธ็ปธไธ€ๆ ท\n719002 0 ้‚ฃไธชๅœจZIPPOๅบ—้‡Œ็‰นๆœ‰็š„ๅ‰ไป–ๅ’Œ็”ทๅฃฐ\n719003 0 ๅŽŸๆœฌๆ˜จๅคฉไธŠๅˆๅฐฑ่ฏฅๅˆฐBergen\n719004 0 ไธ่ฆๆŠŠ่‡ชๅทฑไธ€ๆ—ถ็š„ๅฟซไนๅปบ็ญ‘ๅœจๅˆซไบบไธ€็”Ÿ็š„็—›่‹ฆไธŠ้ข\n1000 Processed\n classify content\n719500 0 ่ขซ็ฌ‘ๆ˜ฏๅ› ไธบๆœ‰ไนŒ้พŸๆ‰€ไปฅ็”ต่„‘ๅ˜ๆ…ขไบ†\n719501 0 ๅฟƒ็ƒฆๆƒณ็ ไบบไธ€ไธช็”ต่„‘็™ฝ็—ด็Žฐๅœจๅพ˜ๅพŠๅœจ่ทŸๅ•ฅ้ƒฝไธๅ…ผๅฎน็š„winx่ทŸ่ฝฏไปถไน‹้—ด\n719502 0 ๅ“ชๆœ‰ๆˆ‘ๅคง่‹ๅทž่ฟ™็งๆฑŸๅ—ๅฐ้•‡็š„็ป†่…ปๆœฆ่ƒงๅ•Š๏ฝžไธ‹็š„ๆˆ‘ๅฟƒ็ƒฆ\n719503 0 ๆต™ๆฑŸๅซ่ง†ๆœ€ๅŽ็š„ๅทฅไฝœไบบๅ‘˜ๅๅ•ไปฅๅŠ่ตžๅŠฉๅ•†ๅ•†ๆ ‡ๆฏ”่ตฐ้ฉฌ็ฏๅฟซไธ€็™พๅ€ๅ•Šไธ€็™พๅ€\n719504 0 ไปŠๅคฉๆฃ€ๅฏŸ้™ขๆŽง็”ณ็ง‘็š„่–›ๆ€€ไบฎไธ€ๅๅธธๆ€ไธ่ฎฒๆณ•็†\n1000 Processed\n classify content\n720000 0 2015ๅฎฟ่ฟๅธ‚้’Ÿๅพๅ›ฝ้™…ๅญฆๆ กๅ…ฌๅผ€ๆ‹›่˜8ๅๆ•™ๅธˆ็ฎ€็ซ \n720001 0 ๅฎƒ่žๅˆไบ†30ๅคšๅ‘ณ้กถ็บงไธญ่ฏ้ฃŸๆ\n720002 0 ๆ˜จๅคฉๆ™šไธŠ็š„ๅŠ ๆฒนๅงๅฎžไน ็”Ÿ็œŸ็š„ๅพˆ็ฒพๅฝฉ\n720003 0 1็ฑณ้ซ˜็š„็ƒงๆฏ›้ขไธญๅ›ฝ้ป‘็ŸณๆๆญไธŠไบ†ไปฅๅคไปฃๅ‰”ๆผ†ๅทฅ่‰บไธบ็ตๆ„Ÿ่ฎพ่ฎก่€Œๆˆ็š„็บขๆผ†่พนๆกŒไฝœไธบ้š”ๆ–ญ\n720004 0 ๅ‘ƒๆฅผไธ‹ๆตท้—จๅฅฝๅฃฐ้Ÿณ็œ‹็€ๅˆซไบบ็ƒญๆƒ…ไผผ็ซ่ตถ็€็œ‹็›ดๆ’ญๆˆ‘ๆƒณ่ฏดๅ“ฅๅ“ฅๅœจๅˆท้ข˜ๅ•Šๆ“่›‹ๅฅฝๅต\n1000 Processed\n classify content\n720500 0 ๆ‰€ไปฅ่ต–็€็šฎ่ฆๅ’Œ้‚ฃไธชๆ‰‹ๆœบ็š„ไธปไบบไธ€ๅผ ๅบŠ\n720501 0 ๅญ—ๆฏSๅ’Œๅญ—ๆฏDไธๆ˜ฏไธ€็›ดๅœจไธ€่ตทๅ—\n720502 0 ๅพฎ้›จ่ฝป้ฃŽๅนฟๅœบไธŠๅœจๆ”พๅŽปๅนดๅคๅคฉ็š„็”ตๅฝฑ่บซ่พน่ทฏ่ฟ‡ๆต…้‡‘่‰ฒๅคดๅ‘็š„ๅฐ้ฒœ่‚‰ไนŸไธ็Ÿฅ้“ๆต™ๆฑŸๆฅ็š„็ฌ‘ๅฎน็”œ็”œ็š„่€ๆฟๅจ˜ๅš...\n720503 0 ๆˆ‘ๆƒณ่ตทๆฅๆˆ‘ไฟก็”จๅก็Žฐๅœจ่ฟ˜ๆฒก่ฟ˜\n720504 0 ๆˆ‘ๅˆ†ไบซไบ†็™พๅบฆไบ‘้‡Œ็š„ๆ–‡ไปถ๏ผš?Standbyme็ฒค่ฏญ\n1000 Processed\n classify content\n721000 0 ๅœจไธœๅ…ด่ถŠๅ—้ฃŽๆƒ…่ก—็”ตๆขฏ้—จๅ‰ไนฐ็š„ๆฆด่Žฒ้‡Œ้ข้ƒฝ้•ฟ่™ซไบ†\n721001 0 ๅœจๆ— ้”กๅ“ชไธช็Š„่ง’ๆ—ฎๆ—ฏ้‡Œ้ƒฝ่ƒฝ้‡ๅˆฐ็†Ÿไบบ\n721002 0 ็‰นๅˆซ็‰ˆ|ๅšๅฎขๅ—ไบฌไฝ“่‚ฒๅนฟๆ’ญ\n721003 1 ๆ„Ÿ่ฐขๆ‚จ่‡ด็”ตๅฎ‰้€ธxxx่ฟž้”้…’ๅบ—่‡ช่ดกๅบ—!ๆœฌๅบ—ๆ ผ่ฐƒ่ฑช้›…ใ€้…ๅฅ—้ฝๅ…จ\n721004 0 ๅธ‚็ฌฌไบŒๆณ•้™ข้€šๆŠฅไบ†่ฟ™่ตท้žๆณ•ๅˆถ้€ ๆณจๅ†Œๅ•†ๆ ‡ๆ ‡่ฏ†็ฝช็š„ๅˆค\n1000 Processed\n classify content\n721500 0 ๅค„3ไธ‡ๅ…ƒไปฅไธŠ10ไธ‡ๅ…ƒไปฅไธ‹็š„็ฝšๆฌพ\n721501 0 ๆœฌๆฌกๅฑ•ไผš็”ฑๆต™ๆฑŸ็œ่ฟž้”็ป่ฅๅไผš\n721502 0 ๅˆ†ไบซไธ€ไธ‹ๆœฌๆฅๆƒณไนฐ่ฟ™ไธช็š„ไฝ†ๆ˜ฏๆ‰‹ๆœบ่€ณๆœบ่ฒŒไผผไธ้€‚็”จๅ…ถไป–่ฎพๅค‡\n721503 0 ่ญฆ็คบๆฏไธ€ไธชๅ‰ๆฅๅธๆณ•ๆ‰€ๅฝ“้ขๆŠฅๅ‘Š็š„็คพๅŒบๆœๅˆ‘ไบบๅ‘˜\n721504 0 ๆœฌๅ่ฎฎๆ˜ฏ้•ฟๅ…‰่พฐ่ŠฏไธŽๅซๆ˜Ÿๅ…ฌๅธๅ…ณไบŽ้ซ˜ๆ€ง่ƒฝCMOSๅ›พๅƒไผ ๆ„Ÿๅ™จ็ญ‰้กน็›ฎ็ ”ๅˆถ็š„ๆก†ๆžถๅ่ฎฎ\n1000 Processed\n classify content\n722000 0 ๅตไบ†ๅŠๅคฉๆˆ‘ๅ†ณๅฎšๆปšๅ›ž็”ต่„‘ๅ‰็œ‹โ€ฆโ€ฆๆˆ‘BOๅ“ฅๆœ€ๅธ…ไบ†ๆœ‰ๆœจๆœ‰\n722001 0 โ€œ6pairๅŠโ€ๆ˜ฏๆŒ‡ๅฝ“ๆ—ถ้ฆ™ๆธฏๅ•†ไธš็”ตๅฐ็š„13ไธชDJ\n722002 0 ่ฏ่ฏดๅคชๆน–ๆ—่พนๅฑฑไธŠ็š„ๅˆซๅข…ๆ˜ฏ็œŸๅฟƒๆผ‚ไบฎ\n722003 1 ๆ–ฐๅนดๅฅฝ๏ผ็Žฐๅงฟ้‡‘ๆœๅŠกๅ…จ้ขๅผ€ๅง‹๏ผไธบๆ‚จๅฅ‰ไธŠๆ–ฐๅนด็ฌฌไธ€ๆกถ้‡‘๏ผไธบๆ‚จไบ‹ไธšไฟ้ฉพๆŠค่ˆช๏ผ้“ถ ่กŒ่ดท ๆฌพ๏ผŒ็ซ‹ๆฏไฝŽ่‡ณx...\n722004 0 ๆญฆๅชšๅจ˜้ซ˜็ง‘ๆŠ€ๆœบๅ™จไบบ็އๅ…ˆๅœจๆ™บ่ƒฝๆœๅŠกๆœบๅ™จไบบ่กŒไธš่ฟ็”จ้ซ˜็ง‘ๆŠ€ๆŠ€ๆœฏ่‡ชไธป็ ”ๅ‘็š„ไผš่ฏด\n1000 Processed\n classify content\n722500 0 ๅœฐ้“ๅๅท็บฟ้‡Œๅ‡บ็Žฐๅฅฝๅคš่งไบบๅฐฑ็ฃ•ๅคดไนž่ฎจ็š„ๅฐๅญฉๅ„ฟ\n722501 0 ๅฅฝๅฃฐ้Ÿณๆ€Žไนˆๅˆ่ฎฉ่…พ่ฎฏๅšๆ’ญๆ”พๅนณๅฐๅ•Š\n722502 0 cucumboๅฏไปฅ่‡ช่กŒ็ป„่ฃ…ๅ’Œๆ›ดๆข\n722503 0 ๆ— ้”กๅธ‚้”กๅฑฑๅŒบไธœไบญ่ก—้“ๆ–ฐๅฑฏ็คพๅŒบ่—คๅบ„ๅททๅฑ…ๆฐ‘ๆ‹†่ฟๆƒ…ๅ†ตไปŽๆœชๅฑฅ่กŒๅ…ฌ็คบ็จ‹ๅบ\n722504 0 ่€Œ็”จๆˆ‘็š„ๆ‰‹ๆœบๅทๅ’Œไธ€ๆ ท็š„ๅฏ†็ ่ฟ›ๅŽปๅŽๅฐฑ่ฟ˜ๆ˜ฏๆˆ‘\n1000 Processed\n classify content\n723000 0 Qqๅพฎไฟก็”จไธไบ†ๆœ‹ๅ‹ไปฌๅœจๅพฎๅšไธŠ่”็ณปๆˆ‘\n723001 0 ๆฏๅคฉๅ–ไบŒๅๅ‘ณไธญ่ฏ็š„ๆˆ‘็ช็„ถๅ‘็Žฐไบ†่‡ชๅทฑ็š„ๅ…ด่ถฃ\n723002 0 ๆˆ‘ๅˆšๆ‰ๅฑ…็„ถ็”จๅŠ›ๆ‘ๆˆ‘็š„็”ต่„‘ๅฑๅน•\n723003 0 ๆต™ๆฑŸๅซ่ง†ๆŒ‘ๆˆ˜่€…่”็›Ÿ่ฟ˜ๆœ‰xๅคฉ\n723004 0 ไนๅ˜‰่‹้ซ˜้€Ÿ่‹ๅทžๆ–นๅ‘ๅ‡คๆกฅๅ‡บๅฃๅŒ้“ๆœ‰่พ†ๅฑๅŒ–ๅ“่ฟ่พ“่ฝฆไพง็ฟป\n1000 Processed\n classify content\n723500 0 ไนŸไฝ“้ชŒๅˆฐไบ†ๅธธๅทžไบบๆฐ‘็š„ไฝŽๆถˆ่ดน้ซ˜่ดจ้‡\n723501 0 ้€‚็”จ้€Ÿ่ฃ็จ‹ๅบๅฎก็ป“ๆกˆไปถxxxxไปถ\n723502 0 ๆฒณๅŒ—ๅŽๅคๅนธ็ฆ0๏ผš0ๆฑŸ่‹่ˆœๅคฉ\n723503 1 ............ๆ–คๅฅ–xxxๅ…ƒใ€‚[้ผ“ๆŽŒ][้ผ“ๆŽŒ][้ผ“ๆŽŒ][้ผ“ๆŽŒ]ๅฏปๆ‰พๆด›้˜ณ็ฌซไธ€่ƒ–ๅ…่ดนๅ‡้‡...\n723504 0 ่ฟ˜ไธๅฆ‚ๆˆ‘ไปฌChinaJoy้šไพฟไธ€ไธชๅฑ•ๅฐ็š„ๅฆนๅญโ€”โ€”ๆ— ่ฎบๅœจๆ•ฐ้‡ๅ’Œ่ดจ้‡ไธŠ้ƒฝๆ˜ฏๅฆ‚ๆญค\n1000 Processed\n classify content\n724000 0 ๆˆด็œผ้•œ็™ฝ่กฃ็”ทๅญๆ˜ฏไบบๅคง็š„ไธ‡ไธปไปปๆปก่บซ่ก€ๆ˜ฏๅ—ๅฎณ่€…ๅผ ้“ญ\n724001 0 ๅšๅ…ฌ็›Š็š„ไบบๆ”ฟๅบœๅŠๅฎ˜ๆ–นๅช’ไฝ“ไธ่ฆๆ’ๆ‰‹\n724002 0 ๆ˜จๆ™šๅœจ้ฃžๆœบไธŠๆ‹็š„้—ช็”ต๏ฝž\n724003 0 ็Œชๅœจไผ ็ปŸๆ–‡ๅŒ–ไธญๆ˜ฏ่ดขๅฏŒ็š„่ฑกๅพ\n724004 1 ๏ผˆไบบๆฐ‘ๅ•†ๅœบ๏ผ‰ๆ˜ฅๅคๆ–ฐๆฌพๅ…จ้ฆ†็››ๆ”พ๏ผŒxxๅคงๅŒ–ๅฆ†ๅ“็‰Œ็‹ฌๅฎถ่ต ็คผ๏ผŒไผšๅ‘˜็งฏๅˆ†ๆปก้ขๅฏๅ…‘็Žฐ้‡‘ๅˆธ๏ผŒๅ‡ญๆญค็Ÿญไฟก่ดญๆŒ‡ๅฎš...\n1000 Processed\n classify content\n724500 0 ๅค–ๅช’่ฏ„ไปท๏ผšไธ€ๅŠ ๆ‰‹ๆœบ2ไธŽไธ‰ๆ˜ŸS6ไธ็›ธไธŠไธ‹\n724501 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๅˆšๅˆš็ป™ๆ‚จๆ‰“็”ต่ฏ็š„ๅ•็‚œ็‚œใ€‚ๆˆ‘่กŒ็ŽฐๅœจๆŽจๅ‡บ็บฏไฟก็”จ่ดทๆฌพ๏ผŒ้ขๅบฆไธบxx๏ฝžxxxไธ‡๏ผŒๅˆฉๆฏไฝŽไบŽๅธ‚ๅœบ...\n724502 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏๆฝๅŠๆฒƒๅฏŒๆœบๆขฐๆœ‰้™ๅ…ฌๅธ ็Ž‹ๅฅใ€‚ๆˆ‘ไปฌไธป่ฆ็”Ÿไบงๅพฎ่€•ๆœบใ€ๅผ€ๆฒŸๅŸนๅœŸๆœบใ€็”ตๅŠจๆฑฝ่ฝฆใ€‚็Žฐๅœจๅœจๆฒณๅ—ๅธ‚...\n724503 0 ๅ—้€šใ€่‹ๅทžใ€ๆ— ้”กใ€ๅธธๅทžใ€็›ๅŸŽใ€ๆณฐๅทž็š„ๆœ‹ๅ‹ๆณจๆ„ไบ†\n724504 0 ็›ˆ็ง‘ๅพ‹ไบ‘๏ผๅ…จ็ƒๆŠ•้—น่ž่ต„/ๅ•†ๅŠก/็งปๆฐ‘/็จŽๅŠก/ๆณ•ๅพ‹ๆœๅŠก\n1000 Processed\n classify content\n725000 0 ๆณ•ๅ›ฝsandroๅ†ฐไธๅŽŸๅ•่ฟž่กฃ่ฃ™่ฟ›ๅฃๅž‚ๆ„Ÿๆ›ฒ็ ๅ†ฐไธ้ขๆ–™ๅŒ้ข้’ˆ็ป‡ๅŠ ๆทฑไปฝ้‡ๆ„Ÿ้ขๆ–™่ถ…ๅฅฝๆ‰“็†ๆ€Žไนˆๆด—ไนŸไธไผš...\n725001 0 ๅฆ่บฒๅŒฟๅœจ้‡‘ๆตท่ฅฟๅ›ญๆ— ่ฏๅฐ่ดฉ\n725002 0 ่ฟ™็›ธๅฝ“ไบŽๅœจWindowsPhone็š„ๆฃบๆไธŠๅˆๅฎšไธ€ไธช้’‰ๅญ\n725003 0 ็›ฎๆต‹ๅทฒ็ปๅˆฐไบ†ๅ—ไบฌๅคๅคฉๆœ€็ƒญ็š„ๆ—ถๅ€™\n725004 0 ๅพˆๆƒณ็Ÿฅ้“็œŸ็›ธ๏ผš็Ž‹ๆž—ๆ˜ฏๅฆ็œŸ่ƒฝ็”จๆฐ”ๅŠŸๆฒป็—…\n1000 Processed\n classify content\n725500 1 ็พŽๆดฅๆค็ง€ๅ…จๅœบๆปกxxxๅ‡xx๏ผŒxๆœˆxๆ—ฅไน‹ๅ‰่ฟ˜ๅฏไบซๅ—ๆปกๅ‡ๅŽ้ขๅค–x.xๆŠ˜๏ผŒๅฑŠๆ—ถๆ•ฌ่ฏทๅ„ไฝๆ–ฐ่€ไผšๅ‘˜ๅ…‰ไธด...\n725501 0 xxxxๅŒ—ไบฌๅŸŽไฟก็”จๅกๅ–็Žฐ่ฅฟ่—ๆ—…ๆธธๆšด่ตฐ่ฎกๅˆ’ๆ็พ+็บฆไผด่ดด\n725502 0 ๅชๆœ‰่ฟ™ๆ ทไฝ ๆ‰่ƒฝ่ถŠๆฅ่ถŠๅฅฝๅŠ ๆฒน\n725503 0 2ใ€้€š่ฟ‡้“พๆŽฅ่ฟ›ๅŽปๅŽๆณจๅ†Œๅฎžๅ่ฎค่ฏ\n725504 0 ๅŒ…้‚ฎ็‘žๅฃซๆญฃๅ“ๅ็‰ŒBUREIๆ‰‹่กจ็”ทๅฃซๆ–ฐๆฌพ้’จ้’ข้˜ฒๆฐด็Ÿณ่‹ฑ่กจๆƒ…ไพฃๆ‰‹่กจBUREIๅฎๆขญ\n1000 Processed\n classify content\n726000 0 miumiu็š„่ฎพ่ฎกๆ˜ฏๆ„ๅผ้ฃŽๆ ผ\n726001 0 ๅœจๅ‘ฝ่ฟ็š„ๆผฉๆถก้‡Œไธ€่พน็œŸ็›ธๅœจ็”Ÿๆดปไธญๆญค่ตทๅฝผไผไบ†ๆ—…่กŒ\n726002 1 ๅ˜‰ๅ…ดๅ—ๆบชๅคšๅˆๅฅฝๅบ—่ฟŽ่พฃๅฆˆ้—นๅ…ƒๅฎตๆดปๅŠจ๏ผšxๆœˆxๆ—ฅไธ€ไธ€xๆœˆxๆ—ฅ ๅ…จๅœบๅฅถ็ฒ‰.็บธๅฐฟ่ฃคๆปกxxxๅ…ƒ้€xxๅ…ƒ็คผ...\n726003 0 ๅˆซไบบไธ‹้ฃžๆœบๅŽปๅพ—็ฌฌไธ€ไธชๅœฐๆ–นๆ˜ฏๆ™ฏ็‚น\n726004 0 ็›ฎๅ‰ๅ› ๆถ‰ๅซŒๅ…ฅๅฎค็›—็ชƒ่ขซๅˆ‘ไบ‹ๆ‹˜็•™\n1000 Processed\n classify content\n726500 0 ็„ถๅŽๆ‰“ไธช้ฃžๆœบๅŽปไบ†ๅฆไธ€ไธชๅœฐๆ–น็ปง็ปญ่ฟ‡็€็™ฝๅคฉ๏ฝž\n726501 0 ่ฟ˜ๆœ‰ไป€ไนˆ็†่ดขappไฝ ไปฌ่ง‰ๅพ—ๆฏ”่พƒๅฅฝ็”จ็š„\n726502 0 ๅ…ญๅˆๅŒบๆญคๆฌก่‡ช่กŒ่ฝฆๆŠ•ๆ”พๅฐ†ๅˆ†ไธบ3ไธชๆ‰นๆฌก\n726503 0 ๅฌไป–่ฎฒ่ฟฐๆœบๅ™จไบบไบงไธš็š„ๅ‘ๅฑ•ๆ–นๅ‘\n726504 0 ็”จFWSim่ฝฏไปถ่ฎพ่ฎก็š„็ƒŸ่Šฑ\n1000 Processed\n classify content\n727000 0 2014ๅนดๆฒฟๆตทๅ„ๅœฐๅŠ็›ธๅ…ณ้ƒจ้—จ็งฏๆžๆŽจ่ฟ›ๆตทๆฐดๅˆฉ็”จๅทฅไฝœ\n727001 0 ไนๅทๆ™šไธŠๅผ€ๅง‹ๆˆ‘็š„ๅ—ไบฌxๅคฉxๅคœๆ—…่กŒ\n727002 0 ๆœฌ้—จไปŽ็š–ๅ—่ฟžไบ‘็ฅž้žญ้—จๅคบๅพ—ๅธๆ˜Ÿๅคงๆณ•ๆฎ‹็ซ ๅ››\n727003 0 ๆŽๆ˜“ๅณฐๆŽๅคšๆตทๅˆ†ๆ‰‹ๅ†…ๅน•ๅ†่ขซๆ‰’\n727004 0 Life็ ดisalwaysatest\n1000 Processed\n classify content\n727500 0 ๆˆ‘ไปฌAndroidๅผ€ๅ‘่€…็š„้€‰ๆ‹ฉ่ฟ˜็œŸไธๅฐ‘\n727501 0 ไธ–็•Œๅ„ๅœฐ็š„่ฎพ่ฎกๅธˆ็”จไป–ไปฌ็‹ฌ็‰น็š„ๆƒณ่ฑกๅŠ›่ฎพ่ฎกไบ†่ฎธๅคšไธŽไผ—ไธๅŒ็š„็ง‹ๅƒ\n727502 0 ๅฐŠ้‡็Ÿฅ่ฏ†ไบงๆƒๆ˜ฏๅ…จ่กŒไธš็š„ๅ…ฑๅŒ่ดฃไปป\n727503 0 8ๅทๆฒ™ๅฐ็ฑณๆฑฝ่ฝฆ็งŸ่ต่™นๅฃๅผ€ๅทฅไธปๅฉš่ฝฆๆ–ฐๆฌพ็Ž›่ŽŽๆ‹‰่’‚ๆ€ป่ฃ\n727504 0 Pepper็š„ไฝฟ็”จไฝ“้ชŒ็ฉถ็ซŸๅฆ‚ไฝ•\n1000 Processed\n classify content\n728000 0 ็œพ้‡Œๅฐ‹ไป–ๅƒ็™พๅบฆ้ฉ€็„ถๅ›žx้ฆ–้‚ฃไบบๅปๅœจ็‡ˆ็ซ้—Œ็Š่™•\n728001 0 Amazon็Žฐๆœ‰CreativeCatsๅŠจ็‰ฉ็ณปๅ–ตๆ˜Ÿไบบๆถ‚้ธฆๅกซ่‰ฒๆœฌไป…ๅ”ฎ$4\n728002 0 ๅฏนๆฃ€ๆ–นๆŒ‡ๆŽงไป–ๅ—่ดฟxxxxxxๅ…ƒ\n728003 1 j่ฝ‰่ฎ“xxx-xxxxxxxx* xxxxxxxx xxxxxxxxxxx xxxxxxxx...\n728004 0 ไธœๅŒ—ๅกๅธฆๅผ่‹ฑ่ฏญโ€ฆใ€Œ25ไธ‡็ƒญๆ’ญ่ง†้ข‘ใ€ๆฐ‘ๅ›ฝ็็จ€ๅฝฑๅƒ\n1000 Processed\n classify content\n728500 0 ๅ—ไบฌ่ฃ…ๆฝขๅ…ฌๅธ๏ผš4ๆ‹›ๆ”นๅ˜ๅŽจๆˆฟ้ฃŽๆฐดๆ‹›่ดข็บณๅฎๅฅฝ็ฆๆบ\n728501 0 ๆœ‹ๅ‹ๅœˆๅค„ๅค„้ƒฝๆ˜ฏๆ€้˜ก้™Œ/่‰ฒ/่‰ฒๅ„็ง็Šฏ่Šฑ็—ด/ไบฒไบฒ/ไบฒไบฒๅฅณไบบไปฌ\n728502 0 CDECGAMINGๅŠ ๆฒน\n728503 0 MichaelKorsSelmaไธญๅท้“†้’‰่€ณๆœตๅŒ…\n728504 0 ็„ถๅŽๅบง11็‚น45ๅˆ†้ฃžไฝไธฝๆฑŸ็š„็ญๆœบ\n1000 Processed\n classify content\n729000 0 ็ง‹ๅถๅŽŸ็š„ๅฅณไป†ๅ’–ๅ•กๅฑ‹?cafeๆŽจๅ‡บๅ…จๆ–ฐๅˆถๆœ\n729001 0 ไธŽไผšไธ“ๅฎถๅญฆ่€…ๅ›ด็ป•โ€œไธญ่ฏ่ฏ็†็ ”็ฉถ็š„ๅˆ›ๆ–ฐไธŽๅ‘ๅฑ•โ€่ฟ™ไธ€ไผš่ฎฎไธป้ข˜ๅฑ•ๅผ€็ ”่ฎจ\n729002 0 ๆฑŸ้ƒฝๅŒบๆฐ”่ฑกๅฐxๆœˆxxๆ—ฅxxๆ—ถๅ‘ๅธƒ๏ผšไปŠๅคฉๅคœ้‡Œๅˆฐๆ˜Žๅคฉ้˜ดๆœ‰้˜ต้›จๆˆ–้›ท้›จ\n729003 0 ๅ‘็Žฐ็œ‹็€ๅฅฝ็œ‹็š„ๅคง้ƒฝๆ˜ฏๆ—…ๆธธๅฉš็บฑ\n729004 0 ๆŠฅๅๆ—ถ้—ด๏ผš2015ๅนด8ๆœˆ9ๆ—ฅโ€”2015ๅนด8ๆœˆ11ๆ—ฅ\n1000 Processed\n classify content\n729500 0 ๆ˜ฏๅฆ็พŽๅ›ฝๅฏไปฅ้€š่ฟ‡ไฟก็”จๅก็š„ไฟกๆฏๅœจๅฝ“ๅœฐ็š„ๅบ—็š„ๆถˆ่ดน่ฎฐๅฝ•้‡Œๆฃ€็ดขๅ‡บๆˆ‘่ดญไนฐ็š„ๅž‹ๅทๅ‘ข\n729501 0 ้ƒฝๆ˜ฏ่Œ่Œๅ“’~~้ป„้‡‘ๅทฆ่„š่ธข็ƒๅงฟๅŠฟไนŸไธ้”™\n729502 0 1946ๅนด้‡ๅทกๆด‹่ˆฐๅฆ™้ซ˜่ขซๅ‡ฟๆฒ‰ไบŽ้ฉฌๅ…ญ็”ฒๆตทๅณก\n729503 1 ๆ‚จๅฅฝ๏ผๆˆ‘ๆ˜ฏ่ฅฟๅŒบๆฌงๅ‡ฏ้พ™่ดŸไธ€ๆฅผ๏ผšๅŽจๅฃนๅ ‚็Žฏไฟ้›†ๆˆ็ถ็š„้”€ๅ”ฎๅ‘˜๏ฝžๅฐๆšดใ€‚ๆœฌๅ•†ๅœบxxxxๅนดๅปบๆๅฎถๅ…ทโ€œx.x...\n729504 0 Vic็ป™ๆˆ‘็•™่จ€่ฏด๏ผšๅ’Œๅผ€้ฃžๆœบ็š„ๆ‰“่ตทๆฅไบ†\n1000 Processed\n classify content\n730000 0 ๆฏ”่พƒ้‡่ฆๅ“ฆ~ๅฟซget่ตทๆฅๅง~\n730001 0 2016ๅ›ฝๅฎถๅ…ฌๅŠกๅ‘˜่€ƒ่ฏ•ๆš‘ๆœŸๅค‡่€ƒๆญฅๆญฅไธบ่ฅโ€œๆ‰่กŒๆต‹็”ณ่ฎบโ€\n730002 0 ๆ—ถ้š”5ๅคฉ็ปˆไบŽ่ฟ‡ไธŠๆœ‰ๆ‰‹ๆœบ็š„็”Ÿๆดปไบ†\n730003 0 ๅคฎ่กŒๆ˜จ้€†ๅ›ž่ดญ400ไบฟๅ…ƒๆœบๆž„้ข„่ฎกๆœชๆฅๆ•ฐๆœˆๅฎฝๆพ็ซ‹ๅœบไธๅ˜\n730004 0 ็พŽ้“ญไน้ซ˜ๆœบๅ™จไบบๅŸบๅœฐๅˆไธ€ๆฌกๅ—้‚€ๅ‚ๅŠ xxxxไธญๅ›ฝๅ›ฝ้™…ๆถˆ่ดน็”ตๅญๅš่งˆไผšๆšจๅฑฑไธœ็œๆœบๅ™จไบบๅฑ•ไผš\n1000 Processed\n classify content\n730500 0 ็œŸๅฟƒไธบ็›ๅŸŽ็š„ๅ…ฌๅ…ฑๅŸบ็ก€่ฎพๆ–ฝๅปบ่ฎพ็‚น่ตž\n730501 0 ไปŠๅคฉ่ฆๆŠฑ็€ไธ€ๅฐ็”ต่„‘ๅ’Œๆๆ–™ๅ›žๅŽปๅ•Š\n730502 0 ๆˆ‘็š„้ข่ฏ•ๆˆ็ปฉๆ•ดไธชๆฃ€ๅฏŸ้™ข็ณป็ปŸ็ฌฌไธ€\n730503 0 ๆ”ฟๅบœไผšๅผบๅˆถๆ€ง่ฆๆฑ‚ๅˆถ้€ ๅ•†ๅœจLEDไบงๅ“ไธŠๆ ‡ๆณจ่ƒฝๆ•ˆๆ ‡่ฏ†\n730504 1 ใ€Šๅนณๅฎ‰้“ถ่กŒๆ–ฐไธ€่ดทใ€‹็ฎ€ๅ• ๅฟซ ๆ— ้œ€ๆŠตๆŠผใ€‚ xxๅฐๆ—ถๅ†…ๆ”พๆฌพใ€‚ๅˆฉ็އไฝŽ๏ผŒ ่ดทๆฌพxxไธ‡ๅ…ƒไธ€ๅนดๅˆฉๆฏ็บฆx....\n1000 Processed\n classify content\n731000 0 ๆˆ‘ๅˆฐ็Žฐๅœจๆ‰็Ÿฅ้“ๆผ”่Šฑๅƒ้ชจ้‚ฃไธชๆ˜ฏๅผ ไธนๅณฐ\n731001 1 ๆ™ฎๆƒ ้‡‘่ž โ€œๅฎœไฟก.ๅฎœไบบ่ดทโ€ ไฟก็”จ่ดทๆฌพโ€ฆโ€ฆๅ…จๅŠ›ๅŠฉๆ‚จๅ‘ๅฑ•ใ€‚x.ไฟก็”จๅ€Ÿๆฌพๆ— ้œ€ๆŠตๆŠผๆ‹…ไฟ๏ผŒๆ— ไปปไฝ•ๅ‰...\n731002 0 ๅพ่ดคๅˆšๆ›ดๆ–ฐไบ†Twitter๏ผš??????????~?????????????ๅŽŸๆ–‡ๆˆณ๏ผš\n731003 1 ็Ÿณๆฒนๆด’ๆฅผๆญฃๆœˆๅไบŒๅผ€ๅง‹่ฟŽๆŽฅ้ซ˜ๆœ‹ๅฅฝๅ‹ไบ†๏ผŒ็พŠๅนดๅ–œๆด‹ๆด‹๏ผŒไปŠๅนด้…’ๆฅผ่œๅ“ไปฅ๏ผŒไฟ็‰น่‰ฒ๏ผŒๆŽจๆ–ฐๅ“ๅšๅฎถๅฎด๏ผŒๅทฅไฝœๆ”ด...\n731004 0 ๅšๆณ•๏ผšๅฐ†็‰›ๅฅถๅ’ŒQQ็ณ–ไธ€่ตทๆ”พๅœจ้”…ไธญ\n1000 Processed\n classify content\n731500 0 ๅผ ๅฎถๆธฏๆ˜“้“ๆ•™่‚ฒๅฎ่ดไปฌๅˆ›ไฝœ็š„็”ป\n731501 0 ๆ—ฅๆœฌ่ญฆๆ–น่ฟ‘ๆ—ฅ้€ฎๆ•ไบ†ๅ…ถCEO้ฉฌๅ…‹ยท็ง‘ๅฐ”ไฝฉๅ‹’ๆ–ฏ\n731502 0 ๅฐฑๅธŒๆœ›ๆ‰€ๆœ‰่บซ่พน็š„่ดชๅฎ˜ๅ…จ่ฝ็ฝ‘\n731503 0 ๆ€Žไนˆไฟ่ฏ่ฟ™ไบ›้‡‘่ž้ซ˜็ฎกไธ่ขซๆ‹‰ไธ‹ๆฐด\n731504 0 ๆ™šไธŠๅœจ่ถ…ๅธ‚็œ‹ๅˆฐไธช็”ท็š„ไธŠ็”ตๆขฏ\n1000 Processed\n classify content\n732000 0 141ๆˆทๆ‘ๆฐ‘ไฝ่ฟ›ไบ†ๆ–ฐๅปบ็š„ๅˆซๅข…\n732001 0 ๆฑŸ่‹ๅซ่ง†้ž่ฏšๅ‹ฟๆ‰ฐๅฅณๅ˜‰ๅฎพๅญ™ๅช›ๅช›ไธชไบบ็ฎ€ๅކๅ’Œ็ฝ‘ๅ‹่ฏ„ไปท|้žไฝ ่Žซๅฑž|้ž่ฏšๅ‹ฟๆ‰ฐ|็”ต\n732002 0 ๅคงๅฎถๅฟซๆฅ็œ‹็œ‹~ๅพฎ่ฝฏๆญฃๅผๅผ€ๅฏWinx/Winx\n732003 0 ๅทฅ็ฎก1402็š„ๆŽๆธฏๆฅๅŒๅญฆ้€‰ๅ–ไบ†ๆ— ้”กๆœฌๅœฐๅไบบๆ‰ๅญไธญๅ›ฝ่ฟ‘ไปฃๆ–‡ๅŒ–ๅคงๅธˆ้’ฑ้’Ÿไนฆ็š„ๆ•…ๅฑ…ไธบๆœฌๆฌก็คพไผšๅฎž่ทต็š„่ฎฟ้—ฎๅœฐ\n732004 0 ไธญๅ›ฝ้ฆ–ๅฑŠๆ— ๆฒน็ƒŸ่Š‚่ฝๅน•ๆต™ๆฑŸๆตทๅฎ็”ฑ็ง‘ๅคง้›†ๆˆ็ถไธปๅŠž\n1000 Processed\n classify content\n732500 1 ๅ…ฌๅธ่ดขๅŠกๅทฅ่กŒๅธๆˆท๏ผšxxxxxxxxxxxxxxxxxxx็Ž‹ๅฎ‰็ปช\n732501 0 ไธ€ๅคงๆธ…ๆ—ฉๅŒไบ‹ไปฌๅฐฑๅผ€ๅง‹่Š็”ต่ง†ๅ‰ง่ฏด่Šฑๅƒ้ชจ่ฏด็š„ๆดฅๆดฅๆœ‰ๅ‘ณโ€ฆๆ นๆœฌๆฒกๆœ‰ๅ…ฑๅŒ่ฏ้ข˜ๅ•Š\n732502 0 ไปŠๅคฉๆ™šไธŠ็›ๅŸŽๅ‡บ็Žฐไบ†ไธ€็พคๆ‘ฉๆ‰˜้ฃ™่ฝฆๅ…š\n732503 0 ไปŠๆ—ฅๆ‹…ไปปๅฎกๅˆค้•ฟ็š„ๆ˜ฏ้˜ณ็‘œๆณ•ๅฎ˜\n732504 0 ้šๅŽๅพฎ่ฝฏXbox้ƒจ้—จไธป็ฎกPhilSpencer็™ปๅœบ\n1000 Processed\n classify content\n733000 0 ๅฐ†ๅฆ‚ไปŠๆณ•ๅฎ˜ๅบญๅฎกๆ‰€้ตๅพช็š„โ€œ่‡ช็”ฑๅฟƒ่ฏโ€ๅŠๆณ•ๅพ‹ไบ‹ๅฎžๅ’Œๆณ•ๅพ‹่ฏๆฎไน‹้—ด็Ÿ›็›พ่กจ็Žฐ็š„ๆท‹ๆผ“ๅฐฝ่‡ด\n733001 0 ๆ— ็บฟU็›พ็›ธๅฝ“ไบŽๆŽŒๆŽงๆ‰‹ๆœบไฟกๆฏๅฎ‰ๅ…จ็š„ๆ— ็บฟ้’ฅๅŒ™\n733002 0 ๆˆ‘ๅ็”ตๆขฏไน‹ๅ‰่ฟ˜ๅพ—ๅ…ˆๅผ„ๆ˜Ž็™ฝ็”ตๆขฏ็š„ๅฎ‰ๅ…จไธŽ็ป“ๆž„\n733003 0 ๅฆ‚ๆžœ็”Ÿๆดปๅผบๅฅธไบ†ไฝ ไฝ ๅˆไธ่ƒฝๅๆŠ—้‚ฃๅฐฑ่ฆ่ฏ•็€ไบซๅ—\n733004 0 ๆ‚จ็š„ๅŒป็”Ÿๆ€€็–‘้ซ‚้™่„‰ๅ—ๅŽ‹ๆ˜ฏๅฆๆœ‰็›ธๅ…ณๅฝฑๅƒๅญฆๆฃ€ๆŸฅ่ฏๆฎๆ”ฏๆŒ\n1000 Processed\n classify content\n733500 0 ๅธฆไฝ ไบ†่งฃSOHO่ฟ™ไบ›ๅฎž็”จๆ€ง่ถ…้ซ˜็š„้…’ๅ…ท\n733501 0 ๅฎฃๅธƒๅฐ†็ƒ่กฃ้”€ๅ”ฎ้ข็š„5%ๆ็Œฎ็ป™ไธ€ๅฎถๅไธบ\n733502 0 ไปฅ้˜ฒ็”ตๆขฏ็š„ๆ‰ถๆ‰‹ๅ’Œๆขฏ็บง่ฟ่กŒไธๅฎŒๅ…จๅŒๆญฅ\n733503 0 ๅฐๆ•ฃไปฌ็œŸ็š„ๅธŒๆœ›ๆ”ฟๅบœๆ‹‰ๅ‡ไธญ็Ÿณๆฒน\n733504 0 ้€‰ๆ‹ฉๆ‘„ๅฝฑๅคง่ต›ไธญ็š„็ฌฌ10342ๅทๆฏๅคฉๆŠ•ไธ€็ฅจ\n1000 Processed\n classify content\n734000 0 ๆ–‡็ง‘ๆฅผ็š„่‡ชไฟฎ`็ฏฎ็ƒๅœบ็š„ๆฑ—ๆฐด`ๅ›พไนฆ้ฆ†็š„ๅฐ่ฏด`้‡‘ๆฒ™็ฝ‘ๅง็š„ๅˆไผ‘ๆ—ถๅ…‰`้ฃŸๅ ‚่ง‚ๆˆ˜NBAๅ’Œไฝๅฎฟ็š„้—ๆ†พ?ๅ†ไผšๅง\n734001 0 ่‡ชๅทฑๆŠŠไธœ่ฅฟ็œ‹ๅฅฝๅ•Šๅ“ˆๅ“ˆๅ“ˆๅ“ˆๆˆ‘็ซŸ็„ถๆ— ่จ€ไปฅๅฏน\n734002 1 ๆ‚จๅฅฝ๏ผŒๆˆ‘ๆ˜ฏ้‡‘ๅธๆตท็€ๅฐๅผ ๏ผŒ็Žฐๅœจๆˆ‘ไปฌๆŽจๅ‡บxxๅทๆฅผ็š„ไธ€ๆขฏไธ€ๆˆทๆ–ฐๅผ€็›˜็š„ๆˆฟๆบ๏ผŒxxxๅนณๆˆฟๆ€ปไปทไธƒๅๅ…ซไธ‡่ตท...\n734003 1 ้€š็Ÿฅ๏ผš็บข่œป่œ“ๅนผๅ„ฟๅ›ญๆ‹›็”Ÿๅœจๅณใ€‚ไบŒๅๅนดๅŠžๅ›ญ็ป้ชŒ๏ผŒๅฃ็ข‘ๅฅฝ๏ผŒๅธˆ่ต„ๅŠ›้‡้›„ๅŽš๏ผŒๆ•™ๅญฆ็Žฏๅขƒๅฎฝๆ•žใ€‚ๅœฐๅ€๏ผš็บขๆ——ๅฐ...\n734004 0 ไธไป…็”จๆ•ฐๅญ—ๆญๅผ€ๆต™ๆฑŸๅฏŒ่ฑช็š„ๆ•ฐ้‡\n1000 Processed\n classify content\n734500 0 |ๅˆšๅˆš็œ‹ๅˆฐไธ€ไธช่ฏ„่ฎบ่ฏดๅ•ๆ˜ฅ็ง‹่ฆๆ‰“่ดฅ้•ฟ็•™ๆ นๆœฌไธ้šพ\n734501 0 ่ฏฅ็”ทๅญๅ› ๆถ‰ๅซŒ็›—็ชƒ่ขซๆณ•้™ขไธ€ๅฎกๅˆคๅค„ๆœ‰ๆœŸๅพ’ๅˆ‘ไธคๅนด\n734502 0 ่ขซ120ๆ•‘ๆŠค่ฝฆ้€ๅพ€ๅŒป้™ขๆฒป็–—\n734503 0 ๅฎๅฎๆฒ‰่ฟทๆ‰‹ๆœบไธ่ƒฝ่‡ชๆ‹”ๆ€ŽไนˆๅŠž\n734504 0 2ใ€้ฆ™่‡ๆด—ๅ‡€ๆณก่ฝฏใ€ๆžธๆžๆด—ๅ‡€ใ€ๆก‚ๅœ†ๅ‰ฅ็šฎๅค‡็”จ\n1000 Processed\n classify content\n735000 0 ไฝ ่ฏด่ฟ‡ไธ€ไธชไบบ็š„ๆ—…ๆธธๆ˜ฏๅญค็‹ฌ็š„\n735001 0 ๆฏ”่ตท้‚ฃไบ›ไธ€ไธชๅŠฒๅซไฝ ๅŠžไฟก็”จๅก็š„่ฟ˜ๆ˜ฏ่ˆ’ๆœๅคšไบ†\n735002 0 ๆฅไธญๅ›ฝๆ—…ๆธธๅŠ ้ช—้’ฑ็š„ๆ”พๆพไน‹ๆ—…\n735003 0 ๅŒป้™ข๏ผš่—ๅŒป้™ขใ€็œๅ››ๅŒป้™ขใ€่ฅฟๅฎๅธ‚็ฌฌไธ‰ไบบๆฐ‘ๅŒป้™ขใ€ๅบทไนๅŒป้™ข\n735004 0 ๆ—ฅๆœฌCurelๆ•ๆ„Ÿ่‚ŒๅŒ–ๅฆ†ๆฐด้…ๅˆCurelไนณๆถฒไธ€่ตทไฝฟ็”จ็š„ๅŒ–ๅฆ†ๆฐด\n1000 Processed\n classify content\n735500 0 insider็”จๆˆทๅˆฐๅบ•ๆ˜ฏๆ€Žไนˆไธชๆ”ฟ็ญ–\n735501 0 ๅŠ่ฃ™ๆŠ˜ๅŽ173RMBไธŠ่กฃๆŠ˜ๅŽ158RMB\n735502 1 ็ดงๆ€ฅ้€š็Ÿฅ:ใ€Šๅผ ็ˆฑ็Žฒใ€‹่ฟ็บฆๆˆทใ€ๅฅ”่ท‘ๅงๅ…„ๅผŸใ€‘ๆ ็›ฎ็ป„ๅ‘ๆœฌ้™ขๆไบคๆ–‡ไปถๆŽง่ฏ‰ๆ‚จๆŸๅๆ ็›ฎ็ป„็š„ๅ่ช‰ๅ’Œๆญฃ่ง„ๆดปๅŠจ...\n735503 0 SPF50็š„tromborg้˜ฒๆ™’ๅทฒ็ปๅ…จ้ƒจๅ”ฎๅฎŒๅ•ฆ\n735504 0 ็ฌฌไบŒไธชๅพฎๅšๆขไบ†็”ต่„‘ๅฟ˜่ฎฐ่ดฆๅท\n1000 Processed\n classify content\n736000 0 ่ฃๅˆคๅ†ๅ‚ป้€ผไนŸไธ่ƒฝๆŠŠไฝ ๅฐ„่ฟ›็ƒ้—จ็š„็ƒๅ†ๅนๅ‡บๆฅๅ•Š\n736001 1 โ€œๅ”ฑไบซๅŒ—ๅŸŽใ€ๆดพๅฏนๅ‹็บฆโ€ๅŒ—ๅŸŽๆดพๅฏนKTV้‚€ๆ‚จๅˆฐๅบ—ๆฌขๅ”ฑไฝ“้ชŒ๏ผŒๆฏๅ‘จไบŒไผšๅ‘˜ๅ…จๅคฉๅ…่ดนๅ”ฑxๅฐๆ—ถใ€‚ๅœฐๅ€๏ผšไบ”ๅ—...\n736002 0 ๅ›พๅ››็”ทๅฃซๅฅ—่ฃ…๏ฟฅxxxๅ†…ๅซๆด้ข\n736003 0 ๅพฎ่ฝฏ็š„MikeYbarra่ฏๅฎžWin10ๆต‹่ฏ•็‰ˆMinecraftๅฐ†ๆทปๅŠ ๆˆๅฐฑๅŠŸ่ƒฝ\n736004 0 ็ช„ๅนณ่€Œไธ”้ขๅคดไธญๅคฎๆœ‰ๅ‡นๅ‡ธ็Žฐ่ฑก\n1000 Processed\n classify content\n736500 0 ่މ่Žฒๆ–ฐๅจ˜ๅฉš็บฑ็คผๆœ่ฎฉไฝ ๆˆไธบๆœ€็พŽ็š„ๆ–ฐๅซๅจ˜\n736501 0 ๆน–ๅ—็œไบบๆฐ‘ๆฃ€ๅฏŸ้™ขไพๆณ•ไปฅๆถ‰ๅซŒๅ—่ดฟ็ฝชๅฏน็œๅ‘ๆ”นๅง”ๅŽŸๆ€ป็ปๆตŽๅธˆๆจไธ–่Šณใ€ไธญ็งปๅŠจๆน–ๅ—ๅŽŸๅ…š็ป„ไนฆ่ฎฐ็Ž‹ๅปบๆ นๅ†ณๅฎš้€ฎๆ•\n736502 0 ๆˆฟไบง็จŽๅŸบๆœฌๆก†ๅฎšๆŒ‰้ข็งฏๅพๆ”ถ็ฆป็ ดๅ†ฐ่ฟ˜ๆœ‰ๅคš่ฟœ\n736503 0 ๅฏๆ˜ฏๅŒป็”Ÿ่ฏดๆ นๆฎb่ถ…ๆˆ‘ๅฏ่ƒฝไผšๆฒกๅฅถ\n736504 0 ็ป“ๆžœๅดๆทปๅ ตไบ†โ€ฆ็Ÿฅ้“็œŸ็›ธ็š„ๆˆ‘ๆ— ่ฏญไธ”ๆ„Ÿไบบ\n1000 Processed\n classify content\n737000 1 ๅฎ‰ๆ–ฐๅŒ้š†็ดซๆท‘ไธ“ๅŽ…ๅบ—ๅ‘˜ๆŽ่ช้ข–็ฅๆ‰€ๆœ‰ไบฒไบบๅ…ƒๅฎต่Š‚ๅฟซไน๏ผๅ…ƒๅฎต่Š‚่ฟ‡ๅŽไธบ่ฟŽๆŽฅxๆœˆxๅทๅฆ‡ๅฅณ่Š‚็š„ๅˆฐๆฅๅ…จๅœบๆ˜ฅ่ฃ…...\n737001 0 ๅฅณๅ‹็บขๆๅ‡บๅข™็”ทๅ‹ๆŒๅˆ€็ ไบบโ€”โ€”ๅฅณๅ‹ๆญฃ็”จ็บฆ็‚ฎ็ฅžๅ™จไธŽ็ฝ‘ๅ‹่Šๅคฉ\n737002 0 ็‰ฉ็พŽๆŽง่‚กๅŠๅŒ—ไบฌ็‰ฉๆตไฟกๆฏๅˆ่ฎกๆŒๆœ‰ๅ…ฌๅธ69\n737003 0 ๅ…ถๅฎžๆฏไธชไบบๆ‰€้ญๅ—็š„ไธๅ…ฌไธŽ่‹ฆ้šพ\n737004 0 ไฝ ไธ้œ€่ฆๆŠ•่ต„ใ€ไธ้œ€่ฆๅ›บๅฎš็š„ๅทฅไฝœๆ—ถ้—ดใ€ๆฒกๆœ‰ๅฑฏ่ดงๅŽ‹ๅŠ›ใ€ๆฒกๆœ‰็‰ฉๆตๅ›ฐๆ‰ฐ\n1000 Processed\n classify content\n737500 0 ไธคไธชๆต™ๆฑŸ็š„ๅค้•‡ไบบๅœจๆŸไธ€ๆ—ถๅˆป่ขซๆณจๅฎš็š„ๅ› ็ผ˜\n737501 0 ๅฐๅฐผCommunalๅ’–ๅ•กๅงๅ…ผ้คๅŽ…โ€”โ€”Communalๅ’–ๅ•กๅงๅ…ผ้คๅŽ…\n737502 0 ็‚ซๅฑๆ‰‹ๆœบ๏ผšๅŒ4GๅŒ็™พๅ…†ๅ…จๅ…ผๅฎน3\n737503 0 ็‘žๅฃซๅคฎ่กŒๆŠ•่ต„่‚ก็ฅจ็š„ๅธ‚ๅ€ผ็›ธๅฝ“ไบŽ็‘žๅฃซGDP็š„xx%\n737504 0 ้พ™ๅฉ†ๅคไฝ›ๅކ2536ๅนด21ๅนด่€็‰Œ\n1000 Processed\n classify content\n738000 0 ็œ‹็”ตๅฝฑ๏ผšTheJudgeๆณ•ๅฎ˜ๅ’Œไป–็š„ไธ‰ไธชๅ„ฟๅญ็š„ๆ•…ไบ‹2ๅฐๆ—ถ20ๅˆ†้’Ÿ\n738001 0 ็Žฐๅœจ่ฎฉๆˆ‘ไธ€ไธชไบบ็Žฉไป–ไปฌๅŽป้š”ๅฃๆˆฟ้—ด\n738002 0 ่ฎค่ฏไฟกๆฏไธบโ€œๆฑ•ๅคดไธญๅ คๆŠ•่ต„้›†ๅ›ข้กน็›ฎ้ƒจโ€\n738003 0 ๅคฉๆดฅๆปจๆตทๅŒบLTE็ฝ‘็ปœๅ…ณ้”ฎๆŒ‡ๆ ‡ๅพ—ๅˆฐๅพˆๅคงๆ”นๅ–„\n738004 0 ๆœ‰ๆ—ถๅ€™็œŸ็š„ไผš่ง‰ๅพ—ๆˆ‘ๅฆˆ่ฆ้€ผๆˆ‘ๅ…ถๅฎžๆŒบๆ–นไพฟ็š„\n1000 Processed\n classify content\n738500 0 58ๅ…ƒ็บขๅŒ…ๆ‘‡ๅˆฐ3๏ผš็ป™ไฝ 0\n738501 0 ๅœจๆ˜จๅคฉๅธ‚ๆ”ฟๅบœๆ–ฐ้—ปๅŠžไธพ่กŒ็š„โ€œไธ‰่ฏๅˆไธ€โ€ๆ–ฐ้—ปๅ‘ๅธƒไผšไธŠ\n738502 1 ๅคงไธฐ้‘ซ้ฆจๆ•™่‚ฒๆ˜ฅๅญฃๆŠฅๅไบ†๏ผ่ˆž่นˆ็ด ๆ็ป˜็”ป่ท†ๆ‹ณ้“๏ผŒ้ญ”ๆณ•็Žฉๅญ—ๆ‰‹่„‘้€Ÿ็ฎ—ๅฐไธปๆŒ๏ผŒๅฐๅญฆ่ฏญๆ•ฐๅค–๏ผŒๆœฌๅ‘จๅ…ญไน็‚นๅ…จ...\n738503 0 ๆฒณๅŒ—ไธ€ๅ‰ฏๅŽ…้•ฟๆ—ฅๅ‡ๅ—่ดฟ7ไธ‡ใ€ไธๅญ˜้“ถ่กŒใ€ๅจฑไนใ€ๆฌฃ่ต\n738504 0 ไปŠๅคฉ็ปˆไบŽๆŠŠ่Šฑๅƒ้ชจ็š„ๅฐ่ฏด็œ‹ๅฎŒไบ†\n1000 Processed\n classify content\n739000 0 ๅŠ ๅ…ฅ้ป„็“œๆฑใ€่›‹้ป„ใ€่œ‚่œœๅ’Œๆฉ„ๆฆ„ๆฒน\n739001 0 2015ๅนดๆฑŸ่‹็œๅฐ‘ๅ„ฟ้—จ็ƒๆฏ”่ต›ๅœจๅฎฟ่ฟๅค้ป„ๆฒณไฝ“่‚ฒๅ…ฌๅ›ญ้—จ็ƒๅœบไธพ่กŒ\n739002 0 ้‡‘่‰ฒๅฟซ่ฝฆ่ถ…ๅคง็Žป็’ƒ็ช—ไปฅๅŠ้€ๆ˜Žๅคฉ็ช—่ฎฉๆ‰€ๆœ‰็พŽๆ™ฏๅฐฝๆ”ถ็œผๅบ•\n739003 1 ไธญๅ›ฝไบบๆฐ‘้“ถ่กŒ็‰นๆ‰นๅ‘่กŒใ€Šไธ–็•Œ้—ไบง็บชๅฟตๅธ็่—ๅ†Œใ€‹ๆ€ปxxๆžš๏ผŒๆ€ป่€—้‡‘xๅ…‹๏ผŒ่€—xxxๅ…‹ๅทฆๅณ้“œ้”Œๅˆ้‡‘๏ผŒไธ€...\n739004 0 ไบบไฝ“ๅทฅๅญฆ็”ต่„‘ๆค…ๆ˜ฏๅœจๆกŒๅ‰็”ต่„‘ไธ€ๆ—ๅ‡บ็Žฐไน‹ๅŽ\n1000 Processed\n classify content\n739500 0 ๅธฆ็€ๆปกๆปก็š„ๆฐงๆฐ”ๅ†ๆฌกๅ›žๅฝ’ๅพช่ง„่นˆ็Ÿฉ็š„็”Ÿๆดปไธญ\n739501 0 ๆœ€ๅŽๅฏป่ฎฟไบ†ๆต™ๅคงๅ ค่™ฝ็„ถๆฑ‚ๆ˜ฏ็ฒพ็ฅžๆฐธๅœจ\n739502 0 ๅคงๅž‹ๅŒ–ๅฆ†ๅ“้›†ๅ›ขๆ‰พไธ€ไธชProductManager\n739503 0 ๅŽไธบ่ฃ่€€xplusๆ‰‹ๆœบๅฃณ่ฃ่€€xplusไฟๆŠคๅฅ—็ฃจ็ ‚่ถ…่–„ๅค–ๅฃณๆ–ฐๆฝฎ็กฌ็”ทๅฅณๆƒ…ไพฃ\n739504 0 x็ญๅ’Œx็ญๆˆ็ปฉ่พƒๅฅฝ็š„ไบบ้‡ๆ–ฐๅฎ‰ๆŽ’ๅˆฐx็ญ\n1000 Processed\n classify content\n740000 0 ๆˆๅŠŸๆŠŠๆ‰‹ๆœบไปŽๅ›ฝ่กŒ็ณป็ปŸๅˆทๆˆๅฐๆนพ็ณป็ปŸ\n740001 1 ๆ„Ÿ่ฐข่‡ด็”ตๆญๅทžๅˆ›ๅฑ•้—จไธš ๆœฌๅบ—ไธป่ฅ๏ผš็งปๅŠจ้—จ ๆŽจๆ‹‰้—จ็ญ‰ๅ„็ง้—จ ็”ต่ฏ๏ผšxxxxxxx ๅœฐๅ€๏ผš่ฅ„ๅทžๅŒบๅŽ...\n740002 0 ๆ‰่ƒฝๅšๅˆฐไฝŽๆŠ•่ต„้ซ˜ๆ•ˆ็›Šๅขžไบงๅขžๆ”ถ\n740003 0 ๆ–ฐๅบ—็”ป่ฎพ่ฎกๅ›พ็š„ๆ—ถๅ€™ๅฅฝๆญน้•ฟ็‚นๅฟƒๅง\n740004 0 \\๏ผ‚ไธŠๅธ\\๏ผ‚็œผไธญ่‡ชๆœ‰็ญ”ๆกˆ????ๅœจๅซๆ˜Ÿ็š„็œผไธญ\n1000 Processed\n classify content\n740500 0 ่‹ๅทž็ป•ๅŸŽS58ไธŠๆตทๅพ€่‹ๅทžๆ–นๅ‘28km่ฝฆๅŠๅ‡บๅฃไธๅˆฐๅ‘็”Ÿ็š„ๅฐ่ดง่ฝฆไพง็ฟปไบ‹ๆ•…่ฟ˜ๅœจๅค„็†ไธญ\n740501 0 ไป–็ช็„ถ็œ‹่งไธ€่พ†่ญฆ่ฝฆ่ทŸๅœจไป–ๅŽ้ข\n740502 0 ๅ‘ตๅ‘ตๆœชๆˆๅนดไบบxๅคฉxxๅนด่ฟ™ๆณ•ๅฎ˜ๆ”ถไบ†ๅคšๅฐ‘\n740503 0 900ๆฏซๅ‡็จ€ๆœ‰่ก€ๆŠต่พพๆฑŸ่‹็œไบบๆฐ‘ๅŒป้™ขicu้‡็—‡็›‘ๆŠคๅฎค\n740504 0 ๅฐฑ่ฟ™ๆ ทไฝ ้™ทๅ…ฅไบ†ไธ€ไธชๆญปๅพช็Žฏ๏ผšๅชไผš่ถŠๆฅ่ถŠ็ฉท่ถŠๆฅ่ถŠ่ƒ–ๆญคๆ—ถๆญคๅˆปไฝ ๅบ”่ฏฅๅšไธ€ไธชๆญฃ็กฎ็š„้€‰ๆ‹ฉ็ญ‰ไฝ ๆฅ้šๆ—ถๆฌข่ฟŽไฝ ็š„ๅŠ ๅ…ฅ\n1000 Processed\n classify content\n741000 0 ่–„่–„่ฝฏ่ฝฏ็š„็‰›ไป”้ขๆ–™ๆฐดๆด—็‰›ไป”ไธไผšๆމ่‰ฒๅ“ˆ\n741001 1 ๅนดๅบฆ้ฆ–ๅผ€็‰นๆƒ  ๅ‡ไปทxxxxxๅ…ƒๆฏๅนณๆ–น ็ŽฐไบŒๆœŸๅผ€็›˜ๅœจๅณ ็ซ็ˆ†ๅ…จๅŸŽ xไธ‡ๆŠตๆ‰ฃxไธ‡ๅ›ข่ดญไผ˜ๆƒ ๆฅๅฐฑไบซ๏ผ...\n741002 0 ๆˆ‘็š„็”ต่„‘ๅ…ˆๆ˜ฏๆฒกๅฃฐ้Ÿณใ€่ฟ™ไธคๅคฉ่‡ชๅŠจๆ›ดๆ–ฐใ€่ฟžๅฏๅŠจ้ƒฝๅฏๅŠจไธไบ†\n741003 0 2ใ€ไบŒ็ปด็ ่ฏˆ้ช—๏ผšโ€œๅ•†ๅ“ไบŒ็ปด็ โ€ไธบๆ‰‹ๆœบๆœจ้ฉฌ\n741004 0 ้‚ฃไนˆไฝ ไธ€ๅฎšๆƒณ็Ÿฅ้“ๅœจWin10ๆ‰‹ๆœบ็‰ˆ10149ไธญ\n1000 Processed\n classify content\n741500 0 ๅฟตๅœจๅŒ–่บซ+ๅ’Œๅฃฐ+ๅฎกๅˆคๅŠ›้‡่ถ…็บง็‰›้€ผๆ‰€ไปฅไธ€็›ดไธ่ˆๅพ—\n741501 0 ็–‘ไผผๆœฌยทๆ‹‰็™ปๅฎถๆ—็งไบบ้ฃžๆœบๅœจ่‹ฑๅ›ฝๅ ๆฏ\n741502 0 ่ฆ่งฃๅ†ณๅฐฑ่ตถ็ดง็ป™ๆˆ‘ๆฅ็”ต่ฏ่งฃๅ†ณไธ็„ถๅฐฑๅˆซ็ป™ๆˆ‘ๅ‘ไธ€ๆก่ฏ„่ฎบไธ€ๆกๆ‰“ๆ‰ฐๆˆ‘็‰ˆ้ช‚\n741503 0 ่€Œๆญคๆ—ถๆญฃๅ€ผไบš้ฉฌ้€Š20ๅ‘จๅนดๅบ—ๅบ†ไฟƒ้”€ๆดปๅŠจๅ‰ๅค•\n741504 0 ๆœ‰ๆ„่€…ๅพฎไฟก่”็ณปjanephuaxxxx\n1000 Processed\n classify content\n742000 0 xxๅทๅฎŒๆˆ็ฌฌไบŒ้’ˆ้˜ฒ็–ซ่ง‚ๅฏŸไธ€ๅ‘จไบคๆŽฅ\n742001 0 ๆ‹›ๅ•†ๅผ•่ต„ๅฏนไธญๅ›ฝๅˆถ้€ ็š„ไฝœ็”จๅพˆๆœ‰้™\n742002 0 ่ฎพ่ฎกๅธˆ่ฎพ่ฎกไบ†่ฟ™ๅฅ—ๅฏ็ˆฑ็š„ๅผ€ๅ…ณ\n742003 0 xxๆ˜Ÿๅบงๆœ€ๆœ‰ๆฏ…ๅŠ›็ฌฌไธ€ๅ๏ผšๅค„ๅฅณ\n742004 1 (่ฟชๅฐผ่ŽŽๅฅณ้ž‹)x.xๅฅณไบบ่Š‚ๅ›ž้ฆˆๆ—ฅ๏ผˆx.xๆ—ฅ-x.xๆ—ฅ๏ผ‰๏ผŒๆ–ฐๅ“ๆŠ˜ๅŽๅ†ๅ‡xxๅ…ƒ๏ผŒ่€ๆฌพๅฅณ้ดๆธ…ไป“ๅคงๅค„...\n1000 Processed\n classify content\n742500 0 ๆตทๅ—็œไธ‡ๅฎๅธ‚ๆฃ€ๅฏŸ้™ขๆœ‰ๅ…ณไบบๅฃซ้€้œฒ\n742501 0 ๅ› ไธบ็œŸ็›ธๅพ€ๅพ€ไธๆ˜ฏไป€ไนˆๅฅฝไธœ่ฅฟ\n742502 1 ๅ•†ๅ“่•พ็‘Ÿไธๅซ็”Ÿๅทพ้™ๆ—ถไผ˜ๆƒ ไฝ“้ชŒxๅฅ—ไป…้œ€xxxๅ…ƒ๏ผๅฎƒๆ˜ฏไธ€ๆฌพ้ซ˜็ซฏๅฅˆ็ฑณ้“ถ็š„ๅŒป็–—ๅ“็‰Œๅซ็”Ÿๅทพ๏ผŒ้‡Šๆ”พ่ดŸ็ฆปๅญ...\n742503 0 Nordstromๅ †ไบ†2000ๅคš\n742504 0 ๅŽšๅŽš็š„้˜ฒๆ™’้œœ่ฎฉ็šฎ่‚คไธ่ƒฝๅ‘ผๅธๅ ตๅกžๆฏ›ๅญ”ๅฏผ่‡ด้•ฟ็—˜\n1000 Processed\n classify content\n743000 0 ๆ— ้”กไธ€ๆฃ‰็š„้ซ˜ๆกฃ็ดงๅฏ†็บบ็ฒพๆขณๆฃ‰็บฑ็”Ÿไบง่ฝฆ้—ดใ€ๅธธๅทžๅคฉๅˆๅ…‰่ƒฝ็š„ๅ…‰ไผ็”ตๆฑ ๅŠ็ป„ไปถ็”Ÿไบง่ฝฆ้—ดใ€ๆฑŸ่‹ๅบท็ผ˜่ฏไธš็š„ไธญ...\n743001 0 ้ƒฝๅบ”่ฏฅ่ดŸ่ดฃ100๏ผ…ๆฐ‘ไบ‹่ต”ๅฟ\n743002 0 Themer้‡Œๆ‹ฅๆœ‰ไธฐๅฏŒ็š„ไธป้ข˜่ต„ๆบ\n743003 0 ๆฒกๆƒณๅˆฐๅ’Œ่ญฆๅฏŸๅฆนๅญ่ฟ˜ๆ˜ฏๆ กๅ‹๏ฝžๆŒบๆœ‰็ผ˜ๅˆ†็š„\n743004 0 ๆ–ฐ็–†ๅณๅฐ†่ฟŽๆฅ60ๅ‘จๅนดๅคงๅบ†ไธคไธป็บฟๆๅ‰ๅธƒๅฑ€\n1000 Processed\n classify content\n743500 0 ๆœฌ้—จไปŽๆ˜†ๅฑฑ่ก€ๅˆ€้˜ๅคบๅพ—้•ฟ็”ŸๅŠŸๆฎ‹็ซ ๅ››\n743501 0 ็ฌฌ70่ฎฒ๏ผšScala็•Œ้ขGUI็ผ–็จ‹ๅฎžๆˆ˜่ฏฆ่งฃ็™พๅบฆไบ‘๏ผš\n743502 0 ๅƒๅฎŒ้ฃžๆœบ้คๅฐฑๅทฒ็ป่ถณๅคŸ็ฅžๅฅ‡ไบ†\n743503 0 ๅฐๆœ‹ๅ‹ไปฌไฝฟ็”จxxxๅ„ฟ็ซฅๅซๅฃซxไปฃๆ‰‹่กจ้กบๅˆฉๆ‰พๅˆฐๅฎ่—\n743504 0 ๅฏไปฅไธ“ๅฟƒๅคไน ไธ€ไธชๅฐๆ—ถไธคไธชๅฐๆ—ถไธ‰ไธชๅฐๆ—ถ\n1000 Processed\n classify content\n744000 0 ๆ‰€ไปฅไผšๅˆปๆ„่ฐŽๆŠฅๅญฉๅญ็š„ๅ‡บ็”Ÿๆ—ถ้—ดๅ’Œๅ‡บ็”Ÿๆ—ฅๆœŸ\n744001 0 ๅฏผๆผ”ๅผ ๆ–ฐๅปบใ€ๆผ”ๅ‘˜่จๆ—ฅๅจœใ€ๅ‚…ๆทผใ€ๅˆ˜ๅ‘ไบฌๅฐ†ไผšไธŽๅ—ไบฌๅช’ไฝ“้›ถ่ท็ฆป่ง้ข~ๆœ‰ๆœจๆœ‰ๅ–œๆฌข็š„ๅ‘€\n744002 0 ๆฑŸ่‹็œๆถˆ่ดน่€…ๅไผšๅ‘ๅธƒไบ†ๆฐดไธŠไนๅ›ญๆถˆ่ดน่ฐƒๆŸฅๆŠฅๅ‘Š\n744003 0 ๅคงๆธ…ๆ—ฉๅˆซไบบ่ฃ…ไฟฎๅค–้ขๅตๆžถ้‡Œ้ข่Šๅคฉๆ‰“้ผพ่ฎฒ็”ต่ฏ็–ฏ็‹‚ๆ•ฒๆˆฟ้—จ็š„ๅŠๆขฆๅŠ้†’ๆˆ‘่ฟ˜ๅพ—่ตทๅบŠๅŽปๅผ€้”้บป็—นๆˆ‘ๆ—ฅ\n744004 0 ๅคšๅฎถไธŠๅธ‚ๅ…ฌๅธ่ฏๅ“ไธบไธๅˆๆ ผๅ“\n1000 Processed\n classify content\n744500 0 ่ขซๆ›ๆ˜ฏๅ—ไบฌๅฒไธŠๆœ€ไพˆๅŽ็š„่ฟๅปบ\n744501 0 ไธคไน˜ๅฎขๅ› ๅบงไฝ้—ฎ้ข˜ๅœจ้ฃžๆœบไธŠๅคงๆ‰“ๅ‡บๆ‰‹\n744502 0 ไธŽๆฑฝ่ฝฆไบบๆ€ปๅŠจๅ‘˜ใ€้‡‘ๅบธๆ–ฐๆœ‰ไฝ•ๅผ‚\n744503 0 ไธŠ็ญๆ—ถๆŠ“ๅˆฐๅฐๅท็„ถๅŽ้ญๅˆฐๅจ่ƒๅ›žๆฅ็š„่ทฏไธŠๅพˆๅฎณๆ€•3\n744504 0 2ใ€ๆŠฑ็€่ฏ•่ฏ•็œ‹ๅฟƒๆ€็š„ๅพฎๅ•†\n1000 Processed\n classify content\n745000 0 ๆญฆๆฑ‰้”ฆ็ปฃ้•ฟๆฑŸๆ•™่‚ฒ้›†ๅ›ขๅ‘ๅฑ•ๆœ‰้™ๅ…ฌๅธๆ–ฐๅŠจๆ€ๅ›ฝ้™…่‹ฑ่ฏญ่ต„ๆทฑๅธ‚ๅœบๅˆ†ๆžๅธˆ\n745001 0 ๅพฎ่ฝฏๅฐ†ไธๅ‚ๅŠ TGS2015\n745002 0 ่‡ณไบŽ่Šฑๅƒ้ชจ็‹ฌๅฎถ็…ง็‰‡ๅ‡ ๆ—ถๆ”พๅ‡บ\n745003 0 ๆŸฅๅค„ๆŽXX้ฉพ้ฉถ่ฑซFxx***ๅทๆœบๅŠจ่ฝฆๅฎžๆ–ฝๆœชๆŒ‰่ง„ๅฎšไฝฟ็”จๅฎ‰ๅ…จๅธฆ็š„ๆœบๅŠจ่ฝฆ็š„่ฟๆณ•่กŒไธบ\n745004 0 xๆœˆxxๆ—ฅไบš้ฉฌ้€Š็š„ไผšๅ‘˜ๆ—ฅ็ป“ๆŸ\n1000 Processed\n classify content\n745500 0 ๆˆ‘ไปฌๅญฆๆ กๆœ‰่ฐๅœจ่‹ๅทžๅšๆš‘ๅ‡ๅทฅๆปด\n745501 0 ๅŽŸๅทฒไป–ๅทฒ็ปๆŠŠ่‡ชๅทฑๅ•†ไธšๅŒ–ไบ†็š„ๅƒงไบบ\n745502 1 ๅคง้‡ๅ‡บๅ”ฎ่ฎกๅˆ’ๅค–้ป„็“œๅซๆŽฅ่‹—๏ผŒxๆœˆxๅทๅ‡บๆฃš๏ผŒๆ ‡ๅ‡†ไธ€ๅถไธ€ๅฟƒ๏ผŒๅ“็งxx-xx๏ผŒไปทๆ ผx.xๅ…ƒ/ๆฃต๏ผŒๅœฐๅ€...\n745503 0 ่ฏด่ฏดไฝ ้ฅญไบ†่ฟ™ไนˆไน…็š„TFBOYS้‡ๅˆฐ่ฟ‡ไป€ไนˆๆ„Ÿไบบ็š„ไบ‹ๆƒ…ๆˆ–ๅ–œๆฌขไธŠไป–ไปฌๆ”นๅ˜ไบ†ไฝ ไป€ไนˆ\n745504 0 ๆฏไธชๅ…ฌไผ—ๅท1000ๅˆฐ2500ไธ็ญ‰\n1000 Processed\n classify content\n746000 1 ๅฐŠๆ•ฌ็š„้กพๅฎข๏ผšไธ–็บช้‡‘่Šฑxๆœˆxๆ—ฅ๏ผxๆœˆxๆ—ฅ๏ผŒๅ…จๅœบๆปกxxx่ฟ”xx็Žฐ้‡‘ๅˆธ๏ผŒๅนถไบซx.xๅ€็งฏๅˆ†ใ€‚ๅ››ๆฅผAC...\n746001 0 ๅธฆไบ†่‡ชๅทฑ็š„็”ต่„‘ๆฅๅ•ไฝไปฅๅŽๆ‰ๅ‘็ŽฐๅŽŸๆฅๅ•ไฝ็š„็ฝ‘้€Ÿ่ฟ™ไนˆๅฟซโ€ฆโ€ฆๆœ‰ๅคšๅฟซๅ‘ขๆˆ‘ๆƒณๆŠŠๆˆ‘็š„็”ต่„‘ๆœฌ่บซๅฝขๅฎนๆˆไธ€ไธช่ทฏ...\n746002 0 ้€ ๆˆๅœๆ”พๅœจ่ฝฆๅบ“้‡Œ็š„็บฆ300่พ†็”ตๅŠจ่ฝฆๅ’Œๆ‘ฉๆ‰˜่ฝฆ่ขซ็ƒงๆฏ\n746003 0 17ๅฒ็š„ๅฅนๅฐฑ็™ปไธŠ็ฆๅธƒๆ–ฏๅไบบๆฆœ\n746004 0 ไปŠๅคฉๅซ็š„ๆปดๆปดๅฟซ่ฝฆๆ”พ็š„้™ˆๅ‡็š„ๆŠŠๆ‚ฒไผค็•™็ป™่‡ชๅทฑ\n1000 Processed\n classify content\n746500 0 ๅคฉ็พไบบ็ฅธ็–พ็—…้ขๅ‰ๆˆ‘ไปฌ้ƒฝๆ˜ฏๆธบๅฐ็š„ๆˆ‘ไปฌๆ— ่ƒฝไธบๅŠ›\n746501 0 ๅœจ่…พ่ฎฏๅผ€ๅผนๅน•็œ‹่ง†้ข‘ๅ‘็Žฐๅ‚ป้€ผๅฅฝๅคš\n746502 0 ไปฅไธ‹5็งๆฐดไธๅปบ่ฎฎ็”จๆฅ็ป™ๅฎๅฎๅ†ฒๅฅถ็ฒ‰\n746503 0 ๅฐฑ่ง‰ๅพ—ๅฅฝๅธ…็„ถๅŽๅˆฐๅค„่ทŸไบบ่ฏดๅ•Šๅ•Šๅ•Š่พฃไธช้˜Ÿ้•ฟๅฅฝๅธ…ๅŽๆฅ้€€ๅ‡บไบ†ๅ…ถๅฎžๆˆ‘ไนŸ่ง‰ๅพ—ๆŒบๅฅฝๆœ‰ๆ›ดๅคง็š„ๅ‘ๅฑ•็ฉบ้—ดๅ› ไธบไป–็œ‹...\n746504 0 ้Ÿฉๅ›ฝ้‚ฃ่พน็š„ranker่ฟ‡ๆฅๆ—…ๆธธ้กบไพฟๅ‚่ต›\n1000 Processed\n classify content\n747000 0 ็Žฐๅœจๆฒกไบ‹ๅ็”ต่„‘ๅ‰ไธค็œผไธ€็žช่ฟ˜็ฝ‘ไธŠๅ†ฒๆตช็š„ๅฐฑไฝ ไบ†ไฝ ไธช่‘ซ่Šฆๅจƒ\n747001 0 ๆต™ๆฑŸๆฑŸๅฑฑๅธ‚ๅ…ฌๅŠกๅ‘˜่€ƒ่ฏ•่€ƒๅฏŸๅๅ•ๅ…ฌ็คบๅ•ฆ~~่ตถๅฟซ็œ‹็œ‹ๆœ‰ๆฒกๆœ‰ไฝ ็š„ๅๅญ—ๅ“ˆ\n747002 0 ๅ‘ตๅ‘ตๅ“’~ๅ„็ง็›ด็”ท็™Œๅˆฐ็ˆ†็š„ๆณ•ๅพ‹ๅ’Œๆ”ฟ็ญ–\n747003 0 ๆตทๆŠฅๅผ ่ดด~ๅ›ฝๅ†…ๅค–ไธ“ๅฎถไปฌๆœ€ๆ–ฐๆˆๆžœ็š„ๅฑ•็คบ\n747004 0 ไปŽๆข…ๆ‘็š„ๆ— ้”กๅ›พไนฆไป“ๅบ“ๅˆฐ็ก•ๆ”พ\n1000 Processed\n classify content\n747500 0 5ไฝ้ขœๅ€ผ่ถ…้ซ˜็š„ๅฐๆœ‹ๅ‹่‡ช500ๅๅ‚่ต›้€‰ๆ‰‹ไธญ่„ฑ้ข–่€Œๅ‡บ\n747501 0 ๅ…ฑไธบxxx่พ†่ฝฆๅŠž็†ไบ†่ฝฆ่บซๅนฟๅ‘Š็™ป่ฎฐ\n747502 0 ๆœ็‹่ฏๅˆธ้ข‘้“ไธป่ง‚ไธŠๆœ‰็œ‹็ฉบใ€ๅš็ฉบๅธ‚ๅœบ็š„ๅ€พๅ‘\n747503 0 ไปฅ็„•ๆ–ฐ็š„่ฎพ่ฎกๅ’Œ้…็ฝฎ็ปง็ปญๅทฉๅ›บๅ…ถ้ข†่ข–ๅœฐไฝ\n747504 0 ไธ่ถ…่ฟ‡6000็š„ไบบๅฃไปฅๆธ”ไธšไธบ็”Ÿ\n1000 Processed\n classify content\n748000 0 ่ฟ™้‡Œ็š„้—ฎ้ข˜ๆ˜ฏ๏ผšAstuteไฝœไธบๆ”ปๅ‡ปๆ ธๆฝœ่‰‡\n748001 0 ็™พๅบฆๅญฆๆœฏๆฏ”่ฐทๆญŒๅญฆๆœฏ้ƒฝๅผบๅคงไบ†\n748002 0 ๆ†จๅŽš่€ๅฎž่ƒฝๅƒ่‹ฆ็š„ๅปบ็ญ‘ๅทฅไบบ\n748003 0 ้“ถ่กŒ99ๅนดๆ—ถๅทฒ็ปๆŠ€ๆœฏๆ€ง็ ดไบง่ฟ‡ไบ†\n748004 0 ๆญๅทž็ ด่Žท็‰นๅคง็›—็ชƒๅ›ขไผ™่ญฆๆ–นๆ้†’่ฐจ้˜ฒๅคœ็›—\n1000 Processed\n classify content\n748500 0 445ๅพ˜ๅพŠโ€ฆ่ฏฆๆ–‡่ฏทClickไธŠ\n748501 1 ๆ˜ฏไผไธš็š„ไปชๅ™จ็พŽๅฎน่Š‚๏ผŒไธบ็ญ”่ฐขๆ–ฐ่€้กพๅฎข็‰นๅˆซๅฅ‰็Œฎไปชๅ™จไฝ“้ชŒxxๅ…ƒ็ง’ๆ€ๆดปๅŠจ๏ผŒๆฏไฝไผšๅ‘˜ๅฏไปฅ็ง’ๆ€ๆ‚จๆœ€ๆƒณไฝ“้ชŒ...\n748502 0 ๆฑŸ่‹็š„ไบ‹ๆƒ…ๅ‘็”Ÿๅœจ่พฝๅฎๅˆๆ€Žๆ ท\n748503 1 ้พ™ๆน–ๅšๅฅฝๆ ทๆฟๅทฅๅœฐไพ›ๆ‚จๅ‚่ง‚๏ผไปทๆ ผๆœ€ไฝŽไผ˜ๆƒ xx๏ผ…๏ผŒๅƒไธ‡ไธ่ฆ้”™่ฟ‡่ฟ™ไธชๅฅฝๆœบไผš๏ผ่ฃ…ไฟฎ่ฆ็Žฏไฟ๏ผŒๅœจๆญๅทž้ฆ–้€‰...\n748504 0 ๆฎ่ฏด่‹่ฟช็ฝ—ๆ˜Žๅคฉๅฐฑๆ‹œ่ฎฟ่‹ๅทžไบ†o&gt\n1000 Processed\n classify content\n749000 0 ็™พๅบฆๆœๅˆฐ็š„ๅ‡ ๅผ ๅ–œๆฌข็š„็‰›่ก—ๅผ€ๆ–‹่Š‚็…ง็‰‡\n749001 0 ้›†ๅ›ข้™คไบ†ๅœจๆœฌๆธฏๆ‹ฅๆœ‰็จณๅ›บ็š„ไธšๅŠกๅŸบ็ก€ๅค–\n749002 0 ็”ฑๆฑŸ่‹็œ่ดจ็›‘ๅฑ€ใ€ไบบ็คพๅŽ…ใ€ๆ€ปๅทฅไผšๅ’Œๅ›ข็œๅง”ไธปๅŠž็š„ๆฑŸ่‹็œ้ฆ–ๅฑŠ็”ตๆขฏๅฎ‰่ฃ…็ปดไฟฎๅทฅๆŠ€่ƒฝ็ซž่ต›\n749003 0 ๅ…ณ็”ต่„‘ๅ…ณ็”ต่„‘ไธ€ๅˆทๅพฎๅš่„‘ๅญไธ€็ƒญๅฐฑไธๆƒณ่ƒŒๅ•่ฏไบ†\n749004 0 ๅพฎ่ฝฏๅ†ๆฌก็กฎ่ฎคไบ†Win10ๅฐ†ๅœจ7ๆœˆ29ๆ—ฅๅ…่ดนๆไพ›็ป™ๆญฃ็‰ˆWin7ๅ’Œ8\n1000 Processed\n classify content\n749500 0 ็ตๅฑฑๅŽฟไบบๆฐ‘ๆณ•้™ข่กŒๆ”ฟๅบญ็š„ๆณ•ๅฎ˜ๅ’Œไนฆ่ฎฐๅ‘˜ใ€ๆณ•่ญฆ็ป„ๆˆ็š„ๅ‹˜้ชŒๅฐ็ป„ๅ‰ๅพ€็ตๅฑฑๅŽฟๆŸ้•‡ไธ€ๅค„ๅทฒ่ขซๅผบๅˆถๆ‹†้™ค็š„ๅปบ็ญ‘ๅทฅ...\n749501 0 ็”ฑ1ไธชๆ ‘ๆกฉๅ’Œ70ๅผ ๅœ†ๅœ†็š„ๅนด่ฝฎไพฟ็ญพ็บธ็ป„ๆˆ\n749502 0 ่ฐˆๅ‚ๅŠ ๅฆๅ…‹ไธค้กน่ต›็š„96A1ๅž‹ๅฆๅ…‹ใ€Žไฟž็ก•๏ผšไธญๅ›ฝๅฆๅ…‹ๆ˜ฏๅŽปไฟ„็ฝ—ๆ–ฏ\"็ ธๅœบๅญ\"็š„ๅ—\n749503 0 ๅคงๅฎถไธ่ฆๅœจๆŠฑ็€ๅฐ้‡‘ๅบ“ๅพ€้‡Œๆ€ไบ†็œŸๆƒณ็Žฉ็ญ‰ๅคง่ทŒไธคไธ‰ๅคฉๅŽๅ…ฅๅœบๅšไธ€ๆณขๆœ€็จณๅฅ\n749504 0 ่ฏฆๆƒ…ๅŠ ๆˆ‘QQxxxxxxxxxx่ฏฆ็ป†ไบ†่งฃๅ’จ่ฏข\n1000 Processed\n classify content\n750000 0 ไปŽๅ…‰้ฒœ็š„VC่Œไธš้‡ๆ–ฐๆŠ•ๅ…ฅๅˆฐๅˆ›ไธšไผไธšไธญ\n750001 0 zf็š„ไธไฝœไธบ่ฆ่ฎฉ่€็™พๅง“ๆฅๆ‰ฟๆ‹…็ป“ๆžœ\n750002 0 ไธ‹ๅˆๆŠŠๅฎถ้‡Œ็š„ๅฐๅผ็”ต่„‘xp็ณป็ปŸๅ‡็บงไบ†ไธ€ไธ‹\n750003 0 ๆทฑๅœณๅธ‚ๆฐ‘็Ž‹ๅ…ˆ็”ŸๆŠฅๆ–™็งฐๅ…ถๅ•ไฝ้™„่ฟ‘็š„ไธ€็ง‘ๆŠ€ๅ…ฌๅธๅฐ†ๅ่พ†ๅฅ”้ฉฐ่ฝฟ่ฝฆๅฝ“ไฝœๅนด็ปˆๅฅ–้€็ป™ไผ˜็ง€ๅ‘˜ๅทฅ\n750004 0 ๆˆ‘็œŸ็š„ไธๆƒณ่ฏดไป€ไนˆไบ†ๅ…‰ๆ˜ฏ็œ‹ๅฐฑ็œ‹ไบ†ๅฅฝๅ‡ ้็ปๅ…ธ็š„ไธ่ƒฝๅ†็ปๅ…ธ\n1000 Processed\n classify content\n750500 0 ไฝ†ๅดๆฒกๆœ‰็ป™Googleๅธฆๆฅ่ถณๅคŸๅคš็š„โ€œๅˆฉ็›Šโ€\n750501 0 ้Ÿฉๅ›ฝSINILPHARM็–ฒๅŠณ่ดด\n750502 0 ็œ‹็€ๅŒป้™ข้‡Œ็ฉฟ็€็—…ๅทๆœ็š„็—…ไบบ\n750503 0 ๅœจๅ—ไบฌๅ…ญๅˆไธคๅฑฑ้—ด็š„ๆดผๅœฐไธญๅธƒไธ‹ไธ€ๅฅ—ไป–็ฒพๅฟƒ่ฎพ็š„้˜ตๆณ•\n750504 0 ไธบไป€ไนˆๅœจไผŠๅŒ็š„COSๅŠจ็”ป้‡Œ่ฟ˜ๆœ‰ๅˆทโ€œไป–ไปฌๆ˜ฏๅฎ‰ไธœๅฎถ็š„ไบบโ€่ฟ™ๆ ท็š„KYๅญ˜ๅœจโ€ฆโ€ฆ็œŸ็š„ๅฅฝไธ็†่งฃโ€ฆโ€ฆๅๆญฃ็ปŸ...\n1000 Processed\n classify content\n751000 0 ๅˆซๅข…ๅœจโ€˜้˜ด้˜ณโ€™็š„ๅŸบ็ก€ไธŠ็ ”็ฉถๅฏน็ซ‹\n751001 0 ๅ›ฝไบงๅคง้ฃžๆœบC919้ฆ–ๅฐๅ‘ๅŠจๆœบไบคไป˜้ฆ–ๆžถๆœบๅนดๅ†…ไธ‹็บฟ\n751002 1 ๅฎถ้•ฟๆ‚จๅฅฝ๏ผŒๆˆ‘ไปฌๆ˜ฏ้ธฟ่’™ๆ•™่‚ฒๆ€็ปด็ป˜็”ปๅ„ฟ็ซฅ่„‘ๆฝœ่ƒฝๅผ€ๅ‘๏ผŒๆœฌๅ‘จๅ…ญไธŠๅˆๅ็‚นๅ…่ดนๅ…ฌๅผ€่ฏพๆฌข่ฟŽๅธฆๅฎๅฎ่ฏ•ๅฌ๏ผๅœฐๅ€...\n751003 0 ไธคๅ‘้•ฟๅพไธ‰ๅทไน™่ฟ่ฝฝ็ซ็ฎญๆญไน˜ๅŒไธ€ไธ“ๅˆ—ไปŽๅŒ—ไบฌๅฏ็จ‹ใ€ๅฅ”่ตด่ฅฟๆ˜Œๅซๆ˜Ÿๅ‘ๅฐ„ไธญๅฟƒๆ‰ง่กŒๅซๆ˜Ÿๅ‘ๅฐ„ไปปๅŠก\n751004 0 ๅทฒๆœ‰ๆธ…ๅŽๅคงๅญฆๆ™บ่ƒฝๆœบๅ™จไบบ็ ”ๅ‘็”ŸไบงๅŸบๅœฐ็ญ‰xไธช็”ฑไบฌๆดฅ้ซ˜ๆ กใ€ไผไธšๆŠ•่ต„็š„ๆœบๅ™จไบบ้กน็›ฎ\n1000 Processed\n classify content\n751500 0 ๅญธๆ‡‚ไบ†FIRAC้€™ๅ€‹็ฌฆๅˆๆณ•ๅพ‹็†ๆ“š็š„็จ‹ๅบๅŽปๅ ฑๆกˆ\n751501 0 ไธบไป€ไนˆ็™พๅบฆ็š„ๆฑ‡็އๅ’Œๅ›ฝๅค–ๆ›ดๆ–ฐ็š„ๆฑ‡็އๆœ‰ๅทฎๅผ‚\n751502 0 ไธŠๆตทๅฎๅŽŸ็‰ฉไธš้กพ้—ฎๆœ‰้™ๅ…ฌๅธไธบไบ†ๆ‹›่˜ไบบๆ‰ไธ“้—จๅ‡†ๅค‡ไบ†่ฟ™ไธช้กต้ข\n751503 0 ๆˆ‘ๆญฃๅœจ็œ‹ๆˆ‘ๅˆ›ๅปบไบ†ไธ€ไธช่‹ๅฑฑ็‚ฎ้‡‘็Ÿฟ\n751504 0 ๆ‹ฑๅข…ๅŒบ่‡ชๅทฑ็š„โ€œๅ“†ๅ•ฆAๆขฆโ€ใ€ๅŒบๆฃ€ๅฏŸ้™ขๆœชๆฃ€โ€œ่“่ƒ–็บธโ€ไปŠๅนดๅ…ญไธ€ไนŸ่ฏž็”Ÿไบ†\n1000 Processed\n classify content\n752000 0 ไปŽๅฐ่™Žๅ›ข้˜ŸๅˆฐYtigerๅ†ๅˆฐๆˆ็ซ‹ๆœๆถฆๅ…ฌๅธ\n752001 0 ็ฌฌไบŒๅคง็†็”ฑ๏ผšๅคšๅฝฉ็”Ÿๆดป\n752002 0 ๆˆ‘็š„้Ÿณไนๆ–ฐ็œ‹ๆณ•ๅฐฑๅœจ่…พ่ฎฏ่ง†้ข‘LiveMusic\n752003 0 4ใ€ๅฐ่ถ…ๅธ‚ๅฑก้ญโ€œๅค–ๅ›ฝ้ป‘ๆ‰‹โ€็›—็ชƒ\n752004 0 ๆˆ‘้ข†ๅœŸๆ”นๅผ€ไปฅๆฅ่ขซไพตๅ ๅคšๅฐ‘้ข†ๅœŸ\n1000 Processed\n classify content\n752500 0 ๅนถ้€š่ฟ‡ๆ‰‹ๆœบAppๆฅๅฏๅŠจSOSๆจกๅผ\n752501 0 ๅฅฝๅ…ฌๅธๅฅฝ่‚ก็ฅจไผšๆ…ขๆ…ข่ตฐ็‹ฌ็ซ‹่กŒๆƒ…\n752502 0 ่ฏƒ้ขไผฆๆดพๅ‡บๆ‰€ๅ…ฑ็ป„็ป‡ไธƒๆ‰นๅนณๅฎ‰ๅฟ—ๆ„ฟ่€…1000ไฝ™ไบบ่ฟ›่กŒๆถ‰ๆๆถ‰ๆšด\n752503 0 ๅฅฝๅฃฐ้Ÿณโ€œไธ€่ตทๆ‘‡ๆ‘†โ€ๅ“ˆๆž—ๅ”ฑๅ‡บไบ†ไป–ๆ‰ๆ˜ฏๅŽŸๅ”ฑ็š„ๆ„Ÿ่ง‰\n752504 1 xxxxxxxxxxxxxxxxxxx่’‹ไธญๅซๅ†œ่กŒๅกใ€‚\n1000 Processed\n classify content\n753000 1 ไธ‰ๆœˆไฟƒ้”€โ€”โ€”ๅนปๆดปๆ–ฐ็”Ÿ ๆ—ถๅ…‰็คผ้‡ ไธ€ใ€ๅ•ๅผ ่ฎขๅ•่ดญไนฐไปปๆ„ๅนปๆ—ถๆˆ–ๅนปๆ—ถไฝณไบงๅ“ๆฏๆปกxxxxๅ…ƒ๏ผŒๅณๅฏๅ…่ดน...\n753001 0 ๅ…ณ้—ญๅฐ็…ค็Ÿฟใ€ๆŸฅๅค„ๆฒนๆฐ”็ฎก้“ใ€้‡ๅคงไบ‹ๆ•…่ฐƒๆŸฅโ€”โ€”ๅฎ‰ๅ…จ็›‘็ฎกๆ€ปๅฑ€ๅ›žๅบ”็ƒญ็‚น้—ฎ้ข˜\n753002 0 ๅฏนไบŽ็Ž‹ไธนไธน็Žฐๅœจ็š„่ต„ไบง้—ฎ้ข˜ๆˆ‘่ง‰ๅพ—่ญฆๆ–นๆœ‰ๅฟ…่ฆๅŽป่ฐƒๆŸฅไธ€ไธ‹ไบ†่ฟ˜ๆœ‰ๅฅน็Ÿฅ้“้ซ˜ๅ˜‰ๆณฝ็ˆถไบฒ้•ฟ็›ธ็š„้—ฎ้ข˜ๆ€ไบบๅฟๅ‘ฝๆญปๅˆ‘\n753003 0 ๅฅณๆ˜Ÿๆฝœ่ง„ๅˆ™๏ผšๅƒไธ‡ไธ่ƒฝๅ’Œ่Œƒๅ†ฐๅ†ฐๅˆ็…ง\n753004 0 G15wๅธธๅฐ้ซ˜้€Ÿ่‹ๅทžๆฎต็”ฑไบŽๆ–ฝๅทฅ\n1000 Processed\n classify content\n753500 0 ็™ฝๅญ็”ปๅทฒ็ป้€‰ๆ‹ฉไบ†้ขๅฏน่Šฑๅƒ้ชจ\n753501 0 ๆˆ‘่ตฐ้่šŒๅŸ ๅธ‚ใ€ๅฎฟ่ฟๅธ‚็š„4ไธชๅŒบๅŸŸ\n753502 0 xxxxๅนดไธญๅ›ฝๅค–่ดธxxxๅผบๅŸŽๅธ‚ๆŽ’ๅๅฆ‚ๅ›พ\n753503 0 ไธญๅ…ฑ้š็ž’ไบ†ๅ—ไบฌๅคงๅฑ ๆ€้•ฟ่พพๅ‡ ๅๅนด\n753504 0 ๅŽ้ข็š„็‚ฎๅ…ตใ€ๅฆๅ…‹ๅ…ตใ€ไพฆๅฏŸๅ…ต็ฏ‡ไธๆƒณ็œ‹ไบ†ๆ€ŽไนˆๅŠž\n1000 Processed\n classify content\n754000 0 ไธ้กพไธ€ๅˆ‡็”จ่ฟ˜ๆฒกๆœ‰ๆ‰“็ฃจๆˆ็†Ÿ็š„ๆ–ฐๆŠ€ๆœฏ\n754001 0 ๆต™ๆฑŸๅซ่ง†ๅไบŒ้“้”‹ๅ‘ณๅœจ้ป„ๅฑฑๅš่‡ญ้ณœ้ฑผๅ‘ข\n754002 0 ๅนฟไธœไปฅ142ๅฎถๆŽ’ๅœจ็ฌฌไบŒโ€ฆๅŒ—ไบฌไปฅ83ไฝๅฑ…็ฌฌไบ”\n754003 1 ่ฟ›ใ€xxxxxใ€CxMใ€‘ไฝฐ/ๅŠ /็ƒญใ€‚ไฝ“-่‚ฒ๏ผ็ซžๅ™ป็ญ‰็ญ‰โ€ฆโ€ฆ้กถ็บงไฟก่ช‰๏ผๅคšๅนณๅฐ็Žฉๆณ•ใ€็ซ‹ๅณไธŠๅคฉๆˆ*ๅ›ฝ...\n754004 0 ๅทๅดŽ็Žซ็‘ฐไนŸๅ’Œๆฌง็พŽ็Žซ็‘ฐ็š„็›ธไผผๅบฆๅพˆ้ซ˜\n1000 Processed\n classify content\n754500 0 ไธ็”จ่ดจ็–‘็”จไบ†ๅฐฑไผšๆ„Ÿๆฟ€่‡ชๅทฑๅฝ“ๅˆ็š„้€‰ๆ‹ฉ\n754501 0 ๅ‹้˜ฟ่‚กไปฝๆŽง่‚ก่‚กไธœๆ‹Ÿๆ–ฅ่ต„5000ไธ‡ๅ…ƒๅขžๆŒ\n754502 0 ๆˆ‘ๅธฆ็€ไฝ /ไฝ ๅธฆ็€ๆ—บๆ—บ็”ต่„‘ไนŸๅฅฝๆ‰‹ๆœบไนŸ็ฝขๆจช็ฉฟๆŽฅๅพ…ๅคงๅŽ…ๆšด่ตฐๅšๅ•ๆˆฟ้—ด่ฎฉๆˆ‘ไปฌๅšไธ€ไธช่ฏดๅนฒๅฐฑๅนฒ็š„ๅˆทๆ‰‹\n754503 0 ๆƒณ่ตทever17ๅ“ชๅ„ฟ้ƒฝ่ƒฝ็œ‹ๅˆฐ็š„ไธ€ๅฅ่ฏ๏ผšไธ่ฆ่ฟ‡ๅบฆๆŽจ็†\n754504 1 ๆ˜ฅๅŸŽไบบไบบๅ“ๆตท้ฒœ๏ผŒๅ–œๆฐ”ๆด‹ๆด‹่ดบๆ–ฐๅนดใ€‚ๅณๆ—ฅ่ตทๆๅ‰้ข„ๅฎšๅนดๅคœ้ฅญๅฏ่Žท่ต ๅ…‘ๅธ๏ผŒๅƒ็ง็คผ็‰ฉไปปๆ‚จ้€‰๏ผ่ฏฆ่ฏขๆœฌๆœบใ€‚็พŽ...\n1000 Processed\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code" ] ]
d06207fa0f559b3ebc86802bd9160f20e280b273
88,828
ipynb
Jupyter Notebook
Models/CNN_best.ipynb
DataMas/Deep-Learning-Image-Classification
88916064a041ba1e8f3b0c397226710c1c9058fa
[ "MIT" ]
1
2021-04-14T18:50:50.000Z
2021-04-14T18:50:50.000Z
Models/CNN_best.ipynb
DataMas/Deep-Learning-Image-Classification
88916064a041ba1e8f3b0c397226710c1c9058fa
[ "MIT" ]
null
null
null
Models/CNN_best.ipynb
DataMas/Deep-Learning-Image-Classification
88916064a041ba1e8f3b0c397226710c1c9058fa
[ "MIT" ]
null
null
null
115.812256
24,292
0.82957
[ [ [ "# Mount google drive to colab", "_____no_output_____" ] ], [ [ "from google.colab import drive\ndrive.mount(\"/content/drive\")", "Mounted at /content/drive\n" ] ], [ [ "# Import libraries", "_____no_output_____" ] ], [ [ "import os\nimport random \nimport numpy as np\nimport shutil\nimport time \nfrom PIL import Image, ImageOps\nimport cv2\nimport pandas as pd\nimport math\n\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nsns.set_style('darkgrid')\n\nimport tensorflow as tf\n\nfrom keras import models\nfrom keras import layers\nfrom keras import optimizers\nfrom keras.callbacks import EarlyStopping\nfrom keras.callbacks import ModelCheckpoint\nfrom keras.callbacks import LearningRateScheduler\nfrom keras.utils import np_utils\n\n\nfrom sklearn.metrics import confusion_matrix, classification_report\nfrom sklearn.preprocessing import LabelBinarizer\nfrom sklearn.preprocessing import MinMaxScaler\nfrom keras.preprocessing.image import ImageDataGenerator\n\nfrom keras import models, layers, optimizers\nfrom keras.callbacks import ModelCheckpoint\nfrom keras import losses", "_____no_output_____" ] ], [ [ "# Initialize basic working directories", "_____no_output_____" ] ], [ [ "directory = \"drive/MyDrive/Datasets/Sign digits/Dataset\"\ntrainDir = \"train\"\ntestDir = \"test\"\nos.chdir(directory)", "_____no_output_____" ] ], [ [ "# Augmented dataframes", "_____no_output_____" ] ], [ [ "augDir = \"augmented/\"\nclassNames_train = os.listdir(augDir+'train/')\nclassNames_test = os.listdir(augDir+'test/')\n\n\nclasses_train = []\ndata_train = []\npaths_train = []\n\nclasses_test = []\ndata_test = []\npaths_test = []\n\nclasses_val = []\ndata_val = []\npaths_val = []\n\nfor className in range(0,10):\n temp_train = os.listdir(augDir+'train/'+str(className))\n temp_test = os.listdir(augDir+'test/'+str(className))\n\n for dataFile in temp_train:\n path_train = augDir+'train/'+str(className)+'/'+dataFile\n\n paths_train.append(path_train)\n classes_train .append(str(className))\n \n testSize = [i for i in range(math.floor(len(temp_test)/2),len(temp_test))]\n valSize = [i for i in range(0,math.floor(len(temp_test)/2))]\n for dataFile in testSize:\n path_test = augDir+'test/'+str(className)+'/'+temp_test[dataFile]\n\n paths_test.append(path_test)\n classes_test .append(str(className))\n\n for dataFile in valSize:\n path_val = augDir+'test/'+str(className)+'/'+temp_test[dataFile]\n\n paths_val.append(path_val)\n classes_val .append(str(className))\n\n \naugTrain_df = pd.DataFrame({'fileNames': paths_train, 'labels': classes_train})\naugTest_df = pd.DataFrame({'fileNames': paths_test, 'labels': classes_test})\naugVal_df = pd.DataFrame({'fileNames': paths_val, 'labels': classes_val})", "_____no_output_____" ], [ "augTrain_df.head(10)", "_____no_output_____" ], [ "augTrain_df['labels'].hist(figsize=(10,5))\naugTest_df['labels'].hist(figsize=(10,5))", "_____no_output_____" ], [ "augTest_df['labels'].hist(figsize=(10,5))\naugVal_df['labels'].hist(figsize=(10,5))", "_____no_output_____" ], [ "augTrainX=[]\naugTrainY=[]\naugTestX=[]\naugTestY=[]\naugValX=[]\naugValY=[]\n\niter = -1\n\n#read images from train set\nfor path in augTrain_df['fileNames']:\n iter = iter + 1\n #image = np.array((Image.open(path)))\n image = cv2.imread(path)\n augTrainX.append(image)\n label = augTrain_df['labels'][iter]\n augTrainY.append(label)\n\niter = -1\n\nfor path in augTest_df['fileNames']:\n iter = iter + 1\n #image = np.array((Image.open(path)))\n image = cv2.imread(path)\n augTestX.append(image)\n augTestY.append(augTest_df['labels'][iter])\n\niter = -1\n\nfor path in augVal_df['fileNames']:\n iter = iter + 1\n #image = np.array((Image.open(path)))\n image = cv2.imread(path)\n augValX.append(image)\n augValY.append(augVal_df['labels'][iter])\n\naugTrainX = np.array(augTrainX)\naugTestX = np.array(augTestX)\naugValX = np.array(augValX)\n\n \naugTrainX = augTrainX / 255\naugTestX = augTestX / 255\naugValX = augValX / 255\n# OneHot Encode the Output\naugTrainY = np_utils.to_categorical(augTrainY, 10)\naugTestY = np_utils.to_categorical(augTestY, 10)\naugValY = np_utils.to_categorical(augValY, 10)", "/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:37: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray\n/usr/local/lib/python3.7/dist-packages/ipykernel_launcher.py:39: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray\n" ], [ "train_datagen = ImageDataGenerator(rescale=1./255)\nvalidation_datagen = ImageDataGenerator(rescale=1./255)\ntest_datagen = ImageDataGenerator(rescale=1./255)\n\ntrain_generator = train_datagen.flow_from_dataframe(dataframe=augTrain_df,\n x_col=\"fileNames\",\n y_col=\"labels\",\n batch_size=16,\n class_mode=\"categorical\",\n color_mode=\"grayscale\",\n target_size=(100,100),\n shuffle=True)\n\nvalidation_generator = validation_datagen.flow_from_dataframe(dataframe=augVal_df,\n x_col=\"fileNames\",\n y_col=\"labels\",\n batch_size=16,\n class_mode=\"categorical\",\n color_mode=\"grayscale\",\n target_size=(100,100),\n shuffle=True)\n\ntest_generator = test_datagen.flow_from_dataframe(dataframe=augTest_df,\n x_col=\"fileNames\",\n y_col=\"labels\",\n batch_size=16,\n class_mode=\"categorical\",\n color_mode=\"grayscale\",\n target_size=(100,100),\n shuffle=True)", "Found 3124 validated image filenames belonging to 10 classes.\nFound 252 validated image filenames belonging to 10 classes.\nFound 252 validated image filenames belonging to 10 classes.\n" ], [ "model_best = models.Sequential()\n\nmodel_best.add(layers.Conv2D(64, (3,3), input_shape=(100, 100,1), padding='same', activation='relu'))\nmodel_best.add(layers.BatchNormalization(momentum=0.1))\nmodel_best.add(layers.MaxPooling2D(pool_size=(2,2)))\nmodel_best.add(layers.Conv2D(32, (3,3), padding='same', activation='relu'))\nmodel_best.add(layers.BatchNormalization(momentum=0.1))\nmodel_best.add(layers.MaxPooling2D(pool_size=(2,2)))\nmodel_best.add(layers.Conv2D(16, (3,3), padding='same', activation='relu'))\nmodel_best.add(layers.BatchNormalization(momentum=0.1))\nmodel_best.add(layers.MaxPooling2D(pool_size=(2,2)))\nmodel_best.add(layers.Flatten())\nmodel_best.add(layers.Dense(128, activation='relu'))\nmodel_best.add(layers.Dropout(0.2))\nmodel_best.add(layers.Dense(10, activation='softmax'))\n\nmodel_best.summary()", "Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv2d (Conv2D) (None, 100, 100, 64) 640 \n_________________________________________________________________\nbatch_normalization (BatchNo (None, 100, 100, 64) 256 \n_________________________________________________________________\nmax_pooling2d (MaxPooling2D) (None, 50, 50, 64) 0 \n_________________________________________________________________\nconv2d_1 (Conv2D) (None, 50, 50, 32) 18464 \n_________________________________________________________________\nbatch_normalization_1 (Batch (None, 50, 50, 32) 128 \n_________________________________________________________________\nmax_pooling2d_1 (MaxPooling2 (None, 25, 25, 32) 0 \n_________________________________________________________________\nconv2d_2 (Conv2D) (None, 25, 25, 16) 4624 \n_________________________________________________________________\nbatch_normalization_2 (Batch (None, 25, 25, 16) 64 \n_________________________________________________________________\nmax_pooling2d_2 (MaxPooling2 (None, 12, 12, 16) 0 \n_________________________________________________________________\nflatten (Flatten) (None, 2304) 0 \n_________________________________________________________________\ndense (Dense) (None, 128) 295040 \n_________________________________________________________________\ndropout (Dropout) (None, 128) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 10) 1290 \n=================================================================\nTotal params: 320,506\nTrainable params: 320,282\nNon-trainable params: 224\n_________________________________________________________________\n" ], [ "print(\"[INFO] Model is training...\")\ntime1 = time.time() # to measure time taken\n# Compile the model\nmodel_best.compile(loss='categorical_crossentropy',\n optimizer=optimizers.Adam(learning_rate=1e-3),\n metrics=['acc'])\n\nhistory_best = model_best.fit(\n train_generator,\n steps_per_epoch=train_generator.samples/train_generator.batch_size ,\n epochs=20,\n validation_data=validation_generator,\n validation_steps=validation_generator.samples/validation_generator.batch_size,\n)\nprint('Time taken: {:.1f} seconds'.format(time.time() - time1)) # to measure time taken\nprint(\"[INFO] Model is trained.\")", "[INFO] Model is training...\nEpoch 1/20\n195/195 [==============================] - 87s 443ms/step - loss: 1.5492 - acc: 0.5419 - val_loss: 0.4457 - val_acc: 0.8373\nEpoch 2/20\n195/195 [==============================] - 86s 439ms/step - loss: 0.3078 - acc: 0.8933 - val_loss: 0.2915 - val_acc: 0.9087\nEpoch 3/20\n195/195 [==============================] - 85s 436ms/step - loss: 0.1132 - acc: 0.9614 - val_loss: 0.3068 - val_acc: 0.8968\nEpoch 4/20\n195/195 [==============================] - 86s 440ms/step - loss: 0.0780 - acc: 0.9808 - val_loss: 0.2856 - val_acc: 0.9246\nEpoch 5/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0408 - acc: 0.9869 - val_loss: 0.2254 - val_acc: 0.9444\nEpoch 6/20\n195/195 [==============================] - 86s 439ms/step - loss: 0.0308 - acc: 0.9909 - val_loss: 0.3072 - val_acc: 0.9286\nEpoch 7/20\n195/195 [==============================] - 85s 437ms/step - loss: 0.0409 - acc: 0.9857 - val_loss: 0.2902 - val_acc: 0.9246\nEpoch 8/20\n195/195 [==============================] - 85s 437ms/step - loss: 0.0526 - acc: 0.9827 - val_loss: 0.3072 - val_acc: 0.9206\nEpoch 9/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0241 - acc: 0.9896 - val_loss: 0.3179 - val_acc: 0.9127\nEpoch 10/20\n195/195 [==============================] - 85s 436ms/step - loss: 0.0223 - acc: 0.9945 - val_loss: 0.2930 - val_acc: 0.9405\nEpoch 11/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0151 - acc: 0.9952 - val_loss: 0.2063 - val_acc: 0.9444\nEpoch 12/20\n195/195 [==============================] - 86s 439ms/step - loss: 0.0185 - acc: 0.9940 - val_loss: 0.2123 - val_acc: 0.9563\nEpoch 13/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0434 - acc: 0.9863 - val_loss: 0.3235 - val_acc: 0.9484\nEpoch 14/20\n195/195 [==============================] - 85s 438ms/step - loss: 0.0478 - acc: 0.9856 - val_loss: 0.3105 - val_acc: 0.9365\nEpoch 15/20\n195/195 [==============================] - 86s 440ms/step - loss: 0.0110 - acc: 0.9966 - val_loss: 0.2986 - val_acc: 0.9405\nEpoch 16/20\n195/195 [==============================] - 86s 440ms/step - loss: 0.0169 - acc: 0.9932 - val_loss: 0.4730 - val_acc: 0.9286\nEpoch 17/20\n195/195 [==============================] - 85s 436ms/step - loss: 0.0693 - acc: 0.9743 - val_loss: 0.2832 - val_acc: 0.9405\nEpoch 18/20\n195/195 [==============================] - 85s 437ms/step - loss: 0.0265 - acc: 0.9925 - val_loss: 0.2911 - val_acc: 0.9365\nEpoch 19/20\n195/195 [==============================] - 86s 438ms/step - loss: 0.0233 - acc: 0.9920 - val_loss: 0.2732 - val_acc: 0.9524\nEpoch 20/20\n195/195 [==============================] - 85s 435ms/step - loss: 0.0167 - acc: 0.9940 - val_loss: 0.2515 - val_acc: 0.9603\nTime taken: 1713.1 seconds\n[INFO] Model is trained.\n" ], [ "score = model_best.evaluate(test_generator)\n\nprint('===Testing loss and accuracy===')\nprint('Test loss: ', score[0])\nprint('Test accuracy: ', score[1])", "16/16 [==============================] - 2s 111ms/step - loss: 0.4789 - acc: 0.9405\n===Testing loss and accuracy===\nTest loss: 0.47893190383911133\nTest accuracy: 0.9404761791229248\n" ], [ "import matplotlib.pyplot as plot\nplot.plot(history_best.history['acc'])\nplot.plot(history_best.history['val_acc'])\nplot.title('Model accuracy')\nplot.ylabel('Accuracy')\nplot.xlabel('Epoch')\nplot.legend(['Train', 'Vall'], loc='upper left')\nplot.show()\n\nplot.plot(history_best.history['loss'])\nplot.plot(history_best.history['val_loss'])\nplot.title('Model loss')\nplot.ylabel('Loss')\nplot.xlabel('Epoch')\nplot.legend(['Train', 'Vall'], loc='upper left')\nplot.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0621f6db1f8301e02bf44fe950e88798ea5aeb7
538,383
ipynb
Jupyter Notebook
HW_exam/.ipynb_checkpoints/Exam_Prazdnichnykh-checkpoint.ipynb
AntonPrazdnichnykh/HSE.optimization
ca844bb041c614e0de95cab6de87db340323e59d
[ "Apache-2.0" ]
null
null
null
HW_exam/.ipynb_checkpoints/Exam_Prazdnichnykh-checkpoint.ipynb
AntonPrazdnichnykh/HSE.optimization
ca844bb041c614e0de95cab6de87db340323e59d
[ "Apache-2.0" ]
null
null
null
HW_exam/.ipynb_checkpoints/Exam_Prazdnichnykh-checkpoint.ipynb
AntonPrazdnichnykh/HSE.optimization
ca844bb041c614e0de95cab6de87db340323e59d
[ "Apache-2.0" ]
null
null
null
740.554333
25,460
0.954436
[ [ [ "import numpy as np\nimport scipy.sparse as sp\nfrom sklearn.datasets import load_svmlight_file\nfrom oracle import Oracle, make_oracle\nimport scipy as sc\nfrom methods import OptimizeLassoProximal, OptimizeGD, NesterovLineSearch\nimport matplotlib.pyplot as plt\nfrom sklearn import linear_model", "_____no_output_____" ] ], [ [ "ะ ะตัˆะฐะตะผ ะทะฐะดะฐั‡ัƒ ะปะพะณะธัั‚ะธั‡ะตัะบะพะน ั€ะตะณั€ะตััะธะธ ะธ l1-ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะตะน:\n$$F(w) = - \\frac{1}{N}\\sum\\limits_{i=1}^Ny_i\\ln(\\sigma_w(x_i)) + (1 - y_i)\\ln(1 - \\sigma_w(x_i)) + \\lambda\\|w\\|_1,$$\nะณะดะต $\\lambda$ -- ะฟะฐั€ะฐะผะตั‚ั€ ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะธ.\n\nะ—ะฐะดะฐั‡ัƒ ั€ะตัˆะฐะตะผ ะฟั€ะพะบัะธะผะฐะปัŒะฝั‹ะผ ะณั€ะฐะดะธะตะฝั‚ะฝั‹ะผ ะผะตั‚ะพะดะพะผ. ะฃะฑะตะดะธะผัั ัะฝะฐั‡ะฐะปะฐ, ั‡ั‚ะพ ะฟั€ะธ $\\lambda = 0$ ะฝะฐัˆะต ั€ะตัˆะตะฝะธะต ัะพะฒะฟะฐะดะฐะตั‚ ั ั€ะตัˆะตะฝะธะตะผ ะผะตั‚ะพะดะฐ ะณั€ะฐะดะธะตะฝั‚ะฝะพะณะพ ัะฟัƒัะบะฐ ั ะพั†ะตะฝะบะพะน ะดะปะธะฝั‹ ัˆะฐะณะฐ ะผะตั‚ะพะดะพะผ ะะตัั‚ะตั€ะพะฒะฐ.", "_____no_output_____" ] ], [ [ "orac = make_oracle('a1a.txt', penalty='l1', reg=0)\norac1 = make_oracle('a1a.txt')\nx, y = load_svmlight_file('a1a.txt', zero_based=False)\nm = x[0].shape[1] + 1\nw0 = np.zeros((m, 1))\noptimizer = OptimizeLassoProximal()\noptimizer1 = OptimizeGD()\npoint = optimizer(orac, w0)\npoint1 = optimizer1(orac1, w0, NesterovLineSearch())\n\nnp.allclose(point, point1)", "_____no_output_____" ] ], [ [ "ะ˜ะทัƒั‡ะธะผ ัะบะพั€ะพัั‚ัŒ ัั…ะพะดะธะผะพัั‚ะธ ะผะตั‚ะพะดะฐ ะฝะฐ ะดะฐั‚ะฐัะตั‚ะต a1a.txt ($\\lambda = 0.001$)", "_____no_output_____" ] ], [ [ "def convergence_plot(xs, ys, xlabel, title=None):\n plt.figure(figsize = (12, 3))\n plt.xlabel(xlabel)\n plt.ylabel('F(w_{k+1} - F(w_k)')\n plt.plot(xs, ys)\n plt.yscale('log')\n if title:\n plt.title(title)\n plt.tight_layout()\n plt.show()\n ", "_____no_output_____" ], [ "orac = make_oracle('a1a.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)", "_____no_output_____" ], [ "errs = optimizer.errs\ntitle = 'lambda = 0.001'\nconvergence_plot(optimizer.times, errs, 'ะฒะตั€ะผั ั€ะฐะฑะพั‚ั‹, ั', title)\nconvergence_plot(optimizer.orac_calls, errs, 'ะบะพะป-ะฒะพ ะฒั‹ะทะพะฒะพะฒ ะพั€ะฐะบัƒะปะฐ', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), errs, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน', title)", "_____no_output_____" ] ], [ [ "ะ—ะฐะผะตั‚ะธะผ, ั‡ั‚ะพ ะฑั‹ะปะพ ะธัะฟะพะปัŒะทะพะฒะฐะฝะพ ัƒัะปะพะฒะธะต ะพัั‚ะฐะฝะพะฒะบะธ $F(w_{k+1}) - F(w_k) \\leq tol = 10^{-16}$. ะ˜ะท ะผะฐั‚ะตะผะฐั‚ะธั‡ะตัะบะธั… ัะพะพะฑั€ะฐะถะตะฝะธะน ะบะฐะถะตั‚ัั, ั‡ั‚ะพ ัั‚ะพ ะพะบ, ั‚ะฐะบ ะบะฐะบ ะฒ ะฒะตั‰ะตัั‚ะฒะตะฝะฝั‹ั… ั‡ะธัะปะฐั… ัั…ะพะดะธะผะพัั‚ัŒ ะฟะพัะปะตะดะพะฒะฐั‚ะตะปัŒะฝะพัั‚ะธ ั€ะฐะฒะฝะพัะธะปัŒะฝะฐ ะตั‘ ั„ัƒะฝะดะฐะผะตะฝั‚ะฐะปัŒะฝะพัั‚ะธ. ะฏ ั‚ะฐะบะถะต ะฟั‹ั‚ะฐะปัั ะธัะฟะพะปัŒะทะพะฒะฐั‚ัŒ ะฒ ะบะฐั‡ะตัั‚ะฒะต ัƒัะปะพะฒะธั ะพัั‚ะฐะฝะพะฒะบะธ $\\|\\nabla_w f(w_k)\\|_2^2 / \\|\\nabla_w f(w_0)\\|_2^2 <= tol$, ะณะดะต $f$ -- ะปะพัั ะปะพะณะธัั‚ะธั‡ะตัะบะพะน ั€ะตะณั€ะตััะธะธ ะฑะตะท ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะธ ($F = f + reg$), ะฝะพ, ะฒะพะพะฑั‰ะต ะณะพะฒะพั€ั, ะฝะต ะพั‡ะตะฝัŒ ะฟะพะฝัั‚ะฝะพ, ะผะพะถะฝะพ ะปะธ ั‚ะฐะบ ะดะตะปะฐั‚ัŒ, ะฟะพั‚ะพะผัƒ ั‡ั‚ะพ ะพะฝะพ ัƒั‡ะธั‚ั‹ะฒะฐะตั‚ ั‚ะพะปัŒะบะพ ั‡ะฐัั‚ัŒ ั„ัƒะฝะบั†ะธะธ.\n\nะ˜ะท ะณั€ะฐั„ะธะบะพะฒ ะฒะธะดะฝะพ, ั‡ั‚ะพ ะผะตั‚ะพะด ะพะฑะปะฐะดะฐะตั‚ ะปะธะฝะตะนะฝะพะน ัะบะพั€ะพัั‚ัŒัŽ ัั…ะพะดะธะผะพัั‚ะธ", "_____no_output_____" ], [ "ะ˜ะทัƒั‡ะธะผ ั‚ะตะฟะตั€ัŒ ะทะฐะฒะธัะธะผะพัั‚ัŒ ัะบะพั€ะพัั‚ะธ ัั…ะพะดะธะผะพัั‚ะธ ะธ ะบะพะปะธั‡ะตัั‚ะฒะฐ ะฝะตะฝัƒะปะตะฒั‹ั… ะบะพะผะฟะพะฝะตะฝั‚ ะฒ ั€ะตัˆะตะฝะธะธ ะพั‚ ะฟะฐั€ะฐะผะตั‚ั€ะฐ ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะธ $\\lambda$", "_____no_output_____" ] ], [ [ "def plot(x, ys, ylabel, legend=False): \n plt.figure(figsize = (12, 3))\n plt.xlabel(\"lambda\")\n plt.ylabel(ylabel)\n plt.plot(x, ys, 'o')\n plt.xscale('log')\n if legend:\n plt.legend()\n plt.tight_layout()\n plt.show()", "_____no_output_____" ], [ "lambdas = [10**(-i) for i in range(8, 0, -1)]\nnon_zeros = []\nfor reg in lambdas:\n orac = make_oracle('a1a.txt', penalty='l1', reg=reg)\n point = optimizer(orac, w0)\n convergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน',\n f\"lambda = {reg}\")\n non_zeros.append(len(np.nonzero(point)[0]))\nplot(lambdas, non_zeros, '# nonzero components')", "_____no_output_____" ] ], [ [ "ะ’ะธะดะฝะพ, ั‡ั‚ะพ ะฟะฐั€ะฐะผะตั‚ั€ ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะธ ะฟั€ะฐะบั‚ะธั‡ะตัะบะธ ะฝะต ะฒะปะธัะตั‚ ะฝะฐ ัะบะพั€ะพัั‚ัŒ ัั…ะพะดะธะผะพัั‚ะธ (ะพะฝะฐ ะฒัะตะณะดะฐ ะปะธะฝะตะนะฝะฐั), ะฝะพ ะบะพะปะธั‡ะตัั‚ะฒะพ ะธั‚ะตั€ะฐั†ะธะน ะผะตั‚ะพะดะฐ ะฟะฐะดะฐะตั‚ ั ัƒะฒะตะปะธั‡ะตะฝะธะตะผ ะฟะฐั€ะฐะผะตั‚ั€ะฐ ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะธ. ะขะฐะบ ะถะต ะธะท ะฟะพัะปะตะดะฝะตะณะพ ะณั€ะฐั„ะธะบะฐ ะดะตะปะฐะตะผ ะพะถะธะดะฐะตะผั‹ะน ะฒั‹ะฒะพะด, ั‡ั‚ะพ ั‡ะธัะปะพ ะฝะตะฝัƒะปะตะฒั‹ั… ะบะพะผะฟะพะฝะตะฝั‚ ะฒ ั€ะตัˆะตะฝะธะธ ัƒะผะตะฝัŒัˆะฐะตั‚ัั ั ั€ะพัั‚ะพะผ ะฟะฐั€ะฐะผะตั‚ั€ะฐ ั€ะตะณัƒะปัั€ะธะทะฐั†ะธะธ", "_____no_output_____" ], [ "ะŸะพัั‚ั€ะพะธะผ ะตั‰ะต ะณั€ะฐั„ะธะบะธ ะทะฐะฒะธัะธะผะพัั‚ะธ ะทะฝะฐั‡ะตะฝะธั ะพะฟั‚ะธะผะธะทะธั€ัƒะตะผะพะน ั„ัƒะฝะบั†ะธะธ ะธ ะบั€ะธั‚ะตั€ะธั ะพัั‚ะฝะพะฒะบะธ (ะตั‰ั‘ ั€ะฐะทะพะบ) ะฒ ะทะฐะฒะธัะธะผะพัั‚ะธ ะพั‚ ะธั‚ะตั€ะฐั†ะธะธ ($\\lambda = 0.001$)", "_____no_output_____" ] ], [ [ "def value_plot(xs, ys, xlabel, title=None):\n plt.figure(figsize = (12, 3))\n plt.xlabel(xlabel)\n plt.ylabel('F(w_k)')\n plt.plot(xs, ys)\n# plt.yscale('log')\n if title:\n plt.title(title)\n plt.tight_layout()\n plt.show()", "_____no_output_____" ], [ "orac = make_oracle('a1a.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)\ntitle = 'lambda = 0.001'\nvalue_plot(list(range(1, optimizer.n_iter + 1)), optimizer.values, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน', title)", "_____no_output_____" ] ], [ [ "ะ”ะปั ะฟะพะดั‚ะฒะตั€ะถะดะตะฝะธั ัะดะตะปะฐะฝั‹ั… ะฒั‹ะฒะพะดะพะฒ ะฟั€ะพะฒะตั€ะธะผ ะธั… ะตั‰ั‘ ะฝะฐ breast-cancer_scale ะดะฐั‚ะฐัะตั‚ะต.", "_____no_output_____" ], [ "ะŸั€ะพะฒะตั€ะบะฐ ั€ะฐะฒะฝะพัะธะปัŒะฝะพัั‚ะธ GD + Nesterov ะธ Proximal + $\\lambda = 0$:", "_____no_output_____" ] ], [ [ "orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=0)\norac1 = make_oracle('breast-cancer_scale.txt')\nx, y = load_svmlight_file('breast-cancer_scale.txt', zero_based=False)\nm = x[0].shape[1] + 1\nw0 = np.zeros((m, 1))\noptimizer = OptimizeLassoProximal()\noptimizer1 = OptimizeGD()\npoint = optimizer(orac, w0)\npoint1 = optimizer1(orac1, w0, NesterovLineSearch())\n\nnp.allclose(point, point1)", "_____no_output_____" ], [ "print(abs(orac.value(point) - orac1.value(point1)))", "0.0001461093710795336\n" ] ], [ [ "ะกะฐะผะธ ะฒะตะบั‚ะพั€ะฐ ะฒะตัะพะฒ ะฝะต ัะพะฒะฟะฐะปะธ, ะฝะพ ะทะฝะฐั‡ะตะฝะธั ะพะฟั‚ะธะผะธะทะธั€ัƒะตะผะพะน ั„ัƒะฝะบั†ะธะธ ะฑะปะธะทะบะธ, ั‚ะฐะบ ั‡ั‚ะพ ะฑัƒะดะตะผ ัั‡ะธั‚ะฐั‚ัŒ, ั‡ั‚ะพ ะฒัะต ะพะบ.", "_____no_output_____" ], [ "ะ˜ะทัƒั‡ะฐะตะผ ัะบะพั€ะพัั‚ัŒ ัั…ะพะดะธะผะพัั‚ะธ ะดะปั $\\lambda = 0.001$:", "_____no_output_____" ] ], [ [ "orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)\nerrs = optimizer.errs\ntitle = 'lambda = 0.001'\nconvergence_plot(optimizer.times, errs, 'ะฒะตั€ะผั ั€ะฐะฑะพั‚ั‹, ั', title)\nconvergence_plot(optimizer.orac_calls, errs, 'ะบะพะป-ะฒะพ ะฒั‹ะทะพะฒะพะฒ ะพั€ะฐะบัƒะปะฐ', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), errs, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน', title)", "_____no_output_____" ] ], [ [ "ะšะฐะถะตั‚ัั, ั‡ั‚ะพ ัะบะพั€ะพัั‚ัŒ ัั…ะพะดะธะผะพัั‚ะธ ะพะฟัั‚ัŒ ะปะธะฝะตะนะฝะฐั", "_____no_output_____" ], [ "ะ˜ะทัƒั‡ะฐะตะผ ะทะฐะฒะธัะธะผะพัั‚ัŒ ัะบะพั€ะพัั‚ะธ ัั…ะพะดะธะผะพัั‚ะธ ะธ ะบะพะปะธั‡ะตัั‚ะฒะฐ ะฝะตะฝัƒะปะตะฒั‹ั… ะบะพะผะฟะพะฝะตะฝั‚ ะฒ ั€ะตัˆะตะฝะธะธ ะพั‚ $\\lambda$", "_____no_output_____" ] ], [ [ "lambdas = [10**(-i) for i in range(8, 0, -1)]\nnon_zeros = []\nfor reg in lambdas:\n orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=reg)\n point = optimizer(orac, w0)\n convergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน',\n f\"lambda = {reg}\")\n non_zeros.append(len(np.nonzero(point)[0]))\nplot(lambdas, non_zeros, '# nonzero components')", "_____no_output_____" ] ], [ [ "ะ”ะตะปะฐะตะผ ั‚ะต ะถะต ะฒั‹ะฒะพะดั‹", "_____no_output_____" ], [ "ะŸะพัั‚ั€ะพะธะผ ะฝะฐะฟะพัะปะตะดะพะบ ะณั€ั„ะธะบะธ ะดะปั ะทะฝะฐั‡ะตะฝะธะน ะพะฟั‚ะธะผะธะทะธั€ัƒะตะผะพะน ั„ัƒะฝะบั†ะธะธ ะธ ะบั€ะธั‚ะตั€ะธั ะพัั‚ะฐะฝะพะฒะบะธ (ะตั‰ั‘ ั€ะฐะทะพะบ) ะฒ ะทะฐะฒะธัะธะผะพัั‚ะธ ะพั‚ ะธั‚ะตั€ะฐั†ะธะธ ($\\lambda = 0.001$)", "_____no_output_____" ] ], [ [ "orac = make_oracle('breast-cancer_scale.txt', penalty='l1', reg=0.001)\npoint = optimizer(orac, w0)\ntitle = 'lambda = 0.001'\nvalue_plot(list(range(1, optimizer.n_iter + 1)), optimizer.values, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน', title)\nconvergence_plot(list(range(1, optimizer.n_iter + 1)), optimizer.errs, 'ะบะพะป-ะฒะพ ะธั‚ะตั€ะฐั†ะธะน', title)", "_____no_output_____" ] ], [ [ "ะšะพะฝะตั†.", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ] ]
d06224770637f53d41cec6b94186b3ae72820478
250,590
ipynb
Jupyter Notebook
sngp_with_bert_aws.ipynb
tejashrigadre/Anomaly-detection-for-chat-bots
7cac681983bc435953d472a3d2f91bcbe4bc756f
[ "MIT" ]
null
null
null
sngp_with_bert_aws.ipynb
tejashrigadre/Anomaly-detection-for-chat-bots
7cac681983bc435953d472a3d2f91bcbe4bc756f
[ "MIT" ]
null
null
null
sngp_with_bert_aws.ipynb
tejashrigadre/Anomaly-detection-for-chat-bots
7cac681983bc435953d472a3d2f91bcbe4bc756f
[ "MIT" ]
null
null
null
192.317728
51,284
0.88982
[ [ [ "## Implementing BERT with SNGP", "_____no_output_____" ] ], [ [ "!pip install tensorflow_text==2.7.3", "Collecting tensorflow_text==2.7.3\n Using cached tensorflow_text-2.7.3-cp38-cp38-manylinux2010_x86_64.whl (4.9 MB)\nCollecting tensorflow-hub>=0.8.0\n Using cached tensorflow_hub-0.12.0-py2.py3-none-any.whl (108 kB)\nCollecting tensorflow<2.8,>=2.7.0\n Using cached tensorflow-2.7.1-cp38-cp38-manylinux2010_x86_64.whl (495.1 MB)\nCollecting tensorflow-estimator<2.8,~=2.7.0rc0\n Using cached tensorflow_estimator-2.7.0-py2.py3-none-any.whl (463 kB)\nRequirement already satisfied: six>=1.12.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.16.0)\nCollecting libclang>=9.0.1\n Using cached libclang-13.0.0-py2.py3-none-manylinux1_x86_64.whl (14.5 MB)\nRequirement already satisfied: flatbuffers<3.0,>=1.12 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.12)\nCollecting keras<2.8,>=2.7.0rc0\n Using cached keras-2.7.0-py2.py3-none-any.whl (1.3 MB)\nRequirement already satisfied: wheel<1.0,>=0.32.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.37.0)\nRequirement already satisfied: gast<0.5.0,>=0.2.1 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.4.0)\nRequirement already satisfied: protobuf>=3.9.2 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.19.1)\nRequirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.12.1)\nRequirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.6.3)\nRequirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.2.0)\nRequirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.42.0)\nRequirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.3.0)\nRequirement already satisfied: numpy>=1.14.5 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.19.5)\nRequirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.1.0)\nRequirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.7.4.3)\nRequirement already satisfied: tensorboard~=2.6 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.6.0)\nRequirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.1.2)\nRequirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.1.0)\nRequirement already satisfied: tensorflow-io-gcs-filesystem>=0.21.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.21.0)\nRequirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.8/site-packages (from tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.10.0)\nRequirement already satisfied: setuptools>=41.0.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (59.5.0)\nRequirement already satisfied: google-auth<2,>=1.6.3 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.35.0)\nRequirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.6.1)\nRequirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.8.0)\nRequirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.0.2)\nRequirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.4.6)\nRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.3.6)\nRequirement already satisfied: requests<3,>=2.21.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.25.1)\nRequirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.8/site-packages (from google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.2.8)\nRequirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.8/site-packages (from google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.2.4)\nRequirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.8/site-packages (from google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.7.2)\nRequirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.8/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.3.0)\nRequirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.8/site-packages (from markdown>=2.6.8->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.8.2)\nRequirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2.10)\nRequirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (4.0.0)\nRequirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (1.26.7)\nRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.8/site-packages (from requests<3,>=2.21.0->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (2021.10.8)\nRequirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.8/site-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.6.0)\nRequirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.8/site-packages (from pyasn1-modules>=0.2.1->google-auth<2,>=1.6.3->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (0.4.8)\nRequirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.8/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow<2.8,>=2.7.0->tensorflow_text==2.7.3) (3.1.1)\nInstalling collected packages: tensorflow-estimator, libclang, keras, tensorflow-hub, tensorflow, tensorflow-text\n Attempting uninstall: tensorflow-estimator\n Found existing installation: tensorflow-estimator 2.6.0\n Uninstalling tensorflow-estimator-2.6.0:\n Successfully uninstalled tensorflow-estimator-2.6.0\n Attempting uninstall: keras\n Found existing installation: keras 2.6.0\n Uninstalling keras-2.6.0:\n Successfully uninstalled keras-2.6.0\n Attempting uninstall: tensorflow\n Found existing installation: tensorflow 2.6.2\n Uninstalling tensorflow-2.6.2:\n Successfully uninstalled tensorflow-2.6.2\n\u001b[31mERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.\ntensorflow-io 0.21.0 requires tensorflow<2.7.0,>=2.6.0, but you have tensorflow 2.7.1 which is incompatible.\u001b[0m\nSuccessfully installed keras-2.7.0 libclang-13.0.0 tensorflow-2.7.1 tensorflow-estimator-2.7.0 tensorflow-hub-0.12.0 tensorflow-text-2.7.3\n\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\n\u001b[33mWARNING: You are using pip version 21.3.1; however, version 22.0.3 is available.\nYou should consider upgrading via the '/usr/local/bin/python3.8 -m pip install --upgrade pip' command.\u001b[0m\n" ], [ "!pip install -U tf-models-official==2.7.0", "Collecting tf-models-official==2.7.0\n Using cached tf_models_official-2.7.0-py2.py3-none-any.whl (1.8 MB)\nRequirement already satisfied: tensorflow-text>=2.7.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (2.7.3)\nRequirement already satisfied: Pillow in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (8.3.2)\nRequirement already satisfied: matplotlib in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (3.5.0)\nCollecting pycocotools\n Using cached pycocotools-2.0.4-cp38-cp38-linux_x86_64.whl\nCollecting oauth2client\n Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)\nCollecting google-api-python-client>=1.6.7\n Using cached google_api_python_client-2.37.0-py2.py3-none-any.whl (8.1 MB)\nCollecting seqeval\n Using cached seqeval-1.2.2-py3-none-any.whl\nCollecting Cython\n Using cached Cython-0.29.28-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (1.9 MB)\nCollecting sentencepiece\n Using cached sentencepiece-0.1.96-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.2 MB)\nCollecting tensorflow-datasets\n Using cached tensorflow_datasets-4.5.2-py3-none-any.whl (4.2 MB)\nCollecting tensorflow-addons\n Using cached tensorflow_addons-0.16.1-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)\nRequirement already satisfied: psutil>=5.4.3 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (5.8.0)\nCollecting tensorflow-model-optimization>=0.4.1\n Using cached tensorflow_model_optimization-0.7.1-py2.py3-none-any.whl (234 kB)\nCollecting sacrebleu\n Using cached sacrebleu-2.0.0-py3-none-any.whl (90 kB)\nRequirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.19.5)\nRequirement already satisfied: tensorflow-hub>=0.6.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (0.12.0)\nCollecting py-cpuinfo>=3.3.0\n Using cached py_cpuinfo-8.0.0-py3-none-any.whl\nCollecting kaggle>=1.3.9\n Using cached kaggle-1.5.12-py3-none-any.whl\nRequirement already satisfied: scipy>=0.19.1 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.7.0)\nCollecting gin-config\n Using cached gin_config-0.5.0-py3-none-any.whl (61 kB)\nCollecting tf-slim>=1.1.0\n Using cached tf_slim-1.1.0-py2.py3-none-any.whl (352 kB)\nRequirement already satisfied: pyyaml>=5.1 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (5.4.1)\nRequirement already satisfied: tensorflow>=2.7.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (2.7.1)\nRequirement already satisfied: six in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.16.0)\nCollecting opencv-python-headless\n Using cached opencv_python_headless-4.5.5.62-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (47.7 MB)\nRequirement already satisfied: pandas>=0.22.0 in /usr/local/lib/python3.8/site-packages (from tf-models-official==2.7.0) (1.2.5)\nCollecting httplib2<1dev,>=0.15.0\n Using cached httplib2-0.20.4-py3-none-any.whl (96 kB)\nCollecting uritemplate<5,>=3.0.1\n Using cached uritemplate-4.1.1-py2.py3-none-any.whl (10 kB)\nRequirement already satisfied: google-auth<3.0.0dev,>=1.16.0 in /usr/local/lib/python3.8/site-packages (from google-api-python-client>=1.6.7->tf-models-official==2.7.0) (1.35.0)\nCollecting google-auth-httplib2>=0.1.0\n Using cached google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB)\nCollecting google-api-core<3.0.0dev,>=1.21.0\n Using cached google_api_core-2.5.0-py2.py3-none-any.whl (111 kB)\nRequirement already satisfied: python-dateutil in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (2.8.2)\nRequirement already satisfied: tqdm in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (4.62.3)\nRequirement already satisfied: certifi in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (2021.10.8)\nRequirement already satisfied: urllib3 in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (1.26.7)\nCollecting python-slugify\n Using cached python_slugify-6.0.1-py2.py3-none-any.whl (9.0 kB)\nRequirement already satisfied: requests in /usr/local/lib/python3.8/site-packages (from kaggle>=1.3.9->tf-models-official==2.7.0) (2.25.1)\nRequirement already satisfied: pytz>=2017.3 in /usr/local/lib/python3.8/site-packages (from pandas>=0.22.0->tf-models-official==2.7.0) (2021.3)\nRequirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.1.0)\nRequirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.1.0)\nRequirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.3.0)\nRequirement already satisfied: wheel<1.0,>=0.32.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.37.0)\nRequirement already satisfied: tensorflow-io-gcs-filesystem>=0.21.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.21.0)\nRequirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.1.2)\nRequirement already satisfied: libclang>=9.0.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (13.0.0)\nRequirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.7.4.3)\nRequirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.2.0)\nRequirement already satisfied: flatbuffers<3.0,>=1.12 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.12)\nRequirement already satisfied: keras<2.8,>=2.7.0rc0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (2.7.0)\nRequirement already satisfied: tensorflow-estimator<2.8,~=2.7.0rc0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (2.7.0)\nRequirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.6.3)\nRequirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.42.0)\nRequirement already satisfied: tensorboard~=2.6 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (2.6.0)\nRequirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (1.12.1)\nRequirement already satisfied: gast<0.5.0,>=0.2.1 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.4.0)\nRequirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (0.10.0)\nRequirement already satisfied: protobuf>=3.9.2 in /usr/local/lib/python3.8/site-packages (from tensorflow>=2.7.0->tf-models-official==2.7.0) (3.19.1)\nCollecting dm-tree~=0.1.1\n Using cached dm_tree-0.1.6-cp38-cp38-manylinux_2_24_x86_64.whl (94 kB)\nRequirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (0.11.0)\nRequirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (4.28.3)\nRequirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (21.3)\nRequirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (1.3.2)\nRequirement already satisfied: pyparsing>=2.2.1 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (3.0.6)\nRequirement already satisfied: setuptools-scm>=4 in /usr/local/lib/python3.8/site-packages (from matplotlib->tf-models-official==2.7.0) (6.3.2)\nRequirement already satisfied: pyasn1>=0.1.7 in /usr/local/lib/python3.8/site-packages (from oauth2client->tf-models-official==2.7.0) (0.4.8)\nRequirement already satisfied: pyasn1-modules>=0.0.5 in /usr/local/lib/python3.8/site-packages (from oauth2client->tf-models-official==2.7.0) (0.2.8)\nRequirement already satisfied: rsa>=3.1.4 in /usr/local/lib/python3.8/site-packages (from oauth2client->tf-models-official==2.7.0) (4.7.2)\nRequirement already satisfied: tabulate>=0.8.9 in /usr/local/lib/python3.8/site-packages (from sacrebleu->tf-models-official==2.7.0) (0.8.9)\nRequirement already satisfied: colorama in /usr/local/lib/python3.8/site-packages (from sacrebleu->tf-models-official==2.7.0) (0.4.3)\nCollecting regex\n Using cached regex-2022.1.18-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (764 kB)\nCollecting portalocker\n Using cached portalocker-2.4.0-py2.py3-none-any.whl (16 kB)\nRequirement already satisfied: scikit-learn>=0.21.3 in /usr/local/lib/python3.8/site-packages (from seqeval->tf-models-official==2.7.0) (0.24.2)\nCollecting typeguard>=2.7\n Using cached typeguard-2.13.3-py3-none-any.whl (17 kB)\nCollecting promise\n Using cached promise-2.3-py3-none-any.whl\nRequirement already satisfied: dill in /usr/local/lib/python3.8/site-packages (from tensorflow-datasets->tf-models-official==2.7.0) (0.3.4)\nCollecting tensorflow-metadata\n Using cached tensorflow_metadata-1.6.0-py3-none-any.whl (48 kB)\nRequirement already satisfied: importlib-resources in /usr/local/lib/python3.8/site-packages (from tensorflow-datasets->tf-models-official==2.7.0) (5.4.0)\nCollecting googleapis-common-protos<2.0dev,>=1.52.0\n Using cached googleapis_common_protos-1.54.0-py2.py3-none-any.whl (207 kB)\nRequirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.8/site-packages (from google-auth<3.0.0dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official==2.7.0) (4.2.4)\nRequirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.8/site-packages (from google-auth<3.0.0dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official==2.7.0) (59.5.0)\nRequirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.8/site-packages (from requests->kaggle>=1.3.9->tf-models-official==2.7.0) (2.10)\nRequirement already satisfied: chardet<5,>=3.0.2 in /usr/local/lib/python3.8/site-packages (from requests->kaggle>=1.3.9->tf-models-official==2.7.0) (4.0.0)\nRequirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.8/site-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official==2.7.0) (3.0.0)\nRequirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.8/site-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official==2.7.0) (1.1.0)\nRequirement already satisfied: tomli>=1.0.0 in /usr/local/lib/python3.8/site-packages (from setuptools-scm>=4->matplotlib->tf-models-official==2.7.0) (1.2.2)\nRequirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (2.0.2)\nRequirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (3.3.6)\nRequirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (0.6.1)\nRequirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (0.4.6)\nRequirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.8/site-packages (from tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (1.8.0)\nRequirement already satisfied: zipp>=3.1.0 in /usr/local/lib/python3.8/site-packages (from importlib-resources->tensorflow-datasets->tf-models-official==2.7.0) (3.6.0)\nCollecting text-unidecode>=1.3\n Using cached text_unidecode-1.3-py2.py3-none-any.whl (78 kB)\nRequirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.8/site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (1.3.0)\nRequirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.8/site-packages (from markdown>=2.6.8->tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (4.8.2)\nRequirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.8/site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard~=2.6->tensorflow>=2.7.0->tf-models-official==2.7.0) (3.1.1)\nInstalling collected packages: text-unidecode, httplib2, googleapis-common-protos, uritemplate, typeguard, tensorflow-metadata, regex, python-slugify, promise, portalocker, google-auth-httplib2, google-api-core, dm-tree, tf-slim, tensorflow-model-optimization, tensorflow-datasets, tensorflow-addons, seqeval, sentencepiece, sacrebleu, pycocotools, py-cpuinfo, opencv-python-headless, oauth2client, kaggle, google-api-python-client, gin-config, Cython, tf-models-official\nSuccessfully installed Cython-0.29.28 dm-tree-0.1.6 gin-config-0.5.0 google-api-core-2.5.0 google-api-python-client-2.37.0 google-auth-httplib2-0.1.0 googleapis-common-protos-1.54.0 httplib2-0.20.4 kaggle-1.5.12 oauth2client-4.1.3 opencv-python-headless-4.5.5.62 portalocker-2.4.0 promise-2.3 py-cpuinfo-8.0.0 pycocotools-2.0.4 python-slugify-6.0.1 regex-2022.1.18 sacrebleu-2.0.0 sentencepiece-0.1.96 seqeval-1.2.2 tensorflow-addons-0.16.1 tensorflow-datasets-4.5.2 tensorflow-metadata-1.6.0 tensorflow-model-optimization-0.7.1 text-unidecode-1.3 tf-models-official-2.7.0 tf-slim-1.1.0 typeguard-2.13.3 uritemplate-4.1.1\n\u001b[33mWARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv\u001b[0m\n\u001b[33mWARNING: You are using pip version 21.3.1; however, version 22.0.3 is available.\nYou should consider upgrading via the '/usr/local/bin/python3.8 -m pip install --upgrade pip' command.\u001b[0m\n" ], [ "import matplotlib.pyplot as plt\nimport matplotlib.colors as colors\n\nimport sklearn.metrics\nimport sklearn.calibration\n\nimport tensorflow_hub as hub\nimport tensorflow_datasets as tfds\n\nimport numpy as np\nimport tensorflow as tf\nimport pandas as pd\nimport json\n\nimport official.nlp.modeling.layers as layers\nimport official.nlp.optimization as optimization", "_____no_output_____" ] ], [ [ "### Implement a standard BERT classifier following which classifies text", "_____no_output_____" ] ], [ [ "gpus = tf.config.list_physical_devices('GPU')\ngpus", "_____no_output_____" ], [ "# Standard BERT model\n\nPREPROCESS_HANDLE = 'https://tfhub.dev/tensorflow/bert_en_uncased_preprocess/3'\nMODEL_HANDLE = 'https://tfhub.dev/tensorflow/bert_en_uncased_L-12_H-768_A-12/3'\n\nclass BertClassifier(tf.keras.Model):\n def __init__(self, \n num_classes=150, inner_dim=768, dropout_rate=0.1,\n **classifier_kwargs):\n \n super().__init__()\n self.classifier_kwargs = classifier_kwargs\n\n # Initiate the BERT encoder components.\n self.bert_preprocessor = hub.KerasLayer(PREPROCESS_HANDLE, name='preprocessing')\n self.bert_hidden_layer = hub.KerasLayer(MODEL_HANDLE, trainable=True, name='bert_encoder')\n\n # Defines the encoder and classification layers.\n self.bert_encoder = self.make_bert_encoder()\n self.classifier = self.make_classification_head(num_classes, inner_dim, dropout_rate)\n\n def make_bert_encoder(self):\n text_inputs = tf.keras.layers.Input(shape=(), dtype=tf.string, name='text')\n encoder_inputs = self.bert_preprocessor(text_inputs)\n encoder_outputs = self.bert_hidden_layer(encoder_inputs)\n return tf.keras.Model(text_inputs, encoder_outputs)\n\n def make_classification_head(self, num_classes, inner_dim, dropout_rate):\n return layers.ClassificationHead(\n num_classes=num_classes, \n inner_dim=inner_dim,\n dropout_rate=dropout_rate,\n **self.classifier_kwargs)\n\n def call(self, inputs, **kwargs):\n encoder_outputs = self.bert_encoder(inputs)\n classifier_inputs = encoder_outputs['sequence_output']\n return self.classifier(classifier_inputs, **kwargs)\n", "_____no_output_____" ] ], [ [ "### Build SNGP model", "_____no_output_____" ], [ "To implement a BERT-SNGP model designed by Google researchers", "_____no_output_____" ] ], [ [ "class ResetCovarianceCallback(tf.keras.callbacks.Callback):\n\n def on_epoch_begin(self, epoch, logs=None):\n \"\"\"Resets covariance matrix at the begining of the epoch.\"\"\"\n if epoch > 0:\n self.model.classifier.reset_covariance_matrix()", "_____no_output_____" ], [ "class SNGPBertClassifier(BertClassifier):\n\n def make_classification_head(self, num_classes, inner_dim, dropout_rate):\n return layers.GaussianProcessClassificationHead(\n num_classes=num_classes, \n inner_dim=inner_dim,\n dropout_rate=dropout_rate,\n gp_cov_momentum=-1,\n temperature=30.,\n **self.classifier_kwargs)\n\n def fit(self, *args, **kwargs):\n \"\"\"Adds ResetCovarianceCallback to model callbacks.\"\"\"\n kwargs['callbacks'] = list(kwargs.get('callbacks', []))\n kwargs['callbacks'].append(ResetCovarianceCallback())\n\n return super().fit(*args, **kwargs)", "_____no_output_____" ] ], [ [ "### Load train and test datasets", "_____no_output_____" ] ], [ [ "is_train = pd.read_json('is_train.json')\nis_train.columns = ['question','intent']\n\nis_test = pd.read_json('is_test.json')\nis_test.columns = ['question','intent']\n\noos_test = pd.read_json('oos_test.json')\noos_test.columns = ['question','intent']\n\nis_test.shape", "_____no_output_____" ] ], [ [ "Make the train and test data.", "_____no_output_____" ] ], [ [ "#Generate codes\nis_data = is_train.append(is_test)\nis_data.intent = pd.Categorical(is_data.intent)\nis_data['code'] = is_data.intent.cat.codes\n\n#in-scope evaluation data\nis_test = is_data[15000:19500]\n\nis_test_queries = is_test.question\nis_test_labels = is_test.intent\nis_test_codes = is_test.code\n\nis_eval_data = (tf.convert_to_tensor(is_test_queries), tf.convert_to_tensor(is_test_codes))\n\nis_train = is_data[0:15000]\nis_train_queries = is_train.question\nis_train_labels = is_train.intent\nis_train_codes = is_train.code\n\ntraining_ds_queries = tf.convert_to_tensor(is_train_queries)\n\ntraining_ds_labels = tf.convert_to_tensor(is_train_codes)", "_____no_output_____" ], [ "is_test.shape", "_____no_output_____" ] ], [ [ "Create a OOD evaluation dataset. For this, combine the in-scope test data 'is_test' and out-of-scope 'oos_test' data. Assign label 0 for in-scope and label 1 for out-of-scope data", "_____no_output_____" ] ], [ [ "train_size = len(is_train)\ntest_size = len(is_test)\noos_size = len(oos_test)\n\n# Combines the in-domain and out-of-domain test examples.\noos_queries= tf.concat([is_test['question'], oos_test['question']], axis=0)\noos_labels = tf.constant([0] * test_size + [1] * oos_size)\n\n# Converts into a TF dataset.\noos_eval_dataset = tf.data.Dataset.from_tensor_slices(\n {\"text\": oos_queries, \"label\": oos_labels})", "_____no_output_____" ] ], [ [ "### Train and evaluate", "_____no_output_____" ] ], [ [ "TRAIN_EPOCHS = 4\nTRAIN_BATCH_SIZE = 16\nEVAL_BATCH_SIZE = 256", "_____no_output_____" ], [ "#@title\n\ndef bert_optimizer(learning_rate, \n batch_size=TRAIN_BATCH_SIZE, epochs=TRAIN_EPOCHS, \n warmup_rate=0.1):\n \"\"\"Creates an AdamWeightDecay optimizer with learning rate schedule.\"\"\"\n train_data_size = train_size\n \n steps_per_epoch = int(train_data_size / batch_size)\n num_train_steps = steps_per_epoch * epochs\n num_warmup_steps = int(warmup_rate * num_train_steps) \n\n # Creates learning schedule.\n lr_schedule = tf.keras.optimizers.schedules.PolynomialDecay(\n initial_learning_rate=learning_rate,\n decay_steps=num_train_steps,\n end_learning_rate=0.0) \n \n return optimization.AdamWeightDecay(\n learning_rate=lr_schedule,\n weight_decay_rate=0.01,\n epsilon=1e-6,\n exclude_from_weight_decay=['LayerNorm', 'layer_norm', 'bias'])", "_____no_output_____" ], [ "optimizer = bert_optimizer(learning_rate=1e-4)\nloss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)\nmetrics = tf.metrics.SparseCategoricalAccuracy()", "_____no_output_____" ], [ "fit_configs = dict(batch_size=TRAIN_BATCH_SIZE,\n epochs=TRAIN_EPOCHS,\n validation_batch_size=EVAL_BATCH_SIZE, \n validation_data=is_eval_data)", "_____no_output_____" ] ], [ [ "### Model 1 - Batch size of 32 & 3 epochs ", "_____no_output_____" ] ], [ [ "sngp_model = SNGPBertClassifier()\nsngp_model.compile(optimizer=optimizer, loss=loss, metrics=metrics)\nsngp_model.fit(training_ds_queries, training_ds_labels, **fit_configs)", "Epoch 1/2\n938/938 [==============================] - 481s 494ms/step - loss: 0.8704 - sparse_categorical_accuracy: 0.8241 - val_loss: 0.2888 - val_sparse_categorical_accuracy: 0.9473\nEpoch 2/2\n938/938 [==============================] - 464s 495ms/step - loss: 0.0647 - sparse_categorical_accuracy: 0.9853 - val_loss: 0.1979 - val_sparse_categorical_accuracy: 0.9598\n" ] ], [ [ "### Model 2 - Batch size of 16 & 2 epochs ", "_____no_output_____" ] ], [ [ "sngp_model2 = SNGPBertClassifier()\nsngp_model2.compile(optimizer=optimizer, loss=loss, metrics=metrics)\nsngp_model2.fit(training_ds_queries, training_ds_labels, **fit_configs)", "Epoch 1/3\n938/938 [==============================] - 480s 495ms/step - loss: 0.9506 - sparse_categorical_accuracy: 0.8029 - val_loss: 0.3883 - val_sparse_categorical_accuracy: 0.9376\nEpoch 2/3\n938/938 [==============================] - 462s 493ms/step - loss: 0.0989 - sparse_categorical_accuracy: 0.9769 - val_loss: 0.2342 - val_sparse_categorical_accuracy: 0.9522\nEpoch 3/3\n938/938 [==============================] - 462s 493ms/step - loss: 0.0272 - sparse_categorical_accuracy: 0.9939 - val_loss: 0.2013 - val_sparse_categorical_accuracy: 0.9598\n" ] ], [ [ "### Model 3 - Batch size of 16 & 4 epochs ", "_____no_output_____" ] ], [ [ "sngp_model3 = SNGPBertClassifier()\nsngp_model3.compile(optimizer=optimizer, loss=loss, metrics=metrics)\nsngp_model3.fit(training_ds_queries, training_ds_labels, **fit_configs)", "Epoch 1/4\n938/938 [==============================] - 477s 493ms/step - loss: 0.9459 - sparse_categorical_accuracy: 0.8066 - val_loss: 0.3804 - val_sparse_categorical_accuracy: 0.9393\nEpoch 2/4\n938/938 [==============================] - 465s 496ms/step - loss: 0.1192 - sparse_categorical_accuracy: 0.9730 - val_loss: 0.2526 - val_sparse_categorical_accuracy: 0.9511\nEpoch 3/4\n938/938 [==============================] - 466s 497ms/step - loss: 0.0372 - sparse_categorical_accuracy: 0.9917 - val_loss: 0.2169 - val_sparse_categorical_accuracy: 0.9564\nEpoch 4/4\n938/938 [==============================] - 465s 496ms/step - loss: 0.0135 - sparse_categorical_accuracy: 0.9974 - val_loss: 0.1992 - val_sparse_categorical_accuracy: 0.9629\n" ] ], [ [ "### Evaluate OOD performance", "_____no_output_____" ], [ "Evaluate how well the model can detect the unfamiliar out-of-domain queries.", "_____no_output_____" ] ], [ [ "\n\ndef oos_predict(model, ood_eval_dataset, **model_kwargs):\n oos_labels = []\n oos_probs = []\n\n ood_eval_dataset = ood_eval_dataset.batch(EVAL_BATCH_SIZE)\n for oos_batch in ood_eval_dataset:\n oos_text_batch = oos_batch[\"text\"]\n oos_label_batch = oos_batch[\"label\"] \n\n pred_logits = model(oos_text_batch, **model_kwargs)\n pred_probs_all = tf.nn.softmax(pred_logits, axis=-1)\n pred_probs = tf.reduce_max(pred_probs_all, axis=-1)\n\n oos_labels.append(oos_label_batch)\n oos_probs.append(pred_probs)\n\n oos_probs = tf.concat(oos_probs, axis=0)\n oos_labels = tf.concat(oos_labels, axis=0) \n\n return oos_probs, oos_labels", "_____no_output_____" ] ], [ [ "Computes the OOD probabilities as $1 - p(x)$, where $p(x)=softmax(logit(x))$ is the predictive probability.", "_____no_output_____" ] ], [ [ "sngp_probs, ood_labels = oos_predict(sngp_model, oos_eval_dataset)", "_____no_output_____" ], [ "sngp_probs2, ood_labels2 = oos_predict(sngp_model2, oos_eval_dataset)", "_____no_output_____" ], [ "sngp_probs3, ood_labels3 = oos_predict(sngp_model3, oos_eval_dataset)", "_____no_output_____" ], [ "ood_probs = 1 - sngp_probs\nood_probs2 = 1 - sngp_probs2\nood_probs3 = 1 - sngp_probs3", "_____no_output_____" ], [ "plt.rcParams['figure.dpi'] = 140\n\nDEFAULT_X_RANGE = (-3.5, 3.5)\nDEFAULT_Y_RANGE = (-2.5, 2.5)\nDEFAULT_CMAP = colors.ListedColormap([\"#377eb8\", \"#ff7f00\"])\nDEFAULT_NORM = colors.Normalize(vmin=0, vmax=1,)\nDEFAULT_N_GRID = 100", "_____no_output_____" ], [ "ood_uncertainty = ood_probs * (1 - ood_probs)\nood_uncertainty2 = ood_probs2 * (1 - ood_probs2)\nood_uncertainty3 = ood_probs3 * (1 - ood_probs3)", "_____no_output_____" ], [ "s1 = np.array(sngp_probs.numpy())\nprint(s1[3000])", "0.98855245\n" ], [ "s2 = np.array(sngp_probs2.numpy())\nprint(s2[2000])", "0.99832803\n" ], [ "s3 = np.array(sngp_probs3.numpy())\nprint(s3[1000])", "0.9983203\n" ] ], [ [ "### Compute the Area under precision-recall curve (AUPRC) for OOD probability v.s. OOD detection accuracy.", "_____no_output_____" ] ], [ [ "precision, recall, _ = sklearn.metrics.precision_recall_curve(ood_labels, ood_probs)\nprecision2, recall2, _ = sklearn.metrics.precision_recall_curve(ood_labels2, ood_probs2)\nprecision3, recall3, _ = sklearn.metrics.precision_recall_curve(ood_labels3, ood_probs3)", "_____no_output_____" ], [ "print((precision3)\nprint(recall3)", "_____no_output_____" ] ], [ [ "[0.23380874 0.23362956 0.23368421 ... 1. 1. 1. ]\n[1. 0.999 0.999 ... 0.002 0.001 0. ]", "_____no_output_____" ] ], [ [ "sklearn.metrics.recall_score(oos_labels, ood_labels3, average='weighted')", "_____no_output_____" ], [ "sklearn.metrics.precision_score(oos_labels, ood_labels3, average='weighted')", "_____no_output_____" ], [ "auprc = sklearn.metrics.auc(recall, precision)\nprint(f'SNGP AUPRC: {auprc:.4f}')", "SNGP AUPRC: 0.9026\n" ], [ "auprc2 = sklearn.metrics.auc(recall2, precision2)\nprint(f'SNGP AUPRC 2: {auprc2:.4f}')", "SNGP AUPRC 2: 0.8926\n" ], [ "auprc3 = sklearn.metrics.auc(recall3, precision3)\nprint(f'SNGP AUPRC 3: {auprc3:.4f}')", "SNGP AUPRC 3: 0.8926\n" ], [ "prob_true, prob_pred = sklearn.calibration.calibration_curve(\n ood_labels, ood_probs, n_bins=10, strategy='quantile')\n\nprob_true2, prob_pred2 = sklearn.calibration.calibration_curve(\n ood_labels2, ood_probs2, n_bins=10, strategy='quantile')\n\nprob_true3, prob_pred3 = sklearn.calibration.calibration_curve(\n ood_labels3, ood_probs3, n_bins=10, strategy='quantile')", "_____no_output_____" ], [ "plt.plot(prob_pred, prob_true)\n\nplt.plot([0., 1.], [0., 1.], c='k', linestyle=\"--\")\nplt.xlabel('Predictive Probability')\nplt.ylabel('Predictive Accuracy')\nplt.title('Calibration Plots, SNGP')\n\nplt.show()", "_____no_output_____" ], [ "plt.plot(prob_pred2, prob_true2)\n\nplt.plot([0., 1.], [0., 1.], c='k', linestyle=\"--\")\nplt.xlabel('Predictive Probability')\nplt.ylabel('Predictive Accuracy')\nplt.title('Calibration Plots, SNGP')\n\nplt.show()", "_____no_output_____" ], [ "plt.plot(prob_pred3, prob_true3)\n\nplt.plot([0., 1.], [0., 1.], c='k', linestyle=\"--\")\nplt.xlabel('Predictive Probability')\nplt.ylabel('Predictive Accuracy')\nplt.title('Calibration Plots, SNGP')\n\nplt.show()", "_____no_output_____" ], [ "# calculate scores\nauc1 = roc_auc_score(oos_labels, ood_probs)\nauc2 = roc_auc_score(oos_labels, ood_probs2)\nauc3 = roc_auc_score(oos_labels, ood_probs3)\n# summarize scores\nprint('SNGP Model 1: ROC AUC=%.3f' % (auc1))\nprint('SNGP Model 2: ROC AUC=%.3f' % (auc2))\nprint('SNGP Model 3: ROC AUC=%.3f' % (auc3))\n# calculate roc curves\nfpr1, tpr1, _ = roc_curve(oos_labels, ood_probs)\nfpr2, tpr2, _ = roc_curve(oos_labels, ood_probs2)\nfpr3, tpr3, _ = roc_curve(oos_labels, ood_probs3)\n# plot the roc curve for the model\npyplot.plot(fpr1, tpr1, marker='.', label='SNGP Model 1')\npyplot.plot(fpr2, tpr2, marker='*', label='SNGP Model 2')\npyplot.plot(fpr3, tpr3, marker='+', label='SNGP Model 3')\n# axis labels\npyplot.xlabel('False Positive Rate (Precision)')\npyplot.ylabel('True Positive Rate (Recall)')\n# show the legend\npyplot.legend()\n# show the plot\npyplot.show()", "SNGP Model 1: ROC AUC=0.972\nSNGP Model 2: ROC AUC=0.973\nSNGP Model 3: ROC AUC=0.973\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d06235b84db16934ad49870332ad1f26e8547ddb
31,064
ipynb
Jupyter Notebook
t81_558_class_02_4_pandas_functional.ipynb
AritraJana1810/t81_558_deep_learning
184d84d202b54990be8c927499ce0a01a3662e6f
[ "Apache-2.0" ]
1
2021-07-03T09:02:59.000Z
2021-07-03T09:02:59.000Z
t81_558_class_02_4_pandas_functional.ipynb
joaquinmorenoa/t81_558_deep_learning
569ed623cb225a5d410fda6f49e1a15073b247ea
[ "Apache-2.0" ]
null
null
null
t81_558_class_02_4_pandas_functional.ipynb
joaquinmorenoa/t81_558_deep_learning
569ed623cb225a5d410fda6f49e1a15073b247ea
[ "Apache-2.0" ]
1
2020-09-21T15:11:35.000Z
2020-09-21T15:11:35.000Z
31,064
31,064
0.489988
[ [ [ "<a href=\"https://colab.research.google.com/github/jeffheaton/t81_558_deep_learning/blob/master/t81_558_class_02_4_pandas_functional.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "# T81-558: Applications of Deep Neural Networks\n**Module 2: Python for Machine Learning**\n* Instructor: [Jeff Heaton](https://sites.wustl.edu/jeffheaton/), McKelvey School of Engineering, [Washington University in St. Louis](https://engineering.wustl.edu/Programs/Pages/default.aspx)\n* For more information visit the [class website](https://sites.wustl.edu/jeffheaton/t81-558/).", "_____no_output_____" ], [ "# Module 2 Material\n\nMain video lecture:\n\n* Part 2.1: Introduction to Pandas [[Video]](https://www.youtube.com/watch?v=bN4UuCBdpZc&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_1_python_pandas.ipynb)\n* Part 2.2: Categorical Values [[Video]](https://www.youtube.com/watch?v=4a1odDpG0Ho&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_2_pandas_cat.ipynb)\n* Part 2.3: Grouping, Sorting, and Shuffling in Python Pandas [[Video]](https://www.youtube.com/watch?v=YS4wm5gD8DM&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_3_pandas_grouping.ipynb)\n* **Part 2.4: Using Apply and Map in Pandas for Keras** [[Video]](https://www.youtube.com/watch?v=XNCEZ4WaPBY&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_4_pandas_functional.ipynb)\n* Part 2.5: Feature Engineering in Pandas for Deep Learning in Keras [[Video]](https://www.youtube.com/watch?v=BWPTj4_Mi9E&list=PLjy4p-07OYzulelvJ5KVaT2pDlxivl_BN) [[Notebook]](t81_558_class_02_5_pandas_features.ipynb)", "_____no_output_____" ], [ "# Google CoLab Instructions\n\nThe following code ensures that Google CoLab is running the correct version of TensorFlow.", "_____no_output_____" ] ], [ [ "try:\n %tensorflow_version 2.x\n COLAB = True\n print(\"Note: using Google CoLab\")\nexcept:\n print(\"Note: not using Google CoLab\")\n COLAB = False", "Note: not using Google CoLab\n" ] ], [ [ "# Part 2.4: Apply and Map", "_____no_output_____" ], [ "If you've ever worked with Big Data or functional programming languages before, you've likely heard of map/reduce. Map and reduce are two functions that apply a task that you create to a data frame. Pandas supports functional programming techniques that allow you to use functions across en entire data frame. In addition to functions that you write, Pandas also provides several standard functions for use with data frames.", "_____no_output_____" ], [ "### Using Map with Dataframes\n\nThe map function allows you to transform a column by mapping certain values in that column to other values. Consider the Auto MPG data set that contains a field **origin_name** that holds a value between one and three that indicates the geographic origin of each car. We can see how to use the map function to transform this numeric origin into the textual name of each origin.\n\nWe will begin by loading the Auto MPG data set. ", "_____no_output_____" ] ], [ [ "import os\nimport pandas as pd\nimport numpy as np\n\ndf = pd.read_csv(\n \"https://data.heatonresearch.com/data/t81-558/auto-mpg.csv\", \n na_values=['NA', '?'])\n\npd.set_option('display.max_columns', 7)\npd.set_option('display.max_rows', 5)\n\ndisplay(df)", "_____no_output_____" ] ], [ [ "The **map** method in Pandas operates on a single column. You provide **map** with a dictionary of values to transform the target column. The map keys specify what values in the target column should be turned into values specified by those keys. The following code shows how the map function can transform the numeric values of 1, 2, and 3 into the string values of North America, Europe and Asia.", "_____no_output_____" ] ], [ [ "# Apply the map\ndf['origin_name'] = df['origin'].map(\n {1: 'North America', 2: 'Europe', 3: 'Asia'})\n\n# Shuffle the data, so that we hopefully see\n# more regions.\ndf = df.reindex(np.random.permutation(df.index)) \n\n# Display\npd.set_option('display.max_columns', 7)\npd.set_option('display.max_rows', 10)\ndisplay(df)", "_____no_output_____" ] ], [ [ "### Using Apply with Dataframes\n\nThe **apply** function of the data frame can run a function over the entire data frame. You can use either be a traditional named function or a lambda function. Python will execute the provided function against each of the rows or columns in the data frame. The **axis** parameter specifies of the function is run across rows or columns. For axis = 1, rows are used. The following code calculates a series called **efficiency** that is the **displacement** divided by **horsepower**. ", "_____no_output_____" ] ], [ [ "efficiency = df.apply(lambda x: x['displacement']/x['horsepower'], axis=1)\ndisplay(efficiency[0:10])", "_____no_output_____" ] ], [ [ "You can now insert this series into the data frame, either as a new column or to replace an existing column. The following code inserts this new series into the data frame.", "_____no_output_____" ] ], [ [ "df['efficiency'] = efficiency", "_____no_output_____" ] ], [ [ "### Feature Engineering with Apply and Map", "_____no_output_____" ], [ "In this section, we will see how to calculate a complex feature using map, apply, and grouping. The data set is the following CSV:\n\n* https://www.irs.gov/pub/irs-soi/16zpallagi.csv \n\nThis URL contains US Government public data for \"SOI Tax Stats - Individual Income Tax Statistics.\" The entry point to the website is here:\n\n* https://www.irs.gov/statistics/soi-tax-stats-individual-income-tax-statistics-2016-zip-code-data-soi \n\nDocumentation describing this data is at the above link.\n\nFor this feature, we will attempt to estimate the adjusted gross income (AGI) for each of the zip codes. The data file contains many columns; however, you will only use the following:\n\n* STATE - The state (e.g., MO)\n* zipcode - The zipcode (e.g. 63017)\n* agi_stub - Six different brackets of annual income (1 through 6) \n* N1 - The number of tax returns for each of the agi_stubs\n\nNote, the file will have six rows for each zip code, for each of the agi_stub brackets. You can skip zip codes with 0 or 99999.\n\nWe will create an output CSV with these columns; however, only one row per zip code. Calculate a weighted average of the income brackets. For example, the following six rows are present for 63017:\n\n\n|zipcode |agi_stub | N1 |\n|--|--|-- |\n|63017 |1 | 4710 |\n|63017 |2 | 2780 |\n|63017 |3 | 2130 |\n|63017 |4 | 2010 |\n|63017 |5 | 5240 |\n|63017 |6 | 3510 |\n\n\nWe must combine these six rows into one. For privacy reasons, AGI's are broken out into 6 buckets. We need to combine the buckets and estimate the actual AGI of a zipcode. To do this, consider the values for N1:\n\n* 1 = 1 to 25,000\n* 2 = 25,000 to 50,000\n* 3 = 50,000 to 75,000\n* 4 = 75,000 to 100,000\n* 5 = 100,000 to 200,000\n* 6 = 200,000 or more\n\nThe median of each of these ranges is approximately:\n\n* 1 = 12,500\n* 2 = 37,500\n* 3 = 62,500 \n* 4 = 87,500\n* 5 = 112,500\n* 6 = 212,500\n\nUsing this you can estimate 63017's average AGI as:\n\n```\n>>> totalCount = 4710 + 2780 + 2130 + 2010 + 5240 + 3510\n>>> totalAGI = 4710 * 12500 + 2780 * 37500 + 2130 * 62500 \n + 2010 * 87500 + 5240 * 112500 + 3510 * 212500\n>>> print(totalAGI / totalCount)\n\n88689.89205103042\n```\n\nWe begin by reading in the government data.", "_____no_output_____" ] ], [ [ "import pandas as pd\n\ndf=pd.read_csv('https://www.irs.gov/pub/irs-soi/16zpallagi.csv')", "_____no_output_____" ] ], [ [ "First, we trim all zip codes that are either 0 or 99999. We also select the three fields that we need.", "_____no_output_____" ] ], [ [ "df=df.loc[(df['zipcode']!=0) & (df['zipcode']!=99999),\n ['STATE','zipcode','agi_stub','N1']]\n\npd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\n\ndisplay(df)", "_____no_output_____" ] ], [ [ "We replace all of the **agi_stub** values with the correct median values with the **map** function.", "_____no_output_____" ] ], [ [ "medians = {1:12500,2:37500,3:62500,4:87500,5:112500,6:212500}\ndf['agi_stub']=df.agi_stub.map(medians)\n\npd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\ndisplay(df)", "_____no_output_____" ] ], [ [ "Next, we group the data frame by zip code.", "_____no_output_____" ] ], [ [ "groups = df.groupby(by='zipcode')", "_____no_output_____" ] ], [ [ "The program applies a lambda is applied across the groups, and then calculates the AGI estimate.", "_____no_output_____" ] ], [ [ "df = pd.DataFrame(groups.apply( \n lambda x:sum(x['N1']*x['agi_stub'])/sum(x['N1']))) \\\n .reset_index()", "_____no_output_____" ], [ "pd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\n\ndisplay(df)", "_____no_output_____" ] ], [ [ "We can now rename the new agi_estimate column.", "_____no_output_____" ] ], [ [ "df.columns = ['zipcode','agi_estimate']", "_____no_output_____" ], [ "pd.set_option('display.max_columns', 0)\npd.set_option('display.max_rows', 10)\n\ndisplay(df)", "_____no_output_____" ] ], [ [ "Finally, we check to see that our zip code of 63017 got the correct value.", "_____no_output_____" ] ], [ [ "df[ df['zipcode']==63017 ]", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ] ]
d062446f226ef60582dc04dd098b4f3cbd00db61
52,841
ipynb
Jupyter Notebook
matplotlibAndobjectorianted_linechart_with_errorbars.ipynb
GirijaJoshi/PyBer_Analysis
faaea2be8baae82ba8ca84314b51954b14784c8d
[ "MIT" ]
1
2020-10-20T15:15:37.000Z
2020-10-20T15:15:37.000Z
matplotlibAndobjectorianted_linechart_with_errorbars.ipynb
GirijaJoshi/PyBer_Analysis
faaea2be8baae82ba8ca84314b51954b14784c8d
[ "MIT" ]
null
null
null
matplotlibAndobjectorianted_linechart_with_errorbars.ipynb
GirijaJoshi/PyBer_Analysis
faaea2be8baae82ba8ca84314b51954b14784c8d
[ "MIT" ]
null
null
null
265.532663
16,468
0.931133
[ [ [ "%matplotlib inline", "_____no_output_____" ], [ "# Import dependencies.\nimport matplotlib.pyplot as plt\nimport statistics", "_____no_output_____" ], [ "# Set the x-axis to a list of strings for each month.\nx_axis = [\"Jan\", \"Feb\", \"Mar\", \"April\", \"May\", \"June\", \"July\", \"Aug\", \"Sept\", \"Oct\", \"Nov\", \"Dec\"]\n\n# Set the y-axis to a list of floats as the total fare in US dollars accumulated for each month.\ny_axis = [10.02, 23.24, 39.20, 35.42, 32.34, 27.04, 43.82, 10.56, 11.85, 27.90, 20.71, 20.09]", "_____no_output_____" ], [ "average = sum(y_axis)/len(y_axis)\naverage", "_____no_output_____" ], [ "# Get the standard deviation of the values in the y-axis.\nstdev = statistics.stdev(y_axis)\nstdev", "_____no_output_____" ], [ "# added standart davitation to y-axis\nplt.errorbar(x_axis, y_axis, yerr=stdev)", "_____no_output_____" ], [ "# added standart davitation to y-axis, adding cap\nplt.errorbar(x_axis, y_axis, yerr=stdev, capsize=3)", "_____no_output_____" ], [ "fig, ax = plt.subplots()\nax.errorbar(x_axis, y_axis, yerr=stdev, capsize=3)\nplt.show()", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d06244e92170a3cee84bba7d981221ffc4f00179
350,583
ipynb
Jupyter Notebook
Charting a path into the data science field.ipynb
khiara/DSND_Kaggle_2020_Survey
57ba312125edbe8278b6f292b4f1bda1fb4018f0
[ "CNRI-Python" ]
null
null
null
Charting a path into the data science field.ipynb
khiara/DSND_Kaggle_2020_Survey
57ba312125edbe8278b6f292b4f1bda1fb4018f0
[ "CNRI-Python" ]
null
null
null
Charting a path into the data science field.ipynb
khiara/DSND_Kaggle_2020_Survey
57ba312125edbe8278b6f292b4f1bda1fb4018f0
[ "CNRI-Python" ]
null
null
null
211.576946
41,700
0.882747
[ [ [ "# Charting a path into the data science field", "_____no_output_____" ], [ "This project attempts to shed light on the path or paths to becoming a data science professional in the United States.\n\nData science is a rapidly growing field, and the demand for data scientists is outpacing supply. In the past, most Data Scientist positions went to people with PhDs in Computer Science. I wanted to know if that is changing in light of both the increased job openings and the expanding definition of data science that has come with more companies realizing the wealth of raw data they have available for analysis, and how that can help to grow and refine their businesses.", "_____no_output_____" ], [ "## Business Questions\n\n\n1. Do you need a a formal degree?\n2. What programming language(s) do data science professionals need to know?\n3. What are the preferred online learning platforms to gain data science knowledge and skills?", "_____no_output_____" ], [ "## Data\n\nSince 2017, Kaggle ('The world's largest data science community') has annually surveyed its users on demographics, practices, and preferences. This notebook explores the data from Kaggle's 2020 Machine Learning and Data Science survey. A caveat: Kaggle is heavy on Machine Learning and competitions, and while it claims over 8 million users the group may not be representative of the overall data science community. Additionally,survey respondents are self-selected, so we can't extrapolate any findings to the data science community as a whole, but the trends and demographics amongst Kaggle survey takers may still offer insights about data science professionals.", "_____no_output_____" ], [ "The first step is importing the necessary libraries and data.", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport textwrap\n%matplotlib inline\n\nfrom matplotlib.ticker import PercentFormatter\n\nimport warnings\nwarnings.filterwarnings('ignore')", "_____no_output_____" ], [ "df = pd.read_csv('./kaggle_survey_2020_responses.csv')\nlow_memory = False", "_____no_output_____" ] ], [ [ "### Initial data exploration and cleaning\nLet's take a look at the survey data.", "_____no_output_____" ] ], [ [ "# Let's look at the first 5 rows of the dataset\ndf.head()", "_____no_output_____" ] ], [ [ "One thing we can see from this: some questions are tied to a single column, with a number of answers possible; these questions only allowed survey respondents to choose one answer from among the options. Other questions take up multiple columns, with each column tied to a specific answer; these were questions that allowed users to choose more than one option as the answer ('select all that apply'). The two types of questions will require different approaches to data preparation.", "_____no_output_____" ], [ "But first, we'll do some cleaning. The top row of data contains the question titles. We'll remove that, as well as the first column of survey completion time values.", "_____no_output_____" ] ], [ [ "# Removing the first column and the first row\ndf.drop(['Time from Start to Finish (seconds)'], axis=1, inplace=True)\ndf = df.loc[1:, :]\ndf.head()", "_____no_output_____" ], [ "df.shape", "_____no_output_____" ] ], [ [ "There are over 20,000 responses, with 354 answer fields.", "_____no_output_____" ], [ "#### Data preparation and filtering", "_____no_output_____" ], [ "To improve readability of visualizations, we'll aggregate some fields, shorten some labels, and re-order categories.", "_____no_output_____" ] ], [ [ "# Aggregating the nonbinary answers\ndf.loc[(df.Q2 == 'Prefer not to say'), 'Q2'] = 'Other Response'\ndf.loc[(df.Q2 == 'Prefer to self-describe'),'Q2'] = 'Other Response'\ndf.loc[(df.Q2 == 'Nonbinary'), 'Q2'] = 'Other Response'\n\n# Abbreviating country name\ndf.loc[(df.Q3 == 'United States of America'),'Q3']='USA'\n\n# Shortening education level descriptions\ndf.loc[(df.Q4 == 'Doctoral degree'),'Q4']='PhD'\ndf.loc[(df.Q4 == 'Masterโ€™s degree'),'Q4']='Masterโ€™s'\ndf.loc[(df.Q4 == 'Bachelorโ€™s degree'),'Q4']='Bachelorโ€™s'\ndf.loc[(df.Q4 == \"Some college/university study without earning a bachelorโ€™s degree\"), 'Q4']='Some college/university'\ndf.loc[(df.Q4 == 'No formal education past high school'), 'Q4']='High school'\ndf.loc[(df.Q4 == 'I prefer not to answer'), 'Q4']='Prefer not to answer'\n\n# Ordering education levels by reverse typical chronological completion\nq4_order = [\n 'PhD',\n 'Masterโ€™s', \n 'Professional degree', \n 'Bachelorโ€™s', \n 'Some college/university', \n 'High school', \n 'Prefer not to answer']\n\n# Putting coding experience answers in order from shortest time to longest\nq6_order = [\n 'I have never written code', \n '< 1 years', \n '1-2 years', \n '3-5 years', \n '5-10 years', \n '10-20 years', \n '20+ years']\n\ndf.loc[(df.Q37_Part_9 == 'Cloud-certification programs (direct from AWS, Azure, GCP, or similar)'), 'Q37_Part_9']='Cloud-certification programs'\ndf.loc[(df.Q37_Part_10 == 'University Courses (resulting in a university degree)'), 'Q37_Part_10']='University Courses resulting in a degree'", "_____no_output_____" ] ], [ [ "We're going to focus on the US answers from currently employed Kagglers.", "_____no_output_____" ] ], [ [ "# Filtering for just US responses\nus_df = df[df['Q3'] == 'USA']\n\n# Filtering to only include currently employed Kagglers\nq5_order = [\n 'Data Scientist',\n 'Software Engineer',\n 'Data Analyst', \n 'Research Scientist',\n 'Product/Project Manager',\n 'Business Analyst',\n 'Machine Learning Engineer',\n 'Data Engineer',\n 'Statistician',\n 'DBA/Database Engineer',\n 'Other']\n\nus_df = us_df[us_df['Q5'].isin(q5_order)]", "_____no_output_____" ] ], [ [ "We're interested in the demographic questions at the beginning, plus coding experience, coding languages used, and online learning platforms used. ", "_____no_output_____" ] ], [ [ "# Filtering to only include specific question columns\nus_df = us_df.loc[:, ['Q1', 'Q2', 'Q3', 'Q4', 'Q5', 'Q6', 'Q7_Part_1', 'Q7_Part_2','Q7_Part_3','Q7_Part_4','Q7_Part_5',\n 'Q7_Part_6', 'Q7_Part_7','Q7_Part_8','Q7_Part_9','Q7_Part_10','Q7_Part_11', 'Q7_Part_12', 'Q7_OTHER',\n 'Q37_Part_1', 'Q37_Part_2', 'Q37_Part_3', 'Q37_Part_4', 'Q37_Part_5', 'Q37_Part_6', 'Q37_Part_7', \n 'Q37_Part_8', 'Q37_Part_9', 'Q37_Part_10','Q37_Part_11', 'Q37_OTHER']]", "_____no_output_____" ], [ "us_df.isna().sum()", "_____no_output_____" ] ], [ [ "Not much in the way of missing values in the first 6 questions; that changes for the multiple-column questions, as expected, since users only filled in the column when they were choosing that particular option. We'll address that by converting the missing values to zeros in the helper functions.", "_____no_output_____" ] ], [ [ "us_df.shape", "_____no_output_____" ] ], [ [ "This will be the data for our analysis -- covering 1680 currently employed Kagglers in the US.", "_____no_output_____" ], [ "## Helper functions", "_____no_output_____" ], [ "A few functions to help with data visualizations. The first two plot a barchart with a corresponding list of the counts and percentages for the values; one handles single-column questions and the other handles multiple-column questions. The third and fourth are heatmap functions -- one for single-column questions, and one for multiple-column questions.", "_____no_output_____" ] ], [ [ "def list_and_bar(qnum, q_order, title):\n \n '''\n INPUT:\n qnum - the y-axis variable, a single-column question\n q_order - the order to display responses on the barchart\n title - the title of the barchart\n \n OUTPUT:\n 1. A list of responses to the selected question, in descending order\n 2. A horizontal barchart showing the values, in sorted order \n '''\n\n # creating a dataframe of values to include both raw counts and percentages\n val_list = pd.DataFrame()\n val_list['Count'] = us_df[qnum].value_counts()\n pct = round(val_list * 100/us_df[qnum].count(),2)\n val_list['Pct'] = pct\n \n print(val_list)\n \n fig, ax = plt.subplots(1, 1, figsize=(12,6))\n ax = us_df[qnum].value_counts()[q_order].plot(kind='barh')\n \n # reversing the order of y axis -- \n # the horizontal barchart displays values in the reverse order of a regular barchart (i.e., where the barchart might show \n # a - b - c left to right, the corresponding horizontal barchart would show c at the top, and a at the bottom)\n ax.invert_yaxis()\n \n plt.title(title, fontsize = 14, fontweight = 'bold')\n plt.show()\n \n \n\ndef list_and_bar_mc(mc_df, title):\n \n '''\n INPUT:\n mc_df - a dataframe consisting of answers to a specific multiple-column question\n title - the title of the barchart\n \n OUTPUT:\n 1. A list of responses to the selected question, in descending order\n 2. A horizontal barchart showing the values, also in descending order\n '''\n print(mc_df)\n \n fig, ax = plt.subplots(1, 1, figsize=(12,6))\n mc_df['Count'].sort_values().plot(kind='barh')\n plt.title(title, fontsize = 14, fontweight = 'bold')\n plt.show()\n \n \n\ndef heatmap(qnum_a, qnum_b, title, order_rows, columns):\n \n '''\n INPUT:\n qnum_a - the x-axis variable, a single-column question\n qnum_b - the y-axis variable, a single-column question\n title - the title of the heatmap, describing the variables in the visualization\n order_rows - sorted order for the y-axis\n columns - sorted order for the x-axis\n \n OUTPUT:\n A heatmap showing the correlation between the two chosen variables\n '''\n vals = us_df[[qnum_a, qnum_b]].groupby(qnum_b)[qnum_a].value_counts().unstack()\n \n # getting the total number of responses for the columns in order to calculate the % of the total\n vals_rowsums = pd.DataFrame([vals.sum(axis=0).tolist()], columns=vals.columns, index=['All'])\n vals = pd.concat([vals_rowsums, vals], axis=0)\n\n # convert to % \n vals = ((vals.T / (vals.sum(axis=1) + 0.001)).T) * 100 \n\n order = order_rows\n columns = columns\n \n vals = vals.reindex(order).reindex(columns = columns)\n \n fig, ax = plt.subplots(1, 1, figsize=[12,6])\n ax = sns.heatmap(ax = ax, data = vals, cmap = 'GnBu', cbar_kws = {'format': '%.0f%%'})\n plt.title(title, fontsize = 14, fontweight = 'bold')\n ax.set_xlabel('')\n ax.set_ylabel('')\n plt.show()\n \n \n\ndef heatmap_mc(qnum, qnum_mc, title, columns, order_rows):\n \n '''\n INPUT:\n qnum - the y-axis variable, a single-column question\n qnum_mc - the x-axis variable, a question with multiple columns of answers\n title - the title of the heatmap, describing the variables in the visualization\n order_rows - sorted order for the y-axis\n columns - a list of column names, representing the multiple-column answer options, ordered\n \n OUTPUT:\n 1. A heatmap showing the correlation between the two specified variables\n 2. avg_num - the average number of answer options chosen for the multiple column question\n '''\n # creating a dataframe with the single-column question\n df_qnum = us_df[qnum]\n df_qnum = pd.DataFrame(df_qnum)\n \n # creating a dataframe containing all the columns for a given multiple-column question\n cols_mc = [col for col in us_df if col.startswith(qnum_mc)]\n df_mc = us_df[cols_mc]\n df_mc.columns = columns\n \n # converting column values to binary 0 or 1 values (1 if the user chose that answer, 0 if not)\n df_mc = df_mc.notnull().astype(int)\n \n # joining the dataframes together\n df_join = df_qnum.join(df_mc)\n \n # aggregating counts for each answer option and re-ordering dataframe\n df_agg = df_join.groupby([qnum]).agg('sum')\n df_agg = df_agg.reindex(order_rows)\n \n df_agg['users'] = df_join.groupby(qnum)[qnum].count()\n df_agg = df_agg.div(df_agg.loc[:, 'users'], axis=0)\n df_agg.drop(columns='users', inplace=True)\n \n \n fig, ax = plt.subplots(1, 1, figsize=(12, 6))\n ax = sns.heatmap(ax = ax, data = df_agg, cmap = 'GnBu')\n cbar = ax.collections[0].colorbar\n cbar.ax.yaxis.set_major_formatter(PercentFormatter(1, 0))\n plt.title(title, fontsize = 14, fontweight = 'bold')\n ax.set_xlabel('')\n ax.set_ylabel('')\n plt.show() \n \n # finding the average number of answers chosen for the multiple column options, minus tabulations for 'None'\n df_temp = df_join\n df_temp.drop('None', axis = 1, inplace = True)\n rowsums = df_temp.sum(axis = 1)\n avg_num = round(rowsums.mean(), 2)\n \n print('Average number of options chosen by survey respondents: ' + str(avg_num) + '.')\n", "_____no_output_____" ] ], [ [ "## Analysis and visualizations", "_____no_output_____" ], [ "We'll start by looking at the age and gender distribution, just to get an overview of the response community.", "_____no_output_____" ] ], [ [ "plt.figure(figsize=[12,6])\nus_ages = us_df['Q1'].value_counts().sort_index()\nsns.countplot(data = us_df, x = 'Q1', hue = 'Q2', order = us_ages.index)\nplt.title('Age and Gender Distribution')", "_____no_output_____" ] ], [ [ "The survey response pool skews heavily male, with most US Kagglers between the ages of 25 and 45. ", "_____no_output_____" ] ], [ [ "list_and_bar('Q6', q6_order, 'Years of Coding Experience')", " Count Pct\n3-5 years 367 22.00\n20+ years 349 20.92\n5-10 years 334 20.02\n10-20 years 288 17.27\n1-2 years 171 10.25\n< 1 years 104 6.24\nI have never written code 55 3.30\n" ] ], [ [ "Around 80 percent of those responding have 3 or more years experience coding.", "_____no_output_____" ], [ "### 1. Do you need a formal degree to become a data science professional?", "_____no_output_____" ], [ "Let's look at formal education, and how it correlates with job title.", "_____no_output_____" ] ], [ [ "list_and_bar('Q4', q4_order, 'Highest Level of Education Attained')", " Count Pct\nMasterโ€™s 819 48.75\nBachelorโ€™s 409 24.35\nPhD 334 19.88\nSome college/university 71 4.23\nProfessional degree 34 2.02\nPrefer not to answer 8 0.48\nHigh school 5 0.30\n" ], [ "list_and_bar('Q5', q5_order, 'Current Job Title')", " Count Pct\nData Scientist 389 23.15\nOther 292 17.38\nSoftware Engineer 219 13.04\nData Analyst 192 11.43\nResearch Scientist 140 8.33\nProduct/Project Manager 117 6.96\nBusiness Analyst 107 6.37\nMachine Learning Engineer 97 5.77\nData Engineer 71 4.23\nStatistician 38 2.26\nDBA/Database Engineer 18 1.07\n" ], [ "heatmap('Q4', 'Q5', 'Roles by Education Level', q5_order, q4_order)", "_____no_output_____" ] ], [ [ "### Question 1 analysis", "_____no_output_____" ], [ "With almost 49% of the responses, a Master's degree was by far the most common level of education listed, more than double the next most popular answer. Other notable observations:\n * Sixty-eight percent of US Kagglers hold a Master's Degree or higher. \n * Research scientists and statisticians are most likely to hold PhDs, followed by Data Scientists.\n * Relatively few survey respondents (around 5%) indicate they do not have at least a Bachelor's degree.\n * Only 23% of those responding hold the title of Data Scientist, but it is nonetheless the title with the highest count. \n Arguably anyone who is active on Kaggle and who would complete their survey considers themself to be either in, or \n interested in, the data science field, if not actively working as a Data Scientist. ", "_____no_output_____" ], [ "### Question 2. What programming language(s) do Data Scientists need to know?", "_____no_output_____" ], [ "Now we'll turn to programming languages used. As this is a \"Select all that apply\" question, with each language option appearing as a separate column, we need to do some processing to get the data into a format for easier graphing and analysis.", "_____no_output_____" ] ], [ [ "# creating a dataframe of the language options and the number of times each language was selected\nlanguages = pd.DataFrame()\n\nfor col in us_df.columns:\n if(col.startswith('Q7_')):\n language = us_df[col].value_counts()\n languages = languages.append({'Language':language.index[0], 'Count':language[0]}, ignore_index=True)\nlanguages = languages.set_index('Language')\nlanguages = languages.sort_values(by = 'Count', ascending = False)\nlanguages_tot = sum(languages.Count)\nlanguages['Pct'] = round((languages['Count'] * 100 / languages_tot), 2)", "_____no_output_____" ], [ "list_and_bar_mc(languages, 'Programming Languages Used')", " Count Pct\nLanguage \nPython 1290.0 29.72\nSQL 899.0 20.71\nR 549.0 12.65\nBash 304.0 7.00\nOther 281.0 6.47\nJavascript 265.0 6.11\nJava 214.0 4.93\nC++ 177.0 4.08\nMATLAB 138.0 3.18\nC 125.0 2.88\nJulia 37.0 0.85\nNone 37.0 0.85\nSwift 24.0 0.55\n" ], [ "heatmap_mc('Q5', 'Q7', 'Language Use by Role', languages.index, q5_order)", "_____no_output_____" ], [ "heatmap_mc('Q4', 'Q7','Language Use by Education Level', languages.index, q4_order)", "_____no_output_____" ], [ "heatmap_mc('Q6', 'Q7', 'Language Use by Years Coding', languages.index, q6_order)", "_____no_output_____" ] ], [ [ "### Question 2 analysis", "_____no_output_____" ], [ "Python was the most widely used language, followed by SQL and R. Python held the top spot across almost all job roles -- only Statisticians listed another language (SQL) higher -- and for all education levels and coding experience. R enjoys widespread popularity across education level and years coding as well; SQL shows a high number of users overall, but they are more concentrated in people holding Master's or PhD degrees, working as Statisticians, Data Scientists and Data Analysts.", "_____no_output_____" ], [ "Kagglers reported using 2-3 languages on a regular basis.", "_____no_output_____" ], [ "### 3. What are the preferred online learning platforms to gain data science knowledge and skills?", "_____no_output_____" ], [ "Regarding online learning, Kaggle's survey asked, \"On which platforms have you begun or completed data science courses? (Select all that apply).\" We'll handle the answers similarly to the language data. ", "_____no_output_____" ] ], [ [ "# creating a dataframe of online course providers and the number of times each was selected by users\nplatforms = pd.DataFrame()\n\nfor col in us_df.columns:\n if(col.startswith('Q37_')):\n platform = us_df[col].value_counts()\n platforms = platforms.append({'Platform':platform.index[0], 'Count':platform[0]}, ignore_index=True)\nplatforms = platforms.set_index('Platform')\nplatforms = platforms.sort_values(by = 'Count', ascending=False)\nplatforms_tot = sum(platforms.Count)\nplatforms['Pct'] = round((platforms['Count'] * 100 / platforms_tot), 2)", "_____no_output_____" ], [ "list_and_bar_mc(platforms, 'Learning Platforms Used')", " Count Pct\nPlatform \nCoursera 774.0 20.78\nKaggle Learn Courses 433.0 11.63\nUniversity Courses resulting in a degree 414.0 11.12\nUdemy 393.0 10.55\nDataCamp 367.0 9.85\nedX 328.0 8.81\nUdacity 254.0 6.82\nLinkedIn Learning 209.0 5.61\nNone 154.0 4.14\nFast.ai 144.0 3.87\nOther 139.0 3.73\nCloud-certification programs 115.0 3.09\n" ], [ "heatmap_mc('Q5', 'Q37', 'Learning Platform Use by Role', platforms.index, q5_order)", "_____no_output_____" ], [ "heatmap_mc('Q4', 'Q37', 'Learning Platform Use by Education Level', platforms.index, q4_order)", "_____no_output_____" ] ], [ [ "### Question 3 analysis", "_____no_output_____" ], [ "Coursera was the most popular response, by a good margin. Kaggle Learn, University Courses towards a degree and Udemy followed, with Datacamp and edX not far behind. Kaggle Learn is a relatively new entrant into this area, offering short, narrowly-focused, skill-based courses for free which offer certificates upon completion. These factors may all contribute to the platform's popularity, as it is easy to try out for the cost of a few hours and no money.", "_____no_output_____" ], [ "Kagglers reported trying data science courses on two platforms, on average.", "_____no_output_____" ], [ "Coursera's popularity was high across almost education levels and job titles. Kaggle Learn's usage was fairly uniform across categories. Fast.ai was popular with Research Scientists, Data Scientists, Machine Learnig Engineers, and Statisticians. Other platforms seem to enjoy popularity with some groups more than others, but not in ways that make it easy to extrapolate much.", "_____no_output_____" ], [ "## Conclusion", "_____no_output_____" ], [ "The most well-travelled path into the data science field, at least for those responding to the 2020 Kaggle survey:\n * Get at least a Bachelor's degree, though a Master's degree may be preferable\n * Learn at least 2 coding languages -- Python and R are the top data science languages; depending on the role you want,\n you might want to get comfortable with another language, such as SQL or C.\n * Take classes on online learning platforms to update your skills and learn new ones. Coursera is the standard, while\n Kaggle Learn is a good option for short,targeted learning.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ] ]
d06247bb441bf750a87dbb311ea26d7a156ab0c3
9,961
ipynb
Jupyter Notebook
7.16.ipynb
zhayanqi/mysql
9f03e75ca641cff27fb203ddd0d397d357019057
[ "Apache-2.0" ]
null
null
null
7.16.ipynb
zhayanqi/mysql
9f03e75ca641cff27fb203ddd0d397d357019057
[ "Apache-2.0" ]
null
null
null
7.16.ipynb
zhayanqi/mysql
9f03e75ca641cff27fb203ddd0d397d357019057
[ "Apache-2.0" ]
null
null
null
19.119002
318
0.466419
[ [ [ "# ๅŸบๆœฌ็จ‹ๅบ่ฎพ่ฎก\n- ไธ€ๅˆ‡ไปฃ็ ่พ“ๅ…ฅ๏ผŒ่ฏทไฝฟ็”จ่‹ฑๆ–‡่พ“ๅ…ฅๆณ•", "_____no_output_____" ] ], [ [ "print('hello word')", "_____no_output_____" ], [ "print 'hello'", "_____no_output_____" ] ], [ [ "## ็ผ–ๅ†™ไธ€ไธช็ฎ€ๅ•็š„็จ‹ๅบ\n- ๅœ†ๅ…ฌๅผ้ข็งฏ๏ผš area = radius \\* radius \\* 3.1415", "_____no_output_____" ] ], [ [ "radius = 1.0\narea = radius * radius * 3.14 # ๅฐ†ๅŽๅŠ้ƒจๅˆ†็š„็ป“ๆžœ่ต‹ๅ€ผ็ป™ๅ˜้‡area\n# ๅ˜้‡ไธ€ๅฎš่ฆๆœ‰ๅˆๅง‹ๅ€ผ๏ผ๏ผ๏ผ\n# radius: ๅ˜้‡.area: ๅ˜้‡๏ผ\n# int ็ฑปๅž‹\nprint(area)", "3.14\n" ] ], [ [ "### ๅœจPython้‡Œ้ขไธ้œ€่ฆๅฎšไน‰ๆ•ฐๆฎ็š„็ฑปๅž‹", "_____no_output_____" ], [ "## ๆŽงๅˆถๅฐ็š„่ฏปๅ–ไธŽ่พ“ๅ…ฅ\n- input ่พ“ๅ…ฅ่ฟ›ๅŽป็š„ๆ˜ฏๅญ—็ฌฆไธฒ\n- eval", "_____no_output_____" ] ], [ [ "radius = input('่ฏท่พ“ๅ…ฅๅŠๅพ„') # inputๅพ—ๅˆฐ็š„็ป“ๆžœๆ˜ฏๅญ—็ฌฆไธฒ็ฑปๅž‹\nradius = float(radius)\narea = radius * radius * 3.14\nprint('้ข็งฏไธบ:',area)", "่ฏท่พ“ๅ…ฅๅŠๅพ„10\n้ข็งฏไธบ: 314.0\n" ] ], [ [ "- ๅœจjupyter็”จshift + tab ้”ฎๅฏไปฅ่ทณๅ‡บ่งฃ้‡Šๆ–‡ๆกฃ", "_____no_output_____" ], [ "## ๅ˜้‡ๅ‘ฝๅ็š„่ง„่Œƒ\n- ็”ฑๅญ—ๆฏใ€ๆ•ฐๅญ—ใ€ไธ‹ๅˆ’็บฟๆž„ๆˆ\n- ไธ่ƒฝไปฅๆ•ฐๅญ—ๅผ€ๅคด \\*\n- ๆ ‡่ฏ†็ฌฆไธ่ƒฝๆ˜ฏๅ…ณ้”ฎ่ฏ(ๅฎž้™…ไธŠๆ˜ฏๅฏไปฅๅผบๅˆถๆ”นๅ˜็š„๏ผŒไฝ†ๆ˜ฏๅฏนไบŽไปฃ็ ่ง„่Œƒ่€Œ่จ€ๆ˜ฏๆžๅ…ถไธ้€‚ๅˆ)\n- ๅฏไปฅๆ˜ฏไปปๆ„้•ฟๅบฆ\n- ้ฉผๅณฐๅผๅ‘ฝๅ", "_____no_output_____" ], [ "## ๅ˜้‡ใ€่ต‹ๅ€ผ่ฏญๅฅๅ’Œ่ต‹ๅ€ผ่กจ่พพๅผ\n- ๅ˜้‡: ้€šไฟ—็†่งฃไธบๅฏไปฅๅ˜ๅŒ–็š„้‡\n- x = 2 \\* x + 1 ๅœจๆ•ฐๅญฆไธญๆ˜ฏไธ€ไธชๆ–น็จ‹๏ผŒ่€Œๅœจ่ฏญ่จ€ไธญๅฎƒๆ˜ฏไธ€ไธช่กจ่พพๅผ\n- test = test + 1 \\* ๅ˜้‡ๅœจ่ต‹ๅ€ผไน‹ๅ‰ๅฟ…้กปๆœ‰ๅ€ผ", "_____no_output_____" ], [ "## ๅŒๆ—ถ่ต‹ๅ€ผ\nvar1, var2,var3... = exp1,exp2,exp3...", "_____no_output_____" ], [ "## ๅฎšไน‰ๅธธ้‡\n- ๅธธ้‡๏ผš่กจ็คบไธ€็งๅฎšๅ€ผๆ ‡่ฏ†็ฌฆ๏ผŒ้€‚ๅˆไบŽๅคšๆฌกไฝฟ็”จ็š„ๅœบๆ™ฏใ€‚ๆฏ”ๅฆ‚PI\n- ๆณจๆ„๏ผšๅœจๅ…ถไป–ไฝŽ็บง่ฏญ่จ€ไธญๅฆ‚ๆžœๅฎšไน‰ไบ†ๅธธ้‡๏ผŒ้‚ฃไนˆ๏ผŒ่ฏฅๅธธ้‡ๆ˜ฏไธๅฏไปฅ่ขซๆ”นๅ˜็š„๏ผŒไฝ†ๆ˜ฏๅœจPythonไธญไธ€ๅˆ‡็š†ๅฏน่ฑก๏ผŒๅธธ้‡ไนŸๆ˜ฏๅฏไปฅ่ขซๆ”นๅ˜็š„", "_____no_output_____" ], [ "## ๆ•ฐๅ€ผๆ•ฐๆฎ็ฑปๅž‹ๅ’Œ่ฟ็ฎ—็ฌฆ\n- ๅœจPythonไธญๆœ‰ไธค็งๆ•ฐๅ€ผ็ฑปๅž‹๏ผˆint ๅ’Œ float๏ผ‰้€‚็”จไบŽๅŠ ๅ‡ไน˜้™คใ€ๆจกใ€ๅน‚ๆฌก\n<img src = \"../Photo/01.jpg\"></img>", "_____no_output_____" ], [ "## ่ฟ็ฎ—็ฌฆ /ใ€//ใ€**", "_____no_output_____" ], [ "## ่ฟ็ฎ—็ฌฆ %", "_____no_output_____" ], [ "## EP๏ผš\n- 25/4 ๅคšๅฐ‘๏ผŒๅฆ‚ๆžœ่ฆๅฐ†ๅ…ถ่ฝฌๅ˜ไธบๆ•ดๆ•ฐ่ฏฅๆ€Žไนˆๆ”นๅ†™\n- ่พ“ๅ…ฅไธ€ไธชๆ•ฐๅญ—ๅˆคๆ–ญๆ˜ฏๅฅ‡ๆ•ฐ่ฟ˜ๆ˜ฏๅถๆ•ฐ\n- ่ฟ›้˜ถ: ่พ“ๅ…ฅไธ€ไธช็ง’๏ผŒๆ•ฐ๏ผŒๅ†™ไธ€ไธช็จ‹ๅบๅฐ†ๅ…ถ่ฝฌๆขๆˆๅˆ†ๅ’Œ็ง’๏ผšไพ‹ๅฆ‚500็ง’็ญ‰ไบŽ8ๅˆ†20็ง’\n- ่ฟ›้˜ถ: ๅฆ‚ๆžœไปŠๅคฉๆ˜ฏๆ˜ŸๆœŸๅ…ญ๏ผŒ้‚ฃไนˆ10ๅคฉไปฅๅŽๆ˜ฏๆ˜ŸๆœŸๅ‡ ๏ผŸ ๆ็คบ๏ผšๆฏไธชๆ˜ŸๆœŸ็š„็ฌฌ0ๅคฉๆ˜ฏๆ˜ŸๆœŸๅคฉ", "_____no_output_____" ] ], [ [ "day = eval(input('week'))\nplus_day = eval(input('plus'))\n", "_____no_output_____" ] ], [ [ "## ่ฎก็ฎ—่กจ่พพๅผๅ’Œ่ฟ็ฎ—ไผ˜ๅ…ˆ็บง\n<img src = \"../Photo/02.png\"></img>\n<img src = \"../Photo/03.png\"></img>", "_____no_output_____" ], [ "## ๅขžๅผบๅž‹่ต‹ๅ€ผ่ฟ็ฎ—\n<img src = \"../Photo/04.png\"></img>", "_____no_output_____" ], [ "## ็ฑปๅž‹่ฝฌๆข\n- float -> int\n- ๅ››่ˆไบ”ๅ…ฅ round", "_____no_output_____" ], [ "## EP:\n- ๅฆ‚ๆžœไธ€ไธชๅนด่ฅไธš็จŽไธบ0.06%๏ผŒ้‚ฃไนˆๅฏนไบŽ197.55e+2็š„ๅนดๆ”ถๅ…ฅ๏ผŒ้œ€่ฆไบค็จŽไธบๅคšๅฐ‘๏ผŸ(็ป“ๆžœไฟ็•™2ไธบๅฐๆ•ฐ)\n- ๅฟ…้กปไฝฟ็”จ็ง‘ๅญฆ่ฎกๆ•ฐๆณ•", "_____no_output_____" ], [ "# Project\n- ็”จPythonๅ†™ไธ€ไธช่ดทๆฌพ่ฎก็ฎ—ๅ™จ็จ‹ๅบ๏ผš่พ“ๅ…ฅ็š„ๆ˜ฏๆœˆไพ›(monthlyPayment) ่พ“ๅ‡บ็š„ๆ˜ฏๆ€ป่ฟ˜ๆฌพๆ•ฐ(totalpayment)\n![](../Photo/05.png)", "_____no_output_____" ], [ "# Homework\n- 1\n<img src=\"../Photo/06.png\"></img>", "_____no_output_____" ] ], [ [ "celsius = input('่ฏท่พ“ๅ…ฅๆธฉๅบฆ')\ncelsius = float(celsius)\nfahrenheit = (9/5) * celsius + 32\nprint(celsius,'Celsius is',fahrenheit,'Fahrenheit')", "่ฏท่พ“ๅ…ฅๆธฉๅบฆ43\n43.0 Celsius is 109.4 Fahrenheit\n" ] ], [ [ "- 2\n<img src=\"../Photo/07.png\"></img>", "_____no_output_____" ] ], [ [ "radius = input('่ฏท่พ“ๅ…ฅๅŠๅพ„')\nlength = input('่ฏท่พ“ๅ…ฅ้ซ˜')\nradius = float(radius)\nlength = float(length)\narea = radius * radius * 3.14\nvolume = area * length\nprint('The area is',area)\nprint('The volume is',volume)", "่ฏท่พ“ๅ…ฅๅŠๅพ„5.5\n่ฏท่พ“ๅ…ฅ้ซ˜12\nThe area is 94.985\nThe volume is 1139.82\n" ] ], [ [ "- 3\n<img src=\"../Photo/08.png\"></img>", "_____no_output_____" ] ], [ [ "feet = input('่ฏท่พ“ๅ…ฅ่‹ฑๅฐบ')\nfeet = float(feet)\nmeter = feet * 0.305\nprint(feet,'feet is',meter,'meters')", "่ฏท่พ“ๅ…ฅ่‹ฑๅฐบ16.5\n16.5 feet is 5.0325 meters\n" ] ], [ [ "- 4\n<img src=\"../Photo/10.png\"></img>", "_____no_output_____" ] ], [ [ "M = input('่ฏท่พ“ๅ…ฅๆฐด้‡')\ninitial = input('่ฏท่พ“ๅ…ฅๅˆๅง‹ๆธฉๅบฆ')\nfinal = input('่ฏท่พ“ๅ…ฅๆœ€็ปˆๆธฉๅบฆ')\nM = float(M)\ninitial = float(initial)\nfinal = float(final)\nQ = M * (final - initial) * 4184\nprint('The energy needed is ',Q)", "่ฏท่พ“ๅ…ฅๆฐด้‡55.5\n่ฏท่พ“ๅ…ฅๅˆๅง‹ๆธฉๅบฆ3.5\n่ฏท่พ“ๅ…ฅๆœ€็ปˆๆธฉๅบฆ10.5\nThe energy needed is 1625484.0\n" ] ], [ [ "- 5\n<img src=\"../Photo/11.png\"></img>", "_____no_output_____" ] ], [ [ "cha = input('่ฏท่พ“ๅ…ฅๅทฎ้ข')\nrate = input('่ฏท่พ“ๅ…ฅๅนดๅˆฉ็އ')\ncha = float(cha)\nrate = float(rate)\ninterest = cha * (rate/1200)\nprint(interest)", "่ฏท่พ“ๅ…ฅๅทฎ้ข1000\n่ฏท่พ“ๅ…ฅๅนดๅˆฉ็އ3.5\n2.916666666666667\n" ] ], [ [ "- 6\n<img src=\"../Photo/12.png\"></img>", "_____no_output_____" ] ], [ [ "start = input('่ฏท่พ“ๅ…ฅๅˆๅง‹้€Ÿๅบฆ')\nend = input('่ฏท่พ“ๅ…ฅๆœซ้€Ÿๅบฆ')\ntime = input('่ฏท่พ“ๅ…ฅๆ—ถ้—ด')\nstart = float(start)\nend =float(end)\ntime = float(time)\na = (end - start)/time\nprint(a)", "่ฏท่พ“ๅ…ฅๅˆๅง‹้€Ÿๅบฆ5.5\n่ฏท่พ“ๅ…ฅๆœซ้€Ÿๅบฆ50.9\n่ฏท่พ“ๅ…ฅๆ—ถ้—ด4.5\n10.088888888888889\n" ] ], [ [ "- 7 ่ฟ›้˜ถ\n<img src=\"../Photo/13.png\"></img>", "_____no_output_____" ], [ "- 8 ่ฟ›้˜ถ\n<img src=\"../Photo/14.png\"></img>", "_____no_output_____" ] ], [ [ "a,b = eval(input('>>'))\nprint(a,b)\nprint(type(a),type(b))", ">>1,1.0\n1 1.0\n<class 'int'> <class 'float'>\n" ], [ "a = eval(input('>>'))\nprint(a)", ">>1,2,3,4,5,6\n(1, 2, 3, 4, 5, 6)\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ] ]
d062488abb878f5992408c178f41f69940586dd0
55,258
ipynb
Jupyter Notebook
notebooks/sum_backbone_stack_hb_0.ipynb
yizaochen/enmspring
84c9aabeb7f87eda43967d86c763b7d600986215
[ "MIT" ]
null
null
null
notebooks/sum_backbone_stack_hb_0.ipynb
yizaochen/enmspring
84c9aabeb7f87eda43967d86c763b7d600986215
[ "MIT" ]
null
null
null
notebooks/sum_backbone_stack_hb_0.ipynb
yizaochen/enmspring
84c9aabeb7f87eda43967d86c763b7d600986215
[ "MIT" ]
null
null
null
40.931852
3,264
0.434127
[ [ [ "from os import path\nfrom enmspring.sum_bb_st_hb_k import ThreeBar\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom matplotlib import rcParams\nbig_traj_folder = '/home/ytcdata/bigtraj_fluctmatch/500ns'\ndrawzone_folder = '/home/yizaochen/Desktop/drawzone_temp'\ndata_folder = '/home/yizaochen/Documents/dna_2021_drawzone/summation_bb_st_hb'\nrcParams['font.family'] = 'Arial'", "_____no_output_____" ] ], [ [ "### Part 1: Initailize Plot Agent", "_____no_output_____" ] ], [ [ "plot_agent = ThreeBar(big_traj_folder, data_folder)", "_____no_output_____" ] ], [ [ "### Part 2: Make/Read DataFrame", "_____no_output_____" ] ], [ [ "makedf = False\nif makedf:\n plot_agent.ini_b_agent()\n plot_agent.ini_s_agent()\n plot_agent.ini_h_agent()\n plot_agent.make_df_for_all_host()", "_____no_output_____" ], [ "plot_agent.read_df_for_all_host()", "_____no_output_____" ] ], [ [ "### Part 2: Bar Plot", "_____no_output_____" ] ], [ [ "figsize = (1.817, 1.487)\nhspace = 0\n\nplot_agent.plot_main(figsize, hspace)\nsvg_out = path.join(drawzone_folder, 'sum_bb_st_hb.svg')\nplt.savefig(svg_out, dpi=200)\nplt.show()", "_____no_output_____" ], [ "from enmspring.graphs_bigtraj import BackboneMeanModeAgent", "_____no_output_____" ], [ "host = 'a_tract_21mer'\ninterval_time = 500\nb_agent = BackboneMeanModeAgent(host, big_traj_folder, interval_time)", "/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/mean_mode_npy exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/0_500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/250_750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/500_1000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/750_1250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1000_1500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1250_1750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1500_2000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/1750_2250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2000_2500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2250_2750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2500_3000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/2750_3250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3000_3500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3250_3750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3500_4000/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/3750_4250/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/4000_4500/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/4250_4750/pd_dfs exists\n/home/ytcdata/bigtraj_fluctmatch/500ns/a_tract_21mer/bdna+bdna/4500_5000/pd_dfs exists\n" ], [ "b_agent.preprocess_all_small_agents()", "Thare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\nThare are 855 nodes.\nInitialize adjacency, degree and Laplacian matrices... Done.\nFinish the setup for Laplaican matrix.\nTotal number of nodes: 855\nThere are 438 eigenvectors belonging to STRAND1.\nThere are 417 eigenvectors belonging to STRAND2.\nSum of two strands: 855\n" ], [ "b_agent.d_smallagents[(0,500)].laplacian_mat", "_____no_output_____" ], [ "b_agent.initialize_all_maps()", "_____no_output_____" ], [ "b_agent.n_window", "_____no_output_____" ], [ "from enmspring.hb_k import HBResidPlotV1\nbigtraj_folder = '/home/ytcdata/bigtraj_fluctmatch'\ndf_folder = '/home/yizaochen/Documents/dna_2021_drawzone/local_hb'", "_____no_output_____" ], [ "interval_time = 500\nplot_agent = HBResidPlotV1(bigtraj_folder, interval_time, df_folder)", "_____no_output_____" ], [ "plot_agent.read_mean_std_df()", "Read df_mean from /home/yizaochen/Documents/dna_2021_drawzone/local_hb/hb.mean.csv\nRead df_std from /home/yizaochen/Documents/dna_2021_drawzone/local_hb/hb.std.csv\n" ], [ "plot_agent.df_mean", "_____no_output_____" ], [ "plot_agent.df_std", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0624c3648d264d7db51f66d4e56be8724034121
5,430
ipynb
Jupyter Notebook
notebooks/xmap.ipynb
yssource/xplot
69233b204bd680eeb19cecbe7712c3e09fefb83a
[ "BSD-3-Clause" ]
null
null
null
notebooks/xmap.ipynb
yssource/xplot
69233b204bd680eeb19cecbe7712c3e09fefb83a
[ "BSD-3-Clause" ]
null
null
null
notebooks/xmap.ipynb
yssource/xplot
69233b204bd680eeb19cecbe7712c3e09fefb83a
[ "BSD-3-Clause" ]
null
null
null
19.462366
73
0.466851
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d0625bd93fde81ac8be7517aa3101d8361d6bd43
47,126
ipynb
Jupyter Notebook
chatbot.ipynb
Kevinz930/Alexiri-chatbot-
43cd1daf633516a79a6d7ff23beb866f5f59d62d
[ "MIT" ]
null
null
null
chatbot.ipynb
Kevinz930/Alexiri-chatbot-
43cd1daf633516a79a6d7ff23beb866f5f59d62d
[ "MIT" ]
null
null
null
chatbot.ipynb
Kevinz930/Alexiri-chatbot-
43cd1daf633516a79a6d7ff23beb866f5f59d62d
[ "MIT" ]
null
null
null
38.407498
244
0.533463
[ [ [ "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport torch\nfrom torch.jit import script, trace\nimport torch.nn as nn\nfrom torch import optim\nimport torch.nn.functional as F\nimport csv\nimport random\nimport re\nimport os\nimport unicodedata\nimport codecs\nfrom io import open\nimport itertools\nimport math\nimport gensim", "_____no_output_____" ], [ "USE_CUDA = torch.cuda.is_available()\ndevice = torch.device(\"cuda\" if USE_CUDA else \"cpu\")", "_____no_output_____" ] ], [ [ "# Load & Preprocess Data", "_____no_output_____" ], [ "### Cornell Movie Dialogues Corpus", "_____no_output_____" ] ], [ [ "corpus_name = \"cornell movie-dialogs corpus\"\ncorpus = os.path.join(\"data\", corpus_name)\n\ndef printLines(file, n=10):\n with open(file, 'rb') as datafile:\n lines = datafile.readlines()\n for line in lines[:n]:\n print(line)\n\nprintLines(os.path.join(corpus, \"movie_lines.txt\"))", "b'L1045 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ They do not!\\r\\n'\nb'L1044 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ They do to!\\r\\n'\nb'L985 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ I hope so.\\r\\n'\nb'L984 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ She okay?\\r\\n'\nb\"L925 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ Let's go.\\r\\n\"\nb'L924 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ Wow\\r\\n'\nb\"L872 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ Okay -- you're gonna need to learn how to lie.\\r\\n\"\nb'L871 +++$+++ u2 +++$+++ m0 +++$+++ CAMERON +++$+++ No\\r\\n'\nb'L870 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ I\\'m kidding. You know how sometimes you just become this \"persona\"? And you don\\'t know how to quit?\\r\\n'\nb'L869 +++$+++ u0 +++$+++ m0 +++$+++ BIANCA +++$+++ Like my fear of wearing pastels?\\r\\n'\n" ], [ "# Splits each line of the file into a dictionary of fields\ndef loadLines(fileName, fields):\n lines = {}\n with open(fileName, 'r', encoding='iso-8859-1') as f:\n for line in f:\n values = line.split(\" +++$+++ \")\n # Extract fields\n lineObj = {}\n for i, field in enumerate(fields):\n lineObj[field] = values[i]\n lines[lineObj['lineID']] = lineObj\n return lines\n\n\n# Groups fields of lines from `loadLines` into conversations based on *movie_conversations.txt*\ndef loadConversations(fileName, lines, fields):\n conversations = []\n with open(fileName, 'r', encoding='iso-8859-1') as f:\n for line in f:\n values = line.split(\" +++$+++ \")\n # Extract fields\n convObj = {}\n for i, field in enumerate(fields):\n convObj[field] = values[i]\n # Convert string to list (convObj[\"utteranceIDs\"] == \"['L598485', 'L598486', ...]\")\n utterance_id_pattern = re.compile('L[0-9]+')\n lineIds = utterance_id_pattern.findall(convObj[\"utteranceIDs\"])\n # Reassemble lines\n convObj[\"lines\"] = []\n for lineId in lineIds:\n convObj[\"lines\"].append(lines[lineId])\n conversations.append(convObj)\n return conversations\n\n\n# Extracts pairs of sentences from conversations\ndef extractSentencePairs(conversations):\n qa_pairs = []\n for conversation in conversations:\n # Iterate over all the lines of the conversation\n for i in range(len(conversation[\"lines\"]) - 1): # We ignore the last line (no answer for it)\n inputLine = conversation[\"lines\"][i][\"text\"].strip()\n targetLine = conversation[\"lines\"][i+1][\"text\"].strip()\n # Filter wrong samples (if one of the lists is empty)\n if inputLine and targetLine:\n qa_pairs.append([inputLine, targetLine])\n return qa_pairs", "_____no_output_____" ], [ "# Define path to new file\ndatafile = os.path.join(corpus, \"formatted_movie_lines.txt\")\n\ndelimiter = '\\t'\n# Unescape the delimiter\ndelimiter = str(codecs.decode(delimiter, \"unicode_escape\"))\n\n# Initialize lines dict, conversations list, and field ids\nlines = {}\nconversations = []\nMOVIE_LINES_FIELDS = [\"lineID\", \"characterID\", \"movieID\", \"character\", \"text\"]\nMOVIE_CONVERSATIONS_FIELDS = [\"character1ID\", \"character2ID\", \"movieID\", \"utteranceIDs\"]\n\n# Load lines and process conversations\nprint(\"\\nProcessing corpus...\")\nlines = loadLines(os.path.join(corpus, \"movie_lines.txt\"), MOVIE_LINES_FIELDS)\nprint(\"\\nLoading conversations...\")\nconversations = loadConversations(os.path.join(corpus, \"movie_conversations.txt\"),\n lines, MOVIE_CONVERSATIONS_FIELDS)\n\n# Write new csv file\nprint(\"\\nWriting newly formatted file...\")\nwith open(datafile, 'w', encoding='utf-8') as outputfile:\n writer = csv.writer(outputfile, delimiter=delimiter, lineterminator='\\n')\n for pair in extractSentencePairs(conversations):\n writer.writerow(pair)\n\n# Print a sample of lines\nprint(\"\\nSample lines from file:\")\nprintLines(datafile)", "\nProcessing corpus...\n\nLoading conversations...\n\nWriting newly formatted file...\n\nSample lines from file:\nb\"Can we make this quick? Roxanne Korrine and Andrew Barrett are having an incredibly horrendous public break- up on the quad. Again.\\tWell, I thought we'd start with pronunciation, if that's okay with you.\\r\\n\"\nb\"Well, I thought we'd start with pronunciation, if that's okay with you.\\tNot the hacking and gagging and spitting part. Please.\\r\\n\"\nb\"Not the hacking and gagging and spitting part. Please.\\tOkay... then how 'bout we try out some French cuisine. Saturday? Night?\\r\\n\"\nb\"You're asking me out. That's so cute. What's your name again?\\tForget it.\\r\\n\"\nb\"No, no, it's my fault -- we didn't have a proper introduction ---\\tCameron.\\r\\n\"\nb\"Cameron.\\tThe thing is, Cameron -- I'm at the mercy of a particularly hideous breed of loser. My sister. I can't date until she does.\\r\\n\"\nb\"The thing is, Cameron -- I'm at the mercy of a particularly hideous breed of loser. My sister. I can't date until she does.\\tSeems like she could get a date easy enough...\\r\\n\"\nb'Why?\\tUnsolved mystery. She used to be really popular when she started high school, then it was just like she got sick of it or something.\\r\\n'\nb\"Unsolved mystery. She used to be really popular when she started high school, then it was just like she got sick of it or something.\\tThat's a shame.\\r\\n\"\nb'Gosh, if only we could find Kat a boyfriend...\\tLet me see what I can do.\\r\\n'\n" ], [ "# Default word tokens\nPAD_token = 0 # Used for padding short sentences\nSOS_token = 1 # Start-of-sentence token\nEOS_token = 2 # End-of-sentence token\n\nclass Voc:\n def __init__(self, name):\n self.name = name\n self.trimmed = False\n self.word2index = {}\n self.word2count = {}\n self.index2word = {PAD_token: \"PAD\", SOS_token: \"SOS\", EOS_token: \"EOS\"}\n self.num_words = 3 # Count SOS, EOS, PAD\n\n def addSentence(self, sentence):\n for word in sentence.split(' '):\n self.addWord(word)\n\n def addWord(self, word):\n if word not in self.word2index:\n self.word2index[word] = self.num_words\n self.word2count[word] = 1\n self.index2word[self.num_words] = word\n self.num_words += 1\n else:\n self.word2count[word] += 1\n\n # Remove words below a certain count threshold\n def trim(self, min_count):\n if self.trimmed:\n return\n self.trimmed = True\n\n keep_words = []\n\n for k, v in self.word2count.items():\n if v >= min_count:\n keep_words.append(k)\n\n print('keep_words {} / {} = {:.4f}'.format(\n len(keep_words), len(self.word2index), len(keep_words) / len(self.word2index)\n ))\n\n # Reinitialize dictionaries\n self.word2index = {}\n self.word2count = {}\n self.index2word = {PAD_token: \"PAD\", SOS_token: \"SOS\", EOS_token: \"EOS\"}\n self.num_words = 3 # Count default tokens\n\n for word in keep_words:\n self.addWord(word)", "_____no_output_____" ], [ "MAX_LENGTH = 10 # Maximum sentence length to consider\n\n# Turn a Unicode string to plain ASCII, thanks to\n# https://stackoverflow.com/a/518232/2809427\ndef unicodeToAscii(s):\n return ''.join(\n c for c in unicodedata.normalize('NFD', s)\n if unicodedata.category(c) != 'Mn'\n )\n\n# Lowercase, trim, and remove non-letter characters\ndef normalizeString(s):\n s = unicodeToAscii(s.lower().strip())\n s = re.sub(r\"([.!?])\", r\" \\1\", s)\n s = re.sub(r\"[^a-zA-Z.!?']+\", r\" \", s)\n s = re.sub(r\"\\s+\", r\" \", s).strip()\n return s\n\n# Read query/response pairs and return a voc object\ndef readVocs(datafile, corpus_name):\n print(\"Reading lines...\")\n # Read the file and split into lines\n lines = open(datafile, encoding='utf-8').\\\n read().strip().split('\\n')\n # Split every line into pairs and normalize\n pairs = [[normalizeString(s) for s in l.split('\\t')] for l in lines]\n voc = Voc(corpus_name)\n return voc, pairs\n\n# Returns True iff both sentences in a pair 'p' are under the MAX_LENGTH threshold\ndef filterPair(p):\n # Input sequences need to preserve the last word for EOS token\n return len(p[0].split(' ')) < MAX_LENGTH and len(p[1].split(' ')) < MAX_LENGTH\n\n# Filter pairs using filterPair condition\ndef filterPairs(pairs):\n return [pair for pair in pairs if filterPair(pair)]\n\n# Using the functions defined above, return a populated voc object and pairs list\ndef loadPrepareData(corpus, corpus_name, datafile, save_dir):\n print(\"Start preparing training data ...\")\n voc, pairs = readVocs(datafile, corpus_name)\n print(\"Read {!s} sentence pairs\".format(len(pairs)))\n pairs = filterPairs(pairs)\n print(\"Trimmed to {!s} sentence pairs\".format(len(pairs)))\n print(\"Counting words...\")\n for pair in pairs:\n voc.addSentence(pair[0])\n voc.addSentence(pair[1])\n print(\"Counted words:\", voc.num_words)\n return voc, pairs\n\n\n# Load/Assemble voc and pairs\nsave_dir = os.path.join(\"data\", \"save\")\nvoc, pairs = loadPrepareData(corpus, corpus_name, datafile, save_dir)\n# Print some pairs to validate\nprint(\"\\npairs:\")\nfor pair in pairs[:10]:\n print(pair)", "Start preparing training data ...\nReading lines...\nRead 221282 sentence pairs\nTrimmed to 70086 sentence pairs\nCounting words...\nCounted words: 20282\n\npairs:\n[\"that's because it's such a nice one .\", 'forget french .']\n['there .', 'where ?']\n['you have my word . as a gentleman', \"you're sweet .\"]\n['hi .', 'looks like things worked out tonight huh ?']\n['you know chastity ?', 'i believe we share an art instructor']\n['have fun tonight ?', 'tons']\n['well no . . .', \"then that's all you had to say .\"]\n[\"then that's all you had to say .\", 'but']\n['but', 'you always been this selfish ?']\n['do you listen to this crap ?', 'what crap ?']\n" ], [ "MIN_COUNT = 3 # Minimum word count threshold for trimming\n\ndef trimRareWords(voc, pairs, MIN_COUNT):\n # Trim words used under the MIN_COUNT from the voc\n voc.trim(MIN_COUNT)\n # Filter out pairs with trimmed words\n keep_pairs = []\n for pair in pairs:\n input_sentence = pair[0]\n output_sentence = pair[1]\n keep_input = True\n keep_output = True\n # Check input sentence\n for word in input_sentence.split(' '):\n if word not in voc.word2index:\n keep_input = False\n break\n # Check output sentence\n for word in output_sentence.split(' '):\n if word not in voc.word2index:\n keep_output = False\n break\n\n # Only keep pairs that do not contain trimmed word(s) in their input or output sentence\n if keep_input and keep_output:\n keep_pairs.append(pair)\n\n print(\"Trimmed from {} pairs to {}, {:.4f} of total\".format(len(pairs), len(keep_pairs), len(keep_pairs) / len(pairs)))\n return keep_pairs\n\n\n# Trim voc and pairs\npairs = trimRareWords(voc, pairs, MIN_COUNT)", "keep_words 8610 / 20279 = 0.4246\nTrimmed from 70086 pairs to 57379, 0.8187 of total\n" ] ], [ [ "# Prepare Data for Models", "_____no_output_____" ] ], [ [ "def indexesFromSentence(voc, sentence):\n return [voc.word2index[word] for word in sentence.split(' ')] + [EOS_token]\n\n\ndef zeroPadding(l, fillvalue=PAD_token):\n return list(itertools.zip_longest(*l, fillvalue=fillvalue))\n\ndef binaryMatrix(l, value=PAD_token):\n m = []\n for i, seq in enumerate(l):\n m.append([])\n for token in seq:\n if token == PAD_token:\n m[i].append(0)\n else:\n m[i].append(1)\n return m\n\n# Returns padded input sequence tensor and lengths\ndef inputVar(l, voc):\n indexes_batch = [indexesFromSentence(voc, sentence) for sentence in l]\n lengths = torch.tensor([len(indexes) for indexes in indexes_batch])\n padList = zeroPadding(indexes_batch)\n padVar = torch.LongTensor(padList)\n return padVar, lengths\n\n# Returns padded target sequence tensor, padding mask, and max target length\ndef outputVar(l, voc):\n indexes_batch = [indexesFromSentence(voc, sentence) for sentence in l]\n max_target_len = max([len(indexes) for indexes in indexes_batch])\n padList = zeroPadding(indexes_batch)\n mask = binaryMatrix(padList)\n mask = torch.BoolTensor(mask)\n padVar = torch.LongTensor(padList)\n return padVar, mask, max_target_len\n\n# Returns all items for a given batch of pairs\ndef batch2TrainData(voc, pair_batch):\n pair_batch.sort(key=lambda x: len(x[0].split(\" \")), reverse=True)\n input_batch, output_batch = [], []\n for pair in pair_batch:\n input_batch.append(pair[0])\n output_batch.append(pair[1])\n inp, lengths = inputVar(input_batch, voc)\n output, mask, max_target_len = outputVar(output_batch, voc)\n return inp, lengths, output, mask, max_target_len\n\n\n# Example for validation\nsmall_batch_size = 5\nbatches = batch2TrainData(voc, [random.choice(pairs) for _ in range(small_batch_size)])\ninput_variable, lengths, target_variable, mask, max_target_len = batches\n\nprint(\"input_variable:\", input_variable)\nprint(\"lengths:\", lengths)\nprint(\"target_variable:\", target_variable)\nprint(\"mask:\", mask)\nprint(\"max_target_len:\", max_target_len)", "input_variable: tensor([[ 33, 42, 83, 181, 279],\n [ 97, 67, 59, 341, 31],\n [ 32, 1089, 735, 33, 10],\n [ 10, 260, 112, 32, 2],\n [ 563, 33, 16, 15, 0],\n [ 46, 121, 15, 2, 0],\n [ 82, 1727, 2, 0, 0],\n [ 10, 10, 0, 0, 0],\n [ 2, 2, 0, 0, 0]])\nlengths: tensor([9, 9, 7, 6, 4])\ntarget_variable: tensor([[ 56, 125, 5, 616, 22],\n [ 53, 548, 68, 175, 73],\n [ 33, 10, 10, 59, 7],\n [ 47, 2, 33, 1905, 3516],\n [ 15, 0, 32, 10, 4119],\n [ 2, 0, 204, 2, 10],\n [ 0, 0, 10, 0, 2],\n [ 0, 0, 2, 0, 0]])\nmask: tensor([[ True, True, True, True, True],\n [ True, True, True, True, True],\n [ True, True, True, True, True],\n [ True, True, True, True, True],\n [ True, False, True, True, True],\n [ True, False, True, True, True],\n [False, False, True, False, True],\n [False, False, True, False, False]])\nmax_target_len: 8\n" ] ], [ [ "# Encoder", "_____no_output_____" ] ], [ [ "class EncoderRNN(nn.Module):\n def __init__(self, hidden_size, embedding, n_layers=1, dropout=0):\n super(EncoderRNN, self).__init__()\n self.n_layers = n_layers\n self.hidden_size = hidden_size\n self.embedding = embedding\n\n # Initialize GRU; the input_size and hidden_size params are both set to 'hidden_size'\n # because our input size is a word embedding with number of features == hidden_size\n self.gru = nn.GRU(hidden_size, hidden_size, n_layers,\n dropout=(0 if n_layers == 1 else dropout), bidirectional=True)\n\n def forward(self, input_seq, input_lengths, hidden=None):\n # Convert word indexes to embeddings\n embedded = self.embedding(input_seq)\n # Pack padded batch of sequences for RNN module\n packed = nn.utils.rnn.pack_padded_sequence(embedded, input_lengths)\n # Forward pass through GRU\n outputs, hidden = self.gru(packed, hidden)\n # Unpack padding\n outputs, _ = nn.utils.rnn.pad_packed_sequence(outputs)\n # Sum bidirectional GRU outputs\n outputs = outputs[:, :, :self.hidden_size] + outputs[:, : ,self.hidden_size:]\n # Return output and final hidden state\n return outputs, hidden", "_____no_output_____" ] ], [ [ "# Decoder", "_____no_output_____" ] ], [ [ "# Luong attention layer\nclass Attn(nn.Module):\n def __init__(self, method, hidden_size):\n super(Attn, self).__init__()\n self.method = method\n if self.method not in ['dot', 'general', 'concat']:\n raise ValueError(self.method, \"is not an appropriate attention method.\")\n self.hidden_size = hidden_size\n if self.method == 'general':\n self.attn = nn.Linear(self.hidden_size, hidden_size)\n elif self.method == 'concat':\n self.attn = nn.Linear(self.hidden_size * 2, hidden_size)\n self.v = nn.Parameter(torch.FloatTensor(hidden_size))\n\n def dot_score(self, hidden, encoder_output):\n return torch.sum(hidden * encoder_output, dim=2)\n\n def general_score(self, hidden, encoder_output):\n energy = self.attn(encoder_output)\n return torch.sum(hidden * energy, dim=2)\n\n def concat_score(self, hidden, encoder_output):\n energy = self.attn(torch.cat((hidden.expand(encoder_output.size(0), -1, -1), encoder_output), 2)).tanh()\n return torch.sum(self.v * energy, dim=2)\n\n def forward(self, hidden, encoder_outputs):\n # Calculate the attention weights (energies) based on the given method\n if self.method == 'general':\n attn_energies = self.general_score(hidden, encoder_outputs)\n elif self.method == 'concat':\n attn_energies = self.concat_score(hidden, encoder_outputs)\n elif self.method == 'dot':\n attn_energies = self.dot_score(hidden, encoder_outputs)\n\n # Transpose max_length and batch_size dimensions\n attn_energies = attn_energies.t()\n\n # Return the softmax normalized probability scores (with added dimension)\n return F.softmax(attn_energies, dim=1).unsqueeze(1)", "_____no_output_____" ], [ "class LuongAttnDecoderRNN(nn.Module):\n def __init__(self, attn_model, embedding, hidden_size, output_size, n_layers=1, dropout=0.1):\n super(LuongAttnDecoderRNN, self).__init__()\n\n # Keep for reference\n self.attn_model = attn_model\n self.hidden_size = hidden_size\n self.output_size = output_size\n self.n_layers = n_layers\n self.dropout = dropout\n\n # Define layers\n self.embedding = embedding\n self.embedding_dropout = nn.Dropout(dropout)\n self.gru = nn.GRU(hidden_size, hidden_size, n_layers, dropout=(0 if n_layers == 1 else dropout))\n self.concat = nn.Linear(hidden_size * 2, hidden_size)\n self.out = nn.Linear(hidden_size, output_size)\n\n self.attn = Attn(attn_model, hidden_size)\n\n def forward(self, input_step, last_hidden, encoder_outputs):\n # Note: we run this one step (word) at a time\n # Get embedding of current input word\n embedded = self.embedding(input_step)\n embedded = self.embedding_dropout(embedded)\n # Forward through unidirectional GRU\n rnn_output, hidden = self.gru(embedded, last_hidden)\n # Calculate attention weights from the current GRU output\n attn_weights = self.attn(rnn_output, encoder_outputs)\n # Multiply attention weights to encoder outputs to get new \"weighted sum\" context vector\n context = attn_weights.bmm(encoder_outputs.transpose(0, 1))\n # Concatenate weighted context vector and GRU output using Luong eq. 5\n rnn_output = rnn_output.squeeze(0)\n context = context.squeeze(1)\n concat_input = torch.cat((rnn_output, context), 1)\n concat_output = torch.tanh(self.concat(concat_input))\n # Predict next word using Luong eq. 6\n output = self.out(concat_output)\n output = F.softmax(output, dim=1)\n # Return output and final hidden state\n return output, hidden", "_____no_output_____" ] ], [ [ "# Training Procedure", "_____no_output_____" ] ], [ [ "def maskNLLLoss(inp, target, mask):\n nTotal = mask.sum()\n crossEntropy = -torch.log(torch.gather(inp, 1, target.view(-1, 1)).squeeze(1))\n loss = crossEntropy.masked_select(mask).mean()\n loss = loss.to(device)\n return loss, nTotal.item()", "_____no_output_____" ], [ "def train(input_variable, lengths, target_variable, mask, max_target_len, encoder, decoder, embedding,\n encoder_optimizer, decoder_optimizer, batch_size, clip, max_length=MAX_LENGTH):\n\n # Zero gradients\n encoder_optimizer.zero_grad()\n decoder_optimizer.zero_grad()\n\n # Set device options\n input_variable = input_variable.to(device)\n target_variable = target_variable.to(device)\n mask = mask.to(device)\n # Lengths for rnn packing should always be on the cpu\n lengths = lengths.to(\"cpu\")\n\n # Initialize variables\n loss = 0\n print_losses = []\n n_totals = 0\n\n # Forward pass through encoder\n encoder_outputs, encoder_hidden = encoder(input_variable, lengths)\n\n # Create initial decoder input (start with SOS tokens for each sentence)\n decoder_input = torch.LongTensor([[SOS_token for _ in range(batch_size)]])\n decoder_input = decoder_input.to(device)\n\n # Set initial decoder hidden state to the encoder's final hidden state\n decoder_hidden = encoder_hidden[:decoder.n_layers]\n\n # Determine if we are using teacher forcing this iteration\n use_teacher_forcing = True if random.random() < teacher_forcing_ratio else False\n\n # Forward batch of sequences through decoder one time step at a time\n if use_teacher_forcing:\n for t in range(max_target_len):\n decoder_output, decoder_hidden = decoder(\n decoder_input, decoder_hidden, encoder_outputs\n )\n # Teacher forcing: next input is current target\n decoder_input = target_variable[t].view(1, -1)\n # Calculate and accumulate loss\n mask_loss, nTotal = maskNLLLoss(decoder_output, target_variable[t], mask[t])\n loss += mask_loss\n print_losses.append(mask_loss.item() * nTotal)\n n_totals += nTotal\n else:\n for t in range(max_target_len):\n decoder_output, decoder_hidden = decoder(\n decoder_input, decoder_hidden, encoder_outputs\n )\n # No teacher forcing: next input is decoder's own current output\n _, topi = decoder_output.topk(1)\n decoder_input = torch.LongTensor([[topi[i][0] for i in range(batch_size)]])\n decoder_input = decoder_input.to(device)\n # Calculate and accumulate loss\n mask_loss, nTotal = maskNLLLoss(decoder_output, target_variable[t], mask[t])\n loss += mask_loss\n print_losses.append(mask_loss.item() * nTotal)\n n_totals += nTotal\n\n # Perform backpropatation\n loss.backward()\n\n # Clip gradients: gradients are modified in place\n _ = nn.utils.clip_grad_norm_(encoder.parameters(), clip)\n _ = nn.utils.clip_grad_norm_(decoder.parameters(), clip)\n\n # Adjust model weights\n encoder_optimizer.step()\n decoder_optimizer.step()\n\n return sum(print_losses) / n_totals", "_____no_output_____" ], [ "def trainIters(model_name, voc, pairs, encoder, decoder, encoder_optimizer, decoder_optimizer, embedding, encoder_n_layers, decoder_n_layers, save_dir, n_iteration, batch_size, print_every, save_every, clip, corpus_name, loadFilename):\n\n # Load batches for each iteration\n training_batches = [batch2TrainData(voc, [random.choice(pairs) for _ in range(batch_size)])\n for _ in range(n_iteration)]\n\n # Initializations\n print('Initializing ...')\n start_iteration = 1\n print_loss = 0\n if loadFilename:\n start_iteration = checkpoint['iteration'] + 1\n\n # Training loop\n print(\"Training...\")\n for iteration in range(start_iteration, n_iteration + 1):\n training_batch = training_batches[iteration - 1]\n # Extract fields from batch\n input_variable, lengths, target_variable, mask, max_target_len = training_batch\n\n # Run a training iteration with batch\n loss = train(input_variable, lengths, target_variable, mask, max_target_len, encoder,\n decoder, embedding, encoder_optimizer, decoder_optimizer, batch_size, clip)\n print_loss += loss\n\n # Print progress\n if iteration % print_every == 0:\n print_loss_avg = print_loss / print_every\n print(\"Iteration: {}; Percent complete: {:.1f}%; Average loss: {:.4f}\".format(iteration, iteration / n_iteration * 100, print_loss_avg))\n print_loss = 0\n\n # Save checkpoint\n if (iteration % save_every == 0):\n directory = os.path.join(save_dir, model_name, corpus_name, '{}-{}_{}'.format(encoder_n_layers, decoder_n_layers, hidden_size))\n if not os.path.exists(directory):\n os.makedirs(directory)\n torch.save({\n 'iteration': iteration,\n 'en': encoder.state_dict(),\n 'de': decoder.state_dict(),\n 'en_opt': encoder_optimizer.state_dict(),\n 'de_opt': decoder_optimizer.state_dict(),\n 'loss': loss,\n 'voc_dict': voc.__dict__,\n 'embedding': embedding.state_dict()\n }, os.path.join(directory, '{}_{}.tar'.format(iteration, 'checkpoint')))", "_____no_output_____" ] ], [ [ "# Evaluation", "_____no_output_____" ] ], [ [ "class GreedySearchDecoder(nn.Module):\n def __init__(self, encoder, decoder, voc):\n super(GreedySearchDecoder, self).__init__()\n self.encoder = encoder\n self.decoder = decoder\n self.voc = voc\n\n def forward(self, input_seq, input_length, max_length):\n # Forward input through encoder model\n encoder_outputs, encoder_hidden = self.encoder(input_seq, input_length)\n # Prepare encoder's final hidden layer to be first hidden input to the decoder\n decoder_hidden = encoder_hidden[:decoder.n_layers]\n # Initialize decoder input with SOS_token\n decoder_input = torch.ones(1, 1, device=device, dtype=torch.long) * SOS_token\n # Initialize tensors to append decoded words to\n all_tokens = torch.zeros([0], device=device, dtype=torch.long)\n all_scores = torch.zeros([0], device=device)\n # Iteratively decode one word token at a time\n for _ in range(max_length):\n # Forward pass through decoder\n decoder_output, decoder_hidden = self.decoder(decoder_input, decoder_hidden, encoder_outputs)\n # Obtain most likely word token and its softmax score\n decoder_scores, decoder_input = torch.max(decoder_output, dim=1)\n \n \n # Print words and scores\n# print('all tokens', all_tokens)\n print('all tokens words', [voc.index2word[token.item()] for token in all_tokens])\n \n \n if all_tokens.nelement() > 0 and int(decoder_input[0]) == self.voc.word2index['.']: # and int(all_tokens[-1]) == 2\n decoder_scores, decoder_input = torch.kthvalue(decoder_output, 2)\n \n # Record token and score\n all_tokens = torch.cat((all_tokens, decoder_input), dim=0)\n all_scores = torch.cat((all_scores, decoder_scores), dim=0)\n # Prepare current token to be next decoder input (add a dimension)\n decoder_input = torch.unsqueeze(decoder_input, 0)\n \n # Return collections of word tokens and scores\n return all_tokens, all_scores", "_____no_output_____" ], [ "def evaluate(encoder, decoder, searcher, voc, sentence, max_length=MAX_LENGTH):\n ### Format input sentence as a batch\n # words -> indexes\n indexes_batch = [indexesFromSentence(voc, sentence)]\n # Create lengths tensor\n lengths = torch.tensor([len(indexes) for indexes in indexes_batch])\n # Transpose dimensions of batch to match models' expectations\n input_batch = torch.LongTensor(indexes_batch).transpose(0, 1)\n # Use appropriate device\n input_batch = input_batch.to(device)\n lengths = lengths.to(\"cpu\")\n # Decode sentence with searcher\n tokens, scores = searcher(input_batch, lengths, max_length)\n # indexes -> words\n decoded_words = [voc.index2word[token.item()] for token in tokens]\n \n return decoded_words\n\n\ndef evaluateInput(encoder, decoder, searcher, voc):\n input_sentence = ''\n while True:\n try:\n # Get input sentence\n input_sentence = input('> ')\n # Check if it is quit case\n if input_sentence == 'q' or input_sentence == 'quit': break\n # Normalize sentence\n input_sentence = normalizeString(input_sentence)\n # Evaluate sentence\n output_words = evaluate(encoder, decoder, searcher, voc, input_sentence)\n \n # Format and print response sentence\n output_words[:] = [x for x in output_words if not (x == 'EOS' or x == 'PAD')] # or x == '.'\n \n print('human:', input_sentence)\n print('Bot:', ' '.join(output_words))\n\n except KeyError:\n print(\"Error: Encountered unknown word.\")", "_____no_output_____" ] ], [ [ "# Embeddings", "_____no_output_____" ] ], [ [ "# load pre-trained word2Vec model\nimport gensim.downloader as api\nmodel = api.load('word2vec-google-news-300')\nweights_w2v = torch.FloatTensor(model.vectors)", "_____no_output_____" ], [ "# load pre-trained Gloves 42B-300d model\n# model = gensim.models.KeyedVectors.load_word2vec_format('glove.42B.300d.w2vformat.txt')\n\ncorpus = os.path.join(\"glove\", \"glove.42B.300d.w2vformat.txt\")\nmodel = gensim.models.KeyedVectors.load_word2vec_format(corpus)\nweights_42b = torch.FloatTensor(model.vectors)", "_____no_output_____" ], [ "# load pre-trained Gloves 6B-300d model\ncorpus = os.path.join(\"glove\", \"glove.6B.300d.w2vformat.txt\")\nmodel = gensim.models.KeyedVectors.load_word2vec_format(corpus)\nweights_6b = torch.FloatTensor(model.vectors)", "_____no_output_____" ], [ "# Configure models\nmodel_name = 'cb_model'\n# attn_model = 'dot'\n#attn_model = 'general'\nattn_model = 'concat'\nhidden_size = 300 # 500 -> 300 to fit Gloves model\nencoder_n_layers = 3 # 2 -> 3\ndecoder_n_layers = 3 # 2 -> 3\ndropout = 0.1\nbatch_size = 64\n\n# Set checkpoint to load from; set to None if starting from scratch\nloadFilename = None\ncheckpoint_iter = 5000\n# loadFilename = os.path.join(save_dir, model_name, corpus_name,\n# '{}-{}_{}'.format(encoder_n_layers, decoder_n_layers, hidden_size),\n# '{}_checkpoint.tar'.format(checkpoint_iter))\n\n\n# Load model if a loadFilename is provided\nif loadFilename:\n # If loading on same machine the model was trained on\n checkpoint = torch.load(loadFilename)\n # If loading a model trained on GPU to CPU\n #checkpoint = torch.load(loadFilename, map_location=torch.device('cpu'))\n encoder_sd = checkpoint['en']\n decoder_sd = checkpoint['de']\n encoder_optimizer_sd = checkpoint['en_opt']\n decoder_optimizer_sd = checkpoint['de_opt']\n embedding_sd = checkpoint['embedding']\n voc.__dict__ = checkpoint['voc_dict']\n\n\nprint('Building encoder and decoder ...')\n# Initialize word embeddings\n# embedding = nn.Embedding(voc.num_words, hidden_size)\nembedding = nn.Embedding.from_pretrained(weights_w2v) # Choose embedding model\nif loadFilename:\n embedding.load_state_dict(embedding_sd)\n# Initialize encoder & decoder models\nencoder = EncoderRNN(hidden_size, embedding, encoder_n_layers, dropout)\ndecoder = LuongAttnDecoderRNN(attn_model, embedding, hidden_size, voc.num_words, decoder_n_layers, dropout)\nif loadFilename:\n encoder.load_state_dict(encoder_sd)\n decoder.load_state_dict(decoder_sd)\n# Use appropriate device\nencoder = encoder.to(device)\ndecoder = decoder.to(device)\nprint('Models built and ready to go!')", "Building encoder and decoder ...\nModels built and ready to go!\n" ] ], [ [ "# Run Model", "_____no_output_____" ], [ "### Training", "_____no_output_____" ] ], [ [ "# Configure training/optimization\nclip = 50.0\nteacher_forcing_ratio = 1.0\nlearning_rate = 0.0001\ndecoder_learning_ratio = 6.0 # 5.0 -> 4.0\nn_iteration = 5000 # 4000 -> 5000\nprint_every = 1\nsave_every = 500\n\n# Ensure dropout layers are in train mode\nencoder.train()\ndecoder.train()\n\n# Initialize optimizers\nprint('Building optimizers ...')\nencoder_optimizer = optim.Adam(encoder.parameters(), lr=learning_rate)\ndecoder_optimizer = optim.Adam(decoder.parameters(), lr=learning_rate * decoder_learning_ratio)\nif loadFilename:\n encoder_optimizer.load_state_dict(encoder_optimizer_sd)\n decoder_optimizer.load_state_dict(decoder_optimizer_sd)\n\n# If you have cuda, configure cuda to call\nfor state in encoder_optimizer.state.values():\n for k, v in state.items():\n if isinstance(v, torch.Tensor):\n state[k] = v.cuda()\n\nfor state in decoder_optimizer.state.values():\n for k, v in state.items():\n if isinstance(v, torch.Tensor):\n state[k] = v.cuda()\n\n# Run training iterations\nprint(\"Starting Training!\")\ntrainIters(model_name, voc, pairs, encoder, decoder, encoder_optimizer, decoder_optimizer,\n embedding, encoder_n_layers, decoder_n_layers, save_dir, n_iteration, batch_size,\n print_every, save_every, clip, corpus_name, loadFilename)", "_____no_output_____" ] ], [ [ "### Evaluation", "_____no_output_____" ] ], [ [ "# Set dropout layers to eval mode\nencoder.eval()\ndecoder.eval()\n\n# Initialize search module\nsearcher = GreedySearchDecoder(encoder, decoder, voc)\n\nevaluateInput(encoder, decoder, searcher, voc)", "> hey\nall tokens words []\nall tokens words ['i']\nall tokens words ['i', \"don't\"]\nall tokens words ['i', \"don't\", 'bacon']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon', 'sandwich']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon', 'sandwich', 'bacon']\nall tokens words ['i', \"don't\", 'bacon', 'sandwich', 'sandwich', 'bacon', 'sandwich', 'bacon', 'sandwich']\nhuman: hey\nBot: i don't bacon sandwich sandwich bacon sandwich bacon sandwich\n" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0625de76b5fbb52b499afab5a5debe2b3b06ab9
25,964
ipynb
Jupyter Notebook
Seattle Busiest Time.ipynb
ShadyHanafy/Shady
f8e3f786840375845e2a1aede00212d8e5c95b25
[ "CNRI-Python" ]
null
null
null
Seattle Busiest Time.ipynb
ShadyHanafy/Shady
f8e3f786840375845e2a1aede00212d8e5c95b25
[ "CNRI-Python" ]
null
null
null
Seattle Busiest Time.ipynb
ShadyHanafy/Shady
f8e3f786840375845e2a1aede00212d8e5c95b25
[ "CNRI-Python" ]
null
null
null
65.565657
8,320
0.757703
[ [ [ "# Data Understanding\nIn order to get a better understanding of the busiest times in seattle, we will take a look at the dataset.\n\n## Access & Explore\nFirst, let's read and explore the data", "_____no_output_____" ] ], [ [ "import pandas as pd\nimport matplotlib.pyplot as plt", "_____no_output_____" ], [ "#Import Calendar dataset\ndf_cal=pd.read_csv('calendar.csv', thousands=',')\npd.set_option(\"display.max_columns\", None)\ndf_cal.head()", "_____no_output_____" ], [ "#Check if any empty records for the price\ndf_cal['price'].isnull().value_counts()", "_____no_output_____" ] ], [ [ "# Data Preparation & Analysis\nNow we will prepare the data and make some convertions to prepare the data for visualization\n\n## Wrangle and Clean", "_____no_output_____" ] ], [ [ "#Convert price to numerical value\ndf_cal[\"price\"] = df_cal[\"price\"].str.replace('[$,,,]',\"\").astype(float)", "<ipython-input-16-61781eef3286>:2: FutureWarning: The default value of regex will change from True to False in a future version.\n df_cal[\"price\"] = df_cal[\"price\"].str.replace('[$,,,]',\"\").astype(float)\n" ], [ "#Impute the missing data of price columns with mean\ndf_cal['price'].fillna((df_cal['price'].mean()), inplace=True)", "_____no_output_____" ], [ "#Create new feature represent the month of a year\ndf_cal['month'] = pd.DatetimeIndex(df_cal['date']).month\ndf_cal.head()", "_____no_output_____" ] ], [ [ "## Data Visualization\nNow we will visualize our dataset to get the required answer for the main question that which time is the busiest in seattle all over the year and its reflection on price", "_____no_output_____" ] ], [ [ "#Plot the busiest seattle time of the year\nbusytime=df_cal.groupby(['month']).price.mean()\nbusytime.plot(kind = 'bar', title=\"BusyTime\")", "_____no_output_____" ], [ "#Plot the price range accross the year\nbusytime_price=df_cal.groupby(['month']).mean()['price'].sort_values().dropna()\nbusytime_price.plot(kind=\"bar\");\nplt.title(\"Price Trend over year\");", "_____no_output_____" ] ], [ [ "# Conclusion\n\nJuly, August and June are the busiest time of the year and this reflects proportionally in booking prices", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d06277d0426a410149263ce439782d12a0d06670
406,806
ipynb
Jupyter Notebook
m03_v01_store_sales_prediction.ipynb
luana-afonso/DataScience-Em-Producao
13ef7d1a0f72fb9e83f6856612644f08e0bf0ae7
[ "MIT" ]
null
null
null
m03_v01_store_sales_prediction.ipynb
luana-afonso/DataScience-Em-Producao
13ef7d1a0f72fb9e83f6856612644f08e0bf0ae7
[ "MIT" ]
null
null
null
m03_v01_store_sales_prediction.ipynb
luana-afonso/DataScience-Em-Producao
13ef7d1a0f72fb9e83f6856612644f08e0bf0ae7
[ "MIT" ]
null
null
null
102.884674
204,940
0.752998
[ [ [ "# 0.0. IMPORTS", "_____no_output_____" ] ], [ [ "import math\nimport pandas as pd\nimport inflection\nimport numpy as np\nimport seaborn as sns\nimport matplotlib as plt\nimport datetime\n\nfrom IPython.display import Image", "_____no_output_____" ] ], [ [ "## 0.1. Helper Functions", "_____no_output_____" ], [ "## 0.2. Loading Data", "_____no_output_____" ] ], [ [ "# read_csv รฉ um metodo da classe Pandas\n# Preciso \"unzipar\" o arquivo antes?\n# low_memory para dizer se ele lรช o arquivo todo (False) ou em pedaรงรตes (True), ele costuma avisar qual o melhor para a situaรงรฃo\ndf_sales_raw = pd.read_csv(\"data/train.csv.zip\", low_memory=False)\ndf_store_raw = pd.read_csv(\"data/store.csv\", low_memory=False)\n\n# Merge (arquivo de referencia, arquivo a ser anexado a essa referencia, como quero fazer o merge, coluna que รฉ igual nos 2 datasets para servir de chave )\n# Merge tambรฉm รฉ um mรฉtodo da classe Pandas\ndf_raw = pd.merge( df_sales_raw, df_store_raw, how=\"left\", on=\"Store\" )", "_____no_output_____" ], [ "df_sales_raw.head()", "_____no_output_____" ], [ "df_store_raw.head()", "_____no_output_____" ], [ "# Plotar uma linha aleatรณria para ver se deu certo com o mรฉtodo sample\ndf_raw.sample()", "_____no_output_____" ] ], [ [ "# 1.0. STEP 01 - DATA DESCRIPTION", "_____no_output_____" ] ], [ [ "df1 = df_raw.copy()", "_____no_output_____" ] ], [ [ "## 1.1. Rename Columns", "_____no_output_____" ], [ "### Para ganhar velocidade no desenvolvimento!", "_____no_output_____" ] ], [ [ "df_raw.columns\n# Estรฃo atรฉ bem organizadas, formato candle (ou camble?) case, mas no mundo real pode ser bem diferente! rs", "_____no_output_____" ], [ "cols_old = ['Store', 'DayOfWeek', 'Date', 'Sales', 'Customers', 'Open', 'Promo',\n 'StateHoliday', 'SchoolHoliday', 'StoreType', 'Assortment',\n 'CompetitionDistance', 'CompetitionOpenSinceMonth',\n 'CompetitionOpenSinceYear', 'Promo2', 'Promo2SinceWeek',\n 'Promo2SinceYear', 'PromoInterval']\n\nsnakecase = lambda x: inflection.underscore( x )\n\ncols_new = list( map( snakecase, cols_old) )\n\n# Rename\ndf1.columns = cols_new", "_____no_output_____" ], [ "df1.columns", "_____no_output_____" ] ], [ [ "## 1.2. Data Dimensions", "_____no_output_____" ], [ "### Saber qual a quantidade de linhas e colunas do dataset", "_____no_output_____" ] ], [ [ "# O shape printa linhas e colunas do dataframe em que primeiro elemento sรฃo as rows\n# Pq ali sรฃo as chaves que ele usa? Isso tem a ver com placeholder?\nprint( \"Number of Rows: {}\".format( df1.shape[0] ) )\nprint( \"Number of Cols: {}\".format( df1.shape[1] ) )", "Number of Rows: 1017209\nNumber of Cols: 18\n" ] ], [ [ "## 1.3. Data Types", "_____no_output_____" ] ], [ [ "# Atente que nรฃo usamos os parรชnteses aqui. Isso pq estamos vendo uma propriedade e nรฃo usando um mรฉtodo?\n# O default do pandas รฉ assumir o que nรฃo for int como object. Object รฉ o \"caracter\" dentro do Pandas\n# Atente para o date, precisamos mudar de object para datetime!\ndf1.dtypes", "_____no_output_____" ], [ "df1[\"date\"] = pd.to_datetime( df1[\"date\"] )\ndf1.dtypes", "_____no_output_____" ] ], [ [ "## 1.4. Check NA", "_____no_output_____" ] ], [ [ "# O mรฉtodo isna vai mostrar todas as linhas que tem pelo menos uma coluna com um NA (vazia)\n# Mas como eu quero ver a soma disso por coluna, uso o mรฉtodo sum\ndf1.isna().sum()", "_____no_output_____" ], [ "# Precisamos tratar esses NAs.\n# Existem basicamente 3 maneiras:\n# 1. Descartar essas linhas (fรกcil e rรกpido; mas jogando dado fora)\n# 2. Usando algoritmos de machine learning. Tem alguns metodos de input NA que voce pode, por exemplo, substituir as colunas vazias pelo proprio comportamento da coluna (e.g. mediana, media...)\n# 3. Entendendo o negรณcio para colocar valores nos NAs e recuperar dados.", "_____no_output_____" ] ], [ [ "## 1.5. Fillout NA", "_____no_output_____" ] ], [ [ "df1[\"competition_distance\"].max()", "_____no_output_____" ], [ "#competition_distance: distance in meters to the nearest competitor store\n# Se pensarmos que nรฃo ter o dado nessa coluna significa um competidor estar muito longe geograficamente e, portanto, se assumirmos os valores como muito maiores que a distancia mรกxima encontrada resolveria o problema?\n# Quando uso funรงรฃo lambda, posso usar tudo conforme o nome da variรกvel que defino, no caso x\n# Funรงรฃo apply vai aplicar essa logica a todas as linhas do dataset\n# Aplica funรงรฃo apply sรณ na coluna competition_distance\n# O resultado eu quero sobrescrever na minha coluna original\n\ndf1[\"competition_distance\"] = df1[\"competition_distance\"].apply( lambda x: 200000.0 if math.isnan( x ) else x)\n\n#competition_open_since_month - gives the approximate year and month of the time the nearest competitor was opened \n# PREMISSA: Podemos assumir que se essa coluna for NA eu vou copiar a data de venda (extrair o mรชs)\n# Pq isso? jรก pensando na etapa a frente de feature engineering... tem algumas variaveis que derivamos do tempo que sรฃo muito importantes pra representar o comportamento, uma delas รฉ: quanto tempo faz desde que o evento aconteceu\n# A informaรงรฃo de competiรงรฃo proxima รฉ muito importante pois influencia nas vendas! (entao evitamos ao maximo excluir esses dados)\n# Primeiro tenho que ver se รฉ NA, uso a classe math. Se isso for verdade, vou pegar a coluna \"date\" e extrair o mรชs dela. Se nรฃo for verdade, mantem.\n# Vou usar funรงรฃo lambda, entรฃo posso colocar como x os df1.\n# Vou aplicar (funรงรฃo apply) isso ao longo das colunas (axis=1). Nรฃo precisamos fazer isso no \"competition_distance\" pois lรก estavamos avaliando apenas 1 coluna. Preciso explicitar para a funรงรฃo apply quando tenho mais de uma coluna\n# O resultado disso eu vou sobrescrever a coluna \"competition_open_since_month\"\n\ndf1[\"competition_open_since_month\"] = df1.apply( lambda x: x[\"date\"].month if math.isnan( x[\"competition_open_since_month\"] ) else x[\"competition_open_since_month\"] , axis=1)\n\n#competition_open_since_year - gives the approximate year and month of the time the nearest competitor was opened\n# Mesma lรณgica da coluna acima, sรณ que em anos\n\ndf1[\"competition_open_since_year\"] = df1.apply( lambda x: x[\"date\"].year if math.isnan( x[\"competition_open_since_year\"] ) else x[\"competition_open_since_year\"] , axis=1)\n\n#promo2 - Promo2 is a continuing and consecutive promotion for some stores: 0 = store is not participating, 1 = store is participating\n#promo2_since_week - describes the year and calendar week when the store started participating in Promo2 \n# Dados NA nessa coluna querem dizer que a loja nรฃo participa da promoรงรฃo\n# Similar ao de cima\n\ndf1[\"promo2_since_week\"] = df1.apply( lambda x: x[\"date\"].week if math.isnan( x[\"promo2_since_week\"] ) else x[\"promo2_since_week\"] , axis=1)\n\n#promo2_since_year \ndf1[\"promo2_since_year\"] = df1.apply( lambda x: x[\"date\"].year if math.isnan( x[\"promo2_since_year\"] ) else x[\"promo2_since_year\"] , axis=1)\n\n#promo_interval - describes the consecutive intervals Promo2 is started, naming the months the promotion is started anew. E.g. \"Feb,May,Aug,Nov\" means each round starts in February, May, August, November of any given year for that store (meses que a promoรงรฃo ficou ativa)\n# Vamos fazer um split dessa coluna e criar uma lista: se a minha data estiver dentro dessa lista (promoรงรฃo ativa) eu vou criar uma coluna falando que a promo2 foi ativa\n\n# Cria coluna auxiliar\nmonth_map = {1: \"Jan\",2: \"Feb\",3: \"Mar\",4: \"Apr\",5: \"May\",6: \"Jun\",7: \"Jul\",8: \"Aug\",9: \"Sep\",10: \"Oct\",11: \"Nov\",12: \"Dec\"}\n\n# Se o valor na coluna promo_interval for NA, substituo por 0 (nรฃo hรก promoรงรฃo ativa). inplace=True pois nรฃo quero que ele retorne nenhum valor (faรงa a modificaรงรฃo direto na coluna)\ndf1[\"promo_interval\"].fillna(0, inplace=True)\n\n# ??? Pq aqui usamos o map ao inves do apply?\ndf1[\"month_map\"] = df1[\"date\"].dt.month.map( month_map )\n\n# Se o mรชs da coluna month_map estiver na promoรงรฃo, vamos colocar 1, se nรฃo estiver, 0\n# Temos aluns zeros na coluna \"promo_interval\" que sรฃo lojas que nรฃo aderiram a promo2\n\n# 0 if df1[\"promo_interval\"] == 0 else 1 if df1[\"month_map\"] in df1[\"promo_interval\"].split( \",\" ) else 0\n\n# Como vou usar mais de uma coluna preciso especificar a direรงรฃo\n# apply(lambda x: 0 if x[\"promo_interval\"] == 0 else 1 if df1[\"month_map\"] in x[\"promo_interval\"].split( \",\" ) else 0, axis=1 )\n\n# Nรฃo vou aplicar no dataset todo, vou filtrar pra ficar mais fรกcil:\n# Vou criar uma nova coluna is_promo que vai ser 1 ou 0\n\ndf1[\"is_promo\"] = df1[[\"promo_interval\",\"month_map\"]].apply(lambda x: 0 if x[\"promo_interval\"] == 0 else 1 if x[\"month_map\"] in x[\"promo_interval\"].split( \",\" ) else 0, axis=1 )", "_____no_output_____" ], [ "df1.isna().sum()", "_____no_output_____" ], [ "# Agora a coluna \"competition_distance\" nรฃo tem mais NA e o valor maximo รฉ 200000\ndf1[\"competition_distance\"].max()", "_____no_output_____" ], [ "# Pegando linhas aleatorias. T para mostrar a transposta\ndf1.sample(5).T", "_____no_output_____" ] ], [ [ "## 1.6. Change Types", "_____no_output_____" ] ], [ [ "# Importante checar se alguma operaรงรฃo feita na etapa anterior alterou algum dado anterior\n# Mรฉtodo dtypes\n# competition_open_since_month float64\n# competition_open_since_year float64\n# promo2_since_week float64\n# promo2_since_year float64\n# Na verdade essas variaveis acima deveriam ser int (mรชs e ano)\n\ndf1.dtypes", "_____no_output_____" ], [ "# Mรฉtodo astype nesse caso vai aplicar o int sob essa coluna e vai salvar de volta\ndf1[\"competition_open_since_month\"] = df1[\"competition_open_since_month\"].astype(int)\ndf1[\"competition_open_since_year\"] = df1[\"competition_open_since_year\"].astype(int)\ndf1[\"promo2_since_week\"] = df1[\"promo2_since_week\"].astype(int)\ndf1[\"promo2_since_year\"] = df1[\"promo2_since_year\"].astype(int)", "_____no_output_____" ], [ "df1.dtypes", "_____no_output_____" ] ], [ [ "## 1.7. Descriptive Statistics", "_____no_output_____" ], [ "### Ganhar conhecimento de negรณcio e detectar alguns erros", "_____no_output_____" ] ], [ [ "# Central Tendency = mean, median\n# Dispersion = std, min, max, range, skew, kurtosis\n\n# Precisamos separar nossas variรกveis entre numรฉricas e categรณricas.\n# A estatรญstica descritiva funciona para os dois tipos de variรกveis, mas a forma com que eu construo a estatistica \n# descritiva รฉ diferente.\n\n# Vou separar todas as colunas que sรฃo numรฉricas:\n# mรฉtodo select_dtypes e vou passar uma lista de todos os tipos de variaveis que quero selecionar\n# datetime64(ns) = dado de tempo (date)\n\n# ??? Qual a diferenรงa do int64 e int32?\n\nnum_attributes = df1.select_dtypes( include=[\"int64\",\"int32\",\"float64\"] )\ncat_attributes = df1.select_dtypes( exclude=[\"int64\", \"float64\",\"int32\",\"datetime64[ns]\"] )", "_____no_output_____" ], [ "num_attributes.sample(2)", "_____no_output_____" ], [ "cat_attributes.sample(2)", "_____no_output_____" ] ], [ [ "## 1.7.1 Numerical Attributes", "_____no_output_____" ] ], [ [ "# Apply para aplicar uma operaรงรฃo em todas as colunas e transformar num dataframe pra facilitar a visualizaรงรฃo\n# Transpostas para ter metricas nas colunas e features nas linhas\n\n# central tendency\nct1 = pd.DataFrame( num_attributes.apply ( np.mean) ).T\nct2 = pd.DataFrame( num_attributes.apply ( np.median ) ).T\n\n# dispersion\nd1 = pd.DataFrame( num_attributes.apply( np.std )).T\nd2 = pd.DataFrame( num_attributes.apply( min )).T\nd3 = pd.DataFrame( num_attributes.apply( max )).T\nd4 = pd.DataFrame( num_attributes.apply( lambda x: x.max() - x.min() )).T\nd5 = pd.DataFrame( num_attributes.apply( lambda x: x.skew() )).T\nd6 = pd.DataFrame( num_attributes.apply( lambda x: x.kurtosis() )).T\n\n# Para concatenar todas essas mรฉtricas na ordem que quero ver:\n# obs: Classe Pandas\n# Tem que transpor e resetar o index (Pq???)\n\nm = pd.concat([d2,d3,d4,ct1,ct2,d1,d5,d6]).T.reset_index()\n\n# Vamos nomear as colunas para nรฃo aparecer o index padrรฃo\nm.columns = [\"attributes\",\"min\",\"max\",\"range\",\"mean\",\"median\",\"std\",\"skew\",\"kurtosis\"]\nm", "_____no_output_____" ], [ "# Avaliando por exemplo vendas: min 0, max 41k. Media e mediana parecidas, nao tenho um deslocamento da Normal muito grande.\n# Skew proxima de 0 - muito proxima de uma normal\n# Kurtosis proximo de 1 - nao tem um pico muuuito grande", "_____no_output_____" ], [ "# Plotando as sales passando as colunas que quero mostrar\n# Obs: Vocรช consegue mudar o tamanho do plot usando os parรขmetros height e aspect. Um exemplo ficaria assim:\n# sns.displot(df1['sales'], height=8, aspect=2)\n# Descobri isso procurando a funรงรฃo displot direto na documentaรงรฃo do seaborn: https://seaborn.pydata.org/generated/seaborn.displot.html#seaborn.displot\nsns.displot( df1[\"sales\"], height=8, aspect=2)", "_____no_output_____" ], [ "# Skew alta, alta concentraรงรฃo de valores no comeรงo\n# Meus competidores estรฃo muito proximos\n\nsns.displot( df1[\"competition_distance\"])", "_____no_output_____" ] ], [ [ "## 1.7.2 Categorical Attributes", "_____no_output_____" ], [ "### Vai de boxblot!", "_____no_output_____" ] ], [ [ "# ??? No do Meigarom sรณ apareceu os: state_holiday, store_type, assortment, promo_interval e month_map\n# Tirei os int32 tambem dos categoricos\ncat_attributes.apply( lambda x: x.unique().shape[0] )", "_____no_output_____" ], [ "# Meigarom prefere o seaborn do que o matplotlib\n# sns.boxplot( x= y=, data= )\n# x = linha que vai ficar como referencia\n# y = o que quero medir (no caso, as vendas)\n\nsns.boxplot( x=\"state_holiday\", y=\"sales\", data=df1 )", "_____no_output_____" ], [ "# Se plotamos da forma acima nรฃo da pra ver nada... (variaveis com ranges mt diferentes)\n# Vamos filtrar os dados para plotar:\n# ??? Pq esse 0 รฉ uma string e nao um numero? df1[\"state_holiday\"] != \"0\"\n\naux1 = df1[(df1[\"state_holiday\"] != \"0\") & (df1[\"sales\"] > 0)]\n\n# plt.subplot = para plotar um do lado do outro\n\nplt.pyplot.subplot( 1, 3, 1)\nsns.boxplot( x=\"state_holiday\", y=\"sales\", data=aux1)\n\nplt.pyplot.subplot( 1, 3, 2)\nsns.boxplot( x=\"store_type\", y=\"sales\", data=aux1)\n\nplt.pyplot.subplot( 1, 3, 3)\nsns.boxplot( x=\"assortment\", y=\"sales\", data=aux1)\n\n# Boxplot:\n# Linha do meio รฉ a mediana: chegou na metade dos valores (em termos de posiรงรฃo), aquele valor รฉ sua mediana\n# Limite inferior da barra: 25ยบ quartil (quartil 25) e o limite superior รฉ o quartil 75\n# Os ultimos tracinhos sรฃo em cima o maximo e embaixo o minimo. Todos os pontos acima do tracinho de maximo sรฃo considerados outliers (3x o desvio padrรฃo)\n# assortment = mix de produtos", "_____no_output_____" ] ], [ [ "# 2.0. STEP 02 - FEATURE ENGINEERING", "_____no_output_____" ], [ "Para quรช fazer a Feature Engineering? Para ter as variรกveis DISPONรVEIS para ESTUDO durante a Anรกlise Exploratรณria dos Dados. Pra nรฃo ter bagunรงa, crie as variรกveis ANTES na anรกlise exploratรณria!!!", "_____no_output_____" ], [ "Vou usar uma classe Image para colocar a imagem do mapa mental:", "_____no_output_____" ] ], [ [ "df2 = df1.copy()", "_____no_output_____" ] ], [ [ "## 2.1. Hypothesis Mind Map ", "_____no_output_____" ] ], [ [ "Image (\"img/mind-map-hypothesis.png\")", "_____no_output_____" ] ], [ [ "## 2.2. Hypothesis Creation", "_____no_output_____" ], [ "### 2.2.1 Store Hypothesis", "_____no_output_____" ], [ "1. Stores with greater number of employees should sell more.", "_____no_output_____" ], [ "2. Stores with greater stock size should sell more.", "_____no_output_____" ], [ "3. Stores with bigger size should sell more.", "_____no_output_____" ], [ "4. Stores with smaller size should sell less.", "_____no_output_____" ], [ "5. Stores with greater assortment should sell more.", "_____no_output_____" ], [ "6. Stores with more competitors nearby should sell less.", "_____no_output_____" ], [ "7. Stores with competitors for longer should sell more. ", "_____no_output_____" ], [ "### 2.2.2 Product Hypothesis", "_____no_output_____" ], [ "1. Stores with more marketing should sell more.", "_____no_output_____" ], [ "2. Stores that exhibit more products in the showcase sell more.", "_____no_output_____" ], [ "3. Stores that have lower prices on products should sell more.", "_____no_output_____" ], [ "4. Stores that have lower prices for longer on products should sell more.", "_____no_output_____" ], [ "5. Stores with more consecutive sales should sell more.", "_____no_output_____" ], [ "### 2.2.3Time-based Hypothesis", "_____no_output_____" ], [ "1. Stores with more days in holidays should sell less.", "_____no_output_____" ], [ "2. Stores that open in the first 6 months should sell more.", "_____no_output_____" ], [ "3. Stores that open on weekends should sell more.", "_____no_output_____" ], [ "## 2.3. Final Hypothesis List", "_____no_output_____" ], [ "### As hipรณteses das quais temos os dados, vรฃo para a lista final de hipรณteses.\n\n", "_____no_output_____" ], [ "1. Stores with greater assortment should sell more.\n\n2. Stores with more competitors nearby should sell less.\n\n3. Stores with competitors for longer should sell more. \n\n4. Stores with active sales for longer should sell more.\n\n5. Stores with more days on sale should sell more.\n\n7. Stores with more consecutive sales should sell more.\n\n8. Stores opened during the Christmas holiday should sell more.\n\n9. Stores should sell more over the years.\n\n10. Stores should sell more in the second half of the year.\n\n11. Stores should sell more after the 10th of each month.\n\n12. Stores should sell less on weekends.\n\n13. Stores should sell less during school holidays. ", "_____no_output_____" ], [ "## 2.4. Feature Engineering", "_____no_output_____" ] ], [ [ "# year\ndf2['year'] = df2['date'].dt.year\n\n# month\ndf2['month'] = df2['date'].dt.month\n\n# day\ndf2['day'] = df2['date'].dt.day\n\n# week of year\ndf2['week_of_year'] = df2['date'].dt.isocalendar().week\n\n# year week\n# aqui nรฃo usaremos nenhum metodo, e sim mudaremos a formataรงรฃo da data apenas\n# ele fala do strftime no bรดnus\ndf2['year_week'] = df2['date'].dt.strftime( '%Y-%W' )\n\n# week of year \n# ps: <ipython-input-35-d06c5b7375c4>:9: FutureWarning: Series.dt.weekofyear and Series.dt.week have been deprecated. Please use Series.dt.isocalendar().week instead.\n# df2[\"week_of_year\"] = df2[\"date\"].dt.weekofyear\n\ndf2[\"week_of_year\"] = df2[\"date\"].dt.isocalendar().week\n\n# ??? Nรฃo era pra week_of_year ser igual ร  semana que aparece na coluna \"year_week\"? รฉ diferente!", "_____no_output_____" ], [ "df2.sample(10).T", "_____no_output_____" ], [ "# competition since\n# ja temos a coluna \"date\" para comparar, mas a informaรงรฃo de competition since estรก quebrada, temos coluna com year \n# e outra com month\n# Precisamos juntar as duas em uma data e fazer a substraรงรฃo das duas\n# mรฉtodo datetime vem de uma classe tambรฉm chamada datetime\n# datetime.datetime( year=, month=, day= )\n\n# datetime.datetime( year= df2[\"competition_open_since_year\"], month= df2[\"competition_open_since_month\"], day= 1 )\n# Vamos usar a funรงรฃo acima para todas as linhas do dataframe vamos usar lambda com variavel x e depois usar o apply\n# day = 1 pois nao temos informaรงรฃo sobre o dia\n# o apply vai precisar do axis pois estou usando duas colunas diferentes\n\ndf2[\"competition_since\"] = df2.apply(lambda x: datetime.datetime( year= x[\"competition_open_since_year\"], month= x[\"competition_open_since_month\"], day= 1), axis=1 )\n# com esse comando acima geramos a coluna \"competition_since\" no formato 2008-09-01 00:00:00. \n# Agora precisamos ver a diferenรงa dessa data com a date para saber o tempo de \"competition since\"\n\n# df2['date'] - df2['competition_since'] )/30 \n# divido por 30 pq quero manter a glanularidade em dias \n# o .days vai extrair os dias desse datetime e salva como inteiro em uma nova coluna 'competition_time_month'\ndf2['competition_time_month'] = ( ( df2['date'] - df2['competition_since'] )/30 ).apply( lambda x: x.days ).astype( int )", "_____no_output_____" ], [ "df2.head().T", "_____no_output_____" ], [ "# promo since, mesma estratรฉgia acima\n# Mas para as promoรงoes temos uma dificuldade a mais pois temos a coluna promo2 e informaรงรฃo de ano e semana\n# nรฃo temos de mรชs\n# Vamos fazer um join dos caracteres e depois converter na data\n# Mas para juntar as variรกveis assim precisamos que as 2 sejam strings (astype converte)\n# colocamos o \"-\" pra ficar no formato ano - semana do ano\n\n# df2['promo_since'] = df2['promo2_since_year'].astype( str ) + '-' + df2['promo2_since_week'].astype( str )\n# \"promo_since\" agora e string, nao รฉ datetime\n\ndf2['promo_since'] = df2['promo2_since_year'].astype( str ) + '-' + df2['promo2_since_week'].astype( str )\n\n# Deu uma complicada nesse promo, mas bora lรก...\n# Truque para converter o que geramos aqui em cima que ficou como string para data: datetime.datetime.strptime( x + '-1', '%Y-%W-%w' ). strptime( o que vai \n# mostrar, \"formato\")\n# x pq vamos aplicar para todas as linhas do dataframe\n# /7 para ter em semanas\n\ndf2['promo_since'] = df2['promo_since'].apply( lambda x: datetime.datetime.strptime( x + '-1', '%Y-%W-%w' ) - datetime.timedelta( days=7 ) )\n# Agora que temos duas datas sรณ falta subtrair...\ndf2['promo_time_week'] = ( ( df2['date'] - df2['promo_since'] )/7 ).apply( lambda x: x.days ).astype( int )\n\n#Obs:\n# %W Week number of the year (Monday as the first day of the week). \n# All days in a new year preceding the first Monday are considered to be in week 0\n# %w Weekday as a decimal number.\n\n# assortment (describes an assortment level: a = basic, b = extra, c = extended)\n# Mudar as letras para o que isso representa pra ficar mais facil a leitura:\n# Pq else e nรฃo elif na estrutura dentro do lambda???\n# ??? object type รฉ tipo string?\n# Nao preciso usar o axis pq sรณ vou usar a coluna \"assortment\"\n\n# assortment\ndf2['assortment'] = df2['assortment'].apply( lambda x: 'basic' if x == 'a' else 'extra' if x == 'b' else 'extended' )\n\n# Mesma coisa do assortment no \"state holiday\"\n# state holiday\ndf2['state_holiday'] = df2['state_holiday'].apply( lambda x: 'public_holiday' if x == 'a' else 'easter_holiday' if x == 'b' else 'christmas' if x == 'c' else 'regular_day' )\n", "_____no_output_____" ], [ "df2.head().T", "_____no_output_____" ] ], [ [ "# 3.0. STEP 03 - VARIABLES FILTERING", "_____no_output_____" ] ], [ [ "# Antes de qualquer coisa, ao comeรงar um novo passo, copia o dataset do passo anterior e passa a trabalhar com um novo\ndf3 = df2.copy()", "_____no_output_____" ], [ "df3.head()", "_____no_output_____" ] ], [ [ "## 3.1. ROWS FILTERING", "_____no_output_____" ] ], [ [ "# \"open\" != 0 & \"sales\" > 0\n\ndf3 = df3[(df3[\"open\"] != 0) & (df3[\"sales\"] > 0)]", "_____no_output_____" ] ], [ [ "## 3.2. COLUMNS SELECTION", "_____no_output_____" ] ], [ [ "# Vamos \"dropar\" as colunas que nรฃo queremos\n# A \"open\" estรก aqui pois apรณs tirarmos as linhas cujos dados da coluna \"open\" eram 0, sรณ sobraram valores 1, entรฃo รฉ uma coluna 'inรบtil'\ncols_drop = ['customers', 'open', 'promo_interval', 'month_map']\n# Drop รฉ um metodo da classe Pandas (quais colunas e sentido); axis 0 = linhas, axis 1 = colunas\ndf3 = df3.drop( cols_drop, axis=1 )", "_____no_output_____" ], [ "df3.columns", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d06279e2fc0fa31f7f3aa24441e977a8d467c22e
540,612
ipynb
Jupyter Notebook
Model backlog/DenseNet169/133 - DenseNet169 - Classification - Refactor.ipynb
ThinkBricks/APTOS2019BlindnessDetection
e524fd69f83a1252710076c78b6a5236849cd885
[ "MIT" ]
23
2019-09-08T17:19:16.000Z
2022-02-02T16:20:09.000Z
Model backlog/DenseNet169/133 - DenseNet169 - Classification - Refactor.ipynb
ThinkBricks/APTOS2019BlindnessDetection
e524fd69f83a1252710076c78b6a5236849cd885
[ "MIT" ]
1
2020-03-10T18:42:12.000Z
2020-09-18T22:02:38.000Z
Model backlog/DenseNet169/133 - DenseNet169 - Classification - Refactor.ipynb
ThinkBricks/APTOS2019BlindnessDetection
e524fd69f83a1252710076c78b6a5236849cd885
[ "MIT" ]
16
2019-09-21T12:29:59.000Z
2022-03-21T00:42:26.000Z
150.924623
156,324
0.797855
[ [ [ "## Dependencies", "_____no_output_____" ] ], [ [ "import os\nimport cv2\nimport shutil\nimport random\nimport warnings\nimport numpy as np\nimport pandas as pd\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom tensorflow import set_random_seed\nfrom sklearn.utils import class_weight\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import confusion_matrix, cohen_kappa_score\nfrom keras import backend as K\nfrom keras.models import Model\nfrom keras.utils import to_categorical\nfrom keras import optimizers, applications\nfrom keras.preprocessing.image import ImageDataGenerator\nfrom keras.callbacks import EarlyStopping, ReduceLROnPlateau, Callback, LearningRateScheduler\nfrom keras.layers import Dense, Dropout, GlobalAveragePooling2D, Input\n\n# Set seeds to make the experiment more reproducible.\ndef seed_everything(seed=0):\n random.seed(seed)\n os.environ['PYTHONHASHSEED'] = str(seed)\n np.random.seed(seed)\n set_random_seed(0)\nseed = 0\nseed_everything(seed)\n\n%matplotlib inline\nsns.set(style=\"whitegrid\")\nwarnings.filterwarnings(\"ignore\")", "Using TensorFlow backend.\n" ] ], [ [ "## Load data", "_____no_output_____" ] ], [ [ "hold_out_set = pd.read_csv('../input/aptos-data-split/hold-out.csv')\nX_train = hold_out_set[hold_out_set['set'] == 'train']\nX_val = hold_out_set[hold_out_set['set'] == 'validation']\ntest = pd.read_csv('../input/aptos2019-blindness-detection/test.csv')\nprint('Number of train samples: ', X_train.shape[0])\nprint('Number of validation samples: ', X_val.shape[0])\nprint('Number of test samples: ', test.shape[0])\n\n# Preprocecss data\nX_train[\"id_code\"] = X_train[\"id_code\"].apply(lambda x: x + \".png\")\nX_val[\"id_code\"] = X_val[\"id_code\"].apply(lambda x: x + \".png\")\ntest[\"id_code\"] = test[\"id_code\"].apply(lambda x: x + \".png\")\nX_train['diagnosis'] = X_train['diagnosis'].astype('str')\nX_val['diagnosis'] = X_val['diagnosis'].astype('str')\ndisplay(X_train.head())", "Number of train samples: 2929\nNumber of validation samples: 733\nNumber of test samples: 1928\n" ] ], [ [ "# Model parameters", "_____no_output_____" ] ], [ [ "# Model parameters\nN_CLASSES = X_train['diagnosis'].nunique()\nBATCH_SIZE = 16\nEPOCHS = 40\nWARMUP_EPOCHS = 5\nLEARNING_RATE = 1e-4\nWARMUP_LEARNING_RATE = 1e-3\nHEIGHT = 320\nWIDTH = 320\nCHANNELS = 3\nES_PATIENCE = 5\nRLROP_PATIENCE = 3\nDECAY_DROP = 0.5", "_____no_output_____" ], [ "def kappa(y_true, y_pred, n_classes=5):\n y_trues = K.cast(K.argmax(y_true), K.floatx())\n y_preds = K.cast(K.argmax(y_pred), K.floatx())\n n_samples = K.cast(K.shape(y_true)[0], K.floatx())\n distance = K.sum(K.abs(y_trues - y_preds))\n max_distance = n_classes - 1\n \n kappa_score = 1 - ((distance**2) / (n_samples * (max_distance**2)))\n\n return kappa_score\n\ndef step_decay(epoch):\n lrate = 30e-5\n if epoch > 3:\n lrate = 15e-5\n if epoch > 7:\n lrate = 7.5e-5\n if epoch > 11:\n lrate = 3e-5\n if epoch > 15:\n lrate = 1e-5\n\n return lrate\n\ndef focal_loss(y_true, y_pred):\n gamma = 2.0\n epsilon = K.epsilon()\n \n pt = y_pred * y_true + (1-y_pred) * (1-y_true)\n pt = K.clip(pt, epsilon, 1-epsilon)\n CE = -K.log(pt)\n FL = K.pow(1-pt, gamma) * CE\n loss = K.sum(FL, axis=1)\n \n return loss", "_____no_output_____" ] ], [ [ "# Pre-procecess images", "_____no_output_____" ] ], [ [ "train_base_path = '../input/aptos2019-blindness-detection/train_images/'\ntest_base_path = '../input/aptos2019-blindness-detection/test_images/'\ntrain_dest_path = 'base_dir/train_images/'\nvalidation_dest_path = 'base_dir/validation_images/'\ntest_dest_path = 'base_dir/test_images/'\n\n# Making sure directories don't exist\nif os.path.exists(train_dest_path):\n shutil.rmtree(train_dest_path)\nif os.path.exists(validation_dest_path):\n shutil.rmtree(validation_dest_path)\nif os.path.exists(test_dest_path):\n shutil.rmtree(test_dest_path)\n \n# Creating train, validation and test directories\nos.makedirs(train_dest_path)\nos.makedirs(validation_dest_path)\nos.makedirs(test_dest_path)\n\ndef crop_image(img, tol=7):\n if img.ndim ==2:\n mask = img>tol\n return img[np.ix_(mask.any(1),mask.any(0))]\n elif img.ndim==3:\n gray_img = cv2.cvtColor(img, cv2.COLOR_RGB2GRAY)\n mask = gray_img>tol\n check_shape = img[:,:,0][np.ix_(mask.any(1),mask.any(0))].shape[0]\n if (check_shape == 0): # image is too dark so that we crop out everything,\n return img # return original image\n else:\n img1=img[:,:,0][np.ix_(mask.any(1),mask.any(0))]\n img2=img[:,:,1][np.ix_(mask.any(1),mask.any(0))]\n img3=img[:,:,2][np.ix_(mask.any(1),mask.any(0))]\n img = np.stack([img1,img2,img3],axis=-1)\n \n return img\n\ndef circle_crop(img):\n img = crop_image(img)\n\n height, width, depth = img.shape\n largest_side = np.max((height, width))\n img = cv2.resize(img, (largest_side, largest_side))\n\n height, width, depth = img.shape\n\n x = width//2\n y = height//2\n r = np.amin((x, y))\n\n circle_img = np.zeros((height, width), np.uint8)\n cv2.circle(circle_img, (x, y), int(r), 1, thickness=-1)\n img = cv2.bitwise_and(img, img, mask=circle_img)\n img = crop_image(img)\n\n return img\n \ndef preprocess_image(base_path, save_path, image_id, HEIGHT, WIDTH, sigmaX=10):\n image = cv2.imread(base_path + image_id)\n image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)\n image = circle_crop(image)\n image = cv2.resize(image, (HEIGHT, WIDTH))\n image = cv2.addWeighted(image, 4, cv2.GaussianBlur(image, (0,0), sigmaX), -4 , 128)\n cv2.imwrite(save_path + image_id, image)\n \n# Pre-procecss train set\nfor i, image_id in enumerate(X_train['id_code']):\n preprocess_image(train_base_path, train_dest_path, image_id, HEIGHT, WIDTH)\n \n# Pre-procecss validation set\nfor i, image_id in enumerate(X_val['id_code']):\n preprocess_image(train_base_path, validation_dest_path, image_id, HEIGHT, WIDTH)\n \n# Pre-procecss test set\nfor i, image_id in enumerate(test['id_code']):\n preprocess_image(test_base_path, test_dest_path, image_id, HEIGHT, WIDTH)", "_____no_output_____" ] ], [ [ "# Data generator", "_____no_output_____" ] ], [ [ "train_datagen=ImageDataGenerator(rescale=1./255, \n rotation_range=360,\n horizontal_flip=True,\n vertical_flip=True)\n\nvalid_datagen=ImageDataGenerator(rescale=1./255)\n\ntrain_generator=train_datagen.flow_from_dataframe(\n dataframe=X_train,\n directory=train_dest_path,\n x_col=\"id_code\",\n y_col=\"diagnosis\",\n class_mode=\"categorical\",\n batch_size=BATCH_SIZE,\n target_size=(HEIGHT, WIDTH),\n seed=seed)\n\nvalid_generator=valid_datagen.flow_from_dataframe(\n dataframe=X_val,\n directory=validation_dest_path,\n x_col=\"id_code\",\n y_col=\"diagnosis\",\n class_mode=\"categorical\",\n batch_size=BATCH_SIZE,\n target_size=(HEIGHT, WIDTH),\n seed=seed)\n\ntest_generator=valid_datagen.flow_from_dataframe( \n dataframe=test,\n directory=test_dest_path,\n x_col=\"id_code\",\n batch_size=1,\n class_mode=None,\n shuffle=False,\n target_size=(HEIGHT, WIDTH),\n seed=seed)", "Found 2929 validated image filenames belonging to 5 classes.\nFound 733 validated image filenames belonging to 5 classes.\nFound 1928 validated image filenames.\n" ] ], [ [ "# Model", "_____no_output_____" ] ], [ [ "def create_model(input_shape, n_out):\n input_tensor = Input(shape=input_shape)\n base_model = applications.DenseNet169(weights=None, \n include_top=False,\n input_tensor=input_tensor)\n base_model.load_weights('../input/keras-notop/densenet169_weights_tf_dim_ordering_tf_kernels_notop.h5')\n\n x = GlobalAveragePooling2D()(base_model.output)\n x = Dropout(0.5)(x)\n x = Dense(2048, activation='relu')(x)\n x = Dropout(0.5)(x)\n final_output = Dense(n_out, activation='softmax', name='final_output')(x)\n model = Model(input_tensor, final_output)\n \n return model", "_____no_output_____" ] ], [ [ "# Train top layers", "_____no_output_____" ] ], [ [ "model = create_model(input_shape=(HEIGHT, WIDTH, CHANNELS), n_out=N_CLASSES)\n\nfor layer in model.layers:\n layer.trainable = False\n\nfor i in range(-5, 0):\n model.layers[i].trainable = True\n \nclass_weights = class_weight.compute_class_weight('balanced', np.unique(X_train['diagnosis'].astype('int').values), X_train['diagnosis'].astype('int').values)\n\nmetric_list = [\"accuracy\", kappa]\noptimizer = optimizers.Adam(lr=WARMUP_LEARNING_RATE)\nmodel.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=metric_list)\nmodel.summary()", "__________________________________________________________________________________________________\nLayer (type) Output Shape Param # Connected to \n==================================================================================================\ninput_1 (InputLayer) (None, 320, 320, 3) 0 \n__________________________________________________________________________________________________\nzero_padding2d_1 (ZeroPadding2D (None, 326, 326, 3) 0 input_1[0][0] \n__________________________________________________________________________________________________\nconv1/conv (Conv2D) (None, 160, 160, 64) 9408 zero_padding2d_1[0][0] \n__________________________________________________________________________________________________\nconv1/bn (BatchNormalization) (None, 160, 160, 64) 256 conv1/conv[0][0] \n__________________________________________________________________________________________________\nconv1/relu (Activation) (None, 160, 160, 64) 0 conv1/bn[0][0] \n__________________________________________________________________________________________________\nzero_padding2d_2 (ZeroPadding2D (None, 162, 162, 64) 0 conv1/relu[0][0] \n__________________________________________________________________________________________________\npool1 (MaxPooling2D) (None, 80, 80, 64) 0 zero_padding2d_2[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_bn (BatchNormali (None, 80, 80, 64) 256 pool1[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_relu (Activation (None, 80, 80, 64) 0 conv2_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_conv (Conv2D) (None, 80, 80, 128) 8192 conv2_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_relu (Activation (None, 80, 80, 128) 0 conv2_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_concat (Concatenat (None, 80, 80, 96) 0 pool1[0][0] \n conv2_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_bn (BatchNormali (None, 80, 80, 96) 384 conv2_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_relu (Activation (None, 80, 80, 96) 0 conv2_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_conv (Conv2D) (None, 80, 80, 128) 12288 conv2_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_relu (Activation (None, 80, 80, 128) 0 conv2_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_concat (Concatenat (None, 80, 80, 128) 0 conv2_block1_concat[0][0] \n conv2_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_relu (Activation (None, 80, 80, 128) 0 conv2_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_conv (Conv2D) (None, 80, 80, 128) 16384 conv2_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_relu (Activation (None, 80, 80, 128) 0 conv2_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_concat (Concatenat (None, 80, 80, 160) 0 conv2_block2_concat[0][0] \n conv2_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_bn (BatchNormali (None, 80, 80, 160) 640 conv2_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_relu (Activation (None, 80, 80, 160) 0 conv2_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_conv (Conv2D) (None, 80, 80, 128) 20480 conv2_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_relu (Activation (None, 80, 80, 128) 0 conv2_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_concat (Concatenat (None, 80, 80, 192) 0 conv2_block3_concat[0][0] \n conv2_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_bn (BatchNormali (None, 80, 80, 192) 768 conv2_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_relu (Activation (None, 80, 80, 192) 0 conv2_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_conv (Conv2D) (None, 80, 80, 128) 24576 conv2_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_relu (Activation (None, 80, 80, 128) 0 conv2_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_concat (Concatenat (None, 80, 80, 224) 0 conv2_block4_concat[0][0] \n conv2_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_bn (BatchNormali (None, 80, 80, 224) 896 conv2_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_relu (Activation (None, 80, 80, 224) 0 conv2_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_conv (Conv2D) (None, 80, 80, 128) 28672 conv2_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_relu (Activation (None, 80, 80, 128) 0 conv2_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_concat (Concatenat (None, 80, 80, 256) 0 conv2_block5_concat[0][0] \n conv2_block6_2_conv[0][0] \n__________________________________________________________________________________________________\npool2_bn (BatchNormalization) (None, 80, 80, 256) 1024 conv2_block6_concat[0][0] \n__________________________________________________________________________________________________\npool2_relu (Activation) (None, 80, 80, 256) 0 pool2_bn[0][0] \n__________________________________________________________________________________________________\npool2_conv (Conv2D) (None, 80, 80, 128) 32768 pool2_relu[0][0] \n__________________________________________________________________________________________________\npool2_pool (AveragePooling2D) (None, 40, 40, 128) 0 pool2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_bn (BatchNormali (None, 40, 40, 128) 512 pool2_pool[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_relu (Activation (None, 40, 40, 128) 0 conv3_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_conv (Conv2D) (None, 40, 40, 128) 16384 conv3_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_relu (Activation (None, 40, 40, 128) 0 conv3_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_concat (Concatenat (None, 40, 40, 160) 0 pool2_pool[0][0] \n conv3_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_bn (BatchNormali (None, 40, 40, 160) 640 conv3_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_relu (Activation (None, 40, 40, 160) 0 conv3_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_conv (Conv2D) (None, 40, 40, 128) 20480 conv3_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_relu (Activation (None, 40, 40, 128) 0 conv3_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_concat (Concatenat (None, 40, 40, 192) 0 conv3_block1_concat[0][0] \n conv3_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_bn (BatchNormali (None, 40, 40, 192) 768 conv3_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_relu (Activation (None, 40, 40, 192) 0 conv3_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_conv (Conv2D) (None, 40, 40, 128) 24576 conv3_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_relu (Activation (None, 40, 40, 128) 0 conv3_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_concat (Concatenat (None, 40, 40, 224) 0 conv3_block2_concat[0][0] \n conv3_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_bn (BatchNormali (None, 40, 40, 224) 896 conv3_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_relu (Activation (None, 40, 40, 224) 0 conv3_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_conv (Conv2D) (None, 40, 40, 128) 28672 conv3_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_relu (Activation (None, 40, 40, 128) 0 conv3_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_concat (Concatenat (None, 40, 40, 256) 0 conv3_block3_concat[0][0] \n conv3_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_bn (BatchNormali (None, 40, 40, 256) 1024 conv3_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_relu (Activation (None, 40, 40, 256) 0 conv3_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_conv (Conv2D) (None, 40, 40, 128) 32768 conv3_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_relu (Activation (None, 40, 40, 128) 0 conv3_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_concat (Concatenat (None, 40, 40, 288) 0 conv3_block4_concat[0][0] \n conv3_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_bn (BatchNormali (None, 40, 40, 288) 1152 conv3_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_relu (Activation (None, 40, 40, 288) 0 conv3_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_conv (Conv2D) (None, 40, 40, 128) 36864 conv3_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_relu (Activation (None, 40, 40, 128) 0 conv3_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_concat (Concatenat (None, 40, 40, 320) 0 conv3_block5_concat[0][0] \n conv3_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_bn (BatchNormali (None, 40, 40, 320) 1280 conv3_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_relu (Activation (None, 40, 40, 320) 0 conv3_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_conv (Conv2D) (None, 40, 40, 128) 40960 conv3_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_relu (Activation (None, 40, 40, 128) 0 conv3_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_concat (Concatenat (None, 40, 40, 352) 0 conv3_block6_concat[0][0] \n conv3_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_bn (BatchNormali (None, 40, 40, 352) 1408 conv3_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_relu (Activation (None, 40, 40, 352) 0 conv3_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_conv (Conv2D) (None, 40, 40, 128) 45056 conv3_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_relu (Activation (None, 40, 40, 128) 0 conv3_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_concat (Concatenat (None, 40, 40, 384) 0 conv3_block7_concat[0][0] \n conv3_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_bn (BatchNormali (None, 40, 40, 384) 1536 conv3_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_relu (Activation (None, 40, 40, 384) 0 conv3_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_conv (Conv2D) (None, 40, 40, 128) 49152 conv3_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_relu (Activation (None, 40, 40, 128) 0 conv3_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_concat (Concatenat (None, 40, 40, 416) 0 conv3_block8_concat[0][0] \n conv3_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_bn (BatchNormal (None, 40, 40, 416) 1664 conv3_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_relu (Activatio (None, 40, 40, 416) 0 conv3_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_conv (Conv2D) (None, 40, 40, 128) 53248 conv3_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_concat (Concatena (None, 40, 40, 448) 0 conv3_block9_concat[0][0] \n conv3_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_bn (BatchNormal (None, 40, 40, 448) 1792 conv3_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_relu (Activatio (None, 40, 40, 448) 0 conv3_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_conv (Conv2D) (None, 40, 40, 128) 57344 conv3_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_concat (Concatena (None, 40, 40, 480) 0 conv3_block10_concat[0][0] \n conv3_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_bn (BatchNormal (None, 40, 40, 480) 1920 conv3_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_relu (Activatio (None, 40, 40, 480) 0 conv3_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_conv (Conv2D) (None, 40, 40, 128) 61440 conv3_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_concat (Concatena (None, 40, 40, 512) 0 conv3_block11_concat[0][0] \n conv3_block12_2_conv[0][0] \n__________________________________________________________________________________________________\npool3_bn (BatchNormalization) (None, 40, 40, 512) 2048 conv3_block12_concat[0][0] \n__________________________________________________________________________________________________\npool3_relu (Activation) (None, 40, 40, 512) 0 pool3_bn[0][0] \n__________________________________________________________________________________________________\npool3_conv (Conv2D) (None, 40, 40, 256) 131072 pool3_relu[0][0] \n__________________________________________________________________________________________________\npool3_pool (AveragePooling2D) (None, 20, 20, 256) 0 pool3_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_bn (BatchNormali (None, 20, 20, 256) 1024 pool3_pool[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_relu (Activation (None, 20, 20, 256) 0 conv4_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_conv (Conv2D) (None, 20, 20, 128) 32768 conv4_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_relu (Activation (None, 20, 20, 128) 0 conv4_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_concat (Concatenat (None, 20, 20, 288) 0 pool3_pool[0][0] \n conv4_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_bn (BatchNormali (None, 20, 20, 288) 1152 conv4_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_relu (Activation (None, 20, 20, 288) 0 conv4_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_conv (Conv2D) (None, 20, 20, 128) 36864 conv4_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_relu (Activation (None, 20, 20, 128) 0 conv4_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_concat (Concatenat (None, 20, 20, 320) 0 conv4_block1_concat[0][0] \n conv4_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_bn (BatchNormali (None, 20, 20, 320) 1280 conv4_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_relu (Activation (None, 20, 20, 320) 0 conv4_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_conv (Conv2D) (None, 20, 20, 128) 40960 conv4_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_relu (Activation (None, 20, 20, 128) 0 conv4_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_concat (Concatenat (None, 20, 20, 352) 0 conv4_block2_concat[0][0] \n conv4_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_bn (BatchNormali (None, 20, 20, 352) 1408 conv4_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_relu (Activation (None, 20, 20, 352) 0 conv4_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_conv (Conv2D) (None, 20, 20, 128) 45056 conv4_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_relu (Activation (None, 20, 20, 128) 0 conv4_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_concat (Concatenat (None, 20, 20, 384) 0 conv4_block3_concat[0][0] \n conv4_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_bn (BatchNormali (None, 20, 20, 384) 1536 conv4_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_relu (Activation (None, 20, 20, 384) 0 conv4_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_conv (Conv2D) (None, 20, 20, 128) 49152 conv4_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_relu (Activation (None, 20, 20, 128) 0 conv4_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_concat (Concatenat (None, 20, 20, 416) 0 conv4_block4_concat[0][0] \n conv4_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_bn (BatchNormali (None, 20, 20, 416) 1664 conv4_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_relu (Activation (None, 20, 20, 416) 0 conv4_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_conv (Conv2D) (None, 20, 20, 128) 53248 conv4_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_relu (Activation (None, 20, 20, 128) 0 conv4_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_concat (Concatenat (None, 20, 20, 448) 0 conv4_block5_concat[0][0] \n conv4_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_bn (BatchNormali (None, 20, 20, 448) 1792 conv4_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_relu (Activation (None, 20, 20, 448) 0 conv4_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_conv (Conv2D) (None, 20, 20, 128) 57344 conv4_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_relu (Activation (None, 20, 20, 128) 0 conv4_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_concat (Concatenat (None, 20, 20, 480) 0 conv4_block6_concat[0][0] \n conv4_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_bn (BatchNormali (None, 20, 20, 480) 1920 conv4_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_relu (Activation (None, 20, 20, 480) 0 conv4_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_conv (Conv2D) (None, 20, 20, 128) 61440 conv4_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_relu (Activation (None, 20, 20, 128) 0 conv4_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_concat (Concatenat (None, 20, 20, 512) 0 conv4_block7_concat[0][0] \n conv4_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_bn (BatchNormali (None, 20, 20, 512) 2048 conv4_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_relu (Activation (None, 20, 20, 512) 0 conv4_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_conv (Conv2D) (None, 20, 20, 128) 65536 conv4_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_relu (Activation (None, 20, 20, 128) 0 conv4_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_concat (Concatenat (None, 20, 20, 544) 0 conv4_block8_concat[0][0] \n conv4_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_bn (BatchNormal (None, 20, 20, 544) 2176 conv4_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_relu (Activatio (None, 20, 20, 544) 0 conv4_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_conv (Conv2D) (None, 20, 20, 128) 69632 conv4_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_concat (Concatena (None, 20, 20, 576) 0 conv4_block9_concat[0][0] \n conv4_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_bn (BatchNormal (None, 20, 20, 576) 2304 conv4_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_relu (Activatio (None, 20, 20, 576) 0 conv4_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_conv (Conv2D) (None, 20, 20, 128) 73728 conv4_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_concat (Concatena (None, 20, 20, 608) 0 conv4_block10_concat[0][0] \n conv4_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_bn (BatchNormal (None, 20, 20, 608) 2432 conv4_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_relu (Activatio (None, 20, 20, 608) 0 conv4_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_conv (Conv2D) (None, 20, 20, 128) 77824 conv4_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_concat (Concatena (None, 20, 20, 640) 0 conv4_block11_concat[0][0] \n conv4_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_bn (BatchNormal (None, 20, 20, 640) 2560 conv4_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_relu (Activatio (None, 20, 20, 640) 0 conv4_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_conv (Conv2D) (None, 20, 20, 128) 81920 conv4_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_concat (Concatena (None, 20, 20, 672) 0 conv4_block12_concat[0][0] \n conv4_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_bn (BatchNormal (None, 20, 20, 672) 2688 conv4_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_relu (Activatio (None, 20, 20, 672) 0 conv4_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_conv (Conv2D) (None, 20, 20, 128) 86016 conv4_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_concat (Concatena (None, 20, 20, 704) 0 conv4_block13_concat[0][0] \n conv4_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_bn (BatchNormal (None, 20, 20, 704) 2816 conv4_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_relu (Activatio (None, 20, 20, 704) 0 conv4_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_conv (Conv2D) (None, 20, 20, 128) 90112 conv4_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_concat (Concatena (None, 20, 20, 736) 0 conv4_block14_concat[0][0] \n conv4_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_bn (BatchNormal (None, 20, 20, 736) 2944 conv4_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_relu (Activatio (None, 20, 20, 736) 0 conv4_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_conv (Conv2D) (None, 20, 20, 128) 94208 conv4_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_concat (Concatena (None, 20, 20, 768) 0 conv4_block15_concat[0][0] \n conv4_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_bn (BatchNormal (None, 20, 20, 768) 3072 conv4_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_relu (Activatio (None, 20, 20, 768) 0 conv4_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_conv (Conv2D) (None, 20, 20, 128) 98304 conv4_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_concat (Concatena (None, 20, 20, 800) 0 conv4_block16_concat[0][0] \n conv4_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_bn (BatchNormal (None, 20, 20, 800) 3200 conv4_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_relu (Activatio (None, 20, 20, 800) 0 conv4_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_conv (Conv2D) (None, 20, 20, 128) 102400 conv4_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_concat (Concatena (None, 20, 20, 832) 0 conv4_block17_concat[0][0] \n conv4_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_bn (BatchNormal (None, 20, 20, 832) 3328 conv4_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_relu (Activatio (None, 20, 20, 832) 0 conv4_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_conv (Conv2D) (None, 20, 20, 128) 106496 conv4_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_concat (Concatena (None, 20, 20, 864) 0 conv4_block18_concat[0][0] \n conv4_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_bn (BatchNormal (None, 20, 20, 864) 3456 conv4_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_relu (Activatio (None, 20, 20, 864) 0 conv4_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_conv (Conv2D) (None, 20, 20, 128) 110592 conv4_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_concat (Concatena (None, 20, 20, 896) 0 conv4_block19_concat[0][0] \n conv4_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_bn (BatchNormal (None, 20, 20, 896) 3584 conv4_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_relu (Activatio (None, 20, 20, 896) 0 conv4_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_conv (Conv2D) (None, 20, 20, 128) 114688 conv4_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_concat (Concatena (None, 20, 20, 928) 0 conv4_block20_concat[0][0] \n conv4_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_bn (BatchNormal (None, 20, 20, 928) 3712 conv4_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_relu (Activatio (None, 20, 20, 928) 0 conv4_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_conv (Conv2D) (None, 20, 20, 128) 118784 conv4_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_concat (Concatena (None, 20, 20, 960) 0 conv4_block21_concat[0][0] \n conv4_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_bn (BatchNormal (None, 20, 20, 960) 3840 conv4_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_relu (Activatio (None, 20, 20, 960) 0 conv4_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_conv (Conv2D) (None, 20, 20, 128) 122880 conv4_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_concat (Concatena (None, 20, 20, 992) 0 conv4_block22_concat[0][0] \n conv4_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_bn (BatchNormal (None, 20, 20, 992) 3968 conv4_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_relu (Activatio (None, 20, 20, 992) 0 conv4_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_conv (Conv2D) (None, 20, 20, 128) 126976 conv4_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_concat (Concatena (None, 20, 20, 1024) 0 conv4_block23_concat[0][0] \n conv4_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_bn (BatchNormal (None, 20, 20, 1024) 4096 conv4_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_relu (Activatio (None, 20, 20, 1024) 0 conv4_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_conv (Conv2D) (None, 20, 20, 128) 131072 conv4_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_concat (Concatena (None, 20, 20, 1056) 0 conv4_block24_concat[0][0] \n conv4_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_bn (BatchNormal (None, 20, 20, 1056) 4224 conv4_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_relu (Activatio (None, 20, 20, 1056) 0 conv4_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_conv (Conv2D) (None, 20, 20, 128) 135168 conv4_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_concat (Concatena (None, 20, 20, 1088) 0 conv4_block25_concat[0][0] \n conv4_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_bn (BatchNormal (None, 20, 20, 1088) 4352 conv4_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_relu (Activatio (None, 20, 20, 1088) 0 conv4_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_conv (Conv2D) (None, 20, 20, 128) 139264 conv4_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_concat (Concatena (None, 20, 20, 1120) 0 conv4_block26_concat[0][0] \n conv4_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_bn (BatchNormal (None, 20, 20, 1120) 4480 conv4_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_relu (Activatio (None, 20, 20, 1120) 0 conv4_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_conv (Conv2D) (None, 20, 20, 128) 143360 conv4_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_concat (Concatena (None, 20, 20, 1152) 0 conv4_block27_concat[0][0] \n conv4_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_bn (BatchNormal (None, 20, 20, 1152) 4608 conv4_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_relu (Activatio (None, 20, 20, 1152) 0 conv4_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_conv (Conv2D) (None, 20, 20, 128) 147456 conv4_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_concat (Concatena (None, 20, 20, 1184) 0 conv4_block28_concat[0][0] \n conv4_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_bn (BatchNormal (None, 20, 20, 1184) 4736 conv4_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_relu (Activatio (None, 20, 20, 1184) 0 conv4_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_conv (Conv2D) (None, 20, 20, 128) 151552 conv4_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_concat (Concatena (None, 20, 20, 1216) 0 conv4_block29_concat[0][0] \n conv4_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_bn (BatchNormal (None, 20, 20, 1216) 4864 conv4_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_relu (Activatio (None, 20, 20, 1216) 0 conv4_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_conv (Conv2D) (None, 20, 20, 128) 155648 conv4_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_concat (Concatena (None, 20, 20, 1248) 0 conv4_block30_concat[0][0] \n conv4_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_bn (BatchNormal (None, 20, 20, 1248) 4992 conv4_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_relu (Activatio (None, 20, 20, 1248) 0 conv4_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_conv (Conv2D) (None, 20, 20, 128) 159744 conv4_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_concat (Concatena (None, 20, 20, 1280) 0 conv4_block31_concat[0][0] \n conv4_block32_2_conv[0][0] \n__________________________________________________________________________________________________\npool4_bn (BatchNormalization) (None, 20, 20, 1280) 5120 conv4_block32_concat[0][0] \n__________________________________________________________________________________________________\npool4_relu (Activation) (None, 20, 20, 1280) 0 pool4_bn[0][0] \n__________________________________________________________________________________________________\npool4_conv (Conv2D) (None, 20, 20, 640) 819200 pool4_relu[0][0] \n__________________________________________________________________________________________________\npool4_pool (AveragePooling2D) (None, 10, 10, 640) 0 pool4_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_bn (BatchNormali (None, 10, 10, 640) 2560 pool4_pool[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_relu (Activation (None, 10, 10, 640) 0 conv5_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_conv (Conv2D) (None, 10, 10, 128) 81920 conv5_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_relu (Activation (None, 10, 10, 128) 0 conv5_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_concat (Concatenat (None, 10, 10, 672) 0 pool4_pool[0][0] \n conv5_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_bn (BatchNormali (None, 10, 10, 672) 2688 conv5_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_relu (Activation (None, 10, 10, 672) 0 conv5_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_conv (Conv2D) (None, 10, 10, 128) 86016 conv5_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_relu (Activation (None, 10, 10, 128) 0 conv5_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_concat (Concatenat (None, 10, 10, 704) 0 conv5_block1_concat[0][0] \n conv5_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_bn (BatchNormali (None, 10, 10, 704) 2816 conv5_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_relu (Activation (None, 10, 10, 704) 0 conv5_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_conv (Conv2D) (None, 10, 10, 128) 90112 conv5_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_relu (Activation (None, 10, 10, 128) 0 conv5_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_concat (Concatenat (None, 10, 10, 736) 0 conv5_block2_concat[0][0] \n conv5_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_bn (BatchNormali (None, 10, 10, 736) 2944 conv5_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_relu (Activation (None, 10, 10, 736) 0 conv5_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_conv (Conv2D) (None, 10, 10, 128) 94208 conv5_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_relu (Activation (None, 10, 10, 128) 0 conv5_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_concat (Concatenat (None, 10, 10, 768) 0 conv5_block3_concat[0][0] \n conv5_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_bn (BatchNormali (None, 10, 10, 768) 3072 conv5_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_relu (Activation (None, 10, 10, 768) 0 conv5_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_conv (Conv2D) (None, 10, 10, 128) 98304 conv5_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_relu (Activation (None, 10, 10, 128) 0 conv5_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_concat (Concatenat (None, 10, 10, 800) 0 conv5_block4_concat[0][0] \n conv5_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_bn (BatchNormali (None, 10, 10, 800) 3200 conv5_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_relu (Activation (None, 10, 10, 800) 0 conv5_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_conv (Conv2D) (None, 10, 10, 128) 102400 conv5_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_relu (Activation (None, 10, 10, 128) 0 conv5_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_concat (Concatenat (None, 10, 10, 832) 0 conv5_block5_concat[0][0] \n conv5_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_bn (BatchNormali (None, 10, 10, 832) 3328 conv5_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_relu (Activation (None, 10, 10, 832) 0 conv5_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_conv (Conv2D) (None, 10, 10, 128) 106496 conv5_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_relu (Activation (None, 10, 10, 128) 0 conv5_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_concat (Concatenat (None, 10, 10, 864) 0 conv5_block6_concat[0][0] \n conv5_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_bn (BatchNormali (None, 10, 10, 864) 3456 conv5_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_relu (Activation (None, 10, 10, 864) 0 conv5_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_conv (Conv2D) (None, 10, 10, 128) 110592 conv5_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_relu (Activation (None, 10, 10, 128) 0 conv5_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_concat (Concatenat (None, 10, 10, 896) 0 conv5_block7_concat[0][0] \n conv5_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_bn (BatchNormali (None, 10, 10, 896) 3584 conv5_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_relu (Activation (None, 10, 10, 896) 0 conv5_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_conv (Conv2D) (None, 10, 10, 128) 114688 conv5_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_relu (Activation (None, 10, 10, 128) 0 conv5_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_concat (Concatenat (None, 10, 10, 928) 0 conv5_block8_concat[0][0] \n conv5_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_bn (BatchNormal (None, 10, 10, 928) 3712 conv5_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_relu (Activatio (None, 10, 10, 928) 0 conv5_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_conv (Conv2D) (None, 10, 10, 128) 118784 conv5_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_concat (Concatena (None, 10, 10, 960) 0 conv5_block9_concat[0][0] \n conv5_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_bn (BatchNormal (None, 10, 10, 960) 3840 conv5_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_relu (Activatio (None, 10, 10, 960) 0 conv5_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_conv (Conv2D) (None, 10, 10, 128) 122880 conv5_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_concat (Concatena (None, 10, 10, 992) 0 conv5_block10_concat[0][0] \n conv5_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_bn (BatchNormal (None, 10, 10, 992) 3968 conv5_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_relu (Activatio (None, 10, 10, 992) 0 conv5_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_conv (Conv2D) (None, 10, 10, 128) 126976 conv5_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_concat (Concatena (None, 10, 10, 1024) 0 conv5_block11_concat[0][0] \n conv5_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_bn (BatchNormal (None, 10, 10, 1024) 4096 conv5_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_relu (Activatio (None, 10, 10, 1024) 0 conv5_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_conv (Conv2D) (None, 10, 10, 128) 131072 conv5_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_concat (Concatena (None, 10, 10, 1056) 0 conv5_block12_concat[0][0] \n conv5_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_bn (BatchNormal (None, 10, 10, 1056) 4224 conv5_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_relu (Activatio (None, 10, 10, 1056) 0 conv5_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_conv (Conv2D) (None, 10, 10, 128) 135168 conv5_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_concat (Concatena (None, 10, 10, 1088) 0 conv5_block13_concat[0][0] \n conv5_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_bn (BatchNormal (None, 10, 10, 1088) 4352 conv5_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_relu (Activatio (None, 10, 10, 1088) 0 conv5_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_conv (Conv2D) (None, 10, 10, 128) 139264 conv5_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_concat (Concatena (None, 10, 10, 1120) 0 conv5_block14_concat[0][0] \n conv5_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_bn (BatchNormal (None, 10, 10, 1120) 4480 conv5_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_relu (Activatio (None, 10, 10, 1120) 0 conv5_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_conv (Conv2D) (None, 10, 10, 128) 143360 conv5_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_concat (Concatena (None, 10, 10, 1152) 0 conv5_block15_concat[0][0] \n conv5_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_bn (BatchNormal (None, 10, 10, 1152) 4608 conv5_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_relu (Activatio (None, 10, 10, 1152) 0 conv5_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_conv (Conv2D) (None, 10, 10, 128) 147456 conv5_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_concat (Concatena (None, 10, 10, 1184) 0 conv5_block16_concat[0][0] \n conv5_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_bn (BatchNormal (None, 10, 10, 1184) 4736 conv5_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_relu (Activatio (None, 10, 10, 1184) 0 conv5_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_conv (Conv2D) (None, 10, 10, 128) 151552 conv5_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_concat (Concatena (None, 10, 10, 1216) 0 conv5_block17_concat[0][0] \n conv5_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_bn (BatchNormal (None, 10, 10, 1216) 4864 conv5_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_relu (Activatio (None, 10, 10, 1216) 0 conv5_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_conv (Conv2D) (None, 10, 10, 128) 155648 conv5_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_concat (Concatena (None, 10, 10, 1248) 0 conv5_block18_concat[0][0] \n conv5_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_bn (BatchNormal (None, 10, 10, 1248) 4992 conv5_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_relu (Activatio (None, 10, 10, 1248) 0 conv5_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_conv (Conv2D) (None, 10, 10, 128) 159744 conv5_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_concat (Concatena (None, 10, 10, 1280) 0 conv5_block19_concat[0][0] \n conv5_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_bn (BatchNormal (None, 10, 10, 1280) 5120 conv5_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_relu (Activatio (None, 10, 10, 1280) 0 conv5_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_conv (Conv2D) (None, 10, 10, 128) 163840 conv5_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_concat (Concatena (None, 10, 10, 1312) 0 conv5_block20_concat[0][0] \n conv5_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_bn (BatchNormal (None, 10, 10, 1312) 5248 conv5_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_relu (Activatio (None, 10, 10, 1312) 0 conv5_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_conv (Conv2D) (None, 10, 10, 128) 167936 conv5_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_concat (Concatena (None, 10, 10, 1344) 0 conv5_block21_concat[0][0] \n conv5_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_bn (BatchNormal (None, 10, 10, 1344) 5376 conv5_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_relu (Activatio (None, 10, 10, 1344) 0 conv5_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_conv (Conv2D) (None, 10, 10, 128) 172032 conv5_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_concat (Concatena (None, 10, 10, 1376) 0 conv5_block22_concat[0][0] \n conv5_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_bn (BatchNormal (None, 10, 10, 1376) 5504 conv5_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_relu (Activatio (None, 10, 10, 1376) 0 conv5_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_conv (Conv2D) (None, 10, 10, 128) 176128 conv5_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_concat (Concatena (None, 10, 10, 1408) 0 conv5_block23_concat[0][0] \n conv5_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_bn (BatchNormal (None, 10, 10, 1408) 5632 conv5_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_relu (Activatio (None, 10, 10, 1408) 0 conv5_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_conv (Conv2D) (None, 10, 10, 128) 180224 conv5_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_concat (Concatena (None, 10, 10, 1440) 0 conv5_block24_concat[0][0] \n conv5_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_bn (BatchNormal (None, 10, 10, 1440) 5760 conv5_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_relu (Activatio (None, 10, 10, 1440) 0 conv5_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_conv (Conv2D) (None, 10, 10, 128) 184320 conv5_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_concat (Concatena (None, 10, 10, 1472) 0 conv5_block25_concat[0][0] \n conv5_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_bn (BatchNormal (None, 10, 10, 1472) 5888 conv5_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_relu (Activatio (None, 10, 10, 1472) 0 conv5_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_conv (Conv2D) (None, 10, 10, 128) 188416 conv5_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_concat (Concatena (None, 10, 10, 1504) 0 conv5_block26_concat[0][0] \n conv5_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_bn (BatchNormal (None, 10, 10, 1504) 6016 conv5_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_relu (Activatio (None, 10, 10, 1504) 0 conv5_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_conv (Conv2D) (None, 10, 10, 128) 192512 conv5_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_concat (Concatena (None, 10, 10, 1536) 0 conv5_block27_concat[0][0] \n conv5_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_bn (BatchNormal (None, 10, 10, 1536) 6144 conv5_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_relu (Activatio (None, 10, 10, 1536) 0 conv5_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_conv (Conv2D) (None, 10, 10, 128) 196608 conv5_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_concat (Concatena (None, 10, 10, 1568) 0 conv5_block28_concat[0][0] \n conv5_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_bn (BatchNormal (None, 10, 10, 1568) 6272 conv5_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_relu (Activatio (None, 10, 10, 1568) 0 conv5_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_conv (Conv2D) (None, 10, 10, 128) 200704 conv5_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_concat (Concatena (None, 10, 10, 1600) 0 conv5_block29_concat[0][0] \n conv5_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_bn (BatchNormal (None, 10, 10, 1600) 6400 conv5_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_relu (Activatio (None, 10, 10, 1600) 0 conv5_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_conv (Conv2D) (None, 10, 10, 128) 204800 conv5_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_concat (Concatena (None, 10, 10, 1632) 0 conv5_block30_concat[0][0] \n conv5_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_bn (BatchNormal (None, 10, 10, 1632) 6528 conv5_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_relu (Activatio (None, 10, 10, 1632) 0 conv5_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_conv (Conv2D) (None, 10, 10, 128) 208896 conv5_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_concat (Concatena (None, 10, 10, 1664) 0 conv5_block31_concat[0][0] \n conv5_block32_2_conv[0][0] \n__________________________________________________________________________________________________\nbn (BatchNormalization) (None, 10, 10, 1664) 6656 conv5_block32_concat[0][0] \n__________________________________________________________________________________________________\nrelu (Activation) (None, 10, 10, 1664) 0 bn[0][0] \n__________________________________________________________________________________________________\nglobal_average_pooling2d_1 (Glo (None, 1664) 0 relu[0][0] \n__________________________________________________________________________________________________\ndropout_1 (Dropout) (None, 1664) 0 global_average_pooling2d_1[0][0] \n__________________________________________________________________________________________________\ndense_1 (Dense) (None, 2048) 3409920 dropout_1[0][0] \n__________________________________________________________________________________________________\ndropout_2 (Dropout) (None, 2048) 0 dense_1[0][0] \n__________________________________________________________________________________________________\nfinal_output (Dense) (None, 5) 10245 dropout_2[0][0] \n==================================================================================================\nTotal params: 16,063,045\nTrainable params: 3,420,165\nNon-trainable params: 12,642,880\n__________________________________________________________________________________________________\n" ], [ "STEP_SIZE_TRAIN = train_generator.n//train_generator.batch_size\nSTEP_SIZE_VALID = valid_generator.n//valid_generator.batch_size\n\nhistory_warmup = model.fit_generator(generator=train_generator,\n steps_per_epoch=STEP_SIZE_TRAIN,\n validation_data=valid_generator,\n validation_steps=STEP_SIZE_VALID,\n epochs=WARMUP_EPOCHS,\n class_weight=class_weights,\n verbose=1).history", "Epoch 1/5\n183/183 [==============================] - 81s 445ms/step - loss: 1.3357 - acc: 0.5731 - kappa: 0.3848 - val_loss: 1.0849 - val_acc: 0.5083 - val_kappa: -0.2198\nEpoch 2/5\n183/183 [==============================] - 68s 373ms/step - loss: 0.9705 - acc: 0.6499 - kappa: 0.6185 - val_loss: 1.0448 - val_acc: 0.5760 - val_kappa: 0.1622\nEpoch 3/5\n183/183 [==============================] - 69s 379ms/step - loss: 0.9260 - acc: 0.6571 - kappa: 0.6398 - val_loss: 1.2030 - val_acc: 0.4881 - val_kappa: -0.4510\nEpoch 4/5\n183/183 [==============================] - 69s 378ms/step - loss: 0.8650 - acc: 0.6837 - kappa: 0.6950 - val_loss: 1.0301 - val_acc: 0.5425 - val_kappa: 0.0034\nEpoch 5/5\n183/183 [==============================] - 69s 377ms/step - loss: 0.8863 - acc: 0.6640 - kappa: 0.6651 - val_loss: 0.9225 - val_acc: 0.6444 - val_kappa: 0.5296\n" ] ], [ [ "# Fine-tune the complete model", "_____no_output_____" ] ], [ [ "for layer in model.layers:\n layer.trainable = True\n\n# lrstep = LearningRateScheduler(step_decay)\nes = EarlyStopping(monitor='val_loss', mode='min', patience=ES_PATIENCE, restore_best_weights=True, verbose=1)\nrlrop = ReduceLROnPlateau(monitor='val_loss', mode='min', patience=RLROP_PATIENCE, factor=DECAY_DROP, min_lr=1e-6, verbose=1)\n\ncallback_list = [es, rlrop]\noptimizer = optimizers.Adam(lr=LEARNING_RATE)\nmodel.compile(optimizer=optimizer, loss='categorical_crossentropy', metrics=metric_list)\nmodel.summary()", "__________________________________________________________________________________________________\nLayer (type) Output Shape Param # Connected to \n==================================================================================================\ninput_1 (InputLayer) (None, 320, 320, 3) 0 \n__________________________________________________________________________________________________\nzero_padding2d_1 (ZeroPadding2D (None, 326, 326, 3) 0 input_1[0][0] \n__________________________________________________________________________________________________\nconv1/conv (Conv2D) (None, 160, 160, 64) 9408 zero_padding2d_1[0][0] \n__________________________________________________________________________________________________\nconv1/bn (BatchNormalization) (None, 160, 160, 64) 256 conv1/conv[0][0] \n__________________________________________________________________________________________________\nconv1/relu (Activation) (None, 160, 160, 64) 0 conv1/bn[0][0] \n__________________________________________________________________________________________________\nzero_padding2d_2 (ZeroPadding2D (None, 162, 162, 64) 0 conv1/relu[0][0] \n__________________________________________________________________________________________________\npool1 (MaxPooling2D) (None, 80, 80, 64) 0 zero_padding2d_2[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_bn (BatchNormali (None, 80, 80, 64) 256 pool1[0][0] \n__________________________________________________________________________________________________\nconv2_block1_0_relu (Activation (None, 80, 80, 64) 0 conv2_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_conv (Conv2D) (None, 80, 80, 128) 8192 conv2_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block1_1_relu (Activation (None, 80, 80, 128) 0 conv2_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block1_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block1_concat (Concatenat (None, 80, 80, 96) 0 pool1[0][0] \n conv2_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_bn (BatchNormali (None, 80, 80, 96) 384 conv2_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block2_0_relu (Activation (None, 80, 80, 96) 0 conv2_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_conv (Conv2D) (None, 80, 80, 128) 12288 conv2_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block2_1_relu (Activation (None, 80, 80, 128) 0 conv2_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block2_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block2_concat (Concatenat (None, 80, 80, 128) 0 conv2_block1_concat[0][0] \n conv2_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block3_0_relu (Activation (None, 80, 80, 128) 0 conv2_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_conv (Conv2D) (None, 80, 80, 128) 16384 conv2_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block3_1_relu (Activation (None, 80, 80, 128) 0 conv2_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block3_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block3_concat (Concatenat (None, 80, 80, 160) 0 conv2_block2_concat[0][0] \n conv2_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_bn (BatchNormali (None, 80, 80, 160) 640 conv2_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block4_0_relu (Activation (None, 80, 80, 160) 0 conv2_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_conv (Conv2D) (None, 80, 80, 128) 20480 conv2_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block4_1_relu (Activation (None, 80, 80, 128) 0 conv2_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block4_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block4_concat (Concatenat (None, 80, 80, 192) 0 conv2_block3_concat[0][0] \n conv2_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_bn (BatchNormali (None, 80, 80, 192) 768 conv2_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block5_0_relu (Activation (None, 80, 80, 192) 0 conv2_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_conv (Conv2D) (None, 80, 80, 128) 24576 conv2_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block5_1_relu (Activation (None, 80, 80, 128) 0 conv2_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block5_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block5_concat (Concatenat (None, 80, 80, 224) 0 conv2_block4_concat[0][0] \n conv2_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_bn (BatchNormali (None, 80, 80, 224) 896 conv2_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv2_block6_0_relu (Activation (None, 80, 80, 224) 0 conv2_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_conv (Conv2D) (None, 80, 80, 128) 28672 conv2_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_bn (BatchNormali (None, 80, 80, 128) 512 conv2_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv2_block6_1_relu (Activation (None, 80, 80, 128) 0 conv2_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv2_block6_2_conv (Conv2D) (None, 80, 80, 32) 36864 conv2_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv2_block6_concat (Concatenat (None, 80, 80, 256) 0 conv2_block5_concat[0][0] \n conv2_block6_2_conv[0][0] \n__________________________________________________________________________________________________\npool2_bn (BatchNormalization) (None, 80, 80, 256) 1024 conv2_block6_concat[0][0] \n__________________________________________________________________________________________________\npool2_relu (Activation) (None, 80, 80, 256) 0 pool2_bn[0][0] \n__________________________________________________________________________________________________\npool2_conv (Conv2D) (None, 80, 80, 128) 32768 pool2_relu[0][0] \n__________________________________________________________________________________________________\npool2_pool (AveragePooling2D) (None, 40, 40, 128) 0 pool2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_bn (BatchNormali (None, 40, 40, 128) 512 pool2_pool[0][0] \n__________________________________________________________________________________________________\nconv3_block1_0_relu (Activation (None, 40, 40, 128) 0 conv3_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_conv (Conv2D) (None, 40, 40, 128) 16384 conv3_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block1_1_relu (Activation (None, 40, 40, 128) 0 conv3_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block1_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block1_concat (Concatenat (None, 40, 40, 160) 0 pool2_pool[0][0] \n conv3_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_bn (BatchNormali (None, 40, 40, 160) 640 conv3_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block2_0_relu (Activation (None, 40, 40, 160) 0 conv3_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_conv (Conv2D) (None, 40, 40, 128) 20480 conv3_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block2_1_relu (Activation (None, 40, 40, 128) 0 conv3_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block2_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block2_concat (Concatenat (None, 40, 40, 192) 0 conv3_block1_concat[0][0] \n conv3_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_bn (BatchNormali (None, 40, 40, 192) 768 conv3_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block3_0_relu (Activation (None, 40, 40, 192) 0 conv3_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_conv (Conv2D) (None, 40, 40, 128) 24576 conv3_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block3_1_relu (Activation (None, 40, 40, 128) 0 conv3_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block3_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block3_concat (Concatenat (None, 40, 40, 224) 0 conv3_block2_concat[0][0] \n conv3_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_bn (BatchNormali (None, 40, 40, 224) 896 conv3_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block4_0_relu (Activation (None, 40, 40, 224) 0 conv3_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_conv (Conv2D) (None, 40, 40, 128) 28672 conv3_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block4_1_relu (Activation (None, 40, 40, 128) 0 conv3_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block4_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block4_concat (Concatenat (None, 40, 40, 256) 0 conv3_block3_concat[0][0] \n conv3_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_bn (BatchNormali (None, 40, 40, 256) 1024 conv3_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block5_0_relu (Activation (None, 40, 40, 256) 0 conv3_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_conv (Conv2D) (None, 40, 40, 128) 32768 conv3_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block5_1_relu (Activation (None, 40, 40, 128) 0 conv3_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block5_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block5_concat (Concatenat (None, 40, 40, 288) 0 conv3_block4_concat[0][0] \n conv3_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_bn (BatchNormali (None, 40, 40, 288) 1152 conv3_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block6_0_relu (Activation (None, 40, 40, 288) 0 conv3_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_conv (Conv2D) (None, 40, 40, 128) 36864 conv3_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block6_1_relu (Activation (None, 40, 40, 128) 0 conv3_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block6_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block6_concat (Concatenat (None, 40, 40, 320) 0 conv3_block5_concat[0][0] \n conv3_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_bn (BatchNormali (None, 40, 40, 320) 1280 conv3_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block7_0_relu (Activation (None, 40, 40, 320) 0 conv3_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_conv (Conv2D) (None, 40, 40, 128) 40960 conv3_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block7_1_relu (Activation (None, 40, 40, 128) 0 conv3_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block7_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block7_concat (Concatenat (None, 40, 40, 352) 0 conv3_block6_concat[0][0] \n conv3_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_bn (BatchNormali (None, 40, 40, 352) 1408 conv3_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block8_0_relu (Activation (None, 40, 40, 352) 0 conv3_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_conv (Conv2D) (None, 40, 40, 128) 45056 conv3_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block8_1_relu (Activation (None, 40, 40, 128) 0 conv3_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block8_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block8_concat (Concatenat (None, 40, 40, 384) 0 conv3_block7_concat[0][0] \n conv3_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_bn (BatchNormali (None, 40, 40, 384) 1536 conv3_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block9_0_relu (Activation (None, 40, 40, 384) 0 conv3_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_conv (Conv2D) (None, 40, 40, 128) 49152 conv3_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_bn (BatchNormali (None, 40, 40, 128) 512 conv3_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block9_1_relu (Activation (None, 40, 40, 128) 0 conv3_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block9_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block9_concat (Concatenat (None, 40, 40, 416) 0 conv3_block8_concat[0][0] \n conv3_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_bn (BatchNormal (None, 40, 40, 416) 1664 conv3_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block10_0_relu (Activatio (None, 40, 40, 416) 0 conv3_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_conv (Conv2D) (None, 40, 40, 128) 53248 conv3_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block10_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block10_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block10_concat (Concatena (None, 40, 40, 448) 0 conv3_block9_concat[0][0] \n conv3_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_bn (BatchNormal (None, 40, 40, 448) 1792 conv3_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block11_0_relu (Activatio (None, 40, 40, 448) 0 conv3_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_conv (Conv2D) (None, 40, 40, 128) 57344 conv3_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block11_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block11_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block11_concat (Concatena (None, 40, 40, 480) 0 conv3_block10_concat[0][0] \n conv3_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_bn (BatchNormal (None, 40, 40, 480) 1920 conv3_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv3_block12_0_relu (Activatio (None, 40, 40, 480) 0 conv3_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_conv (Conv2D) (None, 40, 40, 128) 61440 conv3_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_bn (BatchNormal (None, 40, 40, 128) 512 conv3_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv3_block12_1_relu (Activatio (None, 40, 40, 128) 0 conv3_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv3_block12_2_conv (Conv2D) (None, 40, 40, 32) 36864 conv3_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv3_block12_concat (Concatena (None, 40, 40, 512) 0 conv3_block11_concat[0][0] \n conv3_block12_2_conv[0][0] \n__________________________________________________________________________________________________\npool3_bn (BatchNormalization) (None, 40, 40, 512) 2048 conv3_block12_concat[0][0] \n__________________________________________________________________________________________________\npool3_relu (Activation) (None, 40, 40, 512) 0 pool3_bn[0][0] \n__________________________________________________________________________________________________\npool3_conv (Conv2D) (None, 40, 40, 256) 131072 pool3_relu[0][0] \n__________________________________________________________________________________________________\npool3_pool (AveragePooling2D) (None, 20, 20, 256) 0 pool3_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_bn (BatchNormali (None, 20, 20, 256) 1024 pool3_pool[0][0] \n__________________________________________________________________________________________________\nconv4_block1_0_relu (Activation (None, 20, 20, 256) 0 conv4_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_conv (Conv2D) (None, 20, 20, 128) 32768 conv4_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block1_1_relu (Activation (None, 20, 20, 128) 0 conv4_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block1_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block1_concat (Concatenat (None, 20, 20, 288) 0 pool3_pool[0][0] \n conv4_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_bn (BatchNormali (None, 20, 20, 288) 1152 conv4_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block2_0_relu (Activation (None, 20, 20, 288) 0 conv4_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_conv (Conv2D) (None, 20, 20, 128) 36864 conv4_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block2_1_relu (Activation (None, 20, 20, 128) 0 conv4_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block2_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block2_concat (Concatenat (None, 20, 20, 320) 0 conv4_block1_concat[0][0] \n conv4_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_bn (BatchNormali (None, 20, 20, 320) 1280 conv4_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block3_0_relu (Activation (None, 20, 20, 320) 0 conv4_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_conv (Conv2D) (None, 20, 20, 128) 40960 conv4_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block3_1_relu (Activation (None, 20, 20, 128) 0 conv4_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block3_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block3_concat (Concatenat (None, 20, 20, 352) 0 conv4_block2_concat[0][0] \n conv4_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_bn (BatchNormali (None, 20, 20, 352) 1408 conv4_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block4_0_relu (Activation (None, 20, 20, 352) 0 conv4_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_conv (Conv2D) (None, 20, 20, 128) 45056 conv4_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block4_1_relu (Activation (None, 20, 20, 128) 0 conv4_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block4_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block4_concat (Concatenat (None, 20, 20, 384) 0 conv4_block3_concat[0][0] \n conv4_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_bn (BatchNormali (None, 20, 20, 384) 1536 conv4_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block5_0_relu (Activation (None, 20, 20, 384) 0 conv4_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_conv (Conv2D) (None, 20, 20, 128) 49152 conv4_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block5_1_relu (Activation (None, 20, 20, 128) 0 conv4_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block5_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block5_concat (Concatenat (None, 20, 20, 416) 0 conv4_block4_concat[0][0] \n conv4_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_bn (BatchNormali (None, 20, 20, 416) 1664 conv4_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block6_0_relu (Activation (None, 20, 20, 416) 0 conv4_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_conv (Conv2D) (None, 20, 20, 128) 53248 conv4_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block6_1_relu (Activation (None, 20, 20, 128) 0 conv4_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block6_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block6_concat (Concatenat (None, 20, 20, 448) 0 conv4_block5_concat[0][0] \n conv4_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_bn (BatchNormali (None, 20, 20, 448) 1792 conv4_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block7_0_relu (Activation (None, 20, 20, 448) 0 conv4_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_conv (Conv2D) (None, 20, 20, 128) 57344 conv4_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block7_1_relu (Activation (None, 20, 20, 128) 0 conv4_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block7_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block7_concat (Concatenat (None, 20, 20, 480) 0 conv4_block6_concat[0][0] \n conv4_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_bn (BatchNormali (None, 20, 20, 480) 1920 conv4_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block8_0_relu (Activation (None, 20, 20, 480) 0 conv4_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_conv (Conv2D) (None, 20, 20, 128) 61440 conv4_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block8_1_relu (Activation (None, 20, 20, 128) 0 conv4_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block8_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block8_concat (Concatenat (None, 20, 20, 512) 0 conv4_block7_concat[0][0] \n conv4_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_bn (BatchNormali (None, 20, 20, 512) 2048 conv4_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block9_0_relu (Activation (None, 20, 20, 512) 0 conv4_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_conv (Conv2D) (None, 20, 20, 128) 65536 conv4_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_bn (BatchNormali (None, 20, 20, 128) 512 conv4_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block9_1_relu (Activation (None, 20, 20, 128) 0 conv4_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block9_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block9_concat (Concatenat (None, 20, 20, 544) 0 conv4_block8_concat[0][0] \n conv4_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_bn (BatchNormal (None, 20, 20, 544) 2176 conv4_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block10_0_relu (Activatio (None, 20, 20, 544) 0 conv4_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_conv (Conv2D) (None, 20, 20, 128) 69632 conv4_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block10_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block10_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block10_concat (Concatena (None, 20, 20, 576) 0 conv4_block9_concat[0][0] \n conv4_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_bn (BatchNormal (None, 20, 20, 576) 2304 conv4_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block11_0_relu (Activatio (None, 20, 20, 576) 0 conv4_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_conv (Conv2D) (None, 20, 20, 128) 73728 conv4_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block11_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block11_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block11_concat (Concatena (None, 20, 20, 608) 0 conv4_block10_concat[0][0] \n conv4_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_bn (BatchNormal (None, 20, 20, 608) 2432 conv4_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block12_0_relu (Activatio (None, 20, 20, 608) 0 conv4_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_conv (Conv2D) (None, 20, 20, 128) 77824 conv4_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block12_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block12_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block12_concat (Concatena (None, 20, 20, 640) 0 conv4_block11_concat[0][0] \n conv4_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_bn (BatchNormal (None, 20, 20, 640) 2560 conv4_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block13_0_relu (Activatio (None, 20, 20, 640) 0 conv4_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_conv (Conv2D) (None, 20, 20, 128) 81920 conv4_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block13_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block13_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block13_concat (Concatena (None, 20, 20, 672) 0 conv4_block12_concat[0][0] \n conv4_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_bn (BatchNormal (None, 20, 20, 672) 2688 conv4_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block14_0_relu (Activatio (None, 20, 20, 672) 0 conv4_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_conv (Conv2D) (None, 20, 20, 128) 86016 conv4_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block14_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block14_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block14_concat (Concatena (None, 20, 20, 704) 0 conv4_block13_concat[0][0] \n conv4_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_bn (BatchNormal (None, 20, 20, 704) 2816 conv4_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block15_0_relu (Activatio (None, 20, 20, 704) 0 conv4_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_conv (Conv2D) (None, 20, 20, 128) 90112 conv4_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block15_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block15_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block15_concat (Concatena (None, 20, 20, 736) 0 conv4_block14_concat[0][0] \n conv4_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_bn (BatchNormal (None, 20, 20, 736) 2944 conv4_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block16_0_relu (Activatio (None, 20, 20, 736) 0 conv4_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_conv (Conv2D) (None, 20, 20, 128) 94208 conv4_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block16_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block16_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block16_concat (Concatena (None, 20, 20, 768) 0 conv4_block15_concat[0][0] \n conv4_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_bn (BatchNormal (None, 20, 20, 768) 3072 conv4_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block17_0_relu (Activatio (None, 20, 20, 768) 0 conv4_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_conv (Conv2D) (None, 20, 20, 128) 98304 conv4_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block17_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block17_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block17_concat (Concatena (None, 20, 20, 800) 0 conv4_block16_concat[0][0] \n conv4_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_bn (BatchNormal (None, 20, 20, 800) 3200 conv4_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block18_0_relu (Activatio (None, 20, 20, 800) 0 conv4_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_conv (Conv2D) (None, 20, 20, 128) 102400 conv4_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block18_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block18_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block18_concat (Concatena (None, 20, 20, 832) 0 conv4_block17_concat[0][0] \n conv4_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_bn (BatchNormal (None, 20, 20, 832) 3328 conv4_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block19_0_relu (Activatio (None, 20, 20, 832) 0 conv4_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_conv (Conv2D) (None, 20, 20, 128) 106496 conv4_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block19_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block19_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block19_concat (Concatena (None, 20, 20, 864) 0 conv4_block18_concat[0][0] \n conv4_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_bn (BatchNormal (None, 20, 20, 864) 3456 conv4_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block20_0_relu (Activatio (None, 20, 20, 864) 0 conv4_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_conv (Conv2D) (None, 20, 20, 128) 110592 conv4_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block20_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block20_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block20_concat (Concatena (None, 20, 20, 896) 0 conv4_block19_concat[0][0] \n conv4_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_bn (BatchNormal (None, 20, 20, 896) 3584 conv4_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block21_0_relu (Activatio (None, 20, 20, 896) 0 conv4_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_conv (Conv2D) (None, 20, 20, 128) 114688 conv4_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block21_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block21_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block21_concat (Concatena (None, 20, 20, 928) 0 conv4_block20_concat[0][0] \n conv4_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_bn (BatchNormal (None, 20, 20, 928) 3712 conv4_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block22_0_relu (Activatio (None, 20, 20, 928) 0 conv4_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_conv (Conv2D) (None, 20, 20, 128) 118784 conv4_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block22_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block22_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block22_concat (Concatena (None, 20, 20, 960) 0 conv4_block21_concat[0][0] \n conv4_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_bn (BatchNormal (None, 20, 20, 960) 3840 conv4_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block23_0_relu (Activatio (None, 20, 20, 960) 0 conv4_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_conv (Conv2D) (None, 20, 20, 128) 122880 conv4_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block23_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block23_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block23_concat (Concatena (None, 20, 20, 992) 0 conv4_block22_concat[0][0] \n conv4_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_bn (BatchNormal (None, 20, 20, 992) 3968 conv4_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block24_0_relu (Activatio (None, 20, 20, 992) 0 conv4_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_conv (Conv2D) (None, 20, 20, 128) 126976 conv4_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block24_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block24_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block24_concat (Concatena (None, 20, 20, 1024) 0 conv4_block23_concat[0][0] \n conv4_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_bn (BatchNormal (None, 20, 20, 1024) 4096 conv4_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block25_0_relu (Activatio (None, 20, 20, 1024) 0 conv4_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_conv (Conv2D) (None, 20, 20, 128) 131072 conv4_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block25_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block25_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block25_concat (Concatena (None, 20, 20, 1056) 0 conv4_block24_concat[0][0] \n conv4_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_bn (BatchNormal (None, 20, 20, 1056) 4224 conv4_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block26_0_relu (Activatio (None, 20, 20, 1056) 0 conv4_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_conv (Conv2D) (None, 20, 20, 128) 135168 conv4_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block26_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block26_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block26_concat (Concatena (None, 20, 20, 1088) 0 conv4_block25_concat[0][0] \n conv4_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_bn (BatchNormal (None, 20, 20, 1088) 4352 conv4_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block27_0_relu (Activatio (None, 20, 20, 1088) 0 conv4_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_conv (Conv2D) (None, 20, 20, 128) 139264 conv4_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block27_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block27_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block27_concat (Concatena (None, 20, 20, 1120) 0 conv4_block26_concat[0][0] \n conv4_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_bn (BatchNormal (None, 20, 20, 1120) 4480 conv4_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block28_0_relu (Activatio (None, 20, 20, 1120) 0 conv4_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_conv (Conv2D) (None, 20, 20, 128) 143360 conv4_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block28_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block28_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block28_concat (Concatena (None, 20, 20, 1152) 0 conv4_block27_concat[0][0] \n conv4_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_bn (BatchNormal (None, 20, 20, 1152) 4608 conv4_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block29_0_relu (Activatio (None, 20, 20, 1152) 0 conv4_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_conv (Conv2D) (None, 20, 20, 128) 147456 conv4_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block29_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block29_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block29_concat (Concatena (None, 20, 20, 1184) 0 conv4_block28_concat[0][0] \n conv4_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_bn (BatchNormal (None, 20, 20, 1184) 4736 conv4_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block30_0_relu (Activatio (None, 20, 20, 1184) 0 conv4_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_conv (Conv2D) (None, 20, 20, 128) 151552 conv4_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block30_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block30_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block30_concat (Concatena (None, 20, 20, 1216) 0 conv4_block29_concat[0][0] \n conv4_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_bn (BatchNormal (None, 20, 20, 1216) 4864 conv4_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block31_0_relu (Activatio (None, 20, 20, 1216) 0 conv4_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_conv (Conv2D) (None, 20, 20, 128) 155648 conv4_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block31_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block31_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block31_concat (Concatena (None, 20, 20, 1248) 0 conv4_block30_concat[0][0] \n conv4_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_bn (BatchNormal (None, 20, 20, 1248) 4992 conv4_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv4_block32_0_relu (Activatio (None, 20, 20, 1248) 0 conv4_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_conv (Conv2D) (None, 20, 20, 128) 159744 conv4_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_bn (BatchNormal (None, 20, 20, 128) 512 conv4_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv4_block32_1_relu (Activatio (None, 20, 20, 128) 0 conv4_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv4_block32_2_conv (Conv2D) (None, 20, 20, 32) 36864 conv4_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv4_block32_concat (Concatena (None, 20, 20, 1280) 0 conv4_block31_concat[0][0] \n conv4_block32_2_conv[0][0] \n__________________________________________________________________________________________________\npool4_bn (BatchNormalization) (None, 20, 20, 1280) 5120 conv4_block32_concat[0][0] \n__________________________________________________________________________________________________\npool4_relu (Activation) (None, 20, 20, 1280) 0 pool4_bn[0][0] \n__________________________________________________________________________________________________\npool4_conv (Conv2D) (None, 20, 20, 640) 819200 pool4_relu[0][0] \n__________________________________________________________________________________________________\npool4_pool (AveragePooling2D) (None, 10, 10, 640) 0 pool4_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_bn (BatchNormali (None, 10, 10, 640) 2560 pool4_pool[0][0] \n__________________________________________________________________________________________________\nconv5_block1_0_relu (Activation (None, 10, 10, 640) 0 conv5_block1_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_conv (Conv2D) (None, 10, 10, 128) 81920 conv5_block1_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block1_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block1_1_relu (Activation (None, 10, 10, 128) 0 conv5_block1_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block1_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block1_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block1_concat (Concatenat (None, 10, 10, 672) 0 pool4_pool[0][0] \n conv5_block1_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_bn (BatchNormali (None, 10, 10, 672) 2688 conv5_block1_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block2_0_relu (Activation (None, 10, 10, 672) 0 conv5_block2_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_conv (Conv2D) (None, 10, 10, 128) 86016 conv5_block2_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block2_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block2_1_relu (Activation (None, 10, 10, 128) 0 conv5_block2_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block2_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block2_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block2_concat (Concatenat (None, 10, 10, 704) 0 conv5_block1_concat[0][0] \n conv5_block2_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_bn (BatchNormali (None, 10, 10, 704) 2816 conv5_block2_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block3_0_relu (Activation (None, 10, 10, 704) 0 conv5_block3_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_conv (Conv2D) (None, 10, 10, 128) 90112 conv5_block3_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block3_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block3_1_relu (Activation (None, 10, 10, 128) 0 conv5_block3_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block3_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block3_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block3_concat (Concatenat (None, 10, 10, 736) 0 conv5_block2_concat[0][0] \n conv5_block3_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_bn (BatchNormali (None, 10, 10, 736) 2944 conv5_block3_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block4_0_relu (Activation (None, 10, 10, 736) 0 conv5_block4_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_conv (Conv2D) (None, 10, 10, 128) 94208 conv5_block4_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block4_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block4_1_relu (Activation (None, 10, 10, 128) 0 conv5_block4_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block4_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block4_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block4_concat (Concatenat (None, 10, 10, 768) 0 conv5_block3_concat[0][0] \n conv5_block4_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_bn (BatchNormali (None, 10, 10, 768) 3072 conv5_block4_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block5_0_relu (Activation (None, 10, 10, 768) 0 conv5_block5_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_conv (Conv2D) (None, 10, 10, 128) 98304 conv5_block5_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block5_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block5_1_relu (Activation (None, 10, 10, 128) 0 conv5_block5_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block5_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block5_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block5_concat (Concatenat (None, 10, 10, 800) 0 conv5_block4_concat[0][0] \n conv5_block5_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_bn (BatchNormali (None, 10, 10, 800) 3200 conv5_block5_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block6_0_relu (Activation (None, 10, 10, 800) 0 conv5_block6_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_conv (Conv2D) (None, 10, 10, 128) 102400 conv5_block6_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block6_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block6_1_relu (Activation (None, 10, 10, 128) 0 conv5_block6_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block6_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block6_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block6_concat (Concatenat (None, 10, 10, 832) 0 conv5_block5_concat[0][0] \n conv5_block6_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_bn (BatchNormali (None, 10, 10, 832) 3328 conv5_block6_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block7_0_relu (Activation (None, 10, 10, 832) 0 conv5_block7_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_conv (Conv2D) (None, 10, 10, 128) 106496 conv5_block7_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block7_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block7_1_relu (Activation (None, 10, 10, 128) 0 conv5_block7_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block7_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block7_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block7_concat (Concatenat (None, 10, 10, 864) 0 conv5_block6_concat[0][0] \n conv5_block7_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_bn (BatchNormali (None, 10, 10, 864) 3456 conv5_block7_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block8_0_relu (Activation (None, 10, 10, 864) 0 conv5_block8_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_conv (Conv2D) (None, 10, 10, 128) 110592 conv5_block8_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block8_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block8_1_relu (Activation (None, 10, 10, 128) 0 conv5_block8_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block8_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block8_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block8_concat (Concatenat (None, 10, 10, 896) 0 conv5_block7_concat[0][0] \n conv5_block8_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_bn (BatchNormali (None, 10, 10, 896) 3584 conv5_block8_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block9_0_relu (Activation (None, 10, 10, 896) 0 conv5_block9_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_conv (Conv2D) (None, 10, 10, 128) 114688 conv5_block9_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_bn (BatchNormali (None, 10, 10, 128) 512 conv5_block9_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block9_1_relu (Activation (None, 10, 10, 128) 0 conv5_block9_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block9_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block9_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block9_concat (Concatenat (None, 10, 10, 928) 0 conv5_block8_concat[0][0] \n conv5_block9_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_bn (BatchNormal (None, 10, 10, 928) 3712 conv5_block9_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block10_0_relu (Activatio (None, 10, 10, 928) 0 conv5_block10_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_conv (Conv2D) (None, 10, 10, 128) 118784 conv5_block10_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block10_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block10_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block10_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block10_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block10_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block10_concat (Concatena (None, 10, 10, 960) 0 conv5_block9_concat[0][0] \n conv5_block10_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_bn (BatchNormal (None, 10, 10, 960) 3840 conv5_block10_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block11_0_relu (Activatio (None, 10, 10, 960) 0 conv5_block11_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_conv (Conv2D) (None, 10, 10, 128) 122880 conv5_block11_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block11_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block11_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block11_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block11_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block11_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block11_concat (Concatena (None, 10, 10, 992) 0 conv5_block10_concat[0][0] \n conv5_block11_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_bn (BatchNormal (None, 10, 10, 992) 3968 conv5_block11_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block12_0_relu (Activatio (None, 10, 10, 992) 0 conv5_block12_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_conv (Conv2D) (None, 10, 10, 128) 126976 conv5_block12_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block12_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block12_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block12_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block12_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block12_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block12_concat (Concatena (None, 10, 10, 1024) 0 conv5_block11_concat[0][0] \n conv5_block12_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_bn (BatchNormal (None, 10, 10, 1024) 4096 conv5_block12_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block13_0_relu (Activatio (None, 10, 10, 1024) 0 conv5_block13_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_conv (Conv2D) (None, 10, 10, 128) 131072 conv5_block13_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block13_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block13_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block13_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block13_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block13_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block13_concat (Concatena (None, 10, 10, 1056) 0 conv5_block12_concat[0][0] \n conv5_block13_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_bn (BatchNormal (None, 10, 10, 1056) 4224 conv5_block13_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block14_0_relu (Activatio (None, 10, 10, 1056) 0 conv5_block14_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_conv (Conv2D) (None, 10, 10, 128) 135168 conv5_block14_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block14_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block14_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block14_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block14_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block14_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block14_concat (Concatena (None, 10, 10, 1088) 0 conv5_block13_concat[0][0] \n conv5_block14_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_bn (BatchNormal (None, 10, 10, 1088) 4352 conv5_block14_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block15_0_relu (Activatio (None, 10, 10, 1088) 0 conv5_block15_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_conv (Conv2D) (None, 10, 10, 128) 139264 conv5_block15_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block15_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block15_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block15_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block15_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block15_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block15_concat (Concatena (None, 10, 10, 1120) 0 conv5_block14_concat[0][0] \n conv5_block15_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_bn (BatchNormal (None, 10, 10, 1120) 4480 conv5_block15_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block16_0_relu (Activatio (None, 10, 10, 1120) 0 conv5_block16_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_conv (Conv2D) (None, 10, 10, 128) 143360 conv5_block16_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block16_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block16_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block16_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block16_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block16_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block16_concat (Concatena (None, 10, 10, 1152) 0 conv5_block15_concat[0][0] \n conv5_block16_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_bn (BatchNormal (None, 10, 10, 1152) 4608 conv5_block16_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block17_0_relu (Activatio (None, 10, 10, 1152) 0 conv5_block17_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_conv (Conv2D) (None, 10, 10, 128) 147456 conv5_block17_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block17_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block17_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block17_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block17_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block17_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block17_concat (Concatena (None, 10, 10, 1184) 0 conv5_block16_concat[0][0] \n conv5_block17_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_bn (BatchNormal (None, 10, 10, 1184) 4736 conv5_block17_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block18_0_relu (Activatio (None, 10, 10, 1184) 0 conv5_block18_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_conv (Conv2D) (None, 10, 10, 128) 151552 conv5_block18_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block18_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block18_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block18_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block18_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block18_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block18_concat (Concatena (None, 10, 10, 1216) 0 conv5_block17_concat[0][0] \n conv5_block18_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_bn (BatchNormal (None, 10, 10, 1216) 4864 conv5_block18_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block19_0_relu (Activatio (None, 10, 10, 1216) 0 conv5_block19_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_conv (Conv2D) (None, 10, 10, 128) 155648 conv5_block19_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block19_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block19_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block19_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block19_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block19_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block19_concat (Concatena (None, 10, 10, 1248) 0 conv5_block18_concat[0][0] \n conv5_block19_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_bn (BatchNormal (None, 10, 10, 1248) 4992 conv5_block19_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block20_0_relu (Activatio (None, 10, 10, 1248) 0 conv5_block20_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_conv (Conv2D) (None, 10, 10, 128) 159744 conv5_block20_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block20_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block20_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block20_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block20_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block20_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block20_concat (Concatena (None, 10, 10, 1280) 0 conv5_block19_concat[0][0] \n conv5_block20_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_bn (BatchNormal (None, 10, 10, 1280) 5120 conv5_block20_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block21_0_relu (Activatio (None, 10, 10, 1280) 0 conv5_block21_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_conv (Conv2D) (None, 10, 10, 128) 163840 conv5_block21_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block21_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block21_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block21_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block21_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block21_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block21_concat (Concatena (None, 10, 10, 1312) 0 conv5_block20_concat[0][0] \n conv5_block21_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_bn (BatchNormal (None, 10, 10, 1312) 5248 conv5_block21_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block22_0_relu (Activatio (None, 10, 10, 1312) 0 conv5_block22_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_conv (Conv2D) (None, 10, 10, 128) 167936 conv5_block22_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block22_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block22_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block22_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block22_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block22_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block22_concat (Concatena (None, 10, 10, 1344) 0 conv5_block21_concat[0][0] \n conv5_block22_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_bn (BatchNormal (None, 10, 10, 1344) 5376 conv5_block22_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block23_0_relu (Activatio (None, 10, 10, 1344) 0 conv5_block23_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_conv (Conv2D) (None, 10, 10, 128) 172032 conv5_block23_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block23_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block23_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block23_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block23_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block23_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block23_concat (Concatena (None, 10, 10, 1376) 0 conv5_block22_concat[0][0] \n conv5_block23_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_bn (BatchNormal (None, 10, 10, 1376) 5504 conv5_block23_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block24_0_relu (Activatio (None, 10, 10, 1376) 0 conv5_block24_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_conv (Conv2D) (None, 10, 10, 128) 176128 conv5_block24_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block24_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block24_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block24_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block24_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block24_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block24_concat (Concatena (None, 10, 10, 1408) 0 conv5_block23_concat[0][0] \n conv5_block24_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_bn (BatchNormal (None, 10, 10, 1408) 5632 conv5_block24_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block25_0_relu (Activatio (None, 10, 10, 1408) 0 conv5_block25_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_conv (Conv2D) (None, 10, 10, 128) 180224 conv5_block25_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block25_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block25_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block25_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block25_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block25_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block25_concat (Concatena (None, 10, 10, 1440) 0 conv5_block24_concat[0][0] \n conv5_block25_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_bn (BatchNormal (None, 10, 10, 1440) 5760 conv5_block25_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block26_0_relu (Activatio (None, 10, 10, 1440) 0 conv5_block26_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_conv (Conv2D) (None, 10, 10, 128) 184320 conv5_block26_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block26_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block26_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block26_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block26_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block26_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block26_concat (Concatena (None, 10, 10, 1472) 0 conv5_block25_concat[0][0] \n conv5_block26_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_bn (BatchNormal (None, 10, 10, 1472) 5888 conv5_block26_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block27_0_relu (Activatio (None, 10, 10, 1472) 0 conv5_block27_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_conv (Conv2D) (None, 10, 10, 128) 188416 conv5_block27_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block27_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block27_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block27_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block27_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block27_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block27_concat (Concatena (None, 10, 10, 1504) 0 conv5_block26_concat[0][0] \n conv5_block27_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_bn (BatchNormal (None, 10, 10, 1504) 6016 conv5_block27_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block28_0_relu (Activatio (None, 10, 10, 1504) 0 conv5_block28_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_conv (Conv2D) (None, 10, 10, 128) 192512 conv5_block28_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block28_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block28_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block28_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block28_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block28_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block28_concat (Concatena (None, 10, 10, 1536) 0 conv5_block27_concat[0][0] \n conv5_block28_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_bn (BatchNormal (None, 10, 10, 1536) 6144 conv5_block28_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block29_0_relu (Activatio (None, 10, 10, 1536) 0 conv5_block29_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_conv (Conv2D) (None, 10, 10, 128) 196608 conv5_block29_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block29_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block29_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block29_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block29_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block29_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block29_concat (Concatena (None, 10, 10, 1568) 0 conv5_block28_concat[0][0] \n conv5_block29_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_bn (BatchNormal (None, 10, 10, 1568) 6272 conv5_block29_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block30_0_relu (Activatio (None, 10, 10, 1568) 0 conv5_block30_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_conv (Conv2D) (None, 10, 10, 128) 200704 conv5_block30_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block30_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block30_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block30_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block30_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block30_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block30_concat (Concatena (None, 10, 10, 1600) 0 conv5_block29_concat[0][0] \n conv5_block30_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_bn (BatchNormal (None, 10, 10, 1600) 6400 conv5_block30_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block31_0_relu (Activatio (None, 10, 10, 1600) 0 conv5_block31_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_conv (Conv2D) (None, 10, 10, 128) 204800 conv5_block31_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block31_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block31_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block31_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block31_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block31_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block31_concat (Concatena (None, 10, 10, 1632) 0 conv5_block30_concat[0][0] \n conv5_block31_2_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_bn (BatchNormal (None, 10, 10, 1632) 6528 conv5_block31_concat[0][0] \n__________________________________________________________________________________________________\nconv5_block32_0_relu (Activatio (None, 10, 10, 1632) 0 conv5_block32_0_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_conv (Conv2D) (None, 10, 10, 128) 208896 conv5_block32_0_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_bn (BatchNormal (None, 10, 10, 128) 512 conv5_block32_1_conv[0][0] \n__________________________________________________________________________________________________\nconv5_block32_1_relu (Activatio (None, 10, 10, 128) 0 conv5_block32_1_bn[0][0] \n__________________________________________________________________________________________________\nconv5_block32_2_conv (Conv2D) (None, 10, 10, 32) 36864 conv5_block32_1_relu[0][0] \n__________________________________________________________________________________________________\nconv5_block32_concat (Concatena (None, 10, 10, 1664) 0 conv5_block31_concat[0][0] \n conv5_block32_2_conv[0][0] \n__________________________________________________________________________________________________\nbn (BatchNormalization) (None, 10, 10, 1664) 6656 conv5_block32_concat[0][0] \n__________________________________________________________________________________________________\nrelu (Activation) (None, 10, 10, 1664) 0 bn[0][0] \n__________________________________________________________________________________________________\nglobal_average_pooling2d_1 (Glo (None, 1664) 0 relu[0][0] \n__________________________________________________________________________________________________\ndropout_1 (Dropout) (None, 1664) 0 global_average_pooling2d_1[0][0] \n__________________________________________________________________________________________________\ndense_1 (Dense) (None, 2048) 3409920 dropout_1[0][0] \n__________________________________________________________________________________________________\ndropout_2 (Dropout) (None, 2048) 0 dense_1[0][0] \n__________________________________________________________________________________________________\nfinal_output (Dense) (None, 5) 10245 dropout_2[0][0] \n==================================================================================================\nTotal params: 16,063,045\nTrainable params: 15,904,645\nNon-trainable params: 158,400\n__________________________________________________________________________________________________\n" ], [ "history = model.fit_generator(generator=train_generator,\n steps_per_epoch=STEP_SIZE_TRAIN,\n validation_data=valid_generator,\n validation_steps=STEP_SIZE_VALID,\n epochs=EPOCHS,\n callbacks=callback_list,\n class_weight=class_weights,\n verbose=1).history", "Epoch 1/40\n183/183 [==============================] - 139s 757ms/step - loss: 0.6850 - acc: 0.7466 - kappa: 0.8268 - val_loss: 0.5695 - val_acc: 0.7908 - val_kappa: 0.8843\nEpoch 2/40\n183/183 [==============================] - 87s 478ms/step - loss: 0.5764 - acc: 0.7828 - kappa: 0.8835 - val_loss: 0.5638 - val_acc: 0.7880 - val_kappa: 0.8543\nEpoch 3/40\n183/183 [==============================] - 89s 487ms/step - loss: 0.5302 - acc: 0.7968 - kappa: 0.8996 - val_loss: 0.4854 - val_acc: 0.8298 - val_kappa: 0.9267\nEpoch 4/40\n183/183 [==============================] - 90s 493ms/step - loss: 0.4941 - acc: 0.8060 - kappa: 0.9220 - val_loss: 0.5247 - val_acc: 0.8061 - val_kappa: 0.9171\nEpoch 5/40\n183/183 [==============================] - 89s 487ms/step - loss: 0.4654 - acc: 0.8279 - kappa: 0.9285 - val_loss: 0.4637 - val_acc: 0.8145 - val_kappa: 0.9086\nEpoch 6/40\n183/183 [==============================] - 90s 491ms/step - loss: 0.4864 - acc: 0.8170 - kappa: 0.9225 - val_loss: 0.4663 - val_acc: 0.8326 - val_kappa: 0.9399\nEpoch 7/40\n183/183 [==============================] - 90s 493ms/step - loss: 0.4761 - acc: 0.8265 - kappa: 0.9363 - val_loss: 0.6075 - val_acc: 0.8006 - val_kappa: 0.8896\nEpoch 8/40\n183/183 [==============================] - 90s 494ms/step - loss: 0.4110 - acc: 0.8473 - kappa: 0.9440 - val_loss: 0.5248 - val_acc: 0.8229 - val_kappa: 0.9262\n\nEpoch 00008: ReduceLROnPlateau reducing learning rate to 4.999999873689376e-05.\nEpoch 9/40\n183/183 [==============================] - 89s 486ms/step - loss: 0.4127 - acc: 0.8477 - kappa: 0.9442 - val_loss: 0.4522 - val_acc: 0.8187 - val_kappa: 0.9232\nEpoch 10/40\n183/183 [==============================] - 91s 498ms/step - loss: 0.4236 - acc: 0.8498 - kappa: 0.9455 - val_loss: 0.4969 - val_acc: 0.8173 - val_kappa: 0.9069\nEpoch 11/40\n183/183 [==============================] - 92s 503ms/step - loss: 0.3767 - acc: 0.8562 - kappa: 0.9504 - val_loss: 0.5195 - val_acc: 0.7950 - val_kappa: 0.8966\nEpoch 12/40\n183/183 [==============================] - 93s 509ms/step - loss: 0.3427 - acc: 0.8696 - kappa: 0.9628 - val_loss: 0.5767 - val_acc: 0.8131 - val_kappa: 0.9236\n\nEpoch 00012: ReduceLROnPlateau reducing learning rate to 2.499999936844688e-05.\nEpoch 13/40\n183/183 [==============================] - 92s 505ms/step - loss: 0.2877 - acc: 0.8839 - kappa: 0.9645 - val_loss: 0.4223 - val_acc: 0.8424 - val_kappa: 0.9401\nEpoch 14/40\n183/183 [==============================] - 93s 510ms/step - loss: 0.2880 - acc: 0.8910 - kappa: 0.9704 - val_loss: 0.4906 - val_acc: 0.8103 - val_kappa: 0.9350\nEpoch 15/40\n183/183 [==============================] - 92s 505ms/step - loss: 0.2696 - acc: 0.9003 - kappa: 0.9719 - val_loss: 0.4484 - val_acc: 0.8271 - val_kappa: 0.9320\nEpoch 16/40\n183/183 [==============================] - 93s 509ms/step - loss: 0.2698 - acc: 0.8996 - kappa: 0.9774 - val_loss: 0.4540 - val_acc: 0.8229 - val_kappa: 0.9406\n\nEpoch 00016: ReduceLROnPlateau reducing learning rate to 1.249999968422344e-05.\nEpoch 17/40\n183/183 [==============================] - 92s 504ms/step - loss: 0.2323 - acc: 0.9197 - kappa: 0.9798 - val_loss: 0.5455 - val_acc: 0.7894 - val_kappa: 0.8988\nEpoch 18/40\n183/183 [==============================] - 94s 515ms/step - loss: 0.2399 - acc: 0.9132 - kappa: 0.9767 - val_loss: 0.4185 - val_acc: 0.8508 - val_kappa: 0.9487\nEpoch 19/40\n183/183 [==============================] - 93s 507ms/step - loss: 0.2322 - acc: 0.9157 - kappa: 0.9791 - val_loss: 0.5034 - val_acc: 0.8061 - val_kappa: 0.9174\nEpoch 20/40\n183/183 [==============================] - 93s 508ms/step - loss: 0.2174 - acc: 0.9167 - kappa: 0.9826 - val_loss: 0.4698 - val_acc: 0.8452 - val_kappa: 0.9419\nEpoch 21/40\n183/183 [==============================] - 93s 507ms/step - loss: 0.2468 - acc: 0.9157 - kappa: 0.9800 - val_loss: 0.5091 - val_acc: 0.8131 - val_kappa: 0.9259\n\nEpoch 00021: ReduceLROnPlateau reducing learning rate to 6.24999984211172e-06.\nEpoch 22/40\n183/183 [==============================] - 92s 501ms/step - loss: 0.1998 - acc: 0.9276 - kappa: 0.9841 - val_loss: 0.4864 - val_acc: 0.8285 - val_kappa: 0.9446\nEpoch 23/40\n183/183 [==============================] - 93s 507ms/step - loss: 0.2131 - acc: 0.9232 - kappa: 0.9844 - val_loss: 0.4938 - val_acc: 0.8173 - val_kappa: 0.9299\nRestoring model weights from the end of the best epoch\nEpoch 00023: early stopping\n" ] ], [ [ "# Model loss graph ", "_____no_output_____" ] ], [ [ "sns.set_style(\"whitegrid\")\nfig, (ax1, ax2, ax3) = plt.subplots(3, 1, sharex='col', figsize=(20, 18))\n\nax1.plot(history['loss'], label='Train loss')\nax1.plot(history['val_loss'], label='Validation loss')\nax1.legend(loc='best')\nax1.set_title('Loss')\n\nax2.plot(history['acc'], label='Train accuracy')\nax2.plot(history['val_acc'], label='Validation accuracy')\nax2.legend(loc='best')\nax2.set_title('Accuracy')\n\nax3.plot(history['kappa'], label='Train kappa')\nax3.plot(history['val_kappa'], label='Validation kappa')\nax3.legend(loc='best')\nax3.set_title('Kappa')\n\nplt.xlabel('Epochs')\nsns.despine()\nplt.show()", "_____no_output_____" ], [ "# Create empty arays to keep the predictions and labels\nlastFullTrainPred = np.empty((0, N_CLASSES))\nlastFullTrainLabels = np.empty((0, N_CLASSES))\nlastFullValPred = np.empty((0, N_CLASSES))\nlastFullValLabels = np.empty((0, N_CLASSES))\n\n# Add train predictions and labels\nfor i in range(STEP_SIZE_TRAIN+1):\n im, lbl = next(train_generator)\n scores = model.predict(im, batch_size=train_generator.batch_size)\n lastFullTrainPred = np.append(lastFullTrainPred, scores, axis=0)\n lastFullTrainLabels = np.append(lastFullTrainLabels, lbl, axis=0)\n\n# Add validation predictions and labels\nfor i in range(STEP_SIZE_VALID+1):\n im, lbl = next(valid_generator)\n scores = model.predict(im, batch_size=valid_generator.batch_size)\n lastFullValPred = np.append(lastFullValPred, scores, axis=0)\n lastFullValLabels = np.append(lastFullValLabels, lbl, axis=0)\n\nlastFullComPred = np.concatenate((lastFullTrainPred, lastFullValPred))\nlastFullComLabels = np.concatenate((lastFullTrainLabels, lastFullValLabels))\n\ntrain_preds = [np.argmax(pred) for pred in lastFullTrainPred]\ntrain_labels = [np.argmax(label) for label in lastFullTrainLabels]\nvalidation_preds = [np.argmax(pred) for pred in lastFullValPred]\nvalidation_labels = [np.argmax(label) for label in lastFullValLabels]\ncomplete_labels = [np.argmax(label) for label in lastFullComLabels]", "_____no_output_____" ] ], [ [ "# Model Evaluation", "_____no_output_____" ], [ "## Confusion Matrix\n\n### Original thresholds", "_____no_output_____" ] ], [ [ "labels = ['0 - No DR', '1 - Mild', '2 - Moderate', '3 - Severe', '4 - Proliferative DR']\ndef plot_confusion_matrix(train, validation, labels=labels):\n train_labels, train_preds = train\n validation_labels, validation_preds = validation\n fig, (ax1, ax2) = plt.subplots(1, 2, sharex='col', figsize=(24, 7))\n train_cnf_matrix = confusion_matrix(train_labels, train_preds)\n validation_cnf_matrix = confusion_matrix(validation_labels, validation_preds)\n\n train_cnf_matrix_norm = train_cnf_matrix.astype('float') / train_cnf_matrix.sum(axis=1)[:, np.newaxis]\n validation_cnf_matrix_norm = validation_cnf_matrix.astype('float') / validation_cnf_matrix.sum(axis=1)[:, np.newaxis]\n\n train_df_cm = pd.DataFrame(train_cnf_matrix_norm, index=labels, columns=labels)\n validation_df_cm = pd.DataFrame(validation_cnf_matrix_norm, index=labels, columns=labels)\n\n sns.heatmap(train_df_cm, annot=True, fmt='.2f', cmap=\"Blues\",ax=ax1).set_title('Train')\n sns.heatmap(validation_df_cm, annot=True, fmt='.2f', cmap=sns.cubehelix_palette(8),ax=ax2).set_title('Validation')\n plt.show()\n\nplot_confusion_matrix((train_labels, train_preds), (validation_labels, validation_preds))", "_____no_output_____" ] ], [ [ "## Quadratic Weighted Kappa", "_____no_output_____" ] ], [ [ "def evaluate_model(train, validation):\n train_labels, train_preds = train\n validation_labels, validation_preds = validation\n print(\"Train Cohen Kappa score: %.3f\" % cohen_kappa_score(train_preds, train_labels, weights='quadratic'))\n print(\"Validation Cohen Kappa score: %.3f\" % cohen_kappa_score(validation_preds, validation_labels, weights='quadratic'))\n print(\"Complete set Cohen Kappa score: %.3f\" % cohen_kappa_score(train_preds+validation_preds, train_labels+validation_labels, weights='quadratic'))\n \nevaluate_model((train_preds, train_labels), (validation_preds, validation_labels))", "Train Cohen Kappa score: 0.962\nValidation Cohen Kappa score: 0.900\nComplete set Cohen Kappa score: 0.950\n" ] ], [ [ "## Apply model to test set and output predictions", "_____no_output_____" ] ], [ [ "step_size = test_generator.n//test_generator.batch_size\ntest_generator.reset()\npreds = model.predict_generator(test_generator, steps=step_size)\npredictions = np.argmax(preds, axis=1)\n\nresults = pd.DataFrame({'id_code':test['id_code'], 'diagnosis':predictions})\nresults['id_code'] = results['id_code'].map(lambda x: str(x)[:-4])", "_____no_output_____" ], [ "# Cleaning created directories\nif os.path.exists(train_dest_path):\n shutil.rmtree(train_dest_path)\nif os.path.exists(validation_dest_path):\n shutil.rmtree(validation_dest_path)\nif os.path.exists(test_dest_path):\n shutil.rmtree(test_dest_path)", "_____no_output_____" ] ], [ [ "# Predictions class distribution", "_____no_output_____" ] ], [ [ "fig = plt.subplots(sharex='col', figsize=(24, 8.7))\nsns.countplot(x=\"diagnosis\", data=results, palette=\"GnBu_d\").set_title('Test')\nsns.despine()\nplt.show()", "_____no_output_____" ], [ "results.to_csv('submission.csv', index=False)\ndisplay(results.head())", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ] ]
d0627caae9fc6d352f304825bb063317fa6c3193
9,486
ipynb
Jupyter Notebook
notebooks/computer_vision/raw/tut5.ipynb
guesswhohaha/learntools
c1bd607ade5227f8c8977ff05bf9d04d0a8b7732
[ "Apache-2.0" ]
null
null
null
notebooks/computer_vision/raw/tut5.ipynb
guesswhohaha/learntools
c1bd607ade5227f8c8977ff05bf9d04d0a8b7732
[ "Apache-2.0" ]
null
null
null
notebooks/computer_vision/raw/tut5.ipynb
guesswhohaha/learntools
c1bd607ade5227f8c8977ff05bf9d04d0a8b7732
[ "Apache-2.0" ]
null
null
null
37.346457
517
0.598461
[ [ [ "<!--TITLE:Custom Convnets-->\n# Introduction #\n\nNow that you've seen the layers a convnet uses to extract features, it's time to put them together and build a network of your own!\n\n# Simple to Refined #\n\nIn the last three lessons, we saw how convolutional networks perform **feature extraction** through three operations: **filter**, **detect**, and **condense**. A single round of feature extraction can only extract relatively simple features from an image, things like simple lines or contrasts. These are too simple to solve most classification problems. Instead, convnets will repeat this extraction over and over, so that the features become more complex and refined as they travel deeper into the network.\n\n<figure>\n<img src=\"https://i.imgur.com/VqmC1rm.png\" alt=\"Features extracted from an image of a car, from simple to refined.\" width=800>\n</figure>\n\n# Convolutional Blocks #\n\nIt does this by passing them through long chains of **convolutional blocks** which perform this extraction.\n\n<figure>\n<img src=\"https://i.imgur.com/pr8VwCZ.png\" width=\"400\" alt=\"Extraction as a sequence of blocks.\">\n</figure>\n\nThese convolutional blocks are stacks of `Conv2D` and `MaxPool2D` layers, whose role in feature extraction we learned about in the last few lessons.\n\n<figure>\n<!-- <img src=\"./images/2-block-crp.png\" width=\"400\" alt=\"A kind of extraction block: convolution, ReLU, pooling.\"> -->\n<img src=\"https://i.imgur.com/8D6IhEw.png\" width=\"400\" alt=\"A kind of extraction block: convolution, ReLU, pooling.\">\n</figure>\n\nEach block represents a round of extraction, and by composing these blocks the convnet can combine and recombine the features produced, growing them and shaping them to better fit the problem at hand. The deep structure of modern convnets is what allows this sophisticated feature engineering and has been largely responsible for their superior performance.\n\n# Example - Design a Convnet #\n\nLet's see how to define a deep convolutional network capable of engineering complex features. In this example, we'll create a Keras `Sequence` model and then train it on our Cars dataset.\n\n## Step 1 - Load Data ##\n\nThis hidden cell loads the data.", "_____no_output_____" ] ], [ [ "#$HIDE_INPUT$\n# Imports\nimport os, warnings\nimport matplotlib.pyplot as plt\nfrom matplotlib import gridspec\n\nimport numpy as np\nimport tensorflow as tf\nfrom tensorflow.keras.preprocessing import image_dataset_from_directory\n\n# Reproducability\ndef set_seed(seed=31415):\n np.random.seed(seed)\n tf.random.set_seed(seed)\n os.environ['PYTHONHASHSEED'] = str(seed)\n os.environ['TF_DETERMINISTIC_OPS'] = '1'\nset_seed()\n\n# Set Matplotlib defaults\nplt.rc('figure', autolayout=True)\nplt.rc('axes', labelweight='bold', labelsize='large',\n titleweight='bold', titlesize=18, titlepad=10)\nplt.rc('image', cmap='magma')\nwarnings.filterwarnings(\"ignore\") # to clean up output cells\n\n\n# Load training and validation sets\nds_train_ = image_dataset_from_directory(\n '../input/car-or-truck/train',\n labels='inferred',\n label_mode='binary',\n image_size=[128, 128],\n interpolation='nearest',\n batch_size=64,\n shuffle=True,\n)\nds_valid_ = image_dataset_from_directory(\n '../input/car-or-truck/valid',\n labels='inferred',\n label_mode='binary',\n image_size=[128, 128],\n interpolation='nearest',\n batch_size=64,\n shuffle=False,\n)\n\n# Data Pipeline\ndef convert_to_float(image, label):\n image = tf.image.convert_image_dtype(image, dtype=tf.float32)\n return image, label\n\nAUTOTUNE = tf.data.experimental.AUTOTUNE\nds_train = (\n ds_train_\n .map(convert_to_float)\n .cache()\n .prefetch(buffer_size=AUTOTUNE)\n)\nds_valid = (\n ds_valid_\n .map(convert_to_float)\n .cache()\n .prefetch(buffer_size=AUTOTUNE)\n)\n", "_____no_output_____" ] ], [ [ "## Step 2 - Define Model ##\n\nHere is a diagram of the model we'll use:\n\n<figure>\n<!-- <img src=\"./images/2-convmodel-1.png\" width=\"200\" alt=\"Diagram of a convolutional model.\"> -->\n<img src=\"https://i.imgur.com/U1VdoDJ.png\" width=\"250\" alt=\"Diagram of a convolutional model.\">\n</figure>\n\nNow we'll define the model. See how our model consists of three blocks of `Conv2D` and `MaxPool2D` layers (the base) followed by a head of `Dense` layers. We can translate this diagram more or less directly into a Keras `Sequential` model just by filling in the appropriate parameters.", "_____no_output_____" ] ], [ [ "import tensorflow.keras as keras\nimport tensorflow.keras.layers as layers\n\nmodel = keras.Sequential([\n\n # First Convolutional Block\n layers.Conv2D(filters=32, kernel_size=5, activation=\"relu\", padding='same',\n # give the input dimensions in the first layer\n # [height, width, color channels(RGB)]\n input_shape=[128, 128, 3]),\n layers.MaxPool2D(),\n\n # Second Convolutional Block\n layers.Conv2D(filters=64, kernel_size=3, activation=\"relu\", padding='same'),\n layers.MaxPool2D(),\n\n # Third Convolutional Block\n layers.Conv2D(filters=128, kernel_size=3, activation=\"relu\", padding='same'),\n layers.MaxPool2D(),\n\n # Classifier Head\n layers.Flatten(),\n layers.Dense(units=6, activation=\"relu\"),\n layers.Dense(units=1, activation=\"sigmoid\"),\n])\nmodel.summary()", "_____no_output_____" ] ], [ [ "Notice in this definition is how the number of filters doubled block-by-block: 64, 128, 256. This is a common pattern. Since the `MaxPool2D` layer is reducing the *size* of the feature maps, we can afford to increase the *quantity* we create.\n\n## Step 3 - Train ##\n\nWe can train this model just like the model from Lesson 1: compile it with an optimizer along with a loss and metric appropriate for binary classification.", "_____no_output_____" ] ], [ [ "model.compile(\n optimizer=tf.keras.optimizers.Adam(epsilon=0.01),\n loss='binary_crossentropy',\n metrics=['binary_accuracy']\n)\n\nhistory = model.fit(\n ds_train,\n validation_data=ds_valid,\n epochs=40,\n)\n", "_____no_output_____" ], [ "import pandas as pd\n\nhistory_frame = pd.DataFrame(history.history)\nhistory_frame.loc[:, ['loss', 'val_loss']].plot()\nhistory_frame.loc[:, ['binary_accuracy', 'val_binary_accuracy']].plot();", "_____no_output_____" ] ], [ [ "This model is much smaller than the VGG16 model from Lesson 1 -- only 3 convolutional layers versus the 16 of VGG16. It was nevertheless able to fit this dataset fairly well. We might still be able to improve this simple model by adding more convolutional layers, hoping to create features better adapted to the dataset. This is what we'll try in the exercises.\n\n# Conclusion #\n\nIn this tutorial, you saw how to build a custom convnet composed of many **convolutional blocks** and capable of complex feature engineering. \n\n# Your Turn #\n\nIn the exercises, you'll create a convnet that performs as well on this problem as VGG16 does -- without pretraining! [**Try it now!**](#$NEXT_NOTEBOOK_URL$)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d0627ce38b47f49d40a787be57156a5c935c8209
5,818
ipynb
Jupyter Notebook
101notebook/ipython-rel-2.1.0-examples/Notebook/Raw Input.ipynb
OpenBookProjects/ipynb
72a28109e8e30aea0b9c6713e78821e4affa2e33
[ "MIT" ]
6
2015-06-08T12:50:14.000Z
2018-11-20T10:05:01.000Z
101notebook/ipython-rel-2.1.0-examples/Notebook/Raw Input.ipynb
OpenBookProjects/ipynb
72a28109e8e30aea0b9c6713e78821e4affa2e33
[ "MIT" ]
null
null
null
101notebook/ipython-rel-2.1.0-examples/Notebook/Raw Input.ipynb
OpenBookProjects/ipynb
72a28109e8e30aea0b9c6713e78821e4affa2e33
[ "MIT" ]
8
2016-01-26T14:12:50.000Z
2021-02-20T14:24:09.000Z
29.683673
762
0.508594
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d06280bd27aa1ca8f8e3c4b7aae0d4c197c9d83e
2,356
ipynb
Jupyter Notebook
11_Face_Detection.ipynb
EliasPapachristos/Computer_Vision_with_OpenCV
05af3c6161bd446f7df81ad190e732b1c5c6eb42
[ "Apache-2.0" ]
9
2020-05-01T10:28:55.000Z
2021-04-15T15:58:00.000Z
11_Face_Detection.ipynb
EliasPapachristos/Computer_Vision_with_OpenCV
05af3c6161bd446f7df81ad190e732b1c5c6eb42
[ "Apache-2.0" ]
null
null
null
11_Face_Detection.ipynb
EliasPapachristos/Computer_Vision_with_OpenCV
05af3c6161bd446f7df81ad190e732b1c5c6eb42
[ "Apache-2.0" ]
7
2020-06-11T18:09:25.000Z
2020-12-11T09:35:03.000Z
20.666667
122
0.48854
[ [ [ "import numpy as np\nimport cv2 \nimport matplotlib.pyplot as plt\n%matplotlib inline", "_____no_output_____" ] ], [ [ "### Cascade Files\nOpenCV comes with these pre-trained cascade files, we've relocated the .xml files for you in our own DATA folder.\n\n### Face Detectionยถ", "_____no_output_____" ] ], [ [ "face_cascade = cv2.CascadeClassifier('haarcascade_frontalface_default.xml')", "_____no_output_____" ], [ "def detect_face(img):\n \n \n face_img = img.copy()\n \n face_rects = face_cascade.detectMultiScale(face_img) \n \n for (x, y, w, h) in face_rects: \n cv2.rectangle(face_img, (x, y), (x + w, y + h), (255, 255, 255), 10) \n \n return face_img", "_____no_output_____" ] ], [ [ "### Conjunction with Video\n", "_____no_output_____" ] ], [ [ "cap = cv2.VideoCapture(0) \n\nwhile True: \n \n ret, frame = cap.read(0) \n \n frame = detect_face(frame)\n \n cv2.imshow('Video Face Detection', frame) \n \n c = cv2.waitKey(1) \n if c == 27: \n break \n \ncap.release() \ncv2.destroyAllWindows()", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code" ]
[ [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ] ]
d062889d977111401f6d85b9721c2780a97ec009
58,900
ipynb
Jupyter Notebook
3-object-tracking-and-localization/activities/8-vehicle-motion-and-calculus/Looking up Trig Ratios.ipynb
S1lv10Fr4gn4n1/udacity-cv
ce7aafc41e2c123396d809042973840ea08b850e
[ "MIT" ]
null
null
null
3-object-tracking-and-localization/activities/8-vehicle-motion-and-calculus/Looking up Trig Ratios.ipynb
S1lv10Fr4gn4n1/udacity-cv
ce7aafc41e2c123396d809042973840ea08b850e
[ "MIT" ]
3
2020-03-24T21:18:48.000Z
2021-06-08T21:11:14.000Z
3-object-tracking-and-localization/activities/8-vehicle-motion-and-calculus/Looking up Trig Ratios.ipynb
S1lv10Fr4gn4n1/udacity-cv
ce7aafc41e2c123396d809042973840ea08b850e
[ "MIT" ]
null
null
null
198.986486
16,876
0.907878
[ [ [ "# Looking up Trig Ratios\nThere are three ways you could find the value of a trig function at a particular angle.\n\n**1. Use a table** - This is how engineers used to find trig ratios before the days of computers. For example, from the table below I can see that $\\sin(60)=0.866$\n\n| angle | sin | cos | tan |\n| :---: | :---: | :---: | :---: |\n| 0 | 0.000 | 1.000 | 0.000 |\n| 10 | 0.174 | 0.985 | 0.176 |\n| 20 | 0.342 | 0.940 | 0.364 |\n| 30 | 0.500 | 0.866 | 0.577 |\n| 40 | 0.643 | 0.766 | 0.839 |\n| 50 | 0.766 | 0.643 | 1.192 |\n| 60 | 0.866 | 0.500 | 1.732 |\n| 70 | 0.940 | 0.342 | 2.747 |\n| 80 | 0.985 | 0.174 | 5.671 |\n\nThe problem with this technique is that there will always be gaps in a table. \n\n**2. Use a graph** - One way to try to fill these gaps is by consulting a graph of a trigonometric function. For example, the image below shows a plot of $\\sin(\\theta)$ for $0 \\leq \\theta \\leq 360$\n\n![](https://d17h27t6h515a5.cloudfront.net/topher/2017/December/5a2efe68_sine/sine.png)\n\nThese graphs are nice because they give a good visual sense for how these ratios behave, but they aren't great for getting accurate values. Which leads us to the **best** way to look up trig ratios...\n\n**3. Use a computer!** This probably isn't a surprise, but python has built in functions to calculate sine, cosine, and tangent... \n\nIn fact, you can even type \"sin(60 degrees)\" into **Google** and you'll get the correct answer!\n\n![](https://d17h27t6h515a5.cloudfront.net/topher/2017/December/5a2f0062_img-1742/img-1742.jpg)\n\nNote how I wrote in \"sin(60 degrees)\" instead of just \"sin(60)\". That's because these functions generally expect their input to be in **radians**. \n\nNow let's calculate these ratios with Python.", "_____no_output_____" ] ], [ [ "# Python's math module has functions called sin, cos, and tan\n# as well as the constant \"pi\" (which we will find useful shortly)\nfrom math import sin, cos, tan, pi\n\n# Run this cell. What do you expect the output to be?\nprint(sin(60))", "-0.3048106211022167\n" ] ], [ [ "Did the output match what you expected?\n\nIf not, it's probably because we didn't convert our angle to radians. \n\n### EXERCISE 1 - Write a function that converts degrees to radians\n\nImplement the following math in code:\n\n$$\\theta_{\\text{radians}} = \\theta_{\\text{degrees}} \\times \\frac{\\pi}{180}$$\n", "_____no_output_____" ] ], [ [ "from math import pi\ndef deg2rad(theta):\n \"\"\"Converts degrees to radians\"\"\"\n return theta * (pi/180)\n # TODO - implement this function (solution\n # code at end of notebook)\n\nassert(deg2rad(45.0) == pi / 4)\nassert(deg2rad(90.0) == pi / 2)\nprint(\"Nice work! Your degrees to radians function works!\")\n\nfor theta in [0, 30, 45, 60, 90]:\n theta_rad = deg2rad(theta)\n sin_theta = sin(theta_rad)\n print(\"sin(\", theta, \"degrees) =\", sin_theta)", "Nice work! Your degrees to radians function works!\nsin( 0 degrees) = 0.0\nsin( 30 degrees) = 0.49999999999999994\nsin( 45 degrees) = 0.7071067811865475\nsin( 60 degrees) = 0.8660254037844386\nsin( 90 degrees) = 1.0\n" ] ], [ [ "### EXERCISE 2 - Make plots of cosine and tangent", "_____no_output_____" ] ], [ [ "import numpy as np\nfrom matplotlib import pyplot as plt\ndef plot_sine(min_theta, max_theta):\n \"\"\"\n Generates a plot of sin(theta) between min_theta\n and max_theta (both of which are specified in degrees).\n \"\"\"\n angles_degrees = np.linspace(min_theta, max_theta)\n angles_radians = deg2rad(angles_degrees)\n values = np.sin(angles_radians)\n X = angles_degrees\n Y = values\n plt.plot(X,Y)\n plt.show()\n \n# EXERCISE 2.1 Implement this! Try not to look at the\n# implementation of plot_sine TOO much...\ndef plot_cosine(min_theta, max_theta):\n \"\"\"\n Generates a plot of sin(theta) between min_theta\n and max_theta (both of which are specified in degrees).\n \"\"\"\n angles_degrees = np.linspace(min_theta, max_theta)\n angles_radians = deg2rad(angles_degrees)\n values = np.cos(angles_radians)\n X = angles_degrees\n Y = values\n plt.plot(X,Y)\n plt.show()", "_____no_output_____" ], [ "plot_sine(0, 360)", "_____no_output_____" ], [ "plot_cosine(0, 360)", "_____no_output_____" ], [ "#\n\n#\n\n#\n\n#\n\n# SOLUTION CODE\n\n#\n\n#\n\n#\n\n#\nfrom math import pi\ndef deg2rad_solution(theta):\n \"\"\"Converts degrees to radians\"\"\"\n return theta * pi / 180\n\nassert(deg2rad_solution(45.0) == pi / 4)\nassert(deg2rad_solution(90.0) == pi / 2)\n\nimport numpy as np\nfrom matplotlib import pyplot as plt\ndef plot_cosine_solution(min_theta, max_theta):\n \"\"\"\n Generates a plot of sin(theta) between min_theta\n and max_theta (both of which are specified in degrees).\n \"\"\"\n angles_degrees = np.linspace(min_theta, max_theta)\n angles_radians = deg2rad_solution(angles_degrees)\n values = np.cos(angles_radians)\n X = angles_degrees\n Y = values\n plt.plot(X,Y)\n plt.show()", "_____no_output_____" ], [ "plot_cosine_solution(0, 360)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d0629a451407335fb731206a6bb49b75cb94e08a
32,217
ipynb
Jupyter Notebook
gs_quant/content/events/00_virtual_event/0003_trades.ipynb
KabbalahOracle/gs-quant
e4daa30654d8e4757c84f8836b5c1e22f39e7174
[ "Apache-2.0" ]
1
2020-11-04T21:21:45.000Z
2020-11-04T21:21:45.000Z
gs_quant/content/events/00_virtual_event/0003_trades.ipynb
KabbalahOracle/gs-quant
e4daa30654d8e4757c84f8836b5c1e22f39e7174
[ "Apache-2.0" ]
null
null
null
gs_quant/content/events/00_virtual_event/0003_trades.ipynb
KabbalahOracle/gs-quant
e4daa30654d8e4757c84f8836b5c1e22f39e7174
[ "Apache-2.0" ]
null
null
null
70.806593
1,918
0.701555
[ [ [ "from gs_quant.data import Dataset\nfrom gs_quant.markets.securities import Asset, AssetIdentifier, SecurityMaster\nfrom gs_quant.timeseries import *\nfrom gs_quant.target.instrument import FXOption, IRSwaption\nfrom gs_quant.markets import PricingContext, HistoricalPricingContext, BackToTheFuturePricingContext\nfrom gs_quant.risk import CarryScenario, MarketDataPattern, MarketDataShock, MarketDataShockBasedScenario, MarketDataShockType, CurveScenario,CarryScenario\nfrom gs_quant.markets.portfolio import Portfolio\nfrom gs_quant.risk import IRAnnualImpliedVol\nfrom gs_quant.timeseries import percentiles\nfrom gs_quant.datetime import business_day_offset\nimport seaborn as sns\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom scipy import stats\nimport warnings\nfrom datetime import date\nwarnings.filterwarnings('ignore')\nsns.set(style=\"darkgrid\", color_codes=True)", "_____no_output_____" ], [ "from gs_quant.session import GsSession\n# external users should substitute their client id and secret; please skip this step if using internal jupyterhub\nGsSession.use(client_id=None, client_secret=None, scopes=('run_analytics',)) ", "_____no_output_____" ] ], [ [ "In this notebook, we'll look at entry points for G10 vol, look for crosses with the largest downside sensivity to SPX, indicatively price several structures and analyze their carry profile.\n\n* [1: FX entry point vs richness](#1:-FX-entry-point-vs-richness)\n* [2: Downside sensitivity to SPX](#2:-Downside-sensitivity-to-SPX)\n* [3: AUDJPY conditional relationship with SPX](#3:-AUDJPY-conditional-relationship-with-SPX)\n* [4: Price structures](#4:-Price-structures)\n* [5: Analyse rates package](#5:-Analyse-rates-package)", "_____no_output_____" ], [ "### 1: FX entry point vs richness\nLet's pull [GS FX Spot](https://marquee.gs.com/s/developer/datasets/FXSPOT_PREMIUM) and [GS FX Implied Volatility](https://marquee.gs.com/s/developer/datasets/FXIMPLIEDVOL_PREMIUM) and look at implied vs realized vol as well as current implied level as percentile relative to the last 2 years.", "_____no_output_____" ] ], [ [ "def format_df(data_dict):\n df = pd.concat(data_dict, axis=1)\n df.columns = data_dict.keys()\n return df.fillna(method='ffill').dropna()", "_____no_output_____" ], [ "g10 = ['USDJPY', 'EURUSD', 'AUDUSD', 'GBPUSD', 'USDCAD', 'USDNOK', 'NZDUSD', 'USDSEK', 'USDCHF', 'AUDJPY']\nstart_date = date(2005, 8, 26)\nend_date = business_day_offset(date.today(), -1, roll='preceding')\nfxspot_dataset, fxvol_dataset = Dataset('FXSPOT_PREMIUM'), Dataset('FXIMPLIEDVOL_PREMIUM')\n\nspot_data, impvol_data, spot_fx = {}, {}, {}\nfor cross in g10:\n spot = fxspot_dataset.get_data(start_date, end_date, bbid=cross)[['spot']].drop_duplicates(keep='last')\n spot_fx[cross] = spot['spot']\n spot_data[cross] = volatility(spot['spot'], 63) # realized vol \n vol = fxvol_dataset.get_data(start_date, end_date, bbid=cross, tenor='3m', deltaStrike='DN', location='NYC')[['impliedVolatility']]\n impvol_data[cross] = vol.drop_duplicates(keep='last') * 100\n\nspdata, ivdata = format_df(spot_data), format_df(impvol_data)\ndiff = ivdata.subtract(spdata).dropna()", "_____no_output_____" ], [ "_slice = ivdata['2018-09-01': '2020-09-08']\npct_rank = {}\nfor x in _slice.columns:\n pct = percentiles(_slice[x])\n pct_rank[x] = pct.iloc[-1]\n\nfor fx in pct_rank:\n plt.scatter(pct_rank[fx], diff[fx]['2020-09-08'])\n plt.legend(pct_rank.keys(),loc='best', bbox_to_anchor=(0.9, -0.13), ncol=3)\n \nplt.xlabel('Percentile of Current Implied Vol')\nplt.ylabel('Implied vs Realized Vol')\nplt.title('Entry Point vs Richness')\nplt.show()", "_____no_output_____" ] ], [ [ "### 2: Downside sensitivity to SPX\n\nLet's now look at beta and correlation with SPX across G10.", "_____no_output_____" ] ], [ [ "spx_spot = Dataset('TREOD').get_data(start_date, end_date, bbid='SPX')[['closePrice']]\nspx_spot = spx_spot.fillna(method='ffill').dropna()\ndf = pd.DataFrame(spx_spot)\n\n#FX Spot data\nfx_spots = format_df(spot_fx)\ndata = pd.concat([spx_spot, fx_spots], axis=1).dropna()\ndata.columns = ['SPX'] + g10", "_____no_output_____" ], [ "beta_spx, corr_spx = {}, {}\n\n#calculate rolling 84d or 4m beta to S&P\nfor cross in g10:\n beta_spx[cross] = beta(data[cross],data['SPX'], 84)\n corr_spx[cross] = correlation(data['SPX'], data[cross], 84)\n\nfig, axs = plt.subplots(5, 2, figsize=(18, 20))\nfor j in range(2):\n for i in range(5):\n color='tab:blue'\n axs[i,j].plot(beta_spx[g10[i + j*5]], color=color)\n axs[i,j].set_title(g10[i + j*5])\n color='tab:blue'\n axs[i,j].set_ylabel('Beta', color=color)\n axs[i,j].plot(beta_spx[g10[i + j*5]], color=color)\n ax2 = axs[i,j].twinx()\n color = 'tab:orange' \n ax2.plot(corr_spx[g10[i + j*5]], color=color)\n ax2.set_ylabel('Correlation', color=color)\nplt.show()", "_____no_output_____" ] ], [ [ "### Part 3: AUDJPY conditional relationship with SPX\n\nLet's focus on AUDJPY and look at its relationship with SPX when SPX is significantly up and down.", "_____no_output_____" ] ], [ [ "# resample data to weekly from daily & get weekly returns\nwk_data = data.resample('W-FRI').last()\nrets = returns(wk_data, 1)\nsns.set(style='white', color_codes=True)\nspx_returns = [-.1, -.05, .05, .1]\nr2 = lambda x,y: stats.pearsonr(x,y)[0]**2 \nbetas = pd.DataFrame(index=spx_returns, columns=g10)\nfor ret in spx_returns:\n dns = rets[rets.SPX <= ret].dropna() if ret < 0 else rets[rets.SPX >= ret].dropna() \n j = sns.jointplot(x='SPX', y='AUDJPY', data=dns, kind='reg')\n j.set_axis_labels('SPX with {}% Returns'.format(ret*100), 'AUDJPY')\n j.fig.subplots_adjust(wspace=.02)\n plt.show()", "_____no_output_____" ] ], [ [ "Let's use the beta for all S&P returns to price a structure", "_____no_output_____" ] ], [ [ "sns.jointplot(x='SPX', y='AUDJPY', data=rets, kind='reg', stat_func=r2)", "_____no_output_____" ] ], [ [ "### 4: Price structures \n\n##### Let's now look at a few AUDJPY structures as potential hedges\n\n* Buy 4m AUDJPY put using spx beta to size. Max loss limited to premium paid.\n* Buy 4m AUDJPY put spread (4.2%/10.6% OTMS). Max loss limited to premium paid.\n\nFor more info on this trade, check out our market strats piece [here](https://marquee.gs.com/content/#/article/2020/08/28/gs-marketstrats-audjpy-as-us-election-hedge)", "_____no_output_____" ] ], [ [ "#buy 4m AUDJPY put\naudjpy_put = FXOption(option_type='Put', pair='AUDJPY', strike_price= 's-4.2%', expiration_date='4m', buy_sell='Buy') \nprint('cost in bps: {:,.2f}'.format(audjpy_put.premium / audjpy_put.notional_amount * 1e4))", "_____no_output_____" ], [ "#buy 4m AUDJPY put spread (5.3%/10.6% OTMS)\nfrom gs_quant.markets.portfolio import Portfolio\nput1 = FXOption(option_type='Put', pair='AUDJPY', strike_price= 's-4.2%', expiration_date='4m', buy_sell='Buy')\nput2 = FXOption(option_type='Put', pair='AUDJPY', strike_price= 's-10.6%', expiration_date='4m', buy_sell='Sell')\n\nfx_package = Portfolio((put1, put2))\ncost = put2.premium/put2.notional_amount - put1.premium/put1.notional_amount \nprint('cost in bps: {:,.2f}'.format(cost * 1e4))", "_____no_output_____" ] ], [ [ "##### ...And some rates ideas\n\n* Sell straddle. Max loss unlimited.\n* Sell 3m30y straddle, buy 2y30y straddle in a 0 pv package. Max loss unlimited.", "_____no_output_____" ] ], [ [ "leg = IRSwaption('Straddle', '30y', notional_currency='USD', expiration_date='3m', buy_sell='Sell')\nprint('PV in USD: {:,.2f}'.format(leg.dollar_price()))", "_____no_output_____" ], [ "leg1 = IRSwaption('Straddle', '30y', notional_currency='USD', expiration_date='3m', buy_sell='Sell',name='3m30y ATM Straddle')\nleg2 = IRSwaption('Straddle', '30y', notional_currency='USD', expiration_date='2y', notional_amount='{}/pv'.format(leg1.price()), buy_sell='Buy', name = '2y30y ATM Straddle')\n\nrates_package = Portfolio((leg1, leg2))\nrates_package.resolve()\n\nprint('Package cost in USD: {:,.2f}'.format(rates_package.price().aggregate()))\nprint('PV Flat notionals ($$m):', round(leg1.notional_amount/1e6, 1),' by ',round(leg2.notional_amount/1e6, 1))", "_____no_output_____" ] ], [ [ "### 5: Analyse rates package", "_____no_output_____" ] ], [ [ "dates = pd.bdate_range(date(2020, 6, 8), leg1.expiration_date, freq='5B').date.tolist()\n\nwith BackToTheFuturePricingContext(dates=dates, roll_to_fwds=True):\n future = rates_package.price()\nrates_future = future.result().aggregate()\n\nrates_future.plot(figsize=(10, 6), title='Historical PV and carry for rates package')\n\nprint('PV breakdown between legs:')\nresults = future.result().to_frame()\nresults /= 1e6\nresults.index=[leg1.name,leg2.name]\nresults.loc['Total'] = results.sum()\nresults.round(1)", "_____no_output_____" ] ], [ [ "Let's focus on the next 3m and how the calendar carries in different rates shocks.", "_____no_output_____" ] ], [ [ "dates = pd.bdate_range(dt.date.today(), leg1.expiration_date, freq='5B').date.tolist()\nshocked_pv = pd.DataFrame(columns=['Base', '5bp per week', '50bp instantaneous'], index=dates)\n\np1, p2, p3 = [], [], []\nwith PricingContext(is_batch=True):\n for t, d in enumerate(dates):\n with CarryScenario(date=d, roll_to_fwds=True):\n p1.append(rates_package.price())\n with MarketDataShockBasedScenario({MarketDataPattern('IR', 'USD'): MarketDataShock(MarketDataShockType.Absolute, t*0.0005)}):\n p2.append(rates_package.price())\n with MarketDataShockBasedScenario({MarketDataPattern('IR', 'USD'): MarketDataShock(MarketDataShockType.Absolute, 0.005)}):\n p3.append(rates_package.price())\n\nshocked_pv.Base = [p.result().aggregate() for p in p1]\nshocked_pv['5bp per week'] = [p.result().aggregate() for p in p2]\nshocked_pv['50bp instantaneous'] = [p.result().aggregate() for p in p3]\n\nshocked_pv/=1e6\nshocked_pv.round(1)\nshocked_pv.plot(figsize=(10, 6), title='Carry + scenario analysis')", "_____no_output_____" ] ], [ [ "### Disclaimers\n\nScenarios/predictions: Simulated results are for illustrative purposes only. GS provides no assurance or guarantee that the strategy will operate or would have operated in the past in a manner consistent with the above analysis. Past performance figures are not a reliable indicator of future results.\n\nIndicative Terms/Pricing Levels: This material may contain indicative terms only, including but not limited to pricing levels. There is no representation that any transaction can or could have been effected at such terms or prices. Proposed terms and conditions are for discussion purposes only. Finalized terms and conditions are subject to further discussion and negotiation.\nwww.goldmansachs.com/disclaimer/sales-and-trading-invest-rec-disclosures.html If you are not accessing this material via Marquee ContentStream, a list of the author's investment recommendations disseminated during the preceding 12 months and the proportion of the author's recommendations that are 'buy', 'hold', 'sell' or other over the previous 12 months is available by logging into Marquee ContentStream using the link below. Alternatively, if you do not have access to Marquee ContentStream, please contact your usual GS representative who will be able to provide this information to you.\n\nBacktesting, Simulated Results, Sensitivity/Scenario Analysis or Spreadsheet Calculator or Model: There may be data presented herein that is solely for illustrative purposes and which may include among other things back testing, simulated results and scenario analyses. The information is based upon certain factors, assumptions and historical information that Goldman Sachs may in its discretion have considered appropriate, however, Goldman Sachs provides no assurance or guarantee that this product will operate or would have operated in the past in a manner consistent with these assumptions. In the event any of the assumptions used do not prove to be true, results are likely to vary materially from the examples shown herein. Additionally, the results may not reflect material economic and market factors, such as liquidity, transaction costs and other expenses which could reduce potential return.\n\nOTC Derivatives Risk Disclosures: \nTerms of the Transaction: To understand clearly the terms and conditions of any OTC derivative transaction you may enter into, you should carefully review the Master Agreement, including any related schedules, credit support documents, addenda and exhibits. You should not enter into OTC derivative transactions unless you understand the terms of the transaction you are entering into as well as the nature and extent of your risk exposure. You should also be satisfied that the OTC derivative transaction is appropriate for you in light of your circumstances and financial condition. You may be requested to post margin or collateral to support written OTC derivatives at levels consistent with the internal policies of Goldman Sachs. \n \nLiquidity Risk: There is no public market for OTC derivative transactions and, therefore, it may be difficult or impossible to liquidate an existing position on favorable terms. Transfer Restrictions: OTC derivative transactions entered into with one or more affiliates of The Goldman Sachs Group, Inc. (Goldman Sachs) cannot be assigned or otherwise transferred without its prior written consent and, therefore, it may be impossible for you to transfer any OTC derivative transaction to a third party. \n \nConflict of Interests: Goldman Sachs may from time to time be an active participant on both sides of the market for the underlying securities, commodities, futures, options or any other derivative or instrument identical or related to those mentioned herein (together, \"the Product\"). Goldman Sachs at any time may have long or short positions in, or buy and sell Products (on a principal basis or otherwise) identical or related to those mentioned herein. Goldman Sachs hedging and trading activities may affect the value of the Products. \n \nCounterparty Credit Risk: Because Goldman Sachs, may be obligated to make substantial payments to you as a condition of an OTC derivative transaction, you must evaluate the credit risk of doing business with Goldman Sachs or its affiliates. \n \nPricing and Valuation: The price of each OTC derivative transaction is individually negotiated between Goldman Sachs and each counterparty and Goldman Sachs does not represent or warrant that the prices for which it offers OTC derivative transactions are the best prices available, possibly making it difficult for you to establish what is a fair price for a particular OTC derivative transaction; The value or quoted price of the Product at any time, however, will reflect many factors and cannot be predicted. If Goldman Sachs makes a market in the offered Product, the price quoted by Goldman Sachs would reflect any changes in market conditions and other relevant factors, and the quoted price (and the value of the Product that Goldman Sachs will use for account statements or otherwise) could be higher or lower than the original price, and may be higher or lower than the value of the Product as determined by reference to pricing models used by Goldman Sachs. If at any time a third party dealer quotes a price to purchase the Product or otherwise values the Product, that price may be significantly different (higher or lower) than any price quoted by Goldman Sachs. Furthermore, if you sell the Product, you will likely be charged a commission for secondary market transactions, or the price will likely reflect a dealer discount. Goldman Sachs may conduct market making activities in the Product. To the extent Goldman Sachs makes a market, any price quoted for the OTC derivative transactions, Goldman Sachs may differ significantly from (i) their value determined by reference to Goldman Sachs pricing models and (ii) any price quoted by a third party. The market price of the OTC derivative transaction may be influenced by many unpredictable factors, including economic conditions, the creditworthiness of Goldman Sachs, the value of any underlyers, and certain actions taken by Goldman Sachs. \n \nMarket Making, Investing and Lending: Goldman Sachs engages in market making, investing and lending businesses for its own account and the accounts of its affiliates in the same or similar instruments underlying OTC derivative transactions (including such trading as Goldman Sachs deems appropriate in its sole discretion to hedge its market risk in any OTC derivative transaction whether between Goldman Sachs and you or with third parties) and such trading may affect the value of an OTC derivative transaction. \n \nEarly Termination Payments: The provisions of an OTC Derivative Transaction may allow for early termination and, in such cases, either you or Goldman Sachs may be required to make a potentially significant termination payment depending upon whether the OTC Derivative Transaction is in-the-money to Goldman Sachs or you at the time of termination. Indexes: Goldman Sachs does not warrant, and takes no responsibility for, the structure, method of computation or publication of any currency exchange rates, interest rates, indexes of such rates, or credit, equity or other indexes, unless Goldman Sachs specifically advises you otherwise.\nRisk Disclosure Regarding futures, options, equity swaps, and other derivatives as well as non-investment-grade securities and ADRs: Please ensure that you have read and understood the current options, futures and security futures disclosure document before entering into any such transactions. Current United States listed options, futures and security futures disclosure documents are available from our sales representatives or at http://www.theocc.com/components/docs/riskstoc.pdf, http://www.goldmansachs.com/disclosures/risk-disclosure-for-futures.pdf and https://www.nfa.futures.org/investors/investor-resources/files/security-futures-disclosure.pdf, respectively. Certain transactions - including those involving futures, options, equity swaps, and other derivatives as well as non-investment-grade securities - give rise to substantial risk and are not available to nor suitable for all investors. If you have any questions about whether you are eligible to enter into these transactions with Goldman Sachs, please contact your sales representative. Foreign-currency-denominated securities are subject to fluctuations in exchange rates that could have an adverse effect on the value or price of, or income derived from, the investment. In addition, investors in securities such as ADRs, the values of which are influenced by foreign currencies, effectively assume currency risk.\nOptions Risk Disclosures: Options may trade at a value other than that which may be inferred from the current levels of interest rates, dividends (if applicable) and the underlier due to other factors including, but not limited to, expectations of future levels of interest rates, future levels of dividends and the volatility of the underlier at any time prior to maturity. Note: Options involve risk and are not suitable for all investors. Please ensure that you have read and understood the current options disclosure document before entering into any standardized options transactions. United States listed options disclosure documents are available from our sales representatives or at http://theocc.com/publications/risks/riskstoc.pdf. A secondary market may not be available for all options. Transaction costs may be a significant factor in option strategies calling for multiple purchases and sales of options, such as spreads. When purchasing long options an investor may lose their entire investment and when selling uncovered options the risk is potentially unlimited. Supporting documentation for any comparisons, recommendations, statistics, technical data, or other similar information will be supplied upon request.\nThis material is for the private information of the recipient only. This material is not sponsored, endorsed, sold or promoted by any sponsor or provider of an index referred herein (each, an \"Index Provider\"). GS does not have any affiliation with or control over the Index Providers or any control over the computation, composition or dissemination of the indices. While GS will obtain information from publicly available sources it believes reliable, it will not independently verify this information. Accordingly, GS shall have no liability, contingent or otherwise, to the user or to third parties, for the quality, accuracy, timeliness, continued availability or completeness of the data nor for any special, indirect, incidental or consequential damages which may be incurred or experienced because of the use of the data made available herein, even if GS has been advised of the possibility of such damages.\nStandard & Poor's ยฎ and S&P ยฎ are registered trademarks of The McGraw-Hill Companies, Inc. and S&P GSCIโ„ข is a trademark of The McGraw-Hill Companies, Inc. and have been licensed for use by the Issuer. This Product (the \"Product\") is not sponsored, endorsed, sold or promoted by S&P and S&P makes no representation, warranty or condition regarding the advisability of investing in the Product.\nNotice to Brazilian Investors\nMarquee is not meant for the general public in Brazil. The services or products provided by or through Marquee, at any time, may not be offered or sold to the general public in Brazil. You have received a password granting access to Marquee exclusively due to your existing relationship with a GS business located in Brazil. The selection and engagement with any of the offered services or products through Marquee, at any time, will be carried out directly by you. Before acting to implement any chosen service or products, provided by or through Marquee you should consider, at your sole discretion, whether it is suitable for your particular circumstances and, if necessary, seek professional advice. Any steps necessary in order to implement the chosen service or product, including but not limited to remittance of funds, shall be carried out at your discretion. Accordingly, such services and products have not been and will not be publicly issued, placed, distributed, offered or negotiated in the Brazilian capital markets and, as a result, they have not been and will not be registered with the Brazilian Securities and Exchange Commission (Comissรฃo de Valores Mobiliรกrios), nor have they been submitted to the foregoing agency for approval. Documents relating to such services or products, as well as the information contained therein, may not be supplied to the general public in Brazil, as the offering of such services or products is not a public offering in Brazil, nor used in connection with any offer for subscription or sale of securities to the general public in Brazil.\nThe offer of any securities mentioned in this message may not be made to the general public in Brazil. Accordingly, any such securities have not been nor will they be registered with the Brazilian Securities and Exchange Commission (Comissรฃo de Valores Mobiliรกrios) nor has any offer been submitted to the foregoing agency for approval. Documents relating to the offer, as well as the information contained therein, may not be supplied to the public in Brazil, as the offer is not a public offering of securities in Brazil. These terms will apply on every access to Marquee.\nOuvidoria Goldman Sachs Brasil: 0800 727 5764 e/ou [email protected]\nHorรกrio de funcionamento: segunda-feira ร  sexta-feira (exceto feriados), das 9hs ร s 18hs.\nOmbudsman Goldman Sachs Brazil: 0800 727 5764 and / or [email protected]\nAvailable Weekdays (except holidays), from 9 am to 6 pm.\n\n", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d062ad7c364d8195bd0661953d59fa2e49a6751f
16,541
ipynb
Jupyter Notebook
solutions/gqlalchemy-solutions.ipynb
pyladiesams/graphdatabases-gqlalchemy-beginner-mar2022
39b6d1eedfea63177b3e6a124411fdb2341116f5
[ "MIT" ]
4
2021-11-28T09:28:06.000Z
2022-02-23T20:30:47.000Z
solutions/gqlalchemy-solutions.ipynb
pyladiesams/graphdbs-gqlalchemy-beginner-mar2022
39b6d1eedfea63177b3e6a124411fdb2341116f5
[ "MIT" ]
null
null
null
solutions/gqlalchemy-solutions.ipynb
pyladiesams/graphdbs-gqlalchemy-beginner-mar2022
39b6d1eedfea63177b3e6a124411fdb2341116f5
[ "MIT" ]
null
null
null
30.294872
445
0.517562
[ [ [ "# ๐Ÿ’ก Solutions\n\nBefore trying out these solutions, please start the [gqlalchemy-workshop notebook](../workshop/gqlalchemy-workshop.ipynb) to import all data. Also, this solutions manual is here to help you out, and it is recommended you try solving the exercises first by yourself.\n\n## Exercise 1\n\n**Find out how many genres there are in the database.**\n\nThe correct Cypher query is:\n\n```\nMATCH (g:Genre)\nRETURN count(g) AS num_of_genres;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:", "_____no_output_____" ] ], [ [ "from gqlalchemy import match\n\ntotal_genres = (\n match()\n .node(labels=\"Genre\", variable=\"g\")\n .return_({\"count(g)\": \"num_of_genres\"})\n .execute()\n)\n\nresults = list(total_genres)\nfor result in results:\n print(result[\"num_of_genres\"])", "22084\n" ] ], [ [ "## Exercise 2\n\n**Find out to how many genres movie 'Matrix, The (1999)' belongs to.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (:Movie {title: 'Matrix, The (1999)'})-[:OF_GENRE]->(g:Genre)\nRETURN count(g) AS num_of_genres;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:\n", "_____no_output_____" ] ], [ [ "matrix = (\n match()\n .node(labels=\"Movie\", variable=\"m\")\n .to(\"OF_GENRE\")\n .node(labels=\"Genre\", variable=\"g\")\n .where(\"m.title\", \"=\", \"Matrix, The (1999)\")\n .return_({\"count(g)\": \"num_of_genres\"})\n .execute()\n)\n\nresults = list(matrix)\n\nfor result in results:\n print(result[\"num_of_genres\"])", "3\n" ] ], [ [ "## Exercise 3\n\n**Find out the title of the movies that the user with `id` 1 rated.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (:User {id: 1})-[:RATED]->(m:Movie)\nRETURN m.title;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:", "_____no_output_____" ] ], [ [ "movies = (\n match()\n .node(labels=\"User\", variable=\"u\")\n .to(\"RATED\")\n .node(labels=\"Movie\", variable=\"m\")\n .where(\"u.id\", \"=\", 1)\n .return_({\"m.title\": \"movie\"})\n .execute()\n)\n\nresults = list(movies)\n\nfor result in results:\n print(result[\"movie\"])", "Toy Story (1995)\nGrumpier Old Men (1995)\nHeat (1995)\nSeven (a.k.a. Se7en) (1995)\nUsual Suspects, The (1995)\nFrom Dusk Till Dawn (1996)\nBottle Rocket (1996)\nBraveheart (1995)\nRob Roy (1995)\nCanadian Bacon (1995)\nDesperado (1995)\nBilly Madison (1995)\nClerks (1994)\nDumb & Dumber (Dumb and Dumber) (1994)\nEd Wood (1994)\nStar Wars: Episode IV - A New Hope (1977)\nPulp Fiction (1994)\nStargate (1994)\nTommy Boy (1995)\nClear and Present Danger (1994)\nForrest Gump (1994)\nJungle Book, The (1994)\nMask, The (1994)\nBlown Away (1994)\nDazed and Confused (1993)\nFugitive, The (1993)\nJurassic Park (1993)\nMrs. Doubtfire (1993)\nSchindler's List (1993)\nSo I Married an Axe Murderer (1993)\nThree Musketeers, The (1993)\nTombstone (1993)\nDances with Wolves (1990)\nBatman (1989)\nSilence of the Lambs, The (1991)\nPinocchio (1940)\nFargo (1996)\nMission: Impossible (1996)\nJames and the Giant Peach (1996)\nSpace Jam (1996)\nRock, The (1996)\nTwister (1996)\nIndependence Day (a.k.a. ID4) (1996)\nShe's the One (1996)\nWizard of Oz, The (1939)\nCitizen Kane (1941)\nAdventures of Robin Hood, The (1938)\nGhost and Mrs. Muir, The (1947)\nMr. Smith Goes to Washington (1939)\nEscape to Witch Mountain (1975)\nWinnie the Pooh and the Blustery Day (1968)\nThree Caballeros, The (1945)\nSword in the Stone, The (1963)\nDumbo (1941)\nPete's Dragon (1977)\nBedknobs and Broomsticks (1971)\nAlice in Wonderland (1951)\nThat Thing You Do! (1996)\nGhost and the Darkness, The (1996)\nSwingers (1996)\nWilly Wonka & the Chocolate Factory (1971)\nMonty Python's Life of Brian (1979)\nReservoir Dogs (1992)\nPlatoon (1986)\nBasic Instinct (1992)\nE.T. the Extra-Terrestrial (1982)\nAbyss, The (1989)\nMonty Python and the Holy Grail (1975)\nStar Wars: Episode V - The Empire Strikes Back (1980)\nPrincess Bride, The (1987)\nRaiders of the Lost Ark (Indiana Jones and the Raiders of the Lost Ark) (1981)\nClockwork Orange, A (1971)\nApocalypse Now (1979)\nStar Wars: Episode VI - Return of the Jedi (1983)\nGoodfellas (1990)\nAlien (1979)\nPsycho (1960)\nBlues Brothers, The (1980)\nFull Metal Jacket (1987)\nHenry V (1989)\nQuiet Man, The (1952)\nTerminator, The (1984)\nDuck Soup (1933)\nShining, The (1980)\nGroundhog Day (1993)\nBack to the Future (1985)\nHighlander (1986)\nYoung Frankenstein (1974)\nFantasia (1940)\nIndiana Jones and the Last Crusade (1989)\nPink Floyd: The Wall (1982)\nNosferatu (Nosferatu, eine Symphonie des Grauens) (1922)\nBatman Returns (1992)\nSneakers (1992)\nLast of the Mohicans, The (1992)\nMcHale's Navy (1997)\nBest Men (1997)\nGrosse Pointe Blank (1997)\nAustin Powers: International Man of Mystery (1997)\nCon Air (1997)\nFace/Off (1997)\nMen in Black (a.k.a. MIB) (1997)\nConan the Barbarian (1982)\nL.A. Confidential (1997)\nKiss the Girls (1997)\nGame, The (1997)\nI Know What You Did Last Summer (1997)\nStarship Troopers (1997)\nBig Lebowski, The (1998)\nWedding Singer, The (1998)\nWelcome to Woop-Woop (1997)\nNewton Boys, The (1998)\nWild Things (1998)\nSmall Soldiers (1998)\nAll Quiet on the Western Front (1930)\nRocky (1976)\nLabyrinth (1986)\nLethal Weapon (1987)\nGoonies, The (1985)\nBack to the Future Part III (1990)\nBambi (1942)\nSaving Private Ryan (1998)\nBlack Cauldron, The (1985)\nFlight of the Navigator (1986)\nGreat Mouse Detective, The (1986)\nHoney, I Shrunk the Kids (1989)\nNegotiator, The (1998)\nJungle Book, The (1967)\nRescuers, The (1977)\nReturn to Oz (1985)\nRocketeer, The (1991)\nSleeping Beauty (1959)\nSong of the South (1946)\nTron (1982)\nIndiana Jones and the Temple of Doom (1984)\nLord of the Rings, The (1978)\nCharlotte's Web (1973)\nSecret of NIMH, The (1982)\nAmerican Tail, An (1986)\nLegend (1985)\nNeverEnding Story, The (1984)\nBeetlejuice (1988)\nWillow (1988)\nToys (1992)\nFew Good Men, A (1992)\nRush Hour (1998)\nEdward Scissorhands (1990)\nAmerican History X (1998)\nI Still Know What You Did Last Summer (1998)\nEnemy of the State (1998)\nKing Kong (1933)\nVery Bad Things (1998)\nPsycho (1998)\nRushmore (1998)\nRomancing the Stone (1984)\nYoung Sherlock Holmes (1985)\nThin Red Line, The (1998)\nHoward the Duck (1986)\nTexas Chainsaw Massacre, The (1974)\nCrocodile Dundee (1986)\nยกThree Amigos! (1986)\n20 Dates (1998)\nOffice Space (1999)\nLogan's Run (1976)\nPlanet of the Apes (1968)\nLock, Stock & Two Smoking Barrels (1998)\nMatrix, The (1999)\nGo (1999)\nSLC Punk! (1998)\nDick Tracy (1990)\nMummy, The (1999)\nStar Wars: Episode I - The Phantom Menace (1999)\nSuperman (1978)\nSuperman II (1980)\nDracula (1931)\nFrankenstein (1931)\nWolf Man, The (1941)\nRocky Horror Picture Show, The (1975)\nRun Lola Run (Lola rennt) (1998)\nSouth Park: Bigger, Longer and Uncut (1999)\nGhostbusters (a.k.a. Ghost Busters) (1984)\nIron Giant, The (1999)\nBig (1988)\n13th Warrior, The (1999)\nAmerican Beauty (1999)\nExcalibur (1981)\nGulliver's Travels (1939)\nTotal Recall (1990)\nDirty Dozen, The (1967)\nGoldfinger (1964)\nFrom Russia with Love (1963)\nDr. No (1962)\nFight Club (1999)\nRoboCop (1987)\nWho Framed Roger Rabbit? (1988)\nLive and Let Die (1973)\nThunderball (1965)\nBeing John Malkovich (1999)\nSpaceballs (1987)\nRobin Hood (1973)\nDogma (1999)\nMessenger: The Story of Joan of Arc, The (1999)\nLongest Day, The (1962)\nGreen Mile, The (1999)\nEasy Rider (1969)\nTalented Mr. Ripley, The (1999)\nEncino Man (1992)\nSister Act (1992)\nWayne's World (1992)\nScream 3 (2000)\nJFK (1991)\nTeenage Mutant Ninja Turtles II: The Secret of the Ooze (1991)\nTeenage Mutant Ninja Turtles III (1993)\nRed Dawn (1984)\nGood Morning, Vietnam (1987)\nGrumpy Old Men (1993)\nLadyhawke (1985)\nHook (1991)\nPredator (1987)\nGladiator (2000)\nRoad Trip (2000)\nMan with the Golden Gun, The (1974)\nBlazing Saddles (1974)\nMad Max (1979)\nRoad Warrior, The (Mad Max 2) (1981)\nShaft (1971)\nBig Trouble in Little China (1986)\nShaft (2000)\nX-Men (2000)\nWhat About Bob? (1991)\nTransformers: The Movie (1986)\nM*A*S*H (a.k.a. MASH) (1970)\n" ] ], [ [ "## Exercise 4\n\n**List 15 movies of 'Documentary' and 'Comedy' genres and sort them by title descending.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (m:Movie)-[:OF_GENRE]->(:Genre {name: \"Documentary\"})\nMATCH (m)-[:OF_GENRE]->(:Genre {name: \"Comedy\"})\nRETURN m.title\nORDER BY m.title DESC\nLIMIT 15;\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:", "_____no_output_____" ] ], [ [ "movies = (\n match()\n .node(labels=\"Movie\", variable=\"m\")\n .to(\"OF_GENRE\")\n .node(labels=\"Genre\", variable=\"g1\")\n .where(\"g1.name\", \"=\", \"Documentary\")\n .match()\n .node(labels=\"Movie\", variable=\"m\")\n .to(\"OF_GENRE\")\n .node(labels=\"Genre\", variable=\"g2\")\n .where(\"g2.name\", \"=\", \"Comedy\")\n .return_({\"m.title\": \"movie\"})\n .order_by(\"m.title DESC\")\n .limit(15)\n .execute()\n)\n\nresults = list(movies)\n\nfor result in results:\n print(result[\"movie\"])", "What the #$*! Do We Know!? (a.k.a. What the Bleep Do We Know!?) (2004)\nUnion: The Business Behind Getting High, The (2007)\nSuper Size Me (2004)\nSuper High Me (2007)\nSecret Policeman's Other Ball, The (1982)\nRichard Pryor Live on the Sunset Strip (1982)\nReligulous (2008)\nPaper Heart (2009)\nOriginal Kings of Comedy, The (2000)\nMerci Patron ! (2016)\nMartin Lawrence Live: Runteldat (2002)\nKevin Hart: Laugh at My Pain (2011)\nJeff Ross Roasts Criminals: Live at Brazos County Jail (2015)\nJackass: The Movie (2002)\nJackass Number Two (2006)\n" ] ], [ [ "## Exercise 5\n\n**Find out the minimum rating of the 'Star Wars: Episode I - The Phantom Menace (1999)' movie.**\n\n\nThe correct Cypher query is:\n\n```\nMATCH (:User)-[r:RATED]->(:Movie {title: 'Star Wars: Episode I - The Phantom Menace (1999)'})\nRETURN min(r.rating);\n```\n\nYou can try it out in Memgraph Lab at `localhost:3000`.\n\nWith GQLAlchemy's query builder, the solution is:", "_____no_output_____" ] ], [ [ "rating = (\n match()\n .node(labels=\"User\")\n .to(\"RATED\", variable=\"r\")\n .node(labels=\"Movie\", variable=\"m\")\n .where(\"m.title\", \"=\", \"Star Wars: Episode I - The Phantom Menace (1999)\")\n .return_({\"min(r.rating)\": \"min_rating\"})\n .execute()\n)\n\nresults = list(rating)\n\nfor result in results:\n print(result[\"min_rating\"])", "0.5\n" ] ], [ [ "And that's it! If you have any issues with this notebook, feel free to open an issue on the [GitHub repository](https://github.com/pyladiesams/graphdbs-gqlalchemy-beginner-mar2022), or [join the Discord server](https://discord.gg/memgraph) and get your answer instantly. If you are interested in the Cypher query language and want to learn more, sign up for the free [Cypher Email Course](https://memgraph.com/learn-cypher-query-language).", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d062af471a2cba624d9fcde9b9879fe8ddf2e6c9
325,489
ipynb
Jupyter Notebook
_notebooks/2021_04_28_PGA_Wins.ipynb
brennanashley/lambdalost
cfa50a18039062712919d99a01a8e6dcd484dc6c
[ "Apache-2.0" ]
1
2021-02-27T02:10:15.000Z
2021-02-27T02:10:15.000Z
_notebooks/2021_04_28_PGA_Wins.ipynb
brennanashley/lambdalost
cfa50a18039062712919d99a01a8e6dcd484dc6c
[ "Apache-2.0" ]
null
null
null
_notebooks/2021_04_28_PGA_Wins.ipynb
brennanashley/lambdalost
cfa50a18039062712919d99a01a8e6dcd484dc6c
[ "Apache-2.0" ]
null
null
null
172.581654
195,586
0.851387
[ [ [ "# \"PGA Tour Wins Classification\"\n\n\n", "_____no_output_____" ] ], [ [ "Can We Predict If a PGA Tour Player Won a Tournament in a Given Year?\n\nGolf is picking up popularity, so I thought it would be interesting to focus my project here. I set out to find what sets apart the best golfers from the rest. \nI decided to explore their statistics and to see if I could predict which golfers would win in a given year. My original dataset was found on Kaggle, and the data was scraped from the PGA Tour website. \n\nFrom this data, I performed an exploratory data analysis to explore the distribution of players on numerous aspects of the game, discover outliers, and further explore how the game has changed from 2010 to 2018. I also utilized numerous supervised machine learning models to predict a golfer's earnings and wins.\n\nTo predict the golfer's win, I used classification methods such as logisitic regression and Random Forest Classification. The best performance came from the Random Forest Classification method.", "_____no_output_____" ], [ "1. The Data\n\npgaTourData.csv contains 1674 rows and 18 columns. Each row indicates a golfer's performance for that year.\n", "_____no_output_____" ] ], [ [ "\n# Player Name: Name of the golfer\n\n# Rounds: The number of games that a player played\n\n# Fairway Percentage: The percentage of time a tee shot lands on the fairway\n\n# Year: The year in which the statistic was collected\n\n# Avg Distance: The average distance of the tee-shot\n\n# gir: (Green in Regulation) is met if any part of the ball is touching the putting surface while the number of strokes taken is at least two fewer than par\n\n# Average Putts: The average number of strokes taken on the green\n\n# Average Scrambling: Scrambling is when a player misses the green in regulation, but still makes par or better on a hole\n\n# Average Score: Average Score is the average of all the scores a player has played in that year\n\n# Points: The number of FedExCup points a player earned in that year\n\n# Wins: The number of competition a player has won in that year\n\n# Top 10: The number of competitions where a player has placed in the Top 10\n\n# Average SG Putts: Strokes gained: putting measures how many strokes a player gains (or loses) on the greens\n\n# Average SG Total: The Off-the-tee + approach-the-green + around-the-green + putting statistics combined\n\n# SG:OTT: Strokes gained: off-the-tee measures player performance off the tee on all par-4s and par-5s\n\n# SG:APR: Strokes gained: approach-the-green measures player performance on approach shots\n\n# SG:ARG: Strokes gained: around-the-green measures player performance on any shot within 30 yards of the edge of the green\n\n# Money: The amount of prize money a player has earned from tournaments\n", "_____no_output_____" ], [ "#collapse\n# importing packages\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns", "_____no_output_____" ], [ "# Importing the data \ndf = pd.read_csv('pgaTourData.csv')\n\n# Examining the first 5 data\ndf.head()", "_____no_output_____" ], [ "#collapse\ndf.info()", "<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 2312 entries, 0 to 2311\nData columns (total 18 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 Player Name 2312 non-null object \n 1 Rounds 1678 non-null float64\n 2 Fairway Percentage 1678 non-null float64\n 3 Year 2312 non-null int64 \n 4 Avg Distance 1678 non-null float64\n 5 gir 1678 non-null float64\n 6 Average Putts 1678 non-null float64\n 7 Average Scrambling 1678 non-null float64\n 8 Average Score 1678 non-null float64\n 9 Points 2296 non-null object \n 10 Wins 293 non-null float64\n 11 Top 10 1458 non-null float64\n 12 Average SG Putts 1678 non-null float64\n 13 Average SG Total 1678 non-null float64\n 14 SG:OTT 1678 non-null float64\n 15 SG:APR 1678 non-null float64\n 16 SG:ARG 1678 non-null float64\n 17 Money 2300 non-null object \ndtypes: float64(14), int64(1), object(3)\nmemory usage: 325.2+ KB\n" ], [ "#collapse\ndf.shape", "_____no_output_____" ] ], [ [ "2. Data Cleaning\n\n\nAfter looking at the dataframe, the data needs to be cleaned:\n\n-For the columns Top 10 and Wins, convert the NaNs to 0s\n\n-Change Top 10 and Wins into an int \n\n-Drop NaN values for players who do not have the full statistics\n\n-Change the columns Rounds into int\n\n-Change points to int\n\n-Remove the dollar sign ($) and commas in the column Money", "_____no_output_____" ] ], [ [ "# Replace NaN with 0 in Top 10 \ndf['Top 10'].fillna(0, inplace=True)\ndf['Top 10'] = df['Top 10'].astype(int)\n\n# Replace NaN with 0 in # of wins\ndf['Wins'].fillna(0, inplace=True)\ndf['Wins'] = df['Wins'].astype(int)\n\n# Drop NaN values \ndf.dropna(axis = 0, inplace=True)", "_____no_output_____" ], [ "# Change Rounds to int\ndf['Rounds'] = df['Rounds'].astype(int)\n\n# Change Points to int \ndf['Points'] = df['Points'].apply(lambda x: x.replace(',',''))\ndf['Points'] = df['Points'].astype(int)\n\n# Remove the $ and commas in money \ndf['Money'] = df['Money'].apply(lambda x: x.replace('$',''))\ndf['Money'] = df['Money'].apply(lambda x: x.replace(',',''))\ndf['Money'] = df['Money'].astype(float)", "_____no_output_____" ], [ "#collapse\ndf.info()", "<class 'pandas.core.frame.DataFrame'>\nInt64Index: 1674 entries, 0 to 1677\nData columns (total 18 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 Player Name 1674 non-null object \n 1 Rounds 1674 non-null int64 \n 2 Fairway Percentage 1674 non-null float64\n 3 Year 1674 non-null int64 \n 4 Avg Distance 1674 non-null float64\n 5 gir 1674 non-null float64\n 6 Average Putts 1674 non-null float64\n 7 Average Scrambling 1674 non-null float64\n 8 Average Score 1674 non-null float64\n 9 Points 1674 non-null int64 \n 10 Wins 1674 non-null int64 \n 11 Top 10 1674 non-null int64 \n 12 Average SG Putts 1674 non-null float64\n 13 Average SG Total 1674 non-null float64\n 14 SG:OTT 1674 non-null float64\n 15 SG:APR 1674 non-null float64\n 16 SG:ARG 1674 non-null float64\n 17 Money 1674 non-null float64\ndtypes: float64(12), int64(5), object(1)\nmemory usage: 248.5+ KB\n" ], [ "#collapse\ndf.describe()", "_____no_output_____" ] ], [ [ "3. Exploratory Data Analysis", "_____no_output_____" ] ], [ [ "#collapse_output\n# Looking at the distribution of data\nf, ax = plt.subplots(nrows = 6, ncols = 3, figsize=(20,20))\ndistribution = df.loc[:,df.columns!='Player Name'].columns\nrows = 0\ncols = 0\nfor i, column in enumerate(distribution):\n p = sns.distplot(df[column], ax=ax[rows][cols])\n cols += 1\n if cols == 3:\n cols = 0\n rows += 1", "/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n/usr/local/lib/python3.7/dist-packages/seaborn/distributions.py:2557: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n" ] ], [ [ "From the distributions plotted, most of the graphs are normally distributed. However, we can observe that Money, Points, Wins, and Top 10s are all skewed to the right. This could be explained by the separation of the best players and the average PGA Tour player. The best players have multiple placings in the Top 10 with wins that allows them to earn more from tournaments, while the average player will have no wins and only a few Top 10 placings that prevent them from earning as much.", "_____no_output_____" ] ], [ [ "#collapse_output\n# Looking at the number of players with Wins for each year \nwin = df.groupby('Year')['Wins'].value_counts()\nwin = win.unstack()\nwin.fillna(0, inplace=True)\n\n# Converting win into ints\nwin = win.astype(int)\n\nprint(win)", "Wins 0 1 2 3 4 5\nYear \n2010 166 21 5 0 0 0\n2011 156 25 5 0 0 0\n2012 159 26 4 1 0 0\n2013 152 24 3 0 0 1\n2014 142 29 3 2 0 0\n2015 150 29 2 1 1 0\n2016 152 28 4 1 0 0\n2017 156 30 0 3 1 0\n2018 158 26 5 3 0 0\n" ] ], [ [ "From this table, we can see that most players end the year without a win. It's pretty rare to find a player that has won more than once!", "_____no_output_____" ] ], [ [ "# Looking at the percentage of players without a win in that year \nplayers = win.apply(lambda x: np.sum(x), axis=1)\npercent_no_win = win[0]/players\npercent_no_win = percent_no_win*100\nprint(percent_no_win)", "Year\n2010 86.458333\n2011 83.870968\n2012 83.684211\n2013 84.444444\n2014 80.681818\n2015 81.967213\n2016 82.162162\n2017 82.105263\n2018 82.291667\ndtype: float64\n" ], [ "#collapse_output\n# Plotting percentage of players without a win each year \nfig, ax = plt.subplots()\nbar_width = 0.8\nopacity = 0.7 \nindex = np.arange(2010, 2019)\n\nplt.bar(index, percent_no_win, bar_width, alpha = opacity)\nplt.xticks(index)\nplt.xlabel('Year')\nplt.ylabel('%')\nplt.title('Percentage of Players without a Win')", "_____no_output_____" ] ], [ [ "From the box plot above, we can observe that the percentages of players without a win are around 80%. There was very little variation in the percentage of players without a win in the past 8 years.", "_____no_output_____" ] ], [ [ "#collapse_output\n# Plotting the number of wins on a bar chart \nfig, ax = plt.subplots()\nindex = np.arange(2010, 2019)\nbar_width = 0.2\nopacity = 0.7 \n\ndef plot_bar(index, win, labels):\n plt.bar(index, win, bar_width, alpha=opacity, label=labels)\n\n# Plotting the bars\nrects = plot_bar(index, win[0], labels = '0 Wins')\nrects1 = plot_bar(index + bar_width, win[1], labels = '1 Wins')\nrects2 = plot_bar(index + bar_width*2, win[2], labels = '2 Wins')\nrects3 = plot_bar(index + bar_width*3, win[3], labels = '3 Wins')\nrects4 = plot_bar(index + bar_width*4, win[4], labels = '4 Wins')\nrects5 = plot_bar(index + bar_width*5, win[5], labels = '5 Wins')\n\nplt.xticks(index + bar_width, index)\nplt.xlabel('Year')\nplt.ylabel('Number of Players')\nplt.title('Distribution of Wins each Year')\nplt.legend()", "_____no_output_____" ] ], [ [ "By looking at the distribution of Wins each year, we can see that it is rare for most players to even win a tournament in the PGA Tour. Majority of players do not win, and a very few number of players win more than once a year.", "_____no_output_____" ] ], [ [ "# Percentage of people who did not place in the top 10 each year\ntop10 = df.groupby('Year')['Top 10'].value_counts()\ntop10 = top10.unstack()\ntop10.fillna(0, inplace=True)\nplayers = top10.apply(lambda x: np.sum(x), axis=1)\n\nno_top10 = top10[0]/players * 100\nprint(no_top10)", "Year\n2010 17.187500\n2011 25.268817\n2012 23.157895\n2013 18.888889\n2014 16.477273\n2015 18.579235\n2016 20.000000\n2017 15.789474\n2018 17.187500\ndtype: float64\n" ] ], [ [ "By looking at the percentage of players that did not place in the top 10 by year, We can observe that only approximately 20% of players did not place in the Top 10. In addition, the range for these player that did not place in the Top 10 is only 9.47%. This tells us that this statistic does not vary much on a yearly basis.", "_____no_output_____" ] ], [ [ "# Who are some of the longest hitters \ndistance = df[['Year','Player Name','Avg Distance']].copy()\ndistance.sort_values(by='Avg Distance', inplace=True, ascending=False)\nprint(distance.head())", " Year Player Name Avg Distance\n162 2018 Rory McIlroy 319.7\n1481 2011 J.B. Holmes 318.4\n174 2018 Trey Mullinax 318.3\n732 2015 Dustin Johnson 317.7\n350 2017 Rory McIlroy 316.7\n" ] ], [ [ "Rory McIlroy is one of the longest hitters in the game, setting the average driver distance to be 319.7 yards in 2018. He was also the longest hitter in 2017 with an average of 316.7 yards. ", "_____no_output_____" ] ], [ [ "# Who made the most money\nmoney_ranking = df[['Year','Player Name','Money']].copy()\nmoney_ranking.sort_values(by='Money', inplace=True, ascending=False)\nprint(money_ranking.head())", " Year Player Name Money\n647 2015 Jordan Spieth 12030465.0\n361 2017 Justin Thomas 9921560.0\n303 2017 Jordan Spieth 9433033.0\n729 2015 Jason Day 9403330.0\n520 2016 Dustin Johnson 9365185.0\n" ] ], [ [ "We can see that Jordan Spieth has made the most amount of money in a year, earning a total of 12 million dollars in 2015.", "_____no_output_____" ] ], [ [ "#collapse_output\n# Who made the most money each year\nmoney_rank = money_ranking.groupby('Year')['Money'].max()\nmoney_rank = pd.DataFrame(money_rank)\n\n\nindexs = np.arange(2010, 2019)\nnames = []\nfor i in range(money_rank.shape[0]):\n temp = df.loc[df['Money'] == money_rank.iloc[i,0],'Player Name']\n names.append(str(temp.values[0]))\n\nmoney_rank['Player Name'] = names\nprint(money_rank)", " Money Player Name\nYear \n2010 4910477.0 Matt Kuchar\n2011 6683214.0 Luke Donald\n2012 8047952.0 Rory McIlroy\n2013 8553439.0 Tiger Woods\n2014 8280096.0 Rory McIlroy\n2015 12030465.0 Jordan Spieth\n2016 9365185.0 Dustin Johnson\n2017 9921560.0 Justin Thomas\n2018 8694821.0 Justin Thomas\n" ] ], [ [ "With this table, we can examine the earnings of each player by year. Some of the most notable were Jordan Speith's earning of 12 million dollars and Justin Thomas earning the most money in both 2017 and 2018.", "_____no_output_____" ] ], [ [ "#collapse_output\n# Plot the correlation matrix between variables \ncorr = df.corr()\nsns.heatmap(corr, \n xticklabels=corr.columns.values,\n yticklabels=corr.columns.values,\n cmap='coolwarm')", "_____no_output_____" ], [ "df.corr()['Wins']", "_____no_output_____" ] ], [ [ "From the correlation matrix, we can observe that Money is highly correlated to wins along with the FedExCup Points. We can also observe that the fairway percentage, year, and rounds are not correlated to Wins.", "_____no_output_____" ], [ "4. Machine Learning Model (Classification)\n\n\nTo predict winners, I used multiple machine learning models to explore which models could accurately classify if a player is going to win in that year.\n\nTo measure the models, I used Receiver Operating Characterisitc Area Under the Curve. (ROC AUC) The ROC AUC tells us how capable the model is at distinguishing players with a win. In addition, as the data is skewed with 83% of players having no wins in that year, ROC AUC is a much better metric than the accuracy of the model.", "_____no_output_____" ] ], [ [ "#collapse\n# Importing the Machine Learning modules\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.metrics import roc_curve, roc_auc_score\nfrom sklearn.metrics import confusion_matrix\nfrom sklearn.feature_selection import RFE\nfrom sklearn.metrics import classification_report\nfrom sklearn.preprocessing import PolynomialFeatures\nfrom sklearn.svm import SVC \nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.preprocessing import MinMaxScaler\n", "_____no_output_____" ] ], [ [ "Preparing the Data for Classification\n\nWe know from the calculation above that the data for wins is skewed. Even without machine learning we know that approximately 83% of the players does not lead to a win. Therefore, we will be utilizing ROC AUC as the metric of these models", "_____no_output_____" ] ], [ [ "# Adding the Winner column to determine if the player won that year or not \ndf['Winner'] = df['Wins'].apply(lambda x: 1 if x>0 else 0)\n\n# New DataFrame \nml_df = df.copy()\n\n# Y value for machine learning is the Winner column\ntarget = df['Winner']\n\n# Removing the columns Player Name, Wins, and Winner from the dataframe to avoid leakage\nml_df.drop(['Player Name','Wins','Winner'], axis=1, inplace=True)\nprint(ml_df.head())", " Rounds Fairway Percentage Year ... SG:APR SG:ARG Money\n0 60 75.19 2018 ... 0.960 -0.027 2680487.0\n1 109 73.58 2018 ... 0.213 0.194 2485203.0\n2 93 72.24 2018 ... 0.437 -0.137 2700018.0\n3 78 71.94 2018 ... 0.532 0.273 1986608.0\n4 103 71.44 2018 ... 0.099 0.026 1089763.0\n\n[5 rows x 16 columns]\n" ], [ "## Logistic Regression Baseline\nper_no_win = target.value_counts()[0] / (target.value_counts()[0] + target.value_counts()[1])\nper_no_win = per_no_win.round(4)*100\nprint(str(per_no_win)+str('%'))", "83.09%\n" ], [ "#collapse_show\n# Function for the logisitic regression \ndef log_reg(X, y):\n X_train, X_test, y_train, y_test = train_test_split(X, y,\n random_state = 10)\n clf = LogisticRegression().fit(X_train, y_train)\n y_pred = clf.predict(X_test)\n print('Accuracy of Logistic regression classifier on training set: {:.2f}'\n .format(clf.score(X_train, y_train)))\n print('Accuracy of Logistic regression classifier on test set: {:.2f}'\n .format(clf.score(X_test, y_test)))\n cf_mat = confusion_matrix(y_test, y_pred)\n confusion = pd.DataFrame(data = cf_mat)\n print(confusion)\n \n print(classification_report(y_test, y_pred))\n\n # Returning the 5 important features \n #rfe = RFE(clf, 5)\n # rfe = rfe.fit(X, y)\n # print('Feature Importance')\n # print(X.columns[rfe.ranking_ == 1].values)\n \n print('ROC AUC Score: {:.2f}'.format(roc_auc_score(y_test, y_pred)))", "_____no_output_____" ], [ "#collapse_show\nlog_reg(ml_df, target)", "Accuracy of Logistic regression classifier on training set: 0.90\nAccuracy of Logistic regression classifier on test set: 0.91\n 0 1\n0 345 8\n1 28 38\n precision recall f1-score support\n\n 0 0.92 0.98 0.95 353\n 1 0.83 0.58 0.68 66\n\n accuracy 0.91 419\n macro avg 0.88 0.78 0.81 419\nweighted avg 0.91 0.91 0.91 419\n\nROC AUC Score: 0.78\n" ] ], [ [ "From the logisitic regression, we got an accuracy of 0.9 on the training set and an accuracy of 0.91 on the test set. This was surprisingly accurate for a first run. However, the ROC AUC Score of 0.78 could be improved. Therefore, I decided to add more features as a way of possibly improving the model.\n\n", "_____no_output_____" ] ], [ [ "## Feature Engineering\n\n# Adding Domain Features \nml_d = ml_df.copy()\n# Top 10 / Money might give us a better understanding on how well they placed in the top 10\nml_d['Top10perMoney'] = ml_d['Top 10'] / ml_d['Money']\n\n# Avg Distance / Fairway Percentage to give us a ratio that determines how accurate and far a player hits \nml_d['DistanceperFairway'] = ml_d['Avg Distance'] / ml_d['Fairway Percentage']\n\n# Money / Rounds to see on average how much money they would make playing a round of golf \nml_d['MoneyperRound'] = ml_d['Money'] / ml_d['Rounds']", "_____no_output_____" ], [ "#collapse_show\nlog_reg(ml_d, target)", "Accuracy of Logistic regression classifier on training set: 0.91\nAccuracy of Logistic regression classifier on test set: 0.91\n 0 1\n0 342 11\n1 27 39\n precision recall f1-score support\n\n 0 0.93 0.97 0.95 353\n 1 0.78 0.59 0.67 66\n\n accuracy 0.91 419\n macro avg 0.85 0.78 0.81 419\nweighted avg 0.90 0.91 0.90 419\n\nROC AUC Score: 0.78\n" ], [ "#collapse_show\n# Adding Polynomial Features to the ml_df \nmldf2 = ml_df.copy()\npoly = PolynomialFeatures(2)\npoly = poly.fit(mldf2)\npoly_feature = poly.transform(mldf2)\nprint(poly_feature.shape)\n\n# Creating a DataFrame with the polynomial features \npoly_feature = pd.DataFrame(poly_feature, columns = poly.get_feature_names(ml_df.columns))\nprint(poly_feature.head())", "(1674, 153)\n 1 Rounds Fairway Percentage ... SG:ARG^2 SG:ARG Money Money^2\n0 1.0 60.0 75.19 ... 0.000729 -72373.149 7.185011e+12\n1 1.0 109.0 73.58 ... 0.037636 482129.382 6.176234e+12\n2 1.0 93.0 72.24 ... 0.018769 -369902.466 7.290097e+12\n3 1.0 78.0 71.94 ... 0.074529 542343.984 3.946611e+12\n4 1.0 103.0 71.44 ... 0.000676 28333.838 1.187583e+12\n\n[5 rows x 153 columns]\n" ], [ "#collapse_show\nlog_reg(poly_feature, target)", "Accuracy of Logistic regression classifier on training set: 0.90\nAccuracy of Logistic regression classifier on test set: 0.91\n 0 1\n0 346 7\n1 32 34\n precision recall f1-score support\n\n 0 0.92 0.98 0.95 353\n 1 0.83 0.52 0.64 66\n\n accuracy 0.91 419\n macro avg 0.87 0.75 0.79 419\nweighted avg 0.90 0.91 0.90 419\n\nROC AUC Score: 0.75\n" ] ], [ [ "From feature engineering, there were no improvements in the ROC AUC Score. In fact as I added more features, the accuracy and the ROC AUC Score decreased. This could signal to us that another machine learning algorithm could better predict winners.", "_____no_output_____" ] ], [ [ "#collapse_show\n## Randon Forest Model\n\ndef random_forest(X, y):\n X_train, X_test, y_train, y_test = train_test_split(X, y,\n random_state = 10)\n clf = RandomForestClassifier(n_estimators=200).fit(X_train, y_train)\n y_pred = clf.predict(X_test)\n print('Accuracy of Random Forest classifier on training set: {:.2f}'\n .format(clf.score(X_train, y_train)))\n print('Accuracy of Random Forest classifier on test set: {:.2f}'\n .format(clf.score(X_test, y_test)))\n \n cf_mat = confusion_matrix(y_test, y_pred)\n confusion = pd.DataFrame(data = cf_mat)\n print(confusion)\n \n print(classification_report(y_test, y_pred))\n \n # Returning the 5 important features \n rfe = RFE(clf, 5)\n rfe = rfe.fit(X, y)\n print('Feature Importance')\n print(X.columns[rfe.ranking_ == 1].values)\n \n print('ROC AUC Score: {:.2f}'.format(roc_auc_score(y_test, y_pred)))", "_____no_output_____" ], [ "#collapse_show\nrandom_forest(ml_df, target)", "Accuracy of Random Forest classifier on training set: 1.00\nAccuracy of Random Forest classifier on test set: 0.94\n 0 1\n0 342 11\n1 16 50\n precision recall f1-score support\n\n 0 0.96 0.97 0.96 353\n 1 0.82 0.76 0.79 66\n\n accuracy 0.94 419\n macro avg 0.89 0.86 0.87 419\nweighted avg 0.93 0.94 0.93 419\n\nFeature Importance\n['Average Score' 'Points' 'Top 10' 'Average SG Total' 'Money']\nROC AUC Score: 0.86\n" ], [ "#collapse_show\nrandom_forest(ml_d, target)", "Accuracy of Random Forest classifier on training set: 1.00\nAccuracy of Random Forest classifier on test set: 0.94\n 0 1\n0 343 10\n1 16 50\n precision recall f1-score support\n\n 0 0.96 0.97 0.96 353\n 1 0.83 0.76 0.79 66\n\n accuracy 0.94 419\n macro avg 0.89 0.86 0.88 419\nweighted avg 0.94 0.94 0.94 419\n\nFeature Importance\n['Average Score' 'Points' 'Average SG Total' 'Money' 'MoneyperRound']\nROC AUC Score: 0.86\n" ], [ "#collapse_show\nrandom_forest(poly_feature, target)", "Accuracy of Random Forest classifier on training set: 1.00\nAccuracy of Random Forest classifier on test set: 0.94\n 0 1\n0 340 13\n1 14 52\n precision recall f1-score support\n\n 0 0.96 0.96 0.96 353\n 1 0.80 0.79 0.79 66\n\n accuracy 0.94 419\n macro avg 0.88 0.88 0.88 419\nweighted avg 0.94 0.94 0.94 419\n\nFeature Importance\n['Year Points' 'Average Putts Points' 'Average Scrambling Top 10'\n 'Average Score Points' 'Points^2']\nROC AUC Score: 0.88\n" ] ], [ [ "The Random Forest Model scored highly on the ROC AUC Score, obtaining a value of 0.89. With this, we observed that the Random Forest Model could accurately classify players with and without a win.", "_____no_output_____" ], [ "6. Conclusion\n\nIt's been interesting to learn about numerous aspects of the game that differentiate the winner and the average PGA Tour player. For example, we can see that the fairway percentage and greens in regulations do not seem to contribute as much to a player's win. However, all the strokes gained statistics contribute pretty highly to wins for these players. It was interesting to see which aspects of the game that the professionals should put their time into. This also gave me the idea of track my personal golf statistics, so that I could compare it to the pros and find areas of my game that need the most improvement.\n\nMachine Learning Model\nI've been able to examine the data of PGA Tour players and classify if a player will win that year or not. With the random forest classification model, I was able to achieve an ROC AUC of 0.89 and an accuracy of 0.95 on the test set. This was a significant improvement from the ROC AUC of 0.78 and accuracy of 0.91. Because the data is skewed with approximately 80% of players not earning a win, the primary measure of the model was the ROC AUC. I was able to improve my model from ROC AUC score of 0.78 to a score of 0.89 by simply trying 3 different models, adding domain features, and polynomial features.\n\n", "_____no_output_____" ], [ "The End!!", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown" ] ]
d062cdd7eeb895340e644ac4092a20863b415b5b
15,675
ipynb
Jupyter Notebook
Week2-Lesson1-MNIST-Fashion.ipynb
monahatami1/Coursera_Introduction-to-TensorFlow-for-Artificial-Intelligence-Machine-Learning-and-Deep-Lear
4ed47fac75d3ec2eea277ca64b1b99ba017f8a27
[ "MIT" ]
null
null
null
Week2-Lesson1-MNIST-Fashion.ipynb
monahatami1/Coursera_Introduction-to-TensorFlow-for-Artificial-Intelligence-Machine-Learning-and-Deep-Lear
4ed47fac75d3ec2eea277ca64b1b99ba017f8a27
[ "MIT" ]
null
null
null
Week2-Lesson1-MNIST-Fashion.ipynb
monahatami1/Coursera_Introduction-to-TensorFlow-for-Artificial-Intelligence-Machine-Learning-and-Deep-Lear
4ed47fac75d3ec2eea277ca64b1b99ba017f8a27
[ "MIT" ]
null
null
null
60.755814
7,088
0.69327
[ [ [ "import tensorflow as tf\nprint(tf.__version__)", "2.3.1\n" ], [ "mnist = tf.keras.datasets.fashion_mnist", "_____no_output_____" ], [ "(training_images, training_labels), (test_images, test_labels) = mnist.load_data()", "_____no_output_____" ], [ "import numpy as np\nnp.set_printoptions(linewidth=200)\nimport matplotlib.pyplot as plt\nplt.imshow(training_images[0])\nprint(training_labels[0])\nprint(training_images[0])", "9\n[[ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 13 73 0 0 1 4 0 0 0 0 1 1 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 3 0 36 136 127 62 54 0 0 0 1 3 4 0 0 3]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 6 0 102 204 176 134 144 123 23 0 0 0 0 12 10 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 155 236 207 178 107 156 161 109 64 23 77 130 72 15]\n [ 0 0 0 0 0 0 0 0 0 0 0 1 0 69 207 223 218 216 216 163 127 121 122 146 141 88 172 66]\n [ 0 0 0 0 0 0 0 0 0 1 1 1 0 200 232 232 233 229 223 223 215 213 164 127 123 196 229 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 183 225 216 223 228 235 227 224 222 224 221 223 245 173 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 193 228 218 213 198 180 212 210 211 213 223 220 243 202 0]\n [ 0 0 0 0 0 0 0 0 0 1 3 0 12 219 220 212 218 192 169 227 208 218 224 212 226 197 209 52]\n [ 0 0 0 0 0 0 0 0 0 0 6 0 99 244 222 220 218 203 198 221 215 213 222 220 245 119 167 56]\n [ 0 0 0 0 0 0 0 0 0 4 0 0 55 236 228 230 228 240 232 213 218 223 234 217 217 209 92 0]\n [ 0 0 1 4 6 7 2 0 0 0 0 0 237 226 217 223 222 219 222 221 216 223 229 215 218 255 77 0]\n [ 0 3 0 0 0 0 0 0 0 62 145 204 228 207 213 221 218 208 211 218 224 223 219 215 224 244 159 0]\n [ 0 0 0 0 18 44 82 107 189 228 220 222 217 226 200 205 211 230 224 234 176 188 250 248 233 238 215 0]\n [ 0 57 187 208 224 221 224 208 204 214 208 209 200 159 245 193 206 223 255 255 221 234 221 211 220 232 246 0]\n [ 3 202 228 224 221 211 211 214 205 205 205 220 240 80 150 255 229 221 188 154 191 210 204 209 222 228 225 0]\n [ 98 233 198 210 222 229 229 234 249 220 194 215 217 241 65 73 106 117 168 219 221 215 217 223 223 224 229 29]\n [ 75 204 212 204 193 205 211 225 216 185 197 206 198 213 240 195 227 245 239 223 218 212 209 222 220 221 230 67]\n [ 48 203 183 194 213 197 185 190 194 192 202 214 219 221 220 236 225 216 199 206 186 181 177 172 181 205 206 115]\n [ 0 122 219 193 179 171 183 196 204 210 213 207 211 210 200 196 194 191 195 191 198 192 176 156 167 177 210 92]\n [ 0 0 74 189 212 191 175 172 175 181 185 188 189 188 193 198 204 209 210 210 211 188 188 194 192 216 170 0]\n [ 2 0 0 0 66 200 222 237 239 242 246 243 244 221 220 193 191 179 182 182 181 176 166 168 99 58 0 0]\n [ 0 0 0 0 0 0 0 40 61 44 72 41 35 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]]\n" ], [ "training_images = training_images / 255.0\ntest_images = test_images / 255.0\n", "_____no_output_____" ], [ "model = tf.keras.models.Sequential([tf.keras.layers.Flatten(), \n tf.keras.layers.Dense(128, activation=tf.nn.relu), \n tf.keras.layers.Dense(10, activation=tf.nn.softmax)])", "_____no_output_____" ], [ "model.compile(optimizer = tf.optimizers.Adam(),\n loss = 'sparse_categorical_crossentropy',\n metrics=['accuracy'])\n\nmodel.fit(training_images, training_labels, epochs=5)", "Epoch 1/5\n1875/1875 [==============================] - 2s 870us/step - loss: 1.1072 - accuracy: 0.6507\nEpoch 2/5\n1875/1875 [==============================] - 2s 835us/step - loss: 0.6459 - accuracy: 0.7674\nEpoch 3/5\n1875/1875 [==============================] - 2s 807us/step - loss: 0.5682 - accuracy: 0.7962\nEpoch 4/5\n1875/1875 [==============================] - 1s 796us/step - loss: 0.5250 - accuracy: 0.8135\nEpoch 5/5\n1875/1875 [==============================] - 2s 805us/step - loss: 0.4971 - accuracy: 0.8244\n" ], [ "model.evaluate(test_images, test_labels)", "313/313 [==============================] - 0s 704us/step - loss: 95.0182 - accuracy: 0.6898\n" ], [ "classifications = model.predict(test_images)\n\nprint(classifications[0])", "[6.8263911e-13 1.7325267e-12 2.5193808e-18 1.0686662e-12 9.9983463e-18 1.1335950e-01 2.2505068e-18 1.0656738e-01 2.8287264e-12 7.8007311e-01]\n" ], [ "len(set(training_labels))", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d062d93ee6753a9376227ea46c29417c53a92f09
456,549
ipynb
Jupyter Notebook
monte-carlo/Monte_Carlo.ipynb
jbdekker/deep-reinforcement-learning
faea2012a9b96ec013c5fbf83ef40316b446fe2e
[ "MIT" ]
null
null
null
monte-carlo/Monte_Carlo.ipynb
jbdekker/deep-reinforcement-learning
faea2012a9b96ec013c5fbf83ef40316b446fe2e
[ "MIT" ]
null
null
null
monte-carlo/Monte_Carlo.ipynb
jbdekker/deep-reinforcement-learning
faea2012a9b96ec013c5fbf83ef40316b446fe2e
[ "MIT" ]
null
null
null
842.341328
216,412
0.947911
[ [ [ "# Monte Carlo Methods\n\nIn this notebook, you will write your own implementations of many Monte Carlo (MC) algorithms. \n\nWhile we have provided some starter code, you are welcome to erase these hints and write your code from scratch.\n\n### Part 0: Explore BlackjackEnv\n\nWe begin by importing the necessary packages.", "_____no_output_____" ] ], [ [ "import sys\nimport gym\nimport numpy as np\nfrom collections import defaultdict\n\nfrom plot_utils import plot_blackjack_values, plot_policy", "_____no_output_____" ] ], [ [ "Use the code cell below to create an instance of the [Blackjack](https://github.com/openai/gym/blob/master/gym/envs/toy_text/blackjack.py) environment.", "_____no_output_____" ] ], [ [ "env = gym.make('Blackjack-v0')", "_____no_output_____" ] ], [ [ "Each state is a 3-tuple of:\n- the player's current sum $\\in \\{0, 1, \\ldots, 31\\}$,\n- the dealer's face up card $\\in \\{1, \\ldots, 10\\}$, and\n- whether or not the player has a usable ace (`no` $=0$, `yes` $=1$).\n\nThe agent has two potential actions:\n\n```\n STICK = 0\n HIT = 1\n```\nVerify this by running the code cell below.", "_____no_output_____" ] ], [ [ "print(f\"Observation space: \\t{env.observation_space}\")\nprint(f\"Action space: \\t\\t{env.action_space}\")", "Observation space: \tTuple(Discrete(32), Discrete(11), Discrete(2))\nAction space: \t\tDiscrete(2)\n" ] ], [ [ "Execute the code cell below to play Blackjack with a random policy. \n\n(_The code currently plays Blackjack three times - feel free to change this number, or to run the cell multiple times. The cell is designed for you to get some experience with the output that is returned as the agent interacts with the environment._)", "_____no_output_____" ] ], [ [ "for i_episode in range(3):\n state = env.reset()\n while True:\n print(state)\n action = env.action_space.sample()\n state, reward, done, info = env.step(action)\n if done:\n print('End game! Reward: ', reward)\n print('You won :)\\n') if reward > 0 else print('You lost :(\\n')\n break", "(19, 10, False)\nEnd game! Reward: 1.0\nYou won :)\n\n(14, 6, False)\n(15, 6, False)\nEnd game! Reward: 1.0\nYou won :)\n\n(16, 3, False)\nEnd game! Reward: 1.0\nYou won :)\n\n" ] ], [ [ "### Part 1: MC Prediction\n\nIn this section, you will write your own implementation of MC prediction (for estimating the action-value function). \n\nWe will begin by investigating a policy where the player _almost_ always sticks if the sum of her cards exceeds 18. In particular, she selects action `STICK` with 80% probability if the sum is greater than 18; and, if the sum is 18 or below, she selects action `HIT` with 80% probability. The function `generate_episode_from_limit_stochastic` samples an episode using this policy. \n\nThe function accepts as **input**:\n- `bj_env`: This is an instance of OpenAI Gym's Blackjack environment.\n\nIt returns as **output**:\n- `episode`: This is a list of (state, action, reward) tuples (of tuples) and corresponds to $(S_0, A_0, R_1, \\ldots, S_{T-1}, A_{T-1}, R_{T})$, where $T$ is the final time step. In particular, `episode[i]` returns $(S_i, A_i, R_{i+1})$, and `episode[i][0]`, `episode[i][1]`, and `episode[i][2]` return $S_i$, $A_i$, and $R_{i+1}$, respectively.", "_____no_output_____" ] ], [ [ "def generate_episode_from_limit_stochastic(bj_env):\n episode = []\n state = bj_env.reset()\n \n while True:\n probs = [0.8, 0.2] if state[0] > 18 else [0.2, 0.8]\n action = np.random.choice(np.arange(2), p=probs)\n \n next_state, reward, done, info = bj_env.step(action)\n \n episode.append((state, action, reward))\n \n state = next_state\n \n if done:\n break\n \n return episode", "_____no_output_____" ] ], [ [ "Execute the code cell below to play Blackjack with the policy. \n\n(*The code currently plays Blackjack three times - feel free to change this number, or to run the cell multiple times. The cell is designed for you to gain some familiarity with the output of the `generate_episode_from_limit_stochastic` function.*)", "_____no_output_____" ] ], [ [ "for i in range(5):\n print(generate_episode_from_limit_stochastic(env))", "[((18, 2, True), 0, 1.0)]\n[((16, 5, False), 1, 0.0), ((18, 5, False), 1, -1.0)]\n[((13, 5, False), 1, 0.0), ((17, 5, False), 1, -1.0)]\n[((14, 4, False), 1, 0.0), ((17, 4, False), 1, -1.0)]\n[((20, 10, False), 0, -1.0)]\n" ] ], [ [ "Now, you are ready to write your own implementation of MC prediction. Feel free to implement either first-visit or every-visit MC prediction; in the case of the Blackjack environment, the techniques are equivalent.\n\nYour algorithm has three arguments:\n- `env`: This is an instance of an OpenAI Gym environment.\n- `num_episodes`: This is the number of episodes that are generated through agent-environment interaction.\n- `generate_episode`: This is a function that returns an episode of interaction.\n- `gamma`: This is the discount rate. It must be a value between 0 and 1, inclusive (default value: `1`).\n\nThe algorithm returns as output:\n- `Q`: This is a dictionary (of one-dimensional arrays) where `Q[s][a]` is the estimated action value corresponding to state `s` and action `a`.", "_____no_output_____" ] ], [ [ "def mc_prediction_q(env, num_episodes, generate_episode, gamma=1.0):\n # initialize empty dictionaries of arrays\n returns_sum = defaultdict(lambda: np.zeros(env.action_space.n))\n N = defaultdict(lambda: np.zeros(env.action_space.n))\n Q = defaultdict(lambda: np.zeros(env.action_space.n))\n R = defaultdict(lambda: np.zeros(env.action_space.n))\n \n # loop over episodes\n for i_episode in range(1, num_episodes+1):\n # monitor progress\n if i_episode % 1000 == 0:\n print(\"\\rEpisode {}/{}.\".format(i_episode, num_episodes), end=\"\")\n sys.stdout.flush()\n \n episode = generate_episode(env)\n \n n = len(episode)\n states, actions, rewards = zip(*episode)\n discounts = np.array([gamma**i for i in range(n+1)])\n \n for i, state in enumerate(states):\n returns_sum[state][actions[i]] += sum(rewards[i:] * discounts[:-(i+1)])\n N[state][actions[i]] += 1\n \n # comnpute Q table\n for state in returns_sum.keys():\n for action in range(env.action_space.n):\n Q[state][action] = returns_sum[state][action] / N[state][action]\n \n return Q, returns_sum, N", "_____no_output_____" ] ], [ [ "Use the cell below to obtain the action-value function estimate $Q$. We have also plotted the corresponding state-value function.\n\nTo check the accuracy of your implementation, compare the plot below to the corresponding plot in the solutions notebook **Monte_Carlo_Solution.ipynb**.", "_____no_output_____" ] ], [ [ "# obtain the action-value function\nQ, R, N = mc_prediction_q(env, 500000, generate_episode_from_limit_stochastic)\n\n# obtain the corresponding state-value function\nV_to_plot = dict((k,(k[0]>18)*(np.dot([0.8, 0.2],v)) + (k[0]<=18)*(np.dot([0.2, 0.8],v))) \\\n for k, v in Q.items())\n\n# plot the state-value function\nplot_blackjack_values(V_to_plot)", "Episode 500000/500000." ] ], [ [ "### Part 2: MC Control\n\nIn this section, you will write your own implementation of constant-$\\alpha$ MC control. \n\nYour algorithm has four arguments:\n- `env`: This is an instance of an OpenAI Gym environment.\n- `num_episodes`: This is the number of episodes that are generated through agent-environment interaction.\n- `alpha`: This is the step-size parameter for the update step.\n- `gamma`: This is the discount rate. It must be a value between 0 and 1, inclusive (default value: `1`).\n\nThe algorithm returns as output:\n- `Q`: This is a dictionary (of one-dimensional arrays) where `Q[s][a]` is the estimated action value corresponding to state `s` and action `a`.\n- `policy`: This is a dictionary where `policy[s]` returns the action that the agent chooses after observing state `s`.\n\n(_Feel free to define additional functions to help you to organize your code._)", "_____no_output_____" ] ], [ [ "def generate_episode_from_Q(env, Q, epsilon, n):\n \"\"\" generates an episode following the epsilon-greedy policy\"\"\"\n episode = []\n state = env.reset()\n \n while True:\n if state in Q:\n action = np.random.choice(np.arange(n), p=get_props(Q[state], epsilon, n))\n else:\n action = env.action_space.sample()\n \n next_state, reward, done, _ = env.step(action)\n episode.append((state, action, reward))\n \n state = next_state\n \n if done:\n break\n \n return episode", "_____no_output_____" ], [ "def get_props(Q_s, epsilon, n):\n policy_s = np.ones(n) * epsilon / n\n best_a = np.argmax(Q_s)\n policy_s[best_a] = 1 - epsilon + (epsilon / n)\n \n return policy_s", "_____no_output_____" ], [ "def update_Q(episode, Q, alpha, gamma):\n n = len(episode)\n \n states, actions, rewards = zip(*episode)\n discounts = np.array([gamma**i for i in range(n+1)])\n \n for i, state in enumerate(states):\n R = sum(rewards[i:] * discounts[:-(1+i)])\n Q[state][actions[i]] = Q[state][actions[i]] + alpha * (R - Q[state][actions[i]])\n \n return Q", "_____no_output_____" ], [ "def mc_control(env, num_episodes, alpha, gamma=1.0, eps_start=1.0, eps_decay=.99999, eps_min=0.05):\n nA = env.action_space.n\n # initialize empty dictionary of arrays\n Q = defaultdict(lambda: np.zeros(nA))\n \n epsilon = eps_start\n # loop over episodes\n for i_episode in range(1, num_episodes+1):\n # monitor progress\n if i_episode % 1000 == 0:\n print(\"\\rEpisode {}/{}.\".format(i_episode, num_episodes), end=\"\")\n sys.stdout.flush()\n \n epsilon = max(eps_min, epsilon * eps_decay)\n episode = generate_episode_from_Q(env, Q, epsilon, nA)\n \n Q = update_Q(episode, Q, alpha, gamma)\n \n policy = dict((s, np.argmax(v)) for s, v in Q.items())\n \n return policy, Q", "_____no_output_____" ] ], [ [ "Use the cell below to obtain the estimated optimal policy and action-value function. Note that you should fill in your own values for the `num_episodes` and `alpha` parameters.", "_____no_output_____" ] ], [ [ "# obtain the estimated optimal policy and action-value function\npolicy, Q = mc_control(env, 500000, 0.02)", "Episode 500000/500000." ] ], [ [ "Next, we plot the corresponding state-value function.", "_____no_output_____" ] ], [ [ "# obtain the corresponding state-value function\nV = dict((k,np.max(v)) for k, v in Q.items())\n\n# plot the state-value function\nplot_blackjack_values(V)", "_____no_output_____" ] ], [ [ "Finally, we visualize the policy that is estimated to be optimal.", "_____no_output_____" ] ], [ [ "# plot the policy\nplot_policy(policy)", "_____no_output_____" ] ], [ [ "The **true** optimal policy $\\pi_*$ can be found in Figure 5.2 of the [textbook](http://go.udacity.com/rl-textbook) (and appears below). Compare your final estimate to the optimal policy - how close are you able to get? If you are not happy with the performance of your algorithm, take the time to tweak the decay rate of $\\epsilon$, change the value of $\\alpha$, and/or run the algorithm for more episodes to attain better results.\n\n![True Optimal Policy](images/optimal.png)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d062e3d741909a888a9e790727794972375a0f60
6,951
ipynb
Jupyter Notebook
highPerformanceComputing/Project2-KNNClassifier/knnCountries.ipynb
naokishami/Classwork
ac59d640f15e88294804fdb518b6c84b10e0d2bd
[ "MIT" ]
null
null
null
highPerformanceComputing/Project2-KNNClassifier/knnCountries.ipynb
naokishami/Classwork
ac59d640f15e88294804fdb518b6c84b10e0d2bd
[ "MIT" ]
null
null
null
highPerformanceComputing/Project2-KNNClassifier/knnCountries.ipynb
naokishami/Classwork
ac59d640f15e88294804fdb518b6c84b10e0d2bd
[ "MIT" ]
null
null
null
24.736655
68
0.339951
[ [ [ "import pandas as pd\n\nmatrix = pd.read_csv(\"./data/matrixCountries.csv\")\nmatrix", "_____no_output_____" ], [ "us = matrix.iloc[:, 2]\nrus = matrix.iloc[:, 3]\nquery = matrix.iloc[:, 4]", "_____no_output_____" ], [ "def magnitude(vec):\n total = 0\n for item in vec:\n total += item**2\n total /= len(vec)\n return total", "_____no_output_____" ], [ "us_cos = us @ query / magnitude(us) / magnitude(query)\nus_cos", "_____no_output_____" ], [ "rus_cos = rus @ query / (magnitude(rus) * magnitude(query))\nrus_cos", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code" ] ]
d062f3c869548b0ed612ff7dc9ec73685185bd92
64,018
ipynb
Jupyter Notebook
notebooks/1_0_EDA_BASE_A.ipynb
teoria/PD_datascience
e679e942b70be67f0f33cad6db11de3bc4cd9f1c
[ "MIT" ]
null
null
null
notebooks/1_0_EDA_BASE_A.ipynb
teoria/PD_datascience
e679e942b70be67f0f33cad6db11de3bc4cd9f1c
[ "MIT" ]
null
null
null
notebooks/1_0_EDA_BASE_A.ipynb
teoria/PD_datascience
e679e942b70be67f0f33cad6db11de3bc4cd9f1c
[ "MIT" ]
null
null
null
101.134281
9,175
0.801728
[ [ [ "import pandas as pd\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport seaborn as sns\n#%matplotlib inline\n\nfrom IPython.core.pylabtools import figsize\nfigsize(8, 6)\nsns.set()", "_____no_output_____" ] ], [ [ "## Carregando dados dos usuรกrios premium", "_____no_output_____" ] ], [ [ "df = pd.read_csv(\"../data/processed/premium_students.csv\",parse_dates=[1,2],index_col=[0])\nprint(df.shape)\ndf.head()", "(6260, 2)\n" ] ], [ [ "---\n### Novas colunas auxiliares", "_____no_output_____" ] ], [ [ "df['diffDate'] = (df.SubscriptionDate - df.RegisteredDate)\ndf['diffDays'] = [ item.days for item in df['diffDate']]\ndf['register_time'] = df.RegisteredDate.map( lambda x : int(x.strftime(\"%H\")) )\ndf['register_time_AM_PM'] = df.register_time.map( lambda x : 1 if x>=12 else 0)\ndf['register_num_week'] = df.RegisteredDate.map( lambda x : int(x.strftime(\"%V\")) )\ndf['register_week_day'] = df.RegisteredDate.map( lambda x : int(x.weekday()) )\ndf['register_month'] = df.RegisteredDate.map( lambda x : int(x.strftime('%m')) )\ndf['subscription_time'] = df.SubscriptionDate.map( lambda x : int(x.strftime(\"%H\") ))\ndf['subscription_time_AM_PM'] = df.subscription_time.map( lambda x : 1 if x>=12 else 0)\ndf['subscription_num_week'] = df.SubscriptionDate.map( lambda x : int(x.strftime(\"%V\")) )\ndf['subscription_week_day'] = df.SubscriptionDate.map( lambda x : int(x.weekday()) )\ndf['subscription_month'] = df.SubscriptionDate.map( lambda x : int(x.strftime('%m')) )\ndf.tail()", "_____no_output_____" ] ], [ [ "---\n### Verificando distribuiรงรตes", "_____no_output_____" ] ], [ [ "df.register_time.hist()", "_____no_output_____" ], [ "df.subscription_time.hist()", "_____no_output_____" ], [ "df.register_time_AM_PM.value_counts()", "_____no_output_____" ], [ "df.subscription_time_AM_PM.value_counts()", "_____no_output_____" ], [ "df.subscription_week_day.value_counts()", "_____no_output_____" ], [ "df.diffDays.hist()", "_____no_output_____" ], [ "df.diffDays.quantile([.25,.5,.75,.95])", "_____no_output_____" ] ], [ [ "Separando os dados em 2 momentos.", "_____no_output_____" ] ], [ [ "lt_50 = df.loc[(df.diffDays <50) & (df.diffDays >3)]\nlt_50.diffDays.hist()\nlt_50.diffDays.value_counts()", "_____no_output_____" ], [ "lt_50.diffDays.quantile([.25,.5,.75,.95])\n", "_____no_output_____" ], [ "range_0_3 = df.loc[(df.diffDays < 3)]\nrange_3_18 = df.loc[(df.diffDays >= 3)&(df.diffDays < 18)]\nrange_6_11 = df.loc[(df.diffDays >= 6) & (df.diffDays < 11)]\nrange_11_18 = df.loc[(df.diffDays >= 11) & (df.diffDays < 18)]\nrange_18_32 = df.loc[(df.diffDays >= 18 )& (df.diffDays <= 32)]\nrange_32 = df.loc[(df.diffDays >=32)]", "_____no_output_____" ], [ "total_subs = df.shape[0]\n(\nround(range_0_3.shape[0] / total_subs,2),\nround(range_3_18.shape[0] / total_subs,2),\nround(range_18_32.shape[0] / total_subs,2),\nround(range_32.shape[0] / total_subs,2)\n )", "_____no_output_____" ], [ "gte_30 = df.loc[df.diffDays >=32]\ngte_30.diffDays.hist()\ngte_30.diffDays.value_counts()\ngte_30.shape", "_____no_output_____" ], [ "gte_30.diffDays.quantile([.25,.5,.75,.95])", "_____no_output_____" ], [ "range_32_140 = df.loc[(df.diffDays > 32)&(df.diffDays <=140)]\nrange_140_168 = df.loc[(df.diffDays > 140)&(df.diffDays <=168)]\nrange_168_188 = df.loc[(df.diffDays > 168)&(df.diffDays <=188)]\nrange_188 = df.loc[(df.diffDays > 188)]\n\ntotal_subs_gte_32 = gte_30.shape[0]\n(\nround(range_32_140.shape[0] / total_subs,2),\nround(range_140_168.shape[0] / total_subs,2),\nround(range_168_188.shape[0] / total_subs,2),\nround(range_188.shape[0] / total_subs,2)\n )", "_____no_output_____" ], [ "(\nround(range_32_140.shape[0] / total_subs_gte_32,2),\nround(range_140_168.shape[0] / total_subs_gte_32,2),\nround(range_168_188.shape[0] / total_subs_gte_32,2),\nround(range_188.shape[0] / total_subs_gte_32,2)\n )\n", "_____no_output_____" ] ], [ [ "----\n## Questรฃo 1:\nDentre os usuรกrios cadastrados em Nov/2017 que assinaram o Plano Premium,\nqual a probabilidade do usuรกrio virar Premium apรณs o cadastro em ranges de dias? A escolha\ndos ranges deve ser feita por vocรช, tendo em vista os insights que podemos tirar para o\nnegรณcio.", "_____no_output_____" ], [ "- De 0 a 3 dias -> 53%\n- De 3 a 18 dias -> 12%\n- De 18 a 32 -> 3%\n- Mais 32 dias -> 33%\n\n\nAnalisando as inscriรงรตes feitas depois do primeiro mรชs (33%)\n\n* De 32 a 140 -> 8%\n* De 140 a 168 -> 8%\n* De 168 a 188 -> 8%\n* De 188 a 216 -> 8%", "_____no_output_____" ], [ "Um pouco mais da metade das conversรตes acontecem nos primeiros 3 dias.\nA taxa conversรฃo chega a 65% atรฉ 18 dias apรณs o registro.\nApรณs 100 dias acontece outro momento relevante que representa 33%.\nPossivelmente essa janela coincide com o calendรกrio de provas das instituiรงรตes.\n\nInsights:\n* Maioria das conversรตes no perรญodo da tarde\n* Maioria das conversรตes no comeรงo da semana ( anรบncios aos domingos )\n* Direcionar anรบncios de instagram geolocalizados (instituiรงรตes) nos perรญodos que antecede o calendรกrio de provas.\n* Tentar converter usuรกrios ativos 100 dias apรณs o registro\n* Tentar converter usuรกrios com base no calendรกrio de provas da instituiรงรฃo", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown", "markdown" ] ]
d062f5c370018e72626a3bbe53882379df9d5c52
4,946
ipynb
Jupyter Notebook
Numpy/Numpy Operations.ipynb
aaavinash85/100-Days-of-ML-
d055d718f7972e3a4469279b9112867a42cf652f
[ "Apache-2.0" ]
3
2021-01-15T14:59:57.000Z
2021-07-01T07:32:19.000Z
Numpy/Numpy Operations.ipynb
aaavinash85/100-Days-of-ML-
d055d718f7972e3a4469279b9112867a42cf652f
[ "Apache-2.0" ]
null
null
null
Numpy/Numpy Operations.ipynb
aaavinash85/100-Days-of-ML-
d055d718f7972e3a4469279b9112867a42cf652f
[ "Apache-2.0" ]
1
2021-07-01T07:32:23.000Z
2021-07-01T07:32:23.000Z
19.170543
115
0.45552
[ [ [ "# NumPy Operations", "_____no_output_____" ], [ "## Arithmetic\n\nYou can easily perform array with array arithmetic, or scalar with array arithmetic. Let's see some examples:", "_____no_output_____" ] ], [ [ "import numpy as np\narr = np.arange(0,10)", "_____no_output_____" ], [ "arr + arr", "_____no_output_____" ], [ "arr * arr", "_____no_output_____" ], [ "arr - arr", "_____no_output_____" ], [ "arr**3", "_____no_output_____" ] ], [ [ "## Universal Array Functions\n\n", "_____no_output_____" ] ], [ [ "#Taking Square Roots\nnp.sqrt(arr)", "_____no_output_____" ], [ "#Calcualting exponential (e^)\nnp.exp(arr)", "_____no_output_____" ], [ "np.max(arr) #same as arr.max()", "_____no_output_____" ], [ "np.sin(arr)", "_____no_output_____" ], [ "np.log(arr)", "<ipython-input-3-a67b4ae04e95>:1: RuntimeWarning: divide by zero encountered in log\n np.log(arr)\n" ] ] ]
[ "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d062f667bf2ab6a5ad7d20d6bbfe1c67d779f452
24,819
ipynb
Jupyter Notebook
docker_galaxy/data/brassica_data/brapa_nb/1-bashNB_bra.ipynb
wilkinsonlab/epigenomics_pipeline
ed4a7fd97b798110e88364b101d020b2baebc298
[ "MIT" ]
null
null
null
docker_galaxy/data/brassica_data/brapa_nb/1-bashNB_bra.ipynb
wilkinsonlab/epigenomics_pipeline
ed4a7fd97b798110e88364b101d020b2baebc298
[ "MIT" ]
null
null
null
docker_galaxy/data/brassica_data/brapa_nb/1-bashNB_bra.ipynb
wilkinsonlab/epigenomics_pipeline
ed4a7fd97b798110e88364b101d020b2baebc298
[ "MIT" ]
1
2019-10-17T10:50:38.000Z
2019-10-17T10:50:38.000Z
38.124424
943
0.462065
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d063024d84566109dd7dd18bc446b9f87d5c39bb
3,619
ipynb
Jupyter Notebook
data/Untitled.ipynb
DSqiansun/CloudComparer
f5dfc6bda3ccb1d80421c241931d19f069ff2475
[ "MIT" ]
null
null
null
data/Untitled.ipynb
DSqiansun/CloudComparer
f5dfc6bda3ccb1d80421c241931d19f069ff2475
[ "MIT" ]
null
null
null
data/Untitled.ipynb
DSqiansun/CloudComparer
f5dfc6bda3ccb1d80421c241931d19f069ff2475
[ "MIT" ]
null
null
null
26.035971
113
0.494059
[ [ [ "def flatten_json(nested_json, exclude=['']):\n \"\"\"Flatten json object with nested keys into a single level.\n Args:\n nested_json: A nested json object.\n exclude: Keys to exclude from output.\n Returns:\n The flattened json object if successful, None otherwise.\n \"\"\"\n out = {}\n\n def flatten(x, name='', exclude=exclude):\n if type(x) is dict:\n for a in x:\n if a not in exclude: flatten(x[a], name + a + '_')\n elif type(x) is list:\n i = 0\n for a in x:\n flatten(a, name + str(i) + '_')\n i += 1\n else:\n out[name[:-1]] = x\n\n flatten(nested_json)\n return out", "_____no_output_____" ], [ "\n", "_____no_output_____" ], [ "from flatten_json import flatten\nimport json\n\nwith open('Services.json') as f:\n data = json.load(f)\ndic_flattened = (flatten(d) for d in data)\ndf = pd.DataFrame(dic_flattened)\nclos = [col for col in list(df.columns) if 'Propertie' not in col]\ndf = df[clos]#.drop_duplicates().to_csv('cloud_service.csv', index=False)", "_____no_output_____" ], [ "def rchop(s, suffix):\n if suffix and s.endswith(suffix):\n s = s[:-len(suffix)]\n if suffix and s.endswith(suffix):\n return rchop(s, suffix)\n return s\n\ndef concate(df, cloud, type_):\n col_ = [col for col in list(df.columns) if cloud in col and type_ in col]\n return df[col_].fillna('').astype(str).agg('<br/>'.join, axis=1).apply(lambda x: rchop(x, '<br/>') )\n\n\nclouds = ['aws', 'azure', 'google', 'ibm', 'alibaba', 'oracle']\ntype_s = ['name', 'ref', 'icon']\n\nfor cloud in clouds:\n for type_ in type_s:\n df[cloud +'_'+ type_] = concate(df, cloud, type_)\n", "_____no_output_____" ], [ "df.drop_duplicates().to_csv('cloud_service.csv', index=False)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code" ] ]
d06303ee27ab4398b14cd8cdbe43733805679c26
11,587
ipynb
Jupyter Notebook
Chapters/06.MinimumSpanningTrees/Chapter6.ipynb
MichielStock/SelectedTopicsOptimization
20f6b37566d23cdde0ac6b765ffcc5ed72a11172
[ "MIT" ]
22
2017-03-21T14:01:10.000Z
2022-03-02T18:51:40.000Z
Chapters/06.MinimumSpanningTrees/Chapter6.ipynb
MichielStock/SelectedTopicsOptimization
20f6b37566d23cdde0ac6b765ffcc5ed72a11172
[ "MIT" ]
2
2018-03-22T09:54:01.000Z
2018-05-30T16:16:53.000Z
Chapters/06.MinimumSpanningTrees/Chapter6.ipynb
MichielStock/SelectedTopicsOptimization
20f6b37566d23cdde0ac6b765ffcc5ed72a11172
[ "MIT" ]
18
2018-01-21T15:23:51.000Z
2022-02-05T20:12:03.000Z
26.759815
493
0.543454
[ [ [ "# Minimum spanning trees\n\n*Selected Topics in Mathematical Optimization*\n\n**Michiel Stock** ([email]([email protected]))\n\n![](Figures/logo.png)", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt\n%matplotlib inline\nfrom minimumspanningtrees import red, green, blue, orange, yellow", "_____no_output_____" ] ], [ [ "## Graphs in python\n\nConsider the following example graph:\n\n![A small graph to show how to implement graphs in python.](Figures/graph.png)", "_____no_output_____" ], [ "This graph can be represented using an *adjacency list*. We do this using a `dict`. Every vertex is a key with the adjacent vertices given as a `set` containing tuples `(weight, neighbor)`. The weight is first because this makes it easy to compare the weights of two edges. Note that for every ingoing edges, there is also an outgoing edge, this is an undirected graph.", "_____no_output_____" ] ], [ [ "graph = {\n 'A' : set([(2, 'B'), (3, 'D')]),\n 'B' : set([(2, 'A'), (1, 'C'), (2, 'E')]),\n 'C' : set([(1, 'B'), (2, 'D'), (1, 'E')]),\n 'D' : set([(2, 'C'), (3, 'A'), (3, 'E')]),\n 'E' : set([(2, 'B'), (1, 'C'), (3, 'D')])\n}", "_____no_output_____" ] ], [ [ "Sometimes we will use an *edge list*, i.e. a list of (weighted) edges. This is often a more compact way of storing a graph. The edge list is given below. Note that again every edge is double: an in- and outgoing edge is included.", "_____no_output_____" ] ], [ [ "edges = [\n (2, 'B', 'A'),\n (3, 'D', 'A'),\n (2, 'C', 'D'),\n (3, 'A', 'D'),\n (3, 'E', 'D'),\n (2, 'B', 'E'),\n (3, 'D', 'E'),\n (1, 'C', 'E'),\n (2, 'E', 'B'),\n (2, 'A', 'B'),\n (1, 'C', 'B'),\n (1, 'E', 'C'),\n (1, 'B', 'C'),\n (2, 'D', 'C')]", "_____no_output_____" ] ], [ [ "We can easily turn one representation in the other (with a time complexity proportional to the number of edges) using the provided functions `edges_to_adj_list` and `adj_list_to_edges`.", "_____no_output_____" ] ], [ [ "from minimumspanningtrees import edges_to_adj_list, adj_list_to_edges", "_____no_output_____" ], [ "adj_list_to_edges(graph)", "_____no_output_____" ], [ "edges_to_adj_list(edges)", "_____no_output_____" ] ], [ [ "## Disjoint-set data structure\n\nImplementing an algorithm for finding the minimum spanning tree is fairly straightforward. The only bottleneck is that the algorithm requires the a disjoint-set data structure to keep track of a set partitioned in a number of disjoined subsets.\n\nFor example, consider the following inital set of eight elements.\n\n![](Figures/disjointset1.png)\n\nWe decide to group elements A, B and C together in a subset and F and G in another subset.\n\n![](Figures/disjointset2.png)\n\nThe disjoint-set data structure support the following operations:\n\n- **Find**: check which subset an element is in. Is typically used to check whether two objects are in the same subset;\n- **Union** merges two subsets into a single subset.\n\nA python implementation of a disjoint-set is available using an union-set forest. A simple example will make everything clear!", "_____no_output_____" ] ], [ [ "from union_set_forest import USF\n\nanimals = ['mouse', 'bat', 'robin', 'trout', 'seagull', 'hummingbird',\n 'salmon', 'goldfish', 'hippopotamus', 'whale', 'sparrow']\nunion_set_forest = USF(animals)\n\n# group mammals together\nunion_set_forest.union('mouse', 'bat')\nunion_set_forest.union('mouse', 'hippopotamus')\nunion_set_forest.union('whale', 'bat')\n\n# group birds together\nunion_set_forest.union('robin', 'seagull')\nunion_set_forest.union('seagull', 'sparrow')\nunion_set_forest.union('seagull', 'hummingbird')\nunion_set_forest.union('robin', 'hummingbird')\n\n# group fishes together\nunion_set_forest.union('goldfish', 'salmon')\nunion_set_forest.union('trout', 'salmon')", "_____no_output_____" ], [ "# mouse and whale in same subset?\nprint(union_set_forest.find('mouse') == union_set_forest.find('whale'))", "_____no_output_____" ], [ "# robin and salmon in the same subset?\nprint(union_set_forest.find('robin') == union_set_forest.find('salmon'))", "_____no_output_____" ] ], [ [ "## Heap queue\n\nCan be used to find the minimum of a changing list without having to sort the list every update.", "_____no_output_____" ] ], [ [ "from heapq import heapify, heappop, heappush\n\nheap = [(5, 'A'), (3, 'B'), (2, 'C'), (7, 'D')]\n\nheapify(heap) # turn in a heap\n\nprint(heap)", "_____no_output_____" ], [ "# return item lowest value while retaining heap property\nprint(heappop(heap))", "_____no_output_____" ], [ "print(heap)", "_____no_output_____" ], [ "# add new item and retain heap prop\nheappush(heap, (4, 'E'))\nprint(heap)", "_____no_output_____" ] ], [ [ "## Prim's algorithm\n\nPrim's algorithm starts with a single vertex and add $|V|-1$ edges to it, always taking the next edge with minimal weight that connects a vertex on the MST to a vertex not yet in the MST.", "_____no_output_____" ] ], [ [ "from minimumspanningtrees import prim", "_____no_output_____" ] ], [ [ "def prim(vertices, edges, start):\n \"\"\"\n Prim's algorithm for finding a minimum spanning tree.\n\n Inputs :\n - vertices : a set of the vertices of the Graph\n - edges : a list of weighted edges (e.g. (0.7, 'A', 'B') for an\n edge from node A to node B with weigth 0.7)\n - start : a vertex to start with\n\n Output:\n - edges : a minumum spanning tree represented as a list of edges\n - total_cost : total cost of the tree\n \"\"\"\n adj_list = edges_to_adj_list(edges) # easier using an adjacency list\n \n ... # to complete\n return mst_edges, total_cost", "_____no_output_____" ] ], [ [ "## Kruskal's algorithm\n\n\nKruskal's algorithm is a very simple algorithm to find the minimum spanning tree. The main idea is to start with an intial 'forest' of the individual nodes of the graph. In each step of the algorithm we add an edge with the smallest possible value that connects two disjoint trees in the forest. This process is continued until we have a single tree, which is a minimum spanning tree, or until all edges are considered. In the latter case, the algoritm returns a minimum spanning forest.", "_____no_output_____" ] ], [ [ "from minimumspanningtrees import kruskal", "_____no_output_____" ], [ "def kruskal(vertices, edges):\n \"\"\"\n Kruskal's algorithm for finding a minimum spanning tree.\n\n Inputs :\n - vertices : a set of the vertices of the Graph\n - edges : a list of weighted edges (e.g. (0.7, 'A', 'B') for an\n edge from node A to node B with weigth 0.7)\n\n Output:\n - edges : a minumum spanning tree represented as a list of edges\n - total_cost : total cost of the tree\n \"\"\"\n ... # to complete\n return mst_edges, total_cost", "_____no_output_____" ] ], [ [ "from tickettoride import vertices, edges", "_____no_output_____" ] ], [ [ "print(vertices)", "_____no_output_____" ], [ "print(edges[:5])", "_____no_output_____" ], [ "# compute the minimum spanning tree of the ticket to ride data set\n...", "_____no_output_____" ] ], [ [ "## Clustering\n\nMinimum spanning trees on a distance graph can be used to cluster a data set.", "_____no_output_____" ] ], [ [ "# import features and distance\nfrom clustering import X, D", "_____no_output_____" ], [ "fig, ax = plt.subplots()\nax.scatter(X[:,0], X[:,1], color=green)", "_____no_output_____" ], [ "# cluster the data based on the distance", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "raw", "code", "markdown", "code", "raw", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "raw" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "raw" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ] ]
d0634585edff76f717a666c4474ee29bebee9bc6
4,661
ipynb
Jupyter Notebook
section_2/03_simple_bert.ipynb
derwind/bert_nlp
ff1279e276c85a789edc863e1c27bbc2ef86e1f0
[ "MIT" ]
null
null
null
section_2/03_simple_bert.ipynb
derwind/bert_nlp
ff1279e276c85a789edc863e1c27bbc2ef86e1f0
[ "MIT" ]
null
null
null
section_2/03_simple_bert.ipynb
derwind/bert_nlp
ff1279e276c85a789edc863e1c27bbc2ef86e1f0
[ "MIT" ]
null
null
null
4,661
4,661
0.680326
[ [ [ "<a href=\"https://colab.research.google.com/github/yukinaga/bert_nlp/blob/main/section_2/03_simple_bert.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "# ใ‚ทใƒณใƒ—ใƒซใชBERTใฎๅฎŸ่ฃ…\n่จ“็ทดๆธˆใฟใฎใƒขใƒ‡ใƒซใ‚’ไฝฟ็”จใ—ใ€ๆ–‡็ซ ใฎไธ€้ƒจใฎไบˆๆธฌใ€ๅŠใณ2ใคใฎๆ–‡็ซ ใŒ้€ฃ็ถšใ—ใฆใ„ใ‚‹ใ‹ใฉใ†ใ‹ใฎๅˆคๅฎšใ‚’่กŒใ„ใพใ™ใ€‚", "_____no_output_____" ], [ "## ใƒฉใ‚คใƒ–ใƒฉใƒชใฎใ‚คใƒณใ‚นใƒˆใƒผใƒซ\nPyTorch-Transformersใ€ใŠใ‚ˆใณๅฟ…่ฆใชใƒฉใ‚คใƒ–ใƒฉใƒชใฎใ‚คใƒณใ‚นใƒˆใƒผใƒซใ‚’่กŒใ„ใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "!pip install folium==0.2.1\n!pip install urllib3==1.25.11\n!pip install transformers==4.13.0", "_____no_output_____" ] ], [ [ "## ๆ–‡็ซ ใฎไธ€้ƒจใฎไบˆๆธฌ\nๆ–‡็ซ ใซใŠใ‘ใ‚‹ไธ€้ƒจใฎๅ˜่ชžใ‚’MASKใ—ใ€ใใ‚Œใ‚’BERTใฎใƒขใƒ‡ใƒซใ‚’ไฝฟใฃใฆไบˆๆธฌใ—ใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "import torch\nfrom transformers import BertForMaskedLM\nfrom transformers import BertTokenizer\n\n\ntext = \"[CLS] I played baseball with my friends at school yesterday [SEP]\"\ntokenizer = BertTokenizer.from_pretrained(\"bert-base-uncased\")\nwords = tokenizer.tokenize(text)\nprint(words)", "_____no_output_____" ] ], [ [ "ๆ–‡็ซ ใฎไธ€้ƒจใ‚’MASKใ—ใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "msk_idx = 3\nwords[msk_idx] = \"[MASK]\" # ๅ˜่ชžใ‚’[MASK]ใซ็ฝฎใๆ›ใˆใ‚‹\nprint(words)", "_____no_output_____" ] ], [ [ "ๅ˜่ชžใ‚’ๅฏพๅฟœใ™ใ‚‹ใ‚คใƒณใƒ‡ใƒƒใ‚ฏใ‚นใซๅค‰ๆ›ใ—ใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "word_ids = tokenizer.convert_tokens_to_ids(words) # ๅ˜่ชžใ‚’ใ‚คใƒณใƒ‡ใƒƒใ‚ฏใ‚นใซๅค‰ๆ›\nword_tensor = torch.tensor([word_ids]) # ใƒ†ใƒณใ‚ฝใƒซใซๅค‰ๆ›\nprint(word_tensor)", "_____no_output_____" ] ], [ [ "BERTใฎใƒขใƒ‡ใƒซใ‚’ไฝฟใฃใฆไบˆๆธฌใ‚’่กŒใ„ใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "msk_model = BertForMaskedLM.from_pretrained(\"bert-base-uncased\")\nmsk_model.cuda() # GPUๅฏพๅฟœ\nmsk_model.eval()\n\nx = word_tensor.cuda() # GPUๅฏพๅฟœ\ny = msk_model(x) # ไบˆๆธฌ\nresult = y[0]\nprint(result.size()) # ็ตๆžœใฎๅฝข็Šถ\n\n_, max_ids = torch.topk(result[0][msk_idx], k=5) # ๆœ€ใ‚‚ๅคงใใ„5ใคใฎๅ€ค\nresult_words = tokenizer.convert_ids_to_tokens(max_ids.tolist()) # ใ‚คใƒณใƒ‡ใƒƒใ‚ฏใ‚นใ‚’ๅ˜่ชžใซๅค‰ๆ›\nprint(result_words)", "_____no_output_____" ] ], [ [ "## ๆ–‡็ซ ใŒ้€ฃ็ถšใ—ใฆใ„ใ‚‹ใ‹ใฉใ†ใ‹ใฎๅˆคๅฎš\nBERTใฎใƒขใƒ‡ใƒซใ‚’ไฝฟใฃใฆใ€2ใคใฎๆ–‡็ซ ใŒ้€ฃ็ถšใ—ใฆใ„ใ‚‹ใ‹ใฉใ†ใ‹ใฎๅˆคๅฎšใ‚’่กŒใ„ใพใ™ใ€‚ \nไปฅไธ‹ใฎ้–ขๆ•ฐ`show_continuity`ใงใฏใ€2ใคใฎๆ–‡็ซ ใฎ้€ฃ็ถšๆ€งใ‚’ๅˆคๅฎšใ—ใ€่กจ็คบใ—ใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "from transformers import BertForNextSentencePrediction\n\ndef show_continuity(text, seg_ids):\n words = tokenizer.tokenize(text)\n word_ids = tokenizer.convert_tokens_to_ids(words) # ๅ˜่ชžใ‚’ใ‚คใƒณใƒ‡ใƒƒใ‚ฏใ‚นใซๅค‰ๆ›\n word_tensor = torch.tensor([word_ids]) # ใƒ†ใƒณใ‚ฝใƒซใซๅค‰ๆ›\n\n seg_tensor = torch.tensor([seg_ids])\n\n nsp_model = BertForNextSentencePrediction.from_pretrained('bert-base-uncased')\n nsp_model.cuda() # GPUๅฏพๅฟœ\n nsp_model.eval()\n\n x = word_tensor.cuda() # GPUๅฏพๅฟœ\n s = seg_tensor.cuda() # GPUๅฏพๅฟœ\n\n y = nsp_model(x, token_type_ids=s) # ไบˆๆธฌ\n result = torch.softmax(y[0], dim=1)\n print(result) # Softmaxใง็ขบ็އใซ\n print(str(result[0][0].item()*100) + \"%ใฎ็ขบ็އใง้€ฃ็ถšใ—ใฆใ„ใพใ™ใ€‚\")", "_____no_output_____" ] ], [ [ "`show_continuity`้–ขๆ•ฐใซใ€่‡ช็„ถใซใคใชใŒใ‚‹2ใคใฎๆ–‡็ซ ใ‚’ไธŽใˆใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "text = \"[CLS] What is baseball ? [SEP] It is a game of hitting the ball with the bat [SEP]\"\nseg_ids = [0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1 ,1, 1] # 0:ๅ‰ใฎๆ–‡็ซ ใฎๅ˜่ชžใ€1:ๅพŒใฎๆ–‡็ซ ใฎๅ˜่ชž\nshow_continuity(text, seg_ids)", "_____no_output_____" ] ], [ [ "`show_continuity`้–ขๆ•ฐใซใ€่‡ช็„ถใซใคใชใŒใ‚‰ใชใ„2ใคใฎๆ–‡็ซ ใ‚’ไธŽใˆใพใ™ใ€‚", "_____no_output_____" ] ], [ [ "text = \"[CLS] What is baseball ? [SEP] This food is made with flour and milk [SEP]\"\nseg_ids = [0, 0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 1, 1, 1, 1] # 0:ๅ‰ใฎๆ–‡็ซ ใฎๅ˜่ชžใ€1:ๅพŒใฎๆ–‡็ซ ใฎๅ˜่ชž\nshow_continuity(text, seg_ids)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d06345a8a4ec272aabf44402ccdab8f1f0da4720
11,445
ipynb
Jupyter Notebook
_notebooks/2022-05-03-binary-search-or-bust.ipynb
boolean-pandit/non-faangable-tokens
14872620b21077be68ae64a2527643ed29e2260e
[ "Apache-2.0" ]
null
null
null
_notebooks/2022-05-03-binary-search-or-bust.ipynb
boolean-pandit/non-faangable-tokens
14872620b21077be68ae64a2527643ed29e2260e
[ "Apache-2.0" ]
null
null
null
_notebooks/2022-05-03-binary-search-or-bust.ipynb
boolean-pandit/non-faangable-tokens
14872620b21077be68ae64a2527643ed29e2260e
[ "Apache-2.0" ]
null
null
null
30.601604
506
0.562779
[ [ [ "# Binary Search or Bust\n> Binary search is useful for searching, but its implementation often leaves us searching for edge cases\n\n- toc: true \n- badges: true\n- comments: true\n- categories: [data structures & algorithms, coding interviews, searching]\n- image: images/binary_search_gif.gif", "_____no_output_____" ], [ "# Why should you care?\nBinary search is useful for searching through a set of values (which typically are sorted) efficiently. At each step, it reduces the search space by half, thereby running in $O(log(n))$ complexity. While it sounds simple enough to understand, it is deceptively tricky to implement and use in problems. Over the next few sections, let's take a look at binary search and it can be applied to some commonly encountered interview problems.", "_____no_output_____" ], [ "# A Recipe for Binary Searching\nHow does binary search reduce the search space by half? It leverages the fact that the input is sorted (_most of the time_) and compares the middle value of the search space at any step with the target value that we're searching for. If the middle value is smaller than the target, then we know that the target can only lie to its right, thus eliminating all the values to the left of the middle value and vice versa. So what information do we need to implement binary search?\n1. The left and right ends of the search space \n2. The target value we're searching for\n3. What to store at each step if any\n\nHere's a nice video which walks through the binary search algorithm:\n > youtube: https://youtu.be/P3YID7liBug\n", "_____no_output_____" ], [ "Next, let's look at an implementation of vanilla binary search. ", "_____no_output_____" ] ], [ [ "#hide\nfrom typing import List, Dict, Tuple ", "_____no_output_____" ], [ "def binary_search(nums: List[int], target: int) -> int:\n \"\"\"Vanilla Binary Search.\n Given a sorted list of integers and a target value,\n find the index of the target value in the list.\n If not present, return -1.\n \"\"\"\n # Left and right boundaries of the search space\n left, right = 0, len(nums) - 1\n while left <= right:\n # Why not (left + right) // 2 ?\n # Hint: Doesn't matter for Python\n middle = left + (right - left) // 2\n\n # Found the target, return the index\n if nums[middle] == target:\n return middle \n # The middle value is less than the\n # target, so look to the right\n elif nums[middle] < target:\n left = middle + 1\n # The middle value is greater than the\n # target, so look to the left\n else:\n right = middle - 1\n return -1 # Target not found", "_____no_output_____" ] ], [ [ "Here're a few examples of running our binary search implementation on a list and target values", "_____no_output_____" ] ], [ [ "#hide_input\nnums = [1,4,9,54,100,123]\ntargets = [4, 100, 92]\n\nfor val in targets:\n print(f\"Result of searching for {val} in {nums} : \\\n {binary_search(nums, val)}\\n\")\n", "Result of searching for 4 in [1, 4, 9, 54, 100, 123] : 1\n\nResult of searching for 100 in [1, 4, 9, 54, 100, 123] : 4\n\nResult of searching for 92 in [1, 4, 9, 54, 100, 123] : -1\n\n" ] ], [ [ "> Tip: Using the approach middle = left + (right - left) // 2 helps avoid overflow. While this isn&#39;t a concern in Python, it becomes a tricky issue to debug in other programming languages such as C++. For more on overflow, check out this [article](https://ai.googleblog.com/2006/06/extra-extra-read-all-about-it-nearly.html).", "_____no_output_____" ], [ "Before we look at some problems that can be solved using binary search, let's run a quick comparison of linear search and binary search on some large input. ", "_____no_output_____" ] ], [ [ "def linear_search(nums: List[int], target: int) -> int:\n \"\"\"Linear Search.\n Given a list of integers and a target value, return\n find the index of the target value in the list.\n If not present, return -1.\n \"\"\"\n for idx, elem in enumerate(nums):\n # Found the target value\n if elem == target:\n return idx \n return -1 # Target not found", "_____no_output_____" ], [ "#hide\nn = 1000000\nlarge_nums = range((1, n + 1))\ntarget = 99999", "_____no_output_____" ] ], [ [ "Let's see the time it takes linear search and binary search to find $99999$ in a sorted list of numbers from $[1, 1000000]$", "_____no_output_____" ], [ "- Linear Search", "_____no_output_____" ] ], [ [ "#hide_input\n%timeit linear_search(large_nums, target)", "5.19 ms ยฑ 26.3 ยตs per loop (mean ยฑ std. dev. of 7 runs, 100 loops each)\n" ] ], [ [ "- Binary Search", "_____no_output_____" ] ], [ [ "#hide_input\n%timeit binary_search(large_nums, target)", "6.05 ยตs ยฑ 46.9 ns per loop (mean ยฑ std. dev. of 7 runs, 100000 loops each)\n" ] ], [ [ "Hopefully, that drives the point home :wink:.", "_____no_output_____" ], [ "# Naรฏve Binary Search Problems\nHere's a list of problems that can be solved using vanilla binary search (or slightly modifying it). Anytime you see a problem statement which goes something like _\"Given a sorted list..\"_ or _\"Find the position of an element\"_, think of using binary search. You can also consider **sorting** the input in case it is an unordered collection of items to reduce it to a binary search problem. Note that this list is by no means exhaustive, but is a good starting point to practice binary search:\n- [Search Insert Position](https://leetcode.com/problems/search-insert-position/\n)\n- [Find the Square Root of x](https://leetcode.com/problems/sqrtx/)\n- [Find First and Last Position of Element in Sorted Array](https://leetcode.com/problems/find-first-and-last-position-of-element-in-sorted-array/)\n- [Search in a Rotated Sorted Array](https://leetcode.com/problems/search-in-rotated-sorted-array/)\n\nIn the problems above, we can either directly apply binary search or adapt it slightly to solve the problem. For example, take the square root problem. We know that the square root of a positive number $n$ has to lie between $[1, n / 2]$. This gives us the bounds for the search space. Applying binary search over this space allows us to find the a good approximation of the square root. See the implementation below for details:", "_____no_output_____" ] ], [ [ "def find_square_root(n: int) -> int:\n \"\"\"Integer square root.\n Given a positive integer, return\n its square root.\n \"\"\"\n left, right = 1, n // 2 + 1\n\n while left <= right:\n middle = left + (right - left) // 2\n if middle * middle == n:\n return middle # Found an exact match\n elif middle * middle < n:\n left = middle + 1 # Go right\n else:\n right = middle - 1 # Go left\n \n return right # This is the closest value to the actual square root", "_____no_output_____" ], [ "#hide_input\nnums = [1,4,8,33,100]\n\nfor val in nums:\n print(f\"Square root of {val} is: {find_square_root(val)}\\n\")", "Square root of 1 is: 1\n\nSquare root of 4 is: 2\n\nSquare root of 8 is: 2\n\nSquare root of 33 is: 5\n\nSquare root of 100 is: 10\n\n" ] ], [ [ "# To Be Continued\n- Applying binary search to unordered data\n- Problems where using binary search isn't obvious", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d0634fdb80f870ae48693947b54ac4b283f8f24f
414,536
ipynb
Jupyter Notebook
Lecture 30 - Assignment.ipynb
Elseidy83/Countery_data
225c757687c3d799e1d6c719f57494b933d92e0c
[ "MIT" ]
null
null
null
Lecture 30 - Assignment.ipynb
Elseidy83/Countery_data
225c757687c3d799e1d6c719f57494b933d92e0c
[ "MIT" ]
null
null
null
Lecture 30 - Assignment.ipynb
Elseidy83/Countery_data
225c757687c3d799e1d6c719f57494b933d92e0c
[ "MIT" ]
null
null
null
371.781166
253,892
0.913636
[ [ [ "import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom sklearn.cluster import AgglomerativeClustering\nimport seaborn as sns\n\nsns.set(rc={'figure.figsize': [7, 7]}, font_scale=1.2)", "_____no_output_____" ], [ "df = pd.read_csv('Country-data.csv')\ndf", "_____no_output_____" ], [ "ds = df.drop(['country'],axis=1)", "_____no_output_____" ], [ "sns.heatmap(ds.corr(), annot=True, fmt='.1f')", "_____no_output_____" ], [ "def get_sum(rw):\n return rw['child_mort']+ rw['exports']+rw['health']+rw['imports']+rw['income']+rw['inflation']+rw['life_expec']+rw['total_fer']+rw['gdpp']", "_____no_output_____" ], [ "dd = ds.corr().abs()", "_____no_output_____" ], [ "dd.apply(get_sum).sort_values(ascending=False)", "_____no_output_____" ], [ "ds = ds.drop(['inflation','imports','health'],axis=1)", "_____no_output_____" ], [ "ds", "_____no_output_____" ], [ "from sklearn.preprocessing import StandardScaler\nscaler = StandardScaler()\nx_scaled = scaler.fit_transform(ds)", "_____no_output_____" ], [ "x_scaled", "_____no_output_____" ], [ "sns.pairplot(ds)", "_____no_output_____" ], [ "plt.scatter(x_scaled[:, 0], x_scaled[:, 1])", "_____no_output_____" ], [ "import scipy.cluster.hierarchy as sch", "_____no_output_____" ], [ "dendrogram = sch.dendrogram(sch.linkage(ds, method='ward'))", "_____no_output_____" ], [ "model = AgglomerativeClustering(n_clusters=5)\nclusters = model.fit_predict(x_scaled)\nclusters", "_____no_output_____" ], [ "plt.scatter(x_scaled[:, 0], x_scaled[:,7], c=clusters, cmap='viridis')", "_____no_output_____" ], [ "df['Clusters'] = clusters\ndf", "_____no_output_____" ], [ "df.groupby('Clusters').describe().transpose()", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0635092cb676270c58b8bb3262e443ac5adbcfa
52,667
ipynb
Jupyter Notebook
06. VGGNet Architecture/Mini VGGNet.ipynb
ThinamXx/ComputerVision
732d251bc19bf9d9c1b497037c6acd3d640066b9
[ "MIT" ]
13
2021-11-21T12:06:40.000Z
2022-03-30T00:54:06.000Z
06. VGGNet Architecture/Mini VGGNet.ipynb
ThinamXx/ComputerVision
732d251bc19bf9d9c1b497037c6acd3d640066b9
[ "MIT" ]
null
null
null
06. VGGNet Architecture/Mini VGGNet.ipynb
ThinamXx/ComputerVision
732d251bc19bf9d9c1b497037c6acd3d640066b9
[ "MIT" ]
10
2021-11-20T23:40:15.000Z
2022-03-11T19:51:28.000Z
132.997475
31,266
0.760856
[ [ [ "**INITIALIZATION:**\n- I use these three lines of code on top of my each notebooks because it will help to prevent any problems while reloading the same project. And the third line of code helps to make visualization within the notebook.", "_____no_output_____" ] ], [ [ "#@ INITIALIZATION: \n%reload_ext autoreload\n%autoreload 2\n%matplotlib inline", "_____no_output_____" ] ], [ [ "**LIBRARIES AND DEPENDENCIES:**\n- I have downloaded all the libraries and dependencies required for the project in one particular cell.", "_____no_output_____" ] ], [ [ "#@ IMPORTING NECESSARY LIBRARIES AND DEPENDENCIES:\nfrom keras.models import Sequential\nfrom keras.layers import BatchNormalization\nfrom keras.layers.convolutional import Conv2D\nfrom keras.layers.convolutional import MaxPooling2D\nfrom keras.layers.core import Activation\nfrom keras.layers.core import Flatten\nfrom keras.layers.core import Dense, Dropout\nfrom keras import backend as K\nfrom tensorflow.keras.optimizers import SGD\nfrom tensorflow.keras.datasets import cifar10\nfrom keras.callbacks import LearningRateScheduler\n\nfrom sklearn.preprocessing import LabelBinarizer\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import classification_report\n\nimport matplotlib.pyplot as plt\nimport numpy as np", "_____no_output_____" ] ], [ [ "**VGG ARCHITECTURE:**\n- I will define the build method of Mini VGGNet architecture below. It requires four parameters: width of input image, height of input image, depth of image, number of class labels in the classification task. The Sequential class, the building block of sequential networks sequentially stack one layer on top of the other layer initialized below. Batch Normalization operates over the channels, so in order to apply BN, we need to know which axis to normalize over. ", "_____no_output_____" ] ], [ [ "#@ DEFINING VGGNET ARCHITECTURE:\nclass MiniVGGNet: # Defining VGG Network. \n @staticmethod\n def build(width, height, depth, classes): # Defining Build Method. \n model = Sequential() # Initializing Sequential Model.\n inputShape = (width, height, depth) # Initializing Input Shape. \n chanDim = -1 # Index of Channel Dimension.\n if K.image_data_format() == \"channels_first\":\n inputShape = (depth, width, height) # Initializing Input Shape. \n chanDim = 1 # Index of Channel Dimension. \n model.add(Conv2D(32, (3, 3), padding='same', \n input_shape=inputShape)) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(Conv2D(32, (3, 3), padding='same')) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(MaxPooling2D(pool_size=(2, 2))) # Adding Max Pooling Layer. \n model.add(Dropout(0.25)) # Adding Dropout Layer.\n model.add(Conv2D(64, (3, 3), padding=\"same\")) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(Conv2D(64, (3, 3), padding='same')) # Adding Convolutional Layer. \n model.add(Activation(\"relu\")) # Adding RELU Activation Function. \n model.add(BatchNormalization(axis=chanDim)) # Adding Batch Normalization Layer. \n model.add(MaxPooling2D(pool_size=(2, 2))) # Adding Max Pooling Layer. \n model.add(Dropout(0.25)) # Adding Dropout Layer. \n model.add(Flatten()) # Adding Flatten Layer. \n model.add(Dense(512)) # Adding FC Dense Layer. \n model.add(Activation(\"relu\")) # Adding Activation Layer. \n model.add(BatchNormalization()) # Adding Batch Normalization Layer. \n model.add(Dropout(0.5)) # Adding Dropout Layer. \n model.add(Dense(classes)) # Adding Dense Output Layer. \n model.add(Activation(\"softmax\")) # Adding Softmax Layer. \n return model", "_____no_output_____" ], [ "#@ CUSTOM LEARNING RATE SCHEDULER: \ndef step_decay(epoch): # Definig step decay function. \n initAlpha = 0.01 # Initializing initial LR.\n factor = 0.25 # Initializing drop factor. \n dropEvery = 5 # Initializing epochs to drop. \n alpha = initAlpha*(factor ** np.floor((1 + epoch) / dropEvery))\n return float(alpha)", "_____no_output_____" ] ], [ [ "**VGGNET ON CIFAR10**", "_____no_output_____" ] ], [ [ "#@ GETTING THE DATASET:\n((trainX, trainY), (testX, testY)) = cifar10.load_data() # Loading Dataset. \ntrainX = trainX.astype(\"float\") / 255.0 # Normalizing Dataset. \ntestX = testX.astype(\"float\") / 255.0 # Normalizing Dataset. \n\n#@ PREPARING THE DATASET:\nlb = LabelBinarizer() # Initializing LabelBinarizer. \ntrainY = lb.fit_transform(trainY) # Converting Labels to Vectors. \ntestY = lb.transform(testY) # Converting Labels to Vectors. \nlabelNames = [\"airplane\", \"automobile\", \"bird\", \"cat\", \"deer\", \n \"dog\", \"frog\", \"horse\", \"ship\", \"truck\"] # Initializing LabelNames.", "_____no_output_____" ], [ "#@ INITIALIZING OPTIMIZER AND MODEL: \ncallbacks = [LearningRateScheduler(step_decay)] # Initializing Callbacks. \nopt = SGD(0.01, nesterov=True, momentum=0.9) # Initializing SGD Optimizer. \nmodel = MiniVGGNet.build(width=32, height=32, depth=3, classes=10) # Initializing VGGNet Architecture. \nmodel.compile(loss=\"categorical_crossentropy\", optimizer=opt,\n metrics=[\"accuracy\"]) # Compiling VGGNet Model. \nH = model.fit(trainX, trainY, \n validation_data=(testX, testY), batch_size=64, \n epochs=40, verbose=1, callbacks=callbacks) # Training VGGNet Model.", "Epoch 1/40\n782/782 [==============================] - 29s 21ms/step - loss: 1.6339 - accuracy: 0.4555 - val_loss: 1.1509 - val_accuracy: 0.5970 - lr: 0.0100\nEpoch 2/40\n782/782 [==============================] - 16s 21ms/step - loss: 1.1813 - accuracy: 0.5932 - val_loss: 0.9222 - val_accuracy: 0.6733 - lr: 0.0100\nEpoch 3/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.9908 - accuracy: 0.6567 - val_loss: 0.8341 - val_accuracy: 0.7159 - lr: 0.0100\nEpoch 4/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.8854 - accuracy: 0.6945 - val_loss: 0.8282 - val_accuracy: 0.7167 - lr: 0.0100\nEpoch 5/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.7380 - accuracy: 0.7421 - val_loss: 0.6881 - val_accuracy: 0.7598 - lr: 0.0025\nEpoch 6/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.6845 - accuracy: 0.7586 - val_loss: 0.6600 - val_accuracy: 0.7711 - lr: 0.0025\nEpoch 7/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.6628 - accuracy: 0.7683 - val_loss: 0.6435 - val_accuracy: 0.7744 - lr: 0.0025\nEpoch 8/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.6391 - accuracy: 0.7755 - val_loss: 0.6362 - val_accuracy: 0.7784 - lr: 0.0025\nEpoch 9/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.6204 - accuracy: 0.7830 - val_loss: 0.6499 - val_accuracy: 0.7744 - lr: 0.0025\nEpoch 10/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5912 - accuracy: 0.7909 - val_loss: 0.6161 - val_accuracy: 0.7856 - lr: 6.2500e-04\nEpoch 11/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5812 - accuracy: 0.7936 - val_loss: 0.6054 - val_accuracy: 0.7879 - lr: 6.2500e-04\nEpoch 12/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5730 - accuracy: 0.7978 - val_loss: 0.5994 - val_accuracy: 0.7907 - lr: 6.2500e-04\nEpoch 13/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5698 - accuracy: 0.7974 - val_loss: 0.6013 - val_accuracy: 0.7882 - lr: 6.2500e-04\nEpoch 14/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5623 - accuracy: 0.8009 - val_loss: 0.5973 - val_accuracy: 0.7910 - lr: 6.2500e-04\nEpoch 15/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5496 - accuracy: 0.8064 - val_loss: 0.5961 - val_accuracy: 0.7905 - lr: 1.5625e-04\nEpoch 16/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5484 - accuracy: 0.8048 - val_loss: 0.5937 - val_accuracy: 0.7914 - lr: 1.5625e-04\nEpoch 17/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5573 - accuracy: 0.8037 - val_loss: 0.5950 - val_accuracy: 0.7902 - lr: 1.5625e-04\nEpoch 18/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5477 - accuracy: 0.8062 - val_loss: 0.5927 - val_accuracy: 0.7907 - lr: 1.5625e-04\nEpoch 19/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5446 - accuracy: 0.8073 - val_loss: 0.5904 - val_accuracy: 0.7923 - lr: 1.5625e-04\nEpoch 20/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5391 - accuracy: 0.8104 - val_loss: 0.5926 - val_accuracy: 0.7920 - lr: 3.9062e-05\nEpoch 21/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5419 - accuracy: 0.8080 - val_loss: 0.5915 - val_accuracy: 0.7929 - lr: 3.9062e-05\nEpoch 22/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5438 - accuracy: 0.8099 - val_loss: 0.5909 - val_accuracy: 0.7925 - lr: 3.9062e-05\nEpoch 23/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5467 - accuracy: 0.8075 - val_loss: 0.5914 - val_accuracy: 0.7919 - lr: 3.9062e-05\nEpoch 24/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5376 - accuracy: 0.8103 - val_loss: 0.5918 - val_accuracy: 0.7920 - lr: 3.9062e-05\nEpoch 25/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5410 - accuracy: 0.8085 - val_loss: 0.5923 - val_accuracy: 0.7917 - lr: 9.7656e-06\nEpoch 26/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5406 - accuracy: 0.8084 - val_loss: 0.5910 - val_accuracy: 0.7915 - lr: 9.7656e-06\nEpoch 27/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5384 - accuracy: 0.8097 - val_loss: 0.5901 - val_accuracy: 0.7919 - lr: 9.7656e-06\nEpoch 28/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5431 - accuracy: 0.8089 - val_loss: 0.5915 - val_accuracy: 0.7927 - lr: 9.7656e-06\nEpoch 29/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5417 - accuracy: 0.8095 - val_loss: 0.5921 - val_accuracy: 0.7925 - lr: 9.7656e-06\nEpoch 30/40\n782/782 [==============================] - 17s 21ms/step - loss: 0.5385 - accuracy: 0.8108 - val_loss: 0.5900 - val_accuracy: 0.7926 - lr: 2.4414e-06\nEpoch 31/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5451 - accuracy: 0.8073 - val_loss: 0.5910 - val_accuracy: 0.7923 - lr: 2.4414e-06\nEpoch 32/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5402 - accuracy: 0.8103 - val_loss: 0.5899 - val_accuracy: 0.7925 - lr: 2.4414e-06\nEpoch 33/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5405 - accuracy: 0.8091 - val_loss: 0.5909 - val_accuracy: 0.7928 - lr: 2.4414e-06\nEpoch 34/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5427 - accuracy: 0.8091 - val_loss: 0.5914 - val_accuracy: 0.7921 - lr: 2.4414e-06\nEpoch 35/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5416 - accuracy: 0.8105 - val_loss: 0.5906 - val_accuracy: 0.7928 - lr: 6.1035e-07\nEpoch 36/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5375 - accuracy: 0.8109 - val_loss: 0.5905 - val_accuracy: 0.7927 - lr: 6.1035e-07\nEpoch 37/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5372 - accuracy: 0.8092 - val_loss: 0.5900 - val_accuracy: 0.7923 - lr: 6.1035e-07\nEpoch 38/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5438 - accuracy: 0.8090 - val_loss: 0.5907 - val_accuracy: 0.7927 - lr: 6.1035e-07\nEpoch 39/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5424 - accuracy: 0.8097 - val_loss: 0.5906 - val_accuracy: 0.7922 - lr: 6.1035e-07\nEpoch 40/40\n782/782 [==============================] - 16s 21ms/step - loss: 0.5385 - accuracy: 0.8116 - val_loss: 0.5909 - val_accuracy: 0.7928 - lr: 1.5259e-07\n" ] ], [ [ "**MODEL EVALUATION:**", "_____no_output_____" ] ], [ [ "#@ INITIALIZING MODEL EVALUATION:\npredictions = model.predict(testX, batch_size=64) # Getting Model Predictions. \nprint(classification_report(testY.argmax(axis=1),\n predictions.argmax(axis=1), \n target_names=labelNames)) # Inspecting Classification Report.", " precision recall f1-score support\n\n airplane 0.85 0.79 0.82 1000\n automobile 0.90 0.88 0.89 1000\n bird 0.73 0.65 0.69 1000\n cat 0.62 0.60 0.61 1000\n deer 0.72 0.81 0.76 1000\n dog 0.71 0.71 0.71 1000\n frog 0.80 0.89 0.84 1000\n horse 0.87 0.82 0.85 1000\n ship 0.89 0.89 0.89 1000\n truck 0.85 0.88 0.86 1000\n\n accuracy 0.79 10000\n macro avg 0.79 0.79 0.79 10000\nweighted avg 0.79 0.79 0.79 10000\n\n" ], [ "#@ INSPECTING TRAINING LOSS AND ACCURACY:\nplt.style.use(\"ggplot\")\nplt.figure()\nplt.plot(np.arange(0, 40), H.history[\"loss\"], label=\"train_loss\")\nplt.plot(np.arange(0, 40), H.history[\"val_loss\"], label=\"val_loss\")\nplt.plot(np.arange(0, 40), H.history[\"accuracy\"], label=\"train_acc\")\nplt.plot(np.arange(0, 40), H.history[\"val_accuracy\"], label=\"val_acc\")\nplt.title(\"Training Loss and Accuracy\")\nplt.xlabel(\"Epoch\")\nplt.ylabel(\"Loss/Accuracy\")\nplt.legend()\nplt.show();", "_____no_output_____" ] ], [ [ "**Note:**\n- Batch Normalization can lead to a faster, more stable convergence with higher accuracy. \n- Batch Normalization will require more wall time to train the network even though the network will obtain higher accuracy in less epochs. ", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d063570e27d884ad1284ea042d5745f573a85718
15,261
ipynb
Jupyter Notebook
src/data-cleaning-final.ipynb
emilynomura1/1030MidtermProject
abe25ffde5733d7110f30ce81faf37a2dfa95abc
[ "MIT" ]
null
null
null
src/data-cleaning-final.ipynb
emilynomura1/1030MidtermProject
abe25ffde5733d7110f30ce81faf37a2dfa95abc
[ "MIT" ]
null
null
null
src/data-cleaning-final.ipynb
emilynomura1/1030MidtermProject
abe25ffde5733d7110f30ce81faf37a2dfa95abc
[ "MIT" ]
null
null
null
36.951574
120
0.519756
[ [ [ "# Import packages\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n\n# Read in data. If data is zipped, unzip the file and change file path accordingly\nyelp = pd.read_csv(\"../yelp_academic_dataset_business.csv\",\n dtype={'attributes': str, 'postal_code': str}, low_memory=False)\n\n# Reorder columns\n# https://stackoverflow.com/questions/41968732/set-order-of-columns-in-pandas-dataframe\ncols_to_order = ['name', 'stars', 'review_count', 'categories', 'city', 'state', \n 'postal_code', 'latitude', 'longitude', 'address']\nnew_cols = cols_to_order + (yelp.columns.drop(cols_to_order).tolist())\nyelp = yelp[new_cols]\n\nprint(yelp.shape)\nprint(yelp.info())", "_____no_output_____" ], [ "# Remove entries with null in columns: name, categories, city, postal code\nyelp = yelp[(pd.isna(yelp['name'])==False) & \n (pd.isna(yelp['city'])==False) & \n (pd.isna(yelp['categories'])==False) & \n (pd.isna(yelp['postal_code'])==False)]\nprint(yelp.shape)", "_____no_output_____" ], [ "# Remove columns with <0.5% non-null values (<894) except BYOB=641 non-null\n# and non-relevant columns\nyelp = yelp.drop(yelp.columns[[6,9,17,26,31,33,34,37,38]], axis=1)\nprint(yelp.shape)", "_____no_output_____" ], [ "# Remove entries with < 1000 businesses in each state\nstate_counts = yelp['state'].value_counts()\nyelp = yelp[~yelp['state'].isin(state_counts[state_counts < 1000].index)]\nprint(yelp.shape)", "_____no_output_____" ], [ "# Create new column of grouped star rating\nconds = [\n ((yelp['stars'] == 1) | (yelp['stars'] == 1.5)),\n ((yelp['stars'] == 2) | (yelp['stars'] == 2.5)),\n ((yelp['stars'] == 3) | (yelp['stars'] == 3.5)),\n ((yelp['stars'] == 4) | (yelp['stars'] == 4.5)),\n (yelp['stars'] == 5) ]\nvalues = [1, 2, 3, 4, 5]\nyelp['star-rating'] = np.select(conds, values)\nprint(yelp.shape)", "_____no_output_____" ], [ "# Convert 'hours' columns to total hours open that day for each day column\nfrom datetime import timedelta, time\n# Monday ---------------------------------------------------------\nyelp[['hours.Monday.start', 'hours.Monday.end']] = yelp['hours.Monday'].str.split('-', 1, expand=True)\n# Monday start time\nhr_min = []\nfor row in yelp['hours.Monday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el]) #change elements in list to int\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Monday.start'] = time_obj\n# Monday end time\nhr_min = []\nfor row in yelp['hours.Monday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Monday.end'] = time_obj\n# Create column of time difference\nyelp['Monday.hrs.open'] = yelp['hours.Monday.end'] - yelp['hours.Monday.start']\n# Convert seconds to minutes\nhour_calc = []\nfor ob in yelp['Monday.hrs.open']:\n hour_calc.append(ob.seconds//3600) #convert seconds to hours for explainability\nyelp['Monday.hrs.open'] = hour_calc\n# Tuesday -------------------------------------------------------------\nyelp[['hours.Tuesday.start', 'hours.Tuesday.end']] = yelp['hours.Tuesday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Tuesday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Tuesday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Tuesday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Tuesday.end'] = time_obj\nyelp['Tuesday.hrs.open'] = yelp['hours.Tuesday.end'] - yelp['hours.Tuesday.start']\nhour_calc = []\nfor ob in yelp['Tuesday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Tuesday.hrs.open'] = hour_calc\n# Wednesday ---------------------------------------------------------\nyelp[['hours.Wednesday.start', 'hours.Wednesday.end']] = yelp['hours.Wednesday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Wednesday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Wednesday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Wednesday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Wednesday.end'] = time_obj\nyelp['Wednesday.hrs.open'] = yelp['hours.Wednesday.end'] - yelp['hours.Wednesday.start']\nhour_calc = []\nfor ob in yelp['Wednesday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Wednesday.hrs.open'] = hour_calc\n# Thursday --------------------------------------------------------------------\nyelp[['hours.Thursday.start', 'hours.Thursday.end']] = yelp['hours.Thursday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Thursday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Thursday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Thursday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Thursday.end'] = time_obj\nyelp['Thursday.hrs.open'] = yelp['hours.Thursday.end'] - yelp['hours.Thursday.start']\nhour_calc = []\nfor ob in yelp['Thursday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Thursday.hrs.open'] = hour_calc\n# Friday -----------------------------------------------------------------------\nyelp[['hours.Friday.start', 'hours.Friday.end']] = yelp['hours.Friday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Friday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Friday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Friday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Friday.end'] = time_obj\nyelp['Friday.hrs.open'] = yelp['hours.Friday.end'] - yelp['hours.Friday.start']\nhour_calc = []\nfor ob in yelp['Friday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Friday.hrs.open'] = hour_calc\n# Saturday ------------------------------------------------------------------------\nyelp[['hours.Saturday.start', 'hours.Saturday.end']] = yelp['hours.Saturday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Saturday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Saturday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Saturday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Saturday.end'] = time_obj\nyelp['Saturday.hrs.open'] = yelp['hours.Saturday.end'] - yelp['hours.Saturday.start']\nhour_calc = []\nfor ob in yelp['Saturday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Saturday.hrs.open'] = hour_calc\n# Sunday ----------------------------------------------------------------------\nyelp[['hours.Sunday.start', 'hours.Sunday.end']] = yelp['hours.Sunday'].str.split('-', 1, expand=True)\nhr_min = []\nfor row in yelp['hours.Sunday.start']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Sunday.start'] = time_obj\nhr_min = []\nfor row in yelp['hours.Sunday.end']:\n hr_min.append(str(row).split(':'))\nnew_el = []\nfor el in hr_min:\n if len(el) == 1:\n new_el.append([0,0])\n else:\n new_el.append([int(i) for i in el])\ntime_obj = []\nfor el_split in new_el:\n time_obj.append(timedelta(hours=el_split[0], minutes=el_split[1]))\nyelp['hours.Sunday.end'] = time_obj\nyelp['Sunday.hrs.open'] = yelp['hours.Sunday.end'] - yelp['hours.Sunday.start']\nhour_calc = []\nfor ob in yelp['Sunday.hrs.open']:\n hour_calc.append(ob.seconds//3600)\nyelp['Sunday.hrs.open'] = hour_calc", "_____no_output_____" ], [ "# Remove old target variable (stars) and \n# unecessary time columns that were created. Only keep 'day.hrs.open' columns\nyelp = yelp.drop(yelp.columns[[1,10,11,12,16,18,41,48,52,53,55,56,\n 58,59,61,62,64,65,67,68,70,71]], axis=1)\nprint(yelp.shape)", "_____no_output_____" ], [ "# Delete columns with unworkable form (dict)\ndel yelp['attributes.BusinessParking']\ndel yelp['attributes.Music']\ndel yelp['attributes.Ambience']\ndel yelp['attributes.GoodForKids']\ndel yelp['attributes.RestaurantsDelivery']\ndel yelp['attributes.BestNights']\ndel yelp['attributes.HairSpecializesIn']\ndel yelp['attributes.GoodForMeal']", "_____no_output_____" ], [ "# Look at final DF before saving\nprint(yelp.info())", "_____no_output_____" ], [ "# Save as CSV for faster loading -------------------------------------------------\nyelp.to_csv('/Data/yelp-clean.csv')", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0636b41b3a1672c4be3cb9dea70e74ac379adcf
822,625
ipynb
Jupyter Notebook
celebrity.ipynb
peter1505/AIFFEL
def84c450cc479d2fc34d428438c542595606286
[ "MIT" ]
2
2021-11-18T08:40:43.000Z
2021-12-17T07:46:26.000Z
celebrity.ipynb
peter1505/AIFFEL
def84c450cc479d2fc34d428438c542595606286
[ "MIT" ]
null
null
null
celebrity.ipynb
peter1505/AIFFEL
def84c450cc479d2fc34d428438c542595606286
[ "MIT" ]
null
null
null
1,600.437743
576,380
0.958157
[ [ [ "# ๋‚ด๊ฐ€ ๋‹ฎ์€ ์—ฐ์˜ˆ์ธ์€?\n\n\n์‚ฌ์ง„ ๋ชจ์œผ๊ธฐ\n์–ผ๊ตด ์˜์—ญ ์ž๋ฅด๊ธฐ\n์–ผ๊ตด ์˜์—ญ Embedding ์ถ”์ถœ\n์—ฐ์˜ˆ์ธ๋“ค์˜ ์–ผ๊ตด๊ณผ ๊ฑฐ๋ฆฌ ๋น„๊ตํ•˜๊ธฐ\n์‹œ๊ฐํ™”\nํšŒ๊ณ \n\n\n1. ์‚ฌ์ง„ ๋ชจ์œผ๊ธฐ\n\n\n2. ์–ผ๊ตด ์˜์—ญ ์ž๋ฅด๊ธฐ\n์ด๋ฏธ์ง€์—์„œ ์–ผ๊ตด ์˜์—ญ์„ ์ž๋ฆ„\nimage.fromarray๋ฅผ ์ด์šฉํ•˜์—ฌ PIL image๋กœ ๋ณ€ํ™˜ํ•œ ํ›„, ์ถ”ํ›„์— ์‹œ๊ฐํ™”์— ์‚ฌ์šฉ", "_____no_output_____" ] ], [ [ "# ํ•„์š”ํ•œ ๋ชจ๋“ˆ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ\n\nimport os\nimport re\nimport glob\n\nimport glob\nimport pickle\nimport pandas as pd\n\n\nimport matplotlib.pyplot as plt\nimport matplotlib.image as img\nimport face_recognition\n%matplotlib inline \nfrom PIL import Image\nimport numpy as np\n\nimport face_recognition\nimport os\nfrom PIL import Image\n\n\n\n\ndir_path = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data'\nfile_list = os.listdir(dir_path)\n\nprint(len(file_list))\n\n# ์ด๋ฏธ์ง€ ํŒŒ์ผ ๋ถˆ๋Ÿฌ์˜ค๊ธฐ\n\nprint('์—ฐ์˜ˆ์ธ ์ด๋ฏธ์ง€ ํŒŒ์ผ ๊ฐฏ์ˆ˜:', len(file_list) - 5) # ์ถ”๊ฐ€ํ•œ ๋‚ด ์‚ฌ์ง„ ์ˆ˜๋ฅผ ๋บ€ ๋‚˜๋จธ์ง€ ์‚ฌ์ง„ ์ˆ˜ ์„ธ๊ธฐ", "247\n์—ฐ์˜ˆ์ธ ์ด๋ฏธ์ง€ ํŒŒ์ผ ๊ฐฏ์ˆ˜: 242\n" ], [ "# ์ด๋ฏธ์ง€ ํŒŒ์ผ ๋ฆฌ์ŠคํŠธ ํ™•์ธ\n\nprint (\"ํŒŒ์ผ ๋ฆฌ์ŠคํŠธ:\\n{}\".format(file_list))", "ํŒŒ์ผ ๋ฆฌ์ŠคํŠธ:\n['๊ฐ•์ธ๋•.jpg', '๊น€๊ฒฝํ˜„.jpg', '๊ณ ๋ณด๊ฒฐ.jpg', 'T.O.P.jpg', '๊น€๋‚˜์˜.jpg', '๊ธธ์€์ •.jpg', '๊ณ ์†Œ์˜.jpg', '๊ฐ•๋ฌธ๊ฒฝ.jpg', '๊ฐ•์ธ๋ด‰.jpg', '๊ณฝ์ง„์˜.jpg', '๊ฐ•์„ฑ๋ฏผ.jpg', '๊ณต์Šน์—ฐ.jpg', '๊น€๋ฏผ์ƒ.jpg', '๊น€๊ทœ์ข….jpg', '๊ธฐ๋„ํ›ˆ.jpg', 'BUMZU.jpg', '๊ณ ์œ ์ง„.jpg', '๊น€๋ณ‘๊ธฐ.jpg', '๊ณฝ๋™์—ฐ.jpg', '๊น€๋™ํฌ.jpg', '๊ถŒ์ธํ•˜.jpg', '๊น€๋™๋ฅ .jpg', '๊น€์ƒ๊ฒฝ.jpg', 'G-DRAGON.jpg', '๊ถŒ์˜ค์ค‘.jpg', '๊ฐ•์„ฑ์—ฐ.jpg', '๊น€์„œ๋ผ.jpg', '๊น€๋‹ค๋ฏธ.jpg', '๊น€๊ฐ€์€.jpg', '๊น€๊ธฐ๋‘.jpg', '๊ฐ•์„ฑ์•„.jpg', '๊ณ ์šด๋ด‰.jpg', '๊น€๊ฐ€ํฌ.jpg', '๊ฐ•๋ฆฌ๋‚˜.jpg', '๊น€์„ ํ˜ธ.jpg', '๊ณ ์•„๋ผ.jpg', '๊น€๊ทธ๋ฆผ.jpg', '๊น€๋ฏผํฌ.jpg', '๊ฐ•๊ฒฝ์ค€.jpg', '๊น€๊ฑด๋ชจ.jpg', '๊ฐ•๊ท ์„ฑ.jpg', '๊ณฝ์ง€๋ฏผ.jpg', '๊ณ ๊ฒฝํ‘œ.jpg', '๊ฑด์ง€.jpg', '๊น€๊ฒฝํ˜ธ.jpg', '๊ฐ•ํƒœ์˜ค.jpg', '๊น€๋‹ค์†œ.jpg', '๊ณ ์œค์ •.jpg', '๊ฐ•์˜ˆ์†”.jpg', '๊ธธํ•™๋ฏธ.jpg', '๊ฐ•๋‘๋ฆฌ.jpg', '๊น€๋ขฐํ•˜.jpg', '๊ถŒํ˜„์ƒ.jpg', '๊ฐ•๋ฏผ๊ฒฝ.jpg', '๊ณฝ๋ฏผ์„.jpg', '๊ฐ•์€ํƒ.jpg', '๊ณฝ๋„์›.jpg', '๊น€๋ฒ•๋ž˜.jpg', 'K.WILL.jpg', '๊น€์„ํ›ˆ.jpg', '๊ตฌ์œคํšŒ.jpg', '๊ธˆ๋ณด๋ผ.jpg', '๊น€๊ฐ‘์ˆ˜.jpg', '๊น€๋ช…์ค€.jpg', '๊ฐ•๊ฒฝํ—Œ.jpg', '๊ธธ์ •์šฐ.jpg', '๊น€์„ ํ˜.jpg', '๊ถŒ์„ฑํฌ.jpg', '๊น€๊ณ ์€.jpg', '๊ฐ์šฐ์„ฑ.jpg', '๊ฐ•์†Œ๋ผ.jpg', '๊ฐ•์Šน์œค.jpg', '๊ฒฝ์ˆ˜์ง„.jpg', '๊น€๊ฐ€๋ž€.jpg', '๊ฐ•์ด์„.jpg', '๊ณต์ •ํ™˜.jpg', '๊น€๋ฏผ๊ธฐ.jpg', '๊น€๋ฏผ๊ต.jpg', '๊ฐ•์‹ ์ผ.jpg', '๊ถŒํ˜์ˆ˜.jpg', '๊น€๊ฝƒ๋น„.jpg', '๊น€๋‚จ์ฃผ.jpg', '๊ณฝํฌ์„ฑ.jpg', '๊ฐ„๋ฏธ์—ฐ.jpg', '๊น€๋ฏผ์„.jpg', '๊ฐ•๋ฏผ์•„.jpg', 'SE7EN.jpg', '๊ฐ•์†Œ๋ฆฌ.jpg', '๊ณฝ์ •์šฑ.jpg', '๊ณต๋ช….jpg', '๊น€๋ณด๋ฏธ.jpg', '๊น€์ƒํ˜ธ.jpg', '๊น€๋ช…๋ฏผ.jpg', '๊น€์ƒํฌ.jpg', '๊ฐ•๋ด‰์„ฑ.jpg', '๊ธฐ๋ฆฌ๋ณด์ด.jpg', '๊น€๊ทœ๋ฆฌ.jpg', '๊น€๋ถ€์„ .jpg', '๊ณ ์ˆ˜.jpg', '๊น€๋ณด๋ผ.jpg', 'RM.jpg', '๊ธฐ์ฃผ๋ด‰.jpg', '๊ฐœ๋ฆฌ.jpg', '๊น€๊ตญํ™˜.jpg', '๊น€๊ธฐ๋ฒ”.jpg', '๊ณ ์•„์„ฑ.jpg', '๊น€์ƒˆ๋ก .jpg', '๊ณ ์›ํฌ.jpg', '๊น€๊ฐ•ํ›ˆ.jpg', '๊ฒฌ์šฐ.jpg', 'KCM.jpg', '๊ณฝ์‹œ์–‘.jpg', '๊ถŒ์œ ๋ฆฌ.jpg', '๊น€๋ฒ”.jpg', '์ด์›์žฌ_01.jpg', '๊ณ ์„ฑํฌ.jpg', '๊ธธ๊ฑด.jpg', 'Zion.T.jpg', '๊น€๋ณ‘์˜ฅ.jpg', '๊ณ ์ค€ํฌ.jpg', '๊น€๊ด‘๊ทœ.jpg', '๊ณ ์ฃผ์›.jpg', '๊ฐ•์˜ˆ์›.jpg', '๊ณ ๋ฏผ์‹œ.jpg', '๊ณตํšจ์ง„.jpg', '๊ฐ•์†Œ์—ฐ.jpg', '๊น€๋ฏผ.jpg', '๊น€๋ช…์ˆ˜.jpg', '๊ถŒํ•ด์„ฑ.jpg', '๊น€๋ฒ”๋ฃก.jpg', '๊ฐ•์„์šฐ.jpg', '๊ถŒ์†Œํ˜„.jpg', '๊ฐ•์ง€์„ญ.jpg', '๊ฐ•์Šน์›.jpg', 'MC๋ชฝ.jpg', '๊น€๋™๋ช….jpg', '๊น€๋ฏผ์„œ.jpg', '๊น€์‚ฌ๋ž‘.jpg', '๊น€๊ฐ€์—ฐ.jpg', '๊ฐ•๋ณ„.jpg', '๊ฐ•์ง€ํ™˜.jpg', '๊ฐ•์ˆ˜์ง€.jpg', '๊น€์ƒ๋ฐฐ.jpg', '๊ถŒํ•ดํšจ.jpg', 'euPhemia.jpg', '์ด์›์žฌ_02.jpg', '๊ธˆ์‚ฌํ–ฅ.jpg', '๊น€๋ฏผ์ข….jpg', '๊ถŒํƒœ์›.jpg', '๊ณ ํ˜„์ •.jpg', '๊ฐ•์„ฑํ•„.jpg', 'V.One.jpg', '๊น€๊ฐ•์šฐ.jpg', '๊น€์„ ์›….jpg', '๊น€๋„์—ฐ.jpg', '๊ถŒ์€์•„.jpg', '๊ธฐ์€์„ธ.jpg', '๊น€๋™ํ•œ.jpg', '๊ฐ•์ด์ฑ„.jpg', '๊ณ ์œค.jpg', '๊ธธํ•ด์—ฐ.jpg', '๊ฒฌ๋ฏธ๋ฆฌ.jpg', '๊ตฌ์žฌ์ด.jpg', '๊ฐ•๊ธฐ์˜.jpg', '๊ณ ๋‘์‹ฌ.jpg', '๊น€๋ฏผ์ค€.jpg', '๊ถŒํ™”์šด.jpg', '๊ถŒ๋‹คํ˜„.jpg', '๊ฐ€ํฌ.jpg', '๊ฐ•๋‹คํ˜„.jpg', '๊ณ ์ธ๋ฒ”.jpg', '๊น€๊ด‘์„.jpg', '๊ฐ•์‚ฐ์—.jpg', 'JK๊น€๋™์šฑ.jpg', '๊น€์ƒˆ๋ฒฝ.jpg', '๊ถŒ์€์ˆ˜.jpg', '๊ฐ•์˜ˆ๋นˆ.jpg', '๊ฐ•์ˆ˜์—ฐ.jpg', '๊น€๊ธฐ๋ฐฉ.jpg', '๊ตฌํ˜œ์„ .jpg', '๊ธˆ์ž”๋””.jpg', '๊ฐ•ํ•˜๋Š˜.jpg', '๊ณ ์ฐฝ์„.jpg', '๊ฐ•๋ฏผ์ฃผ.jpg', '๊น€๋‚จ๊ธธ.jpg', '๊น€๋‚˜์šด.jpg', '๊ฑฐ๋ฏธ.jpg', '๊ณ ์€์•„.jpg', '๊ธธ์šฉ์šฐ.jpg', '๊ถŒ์œจ.jpg', '๊น€๋Œ€๋ช….jpg', '๊น€๋‹คํ˜„.jpg', '๊ฐ•๋‚จ๊ธธ.jpg', '๊น€๋ณ‘์„ธ.jpg', '๊น€๋นˆ์šฐ.jpg', '๊ธˆ์ƒˆ๋ก.jpg', '๊ฐ•ํ•œ๋‚˜.jpg', '๊น€๋‹จ์šฐ.jpg', '๊น€๋„์œค.jpg', '๊ถŒ์ƒ์šฐ.jpg', '๊น€๋ณด๊ฒฝ.jpg', '๊ฒฝ์ธ์„ .jpg', '๊ฐ•ํƒ€.jpg', '๊ธธ๋ฏธ.jpg', '๊ฐˆ์†Œ์›.jpg', '๊ณ ๋‚˜์€.jpg', '๊ถŒ๋™ํ˜ธ.jpg', '๊น€๋น›๋‚˜๋ฆฌ.jpg', '๊ณต์œ .jpg', '๊ตฌ์›์ฐฌ.jpg', '๊ธฐํƒœ์˜.jpg', '๊น€๋ฌด์—ด.jpg', '๊น€๊ถŒ.jpg', '๊น€๋™์šฑ.jpg', '๊น€์„ ์˜.jpg', '๊ณ ์„ธ์›.jpg', '๊น€๋ฒ”์ˆ˜.jpg', '๊น€๋ช…๊ณค.jpg', '๊น€๊ฒฝ๋ก.jpg', '๊น€๋ณด๋ฏผ.jpg', '๊ณ ๋‚˜ํฌ.jpg', '๊ฐ•๋ถ€์ž.jpg', '๊ฐ•์„ฑ์ง„.jpg', '๊ณ ์€๋ฏธ.jpg', '๊ฐ•์ˆ˜์ง„.jpg', '๊ฐ•์€๋น„.jpg', '๊น€๋ž˜์›.jpg', '๊น€์„œํ˜•.jpg', '๊น€๋ณด์—ฐ.jpg', '๊ฐ•์ง„.jpg', '๊น€๋‹จ์œจ.jpg', '๊ธˆ๋‹จ๋น„.jpg', '๊ฐ•์ •์šฐ.jpg', '.ipynb_checkpoints', '๊น€๋™์™„.jpg', '๊น€๊ทœ์„ .jpg', '๊น€๋ฏผ์ฃผ.jpg', '๊น€์ƒ์ค‘.jpg', '๊น€๋ฏผ๊ฒฝ.jpg', '๊ฐ•ํ˜œ์—ฐ.jpg', '๊ฐ•๋™์›.jpg', '๊ฐ•๋ฌธ์˜.jpg', '๊น€๋ฌด์ƒ.jpg', '๊ณฝ์ฐฝ์„ .jpg', '๊ณตํ˜•์ง„.jpg', '๊น€๋นˆ.jpg', '๊น€๊ตญํฌ.jpg']\n" ], [ "# ์ด๋ฏธ์ง€ ํŒŒ์ผ ์ผ๋ถ€ ํ™•์ธ\n\n# Set figsize here\nfig, axes = plt.subplots(nrows=2, ncols=3, figsize=(24,10))\n\n# flatten axes for easy iterating\nfor i, ax in enumerate(axes.flatten()):\n image = img.imread(dir_path+'/'+file_list[i])\n ax.imshow(image)\nplt.show()\n\nfig.tight_layout()", "_____no_output_____" ], [ "\n# ์ด๋ฏธ์ง€ ํŒŒ์ผ ๊ฒฝ๋กœ๋ฅผ ํŒŒ๋ผ๋ฏธํ„ฐ๋กœ ๋„˜๊ธฐ๋ฉด ์–ผ๊ตด ์˜์—ญ๋งŒ ์ž˜๋ผ์ฃผ๋Š” ํ•จ์ˆ˜\n\ndef get_cropped_face(image_file):\n image = face_recognition.load_image_file(image_file)\n face_locations = face_recognition.face_locations(image)\n a, b, c, d = face_locations[0]\n cropped_face = image[a:c,d:b,:]\n \n return cropped_face", "_____no_output_____" ], [ "# ์–ผ๊ตด ์˜์—ญ์ด ์ •ํ™•ํžˆ ์ž˜๋ฆฌ๋Š” ์ง€ ํ™•์ธ\n\nimage_path = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/์ด์›์žฌ_02.jpg'\n\ncropped_face = get_cropped_face(image_path)\nplt.imshow(cropped_face)", "_____no_output_____" ] ], [ [ "## Step3. ์–ผ๊ตด ์˜์—ญ์˜ ์ž„๋ฒ ๋”ฉ ์ถ”์ถœํ•˜๊ธฐ", "_____no_output_____" ] ], [ [ "# ์–ผ๊ตด ์˜์—ญ์„ ๊ฐ€์ง€๊ณ  ์–ผ๊ตด ์ž„๋ฒ ๋”ฉ ๋ฒกํ„ฐ๋ฅผ ๊ตฌํ•˜๋Š” ํ•จ์ˆ˜\n\ndef get_face_embedding(face):\n return face_recognition.face_encodings(face, model='cnn')", "_____no_output_____" ], [ "# ํŒŒ์ผ ๊ฒฝ๋กœ๋ฅผ ๋„ฃ์œผ๋ฉด embedding_dict๋ฅผ ๋ฆฌํ„ดํ•˜๋Š” ํ•จ์ˆ˜\n\ndef get_face_embedding_dict(dir_path):\n file_list = os.listdir(dir_path)\n embedding_dict = {}\n \n for file in file_list:\n try: \n img_path = os.path.join(dir_path, file)\n face = get_cropped_face(img_path)\n embedding = get_face_embedding(face)\n if len(embedding) > 0: \n # ์–ผ๊ตด์˜์—ญ face๊ฐ€ ์ œ๋Œ€๋กœ detect๋˜์ง€ ์•Š์œผ๋ฉด len(embedding)==0์ธ ๊ฒฝ์šฐ๊ฐ€ ๋ฐœ์ƒํ•˜๋ฏ€๋กœ \n # os.path.splitext(file)[0]์—๋Š” ์ด๋ฏธ์ง€ํŒŒ์ผ๋ช…์—์„œ ํ™•์žฅ์ž๋ฅผ ์ œ๊ฑฐํ•œ ์ด๋ฆ„์ด ๋‹ด๊น๋‹ˆ๋‹ค. \n embedding_dict[os.path.splitext(file)[0]] = embedding[0]\n # embedding_dict[] ์ด๋ฏธ์ง€ ํŒŒ์ผ์˜ ์ž„๋ฒ ๋”ฉ์„ ๊ตฌํ•ด ๋‹ด์Œ ํ‚ค=์‚ฌ๋žŒ์ด๋ฆ„, ๊ฐ’=์ž„๋ฒ ๋”ฉ ๋ฒกํ„ฐ\n # os.path.splitext(file)[0] ํŒŒ์ผ์˜ ํ™•์žฅ์ž๋ฅผ ์ œ๊ฑฐํ•œ ์ด๋ฆ„๋งŒ ์ถ”์ถœ\n # embedding[0]์€ ๋„ฃ๊ณ  ์‹ถ์€ ์š”์†Œ๊ฐ’\n\n except:\n continue\n \n return embedding_dict", "_____no_output_____" ], [ "embedding_dict = get_face_embedding_dict(dir_path)", "_____no_output_____" ] ], [ [ "## Step4. ๋ชจ์€ ์—ฐ์˜ˆ์ธ๋“ค๊ณผ ๋น„๊ตํ•˜๊ธฐ", "_____no_output_____" ] ], [ [ "# ์ด๋ฏธ์ง€ ๊ฐ„ ๊ฑฐ๋ฆฌ๋ฅผ ๊ตฌํ•˜๋Š” ํ•จ์ˆ˜\n\ndef get_distance(name1, name2):\n return np.linalg.norm(embedding_dict[name1]-embedding_dict[name2], ord=2)", "_____no_output_____" ], [ "# ๋ณธ์ธ ์‚ฌ์ง„์˜ ๊ฑฐ๋ฆฌ๋ฅผ ํ™•์ธํ•ด๋ณด์ž\n\nprint('๋‚ด ์‚ฌ์ง„๋ผ๋ฆฌ์˜ ๊ฑฐ๋ฆฌ๋Š”?:', get_distance('์ด์›์žฌ_01', '์ด์›์žฌ_02'))", "๋‚ด ์‚ฌ์ง„๋ผ๋ฆฌ์˜ ๊ฑฐ๋ฆฌ๋Š”?: 0.27525162596989655\n" ], [ "# name1๊ณผ name2์˜ ๊ฑฐ๋ฆฌ๋ฅผ ๋น„๊ตํ•˜๋Š” ํ•จ์ˆ˜๋ฅผ ์ƒ์„ฑํ•˜๋˜, name1์€ ๋ฏธ๋ฆฌ ์ง€์ •ํ•˜๊ณ , name2๋Š” ํ˜ธ์ถœ์‹œ์— ์ธ์ž๋กœ ๋ฐ›๋„๋ก ํ•ฉ๋‹ˆ๋‹ค.\n\ndef get_sort_key_func(name1):\n def get_distance_from_name1(name2):\n return get_distance(name1, name2)\n return get_distance_from_name1", "_____no_output_____" ], [ "\n# ๋‹ฎ์€๊ผด ์ˆœ์œ„, ์ด๋ฆ„, ์ž„๋ฒ ๋”ฉ ๊ฑฐ๋ฆฌ๋ฅผ ํฌํ•จํ•œ Top-5 ๋ฆฌ์ŠคํŠธ ์ถœ๋ ฅํ•˜๋Š” ํ•จ์ˆ˜\n\ndef get_nearest_face(name, top=5):\n sort_key_func = get_sort_key_func(name)\n sorted_faces = sorted(embedding_dict.items(), key=lambda x:sort_key_func(x[0]))\n \n rank_cnt = 1 # ์ˆœ์œ„๋ฅผ ์„ธ๋Š” ๋ณ€์ˆ˜\n pass_cnt = 1 # ๊ฑด๋„ˆ๋›ด ์ˆซ์ž๋ฅผ ์„ธ๋Š” ๋ณ€์ˆ˜(๋ณธ์ธ ์‚ฌ์ง„ ์นด์šดํŠธ)\n end = 0 # ๋‹ฎ์€ ๊ผด 5๋ฒˆ ์ถœ๋ ฅ์‹œ ์ข…๋ฃŒํ•˜๊ธฐ ์œ„ํ•ด ์„ธ๋Š” ๋ณ€์ˆ˜\n for i in range(top+15):\n rank_cnt += 1\n if sorted_faces[i][0].find('์ด์›์žฌ_02') == 0: # ๋ณธ์ธ ์‚ฌ์ง„์ธ mypicture๋ผ๋Š” ํŒŒ์ผ๋ช…์œผ๋กœ ์‹œ์ž‘ํ•˜๋Š” ๊ฒฝ์šฐ ์ œ์™ธํ•ฉ๋‹ˆ๋‹ค.\n pass_cnt += 1\n continue\n if sorted_faces[i]:\n print('์ˆœ์œ„ {} : ์ด๋ฆ„({}), ๊ฑฐ๋ฆฌ({})'.format(rank_cnt - pass_cnt, sorted_faces[i][0], sort_key_func(sorted_faces[i][0])))\n end += 1\n if end == 5: # end๊ฐ€ 5๊ฐ€ ๋œ ๊ฒฝ์šฐ ์—ฐ์˜ˆ์ธ 5๋ช… ์ถœ๋ ฅ๋˜์—ˆ๊ธฐ์— ์ข…๋ฃŒํ•ฉ๋‹ˆ๋‹ค.\n break", "_____no_output_____" ], [ "# '์ด์›์žฌ_01'๊ณผ ๊ฐ€์žฅ ๋‹ฎ์€ ์‚ฌ๋žŒ์€ ๋ˆ„๊ตด๊นŒ์š”?\n\nget_nearest_face('์ด์›์žฌ_01')", "์ˆœ์œ„ 1 : ์ด๋ฆ„(์ด์›์žฌ_01), ๊ฑฐ๋ฆฌ(0.0)\n์ˆœ์œ„ 2 : ์ด๋ฆ„(euPhemia), ๊ฑฐ๋ฆฌ(0.39785575251289035)\n์ˆœ์œ„ 3 : ์ด๋ฆ„(๊ณต๋ช…), ๊ฑฐ๋ฆฌ(0.43181500298337777)\n์ˆœ์œ„ 4 : ์ด๋ฆ„(๊ฐ•๊ธฐ์˜), ๊ฑฐ๋ฆฌ(0.44559566211978)\n์ˆœ์œ„ 5 : ์ด๋ฆ„(JK๊น€๋™์šฑ), ๊ฑฐ๋ฆฌ(0.4560282622605789)\n" ], [ "# '์ด์›์žฌ_02'์™€ ๊ฐ€์žฅ ๋‹ฎ์€ ์‚ฌ๋žŒ์€ ๋ˆ„๊ตด๊นŒ์š”?\n\nget_nearest_face('์ด์›์žฌ_02')", "์ˆœ์œ„ 1 : ์ด๋ฆ„(์ด์›์žฌ_01), ๊ฑฐ๋ฆฌ(0.27525162596989655)\n์ˆœ์œ„ 2 : ์ด๋ฆ„(euPhemia), ๊ฑฐ๋ฆฌ(0.38568278214648233)\n์ˆœ์œ„ 3 : ์ด๋ฆ„(๊ณต๋ช…), ๊ฑฐ๋ฆฌ(0.445581489047543)\n์ˆœ์œ„ 4 : ์ด๋ฆ„(๊น€๋™์™„), ๊ฑฐ๋ฆฌ(0.44765017085662295)\n์ˆœ์œ„ 5 : ์ด๋ฆ„(๊ฐ•์„ฑํ•„), ๊ฑฐ๋ฆฌ(0.4536061116328271)\n" ] ], [ [ "## Step5. ๋‹ค์–‘ํ•œ ์žฌ๋ฏธ์žˆ๋Š” ์‹œ๊ฐํ™” ์‹œ๋„ํ•ด ๋ณด๊ธฐ", "_____no_output_____" ] ], [ [ "\n# ์‚ฌ์ง„ ๊ฒฝ๋กœ ์„ค์ •\n\nmypicture1 = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/์ด์›์žฌ_01.jpg'\nmypicture2 = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/์ด์›์žฌ_02.jpg'\n\nmc= os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/MC๋ชฝ.jpg'\ngahee = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ€ํฌ.jpg'\nseven = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/SE7EN.jpg'\ngam = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ์šฐ์„ฑ.jpg'\n\ngang = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ•๊ฒฝ์ค€.jpg'\ngyung = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ•๊ฒฝํ˜„.jpg'\ngi = os.getenv('HOME')+'/aiffel/EXP_07_face_embedding/data/๊ฐ•๊ธฐ์˜.jpg'", "_____no_output_____" ], [ "\n# ํฌ๋กญํ•œ ์–ผ๊ตด์„ ์ €์žฅํ•ด ๋ณด์ž\n\na1 = get_cropped_face(mypicture1)\na2 = get_cropped_face(mypicture2)\n\nb1 = get_cropped_face(mc)\nb2 = get_cropped_face(gahee)\nb3 = get_cropped_face(gam)", "_____no_output_____" ], [ "plt.figure(figsize=(10,8))\n\nplt.subplot(231)\nplt.imshow(a1)\nplt.axis('off')\nplt.title('1st')\nplt.subplot(232)\nplt.imshow(a2)\nplt.axis('off')\nplt.title('me')\nplt.subplot(233)\nplt.imshow(b1)\nplt.axis('off')\nplt.title('2nd')\nplt.subplot(234)\n\nprint('''mypicture์˜ ์ˆœ์œ„\n์ˆœ์œ„ 1 : ์ด๋ฆ„(์‚ฌ์ฟ ๋ผ), ๊ฑฐ๋ฆฌ(0.36107689719729225)\n์ˆœ์œ„ 2 : ์ด๋ฆ„(ํŠธ์™€์ด์Šค๋‚˜์—ฐ), ๊ฑฐ๋ฆฌ(0.36906292012955577) \n์ˆœ์œ„ 3 : ์ด๋ฆ„(์•„์ด์œ ), ๊ฑฐ๋ฆฌ(0.3703590842312735) \n์ˆœ์œ„ 4 : ์ด๋ฆ„(์œ ํŠธ๋ฃจ), ๊ฑฐ๋ฆฌ(0.3809516850126146) \n์ˆœ์œ„ 5 : ์ด๋ฆ„(์ง€ํ˜ธ), ๊ฑฐ๋ฆฌ(0.3886670633997685)''')", "mypicture์˜ ์ˆœ์œ„\n์ˆœ์œ„ 1 : ์ด๋ฆ„(์‚ฌ์ฟ ๋ผ), ๊ฑฐ๋ฆฌ(0.36107689719729225)\n์ˆœ์œ„ 2 : ์ด๋ฆ„(ํŠธ์™€์ด์Šค๋‚˜์—ฐ), ๊ฑฐ๋ฆฌ(0.36906292012955577) \n์ˆœ์œ„ 3 : ์ด๋ฆ„(์•„์ด์œ ), ๊ฑฐ๋ฆฌ(0.3703590842312735) \n์ˆœ์œ„ 4 : ์ด๋ฆ„(์œ ํŠธ๋ฃจ), ๊ฑฐ๋ฆฌ(0.3809516850126146) \n์ˆœ์œ„ 5 : ์ด๋ฆ„(์ง€ํ˜ธ), ๊ฑฐ๋ฆฌ(0.3886670633997685)\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ] ]
d0636f5a18ee02ac2f75f5d634bcbcb10c053fe4
43,386
ipynb
Jupyter Notebook
Feature - Handling missing Values/5. Arbitrary Value Imputation.ipynb
deepakkum21/Feature-Engineering
ea10b2685c842bcf0247887db755d05c38b23844
[ "Apache-2.0" ]
null
null
null
Feature - Handling missing Values/5. Arbitrary Value Imputation.ipynb
deepakkum21/Feature-Engineering
ea10b2685c842bcf0247887db755d05c38b23844
[ "Apache-2.0" ]
null
null
null
Feature - Handling missing Values/5. Arbitrary Value Imputation.ipynb
deepakkum21/Feature-Engineering
ea10b2685c842bcf0247887db755d05c38b23844
[ "Apache-2.0" ]
null
null
null
114.777778
28,204
0.847877
[ [ [ "## 5. Arbitrary Value Imputation\n#### this technique was derived from kaggle competition It consists of replacing NAN by an arbitrary value", "_____no_output_____" ] ], [ [ "import pandas as pd", "_____no_output_____" ], [ "df=pd.read_csv(\"titanic.csv\", usecols=[\"Age\",\"Fare\",\"Survived\"])\ndf.head()", "_____no_output_____" ], [ "def impute_nan(df,variable):\n df[variable+'_zero']=df[variable].fillna(0)\n df[variable+'_hundred']=df[variable].fillna(100)", "_____no_output_____" ], [ "df['Age'].hist(bins=50)", "_____no_output_____" ] ], [ [ "### Advantages\n Easy to implement\n Captures the importance of missingess if there is one\n### Disadvantages\n Distorts the original distribution of the variable\n If missingess is not important, it may mask the predictive power of the original variable by distorting its distribution\n Hard to decide which value to use", "_____no_output_____" ] ], [ [ "impute_nan(df,'Age')\ndf.head()", "_____no_output_____" ], [ "print(df['Age'].std())\nprint(df['Age_zero'].std())\nprint(df['Age_hundred'].std())", "14.526497332334044\n17.596074065915886\n30.930372890173594\n" ], [ "print(df['Age'].mean())\nprint(df['Age_zero'].mean())\nprint(df['Age_hundred'].mean())", "29.69911764705882\n23.79929292929293\n43.66461279461279\n" ], [ "import matplotlib.pyplot as plt\n%matplotlib inline", "_____no_output_____" ], [ "fig = plt.figure()\nax = fig.add_subplot(111)\ndf['Age'].plot(kind='kde', ax=ax)\ndf.Age_zero.plot(kind='kde', ax=ax, color='red')\ndf.Age_hundred.plot(kind='kde', ax=ax, color='green')\nlines, labels = ax.get_legend_handles_labels()\nax.legend(lines, labels, loc='best')", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d063869573f5b3dfc578fc45bb7d1c7875fd50ea
25,044
ipynb
Jupyter Notebook
learning/matplot/animation/basic_animation.ipynb
HypoChloremic/python_learning
3778f6d7c35cdd54a85a3418aba99f2b91d32775
[ "Apache-2.0" ]
2
2019-06-23T07:17:30.000Z
2019-07-06T15:15:42.000Z
learning/matplot/animation/basic_animation.ipynb
HypoChloremic/python_learning
3778f6d7c35cdd54a85a3418aba99f2b91d32775
[ "Apache-2.0" ]
null
null
null
learning/matplot/animation/basic_animation.ipynb
HypoChloremic/python_learning
3778f6d7c35cdd54a85a3418aba99f2b91d32775
[ "Apache-2.0" ]
1
2019-06-23T07:17:43.000Z
2019-06-23T07:17:43.000Z
61.53317
4,768
0.592797
[ [ [ "import numpy as np\nimport matplotlib.pyplot as plt\nimport matplotlib.animation as animation\nimport matplotlib\nfrom IPython.display import HTML\n\n", "_____no_output_____" ], [ "def update_line(num, data, line):\n print(num)\n line.set_data(data[..., :num])\n return line,", "_____no_output_____" ], [ "plt.rcParams['animation.writer'] = 'ffmpeg'\nprint(matplotlib.animation.writers.list())", "['pillow', 'ffmpeg', 'ffmpeg_file', 'html']\n" ], [ "fig1 = plt.figure()\n\n# Fixing random state for reproducibility\nnp.random.seed(19680801)\n\ndata = np.random.rand(2, 25)\nl, = plt.plot(x=[], y=[])\nplt.xlim(0, 1)\nplt.ylim(0, 1)\nline_ani = animation.FuncAnimation(fig1, update_line, 25, fargs=(data, l),\n interval=50, blit=True)\nHTML(line_ani.to_html5_video())", "_____no_output_____" ], [ "help(plt.plot)", "Help on function plot in module matplotlib.pyplot:\n\nplot(*args, scalex=True, scaley=True, data=None, **kwargs)\n Plot y versus x as lines and/or markers.\n \n Call signatures::\n \n plot([x], y, [fmt], *, data=None, **kwargs)\n plot([x], y, [fmt], [x2], y2, [fmt2], ..., **kwargs)\n \n The coordinates of the points or line nodes are given by *x*, *y*.\n \n The optional parameter *fmt* is a convenient way for defining basic\n formatting like color, marker and linestyle. It's a shortcut string\n notation described in the *Notes* section below.\n \n >>> plot(x, y) # plot x and y using default line style and color\n >>> plot(x, y, 'bo') # plot x and y using blue circle markers\n >>> plot(y) # plot y using x as index array 0..N-1\n >>> plot(y, 'r+') # ditto, but with red plusses\n \n You can use `.Line2D` properties as keyword arguments for more\n control on the appearance. Line properties and *fmt* can be mixed.\n The following two calls yield identical results:\n \n >>> plot(x, y, 'go--', linewidth=2, markersize=12)\n >>> plot(x, y, color='green', marker='o', linestyle='dashed',\n ... linewidth=2, markersize=12)\n \n When conflicting with *fmt*, keyword arguments take precedence.\n \n \n **Plotting labelled data**\n \n There's a convenient way for plotting objects with labelled data (i.e.\n data that can be accessed by index ``obj['y']``). Instead of giving\n the data in *x* and *y*, you can provide the object in the *data*\n parameter and just give the labels for *x* and *y*::\n \n >>> plot('xlabel', 'ylabel', data=obj)\n \n All indexable objects are supported. This could e.g. be a `dict`, a\n `pandas.DataFame` or a structured numpy array.\n \n \n **Plotting multiple sets of data**\n \n There are various ways to plot multiple sets of data.\n \n - The most straight forward way is just to call `plot` multiple times.\n Example:\n \n >>> plot(x1, y1, 'bo')\n >>> plot(x2, y2, 'go')\n \n - Alternatively, if your data is already a 2d array, you can pass it\n directly to *x*, *y*. A separate data set will be drawn for every\n column.\n \n Example: an array ``a`` where the first column represents the *x*\n values and the other columns are the *y* columns::\n \n >>> plot(a[0], a[1:])\n \n - The third way is to specify multiple sets of *[x]*, *y*, *[fmt]*\n groups::\n \n >>> plot(x1, y1, 'g^', x2, y2, 'g-')\n \n In this case, any additional keyword argument applies to all\n datasets. Also this syntax cannot be combined with the *data*\n parameter.\n \n By default, each line is assigned a different style specified by a\n 'style cycle'. The *fmt* and line property parameters are only\n necessary if you want explicit deviations from these defaults.\n Alternatively, you can also change the style cycle using\n :rc:`axes.prop_cycle`.\n \n \n Parameters\n ----------\n x, y : array-like or scalar\n The horizontal / vertical coordinates of the data points.\n *x* values are optional and default to `range(len(y))`.\n \n Commonly, these parameters are 1D arrays.\n \n They can also be scalars, or two-dimensional (in that case, the\n columns represent separate data sets).\n \n These arguments cannot be passed as keywords.\n \n fmt : str, optional\n A format string, e.g. 'ro' for red circles. See the *Notes*\n section for a full description of the format strings.\n \n Format strings are just an abbreviation for quickly setting\n basic line properties. All of these and more can also be\n controlled by keyword arguments.\n \n This argument cannot be passed as keyword.\n \n data : indexable object, optional\n An object with labelled data. If given, provide the label names to\n plot in *x* and *y*.\n \n .. note::\n Technically there's a slight ambiguity in calls where the\n second label is a valid *fmt*. `plot('n', 'o', data=obj)`\n could be `plt(x, y)` or `plt(y, fmt)`. In such cases,\n the former interpretation is chosen, but a warning is issued.\n You may suppress the warning by adding an empty format string\n `plot('n', 'o', '', data=obj)`.\n \n Other Parameters\n ----------------\n scalex, scaley : bool, optional, default: True\n These parameters determined if the view limits are adapted to\n the data limits. The values are passed on to `autoscale_view`.\n \n **kwargs : `.Line2D` properties, optional\n *kwargs* are used to specify properties like a line label (for\n auto legends), linewidth, antialiasing, marker face color.\n Example::\n \n >>> plot([1, 2, 3], [1, 2, 3], 'go-', label='line 1', linewidth=2)\n >>> plot([1, 2, 3], [1, 4, 9], 'rs', label='line 2')\n \n If you make multiple lines with one plot command, the kwargs\n apply to all those lines.\n \n Here is a list of available `.Line2D` properties:\n \n Properties:\n agg_filter: a filter function, which takes a (m, n, 3) float array and a dpi value, and returns a (m, n, 3) array\n alpha: float or None\n animated: bool\n antialiased or aa: bool\n clip_box: `.Bbox`\n clip_on: bool\n clip_path: Patch or (Path, Transform) or None\n color or c: color\n contains: callable\n dash_capstyle: {'butt', 'round', 'projecting'}\n dash_joinstyle: {'miter', 'round', 'bevel'}\n dashes: sequence of floats (on/off ink in points) or (None, None)\n data: (2, N) array or two 1D arrays\n drawstyle or ds: {'default', 'steps', 'steps-pre', 'steps-mid', 'steps-post'}, default: 'default'\n figure: `.Figure`\n fillstyle: {'full', 'left', 'right', 'bottom', 'top', 'none'}\n gid: str\n in_layout: bool\n label: object\n linestyle or ls: {'-', '--', '-.', ':', '', (offset, on-off-seq), ...}\n linewidth or lw: float\n marker: marker style\n markeredgecolor or mec: color\n markeredgewidth or mew: float\n markerfacecolor or mfc: color\n markerfacecoloralt or mfcalt: color\n markersize or ms: float\n markevery: None or int or (int, int) or slice or List[int] or float or (float, float)\n path_effects: `.AbstractPathEffect`\n picker: float or callable[[Artist, Event], Tuple[bool, dict]]\n pickradius: float\n rasterized: bool or None\n sketch_params: (scale: float, length: float, randomness: float)\n snap: bool or None\n solid_capstyle: {'butt', 'round', 'projecting'}\n solid_joinstyle: {'miter', 'round', 'bevel'}\n transform: `matplotlib.transforms.Transform`\n url: str\n visible: bool\n xdata: 1D array\n ydata: 1D array\n zorder: float\n \n Returns\n -------\n lines\n A list of `.Line2D` objects representing the plotted data.\n \n See Also\n --------\n scatter : XY scatter plot with markers of varying size and/or color (\n sometimes also called bubble chart).\n \n Notes\n -----\n **Format Strings**\n \n A format string consists of a part for color, marker and line::\n \n fmt = '[marker][line][color]'\n \n Each of them is optional. If not provided, the value from the style\n cycle is used. Exception: If ``line`` is given, but no ``marker``,\n the data will be a line without markers.\n \n Other combinations such as ``[color][marker][line]`` are also\n supported, but note that their parsing may be ambiguous.\n \n **Markers**\n \n ============= ===============================\n character description\n ============= ===============================\n ``'.'`` point marker\n ``','`` pixel marker\n ``'o'`` circle marker\n ``'v'`` triangle_down marker\n ``'^'`` triangle_up marker\n ``'<'`` triangle_left marker\n ``'>'`` triangle_right marker\n ``'1'`` tri_down marker\n ``'2'`` tri_up marker\n ``'3'`` tri_left marker\n ``'4'`` tri_right marker\n ``'s'`` square marker\n ``'p'`` pentagon marker\n ``'*'`` star marker\n ``'h'`` hexagon1 marker\n ``'H'`` hexagon2 marker\n ``'+'`` plus marker\n ``'x'`` x marker\n ``'D'`` diamond marker\n ``'d'`` thin_diamond marker\n ``'|'`` vline marker\n ``'_'`` hline marker\n ============= ===============================\n \n **Line Styles**\n \n ============= ===============================\n character description\n ============= ===============================\n ``'-'`` solid line style\n ``'--'`` dashed line style\n ``'-.'`` dash-dot line style\n ``':'`` dotted line style\n ============= ===============================\n \n Example format strings::\n \n 'b' # blue markers with default shape\n 'or' # red circles\n '-g' # green solid line\n '--' # dashed line with default color\n '^k:' # black triangle_up markers connected by a dotted line\n \n **Colors**\n \n The supported color abbreviations are the single letter codes\n \n ============= ===============================\n character color\n ============= ===============================\n ``'b'`` blue\n ``'g'`` green\n ``'r'`` red\n ``'c'`` cyan\n ``'m'`` magenta\n ``'y'`` yellow\n ``'k'`` black\n ``'w'`` white\n ============= ===============================\n \n and the ``'CN'`` colors that index into the default property cycle.\n \n If the color is the only part of the format string, you can\n additionally use any `matplotlib.colors` spec, e.g. full names\n (``'green'``) or hex strings (``'#008000'``).\n\n" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code" ] ]
d063877bfa6c74b4e238643da9e2ef6c123e9eec
5,330
ipynb
Jupyter Notebook
notebooks/Toy_problem/Tp4_criterium_2_daylighting_potential.ipynb
Maxketelaar/thesis
d1bab7dffa414c335b452476733c8b9d8ec24579
[ "MIT" ]
null
null
null
notebooks/Toy_problem/Tp4_criterium_2_daylighting_potential.ipynb
Maxketelaar/thesis
d1bab7dffa414c335b452476733c8b9d8ec24579
[ "MIT" ]
null
null
null
notebooks/Toy_problem/Tp4_criterium_2_daylighting_potential.ipynb
Maxketelaar/thesis
d1bab7dffa414c335b452476733c8b9d8ec24579
[ "MIT" ]
1
2021-12-21T15:24:57.000Z
2021-12-21T15:24:57.000Z
27.905759
392
0.558161
[ [ [ "#### loading the libraries", "_____no_output_____" ] ], [ [ "import os\nimport sys\nimport pyvista as pv\nimport trimesh as tm\nimport numpy as np\nimport topogenesis as tg\nimport pickle as pk\nsys.path.append(os.path.realpath('..\\..')) # no idea how or why this is not working without adding this to the path TODO: learn about path etc.\nfrom notebooks.resources import RES as res", "_____no_output_____" ] ], [ [ "#### loading the configuration of the test", "_____no_output_____" ] ], [ [ "# load base lattice CSV file\nlattice_path = os.path.relpath('../../data/macrovoxels.csv')\nmacro_lattice = tg.lattice_from_csv(lattice_path)\n\n# load random configuration for testing\nconfig_path = os.path.relpath('../../data/random_lattice.csv')\nconfiguration = tg.lattice_from_csv(config_path)\n\n# load environment\nenvironment_path = os.path.relpath(\"../../data/movedcontext.obj\") \nenvironment_mesh = tm.load(environment_path)\n\n# load solar vectors\nvectors = pk.load(open(\"../../data/sunvectors.pk\", \"rb\"))\n\n# load vector intensities\nintensity = pk.load(open(\"../../data/dnival.pk\", \"rb\"))", "_____no_output_____" ] ], [ [ "#### during optimization, arrays like these will be passed to the function:", "_____no_output_____" ] ], [ [ "variable = [0, 0, 0, 1, 1, 1, 0, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 1, 0, 0, 0, 1, 1, 1, 0, 0, 0, 1, 1, 0, 0, 0, 1, 0, 0, 1, 0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 0, 0, 1, 1, 1, 1, 0, 1, 1, 1, 1, 0, 0, 1, 1, 1, 1, 0, 0, 1, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, 1, 1, 1, 0, 0, 1, 0, 1, 0, 0]", "_____no_output_____" ] ], [ [ "#### calling the objective function", "_____no_output_____" ] ], [ [ "# input is the decision variables, a referenca lattice, the visibility vectors, their magnitude (i.e. direct normal illuminance for daylight), and a mesh of the environment\n# output is the total objective score in 100s of lux on the facade, and 100s of lux per each surface (voxel roofs)\ncrit, voxcrit = res.crit_2_DL(variable, macro_lattice, vectors, intensity, environment_mesh)", "_____no_output_____" ] ], [ [ "#### generating mesh", "_____no_output_____" ] ], [ [ "meshes, _, _ = res.construct_vertical_mesh(configuration, configuration.unit)\nfacademesh = tm.util.concatenate(meshes)", "_____no_output_____" ] ], [ [ "#### visualisation", "_____no_output_____" ] ], [ [ "p = pv.Plotter(notebook=True)\n\nconfiguration.fast_vis(p,False,False,opacity=0.1)\n# p.add_arrows(ctr_per_ray, -ray_per_ctr, mag=5, show_scalar_bar=False)\n# p.add_arrows(ctr_per_ray, nrm_per_ray, mag=5, show_scalar_bar=False)\n# p.add_mesh(roof_mesh)\np.add_mesh(environment_mesh)\np.add_mesh(facademesh, cmap='fire', scalars=np.repeat(voxcrit,2))\np.add_points(vectors*-300)\n# p.add_points(horizontal_test_points)\n\np.show(use_ipyvtk=True)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0638e2eaa1d7f56e9fec1065b6dd907f395d8fb
5,772
ipynb
Jupyter Notebook
Cap04/.ipynb_checkpoints/modulos_pacotes-checkpoint.ipynb
carlos-freitas-gitHub/python-analytics
4b55cb2acb3383ded700596c5a856b7e2124f2da
[ "Apache-2.0" ]
1
2020-07-31T20:31:19.000Z
2020-07-31T20:31:19.000Z
Cap04/.ipynb_checkpoints/modulos_pacotes-checkpoint.ipynb
carlos-freitas-gitHub/python-analytics
4b55cb2acb3383ded700596c5a856b7e2124f2da
[ "Apache-2.0" ]
null
null
null
Cap04/.ipynb_checkpoints/modulos_pacotes-checkpoint.ipynb
carlos-freitas-gitHub/python-analytics
4b55cb2acb3383ded700596c5a856b7e2124f2da
[ "Apache-2.0" ]
null
null
null
20.688172
502
0.445773
[ [ [ "## Mรณdulo e pacote", "_____no_output_____" ] ], [ [ "# importando mรณdulo, math para operaรงรตes matemรกticas\nimport math", "_____no_output_____" ], [ "# verificando todos os metodos do modulo\ndir(math)", "_____no_output_____" ], [ "# usando um dos metรณdos do mรณdulo, sqrt, raiz quadrada\nprint(math.sqrt(25))", "5.0\n" ], [ "# importando apenas uma funรงรฃo do mรณdulo math\nfrom math import sqrt ", "_____no_output_____" ], [ "# usando este mรฉtodo, como importou somente a funรงรฃo do mรณdulo pode usar somente\n# a funรงรฃo sem o nome do pacote\nprint(sqrt(25))", "5.0\n" ], [ "# imprimindo todos os metodos do mรณdulo math\nprint(dir(math))", "['__doc__', '__loader__', '__name__', '__package__', '__spec__', 'acos', 'acosh', 'asin', 'asinh', 'atan', 'atan2', 'atanh', 'ceil', 'copysign', 'cos', 'cosh', 'degrees', 'e', 'erf', 'erfc', 'exp', 'expm1', 'fabs', 'factorial', 'floor', 'fmod', 'frexp', 'fsum', 'gamma', 'gcd', 'hypot', 'inf', 'isclose', 'isfinite', 'isinf', 'isnan', 'ldexp', 'lgamma', 'log', 'log10', 'log1p', 'log2', 'modf', 'nan', 'pi', 'pow', 'radians', 'remainder', 'sin', 'sinh', 'sqrt', 'tan', 'tanh', 'tau', 'trunc']\n" ], [ "# help da funรงรฃo sqrt do mรณdulo math\nprint(help(sqrt))", "Help on built-in function sqrt in module math:\n\nsqrt(x, /)\n Return the square root of x.\n\nNone\n" ], [ "# random\nimport random", "_____no_output_____" ], [ "# random choice(), escolha, buscando os elementos de maneira aleatรณria\nprint(random.choice(['Maรงa', 'Banana', 'Laranja']))", "Laranja\n" ], [ "# renadom sample(), amostra apartir de uma amostra de valores\nprint(random.sample(range(100), 10))", "[51, 33, 65, 7, 66, 95, 96, 17, 77, 22]\n" ], [ "# mรณdulo para estatistรญca\nimport statistics", "_____no_output_____" ], [ "# criando uma lista de nรบmeros reais\ndados = [2.75, 1.75, 1.25, 0.25, 1.25, 3.5]", "_____no_output_____" ] ] ]
[ "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d0639939fd0e4f172287ca2c118fc3142e12f140
182,702
ipynb
Jupyter Notebook
BareBones 1D CNN LSTM MLP - Sequence Prediction.ipynb
codeWhim/Sequence-Prediction
2f9a0c3f57c20c311840ae00009637f553081f5b
[ "MIT" ]
1
2019-03-06T15:08:47.000Z
2019-03-06T15:08:47.000Z
BareBones 1D CNN LSTM MLP - Sequence Prediction.ipynb
codeWhim/Sequence-Prediction
2f9a0c3f57c20c311840ae00009637f553081f5b
[ "MIT" ]
null
null
null
BareBones 1D CNN LSTM MLP - Sequence Prediction.ipynb
codeWhim/Sequence-Prediction
2f9a0c3f57c20c311840ae00009637f553081f5b
[ "MIT" ]
null
null
null
148.78013
24,024
0.845612
[ [ [ "<h1>Notebook Content</h1>\n\n1. [Import Packages](#1)\n1. [Helper Functions](#2)\n1. [Input](#3)\n1. [Model](#4)\n1. [Prediction](#5)\n1. [Complete Figure](#6)", "_____no_output_____" ], [ "<h1 id=\"1\">1. Import Packages</h1>\nImporting all necessary and useful packages in single cell.", "_____no_output_____" ] ], [ [ "import numpy as np\nimport keras\nimport tensorflow as tf\nfrom numpy import array\nfrom keras.models import Sequential\nfrom keras.layers import LSTM\nfrom keras.layers import Dense\nfrom keras.layers import Flatten\nfrom keras.layers import TimeDistributed\nfrom keras.layers.convolutional import Conv1D\nfrom keras.layers.convolutional import MaxPooling1D\nfrom keras_tqdm import TQDMNotebookCallback\nfrom sklearn.preprocessing import MinMaxScaler\nfrom tqdm import tqdm_notebook\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport random\nfrom random import randint", "_____no_output_____" ] ], [ [ "<h1 id=\"2\">2. Helper Functions</h1>\nDefining Some helper functions which we will need later in code", "_____no_output_____" ] ], [ [ "# split a univariate sequence into samples\ndef split_sequence(sequence, n_steps, look_ahead=0):\n X, y = list(), list()\n for i in range(len(sequence)-look_ahead):\n # find the end of this pattern\n end_ix = i + n_steps\n # check if we are beyond the sequence\n if end_ix > len(sequence)-1-look_ahead:\n break\n # gather input and output parts of the pattern\n seq_x, seq_y = sequence[i:end_ix], sequence[end_ix+look_ahead]\n X.append(seq_x)\n y.append(seq_y)\n return array(X), array(y)\n\ndef plot_multi_graph(xAxis,yAxes,title='',xAxisLabel='number',yAxisLabel='Y'):\n linestyles = ['-', '--', '-.', ':']\n plt.figure()\n plt.title(title)\n plt.xlabel(xAxisLabel)\n plt.ylabel(yAxisLabel)\n for key, value in yAxes.items():\n plt.plot(xAxis, np.array(value), label=key, linestyle=linestyles[randint(0,3)])\n plt.legend()\n \ndef normalize(values):\n values = array(values, dtype=\"float64\").reshape((len(values), 1))\n # train the normalization\n scaler = MinMaxScaler(feature_range=(0, 1))\n scaler = scaler.fit(values)\n #print('Min: %f, Max: %f' % (scaler.data_min_, scaler.data_max_))\n # normalize the dataset and print the first 5 rows\n normalized = scaler.transform(values)\n return normalized,scaler", "_____no_output_____" ] ], [ [ "<h1 id=\"3\">3. Input</h1>\n\n<h3 id=\"3-1\">3-1. Sequence PreProcessing</h3>\nSplitting and Reshaping", "_____no_output_____" ] ], [ [ "n_features = 1\nn_seq = 20\nn_steps = 1\n \ndef sequence_preprocessed(values, sliding_window, look_ahead=0):\n \n # Normalization\n normalized,scaler = normalize(values)\n \n # Try the following if randomizing the sequence:\n # random.seed('sam') # set the seed\n # raw_seq = random.sample(raw_seq, 100)\n\n # split into samples\n X, y = split_sequence(normalized, sliding_window, look_ahead)\n\n # reshape from [samples, timesteps] into [samples, subsequences, timesteps, features]\n X = X.reshape((X.shape[0], n_seq, n_steps, n_features))\n \n return X,y,scaler", "_____no_output_____" ] ], [ [ "<h3 id=\"3-2\">3-2. Providing Sequence</h3>\nDefining a raw sequence, sliding window of data to consider and look ahead future timesteps", "_____no_output_____" ] ], [ [ "# define input sequence\nsequence_val = [i for i in range(5000,7000)]\nsequence_train = [i for i in range(1000,2000)]\nsequence_test = [i for i in range(10000,14000)]\n\n# choose a number of time steps for sliding window\nsliding_window = 20\n\n# choose a number of further time steps after end of sliding_window till target start (gap between data and target)\nlook_ahead = 20\n\nX_train, y_train, scaler_train = sequence_preprocessed(sequence_train, sliding_window, look_ahead)\nX_val, y_val ,scaler_val = sequence_preprocessed(sequence_val, sliding_window, look_ahead)\nX_test,y_test,scaler_test = sequence_preprocessed(sequence_test, sliding_window, look_ahead)", "_____no_output_____" ] ], [ [ "<h1 id=\"4\">4. Model</h1>\n\n<h3 id=\"4-1\">4-1. Defining Layers</h3>\nAdding 1D Convolution, Max Pooling, LSTM and finally Dense (MLP) layer", "_____no_output_____" ] ], [ [ "# define model\nmodel = Sequential()\nmodel.add(TimeDistributed(Conv1D(filters=64, kernel_size=1, activation='relu'), \n input_shape=(None, n_steps, n_features)\n ))\nmodel.add(TimeDistributed(MaxPooling1D(pool_size=1)))\nmodel.add(TimeDistributed(Flatten()))\nmodel.add(LSTM(50, activation='relu', stateful=False))\nmodel.add(Dense(1))", "_____no_output_____" ] ], [ [ "<h3 id=\"4-2\">4-2. Training Model</h3>\nDefined early stop, can be used in callbacks param of model fit, not using for now since it's not recommended at first few iterations of experimentation with new data", "_____no_output_____" ] ], [ [ "# Defining multiple metrics, leaving it to a choice, some may be useful and few may even surprise on some problems\nmetrics = ['mean_squared_error',\n 'mean_absolute_error',\n 'mean_absolute_percentage_error',\n 'mean_squared_logarithmic_error',\n 'logcosh']\n\n# Compiling Model\nmodel.compile(optimizer='adam', loss='mape', metrics=metrics)\n\n# Defining early stop, call it in model fit callback\nearly_stop = keras.callbacks.EarlyStopping(monitor='val_loss', patience=10)\n\n# Fit model\nhistory = model.fit(X_train, y_train, epochs=100, verbose=3, validation_data=(X_val,y_val))", "Train on 960 samples, validate on 1960 samples\nEpoch 1/100\nEpoch 2/100\nEpoch 3/100\nEpoch 4/100\nEpoch 5/100\nEpoch 6/100\nEpoch 7/100\nEpoch 8/100\nEpoch 9/100\nEpoch 10/100\nEpoch 11/100\nEpoch 12/100\nEpoch 13/100\nEpoch 14/100\nEpoch 15/100\nEpoch 16/100\nEpoch 17/100\nEpoch 18/100\nEpoch 19/100\nEpoch 20/100\nEpoch 21/100\nEpoch 22/100\nEpoch 23/100\nEpoch 24/100\nEpoch 25/100\nEpoch 26/100\nEpoch 27/100\nEpoch 28/100\nEpoch 29/100\nEpoch 30/100\nEpoch 31/100\nEpoch 32/100\nEpoch 33/100\nEpoch 34/100\nEpoch 35/100\nEpoch 36/100\nEpoch 37/100\nEpoch 38/100\nEpoch 39/100\nEpoch 40/100\nEpoch 41/100\nEpoch 42/100\nEpoch 43/100\nEpoch 44/100\nEpoch 45/100\nEpoch 46/100\nEpoch 47/100\nEpoch 48/100\nEpoch 49/100\nEpoch 50/100\nEpoch 51/100\nEpoch 52/100\nEpoch 53/100\nEpoch 54/100\nEpoch 55/100\nEpoch 56/100\nEpoch 57/100\nEpoch 58/100\nEpoch 59/100\nEpoch 60/100\nEpoch 61/100\nEpoch 62/100\nEpoch 63/100\nEpoch 64/100\nEpoch 65/100\nEpoch 66/100\nEpoch 67/100\nEpoch 68/100\nEpoch 69/100\nEpoch 70/100\nEpoch 71/100\nEpoch 72/100\nEpoch 73/100\nEpoch 74/100\nEpoch 75/100\nEpoch 76/100\nEpoch 77/100\nEpoch 78/100\nEpoch 79/100\nEpoch 80/100\nEpoch 81/100\nEpoch 82/100\nEpoch 83/100\nEpoch 84/100\nEpoch 85/100\nEpoch 86/100\nEpoch 87/100\nEpoch 88/100\nEpoch 89/100\nEpoch 90/100\nEpoch 91/100\nEpoch 92/100\nEpoch 93/100\nEpoch 94/100\nEpoch 95/100\nEpoch 96/100\nEpoch 97/100\nEpoch 98/100\nEpoch 99/100\nEpoch 100/100\n" ] ], [ [ "<h3 id=\"4-3\">4-3. Evaluating Model</h3>\nPlotting Training and Validation mean square error", "_____no_output_____" ] ], [ [ "# Plot Errors\n\nfor metric in metrics:\n xAxis = history.epoch\n yAxes = {}\n yAxes[\"Training\"]=history.history[metric]\n yAxes[\"Validation\"]=history.history['val_'+metric]\n plot_multi_graph(xAxis,yAxes, title=metric,xAxisLabel='Epochs')", "_____no_output_____" ] ], [ [ "<h1 id=\"5\">5. Prediction</h1>\n\n<h3 id=\"5-1\">5-1. Single Value Prediction</h3>\nPredicting a single value slided 20 (our provided figure for look_ahead above) values ahead", "_____no_output_____" ] ], [ [ "# demonstrate prediction\nx_input = array([i for i in range(100,120)])\nprint(x_input)\nx_input = x_input.reshape((1, n_seq, n_steps, n_features))\nyhat = model.predict(x_input)\nprint(yhat)", "[100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117\n 118 119]\n[[105.82992]]\n" ] ], [ [ "<h3 id=\"5-2\">5-2. Sequence Prediction</h3>\nPredicting complete sequence (determining closeness to target) based on data <br />\n<i>change variable for any other sequence though</i>", "_____no_output_____" ] ], [ [ "# Prediction from Training Set\npredict_train = model.predict(X_train)\n\n# Prediction from Test Set\npredict_test = model.predict(X_test)\n\n\"\"\"\ndf = pd.DataFrame(({\"normalized y_train\":y_train.flatten(),\n \"normalized predict_train\":predict_train.flatten(),\n \"actual y_train\":scaler_train.inverse_transform(y_train).flatten(),\n \"actual predict_train\":scaler_train.inverse_transform(predict_train).flatten(),\n }))\n\n\"\"\"\n\ndf = pd.DataFrame(({ \n \"normalized y_test\":y_test.flatten(),\n \"normalized predict_test\":predict_test.flatten(),\n \"actual y_test\":scaler_test.inverse_transform(y_test).flatten(),\n \"actual predict_test\":scaler_test.inverse_transform(predict_test).flatten()\n }))\ndf", "_____no_output_____" ] ], [ [ "<h1 id=\"6\">6. Complete Figure</h1>\nData, Target, Prediction - all in one single graph", "_____no_output_____" ] ], [ [ "xAxis = [i for i in range(len(y_train))]\nyAxes = {}\nyAxes[\"Data\"]=sequence_train[sliding_window:len(sequence_train)-look_ahead]\nyAxes[\"Target\"]=scaler_train.inverse_transform(y_train)\nyAxes[\"Prediction\"]=scaler_train.inverse_transform(predict_train)\nplot_multi_graph(xAxis,yAxes,title='')\n\nxAxis = [i for i in range(len(y_test))]\nyAxes = {}\nyAxes[\"Data\"]=sequence_test[sliding_window:len(sequence_test)-look_ahead]\nyAxes[\"Target\"]=scaler_test.inverse_transform(y_test)\nyAxes[\"Prediction\"]=scaler_test.inverse_transform(predict_test)\nplot_multi_graph(xAxis,yAxes,title='')\n\nprint(metrics)\nprint(model.evaluate(X_test,y_test))", "['mean_squared_error', 'mean_absolute_error', 'mean_absolute_percentage_error', 'mean_squared_logarithmic_error', 'logcosh']\n3960/3960 [==============================] - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - ETA: - 1s 294us/step\n[7.694095613258053, 0.00023503987094495595, 0.015312134466990077, 7.694095613258053, 0.00011939386936549021, 0.0001175134772149084]\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0639b5e7b411de78ef7448fa61f58a26ff2ed77
47,720
ipynb
Jupyter Notebook
models/Character_Level_CNN.ipynb
TheBlueEngineer/Serene-1.0
4f8c2e688c1403fda3c43c46c5ee598da3e607ea
[ "MIT" ]
1
2020-09-23T21:21:55.000Z
2020-09-23T21:21:55.000Z
models/Character_Level_CNN.ipynb
TheBlueEngineer/Serene-1.0
4f8c2e688c1403fda3c43c46c5ee598da3e607ea
[ "MIT" ]
null
null
null
models/Character_Level_CNN.ipynb
TheBlueEngineer/Serene-1.0
4f8c2e688c1403fda3c43c46c5ee598da3e607ea
[ "MIT" ]
null
null
null
57.842424
1,659
0.51536
[ [ [ "# **Libraries**", "_____no_output_____" ] ], [ [ "from google.colab import drive\ndrive.mount('/content/drive')", "Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n" ], [ "# ***********************\n# *****| LIBRARIES |*****\n# ***********************\n%tensorflow_version 2.x\nimport pandas as pd\nimport numpy as np\nimport os\nimport json\n\nfrom sklearn.model_selection import train_test_split\nimport tensorflow as tf\nfrom keras.preprocessing.text import Tokenizer\nfrom keras.preprocessing.sequence import pad_sequences\nfrom keras.layers import Input, Embedding, Activation, Flatten, Dense\nfrom keras.layers import Conv1D, MaxPooling1D, Dropout\nfrom keras.models import Model\nfrom keras.utils import to_categorical\nfrom keras.optimizers import SGD\nfrom keras.wrappers.scikit_learn import KerasClassifier\nfrom sklearn.model_selection import RandomizedSearchCV, GridSearchCV\n\ndevice_name = tf.test.gpu_device_name()\nif device_name != '/device:GPU:0':\n print(\"GPU not found\")\nelse:\n print('Found GPU at: {}'.format(device_name))", "Using TensorFlow backend.\n" ], [ "# ******************************\n# *****| GLOBAL VARIABLES |*****\n# ******************************\ntest_size = 0.2\n\nconvsize = 256\nconvsize2 = 1024\nembedding_size = 27\ninput_size = 1000\nconv_layers = [\n [convsize, 7, 3],\n [convsize, 7, 3],\n [convsize, 3, -1],\n [convsize, 3, -1],\n [convsize, 3, -1],\n [convsize, 3, 3]\n ]\n\nfully_connected_layers = [convsize2, convsize2]\nnum_of_classes= 2\ndropout_p = 0.5\noptimizer= 'adam'\nbatch = 128\nloss = 'categorical_crossentropy'", "_____no_output_____" ] ], [ [ "# **Utility functions**", "_____no_output_____" ] ], [ [ "# *****************\n# *** GET FILES ***\n# *****************\ndef getFiles( driverPath, directory, basename, extension): # Define a function that will return a list of files\n pathList = [] # Declare an empty array\n directory = os.path.join( driverPath, directory) # \n \n for root, dirs, files in os.walk( directory): # Iterate through roots, dirs and files recursively\n for file in files: # For every file in files\n if os.path.basename(root) == basename: # If the parent directory of the current file is equal with the parameter\n if file.endswith('.%s' % (extension)): # If the searched file ends in the parameter\n path = os.path.join(root, file) # Join together the root path and file name\n pathList.append(path) # Append the new path to the list\n return pathList ", "_____no_output_____" ], [ "# ****************************************\n# *** GET DATA INTO A PANDAS DATAFRAME ***\n# ****************************************\ndef getDataFrame( listFiles, maxFiles, minWords, limit):\n counter_real, counter_max, limitReached = 0, 0, 0\n text_list, label_list = [], []\n\n print(\"Word min set to: %i.\" % ( minWords))\n # Iterate through all the files\n for file in listFiles:\n # Open each file and look into it\n with open(file) as f:\n if(limitReached):\n break\n if maxFiles == 0:\n break\n else:\n maxFiles -= 1\n objects = json.loads( f.read())['data'] # Get the data from the JSON file\n # Look into each object from the file and test for limiters\n for object in objects:\n if limit > 0 and counter_real >= (limit * 1000):\n limitReached = 1\n break\n if len( object['text'].split()) >= minWords:\n text_list.append(object['text'])\n label_list.append(object['label'])\n counter_real += 1\n counter_max += 1\n\n if(counter_real > 0 and counter_max > 0):\n ratio = counter_real / counter_max * 100\n else:\n ratio = 0\n # Print the final result\n print(\"Lists created with %i/%i (%.2f%%) data objects.\" % ( counter_real, counter_max, ratio))\n print(\"Rest ignored due to minimum words limit of %i or the limit of %i data objects maximum.\" % ( minWords, limit * 1000))\n # Return the final Pandas DataFrame\n return text_list, label_list, counter_real", "_____no_output_____" ] ], [ [ "# **Gather the path to files**", "_____no_output_____" ] ], [ [ "# ***********************************\n# *** GET THE PATHS FOR THE FILES ***\n# ***********************************\n\n# Path to the content of the Google Drive \ndriverPath = \"/content/drive/My Drive\"\n\n# Sub-directories in the driver\npaths = [\"processed/depression/submission\",\n \"processed/depression/comment\", \n \"processed/AskReddit/submission\", \n \"processed/AskReddit/comment\"]\n\nfiles = [None] * len(paths)\nfor i in range(len(paths)):\n files[i] = getFiles( driverPath, paths[i], \"text\", \"json\")\n print(\"Gathered %i files from %s.\" % ( len(files[i]), paths[i]))", "Gathered 750 files from processed/depression/submission.\nGathered 2892 files from processed/depression/comment.\nGathered 1311 files from processed/AskReddit/submission.\nGathered 5510 files from processed/AskReddit/comment.\n" ] ], [ [ "# **Gather the data from files**", "_____no_output_____" ] ], [ [ "# ************************************\n# *** GATHER THE DATA AND SPLIT IT ***\n# ************************************\n# Local variables\nrand_state_splitter = 1000\ntest_size = 0.2\n\nmin_files = [ 750, 0, 1300, 0] \nmax_words = [ 50, 0, 50, 0]\nlimit_packets = [300, 0, 300, 0]\nmessage = [\"Depression submissions\", \"Depression comments\", \"AskReddit submissions\", \"AskReddit comments\"]\ntext, label = [], []\n\n# Get the pandas data frames for each category\nprint(\"Build the Pandas DataFrames for each category.\")\nfor i in range(4):\n dummy_text, dummy_label, counter = getDataFrame( files[i], min_files[i], max_words[i], limit_packets[i])\n if counter > 0:\n text += dummy_text\n label += dummy_label\n dummy_text, dummy_label = None, None\n print(\"Added %i samples to data list: %s.\\n\" % ( counter ,message[i]) )\n\n# Splitting the data\nx_train, x_test, y_train, y_test = train_test_split(text, \n label, \n test_size = test_size, \n shuffle = True, \n random_state = rand_state_splitter)\nprint(\"Training data: %i samples.\" % ( len(y_train)) )\nprint(\"Testing data: %i samples.\" % ( len(y_test)) )\n\n# Clear data no longer needed\ndel rand_state_splitter, min_files, max_words, message, dummy_label, dummy_text", "Build the Pandas DataFrames for each category.\nWord min set to: 50.\nLists created with 300000/349305 (85.88%) data objects.\nRest ignored due to minimum words limit of 50 or the limit of 300000 data objects maximum.\nAdded 300000 samples to data list: Depression submissions.\n\nWord min set to: 0.\nLists created with 0/0 (0.00%) data objects.\nRest ignored due to minimum words limit of 0 or the limit of 0 data objects maximum.\nWord min set to: 50.\nLists created with 300000/554781 (54.08%) data objects.\nRest ignored due to minimum words limit of 50 or the limit of 300000 data objects maximum.\nAdded 300000 samples to data list: AskReddit submissions.\n\nWord min set to: 0.\nLists created with 0/0 (0.00%) data objects.\nRest ignored due to minimum words limit of 0 or the limit of 0 data objects maximum.\nTraining data: 480000 samples.\nTesting data: 120000 samples.\n" ] ], [ [ "# **Process the data at a character-level**", "_____no_output_____" ] ], [ [ "# *******************************\n# *** CONVERT STRING TO INDEX ***\n# *******************************\nprint(\"Convert the strings to indexes.\")\ntk = Tokenizer(num_words = None, char_level = True, oov_token='UNK')\ntk.fit_on_texts(x_train)\nprint(\"Original:\", x_train[0])\n# *********************************\n# *** CONSTRUCT A NEW VOCABULARY***\n# *********************************\nprint(\"Construct a new vocabulary\")\nalphabet = \"abcdefghijklmnopqrstuvwxyz\"\nchar_dict = {}\nfor i, char in enumerate(alphabet):\n char_dict[char] = i + 1\nprint(\"dictionary\")\ntk.word_index = char_dict.copy() # Use char_dict to replace the tk.word_index\nprint(tk.word_index)\ntk.word_index[tk.oov_token] = max(char_dict.values()) + 1 # Add 'UNK' to the vocabulary\nprint(tk.word_index)\n# *************************\n# *** TEXT TO SEQUENCES ***\n# *************************\nprint(\"Text to sequence.\")\nx_train = tk.texts_to_sequences(x_train)\nx_test = tk.texts_to_sequences(x_test)\nprint(\"After sequences:\", x_train[0])\n# ***************\n# *** PADDING ***\n# ***************\nprint(\"Padding the sequences.\")\nx_train = pad_sequences( x_train, maxlen = input_size, padding = 'post')\nx_test = pad_sequences( x_test, maxlen= input_size , padding = 'post')\n\n# ************************\n# *** CONVERT TO NUMPY ***\n# ************************\nprint(\"Convert to Numpy arrays\")\nx_train = np.array( x_train, dtype = 'float32')\nx_test = np.array(x_test, dtype = 'float32')\n\n# **************************************\n# *** GET CLASSES FOR CLASSIFICATION ***\n# **************************************\ny_test_copy = y_test\ny_train_list = [x-1 for x in y_train]\ny_test_list = [x-1 for x in y_test]\n\ny_train = to_categorical( y_train_list, num_of_classes)\ny_test = to_categorical( y_test_list, num_of_classes)", "Convert the strings to indexes.\nOriginal: i did not think i had have to post in this subreddit i just feel empty and completely alone i am hanging out with friends but nothing makes me feel happy as i used to be i know people generally have it worse i just want someone to talk to and just be silly with \nConstruct a new vocabulary\ndictionary\n{'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 6, 'g': 7, 'h': 8, 'i': 9, 'j': 10, 'k': 11, 'l': 12, 'm': 13, 'n': 14, 'o': 15, 'p': 16, 'q': 17, 'r': 18, 's': 19, 't': 20, 'u': 21, 'v': 22, 'w': 23, 'x': 24, 'y': 25, 'z': 26}\n{'a': 1, 'b': 2, 'c': 3, 'd': 4, 'e': 5, 'f': 6, 'g': 7, 'h': 8, 'i': 9, 'j': 10, 'k': 11, 'l': 12, 'm': 13, 'n': 14, 'o': 15, 'p': 16, 'q': 17, 'r': 18, 's': 19, 't': 20, 'u': 21, 'v': 22, 'w': 23, 'x': 24, 'y': 25, 'z': 26, 'UNK': 27}\nText to sequence.\nAfter sequences: [9, 27, 4, 9, 4, 27, 14, 15, 20, 27, 20, 8, 9, 14, 11, 27, 9, 27, 8, 1, 4, 27, 8, 1, 22, 5, 27, 20, 15, 27, 16, 15, 19, 20, 27, 9, 14, 27, 20, 8, 9, 19, 27, 19, 21, 2, 18, 5, 4, 4, 9, 20, 27, 9, 27, 10, 21, 19, 20, 27, 6, 5, 5, 12, 27, 5, 13, 16, 20, 25, 27, 1, 14, 4, 27, 3, 15, 13, 16, 12, 5, 20, 5, 12, 25, 27, 1, 12, 15, 14, 5, 27, 9, 27, 1, 13, 27, 8, 1, 14, 7, 9, 14, 7, 27, 15, 21, 20, 27, 23, 9, 20, 8, 27, 6, 18, 9, 5, 14, 4, 19, 27, 2, 21, 20, 27, 14, 15, 20, 8, 9, 14, 7, 27, 13, 1, 11, 5, 19, 27, 13, 5, 27, 6, 5, 5, 12, 27, 8, 1, 16, 16, 25, 27, 1, 19, 27, 9, 27, 21, 19, 5, 4, 27, 20, 15, 27, 2, 5, 27, 9, 27, 11, 14, 15, 23, 27, 16, 5, 15, 16, 12, 5, 27, 7, 5, 14, 5, 18, 1, 12, 12, 25, 27, 8, 1, 22, 5, 27, 9, 20, 27, 23, 15, 18, 19, 5, 27, 9, 27, 10, 21, 19, 20, 27, 23, 1, 14, 20, 27, 19, 15, 13, 5, 15, 14, 5, 27, 20, 15, 27, 20, 1, 12, 11, 27, 20, 15, 27, 1, 14, 4, 27, 10, 21, 19, 20, 27, 2, 5, 27, 19, 9, 12, 12, 25, 27, 23, 9, 20, 8, 27]\nPadding the sequences.\nConvert to Numpy arrays\n" ] ], [ [ "# **Load embedding words**", "_____no_output_____" ] ], [ [ "# ***********************\n# *** LOAD EMBEDDINGS ***\n# ***********************\nembedding_weights = []\nvocab_size = len(tk.word_index)\nembedding_weights.append(np.zeros(vocab_size))\n\nfor char, i in tk.word_index.items():\n onehot = np.zeros(vocab_size)\n onehot[i-1] = 1\n embedding_weights.append(onehot)\nembedding_weights = np.array(embedding_weights)\n\nprint(\"Vocabulary size: \",vocab_size)\nprint(\"Embedding weights: \", embedding_weights)", "Vocabulary size: 27\nEmbedding weights: [[0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1.\n 0. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 1. 0. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 1. 0.]\n [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 1.]]\n" ] ], [ [ "# **Build the CNN model**", "_____no_output_____" ] ], [ [ "def KerasModel():\n # ***************************************\n # *****| BUILD THE NEURAL NETWORK |******\n # ***************************************\n embedding_layer = Embedding(vocab_size+1,\n embedding_size,\n input_length = input_size,\n weights = [embedding_weights])\n\n # Input layer\n inputs = Input(shape=(input_size,), name='input', dtype='int64')\n\n # Embedding layer\n x = embedding_layer(inputs)\n\n # Convolution\n for filter_num, filter_size, pooling_size in conv_layers:\n x = Conv1D(filter_num, filter_size)(x)\n x = Activation('relu')(x)\n if pooling_size != -1:\n x = MaxPooling1D( pool_size = pooling_size)(x)\n x = Flatten()(x)\n\n # Fully Connected layers\n for dense_size in fully_connected_layers:\n x = Dense( dense_size, activation='relu')(x)\n x = Dropout( dropout_p)(x)\n\n # Output Layer\n predictions = Dense(num_of_classes, activation = 'softmax')(x)\n\n # BUILD MODEL\n model = Model( inputs = inputs, outputs = predictions)\n model.compile(optimizer = optimizer, loss = loss, metrics = ['accuracy'])\n model.summary()\n\n return model", "_____no_output_____" ] ], [ [ "# **Train the CNN**", "_____no_output_____" ] ], [ [ "#with tf.device(\"/gpu:0\"):\n# history = model.fit(x_train, y_train,\n# validation_data = ( x_test, y_test),\n# epochs = 10,\n# batch_size = batch,\n# verbose = True)\n \nwith tf.device(\"/gpu:0\"):\n grid = KerasClassifier(build_fn = KerasModel, epochs = 15, verbose= True)\n param_grid = dict(\n epochs = [15]\n )\n #grid = GridSearchCV(estimator = model, \n # param_grid = param_grid,\n # cv = 5, \n # verbose = 10, \n # return_train_score = True)\n \n grid_result = grid.fit(x_train, y_train)", "Model: \"model_1\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ninput (InputLayer) (None, 1000) 0 \n_________________________________________________________________\nembedding_1 (Embedding) (None, 1000, 27) 756 \n_________________________________________________________________\nconv1d_1 (Conv1D) (None, 994, 256) 48640 \n_________________________________________________________________\nactivation_1 (Activation) (None, 994, 256) 0 \n_________________________________________________________________\nmax_pooling1d_1 (MaxPooling1 (None, 331, 256) 0 \n_________________________________________________________________\nconv1d_2 (Conv1D) (None, 325, 256) 459008 \n_________________________________________________________________\nactivation_2 (Activation) (None, 325, 256) 0 \n_________________________________________________________________\nmax_pooling1d_2 (MaxPooling1 (None, 108, 256) 0 \n_________________________________________________________________\nconv1d_3 (Conv1D) (None, 106, 256) 196864 \n_________________________________________________________________\nactivation_3 (Activation) (None, 106, 256) 0 \n_________________________________________________________________\nconv1d_4 (Conv1D) (None, 104, 256) 196864 \n_________________________________________________________________\nactivation_4 (Activation) (None, 104, 256) 0 \n_________________________________________________________________\nconv1d_5 (Conv1D) (None, 102, 256) 196864 \n_________________________________________________________________\nactivation_5 (Activation) (None, 102, 256) 0 \n_________________________________________________________________\nconv1d_6 (Conv1D) (None, 100, 256) 196864 \n_________________________________________________________________\nactivation_6 (Activation) (None, 100, 256) 0 \n_________________________________________________________________\nmax_pooling1d_3 (MaxPooling1 (None, 33, 256) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 8448) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 1024) 8651776 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 1024) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 1024) 1049600 \n_________________________________________________________________\ndropout_2 (Dropout) (None, 1024) 0 \n_________________________________________________________________\ndense_3 (Dense) (None, 2) 2050 \n=================================================================\nTotal params: 10,999,286\nTrainable params: 10,999,286\nNon-trainable params: 0\n_________________________________________________________________\n" ] ], [ [ "# **Test the CNN**", "_____no_output_____" ] ], [ [ "#loss, accuracy = model.evaluate( x_train, y_train, verbose = True)\n#print(\"Training Accuracy: {:.4f}\".format( accuracy))\n#loss, accuracy = model.evaluate( x_test, y_test, verbose = True)\n#print(\"Testing Accuracy: {:.4f}\".format( accuracy))\n\nfrom sklearn.metrics import classification_report, confusion_matrix\ny_predict = grid.predict( x_test)\n# Build the confusion matrix \ny_tested = y_test\nprint( type(y_test))\nprint(y_tested)\ny_tested = np.argmax( y_tested, axis = 1)\nprint(y_tested)\nconfMatrix = confusion_matrix(y_tested, y_predict) \ntn, fp, fn, tp = confMatrix.ravel() \n# Build a classification report \nclassification_reports = classification_report( y_tested, y_predict, target_names = ['Non-depressed', 'Depressed'], digits=3)\nprint(confMatrix)\nprint(classification_reports)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d063a4b442a49c71d22336e6b555d4a0dd1f82bf
404,490
ipynb
Jupyter Notebook
src/plotting/OpenChromatin_plotsold.ipynb
Switham1/PromoterArchitecture
0a9021b869ac66cdd622be18cd029950314d111e
[ "MIT" ]
null
null
null
src/plotting/OpenChromatin_plotsold.ipynb
Switham1/PromoterArchitecture
0a9021b869ac66cdd622be18cd029950314d111e
[ "MIT" ]
null
null
null
src/plotting/OpenChromatin_plotsold.ipynb
Switham1/PromoterArchitecture
0a9021b869ac66cdd622be18cd029950314d111e
[ "MIT" ]
null
null
null
154.444444
43,436
0.855368
[ [ [ "import pandas as pd\nimport numpy as np\nimport seaborn as sns\nimport matplotlib.pyplot as plt\nfrom scipy import stats\nfrom statsmodels.formula.api import ols\nimport researchpy as rp\nfrom pingouin import kruskal\nfrom pybedtools import BedTool", "_____no_output_____" ], [ "RootChomatin_bp_covered = '../../data/promoter_analysis/responsivepromotersRootOpenChrom.bp_covered.txt'\nShootChomatin_bp_covered = '../../data/promoter_analysis/responsivepromotersShootOpenChrom.bp_covered.txt'\nRootShootIntersect_bp_covered = '../../data/promoter_analysis/responsivepromotersShootRootIntersectOpenChrom.bp_covered.txt'", "_____no_output_____" ], [ "def add_chr_linestart(input_location,output_location):\n \"\"\"this function adds chr to the beginning of the line if it starts with a digit and saves a file\"\"\"\n output = open(output_location, 'w') #make output file with write capability\n #open input file\n with open(input_location, 'r') as infile: \n #iterate over lines in file\n for line in infile:\n line = line.strip() # removes hidden characters/spaces\n if line[0].isdigit():\n \n line = 'chr' + line #prepend chr to the beginning of line if starts with a digit\n output.write(line + '\\n') #output to new file\n output.close()", "_____no_output_____" ], [ "def percent_coverage(bp_covered):\n \"\"\"function to calculate the % coverage from the output file of bedtools coverage\"\"\"\n\n coverage_df = pd.read_table(bp_covered, sep='\\t', header=None)\n col = ['chr','start','stop','gene','dot','strand','source', 'type', 'dot2', 'details', 'no._of_overlaps', 'no._of_bases_covered','promoter_length','fraction_bases_covered']\n coverage_df.columns = col\n #add % bases covered column\n coverage_df['percentage_bases_covered'] = coverage_df.fraction_bases_covered * 100\n\n #remove unnecessary columns\n coverage_df_reduced_columns = coverage_df[['chr','start','stop','gene','strand', 'no._of_overlaps', 'no._of_bases_covered','promoter_length','fraction_bases_covered','percentage_bases_covered']]\n return coverage_df_reduced_columns", "_____no_output_____" ], [ "root_coverage = percent_coverage(RootChomatin_bp_covered)", "_____no_output_____" ], [ "shoot_coverage = percent_coverage(ShootChomatin_bp_covered)", "_____no_output_____" ], [ "rootshootintersect_coverage = percent_coverage(RootShootIntersect_bp_covered)", "_____no_output_____" ], [ "sns.set(color_codes=True)\nsns.set_style(\"whitegrid\")", "_____no_output_____" ], [ "#distribution plot", "_____no_output_____" ], [ "dist_plot = root_coverage['percentage_bases_covered']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()\n\n#save to file\n#dist_plot_fig.savefig('../../data/plots/TFBS_coverage/all_genes_bp_covered_dist.pdf', format='pdf')\n", "_____no_output_____" ], [ "dist_plot = shoot_coverage['percentage_bases_covered']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()\n\n#save to file\n#dist_plot_fig.savefig('../../data/plots/TFBS_coverage/all_genes_bp_covered_dist.pdf', format='pdf')\n", "_____no_output_____" ], [ "dist_plot = rootshootintersect_coverage['percentage_bases_covered']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()\n\n#save to file\n#dist_plot_fig.savefig('../../data/plots/TFBS_coverage/all_genes_bp_covered_dist.pdf', format='pdf')\n", "_____no_output_____" ] ], [ [ "## constitutive vs variable", "_____no_output_____" ] ], [ [ "def add_genetype(coverage):\n \"\"\"function to add gene type to the df, and remove random genes\"\"\"\n select_genes_file = '../../data/genomes/ara_housekeeping_list.out'\n select_genes = pd.read_table(select_genes_file, sep='\\t', header=None)\n cols = ['gene','gene_type']\n select_genes.columns = cols\n merged = pd.merge(coverage, select_genes, on='gene')\n \n merged_renamed = merged.copy()\n merged_renamed.gene_type.replace('housekeeping','constitutive', inplace=True)\n merged_renamed.gene_type.replace('highVar','variable', inplace=True)\n merged_renamed.gene_type.replace('randCont','random', inplace=True)\n \n # no_random = merged_renamed[merged_renamed.gene_type != 'random']\n # no_random.reset_index(drop=True, inplace=True)\n \n return merged_renamed", "_____no_output_____" ], [ "roots_merged = add_genetype(root_coverage)\nno_random_roots = roots_merged[roots_merged.gene_type != 'random']", "_____no_output_____" ], [ "shoots_merged = add_genetype(shoot_coverage)\nno_random_shoots = shoots_merged[shoots_merged.gene_type != 'random']", "_____no_output_____" ], [ "rootsshootsintersect_merged = add_genetype(rootshootintersect_coverage)\nno_random_rootsshoots = rootsshootsintersect_merged[rootsshootsintersect_merged.gene_type != 'random']", "_____no_output_____" ], [ "#how many have open chromatin??\nprint('root openchromatin present:')\nprint(len(no_random_roots)-len(no_random_roots[no_random_roots.percentage_bases_covered == 0]))\nprint('shoot openchromatin present:')\nprint(len(no_random_shoots)-len(no_random_shoots[no_random_shoots.percentage_bases_covered == 0]))\nprint('root-shoot intersect openchromatin present:')\nprint(len(no_random_rootsshoots)-len(no_random_rootsshoots[no_random_rootsshoots.percentage_bases_covered == 0]))", "root openchromatin present:\n164\nshoot openchromatin present:\n153\nroot-shoot intersect openchromatin present:\n149\n" ], [ "#how many have open chromatin??\nprint('root openchromatin present variable promoters:')\nprint(len(no_random_roots[no_random_roots.gene_type=='variable'])-len(no_random_roots[no_random_roots.gene_type=='variable'][no_random_roots[no_random_roots.gene_type=='variable'].percentage_bases_covered == 0]))\nprint('root openchromatin present constitutive promoters:')\nprint(len(no_random_roots[no_random_roots.gene_type=='constitutive'])-len(no_random_roots[no_random_roots.gene_type=='constitutive'][no_random_roots[no_random_roots.gene_type=='constitutive'].percentage_bases_covered == 0]))\n\n\nprint('shoot openchromatin present variable promoters:')\nprint(len(no_random_shoots[no_random_shoots.gene_type=='variable'])-len(no_random_shoots[no_random_shoots.gene_type=='variable'][no_random_shoots[no_random_shoots.gene_type=='variable'].percentage_bases_covered == 0]))\nprint('shoot openchromatin present constitutive promoters:')\nprint(len(no_random_shoots[no_random_shoots.gene_type=='constitutive'])-len(no_random_shoots[no_random_shoots.gene_type=='constitutive'][no_random_shoots[no_random_shoots.gene_type=='constitutive'].percentage_bases_covered == 0]))\n\nprint('root-shoot intersect openchromatin present variable promoters:')\nprint(len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='variable'])-len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='variable'][no_random_rootsshoots[no_random_rootsshoots.gene_type=='variable'].percentage_bases_covered == 0]))\nprint('root-shoot intersect openchromatin present constitutive promoters:')\nprint(len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='constitutive'])-len(no_random_rootsshoots[no_random_rootsshoots.gene_type=='constitutive'][no_random_rootsshoots[no_random_rootsshoots.gene_type=='constitutive'].percentage_bases_covered == 0]))", "root openchromatin present variable promoters:\n75\nroot openchromatin present constitutive promoters:\n89\nshoot openchromatin present variable promoters:\n66\nshoot openchromatin present constitutive promoters:\n87\nroot-shoot intersect openchromatin present variable promoters:\n63\nroot-shoot intersect openchromatin present constitutive promoters:\n86\n" ], [ "sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=roots_merged) #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered.pdf', format='pdf')", "_____no_output_____" ], [ "sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=shoots_merged) #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered.pdf', format='pdf')", "_____no_output_____" ], [ "#roots\nplot = sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", kind='box', data=no_random_roots)\n#plot points\nax = sns.swarmplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=no_random_roots, color=\".25\")\nplt.ylabel('Percentage bases covered')\nplt.xlabel('Gene type');\n#ax.get_figure() #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered_boxplot.pdf', format='pdf')", "_____no_output_____" ], [ "#shoots\nplot = sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", kind='box', data=no_random_shoots)\n#plot points\nax = sns.swarmplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=no_random_shoots, color=\".25\")\nplt.ylabel('Percentage bases covered')\nplt.xlabel('Gene type');\n#ax.get_figure() #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered_boxplot.pdf', format='pdf')", "_____no_output_____" ], [ "#roots-shoots intersect\nplot = sns.catplot(x=\"gene_type\", y=\"percentage_bases_covered\", kind='box', data=no_random_rootsshoots)\n#plot points\nax = sns.swarmplot(x=\"gene_type\", y=\"percentage_bases_covered\", data=no_random_rootsshoots, color=\".25\")\nplt.ylabel('Percentage bases covered')\nplt.xlabel('Gene type');\n#ax.get_figure() #.savefig('../../data/plots/TFBS_coverage/responsive_bp_covered_boxplot.pdf', format='pdf')", "_____no_output_____" ], [ "#Get names of each promoter\ndef normality(input_proms):\n \"\"\"function to test normality of data - returns test statistic, p-value\"\"\"\n #Get names of each promoter\n pd.Categorical(input_proms.gene_type)\n names = input_proms.gene_type.unique()\n# for name in names:\n# print(name)\n \n for name in names:\n print('{}: {}'.format(name, stats.shapiro(input_proms.percentage_bases_covered[input_proms.gene_type == name])))\n ", "_____no_output_____" ], [ "def variance(input_proms):\n \"\"\"function to test variance of data\"\"\"\n#test variance\n constitutive = input_proms[input_proms.gene_type == 'constitutive']\n #reset indexes so residuals can be calculated later\n constitutive.reset_index(inplace=True)\n\n responsive = input_proms[input_proms.gene_type == 'variable']\n responsive.reset_index(inplace=True)\n\n control = input_proms[input_proms.gene_type == 'random']\n control.reset_index(inplace=True)\n\n print(stats.levene(constitutive.percentage_bases_covered, responsive.percentage_bases_covered))", "_____no_output_____" ], [ "normality(no_random_roots)", "variable: (0.8330899477005005, 3.833479311765586e-09)\nconstitutive: (0.7916173934936523, 1.8358696507458916e-10)\n" ], [ "normality(no_random_shoots)", "variable: (0.8625870943069458, 4.528254393676434e-08)\nconstitutive: (0.8724747896194458, 1.1140339495341323e-07)\n" ], [ "normality(no_random_rootsshoots)", "variable: (0.8546600937843323, 2.263117515610702e-08)\nconstitutive: (0.8711197376251221, 9.823354929494599e-08)\n" ] ], [ [ "## Not normal", "_____no_output_____" ] ], [ [ "variance(no_random_roots)", "LeveneResult(statistic=3.3550855113629137, pvalue=0.0685312309497174)\n" ], [ "variance(no_random_shoots)", "LeveneResult(statistic=0.20460439034148425, pvalue=0.6515350841099911)\n" ], [ "variance(no_random_rootsshoots)", "LeveneResult(statistic=0.00041366731166758155, pvalue=0.9837939970964911)\n" ] ], [ [ "## unequal variance for shoots", "_____no_output_____" ] ], [ [ "def kruskal_test(input_data):\n \"\"\"function to do kruskal-wallis test on data\"\"\" \n \n #print('\\033[1m' +promoter + '\\033[0m')\n print(kruskal(data=input_data, dv='percentage_bases_covered', between='gene_type'))\n #print('')", "_____no_output_____" ], [ "no_random_roots", "_____no_output_____" ], [ "kruskal_test(no_random_roots)", " Source ddof1 H p-unc\nKruskal gene_type 1 7.281793 0.006966\n" ], [ "kruskal_test(no_random_shoots)", " Source ddof1 H p-unc\nKruskal gene_type 1 20.935596 0.000005\n" ], [ "kruskal_test(no_random_rootsshoots)", " Source ddof1 H p-unc\nKruskal gene_type 1 22.450983 0.000002\n" ] ], [ [ "## try gat enrichment", "_____no_output_____" ] ], [ [ "#add Chr to linestart of chromatin bed files\n\nadd_chr_linestart('../../data/ATAC-seq/potter2018/Shoots_NaOH_peaks_all.bed','../../data/ATAC-seq/potter2018/Shoots_NaOH_peaks_all_renamed.bed')\nadd_chr_linestart('../../data/ATAC-seq/potter2018/Roots_NaOH_peaks_all.bed','../../data/ATAC-seq/potter2018/Roots_NaOH_peaks_all_renamed.bed')\nadd_chr_linestart('../../data/ATAC-seq/potter2018/intersectRootsShoots_PeaksInBoth.bed','../../data/ATAC-seq/potter2018/intersectRootsShoots_PeaksInBoth_renamed.bed')", "_____no_output_____" ], [ "#create a bed file containing all 100 constitutive/responsive promoters with the fourth column annotating whether it's constitutive or responsive\nproms_file = '../../data/genes/constitutive-variable-random_100_each.csv'\npromoters = pd.read_csv(proms_file)\npromoters\ncols2 = ['delete','promoter_AGI', 'gene_type']\npromoters_df = promoters[['promoter_AGI','gene_type']]\npromoters_no_random = promoters_df.copy()\n#drop randCont rows\npromoters_no_random = promoters_df[~(promoters_df.gene_type == 'randCont')]\npromoters_no_random", "_____no_output_____" ], [ "#merge promoters with genetype selected\npromoterbedfile = '../../data/FIMO/responsivepromoters.bed'\npromoters_bed = pd.read_table(promoterbedfile, sep='\\t', header=None)\ncols = ['chr', 'start', 'stop', 'promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\npromoters_bed.columns = cols\nmerged = pd.merge(promoters_bed,promoters_no_random, on='promoter_AGI')", "_____no_output_____" ], [ "#add gene_type to column3\nmerged = merged[['chr','start','stop','gene_type','promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']]", "_____no_output_____" ], [ "#write to bed file\npromoter_file = '../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace.bed'\nwith open(promoter_file,'w') as f:\n merged.to_csv(f,index=False,sep='\\t',header=None)", "_____no_output_____" ], [ "# new_merged = merged.astype({'start': 'int'})\n# new_merged = merged.astype({'stop': 'int'})\n# new_merged = merged.astype({'chr': 'int'})", "_____no_output_____" ], [ "#add Chr to linestart of promoter bed file\n\nadd_chr_linestart('../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace.bed','../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace_renamed.bed')", "_____no_output_____" ], [ "#create separate variable and constitutive and gat workspace\npromoter_file_renamed = '../../data/promoter_analysis/old1000bpproms_variable_constitutive_workspace_renamed.bed'\npromoters = pd.read_table(promoter_file_renamed, sep='\\t', header=None)\n#make a new gat workspace file with all promoters (first 3 columns)\nbed = BedTool.from_dataframe(promoters[[0,1,2]]).saveas('../../data/promoter_analysis/chromatin/variable_constitutive_promoters_1000bp_workspace.bed')\n#select only variable promoters\nvariable_promoters = promoters[promoters[3] == 'highVar']\nsorted_variable = variable_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_variable).saveas('../../data/promoter_analysis/chromatin/variable_promoters_1000bp.bed')\n#make a constitutive only file\nconstitutive_promoters = promoters[promoters[3] == 'housekeeping']\nsorted_constitutive = constitutive_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_constitutive).saveas('../../data/promoter_analysis/chromatin/constitutive_promoters_1000bp.bed')", "_____no_output_____" ] ], [ [ "## now I will do the plots with non-overlapping promoters including the 5'UTR", "_____no_output_____" ] ], [ [ "#merge promoters with genetype selected\npromoter_UTR = '../../data/FIMO/non-overlapping_includingbidirectional_all_genes/promoters_5UTR_renamedChr.bed'\npromoters_bed = pd.read_table(promoter_UTR, sep='\\t', header=None)\ncols = ['chr', 'start', 'stop', 'promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\npromoters_bed.columns = cols\nmerged = pd.merge(promoters_bed,promoters_no_random, on='promoter_AGI')", "_____no_output_____" ], [ "#how many constitutive genes left after removed/shortened overlapping\nlen(merged[merged.gene_type == 'housekeeping'])", "_____no_output_____" ], [ "#how many variable genes left after removed/shortened overlapping\nlen(merged[merged.gene_type == 'highVar'])", "_____no_output_____" ], [ "merged['length'] = (merged.start - merged.stop).abs()\nmerged.sort_values('length',ascending=True)", "_____no_output_____" ], [ "#plot of lengths\ndist_plot = merged['length']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()", "_____no_output_____" ], [ "#remove 2 genes from constitutive group so equal sample size to variable\n#random sample of 98, using seed 1\nmerged[merged.gene_type == 'housekeeping'] = merged[merged.gene_type == 'housekeeping'].sample(98, random_state=1)", "_____no_output_____" ], [ "#drop rows with at least 2 NaNs\nmerged = merged.dropna(thresh=2)", "_____no_output_____" ], [ "merged", "_____no_output_____" ], [ "#write to bed file so can run OpenChromatin_coverage.py\nnew_promoter_file = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive.bed'\ncols = ['chr', 'start', 'stop', 'promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\n#remove trailing decimal .0 from start and stop\nmerged = merged.astype({'start': 'int'})\nmerged = merged.astype({'stop': 'int'})\nmerged = merged.astype({'chr': 'int'})\n\nmerged_coverage = merged[cols]\n\nwith open(new_promoter_file,'w') as f:\n merged_coverage.to_csv(f,index=False,sep='\\t',header=None)", "_____no_output_____" ], [ "#write to bed file so can run gat\nnew_promoter_file_gat = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat.bed'\ncols_gat = ['chr', 'start', 'stop', 'gene_type','promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\nmerged_gat = merged[cols_gat]\nwith open(new_promoter_file_gat,'w') as f:\n merged_gat.to_csv(f,index=False,sep='\\t',header=None)\n", "_____no_output_____" ], [ "#Read in new files\nRootChomatin_bp_covered = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutiveRootOpenChrom.bp_covered.txt'\nShootChomatin_bp_covered = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutiveShootOpenChrom.bp_covered.txt'\nRootShootIntersect_bp_covered = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutiveShootRootIntersectOpenChrom.bp_covered.txt'", "_____no_output_____" ], [ "root_coverage = percent_coverage(RootChomatin_bp_covered)\nshoot_coverage = percent_coverage(ShootChomatin_bp_covered)\nrootshootintersect_coverage = percent_coverage(RootShootIntersect_bp_covered)", "_____no_output_____" ], [ "#add Chr to linestart of promoter bed file\n\nadd_chr_linestart('../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat.bed','../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat_renamed.bed')", "_____no_output_____" ], [ "#create separate variable and constitutive and gat workspace\npromoter_file_renamed = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_gat_renamed.bed'\npromoters = pd.read_table(promoter_file_renamed, sep='\\t', header=None)\n#make a new gat workspace file with all promoters (first 3 columns)\nbed = BedTool.from_dataframe(promoters[[0,1,2]]).saveas('../../data/promoter_analysis/chromatin/non-overlapping_includingbidirectional_variable_constitutive_workspace.bed')\n#select only variable promoters\nvariable_promoters = promoters[promoters[3] == 'highVar']\nsorted_variable = variable_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_variable).saveas('../../data/promoter_analysis/chromatin/non-overlapping_includingbidirectional_variable_promoters.bed')\n#make a constitutive only file\nconstitutive_promoters = promoters[promoters[3] == 'housekeeping']\nsorted_constitutive = constitutive_promoters.sort_values([0,1])\nbed = BedTool.from_dataframe(sorted_constitutive).saveas('../../data/promoter_analysis/chromatin/non-overlapping_includingbidirectional_constitutive_promoters.bed')", "_____no_output_____" ], [ "#show distribution of the distance from the closest end of the open chromatin peak to the ATG (if overlapping already then distance is 0)\nroot_peaks_bed = '../../data/ATAC-seq/potter2018/Roots_NaOH_peaks_all_renamed.bed'\nshoot_peaks_bed = '../../data/ATAC-seq/potter2018/Shoots_NaOH_peaks_all_renamed.bed'\nrootshootintersect_peaks_bed = '../../data/ATAC-seq/potter2018/intersectRootsShoots_PeaksInBoth_renamed.bed'\npromoters_bed = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_renamed.bed'\npromoter_openchrom_intersect = '../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_chromintersect.bed'", "_____no_output_____" ], [ "add_chr_linestart('../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive.bed','../../data/promoter_analysis/non-overlapping_includingbidirectional_variable_constitutive_renamed.bed')", "_____no_output_____" ], [ "def distr_distance_ATG(peaks_bed, promoter_bed, output_file):\n \"\"\"function to show the distribution of the distance rom the closest end\n of the open chromatin peak to the ATG (if overlapping already then distance is 0)\"\"\"\n# peaks = pd.read_table(peaks_bed, sep='\\t', header=None)\n# cols = ['chr','start', 'stop']\n# peaks.columns = cols\n# promoters = pd.read_table(promoter_bed, sep='\\t', header=None)\n# cols_proms = ['chr', 'start', 'stop', 'gene_type','promoter_AGI', 'score', 'strand', 'source', 'feature_name', 'dot2', 'attributes']\n# promoters.columns = cols_proms\n proms = BedTool(promoter_bed) #read in files using BedTools\n peaks = BedTool(peaks_bed)\n #report chromosome position of overlapping feature, along with the promoter which overlaps it (only reports the overlapping nucleotides, not the whole promoter length. Can use u=True to get whole promoter length)\n #f, the minimum overlap as fraction of A. F, nucleotide fraction of B (genes) that need to be overlapping with A (promoters)\n #wa, Write the original entry in A for each overlap.\n #wo, Write the original A and B entries plus the number of base pairs of overlap between the two features. Only A features with overlap are reported. \n #u, write original A entry only once even if more than one overlap\n intersect = proms.intersect(peaks, wo=True) #could add u=True which indicates we want to see the promoters that overlap features in the genome\n #Write to output_file\n with open(output_file, 'w') as output:\n #Each line in the file contains bed entry a and bed entry b that it overlaps plus the number of bp in the overlap so 19 columns\n output.write(str(intersect))\n #read in intersect bed file\n overlapping_proms = pd.read_table(output_file, sep='\\t', header=None)\n cols = ['chrA', 'startA', 'stopA', 'promoter_AGI','dot1','strand','source','type','dot2','attributes','chrB', 'startB','stopB','bp_overlap']\n overlapping_proms.columns = cols\n #add empty openchrom_distance_from_ATG column\n overlapping_proms['openchrom_distance_from_ATG'] = int()\n for i, v in overlapping_proms.iterrows():\n #if positive strand feature A\n if overlapping_proms.loc[i,'strand'] == '+':\n #if end of open chromatin is downstream or equal to ATG, distance is 0\n if overlapping_proms.loc[i,'stopA'] <= overlapping_proms.loc[i, 'stopB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = 0\n #else if upstream and chromatin stop is after promoter start, add distance from chromatin stop to ATG\n elif overlapping_proms.loc[i,'startA'] <= overlapping_proms.loc[i, 'stopB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = overlapping_proms.loc[i,'stopA'] - overlapping_proms.loc[i, 'stopB'] \n \n elif overlapping_proms.loc[i,'strand'] == '-': \n #if end of open chromatin is downstream or equal to ATG, distance is 0\n if overlapping_proms.loc[i,'startA'] >= overlapping_proms.loc[i, 'startB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = 0\n #else if upstream and chromatin stop is after promoter start, add distance from chromatin stop to ATG \n elif overlapping_proms.loc[i,'stopA'] >= overlapping_proms.loc[i, 'startB']:\n overlapping_proms.loc[i,'openchrom_distance_from_ATG'] = overlapping_proms.loc[i, 'startB'] - overlapping_proms.loc[i,'startB']\n \n\n \n return overlapping_proms", "_____no_output_____" ], [ "#show length of open chromatin peaks\nrootshootintersect = distr_distance_ATG(rootshootintersect_peaks_bed)\nrootshootintersect['length'] = (rootshootintersect.start - rootshootintersect.stop).abs()\nrootshootintersect.sort_values('length',ascending=True)\n", "_____no_output_____" ], [ "rootshootintersect = distr_distance_ATG(rootshootintersect_peaks_bed,promoters_bed,promoter_openchrom_intersect)", "_____no_output_____" ], [ "rootshootintersect\nrootshootintersect.sort_values('openchrom_distance_from_ATG',ascending=True)", "_____no_output_____" ], [ "#plot of distances of chomatin to ATG\ndist_plot = rootshootintersect['openchrom_distance_from_ATG']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()", "_____no_output_____" ], [ "#now split constitutive and variable\nmerged_distances = pd.merge(merged, rootshootintersect, on='promoter_AGI')", "_____no_output_____" ], [ "merged_distances.gene_type", "_____no_output_____" ], [ "#VARIABLE\n#plot of distances of chomatin to ATG \ndist_plot = merged_distances[merged_distances.gene_type=='highVar']['openchrom_distance_from_ATG']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()", "_____no_output_____" ], [ "merged_distances[merged_distances.gene_type=='housekeeping']['openchrom_distance_from_ATG']", "_____no_output_____" ], [ "#CONSTITUTIVE\n#plot of distances of chomatin to ATG \ndist_plot = merged_distances[merged_distances.gene_type=='housekeeping']['openchrom_distance_from_ATG']\n#create figure with no transparency\ndist_plot_fig = sns.distplot(dist_plot).get_figure()", "/home/witham/opt/anaconda3/envs/PromoterArchitecturePipeline/lib/python3.7/site-packages/seaborn/distributions.py:369: UserWarning: Default bandwidth for data is 0; skipping density estimation.\n warnings.warn(msg, UserWarning)\n" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d063a7d824a918b20fdae24c6f4a772f811a45cc
218,539
ipynb
Jupyter Notebook
main.ipynb
spectraldani/DeepMahalanobisGP
bf2d788ac8b56d25f544b6cb9c0325820f4b7e64
[ "Apache-2.0" ]
null
null
null
main.ipynb
spectraldani/DeepMahalanobisGP
bf2d788ac8b56d25f544b6cb9c0325820f4b7e64
[ "Apache-2.0" ]
null
null
null
main.ipynb
spectraldani/DeepMahalanobisGP
bf2d788ac8b56d25f544b6cb9c0325820f4b7e64
[ "Apache-2.0" ]
null
null
null
402.46593
98,764
0.933339
[ [ [ "dataset = 'load' # 'load' or 'generate'\nretrain_models = False # False or True or 'save'", "_____no_output_____" ], [ "import numpy as np\nimport pandas as pd\nimport tensorflow as tf\ntf.logging.set_verbosity(tf.logging.FATAL)\n\nimport gpflow\nimport library.models.deep_vmgp as deep_vmgp\nimport library.models.vmgp as vmgp\nfrom doubly_stochastic_dgp.dgp import DGP\n\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\nimport cplot\n\nimport sklearn.model_selection\nimport pickle\nfrom pathlib import Path\nfrom types import SimpleNamespace\nfrom library.helper import TrainTestSplit, initial_inducing_points\nfrom library import metrics\n\n%matplotlib inline", "_____no_output_____" ], [ "random_seed = 19960111\ndef reset_seed():\n np.random.seed(random_seed)\n tf.random.set_random_seed(random_seed)", "_____no_output_____" ], [ "if dataset == 'generate':\n s = 0.4\n n = 500//2\n reset_seed()\n rng = np.random.default_rng(random_seed)\n m1, m2 = np.array([[-1,1],[2,1]])\n X1 = rng.multivariate_normal(m1,s*np.eye(2), size=n)\n X2 = rng.multivariate_normal(m2,s*np.eye(2), size=n)\n y1 = X1[:,0]**2 + X1[:,0]\n y2 = X2[:,1]**2 + X2[:,1]\n\n X = np.concatenate([X1,X2],axis=0)\n y = np.concatenate([y1,y2],axis=0)[:,None]\n\n X_all, y_all = X,y\n n = X_all.shape[0]\n kfold = sklearn.model_selection.KFold(2,shuffle=True,random_state=random_seed)\n folds = [\n [TrainTestSplit(X_all[train],X_all[test]), TrainTestSplit(y_all[train],y_all[test])]\n for train, test in kfold.split(X_all, y_all)\n ]\n X,y = folds[0]\nelif dataset == 'load':\n with open('./dataset.pkl','rb') as f:\n X, y = pickle.load(f)\n X_all, y_all = np.concatenate(X,axis=0), np.concatenate(y,axis=0)", "_____no_output_____" ], [ "scalers = SimpleNamespace(x=sklearn.preprocessing.StandardScaler(),y=sklearn.preprocessing.StandardScaler())\nscalers.x.fit(X.train)\nX = X.apply(lambda x: scalers.x.transform(x))\nscalers.y.fit(y.train)\ny = y.apply(lambda y: scalers.y.transform(y))", "_____no_output_____" ], [ "models = pd.Series(index=pd.Index([],dtype='object'), dtype=object)\nparameters = pd.Series({p.stem:p for p in Path('./optimized_parameters/').glob('*.pkl')}, dtype=object).map(read_parameters)\n\ny_pred = pd.DataFrame(dtype=float, index=range(y.test.size), columns=pd.MultiIndex(levels=[[],['mean','var']],codes=[[],[]],names=['model','']))\nresults = pd.DataFrame(columns=['RMSE','NLPD','MRAE'],dtype=float)", "_____no_output_____" ], [ "def read_parameters(p):\n try:\n with p.open('rb') as f:\n return pickle.load(f)\n except:\n return None\n\ndef train_model(model_label):\n m = models[model_label]\n if retrain_models == True or retrain_models == 'save' or model_label not in parameters.index:\n print('Training',model_label)\n variance_parameter = m.likelihood.variance if not isinstance(m, DGP) else m.likelihood.likelihood.variance\n variance_parameter.assign(0.01)\n # First round\n variance_parameter.trainable = False\n opt = gpflow.train.AdamOptimizer(0.01)\n opt.minimize(m, maxiter=2000)\n\n # Second round\n variance_parameter.trainable = True\n opt = gpflow.train.AdamOptimizer(0.01)\n opt.minimize(m, maxiter=5000)\n if retrain_models == 'save' or model_label not in parameters.index:\n with open(f'./optimized_parameters/{model_label}.pkl','wb') as f:\n pickle.dump(m.read_trainables(), f)\n else:\n m.assign(parameters[model_label])", "_____no_output_____" ] ], [ [ "# Create, train, and predict with models", "_____no_output_____" ] ], [ [ "n,D = X.train.shape\nm_v = 25\nm_u, Q, = 50, D\nZ_v = (m_v,D)\nZ_u = (m_u,Q)\nsample_size = 200", "_____no_output_____" ] ], [ [ "### SGPR", "_____no_output_____" ] ], [ [ "models['sgpr'] = gpflow.models.SGPR(X.train, y.train, gpflow.kernels.RBF(D, ARD=True), initial_inducing_points(X.train, m_u))\ntrain_model('sgpr')\ny_pred[('sgpr','mean')], y_pred[('sgpr','var')] = models['sgpr'].predict_y(X.test)", "_____no_output_____" ] ], [ [ "### Deep Mahalanobis GP", "_____no_output_____" ] ], [ [ "reset_seed()\nwith gpflow.defer_build():\n models['dvmgp'] = deep_vmgp.DeepVMGP(\n X.train, y.train, Z_u, Z_v,\n [gpflow.kernels.RBF(D,ARD=True) for i in range(Q)],\n full_qcov=False, diag_qmu=False\n )\nmodels['dvmgp'].compile()\ntrain_model('dvmgp')\ny_pred[('dvmgp','mean')], y_pred[('dvmgp','var')] = models['dvmgp'].predict_y(X.test)", "_____no_output_____" ] ], [ [ "### Show scores", "_____no_output_____" ] ], [ [ "for m in models.index:\n scaled_y_test = scalers.y.inverse_transform(y.test)\n scaled_y_pred = [\n scalers.y.inverse_transform(y_pred[m].values[:,[0]]),\n scalers.y.var_ * y_pred[m].values[:,[1]]\n ]\n results.at[m,'MRAE'] = metrics.mean_relative_absolute_error(scaled_y_test, scaled_y_pred[0]).squeeze()\n results.at[m,'RMSE'] = metrics.root_mean_squared_error(scaled_y_test, scaled_y_pred[0]).squeeze()\n results.at[m,'NLPD'] = metrics.negative_log_predictive_density(scaled_y_test, *scaled_y_pred).squeeze()\n\nresults", "_____no_output_____" ] ], [ [ "# Plot results", "_____no_output_____" ] ], [ [ "class MidpointNormalize(mpl.colors.Normalize):\n def __init__(self, vmin=None, vmax=None, midpoint=None, clip=False):\n self.midpoint = midpoint\n mpl.colors.Normalize.__init__(self, vmin, vmax, clip)\n\n def __call__(self, value, clip=None):\n x, y = [self.vmin, self.midpoint, self.vmax], [0, 0.5, 1]\n return np.ma.masked_array(np.interp(value, x, y), np.isnan(value))", "_____no_output_____" ], [ "f = plt.figure()\nax = plt.gca()\nax.scatter(scalers.x.transform(X_all)[:,0],scalers.x.transform(X_all)[:,1],edgecolors='white',facecolors='none')\nlims = (ax.get_xlim(), ax.get_ylim())\nplt.close(f)", "_____no_output_____" ], [ "n = 50\ngrid_points = np.dstack(np.meshgrid(np.linspace(*lims[0],n), np.linspace(*lims[1],n))).reshape(-1,2)\ngrid_y = np.empty((len(models.index),grid_points.shape[0]))\nfor i,m in enumerate(models.index):\n reset_seed()\n grid_pred = models[m].predict_y(grid_points, sample_size)[0]\n if len(grid_pred.shape) == 3:\n grid_y[i] = grid_pred.mean(axis=0)[:,0]\n else:\n grid_y[i] = grid_pred[:,0]\n\ngrid_points = grid_points.reshape(n,n,2)\ngrid_y = grid_y.reshape(-1,n,n)", "_____no_output_____" ], [ "f = plt.figure(constrained_layout=True,figsize=(8,7))\ngs = f.add_gridspec(ncols=4, nrows=2)\naxs = np.empty(3,dtype=object)\naxs[0] = f.add_subplot(gs[0,0:2])\naxs[1] = f.add_subplot(gs[0,2:4],sharey=axs[0])\naxs[2] = f.add_subplot(gs[1,1:3])\n\naxs[1].yaxis.set_visible(False)\naxs[2].yaxis.set_visible(False)\n\naxs[0].set_title('SGPR')\naxs[1].set_title('DVMGP')\naxs[2].set_title('Full Dataset')\n\nims = np.empty((2,4),dtype=object)\n\nfor i,m in enumerate(['sgpr', 'dvmgp']):\n ax = axs[i]\n ims[0,i] = ax.contourf(grid_points[:,:,0],grid_points[:,:,1],grid_y[i],30)\n\n # Plot features\n Z = None\n if m == 'dgp':\n Z = models[m].layers[0].feature.Z.value\n elif m in ['sgpr','vmgp']:\n Z = models[m].feature.Z.value\n elif m == 'dvmgp':\n Z = models[m].Z_v.Z.value\n\n if Z is not None:\n ax.scatter(Z[:,0],Z[:,1],marker='^',edgecolors='white',facecolors='none')\n# ims[1,i] = ax.scatter(X.test[:,0],X.test[:,1],edgecolors='white',c=y.test)\n \nims[0,3] = axs[2].scatter(X.test[:,0],X.test[:,1],c=y.test)\nims[1,3] = axs[2].scatter(X.train[:,0],X.train[:,1],c=y.train)\n\nfor ax in axs:\n ax.set_xlim(lims[0]);\n ax.set_ylim(lims[1]);\n \nclim = np.array([i.get_clim() for i in ims.flat if i is not None])\nclim = (clim.min(), clim.max())\nnorm = mpl.colors.Normalize(vmin=clim[0], vmax=clim[1])\n# norm = MidpointNormalize(vmin=clim[0], vmax=clim[1], midpoint=0)\nfor im in ims.flat:\n if im is not None:\n im.set_norm(norm)\nf.colorbar(ims[0,0], ax=axs, orientation='vertical', fraction=1, aspect=50)\n\nfor im in ims[0,:3].flat:\n if im is not None:\n for c in im.collections:\n c.set_edgecolor(\"face\")\n\nf.savefig('./figs/outputs.pdf')", "_____no_output_____" ], [ "n = 50\ngrid_points = np.dstack(np.meshgrid(np.linspace(*lims[0],n), np.linspace(*lims[1],n))).reshape(-1,2)\ngrid_y = np.empty((grid_points.shape[0],2))\n\ngrid_y = models['dvmgp'].enquire_session().run(tf.matmul(\n tf.transpose(models['dvmgp'].compute_qW(grid_points)[0][...,0],[2,0,1]),grid_points[:,:,None]\n)[:,:,0])\n\ngrid_points = grid_points.reshape(n,n,2)\ngrid_y = grid_y.reshape(n,n,2)\n\nf = plt.figure(constrained_layout=True,figsize=(8,4))\ngs = f.add_gridspec(ncols=2, nrows=1)\naxs = np.empty(4,dtype=object)\naxs[0] = f.add_subplot(gs[0,0])\naxs[1] = f.add_subplot(gs[0,1])\n\nextent = (*lims[0], *lims[1])\ncolorspace = 'cielab'\nalpha = 0.7\n\naxs[0].imshow(\n cplot.get_srgb1(grid_points[:,:,0] + grid_points[:,:,1]*1j, colorspace=colorspace, alpha=alpha),\n origin='lower',\n extent=extent,\n aspect='auto',\n interpolation='gaussian'\n)\naxs[0].set_title('Identity map')\n\naxs[1].imshow(\n cplot.get_srgb1(grid_y[:,:,0] + grid_y[:,:,1]*1j, colorspace=colorspace, alpha=alpha),\n origin='lower',\n extent=extent,\n aspect='auto',\n interpolation='gaussian'\n)\naxs[1].set_title('DVMGP: $Wx^\\intercal$');\nf.savefig('./figs/layers.pdf')", "_____no_output_____" ], [ "dvmgp_var = np.array([k.variance.value for k in models['dvmgp'].w_kerns])\n\nf,ax = plt.subplots(1,1,figsize=(3,3))\nax.bar(np.arange(2), dvmgp_var/dvmgp_var.max(), color='C2')\nax.set_ylabel('1st layer variance\\nrelative to largest value')\n\nax.set_xlabel('Latent dimension')\nax.set_xticks([])\n\nax.set_title('DVMGP')\nf.tight_layout()\nf.savefig('./figs/dims.pdf')", "_____no_output_____" ] ] ]
[ "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ] ]
d063b69f0b08b02a7b8969cb3540bb5645d3d954
7,384
ipynb
Jupyter Notebook
docs/level1/sasum.ipynb
timleslie/pyblas
9109f2cc24e674cf59a3b39f95c2d7b8116ae884
[ "BSD-3-Clause" ]
null
null
null
docs/level1/sasum.ipynb
timleslie/pyblas
9109f2cc24e674cf59a3b39f95c2d7b8116ae884
[ "BSD-3-Clause" ]
1
2020-10-10T23:23:06.000Z
2020-10-10T23:23:06.000Z
docs/level1/sasum.ipynb
timleslie/pyblas
9109f2cc24e674cf59a3b39f95c2d7b8116ae884
[ "BSD-3-Clause" ]
null
null
null
31.555556
399
0.529117
[ [ [ "# `sasum(N, SX, INCX)`\n\nComputes the sum of absolute values of elements of the vector $x$.\n\nOperates on single-precision real valued arrays.\n\nInput vector $\\mathbf{x}$ is represented as a [strided array](../strided_arrays.ipynb) `SX`, spaced by `INCX`.\nVector $\\mathbf{x}$ is of size `N`.", "_____no_output_____" ], [ "### Example usage", "_____no_output_____" ] ], [ [ "import os\nimport sys\nsys.path.insert(0, os.path.abspath(os.path.join(os.path.abspath(''), \"..\", \"..\")))", "_____no_output_____" ], [ "import numpy as np\nfrom pyblas.level1 import sasum", "_____no_output_____" ], [ "x = np.array([1, 2, 3], dtype=np.single)\nN = len(x)\nincx = 1", "_____no_output_____" ], [ "sasum(N, x, incx)", "_____no_output_____" ] ], [ [ "### Docstring", "_____no_output_____" ] ], [ [ "help(sasum)", "Help on function sasum in module pyblas.level1.sasum:\n\nsasum(N, SX, INCX)\n Computes the sum of absolute values of elements of the vector x\n \n Parameters\n ----------\n N : int\n Number of elements in input vector\n SX : numpy.ndarray\n A single precision real array, dimension (1 + (`N` - 1)*abs(`INCX`))\n INCX : int\n Storage spacing between elements of `SX`\n \n Returns\n -------\n numpy.single\n \n See Also\n --------\n dasum : Double-precision sum of absolute values\n \n Notes\n -----\n Online PyBLAS documentation: https://nbviewer.jupyter.org/github/timleslie/pyblas/blob/main/docs/sasum.ipynb\n Reference BLAS documentation: https://github.com/Reference-LAPACK/lapack/blob/v3.9.0/BLAS/SRC/sasum.f\n \n Examples\n --------\n >>> x = np.array([1, 2, 3], dtype=np.single)\n >>> N = len(x)\n >>> incx = 1\n >>> print(sasum(N, x, incx)\n 6.\n\n" ] ], [ [ "### Source code", "_____no_output_____" ] ], [ [ "sasum??", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d063bda09143f23b1dec8e6d92850b0fa5e5ff8b
431,235
ipynb
Jupyter Notebook
2019/PAN_AA_2018-POS-tag.ipynb
jeleandro/PANAA2018
aa681fcb4e2f90841cf30f53265fecbb111123e1
[ "Apache-2.0" ]
null
null
null
2019/PAN_AA_2018-POS-tag.ipynb
jeleandro/PANAA2018
aa681fcb4e2f90841cf30f53265fecbb111123e1
[ "Apache-2.0" ]
null
null
null
2019/PAN_AA_2018-POS-tag.ipynb
jeleandro/PANAA2018
aa681fcb4e2f90841cf30f53265fecbb111123e1
[ "Apache-2.0" ]
null
null
null
142.321782
120,504
0.77929
[ [ [ "# Notebook para o PAN - Atribuiรงรฃo Autoral - 2018", "_____no_output_____" ] ], [ [ "%matplotlib inline\n#python basic libs\nimport os;\nfrom os.path import join as pathjoin;\n\nimport warnings\nwarnings.simplefilter(action='ignore', category=FutureWarning)\nfrom sklearn.exceptions import UndefinedMetricWarning\nwarnings.simplefilter(action='ignore', category=UndefinedMetricWarning)\n\nimport re;\nimport json;\nimport codecs;\nfrom collections import defaultdict;\n\nfrom pprint import pprint\nfrom time import time\nimport logging\n\n\n#data analysis libs\nimport numpy as np;\nimport pandas as pd;\nfrom pandas.plotting import scatter_matrix;\nimport matplotlib.pyplot as plt;\nimport random;\n\n#machine learning libs\n#feature extraction\nfrom sklearn.feature_extraction.text import CountVectorizer, TfidfVectorizer, TfidfTransformer\n\n#preprocessing and transformation\nfrom sklearn import preprocessing\nfrom sklearn.preprocessing import normalize, MaxAbsScaler, RobustScaler;\nfrom sklearn.decomposition import PCA;\n\nfrom sklearn.base import BaseEstimator, ClassifierMixin\n\n#classifiers\nfrom sklearn import linear_model;\nfrom sklearn.linear_model import LogisticRegression\n\nfrom sklearn.svm import LinearSVC, SVC\nfrom sklearn.multiclass import OneVsOneClassifier, OneVsRestClassifier\nfrom sklearn.neural_network import MLPClassifier\n\n \n#\nfrom sklearn import feature_selection;\nfrom sklearn import ensemble;\n\nfrom sklearn.model_selection import train_test_split;\nfrom sklearn.model_selection import GridSearchCV\nfrom sklearn.pipeline import Pipeline\n\n#model valuation\nfrom sklearn.metrics import roc_auc_score, f1_score, precision_score, recall_score, accuracy_score;", "_____no_output_____" ], [ "import seaborn as sns;\nsns.set(color_codes=True);", "_____no_output_____" ], [ "import spacy\nget_ipython().config.get('IPKernelApp', {})['parent_appname'] = \"\" #spacy causes a bug on pandas and this code fix it", "_____no_output_____" ], [ "import platform;\nimport sklearn;\nimport scipy;\n\nprint(\"|%-15s|%-40s|\"%(\"PACK\",\"VERSION\"))\nprint(\"|%-15s|%-40s|\"%('-'*15,'-'*40))\nprint('\\n'.join(\n \"|%-15s|%-40s|\" % (pack, version)\n for pack, version in\n zip(['SO','NumPy','SciPy','Scikit-Learn','seaborn','spacy'],\n [platform.platform(), np.__version__, scipy.__version__, sklearn.__version__, sns.__version__, spacy.__version__])\n\n))", "|PACK |VERSION |\n|---------------|----------------------------------------|\n|SO |Darwin-18.2.0-x86_64-i386-64bit |\n|NumPy |1.15.4 |\n|SciPy |1.1.0 |\n|Scikit-Learn |0.20.1 |\n|seaborn |0.9.0 |\n|spacy |2.0.16 |\n" ], [ "np.set_printoptions(precision=4)\npd.options.display.float_format = '{:,.4f}'.format", "_____no_output_____" ], [ "#externalizing codes that is used in many notebooks and it is not experiment specific\nimport pan\n#convert a sparse matrix into a dense for being used on PCA\nfrom skleanExtensions import DenseTransformer;\n\n#convert an array of text into an array of tokenized texts each token must contain text, tag_, pos_, dep_\nfrom skleanExtensions import POSTagTransformer", "_____no_output_____" ] ], [ [ "### paths configuration", "_____no_output_____" ] ], [ [ "baseDir = '/Users/joseeleandrocustodio/Dropbox/mestrado/02 - Pesquisa/code';\n\ninputDir= pathjoin(baseDir,'pan18aa');\noutputDir= pathjoin(baseDir,'out',\"oficial\");\nif not os.path.exists(outputDir):\n os.mkdir(outputDir);", "_____no_output_____" ] ], [ [ "## loading the dataset", "_____no_output_____" ] ], [ [ "problems = pan.readCollectionsOfProblems(inputDir);", "_____no_output_____" ], [ "print(problems[0]['problem'])\nprint(problems[0].keys())", "problem00001\ndict_keys(['problem', 'language', 'encoding', 'candidates_folder_count', 'candidates', 'unknown'])\n" ], [ "pd.DataFrame(problems)", "_____no_output_____" ], [ "def cachingPOSTAG(problem, taggingVersion='TAG'):\n import json;\n print (\"Tagging: %s, language: %s, \" %(problem['problem'],problem['language']), end=' ');\n \n if not os.path.exists('POSTAG_cache'):\n os.makedirs('POSTAG_cache');\n \n _id = problem['problem']+problem['language'];\n filename = os.path.join('POSTAG_cache',taggingVersion+'_'+_id+'.json')\n if not os.path.exists(filename):\n lang = problem['language'];\n if lang == 'sp':\n lang = 'es';\n elif lang =='pl':\n print(lang, ' not supported');\n return ;\n\n train_docs, train_labels, _ = zip(*problem['candidates'])\n problem['training_docs_size'] = len(train_docs);\n test_docs, _, test_filename = zip(*problem['unknown'])\n\n t0 = time()\n tagger = POSTagTransformer(language=lang);\n train_docs = tagger.fit_transform(train_docs);\n test_docs = tagger.fit_transform(test_docs);\n \n print(\"Annotation time %0.3fs\" % (time() - t0))\n \n with open(filename,'w') as f:\n json.dump({\n 'train':train_docs,\n 'train_labels':train_labels,\n 'test':test_docs,\n 'test_filename':test_filename\n },f);\n else:\n with open(filename,'r') as f:\n data = json.load(f);\n\n train_docs = data['train'];\n train_labels = data['train_labels'];\n test_docs = data['test'];\n test_filename = data['test_filename'];\n print('tagged')\n return train_docs, train_labels, test_docs, test_filename;\n\nfor problem in problems:\n cachingPOSTAG(problem)", "Tagging: problem00001, language: en, tagged\nTagging: problem00002, language: en, tagged\nTagging: problem00003, language: fr, tagged\nTagging: problem00004, language: fr, tagged\nTagging: problem00005, language: it, tagged\nTagging: problem00006, language: it, tagged\nTagging: problem00007, language: pl, pl not supported\nTagging: problem00008, language: pl, pl not supported\nTagging: problem00009, language: sp, tagged\nTagging: problem00010, language: sp, tagged\n" ], [ "train_docs, train_labels, test_docs, test_filename = cachingPOSTAG(problem)", "Tagging: problem00010, language: sp, tagged\n" ], [ "class FilterTagTransformer(BaseEstimator):\n def __init__(self,token='POS', parts=None):\n self.token = token;\n self.parts = parts;\n\n def transform(self, X, y=None):\n \"\"\" Return An array of tokens \n Parameters\n ----------\n X : {array-like}, shape = [n_samples, n_tokens]\n Array documents, where each document consists of a list of node\n and each node consist of a token and its correspondent tag\n \n [\n [('a','TAG1'),('b','TAG2')],\n [('a','TAG1')]\n ]\n y : array-like, shape = [n_samples] (default: None)\n Returns\n ---------\n X_dense : dense version of the input X array.\n \"\"\"\n if self.token == 'TAG':\n X = [' '.join([d[1].split('__')[0] for d in doc]) for doc in X]\n elif self.token == 'POS':\n if self.parts is None:\n X = [' '.join([d[2] for d in doc]) for doc in X];\n else:\n X = [' '.join([d[0] for d in doc if d[2] in self.parts]) for doc in X]\n elif self.token == 'DEP':\n X = [' '.join([d[3] for d in doc]) for doc in X]\n elif self.token == 'word_POS':\n if self.parts is None:\n X = [' '.join([d[0]+'/'+d[2] for d in doc]) for doc in X]\n elif self.token == 'filter':\n if self.parts is None:\n X = [' '.join([d[2] for d in doc]) for doc in X];\n else:\n X = [' '.join([d[0] for d in doc if d[2] in self.parts]) for doc in X]\n else:\n X = [' '.join([d[0] for d in doc]) for doc in X]\n \n return np.array(X); \n\n def fit(self, X, y=None):\n self.is_fitted = True\n return self\n\n def fit_transform(self, X, y=None):\n return self.transform(X=X, y=y)", "_____no_output_____" ] ], [ [ "### analisando os demais parametros", "_____no_output_____" ] ], [ [ "def spaceTokenizer(x):\n return x.split(\" \");", "_____no_output_____" ], [ "def runML(problem):\n print (\"\\nProblem: %s, language: %s, \" %(problem['problem'],problem['language']), end=' ');\n \n lang = problem['language'];\n if lang == 'sp':\n lang = 'es';\n elif lang =='pl':\n print(lang, ' not supported');\n return None,None,None,None;\n \n \n train_docs, train_labels, test_docs, test_filename = cachingPOSTAG(problem)\n problem['training_docs_size'] = len(train_docs);\n\n t0 = time()\n \n pipeline = Pipeline([\n ('filter',FilterTagTransformer(token='TAG')),\n ('vect', CountVectorizer(\n tokenizer=spaceTokenizer,\n min_df=0.01,\n lowercase=False\n )),\n ('tfidf', TfidfTransformer()),\n ('scaler', MaxAbsScaler()),\n ('dense', DenseTransformer()),\n ('transf', PCA(0.999)),\n ('clf', LogisticRegression(random_state=0,multi_class='multinomial', solver='newton-cg')),\n ])\n \n \n # uncommenting more parameters will give better exploring power but will\n # increase processing time in a combinatorial way\n parameters = {\n 'vect__ngram_range' :((1,1),(1,2),(1,3),(1,5)),\n 'tfidf__use_idf' :(True, False),\n 'tfidf__sublinear_tf':(True, False),\n 'tfidf__norm':('l1','l2'),\n 'clf__C':(0.1,1,10),\n }\n \n grid_search = GridSearchCV(pipeline,\n parameters,\n cv=4,\n iid=False,\n n_jobs=-1,\n verbose=False,\n scoring='f1_macro')\n \n t0 = time()\n grid_search.fit(train_docs, train_labels)\n print(\"Gridsearh %0.3fs\" % (time() - t0), end=' ')\n\n print(\"Best score: %0.3f\" % grid_search.best_score_)\n print(\"Best parameters set:\")\n best_parameters = grid_search.best_estimator_.get_params()\n for param_name in sorted(parameters.keys()):\n print(\"\\t%s: %r\" % (param_name, best_parameters[param_name]))\n \n train_pred=grid_search.predict(train_docs);\n test_pred=grid_search.predict(test_docs);\n \n \n # Writing output file\n out_data=[]\n for i,v in enumerate(test_pred):\n out_data.append({'unknown-text': test_filename[i],'predicted-author': v})\n answerFile = pathjoin(outputDir,'answers-'+problem['problem']+'.json');\n with open(answerFile, 'w') as f:\n json.dump(out_data, f, indent=4)\n \n \n #calculating the performance using PAN evaluation code\n f1,precision,recall,accuracy=pan.evaluate(\n pathjoin(inputDir, problem['problem'], 'ground-truth.json'),\n answerFile)\n \n return {\n 'problem-name' : problem['problem'],\n \"language\" : problem['language'],\n 'AuthorCount' : len(set(train_labels)),\n 'macro-f1' : round(f1,3),\n 'macro-precision': round(precision,3),\n 'macro-recall' : round(recall,3),\n 'micro-accuracy' : round(accuracy,3),\n \n }, grid_search.cv_results_,best_parameters, grid_search.best_estimator_;", "_____no_output_____" ], [ "result = [];\ncv_result = [];\nbest_parameters = [];\nestimators = [];\nfor problem in problems:\n with warnings.catch_warnings():\n warnings.filterwarnings(\"ignore\");\n r, c, b, e = runML(problem);\n if r is None:\n continue;\n result.append(r);\n cv_result.append(c);\n estimators.append(e);\n b['problem'] = problem['problem'];\n best_parameters.append(b);", "\nProblem: problem00001, language: en, Tagging: problem00001, language: en, tagged\nGridsearh 1107.958s Best score: 0.661\nBest parameters set:\n\tclf__C: 10\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 2)\n\nProblem: problem00002, language: en, Tagging: problem00002, language: en, tagged\nGridsearh 251.719s Best score: 0.840\nBest parameters set:\n\tclf__C: 0.1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: False\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 3)\n\nProblem: problem00003, language: fr, Tagging: problem00003, language: fr, tagged\nGridsearh 1038.886s Best score: 0.530\nBest parameters set:\n\tclf__C: 1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: False\n\tvect__ngram_range: (1, 3)\n\nProblem: problem00004, language: fr, Tagging: problem00004, language: fr, tagged\nGridsearh 256.516s Best score: 0.663\nBest parameters set:\n\tclf__C: 0.1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 5)\n\nProblem: problem00005, language: it, Tagging: problem00005, language: it, tagged\nGridsearh 1014.834s Best score: 0.622\nBest parameters set:\n\tclf__C: 10\n\ttfidf__norm: 'l1'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 3)\n\nProblem: problem00006, language: it, Tagging: problem00006, language: it, tagged\nGridsearh 264.087s Best score: 0.880\nBest parameters set:\n\tclf__C: 0.1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 2)\n\nProblem: problem00007, language: pl, pl not supported\n\nProblem: problem00008, language: pl, pl not supported\n\nProblem: problem00009, language: sp, Tagging: problem00009, language: sp, tagged\nGridsearh 1135.047s Best score: 0.610\nBest parameters set:\n\tclf__C: 1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: True\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 5)\n\nProblem: problem00010, language: sp, Tagging: problem00010, language: sp, tagged\nGridsearh 267.930s Best score: 0.678\nBest parameters set:\n\tclf__C: 1\n\ttfidf__norm: 'l2'\n\ttfidf__sublinear_tf: False\n\ttfidf__use_idf: True\n\tvect__ngram_range: (1, 2)\n" ], [ "df=pd.DataFrame(result)[['problem-name',\n \"language\",\n 'AuthorCount',\n 'macro-f1','macro-precision','macro-recall' ,'micro-accuracy']]", "_____no_output_____" ], [ "df", "_____no_output_____" ], [ "df[['macro-f1']].mean()", "_____no_output_____" ], [ "languages={\n 'en':'inglesa',\n 'sp':'espanhola',\n 'it':'italiana',\n 'pl':'polonesa',\n 'fr':'francesa'\n}", "_____no_output_____" ], [ "cv_result2 = [];\ndfCV = pd.DataFrame();\nfor i, c in enumerate(cv_result):\n temp = pd.DataFrame(c);\n temp['language'] = result[i]['AuthorCount']\n temp['problem'] = int(re.sub('\\D','',result[i]['problem-name']));\n temp['language'] = languages[result[i]['language']]\n dfCV = dfCV.append(temp);\n\nfor p in [\n 'mean_test_score','std_test_score','mean_train_score', \n 'split0_test_score',\n 'split1_test_score',\n 'split2_test_score']:\n dfCV[p]=dfCV[p].astype(np.float32);\n\n \ndfCV =dfCV[[\n 'problem',\n 'language',\n 'rank_test_score',\n 'param_vect__ngram_range',\n 'param_tfidf__sublinear_tf',\n 'param_tfidf__norm',\n 'param_clf__C',\n 'mean_test_score', \n 'std_test_score',\n\n 'split0_test_score',\n 'split1_test_score',\n 'split2_test_score',\n\n 'mean_score_time',\n 'mean_fit_time',\n 'std_fit_time',\n 'std_score_time',\n 'std_train_score',\n]];\n\ndfCV.rename(columns={\n 'param_vect__ngram_range':'ngram_range',\n 'param_tfidf__sublinear_tf':'sublinear_tf',\n 'param_tfidf__smooth_idf':'smooth_idf',\n 'param_tfidf__norm':'norm',\n 'param_clf__C':'regularization',\n},inplace=True);\n\n#print('\\',\\n\\''.join(dfCV.columns))\n", "_____no_output_____" ], [ "dfCV.head()", "_____no_output_____" ] ], [ [ "## Saving the model", "_____no_output_____" ] ], [ [ "dfCV.to_csv('PANAA2018_POSTAG.csv', index=False)", "_____no_output_____" ], [ "dfCV = pd.read_csv('PANAA2018_POSTAG.csv', na_values='')", "_____no_output_____" ], [ "import pickle;\nwith open(\"PANAA2018_POSTAG.pkl\",\"wb\") as f:\n pickle.dump(estimators,f)", "_____no_output_____" ] ], [ [ "## understanding the model with reports", "_____no_output_____" ], [ "Podemos ver que para um mesmo problema mais de uma configuraรงรฃo รฉ possรญvel", "_____no_output_____" ] ], [ [ "print(' | '.join(best_parameters[0]['vect'].get_feature_names()[0:20]))", " | '' | -LRB- | CC | CD | DT | EX | IN | JJ | MD | NN | NNP | NNPS | NNS | PRP | PRP$ | RB | UH | VB | VBD\n" ], [ "(dfCV[dfCV.rank_test_score == 1]).drop_duplicates()[\n ['problem',\n 'language',\n 'mean_test_score',\n 'std_test_score',\n 'ngram_range',\n 'sublinear_tf',\n 'norm']\n].sort_values(by=[\n 'problem',\n 'mean_test_score',\n 'std_test_score',\n 'ngram_range',\n 'sublinear_tf'\n], ascending=[True, False,True,False,False])", "_____no_output_____" ], [ "dfCV.pivot_table(\n index=['problem','language','norm','sublinear_tf'],\n columns=[ 'ngram_range','regularization'],\n values='mean_test_score'\n )", "_____no_output_____" ] ], [ [ "O score retornado vem do conjunto de teste da validaรงรฃo cruzada e nรฃo do conjunto de testes", "_____no_output_____" ] ], [ [ "pd.options.display.precision = 3 \nprint(u\"\\\\begin{table}[h]\\n\\\\centering\\n\\\\caption{Medida F1 para os parรขmetros }\")\n\nprint(re.sub(r'[ ]{2,}',' ',dfCV.pivot_table(\n index=['problem','language','sublinear_tf','norm'],\n columns=['ngram_range'],\n values='mean_test_score'\n ).to_latex()))\nprint (\"\\label{tab:modelocaracter}\")\nprint(r\"\\end{table}\")", "\\begin{table}[h]\n\\centering\n\\caption{Medida F1 para os parรขmetros }\n\\begin{tabular}{llllrrrr}\n\\toprule\n & & & ngram\\_range & (1, 1) & (1, 2) & (1, 3) & (1, 5) \\\\\nproblem & language & sublinear\\_tf & norm & & & & \\\\\n\\midrule\n1 & inglesa & False & l1 & 0.4150 & 0.5965 & 0.6053 & 0.5652 \\\\\n & & & l2 & 0.4107 & 0.5957 & 0.6074 & 0.5462 \\\\\n & & True & l1 & 0.3265 & 0.6172 & 0.6278 & 0.5325 \\\\\n & & & l2 & 0.3280 & 0.6317 & 0.6174 & 0.5734 \\\\\n2 & inglesa & False & l1 & 0.6302 & 0.7712 & 0.8017 & 0.7436 \\\\\n & & & l2 & 0.6209 & 0.7722 & 0.8200 & 0.7352 \\\\\n & & True & l1 & 0.7302 & 0.7919 & 0.7552 & 0.7519 \\\\\n & & & l2 & 0.7306 & 0.7895 & 0.7678 & 0.7519 \\\\\n3 & francesa & False & l1 & 0.2583 & 0.4162 & 0.4969 & 0.4386 \\\\\n & & & l2 & 0.2444 & 0.4164 & 0.5003 & 0.4524 \\\\\n & & True & l1 & 0.1230 & 0.4329 & 0.4955 & 0.4724 \\\\\n & & & l2 & 0.1288 & 0.4439 & 0.5196 & 0.4928 \\\\\n4 & francesa & False & l1 & 0.4039 & 0.4439 & 0.6035 & 0.5917 \\\\\n & & & l2 & 0.4108 & 0.4278 & 0.5944 & 0.5662 \\\\\n & & True & l1 & 0.2567 & 0.3345 & 0.6181 & 0.6328 \\\\\n & & & l2 & 0.2594 & 0.3315 & 0.6411 & 0.6633 \\\\\n5 & italiana & False & l1 & 0.2972 & 0.4731 & 0.5116 & 0.4924 \\\\\n & & & l2 & 0.2880 & 0.4545 & 0.4813 & 0.4743 \\\\\n & & True & l1 & 0.1986 & 0.5366 & 0.6021 & 0.5081 \\\\\n & & & l2 & 0.1973 & 0.5217 & 0.5617 & 0.5230 \\\\\n6 & italiana & False & l1 & 0.7239 & 0.8300 & 0.8367 & 0.8367 \\\\\n & & & l2 & 0.7528 & 0.8300 & 0.8367 & 0.8233 \\\\\n & & True & l1 & 0.4723 & 0.8533 & 0.8367 & 0.8339 \\\\\n & & & l2 & 0.4858 & 0.8683 & 0.8367 & 0.8100 \\\\\n9 & espanhola & False & l1 & 0.2194 & 0.5035 & 0.5213 & 0.5761 \\\\\n & & & l2 & 0.2126 & 0.5008 & 0.5177 & 0.5638 \\\\\n & & True & l1 & 0.1186 & 0.4609 & 0.5542 & 0.6021 \\\\\n & & & l2 & 0.1213 & 0.4623 & 0.5585 & 0.5997 \\\\\n10 & espanhola & False & l1 & 0.3879 & 0.6108 & 0.5474 & 0.6333 \\\\\n & & & l2 & 0.3901 & 0.6106 & 0.5526 & 0.5783 \\\\\n & & True & l1 & 0.2956 & 0.5603 & 0.5447 & 0.6572 \\\\\n & & & l2 & 0.2665 & 0.5697 & 0.5450 & 0.6289 \\\\\n\\bottomrule\n\\end{tabular}\n\n\\label{tab:modelocaracter}\n\\end{table}\n" ], [ "d = dfCV.copy()\nd = d.rename(columns={'language':u'Lรญngua', 'sublinear_tf':'TF Sublinear'})\nd = d [ d.norm.isna() == False]\nd['autorNumber'] = d.problem.map(lambda x: 20 if x % 2==0 else 5)\nd.problem = d.apply(lambda x: x[u'Lรญngua'] +\" \"+ str(x[u'problem']), axis=1)\n#d.ngram_range = d.apply(lambda x: str(x[u'ngram_range'][0]) +\" \"+ str(x[u'ngram_range'][1]), axis=1)\n\nd.std_test_score =d.std_test_score / d.std_test_score.quantile(0.95) *500;\nd.std_test_score +=1;\nd.std_test_score = d.std_test_score.astype(np.int64)\ng = sns.FacetGrid(d, col='Lรญngua', hue='TF Sublinear', row=\"regularization\", height=3,palette=\"Set1\")\ng.map(plt.scatter, \"ngram_range\", \"mean_test_score\",s=d.std_test_score.values).add_legend();\n#sns.pairplot(d, hue=\"TF Sublinear\", vars=[\"autorNumber\", \"mean_test_score\"])\n", "_____no_output_____" ], [ "g = sns.FacetGrid(d, row='autorNumber', hue='TF Sublinear', col=u\"Lรญngua\", height=3,palette=\"Set1\")\ng.map(plt.scatter, \"ngram_range\", \"mean_test_score\", alpha=0.5, s=d.std_test_score.values).add_legend();", "_____no_output_____" ], [ "sns.distplot(dfCV.std_test_score, bins=25);", "_____no_output_____" ], [ "import statsmodels.api as sm", "_____no_output_____" ], [ "d = dfCV[['mean_test_score','problem', 'language','sublinear_tf','norm','ngram_range']].copy();\nd.sublinear_tf=d.sublinear_tf.apply(lambda x: 1 if x else 0)\nd.norm=d.norm.apply(lambda x: 1 if x=='l1' else 0)\n\nd['autorNumber'] = d.problem.map(lambda x: 20 if x % 2==0 else 5)\nd.norm.fillna(value='None', inplace=True);\n\n_, d['ngram_max'] = zip(*d.ngram_range.str.replace(r'[^\\d,]','').str.split(',').values.tolist())\n#d.ngram_min = d.ngram_min.astype(np.uint8);\nd.ngram_max = d.ngram_max.astype(np.uint8);\nd.drop(columns=['ngram_range','problem'], inplace=True)\n#d['intercept'] = 1;\n\nd=pd.get_dummies(d, columns=['language'])", "_____no_output_____" ], [ "d.describe()", "_____no_output_____" ], [ "mod = sm.OLS( d.iloc[:,0], d.iloc[:,1:])\nres = mod.fit()\nres.summary()", "_____no_output_____" ], [ "sns.distplot(res.predict()-d.iloc[:,0].values, bins=25)", "_____no_output_____" ], [ "sns.jointplot(x='F1',y='F1-estimated',data=pd.DataFrame({'F1':d.iloc[:,0].values, 'F1-estimated':res.predict()}));", "_____no_output_____" ] ], [ [ "# tests", "_____no_output_____" ] ], [ [ "problem = problems[0]\nprint (\"\\nProblem: %s, language: %s, \" %(problem['problem'],problem['language']), end=' ');\n", "\nProblem: problem00001, language: en, " ], [ "def d(estimator, n_features=5):\n from IPython.display import Markdown, display, HTML\n names = np.array(estimator.named_steps['vect'].get_feature_names());\n classes_ = estimator.named_steps['clf'].classes_;\n weights = estimator.named_steps['clf'].coef_;\n \n def tag(tag, content, attrib=''):\n if attrib != '':\n attrib = ' style=\"' + attrib+'\"'; \n return ''.join(['<',tag,attrib,' >',content,'</',tag,'>']);\n \n def color(baseColor, intensity):\n r,g,b = baseColor[0:2],baseColor[2:4],baseColor[4:6]\n r,g,b = int(r, 16), int(g, 16), int(b, 16)\n \n f= (1-np.abs(intensity))/2;\n r = r + int((255-r)*f)\n g = g + int((255-g)*f)\n b = b + int((255-b)*f)\n rgb = '#%02x%x%x' % (r, g, b);\n #print(baseColor,rgb,r,g,b,intensity,f)\n return rgb\n \n \n spanStyle ='border-radius: 5px;margin:4px;padding:3px; color:#FFF !important;';\n \n lines = '<table>'+tag('thead',tag('th','Classes')+tag('th','positive')+tag('th','negative'))\n lines += '<tbody>'\n for i,c in enumerate(weights):\n c = np.round(c / np.abs(c).max(),2);\n positive = names[np.argsort(-c)][:n_features];\n positiveV = c[np.argsort(-c)][:n_features]\n negative = names[np.argsort(c)][:n_features];\n negativeV = c[np.argsort(c)][:n_features]\n \n lines += tag('tr',\n tag('td', re.sub('\\D0*','',classes_[i]))\n + tag('td',''.join([tag('span',d.upper()+' '+str(v),spanStyle+'background:'+color('51A3DD',v)) for d,v in zip(positive,positiveV)]))\n + tag('td',''.join([tag('span',d.upper()+' '+str(v),spanStyle+'background:'+color('DD5555',v)) for d,v in zip(negative,negativeV)]))\n )\n lines+= '</tbody></table>'\n \n display(HTML(lines))\n #print(lines)\n \nd(estimators[0])", "_____no_output_____" ], [ "%%HTML\n<table><tbody><tr><th>POS</th><th>Description</th><th>Examples</th></tr><tr >\n<td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text u-text-small\">adjective</td><td class=\"c-table__cell u-text u-text-small\"><em>big, old, green, incomprehensible, first</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text u-text-small\">adposition</td><td class=\"c-table__cell u-text u-text-small\"><em>in, to, during</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text u-text-small\">adverb</td><td class=\"c-table__cell u-text u-text-small\"><em>very, tomorrow, down, where, there</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text u-text-small\">auxiliary</td><td class=\"c-table__cell u-text u-text-small\"><em>is, has (done), will (do), should (do)</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text u-text-small\">conjunction</td><td class=\"c-table__cell u-text u-text-small\"><em>and, or, but</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CCONJ</code></td><td class=\"c-table__cell u-text u-text-small\">coordinating conjunction</td><td class=\"c-table__cell u-text u-text-small\"><em>and, or, but</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text u-text-small\">determiner</td><td class=\"c-table__cell u-text u-text-small\"><em>a, an, the</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>INTJ</code></td><td class=\"c-table__cell u-text u-text-small\">interjection</td><td class=\"c-table__cell u-text u-text-small\"><em>psst, ouch, bravo, hello</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text u-text-small\">noun</td><td class=\"c-table__cell u-text u-text-small\"><em>girl, cat, tree, air, beauty</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NUM</code></td><td class=\"c-table__cell u-text u-text-small\">numeral</td><td class=\"c-table__cell u-text u-text-small\"><em>1, 2017, one, seventy-seven, IV, MMXIV</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text u-text-small\">particle</td><td class=\"c-table__cell u-text u-text-small\"><em>'s, not, </em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text u-text-small\">pronoun</td><td class=\"c-table__cell u-text u-text-small\"><em>I, you, he, she, myself, themselves, somebody</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text u-text-small\">proper noun</td><td class=\"c-table__cell u-text u-text-small\"><em>Mary, John, London, NATO, HBO</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation</td><td class=\"c-table__cell u-text u-text-small\"><em>., (, ), ?</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SCONJ</code></td><td class=\"c-table__cell u-text u-text-small\">subordinating conjunction</td><td class=\"c-table__cell u-text u-text-small\"><em>if, while, that</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text u-text-small\">symbol</td><td class=\"c-table__cell u-text u-text-small\"><em>$, %, ยง, ยฉ, +, โˆ’, ร—, รท, =, :), ๐Ÿ˜</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text u-text-small\">verb</td><td class=\"c-table__cell u-text u-text-small\"><em>run, runs, running, eat, ate, eating</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text u-text-small\">other</td><td class=\"c-table__cell u-text u-text-small\"><em>sfpksdpsxmsa</em></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SPACE</code></td><td class=\"c-table__cell u-text u-text-small\">space</td></tr></tbody></table>", "_____no_output_____" ], [ "%%HTML\n<h1>English</h1>\n\n<table class=\"c-table o-block\"><tbody><tr class=\"c-table__row c-table__row--head\"><th class=\"c-table__head-cell u-text-label\">Tag</th><th class=\"c-table__head-cell u-text-label\">POS</th><th class=\"c-table__head-cell u-text-label\">Morphology</th><th class=\"c-table__head-cell u-text-label\">Description</th></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>-LRB-</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=brck</code> <code>PunctSide=ini</code></td><td class=\"c-table__cell u-text u-text-small\">left round bracket</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>-RRB-</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=brck</code> <code>PunctSide=fin</code></td><td class=\"c-table__cell u-text u-text-small\">right round bracket</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>,</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=comm</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, comma</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>:</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, colon or ellipsis</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>.</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=peri</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, sentence closer</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>''</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=quot</code> <code>PunctSide=fin</code></td><td class=\"c-table__cell u-text u-text-small\">closing quotation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>\"\"</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=quot</code> <code>PunctSide=fin</code></td><td class=\"c-table__cell u-text u-text-small\">closing quotation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>#</code></td><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"> <code>SymType=numbersign</code></td><td class=\"c-table__cell u-text u-text-small\">symbol, number sign</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>``</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=quot</code> <code>PunctSide=ini</code></td><td class=\"c-table__cell u-text u-text-small\">opening quotation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$</code></td><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"> <code>SymType=currency</code></td><td class=\"c-table__cell u-text u-text-small\">symbol, currency</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADD</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">email</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>AFX</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Hyph=yes</code></td><td class=\"c-table__cell u-text u-text-small\">affix</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>BES</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">auxiliary \"be\"</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CC</code></td><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text\"> <code>ConjType=coor</code></td><td class=\"c-table__cell u-text u-text-small\">conjunction, coordinating</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CD</code></td><td class=\"c-table__cell u-text\"><code>NUM</code></td><td class=\"c-table__cell u-text\"> <code>NumType=card</code></td><td class=\"c-table__cell u-text u-text-small\">cardinal number</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>DT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>determiner</code></td><td class=\"c-table__cell u-text u-text-small\"></td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>EX</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>AdvType=ex</code></td><td class=\"c-table__cell u-text u-text-small\">existential there</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>FW</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"> <code>Foreign=yes</code></td><td class=\"c-table__cell u-text u-text-small\">foreign word</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>GW</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">additional word in multi-word expression</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>HVS</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">forms of \"have\"</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>HYPH</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=dash</code></td><td class=\"c-table__cell u-text u-text-small\">punctuation mark, hyphen</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>IN</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">conjunction, subordinating or preposition</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>JJ</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Degree=pos</code></td><td class=\"c-table__cell u-text u-text-small\">adjective</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>JJR</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Degree=comp</code></td><td class=\"c-table__cell u-text u-text-small\">adjective, comparative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>JJS</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Degree=sup</code></td><td class=\"c-table__cell u-text u-text-small\">adjective, superlative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>LS</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>NumType=ord</code></td><td class=\"c-table__cell u-text u-text-small\">list item marker</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>MD</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">verb, modal auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NFP</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">superfluous punctuation</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NIL</code></td><td class=\"c-table__cell u-text\"><code></code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">missing tag</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NN</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"> <code>Number=sing</code></td><td class=\"c-table__cell u-text u-text-small\">noun, singular or mass</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNP</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"> <code>NounType=prop</code> <code>Number=sign</code></td><td class=\"c-table__cell u-text u-text-small\">noun, proper singular</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNPS</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"> <code>NounType=prop</code> <code>Number=plur</code></td><td class=\"c-table__cell u-text u-text-small\">noun, proper plural</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNS</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"> <code>Number=plur</code></td><td class=\"c-table__cell u-text u-text-small\">noun, plural</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PDT</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>AdjType=pdt</code> <code>PronType=prn</code></td><td class=\"c-table__cell u-text u-text-small\">predeterminer</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>POS</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>Poss=yes</code></td><td class=\"c-table__cell u-text u-text-small\">possessive ending</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRP</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code></td><td class=\"c-table__cell u-text u-text-small\">pronoun, personal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRP$</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code> <code>Poss=yes</code></td><td class=\"c-table__cell u-text u-text-small\">pronoun, possessive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RB</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>Degree=pos</code></td><td class=\"c-table__cell u-text u-text-small\">adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RBR</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>Degree=comp</code></td><td class=\"c-table__cell u-text u-text-small\">adverb, comparative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RBS</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>Degree=sup</code></td><td class=\"c-table__cell u-text u-text-small\">adverb, superlative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>RP</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">adverb, particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>_SP</code></td><td class=\"c-table__cell u-text\"><code>SPACE</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">space</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"><code>SYM</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">symbol</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>TO</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=inf</code> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitival to</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>UH</code></td><td class=\"c-table__cell u-text\"><code>INTJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">interjection</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VB</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">verb, base form</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBD</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>Tense=past</code></td><td class=\"c-table__cell u-text u-text-small\">verb, past tense</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBG</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=part</code> <code>Tense=pres</code> <code>Aspect=prog</code></td><td class=\"c-table__cell u-text u-text-small\">verb, gerund or present participle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBN</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=part</code> <code>Tense=past</code> <code>Aspect=perf</code></td><td class=\"c-table__cell u-text u-text-small\">verb, past participle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>Tense=pres</code></td><td class=\"c-table__cell u-text u-text-small\">verb, non-3rd person singular present</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VBZ</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>Tense=pres</code> <code>Number=sing</code> <code>Person=3</code></td><td class=\"c-table__cell u-text u-text-small\">verb, 3rd person singular present</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WDT</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-determiner</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WP</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-pronoun, personal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WP$</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Poss=yes PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-pronoun, possessive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>WRB</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code> <code>rel</code></td><td class=\"c-table__cell u-text u-text-small\">wh-adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>XX</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">unknown</td></tr></tbody></table>", "_____no_output_____" ], [ "%%HTML\n<h1>German</h1>\n<p> The German part-of-speech tagger uses the <a href=\"http://www.ims.uni-stuttgart.de/forschung/ressourcen/korpora/TIGERCorpus/annotation/index.html\" target=\"_blank\" rel=\"noopener nofollow\">TIGER Treebank</a> annotation scheme. We also map the tags to the simpler Google\nUniversal POS tag set.</p>\n\n<table class=\"c-table o-block\"><tbody><tr class=\"c-table__row c-table__row--head\"><th class=\"c-table__head-cell u-text-label\">Tag</th><th class=\"c-table__head-cell u-text-label\">POS</th><th class=\"c-table__head-cell u-text-label\">Morphology</th><th class=\"c-table__head-cell u-text-label\">Description</th></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$(</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=brck</code></td><td class=\"c-table__cell u-text u-text-small\">other sentence-internal punctuation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$,</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=comm</code></td><td class=\"c-table__cell u-text u-text-small\">comma</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>$.</code></td><td class=\"c-table__cell u-text\"><code>PUNCT</code></td><td class=\"c-table__cell u-text\"> <code>PunctType=peri</code></td><td class=\"c-table__cell u-text u-text-small\">sentence-final punctuation mark</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADJA</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">adjective, attributive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADJD</code></td><td class=\"c-table__cell u-text\"><code>ADJ</code></td><td class=\"c-table__cell u-text\"> <code>Variant=short</code></td><td class=\"c-table__cell u-text u-text-small\">adjective, adverbial or predicative</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APPO</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=post</code></td><td class=\"c-table__cell u-text u-text-small\">postposition</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APPR</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=prep</code></td><td class=\"c-table__cell u-text u-text-small\">preposition; circumposition left</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APPRART</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=prep</code> <code>PronType=art</code></td><td class=\"c-table__cell u-text u-text-small\">preposition with article</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>APZR</code></td><td class=\"c-table__cell u-text\"><code>ADP</code></td><td class=\"c-table__cell u-text\"> <code>AdpType=circ</code></td><td class=\"c-table__cell u-text u-text-small\">circumposition right</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ART</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=art</code></td><td class=\"c-table__cell u-text u-text-small\">definite or indefinite article</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>CARD</code></td><td class=\"c-table__cell u-text\"><code>NUM</code></td><td class=\"c-table__cell u-text\"> <code>NumType=card</code></td><td class=\"c-table__cell u-text u-text-small\">cardinal number</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>FM</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"> <code>Foreign=yes</code></td><td class=\"c-table__cell u-text u-text-small\">foreign language material</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>ITJ</code></td><td class=\"c-table__cell u-text\"><code>INTJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">interjection</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KOKOM</code></td><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text\"> <code>ConjType=comp</code></td><td class=\"c-table__cell u-text u-text-small\">comparative conjunction</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KON</code></td><td class=\"c-table__cell u-text\"><code>CONJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">coordinate conjunction</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KOUI</code></td><td class=\"c-table__cell u-text\"><code>SCONJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">subordinate conjunction with \"zu\" and infinitive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>KOUS</code></td><td class=\"c-table__cell u-text\"><code>SCONJ</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">subordinate conjunction with sentence</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NE</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">proper noun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NNE</code></td><td class=\"c-table__cell u-text\"><code>PROPN</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">proper noun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>NN</code></td><td class=\"c-table__cell u-text\"><code>NOUN</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">noun, singular or mass</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PAV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">pronominal adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PROAV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">pronominal adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PDAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">attributive demonstrative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PDS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=dem</code></td><td class=\"c-table__cell u-text u-text-small\">substituting demonstrative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PIAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=ind</code> <code>neg</code> <code>tot</code></td><td class=\"c-table__cell u-text u-text-small\">attributive indefinite pronoun without determiner</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PIDAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>AdjType=pdt PronType=ind</code> <code>neg</code> <code>tot</code></td><td class=\"c-table__cell u-text u-text-small\">attributive indefinite pronoun with determiner</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PIS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=ind</code> <code>neg</code> <code>tot</code></td><td class=\"c-table__cell u-text u-text-small\">substituting indefinite pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PPER</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code></td><td class=\"c-table__cell u-text u-text-small\">non-reflexive personal pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PPOSAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>Poss=yes</code> <code>PronType=prs</code></td><td class=\"c-table__cell u-text u-text-small\">attributive possessive pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PPOSS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=rel</code></td><td class=\"c-table__cell u-text u-text-small\">substituting possessive pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRELAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=rel</code></td><td class=\"c-table__cell u-text u-text-small\">attributive relative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRELS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=rel</code></td><td class=\"c-table__cell u-text u-text-small\">substituting relative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PRF</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=prs</code> <code>Reflex=yes</code></td><td class=\"c-table__cell u-text u-text-small\">reflexive personal pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKA</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">particle with adjective or adverb</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKANT</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=res</code></td><td class=\"c-table__cell u-text u-text-small\">answer particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKNEG</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>Negative=yes</code></td><td class=\"c-table__cell u-text u-text-small\">negative particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKVZ</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=vbp</code></td><td class=\"c-table__cell u-text u-text-small\">separable verbal particle</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PTKZU</code></td><td class=\"c-table__cell u-text\"><code>PART</code></td><td class=\"c-table__cell u-text\"> <code>PartType=inf</code></td><td class=\"c-table__cell u-text u-text-small\">\"zu\" before infinitive</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PWAT</code></td><td class=\"c-table__cell u-text\"><code>DET</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code></td><td class=\"c-table__cell u-text u-text-small\">attributive interrogative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PWAV</code></td><td class=\"c-table__cell u-text\"><code>ADV</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code></td><td class=\"c-table__cell u-text u-text-small\">adverbial interrogative or relative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>PWS</code></td><td class=\"c-table__cell u-text\"><code>PRON</code></td><td class=\"c-table__cell u-text\"> <code>PronType=int</code></td><td class=\"c-table__cell u-text u-text-small\">substituting interrogative pronoun</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>TRUNC</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"> <code>Hyph=yes</code></td><td class=\"c-table__cell u-text u-text-small\">word remnant</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAFIN</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>Mood=ind</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">finite verb, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAIMP</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>Mood=imp</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">imperative, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAINF</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VAPP</code></td><td class=\"c-table__cell u-text\"><code>AUX</code></td><td class=\"c-table__cell u-text\"> <code>Aspect=perf</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">perfect participle, auxiliary</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VMFIN</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Mood=ind</code> <code>VerbForm=fin</code> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">finite verb, modal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VMINF</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=fin</code> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive, modal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VMPP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Aspect=perf</code> <code>VerbForm=part</code> <code>VerbType=mod</code></td><td class=\"c-table__cell u-text u-text-small\">perfect participle, modal</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVFIN</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Mood=ind</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">finite verb, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVIMP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Mood=imp</code> <code>VerbForm=fin</code></td><td class=\"c-table__cell u-text u-text-small\">imperative, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVINF</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVIZU</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>VerbForm=inf</code></td><td class=\"c-table__cell u-text u-text-small\">infinitive with \"zu\", full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>VVPP</code></td><td class=\"c-table__cell u-text\"><code>VERB</code></td><td class=\"c-table__cell u-text\"> <code>Aspect=perf</code> <code>VerbForm=part</code></td><td class=\"c-table__cell u-text u-text-small\">perfect participle, full</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>XY</code></td><td class=\"c-table__cell u-text\"><code>X</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">non-word containing non-letter</td></tr><tr class=\"c-table__row\"><td class=\"c-table__cell u-text\"><code>SP</code></td><td class=\"c-table__cell u-text\"><code>SPACE</code></td><td class=\"c-table__cell u-text\"></td><td class=\"c-table__cell u-text u-text-small\">space</td></tr></tbody></table>", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d063dd4da6edcb8569e16c912721a1ffc128f161
12,840
ipynb
Jupyter Notebook
jupyter/spark_nlp_model.ipynb
akashmavle5/--akash
cfb21d5a943a3d3fcae3c08921e7323a52761acd
[ "Apache-2.0" ]
null
null
null
jupyter/spark_nlp_model.ipynb
akashmavle5/--akash
cfb21d5a943a3d3fcae3c08921e7323a52761acd
[ "Apache-2.0" ]
null
null
null
jupyter/spark_nlp_model.ipynb
akashmavle5/--akash
cfb21d5a943a3d3fcae3c08921e7323a52761acd
[ "Apache-2.0" ]
null
null
null
34.423592
224
0.51285
[ [ [ "![JohnSnowLabs](https://nlp.johnsnowlabs.com/assets/images/logo.png)", "_____no_output_____" ], [ "# Spark NLP Quick Start\n### How to use Spark NLP pretrained pipelines", "_____no_output_____" ], [ "[![Open In Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/JohnSnowLabs/spark-nlp-workshop/blob/master/jupyter/quick_start_google_colab.ipynb)", "_____no_output_____" ], [ "We will first set up the runtime environment and then load pretrained Entity Recognition model and Sentiment analysis model and give it a quick test. Feel free to test the models on your own sentences / datasets.", "_____no_output_____" ] ], [ [ "!wget http://setup.johnsnowlabs.com/colab.sh -O - | bash", "--2021-06-03 06:56:33-- http://setup.johnsnowlabs.com/colab.sh\nResolving setup.johnsnowlabs.com (setup.johnsnowlabs.com)... 51.158.130.125\nConnecting to setup.johnsnowlabs.com (setup.johnsnowlabs.com)|51.158.130.125|:80... connected.\nHTTP request sent, awaiting response... 302 Moved Temporarily\nLocation: https://raw.githubusercontent.com/JohnSnowLabs/spark-nlp/master/scripts/colab_setup.sh [following]\n--2021-06-03 06:56:34-- https://raw.githubusercontent.com/JohnSnowLabs/spark-nlp/master/scripts/colab_setup.sh\nResolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.109.133, 185.199.111.133, 185.199.110.133, ...\nConnecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.109.133|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 1608 (1.6K) [text/plain]\nSaving to: โ€˜STDOUTโ€™\n\n- 100%[===================>] 1.57K --.-KB/s in 0s \n\n2021-06-03 06:56:34 (34.0 MB/s) - written to stdout [1608/1608]\n\nsetup Colab for PySpark 3.0.2 and Spark NLP 3.0.3\nGet:1 https://cloud.r-project.org/bin/linux/ubuntu bionic-cran40/ InRelease [3,626 B]\nIgn:2 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 InRelease\nGet:3 http://security.ubuntu.com/ubuntu bionic-security InRelease [88.7 kB]\nGet:4 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic InRelease [15.9 kB]\nIgn:5 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 InRelease\nHit:6 https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 Release\nHit:7 https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 Release\nHit:9 http://archive.ubuntu.com/ubuntu bionic InRelease\nGet:11 http://archive.ubuntu.com/ubuntu bionic-updates InRelease [88.7 kB]\nHit:12 http://ppa.launchpad.net/cran/libgit2/ubuntu bionic InRelease\nGet:13 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic InRelease [15.9 kB]\nGet:14 http://security.ubuntu.com/ubuntu bionic-security/restricted amd64 Packages [424 kB]\nGet:15 http://archive.ubuntu.com/ubuntu bionic-backports InRelease [74.6 kB]\nGet:16 http://archive.ubuntu.com/ubuntu bionic-updates/restricted amd64 Packages [478 kB]\nHit:17 http://ppa.launchpad.net/graphics-drivers/ppa/ubuntu bionic InRelease\nGet:18 http://security.ubuntu.com/ubuntu bionic-security/universe amd64 Packages [1,414 kB]\nGet:19 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic/main Sources [1,770 kB]\nGet:20 http://security.ubuntu.com/ubuntu bionic-security/main amd64 Packages [2,154 kB]\nGet:21 http://archive.ubuntu.com/ubuntu bionic-updates/main amd64 Packages [2,615 kB]\nGet:22 http://archive.ubuntu.com/ubuntu bionic-updates/universe amd64 Packages [2,184 kB]\nGet:23 http://ppa.launchpad.net/c2d4u.team/c2d4u4.0+/ubuntu bionic/main amd64 Packages [906 kB]\nGet:24 http://ppa.launchpad.net/deadsnakes/ppa/ubuntu bionic/main amd64 Packages [40.9 kB]\nFetched 12.3 MB in 7s (1,728 kB/s)\nReading package lists... Done\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 204.8MB 61kB/s \n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 51kB 6.0MB/s \n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 204kB 36.1MB/s \n\u001b[?25h Building wheel for pyspark (setup.py) ... \u001b[?25l\u001b[?25hdone\n" ], [ "import sparknlp\nspark = sparknlp.start()\n\nprint(\"Spark NLP version: {}\".format(sparknlp.version()))\nprint(\"Apache Spark version: {}\".format(spark.version))", "Spark NLP version: 3.0.3\nApache Spark version: 3.0.2\n" ], [ "from sparknlp.pretrained import PretrainedPipeline ", "_____no_output_____" ] ], [ [ "Let's use Spark NLP pre-trained pipeline for `named entity recognition`", "_____no_output_____" ] ], [ [ "pipeline = PretrainedPipeline('recognize_entities_dl', 'en')", "recognize_entities_dl download started this may take some time.\nApprox size to download 160.1 MB\n[OK!]\n" ], [ "result = pipeline.annotate('President Biden represented Delaware for 36 years in the U.S. Senate before becoming the 47th Vice President of the United States.') ", "_____no_output_____" ], [ "print(result['ner'])\nprint(result['entities'])", "['O', 'B-PER', 'O', 'B-LOC', 'O', 'O', 'O', 'O', 'O', 'B-LOC', 'O', 'B-ORG', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-LOC', 'I-LOC', 'O']\n['Biden', 'Delaware', 'U.S', 'Senate', 'United States']\n" ] ], [ [ "Let's try another Spark NLP pre-trained pipeline for `named entity recognition`", "_____no_output_____" ] ], [ [ "pipeline = PretrainedPipeline('onto_recognize_entities_bert_tiny', 'en')\n\nresult = pipeline.annotate(\"Johnson first entered politics when elected in 2001 as a member of Parliament. He then served eight years as the mayor of London, from 2008 to 2016, before rejoining Parliament.\")\n\nprint(result['ner'])\nprint(result['entities'])", "onto_recognize_entities_bert_tiny download started this may take some time.\nApprox size to download 30.2 MB\n[OK!]\n['B-PERSON', 'B-ORDINAL', 'O', 'O', 'O', 'O', 'O', 'B-DATE', 'O', 'O', 'O', 'O', 'B-ORG', 'O', 'O', 'O', 'B-DATE', 'I-DATE', 'O', 'O', 'O', 'O', 'B-GPE', 'O', 'B-DATE', 'O', 'B-DATE', 'O', 'O', 'O', 'B-ORG']\n['Johnson', 'first', '2001', 'Parliament.', 'eight years', 'London,', '2008', '2016', 'Parliament.']\n" ] ], [ [ "Let's use Spark NLP pre-trained pipeline for `sentiment` analysis", "_____no_output_____" ] ], [ [ "pipeline = PretrainedPipeline('analyze_sentimentdl_glove_imdb', 'en')", "analyze_sentimentdl_glove_imdb download started this may take some time.\nApprox size to download 155.3 MB\n[OK!]\n" ], [ "result = pipeline.annotate(\"Harry Potter is a great movie.\")", "_____no_output_____" ], [ "print(result['sentiment'])", "['pos']\n" ] ], [ [ "### Please check our [Models Hub](https://nlp.johnsnowlabs.com/models) for more pretrained models and pipelines! ๐Ÿ˜Š ", "_____no_output_____" ] ], [ [ "", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ] ]
d063e5bcce03d3c35d6f90ff41993985148b21df
4,788
ipynb
Jupyter Notebook
test/ipynb/groovy/TableMenuTest.ipynb
ssadedin/beakerx
34479b07d2dfdf1404692692f483faf0251632c3
[ "Apache-2.0" ]
1,491
2017-03-30T03:05:05.000Z
2022-03-27T04:26:02.000Z
test/ipynb/groovy/TableMenuTest.ipynb
ssadedin/beakerx
34479b07d2dfdf1404692692f483faf0251632c3
[ "Apache-2.0" ]
3,268
2015-01-01T00:10:26.000Z
2017-05-05T18:59:41.000Z
test/ipynb/groovy/TableMenuTest.ipynb
ssadedin/beakerx
34479b07d2dfdf1404692692f483faf0251632c3
[ "Apache-2.0" ]
287
2017-04-03T01:30:06.000Z
2022-03-17T06:09:15.000Z
23.130435
104
0.553258
[ [ [ "empty" ] ] ]
[ "empty" ]
[ [ "empty" ] ]
d0643a20c5e91fa8a2a1902b0f62ac611d1e5814
45,339
ipynb
Jupyter Notebook
chapter2/2.3.2-text_classification.ipynb
wangxingda/Tensorflow-Handbook
97987e62da5a24dac6169fbacf1c3d4c041b3339
[ "Apache-2.0" ]
22
2019-10-12T06:38:05.000Z
2022-02-24T03:10:29.000Z
chapter2/2.3.2-text_classification.ipynb
wangxingda/tensorflow-handbook
97987e62da5a24dac6169fbacf1c3d4c041b3339
[ "Apache-2.0" ]
null
null
null
chapter2/2.3.2-text_classification.ipynb
wangxingda/tensorflow-handbook
97987e62da5a24dac6169fbacf1c3d4c041b3339
[ "Apache-2.0" ]
6
2019-11-29T15:14:12.000Z
2020-06-30T03:59:03.000Z
50.488864
15,964
0.677717
[ [ [ "# ็”ตๅฝฑ่ฏ„่ฎบๆ–‡ๆœฌๅˆ†็ฑป", "_____no_output_____" ], [ "\nๆญค็ฌ”่ฎฐๆœฌ๏ผˆnotebook๏ผ‰ไฝฟ็”จ่ฏ„่ฎบๆ–‡ๆœฌๅฐ†ๅฝฑ่ฏ„ๅˆ†ไธบ*็งฏๆž๏ผˆpositive๏ผ‰*ๆˆ–*ๆถˆๆž๏ผˆnagetive๏ผ‰*ไธค็ฑปใ€‚่ฟ™ๆ˜ฏไธ€ไธช*ไบŒๅ…ƒ๏ผˆbinary๏ผ‰*ๆˆ–่€…ไบŒๅˆ†็ฑป้—ฎ้ข˜๏ผŒไธ€็ง้‡่ฆไธ”ๅบ”็”จๅนฟๆณ›็š„ๆœบๅ™จๅญฆไน ้—ฎ้ข˜ใ€‚\n\nๆˆ‘ไปฌๅฐ†ไฝฟ็”จๆฅๆบไบŽ[็ฝ‘็ปœ็”ตๅฝฑๆ•ฐๆฎๅบ“๏ผˆInternet Movie Database๏ผ‰](https://www.imdb.com/)็š„ [IMDB ๆ•ฐๆฎ้›†๏ผˆIMDB dataset๏ผ‰](https://tensorflow.google.cn/api_docs/python/tf/keras/datasets/imdb)๏ผŒๅ…ถๅŒ…ๅซ 50,000 ๆกๅฝฑ่ฏ„ๆ–‡ๆœฌใ€‚ไปŽ่ฏฅๆ•ฐๆฎ้›†ๅˆ‡ๅ‰ฒๅ‡บ็š„25,000ๆก่ฏ„่ฎบ็”จไฝœ่ฎญ็ปƒ๏ผŒๅฆๅค– 25,000 ๆก็”จไฝœๆต‹่ฏ•ใ€‚่ฎญ็ปƒ้›†ไธŽๆต‹่ฏ•้›†ๆ˜ฏ*ๅนณ่กก็š„๏ผˆbalanced๏ผ‰*๏ผŒๆ„ๅ‘ณ็€ๅฎƒไปฌๅŒ…ๅซ็›ธ็ญ‰ๆ•ฐ้‡็š„็งฏๆžๅ’Œๆถˆๆž่ฏ„่ฎบใ€‚\n\nๆญค็ฌ”่ฎฐๆœฌ๏ผˆnotebook๏ผ‰ไฝฟ็”จไบ† [tf.keras](https://tensorflow.google.cn/guide/keras)๏ผŒๅฎƒๆ˜ฏไธ€ไธช Tensorflow ไธญ็”จไบŽๆž„ๅปบๅ’Œ่ฎญ็ปƒๆจกๅž‹็š„้ซ˜็บงAPIใ€‚ๆœ‰ๅ…ณไฝฟ็”จ `tf.keras` ่ฟ›่กŒๆ–‡ๆœฌๅˆ†็ฑป็š„ๆ›ด้ซ˜็บงๆ•™็จ‹๏ผŒ่ฏทๅ‚้˜… [MLCCๆ–‡ๆœฌๅˆ†็ฑปๆŒ‡ๅ—๏ผˆMLCC Text Classification Guide๏ผ‰](https://developers.google.com/machine-learning/guides/text-classification/)ใ€‚", "_____no_output_____" ] ], [ [ "from __future__ import absolute_import, division, print_function, unicode_literals\n\ntry:\n # Colab only\n %tensorflow_version 2.x\nexcept Exception:\n pass\nimport tensorflow as tf\nfrom tensorflow import keras\n\nimport numpy as np\n\nprint(tf.__version__)", "2.0.0\n" ] ], [ [ "## ไธ‹่ฝฝ IMDB ๆ•ฐๆฎ้›†\n\nIMDB ๆ•ฐๆฎ้›†ๅทฒ็ปๆ‰“ๅŒ…ๅœจ Tensorflow ไธญใ€‚่ฏฅๆ•ฐๆฎ้›†ๅทฒ็ป็ป่ฟ‡้ข„ๅค„็†๏ผŒ่ฏ„่ฎบ๏ผˆๅ•่ฏๅบๅˆ—๏ผ‰ๅทฒ็ป่ขซ่ฝฌๆขไธบๆ•ดๆ•ฐๅบๅˆ—๏ผŒๅ…ถไธญๆฏไธชๆ•ดๆ•ฐ่กจ็คบๅญ—ๅ…ธไธญ็š„็‰นๅฎšๅ•่ฏใ€‚\n\nไปฅไธ‹ไปฃ็ ๅฐ†ไธ‹่ฝฝ IMDB ๆ•ฐๆฎ้›†ๅˆฐๆ‚จ็š„ๆœบๅ™จไธŠ๏ผˆๅฆ‚ๆžœๆ‚จๅทฒ็ปไธ‹่ฝฝ่ฟ‡ๅฐ†ไปŽ็ผ“ๅญ˜ไธญๅคๅˆถ๏ผ‰๏ผš", "_____no_output_____" ] ], [ [ "imdb = keras.datasets.imdb\n\n(train_data, train_labels), (test_data, test_labels) = imdb.load_data(num_words=10000)", "_____no_output_____" ] ], [ [ "ๅ‚ๆ•ฐ `num_words=10000` ไฟ็•™ไบ†่ฎญ็ปƒๆ•ฐๆฎไธญๆœ€ๅธธๅ‡บ็Žฐ็š„ 10,000 ไธชๅ•่ฏใ€‚ไธบไบ†ไฟๆŒๆ•ฐๆฎ่ง„ๆจก็š„ๅฏ็ฎก็†ๆ€ง๏ผŒไฝŽ้ข‘่ฏๅฐ†่ขซไธขๅผƒใ€‚\n", "_____no_output_____" ], [ "## ๆŽข็ดขๆ•ฐๆฎ\n\n่ฎฉๆˆ‘ไปฌ่Šฑไธ€็‚นๆ—ถ้—ดๆฅไบ†่งฃๆ•ฐๆฎๆ ผๅผใ€‚่ฏฅๆ•ฐๆฎ้›†ๆ˜ฏ็ป่ฟ‡้ข„ๅค„็†็š„๏ผšๆฏไธชๆ ทๆœฌ้ƒฝๆ˜ฏไธ€ไธช่กจ็คบๅฝฑ่ฏ„ไธญ่ฏๆฑ‡็š„ๆ•ดๆ•ฐๆ•ฐ็ป„ใ€‚ๆฏไธชๆ ‡็ญพ้ƒฝๆ˜ฏไธ€ไธชๅ€ผไธบ 0 ๆˆ– 1 ็š„ๆ•ดๆ•ฐๅ€ผ๏ผŒๅ…ถไธญ 0 ไปฃ่กจๆถˆๆž่ฏ„่ฎบ๏ผŒ1 ไปฃ่กจ็งฏๆž่ฏ„่ฎบใ€‚", "_____no_output_____" ] ], [ [ "print(\"Training entries: {}, labels: {}\".format(len(train_data), len(train_labels)))", "Training entries: 25000, labels: 25000\n" ] ], [ [ "่ฏ„่ฎบๆ–‡ๆœฌ่ขซ่ฝฌๆขไธบๆ•ดๆ•ฐๅ€ผ๏ผŒๅ…ถไธญๆฏไธชๆ•ดๆ•ฐไปฃ่กจ่ฏๅ…ธไธญ็š„ไธ€ไธชๅ•่ฏใ€‚้ฆ–ๆก่ฏ„่ฎบๆ˜ฏ่ฟ™ๆ ท็š„๏ผš", "_____no_output_____" ] ], [ [ "print(train_data[0])", "[1, 14, 22, 16, 43, 530, 973, 1622, 1385, 65, 458, 4468, 66, 3941, 4, 173, 36, 256, 5, 25, 100, 43, 838, 112, 50, 670, 2, 9, 35, 480, 284, 5, 150, 4, 172, 112, 167, 2, 336, 385, 39, 4, 172, 4536, 1111, 17, 546, 38, 13, 447, 4, 192, 50, 16, 6, 147, 2025, 19, 14, 22, 4, 1920, 4613, 469, 4, 22, 71, 87, 12, 16, 43, 530, 38, 76, 15, 13, 1247, 4, 22, 17, 515, 17, 12, 16, 626, 18, 2, 5, 62, 386, 12, 8, 316, 8, 106, 5, 4, 2223, 5244, 16, 480, 66, 3785, 33, 4, 130, 12, 16, 38, 619, 5, 25, 124, 51, 36, 135, 48, 25, 1415, 33, 6, 22, 12, 215, 28, 77, 52, 5, 14, 407, 16, 82, 2, 8, 4, 107, 117, 5952, 15, 256, 4, 2, 7, 3766, 5, 723, 36, 71, 43, 530, 476, 26, 400, 317, 46, 7, 4, 2, 1029, 13, 104, 88, 4, 381, 15, 297, 98, 32, 2071, 56, 26, 141, 6, 194, 7486, 18, 4, 226, 22, 21, 134, 476, 26, 480, 5, 144, 30, 5535, 18, 51, 36, 28, 224, 92, 25, 104, 4, 226, 65, 16, 38, 1334, 88, 12, 16, 283, 5, 16, 4472, 113, 103, 32, 15, 16, 5345, 19, 178, 32]\n" ] ], [ [ "็”ตๅฝฑ่ฏ„่ฎบๅฏ่ƒฝๅ…ทๆœ‰ไธๅŒ็š„้•ฟๅบฆใ€‚ไปฅไธ‹ไปฃ็ ๆ˜พ็คบไบ†็ฌฌไธ€ๆกๅ’Œ็ฌฌไบŒๆก่ฏ„่ฎบ็š„ไธญๅ•่ฏๆ•ฐ้‡ใ€‚็”ฑไบŽ็ฅž็ป็ฝ‘็ปœ็š„่พ“ๅ…ฅๅฟ…้กปๆ˜ฏ็ปŸไธ€็š„้•ฟๅบฆ๏ผŒๆˆ‘ไปฌ็จๅŽ้œ€่ฆ่งฃๅ†ณ่ฟ™ไธช้—ฎ้ข˜ใ€‚", "_____no_output_____" ] ], [ [ "len(train_data[0]), len(train_data[1])", "_____no_output_____" ] ], [ [ "### ๅฐ†ๆ•ดๆ•ฐ่ฝฌๆขๅ›žๅ•่ฏ\n\nไบ†่งฃๅฆ‚ไฝ•ๅฐ†ๆ•ดๆ•ฐ่ฝฌๆขๅ›žๆ–‡ๆœฌๅฏนๆ‚จๅฏ่ƒฝๆ˜ฏๆœ‰ๅธฎๅŠฉ็š„ใ€‚่ฟ™้‡Œๆˆ‘ไปฌๅฐ†ๅˆ›ๅปบไธ€ไธช่พ…ๅŠฉๅ‡ฝๆ•ฐๆฅๆŸฅ่ฏขไธ€ไธชๅŒ…ๅซไบ†ๆ•ดๆ•ฐๅˆฐๅญ—็ฌฆไธฒๆ˜ ๅฐ„็š„ๅญ—ๅ…ธๅฏน่ฑก๏ผš", "_____no_output_____" ] ], [ [ "# ไธ€ไธชๆ˜ ๅฐ„ๅ•่ฏๅˆฐๆ•ดๆ•ฐ็ดขๅผ•็š„่ฏๅ…ธ\nword_index = imdb.get_word_index()\n\n# ไฟ็•™็ฌฌไธ€ไธช็ดขๅผ•\nword_index = {k:(v+3) for k,v in word_index.items()}\nword_index[\"<PAD>\"] = 0\nword_index[\"<START>\"] = 1\nword_index[\"<UNK>\"] = 2 # unknown\nword_index[\"<UNUSED>\"] = 3\n\nreverse_word_index = dict([(value, key) for (key, value) in word_index.items()])\n\ndef decode_review(text):\n return ' '.join([reverse_word_index.get(i, '?') for i in text])", "Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/imdb_word_index.json\n1646592/1641221 [==============================] - 0s 0us/step\n" ] ], [ [ "็Žฐๅœจๆˆ‘ไปฌๅฏไปฅไฝฟ็”จ `decode_review` ๅ‡ฝๆ•ฐๆฅๆ˜พ็คบ้ฆ–ๆก่ฏ„่ฎบ็š„ๆ–‡ๆœฌ๏ผš", "_____no_output_____" ] ], [ [ "decode_review(train_data[0])", "_____no_output_____" ] ], [ [ "## ๅ‡†ๅค‡ๆ•ฐๆฎ\n\nๅฝฑ่ฏ„โ€”โ€”ๅณๆ•ดๆ•ฐๆ•ฐ็ป„ๅฟ…้กปๅœจ่พ“ๅ…ฅ็ฅž็ป็ฝ‘็ปœไน‹ๅ‰่ฝฌๆขไธบๅผ ้‡ใ€‚่ฟ™็ง่ฝฌๆขๅฏไปฅ้€š่ฟ‡ไปฅไธ‹ไธค็งๆ–นๅผๆฅๅฎŒๆˆ๏ผš\n\n* ๅฐ†ๆ•ฐ็ป„่ฝฌๆขไธบ่กจ็คบๅ•่ฏๅ‡บ็ŽฐไธŽๅฆ็š„็”ฑ 0 ๅ’Œ 1 ็ป„ๆˆ็š„ๅ‘้‡๏ผŒ็ฑปไผผไบŽ one-hot ็ผ–็ ใ€‚ไพ‹ๅฆ‚๏ผŒๅบๅˆ—[3, 5]ๅฐ†่ฝฌๆขไธบไธ€ไธช 10,000 ็ปด็š„ๅ‘้‡๏ผŒ่ฏฅๅ‘้‡้™คไบ†็ดขๅผ•ไธบ 3 ๅ’Œ 5 ็š„ไฝ็ฝฎๆ˜ฏ 1 ไปฅๅค–๏ผŒๅ…ถไป–้ƒฝไธบ 0ใ€‚็„ถๅŽ๏ผŒๅฐ†ๅ…ถไฝœไธบ็ฝ‘็ปœ็š„้ฆ–ๅฑ‚โ€”โ€”ไธ€ไธชๅฏไปฅๅค„็†ๆตฎ็‚นๅž‹ๅ‘้‡ๆ•ฐๆฎ็š„็จ ๅฏ†ๅฑ‚ใ€‚ไธ่ฟ‡๏ผŒ่ฟ™็งๆ–นๆณ•้œ€่ฆๅคง้‡็š„ๅ†…ๅญ˜๏ผŒ้œ€่ฆไธ€ไธชๅคงๅฐไธบ `num_words * num_reviews` ็š„็Ÿฉ้˜ตใ€‚\n\n* ๆˆ–่€…๏ผŒๆˆ‘ไปฌๅฏไปฅๅกซๅ……ๆ•ฐ็ป„ๆฅไฟ่ฏ่พ“ๅ…ฅๆ•ฐๆฎๅ…ทๆœ‰็›ธๅŒ็š„้•ฟๅบฆ๏ผŒ็„ถๅŽๅˆ›ๅปบไธ€ไธชๅคงๅฐไธบ `max_length * num_reviews` ็š„ๆ•ดๅž‹ๅผ ้‡ใ€‚ๆˆ‘ไปฌๅฏไปฅไฝฟ็”จ่ƒฝๅคŸๅค„็†ๆญคๅฝข็Šถๆ•ฐๆฎ็š„ๅตŒๅ…ฅๅฑ‚ไฝœไธบ็ฝ‘็ปœไธญ็š„็ฌฌไธ€ๅฑ‚ใ€‚\n\nๅœจๆœฌๆ•™็จ‹ไธญ๏ผŒๆˆ‘ไปฌๅฐ†ไฝฟ็”จ็ฌฌไบŒ็งๆ–นๆณ•ใ€‚\n\n็”ฑไบŽ็”ตๅฝฑ่ฏ„่ฎบ้•ฟๅบฆๅฟ…้กป็›ธๅŒ๏ผŒๆˆ‘ไปฌๅฐ†ไฝฟ็”จ [pad_sequences](https://tensorflow.google.cn/api_docs/python/tf/keras/preprocessing/sequence/pad_sequences) ๅ‡ฝๆ•ฐๆฅไฝฟ้•ฟๅบฆๆ ‡ๅ‡†ๅŒ–๏ผš", "_____no_output_____" ] ], [ [ "train_data = keras.preprocessing.sequence.pad_sequences(train_data,\n value=word_index[\"<PAD>\"],\n padding='post',\n maxlen=256)\n\ntest_data = keras.preprocessing.sequence.pad_sequences(test_data,\n value=word_index[\"<PAD>\"],\n padding='post',\n maxlen=256)", "_____no_output_____" ] ], [ [ "็Žฐๅœจ่ฎฉๆˆ‘ไปฌ็œ‹ไธ‹ๆ ทๆœฌ็š„้•ฟๅบฆ๏ผš", "_____no_output_____" ] ], [ [ "len(train_data[0]), len(train_data[1])", "_____no_output_____" ] ], [ [ "ๅนถๆฃ€ๆŸฅไธ€ไธ‹้ฆ–ๆก่ฏ„่ฎบ๏ผˆๅฝ“ๅ‰ๅทฒ็ปๅกซๅ……๏ผ‰๏ผš", "_____no_output_____" ] ], [ [ "print(train_data[0])", "[ 1 14 22 16 43 530 973 1622 1385 65 458 4468 66 3941\n 4 173 36 256 5 25 100 43 838 112 50 670 2 9\n 35 480 284 5 150 4 172 112 167 2 336 385 39 4\n 172 4536 1111 17 546 38 13 447 4 192 50 16 6 147\n 2025 19 14 22 4 1920 4613 469 4 22 71 87 12 16\n 43 530 38 76 15 13 1247 4 22 17 515 17 12 16\n 626 18 2 5 62 386 12 8 316 8 106 5 4 2223\n 5244 16 480 66 3785 33 4 130 12 16 38 619 5 25\n 124 51 36 135 48 25 1415 33 6 22 12 215 28 77\n 52 5 14 407 16 82 2 8 4 107 117 5952 15 256\n 4 2 7 3766 5 723 36 71 43 530 476 26 400 317\n 46 7 4 2 1029 13 104 88 4 381 15 297 98 32\n 2071 56 26 141 6 194 7486 18 4 226 22 21 134 476\n 26 480 5 144 30 5535 18 51 36 28 224 92 25 104\n 4 226 65 16 38 1334 88 12 16 283 5 16 4472 113\n 103 32 15 16 5345 19 178 32 0 0 0 0 0 0\n 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n 0 0 0 0 0 0 0 0 0 0 0 0 0 0\n 0 0 0 0]\n" ] ], [ [ "## ๆž„ๅปบๆจกๅž‹\n\n็ฅž็ป็ฝ‘็ปœ็”ฑๅ †ๅ ็š„ๅฑ‚ๆฅๆž„ๅปบ๏ผŒ่ฟ™้œ€่ฆไปŽไธคไธชไธป่ฆๆ–น้ขๆฅ่ฟ›่กŒไฝ“็ณป็ป“ๆž„ๅ†ณ็ญ–๏ผš\n\n* ๆจกๅž‹้‡Œๆœ‰ๅคšๅฐ‘ๅฑ‚๏ผŸ\n* ๆฏไธชๅฑ‚้‡Œๆœ‰ๅคšๅฐ‘*้šๅฑ‚ๅ•ๅ…ƒ๏ผˆhidden units๏ผ‰*๏ผŸ\n\nๅœจๆญคๆ ทๆœฌไธญ๏ผŒ่พ“ๅ…ฅๆ•ฐๆฎๅŒ…ๅซไธ€ไธชๅ•่ฏ็ดขๅผ•็š„ๆ•ฐ็ป„ใ€‚่ฆ้ข„ๆต‹็š„ๆ ‡็ญพไธบ 0 ๆˆ– 1ใ€‚่ฎฉๆˆ‘ไปฌๆฅไธบ่ฏฅ้—ฎ้ข˜ๆž„ๅปบไธ€ไธชๆจกๅž‹๏ผš", "_____no_output_____" ] ], [ [ "# ่พ“ๅ…ฅๅฝข็Šถๆ˜ฏ็”จไบŽ็”ตๅฝฑ่ฏ„่ฎบ็š„่ฏๆฑ‡ๆ•ฐ็›ฎ๏ผˆ10,000 ่ฏ๏ผ‰\nvocab_size = 10000\n\nmodel = keras.Sequential()\nmodel.add(keras.layers.Embedding(vocab_size, 16))\nmodel.add(keras.layers.GlobalAveragePooling1D())\nmodel.add(keras.layers.Dense(16, activation='relu'))\nmodel.add(keras.layers.Dense(1, activation='sigmoid'))\n\nmodel.summary()", "Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nembedding (Embedding) (None, None, 16) 160000 \n_________________________________________________________________\nglobal_average_pooling1d (Gl (None, 16) 0 \n_________________________________________________________________\ndense (Dense) (None, 16) 272 \n_________________________________________________________________\ndense_1 (Dense) (None, 1) 17 \n=================================================================\nTotal params: 160,289\nTrainable params: 160,289\nNon-trainable params: 0\n_________________________________________________________________\n" ] ], [ [ "ๅฑ‚ๆŒ‰้กบๅบๅ †ๅ ไปฅๆž„ๅปบๅˆ†็ฑปๅ™จ๏ผš\n\n1. ็ฌฌไธ€ๅฑ‚ๆ˜ฏ`ๅตŒๅ…ฅ๏ผˆEmbedding๏ผ‰`ๅฑ‚ใ€‚่ฏฅๅฑ‚้‡‡็”จๆ•ดๆ•ฐ็ผ–็ ็š„่ฏๆฑ‡่กจ๏ผŒๅนถๆŸฅๆ‰พๆฏไธช่ฏ็ดขๅผ•็š„ๅตŒๅ…ฅๅ‘้‡๏ผˆembedding vector๏ผ‰ใ€‚่ฟ™ไบ›ๅ‘้‡ๆ˜ฏ้€š่ฟ‡ๆจกๅž‹่ฎญ็ปƒๅญฆไน ๅˆฐ็š„ใ€‚ๅ‘้‡ๅ‘่พ“ๅ‡บๆ•ฐ็ป„ๅขžๅŠ ไบ†ไธ€ไธช็ปดๅบฆใ€‚ๅพ—ๅˆฐ็š„็ปดๅบฆไธบ๏ผš`(batch, sequence, embedding)`ใ€‚\n2. ๆŽฅไธ‹ๆฅ๏ผŒ`GlobalAveragePooling1D` ๅฐ†้€š่ฟ‡ๅฏนๅบๅˆ—็ปดๅบฆๆฑ‚ๅนณๅ‡ๅ€ผๆฅไธบๆฏไธชๆ ทๆœฌ่ฟ”ๅ›žไธ€ไธชๅฎš้•ฟ่พ“ๅ‡บๅ‘้‡ใ€‚่ฟ™ๅ…่ฎธๆจกๅž‹ไปฅๅฐฝๅฏ่ƒฝๆœ€็ฎ€ๅ•็š„ๆ–นๅผๅค„็†ๅ˜้•ฟ่พ“ๅ…ฅใ€‚\n3. ่ฏฅๅฎš้•ฟ่พ“ๅ‡บๅ‘้‡้€š่ฟ‡ไธ€ไธชๆœ‰ 16 ไธช้šๅฑ‚ๅ•ๅ…ƒ็š„ๅ…จ่ฟžๆŽฅ๏ผˆ`Dense`๏ผ‰ๅฑ‚ไผ ่พ“ใ€‚\n4. ๆœ€ๅŽไธ€ๅฑ‚ไธŽๅ•ไธช่พ“ๅ‡บ็ป“็‚นๅฏ†้›†่ฟžๆŽฅใ€‚ไฝฟ็”จ `Sigmoid` ๆฟ€ๆดปๅ‡ฝๆ•ฐ๏ผŒๅ…ถๅ‡ฝๆ•ฐๅ€ผไธบไป‹ไบŽ 0 ไธŽ 1 ไน‹้—ด็š„ๆตฎ็‚นๆ•ฐ๏ผŒ่กจ็คบๆฆ‚็އๆˆ–็ฝฎไฟกๅบฆใ€‚", "_____no_output_____" ], [ "### ้šๅฑ‚ๅ•ๅ…ƒ\n\nไธŠ่ฟฐๆจกๅž‹ๅœจ่พ“ๅ…ฅ่พ“ๅ‡บไน‹้—ดๆœ‰ไธคไธชไธญ้—ดๅฑ‚ๆˆ–โ€œ้š่—ๅฑ‚โ€ใ€‚่พ“ๅ‡บ๏ผˆๅ•ๅ…ƒ๏ผŒ็ป“็‚นๆˆ–็ฅž็ปๅ…ƒ๏ผ‰็š„ๆ•ฐ้‡ๅณไธบๅฑ‚่กจ็คบ็ฉบ้—ด็š„็ปดๅบฆใ€‚ๆขๅฅ่ฏ่ฏด๏ผŒๆ˜ฏๅญฆไน ๅ†…้ƒจ่กจ็คบๆ—ถ็ฝ‘็ปœๆ‰€ๅ…่ฎธ็š„่‡ช็”ฑๅบฆใ€‚\n\nๅฆ‚ๆžœๆจกๅž‹ๅ…ทๆœ‰ๆ›ดๅคš็š„้šๅฑ‚ๅ•ๅ…ƒ๏ผˆๆ›ด้ซ˜็ปดๅบฆ็š„่กจ็คบ็ฉบ้—ด๏ผ‰ๅ’Œ/ๆˆ–ๆ›ดๅคšๅฑ‚๏ผŒๅˆ™ๅฏไปฅๅญฆไน ๅˆฐๆ›ดๅคๆ‚็š„่กจ็คบใ€‚ไฝ†ๆ˜ฏ๏ผŒ่ฟ™ไผšไฝฟ็ฝ‘็ปœ็š„่ฎก็ฎ—ๆˆๆœฌๆ›ด้ซ˜๏ผŒๅนถไธ”ๅฏ่ƒฝๅฏผ่‡ดๅญฆไน ๅˆฐไธ้œ€่ฆ็š„ๆจกๅผโ€”โ€”ไธ€ไบ›่ƒฝๅคŸๅœจ่ฎญ็ปƒๆ•ฐๆฎไธŠ่€Œไธๆ˜ฏๆต‹่ฏ•ๆ•ฐๆฎไธŠๆ”นๅ–„ๆ€ง่ƒฝ็š„ๆจกๅผใ€‚่ฟ™่ขซ็งฐไธบ*่ฟ‡ๆ‹Ÿๅˆ๏ผˆoverfitting๏ผ‰*๏ผŒๆˆ‘ไปฌ็จๅŽไผšๅฏนๆญค่ฟ›่กŒๆŽข็ฉถใ€‚", "_____no_output_____" ], [ "### ๆŸๅคฑๅ‡ฝๆ•ฐไธŽไผ˜ๅŒ–ๅ™จ\n\nไธ€ไธชๆจกๅž‹้œ€่ฆๆŸๅคฑๅ‡ฝๆ•ฐๅ’Œไผ˜ๅŒ–ๅ™จๆฅ่ฟ›่กŒ่ฎญ็ปƒใ€‚็”ฑไบŽ่ฟ™ๆ˜ฏไธ€ไธชไบŒๅˆ†็ฑป้—ฎ้ข˜ไธ”ๆจกๅž‹่พ“ๅ‡บๆฆ‚็އๅ€ผ๏ผˆไธ€ไธชไฝฟ็”จ sigmoid ๆฟ€ๆดปๅ‡ฝๆ•ฐ็š„ๅ•ไธ€ๅ•ๅ…ƒๅฑ‚๏ผ‰๏ผŒๆˆ‘ไปฌๅฐ†ไฝฟ็”จ `binary_crossentropy` ๆŸๅคฑๅ‡ฝๆ•ฐใ€‚\n\n่ฟ™ไธๆ˜ฏๆŸๅคฑๅ‡ฝๆ•ฐ็š„ๅ”ฏไธ€้€‰ๆ‹ฉ๏ผŒไพ‹ๅฆ‚๏ผŒๆ‚จๅฏไปฅ้€‰ๆ‹ฉ `mean_squared_error` ใ€‚ไฝ†ๆ˜ฏ๏ผŒไธ€่ˆฌๆฅ่ฏด `binary_crossentropy` ๆ›ด้€‚ๅˆๅค„็†ๆฆ‚็އโ€”โ€”ๅฎƒ่ƒฝๅคŸๅบฆ้‡ๆฆ‚็އๅˆ†ๅธƒไน‹้—ด็š„โ€œ่ท็ฆปโ€๏ผŒๆˆ–่€…ๅœจๆˆ‘ไปฌ็š„็คบไพ‹ไธญ๏ผŒๆŒ‡็š„ๆ˜ฏๅบฆ้‡ ground-truth ๅˆ†ๅธƒไธŽ้ข„ๆต‹ๅ€ผไน‹้—ด็š„โ€œ่ท็ฆปโ€ใ€‚\n\n็จๅŽ๏ผŒๅฝ“ๆˆ‘ไปฌ็ ”็ฉถๅ›žๅฝ’้—ฎ้ข˜๏ผˆไพ‹ๅฆ‚๏ผŒ้ข„ๆต‹ๆˆฟไปท๏ผ‰ๆ—ถ๏ผŒๆˆ‘ไปฌๅฐ†ไป‹็ปๅฆ‚ไฝ•ไฝฟ็”จๅฆไธ€็งๅซๅšๅ‡ๆ–น่ฏฏๅทฎ็š„ๆŸๅคฑๅ‡ฝๆ•ฐใ€‚\n\n็Žฐๅœจ๏ผŒ้…็ฝฎๆจกๅž‹ๆฅไฝฟ็”จไผ˜ๅŒ–ๅ™จๅ’ŒๆŸๅคฑๅ‡ฝๆ•ฐ๏ผš", "_____no_output_____" ] ], [ [ "model.compile(optimizer='adam',\n loss='binary_crossentropy',\n metrics=['accuracy'])", "_____no_output_____" ] ], [ [ "## ๅˆ›ๅปบไธ€ไธช้ชŒ่ฏ้›†\n\nๅœจ่ฎญ็ปƒๆ—ถ๏ผŒๆˆ‘ไปฌๆƒณ่ฆๆฃ€ๆŸฅๆจกๅž‹ๅœจๆœช่ง่ฟ‡็š„ๆ•ฐๆฎไธŠ็š„ๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰ใ€‚้€š่ฟ‡ไปŽๅŽŸๅง‹่ฎญ็ปƒๆ•ฐๆฎไธญๅˆ†็ฆป 10,000 ไธชๆ ทๆœฌๆฅๅˆ›ๅปบไธ€ไธช*้ชŒ่ฏ้›†*ใ€‚๏ผˆไธบไป€ไนˆ็Žฐๅœจไธไฝฟ็”จๆต‹่ฏ•้›†๏ผŸๆˆ‘ไปฌ็š„็›ฎๆ ‡ๆ˜ฏๅชไฝฟ็”จ่ฎญ็ปƒๆ•ฐๆฎๆฅๅผ€ๅ‘ๅ’Œ่ฐƒๆ•ดๆจกๅž‹๏ผŒ็„ถๅŽๅชไฝฟ็”จไธ€ๆฌกๆต‹่ฏ•ๆ•ฐๆฎๆฅ่ฏ„ไผฐๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰๏ผ‰ใ€‚", "_____no_output_____" ] ], [ [ "x_val = train_data[:10000]\npartial_x_train = train_data[10000:]\n\ny_val = train_labels[:10000]\npartial_y_train = train_labels[10000:]", "_____no_output_____" ] ], [ [ "## ่ฎญ็ปƒๆจกๅž‹\n\nไปฅ 512 ไธชๆ ทๆœฌ็š„ mini-batch ๅคงๅฐ่ฟญไปฃ 40 ไธช epoch ๆฅ่ฎญ็ปƒๆจกๅž‹ใ€‚่ฟ™ๆ˜ฏๆŒ‡ๅฏน `x_train` ๅ’Œ `y_train` ๅผ ้‡ไธญๆ‰€ๆœ‰ๆ ทๆœฌ็š„็š„ 40 ๆฌก่ฟญไปฃใ€‚ๅœจ่ฎญ็ปƒ่ฟ‡็จ‹ไธญ๏ผŒ็›‘ๆต‹ๆฅ่‡ช้ชŒ่ฏ้›†็š„ 10,000 ไธชๆ ทๆœฌไธŠ็š„ๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰ๅ’Œๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰๏ผš", "_____no_output_____" ] ], [ [ "history = model.fit(partial_x_train,\n partial_y_train,\n epochs=40,\n batch_size=512,\n validation_data=(x_val, y_val),\n verbose=1)", "Train on 15000 samples, validate on 10000 samples\nEpoch 1/40\n15000/15000 [==============================] - 1s 99us/sample - loss: 0.6921 - accuracy: 0.5437 - val_loss: 0.6903 - val_accuracy: 0.6241\nEpoch 2/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.6870 - accuracy: 0.7057 - val_loss: 0.6833 - val_accuracy: 0.7018\nEpoch 3/40\n15000/15000 [==============================] - 1s 54us/sample - loss: 0.6760 - accuracy: 0.7454 - val_loss: 0.6694 - val_accuracy: 0.7501\nEpoch 4/40\n15000/15000 [==============================] - 1s 53us/sample - loss: 0.6563 - accuracy: 0.7659 - val_loss: 0.6467 - val_accuracy: 0.7571\nEpoch 5/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.6270 - accuracy: 0.7837 - val_loss: 0.6155 - val_accuracy: 0.7793\nEpoch 6/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.5882 - accuracy: 0.7993 - val_loss: 0.5762 - val_accuracy: 0.7960\nEpoch 7/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.5420 - accuracy: 0.8219 - val_loss: 0.5336 - val_accuracy: 0.8106\nEpoch 8/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.4955 - accuracy: 0.8367 - val_loss: 0.4930 - val_accuracy: 0.8262\nEpoch 9/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.4507 - accuracy: 0.8522 - val_loss: 0.4542 - val_accuracy: 0.8393\nEpoch 10/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.4107 - accuracy: 0.8667 - val_loss: 0.4218 - val_accuracy: 0.8478\nEpoch 11/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.3766 - accuracy: 0.8779 - val_loss: 0.3957 - val_accuracy: 0.8551\nEpoch 12/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.3483 - accuracy: 0.8843 - val_loss: 0.3741 - val_accuracy: 0.8613\nEpoch 13/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.3238 - accuracy: 0.8925 - val_loss: 0.3573 - val_accuracy: 0.8667\nEpoch 14/40\n15000/15000 [==============================] - 1s 54us/sample - loss: 0.3027 - accuracy: 0.8977 - val_loss: 0.3439 - val_accuracy: 0.8678\nEpoch 15/40\n15000/15000 [==============================] - 1s 54us/sample - loss: 0.2850 - accuracy: 0.9032 - val_loss: 0.3318 - val_accuracy: 0.8737\nEpoch 16/40\n15000/15000 [==============================] - 1s 56us/sample - loss: 0.2695 - accuracy: 0.9071 - val_loss: 0.3231 - val_accuracy: 0.8744\nEpoch 17/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2549 - accuracy: 0.9124 - val_loss: 0.3151 - val_accuracy: 0.8790\nEpoch 18/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2421 - accuracy: 0.9166 - val_loss: 0.3086 - val_accuracy: 0.8807\nEpoch 19/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2307 - accuracy: 0.9201 - val_loss: 0.3035 - val_accuracy: 0.8794\nEpoch 20/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2201 - accuracy: 0.9243 - val_loss: 0.2994 - val_accuracy: 0.8802\nEpoch 21/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.2103 - accuracy: 0.9271 - val_loss: 0.2953 - val_accuracy: 0.8825\nEpoch 22/40\n15000/15000 [==============================] - 1s 53us/sample - loss: 0.2014 - accuracy: 0.9306 - val_loss: 0.2926 - val_accuracy: 0.8834\nEpoch 23/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.1923 - accuracy: 0.9352 - val_loss: 0.2901 - val_accuracy: 0.8848\nEpoch 24/40\n15000/15000 [==============================] - 1s 53us/sample - loss: 0.1845 - accuracy: 0.9395 - val_loss: 0.2907 - val_accuracy: 0.8852\nEpoch 25/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.1770 - accuracy: 0.9426 - val_loss: 0.2875 - val_accuracy: 0.8838\nEpoch 26/40\n15000/15000 [==============================] - 1s 52us/sample - loss: 0.1696 - accuracy: 0.9459 - val_loss: 0.2870 - val_accuracy: 0.8849\nEpoch 27/40\n15000/15000 [==============================] - 1s 58us/sample - loss: 0.1628 - accuracy: 0.9492 - val_loss: 0.2868 - val_accuracy: 0.8849\nEpoch 28/40\n15000/15000 [==============================] - 1s 65us/sample - loss: 0.1563 - accuracy: 0.9513 - val_loss: 0.2876 - val_accuracy: 0.8842\nEpoch 29/40\n15000/15000 [==============================] - 1s 65us/sample - loss: 0.1505 - accuracy: 0.9534 - val_loss: 0.2881 - val_accuracy: 0.8849\nEpoch 30/40\n15000/15000 [==============================] - 1s 62us/sample - loss: 0.1450 - accuracy: 0.9553 - val_loss: 0.2878 - val_accuracy: 0.8857\nEpoch 31/40\n15000/15000 [==============================] - 1s 60us/sample - loss: 0.1389 - accuracy: 0.9584 - val_loss: 0.2879 - val_accuracy: 0.8862\nEpoch 32/40\n15000/15000 [==============================] - 1s 62us/sample - loss: 0.1347 - accuracy: 0.9595 - val_loss: 0.2907 - val_accuracy: 0.8849\nEpoch 33/40\n15000/15000 [==============================] - 1s 61us/sample - loss: 0.1286 - accuracy: 0.9626 - val_loss: 0.2908 - val_accuracy: 0.8859\nEpoch 34/40\n15000/15000 [==============================] - 1s 59us/sample - loss: 0.1244 - accuracy: 0.9645 - val_loss: 0.2926 - val_accuracy: 0.8864\nEpoch 35/40\n15000/15000 [==============================] - 1s 59us/sample - loss: 0.1192 - accuracy: 0.9664 - val_loss: 0.2945 - val_accuracy: 0.8850\nEpoch 36/40\n15000/15000 [==============================] - 1s 61us/sample - loss: 0.1149 - accuracy: 0.9688 - val_loss: 0.2959 - val_accuracy: 0.8847\nEpoch 37/40\n15000/15000 [==============================] - 1s 60us/sample - loss: 0.1107 - accuracy: 0.9699 - val_loss: 0.2998 - val_accuracy: 0.8833\nEpoch 38/40\n15000/15000 [==============================] - 1s 59us/sample - loss: 0.1065 - accuracy: 0.9703 - val_loss: 0.3007 - val_accuracy: 0.8844\nEpoch 39/40\n15000/15000 [==============================] - 1s 62us/sample - loss: 0.1029 - accuracy: 0.9722 - val_loss: 0.3042 - val_accuracy: 0.8827\nEpoch 40/40\n15000/15000 [==============================] - 1s 69us/sample - loss: 0.0995 - accuracy: 0.9736 - val_loss: 0.3074 - val_accuracy: 0.8817\n" ] ], [ [ "## ่ฏ„ไผฐๆจกๅž‹\n\nๆˆ‘ไปฌๆฅ็œ‹ไธ€ไธ‹ๆจกๅž‹็š„ๆ€ง่ƒฝๅฆ‚ไฝ•ใ€‚ๅฐ†่ฟ”ๅ›žไธคไธชๅ€ผใ€‚ๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰๏ผˆไธ€ไธช่กจ็คบ่ฏฏๅทฎ็š„ๆ•ฐๅญ—๏ผŒๅ€ผ่ถŠไฝŽ่ถŠๅฅฝ๏ผ‰ไธŽๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰ใ€‚", "_____no_output_____" ] ], [ [ "results = model.evaluate(test_data, test_labels, verbose=2)\n\nprint(results)", "25000/1 - 1s - loss: 0.3459 - accuracy: 0.8727\n[0.325805940823555, 0.87268]\n" ] ], [ [ "่ฟ™็งๅๅˆ†ๆœด็ด ็š„ๆ–นๆณ•ๅพ—ๅˆฐไบ†็บฆ 87% ็š„ๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰ใ€‚่‹ฅ้‡‡็”จๆ›ดๅฅฝ็š„ๆ–นๆณ•๏ผŒๆจกๅž‹็š„ๅ‡†็กฎ็އๅบ”ๅฝ“ๆŽฅ่ฟ‘ 95%ใ€‚", "_____no_output_____" ], [ "## ๅˆ›ๅปบไธ€ไธชๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰ๅ’ŒๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰้šๆ—ถ้—ดๅ˜ๅŒ–็š„ๅ›พ่กจ\n\n`model.fit()` ่ฟ”ๅ›žไธ€ไธช `History` ๅฏน่ฑก๏ผŒ่ฏฅๅฏน่ฑกๅŒ…ๅซไธ€ไธชๅญ—ๅ…ธ๏ผŒๅ…ถไธญๅŒ…ๅซ่ฎญ็ปƒ้˜ถๆฎตๆ‰€ๅ‘็”Ÿ็š„ไธ€ๅˆ‡ไบ‹ไปถ๏ผš", "_____no_output_____" ] ], [ [ "history_dict = history.history\nhistory_dict.keys()", "_____no_output_____" ] ], [ [ "ๆœ‰ๅ››ไธชๆก็›ฎ๏ผšๅœจ่ฎญ็ปƒๅ’Œ้ชŒ่ฏๆœŸ้—ด๏ผŒๆฏไธชๆก็›ฎๅฏนๅบ”ไธ€ไธช็›‘ๆŽงๆŒ‡ๆ ‡ใ€‚ๆˆ‘ไปฌๅฏไปฅไฝฟ็”จ่ฟ™ไบ›ๆก็›ฎๆฅ็ป˜ๅˆถ่ฎญ็ปƒไธŽ้ชŒ่ฏ่ฟ‡็จ‹็š„ๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰ๅ’Œๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰๏ผŒไปฅไพฟ่ฟ›่กŒๆฏ”่พƒใ€‚", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt\n\nacc = history_dict['accuracy']\nval_acc = history_dict['val_accuracy']\nloss = history_dict['loss']\nval_loss = history_dict['val_loss']\n\nepochs = range(1, len(acc) + 1)\n\n# โ€œboโ€ไปฃ่กจ \"่“็‚น\"\nplt.plot(epochs, loss, 'bo', label='Training loss')\n# bไปฃ่กจโ€œ่“่‰ฒๅฎž็บฟโ€\nplt.plot(epochs, val_loss, 'b', label='Validation loss')\nplt.title('Training and validation loss')\nplt.xlabel('Epochs')\nplt.ylabel('Loss')\nplt.legend()\n\nplt.show()", "_____no_output_____" ], [ "plt.clf() # ๆธ…้™คๆ•ฐๅญ—\n\nplt.plot(epochs, acc, 'bo', label='Training acc')\nplt.plot(epochs, val_acc, 'b', label='Validation acc')\nplt.title('Training and validation accuracy')\nplt.xlabel('Epochs')\nplt.ylabel('Accuracy')\nplt.legend()\n\nplt.show()", "_____no_output_____" ] ], [ [ "\nๅœจ่ฏฅๅ›พไธญ๏ผŒ็‚นไปฃ่กจ่ฎญ็ปƒๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰ไธŽๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰๏ผŒๅฎž็บฟไปฃ่กจ้ชŒ่ฏๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰ไธŽๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰ใ€‚\n\nๆณจๆ„่ฎญ็ปƒๆŸๅคฑๅ€ผ้šๆฏไธ€ไธช epoch *ไธ‹้™*่€Œ่ฎญ็ปƒๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰้šๆฏไธ€ไธช epoch *ไธŠๅ‡*ใ€‚่ฟ™ๅœจไฝฟ็”จๆขฏๅบฆไธ‹้™ไผ˜ๅŒ–ๆ—ถๆ˜ฏๅฏ้ข„ๆœŸ็š„โ€”โ€”็†ๅบ”ๅœจๆฏๆฌก่ฟญไปฃไธญๆœ€ๅฐๅŒ–ๆœŸๆœ›ๅ€ผใ€‚\n\n้ชŒ่ฏ่ฟ‡็จ‹็š„ๆŸๅคฑๅ€ผ๏ผˆloss๏ผ‰ไธŽๅ‡†็กฎ็އ๏ผˆaccuracy๏ผ‰็š„ๆƒ…ๅ†ตๅดๅนถ้žๅฆ‚ๆญคโ€”โ€”ๅฎƒไปฌไผผไนŽๅœจ 20 ไธช epoch ๅŽ่พพๅˆฐๅณฐๅ€ผใ€‚่ฟ™ๆ˜ฏ่ฟ‡ๆ‹Ÿๅˆ็š„ไธ€ไธชๅฎžไพ‹๏ผšๆจกๅž‹ๅœจ่ฎญ็ปƒๆ•ฐๆฎไธŠ็š„่กจ็Žฐๆฏ”ๅœจไปฅๅ‰ไปŽๆœช่ง่ฟ‡็š„ๆ•ฐๆฎไธŠ็š„่กจ็Žฐ่ฆๆ›ดๅฅฝใ€‚ๅœจๆญคไน‹ๅŽ๏ผŒๆจกๅž‹่ฟ‡ๅบฆไผ˜ๅŒ–ๅนถๅญฆไน *็‰นๅฎš*ไบŽ่ฎญ็ปƒๆ•ฐๆฎ็š„่กจ็คบ๏ผŒ่€Œไธ่ƒฝๅคŸ*ๆณ›ๅŒ–*ๅˆฐๆต‹่ฏ•ๆ•ฐๆฎใ€‚\n\nๅฏนไบŽ่ฟ™็ง็‰นๆฎŠๆƒ…ๅ†ต๏ผŒๆˆ‘ไปฌๅฏไปฅ้€š่ฟ‡ๅœจ 20 ไธชๅทฆๅณ็š„ epoch ๅŽๅœๆญข่ฎญ็ปƒๆฅ้ฟๅ…่ฟ‡ๆ‹Ÿๅˆใ€‚็จๅŽ๏ผŒๆ‚จๅฐ†็œ‹ๅˆฐๅฆ‚ไฝ•้€š่ฟ‡ๅ›ž่ฐƒ่‡ชๅŠจๆ‰ง่กŒๆญคๆ“ไฝœใ€‚", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ] ]
d06454dfa2d4595190cbf36062f5989e3148a977
16,185
ipynb
Jupyter Notebook
notebooks/00_just_plot_it.ipynb
NFAcademy/2021_course_dev-tacaswell
78df3044f46581f6c5276283ff825b26a00a3039
[ "MIT" ]
null
null
null
notebooks/00_just_plot_it.ipynb
NFAcademy/2021_course_dev-tacaswell
78df3044f46581f6c5276283ff825b26a00a3039
[ "MIT" ]
null
null
null
notebooks/00_just_plot_it.ipynb
NFAcademy/2021_course_dev-tacaswell
78df3044f46581f6c5276283ff825b26a00a3039
[ "MIT" ]
null
null
null
31.125
514
0.617547
[ [ [ "# Just Plot It!", "_____no_output_____" ], [ "## Introduction", "_____no_output_____" ], [ "### The System", "_____no_output_____" ], [ "In this course we will work with a set of \"experimental\" data to illustrate going from \"raw\" measurement (or simulation) data through exploratory visualization to an (almost) paper ready figure.\n\nIn this scenario, we have fabricated (or simulated) 25 cantilevers. There is some value (suggestively called \"control\") that varies between the cantilevers and we want to see how the properties of the cantilever are affect by \"control\".", "_____no_output_____" ], [ "To see what this will look like physically, take part a \"clicky\" pen. Hold one end of the spring in your fingers and flick the free end. \n\nOr just watch this cat:", "_____no_output_____" ] ], [ [ "from IPython.display import YouTubeVideo\nYouTubeVideo('4aTagDSnclk?start=19')", "_____no_output_____" ] ], [ [ "Springs, and our cantilevers, are part of a class of systems known as (Damped) Harmonic Oscillators. We are going to measure the natural frequency and damping rate we deflect each cantilever by the same amount and then observe the position as a function of time as the vibrations damp out.", "_____no_output_____" ], [ "### The Tools", "_____no_output_____" ], [ "We are going make use of: \n\n- [jupyter](https://jupyter.org)\n- [numpy](https://numpy.org)\n- [matplotlib](https://matplotlib.org)\n- [scipy](https://www.scipy.org/scipylib/index.html)\n- [xarray](http://xarray.pydata.org/en/stable/index.html)\n- [pandas](https://pandas.pydata.org/docs/)\n\nWe are only going to scratch the surface of what any of these libraries can do! For the purposes of this course we assume you know numpy and Matplotlib at least to the level of LINKS TO OTHER COURSES. We will only be using one aspect (least square fitting) from scipy so no prior familiarity is needed. Similarly, we will only be superficially making use of pandas and xarray to provided access to structured data. No prior familiarity is required and if you want to learn more see LINK TO OTHER COURSES.", "_____no_output_____" ] ], [ [ "# interactive figures, requires ipypml!\n%matplotlib widget\n#%matplotlib inline\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport pandas as pd\nimport scipy\nimport xarray as xa", "_____no_output_____" ] ], [ [ "### Philsophy", "_____no_output_____" ], [ "While this coures uses Matplotlib for the visualization, the high-level lessons of this course are transferable to any plotting tools (in any language).\n\nAt its core, programing in the process of taking existing tools (libraries) and building new tools more fit to your purpose. This course will walk through a concrete example, starting with a pile of data and ending with a paper figure, of how to think about and design scientific visualizations tools tuned to exactly *your* data and questions.", "_____no_output_____" ], [ "## The Data", "_____no_output_____" ], [ "### Accessing data\n\nAs a rule-of-thumb I/O logic should be kept out of the inner loops of analysis or plotting. This will, in the medium term, lead to more re-usable and maintainable code. Remember your most frequent collaborator is yourself in 6 months. Be kind to your (future) self and write re-usable, maintainable, and understandable code now ;)\n\nIn this case, we have a data (simulation) function `get_data` that will simulate the experiment and returns to us a [`xarray.DataArray`](http://xarray.pydata.org/en/stable/quick-overview.html#create-a-dataarray). `xarray.DataArray` is (roughly) a N-dimensional numpy array that is enriched by the concept of coordinates and indies on the the axes and meta-data. \n\n`xarray` has much more functionality than we will use in this course!", "_____no_output_____" ] ], [ [ "# not sure how else to get the helpers on the path!\nimport sys\nsys.path.append('../scripts')", "_____no_output_____" ], [ "from data_gen import get_data, fit", "_____no_output_____" ] ], [ [ "### First look", "_____no_output_____" ], [ "Using the function `get_data` we can pull an `xarray.DataArray` into our namespace and the use the html repr from xarray to get a first look at the data", "_____no_output_____" ] ], [ [ "d = get_data(25)\nd", "_____no_output_____" ] ], [ [ "From this we can see that we have a, more-or-less, 2D array with 25 rows, each of which is a measurement that is a 4,112 point time series. Because this is an DataArray it also caries **coordinates** giving the value of **control** for each row and the time for each column.", "_____no_output_____" ], [ "If we pull out just one row we can see a single experimental measurement.", "_____no_output_____" ] ], [ [ "d[6]", "_____no_output_____" ] ], [ [ "We can see that the **control** coordinate now gives 1 value, but the **time** coordinate is still a vector. We can access these values via attribute access (which we will use later):", "_____no_output_____" ] ], [ [ "d[6].control", "_____no_output_____" ], [ "d[6].time", "_____no_output_____" ] ], [ [ "## The Plotting", "_____no_output_____" ], [ "### Plot it?\nLooking at (truncated) lists of numbers is not intuitive or informative for most people, to get a better sense of what this data looks like lets plot it! We know that `Axes.plot` can plot multiple lines at once so lets try naively throwing `d` at `ax.plot`!", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\nax.plot(d);", "_____no_output_____" ] ], [ [ "While this does look sort of cool, it is not *useful*. What has happened is that Matplotlib has looked at our `(25, 4_112)` array and said \"Clearly, you have a table that is 4k columns wide and 25 rows long. What you want is each column plotted!\". Thus, what we are seeing is \"The deflection at a fixed time as a function of cantilever ID number\". This plot does accurately reflect that data that we passed in, but this is a nearly meaningless plot!\n\nVisualization, just like writing, is a tool for communication and you need to think about the story you want to tell as you make the plots.", "_____no_output_____" ], [ "### Sidebar: Explicit vs Implicit Matplotlib API\n\nThere are two related but distinct APIs to use Matplotlib: the \"Explicit\" (nee \"Object Oriented\") and \"Implicit\" (nee \"pyplot/pylab\"). The Implicit API is implemented using the Explicit API; anything you can do with the Implicit API you can do with the Explicit API, but there is some functionality of the Explicit API that is not exposed through the Implicit API. It is also possible, but with one exception not suggested, to mix the two APIs.\n\nThe core conceptual difference is than in the Implicit API Matplotlib has a notion of the \"current figure\" and \"current axes\" that all of the calls re-directed to. For example, the implementation of `plt.plot` (once you scroll past the docstring) is only 1 line:", "_____no_output_____" ] ], [ [ "?? plt.plot", "_____no_output_____" ] ], [ [ "While the Implicit API reduces the boilerplate required to get some things done and is convenient when working in a terminal, it comes at the cost of Matplotlib maintaining global state of which Axes is currently active! When scripting this can quickly become a headache to manage.", "_____no_output_____" ], [ "When using Matplotlib with one of the GUI backends, we do need to, at the library level, keep track of some global state so that the plot windows remain responsive. If you are embedding Matplotlib in your own GUI application you are responsible for this, but when working at an IPython prompt,`pyplot` takes care of this for you.", "_____no_output_____" ], [ "This course is going to, with the exception of creating new figures, always use the Explict API.", "_____no_output_____" ], [ "### Plot it!\n\nWhat we really want to see is the transpose of the above (A line per experiment as a function of time):", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\nax.plot(d.T);", "_____no_output_____" ] ], [ [ "Which is better! If we squint a bit (or zoom in if we are using `ipympl` or a GUI backend) can sort of see each of the individual oscillators ringing-down over time.", "_____no_output_____" ], [ "### Just one at a time", "_____no_output_____" ], [ "To make it easier to see lets plot just one of the curves:", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\nax.plot(d[6]);", "_____no_output_____" ] ], [ [ "### Pass freshman physics", "_____no_output_____" ], [ "While we do have just one line on the axes and can see what is going on, this plot would, right, be marked as little-to-no credit if turned in as part of a freshman Physics lab! We do not have a meaningful value on the x-axis, no legend, and no axis labels!", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\nm = d[6]\nax.plot(m.time, m, label=f'control = {float(m.control):.1f}')\nax.set_xlabel('time (ms)')\nax.set_ylabel('displacement (mm)')\nax.legend();", "_____no_output_____" ] ], [ [ "At this point we have a minimally acceptable plot! It shows us one curve with axis labels (with units!) and a legend. With ", "_____no_output_____" ], [ "### sidebar: xarray plotting", "_____no_output_____" ], [ "Because xarray knows more about the structure of your data than a couple of numpy arrays in your local namespace or dictionary, it can make smarter choices about the automatic visualization:", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots()\nm.plot(ax=ax)", "_____no_output_____" ] ], [ [ "While this is helpful exploritory plotting, `xarray` makes some choices that make it difficult to compose plotting multiple data sets.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ] ]
d06454dfb1715cc2a28cb347906303323e0c830d
166,341
ipynb
Jupyter Notebook
08. Classification.ipynb
monocilindro/data_science
36098bb6b3e731a11315e58e0b09628fae86069d
[ "MIT" ]
41
2020-01-25T21:23:59.000Z
2022-02-22T19:48:15.000Z
08. Classification.ipynb
ichit/data_science
36098bb6b3e731a11315e58e0b09628fae86069d
[ "MIT" ]
null
null
null
08. Classification.ipynb
ichit/data_science
36098bb6b3e731a11315e58e0b09628fae86069d
[ "MIT" ]
38
2020-01-27T18:57:46.000Z
2022-03-05T00:33:45.000Z
231.995816
136,448
0.902934
[ [ [ "## 8. Classification\n\n[Data Science Playlist on YouTube](https://www.youtube.com/watch?v=VLKEj9EN2ew&list=PLLBUgWXdTBDg1Qgmwt4jKtVn9BWh5-zgy)\n[![Python Data Science](https://apmonitor.com/che263/uploads/Begin_Python/DataScience08.png)](https://www.youtube.com/watch?v=VLKEj9EN2ew&list=PLLBUgWXdTBDg1Qgmwt4jKtVn9BWh5-zgy \"Python Data Science\")\n\n**Classification** predicts *discrete labels (outcomes)* such as `yes`/`no`, `True`/`False`, or any number of discrete levels such as a letter from text recognition, or a word from speech recognition. There are two main methods for training classifiers: unsupervised and supervised learning. The difference between the two is that unsupervised learning does not use labels while supervised learning uses labels to build the classifier. The goal of unsupervised learning is to cluster input features but without labels to guide the grouping. ", "_____no_output_____" ], [ "![list](https://apmonitor.com/che263/uploads/Begin_Python/list.png)\n\n### Supervised Learning to Classify Numbers\n\nA dataset that is included with sklearn is a set of 1797 images of numbers that are 64 pixels (8x8) each. There are labels with each to indicate the correct answer. A Support Vector Classifier is trained on the first half of the images.", "_____no_output_____" ] ], [ [ "from sklearn import datasets, svm\nfrom sklearn.model_selection import train_test_split\nimport matplotlib.pyplot as plt\n%matplotlib inline\nimport numpy as np\n\n# train classifier\ndigits = datasets.load_digits()\nn_samples = len(digits.images)\ndata = digits.images.reshape((n_samples, -1))\nsvc = svm.SVC(gamma=0.001)\nX_train, X_test, y_train, y_test = train_test_split(\n data, digits.target, test_size=0.5, shuffle=False)\nsvc.fit(X_train, y_train)\nprint('SVC Trained')", "_____no_output_____" ] ], [ [ "![expert](https://apmonitor.com/che263/uploads/Begin_Python/expert.png)\n\n### Test Number Classifier\n\nThe image classification is trained on 10 randomly selected images from the other half of the data set to evaluate the training. Run the classifier test until you observe a misclassified number.", "_____no_output_____" ] ], [ [ "plt.figure(figsize=(10,4))\nfor i in range(10):\n n = np.random.randint(int(n_samples/2),n_samples)\n predict = svc.predict(digits.data[n:n+1])[0]\n plt.subplot(2,5,i+1)\n plt.imshow(digits.images[n], cmap=plt.cm.gray_r, interpolation='nearest')\n plt.text(0,7,'Actual: ' + str(digits.target[n]),color='r')\n plt.text(0,1,'Predict: ' + str(predict),color='b')\n if predict==digits.target[n]:\n plt.text(0,4,'Correct',color='g')\n else:\n plt.text(0,4,'Incorrect',color='orange')\nplt.show()", "_____no_output_____" ] ], [ [ "![buildings](https://apmonitor.com/che263/uploads/Begin_Python/buildings.png)\n\n### Classification with Supervised Learning", "_____no_output_____" ], [ "Select data set option with `moons`, `cirlces`, or `blobs`. Run the following cell to generate the data that will be used to test the classifiers.", "_____no_output_____" ] ], [ [ "option = 'moons' # moons, circles, or blobs\n\nn = 2000 # number of data points\nX = np.random.random((n,2))\nmixing = 0.0 # add random mixing element to data\nxplot = np.linspace(0,1,100)\nif option=='moons':\n X, y = datasets.make_moons(n_samples=n,noise=0.1)\n yplot = xplot*0.0\nelif option=='circles':\n X, y = datasets.make_circles(n_samples=n,noise=0.1,factor=0.5)\n yplot = xplot*0.0\nelif option=='blobs':\n X, y = datasets.make_blobs(n_samples=n,centers=[[-5,3],[5,-3]],cluster_std=2.0)\n yplot = xplot*0.0\n# Split into train and test subsets (50% each)\nXA, XB, yA, yB = train_test_split(X, y, test_size=0.5, shuffle=False)\n# Plot regression results\ndef assess(P):\n plt.figure()\n plt.scatter(XB[P==1,0],XB[P==1,1],marker='^',color='blue',label='True')\n plt.scatter(XB[P==0,0],XB[P==0,1],marker='x',color='red',label='False')\n plt.scatter(XB[P!=yB,0],XB[P!=yB,1],marker='s',color='orange',\\\n alpha=0.5,label='Incorrect')\n plt.legend()", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.1 Logistic Regression\n\n**Definition:** Logistic regression is a machine learning algorithm for classification. In this algorithm, the probabilities describing the possible outcomes of a single trial are modelled using a logistic function.\n\n**Advantages:** Logistic regression is designed for this purpose (classification), and is most useful for understanding the influence of several independent variables on a single outcome variable.\n\n**Disadvantages:** Works only when the predicted variable is binary, assumes all predictors are independent of each other, and assumes data is free of missing values.", "_____no_output_____" ] ], [ [ "from sklearn.linear_model import LogisticRegression\nlr = LogisticRegression(solver='lbfgs')\nlr.fit(XA,yA)\nyP = lr.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.2 Naรฏve Bayes\n\n**Definition:** Naive Bayes algorithm based on Bayesโ€™ theorem with the assumption of independence between every pair of features. Naive Bayes classifiers work well in many real-world situations such as document classification and spam filtering.\n\n**Advantages:** This algorithm requires a small amount of training data to estimate the necessary parameters. Naive Bayes classifiers are extremely fast compared to more sophisticated methods.\n\n**Disadvantages:** Naive Bayes is known to be a bad estimator.", "_____no_output_____" ] ], [ [ "from sklearn.naive_bayes import GaussianNB\nnb = GaussianNB()\nnb.fit(XA,yA)\nyP = nb.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.3 Stochastic Gradient Descent\n\n**Definition:** Stochastic gradient descent is a simple and very efficient approach to fit linear models. It is particularly useful when the number of samples is very large. It supports different loss functions and penalties for classification.\n\n**Advantages:** Efficiency and ease of implementation.\n\n**Disadvantages:** Requires a number of hyper-parameters and it is sensitive to feature scaling.", "_____no_output_____" ] ], [ [ "from sklearn.linear_model import SGDClassifier\nsgd = SGDClassifier(loss='modified_huber', shuffle=True,random_state=101)\nsgd.fit(XA,yA)\nyP = sgd.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.4 K-Nearest Neighbours\n\n**Definition:** Neighbours based classification is a type of lazy learning as it does not attempt to construct a general internal model, but simply stores instances of the training data. Classification is computed from a simple majority vote of the k nearest neighbours of each point.\n\n**Advantages:** This algorithm is simple to implement, robust to noisy training data, and effective if training data is large.\n\n**Disadvantages:** Need to determine the value of `K` and the computation cost is high as it needs to computer the distance of each instance to all the training samples. One possible solution to determine `K` is to add a feedback loop to determine the number of neighbors.", "_____no_output_____" ] ], [ [ "from sklearn.neighbors import KNeighborsClassifier\nknn = KNeighborsClassifier(n_neighbors=5)\nknn.fit(XA,yA)\nyP = knn.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.5 Decision Tree\n\n**Definition:** Given a data of attributes together with its classes, a decision tree produces a sequence of rules that can be used to classify the data.\n\n**Advantages:** Decision Tree is simple to understand and visualise, requires little data preparation, and can handle both numerical and categorical data.\n\n**Disadvantages:** Decision tree can create complex trees that do not generalise well, and decision trees can be unstable because small variations in the data might result in a completely different tree being generated.", "_____no_output_____" ] ], [ [ "from sklearn.tree import DecisionTreeClassifier\ndtree = DecisionTreeClassifier(max_depth=10,random_state=101,\\\n max_features=None,min_samples_leaf=5)\ndtree.fit(XA,yA)\nyP = dtree.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.6 Random Forest\n\n**Definition:** Random forest classifier is a meta-estimator that fits a number of decision trees on various sub-samples of datasets and uses average to improve the predictive accuracy of the model and controls over-fitting. The sub-sample size is always the same as the original input sample size but the samples are drawn with replacement.\n\n**Advantages:** Reduction in over-fitting and random forest classifier is more accurate than decision trees in most cases.\n\n**Disadvantages:** Slow real time prediction, difficult to implement, and complex algorithm.", "_____no_output_____" ] ], [ [ "from sklearn.ensemble import RandomForestClassifier\nrfm = RandomForestClassifier(n_estimators=70,oob_score=True,\\\n n_jobs=1,random_state=101,max_features=None,\\\n min_samples_leaf=3) #change min_samples_leaf from 30 to 3\nrfm.fit(XA,yA)\nyP = rfm.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.7 Support Vector Classifier\n\n**Definition:** Support vector machine is a representation of the training data as points in space separated into categories by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall.\n\n**Advantages:** Effective in high dimensional spaces and uses a subset of training points in the decision function so it is also memory efficient.\n\n**Disadvantages:** The algorithm does not directly provide probability estimates, these are calculated using an expensive five-fold cross-validation.", "_____no_output_____" ] ], [ [ "from sklearn.svm import SVC\nsvm = SVC(gamma='scale', C=1.0, random_state=101)\nsvm.fit(XA,yA)\nyP = svm.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### S.8 Neural Network\n\nThe `MLPClassifier` implements a multi-layer perceptron (MLP) algorithm that trains using Backpropagation.\n\n**Definition:** A neural network is a set of neurons (activation functions) in layers that are processed sequentially to relate an input to an output.\n\n**Advantages:** Effective in nonlinear spaces where the structure of the relationship is not linear. No prior knowledge or specialized equation structure is defined although there are different network architectures that may lead to a better result.\n\n**Disadvantages:** Neural networks do not extrapolate well outside of the training domain. They may also require longer to train by adjusting the parameter weights to minimize a loss (objective) function. It is also more challenging to explain the outcome of the training and changes in initialization or number of epochs (iterations) may lead to different results. Too many epochs may lead to overfitting, especially if there are excess parameters beyond the minimum needed to capture the input to output relationship.", "_____no_output_____" ], [ "![deep_neural_network.png](attachment:deep_neural_network.png)\n\nMLP trains on two arrays: array X of size (n_samples, n_features), which holds the training samples represented as floating point feature vectors; and array y of size (n_samples,), which holds the target values (class labels) for the training samples.\nMLP can fit a non-linear model to the training data. clf.coefs_ contains the weight matrices that constitute the model parameters. Currently, MLPClassifier supports only the Cross-Entropy loss function, which allows probability estimates by running the predict_proba method. MLP trains using Backpropagation. More precisely, it trains using some form of gradient descent and the gradients are calculated using Backpropagation. For classification, it minimizes the Cross-Entropy loss function, giving a vector of probability estimates. MLPClassifier supports multi-class classification by applying Softmax as the output function. Further, the model supports multi-label classification in which a sample can belong to more than one class. For each class, the raw output passes through the logistic function. Values larger or equal to 0.5 are rounded to 1, otherwise to 0. For a predicted output of a sample, the indices where the value is 1 represents the assigned classes of that sample.", "_____no_output_____" ] ], [ [ "from sklearn.neural_network import MLPClassifier\n\nclf = MLPClassifier(solver='lbfgs',alpha=1e-5,max_iter=200,activation='relu',\\\n hidden_layer_sizes=(10,30,10), random_state=1, shuffle=True)\nclf.fit(XA,yA)\nyP = clf.predict(XB)\nassess(yP)", "_____no_output_____" ] ], [ [ "![animal_eggs](https://apmonitor.com/che263/uploads/Begin_Python/animal_eggs.png)\n\n### Unsupervised Classification\n\nAdditional examples show the potential for unsupervised learning to classify the groups. Unsupervised learning does not use the labels (`True`/`False`) so the results may need to be switched to align with the test set with `if len(XB[yP!=yB]) > n/4: yP = 1 - yP \n`", "_____no_output_____" ], [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### U.1 K-Means Clustering\n\n**Definition:** Specify how many possible clusters (or K) there are in the dataset. The algorithm then iteratively moves the K-centers and selects the datapoints that are closest to that centroid in the cluster.\n\n**Advantages:** The most common and simplest clustering algorithm.\n\n**Disadvantages:** Must specify the number of clusters although this can typically be determined by increasing the number of clusters until the objective function does not change significantly.", "_____no_output_____" ] ], [ [ "from sklearn.cluster import KMeans\nkm = KMeans(n_clusters=2)\nkm.fit(XA)\nyP = km.predict(XB)\nif len(XB[yP!=yB]) > n/4: yP = 1 - yP \nassess(yP)", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### U.2 Gaussian Mixture Model\n\n**Definition:** Data points that exist at the boundary of clusters may simply have similar probabilities of being on either clusters. A mixture model predicts a probability instead of a hard classification such as K-Means clustering.\n\n**Advantages:** Incorporates uncertainty into the solution.\n\n**Disadvantages:** Uncertainty may not be desirable for some applications. This method is not as common as the K-Means method for clustering.", "_____no_output_____" ] ], [ [ "from sklearn.mixture import GaussianMixture\ngmm = GaussianMixture(n_components=2)\ngmm.fit(XA)\nyP = gmm.predict_proba(XB) # produces probabilities\nif len(XB[np.round(yP[:,0])!=yB]) > n/4: yP = 1 - yP \nassess(np.round(yP[:,0]))", "_____no_output_____" ] ], [ [ "![idea](https://apmonitor.com/che263/uploads/Begin_Python/idea.png)\n\n### U.3 Spectral Clustering\n\n**Definition:** Spectral clustering is known as segmentation-based object categorization. It is a technique with roots in graph theory, where identify communities of nodes in a graph are based on the edges connecting them. The method is flexible and allows clustering of non graph data as well.\nIt uses information from the eigenvalues of special matrices built from the graph or the data set. \n\n**Advantages:** Flexible approach for finding clusters when data doesnโ€™t meet the requirements of other common algorithms.\n\n**Disadvantages:** For large-sized graphs, the second eigenvalue of the (normalized) graph Laplacian matrix is often ill-conditioned, leading to slow convergence of iterative eigenvalue solvers. Spectral clustering is computationally expensive unless the graph is sparse and the similarity matrix can be efficiently constructed.", "_____no_output_____" ] ], [ [ "from sklearn.cluster import SpectralClustering\nsc = SpectralClustering(n_clusters=2,eigen_solver='arpack',\\\n affinity='nearest_neighbors')\nyP = sc.fit_predict(XB) # No separation between fit and predict calls\n # need to fit and predict on same dataset\nif len(XB[yP!=yB]) > n/4: yP = 1 - yP \nassess(yP)", "_____no_output_____" ] ], [ [ "![expert](https://apmonitor.com/che263/uploads/Begin_Python/expert.png)\n\n### TCLab Activity\n\nTrain a classifier to predict if the heater is on (100%) or off (0%). Generate data with 10 minutes of 1 second data. If you do not have a TCLab, use one of the sample data sets.\n\n- [Sample Data Set 1 (10 min)](http://apmonitor.com/do/uploads/Main/tclab_data5.txt): http://apmonitor.com/do/uploads/Main/tclab_data5.txt \n- [Sample Data Set 2 (60 min)](http://apmonitor.com/do/uploads/Main/tclab_data6.txt): http://apmonitor.com/do/uploads/Main/tclab_data6.txt", "_____no_output_____" ] ], [ [ "# 10 minute data collection\nimport tclab, time\nimport numpy as np\nimport pandas as pd\nwith tclab.TCLab() as lab:\n n = 600; on=100; t = np.linspace(0,n-1,n) \n Q1 = np.zeros(n); T1 = np.zeros(n)\n Q2 = np.zeros(n); T2 = np.zeros(n) \n Q1[20:41]=on; Q1[60:91]=on; Q1[150:181]=on\n Q1[190:206]=on; Q1[220:251]=on; Q1[260:291]=on\n Q1[300:316]=on; Q1[340:351]=on; Q1[400:431]=on\n Q1[500:521]=on; Q1[540:571]=on; Q1[20:41]=on\n Q1[60:91]=on; Q1[150:181]=on; Q1[190:206]=on\n Q1[220:251]=on; Q1[260:291]=on\n print('Time Q1 Q2 T1 T2')\n for i in range(n):\n T1[i] = lab.T1; T2[i] = lab.T2\n lab.Q1(Q1[i])\n if i%5==0:\n print(int(t[i]),Q1[i],Q2[i],T1[i],T2[i])\n time.sleep(1)\ndata = np.column_stack((t,Q1,Q2,T1,T2))\ndata8 = pd.DataFrame(data,columns=['Time','Q1','Q2','T1','T2'])\ndata8.to_csv('08-tclab.csv',index=False)", "_____no_output_____" ] ], [ [ "Use the data file `08-tclab.csv` to train and test the classifier. Select and scale (0-1) the features of the data including `T1`, `T2`, and the 1st and 2nd derivatives of `T1`. Use the measured temperatures, derivatives, and heater value label to create a classifier that predicts when the heater is on or off. Validate the classifier with new data that was not used for training. Starting code is provided below but does not include `T2` as a feature input. **Add `T2` as an input feature to the classifer. Does it improve the classifier performance?**", "_____no_output_____" ] ], [ [ "import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib import gridspec\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.model_selection import train_test_split\n\ntry:\n data = pd.read_csv('08-tclab.csv')\nexcept:\n print('Warning: Unable to load 08-tclab.csv, using online data')\n url = 'http://apmonitor.com/do/uploads/Main/tclab_data5.txt'\n data = pd.read_csv(url)\n \n# Input Features: Temperature and 1st / 2nd Derivatives\n# Cubic polynomial fit of temperature using 10 data points\ndata['dT1'] = np.zeros(len(data))\ndata['d2T1'] = np.zeros(len(data))\nfor i in range(len(data)):\n if i<len(data)-10:\n x = data['Time'][i:i+10]-data['Time'][i]\n y = data['T1'][i:i+10]\n p = np.polyfit(x,y,3)\n # evaluate derivatives at mid-point (5 sec)\n t = 5.0\n data['dT1'][i] = 3.0*p[0]*t**2 + 2.0*p[1]*t+p[2]\n data['d2T1'][i] = 6.0*p[0]*t + 2.0*p[1]\n else:\n data['dT1'][i] = np.nan\n data['d2T1'][i] = np.nan\n\n# Remove last 10 values\nX = np.array(data[['T1','dT1','d2T1']][0:-10])\ny = np.array(data[['Q1']][0:-10])\n\n# Scale data\n# Input features (Temperature and 2nd derivative at 5 sec)\ns1 = MinMaxScaler(feature_range=(0,1))\nXs = s1.fit_transform(X)\n# Output labels (heater On / Off)\nys = [True if y[i]>50.0 else False for i in range(len(y))]\n\n# Split into train and test subsets (50% each)\nXA, XB, yA, yB = train_test_split(Xs, ys, \\\n test_size=0.5, shuffle=False)\n\n# Supervised Classification\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.naive_bayes import GaussianNB\nfrom sklearn.linear_model import SGDClassifier\nfrom sklearn.neighbors import KNeighborsClassifier\nfrom sklearn.tree import DecisionTreeClassifier\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.svm import SVC\nfrom sklearn.neural_network import MLPClassifier\n\n# Create supervised classification models\nlr = LogisticRegression(solver='lbfgs') # Logistic Regression\nnb = GaussianNB() # Naรฏve Bayes\nsgd = SGDClassifier(loss='modified_huber', shuffle=True,\\\n random_state=101) # Stochastic Gradient Descent\nknn = KNeighborsClassifier(n_neighbors=5) # K-Nearest Neighbors\ndtree = DecisionTreeClassifier(max_depth=10,random_state=101,\\\n max_features=None,min_samples_leaf=5) # Decision Tree\nrfm = RandomForestClassifier(n_estimators=70,oob_score=True,n_jobs=1,\\\n random_state=101,max_features=None,min_samples_leaf=3) # Random Forest\nsvm = SVC(gamma='scale', C=1.0, random_state=101) # Support Vector Classifier\nclf = MLPClassifier(solver='lbfgs',alpha=1e-5,max_iter=200,\\\n activation='relu',hidden_layer_sizes=(10,30,10),\\\n random_state=1, shuffle=True) # Neural Network\nmodels = [lr,nb,sgd,knn,dtree,rfm,svm,clf]\n\n# Supervised learning\nyP = [None]*(len(models)+3) # 3 for unsupervised learning\nfor i,m in enumerate(models):\n m.fit(XA,yA)\n yP[i] = m.predict(XB)\n\n# Unsupervised learning modules\nfrom sklearn.cluster import KMeans\nfrom sklearn.mixture import GaussianMixture\nfrom sklearn.cluster import SpectralClustering\nkm = KMeans(n_clusters=2)\ngmm = GaussianMixture(n_components=2)\nsc = SpectralClustering(n_clusters=2,eigen_solver='arpack',\\\n affinity='nearest_neighbors')\nkm.fit(XA)\nyP[8] = km.predict(XB)\ngmm.fit(XA)\nyP[9] = gmm.predict_proba(XB)[:,0]\nyP[10] = sc.fit_predict(XB)\n\nplt.figure(figsize=(10,7))\ngs = gridspec.GridSpec(3, 1, height_ratios=[1,1,5])\nplt.subplot(gs[0])\nplt.plot(data['Time']/60,data['T1'],'r-',\\\n label='Temperature (ยฐC)')\nplt.ylabel('T (ยฐC)')\nplt.legend()\nplt.subplot(gs[1])\nplt.plot(data['Time']/60,data['dT1'],'b:',\\\n label='dT/dt (ยฐC/sec)') \nplt.plot(data['Time']/60,data['d2T1'],'k--',\\\n label=r'$d^2T/dt^2$ ($ยฐC^2/sec^2$)')\nplt.ylabel('Derivatives')\nplt.legend()\n\nplt.subplot(gs[2])\nplt.plot(data['Time']/60,data['Q1']/100,'k-',\\\n label='Heater (On=1/Off=0)')\n\nt2 = data['Time'][len(yA):-10].values\ndesc = ['Logistic Regression','Naรฏve Bayes','Stochastic Gradient Descent',\\\n 'K-Nearest Neighbors','Decision Tree','Random Forest',\\\n 'Support Vector Classifier','Neural Network',\\\n 'K-Means Clustering','Gaussian Mixture Model','Spectral Clustering']\nfor i in range(11):\n plt.plot(t2/60,yP[i]-i-1,label=desc[i])\n\nplt.ylabel('Heater')\nplt.legend()\n\nplt.xlabel(r'Time (min)')\nplt.legend()\nplt.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d0646654843a52d1c0a410b3f80fe5892535ad2c
6,094
ipynb
Jupyter Notebook
reg-linear/Bonus/Simulador Interativo.ipynb
DiegoVialle/Regressao-Linear-Testando-Relacoes-e-Prevendo-Resultados
3c185aefc9ab4fbcee98efb294c7637eb4f594e5
[ "MIT" ]
null
null
null
reg-linear/Bonus/Simulador Interativo.ipynb
DiegoVialle/Regressao-Linear-Testando-Relacoes-e-Prevendo-Resultados
3c185aefc9ab4fbcee98efb294c7637eb4f594e5
[ "MIT" ]
null
null
null
reg-linear/Bonus/Simulador Interativo.ipynb
DiegoVialle/Regressao-Linear-Testando-Relacoes-e-Prevendo-Resultados
3c185aefc9ab4fbcee98efb294c7637eb4f594e5
[ "MIT" ]
null
null
null
32.414894
1,066
0.564654
[ [ [ "<h1 style='color: green; font-size: 36px; font-weight: bold;'>Data Science - Regressรฃo Linear</h1>", "_____no_output_____" ], [ "# <font color='red' style='font-size: 30px;'>Bรดnus</font>\n<hr style='border: 2px solid red;'>", "_____no_output_____" ], [ "## Importando nosso modelo", "_____no_output_____" ] ], [ [ "import pickle\n\nmodelo = open('../Exercicio/modelo_preรงo','rb')\nlm_new = pickle.load(modelo)\nmodelo.close()\n\narea = 38\ngaragem = 2\nbanheiros = 4\nlareira = 4\nmarmore = 0\nandares = 1\n\nentrada = [[area, garagem, banheiros, lareira, marmore, andares]]\n\nprint('$ {0:.2f}'.format(lm_new.predict(entrada)[0]))", "_____no_output_____" ] ], [ [ "## Exemplo de um simulador interativo para Jupyter\n\nhttps://ipywidgets.readthedocs.io/en/stable/index.html\n\nhttps://github.com/jupyter-widgets/ipywidgets", "_____no_output_____" ] ], [ [ "# Importando bibliotecas\nfrom ipywidgets import widgets, HBox, VBox\nfrom IPython.display import display\n\n# Criando os controles do formulรกrio\narea = widgets.Text(description=\"รrea\")\ngaragem = widgets.Text(description=\"Garagem\")\nbanheiros = widgets.Text(description=\"Banheiros\")\nlareira = widgets.Text(description=\"Lareira\")\nmarmore = widgets.Text(description=\"Mรกrmore?\")\nandares = widgets.Text(description=\"Andares?\")\n\nbotao = widgets.Button(description=\"Simular\")\n\n# Posicionando os controles\nleft = VBox([area, banheiros, marmore])\nright = VBox([garagem, lareira, andares])\ninputs = HBox([left, right])\n\n# Funรงรฃo de simulaรงรฃo\ndef simulador(sender):\n entrada=[[\n float(area.value if area.value else 0), \n float(garagem.value if garagem.value else 0), \n float(banheiros.value if banheiros.value else 0), \n float(lareira.value if lareira.value else 0), \n float(marmore.value if marmore.value else 0), \n float(andares.value if andares.value else 0)\n ]]\n print('$ {0:.2f}'.format(lm_new.predict(entrada)[0]))\n \n# Atribuindo a funรงรฃo \"simulador\" ao evento click do botรฃo\nbotao.on_click(simulador) ", "_____no_output_____" ], [ "display(inputs, botao)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d064705e51ef7eddc4d0656b9a41f9386500351e
11,423
ipynb
Jupyter Notebook
albert-base/albert-baseline.ipynb
shanayghag/AV-Janatahack-Independence-Day-2020-ML-Hackathon
410c549488b0e2ceece067a9e1581e182a11e885
[ "MIT" ]
6
2020-08-26T13:00:11.000Z
2021-12-28T18:58:43.000Z
albert-base/albert-baseline.ipynb
shanayghag/AV-Janatahack-Independence-Day-2020-ML-Hackathon
410c549488b0e2ceece067a9e1581e182a11e885
[ "MIT" ]
null
null
null
albert-base/albert-baseline.ipynb
shanayghag/AV-Janatahack-Independence-Day-2020-ML-Hackathon
410c549488b0e2ceece067a9e1581e182a11e885
[ "MIT" ]
1
2020-08-24T08:34:19.000Z
2020-08-24T08:34:19.000Z
11,423
11,423
0.670927
[ [ [ "## Installing & importing necsessary libs", "_____no_output_____" ] ], [ [ "!pip install -q transformers", "_____no_output_____" ], [ "import numpy as np\nimport pandas as pd\nfrom sklearn import metrics\nimport transformers\nimport torch\nfrom torch.utils.data import Dataset, DataLoader, RandomSampler, SequentialSampler\nfrom transformers import AlbertTokenizer, AlbertModel, AlbertConfig\nfrom tqdm.notebook import tqdm\nfrom transformers import get_linear_schedule_with_warmup", "_____no_output_____" ], [ "device = torch.device(\"cuda\" if torch.cuda.is_available() else \"cpu\")\nn_gpu = torch.cuda.device_count()\ntorch.cuda.get_device_name(0)", "_____no_output_____" ] ], [ [ "## Data Preprocessing", "_____no_output_____" ] ], [ [ "df = pd.read_csv(\"../input/avjantahack/data/train.csv\")\ndf['list'] = df[df.columns[3:]].values.tolist()\nnew_df = df[['ABSTRACT', 'list']].copy()\nnew_df.head()", "_____no_output_____" ] ], [ [ "## Model configurations", "_____no_output_____" ] ], [ [ "# Defining some key variables that will be used later on in the training\nMAX_LEN = 512\nTRAIN_BATCH_SIZE = 16\nVALID_BATCH_SIZE = 8\nEPOCHS = 5\nLEARNING_RATE = 3e-05\ntokenizer = AlbertTokenizer.from_pretrained('albert-base-v2')", "_____no_output_____" ] ], [ [ "## Custom Dataset Class", "_____no_output_____" ] ], [ [ "class CustomDataset(Dataset):\n\n def __init__(self, dataframe, tokenizer, max_len):\n self.tokenizer = tokenizer\n self.data = dataframe\n self.abstract = dataframe.ABSTRACT\n self.targets = self.data.list\n self.max_len = max_len\n\n def __len__(self):\n return len(self.abstract)\n\n def __getitem__(self, index):\n abstract = str(self.abstract[index])\n abstract = \" \".join(abstract.split())\n\n inputs = self.tokenizer.encode_plus(\n abstract,\n None,\n add_special_tokens = True,\n max_length = self.max_len,\n pad_to_max_length = True,\n return_token_type_ids=True,\n truncation = True\n )\n\n ids = inputs['input_ids']\n mask = inputs['attention_mask']\n token_type_ids = inputs['token_type_ids']\n\n return{\n 'ids': torch.tensor(ids, dtype=torch.long),\n 'mask': torch.tensor(mask, dtype=torch.long),\n 'token_type_ids': torch.tensor(token_type_ids, dtype=torch.long),\n 'targets': torch.tensor(self.targets[index], dtype=torch.float)\n }", "_____no_output_____" ], [ "train_size = 0.8\ntrain_dataset=new_df.sample(frac=train_size,random_state=200)\ntest_dataset=new_df.drop(train_dataset.index).reset_index(drop=True)\ntrain_dataset = train_dataset.reset_index(drop=True)\n\n\nprint(\"FULL Dataset: {}\".format(new_df.shape))\nprint(\"TRAIN Dataset: {}\".format(train_dataset.shape))\nprint(\"TEST Dataset: {}\".format(test_dataset.shape))\n\ntraining_set = CustomDataset(train_dataset, tokenizer, MAX_LEN)\ntesting_set = CustomDataset(test_dataset, tokenizer, MAX_LEN)", "_____no_output_____" ], [ "train_params = {'batch_size': TRAIN_BATCH_SIZE,\n 'shuffle': True,\n 'num_workers': 0\n }\n\ntest_params = {'batch_size': VALID_BATCH_SIZE,\n 'shuffle': True,\n 'num_workers': 0\n }\n\ntraining_loader = DataLoader(training_set, **train_params)\ntesting_loader = DataLoader(testing_set, **test_params)", "_____no_output_____" ] ], [ [ "## Albert model", "_____no_output_____" ] ], [ [ "class AlbertClass(torch.nn.Module):\n def __init__(self):\n super(AlbertClass, self).__init__()\n self.albert = transformers.AlbertModel.from_pretrained('albert-base-v2')\n self.drop = torch.nn.Dropout(0.1)\n self.linear = torch.nn.Linear(768, 6)\n \n def forward(self, ids, mask, token_type_ids):\n _, output= self.albert(ids, attention_mask = mask)\n output = self.drop(output)\n output = self.linear(output)\n\n return output\n\nmodel = AlbertClass()\nmodel.to(device)", "_____no_output_____" ] ], [ [ "## Hyperparameters & Loss function", "_____no_output_____" ] ], [ [ "def loss_fn(outputs, targets):\n return torch.nn.BCEWithLogitsLoss()(outputs, targets)", "_____no_output_____" ], [ "param_optimizer = list(model.named_parameters())\nno_decay = [\"bias\", \"LayerNorm.bias\", \"LayerNorm.weight\"]\noptimizer_parameters = [\n {\n \"params\": [\n p for n, p in param_optimizer if not any(nd in n for nd in no_decay)\n ],\n \"weight_decay\": 0.001,\n },\n {\n \"params\": [\n p for n, p in param_optimizer if any(nd in n for nd in no_decay)\n ],\n \"weight_decay\": 0.0,\n },\n]\n\noptimizer = torch.optim.AdamW(optimizer_parameters, lr=1e-5)\nnum_training_steps = int(len(train_dataset) / TRAIN_BATCH_SIZE * EPOCHS)\n\nscheduler = get_linear_schedule_with_warmup(\n optimizer,\n num_warmup_steps = 0,\n num_training_steps = num_training_steps\n)", "_____no_output_____" ] ], [ [ "## Train & Eval Functions\n\n", "_____no_output_____" ] ], [ [ "def train(epoch):\n model.train()\n for _,data in tqdm(enumerate(training_loader, 0), total=len(training_loader)):\n ids = data['ids'].to(device, dtype = torch.long)\n mask = data['mask'].to(device, dtype = torch.long)\n token_type_ids = data['token_type_ids'].to(device, dtype = torch.long)\n targets = data['targets'].to(device, dtype = torch.float)\n\n outputs = model(ids, mask, token_type_ids)\n\n optimizer.zero_grad()\n loss = loss_fn(outputs, targets)\n if _%1000==0:\n print(f'Epoch: {epoch}, Loss: {loss.item()}')\n \n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n scheduler.step()\n\ndef validation(epoch):\n model.eval()\n fin_targets=[]\n fin_outputs=[]\n with torch.no_grad():\n for _, data in tqdm(enumerate(testing_loader, 0), total=len(testing_loader)):\n ids = data['ids'].to(device, dtype = torch.long)\n mask = data['mask'].to(device, dtype = torch.long)\n token_type_ids = data['token_type_ids'].to(device, dtype = torch.long)\n\n targets = data['targets'].to(device, dtype = torch.float)\n outputs = model(ids, mask, token_type_ids)\n fin_targets.extend(targets.cpu().detach().numpy().tolist())\n fin_outputs.extend(torch.sigmoid(outputs).cpu().detach().numpy().tolist())\n return fin_outputs, fin_targets", "_____no_output_____" ] ], [ [ "## Training Model", "_____no_output_____" ] ], [ [ "MODEL_PATH = \"/kaggle/working/albert-multilabel-model.bin\"\nbest_micro = 0\nfor epoch in range(EPOCHS):\n train(epoch)\n outputs, targets = validation(epoch)\n outputs = np.array(outputs) >= 0.5\n accuracy = metrics.accuracy_score(targets, outputs)\n f1_score_micro = metrics.f1_score(targets, outputs, average='micro')\n f1_score_macro = metrics.f1_score(targets, outputs, average='macro')\n print(f\"Accuracy Score = {accuracy}\")\n print(f\"F1 Score (Micro) = {f1_score_micro}\")\n print(f\"F1 Score (Macro) = {f1_score_macro}\")\n if f1_score_micro > best_micro:\n torch.save(model.state_dict(), MODEL_PATH)\n best_micro = f1_score_micro", "_____no_output_____" ], [ "def predict(id, abstract):\n MAX_LENGTH = 512\n inputs = tokenizer.encode_plus(\n abstract,\n None,\n add_special_tokens=True,\n max_length=512,\n pad_to_max_length=True,\n return_token_type_ids=True,\n truncation = True\n )\n \n ids = inputs['input_ids']\n mask = inputs['attention_mask']\n token_type_ids = inputs['token_type_ids']\n\n ids = torch.tensor(ids, dtype=torch.long).unsqueeze(0)\n mask = torch.tensor(mask, dtype=torch.long).unsqueeze(0)\n token_type_ids = torch.tensor(token_type_ids, dtype=torch.long).unsqueeze(0)\n\n ids = ids.to(device)\n mask = mask.to(device)\n token_type_ids = token_type_ids.to(device)\n\n with torch.no_grad():\n outputs = model(ids, mask, token_type_ids)\n\n outputs = torch.sigmoid(outputs).squeeze()\n outputs = np.round(outputs.cpu().numpy())\n \n out = np.insert(outputs, 0, id)\n return out", "_____no_output_____" ], [ "def submit():\n test_df = pd.read_csv('../input/avjantahack/data/test.csv')\n sample_submission = pd.read_csv('../input/avjantahack/data/sample_submission_UVKGLZE.csv')\n\n y = []\n for id, abstract in tqdm(zip(test_df['ID'], test_df['ABSTRACT']),\n total=len(test_df)):\n out = predict(id, abstract)\n y.append(out)\n y = np.array(y)\n submission = pd.DataFrame(y, columns=sample_submission.columns).astype(int)\n return submission", "_____no_output_____" ], [ "submission = submit()\nsubmission", "_____no_output_____" ], [ "submission.to_csv('/kaggle/working/alberta-tuned-lr-ws-dr.csv', index=False)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d06476f82b77f6e5bcbde9a83435ff7ac540991b
1,585
ipynb
Jupyter Notebook
chapter10/05_transfer_learning.ipynb
PacktPublishing/Mastering-Azure-Machine-Learning-Second-Edition
1ca0cf19fd49f6781c589ae4d9bde56135791cf1
[ "MIT" ]
1
2022-03-07T20:15:08.000Z
2022-03-07T20:15:08.000Z
chapter10/05_transfer_learning.ipynb
PacktPublishing/Mastering-Azure-Machine-Learning-Second-Edition
1ca0cf19fd49f6781c589ae4d9bde56135791cf1
[ "MIT" ]
null
null
null
chapter10/05_transfer_learning.ipynb
PacktPublishing/Mastering-Azure-Machine-Learning-Second-Edition
1ca0cf19fd49f6781c589ae4d9bde56135791cf1
[ "MIT" ]
1
2022-03-22T17:57:41.000Z
2022-03-22T17:57:41.000Z
20.855263
105
0.548265
[ [ [ "import keras", "_____no_output_____" ], [ "from keras.applications.resnet50 import ResNet50\n\nnum_classes = 10\ninput_shape = (224, 224, 3)\n\n# create the base pre-trained model\nbase_model = ResNet50(input_shape=input_shape, weights='imagenet', include_top=False,pooling='avg')", "_____no_output_____" ], [ "for layer in base_model.layers:\n layer.trainable=False", "_____no_output_____" ], [ "from keras.models import Model\nfrom keras.layers import Flatten, Dense\n\nclf = base_model.output\nclf = Dense(256, activation='relu')(clf)\nclf = Dense(10, activation='softmax')(clf)\n\nmodel = Model(base_model.input, clf)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code" ] ]
d06483d677627cbca55508862a20d7391a6436a3
20,102
ipynb
Jupyter Notebook
AS2520 Propulsion Lab/Experiment 6 - Droplet Evaporation/re-work-notebook.ipynb
kirtan2605/Coursework-Codes
f979b45a608a3420a107de99fc6eb6acefcadf87
[ "MIT" ]
null
null
null
AS2520 Propulsion Lab/Experiment 6 - Droplet Evaporation/re-work-notebook.ipynb
kirtan2605/Coursework-Codes
f979b45a608a3420a107de99fc6eb6acefcadf87
[ "MIT" ]
null
null
null
AS2520 Propulsion Lab/Experiment 6 - Droplet Evaporation/re-work-notebook.ipynb
kirtan2605/Coursework-Codes
f979b45a608a3420a107de99fc6eb6acefcadf87
[ "MIT" ]
null
null
null
20,102
20,102
0.717988
[ [ [ "# Droplet Evaporation", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\nfrom scipy import optimize", "_____no_output_____" ], [ "# Ethyl Acetate\n#time_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\n#diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\n\n# Gasoline\n#time_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\n#diameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])", "_____no_output_____" ] ], [ [ "# Ethyl Acetate", "_____no_output_____" ] ], [ [ "time_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\ndiameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])", "_____no_output_____" ], [ "x = time_in_sec.tolist()\ny = diameter.tolist()\n\npolynomial_coeff_1=np.polyfit(x,y,1)\npolynomial_coeff_2=np.polyfit(x,y,2)\npolynomial_coeff_3=np.polyfit(x,y,3)\n\nxnew=np.linspace(0,110 ,100)\nynew_1=np.poly1d(polynomial_coeff_1)\nynew_2=np.poly1d(polynomial_coeff_2)\nynew_3=np.poly1d(polynomial_coeff_3)\n\nplt.plot(x,y,'o')\nplt.plot(xnew,ynew_1(xnew))\nplt.plot(xnew,ynew_2(xnew))\nplt.plot(xnew,ynew_3(xnew))\nprint(ynew_1)\nprint(ynew_2)\nprint(ynew_3)\nplt.title(\"Diameter vs Time(s)\")\nplt.xlabel(\"Time(s)\")\nplt.ylabel(\"Diameter\")\n\nplt.show()\n\n\n# Coeficients\n# LINEAR : -0.02386 x + 3.139\n# QUADRATIC : -0.0002702 x^2 + 0.005868 x + 2.619\n# CUBIC : -4.771e-07 x^3 - 0.0001915 x^2 + 0.002481 x + 2.646\n#\n# Using Desmos to find the roots of the best fit polynomials\n# Root of linear fit = 131.559\n# Root of quadratic fit = 109.908\n# Root of cubic fit = 109.414", "_____no_output_____" ], [ "def d_square_law(x, C, n):\n y = C/(x**n)\n return y", "_____no_output_____" ] ], [ [ "# Linear Fit", "_____no_output_____" ] ], [ [ "# Calculating time taken for vaporization for different diameters. (LINEAR FIT)\ndiameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\ntime_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\nt_vap = time_in_sec\nt_vap = t_vap*0\nt_vap = t_vap + 131.559\nt_vap = t_vap - time_in_sec\nprint(t_vap.tolist())", "_____no_output_____" ], [ "# Finding C and n for d-square law\n#initial_diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\n#vap_time = np.array([109.908, 104.908, 99.908, 94.908, 89.908, 84.908, 79.908, 74.908, 69.908, 64.908, 59.908, 54.908, 49.908, 44.908, 39.908, 34.908, 29.908, 24.908, 19.908, 14.908000000000001, 9.908000000000001, 4.908000000000001, -0.09199999999999875])\n\n# Linear \ninitial_diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\nvap_time_lin = np.array([131.559, 126.559, 121.559, 116.559, 111.559, 106.559, 101.559, 96.559, 91.559, 86.559, 81.559, 76.559, 71.559, 66.559, 61.559, 56.559, 51.559, 46.559, 41.559, 36.559, 31.558999999999997, 26.558999999999997, 21.558999999999997])", "_____no_output_____" ], [ "# Linear\nparameters_lin = optimize.curve_fit(d_square_law, xdata = initial_diameter, ydata = vap_time_lin)[0]\nprint(\"Linear : \",parameters_lin)\n#C = parameters_lin[0]\n#n = parameters_lin[1]", "_____no_output_____" ] ], [ [ "# Quadratic Fit", "_____no_output_____" ] ], [ [ "# Calculating time taken for vaporization for different diameters. (QUADRATIC FIT)\ndiameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])\ntime_in_sec = np.array([0,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\nt_vap = time_in_sec\nt_vap = t_vap*0\nt_vap = t_vap + 109.908\nt_vap = t_vap - time_in_sec\nprint(t_vap.tolist())", "_____no_output_____" ], [ "# Quadratic Fit\ninitial_diameter = np.array([2.79,2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372])\nvap_time_quad = np.array([109.908, 104.908, 99.908, 94.908, 89.908, 84.908, 79.908, 74.908, 69.908, 64.908, 59.908, 54.908, 49.908, 44.908, 39.908, 34.908, 29.908, 24.908, 19.908, 14.908000000000001, 9.908000000000001, 4.908000000000001])\n", "_____no_output_____" ], [ "# Quadratic\nparameters_quad = optimize.curve_fit(d_square_law, xdata = initial_diameter, ydata = vap_time_quad)[0]\nprint(\"Linear : \",parameters_quad)\n#C = parameters_lin[0]\n#n = parameters_lin[1]", "_____no_output_____" ] ], [ [ "# Ethyl Acetate - After finding d-square Law", "_____no_output_____" ] ], [ [ "# Linear\nC = 41.72856231\nn = -0.97941652\n\n# Quadratic\n# C = 11.6827828\n# n = -2.13925924\n\n\n\nx = vap_time.tolist()\ny = initial_diameter.tolist()\n\nynew=np.linspace(0,3 ,100)\nxnew=[]\nfor item in ynew:\n v1 = C/(item**n)\n xnew.append(v1)\n \nplt.plot(x,y,'o')\nplt.plot(xnew,ynew)\nplt.title(\"Initial Diameter vs Vaporization Time(s)\")\nplt.xlabel(\"Vaporization Time(s)\")\nplt.ylabel(\"Initial Diameter\")\n\nplt.show()", "_____no_output_____" ] ], [ [ "# Gasoline", "_____no_output_____" ] ], [ [ "time_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\ndiameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])", "_____no_output_____" ], [ "x = time_in_min.tolist()\ny = diameter.tolist()\n\npolynomial_coeff_1=np.polyfit(x,y,1)\npolynomial_coeff_2=np.polyfit(x,y,2)\npolynomial_coeff_3=np.polyfit(x,y,3)\n\nxnew=np.linspace(0,300 ,100)\nynew_1=np.poly1d(polynomial_coeff_1)\nynew_2=np.poly1d(polynomial_coeff_2)\nynew_3=np.poly1d(polynomial_coeff_3)\n\nplt.plot(x,y,'o')\nplt.plot(xnew,ynew_1(xnew))\nplt.plot(xnew,ynew_2(xnew))\nplt.plot(xnew,ynew_3(xnew))\nprint(ynew_1)\nprint(ynew_2)\nprint(ynew_3)\nplt.title(\"Diameter vs Time(min)\")\nplt.xlabel(\"Time(min)\")\nplt.ylabel(\"Diameter\")\n\nplt.show()\n\n\n# Coeficients\n# LINEAR : -0.005637 x + 2.074\n# QUADRATIC : -6.67e-06 x^2 - 0.003865 x + 2\n# CUBIC : 1.481e-07 x^3 - 6.531e-05 x^2 + 0.00207 x + 1.891\n#\n# Using Desmos to find the roots of the best fit polynomials\n# Root of linear fit = 367.926\n# Root of quadratic fit = 329.781\n# Root of cubic fit = No Positive Root", "_____no_output_____" ] ], [ [ "# Linear Fit", "_____no_output_____" ] ], [ [ "# Calculating time taken for vaporization for different diameters. (LINEAR FIT)\ntime_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\ndiameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nt_vap = time_in_min\nt_vap = t_vap*0\nt_vap = t_vap + 367.926\nt_vap = t_vap - time_in_min\nprint(t_vap.tolist())", "_____no_output_____" ], [ "initial_diameter_g_lin = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nvap_time_g_lin = np.array([367.926, 352.926, 337.926, 322.926, 307.926, 292.926, 277.926, 262.926, 247.926, 232.926, 217.926, 202.926, 187.926, 157.926, 132.926, 117.92599999999999, 102.92599999999999])", "_____no_output_____" ], [ "parameters_g_lin = optimize.curve_fit(d_square_law, xdata = initial_diameter_g_lin, ydata = vap_time_g_lin)[0]\nprint(parameters_g_lin)\nC_g = parameters_g_lin[0]\nn_g = parameters_g_lin[1]", "_____no_output_____" ] ], [ [ "# Quadratic Fit", "_____no_output_____" ] ], [ [ "# Calculating time taken for vaporization for different diameters.\ntime_in_min = np.array([0,15,30,45,60,75,90,105,120,135,150,165,180,210,235,250,265])\ndiameter = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nt_vap = time_in_min\nt_vap = t_vap*0\nt_vap = t_vap + 329.781\nt_vap = t_vap - time_in_min\nprint(t_vap.tolist())", "_____no_output_____" ], [ "initial_diameter_g_quad = np.array([2,1.85,1.82,1.8,1.77,1.74,1.72,1.68,1.57,1.3,1.166,1.091,0.94,0.81,0.74,0.66,0.59])\nvap_time_g_quad = np.array([329.781, 314.781, 299.781, 284.781, 269.781, 254.781, 239.781, 224.781, 209.781, 194.781, 179.781, 164.781, 149.781, 119.781, 94.781, 79.781, 64.781])", "_____no_output_____" ], [ "parameters_g_quad = optimize.curve_fit(d_square_law, xdata = initial_diameter_g_quad, ydata = vap_time_g_quad)[0]\nprint(parameters_g_quad)\nC_g = parameters_g_quad[0]\nn_g = parameters_g_quad[1]", "_____no_output_____" ] ], [ [ "# Gasoline - After finding Vaporization Time Data", "_____no_output_____" ] ], [ [ "#Linear \nC_g = 140.10666889\nn_g = -1.1686059 \n\n# Quadratic\nC_g = 140.10666889\nn_g = -1.1686059 \n\nx_g = vap_time_g.tolist()\ny_g = initial_diameter_g.tolist()\n\nynew_g=np.linspace(0,2.2 ,100)\nxnew_g=[]\nfor item in ynew_g:\n v1 = C_g/(item**n_g)\n xnew_g.append(v1)\nprint(ynew_g)\nprint(xnew_g)\n \nplt.plot(x_g,y_g,'o')\nplt.plot(xnew_g,ynew_g)\nplt.title(\"Initial Diameter vs Vaporization Time(min)\")\nplt.xlabel(\"Vaporization Time(min)\")\nplt.ylabel(\"Initial Diameter\")\n\nplt.show()", "_____no_output_____" ] ], [ [ "# Optimization Methods (IGNORE)", "_____no_output_____" ] ], [ [ "import numpy as np\nfrom scipy import optimize\nimport matplotlib.pyplot as plt\n\nplt.style.use('seaborn-poster')", "_____no_output_____" ], [ "time_in_sec = np.array([5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110])\ndiameter = np.array([2.697,2.573,2.542,2.573,2.48,2.449,2.449,2.387,2.356,2.263,2.232,2.201,2.139,1.82,1.426,1.178,1.085,0.992,0.496,0.403,0.372,0.11])", "_____no_output_____" ], [ "def func(x, a, b):\n y = a/(x**b) \n return y\n\nparameters = optimize.curve_fit(func, xdata = time_in_sec, ydata = diameter)[0]\nprint(parameters)\nC = parameters[0]\nn = parameters[1]", "_____no_output_____" ], [ "plt.plot(time_in_sec,diameter,'o',label='data')\ny_new = []\nfor val in time_in_sec:\n v1 = C/(val**n)\n y_new.append(v1)\nplt.plot(time_in_sec,y_new,'-',label='fit')", "_____no_output_____" ], [ "log_time = np.log(time_in_min)\nlog_d = np.log(diameter)\nprint(log_d)\nprint(log_time)\nx = log_time.tolist()\ny = log_d.tolist()\npolynomial_coeff=np.polyfit(x,y,1)\nxnew=np.linspace(2.5,6,100)\nynew=np.poly1d(polynomial_coeff)\nplt.plot(xnew,ynew(xnew),x,y,'o')\nprint(ynew)\nplt.title(\"log(diameter) vs log(Time(s))\")\nplt.xlabel(\"log(Time(s))\")\nplt.ylabel(\"log(diameter)\")\n\nplt.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ] ]
d064867b0d95195803807b55ac9a5c83deb05dbf
26,902
ipynb
Jupyter Notebook
Special. NLP_with_BERT.ipynb
Samrath49/AI_ML_DL
f5427adca5d914a7b69e11b578706cc7d21d3d56
[ "Unlicense" ]
null
null
null
Special. NLP_with_BERT.ipynb
Samrath49/AI_ML_DL
f5427adca5d914a7b69e11b578706cc7d21d3d56
[ "Unlicense" ]
null
null
null
Special. NLP_with_BERT.ipynb
Samrath49/AI_ML_DL
f5427adca5d914a7b69e11b578706cc7d21d3d56
[ "Unlicense" ]
null
null
null
56.875264
451
0.597056
[ [ [ "<a href=\"https://colab.research.google.com/github/Samrath49/AI_ML_DL/blob/main/Special.%20NLP_with_BERT.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>", "_____no_output_____" ], [ "# NLP with Bert for Sentiment Analysis", "_____no_output_____" ], [ "### Importing Libraries", "_____no_output_____" ] ], [ [ "!pip3 install ktrain ", "Collecting ktrain\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/e1/3c/8469632f3fa51f244ce35ac184de4c55a260dccfcb7386529faf82ebf60f/ktrain-0.25.4.tar.gz (25.3MB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 25.3MB 133kB/s \n\u001b[?25hCollecting scikit-learn==0.23.2\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f4/cb/64623369f348e9bfb29ff898a57ac7c91ed4921f228e9726546614d63ccb/scikit_learn-0.23.2-cp37-cp37m-manylinux1_x86_64.whl (6.8MB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 6.8MB 48.5MB/s \n\u001b[?25hRequirement already satisfied: matplotlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from ktrain) (3.2.2)\nRequirement already satisfied: pandas>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from ktrain) (1.1.5)\nRequirement already satisfied: fastprogress>=0.1.21 in /usr/local/lib/python3.7/dist-packages (from ktrain) (1.0.0)\nRequirement already satisfied: requests in /usr/local/lib/python3.7/dist-packages (from ktrain) (2.23.0)\nRequirement already satisfied: joblib in /usr/local/lib/python3.7/dist-packages (from ktrain) (1.0.1)\nRequirement already satisfied: packaging in /usr/local/lib/python3.7/dist-packages (from ktrain) (20.9)\nRequirement already satisfied: ipython in /usr/local/lib/python3.7/dist-packages (from ktrain) (5.5.0)\nCollecting langdetect\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/56/a3/8407c1e62d5980188b4acc45ef3d94b933d14a2ebc9ef3505f22cf772570/langdetect-1.0.8.tar.gz (981kB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 983kB 50.1MB/s \n\u001b[?25hRequirement already satisfied: jieba in /usr/local/lib/python3.7/dist-packages (from ktrain) (0.42.1)\nCollecting cchardet\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/80/72/a4fba7559978de00cf44081c548c5d294bf00ac7dcda2db405d2baa8c67a/cchardet-2.1.7-cp37-cp37m-manylinux2010_x86_64.whl (263kB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 266kB 51.8MB/s \n\u001b[?25hCollecting syntok\n Downloading https://files.pythonhosted.org/packages/8c/76/a49e73a04b3e3a14ce232e8e28a1587f8108baa665644fe8c40e307e792e/syntok-1.3.1.tar.gz\nCollecting seqeval==0.0.19\n Downloading https://files.pythonhosted.org/packages/93/e5/b7705156a77f742cfe4fc6f22d0c71591edb2d243328dff2f8fc0f933ab6/seqeval-0.0.19.tar.gz\nCollecting transformers<4.0,>=3.1.0\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/3a/83/e74092e7f24a08d751aa59b37a9fc572b2e4af3918cb66f7766c3affb1b4/transformers-3.5.1-py3-none-any.whl (1.3MB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 1.3MB 50.5MB/s \n\u001b[?25hCollecting sentencepiece\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f5/99/e0808cb947ba10f575839c43e8fafc9cc44e4a7a2c8f79c60db48220a577/sentencepiece-0.1.95-cp37-cp37m-manylinux2014_x86_64.whl (1.2MB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 1.2MB 52.6MB/s \n\u001b[?25hCollecting keras_bert>=0.86.0\n Downloading https://files.pythonhosted.org/packages/e2/7f/95fabd29f4502924fa3f09ff6538c5a7d290dfef2c2fe076d3d1a16e08f0/keras-bert-0.86.0.tar.gz\nRequirement already satisfied: networkx>=2.3 in /usr/local/lib/python3.7/dist-packages (from ktrain) (2.5)\nCollecting whoosh\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/ba/19/24d0f1f454a2c1eb689ca28d2f178db81e5024f42d82729a4ff6771155cf/Whoosh-2.7.4-py2.py3-none-any.whl (468kB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 471kB 49.2MB/s \n\u001b[?25hCollecting threadpoolctl>=2.0.0\n Downloading https://files.pythonhosted.org/packages/f7/12/ec3f2e203afa394a149911729357aa48affc59c20e2c1c8297a60f33f133/threadpoolctl-2.1.0-py3-none-any.whl\nRequirement already satisfied: scipy>=0.19.1 in /usr/local/lib/python3.7/dist-packages (from scikit-learn==0.23.2->ktrain) (1.4.1)\nRequirement already satisfied: numpy>=1.13.3 in /usr/local/lib/python3.7/dist-packages (from scikit-learn==0.23.2->ktrain) (1.19.5)\nRequirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (2.4.7)\nRequirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (1.3.1)\nRequirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (0.10.0)\nRequirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.7/dist-packages (from matplotlib>=3.0.0->ktrain) (2.8.1)\nRequirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.7/dist-packages (from pandas>=1.0.1->ktrain) (2018.9)\nRequirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (2.10)\nRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (2020.12.5)\nRequirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (3.0.4)\nRequirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /usr/local/lib/python3.7/dist-packages (from requests->ktrain) (1.24.3)\nRequirement already satisfied: prompt-toolkit<2.0.0,>=1.0.4 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (1.0.18)\nRequirement already satisfied: pexpect; sys_platform != \"win32\" in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (4.8.0)\nRequirement already satisfied: setuptools>=18.5 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (54.0.0)\nRequirement already satisfied: pygments in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (2.6.1)\nRequirement already satisfied: pickleshare in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (0.7.5)\nRequirement already satisfied: simplegeneric>0.8 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (0.8.1)\nRequirement already satisfied: traitlets>=4.2 in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (5.0.5)\nRequirement already satisfied: decorator in /usr/local/lib/python3.7/dist-packages (from ipython->ktrain) (4.4.2)\nRequirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from langdetect->ktrain) (1.15.0)\nRequirement already satisfied: regex in /usr/local/lib/python3.7/dist-packages (from syntok->ktrain) (2019.12.20)\nRequirement already satisfied: Keras>=2.2.4 in /usr/local/lib/python3.7/dist-packages (from seqeval==0.0.19->ktrain) (2.4.3)\nRequirement already satisfied: tqdm>=4.27 in /usr/local/lib/python3.7/dist-packages (from transformers<4.0,>=3.1.0->ktrain) (4.41.1)\nCollecting sacremoses\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/7d/34/09d19aff26edcc8eb2a01bed8e98f13a1537005d31e95233fd48216eed10/sacremoses-0.0.43.tar.gz (883kB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 890kB 53.4MB/s \n\u001b[?25hRequirement already satisfied: filelock in /usr/local/lib/python3.7/dist-packages (from transformers<4.0,>=3.1.0->ktrain) (3.0.12)\nCollecting tokenizers==0.9.3\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/7b/ac/f5ba028f0f097d855e1541301e946d4672eb0f30b6e25cb2369075f916d2/tokenizers-0.9.3-cp37-cp37m-manylinux1_x86_64.whl (2.9MB)\n\u001b[K |โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ| 2.9MB 54.5MB/s \n\u001b[?25hRequirement already satisfied: protobuf in /usr/local/lib/python3.7/dist-packages (from transformers<4.0,>=3.1.0->ktrain) (3.12.4)\nCollecting keras-transformer>=0.38.0\n Downloading https://files.pythonhosted.org/packages/89/6c/d6f0c164f4cc16fbc0d0fea85f5526e87a7d2df7b077809e422a7e626150/keras-transformer-0.38.0.tar.gz\nRequirement already satisfied: wcwidth in /usr/local/lib/python3.7/dist-packages (from prompt-toolkit<2.0.0,>=1.0.4->ipython->ktrain) (0.2.5)\nRequirement already satisfied: ptyprocess>=0.5 in /usr/local/lib/python3.7/dist-packages (from pexpect; sys_platform != \"win32\"->ipython->ktrain) (0.7.0)\nRequirement already satisfied: ipython-genutils in /usr/local/lib/python3.7/dist-packages (from traitlets>=4.2->ipython->ktrain) (0.2.0)\nRequirement already satisfied: pyyaml in /usr/local/lib/python3.7/dist-packages (from Keras>=2.2.4->seqeval==0.0.19->ktrain) (3.13)\nRequirement already satisfied: h5py in /usr/local/lib/python3.7/dist-packages (from Keras>=2.2.4->seqeval==0.0.19->ktrain) (2.10.0)\nRequirement already satisfied: click in /usr/local/lib/python3.7/dist-packages (from sacremoses->transformers<4.0,>=3.1.0->ktrain) (7.1.2)\nCollecting keras-pos-embd>=0.11.0\n Downloading https://files.pythonhosted.org/packages/09/70/b63ed8fc660da2bb6ae29b9895401c628da5740c048c190b5d7107cadd02/keras-pos-embd-0.11.0.tar.gz\nCollecting keras-multi-head>=0.27.0\n Downloading https://files.pythonhosted.org/packages/e6/32/45adf2549450aca7867deccfa04af80a0ab1ca139af44b16bc669e0e09cd/keras-multi-head-0.27.0.tar.gz\nCollecting keras-layer-normalization>=0.14.0\n Downloading https://files.pythonhosted.org/packages/a4/0e/d1078df0494bac9ce1a67954e5380b6e7569668f0f3b50a9531c62c1fc4a/keras-layer-normalization-0.14.0.tar.gz\nCollecting keras-position-wise-feed-forward>=0.6.0\n Downloading https://files.pythonhosted.org/packages/e3/59/f0faa1037c033059e7e9e7758e6c23b4d1c0772cd48de14c4b6fd4033ad5/keras-position-wise-feed-forward-0.6.0.tar.gz\nCollecting keras-embed-sim>=0.8.0\n Downloading https://files.pythonhosted.org/packages/57/ef/61a1e39082c9e1834a2d09261d4a0b69f7c818b359216d4e1912b20b1c86/keras-embed-sim-0.8.0.tar.gz\nCollecting keras-self-attention==0.46.0\n Downloading https://files.pythonhosted.org/packages/15/6b/c804924a056955fa1f3ff767945187103cfc851ba9bd0fc5a6c6bc18e2eb/keras-self-attention-0.46.0.tar.gz\nBuilding wheels for collected packages: ktrain, langdetect, syntok, seqeval, keras-bert, sacremoses, keras-transformer, keras-pos-embd, keras-multi-head, keras-layer-normalization, keras-position-wise-feed-forward, keras-embed-sim, keras-self-attention\n Building wheel for ktrain (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for ktrain: filename=ktrain-0.25.4-cp37-none-any.whl size=25276443 sha256=a21bf62c621920a75422c4df8cae95d466380843fd1eda8e66302f5807ceda37\n Stored in directory: /root/.cache/pip/wheels/1b/77/8a/bdceaabc308e7178d575278bf6143b7d1a9b939a1e40c56b88\n Building wheel for langdetect (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for langdetect: filename=langdetect-1.0.8-cp37-none-any.whl size=993193 sha256=aec636b54ffe434c9028359c31bdfc76e9da9a1752fa7f10d87e69d57c34d46a\n Stored in directory: /root/.cache/pip/wheels/8d/b3/aa/6d99de9f3841d7d3d40a60ea06e6d669e8e5012e6c8b947a57\n Building wheel for syntok (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for syntok: filename=syntok-1.3.1-cp37-none-any.whl size=20919 sha256=4f6fa992ceefd03a0101faff02b00f882b85c93d7c32eac68c56155956a0bb9e\n Stored in directory: /root/.cache/pip/wheels/51/c6/a4/be1920586c49469846bcd2888200bdecfe109ec421dab9be2d\n Building wheel for seqeval (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for seqeval: filename=seqeval-0.0.19-cp37-none-any.whl size=9919 sha256=ac03ed5c47baebb742f37bf9b08ad4e45782dd3ee4bd727f850a4af61f5fbf77\n Stored in directory: /root/.cache/pip/wheels/8d/1f/bf/1198beceed805a2099060975f6281d1b01046dd279e19c97be\n Building wheel for keras-bert (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-bert: filename=keras_bert-0.86.0-cp37-none-any.whl size=34144 sha256=199f3eea09c452e52c98f833287b4c2e0161520432357af5ecfc932031eddb12\n Stored in directory: /root/.cache/pip/wheels/66/f0/b1/748128b58562fc9e31b907bb5e2ab6a35eb37695e83911236b\n Building wheel for sacremoses (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for sacremoses: filename=sacremoses-0.0.43-cp37-none-any.whl size=893262 sha256=ba82c1360a233bd048daf43f948e0400661f80d3d21e0c1b72500c2fb34065b1\n Stored in directory: /root/.cache/pip/wheels/29/3c/fd/7ce5c3f0666dab31a50123635e6fb5e19ceb42ce38d4e58f45\n Building wheel for keras-transformer (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-transformer: filename=keras_transformer-0.38.0-cp37-none-any.whl size=12942 sha256=598c25f31534d9bbf3e134135ec5ffabecc0de55f3bc3ccef1bc9362f20c8f2b\n Stored in directory: /root/.cache/pip/wheels/e5/fb/3a/37b2b9326c799aa010ae46a04ddb04f320d8c77c0b7e837f4e\n Building wheel for keras-pos-embd (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-pos-embd: filename=keras_pos_embd-0.11.0-cp37-none-any.whl size=7554 sha256=8dffa94551da41c503305037b9936c354793a06d95bcd09d6489f3bea15c49ca\n Stored in directory: /root/.cache/pip/wheels/5b/a1/a0/ce6b1d49ba1a9a76f592e70cf297b05c96bc9f418146761032\n Building wheel for keras-multi-head (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-multi-head: filename=keras_multi_head-0.27.0-cp37-none-any.whl size=15611 sha256=7a015af070bc4ce247816f6ae650140ba6ac85bdb0a845d633c9dea464c22c7a\n Stored in directory: /root/.cache/pip/wheels/b5/b4/49/0a0c27dcb93c13af02fea254ff51d1a43a924dd4e5b7a7164d\n Building wheel for keras-layer-normalization (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-layer-normalization: filename=keras_layer_normalization-0.14.0-cp37-none-any.whl size=5269 sha256=c4050c794d67cf2aa834ffad4960aed9a36145f0a16b4e54f6fab703efb570f6\n Stored in directory: /root/.cache/pip/wheels/54/80/22/a638a7d406fd155e507aa33d703e3fa2612b9eb7bb4f4fe667\n Building wheel for keras-position-wise-feed-forward (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-position-wise-feed-forward: filename=keras_position_wise_feed_forward-0.6.0-cp37-none-any.whl size=5623 sha256=39e5ca51c76b0a07dd6c5f5208f8d68e5e5ab8d88ad8638279f506220420eb6a\n Stored in directory: /root/.cache/pip/wheels/39/e2/e2/3514fef126a00574b13bc0b9e23891800158df3a3c19c96e3b\n Building wheel for keras-embed-sim (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-embed-sim: filename=keras_embed_sim-0.8.0-cp37-none-any.whl size=4558 sha256=a01ad8cac95ba2cd3b0d2462b0dab4b91b5e57a13d65802f81b5ed8514cce406\n Stored in directory: /root/.cache/pip/wheels/49/45/8b/c111f6cc8bec253e984677de73a6f4f5d2f1649f42aac191c8\n Building wheel for keras-self-attention (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for keras-self-attention: filename=keras_self_attention-0.46.0-cp37-none-any.whl size=17278 sha256=d0a6d2471a49500962a43539660a7cf5acaf4e829fb5d7c906fe434d7cbade2c\n Stored in directory: /root/.cache/pip/wheels/d2/2e/80/fec4c05eb23c8e13b790e26d207d6e0ffe8013fad8c6bdd4d2\nSuccessfully built ktrain langdetect syntok seqeval keras-bert sacremoses keras-transformer keras-pos-embd keras-multi-head keras-layer-normalization keras-position-wise-feed-forward keras-embed-sim keras-self-attention\n\u001b[31mERROR: transformers 3.5.1 has requirement sentencepiece==0.1.91, but you'll have sentencepiece 0.1.95 which is incompatible.\u001b[0m\nInstalling collected packages: threadpoolctl, scikit-learn, langdetect, cchardet, syntok, seqeval, sacremoses, tokenizers, sentencepiece, transformers, keras-pos-embd, keras-self-attention, keras-multi-head, keras-layer-normalization, keras-position-wise-feed-forward, keras-embed-sim, keras-transformer, keras-bert, whoosh, ktrain\n Found existing installation: scikit-learn 0.22.2.post1\n Uninstalling scikit-learn-0.22.2.post1:\n Successfully uninstalled scikit-learn-0.22.2.post1\nSuccessfully installed cchardet-2.1.7 keras-bert-0.86.0 keras-embed-sim-0.8.0 keras-layer-normalization-0.14.0 keras-multi-head-0.27.0 keras-pos-embd-0.11.0 keras-position-wise-feed-forward-0.6.0 keras-self-attention-0.46.0 keras-transformer-0.38.0 ktrain-0.25.4 langdetect-1.0.8 sacremoses-0.0.43 scikit-learn-0.23.2 sentencepiece-0.1.95 seqeval-0.0.19 syntok-1.3.1 threadpoolctl-2.1.0 tokenizers-0.9.3 transformers-3.5.1 whoosh-2.7.4\n" ], [ "import os.path\r\nimport numpy as np\r\nimport pandas as pd\r\nimport tensorflow as tf\r\nimport ktrain\r\nfrom ktrain import text ", "_____no_output_____" ] ], [ [ "## Part 1: Data Preprocessing", "_____no_output_____" ], [ "### Loading the IMDB dataset", "_____no_output_____" ] ], [ [ "dataset = tf.keras.utils.get_file(fname = \"aclImdb_v1.tar\",\r\n origin = \"https://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar\",\r\n extract = True)\r\nIMDB_DATADIR = os.path.join(os.path.dirname(dataset), 'aclImdb')", "Downloading data from https://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar\n84131840/84125825 [==============================] - 2s 0us/step\n" ], [ "print(os.path.dirname(dataset))\r\nprint(IMDB_DATADIR)", "/root/.keras/datasets\n/root/.keras/datasets/aclImdb\n" ] ], [ [ "### Creating the training & test sets", "_____no_output_____" ] ], [ [ "(X_train, y_train), (X_test, y_test), preproc = text.texts_from_folder(datadir = IMDB_DATADIR, \r\n classes = ['pos','neg'],\r\n maxlen = 500, \r\n train_test_names = ['train', 'test'],\r\n preprocess_mode = 'bert')", "detected encoding: utf-8\ndownloading pretrained BERT model (uncased_L-12_H-768_A-12.zip)...\n[โ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆโ–ˆ]\nextracting pretrained BERT model...\ndone.\n\ncleanup downloaded zip...\ndone.\n\npreprocessing train...\nlanguage: en\n" ] ], [ [ "## Part 2: Building the BERT model", "_____no_output_____" ] ], [ [ "model = text.text_classifier(name = 'bert',\r\n train_data = (X_train, y_train),\r\n preproc = preproc)", "Is Multi-Label? False\nmaxlen is 500\ndone.\n" ] ], [ [ "## Part 3: Training the BERT model", "_____no_output_____" ] ], [ [ "learner = ktrain.get_learner(model = model, \r\n train_data = (X_train, y_train),\r\n val_data = (X_test, y_test),\r\n batch_size = 6)", "_____no_output_____" ], [ "learner.fit_onecycle(lr=2e-5,\r\n epochs = 1)", "\n\nbegin training using onecycle policy with max lr of 2e-05...\n4167/4167 [==============================] - 3436s 820ms/step - loss: 0.3313 - accuracy: 0.8479 - val_loss: 0.1619 - val_accuracy: 0.9383\n" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown", "markdown" ], [ "code", "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ] ]
d0648ee1a52be7f2190c84ec7d539059836b6cb4
920,987
ipynb
Jupyter Notebook
TD Actor Critic/TD_Actor_Critic_seperate_net.ipynb
gt-coar/BrianSURE2021
ef3087763a1500b5dc01fe74474cb1bf4936773a
[ "MIT" ]
null
null
null
TD Actor Critic/TD_Actor_Critic_seperate_net.ipynb
gt-coar/BrianSURE2021
ef3087763a1500b5dc01fe74474cb1bf4936773a
[ "MIT" ]
null
null
null
TD Actor Critic/TD_Actor_Critic_seperate_net.ipynb
gt-coar/BrianSURE2021
ef3087763a1500b5dc01fe74474cb1bf4936773a
[ "MIT" ]
null
null
null
40.548893
407
0.49593
[ [ [ "# Enable GPU", "_____no_output_____" ] ], [ [ "import torch\ndevice = torch.device('cuda:0' if torch.cuda.is_available else 'cpu')", "_____no_output_____" ] ], [ [ "# Actor and Critic Network\n\n", "_____no_output_____" ] ], [ [ "import torch.nn as nn\nimport torch.nn.functional as F\nfrom torch.distributions import Categorical\n\nclass Actor_Net(nn.Module):\n def __init__(self, input_dims, output_dims, num_neurons = 128):\n super(Actor_Net, self).__init__()\n self.fc1 = nn.Linear(input_dims, num_neurons)\n self.actor = nn.Linear(num_neurons, output_dims)\n self.log_probs = []\n self.entropies = []\n\n def forward(self, state):\n x = F.relu(self.fc1(state))\n x = F.softmax(self.actor(x), dim = 1)\n\n return x\n\n def get_action(self, state):\n with torch.no_grad():\n probs = self.forward(state)\n dist = Categorical(probs = probs)\n action = dist.sample()\n return action\n \n def eval_action(self, state):\n probs = self.forward(state)\n dist = Categorical(probs = probs)\n action = dist.sample().to(device)\n log_prob = dist.log_prob(action)\n entropy = dist.entropy()\n self.log_probs.append(log_prob)\n self.entropies.append(entropy)\n\n return action\n\nclass Critic_Net(nn.Module):\n def __init__ (self, input_dims, output_dims, num_neurons = 128):\n super(Critic_Net, self).__init__()\n\n self.values = []\n self.next_values = []\n \n self.fc1 = nn.Linear(input_dims, num_neurons)\n self.critic = nn.Linear(num_neurons, 1)\n\n def forward (self, state):\n x = F.relu(self.fc1(state))\n x = self.critic(x)\n\n return x", "_____no_output_____" ], [ "import torch.optim as optim\nimport numpy as np\nimport gym\n\nclass Actor_Critic_Agent(nn.Module):\n def __init__(self, input_dims, output_dims, optimizer = 'RMSprop', num_neurons = 128 , gamma = 0.99, actor_lr=0.001, critic_lr = 0.01):\n super(Actor_Critic_Agent, self).__init__()\n self.actor_net = Actor_Net(input_dims= input_dims, output_dims= output_dims, num_neurons= num_neurons).to(device)\n self.critic_net = Critic_Net(input_dims=input_dims, output_dims= output_dims, num_neurons= num_neurons).to(device)\n self.gamma = gamma\n if optimizer == 'RMSprop':\n self.actor_optimizer = optim.RMSprop(params = self.actor_net.parameters(), lr =actor_lr)\n self.critic_optimizer = optim.RMSprop(params = self.critic_net.parameters(), lr = critic_lr)\n else:\n self.actor_optimizer = optim.Adam(params = self.actor_net.parameters(), lr = actor_lr)\n self.critic_optimizer = optim.Adam(params = self.critic_net.parameters(), lr = critic_lr)\n\n def learn_mean(self, rewards, dones):\n value_criteration = nn.MSELoss()\n value_losses = []\n actor_losses = []\n self.critic_net.next_values = torch.cat(self.critic_net.next_values, dim = 0).squeeze(0)\n self.critic_net.values = torch.cat(self.critic_net.values, dim = 0).squeeze(0)\n self.actor_net.log_probs = torch.cat(self.actor_net.log_probs, dim = 0)\n self.actor_net.entropies = torch.cat(self.actor_net.entropies, dim = 0)\n\n for reward, entropy, log_prob, v, v_next, done in zip(rewards ,self.actor_net.entropies, self.actor_net.log_probs, self.critic_net.values, self.critic_net.next_values, dones):\n td_target = reward + self.gamma * v_next * done\n td_error = td_target - v\n value_loss = value_criteration(v, td_target.detach())- 0.001 * entropy.detach()\n actor_loss = - log_prob * td_error.detach() \n value_losses.append(value_loss)\n actor_losses.append(actor_loss)\n\n self.critic_optimizer.zero_grad()\n value_losses = torch.stack(value_losses).sum()\n value_losses.backward()\n self.critic_optimizer.step() \n\n self.actor_optimizer.zero_grad()\n actor_losses = torch.stack(actor_losses).sum()\n actor_losses.backward()\n self.actor_optimizer.step()\n\n \n # clear out memory \n self.actor_net.log_probs = []\n self.actor_net.entropies = []\n self.critic_net.values = []\n self.critic_net.next_values = []\n\n", "_____no_output_____" ] ], [ [ "# Without Wandb", "_____no_output_____" ] ], [ [ "import gym\nimport time\nimport pdb\n\nenv = gym.make('CartPole-v1')\nenv.seed(543)\ntorch.manual_seed(543)\nstate_dims = env.observation_space.shape[0]\naction_dims = env.action_space.n\nagent = Actor_Critic_Agent(input_dims= state_dims, output_dims = action_dims)\n\ndef train():\n\n num_ep = 2000\n print_every = 100\n running_score = 10\n start = time.time()\n\n rewards = []\n dones = []\n\n for ep in range(1, num_ep + 1):\n state = env.reset()\n score = 0\n done = False\n rewards = []\n dones = []\n\n while not done:\n state = torch.tensor([state]).float().to(device)\n action = agent.actor_net.eval_action(state)\n v = agent.critic_net(state)\n\n next_state, reward, done, _ = env.step(action.item())\n v_next = agent.critic_net(torch.tensor([next_state]).float().to(device))\n \n agent.critic_net.values.append(v.squeeze(0))\n agent.critic_net.next_values.append(v_next.squeeze(0))\n rewards.append(reward)\n dones.append(1 - done)\n \n # update episode\n score += reward\n state = next_state\n\n if done:\n break\n\n # update agent\n #pdb.set_trace()\n agent.learn_mean(rewards,dones)\n \n # calculating score and running score\n running_score = 0.05 * score + (1 - 0.05) * running_score\n\n if ep % print_every == 0:\n print('episode: {}, running score: {}, time elapsed: {}'.format(ep, running_score, time.time() - start))\n\n\n\n", "_____no_output_____" ], [ "train() #RMS", "episode: 100, running score: 43.32507441570408, time elapsed: 4.842878341674805\nepisode: 200, running score: 129.30332722904944, time elapsed: 19.552313089370728\n" ] ], [ [ "# Wtih wandb", "_____no_output_____" ] ], [ [ "!pip install wandb\n!wandb login\n", "_____no_output_____" ], [ "import wandb\nsweep_config = dict()\nsweep_config['method'] = 'grid'\nsweep_config['metric'] = {'name': 'running_score', 'goal': 'maximize'}\nsweep_config['parameters'] = {'learning': {'value': 'learn_mean'}, 'actor_learning_rate': {'values' : [0.01, 0.001, 0.0001,0.0003,0.00001]}, 'critic_learning_rate' : {'values': [0.01, 0.001, 0.0001, 0.0003, 0.00001]}\n , 'num_neurons': {'value': 128 }, 'optimizer': {'values' : ['RMSprop', 'Adam']}}\n\nsweep_id = wandb.sweep(sweep_config, project = 'Advantage_Actor_Critic')", "Create sweep with ID: t9gia22t\nSweep URL: https://wandb.ai/ko120/Advantage_Actor_Critic/sweeps/t9gia22t\n" ], [ "import gym \nimport torch\nimport time\nimport wandb\n\n\n\ndef train():\n wandb.init(config = {'env':'CartPole-v1','algorithm:': 'Actor_Critic','architecture': 'seperate','num_laeyrs':'2'}, project = 'Advantage_Actor_Critic',group = 'Cart_128_neurons_2_layer')\n config = wandb.config\n\n env = gym.make('CartPole-v1')\n env.seed(543)\n torch.manual_seed(543)\n\n state_dim = env.observation_space.shape[0]\n action_dim = env.action_space.n\n\n device = torch.device('cuda:0' if torch.cuda.is_available else 'cpu')\n agent = Actor_Critic_Agent(input_dims= state_dim, output_dims= action_dim, optimizer = config.optimizer, num_neurons= config.num_neurons, actor_lr = config.actor_learning_rate, critic_lr = config.critic_learning_rate)\n\n\n num_ep = 3000\n print_interval = 100\n save_interval = 1000\n running_score = 10\n start = time.time()\n\n \n wandb.watch(agent)\n for ep in range(1,num_ep+1):\n state = env.reset()\n score = 0\n done = False\n rewards = []\n dones = []\n\n while not done:\n state = torch.tensor([state]).float().to(device)\n action = agent.actor_net.eval_action(state)\n v = agent.critic_net(state)\n\n next_state, reward, done, _ = env.step(action.item())\n v_next = agent.critic_net(torch.tensor([next_state]).float().to(device))\n \n agent.critic_net.values.append(v.squeeze(0))\n agent.critic_net.next_values.append(v_next.squeeze(0))\n rewards.append(reward)\n dones.append(1 - done)\n \n # update episode\n score += reward\n state = next_state\n\n if done:\n break\n\n # update agent\n agent.learn_mean(rewards,dones)\n \n # calculating score and running score\n running_score = 0.05 * score + (1 - 0.05) * running_score\n\n wandb.log({'episode': ep, 'running_score': running_score}) \n\n if ep % print_interval == 0:\n print('episode {} average reward {}, ended at {:.01f}'.format(ep, running_score, time.time() - start)) \n \n if ep % save_interval == 0:\n save_name_actor = 'actor_' + str(ep) + '.pt'\n torch.save(agent.actor_net.state_dict(),save_name_actor)\n save_name_critic = 'critic_' + str(ep) + '.pt'\n torch.save(agent.critic_net.state_dict(),save_name_critic)\n wandb.save(save_name_actor)\n wandb.save(save_name_critic)\n\n if ep == num_ep:\n dummy_input = torch.rand(1,4).to(device)\n torch.onnx.export(agent.actor_net,dummy_input,'final_model_actor.onnx')\n wandb.save('final_model_actor.onnx')\n torch.onnx.export(agent.critic_net, dummy_input, 'final_model_critic.onnx')\n wandb.save('final_model_critic.onnx')\n ", "_____no_output_____" ], [ "wandb.agent(sweep_id, train)", "\u001b[34m\u001b[1mwandb\u001b[0m: Agent Starting Run: wivnmds7 with config:\n\u001b[34m\u001b[1mwandb\u001b[0m: \tactor_learning_rate: 0.01\n\u001b[34m\u001b[1mwandb\u001b[0m: \tcritic_learning_rate: 0.01\n\u001b[34m\u001b[1mwandb\u001b[0m: \tlearning: learn_mean\n\u001b[34m\u001b[1mwandb\u001b[0m: \tnum_neurons: 128\n\u001b[34m\u001b[1mwandb\u001b[0m: \toptimizer: RMSprop\n\u001b[34m\u001b[1mwandb\u001b[0m: \u001b[33mWARNING\u001b[0m Ignored wandb.init() arg project when running a sweep\n" ] ], [ [ "# You can see the result here!\n[Report Link](https://wandb.ai/ko120/Advantage_Actor_Critic/reports/TD-Actor-Critic-Learning-rate-tune---Vmlldzo4OTIwODg)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ] ]
d0648f64f9504f4ac0f20c8cc100a85b421e3dac
173,648
ipynb
Jupyter Notebook
PREDICTION-MODEL-1.ipynb
fuouo/TrafficBato
bd8ab5645116db4029b90bb5a28e0134d59e9ac0
[ "MIT" ]
null
null
null
PREDICTION-MODEL-1.ipynb
fuouo/TrafficBato
bd8ab5645116db4029b90bb5a28e0134d59e9ac0
[ "MIT" ]
null
null
null
PREDICTION-MODEL-1.ipynb
fuouo/TrafficBato
bd8ab5645116db4029b90bb5a28e0134d59e9ac0
[ "MIT" ]
null
null
null
58.983696
38,484
0.633298
[ [ [ "## Import Necessary Packages", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nimport datetime\nimport os\n\nnp.random.seed(1337) # for reproducibility\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics.classification import accuracy_score\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.metrics.regression import r2_score, mean_squared_error, mean_absolute_error\n\nfrom dbn.tensorflow import SupervisedDBNRegression", "_____no_output_____" ] ], [ [ "## Define Model Settings", "_____no_output_____" ] ], [ [ "RBM_EPOCHS = 5\nDBN_EPOCHS = 150\nRBM_LEARNING_RATE = 0.01\nDBN_LEARNING_RATE = 0.01\nHIDDEN_LAYER_STRUCT = [20, 50, 100]\nACTIVE_FUNC = 'relu'\nBATCH_SIZE = 28", "_____no_output_____" ] ], [ [ "## Define Directory, Road, and Year", "_____no_output_____" ] ], [ [ "# Read the dataset\nROAD = \"Vicente Cruz\"\nYEAR = \"2015\"\nEXT = \".csv\"\nDATASET_DIVISION = \"seasonWet\"\nDIR = \"../../../datasets/Thesis Datasets/\"\nOUTPUT_DIR = \"PM1/Rolling 3/\"\nMODEL_DIR = \"PM1/Rolling 3/\"\n\n'''''''Training dataset'''''''\nWP = False\nWEEKDAY = False\nCONNECTED_ROADS = False\nCONNECTED_1 = [\"Antipolo\"]\ntrafficDT = \"recon_traffic\" #orig_traffic recon_traffic\nfeatureEngineering = \"Rolling\" #Rolling Expanding Rolling and Expanding\ntimeFE = \"today\" #today yesterday\ntimeConnected = \"today\"\nROLLING_WINDOW = 3\nEXPANDING_WINDOW = 3\nRECON_SHIFT = 96\n# RECON_FE_WINDOW = 48", "_____no_output_____" ], [ "def addWorkingPeakFeatures(df):\n result_df = df.copy()\n\n # Converting the index as date\n result_df.index = pd.to_datetime(result_df.index)\n \n # Create column work_day\n result_df['work_day'] = ((result_df.index.dayofweek) < 5).astype(int)\n\n # Consider non-working holiday\n if DATASET_DIVISION is not \"seasonWet\":\n\n # Jan\n result_df.loc['2015-01-01', 'work_day'] = 0\n result_df.loc['2015-01-02', 'work_day'] = 0\n\n # Feb\n result_df.loc['2015-02-19', 'work_day'] = 0\n result_df.loc['2015-02-25', 'work_day'] = 0\n\n # Apr\n result_df.loc['2015-04-02', 'work_day'] = 0\n result_df.loc['2015-04-03', 'work_day'] = 0\n result_df.loc['2015-04-09', 'work_day'] = 0\n\n # May\n result_df.loc['2015-05-01', 'work_day'] = 0\n\n # Jun\n result_df.loc['2015-06-12', 'work_day'] = 0\n result_df.loc['2015-06-24', 'work_day'] = 0\n\n # Jul\n result_df.loc['2015-07-17', 'work_day'] = 0\n\n # Aug\n result_df.loc['2015-08-21', 'work_day'] = 0\n result_df.loc['2015-08-31', 'work_day'] = 0\n\n # Sep\n result_df.loc['2015-08-25', 'work_day'] = 0\n\n if DATASET_DIVISION is not \"seasonWet\":\n # Nov\n result_df.loc['2015-11-30', 'work_day'] = 0\n\n # Dec\n result_df.loc['2015-12-24', 'work_day'] = 0\n result_df.loc['2015-12-25', 'work_day'] = 0\n result_df.loc['2015-12-30', 'work_day'] = 0\n result_df.loc['2015-12-31', 'work_day'] = 0\n\n # Consider class suspension\n if DATASET_DIVISION is not \"seasonWet\":\n # Jan\n result_df.loc['2015-01-08', 'work_day'] = 0\n result_df.loc['2015-01-09', 'work_day'] = 0\n result_df.loc['2015-01-14', 'work_day'] = 0\n result_df.loc['2015-01-15', 'work_day'] = 0\n result_df.loc['2015-01-16', 'work_day'] = 0\n result_df.loc['2015-01-17', 'work_day'] = 0\n\n # Jul\n result_df.loc['2015-07-06', 'work_day'] = 0\n result_df.loc['2015-07-08', 'work_day'] = 0\n result_df.loc['2015-07-09', 'work_day'] = 0\n result_df.loc['2015-07-10', 'work_day'] = 0\n\n # Aug\n result_df.loc['2015-08-10', 'work_day'] = 0\n result_df.loc['2015-08-11', 'work_day'] = 0\n\n # Sep\n result_df.loc['2015-09-10', 'work_day'] = 0\n\n # Oct\n result_df.loc['2015-10-02', 'work_day'] = 0\n result_df.loc['2015-10-19', 'work_day'] = 0\n\n if DATASET_DIVISION is not \"seasonWet\":\n # Nov\n result_df.loc['2015-11-16', 'work_day'] = 0\n result_df.loc['2015-11-17', 'work_day'] = 0\n result_df.loc['2015-11-18', 'work_day'] = 0\n result_df.loc['2015-11-19', 'work_day'] = 0\n result_df.loc['2015-11-20', 'work_day'] = 0\n\n # Dec\n result_df.loc['2015-12-16', 'work_day'] = 0\n result_df.loc['2015-12-18', 'work_day'] = 0\n\n result_df['peak_hour'] = 0\n\n # Set morning peak hour\n\n start = datetime.time(7,0,0)\n end = datetime.time(10,0,0)\n\n result_df.loc[result_df.between_time(start, end).index, 'peak_hour'] = 1\n\n # Set afternoon peak hour\n\n start = datetime.time(16,0,0)\n end = datetime.time(19,0,0)\n\n result_df.loc[result_df.between_time(start, end).index, 'peak_hour'] = 1\n \n result_df\n \n return result_df", "_____no_output_____" ], [ "def reconstructDT(df, pastTraffic=False, trafficFeatureNeeded=[]):\n result_df = df.copy()\n\n # Converting the index as date\n result_df.index = pd.to_datetime(result_df.index, format='%d/%m/%Y %H:%M')\n result_df['month'] = result_df.index.month\n result_df['day'] = result_df.index.day\n result_df['hour'] = result_df.index.hour\n result_df['min'] = result_df.index.minute \n result_df['dayOfWeek'] = result_df.index.dayofweek\n \n if pastTraffic:\n for f in trafficFeatureNeeded:\n result_df[f + '-' + str(RECON_SHIFT*15) + \"mins\"] = result_df[f].shift(RECON_SHIFT)\n \n result_df = result_df.iloc[RECON_SHIFT:, :]\n \n for f in range(len(result_df.columns)):\n result_df[result_df.columns[f]] = normalize(result_df[result_df.columns[f]])\n\n return result_df", "_____no_output_____" ], [ "def getNeededFeatures(columns, arrFeaturesNeed, featureEngineering=\"Original\"):\n to_remove = []\n if len(arrFeaturesNeed) == 0: #all features aren't needed\n to_remove += range(0, len(columns))\n\n else:\n if featureEngineering == \"Original\":\n compareTo = \" \"\n elif featureEngineering == \"Rolling\" or featureEngineering == \"Expanding\":\n compareTo = \"_\"\n \n for f in arrFeaturesNeed:\n for c in range(0, len(columns)):\n if f not in columns[c].split(compareTo)[0] and columns[c].split(compareTo)[0] not in arrFeaturesNeed:\n to_remove.append(c)\n if len(columns[c].split(compareTo)) > 1:\n if \"Esum\" in columns[c].split(compareTo)[1]: #Removing all Expanding Sum \n to_remove.append(c)\n \n return to_remove", "_____no_output_____" ], [ "def normalize(data):\n y = pd.to_numeric(data)\n y = np.array(y.reshape(-1, 1))\n \n scaler = MinMaxScaler()\n y = scaler.fit_transform(y)\n y = y.reshape(1, -1)[0]\n return y", "_____no_output_____" ] ], [ [ "<br><br>\n### Preparing Traffic Dataset", "_____no_output_____" ], [ "#### Importing Original Traffic (wo new features)", "_____no_output_____" ] ], [ [ "TRAFFIC_DIR = DIR + \"mmda/\"\nTRAFFIC_FILENAME = \"mmda_\" + ROAD + \"_\" + YEAR + \"_\" + DATASET_DIVISION\norig_traffic = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\norig_traffic = orig_traffic.fillna(0)\n\n#Converting index to date and time, and removing 'dt' column\norig_traffic.index = pd.to_datetime(orig_traffic.dt, format='%d/%m/%Y %H:%M')\ncols_to_remove = [0]\ncols_to_remove = getNeededFeatures(orig_traffic.columns, [\"statusN\"])\norig_traffic.drop(orig_traffic.columns[[cols_to_remove]], axis=1, inplace=True)\norig_traffic.head()\n\nif WEEKDAY:\n orig_traffic = orig_traffic[((orig_traffic.index.dayofweek) < 5)]\norig_traffic.head()", "_____no_output_____" ], [ "TRAFFIC_DIR = DIR + \"mmda/Rolling/\" + DATASET_DIVISION + \"/\"\nTRAFFIC_FILENAME = \"eng_win\" + str(ROLLING_WINDOW) + \"_mmda_\" + ROAD + \"_\" + YEAR + \"_\" + DATASET_DIVISION\nrolling_traffic = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\n\ncols_to_remove = [0, 1, 2]\ncols_to_remove += getNeededFeatures(rolling_traffic.columns, [\"statusN\"], \"Rolling\")\n\nrolling_traffic.index = pd.to_datetime(rolling_traffic.dt, format='%Y-%m-%d %H:%M')\n\nrolling_traffic.drop(rolling_traffic.columns[[cols_to_remove]], axis=1, inplace=True)\n\nif WEEKDAY:\n rolling_traffic = rolling_traffic[((rolling_traffic.index.dayofweek) < 5)]\n \nrolling_traffic.head()", "_____no_output_____" ], [ "TRAFFIC_DIR = DIR + \"mmda/Expanding/\" + DATASET_DIVISION + \"/\"\nTRAFFIC_FILENAME = \"eng_win\" + str(EXPANDING_WINDOW) + \"_mmda_\" + ROAD + \"_\" + YEAR + \"_\" + DATASET_DIVISION\nexpanding_traffic = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\n\ncols_to_remove = [0, 1, 2, 5]\ncols_to_remove += getNeededFeatures(expanding_traffic.columns, [\"statusN\"], \"Rolling\")\n\nexpanding_traffic.index = pd.to_datetime(expanding_traffic.dt, format='%d/%m/%Y %H:%M')\n\nexpanding_traffic.drop(expanding_traffic.columns[[cols_to_remove]], axis=1, inplace=True)\n\nif WEEKDAY:\n expanding_traffic = expanding_traffic[((expanding_traffic.index.dayofweek) < 5)]\nexpanding_traffic.head()", "_____no_output_____" ], [ "recon_traffic = reconstructDT(orig_traffic, pastTraffic=True, trafficFeatureNeeded=['statusN'])\nrecon_traffic.head()", "c:\\users\\ronnie nieva\\anaconda3\\envs\\tensorflow\\lib\\site-packages\\ipykernel_launcher.py:3: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead\n This is separate from the ipykernel package so we can avoid doing imports until\nc:\\users\\ronnie nieva\\anaconda3\\envs\\tensorflow\\lib\\site-packages\\sklearn\\utils\\validation.py:475: DataConversionWarning: Data with input dtype int64 was converted to float64 by MinMaxScaler.\n warnings.warn(msg, DataConversionWarning)\n" ], [ "connected_roads = []\n\nfor c in CONNECTED_1:\n TRAFFIC_DIR = DIR + \"mmda/\"\n TRAFFIC_FILENAME = \"mmda_\" + c + \"_\" + YEAR + \"_\" + DATASET_DIVISION\n temp = pd.read_csv(TRAFFIC_DIR + TRAFFIC_FILENAME + EXT, skipinitialspace=True)\n temp = temp.fillna(0)\n\n #Converting index to date and time, and removing 'dt' column\n temp.index = pd.to_datetime(temp.dt, format='%d/%m/%Y %H:%M')\n cols_to_remove = [0]\n cols_to_remove = getNeededFeatures(temp.columns, [\"statusN\"])\n temp.drop(temp.columns[[cols_to_remove]], axis=1, inplace=True)\n \n if WEEKDAY:\n temp = temp[((temp.index.dayofweek) < 5)]\n \n for f in range(len(temp.columns)):\n temp[temp.columns[f]] = normalize(temp[temp.columns[f]])\n temp = temp.rename(columns={temp.columns[f]: temp.columns[f] +\"(\" + c + \")\"})\n connected_roads.append(temp)\n \nconnected_roads[0].head()", "c:\\users\\ronnie nieva\\anaconda3\\envs\\tensorflow\\lib\\site-packages\\ipykernel_launcher.py:3: FutureWarning: reshape is deprecated and will raise in a subsequent release. Please use .values.reshape(...) instead\n This is separate from the ipykernel package so we can avoid doing imports until\n" ] ], [ [ "### Merging datasets", "_____no_output_____" ] ], [ [ "if trafficDT == \"orig_traffic\":\n arrDT = [orig_traffic]\n \n if CONNECTED_ROADS:\n for c in connected_roads:\n arrDT.append(c)\n \nelif trafficDT == \"recon_traffic\":\n arrDT = [recon_traffic]\n \n if CONNECTED_ROADS:\n timeConnected = \"today\"\n print(\"TimeConnected = \" + timeConnected)\n for c in connected_roads:\n if timeConnected == \"today\":\n startIndex = np.absolute(len(arrDT[0])-len(c))\n endIndex = len(c)\n elif timeConnected == \"yesterday\":\n startIndex = 0\n endIndex = len(rolling_traffic) - RECON_SHIFT\n c = c.rename(columns={c.columns[0]: c.columns[0] + \"-\" + str(RECON_SHIFT*15) + \"mins\"})\n\n\n c = c.iloc[startIndex:endIndex, :]\n print(\"Connected Road Start time: \" + str(c.index[0]))\n c.index = arrDT[0].index\n arrDT.append(c)\n print(str(startIndex) + \" \" + str(endIndex))\n\n \nif featureEngineering != \"\":\n print(\"Adding Feature Engineering\")\n \n print(\"TimeConnected = \" + timeFE)\n\n \n if timeFE == \"today\":\n startIndex = np.absolute(len(arrDT[0])-len(rolling_traffic))\n endIndex = len(rolling_traffic)\n elif timeFE == \"yesterday\":\n startIndex = 0\n endIndex = len(rolling_traffic) - RECON_SHIFT\n \n if featureEngineering == \"Rolling\":\n temp = rolling_traffic.iloc[startIndex:endIndex, :]\n arrDT.append(temp)\n\n elif featureEngineering == \"Expanding\":\n temp = expanding_traffic.iloc[startIndex:endIndex, :]\n arrDT.append(temp)\n\n elif featureEngineering == \"Rolling and Expanding\":\n print(str(startIndex) + \" \" + str(endIndex))\n \n #Rolling\n temp = rolling_traffic.iloc[startIndex:endIndex, :]\n temp.index = arrDT[0].index\n arrDT.append(temp)\n \n #Expanding\n temp = expanding_traffic.iloc[startIndex:endIndex, :]\n temp.index = arrDT[0].index\n arrDT.append(temp)\n \nmerged_dataset = pd.concat(arrDT, axis=1)\nif \"Rolling\" in featureEngineering:\n merged_dataset = merged_dataset.iloc[ROLLING_WINDOW+1:, :]\n \nif WP:\n merged_dataset = addWorkingPeakFeatures(merged_dataset)\n print(\"Adding working / peak days\") \n\nmerged_dataset", "Adding Feature Engineering\nTimeConnected = today\n" ] ], [ [ "### Adding Working / Peak Features", "_____no_output_____" ] ], [ [ "if WP:\n merged_dataset = addWorkingPeakFeatures(merged_dataset)\n print(\"Adding working / peak days\")", "_____no_output_____" ] ], [ [ "## Preparing Training dataset", "_____no_output_____" ], [ "### Merge Original (and Rolling and Expanding)", "_____no_output_____" ] ], [ [ "# To-be Predicted variable \nY = merged_dataset.statusN\nY = Y.fillna(0)", "_____no_output_____" ], [ "# Training Data\nX = merged_dataset\nX = X.drop(X.columns[[0]], axis=1)\n\n# Splitting data\nX_train, X_test, Y_train, Y_test = train_test_split(X, Y, test_size=0.67, shuffle=False)\nX_train = np.array(X_train)\nX_test = np.array(X_test)\nY_train = np.array(Y_train)\nY_test = np.array(Y_test)\n\n# Data scaling\n# min_max_scaler = MinMaxScaler()\n# X_train = min_max_scaler.fit_transform(X_train)\n\n#Print training and testing data\npd.concat([X, Y.to_frame()], axis=1).head()", "_____no_output_____" ] ], [ [ "<br><br>\n## Training Model", "_____no_output_____" ] ], [ [ "# Training\nregressor = SupervisedDBNRegression(hidden_layers_structure=HIDDEN_LAYER_STRUCT,\n learning_rate_rbm=RBM_LEARNING_RATE,\n learning_rate=DBN_LEARNING_RATE,\n n_epochs_rbm=RBM_EPOCHS,\n n_iter_backprop=DBN_EPOCHS,\n batch_size=BATCH_SIZE,\n activation_function=ACTIVE_FUNC)\nregressor.fit(X_train, Y_train)", "[START] Pre-training step:\n>> Epoch 1 finished \tRBM Reconstruction error 0.600877\n>> Epoch 2 finished \tRBM Reconstruction error 0.567511\n>> Epoch 3 finished \tRBM Reconstruction error 0.521251\n>> Epoch 4 finished \tRBM Reconstruction error 0.471193\n>> Epoch 5 finished \tRBM Reconstruction error 0.412250\n>> Epoch 1 finished \tRBM Reconstruction error 0.150905\n>> Epoch 2 finished \tRBM Reconstruction error 0.141888\n>> Epoch 3 finished \tRBM Reconstruction error 0.133112\n>> Epoch 4 finished \tRBM Reconstruction error 0.125336\n>> Epoch 5 finished \tRBM Reconstruction error 0.117369\n>> Epoch 1 finished \tRBM Reconstruction error 0.045284\n>> Epoch 2 finished \tRBM Reconstruction error 0.038918\n>> Epoch 3 finished \tRBM Reconstruction error 0.038393\n>> Epoch 4 finished \tRBM Reconstruction error 0.037598\n>> Epoch 5 finished \tRBM Reconstruction error 0.036662\n[END] Pre-training step\n[START] Fine tuning step:\n>> Epoch 0 finished \tANN training loss 0.056863\n>> Epoch 1 finished \tANN training loss 0.048765\n>> Epoch 2 finished \tANN training loss 0.038931\n>> Epoch 3 finished \tANN training loss 0.028552\n>> Epoch 4 finished \tANN training loss 0.019801\n>> Epoch 5 finished \tANN training loss 0.014199\n>> Epoch 6 finished \tANN training loss 0.011577\n>> Epoch 7 finished \tANN training loss 0.010580\n>> Epoch 8 finished \tANN training loss 0.010219\n>> Epoch 9 finished \tANN training loss 0.010065\n>> Epoch 10 finished \tANN training loss 0.009976\n>> Epoch 11 finished \tANN training loss 0.009865\n>> Epoch 12 finished \tANN training loss 0.009775\n>> Epoch 13 finished \tANN training loss 0.009698\n>> Epoch 14 finished \tANN training loss 0.009636\n>> Epoch 15 finished \tANN training loss 0.009586\n>> Epoch 16 finished \tANN training loss 0.009556\n>> Epoch 17 finished \tANN training loss 0.009533\n>> Epoch 18 finished \tANN training loss 0.009486\n>> Epoch 19 finished \tANN training loss 0.009430\n>> Epoch 20 finished \tANN training loss 0.009416\n>> Epoch 21 finished \tANN training loss 0.009390\n>> Epoch 22 finished \tANN training loss 0.009394\n>> Epoch 23 finished \tANN training loss 0.009345\n>> Epoch 24 finished \tANN training loss 0.009330\n>> Epoch 25 finished \tANN training loss 0.009319\n>> Epoch 26 finished \tANN training loss 0.009298\n>> Epoch 27 finished \tANN training loss 0.009302\n>> Epoch 28 finished \tANN training loss 0.009276\n>> Epoch 29 finished \tANN training loss 0.009319\n>> Epoch 30 finished \tANN training loss 0.009279\n>> Epoch 31 finished \tANN training loss 0.009273\n>> Epoch 32 finished \tANN training loss 0.009264\n>> Epoch 33 finished \tANN training loss 0.009274\n>> Epoch 34 finished \tANN training loss 0.009242\n>> Epoch 35 finished \tANN training loss 0.009231\n>> Epoch 36 finished \tANN training loss 0.009227\n>> Epoch 37 finished \tANN training loss 0.009224\n>> Epoch 38 finished \tANN training loss 0.009249\n>> Epoch 39 finished \tANN training loss 0.009218\n>> Epoch 40 finished \tANN training loss 0.009307\n>> Epoch 41 finished \tANN training loss 0.009225\n>> Epoch 42 finished \tANN training loss 0.009235\n>> Epoch 43 finished \tANN training loss 0.009212\n>> Epoch 44 finished \tANN training loss 0.009213\n>> Epoch 45 finished \tANN training loss 0.009226\n>> Epoch 46 finished \tANN training loss 0.009228\n>> Epoch 47 finished \tANN training loss 0.009217\n>> Epoch 48 finished \tANN training loss 0.009202\n>> Epoch 49 finished \tANN training loss 0.009241\n>> Epoch 50 finished \tANN training loss 0.009205\n>> Epoch 51 finished \tANN training loss 0.009220\n>> Epoch 52 finished \tANN training loss 0.009202\n>> Epoch 53 finished \tANN training loss 0.009201\n>> Epoch 54 finished \tANN training loss 0.009201\n>> Epoch 55 finished \tANN training loss 0.009241\n>> Epoch 56 finished \tANN training loss 0.009195\n>> Epoch 57 finished \tANN training loss 0.009217\n>> Epoch 58 finished \tANN training loss 0.009208\n>> Epoch 59 finished \tANN training loss 0.009194\n>> Epoch 60 finished \tANN training loss 0.009195\n>> Epoch 61 finished \tANN training loss 0.009192\n>> Epoch 62 finished \tANN training loss 0.009193\n>> Epoch 63 finished \tANN training loss 0.009190\n>> Epoch 64 finished \tANN training loss 0.009193\n>> Epoch 65 finished \tANN training loss 0.009215\n>> Epoch 66 finished \tANN training loss 0.009211\n>> Epoch 67 finished \tANN training loss 0.009191\n>> Epoch 68 finished \tANN training loss 0.009190\n>> Epoch 69 finished \tANN training loss 0.009243\n>> Epoch 70 finished \tANN training loss 0.009219\n>> Epoch 71 finished \tANN training loss 0.009189\n>> Epoch 72 finished \tANN training loss 0.009185\n>> Epoch 73 finished \tANN training loss 0.009197\n>> Epoch 74 finished \tANN training loss 0.009182\n>> Epoch 75 finished \tANN training loss 0.009181\n>> Epoch 76 finished \tANN training loss 0.009182\n>> Epoch 77 finished \tANN training loss 0.009263\n>> Epoch 78 finished \tANN training loss 0.009181\n>> Epoch 79 finished \tANN training loss 0.009179\n>> Epoch 80 finished \tANN training loss 0.009179\n>> Epoch 81 finished \tANN training loss 0.009187\n>> Epoch 82 finished \tANN training loss 0.009196\n>> Epoch 83 finished \tANN training loss 0.009187\n>> Epoch 84 finished \tANN training loss 0.009178\n>> Epoch 85 finished \tANN training loss 0.009182\n>> Epoch 86 finished \tANN training loss 0.009179\n>> Epoch 87 finished \tANN training loss 0.009175\n>> Epoch 88 finished \tANN training loss 0.009176\n>> Epoch 89 finished \tANN training loss 0.009184\n>> Epoch 90 finished \tANN training loss 0.009173\n>> Epoch 91 finished \tANN training loss 0.009174\n>> Epoch 92 finished \tANN training loss 0.009226\n>> Epoch 93 finished \tANN training loss 0.009172\n>> Epoch 94 finished \tANN training loss 0.009193\n>> Epoch 95 finished \tANN training loss 0.009171\n>> Epoch 96 finished \tANN training loss 0.009180\n>> Epoch 97 finished \tANN training loss 0.009207\n>> Epoch 98 finished \tANN training loss 0.009206\n>> Epoch 99 finished \tANN training loss 0.009183\n>> Epoch 100 finished \tANN training loss 0.009167\n>> Epoch 101 finished \tANN training loss 0.009179\n>> Epoch 102 finished \tANN training loss 0.009191\n>> Epoch 103 finished \tANN training loss 0.009165\n>> Epoch 104 finished \tANN training loss 0.009184\n>> Epoch 105 finished \tANN training loss 0.009164\n>> Epoch 106 finished \tANN training loss 0.009169\n>> Epoch 107 finished \tANN training loss 0.009162\n>> Epoch 108 finished \tANN training loss 0.009175\n>> Epoch 109 finished \tANN training loss 0.009162\n>> Epoch 110 finished \tANN training loss 0.009170\n>> Epoch 111 finished \tANN training loss 0.009163\n>> Epoch 112 finished \tANN training loss 0.009163\n>> Epoch 113 finished \tANN training loss 0.009160\n>> Epoch 114 finished \tANN training loss 0.009168\n>> Epoch 115 finished \tANN training loss 0.009207\n>> Epoch 116 finished \tANN training loss 0.009159\n>> Epoch 117 finished \tANN training loss 0.009167\n>> Epoch 118 finished \tANN training loss 0.009176\n>> Epoch 119 finished \tANN training loss 0.009162\n>> Epoch 120 finished \tANN training loss 0.009156\n>> Epoch 121 finished \tANN training loss 0.009161\n>> Epoch 122 finished \tANN training loss 0.009157\n>> Epoch 123 finished \tANN training loss 0.009155\n>> Epoch 124 finished \tANN training loss 0.009222\n>> Epoch 125 finished \tANN training loss 0.009232\n>> Epoch 126 finished \tANN training loss 0.009151\n>> Epoch 127 finished \tANN training loss 0.009166\n>> Epoch 128 finished \tANN training loss 0.009171\n>> Epoch 129 finished \tANN training loss 0.009152\n>> Epoch 130 finished \tANN training loss 0.009160\n>> Epoch 131 finished \tANN training loss 0.009149\n>> Epoch 132 finished \tANN training loss 0.009163\n>> Epoch 133 finished \tANN training loss 0.009197\n>> Epoch 134 finished \tANN training loss 0.009197\n>> Epoch 135 finished \tANN training loss 0.009160\n>> Epoch 136 finished \tANN training loss 0.009154\n>> Epoch 137 finished \tANN training loss 0.009159\n>> Epoch 138 finished \tANN training loss 0.009166\n>> Epoch 139 finished \tANN training loss 0.009144\n>> Epoch 140 finished \tANN training loss 0.009144\n>> Epoch 141 finished \tANN training loss 0.009143\n>> Epoch 142 finished \tANN training loss 0.009144\n>> Epoch 143 finished \tANN training loss 0.009144\n>> Epoch 144 finished \tANN training loss 0.009146\n>> Epoch 145 finished \tANN training loss 0.009140\n>> Epoch 146 finished \tANN training loss 0.009140\n>> Epoch 147 finished \tANN training loss 0.009139\n>> Epoch 148 finished \tANN training loss 0.009162\n" ], [ "#To check RBM Loss Errors:\nrbm_error = regressor.unsupervised_dbn.rbm_layers[0].rbm_loss_error\n#To check DBN Loss Errors\ndbn_error = regressor.dbn_loss_error", "_____no_output_____" ] ], [ [ "<br><br>\n## Testing Model", "_____no_output_____" ] ], [ [ "# Test\nmin_max_scaler = MinMaxScaler()\nX_test = min_max_scaler.fit_transform(X_test)\nY_pred = regressor.predict(X_test)\n\nr2score = r2_score(Y_test, Y_pred)\nrmse = np.sqrt(mean_squared_error(Y_test, Y_pred))\nmae = mean_absolute_error(Y_test, Y_pred)\nprint('Done.\\nR-squared: %.3f\\nRMSE: %.3f \\nMAE: %.3f' % (r2score, rmse, mae))", "Done.\nR-squared: 0.892\nRMSE: 0.105 \nMAE: 0.063\n" ], [ "print(len(Y_pred))\ntemp = []\nfor i in range(len(Y_pred)):\n temp.append(Y_pred[i][0])\nd = {'Predicted': temp, 'Actual': Y_test}\n\ndf = pd.DataFrame(data=d)\ndf.head()", "9774\n" ], [ "# Save the model\nif MODEL_DIR != \"\":\n directory = \"models/\" + MODEL_DIR\n if not os.path.exists(directory):\n print(\"Making Directory\")\n os.makedirs(directory)\n\nregressor.save('models/' + MODEL_DIR + 'pm1_' + ROAD + '_' + YEAR + '.pkl')", "Making Directory\n" ] ], [ [ "### Results and Analysis below", "_____no_output_____" ] ], [ [ "import matplotlib.pyplot as plt", "_____no_output_____" ] ], [ [ "##### Printing Predicted and Actual Results", "_____no_output_____" ] ], [ [ "startIndex = merged_dataset.shape[0] - Y_pred.shape[0]\ndt = merged_dataset.index[startIndex:,]\ntemp = []\nfor i in range(len(Y_pred)):\n temp.append(Y_pred[i][0])\nd = {'Predicted': temp, 'Actual': Y_test, 'dt': dt}\ndf = pd.DataFrame(data=d)\ndf.head()", "_____no_output_____" ], [ "df.tail()", "_____no_output_____" ] ], [ [ "#### Visualize Actual and Predicted Traffic ", "_____no_output_____" ] ], [ [ "print(df.dt[0])\nstartIndex = 0\nendIndex = 96\nline1 = df.Actual.rdiv(1)\nline2 = df.Predicted.rdiv(1)\nx = range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT))\nplt.figure(figsize=(20, 4))\nplt.plot(line1[startIndex:endIndex], c='red', label=\"Actual-Congestion\")\nplt.plot(line2[startIndex:endIndex], c='blue', label=\"Predicted-Congestion\")\nplt.legend()\nplt.xlabel(\"Date\")\nplt.ylabel(\"Traffic Congestion\")\nplt.show()", "2015-07-22 04:30:00\n" ], [ "if OUTPUT_DIR != \"\":\n directory = \"output/\" + OUTPUT_DIR\n if not os.path.exists(directory):\n print(\"Making Directory\")\n os.makedirs(directory)\n\ndf.to_csv(\"output/\" + OUTPUT_DIR + \"pm1_\" + ROAD + '_' + YEAR + EXT, index=False, encoding='utf-8')", "_____no_output_____" ] ], [ [ "#### Visualize trend of loss of RBM and DBN Training", "_____no_output_____" ] ], [ [ "line1 = rbm_error\nline2 = dbn_error\nx = range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT))\nplt.plot(range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT)), line1, c='red')\nplt.xticks(x)\nplt.xlabel(\"Iteration\")\nplt.ylabel(\"Error\")\nplt.show()\n\n\nplt.plot(range(DBN_EPOCHS), line2, c='blue')\nplt.xticks(x)\nplt.xlabel(\"Iteration\")\nplt.ylabel(\"Error\")\nplt.show()\n\nplt.plot(range(0, RBM_EPOCHS * len(HIDDEN_LAYER_STRUCT)), line1, c='red')\nplt.plot(range(DBN_EPOCHS), line2, c='blue')\nplt.xticks(x)\nplt.xlabel(\"Iteration\")\nplt.ylabel(\"Error\")\nplt.show()", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ] ]
d06497270dad7bc9e9b7b8fb64621773d14f527c
521,000
ipynb
Jupyter Notebook
Course02/Voxel-Map.ipynb
thhuang/NOTES-FCND
c5b0ec7d99df3cb60a850308d16ccc6c096c7931
[ "MIT" ]
1
2018-10-26T04:06:21.000Z
2018-10-26T04:06:21.000Z
Course02/Voxel-Map.ipynb
thhuang/notes-fcnd
c5b0ec7d99df3cb60a850308d16ccc6c096c7931
[ "MIT" ]
null
null
null
Course02/Voxel-Map.ipynb
thhuang/notes-fcnd
c5b0ec7d99df3cb60a850308d16ccc6c096c7931
[ "MIT" ]
1
2018-10-26T04:06:23.000Z
2018-10-26T04:06:23.000Z
2,592.039801
515,672
0.960152
[ [ [ "# 3D Map\n\nWhile representing the configuration space in 3 dimensions isn't entirely practical it's fun (and useful) to visualize things in 3D.\n\nIn this exercise you'll finish the implementation of `create_grid` such that a 3D grid is returned where cells containing a voxel are set to `True`. We'll then plot the result!", "_____no_output_____" ] ], [ [ "import numpy as np\nimport matplotlib.pyplot as plt\nfrom mpl_toolkits.mplot3d import Axes3D\n\n%matplotlib inline ", "_____no_output_____" ], [ "plt.rcParams['figure.figsize'] = 16, 16", "_____no_output_____" ], [ "# This is the same obstacle data from the previous lesson.\nfilename = 'colliders.csv'\ndata = np.loadtxt(filename, delimiter=',', dtype='Float64', skiprows=2)\nprint(data)", "[[-305. -435. 85.5 5. 5. 85.5]\n [-295. -435. 85.5 5. 5. 85.5]\n [-285. -435. 85.5 5. 5. 85.5]\n ...\n [ 435. 465. 8. 5. 5. 8. ]\n [ 445. 465. 8. 5. 5. 8. ]\n [ 455. 465. 8. 5. 5. 8. ]]\n" ], [ "def create_voxmap(data, voxel_size=5):\n \"\"\"\n Returns a grid representation of a 3D configuration space\n based on given obstacle data.\n \n The `voxel_size` argument sets the resolution of the voxel map. \n \"\"\"\n\n # minimum and maximum north coordinates\n north_min = np.floor(np.amin(data[:, 0] - data[:, 3]))\n north_max = np.ceil(np.amax(data[:, 0] + data[:, 3]))\n\n # minimum and maximum east coordinates\n east_min = np.floor(np.amin(data[:, 1] - data[:, 4]))\n east_max = np.ceil(np.amax(data[:, 1] + data[:, 4]))\n\n alt_max = np.ceil(np.amax(data[:, 2] + data[:, 5]))\n \n # given the minimum and maximum coordinates we can\n # calculate the size of the grid.\n north_size = int(np.ceil((north_max - north_min))) // voxel_size\n east_size = int(np.ceil((east_max - east_min))) // voxel_size\n alt_size = int(alt_max) // voxel_size\n\n voxmap = np.zeros((north_size, east_size, alt_size), dtype=np.bool)\n\n for datum in data:\n x, y, z, dx, dy, dz = datum.astype(np.int32)\n obstacle = np.array(((x-dx, x+dx),\n (y-dy, y+dy),\n (z-dz, z+dz)))\n obstacle[0] = (obstacle[0] - north_min) // voxel_size\n obstacle[1] = (obstacle[1] - east_min) // voxel_size\n obstacle[2] = obstacle[2] // voxel_size \n voxmap[obstacle[0][0]:obstacle[0][1], obstacle[1][0]:obstacle[1][1], obstacle[2][0]:obstacle[2][1]] = True\n \n return voxmap", "_____no_output_____" ] ], [ [ "Create 3D grid.", "_____no_output_____" ] ], [ [ "voxel_size = 10\nvoxmap = create_voxmap(data, voxel_size)\nprint(voxmap.shape)", "(81, 91, 21)\n" ] ], [ [ "Plot the 3D grid. ", "_____no_output_____" ] ], [ [ "fig = plt.figure()\nax = fig.gca(projection='3d')\nax.voxels(voxmap, edgecolor='k')\nax.set_xlim(voxmap.shape[0], 0)\nax.set_ylim(0, voxmap.shape[1])\n# add 100 to the height so the buildings aren't so tall\nax.set_zlim(0, voxmap.shape[2]+100//voxel_size)\n\nplt.xlabel('North')\nplt.ylabel('East')\n\nplt.show()", "_____no_output_____" ] ], [ [ "Isn't the city pretty?", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d0649f7ba52737564c85b801018ece1b975776e8
51,572
ipynb
Jupyter Notebook
Projecting_Covid_19_Case_Growth_in_Bangladesh_Using_Logistic_Regression.ipynb
tanzimtaher/Modeling-Covid-19-Cumulative-Case-Growth-in-Bangladesh-with-Logistic-Regression
6304b1422a4414a3eda7b8fd1ee529b69291edd6
[ "Xnet", "X11" ]
null
null
null
Projecting_Covid_19_Case_Growth_in_Bangladesh_Using_Logistic_Regression.ipynb
tanzimtaher/Modeling-Covid-19-Cumulative-Case-Growth-in-Bangladesh-with-Logistic-Regression
6304b1422a4414a3eda7b8fd1ee529b69291edd6
[ "Xnet", "X11" ]
null
null
null
Projecting_Covid_19_Case_Growth_in_Bangladesh_Using_Logistic_Regression.ipynb
tanzimtaher/Modeling-Covid-19-Cumulative-Case-Growth-in-Bangladesh-with-Logistic-Regression
6304b1422a4414a3eda7b8fd1ee529b69291edd6
[ "Xnet", "X11" ]
null
null
null
70.646575
31,328
0.796847
[ [ [ "# Prologue", "_____no_output_____" ], [ "For this project we will use the logistic regression function to model the growth of confirmed Covid-19 case population growth in Bangladesh. The logistic regression function is commonly used in classification problems, and in this project we will be examining how it fares as a regression tool. Both cumulative case counts over time and logistic regression curves have a sigmoid shape and we shall try to fit a theoretically predicted curve over the actual cumulative case counts over time to reach certain conclusions about the case count growth, such as the time of peak daily new cases and the total cases that may be reached during this outbreak.", "_____no_output_____" ], [ "# Import the necessary modules", "_____no_output_____" ] ], [ [ "import pandas as pd\nimport numpy as np\nfrom datetime import datetime,timedelta\nfrom sklearn.metrics import mean_squared_error\nfrom scipy.optimize import curve_fit\nfrom scipy.optimize import fsolve\nimport matplotlib.pyplot as plt\n%matplotlib inline", "_____no_output_____" ] ], [ [ "# Connect to Google Drive (where the data is kept)", "_____no_output_____" ] ], [ [ "from google.colab import drive\ndrive.mount('/content/drive')", "Drive already mounted at /content/drive; to attempt to forcibly remount, call drive.mount(\"/content/drive\", force_remount=True).\n" ] ], [ [ "# Import data and format as needed", "_____no_output_____" ] ], [ [ "df = pd.read_csv('/content/drive/My Drive/Corona-Cases.n-1.csv')\ndf.tail()", "_____no_output_____" ] ], [ [ "As you can see, the format of the date is 'month-day-year'. Let's specify the date column is datetime type. Let's also specify the formatting as %m-%d-%Y. And then, let's find the day when the first confirmed cases of Covid-19 were reported in Bangladesh.", "_____no_output_____" ] ], [ [ "FMT = '%m-%d-%Y'\ndf['Date'] = pd.to_datetime(df['Date'], format=FMT)", "_____no_output_____" ] ], [ [ "We have to initialize the first date of confirmed Covid-19 cases as the datetime variable start_date because we would need it later to calculate the peak.", "_____no_output_____" ] ], [ [ "# Initialize the start date\nstart_date = datetime.date(df.loc[0, 'Date'])\nprint('Start date: ', start_date)", "Start date: 2020-03-08\n" ] ], [ [ "Now, for the logistic regression function, we would need a timestep column instead of a date column in the dataframe. So we create a new dataframe called data where we drop the date column and use the index as the timestep column.", "_____no_output_____" ] ], [ [ "# drop date column\ndata = df['Total cases']\n\n# reset index and create a timestep\ndata = data.reset_index(drop=False)\n\n# rename columns\ndata.columns = ['Timestep', 'Total Cases']\n\n# check\ndata.tail()", "_____no_output_____" ] ], [ [ "# Defining the logistic regression function", "_____no_output_____" ] ], [ [ "def logistic_model(x,a,b,c):\n return c/(1+np.exp(-(x-b)/a))", "_____no_output_____" ] ], [ [ "In this formula, we have the variable x that is the time and three parameters: a, b, c.\n* a is a metric for the speed of infections\n* b is the day with the estimated maximum growth rate of confirmed Covid-19 cases\n* c is the maximum number the cumulative confirmed cases will reach by the end of the first outbreak here in Bangladesh\n\nThe growth of cumulative cases follows a sigmoid shape like the logistic regression curve and hence, this may be a good way to model the growth of the confirmed Covid-19 case population over time. For the first outbreak at least. It makes sense because, for an outbreak, the rise in cumulative case counts is initially exponential. Then there is a point of inflection where the curve nearly becomes linear. We assume that this point of inflection is the time around which the daily new case numbers will peak. After that the curve eventually flattens out. \n\n", "_____no_output_____" ], [ "# Fit the logistic function and extrapolate", "_____no_output_____" ] ], [ [ "# Initialize all the timesteps as x\nx = list(data.iloc[:,0])\n\n# Initialize all the Total Cases values as y\ny = list(data.iloc[:,1])\n\n# Fit the curve using sklearn's curve_fit method we initialize the parameter p0 with arbitrary values\nfit = curve_fit(logistic_model,x,y,p0=[2,100,20000])\n(a, b, c), cov = fit", "_____no_output_____" ], [ "# Print outputs\nprint('Metric for speed of infections: ', a)\nprint('Days from start when cumulative case counts will peak: ', b)\nprint('Total cumulative cases that will be reached: ', c)", "Metric for speed of infections: 17.41386234974941\nDays from start when cumulative case counts will peak: 110.7731800890406\nTotal cumulative cases that will be reached: 265257.7755190932\n" ], [ "# Print errors for a, b, c\nerrors = [np.sqrt(fit[1][i][i]) for i in [0,1,2]]\nprint('Errors in a, b and c respectively:\\n', errors)", "Errors in a, b and c respectively:\n [0.12923467446546272, 0.24474862210706608, 1384.097103078659]\n" ], [ "# estimated time of peak\nprint('Estimated time of peak between', start_date + timedelta(days=(b-errors[1])), ' and ', start_date + timedelta(days=(b+errors[1])))\n\n# estimated total number of infections \nprint('Estimated total number of infections betweeen ', (c - errors[2]), ' and ', (c + errors[2]))", "Estimated time of peak between 2020-06-26 and 2020-06-27\nEstimated total number of infections betweeen 263873.67841601453 and 266641.8726221719\n" ] ], [ [ "To extrapolate the curve to the future, use the fsolve function from scipy.", "_____no_output_____" ] ], [ [ "# Extrapolate\nsol = int(fsolve(lambda x : logistic_model(x,a,b,c) - int(c),b))", "_____no_output_____" ] ], [ [ "# Plot the graph", "_____no_output_____" ] ], [ [ "pred_x = list(range(max(x),sol))\nplt.rcParams['figure.figsize'] = [7, 7]\nplt.rc('font', size=14)\n# Real data\nplt.scatter(x,y,label=\"Real data\",color=\"red\")\n# Predicted logistic curve\nplt.plot(x+pred_x, [logistic_model(i,fit[0][0],fit[0][1],fit[0][2]) for i in x+pred_x], label=\"Logistic model\" )\nplt.legend()\nplt.xlabel(\"Days since 8th March 2020\")\nplt.ylabel(\"Total number of infected people\")\nplt.ylim((min(y)*0.9,c*1.1))\nplt.show()", "_____no_output_____" ] ], [ [ "# Evaluate the MSE error", "_____no_output_____" ], [ "Evaluating the mean squared error (MSE) is not very meaningful on its own until we can compare it with another predictive method. We can compare MSE of our regression with MSE from another method to check if our logistic regression model works better than the other predictive model. The model with the lower MSE performs better.\n\n\n", "_____no_output_____" ] ], [ [ "y_pred_logistic = [logistic_model(i,fit[0][0],fit[0][1],fit[0][2])\nfor i in x]\n\nprint('Mean squared error: ', mean_squared_error(y,y_pred_logistic))", "Mean squared error: 3298197.2412489704\n" ] ], [ [ "# Epilogue", "_____no_output_____" ], [ "We should be mindful of some caveats:\n\n* These predictions will only be meaningful when the peak has actually been crossed definitively. \n\n* Also, the reliability of the reported cases would also influence the dependability of the model. Developing countries, especially the South Asian countries have famously failed to report accurate disaster statisticcs in the past. \n\n* Also, the testing numbers are low overall, especially in cities outside Dhaka where the daily new cases still have not peaked yet.\n\n* Since most of the cases reported were in Dhaka, the findings indicate that the peak in Dhaka may have been reached already.\n\n* If there is a second outbreak before the first outbreak subsides, the curve may not be sigmoid shaped and hence the results may not be as meaningful.\n\n* The total reported case numbers will possibly be greater than 260000, because the daily new cases is still rising in some cities other than Dhaka. It is not unsound to expect that the total reported case count for this first instance of Covid-19 outbreak could very well reach 300000 or more.\n\n* The government recently hiked the prices of tests which may have led to increased unwillingness in suspected candidates to actually test for the disease, and that may have influenced the recent confirmed case counts.", "_____no_output_____" ], [ "# References", "_____no_output_____" ], [ "Inspiration for theory and code from the following articles:\n\n* [Covid-19 infection in Italy. Mathematical models and predictions](https://towardsdatascience.com/covid-19-infection-in-italy-mathematical-models-and-predictions-7784b4d7dd8d)\n\n* [Logistic growth modelling of COVID-19 proliferation in China and its international implications](https://www.sciencedirect.com/science/article/pii/S1201971220303039)\n\n* [Logistic Growth Model for COVID-19](https://www.wolframcloud.com/obj/covid-19/Published/Logistic-Growth-Model-for-COVID-19.nb)\n", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown", "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown", "markdown" ], [ "code" ], [ "markdown", "markdown", "markdown", "markdown" ] ]
d064af379f3ecbb6e76efa98695906e87b3a7151
149,980
ipynb
Jupyter Notebook
examples/Notebooks/flopy3_LoadSWRBinaryData.ipynb
gyanz/flopy
282703716a01721e07905da65aa54e6017452a5a
[ "CC0-1.0", "BSD-3-Clause" ]
1
2019-11-01T00:34:14.000Z
2019-11-01T00:34:14.000Z
examples/Notebooks/flopy3_LoadSWRBinaryData.ipynb
gyanz/flopy
282703716a01721e07905da65aa54e6017452a5a
[ "CC0-1.0", "BSD-3-Clause" ]
null
null
null
examples/Notebooks/flopy3_LoadSWRBinaryData.ipynb
gyanz/flopy
282703716a01721e07905da65aa54e6017452a5a
[ "CC0-1.0", "BSD-3-Clause" ]
null
null
null
323.930886
54,404
0.930931
[ [ [ "# FloPy\n\n## Plotting SWR Process Results\n\nThis notebook demonstrates the use of the `SwrObs` and `SwrStage`, `SwrBudget`, `SwrFlow`, and `SwrExchange`, `SwrStructure`, classes to read binary SWR Process observation, stage, budget, reach to reach flows, reach-aquifer exchange, and structure files. It demonstrates these capabilities by loading these binary file types and showing examples of plotting SWR Process data. An example showing how the simulated water surface profile at a selected time along a selection of reaches can be plotted is also presented.", "_____no_output_____" ] ], [ [ "%matplotlib inline\nfrom IPython.display import Image\nimport os\nimport sys\nimport numpy as np\nimport matplotlib as mpl\nimport matplotlib.pyplot as plt\n\n# run installed version of flopy or add local path\ntry:\n import flopy\nexcept:\n fpth = os.path.abspath(os.path.join('..', '..'))\n sys.path.append(fpth)\n import flopy\n\nprint(sys.version)\nprint('numpy version: {}'.format(np.__version__))\nprint('matplotlib version: {}'.format(mpl.__version__))\nprint('flopy version: {}'.format(flopy.__version__))", "3.6.5 | packaged by conda-forge | (default, Apr 6 2018, 13:44:09) \n[GCC 4.2.1 Compatible Apple LLVM 6.1.0 (clang-602.0.53)]\nnumpy version: 1.14.5\nmatplotlib version: 2.2.2\nflopy version: 3.2.10\n" ], [ "#Set the paths\ndatapth = os.path.join('..', 'data', 'swr_test')\n\n# SWR Process binary files \nfiles = ('SWR004.obs', 'SWR004.vel', 'SWR004.str', 'SWR004.stg', 'SWR004.flow')", "_____no_output_____" ] ], [ [ "### Load SWR Process observations\n\nCreate an instance of the `SwrObs` class and load the observation data.", "_____no_output_____" ] ], [ [ "sobj = flopy.utils.SwrObs(os.path.join(datapth, files[0]))\n\nts = sobj.get_data()", "_____no_output_____" ] ], [ [ "#### Plot the data from the binary SWR Process observation file", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(6, 12))\nax1 = fig.add_subplot(3, 1, 1)\nax1.semilogx(ts['totim']/3600., -ts['OBS1'], label='OBS1')\nax1.semilogx(ts['totim']/3600., -ts['OBS2'], label='OBS2')\nax1.semilogx(ts['totim']/3600., -ts['OBS9'], label='OBS3')\nax1.set_ylabel('Flow, in cubic meters per second')\nax1.legend()\n\nax = fig.add_subplot(3, 1, 2, sharex=ax1)\nax.semilogx(ts['totim']/3600., -ts['OBS4'], label='OBS4')\nax.semilogx(ts['totim']/3600., -ts['OBS5'], label='OBS5')\nax.set_ylabel('Flow, in cubic meters per second')\nax.legend()\n\nax = fig.add_subplot(3, 1, 3, sharex=ax1)\nax.semilogx(ts['totim']/3600., ts['OBS6'], label='OBS6')\nax.semilogx(ts['totim']/3600., ts['OBS7'], label='OBS7')\nax.set_xlim(1, 100)\nax.set_ylabel('Stage, in meters')\nax.set_xlabel('Time, in hours')\nax.legend();", "_____no_output_____" ] ], [ [ "### Load the same data from the individual binary SWR Process files\n\nLoad discharge data from the flow file. The flow file contains the simulated flow between connected reaches for each connection in the model.", "_____no_output_____" ] ], [ [ "sobj = flopy.utils.SwrFlow(os.path.join(datapth, files[1]))\ntimes = np.array(sobj.get_times())/3600.\nobs1 = sobj.get_ts(irec=1, iconn=0)\nobs2 = sobj.get_ts(irec=14, iconn=13)\nobs4 = sobj.get_ts(irec=4, iconn=3)\nobs5 = sobj.get_ts(irec=5, iconn=4)", "_____no_output_____" ] ], [ [ "Load discharge data from the structure file. The structure file contains the simulated structure flow for each reach with a structure.", "_____no_output_____" ] ], [ [ "sobj = flopy.utils.SwrStructure(os.path.join(datapth, files[2]))\nobs3 = sobj.get_ts(irec=17, istr=0)", "_____no_output_____" ] ], [ [ "Load stage data from the stage file. The flow file contains the simulated stage for each reach in the model.", "_____no_output_____" ] ], [ [ "sobj = flopy.utils.SwrStage(os.path.join(datapth, files[3]))\nobs6 = sobj.get_ts(irec=13)", "_____no_output_____" ] ], [ [ "Load budget data from the budget file. The budget file contains the simulated budget for each reach group in the model. The budget file also contains the stage data for each reach group. In this case the number of reach groups equals the number of reaches in the model.", "_____no_output_____" ] ], [ [ "sobj = flopy.utils.SwrBudget(os.path.join(datapth, files[4]))\nobs7 = sobj.get_ts(irec=17)", "_____no_output_____" ] ], [ [ "#### Plot the data loaded from the individual binary SWR Process files.\n\nNote that the plots are identical to the plots generated from the binary SWR observation data.", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(6, 12))\nax1 = fig.add_subplot(3, 1, 1)\nax1.semilogx(times, obs1['flow'], label='OBS1')\nax1.semilogx(times, obs2['flow'], label='OBS2')\nax1.semilogx(times, -obs3['strflow'], label='OBS3')\nax1.set_ylabel('Flow, in cubic meters per second')\nax1.legend()\n\nax = fig.add_subplot(3, 1, 2, sharex=ax1)\nax.semilogx(times, obs4['flow'], label='OBS4')\nax.semilogx(times, obs5['flow'], label='OBS5')\nax.set_ylabel('Flow, in cubic meters per second')\nax.legend()\n\nax = fig.add_subplot(3, 1, 3, sharex=ax1)\nax.semilogx(times, obs6['stage'], label='OBS6')\nax.semilogx(times, obs7['stage'], label='OBS7')\nax.set_xlim(1, 100)\nax.set_ylabel('Stage, in meters')\nax.set_xlabel('Time, in hours')\nax.legend();", "_____no_output_____" ] ], [ [ "### Plot simulated water surface profiles\n\nSimulated water surface profiles can be created using the `ModelCrossSection` class. \n\nSeveral things that we need in addition to the stage data include reach lengths and bottom elevations. We load these data from an existing file.", "_____no_output_____" ] ], [ [ "sd = np.genfromtxt(os.path.join(datapth, 'SWR004.dis.ref'), names=True)", "_____no_output_____" ] ], [ [ "The contents of the file are shown in the cell below.", "_____no_output_____" ] ], [ [ "fc = open(os.path.join(datapth, 'SWR004.dis.ref')).readlines()\nfc", "_____no_output_____" ] ], [ [ "Create an instance of the `SwrStage` class for SWR Process stage data.", "_____no_output_____" ] ], [ [ "sobj = flopy.utils.SwrStage(os.path.join(datapth, files[3]))", "_____no_output_____" ] ], [ [ "Create a selection condition (`iprof`) that can be used to extract data for the reaches of interest (reaches 0, 1, and 8 through 17). Use this selection condition to extract reach lengths (from `sd['RLEN']`) and the bottom elevation (from `sd['BELEV']`) for the reaches of interest. The selection condition will also be used to extract the stage data for reaches of interest.", "_____no_output_____" ] ], [ [ "iprof = sd['IRCH'] > 0\niprof[2:8] = False\ndx = np.extract(iprof, sd['RLEN'])\nbelev = np.extract(iprof, sd['BELEV'])", "_____no_output_____" ] ], [ [ "Create a fake model instance so that the `ModelCrossSection` class can be used.", "_____no_output_____" ] ], [ [ "ml = flopy.modflow.Modflow()\ndis = flopy.modflow.ModflowDis(ml, nrow=1, ncol=dx.shape[0], delr=dx, top=4.5, botm=belev.reshape(1,1,12))", "_____no_output_____" ] ], [ [ "Create an array with the x position at the downstream end of each reach, which will be used to color the plots below each reach. ", "_____no_output_____" ] ], [ [ "x = np.cumsum(dx)", "_____no_output_____" ] ], [ [ "Plot simulated water surface profiles for 8 times.", "_____no_output_____" ] ], [ [ "fig = plt.figure(figsize=(12, 12))\nfor idx, v in enumerate([19, 29, 34, 39, 44, 49, 54, 59]):\n ax = fig.add_subplot(4, 2, idx+1)\n s = sobj.get_data(idx=v)\n stage = np.extract(iprof, s['stage'])\n xs = flopy.plot.ModelCrossSection(model=ml, line={'Row': 0})\n xs.plot_fill_between(stage.reshape(1,1,12), colors=['none', 'blue'], ax=ax, edgecolors='none')\n linecollection = xs.plot_grid(ax=ax, zorder=10)\n ax.fill_between(np.append(0., x), y1=np.append(belev[0], belev), y2=-0.5, \n facecolor='0.5', edgecolor='none', step='pre')\n ax.set_title('{} hours'.format(times[v]))\n ax.set_ylim(-0.5, 4.5)", "_____no_output_____" ] ], [ [ "## Summary\n\nThis notebook demonstrates flopy functionality for reading binary output generated by the SWR Process. Binary files that can be read include observations, stages, budgets, flow, reach-aquifer exchanges, and structure data. The binary stage data can also be used to create water-surface profiles. \n\nHope this gets you started!", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ] ]
d064b3c29ebb882359b6c29fc749f2293e5fe886
19,035
ipynb
Jupyter Notebook
Lec4/Lab6_result_report.ipynb
Cho-D-YoungRae/Standalone-DeepLearning
ea581708eca95fb73bb34dc17fb0dadb5f1a93a3
[ "MIT" ]
553
2019-01-20T07:54:00.000Z
2022-03-31T16:35:17.000Z
Lec4/Lab6_result_report.ipynb
betteryy/Standalone-DeepLearning
dfc12f6dc98d13751eebf5a1503665e09647f499
[ "MIT" ]
10
2019-01-22T12:23:33.000Z
2021-05-22T08:41:00.000Z
Lec4/Lab6_result_report.ipynb
betteryy/Standalone-DeepLearning
dfc12f6dc98d13751eebf5a1503665e09647f499
[ "MIT" ]
190
2019-01-17T20:32:13.000Z
2022-03-31T02:56:34.000Z
33.277972
199
0.492409
[ [ [ "[์ œ๊ฐ€ ๋ฏธ๋ฆฌ ๋งŒ๋“ค์–ด๋†“์€ ์ด ๋งํฌ](https://colab.research.google.com/github/heartcored98/Standalone-DeepLearning/blob/master/Lec4/Lab6_result_report.ipynb)๋ฅผ ํ†ตํ•ด Colab์—์„œ ๋ฐ”๋กœ ์ž‘์—…ํ•˜์‹ค ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค! \n๋Ÿฐํƒ€์ž„ ์œ ํ˜•์€ python3, GPU ๊ฐ€์† ํ™•์ธํ•˜๊ธฐ!", "_____no_output_____" ] ], [ [ "!mkdir results", "_____no_output_____" ], [ "import torch\nimport torchvision\nimport torchvision.transforms as transforms\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\nimport argparse\nimport numpy as np\nimport time\nfrom copy import deepcopy # Add Deepcopy for args", "_____no_output_____" ] ], [ [ "## Data Preparation", "_____no_output_____" ] ], [ [ "transform = transforms.Compose(\n [transforms.ToTensor(),\n transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))])\n\ntrainset = torchvision.datasets.CIFAR10(root='./data', train=True,\n download=True, transform=transform)\ntrainset, valset = torch.utils.data.random_split(trainset, [40000, 10000])\ntestset = torchvision.datasets.CIFAR10(root='./data', train=False,\n download=True, transform=transform)\npartition = {'train': trainset, 'val':valset, 'test':testset}", "_____no_output_____" ] ], [ [ "## Model Architecture", "_____no_output_____" ] ], [ [ "class MLP(nn.Module):\n def __init__(self, in_dim, out_dim, hid_dim, n_layer, act, dropout, use_bn, use_xavier):\n super(MLP, self).__init__()\n self.in_dim = in_dim\n self.out_dim = out_dim\n self.hid_dim = hid_dim\n self.n_layer = n_layer\n self.act = act\n self.dropout = dropout\n self.use_bn = use_bn\n self.use_xavier = use_xavier\n \n # ====== Create Linear Layers ====== #\n self.fc1 = nn.Linear(self.in_dim, self.hid_dim)\n \n self.linears = nn.ModuleList()\n self.bns = nn.ModuleList()\n for i in range(self.n_layer-1):\n self.linears.append(nn.Linear(self.hid_dim, self.hid_dim))\n if self.use_bn:\n self.bns.append(nn.BatchNorm1d(self.hid_dim))\n \n self.fc2 = nn.Linear(self.hid_dim, self.out_dim)\n \n # ====== Create Activation Function ====== #\n if self.act == 'relu':\n self.act = nn.ReLU()\n elif self.act == 'tanh':\n self.act == nn.Tanh()\n elif self.act == 'sigmoid':\n self.act = nn.Sigmoid()\n else:\n raise ValueError('no valid activation function selected!')\n \n # ====== Create Regularization Layer ======= #\n self.dropout = nn.Dropout(self.dropout)\n if self.use_xavier:\n self.xavier_init()\n \n def forward(self, x):\n x = self.act(self.fc1(x))\n for i in range(len(self.linears)):\n x = self.act(self.linears[i](x))\n x = self.bns[i](x)\n x = self.dropout(x)\n x = self.fc2(x)\n return x\n \n def xavier_init(self):\n for linear in self.linears:\n nn.init.xavier_normal_(linear.weight)\n linear.bias.data.fill_(0.01)\n \nnet = MLP(3072, 10, 100, 4, 'relu', 0.1, True, True) # Testing Model Construction", "_____no_output_____" ] ], [ [ "## Train, Validate, Test and Experiment", "_____no_output_____" ] ], [ [ "def train(net, partition, optimizer, criterion, args):\n trainloader = torch.utils.data.DataLoader(partition['train'], \n batch_size=args.train_batch_size, \n shuffle=True, num_workers=2)\n net.train()\n\n correct = 0\n total = 0\n train_loss = 0.0\n for i, data in enumerate(trainloader, 0):\n optimizer.zero_grad() # [21.01.05 ์˜ค๋ฅ˜ ์ˆ˜์ •] ๋งค Epoch ๋งˆ๋‹ค .zero_grad()๊ฐ€ ์‹คํ–‰๋˜๋Š” ๊ฒƒ์„ ๋งค iteration ๋งˆ๋‹ค ์‹คํ–‰๋˜๋„๋ก ์ˆ˜์ •ํ–ˆ์Šต๋‹ˆ๋‹ค. \n\n # get the inputs\n inputs, labels = data\n inputs = inputs.view(-1, 3072)\n inputs = inputs.cuda()\n labels = labels.cuda()\n outputs = net(inputs)\n\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n\n train_loss += loss.item()\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n train_loss = train_loss / len(trainloader)\n train_acc = 100 * correct / total\n return net, train_loss, train_acc", "_____no_output_____" ], [ "def validate(net, partition, criterion, args):\n valloader = torch.utils.data.DataLoader(partition['val'], \n batch_size=args.test_batch_size, \n shuffle=False, num_workers=2)\n net.eval()\n\n correct = 0\n total = 0\n val_loss = 0 \n with torch.no_grad():\n for data in valloader:\n images, labels = data\n images = images.view(-1, 3072)\n images = images.cuda()\n labels = labels.cuda()\n outputs = net(images)\n\n loss = criterion(outputs, labels)\n \n val_loss += loss.item()\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n val_loss = val_loss / len(valloader)\n val_acc = 100 * correct / total\n return val_loss, val_acc", "_____no_output_____" ], [ "def test(net, partition, args):\n testloader = torch.utils.data.DataLoader(partition['test'], \n batch_size=args.test_batch_size, \n shuffle=False, num_workers=2)\n net.eval()\n \n correct = 0\n total = 0\n with torch.no_grad():\n for data in testloader:\n images, labels = data\n images = images.view(-1, 3072)\n images = images.cuda()\n labels = labels.cuda()\n\n outputs = net(images)\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\n test_acc = 100 * correct / total\n return test_acc", "_____no_output_____" ], [ "def experiment(partition, args):\n \n net = MLP(args.in_dim, args.out_dim, args.hid_dim, args.n_layer, args.act, args.dropout, args.use_bn, args.use_xavier)\n net.cuda()\n\n criterion = nn.CrossEntropyLoss()\n if args.optim == 'SGD':\n optimizer = optim.RMSprop(net.parameters(), lr=args.lr, weight_decay=args.l2)\n elif args.optim == 'RMSprop':\n optimizer = optim.RMSprop(net.parameters(), lr=args.lr, weight_decay=args.l2)\n elif args.optim == 'Adam':\n optimizer = optim.Adam(net.parameters(), lr=args.lr, weight_decay=args.l2)\n else:\n raise ValueError('In-valid optimizer choice')\n \n # ===== List for epoch-wise data ====== #\n train_losses = []\n val_losses = []\n train_accs = []\n val_accs = []\n # ===================================== #\n \n for epoch in range(args.epoch): # loop over the dataset multiple times\n ts = time.time()\n net, train_loss, train_acc = train(net, partition, optimizer, criterion, args)\n val_loss, val_acc = validate(net, partition, criterion, args)\n te = time.time()\n \n # ====== Add Epoch Data ====== #\n train_losses.append(train_loss)\n val_losses.append(val_loss)\n train_accs.append(train_acc)\n val_accs.append(val_acc)\n # ============================ #\n \n print('Epoch {}, Acc(train/val): {:2.2f}/{:2.2f}, Loss(train/val) {:2.2f}/{:2.2f}. Took {:2.2f} sec'.format(epoch, train_acc, val_acc, train_loss, val_loss, te-ts))\n \n test_acc = test(net, partition, args) \n \n # ======= Add Result to Dictionary ======= #\n result = {}\n result['train_losses'] = train_losses\n result['val_losses'] = val_losses\n result['train_accs'] = train_accs\n result['val_accs'] = val_accs\n result['train_acc'] = train_acc\n result['val_acc'] = val_acc\n result['test_acc'] = test_acc\n return vars(args), result\n # ===================================== #", "_____no_output_____" ] ], [ [ "# Manage Experiment Result", "_____no_output_____" ] ], [ [ "import hashlib\nimport json\nfrom os import listdir\nfrom os.path import isfile, join\nimport pandas as pd\n\ndef save_exp_result(setting, result):\n exp_name = setting['exp_name']\n del setting['epoch']\n del setting['test_batch_size']\n\n hash_key = hashlib.sha1(str(setting).encode()).hexdigest()[:6]\n filename = './results/{}-{}.json'.format(exp_name, hash_key)\n result.update(setting)\n with open(filename, 'w') as f:\n json.dump(result, f)\n\n \ndef load_exp_result(exp_name):\n dir_path = './results'\n filenames = [f for f in listdir(dir_path) if isfile(join(dir_path, f)) if '.json' in f]\n list_result = []\n for filename in filenames:\n if exp_name in filename:\n with open(join(dir_path, filename), 'r') as infile:\n results = json.load(infile)\n list_result.append(results)\n df = pd.DataFrame(list_result) # .drop(columns=[])\n return df\n ", "_____no_output_____" ] ], [ [ "## Experiment", "_____no_output_____" ] ], [ [ "# ====== Random Seed Initialization ====== #\nseed = 123\nnp.random.seed(seed)\ntorch.manual_seed(seed)\n\nparser = argparse.ArgumentParser()\nargs = parser.parse_args(\"\")\nargs.exp_name = \"exp1_n_layer_hid_dim\"\n\n# ====== Model Capacity ====== #\nargs.in_dim = 3072\nargs.out_dim = 10\nargs.hid_dim = 100\nargs.act = 'relu'\n\n# ====== Regularization ======= #\nargs.dropout = 0.2\nargs.use_bn = True\nargs.l2 = 0.00001\nargs.use_xavier = True\n\n# ====== Optimizer & Training ====== #\nargs.optim = 'RMSprop' #'RMSprop' #SGD, RMSprop, ADAM...\nargs.lr = 0.0015\nargs.epoch = 10\n\nargs.train_batch_size = 256\nargs.test_batch_size = 1024\n\n# ====== Experiment Variable ====== #\nname_var1 = 'n_layer'\nname_var2 = 'hid_dim'\nlist_var1 = [1, 2, 3]\nlist_var2 = [500, 300]\n\n\nfor var1 in list_var1:\n for var2 in list_var2:\n setattr(args, name_var1, var1)\n setattr(args, name_var2, var2)\n print(args)\n \n setting, result = experiment(partition, deepcopy(args))\n save_exp_result(setting, result)\n", "_____no_output_____" ], [ "import seaborn as sns \nimport matplotlib.pyplot as plt\n\ndf = load_exp_result('exp1')\n\nfig, ax = plt.subplots(1, 3)\nfig.set_size_inches(15, 6)\nsns.set_style(\"darkgrid\", {\"axes.facecolor\": \".9\"})\n\nsns.barplot(x='n_layer', y='train_acc', hue='hid_dim', data=df, ax=ax[0])\nsns.barplot(x='n_layer', y='val_acc', hue='hid_dim', data=df, ax=ax[1])\nsns.barplot(x='n_layer', y='test_acc', hue='hid_dim', data=df, ax=ax[2])\n", "_____no_output_____" ], [ "var1 = 'n_layer'\nvar2 = 'hid_dim'\n\ndf = load_exp_result('exp1')\nlist_v1 = df[var1].unique()\nlist_v2 = df[var2].unique()\nlist_data = []\n\nfor value1 in list_v1:\n for value2 in list_v2:\n row = df.loc[df[var1]==value1]\n row = row.loc[df[var2]==value2]\n \n train_losses = list(row.train_losses)[0]\n val_losses = list(row.val_losses)[0]\n \n for epoch, train_loss in enumerate(train_losses):\n list_data.append({'type':'train', 'loss':train_loss, 'epoch':epoch, var1:value1, var2:value2})\n for epoch, val_loss in enumerate(val_losses):\n list_data.append({'type':'val', 'loss':val_loss, 'epoch':epoch, var1:value1, var2:value2})\n \ndf = pd.DataFrame(list_data)\ng = sns.FacetGrid(df, row=var2, col=var1, hue='type', margin_titles=True, sharey=False)\ng = g.map(plt.plot, 'epoch', 'loss', marker='.')\ng.add_legend()\ng.fig.suptitle('Train loss vs Val loss')\nplt.subplots_adjust(top=0.89)", "_____no_output_____" ], [ "var1 = 'n_layer'\nvar2 = 'hid_dim'\n\ndf = load_exp_result('exp1')\nlist_v1 = df[var1].unique()\nlist_v2 = df[var2].unique()\nlist_data = []\n\nfor value1 in list_v1:\n for value2 in list_v2:\n row = df.loc[df[var1]==value1]\n row = row.loc[df[var2]==value2]\n \n train_accs = list(row.train_accs)[0]\n val_accs = list(row.val_accs)[0]\n test_acc = list(row.test_acc)[0]\n \n for epoch, train_acc in enumerate(train_accs):\n list_data.append({'type':'train', 'Acc':train_acc, 'test_acc':test_acc, 'epoch':epoch, var1:value1, var2:value2})\n for epoch, val_acc in enumerate(val_accs):\n list_data.append({'type':'val', 'Acc':val_acc, 'test_acc':test_acc, 'epoch':epoch, var1:value1, var2:value2})\n \ndf = pd.DataFrame(list_data)\ng = sns.FacetGrid(df, row=var2, col=var1, hue='type', margin_titles=True, sharey=False)\ng = g.map(plt.plot, 'epoch', 'Acc', marker='.')\n\ndef show_acc(x, y, metric, **kwargs):\n plt.scatter(x, y, alpha=0.3, s=1)\n metric = \"Test Acc: {:1.3f}\".format(list(metric.values)[0])\n plt.text(0.05, 0.95, metric, horizontalalignment='left', verticalalignment='center', transform=plt.gca().transAxes, bbox=dict(facecolor='yellow', alpha=0.5, boxstyle=\"round,pad=0.1\"))\ng = g.map(show_acc, 'epoch', 'Acc', 'test_acc')\n\ng.add_legend()\ng.fig.suptitle('Train Accuracy vs Val Accuracy')\n\n\n\nplt.subplots_adjust(top=0.89)", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code", "code" ] ]
d064c10961958259b92a5c2eacdc9bd02cc89d9b
375,897
ipynb
Jupyter Notebook
phase4_analysis_by_visualization/final_visualizations.ipynb
eric-wisniewski/visualizeYourself_Project
dcb6f3620468206de31ee587d4e2da18d6e1575d
[ "Apache-2.0" ]
null
null
null
phase4_analysis_by_visualization/final_visualizations.ipynb
eric-wisniewski/visualizeYourself_Project
dcb6f3620468206de31ee587d4e2da18d6e1575d
[ "Apache-2.0" ]
null
null
null
phase4_analysis_by_visualization/final_visualizations.ipynb
eric-wisniewski/visualizeYourself_Project
dcb6f3620468206de31ee587d4e2da18d6e1575d
[ "Apache-2.0" ]
null
null
null
1,070.931624
56,644
0.954472
[ [ [ "import pandas as pd\nimport matplotlib.pyplot as plt\nfrom matplotlib import pyplot\n%matplotlib inline\n\nfinal_data_e = pd.read_csv(\"vizSelf_eric.csv\", index_col=0, parse_dates=True)\nfinal_data_p = pd.read_csv(\"vizSelf_parent.csv\", index_col=0, parse_dates=True)\nfinal_data_e.head()\nfinal_data_p.head()", "_____no_output_____" ], [ "axe = final_data_e.plot.area(figsize=(12,4), subplots=True)\naxp = final_data_p.plot.area(figsize=(12,4), subplots=True)", "_____no_output_____" ], [ "axeb = final_data_e.plot.bar(figsize=(12,4), subplots=True)\naxpb = final_data_p.plot.bar(figsize=(12,4), subplots=True)", "_____no_output_____" ], [ "axeh = final_data_e.plot.hist(figsize=(12,4), subplots=True)\naxph = final_data_p.plot.hist(figsize=(12,4), subplots=True)", "_____no_output_____" ], [ "axed = final_data_e.plot.density(figsize=(12,4), subplots=True)\naxpd = final_data_p.plot.density(figsize=(12,4), subplots=True)", "_____no_output_____" ], [ "axebp = final_data_e.plot.box(figsize=(12,4), subplots=True)\naxpbp = final_data_p.plot.box(figsize=(12,4), subplots=True)", "_____no_output_____" ], [ "axekde = final_data_e.plot.kde(figsize=(12,4), subplots=True)\naxpkde = final_data_p.plot.kde(figsize=(12,4), subplots=True)", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code" ] ]
d064ccbc966f61700046f9a84bfae97b6c7585ed
305,168
ipynb
Jupyter Notebook
DAY6_Eda_With_Pandas.ipynb
averryset/H8_Python_for_Data_Science
c1e87c2e272d64147f50ae967bd0153073b23b9b
[ "MIT" ]
null
null
null
DAY6_Eda_With_Pandas.ipynb
averryset/H8_Python_for_Data_Science
c1e87c2e272d64147f50ae967bd0153073b23b9b
[ "MIT" ]
null
null
null
DAY6_Eda_With_Pandas.ipynb
averryset/H8_Python_for_Data_Science
c1e87c2e272d64147f50ae967bd0153073b23b9b
[ "MIT" ]
null
null
null
77.043171
110,700
0.704907
[ [ [ "import pandas as pd\nimport numpy as np", "_____no_output_____" ], [ "df_properti = pd.read_csv(\"https://raw.githubusercontent.com/ardhiraka/PFDS_sources/master/property_data.csv\")", "_____no_output_____" ], [ "df_properti", "_____no_output_____" ], [ "df_properti.shape", "_____no_output_____" ], [ "df_properti.columns", "_____no_output_____" ], [ "df_properti[\"ST_NAME\"]", "_____no_output_____" ], [ "df_properti[\"ST_NUM\"].isna()", "_____no_output_____" ], [ "list_missing_values = [\"n/a\", \"--\", \"na\"]\ndf_properti = pd.read_csv(\n \"https://raw.githubusercontent.com/ardhiraka/PFDS_sources/master/property_data.csv\",\n na_values = list_missing_values\n)", "_____no_output_____" ], [ "df_properti", "_____no_output_____" ], [ "df_properti[\"OWN_OCCUPIED\"].isna()", "_____no_output_____" ], [ "cnt=0\ndf_properti_own = df_properti[\"OWN_OCCUPIED\"]\nfor row in df_properti_own:\n try:\n int(row)\n df_properti[cnt, \"OWN_OCCUPIED\"]=np.nan\n except ValueError:\n pass\n cnt+=1", "_____no_output_____" ], [ "df_properti ", "_____no_output_____" ], [ "df_properti[\"NEW_OWN_OCCUPIEW\"] = df_properti[\"OWN_OCCUPIED\"].apply(\n lambda val: 1 if val == \"Y\" else 0\n)\ndf_properti", "_____no_output_____" ], [ "df_properti.isna().sum()", "_____no_output_____" ], [ "df_properti.isna().sum().sum()", "_____no_output_____" ], [ "df_properti", "_____no_output_____" ], [ "cnt=0\ndf_properti_num_bat = df_properti[\"NUM_BATH\"]\nfor row in df_properti_num_bat:\n try:\n float(row)\n df_properti.loc[cnt, \"NEW_NUM_BATH\"]=row\n except ValueError:\n df_properti.loc[cnt, \"NEW_NUM_BATH\"]=np.nan\n cnt+=1", "_____no_output_____" ], [ "df_properti", "_____no_output_____" ], [ "df_properti[\"ST_NUM\"].fillna(125)", "_____no_output_____" ], [ "obes = pd.ExcelFile(\"csv/obes.xls\")", "_____no_output_____" ], [ "obes.sheet_names", "_____no_output_____" ], [ "obes_age = obes.parse(\"7.2\", skiprows=4, skipfooter=14)", "_____no_output_____" ], [ "obes_age", "_____no_output_____" ], [ "obes_age.set_index('Year', inplace=True)", "_____no_output_____" ], [ "obes_age.plot()", "_____no_output_____" ], [ "obes_age.drop(\"Total\", axis=1).plot()", "_____no_output_____" ], [ "from datetime import datetime", "_____no_output_____" ], [ "datetime.now().date()", "_____no_output_____" ], [ "opsd_daily = pd.read_csv(\n 'https://raw.githubusercontent.com/ardhiraka/PFDS_sources/master/opsd_germany_daily.csv',\n index_col=0, parse_dates=True\n)", "_____no_output_____" ], [ "opsd_daily.head()", "_____no_output_____" ], [ "opsd_daily['Year'] = opsd_daily.index.year\nopsd_daily['Month'] = opsd_daily.index.month\nopsd_daily['Weekday'] = opsd_daily.index.weekday", "_____no_output_____" ], [ "opsd_daily", "_____no_output_____" ], [ "opsd_daily[\"Consumption\"].plot(\n linewidth=.3, \n figsize=(12, 5)\n)", "_____no_output_____" ], [ "df_canada = pd.read_excel(\n \"https://github.com/ardhiraka/PFDS_sources/blob/master/Canada.xlsx?raw=true\",\n sheet_name=\"Canada by Citizenship\",\n skiprows=range(20),\n skipfooter=2\n)", "_____no_output_____" ], [ "df_canada.head()", "_____no_output_____" ], [ "df_canada.columns", "_____no_output_____" ], [ "df_canada.drop(\n columns=[\n \"AREA\", \"REG\", \"DEV\",\n \"Type\", \"Coverage\"\n ],\n axis=1,\n inplace=True\n)", "_____no_output_____" ], [ "df_canada.head()", "_____no_output_____" ], [ "df_canada.rename(\n columns={\n \"OdName\": \"Country\",\n \"AreaName\": \"Continent\",\n \"RegName\": \"Region\"\n },\n inplace=True\n)", "_____no_output_____" ], [ "df_canada.head()", "_____no_output_____" ], [ "df_canada_total = df_canada.sum(axis=1)", "_____no_output_____" ], [ "df_canada[\"Total\"] = df_canada_total\ndf_canada.head()", "_____no_output_____" ], [ "df_canada.describe()", "_____no_output_____" ], [ "df_canada.Country", "_____no_output_____" ], [ "df_canada[\n [\n \"Country\",\n 2000,\n 2001,\n 2002,\n 2003,\n 2004,\n 2005,\n 2006,\n 2007,\n 2008,\n 2009,\n 2010,\n 2011,\n 2012,\n 2013,\n ]\n]", "_____no_output_____" ], [ "df_canada[\"Continent\"] == \"Africa\"", "_____no_output_____" ], [ "df_canada[(df_canada[\"Continent\"]==\"Asia\") & (df_canada[\"Region\"]==\"Southern Asia\")]", "_____no_output_____" ] ] ]
[ "code" ]
[ [ "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code", "code" ] ]
d064e55492113bcb0b663aa6bb3350fddb822574
14,748
ipynb
Jupyter Notebook
Demo.ipynb
4nthon/HomographyNet
9749b80b2d68d9ecff6423e209782327440e8226
[ "MIT" ]
8
2020-07-02T00:23:09.000Z
2022-03-17T01:55:22.000Z
Demo.ipynb
4nthon/HomographyNet
9749b80b2d68d9ecff6423e209782327440e8226
[ "MIT" ]
1
2021-09-18T02:03:17.000Z
2021-09-18T02:16:41.000Z
Demo.ipynb
4nthon/HomographyNet
9749b80b2d68d9ecff6423e209782327440e8226
[ "MIT" ]
6
2020-10-26T08:41:48.000Z
2021-07-05T03:08:01.000Z
42.872093
1,663
0.606184
[ [ [ "## 1ใ€ๅฏ่ง†ๅŒ–DataGeneratorHomographyNetๆจกๅ—้ƒฝๅนฒไบ†ไป€ไนˆ", "_____no_output_____" ] ], [ [ "import glob\nimport os\nimport cv2\nimport numpy as np\nfrom dataGenerator import DataGeneratorHomographyNet", "/home/nvidia/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\nUsing TensorFlow backend.\n" ], [ "img_dir = os.path.join(os.path.expanduser(\"~\"), \"/home/nvidia/test2017\")\nimg_ext = \".jpg\"\nimg_paths = glob.glob(os.path.join(img_dir, '*' + img_ext))\ndg = DataGeneratorHomographyNet(img_paths, input_dim=(240, 240))\ndata, label = dg.__getitem__(0)\nfor idx in range(dg.batch_size):\n cv2.imshow(\"orig\", data[idx, :, :, 0])\n cv2.imshow(\"transformed\", data[idx, :, :, 1])\n cv2.waitKey(0)", "_____no_output_____" ] ], [ [ "## 2ใ€ๅผ€ๅง‹่ฎญ็ปƒ", "_____no_output_____" ] ], [ [ "import os\nimport glob\nimport datetime\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport keras\nfrom keras.callbacks import ModelCheckpoint\nfrom sklearn.model_selection import train_test_split\nimport tensorflow as tf\nfrom homographyNet import HomographyNet\nimport dataGenerator as dg\nkeras.__version__", "_____no_output_____" ], [ "batch_size = 2\n#ๅ–ๅ€ผ0,1,2 0-ๅฎ‰้™ๆจกๅผ 1-่ฟ›ๅบฆๆก 2-ๆฏไธ€่กŒ้ƒฝๆœ‰่พ“ๅ‡บ\nverbose = 1\n#Epoch\nnb_epo = 150\n#่ฎกๆ—ถๅผ€ๅง‹\nstart_ts = datetime.datetime.now().strftime(\"%Y%m%d-%H%M%S\")\n#็”จไบŽ่ฎญ็ปƒ็š„ๅ›พ็‰‡็›ฎๅฝ•\ndata_path = \"/home/nvidia/test2017\"\n#ๆจกๅž‹ไฟๅญ˜็š„็›ฎๅฝ•\nmodel_dir = \"/home/nvidia\"\nimg_dir = os.path.join(os.path.expanduser(\"~\"), data_path)\nmodel_dir = os.path.join(os.path.expanduser(\"~\"), model_dir, start_ts)\n#ไปฅๆ—ถ้—ดไธบๅๅˆ›ๅปบ็›ฎๅฝ•\nif not os.path.exists(model_dir):\n os.makedirs(model_dir)", "_____no_output_____" ], [ "img_ext = \".jpg\"\n#่Žทๅ–ๆ‰€ๆœ‰ๅ›พๅƒ็›ฎๅฝ•\nimg_paths = glob.glob(os.path.join(img_dir, '*' + img_ext))\ninput_size = (360, 360, 2)\n#ๅˆ’ๅˆ†่ฎญ็ปƒ้›†ๅ’Œ้ชŒ่ฏ้›†๏ผŒ้ชŒ่ฏ้›†ๆžๅฐไธ€็‚น๏ผŒไธ็„ถๆฏไธชepoch่ท‘ๅฎŒๅคชๆ…ขไบ†\ntrain_idx, val_idx = train_test_split(img_paths, test_size=0.01)\n#ๆ‹ฟๅˆฐ่ฎญ็ปƒๆ•ฐๆฎ\ntrain_dg = dg.DataGeneratorHomographyNet(train_idx, input_dim=input_size[0:2], batch_size=batch_size)\n#ๆ‹ฟๅˆฐๆ—ขๅฎšไบ‹ๅฎž็š„ๆ ‡็ญพ\nval_dg = dg.DataGeneratorHomographyNet(val_idx, input_dim=input_size[0:2], batch_size=batch_size)\n#ๅฏนไบŽ็ฅž็ป็ฝ‘็ปœๆฅ่ฏด่ฟ™ไธช้ฌผไธ€ๆ ท็š„ๅ›พๅฐฑๆ˜ฏ่พ“ๅ…ฅ๏ผŒๅฎƒ่‡ชๅทฑไปŽ่ฟ™ๅน…ๅ›พ็š„ๅทฆ่พนๅ’Œๅณ่พนๅญฆไน ๅ‡บๅ•ๅบ”ๆ€ง็Ÿฉ้˜ต๏ผŒ็ฅžๅฅ‡ๅง๏ผŸ\n#ไฟฎๆญฃ็ฝ‘็ปœ่พ“ๅ…ฅๅคด\nhomo_net = HomographyNet(input_size)\n#ๅฎžไพ‹ๅŒ–็ฝ‘็ปœ็ป“ๆž„\nmodel = homo_net.build_model()\n#่พ“ๅ‡บๆจกๅž‹\nmodel.summary()", "WARNING:tensorflow:From /home/nvidia/.local/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:1264: calling reduce_prod (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.\nInstructions for updating:\nkeep_dims is deprecated, use keepdims instead\nWARNING:tensorflow:From /home/nvidia/.local/lib/python3.6/site-packages/keras/backend/tensorflow_backend.py:1349: calling reduce_mean (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.\nInstructions for updating:\nkeep_dims is deprecated, use keepdims instead\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\ninput_1 (InputLayer) (None, 360, 360, 2) 0 \n_________________________________________________________________\nconv2d_1 (Conv2D) (None, 360, 360, 64) 1216 \n_________________________________________________________________\nbatch_normalization_1 (Batch (None, 360, 360, 64) 256 \n_________________________________________________________________\nactivation_1 (Activation) (None, 360, 360, 64) 0 \n_________________________________________________________________\nconv2d_2 (Conv2D) (None, 360, 360, 64) 36928 \n_________________________________________________________________\nbatch_normalization_2 (Batch (None, 360, 360, 64) 256 \n_________________________________________________________________\nactivation_2 (Activation) (None, 360, 360, 64) 0 \n_________________________________________________________________\nmax_pooling2d_1 (MaxPooling2 (None, 180, 180, 64) 0 \n_________________________________________________________________\nconv2d_3 (Conv2D) (None, 180, 180, 64) 36928 \n_________________________________________________________________\nbatch_normalization_3 (Batch (None, 180, 180, 64) 256 \n_________________________________________________________________\nactivation_3 (Activation) (None, 180, 180, 64) 0 \n_________________________________________________________________\nconv2d_4 (Conv2D) (None, 180, 180, 64) 36928 \n_________________________________________________________________\nbatch_normalization_4 (Batch (None, 180, 180, 64) 256 \n_________________________________________________________________\nactivation_4 (Activation) (None, 180, 180, 64) 0 \n_________________________________________________________________\nmax_pooling2d_2 (MaxPooling2 (None, 90, 90, 64) 0 \n_________________________________________________________________\nconv2d_5 (Conv2D) (None, 90, 90, 128) 73856 \n_________________________________________________________________\nbatch_normalization_5 (Batch (None, 90, 90, 128) 512 \n_________________________________________________________________\nactivation_5 (Activation) (None, 90, 90, 128) 0 \n_________________________________________________________________\nconv2d_6 (Conv2D) (None, 90, 90, 128) 147584 \n_________________________________________________________________\nbatch_normalization_6 (Batch (None, 90, 90, 128) 512 \n_________________________________________________________________\nactivation_6 (Activation) (None, 90, 90, 128) 0 \n_________________________________________________________________\nmax_pooling2d_3 (MaxPooling2 (None, 45, 45, 128) 0 \n_________________________________________________________________\nconv2d_7 (Conv2D) (None, 45, 45, 128) 147584 \n_________________________________________________________________\nbatch_normalization_7 (Batch (None, 45, 45, 128) 512 \n_________________________________________________________________\nactivation_7 (Activation) (None, 45, 45, 128) 0 \n_________________________________________________________________\nconv2d_8 (Conv2D) (None, 45, 45, 128) 147584 \n_________________________________________________________________\nbatch_normalization_8 (Batch (None, 45, 45, 128) 512 \n_________________________________________________________________\nactivation_8 (Activation) (None, 45, 45, 128) 0 \n_________________________________________________________________\nmax_pooling2d_4 (MaxPooling2 (None, 22, 22, 128) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 61952) 0 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 61952) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 1028) 63687684 \n_________________________________________________________________\nactivation_9 (Activation) (None, 1028) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 8) 8232 \n=================================================================\nTotal params: 64,327,596\nTrainable params: 64,326,060\nNon-trainable params: 1,536\n_________________________________________________________________\n" ], [ "#ๆฃ€ๆŸฅ็‚นๅ›ž่ฐƒ๏ผŒๆฒกๅ†™tensorboard็š„ๅ›ž่ฐƒ๏ผŒ็œŸๆญฃ็š„ๅคงๅธˆ้ƒฝๆ˜ฏ็›ดๆŽฅ็œ‹loss่พ“ๅ‡บ็š„\ncheckpoint = ModelCheckpoint(\n os.path.join(model_dir, 'model.h5'),\n monitor='val_loss',\n verbose=verbose,\n save_best_only=True,\n save_weights_only=False,\n mode='auto'\n)", "_____no_output_____" ], [ "#ๆˆ‘ๅซŒๅผƒๅœจไธŠ้ขๆ”นๅคช้บป็ƒฆ๏ผŒ็›ดๆŽฅๅœจ่ฟ™้‡ๅฎšไน‰ไบ†\n#ๅผ€ๅง‹่ฎญ็ปƒ\n#ๅฆ‚ๆžœไธๅŠ steps_per_epoch= 32, ๅฐฑๆ˜ฏๆฏๆฌกๅ…จ่ท‘\nhistory = model.fit_generator(train_dg, \n validation_data = val_dg,\n #steps_per_epoch = 32, \n callbacks = [checkpoint], \n epochs = 15, \n verbose = 1)", "Epoch 1/15\n 1373/20131 [=>............................] - ETA: 1:18:50 - loss: 1615938396204833.0000 - mean_squared_error: 1615938396204833.0000" ] ], [ [ "\n", "_____no_output_____" ] ], [ [ "#ๆ•ดไธชๅ›พ็œ‹็œ‹\nhistory_df = pd.DataFrame(history.history)\nhistory_df.to_csv(os.path.join(model_dir, 'history.csv'))\nhistory_df[['loss', 'val_loss']].plot()\nhistory_df[['mean_squared_error', 'val_mean_squared_error']].plot()\nplt.show()", "_____no_output_____" ] ], [ [ "## ้ข„ๆต‹&่ฏ„ไผฐ", "_____no_output_____" ] ], [ [ "TODO", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ] ]
d064e82ca5571ab846425be59e36d57fa95702af
34,696
ipynb
Jupyter Notebook
Chapter08/.ipynb_checkpoints/ch8-diamond-prices-model-tuning-checkpoint.ipynb
arifmudi/Hands-On-Predictive-Analytics-with-Python
27122c8c75711c1e3e29d265f13788b9c4b8f5ee
[ "MIT" ]
38
2019-01-03T14:54:56.000Z
2022-02-02T04:13:35.000Z
Chapter08/.ipynb_checkpoints/ch8-diamond-prices-model-tuning-checkpoint.ipynb
arifmudi/Hands-On-Predictive-Analytics-with-Python
27122c8c75711c1e3e29d265f13788b9c4b8f5ee
[ "MIT" ]
4
2019-07-03T11:25:24.000Z
2020-11-21T07:15:27.000Z
Chapter08/.ipynb_checkpoints/ch8-diamond-prices-model-tuning-checkpoint.ipynb
arifmudi/Hands-On-Predictive-Analytics-with-Python
27122c8c75711c1e3e29d265f13788b9c4b8f5ee
[ "MIT" ]
31
2018-12-27T05:00:08.000Z
2022-03-22T23:24:57.000Z
65.095685
20,952
0.806837
[ [ [ "# Diamond Prices: Model Tuning and Improving Performance", "_____no_output_____" ], [ "#### Importing libraries", "_____no_output_____" ] ], [ [ "import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport os\n\npd.options.mode.chained_assignment = None\n%matplotlib inline", "_____no_output_____" ] ], [ [ "#### Loading the dataset", "_____no_output_____" ] ], [ [ "DATA_DIR = '../data'\nFILE_NAME = 'diamonds.csv'\ndata_path = os.path.join(DATA_DIR, FILE_NAME)\ndiamonds = pd.read_csv(data_path)", "_____no_output_____" ] ], [ [ "#### Preparing the dataset", "_____no_output_____" ] ], [ [ "## Preparation done from Chapter 2\ndiamonds = diamonds.loc[(diamonds['x']>0) | (diamonds['y']>0)]\ndiamonds.loc[11182, 'x'] = diamonds['x'].median()\ndiamonds.loc[11182, 'z'] = diamonds['z'].median()\ndiamonds = diamonds.loc[~((diamonds['y'] > 30) | (diamonds['z'] > 30))]\ndiamonds = pd.concat([diamonds, pd.get_dummies(diamonds['cut'], prefix='cut', drop_first=True)], axis=1)\ndiamonds = pd.concat([diamonds, pd.get_dummies(diamonds['color'], prefix='color', drop_first=True)], axis=1)\ndiamonds = pd.concat([diamonds, pd.get_dummies(diamonds['clarity'], prefix='clarity', drop_first=True)], axis=1)\n\n## Dimensionality reduction\nfrom sklearn.decomposition import PCA\npca = PCA(n_components=1, random_state=123)\ndiamonds['dim_index'] = pca.fit_transform(diamonds[['x','y','z']])\ndiamonds.drop(['x','y','z'], axis=1, inplace=True)", "_____no_output_____" ], [ "diamonds.columns", "_____no_output_____" ] ], [ [ "#### Train-test split", "_____no_output_____" ] ], [ [ "X = diamonds.drop(['cut','color','clarity','price'], axis=1)\ny = diamonds['price']\n\nfrom sklearn.model_selection import train_test_split\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=7)", "_____no_output_____" ] ], [ [ "#### Standarization: centering and scaling ", "_____no_output_____" ] ], [ [ "numerical_features = ['carat', 'depth', 'table', 'dim_index']\nfrom sklearn.preprocessing import StandardScaler\nscaler = StandardScaler()\nscaler.fit(X_train[numerical_features])\nX_train.loc[:, numerical_features] = scaler.fit_transform(X_train[numerical_features])\nX_test.loc[:, numerical_features] = scaler.transform(X_test[numerical_features])", "_____no_output_____" ] ], [ [ "## Optimizing a single hyper-parameter", "_____no_output_____" ] ], [ [ "X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, test_size=0.1, random_state=13)", "_____no_output_____" ], [ "from sklearn.neighbors import KNeighborsRegressor\nfrom sklearn.metrics import mean_absolute_error\n\ncandidates = np.arange(4,16)\nmae_metrics = []\nfor k in candidates:\n model = KNeighborsRegressor(n_neighbors=k, weights='distance', metric='minkowski', leaf_size=50, n_jobs=4)\n model.fit(X_train, y_train)\n y_pred = model.predict(X_val)\n metric = mean_absolute_error(y_true=y_val, y_pred=y_pred)\n mae_metrics.append(metric)", "_____no_output_____" ], [ "fig, ax = plt.subplots(figsize=(8,5))\nax.plot(candidates, mae_metrics, \"o-\")\nax.set_xlabel('Hyper-parameter K', fontsize=14)\nax.set_ylabel('MAE', fontsize=14)\nax.set_xticks(candidates)\nax.grid();", "_____no_output_____" ] ], [ [ "#### Recalculating train-set split", "_____no_output_____" ] ], [ [ "X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.1, random_state=7)\nscaler = StandardScaler()\nscaler.fit(X_train[numerical_features])\nX_train.loc[:, numerical_features] = scaler.fit_transform(X_train[numerical_features])\nX_test.loc[:, numerical_features] = scaler.transform(X_test[numerical_features])", "_____no_output_____" ] ], [ [ "#### Optimizing with cross-validation", "_____no_output_____" ] ], [ [ "from sklearn.model_selection import cross_val_score\ncandidates = np.arange(4,16)\nmean_mae = []\nstd_mae = []\nfor k in candidates:\n model = KNeighborsRegressor(n_neighbors=k, weights='distance', metric='minkowski', leaf_size=50, n_jobs=4)\n cv_results = cross_val_score(model, X_train, y_train, scoring='neg_mean_absolute_error', cv=10)\n mean_score, std_score = -1*cv_results.mean(), cv_results.std()\n mean_mae.append(mean_score)\n std_mae.append(std_score)", "_____no_output_____" ], [ "fig, ax = plt.subplots(figsize=(8,5))\nax.plot(candidates, mean_mae, \"o-\")\nax.set_xlabel('Hyper-parameter K', fontsize=14)\nax.set_ylabel('Mean MAE', fontsize=14)\nax.set_xticks(candidates)\nax.grid();", "_____no_output_____" ], [ "fig, ax = plt.subplots(figsize=(8,5))\nax.plot(candidates, std_mae, \"o-\")\nax.set_xlabel('Hyper-parameter K', fontsize=14)\nax.set_ylabel('Standard deviation of MAE', fontsize=14)\nax.set_xticks(candidates)\nax.grid();", "_____no_output_____" ] ], [ [ "# Improving Performance", "_____no_output_____" ], [ "## Improving our diamond price predictions", "_____no_output_____" ], [ "### Fitting a neural network", "_____no_output_____" ] ], [ [ "from keras.models import Sequential\nfrom keras.layers import Dense\n\nn_input = X_train.shape[1]\nn_hidden1 = 32\nn_hidden2 = 16\nn_hidden3 = 8\n\nnn_reg = Sequential()\nnn_reg.add(Dense(units=n_hidden1, activation='relu', input_shape=(n_input,)))\nnn_reg.add(Dense(units=n_hidden2, activation='relu'))\nnn_reg.add(Dense(units=n_hidden3, activation='relu'))\n# output layer\nnn_reg.add(Dense(units=1, activation=None))", "_____no_output_____" ], [ "batch_size = 32\nn_epochs = 40\nnn_reg.compile(loss='mean_absolute_error', optimizer='adam')\nnn_reg.fit(X_train, y_train, epochs=n_epochs, batch_size=batch_size, validation_split=0.05)", "_____no_output_____" ], [ "y_pred = nn_reg.predict(X_test).flatten()\nmae_neural_net = mean_absolute_error(y_test, y_pred)\nprint(\"MAE Neural Network: {:0.2f}\".format(mae_neural_net))", "_____no_output_____" ] ], [ [ "### Transforming the target", "_____no_output_____" ] ], [ [ "diamonds['price'].hist(bins=25, ec='k', figsize=(8,5))\nplt.title(\"Distribution of diamond prices\", fontsize=16)\nplt.grid(False);", "_____no_output_____" ], [ "y_train = np.log(y_train)\npd.Series(y_train).hist(bins=25, ec='k', figsize=(8,5))\nplt.title(\"Distribution of log diamond prices\", fontsize=16)\nplt.grid(False);", "_____no_output_____" ], [ "nn_reg = Sequential()\nnn_reg.add(Dense(units=n_hidden1, activation='relu', input_shape=(n_input,)))\nnn_reg.add(Dense(units=n_hidden2, activation='relu'))\nnn_reg.add(Dense(units=n_hidden3, activation='relu'))\n# output layer\nnn_reg.add(Dense(units=1, activation=None))", "_____no_output_____" ], [ "batch_size = 32\nn_epochs = 40\nnn_reg.compile(loss='mean_absolute_error', optimizer='adam')\nnn_reg.fit(X_train, y_train, epochs=n_epochs, batch_size=batch_size, validation_split=0.05)", "_____no_output_____" ], [ "y_pred = nn_reg.predict(X_test).flatten()\ny_pred = np.exp(y_pred)\nmae_neural_net2 = mean_absolute_error(y_test, y_pred)\nprint(\"MAE Neural Network (modified target): {:0.2f}\".format(mae_neural_net2))", "_____no_output_____" ], [ "100*(mae_neural_net - mae_neural_net2)/mae_neural_net2", "_____no_output_____" ] ], [ [ "#### Analyzing the results", "_____no_output_____" ] ], [ [ "fig, ax = plt.subplots(figsize=(8,5))\nresiduals = y_test - y_pred\nax.scatter(y_test, residuals, s=3)\nax.set_title('Residuals vs. Observed Prices', fontsize=16)\nax.set_xlabel('Observed prices', fontsize=14)\nax.set_ylabel('Residuals', fontsize=14)\nax.grid();", "_____no_output_____" ], [ "mask_7500 = y_test <=7500\nmae_neural_less_7500 = mean_absolute_error(y_test[mask_7500], y_pred[mask_7500])\nprint(\"MAE considering price <= 7500: {:0.2f}\".format(mae_neural_less_7500))", "_____no_output_____" ], [ "fig, ax = plt.subplots(figsize=(8,5))\npercent_residuals = (y_test - y_pred)/y_test\nax.scatter(y_test, percent_residuals, s=3)\nax.set_title('Pecent residuals vs. Observed Prices', fontsize=16)\nax.set_xlabel('Observed prices', fontsize=14)\nax.set_ylabel('Pecent residuals', fontsize=14)\nax.axhline(y=0.15, color='r'); ax.axhline(y=-0.15, color='r'); \nax.grid();", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code", "markdown", "code" ]
[ [ "markdown", "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code" ], [ "markdown" ], [ "code", "code", "code" ], [ "markdown", "markdown", "markdown" ], [ "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code", "code", "code", "code" ], [ "markdown" ], [ "code", "code", "code" ] ]
d064e9236cbc6f092035bc0d5dba899559c20f39
254,266
ipynb
Jupyter Notebook
materials/seaborn_data_viz_complete_with_outputs.ipynb
kthrog/dataviz_workshop
971836516c72d3d07e7ef59ee8bb313cd788991c
[ "CC0-1.0" ]
null
null
null
materials/seaborn_data_viz_complete_with_outputs.ipynb
kthrog/dataviz_workshop
971836516c72d3d07e7ef59ee8bb313cd788991c
[ "CC0-1.0" ]
null
null
null
materials/seaborn_data_viz_complete_with_outputs.ipynb
kthrog/dataviz_workshop
971836516c72d3d07e7ef59ee8bb313cd788991c
[ "CC0-1.0" ]
null
null
null
167.280263
69,956
0.785123
[ [ [ "# Visualizing COVID-19 Hospital Dataset with Seaborn\n\n**Pre-Work:**\n1. Ensure that Jupyter Notebook, Python 3, and seaborn (which will also install dependency libraries if not already installed) are installed. (See resources below for installation instructions.)\n\n### **Instructions:**\n1. Using Python, import main visualization library, `seaborn`, and its dependencies: `pandas`, `numpy`, and `matplotlib`.\n2. Define dataset and read in data using pandas function, `read_json()`. [Notes: a) we're reading in data as an API endpoint; for more about this, see associated workshop slides or resources at bottom of notebook. b) If, instead, you prefer to use your own data, see comment with alternative for `read_csv()`.]\n3. Check data has been read is as expected using `head()` function.\n4. Graph two variables with `seaborn`as a lineplot using the `lineplot()` function.\n5. Graph these same variables, plus a third, from the source dataset with `seaborn` as a scatterplot using the `relplot()` function.\n6. See additional methods, using filtered data and other graphs. Feel free to open a new notebook, and try out your own ideas, using different variables or charts. (Or try out your own data!)\n7. When ready, save figure using `matplotlib`'s `savefig`.\n\n**Note:**\n*If you're new to Jupyter Notebook, see resources below.*\n\n### **Data source:**\n\n[COVID-19 Reported Patient Impact and Hospital Capacity by State Timeseries](https://healthdata.gov/Hospital/COVID-19-Reported-Patient-Impact-and-Hospital-Capa/g62h-syeh),\" created by the U.S. Department of Health & Human Services, on [HealthData.gov](https://healthdata.gov/).", "_____no_output_____" ] ], [ [ "# import libraries\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport seaborn as sns", "_____no_output_____" ], [ "# read JSON data in via healthdata.gov's API endpoint - https://healthdata.gov/resource/g62h-syeh.json?$limit=50000\n# because the SODA API defaults to 1,000 rows, we're going to change that with the $limit parameter\n# define data as 'covid' and set equal to read function\n# if we want filtered data to compare to, define more datasets\n\ncovid = pd.read_json(\"https://healthdata.gov/resource/g62h-syeh.json?$limit=50000\")\n\ncovid_ct = pd.read_json(\"https://healthdata.gov/resource/g62h-syeh.json?state=CT\")\n\ncovid_maytopresent = pd.read_json(\"https://healthdata.gov/resource/g62h-syeh.json?$limit=50000&$where=date%20between%20%272021-05-01T12:00:00%27%20and%20%272021-08-01T12:00:00%27\")\n\n# if you want to read in your own data, see resources below, or if you have a CSV, try: mydata = pd.read_csv('')\n# and add data filepath inside ''\n# be sure to change covid to mydata in code below", "_____no_output_____" ], [ "# use head function and call our dataset (covid) to see the first few rows \n# the default argument for this function is 5 rows, but you can set this to anything, e.g. covid.head(20)\ncovid.head()", "_____no_output_____" ], [ "# example of head with more rows\ncovid_ct.head(20)", "_____no_output_____" ], [ "# use seaborn to plot inpatient beds used versus whether a critical staffing shortage is occuring\n# we also need to tell seaborn what dataset to use; in this case it's 'covid' as defined above\n# variables: inpatient_beds_used_covid; critical_staffing_shortage_today_yes\n\nsns.lineplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", data=covid)\n\n# save and name fig; uncomment below to run\n# plt.savefig('covid_lineplot.png')", "_____no_output_____" ], [ "# use seaborn to plot inpatient beds used versus whether a critical staffing shortage is occuring\n# this time, with a bar plot\n# variables: inpatient_beds_used_covid; critical_staffing_shortage_today_yes\n\nsns.barplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", data=covid)\n\n# save and name fig; uncomment below to run\n# plt.savefig('covid_barplot.png')", "_____no_output_____" ], [ "# now we're going to try another graph type, a relational graph that will be scatterplot, with the same variables\n# and add one more variable, deaths_covid, to color dots based on prevalance of COVID-19 deaths by setting hue\n# though feel free to try new variables by browsing them here (scroll down to Columns in this Dataset): https://healthdata.gov/Hospital/COVID-19-Reported-Patient-Impact-and-Hospital-Capa/g62h-syeh\n# variables: inpatient_beds_used_covid; critical_staffing_shortage_today_yes; deaths_covid\n\nsns.relplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", hue=\"deaths_covid\", data=covid)\n\n# save and name fig; uncomment below to run\n# plt.savefig('covid_scatterplot.png')", "_____no_output_____" ], [ "# now let's try some graphs with the more limited datasets above, for instance, just the CT data\n\nsns.relplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", hue=\"deaths_covid\", data=covid_ct)\n", "_____no_output_____" ], [ "# or just the May - August (present) 2021 date range\n\nsns.relplot(x='inpatient_beds_used_covid', y=\"critical_staffing_shortage_today_yes\", hue=\"deaths_covid\", data=covid_maytopresent)\n", "_____no_output_____" ] ], [ [ "### Final Note:\nIt's important to remember that we can't necessarily infer any causation or directionality from these charts, but they can be a good place to start for further analysis and exploration, and can point us in the right direction of where to apply more advanced statistical methods, such as linear regression. Even with more advanced methods, though, we still want to stick the principles we're using here: keep charts as simple as possible, using only a few variables, and adding color only where needed. We want our charts to be readable and understandable -- see resources below for more advice and guidance on this. \n\nUltimately, these quick-start methods are helpful for idea generation and early investigation, and can get that process up and running quickly.", "_____no_output_____" ], [ "#### Code/Tools Resources:\n- Jupyter notebook - about: https://jupyter-notebook.readthedocs.io/en/stable/notebook.html#introduction\n- Jupyter notebook - how to use this tool: https://jupyter-notebook.readthedocs.io/en/stable/notebook.html\n- Python: https://www.python.org/\n- Seaborn: https://seaborn.pydata.org/index.html\n- Seaborn tutorial: https://seaborn.pydata.org/tutorial.html\n- Seaborn gallery: https://seaborn.pydata.org/examples/index.html\n- Seaborn `lineplot()` function: https://seaborn.pydata.org/generated/seaborn.lineplot.html#seaborn.lineplot + https://seaborn.pydata.org/examples/errorband_lineplots.html\n- Seaborn `relplot()` function: https://seaborn.pydata.org/generated/seaborn.relplot.html#seaborn.relplot + https://seaborn.pydata.org/examples/faceted_lineplot.html\n- Pandas: https://pandas.pydata.org/\n- Pandas - how to read / write tabular data: https://pandas.pydata.org/docs/getting_started/intro_tutorials/02_read_write.html\n- Pandas `read.json()` function: https://pandas.pydata.org/docs/reference/api/pandas.io.json.read_json.html?highlight=read_json#pandas.io.json.read_json\n- Pandas `head()` function: https://pandas.pydata.org/docs/reference/api/pandas.DataFrame.head.html?highlight=head#pandas.DataFrame.head\n- Matplotlib: https://matplotlib.org/\n- Matplotlib `savefig` function: https://matplotlib.org/stable/api/_as_gen/matplotlib.pyplot.savefig.html\n- Socrata Open Data API (SODA) Docs: https://dev.socrata.com/\n- SODA Docs for [Dataset](https://healthdata.gov/Hospital/COVID-19-Reported-Patient-Impact-and-Hospital-Capa/g62h-syeh): https://dev.socrata.com/foundry/healthdata.gov/g62h-syeh\n- SODA Docs - what is an endpoint: https://dev.socrata.com/docs/endpoints.html\n\n#### Visualization Resources:\n- 10 Simple Rules for Better Figures | *PLOS Comp Bio*: https://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1003833\n\n- How to Choose the Right Data Visualization | *Chartio*: https://chartio.com/learn/charts/how-to-choose-data-visualization/\n\n#### Additional Note:\nThis notebook was created by Kaitlin Throgmorton for a data analysis workshop, as part of an interview for Yale University.", "_____no_output_____" ] ] ]
[ "markdown", "code", "markdown" ]
[ [ "markdown" ], [ "code", "code", "code", "code", "code", "code", "code", "code", "code" ], [ "markdown", "markdown" ] ]