problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_63334 | rasdani/github-patches | git_diff | sanic-org__sanic-1527 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Publish 19.3 release to PyPI
Thank you for the release 3 days ago!
https://github.com/huge-success/sanic/releases/tag/19.3
It's missing from PyPI at the moment:
https://pypi.org/project/sanic/#history
Please publish it at your convenience 🙇
Keep up the awesome work ❤️
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sanic/__init__.py`
Content:
```
1 from sanic.app import Sanic
2 from sanic.blueprints import Blueprint
3
4
5 __version__ = "18.12.0"
6
7 __all__ = ["Sanic", "Blueprint"]
8
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sanic/__init__.py b/sanic/__init__.py
--- a/sanic/__init__.py
+++ b/sanic/__init__.py
@@ -2,6 +2,6 @@
from sanic.blueprints import Blueprint
-__version__ = "18.12.0"
+__version__ = "19.03.0"
__all__ = ["Sanic", "Blueprint"]
| {"golden_diff": "diff --git a/sanic/__init__.py b/sanic/__init__.py\n--- a/sanic/__init__.py\n+++ b/sanic/__init__.py\n@@ -2,6 +2,6 @@\n from sanic.blueprints import Blueprint\n \n \n-__version__ = \"18.12.0\"\n+__version__ = \"19.03.0\"\n \n __all__ = [\"Sanic\", \"Blueprint\"]\n", "issue": "Publish 19.3 release to PyPI\nThank you for the release 3 days ago!\r\n\r\nhttps://github.com/huge-success/sanic/releases/tag/19.3\r\n\r\nIt's missing from PyPI at the moment:\r\n\r\nhttps://pypi.org/project/sanic/#history\r\n\r\nPlease publish it at your convenience \ud83d\ude47 \r\n\r\nKeep up the awesome work \u2764\ufe0f \n", "before_files": [{"content": "from sanic.app import Sanic\nfrom sanic.blueprints import Blueprint\n\n\n__version__ = \"18.12.0\"\n\n__all__ = [\"Sanic\", \"Blueprint\"]\n", "path": "sanic/__init__.py"}], "after_files": [{"content": "from sanic.app import Sanic\nfrom sanic.blueprints import Blueprint\n\n\n__version__ = \"19.03.0\"\n\n__all__ = [\"Sanic\", \"Blueprint\"]\n", "path": "sanic/__init__.py"}]} | 389 | 95 |
gh_patches_debug_2102 | rasdani/github-patches | git_diff | rucio__rucio-1372 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix activity in BB8
Motivation
----------
BB8 uses activity `Data Rebalancing` but the activity defined in ATLAS schema is `Data rebalancing`. We should use the same activity everywhere, and it should be consistent with the share defined in FTS
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/rucio/vcsversion.py`
Content:
```
1
2 '''
3 This file is automatically generated; Do not edit it. :)
4 '''
5 VERSION_INFO = {
6 'final': True,
7 'version': '1.17.4',
8 'branch_nick': 'patch-0-1_17_4_client_release_prep',
9 'revision_id': 'ba996ce9bf8366cd7d8d1fb60a7f1daf8d4f517e',
10 'revno': 6827
11 }
12
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/rucio/vcsversion.py b/lib/rucio/vcsversion.py
--- a/lib/rucio/vcsversion.py
+++ b/lib/rucio/vcsversion.py
@@ -4,8 +4,8 @@
'''
VERSION_INFO = {
'final': True,
- 'version': '1.17.4',
- 'branch_nick': 'patch-0-1_17_4_client_release_prep',
- 'revision_id': 'ba996ce9bf8366cd7d8d1fb60a7f1daf8d4f517e',
- 'revno': 6827
+ 'version': '1.17.5',
+ 'branch_nick': 'patch-0-1_17_5_preparation',
+ 'revision_id': '537e1e47eb627741394b6bb9bc21d0f046296275',
+ 'revno': 6837
}
| {"golden_diff": "diff --git a/lib/rucio/vcsversion.py b/lib/rucio/vcsversion.py\n--- a/lib/rucio/vcsversion.py\n+++ b/lib/rucio/vcsversion.py\n@@ -4,8 +4,8 @@\n '''\n VERSION_INFO = {\n 'final': True,\n- 'version': '1.17.4',\n- 'branch_nick': 'patch-0-1_17_4_client_release_prep',\n- 'revision_id': 'ba996ce9bf8366cd7d8d1fb60a7f1daf8d4f517e',\n- 'revno': 6827\n+ 'version': '1.17.5',\n+ 'branch_nick': 'patch-0-1_17_5_preparation',\n+ 'revision_id': '537e1e47eb627741394b6bb9bc21d0f046296275',\n+ 'revno': 6837\n }\n", "issue": "Fix activity in BB8\nMotivation\r\n----------\r\n\r\nBB8 uses activity `Data Rebalancing` but the activity defined in ATLAS schema is `Data rebalancing`. We should use the same activity everywhere, and it should be consistent with the share defined in FTS\n", "before_files": [{"content": "\n'''\nThis file is automatically generated; Do not edit it. :)\n'''\nVERSION_INFO = {\n 'final': True,\n 'version': '1.17.4',\n 'branch_nick': 'patch-0-1_17_4_client_release_prep',\n 'revision_id': 'ba996ce9bf8366cd7d8d1fb60a7f1daf8d4f517e',\n 'revno': 6827\n}\n", "path": "lib/rucio/vcsversion.py"}], "after_files": [{"content": "\n'''\nThis file is automatically generated; Do not edit it. :)\n'''\nVERSION_INFO = {\n 'final': True,\n 'version': '1.17.5',\n 'branch_nick': 'patch-0-1_17_5_preparation',\n 'revision_id': '537e1e47eb627741394b6bb9bc21d0f046296275',\n 'revno': 6837\n}\n", "path": "lib/rucio/vcsversion.py"}]} | 443 | 243 |
gh_patches_debug_17555 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-5324 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `extensions/base_extension.py`
Content:
```
1 import hashlib
2 import os
3 from abc import ABC, abstractmethod
4 from typing import Union
5
6 __all__ = ["_Extension"]
7
8
9 class _Extension(ABC):
10 def __init__(self, name: str, support_aot: bool, support_jit: bool, priority: int = 1):
11 self._name = name
12 self._support_aot = support_aot
13 self._support_jit = support_jit
14 self.priority = priority
15
16 @property
17 def name(self):
18 return self._name
19
20 @property
21 def support_aot(self):
22 return self._support_aot
23
24 @property
25 def support_jit(self):
26 return self._support_jit
27
28 @staticmethod
29 def get_jit_extension_folder_path():
30 """
31 Kernels which are compiled during runtime will be stored in the same cache folder for reuse.
32 The folder is in the path ~/.cache/colossalai/torch_extensions/<cache-folder>.
33 The name of the <cache-folder> follows a common format:
34 torch<torch_version_major>.<torch_version_minor>_<device_name><device_version>-<hash>
35
36 The <hash> suffix is the hash value of the path of the `colossalai` file.
37 """
38 import torch
39
40 import colossalai
41 from colossalai.accelerator import get_accelerator
42
43 # get torch version
44 torch_version_major = torch.__version__.split(".")[0]
45 torch_version_minor = torch.__version__.split(".")[1]
46
47 # get device version
48 device_name = get_accelerator().name
49 device_version = get_accelerator().get_version()
50
51 # use colossalai's file path as hash
52 hash_suffix = hashlib.sha256(colossalai.__file__.encode()).hexdigest()
53
54 # concat
55 home_directory = os.path.expanduser("~")
56 extension_directory = f".cache/colossalai/torch_extensions/torch{torch_version_major}.{torch_version_minor}_{device_name}-{device_version}-{hash_suffix}"
57 cache_directory = os.path.join(home_directory, extension_directory)
58 return cache_directory
59
60 @abstractmethod
61 def is_hardware_available(self) -> bool:
62 """
63 Check if the hardware required by the kernel is available.
64 """
65
66 @abstractmethod
67 def assert_hardware_compatible(self) -> bool:
68 """
69 Check if the hardware required by the kernel is compatible.
70 """
71
72 @abstractmethod
73 def build_aot(self) -> Union["CppExtension", "CUDAExtension"]:
74 pass
75
76 @abstractmethod
77 def build_jit(self) -> None:
78 pass
79
80 @abstractmethod
81 def load(self):
82 pass
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/extensions/base_extension.py b/extensions/base_extension.py
--- a/extensions/base_extension.py
+++ b/extensions/base_extension.py
@@ -1,7 +1,7 @@
import hashlib
import os
from abc import ABC, abstractmethod
-from typing import Union
+from typing import Callable, Union
__all__ = ["_Extension"]
@@ -64,7 +64,7 @@
"""
@abstractmethod
- def assert_hardware_compatible(self) -> bool:
+ def assert_hardware_compatible(self) -> None:
"""
Check if the hardware required by the kernel is compatible.
"""
@@ -74,9 +74,9 @@
pass
@abstractmethod
- def build_jit(self) -> None:
+ def build_jit(self) -> Callable:
pass
@abstractmethod
- def load(self):
+ def load(self) -> Callable:
pass
| {"golden_diff": "diff --git a/extensions/base_extension.py b/extensions/base_extension.py\n--- a/extensions/base_extension.py\n+++ b/extensions/base_extension.py\n@@ -1,7 +1,7 @@\n import hashlib\n import os\n from abc import ABC, abstractmethod\n-from typing import Union\n+from typing import Callable, Union\n \n __all__ = [\"_Extension\"]\n \n@@ -64,7 +64,7 @@\n \"\"\"\n \n @abstractmethod\n- def assert_hardware_compatible(self) -> bool:\n+ def assert_hardware_compatible(self) -> None:\n \"\"\"\n Check if the hardware required by the kernel is compatible.\n \"\"\"\n@@ -74,9 +74,9 @@\n pass\n \n @abstractmethod\n- def build_jit(self) -> None:\n+ def build_jit(self) -> Callable:\n pass\n \n @abstractmethod\n- def load(self):\n+ def load(self) -> Callable:\n pass\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "import hashlib\nimport os\nfrom abc import ABC, abstractmethod\nfrom typing import Union\n\n__all__ = [\"_Extension\"]\n\n\nclass _Extension(ABC):\n def __init__(self, name: str, support_aot: bool, support_jit: bool, priority: int = 1):\n self._name = name\n self._support_aot = support_aot\n self._support_jit = support_jit\n self.priority = priority\n\n @property\n def name(self):\n return self._name\n\n @property\n def support_aot(self):\n return self._support_aot\n\n @property\n def support_jit(self):\n return self._support_jit\n\n @staticmethod\n def get_jit_extension_folder_path():\n \"\"\"\n Kernels which are compiled during runtime will be stored in the same cache folder for reuse.\n The folder is in the path ~/.cache/colossalai/torch_extensions/<cache-folder>.\n The name of the <cache-folder> follows a common format:\n torch<torch_version_major>.<torch_version_minor>_<device_name><device_version>-<hash>\n\n The <hash> suffix is the hash value of the path of the `colossalai` file.\n \"\"\"\n import torch\n\n import colossalai\n from colossalai.accelerator import get_accelerator\n\n # get torch version\n torch_version_major = torch.__version__.split(\".\")[0]\n torch_version_minor = torch.__version__.split(\".\")[1]\n\n # get device version\n device_name = get_accelerator().name\n device_version = get_accelerator().get_version()\n\n # use colossalai's file path as hash\n hash_suffix = hashlib.sha256(colossalai.__file__.encode()).hexdigest()\n\n # concat\n home_directory = os.path.expanduser(\"~\")\n extension_directory = f\".cache/colossalai/torch_extensions/torch{torch_version_major}.{torch_version_minor}_{device_name}-{device_version}-{hash_suffix}\"\n cache_directory = os.path.join(home_directory, extension_directory)\n return cache_directory\n\n @abstractmethod\n def is_hardware_available(self) -> bool:\n \"\"\"\n Check if the hardware required by the kernel is available.\n \"\"\"\n\n @abstractmethod\n def assert_hardware_compatible(self) -> bool:\n \"\"\"\n Check if the hardware required by the kernel is compatible.\n \"\"\"\n\n @abstractmethod\n def build_aot(self) -> Union[\"CppExtension\", \"CUDAExtension\"]:\n pass\n\n @abstractmethod\n def build_jit(self) -> None:\n pass\n\n @abstractmethod\n def load(self):\n pass\n", "path": "extensions/base_extension.py"}], "after_files": [{"content": "import hashlib\nimport os\nfrom abc import ABC, abstractmethod\nfrom typing import Callable, Union\n\n__all__ = [\"_Extension\"]\n\n\nclass _Extension(ABC):\n def __init__(self, name: str, support_aot: bool, support_jit: bool, priority: int = 1):\n self._name = name\n self._support_aot = support_aot\n self._support_jit = support_jit\n self.priority = priority\n\n @property\n def name(self):\n return self._name\n\n @property\n def support_aot(self):\n return self._support_aot\n\n @property\n def support_jit(self):\n return self._support_jit\n\n @staticmethod\n def get_jit_extension_folder_path():\n \"\"\"\n Kernels which are compiled during runtime will be stored in the same cache folder for reuse.\n The folder is in the path ~/.cache/colossalai/torch_extensions/<cache-folder>.\n The name of the <cache-folder> follows a common format:\n torch<torch_version_major>.<torch_version_minor>_<device_name><device_version>-<hash>\n\n The <hash> suffix is the hash value of the path of the `colossalai` file.\n \"\"\"\n import torch\n\n import colossalai\n from colossalai.accelerator import get_accelerator\n\n # get torch version\n torch_version_major = torch.__version__.split(\".\")[0]\n torch_version_minor = torch.__version__.split(\".\")[1]\n\n # get device version\n device_name = get_accelerator().name\n device_version = get_accelerator().get_version()\n\n # use colossalai's file path as hash\n hash_suffix = hashlib.sha256(colossalai.__file__.encode()).hexdigest()\n\n # concat\n home_directory = os.path.expanduser(\"~\")\n extension_directory = f\".cache/colossalai/torch_extensions/torch{torch_version_major}.{torch_version_minor}_{device_name}-{device_version}-{hash_suffix}\"\n cache_directory = os.path.join(home_directory, extension_directory)\n return cache_directory\n\n @abstractmethod\n def is_hardware_available(self) -> bool:\n \"\"\"\n Check if the hardware required by the kernel is available.\n \"\"\"\n\n @abstractmethod\n def assert_hardware_compatible(self) -> None:\n \"\"\"\n Check if the hardware required by the kernel is compatible.\n \"\"\"\n\n @abstractmethod\n def build_aot(self) -> Union[\"CppExtension\", \"CUDAExtension\"]:\n pass\n\n @abstractmethod\n def build_jit(self) -> Callable:\n pass\n\n @abstractmethod\n def load(self) -> Callable:\n pass\n", "path": "extensions/base_extension.py"}]} | 1,015 | 201 |
gh_patches_debug_407 | rasdani/github-patches | git_diff | wemake-services__wemake-python-styleguide-200 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Feature: allow magic numbers in async functions constructors
We check that some magic numbers can be used in function constructors like so:
```python
def some_function(price, delta=0.1):
return price * delta
```
But, we only allow regular functions, not `async` ones: https://github.com/wemake-services/wemake-python-styleguide/blob/master/wemake_python_styleguide/visitors/ast/numbers.py#L19-L21
What we need to do is:
1. Add `ast.AsyncFunctionDef` to the allowed list
2. Write a unit test for it: https://github.com/wemake-services/wemake-python-styleguide/blob/master/tests/test_visitors/test_ast/test_general/test_magic_numbers.py
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wemake_python_styleguide/visitors/ast/numbers.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 import ast
4 from typing import Optional
5
6 from wemake_python_styleguide.constants import MAGIC_NUMBERS_WHITELIST
7 from wemake_python_styleguide.violations.best_practices import (
8 MagicNumberViolation,
9 )
10 from wemake_python_styleguide.visitors.base import BaseNodeVisitor
11
12
13 class MagicNumberVisitor(BaseNodeVisitor):
14 """Checks magic numbers used in the code."""
15
16 _ALLOWED_PARENTS = (
17 ast.Assign,
18
19 # Constructor usages:
20 ast.FunctionDef,
21 ast.arguments,
22
23 # Primitives:
24 ast.List,
25 ast.Dict,
26 ast.Set,
27 ast.Tuple,
28 )
29
30 # TODO: make consistent naming rules for class attributes:
31 _PROXY_PARENTS = (
32 ast.UnaryOp,
33 )
34
35 def _get_real_parent(self, node: Optional[ast.AST]) -> Optional[ast.AST]:
36 """
37 Returns real number's parent.
38
39 What can go wrong?
40
41 1. Number can be negative: ``x = -1``,
42 so ``1`` has ``UnaryOp`` as parent, but should return ``Assign``
43
44 """
45 parent = getattr(node, 'parent', None)
46 if isinstance(parent, self._PROXY_PARENTS):
47 return self._get_real_parent(parent)
48 return parent
49
50 def _check_is_magic(self, node: ast.Num) -> None:
51 parent = self._get_real_parent(node)
52 if isinstance(parent, self._ALLOWED_PARENTS):
53 return
54
55 if node.n in MAGIC_NUMBERS_WHITELIST:
56 return
57
58 if isinstance(node.n, int) and node.n <= 10:
59 return
60
61 self.add_violation(MagicNumberViolation(node, text=str(node.n)))
62
63 def visit_Num(self, node: ast.Num) -> None:
64 """
65 Checks numbers not to be magic constants inside the code.
66
67 Raises:
68 MagicNumberViolation
69
70 """
71 self._check_is_magic(node)
72 self.generic_visit(node)
73
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wemake_python_styleguide/visitors/ast/numbers.py b/wemake_python_styleguide/visitors/ast/numbers.py
--- a/wemake_python_styleguide/visitors/ast/numbers.py
+++ b/wemake_python_styleguide/visitors/ast/numbers.py
@@ -18,6 +18,7 @@
# Constructor usages:
ast.FunctionDef,
+ ast.AsyncFunctionDef,
ast.arguments,
# Primitives:
| {"golden_diff": "diff --git a/wemake_python_styleguide/visitors/ast/numbers.py b/wemake_python_styleguide/visitors/ast/numbers.py\n--- a/wemake_python_styleguide/visitors/ast/numbers.py\n+++ b/wemake_python_styleguide/visitors/ast/numbers.py\n@@ -18,6 +18,7 @@\n \n # Constructor usages:\n ast.FunctionDef,\n+ ast.AsyncFunctionDef,\n ast.arguments,\n \n # Primitives:\n", "issue": "Feature: allow magic numbers in async functions constructors\nWe check that some magic numbers can be used in function constructors like so:\r\n\r\n```python\r\ndef some_function(price, delta=0.1):\r\n return price * delta\r\n```\r\n\r\nBut, we only allow regular functions, not `async` ones: https://github.com/wemake-services/wemake-python-styleguide/blob/master/wemake_python_styleguide/visitors/ast/numbers.py#L19-L21\r\n\r\nWhat we need to do is:\r\n1. Add `ast.AsyncFunctionDef` to the allowed list\r\n2. Write a unit test for it: https://github.com/wemake-services/wemake-python-styleguide/blob/master/tests/test_visitors/test_ast/test_general/test_magic_numbers.py\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport ast\nfrom typing import Optional\n\nfrom wemake_python_styleguide.constants import MAGIC_NUMBERS_WHITELIST\nfrom wemake_python_styleguide.violations.best_practices import (\n MagicNumberViolation,\n)\nfrom wemake_python_styleguide.visitors.base import BaseNodeVisitor\n\n\nclass MagicNumberVisitor(BaseNodeVisitor):\n \"\"\"Checks magic numbers used in the code.\"\"\"\n\n _ALLOWED_PARENTS = (\n ast.Assign,\n\n # Constructor usages:\n ast.FunctionDef,\n ast.arguments,\n\n # Primitives:\n ast.List,\n ast.Dict,\n ast.Set,\n ast.Tuple,\n )\n\n # TODO: make consistent naming rules for class attributes:\n _PROXY_PARENTS = (\n ast.UnaryOp,\n )\n\n def _get_real_parent(self, node: Optional[ast.AST]) -> Optional[ast.AST]:\n \"\"\"\n Returns real number's parent.\n\n What can go wrong?\n\n 1. Number can be negative: ``x = -1``,\n so ``1`` has ``UnaryOp`` as parent, but should return ``Assign``\n\n \"\"\"\n parent = getattr(node, 'parent', None)\n if isinstance(parent, self._PROXY_PARENTS):\n return self._get_real_parent(parent)\n return parent\n\n def _check_is_magic(self, node: ast.Num) -> None:\n parent = self._get_real_parent(node)\n if isinstance(parent, self._ALLOWED_PARENTS):\n return\n\n if node.n in MAGIC_NUMBERS_WHITELIST:\n return\n\n if isinstance(node.n, int) and node.n <= 10:\n return\n\n self.add_violation(MagicNumberViolation(node, text=str(node.n)))\n\n def visit_Num(self, node: ast.Num) -> None:\n \"\"\"\n Checks numbers not to be magic constants inside the code.\n\n Raises:\n MagicNumberViolation\n\n \"\"\"\n self._check_is_magic(node)\n self.generic_visit(node)\n", "path": "wemake_python_styleguide/visitors/ast/numbers.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nimport ast\nfrom typing import Optional\n\nfrom wemake_python_styleguide.constants import MAGIC_NUMBERS_WHITELIST\nfrom wemake_python_styleguide.violations.best_practices import (\n MagicNumberViolation,\n)\nfrom wemake_python_styleguide.visitors.base import BaseNodeVisitor\n\n\nclass MagicNumberVisitor(BaseNodeVisitor):\n \"\"\"Checks magic numbers used in the code.\"\"\"\n\n _ALLOWED_PARENTS = (\n ast.Assign,\n\n # Constructor usages:\n ast.FunctionDef,\n ast.AsyncFunctionDef,\n ast.arguments,\n\n # Primitives:\n ast.List,\n ast.Dict,\n ast.Set,\n ast.Tuple,\n )\n\n _PROXY_PARENTS = (\n ast.UnaryOp,\n )\n\n def _get_real_parent(self, node: Optional[ast.AST]) -> Optional[ast.AST]:\n \"\"\"\n Returns real number's parent.\n\n What can go wrong?\n\n 1. Number can be negative: ``x = -1``,\n so ``1`` has ``UnaryOp`` as parent, but should return ``Assign``\n\n \"\"\"\n parent = getattr(node, 'parent', None)\n if isinstance(parent, self._PROXY_PARENTS):\n return self._get_real_parent(parent)\n return parent\n\n def _check_is_magic(self, node: ast.Num) -> None:\n parent = self._get_real_parent(node)\n if isinstance(parent, self._ALLOWED_PARENTS):\n return\n\n if node.n in MAGIC_NUMBERS_WHITELIST:\n return\n\n if isinstance(node.n, int) and node.n <= 10:\n return\n\n self.add_violation(MagicNumberViolation(node, text=str(node.n)))\n\n def visit_Num(self, node: ast.Num) -> None:\n \"\"\"\n Checks numbers not to be magic constants inside the code.\n\n Raises:\n MagicNumberViolation\n\n \"\"\"\n self._check_is_magic(node)\n self.generic_visit(node)\n", "path": "wemake_python_styleguide/visitors/ast/numbers.py"}]} | 1,004 | 109 |
gh_patches_debug_21157 | rasdani/github-patches | git_diff | ipython__ipython-11382 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
async-detection: nonlocal does not raise syntaxerror
See https://github.com/ipython/ipython/issues/11303#issuecomment-421297197
```
In [1]: x = 1
...: def f():
...: nonlocal x
...: x = 10000
```
Should raise but does not.
It's minor, but may be good to fix as behavior is likely undefined.
async-detection: nonlocal does not raise syntaxerror
See https://github.com/ipython/ipython/issues/11303#issuecomment-421297197
```
In [1]: x = 1
...: def f():
...: nonlocal x
...: x = 10000
```
Should raise but does not.
It's minor, but may be good to fix as behavior is likely undefined.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/core/async_helpers.py`
Content:
```
1 """
2 Async helper function that are invalid syntax on Python 3.5 and below.
3
4 This code is best effort, and may have edge cases not behaving as expected. In
5 particular it contain a number of heuristics to detect whether code is
6 effectively async and need to run in an event loop or not.
7
8 Some constructs (like top-level `return`, or `yield`) are taken care of
9 explicitly to actually raise a SyntaxError and stay as close as possible to
10 Python semantics.
11 """
12
13
14 import ast
15 import sys
16 from textwrap import dedent, indent
17
18
19 class _AsyncIORunner:
20
21 def __call__(self, coro):
22 """
23 Handler for asyncio autoawait
24 """
25 import asyncio
26
27 return asyncio.get_event_loop().run_until_complete(coro)
28
29 def __str__(self):
30 return 'asyncio'
31
32 _asyncio_runner = _AsyncIORunner()
33
34
35 def _curio_runner(coroutine):
36 """
37 handler for curio autoawait
38 """
39 import curio
40
41 return curio.run(coroutine)
42
43
44 def _trio_runner(async_fn):
45 import trio
46
47 async def loc(coro):
48 """
49 We need the dummy no-op async def to protect from
50 trio's internal. See https://github.com/python-trio/trio/issues/89
51 """
52 return await coro
53
54 return trio.run(loc, async_fn)
55
56
57 def _pseudo_sync_runner(coro):
58 """
59 A runner that does not really allow async execution, and just advance the coroutine.
60
61 See discussion in https://github.com/python-trio/trio/issues/608,
62
63 Credit to Nathaniel Smith
64
65 """
66 try:
67 coro.send(None)
68 except StopIteration as exc:
69 return exc.value
70 else:
71 # TODO: do not raise but return an execution result with the right info.
72 raise RuntimeError(
73 "{coro_name!r} needs a real async loop".format(coro_name=coro.__name__)
74 )
75
76
77 def _asyncify(code: str) -> str:
78 """wrap code in async def definition.
79
80 And setup a bit of context to run it later.
81 """
82 res = dedent(
83 """
84 async def __wrapper__():
85 try:
86 {usercode}
87 finally:
88 locals()
89 """
90 ).format(usercode=indent(code, " " * 8))
91 return res
92
93
94 class _AsyncSyntaxErrorVisitor(ast.NodeVisitor):
95 """
96 Find syntax errors that would be an error in an async repl, but because
97 the implementation involves wrapping the repl in an async function, it
98 is erroneously allowed (e.g. yield or return at the top level)
99 """
100
101 def generic_visit(self, node):
102 func_types = (ast.FunctionDef, ast.AsyncFunctionDef)
103 invalid_types = (ast.Return, ast.Yield, ast.YieldFrom)
104
105 if isinstance(node, func_types):
106 return # Don't recurse into functions
107 elif isinstance(node, invalid_types):
108 raise SyntaxError()
109 else:
110 super().generic_visit(node)
111
112
113 def _async_parse_cell(cell: str) -> ast.AST:
114 """
115 This is a compatibility shim for pre-3.7 when async outside of a function
116 is a syntax error at the parse stage.
117
118 It will return an abstract syntax tree parsed as if async and await outside
119 of a function were not a syntax error.
120 """
121 if sys.version_info < (3, 7):
122 # Prior to 3.7 you need to asyncify before parse
123 wrapped_parse_tree = ast.parse(_asyncify(cell))
124 return wrapped_parse_tree.body[0].body[0]
125 else:
126 return ast.parse(cell)
127
128
129 def _should_be_async(cell: str) -> bool:
130 """Detect if a block of code need to be wrapped in an `async def`
131
132 Attempt to parse the block of code, it it compile we're fine.
133 Otherwise we wrap if and try to compile.
134
135 If it works, assume it should be async. Otherwise Return False.
136
137 Not handled yet: If the block of code has a return statement as the top
138 level, it will be seen as async. This is a know limitation.
139 """
140
141 try:
142 # we can't limit ourself to ast.parse, as it __accepts__ to parse on
143 # 3.7+, but just does not _compile_
144 compile(cell, "<>", "exec")
145 return False
146 except SyntaxError:
147 try:
148 parse_tree = _async_parse_cell(cell)
149
150 # Raise a SyntaxError if there are top-level return or yields
151 v = _AsyncSyntaxErrorVisitor()
152 v.visit(parse_tree)
153
154 except SyntaxError:
155 return False
156 return True
157 return False
158
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/IPython/core/async_helpers.py b/IPython/core/async_helpers.py
--- a/IPython/core/async_helpers.py
+++ b/IPython/core/async_helpers.py
@@ -97,14 +97,22 @@
the implementation involves wrapping the repl in an async function, it
is erroneously allowed (e.g. yield or return at the top level)
"""
+ def __init__(self):
+ self.depth = 0
+ super().__init__()
def generic_visit(self, node):
func_types = (ast.FunctionDef, ast.AsyncFunctionDef)
- invalid_types = (ast.Return, ast.Yield, ast.YieldFrom)
-
- if isinstance(node, func_types):
- return # Don't recurse into functions
- elif isinstance(node, invalid_types):
+ invalid_types_by_depth = {
+ 0: (ast.Return, ast.Yield, ast.YieldFrom),
+ 1: (ast.Nonlocal,)
+ }
+
+ should_traverse = self.depth < max(invalid_types_by_depth.keys())
+ if isinstance(node, func_types) and should_traverse:
+ self.depth += 1
+ super().generic_visit(node)
+ elif isinstance(node, invalid_types_by_depth[self.depth]):
raise SyntaxError()
else:
super().generic_visit(node)
| {"golden_diff": "diff --git a/IPython/core/async_helpers.py b/IPython/core/async_helpers.py\n--- a/IPython/core/async_helpers.py\n+++ b/IPython/core/async_helpers.py\n@@ -97,14 +97,22 @@\n the implementation involves wrapping the repl in an async function, it\n is erroneously allowed (e.g. yield or return at the top level)\n \"\"\"\n+ def __init__(self):\n+ self.depth = 0\n+ super().__init__()\n \n def generic_visit(self, node):\n func_types = (ast.FunctionDef, ast.AsyncFunctionDef)\n- invalid_types = (ast.Return, ast.Yield, ast.YieldFrom)\n-\n- if isinstance(node, func_types):\n- return # Don't recurse into functions\n- elif isinstance(node, invalid_types):\n+ invalid_types_by_depth = {\n+ 0: (ast.Return, ast.Yield, ast.YieldFrom),\n+ 1: (ast.Nonlocal,)\n+ }\n+\n+ should_traverse = self.depth < max(invalid_types_by_depth.keys())\n+ if isinstance(node, func_types) and should_traverse:\n+ self.depth += 1\n+ super().generic_visit(node)\n+ elif isinstance(node, invalid_types_by_depth[self.depth]):\n raise SyntaxError()\n else:\n super().generic_visit(node)\n", "issue": "async-detection: nonlocal does not raise syntaxerror\nSee https://github.com/ipython/ipython/issues/11303#issuecomment-421297197\r\n\r\n```\r\nIn [1]: x = 1\r\n ...: def f():\r\n ...: nonlocal x\r\n ...: x = 10000\r\n```\r\nShould raise but does not.\r\n\r\nIt's minor, but may be good to fix as behavior is likely undefined.\nasync-detection: nonlocal does not raise syntaxerror\nSee https://github.com/ipython/ipython/issues/11303#issuecomment-421297197\r\n\r\n```\r\nIn [1]: x = 1\r\n ...: def f():\r\n ...: nonlocal x\r\n ...: x = 10000\r\n```\r\nShould raise but does not.\r\n\r\nIt's minor, but may be good to fix as behavior is likely undefined.\n", "before_files": [{"content": "\"\"\"\nAsync helper function that are invalid syntax on Python 3.5 and below.\n\nThis code is best effort, and may have edge cases not behaving as expected. In\nparticular it contain a number of heuristics to detect whether code is\neffectively async and need to run in an event loop or not.\n\nSome constructs (like top-level `return`, or `yield`) are taken care of\nexplicitly to actually raise a SyntaxError and stay as close as possible to\nPython semantics.\n\"\"\"\n\n\nimport ast\nimport sys\nfrom textwrap import dedent, indent\n\n\nclass _AsyncIORunner:\n\n def __call__(self, coro):\n \"\"\"\n Handler for asyncio autoawait\n \"\"\"\n import asyncio\n\n return asyncio.get_event_loop().run_until_complete(coro)\n\n def __str__(self):\n return 'asyncio'\n\n_asyncio_runner = _AsyncIORunner()\n\n\ndef _curio_runner(coroutine):\n \"\"\"\n handler for curio autoawait\n \"\"\"\n import curio\n\n return curio.run(coroutine)\n\n\ndef _trio_runner(async_fn):\n import trio\n\n async def loc(coro):\n \"\"\"\n We need the dummy no-op async def to protect from\n trio's internal. See https://github.com/python-trio/trio/issues/89\n \"\"\"\n return await coro\n\n return trio.run(loc, async_fn)\n\n\ndef _pseudo_sync_runner(coro):\n \"\"\"\n A runner that does not really allow async execution, and just advance the coroutine.\n\n See discussion in https://github.com/python-trio/trio/issues/608,\n\n Credit to Nathaniel Smith\n\n \"\"\"\n try:\n coro.send(None)\n except StopIteration as exc:\n return exc.value\n else:\n # TODO: do not raise but return an execution result with the right info.\n raise RuntimeError(\n \"{coro_name!r} needs a real async loop\".format(coro_name=coro.__name__)\n )\n\n\ndef _asyncify(code: str) -> str:\n \"\"\"wrap code in async def definition.\n\n And setup a bit of context to run it later.\n \"\"\"\n res = dedent(\n \"\"\"\n async def __wrapper__():\n try:\n {usercode}\n finally:\n locals()\n \"\"\"\n ).format(usercode=indent(code, \" \" * 8))\n return res\n\n\nclass _AsyncSyntaxErrorVisitor(ast.NodeVisitor):\n \"\"\"\n Find syntax errors that would be an error in an async repl, but because\n the implementation involves wrapping the repl in an async function, it\n is erroneously allowed (e.g. yield or return at the top level)\n \"\"\"\n\n def generic_visit(self, node):\n func_types = (ast.FunctionDef, ast.AsyncFunctionDef)\n invalid_types = (ast.Return, ast.Yield, ast.YieldFrom)\n\n if isinstance(node, func_types):\n return # Don't recurse into functions\n elif isinstance(node, invalid_types):\n raise SyntaxError()\n else:\n super().generic_visit(node)\n\n\ndef _async_parse_cell(cell: str) -> ast.AST:\n \"\"\"\n This is a compatibility shim for pre-3.7 when async outside of a function\n is a syntax error at the parse stage.\n\n It will return an abstract syntax tree parsed as if async and await outside\n of a function were not a syntax error.\n \"\"\"\n if sys.version_info < (3, 7):\n # Prior to 3.7 you need to asyncify before parse\n wrapped_parse_tree = ast.parse(_asyncify(cell))\n return wrapped_parse_tree.body[0].body[0]\n else:\n return ast.parse(cell)\n\n\ndef _should_be_async(cell: str) -> bool:\n \"\"\"Detect if a block of code need to be wrapped in an `async def`\n\n Attempt to parse the block of code, it it compile we're fine.\n Otherwise we wrap if and try to compile.\n\n If it works, assume it should be async. Otherwise Return False.\n\n Not handled yet: If the block of code has a return statement as the top\n level, it will be seen as async. This is a know limitation.\n \"\"\"\n\n try:\n # we can't limit ourself to ast.parse, as it __accepts__ to parse on\n # 3.7+, but just does not _compile_\n compile(cell, \"<>\", \"exec\")\n return False\n except SyntaxError:\n try:\n parse_tree = _async_parse_cell(cell)\n\n # Raise a SyntaxError if there are top-level return or yields\n v = _AsyncSyntaxErrorVisitor()\n v.visit(parse_tree)\n\n except SyntaxError:\n return False\n return True\n return False\n", "path": "IPython/core/async_helpers.py"}], "after_files": [{"content": "\"\"\"\nAsync helper function that are invalid syntax on Python 3.5 and below.\n\nThis code is best effort, and may have edge cases not behaving as expected. In\nparticular it contain a number of heuristics to detect whether code is\neffectively async and need to run in an event loop or not.\n\nSome constructs (like top-level `return`, or `yield`) are taken care of\nexplicitly to actually raise a SyntaxError and stay as close as possible to\nPython semantics.\n\"\"\"\n\n\nimport ast\nimport sys\nfrom textwrap import dedent, indent\n\n\nclass _AsyncIORunner:\n\n def __call__(self, coro):\n \"\"\"\n Handler for asyncio autoawait\n \"\"\"\n import asyncio\n\n return asyncio.get_event_loop().run_until_complete(coro)\n\n def __str__(self):\n return 'asyncio'\n\n_asyncio_runner = _AsyncIORunner()\n\n\ndef _curio_runner(coroutine):\n \"\"\"\n handler for curio autoawait\n \"\"\"\n import curio\n\n return curio.run(coroutine)\n\n\ndef _trio_runner(async_fn):\n import trio\n\n async def loc(coro):\n \"\"\"\n We need the dummy no-op async def to protect from\n trio's internal. See https://github.com/python-trio/trio/issues/89\n \"\"\"\n return await coro\n\n return trio.run(loc, async_fn)\n\n\ndef _pseudo_sync_runner(coro):\n \"\"\"\n A runner that does not really allow async execution, and just advance the coroutine.\n\n See discussion in https://github.com/python-trio/trio/issues/608,\n\n Credit to Nathaniel Smith\n\n \"\"\"\n try:\n coro.send(None)\n except StopIteration as exc:\n return exc.value\n else:\n # TODO: do not raise but return an execution result with the right info.\n raise RuntimeError(\n \"{coro_name!r} needs a real async loop\".format(coro_name=coro.__name__)\n )\n\n\ndef _asyncify(code: str) -> str:\n \"\"\"wrap code in async def definition.\n\n And setup a bit of context to run it later.\n \"\"\"\n res = dedent(\n \"\"\"\n async def __wrapper__():\n try:\n {usercode}\n finally:\n locals()\n \"\"\"\n ).format(usercode=indent(code, \" \" * 8))\n return res\n\n\nclass _AsyncSyntaxErrorVisitor(ast.NodeVisitor):\n \"\"\"\n Find syntax errors that would be an error in an async repl, but because\n the implementation involves wrapping the repl in an async function, it\n is erroneously allowed (e.g. yield or return at the top level)\n \"\"\"\n def __init__(self):\n self.depth = 0\n super().__init__()\n\n def generic_visit(self, node):\n func_types = (ast.FunctionDef, ast.AsyncFunctionDef)\n invalid_types_by_depth = {\n 0: (ast.Return, ast.Yield, ast.YieldFrom),\n 1: (ast.Nonlocal,)\n }\n\n should_traverse = self.depth < max(invalid_types_by_depth.keys())\n if isinstance(node, func_types) and should_traverse:\n self.depth += 1\n super().generic_visit(node)\n elif isinstance(node, invalid_types_by_depth[self.depth]):\n raise SyntaxError()\n else:\n super().generic_visit(node)\n\n\ndef _async_parse_cell(cell: str) -> ast.AST:\n \"\"\"\n This is a compatibility shim for pre-3.7 when async outside of a function\n is a syntax error at the parse stage.\n\n It will return an abstract syntax tree parsed as if async and await outside\n of a function were not a syntax error.\n \"\"\"\n if sys.version_info < (3, 7):\n # Prior to 3.7 you need to asyncify before parse\n wrapped_parse_tree = ast.parse(_asyncify(cell))\n return wrapped_parse_tree.body[0].body[0]\n else:\n return ast.parse(cell)\n\n\ndef _should_be_async(cell: str) -> bool:\n \"\"\"Detect if a block of code need to be wrapped in an `async def`\n\n Attempt to parse the block of code, it it compile we're fine.\n Otherwise we wrap if and try to compile.\n\n If it works, assume it should be async. Otherwise Return False.\n\n Not handled yet: If the block of code has a return statement as the top\n level, it will be seen as async. This is a know limitation.\n \"\"\"\n\n try:\n # we can't limit ourself to ast.parse, as it __accepts__ to parse on\n # 3.7+, but just does not _compile_\n compile(cell, \"<>\", \"exec\")\n return False\n except SyntaxError:\n try:\n parse_tree = _async_parse_cell(cell)\n\n # Raise a SyntaxError if there are top-level return or yields\n v = _AsyncSyntaxErrorVisitor()\n v.visit(parse_tree)\n\n except SyntaxError:\n return False\n return True\n return False\n", "path": "IPython/core/async_helpers.py"}]} | 1,878 | 295 |
gh_patches_debug_32876 | rasdani/github-patches | git_diff | getsentry__sentry-52083 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SDK Crash Detection: Set in app
Set all in app for all frames to `false` except for the SDK frames after stripping the event data.
https://github.com/getsentry/sentry/blob/95086b406dec79e6bcef45f299a3e92f727da2c0/src/sentry/utils/sdk_crashes/sdk_crash_detection.py#L58
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/sentry/utils/sdk_crashes/event_stripper.py`
Content:
```
1 from enum import Enum, auto
2 from typing import Any, Dict, Mapping, Optional, Sequence
3
4 from sentry.db.models import NodeData
5 from sentry.utils.safe import get_path
6 from sentry.utils.sdk_crashes.sdk_crash_detector import SDKCrashDetector
7
8
9 class Allow(Enum):
10 def __init__(self, explanation: str = "") -> None:
11 self.explanation = explanation
12
13 """Keeps the event data if it is of type str, int, float, bool."""
14 SIMPLE_TYPE = auto()
15
16 """
17 Doesn't keep the event data no matter the type. This can be used to explicitly
18 specify that data should be removed with an explanation.
19 """
20 NEVER = auto()
21
22 def with_explanation(self, explanation: str) -> "Allow":
23 self.explanation = explanation
24 return self
25
26
27 EVENT_DATA_ALLOWLIST = {
28 "type": Allow.SIMPLE_TYPE,
29 "datetime": Allow.SIMPLE_TYPE,
30 "timestamp": Allow.SIMPLE_TYPE,
31 "platform": Allow.SIMPLE_TYPE,
32 "sdk": {
33 "name": Allow.SIMPLE_TYPE,
34 "version": Allow.SIMPLE_TYPE,
35 "integrations": Allow.NEVER.with_explanation("Users can add their own integrations."),
36 },
37 "exception": {
38 "values": {
39 "stacktrace": {
40 "frames": {
41 "filename": Allow.SIMPLE_TYPE,
42 "function": Allow.SIMPLE_TYPE,
43 "raw_function": Allow.SIMPLE_TYPE,
44 "module": Allow.SIMPLE_TYPE,
45 "abs_path": Allow.SIMPLE_TYPE,
46 "in_app": Allow.SIMPLE_TYPE,
47 "instruction_addr": Allow.SIMPLE_TYPE,
48 "addr_mode": Allow.SIMPLE_TYPE,
49 "symbol": Allow.SIMPLE_TYPE,
50 "symbol_addr": Allow.SIMPLE_TYPE,
51 "image_addr": Allow.SIMPLE_TYPE,
52 "package": Allow.SIMPLE_TYPE,
53 "platform": Allow.SIMPLE_TYPE,
54 }
55 },
56 "value": Allow.NEVER.with_explanation("The exception value could contain PII."),
57 "type": Allow.SIMPLE_TYPE,
58 "mechanism": {
59 "handled": Allow.SIMPLE_TYPE,
60 "type": Allow.SIMPLE_TYPE,
61 "meta": {
62 "signal": {
63 "number": Allow.SIMPLE_TYPE,
64 "code": Allow.SIMPLE_TYPE,
65 "name": Allow.SIMPLE_TYPE,
66 "code_name": Allow.SIMPLE_TYPE,
67 },
68 "mach_exception": {
69 "exception": Allow.SIMPLE_TYPE,
70 "code": Allow.SIMPLE_TYPE,
71 "subcode": Allow.SIMPLE_TYPE,
72 "name": Allow.SIMPLE_TYPE,
73 },
74 },
75 },
76 }
77 },
78 "contexts": {
79 "device": {
80 "family": Allow.SIMPLE_TYPE,
81 "model": Allow.SIMPLE_TYPE,
82 "arch": Allow.SIMPLE_TYPE,
83 },
84 "os": {
85 "name": Allow.SIMPLE_TYPE,
86 "version": Allow.SIMPLE_TYPE,
87 "build": Allow.SIMPLE_TYPE,
88 },
89 },
90 }
91
92
93 def strip_event_data(
94 event_data: NodeData, sdk_crash_detector: SDKCrashDetector
95 ) -> Mapping[str, Any]:
96 new_event_data = _strip_event_data_with_allowlist(event_data, EVENT_DATA_ALLOWLIST)
97
98 if (new_event_data is None) or (new_event_data == {}):
99 return {}
100
101 stripped_frames: Sequence[Mapping[str, Any]] = []
102 frames = get_path(new_event_data, "exception", "values", -1, "stacktrace", "frames")
103
104 if frames is not None:
105 stripped_frames = _strip_frames(frames, sdk_crash_detector)
106 new_event_data["exception"]["values"][0]["stacktrace"]["frames"] = stripped_frames
107
108 return new_event_data
109
110
111 def _strip_event_data_with_allowlist(
112 data: Mapping[str, Any], allowlist: Optional[Mapping[str, Any]]
113 ) -> Optional[Mapping[str, Any]]:
114 """
115 Recursively traverses the data and only keeps values based on the allowlist.
116 """
117 if allowlist is None:
118 return None
119
120 stripped_data: Dict[str, Any] = {}
121 for data_key, data_value in data.items():
122 allowlist_for_data = allowlist.get(data_key)
123 if allowlist_for_data is None:
124 continue
125
126 if isinstance(allowlist_for_data, Allow):
127 allowed = allowlist_for_data
128
129 if allowed is Allow.SIMPLE_TYPE and isinstance(data_value, (str, int, float, bool)):
130 stripped_data[data_key] = data_value
131 else:
132 continue
133
134 elif isinstance(data_value, Mapping):
135 stripped_data[data_key] = _strip_event_data_with_allowlist(
136 data_value, allowlist_for_data
137 )
138 elif isinstance(data_value, Sequence):
139 stripped_data[data_key] = [
140 _strip_event_data_with_allowlist(item, allowlist_for_data) for item in data_value
141 ]
142
143 return stripped_data
144
145
146 def _strip_frames(
147 frames: Sequence[Mapping[str, Any]], sdk_crash_detector: SDKCrashDetector
148 ) -> Sequence[Mapping[str, Any]]:
149 """
150 Only keep SDK frames or Apple system libraries.
151 We need to adapt this logic once we support other platforms.
152 """
153
154 def is_system_library(frame: Mapping[str, Any]) -> bool:
155 fields_containing_paths = {"package", "module", "abs_path"}
156 system_library_paths = {"/System/Library/", "/usr/lib/system/"}
157
158 for field in fields_containing_paths:
159 for path in system_library_paths:
160 if frame.get(field, "").startswith(path):
161 return True
162
163 return False
164
165 return [
166 frame
167 for frame in frames
168 if sdk_crash_detector.is_sdk_frame(frame) or is_system_library(frame)
169 ]
170
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/sentry/utils/sdk_crashes/event_stripper.py b/src/sentry/utils/sdk_crashes/event_stripper.py
--- a/src/sentry/utils/sdk_crashes/event_stripper.py
+++ b/src/sentry/utils/sdk_crashes/event_stripper.py
@@ -1,5 +1,5 @@
from enum import Enum, auto
-from typing import Any, Dict, Mapping, Optional, Sequence
+from typing import Any, Dict, Mapping, MutableMapping, Optional, Sequence
from sentry.db.models import NodeData
from sentry.utils.safe import get_path
@@ -98,11 +98,11 @@
if (new_event_data is None) or (new_event_data == {}):
return {}
- stripped_frames: Sequence[Mapping[str, Any]] = []
frames = get_path(new_event_data, "exception", "values", -1, "stacktrace", "frames")
if frames is not None:
stripped_frames = _strip_frames(frames, sdk_crash_detector)
+
new_event_data["exception"]["values"][0]["stacktrace"]["frames"] = stripped_frames
return new_event_data
@@ -144,7 +144,7 @@
def _strip_frames(
- frames: Sequence[Mapping[str, Any]], sdk_crash_detector: SDKCrashDetector
+ frames: Sequence[MutableMapping[str, Any]], sdk_crash_detector: SDKCrashDetector
) -> Sequence[Mapping[str, Any]]:
"""
Only keep SDK frames or Apple system libraries.
@@ -162,8 +162,15 @@
return False
+ def strip_frame(frame: MutableMapping[str, Any]) -> MutableMapping[str, Any]:
+ if sdk_crash_detector.is_sdk_frame(frame):
+ frame["in_app"] = True
+ else:
+ frame["in_app"] = False
+ return frame
+
return [
- frame
+ strip_frame(frame)
for frame in frames
if sdk_crash_detector.is_sdk_frame(frame) or is_system_library(frame)
]
| {"golden_diff": "diff --git a/src/sentry/utils/sdk_crashes/event_stripper.py b/src/sentry/utils/sdk_crashes/event_stripper.py\n--- a/src/sentry/utils/sdk_crashes/event_stripper.py\n+++ b/src/sentry/utils/sdk_crashes/event_stripper.py\n@@ -1,5 +1,5 @@\n from enum import Enum, auto\n-from typing import Any, Dict, Mapping, Optional, Sequence\n+from typing import Any, Dict, Mapping, MutableMapping, Optional, Sequence\n \n from sentry.db.models import NodeData\n from sentry.utils.safe import get_path\n@@ -98,11 +98,11 @@\n if (new_event_data is None) or (new_event_data == {}):\n return {}\n \n- stripped_frames: Sequence[Mapping[str, Any]] = []\n frames = get_path(new_event_data, \"exception\", \"values\", -1, \"stacktrace\", \"frames\")\n \n if frames is not None:\n stripped_frames = _strip_frames(frames, sdk_crash_detector)\n+\n new_event_data[\"exception\"][\"values\"][0][\"stacktrace\"][\"frames\"] = stripped_frames\n \n return new_event_data\n@@ -144,7 +144,7 @@\n \n \n def _strip_frames(\n- frames: Sequence[Mapping[str, Any]], sdk_crash_detector: SDKCrashDetector\n+ frames: Sequence[MutableMapping[str, Any]], sdk_crash_detector: SDKCrashDetector\n ) -> Sequence[Mapping[str, Any]]:\n \"\"\"\n Only keep SDK frames or Apple system libraries.\n@@ -162,8 +162,15 @@\n \n return False\n \n+ def strip_frame(frame: MutableMapping[str, Any]) -> MutableMapping[str, Any]:\n+ if sdk_crash_detector.is_sdk_frame(frame):\n+ frame[\"in_app\"] = True\n+ else:\n+ frame[\"in_app\"] = False\n+ return frame\n+\n return [\n- frame\n+ strip_frame(frame)\n for frame in frames\n if sdk_crash_detector.is_sdk_frame(frame) or is_system_library(frame)\n ]\n", "issue": "SDK Crash Detection: Set in app\nSet all in app for all frames to `false` except for the SDK frames after stripping the event data. \r\n\r\nhttps://github.com/getsentry/sentry/blob/95086b406dec79e6bcef45f299a3e92f727da2c0/src/sentry/utils/sdk_crashes/sdk_crash_detection.py#L58\n", "before_files": [{"content": "from enum import Enum, auto\nfrom typing import Any, Dict, Mapping, Optional, Sequence\n\nfrom sentry.db.models import NodeData\nfrom sentry.utils.safe import get_path\nfrom sentry.utils.sdk_crashes.sdk_crash_detector import SDKCrashDetector\n\n\nclass Allow(Enum):\n def __init__(self, explanation: str = \"\") -> None:\n self.explanation = explanation\n\n \"\"\"Keeps the event data if it is of type str, int, float, bool.\"\"\"\n SIMPLE_TYPE = auto()\n\n \"\"\"\n Doesn't keep the event data no matter the type. This can be used to explicitly\n specify that data should be removed with an explanation.\n \"\"\"\n NEVER = auto()\n\n def with_explanation(self, explanation: str) -> \"Allow\":\n self.explanation = explanation\n return self\n\n\nEVENT_DATA_ALLOWLIST = {\n \"type\": Allow.SIMPLE_TYPE,\n \"datetime\": Allow.SIMPLE_TYPE,\n \"timestamp\": Allow.SIMPLE_TYPE,\n \"platform\": Allow.SIMPLE_TYPE,\n \"sdk\": {\n \"name\": Allow.SIMPLE_TYPE,\n \"version\": Allow.SIMPLE_TYPE,\n \"integrations\": Allow.NEVER.with_explanation(\"Users can add their own integrations.\"),\n },\n \"exception\": {\n \"values\": {\n \"stacktrace\": {\n \"frames\": {\n \"filename\": Allow.SIMPLE_TYPE,\n \"function\": Allow.SIMPLE_TYPE,\n \"raw_function\": Allow.SIMPLE_TYPE,\n \"module\": Allow.SIMPLE_TYPE,\n \"abs_path\": Allow.SIMPLE_TYPE,\n \"in_app\": Allow.SIMPLE_TYPE,\n \"instruction_addr\": Allow.SIMPLE_TYPE,\n \"addr_mode\": Allow.SIMPLE_TYPE,\n \"symbol\": Allow.SIMPLE_TYPE,\n \"symbol_addr\": Allow.SIMPLE_TYPE,\n \"image_addr\": Allow.SIMPLE_TYPE,\n \"package\": Allow.SIMPLE_TYPE,\n \"platform\": Allow.SIMPLE_TYPE,\n }\n },\n \"value\": Allow.NEVER.with_explanation(\"The exception value could contain PII.\"),\n \"type\": Allow.SIMPLE_TYPE,\n \"mechanism\": {\n \"handled\": Allow.SIMPLE_TYPE,\n \"type\": Allow.SIMPLE_TYPE,\n \"meta\": {\n \"signal\": {\n \"number\": Allow.SIMPLE_TYPE,\n \"code\": Allow.SIMPLE_TYPE,\n \"name\": Allow.SIMPLE_TYPE,\n \"code_name\": Allow.SIMPLE_TYPE,\n },\n \"mach_exception\": {\n \"exception\": Allow.SIMPLE_TYPE,\n \"code\": Allow.SIMPLE_TYPE,\n \"subcode\": Allow.SIMPLE_TYPE,\n \"name\": Allow.SIMPLE_TYPE,\n },\n },\n },\n }\n },\n \"contexts\": {\n \"device\": {\n \"family\": Allow.SIMPLE_TYPE,\n \"model\": Allow.SIMPLE_TYPE,\n \"arch\": Allow.SIMPLE_TYPE,\n },\n \"os\": {\n \"name\": Allow.SIMPLE_TYPE,\n \"version\": Allow.SIMPLE_TYPE,\n \"build\": Allow.SIMPLE_TYPE,\n },\n },\n}\n\n\ndef strip_event_data(\n event_data: NodeData, sdk_crash_detector: SDKCrashDetector\n) -> Mapping[str, Any]:\n new_event_data = _strip_event_data_with_allowlist(event_data, EVENT_DATA_ALLOWLIST)\n\n if (new_event_data is None) or (new_event_data == {}):\n return {}\n\n stripped_frames: Sequence[Mapping[str, Any]] = []\n frames = get_path(new_event_data, \"exception\", \"values\", -1, \"stacktrace\", \"frames\")\n\n if frames is not None:\n stripped_frames = _strip_frames(frames, sdk_crash_detector)\n new_event_data[\"exception\"][\"values\"][0][\"stacktrace\"][\"frames\"] = stripped_frames\n\n return new_event_data\n\n\ndef _strip_event_data_with_allowlist(\n data: Mapping[str, Any], allowlist: Optional[Mapping[str, Any]]\n) -> Optional[Mapping[str, Any]]:\n \"\"\"\n Recursively traverses the data and only keeps values based on the allowlist.\n \"\"\"\n if allowlist is None:\n return None\n\n stripped_data: Dict[str, Any] = {}\n for data_key, data_value in data.items():\n allowlist_for_data = allowlist.get(data_key)\n if allowlist_for_data is None:\n continue\n\n if isinstance(allowlist_for_data, Allow):\n allowed = allowlist_for_data\n\n if allowed is Allow.SIMPLE_TYPE and isinstance(data_value, (str, int, float, bool)):\n stripped_data[data_key] = data_value\n else:\n continue\n\n elif isinstance(data_value, Mapping):\n stripped_data[data_key] = _strip_event_data_with_allowlist(\n data_value, allowlist_for_data\n )\n elif isinstance(data_value, Sequence):\n stripped_data[data_key] = [\n _strip_event_data_with_allowlist(item, allowlist_for_data) for item in data_value\n ]\n\n return stripped_data\n\n\ndef _strip_frames(\n frames: Sequence[Mapping[str, Any]], sdk_crash_detector: SDKCrashDetector\n) -> Sequence[Mapping[str, Any]]:\n \"\"\"\n Only keep SDK frames or Apple system libraries.\n We need to adapt this logic once we support other platforms.\n \"\"\"\n\n def is_system_library(frame: Mapping[str, Any]) -> bool:\n fields_containing_paths = {\"package\", \"module\", \"abs_path\"}\n system_library_paths = {\"/System/Library/\", \"/usr/lib/system/\"}\n\n for field in fields_containing_paths:\n for path in system_library_paths:\n if frame.get(field, \"\").startswith(path):\n return True\n\n return False\n\n return [\n frame\n for frame in frames\n if sdk_crash_detector.is_sdk_frame(frame) or is_system_library(frame)\n ]\n", "path": "src/sentry/utils/sdk_crashes/event_stripper.py"}], "after_files": [{"content": "from enum import Enum, auto\nfrom typing import Any, Dict, Mapping, MutableMapping, Optional, Sequence\n\nfrom sentry.db.models import NodeData\nfrom sentry.utils.safe import get_path\nfrom sentry.utils.sdk_crashes.sdk_crash_detector import SDKCrashDetector\n\n\nclass Allow(Enum):\n def __init__(self, explanation: str = \"\") -> None:\n self.explanation = explanation\n\n \"\"\"Keeps the event data if it is of type str, int, float, bool.\"\"\"\n SIMPLE_TYPE = auto()\n\n \"\"\"\n Doesn't keep the event data no matter the type. This can be used to explicitly\n specify that data should be removed with an explanation.\n \"\"\"\n NEVER = auto()\n\n def with_explanation(self, explanation: str) -> \"Allow\":\n self.explanation = explanation\n return self\n\n\nEVENT_DATA_ALLOWLIST = {\n \"type\": Allow.SIMPLE_TYPE,\n \"datetime\": Allow.SIMPLE_TYPE,\n \"timestamp\": Allow.SIMPLE_TYPE,\n \"platform\": Allow.SIMPLE_TYPE,\n \"sdk\": {\n \"name\": Allow.SIMPLE_TYPE,\n \"version\": Allow.SIMPLE_TYPE,\n \"integrations\": Allow.NEVER.with_explanation(\"Users can add their own integrations.\"),\n },\n \"exception\": {\n \"values\": {\n \"stacktrace\": {\n \"frames\": {\n \"filename\": Allow.SIMPLE_TYPE,\n \"function\": Allow.SIMPLE_TYPE,\n \"raw_function\": Allow.SIMPLE_TYPE,\n \"module\": Allow.SIMPLE_TYPE,\n \"abs_path\": Allow.SIMPLE_TYPE,\n \"in_app\": Allow.SIMPLE_TYPE,\n \"instruction_addr\": Allow.SIMPLE_TYPE,\n \"addr_mode\": Allow.SIMPLE_TYPE,\n \"symbol\": Allow.SIMPLE_TYPE,\n \"symbol_addr\": Allow.SIMPLE_TYPE,\n \"image_addr\": Allow.SIMPLE_TYPE,\n \"package\": Allow.SIMPLE_TYPE,\n \"platform\": Allow.SIMPLE_TYPE,\n }\n },\n \"value\": Allow.NEVER.with_explanation(\"The exception value could contain PII.\"),\n \"type\": Allow.SIMPLE_TYPE,\n \"mechanism\": {\n \"handled\": Allow.SIMPLE_TYPE,\n \"type\": Allow.SIMPLE_TYPE,\n \"meta\": {\n \"signal\": {\n \"number\": Allow.SIMPLE_TYPE,\n \"code\": Allow.SIMPLE_TYPE,\n \"name\": Allow.SIMPLE_TYPE,\n \"code_name\": Allow.SIMPLE_TYPE,\n },\n \"mach_exception\": {\n \"exception\": Allow.SIMPLE_TYPE,\n \"code\": Allow.SIMPLE_TYPE,\n \"subcode\": Allow.SIMPLE_TYPE,\n \"name\": Allow.SIMPLE_TYPE,\n },\n },\n },\n }\n },\n \"contexts\": {\n \"device\": {\n \"family\": Allow.SIMPLE_TYPE,\n \"model\": Allow.SIMPLE_TYPE,\n \"arch\": Allow.SIMPLE_TYPE,\n },\n \"os\": {\n \"name\": Allow.SIMPLE_TYPE,\n \"version\": Allow.SIMPLE_TYPE,\n \"build\": Allow.SIMPLE_TYPE,\n },\n },\n}\n\n\ndef strip_event_data(\n event_data: NodeData, sdk_crash_detector: SDKCrashDetector\n) -> Mapping[str, Any]:\n new_event_data = _strip_event_data_with_allowlist(event_data, EVENT_DATA_ALLOWLIST)\n\n if (new_event_data is None) or (new_event_data == {}):\n return {}\n\n frames = get_path(new_event_data, \"exception\", \"values\", -1, \"stacktrace\", \"frames\")\n\n if frames is not None:\n stripped_frames = _strip_frames(frames, sdk_crash_detector)\n\n new_event_data[\"exception\"][\"values\"][0][\"stacktrace\"][\"frames\"] = stripped_frames\n\n return new_event_data\n\n\ndef _strip_event_data_with_allowlist(\n data: Mapping[str, Any], allowlist: Optional[Mapping[str, Any]]\n) -> Optional[Mapping[str, Any]]:\n \"\"\"\n Recursively traverses the data and only keeps values based on the allowlist.\n \"\"\"\n if allowlist is None:\n return None\n\n stripped_data: Dict[str, Any] = {}\n for data_key, data_value in data.items():\n allowlist_for_data = allowlist.get(data_key)\n if allowlist_for_data is None:\n continue\n\n if isinstance(allowlist_for_data, Allow):\n allowed = allowlist_for_data\n\n if allowed is Allow.SIMPLE_TYPE and isinstance(data_value, (str, int, float, bool)):\n stripped_data[data_key] = data_value\n else:\n continue\n\n elif isinstance(data_value, Mapping):\n stripped_data[data_key] = _strip_event_data_with_allowlist(\n data_value, allowlist_for_data\n )\n elif isinstance(data_value, Sequence):\n stripped_data[data_key] = [\n _strip_event_data_with_allowlist(item, allowlist_for_data) for item in data_value\n ]\n\n return stripped_data\n\n\ndef _strip_frames(\n frames: Sequence[MutableMapping[str, Any]], sdk_crash_detector: SDKCrashDetector\n) -> Sequence[Mapping[str, Any]]:\n \"\"\"\n Only keep SDK frames or Apple system libraries.\n We need to adapt this logic once we support other platforms.\n \"\"\"\n\n def is_system_library(frame: Mapping[str, Any]) -> bool:\n fields_containing_paths = {\"package\", \"module\", \"abs_path\"}\n system_library_paths = {\"/System/Library/\", \"/usr/lib/system/\"}\n\n for field in fields_containing_paths:\n for path in system_library_paths:\n if frame.get(field, \"\").startswith(path):\n return True\n\n return False\n\n def strip_frame(frame: MutableMapping[str, Any]) -> MutableMapping[str, Any]:\n if sdk_crash_detector.is_sdk_frame(frame):\n frame[\"in_app\"] = True\n else:\n frame[\"in_app\"] = False\n return frame\n\n return [\n strip_frame(frame)\n for frame in frames\n if sdk_crash_detector.is_sdk_frame(frame) or is_system_library(frame)\n ]\n", "path": "src/sentry/utils/sdk_crashes/event_stripper.py"}]} | 2,041 | 448 |
gh_patches_debug_16733 | rasdani/github-patches | git_diff | mampfes__hacs_waste_collection_schedule-1729 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug]: adur_worthing_gov_uk has changed format of address
### I Have A Problem With:
A specific source
### What's Your Problem
The Adur and Worthing council used to return my address as “12 Roadname”, so that’s what I had in my source args. But the format has recently changed to “12 ROADNAME”, causing the lookup in adur_worthing_gov_uk.py to fail.
As the council is just as likely to change it back at some point can I suggest that the lookup is made case independent?
### Source (if relevant)
adur_worthing_gov_uk
### Logs
_No response_
### Relevant Configuration
_No response_
### Checklist Source Error
- [X] Use the example parameters for your source (often available in the documentation) (don't forget to restart Home Assistant after changing the configuration)
- [X] Checked that the website of your service provider is still working
- [X] Tested my attributes on the service provider website (if possible)
- [X] I have tested with the latest version of the integration (master) (for HACS in the 3 dot menu of the integration click on "Redownload" and choose master as version)
### Checklist Sensor Error
- [X] Checked in the Home Assistant Calendar tab if the event names match the types names (if types argument is used)
### Required
- [X] I have searched past (closed AND opened) issues to see if this bug has already been reported, and it hasn't been.
- [X] I understand that people give their precious time for free, and thus I've done my very best to make this problem as easy as possible to investigate.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py`
Content:
```
1 from datetime import datetime
2
3 import bs4
4 import requests
5 from waste_collection_schedule import Collection # type: ignore[attr-defined]
6
7 TITLE = "Adur & Worthing Councils"
8 DESCRIPTION = "Source for adur-worthing.gov.uk services for Adur & Worthing, UK."
9 URL = "https://adur-worthing.gov.uk"
10 TEST_CASES = {
11 "Test_001": {"postcode": "BN15 9UX", "address": "1 Western Road North"},
12 "Test_002": {"postcode": "BN43 5WE", "address": "6 Hebe Road"},
13 }
14 HEADERS = {
15 "user-agent": "Mozilla/5.0",
16 }
17 ICON_MAP = {
18 "Recycling": "mdi:recycle",
19 "Refuse": "mdi:trash-can",
20 "Garden": "mdi:leaf",
21 }
22
23
24 class Source:
25 def __init__(self, postcode, address):
26 self._postcode = postcode
27 self._address = address
28
29 def fetch(self):
30
31 if self._postcode is None or self._address is None:
32 raise ValueError("Either postcode or address is None")
33
34 s = requests.Session()
35
36 postcode_search_request = s.get(
37 f"https://www.adur-worthing.gov.uk/bin-day/?brlu-address-postcode={self._postcode}&return-url=/bin-day/&action=search",
38 headers=HEADERS,
39 )
40 html_addresses = postcode_search_request.content
41 addresses = bs4.BeautifulSoup(html_addresses, "html.parser")
42 addresses_select = addresses.find("select", {"id": "brlu-selected-address"})
43
44 found_address = None
45 for address in addresses_select.find_all("option"):
46 if self._address in address.get_text():
47 found_address = address
48
49 if found_address is None:
50 raise ValueError("Address not found")
51
52 collections_request = s.get(
53 f"https://www.adur-worthing.gov.uk/bin-day/?brlu-selected-address={address['value']}&return-url=/bin-day/",
54 headers=HEADERS,
55 )
56 html_collections = collections_request.content
57 bin_collections = bs4.BeautifulSoup(html_collections, "html.parser")
58
59 bin_days_table = bin_collections.find("table", class_="bin-days")
60 bin_days_table_body = bin_days_table.find("tbody")
61 bin_days_by_type = bin_days_table_body.find_all("tr")
62
63 entries = []
64
65 for bin_by_type in bin_days_by_type:
66 bin_type = bin_by_type.find("th").text
67 icon = ICON_MAP.get(bin_type)
68 bin_days = bin_by_type.find_all("td")[-1].get_text(separator="\n")
69 for bin_day in bin_days.split("\n"):
70 bin_datetime = datetime.strptime(bin_day, "%A %d %b %Y").date()
71 entries.append(Collection(t=bin_type, date=bin_datetime, icon=icon))
72
73 return entries
74
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py b/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py
--- a/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py
+++ b/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py
@@ -27,7 +27,6 @@
self._address = address
def fetch(self):
-
if self._postcode is None or self._address is None:
raise ValueError("Either postcode or address is None")
@@ -43,7 +42,7 @@
found_address = None
for address in addresses_select.find_all("option"):
- if self._address in address.get_text():
+ if self._address.upper() in address.get_text().upper():
found_address = address
if found_address is None:
| {"golden_diff": "diff --git a/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py b/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py\n--- a/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py\n+++ b/custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py\n@@ -27,7 +27,6 @@\n self._address = address\n \n def fetch(self):\n-\n if self._postcode is None or self._address is None:\n raise ValueError(\"Either postcode or address is None\")\n \n@@ -43,7 +42,7 @@\n \n found_address = None\n for address in addresses_select.find_all(\"option\"):\n- if self._address in address.get_text():\n+ if self._address.upper() in address.get_text().upper():\n found_address = address\n \n if found_address is None:\n", "issue": "[Bug]: adur_worthing_gov_uk has changed format of address\n### I Have A Problem With:\n\nA specific source\n\n### What's Your Problem\n\nThe Adur and Worthing council used to return my address as \u201c12 Roadname\u201d, so that\u2019s what I had in my source args. But the format has recently changed to \u201c12 ROADNAME\u201d, causing the lookup in adur_worthing_gov_uk.py to fail. \r\n\r\nAs the council is just as likely to change it back at some point can I suggest that the lookup is made case independent?\n\n### Source (if relevant)\n\nadur_worthing_gov_uk\n\n### Logs\n\n_No response_\n\n### Relevant Configuration\n\n_No response_\n\n### Checklist Source Error\n\n- [X] Use the example parameters for your source (often available in the documentation) (don't forget to restart Home Assistant after changing the configuration)\n- [X] Checked that the website of your service provider is still working\n- [X] Tested my attributes on the service provider website (if possible)\n- [X] I have tested with the latest version of the integration (master) (for HACS in the 3 dot menu of the integration click on \"Redownload\" and choose master as version)\n\n### Checklist Sensor Error\n\n- [X] Checked in the Home Assistant Calendar tab if the event names match the types names (if types argument is used)\n\n### Required\n\n- [X] I have searched past (closed AND opened) issues to see if this bug has already been reported, and it hasn't been.\n- [X] I understand that people give their precious time for free, and thus I've done my very best to make this problem as easy as possible to investigate.\n", "before_files": [{"content": "from datetime import datetime\n\nimport bs4\nimport requests\nfrom waste_collection_schedule import Collection # type: ignore[attr-defined]\n\nTITLE = \"Adur & Worthing Councils\"\nDESCRIPTION = \"Source for adur-worthing.gov.uk services for Adur & Worthing, UK.\"\nURL = \"https://adur-worthing.gov.uk\"\nTEST_CASES = {\n \"Test_001\": {\"postcode\": \"BN15 9UX\", \"address\": \"1 Western Road North\"},\n \"Test_002\": {\"postcode\": \"BN43 5WE\", \"address\": \"6 Hebe Road\"},\n}\nHEADERS = {\n \"user-agent\": \"Mozilla/5.0\",\n}\nICON_MAP = {\n \"Recycling\": \"mdi:recycle\",\n \"Refuse\": \"mdi:trash-can\",\n \"Garden\": \"mdi:leaf\",\n}\n\n\nclass Source:\n def __init__(self, postcode, address):\n self._postcode = postcode\n self._address = address\n\n def fetch(self):\n\n if self._postcode is None or self._address is None:\n raise ValueError(\"Either postcode or address is None\")\n\n s = requests.Session()\n\n postcode_search_request = s.get(\n f\"https://www.adur-worthing.gov.uk/bin-day/?brlu-address-postcode={self._postcode}&return-url=/bin-day/&action=search\",\n headers=HEADERS,\n )\n html_addresses = postcode_search_request.content\n addresses = bs4.BeautifulSoup(html_addresses, \"html.parser\")\n addresses_select = addresses.find(\"select\", {\"id\": \"brlu-selected-address\"})\n\n found_address = None\n for address in addresses_select.find_all(\"option\"):\n if self._address in address.get_text():\n found_address = address\n\n if found_address is None:\n raise ValueError(\"Address not found\")\n\n collections_request = s.get(\n f\"https://www.adur-worthing.gov.uk/bin-day/?brlu-selected-address={address['value']}&return-url=/bin-day/\",\n headers=HEADERS,\n )\n html_collections = collections_request.content\n bin_collections = bs4.BeautifulSoup(html_collections, \"html.parser\")\n\n bin_days_table = bin_collections.find(\"table\", class_=\"bin-days\")\n bin_days_table_body = bin_days_table.find(\"tbody\")\n bin_days_by_type = bin_days_table_body.find_all(\"tr\")\n\n entries = []\n\n for bin_by_type in bin_days_by_type:\n bin_type = bin_by_type.find(\"th\").text\n icon = ICON_MAP.get(bin_type)\n bin_days = bin_by_type.find_all(\"td\")[-1].get_text(separator=\"\\n\")\n for bin_day in bin_days.split(\"\\n\"):\n bin_datetime = datetime.strptime(bin_day, \"%A %d %b %Y\").date()\n entries.append(Collection(t=bin_type, date=bin_datetime, icon=icon))\n\n return entries\n", "path": "custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py"}], "after_files": [{"content": "from datetime import datetime\n\nimport bs4\nimport requests\nfrom waste_collection_schedule import Collection # type: ignore[attr-defined]\n\nTITLE = \"Adur & Worthing Councils\"\nDESCRIPTION = \"Source for adur-worthing.gov.uk services for Adur & Worthing, UK.\"\nURL = \"https://adur-worthing.gov.uk\"\nTEST_CASES = {\n \"Test_001\": {\"postcode\": \"BN15 9UX\", \"address\": \"1 Western Road North\"},\n \"Test_002\": {\"postcode\": \"BN43 5WE\", \"address\": \"6 Hebe Road\"},\n}\nHEADERS = {\n \"user-agent\": \"Mozilla/5.0\",\n}\nICON_MAP = {\n \"Recycling\": \"mdi:recycle\",\n \"Refuse\": \"mdi:trash-can\",\n \"Garden\": \"mdi:leaf\",\n}\n\n\nclass Source:\n def __init__(self, postcode, address):\n self._postcode = postcode\n self._address = address\n\n def fetch(self):\n if self._postcode is None or self._address is None:\n raise ValueError(\"Either postcode or address is None\")\n\n s = requests.Session()\n\n postcode_search_request = s.get(\n f\"https://www.adur-worthing.gov.uk/bin-day/?brlu-address-postcode={self._postcode}&return-url=/bin-day/&action=search\",\n headers=HEADERS,\n )\n html_addresses = postcode_search_request.content\n addresses = bs4.BeautifulSoup(html_addresses, \"html.parser\")\n addresses_select = addresses.find(\"select\", {\"id\": \"brlu-selected-address\"})\n\n found_address = None\n for address in addresses_select.find_all(\"option\"):\n if self._address.upper() in address.get_text().upper():\n found_address = address\n\n if found_address is None:\n raise ValueError(\"Address not found\")\n\n collections_request = s.get(\n f\"https://www.adur-worthing.gov.uk/bin-day/?brlu-selected-address={address['value']}&return-url=/bin-day/\",\n headers=HEADERS,\n )\n html_collections = collections_request.content\n bin_collections = bs4.BeautifulSoup(html_collections, \"html.parser\")\n\n bin_days_table = bin_collections.find(\"table\", class_=\"bin-days\")\n bin_days_table_body = bin_days_table.find(\"tbody\")\n bin_days_by_type = bin_days_table_body.find_all(\"tr\")\n\n entries = []\n\n for bin_by_type in bin_days_by_type:\n bin_type = bin_by_type.find(\"th\").text\n icon = ICON_MAP.get(bin_type)\n bin_days = bin_by_type.find_all(\"td\")[-1].get_text(separator=\"\\n\")\n for bin_day in bin_days.split(\"\\n\"):\n bin_datetime = datetime.strptime(bin_day, \"%A %d %b %Y\").date()\n entries.append(Collection(t=bin_type, date=bin_datetime, icon=icon))\n\n return entries\n", "path": "custom_components/waste_collection_schedule/waste_collection_schedule/source/adur_worthing_gov_uk.py"}]} | 1,405 | 216 |
gh_patches_debug_659 | rasdani/github-patches | git_diff | pex-tool__pex-2214 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.142
On the docket:
+ [x] KeyError when locking awscli on Python 3.11 #2211
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.141"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.141"
+__version__ = "2.1.142"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.141\"\n+__version__ = \"2.1.142\"\n", "issue": "Release 2.1.142\nOn the docket:\r\n+ [x] KeyError when locking awscli on Python 3.11 #2211\r\n\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.141\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.142\"\n", "path": "pex/version.py"}]} | 346 | 98 |
gh_patches_debug_6937 | rasdani/github-patches | git_diff | ivy-llc__ivy-22098 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
reshape_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/paddle/tensor/manipulation.py`
Content:
```
1 # global
2 import ivy
3 from ivy.functional.frontends.paddle.func_wrapper import (
4 to_ivy_arrays_and_back,
5 )
6 from ivy.func_wrapper import (
7 with_unsupported_dtypes,
8 with_supported_dtypes,
9 with_supported_device_and_dtypes,
10 )
11
12
13 @to_ivy_arrays_and_back
14 def reshape(x, shape):
15 return ivy.reshape(x, shape)
16
17
18 @with_unsupported_dtypes({"2.5.1 and below": ("float16", "bfloat16")}, "paddle")
19 @to_ivy_arrays_and_back
20 def abs(x, name=None):
21 return ivy.abs(x)
22
23
24 absolute = abs
25
26
27 @to_ivy_arrays_and_back
28 def stack(x, axis=0, name=None):
29 return ivy.stack(x, axis=axis)
30
31
32 @with_unsupported_dtypes({"2.5.1 and below": ("int8", "int16")}, "paddle")
33 @to_ivy_arrays_and_back
34 def concat(x, axis, name=None):
35 return ivy.concat(x, axis=axis)
36
37
38 @with_unsupported_dtypes(
39 {"2.5.1 and below": ("int8", "uint8", "int16", "float16")},
40 "paddle",
41 )
42 @to_ivy_arrays_and_back
43 def tile(x, repeat_times, name=None):
44 return ivy.tile(x, repeats=repeat_times)
45
46
47 @with_unsupported_dtypes(
48 {"2.5.1 and below": ("int8", "uint8", "int16", "float16")},
49 "paddle",
50 )
51 @to_ivy_arrays_and_back
52 def flip(x, axis, name=None):
53 return ivy.flip(x, axis=axis)
54
55
56 @with_unsupported_dtypes(
57 {"2.5.1 and below": ("int16", "complex64", "complex128")},
58 "paddle",
59 )
60 @to_ivy_arrays_and_back
61 def split(x, num_or_sections, axis=0, name=None):
62 return ivy.split(x, num_or_size_splits=num_or_sections, axis=axis)
63
64
65 @with_unsupported_dtypes(
66 {"2.5.1 and below": ("float16", "bfloat16", "int8", "int16")},
67 "paddle",
68 )
69 @to_ivy_arrays_and_back
70 def squeeze(x, axis=None, name=None):
71 return ivy.squeeze(x, axis=axis)
72
73
74 @with_supported_dtypes(
75 {"2.5.1 and below": ("bool", "float32", "float64", "int32", "int64")},
76 "paddle",
77 )
78 @to_ivy_arrays_and_back
79 def expand(x, shape, name=None):
80 return ivy.expand(x, shape)
81
82
83 @with_supported_dtypes(
84 {
85 "2.5.1 and below": (
86 "bool",
87 "float16",
88 "float32",
89 "float64",
90 "int32",
91 "int64",
92 "uint8",
93 )
94 },
95 "paddle",
96 )
97 @to_ivy_arrays_and_back
98 def cast(x, dtype):
99 return ivy.astype(x, dtype)
100
101
102 @with_supported_dtypes(
103 {"2.5.1 and below": ("bool", "float32", "float64", "int32", "int64")},
104 "paddle",
105 )
106 @to_ivy_arrays_and_back
107 def broadcast_to(x, shape, name=None):
108 return ivy.broadcast_to(x, shape)
109
110
111 @with_supported_dtypes(
112 {"2.5.1 and below": ("bool", "float32", "float64", "int32", "int64")},
113 "paddle",
114 )
115 @to_ivy_arrays_and_back
116 def gather(params, indices, axis=-1, batch_dims=0, name=None):
117 return ivy.gather(params, indices, axis=axis, batch_dims=batch_dims)
118
119
120 @with_supported_dtypes(
121 {
122 "2.5.0 and below": (
123 "float32",
124 "float64",
125 "int32",
126 "int64",
127 "complex64",
128 "complex128",
129 )
130 },
131 "paddle",
132 )
133 @to_ivy_arrays_and_back
134 def roll(x, shifts, axis=None, name=None):
135 return ivy.roll(x, shifts, axis=axis)
136
137
138 @with_supported_dtypes(
139 {
140 "2.5.1 and below": (
141 "float32",
142 "float64",
143 "int32",
144 "int64",
145 )
146 },
147 "paddle",
148 )
149 @to_ivy_arrays_and_back
150 def take_along_axis(arr, indices, axis):
151 return ivy.take_along_axis(arr, indices, axis)
152
153
154 @with_supported_device_and_dtypes(
155 {
156 "2.5.1 and above": {
157 "cpu": (
158 "bool",
159 "int32",
160 "int64",
161 "float32",
162 "float64",
163 ),
164 "gpu": ("float16",),
165 },
166 },
167 "paddle",
168 )
169 @to_ivy_arrays_and_back
170 def rot90(x, k=1, axes=(0, 1), name=None):
171 return ivy.rot90(x, k=k, axes=axes)
172
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/paddle/tensor/manipulation.py b/ivy/functional/frontends/paddle/tensor/manipulation.py
--- a/ivy/functional/frontends/paddle/tensor/manipulation.py
+++ b/ivy/functional/frontends/paddle/tensor/manipulation.py
@@ -15,6 +15,17 @@
return ivy.reshape(x, shape)
+@with_unsupported_dtypes(
+ {"2.5.1 and below": ("int8", "uint8", "int16", "uint16", "float16", "bfloat16")},
+ "paddle",
+)
+@to_ivy_arrays_and_back
+def reshape_(x, shape):
+ ret = ivy.reshape(x, shape)
+ ivy.inplace_update(x, ret)
+ return x
+
+
@with_unsupported_dtypes({"2.5.1 and below": ("float16", "bfloat16")}, "paddle")
@to_ivy_arrays_and_back
def abs(x, name=None):
| {"golden_diff": "diff --git a/ivy/functional/frontends/paddle/tensor/manipulation.py b/ivy/functional/frontends/paddle/tensor/manipulation.py\n--- a/ivy/functional/frontends/paddle/tensor/manipulation.py\n+++ b/ivy/functional/frontends/paddle/tensor/manipulation.py\n@@ -15,6 +15,17 @@\n return ivy.reshape(x, shape)\n \n \n+@with_unsupported_dtypes(\n+ {\"2.5.1 and below\": (\"int8\", \"uint8\", \"int16\", \"uint16\", \"float16\", \"bfloat16\")},\n+ \"paddle\",\n+)\n+@to_ivy_arrays_and_back\n+def reshape_(x, shape):\n+ ret = ivy.reshape(x, shape)\n+ ivy.inplace_update(x, ret)\n+ return x\n+\n+\n @with_unsupported_dtypes({\"2.5.1 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n @to_ivy_arrays_and_back\n def abs(x, name=None):\n", "issue": "reshape_\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\nfrom ivy.func_wrapper import (\n with_unsupported_dtypes,\n with_supported_dtypes,\n with_supported_device_and_dtypes,\n)\n\n\n@to_ivy_arrays_and_back\ndef reshape(x, shape):\n return ivy.reshape(x, shape)\n\n\n@with_unsupported_dtypes({\"2.5.1 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef abs(x, name=None):\n return ivy.abs(x)\n\n\nabsolute = abs\n\n\n@to_ivy_arrays_and_back\ndef stack(x, axis=0, name=None):\n return ivy.stack(x, axis=axis)\n\n\n@with_unsupported_dtypes({\"2.5.1 and below\": (\"int8\", \"int16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef concat(x, axis, name=None):\n return ivy.concat(x, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef tile(x, repeat_times, name=None):\n return ivy.tile(x, repeats=repeat_times)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef flip(x, axis, name=None):\n return ivy.flip(x, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int16\", \"complex64\", \"complex128\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef split(x, num_or_sections, axis=0, name=None):\n return ivy.split(x, num_or_size_splits=num_or_sections, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"float16\", \"bfloat16\", \"int8\", \"int16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef squeeze(x, axis=None, name=None):\n return ivy.squeeze(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef expand(x, shape, name=None):\n return ivy.expand(x, shape)\n\n\n@with_supported_dtypes(\n {\n \"2.5.1 and below\": (\n \"bool\",\n \"float16\",\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"uint8\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef cast(x, dtype):\n return ivy.astype(x, dtype)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef broadcast_to(x, shape, name=None):\n return ivy.broadcast_to(x, shape)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef gather(params, indices, axis=-1, batch_dims=0, name=None):\n return ivy.gather(params, indices, axis=axis, batch_dims=batch_dims)\n\n\n@with_supported_dtypes(\n {\n \"2.5.0 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"complex64\",\n \"complex128\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef roll(x, shifts, axis=None, name=None):\n return ivy.roll(x, shifts, axis=axis)\n\n\n@with_supported_dtypes(\n {\n \"2.5.1 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef take_along_axis(arr, indices, axis):\n return ivy.take_along_axis(arr, indices, axis)\n\n\n@with_supported_device_and_dtypes(\n {\n \"2.5.1 and above\": {\n \"cpu\": (\n \"bool\",\n \"int32\",\n \"int64\",\n \"float32\",\n \"float64\",\n ),\n \"gpu\": (\"float16\",),\n },\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef rot90(x, k=1, axes=(0, 1), name=None):\n return ivy.rot90(x, k=k, axes=axes)\n", "path": "ivy/functional/frontends/paddle/tensor/manipulation.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\nfrom ivy.func_wrapper import (\n with_unsupported_dtypes,\n with_supported_dtypes,\n with_supported_device_and_dtypes,\n)\n\n\n@to_ivy_arrays_and_back\ndef reshape(x, shape):\n return ivy.reshape(x, shape)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int8\", \"uint8\", \"int16\", \"uint16\", \"float16\", \"bfloat16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef reshape_(x, shape):\n ret = ivy.reshape(x, shape)\n ivy.inplace_update(x, ret)\n return x\n\n\n@with_unsupported_dtypes({\"2.5.1 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef abs(x, name=None):\n return ivy.abs(x)\n\n\nabsolute = abs\n\n\n@to_ivy_arrays_and_back\ndef stack(x, axis=0, name=None):\n return ivy.stack(x, axis=axis)\n\n\n@with_unsupported_dtypes({\"2.5.1 and below\": (\"int8\", \"int16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef concat(x, axis, name=None):\n return ivy.concat(x, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef tile(x, repeat_times, name=None):\n return ivy.tile(x, repeats=repeat_times)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int8\", \"uint8\", \"int16\", \"float16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef flip(x, axis, name=None):\n return ivy.flip(x, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"int16\", \"complex64\", \"complex128\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef split(x, num_or_sections, axis=0, name=None):\n return ivy.split(x, num_or_size_splits=num_or_sections, axis=axis)\n\n\n@with_unsupported_dtypes(\n {\"2.5.1 and below\": (\"float16\", \"bfloat16\", \"int8\", \"int16\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef squeeze(x, axis=None, name=None):\n return ivy.squeeze(x, axis=axis)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef expand(x, shape, name=None):\n return ivy.expand(x, shape)\n\n\n@with_supported_dtypes(\n {\n \"2.5.1 and below\": (\n \"bool\",\n \"float16\",\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"uint8\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef cast(x, dtype):\n return ivy.astype(x, dtype)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef broadcast_to(x, shape, name=None):\n return ivy.broadcast_to(x, shape)\n\n\n@with_supported_dtypes(\n {\"2.5.1 and below\": (\"bool\", \"float32\", \"float64\", \"int32\", \"int64\")},\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef gather(params, indices, axis=-1, batch_dims=0, name=None):\n return ivy.gather(params, indices, axis=axis, batch_dims=batch_dims)\n\n\n@with_supported_dtypes(\n {\n \"2.5.0 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n \"complex64\",\n \"complex128\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef roll(x, shifts, axis=None, name=None):\n return ivy.roll(x, shifts, axis=axis)\n\n\n@with_supported_dtypes(\n {\n \"2.5.1 and below\": (\n \"float32\",\n \"float64\",\n \"int32\",\n \"int64\",\n )\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef take_along_axis(arr, indices, axis):\n return ivy.take_along_axis(arr, indices, axis)\n\n\n@with_supported_device_and_dtypes(\n {\n \"2.5.1 and above\": {\n \"cpu\": (\n \"bool\",\n \"int32\",\n \"int64\",\n \"float32\",\n \"float64\",\n ),\n \"gpu\": (\"float16\",),\n },\n },\n \"paddle\",\n)\n@to_ivy_arrays_and_back\ndef rot90(x, k=1, axes=(0, 1), name=None):\n return ivy.rot90(x, k=k, axes=axes)\n", "path": "ivy/functional/frontends/paddle/tensor/manipulation.py"}]} | 1,857 | 239 |
gh_patches_debug_36890 | rasdani/github-patches | git_diff | bokeh__bokeh-4021 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Allow for the injection of raw HTML code
Currently, the Widget library contains a `Paragraph` and a `PreText` widget allowing the user to put basic text on the rendered page. Neither of these widgets allows for the inclusion of formatted text using HTML markup. A widget should be added to support the inclusion of raw HTML. The widget can be a simple div named div tag.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bokeh/models/widgets/markups.py`
Content:
```
1 """ Various kinds of markup (static content) widgets.
2
3 """
4 from __future__ import absolute_import
5
6 from ...core.properties import abstract
7 from ...core.properties import Int, String
8 from .widget import Widget
9
10 @abstract
11 class Markup(Widget):
12 """ Base class for HTML markup widget models. """
13
14 class Paragraph(Markup):
15 """ A block (paragraph) of text.
16
17 """
18
19 text = String(default="", help="""
20 The contents of the widget.
21 """)
22
23 width = Int(500, help="""
24 The width of the block in pixels.
25 """)
26
27 height = Int(400, help="""
28 The height of the block in pixels.
29 """)
30
31
32 class PreText(Paragraph):
33 """ A block (paragraph) of pre-formatted text.
34
35 """
36
```
Path: `examples/plotting/file/slider_callback_policy.py`
Content:
```
1 from bokeh.io import vform, output_file, show
2 from bokeh.models import CustomJS, Slider, Paragraph, PreText
3
4 # NOTE: the JS functions to forvide the format code for strings is found the answer
5 # from the user fearphage at http://stackoverflow.com/questions/610406/javascript-equivalent-to-printf-string-format
6 callback = CustomJS(code="""
7 var s1 = slider1.get('value')
8 var s2 = slider2.get('value')
9 var s3 = slider3.get('value')
10
11 if (!String.prototype.format) {
12 String.prototype.format = function() {
13 var args = arguments;
14 return this.replace(/{(\d+)}/g, function(match, number) {
15 return typeof args[number] != 'undefined'
16 ? args[number]
17 : match
18 ;
19 });
20 };
21 }
22
23 para.set('text', "Slider Values\\n\\n Slider 1: {0}\\n Slider 2: {1}\\n Slider 3: {2}".format(s1, s2, s3))
24 """)
25
26 para = PreText(text = "Slider Values:\n\n Slider 1: 0\n Slider 2: 0\n Slider 3: 0", width = 200, height = 150)
27
28 s1 = Slider(title="Slider 1 (Continuous)", start=0, end=1000, value=0, step=1, callback=callback, callback_policy="continuous")
29 s2 = Slider(title="Slider 2 (Throttle)", start=0, end=1000, value=0, step=1, callback=callback, callback_policy="throttle", callback_throttle=2000)
30 s3 = Slider(title="Slider 3 (Mouse Up)", start=0, end=1000, value=0, step=1, callback=callback, callback_policy="mouseup")
31
32 callback.args['para'] = para
33 callback.args['slider1'] = s1
34 callback.args['slider2'] = s2
35 callback.args['slider3'] = s3
36
37 output_file('slider_callback_policy.html')
38
39 show(vform(s1, s2, s3, para))
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bokeh/models/widgets/markups.py b/bokeh/models/widgets/markups.py
--- a/bokeh/models/widgets/markups.py
+++ b/bokeh/models/widgets/markups.py
@@ -4,18 +4,13 @@
from __future__ import absolute_import
from ...core.properties import abstract
-from ...core.properties import Int, String
+from ...core.properties import Int, String, Bool
from .widget import Widget
@abstract
class Markup(Widget):
""" Base class for HTML markup widget models. """
-class Paragraph(Markup):
- """ A block (paragraph) of text.
-
- """
-
text = String(default="", help="""
The contents of the widget.
""")
@@ -29,6 +24,20 @@
""")
+class Paragraph(Markup):
+ """ A block (paragraph) of text.
+
+ """
+
+class Div(Markup):
+ """ A block (div) of text.
+
+ """
+
+ render_as_text = Bool(False, help="""
+ Should the text be rendered as raw text (False, default), or should the text be interprited as an HTML string (True)
+ """)
+
class PreText(Paragraph):
""" A block (paragraph) of pre-formatted text.
diff --git a/examples/plotting/file/slider_callback_policy.py b/examples/plotting/file/slider_callback_policy.py
--- a/examples/plotting/file/slider_callback_policy.py
+++ b/examples/plotting/file/slider_callback_policy.py
@@ -1,5 +1,5 @@
from bokeh.io import vform, output_file, show
-from bokeh.models import CustomJS, Slider, Paragraph, PreText
+from bokeh.models import CustomJS, Slider, Div
# NOTE: the JS functions to forvide the format code for strings is found the answer
# from the user fearphage at http://stackoverflow.com/questions/610406/javascript-equivalent-to-printf-string-format
@@ -20,10 +20,10 @@
};
}
- para.set('text', "Slider Values\\n\\n Slider 1: {0}\\n Slider 2: {1}\\n Slider 3: {2}".format(s1, s2, s3))
+ para.set('text', "<h1>Slider Values</h1><p>Slider 1: {0}<p>Slider 2: {1}<p>Slider 3: {2}".format(s1, s2, s3))
""")
-para = PreText(text = "Slider Values:\n\n Slider 1: 0\n Slider 2: 0\n Slider 3: 0", width = 200, height = 150)
+para = Div(text="<h1>Slider Values:</h1><p>Slider 1: 0<p>Slider 2: 0<p>Slider 3: 0", width=200, height=150, render_as_text=False)
s1 = Slider(title="Slider 1 (Continuous)", start=0, end=1000, value=0, step=1, callback=callback, callback_policy="continuous")
s2 = Slider(title="Slider 2 (Throttle)", start=0, end=1000, value=0, step=1, callback=callback, callback_policy="throttle", callback_throttle=2000)
| {"golden_diff": "diff --git a/bokeh/models/widgets/markups.py b/bokeh/models/widgets/markups.py\n--- a/bokeh/models/widgets/markups.py\n+++ b/bokeh/models/widgets/markups.py\n@@ -4,18 +4,13 @@\n from __future__ import absolute_import\n \n from ...core.properties import abstract\n-from ...core.properties import Int, String\n+from ...core.properties import Int, String, Bool\n from .widget import Widget\n \n @abstract\n class Markup(Widget):\n \"\"\" Base class for HTML markup widget models. \"\"\"\n \n-class Paragraph(Markup):\n- \"\"\" A block (paragraph) of text.\n-\n- \"\"\"\n-\n text = String(default=\"\", help=\"\"\"\n The contents of the widget.\n \"\"\")\n@@ -29,6 +24,20 @@\n \"\"\")\n \n \n+class Paragraph(Markup):\n+ \"\"\" A block (paragraph) of text.\n+\n+ \"\"\"\n+\n+class Div(Markup):\n+ \"\"\" A block (div) of text.\n+\n+ \"\"\"\n+\n+ render_as_text = Bool(False, help=\"\"\"\n+ Should the text be rendered as raw text (False, default), or should the text be interprited as an HTML string (True)\n+ \"\"\")\n+\n class PreText(Paragraph):\n \"\"\" A block (paragraph) of pre-formatted text.\n \ndiff --git a/examples/plotting/file/slider_callback_policy.py b/examples/plotting/file/slider_callback_policy.py\n--- a/examples/plotting/file/slider_callback_policy.py\n+++ b/examples/plotting/file/slider_callback_policy.py\n@@ -1,5 +1,5 @@\n from bokeh.io import vform, output_file, show\n-from bokeh.models import CustomJS, Slider, Paragraph, PreText\n+from bokeh.models import CustomJS, Slider, Div\n \n # NOTE: the JS functions to forvide the format code for strings is found the answer\n # from the user fearphage at http://stackoverflow.com/questions/610406/javascript-equivalent-to-printf-string-format\n@@ -20,10 +20,10 @@\n };\n }\n \n- para.set('text', \"Slider Values\\\\n\\\\n Slider 1: {0}\\\\n Slider 2: {1}\\\\n Slider 3: {2}\".format(s1, s2, s3))\n+ para.set('text', \"<h1>Slider Values</h1><p>Slider 1: {0}<p>Slider 2: {1}<p>Slider 3: {2}\".format(s1, s2, s3))\n \"\"\")\n \n-para = PreText(text = \"Slider Values:\\n\\n Slider 1: 0\\n Slider 2: 0\\n Slider 3: 0\", width = 200, height = 150)\n+para = Div(text=\"<h1>Slider Values:</h1><p>Slider 1: 0<p>Slider 2: 0<p>Slider 3: 0\", width=200, height=150, render_as_text=False)\n \n s1 = Slider(title=\"Slider 1 (Continuous)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"continuous\")\n s2 = Slider(title=\"Slider 2 (Throttle)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"throttle\", callback_throttle=2000)\n", "issue": "Allow for the injection of raw HTML code\nCurrently, the Widget library contains a `Paragraph` and a `PreText` widget allowing the user to put basic text on the rendered page. Neither of these widgets allows for the inclusion of formatted text using HTML markup. A widget should be added to support the inclusion of raw HTML. The widget can be a simple div named div tag.\n\n", "before_files": [{"content": "\"\"\" Various kinds of markup (static content) widgets.\n\n\"\"\"\nfrom __future__ import absolute_import\n\nfrom ...core.properties import abstract\nfrom ...core.properties import Int, String\nfrom .widget import Widget\n\n@abstract\nclass Markup(Widget):\n \"\"\" Base class for HTML markup widget models. \"\"\"\n\nclass Paragraph(Markup):\n \"\"\" A block (paragraph) of text.\n\n \"\"\"\n\n text = String(default=\"\", help=\"\"\"\n The contents of the widget.\n \"\"\")\n\n width = Int(500, help=\"\"\"\n The width of the block in pixels.\n \"\"\")\n\n height = Int(400, help=\"\"\"\n The height of the block in pixels.\n \"\"\")\n\n\nclass PreText(Paragraph):\n \"\"\" A block (paragraph) of pre-formatted text.\n\n \"\"\"\n", "path": "bokeh/models/widgets/markups.py"}, {"content": "from bokeh.io import vform, output_file, show\nfrom bokeh.models import CustomJS, Slider, Paragraph, PreText\n\n# NOTE: the JS functions to forvide the format code for strings is found the answer\n# from the user fearphage at http://stackoverflow.com/questions/610406/javascript-equivalent-to-printf-string-format\ncallback = CustomJS(code=\"\"\"\n var s1 = slider1.get('value')\n var s2 = slider2.get('value')\n var s3 = slider3.get('value')\n\n if (!String.prototype.format) {\n String.prototype.format = function() {\n var args = arguments;\n return this.replace(/{(\\d+)}/g, function(match, number) {\n return typeof args[number] != 'undefined'\n ? args[number]\n : match\n ;\n });\n };\n }\n\n para.set('text', \"Slider Values\\\\n\\\\n Slider 1: {0}\\\\n Slider 2: {1}\\\\n Slider 3: {2}\".format(s1, s2, s3))\n\"\"\")\n\npara = PreText(text = \"Slider Values:\\n\\n Slider 1: 0\\n Slider 2: 0\\n Slider 3: 0\", width = 200, height = 150)\n\ns1 = Slider(title=\"Slider 1 (Continuous)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"continuous\")\ns2 = Slider(title=\"Slider 2 (Throttle)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"throttle\", callback_throttle=2000)\ns3 = Slider(title=\"Slider 3 (Mouse Up)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"mouseup\")\n\ncallback.args['para'] = para\ncallback.args['slider1'] = s1\ncallback.args['slider2'] = s2\ncallback.args['slider3'] = s3\n\noutput_file('slider_callback_policy.html')\n\nshow(vform(s1, s2, s3, para))\n", "path": "examples/plotting/file/slider_callback_policy.py"}], "after_files": [{"content": "\"\"\" Various kinds of markup (static content) widgets.\n\n\"\"\"\nfrom __future__ import absolute_import\n\nfrom ...core.properties import abstract\nfrom ...core.properties import Int, String, Bool\nfrom .widget import Widget\n\n@abstract\nclass Markup(Widget):\n \"\"\" Base class for HTML markup widget models. \"\"\"\n\n text = String(default=\"\", help=\"\"\"\n The contents of the widget.\n \"\"\")\n\n width = Int(500, help=\"\"\"\n The width of the block in pixels.\n \"\"\")\n\n height = Int(400, help=\"\"\"\n The height of the block in pixels.\n \"\"\")\n\n\nclass Paragraph(Markup):\n \"\"\" A block (paragraph) of text.\n\n \"\"\"\n\nclass Div(Markup):\n \"\"\" A block (div) of text.\n\n \"\"\"\n\n render_as_text = Bool(False, help=\"\"\"\n Should the text be rendered as raw text (False, default), or should the text be interprited as an HTML string (True)\n \"\"\")\n\nclass PreText(Paragraph):\n \"\"\" A block (paragraph) of pre-formatted text.\n\n \"\"\"\n", "path": "bokeh/models/widgets/markups.py"}, {"content": "from bokeh.io import vform, output_file, show\nfrom bokeh.models import CustomJS, Slider, Div\n\n# NOTE: the JS functions to forvide the format code for strings is found the answer\n# from the user fearphage at http://stackoverflow.com/questions/610406/javascript-equivalent-to-printf-string-format\ncallback = CustomJS(code=\"\"\"\n var s1 = slider1.get('value')\n var s2 = slider2.get('value')\n var s3 = slider3.get('value')\n\n if (!String.prototype.format) {\n String.prototype.format = function() {\n var args = arguments;\n return this.replace(/{(\\d+)}/g, function(match, number) {\n return typeof args[number] != 'undefined'\n ? args[number]\n : match\n ;\n });\n };\n }\n\n para.set('text', \"<h1>Slider Values</h1><p>Slider 1: {0}<p>Slider 2: {1}<p>Slider 3: {2}\".format(s1, s2, s3))\n\"\"\")\n\npara = Div(text=\"<h1>Slider Values:</h1><p>Slider 1: 0<p>Slider 2: 0<p>Slider 3: 0\", width=200, height=150, render_as_text=False)\n\ns1 = Slider(title=\"Slider 1 (Continuous)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"continuous\")\ns2 = Slider(title=\"Slider 2 (Throttle)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"throttle\", callback_throttle=2000)\ns3 = Slider(title=\"Slider 3 (Mouse Up)\", start=0, end=1000, value=0, step=1, callback=callback, callback_policy=\"mouseup\")\n\ncallback.args['para'] = para\ncallback.args['slider1'] = s1\ncallback.args['slider2'] = s2\ncallback.args['slider3'] = s3\n\noutput_file('slider_callback_policy.html')\n\nshow(vform(s1, s2, s3, para))\n", "path": "examples/plotting/file/slider_callback_policy.py"}]} | 1,157 | 762 |
gh_patches_debug_14897 | rasdani/github-patches | git_diff | qtile__qtile-3099 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Widget for updatable image
I don't want to reinvent the wheel, so will check before.
I use GenPollText for my keyboard layout indicator but instead I want to see the flag (image). As I change layout the image should be changed. Do Qtile has such widget or proper way to do that?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/widget/image.py`
Content:
```
1 # Copyright (c) 2013 dequis
2 # Copyright (c) 2014 Sean Vig
3 # Copyright (c) 2014 Adi Sieker
4 #
5 # Permission is hereby granted, free of charge, to any person obtaining a copy
6 # of this software and associated documentation files (the "Software"), to deal
7 # in the Software without restriction, including without limitation the rights
8 # to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9 # copies of the Software, and to permit persons to whom the Software is
10 # furnished to do so, subject to the following conditions:
11 #
12 # The above copyright notice and this permission notice shall be included in
13 # all copies or substantial portions of the Software.
14 #
15 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16 # IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17 # FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18 # AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19 # LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21 # SOFTWARE.
22 import os
23
24 from libqtile import bar
25 from libqtile.images import Img
26 from libqtile.log_utils import logger
27 from libqtile.widget import base
28
29
30 class Image(base._Widget, base.MarginMixin):
31 """Display a PNG image on the bar"""
32 orientations = base.ORIENTATION_BOTH
33 defaults = [
34 ("scale", True, "Enable/Disable image scaling"),
35 ("rotate", 0.0, "rotate the image in degrees counter-clockwise"),
36 ("filename", None, "Image filename. Can contain '~'"),
37 ]
38
39 def __init__(self, length=bar.CALCULATED, **config):
40 base._Widget.__init__(self, length, **config)
41 self.add_defaults(Image.defaults)
42 self.add_defaults(base.MarginMixin.defaults)
43
44 # make the default 0 instead
45 self._variable_defaults["margin"] = 0
46
47 def _configure(self, qtile, bar):
48 base._Widget._configure(self, qtile, bar)
49 self.img = None
50
51 if not self.filename:
52 logger.warning("Image filename not set!")
53 return
54
55 self.filename = os.path.expanduser(self.filename)
56
57 if not os.path.exists(self.filename):
58 logger.warning("Image does not exist: {}".format(self.filename))
59 return
60
61 img = Img.from_path(self.filename)
62 self.img = img
63 img.theta = self.rotate
64 if not self.scale:
65 return
66 if self.bar.horizontal:
67 new_height = self.bar.height - (self.margin_y * 2)
68 img.resize(height=new_height)
69 else:
70 new_width = self.bar.width - (self.margin_x * 2)
71 img.resize(width=new_width)
72
73 def draw(self):
74 if self.img is None:
75 return
76
77 self.drawer.clear(self.background or self.bar.background)
78 self.drawer.ctx.save()
79 self.drawer.ctx.translate(self.margin_x, self.margin_y)
80 self.drawer.ctx.set_source(self.img.pattern)
81 self.drawer.ctx.paint()
82 self.drawer.ctx.restore()
83
84 if self.bar.horizontal:
85 self.drawer.draw(offsetx=self.offset, offsety=self.offsety, width=self.width)
86 else:
87 self.drawer.draw(offsety=self.offset, offsetx=self.offsetx, height=self.width)
88
89 def calculate_length(self):
90 if self.img is None:
91 return 0
92
93 if self.bar.horizontal:
94 return self.img.width + (self.margin_x * 2)
95 else:
96 return self.img.height + (self.margin_y * 2)
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libqtile/widget/image.py b/libqtile/widget/image.py
--- a/libqtile/widget/image.py
+++ b/libqtile/widget/image.py
@@ -46,6 +46,9 @@
def _configure(self, qtile, bar):
base._Widget._configure(self, qtile, bar)
+ self._update_image()
+
+ def _update_image(self):
self.img = None
if not self.filename:
@@ -94,3 +97,13 @@
return self.img.width + (self.margin_x * 2)
else:
return self.img.height + (self.margin_y * 2)
+
+ def cmd_update(self, filename):
+ old_length = self.calculate_length()
+ self.filename = filename
+ self._update_image()
+
+ if self.calculate_length() == old_length:
+ self.draw()
+ else:
+ self.bar.draw()
| {"golden_diff": "diff --git a/libqtile/widget/image.py b/libqtile/widget/image.py\n--- a/libqtile/widget/image.py\n+++ b/libqtile/widget/image.py\n@@ -46,6 +46,9 @@\n \n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n+ self._update_image()\n+\n+ def _update_image(self):\n self.img = None\n \n if not self.filename:\n@@ -94,3 +97,13 @@\n return self.img.width + (self.margin_x * 2)\n else:\n return self.img.height + (self.margin_y * 2)\n+\n+ def cmd_update(self, filename):\n+ old_length = self.calculate_length()\n+ self.filename = filename\n+ self._update_image()\n+\n+ if self.calculate_length() == old_length:\n+ self.draw()\n+ else:\n+ self.bar.draw()\n", "issue": "Widget for updatable image\nI don't want to reinvent the wheel, so will check before.\r\nI use GenPollText for my keyboard layout indicator but instead I want to see the flag (image). As I change layout the image should be changed. Do Qtile has such widget or proper way to do that?\n", "before_files": [{"content": "# Copyright (c) 2013 dequis\n# Copyright (c) 2014 Sean Vig\n# Copyright (c) 2014 Adi Sieker\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\nimport os\n\nfrom libqtile import bar\nfrom libqtile.images import Img\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\n\nclass Image(base._Widget, base.MarginMixin):\n \"\"\"Display a PNG image on the bar\"\"\"\n orientations = base.ORIENTATION_BOTH\n defaults = [\n (\"scale\", True, \"Enable/Disable image scaling\"),\n (\"rotate\", 0.0, \"rotate the image in degrees counter-clockwise\"),\n (\"filename\", None, \"Image filename. Can contain '~'\"),\n ]\n\n def __init__(self, length=bar.CALCULATED, **config):\n base._Widget.__init__(self, length, **config)\n self.add_defaults(Image.defaults)\n self.add_defaults(base.MarginMixin.defaults)\n\n # make the default 0 instead\n self._variable_defaults[\"margin\"] = 0\n\n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n self.img = None\n\n if not self.filename:\n logger.warning(\"Image filename not set!\")\n return\n\n self.filename = os.path.expanduser(self.filename)\n\n if not os.path.exists(self.filename):\n logger.warning(\"Image does not exist: {}\".format(self.filename))\n return\n\n img = Img.from_path(self.filename)\n self.img = img\n img.theta = self.rotate\n if not self.scale:\n return\n if self.bar.horizontal:\n new_height = self.bar.height - (self.margin_y * 2)\n img.resize(height=new_height)\n else:\n new_width = self.bar.width - (self.margin_x * 2)\n img.resize(width=new_width)\n\n def draw(self):\n if self.img is None:\n return\n\n self.drawer.clear(self.background or self.bar.background)\n self.drawer.ctx.save()\n self.drawer.ctx.translate(self.margin_x, self.margin_y)\n self.drawer.ctx.set_source(self.img.pattern)\n self.drawer.ctx.paint()\n self.drawer.ctx.restore()\n\n if self.bar.horizontal:\n self.drawer.draw(offsetx=self.offset, offsety=self.offsety, width=self.width)\n else:\n self.drawer.draw(offsety=self.offset, offsetx=self.offsetx, height=self.width)\n\n def calculate_length(self):\n if self.img is None:\n return 0\n\n if self.bar.horizontal:\n return self.img.width + (self.margin_x * 2)\n else:\n return self.img.height + (self.margin_y * 2)\n", "path": "libqtile/widget/image.py"}], "after_files": [{"content": "# Copyright (c) 2013 dequis\n# Copyright (c) 2014 Sean Vig\n# Copyright (c) 2014 Adi Sieker\n#\n# Permission is hereby granted, free of charge, to any person obtaining a copy\n# of this software and associated documentation files (the \"Software\"), to deal\n# in the Software without restriction, including without limitation the rights\n# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\n# copies of the Software, and to permit persons to whom the Software is\n# furnished to do so, subject to the following conditions:\n#\n# The above copyright notice and this permission notice shall be included in\n# all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\n# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\n# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\n# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\n# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\n# SOFTWARE.\nimport os\n\nfrom libqtile import bar\nfrom libqtile.images import Img\nfrom libqtile.log_utils import logger\nfrom libqtile.widget import base\n\n\nclass Image(base._Widget, base.MarginMixin):\n \"\"\"Display a PNG image on the bar\"\"\"\n orientations = base.ORIENTATION_BOTH\n defaults = [\n (\"scale\", True, \"Enable/Disable image scaling\"),\n (\"rotate\", 0.0, \"rotate the image in degrees counter-clockwise\"),\n (\"filename\", None, \"Image filename. Can contain '~'\"),\n ]\n\n def __init__(self, length=bar.CALCULATED, **config):\n base._Widget.__init__(self, length, **config)\n self.add_defaults(Image.defaults)\n self.add_defaults(base.MarginMixin.defaults)\n\n # make the default 0 instead\n self._variable_defaults[\"margin\"] = 0\n\n def _configure(self, qtile, bar):\n base._Widget._configure(self, qtile, bar)\n self._update_image()\n\n def _update_image(self):\n self.img = None\n\n if not self.filename:\n logger.warning(\"Image filename not set!\")\n return\n\n self.filename = os.path.expanduser(self.filename)\n\n if not os.path.exists(self.filename):\n logger.warning(\"Image does not exist: {}\".format(self.filename))\n return\n\n img = Img.from_path(self.filename)\n self.img = img\n img.theta = self.rotate\n if not self.scale:\n return\n if self.bar.horizontal:\n new_height = self.bar.height - (self.margin_y * 2)\n img.resize(height=new_height)\n else:\n new_width = self.bar.width - (self.margin_x * 2)\n img.resize(width=new_width)\n\n def draw(self):\n if self.img is None:\n return\n\n self.drawer.clear(self.background or self.bar.background)\n self.drawer.ctx.save()\n self.drawer.ctx.translate(self.margin_x, self.margin_y)\n self.drawer.ctx.set_source(self.img.pattern)\n self.drawer.ctx.paint()\n self.drawer.ctx.restore()\n\n if self.bar.horizontal:\n self.drawer.draw(offsetx=self.offset, offsety=self.offsety, width=self.width)\n else:\n self.drawer.draw(offsety=self.offset, offsetx=self.offsetx, height=self.width)\n\n def calculate_length(self):\n if self.img is None:\n return 0\n\n if self.bar.horizontal:\n return self.img.width + (self.margin_x * 2)\n else:\n return self.img.height + (self.margin_y * 2)\n\n def cmd_update(self, filename):\n old_length = self.calculate_length()\n self.filename = filename\n self._update_image()\n\n if self.calculate_length() == old_length:\n self.draw()\n else:\n self.bar.draw()\n", "path": "libqtile/widget/image.py"}]} | 1,312 | 205 |
gh_patches_debug_11910 | rasdani/github-patches | git_diff | web2py__web2py-1682 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
heroku ADAPTERS error
I'm looking to use Heroku for deployment of my web2py project. However, when I add
`from gluon.contrib.heroku import get_db`
`db = get_db(name=None, pool_size=myconf.get('db.pool_size'))`
I get a ticket with the error:
> File "/Users/huangyu/dev/web2py/gluon/contrib/heroku.py", line 10, in <module>
from pydal.adapters import ADAPTERS, PostgreSQLAdapter
ImportError: cannot import name ADAPTERS
It looks like web2py has moved on from using ADAPTERS? Has that been replaced by
`@adapters.register_for('postgres')`
But the heroku file has not been updated.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gluon/contrib/heroku.py`
Content:
```
1 """
2 Usage: in web2py models/db.py
3
4 from gluon.contrib.heroku import get_db
5 db = get_db()
6
7 """
8 import os
9 from gluon import *
10 from pydal.adapters import ADAPTERS, PostgreSQLAdapter
11 from pydal.helpers.classes import UseDatabaseStoredFile
12
13 class HerokuPostgresAdapter(UseDatabaseStoredFile,PostgreSQLAdapter):
14 drivers = ('psycopg2',)
15 uploads_in_blob = True
16
17 ADAPTERS['postgres'] = HerokuPostgresAdapter
18
19 def get_db(name = None, pool_size=10):
20 if not name:
21 names = [n for n in os.environ.keys()
22 if n[:18]+n[-4:]=='HEROKU_POSTGRESQL__URL']
23 if names:
24 name = names[0]
25 if name:
26 db = DAL(os.environ[name], pool_size=pool_size)
27 current.session.connect(current.request, current.response, db=db)
28 else:
29 db = DAL('sqlite://heroku.test.sqlite')
30 return db
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gluon/contrib/heroku.py b/gluon/contrib/heroku.py
--- a/gluon/contrib/heroku.py
+++ b/gluon/contrib/heroku.py
@@ -7,15 +7,13 @@
"""
import os
from gluon import *
-from pydal.adapters import ADAPTERS, PostgreSQLAdapter
-from pydal.helpers.classes import UseDatabaseStoredFile
+from pydal.adapters import adapters, PostgrePsyco
+from pydal.helpers.classes import DatabaseStoredFile
-class HerokuPostgresAdapter(UseDatabaseStoredFile,PostgreSQLAdapter):
- drivers = ('psycopg2',)
[email protected]_for('postgres')
+class HerokuPostgresAdapter(DatabaseStoredFile, PostgrePsyco):
uploads_in_blob = True
-ADAPTERS['postgres'] = HerokuPostgresAdapter
-
def get_db(name = None, pool_size=10):
if not name:
names = [n for n in os.environ.keys()
| {"golden_diff": "diff --git a/gluon/contrib/heroku.py b/gluon/contrib/heroku.py\n--- a/gluon/contrib/heroku.py\n+++ b/gluon/contrib/heroku.py\n@@ -7,15 +7,13 @@\n \"\"\"\n import os\n from gluon import *\n-from pydal.adapters import ADAPTERS, PostgreSQLAdapter\n-from pydal.helpers.classes import UseDatabaseStoredFile\n+from pydal.adapters import adapters, PostgrePsyco\n+from pydal.helpers.classes import DatabaseStoredFile\n \n-class HerokuPostgresAdapter(UseDatabaseStoredFile,PostgreSQLAdapter):\n- drivers = ('psycopg2',)\[email protected]_for('postgres')\n+class HerokuPostgresAdapter(DatabaseStoredFile, PostgrePsyco):\n uploads_in_blob = True\n \n-ADAPTERS['postgres'] = HerokuPostgresAdapter\n-\n def get_db(name = None, pool_size=10):\n if not name:\n names = [n for n in os.environ.keys()\n", "issue": "heroku ADAPTERS error\nI'm looking to use Heroku for deployment of my web2py project. However, when I add \r\n`from gluon.contrib.heroku import get_db`\r\n`db = get_db(name=None, pool_size=myconf.get('db.pool_size'))`\r\n\r\nI get a ticket with the error:\r\n\r\n> File \"/Users/huangyu/dev/web2py/gluon/contrib/heroku.py\", line 10, in <module>\r\n from pydal.adapters import ADAPTERS, PostgreSQLAdapter\r\nImportError: cannot import name ADAPTERS\r\n\r\nIt looks like web2py has moved on from using ADAPTERS? Has that been replaced by \r\n`@adapters.register_for('postgres')`\r\n\r\nBut the heroku file has not been updated. \n", "before_files": [{"content": "\"\"\"\nUsage: in web2py models/db.py\n\nfrom gluon.contrib.heroku import get_db\ndb = get_db()\n\n\"\"\"\nimport os\nfrom gluon import *\nfrom pydal.adapters import ADAPTERS, PostgreSQLAdapter\nfrom pydal.helpers.classes import UseDatabaseStoredFile\n\nclass HerokuPostgresAdapter(UseDatabaseStoredFile,PostgreSQLAdapter):\n drivers = ('psycopg2',)\n uploads_in_blob = True\n\nADAPTERS['postgres'] = HerokuPostgresAdapter\n\ndef get_db(name = None, pool_size=10):\n if not name:\n names = [n for n in os.environ.keys()\n if n[:18]+n[-4:]=='HEROKU_POSTGRESQL__URL']\n if names:\n name = names[0]\n if name:\n db = DAL(os.environ[name], pool_size=pool_size)\n current.session.connect(current.request, current.response, db=db)\n else:\n db = DAL('sqlite://heroku.test.sqlite')\n return db\n", "path": "gluon/contrib/heroku.py"}], "after_files": [{"content": "\"\"\"\nUsage: in web2py models/db.py\n\nfrom gluon.contrib.heroku import get_db\ndb = get_db()\n\n\"\"\"\nimport os\nfrom gluon import *\nfrom pydal.adapters import adapters, PostgrePsyco\nfrom pydal.helpers.classes import DatabaseStoredFile\n\[email protected]_for('postgres')\nclass HerokuPostgresAdapter(DatabaseStoredFile, PostgrePsyco):\n uploads_in_blob = True\n\ndef get_db(name = None, pool_size=10):\n if not name:\n names = [n for n in os.environ.keys()\n if n[:18]+n[-4:]=='HEROKU_POSTGRESQL__URL']\n if names:\n name = names[0]\n if name:\n db = DAL(os.environ[name], pool_size=pool_size)\n current.session.connect(current.request, current.response, db=db)\n else:\n db = DAL('sqlite://heroku.test.sqlite')\n return db\n", "path": "gluon/contrib/heroku.py"}]} | 695 | 214 |
gh_patches_debug_34290 | rasdani/github-patches | git_diff | open-telemetry__opentelemetry-python-306 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Switch unit tests from `unittest.mock` to SDK & in-memory exporter
See https://github.com/open-telemetry/opentelemetry-python/pull/290#issuecomment-558091283.
Currently tests are cumbersome to write and actually we probably don't want to test which API calls are made but what Spans would result in most cases. For this a SDK with in-memory exporter would be better than using `unittest.mock`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py`
Content:
```
1 import io
2 import unittest
3 import unittest.mock as mock
4 import wsgiref.util as wsgiref_util
5
6 from opentelemetry import trace as trace_api
7
8
9 class WsgiTestBase(unittest.TestCase):
10 def setUp(self):
11 self.span = mock.create_autospec(trace_api.Span, spec_set=True)
12 tracer = trace_api.Tracer()
13 self.get_tracer_patcher = mock.patch.object(
14 trace_api.TracerSource,
15 "get_tracer",
16 autospec=True,
17 spec_set=True,
18 return_value=tracer,
19 )
20 self.get_tracer_patcher.start()
21
22 self.start_span_patcher = mock.patch.object(
23 tracer,
24 "start_span",
25 autospec=True,
26 spec_set=True,
27 return_value=self.span,
28 )
29 self.start_span = self.start_span_patcher.start()
30 self.write_buffer = io.BytesIO()
31 self.write = self.write_buffer.write
32
33 self.environ = {}
34 wsgiref_util.setup_testing_defaults(self.environ)
35
36 self.status = None
37 self.response_headers = None
38 self.exc_info = None
39
40 def tearDown(self):
41 self.get_tracer_patcher.stop()
42 self.start_span_patcher.stop()
43
44 def start_response(self, status, response_headers, exc_info=None):
45 self.status = status
46 self.response_headers = response_headers
47 self.exc_info = exc_info
48 return self.write
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py b/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py
--- a/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py
+++ b/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py
@@ -1,32 +1,38 @@
import io
import unittest
-import unittest.mock as mock
import wsgiref.util as wsgiref_util
+from importlib import reload
from opentelemetry import trace as trace_api
+from opentelemetry.sdk.trace import TracerSource, export
+from opentelemetry.sdk.trace.export.in_memory_span_exporter import (
+ InMemorySpanExporter,
+)
+
+_MEMORY_EXPORTER = None
class WsgiTestBase(unittest.TestCase):
- def setUp(self):
- self.span = mock.create_autospec(trace_api.Span, spec_set=True)
- tracer = trace_api.Tracer()
- self.get_tracer_patcher = mock.patch.object(
- trace_api.TracerSource,
- "get_tracer",
- autospec=True,
- spec_set=True,
- return_value=tracer,
- )
- self.get_tracer_patcher.start()
-
- self.start_span_patcher = mock.patch.object(
- tracer,
- "start_span",
- autospec=True,
- spec_set=True,
- return_value=self.span,
+ @classmethod
+ def setUpClass(cls):
+ global _MEMORY_EXPORTER # pylint:disable=global-statement
+ trace_api.set_preferred_tracer_source_implementation(
+ lambda T: TracerSource()
)
- self.start_span = self.start_span_patcher.start()
+ tracer_source = trace_api.tracer_source()
+ _MEMORY_EXPORTER = InMemorySpanExporter()
+ span_processor = export.SimpleExportSpanProcessor(_MEMORY_EXPORTER)
+ tracer_source.add_span_processor(span_processor)
+
+ @classmethod
+ def tearDownClass(cls):
+ reload(trace_api)
+
+ def setUp(self):
+
+ self.memory_exporter = _MEMORY_EXPORTER
+ self.memory_exporter.clear()
+
self.write_buffer = io.BytesIO()
self.write = self.write_buffer.write
@@ -37,10 +43,6 @@
self.response_headers = None
self.exc_info = None
- def tearDown(self):
- self.get_tracer_patcher.stop()
- self.start_span_patcher.stop()
-
def start_response(self, status, response_headers, exc_info=None):
self.status = status
self.response_headers = response_headers
| {"golden_diff": "diff --git a/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py b/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py\n--- a/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py\n+++ b/ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py\n@@ -1,32 +1,38 @@\n import io\n import unittest\n-import unittest.mock as mock\n import wsgiref.util as wsgiref_util\n+from importlib import reload\n \n from opentelemetry import trace as trace_api\n+from opentelemetry.sdk.trace import TracerSource, export\n+from opentelemetry.sdk.trace.export.in_memory_span_exporter import (\n+ InMemorySpanExporter,\n+)\n+\n+_MEMORY_EXPORTER = None\n \n \n class WsgiTestBase(unittest.TestCase):\n- def setUp(self):\n- self.span = mock.create_autospec(trace_api.Span, spec_set=True)\n- tracer = trace_api.Tracer()\n- self.get_tracer_patcher = mock.patch.object(\n- trace_api.TracerSource,\n- \"get_tracer\",\n- autospec=True,\n- spec_set=True,\n- return_value=tracer,\n- )\n- self.get_tracer_patcher.start()\n-\n- self.start_span_patcher = mock.patch.object(\n- tracer,\n- \"start_span\",\n- autospec=True,\n- spec_set=True,\n- return_value=self.span,\n+ @classmethod\n+ def setUpClass(cls):\n+ global _MEMORY_EXPORTER # pylint:disable=global-statement\n+ trace_api.set_preferred_tracer_source_implementation(\n+ lambda T: TracerSource()\n )\n- self.start_span = self.start_span_patcher.start()\n+ tracer_source = trace_api.tracer_source()\n+ _MEMORY_EXPORTER = InMemorySpanExporter()\n+ span_processor = export.SimpleExportSpanProcessor(_MEMORY_EXPORTER)\n+ tracer_source.add_span_processor(span_processor)\n+\n+ @classmethod\n+ def tearDownClass(cls):\n+ reload(trace_api)\n+\n+ def setUp(self):\n+\n+ self.memory_exporter = _MEMORY_EXPORTER\n+ self.memory_exporter.clear()\n+\n self.write_buffer = io.BytesIO()\n self.write = self.write_buffer.write\n \n@@ -37,10 +43,6 @@\n self.response_headers = None\n self.exc_info = None\n \n- def tearDown(self):\n- self.get_tracer_patcher.stop()\n- self.start_span_patcher.stop()\n-\n def start_response(self, status, response_headers, exc_info=None):\n self.status = status\n self.response_headers = response_headers\n", "issue": "Switch unit tests from `unittest.mock` to SDK & in-memory exporter\nSee https://github.com/open-telemetry/opentelemetry-python/pull/290#issuecomment-558091283.\r\nCurrently tests are cumbersome to write and actually we probably don't want to test which API calls are made but what Spans would result in most cases. For this a SDK with in-memory exporter would be better than using `unittest.mock`.\n", "before_files": [{"content": "import io\nimport unittest\nimport unittest.mock as mock\nimport wsgiref.util as wsgiref_util\n\nfrom opentelemetry import trace as trace_api\n\n\nclass WsgiTestBase(unittest.TestCase):\n def setUp(self):\n self.span = mock.create_autospec(trace_api.Span, spec_set=True)\n tracer = trace_api.Tracer()\n self.get_tracer_patcher = mock.patch.object(\n trace_api.TracerSource,\n \"get_tracer\",\n autospec=True,\n spec_set=True,\n return_value=tracer,\n )\n self.get_tracer_patcher.start()\n\n self.start_span_patcher = mock.patch.object(\n tracer,\n \"start_span\",\n autospec=True,\n spec_set=True,\n return_value=self.span,\n )\n self.start_span = self.start_span_patcher.start()\n self.write_buffer = io.BytesIO()\n self.write = self.write_buffer.write\n\n self.environ = {}\n wsgiref_util.setup_testing_defaults(self.environ)\n\n self.status = None\n self.response_headers = None\n self.exc_info = None\n\n def tearDown(self):\n self.get_tracer_patcher.stop()\n self.start_span_patcher.stop()\n\n def start_response(self, status, response_headers, exc_info=None):\n self.status = status\n self.response_headers = response_headers\n self.exc_info = exc_info\n return self.write\n", "path": "ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py"}], "after_files": [{"content": "import io\nimport unittest\nimport wsgiref.util as wsgiref_util\nfrom importlib import reload\n\nfrom opentelemetry import trace as trace_api\nfrom opentelemetry.sdk.trace import TracerSource, export\nfrom opentelemetry.sdk.trace.export.in_memory_span_exporter import (\n InMemorySpanExporter,\n)\n\n_MEMORY_EXPORTER = None\n\n\nclass WsgiTestBase(unittest.TestCase):\n @classmethod\n def setUpClass(cls):\n global _MEMORY_EXPORTER # pylint:disable=global-statement\n trace_api.set_preferred_tracer_source_implementation(\n lambda T: TracerSource()\n )\n tracer_source = trace_api.tracer_source()\n _MEMORY_EXPORTER = InMemorySpanExporter()\n span_processor = export.SimpleExportSpanProcessor(_MEMORY_EXPORTER)\n tracer_source.add_span_processor(span_processor)\n\n @classmethod\n def tearDownClass(cls):\n reload(trace_api)\n\n def setUp(self):\n\n self.memory_exporter = _MEMORY_EXPORTER\n self.memory_exporter.clear()\n\n self.write_buffer = io.BytesIO()\n self.write = self.write_buffer.write\n\n self.environ = {}\n wsgiref_util.setup_testing_defaults(self.environ)\n\n self.status = None\n self.response_headers = None\n self.exc_info = None\n\n def start_response(self, status, response_headers, exc_info=None):\n self.status = status\n self.response_headers = response_headers\n self.exc_info = exc_info\n return self.write\n", "path": "ext/opentelemetry-ext-testutil/src/opentelemetry/ext/testutil/wsgitestutil.py"}]} | 757 | 593 |
gh_patches_debug_7939 | rasdani/github-patches | git_diff | mozilla__bugbug-3401 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Refactor logging statements to use lazy % formatting
Example of logging statements that we want to refactor:
https://github.com/mozilla/bugbug/blob/d53595391dbd75379bb49bff12dee4821e4b956c/bugbug/github.py#L61
https://github.com/mozilla/bugbug/blob/69972a1684f788319bf5c2944bbe8eeb79428c7d/scripts/regressor_finder.py#L396
More details can be found in the [pylint docs](https://pylint.readthedocs.io/en/latest/user_guide/messages/warning/logging-fstring-interpolation.html).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scripts/retrieve_training_metrics.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # This Source Code Form is subject to the terms of the Mozilla Public
3 # License, v. 2.0. If a copy of the MPL was not distributed with this file,
4 # You can obtain one at http://mozilla.org/MPL/2.0/.
5
6 import argparse
7 import logging
8 import os
9 import sys
10 from os.path import abspath, join
11
12 import requests
13 import taskcluster
14
15 from bugbug.utils import get_taskcluster_options
16
17 ROOT_URI = "train_{}.per_date"
18 DATE_URI = "train_{}.per_date.{}"
19 BASE_URL = "https://community-tc.services.mozilla.com/api/index/v1/task/{}/artifacts/public/metrics.json"
20 NAMESPACE_URI = "project.bugbug.{}"
21
22 LOGGER = logging.getLogger(__name__)
23
24 logging.basicConfig(level=logging.INFO)
25
26
27 def get_task_metrics_from_uri(index_uri):
28 index_url = BASE_URL.format(index_uri)
29 LOGGER.info("Retrieving metrics from %s", index_url)
30 r = requests.get(index_url)
31
32 if r.status_code == 404:
33 LOGGER.error(f"File not found for URL {index_url}, check your arguments")
34 sys.exit(1)
35
36 r.raise_for_status()
37
38 return r
39
40
41 def get_namespaces(index, index_uri):
42 index_namespaces = index.listNamespaces(index_uri)
43
44 return index_namespaces["namespaces"]
45
46
47 def is_later_or_equal(partial_date, from_date):
48 for partial_date_part, from_date_part in zip(partial_date, from_date):
49 if int(partial_date_part) > int(from_date_part):
50 return True
51 elif int(partial_date_part) < int(from_date_part):
52 return False
53 else:
54 continue
55
56 return True
57
58
59 def get_task_metrics_from_date(model, date, output_directory):
60 options = get_taskcluster_options()
61
62 index = taskcluster.Index(options)
63
64 index.ping()
65
66 # Split the date
67 from_date = date.split(".")
68
69 namespaces = []
70
71 # Start at the root level
72 # We need an empty list in order to append namespaces part to it
73 namespaces.append([])
74
75 # Recursively list all namespaces greater or equals than the given date
76 while namespaces:
77 current_ns = namespaces.pop()
78
79 # Handle version level namespaces
80 if not current_ns:
81 ns_uri = ROOT_URI.format(model)
82 else:
83 current_ns_date = ".".join(current_ns)
84 ns_uri = DATE_URI.format(model, current_ns_date)
85
86 ns_full_uri = NAMESPACE_URI.format(ns_uri)
87
88 tasks = index.listTasks(ns_full_uri)
89 for task in tasks["tasks"]:
90 task_uri = task["namespace"]
91 r = get_task_metrics_from_uri(task_uri)
92
93 # Write the file on disk
94 file_name = f"metric_{'_'.join(task_uri.split('.'))}.json"
95 file_path = abspath(join(output_directory, file_name))
96 with open(file_path, "w") as metric_file:
97 metric_file.write(r.text)
98 LOGGER.info(f"Metrics saved to {file_path!r}")
99
100 for namespace in get_namespaces(index, ns_full_uri):
101 new_ns = current_ns.copy()
102 new_ns.append(namespace["name"])
103
104 if not is_later_or_equal(new_ns, from_date):
105 LOGGER.debug("NEW namespace %s is before %s", new_ns, from_date)
106 continue
107
108 # Might not be efficient but size of `namespaces` shouldn't be too
109 # big as we are doing a depth-first traversal
110 if new_ns not in namespaces:
111 namespaces.append(new_ns)
112
113
114 def main():
115 description = "Retrieve a model training metrics"
116 parser = argparse.ArgumentParser(description=description)
117
118 parser.add_argument(
119 "-d",
120 "--output-directory",
121 default=os.getcwd(),
122 help="In which directory the script should save the metrics file. The directory must exists",
123 )
124 parser.add_argument("model", help="Which model to retrieve training metrics from.")
125 parser.add_argument(
126 "date",
127 nargs="?",
128 help="Which date should we retrieve training metrics from. Default to latest",
129 )
130
131 args = parser.parse_args()
132
133 get_task_metrics_from_date(args.model, args.date, args.output_directory)
134
135
136 if __name__ == "__main__":
137 main()
138
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scripts/retrieve_training_metrics.py b/scripts/retrieve_training_metrics.py
--- a/scripts/retrieve_training_metrics.py
+++ b/scripts/retrieve_training_metrics.py
@@ -95,7 +95,7 @@
file_path = abspath(join(output_directory, file_name))
with open(file_path, "w") as metric_file:
metric_file.write(r.text)
- LOGGER.info(f"Metrics saved to {file_path!r}")
+ LOGGER.info("Metrics saved to %r", file_path)
for namespace in get_namespaces(index, ns_full_uri):
new_ns = current_ns.copy()
| {"golden_diff": "diff --git a/scripts/retrieve_training_metrics.py b/scripts/retrieve_training_metrics.py\n--- a/scripts/retrieve_training_metrics.py\n+++ b/scripts/retrieve_training_metrics.py\n@@ -95,7 +95,7 @@\n file_path = abspath(join(output_directory, file_name))\n with open(file_path, \"w\") as metric_file:\n metric_file.write(r.text)\n- LOGGER.info(f\"Metrics saved to {file_path!r}\")\n+ LOGGER.info(\"Metrics saved to %r\", file_path)\n \n for namespace in get_namespaces(index, ns_full_uri):\n new_ns = current_ns.copy()\n", "issue": "Refactor logging statements to use lazy % formatting\nExample of logging statements that we want to refactor:\r\n\r\nhttps://github.com/mozilla/bugbug/blob/d53595391dbd75379bb49bff12dee4821e4b956c/bugbug/github.py#L61\r\n\r\nhttps://github.com/mozilla/bugbug/blob/69972a1684f788319bf5c2944bbe8eeb79428c7d/scripts/regressor_finder.py#L396\r\n\r\nMore details can be found in the [pylint docs](https://pylint.readthedocs.io/en/latest/user_guide/messages/warning/logging-fstring-interpolation.html).\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport argparse\nimport logging\nimport os\nimport sys\nfrom os.path import abspath, join\n\nimport requests\nimport taskcluster\n\nfrom bugbug.utils import get_taskcluster_options\n\nROOT_URI = \"train_{}.per_date\"\nDATE_URI = \"train_{}.per_date.{}\"\nBASE_URL = \"https://community-tc.services.mozilla.com/api/index/v1/task/{}/artifacts/public/metrics.json\"\nNAMESPACE_URI = \"project.bugbug.{}\"\n\nLOGGER = logging.getLogger(__name__)\n\nlogging.basicConfig(level=logging.INFO)\n\n\ndef get_task_metrics_from_uri(index_uri):\n index_url = BASE_URL.format(index_uri)\n LOGGER.info(\"Retrieving metrics from %s\", index_url)\n r = requests.get(index_url)\n\n if r.status_code == 404:\n LOGGER.error(f\"File not found for URL {index_url}, check your arguments\")\n sys.exit(1)\n\n r.raise_for_status()\n\n return r\n\n\ndef get_namespaces(index, index_uri):\n index_namespaces = index.listNamespaces(index_uri)\n\n return index_namespaces[\"namespaces\"]\n\n\ndef is_later_or_equal(partial_date, from_date):\n for partial_date_part, from_date_part in zip(partial_date, from_date):\n if int(partial_date_part) > int(from_date_part):\n return True\n elif int(partial_date_part) < int(from_date_part):\n return False\n else:\n continue\n\n return True\n\n\ndef get_task_metrics_from_date(model, date, output_directory):\n options = get_taskcluster_options()\n\n index = taskcluster.Index(options)\n\n index.ping()\n\n # Split the date\n from_date = date.split(\".\")\n\n namespaces = []\n\n # Start at the root level\n # We need an empty list in order to append namespaces part to it\n namespaces.append([])\n\n # Recursively list all namespaces greater or equals than the given date\n while namespaces:\n current_ns = namespaces.pop()\n\n # Handle version level namespaces\n if not current_ns:\n ns_uri = ROOT_URI.format(model)\n else:\n current_ns_date = \".\".join(current_ns)\n ns_uri = DATE_URI.format(model, current_ns_date)\n\n ns_full_uri = NAMESPACE_URI.format(ns_uri)\n\n tasks = index.listTasks(ns_full_uri)\n for task in tasks[\"tasks\"]:\n task_uri = task[\"namespace\"]\n r = get_task_metrics_from_uri(task_uri)\n\n # Write the file on disk\n file_name = f\"metric_{'_'.join(task_uri.split('.'))}.json\"\n file_path = abspath(join(output_directory, file_name))\n with open(file_path, \"w\") as metric_file:\n metric_file.write(r.text)\n LOGGER.info(f\"Metrics saved to {file_path!r}\")\n\n for namespace in get_namespaces(index, ns_full_uri):\n new_ns = current_ns.copy()\n new_ns.append(namespace[\"name\"])\n\n if not is_later_or_equal(new_ns, from_date):\n LOGGER.debug(\"NEW namespace %s is before %s\", new_ns, from_date)\n continue\n\n # Might not be efficient but size of `namespaces` shouldn't be too\n # big as we are doing a depth-first traversal\n if new_ns not in namespaces:\n namespaces.append(new_ns)\n\n\ndef main():\n description = \"Retrieve a model training metrics\"\n parser = argparse.ArgumentParser(description=description)\n\n parser.add_argument(\n \"-d\",\n \"--output-directory\",\n default=os.getcwd(),\n help=\"In which directory the script should save the metrics file. The directory must exists\",\n )\n parser.add_argument(\"model\", help=\"Which model to retrieve training metrics from.\")\n parser.add_argument(\n \"date\",\n nargs=\"?\",\n help=\"Which date should we retrieve training metrics from. Default to latest\",\n )\n\n args = parser.parse_args()\n\n get_task_metrics_from_date(args.model, args.date, args.output_directory)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "scripts/retrieve_training_metrics.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport argparse\nimport logging\nimport os\nimport sys\nfrom os.path import abspath, join\n\nimport requests\nimport taskcluster\n\nfrom bugbug.utils import get_taskcluster_options\n\nROOT_URI = \"train_{}.per_date\"\nDATE_URI = \"train_{}.per_date.{}\"\nBASE_URL = \"https://community-tc.services.mozilla.com/api/index/v1/task/{}/artifacts/public/metrics.json\"\nNAMESPACE_URI = \"project.bugbug.{}\"\n\nLOGGER = logging.getLogger(__name__)\n\nlogging.basicConfig(level=logging.INFO)\n\n\ndef get_task_metrics_from_uri(index_uri):\n index_url = BASE_URL.format(index_uri)\n LOGGER.info(\"Retrieving metrics from %s\", index_url)\n r = requests.get(index_url)\n\n if r.status_code == 404:\n LOGGER.error(f\"File not found for URL {index_url}, check your arguments\")\n sys.exit(1)\n\n r.raise_for_status()\n\n return r\n\n\ndef get_namespaces(index, index_uri):\n index_namespaces = index.listNamespaces(index_uri)\n\n return index_namespaces[\"namespaces\"]\n\n\ndef is_later_or_equal(partial_date, from_date):\n for partial_date_part, from_date_part in zip(partial_date, from_date):\n if int(partial_date_part) > int(from_date_part):\n return True\n elif int(partial_date_part) < int(from_date_part):\n return False\n else:\n continue\n\n return True\n\n\ndef get_task_metrics_from_date(model, date, output_directory):\n options = get_taskcluster_options()\n\n index = taskcluster.Index(options)\n\n index.ping()\n\n # Split the date\n from_date = date.split(\".\")\n\n namespaces = []\n\n # Start at the root level\n # We need an empty list in order to append namespaces part to it\n namespaces.append([])\n\n # Recursively list all namespaces greater or equals than the given date\n while namespaces:\n current_ns = namespaces.pop()\n\n # Handle version level namespaces\n if not current_ns:\n ns_uri = ROOT_URI.format(model)\n else:\n current_ns_date = \".\".join(current_ns)\n ns_uri = DATE_URI.format(model, current_ns_date)\n\n ns_full_uri = NAMESPACE_URI.format(ns_uri)\n\n tasks = index.listTasks(ns_full_uri)\n for task in tasks[\"tasks\"]:\n task_uri = task[\"namespace\"]\n r = get_task_metrics_from_uri(task_uri)\n\n # Write the file on disk\n file_name = f\"metric_{'_'.join(task_uri.split('.'))}.json\"\n file_path = abspath(join(output_directory, file_name))\n with open(file_path, \"w\") as metric_file:\n metric_file.write(r.text)\n LOGGER.info(\"Metrics saved to %r\", file_path)\n\n for namespace in get_namespaces(index, ns_full_uri):\n new_ns = current_ns.copy()\n new_ns.append(namespace[\"name\"])\n\n if not is_later_or_equal(new_ns, from_date):\n LOGGER.debug(\"NEW namespace %s is before %s\", new_ns, from_date)\n continue\n\n # Might not be efficient but size of `namespaces` shouldn't be too\n # big as we are doing a depth-first traversal\n if new_ns not in namespaces:\n namespaces.append(new_ns)\n\n\ndef main():\n description = \"Retrieve a model training metrics\"\n parser = argparse.ArgumentParser(description=description)\n\n parser.add_argument(\n \"-d\",\n \"--output-directory\",\n default=os.getcwd(),\n help=\"In which directory the script should save the metrics file. The directory must exists\",\n )\n parser.add_argument(\"model\", help=\"Which model to retrieve training metrics from.\")\n parser.add_argument(\n \"date\",\n nargs=\"?\",\n help=\"Which date should we retrieve training metrics from. Default to latest\",\n )\n\n args = parser.parse_args()\n\n get_task_metrics_from_date(args.model, args.date, args.output_directory)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "scripts/retrieve_training_metrics.py"}]} | 1,668 | 132 |
gh_patches_debug_29495 | rasdani/github-patches | git_diff | bridgecrewio__checkov-1215 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
False positive for CKV_AWS_143 ("Ensure S3 bucket has lock configuration enabled by default")
**Describe the bug**
Checkov expects argument `object_lock_configuration` to be an object, i.e.
```hcl
object_lock_configuration = {
object_lock_enabled = "Enabled"
}
```
Terraform works with the above configuration, but when also declaring rules for the object lock configuration, it expects a block instead, e.g.
```hcl
object_lock_configuration {
object_lock_enabled = "Enabled"
rule {
default_retention {
mode = "GOVERNANCE"
days = 366
}
}
}
```
**Expected behavior**
Checkov should pass for a `object_lock_configuration` argument block.
**Desktop (please complete the following information):**
- OS: macOS Big Sur 11.3.1
- Checkov Version: 2.0.135
- Terraform version: v0.14.8
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/aws/S3BucketObjectLock.py`
Content:
```
1 from checkov.common.models.enums import CheckCategories, CheckResult
2 from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck
3
4
5 class S3BucketObjectLock(BaseResourceCheck):
6 def __init__(self):
7 name = "Ensure that S3 bucket has lock configuration enabled by default"
8 id = "CKV_AWS_143"
9 supported_resources = ['aws_s3_bucket']
10 categories = [CheckCategories.GENERAL_SECURITY]
11 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
12
13 def scan_resource_conf(self, conf):
14 if 'object_lock_configuration' in conf:
15 if 'object_lock_enabled' in conf['object_lock_configuration'][0]:
16 lock = conf['object_lock_configuration'][0]['object_lock_enabled']
17 if lock == "Enabled":
18 return CheckResult.PASSED
19 else:
20 return CheckResult.FAILED
21 else:
22 return CheckResult.PASSED
23
24
25 check = S3BucketObjectLock()
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py b/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py
--- a/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py
+++ b/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py
@@ -1,25 +1,26 @@
+from typing import Dict, List, Any
+
from checkov.common.models.enums import CheckCategories, CheckResult
from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck
class S3BucketObjectLock(BaseResourceCheck):
- def __init__(self):
+ def __init__(self) -> None:
name = "Ensure that S3 bucket has lock configuration enabled by default"
id = "CKV_AWS_143"
- supported_resources = ['aws_s3_bucket']
+ supported_resources = ["aws_s3_bucket"]
categories = [CheckCategories.GENERAL_SECURITY]
super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
- def scan_resource_conf(self, conf):
- if 'object_lock_configuration' in conf:
- if 'object_lock_enabled' in conf['object_lock_configuration'][0]:
- lock = conf['object_lock_configuration'][0]['object_lock_enabled']
- if lock == "Enabled":
- return CheckResult.PASSED
- else:
- return CheckResult.FAILED
- else:
- return CheckResult.PASSED
+ def scan_resource_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:
+ lock_conf = conf.get("object_lock_configuration")
+ if lock_conf and lock_conf[0]:
+ lock_enabled = lock_conf[0].get("object_lock_enabled")
+ if lock_enabled in ["Enabled", ["Enabled"]]:
+ return CheckResult.PASSED
+ return CheckResult.FAILED
+
+ return CheckResult.UNKNOWN
check = S3BucketObjectLock()
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py b/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py\n--- a/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py\n+++ b/checkov/terraform/checks/resource/aws/S3BucketObjectLock.py\n@@ -1,25 +1,26 @@\n+from typing import Dict, List, Any\n+\n from checkov.common.models.enums import CheckCategories, CheckResult\n from checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck\n \n \n class S3BucketObjectLock(BaseResourceCheck):\n- def __init__(self):\n+ def __init__(self) -> None:\n name = \"Ensure that S3 bucket has lock configuration enabled by default\"\n id = \"CKV_AWS_143\"\n- supported_resources = ['aws_s3_bucket']\n+ supported_resources = [\"aws_s3_bucket\"]\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n \n- def scan_resource_conf(self, conf):\n- if 'object_lock_configuration' in conf:\n- if 'object_lock_enabled' in conf['object_lock_configuration'][0]:\n- lock = conf['object_lock_configuration'][0]['object_lock_enabled']\n- if lock == \"Enabled\":\n- return CheckResult.PASSED\n- else:\n- return CheckResult.FAILED\n- else:\n- return CheckResult.PASSED\n+ def scan_resource_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:\n+ lock_conf = conf.get(\"object_lock_configuration\")\n+ if lock_conf and lock_conf[0]:\n+ lock_enabled = lock_conf[0].get(\"object_lock_enabled\")\n+ if lock_enabled in [\"Enabled\", [\"Enabled\"]]:\n+ return CheckResult.PASSED\n+ return CheckResult.FAILED\n+\n+ return CheckResult.UNKNOWN\n \n \n check = S3BucketObjectLock()\n", "issue": "False positive for CKV_AWS_143 (\"Ensure S3 bucket has lock configuration enabled by default\")\n**Describe the bug**\r\n\r\nCheckov expects argument `object_lock_configuration` to be an object, i.e.\r\n\r\n```hcl\r\nobject_lock_configuration = {\r\n object_lock_enabled = \"Enabled\"\r\n}\r\n```\r\n\r\nTerraform works with the above configuration, but when also declaring rules for the object lock configuration, it expects a block instead, e.g.\r\n```hcl\r\nobject_lock_configuration {\r\n object_lock_enabled = \"Enabled\"\r\n\r\n rule {\r\n default_retention {\r\n mode = \"GOVERNANCE\"\r\n days = 366\r\n }\r\n }\r\n}\r\n```\r\n\r\n**Expected behavior**\r\nCheckov should pass for a `object_lock_configuration` argument block.\r\n\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: macOS Big Sur 11.3.1\r\n - Checkov Version: 2.0.135\r\n - Terraform version: v0.14.8\r\n\r\n\n", "before_files": [{"content": "from checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck\n\n\nclass S3BucketObjectLock(BaseResourceCheck):\n def __init__(self):\n name = \"Ensure that S3 bucket has lock configuration enabled by default\"\n id = \"CKV_AWS_143\"\n supported_resources = ['aws_s3_bucket']\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf):\n if 'object_lock_configuration' in conf:\n if 'object_lock_enabled' in conf['object_lock_configuration'][0]:\n lock = conf['object_lock_configuration'][0]['object_lock_enabled']\n if lock == \"Enabled\":\n return CheckResult.PASSED\n else:\n return CheckResult.FAILED\n else:\n return CheckResult.PASSED\n\n\ncheck = S3BucketObjectLock()\n", "path": "checkov/terraform/checks/resource/aws/S3BucketObjectLock.py"}], "after_files": [{"content": "from typing import Dict, List, Any\n\nfrom checkov.common.models.enums import CheckCategories, CheckResult\nfrom checkov.terraform.checks.resource.base_resource_value_check import BaseResourceCheck\n\n\nclass S3BucketObjectLock(BaseResourceCheck):\n def __init__(self) -> None:\n name = \"Ensure that S3 bucket has lock configuration enabled by default\"\n id = \"CKV_AWS_143\"\n supported_resources = [\"aws_s3_bucket\"]\n categories = [CheckCategories.GENERAL_SECURITY]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:\n lock_conf = conf.get(\"object_lock_configuration\")\n if lock_conf and lock_conf[0]:\n lock_enabled = lock_conf[0].get(\"object_lock_enabled\")\n if lock_enabled in [\"Enabled\", [\"Enabled\"]]:\n return CheckResult.PASSED\n return CheckResult.FAILED\n\n return CheckResult.UNKNOWN\n\n\ncheck = S3BucketObjectLock()\n", "path": "checkov/terraform/checks/resource/aws/S3BucketObjectLock.py"}]} | 743 | 439 |
gh_patches_debug_15777 | rasdani/github-patches | git_diff | python-telegram-bot__python-telegram-bot-1112 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tests missing for `User.mention_markdown` and `User.mention_html`
And while we're at it. Maybe `helpers.mention_markdown/html` too.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `telegram/utils/helpers.py`
Content:
```
1 #!/usr/bin/env python
2 #
3 # A library that provides a Python interface to the Telegram Bot API
4 # Copyright (C) 2015-2018
5 # Leandro Toledo de Souza <[email protected]>
6 #
7 # This program is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU Lesser Public License as published by
9 # the Free Software Foundation, either version 3 of the License, or
10 # (at your option) any later version.
11 #
12 # This program is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU Lesser Public License for more details.
16 #
17 # You should have received a copy of the GNU Lesser Public License
18 # along with this program. If not, see [http://www.gnu.org/licenses/].
19 """This module contains helper functions."""
20 from html import escape
21
22 import re
23 import signal
24 from datetime import datetime
25
26 # From https://stackoverflow.com/questions/2549939/get-signal-names-from-numbers-in-python
27 _signames = {v: k
28 for k, v in reversed(sorted(vars(signal).items()))
29 if k.startswith('SIG') and not k.startswith('SIG_')}
30
31
32 def get_signal_name(signum):
33 """Returns the signal name of the given signal number."""
34 return _signames[signum]
35
36
37 # Not using future.backports.datetime here as datetime value might be an input from the user,
38 # making every isinstace() call more delicate. So we just use our own compat layer.
39 if hasattr(datetime, 'timestamp'):
40 # Python 3.3+
41 def _timestamp(dt_obj):
42 return dt_obj.timestamp()
43 else:
44 # Python < 3.3 (incl 2.7)
45 from time import mktime
46
47 def _timestamp(dt_obj):
48 return mktime(dt_obj.timetuple())
49
50
51 def escape_markdown(text):
52 """Helper function to escape telegram markup symbols."""
53 escape_chars = '\*_`\['
54 return re.sub(r'([%s])' % escape_chars, r'\\\1', text)
55
56
57 def to_timestamp(dt_obj):
58 """
59 Args:
60 dt_obj (:class:`datetime.datetime`):
61
62 Returns:
63 int:
64
65 """
66 if not dt_obj:
67 return None
68
69 return int(_timestamp(dt_obj))
70
71
72 def from_timestamp(unixtime):
73 """
74 Args:
75 unixtime (int):
76
77 Returns:
78 datetime.datetime:
79
80 """
81 if not unixtime:
82 return None
83
84 return datetime.fromtimestamp(unixtime)
85
86
87 def mention_html(user_id, name):
88 """
89 Args:
90 user_id (:obj:`int`) The user's id which you want to mention.
91 name (:obj:`str`) The name the mention is showing.
92
93 Returns:
94 :obj:`str`: The inline mention for the user as html.
95 """
96 if isinstance(user_id, int):
97 return '<a href="tg://user?id={}">{}</a>'.format(user_id, escape(name))
98
99
100 def mention_markdown(user_id, name):
101 """
102 Args:
103 user_id (:obj:`int`) The user's id which you want to mention.
104 name (:obj:`str`) The name the mention is showing.
105
106 Returns:
107 :obj:`str`: The inline mention for the user as markdown.
108 """
109 if isinstance(user_id, int):
110 return '[{}](tg://user?id={})'.format(escape_markdown(name), user_id)
111
112
113 def effective_message_type(entity):
114 """
115 Extracts the type of message as a string identifier from a :class:`telegram.Message` or a
116 :class:`telegram.Update`.
117
118 Args:
119 entity (:obj:`Update` | :obj:`Message`) The ``update`` or ``message`` to extract from
120
121 Returns:
122 str: One of ``Message.MESSAGE_TYPES``
123
124 """
125
126 # Importing on file-level yields cyclic Import Errors
127 from telegram import Message
128 from telegram import Update
129
130 if isinstance(entity, Message):
131 message = entity
132 elif isinstance(entity, Update):
133 message = entity.effective_message
134 else:
135 raise TypeError("entity is not Message or Update (got: {})".format(type(entity)))
136
137 for i in Message.MESSAGE_TYPES:
138 if getattr(message, i, None):
139 return i
140
141 return None
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/telegram/utils/helpers.py b/telegram/utils/helpers.py
--- a/telegram/utils/helpers.py
+++ b/telegram/utils/helpers.py
@@ -94,7 +94,7 @@
:obj:`str`: The inline mention for the user as html.
"""
if isinstance(user_id, int):
- return '<a href="tg://user?id={}">{}</a>'.format(user_id, escape(name))
+ return u'<a href="tg://user?id={}">{}</a>'.format(user_id, escape(name))
def mention_markdown(user_id, name):
@@ -107,7 +107,7 @@
:obj:`str`: The inline mention for the user as markdown.
"""
if isinstance(user_id, int):
- return '[{}](tg://user?id={})'.format(escape_markdown(name), user_id)
+ return u'[{}](tg://user?id={})'.format(escape_markdown(name), user_id)
def effective_message_type(entity):
| {"golden_diff": "diff --git a/telegram/utils/helpers.py b/telegram/utils/helpers.py\n--- a/telegram/utils/helpers.py\n+++ b/telegram/utils/helpers.py\n@@ -94,7 +94,7 @@\n :obj:`str`: The inline mention for the user as html.\n \"\"\"\n if isinstance(user_id, int):\n- return '<a href=\"tg://user?id={}\">{}</a>'.format(user_id, escape(name))\n+ return u'<a href=\"tg://user?id={}\">{}</a>'.format(user_id, escape(name))\n \n \n def mention_markdown(user_id, name):\n@@ -107,7 +107,7 @@\n :obj:`str`: The inline mention for the user as markdown.\n \"\"\"\n if isinstance(user_id, int):\n- return '[{}](tg://user?id={})'.format(escape_markdown(name), user_id)\n+ return u'[{}](tg://user?id={})'.format(escape_markdown(name), user_id)\n \n \n def effective_message_type(entity):\n", "issue": "Tests missing for `User.mention_markdown` and `User.mention_html`\nAnd while we're at it. Maybe `helpers.mention_markdown/html` too.\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2018\n# Leandro Toledo de Souza <[email protected]>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains helper functions.\"\"\"\nfrom html import escape\n\nimport re\nimport signal\nfrom datetime import datetime\n\n# From https://stackoverflow.com/questions/2549939/get-signal-names-from-numbers-in-python\n_signames = {v: k\n for k, v in reversed(sorted(vars(signal).items()))\n if k.startswith('SIG') and not k.startswith('SIG_')}\n\n\ndef get_signal_name(signum):\n \"\"\"Returns the signal name of the given signal number.\"\"\"\n return _signames[signum]\n\n\n# Not using future.backports.datetime here as datetime value might be an input from the user,\n# making every isinstace() call more delicate. So we just use our own compat layer.\nif hasattr(datetime, 'timestamp'):\n # Python 3.3+\n def _timestamp(dt_obj):\n return dt_obj.timestamp()\nelse:\n # Python < 3.3 (incl 2.7)\n from time import mktime\n\n def _timestamp(dt_obj):\n return mktime(dt_obj.timetuple())\n\n\ndef escape_markdown(text):\n \"\"\"Helper function to escape telegram markup symbols.\"\"\"\n escape_chars = '\\*_`\\['\n return re.sub(r'([%s])' % escape_chars, r'\\\\\\1', text)\n\n\ndef to_timestamp(dt_obj):\n \"\"\"\n Args:\n dt_obj (:class:`datetime.datetime`):\n\n Returns:\n int:\n\n \"\"\"\n if not dt_obj:\n return None\n\n return int(_timestamp(dt_obj))\n\n\ndef from_timestamp(unixtime):\n \"\"\"\n Args:\n unixtime (int):\n\n Returns:\n datetime.datetime:\n\n \"\"\"\n if not unixtime:\n return None\n\n return datetime.fromtimestamp(unixtime)\n\n\ndef mention_html(user_id, name):\n \"\"\"\n Args:\n user_id (:obj:`int`) The user's id which you want to mention.\n name (:obj:`str`) The name the mention is showing.\n\n Returns:\n :obj:`str`: The inline mention for the user as html.\n \"\"\"\n if isinstance(user_id, int):\n return '<a href=\"tg://user?id={}\">{}</a>'.format(user_id, escape(name))\n\n\ndef mention_markdown(user_id, name):\n \"\"\"\n Args:\n user_id (:obj:`int`) The user's id which you want to mention.\n name (:obj:`str`) The name the mention is showing.\n\n Returns:\n :obj:`str`: The inline mention for the user as markdown.\n \"\"\"\n if isinstance(user_id, int):\n return '[{}](tg://user?id={})'.format(escape_markdown(name), user_id)\n\n\ndef effective_message_type(entity):\n \"\"\"\n Extracts the type of message as a string identifier from a :class:`telegram.Message` or a\n :class:`telegram.Update`.\n\n Args:\n entity (:obj:`Update` | :obj:`Message`) The ``update`` or ``message`` to extract from\n\n Returns:\n str: One of ``Message.MESSAGE_TYPES``\n\n \"\"\"\n\n # Importing on file-level yields cyclic Import Errors\n from telegram import Message\n from telegram import Update\n\n if isinstance(entity, Message):\n message = entity\n elif isinstance(entity, Update):\n message = entity.effective_message\n else:\n raise TypeError(\"entity is not Message or Update (got: {})\".format(type(entity)))\n\n for i in Message.MESSAGE_TYPES:\n if getattr(message, i, None):\n return i\n\n return None\n", "path": "telegram/utils/helpers.py"}], "after_files": [{"content": "#!/usr/bin/env python\n#\n# A library that provides a Python interface to the Telegram Bot API\n# Copyright (C) 2015-2018\n# Leandro Toledo de Souza <[email protected]>\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Lesser Public License as published by\n# the Free Software Foundation, either version 3 of the License, or\n# (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Lesser Public License for more details.\n#\n# You should have received a copy of the GNU Lesser Public License\n# along with this program. If not, see [http://www.gnu.org/licenses/].\n\"\"\"This module contains helper functions.\"\"\"\nfrom html import escape\n\nimport re\nimport signal\nfrom datetime import datetime\n\n# From https://stackoverflow.com/questions/2549939/get-signal-names-from-numbers-in-python\n_signames = {v: k\n for k, v in reversed(sorted(vars(signal).items()))\n if k.startswith('SIG') and not k.startswith('SIG_')}\n\n\ndef get_signal_name(signum):\n \"\"\"Returns the signal name of the given signal number.\"\"\"\n return _signames[signum]\n\n\n# Not using future.backports.datetime here as datetime value might be an input from the user,\n# making every isinstace() call more delicate. So we just use our own compat layer.\nif hasattr(datetime, 'timestamp'):\n # Python 3.3+\n def _timestamp(dt_obj):\n return dt_obj.timestamp()\nelse:\n # Python < 3.3 (incl 2.7)\n from time import mktime\n\n def _timestamp(dt_obj):\n return mktime(dt_obj.timetuple())\n\n\ndef escape_markdown(text):\n \"\"\"Helper function to escape telegram markup symbols.\"\"\"\n escape_chars = '\\*_`\\['\n return re.sub(r'([%s])' % escape_chars, r'\\\\\\1', text)\n\n\ndef to_timestamp(dt_obj):\n \"\"\"\n Args:\n dt_obj (:class:`datetime.datetime`):\n\n Returns:\n int:\n\n \"\"\"\n if not dt_obj:\n return None\n\n return int(_timestamp(dt_obj))\n\n\ndef from_timestamp(unixtime):\n \"\"\"\n Args:\n unixtime (int):\n\n Returns:\n datetime.datetime:\n\n \"\"\"\n if not unixtime:\n return None\n\n return datetime.fromtimestamp(unixtime)\n\n\ndef mention_html(user_id, name):\n \"\"\"\n Args:\n user_id (:obj:`int`) The user's id which you want to mention.\n name (:obj:`str`) The name the mention is showing.\n\n Returns:\n :obj:`str`: The inline mention for the user as html.\n \"\"\"\n if isinstance(user_id, int):\n return u'<a href=\"tg://user?id={}\">{}</a>'.format(user_id, escape(name))\n\n\ndef mention_markdown(user_id, name):\n \"\"\"\n Args:\n user_id (:obj:`int`) The user's id which you want to mention.\n name (:obj:`str`) The name the mention is showing.\n\n Returns:\n :obj:`str`: The inline mention for the user as markdown.\n \"\"\"\n if isinstance(user_id, int):\n return u'[{}](tg://user?id={})'.format(escape_markdown(name), user_id)\n\n\ndef effective_message_type(entity):\n \"\"\"\n Extracts the type of message as a string identifier from a :class:`telegram.Message` or a\n :class:`telegram.Update`.\n\n Args:\n entity (:obj:`Update` | :obj:`Message`) The ``update`` or ``message`` to extract from\n\n Returns:\n str: One of ``Message.MESSAGE_TYPES``\n\n \"\"\"\n\n # Importing on file-level yields cyclic Import Errors\n from telegram import Message\n from telegram import Update\n\n if isinstance(entity, Message):\n message = entity\n elif isinstance(entity, Update):\n message = entity.effective_message\n else:\n raise TypeError(\"entity is not Message or Update (got: {})\".format(type(entity)))\n\n for i in Message.MESSAGE_TYPES:\n if getattr(message, i, None):\n return i\n\n return None\n", "path": "telegram/utils/helpers.py"}]} | 1,572 | 219 |
gh_patches_debug_2678 | rasdani/github-patches | git_diff | pretalx__pretalx-381 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
installation crashes when there are no config files
## Current Behavior
```
$ cd pretalx
$ pip-3.6 install . --user
(...)
File "<frozen importlib._bootstrap>", line 994, in _gcd_import
File "<frozen importlib._bootstrap>", line 971, in _find_and_load
File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmp/pip-xa87l9tk-build/pretalx/settings.py", line 460, in <module>
plugins=PLUGINS
File "/tmp/pip-xa87l9tk-build/pretalx/common/settings/utils.py", line 11, in log_initial
(f'Read from: {", ".join(config_files)}', False),
TypeError: can only join an iterable
```
if there are no config files at all, the installation crashes, because `config_files` is `None`.
## Your Environment
* Version used: master
* Operating System and version (desktop or mobile): FreeBSD
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pretalx/common/settings/config.py`
Content:
```
1 import configparser
2 import os
3 import sys
4
5 from pretalx.common.settings.utils import reduce_dict
6
7 CONFIG = {
8 'filesystem': {
9 'base': {
10 'default': os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))),
11 },
12 'logs': {
13 'default': None,
14 'env': os.getenv('PRETALX_FILESYSTEM_LOGS'),
15 },
16 'media': {
17 'default': None,
18 'env': os.getenv('PRETALX_FILESYSTEM_MEDIA'),
19 },
20 'static': {
21 'default': None,
22 'env': os.getenv('PRETALX_FILESYSTEM_STATIC'),
23 },
24 },
25 'site': {
26 'debug': {
27 'default': 'runserver' in sys.argv,
28 'env': os.getenv('PRETALX_DEBUG'),
29 },
30 'url': {
31 'default': 'http://localhost',
32 'env': os.getenv('PRETALX_SITE_URL'),
33 },
34 'https': {
35 'env': os.getenv('PRETALX_HTTPS'),
36 },
37 'cookie_domain': {
38 'default': '',
39 'env': os.getenv('PRETALX_COOKIE_DOMAIN'),
40 },
41 },
42 'database': {
43 'backend': {
44 'default': 'sqlite3',
45 'env': os.getenv('PRETALX_DB_TYPE'),
46 },
47 'name': {
48 'env': os.getenv('PRETALX_DB_NAME'),
49 },
50 'user': {
51 'default': '',
52 'env': os.getenv('PRETALX_DB_USER'),
53 },
54 'password': {
55 'default': '',
56 'env': os.getenv('PRETALX_DB_PASS'),
57 },
58 'host': {
59 'default': '',
60 'env': os.getenv('PRETALX_DB_HOST'),
61 },
62 'port': {
63 'default': '',
64 'env': os.getenv('PRETALX_DB_PORT'),
65 },
66 },
67 'mail': {
68 'from': {
69 'default': 'admin@localhost',
70 'env': os.getenv('PRETALX_MAIL_FROM'),
71 },
72 'host': {
73 'default': 'localhost',
74 'env': os.getenv('PRETALX_MAIL_HOST'),
75 },
76 'port': {
77 'default': '25',
78 'env': os.getenv('PRETALX_MAIL_PORT'),
79 },
80 'user': {
81 'default': '',
82 'env': os.getenv('PRETALX_MAIL_USER'),
83 },
84 'password': {
85 'default': '',
86 'env': os.getenv('PRETALX_MAIL_PASSWORD'),
87 },
88 'tls': {
89 'default': 'False',
90 'env': os.getenv('PRETALX_MAIL_TLS'),
91 },
92 'ssl': {
93 'default': 'False',
94 'env': os.getenv('PRETALX_MAIL_SSL'),
95 },
96 },
97 'cache': {
98 },
99 'celery': {
100 'broker': {
101 'default': '',
102 'env': os.getenv('PRETALX_CELERY_BROKER'),
103 },
104 'backend': {
105 'default': '',
106 'env': os.getenv('PRETALX_CELERY_BACKEND'),
107 },
108 },
109 'logging': {
110 'email': {
111 'default': '',
112 'env': os.getenv('PRETALX_LOGGING_EMAIL'),
113 },
114 'email_level': {
115 'default': '',
116 'env': os.getenv('PRETALX_LOGGING_EMAIL_LEVEL'),
117 },
118 },
119 }
120
121
122 def read_config_files(config):
123 if 'PRETALX_CONFIG_FILE' in os.environ:
124 config_files = config.read_file(open(os.environ.get('PRETALX_CONFIG_FILE'), encoding='utf-8'))
125 else:
126 config_files = config.read([
127 '/etc/pretalx/pretalx.cfg',
128 os.path.expanduser('~/.pretalx.cfg'),
129 'pretalx.cfg',
130 ], encoding='utf-8')
131 return config, config_files
132
133
134 def read_layer(layer_name, config):
135 config_dict = reduce_dict({
136 section_name: {
137 key: value.get(layer_name)
138 for key, value in section_content.items()
139 }
140 for section_name, section_content in CONFIG.items()
141 })
142 config.read_dict(config_dict)
143 return config
144
145
146 def build_config():
147 config = configparser.RawConfigParser()
148 config = read_layer('default', config)
149 config, config_files = read_config_files(config)
150 config = read_layer('env', config)
151 return config, config_files
152
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pretalx/common/settings/config.py b/src/pretalx/common/settings/config.py
--- a/src/pretalx/common/settings/config.py
+++ b/src/pretalx/common/settings/config.py
@@ -128,7 +128,7 @@
os.path.expanduser('~/.pretalx.cfg'),
'pretalx.cfg',
], encoding='utf-8')
- return config, config_files
+ return config, config_files or [] # .read() returns None, if there are no config files
def read_layer(layer_name, config):
| {"golden_diff": "diff --git a/src/pretalx/common/settings/config.py b/src/pretalx/common/settings/config.py\n--- a/src/pretalx/common/settings/config.py\n+++ b/src/pretalx/common/settings/config.py\n@@ -128,7 +128,7 @@\n os.path.expanduser('~/.pretalx.cfg'),\n 'pretalx.cfg',\n ], encoding='utf-8')\n- return config, config_files\n+ return config, config_files or [] # .read() returns None, if there are no config files\n \n \n def read_layer(layer_name, config):\n", "issue": "installation crashes when there are no config files\n## Current Behavior\r\n```\r\n$ cd pretalx\r\n$ pip-3.6 install . --user\r\n(...)\r\n File \"<frozen importlib._bootstrap>\", line 994, in _gcd_import\r\n File \"<frozen importlib._bootstrap>\", line 971, in _find_and_load\r\n File \"<frozen importlib._bootstrap>\", line 955, in _find_and_load_unlocked\r\n File \"<frozen importlib._bootstrap>\", line 665, in _load_unlocked\r\n File \"<frozen importlib._bootstrap_external>\", line 678, in exec_module\r\n File \"<frozen importlib._bootstrap>\", line 219, in _call_with_frames_removed\r\n File \"/tmp/pip-xa87l9tk-build/pretalx/settings.py\", line 460, in <module>\r\n plugins=PLUGINS\r\n File \"/tmp/pip-xa87l9tk-build/pretalx/common/settings/utils.py\", line 11, in log_initial\r\n (f'Read from: {\", \".join(config_files)}', False),\r\n TypeError: can only join an iterable\r\n```\r\n\r\nif there are no config files at all, the installation crashes, because `config_files` is `None`.\r\n\r\n## Your Environment\r\n\r\n* Version used: master\r\n* Operating System and version (desktop or mobile): FreeBSD\r\n\n", "before_files": [{"content": "import configparser\nimport os\nimport sys\n\nfrom pretalx.common.settings.utils import reduce_dict\n\nCONFIG = {\n 'filesystem': {\n 'base': {\n 'default': os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))),\n },\n 'logs': {\n 'default': None,\n 'env': os.getenv('PRETALX_FILESYSTEM_LOGS'),\n },\n 'media': {\n 'default': None,\n 'env': os.getenv('PRETALX_FILESYSTEM_MEDIA'),\n },\n 'static': {\n 'default': None,\n 'env': os.getenv('PRETALX_FILESYSTEM_STATIC'),\n },\n },\n 'site': {\n 'debug': {\n 'default': 'runserver' in sys.argv,\n 'env': os.getenv('PRETALX_DEBUG'),\n },\n 'url': {\n 'default': 'http://localhost',\n 'env': os.getenv('PRETALX_SITE_URL'),\n },\n 'https': {\n 'env': os.getenv('PRETALX_HTTPS'),\n },\n 'cookie_domain': {\n 'default': '',\n 'env': os.getenv('PRETALX_COOKIE_DOMAIN'),\n },\n },\n 'database': {\n 'backend': {\n 'default': 'sqlite3',\n 'env': os.getenv('PRETALX_DB_TYPE'),\n },\n 'name': {\n 'env': os.getenv('PRETALX_DB_NAME'),\n },\n 'user': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_USER'),\n },\n 'password': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_PASS'),\n },\n 'host': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_HOST'),\n },\n 'port': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_PORT'),\n },\n },\n 'mail': {\n 'from': {\n 'default': 'admin@localhost',\n 'env': os.getenv('PRETALX_MAIL_FROM'),\n },\n 'host': {\n 'default': 'localhost',\n 'env': os.getenv('PRETALX_MAIL_HOST'),\n },\n 'port': {\n 'default': '25',\n 'env': os.getenv('PRETALX_MAIL_PORT'),\n },\n 'user': {\n 'default': '',\n 'env': os.getenv('PRETALX_MAIL_USER'),\n },\n 'password': {\n 'default': '',\n 'env': os.getenv('PRETALX_MAIL_PASSWORD'),\n },\n 'tls': {\n 'default': 'False',\n 'env': os.getenv('PRETALX_MAIL_TLS'),\n },\n 'ssl': {\n 'default': 'False',\n 'env': os.getenv('PRETALX_MAIL_SSL'),\n },\n },\n 'cache': {\n },\n 'celery': {\n 'broker': {\n 'default': '',\n 'env': os.getenv('PRETALX_CELERY_BROKER'),\n },\n 'backend': {\n 'default': '',\n 'env': os.getenv('PRETALX_CELERY_BACKEND'),\n },\n },\n 'logging': {\n 'email': {\n 'default': '',\n 'env': os.getenv('PRETALX_LOGGING_EMAIL'),\n },\n 'email_level': {\n 'default': '',\n 'env': os.getenv('PRETALX_LOGGING_EMAIL_LEVEL'),\n },\n },\n}\n\n\ndef read_config_files(config):\n if 'PRETALX_CONFIG_FILE' in os.environ:\n config_files = config.read_file(open(os.environ.get('PRETALX_CONFIG_FILE'), encoding='utf-8'))\n else:\n config_files = config.read([\n '/etc/pretalx/pretalx.cfg',\n os.path.expanduser('~/.pretalx.cfg'),\n 'pretalx.cfg',\n ], encoding='utf-8')\n return config, config_files\n\n\ndef read_layer(layer_name, config):\n config_dict = reduce_dict({\n section_name: {\n key: value.get(layer_name)\n for key, value in section_content.items()\n }\n for section_name, section_content in CONFIG.items()\n })\n config.read_dict(config_dict)\n return config\n\n\ndef build_config():\n config = configparser.RawConfigParser()\n config = read_layer('default', config)\n config, config_files = read_config_files(config)\n config = read_layer('env', config)\n return config, config_files\n", "path": "src/pretalx/common/settings/config.py"}], "after_files": [{"content": "import configparser\nimport os\nimport sys\n\nfrom pretalx.common.settings.utils import reduce_dict\n\nCONFIG = {\n 'filesystem': {\n 'base': {\n 'default': os.path.dirname(os.path.dirname(os.path.dirname(os.path.dirname(__file__)))),\n },\n 'logs': {\n 'default': None,\n 'env': os.getenv('PRETALX_FILESYSTEM_LOGS'),\n },\n 'media': {\n 'default': None,\n 'env': os.getenv('PRETALX_FILESYSTEM_MEDIA'),\n },\n 'static': {\n 'default': None,\n 'env': os.getenv('PRETALX_FILESYSTEM_STATIC'),\n },\n },\n 'site': {\n 'debug': {\n 'default': 'runserver' in sys.argv,\n 'env': os.getenv('PRETALX_DEBUG'),\n },\n 'url': {\n 'default': 'http://localhost',\n 'env': os.getenv('PRETALX_SITE_URL'),\n },\n 'https': {\n 'env': os.getenv('PRETALX_HTTPS'),\n },\n 'cookie_domain': {\n 'default': '',\n 'env': os.getenv('PRETALX_COOKIE_DOMAIN'),\n },\n },\n 'database': {\n 'backend': {\n 'default': 'sqlite3',\n 'env': os.getenv('PRETALX_DB_TYPE'),\n },\n 'name': {\n 'env': os.getenv('PRETALX_DB_NAME'),\n },\n 'user': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_USER'),\n },\n 'password': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_PASS'),\n },\n 'host': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_HOST'),\n },\n 'port': {\n 'default': '',\n 'env': os.getenv('PRETALX_DB_PORT'),\n },\n },\n 'mail': {\n 'from': {\n 'default': 'admin@localhost',\n 'env': os.getenv('PRETALX_MAIL_FROM'),\n },\n 'host': {\n 'default': 'localhost',\n 'env': os.getenv('PRETALX_MAIL_HOST'),\n },\n 'port': {\n 'default': '25',\n 'env': os.getenv('PRETALX_MAIL_PORT'),\n },\n 'user': {\n 'default': '',\n 'env': os.getenv('PRETALX_MAIL_USER'),\n },\n 'password': {\n 'default': '',\n 'env': os.getenv('PRETALX_MAIL_PASSWORD'),\n },\n 'tls': {\n 'default': 'False',\n 'env': os.getenv('PRETALX_MAIL_TLS'),\n },\n 'ssl': {\n 'default': 'False',\n 'env': os.getenv('PRETALX_MAIL_SSL'),\n },\n },\n 'cache': {\n },\n 'celery': {\n 'broker': {\n 'default': '',\n 'env': os.getenv('PRETALX_CELERY_BROKER'),\n },\n 'backend': {\n 'default': '',\n 'env': os.getenv('PRETALX_CELERY_BACKEND'),\n },\n },\n 'logging': {\n 'email': {\n 'default': '',\n 'env': os.getenv('PRETALX_LOGGING_EMAIL'),\n },\n 'email_level': {\n 'default': '',\n 'env': os.getenv('PRETALX_LOGGING_EMAIL_LEVEL'),\n },\n },\n}\n\n\ndef read_config_files(config):\n if 'PRETALX_CONFIG_FILE' in os.environ:\n config_files = config.read_file(open(os.environ.get('PRETALX_CONFIG_FILE'), encoding='utf-8'))\n else:\n config_files = config.read([\n '/etc/pretalx/pretalx.cfg',\n os.path.expanduser('~/.pretalx.cfg'),\n 'pretalx.cfg',\n ], encoding='utf-8')\n return config, config_files or [] # .read() returns None, if there are no config files\n\n\ndef read_layer(layer_name, config):\n config_dict = reduce_dict({\n section_name: {\n key: value.get(layer_name)\n for key, value in section_content.items()\n }\n for section_name, section_content in CONFIG.items()\n })\n config.read_dict(config_dict)\n return config\n\n\ndef build_config():\n config = configparser.RawConfigParser()\n config = read_layer('default', config)\n config, config_files = read_config_files(config)\n config = read_layer('env', config)\n return config, config_files\n", "path": "src/pretalx/common/settings/config.py"}]} | 1,904 | 130 |
gh_patches_debug_29983 | rasdani/github-patches | git_diff | liqd__a4-meinberlin-3146 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
testing interactive event: remove + on call to action
**URL:** https://meinberlin-dev.liqd.net/projekte/module/interaktive-veranstaltung-2/
**user:** unregistered user
**expected behaviour:** buttons should be same all over platform
**behaviour:** there is a + on the button
**important screensize:**
**device & browser:**
**Comment/Question:** please take out the + before add question
Screenshot?
<img width="692" alt="Bildschirmfoto 2020-09-22 um 17 51 38" src="https://user-images.githubusercontent.com/35491681/93906276-494d9200-fcfc-11ea-9614-3a9359b5ec97.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/projects/templatetags/meinberlin_project_tags.py`
Content:
```
1 from django import template
2
3 from adhocracy4.comments.models import Comment
4 from meinberlin.apps.budgeting.models import Proposal as budget_proposal
5 from meinberlin.apps.ideas.models import Idea
6 from meinberlin.apps.kiezkasse.models import Proposal as kiezkasse_proposal
7 from meinberlin.apps.mapideas.models import MapIdea
8 from meinberlin.apps.polls.models import Vote
9 from meinberlin.apps.projects import get_project_type
10
11 register = template.Library()
12
13
14 @register.filter
15 def project_url(project):
16 if (project.project_type == 'meinberlin_bplan.Bplan'
17 or project.project_type ==
18 'meinberlin_extprojects.ExternalProject'):
19 return project.externalproject.url
20 return project.get_absolute_url()
21
22
23 @register.filter
24 def project_type(project):
25 return get_project_type(project)
26
27
28 @register.filter
29 def is_external(project):
30 return (project.project_type == 'meinberlin_bplan.Bplan'
31 or project.project_type ==
32 'meinberlin_extprojects.ExternalProject')
33
34
35 @register.filter
36 def is_a4_project(project):
37 return (project.project_type == 'a4projects.Project')
38
39
40 @register.simple_tag
41 def get_num_entries(module):
42 """Count all user-generated items."""
43 item_count = \
44 Idea.objects.filter(module=module).count() \
45 + MapIdea.objects.filter(module=module).count() \
46 + budget_proposal.objects.filter(module=module).count() \
47 + kiezkasse_proposal.objects.filter(module=module).count() \
48 + Comment.objects.filter(idea__module=module).count() \
49 + Comment.objects.filter(mapidea__module=module).count() \
50 + Comment.objects.filter(budget_proposal__module=module).count() \
51 + Comment.objects.filter(kiezkasse_proposal__module=module).count() \
52 + Comment.objects.filter(topic__module=module).count() \
53 + Comment.objects.filter(maptopic__module=module).count() \
54 + Comment.objects.filter(paragraph__chapter__module=module).count() \
55 + Comment.objects.filter(chapter__module=module).count() \
56 + Comment.objects.filter(poll__module=module).count() \
57 + Vote.objects.filter(choice__question__poll__module=module).count()
58 return item_count
59
```
Path: `meinberlin/apps/livequestions/phases.py`
Content:
```
1 from django.utils.translation import ugettext_lazy as _
2
3 from adhocracy4 import phases
4
5 from . import apps
6 from . import models
7 from . import views
8
9
10 class IssuePhase(phases.PhaseContent):
11 app = apps.Config.label
12 phase = 'issue'
13 view = views.LiveQuestionModuleDetail
14
15 name = _('Issue phase')
16 description = _('Add question.')
17 module_name = _('Interactive Event')
18 icon = 'lightbulb-o'
19
20 features = {
21 'crud': (models.LiveQuestion,),
22 'like': (models.LiveQuestion,)
23 }
24
25
26 phases.content.register(IssuePhase())
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/meinberlin/apps/livequestions/phases.py b/meinberlin/apps/livequestions/phases.py
--- a/meinberlin/apps/livequestions/phases.py
+++ b/meinberlin/apps/livequestions/phases.py
@@ -13,7 +13,7 @@
view = views.LiveQuestionModuleDetail
name = _('Issue phase')
- description = _('Add question.')
+ description = _('Add questions and support.')
module_name = _('Interactive Event')
icon = 'lightbulb-o'
diff --git a/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py b/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py
--- a/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py
+++ b/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py
@@ -4,6 +4,8 @@
from meinberlin.apps.budgeting.models import Proposal as budget_proposal
from meinberlin.apps.ideas.models import Idea
from meinberlin.apps.kiezkasse.models import Proposal as kiezkasse_proposal
+from meinberlin.apps.likes.models import Like
+from meinberlin.apps.livequestions.models import LiveQuestion
from meinberlin.apps.mapideas.models import MapIdea
from meinberlin.apps.polls.models import Vote
from meinberlin.apps.projects import get_project_type
@@ -54,5 +56,7 @@
+ Comment.objects.filter(paragraph__chapter__module=module).count() \
+ Comment.objects.filter(chapter__module=module).count() \
+ Comment.objects.filter(poll__module=module).count() \
- + Vote.objects.filter(choice__question__poll__module=module).count()
+ + Vote.objects.filter(choice__question__poll__module=module).count() \
+ + LiveQuestion.objects.filter(module=module).count() \
+ + Like.objects.filter(question__module=module).count()
return item_count
| {"golden_diff": "diff --git a/meinberlin/apps/livequestions/phases.py b/meinberlin/apps/livequestions/phases.py\n--- a/meinberlin/apps/livequestions/phases.py\n+++ b/meinberlin/apps/livequestions/phases.py\n@@ -13,7 +13,7 @@\n view = views.LiveQuestionModuleDetail\n \n name = _('Issue phase')\n- description = _('Add question.')\n+ description = _('Add questions and support.')\n module_name = _('Interactive Event')\n icon = 'lightbulb-o'\n \ndiff --git a/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py b/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py\n--- a/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py\n+++ b/meinberlin/apps/projects/templatetags/meinberlin_project_tags.py\n@@ -4,6 +4,8 @@\n from meinberlin.apps.budgeting.models import Proposal as budget_proposal\n from meinberlin.apps.ideas.models import Idea\n from meinberlin.apps.kiezkasse.models import Proposal as kiezkasse_proposal\n+from meinberlin.apps.likes.models import Like\n+from meinberlin.apps.livequestions.models import LiveQuestion\n from meinberlin.apps.mapideas.models import MapIdea\n from meinberlin.apps.polls.models import Vote\n from meinberlin.apps.projects import get_project_type\n@@ -54,5 +56,7 @@\n + Comment.objects.filter(paragraph__chapter__module=module).count() \\\n + Comment.objects.filter(chapter__module=module).count() \\\n + Comment.objects.filter(poll__module=module).count() \\\n- + Vote.objects.filter(choice__question__poll__module=module).count()\n+ + Vote.objects.filter(choice__question__poll__module=module).count() \\\n+ + LiveQuestion.objects.filter(module=module).count() \\\n+ + Like.objects.filter(question__module=module).count()\n return item_count\n", "issue": "testing interactive event: remove + on call to action\n**URL:** https://meinberlin-dev.liqd.net/projekte/module/interaktive-veranstaltung-2/\r\n**user:** unregistered user\r\n**expected behaviour:** buttons should be same all over platform\r\n**behaviour:** there is a + on the button\r\n**important screensize:**\r\n**device & browser:** \r\n**Comment/Question:** please take out the + before add question\r\n\r\n\r\nScreenshot?\r\n<img width=\"692\" alt=\"Bildschirmfoto 2020-09-22 um 17 51 38\" src=\"https://user-images.githubusercontent.com/35491681/93906276-494d9200-fcfc-11ea-9614-3a9359b5ec97.png\">\r\n\n", "before_files": [{"content": "from django import template\n\nfrom adhocracy4.comments.models import Comment\nfrom meinberlin.apps.budgeting.models import Proposal as budget_proposal\nfrom meinberlin.apps.ideas.models import Idea\nfrom meinberlin.apps.kiezkasse.models import Proposal as kiezkasse_proposal\nfrom meinberlin.apps.mapideas.models import MapIdea\nfrom meinberlin.apps.polls.models import Vote\nfrom meinberlin.apps.projects import get_project_type\n\nregister = template.Library()\n\n\[email protected]\ndef project_url(project):\n if (project.project_type == 'meinberlin_bplan.Bplan'\n or project.project_type ==\n 'meinberlin_extprojects.ExternalProject'):\n return project.externalproject.url\n return project.get_absolute_url()\n\n\[email protected]\ndef project_type(project):\n return get_project_type(project)\n\n\[email protected]\ndef is_external(project):\n return (project.project_type == 'meinberlin_bplan.Bplan'\n or project.project_type ==\n 'meinberlin_extprojects.ExternalProject')\n\n\[email protected]\ndef is_a4_project(project):\n return (project.project_type == 'a4projects.Project')\n\n\[email protected]_tag\ndef get_num_entries(module):\n \"\"\"Count all user-generated items.\"\"\"\n item_count = \\\n Idea.objects.filter(module=module).count() \\\n + MapIdea.objects.filter(module=module).count() \\\n + budget_proposal.objects.filter(module=module).count() \\\n + kiezkasse_proposal.objects.filter(module=module).count() \\\n + Comment.objects.filter(idea__module=module).count() \\\n + Comment.objects.filter(mapidea__module=module).count() \\\n + Comment.objects.filter(budget_proposal__module=module).count() \\\n + Comment.objects.filter(kiezkasse_proposal__module=module).count() \\\n + Comment.objects.filter(topic__module=module).count() \\\n + Comment.objects.filter(maptopic__module=module).count() \\\n + Comment.objects.filter(paragraph__chapter__module=module).count() \\\n + Comment.objects.filter(chapter__module=module).count() \\\n + Comment.objects.filter(poll__module=module).count() \\\n + Vote.objects.filter(choice__question__poll__module=module).count()\n return item_count\n", "path": "meinberlin/apps/projects/templatetags/meinberlin_project_tags.py"}, {"content": "from django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4 import phases\n\nfrom . import apps\nfrom . import models\nfrom . import views\n\n\nclass IssuePhase(phases.PhaseContent):\n app = apps.Config.label\n phase = 'issue'\n view = views.LiveQuestionModuleDetail\n\n name = _('Issue phase')\n description = _('Add question.')\n module_name = _('Interactive Event')\n icon = 'lightbulb-o'\n\n features = {\n 'crud': (models.LiveQuestion,),\n 'like': (models.LiveQuestion,)\n }\n\n\nphases.content.register(IssuePhase())\n", "path": "meinberlin/apps/livequestions/phases.py"}], "after_files": [{"content": "from django import template\n\nfrom adhocracy4.comments.models import Comment\nfrom meinberlin.apps.budgeting.models import Proposal as budget_proposal\nfrom meinberlin.apps.ideas.models import Idea\nfrom meinberlin.apps.kiezkasse.models import Proposal as kiezkasse_proposal\nfrom meinberlin.apps.likes.models import Like\nfrom meinberlin.apps.livequestions.models import LiveQuestion\nfrom meinberlin.apps.mapideas.models import MapIdea\nfrom meinberlin.apps.polls.models import Vote\nfrom meinberlin.apps.projects import get_project_type\n\nregister = template.Library()\n\n\[email protected]\ndef project_url(project):\n if (project.project_type == 'meinberlin_bplan.Bplan'\n or project.project_type ==\n 'meinberlin_extprojects.ExternalProject'):\n return project.externalproject.url\n return project.get_absolute_url()\n\n\[email protected]\ndef project_type(project):\n return get_project_type(project)\n\n\[email protected]\ndef is_external(project):\n return (project.project_type == 'meinberlin_bplan.Bplan'\n or project.project_type ==\n 'meinberlin_extprojects.ExternalProject')\n\n\[email protected]\ndef is_a4_project(project):\n return (project.project_type == 'a4projects.Project')\n\n\[email protected]_tag\ndef get_num_entries(module):\n \"\"\"Count all user-generated items.\"\"\"\n item_count = \\\n Idea.objects.filter(module=module).count() \\\n + MapIdea.objects.filter(module=module).count() \\\n + budget_proposal.objects.filter(module=module).count() \\\n + kiezkasse_proposal.objects.filter(module=module).count() \\\n + Comment.objects.filter(idea__module=module).count() \\\n + Comment.objects.filter(mapidea__module=module).count() \\\n + Comment.objects.filter(budget_proposal__module=module).count() \\\n + Comment.objects.filter(kiezkasse_proposal__module=module).count() \\\n + Comment.objects.filter(topic__module=module).count() \\\n + Comment.objects.filter(maptopic__module=module).count() \\\n + Comment.objects.filter(paragraph__chapter__module=module).count() \\\n + Comment.objects.filter(chapter__module=module).count() \\\n + Comment.objects.filter(poll__module=module).count() \\\n + Vote.objects.filter(choice__question__poll__module=module).count() \\\n + LiveQuestion.objects.filter(module=module).count() \\\n + Like.objects.filter(question__module=module).count()\n return item_count\n", "path": "meinberlin/apps/projects/templatetags/meinberlin_project_tags.py"}, {"content": "from django.utils.translation import ugettext_lazy as _\n\nfrom adhocracy4 import phases\n\nfrom . import apps\nfrom . import models\nfrom . import views\n\n\nclass IssuePhase(phases.PhaseContent):\n app = apps.Config.label\n phase = 'issue'\n view = views.LiveQuestionModuleDetail\n\n name = _('Issue phase')\n description = _('Add questions and support.')\n module_name = _('Interactive Event')\n icon = 'lightbulb-o'\n\n features = {\n 'crud': (models.LiveQuestion,),\n 'like': (models.LiveQuestion,)\n }\n\n\nphases.content.register(IssuePhase())\n", "path": "meinberlin/apps/livequestions/phases.py"}]} | 1,278 | 445 |
gh_patches_debug_17824 | rasdani/github-patches | git_diff | hydroshare__hydroshare-5083 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Track user agent for metrics
**Describe the feature you'd like and what it will do**
In HS v2.5.4, we don't track user_agent in our metrics. This makes it difficult to tell when requests to HS are occurring via direct UI interactions, or via other tools like hsclient.
**Why is this feature important?**
We need more insight into how HS' ecosystem of tools are being used. This information should drive our continued development on existing tools and our consideration of additions for future use.
**Is your feature request related to a problem? Please describe.**
It is difficult to make decisions without information.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hs_tracking/utils.py`
Content:
```
1 import robot_detection
2 from ipware.ip import get_ip
3 from hs_tools_resource.models import RequestUrlBase, RequestUrlBaseAggregation, RequestUrlBaseFile
4 from urllib.parse import urlparse
5
6
7 def get_client_ip(request):
8 return get_ip(request)
9
10
11 def get_user_type(session):
12 try:
13 user = session.visitor.user
14 usertype = user.userprofile.user_type
15 except AttributeError:
16 usertype = None
17 return usertype
18
19
20 def get_user_email_domain(session):
21 try:
22 user = session.visitor.user
23 emaildomain = user.email.split('@')[-1]
24 except AttributeError:
25 emaildomain = None
26 return emaildomain
27
28
29 def get_user_email_tld(session, emaildomain=None):
30 try:
31 if not emaildomain:
32 emaildomain = get_user_email_domain(session)
33 if emaildomain:
34 shortdomain = '.'.join(emaildomain.split('.')[1:])
35 return shortdomain
36 except AttributeError:
37 return None
38
39
40 def is_human(user_agent):
41 if robot_detection.is_robot(user_agent):
42 return False
43 return True
44
45
46 def get_std_log_fields(request, session=None):
47 """ returns a standard set of metadata that to each receiver function.
48 This ensures that all activities are reporting a consistent set of metrics
49 """
50 user_type = None
51 user_email_tld = None
52 full_domain = None
53 if session is not None:
54 user_type = get_user_type(session)
55 full_domain = get_user_email_domain(session)
56 user_email_tld = get_user_email_tld(session, full_domain)
57
58 return {
59 'user_ip': get_client_ip(request),
60 'user_type': user_type,
61 'user_email_domain': user_email_tld,
62 'user_email_domain_full': full_domain
63 }
64
65
66 def authentic_redirect_url(url):
67 """ Validates a url scheme and netloc is in an existing web app
68 :param url: String of a url
69 :return: Boolean, True if the url exists in a web app
70 """
71 if not url:
72 return False
73 u = urlparse(url)
74 url_base = "{}://{}".format(u.scheme, u.netloc)
75 return RequestUrlBase.objects.filter(value__startswith=url_base).exists() \
76 or RequestUrlBaseAggregation.objects.filter(value__startswith=url_base).exists() \
77 or RequestUrlBaseFile.objects.filter(value__startswith=url_base).exists()
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hs_tracking/utils.py b/hs_tracking/utils.py
--- a/hs_tracking/utils.py
+++ b/hs_tracking/utils.py
@@ -47,6 +47,12 @@
""" returns a standard set of metadata that to each receiver function.
This ensures that all activities are reporting a consistent set of metrics
"""
+ try:
+ user_agent = request.META['HTTP_USER_AGENT']
+ human = is_human(user_agent)
+ except KeyError:
+ user_agent = None
+ human = None
user_type = None
user_email_tld = None
full_domain = None
@@ -59,7 +65,9 @@
'user_ip': get_client_ip(request),
'user_type': user_type,
'user_email_domain': user_email_tld,
- 'user_email_domain_full': full_domain
+ 'user_email_domain_full': full_domain,
+ 'is_human': human,
+ 'user_agent': user_agent
}
| {"golden_diff": "diff --git a/hs_tracking/utils.py b/hs_tracking/utils.py\n--- a/hs_tracking/utils.py\n+++ b/hs_tracking/utils.py\n@@ -47,6 +47,12 @@\n \"\"\" returns a standard set of metadata that to each receiver function.\n This ensures that all activities are reporting a consistent set of metrics\n \"\"\"\n+ try:\n+ user_agent = request.META['HTTP_USER_AGENT']\n+ human = is_human(user_agent)\n+ except KeyError:\n+ user_agent = None\n+ human = None\n user_type = None\n user_email_tld = None\n full_domain = None\n@@ -59,7 +65,9 @@\n 'user_ip': get_client_ip(request),\n 'user_type': user_type,\n 'user_email_domain': user_email_tld,\n- 'user_email_domain_full': full_domain\n+ 'user_email_domain_full': full_domain,\n+ 'is_human': human,\n+ 'user_agent': user_agent\n }\n", "issue": "Track user agent for metrics\n**Describe the feature you'd like and what it will do**\r\nIn HS v2.5.4, we don't track user_agent in our metrics. This makes it difficult to tell when requests to HS are occurring via direct UI interactions, or via other tools like hsclient.\r\n\r\n\r\n**Why is this feature important?**\r\nWe need more insight into how HS' ecosystem of tools are being used. This information should drive our continued development on existing tools and our consideration of additions for future use.\r\n\r\n**Is your feature request related to a problem? Please describe.**\r\nIt is difficult to make decisions without information.\r\n\n", "before_files": [{"content": "import robot_detection\nfrom ipware.ip import get_ip\nfrom hs_tools_resource.models import RequestUrlBase, RequestUrlBaseAggregation, RequestUrlBaseFile\nfrom urllib.parse import urlparse\n\n\ndef get_client_ip(request):\n return get_ip(request)\n\n\ndef get_user_type(session):\n try:\n user = session.visitor.user\n usertype = user.userprofile.user_type\n except AttributeError:\n usertype = None\n return usertype\n\n\ndef get_user_email_domain(session):\n try:\n user = session.visitor.user\n emaildomain = user.email.split('@')[-1]\n except AttributeError:\n emaildomain = None\n return emaildomain\n\n\ndef get_user_email_tld(session, emaildomain=None):\n try:\n if not emaildomain:\n emaildomain = get_user_email_domain(session)\n if emaildomain:\n shortdomain = '.'.join(emaildomain.split('.')[1:])\n return shortdomain\n except AttributeError:\n return None\n\n\ndef is_human(user_agent):\n if robot_detection.is_robot(user_agent):\n return False\n return True\n\n\ndef get_std_log_fields(request, session=None):\n \"\"\" returns a standard set of metadata that to each receiver function.\n This ensures that all activities are reporting a consistent set of metrics\n \"\"\"\n user_type = None\n user_email_tld = None\n full_domain = None\n if session is not None:\n user_type = get_user_type(session)\n full_domain = get_user_email_domain(session)\n user_email_tld = get_user_email_tld(session, full_domain)\n\n return {\n 'user_ip': get_client_ip(request),\n 'user_type': user_type,\n 'user_email_domain': user_email_tld,\n 'user_email_domain_full': full_domain\n }\n\n\ndef authentic_redirect_url(url):\n \"\"\" Validates a url scheme and netloc is in an existing web app\n :param url: String of a url\n :return: Boolean, True if the url exists in a web app\n \"\"\"\n if not url:\n return False\n u = urlparse(url)\n url_base = \"{}://{}\".format(u.scheme, u.netloc)\n return RequestUrlBase.objects.filter(value__startswith=url_base).exists() \\\n or RequestUrlBaseAggregation.objects.filter(value__startswith=url_base).exists() \\\n or RequestUrlBaseFile.objects.filter(value__startswith=url_base).exists()\n", "path": "hs_tracking/utils.py"}], "after_files": [{"content": "import robot_detection\nfrom ipware.ip import get_ip\nfrom hs_tools_resource.models import RequestUrlBase, RequestUrlBaseAggregation, RequestUrlBaseFile\nfrom urllib.parse import urlparse\n\n\ndef get_client_ip(request):\n return get_ip(request)\n\n\ndef get_user_type(session):\n try:\n user = session.visitor.user\n usertype = user.userprofile.user_type\n except AttributeError:\n usertype = None\n return usertype\n\n\ndef get_user_email_domain(session):\n try:\n user = session.visitor.user\n emaildomain = user.email.split('@')[-1]\n except AttributeError:\n emaildomain = None\n return emaildomain\n\n\ndef get_user_email_tld(session, emaildomain=None):\n try:\n if not emaildomain:\n emaildomain = get_user_email_domain(session)\n if emaildomain:\n shortdomain = '.'.join(emaildomain.split('.')[1:])\n return shortdomain\n except AttributeError:\n return None\n\n\ndef is_human(user_agent):\n if robot_detection.is_robot(user_agent):\n return False\n return True\n\n\ndef get_std_log_fields(request, session=None):\n \"\"\" returns a standard set of metadata that to each receiver function.\n This ensures that all activities are reporting a consistent set of metrics\n \"\"\"\n try:\n user_agent = request.META['HTTP_USER_AGENT']\n human = is_human(user_agent)\n except KeyError:\n user_agent = None\n human = None\n user_type = None\n user_email_tld = None\n full_domain = None\n if session is not None:\n user_type = get_user_type(session)\n full_domain = get_user_email_domain(session)\n user_email_tld = get_user_email_tld(session, full_domain)\n\n return {\n 'user_ip': get_client_ip(request),\n 'user_type': user_type,\n 'user_email_domain': user_email_tld,\n 'user_email_domain_full': full_domain,\n 'is_human': human,\n 'user_agent': user_agent\n }\n\n\ndef authentic_redirect_url(url):\n \"\"\" Validates a url scheme and netloc is in an existing web app\n :param url: String of a url\n :return: Boolean, True if the url exists in a web app\n \"\"\"\n if not url:\n return False\n u = urlparse(url)\n url_base = \"{}://{}\".format(u.scheme, u.netloc)\n return RequestUrlBase.objects.filter(value__startswith=url_base).exists() \\\n or RequestUrlBaseAggregation.objects.filter(value__startswith=url_base).exists() \\\n or RequestUrlBaseFile.objects.filter(value__startswith=url_base).exists()\n", "path": "hs_tracking/utils.py"}]} | 1,050 | 221 |
gh_patches_debug_12997 | rasdani/github-patches | git_diff | conan-io__conan-center-index-18559 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[package] util-linux-libuuid uses wrong cmake target
### Description
In the following lines, the `util-linux-libuuid` recipe sets the cmake target to be `LibUUID::LibUUID` with a filename of `LibUUID-config.cmake`:
https://github.com/conan-io/conan-center-index/blob/61c4f7819e6cd3594a57f6c3847f94ab86de623f/recipes/util-linux-libuuid/all/conanfile.py#L112-L113
This was based on the internal practice that Kitware has for their internal libuuid cmake module, however this is not public and a number of packages (czmq, cppcommon) seem to assume a `libuuid::libuuid` target. These change should be reverted such that these packages can utilise util-linux-libuuid without a requirement to be patched.
### Package and Environment Details
N/A
### Conan profile
N/A
### Steps to reproduce
N/A
### Logs
<details><summary>Click to expand log</summary>
```
Put your log output here
```
</details>
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/util-linux-libuuid/all/conanfile.py`
Content:
```
1 from conan import ConanFile
2 from conan.errors import ConanInvalidConfiguration
3 from conan.tools.apple import fix_apple_shared_install_name
4 from conan.tools.files import copy, get, rm, rmdir
5 from conan.tools.gnu import Autotools, AutotoolsToolchain, AutotoolsDeps
6 from conan.tools.layout import basic_layout
7 from conan.tools.scm import Version
8 import os
9
10 required_conan_version = ">=1.53.0"
11
12
13 class UtilLinuxLibuuidConan(ConanFile):
14 name = "util-linux-libuuid"
15 description = "Universally unique id library"
16 url = "https://github.com/conan-io/conan-center-index"
17 homepage = "https://github.com/util-linux/util-linux.git"
18 license = "BSD-3-Clause"
19 topics = "id", "identifier", "unique", "uuid"
20 package_type = "library"
21 provides = "libuuid"
22 settings = "os", "arch", "compiler", "build_type"
23 options = {
24 "shared": [True, False],
25 "fPIC": [True, False],
26 }
27 default_options = {
28 "shared": False,
29 "fPIC": True,
30 }
31
32 @property
33 def _has_sys_file_header(self):
34 return self.settings.os in ["FreeBSD", "Linux", "Macos"]
35
36 def config_options(self):
37 if self.settings.os == "Windows":
38 del self.options.fPIC
39
40 def configure(self):
41 if self.options.shared:
42 self.options.rm_safe("fPIC")
43 self.settings.rm_safe("compiler.cppstd")
44 self.settings.rm_safe("compiler.libcxx")
45
46 def layout(self):
47 basic_layout(self, src_folder="src")
48
49 def _minimum_compiler_version(self, compiler, build_type):
50 min_version = {
51 "gcc": {
52 "Release": "4",
53 "Debug": "8",
54 },
55 "clang": {
56 "Release": "3",
57 "Debug": "3",
58 },
59 "apple-clang": {
60 "Release": "5",
61 "Debug": "5",
62 },
63 }
64 return min_version.get(str(compiler), {}).get(str(build_type), "0")
65
66 def validate(self):
67 min_version = self._minimum_compiler_version(self.settings.compiler, self.settings.build_type)
68 if Version(self.settings.compiler.version) < min_version:
69 raise ConanInvalidConfiguration(f"{self.settings.compiler} {self.settings.compiler.version} does not meet the minimum version requirement of version {min_version}")
70 if self.settings.os == "Windows":
71 raise ConanInvalidConfiguration(f"{self.ref} is not supported on Windows")
72
73 def requirements(self):
74 if self.settings.os == "Macos":
75 # Required because libintl.{a,dylib} is not distributed via libc on Macos
76 self.requires("libgettext/0.21")
77
78 def source(self):
79 get(self, **self.conan_data["sources"][self.version], strip_root=True)
80
81 def generate(self):
82 tc = AutotoolsToolchain(self)
83 tc.configure_args.append("--disable-all-programs")
84 tc.configure_args.append("--enable-libuuid")
85 if self._has_sys_file_header:
86 tc.extra_defines.append("HAVE_SYS_FILE_H")
87 if "x86" in self.settings.arch:
88 tc.extra_cflags.append("-mstackrealign")
89 tc.generate()
90
91 deps = AutotoolsDeps(self)
92 deps.generate()
93
94 def build(self):
95 autotools = Autotools(self)
96 autotools.configure()
97 autotools.make()
98
99 def package(self):
100 copy(self, "COPYING.BSD-3-Clause", src=os.path.join(self.source_folder, "Documentation", "licenses"), dst=os.path.join(self.package_folder, "licenses"))
101 autotools = Autotools(self)
102 autotools.install()
103 rm(self, "*.la", os.path.join(self.package_folder, "lib"))
104 rmdir(self, os.path.join(self.package_folder, "lib", "pkgconfig"))
105 rmdir(self, os.path.join(self.package_folder, "bin"))
106 rmdir(self, os.path.join(self.package_folder, "sbin"))
107 rmdir(self, os.path.join(self.package_folder, "share"))
108 fix_apple_shared_install_name(self)
109
110 def package_info(self):
111 self.cpp_info.set_property("pkg_config_name", "uuid")
112 self.cpp_info.set_property("cmake_target_name", "LibUUID::LibUUID")
113 self.cpp_info.set_property("cmake_file_name", "LibUUID")
114 self.cpp_info.libs = ["uuid"]
115 self.cpp_info.includedirs.append(os.path.join("include", "uuid"))
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/recipes/util-linux-libuuid/all/conanfile.py b/recipes/util-linux-libuuid/all/conanfile.py
--- a/recipes/util-linux-libuuid/all/conanfile.py
+++ b/recipes/util-linux-libuuid/all/conanfile.py
@@ -109,7 +109,10 @@
def package_info(self):
self.cpp_info.set_property("pkg_config_name", "uuid")
- self.cpp_info.set_property("cmake_target_name", "LibUUID::LibUUID")
- self.cpp_info.set_property("cmake_file_name", "LibUUID")
+ self.cpp_info.set_property("cmake_target_name", "libuuid::libuuid")
+ self.cpp_info.set_property("cmake_file_name", "libuuid")
+ # Maintain alias to `LibUUID::LibUUID` for previous version of the recipe
+ self.cpp_info.set_property("cmake_target_aliases", ["LibUUID::LibUUID"])
+
self.cpp_info.libs = ["uuid"]
self.cpp_info.includedirs.append(os.path.join("include", "uuid"))
| {"golden_diff": "diff --git a/recipes/util-linux-libuuid/all/conanfile.py b/recipes/util-linux-libuuid/all/conanfile.py\n--- a/recipes/util-linux-libuuid/all/conanfile.py\n+++ b/recipes/util-linux-libuuid/all/conanfile.py\n@@ -109,7 +109,10 @@\n \n def package_info(self):\n self.cpp_info.set_property(\"pkg_config_name\", \"uuid\")\n- self.cpp_info.set_property(\"cmake_target_name\", \"LibUUID::LibUUID\")\n- self.cpp_info.set_property(\"cmake_file_name\", \"LibUUID\")\n+ self.cpp_info.set_property(\"cmake_target_name\", \"libuuid::libuuid\")\n+ self.cpp_info.set_property(\"cmake_file_name\", \"libuuid\")\n+ # Maintain alias to `LibUUID::LibUUID` for previous version of the recipe\n+ self.cpp_info.set_property(\"cmake_target_aliases\", [\"LibUUID::LibUUID\"])\n+\n self.cpp_info.libs = [\"uuid\"]\n self.cpp_info.includedirs.append(os.path.join(\"include\", \"uuid\"))\n", "issue": "[package] util-linux-libuuid uses wrong cmake target\n### Description\n\nIn the following lines, the `util-linux-libuuid` recipe sets the cmake target to be `LibUUID::LibUUID` with a filename of `LibUUID-config.cmake`:\r\n\r\nhttps://github.com/conan-io/conan-center-index/blob/61c4f7819e6cd3594a57f6c3847f94ab86de623f/recipes/util-linux-libuuid/all/conanfile.py#L112-L113\r\n\r\nThis was based on the internal practice that Kitware has for their internal libuuid cmake module, however this is not public and a number of packages (czmq, cppcommon) seem to assume a `libuuid::libuuid` target. These change should be reverted such that these packages can utilise util-linux-libuuid without a requirement to be patched.\n\n### Package and Environment Details\n\nN/A\n\n### Conan profile\n\nN/A\r\n\n\n### Steps to reproduce\n\nN/A\n\n### Logs\n\n<details><summary>Click to expand log</summary>\r\n\r\n```\r\nPut your log output here\r\n```\r\n\r\n</details>\r\n\n", "before_files": [{"content": "from conan import ConanFile\nfrom conan.errors import ConanInvalidConfiguration\nfrom conan.tools.apple import fix_apple_shared_install_name\nfrom conan.tools.files import copy, get, rm, rmdir\nfrom conan.tools.gnu import Autotools, AutotoolsToolchain, AutotoolsDeps\nfrom conan.tools.layout import basic_layout\nfrom conan.tools.scm import Version\nimport os\n\nrequired_conan_version = \">=1.53.0\"\n\n\nclass UtilLinuxLibuuidConan(ConanFile):\n name = \"util-linux-libuuid\"\n description = \"Universally unique id library\"\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/util-linux/util-linux.git\"\n license = \"BSD-3-Clause\"\n topics = \"id\", \"identifier\", \"unique\", \"uuid\"\n package_type = \"library\"\n provides = \"libuuid\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n }\n\n @property\n def _has_sys_file_header(self):\n return self.settings.os in [\"FreeBSD\", \"Linux\", \"Macos\"]\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.options.shared:\n self.options.rm_safe(\"fPIC\")\n self.settings.rm_safe(\"compiler.cppstd\")\n self.settings.rm_safe(\"compiler.libcxx\")\n\n def layout(self):\n basic_layout(self, src_folder=\"src\")\n\n def _minimum_compiler_version(self, compiler, build_type):\n min_version = {\n \"gcc\": {\n \"Release\": \"4\",\n \"Debug\": \"8\",\n },\n \"clang\": {\n \"Release\": \"3\",\n \"Debug\": \"3\",\n },\n \"apple-clang\": {\n \"Release\": \"5\",\n \"Debug\": \"5\",\n },\n }\n return min_version.get(str(compiler), {}).get(str(build_type), \"0\")\n\n def validate(self):\n min_version = self._minimum_compiler_version(self.settings.compiler, self.settings.build_type)\n if Version(self.settings.compiler.version) < min_version:\n raise ConanInvalidConfiguration(f\"{self.settings.compiler} {self.settings.compiler.version} does not meet the minimum version requirement of version {min_version}\")\n if self.settings.os == \"Windows\":\n raise ConanInvalidConfiguration(f\"{self.ref} is not supported on Windows\")\n\n def requirements(self):\n if self.settings.os == \"Macos\":\n # Required because libintl.{a,dylib} is not distributed via libc on Macos\n self.requires(\"libgettext/0.21\")\n\n def source(self):\n get(self, **self.conan_data[\"sources\"][self.version], strip_root=True)\n\n def generate(self):\n tc = AutotoolsToolchain(self)\n tc.configure_args.append(\"--disable-all-programs\")\n tc.configure_args.append(\"--enable-libuuid\")\n if self._has_sys_file_header:\n tc.extra_defines.append(\"HAVE_SYS_FILE_H\")\n if \"x86\" in self.settings.arch:\n tc.extra_cflags.append(\"-mstackrealign\")\n tc.generate()\n\n deps = AutotoolsDeps(self)\n deps.generate()\n\n def build(self):\n autotools = Autotools(self)\n autotools.configure()\n autotools.make()\n\n def package(self):\n copy(self, \"COPYING.BSD-3-Clause\", src=os.path.join(self.source_folder, \"Documentation\", \"licenses\"), dst=os.path.join(self.package_folder, \"licenses\"))\n autotools = Autotools(self)\n autotools.install()\n rm(self, \"*.la\", os.path.join(self.package_folder, \"lib\"))\n rmdir(self, os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n rmdir(self, os.path.join(self.package_folder, \"bin\"))\n rmdir(self, os.path.join(self.package_folder, \"sbin\"))\n rmdir(self, os.path.join(self.package_folder, \"share\"))\n fix_apple_shared_install_name(self)\n\n def package_info(self):\n self.cpp_info.set_property(\"pkg_config_name\", \"uuid\")\n self.cpp_info.set_property(\"cmake_target_name\", \"LibUUID::LibUUID\")\n self.cpp_info.set_property(\"cmake_file_name\", \"LibUUID\")\n self.cpp_info.libs = [\"uuid\"]\n self.cpp_info.includedirs.append(os.path.join(\"include\", \"uuid\"))\n", "path": "recipes/util-linux-libuuid/all/conanfile.py"}], "after_files": [{"content": "from conan import ConanFile\nfrom conan.errors import ConanInvalidConfiguration\nfrom conan.tools.apple import fix_apple_shared_install_name\nfrom conan.tools.files import copy, get, rm, rmdir\nfrom conan.tools.gnu import Autotools, AutotoolsToolchain, AutotoolsDeps\nfrom conan.tools.layout import basic_layout\nfrom conan.tools.scm import Version\nimport os\n\nrequired_conan_version = \">=1.53.0\"\n\n\nclass UtilLinuxLibuuidConan(ConanFile):\n name = \"util-linux-libuuid\"\n description = \"Universally unique id library\"\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/util-linux/util-linux.git\"\n license = \"BSD-3-Clause\"\n topics = \"id\", \"identifier\", \"unique\", \"uuid\"\n package_type = \"library\"\n provides = \"libuuid\"\n settings = \"os\", \"arch\", \"compiler\", \"build_type\"\n options = {\n \"shared\": [True, False],\n \"fPIC\": [True, False],\n }\n default_options = {\n \"shared\": False,\n \"fPIC\": True,\n }\n\n @property\n def _has_sys_file_header(self):\n return self.settings.os in [\"FreeBSD\", \"Linux\", \"Macos\"]\n\n def config_options(self):\n if self.settings.os == \"Windows\":\n del self.options.fPIC\n\n def configure(self):\n if self.options.shared:\n self.options.rm_safe(\"fPIC\")\n self.settings.rm_safe(\"compiler.cppstd\")\n self.settings.rm_safe(\"compiler.libcxx\")\n\n def layout(self):\n basic_layout(self, src_folder=\"src\")\n\n def _minimum_compiler_version(self, compiler, build_type):\n min_version = {\n \"gcc\": {\n \"Release\": \"4\",\n \"Debug\": \"8\",\n },\n \"clang\": {\n \"Release\": \"3\",\n \"Debug\": \"3\",\n },\n \"apple-clang\": {\n \"Release\": \"5\",\n \"Debug\": \"5\",\n },\n }\n return min_version.get(str(compiler), {}).get(str(build_type), \"0\")\n\n def validate(self):\n min_version = self._minimum_compiler_version(self.settings.compiler, self.settings.build_type)\n if Version(self.settings.compiler.version) < min_version:\n raise ConanInvalidConfiguration(f\"{self.settings.compiler} {self.settings.compiler.version} does not meet the minimum version requirement of version {min_version}\")\n if self.settings.os == \"Windows\":\n raise ConanInvalidConfiguration(f\"{self.ref} is not supported on Windows\")\n\n def requirements(self):\n if self.settings.os == \"Macos\":\n # Required because libintl.{a,dylib} is not distributed via libc on Macos\n self.requires(\"libgettext/0.21\")\n\n def source(self):\n get(self, **self.conan_data[\"sources\"][self.version], strip_root=True)\n\n def generate(self):\n tc = AutotoolsToolchain(self)\n tc.configure_args.append(\"--disable-all-programs\")\n tc.configure_args.append(\"--enable-libuuid\")\n if self._has_sys_file_header:\n tc.extra_defines.append(\"HAVE_SYS_FILE_H\")\n if \"x86\" in self.settings.arch:\n tc.extra_cflags.append(\"-mstackrealign\")\n tc.generate()\n\n deps = AutotoolsDeps(self)\n deps.generate()\n\n def build(self):\n autotools = Autotools(self)\n autotools.configure()\n autotools.make()\n\n def package(self):\n copy(self, \"COPYING.BSD-3-Clause\", src=os.path.join(self.source_folder, \"Documentation\", \"licenses\"), dst=os.path.join(self.package_folder, \"licenses\"))\n autotools = Autotools(self)\n autotools.install()\n rm(self, \"*.la\", os.path.join(self.package_folder, \"lib\"))\n rmdir(self, os.path.join(self.package_folder, \"lib\", \"pkgconfig\"))\n rmdir(self, os.path.join(self.package_folder, \"bin\"))\n rmdir(self, os.path.join(self.package_folder, \"sbin\"))\n rmdir(self, os.path.join(self.package_folder, \"share\"))\n fix_apple_shared_install_name(self)\n\n def package_info(self):\n self.cpp_info.set_property(\"pkg_config_name\", \"uuid\")\n self.cpp_info.set_property(\"cmake_target_name\", \"libuuid::libuuid\")\n self.cpp_info.set_property(\"cmake_file_name\", \"libuuid\")\n # Maintain alias to `LibUUID::LibUUID` for previous version of the recipe\n self.cpp_info.set_property(\"cmake_target_aliases\", [\"LibUUID::LibUUID\"])\n\n self.cpp_info.libs = [\"uuid\"]\n self.cpp_info.includedirs.append(os.path.join(\"include\", \"uuid\"))\n", "path": "recipes/util-linux-libuuid/all/conanfile.py"}]} | 1,759 | 233 |
gh_patches_debug_959 | rasdani/github-patches | git_diff | getsentry__sentry-52329 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
fix(django): Disable admin on prod
Reported here: https://forum.sentry.io/t/sentry-django-admin-portal/12787?u=byk
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/sentry/conf/urls.py`
Content:
```
1 from __future__ import annotations
2
3 from django.conf import settings
4 from django.urls import URLPattern, URLResolver, re_path
5
6 from sentry.web.frontend import csrf_failure
7 from sentry.web.frontend.error_404 import Error404View
8 from sentry.web.frontend.error_500 import Error500View
9 from sentry.web.urls import urlpatterns as web_urlpatterns
10
11 handler404 = Error404View.as_view()
12 handler500 = Error500View.as_view()
13
14 urlpatterns: list[URLResolver | URLPattern] = [
15 re_path(
16 r"^500/",
17 handler500,
18 name="error-500",
19 ),
20 re_path(
21 r"^404/",
22 handler404,
23 name="error-404",
24 ),
25 re_path(
26 r"^403-csrf-failure/",
27 csrf_failure.view,
28 name="error-403-csrf-failure",
29 ),
30 ]
31
32 if "django.contrib.admin" in settings.INSTALLED_APPS:
33 from sentry import django_admin
34
35 urlpatterns += django_admin.urlpatterns
36
37 urlpatterns += web_urlpatterns
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/sentry/conf/urls.py b/src/sentry/conf/urls.py
--- a/src/sentry/conf/urls.py
+++ b/src/sentry/conf/urls.py
@@ -29,7 +29,7 @@
),
]
-if "django.contrib.admin" in settings.INSTALLED_APPS:
+if "django.contrib.admin" in settings.INSTALLED_APPS and settings.ADMIN_ENABLED:
from sentry import django_admin
urlpatterns += django_admin.urlpatterns
| {"golden_diff": "diff --git a/src/sentry/conf/urls.py b/src/sentry/conf/urls.py\n--- a/src/sentry/conf/urls.py\n+++ b/src/sentry/conf/urls.py\n@@ -29,7 +29,7 @@\n ),\n ]\n \n-if \"django.contrib.admin\" in settings.INSTALLED_APPS:\n+if \"django.contrib.admin\" in settings.INSTALLED_APPS and settings.ADMIN_ENABLED:\n from sentry import django_admin\n \n urlpatterns += django_admin.urlpatterns\n", "issue": "fix(django): Disable admin on prod\nReported here: https://forum.sentry.io/t/sentry-django-admin-portal/12787?u=byk\n\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom django.conf import settings\nfrom django.urls import URLPattern, URLResolver, re_path\n\nfrom sentry.web.frontend import csrf_failure\nfrom sentry.web.frontend.error_404 import Error404View\nfrom sentry.web.frontend.error_500 import Error500View\nfrom sentry.web.urls import urlpatterns as web_urlpatterns\n\nhandler404 = Error404View.as_view()\nhandler500 = Error500View.as_view()\n\nurlpatterns: list[URLResolver | URLPattern] = [\n re_path(\n r\"^500/\",\n handler500,\n name=\"error-500\",\n ),\n re_path(\n r\"^404/\",\n handler404,\n name=\"error-404\",\n ),\n re_path(\n r\"^403-csrf-failure/\",\n csrf_failure.view,\n name=\"error-403-csrf-failure\",\n ),\n]\n\nif \"django.contrib.admin\" in settings.INSTALLED_APPS:\n from sentry import django_admin\n\n urlpatterns += django_admin.urlpatterns\n\nurlpatterns += web_urlpatterns\n", "path": "src/sentry/conf/urls.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom django.conf import settings\nfrom django.urls import URLPattern, URLResolver, re_path\n\nfrom sentry.web.frontend import csrf_failure\nfrom sentry.web.frontend.error_404 import Error404View\nfrom sentry.web.frontend.error_500 import Error500View\nfrom sentry.web.urls import urlpatterns as web_urlpatterns\n\nhandler404 = Error404View.as_view()\nhandler500 = Error500View.as_view()\n\nurlpatterns: list[URLResolver | URLPattern] = [\n re_path(\n r\"^500/\",\n handler500,\n name=\"error-500\",\n ),\n re_path(\n r\"^404/\",\n handler404,\n name=\"error-404\",\n ),\n re_path(\n r\"^403-csrf-failure/\",\n csrf_failure.view,\n name=\"error-403-csrf-failure\",\n ),\n]\n\nif \"django.contrib.admin\" in settings.INSTALLED_APPS and settings.ADMIN_ENABLED:\n from sentry import django_admin\n\n urlpatterns += django_admin.urlpatterns\n\nurlpatterns += web_urlpatterns\n", "path": "src/sentry/conf/urls.py"}]} | 625 | 103 |
gh_patches_debug_1305 | rasdani/github-patches | git_diff | oppia__oppia-7459 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Upgrade @typescript-eslint/eslint-plugin
`eslint-utils` is currently out of date, https://github.com/oppia/oppia/pull/7451 provides a temporary fix, but we need to upgrade the main package that requires `eslint-utils` to ensure that we have a long term fix.
When fixing this, please make sure that the lint tests run successfully.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/domain/feedback_jobs_one_off.py`
Content:
```
1 # Copyright 2019 The Oppia Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS-IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """One-off jobs for feedback models."""
16
17 from core import jobs
18 from core.platform import models
19
20 (feedback_models,) = models.Registry.import_models([models.NAMES.feedback])
21
22
23 class GeneralFeedbackThreadUserOneOffJob(jobs.BaseMapReduceOneOffJobManager):
24 """One-off job for setting user_id and thread_id for all
25 GeneralFeedbackThreadUserModels.
26 """
27 @classmethod
28 def entity_classes_to_map_over(cls):
29 """Return a list of datastore class references to map over."""
30 return [feedback_models.GeneralFeedbackThreadUserModel]
31
32 @staticmethod
33 def map(model_instance):
34 """Implements the map function for this job."""
35 user_id, thread_id = model_instance.id.split('.', 1)
36 if model_instance.user_id is None:
37 model_instance.user_id = user_id
38 if model_instance.thread_id is None:
39 model_instance.thread_id = thread_id
40 model_instance.put(update_last_updated_time=False)
41 yield ('SUCCESS', model_instance.id)
42
43 @staticmethod
44 def reduce(key, values):
45 yield (key, len(values))
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/domain/feedback_jobs_one_off.py b/core/domain/feedback_jobs_one_off.py
--- a/core/domain/feedback_jobs_one_off.py
+++ b/core/domain/feedback_jobs_one_off.py
@@ -13,6 +13,7 @@
# limitations under the License.
"""One-off jobs for feedback models."""
+from __future__ import absolute_import # pylint: disable=import-only-modules
from core import jobs
from core.platform import models
| {"golden_diff": "diff --git a/core/domain/feedback_jobs_one_off.py b/core/domain/feedback_jobs_one_off.py\n--- a/core/domain/feedback_jobs_one_off.py\n+++ b/core/domain/feedback_jobs_one_off.py\n@@ -13,6 +13,7 @@\n # limitations under the License.\n \n \"\"\"One-off jobs for feedback models.\"\"\"\n+from __future__ import absolute_import # pylint: disable=import-only-modules\n \n from core import jobs\n from core.platform import models\n", "issue": "Upgrade @typescript-eslint/eslint-plugin\n`eslint-utils` is currently out of date, https://github.com/oppia/oppia/pull/7451 provides a temporary fix, but we need to upgrade the main package that requires `eslint-utils` to ensure that we have a long term fix. \r\n\r\nWhen fixing this, please make sure that the lint tests run successfully.\n", "before_files": [{"content": "# Copyright 2019 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"One-off jobs for feedback models.\"\"\"\n\nfrom core import jobs\nfrom core.platform import models\n\n(feedback_models,) = models.Registry.import_models([models.NAMES.feedback])\n\n\nclass GeneralFeedbackThreadUserOneOffJob(jobs.BaseMapReduceOneOffJobManager):\n \"\"\"One-off job for setting user_id and thread_id for all\n GeneralFeedbackThreadUserModels.\n \"\"\"\n @classmethod\n def entity_classes_to_map_over(cls):\n \"\"\"Return a list of datastore class references to map over.\"\"\"\n return [feedback_models.GeneralFeedbackThreadUserModel]\n\n @staticmethod\n def map(model_instance):\n \"\"\"Implements the map function for this job.\"\"\"\n user_id, thread_id = model_instance.id.split('.', 1)\n if model_instance.user_id is None:\n model_instance.user_id = user_id\n if model_instance.thread_id is None:\n model_instance.thread_id = thread_id\n model_instance.put(update_last_updated_time=False)\n yield ('SUCCESS', model_instance.id)\n\n @staticmethod\n def reduce(key, values):\n yield (key, len(values))\n", "path": "core/domain/feedback_jobs_one_off.py"}], "after_files": [{"content": "# Copyright 2019 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"One-off jobs for feedback models.\"\"\"\nfrom __future__ import absolute_import # pylint: disable=import-only-modules\n\nfrom core import jobs\nfrom core.platform import models\n\n(feedback_models,) = models.Registry.import_models([models.NAMES.feedback])\n\n\nclass GeneralFeedbackThreadUserOneOffJob(jobs.BaseMapReduceOneOffJobManager):\n \"\"\"One-off job for setting user_id and thread_id for all\n GeneralFeedbackThreadUserModels.\n \"\"\"\n @classmethod\n def entity_classes_to_map_over(cls):\n \"\"\"Return a list of datastore class references to map over.\"\"\"\n return [feedback_models.GeneralFeedbackThreadUserModel]\n\n @staticmethod\n def map(model_instance):\n \"\"\"Implements the map function for this job.\"\"\"\n user_id, thread_id = model_instance.id.split('.', 1)\n if model_instance.user_id is None:\n model_instance.user_id = user_id\n if model_instance.thread_id is None:\n model_instance.thread_id = thread_id\n model_instance.put(update_last_updated_time=False)\n yield ('SUCCESS', model_instance.id)\n\n @staticmethod\n def reduce(key, values):\n yield (key, len(values))\n", "path": "core/domain/feedback_jobs_one_off.py"}]} | 790 | 99 |
gh_patches_debug_21117 | rasdani/github-patches | git_diff | mlcommons__GaNDLF-614 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CCA failure when enabled
**Describe the bug**
The CCA (Largest Connected Component Analysis) function was implemented as a standalone function, which causes it to fail when called in the segmentation pipeline with post-processing enabled. The expected behavior is a likely failure due to this issue.
**To Reproduce**
Run a segmentation pipeline with CCA enabled for the post-processing.
**Expected behavior**
The CCA function should be corrected and integrated with the segmentation pipeline to work correctly and tested
**GaNDLF Version**
<!-- Put the output of the following command:
python -c 'import GANDLF as g;print(g.__version__)'
-->
Version information of the GaNDLF package in the virtual environment. 0.0.16-dev
**Desktop (please complete the following information):**
- OS: Linux, Ubuntu
- Version (including Build information, if any): 22.04
**Additional context**
None
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `GANDLF/data/post_process/morphology.py`
Content:
```
1 import torch
2 import torch.nn.functional as F
3 from skimage.measure import label
4 import numpy as np
5 from scipy.ndimage import binary_fill_holes, binary_closing
6 from GANDLF.utils.generic import get_array_from_image_or_tensor
7
8
9 def torch_morphological(input_image, kernel_size=1, mode="dilation"):
10 """
11 This function enables morphological operations using torch. Adapted from https://github.com/DIVA-DIA/Generating-Synthetic-Handwritten-Historical-Documents/blob/e6a798dc2b374f338804222747c56cb44869af5b/HTR_ctc/utils/auxilary_functions.py#L10.
12
13 Args:
14 input_image (torch.Tensor): The input image.
15 kernel_size (list): The size of the window to take a max over.
16 mode (str): The type of morphological operation to perform.
17
18 Returns:
19 torch.Tensor: The output image after morphological operations.
20 """
21
22 if len(input_image.shape) == 4:
23 max_pool = F.max_pool2d
24 elif len(input_image.shape) == 5:
25 max_pool = F.max_pool3d
26 else:
27 raise ValueError("Input image has invalid shape for morphological operations.")
28
29 if mode == "dilation":
30 output_image = max_pool(
31 input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2
32 )
33 elif mode == "erosion":
34 output_image = -max_pool(
35 -input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2
36 )
37 elif mode == "closing":
38 output_image = max_pool(
39 input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2
40 )
41 output_image = -max_pool(
42 -output_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2
43 )
44 elif mode == "opening":
45 output_image = -max_pool(
46 -input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2
47 )
48 output_image = max_pool(
49 output_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2
50 )
51
52 return output_image
53
54
55 def fill_holes(input_image, params=None):
56 """
57 This function fills holes in masks.
58
59 Args:
60 input_image (torch.Tensor): The input image.
61 params (dict): The parameters dict; unused.
62
63 Returns:
64 torch.Tensor: The output image after morphological operations.
65 """
66 input_image_array = get_array_from_image_or_tensor(input_image).astype(int)
67 input_image_array_closed = binary_closing(input_image_array)
68 # Fill the holes in binary objects
69 output_array = binary_fill_holes(input_image_array_closed).astype(int)
70
71 return torch.from_numpy(output_array)
72
73
74 def cca(input_image):
75 """
76 This function performs connected component analysis on the input image.
77
78 Args:
79 input_image (torch.Tensor): The input image.
80 params (dict): The parameters dict;
81
82 Returns:
83 torch.Tensor: The output image after morphological operations.
84 """
85 seg = get_array_from_image_or_tensor(input_image)
86 mask = seg != 0
87
88 connectivity = input_image.dim() - 1
89 labels_connected = label(mask, connectivity=connectivity)
90 labels_connected_sizes = [
91 np.sum(labels_connected == i) for i in np.unique(labels_connected)
92 ]
93 largest_region = np.argmax(labels_connected_sizes[1:]) + 1
94 seg[labels_connected != largest_region] = 0
95 return seg
96
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/GANDLF/data/post_process/morphology.py b/GANDLF/data/post_process/morphology.py
--- a/GANDLF/data/post_process/morphology.py
+++ b/GANDLF/data/post_process/morphology.py
@@ -71,7 +71,7 @@
return torch.from_numpy(output_array)
-def cca(input_image):
+def cca(input_image, params=None):
"""
This function performs connected component analysis on the input image.
@@ -85,11 +85,15 @@
seg = get_array_from_image_or_tensor(input_image)
mask = seg != 0
- connectivity = input_image.dim() - 1
+ connectivity = input_image.ndim - 1
labels_connected = label(mask, connectivity=connectivity)
labels_connected_sizes = [
np.sum(labels_connected == i) for i in np.unique(labels_connected)
]
- largest_region = np.argmax(labels_connected_sizes[1:]) + 1
+ largest_region = 0
+ if len(labels_connected_sizes) > 1:
+ largest_region = np.argmax(labels_connected_sizes[1:]) + 1
seg[labels_connected != largest_region] = 0
+
return seg
+
| {"golden_diff": "diff --git a/GANDLF/data/post_process/morphology.py b/GANDLF/data/post_process/morphology.py\n--- a/GANDLF/data/post_process/morphology.py\n+++ b/GANDLF/data/post_process/morphology.py\n@@ -71,7 +71,7 @@\n return torch.from_numpy(output_array)\n \n \n-def cca(input_image):\n+def cca(input_image, params=None):\n \"\"\"\n This function performs connected component analysis on the input image.\n \n@@ -85,11 +85,15 @@\n seg = get_array_from_image_or_tensor(input_image)\n mask = seg != 0\n \n- connectivity = input_image.dim() - 1\n+ connectivity = input_image.ndim - 1\n labels_connected = label(mask, connectivity=connectivity)\n labels_connected_sizes = [\n np.sum(labels_connected == i) for i in np.unique(labels_connected)\n ]\n- largest_region = np.argmax(labels_connected_sizes[1:]) + 1\n+ largest_region = 0\n+ if len(labels_connected_sizes) > 1:\n+ largest_region = np.argmax(labels_connected_sizes[1:]) + 1\n seg[labels_connected != largest_region] = 0\n+\n return seg\n+\n", "issue": "CCA failure when enabled\n**Describe the bug**\r\nThe CCA (Largest Connected Component Analysis) function was implemented as a standalone function, which causes it to fail when called in the segmentation pipeline with post-processing enabled. The expected behavior is a likely failure due to this issue.\r\n\r\n**To Reproduce**\r\nRun a segmentation pipeline with CCA enabled for the post-processing.\r\n\r\n**Expected behavior**\r\nThe CCA function should be corrected and integrated with the segmentation pipeline to work correctly and tested\r\n\r\n**GaNDLF Version**\r\n<!-- Put the output of the following command:\r\npython -c 'import GANDLF as g;print(g.__version__)'\r\n-->\r\nVersion information of the GaNDLF package in the virtual environment. 0.0.16-dev\r\n\r\n**Desktop (please complete the following information):**\r\n - OS: Linux, Ubuntu\r\n - Version (including Build information, if any): 22.04\r\n\r\n**Additional context**\r\nNone\r\n\n", "before_files": [{"content": "import torch\nimport torch.nn.functional as F\nfrom skimage.measure import label\nimport numpy as np\nfrom scipy.ndimage import binary_fill_holes, binary_closing\nfrom GANDLF.utils.generic import get_array_from_image_or_tensor\n\n\ndef torch_morphological(input_image, kernel_size=1, mode=\"dilation\"):\n \"\"\"\n This function enables morphological operations using torch. Adapted from https://github.com/DIVA-DIA/Generating-Synthetic-Handwritten-Historical-Documents/blob/e6a798dc2b374f338804222747c56cb44869af5b/HTR_ctc/utils/auxilary_functions.py#L10.\n\n Args:\n input_image (torch.Tensor): The input image.\n kernel_size (list): The size of the window to take a max over.\n mode (str): The type of morphological operation to perform.\n\n Returns:\n torch.Tensor: The output image after morphological operations.\n \"\"\"\n\n if len(input_image.shape) == 4:\n max_pool = F.max_pool2d\n elif len(input_image.shape) == 5:\n max_pool = F.max_pool3d\n else:\n raise ValueError(\"Input image has invalid shape for morphological operations.\")\n\n if mode == \"dilation\":\n output_image = max_pool(\n input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n elif mode == \"erosion\":\n output_image = -max_pool(\n -input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n elif mode == \"closing\":\n output_image = max_pool(\n input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n output_image = -max_pool(\n -output_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n elif mode == \"opening\":\n output_image = -max_pool(\n -input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n output_image = max_pool(\n output_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n\n return output_image\n\n\ndef fill_holes(input_image, params=None):\n \"\"\"\n This function fills holes in masks.\n\n Args:\n input_image (torch.Tensor): The input image.\n params (dict): The parameters dict; unused.\n\n Returns:\n torch.Tensor: The output image after morphological operations.\n \"\"\"\n input_image_array = get_array_from_image_or_tensor(input_image).astype(int)\n input_image_array_closed = binary_closing(input_image_array)\n # Fill the holes in binary objects\n output_array = binary_fill_holes(input_image_array_closed).astype(int)\n\n return torch.from_numpy(output_array)\n\n\ndef cca(input_image):\n \"\"\"\n This function performs connected component analysis on the input image.\n\n Args:\n input_image (torch.Tensor): The input image.\n params (dict): The parameters dict;\n\n Returns:\n torch.Tensor: The output image after morphological operations.\n \"\"\"\n seg = get_array_from_image_or_tensor(input_image)\n mask = seg != 0\n\n connectivity = input_image.dim() - 1\n labels_connected = label(mask, connectivity=connectivity)\n labels_connected_sizes = [\n np.sum(labels_connected == i) for i in np.unique(labels_connected)\n ]\n largest_region = np.argmax(labels_connected_sizes[1:]) + 1\n seg[labels_connected != largest_region] = 0\n return seg\n", "path": "GANDLF/data/post_process/morphology.py"}], "after_files": [{"content": "import torch\nimport torch.nn.functional as F\nfrom skimage.measure import label\nimport numpy as np\nfrom scipy.ndimage import binary_fill_holes, binary_closing\nfrom GANDLF.utils.generic import get_array_from_image_or_tensor\n\n\ndef torch_morphological(input_image, kernel_size=1, mode=\"dilation\"):\n \"\"\"\n This function enables morphological operations using torch. Adapted from https://github.com/DIVA-DIA/Generating-Synthetic-Handwritten-Historical-Documents/blob/e6a798dc2b374f338804222747c56cb44869af5b/HTR_ctc/utils/auxilary_functions.py#L10.\n\n Args:\n input_image (torch.Tensor): The input image.\n kernel_size (list): The size of the window to take a max over.\n mode (str): The type of morphological operation to perform.\n\n Returns:\n torch.Tensor: The output image after morphological operations.\n \"\"\"\n\n if len(input_image.shape) == 4:\n max_pool = F.max_pool2d\n elif len(input_image.shape) == 5:\n max_pool = F.max_pool3d\n else:\n raise ValueError(\"Input image has invalid shape for morphological operations.\")\n\n if mode == \"dilation\":\n output_image = max_pool(\n input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n elif mode == \"erosion\":\n output_image = -max_pool(\n -input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n elif mode == \"closing\":\n output_image = max_pool(\n input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n output_image = -max_pool(\n -output_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n elif mode == \"opening\":\n output_image = -max_pool(\n -input_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n output_image = max_pool(\n output_image, kernel_size=kernel_size, stride=1, padding=kernel_size // 2\n )\n\n return output_image\n\n\ndef fill_holes(input_image, params=None):\n \"\"\"\n This function fills holes in masks.\n\n Args:\n input_image (torch.Tensor): The input image.\n params (dict): The parameters dict; unused.\n\n Returns:\n torch.Tensor: The output image after morphological operations.\n \"\"\"\n input_image_array = get_array_from_image_or_tensor(input_image).astype(int)\n input_image_array_closed = binary_closing(input_image_array)\n # Fill the holes in binary objects\n output_array = binary_fill_holes(input_image_array_closed).astype(int)\n\n return torch.from_numpy(output_array)\n\n\ndef cca(input_image, params=None):\n \"\"\"\n This function performs connected component analysis on the input image.\n\n Args:\n input_image (torch.Tensor): The input image.\n params (dict): The parameters dict;\n\n Returns:\n torch.Tensor: The output image after morphological operations.\n \"\"\"\n seg = get_array_from_image_or_tensor(input_image)\n mask = seg != 0\n\n connectivity = input_image.ndim - 1\n labels_connected = label(mask, connectivity=connectivity)\n labels_connected_sizes = [\n np.sum(labels_connected == i) for i in np.unique(labels_connected)\n ]\n largest_region = 0\n if len(labels_connected_sizes) > 1:\n largest_region = np.argmax(labels_connected_sizes[1:]) + 1\n seg[labels_connected != largest_region] = 0\n\n return seg\n\n", "path": "GANDLF/data/post_process/morphology.py"}]} | 1,447 | 270 |
gh_patches_debug_42751 | rasdani/github-patches | git_diff | cloudtools__troposphere-1703 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AWS::ImageBuilder::* some wrong data types and attribut missing
In imagebuilder.py (2.6.1 release).
* In AWS::ImageBuilder::*
"Tags" are "json_checker" but should be dict
When encoded in the structure it becomes a string (validator.py, ligne 258,` json.dumps(prop)`) which is creating an issue with CloudFormation that expect a struct like : `"Tags" : {Key : Value, ...}`
* AWS::ImageBuilder::DistributionConfiguration::Distribution
"AmiDistributionConfiguration" is "json_checker" but should be dict.
For the same as above "Tags"
* In AWS::ImageBuilder::Component
"Data" is missing. And should be "json_checker" in that case.
* In AWS::ImageBuilder::ImageRecipe::InstanceBlockDeviceMapping
"NoDevice" is boolean but should be a string
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/imagebuilder.py`
Content:
```
1 # Copyright (c) 2020, Mark Peek <[email protected]>
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5
6 from . import AWSObject, AWSProperty
7 from .validators import (integer, boolean, json_checker,
8 component_platforms, imagepipeline_status,
9 schedule_pipelineexecutionstartcondition,
10 ebsinstanceblockdevicespecification_volume_type)
11
12
13 class S3Logs(AWSProperty):
14 props = {
15 "S3BucketName": (basestring, False),
16 "S3KeyPrefix": (basestring, False),
17 }
18
19
20 class Logging(AWSProperty):
21 props = {
22 'S3Logs': (S3Logs, False),
23 }
24
25
26 class InfrastructureConfiguration(AWSObject):
27 resource_type = "AWS::ImageBuilder::InfrastructureConfiguration"
28
29 props = {
30 'Description': (basestring, False),
31 'InstanceProfileName': (basestring, True),
32 'InstanceTypes': ([basestring], False),
33 'KeyPair': (basestring, False),
34 'Logging': (Logging, False),
35 'Name': (basestring, True),
36 'SecurityGroupIds': ([basestring], False),
37 'SnsTopicArn': (basestring, False),
38 'SubnetId': (basestring, False),
39 'Tags': (json_checker, False),
40 'TerminateInstanceOnFailure': (boolean, False)
41 }
42
43
44 class EbsInstanceBlockDeviceSpecification(AWSProperty):
45 props = {
46 'DeleteOnTermination': (boolean, False),
47 'Encrypted': (boolean, False),
48 'Iops': (integer, False),
49 'KmsKeyId': (basestring, False),
50 'SnapshotId': (basestring, False),
51 'VolumeSize': (integer, False),
52 'VolumeType': (ebsinstanceblockdevicespecification_volume_type, False),
53 }
54
55
56 class InstanceBlockDeviceMapping(AWSProperty):
57 props = {
58 'DeviceName': (basestring, False),
59 'Ebs': (EbsInstanceBlockDeviceSpecification, False),
60 'NoDevice': (boolean, False),
61 'VirtualName': (basestring, False),
62 }
63
64
65 class ComponentConfiguration(AWSProperty):
66 props = {
67 'ComponentArn': (basestring, False),
68 }
69
70
71 class ImageRecipe(AWSObject):
72 resource_type = "AWS::ImageBuilder::ImageRecipe"
73
74 props = {
75 'BlockDeviceMappings': ([InstanceBlockDeviceMapping], False),
76 'Components': ([ComponentConfiguration], True),
77 'Description': (basestring, False),
78 'Name': (basestring, True),
79 'ParentImage': (basestring, True),
80 'Tags': (json_checker, False),
81 'Version': (basestring, True)
82 }
83
84
85 class ImageTestsConfiguration(AWSProperty):
86 props = {
87 'ImageTestsEnabled': (boolean, False),
88 'TimeoutMinutes': (integer, False),
89 }
90
91
92 class Schedule(AWSProperty):
93 props = {
94 'PipelineExecutionStartCondition': (schedule_pipelineexecutionstartcondition, False), # NOQA
95 'ScheduleExpression': (basestring, False),
96 }
97
98
99 class ImagePipeline(AWSObject):
100 resource_type = "AWS::ImageBuilder::ImagePipeline"
101
102 props = {
103 'Description': (basestring, False),
104 'DistributionConfigurationArn': (basestring, False),
105 'ImageRecipeArn': (basestring, True),
106 'ImageTestsConfiguration': (ImageTestsConfiguration, False),
107 'InfrastructureConfigurationArn': (basestring, True),
108 'Name': (basestring, True),
109 'Schedule': (Schedule, False),
110 'Status': (imagepipeline_status, False),
111 'Tags': (json_checker, False),
112 }
113
114
115 class Distribution(AWSProperty):
116 props = {
117 'AmiDistributionConfiguration': (json_checker, False),
118 'LicenseConfigurationArns': ([basestring], False),
119 'Region': (basestring, False),
120 }
121
122
123 class DistributionConfiguration(AWSObject):
124 resource_type = "AWS::ImageBuilder::DistributionConfiguration"
125
126 props = {
127 'Description': (basestring, False),
128 'Distributions': ([Distribution], True),
129 'Name': (basestring, True),
130 'Tags': (json_checker, False),
131 }
132
133
134 class Component(AWSObject):
135 resource_type = "AWS::ImageBuilder::Component"
136
137 props = {
138 'ChangeDescription': (basestring, False),
139 'Description': (basestring, False),
140 'KmsKeyId': (basestring, False),
141 'Name': (basestring, True),
142 'Platform': (component_platforms, True),
143 'Tags': (json_checker, False),
144 'Uri': (basestring, False),
145 'Version': (basestring, True),
146 }
147
148
149 class Image(AWSObject):
150 resource_type = "AWS::ImageBuilder::Image"
151
152 props = {
153 'DistributionConfigurationArn': (basestring, False),
154 'ImageRecipeArn': (basestring, True),
155 'ImageTestsConfiguration': (ImageTestsConfiguration, True),
156 'InfrastructureConfigurationArn': (basestring, True),
157 'Tags': (json_checker, False),
158 }
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/imagebuilder.py b/troposphere/imagebuilder.py
--- a/troposphere/imagebuilder.py
+++ b/troposphere/imagebuilder.py
@@ -4,8 +4,8 @@
# See LICENSE file for full license.
from . import AWSObject, AWSProperty
-from .validators import (integer, boolean, json_checker,
- component_platforms, imagepipeline_status,
+from .validators import (integer, boolean, component_platforms,
+ imagepipeline_status,
schedule_pipelineexecutionstartcondition,
ebsinstanceblockdevicespecification_volume_type)
@@ -36,7 +36,7 @@
'SecurityGroupIds': ([basestring], False),
'SnsTopicArn': (basestring, False),
'SubnetId': (basestring, False),
- 'Tags': (json_checker, False),
+ 'Tags': (dict, False),
'TerminateInstanceOnFailure': (boolean, False)
}
@@ -57,7 +57,7 @@
props = {
'DeviceName': (basestring, False),
'Ebs': (EbsInstanceBlockDeviceSpecification, False),
- 'NoDevice': (boolean, False),
+ 'NoDevice': (basestring, False),
'VirtualName': (basestring, False),
}
@@ -77,7 +77,7 @@
'Description': (basestring, False),
'Name': (basestring, True),
'ParentImage': (basestring, True),
- 'Tags': (json_checker, False),
+ 'Tags': (dict, False),
'Version': (basestring, True)
}
@@ -108,13 +108,13 @@
'Name': (basestring, True),
'Schedule': (Schedule, False),
'Status': (imagepipeline_status, False),
- 'Tags': (json_checker, False),
+ 'Tags': (dict, False),
}
class Distribution(AWSProperty):
props = {
- 'AmiDistributionConfiguration': (json_checker, False),
+ 'AmiDistributionConfiguration': (dict, False),
'LicenseConfigurationArns': ([basestring], False),
'Region': (basestring, False),
}
@@ -127,7 +127,7 @@
'Description': (basestring, False),
'Distributions': ([Distribution], True),
'Name': (basestring, True),
- 'Tags': (json_checker, False),
+ 'Tags': (dict, False),
}
@@ -136,11 +136,12 @@
props = {
'ChangeDescription': (basestring, False),
+ 'Data': (basestring, False),
'Description': (basestring, False),
'KmsKeyId': (basestring, False),
'Name': (basestring, True),
'Platform': (component_platforms, True),
- 'Tags': (json_checker, False),
+ 'Tags': (dict, False),
'Uri': (basestring, False),
'Version': (basestring, True),
}
@@ -154,5 +155,5 @@
'ImageRecipeArn': (basestring, True),
'ImageTestsConfiguration': (ImageTestsConfiguration, True),
'InfrastructureConfigurationArn': (basestring, True),
- 'Tags': (json_checker, False),
+ 'Tags': (dict, False),
}
| {"golden_diff": "diff --git a/troposphere/imagebuilder.py b/troposphere/imagebuilder.py\n--- a/troposphere/imagebuilder.py\n+++ b/troposphere/imagebuilder.py\n@@ -4,8 +4,8 @@\n # See LICENSE file for full license.\n \n from . import AWSObject, AWSProperty\n-from .validators import (integer, boolean, json_checker,\n- component_platforms, imagepipeline_status,\n+from .validators import (integer, boolean, component_platforms,\n+ imagepipeline_status,\n schedule_pipelineexecutionstartcondition,\n ebsinstanceblockdevicespecification_volume_type)\n \n@@ -36,7 +36,7 @@\n 'SecurityGroupIds': ([basestring], False),\n 'SnsTopicArn': (basestring, False),\n 'SubnetId': (basestring, False),\n- 'Tags': (json_checker, False),\n+ 'Tags': (dict, False),\n 'TerminateInstanceOnFailure': (boolean, False)\n }\n \n@@ -57,7 +57,7 @@\n props = {\n 'DeviceName': (basestring, False),\n 'Ebs': (EbsInstanceBlockDeviceSpecification, False),\n- 'NoDevice': (boolean, False),\n+ 'NoDevice': (basestring, False),\n 'VirtualName': (basestring, False),\n }\n \n@@ -77,7 +77,7 @@\n 'Description': (basestring, False),\n 'Name': (basestring, True),\n 'ParentImage': (basestring, True),\n- 'Tags': (json_checker, False),\n+ 'Tags': (dict, False),\n 'Version': (basestring, True)\n }\n \n@@ -108,13 +108,13 @@\n 'Name': (basestring, True),\n 'Schedule': (Schedule, False),\n 'Status': (imagepipeline_status, False),\n- 'Tags': (json_checker, False),\n+ 'Tags': (dict, False),\n }\n \n \n class Distribution(AWSProperty):\n props = {\n- 'AmiDistributionConfiguration': (json_checker, False),\n+ 'AmiDistributionConfiguration': (dict, False),\n 'LicenseConfigurationArns': ([basestring], False),\n 'Region': (basestring, False),\n }\n@@ -127,7 +127,7 @@\n 'Description': (basestring, False),\n 'Distributions': ([Distribution], True),\n 'Name': (basestring, True),\n- 'Tags': (json_checker, False),\n+ 'Tags': (dict, False),\n }\n \n \n@@ -136,11 +136,12 @@\n \n props = {\n 'ChangeDescription': (basestring, False),\n+ 'Data': (basestring, False),\n 'Description': (basestring, False),\n 'KmsKeyId': (basestring, False),\n 'Name': (basestring, True),\n 'Platform': (component_platforms, True),\n- 'Tags': (json_checker, False),\n+ 'Tags': (dict, False),\n 'Uri': (basestring, False),\n 'Version': (basestring, True),\n }\n@@ -154,5 +155,5 @@\n 'ImageRecipeArn': (basestring, True),\n 'ImageTestsConfiguration': (ImageTestsConfiguration, True),\n 'InfrastructureConfigurationArn': (basestring, True),\n- 'Tags': (json_checker, False),\n+ 'Tags': (dict, False),\n }\n", "issue": "AWS::ImageBuilder::* some wrong data types and attribut missing\nIn imagebuilder.py (2.6.1 release).\r\n\r\n* In AWS::ImageBuilder::*\r\n\"Tags\" are \"json_checker\" but should be dict\r\nWhen encoded in the structure it becomes a string (validator.py, ligne 258,` json.dumps(prop)`) which is creating an issue with CloudFormation that expect a struct like : `\"Tags\" : {Key : Value, ...}`\r\n* AWS::ImageBuilder::DistributionConfiguration::Distribution\r\n\"AmiDistributionConfiguration\" is \"json_checker\" but should be dict.\r\nFor the same as above \"Tags\" \r\n* In AWS::ImageBuilder::Component\r\n\"Data\" is missing. And should be \"json_checker\" in that case.\r\n* In AWS::ImageBuilder::ImageRecipe::InstanceBlockDeviceMapping\r\n\"NoDevice\" is boolean but should be a string\r\n\n", "before_files": [{"content": "# Copyright (c) 2020, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import (integer, boolean, json_checker,\n component_platforms, imagepipeline_status,\n schedule_pipelineexecutionstartcondition,\n ebsinstanceblockdevicespecification_volume_type)\n\n\nclass S3Logs(AWSProperty):\n props = {\n \"S3BucketName\": (basestring, False),\n \"S3KeyPrefix\": (basestring, False),\n }\n\n\nclass Logging(AWSProperty):\n props = {\n 'S3Logs': (S3Logs, False),\n }\n\n\nclass InfrastructureConfiguration(AWSObject):\n resource_type = \"AWS::ImageBuilder::InfrastructureConfiguration\"\n\n props = {\n 'Description': (basestring, False),\n 'InstanceProfileName': (basestring, True),\n 'InstanceTypes': ([basestring], False),\n 'KeyPair': (basestring, False),\n 'Logging': (Logging, False),\n 'Name': (basestring, True),\n 'SecurityGroupIds': ([basestring], False),\n 'SnsTopicArn': (basestring, False),\n 'SubnetId': (basestring, False),\n 'Tags': (json_checker, False),\n 'TerminateInstanceOnFailure': (boolean, False)\n }\n\n\nclass EbsInstanceBlockDeviceSpecification(AWSProperty):\n props = {\n 'DeleteOnTermination': (boolean, False),\n 'Encrypted': (boolean, False),\n 'Iops': (integer, False),\n 'KmsKeyId': (basestring, False),\n 'SnapshotId': (basestring, False),\n 'VolumeSize': (integer, False),\n 'VolumeType': (ebsinstanceblockdevicespecification_volume_type, False),\n }\n\n\nclass InstanceBlockDeviceMapping(AWSProperty):\n props = {\n 'DeviceName': (basestring, False),\n 'Ebs': (EbsInstanceBlockDeviceSpecification, False),\n 'NoDevice': (boolean, False),\n 'VirtualName': (basestring, False),\n }\n\n\nclass ComponentConfiguration(AWSProperty):\n props = {\n 'ComponentArn': (basestring, False),\n }\n\n\nclass ImageRecipe(AWSObject):\n resource_type = \"AWS::ImageBuilder::ImageRecipe\"\n\n props = {\n 'BlockDeviceMappings': ([InstanceBlockDeviceMapping], False),\n 'Components': ([ComponentConfiguration], True),\n 'Description': (basestring, False),\n 'Name': (basestring, True),\n 'ParentImage': (basestring, True),\n 'Tags': (json_checker, False),\n 'Version': (basestring, True)\n }\n\n\nclass ImageTestsConfiguration(AWSProperty):\n props = {\n 'ImageTestsEnabled': (boolean, False),\n 'TimeoutMinutes': (integer, False),\n }\n\n\nclass Schedule(AWSProperty):\n props = {\n 'PipelineExecutionStartCondition': (schedule_pipelineexecutionstartcondition, False), # NOQA\n 'ScheduleExpression': (basestring, False),\n }\n\n\nclass ImagePipeline(AWSObject):\n resource_type = \"AWS::ImageBuilder::ImagePipeline\"\n\n props = {\n 'Description': (basestring, False),\n 'DistributionConfigurationArn': (basestring, False),\n 'ImageRecipeArn': (basestring, True),\n 'ImageTestsConfiguration': (ImageTestsConfiguration, False),\n 'InfrastructureConfigurationArn': (basestring, True),\n 'Name': (basestring, True),\n 'Schedule': (Schedule, False),\n 'Status': (imagepipeline_status, False),\n 'Tags': (json_checker, False),\n }\n\n\nclass Distribution(AWSProperty):\n props = {\n 'AmiDistributionConfiguration': (json_checker, False),\n 'LicenseConfigurationArns': ([basestring], False),\n 'Region': (basestring, False),\n }\n\n\nclass DistributionConfiguration(AWSObject):\n resource_type = \"AWS::ImageBuilder::DistributionConfiguration\"\n\n props = {\n 'Description': (basestring, False),\n 'Distributions': ([Distribution], True),\n 'Name': (basestring, True),\n 'Tags': (json_checker, False),\n }\n\n\nclass Component(AWSObject):\n resource_type = \"AWS::ImageBuilder::Component\"\n\n props = {\n 'ChangeDescription': (basestring, False),\n 'Description': (basestring, False),\n 'KmsKeyId': (basestring, False),\n 'Name': (basestring, True),\n 'Platform': (component_platforms, True),\n 'Tags': (json_checker, False),\n 'Uri': (basestring, False),\n 'Version': (basestring, True),\n }\n\n\nclass Image(AWSObject):\n resource_type = \"AWS::ImageBuilder::Image\"\n\n props = {\n 'DistributionConfigurationArn': (basestring, False),\n 'ImageRecipeArn': (basestring, True),\n 'ImageTestsConfiguration': (ImageTestsConfiguration, True),\n 'InfrastructureConfigurationArn': (basestring, True),\n 'Tags': (json_checker, False),\n }\n", "path": "troposphere/imagebuilder.py"}], "after_files": [{"content": "# Copyright (c) 2020, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import (integer, boolean, component_platforms,\n imagepipeline_status,\n schedule_pipelineexecutionstartcondition,\n ebsinstanceblockdevicespecification_volume_type)\n\n\nclass S3Logs(AWSProperty):\n props = {\n \"S3BucketName\": (basestring, False),\n \"S3KeyPrefix\": (basestring, False),\n }\n\n\nclass Logging(AWSProperty):\n props = {\n 'S3Logs': (S3Logs, False),\n }\n\n\nclass InfrastructureConfiguration(AWSObject):\n resource_type = \"AWS::ImageBuilder::InfrastructureConfiguration\"\n\n props = {\n 'Description': (basestring, False),\n 'InstanceProfileName': (basestring, True),\n 'InstanceTypes': ([basestring], False),\n 'KeyPair': (basestring, False),\n 'Logging': (Logging, False),\n 'Name': (basestring, True),\n 'SecurityGroupIds': ([basestring], False),\n 'SnsTopicArn': (basestring, False),\n 'SubnetId': (basestring, False),\n 'Tags': (dict, False),\n 'TerminateInstanceOnFailure': (boolean, False)\n }\n\n\nclass EbsInstanceBlockDeviceSpecification(AWSProperty):\n props = {\n 'DeleteOnTermination': (boolean, False),\n 'Encrypted': (boolean, False),\n 'Iops': (integer, False),\n 'KmsKeyId': (basestring, False),\n 'SnapshotId': (basestring, False),\n 'VolumeSize': (integer, False),\n 'VolumeType': (ebsinstanceblockdevicespecification_volume_type, False),\n }\n\n\nclass InstanceBlockDeviceMapping(AWSProperty):\n props = {\n 'DeviceName': (basestring, False),\n 'Ebs': (EbsInstanceBlockDeviceSpecification, False),\n 'NoDevice': (basestring, False),\n 'VirtualName': (basestring, False),\n }\n\n\nclass ComponentConfiguration(AWSProperty):\n props = {\n 'ComponentArn': (basestring, False),\n }\n\n\nclass ImageRecipe(AWSObject):\n resource_type = \"AWS::ImageBuilder::ImageRecipe\"\n\n props = {\n 'BlockDeviceMappings': ([InstanceBlockDeviceMapping], False),\n 'Components': ([ComponentConfiguration], True),\n 'Description': (basestring, False),\n 'Name': (basestring, True),\n 'ParentImage': (basestring, True),\n 'Tags': (dict, False),\n 'Version': (basestring, True)\n }\n\n\nclass ImageTestsConfiguration(AWSProperty):\n props = {\n 'ImageTestsEnabled': (boolean, False),\n 'TimeoutMinutes': (integer, False),\n }\n\n\nclass Schedule(AWSProperty):\n props = {\n 'PipelineExecutionStartCondition': (schedule_pipelineexecutionstartcondition, False), # NOQA\n 'ScheduleExpression': (basestring, False),\n }\n\n\nclass ImagePipeline(AWSObject):\n resource_type = \"AWS::ImageBuilder::ImagePipeline\"\n\n props = {\n 'Description': (basestring, False),\n 'DistributionConfigurationArn': (basestring, False),\n 'ImageRecipeArn': (basestring, True),\n 'ImageTestsConfiguration': (ImageTestsConfiguration, False),\n 'InfrastructureConfigurationArn': (basestring, True),\n 'Name': (basestring, True),\n 'Schedule': (Schedule, False),\n 'Status': (imagepipeline_status, False),\n 'Tags': (dict, False),\n }\n\n\nclass Distribution(AWSProperty):\n props = {\n 'AmiDistributionConfiguration': (dict, False),\n 'LicenseConfigurationArns': ([basestring], False),\n 'Region': (basestring, False),\n }\n\n\nclass DistributionConfiguration(AWSObject):\n resource_type = \"AWS::ImageBuilder::DistributionConfiguration\"\n\n props = {\n 'Description': (basestring, False),\n 'Distributions': ([Distribution], True),\n 'Name': (basestring, True),\n 'Tags': (dict, False),\n }\n\n\nclass Component(AWSObject):\n resource_type = \"AWS::ImageBuilder::Component\"\n\n props = {\n 'ChangeDescription': (basestring, False),\n 'Data': (basestring, False),\n 'Description': (basestring, False),\n 'KmsKeyId': (basestring, False),\n 'Name': (basestring, True),\n 'Platform': (component_platforms, True),\n 'Tags': (dict, False),\n 'Uri': (basestring, False),\n 'Version': (basestring, True),\n }\n\n\nclass Image(AWSObject):\n resource_type = \"AWS::ImageBuilder::Image\"\n\n props = {\n 'DistributionConfigurationArn': (basestring, False),\n 'ImageRecipeArn': (basestring, True),\n 'ImageTestsConfiguration': (ImageTestsConfiguration, True),\n 'InfrastructureConfigurationArn': (basestring, True),\n 'Tags': (dict, False),\n }\n", "path": "troposphere/imagebuilder.py"}]} | 1,942 | 769 |
gh_patches_debug_14052 | rasdani/github-patches | git_diff | openai__gym-1149 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Why is there a restiction on shape in multi discrete?
https://github.com/openai/gym/blob/422c9c7bb3c3c5a756c3b12dfe99733bfbfe3920/gym/spaces/multi_discrete.py#L10
Why is this imposed?
Say one may need a high dimensional space like a (3,3,3) grid with discretised values for each element in the grid.
It can be very easily circumvented by using np.random.random_sample and passing the shape
Is there some specific reason for doing this?
Also note this example provided in dict_space doesn't work currently.
https://github.com/openai/gym/blob/422c9c7bb3c3c5a756c3b12dfe99733bfbfe3920/gym/spaces/dict_space.py#L22
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gym/spaces/multi_discrete.py`
Content:
```
1 import gym
2 import numpy as np
3
4 class MultiDiscrete(gym.Space):
5 def __init__(self, nvec):
6 """
7 nvec: vector of counts of each categorical variable
8 """
9 self.nvec = np.asarray(nvec, dtype=np.int32)
10 assert self.nvec.ndim == 1, 'nvec should be a 1d array (or list) of ints'
11 gym.Space.__init__(self, (self.nvec.size,), np.int8)
12 def sample(self):
13 return (gym.spaces.np_random.rand(self.nvec.size) * self.nvec).astype(self.dtype)
14 def contains(self, x):
15 return (0 <= x).all() and (x < self.nvec).all() and x.dtype.kind in 'ui'
16
17 __contains__ = contains
18
19 def to_jsonable(self, sample_n):
20 return [sample.tolist() for sample in sample_n]
21 def from_jsonable(self, sample_n):
22 return np.array(sample_n)
23
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gym/spaces/multi_discrete.py b/gym/spaces/multi_discrete.py
--- a/gym/spaces/multi_discrete.py
+++ b/gym/spaces/multi_discrete.py
@@ -7,10 +7,9 @@
nvec: vector of counts of each categorical variable
"""
self.nvec = np.asarray(nvec, dtype=np.int32)
- assert self.nvec.ndim == 1, 'nvec should be a 1d array (or list) of ints'
- gym.Space.__init__(self, (self.nvec.size,), np.int8)
+ gym.Space.__init__(self, (self.nvec.shape,), np.int8)
def sample(self):
- return (gym.spaces.np_random.rand(self.nvec.size) * self.nvec).astype(self.dtype)
+ return (gym.spaces.np_random.random_sample(self.nvec.shape) * self.nvec).astype(self.dtype)
def contains(self, x):
return (0 <= x).all() and (x < self.nvec).all() and x.dtype.kind in 'ui'
| {"golden_diff": "diff --git a/gym/spaces/multi_discrete.py b/gym/spaces/multi_discrete.py\n--- a/gym/spaces/multi_discrete.py\n+++ b/gym/spaces/multi_discrete.py\n@@ -7,10 +7,9 @@\n nvec: vector of counts of each categorical variable\n \"\"\"\n self.nvec = np.asarray(nvec, dtype=np.int32)\n- assert self.nvec.ndim == 1, 'nvec should be a 1d array (or list) of ints'\n- gym.Space.__init__(self, (self.nvec.size,), np.int8)\n+ gym.Space.__init__(self, (self.nvec.shape,), np.int8)\n def sample(self):\n- return (gym.spaces.np_random.rand(self.nvec.size) * self.nvec).astype(self.dtype)\n+ return (gym.spaces.np_random.random_sample(self.nvec.shape) * self.nvec).astype(self.dtype)\n def contains(self, x):\n return (0 <= x).all() and (x < self.nvec).all() and x.dtype.kind in 'ui'\n", "issue": "Why is there a restiction on shape in multi discrete?\nhttps://github.com/openai/gym/blob/422c9c7bb3c3c5a756c3b12dfe99733bfbfe3920/gym/spaces/multi_discrete.py#L10\r\n\r\nWhy is this imposed?\r\nSay one may need a high dimensional space like a (3,3,3) grid with discretised values for each element in the grid. \r\nIt can be very easily circumvented by using np.random.random_sample and passing the shape\r\nIs there some specific reason for doing this?\r\n\r\nAlso note this example provided in dict_space doesn't work currently.\r\nhttps://github.com/openai/gym/blob/422c9c7bb3c3c5a756c3b12dfe99733bfbfe3920/gym/spaces/dict_space.py#L22\r\n\r\n\n", "before_files": [{"content": "import gym\nimport numpy as np\n\nclass MultiDiscrete(gym.Space):\n def __init__(self, nvec):\n \"\"\"\n nvec: vector of counts of each categorical variable\n \"\"\"\n self.nvec = np.asarray(nvec, dtype=np.int32)\n assert self.nvec.ndim == 1, 'nvec should be a 1d array (or list) of ints'\n gym.Space.__init__(self, (self.nvec.size,), np.int8)\n def sample(self):\n return (gym.spaces.np_random.rand(self.nvec.size) * self.nvec).astype(self.dtype)\n def contains(self, x):\n return (0 <= x).all() and (x < self.nvec).all() and x.dtype.kind in 'ui'\n \n __contains__ = contains\n \n def to_jsonable(self, sample_n):\n return [sample.tolist() for sample in sample_n]\n def from_jsonable(self, sample_n):\n return np.array(sample_n)\n", "path": "gym/spaces/multi_discrete.py"}], "after_files": [{"content": "import gym\nimport numpy as np\n\nclass MultiDiscrete(gym.Space):\n def __init__(self, nvec):\n \"\"\"\n nvec: vector of counts of each categorical variable\n \"\"\"\n self.nvec = np.asarray(nvec, dtype=np.int32)\n gym.Space.__init__(self, (self.nvec.shape,), np.int8)\n def sample(self):\n return (gym.spaces.np_random.random_sample(self.nvec.shape) * self.nvec).astype(self.dtype)\n def contains(self, x):\n return (0 <= x).all() and (x < self.nvec).all() and x.dtype.kind in 'ui'\n \n __contains__ = contains\n \n def to_jsonable(self, sample_n):\n return [sample.tolist() for sample in sample_n]\n def from_jsonable(self, sample_n):\n return np.array(sample_n)\n", "path": "gym/spaces/multi_discrete.py"}]} | 725 | 247 |
gh_patches_debug_29594 | rasdani/github-patches | git_diff | fossasia__open-event-server-6739 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove version model
**Describe the bug**
The version model is not used currently and should be removed
https://github.com/fossasia/open-event-server/blob/development/app/models/version.py
**Additional context**
@iamareebjamal Taking this
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/models/version.py`
Content:
```
1 from sqlalchemy.orm import backref
2
3 from app.models import db
4
5
6 class Version(db.Model):
7 """Version model class"""
8 __tablename__ = 'versions'
9 id = db.Column(db.Integer, primary_key=True)
10 event_id = db.Column(db.Integer, db.ForeignKey('events.id', ondelete='CASCADE'))
11 events = db.relationship("Event", backref=backref('version', uselist=False))
12
13 event_ver = db.Column(db.Integer, nullable=False, default=0)
14 sessions_ver = db.Column(db.Integer, nullable=False, default=0)
15 speakers_ver = db.Column(db.Integer, nullable=False, default=0)
16 tracks_ver = db.Column(db.Integer, nullable=False, default=0)
17 sponsors_ver = db.Column(db.Integer, nullable=False, default=0)
18 microlocations_ver = db.Column(db.Integer, nullable=False, default=0)
19
20 def __init__(self,
21 event_id=None,
22 event_ver=None,
23 sessions_ver=None,
24 speakers_ver=None,
25 tracks_ver=None,
26 sponsors_ver=None,
27 microlocations_ver=None):
28 self.event_id = event_id
29 self.event_ver = event_ver
30 self.sessions_ver = sessions_ver
31 self.speakers_ver = speakers_ver
32 self.tracks_ver = tracks_ver
33 self.sponsors_ver = sponsors_ver
34 self.microlocations_ver = microlocations_ver
35
36 def __repr__(self):
37 return '<Version %r>' % self.id
38
39 def __str__(self):
40 return self.__repr__()
41
42 @property
43 def serialize(self):
44 """Return object data in easily serializable format"""
45 return {
46 'version': [
47 {'id': self.id,
48 'event_id': self.event_id,
49 'event_ver': self.event_ver,
50 'sessions_ver': self.sessions_ver,
51 'speakers_ver': self.speakers_ver,
52 'tracks_ver': self.tracks_ver,
53 'sponsors_ver': self.sponsors_ver,
54 'microlocations_ver': self.microlocations_ver}
55 ]
56 }
57
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/app/models/version.py b/app/models/version.py
deleted file mode 100644
--- a/app/models/version.py
+++ /dev/null
@@ -1,56 +0,0 @@
-from sqlalchemy.orm import backref
-
-from app.models import db
-
-
-class Version(db.Model):
- """Version model class"""
- __tablename__ = 'versions'
- id = db.Column(db.Integer, primary_key=True)
- event_id = db.Column(db.Integer, db.ForeignKey('events.id', ondelete='CASCADE'))
- events = db.relationship("Event", backref=backref('version', uselist=False))
-
- event_ver = db.Column(db.Integer, nullable=False, default=0)
- sessions_ver = db.Column(db.Integer, nullable=False, default=0)
- speakers_ver = db.Column(db.Integer, nullable=False, default=0)
- tracks_ver = db.Column(db.Integer, nullable=False, default=0)
- sponsors_ver = db.Column(db.Integer, nullable=False, default=0)
- microlocations_ver = db.Column(db.Integer, nullable=False, default=0)
-
- def __init__(self,
- event_id=None,
- event_ver=None,
- sessions_ver=None,
- speakers_ver=None,
- tracks_ver=None,
- sponsors_ver=None,
- microlocations_ver=None):
- self.event_id = event_id
- self.event_ver = event_ver
- self.sessions_ver = sessions_ver
- self.speakers_ver = speakers_ver
- self.tracks_ver = tracks_ver
- self.sponsors_ver = sponsors_ver
- self.microlocations_ver = microlocations_ver
-
- def __repr__(self):
- return '<Version %r>' % self.id
-
- def __str__(self):
- return self.__repr__()
-
- @property
- def serialize(self):
- """Return object data in easily serializable format"""
- return {
- 'version': [
- {'id': self.id,
- 'event_id': self.event_id,
- 'event_ver': self.event_ver,
- 'sessions_ver': self.sessions_ver,
- 'speakers_ver': self.speakers_ver,
- 'tracks_ver': self.tracks_ver,
- 'sponsors_ver': self.sponsors_ver,
- 'microlocations_ver': self.microlocations_ver}
- ]
- }
| {"golden_diff": "diff --git a/app/models/version.py b/app/models/version.py\ndeleted file mode 100644\n--- a/app/models/version.py\n+++ /dev/null\n@@ -1,56 +0,0 @@\n-from sqlalchemy.orm import backref\n-\n-from app.models import db\n-\n-\n-class Version(db.Model):\n- \"\"\"Version model class\"\"\"\n- __tablename__ = 'versions'\n- id = db.Column(db.Integer, primary_key=True)\n- event_id = db.Column(db.Integer, db.ForeignKey('events.id', ondelete='CASCADE'))\n- events = db.relationship(\"Event\", backref=backref('version', uselist=False))\n-\n- event_ver = db.Column(db.Integer, nullable=False, default=0)\n- sessions_ver = db.Column(db.Integer, nullable=False, default=0)\n- speakers_ver = db.Column(db.Integer, nullable=False, default=0)\n- tracks_ver = db.Column(db.Integer, nullable=False, default=0)\n- sponsors_ver = db.Column(db.Integer, nullable=False, default=0)\n- microlocations_ver = db.Column(db.Integer, nullable=False, default=0)\n-\n- def __init__(self,\n- event_id=None,\n- event_ver=None,\n- sessions_ver=None,\n- speakers_ver=None,\n- tracks_ver=None,\n- sponsors_ver=None,\n- microlocations_ver=None):\n- self.event_id = event_id\n- self.event_ver = event_ver\n- self.sessions_ver = sessions_ver\n- self.speakers_ver = speakers_ver\n- self.tracks_ver = tracks_ver\n- self.sponsors_ver = sponsors_ver\n- self.microlocations_ver = microlocations_ver\n-\n- def __repr__(self):\n- return '<Version %r>' % self.id\n-\n- def __str__(self):\n- return self.__repr__()\n-\n- @property\n- def serialize(self):\n- \"\"\"Return object data in easily serializable format\"\"\"\n- return {\n- 'version': [\n- {'id': self.id,\n- 'event_id': self.event_id,\n- 'event_ver': self.event_ver,\n- 'sessions_ver': self.sessions_ver,\n- 'speakers_ver': self.speakers_ver,\n- 'tracks_ver': self.tracks_ver,\n- 'sponsors_ver': self.sponsors_ver,\n- 'microlocations_ver': self.microlocations_ver}\n- ]\n- }\n", "issue": "Remove version model\n**Describe the bug**\r\nThe version model is not used currently and should be removed\r\n\r\nhttps://github.com/fossasia/open-event-server/blob/development/app/models/version.py\r\n\r\n\r\n**Additional context**\r\n@iamareebjamal Taking this\r\n\n", "before_files": [{"content": "from sqlalchemy.orm import backref\n\nfrom app.models import db\n\n\nclass Version(db.Model):\n \"\"\"Version model class\"\"\"\n __tablename__ = 'versions'\n id = db.Column(db.Integer, primary_key=True)\n event_id = db.Column(db.Integer, db.ForeignKey('events.id', ondelete='CASCADE'))\n events = db.relationship(\"Event\", backref=backref('version', uselist=False))\n\n event_ver = db.Column(db.Integer, nullable=False, default=0)\n sessions_ver = db.Column(db.Integer, nullable=False, default=0)\n speakers_ver = db.Column(db.Integer, nullable=False, default=0)\n tracks_ver = db.Column(db.Integer, nullable=False, default=0)\n sponsors_ver = db.Column(db.Integer, nullable=False, default=0)\n microlocations_ver = db.Column(db.Integer, nullable=False, default=0)\n\n def __init__(self,\n event_id=None,\n event_ver=None,\n sessions_ver=None,\n speakers_ver=None,\n tracks_ver=None,\n sponsors_ver=None,\n microlocations_ver=None):\n self.event_id = event_id\n self.event_ver = event_ver\n self.sessions_ver = sessions_ver\n self.speakers_ver = speakers_ver\n self.tracks_ver = tracks_ver\n self.sponsors_ver = sponsors_ver\n self.microlocations_ver = microlocations_ver\n\n def __repr__(self):\n return '<Version %r>' % self.id\n\n def __str__(self):\n return self.__repr__()\n\n @property\n def serialize(self):\n \"\"\"Return object data in easily serializable format\"\"\"\n return {\n 'version': [\n {'id': self.id,\n 'event_id': self.event_id,\n 'event_ver': self.event_ver,\n 'sessions_ver': self.sessions_ver,\n 'speakers_ver': self.speakers_ver,\n 'tracks_ver': self.tracks_ver,\n 'sponsors_ver': self.sponsors_ver,\n 'microlocations_ver': self.microlocations_ver}\n ]\n }\n", "path": "app/models/version.py"}], "after_files": [{"content": null, "path": "app/models/version.py"}]} | 848 | 525 |
gh_patches_debug_33990 | rasdani/github-patches | git_diff | netbox-community__netbox-16049 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error on cable edit after B port was deleted / wrong status
### Deployment Type
Self-hosted (Docker)
### NetBox Version
v3.7.5
### Python Version
3.11
### Steps to Reproduce
1. Have a device with front- and rear-Ports (Patch Panel)
2. Have a device with interfaces (switches)
3. Connect switch interface (here gi43) with front-port (here 22)
4. Delete rear ports on patch panel device
5. Go To Connections > Cables
6. Click on edit of cable --> error message
### Expected Behavior
- Edit Button works to connect cable again
### Observed Behavior
- Error Message

Cables/UI: Unable to change from front to rear while editing cable
### NetBox version
v3.5.6
### Feature type
Change to existing functionality
### Proposed functionality
Currently, if someone accidentally connects a cable to the rear port instead of the front port, the entire cable must be deleted and created again. It would be nice to be able to change not only the port number but also the location (front/rear) when editing the cable. This might just be an ui change as the api seems to allow to change it.
### Use case
It is not necessary to delete the cable and enter all information (label, length etc) again. You can just reconnect it.
### Database changes
_No response_
### External dependencies
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `netbox/dcim/forms/connections.py`
Content:
```
1 from django import forms
2 from django.contrib.contenttypes.models import ContentType
3 from django.utils.translation import gettext_lazy as _
4
5 from circuits.models import Circuit, CircuitTermination
6 from dcim.models import *
7 from utilities.forms.fields import DynamicModelChoiceField, DynamicModelMultipleChoiceField
8 from .model_forms import CableForm
9
10
11 def get_cable_form(a_type, b_type):
12
13 class FormMetaclass(forms.models.ModelFormMetaclass):
14
15 def __new__(mcs, name, bases, attrs):
16
17 for cable_end, term_cls in (('a', a_type), ('b', b_type)):
18
19 # Device component
20 if hasattr(term_cls, 'device'):
21
22 attrs[f'termination_{cable_end}_device'] = DynamicModelChoiceField(
23 queryset=Device.objects.all(),
24 label=_('Device'),
25 required=False,
26 selector=True,
27 initial_params={
28 f'{term_cls._meta.model_name}s__in': f'${cable_end}_terminations'
29 }
30 )
31 attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(
32 queryset=term_cls.objects.all(),
33 label=term_cls._meta.verbose_name.title(),
34 context={
35 'disabled': '_occupied',
36 },
37 query_params={
38 'device_id': f'$termination_{cable_end}_device',
39 'kind': 'physical', # Exclude virtual interfaces
40 }
41 )
42
43 # PowerFeed
44 elif term_cls == PowerFeed:
45
46 attrs[f'termination_{cable_end}_powerpanel'] = DynamicModelChoiceField(
47 queryset=PowerPanel.objects.all(),
48 label=_('Power Panel'),
49 required=False,
50 selector=True,
51 initial_params={
52 'powerfeeds__in': f'${cable_end}_terminations'
53 }
54 )
55 attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(
56 queryset=term_cls.objects.all(),
57 label=_('Power Feed'),
58 context={
59 'disabled': '_occupied',
60 },
61 query_params={
62 'power_panel_id': f'$termination_{cable_end}_powerpanel',
63 }
64 )
65
66 # CircuitTermination
67 elif term_cls == CircuitTermination:
68
69 attrs[f'termination_{cable_end}_circuit'] = DynamicModelChoiceField(
70 queryset=Circuit.objects.all(),
71 label=_('Circuit'),
72 selector=True,
73 initial_params={
74 'terminations__in': f'${cable_end}_terminations'
75 }
76 )
77 attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(
78 queryset=term_cls.objects.all(),
79 label=_('Side'),
80 context={
81 'disabled': '_occupied',
82 },
83 query_params={
84 'circuit_id': f'$termination_{cable_end}_circuit',
85 }
86 )
87
88 return super().__new__(mcs, name, bases, attrs)
89
90 class _CableForm(CableForm, metaclass=FormMetaclass):
91
92 def __init__(self, *args, initial=None, **kwargs):
93
94 initial = initial or {}
95 if a_type:
96 ct = ContentType.objects.get_for_model(a_type)
97 initial['a_terminations_type'] = f'{ct.app_label}.{ct.model}'
98 if b_type:
99 ct = ContentType.objects.get_for_model(b_type)
100 initial['b_terminations_type'] = f'{ct.app_label}.{ct.model}'
101
102 # TODO: Temporary hack to work around list handling limitations with utils.normalize_querydict()
103 for field_name in ('a_terminations', 'b_terminations'):
104 if field_name in initial and type(initial[field_name]) is not list:
105 initial[field_name] = [initial[field_name]]
106
107 super().__init__(*args, initial=initial, **kwargs)
108
109 if self.instance and self.instance.pk:
110 # Initialize A/B terminations when modifying an existing Cable instance
111 self.initial['a_terminations'] = self.instance.a_terminations
112 self.initial['b_terminations'] = self.instance.b_terminations
113
114 def clean(self):
115 super().clean()
116
117 # Set the A/B terminations on the Cable instance
118 self.instance.a_terminations = self.cleaned_data.get('a_terminations', [])
119 self.instance.b_terminations = self.cleaned_data.get('b_terminations', [])
120
121 return _CableForm
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/netbox/dcim/forms/connections.py b/netbox/dcim/forms/connections.py
--- a/netbox/dcim/forms/connections.py
+++ b/netbox/dcim/forms/connections.py
@@ -90,14 +90,14 @@
class _CableForm(CableForm, metaclass=FormMetaclass):
def __init__(self, *args, initial=None, **kwargs):
-
initial = initial or {}
+
if a_type:
- ct = ContentType.objects.get_for_model(a_type)
- initial['a_terminations_type'] = f'{ct.app_label}.{ct.model}'
+ a_ct = ContentType.objects.get_for_model(a_type)
+ initial['a_terminations_type'] = f'{a_ct.app_label}.{a_ct.model}'
if b_type:
- ct = ContentType.objects.get_for_model(b_type)
- initial['b_terminations_type'] = f'{ct.app_label}.{ct.model}'
+ b_ct = ContentType.objects.get_for_model(b_type)
+ initial['b_terminations_type'] = f'{b_ct.app_label}.{b_ct.model}'
# TODO: Temporary hack to work around list handling limitations with utils.normalize_querydict()
for field_name in ('a_terminations', 'b_terminations'):
@@ -108,8 +108,17 @@
if self.instance and self.instance.pk:
# Initialize A/B terminations when modifying an existing Cable instance
- self.initial['a_terminations'] = self.instance.a_terminations
- self.initial['b_terminations'] = self.instance.b_terminations
+ if a_type and self.instance.a_terminations and a_ct == ContentType.objects.get_for_model(self.instance.a_terminations[0]):
+ self.initial['a_terminations'] = self.instance.a_terminations
+ if b_type and self.instance.b_terminations and b_ct == ContentType.objects.get_for_model(self.instance.b_terminations[0]):
+ self.initial['b_terminations'] = self.instance.b_terminations
+ else:
+ # Need to clear terminations if swapped type - but need to do it only
+ # if not from instance
+ if a_type:
+ initial.pop('a_terminations', None)
+ if b_type:
+ initial.pop('b_terminations', None)
def clean(self):
super().clean()
| {"golden_diff": "diff --git a/netbox/dcim/forms/connections.py b/netbox/dcim/forms/connections.py\n--- a/netbox/dcim/forms/connections.py\n+++ b/netbox/dcim/forms/connections.py\n@@ -90,14 +90,14 @@\n class _CableForm(CableForm, metaclass=FormMetaclass):\n \n def __init__(self, *args, initial=None, **kwargs):\n-\n initial = initial or {}\n+\n if a_type:\n- ct = ContentType.objects.get_for_model(a_type)\n- initial['a_terminations_type'] = f'{ct.app_label}.{ct.model}'\n+ a_ct = ContentType.objects.get_for_model(a_type)\n+ initial['a_terminations_type'] = f'{a_ct.app_label}.{a_ct.model}'\n if b_type:\n- ct = ContentType.objects.get_for_model(b_type)\n- initial['b_terminations_type'] = f'{ct.app_label}.{ct.model}'\n+ b_ct = ContentType.objects.get_for_model(b_type)\n+ initial['b_terminations_type'] = f'{b_ct.app_label}.{b_ct.model}'\n \n # TODO: Temporary hack to work around list handling limitations with utils.normalize_querydict()\n for field_name in ('a_terminations', 'b_terminations'):\n@@ -108,8 +108,17 @@\n \n if self.instance and self.instance.pk:\n # Initialize A/B terminations when modifying an existing Cable instance\n- self.initial['a_terminations'] = self.instance.a_terminations\n- self.initial['b_terminations'] = self.instance.b_terminations\n+ if a_type and self.instance.a_terminations and a_ct == ContentType.objects.get_for_model(self.instance.a_terminations[0]):\n+ self.initial['a_terminations'] = self.instance.a_terminations\n+ if b_type and self.instance.b_terminations and b_ct == ContentType.objects.get_for_model(self.instance.b_terminations[0]):\n+ self.initial['b_terminations'] = self.instance.b_terminations\n+ else:\n+ # Need to clear terminations if swapped type - but need to do it only\n+ # if not from instance\n+ if a_type:\n+ initial.pop('a_terminations', None)\n+ if b_type:\n+ initial.pop('b_terminations', None)\n \n def clean(self):\n super().clean()\n", "issue": "Error on cable edit after B port was deleted / wrong status\n### Deployment Type\r\n\r\nSelf-hosted (Docker)\r\n\r\n### NetBox Version\r\n\r\nv3.7.5\r\n\r\n### Python Version\r\n\r\n3.11\r\n\r\n### Steps to Reproduce\r\n\r\n1. Have a device with front- and rear-Ports (Patch Panel)\r\n2. Have a device with interfaces (switches)\r\n3. Connect switch interface (here gi43) with front-port (here 22)\r\n4. Delete rear ports on patch panel device\r\n5. Go To Connections > Cables\r\n6. Click on edit of cable --> error message\r\n\r\n### Expected Behavior\r\n\r\n- Edit Button works to connect cable again\r\n\r\n### Observed Behavior\r\n\r\n- Error Message \r\n\r\n\nCables/UI: Unable to change from front to rear while editing cable\n### NetBox version\n\nv3.5.6\n\n### Feature type\n\nChange to existing functionality\n\n### Proposed functionality\n\nCurrently, if someone accidentally connects a cable to the rear port instead of the front port, the entire cable must be deleted and created again. It would be nice to be able to change not only the port number but also the location (front/rear) when editing the cable. This might just be an ui change as the api seems to allow to change it.\n\n### Use case\n\nIt is not necessary to delete the cable and enter all information (label, length etc) again. You can just reconnect it.\n\n### Database changes\n\n_No response_\n\n### External dependencies\n\n_No response_\n", "before_files": [{"content": "from django import forms\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.utils.translation import gettext_lazy as _\n\nfrom circuits.models import Circuit, CircuitTermination\nfrom dcim.models import *\nfrom utilities.forms.fields import DynamicModelChoiceField, DynamicModelMultipleChoiceField\nfrom .model_forms import CableForm\n\n\ndef get_cable_form(a_type, b_type):\n\n class FormMetaclass(forms.models.ModelFormMetaclass):\n\n def __new__(mcs, name, bases, attrs):\n\n for cable_end, term_cls in (('a', a_type), ('b', b_type)):\n\n # Device component\n if hasattr(term_cls, 'device'):\n\n attrs[f'termination_{cable_end}_device'] = DynamicModelChoiceField(\n queryset=Device.objects.all(),\n label=_('Device'),\n required=False,\n selector=True,\n initial_params={\n f'{term_cls._meta.model_name}s__in': f'${cable_end}_terminations'\n }\n )\n attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(\n queryset=term_cls.objects.all(),\n label=term_cls._meta.verbose_name.title(),\n context={\n 'disabled': '_occupied',\n },\n query_params={\n 'device_id': f'$termination_{cable_end}_device',\n 'kind': 'physical', # Exclude virtual interfaces\n }\n )\n\n # PowerFeed\n elif term_cls == PowerFeed:\n\n attrs[f'termination_{cable_end}_powerpanel'] = DynamicModelChoiceField(\n queryset=PowerPanel.objects.all(),\n label=_('Power Panel'),\n required=False,\n selector=True,\n initial_params={\n 'powerfeeds__in': f'${cable_end}_terminations'\n }\n )\n attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(\n queryset=term_cls.objects.all(),\n label=_('Power Feed'),\n context={\n 'disabled': '_occupied',\n },\n query_params={\n 'power_panel_id': f'$termination_{cable_end}_powerpanel',\n }\n )\n\n # CircuitTermination\n elif term_cls == CircuitTermination:\n\n attrs[f'termination_{cable_end}_circuit'] = DynamicModelChoiceField(\n queryset=Circuit.objects.all(),\n label=_('Circuit'),\n selector=True,\n initial_params={\n 'terminations__in': f'${cable_end}_terminations'\n }\n )\n attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(\n queryset=term_cls.objects.all(),\n label=_('Side'),\n context={\n 'disabled': '_occupied',\n },\n query_params={\n 'circuit_id': f'$termination_{cable_end}_circuit',\n }\n )\n\n return super().__new__(mcs, name, bases, attrs)\n\n class _CableForm(CableForm, metaclass=FormMetaclass):\n\n def __init__(self, *args, initial=None, **kwargs):\n\n initial = initial or {}\n if a_type:\n ct = ContentType.objects.get_for_model(a_type)\n initial['a_terminations_type'] = f'{ct.app_label}.{ct.model}'\n if b_type:\n ct = ContentType.objects.get_for_model(b_type)\n initial['b_terminations_type'] = f'{ct.app_label}.{ct.model}'\n\n # TODO: Temporary hack to work around list handling limitations with utils.normalize_querydict()\n for field_name in ('a_terminations', 'b_terminations'):\n if field_name in initial and type(initial[field_name]) is not list:\n initial[field_name] = [initial[field_name]]\n\n super().__init__(*args, initial=initial, **kwargs)\n\n if self.instance and self.instance.pk:\n # Initialize A/B terminations when modifying an existing Cable instance\n self.initial['a_terminations'] = self.instance.a_terminations\n self.initial['b_terminations'] = self.instance.b_terminations\n\n def clean(self):\n super().clean()\n\n # Set the A/B terminations on the Cable instance\n self.instance.a_terminations = self.cleaned_data.get('a_terminations', [])\n self.instance.b_terminations = self.cleaned_data.get('b_terminations', [])\n\n return _CableForm\n", "path": "netbox/dcim/forms/connections.py"}], "after_files": [{"content": "from django import forms\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.utils.translation import gettext_lazy as _\n\nfrom circuits.models import Circuit, CircuitTermination\nfrom dcim.models import *\nfrom utilities.forms.fields import DynamicModelChoiceField, DynamicModelMultipleChoiceField\nfrom .model_forms import CableForm\n\n\ndef get_cable_form(a_type, b_type):\n\n class FormMetaclass(forms.models.ModelFormMetaclass):\n\n def __new__(mcs, name, bases, attrs):\n\n for cable_end, term_cls in (('a', a_type), ('b', b_type)):\n\n # Device component\n if hasattr(term_cls, 'device'):\n\n attrs[f'termination_{cable_end}_device'] = DynamicModelChoiceField(\n queryset=Device.objects.all(),\n label=_('Device'),\n required=False,\n selector=True,\n initial_params={\n f'{term_cls._meta.model_name}s__in': f'${cable_end}_terminations'\n }\n )\n attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(\n queryset=term_cls.objects.all(),\n label=term_cls._meta.verbose_name.title(),\n context={\n 'disabled': '_occupied',\n },\n query_params={\n 'device_id': f'$termination_{cable_end}_device',\n 'kind': 'physical', # Exclude virtual interfaces\n }\n )\n\n # PowerFeed\n elif term_cls == PowerFeed:\n\n attrs[f'termination_{cable_end}_powerpanel'] = DynamicModelChoiceField(\n queryset=PowerPanel.objects.all(),\n label=_('Power Panel'),\n required=False,\n selector=True,\n initial_params={\n 'powerfeeds__in': f'${cable_end}_terminations'\n }\n )\n attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(\n queryset=term_cls.objects.all(),\n label=_('Power Feed'),\n context={\n 'disabled': '_occupied',\n },\n query_params={\n 'power_panel_id': f'$termination_{cable_end}_powerpanel',\n }\n )\n\n # CircuitTermination\n elif term_cls == CircuitTermination:\n\n attrs[f'termination_{cable_end}_circuit'] = DynamicModelChoiceField(\n queryset=Circuit.objects.all(),\n label=_('Circuit'),\n selector=True,\n initial_params={\n 'terminations__in': f'${cable_end}_terminations'\n }\n )\n attrs[f'{cable_end}_terminations'] = DynamicModelMultipleChoiceField(\n queryset=term_cls.objects.all(),\n label=_('Side'),\n context={\n 'disabled': '_occupied',\n },\n query_params={\n 'circuit_id': f'$termination_{cable_end}_circuit',\n }\n )\n\n return super().__new__(mcs, name, bases, attrs)\n\n class _CableForm(CableForm, metaclass=FormMetaclass):\n\n def __init__(self, *args, initial=None, **kwargs):\n initial = initial or {}\n\n if a_type:\n a_ct = ContentType.objects.get_for_model(a_type)\n initial['a_terminations_type'] = f'{a_ct.app_label}.{a_ct.model}'\n if b_type:\n b_ct = ContentType.objects.get_for_model(b_type)\n initial['b_terminations_type'] = f'{b_ct.app_label}.{b_ct.model}'\n\n # TODO: Temporary hack to work around list handling limitations with utils.normalize_querydict()\n for field_name in ('a_terminations', 'b_terminations'):\n if field_name in initial and type(initial[field_name]) is not list:\n initial[field_name] = [initial[field_name]]\n\n super().__init__(*args, initial=initial, **kwargs)\n\n if self.instance and self.instance.pk:\n # Initialize A/B terminations when modifying an existing Cable instance\n if a_type and self.instance.a_terminations and a_ct == ContentType.objects.get_for_model(self.instance.a_terminations[0]):\n self.initial['a_terminations'] = self.instance.a_terminations\n if b_type and self.instance.b_terminations and b_ct == ContentType.objects.get_for_model(self.instance.b_terminations[0]):\n self.initial['b_terminations'] = self.instance.b_terminations\n else:\n # Need to clear terminations if swapped type - but need to do it only\n # if not from instance\n if a_type:\n initial.pop('a_terminations', None)\n if b_type:\n initial.pop('b_terminations', None)\n\n def clean(self):\n super().clean()\n\n # Set the A/B terminations on the Cable instance\n self.instance.a_terminations = self.cleaned_data.get('a_terminations', [])\n self.instance.b_terminations = self.cleaned_data.get('b_terminations', [])\n\n return _CableForm\n", "path": "netbox/dcim/forms/connections.py"}]} | 1,846 | 529 |
gh_patches_debug_35052 | rasdani/github-patches | git_diff | pypa__virtualenv-1805 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
activate_this.py failed for python2 virtualenvs
**Issue**
It seems recently pipenv introduced a new type of activate_this.py. On windows the content of activate_this.py has something like this:
```
prev_length = len(sys.path)
for lib in "'..\\Lib\\site-packages".split(os.pathsep):
path = os.path.realpath(os.path.join(bin_dir, lib))
site.addsitedir(path.decode("utf-8") if "'yes" else path)
sys.path[:] = sys.path[prev_length:] + sys.path[0:prev_length]
```
As you can see the "'..\\Lib\\site-packages" is obviously wrong.
**Environment**
Provide at least:
- OS: Windows 10
- ``pip list`` of the host python where ``virtualenv`` is installed:
```console
virtualenv 20.0.18
virtualenv-clone 0.5.4
```
**Output of the virtual environment creation**
As I'm using virtualenv through pipenv, so I failed to grab the virtualenv output
```
[ ==] Creating virtual environment...created virtual environment CPython2.7.17.final.0-64 in 641ms
creator CPython2Windows(dest=C:\Users\win10\.virtualenvs\win10-obmjl69F, clear=False, global=False)
seeder FromAppData(download=False, pip=latest, setuptools=latest, wheel=latest, via=copy, app_data_dir=C:\Users\win10\AppData\Local\pypa\virtualenv\seed-app-data\v1.0.1)
activators BashActivator,BatchActivator,FishActivator,PowerShellActivator,PythonActivator
```
However I've located the related code and wrote out its runtime variable information:
The following is the output of _repr_unicode function in ```src/virtualenv/activation/python/__init__.py```
```
'(win10)
'C:\\Users\\win10\\.virtualenvs\\win10-obmjl69F
'win10-obmjl69F
'Scripts
';
'..\\Lib\\site-packages
'yes
```
As you can see, there's an additional ' before each item. I've done a small experiment on python 3.6 and 3.7:
```
>>> value = "..\\123456"
>>> repr(value.encode("utf-8"))
"b'..\\\\123456'"
>>> repr(value.encode("utf-8"))[1:-1]
"'..\\\\123456"
>>>
```
I believe there's something wrong with this function. This function is introduced in PR #1503
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/virtualenv/activation/python/__init__.py`
Content:
```
1 from __future__ import absolute_import, unicode_literals
2
3 import os
4 from collections import OrderedDict
5
6 from virtualenv.util.path import Path
7 from virtualenv.util.six import ensure_text
8
9 from ..via_template import ViaTemplateActivator
10
11
12 class PythonActivator(ViaTemplateActivator):
13 def templates(self):
14 yield Path("activate_this.py")
15
16 def replacements(self, creator, dest_folder):
17 replacements = super(PythonActivator, self).replacements(creator, dest_folder)
18 lib_folders = OrderedDict((os.path.relpath(str(i), str(dest_folder)), None) for i in creator.libs)
19 win_py2 = creator.interpreter.platform == "win32" and creator.interpreter.version_info.major == 2
20 replacements.update(
21 {
22 "__LIB_FOLDERS__": ensure_text(os.pathsep.join(lib_folders.keys())),
23 "__DECODE_PATH__": ("yes" if win_py2 else ""),
24 }
25 )
26 return replacements
27
28 @staticmethod
29 def _repr_unicode(creator, value):
30 py2 = creator.interpreter.version_info.major == 2
31 if py2: # on Python 2 we need to encode this into explicit utf-8, py3 supports unicode literals
32 value = ensure_text(repr(value.encode("utf-8"))[1:-1])
33 return value
34
```
Path: `src/virtualenv/activation/via_template.py`
Content:
```
1 from __future__ import absolute_import, unicode_literals
2
3 import os
4 import sys
5 from abc import ABCMeta, abstractmethod
6
7 from six import add_metaclass
8
9 from virtualenv.util.six import ensure_text
10
11 from .activator import Activator
12
13 if sys.version_info >= (3, 7):
14 from importlib.resources import read_text
15 else:
16 from importlib_resources import read_text
17
18
19 @add_metaclass(ABCMeta)
20 class ViaTemplateActivator(Activator):
21 @abstractmethod
22 def templates(self):
23 raise NotImplementedError
24
25 def generate(self, creator):
26 dest_folder = creator.bin_dir
27 replacements = self.replacements(creator, dest_folder)
28 self._generate(replacements, self.templates(), dest_folder, creator)
29 if self.flag_prompt is not None:
30 creator.pyenv_cfg["prompt"] = self.flag_prompt
31
32 def replacements(self, creator, dest_folder):
33 return {
34 "__VIRTUAL_PROMPT__": "" if self.flag_prompt is None else self.flag_prompt,
35 "__VIRTUAL_ENV__": ensure_text(str(creator.dest)),
36 "__VIRTUAL_NAME__": creator.env_name,
37 "__BIN_NAME__": ensure_text(str(creator.bin_dir.relative_to(creator.dest))),
38 "__PATH_SEP__": ensure_text(os.pathsep),
39 }
40
41 def _generate(self, replacements, templates, to_folder, creator):
42 for template in templates:
43 text = self.instantiate_template(replacements, template, creator)
44 dest = to_folder / self.as_name(template)
45 dest.write_text(text, encoding="utf-8")
46
47 def as_name(self, template):
48 return template.name
49
50 def instantiate_template(self, replacements, template, creator):
51 # read text and do replacements
52 text = read_text(self.__module__, str(template), encoding="utf-8", errors="strict")
53 for key, value in replacements.items():
54 value = self._repr_unicode(creator, value)
55 text = text.replace(key, value)
56 return text
57
58 @staticmethod
59 def _repr_unicode(creator, value):
60 # by default we just let it be unicode
61 return value
62
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/virtualenv/activation/python/__init__.py b/src/virtualenv/activation/python/__init__.py
--- a/src/virtualenv/activation/python/__init__.py
+++ b/src/virtualenv/activation/python/__init__.py
@@ -1,6 +1,7 @@
from __future__ import absolute_import, unicode_literals
import os
+import sys
from collections import OrderedDict
from virtualenv.util.path import Path
@@ -29,5 +30,6 @@
def _repr_unicode(creator, value):
py2 = creator.interpreter.version_info.major == 2
if py2: # on Python 2 we need to encode this into explicit utf-8, py3 supports unicode literals
- value = ensure_text(repr(value.encode("utf-8"))[1:-1])
+ start = 2 if sys.version_info[0] == 3 else 1
+ value = ensure_text(repr(value.encode("utf-8"))[start:-1])
return value
diff --git a/src/virtualenv/activation/via_template.py b/src/virtualenv/activation/via_template.py
--- a/src/virtualenv/activation/via_template.py
+++ b/src/virtualenv/activation/via_template.py
@@ -25,9 +25,10 @@
def generate(self, creator):
dest_folder = creator.bin_dir
replacements = self.replacements(creator, dest_folder)
- self._generate(replacements, self.templates(), dest_folder, creator)
+ generated = self._generate(replacements, self.templates(), dest_folder, creator)
if self.flag_prompt is not None:
creator.pyenv_cfg["prompt"] = self.flag_prompt
+ return generated
def replacements(self, creator, dest_folder):
return {
@@ -39,10 +40,13 @@
}
def _generate(self, replacements, templates, to_folder, creator):
+ generated = []
for template in templates:
text = self.instantiate_template(replacements, template, creator)
dest = to_folder / self.as_name(template)
dest.write_text(text, encoding="utf-8")
+ generated.append(dest)
+ return generated
def as_name(self, template):
return template.name
| {"golden_diff": "diff --git a/src/virtualenv/activation/python/__init__.py b/src/virtualenv/activation/python/__init__.py\n--- a/src/virtualenv/activation/python/__init__.py\n+++ b/src/virtualenv/activation/python/__init__.py\n@@ -1,6 +1,7 @@\n from __future__ import absolute_import, unicode_literals\n \n import os\n+import sys\n from collections import OrderedDict\n \n from virtualenv.util.path import Path\n@@ -29,5 +30,6 @@\n def _repr_unicode(creator, value):\n py2 = creator.interpreter.version_info.major == 2\n if py2: # on Python 2 we need to encode this into explicit utf-8, py3 supports unicode literals\n- value = ensure_text(repr(value.encode(\"utf-8\"))[1:-1])\n+ start = 2 if sys.version_info[0] == 3 else 1\n+ value = ensure_text(repr(value.encode(\"utf-8\"))[start:-1])\n return value\ndiff --git a/src/virtualenv/activation/via_template.py b/src/virtualenv/activation/via_template.py\n--- a/src/virtualenv/activation/via_template.py\n+++ b/src/virtualenv/activation/via_template.py\n@@ -25,9 +25,10 @@\n def generate(self, creator):\n dest_folder = creator.bin_dir\n replacements = self.replacements(creator, dest_folder)\n- self._generate(replacements, self.templates(), dest_folder, creator)\n+ generated = self._generate(replacements, self.templates(), dest_folder, creator)\n if self.flag_prompt is not None:\n creator.pyenv_cfg[\"prompt\"] = self.flag_prompt\n+ return generated\n \n def replacements(self, creator, dest_folder):\n return {\n@@ -39,10 +40,13 @@\n }\n \n def _generate(self, replacements, templates, to_folder, creator):\n+ generated = []\n for template in templates:\n text = self.instantiate_template(replacements, template, creator)\n dest = to_folder / self.as_name(template)\n dest.write_text(text, encoding=\"utf-8\")\n+ generated.append(dest)\n+ return generated\n \n def as_name(self, template):\n return template.name\n", "issue": "activate_this.py failed for python2 virtualenvs\n**Issue**\r\n\r\nIt seems recently pipenv introduced a new type of activate_this.py. On windows the content of activate_this.py has something like this:\r\n```\r\nprev_length = len(sys.path)\r\nfor lib in \"'..\\\\Lib\\\\site-packages\".split(os.pathsep):\r\n path = os.path.realpath(os.path.join(bin_dir, lib))\r\n site.addsitedir(path.decode(\"utf-8\") if \"'yes\" else path)\r\nsys.path[:] = sys.path[prev_length:] + sys.path[0:prev_length]\r\n```\r\nAs you can see the \"'..\\\\Lib\\\\site-packages\" is obviously wrong.\r\n\r\n**Environment**\r\n\r\nProvide at least:\r\n- OS: Windows 10\r\n- ``pip list`` of the host python where ``virtualenv`` is installed:\r\n\r\n ```console\r\n virtualenv 20.0.18\r\n virtualenv-clone 0.5.4\r\n ```\r\n\r\n**Output of the virtual environment creation**\r\n\r\nAs I'm using virtualenv through pipenv, so I failed to grab the virtualenv output\r\n\r\n```\r\n[ ==] Creating virtual environment...created virtual environment CPython2.7.17.final.0-64 in 641ms\r\n\r\n creator CPython2Windows(dest=C:\\Users\\win10\\.virtualenvs\\win10-obmjl69F, clear=False, global=False)\r\n\r\n seeder FromAppData(download=False, pip=latest, setuptools=latest, wheel=latest, via=copy, app_data_dir=C:\\Users\\win10\\AppData\\Local\\pypa\\virtualenv\\seed-app-data\\v1.0.1)\r\n\r\n activators BashActivator,BatchActivator,FishActivator,PowerShellActivator,PythonActivator\r\n```\r\n\r\nHowever I've located the related code and wrote out its runtime variable information:\r\nThe following is the output of _repr_unicode function in ```src/virtualenv/activation/python/__init__.py```\r\n\r\n```\r\n'(win10) \r\n'C:\\\\Users\\\\win10\\\\.virtualenvs\\\\win10-obmjl69F\r\n'win10-obmjl69F\r\n'Scripts\r\n';\r\n'..\\\\Lib\\\\site-packages\r\n'yes\r\n```\r\nAs you can see, there's an additional ' before each item. I've done a small experiment on python 3.6 and 3.7:\r\n```\r\n>>> value = \"..\\\\123456\"\r\n>>> repr(value.encode(\"utf-8\"))\r\n\"b'..\\\\\\\\123456'\"\r\n>>> repr(value.encode(\"utf-8\"))[1:-1]\r\n\"'..\\\\\\\\123456\"\r\n>>>\r\n```\r\nI believe there's something wrong with this function. This function is introduced in PR #1503 \n", "before_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nimport os\nfrom collections import OrderedDict\n\nfrom virtualenv.util.path import Path\nfrom virtualenv.util.six import ensure_text\n\nfrom ..via_template import ViaTemplateActivator\n\n\nclass PythonActivator(ViaTemplateActivator):\n def templates(self):\n yield Path(\"activate_this.py\")\n\n def replacements(self, creator, dest_folder):\n replacements = super(PythonActivator, self).replacements(creator, dest_folder)\n lib_folders = OrderedDict((os.path.relpath(str(i), str(dest_folder)), None) for i in creator.libs)\n win_py2 = creator.interpreter.platform == \"win32\" and creator.interpreter.version_info.major == 2\n replacements.update(\n {\n \"__LIB_FOLDERS__\": ensure_text(os.pathsep.join(lib_folders.keys())),\n \"__DECODE_PATH__\": (\"yes\" if win_py2 else \"\"),\n }\n )\n return replacements\n\n @staticmethod\n def _repr_unicode(creator, value):\n py2 = creator.interpreter.version_info.major == 2\n if py2: # on Python 2 we need to encode this into explicit utf-8, py3 supports unicode literals\n value = ensure_text(repr(value.encode(\"utf-8\"))[1:-1])\n return value\n", "path": "src/virtualenv/activation/python/__init__.py"}, {"content": "from __future__ import absolute_import, unicode_literals\n\nimport os\nimport sys\nfrom abc import ABCMeta, abstractmethod\n\nfrom six import add_metaclass\n\nfrom virtualenv.util.six import ensure_text\n\nfrom .activator import Activator\n\nif sys.version_info >= (3, 7):\n from importlib.resources import read_text\nelse:\n from importlib_resources import read_text\n\n\n@add_metaclass(ABCMeta)\nclass ViaTemplateActivator(Activator):\n @abstractmethod\n def templates(self):\n raise NotImplementedError\n\n def generate(self, creator):\n dest_folder = creator.bin_dir\n replacements = self.replacements(creator, dest_folder)\n self._generate(replacements, self.templates(), dest_folder, creator)\n if self.flag_prompt is not None:\n creator.pyenv_cfg[\"prompt\"] = self.flag_prompt\n\n def replacements(self, creator, dest_folder):\n return {\n \"__VIRTUAL_PROMPT__\": \"\" if self.flag_prompt is None else self.flag_prompt,\n \"__VIRTUAL_ENV__\": ensure_text(str(creator.dest)),\n \"__VIRTUAL_NAME__\": creator.env_name,\n \"__BIN_NAME__\": ensure_text(str(creator.bin_dir.relative_to(creator.dest))),\n \"__PATH_SEP__\": ensure_text(os.pathsep),\n }\n\n def _generate(self, replacements, templates, to_folder, creator):\n for template in templates:\n text = self.instantiate_template(replacements, template, creator)\n dest = to_folder / self.as_name(template)\n dest.write_text(text, encoding=\"utf-8\")\n\n def as_name(self, template):\n return template.name\n\n def instantiate_template(self, replacements, template, creator):\n # read text and do replacements\n text = read_text(self.__module__, str(template), encoding=\"utf-8\", errors=\"strict\")\n for key, value in replacements.items():\n value = self._repr_unicode(creator, value)\n text = text.replace(key, value)\n return text\n\n @staticmethod\n def _repr_unicode(creator, value):\n # by default we just let it be unicode\n return value\n", "path": "src/virtualenv/activation/via_template.py"}], "after_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nimport os\nimport sys\nfrom collections import OrderedDict\n\nfrom virtualenv.util.path import Path\nfrom virtualenv.util.six import ensure_text\n\nfrom ..via_template import ViaTemplateActivator\n\n\nclass PythonActivator(ViaTemplateActivator):\n def templates(self):\n yield Path(\"activate_this.py\")\n\n def replacements(self, creator, dest_folder):\n replacements = super(PythonActivator, self).replacements(creator, dest_folder)\n lib_folders = OrderedDict((os.path.relpath(str(i), str(dest_folder)), None) for i in creator.libs)\n win_py2 = creator.interpreter.platform == \"win32\" and creator.interpreter.version_info.major == 2\n replacements.update(\n {\n \"__LIB_FOLDERS__\": ensure_text(os.pathsep.join(lib_folders.keys())),\n \"__DECODE_PATH__\": (\"yes\" if win_py2 else \"\"),\n }\n )\n return replacements\n\n @staticmethod\n def _repr_unicode(creator, value):\n py2 = creator.interpreter.version_info.major == 2\n if py2: # on Python 2 we need to encode this into explicit utf-8, py3 supports unicode literals\n start = 2 if sys.version_info[0] == 3 else 1\n value = ensure_text(repr(value.encode(\"utf-8\"))[start:-1])\n return value\n", "path": "src/virtualenv/activation/python/__init__.py"}, {"content": "from __future__ import absolute_import, unicode_literals\n\nimport os\nimport sys\nfrom abc import ABCMeta, abstractmethod\n\nfrom six import add_metaclass\n\nfrom virtualenv.util.six import ensure_text\n\nfrom .activator import Activator\n\nif sys.version_info >= (3, 7):\n from importlib.resources import read_text\nelse:\n from importlib_resources import read_text\n\n\n@add_metaclass(ABCMeta)\nclass ViaTemplateActivator(Activator):\n @abstractmethod\n def templates(self):\n raise NotImplementedError\n\n def generate(self, creator):\n dest_folder = creator.bin_dir\n replacements = self.replacements(creator, dest_folder)\n generated = self._generate(replacements, self.templates(), dest_folder, creator)\n if self.flag_prompt is not None:\n creator.pyenv_cfg[\"prompt\"] = self.flag_prompt\n return generated\n\n def replacements(self, creator, dest_folder):\n return {\n \"__VIRTUAL_PROMPT__\": \"\" if self.flag_prompt is None else self.flag_prompt,\n \"__VIRTUAL_ENV__\": ensure_text(str(creator.dest)),\n \"__VIRTUAL_NAME__\": creator.env_name,\n \"__BIN_NAME__\": ensure_text(str(creator.bin_dir.relative_to(creator.dest))),\n \"__PATH_SEP__\": ensure_text(os.pathsep),\n }\n\n def _generate(self, replacements, templates, to_folder, creator):\n generated = []\n for template in templates:\n text = self.instantiate_template(replacements, template, creator)\n dest = to_folder / self.as_name(template)\n dest.write_text(text, encoding=\"utf-8\")\n generated.append(dest)\n return generated\n\n def as_name(self, template):\n return template.name\n\n def instantiate_template(self, replacements, template, creator):\n # read text and do replacements\n text = read_text(self.__module__, str(template), encoding=\"utf-8\", errors=\"strict\")\n for key, value in replacements.items():\n value = self._repr_unicode(creator, value)\n text = text.replace(key, value)\n return text\n\n @staticmethod\n def _repr_unicode(creator, value):\n # by default we just let it be unicode\n return value\n", "path": "src/virtualenv/activation/via_template.py"}]} | 1,799 | 495 |
gh_patches_debug_2721 | rasdani/github-patches | git_diff | benoitc__gunicorn-1708 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
gunicorn crashed on start with --reload flag
Setup: Vagrant, virtualenv, gunicorn 19.3.0:
The following command produces this stack:
`gunicorn -c /data/shared/api/gunicorn_config.py -b unix:/tmp/api-dev-gunicorn.sock --log-level INFO --reload wsgi:app`
```
Exception in thread Thread-1:
Traceback (most recent call last):
File "/home/vagrant/.pyenv/versions/2.7.6/lib/python2.7/threading.py", line 810, in __bootstrap_inner
self.run()
File "/data/virtualenv/default/lib/python2.7/site-packages/gunicorn/reloader.py", line 41, in run
for filename in self.get_files():
File "/data/virtualenv/default/lib/python2.7/site-packages/gunicorn/reloader.py", line 30, in get_files
if hasattr(module, '__file__')
File "/data/virtualenv/default/lib/python2.7/re.py", line 151, in sub
return _compile(pattern, flags).sub(repl, string, count)
TypeError: expected string or buffer
```
If I remove --reload it boots up fine.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gunicorn/reloader.py`
Content:
```
1 # -*- coding: utf-8 -
2 #
3 # This file is part of gunicorn released under the MIT license.
4 # See the NOTICE for more information.
5
6 import os
7 import os.path
8 import re
9 import sys
10 import time
11 import threading
12
13
14 class Reloader(threading.Thread):
15 def __init__(self, extra_files=None, interval=1, callback=None):
16 super(Reloader, self).__init__()
17 self.setDaemon(True)
18 self._extra_files = set(extra_files or ())
19 self._extra_files_lock = threading.RLock()
20 self._interval = interval
21 self._callback = callback
22
23 def add_extra_file(self, filename):
24 with self._extra_files_lock:
25 self._extra_files.add(filename)
26
27 def get_files(self):
28 fnames = [
29 re.sub('py[co]$', 'py', module.__file__)
30 for module in list(sys.modules.values())
31 if hasattr(module, '__file__')
32 ]
33
34 with self._extra_files_lock:
35 fnames.extend(self._extra_files)
36
37 return fnames
38
39 def run(self):
40 mtimes = {}
41 while True:
42 for filename in self.get_files():
43 try:
44 mtime = os.stat(filename).st_mtime
45 except OSError:
46 continue
47 old_time = mtimes.get(filename)
48 if old_time is None:
49 mtimes[filename] = mtime
50 continue
51 elif mtime > old_time:
52 if self._callback:
53 self._callback(filename)
54 time.sleep(self._interval)
55
56 has_inotify = False
57 if sys.platform.startswith('linux'):
58 try:
59 from inotify.adapters import Inotify
60 import inotify.constants
61 has_inotify = True
62 except ImportError:
63 pass
64
65
66 if has_inotify:
67
68 class InotifyReloader(threading.Thread):
69 event_mask = (inotify.constants.IN_CREATE | inotify.constants.IN_DELETE
70 | inotify.constants.IN_DELETE_SELF | inotify.constants.IN_MODIFY
71 | inotify.constants.IN_MOVE_SELF | inotify.constants.IN_MOVED_FROM
72 | inotify.constants.IN_MOVED_TO)
73
74 def __init__(self, extra_files=None, callback=None):
75 super(InotifyReloader, self).__init__()
76 self.setDaemon(True)
77 self._callback = callback
78 self._dirs = set()
79 self._watcher = Inotify()
80
81 for extra_file in extra_files:
82 self.add_extra_file(extra_file)
83
84 def add_extra_file(self, filename):
85 dirname = os.path.dirname(filename)
86
87 if dirname in self._dirs:
88 return
89
90 self._watcher.add_watch(dirname, mask=self.event_mask)
91 self._dirs.add(dirname)
92
93 def get_dirs(self):
94 fnames = [
95 os.path.dirname(re.sub('py[co]$', 'py', module.__file__))
96 for module in list(sys.modules.values())
97 if hasattr(module, '__file__')
98 ]
99
100 return set(fnames)
101
102 def run(self):
103 self._dirs = self.get_dirs()
104
105 for dirname in self._dirs:
106 self._watcher.add_watch(dirname, mask=self.event_mask)
107
108 for event in self._watcher.event_gen():
109 if event is None:
110 continue
111
112 filename = event[3]
113
114 self._callback(filename)
115
116 else:
117
118 class InotifyReloader(object):
119 def __init__(self, callback=None):
120 raise ImportError('You must have the inotify module installed to '
121 'use the inotify reloader')
122
123
124 preferred_reloader = InotifyReloader if has_inotify else Reloader
125
126 reloader_engines = {
127 'auto': preferred_reloader,
128 'poll': Reloader,
129 'inotify': InotifyReloader,
130 }
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gunicorn/reloader.py b/gunicorn/reloader.py
--- a/gunicorn/reloader.py
+++ b/gunicorn/reloader.py
@@ -28,7 +28,7 @@
fnames = [
re.sub('py[co]$', 'py', module.__file__)
for module in list(sys.modules.values())
- if hasattr(module, '__file__')
+ if getattr(module, '__file__', None)
]
with self._extra_files_lock:
| {"golden_diff": "diff --git a/gunicorn/reloader.py b/gunicorn/reloader.py\n--- a/gunicorn/reloader.py\n+++ b/gunicorn/reloader.py\n@@ -28,7 +28,7 @@\n fnames = [\n re.sub('py[co]$', 'py', module.__file__)\n for module in list(sys.modules.values())\n- if hasattr(module, '__file__')\n+ if getattr(module, '__file__', None)\n ]\n \n with self._extra_files_lock:\n", "issue": "gunicorn crashed on start with --reload flag\nSetup: Vagrant, virtualenv, gunicorn 19.3.0:\n\nThe following command produces this stack:\n\n`gunicorn -c /data/shared/api/gunicorn_config.py -b unix:/tmp/api-dev-gunicorn.sock --log-level INFO --reload wsgi:app`\n\n```\nException in thread Thread-1:\nTraceback (most recent call last):\n File \"/home/vagrant/.pyenv/versions/2.7.6/lib/python2.7/threading.py\", line 810, in __bootstrap_inner\n self.run()\n File \"/data/virtualenv/default/lib/python2.7/site-packages/gunicorn/reloader.py\", line 41, in run\n for filename in self.get_files():\n File \"/data/virtualenv/default/lib/python2.7/site-packages/gunicorn/reloader.py\", line 30, in get_files\n if hasattr(module, '__file__')\n File \"/data/virtualenv/default/lib/python2.7/re.py\", line 151, in sub\n return _compile(pattern, flags).sub(repl, string, count)\nTypeError: expected string or buffer\n```\n\nIf I remove --reload it boots up fine.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -\n#\n# This file is part of gunicorn released under the MIT license.\n# See the NOTICE for more information.\n\nimport os\nimport os.path\nimport re\nimport sys\nimport time\nimport threading\n\n\nclass Reloader(threading.Thread):\n def __init__(self, extra_files=None, interval=1, callback=None):\n super(Reloader, self).__init__()\n self.setDaemon(True)\n self._extra_files = set(extra_files or ())\n self._extra_files_lock = threading.RLock()\n self._interval = interval\n self._callback = callback\n\n def add_extra_file(self, filename):\n with self._extra_files_lock:\n self._extra_files.add(filename)\n\n def get_files(self):\n fnames = [\n re.sub('py[co]$', 'py', module.__file__)\n for module in list(sys.modules.values())\n if hasattr(module, '__file__')\n ]\n\n with self._extra_files_lock:\n fnames.extend(self._extra_files)\n\n return fnames\n\n def run(self):\n mtimes = {}\n while True:\n for filename in self.get_files():\n try:\n mtime = os.stat(filename).st_mtime\n except OSError:\n continue\n old_time = mtimes.get(filename)\n if old_time is None:\n mtimes[filename] = mtime\n continue\n elif mtime > old_time:\n if self._callback:\n self._callback(filename)\n time.sleep(self._interval)\n\nhas_inotify = False\nif sys.platform.startswith('linux'):\n try:\n from inotify.adapters import Inotify\n import inotify.constants\n has_inotify = True\n except ImportError:\n pass\n\n\nif has_inotify:\n\n class InotifyReloader(threading.Thread):\n event_mask = (inotify.constants.IN_CREATE | inotify.constants.IN_DELETE\n | inotify.constants.IN_DELETE_SELF | inotify.constants.IN_MODIFY\n | inotify.constants.IN_MOVE_SELF | inotify.constants.IN_MOVED_FROM\n | inotify.constants.IN_MOVED_TO)\n\n def __init__(self, extra_files=None, callback=None):\n super(InotifyReloader, self).__init__()\n self.setDaemon(True)\n self._callback = callback\n self._dirs = set()\n self._watcher = Inotify()\n\n for extra_file in extra_files:\n self.add_extra_file(extra_file)\n\n def add_extra_file(self, filename):\n dirname = os.path.dirname(filename)\n\n if dirname in self._dirs:\n return\n\n self._watcher.add_watch(dirname, mask=self.event_mask)\n self._dirs.add(dirname)\n\n def get_dirs(self):\n fnames = [\n os.path.dirname(re.sub('py[co]$', 'py', module.__file__))\n for module in list(sys.modules.values())\n if hasattr(module, '__file__')\n ]\n\n return set(fnames)\n\n def run(self):\n self._dirs = self.get_dirs()\n\n for dirname in self._dirs:\n self._watcher.add_watch(dirname, mask=self.event_mask)\n\n for event in self._watcher.event_gen():\n if event is None:\n continue\n\n filename = event[3]\n\n self._callback(filename)\n\nelse:\n\n class InotifyReloader(object):\n def __init__(self, callback=None):\n raise ImportError('You must have the inotify module installed to '\n 'use the inotify reloader')\n\n\npreferred_reloader = InotifyReloader if has_inotify else Reloader\n\nreloader_engines = {\n 'auto': preferred_reloader,\n 'poll': Reloader,\n 'inotify': InotifyReloader,\n}\n", "path": "gunicorn/reloader.py"}], "after_files": [{"content": "# -*- coding: utf-8 -\n#\n# This file is part of gunicorn released under the MIT license.\n# See the NOTICE for more information.\n\nimport os\nimport os.path\nimport re\nimport sys\nimport time\nimport threading\n\n\nclass Reloader(threading.Thread):\n def __init__(self, extra_files=None, interval=1, callback=None):\n super(Reloader, self).__init__()\n self.setDaemon(True)\n self._extra_files = set(extra_files or ())\n self._extra_files_lock = threading.RLock()\n self._interval = interval\n self._callback = callback\n\n def add_extra_file(self, filename):\n with self._extra_files_lock:\n self._extra_files.add(filename)\n\n def get_files(self):\n fnames = [\n re.sub('py[co]$', 'py', module.__file__)\n for module in list(sys.modules.values())\n if getattr(module, '__file__', None)\n ]\n\n with self._extra_files_lock:\n fnames.extend(self._extra_files)\n\n return fnames\n\n def run(self):\n mtimes = {}\n while True:\n for filename in self.get_files():\n try:\n mtime = os.stat(filename).st_mtime\n except OSError:\n continue\n old_time = mtimes.get(filename)\n if old_time is None:\n mtimes[filename] = mtime\n continue\n elif mtime > old_time:\n if self._callback:\n self._callback(filename)\n time.sleep(self._interval)\n\nhas_inotify = False\nif sys.platform.startswith('linux'):\n try:\n from inotify.adapters import Inotify\n import inotify.constants\n has_inotify = True\n except ImportError:\n pass\n\n\nif has_inotify:\n\n class InotifyReloader(threading.Thread):\n event_mask = (inotify.constants.IN_CREATE | inotify.constants.IN_DELETE\n | inotify.constants.IN_DELETE_SELF | inotify.constants.IN_MODIFY\n | inotify.constants.IN_MOVE_SELF | inotify.constants.IN_MOVED_FROM\n | inotify.constants.IN_MOVED_TO)\n\n def __init__(self, extra_files=None, callback=None):\n super(InotifyReloader, self).__init__()\n self.setDaemon(True)\n self._callback = callback\n self._dirs = set()\n self._watcher = Inotify()\n\n for extra_file in extra_files:\n self.add_extra_file(extra_file)\n\n def add_extra_file(self, filename):\n dirname = os.path.dirname(filename)\n\n if dirname in self._dirs:\n return\n\n self._watcher.add_watch(dirname, mask=self.event_mask)\n self._dirs.add(dirname)\n\n def get_dirs(self):\n fnames = [\n os.path.dirname(re.sub('py[co]$', 'py', module.__file__))\n for module in list(sys.modules.values())\n if hasattr(module, '__file__')\n ]\n\n return set(fnames)\n\n def run(self):\n self._dirs = self.get_dirs()\n\n for dirname in self._dirs:\n self._watcher.add_watch(dirname, mask=self.event_mask)\n\n for event in self._watcher.event_gen():\n if event is None:\n continue\n\n filename = event[3]\n\n self._callback(filename)\n\nelse:\n\n class InotifyReloader(object):\n def __init__(self, callback=None):\n raise ImportError('You must have the inotify module installed to '\n 'use the inotify reloader')\n\n\npreferred_reloader = InotifyReloader if has_inotify else Reloader\n\nreloader_engines = {\n 'auto': preferred_reloader,\n 'poll': Reloader,\n 'inotify': InotifyReloader,\n}\n", "path": "gunicorn/reloader.py"}]} | 1,612 | 106 |
gh_patches_debug_21397 | rasdani/github-patches | git_diff | liqd__a4-meinberlin-4092 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
testing 5024: order of filter
**URL:** https://meinberlin-dev.liqd.net/projekte/burgerhaushalt-spandau/?mode=list
**user:** any
**expected behaviour:**
**behaviour:**
**important screensize:**
**device & browser:**
**Comment/Question:** the order of the filter was 1) category 2) archive. It should stay this way but as we will touch the filter again later we can also do it then but need to remember. So if easy maybe change now?
Screenshot?
<img width="490" alt="Bildschirmfoto 2021-12-21 um 16 25 15" src="https://user-images.githubusercontent.com/35491681/146955180-11799600-c739-4d17-8f84-7581b57a861b.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/budgeting/api.py`
Content:
```
1 from django.utils.translation import get_language
2 from django.utils.translation import gettext_lazy as _
3 from django_filters.rest_framework import DjangoFilterBackend
4 from rest_framework import mixins
5 from rest_framework import viewsets
6 from rest_framework.filters import OrderingFilter
7 from rest_framework.pagination import PageNumberPagination
8
9 from adhocracy4.api.mixins import ModuleMixin
10 from adhocracy4.api.permissions import ViewSetRulesPermission
11 from adhocracy4.categories import get_category_icon_url
12 from adhocracy4.categories import has_icons
13 from adhocracy4.categories.models import Category
14 from meinberlin.apps.contrib.filters import IdeaCategoryFilterBackend
15 from meinberlin.apps.votes.api import VotingTokenInfoMixin
16
17 from .models import Proposal
18 from .serializers import ProposalSerializer
19
20
21 # To be changed to a more general IdeaPagination, when using
22 # pagination via rest api for more idea lists
23 class ProposalPagination(PageNumberPagination):
24 page_size = 15
25
26 def get_paginated_response(self, data):
27 response = super(ProposalPagination, self).get_paginated_response(data)
28 response.data['page_size'] = self.page_size
29 response.data['page_count'] = self.page.paginator.num_pages
30 return response
31
32
33 class LocaleInfoMixin:
34 def list(self, request, *args, **kwargs):
35 response = super().list(request, args, kwargs)
36 response.data['locale'] = get_language()
37 return response
38
39
40 class ProposalFilterInfoMixin(ModuleMixin):
41 def list(self, request, *args, **kwargs):
42 """Add the filter information to the data of the Proposal API.
43
44 Needs to be used with rest_framework.mixins.ListModelMixin
45 """
46 filters = {}
47
48 ordering_choices = [('-created', _('Most recent')), ]
49 if self.module.has_feature('rate', Proposal):
50 ordering_choices += ('-positive_rating_count', _('Most popular')),
51 ordering_choices += ('-comment_count', _('Most commented')),
52
53 filters['ordering'] = {
54 'label': _('Ordering'),
55 'choices': ordering_choices,
56 'default': '-created',
57 }
58
59 filters['is_archived'] = {
60 'label': _('Archived'),
61 'choices': [
62 ('', _('All')),
63 ('false', _('No')),
64 ('true', _('Yes')),
65 ],
66 'default': 'false',
67 }
68
69 categories = Category.objects.filter(
70 module=self.module
71 )
72 if categories:
73 category_choices = [('', _('All')), ]
74 if has_icons(self.module):
75 category_icons = []
76 for category in categories:
77 category_choices += (str(category.pk), category.name),
78 if has_icons(self.module):
79 icon_name = getattr(category, 'icon', None)
80 icon_url = get_category_icon_url(icon_name)
81 category_icons += (str(category.pk), icon_url),
82
83 filters['category'] = {
84 'label': _('Category'),
85 'choices': category_choices,
86 }
87 if has_icons(self.module):
88 filters['category']['icons'] = category_icons
89
90 response = super().list(request, args, kwargs)
91 response.data['filters'] = filters
92 return response
93
94
95 class ProposalViewSet(ProposalFilterInfoMixin,
96 LocaleInfoMixin,
97 VotingTokenInfoMixin,
98 mixins.ListModelMixin,
99 viewsets.GenericViewSet,
100 ):
101
102 pagination_class = ProposalPagination
103 serializer_class = ProposalSerializer
104 permission_classes = (ViewSetRulesPermission,)
105 filter_backends = (DjangoFilterBackend,
106 OrderingFilter,
107 IdeaCategoryFilterBackend,)
108 filter_fields = ('is_archived', 'category',)
109 ordering_fields = ('created',
110 'comment_count',
111 'positive_rating_count',)
112
113 def get_permission_object(self):
114 return self.module
115
116 def get_queryset(self):
117 proposals = Proposal.objects\
118 .filter(module=self.module) \
119 .annotate_comment_count() \
120 .annotate_positive_rating_count() \
121 .annotate_negative_rating_count() \
122 .order_by('-created')
123 return proposals
124
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/meinberlin/apps/budgeting/api.py b/meinberlin/apps/budgeting/api.py
--- a/meinberlin/apps/budgeting/api.py
+++ b/meinberlin/apps/budgeting/api.py
@@ -56,16 +56,6 @@
'default': '-created',
}
- filters['is_archived'] = {
- 'label': _('Archived'),
- 'choices': [
- ('', _('All')),
- ('false', _('No')),
- ('true', _('Yes')),
- ],
- 'default': 'false',
- }
-
categories = Category.objects.filter(
module=self.module
)
@@ -87,6 +77,16 @@
if has_icons(self.module):
filters['category']['icons'] = category_icons
+ filters['is_archived'] = {
+ 'label': _('Archived'),
+ 'choices': [
+ ('', _('All')),
+ ('false', _('No')),
+ ('true', _('Yes')),
+ ],
+ 'default': 'false',
+ }
+
response = super().list(request, args, kwargs)
response.data['filters'] = filters
return response
| {"golden_diff": "diff --git a/meinberlin/apps/budgeting/api.py b/meinberlin/apps/budgeting/api.py\n--- a/meinberlin/apps/budgeting/api.py\n+++ b/meinberlin/apps/budgeting/api.py\n@@ -56,16 +56,6 @@\n 'default': '-created',\n }\n \n- filters['is_archived'] = {\n- 'label': _('Archived'),\n- 'choices': [\n- ('', _('All')),\n- ('false', _('No')),\n- ('true', _('Yes')),\n- ],\n- 'default': 'false',\n- }\n-\n categories = Category.objects.filter(\n module=self.module\n )\n@@ -87,6 +77,16 @@\n if has_icons(self.module):\n filters['category']['icons'] = category_icons\n \n+ filters['is_archived'] = {\n+ 'label': _('Archived'),\n+ 'choices': [\n+ ('', _('All')),\n+ ('false', _('No')),\n+ ('true', _('Yes')),\n+ ],\n+ 'default': 'false',\n+ }\n+\n response = super().list(request, args, kwargs)\n response.data['filters'] = filters\n return response\n", "issue": "testing 5024: order of filter \n**URL:** https://meinberlin-dev.liqd.net/projekte/burgerhaushalt-spandau/?mode=list\r\n**user:** any\r\n**expected behaviour:** \r\n**behaviour:** \r\n**important screensize:**\r\n**device & browser:** \r\n**Comment/Question:** the order of the filter was 1) category 2) archive. It should stay this way but as we will touch the filter again later we can also do it then but need to remember. So if easy maybe change now?\r\n\r\nScreenshot?\r\n<img width=\"490\" alt=\"Bildschirmfoto 2021-12-21 um 16 25 15\" src=\"https://user-images.githubusercontent.com/35491681/146955180-11799600-c739-4d17-8f84-7581b57a861b.png\">\r\n\r\n\n", "before_files": [{"content": "from django.utils.translation import get_language\nfrom django.utils.translation import gettext_lazy as _\nfrom django_filters.rest_framework import DjangoFilterBackend\nfrom rest_framework import mixins\nfrom rest_framework import viewsets\nfrom rest_framework.filters import OrderingFilter\nfrom rest_framework.pagination import PageNumberPagination\n\nfrom adhocracy4.api.mixins import ModuleMixin\nfrom adhocracy4.api.permissions import ViewSetRulesPermission\nfrom adhocracy4.categories import get_category_icon_url\nfrom adhocracy4.categories import has_icons\nfrom adhocracy4.categories.models import Category\nfrom meinberlin.apps.contrib.filters import IdeaCategoryFilterBackend\nfrom meinberlin.apps.votes.api import VotingTokenInfoMixin\n\nfrom .models import Proposal\nfrom .serializers import ProposalSerializer\n\n\n# To be changed to a more general IdeaPagination, when using\n# pagination via rest api for more idea lists\nclass ProposalPagination(PageNumberPagination):\n page_size = 15\n\n def get_paginated_response(self, data):\n response = super(ProposalPagination, self).get_paginated_response(data)\n response.data['page_size'] = self.page_size\n response.data['page_count'] = self.page.paginator.num_pages\n return response\n\n\nclass LocaleInfoMixin:\n def list(self, request, *args, **kwargs):\n response = super().list(request, args, kwargs)\n response.data['locale'] = get_language()\n return response\n\n\nclass ProposalFilterInfoMixin(ModuleMixin):\n def list(self, request, *args, **kwargs):\n \"\"\"Add the filter information to the data of the Proposal API.\n\n Needs to be used with rest_framework.mixins.ListModelMixin\n \"\"\"\n filters = {}\n\n ordering_choices = [('-created', _('Most recent')), ]\n if self.module.has_feature('rate', Proposal):\n ordering_choices += ('-positive_rating_count', _('Most popular')),\n ordering_choices += ('-comment_count', _('Most commented')),\n\n filters['ordering'] = {\n 'label': _('Ordering'),\n 'choices': ordering_choices,\n 'default': '-created',\n }\n\n filters['is_archived'] = {\n 'label': _('Archived'),\n 'choices': [\n ('', _('All')),\n ('false', _('No')),\n ('true', _('Yes')),\n ],\n 'default': 'false',\n }\n\n categories = Category.objects.filter(\n module=self.module\n )\n if categories:\n category_choices = [('', _('All')), ]\n if has_icons(self.module):\n category_icons = []\n for category in categories:\n category_choices += (str(category.pk), category.name),\n if has_icons(self.module):\n icon_name = getattr(category, 'icon', None)\n icon_url = get_category_icon_url(icon_name)\n category_icons += (str(category.pk), icon_url),\n\n filters['category'] = {\n 'label': _('Category'),\n 'choices': category_choices,\n }\n if has_icons(self.module):\n filters['category']['icons'] = category_icons\n\n response = super().list(request, args, kwargs)\n response.data['filters'] = filters\n return response\n\n\nclass ProposalViewSet(ProposalFilterInfoMixin,\n LocaleInfoMixin,\n VotingTokenInfoMixin,\n mixins.ListModelMixin,\n viewsets.GenericViewSet,\n ):\n\n pagination_class = ProposalPagination\n serializer_class = ProposalSerializer\n permission_classes = (ViewSetRulesPermission,)\n filter_backends = (DjangoFilterBackend,\n OrderingFilter,\n IdeaCategoryFilterBackend,)\n filter_fields = ('is_archived', 'category',)\n ordering_fields = ('created',\n 'comment_count',\n 'positive_rating_count',)\n\n def get_permission_object(self):\n return self.module\n\n def get_queryset(self):\n proposals = Proposal.objects\\\n .filter(module=self.module) \\\n .annotate_comment_count() \\\n .annotate_positive_rating_count() \\\n .annotate_negative_rating_count() \\\n .order_by('-created')\n return proposals\n", "path": "meinberlin/apps/budgeting/api.py"}], "after_files": [{"content": "from django.utils.translation import get_language\nfrom django.utils.translation import gettext_lazy as _\nfrom django_filters.rest_framework import DjangoFilterBackend\nfrom rest_framework import mixins\nfrom rest_framework import viewsets\nfrom rest_framework.filters import OrderingFilter\nfrom rest_framework.pagination import PageNumberPagination\n\nfrom adhocracy4.api.mixins import ModuleMixin\nfrom adhocracy4.api.permissions import ViewSetRulesPermission\nfrom adhocracy4.categories import get_category_icon_url\nfrom adhocracy4.categories import has_icons\nfrom adhocracy4.categories.models import Category\nfrom meinberlin.apps.contrib.filters import IdeaCategoryFilterBackend\nfrom meinberlin.apps.votes.api import VotingTokenInfoMixin\n\nfrom .models import Proposal\nfrom .serializers import ProposalSerializer\n\n\n# To be changed to a more general IdeaPagination, when using\n# pagination via rest api for more idea lists\nclass ProposalPagination(PageNumberPagination):\n page_size = 15\n\n def get_paginated_response(self, data):\n response = super(ProposalPagination, self).get_paginated_response(data)\n response.data['page_size'] = self.page_size\n response.data['page_count'] = self.page.paginator.num_pages\n return response\n\n\nclass LocaleInfoMixin:\n def list(self, request, *args, **kwargs):\n response = super().list(request, args, kwargs)\n response.data['locale'] = get_language()\n return response\n\n\nclass ProposalFilterInfoMixin(ModuleMixin):\n def list(self, request, *args, **kwargs):\n \"\"\"Add the filter information to the data of the Proposal API.\n\n Needs to be used with rest_framework.mixins.ListModelMixin\n \"\"\"\n filters = {}\n\n ordering_choices = [('-created', _('Most recent')), ]\n if self.module.has_feature('rate', Proposal):\n ordering_choices += ('-positive_rating_count', _('Most popular')),\n ordering_choices += ('-comment_count', _('Most commented')),\n\n filters['ordering'] = {\n 'label': _('Ordering'),\n 'choices': ordering_choices,\n 'default': '-created',\n }\n\n categories = Category.objects.filter(\n module=self.module\n )\n if categories:\n category_choices = [('', _('All')), ]\n if has_icons(self.module):\n category_icons = []\n for category in categories:\n category_choices += (str(category.pk), category.name),\n if has_icons(self.module):\n icon_name = getattr(category, 'icon', None)\n icon_url = get_category_icon_url(icon_name)\n category_icons += (str(category.pk), icon_url),\n\n filters['category'] = {\n 'label': _('Category'),\n 'choices': category_choices,\n }\n if has_icons(self.module):\n filters['category']['icons'] = category_icons\n\n filters['is_archived'] = {\n 'label': _('Archived'),\n 'choices': [\n ('', _('All')),\n ('false', _('No')),\n ('true', _('Yes')),\n ],\n 'default': 'false',\n }\n\n response = super().list(request, args, kwargs)\n response.data['filters'] = filters\n return response\n\n\nclass ProposalViewSet(ProposalFilterInfoMixin,\n LocaleInfoMixin,\n VotingTokenInfoMixin,\n mixins.ListModelMixin,\n viewsets.GenericViewSet,\n ):\n\n pagination_class = ProposalPagination\n serializer_class = ProposalSerializer\n permission_classes = (ViewSetRulesPermission,)\n filter_backends = (DjangoFilterBackend,\n OrderingFilter,\n IdeaCategoryFilterBackend,)\n filter_fields = ('is_archived', 'category',)\n ordering_fields = ('created',\n 'comment_count',\n 'positive_rating_count',)\n\n def get_permission_object(self):\n return self.module\n\n def get_queryset(self):\n proposals = Proposal.objects\\\n .filter(module=self.module) \\\n .annotate_comment_count() \\\n .annotate_positive_rating_count() \\\n .annotate_negative_rating_count() \\\n .order_by('-created')\n return proposals\n", "path": "meinberlin/apps/budgeting/api.py"}]} | 1,597 | 271 |
gh_patches_debug_6641 | rasdani/github-patches | git_diff | ivy-llc__ivy-14502 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
generalized_normal
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/jax/random.py`
Content:
```
1 # local
2 import ivy
3 from ivy.func_wrapper import with_unsupported_dtypes
4 from ivy.functional.frontends.jax.func_wrapper import (
5 to_ivy_arrays_and_back,
6 handle_jax_dtype,
7 )
8
9
10 @to_ivy_arrays_and_back
11 def PRNGKey(seed):
12 return ivy.array([0, seed % 4294967295 - (seed // 4294967295)], dtype=ivy.int64)
13
14
15 @handle_jax_dtype
16 @to_ivy_arrays_and_back
17 def uniform(key, shape=(), dtype=None, minval=0.0, maxval=1.0):
18 return ivy.random_uniform(
19 low=minval, high=maxval, shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1])
20 )
21
22
23 @handle_jax_dtype
24 @to_ivy_arrays_and_back
25 def normal(key, shape=(), dtype=None):
26 return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))
27
28
29 def _get_seed(key):
30 key1, key2 = int(key[0]), int(key[1])
31 return ivy.to_scalar(int("".join(map(str, [key1, key2]))))
32
33
34 @handle_jax_dtype
35 @to_ivy_arrays_and_back
36 @with_unsupported_dtypes(
37 {
38 "0.3.14 and below": (
39 "float16",
40 "bfloat16",
41 )
42 },
43 "jax",
44 )
45 def beta(key, a, b, shape=None, dtype=None):
46 seed = _get_seed(key)
47 return ivy.beta(a, b, shape=shape, dtype=dtype, seed=seed)
48
49
50 @handle_jax_dtype
51 @to_ivy_arrays_and_back
52 @with_unsupported_dtypes(
53 {
54 "0.3.14 and below": (
55 "float16",
56 "bfloat16",
57 )
58 },
59 "jax",
60 )
61 def dirichlet(key, alpha, shape=None, dtype="float32"):
62 seed = _get_seed(key)
63 alpha = ivy.astype(alpha, dtype)
64 return ivy.dirichlet(alpha, size=shape, dtype=dtype, seed=seed)
65
66
67 @handle_jax_dtype
68 @to_ivy_arrays_and_back
69 @with_unsupported_dtypes(
70 {"0.3.14 and below": ("unsigned", "int8", "int16")},
71 "jax",
72 )
73 def poisson(key, lam, shape=None, dtype=None):
74 seed = _get_seed(key)
75 return ivy.poisson(lam, shape=shape, dtype=dtype, seed=seed)
76
77
78 @handle_jax_dtype
79 @to_ivy_arrays_and_back
80 @with_unsupported_dtypes(
81 {
82 "0.3.14 and below": (
83 "float16",
84 "bfloat16",
85 )
86 },
87 "jax",
88 )
89 def gamma(key, a, shape=None, dtype="float64"):
90 seed = _get_seed(key)
91 return ivy.gamma(a, 1.0, shape=shape, dtype=dtype, seed=seed)
92
93
94 @handle_jax_dtype
95 @to_ivy_arrays_and_back
96 @with_unsupported_dtypes(
97 {
98 "0.3.14 and below": (
99 "float16",
100 "bfloat16",
101 )
102 },
103 "jax",
104 )
105 def gumbel(key, shape=(), dtype="float64"):
106 seed = _get_seed(key)
107 uniform_x = ivy.random_uniform(
108 low=0.0,
109 high=1.0,
110 shape=shape,
111 dtype=dtype,
112 seed=seed,
113 )
114 return -ivy.log(-ivy.log(uniform_x))
115
116
117 @handle_jax_dtype
118 @to_ivy_arrays_and_back
119 @with_unsupported_dtypes(
120 {
121 "0.3.14 and below": (
122 "float16",
123 "bfloat16",
124 )
125 },
126 "jax",
127 )
128 def t(key, df, shape=(), dtype="float64"):
129 seed = _get_seed(key)
130 n = ivy.random_normal(shape=shape, dtype=dtype, seed=seed)
131 half_df = df / 2.0
132 g = ivy.gamma(half_df, 1.0, shape=shape, dtype=dtype, seed=seed)
133 return n * ivy.sqrt(ivy.divide(half_df, g))
134
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/jax/random.py b/ivy/functional/frontends/jax/random.py
--- a/ivy/functional/frontends/jax/random.py
+++ b/ivy/functional/frontends/jax/random.py
@@ -125,6 +125,14 @@
},
"jax",
)
+def generalized_normal(key, p, shape=(), dtype="float64"):
+ seed = _get_seed(key)
+ g = ivy.gamma(1 / p, 1.0, shape=shape, dtype=dtype, seed=seed)
+ b = ivy.bernoulli(ivy.array([0.5]), shape=shape, dtype=dtype, seed=seed)
+ r = 2 * b - 1
+ return r * g ** (1 / p)
+
+
def t(key, df, shape=(), dtype="float64"):
seed = _get_seed(key)
n = ivy.random_normal(shape=shape, dtype=dtype, seed=seed)
| {"golden_diff": "diff --git a/ivy/functional/frontends/jax/random.py b/ivy/functional/frontends/jax/random.py\n--- a/ivy/functional/frontends/jax/random.py\n+++ b/ivy/functional/frontends/jax/random.py\n@@ -125,6 +125,14 @@\n },\n \"jax\",\n )\n+def generalized_normal(key, p, shape=(), dtype=\"float64\"):\n+ seed = _get_seed(key)\n+ g = ivy.gamma(1 / p, 1.0, shape=shape, dtype=dtype, seed=seed)\n+ b = ivy.bernoulli(ivy.array([0.5]), shape=shape, dtype=dtype, seed=seed)\n+ r = 2 * b - 1\n+ return r * g ** (1 / p)\n+\n+\n def t(key, df, shape=(), dtype=\"float64\"):\n seed = _get_seed(key)\n n = ivy.random_normal(shape=shape, dtype=dtype, seed=seed)\n", "issue": "generalized_normal\n\n", "before_files": [{"content": "# local\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n)\n\n\n@to_ivy_arrays_and_back\ndef PRNGKey(seed):\n return ivy.array([0, seed % 4294967295 - (seed // 4294967295)], dtype=ivy.int64)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef uniform(key, shape=(), dtype=None, minval=0.0, maxval=1.0):\n return ivy.random_uniform(\n low=minval, high=maxval, shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1])\n )\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef normal(key, shape=(), dtype=None):\n return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))\n\n\ndef _get_seed(key):\n key1, key2 = int(key[0]), int(key[1])\n return ivy.to_scalar(int(\"\".join(map(str, [key1, key2]))))\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef beta(key, a, b, shape=None, dtype=None):\n seed = _get_seed(key)\n return ivy.beta(a, b, shape=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef dirichlet(key, alpha, shape=None, dtype=\"float32\"):\n seed = _get_seed(key)\n alpha = ivy.astype(alpha, dtype)\n return ivy.dirichlet(alpha, size=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\"0.3.14 and below\": (\"unsigned\", \"int8\", \"int16\")},\n \"jax\",\n)\ndef poisson(key, lam, shape=None, dtype=None):\n seed = _get_seed(key)\n return ivy.poisson(lam, shape=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef gamma(key, a, shape=None, dtype=\"float64\"):\n seed = _get_seed(key)\n return ivy.gamma(a, 1.0, shape=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef gumbel(key, shape=(), dtype=\"float64\"):\n seed = _get_seed(key)\n uniform_x = ivy.random_uniform(\n low=0.0,\n high=1.0,\n shape=shape,\n dtype=dtype,\n seed=seed,\n )\n return -ivy.log(-ivy.log(uniform_x))\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef t(key, df, shape=(), dtype=\"float64\"):\n seed = _get_seed(key)\n n = ivy.random_normal(shape=shape, dtype=dtype, seed=seed)\n half_df = df / 2.0\n g = ivy.gamma(half_df, 1.0, shape=shape, dtype=dtype, seed=seed)\n return n * ivy.sqrt(ivy.divide(half_df, g))\n", "path": "ivy/functional/frontends/jax/random.py"}], "after_files": [{"content": "# local\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n)\n\n\n@to_ivy_arrays_and_back\ndef PRNGKey(seed):\n return ivy.array([0, seed % 4294967295 - (seed // 4294967295)], dtype=ivy.int64)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef uniform(key, shape=(), dtype=None, minval=0.0, maxval=1.0):\n return ivy.random_uniform(\n low=minval, high=maxval, shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1])\n )\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef normal(key, shape=(), dtype=None):\n return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))\n\n\ndef _get_seed(key):\n key1, key2 = int(key[0]), int(key[1])\n return ivy.to_scalar(int(\"\".join(map(str, [key1, key2]))))\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef beta(key, a, b, shape=None, dtype=None):\n seed = _get_seed(key)\n return ivy.beta(a, b, shape=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef dirichlet(key, alpha, shape=None, dtype=\"float32\"):\n seed = _get_seed(key)\n alpha = ivy.astype(alpha, dtype)\n return ivy.dirichlet(alpha, size=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\"0.3.14 and below\": (\"unsigned\", \"int8\", \"int16\")},\n \"jax\",\n)\ndef poisson(key, lam, shape=None, dtype=None):\n seed = _get_seed(key)\n return ivy.poisson(lam, shape=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef gamma(key, a, shape=None, dtype=\"float64\"):\n seed = _get_seed(key)\n return ivy.gamma(a, 1.0, shape=shape, dtype=dtype, seed=seed)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef gumbel(key, shape=(), dtype=\"float64\"):\n seed = _get_seed(key)\n uniform_x = ivy.random_uniform(\n low=0.0,\n high=1.0,\n shape=shape,\n dtype=dtype,\n seed=seed,\n )\n return -ivy.log(-ivy.log(uniform_x))\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef generalized_normal(key, p, shape=(), dtype=\"float64\"):\n seed = _get_seed(key)\n g = ivy.gamma(1 / p, 1.0, shape=shape, dtype=dtype, seed=seed)\n b = ivy.bernoulli(ivy.array([0.5]), shape=shape, dtype=dtype, seed=seed)\n r = 2 * b - 1\n return r * g ** (1 / p)\n\n\ndef t(key, df, shape=(), dtype=\"float64\"):\n seed = _get_seed(key)\n n = ivy.random_normal(shape=shape, dtype=dtype, seed=seed)\n half_df = df / 2.0\n g = ivy.gamma(half_df, 1.0, shape=shape, dtype=dtype, seed=seed)\n return n * ivy.sqrt(ivy.divide(half_df, g))\n", "path": "ivy/functional/frontends/jax/random.py"}]} | 1,543 | 227 |
gh_patches_debug_8804 | rasdani/github-patches | git_diff | kivy__kivy-5579 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CTRL-C in a textinput crashes the app
<!--
The issue tracker is a tool to address bugs.
Please use the #kivy IRC channel on freenode or Stack Overflow for
support questions, more information at https://git.io/vM1yQ.
Before opening a new issue, make sure you do the following:
* check that your issue isn't already filed: https://git.io/vM1iE
* prepare a short, runnable example that reproduces the issue
* reproduce the problem with the latest development version of Kivy
* double-check that the issue is indeed a bug and not a support request
-->
### Versions
* Python: 3.6.3
* OS: Windows 10
* Kivy: 1.10.0
* Kivy installation method: using pip, followed the steps shown in the guide
### Description
Trying to use CTRL-C to copy text from a textinput, and the program crashes.
### Code and Logs
```python
import kivy
from kivy.uix.textinput import TextInput
from kivy.app import App
class program(App):
def build(self):
return TextInput()
prog = program()
prog.run()
Traceback (most recent call last):
File "E:/pycharm/MethasiaChatSurelyUnstable/vladegay.py", line 12, in <module>
prog.run()
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\app.py", line 828, in run
runTouchApp()
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\base.py", line 504, in runTouchApp
EventLoop.window.mainloop()
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\core\window\window_sdl2.py", line 663, in mainloop
self._mainloop()
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\core\window\window_sdl2.py", line 602, in _mainloop
self.modifiers):
File "kivy\_event.pyx", line 714, in kivy._event.EventDispatcher.dispatch (kivy\_event.c:8146)
File "kivy\_event.pyx", line 1225, in kivy._event.EventObservers.dispatch (kivy\_event.c:14035)
File "kivy\_event.pyx", line 1149, in kivy._event.EventObservers._dispatch (kivy\_event.c:13564)
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\core\window\__init__.py", line 159, in _on_window_key_down
return self.dispatch('on_key_down', keycode, text, modifiers)
File "kivy\_event.pyx", line 714, in kivy._event.EventDispatcher.dispatch (kivy\_event.c:8146)
File "kivy\_event.pyx", line 1225, in kivy._event.EventObservers.dispatch (kivy\_event.c:14035)
File "kivy\_event.pyx", line 1149, in kivy._event.EventObservers._dispatch (kivy\_event.c:13564)
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\uix\textinput.py", line 2404, in keyboard_on_key_down
self.copy()
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\uix\textinput.py", line 1712, in copy
return Clipboard.copy(self.selection_text)
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\core\clipboard\__init__.py", line 73, in copy
self._copy(data)
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\core\clipboard\__init__.py", line 87, in _copy
self.put(data, self._clip_mime_type)
File "C:\Users\hellNo\AppData\Local\Programs\Python\Python36-32\lib\site-packages\kivy\core\clipboard\clipboard_winctypes.py", line 55, in put
msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text))
ValueError: embedded null character
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kivy/core/clipboard/clipboard_winctypes.py`
Content:
```
1 '''
2 Clipboard windows: an implementation of the Clipboard using ctypes.
3 '''
4
5 __all__ = ('ClipboardWindows', )
6
7 from kivy.utils import platform
8 from kivy.core.clipboard import ClipboardBase
9
10 if platform != 'win':
11 raise SystemError('unsupported platform for Windows clipboard')
12
13 import ctypes
14 from ctypes import wintypes
15 user32 = ctypes.windll.user32
16 kernel32 = ctypes.windll.kernel32
17 msvcrt = ctypes.cdll.msvcrt
18 c_char_p = ctypes.c_char_p
19 c_wchar_p = ctypes.c_wchar_p
20
21
22 class ClipboardWindows(ClipboardBase):
23
24 def get(self, mimetype='text/plain'):
25 GetClipboardData = user32.GetClipboardData
26 GetClipboardData.argtypes = [wintypes.UINT]
27 GetClipboardData.restype = wintypes.HANDLE
28
29 user32.OpenClipboard(user32.GetActiveWindow())
30 # Standard Clipboard Format "1" is "CF_TEXT"
31 pcontents = GetClipboardData(13)
32
33 # if someone pastes a FILE, the content is None for SCF 13
34 # and the clipboard is locked if not closed properly
35 if not pcontents:
36 user32.CloseClipboard()
37 return ''
38 data = c_wchar_p(pcontents).value.encode(self._encoding)
39 user32.CloseClipboard()
40 return data
41
42 def put(self, text, mimetype='text/plain'):
43 text = text.decode(self._encoding) # auto converted later
44 text += u'\x00'
45
46 SetClipboardData = user32.SetClipboardData
47 SetClipboardData.argtypes = [wintypes.UINT, wintypes.HANDLE]
48 SetClipboardData.restype = wintypes.HANDLE
49
50 GlobalAlloc = kernel32.GlobalAlloc
51 GlobalAlloc.argtypes = [wintypes.UINT, ctypes.c_size_t]
52 GlobalAlloc.restype = wintypes.HGLOBAL
53
54 CF_UNICODETEXT = 13
55
56 user32.OpenClipboard(user32.GetActiveWindow())
57 user32.EmptyClipboard()
58 hCd = GlobalAlloc(0, len(text) * ctypes.sizeof(ctypes.c_wchar))
59 msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text))
60 SetClipboardData(CF_UNICODETEXT, hCd)
61 user32.CloseClipboard()
62
63 def get_types(self):
64 return ['text/plain']
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kivy/core/clipboard/clipboard_winctypes.py b/kivy/core/clipboard/clipboard_winctypes.py
--- a/kivy/core/clipboard/clipboard_winctypes.py
+++ b/kivy/core/clipboard/clipboard_winctypes.py
@@ -56,7 +56,9 @@
user32.OpenClipboard(user32.GetActiveWindow())
user32.EmptyClipboard()
hCd = GlobalAlloc(0, len(text) * ctypes.sizeof(ctypes.c_wchar))
- msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text))
+
+ # ignore null character for strSource pointer
+ msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text[:-1]))
SetClipboardData(CF_UNICODETEXT, hCd)
user32.CloseClipboard()
| {"golden_diff": "diff --git a/kivy/core/clipboard/clipboard_winctypes.py b/kivy/core/clipboard/clipboard_winctypes.py\n--- a/kivy/core/clipboard/clipboard_winctypes.py\n+++ b/kivy/core/clipboard/clipboard_winctypes.py\n@@ -56,7 +56,9 @@\n user32.OpenClipboard(user32.GetActiveWindow())\n user32.EmptyClipboard()\n hCd = GlobalAlloc(0, len(text) * ctypes.sizeof(ctypes.c_wchar))\n- msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text))\n+\n+ # ignore null character for strSource pointer\n+ msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text[:-1]))\n SetClipboardData(CF_UNICODETEXT, hCd)\n user32.CloseClipboard()\n", "issue": "CTRL-C in a textinput crashes the app\n<!--\r\nThe issue tracker is a tool to address bugs.\r\nPlease use the #kivy IRC channel on freenode or Stack Overflow for\r\nsupport questions, more information at https://git.io/vM1yQ.\r\n\r\nBefore opening a new issue, make sure you do the following:\r\n * check that your issue isn't already filed: https://git.io/vM1iE\r\n * prepare a short, runnable example that reproduces the issue\r\n * reproduce the problem with the latest development version of Kivy\r\n * double-check that the issue is indeed a bug and not a support request\r\n-->\r\n\r\n### Versions\r\n\r\n* Python: 3.6.3\r\n* OS: Windows 10\r\n* Kivy: 1.10.0\r\n* Kivy installation method: using pip, followed the steps shown in the guide\r\n\r\n### Description\r\n\r\nTrying to use CTRL-C to copy text from a textinput, and the program crashes.\r\n\r\n### Code and Logs\r\n\r\n```python\r\nimport kivy\r\nfrom kivy.uix.textinput import TextInput\r\nfrom kivy.app import App\r\n\r\nclass program(App):\r\n def build(self):\r\n return TextInput()\r\n\r\nprog = program()\r\nprog.run()\r\n\r\n\r\nTraceback (most recent call last):\r\n File \"E:/pycharm/MethasiaChatSurelyUnstable/vladegay.py\", line 12, in <module>\r\n prog.run()\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\app.py\", line 828, in run\r\n runTouchApp()\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\base.py\", line 504, in runTouchApp\r\n EventLoop.window.mainloop()\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\core\\window\\window_sdl2.py\", line 663, in mainloop\r\n self._mainloop()\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\core\\window\\window_sdl2.py\", line 602, in _mainloop\r\n self.modifiers):\r\n File \"kivy\\_event.pyx\", line 714, in kivy._event.EventDispatcher.dispatch (kivy\\_event.c:8146)\r\n File \"kivy\\_event.pyx\", line 1225, in kivy._event.EventObservers.dispatch (kivy\\_event.c:14035)\r\n File \"kivy\\_event.pyx\", line 1149, in kivy._event.EventObservers._dispatch (kivy\\_event.c:13564)\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\core\\window\\__init__.py\", line 159, in _on_window_key_down\r\n return self.dispatch('on_key_down', keycode, text, modifiers)\r\n File \"kivy\\_event.pyx\", line 714, in kivy._event.EventDispatcher.dispatch (kivy\\_event.c:8146)\r\n File \"kivy\\_event.pyx\", line 1225, in kivy._event.EventObservers.dispatch (kivy\\_event.c:14035)\r\n File \"kivy\\_event.pyx\", line 1149, in kivy._event.EventObservers._dispatch (kivy\\_event.c:13564)\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\uix\\textinput.py\", line 2404, in keyboard_on_key_down\r\n self.copy()\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\uix\\textinput.py\", line 1712, in copy\r\n return Clipboard.copy(self.selection_text)\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\core\\clipboard\\__init__.py\", line 73, in copy\r\n self._copy(data)\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\core\\clipboard\\__init__.py\", line 87, in _copy\r\n self.put(data, self._clip_mime_type)\r\n File \"C:\\Users\\hellNo\\AppData\\Local\\Programs\\Python\\Python36-32\\lib\\site-packages\\kivy\\core\\clipboard\\clipboard_winctypes.py\", line 55, in put\r\n msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text))\r\n ValueError: embedded null character\r\n\r\n```\r\n\n", "before_files": [{"content": "'''\nClipboard windows: an implementation of the Clipboard using ctypes.\n'''\n\n__all__ = ('ClipboardWindows', )\n\nfrom kivy.utils import platform\nfrom kivy.core.clipboard import ClipboardBase\n\nif platform != 'win':\n raise SystemError('unsupported platform for Windows clipboard')\n\nimport ctypes\nfrom ctypes import wintypes\nuser32 = ctypes.windll.user32\nkernel32 = ctypes.windll.kernel32\nmsvcrt = ctypes.cdll.msvcrt\nc_char_p = ctypes.c_char_p\nc_wchar_p = ctypes.c_wchar_p\n\n\nclass ClipboardWindows(ClipboardBase):\n\n def get(self, mimetype='text/plain'):\n GetClipboardData = user32.GetClipboardData\n GetClipboardData.argtypes = [wintypes.UINT]\n GetClipboardData.restype = wintypes.HANDLE\n\n user32.OpenClipboard(user32.GetActiveWindow())\n # Standard Clipboard Format \"1\" is \"CF_TEXT\"\n pcontents = GetClipboardData(13)\n\n # if someone pastes a FILE, the content is None for SCF 13\n # and the clipboard is locked if not closed properly\n if not pcontents:\n user32.CloseClipboard()\n return ''\n data = c_wchar_p(pcontents).value.encode(self._encoding)\n user32.CloseClipboard()\n return data\n\n def put(self, text, mimetype='text/plain'):\n text = text.decode(self._encoding) # auto converted later\n text += u'\\x00'\n\n SetClipboardData = user32.SetClipboardData\n SetClipboardData.argtypes = [wintypes.UINT, wintypes.HANDLE]\n SetClipboardData.restype = wintypes.HANDLE\n\n GlobalAlloc = kernel32.GlobalAlloc\n GlobalAlloc.argtypes = [wintypes.UINT, ctypes.c_size_t]\n GlobalAlloc.restype = wintypes.HGLOBAL\n\n CF_UNICODETEXT = 13\n\n user32.OpenClipboard(user32.GetActiveWindow())\n user32.EmptyClipboard()\n hCd = GlobalAlloc(0, len(text) * ctypes.sizeof(ctypes.c_wchar))\n msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text))\n SetClipboardData(CF_UNICODETEXT, hCd)\n user32.CloseClipboard()\n\n def get_types(self):\n return ['text/plain']\n", "path": "kivy/core/clipboard/clipboard_winctypes.py"}], "after_files": [{"content": "'''\nClipboard windows: an implementation of the Clipboard using ctypes.\n'''\n\n__all__ = ('ClipboardWindows', )\n\nfrom kivy.utils import platform\nfrom kivy.core.clipboard import ClipboardBase\n\nif platform != 'win':\n raise SystemError('unsupported platform for Windows clipboard')\n\nimport ctypes\nfrom ctypes import wintypes\nuser32 = ctypes.windll.user32\nkernel32 = ctypes.windll.kernel32\nmsvcrt = ctypes.cdll.msvcrt\nc_char_p = ctypes.c_char_p\nc_wchar_p = ctypes.c_wchar_p\n\n\nclass ClipboardWindows(ClipboardBase):\n\n def get(self, mimetype='text/plain'):\n GetClipboardData = user32.GetClipboardData\n GetClipboardData.argtypes = [wintypes.UINT]\n GetClipboardData.restype = wintypes.HANDLE\n\n user32.OpenClipboard(user32.GetActiveWindow())\n # Standard Clipboard Format \"1\" is \"CF_TEXT\"\n pcontents = GetClipboardData(13)\n\n # if someone pastes a FILE, the content is None for SCF 13\n # and the clipboard is locked if not closed properly\n if not pcontents:\n user32.CloseClipboard()\n return ''\n data = c_wchar_p(pcontents).value.encode(self._encoding)\n user32.CloseClipboard()\n return data\n\n def put(self, text, mimetype='text/plain'):\n text = text.decode(self._encoding) # auto converted later\n text += u'\\x00'\n\n SetClipboardData = user32.SetClipboardData\n SetClipboardData.argtypes = [wintypes.UINT, wintypes.HANDLE]\n SetClipboardData.restype = wintypes.HANDLE\n\n GlobalAlloc = kernel32.GlobalAlloc\n GlobalAlloc.argtypes = [wintypes.UINT, ctypes.c_size_t]\n GlobalAlloc.restype = wintypes.HGLOBAL\n\n CF_UNICODETEXT = 13\n\n user32.OpenClipboard(user32.GetActiveWindow())\n user32.EmptyClipboard()\n hCd = GlobalAlloc(0, len(text) * ctypes.sizeof(ctypes.c_wchar))\n\n # ignore null character for strSource pointer\n msvcrt.wcscpy_s(c_wchar_p(hCd), len(text), c_wchar_p(text[:-1]))\n SetClipboardData(CF_UNICODETEXT, hCd)\n user32.CloseClipboard()\n\n def get_types(self):\n return ['text/plain']\n", "path": "kivy/core/clipboard/clipboard_winctypes.py"}]} | 2,041 | 195 |
gh_patches_debug_9245 | rasdani/github-patches | git_diff | translate__pootle-6371 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fulah locale exposes _XXX_LAST_SUBMISSION_
When in the Fulah (ff) locale, all pages expose the `_XXX_LAST_SUBMISSION_` text used to mangle timesince messages.

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pootle/i18n/dates.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright (C) Pootle contributors.
4 #
5 # This file is a part of the Pootle project. It is distributed under the GPL3
6 # or later license. See the LICENSE file for a copy of the license and the
7 # AUTHORS file for copyright and authorship information.
8
9 from datetime import datetime
10
11
12 from .formatter import get_locale_formats
13
14
15 def timesince(timestamp, locale=None):
16 timedelta = datetime.now() - datetime.fromtimestamp(timestamp)
17 return get_locale_formats(locale).timedelta(timedelta, format='long')
18
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pootle/i18n/dates.py b/pootle/i18n/dates.py
--- a/pootle/i18n/dates.py
+++ b/pootle/i18n/dates.py
@@ -8,10 +8,15 @@
from datetime import datetime
+from django.conf import settings
from .formatter import get_locale_formats
def timesince(timestamp, locale=None):
timedelta = datetime.now() - datetime.fromtimestamp(timestamp)
- return get_locale_formats(locale).timedelta(timedelta, format='long')
+ formatted = get_locale_formats(locale).timedelta(timedelta, format='long')
+ if formatted:
+ return formatted
+ return get_locale_formats(
+ settings.LANGUAGE_CODE).timedelta(timedelta, format='long')
| {"golden_diff": "diff --git a/pootle/i18n/dates.py b/pootle/i18n/dates.py\n--- a/pootle/i18n/dates.py\n+++ b/pootle/i18n/dates.py\n@@ -8,10 +8,15 @@\n \n from datetime import datetime\n \n+from django.conf import settings\n \n from .formatter import get_locale_formats\n \n \n def timesince(timestamp, locale=None):\n timedelta = datetime.now() - datetime.fromtimestamp(timestamp)\n- return get_locale_formats(locale).timedelta(timedelta, format='long')\n+ formatted = get_locale_formats(locale).timedelta(timedelta, format='long')\n+ if formatted:\n+ return formatted\n+ return get_locale_formats(\n+ settings.LANGUAGE_CODE).timedelta(timedelta, format='long')\n", "issue": "Fulah locale exposes _XXX_LAST_SUBMISSION_\nWhen in the Fulah (ff) locale, all pages expose the `_XXX_LAST_SUBMISSION_` text used to mangle timesince messages.\r\n\r\n\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom datetime import datetime\n\n\nfrom .formatter import get_locale_formats\n\n\ndef timesince(timestamp, locale=None):\n timedelta = datetime.now() - datetime.fromtimestamp(timestamp)\n return get_locale_formats(locale).timedelta(timedelta, format='long')\n", "path": "pootle/i18n/dates.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom datetime import datetime\n\nfrom django.conf import settings\n\nfrom .formatter import get_locale_formats\n\n\ndef timesince(timestamp, locale=None):\n timedelta = datetime.now() - datetime.fromtimestamp(timestamp)\n formatted = get_locale_formats(locale).timedelta(timedelta, format='long')\n if formatted:\n return formatted\n return get_locale_formats(\n settings.LANGUAGE_CODE).timedelta(timedelta, format='long')\n", "path": "pootle/i18n/dates.py"}]} | 540 | 177 |
gh_patches_debug_19940 | rasdani/github-patches | git_diff | mirumee__ariadne-153 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Raise ValueError when `field` or `source` decorator was called incorrectly
Currently there's no error when the developer forgets to follow the `field` or `source` decorator with `("name")`, tricking them into thinking that decorated function has been registered while in fact it wasn't.
We could update implementation for those functions to raise ValueError when `name` attr is not `str`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ariadne/objects.py`
Content:
```
1 from typing import Callable, Dict, Optional, cast
2
3 from graphql.type import GraphQLNamedType, GraphQLObjectType, GraphQLSchema
4
5 from .resolvers import resolve_to
6 from .types import Resolver, SchemaBindable
7
8
9 class ObjectType(SchemaBindable):
10 _resolvers: Dict[str, Resolver]
11
12 def __init__(self, name: str) -> None:
13 self.name = name
14 self._resolvers = {}
15
16 def field(self, name: str) -> Callable[[Resolver], Resolver]:
17 return self.create_register_resolver(name)
18
19 def create_register_resolver(self, name: str) -> Callable[[Resolver], Resolver]:
20 def register_resolver(f: Resolver) -> Resolver:
21 self._resolvers[name] = f
22 return f
23
24 return register_resolver
25
26 def set_field(self, name, resolver: Resolver) -> Resolver:
27 self._resolvers[name] = resolver
28 return resolver
29
30 def set_alias(self, name: str, to: str) -> None:
31 self._resolvers[name] = resolve_to(to)
32
33 def bind_to_schema(self, schema: GraphQLSchema) -> None:
34 graphql_type = schema.type_map.get(self.name)
35 self.validate_graphql_type(graphql_type)
36 graphql_type = cast(GraphQLObjectType, graphql_type)
37 self.bind_resolvers_to_graphql_type(graphql_type)
38
39 def validate_graphql_type(self, graphql_type: Optional[GraphQLNamedType]) -> None:
40 if not graphql_type:
41 raise ValueError("Type %s is not defined in the schema" % self.name)
42 if not isinstance(graphql_type, GraphQLObjectType):
43 raise ValueError(
44 "%s is defined in the schema, but it is instance of %s (expected %s)"
45 % (self.name, type(graphql_type).__name__, GraphQLObjectType.__name__)
46 )
47
48 def bind_resolvers_to_graphql_type(self, graphql_type, replace_existing=True):
49 for field, resolver in self._resolvers.items():
50 if field not in graphql_type.fields:
51 raise ValueError(
52 "Field %s is not defined on type %s" % (field, self.name)
53 )
54 if graphql_type.fields[field].resolve is None or replace_existing:
55 graphql_type.fields[field].resolve = resolver
56
57
58 class QueryType(ObjectType):
59 """Convenience class for defining Query type"""
60
61 def __init__(self):
62 super().__init__("Query")
63
64
65 class MutationType(ObjectType):
66 """Convenience class for defining Mutation type"""
67
68 def __init__(self):
69 super().__init__("Mutation")
70
```
Path: `ariadne/subscriptions.py`
Content:
```
1 from typing import Callable, Dict
2
3 from graphql.type import GraphQLSchema
4
5 from .objects import ObjectType
6 from .types import Subscriber
7
8
9 class SubscriptionType(ObjectType):
10 _subscribers: Dict[str, Subscriber]
11
12 def __init__(self) -> None:
13 super().__init__("Subscription")
14 self._subscribers = {}
15
16 def source(self, name: str) -> Callable[[Subscriber], Subscriber]:
17 return self.create_register_subscriber(name)
18
19 def create_register_subscriber(
20 self, name: str
21 ) -> Callable[[Subscriber], Subscriber]:
22 def register_subscriber(generator: Subscriber) -> Subscriber:
23 self._subscribers[name] = generator
24 return generator
25
26 return register_subscriber
27
28 def set_source(self, name, generator: Subscriber) -> Subscriber:
29 self._subscribers[name] = generator
30 return generator
31
32 def bind_to_schema(self, schema: GraphQLSchema) -> None:
33 graphql_type = schema.type_map.get(self.name)
34 self.validate_graphql_type(graphql_type)
35 self.bind_resolvers_to_graphql_type(graphql_type)
36 self.bind_subscribers_to_graphql_type(graphql_type)
37
38 def bind_subscribers_to_graphql_type(self, graphql_type):
39 for field, subscriber in self._subscribers.items():
40 if field not in graphql_type.fields:
41 raise ValueError(
42 "Field %s is not defined on type %s" % (field, self.name)
43 )
44
45 graphql_type.fields[field].subscribe = subscriber
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ariadne/objects.py b/ariadne/objects.py
--- a/ariadne/objects.py
+++ b/ariadne/objects.py
@@ -14,6 +14,10 @@
self._resolvers = {}
def field(self, name: str) -> Callable[[Resolver], Resolver]:
+ if not isinstance(name, str):
+ raise ValueError(
+ 'field decorator should be passed a field name: @foo.field("name")'
+ )
return self.create_register_resolver(name)
def create_register_resolver(self, name: str) -> Callable[[Resolver], Resolver]:
diff --git a/ariadne/subscriptions.py b/ariadne/subscriptions.py
--- a/ariadne/subscriptions.py
+++ b/ariadne/subscriptions.py
@@ -14,6 +14,10 @@
self._subscribers = {}
def source(self, name: str) -> Callable[[Subscriber], Subscriber]:
+ if not isinstance(name, str):
+ raise ValueError(
+ 'source decorator should be passed a field name: @foo.source("name")'
+ )
return self.create_register_subscriber(name)
def create_register_subscriber(
| {"golden_diff": "diff --git a/ariadne/objects.py b/ariadne/objects.py\n--- a/ariadne/objects.py\n+++ b/ariadne/objects.py\n@@ -14,6 +14,10 @@\n self._resolvers = {}\n \n def field(self, name: str) -> Callable[[Resolver], Resolver]:\n+ if not isinstance(name, str):\n+ raise ValueError(\n+ 'field decorator should be passed a field name: @foo.field(\"name\")'\n+ )\n return self.create_register_resolver(name)\n \n def create_register_resolver(self, name: str) -> Callable[[Resolver], Resolver]:\ndiff --git a/ariadne/subscriptions.py b/ariadne/subscriptions.py\n--- a/ariadne/subscriptions.py\n+++ b/ariadne/subscriptions.py\n@@ -14,6 +14,10 @@\n self._subscribers = {}\n \n def source(self, name: str) -> Callable[[Subscriber], Subscriber]:\n+ if not isinstance(name, str):\n+ raise ValueError(\n+ 'source decorator should be passed a field name: @foo.source(\"name\")'\n+ )\n return self.create_register_subscriber(name)\n \n def create_register_subscriber(\n", "issue": "Raise ValueError when `field` or `source` decorator was called incorrectly\nCurrently there's no error when the developer forgets to follow the `field` or `source` decorator with `(\"name\")`, tricking them into thinking that decorated function has been registered while in fact it wasn't.\r\n\r\nWe could update implementation for those functions to raise ValueError when `name` attr is not `str`.\n", "before_files": [{"content": "from typing import Callable, Dict, Optional, cast\n\nfrom graphql.type import GraphQLNamedType, GraphQLObjectType, GraphQLSchema\n\nfrom .resolvers import resolve_to\nfrom .types import Resolver, SchemaBindable\n\n\nclass ObjectType(SchemaBindable):\n _resolvers: Dict[str, Resolver]\n\n def __init__(self, name: str) -> None:\n self.name = name\n self._resolvers = {}\n\n def field(self, name: str) -> Callable[[Resolver], Resolver]:\n return self.create_register_resolver(name)\n\n def create_register_resolver(self, name: str) -> Callable[[Resolver], Resolver]:\n def register_resolver(f: Resolver) -> Resolver:\n self._resolvers[name] = f\n return f\n\n return register_resolver\n\n def set_field(self, name, resolver: Resolver) -> Resolver:\n self._resolvers[name] = resolver\n return resolver\n\n def set_alias(self, name: str, to: str) -> None:\n self._resolvers[name] = resolve_to(to)\n\n def bind_to_schema(self, schema: GraphQLSchema) -> None:\n graphql_type = schema.type_map.get(self.name)\n self.validate_graphql_type(graphql_type)\n graphql_type = cast(GraphQLObjectType, graphql_type)\n self.bind_resolvers_to_graphql_type(graphql_type)\n\n def validate_graphql_type(self, graphql_type: Optional[GraphQLNamedType]) -> None:\n if not graphql_type:\n raise ValueError(\"Type %s is not defined in the schema\" % self.name)\n if not isinstance(graphql_type, GraphQLObjectType):\n raise ValueError(\n \"%s is defined in the schema, but it is instance of %s (expected %s)\"\n % (self.name, type(graphql_type).__name__, GraphQLObjectType.__name__)\n )\n\n def bind_resolvers_to_graphql_type(self, graphql_type, replace_existing=True):\n for field, resolver in self._resolvers.items():\n if field not in graphql_type.fields:\n raise ValueError(\n \"Field %s is not defined on type %s\" % (field, self.name)\n )\n if graphql_type.fields[field].resolve is None or replace_existing:\n graphql_type.fields[field].resolve = resolver\n\n\nclass QueryType(ObjectType):\n \"\"\"Convenience class for defining Query type\"\"\"\n\n def __init__(self):\n super().__init__(\"Query\")\n\n\nclass MutationType(ObjectType):\n \"\"\"Convenience class for defining Mutation type\"\"\"\n\n def __init__(self):\n super().__init__(\"Mutation\")\n", "path": "ariadne/objects.py"}, {"content": "from typing import Callable, Dict\n\nfrom graphql.type import GraphQLSchema\n\nfrom .objects import ObjectType\nfrom .types import Subscriber\n\n\nclass SubscriptionType(ObjectType):\n _subscribers: Dict[str, Subscriber]\n\n def __init__(self) -> None:\n super().__init__(\"Subscription\")\n self._subscribers = {}\n\n def source(self, name: str) -> Callable[[Subscriber], Subscriber]:\n return self.create_register_subscriber(name)\n\n def create_register_subscriber(\n self, name: str\n ) -> Callable[[Subscriber], Subscriber]:\n def register_subscriber(generator: Subscriber) -> Subscriber:\n self._subscribers[name] = generator\n return generator\n\n return register_subscriber\n\n def set_source(self, name, generator: Subscriber) -> Subscriber:\n self._subscribers[name] = generator\n return generator\n\n def bind_to_schema(self, schema: GraphQLSchema) -> None:\n graphql_type = schema.type_map.get(self.name)\n self.validate_graphql_type(graphql_type)\n self.bind_resolvers_to_graphql_type(graphql_type)\n self.bind_subscribers_to_graphql_type(graphql_type)\n\n def bind_subscribers_to_graphql_type(self, graphql_type):\n for field, subscriber in self._subscribers.items():\n if field not in graphql_type.fields:\n raise ValueError(\n \"Field %s is not defined on type %s\" % (field, self.name)\n )\n\n graphql_type.fields[field].subscribe = subscriber\n", "path": "ariadne/subscriptions.py"}], "after_files": [{"content": "from typing import Callable, Dict, Optional, cast\n\nfrom graphql.type import GraphQLNamedType, GraphQLObjectType, GraphQLSchema\n\nfrom .resolvers import resolve_to\nfrom .types import Resolver, SchemaBindable\n\n\nclass ObjectType(SchemaBindable):\n _resolvers: Dict[str, Resolver]\n\n def __init__(self, name: str) -> None:\n self.name = name\n self._resolvers = {}\n\n def field(self, name: str) -> Callable[[Resolver], Resolver]:\n if not isinstance(name, str):\n raise ValueError(\n 'field decorator should be passed a field name: @foo.field(\"name\")'\n )\n return self.create_register_resolver(name)\n\n def create_register_resolver(self, name: str) -> Callable[[Resolver], Resolver]:\n def register_resolver(f: Resolver) -> Resolver:\n self._resolvers[name] = f\n return f\n\n return register_resolver\n\n def set_field(self, name, resolver: Resolver) -> Resolver:\n self._resolvers[name] = resolver\n return resolver\n\n def set_alias(self, name: str, to: str) -> None:\n self._resolvers[name] = resolve_to(to)\n\n def bind_to_schema(self, schema: GraphQLSchema) -> None:\n graphql_type = schema.type_map.get(self.name)\n self.validate_graphql_type(graphql_type)\n graphql_type = cast(GraphQLObjectType, graphql_type)\n self.bind_resolvers_to_graphql_type(graphql_type)\n\n def validate_graphql_type(self, graphql_type: Optional[GraphQLNamedType]) -> None:\n if not graphql_type:\n raise ValueError(\"Type %s is not defined in the schema\" % self.name)\n if not isinstance(graphql_type, GraphQLObjectType):\n raise ValueError(\n \"%s is defined in the schema, but it is instance of %s (expected %s)\"\n % (self.name, type(graphql_type).__name__, GraphQLObjectType.__name__)\n )\n\n def bind_resolvers_to_graphql_type(self, graphql_type, replace_existing=True):\n for field, resolver in self._resolvers.items():\n if field not in graphql_type.fields:\n raise ValueError(\n \"Field %s is not defined on type %s\" % (field, self.name)\n )\n if graphql_type.fields[field].resolve is None or replace_existing:\n graphql_type.fields[field].resolve = resolver\n\n\nclass QueryType(ObjectType):\n \"\"\"Convenience class for defining Query type\"\"\"\n\n def __init__(self):\n super().__init__(\"Query\")\n\n\nclass MutationType(ObjectType):\n \"\"\"Convenience class for defining Mutation type\"\"\"\n\n def __init__(self):\n super().__init__(\"Mutation\")\n", "path": "ariadne/objects.py"}, {"content": "from typing import Callable, Dict\n\nfrom graphql.type import GraphQLSchema\n\nfrom .objects import ObjectType\nfrom .types import Subscriber\n\n\nclass SubscriptionType(ObjectType):\n _subscribers: Dict[str, Subscriber]\n\n def __init__(self) -> None:\n super().__init__(\"Subscription\")\n self._subscribers = {}\n\n def source(self, name: str) -> Callable[[Subscriber], Subscriber]:\n if not isinstance(name, str):\n raise ValueError(\n 'source decorator should be passed a field name: @foo.source(\"name\")'\n )\n return self.create_register_subscriber(name)\n\n def create_register_subscriber(\n self, name: str\n ) -> Callable[[Subscriber], Subscriber]:\n def register_subscriber(generator: Subscriber) -> Subscriber:\n self._subscribers[name] = generator\n return generator\n\n return register_subscriber\n\n def set_source(self, name, generator: Subscriber) -> Subscriber:\n self._subscribers[name] = generator\n return generator\n\n def bind_to_schema(self, schema: GraphQLSchema) -> None:\n graphql_type = schema.type_map.get(self.name)\n self.validate_graphql_type(graphql_type)\n self.bind_resolvers_to_graphql_type(graphql_type)\n self.bind_subscribers_to_graphql_type(graphql_type)\n\n def bind_subscribers_to_graphql_type(self, graphql_type):\n for field, subscriber in self._subscribers.items():\n if field not in graphql_type.fields:\n raise ValueError(\n \"Field %s is not defined on type %s\" % (field, self.name)\n )\n\n graphql_type.fields[field].subscribe = subscriber\n", "path": "ariadne/subscriptions.py"}]} | 1,435 | 270 |
gh_patches_debug_1362 | rasdani/github-patches | git_diff | UTNkar__moore-59 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Login is per-subdomain
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/website/settings/production.py`
Content:
```
1 """
2 Django settings for the production environment of Project Moore.
3
4 For more information regarding running in production see,
5 See https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/
6
7 For more information on this file, see
8 https://docs.djangoproject.com/en/1.10/topics/settings/
9
10 For the full list of settings and their values, see
11 https://docs.djangoproject.com/en/1.10/ref/settings/
12 """
13 from __future__ import absolute_import, unicode_literals
14
15 from .base import *
16
17 # SECURITY WARNING: don't run with debug turned on in production!
18 DEBUG = False
19
20 # SECURITY WARNING: keep the secret key used in production secret!
21 SECRET_KEY = os.environ.get(
22 'DJANGO_SECRET',
23 'za7^0@54n&p-dg4)_l12q_3^o5awz_uym0osqaz2!myki_8kw0'
24 )
25
26 # Database
27 # https://docs.djangoproject.com/en/1.10/ref/settings/#databases
28
29 DATABASES = {
30 'default': {
31 'ENGINE': 'django.db.backends.postgresql',
32 'NAME': os.environ.get('DJANGO_DB'),
33 'USER': os.environ.get('DJANGO_DB_USER'),
34 'PASSWORD': os.environ.get('DJANGO_DB_PASS'),
35 'HOST': os.environ.get('DJANGO_DB_HOST', '127.0.0.1'),
36 'PORT': os.environ.get('DJANGO_DB_PORT', '5432'),
37 }
38 }
39
40 # CONN_MAX_AGE = 0
41
42 # Base URL to use when referring to full URLs within the Wagtail admin
43 # backend - e.g. in notification emails. Don't include '/admin' or a
44 # trailing slash
45 BASE_URL = 'https://dev.utn.se'
46
47 ALLOWED_HOSTS = ['.utn.se']
48
49 # Email settings
50 DEFAULT_FROM_EMAIL = '[email protected]'
51
52 EMAIL_SUBJECT_PREFIX = '[UTN] '
53
54 # Admins - will be sent error messages
55 ADMINS = [('UTN System Administrator', '[email protected]')]
56
57 LOGGING_CONFIG = None
58
59 # TODO: HTTPS security
60 # CSRF_COOKIE_SECURE = True
61 #
62 # SESSION_COOKIE_SECURE = True
63
64 # Membership API
65 MEMBERSHIP_API_USER = 'moore'
66 MEMBERSHIP_API_PASSWORD = os.environ.get('MEMBERSHIP_API_PASSWORD')
67
68 try:
69 from .local import *
70 except ImportError:
71 pass
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/website/website/settings/production.py b/website/website/settings/production.py
--- a/website/website/settings/production.py
+++ b/website/website/settings/production.py
@@ -56,10 +56,11 @@
LOGGING_CONFIG = None
-# TODO: HTTPS security
-# CSRF_COOKIE_SECURE = True
-#
-# SESSION_COOKIE_SECURE = True
+CSRF_COOKIE_SECURE = True
+
+SESSION_COOKIE_DOMAIN = '.utn.se'
+
+SESSION_COOKIE_SECURE = True
# Membership API
MEMBERSHIP_API_USER = 'moore'
| {"golden_diff": "diff --git a/website/website/settings/production.py b/website/website/settings/production.py\n--- a/website/website/settings/production.py\n+++ b/website/website/settings/production.py\n@@ -56,10 +56,11 @@\n \n LOGGING_CONFIG = None\n \n-# TODO: HTTPS security\n-# CSRF_COOKIE_SECURE = True\n-#\n-# SESSION_COOKIE_SECURE = True\n+CSRF_COOKIE_SECURE = True\n+\n+SESSION_COOKIE_DOMAIN = '.utn.se'\n+\n+SESSION_COOKIE_SECURE = True\n \n # Membership API\n MEMBERSHIP_API_USER = 'moore'\n", "issue": "Login is per-subdomain\n\n", "before_files": [{"content": "\"\"\"\nDjango settings for the production environment of Project Moore.\n\nFor more information regarding running in production see,\nSee https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.10/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.10/ref/settings/\n\"\"\"\nfrom __future__ import absolute_import, unicode_literals\n\nfrom .base import *\n\n# SECURITY WARNING: don't run with debug turned on in production!\nDEBUG = False\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = os.environ.get(\n 'DJANGO_SECRET',\n 'za7^0@54n&p-dg4)_l12q_3^o5awz_uym0osqaz2!myki_8kw0'\n)\n\n# Database\n# https://docs.djangoproject.com/en/1.10/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.postgresql',\n 'NAME': os.environ.get('DJANGO_DB'),\n 'USER': os.environ.get('DJANGO_DB_USER'),\n 'PASSWORD': os.environ.get('DJANGO_DB_PASS'),\n 'HOST': os.environ.get('DJANGO_DB_HOST', '127.0.0.1'),\n 'PORT': os.environ.get('DJANGO_DB_PORT', '5432'),\n }\n}\n\n# CONN_MAX_AGE = 0\n\n# Base URL to use when referring to full URLs within the Wagtail admin\n# backend - e.g. in notification emails. Don't include '/admin' or a\n# trailing slash\nBASE_URL = 'https://dev.utn.se'\n\nALLOWED_HOSTS = ['.utn.se']\n\n# Email settings\nDEFAULT_FROM_EMAIL = '[email protected]'\n\nEMAIL_SUBJECT_PREFIX = '[UTN] '\n\n# Admins - will be sent error messages\nADMINS = [('UTN System Administrator', '[email protected]')]\n\nLOGGING_CONFIG = None\n\n# TODO: HTTPS security\n# CSRF_COOKIE_SECURE = True\n#\n# SESSION_COOKIE_SECURE = True\n\n# Membership API\nMEMBERSHIP_API_USER = 'moore'\nMEMBERSHIP_API_PASSWORD = os.environ.get('MEMBERSHIP_API_PASSWORD')\n\ntry:\n from .local import *\nexcept ImportError:\n pass\n", "path": "website/website/settings/production.py"}], "after_files": [{"content": "\"\"\"\nDjango settings for the production environment of Project Moore.\n\nFor more information regarding running in production see,\nSee https://docs.djangoproject.com/en/1.10/howto/deployment/checklist/\n\nFor more information on this file, see\nhttps://docs.djangoproject.com/en/1.10/topics/settings/\n\nFor the full list of settings and their values, see\nhttps://docs.djangoproject.com/en/1.10/ref/settings/\n\"\"\"\nfrom __future__ import absolute_import, unicode_literals\n\nfrom .base import *\n\n# SECURITY WARNING: don't run with debug turned on in production!\nDEBUG = False\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = os.environ.get(\n 'DJANGO_SECRET',\n 'za7^0@54n&p-dg4)_l12q_3^o5awz_uym0osqaz2!myki_8kw0'\n)\n\n# Database\n# https://docs.djangoproject.com/en/1.10/ref/settings/#databases\n\nDATABASES = {\n 'default': {\n 'ENGINE': 'django.db.backends.postgresql',\n 'NAME': os.environ.get('DJANGO_DB'),\n 'USER': os.environ.get('DJANGO_DB_USER'),\n 'PASSWORD': os.environ.get('DJANGO_DB_PASS'),\n 'HOST': os.environ.get('DJANGO_DB_HOST', '127.0.0.1'),\n 'PORT': os.environ.get('DJANGO_DB_PORT', '5432'),\n }\n}\n\n# CONN_MAX_AGE = 0\n\n# Base URL to use when referring to full URLs within the Wagtail admin\n# backend - e.g. in notification emails. Don't include '/admin' or a\n# trailing slash\nBASE_URL = 'https://dev.utn.se'\n\nALLOWED_HOSTS = ['.utn.se']\n\n# Email settings\nDEFAULT_FROM_EMAIL = '[email protected]'\n\nEMAIL_SUBJECT_PREFIX = '[UTN] '\n\n# Admins - will be sent error messages\nADMINS = [('UTN System Administrator', '[email protected]')]\n\nLOGGING_CONFIG = None\n\nCSRF_COOKIE_SECURE = True\n\nSESSION_COOKIE_DOMAIN = '.utn.se'\n\nSESSION_COOKIE_SECURE = True\n\n# Membership API\nMEMBERSHIP_API_USER = 'moore'\nMEMBERSHIP_API_PASSWORD = os.environ.get('MEMBERSHIP_API_PASSWORD')\n\ntry:\n from .local import *\nexcept ImportError:\n pass\n", "path": "website/website/settings/production.py"}]} | 928 | 131 |
gh_patches_debug_3121 | rasdani/github-patches | git_diff | mirumee__ariadne-961 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Support Starlette 0.21.0
Starlette 0.21.0 fix important issues on the BaseHttpMiddleware side.
https://github.com/encode/starlette/pull/1715
https://github.com/tiangolo/fastapi/issues/4544
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #! /usr/bin/env python
2 import os
3 from setuptools import setup
4
5 CLASSIFIERS = [
6 "Development Status :: 4 - Beta",
7 "Intended Audience :: Developers",
8 "License :: OSI Approved :: BSD License",
9 "Operating System :: OS Independent",
10 "Programming Language :: Python",
11 "Programming Language :: Python :: 3.7",
12 "Programming Language :: Python :: 3.8",
13 "Programming Language :: Python :: 3.9",
14 "Programming Language :: Python :: 3.10",
15 "Topic :: Software Development :: Libraries :: Python Modules",
16 ]
17
18 README_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), "README.md")
19 with open(README_PATH, "r", encoding="utf8") as f:
20 README = f.read()
21
22 setup(
23 name="ariadne",
24 author="Mirumee Software",
25 author_email="[email protected]",
26 description="Ariadne is a Python library for implementing GraphQL servers.",
27 long_description=README,
28 long_description_content_type="text/markdown",
29 license="BSD",
30 version="0.16.1",
31 url="https://github.com/mirumee/ariadne",
32 packages=["ariadne"],
33 include_package_data=True,
34 install_requires=[
35 "graphql-core>=3.2.0,<3.3",
36 "starlette>0.17,<0.21",
37 "typing_extensions>=3.6.0",
38 ],
39 extras_require={"asgi-file-uploads": ["python-multipart>=0.0.5"]},
40 classifiers=CLASSIFIERS,
41 platforms=["any"],
42 zip_safe=False,
43 )
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -33,7 +33,7 @@
include_package_data=True,
install_requires=[
"graphql-core>=3.2.0,<3.3",
- "starlette>0.17,<0.21",
+ "starlette>0.17,<1.0",
"typing_extensions>=3.6.0",
],
extras_require={"asgi-file-uploads": ["python-multipart>=0.0.5"]},
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -33,7 +33,7 @@\n include_package_data=True,\n install_requires=[\n \"graphql-core>=3.2.0,<3.3\",\n- \"starlette>0.17,<0.21\",\n+ \"starlette>0.17,<1.0\",\n \"typing_extensions>=3.6.0\",\n ],\n extras_require={\"asgi-file-uploads\": [\"python-multipart>=0.0.5\"]},\n", "issue": "Support Starlette 0.21.0\nStarlette 0.21.0 fix important issues on the BaseHttpMiddleware side. \r\n\r\nhttps://github.com/encode/starlette/pull/1715\r\nhttps://github.com/tiangolo/fastapi/issues/4544\n", "before_files": [{"content": "#! /usr/bin/env python\nimport os\nfrom setuptools import setup\n\nCLASSIFIERS = [\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n]\n\nREADME_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"README.md\")\nwith open(README_PATH, \"r\", encoding=\"utf8\") as f:\n README = f.read()\n\nsetup(\n name=\"ariadne\",\n author=\"Mirumee Software\",\n author_email=\"[email protected]\",\n description=\"Ariadne is a Python library for implementing GraphQL servers.\",\n long_description=README,\n long_description_content_type=\"text/markdown\",\n license=\"BSD\",\n version=\"0.16.1\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n include_package_data=True,\n install_requires=[\n \"graphql-core>=3.2.0,<3.3\",\n \"starlette>0.17,<0.21\",\n \"typing_extensions>=3.6.0\",\n ],\n extras_require={\"asgi-file-uploads\": [\"python-multipart>=0.0.5\"]},\n classifiers=CLASSIFIERS,\n platforms=[\"any\"],\n zip_safe=False,\n)\n", "path": "setup.py"}], "after_files": [{"content": "#! /usr/bin/env python\nimport os\nfrom setuptools import setup\n\nCLASSIFIERS = [\n \"Development Status :: 4 - Beta\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n]\n\nREADME_PATH = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"README.md\")\nwith open(README_PATH, \"r\", encoding=\"utf8\") as f:\n README = f.read()\n\nsetup(\n name=\"ariadne\",\n author=\"Mirumee Software\",\n author_email=\"[email protected]\",\n description=\"Ariadne is a Python library for implementing GraphQL servers.\",\n long_description=README,\n long_description_content_type=\"text/markdown\",\n license=\"BSD\",\n version=\"0.16.1\",\n url=\"https://github.com/mirumee/ariadne\",\n packages=[\"ariadne\"],\n include_package_data=True,\n install_requires=[\n \"graphql-core>=3.2.0,<3.3\",\n \"starlette>0.17,<1.0\",\n \"typing_extensions>=3.6.0\",\n ],\n extras_require={\"asgi-file-uploads\": [\"python-multipart>=0.0.5\"]},\n classifiers=CLASSIFIERS,\n platforms=[\"any\"],\n zip_safe=False,\n)\n", "path": "setup.py"}]} | 760 | 123 |
gh_patches_debug_5839 | rasdani/github-patches | git_diff | boto__boto-2116 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
In 2.25.0 GoogleStorage generate_url() returns url with AWSAccessKeyId instead of GoogleAccessId
When generating a Google Storage signed url:
``` python
conn = boto.connect_gs(gs_access_key, gs_secret_access)
signed_url = conn.generate_url(900, 'GET', bucket=bucket_name, key=file_name)
```
signed_url is of the form:
https://bucket_name.storage.googleapis.com/file_name?Signature=foo&Expires=1392808714&**AWSAccessKeyId**=gs_access_key
But should be of the form:
https://bucket_name.storage.googleapis.com/file_name?Signature=foo&Expires=1392808714&**GoogleAccessId**=gs_access_key
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `boto/gs/connection.py`
Content:
```
1 # Copyright 2010 Google Inc.
2 #
3 # Permission is hereby granted, free of charge, to any person obtaining a
4 # copy of this software and associated documentation files (the
5 # "Software"), to deal in the Software without restriction, including
6 # without limitation the rights to use, copy, modify, merge, publish, dis-
7 # tribute, sublicense, and/or sell copies of the Software, and to permit
8 # persons to whom the Software is furnished to do so, subject to the fol-
9 # lowing conditions:
10 #
11 # The above copyright notice and this permission notice shall be included
12 # in all copies or substantial portions of the Software.
13 #
14 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
15 # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-
16 # ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
17 # SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
18 # WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
19 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
20 # IN THE SOFTWARE.
21
22 from boto.gs.bucket import Bucket
23 from boto.s3.connection import S3Connection
24 from boto.s3.connection import SubdomainCallingFormat
25 from boto.s3.connection import check_lowercase_bucketname
26 from boto.utils import get_utf8_value
27
28 class Location(object):
29 DEFAULT = 'US'
30 EU = 'EU'
31
32 class GSConnection(S3Connection):
33
34 DefaultHost = 'storage.googleapis.com'
35 QueryString = 'Signature=%s&Expires=%d&AWSAccessKeyId=%s'
36
37 def __init__(self, gs_access_key_id=None, gs_secret_access_key=None,
38 is_secure=True, port=None, proxy=None, proxy_port=None,
39 proxy_user=None, proxy_pass=None,
40 host=DefaultHost, debug=0, https_connection_factory=None,
41 calling_format=SubdomainCallingFormat(), path='/',
42 suppress_consec_slashes=True):
43 super(GSConnection, self).__init__(gs_access_key_id, gs_secret_access_key,
44 is_secure, port, proxy, proxy_port, proxy_user, proxy_pass,
45 host, debug, https_connection_factory, calling_format, path,
46 "google", Bucket,
47 suppress_consec_slashes=suppress_consec_slashes)
48
49 def create_bucket(self, bucket_name, headers=None,
50 location=Location.DEFAULT, policy=None,
51 storage_class='STANDARD'):
52 """
53 Creates a new bucket. By default it's located in the USA. You can
54 pass Location.EU to create bucket in the EU. You can also pass
55 a LocationConstraint for where the bucket should be located, and
56 a StorageClass describing how the data should be stored.
57
58 :type bucket_name: string
59 :param bucket_name: The name of the new bucket.
60
61 :type headers: dict
62 :param headers: Additional headers to pass along with the request to GCS.
63
64 :type location: :class:`boto.gs.connection.Location`
65 :param location: The location of the new bucket.
66
67 :type policy: :class:`boto.gs.acl.CannedACLStrings`
68 :param policy: A canned ACL policy that will be applied to the new key
69 in GCS.
70
71 :type storage_class: string
72 :param storage_class: Either 'STANDARD' or 'DURABLE_REDUCED_AVAILABILITY'.
73
74 """
75 check_lowercase_bucketname(bucket_name)
76
77 if policy:
78 if headers:
79 headers[self.provider.acl_header] = policy
80 else:
81 headers = {self.provider.acl_header : policy}
82 if not location:
83 location = Location.DEFAULT
84 location_elem = ('<LocationConstraint>%s</LocationConstraint>'
85 % location)
86 if storage_class:
87 storage_class_elem = ('<StorageClass>%s</StorageClass>'
88 % storage_class)
89 else:
90 storage_class_elem = ''
91 data = ('<CreateBucketConfiguration>%s%s</CreateBucketConfiguration>'
92 % (location_elem, storage_class_elem))
93 response = self.make_request(
94 'PUT', get_utf8_value(bucket_name), headers=headers,
95 data=get_utf8_value(data))
96 body = response.read()
97 if response.status == 409:
98 raise self.provider.storage_create_error(
99 response.status, response.reason, body)
100 if response.status == 200:
101 return self.bucket_class(self, bucket_name)
102 else:
103 raise self.provider.storage_response_error(
104 response.status, response.reason, body)
105
106 def get_bucket(self, bucket_name, validate=True, headers=None):
107 """
108 Retrieves a bucket by name.
109
110 If the bucket does not exist, an ``S3ResponseError`` will be raised. If
111 you are unsure if the bucket exists or not, you can use the
112 ``S3Connection.lookup`` method, which will either return a valid bucket
113 or ``None``.
114
115 :type bucket_name: string
116 :param bucket_name: The name of the bucket
117
118 :type headers: dict
119 :param headers: Additional headers to pass along with the request to
120 AWS.
121
122 :type validate: boolean
123 :param validate: If ``True``, it will try to fetch all keys within the
124 given bucket. (Default: ``True``)
125 """
126 bucket = self.bucket_class(self, bucket_name)
127 if validate:
128 bucket.get_all_keys(headers, maxkeys=0)
129 return bucket
130
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/boto/gs/connection.py b/boto/gs/connection.py
--- a/boto/gs/connection.py
+++ b/boto/gs/connection.py
@@ -32,7 +32,7 @@
class GSConnection(S3Connection):
DefaultHost = 'storage.googleapis.com'
- QueryString = 'Signature=%s&Expires=%d&AWSAccessKeyId=%s'
+ QueryString = 'Signature=%s&Expires=%d&GoogleAccessId=%s'
def __init__(self, gs_access_key_id=None, gs_secret_access_key=None,
is_secure=True, port=None, proxy=None, proxy_port=None,
| {"golden_diff": "diff --git a/boto/gs/connection.py b/boto/gs/connection.py\n--- a/boto/gs/connection.py\n+++ b/boto/gs/connection.py\n@@ -32,7 +32,7 @@\n class GSConnection(S3Connection):\n \n DefaultHost = 'storage.googleapis.com'\n- QueryString = 'Signature=%s&Expires=%d&AWSAccessKeyId=%s'\n+ QueryString = 'Signature=%s&Expires=%d&GoogleAccessId=%s'\n \n def __init__(self, gs_access_key_id=None, gs_secret_access_key=None,\n is_secure=True, port=None, proxy=None, proxy_port=None,\n", "issue": "In 2.25.0 GoogleStorage generate_url() returns url with AWSAccessKeyId instead of GoogleAccessId\nWhen generating a Google Storage signed url:\n\n``` python\nconn = boto.connect_gs(gs_access_key, gs_secret_access)\nsigned_url = conn.generate_url(900, 'GET', bucket=bucket_name, key=file_name)\n```\n\nsigned_url is of the form:\n\nhttps://bucket_name.storage.googleapis.com/file_name?Signature=foo&Expires=1392808714&**AWSAccessKeyId**=gs_access_key\n\nBut should be of the form:\n\nhttps://bucket_name.storage.googleapis.com/file_name?Signature=foo&Expires=1392808714&**GoogleAccessId**=gs_access_key\n\n", "before_files": [{"content": "# Copyright 2010 Google Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, including\n# without limitation the rights to use, copy, modify, merge, publish, dis-\n# tribute, sublicense, and/or sell copies of the Software, and to permit\n# persons to whom the Software is furnished to do so, subject to the fol-\n# lowing conditions:\n#\n# The above copyright notice and this permission notice shall be included\n# in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\n# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-\n# ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT\n# SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\n# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n# IN THE SOFTWARE.\n\nfrom boto.gs.bucket import Bucket\nfrom boto.s3.connection import S3Connection\nfrom boto.s3.connection import SubdomainCallingFormat\nfrom boto.s3.connection import check_lowercase_bucketname\nfrom boto.utils import get_utf8_value\n\nclass Location(object):\n DEFAULT = 'US'\n EU = 'EU'\n\nclass GSConnection(S3Connection):\n\n DefaultHost = 'storage.googleapis.com'\n QueryString = 'Signature=%s&Expires=%d&AWSAccessKeyId=%s'\n\n def __init__(self, gs_access_key_id=None, gs_secret_access_key=None,\n is_secure=True, port=None, proxy=None, proxy_port=None,\n proxy_user=None, proxy_pass=None,\n host=DefaultHost, debug=0, https_connection_factory=None,\n calling_format=SubdomainCallingFormat(), path='/',\n suppress_consec_slashes=True):\n super(GSConnection, self).__init__(gs_access_key_id, gs_secret_access_key,\n is_secure, port, proxy, proxy_port, proxy_user, proxy_pass,\n host, debug, https_connection_factory, calling_format, path,\n \"google\", Bucket,\n suppress_consec_slashes=suppress_consec_slashes)\n\n def create_bucket(self, bucket_name, headers=None,\n location=Location.DEFAULT, policy=None,\n storage_class='STANDARD'):\n \"\"\"\n Creates a new bucket. By default it's located in the USA. You can\n pass Location.EU to create bucket in the EU. You can also pass\n a LocationConstraint for where the bucket should be located, and\n a StorageClass describing how the data should be stored.\n\n :type bucket_name: string\n :param bucket_name: The name of the new bucket.\n\n :type headers: dict\n :param headers: Additional headers to pass along with the request to GCS.\n\n :type location: :class:`boto.gs.connection.Location`\n :param location: The location of the new bucket.\n\n :type policy: :class:`boto.gs.acl.CannedACLStrings`\n :param policy: A canned ACL policy that will be applied to the new key\n in GCS.\n\n :type storage_class: string\n :param storage_class: Either 'STANDARD' or 'DURABLE_REDUCED_AVAILABILITY'.\n\n \"\"\"\n check_lowercase_bucketname(bucket_name)\n\n if policy:\n if headers:\n headers[self.provider.acl_header] = policy\n else:\n headers = {self.provider.acl_header : policy}\n if not location:\n location = Location.DEFAULT\n location_elem = ('<LocationConstraint>%s</LocationConstraint>'\n % location)\n if storage_class:\n storage_class_elem = ('<StorageClass>%s</StorageClass>'\n % storage_class)\n else:\n storage_class_elem = ''\n data = ('<CreateBucketConfiguration>%s%s</CreateBucketConfiguration>'\n % (location_elem, storage_class_elem))\n response = self.make_request(\n 'PUT', get_utf8_value(bucket_name), headers=headers,\n data=get_utf8_value(data))\n body = response.read()\n if response.status == 409:\n raise self.provider.storage_create_error(\n response.status, response.reason, body)\n if response.status == 200:\n return self.bucket_class(self, bucket_name)\n else:\n raise self.provider.storage_response_error(\n response.status, response.reason, body)\n\n def get_bucket(self, bucket_name, validate=True, headers=None):\n \"\"\"\n Retrieves a bucket by name.\n\n If the bucket does not exist, an ``S3ResponseError`` will be raised. If\n you are unsure if the bucket exists or not, you can use the\n ``S3Connection.lookup`` method, which will either return a valid bucket\n or ``None``.\n\n :type bucket_name: string\n :param bucket_name: The name of the bucket\n\n :type headers: dict\n :param headers: Additional headers to pass along with the request to\n AWS.\n\n :type validate: boolean\n :param validate: If ``True``, it will try to fetch all keys within the\n given bucket. (Default: ``True``)\n \"\"\"\n bucket = self.bucket_class(self, bucket_name)\n if validate:\n bucket.get_all_keys(headers, maxkeys=0)\n return bucket\n", "path": "boto/gs/connection.py"}], "after_files": [{"content": "# Copyright 2010 Google Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, including\n# without limitation the rights to use, copy, modify, merge, publish, dis-\n# tribute, sublicense, and/or sell copies of the Software, and to permit\n# persons to whom the Software is furnished to do so, subject to the fol-\n# lowing conditions:\n#\n# The above copyright notice and this permission notice shall be included\n# in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\n# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-\n# ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT\n# SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\n# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n# IN THE SOFTWARE.\n\nfrom boto.gs.bucket import Bucket\nfrom boto.s3.connection import S3Connection\nfrom boto.s3.connection import SubdomainCallingFormat\nfrom boto.s3.connection import check_lowercase_bucketname\nfrom boto.utils import get_utf8_value\n\nclass Location(object):\n DEFAULT = 'US'\n EU = 'EU'\n\nclass GSConnection(S3Connection):\n\n DefaultHost = 'storage.googleapis.com'\n QueryString = 'Signature=%s&Expires=%d&GoogleAccessId=%s'\n\n def __init__(self, gs_access_key_id=None, gs_secret_access_key=None,\n is_secure=True, port=None, proxy=None, proxy_port=None,\n proxy_user=None, proxy_pass=None,\n host=DefaultHost, debug=0, https_connection_factory=None,\n calling_format=SubdomainCallingFormat(), path='/',\n suppress_consec_slashes=True):\n super(GSConnection, self).__init__(gs_access_key_id, gs_secret_access_key,\n is_secure, port, proxy, proxy_port, proxy_user, proxy_pass,\n host, debug, https_connection_factory, calling_format, path,\n \"google\", Bucket,\n suppress_consec_slashes=suppress_consec_slashes)\n\n def create_bucket(self, bucket_name, headers=None,\n location=Location.DEFAULT, policy=None,\n storage_class='STANDARD'):\n \"\"\"\n Creates a new bucket. By default it's located in the USA. You can\n pass Location.EU to create bucket in the EU. You can also pass\n a LocationConstraint for where the bucket should be located, and\n a StorageClass describing how the data should be stored.\n\n :type bucket_name: string\n :param bucket_name: The name of the new bucket.\n\n :type headers: dict\n :param headers: Additional headers to pass along with the request to GCS.\n\n :type location: :class:`boto.gs.connection.Location`\n :param location: The location of the new bucket.\n\n :type policy: :class:`boto.gs.acl.CannedACLStrings`\n :param policy: A canned ACL policy that will be applied to the new key\n in GCS.\n\n :type storage_class: string\n :param storage_class: Either 'STANDARD' or 'DURABLE_REDUCED_AVAILABILITY'.\n\n \"\"\"\n check_lowercase_bucketname(bucket_name)\n\n if policy:\n if headers:\n headers[self.provider.acl_header] = policy\n else:\n headers = {self.provider.acl_header : policy}\n if not location:\n location = Location.DEFAULT\n location_elem = ('<LocationConstraint>%s</LocationConstraint>'\n % location)\n if storage_class:\n storage_class_elem = ('<StorageClass>%s</StorageClass>'\n % storage_class)\n else:\n storage_class_elem = ''\n data = ('<CreateBucketConfiguration>%s%s</CreateBucketConfiguration>'\n % (location_elem, storage_class_elem))\n response = self.make_request(\n 'PUT', get_utf8_value(bucket_name), headers=headers,\n data=get_utf8_value(data))\n body = response.read()\n if response.status == 409:\n raise self.provider.storage_create_error(\n response.status, response.reason, body)\n if response.status == 200:\n return self.bucket_class(self, bucket_name)\n else:\n raise self.provider.storage_response_error(\n response.status, response.reason, body)\n\n def get_bucket(self, bucket_name, validate=True, headers=None):\n \"\"\"\n Retrieves a bucket by name.\n\n If the bucket does not exist, an ``S3ResponseError`` will be raised. If\n you are unsure if the bucket exists or not, you can use the\n ``S3Connection.lookup`` method, which will either return a valid bucket\n or ``None``.\n\n :type bucket_name: string\n :param bucket_name: The name of the bucket\n\n :type headers: dict\n :param headers: Additional headers to pass along with the request to\n AWS.\n\n :type validate: boolean\n :param validate: If ``True``, it will try to fetch all keys within the\n given bucket. (Default: ``True``)\n \"\"\"\n bucket = self.bucket_class(self, bucket_name)\n if validate:\n bucket.get_all_keys(headers, maxkeys=0)\n return bucket\n", "path": "boto/gs/connection.py"}]} | 1,874 | 140 |
gh_patches_debug_20924 | rasdani/github-patches | git_diff | OCA__server-tools-211 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tree_view_record_id: causes a warning in the logs
in the runbot logs I get:
2015-07-17 13:09:05,793 27838 WARNING 3110977-7-0-458127-all openerp.modules.loading: The model module.tree.view.record.id.installed has no access rules, consider adding one. E.g. access_module_tree_view_record_id_installed,access_module_tree_view_record_id_installed,model_module_tree_view_record_id_installed,,1,1,1,1
I tracked down module.tree.view.record.id.installed to tree_view_record_id
I totally don't understand why the pseudo dynamic a weird name generated that way, but an ACL is missing
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sentry_logger/__init__.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 ###############################################################################
3 #
4 # OpenERP, Open Source Management Solution
5 # This module copyright (C) 2010 - 2014 Savoir-faire Linux
6 # (<http://www.savoirfairelinux.com>).
7 #
8 # This program is free software: you can redistribute it and/or modify
9 # it under the terms of the GNU Affero General Public License as
10 # published by the Free Software Foundation, either version 3 of the
11 # License, or (at your option) any later version.
12 #
13 # This program is distributed in the hope that it will be useful,
14 # but WITHOUT ANY WARRANTY; without even the implied warranty of
15 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
16 # GNU Affero General Public License for more details.
17 #
18 # You should have received a copy of the GNU Affero General Public License
19 # along with this program. If not, see <http://www.gnu.org/licenses/>.
20 #
21 ###############################################################################
22
23 import logging
24 import cgitb
25
26 from openerp.tools import config
27 from openerp.addons.web.controllers.main import Session
28
29 _DEFAULT_LOGGING_LEVEL = logging.ERROR
30
31 try:
32 from .odoo_sentry_client import OdooClient
33 from .odoo_sentry_handler import OdooSentryHandler
34
35 root_logger = logging.root
36
37 processors = (
38 'raven.processors.SanitizePasswordsProcessor',
39 'raven_sanitize_openerp.OpenerpPasswordsProcessor'
40 )
41 if config.get(u'sentry_dsn'):
42 cgitb.enable()
43 # Get DSN info from config file or ~/.openerp_serverrc (recommended)
44 dsn = config.get('sentry_dsn')
45 try:
46 level = getattr(logging, config.get('sentry_logging_level'))
47 except (AttributeError, TypeError):
48 level = _DEFAULT_LOGGING_LEVEL
49 # Create Client
50 client = OdooClient(
51 dsn=dsn,
52 processors=processors,
53 )
54 handler = OdooSentryHandler(client, level=level)
55 root_logger.addHandler(handler)
56 else:
57 root_logger.warn(u"Sentry DSN not defined in config file")
58 client = None
59
60 # Inject sentry_activated to session to display error message or not
61 old_session_info = Session.session_info
62
63 def session_info(self, req):
64 res = old_session_info(self, req)
65 res['sentry_activated'] = bool(client)
66 return res
67
68 Session.session_info = session_info
69 except ImportError:
70 pass
71
```
Path: `tree_view_record_id/__openerp__.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 ###############################################################################
3 #
4 # Copyright (C) 2012-TODAY Akretion <http://www.akretion.com>.
5 # All Rights Reserved
6 # @author David BEAL <[email protected]>
7 # This program is free software: you can redistribute it and/or modify
8 # it under the terms of the GNU Affero General Public License as
9 # published by the Free Software Foundation, either version 3 of the
10 # License, or (at your option) any later version.
11 #
12 # This program is distributed in the hope that it will be useful,
13 # but WITHOUT ANY WARRANTY; without even the implied warranty of
14 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
15 # GNU Affero General Public License for more details.
16 #
17 # You should have received a copy of the GNU Affero General Public License
18 # along with this program. If not, see <http://www.gnu.org/licenses/>.
19 #
20 ###############################################################################
21
22 {
23 'name': 'Tree View Record Id',
24 'version': '0.1',
25 'category': 'Other modules',
26 'sequence': 10,
27 'author': "Akretion,Odoo Community Association (OCA)",
28 'summary': "Adds id field to tree views",
29 'description': """
30 Adds Id field in all tree views of any modules/models, except:
31
32 * Arborescent tree views like 'Products by Category', 'Chart of accounts', etc.
33 * Tree views (like in wizard 'Change password') built on transient models
34 which don't have this column in their table.
35
36 Id field is the primary key of standard sql tables
37 defined by the orm (Odoo model).
38 """,
39 'website': 'http://www.akretion.com',
40 'depends': [
41 'base',
42 ],
43 'data': [
44 ],
45 'demo': [
46 ],
47 'installable': True,
48 'auto_install': False,
49 'application': False,
50 'images': [
51 ],
52 'css': [
53 ],
54 'js': [
55 ],
56 'qweb': [
57 ],
58 }
59
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sentry_logger/__init__.py b/sentry_logger/__init__.py
--- a/sentry_logger/__init__.py
+++ b/sentry_logger/__init__.py
@@ -20,6 +20,7 @@
#
###############################################################################
+import os
import logging
import cgitb
@@ -54,7 +55,12 @@
handler = OdooSentryHandler(client, level=level)
root_logger.addHandler(handler)
else:
- root_logger.warn(u"Sentry DSN not defined in config file")
+ msg = u"Sentry DSN not defined in config file"
+ if os.environ.get('OCA_CI'):
+ # don't fail the build on runbot for this
+ root_logger.info(msg)
+ else:
+ root_logger.warn(msg)
client = None
# Inject sentry_activated to session to display error message or not
diff --git a/tree_view_record_id/__openerp__.py b/tree_view_record_id/__openerp__.py
--- a/tree_view_record_id/__openerp__.py
+++ b/tree_view_record_id/__openerp__.py
@@ -41,6 +41,7 @@
'base',
],
'data': [
+ 'security/ir.model.access.csv',
],
'demo': [
],
| {"golden_diff": "diff --git a/sentry_logger/__init__.py b/sentry_logger/__init__.py\n--- a/sentry_logger/__init__.py\n+++ b/sentry_logger/__init__.py\n@@ -20,6 +20,7 @@\n #\n ###############################################################################\n \n+import os\n import logging\n import cgitb\n \n@@ -54,7 +55,12 @@\n handler = OdooSentryHandler(client, level=level)\n root_logger.addHandler(handler)\n else:\n- root_logger.warn(u\"Sentry DSN not defined in config file\")\n+ msg = u\"Sentry DSN not defined in config file\"\n+ if os.environ.get('OCA_CI'):\n+ # don't fail the build on runbot for this\n+ root_logger.info(msg)\n+ else:\n+ root_logger.warn(msg)\n client = None\n \n # Inject sentry_activated to session to display error message or not\ndiff --git a/tree_view_record_id/__openerp__.py b/tree_view_record_id/__openerp__.py\n--- a/tree_view_record_id/__openerp__.py\n+++ b/tree_view_record_id/__openerp__.py\n@@ -41,6 +41,7 @@\n 'base',\n ],\n 'data': [\n+ 'security/ir.model.access.csv',\n ],\n 'demo': [\n ],\n", "issue": "tree_view_record_id: causes a warning in the logs\nin the runbot logs I get:\n\n2015-07-17 13:09:05,793 27838 WARNING 3110977-7-0-458127-all openerp.modules.loading: The model module.tree.view.record.id.installed has no access rules, consider adding one. E.g. access_module_tree_view_record_id_installed,access_module_tree_view_record_id_installed,model_module_tree_view_record_id_installed,,1,1,1,1\n\nI tracked down module.tree.view.record.id.installed to tree_view_record_id\n\nI totally don't understand why the pseudo dynamic a weird name generated that way, but an ACL is missing\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n###############################################################################\n#\n# OpenERP, Open Source Management Solution\n# This module copyright (C) 2010 - 2014 Savoir-faire Linux\n# (<http://www.savoirfairelinux.com>).\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n###############################################################################\n\nimport logging\nimport cgitb\n\nfrom openerp.tools import config\nfrom openerp.addons.web.controllers.main import Session\n\n_DEFAULT_LOGGING_LEVEL = logging.ERROR\n\ntry:\n from .odoo_sentry_client import OdooClient\n from .odoo_sentry_handler import OdooSentryHandler\n\n root_logger = logging.root\n\n processors = (\n 'raven.processors.SanitizePasswordsProcessor',\n 'raven_sanitize_openerp.OpenerpPasswordsProcessor'\n )\n if config.get(u'sentry_dsn'):\n cgitb.enable()\n # Get DSN info from config file or ~/.openerp_serverrc (recommended)\n dsn = config.get('sentry_dsn')\n try:\n level = getattr(logging, config.get('sentry_logging_level'))\n except (AttributeError, TypeError):\n level = _DEFAULT_LOGGING_LEVEL\n # Create Client\n client = OdooClient(\n dsn=dsn,\n processors=processors,\n )\n handler = OdooSentryHandler(client, level=level)\n root_logger.addHandler(handler)\n else:\n root_logger.warn(u\"Sentry DSN not defined in config file\")\n client = None\n\n # Inject sentry_activated to session to display error message or not\n old_session_info = Session.session_info\n\n def session_info(self, req):\n res = old_session_info(self, req)\n res['sentry_activated'] = bool(client)\n return res\n\n Session.session_info = session_info\nexcept ImportError:\n pass\n", "path": "sentry_logger/__init__.py"}, {"content": "# -*- coding: utf-8 -*-\n###############################################################################\n#\n# Copyright (C) 2012-TODAY Akretion <http://www.akretion.com>.\n# All Rights Reserved\n# @author David BEAL <[email protected]>\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n###############################################################################\n\n{\n 'name': 'Tree View Record Id',\n 'version': '0.1',\n 'category': 'Other modules',\n 'sequence': 10,\n 'author': \"Akretion,Odoo Community Association (OCA)\",\n 'summary': \"Adds id field to tree views\",\n 'description': \"\"\"\nAdds Id field in all tree views of any modules/models, except:\n\n* Arborescent tree views like 'Products by Category', 'Chart of accounts', etc.\n* Tree views (like in wizard 'Change password') built on transient models\n which don't have this column in their table.\n\nId field is the primary key of standard sql tables\ndefined by the orm (Odoo model).\n \"\"\",\n 'website': 'http://www.akretion.com',\n 'depends': [\n 'base',\n ],\n 'data': [\n ],\n 'demo': [\n ],\n 'installable': True,\n 'auto_install': False,\n 'application': False,\n 'images': [\n ],\n 'css': [\n ],\n 'js': [\n ],\n 'qweb': [\n ],\n}\n", "path": "tree_view_record_id/__openerp__.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n###############################################################################\n#\n# OpenERP, Open Source Management Solution\n# This module copyright (C) 2010 - 2014 Savoir-faire Linux\n# (<http://www.savoirfairelinux.com>).\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n###############################################################################\n\nimport os\nimport logging\nimport cgitb\n\nfrom openerp.tools import config\nfrom openerp.addons.web.controllers.main import Session\n\n_DEFAULT_LOGGING_LEVEL = logging.ERROR\n\ntry:\n from .odoo_sentry_client import OdooClient\n from .odoo_sentry_handler import OdooSentryHandler\n\n root_logger = logging.root\n\n processors = (\n 'raven.processors.SanitizePasswordsProcessor',\n 'raven_sanitize_openerp.OpenerpPasswordsProcessor'\n )\n if config.get(u'sentry_dsn'):\n cgitb.enable()\n # Get DSN info from config file or ~/.openerp_serverrc (recommended)\n dsn = config.get('sentry_dsn')\n try:\n level = getattr(logging, config.get('sentry_logging_level'))\n except (AttributeError, TypeError):\n level = _DEFAULT_LOGGING_LEVEL\n # Create Client\n client = OdooClient(\n dsn=dsn,\n processors=processors,\n )\n handler = OdooSentryHandler(client, level=level)\n root_logger.addHandler(handler)\n else:\n msg = u\"Sentry DSN not defined in config file\"\n if os.environ.get('OCA_CI'):\n # don't fail the build on runbot for this\n root_logger.info(msg)\n else:\n root_logger.warn(msg)\n client = None\n\n # Inject sentry_activated to session to display error message or not\n old_session_info = Session.session_info\n\n def session_info(self, req):\n res = old_session_info(self, req)\n res['sentry_activated'] = bool(client)\n return res\n\n Session.session_info = session_info\nexcept ImportError:\n pass\n", "path": "sentry_logger/__init__.py"}, {"content": "# -*- coding: utf-8 -*-\n###############################################################################\n#\n# Copyright (C) 2012-TODAY Akretion <http://www.akretion.com>.\n# All Rights Reserved\n# @author David BEAL <[email protected]>\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n###############################################################################\n\n{\n 'name': 'Tree View Record Id',\n 'version': '0.1',\n 'category': 'Other modules',\n 'sequence': 10,\n 'author': \"Akretion,Odoo Community Association (OCA)\",\n 'summary': \"Adds id field to tree views\",\n 'description': \"\"\"\nAdds Id field in all tree views of any modules/models, except:\n\n* Arborescent tree views like 'Products by Category', 'Chart of accounts', etc.\n* Tree views (like in wizard 'Change password') built on transient models\n which don't have this column in their table.\n\nId field is the primary key of standard sql tables\ndefined by the orm (Odoo model).\n \"\"\",\n 'website': 'http://www.akretion.com',\n 'depends': [\n 'base',\n ],\n 'data': [\n 'security/ir.model.access.csv',\n ],\n 'demo': [\n ],\n 'installable': True,\n 'auto_install': False,\n 'application': False,\n 'images': [\n ],\n 'css': [\n ],\n 'js': [\n ],\n 'qweb': [\n ],\n}\n", "path": "tree_view_record_id/__openerp__.py"}]} | 1,703 | 293 |
gh_patches_debug_5906 | rasdani/github-patches | git_diff | mesonbuild__meson-2743 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
run_target Permission Denied error should be clearer
Minimal example -
[folder.zip](https://github.com/mesonbuild/meson/files/1530489/folder.zip)
I ran - `mkdir build && cd build && meson .. && ninja` and everything works. Now I run - `ninja myscript` and it throws errors -
```
[0/1] Running external command myscript.
Traceback (most recent call last):
File "/usr/bin/meson", line 37, in <module>
sys.exit(main())
File "/usr/bin/meson", line 34, in main
return mesonmain.run(sys.argv[1:], launcher)
File "/usr/lib/python3.6/site-packages/mesonbuild/mesonmain.py", line 311, in run
sys.exit(run_script_command(args[1:]))
File "/usr/lib/python3.6/site-packages/mesonbuild/mesonmain.py", line 278, in run_script_command
return cmdfunc(cmdargs)
File "/usr/lib/python3.6/site-packages/mesonbuild/scripts/commandrunner.py", line 60, in run
pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)
File "/usr/lib/python3.6/site-packages/mesonbuild/scripts/commandrunner.py", line 39, in run_command
return subprocess.Popen(command_array, env=child_env, cwd=cwd)
File "/usr/lib/python3.6/subprocess.py", line 709, in __init__
restore_signals, start_new_session)
File "/usr/lib/python3.6/subprocess.py", line 1344, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
PermissionError: [Errno 13] Permission denied: '/home/agauniyal/temp/scripts/script.sh'
FAILED: meson-myscript
/usr/bin/python /usr/bin/meson --internal commandrunner /home/agauniyal/temp/ /home/agauniyal/temp/build '' /usr/bin/python /usr/bin/meson /home/agauniyal/temp/scripts/script.sh
ninja: build stopped: subcommand failed.
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mesonbuild/scripts/commandrunner.py`
Content:
```
1 # Copyright 2014 The Meson development team
2
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6
7 # http://www.apache.org/licenses/LICENSE-2.0
8
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """This program is a wrapper to run external commands. It determines
16 what to run, sets up the environment and executes the command."""
17
18 import sys, os, subprocess, shutil, shlex
19
20 def run_command(source_dir, build_dir, subdir, meson_command, command, arguments):
21 env = {'MESON_SOURCE_ROOT': source_dir,
22 'MESON_BUILD_ROOT': build_dir,
23 'MESON_SUBDIR': subdir,
24 'MESONINTROSPECT': ' '.join([shlex.quote(x) for x in meson_command + ['introspect']]),
25 }
26 cwd = os.path.join(source_dir, subdir)
27 child_env = os.environ.copy()
28 child_env.update(env)
29
30 # Is the command an executable in path?
31 exe = shutil.which(command)
32 if exe is not None:
33 command_array = [exe] + arguments
34 return subprocess.Popen(command_array, env=child_env, cwd=cwd)
35 # No? Maybe it is a script in the source tree.
36 fullpath = os.path.join(source_dir, subdir, command)
37 command_array = [fullpath] + arguments
38 try:
39 return subprocess.Popen(command_array, env=child_env, cwd=cwd)
40 except FileNotFoundError:
41 print('Could not execute command "%s".' % command)
42 sys.exit(1)
43
44 def run(args):
45 if len(args) < 4:
46 print('commandrunner.py <source dir> <build dir> <subdir> <command> [arguments]')
47 return 1
48 src_dir = args[0]
49 build_dir = args[1]
50 subdir = args[2]
51 meson_command = args[3]
52 if 'python' in meson_command: # Hack.
53 meson_command = [meson_command, args[4]]
54 command = args[5]
55 arguments = args[6:]
56 else:
57 meson_command = [meson_command]
58 command = args[4]
59 arguments = args[5:]
60 pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)
61 pc.wait()
62 return pc.returncode
63
64 if __name__ == '__main__':
65 sys.exit(run(sys.argv[1:]))
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mesonbuild/scripts/commandrunner.py b/mesonbuild/scripts/commandrunner.py
--- a/mesonbuild/scripts/commandrunner.py
+++ b/mesonbuild/scripts/commandrunner.py
@@ -38,7 +38,10 @@
try:
return subprocess.Popen(command_array, env=child_env, cwd=cwd)
except FileNotFoundError:
- print('Could not execute command "%s".' % command)
+ print('Could not execute command "%s". File not found.' % command)
+ sys.exit(1)
+ except PermissionError:
+ print('Could not execute command "%s". File not executable.' % command)
sys.exit(1)
def run(args):
| {"golden_diff": "diff --git a/mesonbuild/scripts/commandrunner.py b/mesonbuild/scripts/commandrunner.py\n--- a/mesonbuild/scripts/commandrunner.py\n+++ b/mesonbuild/scripts/commandrunner.py\n@@ -38,7 +38,10 @@\n try:\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n except FileNotFoundError:\n- print('Could not execute command \"%s\".' % command)\n+ print('Could not execute command \"%s\". File not found.' % command)\n+ sys.exit(1)\n+ except PermissionError:\n+ print('Could not execute command \"%s\". File not executable.' % command)\n sys.exit(1)\n \n def run(args):\n", "issue": "run_target Permission Denied error should be clearer\nMinimal example - \r\n[folder.zip](https://github.com/mesonbuild/meson/files/1530489/folder.zip)\r\n\r\nI ran - `mkdir build && cd build && meson .. && ninja` and everything works. Now I run - `ninja myscript` and it throws errors -\r\n\r\n```\r\n[0/1] Running external command myscript.\r\nTraceback (most recent call last):\r\n File \"/usr/bin/meson\", line 37, in <module>\r\n sys.exit(main())\r\n File \"/usr/bin/meson\", line 34, in main\r\n return mesonmain.run(sys.argv[1:], launcher)\r\n File \"/usr/lib/python3.6/site-packages/mesonbuild/mesonmain.py\", line 311, in run\r\n sys.exit(run_script_command(args[1:]))\r\n File \"/usr/lib/python3.6/site-packages/mesonbuild/mesonmain.py\", line 278, in run_script_command\r\n return cmdfunc(cmdargs)\r\n File \"/usr/lib/python3.6/site-packages/mesonbuild/scripts/commandrunner.py\", line 60, in run\r\n pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)\r\n File \"/usr/lib/python3.6/site-packages/mesonbuild/scripts/commandrunner.py\", line 39, in run_command\r\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\r\n File \"/usr/lib/python3.6/subprocess.py\", line 709, in __init__\r\n restore_signals, start_new_session)\r\n File \"/usr/lib/python3.6/subprocess.py\", line 1344, in _execute_child\r\n raise child_exception_type(errno_num, err_msg, err_filename)\r\nPermissionError: [Errno 13] Permission denied: '/home/agauniyal/temp/scripts/script.sh'\r\nFAILED: meson-myscript \r\n/usr/bin/python /usr/bin/meson --internal commandrunner /home/agauniyal/temp/ /home/agauniyal/temp/build '' /usr/bin/python /usr/bin/meson /home/agauniyal/temp/scripts/script.sh\r\nninja: build stopped: subcommand failed.\r\n```\n", "before_files": [{"content": "# Copyright 2014 The Meson development team\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This program is a wrapper to run external commands. It determines\nwhat to run, sets up the environment and executes the command.\"\"\"\n\nimport sys, os, subprocess, shutil, shlex\n\ndef run_command(source_dir, build_dir, subdir, meson_command, command, arguments):\n env = {'MESON_SOURCE_ROOT': source_dir,\n 'MESON_BUILD_ROOT': build_dir,\n 'MESON_SUBDIR': subdir,\n 'MESONINTROSPECT': ' '.join([shlex.quote(x) for x in meson_command + ['introspect']]),\n }\n cwd = os.path.join(source_dir, subdir)\n child_env = os.environ.copy()\n child_env.update(env)\n\n # Is the command an executable in path?\n exe = shutil.which(command)\n if exe is not None:\n command_array = [exe] + arguments\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n # No? Maybe it is a script in the source tree.\n fullpath = os.path.join(source_dir, subdir, command)\n command_array = [fullpath] + arguments\n try:\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n except FileNotFoundError:\n print('Could not execute command \"%s\".' % command)\n sys.exit(1)\n\ndef run(args):\n if len(args) < 4:\n print('commandrunner.py <source dir> <build dir> <subdir> <command> [arguments]')\n return 1\n src_dir = args[0]\n build_dir = args[1]\n subdir = args[2]\n meson_command = args[3]\n if 'python' in meson_command: # Hack.\n meson_command = [meson_command, args[4]]\n command = args[5]\n arguments = args[6:]\n else:\n meson_command = [meson_command]\n command = args[4]\n arguments = args[5:]\n pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)\n pc.wait()\n return pc.returncode\n\nif __name__ == '__main__':\n sys.exit(run(sys.argv[1:]))\n", "path": "mesonbuild/scripts/commandrunner.py"}], "after_files": [{"content": "# Copyright 2014 The Meson development team\n\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n\n# http://www.apache.org/licenses/LICENSE-2.0\n\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"This program is a wrapper to run external commands. It determines\nwhat to run, sets up the environment and executes the command.\"\"\"\n\nimport sys, os, subprocess, shutil, shlex\n\ndef run_command(source_dir, build_dir, subdir, meson_command, command, arguments):\n env = {'MESON_SOURCE_ROOT': source_dir,\n 'MESON_BUILD_ROOT': build_dir,\n 'MESON_SUBDIR': subdir,\n 'MESONINTROSPECT': ' '.join([shlex.quote(x) for x in meson_command + ['introspect']]),\n }\n cwd = os.path.join(source_dir, subdir)\n child_env = os.environ.copy()\n child_env.update(env)\n\n # Is the command an executable in path?\n exe = shutil.which(command)\n if exe is not None:\n command_array = [exe] + arguments\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n # No? Maybe it is a script in the source tree.\n fullpath = os.path.join(source_dir, subdir, command)\n command_array = [fullpath] + arguments\n try:\n return subprocess.Popen(command_array, env=child_env, cwd=cwd)\n except FileNotFoundError:\n print('Could not execute command \"%s\". File not found.' % command)\n sys.exit(1)\n except PermissionError:\n print('Could not execute command \"%s\". File not executable.' % command)\n sys.exit(1)\n\ndef run(args):\n if len(args) < 4:\n print('commandrunner.py <source dir> <build dir> <subdir> <command> [arguments]')\n return 1\n src_dir = args[0]\n build_dir = args[1]\n subdir = args[2]\n meson_command = args[3]\n if 'python' in meson_command: # Hack.\n meson_command = [meson_command, args[4]]\n command = args[5]\n arguments = args[6:]\n else:\n meson_command = [meson_command]\n command = args[4]\n arguments = args[5:]\n pc = run_command(src_dir, build_dir, subdir, meson_command, command, arguments)\n pc.wait()\n return pc.returncode\n\nif __name__ == '__main__':\n sys.exit(run(sys.argv[1:]))\n", "path": "mesonbuild/scripts/commandrunner.py"}]} | 1,469 | 151 |
gh_patches_debug_23694 | rasdani/github-patches | git_diff | pre-commit__pre-commit-718 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Handle when `core.hooksPath` is set?
As we found in https://github.com/pre-commit/pre-commit-hooks/issues/250, pre-commit (despite being installed) will be silently skipped if `core.hooksPath` is set.
A few options:
- during `pre-commit install`, check this variable and warn
- "" but error
- install into the directory at `core.hooksPath` (but it may be outside the working dir? probably not the best idea to write to it)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pre_commit/commands/install_uninstall.py`
Content:
```
1 from __future__ import print_function
2 from __future__ import unicode_literals
3
4 import io
5 import os.path
6 import sys
7
8 from pre_commit import output
9 from pre_commit.util import make_executable
10 from pre_commit.util import mkdirp
11 from pre_commit.util import resource_filename
12
13
14 # This is used to identify the hook file we install
15 PRIOR_HASHES = (
16 '4d9958c90bc262f47553e2c073f14cfe',
17 'd8ee923c46731b42cd95cc869add4062',
18 '49fd668cb42069aa1b6048464be5d395',
19 '79f09a650522a87b0da915d0d983b2de',
20 'e358c9dae00eac5d06b38dfdb1e33a8c',
21 )
22 CURRENT_HASH = '138fd403232d2ddd5efb44317e38bf03'
23 TEMPLATE_START = '# start templated\n'
24 TEMPLATE_END = '# end templated\n'
25
26
27 def is_our_script(filename):
28 if not os.path.exists(filename):
29 return False
30 contents = io.open(filename).read()
31 return any(h in contents for h in (CURRENT_HASH,) + PRIOR_HASHES)
32
33
34 def install(
35 runner, overwrite=False, hooks=False, hook_type='pre-commit',
36 skip_on_missing_conf=False,
37 ):
38 """Install the pre-commit hooks."""
39 hook_path = runner.get_hook_path(hook_type)
40 legacy_path = hook_path + '.legacy'
41
42 mkdirp(os.path.dirname(hook_path))
43
44 # If we have an existing hook, move it to pre-commit.legacy
45 if os.path.lexists(hook_path) and not is_our_script(hook_path):
46 os.rename(hook_path, legacy_path)
47
48 # If we specify overwrite, we simply delete the legacy file
49 if overwrite and os.path.exists(legacy_path):
50 os.remove(legacy_path)
51 elif os.path.exists(legacy_path):
52 output.write_line(
53 'Running in migration mode with existing hooks at {}\n'
54 'Use -f to use only pre-commit.'.format(legacy_path),
55 )
56
57 params = {
58 'CONFIG': runner.config_file,
59 'HOOK_TYPE': hook_type,
60 'INSTALL_PYTHON': sys.executable,
61 'SKIP_ON_MISSING_CONFIG': skip_on_missing_conf,
62 }
63
64 with io.open(hook_path, 'w') as hook_file:
65 with io.open(resource_filename('hook-tmpl')) as f:
66 contents = f.read()
67 before, rest = contents.split(TEMPLATE_START)
68 to_template, after = rest.split(TEMPLATE_END)
69
70 hook_file.write(before + TEMPLATE_START)
71 for line in to_template.splitlines():
72 var = line.split()[0]
73 hook_file.write('{} = {!r}\n'.format(var, params[var]))
74 hook_file.write(TEMPLATE_END + after)
75 make_executable(hook_path)
76
77 output.write_line('pre-commit installed at {}'.format(hook_path))
78
79 # If they requested we install all of the hooks, do so.
80 if hooks:
81 install_hooks(runner)
82
83 return 0
84
85
86 def install_hooks(runner):
87 for repository in runner.repositories:
88 repository.require_installed()
89
90
91 def uninstall(runner, hook_type='pre-commit'):
92 """Uninstall the pre-commit hooks."""
93 hook_path = runner.get_hook_path(hook_type)
94 legacy_path = hook_path + '.legacy'
95 # If our file doesn't exist or it isn't ours, gtfo.
96 if not os.path.exists(hook_path) or not is_our_script(hook_path):
97 return 0
98
99 os.remove(hook_path)
100 output.write_line('{} uninstalled'.format(hook_type))
101
102 if os.path.exists(legacy_path):
103 os.rename(legacy_path, hook_path)
104 output.write_line('Restored previous hooks to {}'.format(hook_path))
105
106 return 0
107
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pre_commit/commands/install_uninstall.py b/pre_commit/commands/install_uninstall.py
--- a/pre_commit/commands/install_uninstall.py
+++ b/pre_commit/commands/install_uninstall.py
@@ -2,15 +2,19 @@
from __future__ import unicode_literals
import io
+import logging
import os.path
import sys
from pre_commit import output
+from pre_commit.util import cmd_output
from pre_commit.util import make_executable
from pre_commit.util import mkdirp
from pre_commit.util import resource_filename
+logger = logging.getLogger(__name__)
+
# This is used to identify the hook file we install
PRIOR_HASHES = (
'4d9958c90bc262f47553e2c073f14cfe',
@@ -36,6 +40,13 @@
skip_on_missing_conf=False,
):
"""Install the pre-commit hooks."""
+ if cmd_output('git', 'config', 'core.hooksPath', retcode=None)[1].strip():
+ logger.error(
+ 'Cowardly refusing to install hooks with `core.hooksPath` set.\n'
+ 'hint: `git config --unset-all core.hooksPath`',
+ )
+ return 1
+
hook_path = runner.get_hook_path(hook_type)
legacy_path = hook_path + '.legacy'
| {"golden_diff": "diff --git a/pre_commit/commands/install_uninstall.py b/pre_commit/commands/install_uninstall.py\n--- a/pre_commit/commands/install_uninstall.py\n+++ b/pre_commit/commands/install_uninstall.py\n@@ -2,15 +2,19 @@\n from __future__ import unicode_literals\n \n import io\n+import logging\n import os.path\n import sys\n \n from pre_commit import output\n+from pre_commit.util import cmd_output\n from pre_commit.util import make_executable\n from pre_commit.util import mkdirp\n from pre_commit.util import resource_filename\n \n \n+logger = logging.getLogger(__name__)\n+\n # This is used to identify the hook file we install\n PRIOR_HASHES = (\n '4d9958c90bc262f47553e2c073f14cfe',\n@@ -36,6 +40,13 @@\n skip_on_missing_conf=False,\n ):\n \"\"\"Install the pre-commit hooks.\"\"\"\n+ if cmd_output('git', 'config', 'core.hooksPath', retcode=None)[1].strip():\n+ logger.error(\n+ 'Cowardly refusing to install hooks with `core.hooksPath` set.\\n'\n+ 'hint: `git config --unset-all core.hooksPath`',\n+ )\n+ return 1\n+\n hook_path = runner.get_hook_path(hook_type)\n legacy_path = hook_path + '.legacy'\n", "issue": "Handle when `core.hooksPath` is set?\nAs we found in https://github.com/pre-commit/pre-commit-hooks/issues/250, pre-commit (despite being installed) will be silently skipped if `core.hooksPath` is set.\r\n\r\nA few options:\r\n- during `pre-commit install`, check this variable and warn\r\n- \"\" but error\r\n- install into the directory at `core.hooksPath` (but it may be outside the working dir? probably not the best idea to write to it)\n", "before_files": [{"content": "from __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport io\nimport os.path\nimport sys\n\nfrom pre_commit import output\nfrom pre_commit.util import make_executable\nfrom pre_commit.util import mkdirp\nfrom pre_commit.util import resource_filename\n\n\n# This is used to identify the hook file we install\nPRIOR_HASHES = (\n '4d9958c90bc262f47553e2c073f14cfe',\n 'd8ee923c46731b42cd95cc869add4062',\n '49fd668cb42069aa1b6048464be5d395',\n '79f09a650522a87b0da915d0d983b2de',\n 'e358c9dae00eac5d06b38dfdb1e33a8c',\n)\nCURRENT_HASH = '138fd403232d2ddd5efb44317e38bf03'\nTEMPLATE_START = '# start templated\\n'\nTEMPLATE_END = '# end templated\\n'\n\n\ndef is_our_script(filename):\n if not os.path.exists(filename):\n return False\n contents = io.open(filename).read()\n return any(h in contents for h in (CURRENT_HASH,) + PRIOR_HASHES)\n\n\ndef install(\n runner, overwrite=False, hooks=False, hook_type='pre-commit',\n skip_on_missing_conf=False,\n):\n \"\"\"Install the pre-commit hooks.\"\"\"\n hook_path = runner.get_hook_path(hook_type)\n legacy_path = hook_path + '.legacy'\n\n mkdirp(os.path.dirname(hook_path))\n\n # If we have an existing hook, move it to pre-commit.legacy\n if os.path.lexists(hook_path) and not is_our_script(hook_path):\n os.rename(hook_path, legacy_path)\n\n # If we specify overwrite, we simply delete the legacy file\n if overwrite and os.path.exists(legacy_path):\n os.remove(legacy_path)\n elif os.path.exists(legacy_path):\n output.write_line(\n 'Running in migration mode with existing hooks at {}\\n'\n 'Use -f to use only pre-commit.'.format(legacy_path),\n )\n\n params = {\n 'CONFIG': runner.config_file,\n 'HOOK_TYPE': hook_type,\n 'INSTALL_PYTHON': sys.executable,\n 'SKIP_ON_MISSING_CONFIG': skip_on_missing_conf,\n }\n\n with io.open(hook_path, 'w') as hook_file:\n with io.open(resource_filename('hook-tmpl')) as f:\n contents = f.read()\n before, rest = contents.split(TEMPLATE_START)\n to_template, after = rest.split(TEMPLATE_END)\n\n hook_file.write(before + TEMPLATE_START)\n for line in to_template.splitlines():\n var = line.split()[0]\n hook_file.write('{} = {!r}\\n'.format(var, params[var]))\n hook_file.write(TEMPLATE_END + after)\n make_executable(hook_path)\n\n output.write_line('pre-commit installed at {}'.format(hook_path))\n\n # If they requested we install all of the hooks, do so.\n if hooks:\n install_hooks(runner)\n\n return 0\n\n\ndef install_hooks(runner):\n for repository in runner.repositories:\n repository.require_installed()\n\n\ndef uninstall(runner, hook_type='pre-commit'):\n \"\"\"Uninstall the pre-commit hooks.\"\"\"\n hook_path = runner.get_hook_path(hook_type)\n legacy_path = hook_path + '.legacy'\n # If our file doesn't exist or it isn't ours, gtfo.\n if not os.path.exists(hook_path) or not is_our_script(hook_path):\n return 0\n\n os.remove(hook_path)\n output.write_line('{} uninstalled'.format(hook_type))\n\n if os.path.exists(legacy_path):\n os.rename(legacy_path, hook_path)\n output.write_line('Restored previous hooks to {}'.format(hook_path))\n\n return 0\n", "path": "pre_commit/commands/install_uninstall.py"}], "after_files": [{"content": "from __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport io\nimport logging\nimport os.path\nimport sys\n\nfrom pre_commit import output\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import make_executable\nfrom pre_commit.util import mkdirp\nfrom pre_commit.util import resource_filename\n\n\nlogger = logging.getLogger(__name__)\n\n# This is used to identify the hook file we install\nPRIOR_HASHES = (\n '4d9958c90bc262f47553e2c073f14cfe',\n 'd8ee923c46731b42cd95cc869add4062',\n '49fd668cb42069aa1b6048464be5d395',\n '79f09a650522a87b0da915d0d983b2de',\n 'e358c9dae00eac5d06b38dfdb1e33a8c',\n)\nCURRENT_HASH = '138fd403232d2ddd5efb44317e38bf03'\nTEMPLATE_START = '# start templated\\n'\nTEMPLATE_END = '# end templated\\n'\n\n\ndef is_our_script(filename):\n if not os.path.exists(filename):\n return False\n contents = io.open(filename).read()\n return any(h in contents for h in (CURRENT_HASH,) + PRIOR_HASHES)\n\n\ndef install(\n runner, overwrite=False, hooks=False, hook_type='pre-commit',\n skip_on_missing_conf=False,\n):\n \"\"\"Install the pre-commit hooks.\"\"\"\n if cmd_output('git', 'config', 'core.hooksPath', retcode=None)[1].strip():\n logger.error(\n 'Cowardly refusing to install hooks with `core.hooksPath` set.\\n'\n 'hint: `git config --unset-all core.hooksPath`',\n )\n return 1\n\n hook_path = runner.get_hook_path(hook_type)\n legacy_path = hook_path + '.legacy'\n\n mkdirp(os.path.dirname(hook_path))\n\n # If we have an existing hook, move it to pre-commit.legacy\n if os.path.lexists(hook_path) and not is_our_script(hook_path):\n os.rename(hook_path, legacy_path)\n\n # If we specify overwrite, we simply delete the legacy file\n if overwrite and os.path.exists(legacy_path):\n os.remove(legacy_path)\n elif os.path.exists(legacy_path):\n output.write_line(\n 'Running in migration mode with existing hooks at {}\\n'\n 'Use -f to use only pre-commit.'.format(legacy_path),\n )\n\n params = {\n 'CONFIG': runner.config_file,\n 'HOOK_TYPE': hook_type,\n 'INSTALL_PYTHON': sys.executable,\n 'SKIP_ON_MISSING_CONFIG': skip_on_missing_conf,\n }\n\n with io.open(hook_path, 'w') as hook_file:\n with io.open(resource_filename('hook-tmpl')) as f:\n contents = f.read()\n before, rest = contents.split(TEMPLATE_START)\n to_template, after = rest.split(TEMPLATE_END)\n\n hook_file.write(before + TEMPLATE_START)\n for line in to_template.splitlines():\n var = line.split()[0]\n hook_file.write('{} = {!r}\\n'.format(var, params[var]))\n hook_file.write(TEMPLATE_END + after)\n make_executable(hook_path)\n\n output.write_line('pre-commit installed at {}'.format(hook_path))\n\n # If they requested we install all of the hooks, do so.\n if hooks:\n install_hooks(runner)\n\n return 0\n\n\ndef install_hooks(runner):\n for repository in runner.repositories:\n repository.require_installed()\n\n\ndef uninstall(runner, hook_type='pre-commit'):\n \"\"\"Uninstall the pre-commit hooks.\"\"\"\n hook_path = runner.get_hook_path(hook_type)\n legacy_path = hook_path + '.legacy'\n # If our file doesn't exist or it isn't ours, gtfo.\n if not os.path.exists(hook_path) or not is_our_script(hook_path):\n return 0\n\n os.remove(hook_path)\n output.write_line('{} uninstalled'.format(hook_type))\n\n if os.path.exists(legacy_path):\n os.rename(legacy_path, hook_path)\n output.write_line('Restored previous hooks to {}'.format(hook_path))\n\n return 0\n", "path": "pre_commit/commands/install_uninstall.py"}]} | 1,495 | 308 |
gh_patches_debug_8601 | rasdani/github-patches | git_diff | getmoto__moto-589 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
botocore.exceptions.ConnectionClosedError raised when calling change_resource_record_sets (Boto3)
I am not sure whether or not I should expect this to work, but I see there are currently similar tests in moto against boto so I thought I would inquire.
When using the Route53 client from boto3, a call to the change_resource_record_set method raises a botocore.exceptions.ConnectionClosedError. (botocore.exceptions.ConnectionClosedError: Connection was closed before we received a valid response from endpoint URL: "https://route53.amazonaws.com/2013-04-01/hostedzone/cc11c883/rrset/")
A test case to reproduce is below.
``` python
import boto3
import uuid
from moto import mock_route53
def guid():
return str(uuid.uuid4())
@mock_route53
def test_route53_rrset_fail():
client = boto3.client('route53')
# Create a new zone
zone_name = '{0}.com'.format(guid())
zone = client.create_hosted_zone(
Name=zone_name,
CallerReference=guid(),
HostedZoneConfig={'Comment': guid()}
)
zone_id = zone['HostedZone']['Id']
# Verify the zone is retrievable
z = client.get_hosted_zone(Id=zone_id)
assert z['HostedZone']['Id'] == zone_id
# Try to create a record set
# Raises botocore.exceptions.ConnectionClosedError
client.change_resource_record_sets(
HostedZoneId=zone_id,
ChangeBatch={
'Comment': guid(),
'Changes': [{
'Action': 'CREATE',
'ResourceRecordSet': {
'Name': 'foo.{0}'.format(zone_name),
'Type': 'A',
'ResourceRecords': [{'Value': '1.2.3.4'}]
}
}]
}
)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `moto/route53/urls.py`
Content:
```
1 from __future__ import unicode_literals
2 from . import responses
3
4 url_bases = [
5 "https://route53.amazonaws.com/201.-..-../",
6 ]
7
8 url_paths = {
9 '{0}hostedzone$': responses.list_or_create_hostzone_response,
10 '{0}hostedzone/[^/]+$': responses.get_or_delete_hostzone_response,
11 '{0}hostedzone/[^/]+/rrset$': responses.rrset_response,
12 '{0}healthcheck': responses.health_check_response,
13 '{0}tags|trafficpolicyinstances/*': responses.not_implemented_response,
14 }
15
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/moto/route53/urls.py b/moto/route53/urls.py
--- a/moto/route53/urls.py
+++ b/moto/route53/urls.py
@@ -8,7 +8,7 @@
url_paths = {
'{0}hostedzone$': responses.list_or_create_hostzone_response,
'{0}hostedzone/[^/]+$': responses.get_or_delete_hostzone_response,
- '{0}hostedzone/[^/]+/rrset$': responses.rrset_response,
+ '{0}hostedzone/[^/]+/rrset/?$': responses.rrset_response,
'{0}healthcheck': responses.health_check_response,
'{0}tags|trafficpolicyinstances/*': responses.not_implemented_response,
}
| {"golden_diff": "diff --git a/moto/route53/urls.py b/moto/route53/urls.py\n--- a/moto/route53/urls.py\n+++ b/moto/route53/urls.py\n@@ -8,7 +8,7 @@\n url_paths = {\n '{0}hostedzone$': responses.list_or_create_hostzone_response,\n '{0}hostedzone/[^/]+$': responses.get_or_delete_hostzone_response,\n- '{0}hostedzone/[^/]+/rrset$': responses.rrset_response,\n+ '{0}hostedzone/[^/]+/rrset/?$': responses.rrset_response,\n '{0}healthcheck': responses.health_check_response,\n '{0}tags|trafficpolicyinstances/*': responses.not_implemented_response,\n }\n", "issue": "botocore.exceptions.ConnectionClosedError raised when calling change_resource_record_sets (Boto3)\nI am not sure whether or not I should expect this to work, but I see there are currently similar tests in moto against boto so I thought I would inquire.\n\nWhen using the Route53 client from boto3, a call to the change_resource_record_set method raises a botocore.exceptions.ConnectionClosedError. (botocore.exceptions.ConnectionClosedError: Connection was closed before we received a valid response from endpoint URL: \"https://route53.amazonaws.com/2013-04-01/hostedzone/cc11c883/rrset/\")\n\nA test case to reproduce is below.\n\n``` python\nimport boto3\nimport uuid\nfrom moto import mock_route53\n\ndef guid():\n return str(uuid.uuid4())\n\n@mock_route53\ndef test_route53_rrset_fail():\n\n client = boto3.client('route53')\n\n # Create a new zone\n zone_name = '{0}.com'.format(guid())\n zone = client.create_hosted_zone(\n Name=zone_name,\n CallerReference=guid(),\n HostedZoneConfig={'Comment': guid()}\n )\n zone_id = zone['HostedZone']['Id']\n\n # Verify the zone is retrievable\n z = client.get_hosted_zone(Id=zone_id)\n assert z['HostedZone']['Id'] == zone_id\n\n # Try to create a record set\n # Raises botocore.exceptions.ConnectionClosedError\n client.change_resource_record_sets(\n HostedZoneId=zone_id,\n ChangeBatch={\n 'Comment': guid(),\n 'Changes': [{\n 'Action': 'CREATE',\n 'ResourceRecordSet': {\n 'Name': 'foo.{0}'.format(zone_name),\n 'Type': 'A',\n 'ResourceRecords': [{'Value': '1.2.3.4'}]\n }\n }]\n }\n )\n```\n\n", "before_files": [{"content": "from __future__ import unicode_literals\nfrom . import responses\n\nurl_bases = [\n \"https://route53.amazonaws.com/201.-..-../\",\n]\n\nurl_paths = {\n '{0}hostedzone$': responses.list_or_create_hostzone_response,\n '{0}hostedzone/[^/]+$': responses.get_or_delete_hostzone_response,\n '{0}hostedzone/[^/]+/rrset$': responses.rrset_response,\n '{0}healthcheck': responses.health_check_response,\n '{0}tags|trafficpolicyinstances/*': responses.not_implemented_response,\n}\n", "path": "moto/route53/urls.py"}], "after_files": [{"content": "from __future__ import unicode_literals\nfrom . import responses\n\nurl_bases = [\n \"https://route53.amazonaws.com/201.-..-../\",\n]\n\nurl_paths = {\n '{0}hostedzone$': responses.list_or_create_hostzone_response,\n '{0}hostedzone/[^/]+$': responses.get_or_delete_hostzone_response,\n '{0}hostedzone/[^/]+/rrset/?$': responses.rrset_response,\n '{0}healthcheck': responses.health_check_response,\n '{0}tags|trafficpolicyinstances/*': responses.not_implemented_response,\n}\n", "path": "moto/route53/urls.py"}]} | 828 | 176 |
gh_patches_debug_15401 | rasdani/github-patches | git_diff | pytorch__text-1912 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
todo-decorator-remove-solved
Removed the code as the issue is closed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchtext/datasets/multi30k.py`
Content:
```
1 import os
2 from functools import partial
3 from typing import Union, Tuple
4
5 from torchtext._internal.module_utils import is_module_available
6 from torchtext.data.datasets_utils import (
7 _wrap_split_argument,
8 _create_dataset_directory,
9 )
10
11 if is_module_available("torchdata"):
12 from torchdata.datapipes.iter import FileOpener, IterableWrapper
13 from torchtext._download_hooks import HttpReader
14
15 # TODO: Update URL to original once the server is back up (see https://github.com/pytorch/text/issues/1756)
16 URL = {
17 "train": r"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz",
18 "valid": r"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz",
19 "test": r"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz",
20 }
21
22 MD5 = {
23 "train": "20140d013d05dd9a72dfde46478663ba05737ce983f478f960c1123c6671be5e",
24 "valid": "a7aa20e9ebd5ba5adce7909498b94410996040857154dab029851af3a866da8c",
25 "test": "6d1ca1dba99e2c5dd54cae1226ff11c2551e6ce63527ebb072a1f70f72a5cd36",
26 }
27
28 _PREFIX = {
29 "train": "train",
30 "valid": "val",
31 "test": "test",
32 }
33
34 NUM_LINES = {
35 "train": 29000,
36 "valid": 1014,
37 "test": 1000,
38 }
39
40 DATASET_NAME = "Multi30k"
41
42
43 def _filepath_fn(root, split, _=None):
44 return os.path.join(root, os.path.basename(URL[split]))
45
46
47 def _decompressed_filepath_fn(root, split, language_pair, i, _):
48 return os.path.join(root, f"{_PREFIX[split]}.{language_pair[i]}")
49
50
51 def _filter_fn(split, language_pair, i, x):
52 return f"{_PREFIX[split]}.{language_pair[i]}" in x[0]
53
54
55 @_create_dataset_directory(dataset_name=DATASET_NAME)
56 @_wrap_split_argument(("train", "valid", "test"))
57 def Multi30k(root: str, split: Union[Tuple[str], str], language_pair: Tuple[str] = ("de", "en")):
58 """Multi30k dataset
59
60 .. warning::
61
62 using datapipes is still currently subject to a few caveats. if you wish
63 to use this dataset with shuffling, multi-processing, or distributed
64 learning, please see :ref:`this note <datapipes_warnings>` for further
65 instructions.
66
67 For additional details refer to https://www.statmt.org/wmt16/multimodal-task.html#task1
68
69 Number of lines per split:
70 - train: 29000
71 - valid: 1014
72 - test: 1000
73
74 Args:
75 root: Directory where the datasets are saved. Default: os.path.expanduser('~/.torchtext/cache')
76 split: split or splits to be returned. Can be a string or tuple of strings. Default: ('train', 'valid', 'test')
77 language_pair: tuple or list containing src and tgt language. Available options are ('de','en') and ('en', 'de')
78
79 :return: DataPipe that yields tuple of source and target sentences
80 :rtype: (str, str)
81 """
82
83 assert len(language_pair) == 2, "language_pair must contain only 2 elements: src and tgt language respectively"
84 assert tuple(sorted(language_pair)) == (
85 "de",
86 "en",
87 ), "language_pair must be either ('de','en') or ('en', 'de')"
88
89 if not is_module_available("torchdata"):
90 raise ModuleNotFoundError(
91 "Package `torchdata` not found. Please install following instructions at https://github.com/pytorch/data"
92 )
93
94 url_dp = IterableWrapper([URL[split]])
95
96 cache_compressed_dp = url_dp.on_disk_cache(
97 filepath_fn=partial(_filepath_fn, root, split),
98 hash_dict={_filepath_fn(root, split): MD5[split]},
99 hash_type="sha256",
100 )
101 cache_compressed_dp = HttpReader(cache_compressed_dp).end_caching(mode="wb", same_filepath_fn=True)
102
103 cache_compressed_dp_1, cache_compressed_dp_2 = cache_compressed_dp.fork(num_instances=2)
104
105 src_cache_decompressed_dp = cache_compressed_dp_1.on_disk_cache(
106 filepath_fn=partial(_decompressed_filepath_fn, root, split, language_pair, 0)
107 )
108 src_cache_decompressed_dp = (
109 FileOpener(src_cache_decompressed_dp, mode="b")
110 .load_from_tar()
111 .filter(partial(_filter_fn, split, language_pair, 0))
112 )
113 src_cache_decompressed_dp = src_cache_decompressed_dp.end_caching(mode="wb", same_filepath_fn=True)
114
115 tgt_cache_decompressed_dp = cache_compressed_dp_2.on_disk_cache(
116 filepath_fn=partial(_decompressed_filepath_fn, root, split, language_pair, 1)
117 )
118 tgt_cache_decompressed_dp = (
119 FileOpener(tgt_cache_decompressed_dp, mode="b")
120 .load_from_tar()
121 .filter(partial(_filter_fn, split, language_pair, 1))
122 )
123 tgt_cache_decompressed_dp = tgt_cache_decompressed_dp.end_caching(mode="wb", same_filepath_fn=True)
124
125 src_data_dp = FileOpener(src_cache_decompressed_dp, encoding="utf-8").readlines(
126 return_path=False, strip_newline=True
127 )
128 tgt_data_dp = FileOpener(tgt_cache_decompressed_dp, encoding="utf-8").readlines(
129 return_path=False, strip_newline=True
130 )
131
132 return src_data_dp.zip(tgt_data_dp).shuffle().set_shuffle(False).sharding_filter()
133
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchtext/datasets/multi30k.py b/torchtext/datasets/multi30k.py
--- a/torchtext/datasets/multi30k.py
+++ b/torchtext/datasets/multi30k.py
@@ -12,11 +12,10 @@
from torchdata.datapipes.iter import FileOpener, IterableWrapper
from torchtext._download_hooks import HttpReader
-# TODO: Update URL to original once the server is back up (see https://github.com/pytorch/text/issues/1756)
URL = {
- "train": r"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz",
- "valid": r"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz",
- "test": r"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz",
+ "train": "http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/training.tar.gz",
+ "valid": "http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/validation.tar.gz",
+ "test": "http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/mmt16_task1_test.tar.gz",
}
MD5 = {
| {"golden_diff": "diff --git a/torchtext/datasets/multi30k.py b/torchtext/datasets/multi30k.py\n--- a/torchtext/datasets/multi30k.py\n+++ b/torchtext/datasets/multi30k.py\n@@ -12,11 +12,10 @@\n from torchdata.datapipes.iter import FileOpener, IterableWrapper\n from torchtext._download_hooks import HttpReader\n \n-# TODO: Update URL to original once the server is back up (see https://github.com/pytorch/text/issues/1756)\n URL = {\n- \"train\": r\"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz\",\n- \"valid\": r\"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz\",\n- \"test\": r\"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz\",\n+ \"train\": \"http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/training.tar.gz\",\n+ \"valid\": \"http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/validation.tar.gz\",\n+ \"test\": \"http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/mmt16_task1_test.tar.gz\",\n }\n \n MD5 = {\n", "issue": "todo-decorator-remove-solved\nRemoved the code as the issue is closed.\n", "before_files": [{"content": "import os\nfrom functools import partial\nfrom typing import Union, Tuple\n\nfrom torchtext._internal.module_utils import is_module_available\nfrom torchtext.data.datasets_utils import (\n _wrap_split_argument,\n _create_dataset_directory,\n)\n\nif is_module_available(\"torchdata\"):\n from torchdata.datapipes.iter import FileOpener, IterableWrapper\n from torchtext._download_hooks import HttpReader\n\n# TODO: Update URL to original once the server is back up (see https://github.com/pytorch/text/issues/1756)\nURL = {\n \"train\": r\"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/training.tar.gz\",\n \"valid\": r\"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/validation.tar.gz\",\n \"test\": r\"https://raw.githubusercontent.com/neychev/small_DL_repo/master/datasets/Multi30k/mmt16_task1_test.tar.gz\",\n}\n\nMD5 = {\n \"train\": \"20140d013d05dd9a72dfde46478663ba05737ce983f478f960c1123c6671be5e\",\n \"valid\": \"a7aa20e9ebd5ba5adce7909498b94410996040857154dab029851af3a866da8c\",\n \"test\": \"6d1ca1dba99e2c5dd54cae1226ff11c2551e6ce63527ebb072a1f70f72a5cd36\",\n}\n\n_PREFIX = {\n \"train\": \"train\",\n \"valid\": \"val\",\n \"test\": \"test\",\n}\n\nNUM_LINES = {\n \"train\": 29000,\n \"valid\": 1014,\n \"test\": 1000,\n}\n\nDATASET_NAME = \"Multi30k\"\n\n\ndef _filepath_fn(root, split, _=None):\n return os.path.join(root, os.path.basename(URL[split]))\n\n\ndef _decompressed_filepath_fn(root, split, language_pair, i, _):\n return os.path.join(root, f\"{_PREFIX[split]}.{language_pair[i]}\")\n\n\ndef _filter_fn(split, language_pair, i, x):\n return f\"{_PREFIX[split]}.{language_pair[i]}\" in x[0]\n\n\n@_create_dataset_directory(dataset_name=DATASET_NAME)\n@_wrap_split_argument((\"train\", \"valid\", \"test\"))\ndef Multi30k(root: str, split: Union[Tuple[str], str], language_pair: Tuple[str] = (\"de\", \"en\")):\n \"\"\"Multi30k dataset\n\n .. warning::\n\n using datapipes is still currently subject to a few caveats. if you wish\n to use this dataset with shuffling, multi-processing, or distributed\n learning, please see :ref:`this note <datapipes_warnings>` for further\n instructions.\n\n For additional details refer to https://www.statmt.org/wmt16/multimodal-task.html#task1\n\n Number of lines per split:\n - train: 29000\n - valid: 1014\n - test: 1000\n\n Args:\n root: Directory where the datasets are saved. Default: os.path.expanduser('~/.torchtext/cache')\n split: split or splits to be returned. Can be a string or tuple of strings. Default: ('train', 'valid', 'test')\n language_pair: tuple or list containing src and tgt language. Available options are ('de','en') and ('en', 'de')\n\n :return: DataPipe that yields tuple of source and target sentences\n :rtype: (str, str)\n \"\"\"\n\n assert len(language_pair) == 2, \"language_pair must contain only 2 elements: src and tgt language respectively\"\n assert tuple(sorted(language_pair)) == (\n \"de\",\n \"en\",\n ), \"language_pair must be either ('de','en') or ('en', 'de')\"\n\n if not is_module_available(\"torchdata\"):\n raise ModuleNotFoundError(\n \"Package `torchdata` not found. Please install following instructions at https://github.com/pytorch/data\"\n )\n\n url_dp = IterableWrapper([URL[split]])\n\n cache_compressed_dp = url_dp.on_disk_cache(\n filepath_fn=partial(_filepath_fn, root, split),\n hash_dict={_filepath_fn(root, split): MD5[split]},\n hash_type=\"sha256\",\n )\n cache_compressed_dp = HttpReader(cache_compressed_dp).end_caching(mode=\"wb\", same_filepath_fn=True)\n\n cache_compressed_dp_1, cache_compressed_dp_2 = cache_compressed_dp.fork(num_instances=2)\n\n src_cache_decompressed_dp = cache_compressed_dp_1.on_disk_cache(\n filepath_fn=partial(_decompressed_filepath_fn, root, split, language_pair, 0)\n )\n src_cache_decompressed_dp = (\n FileOpener(src_cache_decompressed_dp, mode=\"b\")\n .load_from_tar()\n .filter(partial(_filter_fn, split, language_pair, 0))\n )\n src_cache_decompressed_dp = src_cache_decompressed_dp.end_caching(mode=\"wb\", same_filepath_fn=True)\n\n tgt_cache_decompressed_dp = cache_compressed_dp_2.on_disk_cache(\n filepath_fn=partial(_decompressed_filepath_fn, root, split, language_pair, 1)\n )\n tgt_cache_decompressed_dp = (\n FileOpener(tgt_cache_decompressed_dp, mode=\"b\")\n .load_from_tar()\n .filter(partial(_filter_fn, split, language_pair, 1))\n )\n tgt_cache_decompressed_dp = tgt_cache_decompressed_dp.end_caching(mode=\"wb\", same_filepath_fn=True)\n\n src_data_dp = FileOpener(src_cache_decompressed_dp, encoding=\"utf-8\").readlines(\n return_path=False, strip_newline=True\n )\n tgt_data_dp = FileOpener(tgt_cache_decompressed_dp, encoding=\"utf-8\").readlines(\n return_path=False, strip_newline=True\n )\n\n return src_data_dp.zip(tgt_data_dp).shuffle().set_shuffle(False).sharding_filter()\n", "path": "torchtext/datasets/multi30k.py"}], "after_files": [{"content": "import os\nfrom functools import partial\nfrom typing import Union, Tuple\n\nfrom torchtext._internal.module_utils import is_module_available\nfrom torchtext.data.datasets_utils import (\n _wrap_split_argument,\n _create_dataset_directory,\n)\n\nif is_module_available(\"torchdata\"):\n from torchdata.datapipes.iter import FileOpener, IterableWrapper\n from torchtext._download_hooks import HttpReader\n\nURL = {\n \"train\": \"http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/training.tar.gz\",\n \"valid\": \"http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/validation.tar.gz\",\n \"test\": \"http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/mmt16_task1_test.tar.gz\",\n}\n\nMD5 = {\n \"train\": \"20140d013d05dd9a72dfde46478663ba05737ce983f478f960c1123c6671be5e\",\n \"valid\": \"a7aa20e9ebd5ba5adce7909498b94410996040857154dab029851af3a866da8c\",\n \"test\": \"6d1ca1dba99e2c5dd54cae1226ff11c2551e6ce63527ebb072a1f70f72a5cd36\",\n}\n\n_PREFIX = {\n \"train\": \"train\",\n \"valid\": \"val\",\n \"test\": \"test\",\n}\n\nNUM_LINES = {\n \"train\": 29000,\n \"valid\": 1014,\n \"test\": 1000,\n}\n\nDATASET_NAME = \"Multi30k\"\n\n\ndef _filepath_fn(root, split, _=None):\n return os.path.join(root, os.path.basename(URL[split]))\n\n\ndef _decompressed_filepath_fn(root, split, language_pair, i, _):\n return os.path.join(root, f\"{_PREFIX[split]}.{language_pair[i]}\")\n\n\ndef _filter_fn(split, language_pair, i, x):\n return f\"{_PREFIX[split]}.{language_pair[i]}\" in x[0]\n\n\n@_create_dataset_directory(dataset_name=DATASET_NAME)\n@_wrap_split_argument((\"train\", \"valid\", \"test\"))\ndef Multi30k(root: str, split: Union[Tuple[str], str], language_pair: Tuple[str] = (\"de\", \"en\")):\n \"\"\"Multi30k dataset\n\n .. warning::\n\n using datapipes is still currently subject to a few caveats. if you wish\n to use this dataset with shuffling, multi-processing, or distributed\n learning, please see :ref:`this note <datapipes_warnings>` for further\n instructions.\n\n For additional details refer to https://www.statmt.org/wmt16/multimodal-task.html#task1\n\n Number of lines per split:\n - train: 29000\n - valid: 1014\n - test: 1000\n\n Args:\n root: Directory where the datasets are saved. Default: os.path.expanduser('~/.torchtext/cache')\n split: split or splits to be returned. Can be a string or tuple of strings. Default: ('train', 'valid', 'test')\n language_pair: tuple or list containing src and tgt language. Available options are ('de','en') and ('en', 'de')\n\n :return: DataPipe that yields tuple of source and target sentences\n :rtype: (str, str)\n \"\"\"\n\n assert len(language_pair) == 2, \"language_pair must contain only 2 elements: src and tgt language respectively\"\n assert tuple(sorted(language_pair)) == (\n \"de\",\n \"en\",\n ), \"language_pair must be either ('de','en') or ('en', 'de')\"\n\n if not is_module_available(\"torchdata\"):\n raise ModuleNotFoundError(\n \"Package `torchdata` not found. Please install following instructions at https://github.com/pytorch/data\"\n )\n\n url_dp = IterableWrapper([URL[split]])\n\n cache_compressed_dp = url_dp.on_disk_cache(\n filepath_fn=partial(_filepath_fn, root, split),\n hash_dict={_filepath_fn(root, split): MD5[split]},\n hash_type=\"sha256\",\n )\n cache_compressed_dp = HttpReader(cache_compressed_dp).end_caching(mode=\"wb\", same_filepath_fn=True)\n\n cache_compressed_dp_1, cache_compressed_dp_2 = cache_compressed_dp.fork(num_instances=2)\n\n src_cache_decompressed_dp = cache_compressed_dp_1.on_disk_cache(\n filepath_fn=partial(_decompressed_filepath_fn, root, split, language_pair, 0)\n )\n src_cache_decompressed_dp = (\n FileOpener(src_cache_decompressed_dp, mode=\"b\")\n .load_from_tar()\n .filter(partial(_filter_fn, split, language_pair, 0))\n )\n src_cache_decompressed_dp = src_cache_decompressed_dp.end_caching(mode=\"wb\", same_filepath_fn=True)\n\n tgt_cache_decompressed_dp = cache_compressed_dp_2.on_disk_cache(\n filepath_fn=partial(_decompressed_filepath_fn, root, split, language_pair, 1)\n )\n tgt_cache_decompressed_dp = (\n FileOpener(tgt_cache_decompressed_dp, mode=\"b\")\n .load_from_tar()\n .filter(partial(_filter_fn, split, language_pair, 1))\n )\n tgt_cache_decompressed_dp = tgt_cache_decompressed_dp.end_caching(mode=\"wb\", same_filepath_fn=True)\n\n src_data_dp = FileOpener(src_cache_decompressed_dp, encoding=\"utf-8\").readlines(\n return_path=False, strip_newline=True\n )\n tgt_data_dp = FileOpener(tgt_cache_decompressed_dp, encoding=\"utf-8\").readlines(\n return_path=False, strip_newline=True\n )\n\n return src_data_dp.zip(tgt_data_dp).shuffle().set_shuffle(False).sharding_filter()\n", "path": "torchtext/datasets/multi30k.py"}]} | 1,996 | 324 |
gh_patches_debug_9212 | rasdani/github-patches | git_diff | jazzband__pip-tools-956 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add python 3.8 support
#### What's the problem this feature will solve?
<!-- What are you trying to do, that you are unable to achieve with pip-tools as it currently stands? -->
Python 3.8 is released, so it's time to support it.
#### Describe the solution you'd like
<!-- A clear and concise description of what you want to happen. -->
1. add "py37" env to `tox.ini`
1. remove 3.8-dev from `.travis.yml`
1. add "Programming Language :: Python :: 3.8" classifier to `setup.py`
1. add "3.8" dimension to `.travis.yml` (supported, see https://travis-ci.community/t/add-python-3-8-support/5463)
1. add "py37" dimension to `.appveyor.yml` (not supported yet, but will be on the nex image update, tracking issue: https://github.com/appveyor/ci/issues/3142)
1. add "3.8" to python-version list in `.github/workflows/cron.yml` (not supported yet, tracking issue: https://github.com/actions/setup-python/issues/30)
<!-- Provide examples of real-world use cases that this would enable and how it solves the problem described above. -->
#### Alternative Solutions
<!-- Have you tried to workaround the problem using pip-tools or other tools? Or a different approach to solving this issue? Please elaborate here. -->
N/A
#### Additional context
<!-- Add any other context, links, etc. about the feature here. -->
https://discuss.python.org/t/python-3-8-0-is-now-available/2478
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """
2 pip-tools keeps your pinned dependencies fresh.
3 """
4 from os.path import abspath, dirname, join
5
6 from setuptools import find_packages, setup
7
8
9 def read_file(filename):
10 """Read the contents of a file located relative to setup.py"""
11 with open(join(abspath(dirname(__file__)), filename)) as thefile:
12 return thefile.read()
13
14
15 setup(
16 name="pip-tools",
17 use_scm_version=True,
18 url="https://github.com/jazzband/pip-tools/",
19 license="BSD",
20 author="Vincent Driessen",
21 author_email="[email protected]",
22 description=__doc__.strip(),
23 long_description=read_file("README.rst"),
24 long_description_content_type="text/x-rst",
25 packages=find_packages(exclude=["tests"]),
26 package_data={},
27 python_requires=">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*",
28 setup_requires=["setuptools_scm"],
29 install_requires=["click>=6", "six"],
30 zip_safe=False,
31 entry_points={
32 "console_scripts": [
33 "pip-compile = piptools.scripts.compile:cli",
34 "pip-sync = piptools.scripts.sync:cli",
35 ]
36 },
37 platforms="any",
38 classifiers=[
39 "Development Status :: 5 - Production/Stable",
40 "Intended Audience :: Developers",
41 "Intended Audience :: System Administrators",
42 "License :: OSI Approved :: BSD License",
43 "Operating System :: OS Independent",
44 "Programming Language :: Python",
45 "Programming Language :: Python :: 2",
46 "Programming Language :: Python :: 2.7",
47 "Programming Language :: Python :: 3",
48 "Programming Language :: Python :: 3.5",
49 "Programming Language :: Python :: 3.6",
50 "Programming Language :: Python :: 3.7",
51 "Programming Language :: Python :: Implementation :: CPython",
52 "Programming Language :: Python :: Implementation :: PyPy",
53 "Topic :: System :: Systems Administration",
54 ],
55 )
56
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -48,6 +48,7 @@
"Programming Language :: Python :: 3.5",
"Programming Language :: Python :: 3.6",
"Programming Language :: Python :: 3.7",
+ "Programming Language :: Python :: 3.8",
"Programming Language :: Python :: Implementation :: CPython",
"Programming Language :: Python :: Implementation :: PyPy",
"Topic :: System :: Systems Administration",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -48,6 +48,7 @@\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n+ \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: System :: Systems Administration\",\n", "issue": "Add python 3.8 support\n#### What's the problem this feature will solve?\r\n<!-- What are you trying to do, that you are unable to achieve with pip-tools as it currently stands? -->\r\n\r\nPython 3.8 is released, so it's time to support it. \r\n\r\n#### Describe the solution you'd like\r\n<!-- A clear and concise description of what you want to happen. -->\r\n\r\n1. add \"py37\" env to `tox.ini`\r\n1. remove 3.8-dev from `.travis.yml`\r\n1. add \"Programming Language :: Python :: 3.8\" classifier to `setup.py`\r\n1. add \"3.8\" dimension to `.travis.yml` (supported, see https://travis-ci.community/t/add-python-3-8-support/5463)\r\n1. add \"py37\" dimension to `.appveyor.yml` (not supported yet, but will be on the nex image update, tracking issue: https://github.com/appveyor/ci/issues/3142)\r\n1. add \"3.8\" to python-version list in `.github/workflows/cron.yml` (not supported yet, tracking issue: https://github.com/actions/setup-python/issues/30)\r\n\r\n<!-- Provide examples of real-world use cases that this would enable and how it solves the problem described above. -->\r\n\r\n#### Alternative Solutions\r\n<!-- Have you tried to workaround the problem using pip-tools or other tools? Or a different approach to solving this issue? Please elaborate here. -->\r\n\r\nN/A\r\n\r\n#### Additional context\r\n<!-- Add any other context, links, etc. about the feature here. -->\r\n\r\nhttps://discuss.python.org/t/python-3-8-0-is-now-available/2478\n", "before_files": [{"content": "\"\"\"\npip-tools keeps your pinned dependencies fresh.\n\"\"\"\nfrom os.path import abspath, dirname, join\n\nfrom setuptools import find_packages, setup\n\n\ndef read_file(filename):\n \"\"\"Read the contents of a file located relative to setup.py\"\"\"\n with open(join(abspath(dirname(__file__)), filename)) as thefile:\n return thefile.read()\n\n\nsetup(\n name=\"pip-tools\",\n use_scm_version=True,\n url=\"https://github.com/jazzband/pip-tools/\",\n license=\"BSD\",\n author=\"Vincent Driessen\",\n author_email=\"[email protected]\",\n description=__doc__.strip(),\n long_description=read_file(\"README.rst\"),\n long_description_content_type=\"text/x-rst\",\n packages=find_packages(exclude=[\"tests\"]),\n package_data={},\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*\",\n setup_requires=[\"setuptools_scm\"],\n install_requires=[\"click>=6\", \"six\"],\n zip_safe=False,\n entry_points={\n \"console_scripts\": [\n \"pip-compile = piptools.scripts.compile:cli\",\n \"pip-sync = piptools.scripts.sync:cli\",\n ]\n },\n platforms=\"any\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: System :: Systems Administration\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"\npip-tools keeps your pinned dependencies fresh.\n\"\"\"\nfrom os.path import abspath, dirname, join\n\nfrom setuptools import find_packages, setup\n\n\ndef read_file(filename):\n \"\"\"Read the contents of a file located relative to setup.py\"\"\"\n with open(join(abspath(dirname(__file__)), filename)) as thefile:\n return thefile.read()\n\n\nsetup(\n name=\"pip-tools\",\n use_scm_version=True,\n url=\"https://github.com/jazzband/pip-tools/\",\n license=\"BSD\",\n author=\"Vincent Driessen\",\n author_email=\"[email protected]\",\n description=__doc__.strip(),\n long_description=read_file(\"README.rst\"),\n long_description_content_type=\"text/x-rst\",\n packages=find_packages(exclude=[\"tests\"]),\n package_data={},\n python_requires=\">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*\",\n setup_requires=[\"setuptools_scm\"],\n install_requires=[\"click>=6\", \"six\"],\n zip_safe=False,\n entry_points={\n \"console_scripts\": [\n \"pip-compile = piptools.scripts.compile:cli\",\n \"pip-sync = piptools.scripts.sync:cli\",\n ]\n },\n platforms=\"any\",\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: System Administrators\",\n \"License :: OSI Approved :: BSD License\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Topic :: System :: Systems Administration\",\n ],\n)\n", "path": "setup.py"}]} | 1,157 | 114 |
gh_patches_debug_23998 | rasdani/github-patches | git_diff | dotkom__onlineweb4-203 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Auth templates does not use crispy forms
https://github.com/dotKom/onlineweb4/commit/26ae7847c2907895e6842061a848a2c0f47090a0
Håvard did some weird shit. Undo this and test that it still works.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/authentication/urls.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 from django.conf.urls import patterns, url
4
5 urlpatterns = patterns('apps.authentication.views',
6 url(r'^login/$', 'login', name='auth_login'),
7 url(r'^logout/$', 'logout', name='auth_logout'),
8 url(r'^register/$', 'register', name='auth_register'),
9 url(r'^verify/(\w+)/$', 'verify', name='auth_verify'),
10 url(r'^recover/$', 'recover', name='auth_recover'),
11 url(r'^set_password/(\w+)/$', 'set_password', name='auth_set_password'),
12 )
13
```
Path: `apps/authentication/forms.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 import datetime
4 import re
5
6 from django import forms
7 from django.contrib import auth
8
9 from apps.authentication.models import OnlineUser as User
10
11 class LoginForm(forms.Form):
12 username = forms.CharField(widget=forms.TextInput(), label="Username", max_length=50)
13 password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Password")
14 user = None
15
16 def clean(self):
17 if self._errors:
18 return
19
20 user = auth.authenticate(username=self.cleaned_data['username'], password=self.cleaned_data['password'])
21
22 if user:
23 if user.is_active:
24 self.user = user
25 else:
26 self._errors['username'] = self.error_class(["Your account is inactive, try to recover it."])
27 else:
28 self._errors['username'] = self.error_class(["The account does not exist, or username/password combination is incorrect."])
29 return self.cleaned_data
30
31 def login(self, request):
32 try:
33 User.objects.get(username=request.POST['username'])
34 except:
35 return False
36 if self.is_valid():
37 auth.login(request, self.user)
38 request.session.set_expiry(0)
39 return True
40 return False
41
42 class RegisterForm(forms.Form):
43 username = forms.CharField(label="Username", max_length=20)
44 first_name = forms.CharField(label="First name", max_length=50)
45 last_name = forms.CharField(label="Last name", max_length=50)
46 email = forms.EmailField(label="Email", max_length=50)
47 password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Password")
48 repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Repeat password")
49 address = forms.CharField(label="Address", max_length=50)
50 zip_code = forms.CharField(label="ZIP code", max_length=4)
51 phone = forms.CharField(label="Phone number", max_length=20)
52
53 def clean(self):
54 super(RegisterForm, self).clean()
55 if self.is_valid():
56 cleaned_data = self.cleaned_data
57
58 # Check passwords
59 if cleaned_data['password'] != cleaned_data['repeat_password']:
60 self._errors['repeat_password'] = self.error_class(["Passwords did not match."])
61
62 # Check username
63 username = cleaned_data['username']
64 if User.objects.filter(username=username).count() > 0:
65 self._errors['username'] = self.error_class(["There is already a user with that username."])
66 if not re.match("^[a-zA-Z0-9_-]+$", username):
67 self._errors['username'] = self.error_class(["Your desired username contains illegal characters. Valid: a-Z 0-9 - _"])
68
69 # Check email
70 email = cleaned_data['email']
71 if User.objects.filter(email=email).count() > 0:
72 self._errors['email'] = self.error_class(["There is already a user with that email."])
73
74 # ZIP code digits only
75 zip_code = cleaned_data['zip_code']
76 if len(zip_code) != 4 or not zip_code.isdigit():
77 self._errors['zip_code'] = self.error_class(["The ZIP code must be 4 digit number."])
78
79 return cleaned_data
80
81 class RecoveryForm(forms.Form):
82 email = forms.EmailField(label="Email", max_length=50)
83
84 class ChangePasswordForm(forms.Form):
85 old_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Old password", required=False)
86 new_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="New password")
87 repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Repeat new password")
88
89 def clean(self):
90 super(ChangePasswordForm, self).clean()
91 if self.is_valid():
92 cleaned_data = self.cleaned_data
93
94 # Check passwords
95 if cleaned_data['new_password'] != cleaned_data['repeat_password']:
96 self._errors['repeat_password'] = self.error_class(["Passwords did not match."])
97
98 return cleaned_data
99
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/apps/authentication/forms.py b/apps/authentication/forms.py
--- a/apps/authentication/forms.py
+++ b/apps/authentication/forms.py
@@ -82,7 +82,6 @@
email = forms.EmailField(label="Email", max_length=50)
class ChangePasswordForm(forms.Form):
- old_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Old password", required=False)
new_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="New password")
repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label="Repeat new password")
diff --git a/apps/authentication/urls.py b/apps/authentication/urls.py
--- a/apps/authentication/urls.py
+++ b/apps/authentication/urls.py
@@ -6,7 +6,7 @@
url(r'^login/$', 'login', name='auth_login'),
url(r'^logout/$', 'logout', name='auth_logout'),
url(r'^register/$', 'register', name='auth_register'),
- url(r'^verify/(\w+)/$', 'verify', name='auth_verify'),
+ url(r'^verify/(?P<token>\w+)/$', 'verify', name='auth_verify'),
url(r'^recover/$', 'recover', name='auth_recover'),
- url(r'^set_password/(\w+)/$', 'set_password', name='auth_set_password'),
+ url(r'^set_password/(?P<token>\w+)/$', 'set_password', name='auth_set_password'),
)
| {"golden_diff": "diff --git a/apps/authentication/forms.py b/apps/authentication/forms.py\n--- a/apps/authentication/forms.py\n+++ b/apps/authentication/forms.py\n@@ -82,7 +82,6 @@\n email = forms.EmailField(label=\"Email\", max_length=50)\n \n class ChangePasswordForm(forms.Form):\n- old_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Old password\", required=False)\n new_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"New password\")\n repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Repeat new password\")\n \ndiff --git a/apps/authentication/urls.py b/apps/authentication/urls.py\n--- a/apps/authentication/urls.py\n+++ b/apps/authentication/urls.py\n@@ -6,7 +6,7 @@\n url(r'^login/$', 'login', name='auth_login'),\n url(r'^logout/$', 'logout', name='auth_logout'),\n url(r'^register/$', 'register', name='auth_register'),\n- url(r'^verify/(\\w+)/$', 'verify', name='auth_verify'),\n+ url(r'^verify/(?P<token>\\w+)/$', 'verify', name='auth_verify'),\n url(r'^recover/$', 'recover', name='auth_recover'),\n- url(r'^set_password/(\\w+)/$', 'set_password', name='auth_set_password'),\n+ url(r'^set_password/(?P<token>\\w+)/$', 'set_password', name='auth_set_password'),\n )\n", "issue": "Auth templates does not use crispy forms\nhttps://github.com/dotKom/onlineweb4/commit/26ae7847c2907895e6842061a848a2c0f47090a0\n\nH\u00e5vard did some weird shit. Undo this and test that it still works.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom django.conf.urls import patterns, url\n\nurlpatterns = patterns('apps.authentication.views',\n url(r'^login/$', 'login', name='auth_login'),\n url(r'^logout/$', 'logout', name='auth_logout'),\n url(r'^register/$', 'register', name='auth_register'),\n url(r'^verify/(\\w+)/$', 'verify', name='auth_verify'),\n url(r'^recover/$', 'recover', name='auth_recover'),\n url(r'^set_password/(\\w+)/$', 'set_password', name='auth_set_password'),\n)\n", "path": "apps/authentication/urls.py"}, {"content": "# -*- coding: utf-8 -*-\n\nimport datetime\nimport re\n\nfrom django import forms\nfrom django.contrib import auth\n\nfrom apps.authentication.models import OnlineUser as User\n\nclass LoginForm(forms.Form):\n username = forms.CharField(widget=forms.TextInput(), label=\"Username\", max_length=50)\n password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Password\")\n user = None\n\n def clean(self):\n if self._errors:\n return\n \n user = auth.authenticate(username=self.cleaned_data['username'], password=self.cleaned_data['password'])\n\n if user:\n if user.is_active:\n self.user = user\n else:\n self._errors['username'] = self.error_class([\"Your account is inactive, try to recover it.\"])\n else:\n self._errors['username'] = self.error_class([\"The account does not exist, or username/password combination is incorrect.\"])\n return self.cleaned_data\n\n def login(self, request):\n try:\n User.objects.get(username=request.POST['username'])\n except:\n return False\n if self.is_valid():\n auth.login(request, self.user)\n request.session.set_expiry(0)\n return True\n return False\n\nclass RegisterForm(forms.Form):\n username = forms.CharField(label=\"Username\", max_length=20)\n first_name = forms.CharField(label=\"First name\", max_length=50)\n last_name = forms.CharField(label=\"Last name\", max_length=50)\n email = forms.EmailField(label=\"Email\", max_length=50)\n password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Password\")\n repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Repeat password\")\n address = forms.CharField(label=\"Address\", max_length=50)\n zip_code = forms.CharField(label=\"ZIP code\", max_length=4)\n phone = forms.CharField(label=\"Phone number\", max_length=20)\n \n def clean(self):\n super(RegisterForm, self).clean()\n if self.is_valid():\n cleaned_data = self.cleaned_data\n\n # Check passwords\n if cleaned_data['password'] != cleaned_data['repeat_password']:\n self._errors['repeat_password'] = self.error_class([\"Passwords did not match.\"])\n\n # Check username\n username = cleaned_data['username']\n if User.objects.filter(username=username).count() > 0:\n self._errors['username'] = self.error_class([\"There is already a user with that username.\"])\n if not re.match(\"^[a-zA-Z0-9_-]+$\", username):\n self._errors['username'] = self.error_class([\"Your desired username contains illegal characters. Valid: a-Z 0-9 - _\"])\n\n # Check email\n email = cleaned_data['email']\n if User.objects.filter(email=email).count() > 0:\n self._errors['email'] = self.error_class([\"There is already a user with that email.\"])\n\n # ZIP code digits only\n zip_code = cleaned_data['zip_code']\n if len(zip_code) != 4 or not zip_code.isdigit():\n self._errors['zip_code'] = self.error_class([\"The ZIP code must be 4 digit number.\"])\n\n return cleaned_data \n\nclass RecoveryForm(forms.Form):\n email = forms.EmailField(label=\"Email\", max_length=50)\n\nclass ChangePasswordForm(forms.Form):\n old_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Old password\", required=False)\n new_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"New password\")\n repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Repeat new password\")\n\n def clean(self):\n super(ChangePasswordForm, self).clean()\n if self.is_valid():\n cleaned_data = self.cleaned_data\n\n # Check passwords\n if cleaned_data['new_password'] != cleaned_data['repeat_password']:\n self._errors['repeat_password'] = self.error_class([\"Passwords did not match.\"])\n\n return cleaned_data\n", "path": "apps/authentication/forms.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom django.conf.urls import patterns, url\n\nurlpatterns = patterns('apps.authentication.views',\n url(r'^login/$', 'login', name='auth_login'),\n url(r'^logout/$', 'logout', name='auth_logout'),\n url(r'^register/$', 'register', name='auth_register'),\n url(r'^verify/(?P<token>\\w+)/$', 'verify', name='auth_verify'),\n url(r'^recover/$', 'recover', name='auth_recover'),\n url(r'^set_password/(?P<token>\\w+)/$', 'set_password', name='auth_set_password'),\n)\n", "path": "apps/authentication/urls.py"}, {"content": "# -*- coding: utf-8 -*-\n\nimport datetime\nimport re\n\nfrom django import forms\nfrom django.contrib import auth\n\nfrom apps.authentication.models import OnlineUser as User\n\nclass LoginForm(forms.Form):\n username = forms.CharField(widget=forms.TextInput(), label=\"Username\", max_length=50)\n password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Password\")\n user = None\n\n def clean(self):\n if self._errors:\n return\n \n user = auth.authenticate(username=self.cleaned_data['username'], password=self.cleaned_data['password'])\n\n if user:\n if user.is_active:\n self.user = user\n else:\n self._errors['username'] = self.error_class([\"Your account is inactive, try to recover it.\"])\n else:\n self._errors['username'] = self.error_class([\"The account does not exist, or username/password combination is incorrect.\"])\n return self.cleaned_data\n\n def login(self, request):\n try:\n User.objects.get(username=request.POST['username'])\n except:\n return False\n if self.is_valid():\n auth.login(request, self.user)\n request.session.set_expiry(0)\n return True\n return False\n\nclass RegisterForm(forms.Form):\n username = forms.CharField(label=\"Username\", max_length=20)\n first_name = forms.CharField(label=\"First name\", max_length=50)\n last_name = forms.CharField(label=\"Last name\", max_length=50)\n email = forms.EmailField(label=\"Email\", max_length=50)\n password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Password\")\n repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Repeat password\")\n address = forms.CharField(label=\"Address\", max_length=50)\n zip_code = forms.CharField(label=\"ZIP code\", max_length=4)\n phone = forms.CharField(label=\"Phone number\", max_length=20)\n \n def clean(self):\n super(RegisterForm, self).clean()\n if self.is_valid():\n cleaned_data = self.cleaned_data\n\n # Check passwords\n if cleaned_data['password'] != cleaned_data['repeat_password']:\n self._errors['repeat_password'] = self.error_class([\"Passwords did not match.\"])\n\n # Check username\n username = cleaned_data['username']\n if User.objects.filter(username=username).count() > 0:\n self._errors['username'] = self.error_class([\"There is already a user with that username.\"])\n if not re.match(\"^[a-zA-Z0-9_-]+$\", username):\n self._errors['username'] = self.error_class([\"Your desired username contains illegal characters. Valid: a-Z 0-9 - _\"])\n\n # Check email\n email = cleaned_data['email']\n if User.objects.filter(email=email).count() > 0:\n self._errors['email'] = self.error_class([\"There is already a user with that email.\"])\n\n # ZIP code digits only\n zip_code = cleaned_data['zip_code']\n if len(zip_code) != 4 or not zip_code.isdigit():\n self._errors['zip_code'] = self.error_class([\"The ZIP code must be 4 digit number.\"])\n\n return cleaned_data \n\nclass RecoveryForm(forms.Form):\n email = forms.EmailField(label=\"Email\", max_length=50)\n\nclass ChangePasswordForm(forms.Form):\n new_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"New password\")\n repeat_password = forms.CharField(widget=forms.PasswordInput(render_value=False), label=\"Repeat new password\")\n\n def clean(self):\n super(ChangePasswordForm, self).clean()\n if self.is_valid():\n cleaned_data = self.cleaned_data\n\n # Check passwords\n if cleaned_data['new_password'] != cleaned_data['repeat_password']:\n self._errors['repeat_password'] = self.error_class([\"Passwords did not match.\"])\n\n return cleaned_data\n", "path": "apps/authentication/forms.py"}]} | 1,543 | 315 |
gh_patches_debug_18058 | rasdani/github-patches | git_diff | cloudtools__troposphere-1287 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add 'Kind' property to AWS::AppSync::Resolver
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-appsync-resolver.html#cfn-appsync-resolver-kind
The Kind property is required when using the new PipelineConfig feature for AppSync.
There are only two allowable values, PIPELINE or UNIT. The property is not required if using the standard resolvers.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/appsync.py`
Content:
```
1 # Copyright (c) 2012-2017, Mark Peek <[email protected]>
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5
6 from . import AWSObject, AWSProperty
7 from .validators import boolean, integer
8
9
10 class ApiKey(AWSObject):
11 resource_type = "AWS::AppSync::ApiKey"
12
13 props = {
14 'ApiId': (basestring, True),
15 'Description': (basestring, False),
16 'Expires': (integer, False),
17 }
18
19
20 class DynamoDBConfig(AWSProperty):
21 props = {
22 'AwsRegion': (basestring, True),
23 'TableName': (basestring, True),
24 'UseCallerCredentials': (boolean, False),
25 }
26
27
28 class ElasticsearchConfig(AWSProperty):
29 props = {
30 'AwsRegion': (basestring, True),
31 'Endpoint': (basestring, True),
32 }
33
34
35 class AwsIamConfig(AWSProperty):
36 props = {
37 'SigningRegion': (basestring, False),
38 'SigningServiceName': (basestring, False),
39 }
40
41
42 class AuthorizationConfig(AWSProperty):
43 props = {
44 'AuthorizationType': (basestring, True),
45 'AwsIamConfig': (AwsIamConfig, False),
46 }
47
48
49 class HttpConfig(AWSProperty):
50 props = {
51 'AuthorizationConfig': (AuthorizationConfig, False),
52 'Endpoint': (basestring, True),
53 }
54
55
56 class LambdaConfig(AWSProperty):
57 props = {
58 'LambdaFunctionArn': (basestring, True),
59 }
60
61
62 class RdsHttpEndpointConfig(AWSProperty):
63 props = {
64 'AwsRegion': (basestring, False),
65 'DbClusterIdentifier': (basestring, False),
66 'DatabaseName': (basestring, False),
67 'Schema': (basestring, False),
68 'AwsSecretStoreArn': (basestring, False),
69 }
70
71
72 class RelationalDatabaseConfig(AWSProperty):
73 props = {
74 'RelationalDatasourceType': (basestring, False),
75 'RdsHttpEndpointConfig': (RdsHttpEndpointConfig, False),
76 }
77
78
79 class DataSource(AWSObject):
80 resource_type = "AWS::AppSync::DataSource"
81
82 props = {
83 'ApiId': (basestring, True),
84 'Description': (basestring, False),
85 'DynamoDBConfig': (DynamoDBConfig, False),
86 'ElasticsearchConfig': (ElasticsearchConfig, False),
87 'HttpConfig': (HttpConfig, False),
88 'LambdaConfig': (LambdaConfig, False),
89 'Name': (basestring, True),
90 'ServiceRoleArn': (basestring, False),
91 'Type': (basestring, True),
92 'RelationalDatabaseConfig': (RelationalDatabaseConfig, False),
93 }
94
95
96 class LogConfig(AWSProperty):
97 props = {
98 'CloudWatchLogsRoleArn': (basestring, False),
99 'FieldLogLevel': (basestring, False),
100 }
101
102
103 class OpenIDConnectConfig(AWSProperty):
104 props = {
105 'AuthTTL': (float, False),
106 'ClientId': (basestring, False),
107 'IatTTL': (float, False),
108 'Issuer': (basestring, True),
109 }
110
111
112 class UserPoolConfig(AWSProperty):
113 props = {
114 'AppIdClientRegex': (basestring, False),
115 'AwsRegion': (basestring, False),
116 'DefaultAction': (basestring, False),
117 'UserPoolId': (basestring, False),
118 }
119
120
121 class GraphQLApi(AWSObject):
122 resource_type = "AWS::AppSync::GraphQLApi"
123
124 props = {
125 'AuthenticationType': (basestring, True),
126 'LogConfig': (LogConfig, False),
127 'Name': (basestring, True),
128 'OpenIDConnectConfig': (OpenIDConnectConfig, False),
129 'UserPoolConfig': (UserPoolConfig, False),
130 }
131
132
133 class GraphQLSchema(AWSObject):
134 resource_type = "AWS::AppSync::GraphQLSchema"
135
136 props = {
137 'ApiId': (basestring, True),
138 'Definition': (basestring, False),
139 'DefinitionS3Location': (basestring, False),
140 }
141
142
143 class PipelineConfig(AWSProperty):
144 props = {
145 'Functions': ([basestring], False),
146 }
147
148
149 class Resolver(AWSObject):
150 resource_type = "AWS::AppSync::Resolver"
151
152 props = {
153 'ApiId': (basestring, True),
154 'DataSourceName': (basestring, True),
155 'FieldName': (basestring, True),
156 'PipelineConfig': (PipelineConfig, False),
157 'RequestMappingTemplate': (basestring, False),
158 'RequestMappingTemplateS3Location': (basestring, False),
159 'ResponseMappingTemplate': (basestring, False),
160 'ResponseMappingTemplateS3Location': (basestring, False),
161 'TypeName': (basestring, True),
162 }
163
164
165 class FunctionConfiguration(AWSObject):
166 resource_type = "AWS::AppSync::FunctionConfiguration"
167
168 props = {
169 'ApiId': (basestring, True),
170 'Name': (basestring, False),
171 'Description': (basestring, False),
172 'DataSourceName': (basestring, False),
173 'FunctionVersion': (basestring, False),
174 'RequestMappingTemplate': (basestring, False),
175 'RequestMappingTemplateS3Location': (basestring, False),
176 'ResponseMappingTemplate': (basestring, False),
177 'ResponseMappingTemplateS3Location': (basestring, False),
178 }
179
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/appsync.py b/troposphere/appsync.py
--- a/troposphere/appsync.py
+++ b/troposphere/appsync.py
@@ -7,6 +7,13 @@
from .validators import boolean, integer
+def resolver_kind_validator(x):
+ valid_types = ["UNIT", "PIPELINE"]
+ if x not in valid_types:
+ raise ValueError("Kind must be one of: %s" % ", ".join(valid_types))
+ return x
+
+
class ApiKey(AWSObject):
resource_type = "AWS::AppSync::ApiKey"
@@ -153,6 +160,7 @@
'ApiId': (basestring, True),
'DataSourceName': (basestring, True),
'FieldName': (basestring, True),
+ 'Kind': (resolver_kind_validator, False),
'PipelineConfig': (PipelineConfig, False),
'RequestMappingTemplate': (basestring, False),
'RequestMappingTemplateS3Location': (basestring, False),
| {"golden_diff": "diff --git a/troposphere/appsync.py b/troposphere/appsync.py\n--- a/troposphere/appsync.py\n+++ b/troposphere/appsync.py\n@@ -7,6 +7,13 @@\n from .validators import boolean, integer\n \n \n+def resolver_kind_validator(x):\n+ valid_types = [\"UNIT\", \"PIPELINE\"]\n+ if x not in valid_types:\n+ raise ValueError(\"Kind must be one of: %s\" % \", \".join(valid_types))\n+ return x\n+\n+\n class ApiKey(AWSObject):\n resource_type = \"AWS::AppSync::ApiKey\"\n \n@@ -153,6 +160,7 @@\n 'ApiId': (basestring, True),\n 'DataSourceName': (basestring, True),\n 'FieldName': (basestring, True),\n+ 'Kind': (resolver_kind_validator, False),\n 'PipelineConfig': (PipelineConfig, False),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n", "issue": "Add 'Kind' property to AWS::AppSync::Resolver\nhttps://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/aws-resource-appsync-resolver.html#cfn-appsync-resolver-kind\r\n\r\nThe Kind property is required when using the new PipelineConfig feature for AppSync.\r\n\r\nThere are only two allowable values, PIPELINE or UNIT. The property is not required if using the standard resolvers.\n", "before_files": [{"content": "# Copyright (c) 2012-2017, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import boolean, integer\n\n\nclass ApiKey(AWSObject):\n resource_type = \"AWS::AppSync::ApiKey\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'Expires': (integer, False),\n }\n\n\nclass DynamoDBConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'TableName': (basestring, True),\n 'UseCallerCredentials': (boolean, False),\n }\n\n\nclass ElasticsearchConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'Endpoint': (basestring, True),\n }\n\n\nclass AwsIamConfig(AWSProperty):\n props = {\n 'SigningRegion': (basestring, False),\n 'SigningServiceName': (basestring, False),\n }\n\n\nclass AuthorizationConfig(AWSProperty):\n props = {\n 'AuthorizationType': (basestring, True),\n 'AwsIamConfig': (AwsIamConfig, False),\n }\n\n\nclass HttpConfig(AWSProperty):\n props = {\n 'AuthorizationConfig': (AuthorizationConfig, False),\n 'Endpoint': (basestring, True),\n }\n\n\nclass LambdaConfig(AWSProperty):\n props = {\n 'LambdaFunctionArn': (basestring, True),\n }\n\n\nclass RdsHttpEndpointConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, False),\n 'DbClusterIdentifier': (basestring, False),\n 'DatabaseName': (basestring, False),\n 'Schema': (basestring, False),\n 'AwsSecretStoreArn': (basestring, False),\n }\n\n\nclass RelationalDatabaseConfig(AWSProperty):\n props = {\n 'RelationalDatasourceType': (basestring, False),\n 'RdsHttpEndpointConfig': (RdsHttpEndpointConfig, False),\n }\n\n\nclass DataSource(AWSObject):\n resource_type = \"AWS::AppSync::DataSource\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'DynamoDBConfig': (DynamoDBConfig, False),\n 'ElasticsearchConfig': (ElasticsearchConfig, False),\n 'HttpConfig': (HttpConfig, False),\n 'LambdaConfig': (LambdaConfig, False),\n 'Name': (basestring, True),\n 'ServiceRoleArn': (basestring, False),\n 'Type': (basestring, True),\n 'RelationalDatabaseConfig': (RelationalDatabaseConfig, False),\n }\n\n\nclass LogConfig(AWSProperty):\n props = {\n 'CloudWatchLogsRoleArn': (basestring, False),\n 'FieldLogLevel': (basestring, False),\n }\n\n\nclass OpenIDConnectConfig(AWSProperty):\n props = {\n 'AuthTTL': (float, False),\n 'ClientId': (basestring, False),\n 'IatTTL': (float, False),\n 'Issuer': (basestring, True),\n }\n\n\nclass UserPoolConfig(AWSProperty):\n props = {\n 'AppIdClientRegex': (basestring, False),\n 'AwsRegion': (basestring, False),\n 'DefaultAction': (basestring, False),\n 'UserPoolId': (basestring, False),\n }\n\n\nclass GraphQLApi(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLApi\"\n\n props = {\n 'AuthenticationType': (basestring, True),\n 'LogConfig': (LogConfig, False),\n 'Name': (basestring, True),\n 'OpenIDConnectConfig': (OpenIDConnectConfig, False),\n 'UserPoolConfig': (UserPoolConfig, False),\n }\n\n\nclass GraphQLSchema(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLSchema\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Definition': (basestring, False),\n 'DefinitionS3Location': (basestring, False),\n }\n\n\nclass PipelineConfig(AWSProperty):\n props = {\n 'Functions': ([basestring], False),\n }\n\n\nclass Resolver(AWSObject):\n resource_type = \"AWS::AppSync::Resolver\"\n\n props = {\n 'ApiId': (basestring, True),\n 'DataSourceName': (basestring, True),\n 'FieldName': (basestring, True),\n 'PipelineConfig': (PipelineConfig, False),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n 'ResponseMappingTemplate': (basestring, False),\n 'ResponseMappingTemplateS3Location': (basestring, False),\n 'TypeName': (basestring, True),\n }\n\n\nclass FunctionConfiguration(AWSObject):\n resource_type = \"AWS::AppSync::FunctionConfiguration\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Name': (basestring, False),\n 'Description': (basestring, False),\n 'DataSourceName': (basestring, False),\n 'FunctionVersion': (basestring, False),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n 'ResponseMappingTemplate': (basestring, False),\n 'ResponseMappingTemplateS3Location': (basestring, False),\n }\n", "path": "troposphere/appsync.py"}], "after_files": [{"content": "# Copyright (c) 2012-2017, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import boolean, integer\n\n\ndef resolver_kind_validator(x):\n valid_types = [\"UNIT\", \"PIPELINE\"]\n if x not in valid_types:\n raise ValueError(\"Kind must be one of: %s\" % \", \".join(valid_types))\n return x\n\n\nclass ApiKey(AWSObject):\n resource_type = \"AWS::AppSync::ApiKey\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'Expires': (integer, False),\n }\n\n\nclass DynamoDBConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'TableName': (basestring, True),\n 'UseCallerCredentials': (boolean, False),\n }\n\n\nclass ElasticsearchConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'Endpoint': (basestring, True),\n }\n\n\nclass AwsIamConfig(AWSProperty):\n props = {\n 'SigningRegion': (basestring, False),\n 'SigningServiceName': (basestring, False),\n }\n\n\nclass AuthorizationConfig(AWSProperty):\n props = {\n 'AuthorizationType': (basestring, True),\n 'AwsIamConfig': (AwsIamConfig, False),\n }\n\n\nclass HttpConfig(AWSProperty):\n props = {\n 'AuthorizationConfig': (AuthorizationConfig, False),\n 'Endpoint': (basestring, True),\n }\n\n\nclass LambdaConfig(AWSProperty):\n props = {\n 'LambdaFunctionArn': (basestring, True),\n }\n\n\nclass RdsHttpEndpointConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, False),\n 'DbClusterIdentifier': (basestring, False),\n 'DatabaseName': (basestring, False),\n 'Schema': (basestring, False),\n 'AwsSecretStoreArn': (basestring, False),\n }\n\n\nclass RelationalDatabaseConfig(AWSProperty):\n props = {\n 'RelationalDatasourceType': (basestring, False),\n 'RdsHttpEndpointConfig': (RdsHttpEndpointConfig, False),\n }\n\n\nclass DataSource(AWSObject):\n resource_type = \"AWS::AppSync::DataSource\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'DynamoDBConfig': (DynamoDBConfig, False),\n 'ElasticsearchConfig': (ElasticsearchConfig, False),\n 'HttpConfig': (HttpConfig, False),\n 'LambdaConfig': (LambdaConfig, False),\n 'Name': (basestring, True),\n 'ServiceRoleArn': (basestring, False),\n 'Type': (basestring, True),\n 'RelationalDatabaseConfig': (RelationalDatabaseConfig, False),\n }\n\n\nclass LogConfig(AWSProperty):\n props = {\n 'CloudWatchLogsRoleArn': (basestring, False),\n 'FieldLogLevel': (basestring, False),\n }\n\n\nclass OpenIDConnectConfig(AWSProperty):\n props = {\n 'AuthTTL': (float, False),\n 'ClientId': (basestring, False),\n 'IatTTL': (float, False),\n 'Issuer': (basestring, True),\n }\n\n\nclass UserPoolConfig(AWSProperty):\n props = {\n 'AppIdClientRegex': (basestring, False),\n 'AwsRegion': (basestring, False),\n 'DefaultAction': (basestring, False),\n 'UserPoolId': (basestring, False),\n }\n\n\nclass GraphQLApi(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLApi\"\n\n props = {\n 'AuthenticationType': (basestring, True),\n 'LogConfig': (LogConfig, False),\n 'Name': (basestring, True),\n 'OpenIDConnectConfig': (OpenIDConnectConfig, False),\n 'UserPoolConfig': (UserPoolConfig, False),\n }\n\n\nclass GraphQLSchema(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLSchema\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Definition': (basestring, False),\n 'DefinitionS3Location': (basestring, False),\n }\n\n\nclass PipelineConfig(AWSProperty):\n props = {\n 'Functions': ([basestring], False),\n }\n\n\nclass Resolver(AWSObject):\n resource_type = \"AWS::AppSync::Resolver\"\n\n props = {\n 'ApiId': (basestring, True),\n 'DataSourceName': (basestring, True),\n 'FieldName': (basestring, True),\n 'Kind': (resolver_kind_validator, False),\n 'PipelineConfig': (PipelineConfig, False),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n 'ResponseMappingTemplate': (basestring, False),\n 'ResponseMappingTemplateS3Location': (basestring, False),\n 'TypeName': (basestring, True),\n }\n\n\nclass FunctionConfiguration(AWSObject):\n resource_type = \"AWS::AppSync::FunctionConfiguration\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Name': (basestring, False),\n 'Description': (basestring, False),\n 'DataSourceName': (basestring, False),\n 'FunctionVersion': (basestring, False),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n 'ResponseMappingTemplate': (basestring, False),\n 'ResponseMappingTemplateS3Location': (basestring, False),\n }\n", "path": "troposphere/appsync.py"}]} | 1,988 | 226 |
gh_patches_debug_60844 | rasdani/github-patches | git_diff | uclapi__uclapi-128 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug] Search People should return HTTP status 400 when query is missing
Currently, the `/search/people` returns a HTTP 200 code when even for an incorrect API request. For example, if you leave out the `query` param it returns the following body:
```json
{ "error": "No query provided", "ok": false}
```
Yet, the HTTP status code is 200, while it should be 400.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `backend/uclapi/search/views.py`
Content:
```
1 from rest_framework.decorators import api_view
2 from django.http import JsonResponse
3
4 from roombookings.decorators import does_token_exist, log_api_call, throttle
5
6 import os
7 import requests
8
9
10 @api_view(['GET'])
11 @does_token_exist
12 @throttle
13 @log_api_call
14 def people(request):
15 if "query" not in request.GET:
16 return JsonResponse({
17 "ok": False,
18 "error": "No query provided"
19 })
20
21 query = request.GET["query"]
22
23 url = (
24 "{}?{}={}"
25 .format(
26 os.environ["SEARCH_API_URL"],
27 os.environ["SEARCH_API_QUERY_PARAMS"],
28 query,
29 )
30 )
31
32 r = requests.get(url)
33
34 results = r.json()["response"]["resultPacket"]["results"][:20]
35
36 def serialize_person(person):
37 return {
38 "name": person["title"],
39 "department": person["metaData"].get("7", ""),
40 "email": person["metaData"].get("E", ""),
41 "status": person["metaData"].get("g", ""),
42 }
43
44 people = [serialize_person(person) for person in results]
45
46 return JsonResponse({
47 "ok": True,
48 "people": people
49 })
50
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/backend/uclapi/search/views.py b/backend/uclapi/search/views.py
--- a/backend/uclapi/search/views.py
+++ b/backend/uclapi/search/views.py
@@ -13,10 +13,12 @@
@log_api_call
def people(request):
if "query" not in request.GET:
- return JsonResponse({
+ response = JsonResponse({
"ok": False,
- "error": "No query provided"
+ "error": "No query provided."
})
+ response.status_code = 400
+ return response
query = request.GET["query"]
| {"golden_diff": "diff --git a/backend/uclapi/search/views.py b/backend/uclapi/search/views.py\n--- a/backend/uclapi/search/views.py\n+++ b/backend/uclapi/search/views.py\n@@ -13,10 +13,12 @@\n @log_api_call\n def people(request):\n if \"query\" not in request.GET:\n- return JsonResponse({\n+ response = JsonResponse({\n \"ok\": False,\n- \"error\": \"No query provided\"\n+ \"error\": \"No query provided.\"\n })\n+ response.status_code = 400\n+ return response\n \n query = request.GET[\"query\"]\n", "issue": "[Bug] Search People should return HTTP status 400 when query is missing\nCurrently, the `/search/people` returns a HTTP 200 code when even for an incorrect API request. For example, if you leave out the `query` param it returns the following body:\r\n\r\n```json\r\n{ \"error\": \"No query provided\", \"ok\": false}\r\n```\r\n\r\nYet, the HTTP status code is 200, while it should be 400.\r\n\n", "before_files": [{"content": "from rest_framework.decorators import api_view\nfrom django.http import JsonResponse\n\nfrom roombookings.decorators import does_token_exist, log_api_call, throttle\n\nimport os\nimport requests\n\n\n@api_view(['GET'])\n@does_token_exist\n@throttle\n@log_api_call\ndef people(request):\n if \"query\" not in request.GET:\n return JsonResponse({\n \"ok\": False,\n \"error\": \"No query provided\"\n })\n\n query = request.GET[\"query\"]\n\n url = (\n \"{}?{}={}\"\n .format(\n os.environ[\"SEARCH_API_URL\"],\n os.environ[\"SEARCH_API_QUERY_PARAMS\"],\n query,\n )\n )\n\n r = requests.get(url)\n\n results = r.json()[\"response\"][\"resultPacket\"][\"results\"][:20]\n\n def serialize_person(person):\n return {\n \"name\": person[\"title\"],\n \"department\": person[\"metaData\"].get(\"7\", \"\"),\n \"email\": person[\"metaData\"].get(\"E\", \"\"),\n \"status\": person[\"metaData\"].get(\"g\", \"\"),\n }\n\n people = [serialize_person(person) for person in results]\n\n return JsonResponse({\n \"ok\": True,\n \"people\": people\n })\n", "path": "backend/uclapi/search/views.py"}], "after_files": [{"content": "from rest_framework.decorators import api_view\nfrom django.http import JsonResponse\n\nfrom roombookings.decorators import does_token_exist, log_api_call, throttle\n\nimport os\nimport requests\n\n\n@api_view(['GET'])\n@does_token_exist\n@throttle\n@log_api_call\ndef people(request):\n if \"query\" not in request.GET:\n response = JsonResponse({\n \"ok\": False,\n \"error\": \"No query provided.\"\n })\n response.status_code = 400\n return response\n\n query = request.GET[\"query\"]\n\n url = (\n \"{}?{}={}\"\n .format(\n os.environ[\"SEARCH_API_URL\"],\n os.environ[\"SEARCH_API_QUERY_PARAMS\"],\n query,\n )\n )\n\n r = requests.get(url)\n\n results = r.json()[\"response\"][\"resultPacket\"][\"results\"][:20]\n\n def serialize_person(person):\n return {\n \"name\": person[\"title\"],\n \"department\": person[\"metaData\"].get(\"7\", \"\"),\n \"email\": person[\"metaData\"].get(\"E\", \"\"),\n \"status\": person[\"metaData\"].get(\"g\", \"\"),\n }\n\n people = [serialize_person(person) for person in results]\n\n return JsonResponse({\n \"ok\": True,\n \"people\": people\n })\n", "path": "backend/uclapi/search/views.py"}]} | 717 | 136 |
gh_patches_debug_14186 | rasdani/github-patches | git_diff | bokeh__bokeh-4129 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Docs for styling selection overlays
There is currently no way to style the box or poly overlays that various selection tools use.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py`
Content:
```
1 import numpy as np
2
3 from bokeh.models import BoxSelectTool, BoxZoomTool, LassoSelectTool
4 from bokeh.plotting import figure, output_file, show
5
6 output_file("styling_tool_overlays.html")
7
8 x = np.random.random(size=200)
9 y = np.random.random(size=200)
10
11 # Basic plot setup
12 plot = figure(width=400, height=400, title='Select and Zoom',
13 tools="box_select,box_zoom,lasso_select,reset")
14
15 plot.circle(x, y, size=5)
16
17 plot.select_one(BoxSelectTool).overlay.fill_color = "firebrick"
18 plot.select_one(BoxSelectTool).overlay.line_color = None
19
20 plot.select_one(BoxZoomTool).overlay.line_color = "olive"
21 plot.select_one(BoxZoomTool).overlay.line_width = 8
22 plot.select_one(BoxZoomTool).overlay.line_dash = "solid"
23 plot.select_one(BoxZoomTool).overlay.fill_color = None
24
25 plot.select_one(LassoSelectTool).overlay.line_dash = [10, 10]
26
27 show(plot)
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py b/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py
--- a/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py
+++ b/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py
@@ -14,14 +14,18 @@
plot.circle(x, y, size=5)
-plot.select_one(BoxSelectTool).overlay.fill_color = "firebrick"
-plot.select_one(BoxSelectTool).overlay.line_color = None
+select_overlay = plot.select_one(BoxSelectTool).overlay
-plot.select_one(BoxZoomTool).overlay.line_color = "olive"
-plot.select_one(BoxZoomTool).overlay.line_width = 8
-plot.select_one(BoxZoomTool).overlay.line_dash = "solid"
-plot.select_one(BoxZoomTool).overlay.fill_color = None
+select_overlay.fill_color = "firebrick"
+select_overlay.line_color = None
+
+zoom_overlay = plot.select_one(BoxZoomTool).overlay
+
+zoom_overlay.line_color = "olive"
+zoom_overlay.line_width = 8
+zoom_overlay.line_dash = "solid"
+zoom_overlay.fill_color = None
plot.select_one(LassoSelectTool).overlay.line_dash = [10, 10]
-show(plot)
\ No newline at end of file
+show(plot)
| {"golden_diff": "diff --git a/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py b/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py\n--- a/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py\n+++ b/sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py\n@@ -14,14 +14,18 @@\n \n plot.circle(x, y, size=5)\n \n-plot.select_one(BoxSelectTool).overlay.fill_color = \"firebrick\"\n-plot.select_one(BoxSelectTool).overlay.line_color = None\n+select_overlay = plot.select_one(BoxSelectTool).overlay\n \n-plot.select_one(BoxZoomTool).overlay.line_color = \"olive\"\n-plot.select_one(BoxZoomTool).overlay.line_width = 8\n-plot.select_one(BoxZoomTool).overlay.line_dash = \"solid\"\n-plot.select_one(BoxZoomTool).overlay.fill_color = None\n+select_overlay.fill_color = \"firebrick\"\n+select_overlay.line_color = None\n+\n+zoom_overlay = plot.select_one(BoxZoomTool).overlay\n+\n+zoom_overlay.line_color = \"olive\"\n+zoom_overlay.line_width = 8\n+zoom_overlay.line_dash = \"solid\"\n+zoom_overlay.fill_color = None\n \n plot.select_one(LassoSelectTool).overlay.line_dash = [10, 10]\n \n-show(plot)\n\\ No newline at end of file\n+show(plot)\n", "issue": "Docs for styling selection overlays\nThere is currently no way to style the box or poly overlays that various selection tools use. \n\n", "before_files": [{"content": "import numpy as np\n\nfrom bokeh.models import BoxSelectTool, BoxZoomTool, LassoSelectTool\nfrom bokeh.plotting import figure, output_file, show\n\noutput_file(\"styling_tool_overlays.html\")\n\nx = np.random.random(size=200)\ny = np.random.random(size=200)\n\n# Basic plot setup\nplot = figure(width=400, height=400, title='Select and Zoom',\n tools=\"box_select,box_zoom,lasso_select,reset\")\n\nplot.circle(x, y, size=5)\n\nplot.select_one(BoxSelectTool).overlay.fill_color = \"firebrick\"\nplot.select_one(BoxSelectTool).overlay.line_color = None\n\nplot.select_one(BoxZoomTool).overlay.line_color = \"olive\"\nplot.select_one(BoxZoomTool).overlay.line_width = 8\nplot.select_one(BoxZoomTool).overlay.line_dash = \"solid\"\nplot.select_one(BoxZoomTool).overlay.fill_color = None\n\nplot.select_one(LassoSelectTool).overlay.line_dash = [10, 10]\n\nshow(plot)", "path": "sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py"}], "after_files": [{"content": "import numpy as np\n\nfrom bokeh.models import BoxSelectTool, BoxZoomTool, LassoSelectTool\nfrom bokeh.plotting import figure, output_file, show\n\noutput_file(\"styling_tool_overlays.html\")\n\nx = np.random.random(size=200)\ny = np.random.random(size=200)\n\n# Basic plot setup\nplot = figure(width=400, height=400, title='Select and Zoom',\n tools=\"box_select,box_zoom,lasso_select,reset\")\n\nplot.circle(x, y, size=5)\n\nselect_overlay = plot.select_one(BoxSelectTool).overlay\n\nselect_overlay.fill_color = \"firebrick\"\nselect_overlay.line_color = None\n\nzoom_overlay = plot.select_one(BoxZoomTool).overlay\n\nzoom_overlay.line_color = \"olive\"\nzoom_overlay.line_width = 8\nzoom_overlay.line_dash = \"solid\"\nzoom_overlay.fill_color = None\n\nplot.select_one(LassoSelectTool).overlay.line_dash = [10, 10]\n\nshow(plot)\n", "path": "sphinx/source/docs/user_guide/source_examples/styling_tool_overlays.py"}]} | 572 | 302 |
gh_patches_debug_64061 | rasdani/github-patches | git_diff | privacyidea__privacyidea-1978 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Missing requirement in setup.py
The `flask-versioned` package is missing in `setup.py`s `install_requires` list. When installing privacyIDEA via `setup.py` or `pip` this will break.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 from __future__ import print_function
3 from setuptools import setup, find_packages
4 import os
5 import stat
6 import sys
7
8 #VERSION="2.1dev4"
9 VERSION="3.2"
10
11 # Taken from kennethreitz/requests/setup.py
12 package_directory = os.path.realpath(os.path.dirname(__file__))
13
14
15 def get_file_contents(file_path):
16 """Get the context of the file using full path name."""
17 content = ""
18 try:
19 full_path = os.path.join(package_directory, file_path)
20 content = open(full_path, 'r').read()
21 except:
22 print("### could not open file {0!r}".format(file_path), file=sys.stderr)
23 return content
24
25
26 def get_file_list(file_path):
27 full_path = os.path.join(package_directory, file_path)
28 file_list = os.listdir(full_path)
29 # now we need to add the path to the files
30 return [file_path + f for f in file_list]
31
32
33 install_requires = ["Flask>=0.10.1",
34 "Flask-Migrate>=1.2.0",
35 "Flask-SQLAlchemy>=2.0",
36 "Flask-Script>=2.0.5",
37 "Jinja2>=2.10.1",
38 "Mako>=0.9.1",
39 "PyMySQL>=0.6.6",
40 "Pillow>=6.2.1",
41 "PyJWT>=1.3.0",
42 "PyYAML>=5.1",
43 "SQLAlchemy>=1.3.0",
44 "Werkzeug>=0.10.4",
45 "alembic>=0.6.7",
46 "bcrypt>=1.1.0",
47 "beautifulsoup4>=4.3.2",
48 "ldap3>=2.6",
49 "netaddr>=0.7.12",
50 "passlib>=1.6.2",
51 "pyOpenSSL>=17.5",
52 "pyrad>=2.0",
53 "qrcode>=6.1",
54 "requests>=2.7.0",
55 "sqlsoup>=0.9.0",
56 "ecdsa>=0.13.3",
57 "lxml>=4.2.5",
58 "python-gnupg>=0.4.4",
59 "defusedxml>=0.4.1",
60 "flask-babel>=0.9",
61 "croniter>=0.3.8",
62 "oauth2client>=2.0.1",
63 "configobj>=5.0.6"
64 ]
65
66 # For python 2.6 we need additional dependency importlib
67 try:
68 import importlib
69 except ImportError:
70 install_requires.append('importlib')
71
72
73 def get_man_pages(dir):
74 """
75 Get man pages in a directory.
76 :param dir:
77 :return: list of file names
78 """
79 files = os.listdir(dir)
80 r_files = []
81 for file in files:
82 if file.endswith(".1"):
83 r_files.append(dir + "/" + file)
84 return r_files
85
86
87 def get_scripts(dir):
88 """
89 Get files that are executable
90 :param dir:
91 :return: list of file names
92 """
93 files = os.listdir(dir)
94 r_files = []
95 for file in files:
96 if os.stat(dir + "/" + file)[stat.ST_MODE] & stat.S_IEXEC:
97 r_files.append(dir + "/" + file)
98 return r_files
99
100
101 setup(
102 name='privacyIDEA',
103 version=VERSION,
104 description='privacyIDEA: identity, multifactor authentication (OTP), '
105 'authorization, audit',
106 author='privacyidea.org',
107 license='AGPLv3',
108 author_email='[email protected]',
109 url='http://www.privacyidea.org',
110 keywords='OTP, two factor authentication, management, security',
111 python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',
112 packages=find_packages(),
113 scripts=["pi-manage"] + get_scripts("tools"),
114 extras_require={
115 'dev': ["Sphinx>=1.3.1",
116 "sphinxcontrib-httpdomain>=1.3.0"],
117 'test': ["coverage>=3.7.1",
118 "mock>=1.0.1",
119 "pyparsing>=2.0.3",
120 "nose>=1.3.4",
121 "responses>=0.4.0",
122 "six>=1.8.0"],
123 },
124 install_requires=install_requires,
125 include_package_data=True,
126 data_files=[('etc/privacyidea/',
127 ['deploy/apache/privacyideaapp.wsgi',
128 'deploy/privacyidea/dictionary']),
129 ('share/man/man1', get_man_pages("tools")),
130 ('lib/privacyidea/migrations',
131 ["migrations/alembic.ini",
132 "migrations/env.py",
133 "migrations/README",
134 "migrations/script.py.mako"]),
135 ('lib/privacyidea/migrations/versions',
136 get_file_list("migrations/versions/")),
137 ('lib/privacyidea/', ['requirements.txt'])
138 ],
139 classifiers=["Framework :: Flask",
140 "License :: OSI Approved :: "
141 "GNU Affero General Public License v3",
142 "Programming Language :: Python",
143 "Development Status :: 5 - Production/Stable",
144 "Topic :: Internet",
145 "Topic :: Security",
146 "Topic :: System ::"
147 " Systems Administration :: Authentication/Directory",
148 'Programming Language :: Python',
149 'Programming Language :: Python :: 2',
150 'Programming Language :: Python :: 2.7',
151 'Programming Language :: Python :: 3',
152 'Programming Language :: Python :: 3.5',
153 'Programming Language :: Python :: 3.6',
154 'Programming Language :: Python :: 3.7'
155 ],
156 #message_extractors={'privacyidea': [
157 # ('**.py', 'python', None),
158 # ('static/**.html', 'html', {'input_encoding': 'utf-8'})]},
159 zip_safe=False,
160 long_description=get_file_contents('README.rst')
161 )
162
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -60,7 +60,8 @@
"flask-babel>=0.9",
"croniter>=0.3.8",
"oauth2client>=2.0.1",
- "configobj>=5.0.6"
+ "configobj>=5.0.6",
+ "flask-versioned>=0.9.4"
]
# For python 2.6 we need additional dependency importlib
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -60,7 +60,8 @@\n \"flask-babel>=0.9\",\n \"croniter>=0.3.8\",\n \"oauth2client>=2.0.1\",\n- \"configobj>=5.0.6\"\n+ \"configobj>=5.0.6\",\n+ \"flask-versioned>=0.9.4\"\n ]\n \n # For python 2.6 we need additional dependency importlib\n", "issue": "Missing requirement in setup.py\nThe `flask-versioned` package is missing in `setup.py`s `install_requires` list. When installing privacyIDEA via `setup.py` or `pip` this will break.\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import print_function\nfrom setuptools import setup, find_packages\nimport os\nimport stat\nimport sys\n\n#VERSION=\"2.1dev4\"\nVERSION=\"3.2\"\n\n# Taken from kennethreitz/requests/setup.py\npackage_directory = os.path.realpath(os.path.dirname(__file__))\n\n\ndef get_file_contents(file_path):\n \"\"\"Get the context of the file using full path name.\"\"\"\n content = \"\"\n try:\n full_path = os.path.join(package_directory, file_path)\n content = open(full_path, 'r').read()\n except:\n print(\"### could not open file {0!r}\".format(file_path), file=sys.stderr)\n return content\n\n\ndef get_file_list(file_path):\n full_path = os.path.join(package_directory, file_path)\n file_list = os.listdir(full_path)\n # now we need to add the path to the files\n return [file_path + f for f in file_list]\n\n\ninstall_requires = [\"Flask>=0.10.1\",\n \"Flask-Migrate>=1.2.0\",\n \"Flask-SQLAlchemy>=2.0\",\n \"Flask-Script>=2.0.5\",\n \"Jinja2>=2.10.1\",\n \"Mako>=0.9.1\",\n \"PyMySQL>=0.6.6\",\n \"Pillow>=6.2.1\",\n \"PyJWT>=1.3.0\",\n \"PyYAML>=5.1\",\n \"SQLAlchemy>=1.3.0\",\n \"Werkzeug>=0.10.4\",\n \"alembic>=0.6.7\",\n \"bcrypt>=1.1.0\",\n \"beautifulsoup4>=4.3.2\",\n \"ldap3>=2.6\",\n \"netaddr>=0.7.12\",\n \"passlib>=1.6.2\",\n \"pyOpenSSL>=17.5\",\n \"pyrad>=2.0\",\n \"qrcode>=6.1\",\n \"requests>=2.7.0\",\n \"sqlsoup>=0.9.0\",\n \"ecdsa>=0.13.3\",\n \"lxml>=4.2.5\",\n \"python-gnupg>=0.4.4\",\n \"defusedxml>=0.4.1\",\n \"flask-babel>=0.9\",\n \"croniter>=0.3.8\",\n \"oauth2client>=2.0.1\",\n \"configobj>=5.0.6\"\n ]\n\n# For python 2.6 we need additional dependency importlib\ntry:\n import importlib\nexcept ImportError:\n install_requires.append('importlib')\n\n\ndef get_man_pages(dir):\n \"\"\"\n Get man pages in a directory.\n :param dir: \n :return: list of file names\n \"\"\"\n files = os.listdir(dir)\n r_files = []\n for file in files:\n if file.endswith(\".1\"):\n r_files.append(dir + \"/\" + file)\n return r_files\n\n\ndef get_scripts(dir):\n \"\"\"\n Get files that are executable\n :param dir: \n :return: list of file names\n \"\"\"\n files = os.listdir(dir)\n r_files = []\n for file in files:\n if os.stat(dir + \"/\" + file)[stat.ST_MODE] & stat.S_IEXEC:\n r_files.append(dir + \"/\" + file)\n return r_files\n\n\nsetup(\n name='privacyIDEA',\n version=VERSION,\n description='privacyIDEA: identity, multifactor authentication (OTP), '\n 'authorization, audit',\n author='privacyidea.org',\n license='AGPLv3',\n author_email='[email protected]',\n url='http://www.privacyidea.org',\n keywords='OTP, two factor authentication, management, security',\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',\n packages=find_packages(),\n scripts=[\"pi-manage\"] + get_scripts(\"tools\"),\n extras_require={\n 'dev': [\"Sphinx>=1.3.1\",\n \"sphinxcontrib-httpdomain>=1.3.0\"],\n 'test': [\"coverage>=3.7.1\",\n \"mock>=1.0.1\",\n \"pyparsing>=2.0.3\",\n \"nose>=1.3.4\",\n \"responses>=0.4.0\",\n \"six>=1.8.0\"],\n },\n install_requires=install_requires,\n include_package_data=True,\n data_files=[('etc/privacyidea/',\n ['deploy/apache/privacyideaapp.wsgi',\n 'deploy/privacyidea/dictionary']),\n ('share/man/man1', get_man_pages(\"tools\")),\n ('lib/privacyidea/migrations',\n [\"migrations/alembic.ini\",\n \"migrations/env.py\",\n \"migrations/README\",\n \"migrations/script.py.mako\"]),\n ('lib/privacyidea/migrations/versions',\n get_file_list(\"migrations/versions/\")),\n ('lib/privacyidea/', ['requirements.txt'])\n ],\n classifiers=[\"Framework :: Flask\",\n \"License :: OSI Approved :: \"\n \"GNU Affero General Public License v3\",\n \"Programming Language :: Python\",\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Internet\",\n \"Topic :: Security\",\n \"Topic :: System ::\"\n \" Systems Administration :: Authentication/Directory\",\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7'\n ],\n #message_extractors={'privacyidea': [\n # ('**.py', 'python', None),\n # ('static/**.html', 'html', {'input_encoding': 'utf-8'})]},\n zip_safe=False,\n long_description=get_file_contents('README.rst')\n)\n", "path": "setup.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nfrom __future__ import print_function\nfrom setuptools import setup, find_packages\nimport os\nimport stat\nimport sys\n\n#VERSION=\"2.1dev4\"\nVERSION=\"3.2\"\n\n# Taken from kennethreitz/requests/setup.py\npackage_directory = os.path.realpath(os.path.dirname(__file__))\n\n\ndef get_file_contents(file_path):\n \"\"\"Get the context of the file using full path name.\"\"\"\n content = \"\"\n try:\n full_path = os.path.join(package_directory, file_path)\n content = open(full_path, 'r').read()\n except:\n print(\"### could not open file {0!r}\".format(file_path), file=sys.stderr)\n return content\n\n\ndef get_file_list(file_path):\n full_path = os.path.join(package_directory, file_path)\n file_list = os.listdir(full_path)\n # now we need to add the path to the files\n return [file_path + f for f in file_list]\n\n\ninstall_requires = [\"Flask>=0.10.1\",\n \"Flask-Migrate>=1.2.0\",\n \"Flask-SQLAlchemy>=2.0\",\n \"Flask-Script>=2.0.5\",\n \"Jinja2>=2.10.1\",\n \"Mako>=0.9.1\",\n \"PyMySQL>=0.6.6\",\n \"Pillow>=6.2.1\",\n \"PyJWT>=1.3.0\",\n \"PyYAML>=5.1\",\n \"SQLAlchemy>=1.3.0\",\n \"Werkzeug>=0.10.4\",\n \"alembic>=0.6.7\",\n \"bcrypt>=1.1.0\",\n \"beautifulsoup4>=4.3.2\",\n \"ldap3>=2.6\",\n \"netaddr>=0.7.12\",\n \"passlib>=1.6.2\",\n \"pyOpenSSL>=17.5\",\n \"pyrad>=2.0\",\n \"qrcode>=6.1\",\n \"requests>=2.7.0\",\n \"sqlsoup>=0.9.0\",\n \"ecdsa>=0.13.3\",\n \"lxml>=4.2.5\",\n \"python-gnupg>=0.4.4\",\n \"defusedxml>=0.4.1\",\n \"flask-babel>=0.9\",\n \"croniter>=0.3.8\",\n \"oauth2client>=2.0.1\",\n \"configobj>=5.0.6\",\n \"flask-versioned>=0.9.4\"\n ]\n\n# For python 2.6 we need additional dependency importlib\ntry:\n import importlib\nexcept ImportError:\n install_requires.append('importlib')\n\n\ndef get_man_pages(dir):\n \"\"\"\n Get man pages in a directory.\n :param dir: \n :return: list of file names\n \"\"\"\n files = os.listdir(dir)\n r_files = []\n for file in files:\n if file.endswith(\".1\"):\n r_files.append(dir + \"/\" + file)\n return r_files\n\n\ndef get_scripts(dir):\n \"\"\"\n Get files that are executable\n :param dir: \n :return: list of file names\n \"\"\"\n files = os.listdir(dir)\n r_files = []\n for file in files:\n if os.stat(dir + \"/\" + file)[stat.ST_MODE] & stat.S_IEXEC:\n r_files.append(dir + \"/\" + file)\n return r_files\n\n\nsetup(\n name='privacyIDEA',\n version=VERSION,\n description='privacyIDEA: identity, multifactor authentication (OTP), '\n 'authorization, audit',\n author='privacyidea.org',\n license='AGPLv3',\n author_email='[email protected]',\n url='http://www.privacyidea.org',\n keywords='OTP, two factor authentication, management, security',\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',\n packages=find_packages(),\n scripts=[\"pi-manage\"] + get_scripts(\"tools\"),\n extras_require={\n 'dev': [\"Sphinx>=1.3.1\",\n \"sphinxcontrib-httpdomain>=1.3.0\"],\n 'test': [\"coverage>=3.7.1\",\n \"mock>=1.0.1\",\n \"pyparsing>=2.0.3\",\n \"nose>=1.3.4\",\n \"responses>=0.4.0\",\n \"six>=1.8.0\"],\n },\n install_requires=install_requires,\n include_package_data=True,\n data_files=[('etc/privacyidea/',\n ['deploy/apache/privacyideaapp.wsgi',\n 'deploy/privacyidea/dictionary']),\n ('share/man/man1', get_man_pages(\"tools\")),\n ('lib/privacyidea/migrations',\n [\"migrations/alembic.ini\",\n \"migrations/env.py\",\n \"migrations/README\",\n \"migrations/script.py.mako\"]),\n ('lib/privacyidea/migrations/versions',\n get_file_list(\"migrations/versions/\")),\n ('lib/privacyidea/', ['requirements.txt'])\n ],\n classifiers=[\"Framework :: Flask\",\n \"License :: OSI Approved :: \"\n \"GNU Affero General Public License v3\",\n \"Programming Language :: Python\",\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Internet\",\n \"Topic :: Security\",\n \"Topic :: System ::\"\n \" Systems Administration :: Authentication/Directory\",\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7'\n ],\n #message_extractors={'privacyidea': [\n # ('**.py', 'python', None),\n # ('static/**.html', 'html', {'input_encoding': 'utf-8'})]},\n zip_safe=False,\n long_description=get_file_contents('README.rst')\n)\n", "path": "setup.py"}]} | 2,031 | 121 |
gh_patches_debug_8429 | rasdani/github-patches | git_diff | scrapy__scrapy-1644 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
_monkeypatches.py: 'NoneType' object has no attribute 'startswith'
Did not try yet to come up with minimal example to demonstrate this issue but it is reproducible for me:
```
$> datalad --dbg crawl
Traceback (most recent call last):
File "/home/yoh/proj/datalad/datalad/venv-tests/bin/datalad", line 9, in <module>
load_entry_point('datalad==0.1.dev0', 'console_scripts', 'datalad')()
File "/home/yoh/proj/datalad/datalad/datalad/cmdline/main.py", line 199, in main
cmdlineargs.func(cmdlineargs)
File "/home/yoh/proj/datalad/datalad/datalad/interface/base.py", line 151, in call_from_parser
return self(**kwargs)
File "/home/yoh/proj/datalad/datalad/datalad/interface/crawl.py", line 44, in __call__
from datalad.crawler.pipeline import load_pipeline_from_config, get_pipeline_config_path
File "/home/yoh/proj/datalad/datalad/datalad/crawler/pipeline.py", line 21, in <module>
from .newmain import lgr
File "/home/yoh/proj/datalad/datalad/datalad/crawler/newmain.py", line 21, in <module>
from .nodes.matches import *
File "/home/yoh/proj/datalad/datalad/datalad/crawler/nodes/matches.py", line 18, in <module>
from scrapy.selector import Selector
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/scrapy/__init__.py", line 27, in <module>
from . import _monkeypatches
File "/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/scrapy/_monkeypatches.py", line 24, in <module>
and getattr(v, '__module__', '').startswith('twisted'):
AttributeError: 'NoneType' object has no attribute 'startswith'
()
> /home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/scrapy/_monkeypatches.py(24)<module>()
-> and getattr(v, '__module__', '').startswith('twisted'):
(Pdb) l
19 # to prevent bugs like Twisted#7989 while serializing requests
20 import twisted.persisted.styles # NOQA
21 # Remove only entries with twisted serializers for non-twisted types.
22 for k, v in frozenset(copyreg.dispatch_table.items()):
23 if not getattr(k, '__module__', '').startswith('twisted') \
24 -> and getattr(v, '__module__', '').startswith('twisted'):
25 copyreg.dispatch_table.pop(k)
[EOF]
(Pdb) p k
None
(Pdb) p v
None
(Pdb) p copyreg
None
```
not sure it came to it but the issue is (if I pdb before this madness happens):
```
(Pdb) p getattr(k, '__module__', '')
'__builtin__'
(Pdb) p getattr(v, '__module__', '')
None
(Pdb) p v
<function mpq_reducer at 0x7f474bb4ab90>
(Pdb) p v.__module__
None
(Pdb) p k, v
(<type 'mpq'>, <function mpq_reducer at 0x7f474bb4ab90>)
```
so assigned `__module__` is None. As a quick resolution wrapped into str() call to assure str there
```
and str(getattr(v, '__module__', '')).startswith('twisted'):
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/_monkeypatches.py`
Content:
```
1 import sys
2 from six.moves import copyreg
3
4 if sys.version_info[0] == 2:
5 from urlparse import urlparse
6
7 # workaround for http://bugs.python.org/issue7904 - Python < 2.7
8 if urlparse('s3://bucket/key').netloc != 'bucket':
9 from urlparse import uses_netloc
10 uses_netloc.append('s3')
11
12 # workaround for http://bugs.python.org/issue9374 - Python < 2.7.4
13 if urlparse('s3://bucket/key?key=value').query != 'key=value':
14 from urlparse import uses_query
15 uses_query.append('s3')
16
17
18 # Undo what Twisted's perspective broker adds to pickle register
19 # to prevent bugs like Twisted#7989 while serializing requests
20 import twisted.persisted.styles # NOQA
21 # Remove only entries with twisted serializers for non-twisted types.
22 for k, v in frozenset(copyreg.dispatch_table.items()):
23 if not getattr(k, '__module__', '').startswith('twisted') \
24 and getattr(v, '__module__', '').startswith('twisted'):
25 copyreg.dispatch_table.pop(k)
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scrapy/_monkeypatches.py b/scrapy/_monkeypatches.py
--- a/scrapy/_monkeypatches.py
+++ b/scrapy/_monkeypatches.py
@@ -20,6 +20,6 @@
import twisted.persisted.styles # NOQA
# Remove only entries with twisted serializers for non-twisted types.
for k, v in frozenset(copyreg.dispatch_table.items()):
- if not getattr(k, '__module__', '').startswith('twisted') \
- and getattr(v, '__module__', '').startswith('twisted'):
+ if not str(getattr(k, '__module__', '')).startswith('twisted') \
+ and str(getattr(v, '__module__', '')).startswith('twisted'):
copyreg.dispatch_table.pop(k)
| {"golden_diff": "diff --git a/scrapy/_monkeypatches.py b/scrapy/_monkeypatches.py\n--- a/scrapy/_monkeypatches.py\n+++ b/scrapy/_monkeypatches.py\n@@ -20,6 +20,6 @@\n import twisted.persisted.styles # NOQA\n # Remove only entries with twisted serializers for non-twisted types.\n for k, v in frozenset(copyreg.dispatch_table.items()):\n- if not getattr(k, '__module__', '').startswith('twisted') \\\n- and getattr(v, '__module__', '').startswith('twisted'):\n+ if not str(getattr(k, '__module__', '')).startswith('twisted') \\\n+ and str(getattr(v, '__module__', '')).startswith('twisted'):\n copyreg.dispatch_table.pop(k)\n", "issue": "_monkeypatches.py: 'NoneType' object has no attribute 'startswith'\nDid not try yet to come up with minimal example to demonstrate this issue but it is reproducible for me:\n\n```\n$> datalad --dbg crawl\nTraceback (most recent call last):\n File \"/home/yoh/proj/datalad/datalad/venv-tests/bin/datalad\", line 9, in <module>\n load_entry_point('datalad==0.1.dev0', 'console_scripts', 'datalad')()\n File \"/home/yoh/proj/datalad/datalad/datalad/cmdline/main.py\", line 199, in main\n cmdlineargs.func(cmdlineargs)\n File \"/home/yoh/proj/datalad/datalad/datalad/interface/base.py\", line 151, in call_from_parser\n return self(**kwargs)\n File \"/home/yoh/proj/datalad/datalad/datalad/interface/crawl.py\", line 44, in __call__\n from datalad.crawler.pipeline import load_pipeline_from_config, get_pipeline_config_path\n File \"/home/yoh/proj/datalad/datalad/datalad/crawler/pipeline.py\", line 21, in <module>\n from .newmain import lgr\n File \"/home/yoh/proj/datalad/datalad/datalad/crawler/newmain.py\", line 21, in <module>\n from .nodes.matches import *\n File \"/home/yoh/proj/datalad/datalad/datalad/crawler/nodes/matches.py\", line 18, in <module>\n from scrapy.selector import Selector\n File \"/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/scrapy/__init__.py\", line 27, in <module>\n from . import _monkeypatches\n File \"/home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/scrapy/_monkeypatches.py\", line 24, in <module>\n and getattr(v, '__module__', '').startswith('twisted'):\nAttributeError: 'NoneType' object has no attribute 'startswith'\n()\n> /home/yoh/proj/datalad/datalad/venv-tests/local/lib/python2.7/site-packages/scrapy/_monkeypatches.py(24)<module>()\n-> and getattr(v, '__module__', '').startswith('twisted'):\n(Pdb) l\n 19 # to prevent bugs like Twisted#7989 while serializing requests\n 20 import twisted.persisted.styles # NOQA\n 21 # Remove only entries with twisted serializers for non-twisted types.\n 22 for k, v in frozenset(copyreg.dispatch_table.items()):\n 23 if not getattr(k, '__module__', '').startswith('twisted') \\\n 24 -> and getattr(v, '__module__', '').startswith('twisted'):\n 25 copyreg.dispatch_table.pop(k)\n[EOF]\n(Pdb) p k\nNone\n(Pdb) p v\nNone\n(Pdb) p copyreg\nNone\n```\n\nnot sure it came to it but the issue is (if I pdb before this madness happens):\n\n```\n(Pdb) p getattr(k, '__module__', '')\n'__builtin__'\n(Pdb) p getattr(v, '__module__', '')\nNone\n(Pdb) p v\n<function mpq_reducer at 0x7f474bb4ab90>\n(Pdb) p v.__module__\nNone\n(Pdb) p k, v\n(<type 'mpq'>, <function mpq_reducer at 0x7f474bb4ab90>)\n```\n\nso assigned `__module__` is None. As a quick resolution wrapped into str() call to assure str there\n\n```\nand str(getattr(v, '__module__', '')).startswith('twisted'):\n```\n\n", "before_files": [{"content": "import sys\nfrom six.moves import copyreg\n\nif sys.version_info[0] == 2:\n from urlparse import urlparse\n\n # workaround for http://bugs.python.org/issue7904 - Python < 2.7\n if urlparse('s3://bucket/key').netloc != 'bucket':\n from urlparse import uses_netloc\n uses_netloc.append('s3')\n\n # workaround for http://bugs.python.org/issue9374 - Python < 2.7.4\n if urlparse('s3://bucket/key?key=value').query != 'key=value':\n from urlparse import uses_query\n uses_query.append('s3')\n\n\n# Undo what Twisted's perspective broker adds to pickle register\n# to prevent bugs like Twisted#7989 while serializing requests\nimport twisted.persisted.styles # NOQA\n# Remove only entries with twisted serializers for non-twisted types.\nfor k, v in frozenset(copyreg.dispatch_table.items()):\n if not getattr(k, '__module__', '').startswith('twisted') \\\n and getattr(v, '__module__', '').startswith('twisted'):\n copyreg.dispatch_table.pop(k)\n", "path": "scrapy/_monkeypatches.py"}], "after_files": [{"content": "import sys\nfrom six.moves import copyreg\n\nif sys.version_info[0] == 2:\n from urlparse import urlparse\n\n # workaround for http://bugs.python.org/issue7904 - Python < 2.7\n if urlparse('s3://bucket/key').netloc != 'bucket':\n from urlparse import uses_netloc\n uses_netloc.append('s3')\n\n # workaround for http://bugs.python.org/issue9374 - Python < 2.7.4\n if urlparse('s3://bucket/key?key=value').query != 'key=value':\n from urlparse import uses_query\n uses_query.append('s3')\n\n\n# Undo what Twisted's perspective broker adds to pickle register\n# to prevent bugs like Twisted#7989 while serializing requests\nimport twisted.persisted.styles # NOQA\n# Remove only entries with twisted serializers for non-twisted types.\nfor k, v in frozenset(copyreg.dispatch_table.items()):\n if not str(getattr(k, '__module__', '')).startswith('twisted') \\\n and str(getattr(v, '__module__', '')).startswith('twisted'):\n copyreg.dispatch_table.pop(k)\n", "path": "scrapy/_monkeypatches.py"}]} | 1,410 | 163 |
gh_patches_debug_41775 | rasdani/github-patches | git_diff | kymatio__kymatio-185 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
sphinx-gallery: 2d/plot_filters
the wavelets does not display.
Please close this issue only when you're happy with the sphinx-gallery.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/2d/plot_filters.py`
Content:
```
1 """
2 Plot the 2D wavelet filters
3 ===========================
4 See :meth:`scattering.scattering1d.filter_bank` for more informations about the used wavelets.
5 """
6
7 import numpy as np
8 import matplotlib.pyplot as plt
9 from kymatio.scattering2d.filter_bank import filter_bank
10 from kymatio.scattering2d.utils import fft2
11
12
13 ###############################################################################
14 # Initial parameters of the filter bank
15 # -------------------------------------
16 M = 32
17 J = 3
18 L = 8
19 filters_set = filter_bank(M, M, J, L=L)
20
21
22 ###############################################################################
23 # Imshow complex images
24 # -------------------------------------
25 # Thanks to https://stackoverflow.com/questions/17044052/mathplotlib-imshow-complex-2d-array
26 from colorsys import hls_to_rgb
27 def colorize(z):
28 n, m = z.shape
29 c = np.zeros((n, m, 3))
30 c[np.isinf(z)] = (1.0, 1.0, 1.0)
31 c[np.isnan(z)] = (0.5, 0.5, 0.5)
32
33 idx = ~(np.isinf(z) + np.isnan(z))
34 A = (np.angle(z[idx]) + np.pi) / (2*np.pi)
35 A = (A + 0.5) % 1.0
36 B = 1.0/(1.0+abs(z[idx])**0.3)
37 c[idx] = [hls_to_rgb(a, b, 0.8) for a,b in zip(A,B)]
38 return c
39
40 fig, axs = plt.subplots(J+1, L, sharex=True, sharey=True)
41 plt.rc('text', usetex=True)
42 plt.rc('font', family='serif')
43
44 ###############################################################################
45 # Bandpass filters
46 # ----------------
47 # First, we display each wavelets according to each scale and orientation.
48 i=0
49 for filter in filters_set['psi']:
50 f_r = filter[0][...,0].numpy()
51 f_i = filter[0][..., 1].numpy()
52 f = f_r + 1j*f_i
53 filter_c = fft2(f)
54 filter_c = np.fft.fftshift(filter_c)
55 axs[i // L, i % L].imshow(colorize(filter_c))
56 axs[i // L, i % L].axis('off')
57 axs[i // L, i % L].set_title("$j = {}$ \n $\\theta={}".format(i // L, i % L))
58 i = i+1
59
60
61 # Add blanks for pretty display
62 for z in range(L):
63 axs[i // L, i % L].axis('off')
64 i = i+1
65
66 ###############################################################################
67 # Lowpass filter
68 # ----------------
69 # We finally display the Gaussian filter.
70 f_r = filters_set['phi'][0][...,0].numpy()
71 f_i = filters_set['phi'][0][..., 1].numpy()
72 f = f_r + 1j*f_i
73 filter_c = fft2(f)
74 filter_c = np.fft.fftshift(filter_c)
75 axs[J, L // 2].imshow(colorize(filter_c))
76
77 # Final caption.
78 fig.suptitle("Wavelets for each scales $j$ and angles $\\theta$ used, with the corresponding low-pass filter."
79 "\n The contrast corresponds to the amplitude and the color to the phase.", fontsize=13)
80
81
82 plt.show()
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/2d/plot_filters.py b/examples/2d/plot_filters.py
--- a/examples/2d/plot_filters.py
+++ b/examples/2d/plot_filters.py
@@ -1,11 +1,11 @@
"""
Plot the 2D wavelet filters
===========================
-See :meth:`scattering.scattering1d.filter_bank` for more informations about the used wavelets.
+See :meth:`kymatio.scattering2d.filter_bank` for more informations about the used wavelets.
"""
-import numpy as np
import matplotlib.pyplot as plt
+import numpy as np
from kymatio.scattering2d.filter_bank import filter_bank
from kymatio.scattering2d.utils import fft2
@@ -18,10 +18,9 @@
L = 8
filters_set = filter_bank(M, M, J, L=L)
-
###############################################################################
# Imshow complex images
-# -------------------------------------
+# ---------------------
# Thanks to https://stackoverflow.com/questions/17044052/mathplotlib-imshow-complex-2d-array
from colorsys import hls_to_rgb
def colorize(z):
@@ -37,14 +36,15 @@
c[idx] = [hls_to_rgb(a, b, 0.8) for a,b in zip(A,B)]
return c
-fig, axs = plt.subplots(J+1, L, sharex=True, sharey=True)
-plt.rc('text', usetex=True)
-plt.rc('font', family='serif')
-
###############################################################################
# Bandpass filters
# ----------------
-# First, we display each wavelets according to each scale and orientation.
+# First, we display each wavelet according to its scale and orientation.
+fig, axs = plt.subplots(J, L, sharex=True, sharey=True)
+fig.set_figheight(6)
+fig.set_figwidth(6)
+plt.rc('text', usetex=True)
+plt.rc('font', family='serif')
i=0
for filter in filters_set['psi']:
f_r = filter[0][...,0].numpy()
@@ -54,29 +54,30 @@
filter_c = np.fft.fftshift(filter_c)
axs[i // L, i % L].imshow(colorize(filter_c))
axs[i // L, i % L].axis('off')
- axs[i // L, i % L].set_title("$j = {}$ \n $\\theta={}".format(i // L, i % L))
+ axs[i // L, i % L].set_title("$j = {}$ \n $\\theta={}$".format(i // L, i % L))
i = i+1
-
-# Add blanks for pretty display
-for z in range(L):
- axs[i // L, i % L].axis('off')
- i = i+1
+fig.suptitle("Wavelets for each scales $j$ and angles $\\theta$ used."
+"\n Color saturation and color hue respectively denote complex magnitude and complex phase.", fontsize=13)
+fig.show()
###############################################################################
# Lowpass filter
-# ----------------
-# We finally display the Gaussian filter.
-f_r = filters_set['phi'][0][...,0].numpy()
+# --------------
+# We finally display the low-pass filter.
+plt.figure()
+plt.rc('text', usetex=True)
+plt.rc('font', family='serif')
+plt.axis('off')
+plt.set_cmap('gray_r')
+
+f_r = filters_set['phi'][0][..., 0].numpy()
f_i = filters_set['phi'][0][..., 1].numpy()
f = f_r + 1j*f_i
+
filter_c = fft2(f)
filter_c = np.fft.fftshift(filter_c)
-axs[J, L // 2].imshow(colorize(filter_c))
-
-# Final caption.
-fig.suptitle("Wavelets for each scales $j$ and angles $\\theta$ used, with the corresponding low-pass filter."
- "\n The contrast corresponds to the amplitude and the color to the phase.", fontsize=13)
-
-
-plt.show()
+plt.suptitle("The corresponding low-pass filter, also known as scaling function."
+"Color saturation and color hue respectively denote complex magnitude and complex phase", fontsize=13)
+filter_c = np.abs(filter_c)
+plt.imshow(filter_c)
| {"golden_diff": "diff --git a/examples/2d/plot_filters.py b/examples/2d/plot_filters.py\n--- a/examples/2d/plot_filters.py\n+++ b/examples/2d/plot_filters.py\n@@ -1,11 +1,11 @@\n \"\"\"\n Plot the 2D wavelet filters\n ===========================\n-See :meth:`scattering.scattering1d.filter_bank` for more informations about the used wavelets.\n+See :meth:`kymatio.scattering2d.filter_bank` for more informations about the used wavelets.\n \"\"\"\n \n-import numpy as np\n import matplotlib.pyplot as plt\n+import numpy as np\n from kymatio.scattering2d.filter_bank import filter_bank\n from kymatio.scattering2d.utils import fft2\n \n@@ -18,10 +18,9 @@\n L = 8\n filters_set = filter_bank(M, M, J, L=L)\n \n-\n ###############################################################################\n # Imshow complex images\n-# -------------------------------------\n+# ---------------------\n # Thanks to https://stackoverflow.com/questions/17044052/mathplotlib-imshow-complex-2d-array\n from colorsys import hls_to_rgb\n def colorize(z):\n@@ -37,14 +36,15 @@\n c[idx] = [hls_to_rgb(a, b, 0.8) for a,b in zip(A,B)]\n return c\n \n-fig, axs = plt.subplots(J+1, L, sharex=True, sharey=True)\n-plt.rc('text', usetex=True)\n-plt.rc('font', family='serif')\n-\n ###############################################################################\n # Bandpass filters\n # ----------------\n-# First, we display each wavelets according to each scale and orientation.\n+# First, we display each wavelet according to its scale and orientation.\n+fig, axs = plt.subplots(J, L, sharex=True, sharey=True)\n+fig.set_figheight(6)\n+fig.set_figwidth(6)\n+plt.rc('text', usetex=True)\n+plt.rc('font', family='serif')\n i=0\n for filter in filters_set['psi']:\n f_r = filter[0][...,0].numpy()\n@@ -54,29 +54,30 @@\n filter_c = np.fft.fftshift(filter_c)\n axs[i // L, i % L].imshow(colorize(filter_c))\n axs[i // L, i % L].axis('off')\n- axs[i // L, i % L].set_title(\"$j = {}$ \\n $\\\\theta={}\".format(i // L, i % L))\n+ axs[i // L, i % L].set_title(\"$j = {}$ \\n $\\\\theta={}$\".format(i // L, i % L))\n i = i+1\n \n-\n-# Add blanks for pretty display\n-for z in range(L):\n- axs[i // L, i % L].axis('off')\n- i = i+1\n+fig.suptitle(\"Wavelets for each scales $j$ and angles $\\\\theta$ used.\"\n+\"\\n Color saturation and color hue respectively denote complex magnitude and complex phase.\", fontsize=13)\n+fig.show()\n \n ###############################################################################\n # Lowpass filter\n-# ----------------\n-# We finally display the Gaussian filter.\n-f_r = filters_set['phi'][0][...,0].numpy()\n+# --------------\n+# We finally display the low-pass filter.\n+plt.figure()\n+plt.rc('text', usetex=True)\n+plt.rc('font', family='serif')\n+plt.axis('off')\n+plt.set_cmap('gray_r')\n+\n+f_r = filters_set['phi'][0][..., 0].numpy()\n f_i = filters_set['phi'][0][..., 1].numpy()\n f = f_r + 1j*f_i\n+\n filter_c = fft2(f)\n filter_c = np.fft.fftshift(filter_c)\n-axs[J, L // 2].imshow(colorize(filter_c))\n-\n-# Final caption.\n-fig.suptitle(\"Wavelets for each scales $j$ and angles $\\\\theta$ used, with the corresponding low-pass filter.\"\n- \"\\n The contrast corresponds to the amplitude and the color to the phase.\", fontsize=13)\n-\n-\n-plt.show()\n+plt.suptitle(\"The corresponding low-pass filter, also known as scaling function.\"\n+\"Color saturation and color hue respectively denote complex magnitude and complex phase\", fontsize=13)\n+filter_c = np.abs(filter_c)\n+plt.imshow(filter_c)\n", "issue": "sphinx-gallery: 2d/plot_filters\nthe wavelets does not display.\r\n\r\nPlease close this issue only when you're happy with the sphinx-gallery.\n", "before_files": [{"content": "\"\"\"\nPlot the 2D wavelet filters\n===========================\nSee :meth:`scattering.scattering1d.filter_bank` for more informations about the used wavelets.\n\"\"\"\n\nimport numpy as np\nimport matplotlib.pyplot as plt\nfrom kymatio.scattering2d.filter_bank import filter_bank\nfrom kymatio.scattering2d.utils import fft2\n\n\n###############################################################################\n# Initial parameters of the filter bank\n# -------------------------------------\nM = 32\nJ = 3\nL = 8\nfilters_set = filter_bank(M, M, J, L=L)\n\n\n###############################################################################\n# Imshow complex images\n# -------------------------------------\n# Thanks to https://stackoverflow.com/questions/17044052/mathplotlib-imshow-complex-2d-array\nfrom colorsys import hls_to_rgb\ndef colorize(z):\n n, m = z.shape\n c = np.zeros((n, m, 3))\n c[np.isinf(z)] = (1.0, 1.0, 1.0)\n c[np.isnan(z)] = (0.5, 0.5, 0.5)\n\n idx = ~(np.isinf(z) + np.isnan(z))\n A = (np.angle(z[idx]) + np.pi) / (2*np.pi)\n A = (A + 0.5) % 1.0\n B = 1.0/(1.0+abs(z[idx])**0.3)\n c[idx] = [hls_to_rgb(a, b, 0.8) for a,b in zip(A,B)]\n return c\n\nfig, axs = plt.subplots(J+1, L, sharex=True, sharey=True)\nplt.rc('text', usetex=True)\nplt.rc('font', family='serif')\n\n###############################################################################\n# Bandpass filters\n# ----------------\n# First, we display each wavelets according to each scale and orientation.\ni=0\nfor filter in filters_set['psi']:\n f_r = filter[0][...,0].numpy()\n f_i = filter[0][..., 1].numpy()\n f = f_r + 1j*f_i\n filter_c = fft2(f)\n filter_c = np.fft.fftshift(filter_c)\n axs[i // L, i % L].imshow(colorize(filter_c))\n axs[i // L, i % L].axis('off')\n axs[i // L, i % L].set_title(\"$j = {}$ \\n $\\\\theta={}\".format(i // L, i % L))\n i = i+1\n\n\n# Add blanks for pretty display\nfor z in range(L):\n axs[i // L, i % L].axis('off')\n i = i+1\n\n###############################################################################\n# Lowpass filter\n# ----------------\n# We finally display the Gaussian filter.\nf_r = filters_set['phi'][0][...,0].numpy()\nf_i = filters_set['phi'][0][..., 1].numpy()\nf = f_r + 1j*f_i\nfilter_c = fft2(f)\nfilter_c = np.fft.fftshift(filter_c)\naxs[J, L // 2].imshow(colorize(filter_c))\n\n# Final caption.\nfig.suptitle(\"Wavelets for each scales $j$ and angles $\\\\theta$ used, with the corresponding low-pass filter.\"\n \"\\n The contrast corresponds to the amplitude and the color to the phase.\", fontsize=13)\n\n\nplt.show()\n", "path": "examples/2d/plot_filters.py"}], "after_files": [{"content": "\"\"\"\nPlot the 2D wavelet filters\n===========================\nSee :meth:`kymatio.scattering2d.filter_bank` for more informations about the used wavelets.\n\"\"\"\n\nimport matplotlib.pyplot as plt\nimport numpy as np\nfrom kymatio.scattering2d.filter_bank import filter_bank\nfrom kymatio.scattering2d.utils import fft2\n\n\n###############################################################################\n# Initial parameters of the filter bank\n# -------------------------------------\nM = 32\nJ = 3\nL = 8\nfilters_set = filter_bank(M, M, J, L=L)\n\n###############################################################################\n# Imshow complex images\n# ---------------------\n# Thanks to https://stackoverflow.com/questions/17044052/mathplotlib-imshow-complex-2d-array\nfrom colorsys import hls_to_rgb\ndef colorize(z):\n n, m = z.shape\n c = np.zeros((n, m, 3))\n c[np.isinf(z)] = (1.0, 1.0, 1.0)\n c[np.isnan(z)] = (0.5, 0.5, 0.5)\n\n idx = ~(np.isinf(z) + np.isnan(z))\n A = (np.angle(z[idx]) + np.pi) / (2*np.pi)\n A = (A + 0.5) % 1.0\n B = 1.0/(1.0+abs(z[idx])**0.3)\n c[idx] = [hls_to_rgb(a, b, 0.8) for a,b in zip(A,B)]\n return c\n\n###############################################################################\n# Bandpass filters\n# ----------------\n# First, we display each wavelet according to its scale and orientation.\nfig, axs = plt.subplots(J, L, sharex=True, sharey=True)\nfig.set_figheight(6)\nfig.set_figwidth(6)\nplt.rc('text', usetex=True)\nplt.rc('font', family='serif')\ni=0\nfor filter in filters_set['psi']:\n f_r = filter[0][...,0].numpy()\n f_i = filter[0][..., 1].numpy()\n f = f_r + 1j*f_i\n filter_c = fft2(f)\n filter_c = np.fft.fftshift(filter_c)\n axs[i // L, i % L].imshow(colorize(filter_c))\n axs[i // L, i % L].axis('off')\n axs[i // L, i % L].set_title(\"$j = {}$ \\n $\\\\theta={}$\".format(i // L, i % L))\n i = i+1\n\nfig.suptitle(\"Wavelets for each scales $j$ and angles $\\\\theta$ used.\"\n\"\\n Color saturation and color hue respectively denote complex magnitude and complex phase.\", fontsize=13)\nfig.show()\n\n###############################################################################\n# Lowpass filter\n# --------------\n# We finally display the low-pass filter.\nplt.figure()\nplt.rc('text', usetex=True)\nplt.rc('font', family='serif')\nplt.axis('off')\nplt.set_cmap('gray_r')\n\nf_r = filters_set['phi'][0][..., 0].numpy()\nf_i = filters_set['phi'][0][..., 1].numpy()\nf = f_r + 1j*f_i\n\nfilter_c = fft2(f)\nfilter_c = np.fft.fftshift(filter_c)\nplt.suptitle(\"The corresponding low-pass filter, also known as scaling function.\"\n\"Color saturation and color hue respectively denote complex magnitude and complex phase\", fontsize=13)\nfilter_c = np.abs(filter_c)\nplt.imshow(filter_c)\n", "path": "examples/2d/plot_filters.py"}]} | 1,189 | 961 |
gh_patches_debug_31257 | rasdani/github-patches | git_diff | zestedesavoir__zds-site-5761 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Corriger et refactoriser remove_url_scheme et ses tests
On a une fonction utilitaire qui s'appelle [remove_url_scheme](https://github.com/zestedesavoir/zds-site/blob/03c8f316c46e51d42afb8b8d2d9553cdd8fb0f08/zds/utils/templatetags/remove_url_scheme.py#L9), qui permet d'enlever le schéma des urls (http ou https) et le nom du domaine, afin de toujours servir les ressources locales au site avec le bon protocole (http ou https).
**Description du problème**
Le problème actuellement, c'est qu'elle gère mal le nom du domaine spécifié dans l'environnement de dev, à savoir `ZDS_APP['site']['dns'] = 127.0.0.1:8000`, ce qui a pour effet de faire rater un des tests (zds.utils.tests.test_misc.Misc.test_remove_url_scheme), mais en **local seulement**, pas sur Travis. Il s'agit donc d'un faux positif sur l'environnement de dev, ce qui est pénible.
**Comportement attendu**
On devrait avoir le bon fonctionnement sur l'environnement de dev en gérant correctement le numéro de port à la fin de l'url. Au passage, on devrait aussi :
* utiliser `urllib.parse` au lieu du module de `six` ;
* réunir tous les tests de `remove_url_scheme` dans le bon fichier (actuellement c'est réparti dans `test_misc` et un fichier `test_remove_url_scheme`).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zds/utils/templatetags/remove_url_scheme.py`
Content:
```
1 from django import template
2 from django.conf import settings
3 from six.moves import urllib_parse as urlparse
4
5 register = template.Library()
6
7
8 @register.filter('remove_url_scheme')
9 def remove_url_scheme(input_url):
10 """
11 make every image url pointing to this website protocol independant so that if we use https, we are sure
12 that all our media are served with this protocol.
13
14 .. notice::
15
16 this also removes the ``settings.ZDS_APP['site']['dns']`` from the url.
17
18 :return: the url without its scheme, e.g. ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes
19 ``/media/gallery/1/1.png``
20
21 """
22
23 schemeless_url = input_url[len(urlparse.urlparse(input_url).scheme):]
24 schemeless_url = schemeless_url[len('://'):] if schemeless_url.startswith('://') else schemeless_url
25 if schemeless_url.startswith(settings.ZDS_APP['site']['dns']):
26 return schemeless_url[len(settings.ZDS_APP['site']['dns']):]
27 return input_url
28
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/zds/utils/templatetags/remove_url_scheme.py b/zds/utils/templatetags/remove_url_scheme.py
--- a/zds/utils/templatetags/remove_url_scheme.py
+++ b/zds/utils/templatetags/remove_url_scheme.py
@@ -1,27 +1,37 @@
+import urllib.parse
+
from django import template
from django.conf import settings
-from six.moves import urllib_parse as urlparse
register = template.Library()
@register.filter('remove_url_scheme')
-def remove_url_scheme(input_url):
+def remove_url_scheme(url):
"""
- make every image url pointing to this website protocol independant so that if we use https, we are sure
- that all our media are served with this protocol.
-
- .. notice::
+ Remove the scheme and hostname from a URL if it is internal, but leave it unchanged otherwise.
- this also removes the ``settings.ZDS_APP['site']['dns']`` from the url.
+ The internal hostname is determined using the value of ``ZDS_APP['site']['dns']``.
+ URLs with no scheme are accepted. URLs with no hostname are treated as internal.
- :return: the url without its scheme, e.g. ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes
- ``/media/gallery/1/1.png``
+ For example, ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes ``/media/gallery/1/1.png``,
+ whereas ``/media/gallery/1/1.png`` and ``example.com/media/gallery/1/1.png`` stay the same.
+ :return: the url without its scheme and hostname.
"""
- schemeless_url = input_url[len(urlparse.urlparse(input_url).scheme):]
- schemeless_url = schemeless_url[len('://'):] if schemeless_url.startswith('://') else schemeless_url
- if schemeless_url.startswith(settings.ZDS_APP['site']['dns']):
- return schemeless_url[len(settings.ZDS_APP['site']['dns']):]
- return input_url
+ # Parse URLs after adding a prefix if necessary (e.g 'zestedesavoir.com' becomes '//zestedesavoir.com')
+ url_normalized = url
+ if '//' not in url:
+ url_normalized = '//' + url
+ url_parsed = urllib.parse.urlsplit(url_normalized)
+
+ # Return external URLs unchanged
+ if url_parsed.netloc != settings.ZDS_APP['site']['dns']:
+ return url
+
+ # Clean internal URLs
+ url_noscheme = urllib.parse.urlunsplit(['', '', url_parsed.path, url_parsed.query, url_parsed.fragment])
+ url_cleaned = url_noscheme[0:] # remove first "/"
+
+ return url_cleaned
| {"golden_diff": "diff --git a/zds/utils/templatetags/remove_url_scheme.py b/zds/utils/templatetags/remove_url_scheme.py\n--- a/zds/utils/templatetags/remove_url_scheme.py\n+++ b/zds/utils/templatetags/remove_url_scheme.py\n@@ -1,27 +1,37 @@\n+import urllib.parse\n+\n from django import template\n from django.conf import settings\n-from six.moves import urllib_parse as urlparse\n \n register = template.Library()\n \n \n @register.filter('remove_url_scheme')\n-def remove_url_scheme(input_url):\n+def remove_url_scheme(url):\n \"\"\"\n- make every image url pointing to this website protocol independant so that if we use https, we are sure\n- that all our media are served with this protocol.\n-\n- .. notice::\n+ Remove the scheme and hostname from a URL if it is internal, but leave it unchanged otherwise.\n \n- this also removes the ``settings.ZDS_APP['site']['dns']`` from the url.\n+ The internal hostname is determined using the value of ``ZDS_APP['site']['dns']``.\n+ URLs with no scheme are accepted. URLs with no hostname are treated as internal.\n \n- :return: the url without its scheme, e.g. ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes\n- ``/media/gallery/1/1.png``\n+ For example, ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes ``/media/gallery/1/1.png``,\n+ whereas ``/media/gallery/1/1.png`` and ``example.com/media/gallery/1/1.png`` stay the same.\n \n+ :return: the url without its scheme and hostname.\n \"\"\"\n \n- schemeless_url = input_url[len(urlparse.urlparse(input_url).scheme):]\n- schemeless_url = schemeless_url[len('://'):] if schemeless_url.startswith('://') else schemeless_url\n- if schemeless_url.startswith(settings.ZDS_APP['site']['dns']):\n- return schemeless_url[len(settings.ZDS_APP['site']['dns']):]\n- return input_url\n+ # Parse URLs after adding a prefix if necessary (e.g 'zestedesavoir.com' becomes '//zestedesavoir.com')\n+ url_normalized = url\n+ if '//' not in url:\n+ url_normalized = '//' + url\n+ url_parsed = urllib.parse.urlsplit(url_normalized)\n+\n+ # Return external URLs unchanged\n+ if url_parsed.netloc != settings.ZDS_APP['site']['dns']:\n+ return url\n+\n+ # Clean internal URLs\n+ url_noscheme = urllib.parse.urlunsplit(['', '', url_parsed.path, url_parsed.query, url_parsed.fragment])\n+ url_cleaned = url_noscheme[0:] # remove first \"/\"\n+\n+ return url_cleaned\n", "issue": "Corriger et refactoriser remove_url_scheme et ses tests\nOn a une fonction utilitaire qui s'appelle [remove_url_scheme](https://github.com/zestedesavoir/zds-site/blob/03c8f316c46e51d42afb8b8d2d9553cdd8fb0f08/zds/utils/templatetags/remove_url_scheme.py#L9), qui permet d'enlever le sch\u00e9ma des urls (http ou https) et le nom du domaine, afin de toujours servir les ressources locales au site avec le bon protocole (http ou https).\r\n\r\n**Description du probl\u00e8me**\r\n\r\nLe probl\u00e8me actuellement, c'est qu'elle g\u00e8re mal le nom du domaine sp\u00e9cifi\u00e9 dans l'environnement de dev, \u00e0 savoir `ZDS_APP['site']['dns'] = 127.0.0.1:8000`, ce qui a pour effet de faire rater un des tests (zds.utils.tests.test_misc.Misc.test_remove_url_scheme), mais en **local seulement**, pas sur Travis. Il s'agit donc d'un faux positif sur l'environnement de dev, ce qui est p\u00e9nible.\r\n\r\n**Comportement attendu**\r\n\r\nOn devrait avoir le bon fonctionnement sur l'environnement de dev en g\u00e9rant correctement le num\u00e9ro de port \u00e0 la fin de l'url. Au passage, on devrait aussi :\r\n\r\n* utiliser `urllib.parse` au lieu du module de `six` ;\r\n* r\u00e9unir tous les tests de `remove_url_scheme` dans le bon fichier (actuellement c'est r\u00e9parti dans `test_misc` et un fichier `test_remove_url_scheme`).\r\n\n", "before_files": [{"content": "from django import template\nfrom django.conf import settings\nfrom six.moves import urllib_parse as urlparse\n\nregister = template.Library()\n\n\[email protected]('remove_url_scheme')\ndef remove_url_scheme(input_url):\n \"\"\"\n make every image url pointing to this website protocol independant so that if we use https, we are sure\n that all our media are served with this protocol.\n\n .. notice::\n\n this also removes the ``settings.ZDS_APP['site']['dns']`` from the url.\n\n :return: the url without its scheme, e.g. ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes\n ``/media/gallery/1/1.png``\n\n \"\"\"\n\n schemeless_url = input_url[len(urlparse.urlparse(input_url).scheme):]\n schemeless_url = schemeless_url[len('://'):] if schemeless_url.startswith('://') else schemeless_url\n if schemeless_url.startswith(settings.ZDS_APP['site']['dns']):\n return schemeless_url[len(settings.ZDS_APP['site']['dns']):]\n return input_url\n", "path": "zds/utils/templatetags/remove_url_scheme.py"}], "after_files": [{"content": "import urllib.parse\n\nfrom django import template\nfrom django.conf import settings\n\nregister = template.Library()\n\n\[email protected]('remove_url_scheme')\ndef remove_url_scheme(url):\n \"\"\"\n Remove the scheme and hostname from a URL if it is internal, but leave it unchanged otherwise.\n\n The internal hostname is determined using the value of ``ZDS_APP['site']['dns']``.\n URLs with no scheme are accepted. URLs with no hostname are treated as internal.\n\n For example, ``http://zestedesavoir.com/media/gallery/1/1.png`` becomes ``/media/gallery/1/1.png``,\n whereas ``/media/gallery/1/1.png`` and ``example.com/media/gallery/1/1.png`` stay the same.\n\n :return: the url without its scheme and hostname.\n \"\"\"\n\n # Parse URLs after adding a prefix if necessary (e.g 'zestedesavoir.com' becomes '//zestedesavoir.com')\n url_normalized = url\n if '//' not in url:\n url_normalized = '//' + url\n url_parsed = urllib.parse.urlsplit(url_normalized)\n\n # Return external URLs unchanged\n if url_parsed.netloc != settings.ZDS_APP['site']['dns']:\n return url\n\n # Clean internal URLs\n url_noscheme = urllib.parse.urlunsplit(['', '', url_parsed.path, url_parsed.query, url_parsed.fragment])\n url_cleaned = url_noscheme[0:] # remove first \"/\"\n\n return url_cleaned\n", "path": "zds/utils/templatetags/remove_url_scheme.py"}]} | 910 | 625 |
gh_patches_debug_3772 | rasdani/github-patches | git_diff | kivy__python-for-android-2123 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
TestGetSystemPythonExecutable.test_virtualenv test fail
The `TestGetSystemPythonExecutable.test_virtualenv` and `TestGetSystemPythonExecutable.test_venv` tests started failing all of a sudden.
Error was:
```
ModuleNotFoundError: No module named \'pytoml\'\n'
```
This ca be reproduced in local via:
```sh
pytest tests/test_pythonpackage_basic.py::TestGetSystemPythonExecutable::test_virtualenv
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pythonforandroid/recipes/openssl/__init__.py`
Content:
```
1 from os.path import join
2
3 from pythonforandroid.recipe import Recipe
4 from pythonforandroid.util import current_directory
5 from pythonforandroid.logger import shprint
6 import sh
7
8
9 class OpenSSLRecipe(Recipe):
10 '''
11 The OpenSSL libraries for python-for-android. This recipe will generate the
12 following libraries as shared libraries (*.so):
13
14 - crypto
15 - ssl
16
17 The generated openssl libraries are versioned, where the version is the
18 recipe attribute :attr:`version` e.g.: ``libcrypto1.1.so``,
19 ``libssl1.1.so``...so...to link your recipe with the openssl libs,
20 remember to add the version at the end, e.g.:
21 ``-lcrypto1.1 -lssl1.1``. Or better, you could do it dynamically
22 using the methods: :meth:`include_flags`, :meth:`link_dirs_flags` and
23 :meth:`link_libs_flags`.
24
25 .. warning:: This recipe is very sensitive because is used for our core
26 recipes, the python recipes. The used API should match with the one
27 used in our python build, otherwise we will be unable to build the
28 _ssl.so python module.
29
30 .. versionchanged:: 0.6.0
31
32 - The gcc compiler has been deprecated in favour of clang and libraries
33 updated to version 1.1.1 (LTS - supported until 11th September 2023)
34 - Added two new methods to make easier to link with openssl:
35 :meth:`include_flags` and :meth:`link_flags`
36 - subclassed versioned_url
37 - Adapted method :meth:`select_build_arch` to API 21+
38 - Add ability to build a legacy version of the openssl libs when using
39 python2legacy or python3crystax.
40
41 .. versionchanged:: 2019.06.06.1.dev0
42
43 - Removed legacy version of openssl libraries
44
45 '''
46
47 version = '1.1'
48 '''the major minor version used to link our recipes'''
49
50 url_version = '1.1.1'
51 '''the version used to download our libraries'''
52
53 url = 'https://www.openssl.org/source/openssl-{url_version}.tar.gz'
54
55 built_libraries = {
56 'libcrypto{version}.so'.format(version=version): '.',
57 'libssl{version}.so'.format(version=version): '.',
58 }
59
60 @property
61 def versioned_url(self):
62 if self.url is None:
63 return None
64 return self.url.format(url_version=self.url_version)
65
66 def get_build_dir(self, arch):
67 return join(
68 self.get_build_container_dir(arch), self.name + self.version
69 )
70
71 def include_flags(self, arch):
72 '''Returns a string with the include folders'''
73 openssl_includes = join(self.get_build_dir(arch.arch), 'include')
74 return (' -I' + openssl_includes +
75 ' -I' + join(openssl_includes, 'internal') +
76 ' -I' + join(openssl_includes, 'openssl'))
77
78 def link_dirs_flags(self, arch):
79 '''Returns a string with the appropriate `-L<lib directory>` to link
80 with the openssl libs. This string is usually added to the environment
81 variable `LDFLAGS`'''
82 return ' -L' + self.get_build_dir(arch.arch)
83
84 def link_libs_flags(self):
85 '''Returns a string with the appropriate `-l<lib>` flags to link with
86 the openssl libs. This string is usually added to the environment
87 variable `LIBS`'''
88 return ' -lcrypto{version} -lssl{version}'.format(version=self.version)
89
90 def link_flags(self, arch):
91 '''Returns a string with the flags to link with the openssl libraries
92 in the format: `-L<lib directory> -l<lib>`'''
93 return self.link_dirs_flags(arch) + self.link_libs_flags()
94
95 def get_recipe_env(self, arch=None):
96 env = super().get_recipe_env(arch)
97 env['OPENSSL_VERSION'] = self.version
98 env['MAKE'] = 'make' # This removes the '-j5', which isn't safe
99 env['ANDROID_NDK'] = self.ctx.ndk_dir
100 return env
101
102 def select_build_arch(self, arch):
103 aname = arch.arch
104 if 'arm64' in aname:
105 return 'android-arm64'
106 if 'v7a' in aname:
107 return 'android-arm'
108 if 'arm' in aname:
109 return 'android'
110 if 'x86_64' in aname:
111 return 'android-x86_64'
112 if 'x86' in aname:
113 return 'android-x86'
114 return 'linux-armv4'
115
116 def build_arch(self, arch):
117 env = self.get_recipe_env(arch)
118 with current_directory(self.get_build_dir(arch.arch)):
119 # sh fails with code 255 trying to execute ./Configure
120 # so instead we manually run perl passing in Configure
121 perl = sh.Command('perl')
122 buildarch = self.select_build_arch(arch)
123 config_args = [
124 'shared',
125 'no-dso',
126 'no-asm',
127 buildarch,
128 '-D__ANDROID_API__={}'.format(self.ctx.ndk_api),
129 ]
130 shprint(perl, 'Configure', *config_args, _env=env)
131 self.apply_patch('disable-sover.patch', arch.arch)
132
133 shprint(sh.make, 'build_libs', _env=env)
134
135
136 recipe = OpenSSLRecipe()
137
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pythonforandroid/recipes/openssl/__init__.py b/pythonforandroid/recipes/openssl/__init__.py
--- a/pythonforandroid/recipes/openssl/__init__.py
+++ b/pythonforandroid/recipes/openssl/__init__.py
@@ -47,7 +47,7 @@
version = '1.1'
'''the major minor version used to link our recipes'''
- url_version = '1.1.1'
+ url_version = '1.1.1f'
'''the version used to download our libraries'''
url = 'https://www.openssl.org/source/openssl-{url_version}.tar.gz'
| {"golden_diff": "diff --git a/pythonforandroid/recipes/openssl/__init__.py b/pythonforandroid/recipes/openssl/__init__.py\n--- a/pythonforandroid/recipes/openssl/__init__.py\n+++ b/pythonforandroid/recipes/openssl/__init__.py\n@@ -47,7 +47,7 @@\n version = '1.1'\n '''the major minor version used to link our recipes'''\n \n- url_version = '1.1.1'\n+ url_version = '1.1.1f'\n '''the version used to download our libraries'''\n \n url = 'https://www.openssl.org/source/openssl-{url_version}.tar.gz'\n", "issue": "TestGetSystemPythonExecutable.test_virtualenv test fail\nThe `TestGetSystemPythonExecutable.test_virtualenv` and `TestGetSystemPythonExecutable.test_venv` tests started failing all of a sudden.\r\nError was:\r\n```\r\nModuleNotFoundError: No module named \\'pytoml\\'\\n'\r\n```\r\nThis ca be reproduced in local via:\r\n```sh\r\npytest tests/test_pythonpackage_basic.py::TestGetSystemPythonExecutable::test_virtualenv\r\n```\r\n\r\n\n", "before_files": [{"content": "from os.path import join\n\nfrom pythonforandroid.recipe import Recipe\nfrom pythonforandroid.util import current_directory\nfrom pythonforandroid.logger import shprint\nimport sh\n\n\nclass OpenSSLRecipe(Recipe):\n '''\n The OpenSSL libraries for python-for-android. This recipe will generate the\n following libraries as shared libraries (*.so):\n\n - crypto\n - ssl\n\n The generated openssl libraries are versioned, where the version is the\n recipe attribute :attr:`version` e.g.: ``libcrypto1.1.so``,\n ``libssl1.1.so``...so...to link your recipe with the openssl libs,\n remember to add the version at the end, e.g.:\n ``-lcrypto1.1 -lssl1.1``. Or better, you could do it dynamically\n using the methods: :meth:`include_flags`, :meth:`link_dirs_flags` and\n :meth:`link_libs_flags`.\n\n .. warning:: This recipe is very sensitive because is used for our core\n recipes, the python recipes. The used API should match with the one\n used in our python build, otherwise we will be unable to build the\n _ssl.so python module.\n\n .. versionchanged:: 0.6.0\n\n - The gcc compiler has been deprecated in favour of clang and libraries\n updated to version 1.1.1 (LTS - supported until 11th September 2023)\n - Added two new methods to make easier to link with openssl:\n :meth:`include_flags` and :meth:`link_flags`\n - subclassed versioned_url\n - Adapted method :meth:`select_build_arch` to API 21+\n - Add ability to build a legacy version of the openssl libs when using\n python2legacy or python3crystax.\n\n .. versionchanged:: 2019.06.06.1.dev0\n\n - Removed legacy version of openssl libraries\n\n '''\n\n version = '1.1'\n '''the major minor version used to link our recipes'''\n\n url_version = '1.1.1'\n '''the version used to download our libraries'''\n\n url = 'https://www.openssl.org/source/openssl-{url_version}.tar.gz'\n\n built_libraries = {\n 'libcrypto{version}.so'.format(version=version): '.',\n 'libssl{version}.so'.format(version=version): '.',\n }\n\n @property\n def versioned_url(self):\n if self.url is None:\n return None\n return self.url.format(url_version=self.url_version)\n\n def get_build_dir(self, arch):\n return join(\n self.get_build_container_dir(arch), self.name + self.version\n )\n\n def include_flags(self, arch):\n '''Returns a string with the include folders'''\n openssl_includes = join(self.get_build_dir(arch.arch), 'include')\n return (' -I' + openssl_includes +\n ' -I' + join(openssl_includes, 'internal') +\n ' -I' + join(openssl_includes, 'openssl'))\n\n def link_dirs_flags(self, arch):\n '''Returns a string with the appropriate `-L<lib directory>` to link\n with the openssl libs. This string is usually added to the environment\n variable `LDFLAGS`'''\n return ' -L' + self.get_build_dir(arch.arch)\n\n def link_libs_flags(self):\n '''Returns a string with the appropriate `-l<lib>` flags to link with\n the openssl libs. This string is usually added to the environment\n variable `LIBS`'''\n return ' -lcrypto{version} -lssl{version}'.format(version=self.version)\n\n def link_flags(self, arch):\n '''Returns a string with the flags to link with the openssl libraries\n in the format: `-L<lib directory> -l<lib>`'''\n return self.link_dirs_flags(arch) + self.link_libs_flags()\n\n def get_recipe_env(self, arch=None):\n env = super().get_recipe_env(arch)\n env['OPENSSL_VERSION'] = self.version\n env['MAKE'] = 'make' # This removes the '-j5', which isn't safe\n env['ANDROID_NDK'] = self.ctx.ndk_dir\n return env\n\n def select_build_arch(self, arch):\n aname = arch.arch\n if 'arm64' in aname:\n return 'android-arm64'\n if 'v7a' in aname:\n return 'android-arm'\n if 'arm' in aname:\n return 'android'\n if 'x86_64' in aname:\n return 'android-x86_64'\n if 'x86' in aname:\n return 'android-x86'\n return 'linux-armv4'\n\n def build_arch(self, arch):\n env = self.get_recipe_env(arch)\n with current_directory(self.get_build_dir(arch.arch)):\n # sh fails with code 255 trying to execute ./Configure\n # so instead we manually run perl passing in Configure\n perl = sh.Command('perl')\n buildarch = self.select_build_arch(arch)\n config_args = [\n 'shared',\n 'no-dso',\n 'no-asm',\n buildarch,\n '-D__ANDROID_API__={}'.format(self.ctx.ndk_api),\n ]\n shprint(perl, 'Configure', *config_args, _env=env)\n self.apply_patch('disable-sover.patch', arch.arch)\n\n shprint(sh.make, 'build_libs', _env=env)\n\n\nrecipe = OpenSSLRecipe()\n", "path": "pythonforandroid/recipes/openssl/__init__.py"}], "after_files": [{"content": "from os.path import join\n\nfrom pythonforandroid.recipe import Recipe\nfrom pythonforandroid.util import current_directory\nfrom pythonforandroid.logger import shprint\nimport sh\n\n\nclass OpenSSLRecipe(Recipe):\n '''\n The OpenSSL libraries for python-for-android. This recipe will generate the\n following libraries as shared libraries (*.so):\n\n - crypto\n - ssl\n\n The generated openssl libraries are versioned, where the version is the\n recipe attribute :attr:`version` e.g.: ``libcrypto1.1.so``,\n ``libssl1.1.so``...so...to link your recipe with the openssl libs,\n remember to add the version at the end, e.g.:\n ``-lcrypto1.1 -lssl1.1``. Or better, you could do it dynamically\n using the methods: :meth:`include_flags`, :meth:`link_dirs_flags` and\n :meth:`link_libs_flags`.\n\n .. warning:: This recipe is very sensitive because is used for our core\n recipes, the python recipes. The used API should match with the one\n used in our python build, otherwise we will be unable to build the\n _ssl.so python module.\n\n .. versionchanged:: 0.6.0\n\n - The gcc compiler has been deprecated in favour of clang and libraries\n updated to version 1.1.1 (LTS - supported until 11th September 2023)\n - Added two new methods to make easier to link with openssl:\n :meth:`include_flags` and :meth:`link_flags`\n - subclassed versioned_url\n - Adapted method :meth:`select_build_arch` to API 21+\n - Add ability to build a legacy version of the openssl libs when using\n python2legacy or python3crystax.\n\n .. versionchanged:: 2019.06.06.1.dev0\n\n - Removed legacy version of openssl libraries\n\n '''\n\n version = '1.1'\n '''the major minor version used to link our recipes'''\n\n url_version = '1.1.1f'\n '''the version used to download our libraries'''\n\n url = 'https://www.openssl.org/source/openssl-{url_version}.tar.gz'\n\n built_libraries = {\n 'libcrypto{version}.so'.format(version=version): '.',\n 'libssl{version}.so'.format(version=version): '.',\n }\n\n @property\n def versioned_url(self):\n if self.url is None:\n return None\n return self.url.format(url_version=self.url_version)\n\n def get_build_dir(self, arch):\n return join(\n self.get_build_container_dir(arch), self.name + self.version\n )\n\n def include_flags(self, arch):\n '''Returns a string with the include folders'''\n openssl_includes = join(self.get_build_dir(arch.arch), 'include')\n return (' -I' + openssl_includes +\n ' -I' + join(openssl_includes, 'internal') +\n ' -I' + join(openssl_includes, 'openssl'))\n\n def link_dirs_flags(self, arch):\n '''Returns a string with the appropriate `-L<lib directory>` to link\n with the openssl libs. This string is usually added to the environment\n variable `LDFLAGS`'''\n return ' -L' + self.get_build_dir(arch.arch)\n\n def link_libs_flags(self):\n '''Returns a string with the appropriate `-l<lib>` flags to link with\n the openssl libs. This string is usually added to the environment\n variable `LIBS`'''\n return ' -lcrypto{version} -lssl{version}'.format(version=self.version)\n\n def link_flags(self, arch):\n '''Returns a string with the flags to link with the openssl libraries\n in the format: `-L<lib directory> -l<lib>`'''\n return self.link_dirs_flags(arch) + self.link_libs_flags()\n\n def get_recipe_env(self, arch=None):\n env = super().get_recipe_env(arch)\n env['OPENSSL_VERSION'] = self.version\n env['MAKE'] = 'make' # This removes the '-j5', which isn't safe\n env['ANDROID_NDK'] = self.ctx.ndk_dir\n return env\n\n def select_build_arch(self, arch):\n aname = arch.arch\n if 'arm64' in aname:\n return 'android-arm64'\n if 'v7a' in aname:\n return 'android-arm'\n if 'arm' in aname:\n return 'android'\n if 'x86_64' in aname:\n return 'android-x86_64'\n if 'x86' in aname:\n return 'android-x86'\n return 'linux-armv4'\n\n def build_arch(self, arch):\n env = self.get_recipe_env(arch)\n with current_directory(self.get_build_dir(arch.arch)):\n # sh fails with code 255 trying to execute ./Configure\n # so instead we manually run perl passing in Configure\n perl = sh.Command('perl')\n buildarch = self.select_build_arch(arch)\n config_args = [\n 'shared',\n 'no-dso',\n 'no-asm',\n buildarch,\n '-D__ANDROID_API__={}'.format(self.ctx.ndk_api),\n ]\n shprint(perl, 'Configure', *config_args, _env=env)\n self.apply_patch('disable-sover.patch', arch.arch)\n\n shprint(sh.make, 'build_libs', _env=env)\n\n\nrecipe = OpenSSLRecipe()\n", "path": "pythonforandroid/recipes/openssl/__init__.py"}]} | 1,906 | 145 |
gh_patches_debug_10073 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-2457 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider sheetz is broken
During the global build at 2021-06-23-14-42-18, spider **sheetz** failed with **526 features** and **1 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-06-23-14-42-18/logs/sheetz.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-06-23-14-42-18/output/sheetz.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-06-23-14-42-18/output/sheetz.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/sheetz.py`
Content:
```
1 import json
2 import re
3 import scrapy
4 from locations.items import GeojsonPointItem
5
6
7 class SheetzSpider(scrapy.Spider):
8 name = "sheetz"
9 item_attributes = {'brand': "Sheetz"}
10 allowed_domains = ["orderz.sheetz.com"]
11 start_urls = (
12 "https://orderz.sheetz.com/sas/store",
13 )
14
15 def parse(self, response):
16 stores = json.loads(response.body_as_unicode())
17
18 for store in stores:
19 properties = {
20 'addr_full': store['address'],
21 'city': store['city'],
22 'state': store['state'],
23 'postcode': store['zip'],
24 'ref': store['storeNumber'],
25 'phone': store['phone'],
26 'website': 'https://orderz.sheetz.com/#/main/location/store/'+store['storeNumber'],
27 'lat': float(store['latitude']),
28 'lon': float(store['longitude']),
29 'opening_hours': '24/7' if store['open24x7'] else None,
30 'extras': {
31 'amenity:chargingstation': store['evCharger'],
32 'amenity:fuel': True,
33 'atm': store['atm'],
34 'car_wash': store['carWash'],
35 'fax': store['fax'] if 'fax' in store else None,
36 'fuel:diesel': store['diesel'],
37 'fuel:e15': store['e15'],
38 'fuel:e85': store['e85'],
39 'fuel:kerosene': store['kerosene'],
40 'fuel:propane': store['propane'],
41 }
42 }
43
44 yield GeojsonPointItem(**properties)
45
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/sheetz.py b/locations/spiders/sheetz.py
--- a/locations/spiders/sheetz.py
+++ b/locations/spiders/sheetz.py
@@ -22,7 +22,7 @@
'state': store['state'],
'postcode': store['zip'],
'ref': store['storeNumber'],
- 'phone': store['phone'],
+ 'phone': store.get('phone'),
'website': 'https://orderz.sheetz.com/#/main/location/store/'+store['storeNumber'],
'lat': float(store['latitude']),
'lon': float(store['longitude']),
| {"golden_diff": "diff --git a/locations/spiders/sheetz.py b/locations/spiders/sheetz.py\n--- a/locations/spiders/sheetz.py\n+++ b/locations/spiders/sheetz.py\n@@ -22,7 +22,7 @@\n 'state': store['state'],\n 'postcode': store['zip'],\n 'ref': store['storeNumber'],\n- 'phone': store['phone'],\n+ 'phone': store.get('phone'),\n 'website': 'https://orderz.sheetz.com/#/main/location/store/'+store['storeNumber'],\n 'lat': float(store['latitude']),\n 'lon': float(store['longitude']),\n", "issue": "Spider sheetz is broken\nDuring the global build at 2021-06-23-14-42-18, spider **sheetz** failed with **526 features** and **1 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-06-23-14-42-18/logs/sheetz.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-06-23-14-42-18/output/sheetz.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-06-23-14-42-18/output/sheetz.geojson))\n", "before_files": [{"content": "import json\nimport re\nimport scrapy\nfrom locations.items import GeojsonPointItem\n\n\nclass SheetzSpider(scrapy.Spider):\n name = \"sheetz\"\n item_attributes = {'brand': \"Sheetz\"}\n allowed_domains = [\"orderz.sheetz.com\"]\n start_urls = (\n \"https://orderz.sheetz.com/sas/store\",\n )\n\n def parse(self, response):\n stores = json.loads(response.body_as_unicode())\n\n for store in stores:\n properties = {\n 'addr_full': store['address'],\n 'city': store['city'],\n 'state': store['state'],\n 'postcode': store['zip'],\n 'ref': store['storeNumber'],\n 'phone': store['phone'],\n 'website': 'https://orderz.sheetz.com/#/main/location/store/'+store['storeNumber'],\n 'lat': float(store['latitude']),\n 'lon': float(store['longitude']),\n 'opening_hours': '24/7' if store['open24x7'] else None,\n 'extras': {\n 'amenity:chargingstation': store['evCharger'],\n 'amenity:fuel': True,\n 'atm': store['atm'],\n 'car_wash': store['carWash'],\n 'fax': store['fax'] if 'fax' in store else None,\n 'fuel:diesel': store['diesel'],\n 'fuel:e15': store['e15'],\n 'fuel:e85': store['e85'],\n 'fuel:kerosene': store['kerosene'],\n 'fuel:propane': store['propane'],\n }\n }\n\n yield GeojsonPointItem(**properties)\n", "path": "locations/spiders/sheetz.py"}], "after_files": [{"content": "import json\nimport re\nimport scrapy\nfrom locations.items import GeojsonPointItem\n\n\nclass SheetzSpider(scrapy.Spider):\n name = \"sheetz\"\n item_attributes = {'brand': \"Sheetz\"}\n allowed_domains = [\"orderz.sheetz.com\"]\n start_urls = (\n \"https://orderz.sheetz.com/sas/store\",\n )\n\n def parse(self, response):\n stores = json.loads(response.body_as_unicode())\n\n for store in stores:\n properties = {\n 'addr_full': store['address'],\n 'city': store['city'],\n 'state': store['state'],\n 'postcode': store['zip'],\n 'ref': store['storeNumber'],\n 'phone': store.get('phone'),\n 'website': 'https://orderz.sheetz.com/#/main/location/store/'+store['storeNumber'],\n 'lat': float(store['latitude']),\n 'lon': float(store['longitude']),\n 'opening_hours': '24/7' if store['open24x7'] else None,\n 'extras': {\n 'amenity:chargingstation': store['evCharger'],\n 'amenity:fuel': True,\n 'atm': store['atm'],\n 'car_wash': store['carWash'],\n 'fax': store['fax'] if 'fax' in store else None,\n 'fuel:diesel': store['diesel'],\n 'fuel:e15': store['e15'],\n 'fuel:e85': store['e85'],\n 'fuel:kerosene': store['kerosene'],\n 'fuel:propane': store['propane'],\n }\n }\n\n yield GeojsonPointItem(**properties)\n", "path": "locations/spiders/sheetz.py"}]} | 891 | 141 |
gh_patches_debug_8051 | rasdani/github-patches | git_diff | mne-tools__mne-bids-67 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ADD: Configure CircleCI
So that we can check the artifacts tab for builds of the docs
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #! /usr/bin/env python
2 from setuptools import setup
3
4 descr = """Experimental code for BIDS using MNE."""
5
6 DISTNAME = 'mne-bids'
7 DESCRIPTION = descr
8 MAINTAINER = 'Alexandre Gramfort'
9 MAINTAINER_EMAIL = '[email protected]'
10 URL = 'http://martinos.org/mne'
11 LICENSE = 'BSD (3-clause)'
12 DOWNLOAD_URL = 'http://github.com/mne-tools/mne-bids'
13 VERSION = '0.1.dev0'
14
15 if __name__ == "__main__":
16 setup(name=DISTNAME,
17 maintainer=MAINTAINER,
18 maintainer_email=MAINTAINER_EMAIL,
19 description=DESCRIPTION,
20 license=LICENSE,
21 url=URL,
22 version=VERSION,
23 download_url=DOWNLOAD_URL,
24 long_description=open('README.md').read(),
25 classifiers=[
26 'Intended Audience :: Science/Research',
27 'Intended Audience :: Developers',
28 'License :: OSI Approved',
29 'Programming Language :: Python',
30 'Topic :: Software Development',
31 'Topic :: Scientific/Engineering',
32 'Operating System :: Microsoft :: Windows',
33 'Operating System :: POSIX',
34 'Operating System :: Unix',
35 'Operating System :: MacOS',
36 ],
37 platforms='any',
38 packages=[
39 'mne_bids'
40 ],
41 scripts=['bin/mne_bids']
42 )
43
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -1,5 +1,5 @@
#! /usr/bin/env python
-from setuptools import setup
+from setuptools import setup, find_packages
descr = """Experimental code for BIDS using MNE."""
@@ -35,8 +35,6 @@
'Operating System :: MacOS',
],
platforms='any',
- packages=[
- 'mne_bids'
- ],
+ packages=find_packages(),
scripts=['bin/mne_bids']
-)
+ )
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -1,5 +1,5 @@\n #! /usr/bin/env python\n-from setuptools import setup\n+from setuptools import setup, find_packages\n \n descr = \"\"\"Experimental code for BIDS using MNE.\"\"\"\n \n@@ -35,8 +35,6 @@\n 'Operating System :: MacOS',\n ],\n platforms='any',\n- packages=[\n- 'mne_bids'\n- ],\n+ packages=find_packages(),\n scripts=['bin/mne_bids']\n-)\n+ )\n", "issue": "ADD: Configure CircleCI\nSo that we can check the artifacts tab for builds of the docs\n", "before_files": [{"content": "#! /usr/bin/env python\nfrom setuptools import setup\n\ndescr = \"\"\"Experimental code for BIDS using MNE.\"\"\"\n\nDISTNAME = 'mne-bids'\nDESCRIPTION = descr\nMAINTAINER = 'Alexandre Gramfort'\nMAINTAINER_EMAIL = '[email protected]'\nURL = 'http://martinos.org/mne'\nLICENSE = 'BSD (3-clause)'\nDOWNLOAD_URL = 'http://github.com/mne-tools/mne-bids'\nVERSION = '0.1.dev0'\n\nif __name__ == \"__main__\":\n setup(name=DISTNAME,\n maintainer=MAINTAINER,\n maintainer_email=MAINTAINER_EMAIL,\n description=DESCRIPTION,\n license=LICENSE,\n url=URL,\n version=VERSION,\n download_url=DOWNLOAD_URL,\n long_description=open('README.md').read(),\n classifiers=[\n 'Intended Audience :: Science/Research',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved',\n 'Programming Language :: Python',\n 'Topic :: Software Development',\n 'Topic :: Scientific/Engineering',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX',\n 'Operating System :: Unix',\n 'Operating System :: MacOS',\n ],\n platforms='any',\n packages=[\n 'mne_bids'\n ],\n scripts=['bin/mne_bids']\n)\n", "path": "setup.py"}], "after_files": [{"content": "#! /usr/bin/env python\nfrom setuptools import setup, find_packages\n\ndescr = \"\"\"Experimental code for BIDS using MNE.\"\"\"\n\nDISTNAME = 'mne-bids'\nDESCRIPTION = descr\nMAINTAINER = 'Alexandre Gramfort'\nMAINTAINER_EMAIL = '[email protected]'\nURL = 'http://martinos.org/mne'\nLICENSE = 'BSD (3-clause)'\nDOWNLOAD_URL = 'http://github.com/mne-tools/mne-bids'\nVERSION = '0.1.dev0'\n\nif __name__ == \"__main__\":\n setup(name=DISTNAME,\n maintainer=MAINTAINER,\n maintainer_email=MAINTAINER_EMAIL,\n description=DESCRIPTION,\n license=LICENSE,\n url=URL,\n version=VERSION,\n download_url=DOWNLOAD_URL,\n long_description=open('README.md').read(),\n classifiers=[\n 'Intended Audience :: Science/Research',\n 'Intended Audience :: Developers',\n 'License :: OSI Approved',\n 'Programming Language :: Python',\n 'Topic :: Software Development',\n 'Topic :: Scientific/Engineering',\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: POSIX',\n 'Operating System :: Unix',\n 'Operating System :: MacOS',\n ],\n platforms='any',\n packages=find_packages(),\n scripts=['bin/mne_bids']\n )\n", "path": "setup.py"}]} | 647 | 126 |
gh_patches_debug_29351 | rasdani/github-patches | git_diff | open-mmlab__mmdetection-10056 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Why using repeated dataset in val_dataloader ?
### Prerequisite
- [X] I have searched [Issues](https://github.com/open-mmlab/mmdetection/issues) and [Discussions](https://github.com/open-mmlab/mmdetection/discussions) but cannot get the expected help.
- [X] I have read the [FAQ documentation](https://mmdetection.readthedocs.io/en/latest/faq.html) but cannot get the expected help.
- [X] The bug has not been fixed in the [latest version (master)](https://github.com/open-mmlab/mmdetection) or [latest version (3.x)](https://github.com/open-mmlab/mmdetection/tree/dev-3.x).
### Task
I'm using the official example scripts/configs for the officially supported tasks/models/datasets.
### Branch
3.x branch https://github.com/open-mmlab/mmdetection/tree/3.x
### Environment
https://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53
### Reproduces the problem - code sample
https://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53
### Reproduces the problem - command or script
https://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53
### Reproduces the problem - error message
https://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53
### Additional information
https://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `configs/common/ms_3x_coco-instance.py`
Content:
```
1 _base_ = '../_base_/default_runtime.py'
2
3 # dataset settings
4 dataset_type = 'CocoDataset'
5 data_root = 'data/coco/'
6
7 # Example to use different file client
8 # Method 1: simply set the data root and let the file I/O module
9 # automatically infer from prefix (not support LMDB and Memcache yet)
10
11 # data_root = 's3://openmmlab/datasets/detection/coco/'
12
13 # Method 2: Use `backend_args`, `file_client_args` in versions before 3.0.0rc6
14 # backend_args = dict(
15 # backend='petrel',
16 # path_mapping=dict({
17 # './data/': 's3://openmmlab/datasets/detection/',
18 # 'data/': 's3://openmmlab/datasets/detection/'
19 # }))
20 backend_args = None
21
22 train_pipeline = [
23 dict(type='LoadImageFromFile', backend_args=backend_args),
24 dict(type='LoadAnnotations', with_bbox=True, with_mask=True),
25 dict(
26 type='RandomResize', scale=[(1333, 640), (1333, 800)],
27 keep_ratio=True),
28 dict(type='RandomFlip', prob=0.5),
29 dict(type='PackDetInputs')
30 ]
31 test_pipeline = [
32 dict(type='LoadImageFromFile', backend_args=backend_args),
33 dict(type='Resize', scale=(1333, 800), keep_ratio=True),
34 dict(type='LoadAnnotations', with_bbox=True, with_mask=True),
35 dict(
36 type='PackDetInputs',
37 meta_keys=('img_id', 'img_path', 'ori_shape', 'img_shape',
38 'scale_factor'))
39 ]
40 train_dataloader = dict(
41 batch_size=2,
42 num_workers=2,
43 persistent_workers=True,
44 sampler=dict(type='DefaultSampler', shuffle=True),
45 batch_sampler=dict(type='AspectRatioBatchSampler'),
46 dataset=dict(
47 type=dataset_type,
48 data_root=data_root,
49 ann_file='annotations/instances_train2017.json',
50 data_prefix=dict(img='train2017/'),
51 filter_cfg=dict(filter_empty_gt=True, min_size=32),
52 pipeline=train_pipeline,
53 backend_args=backend_args))
54 val_dataloader = dict(
55 batch_size=2,
56 num_workers=2,
57 persistent_workers=True,
58 drop_last=False,
59 sampler=dict(type='DefaultSampler', shuffle=False),
60 dataset=dict(
61 type='RepeatDataset',
62 times=3,
63 dataset=dict(
64 type=dataset_type,
65 data_root=data_root,
66 ann_file='annotations/instances_val2017.json',
67 data_prefix=dict(img='val2017/'),
68 test_mode=True,
69 pipeline=test_pipeline,
70 backend_args=backend_args)))
71 test_dataloader = val_dataloader
72
73 val_evaluator = dict(
74 type='CocoMetric',
75 ann_file=data_root + 'annotations/instances_val2017.json',
76 metric='bbox',
77 backend_args=backend_args)
78 test_evaluator = val_evaluator
79
80 # training schedule for 3x with `RepeatDataset`
81 train_cfg = dict(type='EpochBasedTrainLoop', max_epochs=12, val_interval=1)
82 val_cfg = dict(type='ValLoop')
83 test_cfg = dict(type='TestLoop')
84
85 # learning rate
86 # Experiments show that using milestones=[9, 11] has higher performance
87 param_scheduler = [
88 dict(
89 type='LinearLR', start_factor=0.001, by_epoch=False, begin=0, end=500),
90 dict(
91 type='MultiStepLR',
92 begin=0,
93 end=12,
94 by_epoch=True,
95 milestones=[9, 11],
96 gamma=0.1)
97 ]
98
99 # optimizer
100 optim_wrapper = dict(
101 type='OptimWrapper',
102 optimizer=dict(type='SGD', lr=0.02, momentum=0.9, weight_decay=0.0001))
103
104 # Default setting for scaling LR automatically
105 # - `enable` means enable scaling LR automatically
106 # or not by default.
107 # - `base_batch_size` = (8 GPUs) x (2 samples per GPU).
108 auto_scale_lr = dict(enable=False, base_batch_size=16)
109
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/configs/common/ms_3x_coco-instance.py b/configs/common/ms_3x_coco-instance.py
--- a/configs/common/ms_3x_coco-instance.py
+++ b/configs/common/ms_3x_coco-instance.py
@@ -43,31 +43,31 @@
persistent_workers=True,
sampler=dict(type='DefaultSampler', shuffle=True),
batch_sampler=dict(type='AspectRatioBatchSampler'),
- dataset=dict(
- type=dataset_type,
- data_root=data_root,
- ann_file='annotations/instances_train2017.json',
- data_prefix=dict(img='train2017/'),
- filter_cfg=dict(filter_empty_gt=True, min_size=32),
- pipeline=train_pipeline,
- backend_args=backend_args))
-val_dataloader = dict(
- batch_size=2,
- num_workers=2,
- persistent_workers=True,
- drop_last=False,
- sampler=dict(type='DefaultSampler', shuffle=False),
dataset=dict(
type='RepeatDataset',
times=3,
dataset=dict(
type=dataset_type,
data_root=data_root,
- ann_file='annotations/instances_val2017.json',
- data_prefix=dict(img='val2017/'),
- test_mode=True,
- pipeline=test_pipeline,
+ ann_file='annotations/instances_train2017.json',
+ data_prefix=dict(img='train2017/'),
+ filter_cfg=dict(filter_empty_gt=True, min_size=32),
+ pipeline=train_pipeline,
backend_args=backend_args)))
+val_dataloader = dict(
+ batch_size=1,
+ num_workers=2,
+ persistent_workers=True,
+ drop_last=False,
+ sampler=dict(type='DefaultSampler', shuffle=False),
+ dataset=dict(
+ type=dataset_type,
+ data_root=data_root,
+ ann_file='annotations/instances_val2017.json',
+ data_prefix=dict(img='val2017/'),
+ test_mode=True,
+ pipeline=test_pipeline,
+ backend_args=backend_args))
test_dataloader = val_dataloader
val_evaluator = dict(
| {"golden_diff": "diff --git a/configs/common/ms_3x_coco-instance.py b/configs/common/ms_3x_coco-instance.py\n--- a/configs/common/ms_3x_coco-instance.py\n+++ b/configs/common/ms_3x_coco-instance.py\n@@ -43,31 +43,31 @@\n persistent_workers=True,\n sampler=dict(type='DefaultSampler', shuffle=True),\n batch_sampler=dict(type='AspectRatioBatchSampler'),\n- dataset=dict(\n- type=dataset_type,\n- data_root=data_root,\n- ann_file='annotations/instances_train2017.json',\n- data_prefix=dict(img='train2017/'),\n- filter_cfg=dict(filter_empty_gt=True, min_size=32),\n- pipeline=train_pipeline,\n- backend_args=backend_args))\n-val_dataloader = dict(\n- batch_size=2,\n- num_workers=2,\n- persistent_workers=True,\n- drop_last=False,\n- sampler=dict(type='DefaultSampler', shuffle=False),\n dataset=dict(\n type='RepeatDataset',\n times=3,\n dataset=dict(\n type=dataset_type,\n data_root=data_root,\n- ann_file='annotations/instances_val2017.json',\n- data_prefix=dict(img='val2017/'),\n- test_mode=True,\n- pipeline=test_pipeline,\n+ ann_file='annotations/instances_train2017.json',\n+ data_prefix=dict(img='train2017/'),\n+ filter_cfg=dict(filter_empty_gt=True, min_size=32),\n+ pipeline=train_pipeline,\n backend_args=backend_args)))\n+val_dataloader = dict(\n+ batch_size=1,\n+ num_workers=2,\n+ persistent_workers=True,\n+ drop_last=False,\n+ sampler=dict(type='DefaultSampler', shuffle=False),\n+ dataset=dict(\n+ type=dataset_type,\n+ data_root=data_root,\n+ ann_file='annotations/instances_val2017.json',\n+ data_prefix=dict(img='val2017/'),\n+ test_mode=True,\n+ pipeline=test_pipeline,\n+ backend_args=backend_args))\n test_dataloader = val_dataloader\n \n val_evaluator = dict(\n", "issue": "Why using repeated dataset in val_dataloader ?\n### Prerequisite\n\n- [X] I have searched [Issues](https://github.com/open-mmlab/mmdetection/issues) and [Discussions](https://github.com/open-mmlab/mmdetection/discussions) but cannot get the expected help.\n- [X] I have read the [FAQ documentation](https://mmdetection.readthedocs.io/en/latest/faq.html) but cannot get the expected help.\n- [X] The bug has not been fixed in the [latest version (master)](https://github.com/open-mmlab/mmdetection) or [latest version (3.x)](https://github.com/open-mmlab/mmdetection/tree/dev-3.x).\n\n### Task\n\nI'm using the official example scripts/configs for the officially supported tasks/models/datasets.\n\n### Branch\n\n3.x branch https://github.com/open-mmlab/mmdetection/tree/3.x\n\n### Environment\n\nhttps://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53\n\n### Reproduces the problem - code sample\n\nhttps://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53\n\n### Reproduces the problem - command or script\n\nhttps://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53\n\n### Reproduces the problem - error message\n\nhttps://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53\n\n### Additional information\n\nhttps://github.com/open-mmlab/mmdetection/blob/3.x/configs/common/ms_3x_coco-instance.py#L53\n", "before_files": [{"content": "_base_ = '../_base_/default_runtime.py'\n\n# dataset settings\ndataset_type = 'CocoDataset'\ndata_root = 'data/coco/'\n\n# Example to use different file client\n# Method 1: simply set the data root and let the file I/O module\n# automatically infer from prefix (not support LMDB and Memcache yet)\n\n# data_root = 's3://openmmlab/datasets/detection/coco/'\n\n# Method 2: Use `backend_args`, `file_client_args` in versions before 3.0.0rc6\n# backend_args = dict(\n# backend='petrel',\n# path_mapping=dict({\n# './data/': 's3://openmmlab/datasets/detection/',\n# 'data/': 's3://openmmlab/datasets/detection/'\n# }))\nbackend_args = None\n\ntrain_pipeline = [\n dict(type='LoadImageFromFile', backend_args=backend_args),\n dict(type='LoadAnnotations', with_bbox=True, with_mask=True),\n dict(\n type='RandomResize', scale=[(1333, 640), (1333, 800)],\n keep_ratio=True),\n dict(type='RandomFlip', prob=0.5),\n dict(type='PackDetInputs')\n]\ntest_pipeline = [\n dict(type='LoadImageFromFile', backend_args=backend_args),\n dict(type='Resize', scale=(1333, 800), keep_ratio=True),\n dict(type='LoadAnnotations', with_bbox=True, with_mask=True),\n dict(\n type='PackDetInputs',\n meta_keys=('img_id', 'img_path', 'ori_shape', 'img_shape',\n 'scale_factor'))\n]\ntrain_dataloader = dict(\n batch_size=2,\n num_workers=2,\n persistent_workers=True,\n sampler=dict(type='DefaultSampler', shuffle=True),\n batch_sampler=dict(type='AspectRatioBatchSampler'),\n dataset=dict(\n type=dataset_type,\n data_root=data_root,\n ann_file='annotations/instances_train2017.json',\n data_prefix=dict(img='train2017/'),\n filter_cfg=dict(filter_empty_gt=True, min_size=32),\n pipeline=train_pipeline,\n backend_args=backend_args))\nval_dataloader = dict(\n batch_size=2,\n num_workers=2,\n persistent_workers=True,\n drop_last=False,\n sampler=dict(type='DefaultSampler', shuffle=False),\n dataset=dict(\n type='RepeatDataset',\n times=3,\n dataset=dict(\n type=dataset_type,\n data_root=data_root,\n ann_file='annotations/instances_val2017.json',\n data_prefix=dict(img='val2017/'),\n test_mode=True,\n pipeline=test_pipeline,\n backend_args=backend_args)))\ntest_dataloader = val_dataloader\n\nval_evaluator = dict(\n type='CocoMetric',\n ann_file=data_root + 'annotations/instances_val2017.json',\n metric='bbox',\n backend_args=backend_args)\ntest_evaluator = val_evaluator\n\n# training schedule for 3x with `RepeatDataset`\ntrain_cfg = dict(type='EpochBasedTrainLoop', max_epochs=12, val_interval=1)\nval_cfg = dict(type='ValLoop')\ntest_cfg = dict(type='TestLoop')\n\n# learning rate\n# Experiments show that using milestones=[9, 11] has higher performance\nparam_scheduler = [\n dict(\n type='LinearLR', start_factor=0.001, by_epoch=False, begin=0, end=500),\n dict(\n type='MultiStepLR',\n begin=0,\n end=12,\n by_epoch=True,\n milestones=[9, 11],\n gamma=0.1)\n]\n\n# optimizer\noptim_wrapper = dict(\n type='OptimWrapper',\n optimizer=dict(type='SGD', lr=0.02, momentum=0.9, weight_decay=0.0001))\n\n# Default setting for scaling LR automatically\n# - `enable` means enable scaling LR automatically\n# or not by default.\n# - `base_batch_size` = (8 GPUs) x (2 samples per GPU).\nauto_scale_lr = dict(enable=False, base_batch_size=16)\n", "path": "configs/common/ms_3x_coco-instance.py"}], "after_files": [{"content": "_base_ = '../_base_/default_runtime.py'\n\n# dataset settings\ndataset_type = 'CocoDataset'\ndata_root = 'data/coco/'\n\n# Example to use different file client\n# Method 1: simply set the data root and let the file I/O module\n# automatically infer from prefix (not support LMDB and Memcache yet)\n\n# data_root = 's3://openmmlab/datasets/detection/coco/'\n\n# Method 2: Use `backend_args`, `file_client_args` in versions before 3.0.0rc6\n# backend_args = dict(\n# backend='petrel',\n# path_mapping=dict({\n# './data/': 's3://openmmlab/datasets/detection/',\n# 'data/': 's3://openmmlab/datasets/detection/'\n# }))\nbackend_args = None\n\ntrain_pipeline = [\n dict(type='LoadImageFromFile', backend_args=backend_args),\n dict(type='LoadAnnotations', with_bbox=True, with_mask=True),\n dict(\n type='RandomResize', scale=[(1333, 640), (1333, 800)],\n keep_ratio=True),\n dict(type='RandomFlip', prob=0.5),\n dict(type='PackDetInputs')\n]\ntest_pipeline = [\n dict(type='LoadImageFromFile', backend_args=backend_args),\n dict(type='Resize', scale=(1333, 800), keep_ratio=True),\n dict(type='LoadAnnotations', with_bbox=True, with_mask=True),\n dict(\n type='PackDetInputs',\n meta_keys=('img_id', 'img_path', 'ori_shape', 'img_shape',\n 'scale_factor'))\n]\ntrain_dataloader = dict(\n batch_size=2,\n num_workers=2,\n persistent_workers=True,\n sampler=dict(type='DefaultSampler', shuffle=True),\n batch_sampler=dict(type='AspectRatioBatchSampler'),\n dataset=dict(\n type='RepeatDataset',\n times=3,\n dataset=dict(\n type=dataset_type,\n data_root=data_root,\n ann_file='annotations/instances_train2017.json',\n data_prefix=dict(img='train2017/'),\n filter_cfg=dict(filter_empty_gt=True, min_size=32),\n pipeline=train_pipeline,\n backend_args=backend_args)))\nval_dataloader = dict(\n batch_size=1,\n num_workers=2,\n persistent_workers=True,\n drop_last=False,\n sampler=dict(type='DefaultSampler', shuffle=False),\n dataset=dict(\n type=dataset_type,\n data_root=data_root,\n ann_file='annotations/instances_val2017.json',\n data_prefix=dict(img='val2017/'),\n test_mode=True,\n pipeline=test_pipeline,\n backend_args=backend_args))\ntest_dataloader = val_dataloader\n\nval_evaluator = dict(\n type='CocoMetric',\n ann_file=data_root + 'annotations/instances_val2017.json',\n metric='bbox',\n backend_args=backend_args)\ntest_evaluator = val_evaluator\n\n# training schedule for 3x with `RepeatDataset`\ntrain_cfg = dict(type='EpochBasedTrainLoop', max_epochs=12, val_interval=1)\nval_cfg = dict(type='ValLoop')\ntest_cfg = dict(type='TestLoop')\n\n# learning rate\n# Experiments show that using milestones=[9, 11] has higher performance\nparam_scheduler = [\n dict(\n type='LinearLR', start_factor=0.001, by_epoch=False, begin=0, end=500),\n dict(\n type='MultiStepLR',\n begin=0,\n end=12,\n by_epoch=True,\n milestones=[9, 11],\n gamma=0.1)\n]\n\n# optimizer\noptim_wrapper = dict(\n type='OptimWrapper',\n optimizer=dict(type='SGD', lr=0.02, momentum=0.9, weight_decay=0.0001))\n\n# Default setting for scaling LR automatically\n# - `enable` means enable scaling LR automatically\n# or not by default.\n# - `base_batch_size` = (8 GPUs) x (2 samples per GPU).\nauto_scale_lr = dict(enable=False, base_batch_size=16)\n", "path": "configs/common/ms_3x_coco-instance.py"}]} | 1,796 | 475 |
gh_patches_debug_12829 | rasdani/github-patches | git_diff | feast-dev__feast-456 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Deduplicate example notebooks
Currently we have two sets of example notebooks for Feast
1. [Examples](https://github.com/gojek/feast/tree/master/examples/basic)
2. [Docker compose](https://github.com/gojek/feast/tree/master/infra/docker-compose/jupyter/notebooks)
The docker compose notebooks can be deduplicated so that all examples are only contained in the root of the project. This would make management easier.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sdk/python/setup.py`
Content:
```
1 # Copyright 2019 The Feast Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import os
16
17 from setuptools import find_packages, setup
18
19 NAME = "feast"
20 DESCRIPTION = "Python SDK for Feast"
21 URL = "https://github.com/gojek/feast"
22 AUTHOR = "Feast"
23 REQUIRES_PYTHON = ">=3.6.0"
24
25 REQUIRED = [
26 "Click==7.*",
27 "google-api-core==1.14.*",
28 "google-auth==1.6.*",
29 "google-cloud-bigquery==1.18.*",
30 "google-cloud-storage==1.20.*",
31 "google-cloud-core==1.0.*",
32 "googleapis-common-protos==1.*",
33 "google-cloud-bigquery-storage==0.7.*",
34 "grpcio==1.*",
35 "pandas==0.*",
36 "pandavro==1.5.*",
37 "protobuf>=3.10",
38 "PyYAML==5.1.*",
39 "fastavro==0.*",
40 "kafka-python==1.*",
41 "tabulate==0.8.*",
42 "toml==0.10.*",
43 "tqdm==4.*",
44 "pyarrow>=0.15.1",
45 "numpy",
46 "google",
47 "confluent_kafka",
48 ]
49
50 # README file from Feast repo root directory
51 README_FILE = os.path.join(os.path.dirname(__file__), "..", "..", "README.md")
52 with open(os.path.join(README_FILE), "r") as f:
53 LONG_DESCRIPTION = f.read()
54
55 setup(
56 name=NAME,
57 author=AUTHOR,
58 description=DESCRIPTION,
59 long_description=LONG_DESCRIPTION,
60 long_description_content_type="text/markdown",
61 python_requires=REQUIRES_PYTHON,
62 url=URL,
63 packages=find_packages(exclude=("tests",)),
64 install_requires=REQUIRED,
65 # https://stackoverflow.com/questions/28509965/setuptools-development-requirements
66 # Install dev requirements with: pip install -e .[dev]
67 extras_require={"dev": ["mypy-protobuf==1.*", "grpcio-testing==1.*"]},
68 include_package_data=True,
69 license="Apache",
70 classifiers=[
71 # Trove classifiers
72 # Full list: https://pypi.python.org/pypi?%3Aaction=list_classifiers
73 "License :: OSI Approved :: Apache Software License",
74 "Programming Language :: Python",
75 "Programming Language :: Python :: 3",
76 "Programming Language :: Python :: 3.6",
77 ],
78 entry_points={"console_scripts": ["feast=feast.cli:cli"]},
79 use_scm_version={"root": "../..", "relative_to": __file__},
80 setup_requires=["setuptools_scm"],
81 )
82
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sdk/python/setup.py b/sdk/python/setup.py
--- a/sdk/python/setup.py
+++ b/sdk/python/setup.py
@@ -13,6 +13,7 @@
# limitations under the License.
import os
+import subprocess
from setuptools import find_packages, setup
@@ -48,7 +49,13 @@
]
# README file from Feast repo root directory
-README_FILE = os.path.join(os.path.dirname(__file__), "..", "..", "README.md")
+repo_root = (
+ subprocess.Popen(["git", "rev-parse", "--show-toplevel"], stdout=subprocess.PIPE)
+ .communicate()[0]
+ .rstrip()
+ .decode("utf-8")
+)
+README_FILE = os.path.join(repo_root, "README.md")
with open(os.path.join(README_FILE), "r") as f:
LONG_DESCRIPTION = f.read()
| {"golden_diff": "diff --git a/sdk/python/setup.py b/sdk/python/setup.py\n--- a/sdk/python/setup.py\n+++ b/sdk/python/setup.py\n@@ -13,6 +13,7 @@\n # limitations under the License.\n \n import os\n+import subprocess\n \n from setuptools import find_packages, setup\n \n@@ -48,7 +49,13 @@\n ]\n \n # README file from Feast repo root directory\n-README_FILE = os.path.join(os.path.dirname(__file__), \"..\", \"..\", \"README.md\")\n+repo_root = (\n+ subprocess.Popen([\"git\", \"rev-parse\", \"--show-toplevel\"], stdout=subprocess.PIPE)\n+ .communicate()[0]\n+ .rstrip()\n+ .decode(\"utf-8\")\n+)\n+README_FILE = os.path.join(repo_root, \"README.md\")\n with open(os.path.join(README_FILE), \"r\") as f:\n LONG_DESCRIPTION = f.read()\n", "issue": "Deduplicate example notebooks\nCurrently we have two sets of example notebooks for Feast\r\n1. [Examples](https://github.com/gojek/feast/tree/master/examples/basic)\r\n2. [Docker compose](https://github.com/gojek/feast/tree/master/infra/docker-compose/jupyter/notebooks)\r\n\r\nThe docker compose notebooks can be deduplicated so that all examples are only contained in the root of the project. This would make management easier.\n", "before_files": [{"content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nNAME = \"feast\"\nDESCRIPTION = \"Python SDK for Feast\"\nURL = \"https://github.com/gojek/feast\"\nAUTHOR = \"Feast\"\nREQUIRES_PYTHON = \">=3.6.0\"\n\nREQUIRED = [\n \"Click==7.*\",\n \"google-api-core==1.14.*\",\n \"google-auth==1.6.*\",\n \"google-cloud-bigquery==1.18.*\",\n \"google-cloud-storage==1.20.*\",\n \"google-cloud-core==1.0.*\",\n \"googleapis-common-protos==1.*\",\n \"google-cloud-bigquery-storage==0.7.*\",\n \"grpcio==1.*\",\n \"pandas==0.*\",\n \"pandavro==1.5.*\",\n \"protobuf>=3.10\",\n \"PyYAML==5.1.*\",\n \"fastavro==0.*\",\n \"kafka-python==1.*\",\n \"tabulate==0.8.*\",\n \"toml==0.10.*\",\n \"tqdm==4.*\",\n \"pyarrow>=0.15.1\",\n \"numpy\",\n \"google\",\n \"confluent_kafka\",\n]\n\n# README file from Feast repo root directory\nREADME_FILE = os.path.join(os.path.dirname(__file__), \"..\", \"..\", \"README.md\")\nwith open(os.path.join(README_FILE), \"r\") as f:\n LONG_DESCRIPTION = f.read()\n\nsetup(\n name=NAME,\n author=AUTHOR,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n long_description_content_type=\"text/markdown\",\n python_requires=REQUIRES_PYTHON,\n url=URL,\n packages=find_packages(exclude=(\"tests\",)),\n install_requires=REQUIRED,\n # https://stackoverflow.com/questions/28509965/setuptools-development-requirements\n # Install dev requirements with: pip install -e .[dev]\n extras_require={\"dev\": [\"mypy-protobuf==1.*\", \"grpcio-testing==1.*\"]},\n include_package_data=True,\n license=\"Apache\",\n classifiers=[\n # Trove classifiers\n # Full list: https://pypi.python.org/pypi?%3Aaction=list_classifiers\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n ],\n entry_points={\"console_scripts\": [\"feast=feast.cli:cli\"]},\n use_scm_version={\"root\": \"../..\", \"relative_to\": __file__},\n setup_requires=[\"setuptools_scm\"],\n)\n", "path": "sdk/python/setup.py"}], "after_files": [{"content": "# Copyright 2019 The Feast Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport os\nimport subprocess\n\nfrom setuptools import find_packages, setup\n\nNAME = \"feast\"\nDESCRIPTION = \"Python SDK for Feast\"\nURL = \"https://github.com/gojek/feast\"\nAUTHOR = \"Feast\"\nREQUIRES_PYTHON = \">=3.6.0\"\n\nREQUIRED = [\n \"Click==7.*\",\n \"google-api-core==1.14.*\",\n \"google-auth==1.6.*\",\n \"google-cloud-bigquery==1.18.*\",\n \"google-cloud-storage==1.20.*\",\n \"google-cloud-core==1.0.*\",\n \"googleapis-common-protos==1.*\",\n \"google-cloud-bigquery-storage==0.7.*\",\n \"grpcio==1.*\",\n \"pandas==0.*\",\n \"pandavro==1.5.*\",\n \"protobuf>=3.10\",\n \"PyYAML==5.1.*\",\n \"fastavro==0.*\",\n \"kafka-python==1.*\",\n \"tabulate==0.8.*\",\n \"toml==0.10.*\",\n \"tqdm==4.*\",\n \"pyarrow>=0.15.1\",\n \"numpy\",\n \"google\",\n \"confluent_kafka\",\n]\n\n# README file from Feast repo root directory\nrepo_root = (\n subprocess.Popen([\"git\", \"rev-parse\", \"--show-toplevel\"], stdout=subprocess.PIPE)\n .communicate()[0]\n .rstrip()\n .decode(\"utf-8\")\n)\nREADME_FILE = os.path.join(repo_root, \"README.md\")\nwith open(os.path.join(README_FILE), \"r\") as f:\n LONG_DESCRIPTION = f.read()\n\nsetup(\n name=NAME,\n author=AUTHOR,\n description=DESCRIPTION,\n long_description=LONG_DESCRIPTION,\n long_description_content_type=\"text/markdown\",\n python_requires=REQUIRES_PYTHON,\n url=URL,\n packages=find_packages(exclude=(\"tests\",)),\n install_requires=REQUIRED,\n # https://stackoverflow.com/questions/28509965/setuptools-development-requirements\n # Install dev requirements with: pip install -e .[dev]\n extras_require={\"dev\": [\"mypy-protobuf==1.*\", \"grpcio-testing==1.*\"]},\n include_package_data=True,\n license=\"Apache\",\n classifiers=[\n # Trove classifiers\n # Full list: https://pypi.python.org/pypi?%3Aaction=list_classifiers\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n ],\n entry_points={\"console_scripts\": [\"feast=feast.cli:cli\"]},\n use_scm_version={\"root\": \"../..\", \"relative_to\": __file__},\n setup_requires=[\"setuptools_scm\"],\n)\n", "path": "sdk/python/setup.py"}]} | 1,228 | 192 |
gh_patches_debug_9337 | rasdani/github-patches | git_diff | svthalia__concrexit-2962 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix escaped HTML in promorequest email remarks field
### Describe the bug

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/promotion/emails.py`
Content:
```
1 """The emails defined by the promotion request package."""
2 import logging
3
4 from django.conf import settings
5
6 from promotion.models import PromotionRequest
7 from utils.snippets import send_email
8
9 logger = logging.getLogger(__name__)
10
11
12 def send_weekly_overview():
13 new_requests = PromotionRequest.new_requests.all()
14 upcoming_requests = PromotionRequest.upcoming_requests.all()
15
16 send_email(
17 to=[settings.PROMO_REQUEST_NOTIFICATION_ADDRESS],
18 subject="[PROMO] Weekly request overview",
19 txt_template="requests/weekly_overview.txt",
20 html_template="requests/weekly_overview.html",
21 context={
22 "new_requests": new_requests,
23 "upcoming_requests": upcoming_requests,
24 },
25 )
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/website/promotion/emails.py b/website/promotion/emails.py
--- a/website/promotion/emails.py
+++ b/website/promotion/emails.py
@@ -16,8 +16,8 @@
send_email(
to=[settings.PROMO_REQUEST_NOTIFICATION_ADDRESS],
subject="[PROMO] Weekly request overview",
- txt_template="requests/weekly_overview.txt",
- html_template="requests/weekly_overview.html",
+ txt_template="promotion/email/weekly_overview.txt",
+ html_template="promotion/email/weekly_overview.html",
context={
"new_requests": new_requests,
"upcoming_requests": upcoming_requests,
| {"golden_diff": "diff --git a/website/promotion/emails.py b/website/promotion/emails.py\n--- a/website/promotion/emails.py\n+++ b/website/promotion/emails.py\n@@ -16,8 +16,8 @@\n send_email(\n to=[settings.PROMO_REQUEST_NOTIFICATION_ADDRESS],\n subject=\"[PROMO] Weekly request overview\",\n- txt_template=\"requests/weekly_overview.txt\",\n- html_template=\"requests/weekly_overview.html\",\n+ txt_template=\"promotion/email/weekly_overview.txt\",\n+ html_template=\"promotion/email/weekly_overview.html\",\n context={\n \"new_requests\": new_requests,\n \"upcoming_requests\": upcoming_requests,\n", "issue": "Fix escaped HTML in promorequest email remarks field\n### Describe the bug\n\n", "before_files": [{"content": "\"\"\"The emails defined by the promotion request package.\"\"\"\nimport logging\n\nfrom django.conf import settings\n\nfrom promotion.models import PromotionRequest\nfrom utils.snippets import send_email\n\nlogger = logging.getLogger(__name__)\n\n\ndef send_weekly_overview():\n new_requests = PromotionRequest.new_requests.all()\n upcoming_requests = PromotionRequest.upcoming_requests.all()\n\n send_email(\n to=[settings.PROMO_REQUEST_NOTIFICATION_ADDRESS],\n subject=\"[PROMO] Weekly request overview\",\n txt_template=\"requests/weekly_overview.txt\",\n html_template=\"requests/weekly_overview.html\",\n context={\n \"new_requests\": new_requests,\n \"upcoming_requests\": upcoming_requests,\n },\n )\n", "path": "website/promotion/emails.py"}], "after_files": [{"content": "\"\"\"The emails defined by the promotion request package.\"\"\"\nimport logging\n\nfrom django.conf import settings\n\nfrom promotion.models import PromotionRequest\nfrom utils.snippets import send_email\n\nlogger = logging.getLogger(__name__)\n\n\ndef send_weekly_overview():\n new_requests = PromotionRequest.new_requests.all()\n upcoming_requests = PromotionRequest.upcoming_requests.all()\n\n send_email(\n to=[settings.PROMO_REQUEST_NOTIFICATION_ADDRESS],\n subject=\"[PROMO] Weekly request overview\",\n txt_template=\"promotion/email/weekly_overview.txt\",\n html_template=\"promotion/email/weekly_overview.html\",\n context={\n \"new_requests\": new_requests,\n \"upcoming_requests\": upcoming_requests,\n },\n )\n", "path": "website/promotion/emails.py"}]} | 530 | 148 |
gh_patches_debug_22780 | rasdani/github-patches | git_diff | yt-project__yt-2754 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Matplotlib 3.3.0 Breaks _png
<!--To help us understand and resolve your issue, please fill out the form to
the best of your ability.-->
<!--You can feel free to delete the sections that do not apply.-->
### Bug report
**Bug summary**
Matplotlib 3.3.0 removed the internal `_png` module, which breaks
https://github.com/yt-project/yt/blob/yt-3.6.0/yt/utilities/png_writer.py#L13
See the last mention in https://matplotlib.org/3.3.0/api/api_changes.html#matplotlib-now-uses-pillow-to-save-and-read-pngs
**Code for reproduction**
Just saw this on our CI:
https://travis-ci.com/github/ECP-WarpX/WarpX/jobs/361956903
MPL 3.3.0 was release about 5hrs ago.
https://github.com/matplotlib/matplotlib/releases/tag/v3.3.0
**Actual outcome**
```
File "/home/travis/.local/lib/python3.6/site-packages/yt/utilities/png_writer.py", line 13, in <module>
import matplotlib._png as _png
ModuleNotFoundError: No module named 'matplotlib._png'
```
**Expected outcome**
:-)
**Version Information**
<!--Please specify your platform and versions of the relevant libraries you are
using:-->
* Operating System: Ubuntu 18.04
* Python Version: 3.6
* yt version: 3.6.0
Installed via `python -m pip install --upgrade cmake matplotlib mpi4py numpy scipy yt`.
**Work-Around**
Downgrade matplotlib via `python -m pip install --upgrade matplotlib==3.2.2`.
Exact details:
- https://travis-ci.com/github/ECP-WarpX/WarpX/jobs/361956903
- https://github.com/ECP-WarpX/WarpX/blob/384c6ab9a864d430868a39a065f4a1d4426231af/.travis.yml#L30-L31
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `yt/utilities/png_writer.py`
Content:
```
1 from io import BytesIO
2
3 import matplotlib._png as _png
4
5
6 def call_png_write_png(buffer, width, height, fileobj, dpi):
7 _png.write_png(buffer, fileobj, dpi)
8
9
10 def write_png(buffer, filename, dpi=100):
11 width = buffer.shape[1]
12 height = buffer.shape[0]
13 with open(filename, "wb") as fileobj:
14 call_png_write_png(buffer, width, height, fileobj, dpi)
15
16
17 def write_png_to_string(buffer, dpi=100, gray=0):
18 width = buffer.shape[1]
19 height = buffer.shape[0]
20 fileobj = BytesIO()
21 call_png_write_png(buffer, width, height, fileobj, dpi)
22 png_str = fileobj.getvalue()
23 fileobj.close()
24 return png_str
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/yt/utilities/png_writer.py b/yt/utilities/png_writer.py
--- a/yt/utilities/png_writer.py
+++ b/yt/utilities/png_writer.py
@@ -1,24 +1,29 @@
from io import BytesIO
-import matplotlib._png as _png
+try:
+ # matplotlib switched from an internal submodule _png to using pillow (PIL)
+ # between v3.1.0 and v3.3.0
+ # So PIL should be available on any system where matplotlib._png doesn't exist
+ import matplotlib._png as _png
+except ImportError:
+ from PIL import Image
-def call_png_write_png(buffer, width, height, fileobj, dpi):
- _png.write_png(buffer, fileobj, dpi)
+def call_png_write_png(buffer, fileobj, dpi):
+ try:
+ _png.write_png(buffer, fileobj, dpi)
+ except NameError:
+ Image.fromarray(buffer).save(fileobj, dpi=(dpi, dpi))
def write_png(buffer, filename, dpi=100):
- width = buffer.shape[1]
- height = buffer.shape[0]
with open(filename, "wb") as fileobj:
- call_png_write_png(buffer, width, height, fileobj, dpi)
+ call_png_write_png(buffer, fileobj, dpi)
-def write_png_to_string(buffer, dpi=100, gray=0):
- width = buffer.shape[1]
- height = buffer.shape[0]
+def write_png_to_string(buffer, dpi=100):
fileobj = BytesIO()
- call_png_write_png(buffer, width, height, fileobj, dpi)
+ call_png_write_png(buffer, fileobj, dpi)
png_str = fileobj.getvalue()
fileobj.close()
return png_str
| {"golden_diff": "diff --git a/yt/utilities/png_writer.py b/yt/utilities/png_writer.py\n--- a/yt/utilities/png_writer.py\n+++ b/yt/utilities/png_writer.py\n@@ -1,24 +1,29 @@\n from io import BytesIO\n \n-import matplotlib._png as _png\n+try:\n+ # matplotlib switched from an internal submodule _png to using pillow (PIL)\n+ # between v3.1.0 and v3.3.0\n+ # So PIL should be available on any system where matplotlib._png doesn't exist\n+ import matplotlib._png as _png\n+except ImportError:\n+ from PIL import Image\n \n \n-def call_png_write_png(buffer, width, height, fileobj, dpi):\n- _png.write_png(buffer, fileobj, dpi)\n+def call_png_write_png(buffer, fileobj, dpi):\n+ try:\n+ _png.write_png(buffer, fileobj, dpi)\n+ except NameError:\n+ Image.fromarray(buffer).save(fileobj, dpi=(dpi, dpi))\n \n \n def write_png(buffer, filename, dpi=100):\n- width = buffer.shape[1]\n- height = buffer.shape[0]\n with open(filename, \"wb\") as fileobj:\n- call_png_write_png(buffer, width, height, fileobj, dpi)\n+ call_png_write_png(buffer, fileobj, dpi)\n \n \n-def write_png_to_string(buffer, dpi=100, gray=0):\n- width = buffer.shape[1]\n- height = buffer.shape[0]\n+def write_png_to_string(buffer, dpi=100):\n fileobj = BytesIO()\n- call_png_write_png(buffer, width, height, fileobj, dpi)\n+ call_png_write_png(buffer, fileobj, dpi)\n png_str = fileobj.getvalue()\n fileobj.close()\n return png_str\n", "issue": "Matplotlib 3.3.0 Breaks _png\n<!--To help us understand and resolve your issue, please fill out the form to\r\nthe best of your ability.-->\r\n<!--You can feel free to delete the sections that do not apply.-->\r\n\r\n### Bug report\r\n\r\n**Bug summary**\r\n\r\nMatplotlib 3.3.0 removed the internal `_png` module, which breaks\r\nhttps://github.com/yt-project/yt/blob/yt-3.6.0/yt/utilities/png_writer.py#L13\r\n\r\nSee the last mention in https://matplotlib.org/3.3.0/api/api_changes.html#matplotlib-now-uses-pillow-to-save-and-read-pngs\r\n\r\n**Code for reproduction**\r\n\r\nJust saw this on our CI:\r\nhttps://travis-ci.com/github/ECP-WarpX/WarpX/jobs/361956903\r\n\r\nMPL 3.3.0 was release about 5hrs ago.\r\nhttps://github.com/matplotlib/matplotlib/releases/tag/v3.3.0\r\n\r\n**Actual outcome**\r\n\r\n```\r\nFile \"/home/travis/.local/lib/python3.6/site-packages/yt/utilities/png_writer.py\", line 13, in <module>\r\n import matplotlib._png as _png\r\nModuleNotFoundError: No module named 'matplotlib._png'\r\n```\r\n\r\n**Expected outcome**\r\n\r\n:-)\r\n\r\n**Version Information**\r\n<!--Please specify your platform and versions of the relevant libraries you are\r\nusing:-->\r\n * Operating System: Ubuntu 18.04\r\n * Python Version: 3.6\r\n * yt version: 3.6.0\r\n\r\nInstalled via `python -m pip install --upgrade cmake matplotlib mpi4py numpy scipy yt`.\r\n\r\n**Work-Around**\r\n\r\nDowngrade matplotlib via `python -m pip install --upgrade matplotlib==3.2.2`.\r\n\r\nExact details:\r\n- https://travis-ci.com/github/ECP-WarpX/WarpX/jobs/361956903\r\n- https://github.com/ECP-WarpX/WarpX/blob/384c6ab9a864d430868a39a065f4a1d4426231af/.travis.yml#L30-L31\r\n\n", "before_files": [{"content": "from io import BytesIO\n\nimport matplotlib._png as _png\n\n\ndef call_png_write_png(buffer, width, height, fileobj, dpi):\n _png.write_png(buffer, fileobj, dpi)\n\n\ndef write_png(buffer, filename, dpi=100):\n width = buffer.shape[1]\n height = buffer.shape[0]\n with open(filename, \"wb\") as fileobj:\n call_png_write_png(buffer, width, height, fileobj, dpi)\n\n\ndef write_png_to_string(buffer, dpi=100, gray=0):\n width = buffer.shape[1]\n height = buffer.shape[0]\n fileobj = BytesIO()\n call_png_write_png(buffer, width, height, fileobj, dpi)\n png_str = fileobj.getvalue()\n fileobj.close()\n return png_str\n", "path": "yt/utilities/png_writer.py"}], "after_files": [{"content": "from io import BytesIO\n\ntry:\n # matplotlib switched from an internal submodule _png to using pillow (PIL)\n # between v3.1.0 and v3.3.0\n # So PIL should be available on any system where matplotlib._png doesn't exist\n import matplotlib._png as _png\nexcept ImportError:\n from PIL import Image\n\n\ndef call_png_write_png(buffer, fileobj, dpi):\n try:\n _png.write_png(buffer, fileobj, dpi)\n except NameError:\n Image.fromarray(buffer).save(fileobj, dpi=(dpi, dpi))\n\n\ndef write_png(buffer, filename, dpi=100):\n with open(filename, \"wb\") as fileobj:\n call_png_write_png(buffer, fileobj, dpi)\n\n\ndef write_png_to_string(buffer, dpi=100):\n fileobj = BytesIO()\n call_png_write_png(buffer, fileobj, dpi)\n png_str = fileobj.getvalue()\n fileobj.close()\n return png_str\n", "path": "yt/utilities/png_writer.py"}]} | 967 | 407 |
gh_patches_debug_19260 | rasdani/github-patches | git_diff | pypi__warehouse-3236 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
License metadata seems to be ignored
When I made the latest release of `python-dateutil`, it came along with a license change from BSD to Apache / BSD dual licensed. I updated the `license=` metadata in `setup.py`, but I forgot to update the trove classifiers.
[The page on PyPI](https://pypi.python.org/pypi/python-dateutil/2.7.0) shows the license as "Apache 2.0" as I would expect. [The page on warehouse](https://pypi.org/project/python-dateutil/) shows the license as "BSD License". I'm assuming it's pulling that from the trove classifier? Shouldn't it pull it from the `license` field if that is populated?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `warehouse/packaging/views.py`
Content:
```
1 # Licensed under the Apache License, Version 2.0 (the "License");
2 # you may not use this file except in compliance with the License.
3 # You may obtain a copy of the License at
4 #
5 # http://www.apache.org/licenses/LICENSE-2.0
6 #
7 # Unless required by applicable law or agreed to in writing, software
8 # distributed under the License is distributed on an "AS IS" BASIS,
9 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
10 # See the License for the specific language governing permissions and
11 # limitations under the License.
12
13 from first import first
14 from pyramid.httpexceptions import HTTPMovedPermanently, HTTPNotFound
15 from pyramid.view import view_config
16 from sqlalchemy.orm.exc import NoResultFound
17
18 from warehouse.accounts.models import User
19 from warehouse.cache.origin import origin_cache
20 from warehouse.packaging.models import Release, Role
21
22
23 @view_config(
24 route_name="packaging.project",
25 renderer="packaging/detail.html",
26 decorator=[
27 origin_cache(
28 1 * 24 * 60 * 60, # 1 day
29 stale_while_revalidate=1 * 24 * 60 * 60, # 1 day
30 stale_if_error=5 * 24 * 60 * 60, # 5 days
31 ),
32 ],
33 )
34 def project_detail(project, request):
35 if project.name != request.matchdict.get("name", project.name):
36 return HTTPMovedPermanently(
37 request.current_route_path(name=project.name),
38 )
39
40 try:
41 release = (
42 request.db.query(Release)
43 .filter(Release.project == project)
44 .order_by(
45 Release.is_prerelease.nullslast(),
46 Release._pypi_ordering.desc())
47 .limit(1)
48 .one()
49 )
50 except NoResultFound:
51 return HTTPNotFound()
52
53 return release_detail(release, request)
54
55
56 @view_config(
57 route_name="packaging.release",
58 renderer="packaging/detail.html",
59 decorator=[
60 origin_cache(
61 1 * 24 * 60 * 60, # 1 day
62 stale_while_revalidate=1 * 24 * 60 * 60, # 1 day
63 stale_if_error=5 * 24 * 60 * 60, # 5 days
64 ),
65 ],
66 )
67 def release_detail(release, request):
68 project = release.project
69
70 # Check if the requested version is equivalent but not exactly the same as
71 # the release's version. Use `.get` because this view is used by
72 # `project_detail` and there may not be a version.
73 #
74 # This also handles the case where both the version and the project name
75 # need adjusted, and handles it in a single redirect.
76 if release.version != request.matchdict.get("version", release.version):
77 return HTTPMovedPermanently(
78 request.current_route_path(
79 name=project.name,
80 version=release.version,
81 ),
82 )
83
84 # It's possible that the requested version was correct (or not provided),
85 # but we still need to adjust the project name.
86 if project.name != request.matchdict.get("name", project.name):
87 return HTTPMovedPermanently(
88 request.current_route_path(name=project.name),
89 )
90
91 # Get all of the registered versions for this Project, in order of newest
92 # to oldest.
93 all_releases = (
94 request.db.query(Release)
95 .filter(Release.project == project)
96 .with_entities(
97 Release.version,
98 Release.is_prerelease,
99 Release.created)
100 .order_by(Release._pypi_ordering.desc())
101 .all()
102 )
103
104 # Get the latest non-prerelease of this Project, or the latest release if
105 # all releases are prereleases.
106 latest_release = first(
107 all_releases,
108 key=lambda r: not r.is_prerelease,
109 default=all_releases[0],
110 )
111
112 # Get all of the maintainers for this project.
113 maintainers = [
114 r.user
115 for r in (
116 request.db.query(Role)
117 .join(User)
118 .filter(Role.project == project)
119 .distinct(User.username)
120 .order_by(User.username)
121 .all()
122 )
123 ]
124
125 # Get the license from the classifiers or metadata, preferring classifiers.
126 license = None
127 if release.license:
128 # Make a best effort when the entire license text is given
129 # by using the first line only.
130 license = release.license.split('\n')[0]
131 license_classifiers = [c.split(" :: ")[-1] for c in release.classifiers
132 if c.startswith("License")]
133 if license_classifiers:
134 license = ', '.join(license_classifiers)
135
136 return {
137 "project": project,
138 "release": release,
139 "files": release.files.all(),
140 "latest_release": latest_release,
141 "all_releases": all_releases,
142 "maintainers": maintainers,
143 "license": license,
144 }
145
146
147 @view_config(
148 route_name="includes.edit-project-button",
149 renderer="includes/manage-project-button.html",
150 uses_session=True,
151 permission="manage",
152 )
153 def edit_project_button(project, request):
154 return {'project': project}
155
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/warehouse/packaging/views.py b/warehouse/packaging/views.py
--- a/warehouse/packaging/views.py
+++ b/warehouse/packaging/views.py
@@ -122,16 +122,21 @@
)
]
- # Get the license from the classifiers or metadata, preferring classifiers.
- license = None
- if release.license:
- # Make a best effort when the entire license text is given
- # by using the first line only.
- license = release.license.split('\n')[0]
- license_classifiers = [c.split(" :: ")[-1] for c in release.classifiers
- if c.startswith("License")]
- if license_classifiers:
- license = ', '.join(license_classifiers)
+ # Get the license from both the `Classifier` and `License` metadata fields
+ license_classifiers = ', '.join(
+ c.split(" :: ")[-1]
+ for c in release.classifiers
+ if c.startswith("License")
+ )
+
+ # Make a best effort when the entire license text is given by using the
+ # first line only.
+ short_license = release.license.split('\n')[0] if release.license else None
+
+ if license_classifiers and short_license:
+ license = f'{license_classifiers} ({short_license})'
+ else:
+ license = license_classifiers or short_license or None
return {
"project": project,
| {"golden_diff": "diff --git a/warehouse/packaging/views.py b/warehouse/packaging/views.py\n--- a/warehouse/packaging/views.py\n+++ b/warehouse/packaging/views.py\n@@ -122,16 +122,21 @@\n )\n ]\n \n- # Get the license from the classifiers or metadata, preferring classifiers.\n- license = None\n- if release.license:\n- # Make a best effort when the entire license text is given\n- # by using the first line only.\n- license = release.license.split('\\n')[0]\n- license_classifiers = [c.split(\" :: \")[-1] for c in release.classifiers\n- if c.startswith(\"License\")]\n- if license_classifiers:\n- license = ', '.join(license_classifiers)\n+ # Get the license from both the `Classifier` and `License` metadata fields\n+ license_classifiers = ', '.join(\n+ c.split(\" :: \")[-1]\n+ for c in release.classifiers\n+ if c.startswith(\"License\")\n+ )\n+\n+ # Make a best effort when the entire license text is given by using the\n+ # first line only.\n+ short_license = release.license.split('\\n')[0] if release.license else None\n+\n+ if license_classifiers and short_license:\n+ license = f'{license_classifiers} ({short_license})'\n+ else:\n+ license = license_classifiers or short_license or None\n \n return {\n \"project\": project,\n", "issue": "License metadata seems to be ignored\nWhen I made the latest release of `python-dateutil`, it came along with a license change from BSD to Apache / BSD dual licensed. I updated the `license=` metadata in `setup.py`, but I forgot to update the trove classifiers.\r\n\r\n[The page on PyPI](https://pypi.python.org/pypi/python-dateutil/2.7.0) shows the license as \"Apache 2.0\" as I would expect. [The page on warehouse](https://pypi.org/project/python-dateutil/) shows the license as \"BSD License\". I'm assuming it's pulling that from the trove classifier? Shouldn't it pull it from the `license` field if that is populated?\n", "before_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom first import first\nfrom pyramid.httpexceptions import HTTPMovedPermanently, HTTPNotFound\nfrom pyramid.view import view_config\nfrom sqlalchemy.orm.exc import NoResultFound\n\nfrom warehouse.accounts.models import User\nfrom warehouse.cache.origin import origin_cache\nfrom warehouse.packaging.models import Release, Role\n\n\n@view_config(\n route_name=\"packaging.project\",\n renderer=\"packaging/detail.html\",\n decorator=[\n origin_cache(\n 1 * 24 * 60 * 60, # 1 day\n stale_while_revalidate=1 * 24 * 60 * 60, # 1 day\n stale_if_error=5 * 24 * 60 * 60, # 5 days\n ),\n ],\n)\ndef project_detail(project, request):\n if project.name != request.matchdict.get(\"name\", project.name):\n return HTTPMovedPermanently(\n request.current_route_path(name=project.name),\n )\n\n try:\n release = (\n request.db.query(Release)\n .filter(Release.project == project)\n .order_by(\n Release.is_prerelease.nullslast(),\n Release._pypi_ordering.desc())\n .limit(1)\n .one()\n )\n except NoResultFound:\n return HTTPNotFound()\n\n return release_detail(release, request)\n\n\n@view_config(\n route_name=\"packaging.release\",\n renderer=\"packaging/detail.html\",\n decorator=[\n origin_cache(\n 1 * 24 * 60 * 60, # 1 day\n stale_while_revalidate=1 * 24 * 60 * 60, # 1 day\n stale_if_error=5 * 24 * 60 * 60, # 5 days\n ),\n ],\n)\ndef release_detail(release, request):\n project = release.project\n\n # Check if the requested version is equivalent but not exactly the same as\n # the release's version. Use `.get` because this view is used by\n # `project_detail` and there may not be a version.\n #\n # This also handles the case where both the version and the project name\n # need adjusted, and handles it in a single redirect.\n if release.version != request.matchdict.get(\"version\", release.version):\n return HTTPMovedPermanently(\n request.current_route_path(\n name=project.name,\n version=release.version,\n ),\n )\n\n # It's possible that the requested version was correct (or not provided),\n # but we still need to adjust the project name.\n if project.name != request.matchdict.get(\"name\", project.name):\n return HTTPMovedPermanently(\n request.current_route_path(name=project.name),\n )\n\n # Get all of the registered versions for this Project, in order of newest\n # to oldest.\n all_releases = (\n request.db.query(Release)\n .filter(Release.project == project)\n .with_entities(\n Release.version,\n Release.is_prerelease,\n Release.created)\n .order_by(Release._pypi_ordering.desc())\n .all()\n )\n\n # Get the latest non-prerelease of this Project, or the latest release if\n # all releases are prereleases.\n latest_release = first(\n all_releases,\n key=lambda r: not r.is_prerelease,\n default=all_releases[0],\n )\n\n # Get all of the maintainers for this project.\n maintainers = [\n r.user\n for r in (\n request.db.query(Role)\n .join(User)\n .filter(Role.project == project)\n .distinct(User.username)\n .order_by(User.username)\n .all()\n )\n ]\n\n # Get the license from the classifiers or metadata, preferring classifiers.\n license = None\n if release.license:\n # Make a best effort when the entire license text is given\n # by using the first line only.\n license = release.license.split('\\n')[0]\n license_classifiers = [c.split(\" :: \")[-1] for c in release.classifiers\n if c.startswith(\"License\")]\n if license_classifiers:\n license = ', '.join(license_classifiers)\n\n return {\n \"project\": project,\n \"release\": release,\n \"files\": release.files.all(),\n \"latest_release\": latest_release,\n \"all_releases\": all_releases,\n \"maintainers\": maintainers,\n \"license\": license,\n }\n\n\n@view_config(\n route_name=\"includes.edit-project-button\",\n renderer=\"includes/manage-project-button.html\",\n uses_session=True,\n permission=\"manage\",\n)\ndef edit_project_button(project, request):\n return {'project': project}\n", "path": "warehouse/packaging/views.py"}], "after_files": [{"content": "# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom first import first\nfrom pyramid.httpexceptions import HTTPMovedPermanently, HTTPNotFound\nfrom pyramid.view import view_config\nfrom sqlalchemy.orm.exc import NoResultFound\n\nfrom warehouse.accounts.models import User\nfrom warehouse.cache.origin import origin_cache\nfrom warehouse.packaging.models import Release, Role\n\n\n@view_config(\n route_name=\"packaging.project\",\n renderer=\"packaging/detail.html\",\n decorator=[\n origin_cache(\n 1 * 24 * 60 * 60, # 1 day\n stale_while_revalidate=1 * 24 * 60 * 60, # 1 day\n stale_if_error=5 * 24 * 60 * 60, # 5 days\n ),\n ],\n)\ndef project_detail(project, request):\n if project.name != request.matchdict.get(\"name\", project.name):\n return HTTPMovedPermanently(\n request.current_route_path(name=project.name),\n )\n\n try:\n release = (\n request.db.query(Release)\n .filter(Release.project == project)\n .order_by(\n Release.is_prerelease.nullslast(),\n Release._pypi_ordering.desc())\n .limit(1)\n .one()\n )\n except NoResultFound:\n return HTTPNotFound()\n\n return release_detail(release, request)\n\n\n@view_config(\n route_name=\"packaging.release\",\n renderer=\"packaging/detail.html\",\n decorator=[\n origin_cache(\n 1 * 24 * 60 * 60, # 1 day\n stale_while_revalidate=1 * 24 * 60 * 60, # 1 day\n stale_if_error=5 * 24 * 60 * 60, # 5 days\n ),\n ],\n)\ndef release_detail(release, request):\n project = release.project\n\n # Check if the requested version is equivalent but not exactly the same as\n # the release's version. Use `.get` because this view is used by\n # `project_detail` and there may not be a version.\n #\n # This also handles the case where both the version and the project name\n # need adjusted, and handles it in a single redirect.\n if release.version != request.matchdict.get(\"version\", release.version):\n return HTTPMovedPermanently(\n request.current_route_path(\n name=project.name,\n version=release.version,\n ),\n )\n\n # It's possible that the requested version was correct (or not provided),\n # but we still need to adjust the project name.\n if project.name != request.matchdict.get(\"name\", project.name):\n return HTTPMovedPermanently(\n request.current_route_path(name=project.name),\n )\n\n # Get all of the registered versions for this Project, in order of newest\n # to oldest.\n all_releases = (\n request.db.query(Release)\n .filter(Release.project == project)\n .with_entities(\n Release.version,\n Release.is_prerelease,\n Release.created)\n .order_by(Release._pypi_ordering.desc())\n .all()\n )\n\n # Get the latest non-prerelease of this Project, or the latest release if\n # all releases are prereleases.\n latest_release = first(\n all_releases,\n key=lambda r: not r.is_prerelease,\n default=all_releases[0],\n )\n\n # Get all of the maintainers for this project.\n maintainers = [\n r.user\n for r in (\n request.db.query(Role)\n .join(User)\n .filter(Role.project == project)\n .distinct(User.username)\n .order_by(User.username)\n .all()\n )\n ]\n\n # Get the license from both the `Classifier` and `License` metadata fields\n license_classifiers = ', '.join(\n c.split(\" :: \")[-1]\n for c in release.classifiers\n if c.startswith(\"License\")\n )\n\n # Make a best effort when the entire license text is given by using the\n # first line only.\n short_license = release.license.split('\\n')[0] if release.license else None\n\n if license_classifiers and short_license:\n license = f'{license_classifiers} ({short_license})'\n else:\n license = license_classifiers or short_license or None\n\n return {\n \"project\": project,\n \"release\": release,\n \"files\": release.files.all(),\n \"latest_release\": latest_release,\n \"all_releases\": all_releases,\n \"maintainers\": maintainers,\n \"license\": license,\n }\n\n\n@view_config(\n route_name=\"includes.edit-project-button\",\n renderer=\"includes/manage-project-button.html\",\n uses_session=True,\n permission=\"manage\",\n)\ndef edit_project_button(project, request):\n return {'project': project}\n", "path": "warehouse/packaging/views.py"}]} | 1,916 | 334 |
gh_patches_debug_25260 | rasdani/github-patches | git_diff | streamlit__streamlit-4525 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
st.json collapse parameter
_(Note, you don't have to fill out every section here. They're just here for guidance. That said, nicely detailed feature requests are more likely to get eng attention sooner)_
### Problem
Have a parameter for st.json(body, collapse) where the default is `False` but you can set it to `True`. This would allow developers to choose if the json file is expanded or collapsed when rendered on the Streamlit app.
Requested by a community member, link to forum post:
https://discuss.streamlit.io/t/json-collapse-option/17159
### Solution
**MVP:** a parameter to set the view of a json file on the first render in Streamlit
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `e2e/scripts/st_json.py`
Content:
```
1 # Copyright 2018-2022 Streamlit Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import streamlit as st
16
17 data = {"foo": "bar"}
18 st.json(data)
19
```
Path: `lib/streamlit/elements/json.py`
Content:
```
1 # Copyright 2018-2022 Streamlit Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import json
16 from typing import cast
17
18 import streamlit
19 from streamlit.proto.Json_pb2 import Json as JsonProto
20 from streamlit.state import AutoSessionState
21
22
23 class JsonMixin:
24 def json(self, body):
25 """Display object or string as a pretty-printed JSON string.
26
27 Parameters
28 ----------
29 body : Object or str
30 The object to print as JSON. All referenced objects should be
31 serializable to JSON as well. If object is a string, we assume it
32 contains serialized JSON.
33
34 Example
35 -------
36 >>> st.json({
37 ... 'foo': 'bar',
38 ... 'baz': 'boz',
39 ... 'stuff': [
40 ... 'stuff 1',
41 ... 'stuff 2',
42 ... 'stuff 3',
43 ... 'stuff 5',
44 ... ],
45 ... })
46
47 .. output::
48 https://share.streamlit.io/streamlit/docs/main/python/api-examples-source/data.json.py
49 height: 385px
50
51 """
52 import streamlit as st
53
54 if isinstance(body, AutoSessionState):
55 body = body.to_dict()
56
57 if not isinstance(body, str):
58 try:
59 body = json.dumps(body, default=repr)
60 except TypeError as err:
61 st.warning(
62 "Warning: this data structure was not fully serializable as "
63 "JSON due to one or more unexpected keys. (Error was: %s)" % err
64 )
65 body = json.dumps(body, skipkeys=True, default=repr)
66
67 json_proto = JsonProto()
68 json_proto.body = body
69 return self.dg._enqueue("json", json_proto)
70
71 @property
72 def dg(self) -> "streamlit.delta_generator.DeltaGenerator":
73 """Get our DeltaGenerator."""
74 return cast("streamlit.delta_generator.DeltaGenerator", self)
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/e2e/scripts/st_json.py b/e2e/scripts/st_json.py
--- a/e2e/scripts/st_json.py
+++ b/e2e/scripts/st_json.py
@@ -16,3 +16,4 @@
data = {"foo": "bar"}
st.json(data)
+st.json(data, expanded=False)
diff --git a/lib/streamlit/elements/json.py b/lib/streamlit/elements/json.py
--- a/lib/streamlit/elements/json.py
+++ b/lib/streamlit/elements/json.py
@@ -21,7 +21,12 @@
class JsonMixin:
- def json(self, body):
+ def json(
+ self,
+ body,
+ *, # keyword-only arguments:
+ expanded=True,
+ ):
"""Display object or string as a pretty-printed JSON string.
Parameters
@@ -31,6 +36,11 @@
serializable to JSON as well. If object is a string, we assume it
contains serialized JSON.
+ expanded : bool
+ An optional boolean that allows the user to set whether the initial
+ state of this json element should be expanded. Defaults to True.
+ This argument can only be supplied by keyword.
+
Example
-------
>>> st.json({
@@ -66,6 +76,7 @@
json_proto = JsonProto()
json_proto.body = body
+ json_proto.expanded = expanded
return self.dg._enqueue("json", json_proto)
@property
| {"golden_diff": "diff --git a/e2e/scripts/st_json.py b/e2e/scripts/st_json.py\n--- a/e2e/scripts/st_json.py\n+++ b/e2e/scripts/st_json.py\n@@ -16,3 +16,4 @@\n \n data = {\"foo\": \"bar\"}\n st.json(data)\n+st.json(data, expanded=False)\ndiff --git a/lib/streamlit/elements/json.py b/lib/streamlit/elements/json.py\n--- a/lib/streamlit/elements/json.py\n+++ b/lib/streamlit/elements/json.py\n@@ -21,7 +21,12 @@\n \n \n class JsonMixin:\n- def json(self, body):\n+ def json(\n+ self,\n+ body,\n+ *, # keyword-only arguments:\n+ expanded=True,\n+ ):\n \"\"\"Display object or string as a pretty-printed JSON string.\n \n Parameters\n@@ -31,6 +36,11 @@\n serializable to JSON as well. If object is a string, we assume it\n contains serialized JSON.\n \n+ expanded : bool\n+ An optional boolean that allows the user to set whether the initial\n+ state of this json element should be expanded. Defaults to True.\n+ This argument can only be supplied by keyword.\n+\n Example\n -------\n >>> st.json({\n@@ -66,6 +76,7 @@\n \n json_proto = JsonProto()\n json_proto.body = body\n+ json_proto.expanded = expanded\n return self.dg._enqueue(\"json\", json_proto)\n \n @property\n", "issue": "st.json collapse parameter\n_(Note, you don't have to fill out every section here. They're just here for guidance. That said, nicely detailed feature requests are more likely to get eng attention sooner)_\r\n\r\n### Problem\r\n\r\nHave a parameter for st.json(body, collapse) where the default is `False` but you can set it to `True`. This would allow developers to choose if the json file is expanded or collapsed when rendered on the Streamlit app.\r\n\r\nRequested by a community member, link to forum post: \r\nhttps://discuss.streamlit.io/t/json-collapse-option/17159\r\n\r\n### Solution\r\n\r\n**MVP:** a parameter to set the view of a json file on the first render in Streamlit\r\n\r\n\n", "before_files": [{"content": "# Copyright 2018-2022 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport streamlit as st\n\ndata = {\"foo\": \"bar\"}\nst.json(data)\n", "path": "e2e/scripts/st_json.py"}, {"content": "# Copyright 2018-2022 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport json\nfrom typing import cast\n\nimport streamlit\nfrom streamlit.proto.Json_pb2 import Json as JsonProto\nfrom streamlit.state import AutoSessionState\n\n\nclass JsonMixin:\n def json(self, body):\n \"\"\"Display object or string as a pretty-printed JSON string.\n\n Parameters\n ----------\n body : Object or str\n The object to print as JSON. All referenced objects should be\n serializable to JSON as well. If object is a string, we assume it\n contains serialized JSON.\n\n Example\n -------\n >>> st.json({\n ... 'foo': 'bar',\n ... 'baz': 'boz',\n ... 'stuff': [\n ... 'stuff 1',\n ... 'stuff 2',\n ... 'stuff 3',\n ... 'stuff 5',\n ... ],\n ... })\n\n .. output::\n https://share.streamlit.io/streamlit/docs/main/python/api-examples-source/data.json.py\n height: 385px\n\n \"\"\"\n import streamlit as st\n\n if isinstance(body, AutoSessionState):\n body = body.to_dict()\n\n if not isinstance(body, str):\n try:\n body = json.dumps(body, default=repr)\n except TypeError as err:\n st.warning(\n \"Warning: this data structure was not fully serializable as \"\n \"JSON due to one or more unexpected keys. (Error was: %s)\" % err\n )\n body = json.dumps(body, skipkeys=True, default=repr)\n\n json_proto = JsonProto()\n json_proto.body = body\n return self.dg._enqueue(\"json\", json_proto)\n\n @property\n def dg(self) -> \"streamlit.delta_generator.DeltaGenerator\":\n \"\"\"Get our DeltaGenerator.\"\"\"\n return cast(\"streamlit.delta_generator.DeltaGenerator\", self)\n", "path": "lib/streamlit/elements/json.py"}], "after_files": [{"content": "# Copyright 2018-2022 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport streamlit as st\n\ndata = {\"foo\": \"bar\"}\nst.json(data)\nst.json(data, expanded=False)\n", "path": "e2e/scripts/st_json.py"}, {"content": "# Copyright 2018-2022 Streamlit Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport json\nfrom typing import cast\n\nimport streamlit\nfrom streamlit.proto.Json_pb2 import Json as JsonProto\nfrom streamlit.state import AutoSessionState\n\n\nclass JsonMixin:\n def json(\n self,\n body,\n *, # keyword-only arguments:\n expanded=True,\n ):\n \"\"\"Display object or string as a pretty-printed JSON string.\n\n Parameters\n ----------\n body : Object or str\n The object to print as JSON. All referenced objects should be\n serializable to JSON as well. If object is a string, we assume it\n contains serialized JSON.\n\n expanded : bool\n An optional boolean that allows the user to set whether the initial\n state of this json element should be expanded. Defaults to True.\n This argument can only be supplied by keyword.\n\n Example\n -------\n >>> st.json({\n ... 'foo': 'bar',\n ... 'baz': 'boz',\n ... 'stuff': [\n ... 'stuff 1',\n ... 'stuff 2',\n ... 'stuff 3',\n ... 'stuff 5',\n ... ],\n ... })\n\n .. output::\n https://share.streamlit.io/streamlit/docs/main/python/api-examples-source/data.json.py\n height: 385px\n\n \"\"\"\n import streamlit as st\n\n if isinstance(body, AutoSessionState):\n body = body.to_dict()\n\n if not isinstance(body, str):\n try:\n body = json.dumps(body, default=repr)\n except TypeError as err:\n st.warning(\n \"Warning: this data structure was not fully serializable as \"\n \"JSON due to one or more unexpected keys. (Error was: %s)\" % err\n )\n body = json.dumps(body, skipkeys=True, default=repr)\n\n json_proto = JsonProto()\n json_proto.body = body\n json_proto.expanded = expanded\n return self.dg._enqueue(\"json\", json_proto)\n\n @property\n def dg(self) -> \"streamlit.delta_generator.DeltaGenerator\":\n \"\"\"Get our DeltaGenerator.\"\"\"\n return cast(\"streamlit.delta_generator.DeltaGenerator\", self)\n", "path": "lib/streamlit/elements/json.py"}]} | 1,297 | 336 |
gh_patches_debug_21934 | rasdani/github-patches | git_diff | Project-MONAI__MONAI-2254 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ASPP type hints need to be updated
In ASPP (https://github.com/Project-MONAI/MONAI/blob/dev/monai/networks/blocks/aspp.py), the type hints of `acti_type` and `norm_type` are missing.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `monai/networks/blocks/aspp.py`
Content:
```
1 # Copyright 2020 - 2021 MONAI Consortium
2 # Licensed under the Apache License, Version 2.0 (the "License");
3 # you may not use this file except in compliance with the License.
4 # You may obtain a copy of the License at
5 # http://www.apache.org/licenses/LICENSE-2.0
6 # Unless required by applicable law or agreed to in writing, software
7 # distributed under the License is distributed on an "AS IS" BASIS,
8 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
9 # See the License for the specific language governing permissions and
10 # limitations under the License.
11
12 from typing import Sequence
13
14 import torch
15 import torch.nn as nn
16
17 from monai.networks.blocks.convolutions import Convolution
18 from monai.networks.layers import same_padding
19 from monai.networks.layers.factories import Act, Conv, Norm
20
21
22 class SimpleASPP(nn.Module):
23 """
24 A simplified version of the atrous spatial pyramid pooling (ASPP) module.
25
26 Chen et al., Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation.
27 https://arxiv.org/abs/1802.02611
28
29 Wang et al., A Noise-robust Framework for Automatic Segmentation of COVID-19 Pneumonia Lesions
30 from CT Images. https://ieeexplore.ieee.org/document/9109297
31 """
32
33 def __init__(
34 self,
35 spatial_dims: int,
36 in_channels: int,
37 conv_out_channels: int,
38 kernel_sizes: Sequence[int] = (1, 3, 3, 3),
39 dilations: Sequence[int] = (1, 2, 4, 6),
40 norm_type=Norm.BATCH,
41 acti_type=Act.LEAKYRELU,
42 ) -> None:
43 """
44 Args:
45 spatial_dims: number of spatial dimensions, could be 1, 2, or 3.
46 in_channels: number of input channels.
47 conv_out_channels: number of output channels of each atrous conv.
48 The final number of output channels is conv_out_channels * len(kernel_sizes).
49 kernel_sizes: a sequence of four convolutional kernel sizes.
50 Defaults to (1, 3, 3, 3) for four (dilated) convolutions.
51 dilations: a sequence of four convolutional dilation parameters.
52 Defaults to (1, 2, 4, 6) for four (dilated) convolutions.
53 norm_type: final kernel-size-one convolution normalization type.
54 Defaults to batch norm.
55 acti_type: final kernel-size-one convolution activation type.
56 Defaults to leaky ReLU.
57
58 Raises:
59 ValueError: When ``kernel_sizes`` length differs from ``dilations``.
60
61 See also:
62
63 :py:class:`monai.networks.layers.Act`
64 :py:class:`monai.networks.layers.Conv`
65 :py:class:`monai.networks.layers.Norm`
66
67 """
68 super().__init__()
69 if len(kernel_sizes) != len(dilations):
70 raise ValueError(
71 "kernel_sizes and dilations length must match, "
72 f"got kernel_sizes={len(kernel_sizes)} dilations={len(dilations)}."
73 )
74 pads = tuple(same_padding(k, d) for k, d in zip(kernel_sizes, dilations))
75
76 self.convs = nn.ModuleList()
77 for k, d, p in zip(kernel_sizes, dilations, pads):
78 _conv = Conv[Conv.CONV, spatial_dims](
79 in_channels=in_channels, out_channels=conv_out_channels, kernel_size=k, dilation=d, padding=p
80 )
81 self.convs.append(_conv)
82
83 out_channels = conv_out_channels * len(pads) # final conv. output channels
84 self.conv_k1 = Convolution(
85 dimensions=spatial_dims,
86 in_channels=out_channels,
87 out_channels=out_channels,
88 kernel_size=1,
89 act=acti_type,
90 norm=norm_type,
91 )
92
93 def forward(self, x: torch.Tensor) -> torch.Tensor:
94 """
95 Args:
96 x: in shape (batch, channel, spatial_1[, spatial_2, ...]).
97 """
98 x_out = torch.cat([conv(x) for conv in self.convs], dim=1)
99 x_out = self.conv_k1(x_out)
100 return x_out
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/monai/networks/blocks/aspp.py b/monai/networks/blocks/aspp.py
--- a/monai/networks/blocks/aspp.py
+++ b/monai/networks/blocks/aspp.py
@@ -9,14 +9,14 @@
# See the License for the specific language governing permissions and
# limitations under the License.
-from typing import Sequence
+from typing import Optional, Sequence, Tuple, Union
import torch
import torch.nn as nn
from monai.networks.blocks.convolutions import Convolution
from monai.networks.layers import same_padding
-from monai.networks.layers.factories import Act, Conv, Norm
+from monai.networks.layers.factories import Conv
class SimpleASPP(nn.Module):
@@ -37,8 +37,8 @@
conv_out_channels: int,
kernel_sizes: Sequence[int] = (1, 3, 3, 3),
dilations: Sequence[int] = (1, 2, 4, 6),
- norm_type=Norm.BATCH,
- acti_type=Act.LEAKYRELU,
+ norm_type: Optional[Union[Tuple, str]] = "BATCH",
+ acti_type: Optional[Union[Tuple, str]] = "LEAKYRELU",
) -> None:
"""
Args:
| {"golden_diff": "diff --git a/monai/networks/blocks/aspp.py b/monai/networks/blocks/aspp.py\n--- a/monai/networks/blocks/aspp.py\n+++ b/monai/networks/blocks/aspp.py\n@@ -9,14 +9,14 @@\n # See the License for the specific language governing permissions and\n # limitations under the License.\n \n-from typing import Sequence\n+from typing import Optional, Sequence, Tuple, Union\n \n import torch\n import torch.nn as nn\n \n from monai.networks.blocks.convolutions import Convolution\n from monai.networks.layers import same_padding\n-from monai.networks.layers.factories import Act, Conv, Norm\n+from monai.networks.layers.factories import Conv\n \n \n class SimpleASPP(nn.Module):\n@@ -37,8 +37,8 @@\n conv_out_channels: int,\n kernel_sizes: Sequence[int] = (1, 3, 3, 3),\n dilations: Sequence[int] = (1, 2, 4, 6),\n- norm_type=Norm.BATCH,\n- acti_type=Act.LEAKYRELU,\n+ norm_type: Optional[Union[Tuple, str]] = \"BATCH\",\n+ acti_type: Optional[Union[Tuple, str]] = \"LEAKYRELU\",\n ) -> None:\n \"\"\"\n Args:\n", "issue": "ASPP type hints need to be updated\nIn ASPP (https://github.com/Project-MONAI/MONAI/blob/dev/monai/networks/blocks/aspp.py), the type hints of `acti_type` and `norm_type` are missing.\n", "before_files": [{"content": "# Copyright 2020 - 2021 MONAI Consortium\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n# http://www.apache.org/licenses/LICENSE-2.0\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Sequence\n\nimport torch\nimport torch.nn as nn\n\nfrom monai.networks.blocks.convolutions import Convolution\nfrom monai.networks.layers import same_padding\nfrom monai.networks.layers.factories import Act, Conv, Norm\n\n\nclass SimpleASPP(nn.Module):\n \"\"\"\n A simplified version of the atrous spatial pyramid pooling (ASPP) module.\n\n Chen et al., Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation.\n https://arxiv.org/abs/1802.02611\n\n Wang et al., A Noise-robust Framework for Automatic Segmentation of COVID-19 Pneumonia Lesions\n from CT Images. https://ieeexplore.ieee.org/document/9109297\n \"\"\"\n\n def __init__(\n self,\n spatial_dims: int,\n in_channels: int,\n conv_out_channels: int,\n kernel_sizes: Sequence[int] = (1, 3, 3, 3),\n dilations: Sequence[int] = (1, 2, 4, 6),\n norm_type=Norm.BATCH,\n acti_type=Act.LEAKYRELU,\n ) -> None:\n \"\"\"\n Args:\n spatial_dims: number of spatial dimensions, could be 1, 2, or 3.\n in_channels: number of input channels.\n conv_out_channels: number of output channels of each atrous conv.\n The final number of output channels is conv_out_channels * len(kernel_sizes).\n kernel_sizes: a sequence of four convolutional kernel sizes.\n Defaults to (1, 3, 3, 3) for four (dilated) convolutions.\n dilations: a sequence of four convolutional dilation parameters.\n Defaults to (1, 2, 4, 6) for four (dilated) convolutions.\n norm_type: final kernel-size-one convolution normalization type.\n Defaults to batch norm.\n acti_type: final kernel-size-one convolution activation type.\n Defaults to leaky ReLU.\n\n Raises:\n ValueError: When ``kernel_sizes`` length differs from ``dilations``.\n\n See also:\n\n :py:class:`monai.networks.layers.Act`\n :py:class:`monai.networks.layers.Conv`\n :py:class:`monai.networks.layers.Norm`\n\n \"\"\"\n super().__init__()\n if len(kernel_sizes) != len(dilations):\n raise ValueError(\n \"kernel_sizes and dilations length must match, \"\n f\"got kernel_sizes={len(kernel_sizes)} dilations={len(dilations)}.\"\n )\n pads = tuple(same_padding(k, d) for k, d in zip(kernel_sizes, dilations))\n\n self.convs = nn.ModuleList()\n for k, d, p in zip(kernel_sizes, dilations, pads):\n _conv = Conv[Conv.CONV, spatial_dims](\n in_channels=in_channels, out_channels=conv_out_channels, kernel_size=k, dilation=d, padding=p\n )\n self.convs.append(_conv)\n\n out_channels = conv_out_channels * len(pads) # final conv. output channels\n self.conv_k1 = Convolution(\n dimensions=spatial_dims,\n in_channels=out_channels,\n out_channels=out_channels,\n kernel_size=1,\n act=acti_type,\n norm=norm_type,\n )\n\n def forward(self, x: torch.Tensor) -> torch.Tensor:\n \"\"\"\n Args:\n x: in shape (batch, channel, spatial_1[, spatial_2, ...]).\n \"\"\"\n x_out = torch.cat([conv(x) for conv in self.convs], dim=1)\n x_out = self.conv_k1(x_out)\n return x_out\n", "path": "monai/networks/blocks/aspp.py"}], "after_files": [{"content": "# Copyright 2020 - 2021 MONAI Consortium\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n# http://www.apache.org/licenses/LICENSE-2.0\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Optional, Sequence, Tuple, Union\n\nimport torch\nimport torch.nn as nn\n\nfrom monai.networks.blocks.convolutions import Convolution\nfrom monai.networks.layers import same_padding\nfrom monai.networks.layers.factories import Conv\n\n\nclass SimpleASPP(nn.Module):\n \"\"\"\n A simplified version of the atrous spatial pyramid pooling (ASPP) module.\n\n Chen et al., Encoder-Decoder with Atrous Separable Convolution for Semantic Image Segmentation.\n https://arxiv.org/abs/1802.02611\n\n Wang et al., A Noise-robust Framework for Automatic Segmentation of COVID-19 Pneumonia Lesions\n from CT Images. https://ieeexplore.ieee.org/document/9109297\n \"\"\"\n\n def __init__(\n self,\n spatial_dims: int,\n in_channels: int,\n conv_out_channels: int,\n kernel_sizes: Sequence[int] = (1, 3, 3, 3),\n dilations: Sequence[int] = (1, 2, 4, 6),\n norm_type: Optional[Union[Tuple, str]] = \"BATCH\",\n acti_type: Optional[Union[Tuple, str]] = \"LEAKYRELU\",\n ) -> None:\n \"\"\"\n Args:\n spatial_dims: number of spatial dimensions, could be 1, 2, or 3.\n in_channels: number of input channels.\n conv_out_channels: number of output channels of each atrous conv.\n The final number of output channels is conv_out_channels * len(kernel_sizes).\n kernel_sizes: a sequence of four convolutional kernel sizes.\n Defaults to (1, 3, 3, 3) for four (dilated) convolutions.\n dilations: a sequence of four convolutional dilation parameters.\n Defaults to (1, 2, 4, 6) for four (dilated) convolutions.\n norm_type: final kernel-size-one convolution normalization type.\n Defaults to batch norm.\n acti_type: final kernel-size-one convolution activation type.\n Defaults to leaky ReLU.\n\n Raises:\n ValueError: When ``kernel_sizes`` length differs from ``dilations``.\n\n See also:\n\n :py:class:`monai.networks.layers.Act`\n :py:class:`monai.networks.layers.Conv`\n :py:class:`monai.networks.layers.Norm`\n\n \"\"\"\n super().__init__()\n if len(kernel_sizes) != len(dilations):\n raise ValueError(\n \"kernel_sizes and dilations length must match, \"\n f\"got kernel_sizes={len(kernel_sizes)} dilations={len(dilations)}.\"\n )\n pads = tuple(same_padding(k, d) for k, d in zip(kernel_sizes, dilations))\n\n self.convs = nn.ModuleList()\n for k, d, p in zip(kernel_sizes, dilations, pads):\n _conv = Conv[Conv.CONV, spatial_dims](\n in_channels=in_channels, out_channels=conv_out_channels, kernel_size=k, dilation=d, padding=p\n )\n self.convs.append(_conv)\n\n out_channels = conv_out_channels * len(pads) # final conv. output channels\n self.conv_k1 = Convolution(\n dimensions=spatial_dims,\n in_channels=out_channels,\n out_channels=out_channels,\n kernel_size=1,\n act=acti_type,\n norm=norm_type,\n )\n\n def forward(self, x: torch.Tensor) -> torch.Tensor:\n \"\"\"\n Args:\n x: in shape (batch, channel, spatial_1[, spatial_2, ...]).\n \"\"\"\n x_out = torch.cat([conv(x) for conv in self.convs], dim=1)\n x_out = self.conv_k1(x_out)\n return x_out\n", "path": "monai/networks/blocks/aspp.py"}]} | 1,466 | 298 |
gh_patches_debug_25650 | rasdani/github-patches | git_diff | astronomer__astro-sdk-1374 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
All connections tested even though one's been specified
**Describe the bug**
I ran `astro flow validate --connection=<connection_id>` and all connections were tested even though I passed one conn id specifically.
**Version**
* Astro Runtime: 7.0.0
* Astro CLI: 1.8.3
**To Reproduce**
Steps to reproduce the behavior:
1. Update file `config/default/configuration.yml` as shown below:
```
connections:
- conn_id: sqlite_conn
conn_type: sqlite
host: /Users/magdagultekin/magda-dev/data/imdb.db
login: null
password: null
schema: null
- conn_id: sqlite_default
conn_type: sqlite
host: /tmp/sqlite.db
login: null
password: null
schema: null
```
4. Run `astro flow validate --connection=sqlite_default`
5. See message:
```
Validating connection(s) for environment 'default'
Validating connection sqlite_conn PASSED
Validating connection sqlite_default FAILED
```
**Expected behavior**
Only `sqlite_default` should be tested.
**Screenshots**

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sql-cli/sql_cli/connections.py`
Content:
```
1 from __future__ import annotations
2
3 import os
4 from pathlib import Path
5
6 from airflow.models import Connection
7
8 from sql_cli.utils.rich import rprint
9
10 CONNECTION_ID_OUTPUT_STRING_WIDTH = 25
11
12
13 def validate_connections(connections: list[Connection], connection_id: str | None = None) -> None:
14 """
15 Validates that the given connections are valid and registers them to Airflow with replace policy for existing
16 connections.
17 """
18 for connection in connections:
19 os.environ[f"AIRFLOW_CONN_{connection.conn_id.upper()}"] = connection.get_uri()
20 status = "[bold green]PASSED[/bold green]" if _is_valid(connection) else "[bold red]FAILED[/bold red]"
21 rprint(f"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}", status)
22
23 if connection_id and not any(connection.conn_id == connection_id for connection in connections):
24 rprint("[bold red]Error: Config file does not contain given connection[/bold red]", connection_id)
25
26
27 def _is_valid(connection: Connection) -> bool:
28 # Sqlite automatically creates the file if it does not exist,
29 # but our users might not expect that. They are referencing a database they expect to exist.
30 if connection.conn_type == "sqlite" and not Path(connection.host).is_file():
31 return False
32
33 success_status, _ = connection.test_connection()
34 return success_status
35
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sql-cli/sql_cli/connections.py b/sql-cli/sql_cli/connections.py
--- a/sql-cli/sql_cli/connections.py
+++ b/sql-cli/sql_cli/connections.py
@@ -15,14 +15,17 @@
Validates that the given connections are valid and registers them to Airflow with replace policy for existing
connections.
"""
- for connection in connections:
- os.environ[f"AIRFLOW_CONN_{connection.conn_id.upper()}"] = connection.get_uri()
- status = "[bold green]PASSED[/bold green]" if _is_valid(connection) else "[bold red]FAILED[/bold red]"
- rprint(f"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}", status)
-
if connection_id and not any(connection.conn_id == connection_id for connection in connections):
rprint("[bold red]Error: Config file does not contain given connection[/bold red]", connection_id)
+ for connection in connections:
+ if not connection_id or connection_id and connection.conn_id == connection_id:
+ os.environ[f"AIRFLOW_CONN_{connection.conn_id.upper()}"] = connection.get_uri()
+ status = (
+ "[bold green]PASSED[/bold green]" if _is_valid(connection) else "[bold red]FAILED[/bold red]"
+ )
+ rprint(f"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}", status)
+
def _is_valid(connection: Connection) -> bool:
# Sqlite automatically creates the file if it does not exist,
| {"golden_diff": "diff --git a/sql-cli/sql_cli/connections.py b/sql-cli/sql_cli/connections.py\n--- a/sql-cli/sql_cli/connections.py\n+++ b/sql-cli/sql_cli/connections.py\n@@ -15,14 +15,17 @@\n Validates that the given connections are valid and registers them to Airflow with replace policy for existing\n connections.\n \"\"\"\n- for connection in connections:\n- os.environ[f\"AIRFLOW_CONN_{connection.conn_id.upper()}\"] = connection.get_uri()\n- status = \"[bold green]PASSED[/bold green]\" if _is_valid(connection) else \"[bold red]FAILED[/bold red]\"\n- rprint(f\"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}\", status)\n-\n if connection_id and not any(connection.conn_id == connection_id for connection in connections):\n rprint(\"[bold red]Error: Config file does not contain given connection[/bold red]\", connection_id)\n \n+ for connection in connections:\n+ if not connection_id or connection_id and connection.conn_id == connection_id:\n+ os.environ[f\"AIRFLOW_CONN_{connection.conn_id.upper()}\"] = connection.get_uri()\n+ status = (\n+ \"[bold green]PASSED[/bold green]\" if _is_valid(connection) else \"[bold red]FAILED[/bold red]\"\n+ )\n+ rprint(f\"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}\", status)\n+\n \n def _is_valid(connection: Connection) -> bool:\n # Sqlite automatically creates the file if it does not exist,\n", "issue": "All connections tested even though one's been specified\n**Describe the bug**\r\nI ran `astro flow validate --connection=<connection_id>` and all connections were tested even though I passed one conn id specifically. \r\n\r\n**Version**\r\n* Astro Runtime: 7.0.0\r\n* Astro CLI: 1.8.3\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\n1. Update file `config/default/configuration.yml` as shown below:\r\n```\r\nconnections:\r\n - conn_id: sqlite_conn\r\n conn_type: sqlite\r\n host: /Users/magdagultekin/magda-dev/data/imdb.db\r\n login: null\r\n password: null\r\n schema: null\r\n - conn_id: sqlite_default\r\n conn_type: sqlite\r\n host: /tmp/sqlite.db\r\n login: null\r\n password: null\r\n schema: null\r\n```\r\n4. Run `astro flow validate --connection=sqlite_default`\r\n5. See message:\r\n```\r\nValidating connection(s) for environment 'default'\r\nValidating connection sqlite_conn PASSED\r\nValidating connection sqlite_default FAILED\r\n```\r\n\r\n**Expected behavior**\r\nOnly `sqlite_default` should be tested.\r\n\r\n**Screenshots**\r\n\r\n\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nimport os\nfrom pathlib import Path\n\nfrom airflow.models import Connection\n\nfrom sql_cli.utils.rich import rprint\n\nCONNECTION_ID_OUTPUT_STRING_WIDTH = 25\n\n\ndef validate_connections(connections: list[Connection], connection_id: str | None = None) -> None:\n \"\"\"\n Validates that the given connections are valid and registers them to Airflow with replace policy for existing\n connections.\n \"\"\"\n for connection in connections:\n os.environ[f\"AIRFLOW_CONN_{connection.conn_id.upper()}\"] = connection.get_uri()\n status = \"[bold green]PASSED[/bold green]\" if _is_valid(connection) else \"[bold red]FAILED[/bold red]\"\n rprint(f\"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}\", status)\n\n if connection_id and not any(connection.conn_id == connection_id for connection in connections):\n rprint(\"[bold red]Error: Config file does not contain given connection[/bold red]\", connection_id)\n\n\ndef _is_valid(connection: Connection) -> bool:\n # Sqlite automatically creates the file if it does not exist,\n # but our users might not expect that. They are referencing a database they expect to exist.\n if connection.conn_type == \"sqlite\" and not Path(connection.host).is_file():\n return False\n\n success_status, _ = connection.test_connection()\n return success_status\n", "path": "sql-cli/sql_cli/connections.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport os\nfrom pathlib import Path\n\nfrom airflow.models import Connection\n\nfrom sql_cli.utils.rich import rprint\n\nCONNECTION_ID_OUTPUT_STRING_WIDTH = 25\n\n\ndef validate_connections(connections: list[Connection], connection_id: str | None = None) -> None:\n \"\"\"\n Validates that the given connections are valid and registers them to Airflow with replace policy for existing\n connections.\n \"\"\"\n if connection_id and not any(connection.conn_id == connection_id for connection in connections):\n rprint(\"[bold red]Error: Config file does not contain given connection[/bold red]\", connection_id)\n\n for connection in connections:\n if not connection_id or connection_id and connection.conn_id == connection_id:\n os.environ[f\"AIRFLOW_CONN_{connection.conn_id.upper()}\"] = connection.get_uri()\n status = (\n \"[bold green]PASSED[/bold green]\" if _is_valid(connection) else \"[bold red]FAILED[/bold red]\"\n )\n rprint(f\"Validating connection {connection.conn_id:{CONNECTION_ID_OUTPUT_STRING_WIDTH}}\", status)\n\n\ndef _is_valid(connection: Connection) -> bool:\n # Sqlite automatically creates the file if it does not exist,\n # but our users might not expect that. They are referencing a database they expect to exist.\n if connection.conn_type == \"sqlite\" and not Path(connection.host).is_file():\n return False\n\n success_status, _ = connection.test_connection()\n return success_status\n", "path": "sql-cli/sql_cli/connections.py"}]} | 941 | 338 |
gh_patches_debug_23145 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-881 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Jewel-Osco
https://local.jewelosco.com/index.html
Looks like it can probably just be added as a start url in the albertsons.py spider.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/albertsons.py`
Content:
```
1 import scrapy
2 import re
3 import json
4 from locations.items import GeojsonPointItem
5
6 DAY_MAPPING = {
7 'M': 'Mo',
8 'T': 'Tu',
9 'W': 'We',
10 'F': 'Fr',
11 'Sat': 'Sa',
12 'Sun': 'Su'
13 }
14
15
16 class AlbertsonsSpider(scrapy.Spider):
17
18 name = "albertsons"
19 allowed_domains = ["local.albertsons.com"]
20 download_delay = 0.5
21 start_urls = (
22 'https://local.albertsons.com/index.html',
23 )
24
25 def parse_stores(self, response):
26 ref = re.findall(r"[^(\/)]+.html$" ,response.url)
27 map_data = response.xpath('normalize-space(//script[@id="js-map-config-dir-map-desktop"]/text())').extract_first()
28 map_json= json.loads(map_data)
29 if(len(ref)>0):
30 ref = ref[0].split('.')[0]
31 properties = {
32 'addr_full': response.xpath('normalize-space(//span[@itemprop="streetAddress"]/span/text())').extract_first(),
33 'phone': response.xpath('normalize-space(//span[@itemprop="telephone"]/text())').extract_first(),
34 'city': response.xpath('normalize-space(//span[@itemprop="addressLocality"]/text())').extract_first(),
35 'state': response.xpath('normalize-space(//abbr[@itemprop="addressRegion"]/text())').extract_first(),
36 'postcode': response.xpath('normalize-space(//span[@itemprop="postalCode"]/text())').extract_first(),
37 'ref': ref,
38 'website': response.url,
39 'lat': float(map_json['locs'][0]['latitude']),
40 'lon': float(map_json['locs'][0]['longitude']),
41 }
42 hours = response.xpath('//div[@class="LocationInfo-right"]/div[1]/div[@class="LocationInfo-hoursTable"]/div[@class="c-location-hours-details-wrapper js-location-hours"]/table/tbody/tr/@content').extract()
43 if hours:
44 properties['opening_hours'] = " ;".join(hours)
45 yield GeojsonPointItem(**properties)
46
47 def parse_city_stores(self ,response):
48 stores = response.xpath('//div[@class="Teaser-content"]/h2/a/@href').extract()
49 for store in stores:
50 yield scrapy.Request(response.urljoin(store), callback=self.parse_stores)
51
52 def parse_state(self, response):
53 urls = response.xpath('//div[@class="c-directory-list-content-wrapper"]/ul/li/a/@href').extract()
54 for path in urls:
55 pattern = re.compile("^[a-z]{2}\/[^()]+\/[^()]+.html$")
56 if (pattern.match(path.strip())):
57 yield scrapy.Request(response.urljoin(path), callback=self.parse_stores)
58 else:
59 yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores)
60
61 def parse(self, response):
62 urls = response.xpath('//div[@class="c-directory-list-content-wrapper"]/ul/li/a/@href').extract()
63 for path in urls:
64 pattern = re.compile("^[a-z]{2}.html$")
65 pattern1 = re.compile("^[a-z]{2}\/[^()]+\/[^()]+.html$")
66 if(pattern.match(path.strip())):
67 yield scrapy.Request(response.urljoin(path), callback=self.parse_state)
68 elif(pattern1.match(path.strip())):
69 yield scrapy.Request(response.urljoin(path), callback=self.parse_stores)
70 else:
71 yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores)
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/albertsons.py b/locations/spiders/albertsons.py
--- a/locations/spiders/albertsons.py
+++ b/locations/spiders/albertsons.py
@@ -16,10 +16,14 @@
class AlbertsonsSpider(scrapy.Spider):
name = "albertsons"
- allowed_domains = ["local.albertsons.com"]
download_delay = 0.5
+ allowed_domains = [
+ "local.albertsons.com",
+ "local.jewelosco.com",
+ ]
start_urls = (
'https://local.albertsons.com/index.html',
+ 'https://local.jewelosco.com/index.html',
)
def parse_stores(self, response):
@@ -41,7 +45,7 @@
}
hours = response.xpath('//div[@class="LocationInfo-right"]/div[1]/div[@class="LocationInfo-hoursTable"]/div[@class="c-location-hours-details-wrapper js-location-hours"]/table/tbody/tr/@content').extract()
if hours:
- properties['opening_hours'] = " ;".join(hours)
+ properties['opening_hours'] = "; ".join(hours)
yield GeojsonPointItem(**properties)
def parse_city_stores(self ,response):
| {"golden_diff": "diff --git a/locations/spiders/albertsons.py b/locations/spiders/albertsons.py\n--- a/locations/spiders/albertsons.py\n+++ b/locations/spiders/albertsons.py\n@@ -16,10 +16,14 @@\n class AlbertsonsSpider(scrapy.Spider):\n \n name = \"albertsons\"\n- allowed_domains = [\"local.albertsons.com\"]\n download_delay = 0.5\n+ allowed_domains = [\n+ \"local.albertsons.com\",\n+ \"local.jewelosco.com\",\n+ ]\n start_urls = (\n 'https://local.albertsons.com/index.html',\n+ 'https://local.jewelosco.com/index.html',\n )\n \n def parse_stores(self, response):\n@@ -41,7 +45,7 @@\n }\n hours = response.xpath('//div[@class=\"LocationInfo-right\"]/div[1]/div[@class=\"LocationInfo-hoursTable\"]/div[@class=\"c-location-hours-details-wrapper js-location-hours\"]/table/tbody/tr/@content').extract()\n if hours:\n- properties['opening_hours'] = \" ;\".join(hours)\n+ properties['opening_hours'] = \"; \".join(hours)\n yield GeojsonPointItem(**properties)\n \n def parse_city_stores(self ,response):\n", "issue": "Jewel-Osco\nhttps://local.jewelosco.com/index.html\r\n\r\nLooks like it can probably just be added as a start url in the albertsons.py spider.\n", "before_files": [{"content": "import scrapy\nimport re\nimport json\nfrom locations.items import GeojsonPointItem\n\nDAY_MAPPING = {\n 'M': 'Mo',\n 'T': 'Tu',\n 'W': 'We',\n 'F': 'Fr',\n 'Sat': 'Sa',\n 'Sun': 'Su'\n}\n\n\nclass AlbertsonsSpider(scrapy.Spider):\n\n name = \"albertsons\"\n allowed_domains = [\"local.albertsons.com\"]\n download_delay = 0.5\n start_urls = (\n 'https://local.albertsons.com/index.html',\n )\n\n def parse_stores(self, response):\n ref = re.findall(r\"[^(\\/)]+.html$\" ,response.url)\n map_data = response.xpath('normalize-space(//script[@id=\"js-map-config-dir-map-desktop\"]/text())').extract_first()\n map_json= json.loads(map_data)\n if(len(ref)>0):\n ref = ref[0].split('.')[0]\n properties = {\n 'addr_full': response.xpath('normalize-space(//span[@itemprop=\"streetAddress\"]/span/text())').extract_first(),\n 'phone': response.xpath('normalize-space(//span[@itemprop=\"telephone\"]/text())').extract_first(),\n 'city': response.xpath('normalize-space(//span[@itemprop=\"addressLocality\"]/text())').extract_first(),\n 'state': response.xpath('normalize-space(//abbr[@itemprop=\"addressRegion\"]/text())').extract_first(),\n 'postcode': response.xpath('normalize-space(//span[@itemprop=\"postalCode\"]/text())').extract_first(),\n 'ref': ref,\n 'website': response.url,\n 'lat': float(map_json['locs'][0]['latitude']),\n 'lon': float(map_json['locs'][0]['longitude']),\n }\n hours = response.xpath('//div[@class=\"LocationInfo-right\"]/div[1]/div[@class=\"LocationInfo-hoursTable\"]/div[@class=\"c-location-hours-details-wrapper js-location-hours\"]/table/tbody/tr/@content').extract()\n if hours:\n properties['opening_hours'] = \" ;\".join(hours)\n yield GeojsonPointItem(**properties)\n\n def parse_city_stores(self ,response):\n stores = response.xpath('//div[@class=\"Teaser-content\"]/h2/a/@href').extract()\n for store in stores:\n yield scrapy.Request(response.urljoin(store), callback=self.parse_stores)\n\n def parse_state(self, response):\n urls = response.xpath('//div[@class=\"c-directory-list-content-wrapper\"]/ul/li/a/@href').extract()\n for path in urls:\n pattern = re.compile(\"^[a-z]{2}\\/[^()]+\\/[^()]+.html$\")\n if (pattern.match(path.strip())):\n yield scrapy.Request(response.urljoin(path), callback=self.parse_stores)\n else:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores)\n\n def parse(self, response):\n urls = response.xpath('//div[@class=\"c-directory-list-content-wrapper\"]/ul/li/a/@href').extract()\n for path in urls:\n pattern = re.compile(\"^[a-z]{2}.html$\")\n pattern1 = re.compile(\"^[a-z]{2}\\/[^()]+\\/[^()]+.html$\")\n if(pattern.match(path.strip())):\n yield scrapy.Request(response.urljoin(path), callback=self.parse_state)\n elif(pattern1.match(path.strip())):\n yield scrapy.Request(response.urljoin(path), callback=self.parse_stores)\n else:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores)\n", "path": "locations/spiders/albertsons.py"}], "after_files": [{"content": "import scrapy\nimport re\nimport json\nfrom locations.items import GeojsonPointItem\n\nDAY_MAPPING = {\n 'M': 'Mo',\n 'T': 'Tu',\n 'W': 'We',\n 'F': 'Fr',\n 'Sat': 'Sa',\n 'Sun': 'Su'\n}\n\n\nclass AlbertsonsSpider(scrapy.Spider):\n\n name = \"albertsons\"\n download_delay = 0.5\n allowed_domains = [\n \"local.albertsons.com\",\n \"local.jewelosco.com\",\n ]\n start_urls = (\n 'https://local.albertsons.com/index.html',\n 'https://local.jewelosco.com/index.html',\n )\n\n def parse_stores(self, response):\n ref = re.findall(r\"[^(\\/)]+.html$\" ,response.url)\n map_data = response.xpath('normalize-space(//script[@id=\"js-map-config-dir-map-desktop\"]/text())').extract_first()\n map_json= json.loads(map_data)\n if(len(ref)>0):\n ref = ref[0].split('.')[0]\n properties = {\n 'addr_full': response.xpath('normalize-space(//span[@itemprop=\"streetAddress\"]/span/text())').extract_first(),\n 'phone': response.xpath('normalize-space(//span[@itemprop=\"telephone\"]/text())').extract_first(),\n 'city': response.xpath('normalize-space(//span[@itemprop=\"addressLocality\"]/text())').extract_first(),\n 'state': response.xpath('normalize-space(//abbr[@itemprop=\"addressRegion\"]/text())').extract_first(),\n 'postcode': response.xpath('normalize-space(//span[@itemprop=\"postalCode\"]/text())').extract_first(),\n 'ref': ref,\n 'website': response.url,\n 'lat': float(map_json['locs'][0]['latitude']),\n 'lon': float(map_json['locs'][0]['longitude']),\n }\n hours = response.xpath('//div[@class=\"LocationInfo-right\"]/div[1]/div[@class=\"LocationInfo-hoursTable\"]/div[@class=\"c-location-hours-details-wrapper js-location-hours\"]/table/tbody/tr/@content').extract()\n if hours:\n properties['opening_hours'] = \"; \".join(hours)\n yield GeojsonPointItem(**properties)\n\n def parse_city_stores(self ,response):\n stores = response.xpath('//div[@class=\"Teaser-content\"]/h2/a/@href').extract()\n for store in stores:\n yield scrapy.Request(response.urljoin(store), callback=self.parse_stores)\n\n def parse_state(self, response):\n urls = response.xpath('//div[@class=\"c-directory-list-content-wrapper\"]/ul/li/a/@href').extract()\n for path in urls:\n pattern = re.compile(\"^[a-z]{2}\\/[^()]+\\/[^()]+.html$\")\n if (pattern.match(path.strip())):\n yield scrapy.Request(response.urljoin(path), callback=self.parse_stores)\n else:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores)\n\n def parse(self, response):\n urls = response.xpath('//div[@class=\"c-directory-list-content-wrapper\"]/ul/li/a/@href').extract()\n for path in urls:\n pattern = re.compile(\"^[a-z]{2}.html$\")\n pattern1 = re.compile(\"^[a-z]{2}\\/[^()]+\\/[^()]+.html$\")\n if(pattern.match(path.strip())):\n yield scrapy.Request(response.urljoin(path), callback=self.parse_state)\n elif(pattern1.match(path.strip())):\n yield scrapy.Request(response.urljoin(path), callback=self.parse_stores)\n else:\n yield scrapy.Request(response.urljoin(path), callback=self.parse_city_stores)\n", "path": "locations/spiders/albertsons.py"}]} | 1,193 | 284 |
gh_patches_debug_8006 | rasdani/github-patches | git_diff | tournesol-app__tournesol-1713 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[feat] Make `exclude_compared` configurable in user settings
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `backend/core/serializers/user_settings.py`
Content:
```
1 from django.utils.translation import gettext_lazy as _
2 from rest_framework import serializers
3 from rest_framework.serializers import ValidationError
4
5 from tournesol.models.poll import Poll
6 from tournesol.utils.video_language import ACCEPTED_LANGUAGE_CODES
7
8
9 class GeneralUserSettingsSerializer(serializers.Serializer):
10 """
11 The general user settings that are not related to Tournesol polls.
12 """
13
14 # The first element of the tuple should be an ISO 639-1 code.
15 NOTIFICATIONS_LANG = [
16 ("en", "en"),
17 ("fr", "fr"),
18 ]
19
20 notifications__lang = serializers.ChoiceField(
21 choices=NOTIFICATIONS_LANG, required=False
22 )
23 notifications_email__research = serializers.BooleanField(required=False)
24 notifications_email__new_features = serializers.BooleanField(required=False)
25
26
27 class GenericPollUserSettingsSerializer(serializers.Serializer):
28 """
29 The settings common to each poll.
30 """
31
32 COMPONENT_DISPLAY_STATE = [
33 ("ALWAYS", "always"),
34 ("EMBEDDED_ONLY", "embedded_only"),
35 ("WEBSITE_ONLY", "website_only"),
36 ("NEVER", "never"),
37 ]
38
39 comparison__criteria_order = serializers.ListField(
40 child=serializers.CharField(), required=False
41 )
42
43 comparison__fill_entity_selector = serializers.BooleanField(required=False)
44
45 comparison_ui__weekly_collective_goal_display = serializers.ChoiceField(
46 choices=COMPONENT_DISPLAY_STATE, allow_blank=True, required=False
47 )
48
49 rate_later__auto_remove = serializers.IntegerField(required=False)
50
51 def validate_comparison__criteria_order(self, criteria):
52 poll_name = self.context.get("poll_name", self._context["poll_name"])
53 poll = Poll.objects.get(name=poll_name)
54
55 if poll.main_criteria in criteria:
56 raise ValidationError(_("The main criterion cannot be in the list."))
57
58 if len(criteria) != len(set(criteria)):
59 raise ValidationError(_("The list cannot contain duplicates."))
60
61 for criterion in criteria:
62 if criterion not in poll.criterias_list:
63 raise ValidationError(
64 _("Unknown criterion: %(criterion)s.") % {"criterion": criterion}
65 )
66
67 return criteria
68
69 def validate_rate_later__auto_remove(self, value):
70 if value < 1:
71 raise ValidationError(_("This parameter cannot be lower than 1."))
72 return value
73
74
75 class VideosPollUserSettingsSerializer(GenericPollUserSettingsSerializer):
76 """
77 The settings specific to the `videos` poll.
78
79 Also inherit the settings common to each poll.
80 """
81
82 DEFAULT_DATE_CHOICES = [
83 ("TODAY", "today"),
84 ("WEEK", "week"),
85 ("MONTH", "month"),
86 ("YEAR", "year"),
87 ("ALL_TIME", "all_time"),
88 ]
89
90 recommendations__default_date = serializers.ChoiceField(
91 choices=DEFAULT_DATE_CHOICES, allow_blank=True, required=False
92 )
93 recommendations__default_languages = serializers.ListField(
94 child=serializers.CharField(), allow_empty=True, required=False
95 )
96 recommendations__default_unsafe = serializers.BooleanField(required=False)
97
98 def validate_recommendations__default_languages(self, default_languages):
99 for lang in default_languages:
100 if lang not in ACCEPTED_LANGUAGE_CODES:
101 raise ValidationError(_("Unknown language code: %(lang)s.") % {"lang": lang})
102
103 return default_languages
104
105
106 class TournesolUserSettingsSerializer(serializers.Serializer):
107 """
108 A representation of all user settings of the Tournesol project.
109
110 This representation includes poll-agnostic settings in addition to the
111 specific settings of each poll.
112 """
113
114 general = GeneralUserSettingsSerializer(required=False)
115 videos = VideosPollUserSettingsSerializer(required=False, context={"poll_name": "videos"})
116
117 def create(self, validated_data):
118 return validated_data
119
120 def update(self, instance, validated_data):
121 for scope, settings in self.validated_data.items():
122 if scope not in instance:
123 instance[scope] = {}
124 instance[scope].update(settings)
125 return instance
126
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/backend/core/serializers/user_settings.py b/backend/core/serializers/user_settings.py
--- a/backend/core/serializers/user_settings.py
+++ b/backend/core/serializers/user_settings.py
@@ -94,6 +94,7 @@
child=serializers.CharField(), allow_empty=True, required=False
)
recommendations__default_unsafe = serializers.BooleanField(required=False)
+ recommendations__default_exclude_compared_entities = serializers.BooleanField(required=False)
def validate_recommendations__default_languages(self, default_languages):
for lang in default_languages:
| {"golden_diff": "diff --git a/backend/core/serializers/user_settings.py b/backend/core/serializers/user_settings.py\n--- a/backend/core/serializers/user_settings.py\n+++ b/backend/core/serializers/user_settings.py\n@@ -94,6 +94,7 @@\n child=serializers.CharField(), allow_empty=True, required=False\n )\n recommendations__default_unsafe = serializers.BooleanField(required=False)\n+ recommendations__default_exclude_compared_entities = serializers.BooleanField(required=False)\n \n def validate_recommendations__default_languages(self, default_languages):\n for lang in default_languages:\n", "issue": "[feat] Make `exclude_compared` configurable in user settings\n\n", "before_files": [{"content": "from django.utils.translation import gettext_lazy as _\nfrom rest_framework import serializers\nfrom rest_framework.serializers import ValidationError\n\nfrom tournesol.models.poll import Poll\nfrom tournesol.utils.video_language import ACCEPTED_LANGUAGE_CODES\n\n\nclass GeneralUserSettingsSerializer(serializers.Serializer):\n \"\"\"\n The general user settings that are not related to Tournesol polls.\n \"\"\"\n\n # The first element of the tuple should be an ISO 639-1 code.\n NOTIFICATIONS_LANG = [\n (\"en\", \"en\"),\n (\"fr\", \"fr\"),\n ]\n\n notifications__lang = serializers.ChoiceField(\n choices=NOTIFICATIONS_LANG, required=False\n )\n notifications_email__research = serializers.BooleanField(required=False)\n notifications_email__new_features = serializers.BooleanField(required=False)\n\n\nclass GenericPollUserSettingsSerializer(serializers.Serializer):\n \"\"\"\n The settings common to each poll.\n \"\"\"\n\n COMPONENT_DISPLAY_STATE = [\n (\"ALWAYS\", \"always\"),\n (\"EMBEDDED_ONLY\", \"embedded_only\"),\n (\"WEBSITE_ONLY\", \"website_only\"),\n (\"NEVER\", \"never\"),\n ]\n\n comparison__criteria_order = serializers.ListField(\n child=serializers.CharField(), required=False\n )\n\n comparison__fill_entity_selector = serializers.BooleanField(required=False)\n\n comparison_ui__weekly_collective_goal_display = serializers.ChoiceField(\n choices=COMPONENT_DISPLAY_STATE, allow_blank=True, required=False\n )\n\n rate_later__auto_remove = serializers.IntegerField(required=False)\n\n def validate_comparison__criteria_order(self, criteria):\n poll_name = self.context.get(\"poll_name\", self._context[\"poll_name\"])\n poll = Poll.objects.get(name=poll_name)\n\n if poll.main_criteria in criteria:\n raise ValidationError(_(\"The main criterion cannot be in the list.\"))\n\n if len(criteria) != len(set(criteria)):\n raise ValidationError(_(\"The list cannot contain duplicates.\"))\n\n for criterion in criteria:\n if criterion not in poll.criterias_list:\n raise ValidationError(\n _(\"Unknown criterion: %(criterion)s.\") % {\"criterion\": criterion}\n )\n\n return criteria\n\n def validate_rate_later__auto_remove(self, value):\n if value < 1:\n raise ValidationError(_(\"This parameter cannot be lower than 1.\"))\n return value\n\n\nclass VideosPollUserSettingsSerializer(GenericPollUserSettingsSerializer):\n \"\"\"\n The settings specific to the `videos` poll.\n\n Also inherit the settings common to each poll.\n \"\"\"\n\n DEFAULT_DATE_CHOICES = [\n (\"TODAY\", \"today\"),\n (\"WEEK\", \"week\"),\n (\"MONTH\", \"month\"),\n (\"YEAR\", \"year\"),\n (\"ALL_TIME\", \"all_time\"),\n ]\n\n recommendations__default_date = serializers.ChoiceField(\n choices=DEFAULT_DATE_CHOICES, allow_blank=True, required=False\n )\n recommendations__default_languages = serializers.ListField(\n child=serializers.CharField(), allow_empty=True, required=False\n )\n recommendations__default_unsafe = serializers.BooleanField(required=False)\n\n def validate_recommendations__default_languages(self, default_languages):\n for lang in default_languages:\n if lang not in ACCEPTED_LANGUAGE_CODES:\n raise ValidationError(_(\"Unknown language code: %(lang)s.\") % {\"lang\": lang})\n\n return default_languages\n\n\nclass TournesolUserSettingsSerializer(serializers.Serializer):\n \"\"\"\n A representation of all user settings of the Tournesol project.\n\n This representation includes poll-agnostic settings in addition to the\n specific settings of each poll.\n \"\"\"\n\n general = GeneralUserSettingsSerializer(required=False)\n videos = VideosPollUserSettingsSerializer(required=False, context={\"poll_name\": \"videos\"})\n\n def create(self, validated_data):\n return validated_data\n\n def update(self, instance, validated_data):\n for scope, settings in self.validated_data.items():\n if scope not in instance:\n instance[scope] = {}\n instance[scope].update(settings)\n return instance\n", "path": "backend/core/serializers/user_settings.py"}], "after_files": [{"content": "from django.utils.translation import gettext_lazy as _\nfrom rest_framework import serializers\nfrom rest_framework.serializers import ValidationError\n\nfrom tournesol.models.poll import Poll\nfrom tournesol.utils.video_language import ACCEPTED_LANGUAGE_CODES\n\n\nclass GeneralUserSettingsSerializer(serializers.Serializer):\n \"\"\"\n The general user settings that are not related to Tournesol polls.\n \"\"\"\n\n # The first element of the tuple should be an ISO 639-1 code.\n NOTIFICATIONS_LANG = [\n (\"en\", \"en\"),\n (\"fr\", \"fr\"),\n ]\n\n notifications__lang = serializers.ChoiceField(\n choices=NOTIFICATIONS_LANG, required=False\n )\n notifications_email__research = serializers.BooleanField(required=False)\n notifications_email__new_features = serializers.BooleanField(required=False)\n\n\nclass GenericPollUserSettingsSerializer(serializers.Serializer):\n \"\"\"\n The settings common to each poll.\n \"\"\"\n\n COMPONENT_DISPLAY_STATE = [\n (\"ALWAYS\", \"always\"),\n (\"EMBEDDED_ONLY\", \"embedded_only\"),\n (\"WEBSITE_ONLY\", \"website_only\"),\n (\"NEVER\", \"never\"),\n ]\n\n comparison__criteria_order = serializers.ListField(\n child=serializers.CharField(), required=False\n )\n\n comparison__fill_entity_selector = serializers.BooleanField(required=False)\n\n comparison_ui__weekly_collective_goal_display = serializers.ChoiceField(\n choices=COMPONENT_DISPLAY_STATE, allow_blank=True, required=False\n )\n\n rate_later__auto_remove = serializers.IntegerField(required=False)\n\n def validate_comparison__criteria_order(self, criteria):\n poll_name = self.context.get(\"poll_name\", self._context[\"poll_name\"])\n poll = Poll.objects.get(name=poll_name)\n\n if poll.main_criteria in criteria:\n raise ValidationError(_(\"The main criterion cannot be in the list.\"))\n\n if len(criteria) != len(set(criteria)):\n raise ValidationError(_(\"The list cannot contain duplicates.\"))\n\n for criterion in criteria:\n if criterion not in poll.criterias_list:\n raise ValidationError(\n _(\"Unknown criterion: %(criterion)s.\") % {\"criterion\": criterion}\n )\n\n return criteria\n\n def validate_rate_later__auto_remove(self, value):\n if value < 1:\n raise ValidationError(_(\"This parameter cannot be lower than 1.\"))\n return value\n\n\nclass VideosPollUserSettingsSerializer(GenericPollUserSettingsSerializer):\n \"\"\"\n The settings specific to the `videos` poll.\n\n Also inherit the settings common to each poll.\n \"\"\"\n\n DEFAULT_DATE_CHOICES = [\n (\"TODAY\", \"today\"),\n (\"WEEK\", \"week\"),\n (\"MONTH\", \"month\"),\n (\"YEAR\", \"year\"),\n (\"ALL_TIME\", \"all_time\"),\n ]\n\n recommendations__default_date = serializers.ChoiceField(\n choices=DEFAULT_DATE_CHOICES, allow_blank=True, required=False\n )\n recommendations__default_languages = serializers.ListField(\n child=serializers.CharField(), allow_empty=True, required=False\n )\n recommendations__default_unsafe = serializers.BooleanField(required=False)\n recommendations__default_exclude_compared_entities = serializers.BooleanField(required=False)\n\n def validate_recommendations__default_languages(self, default_languages):\n for lang in default_languages:\n if lang not in ACCEPTED_LANGUAGE_CODES:\n raise ValidationError(_(\"Unknown language code: %(lang)s.\") % {\"lang\": lang})\n\n return default_languages\n\n\nclass TournesolUserSettingsSerializer(serializers.Serializer):\n \"\"\"\n A representation of all user settings of the Tournesol project.\n\n This representation includes poll-agnostic settings in addition to the\n specific settings of each poll.\n \"\"\"\n\n general = GeneralUserSettingsSerializer(required=False)\n videos = VideosPollUserSettingsSerializer(required=False, context={\"poll_name\": \"videos\"})\n\n def create(self, validated_data):\n return validated_data\n\n def update(self, instance, validated_data):\n for scope, settings in self.validated_data.items():\n if scope not in instance:\n instance[scope] = {}\n instance[scope].update(settings)\n return instance\n", "path": "backend/core/serializers/user_settings.py"}]} | 1,393 | 120 |
gh_patches_debug_5138 | rasdani/github-patches | git_diff | dask__dask-2634 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
String Accessors in Converted DataFrame Columns
Whenever I try to access strings in a dataframe which are previously converted to strings (ie. datall[['A']] = datall[['A']].applymap(str) and then datall['A']=datall['A'].str[:5]) I get a TypeError: 'StringAccessor' object has no attribute '__getitem__'.
This is reproducible as follows:
```
import pandas as pd
import dask.dataframe as dd
import numpy as np
def float_apply(x):
try:
return float(x)
except ValueError:
return float('nan')
def string_apply(x):
try:
return str(x)
except ValueError:
return str('nan')
df = pd.DataFrame(np.random.random_integers(0,6,size=(20, 6)), columns=list('ABCDEF'))
data = dd.from_pandas(df, npartitions = 2)
data=data.applymap(float_apply)
data[['A']] = data[['A']].applymap(string_apply)
data['A'] = data['A'].str[:1]
print data.compute()
```
This will work with pandas dataframes. If .compute() is run prior to this then it works fine but that is probably suboptimal for large datasets.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dask/dataframe/accessor.py`
Content:
```
1 from __future__ import absolute_import, division, print_function
2
3 import numpy as np
4 import pandas as pd
5 from toolz import partial
6
7 from ..utils import derived_from
8
9
10 def maybe_wrap_pandas(obj, x):
11 if isinstance(x, np.ndarray):
12 if isinstance(obj, pd.Series):
13 return pd.Series(x, index=obj.index, dtype=x.dtype)
14 return pd.Index(x)
15 return x
16
17
18 class Accessor(object):
19 """
20 Base class for pandas Accessor objects cat, dt, and str.
21
22 Notes
23 -----
24 Subclasses should define the following attributes:
25
26 * _accessor
27 * _accessor_name
28 """
29 _not_implemented = set()
30
31 def __init__(self, series):
32 from .core import Series
33 if not isinstance(series, Series):
34 raise ValueError('Accessor cannot be initialized')
35 self._validate(series)
36 self._series = series
37
38 def _validate(self, series):
39 pass
40
41 @staticmethod
42 def _delegate_property(obj, accessor, attr):
43 out = getattr(getattr(obj, accessor, obj), attr)
44 return maybe_wrap_pandas(obj, out)
45
46 @staticmethod
47 def _delegate_method(obj, accessor, attr, args, kwargs):
48 out = getattr(getattr(obj, accessor, obj), attr)(*args, **kwargs)
49 return maybe_wrap_pandas(obj, out)
50
51 def _property_map(self, attr):
52 meta = self._delegate_property(self._series._meta,
53 self._accessor_name, attr)
54 token = '%s-%s' % (self._accessor_name, attr)
55 return self._series.map_partitions(self._delegate_property,
56 self._accessor_name, attr,
57 token=token, meta=meta)
58
59 def _function_map(self, attr, *args, **kwargs):
60 meta = self._delegate_method(self._series._meta_nonempty,
61 self._accessor_name, attr, args, kwargs)
62 token = '%s-%s' % (self._accessor_name, attr)
63 return self._series.map_partitions(self._delegate_method,
64 self._accessor_name, attr, args,
65 kwargs, meta=meta, token=token)
66
67 @property
68 def _delegates(self):
69 return set(dir(self._accessor)).difference(self._not_implemented)
70
71 def __dir__(self):
72 o = self._delegates
73 o.update(self.__dict__)
74 o.update(dir(type(self)))
75 return list(o)
76
77 def __getattr__(self, key):
78 if key in self._delegates:
79 if isinstance(getattr(self._accessor, key), property):
80 return self._property_map(key)
81 else:
82 return partial(self._function_map, key)
83 else:
84 raise AttributeError(key)
85
86
87 class DatetimeAccessor(Accessor):
88 """ Accessor object for datetimelike properties of the Series values.
89
90 Examples
91 --------
92
93 >>> s.dt.microsecond # doctest: +SKIP
94 """
95 _accessor = pd.Series.dt
96 _accessor_name = 'dt'
97
98
99 class StringAccessor(Accessor):
100 """ Accessor object for string properties of the Series values.
101
102 Examples
103 --------
104
105 >>> s.str.lower() # doctest: +SKIP
106 """
107 _accessor = pd.Series.str
108 _accessor_name = 'str'
109 _not_implemented = {'get_dummies'}
110
111 def _validate(self, series):
112 if not series.dtype == 'object':
113 raise AttributeError("Can only use .str accessor with object dtype")
114
115 @derived_from(pd.core.strings.StringMethods)
116 def split(self, pat=None, n=-1):
117 return self._function_map('split', pat=pat, n=n)
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dask/dataframe/accessor.py b/dask/dataframe/accessor.py
--- a/dask/dataframe/accessor.py
+++ b/dask/dataframe/accessor.py
@@ -115,3 +115,12 @@
@derived_from(pd.core.strings.StringMethods)
def split(self, pat=None, n=-1):
return self._function_map('split', pat=pat, n=n)
+
+ def __getitem__(self, index):
+ return self._series.map_partitions(str_get, index,
+ meta=self._series._meta)
+
+
+def str_get(series, index):
+ """ Implements series.str[index] """
+ return series.str[index]
| {"golden_diff": "diff --git a/dask/dataframe/accessor.py b/dask/dataframe/accessor.py\n--- a/dask/dataframe/accessor.py\n+++ b/dask/dataframe/accessor.py\n@@ -115,3 +115,12 @@\n @derived_from(pd.core.strings.StringMethods)\n def split(self, pat=None, n=-1):\n return self._function_map('split', pat=pat, n=n)\n+\n+ def __getitem__(self, index):\n+ return self._series.map_partitions(str_get, index,\n+ meta=self._series._meta)\n+\n+\n+def str_get(series, index):\n+ \"\"\" Implements series.str[index] \"\"\"\n+ return series.str[index]\n", "issue": "String Accessors in Converted DataFrame Columns\nWhenever I try to access strings in a dataframe which are previously converted to strings (ie. datall[['A']] = datall[['A']].applymap(str) and then datall['A']=datall['A'].str[:5]) I get a TypeError: 'StringAccessor' object has no attribute '__getitem__'.\r\n\r\nThis is reproducible as follows:\r\n\r\n```\r\nimport pandas as pd\r\nimport dask.dataframe as dd\r\nimport numpy as np\r\n\r\ndef float_apply(x):\r\n try:\r\n return float(x)\r\n except ValueError:\r\n return float('nan')\r\n\r\ndef string_apply(x):\r\n try:\r\n return str(x)\r\n except ValueError:\r\n return str('nan')\r\n\r\ndf = pd.DataFrame(np.random.random_integers(0,6,size=(20, 6)), columns=list('ABCDEF'))\r\n\r\ndata = dd.from_pandas(df, npartitions = 2)\r\n\r\ndata=data.applymap(float_apply)\r\n\r\ndata[['A']] = data[['A']].applymap(string_apply)\r\ndata['A'] = data['A'].str[:1]\r\n\r\nprint data.compute()\r\n\r\n```\r\n\r\nThis will work with pandas dataframes. If .compute() is run prior to this then it works fine but that is probably suboptimal for large datasets.\n", "before_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport numpy as np\nimport pandas as pd\nfrom toolz import partial\n\nfrom ..utils import derived_from\n\n\ndef maybe_wrap_pandas(obj, x):\n if isinstance(x, np.ndarray):\n if isinstance(obj, pd.Series):\n return pd.Series(x, index=obj.index, dtype=x.dtype)\n return pd.Index(x)\n return x\n\n\nclass Accessor(object):\n \"\"\"\n Base class for pandas Accessor objects cat, dt, and str.\n\n Notes\n -----\n Subclasses should define the following attributes:\n\n * _accessor\n * _accessor_name\n \"\"\"\n _not_implemented = set()\n\n def __init__(self, series):\n from .core import Series\n if not isinstance(series, Series):\n raise ValueError('Accessor cannot be initialized')\n self._validate(series)\n self._series = series\n\n def _validate(self, series):\n pass\n\n @staticmethod\n def _delegate_property(obj, accessor, attr):\n out = getattr(getattr(obj, accessor, obj), attr)\n return maybe_wrap_pandas(obj, out)\n\n @staticmethod\n def _delegate_method(obj, accessor, attr, args, kwargs):\n out = getattr(getattr(obj, accessor, obj), attr)(*args, **kwargs)\n return maybe_wrap_pandas(obj, out)\n\n def _property_map(self, attr):\n meta = self._delegate_property(self._series._meta,\n self._accessor_name, attr)\n token = '%s-%s' % (self._accessor_name, attr)\n return self._series.map_partitions(self._delegate_property,\n self._accessor_name, attr,\n token=token, meta=meta)\n\n def _function_map(self, attr, *args, **kwargs):\n meta = self._delegate_method(self._series._meta_nonempty,\n self._accessor_name, attr, args, kwargs)\n token = '%s-%s' % (self._accessor_name, attr)\n return self._series.map_partitions(self._delegate_method,\n self._accessor_name, attr, args,\n kwargs, meta=meta, token=token)\n\n @property\n def _delegates(self):\n return set(dir(self._accessor)).difference(self._not_implemented)\n\n def __dir__(self):\n o = self._delegates\n o.update(self.__dict__)\n o.update(dir(type(self)))\n return list(o)\n\n def __getattr__(self, key):\n if key in self._delegates:\n if isinstance(getattr(self._accessor, key), property):\n return self._property_map(key)\n else:\n return partial(self._function_map, key)\n else:\n raise AttributeError(key)\n\n\nclass DatetimeAccessor(Accessor):\n \"\"\" Accessor object for datetimelike properties of the Series values.\n\n Examples\n --------\n\n >>> s.dt.microsecond # doctest: +SKIP\n \"\"\"\n _accessor = pd.Series.dt\n _accessor_name = 'dt'\n\n\nclass StringAccessor(Accessor):\n \"\"\" Accessor object for string properties of the Series values.\n\n Examples\n --------\n\n >>> s.str.lower() # doctest: +SKIP\n \"\"\"\n _accessor = pd.Series.str\n _accessor_name = 'str'\n _not_implemented = {'get_dummies'}\n\n def _validate(self, series):\n if not series.dtype == 'object':\n raise AttributeError(\"Can only use .str accessor with object dtype\")\n\n @derived_from(pd.core.strings.StringMethods)\n def split(self, pat=None, n=-1):\n return self._function_map('split', pat=pat, n=n)\n", "path": "dask/dataframe/accessor.py"}], "after_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport numpy as np\nimport pandas as pd\nfrom toolz import partial\n\nfrom ..utils import derived_from\n\n\ndef maybe_wrap_pandas(obj, x):\n if isinstance(x, np.ndarray):\n if isinstance(obj, pd.Series):\n return pd.Series(x, index=obj.index, dtype=x.dtype)\n return pd.Index(x)\n return x\n\n\nclass Accessor(object):\n \"\"\"\n Base class for pandas Accessor objects cat, dt, and str.\n\n Notes\n -----\n Subclasses should define the following attributes:\n\n * _accessor\n * _accessor_name\n \"\"\"\n _not_implemented = set()\n\n def __init__(self, series):\n from .core import Series\n if not isinstance(series, Series):\n raise ValueError('Accessor cannot be initialized')\n self._validate(series)\n self._series = series\n\n def _validate(self, series):\n pass\n\n @staticmethod\n def _delegate_property(obj, accessor, attr):\n out = getattr(getattr(obj, accessor, obj), attr)\n return maybe_wrap_pandas(obj, out)\n\n @staticmethod\n def _delegate_method(obj, accessor, attr, args, kwargs):\n out = getattr(getattr(obj, accessor, obj), attr)(*args, **kwargs)\n return maybe_wrap_pandas(obj, out)\n\n def _property_map(self, attr):\n meta = self._delegate_property(self._series._meta,\n self._accessor_name, attr)\n token = '%s-%s' % (self._accessor_name, attr)\n return self._series.map_partitions(self._delegate_property,\n self._accessor_name, attr,\n token=token, meta=meta)\n\n def _function_map(self, attr, *args, **kwargs):\n meta = self._delegate_method(self._series._meta_nonempty,\n self._accessor_name, attr, args, kwargs)\n token = '%s-%s' % (self._accessor_name, attr)\n return self._series.map_partitions(self._delegate_method,\n self._accessor_name, attr, args,\n kwargs, meta=meta, token=token)\n\n @property\n def _delegates(self):\n return set(dir(self._accessor)).difference(self._not_implemented)\n\n def __dir__(self):\n o = self._delegates\n o.update(self.__dict__)\n o.update(dir(type(self)))\n return list(o)\n\n def __getattr__(self, key):\n if key in self._delegates:\n if isinstance(getattr(self._accessor, key), property):\n return self._property_map(key)\n else:\n return partial(self._function_map, key)\n else:\n raise AttributeError(key)\n\n\nclass DatetimeAccessor(Accessor):\n \"\"\" Accessor object for datetimelike properties of the Series values.\n\n Examples\n --------\n\n >>> s.dt.microsecond # doctest: +SKIP\n \"\"\"\n _accessor = pd.Series.dt\n _accessor_name = 'dt'\n\n\nclass StringAccessor(Accessor):\n \"\"\" Accessor object for string properties of the Series values.\n\n Examples\n --------\n\n >>> s.str.lower() # doctest: +SKIP\n \"\"\"\n _accessor = pd.Series.str\n _accessor_name = 'str'\n _not_implemented = {'get_dummies'}\n\n def _validate(self, series):\n if not series.dtype == 'object':\n raise AttributeError(\"Can only use .str accessor with object dtype\")\n\n @derived_from(pd.core.strings.StringMethods)\n def split(self, pat=None, n=-1):\n return self._function_map('split', pat=pat, n=n)\n\n def __getitem__(self, index):\n return self._series.map_partitions(str_get, index,\n meta=self._series._meta)\n\n\ndef str_get(series, index):\n \"\"\" Implements series.str[index] \"\"\"\n return series.str[index]\n", "path": "dask/dataframe/accessor.py"}]} | 1,585 | 152 |
gh_patches_debug_4784 | rasdani/github-patches | git_diff | PaddlePaddle__PaddleSeg-134 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
评估时模型路径不存在直接报load op的错
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pdseg/eval.py`
Content:
```
1 # coding: utf8
2 # copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.
3 #
4 # Licensed under the Apache License, Version 2.0 (the "License");
5 # you may not use this file except in compliance with the License.
6 # You may obtain a copy of the License at
7 #
8 # http://www.apache.org/licenses/LICENSE-2.0
9 #
10 # Unless required by applicable law or agreed to in writing, software
11 # distributed under the License is distributed on an "AS IS" BASIS,
12 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13 # See the License for the specific language governing permissions and
14 # limitations under the License.
15
16 from __future__ import absolute_import
17 from __future__ import division
18 from __future__ import print_function
19
20 import os
21 # GPU memory garbage collection optimization flags
22 os.environ['FLAGS_eager_delete_tensor_gb'] = "0.0"
23
24 import sys
25 import time
26 import argparse
27 import functools
28 import pprint
29 import cv2
30 import numpy as np
31 import paddle
32 import paddle.fluid as fluid
33
34 from utils.config import cfg
35 from utils.timer import Timer, calculate_eta
36 from models.model_builder import build_model
37 from models.model_builder import ModelPhase
38 from reader import SegDataset
39 from metrics import ConfusionMatrix
40
41
42 def parse_args():
43 parser = argparse.ArgumentParser(description='PaddleSeg model evalution')
44 parser.add_argument(
45 '--cfg',
46 dest='cfg_file',
47 help='Config file for training (and optionally testing)',
48 default=None,
49 type=str)
50 parser.add_argument(
51 '--use_gpu',
52 dest='use_gpu',
53 help='Use gpu or cpu',
54 action='store_true',
55 default=False)
56 parser.add_argument(
57 '--use_mpio',
58 dest='use_mpio',
59 help='Use multiprocess IO or not',
60 action='store_true',
61 default=False)
62 parser.add_argument(
63 'opts',
64 help='See utils/config.py for all options',
65 default=None,
66 nargs=argparse.REMAINDER)
67 if len(sys.argv) == 1:
68 parser.print_help()
69 sys.exit(1)
70 return parser.parse_args()
71
72
73 def evaluate(cfg, ckpt_dir=None, use_gpu=False, use_mpio=False, **kwargs):
74 np.set_printoptions(precision=5, suppress=True)
75
76 startup_prog = fluid.Program()
77 test_prog = fluid.Program()
78 dataset = SegDataset(
79 file_list=cfg.DATASET.VAL_FILE_LIST,
80 mode=ModelPhase.EVAL,
81 data_dir=cfg.DATASET.DATA_DIR)
82
83 def data_generator():
84 #TODO: check is batch reader compatitable with Windows
85 if use_mpio:
86 data_gen = dataset.multiprocess_generator(
87 num_processes=cfg.DATALOADER.NUM_WORKERS,
88 max_queue_size=cfg.DATALOADER.BUF_SIZE)
89 else:
90 data_gen = dataset.generator()
91
92 for b in data_gen:
93 yield b[0], b[1], b[2]
94
95 py_reader, avg_loss, pred, grts, masks = build_model(
96 test_prog, startup_prog, phase=ModelPhase.EVAL)
97
98 py_reader.decorate_sample_generator(
99 data_generator, drop_last=False, batch_size=cfg.BATCH_SIZE)
100
101 # Get device environment
102 places = fluid.cuda_places() if use_gpu else fluid.cpu_places()
103 place = places[0]
104 dev_count = len(places)
105 print("#Device count: {}".format(dev_count))
106
107 exe = fluid.Executor(place)
108 exe.run(startup_prog)
109
110 test_prog = test_prog.clone(for_test=True)
111
112 ckpt_dir = cfg.TEST.TEST_MODEL if not ckpt_dir else ckpt_dir
113
114 if ckpt_dir is not None:
115 print('load test model:', ckpt_dir)
116 fluid.io.load_params(exe, ckpt_dir, main_program=test_prog)
117
118 # Use streaming confusion matrix to calculate mean_iou
119 np.set_printoptions(
120 precision=4, suppress=True, linewidth=160, floatmode="fixed")
121 conf_mat = ConfusionMatrix(cfg.DATASET.NUM_CLASSES, streaming=True)
122 fetch_list = [avg_loss.name, pred.name, grts.name, masks.name]
123 num_images = 0
124 step = 0
125 all_step = cfg.DATASET.TEST_TOTAL_IMAGES // cfg.BATCH_SIZE + 1
126 timer = Timer()
127 timer.start()
128 py_reader.start()
129 while True:
130 try:
131 step += 1
132 loss, pred, grts, masks = exe.run(
133 test_prog, fetch_list=fetch_list, return_numpy=True)
134
135 loss = np.mean(np.array(loss))
136
137 num_images += pred.shape[0]
138 conf_mat.calculate(pred, grts, masks)
139 _, iou = conf_mat.mean_iou()
140 _, acc = conf_mat.accuracy()
141
142 speed = 1.0 / timer.elapsed_time()
143
144 print(
145 "[EVAL]step={} loss={:.5f} acc={:.4f} IoU={:.4f} step/sec={:.2f} | ETA {}"
146 .format(step, loss, acc, iou, speed,
147 calculate_eta(all_step - step, speed)))
148 timer.restart()
149 sys.stdout.flush()
150 except fluid.core.EOFException:
151 break
152
153 category_iou, avg_iou = conf_mat.mean_iou()
154 category_acc, avg_acc = conf_mat.accuracy()
155 print("[EVAL]#image={} acc={:.4f} IoU={:.4f}".format(
156 num_images, avg_acc, avg_iou))
157 print("[EVAL]Category IoU:", category_iou)
158 print("[EVAL]Category Acc:", category_acc)
159 print("[EVAL]Kappa:{:.4f}".format(conf_mat.kappa()))
160
161 return category_iou, avg_iou, category_acc, avg_acc
162
163
164 def main():
165 args = parse_args()
166 if args.cfg_file is not None:
167 cfg.update_from_file(args.cfg_file)
168 if args.opts:
169 cfg.update_from_list(args.opts)
170 cfg.check_and_infer()
171 print(pprint.pformat(cfg))
172 evaluate(cfg, **args.__dict__)
173
174
175 if __name__ == '__main__':
176 main()
177
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pdseg/eval.py b/pdseg/eval.py
--- a/pdseg/eval.py
+++ b/pdseg/eval.py
@@ -111,6 +111,9 @@
ckpt_dir = cfg.TEST.TEST_MODEL if not ckpt_dir else ckpt_dir
+ if not os.path.exists(ckpt_dir):
+ raise ValueError('The TEST.TEST_MODEL {} is not found'.format(ckpt_dir))
+
if ckpt_dir is not None:
print('load test model:', ckpt_dir)
fluid.io.load_params(exe, ckpt_dir, main_program=test_prog)
| {"golden_diff": "diff --git a/pdseg/eval.py b/pdseg/eval.py\n--- a/pdseg/eval.py\n+++ b/pdseg/eval.py\n@@ -111,6 +111,9 @@\n \n ckpt_dir = cfg.TEST.TEST_MODEL if not ckpt_dir else ckpt_dir\n \n+ if not os.path.exists(ckpt_dir):\n+ raise ValueError('The TEST.TEST_MODEL {} is not found'.format(ckpt_dir))\n+\n if ckpt_dir is not None:\n print('load test model:', ckpt_dir)\n fluid.io.load_params(exe, ckpt_dir, main_program=test_prog)\n", "issue": "\u8bc4\u4f30\u65f6\u6a21\u578b\u8def\u5f84\u4e0d\u5b58\u5728\u76f4\u63a5\u62a5load op\u7684\u9519\n\n", "before_files": [{"content": "# coding: utf8\n# copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\n# GPU memory garbage collection optimization flags\nos.environ['FLAGS_eager_delete_tensor_gb'] = \"0.0\"\n\nimport sys\nimport time\nimport argparse\nimport functools\nimport pprint\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.fluid as fluid\n\nfrom utils.config import cfg\nfrom utils.timer import Timer, calculate_eta\nfrom models.model_builder import build_model\nfrom models.model_builder import ModelPhase\nfrom reader import SegDataset\nfrom metrics import ConfusionMatrix\n\n\ndef parse_args():\n parser = argparse.ArgumentParser(description='PaddleSeg model evalution')\n parser.add_argument(\n '--cfg',\n dest='cfg_file',\n help='Config file for training (and optionally testing)',\n default=None,\n type=str)\n parser.add_argument(\n '--use_gpu',\n dest='use_gpu',\n help='Use gpu or cpu',\n action='store_true',\n default=False)\n parser.add_argument(\n '--use_mpio',\n dest='use_mpio',\n help='Use multiprocess IO or not',\n action='store_true',\n default=False)\n parser.add_argument(\n 'opts',\n help='See utils/config.py for all options',\n default=None,\n nargs=argparse.REMAINDER)\n if len(sys.argv) == 1:\n parser.print_help()\n sys.exit(1)\n return parser.parse_args()\n\n\ndef evaluate(cfg, ckpt_dir=None, use_gpu=False, use_mpio=False, **kwargs):\n np.set_printoptions(precision=5, suppress=True)\n\n startup_prog = fluid.Program()\n test_prog = fluid.Program()\n dataset = SegDataset(\n file_list=cfg.DATASET.VAL_FILE_LIST,\n mode=ModelPhase.EVAL,\n data_dir=cfg.DATASET.DATA_DIR)\n\n def data_generator():\n #TODO: check is batch reader compatitable with Windows\n if use_mpio:\n data_gen = dataset.multiprocess_generator(\n num_processes=cfg.DATALOADER.NUM_WORKERS,\n max_queue_size=cfg.DATALOADER.BUF_SIZE)\n else:\n data_gen = dataset.generator()\n\n for b in data_gen:\n yield b[0], b[1], b[2]\n\n py_reader, avg_loss, pred, grts, masks = build_model(\n test_prog, startup_prog, phase=ModelPhase.EVAL)\n\n py_reader.decorate_sample_generator(\n data_generator, drop_last=False, batch_size=cfg.BATCH_SIZE)\n\n # Get device environment\n places = fluid.cuda_places() if use_gpu else fluid.cpu_places()\n place = places[0]\n dev_count = len(places)\n print(\"#Device count: {}\".format(dev_count))\n\n exe = fluid.Executor(place)\n exe.run(startup_prog)\n\n test_prog = test_prog.clone(for_test=True)\n\n ckpt_dir = cfg.TEST.TEST_MODEL if not ckpt_dir else ckpt_dir\n\n if ckpt_dir is not None:\n print('load test model:', ckpt_dir)\n fluid.io.load_params(exe, ckpt_dir, main_program=test_prog)\n\n # Use streaming confusion matrix to calculate mean_iou\n np.set_printoptions(\n precision=4, suppress=True, linewidth=160, floatmode=\"fixed\")\n conf_mat = ConfusionMatrix(cfg.DATASET.NUM_CLASSES, streaming=True)\n fetch_list = [avg_loss.name, pred.name, grts.name, masks.name]\n num_images = 0\n step = 0\n all_step = cfg.DATASET.TEST_TOTAL_IMAGES // cfg.BATCH_SIZE + 1\n timer = Timer()\n timer.start()\n py_reader.start()\n while True:\n try:\n step += 1\n loss, pred, grts, masks = exe.run(\n test_prog, fetch_list=fetch_list, return_numpy=True)\n\n loss = np.mean(np.array(loss))\n\n num_images += pred.shape[0]\n conf_mat.calculate(pred, grts, masks)\n _, iou = conf_mat.mean_iou()\n _, acc = conf_mat.accuracy()\n\n speed = 1.0 / timer.elapsed_time()\n\n print(\n \"[EVAL]step={} loss={:.5f} acc={:.4f} IoU={:.4f} step/sec={:.2f} | ETA {}\"\n .format(step, loss, acc, iou, speed,\n calculate_eta(all_step - step, speed)))\n timer.restart()\n sys.stdout.flush()\n except fluid.core.EOFException:\n break\n\n category_iou, avg_iou = conf_mat.mean_iou()\n category_acc, avg_acc = conf_mat.accuracy()\n print(\"[EVAL]#image={} acc={:.4f} IoU={:.4f}\".format(\n num_images, avg_acc, avg_iou))\n print(\"[EVAL]Category IoU:\", category_iou)\n print(\"[EVAL]Category Acc:\", category_acc)\n print(\"[EVAL]Kappa:{:.4f}\".format(conf_mat.kappa()))\n\n return category_iou, avg_iou, category_acc, avg_acc\n\n\ndef main():\n args = parse_args()\n if args.cfg_file is not None:\n cfg.update_from_file(args.cfg_file)\n if args.opts:\n cfg.update_from_list(args.opts)\n cfg.check_and_infer()\n print(pprint.pformat(cfg))\n evaluate(cfg, **args.__dict__)\n\n\nif __name__ == '__main__':\n main()\n", "path": "pdseg/eval.py"}], "after_files": [{"content": "# coding: utf8\n# copyright (c) 2019 PaddlePaddle Authors. All Rights Reserve.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport os\n# GPU memory garbage collection optimization flags\nos.environ['FLAGS_eager_delete_tensor_gb'] = \"0.0\"\n\nimport sys\nimport time\nimport argparse\nimport functools\nimport pprint\nimport cv2\nimport numpy as np\nimport paddle\nimport paddle.fluid as fluid\n\nfrom utils.config import cfg\nfrom utils.timer import Timer, calculate_eta\nfrom models.model_builder import build_model\nfrom models.model_builder import ModelPhase\nfrom reader import SegDataset\nfrom metrics import ConfusionMatrix\n\n\ndef parse_args():\n parser = argparse.ArgumentParser(description='PaddleSeg model evalution')\n parser.add_argument(\n '--cfg',\n dest='cfg_file',\n help='Config file for training (and optionally testing)',\n default=None,\n type=str)\n parser.add_argument(\n '--use_gpu',\n dest='use_gpu',\n help='Use gpu or cpu',\n action='store_true',\n default=False)\n parser.add_argument(\n '--use_mpio',\n dest='use_mpio',\n help='Use multiprocess IO or not',\n action='store_true',\n default=False)\n parser.add_argument(\n 'opts',\n help='See utils/config.py for all options',\n default=None,\n nargs=argparse.REMAINDER)\n if len(sys.argv) == 1:\n parser.print_help()\n sys.exit(1)\n return parser.parse_args()\n\n\ndef evaluate(cfg, ckpt_dir=None, use_gpu=False, use_mpio=False, **kwargs):\n np.set_printoptions(precision=5, suppress=True)\n\n startup_prog = fluid.Program()\n test_prog = fluid.Program()\n dataset = SegDataset(\n file_list=cfg.DATASET.VAL_FILE_LIST,\n mode=ModelPhase.EVAL,\n data_dir=cfg.DATASET.DATA_DIR)\n\n def data_generator():\n #TODO: check is batch reader compatitable with Windows\n if use_mpio:\n data_gen = dataset.multiprocess_generator(\n num_processes=cfg.DATALOADER.NUM_WORKERS,\n max_queue_size=cfg.DATALOADER.BUF_SIZE)\n else:\n data_gen = dataset.generator()\n\n for b in data_gen:\n yield b[0], b[1], b[2]\n\n py_reader, avg_loss, pred, grts, masks = build_model(\n test_prog, startup_prog, phase=ModelPhase.EVAL)\n\n py_reader.decorate_sample_generator(\n data_generator, drop_last=False, batch_size=cfg.BATCH_SIZE)\n\n # Get device environment\n places = fluid.cuda_places() if use_gpu else fluid.cpu_places()\n place = places[0]\n dev_count = len(places)\n print(\"#Device count: {}\".format(dev_count))\n\n exe = fluid.Executor(place)\n exe.run(startup_prog)\n\n test_prog = test_prog.clone(for_test=True)\n\n ckpt_dir = cfg.TEST.TEST_MODEL if not ckpt_dir else ckpt_dir\n\n if not os.path.exists(ckpt_dir):\n raise ValueError('The TEST.TEST_MODEL {} is not found'.format(ckpt_dir))\n\n if ckpt_dir is not None:\n print('load test model:', ckpt_dir)\n fluid.io.load_params(exe, ckpt_dir, main_program=test_prog)\n\n # Use streaming confusion matrix to calculate mean_iou\n np.set_printoptions(\n precision=4, suppress=True, linewidth=160, floatmode=\"fixed\")\n conf_mat = ConfusionMatrix(cfg.DATASET.NUM_CLASSES, streaming=True)\n fetch_list = [avg_loss.name, pred.name, grts.name, masks.name]\n num_images = 0\n step = 0\n all_step = cfg.DATASET.TEST_TOTAL_IMAGES // cfg.BATCH_SIZE + 1\n timer = Timer()\n timer.start()\n py_reader.start()\n while True:\n try:\n step += 1\n loss, pred, grts, masks = exe.run(\n test_prog, fetch_list=fetch_list, return_numpy=True)\n\n loss = np.mean(np.array(loss))\n\n num_images += pred.shape[0]\n conf_mat.calculate(pred, grts, masks)\n _, iou = conf_mat.mean_iou()\n _, acc = conf_mat.accuracy()\n\n speed = 1.0 / timer.elapsed_time()\n\n print(\n \"[EVAL]step={} loss={:.5f} acc={:.4f} IoU={:.4f} step/sec={:.2f} | ETA {}\"\n .format(step, loss, acc, iou, speed,\n calculate_eta(all_step - step, speed)))\n timer.restart()\n sys.stdout.flush()\n except fluid.core.EOFException:\n break\n\n category_iou, avg_iou = conf_mat.mean_iou()\n category_acc, avg_acc = conf_mat.accuracy()\n print(\"[EVAL]#image={} acc={:.4f} IoU={:.4f}\".format(\n num_images, avg_acc, avg_iou))\n print(\"[EVAL]Category IoU:\", category_iou)\n print(\"[EVAL]Category Acc:\", category_acc)\n print(\"[EVAL]Kappa:{:.4f}\".format(conf_mat.kappa()))\n\n return category_iou, avg_iou, category_acc, avg_acc\n\n\ndef main():\n args = parse_args()\n if args.cfg_file is not None:\n cfg.update_from_file(args.cfg_file)\n if args.opts:\n cfg.update_from_list(args.opts)\n cfg.check_and_infer()\n print(pprint.pformat(cfg))\n evaluate(cfg, **args.__dict__)\n\n\nif __name__ == '__main__':\n main()\n", "path": "pdseg/eval.py"}]} | 2,041 | 138 |
gh_patches_debug_8046 | rasdani/github-patches | git_diff | conda__conda-build-526 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AppVeyor: Commit message with braces -> failed build
https://ci.appveyor.com/project/mpi4py/mpi4py/build/2.0.0a0-13/job/0q0w2g5o32qk3m94#L522
PS: I got a warning about conda-build being outdated. Isn't `conda update --all` supposed to update it? Maybe conflicting versions with dependencies?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conda_build/windows.py`
Content:
```
1 from __future__ import absolute_import, division, print_function
2
3 import os
4 import sys
5 import shutil
6 from os.path import dirname, isdir, isfile, join, exists
7
8 import conda.config as cc
9 from conda.compat import iteritems
10
11 from conda_build.config import config
12 from conda_build import environ
13 from conda_build import source
14 from conda_build.utils import _check_call
15
16 try:
17 import psutil
18 except ImportError:
19 psutil = None
20
21 assert sys.platform == 'win32'
22
23
24 def fix_staged_scripts():
25 """
26 Fixes scripts which have been installed unix-style to have a .bat
27 helper
28 """
29 scripts_dir = join(config.build_prefix, 'Scripts')
30 if not isdir(scripts_dir):
31 return
32 for fn in os.listdir(scripts_dir):
33 # process all the extensionless files
34 if not isfile(join(scripts_dir, fn)) or '.' in fn:
35 continue
36
37 with open(join(scripts_dir, fn)) as f:
38 line = f.readline().lower()
39 # If it's a #!python script
40 if not (line.startswith('#!') and 'python' in line.lower()):
41 continue
42 print('Adjusting unix-style #! script %s, '
43 'and adding a .bat file for it' % fn)
44 # copy it with a .py extension (skipping that first #! line)
45 with open(join(scripts_dir, fn + '-script.py'), 'w') as fo:
46 fo.write(f.read())
47 # now create the .exe file
48 shutil.copyfile(join(dirname(__file__),
49 'cli-%d.exe' % (8 * tuple.__itemsize__)),
50 join(scripts_dir, fn + '.exe'))
51
52 # remove the original script
53 os.remove(join(scripts_dir, fn))
54
55
56 def msvc_env_cmd():
57 if 'ProgramFiles(x86)' in os.environ:
58 program_files = os.environ['ProgramFiles(x86)']
59 else:
60 program_files = os.environ['ProgramFiles']
61
62 localappdata = os.environ.get("localappdata")
63
64 if config.PY3K:
65 vcvarsall = os.path.join(program_files,
66 r'Microsoft Visual Studio 10.0'
67 r'\VC\vcvarsall.bat')
68 else:
69 vcvarsall = os.path.join(program_files,
70 r'Microsoft Visual Studio 9.0'
71 r'\VC\vcvarsall.bat')
72
73 # Try the Microsoft Visual C++ Compiler for Python 2.7
74 if not isfile(vcvarsall) and localappdata and not config.PY3K:
75 vcvarsall = os.path.join(localappdata, "Programs", "Common",
76 "Microsoft", "Visual C++ for Python", "9.0", "vcvarsall.bat")
77 if not isfile(vcvarsall) and program_files and not config.PY3K:
78 vcvarsall = os.path.join(program_files, 'Common Files',
79 'Microsoft', 'Visual C++ for Python', "9.0", "vcvarsall.bat")
80 if not isfile(vcvarsall):
81 print("Warning: Couldn't find Visual Studio: %r" % vcvarsall)
82 return ''
83
84 return '''\
85 call "%s" %s
86 ''' % (vcvarsall, {32: 'x86', 64: 'amd64'}[cc.bits])
87
88
89 def kill_processes():
90 if psutil is None:
91 return
92 for n in psutil.get_pid_list():
93 try:
94 p = psutil.Process(n)
95 if p.name.lower() == 'msbuild.exe':
96 print('Terminating:', p.name)
97 p.terminate()
98 except:
99 continue
100
101
102 def build(m):
103 env = dict(os.environ)
104 env.update(environ.get_dict(m))
105
106 for name in 'BIN', 'INC', 'LIB':
107 path = env['LIBRARY_' + name]
108 if not isdir(path):
109 os.makedirs(path)
110
111 src_dir = source.get_dir()
112 bld_bat = join(m.path, 'bld.bat')
113 if exists(bld_bat):
114 with open(bld_bat) as fi:
115 data = fi.read()
116 with open(join(src_dir, 'bld.bat'), 'w') as fo:
117 fo.write(msvc_env_cmd())
118 for kv in iteritems(env):
119 fo.write('set %s=%s\n' % kv)
120 # more debuggable with echo on
121 fo.write('@echo on\n')
122 fo.write("REM ===== end generated header =====\n")
123 fo.write(data)
124
125 cmd = [os.environ['COMSPEC'], '/c', 'call', 'bld.bat']
126 _check_call(cmd, cwd=src_dir)
127 kill_processes()
128 fix_staged_scripts()
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conda_build/windows.py b/conda_build/windows.py
--- a/conda_build/windows.py
+++ b/conda_build/windows.py
@@ -116,7 +116,7 @@
with open(join(src_dir, 'bld.bat'), 'w') as fo:
fo.write(msvc_env_cmd())
for kv in iteritems(env):
- fo.write('set %s=%s\n' % kv)
+ fo.write('set "%s=%s"\n' % kv)
# more debuggable with echo on
fo.write('@echo on\n')
fo.write("REM ===== end generated header =====\n")
| {"golden_diff": "diff --git a/conda_build/windows.py b/conda_build/windows.py\n--- a/conda_build/windows.py\n+++ b/conda_build/windows.py\n@@ -116,7 +116,7 @@\n with open(join(src_dir, 'bld.bat'), 'w') as fo:\n fo.write(msvc_env_cmd())\n for kv in iteritems(env):\n- fo.write('set %s=%s\\n' % kv)\n+ fo.write('set \"%s=%s\"\\n' % kv)\n # more debuggable with echo on\n fo.write('@echo on\\n')\n fo.write(\"REM ===== end generated header =====\\n\")\n", "issue": "AppVeyor: Commit message with braces -> failed build\nhttps://ci.appveyor.com/project/mpi4py/mpi4py/build/2.0.0a0-13/job/0q0w2g5o32qk3m94#L522\n\nPS: I got a warning about conda-build being outdated. Isn't `conda update --all` supposed to update it? Maybe conflicting versions with dependencies?\n\n", "before_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport os\nimport sys\nimport shutil\nfrom os.path import dirname, isdir, isfile, join, exists\n\nimport conda.config as cc\nfrom conda.compat import iteritems\n\nfrom conda_build.config import config\nfrom conda_build import environ\nfrom conda_build import source\nfrom conda_build.utils import _check_call\n\ntry:\n import psutil\nexcept ImportError:\n psutil = None\n\nassert sys.platform == 'win32'\n\n\ndef fix_staged_scripts():\n \"\"\"\n Fixes scripts which have been installed unix-style to have a .bat\n helper\n \"\"\"\n scripts_dir = join(config.build_prefix, 'Scripts')\n if not isdir(scripts_dir):\n return\n for fn in os.listdir(scripts_dir):\n # process all the extensionless files\n if not isfile(join(scripts_dir, fn)) or '.' in fn:\n continue\n\n with open(join(scripts_dir, fn)) as f:\n line = f.readline().lower()\n # If it's a #!python script\n if not (line.startswith('#!') and 'python' in line.lower()):\n continue\n print('Adjusting unix-style #! script %s, '\n 'and adding a .bat file for it' % fn)\n # copy it with a .py extension (skipping that first #! line)\n with open(join(scripts_dir, fn + '-script.py'), 'w') as fo:\n fo.write(f.read())\n # now create the .exe file\n shutil.copyfile(join(dirname(__file__),\n 'cli-%d.exe' % (8 * tuple.__itemsize__)),\n join(scripts_dir, fn + '.exe'))\n\n # remove the original script\n os.remove(join(scripts_dir, fn))\n\n\ndef msvc_env_cmd():\n if 'ProgramFiles(x86)' in os.environ:\n program_files = os.environ['ProgramFiles(x86)']\n else:\n program_files = os.environ['ProgramFiles']\n\n localappdata = os.environ.get(\"localappdata\")\n\n if config.PY3K:\n vcvarsall = os.path.join(program_files,\n r'Microsoft Visual Studio 10.0'\n r'\\VC\\vcvarsall.bat')\n else:\n vcvarsall = os.path.join(program_files,\n r'Microsoft Visual Studio 9.0'\n r'\\VC\\vcvarsall.bat')\n\n # Try the Microsoft Visual C++ Compiler for Python 2.7\n if not isfile(vcvarsall) and localappdata and not config.PY3K:\n vcvarsall = os.path.join(localappdata, \"Programs\", \"Common\",\n \"Microsoft\", \"Visual C++ for Python\", \"9.0\", \"vcvarsall.bat\")\n if not isfile(vcvarsall) and program_files and not config.PY3K:\n vcvarsall = os.path.join(program_files, 'Common Files',\n 'Microsoft', 'Visual C++ for Python', \"9.0\", \"vcvarsall.bat\")\n if not isfile(vcvarsall):\n print(\"Warning: Couldn't find Visual Studio: %r\" % vcvarsall)\n return ''\n\n return '''\\\ncall \"%s\" %s\n''' % (vcvarsall, {32: 'x86', 64: 'amd64'}[cc.bits])\n\n\ndef kill_processes():\n if psutil is None:\n return\n for n in psutil.get_pid_list():\n try:\n p = psutil.Process(n)\n if p.name.lower() == 'msbuild.exe':\n print('Terminating:', p.name)\n p.terminate()\n except:\n continue\n\n\ndef build(m):\n env = dict(os.environ)\n env.update(environ.get_dict(m))\n\n for name in 'BIN', 'INC', 'LIB':\n path = env['LIBRARY_' + name]\n if not isdir(path):\n os.makedirs(path)\n\n src_dir = source.get_dir()\n bld_bat = join(m.path, 'bld.bat')\n if exists(bld_bat):\n with open(bld_bat) as fi:\n data = fi.read()\n with open(join(src_dir, 'bld.bat'), 'w') as fo:\n fo.write(msvc_env_cmd())\n for kv in iteritems(env):\n fo.write('set %s=%s\\n' % kv)\n # more debuggable with echo on\n fo.write('@echo on\\n')\n fo.write(\"REM ===== end generated header =====\\n\")\n fo.write(data)\n\n cmd = [os.environ['COMSPEC'], '/c', 'call', 'bld.bat']\n _check_call(cmd, cwd=src_dir)\n kill_processes()\n fix_staged_scripts()\n", "path": "conda_build/windows.py"}], "after_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport os\nimport sys\nimport shutil\nfrom os.path import dirname, isdir, isfile, join, exists\n\nimport conda.config as cc\nfrom conda.compat import iteritems\n\nfrom conda_build.config import config\nfrom conda_build import environ\nfrom conda_build import source\nfrom conda_build.utils import _check_call\n\ntry:\n import psutil\nexcept ImportError:\n psutil = None\n\nassert sys.platform == 'win32'\n\n\ndef fix_staged_scripts():\n \"\"\"\n Fixes scripts which have been installed unix-style to have a .bat\n helper\n \"\"\"\n scripts_dir = join(config.build_prefix, 'Scripts')\n if not isdir(scripts_dir):\n return\n for fn in os.listdir(scripts_dir):\n # process all the extensionless files\n if not isfile(join(scripts_dir, fn)) or '.' in fn:\n continue\n\n with open(join(scripts_dir, fn)) as f:\n line = f.readline().lower()\n # If it's a #!python script\n if not (line.startswith('#!') and 'python' in line.lower()):\n continue\n print('Adjusting unix-style #! script %s, '\n 'and adding a .bat file for it' % fn)\n # copy it with a .py extension (skipping that first #! line)\n with open(join(scripts_dir, fn + '-script.py'), 'w') as fo:\n fo.write(f.read())\n # now create the .exe file\n shutil.copyfile(join(dirname(__file__),\n 'cli-%d.exe' % (8 * tuple.__itemsize__)),\n join(scripts_dir, fn + '.exe'))\n\n # remove the original script\n os.remove(join(scripts_dir, fn))\n\n\ndef msvc_env_cmd():\n if 'ProgramFiles(x86)' in os.environ:\n program_files = os.environ['ProgramFiles(x86)']\n else:\n program_files = os.environ['ProgramFiles']\n\n localappdata = os.environ.get(\"localappdata\")\n\n if config.PY3K:\n vcvarsall = os.path.join(program_files,\n r'Microsoft Visual Studio 10.0'\n r'\\VC\\vcvarsall.bat')\n else:\n vcvarsall = os.path.join(program_files,\n r'Microsoft Visual Studio 9.0'\n r'\\VC\\vcvarsall.bat')\n\n # Try the Microsoft Visual C++ Compiler for Python 2.7\n if not isfile(vcvarsall) and localappdata and not config.PY3K:\n vcvarsall = os.path.join(localappdata, \"Programs\", \"Common\",\n \"Microsoft\", \"Visual C++ for Python\", \"9.0\", \"vcvarsall.bat\")\n if not isfile(vcvarsall) and program_files and not config.PY3K:\n vcvarsall = os.path.join(program_files, 'Common Files',\n 'Microsoft', 'Visual C++ for Python', \"9.0\", \"vcvarsall.bat\")\n if not isfile(vcvarsall):\n print(\"Warning: Couldn't find Visual Studio: %r\" % vcvarsall)\n return ''\n\n return '''\\\ncall \"%s\" %s\n''' % (vcvarsall, {32: 'x86', 64: 'amd64'}[cc.bits])\n\n\ndef kill_processes():\n if psutil is None:\n return\n for n in psutil.get_pid_list():\n try:\n p = psutil.Process(n)\n if p.name.lower() == 'msbuild.exe':\n print('Terminating:', p.name)\n p.terminate()\n except:\n continue\n\n\ndef build(m):\n env = dict(os.environ)\n env.update(environ.get_dict(m))\n\n for name in 'BIN', 'INC', 'LIB':\n path = env['LIBRARY_' + name]\n if not isdir(path):\n os.makedirs(path)\n\n src_dir = source.get_dir()\n bld_bat = join(m.path, 'bld.bat')\n if exists(bld_bat):\n with open(bld_bat) as fi:\n data = fi.read()\n with open(join(src_dir, 'bld.bat'), 'w') as fo:\n fo.write(msvc_env_cmd())\n for kv in iteritems(env):\n fo.write('set \"%s=%s\"\\n' % kv)\n # more debuggable with echo on\n fo.write('@echo on\\n')\n fo.write(\"REM ===== end generated header =====\\n\")\n fo.write(data)\n\n cmd = [os.environ['COMSPEC'], '/c', 'call', 'bld.bat']\n _check_call(cmd, cwd=src_dir)\n kill_processes()\n fix_staged_scripts()\n", "path": "conda_build/windows.py"}]} | 1,680 | 141 |
gh_patches_debug_20839 | rasdani/github-patches | git_diff | mdn__kuma-7759 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make /$locale/search.json redirect
**Summary**
We get a lot of `NoReverseMatch` on URLs like https://developer.mozilla.org/en-US/search.json
That endpoint disappeared when we switched to the new search API.
Let's make it redirect.
**Additional context**
https://sentry.prod.mozaws.net/operations/mdn-prod/issues/10482841/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kuma/search/views.py`
Content:
```
1 from urllib.parse import parse_qs, urlencode
2
3 from django.shortcuts import render
4 from django.urls import reverse_lazy
5 from django.views.decorators.cache import never_cache
6 from django.views.decorators.http import require_GET
7 from django.views.generic import RedirectView
8 from ratelimit.decorators import ratelimit
9
10 from kuma.api.v1.search import search as search_api
11 from kuma.core.decorators import shared_cache_control
12
13
14 # Since the search endpoint accepts user input (via query parameters) and its
15 # response is compressed, use rate limiting to mitigate the BREACH attack
16 # (see http://breachattack.com/). It still needs to allow a user to click
17 # the filter switches (bug 1426968).
18 # Alternate: forbid gzip by setting Content-Encoding: identity
19 @never_cache
20 @require_GET
21 @ratelimit(key="user_or_ip", rate="25/m", block=True)
22 def search(request, *args, **kwargs):
23 """
24 The search view.
25
26 --2021-- THIS VIEW IS A HACK! --2021--
27 This Django view exists to server-side render the search results page.
28 But we're moving the search result page to Yari and that one will use a XHR
29 request (to /api/v1/search) from a skeleton page (aka. SPA).
30 But as a way to get to that, we need to transition from the old to the new.
31 So, this page uses the Django view in kuma.api.v1.search.search, which
32 returns a special `JsonResponse` instance whose data we can pluck out
33 to our needs for this old view.
34 Once we've fully moved to the Yari (static + XHR to v1 API) site-search,
35 we can comfortably delete this view.
36 """
37 # The underlying v1 API supports searching without a 'q' but the web
38 # UI doesn't. For example, the search input field requires a value.
39 # So we match that here too.
40 if not request.GET.get("q", "").strip():
41 status = 400
42 context = {"results": {}}
43 else:
44 # TODO consider, if the current locale is *not* en-US, that we force
45 # it to do a search in both locales.
46 # This might come in handy for people searching in a locale where
47 # there's very little results but they'd be happy to get the en-US ones.
48 response = search_api(request, *args, **kwargs)
49 results = response.data
50
51 error = None
52 status = response.status_code
53
54 # Determine if there were validation errors
55 if status == 400:
56 error = ""
57 for key, messages in results["errors"].items():
58 for message in messages:
59 error += f"{key}: {message['message']}\n"
60 else:
61 # Have to rearrange the 'results' in a way the old search expects it.
62 # ...which is as follows:
63 # - `count`: integer number of matched documents
64 # - `previous`: a URL or empty string
65 # - `next`: a URL or empty string
66 # - `query`: string
67 # - `start`: pagination number
68 # - `end`: pagination number
69 # - `documents`:
70 # - `title`
71 # - `locale`
72 # - `slug`
73 # - `excerpt`: string of safe HTML
74 next_url = ""
75 previous_url = ""
76 page = results["metadata"]["page"]
77 size = results["metadata"]["size"]
78 count = results["metadata"]["total"]["value"]
79 query_string = request.META.get("QUERY_STRING")
80 query_string_parsed = parse_qs(query_string)
81 if (page + 1) * size < count:
82 query_string_parsed["page"] = f"{page + 1}"
83 next_url = f"?{urlencode(query_string_parsed, True)}"
84 if page > 1:
85 if page == 2:
86 del query_string_parsed["page"]
87 else:
88 query_string_parsed["page"] = f"{page - 1}"
89 previous_url = f"?{urlencode(query_string_parsed, True)}"
90
91 results = {
92 "count": count,
93 "next": next_url,
94 "previous": previous_url,
95 "query": request.GET.get("q"),
96 "start": (page - 1) * size + 1,
97 "end": page * size,
98 "documents": [
99 {
100 "title": x["title"],
101 "slug": x["slug"],
102 "locale": x["locale"],
103 "excerpt": "<br>".join(x["highlight"].get("body", [])),
104 }
105 for x in results["documents"]
106 ],
107 }
108
109 context = {"results": {"results": None if error else results, "error": error}}
110 return render(request, "search/react.html", context, status=status)
111
112
113 class SearchRedirectView(RedirectView):
114 permanent = True
115
116 def get_redirect_url(self, *args, **kwargs):
117 query_string = self.request.META.get("QUERY_STRING")
118 url = reverse_lazy(
119 "api.v1.search", kwargs={"locale": self.request.LANGUAGE_CODE}
120 )
121 if query_string:
122 url += "?" + query_string
123 return url
124
125
126 @shared_cache_control(s_maxage=60 * 60 * 24 * 7)
127 def plugin(request):
128 """Render an OpenSearch Plugin."""
129 return render(
130 request,
131 "search/plugin.html",
132 {"locale": request.LANGUAGE_CODE},
133 content_type="application/opensearchdescription+xml",
134 )
135
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kuma/search/views.py b/kuma/search/views.py
--- a/kuma/search/views.py
+++ b/kuma/search/views.py
@@ -1,5 +1,6 @@
from urllib.parse import parse_qs, urlencode
+from django.conf import settings
from django.shortcuts import render
from django.urls import reverse_lazy
from django.views.decorators.cache import never_cache
@@ -115,11 +116,14 @@
def get_redirect_url(self, *args, **kwargs):
query_string = self.request.META.get("QUERY_STRING")
- url = reverse_lazy(
- "api.v1.search", kwargs={"locale": self.request.LANGUAGE_CODE}
- )
- if query_string:
- url += "?" + query_string
+ url = reverse_lazy("api.v1.search")
+ qs = parse_qs(query_string)
+ # If you used `/en-Us/search.json` you can skip the `?locale=`
+ # because the default locale in `/api/v1/search` is `en-US`.
+ if self.request.LANGUAGE_CODE.lower() != settings.LANGUAGE_CODE.lower():
+ qs["locale"] = self.request.LANGUAGE_CODE
+ if qs:
+ url += "?" + urlencode(qs, True)
return url
| {"golden_diff": "diff --git a/kuma/search/views.py b/kuma/search/views.py\n--- a/kuma/search/views.py\n+++ b/kuma/search/views.py\n@@ -1,5 +1,6 @@\n from urllib.parse import parse_qs, urlencode\n \n+from django.conf import settings\n from django.shortcuts import render\n from django.urls import reverse_lazy\n from django.views.decorators.cache import never_cache\n@@ -115,11 +116,14 @@\n \n def get_redirect_url(self, *args, **kwargs):\n query_string = self.request.META.get(\"QUERY_STRING\")\n- url = reverse_lazy(\n- \"api.v1.search\", kwargs={\"locale\": self.request.LANGUAGE_CODE}\n- )\n- if query_string:\n- url += \"?\" + query_string\n+ url = reverse_lazy(\"api.v1.search\")\n+ qs = parse_qs(query_string)\n+ # If you used `/en-Us/search.json` you can skip the `?locale=`\n+ # because the default locale in `/api/v1/search` is `en-US`.\n+ if self.request.LANGUAGE_CODE.lower() != settings.LANGUAGE_CODE.lower():\n+ qs[\"locale\"] = self.request.LANGUAGE_CODE\n+ if qs:\n+ url += \"?\" + urlencode(qs, True)\n return url\n", "issue": "Make /$locale/search.json redirect\n**Summary**\r\nWe get a lot of `NoReverseMatch` on URLs like https://developer.mozilla.org/en-US/search.json\r\nThat endpoint disappeared when we switched to the new search API. \r\nLet's make it redirect. \r\n\r\n\r\n**Additional context**\r\nhttps://sentry.prod.mozaws.net/operations/mdn-prod/issues/10482841/\r\n\n", "before_files": [{"content": "from urllib.parse import parse_qs, urlencode\n\nfrom django.shortcuts import render\nfrom django.urls import reverse_lazy\nfrom django.views.decorators.cache import never_cache\nfrom django.views.decorators.http import require_GET\nfrom django.views.generic import RedirectView\nfrom ratelimit.decorators import ratelimit\n\nfrom kuma.api.v1.search import search as search_api\nfrom kuma.core.decorators import shared_cache_control\n\n\n# Since the search endpoint accepts user input (via query parameters) and its\n# response is compressed, use rate limiting to mitigate the BREACH attack\n# (see http://breachattack.com/). It still needs to allow a user to click\n# the filter switches (bug 1426968).\n# Alternate: forbid gzip by setting Content-Encoding: identity\n@never_cache\n@require_GET\n@ratelimit(key=\"user_or_ip\", rate=\"25/m\", block=True)\ndef search(request, *args, **kwargs):\n \"\"\"\n The search view.\n\n --2021-- THIS VIEW IS A HACK! --2021--\n This Django view exists to server-side render the search results page.\n But we're moving the search result page to Yari and that one will use a XHR\n request (to /api/v1/search) from a skeleton page (aka. SPA).\n But as a way to get to that, we need to transition from the old to the new.\n So, this page uses the Django view in kuma.api.v1.search.search, which\n returns a special `JsonResponse` instance whose data we can pluck out\n to our needs for this old view.\n Once we've fully moved to the Yari (static + XHR to v1 API) site-search,\n we can comfortably delete this view.\n \"\"\"\n # The underlying v1 API supports searching without a 'q' but the web\n # UI doesn't. For example, the search input field requires a value.\n # So we match that here too.\n if not request.GET.get(\"q\", \"\").strip():\n status = 400\n context = {\"results\": {}}\n else:\n # TODO consider, if the current locale is *not* en-US, that we force\n # it to do a search in both locales.\n # This might come in handy for people searching in a locale where\n # there's very little results but they'd be happy to get the en-US ones.\n response = search_api(request, *args, **kwargs)\n results = response.data\n\n error = None\n status = response.status_code\n\n # Determine if there were validation errors\n if status == 400:\n error = \"\"\n for key, messages in results[\"errors\"].items():\n for message in messages:\n error += f\"{key}: {message['message']}\\n\"\n else:\n # Have to rearrange the 'results' in a way the old search expects it.\n # ...which is as follows:\n # - `count`: integer number of matched documents\n # - `previous`: a URL or empty string\n # - `next`: a URL or empty string\n # - `query`: string\n # - `start`: pagination number\n # - `end`: pagination number\n # - `documents`:\n # - `title`\n # - `locale`\n # - `slug`\n # - `excerpt`: string of safe HTML\n next_url = \"\"\n previous_url = \"\"\n page = results[\"metadata\"][\"page\"]\n size = results[\"metadata\"][\"size\"]\n count = results[\"metadata\"][\"total\"][\"value\"]\n query_string = request.META.get(\"QUERY_STRING\")\n query_string_parsed = parse_qs(query_string)\n if (page + 1) * size < count:\n query_string_parsed[\"page\"] = f\"{page + 1}\"\n next_url = f\"?{urlencode(query_string_parsed, True)}\"\n if page > 1:\n if page == 2:\n del query_string_parsed[\"page\"]\n else:\n query_string_parsed[\"page\"] = f\"{page - 1}\"\n previous_url = f\"?{urlencode(query_string_parsed, True)}\"\n\n results = {\n \"count\": count,\n \"next\": next_url,\n \"previous\": previous_url,\n \"query\": request.GET.get(\"q\"),\n \"start\": (page - 1) * size + 1,\n \"end\": page * size,\n \"documents\": [\n {\n \"title\": x[\"title\"],\n \"slug\": x[\"slug\"],\n \"locale\": x[\"locale\"],\n \"excerpt\": \"<br>\".join(x[\"highlight\"].get(\"body\", [])),\n }\n for x in results[\"documents\"]\n ],\n }\n\n context = {\"results\": {\"results\": None if error else results, \"error\": error}}\n return render(request, \"search/react.html\", context, status=status)\n\n\nclass SearchRedirectView(RedirectView):\n permanent = True\n\n def get_redirect_url(self, *args, **kwargs):\n query_string = self.request.META.get(\"QUERY_STRING\")\n url = reverse_lazy(\n \"api.v1.search\", kwargs={\"locale\": self.request.LANGUAGE_CODE}\n )\n if query_string:\n url += \"?\" + query_string\n return url\n\n\n@shared_cache_control(s_maxage=60 * 60 * 24 * 7)\ndef plugin(request):\n \"\"\"Render an OpenSearch Plugin.\"\"\"\n return render(\n request,\n \"search/plugin.html\",\n {\"locale\": request.LANGUAGE_CODE},\n content_type=\"application/opensearchdescription+xml\",\n )\n", "path": "kuma/search/views.py"}], "after_files": [{"content": "from urllib.parse import parse_qs, urlencode\n\nfrom django.conf import settings\nfrom django.shortcuts import render\nfrom django.urls import reverse_lazy\nfrom django.views.decorators.cache import never_cache\nfrom django.views.decorators.http import require_GET\nfrom django.views.generic import RedirectView\nfrom ratelimit.decorators import ratelimit\n\nfrom kuma.api.v1.search import search as search_api\nfrom kuma.core.decorators import shared_cache_control\n\n\n# Since the search endpoint accepts user input (via query parameters) and its\n# response is compressed, use rate limiting to mitigate the BREACH attack\n# (see http://breachattack.com/). It still needs to allow a user to click\n# the filter switches (bug 1426968).\n# Alternate: forbid gzip by setting Content-Encoding: identity\n@never_cache\n@require_GET\n@ratelimit(key=\"user_or_ip\", rate=\"25/m\", block=True)\ndef search(request, *args, **kwargs):\n \"\"\"\n The search view.\n\n --2021-- THIS VIEW IS A HACK! --2021--\n This Django view exists to server-side render the search results page.\n But we're moving the search result page to Yari and that one will use a XHR\n request (to /api/v1/search) from a skeleton page (aka. SPA).\n But as a way to get to that, we need to transition from the old to the new.\n So, this page uses the Django view in kuma.api.v1.search.search, which\n returns a special `JsonResponse` instance whose data we can pluck out\n to our needs for this old view.\n Once we've fully moved to the Yari (static + XHR to v1 API) site-search,\n we can comfortably delete this view.\n \"\"\"\n # The underlying v1 API supports searching without a 'q' but the web\n # UI doesn't. For example, the search input field requires a value.\n # So we match that here too.\n if not request.GET.get(\"q\", \"\").strip():\n status = 400\n context = {\"results\": {}}\n else:\n # TODO consider, if the current locale is *not* en-US, that we force\n # it to do a search in both locales.\n # This might come in handy for people searching in a locale where\n # there's very little results but they'd be happy to get the en-US ones.\n response = search_api(request, *args, **kwargs)\n results = response.data\n\n error = None\n status = response.status_code\n\n # Determine if there were validation errors\n if status == 400:\n error = \"\"\n for key, messages in results[\"errors\"].items():\n for message in messages:\n error += f\"{key}: {message['message']}\\n\"\n else:\n # Have to rearrange the 'results' in a way the old search expects it.\n # ...which is as follows:\n # - `count`: integer number of matched documents\n # - `previous`: a URL or empty string\n # - `next`: a URL or empty string\n # - `query`: string\n # - `start`: pagination number\n # - `end`: pagination number\n # - `documents`:\n # - `title`\n # - `locale`\n # - `slug`\n # - `excerpt`: string of safe HTML\n next_url = \"\"\n previous_url = \"\"\n page = results[\"metadata\"][\"page\"]\n size = results[\"metadata\"][\"size\"]\n count = results[\"metadata\"][\"total\"][\"value\"]\n query_string = request.META.get(\"QUERY_STRING\")\n query_string_parsed = parse_qs(query_string)\n if (page + 1) * size < count:\n query_string_parsed[\"page\"] = f\"{page + 1}\"\n next_url = f\"?{urlencode(query_string_parsed, True)}\"\n if page > 1:\n if page == 2:\n del query_string_parsed[\"page\"]\n else:\n query_string_parsed[\"page\"] = f\"{page - 1}\"\n previous_url = f\"?{urlencode(query_string_parsed, True)}\"\n\n results = {\n \"count\": count,\n \"next\": next_url,\n \"previous\": previous_url,\n \"query\": request.GET.get(\"q\"),\n \"start\": (page - 1) * size + 1,\n \"end\": page * size,\n \"documents\": [\n {\n \"title\": x[\"title\"],\n \"slug\": x[\"slug\"],\n \"locale\": x[\"locale\"],\n \"excerpt\": \"<br>\".join(x[\"highlight\"].get(\"body\", [])),\n }\n for x in results[\"documents\"]\n ],\n }\n\n context = {\"results\": {\"results\": None if error else results, \"error\": error}}\n return render(request, \"search/react.html\", context, status=status)\n\n\nclass SearchRedirectView(RedirectView):\n permanent = True\n\n def get_redirect_url(self, *args, **kwargs):\n query_string = self.request.META.get(\"QUERY_STRING\")\n url = reverse_lazy(\"api.v1.search\")\n qs = parse_qs(query_string)\n # If you used `/en-Us/search.json` you can skip the `?locale=`\n # because the default locale in `/api/v1/search` is `en-US`.\n if self.request.LANGUAGE_CODE.lower() != settings.LANGUAGE_CODE.lower():\n qs[\"locale\"] = self.request.LANGUAGE_CODE\n if qs:\n url += \"?\" + urlencode(qs, True)\n return url\n\n\n@shared_cache_control(s_maxage=60 * 60 * 24 * 7)\ndef plugin(request):\n \"\"\"Render an OpenSearch Plugin.\"\"\"\n return render(\n request,\n \"search/plugin.html\",\n {\"locale\": request.LANGUAGE_CODE},\n content_type=\"application/opensearchdescription+xml\",\n )\n", "path": "kuma/search/views.py"}]} | 1,863 | 279 |
gh_patches_debug_4542 | rasdani/github-patches | git_diff | open-mmlab__mmpretrain-122 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
mmcls/models/losses/eval_metrics.py confusion_matrix
confusion_matrix[target_label.long(), pred_label.long()] += 1
I think this code is wrong, 【target_label.long(), pred_label.long()】 will list all the coordinates that need + 1, but only once + 1 will work
it should be:
`for t, p in zip(target_label, pred_label):
confusion_matrix[t.long(), p.long()] += 1 `
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mmcls/models/losses/eval_metrics.py`
Content:
```
1 import numpy as np
2 import torch
3
4
5 def calculate_confusion_matrix(pred, target):
6 if isinstance(pred, np.ndarray) and isinstance(target, np.ndarray):
7 pred = torch.from_numpy(pred)
8 target = torch.from_numpy(target)
9 elif not (isinstance(pred, torch.Tensor)
10 and isinstance(target, torch.Tensor)):
11 raise TypeError('pred and target should both be'
12 'torch.Tensor or np.ndarray')
13 _, pred_label = pred.topk(1, dim=1)
14 num_classes = pred.size(1)
15 pred_label = pred_label.view(-1)
16 target_label = target.view(-1)
17 assert len(pred_label) == len(target_label)
18 confusion_matrix = torch.zeros(num_classes, num_classes)
19 with torch.no_grad():
20 confusion_matrix[target_label.long(), pred_label.long()] += 1
21 return confusion_matrix
22
23
24 def precision(pred, target):
25 """Calculate macro-averaged precision according to the prediction and target
26
27 Args:
28 pred (torch.Tensor | np.array): The model prediction.
29 target (torch.Tensor | np.array): The target of each prediction.
30
31 Returns:
32 float: The function will return a single float as precision.
33 """
34 confusion_matrix = calculate_confusion_matrix(pred, target)
35 with torch.no_grad():
36 res = confusion_matrix.diag() / torch.clamp(
37 confusion_matrix.sum(0), min=1)
38 res = res.mean().item() * 100
39 return res
40
41
42 def recall(pred, target):
43 """Calculate macro-averaged recall according to the prediction and target
44
45 Args:
46 pred (torch.Tensor | np.array): The model prediction.
47 target (torch.Tensor | np.array): The target of each prediction.
48
49 Returns:
50 float: The function will return a single float as recall.
51 """
52 confusion_matrix = calculate_confusion_matrix(pred, target)
53 with torch.no_grad():
54 res = confusion_matrix.diag() / torch.clamp(
55 confusion_matrix.sum(1), min=1)
56 res = res.mean().item() * 100
57 return res
58
59
60 def f1_score(pred, target):
61 """Calculate macro-averaged F1 score according to the prediction and target
62
63 Args:
64 pred (torch.Tensor | np.array): The model prediction.
65 target (torch.Tensor | np.array): The target of each prediction.
66
67 Returns:
68 float: The function will return a single float as F1 score.
69 """
70 confusion_matrix = calculate_confusion_matrix(pred, target)
71 with torch.no_grad():
72 precision = confusion_matrix.diag() / torch.clamp(
73 confusion_matrix.sum(1), min=1)
74 recall = confusion_matrix.diag() / torch.clamp(
75 confusion_matrix.sum(0), min=1)
76 res = 2 * precision * recall / torch.clamp(
77 precision + recall, min=1e-20)
78 res = torch.where(torch.isnan(res), torch.full_like(res, 0), res)
79 res = res.mean().item() * 100
80 return res
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mmcls/models/losses/eval_metrics.py b/mmcls/models/losses/eval_metrics.py
--- a/mmcls/models/losses/eval_metrics.py
+++ b/mmcls/models/losses/eval_metrics.py
@@ -17,7 +17,8 @@
assert len(pred_label) == len(target_label)
confusion_matrix = torch.zeros(num_classes, num_classes)
with torch.no_grad():
- confusion_matrix[target_label.long(), pred_label.long()] += 1
+ for t, p in zip(target_label, pred_label):
+ confusion_matrix[t.long(), p.long()] += 1
return confusion_matrix
| {"golden_diff": "diff --git a/mmcls/models/losses/eval_metrics.py b/mmcls/models/losses/eval_metrics.py\n--- a/mmcls/models/losses/eval_metrics.py\n+++ b/mmcls/models/losses/eval_metrics.py\n@@ -17,7 +17,8 @@\n assert len(pred_label) == len(target_label)\n confusion_matrix = torch.zeros(num_classes, num_classes)\n with torch.no_grad():\n- confusion_matrix[target_label.long(), pred_label.long()] += 1\n+ for t, p in zip(target_label, pred_label):\n+ confusion_matrix[t.long(), p.long()] += 1\n return confusion_matrix\n", "issue": "mmcls/models/losses/eval_metrics.py confusion_matrix\nconfusion_matrix[target_label.long(), pred_label.long()] += 1\r\nI think this code is wrong, \u3010target_label.long(), pred_label.long()\u3011 will list all the coordinates that need + 1, but only once + 1 will work\r\nit should be:\r\n`for t, p in zip(target_label, pred_label):\r\n confusion_matrix[t.long(), p.long()] += 1 `\r\n\n", "before_files": [{"content": "import numpy as np\nimport torch\n\n\ndef calculate_confusion_matrix(pred, target):\n if isinstance(pred, np.ndarray) and isinstance(target, np.ndarray):\n pred = torch.from_numpy(pred)\n target = torch.from_numpy(target)\n elif not (isinstance(pred, torch.Tensor)\n and isinstance(target, torch.Tensor)):\n raise TypeError('pred and target should both be'\n 'torch.Tensor or np.ndarray')\n _, pred_label = pred.topk(1, dim=1)\n num_classes = pred.size(1)\n pred_label = pred_label.view(-1)\n target_label = target.view(-1)\n assert len(pred_label) == len(target_label)\n confusion_matrix = torch.zeros(num_classes, num_classes)\n with torch.no_grad():\n confusion_matrix[target_label.long(), pred_label.long()] += 1\n return confusion_matrix\n\n\ndef precision(pred, target):\n \"\"\"Calculate macro-averaged precision according to the prediction and target\n\n Args:\n pred (torch.Tensor | np.array): The model prediction.\n target (torch.Tensor | np.array): The target of each prediction.\n\n Returns:\n float: The function will return a single float as precision.\n \"\"\"\n confusion_matrix = calculate_confusion_matrix(pred, target)\n with torch.no_grad():\n res = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(0), min=1)\n res = res.mean().item() * 100\n return res\n\n\ndef recall(pred, target):\n \"\"\"Calculate macro-averaged recall according to the prediction and target\n\n Args:\n pred (torch.Tensor | np.array): The model prediction.\n target (torch.Tensor | np.array): The target of each prediction.\n\n Returns:\n float: The function will return a single float as recall.\n \"\"\"\n confusion_matrix = calculate_confusion_matrix(pred, target)\n with torch.no_grad():\n res = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(1), min=1)\n res = res.mean().item() * 100\n return res\n\n\ndef f1_score(pred, target):\n \"\"\"Calculate macro-averaged F1 score according to the prediction and target\n\n Args:\n pred (torch.Tensor | np.array): The model prediction.\n target (torch.Tensor | np.array): The target of each prediction.\n\n Returns:\n float: The function will return a single float as F1 score.\n \"\"\"\n confusion_matrix = calculate_confusion_matrix(pred, target)\n with torch.no_grad():\n precision = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(1), min=1)\n recall = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(0), min=1)\n res = 2 * precision * recall / torch.clamp(\n precision + recall, min=1e-20)\n res = torch.where(torch.isnan(res), torch.full_like(res, 0), res)\n res = res.mean().item() * 100\n return res\n", "path": "mmcls/models/losses/eval_metrics.py"}], "after_files": [{"content": "import numpy as np\nimport torch\n\n\ndef calculate_confusion_matrix(pred, target):\n if isinstance(pred, np.ndarray) and isinstance(target, np.ndarray):\n pred = torch.from_numpy(pred)\n target = torch.from_numpy(target)\n elif not (isinstance(pred, torch.Tensor)\n and isinstance(target, torch.Tensor)):\n raise TypeError('pred and target should both be'\n 'torch.Tensor or np.ndarray')\n _, pred_label = pred.topk(1, dim=1)\n num_classes = pred.size(1)\n pred_label = pred_label.view(-1)\n target_label = target.view(-1)\n assert len(pred_label) == len(target_label)\n confusion_matrix = torch.zeros(num_classes, num_classes)\n with torch.no_grad():\n for t, p in zip(target_label, pred_label):\n confusion_matrix[t.long(), p.long()] += 1\n return confusion_matrix\n\n\ndef precision(pred, target):\n \"\"\"Calculate macro-averaged precision according to the prediction and target\n\n Args:\n pred (torch.Tensor | np.array): The model prediction.\n target (torch.Tensor | np.array): The target of each prediction.\n\n Returns:\n float: The function will return a single float as precision.\n \"\"\"\n confusion_matrix = calculate_confusion_matrix(pred, target)\n with torch.no_grad():\n res = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(0), min=1)\n res = res.mean().item() * 100\n return res\n\n\ndef recall(pred, target):\n \"\"\"Calculate macro-averaged recall according to the prediction and target\n\n Args:\n pred (torch.Tensor | np.array): The model prediction.\n target (torch.Tensor | np.array): The target of each prediction.\n\n Returns:\n float: The function will return a single float as recall.\n \"\"\"\n confusion_matrix = calculate_confusion_matrix(pred, target)\n with torch.no_grad():\n res = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(1), min=1)\n res = res.mean().item() * 100\n return res\n\n\ndef f1_score(pred, target):\n \"\"\"Calculate macro-averaged F1 score according to the prediction and target\n\n Args:\n pred (torch.Tensor | np.array): The model prediction.\n target (torch.Tensor | np.array): The target of each prediction.\n\n Returns:\n float: The function will return a single float as F1 score.\n \"\"\"\n confusion_matrix = calculate_confusion_matrix(pred, target)\n with torch.no_grad():\n precision = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(1), min=1)\n recall = confusion_matrix.diag() / torch.clamp(\n confusion_matrix.sum(0), min=1)\n res = 2 * precision * recall / torch.clamp(\n precision + recall, min=1e-20)\n res = torch.where(torch.isnan(res), torch.full_like(res, 0), res)\n res = res.mean().item() * 100\n return res\n", "path": "mmcls/models/losses/eval_metrics.py"}]} | 1,168 | 142 |
gh_patches_debug_28395 | rasdani/github-patches | git_diff | pantsbuild__pants-16264 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Not able to load resources when using pants vs sbt
**Describe the bug**
When using sbt we are able to call `Thread.currentThread().getContextClassLoader().getResources` and get a list of URLs. When using pants the list is empty.
This at the moment limits us from using Flyway with pants.
**Pants version**
2.13.0a1 and main.
**OS**
MacOS
**Additional info**
Example repo to reproduce the issue:
https://github.com/somdoron/test-pants-resources
I think the issue is, that pants only compress files in the resources zip file and not the directories.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/python/pants/jvm/resources.py`
Content:
```
1 # Copyright 2021 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 import itertools
5 import logging
6 from itertools import chain
7
8 from pants.core.target_types import ResourcesFieldSet, ResourcesGeneratorFieldSet
9 from pants.core.util_rules import stripped_source_files
10 from pants.core.util_rules.source_files import SourceFilesRequest
11 from pants.core.util_rules.stripped_source_files import StrippedSourceFiles
12 from pants.core.util_rules.system_binaries import ZipBinary
13 from pants.engine.fs import Digest, MergeDigests
14 from pants.engine.internals.selectors import MultiGet
15 from pants.engine.process import Process, ProcessResult
16 from pants.engine.rules import Get, collect_rules, rule
17 from pants.engine.target import SourcesField
18 from pants.engine.unions import UnionRule
19 from pants.jvm import compile
20 from pants.jvm.compile import (
21 ClasspathDependenciesRequest,
22 ClasspathEntry,
23 ClasspathEntryRequest,
24 ClasspathEntryRequests,
25 CompileResult,
26 FallibleClasspathEntries,
27 FallibleClasspathEntry,
28 )
29 from pants.jvm.strip_jar.strip_jar import StripJarRequest
30 from pants.jvm.subsystems import JvmSubsystem
31 from pants.util.logging import LogLevel
32
33 logger = logging.getLogger(__name__)
34
35
36 class JvmResourcesRequest(ClasspathEntryRequest):
37 field_sets = (
38 ResourcesFieldSet,
39 ResourcesGeneratorFieldSet,
40 )
41
42
43 @rule(desc="Assemble resources")
44 async def assemble_resources_jar(
45 zip: ZipBinary,
46 jvm: JvmSubsystem,
47 request: JvmResourcesRequest,
48 ) -> FallibleClasspathEntry:
49 # Request the component's direct dependency classpath, and additionally any prerequisite.
50 # Filter out any dependencies that are generated by our current target so that each resource
51 # only appears in a single input JAR.
52 # NOTE: Generated dependencies will have the same dependencies as the current target, so we
53 # don't need to inspect those dependencies.
54 optional_prereq_request = [*((request.prerequisite,) if request.prerequisite else ())]
55 fallibles = await MultiGet(
56 Get(FallibleClasspathEntries, ClasspathEntryRequests(optional_prereq_request)),
57 Get(FallibleClasspathEntries, ClasspathDependenciesRequest(request, ignore_generated=True)),
58 )
59 direct_dependency_classpath_entries = FallibleClasspathEntries(
60 itertools.chain(*fallibles)
61 ).if_all_succeeded()
62
63 if direct_dependency_classpath_entries is None:
64 return FallibleClasspathEntry(
65 description=str(request.component),
66 result=CompileResult.DEPENDENCY_FAILED,
67 output=None,
68 exit_code=1,
69 )
70
71 source_files = await Get(
72 StrippedSourceFiles,
73 SourceFilesRequest([tgt.get(SourcesField) for tgt in request.component.members]),
74 )
75
76 output_filename = f"{request.component.representative.address.path_safe_spec}.resources.jar"
77 output_files = [output_filename]
78
79 resources_jar_input_digest = source_files.snapshot.digest
80 resources_jar_result = await Get(
81 ProcessResult,
82 Process(
83 argv=[
84 zip.path,
85 output_filename,
86 *source_files.snapshot.files,
87 ],
88 description="Build resources JAR for {request.component}",
89 input_digest=resources_jar_input_digest,
90 output_files=output_files,
91 level=LogLevel.DEBUG,
92 ),
93 )
94
95 output_digest = resources_jar_result.output_digest
96 if jvm.reproducible_jars:
97 output_digest = await Get(Digest, StripJarRequest(output_digest, tuple(output_files)))
98 cpe = ClasspathEntry(output_digest, output_files, [])
99
100 merged_cpe_digest = await Get(
101 Digest,
102 MergeDigests(chain((cpe.digest,), (i.digest for i in direct_dependency_classpath_entries))),
103 )
104
105 merged_cpe = ClasspathEntry.merge(
106 digest=merged_cpe_digest, entries=[cpe, *direct_dependency_classpath_entries]
107 )
108
109 return FallibleClasspathEntry(output_filename, CompileResult.SUCCEEDED, merged_cpe, 0)
110
111
112 def rules():
113 return [
114 *collect_rules(),
115 *compile.rules(),
116 *stripped_source_files.rules(),
117 UnionRule(ClasspathEntryRequest, JvmResourcesRequest),
118 ]
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/python/pants/jvm/resources.py b/src/python/pants/jvm/resources.py
--- a/src/python/pants/jvm/resources.py
+++ b/src/python/pants/jvm/resources.py
@@ -4,6 +4,7 @@
import itertools
import logging
from itertools import chain
+from pathlib import Path
from pants.core.target_types import ResourcesFieldSet, ResourcesGeneratorFieldSet
from pants.core.util_rules import stripped_source_files
@@ -76,6 +77,13 @@
output_filename = f"{request.component.representative.address.path_safe_spec}.resources.jar"
output_files = [output_filename]
+ # #16231: Valid JAR files need the directories of each resource file as well as the files
+ # themselves.
+
+ paths = {Path(filename) for filename in source_files.snapshot.files}
+ directories = {parent for path in paths for parent in path.parents}
+ input_files = {str(path) for path in chain(paths, directories)}
+
resources_jar_input_digest = source_files.snapshot.digest
resources_jar_result = await Get(
ProcessResult,
@@ -83,7 +91,7 @@
argv=[
zip.path,
output_filename,
- *source_files.snapshot.files,
+ *sorted(input_files),
],
description="Build resources JAR for {request.component}",
input_digest=resources_jar_input_digest,
| {"golden_diff": "diff --git a/src/python/pants/jvm/resources.py b/src/python/pants/jvm/resources.py\n--- a/src/python/pants/jvm/resources.py\n+++ b/src/python/pants/jvm/resources.py\n@@ -4,6 +4,7 @@\n import itertools\n import logging\n from itertools import chain\n+from pathlib import Path\n \n from pants.core.target_types import ResourcesFieldSet, ResourcesGeneratorFieldSet\n from pants.core.util_rules import stripped_source_files\n@@ -76,6 +77,13 @@\n output_filename = f\"{request.component.representative.address.path_safe_spec}.resources.jar\"\n output_files = [output_filename]\n \n+ # #16231: Valid JAR files need the directories of each resource file as well as the files\n+ # themselves.\n+\n+ paths = {Path(filename) for filename in source_files.snapshot.files}\n+ directories = {parent for path in paths for parent in path.parents}\n+ input_files = {str(path) for path in chain(paths, directories)}\n+\n resources_jar_input_digest = source_files.snapshot.digest\n resources_jar_result = await Get(\n ProcessResult,\n@@ -83,7 +91,7 @@\n argv=[\n zip.path,\n output_filename,\n- *source_files.snapshot.files,\n+ *sorted(input_files),\n ],\n description=\"Build resources JAR for {request.component}\",\n input_digest=resources_jar_input_digest,\n", "issue": "Not able to load resources when using pants vs sbt\n**Describe the bug**\r\nWhen using sbt we are able to call `Thread.currentThread().getContextClassLoader().getResources` and get a list of URLs. When using pants the list is empty. \r\n\r\nThis at the moment limits us from using Flyway with pants.\r\n\r\n**Pants version**\r\n2.13.0a1 and main.\r\n\r\n**OS**\r\nMacOS\r\n\r\n**Additional info**\r\nExample repo to reproduce the issue:\r\nhttps://github.com/somdoron/test-pants-resources\r\n\r\nI think the issue is, that pants only compress files in the resources zip file and not the directories.\n", "before_files": [{"content": "# Copyright 2021 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nimport itertools\nimport logging\nfrom itertools import chain\n\nfrom pants.core.target_types import ResourcesFieldSet, ResourcesGeneratorFieldSet\nfrom pants.core.util_rules import stripped_source_files\nfrom pants.core.util_rules.source_files import SourceFilesRequest\nfrom pants.core.util_rules.stripped_source_files import StrippedSourceFiles\nfrom pants.core.util_rules.system_binaries import ZipBinary\nfrom pants.engine.fs import Digest, MergeDigests\nfrom pants.engine.internals.selectors import MultiGet\nfrom pants.engine.process import Process, ProcessResult\nfrom pants.engine.rules import Get, collect_rules, rule\nfrom pants.engine.target import SourcesField\nfrom pants.engine.unions import UnionRule\nfrom pants.jvm import compile\nfrom pants.jvm.compile import (\n ClasspathDependenciesRequest,\n ClasspathEntry,\n ClasspathEntryRequest,\n ClasspathEntryRequests,\n CompileResult,\n FallibleClasspathEntries,\n FallibleClasspathEntry,\n)\nfrom pants.jvm.strip_jar.strip_jar import StripJarRequest\nfrom pants.jvm.subsystems import JvmSubsystem\nfrom pants.util.logging import LogLevel\n\nlogger = logging.getLogger(__name__)\n\n\nclass JvmResourcesRequest(ClasspathEntryRequest):\n field_sets = (\n ResourcesFieldSet,\n ResourcesGeneratorFieldSet,\n )\n\n\n@rule(desc=\"Assemble resources\")\nasync def assemble_resources_jar(\n zip: ZipBinary,\n jvm: JvmSubsystem,\n request: JvmResourcesRequest,\n) -> FallibleClasspathEntry:\n # Request the component's direct dependency classpath, and additionally any prerequisite.\n # Filter out any dependencies that are generated by our current target so that each resource\n # only appears in a single input JAR.\n # NOTE: Generated dependencies will have the same dependencies as the current target, so we\n # don't need to inspect those dependencies.\n optional_prereq_request = [*((request.prerequisite,) if request.prerequisite else ())]\n fallibles = await MultiGet(\n Get(FallibleClasspathEntries, ClasspathEntryRequests(optional_prereq_request)),\n Get(FallibleClasspathEntries, ClasspathDependenciesRequest(request, ignore_generated=True)),\n )\n direct_dependency_classpath_entries = FallibleClasspathEntries(\n itertools.chain(*fallibles)\n ).if_all_succeeded()\n\n if direct_dependency_classpath_entries is None:\n return FallibleClasspathEntry(\n description=str(request.component),\n result=CompileResult.DEPENDENCY_FAILED,\n output=None,\n exit_code=1,\n )\n\n source_files = await Get(\n StrippedSourceFiles,\n SourceFilesRequest([tgt.get(SourcesField) for tgt in request.component.members]),\n )\n\n output_filename = f\"{request.component.representative.address.path_safe_spec}.resources.jar\"\n output_files = [output_filename]\n\n resources_jar_input_digest = source_files.snapshot.digest\n resources_jar_result = await Get(\n ProcessResult,\n Process(\n argv=[\n zip.path,\n output_filename,\n *source_files.snapshot.files,\n ],\n description=\"Build resources JAR for {request.component}\",\n input_digest=resources_jar_input_digest,\n output_files=output_files,\n level=LogLevel.DEBUG,\n ),\n )\n\n output_digest = resources_jar_result.output_digest\n if jvm.reproducible_jars:\n output_digest = await Get(Digest, StripJarRequest(output_digest, tuple(output_files)))\n cpe = ClasspathEntry(output_digest, output_files, [])\n\n merged_cpe_digest = await Get(\n Digest,\n MergeDigests(chain((cpe.digest,), (i.digest for i in direct_dependency_classpath_entries))),\n )\n\n merged_cpe = ClasspathEntry.merge(\n digest=merged_cpe_digest, entries=[cpe, *direct_dependency_classpath_entries]\n )\n\n return FallibleClasspathEntry(output_filename, CompileResult.SUCCEEDED, merged_cpe, 0)\n\n\ndef rules():\n return [\n *collect_rules(),\n *compile.rules(),\n *stripped_source_files.rules(),\n UnionRule(ClasspathEntryRequest, JvmResourcesRequest),\n ]\n", "path": "src/python/pants/jvm/resources.py"}], "after_files": [{"content": "# Copyright 2021 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nimport itertools\nimport logging\nfrom itertools import chain\nfrom pathlib import Path\n\nfrom pants.core.target_types import ResourcesFieldSet, ResourcesGeneratorFieldSet\nfrom pants.core.util_rules import stripped_source_files\nfrom pants.core.util_rules.source_files import SourceFilesRequest\nfrom pants.core.util_rules.stripped_source_files import StrippedSourceFiles\nfrom pants.core.util_rules.system_binaries import ZipBinary\nfrom pants.engine.fs import Digest, MergeDigests\nfrom pants.engine.internals.selectors import MultiGet\nfrom pants.engine.process import Process, ProcessResult\nfrom pants.engine.rules import Get, collect_rules, rule\nfrom pants.engine.target import SourcesField\nfrom pants.engine.unions import UnionRule\nfrom pants.jvm import compile\nfrom pants.jvm.compile import (\n ClasspathDependenciesRequest,\n ClasspathEntry,\n ClasspathEntryRequest,\n ClasspathEntryRequests,\n CompileResult,\n FallibleClasspathEntries,\n FallibleClasspathEntry,\n)\nfrom pants.jvm.strip_jar.strip_jar import StripJarRequest\nfrom pants.jvm.subsystems import JvmSubsystem\nfrom pants.util.logging import LogLevel\n\nlogger = logging.getLogger(__name__)\n\n\nclass JvmResourcesRequest(ClasspathEntryRequest):\n field_sets = (\n ResourcesFieldSet,\n ResourcesGeneratorFieldSet,\n )\n\n\n@rule(desc=\"Assemble resources\")\nasync def assemble_resources_jar(\n zip: ZipBinary,\n jvm: JvmSubsystem,\n request: JvmResourcesRequest,\n) -> FallibleClasspathEntry:\n # Request the component's direct dependency classpath, and additionally any prerequisite.\n # Filter out any dependencies that are generated by our current target so that each resource\n # only appears in a single input JAR.\n # NOTE: Generated dependencies will have the same dependencies as the current target, so we\n # don't need to inspect those dependencies.\n optional_prereq_request = [*((request.prerequisite,) if request.prerequisite else ())]\n fallibles = await MultiGet(\n Get(FallibleClasspathEntries, ClasspathEntryRequests(optional_prereq_request)),\n Get(FallibleClasspathEntries, ClasspathDependenciesRequest(request, ignore_generated=True)),\n )\n direct_dependency_classpath_entries = FallibleClasspathEntries(\n itertools.chain(*fallibles)\n ).if_all_succeeded()\n\n if direct_dependency_classpath_entries is None:\n return FallibleClasspathEntry(\n description=str(request.component),\n result=CompileResult.DEPENDENCY_FAILED,\n output=None,\n exit_code=1,\n )\n\n source_files = await Get(\n StrippedSourceFiles,\n SourceFilesRequest([tgt.get(SourcesField) for tgt in request.component.members]),\n )\n\n output_filename = f\"{request.component.representative.address.path_safe_spec}.resources.jar\"\n output_files = [output_filename]\n\n # #16231: Valid JAR files need the directories of each resource file as well as the files\n # themselves.\n\n paths = {Path(filename) for filename in source_files.snapshot.files}\n directories = {parent for path in paths for parent in path.parents}\n input_files = {str(path) for path in chain(paths, directories)}\n\n resources_jar_input_digest = source_files.snapshot.digest\n resources_jar_result = await Get(\n ProcessResult,\n Process(\n argv=[\n zip.path,\n output_filename,\n *sorted(input_files),\n ],\n description=\"Build resources JAR for {request.component}\",\n input_digest=resources_jar_input_digest,\n output_files=output_files,\n level=LogLevel.DEBUG,\n ),\n )\n\n output_digest = resources_jar_result.output_digest\n if jvm.reproducible_jars:\n output_digest = await Get(Digest, StripJarRequest(output_digest, tuple(output_files)))\n cpe = ClasspathEntry(output_digest, output_files, [])\n\n merged_cpe_digest = await Get(\n Digest,\n MergeDigests(chain((cpe.digest,), (i.digest for i in direct_dependency_classpath_entries))),\n )\n\n merged_cpe = ClasspathEntry.merge(\n digest=merged_cpe_digest, entries=[cpe, *direct_dependency_classpath_entries]\n )\n\n return FallibleClasspathEntry(output_filename, CompileResult.SUCCEEDED, merged_cpe, 0)\n\n\ndef rules():\n return [\n *collect_rules(),\n *compile.rules(),\n *stripped_source_files.rules(),\n UnionRule(ClasspathEntryRequest, JvmResourcesRequest),\n ]\n", "path": "src/python/pants/jvm/resources.py"}]} | 1,542 | 307 |
gh_patches_debug_20578 | rasdani/github-patches | git_diff | google__osv.dev-482 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
PURLs for scoped NPM packages are invalid
The package URLs for scoped NPM packages (e.g. [`@hapi/hoek`](https://osv.dev/vulnerability/GHSA-22h7-7wwg-qmgg)) are invalid. Parsing them with any package URL library fails.
According to [the spec](https://github.com/package-url/purl-spec/blob/master/PURL-SPECIFICATION.rst#rules-for-each-purl-component), segments in the namespace (here: `@hapi`) must be percent-encdoded.
So
```
pkg:npm/@hapi/hoek
```
should be
```
pkg:npm/%40hapi/hoek
```
On the same note, the name segment must be percent-encoded, too. I haven't encountered a PURL in OSV where the package name contains characters that'd need encoding, but if this is done for the namespace, it should be considered for the name as well.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lib/osv/purl_helpers.py`
Content:
```
1 # Copyright 2022 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 """PURL conversion utilities."""
15
16 PURL_ECOSYSTEMS = {
17 'crates.io': 'cargo',
18 'Hex': 'hex',
19 'Go': 'golang',
20 'Maven': 'maven',
21 'NuGet': 'nuget',
22 'npm': 'npm',
23 'Packagist': 'composer',
24 'OSS-Fuzz': 'generic',
25 'PyPI': 'pypi',
26 'RubyGems': 'gem',
27 }
28
29
30 def package_to_purl(ecosystem, package_name):
31 """Convert a ecosystem and package name to PURL."""
32 purl_type = PURL_ECOSYSTEMS.get(ecosystem)
33 if not purl_type:
34 return None
35
36 if purl_type == 'maven':
37 # PURLs use / to separate the group ID and the artifact ID.
38 package_name = package_name.replace(':', '/', 1)
39
40 return f'pkg:{purl_type}/{package_name}'
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lib/osv/purl_helpers.py b/lib/osv/purl_helpers.py
--- a/lib/osv/purl_helpers.py
+++ b/lib/osv/purl_helpers.py
@@ -13,6 +13,8 @@
# limitations under the License.
"""PURL conversion utilities."""
+from urllib.parse import quote
+
PURL_ECOSYSTEMS = {
'crates.io': 'cargo',
'Hex': 'hex',
@@ -27,6 +29,12 @@
}
+def _url_encode(package_name):
+ """URL encode a PURL `namespace/name` or `name`."""
+ parts = package_name.split('/')
+ return '/'.join(quote(p) for p in parts)
+
+
def package_to_purl(ecosystem, package_name):
"""Convert a ecosystem and package name to PURL."""
purl_type = PURL_ECOSYSTEMS.get(ecosystem)
@@ -37,4 +45,4 @@
# PURLs use / to separate the group ID and the artifact ID.
package_name = package_name.replace(':', '/', 1)
- return f'pkg:{purl_type}/{package_name}'
+ return f'pkg:{purl_type}/{_url_encode(package_name)}'
| {"golden_diff": "diff --git a/lib/osv/purl_helpers.py b/lib/osv/purl_helpers.py\n--- a/lib/osv/purl_helpers.py\n+++ b/lib/osv/purl_helpers.py\n@@ -13,6 +13,8 @@\n # limitations under the License.\n \"\"\"PURL conversion utilities.\"\"\"\n \n+from urllib.parse import quote\n+\n PURL_ECOSYSTEMS = {\n 'crates.io': 'cargo',\n 'Hex': 'hex',\n@@ -27,6 +29,12 @@\n }\n \n \n+def _url_encode(package_name):\n+ \"\"\"URL encode a PURL `namespace/name` or `name`.\"\"\"\n+ parts = package_name.split('/')\n+ return '/'.join(quote(p) for p in parts)\n+\n+\n def package_to_purl(ecosystem, package_name):\n \"\"\"Convert a ecosystem and package name to PURL.\"\"\"\n purl_type = PURL_ECOSYSTEMS.get(ecosystem)\n@@ -37,4 +45,4 @@\n # PURLs use / to separate the group ID and the artifact ID.\n package_name = package_name.replace(':', '/', 1)\n \n- return f'pkg:{purl_type}/{package_name}'\n+ return f'pkg:{purl_type}/{_url_encode(package_name)}'\n", "issue": "PURLs for scoped NPM packages are invalid\nThe package URLs for scoped NPM packages (e.g. [`@hapi/hoek`](https://osv.dev/vulnerability/GHSA-22h7-7wwg-qmgg)) are invalid. Parsing them with any package URL library fails.\r\n\r\nAccording to [the spec](https://github.com/package-url/purl-spec/blob/master/PURL-SPECIFICATION.rst#rules-for-each-purl-component), segments in the namespace (here: `@hapi`) must be percent-encdoded.\r\n\r\nSo\r\n\r\n```\r\npkg:npm/@hapi/hoek\r\n```\r\n\r\nshould be\r\n\r\n```\r\npkg:npm/%40hapi/hoek\r\n```\r\n\r\nOn the same note, the name segment must be percent-encoded, too. I haven't encountered a PURL in OSV where the package name contains characters that'd need encoding, but if this is done for the namespace, it should be considered for the name as well.\n", "before_files": [{"content": "# Copyright 2022 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"PURL conversion utilities.\"\"\"\n\nPURL_ECOSYSTEMS = {\n 'crates.io': 'cargo',\n 'Hex': 'hex',\n 'Go': 'golang',\n 'Maven': 'maven',\n 'NuGet': 'nuget',\n 'npm': 'npm',\n 'Packagist': 'composer',\n 'OSS-Fuzz': 'generic',\n 'PyPI': 'pypi',\n 'RubyGems': 'gem',\n}\n\n\ndef package_to_purl(ecosystem, package_name):\n \"\"\"Convert a ecosystem and package name to PURL.\"\"\"\n purl_type = PURL_ECOSYSTEMS.get(ecosystem)\n if not purl_type:\n return None\n\n if purl_type == 'maven':\n # PURLs use / to separate the group ID and the artifact ID.\n package_name = package_name.replace(':', '/', 1)\n\n return f'pkg:{purl_type}/{package_name}'\n", "path": "lib/osv/purl_helpers.py"}], "after_files": [{"content": "# Copyright 2022 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"PURL conversion utilities.\"\"\"\n\nfrom urllib.parse import quote\n\nPURL_ECOSYSTEMS = {\n 'crates.io': 'cargo',\n 'Hex': 'hex',\n 'Go': 'golang',\n 'Maven': 'maven',\n 'NuGet': 'nuget',\n 'npm': 'npm',\n 'Packagist': 'composer',\n 'OSS-Fuzz': 'generic',\n 'PyPI': 'pypi',\n 'RubyGems': 'gem',\n}\n\n\ndef _url_encode(package_name):\n \"\"\"URL encode a PURL `namespace/name` or `name`.\"\"\"\n parts = package_name.split('/')\n return '/'.join(quote(p) for p in parts)\n\n\ndef package_to_purl(ecosystem, package_name):\n \"\"\"Convert a ecosystem and package name to PURL.\"\"\"\n purl_type = PURL_ECOSYSTEMS.get(ecosystem)\n if not purl_type:\n return None\n\n if purl_type == 'maven':\n # PURLs use / to separate the group ID and the artifact ID.\n package_name = package_name.replace(':', '/', 1)\n\n return f'pkg:{purl_type}/{_url_encode(package_name)}'\n", "path": "lib/osv/purl_helpers.py"}]} | 883 | 276 |
gh_patches_debug_12874 | rasdani/github-patches | git_diff | vyperlang__vyper-1078 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot convert to address
### Version Information
* vyper Version: 0.2.0b4
### What's your issue about?
Vyper disallows converting to an address. This is a problem because sometimes we need to process stuff from bytes to an address.
#### Cute Animal Picture

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `vyper/types/convert.py`
Content:
```
1 import ast
2 import warnings
3
4 from vyper.functions.signature import (
5 signature
6 )
7 from vyper.parser.parser_utils import (
8 LLLnode,
9 getpos,
10 byte_array_to_num
11 )
12 from vyper.exceptions import (
13 InvalidLiteralException,
14 TypeMismatchException,
15 ParserException,
16 )
17 from vyper.types import (
18 BaseType,
19 )
20 from vyper.types import (
21 get_type,
22 )
23 from vyper.utils import (
24 DECIMAL_DIVISOR,
25 MemoryPositions,
26 SizeLimits
27 )
28
29
30 @signature(('uint256', 'bytes32', 'bytes'), '*')
31 def to_int128(expr, args, kwargs, context):
32 in_node = args[0]
33 typ, len = get_type(in_node)
34 if typ in ('uint256', 'bytes32'):
35 if in_node.typ.is_literal and not SizeLimits.in_bounds('int128', in_node.value):
36 raise InvalidLiteralException("Number out of range: {}".format(in_node.value), expr)
37 return LLLnode.from_list(
38 ['clamp', ['mload', MemoryPositions.MINNUM], in_node,
39 ['mload', MemoryPositions.MAXNUM]], typ=BaseType('int128', in_node.typ.unit), pos=getpos(expr)
40 )
41 else:
42 return byte_array_to_num(in_node, expr, 'int128')
43
44
45 @signature(('num_literal', 'int128', 'bytes32', 'address'), '*')
46 def to_uint256(expr, args, kwargs, context):
47 in_node = args[0]
48 input_type, len = get_type(in_node)
49
50 if isinstance(in_node, int):
51 if not SizeLimits.in_bounds('uint256', in_node):
52 raise InvalidLiteralException("Number out of range: {}".format(in_node))
53 _unit = in_node.typ.unit if input_type == 'int128' else None
54 return LLLnode.from_list(in_node, typ=BaseType('uint256', _unit), pos=getpos(expr))
55
56 elif isinstance(in_node, LLLnode) and input_type in ('int128', 'num_literal'):
57 _unit = in_node.typ.unit if input_type == 'int128' else None
58 return LLLnode.from_list(['clampge', in_node, 0], typ=BaseType('uint256', _unit), pos=getpos(expr))
59
60 elif isinstance(in_node, LLLnode) and input_type in ('bytes32', 'address'):
61 return LLLnode(value=in_node.value, args=in_node.args, typ=BaseType('uint256'), pos=getpos(expr))
62
63 else:
64 raise InvalidLiteralException("Invalid input for uint256: %r" % in_node, expr)
65
66
67 @signature(('int128', 'uint256'), '*')
68 def to_decimal(expr, args, kwargs, context):
69 input = args[0]
70 if input.typ.typ == 'uint256':
71 return LLLnode.from_list(
72 ['uclample', ['mul', input, DECIMAL_DIVISOR], ['mload', MemoryPositions.MAXDECIMAL]],
73 typ=BaseType('decimal', input.typ.unit, input.typ.positional), pos=getpos(expr)
74 )
75 else:
76 return LLLnode.from_list(
77 ['mul', input, DECIMAL_DIVISOR],
78 typ=BaseType('decimal', input.typ.unit, input.typ.positional),
79 pos=getpos(expr)
80 )
81
82
83 @signature(('int128', 'uint256', 'address', 'bytes'), '*')
84 def to_bytes32(expr, args, kwargs, context):
85 in_arg = args[0]
86 typ, _len = get_type(in_arg)
87
88 if typ == 'bytes':
89
90 if _len > 32:
91 raise TypeMismatchException("Unable to convert bytes[{}] to bytes32, max length is too large.".format(len))
92
93 if in_arg.location == "memory":
94 return LLLnode.from_list(
95 ['mload', ['add', in_arg, 32]], typ=BaseType('bytes32')
96 )
97 elif in_arg.location == "storage":
98 return LLLnode.from_list(
99 ['sload', ['add', ['sha3_32', in_arg], 1]], typ=BaseType('bytes32')
100 )
101
102 else:
103 return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('bytes32'), pos=getpos(expr))
104
105
106 def convert(expr, context):
107
108 if isinstance(expr.args[1], ast.Str):
109 warnings.warn(
110 "String parameter has been removed, see VIP1026). "
111 "Use a vyper type instead.",
112 DeprecationWarning
113 )
114
115 if isinstance(expr.args[1], ast.Name):
116 output_type = expr.args[1].id
117 else:
118 raise ParserException("Invalid conversion type, use valid vyper type.", expr)
119
120 if output_type in conversion_table:
121 return conversion_table[output_type](expr, context)
122 else:
123 raise ParserException("Conversion to {} is invalid.".format(output_type), expr)
124
125
126 conversion_table = {
127 'int128': to_int128,
128 'uint256': to_uint256,
129 'decimal': to_decimal,
130 'bytes32': to_bytes32,
131 }
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/vyper/types/convert.py b/vyper/types/convert.py
--- a/vyper/types/convert.py
+++ b/vyper/types/convert.py
@@ -103,6 +103,13 @@
return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('bytes32'), pos=getpos(expr))
+@signature(('bytes32'), '*')
+def to_address(expr, args, kwargs, context):
+ in_arg = args[0]
+
+ return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('address'), pos=getpos(expr))
+
+
def convert(expr, context):
if isinstance(expr.args[1], ast.Str):
@@ -128,4 +135,5 @@
'uint256': to_uint256,
'decimal': to_decimal,
'bytes32': to_bytes32,
+ 'address': to_address,
}
| {"golden_diff": "diff --git a/vyper/types/convert.py b/vyper/types/convert.py\n--- a/vyper/types/convert.py\n+++ b/vyper/types/convert.py\n@@ -103,6 +103,13 @@\n return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('bytes32'), pos=getpos(expr))\n \n \n+@signature(('bytes32'), '*')\n+def to_address(expr, args, kwargs, context):\n+ in_arg = args[0]\n+\n+ return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('address'), pos=getpos(expr))\n+\n+\n def convert(expr, context):\n \n if isinstance(expr.args[1], ast.Str):\n@@ -128,4 +135,5 @@\n 'uint256': to_uint256,\n 'decimal': to_decimal,\n 'bytes32': to_bytes32,\n+ 'address': to_address,\n }\n", "issue": "Cannot convert to address\n### Version Information\r\n\r\n* vyper Version: 0.2.0b4\r\n\r\n### What's your issue about?\r\nVyper disallows converting to an address. This is a problem because sometimes we need to process stuff from bytes to an address.\r\n\r\n#### Cute Animal Picture\r\n\n", "before_files": [{"content": "import ast\nimport warnings\n\nfrom vyper.functions.signature import (\n signature\n)\nfrom vyper.parser.parser_utils import (\n LLLnode,\n getpos,\n byte_array_to_num\n)\nfrom vyper.exceptions import (\n InvalidLiteralException,\n TypeMismatchException,\n ParserException,\n)\nfrom vyper.types import (\n BaseType,\n)\nfrom vyper.types import (\n get_type,\n)\nfrom vyper.utils import (\n DECIMAL_DIVISOR,\n MemoryPositions,\n SizeLimits\n)\n\n\n@signature(('uint256', 'bytes32', 'bytes'), '*')\ndef to_int128(expr, args, kwargs, context):\n in_node = args[0]\n typ, len = get_type(in_node)\n if typ in ('uint256', 'bytes32'):\n if in_node.typ.is_literal and not SizeLimits.in_bounds('int128', in_node.value):\n raise InvalidLiteralException(\"Number out of range: {}\".format(in_node.value), expr)\n return LLLnode.from_list(\n ['clamp', ['mload', MemoryPositions.MINNUM], in_node,\n ['mload', MemoryPositions.MAXNUM]], typ=BaseType('int128', in_node.typ.unit), pos=getpos(expr)\n )\n else:\n return byte_array_to_num(in_node, expr, 'int128')\n\n\n@signature(('num_literal', 'int128', 'bytes32', 'address'), '*')\ndef to_uint256(expr, args, kwargs, context):\n in_node = args[0]\n input_type, len = get_type(in_node)\n\n if isinstance(in_node, int):\n if not SizeLimits.in_bounds('uint256', in_node):\n raise InvalidLiteralException(\"Number out of range: {}\".format(in_node))\n _unit = in_node.typ.unit if input_type == 'int128' else None\n return LLLnode.from_list(in_node, typ=BaseType('uint256', _unit), pos=getpos(expr))\n\n elif isinstance(in_node, LLLnode) and input_type in ('int128', 'num_literal'):\n _unit = in_node.typ.unit if input_type == 'int128' else None\n return LLLnode.from_list(['clampge', in_node, 0], typ=BaseType('uint256', _unit), pos=getpos(expr))\n\n elif isinstance(in_node, LLLnode) and input_type in ('bytes32', 'address'):\n return LLLnode(value=in_node.value, args=in_node.args, typ=BaseType('uint256'), pos=getpos(expr))\n\n else:\n raise InvalidLiteralException(\"Invalid input for uint256: %r\" % in_node, expr)\n\n\n@signature(('int128', 'uint256'), '*')\ndef to_decimal(expr, args, kwargs, context):\n input = args[0]\n if input.typ.typ == 'uint256':\n return LLLnode.from_list(\n ['uclample', ['mul', input, DECIMAL_DIVISOR], ['mload', MemoryPositions.MAXDECIMAL]],\n typ=BaseType('decimal', input.typ.unit, input.typ.positional), pos=getpos(expr)\n )\n else:\n return LLLnode.from_list(\n ['mul', input, DECIMAL_DIVISOR],\n typ=BaseType('decimal', input.typ.unit, input.typ.positional),\n pos=getpos(expr)\n )\n\n\n@signature(('int128', 'uint256', 'address', 'bytes'), '*')\ndef to_bytes32(expr, args, kwargs, context):\n in_arg = args[0]\n typ, _len = get_type(in_arg)\n\n if typ == 'bytes':\n\n if _len > 32:\n raise TypeMismatchException(\"Unable to convert bytes[{}] to bytes32, max length is too large.\".format(len))\n\n if in_arg.location == \"memory\":\n return LLLnode.from_list(\n ['mload', ['add', in_arg, 32]], typ=BaseType('bytes32')\n )\n elif in_arg.location == \"storage\":\n return LLLnode.from_list(\n ['sload', ['add', ['sha3_32', in_arg], 1]], typ=BaseType('bytes32')\n )\n\n else:\n return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('bytes32'), pos=getpos(expr))\n\n\ndef convert(expr, context):\n\n if isinstance(expr.args[1], ast.Str):\n warnings.warn(\n \"String parameter has been removed, see VIP1026). \"\n \"Use a vyper type instead.\",\n DeprecationWarning\n )\n\n if isinstance(expr.args[1], ast.Name):\n output_type = expr.args[1].id\n else:\n raise ParserException(\"Invalid conversion type, use valid vyper type.\", expr)\n\n if output_type in conversion_table:\n return conversion_table[output_type](expr, context)\n else:\n raise ParserException(\"Conversion to {} is invalid.\".format(output_type), expr)\n\n\nconversion_table = {\n 'int128': to_int128,\n 'uint256': to_uint256,\n 'decimal': to_decimal,\n 'bytes32': to_bytes32,\n}\n", "path": "vyper/types/convert.py"}], "after_files": [{"content": "import ast\nimport warnings\n\nfrom vyper.functions.signature import (\n signature\n)\nfrom vyper.parser.parser_utils import (\n LLLnode,\n getpos,\n byte_array_to_num\n)\nfrom vyper.exceptions import (\n InvalidLiteralException,\n TypeMismatchException,\n ParserException,\n)\nfrom vyper.types import (\n BaseType,\n)\nfrom vyper.types import (\n get_type,\n)\nfrom vyper.utils import (\n DECIMAL_DIVISOR,\n MemoryPositions,\n SizeLimits\n)\n\n\n@signature(('uint256', 'bytes32', 'bytes'), '*')\ndef to_int128(expr, args, kwargs, context):\n in_node = args[0]\n typ, len = get_type(in_node)\n if typ in ('uint256', 'bytes32'):\n if in_node.typ.is_literal and not SizeLimits.in_bounds('int128', in_node.value):\n raise InvalidLiteralException(\"Number out of range: {}\".format(in_node.value), expr)\n return LLLnode.from_list(\n ['clamp', ['mload', MemoryPositions.MINNUM], in_node,\n ['mload', MemoryPositions.MAXNUM]], typ=BaseType('int128', in_node.typ.unit), pos=getpos(expr)\n )\n else:\n return byte_array_to_num(in_node, expr, 'int128')\n\n\n@signature(('num_literal', 'int128', 'bytes32', 'address'), '*')\ndef to_uint256(expr, args, kwargs, context):\n in_node = args[0]\n input_type, len = get_type(in_node)\n\n if isinstance(in_node, int):\n if not SizeLimits.in_bounds('uint256', in_node):\n raise InvalidLiteralException(\"Number out of range: {}\".format(in_node))\n _unit = in_node.typ.unit if input_type == 'int128' else None\n return LLLnode.from_list(in_node, typ=BaseType('uint256', _unit), pos=getpos(expr))\n\n elif isinstance(in_node, LLLnode) and input_type in ('int128', 'num_literal'):\n _unit = in_node.typ.unit if input_type == 'int128' else None\n return LLLnode.from_list(['clampge', in_node, 0], typ=BaseType('uint256', _unit), pos=getpos(expr))\n\n elif isinstance(in_node, LLLnode) and input_type in ('bytes32', 'address'):\n return LLLnode(value=in_node.value, args=in_node.args, typ=BaseType('uint256'), pos=getpos(expr))\n\n else:\n raise InvalidLiteralException(\"Invalid input for uint256: %r\" % in_node, expr)\n\n\n@signature(('int128', 'uint256'), '*')\ndef to_decimal(expr, args, kwargs, context):\n input = args[0]\n if input.typ.typ == 'uint256':\n return LLLnode.from_list(\n ['uclample', ['mul', input, DECIMAL_DIVISOR], ['mload', MemoryPositions.MAXDECIMAL]],\n typ=BaseType('decimal', input.typ.unit, input.typ.positional), pos=getpos(expr)\n )\n else:\n return LLLnode.from_list(\n ['mul', input, DECIMAL_DIVISOR],\n typ=BaseType('decimal', input.typ.unit, input.typ.positional),\n pos=getpos(expr)\n )\n\n\n@signature(('int128', 'uint256', 'address', 'bytes'), '*')\ndef to_bytes32(expr, args, kwargs, context):\n in_arg = args[0]\n typ, _len = get_type(in_arg)\n\n if typ == 'bytes':\n\n if _len > 32:\n raise TypeMismatchException(\"Unable to convert bytes[{}] to bytes32, max length is too large.\".format(len))\n\n if in_arg.location == \"memory\":\n return LLLnode.from_list(\n ['mload', ['add', in_arg, 32]], typ=BaseType('bytes32')\n )\n elif in_arg.location == \"storage\":\n return LLLnode.from_list(\n ['sload', ['add', ['sha3_32', in_arg], 1]], typ=BaseType('bytes32')\n )\n\n else:\n return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('bytes32'), pos=getpos(expr))\n\n\n@signature(('bytes32'), '*')\ndef to_address(expr, args, kwargs, context):\n in_arg = args[0]\n\n return LLLnode(value=in_arg.value, args=in_arg.args, typ=BaseType('address'), pos=getpos(expr))\n\n\ndef convert(expr, context):\n\n if isinstance(expr.args[1], ast.Str):\n warnings.warn(\n \"String parameter has been removed, see VIP1026). \"\n \"Use a vyper type instead.\",\n DeprecationWarning\n )\n\n if isinstance(expr.args[1], ast.Name):\n output_type = expr.args[1].id\n else:\n raise ParserException(\"Invalid conversion type, use valid vyper type.\", expr)\n\n if output_type in conversion_table:\n return conversion_table[output_type](expr, context)\n else:\n raise ParserException(\"Conversion to {} is invalid.\".format(output_type), expr)\n\n\nconversion_table = {\n 'int128': to_int128,\n 'uint256': to_uint256,\n 'decimal': to_decimal,\n 'bytes32': to_bytes32,\n 'address': to_address,\n}\n", "path": "vyper/types/convert.py"}]} | 1,815 | 215 |
gh_patches_debug_21469 | rasdani/github-patches | git_diff | cupy__cupy-1999 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Device-to-host copy in `examples/stream/cupy_memcpy.py` is not asynchronous
I've interested in asynchronous memcpy for better performance. I checked the `cupy_memcpy.py` sample, and noticed that the last line `x_pinned_cpu = x_gpu.get()` [1] shouldn't work asynchronously. Unfortunately I don't know how to properly fix it.
One issue is trivial: this line re-binds the variable `x_pinned_cpu`, instead of updating the value bound to this variable.
But there is another more tricky problem. The function `cupy.ndarray.get` creates the new `numpy.ndarray` instance by calling `numpy.empty`, and passes the new host pointer to the `copy_to_host_async` function [2]. IIUC, as the new array not allocated in pinned memory, the copy couldn't be asynchronous with other computations at the GPU.
* [1] https://github.com/cupy/cupy/blob/v5.0.0rc1/examples/stream/cupy_memcpy.py#L24
* [2] https://github.com/cupy/cupy/blob/v5.0.0rc1/cupy/core/core.pyx#L1805
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/stream/cupy_memcpy.py`
Content:
```
1 # nvprof --print-gpu-trace python examples/stream/cupy_memcpy.py
2 import cupy
3 import numpy
4
5 pinned_memory_pool = cupy.cuda.PinnedMemoryPool()
6 cupy.cuda.set_pinned_memory_allocator(pinned_memory_pool.malloc)
7
8
9 def _pin_memory(array):
10 mem = cupy.cuda.alloc_pinned_memory(array.nbytes)
11 ret = numpy.frombuffer(mem, array.dtype, array.size).reshape(array.shape)
12 ret[...] = array
13 return ret
14
15
16 x_cpu = numpy.array([1, 2, 3], dtype=numpy.float32)
17 x_pinned_cpu = _pin_memory(x_cpu)
18 x_gpu = cupy.core.ndarray((3,), dtype=numpy.float32)
19 with cupy.cuda.stream.Stream():
20 x_gpu.set(x_pinned_cpu)
21
22 stream = cupy.cuda.stream.Stream()
23 stream.use()
24 x_pinned_cpu = x_gpu.get()
25
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/stream/cupy_memcpy.py b/examples/stream/cupy_memcpy.py
--- a/examples/stream/cupy_memcpy.py
+++ b/examples/stream/cupy_memcpy.py
@@ -13,12 +13,38 @@
return ret
-x_cpu = numpy.array([1, 2, 3], dtype=numpy.float32)
-x_pinned_cpu = _pin_memory(x_cpu)
-x_gpu = cupy.core.ndarray((3,), dtype=numpy.float32)
-with cupy.cuda.stream.Stream():
- x_gpu.set(x_pinned_cpu)
-
-stream = cupy.cuda.stream.Stream()
-stream.use()
-x_pinned_cpu = x_gpu.get()
+SIZE = 1024 * 1024
+x_cpu_src = numpy.arange(SIZE, dtype=numpy.float32)
+x_gpu_src = cupy.arange(SIZE, dtype=numpy.float32)
+
+
+# synchronous
+stream = cupy.cuda.Stream.null
+start = stream.record()
+x_gpu_dst = cupy.empty(x_cpu_src.shape, x_cpu_src.dtype)
+x_gpu_dst.set(x_cpu_src)
+x_cpu_dst = x_gpu_src.get()
+end = stream.record()
+
+print('Synchronous Device to Host / Host to Device (ms)')
+print(cupy.cuda.get_elapsed_time(start, end))
+
+
+# asynchronous
+x_gpu_dst = cupy.empty(x_cpu_src.shape, x_cpu_src.dtype)
+x_cpu_dst = numpy.empty(x_gpu_src.shape, x_gpu_src.dtype)
+
+x_pinned_cpu_src = _pin_memory(x_cpu_src)
+x_pinned_cpu_dst = _pin_memory(x_cpu_dst)
+
+with cupy.cuda.stream.Stream() as stream_htod:
+ start = stream_htod.record()
+ x_gpu_dst.set(x_pinned_cpu_src)
+ with cupy.cuda.stream.Stream() as stream_dtoh:
+ x_gpu_src.get(out=x_pinned_cpu_dst)
+ stream_dtoh.synchronize()
+ stream_htod.synchronize()
+ end = stream_htod.record()
+
+print('Asynchronous Device to Host / Host to Device (ms)')
+print(cupy.cuda.get_elapsed_time(start, end))
| {"golden_diff": "diff --git a/examples/stream/cupy_memcpy.py b/examples/stream/cupy_memcpy.py\n--- a/examples/stream/cupy_memcpy.py\n+++ b/examples/stream/cupy_memcpy.py\n@@ -13,12 +13,38 @@\n return ret\n \n \n-x_cpu = numpy.array([1, 2, 3], dtype=numpy.float32)\n-x_pinned_cpu = _pin_memory(x_cpu)\n-x_gpu = cupy.core.ndarray((3,), dtype=numpy.float32)\n-with cupy.cuda.stream.Stream():\n- x_gpu.set(x_pinned_cpu)\n-\n-stream = cupy.cuda.stream.Stream()\n-stream.use()\n-x_pinned_cpu = x_gpu.get()\n+SIZE = 1024 * 1024\n+x_cpu_src = numpy.arange(SIZE, dtype=numpy.float32)\n+x_gpu_src = cupy.arange(SIZE, dtype=numpy.float32)\n+\n+\n+# synchronous\n+stream = cupy.cuda.Stream.null\n+start = stream.record()\n+x_gpu_dst = cupy.empty(x_cpu_src.shape, x_cpu_src.dtype)\n+x_gpu_dst.set(x_cpu_src)\n+x_cpu_dst = x_gpu_src.get()\n+end = stream.record()\n+\n+print('Synchronous Device to Host / Host to Device (ms)')\n+print(cupy.cuda.get_elapsed_time(start, end))\n+\n+\n+# asynchronous\n+x_gpu_dst = cupy.empty(x_cpu_src.shape, x_cpu_src.dtype)\n+x_cpu_dst = numpy.empty(x_gpu_src.shape, x_gpu_src.dtype)\n+\n+x_pinned_cpu_src = _pin_memory(x_cpu_src)\n+x_pinned_cpu_dst = _pin_memory(x_cpu_dst)\n+\n+with cupy.cuda.stream.Stream() as stream_htod:\n+ start = stream_htod.record()\n+ x_gpu_dst.set(x_pinned_cpu_src)\n+ with cupy.cuda.stream.Stream() as stream_dtoh:\n+ x_gpu_src.get(out=x_pinned_cpu_dst)\n+ stream_dtoh.synchronize()\n+ stream_htod.synchronize()\n+ end = stream_htod.record()\n+\n+print('Asynchronous Device to Host / Host to Device (ms)')\n+print(cupy.cuda.get_elapsed_time(start, end))\n", "issue": "Device-to-host copy in `examples/stream/cupy_memcpy.py` is not asynchronous\nI've interested in asynchronous memcpy for better performance. I checked the `cupy_memcpy.py` sample, and noticed that the last line `x_pinned_cpu = x_gpu.get()` [1] shouldn't work asynchronously. Unfortunately I don't know how to properly fix it.\r\n\r\nOne issue is trivial: this line re-binds the variable `x_pinned_cpu`, instead of updating the value bound to this variable.\r\n\r\nBut there is another more tricky problem. The function `cupy.ndarray.get` creates the new `numpy.ndarray` instance by calling `numpy.empty`, and passes the new host pointer to the `copy_to_host_async` function [2]. IIUC, as the new array not allocated in pinned memory, the copy couldn't be asynchronous with other computations at the GPU.\r\n\r\n* [1] https://github.com/cupy/cupy/blob/v5.0.0rc1/examples/stream/cupy_memcpy.py#L24\r\n* [2] https://github.com/cupy/cupy/blob/v5.0.0rc1/cupy/core/core.pyx#L1805\n", "before_files": [{"content": "# nvprof --print-gpu-trace python examples/stream/cupy_memcpy.py\nimport cupy\nimport numpy\n\npinned_memory_pool = cupy.cuda.PinnedMemoryPool()\ncupy.cuda.set_pinned_memory_allocator(pinned_memory_pool.malloc)\n\n\ndef _pin_memory(array):\n mem = cupy.cuda.alloc_pinned_memory(array.nbytes)\n ret = numpy.frombuffer(mem, array.dtype, array.size).reshape(array.shape)\n ret[...] = array\n return ret\n\n\nx_cpu = numpy.array([1, 2, 3], dtype=numpy.float32)\nx_pinned_cpu = _pin_memory(x_cpu)\nx_gpu = cupy.core.ndarray((3,), dtype=numpy.float32)\nwith cupy.cuda.stream.Stream():\n x_gpu.set(x_pinned_cpu)\n\nstream = cupy.cuda.stream.Stream()\nstream.use()\nx_pinned_cpu = x_gpu.get()\n", "path": "examples/stream/cupy_memcpy.py"}], "after_files": [{"content": "# nvprof --print-gpu-trace python examples/stream/cupy_memcpy.py\nimport cupy\nimport numpy\n\npinned_memory_pool = cupy.cuda.PinnedMemoryPool()\ncupy.cuda.set_pinned_memory_allocator(pinned_memory_pool.malloc)\n\n\ndef _pin_memory(array):\n mem = cupy.cuda.alloc_pinned_memory(array.nbytes)\n ret = numpy.frombuffer(mem, array.dtype, array.size).reshape(array.shape)\n ret[...] = array\n return ret\n\n\nSIZE = 1024 * 1024\nx_cpu_src = numpy.arange(SIZE, dtype=numpy.float32)\nx_gpu_src = cupy.arange(SIZE, dtype=numpy.float32)\n\n\n# synchronous\nstream = cupy.cuda.Stream.null\nstart = stream.record()\nx_gpu_dst = cupy.empty(x_cpu_src.shape, x_cpu_src.dtype)\nx_gpu_dst.set(x_cpu_src)\nx_cpu_dst = x_gpu_src.get()\nend = stream.record()\n\nprint('Synchronous Device to Host / Host to Device (ms)')\nprint(cupy.cuda.get_elapsed_time(start, end))\n\n\n# asynchronous\nx_gpu_dst = cupy.empty(x_cpu_src.shape, x_cpu_src.dtype)\nx_cpu_dst = numpy.empty(x_gpu_src.shape, x_gpu_src.dtype)\n\nx_pinned_cpu_src = _pin_memory(x_cpu_src)\nx_pinned_cpu_dst = _pin_memory(x_cpu_dst)\n\nwith cupy.cuda.stream.Stream() as stream_htod:\n start = stream_htod.record()\n x_gpu_dst.set(x_pinned_cpu_src)\n with cupy.cuda.stream.Stream() as stream_dtoh:\n x_gpu_src.get(out=x_pinned_cpu_dst)\n stream_dtoh.synchronize()\n stream_htod.synchronize()\n end = stream_htod.record()\n\nprint('Asynchronous Device to Host / Host to Device (ms)')\nprint(cupy.cuda.get_elapsed_time(start, end))\n", "path": "examples/stream/cupy_memcpy.py"}]} | 733 | 460 |
gh_patches_debug_14223 | rasdani/github-patches | git_diff | ibis-project__ibis-2556 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CLN: Remove or consolidate dev dependencies from setup.py and environment.yml
I noticed in https://github.com/ibis-project/ibis/pull/2547#issue-529169508 that the dev dependencies are not in sync in https://github.com/ibis-project/ibis/blob/master/setup.py#L63 and https://github.com/ibis-project/ibis/blob/master/environment.yml#L24
`environment.yml` looks more up to date; the dev dependencies in `setup.py` should either be synced with that file or just removed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 """Ibis setup module."""
3 import pathlib
4 import sys
5
6 from setuptools import find_packages, setup
7
8 import versioneer
9
10 LONG_DESCRIPTION = """
11 Ibis is a productivity-centric Python big data framework.
12
13 See http://ibis-project.org
14 """
15
16 VERSION = sys.version_info.major, sys.version_info.minor
17
18 impala_requires = ['hdfs>=2.0.16', 'sqlalchemy>=1.1,<1.3.7', 'requests']
19 impala_requires.append('impyla[kerberos]>=0.15.0')
20
21 sqlite_requires = ['sqlalchemy>=1.1,<1.3.7']
22 postgres_requires = sqlite_requires + ['psycopg2']
23 mysql_requires = sqlite_requires + ['pymysql']
24
25 omniscidb_requires = ['pymapd==0.24', 'pyarrow']
26 kerberos_requires = ['requests-kerberos']
27 visualization_requires = ['graphviz']
28 clickhouse_requires = [
29 'clickhouse-driver>=0.1.3',
30 'clickhouse-cityhash',
31 ]
32 bigquery_requires = [
33 'google-cloud-bigquery[bqstorage,pandas]>=1.12.0,<2.0.0dev',
34 'pydata-google-auth',
35 ]
36 hdf5_requires = ['tables>=3.0.0']
37
38 parquet_requires = ['pyarrow>=0.12.0']
39 spark_requires = ['pyspark>=2.4.3']
40
41 geospatial_requires = ['geoalchemy2', 'geopandas', 'shapely']
42
43 dask_requires = [
44 'dask[dataframe, array]',
45 ]
46
47 all_requires = (
48 impala_requires
49 + postgres_requires
50 + omniscidb_requires
51 + mysql_requires
52 + kerberos_requires
53 + visualization_requires
54 + clickhouse_requires
55 + bigquery_requires
56 + hdf5_requires
57 + parquet_requires
58 + spark_requires
59 + geospatial_requires
60 + dask_requires
61 )
62
63 develop_requires = all_requires + [
64 'black',
65 'click',
66 'pydocstyle==4.0.1',
67 'flake8',
68 'isort',
69 'mypy',
70 'pre-commit',
71 'pygit2',
72 'pytest>=4.5',
73 ]
74
75 install_requires = [
76 line.strip()
77 for line in pathlib.Path(__file__)
78 .parent.joinpath('requirements.txt')
79 .read_text()
80 .splitlines()
81 ]
82
83 setup(
84 name='ibis-framework',
85 url='https://github.com/ibis-project/ibis',
86 packages=find_packages(),
87 version=versioneer.get_version(),
88 cmdclass=versioneer.get_cmdclass(),
89 install_requires=install_requires,
90 python_requires='>=3.7',
91 extras_require={
92 'all': all_requires,
93 'develop': develop_requires,
94 'impala': impala_requires,
95 'kerberos': kerberos_requires,
96 'postgres': postgres_requires,
97 'omniscidb': omniscidb_requires,
98 'mysql': mysql_requires,
99 'sqlite': sqlite_requires,
100 'visualization': visualization_requires,
101 'clickhouse': clickhouse_requires,
102 'bigquery': bigquery_requires,
103 'hdf5': hdf5_requires,
104 'parquet': parquet_requires,
105 'spark': spark_requires,
106 'geospatial': geospatial_requires,
107 'dask': dask_requires,
108 },
109 description="Productivity-centric Python Big Data Framework",
110 long_description=LONG_DESCRIPTION,
111 classifiers=[
112 'Development Status :: 4 - Beta',
113 'Operating System :: OS Independent',
114 'Intended Audience :: Science/Research',
115 'Programming Language :: Python',
116 'Programming Language :: Python :: 3',
117 'Topic :: Scientific/Engineering',
118 ],
119 license='Apache License, Version 2.0',
120 maintainer="Phillip Cloud",
121 maintainer_email="[email protected]",
122 )
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -60,18 +60,6 @@
+ dask_requires
)
-develop_requires = all_requires + [
- 'black',
- 'click',
- 'pydocstyle==4.0.1',
- 'flake8',
- 'isort',
- 'mypy',
- 'pre-commit',
- 'pygit2',
- 'pytest>=4.5',
-]
-
install_requires = [
line.strip()
for line in pathlib.Path(__file__)
@@ -90,7 +78,6 @@
python_requires='>=3.7',
extras_require={
'all': all_requires,
- 'develop': develop_requires,
'impala': impala_requires,
'kerberos': kerberos_requires,
'postgres': postgres_requires,
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -60,18 +60,6 @@\n + dask_requires\n )\n \n-develop_requires = all_requires + [\n- 'black',\n- 'click',\n- 'pydocstyle==4.0.1',\n- 'flake8',\n- 'isort',\n- 'mypy',\n- 'pre-commit',\n- 'pygit2',\n- 'pytest>=4.5',\n-]\n-\n install_requires = [\n line.strip()\n for line in pathlib.Path(__file__)\n@@ -90,7 +78,6 @@\n python_requires='>=3.7',\n extras_require={\n 'all': all_requires,\n- 'develop': develop_requires,\n 'impala': impala_requires,\n 'kerberos': kerberos_requires,\n 'postgres': postgres_requires,\n", "issue": "CLN: Remove or consolidate dev dependencies from setup.py and environment.yml\nI noticed in https://github.com/ibis-project/ibis/pull/2547#issue-529169508 that the dev dependencies are not in sync in https://github.com/ibis-project/ibis/blob/master/setup.py#L63 and https://github.com/ibis-project/ibis/blob/master/environment.yml#L24\r\n\r\n`environment.yml` looks more up to date; the dev dependencies in `setup.py` should either be synced with that file or just removed.\n", "before_files": [{"content": "#!/usr/bin/env python\n\"\"\"Ibis setup module.\"\"\"\nimport pathlib\nimport sys\n\nfrom setuptools import find_packages, setup\n\nimport versioneer\n\nLONG_DESCRIPTION = \"\"\"\nIbis is a productivity-centric Python big data framework.\n\nSee http://ibis-project.org\n\"\"\"\n\nVERSION = sys.version_info.major, sys.version_info.minor\n\nimpala_requires = ['hdfs>=2.0.16', 'sqlalchemy>=1.1,<1.3.7', 'requests']\nimpala_requires.append('impyla[kerberos]>=0.15.0')\n\nsqlite_requires = ['sqlalchemy>=1.1,<1.3.7']\npostgres_requires = sqlite_requires + ['psycopg2']\nmysql_requires = sqlite_requires + ['pymysql']\n\nomniscidb_requires = ['pymapd==0.24', 'pyarrow']\nkerberos_requires = ['requests-kerberos']\nvisualization_requires = ['graphviz']\nclickhouse_requires = [\n 'clickhouse-driver>=0.1.3',\n 'clickhouse-cityhash',\n]\nbigquery_requires = [\n 'google-cloud-bigquery[bqstorage,pandas]>=1.12.0,<2.0.0dev',\n 'pydata-google-auth',\n]\nhdf5_requires = ['tables>=3.0.0']\n\nparquet_requires = ['pyarrow>=0.12.0']\nspark_requires = ['pyspark>=2.4.3']\n\ngeospatial_requires = ['geoalchemy2', 'geopandas', 'shapely']\n\ndask_requires = [\n 'dask[dataframe, array]',\n]\n\nall_requires = (\n impala_requires\n + postgres_requires\n + omniscidb_requires\n + mysql_requires\n + kerberos_requires\n + visualization_requires\n + clickhouse_requires\n + bigquery_requires\n + hdf5_requires\n + parquet_requires\n + spark_requires\n + geospatial_requires\n + dask_requires\n)\n\ndevelop_requires = all_requires + [\n 'black',\n 'click',\n 'pydocstyle==4.0.1',\n 'flake8',\n 'isort',\n 'mypy',\n 'pre-commit',\n 'pygit2',\n 'pytest>=4.5',\n]\n\ninstall_requires = [\n line.strip()\n for line in pathlib.Path(__file__)\n .parent.joinpath('requirements.txt')\n .read_text()\n .splitlines()\n]\n\nsetup(\n name='ibis-framework',\n url='https://github.com/ibis-project/ibis',\n packages=find_packages(),\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n install_requires=install_requires,\n python_requires='>=3.7',\n extras_require={\n 'all': all_requires,\n 'develop': develop_requires,\n 'impala': impala_requires,\n 'kerberos': kerberos_requires,\n 'postgres': postgres_requires,\n 'omniscidb': omniscidb_requires,\n 'mysql': mysql_requires,\n 'sqlite': sqlite_requires,\n 'visualization': visualization_requires,\n 'clickhouse': clickhouse_requires,\n 'bigquery': bigquery_requires,\n 'hdf5': hdf5_requires,\n 'parquet': parquet_requires,\n 'spark': spark_requires,\n 'geospatial': geospatial_requires,\n 'dask': dask_requires,\n },\n description=\"Productivity-centric Python Big Data Framework\",\n long_description=LONG_DESCRIPTION,\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Operating System :: OS Independent',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Topic :: Scientific/Engineering',\n ],\n license='Apache License, Version 2.0',\n maintainer=\"Phillip Cloud\",\n maintainer_email=\"[email protected]\",\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\"\"\"Ibis setup module.\"\"\"\nimport pathlib\nimport sys\n\nfrom setuptools import find_packages, setup\n\nimport versioneer\n\nLONG_DESCRIPTION = \"\"\"\nIbis is a productivity-centric Python big data framework.\n\nSee http://ibis-project.org\n\"\"\"\n\nVERSION = sys.version_info.major, sys.version_info.minor\n\nimpala_requires = ['hdfs>=2.0.16', 'sqlalchemy>=1.1,<1.3.7', 'requests']\nimpala_requires.append('impyla[kerberos]>=0.15.0')\n\nsqlite_requires = ['sqlalchemy>=1.1,<1.3.7']\npostgres_requires = sqlite_requires + ['psycopg2']\nmysql_requires = sqlite_requires + ['pymysql']\n\nomniscidb_requires = ['pymapd==0.24', 'pyarrow']\nkerberos_requires = ['requests-kerberos']\nvisualization_requires = ['graphviz']\nclickhouse_requires = [\n 'clickhouse-driver>=0.1.3',\n 'clickhouse-cityhash',\n]\nbigquery_requires = [\n 'google-cloud-bigquery[bqstorage,pandas]>=1.12.0,<2.0.0dev',\n 'pydata-google-auth',\n]\nhdf5_requires = ['tables>=3.0.0']\n\nparquet_requires = ['pyarrow>=0.12.0']\nspark_requires = ['pyspark>=2.4.3']\n\ngeospatial_requires = ['geoalchemy2', 'geopandas', 'shapely']\n\ndask_requires = [\n 'dask[dataframe, array]',\n]\n\nall_requires = (\n impala_requires\n + postgres_requires\n + omniscidb_requires\n + mysql_requires\n + kerberos_requires\n + visualization_requires\n + clickhouse_requires\n + bigquery_requires\n + hdf5_requires\n + parquet_requires\n + spark_requires\n + geospatial_requires\n + dask_requires\n)\n\ninstall_requires = [\n line.strip()\n for line in pathlib.Path(__file__)\n .parent.joinpath('requirements.txt')\n .read_text()\n .splitlines()\n]\n\nsetup(\n name='ibis-framework',\n url='https://github.com/ibis-project/ibis',\n packages=find_packages(),\n version=versioneer.get_version(),\n cmdclass=versioneer.get_cmdclass(),\n install_requires=install_requires,\n python_requires='>=3.7',\n extras_require={\n 'all': all_requires,\n 'impala': impala_requires,\n 'kerberos': kerberos_requires,\n 'postgres': postgres_requires,\n 'omniscidb': omniscidb_requires,\n 'mysql': mysql_requires,\n 'sqlite': sqlite_requires,\n 'visualization': visualization_requires,\n 'clickhouse': clickhouse_requires,\n 'bigquery': bigquery_requires,\n 'hdf5': hdf5_requires,\n 'parquet': parquet_requires,\n 'spark': spark_requires,\n 'geospatial': geospatial_requires,\n 'dask': dask_requires,\n },\n description=\"Productivity-centric Python Big Data Framework\",\n long_description=LONG_DESCRIPTION,\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Operating System :: OS Independent',\n 'Intended Audience :: Science/Research',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Topic :: Scientific/Engineering',\n ],\n license='Apache License, Version 2.0',\n maintainer=\"Phillip Cloud\",\n maintainer_email=\"[email protected]\",\n)\n", "path": "setup.py"}]} | 1,494 | 196 |
gh_patches_debug_16105 | rasdani/github-patches | git_diff | comic__grand-challenge.org-1812 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Increase width of algorithm result table
The table on the algorithm results page can become wider than the page container if the name of the scan is very long. The user then has to scroll to the right to see the "Open Result in Viewer" button, which is quite confusing.

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/grandchallenge/core/context_processors.py`
Content:
```
1 import logging
2
3 from django.conf import settings
4 from guardian.shortcuts import get_perms
5 from guardian.utils import get_anonymous_user
6
7 from grandchallenge.blogs.models import Post
8 from grandchallenge.policies.models import Policy
9
10 logger = logging.getLogger(__name__)
11
12
13 def challenge(request):
14 try:
15 challenge = request.challenge
16
17 if challenge is None:
18 return {}
19
20 except AttributeError:
21 logger.warning(f"Could not get challenge for request: {request}")
22 return {}
23
24 try:
25 user = request.user
26 except AttributeError:
27 user = get_anonymous_user()
28
29 return {
30 "challenge": challenge,
31 "challenge_perms": get_perms(user, challenge),
32 "user_is_participant": challenge.is_participant(user),
33 "pages": challenge.page_set.all(),
34 }
35
36
37 def deployment_info(*_, **__):
38 return {
39 "google_analytics_id": settings.GOOGLE_ANALYTICS_ID,
40 "geochart_api_key": settings.GOOGLE_MAPS_API_KEY,
41 "COMMIT_ID": settings.COMMIT_ID,
42 }
43
44
45 def debug(*_, **__):
46 return {
47 "DEBUG": settings.DEBUG,
48 "ACTSTREAM_ENABLE": settings.ACTSTREAM_ENABLE,
49 }
50
51
52 def sentry_dsn(*_, **__):
53 return {
54 "SENTRY_DSN": settings.SENTRY_DSN,
55 "SENTRY_ENABLE_JS_REPORTING": settings.SENTRY_ENABLE_JS_REPORTING,
56 }
57
58
59 def footer_links(*_, **__):
60 return {
61 "policy_pages": Policy.objects.all(),
62 "blog_posts": Post.objects.filter(published=True),
63 }
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/app/grandchallenge/core/context_processors.py b/app/grandchallenge/core/context_processors.py
--- a/app/grandchallenge/core/context_processors.py
+++ b/app/grandchallenge/core/context_processors.py
@@ -5,6 +5,7 @@
from guardian.utils import get_anonymous_user
from grandchallenge.blogs.models import Post
+from grandchallenge.participants.models import RegistrationRequest
from grandchallenge.policies.models import Policy
logger = logging.getLogger(__name__)
@@ -31,6 +32,9 @@
"challenge_perms": get_perms(user, challenge),
"user_is_participant": challenge.is_participant(user),
"pages": challenge.page_set.all(),
+ "pending_requests": challenge.registrationrequest_set.filter(
+ status=RegistrationRequest.PENDING
+ ),
}
| {"golden_diff": "diff --git a/app/grandchallenge/core/context_processors.py b/app/grandchallenge/core/context_processors.py\n--- a/app/grandchallenge/core/context_processors.py\n+++ b/app/grandchallenge/core/context_processors.py\n@@ -5,6 +5,7 @@\n from guardian.utils import get_anonymous_user\n \n from grandchallenge.blogs.models import Post\n+from grandchallenge.participants.models import RegistrationRequest\n from grandchallenge.policies.models import Policy\n \n logger = logging.getLogger(__name__)\n@@ -31,6 +32,9 @@\n \"challenge_perms\": get_perms(user, challenge),\n \"user_is_participant\": challenge.is_participant(user),\n \"pages\": challenge.page_set.all(),\n+ \"pending_requests\": challenge.registrationrequest_set.filter(\n+ status=RegistrationRequest.PENDING\n+ ),\n }\n", "issue": "Increase width of algorithm result table\nThe table on the algorithm results page can become wider than the page container if the name of the scan is very long. The user then has to scroll to the right to see the \"Open Result in Viewer\" button, which is quite confusing.\r\n\r\n\n", "before_files": [{"content": "import logging\n\nfrom django.conf import settings\nfrom guardian.shortcuts import get_perms\nfrom guardian.utils import get_anonymous_user\n\nfrom grandchallenge.blogs.models import Post\nfrom grandchallenge.policies.models import Policy\n\nlogger = logging.getLogger(__name__)\n\n\ndef challenge(request):\n try:\n challenge = request.challenge\n\n if challenge is None:\n return {}\n\n except AttributeError:\n logger.warning(f\"Could not get challenge for request: {request}\")\n return {}\n\n try:\n user = request.user\n except AttributeError:\n user = get_anonymous_user()\n\n return {\n \"challenge\": challenge,\n \"challenge_perms\": get_perms(user, challenge),\n \"user_is_participant\": challenge.is_participant(user),\n \"pages\": challenge.page_set.all(),\n }\n\n\ndef deployment_info(*_, **__):\n return {\n \"google_analytics_id\": settings.GOOGLE_ANALYTICS_ID,\n \"geochart_api_key\": settings.GOOGLE_MAPS_API_KEY,\n \"COMMIT_ID\": settings.COMMIT_ID,\n }\n\n\ndef debug(*_, **__):\n return {\n \"DEBUG\": settings.DEBUG,\n \"ACTSTREAM_ENABLE\": settings.ACTSTREAM_ENABLE,\n }\n\n\ndef sentry_dsn(*_, **__):\n return {\n \"SENTRY_DSN\": settings.SENTRY_DSN,\n \"SENTRY_ENABLE_JS_REPORTING\": settings.SENTRY_ENABLE_JS_REPORTING,\n }\n\n\ndef footer_links(*_, **__):\n return {\n \"policy_pages\": Policy.objects.all(),\n \"blog_posts\": Post.objects.filter(published=True),\n }\n", "path": "app/grandchallenge/core/context_processors.py"}], "after_files": [{"content": "import logging\n\nfrom django.conf import settings\nfrom guardian.shortcuts import get_perms\nfrom guardian.utils import get_anonymous_user\n\nfrom grandchallenge.blogs.models import Post\nfrom grandchallenge.participants.models import RegistrationRequest\nfrom grandchallenge.policies.models import Policy\n\nlogger = logging.getLogger(__name__)\n\n\ndef challenge(request):\n try:\n challenge = request.challenge\n\n if challenge is None:\n return {}\n\n except AttributeError:\n logger.warning(f\"Could not get challenge for request: {request}\")\n return {}\n\n try:\n user = request.user\n except AttributeError:\n user = get_anonymous_user()\n\n return {\n \"challenge\": challenge,\n \"challenge_perms\": get_perms(user, challenge),\n \"user_is_participant\": challenge.is_participant(user),\n \"pages\": challenge.page_set.all(),\n \"pending_requests\": challenge.registrationrequest_set.filter(\n status=RegistrationRequest.PENDING\n ),\n }\n\n\ndef deployment_info(*_, **__):\n return {\n \"google_analytics_id\": settings.GOOGLE_ANALYTICS_ID,\n \"geochart_api_key\": settings.GOOGLE_MAPS_API_KEY,\n \"COMMIT_ID\": settings.COMMIT_ID,\n }\n\n\ndef debug(*_, **__):\n return {\"DEBUG\": settings.DEBUG}\n\n\ndef sentry_dsn(*_, **__):\n return {\n \"SENTRY_DSN\": settings.SENTRY_DSN,\n \"SENTRY_ENABLE_JS_REPORTING\": settings.SENTRY_ENABLE_JS_REPORTING,\n }\n\n\ndef footer_links(*_, **__):\n return {\n \"policy_pages\": Policy.objects.all(),\n \"blog_posts\": Post.objects.filter(published=True),\n }\n", "path": "app/grandchallenge/core/context_processors.py"}]} | 841 | 170 |
gh_patches_debug_25878 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-7567 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
primanti_bros_us: switch to YextSpider as Where2GetIt seemingly no longer used
The store locator at `https://restaurants.primantibros.com/search` now uses Yext APIs for querying store locations, not Where2GetIt.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/primanti_bros_us.py`
Content:
```
1 from locations.categories import Extras, apply_yes_no
2 from locations.hours import DAYS_FULL, OpeningHours
3 from locations.storefinders.where2getit import Where2GetItSpider
4
5
6 class PrimantiBrosUSSpider(Where2GetItSpider):
7 name = "primanti_bros_us"
8 item_attributes = {"brand": "Primanti Bros", "brand_wikidata": "Q7243049"}
9 api_brand_name = "primantibros"
10 api_key = "7CDBB1A2-4AC6-11EB-932C-8917919C4603"
11
12 def parse_item(self, item, location):
13 item["ref"] = location["uid"]
14 item["street_address"] = ", ".join(filter(None, [location.get("address1"), location.get("address2")]))
15 item["website"] = location.get("menuurl")
16 item["opening_hours"] = OpeningHours()
17 hours_string = ""
18 for day_name in DAYS_FULL:
19 hours_string = f"{hours_string} {day_name}: " + location["{}hours".format(day_name.lower())]
20 item["opening_hours"].add_ranges_from_string(hours_string)
21 apply_yes_no(Extras.DRIVE_THROUGH, item, location["has_drive_through"] == "1", False)
22 yield item
23
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/primanti_bros_us.py b/locations/spiders/primanti_bros_us.py
--- a/locations/spiders/primanti_bros_us.py
+++ b/locations/spiders/primanti_bros_us.py
@@ -1,22 +1,18 @@
-from locations.categories import Extras, apply_yes_no
-from locations.hours import DAYS_FULL, OpeningHours
-from locations.storefinders.where2getit import Where2GetItSpider
+from locations.categories import Categories
+from locations.storefinders.yext import YextSpider
-class PrimantiBrosUSSpider(Where2GetItSpider):
+class PrimantiBrosUSSpider(YextSpider):
name = "primanti_bros_us"
- item_attributes = {"brand": "Primanti Bros", "brand_wikidata": "Q7243049"}
- api_brand_name = "primantibros"
- api_key = "7CDBB1A2-4AC6-11EB-932C-8917919C4603"
+ item_attributes = {"brand": "Primanti Bros", "brand_wikidata": "Q7243049", "extras": Categories.RESTAURANT.value}
+ api_key = "7515c25fc685bbdd7c5975b6573c6912"
+ api_version = "20220511"
def parse_item(self, item, location):
- item["ref"] = location["uid"]
- item["street_address"] = ", ".join(filter(None, [location.get("address1"), location.get("address2")]))
- item["website"] = location.get("menuurl")
- item["opening_hours"] = OpeningHours()
- hours_string = ""
- for day_name in DAYS_FULL:
- hours_string = f"{hours_string} {day_name}: " + location["{}hours".format(day_name.lower())]
- item["opening_hours"].add_ranges_from_string(hours_string)
- apply_yes_no(Extras.DRIVE_THROUGH, item, location["has_drive_through"] == "1", False)
+ if "test-location" in item["ref"]:
+ return
+ item["ref"] = location.get("c_pagesURL")
+ item["name"] = location.get("c_searchName")
+ item["website"] = location.get("c_pagesURL")
+ item.pop("twitter", None)
yield item
| {"golden_diff": "diff --git a/locations/spiders/primanti_bros_us.py b/locations/spiders/primanti_bros_us.py\n--- a/locations/spiders/primanti_bros_us.py\n+++ b/locations/spiders/primanti_bros_us.py\n@@ -1,22 +1,18 @@\n-from locations.categories import Extras, apply_yes_no\n-from locations.hours import DAYS_FULL, OpeningHours\n-from locations.storefinders.where2getit import Where2GetItSpider\n+from locations.categories import Categories\n+from locations.storefinders.yext import YextSpider\n \n \n-class PrimantiBrosUSSpider(Where2GetItSpider):\n+class PrimantiBrosUSSpider(YextSpider):\n name = \"primanti_bros_us\"\n- item_attributes = {\"brand\": \"Primanti Bros\", \"brand_wikidata\": \"Q7243049\"}\n- api_brand_name = \"primantibros\"\n- api_key = \"7CDBB1A2-4AC6-11EB-932C-8917919C4603\"\n+ item_attributes = {\"brand\": \"Primanti Bros\", \"brand_wikidata\": \"Q7243049\", \"extras\": Categories.RESTAURANT.value}\n+ api_key = \"7515c25fc685bbdd7c5975b6573c6912\"\n+ api_version = \"20220511\"\n \n def parse_item(self, item, location):\n- item[\"ref\"] = location[\"uid\"]\n- item[\"street_address\"] = \", \".join(filter(None, [location.get(\"address1\"), location.get(\"address2\")]))\n- item[\"website\"] = location.get(\"menuurl\")\n- item[\"opening_hours\"] = OpeningHours()\n- hours_string = \"\"\n- for day_name in DAYS_FULL:\n- hours_string = f\"{hours_string} {day_name}: \" + location[\"{}hours\".format(day_name.lower())]\n- item[\"opening_hours\"].add_ranges_from_string(hours_string)\n- apply_yes_no(Extras.DRIVE_THROUGH, item, location[\"has_drive_through\"] == \"1\", False)\n+ if \"test-location\" in item[\"ref\"]:\n+ return\n+ item[\"ref\"] = location.get(\"c_pagesURL\")\n+ item[\"name\"] = location.get(\"c_searchName\")\n+ item[\"website\"] = location.get(\"c_pagesURL\")\n+ item.pop(\"twitter\", None)\n yield item\n", "issue": "primanti_bros_us: switch to YextSpider as Where2GetIt seemingly no longer used\nThe store locator at `https://restaurants.primantibros.com/search` now uses Yext APIs for querying store locations, not Where2GetIt.\n", "before_files": [{"content": "from locations.categories import Extras, apply_yes_no\nfrom locations.hours import DAYS_FULL, OpeningHours\nfrom locations.storefinders.where2getit import Where2GetItSpider\n\n\nclass PrimantiBrosUSSpider(Where2GetItSpider):\n name = \"primanti_bros_us\"\n item_attributes = {\"brand\": \"Primanti Bros\", \"brand_wikidata\": \"Q7243049\"}\n api_brand_name = \"primantibros\"\n api_key = \"7CDBB1A2-4AC6-11EB-932C-8917919C4603\"\n\n def parse_item(self, item, location):\n item[\"ref\"] = location[\"uid\"]\n item[\"street_address\"] = \", \".join(filter(None, [location.get(\"address1\"), location.get(\"address2\")]))\n item[\"website\"] = location.get(\"menuurl\")\n item[\"opening_hours\"] = OpeningHours()\n hours_string = \"\"\n for day_name in DAYS_FULL:\n hours_string = f\"{hours_string} {day_name}: \" + location[\"{}hours\".format(day_name.lower())]\n item[\"opening_hours\"].add_ranges_from_string(hours_string)\n apply_yes_no(Extras.DRIVE_THROUGH, item, location[\"has_drive_through\"] == \"1\", False)\n yield item\n", "path": "locations/spiders/primanti_bros_us.py"}], "after_files": [{"content": "from locations.categories import Categories\nfrom locations.storefinders.yext import YextSpider\n\n\nclass PrimantiBrosUSSpider(YextSpider):\n name = \"primanti_bros_us\"\n item_attributes = {\"brand\": \"Primanti Bros\", \"brand_wikidata\": \"Q7243049\", \"extras\": Categories.RESTAURANT.value}\n api_key = \"7515c25fc685bbdd7c5975b6573c6912\"\n api_version = \"20220511\"\n\n def parse_item(self, item, location):\n if \"test-location\" in item[\"ref\"]:\n return\n item[\"ref\"] = location.get(\"c_pagesURL\")\n item[\"name\"] = location.get(\"c_searchName\")\n item[\"website\"] = location.get(\"c_pagesURL\")\n item.pop(\"twitter\", None)\n yield item\n", "path": "locations/spiders/primanti_bros_us.py"}]} | 648 | 564 |
gh_patches_debug_20381 | rasdani/github-patches | git_diff | scoutapp__scout_apm_python-663 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Track when an exception occurs in a Celery task
Similar to how we do this in other libraries
`tracked_request.tag("error", "true")`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/scout_apm/celery.py`
Content:
```
1 # coding=utf-8
2 from __future__ import absolute_import, division, print_function, unicode_literals
3
4 import datetime as dt
5
6 from celery.signals import before_task_publish, task_postrun, task_prerun
7
8 import scout_apm.core
9 from scout_apm.compat import datetime_to_timestamp
10 from scout_apm.core.config import scout_config
11 from scout_apm.core.tracked_request import TrackedRequest
12
13
14 def before_task_publish_callback(headers=None, properties=None, **kwargs):
15 if "scout_task_start" not in headers:
16 headers["scout_task_start"] = datetime_to_timestamp(dt.datetime.utcnow())
17
18
19 def task_prerun_callback(task=None, **kwargs):
20 tracked_request = TrackedRequest.instance()
21 tracked_request.is_real_request = True
22
23 start = getattr(task.request, "scout_task_start", None)
24 if start is not None:
25 now = datetime_to_timestamp(dt.datetime.utcnow())
26 try:
27 queue_time = now - start
28 except TypeError:
29 pass
30 else:
31 tracked_request.tag("queue_time", queue_time)
32
33 task_id = getattr(task.request, "id", None)
34 if task_id:
35 tracked_request.tag("task_id", task_id)
36 parent_task_id = getattr(task.request, "parent_id", None)
37 if parent_task_id:
38 tracked_request.tag("parent_task_id", parent_task_id)
39
40 delivery_info = task.request.delivery_info
41 tracked_request.tag("is_eager", delivery_info.get("is_eager", False))
42 tracked_request.tag("exchange", delivery_info.get("exchange", "unknown"))
43 tracked_request.tag("priority", delivery_info.get("priority", "unknown"))
44 tracked_request.tag("routing_key", delivery_info.get("routing_key", "unknown"))
45 tracked_request.tag("queue", delivery_info.get("queue", "unknown"))
46
47 tracked_request.start_span(operation=("Job/" + task.name))
48
49
50 def task_postrun_callback(task=None, **kwargs):
51 tracked_request = TrackedRequest.instance()
52 tracked_request.stop_span()
53
54
55 def install(app=None):
56 if app is not None:
57 copy_configuration(app)
58
59 installed = scout_apm.core.install()
60 if not installed:
61 return
62
63 before_task_publish.connect(before_task_publish_callback)
64 task_prerun.connect(task_prerun_callback)
65 task_postrun.connect(task_postrun_callback)
66
67
68 def copy_configuration(app):
69 prefix = "scout_"
70 prefix_len = len(prefix)
71
72 to_set = {}
73 for key, value in app.conf.items():
74 key_lower = key.lower()
75 if key_lower.startswith(prefix) and len(key_lower) > prefix_len:
76 scout_key = key_lower[prefix_len:]
77 to_set[scout_key] = value
78
79 scout_config.set(**to_set)
80
81
82 def uninstall():
83 before_task_publish.disconnect(before_task_publish_callback)
84 task_prerun.disconnect(task_prerun_callback)
85 task_postrun.disconnect(task_postrun_callback)
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/scout_apm/celery.py b/src/scout_apm/celery.py
--- a/src/scout_apm/celery.py
+++ b/src/scout_apm/celery.py
@@ -3,7 +3,7 @@
import datetime as dt
-from celery.signals import before_task_publish, task_postrun, task_prerun
+from celery.signals import before_task_publish, task_failure, task_postrun, task_prerun
import scout_apm.core
from scout_apm.compat import datetime_to_timestamp
@@ -52,6 +52,11 @@
tracked_request.stop_span()
+def task_failure_callback(task_id=None, **kwargs):
+ tracked_request = TrackedRequest.instance()
+ tracked_request.tag("error", "true")
+
+
def install(app=None):
if app is not None:
copy_configuration(app)
@@ -62,6 +67,7 @@
before_task_publish.connect(before_task_publish_callback)
task_prerun.connect(task_prerun_callback)
+ task_failure.connect(task_failure_callback)
task_postrun.connect(task_postrun_callback)
| {"golden_diff": "diff --git a/src/scout_apm/celery.py b/src/scout_apm/celery.py\n--- a/src/scout_apm/celery.py\n+++ b/src/scout_apm/celery.py\n@@ -3,7 +3,7 @@\n \n import datetime as dt\n \n-from celery.signals import before_task_publish, task_postrun, task_prerun\n+from celery.signals import before_task_publish, task_failure, task_postrun, task_prerun\n \n import scout_apm.core\n from scout_apm.compat import datetime_to_timestamp\n@@ -52,6 +52,11 @@\n tracked_request.stop_span()\n \n \n+def task_failure_callback(task_id=None, **kwargs):\n+ tracked_request = TrackedRequest.instance()\n+ tracked_request.tag(\"error\", \"true\")\n+\n+\n def install(app=None):\n if app is not None:\n copy_configuration(app)\n@@ -62,6 +67,7 @@\n \n before_task_publish.connect(before_task_publish_callback)\n task_prerun.connect(task_prerun_callback)\n+ task_failure.connect(task_failure_callback)\n task_postrun.connect(task_postrun_callback)\n", "issue": "Track when an exception occurs in a Celery task\nSimilar to how we do this in other libraries\r\n`tracked_request.tag(\"error\", \"true\")`\r\n\n", "before_files": [{"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport datetime as dt\n\nfrom celery.signals import before_task_publish, task_postrun, task_prerun\n\nimport scout_apm.core\nfrom scout_apm.compat import datetime_to_timestamp\nfrom scout_apm.core.config import scout_config\nfrom scout_apm.core.tracked_request import TrackedRequest\n\n\ndef before_task_publish_callback(headers=None, properties=None, **kwargs):\n if \"scout_task_start\" not in headers:\n headers[\"scout_task_start\"] = datetime_to_timestamp(dt.datetime.utcnow())\n\n\ndef task_prerun_callback(task=None, **kwargs):\n tracked_request = TrackedRequest.instance()\n tracked_request.is_real_request = True\n\n start = getattr(task.request, \"scout_task_start\", None)\n if start is not None:\n now = datetime_to_timestamp(dt.datetime.utcnow())\n try:\n queue_time = now - start\n except TypeError:\n pass\n else:\n tracked_request.tag(\"queue_time\", queue_time)\n\n task_id = getattr(task.request, \"id\", None)\n if task_id:\n tracked_request.tag(\"task_id\", task_id)\n parent_task_id = getattr(task.request, \"parent_id\", None)\n if parent_task_id:\n tracked_request.tag(\"parent_task_id\", parent_task_id)\n\n delivery_info = task.request.delivery_info\n tracked_request.tag(\"is_eager\", delivery_info.get(\"is_eager\", False))\n tracked_request.tag(\"exchange\", delivery_info.get(\"exchange\", \"unknown\"))\n tracked_request.tag(\"priority\", delivery_info.get(\"priority\", \"unknown\"))\n tracked_request.tag(\"routing_key\", delivery_info.get(\"routing_key\", \"unknown\"))\n tracked_request.tag(\"queue\", delivery_info.get(\"queue\", \"unknown\"))\n\n tracked_request.start_span(operation=(\"Job/\" + task.name))\n\n\ndef task_postrun_callback(task=None, **kwargs):\n tracked_request = TrackedRequest.instance()\n tracked_request.stop_span()\n\n\ndef install(app=None):\n if app is not None:\n copy_configuration(app)\n\n installed = scout_apm.core.install()\n if not installed:\n return\n\n before_task_publish.connect(before_task_publish_callback)\n task_prerun.connect(task_prerun_callback)\n task_postrun.connect(task_postrun_callback)\n\n\ndef copy_configuration(app):\n prefix = \"scout_\"\n prefix_len = len(prefix)\n\n to_set = {}\n for key, value in app.conf.items():\n key_lower = key.lower()\n if key_lower.startswith(prefix) and len(key_lower) > prefix_len:\n scout_key = key_lower[prefix_len:]\n to_set[scout_key] = value\n\n scout_config.set(**to_set)\n\n\ndef uninstall():\n before_task_publish.disconnect(before_task_publish_callback)\n task_prerun.disconnect(task_prerun_callback)\n task_postrun.disconnect(task_postrun_callback)\n", "path": "src/scout_apm/celery.py"}], "after_files": [{"content": "# coding=utf-8\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport datetime as dt\n\nfrom celery.signals import before_task_publish, task_failure, task_postrun, task_prerun\n\nimport scout_apm.core\nfrom scout_apm.compat import datetime_to_timestamp\nfrom scout_apm.core.config import scout_config\nfrom scout_apm.core.tracked_request import TrackedRequest\n\n\ndef before_task_publish_callback(headers=None, properties=None, **kwargs):\n if \"scout_task_start\" not in headers:\n headers[\"scout_task_start\"] = datetime_to_timestamp(dt.datetime.utcnow())\n\n\ndef task_prerun_callback(task=None, **kwargs):\n tracked_request = TrackedRequest.instance()\n tracked_request.is_real_request = True\n\n start = getattr(task.request, \"scout_task_start\", None)\n if start is not None:\n now = datetime_to_timestamp(dt.datetime.utcnow())\n try:\n queue_time = now - start\n except TypeError:\n pass\n else:\n tracked_request.tag(\"queue_time\", queue_time)\n\n task_id = getattr(task.request, \"id\", None)\n if task_id:\n tracked_request.tag(\"task_id\", task_id)\n parent_task_id = getattr(task.request, \"parent_id\", None)\n if parent_task_id:\n tracked_request.tag(\"parent_task_id\", parent_task_id)\n\n delivery_info = task.request.delivery_info\n tracked_request.tag(\"is_eager\", delivery_info.get(\"is_eager\", False))\n tracked_request.tag(\"exchange\", delivery_info.get(\"exchange\", \"unknown\"))\n tracked_request.tag(\"priority\", delivery_info.get(\"priority\", \"unknown\"))\n tracked_request.tag(\"routing_key\", delivery_info.get(\"routing_key\", \"unknown\"))\n tracked_request.tag(\"queue\", delivery_info.get(\"queue\", \"unknown\"))\n\n tracked_request.start_span(operation=(\"Job/\" + task.name))\n\n\ndef task_postrun_callback(task=None, **kwargs):\n tracked_request = TrackedRequest.instance()\n tracked_request.stop_span()\n\n\ndef task_failure_callback(task_id=None, **kwargs):\n tracked_request = TrackedRequest.instance()\n tracked_request.tag(\"error\", \"true\")\n\n\ndef install(app=None):\n if app is not None:\n copy_configuration(app)\n\n installed = scout_apm.core.install()\n if not installed:\n return\n\n before_task_publish.connect(before_task_publish_callback)\n task_prerun.connect(task_prerun_callback)\n task_failure.connect(task_failure_callback)\n task_postrun.connect(task_postrun_callback)\n\n\ndef copy_configuration(app):\n prefix = \"scout_\"\n prefix_len = len(prefix)\n\n to_set = {}\n for key, value in app.conf.items():\n key_lower = key.lower()\n if key_lower.startswith(prefix) and len(key_lower) > prefix_len:\n scout_key = key_lower[prefix_len:]\n to_set[scout_key] = value\n\n scout_config.set(**to_set)\n\n\ndef uninstall():\n before_task_publish.disconnect(before_task_publish_callback)\n task_prerun.disconnect(task_prerun_callback)\n task_postrun.disconnect(task_postrun_callback)\n", "path": "src/scout_apm/celery.py"}]} | 1,092 | 248 |
gh_patches_debug_5122 | rasdani/github-patches | git_diff | liqd__a4-meinberlin-3044 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[dev/stage] auto-fill-in overwrites my bplan-name
**URL:** https://meinberlin-stage.liqd.net/dashboard/projects/caro-testing-new-bplan-mail-2/bplan/
**user:** initiator addin bplan
**expected behaviour:** I can use autofill to add my mail-address
**behaviour:** if I do so, the title of bplan is overwritten by my name but as it is far up the form I don't notice it.
**important screensize:**
**device & browser:** mac, chrome
**Comment/Question:** is that even something we can influence?
Screenshot?
<img width="673" alt="Bildschirmfoto 2020-07-10 um 11 02 30" src="https://user-images.githubusercontent.com/35491681/87137579-6b0eaf80-c29d-11ea-928f-c888dc8eb430.png">
<img width="673" alt="Bildschirmfoto 2020-07-10 um 11 06 10" src="https://user-images.githubusercontent.com/35491681/87137586-6cd87300-c29d-11ea-965d-74b4ecba8bc8.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `meinberlin/apps/bplan/forms.py`
Content:
```
1 from django import forms
2
3 from meinberlin.apps.extprojects.forms import ExternalProjectCreateForm
4 from meinberlin.apps.extprojects.forms import ExternalProjectForm
5
6 from . import models
7
8
9 class StatementForm(forms.ModelForm):
10 class Meta:
11 model = models.Statement
12 fields = ['name', 'email', 'statement',
13 'street_number', 'postal_code_city']
14
15
16 class BplanProjectCreateForm(ExternalProjectCreateForm):
17
18 class Meta:
19 model = models.Bplan
20 fields = ['name', 'description', 'tile_image', 'tile_image_copyright']
21
22
23 class BplanProjectForm(ExternalProjectForm):
24
25 class Meta:
26 model = models.Bplan
27 fields = ['name', 'identifier', 'url', 'description', 'tile_image',
28 'tile_image_copyright', 'is_archived', 'office_worker_email',
29 'start_date', 'end_date']
30 required_for_project_publish = ['name', 'url', 'description',
31 'office_worker_email']
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/meinberlin/apps/bplan/forms.py b/meinberlin/apps/bplan/forms.py
--- a/meinberlin/apps/bplan/forms.py
+++ b/meinberlin/apps/bplan/forms.py
@@ -29,3 +29,9 @@
'start_date', 'end_date']
required_for_project_publish = ['name', 'url', 'description',
'office_worker_email']
+
+ def __init__(self, *args, **kwargs):
+ super().__init__(*args, **kwargs)
+ self.fields['name'].widget.attrs.update({
+ 'autocomplete': 'off', 'autofill': 'off'
+ })
| {"golden_diff": "diff --git a/meinberlin/apps/bplan/forms.py b/meinberlin/apps/bplan/forms.py\n--- a/meinberlin/apps/bplan/forms.py\n+++ b/meinberlin/apps/bplan/forms.py\n@@ -29,3 +29,9 @@\n 'start_date', 'end_date']\n required_for_project_publish = ['name', 'url', 'description',\n 'office_worker_email']\n+\n+ def __init__(self, *args, **kwargs):\n+ super().__init__(*args, **kwargs)\n+ self.fields['name'].widget.attrs.update({\n+ 'autocomplete': 'off', 'autofill': 'off'\n+ })\n", "issue": "[dev/stage] auto-fill-in overwrites my bplan-name\n**URL:** https://meinberlin-stage.liqd.net/dashboard/projects/caro-testing-new-bplan-mail-2/bplan/\r\n**user:** initiator addin bplan\r\n**expected behaviour:** I can use autofill to add my mail-address\r\n**behaviour:** if I do so, the title of bplan is overwritten by my name but as it is far up the form I don't notice it.\r\n**important screensize:**\r\n**device & browser:** mac, chrome\r\n**Comment/Question:** is that even something we can influence?\r\n\r\nScreenshot?\r\n<img width=\"673\" alt=\"Bildschirmfoto 2020-07-10 um 11 02 30\" src=\"https://user-images.githubusercontent.com/35491681/87137579-6b0eaf80-c29d-11ea-928f-c888dc8eb430.png\">\r\n<img width=\"673\" alt=\"Bildschirmfoto 2020-07-10 um 11 06 10\" src=\"https://user-images.githubusercontent.com/35491681/87137586-6cd87300-c29d-11ea-965d-74b4ecba8bc8.png\">\r\n\r\n\n", "before_files": [{"content": "from django import forms\n\nfrom meinberlin.apps.extprojects.forms import ExternalProjectCreateForm\nfrom meinberlin.apps.extprojects.forms import ExternalProjectForm\n\nfrom . import models\n\n\nclass StatementForm(forms.ModelForm):\n class Meta:\n model = models.Statement\n fields = ['name', 'email', 'statement',\n 'street_number', 'postal_code_city']\n\n\nclass BplanProjectCreateForm(ExternalProjectCreateForm):\n\n class Meta:\n model = models.Bplan\n fields = ['name', 'description', 'tile_image', 'tile_image_copyright']\n\n\nclass BplanProjectForm(ExternalProjectForm):\n\n class Meta:\n model = models.Bplan\n fields = ['name', 'identifier', 'url', 'description', 'tile_image',\n 'tile_image_copyright', 'is_archived', 'office_worker_email',\n 'start_date', 'end_date']\n required_for_project_publish = ['name', 'url', 'description',\n 'office_worker_email']\n", "path": "meinberlin/apps/bplan/forms.py"}], "after_files": [{"content": "from django import forms\n\nfrom meinberlin.apps.extprojects.forms import ExternalProjectCreateForm\nfrom meinberlin.apps.extprojects.forms import ExternalProjectForm\n\nfrom . import models\n\n\nclass StatementForm(forms.ModelForm):\n class Meta:\n model = models.Statement\n fields = ['name', 'email', 'statement',\n 'street_number', 'postal_code_city']\n\n\nclass BplanProjectCreateForm(ExternalProjectCreateForm):\n\n class Meta:\n model = models.Bplan\n fields = ['name', 'description', 'tile_image', 'tile_image_copyright']\n\n\nclass BplanProjectForm(ExternalProjectForm):\n\n class Meta:\n model = models.Bplan\n fields = ['name', 'identifier', 'url', 'description', 'tile_image',\n 'tile_image_copyright', 'is_archived', 'office_worker_email',\n 'start_date', 'end_date']\n required_for_project_publish = ['name', 'url', 'description',\n 'office_worker_email']\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n self.fields['name'].widget.attrs.update({\n 'autocomplete': 'off', 'autofill': 'off'\n })\n", "path": "meinberlin/apps/bplan/forms.py"}]} | 854 | 146 |
gh_patches_debug_5041 | rasdani/github-patches | git_diff | dask__dask-256 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
dot_graph does not work in stable version
I try to generate visual graphs as [described in documentation](http://dask.pydata.org/en/latest/inspect.html), but get:
`'module' object has no attribute 'to_pydot'`
The graphviz is installed with homebrew. Dask is installed from conda (latest stable release):
```
In [15]: dask.__version__
Out[15]: '0.5.0'
```
The code and traceback are below (I had to replace `blockshape` with `chunks`, otherwise it did not create task graph):
``` python
In [1]:
import dask.array as da
from dask.dot import dot_graph
In [2]:
x = da.ones((5, 15), chunks=(5, 5))
In [5]:
d = (x + 1).dask
In [6]:
dot_graph(d)
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-6-c797e633866d> in <module>()
----> 1 dot_graph(d)
/Users/koldunov/miniconda/lib/python2.7/site-packages/dask/dot.pyc in dot_graph(d, filename, **kwargs)
73 def dot_graph(d, filename='mydask', **kwargs):
74 dg = to_networkx(d, **kwargs)
---> 75 write_networkx_to_dot(dg, filename=filename)
76
77
/Users/koldunov/miniconda/lib/python2.7/site-packages/dask/dot.pyc in write_networkx_to_dot(dg, filename)
61 def write_networkx_to_dot(dg, filename='mydask'):
62 import os
---> 63 p = nx.to_pydot(dg)
64 p.set_rankdir('BT')
65 with open(filename + '.dot', 'w') as f:
AttributeError: 'module' object has no attribute 'to_pydot'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dask/dot.py`
Content:
```
1 from __future__ import absolute_import, division, print_function
2
3 import networkx as nx
4 from dask.core import istask, get_dependencies
5
6
7 def make_hashable(x):
8 try:
9 hash(x)
10 return x
11 except TypeError:
12 return hash(str(x))
13
14
15 def lower(func):
16 while hasattr(func, 'func'):
17 func = func.func
18 return func
19
20 def name(func):
21 try:
22 return lower(func).__name__
23 except AttributeError:
24 return 'func'
25
26
27 def to_networkx(d, data_attributes=None, function_attributes=None):
28 if data_attributes is None:
29 data_attributes = dict()
30 if function_attributes is None:
31 function_attributes = dict()
32
33 g = nx.DiGraph()
34
35 for k, v in sorted(d.items(), key=lambda x: x[0]):
36 g.add_node(k, shape='box', **data_attributes.get(k, dict()))
37 if istask(v):
38 func, args = v[0], v[1:]
39 func_node = make_hashable((v, 'function'))
40 g.add_node(func_node,
41 shape='circle',
42 label=name(func),
43 **function_attributes.get(k, dict()))
44 g.add_edge(func_node, k)
45 for dep in sorted(get_dependencies(d, k)):
46 arg2 = make_hashable(dep)
47 g.add_node(arg2,
48 label=str(dep),
49 shape='box',
50 **data_attributes.get(dep, dict()))
51 g.add_edge(arg2, func_node)
52 else:
53 if v not in d:
54 g.add_node(k, label='%s=%s' % (k, v), **data_attributes.get(k, dict()))
55 else: # alias situation
56 g.add_edge(v, k)
57
58 return g
59
60
61 def write_networkx_to_dot(dg, filename='mydask'):
62 import os
63 p = nx.to_pydot(dg)
64 p.set_rankdir('BT')
65 with open(filename + '.dot', 'w') as f:
66 f.write(p.to_string())
67
68 os.system('dot -Tpdf %s.dot -o %s.pdf' % (filename, filename))
69 os.system('dot -Tpng %s.dot -o %s.png' % (filename, filename))
70 print("Writing graph to %s.pdf" % filename)
71
72
73 def dot_graph(d, filename='mydask', **kwargs):
74 dg = to_networkx(d, **kwargs)
75 write_networkx_to_dot(dg, filename=filename)
76
77
78 if __name__ == '__main__':
79 def add(x, y):
80 return x + y
81 def inc(x):
82 return x + 1
83
84 dsk = {'x': 1, 'y': (inc, 'x'),
85 'a': 2, 'b': (inc, 'a'),
86 'z': (add, 'y', 'b')}
87
88 dot_graph(dsk)
89
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dask/dot.py b/dask/dot.py
--- a/dask/dot.py
+++ b/dask/dot.py
@@ -60,7 +60,11 @@
def write_networkx_to_dot(dg, filename='mydask'):
import os
- p = nx.to_pydot(dg)
+ try:
+ p = nx.to_pydot(dg)
+ except AttributeError:
+ raise ImportError("Can not find pydot module. Please install.\n"
+ " pip install pydot")
p.set_rankdir('BT')
with open(filename + '.dot', 'w') as f:
f.write(p.to_string())
| {"golden_diff": "diff --git a/dask/dot.py b/dask/dot.py\n--- a/dask/dot.py\n+++ b/dask/dot.py\n@@ -60,7 +60,11 @@\n \n def write_networkx_to_dot(dg, filename='mydask'):\n import os\n- p = nx.to_pydot(dg)\n+ try:\n+ p = nx.to_pydot(dg)\n+ except AttributeError:\n+ raise ImportError(\"Can not find pydot module. Please install.\\n\"\n+ \" pip install pydot\")\n p.set_rankdir('BT')\n with open(filename + '.dot', 'w') as f:\n f.write(p.to_string())\n", "issue": "dot_graph does not work in stable version\nI try to generate visual graphs as [described in documentation](http://dask.pydata.org/en/latest/inspect.html), but get:\n`'module' object has no attribute 'to_pydot'`\n\nThe graphviz is installed with homebrew. Dask is installed from conda (latest stable release):\n\n```\nIn [15]: dask.__version__\n\nOut[15]: '0.5.0'\n```\n\nThe code and traceback are below (I had to replace `blockshape` with `chunks`, otherwise it did not create task graph):\n\n``` python\nIn [1]: \nimport dask.array as da\nfrom dask.dot import dot_graph\nIn [2]:\n\nx = da.ones((5, 15), chunks=(5, 5))\nIn [5]:\n\nd = (x + 1).dask\nIn [6]:\n\ndot_graph(d)\n---------------------------------------------------------------------------\nAttributeError Traceback (most recent call last)\n<ipython-input-6-c797e633866d> in <module>()\n----> 1 dot_graph(d)\n\n/Users/koldunov/miniconda/lib/python2.7/site-packages/dask/dot.pyc in dot_graph(d, filename, **kwargs)\n 73 def dot_graph(d, filename='mydask', **kwargs):\n 74 dg = to_networkx(d, **kwargs)\n---> 75 write_networkx_to_dot(dg, filename=filename)\n 76 \n 77 \n\n/Users/koldunov/miniconda/lib/python2.7/site-packages/dask/dot.pyc in write_networkx_to_dot(dg, filename)\n 61 def write_networkx_to_dot(dg, filename='mydask'):\n 62 import os\n---> 63 p = nx.to_pydot(dg)\n 64 p.set_rankdir('BT')\n 65 with open(filename + '.dot', 'w') as f:\n\nAttributeError: 'module' object has no attribute 'to_pydot'\n```\n\n", "before_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport networkx as nx\nfrom dask.core import istask, get_dependencies\n\n\ndef make_hashable(x):\n try:\n hash(x)\n return x\n except TypeError:\n return hash(str(x))\n\n\ndef lower(func):\n while hasattr(func, 'func'):\n func = func.func\n return func\n\ndef name(func):\n try:\n return lower(func).__name__\n except AttributeError:\n return 'func'\n\n\ndef to_networkx(d, data_attributes=None, function_attributes=None):\n if data_attributes is None:\n data_attributes = dict()\n if function_attributes is None:\n function_attributes = dict()\n\n g = nx.DiGraph()\n\n for k, v in sorted(d.items(), key=lambda x: x[0]):\n g.add_node(k, shape='box', **data_attributes.get(k, dict()))\n if istask(v):\n func, args = v[0], v[1:]\n func_node = make_hashable((v, 'function'))\n g.add_node(func_node,\n shape='circle',\n label=name(func),\n **function_attributes.get(k, dict()))\n g.add_edge(func_node, k)\n for dep in sorted(get_dependencies(d, k)):\n arg2 = make_hashable(dep)\n g.add_node(arg2,\n label=str(dep),\n shape='box',\n **data_attributes.get(dep, dict()))\n g.add_edge(arg2, func_node)\n else:\n if v not in d:\n g.add_node(k, label='%s=%s' % (k, v), **data_attributes.get(k, dict()))\n else: # alias situation\n g.add_edge(v, k)\n\n return g\n\n\ndef write_networkx_to_dot(dg, filename='mydask'):\n import os\n p = nx.to_pydot(dg)\n p.set_rankdir('BT')\n with open(filename + '.dot', 'w') as f:\n f.write(p.to_string())\n\n os.system('dot -Tpdf %s.dot -o %s.pdf' % (filename, filename))\n os.system('dot -Tpng %s.dot -o %s.png' % (filename, filename))\n print(\"Writing graph to %s.pdf\" % filename)\n\n\ndef dot_graph(d, filename='mydask', **kwargs):\n dg = to_networkx(d, **kwargs)\n write_networkx_to_dot(dg, filename=filename)\n\n\nif __name__ == '__main__':\n def add(x, y):\n return x + y\n def inc(x):\n return x + 1\n\n dsk = {'x': 1, 'y': (inc, 'x'),\n 'a': 2, 'b': (inc, 'a'),\n 'z': (add, 'y', 'b')}\n\n dot_graph(dsk)\n", "path": "dask/dot.py"}], "after_files": [{"content": "from __future__ import absolute_import, division, print_function\n\nimport networkx as nx\nfrom dask.core import istask, get_dependencies\n\n\ndef make_hashable(x):\n try:\n hash(x)\n return x\n except TypeError:\n return hash(str(x))\n\n\ndef lower(func):\n while hasattr(func, 'func'):\n func = func.func\n return func\n\ndef name(func):\n try:\n return lower(func).__name__\n except AttributeError:\n return 'func'\n\n\ndef to_networkx(d, data_attributes=None, function_attributes=None):\n if data_attributes is None:\n data_attributes = dict()\n if function_attributes is None:\n function_attributes = dict()\n\n g = nx.DiGraph()\n\n for k, v in sorted(d.items(), key=lambda x: x[0]):\n g.add_node(k, shape='box', **data_attributes.get(k, dict()))\n if istask(v):\n func, args = v[0], v[1:]\n func_node = make_hashable((v, 'function'))\n g.add_node(func_node,\n shape='circle',\n label=name(func),\n **function_attributes.get(k, dict()))\n g.add_edge(func_node, k)\n for dep in sorted(get_dependencies(d, k)):\n arg2 = make_hashable(dep)\n g.add_node(arg2,\n label=str(dep),\n shape='box',\n **data_attributes.get(dep, dict()))\n g.add_edge(arg2, func_node)\n else:\n if v not in d:\n g.add_node(k, label='%s=%s' % (k, v), **data_attributes.get(k, dict()))\n else: # alias situation\n g.add_edge(v, k)\n\n return g\n\n\ndef write_networkx_to_dot(dg, filename='mydask'):\n import os\n try:\n p = nx.to_pydot(dg)\n except AttributeError:\n raise ImportError(\"Can not find pydot module. Please install.\\n\"\n \" pip install pydot\")\n p.set_rankdir('BT')\n with open(filename + '.dot', 'w') as f:\n f.write(p.to_string())\n\n os.system('dot -Tpdf %s.dot -o %s.pdf' % (filename, filename))\n os.system('dot -Tpng %s.dot -o %s.png' % (filename, filename))\n print(\"Writing graph to %s.pdf\" % filename)\n\n\ndef dot_graph(d, filename='mydask', **kwargs):\n dg = to_networkx(d, **kwargs)\n write_networkx_to_dot(dg, filename=filename)\n\n\nif __name__ == '__main__':\n def add(x, y):\n return x + y\n def inc(x):\n return x + 1\n\n dsk = {'x': 1, 'y': (inc, 'x'),\n 'a': 2, 'b': (inc, 'a'),\n 'z': (add, 'y', 'b')}\n\n dot_graph(dsk)\n", "path": "dask/dot.py"}]} | 1,505 | 151 |
gh_patches_debug_20616 | rasdani/github-patches | git_diff | rasterio__rasterio-1259 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
examples/total.py won't run in Python3
The line `total /= 3` should read instead, `total = total / 3`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/sieve.py`
Content:
```
1 #!/usr/bin/env python
2 #
3 # sieve: demonstrate sieving and polygonizing of raster features.
4
5 import subprocess
6
7 import numpy as np
8 import rasterio
9 from rasterio.features import sieve, shapes
10
11
12 # Register GDAL and OGR drivers.
13 with rasterio.Env():
14
15 # Read a raster to be sieved.
16 with rasterio.open('tests/data/shade.tif') as src:
17 shade = src.read(1)
18
19 # Print the number of shapes in the source raster.
20 print("Slope shapes: %d" % len(list(shapes(shade))))
21
22 # Sieve out features 13 pixels or smaller.
23 sieved = sieve(shade, 13, out=np.zeros(src.shape, src.dtypes[0]))
24
25 # Print the number of shapes in the sieved raster.
26 print("Sieved (13) shapes: %d" % len(list(shapes(sieved))))
27
28 # Write out the sieved raster.
29 kwargs = src.meta
30 kwargs['transform'] = kwargs.pop('affine')
31 with rasterio.open('example-sieved.tif', 'w', **kwargs) as dst:
32 dst.write(sieved, indexes=1)
33
34 # Dump out gdalinfo's report card and open (or "eog") the TIFF.
35 print(subprocess.check_output(
36 ['gdalinfo', '-stats', 'example-sieved.tif']))
37 subprocess.call(['open', 'example-sieved.tif'])
38
```
Path: `examples/total.py`
Content:
```
1 import numpy as np
2 import rasterio
3 import subprocess
4
5 with rasterio.Env(CPL_DEBUG=True):
6
7 # Read raster bands directly to Numpy arrays.
8 with rasterio.open('tests/data/RGB.byte.tif') as src:
9 r, g, b = src.read()
10
11 # Combine arrays using the 'iadd' ufunc. Expecting that the sum will
12 # exceed the 8-bit integer range, initialize it as 16-bit. Adding other
13 # arrays to it in-place converts those arrays up and preserves the type
14 # of the total array.
15 total = np.zeros(r.shape, dtype=rasterio.uint16)
16 for band in (r, g, b):
17 total += band
18 total /= 3
19
20 # Write the product as a raster band to a new 8-bit file. For keyword
21 # arguments, we start with the meta attributes of the source file, but
22 # then change the band count to 1, set the dtype to uint8, and specify
23 # LZW compression.
24 kwargs = src.meta
25 kwargs.update(
26 dtype=rasterio.uint8,
27 count=1,
28 compress='lzw')
29
30 with rasterio.open('example-total.tif', 'w', **kwargs) as dst:
31 dst.write(total.astype(rasterio.uint8), indexes=1)
32
33 # Dump out gdalinfo's report card and open the image.
34 info = subprocess.check_output(
35 ['gdalinfo', '-stats', 'example-total.tif'])
36 print(info)
37 subprocess.call(['open', 'example-total.tif'])
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/sieve.py b/examples/sieve.py
--- a/examples/sieve.py
+++ b/examples/sieve.py
@@ -27,7 +27,7 @@
# Write out the sieved raster.
kwargs = src.meta
- kwargs['transform'] = kwargs.pop('affine')
+ kwargs['transform'] = rasterio.transform.guard_transform(kwargs['transform'])
with rasterio.open('example-sieved.tif', 'w', **kwargs) as dst:
dst.write(sieved, indexes=1)
diff --git a/examples/total.py b/examples/total.py
--- a/examples/total.py
+++ b/examples/total.py
@@ -1,3 +1,4 @@
+from __future__ import division
import numpy as np
import rasterio
import subprocess
@@ -15,7 +16,7 @@
total = np.zeros(r.shape, dtype=rasterio.uint16)
for band in (r, g, b):
total += band
- total /= 3
+ total = total // 3
# Write the product as a raster band to a new 8-bit file. For keyword
# arguments, we start with the meta attributes of the source file, but
| {"golden_diff": "diff --git a/examples/sieve.py b/examples/sieve.py\n--- a/examples/sieve.py\n+++ b/examples/sieve.py\n@@ -27,7 +27,7 @@\n \n # Write out the sieved raster.\n kwargs = src.meta\n- kwargs['transform'] = kwargs.pop('affine')\n+ kwargs['transform'] = rasterio.transform.guard_transform(kwargs['transform'])\n with rasterio.open('example-sieved.tif', 'w', **kwargs) as dst:\n dst.write(sieved, indexes=1)\n \ndiff --git a/examples/total.py b/examples/total.py\n--- a/examples/total.py\n+++ b/examples/total.py\n@@ -1,3 +1,4 @@\n+from __future__ import division\n import numpy as np\n import rasterio\n import subprocess\n@@ -15,7 +16,7 @@\n total = np.zeros(r.shape, dtype=rasterio.uint16)\n for band in (r, g, b):\n total += band\n- total /= 3\n+ total = total // 3\n \n # Write the product as a raster band to a new 8-bit file. For keyword\n # arguments, we start with the meta attributes of the source file, but\n", "issue": "examples/total.py won't run in Python3\nThe line `total /= 3` should read instead, `total = total / 3`.\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n#\n# sieve: demonstrate sieving and polygonizing of raster features.\n\nimport subprocess\n\nimport numpy as np\nimport rasterio\nfrom rasterio.features import sieve, shapes\n\n\n# Register GDAL and OGR drivers.\nwith rasterio.Env():\n\n # Read a raster to be sieved.\n with rasterio.open('tests/data/shade.tif') as src:\n shade = src.read(1)\n\n # Print the number of shapes in the source raster.\n print(\"Slope shapes: %d\" % len(list(shapes(shade))))\n\n # Sieve out features 13 pixels or smaller.\n sieved = sieve(shade, 13, out=np.zeros(src.shape, src.dtypes[0]))\n\n # Print the number of shapes in the sieved raster.\n print(\"Sieved (13) shapes: %d\" % len(list(shapes(sieved))))\n\n # Write out the sieved raster.\n kwargs = src.meta\n kwargs['transform'] = kwargs.pop('affine')\n with rasterio.open('example-sieved.tif', 'w', **kwargs) as dst:\n dst.write(sieved, indexes=1)\n\n# Dump out gdalinfo's report card and open (or \"eog\") the TIFF.\nprint(subprocess.check_output(\n ['gdalinfo', '-stats', 'example-sieved.tif']))\nsubprocess.call(['open', 'example-sieved.tif'])\n", "path": "examples/sieve.py"}, {"content": "import numpy as np\nimport rasterio\nimport subprocess\n\nwith rasterio.Env(CPL_DEBUG=True):\n\n # Read raster bands directly to Numpy arrays.\n with rasterio.open('tests/data/RGB.byte.tif') as src:\n r, g, b = src.read()\n\n # Combine arrays using the 'iadd' ufunc. Expecting that the sum will\n # exceed the 8-bit integer range, initialize it as 16-bit. Adding other\n # arrays to it in-place converts those arrays up and preserves the type\n # of the total array.\n total = np.zeros(r.shape, dtype=rasterio.uint16)\n for band in (r, g, b):\n total += band\n total /= 3\n\n # Write the product as a raster band to a new 8-bit file. For keyword\n # arguments, we start with the meta attributes of the source file, but\n # then change the band count to 1, set the dtype to uint8, and specify\n # LZW compression.\n kwargs = src.meta\n kwargs.update(\n dtype=rasterio.uint8,\n count=1,\n compress='lzw')\n\n with rasterio.open('example-total.tif', 'w', **kwargs) as dst:\n dst.write(total.astype(rasterio.uint8), indexes=1)\n\n# Dump out gdalinfo's report card and open the image.\ninfo = subprocess.check_output(\n ['gdalinfo', '-stats', 'example-total.tif'])\nprint(info)\nsubprocess.call(['open', 'example-total.tif'])\n", "path": "examples/total.py"}], "after_files": [{"content": "#!/usr/bin/env python\n#\n# sieve: demonstrate sieving and polygonizing of raster features.\n\nimport subprocess\n\nimport numpy as np\nimport rasterio\nfrom rasterio.features import sieve, shapes\n\n\n# Register GDAL and OGR drivers.\nwith rasterio.Env():\n\n # Read a raster to be sieved.\n with rasterio.open('tests/data/shade.tif') as src:\n shade = src.read(1)\n\n # Print the number of shapes in the source raster.\n print(\"Slope shapes: %d\" % len(list(shapes(shade))))\n\n # Sieve out features 13 pixels or smaller.\n sieved = sieve(shade, 13, out=np.zeros(src.shape, src.dtypes[0]))\n\n # Print the number of shapes in the sieved raster.\n print(\"Sieved (13) shapes: %d\" % len(list(shapes(sieved))))\n\n # Write out the sieved raster.\n kwargs = src.meta\n kwargs['transform'] = rasterio.transform.guard_transform(kwargs['transform'])\n with rasterio.open('example-sieved.tif', 'w', **kwargs) as dst:\n dst.write(sieved, indexes=1)\n\n# Dump out gdalinfo's report card and open (or \"eog\") the TIFF.\nprint(subprocess.check_output(\n ['gdalinfo', '-stats', 'example-sieved.tif']))\nsubprocess.call(['open', 'example-sieved.tif'])\n", "path": "examples/sieve.py"}, {"content": "from __future__ import division\nimport numpy as np\nimport rasterio\nimport subprocess\n\nwith rasterio.Env(CPL_DEBUG=True):\n\n # Read raster bands directly to Numpy arrays.\n with rasterio.open('tests/data/RGB.byte.tif') as src:\n r, g, b = src.read()\n\n # Combine arrays using the 'iadd' ufunc. Expecting that the sum will\n # exceed the 8-bit integer range, initialize it as 16-bit. Adding other\n # arrays to it in-place converts those arrays up and preserves the type\n # of the total array.\n total = np.zeros(r.shape, dtype=rasterio.uint16)\n for band in (r, g, b):\n total += band\n total = total // 3\n\n # Write the product as a raster band to a new 8-bit file. For keyword\n # arguments, we start with the meta attributes of the source file, but\n # then change the band count to 1, set the dtype to uint8, and specify\n # LZW compression.\n kwargs = src.meta\n kwargs.update(\n dtype=rasterio.uint8,\n count=1,\n compress='lzw')\n\n with rasterio.open('example-total.tif', 'w', **kwargs) as dst:\n dst.write(total.astype(rasterio.uint8), indexes=1)\n\n# Dump out gdalinfo's report card and open the image.\ninfo = subprocess.check_output(\n ['gdalinfo', '-stats', 'example-total.tif'])\nprint(info)\nsubprocess.call(['open', 'example-total.tif'])\n", "path": "examples/total.py"}]} | 1,090 | 271 |
gh_patches_debug_58650 | rasdani/github-patches | git_diff | googleapis__google-api-python-client-295 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
BatchError is unprintable using default constructor (one string)
This one should be pretty simple, I hope.
Here's the constructor signature: `def __init__(self, reason, resp=None, content=None):`, which doesn't require `resp` to be defined, and I can see it is not defined most of the time, for example, in googleapiclient/http.py.
Then, given the representation method:
```
def __repr__(self):
return '<BatchError %s "%s">' % (self.resp.status, self.reason)
```
Which is also the string method:
```
__str__ = __repr__
```
This results in unprintable exceptions where `resp` is undefined, which is not very helpful when attempting to understand the error (e.g. #164).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `googleapiclient/errors.py`
Content:
```
1 # Copyright 2014 Google Inc. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Errors for the library.
16
17 All exceptions defined by the library
18 should be defined in this file.
19 """
20 from __future__ import absolute_import
21
22 __author__ = '[email protected] (Joe Gregorio)'
23
24 import json
25
26 # Oauth2client < 3 has the positional helper in 'util', >= 3 has it
27 # in '_helpers'.
28 try:
29 from oauth2client import util
30 except ImportError:
31 from oauth2client import _helpers as util
32
33
34 class Error(Exception):
35 """Base error for this module."""
36 pass
37
38
39 class HttpError(Error):
40 """HTTP data was invalid or unexpected."""
41
42 @util.positional(3)
43 def __init__(self, resp, content, uri=None):
44 self.resp = resp
45 if not isinstance(content, bytes):
46 raise TypeError("HTTP content should be bytes")
47 self.content = content
48 self.uri = uri
49
50 def _get_reason(self):
51 """Calculate the reason for the error from the response content."""
52 reason = self.resp.reason
53 try:
54 data = json.loads(self.content.decode('utf-8'))
55 if isinstance(data, dict):
56 reason = data['error']['message']
57 elif isinstance(data, list) and len(data) > 0:
58 first_error = data[0]
59 reason = first_error['error']['message']
60 except (ValueError, KeyError, TypeError):
61 pass
62 if reason is None:
63 reason = ''
64 return reason
65
66 def __repr__(self):
67 if self.uri:
68 return '<HttpError %s when requesting %s returned "%s">' % (
69 self.resp.status, self.uri, self._get_reason().strip())
70 else:
71 return '<HttpError %s "%s">' % (self.resp.status, self._get_reason())
72
73 __str__ = __repr__
74
75
76 class InvalidJsonError(Error):
77 """The JSON returned could not be parsed."""
78 pass
79
80
81 class UnknownFileType(Error):
82 """File type unknown or unexpected."""
83 pass
84
85
86 class UnknownLinkType(Error):
87 """Link type unknown or unexpected."""
88 pass
89
90
91 class UnknownApiNameOrVersion(Error):
92 """No API with that name and version exists."""
93 pass
94
95
96 class UnacceptableMimeTypeError(Error):
97 """That is an unacceptable mimetype for this operation."""
98 pass
99
100
101 class MediaUploadSizeError(Error):
102 """Media is larger than the method can accept."""
103 pass
104
105
106 class ResumableUploadError(HttpError):
107 """Error occured during resumable upload."""
108 pass
109
110
111 class InvalidChunkSizeError(Error):
112 """The given chunksize is not valid."""
113 pass
114
115 class InvalidNotificationError(Error):
116 """The channel Notification is invalid."""
117 pass
118
119 class BatchError(HttpError):
120 """Error occured during batch operations."""
121
122 @util.positional(2)
123 def __init__(self, reason, resp=None, content=None):
124 self.resp = resp
125 self.content = content
126 self.reason = reason
127
128 def __repr__(self):
129 return '<BatchError %s "%s">' % (self.resp.status, self.reason)
130
131 __str__ = __repr__
132
133
134 class UnexpectedMethodError(Error):
135 """Exception raised by RequestMockBuilder on unexpected calls."""
136
137 @util.positional(1)
138 def __init__(self, methodId=None):
139 """Constructor for an UnexpectedMethodError."""
140 super(UnexpectedMethodError, self).__init__(
141 'Received unexpected call %s' % methodId)
142
143
144 class UnexpectedBodyError(Error):
145 """Exception raised by RequestMockBuilder on unexpected bodies."""
146
147 def __init__(self, expected, provided):
148 """Constructor for an UnexpectedMethodError."""
149 super(UnexpectedBodyError, self).__init__(
150 'Expected: [%s] - Provided: [%s]' % (expected, provided))
151
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/googleapiclient/errors.py b/googleapiclient/errors.py
--- a/googleapiclient/errors.py
+++ b/googleapiclient/errors.py
@@ -126,6 +126,9 @@
self.reason = reason
def __repr__(self):
+ if getattr(self.resp, 'status', None) is None:
+ return '<BatchError "%s">' % (self.reason)
+ else:
return '<BatchError %s "%s">' % (self.resp.status, self.reason)
__str__ = __repr__
| {"golden_diff": "diff --git a/googleapiclient/errors.py b/googleapiclient/errors.py\n--- a/googleapiclient/errors.py\n+++ b/googleapiclient/errors.py\n@@ -126,6 +126,9 @@\n self.reason = reason\n \n def __repr__(self):\n+ if getattr(self.resp, 'status', None) is None:\n+ return '<BatchError \"%s\">' % (self.reason)\n+ else:\n return '<BatchError %s \"%s\">' % (self.resp.status, self.reason)\n \n __str__ = __repr__\n", "issue": "BatchError is unprintable using default constructor (one string)\nThis one should be pretty simple, I hope.\n\nHere's the constructor signature: `def __init__(self, reason, resp=None, content=None):`, which doesn't require `resp` to be defined, and I can see it is not defined most of the time, for example, in googleapiclient/http.py.\n\nThen, given the representation method:\n\n```\ndef __repr__(self):\n return '<BatchError %s \"%s\">' % (self.resp.status, self.reason)\n```\n\nWhich is also the string method:\n\n```\n__str__ = __repr__\n```\n\nThis results in unprintable exceptions where `resp` is undefined, which is not very helpful when attempting to understand the error (e.g. #164).\n\n", "before_files": [{"content": "# Copyright 2014 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Errors for the library.\n\nAll exceptions defined by the library\nshould be defined in this file.\n\"\"\"\nfrom __future__ import absolute_import\n\n__author__ = '[email protected] (Joe Gregorio)'\n\nimport json\n\n# Oauth2client < 3 has the positional helper in 'util', >= 3 has it\n# in '_helpers'.\ntry:\n from oauth2client import util\nexcept ImportError:\n from oauth2client import _helpers as util\n\n\nclass Error(Exception):\n \"\"\"Base error for this module.\"\"\"\n pass\n\n\nclass HttpError(Error):\n \"\"\"HTTP data was invalid or unexpected.\"\"\"\n\n @util.positional(3)\n def __init__(self, resp, content, uri=None):\n self.resp = resp\n if not isinstance(content, bytes):\n raise TypeError(\"HTTP content should be bytes\")\n self.content = content\n self.uri = uri\n\n def _get_reason(self):\n \"\"\"Calculate the reason for the error from the response content.\"\"\"\n reason = self.resp.reason\n try:\n data = json.loads(self.content.decode('utf-8'))\n if isinstance(data, dict):\n reason = data['error']['message']\n elif isinstance(data, list) and len(data) > 0:\n first_error = data[0]\n reason = first_error['error']['message']\n except (ValueError, KeyError, TypeError):\n pass\n if reason is None:\n reason = ''\n return reason\n\n def __repr__(self):\n if self.uri:\n return '<HttpError %s when requesting %s returned \"%s\">' % (\n self.resp.status, self.uri, self._get_reason().strip())\n else:\n return '<HttpError %s \"%s\">' % (self.resp.status, self._get_reason())\n\n __str__ = __repr__\n\n\nclass InvalidJsonError(Error):\n \"\"\"The JSON returned could not be parsed.\"\"\"\n pass\n\n\nclass UnknownFileType(Error):\n \"\"\"File type unknown or unexpected.\"\"\"\n pass\n\n\nclass UnknownLinkType(Error):\n \"\"\"Link type unknown or unexpected.\"\"\"\n pass\n\n\nclass UnknownApiNameOrVersion(Error):\n \"\"\"No API with that name and version exists.\"\"\"\n pass\n\n\nclass UnacceptableMimeTypeError(Error):\n \"\"\"That is an unacceptable mimetype for this operation.\"\"\"\n pass\n\n\nclass MediaUploadSizeError(Error):\n \"\"\"Media is larger than the method can accept.\"\"\"\n pass\n\n\nclass ResumableUploadError(HttpError):\n \"\"\"Error occured during resumable upload.\"\"\"\n pass\n\n\nclass InvalidChunkSizeError(Error):\n \"\"\"The given chunksize is not valid.\"\"\"\n pass\n\nclass InvalidNotificationError(Error):\n \"\"\"The channel Notification is invalid.\"\"\"\n pass\n\nclass BatchError(HttpError):\n \"\"\"Error occured during batch operations.\"\"\"\n\n @util.positional(2)\n def __init__(self, reason, resp=None, content=None):\n self.resp = resp\n self.content = content\n self.reason = reason\n\n def __repr__(self):\n return '<BatchError %s \"%s\">' % (self.resp.status, self.reason)\n\n __str__ = __repr__\n\n\nclass UnexpectedMethodError(Error):\n \"\"\"Exception raised by RequestMockBuilder on unexpected calls.\"\"\"\n\n @util.positional(1)\n def __init__(self, methodId=None):\n \"\"\"Constructor for an UnexpectedMethodError.\"\"\"\n super(UnexpectedMethodError, self).__init__(\n 'Received unexpected call %s' % methodId)\n\n\nclass UnexpectedBodyError(Error):\n \"\"\"Exception raised by RequestMockBuilder on unexpected bodies.\"\"\"\n\n def __init__(self, expected, provided):\n \"\"\"Constructor for an UnexpectedMethodError.\"\"\"\n super(UnexpectedBodyError, self).__init__(\n 'Expected: [%s] - Provided: [%s]' % (expected, provided))\n", "path": "googleapiclient/errors.py"}], "after_files": [{"content": "# Copyright 2014 Google Inc. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Errors for the library.\n\nAll exceptions defined by the library\nshould be defined in this file.\n\"\"\"\nfrom __future__ import absolute_import\n\n__author__ = '[email protected] (Joe Gregorio)'\n\nimport json\n\n# Oauth2client < 3 has the positional helper in 'util', >= 3 has it\n# in '_helpers'.\ntry:\n from oauth2client import util\nexcept ImportError:\n from oauth2client import _helpers as util\n\n\nclass Error(Exception):\n \"\"\"Base error for this module.\"\"\"\n pass\n\n\nclass HttpError(Error):\n \"\"\"HTTP data was invalid or unexpected.\"\"\"\n\n @util.positional(3)\n def __init__(self, resp, content, uri=None):\n self.resp = resp\n if not isinstance(content, bytes):\n raise TypeError(\"HTTP content should be bytes\")\n self.content = content\n self.uri = uri\n\n def _get_reason(self):\n \"\"\"Calculate the reason for the error from the response content.\"\"\"\n reason = self.resp.reason\n try:\n data = json.loads(self.content.decode('utf-8'))\n if isinstance(data, dict):\n reason = data['error']['message']\n elif isinstance(data, list) and len(data) > 0:\n first_error = data[0]\n reason = first_error['error']['message']\n except (ValueError, KeyError, TypeError):\n pass\n if reason is None:\n reason = ''\n return reason\n\n def __repr__(self):\n if self.uri:\n return '<HttpError %s when requesting %s returned \"%s\">' % (\n self.resp.status, self.uri, self._get_reason().strip())\n else:\n return '<HttpError %s \"%s\">' % (self.resp.status, self._get_reason())\n\n __str__ = __repr__\n\n\nclass InvalidJsonError(Error):\n \"\"\"The JSON returned could not be parsed.\"\"\"\n pass\n\n\nclass UnknownFileType(Error):\n \"\"\"File type unknown or unexpected.\"\"\"\n pass\n\n\nclass UnknownLinkType(Error):\n \"\"\"Link type unknown or unexpected.\"\"\"\n pass\n\n\nclass UnknownApiNameOrVersion(Error):\n \"\"\"No API with that name and version exists.\"\"\"\n pass\n\n\nclass UnacceptableMimeTypeError(Error):\n \"\"\"That is an unacceptable mimetype for this operation.\"\"\"\n pass\n\n\nclass MediaUploadSizeError(Error):\n \"\"\"Media is larger than the method can accept.\"\"\"\n pass\n\n\nclass ResumableUploadError(HttpError):\n \"\"\"Error occured during resumable upload.\"\"\"\n pass\n\n\nclass InvalidChunkSizeError(Error):\n \"\"\"The given chunksize is not valid.\"\"\"\n pass\n\nclass InvalidNotificationError(Error):\n \"\"\"The channel Notification is invalid.\"\"\"\n pass\n\nclass BatchError(HttpError):\n \"\"\"Error occured during batch operations.\"\"\"\n\n @util.positional(2)\n def __init__(self, reason, resp=None, content=None):\n self.resp = resp\n self.content = content\n self.reason = reason\n\n def __repr__(self):\n if getattr(self.resp, 'status', None) is None:\n return '<BatchError \"%s\">' % (self.reason)\n else:\n return '<BatchError %s \"%s\">' % (self.resp.status, self.reason)\n\n __str__ = __repr__\n\n\nclass UnexpectedMethodError(Error):\n \"\"\"Exception raised by RequestMockBuilder on unexpected calls.\"\"\"\n\n @util.positional(1)\n def __init__(self, methodId=None):\n \"\"\"Constructor for an UnexpectedMethodError.\"\"\"\n super(UnexpectedMethodError, self).__init__(\n 'Received unexpected call %s' % methodId)\n\n\nclass UnexpectedBodyError(Error):\n \"\"\"Exception raised by RequestMockBuilder on unexpected bodies.\"\"\"\n\n def __init__(self, expected, provided):\n \"\"\"Constructor for an UnexpectedMethodError.\"\"\"\n super(UnexpectedBodyError, self).__init__(\n 'Expected: [%s] - Provided: [%s]' % (expected, provided))\n", "path": "googleapiclient/errors.py"}]} | 1,729 | 124 |
gh_patches_debug_7422 | rasdani/github-patches | git_diff | Lightning-AI__pytorch-lightning-1091 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update CHANGELOG for 0.7.x
## 🐛 Bug
<!-- A clear and concise description of what the bug is. -->
Updated CHANGELOG according to the reset changes (about last two weeks) especially deprecated items like `data_loader` or `xxxxx_end`
### Additional context
<!-- Add any other context about the problem here. -->
https://github.com/PyTorchLightning/pytorch-lightning/milestone/4
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pytorch_lightning/core/decorators.py`
Content:
```
1 import traceback
2 from functools import wraps
3 import warnings
4
5
6 def data_loader(fn):
7 """Decorator to make any fx with this use the lazy property.
8
9 :param fn:
10 :return:
11 """
12 w = 'data_loader decorator deprecated in 0.7.0. Will remove 0.9.0'
13 warnings.warn(w)
14
15 def inner_fx(self):
16 return fn(self)
17 return inner_fx
18
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pytorch_lightning/core/decorators.py b/pytorch_lightning/core/decorators.py
--- a/pytorch_lightning/core/decorators.py
+++ b/pytorch_lightning/core/decorators.py
@@ -6,11 +6,10 @@
def data_loader(fn):
"""Decorator to make any fx with this use the lazy property.
- :param fn:
- :return:
+ Warnings:
+ This decorator deprecated in v0.7.0 and it will be removed v0.9.0.
"""
- w = 'data_loader decorator deprecated in 0.7.0. Will remove 0.9.0'
- warnings.warn(w)
+ warnings.warn('`data_loader` decorator deprecated in v0.7.0. Will be removed v0.9.0', DeprecationWarning)
def inner_fx(self):
return fn(self)
| {"golden_diff": "diff --git a/pytorch_lightning/core/decorators.py b/pytorch_lightning/core/decorators.py\n--- a/pytorch_lightning/core/decorators.py\n+++ b/pytorch_lightning/core/decorators.py\n@@ -6,11 +6,10 @@\n def data_loader(fn):\n \"\"\"Decorator to make any fx with this use the lazy property.\n \n- :param fn:\n- :return:\n+ Warnings:\n+ This decorator deprecated in v0.7.0 and it will be removed v0.9.0.\n \"\"\"\n- w = 'data_loader decorator deprecated in 0.7.0. Will remove 0.9.0'\n- warnings.warn(w)\n+ warnings.warn('`data_loader` decorator deprecated in v0.7.0. Will be removed v0.9.0', DeprecationWarning)\n \n def inner_fx(self):\n return fn(self)\n", "issue": "Update CHANGELOG for 0.7.x\n## \ud83d\udc1b Bug\r\n\r\n<!-- A clear and concise description of what the bug is. -->\r\n\r\nUpdated CHANGELOG according to the reset changes (about last two weeks) especially deprecated items like `data_loader` or `xxxxx_end`\r\n\r\n### Additional context\r\n\r\n<!-- Add any other context about the problem here. -->\r\n\r\nhttps://github.com/PyTorchLightning/pytorch-lightning/milestone/4\n", "before_files": [{"content": "import traceback\nfrom functools import wraps\nimport warnings\n\n\ndef data_loader(fn):\n \"\"\"Decorator to make any fx with this use the lazy property.\n\n :param fn:\n :return:\n \"\"\"\n w = 'data_loader decorator deprecated in 0.7.0. Will remove 0.9.0'\n warnings.warn(w)\n\n def inner_fx(self):\n return fn(self)\n return inner_fx\n", "path": "pytorch_lightning/core/decorators.py"}], "after_files": [{"content": "import traceback\nfrom functools import wraps\nimport warnings\n\n\ndef data_loader(fn):\n \"\"\"Decorator to make any fx with this use the lazy property.\n\n Warnings:\n This decorator deprecated in v0.7.0 and it will be removed v0.9.0.\n \"\"\"\n warnings.warn('`data_loader` decorator deprecated in v0.7.0. Will be removed v0.9.0', DeprecationWarning)\n\n def inner_fx(self):\n return fn(self)\n return inner_fx\n", "path": "pytorch_lightning/core/decorators.py"}]} | 473 | 200 |
gh_patches_debug_38787 | rasdani/github-patches | git_diff | Kinto__kinto-1284 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
500 when creating a new account with a POST and forgetting to put the ID
```
File "kinto/plugins/accounts/views.py", line 112, in process_record
if new[self.model.id_field] != self.request.selected_userid:
KeyError: 'id'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kinto/plugins/accounts/__init__.py`
Content:
```
1 from kinto.authorization import PERMISSIONS_INHERITANCE_TREE
2 from pyramid.exceptions import ConfigurationError
3
4
5 def includeme(config):
6 config.add_api_capability(
7 'accounts',
8 description='Manage user accounts.',
9 url='https://kinto.readthedocs.io/en/latest/api/1.x/accounts.html')
10
11 config.scan('kinto.plugins.accounts.views')
12
13 PERMISSIONS_INHERITANCE_TREE[''].update({
14 'account:create': {}
15 })
16 PERMISSIONS_INHERITANCE_TREE['account'] = {
17 'write': {'account': ['write']},
18 'read': {'account': ['write', 'read']}
19 }
20
21 # Add some safety to avoid weird behaviour with basicauth default policy.
22 settings = config.get_settings()
23 auth_policies = settings['multiauth.policies']
24 if 'basicauth' in auth_policies and 'account' in auth_policies:
25 if auth_policies.index('basicauth') < auth_policies.index('account'):
26 error_msg = ("'basicauth' should not be mentioned before 'account' "
27 "in 'multiauth.policies' setting.")
28 raise ConfigurationError(error_msg)
29
```
Path: `kinto/plugins/accounts/views.py`
Content:
```
1 import bcrypt
2 import colander
3 from pyramid import httpexceptions
4 from pyramid.decorator import reify
5 from pyramid.security import Authenticated, Everyone
6 from pyramid.settings import aslist
7
8 from kinto.views import NameGenerator
9 from kinto.core import resource
10 from kinto.core.errors import raise_invalid, http_error
11
12
13 def _extract_posted_body_id(request):
14 try:
15 # Anonymous creation with POST.
16 return request.json['data']['id']
17 except (ValueError, KeyError):
18 # Bad POST data.
19 if request.method.lower() == 'post':
20 error_details = {
21 'name': 'data.id',
22 'description': 'data.id in body: Required'
23 }
24 raise_invalid(request, **error_details)
25 # Anonymous GET
26 error_msg = 'Cannot read accounts.'
27 raise http_error(httpexceptions.HTTPUnauthorized(), error=error_msg)
28
29
30 class AccountSchema(resource.ResourceSchema):
31 password = colander.SchemaNode(colander.String())
32
33
34 @resource.register()
35 class Account(resource.ShareableResource):
36
37 schema = AccountSchema
38
39 def __init__(self, request, context):
40 # Store if current user is administrator (before accessing get_parent_id())
41 allowed_from_settings = request.registry.settings.get('account_write_principals', [])
42 context.is_administrator = len(set(aslist(allowed_from_settings)) &
43 set(request.prefixed_principals)) > 0
44 # Shortcut to check if current is anonymous (before get_parent_id()).
45 context.is_anonymous = Authenticated not in request.effective_principals
46
47 super().__init__(request, context)
48
49 # Overwrite the current principal set by ShareableResource.
50 if self.model.current_principal == Everyone or context.is_administrator:
51 # Creation is anonymous, but author with write perm is this:
52 # XXX: only works if policy name is account in settings.
53 self.model.current_principal = 'account:{}'.format(self.model.parent_id)
54
55 @reify
56 def id_generator(self):
57 # This generator is used for ID validation.
58 return NameGenerator()
59
60 def get_parent_id(self, request):
61 # The whole challenge here is that we want to isolate what
62 # authenticated users can list, but give access to everything to
63 # administrators.
64 # Plus when anonymous create accounts, we have to set their parent id
65 # to the same value they would obtain when authenticated.
66 if self.context.is_administrator:
67 if self.context.on_collection:
68 # Accounts created by admin should have userid as parent.
69 if request.method.lower() == 'post':
70 return _extract_posted_body_id(request)
71 else:
72 # Admin see all accounts.
73 return '*'
74 else:
75 # No pattern matching for admin on single record.
76 return request.matchdict['id']
77
78 if not self.context.is_anonymous:
79 # Authenticated users see their own account only.
80 return request.selected_userid
81
82 # Anonymous creation with PUT.
83 if 'id' in request.matchdict:
84 return request.matchdict['id']
85
86 return _extract_posted_body_id(request)
87
88 def collection_post(self):
89 result = super(Account, self).collection_post()
90 if self.context.is_anonymous and self.request.response.status_code == 200:
91 error_details = {
92 'message': 'Account ID %r already exists' % result['data']['id']
93 }
94 raise http_error(httpexceptions.HTTPForbidden(), **error_details)
95 return result
96
97 def process_record(self, new, old=None):
98 new = super(Account, self).process_record(new, old)
99
100 # Store password safely in database as str
101 # (bcrypt.hashpw returns base64 bytes).
102 pwd_str = new["password"].encode(encoding='utf-8')
103 hashed = bcrypt.hashpw(pwd_str, bcrypt.gensalt())
104 new["password"] = hashed.decode(encoding='utf-8')
105
106 # Administrators can reach other accounts and anonymous have no
107 # selected_userid. So do not try to enforce.
108 if self.context.is_administrator or self.context.is_anonymous:
109 return new
110
111 # Otherwise, we force the id to match the authenticated username.
112 if new[self.model.id_field] != self.request.selected_userid:
113 error_details = {
114 'name': 'data.id',
115 'description': 'Username and account ID do not match.',
116 }
117 raise_invalid(self.request, **error_details)
118
119 return new
120
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kinto/plugins/accounts/__init__.py b/kinto/plugins/accounts/__init__.py
--- a/kinto/plugins/accounts/__init__.py
+++ b/kinto/plugins/accounts/__init__.py
@@ -26,3 +26,24 @@
error_msg = ("'basicauth' should not be mentioned before 'account' "
"in 'multiauth.policies' setting.")
raise ConfigurationError(error_msg)
+
+ # We assume anyone in account_create_principals is to create
+ # accounts for other people.
+ # No one can create accounts for other people unless they are an
+ # "admin", defined as someone matching account_write_principals.
+ # Therefore any account that is in account_create_principals
+ # should be in account_write_principals too.
+ creators = set(settings.get('account_create_principals', '').split())
+ admins = set(settings.get('account_write_principals', '').split())
+ cant_create_anything = creators.difference(admins)
+ # system.Everyone isn't an account.
+ cant_create_anything.discard('system.Everyone')
+ if cant_create_anything:
+ message = ('Configuration has some principals in account_create_principals '
+ 'but not in account_write_principals. These principals will only be '
+ 'able to create their own accounts. This may not be what you want.\n'
+ 'If you want these users to be able to create accounts for other users, '
+ 'add them to account_write_principals.\n'
+ 'Affected users: {}'.format(list(cant_create_anything)))
+
+ raise ConfigurationError(message)
diff --git a/kinto/plugins/accounts/views.py b/kinto/plugins/accounts/views.py
--- a/kinto/plugins/accounts/views.py
+++ b/kinto/plugins/accounts/views.py
@@ -27,6 +27,12 @@
raise http_error(httpexceptions.HTTPUnauthorized(), error=error_msg)
+class AccountIdGenerator(NameGenerator):
+ """Allow @ signs in account IDs."""
+
+ regexp = r'^[a-zA-Z0-9][.@a-zA-Z0-9_-]*$'
+
+
class AccountSchema(resource.ResourceSchema):
password = colander.SchemaNode(colander.String())
@@ -55,7 +61,7 @@
@reify
def id_generator(self):
# This generator is used for ID validation.
- return NameGenerator()
+ return AccountIdGenerator()
def get_parent_id(self, request):
# The whole challenge here is that we want to isolate what
@@ -108,6 +114,14 @@
if self.context.is_administrator or self.context.is_anonymous:
return new
+ # Do not let accounts be created without usernames.
+ if self.model.id_field not in new:
+ error_details = {
+ 'name': 'data.id',
+ 'description': 'Accounts must have an ID.',
+ }
+ raise_invalid(self.request, **error_details)
+
# Otherwise, we force the id to match the authenticated username.
if new[self.model.id_field] != self.request.selected_userid:
error_details = {
| {"golden_diff": "diff --git a/kinto/plugins/accounts/__init__.py b/kinto/plugins/accounts/__init__.py\n--- a/kinto/plugins/accounts/__init__.py\n+++ b/kinto/plugins/accounts/__init__.py\n@@ -26,3 +26,24 @@\n error_msg = (\"'basicauth' should not be mentioned before 'account' \"\n \"in 'multiauth.policies' setting.\")\n raise ConfigurationError(error_msg)\n+\n+ # We assume anyone in account_create_principals is to create\n+ # accounts for other people.\n+ # No one can create accounts for other people unless they are an\n+ # \"admin\", defined as someone matching account_write_principals.\n+ # Therefore any account that is in account_create_principals\n+ # should be in account_write_principals too.\n+ creators = set(settings.get('account_create_principals', '').split())\n+ admins = set(settings.get('account_write_principals', '').split())\n+ cant_create_anything = creators.difference(admins)\n+ # system.Everyone isn't an account.\n+ cant_create_anything.discard('system.Everyone')\n+ if cant_create_anything:\n+ message = ('Configuration has some principals in account_create_principals '\n+ 'but not in account_write_principals. These principals will only be '\n+ 'able to create their own accounts. This may not be what you want.\\n'\n+ 'If you want these users to be able to create accounts for other users, '\n+ 'add them to account_write_principals.\\n'\n+ 'Affected users: {}'.format(list(cant_create_anything)))\n+\n+ raise ConfigurationError(message)\ndiff --git a/kinto/plugins/accounts/views.py b/kinto/plugins/accounts/views.py\n--- a/kinto/plugins/accounts/views.py\n+++ b/kinto/plugins/accounts/views.py\n@@ -27,6 +27,12 @@\n raise http_error(httpexceptions.HTTPUnauthorized(), error=error_msg)\n \n \n+class AccountIdGenerator(NameGenerator):\n+ \"\"\"Allow @ signs in account IDs.\"\"\"\n+\n+ regexp = r'^[a-zA-Z0-9][.@a-zA-Z0-9_-]*$'\n+\n+\n class AccountSchema(resource.ResourceSchema):\n password = colander.SchemaNode(colander.String())\n \n@@ -55,7 +61,7 @@\n @reify\n def id_generator(self):\n # This generator is used for ID validation.\n- return NameGenerator()\n+ return AccountIdGenerator()\n \n def get_parent_id(self, request):\n # The whole challenge here is that we want to isolate what\n@@ -108,6 +114,14 @@\n if self.context.is_administrator or self.context.is_anonymous:\n return new\n \n+ # Do not let accounts be created without usernames.\n+ if self.model.id_field not in new:\n+ error_details = {\n+ 'name': 'data.id',\n+ 'description': 'Accounts must have an ID.',\n+ }\n+ raise_invalid(self.request, **error_details)\n+\n # Otherwise, we force the id to match the authenticated username.\n if new[self.model.id_field] != self.request.selected_userid:\n error_details = {\n", "issue": "500 when creating a new account with a POST and forgetting to put the ID\n```\r\n File \"kinto/plugins/accounts/views.py\", line 112, in process_record\r\n if new[self.model.id_field] != self.request.selected_userid:\r\nKeyError: 'id'\r\n```\n", "before_files": [{"content": "from kinto.authorization import PERMISSIONS_INHERITANCE_TREE\nfrom pyramid.exceptions import ConfigurationError\n\n\ndef includeme(config):\n config.add_api_capability(\n 'accounts',\n description='Manage user accounts.',\n url='https://kinto.readthedocs.io/en/latest/api/1.x/accounts.html')\n\n config.scan('kinto.plugins.accounts.views')\n\n PERMISSIONS_INHERITANCE_TREE[''].update({\n 'account:create': {}\n })\n PERMISSIONS_INHERITANCE_TREE['account'] = {\n 'write': {'account': ['write']},\n 'read': {'account': ['write', 'read']}\n }\n\n # Add some safety to avoid weird behaviour with basicauth default policy.\n settings = config.get_settings()\n auth_policies = settings['multiauth.policies']\n if 'basicauth' in auth_policies and 'account' in auth_policies:\n if auth_policies.index('basicauth') < auth_policies.index('account'):\n error_msg = (\"'basicauth' should not be mentioned before 'account' \"\n \"in 'multiauth.policies' setting.\")\n raise ConfigurationError(error_msg)\n", "path": "kinto/plugins/accounts/__init__.py"}, {"content": "import bcrypt\nimport colander\nfrom pyramid import httpexceptions\nfrom pyramid.decorator import reify\nfrom pyramid.security import Authenticated, Everyone\nfrom pyramid.settings import aslist\n\nfrom kinto.views import NameGenerator\nfrom kinto.core import resource\nfrom kinto.core.errors import raise_invalid, http_error\n\n\ndef _extract_posted_body_id(request):\n try:\n # Anonymous creation with POST.\n return request.json['data']['id']\n except (ValueError, KeyError):\n # Bad POST data.\n if request.method.lower() == 'post':\n error_details = {\n 'name': 'data.id',\n 'description': 'data.id in body: Required'\n }\n raise_invalid(request, **error_details)\n # Anonymous GET\n error_msg = 'Cannot read accounts.'\n raise http_error(httpexceptions.HTTPUnauthorized(), error=error_msg)\n\n\nclass AccountSchema(resource.ResourceSchema):\n password = colander.SchemaNode(colander.String())\n\n\[email protected]()\nclass Account(resource.ShareableResource):\n\n schema = AccountSchema\n\n def __init__(self, request, context):\n # Store if current user is administrator (before accessing get_parent_id())\n allowed_from_settings = request.registry.settings.get('account_write_principals', [])\n context.is_administrator = len(set(aslist(allowed_from_settings)) &\n set(request.prefixed_principals)) > 0\n # Shortcut to check if current is anonymous (before get_parent_id()).\n context.is_anonymous = Authenticated not in request.effective_principals\n\n super().__init__(request, context)\n\n # Overwrite the current principal set by ShareableResource.\n if self.model.current_principal == Everyone or context.is_administrator:\n # Creation is anonymous, but author with write perm is this:\n # XXX: only works if policy name is account in settings.\n self.model.current_principal = 'account:{}'.format(self.model.parent_id)\n\n @reify\n def id_generator(self):\n # This generator is used for ID validation.\n return NameGenerator()\n\n def get_parent_id(self, request):\n # The whole challenge here is that we want to isolate what\n # authenticated users can list, but give access to everything to\n # administrators.\n # Plus when anonymous create accounts, we have to set their parent id\n # to the same value they would obtain when authenticated.\n if self.context.is_administrator:\n if self.context.on_collection:\n # Accounts created by admin should have userid as parent.\n if request.method.lower() == 'post':\n return _extract_posted_body_id(request)\n else:\n # Admin see all accounts.\n return '*'\n else:\n # No pattern matching for admin on single record.\n return request.matchdict['id']\n\n if not self.context.is_anonymous:\n # Authenticated users see their own account only.\n return request.selected_userid\n\n # Anonymous creation with PUT.\n if 'id' in request.matchdict:\n return request.matchdict['id']\n\n return _extract_posted_body_id(request)\n\n def collection_post(self):\n result = super(Account, self).collection_post()\n if self.context.is_anonymous and self.request.response.status_code == 200:\n error_details = {\n 'message': 'Account ID %r already exists' % result['data']['id']\n }\n raise http_error(httpexceptions.HTTPForbidden(), **error_details)\n return result\n\n def process_record(self, new, old=None):\n new = super(Account, self).process_record(new, old)\n\n # Store password safely in database as str\n # (bcrypt.hashpw returns base64 bytes).\n pwd_str = new[\"password\"].encode(encoding='utf-8')\n hashed = bcrypt.hashpw(pwd_str, bcrypt.gensalt())\n new[\"password\"] = hashed.decode(encoding='utf-8')\n\n # Administrators can reach other accounts and anonymous have no\n # selected_userid. So do not try to enforce.\n if self.context.is_administrator or self.context.is_anonymous:\n return new\n\n # Otherwise, we force the id to match the authenticated username.\n if new[self.model.id_field] != self.request.selected_userid:\n error_details = {\n 'name': 'data.id',\n 'description': 'Username and account ID do not match.',\n }\n raise_invalid(self.request, **error_details)\n\n return new\n", "path": "kinto/plugins/accounts/views.py"}], "after_files": [{"content": "from kinto.authorization import PERMISSIONS_INHERITANCE_TREE\nfrom pyramid.exceptions import ConfigurationError\n\n\ndef includeme(config):\n config.add_api_capability(\n 'accounts',\n description='Manage user accounts.',\n url='https://kinto.readthedocs.io/en/latest/api/1.x/accounts.html')\n\n config.scan('kinto.plugins.accounts.views')\n\n PERMISSIONS_INHERITANCE_TREE[''].update({\n 'account:create': {}\n })\n PERMISSIONS_INHERITANCE_TREE['account'] = {\n 'write': {'account': ['write']},\n 'read': {'account': ['write', 'read']}\n }\n\n # Add some safety to avoid weird behaviour with basicauth default policy.\n settings = config.get_settings()\n auth_policies = settings['multiauth.policies']\n if 'basicauth' in auth_policies and 'account' in auth_policies:\n if auth_policies.index('basicauth') < auth_policies.index('account'):\n error_msg = (\"'basicauth' should not be mentioned before 'account' \"\n \"in 'multiauth.policies' setting.\")\n raise ConfigurationError(error_msg)\n\n # We assume anyone in account_create_principals is to create\n # accounts for other people.\n # No one can create accounts for other people unless they are an\n # \"admin\", defined as someone matching account_write_principals.\n # Therefore any account that is in account_create_principals\n # should be in account_write_principals too.\n creators = set(settings.get('account_create_principals', '').split())\n admins = set(settings.get('account_write_principals', '').split())\n cant_create_anything = creators.difference(admins)\n # system.Everyone isn't an account.\n cant_create_anything.discard('system.Everyone')\n if cant_create_anything:\n message = ('Configuration has some principals in account_create_principals '\n 'but not in account_write_principals. These principals will only be '\n 'able to create their own accounts. This may not be what you want.\\n'\n 'If you want these users to be able to create accounts for other users, '\n 'add them to account_write_principals.\\n'\n 'Affected users: {}'.format(list(cant_create_anything)))\n\n raise ConfigurationError(message)\n", "path": "kinto/plugins/accounts/__init__.py"}, {"content": "import bcrypt\nimport colander\nfrom pyramid import httpexceptions\nfrom pyramid.decorator import reify\nfrom pyramid.security import Authenticated, Everyone\nfrom pyramid.settings import aslist\n\nfrom kinto.views import NameGenerator\nfrom kinto.core import resource\nfrom kinto.core.errors import raise_invalid, http_error\n\n\ndef _extract_posted_body_id(request):\n try:\n # Anonymous creation with POST.\n return request.json['data']['id']\n except (ValueError, KeyError):\n # Bad POST data.\n if request.method.lower() == 'post':\n error_details = {\n 'name': 'data.id',\n 'description': 'data.id in body: Required'\n }\n raise_invalid(request, **error_details)\n # Anonymous GET\n error_msg = 'Cannot read accounts.'\n raise http_error(httpexceptions.HTTPUnauthorized(), error=error_msg)\n\n\nclass AccountIdGenerator(NameGenerator):\n \"\"\"Allow @ signs in account IDs.\"\"\"\n\n regexp = r'^[a-zA-Z0-9][.@a-zA-Z0-9_-]*$'\n\n\nclass AccountSchema(resource.ResourceSchema):\n password = colander.SchemaNode(colander.String())\n\n\[email protected]()\nclass Account(resource.ShareableResource):\n\n schema = AccountSchema\n\n def __init__(self, request, context):\n # Store if current user is administrator (before accessing get_parent_id())\n allowed_from_settings = request.registry.settings.get('account_write_principals', [])\n context.is_administrator = len(set(aslist(allowed_from_settings)) &\n set(request.prefixed_principals)) > 0\n # Shortcut to check if current is anonymous (before get_parent_id()).\n context.is_anonymous = Authenticated not in request.effective_principals\n\n super().__init__(request, context)\n\n # Overwrite the current principal set by ShareableResource.\n if self.model.current_principal == Everyone or context.is_administrator:\n # Creation is anonymous, but author with write perm is this:\n # XXX: only works if policy name is account in settings.\n self.model.current_principal = 'account:{}'.format(self.model.parent_id)\n\n @reify\n def id_generator(self):\n # This generator is used for ID validation.\n return AccountIdGenerator()\n\n def get_parent_id(self, request):\n # The whole challenge here is that we want to isolate what\n # authenticated users can list, but give access to everything to\n # administrators.\n # Plus when anonymous create accounts, we have to set their parent id\n # to the same value they would obtain when authenticated.\n if self.context.is_administrator:\n if self.context.on_collection:\n # Accounts created by admin should have userid as parent.\n if request.method.lower() == 'post':\n return _extract_posted_body_id(request)\n else:\n # Admin see all accounts.\n return '*'\n else:\n # No pattern matching for admin on single record.\n return request.matchdict['id']\n\n if not self.context.is_anonymous:\n # Authenticated users see their own account only.\n return request.selected_userid\n\n # Anonymous creation with PUT.\n if 'id' in request.matchdict:\n return request.matchdict['id']\n\n return _extract_posted_body_id(request)\n\n def collection_post(self):\n result = super(Account, self).collection_post()\n if self.context.is_anonymous and self.request.response.status_code == 200:\n error_details = {\n 'message': 'Account ID %r already exists' % result['data']['id']\n }\n raise http_error(httpexceptions.HTTPForbidden(), **error_details)\n return result\n\n def process_record(self, new, old=None):\n new = super(Account, self).process_record(new, old)\n\n # Store password safely in database as str\n # (bcrypt.hashpw returns base64 bytes).\n pwd_str = new[\"password\"].encode(encoding='utf-8')\n hashed = bcrypt.hashpw(pwd_str, bcrypt.gensalt())\n new[\"password\"] = hashed.decode(encoding='utf-8')\n\n # Administrators can reach other accounts and anonymous have no\n # selected_userid. So do not try to enforce.\n if self.context.is_administrator or self.context.is_anonymous:\n return new\n\n # Do not let accounts be created without usernames.\n if self.model.id_field not in new:\n error_details = {\n 'name': 'data.id',\n 'description': 'Accounts must have an ID.',\n }\n raise_invalid(self.request, **error_details)\n\n # Otherwise, we force the id to match the authenticated username.\n if new[self.model.id_field] != self.request.selected_userid:\n error_details = {\n 'name': 'data.id',\n 'description': 'Username and account ID do not match.',\n }\n raise_invalid(self.request, **error_details)\n\n return new\n", "path": "kinto/plugins/accounts/views.py"}]} | 1,834 | 701 |
gh_patches_debug_24286 | rasdani/github-patches | git_diff | e-valuation__EvaP-1822 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Response status code of failed redemption is 200
As @niklasmohrin remarked in [#1790](https://github.com/e-valuation/EvaP/pull/1790/files#r962983692), in `evap.rewards.views.redeem_reward_points`, the status code of failed redemptions (e.g. due to `NotEnoughPoints` or `RedemptionEventExpired`) is set as 200 OK, even though no redemption points were saved.
Instead, the status code should be something like 400 Bad Request to underline that something went wrong.
@niklasmohrin added, that `assertContains`, used in some tests in `evap.rewards.tests.test_views.TestIndexView`, needs to adopted, as it asserts that the status code is 200 by default.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `evap/rewards/views.py`
Content:
```
1 from datetime import datetime
2
3 from django.contrib import messages
4 from django.core.exceptions import BadRequest, SuspiciousOperation
5 from django.http import HttpResponse
6 from django.shortcuts import get_object_or_404, redirect, render
7 from django.utils.translation import get_language
8 from django.utils.translation import gettext as _
9 from django.views.decorators.http import require_POST
10
11 from evap.evaluation.auth import manager_required, reward_user_required
12 from evap.evaluation.models import Semester
13 from evap.evaluation.tools import AttachmentResponse, get_object_from_dict_pk_entry_or_logged_40x
14 from evap.rewards.exporters import RewardsExporter
15 from evap.rewards.forms import RewardPointRedemptionEventForm
16 from evap.rewards.models import (
17 NoPointsSelected,
18 NotEnoughPoints,
19 RedemptionEventExpired,
20 RewardPointGranting,
21 RewardPointRedemption,
22 RewardPointRedemptionEvent,
23 SemesterActivation,
24 )
25 from evap.rewards.tools import grant_eligible_reward_points_for_semester, reward_points_of_user, save_redemptions
26 from evap.staff.views import semester_view
27
28
29 @reward_user_required
30 def index(request):
31 if request.method == "POST":
32 redemptions = {}
33 try:
34 for key, value in request.POST.items():
35 if key.startswith("points-"):
36 event_id = int(key.rpartition("-")[2])
37 redemptions[event_id] = int(value)
38 except ValueError as e:
39 raise BadRequest from e
40
41 try:
42 save_redemptions(request, redemptions)
43 messages.success(request, _("You successfully redeemed your points."))
44 except (NoPointsSelected, NotEnoughPoints, RedemptionEventExpired) as error:
45 messages.warning(request, error)
46
47 total_points_available = reward_points_of_user(request.user)
48 reward_point_grantings = RewardPointGranting.objects.filter(user_profile=request.user)
49 reward_point_redemptions = RewardPointRedemption.objects.filter(user_profile=request.user)
50 events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__gte=datetime.now()).order_by("date")
51
52 reward_point_actions = []
53 for granting in reward_point_grantings:
54 reward_point_actions.append(
55 (granting.granting_time, _("Reward for") + " " + granting.semester.name, granting.value, "")
56 )
57 for redemption in reward_point_redemptions:
58 reward_point_actions.append((redemption.redemption_time, redemption.event.name, "", redemption.value))
59
60 reward_point_actions.sort(key=lambda action: action[0], reverse=True)
61
62 template_data = dict(
63 reward_point_actions=reward_point_actions,
64 total_points_available=total_points_available,
65 events=events,
66 )
67 return render(request, "rewards_index.html", template_data)
68
69
70 @manager_required
71 def reward_point_redemption_events(request):
72 upcoming_events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__gte=datetime.now()).order_by("date")
73 past_events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__lt=datetime.now()).order_by("-date")
74 template_data = dict(upcoming_events=upcoming_events, past_events=past_events)
75 return render(request, "rewards_reward_point_redemption_events.html", template_data)
76
77
78 @manager_required
79 def reward_point_redemption_event_create(request):
80 event = RewardPointRedemptionEvent()
81 form = RewardPointRedemptionEventForm(request.POST or None, instance=event)
82
83 if form.is_valid():
84 form.save()
85 messages.success(request, _("Successfully created event."))
86 return redirect("rewards:reward_point_redemption_events")
87
88 return render(request, "rewards_reward_point_redemption_event_form.html", dict(form=form))
89
90
91 @manager_required
92 def reward_point_redemption_event_edit(request, event_id):
93 event = get_object_or_404(RewardPointRedemptionEvent, id=event_id)
94 form = RewardPointRedemptionEventForm(request.POST or None, instance=event)
95
96 if form.is_valid():
97 event = form.save()
98
99 messages.success(request, _("Successfully updated event."))
100 return redirect("rewards:reward_point_redemption_events")
101
102 return render(request, "rewards_reward_point_redemption_event_form.html", dict(event=event, form=form))
103
104
105 @require_POST
106 @manager_required
107 def reward_point_redemption_event_delete(request):
108 event = get_object_from_dict_pk_entry_or_logged_40x(RewardPointRedemptionEvent, request.POST, "event_id")
109
110 if not event.can_delete:
111 raise SuspiciousOperation("Deleting redemption event not allowed")
112 event.delete()
113 return HttpResponse() # 200 OK
114
115
116 @manager_required
117 def reward_point_redemption_event_export(request, event_id):
118 event = get_object_or_404(RewardPointRedemptionEvent, id=event_id)
119
120 filename = _("RewardPoints") + f"-{event.date}-{event.name}-{get_language()}.xls"
121 response = AttachmentResponse(filename, content_type="application/vnd.ms-excel")
122
123 RewardsExporter().export(response, event.redemptions_by_user())
124
125 return response
126
127
128 @manager_required
129 def semester_activation(request, semester_id, active):
130 semester = get_object_or_404(Semester, id=semester_id)
131 active = active == "on"
132
133 SemesterActivation.objects.update_or_create(semester=semester, defaults={"is_active": active})
134 if active:
135 grant_eligible_reward_points_for_semester(request, semester)
136
137 return semester_view(request=request, semester_id=semester_id)
138
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/evap/rewards/views.py b/evap/rewards/views.py
--- a/evap/rewards/views.py
+++ b/evap/rewards/views.py
@@ -28,6 +28,8 @@
@reward_user_required
def index(request):
+ # pylint: disable=too-many-locals
+ status = 200
if request.method == "POST":
redemptions = {}
try:
@@ -43,6 +45,7 @@
messages.success(request, _("You successfully redeemed your points."))
except (NoPointsSelected, NotEnoughPoints, RedemptionEventExpired) as error:
messages.warning(request, error)
+ status = 400
total_points_available = reward_points_of_user(request.user)
reward_point_grantings = RewardPointGranting.objects.filter(user_profile=request.user)
@@ -64,7 +67,7 @@
total_points_available=total_points_available,
events=events,
)
- return render(request, "rewards_index.html", template_data)
+ return render(request, "rewards_index.html", template_data, status=status)
@manager_required
| {"golden_diff": "diff --git a/evap/rewards/views.py b/evap/rewards/views.py\n--- a/evap/rewards/views.py\n+++ b/evap/rewards/views.py\n@@ -28,6 +28,8 @@\n \n @reward_user_required\n def index(request):\n+ # pylint: disable=too-many-locals\n+ status = 200\n if request.method == \"POST\":\n redemptions = {}\n try:\n@@ -43,6 +45,7 @@\n messages.success(request, _(\"You successfully redeemed your points.\"))\n except (NoPointsSelected, NotEnoughPoints, RedemptionEventExpired) as error:\n messages.warning(request, error)\n+ status = 400\n \n total_points_available = reward_points_of_user(request.user)\n reward_point_grantings = RewardPointGranting.objects.filter(user_profile=request.user)\n@@ -64,7 +67,7 @@\n total_points_available=total_points_available,\n events=events,\n )\n- return render(request, \"rewards_index.html\", template_data)\n+ return render(request, \"rewards_index.html\", template_data, status=status)\n \n \n @manager_required\n", "issue": "Response status code of failed redemption is 200\nAs @niklasmohrin remarked in [#1790](https://github.com/e-valuation/EvaP/pull/1790/files#r962983692), in `evap.rewards.views.redeem_reward_points`, the status code of failed redemptions (e.g. due to `NotEnoughPoints` or `RedemptionEventExpired`) is set as 200 OK, even though no redemption points were saved. \r\n\r\nInstead, the status code should be something like 400 Bad Request to underline that something went wrong.\r\n@niklasmohrin added, that `assertContains`, used in some tests in `evap.rewards.tests.test_views.TestIndexView`, needs to adopted, as it asserts that the status code is 200 by default.\n", "before_files": [{"content": "from datetime import datetime\n\nfrom django.contrib import messages\nfrom django.core.exceptions import BadRequest, SuspiciousOperation\nfrom django.http import HttpResponse\nfrom django.shortcuts import get_object_or_404, redirect, render\nfrom django.utils.translation import get_language\nfrom django.utils.translation import gettext as _\nfrom django.views.decorators.http import require_POST\n\nfrom evap.evaluation.auth import manager_required, reward_user_required\nfrom evap.evaluation.models import Semester\nfrom evap.evaluation.tools import AttachmentResponse, get_object_from_dict_pk_entry_or_logged_40x\nfrom evap.rewards.exporters import RewardsExporter\nfrom evap.rewards.forms import RewardPointRedemptionEventForm\nfrom evap.rewards.models import (\n NoPointsSelected,\n NotEnoughPoints,\n RedemptionEventExpired,\n RewardPointGranting,\n RewardPointRedemption,\n RewardPointRedemptionEvent,\n SemesterActivation,\n)\nfrom evap.rewards.tools import grant_eligible_reward_points_for_semester, reward_points_of_user, save_redemptions\nfrom evap.staff.views import semester_view\n\n\n@reward_user_required\ndef index(request):\n if request.method == \"POST\":\n redemptions = {}\n try:\n for key, value in request.POST.items():\n if key.startswith(\"points-\"):\n event_id = int(key.rpartition(\"-\")[2])\n redemptions[event_id] = int(value)\n except ValueError as e:\n raise BadRequest from e\n\n try:\n save_redemptions(request, redemptions)\n messages.success(request, _(\"You successfully redeemed your points.\"))\n except (NoPointsSelected, NotEnoughPoints, RedemptionEventExpired) as error:\n messages.warning(request, error)\n\n total_points_available = reward_points_of_user(request.user)\n reward_point_grantings = RewardPointGranting.objects.filter(user_profile=request.user)\n reward_point_redemptions = RewardPointRedemption.objects.filter(user_profile=request.user)\n events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__gte=datetime.now()).order_by(\"date\")\n\n reward_point_actions = []\n for granting in reward_point_grantings:\n reward_point_actions.append(\n (granting.granting_time, _(\"Reward for\") + \" \" + granting.semester.name, granting.value, \"\")\n )\n for redemption in reward_point_redemptions:\n reward_point_actions.append((redemption.redemption_time, redemption.event.name, \"\", redemption.value))\n\n reward_point_actions.sort(key=lambda action: action[0], reverse=True)\n\n template_data = dict(\n reward_point_actions=reward_point_actions,\n total_points_available=total_points_available,\n events=events,\n )\n return render(request, \"rewards_index.html\", template_data)\n\n\n@manager_required\ndef reward_point_redemption_events(request):\n upcoming_events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__gte=datetime.now()).order_by(\"date\")\n past_events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__lt=datetime.now()).order_by(\"-date\")\n template_data = dict(upcoming_events=upcoming_events, past_events=past_events)\n return render(request, \"rewards_reward_point_redemption_events.html\", template_data)\n\n\n@manager_required\ndef reward_point_redemption_event_create(request):\n event = RewardPointRedemptionEvent()\n form = RewardPointRedemptionEventForm(request.POST or None, instance=event)\n\n if form.is_valid():\n form.save()\n messages.success(request, _(\"Successfully created event.\"))\n return redirect(\"rewards:reward_point_redemption_events\")\n\n return render(request, \"rewards_reward_point_redemption_event_form.html\", dict(form=form))\n\n\n@manager_required\ndef reward_point_redemption_event_edit(request, event_id):\n event = get_object_or_404(RewardPointRedemptionEvent, id=event_id)\n form = RewardPointRedemptionEventForm(request.POST or None, instance=event)\n\n if form.is_valid():\n event = form.save()\n\n messages.success(request, _(\"Successfully updated event.\"))\n return redirect(\"rewards:reward_point_redemption_events\")\n\n return render(request, \"rewards_reward_point_redemption_event_form.html\", dict(event=event, form=form))\n\n\n@require_POST\n@manager_required\ndef reward_point_redemption_event_delete(request):\n event = get_object_from_dict_pk_entry_or_logged_40x(RewardPointRedemptionEvent, request.POST, \"event_id\")\n\n if not event.can_delete:\n raise SuspiciousOperation(\"Deleting redemption event not allowed\")\n event.delete()\n return HttpResponse() # 200 OK\n\n\n@manager_required\ndef reward_point_redemption_event_export(request, event_id):\n event = get_object_or_404(RewardPointRedemptionEvent, id=event_id)\n\n filename = _(\"RewardPoints\") + f\"-{event.date}-{event.name}-{get_language()}.xls\"\n response = AttachmentResponse(filename, content_type=\"application/vnd.ms-excel\")\n\n RewardsExporter().export(response, event.redemptions_by_user())\n\n return response\n\n\n@manager_required\ndef semester_activation(request, semester_id, active):\n semester = get_object_or_404(Semester, id=semester_id)\n active = active == \"on\"\n\n SemesterActivation.objects.update_or_create(semester=semester, defaults={\"is_active\": active})\n if active:\n grant_eligible_reward_points_for_semester(request, semester)\n\n return semester_view(request=request, semester_id=semester_id)\n", "path": "evap/rewards/views.py"}], "after_files": [{"content": "from datetime import datetime\n\nfrom django.contrib import messages\nfrom django.core.exceptions import BadRequest, SuspiciousOperation\nfrom django.http import HttpResponse\nfrom django.shortcuts import get_object_or_404, redirect, render\nfrom django.utils.translation import get_language\nfrom django.utils.translation import gettext as _\nfrom django.views.decorators.http import require_POST\n\nfrom evap.evaluation.auth import manager_required, reward_user_required\nfrom evap.evaluation.models import Semester\nfrom evap.evaluation.tools import AttachmentResponse, get_object_from_dict_pk_entry_or_logged_40x\nfrom evap.rewards.exporters import RewardsExporter\nfrom evap.rewards.forms import RewardPointRedemptionEventForm\nfrom evap.rewards.models import (\n NoPointsSelected,\n NotEnoughPoints,\n RedemptionEventExpired,\n RewardPointGranting,\n RewardPointRedemption,\n RewardPointRedemptionEvent,\n SemesterActivation,\n)\nfrom evap.rewards.tools import grant_eligible_reward_points_for_semester, reward_points_of_user, save_redemptions\nfrom evap.staff.views import semester_view\n\n\n@reward_user_required\ndef index(request):\n # pylint: disable=too-many-locals\n status = 200\n if request.method == \"POST\":\n redemptions = {}\n try:\n for key, value in request.POST.items():\n if key.startswith(\"points-\"):\n event_id = int(key.rpartition(\"-\")[2])\n redemptions[event_id] = int(value)\n except ValueError as e:\n raise BadRequest from e\n\n try:\n save_redemptions(request, redemptions)\n messages.success(request, _(\"You successfully redeemed your points.\"))\n except (NoPointsSelected, NotEnoughPoints, RedemptionEventExpired) as error:\n messages.warning(request, error)\n status = 400\n\n total_points_available = reward_points_of_user(request.user)\n reward_point_grantings = RewardPointGranting.objects.filter(user_profile=request.user)\n reward_point_redemptions = RewardPointRedemption.objects.filter(user_profile=request.user)\n events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__gte=datetime.now()).order_by(\"date\")\n\n reward_point_actions = []\n for granting in reward_point_grantings:\n reward_point_actions.append(\n (granting.granting_time, _(\"Reward for\") + \" \" + granting.semester.name, granting.value, \"\")\n )\n for redemption in reward_point_redemptions:\n reward_point_actions.append((redemption.redemption_time, redemption.event.name, \"\", redemption.value))\n\n reward_point_actions.sort(key=lambda action: action[0], reverse=True)\n\n template_data = dict(\n reward_point_actions=reward_point_actions,\n total_points_available=total_points_available,\n events=events,\n )\n return render(request, \"rewards_index.html\", template_data, status=status)\n\n\n@manager_required\ndef reward_point_redemption_events(request):\n upcoming_events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__gte=datetime.now()).order_by(\"date\")\n past_events = RewardPointRedemptionEvent.objects.filter(redeem_end_date__lt=datetime.now()).order_by(\"-date\")\n template_data = dict(upcoming_events=upcoming_events, past_events=past_events)\n return render(request, \"rewards_reward_point_redemption_events.html\", template_data)\n\n\n@manager_required\ndef reward_point_redemption_event_create(request):\n event = RewardPointRedemptionEvent()\n form = RewardPointRedemptionEventForm(request.POST or None, instance=event)\n\n if form.is_valid():\n form.save()\n messages.success(request, _(\"Successfully created event.\"))\n return redirect(\"rewards:reward_point_redemption_events\")\n\n return render(request, \"rewards_reward_point_redemption_event_form.html\", dict(form=form))\n\n\n@manager_required\ndef reward_point_redemption_event_edit(request, event_id):\n event = get_object_or_404(RewardPointRedemptionEvent, id=event_id)\n form = RewardPointRedemptionEventForm(request.POST or None, instance=event)\n\n if form.is_valid():\n event = form.save()\n\n messages.success(request, _(\"Successfully updated event.\"))\n return redirect(\"rewards:reward_point_redemption_events\")\n\n return render(request, \"rewards_reward_point_redemption_event_form.html\", dict(event=event, form=form))\n\n\n@require_POST\n@manager_required\ndef reward_point_redemption_event_delete(request):\n event = get_object_from_dict_pk_entry_or_logged_40x(RewardPointRedemptionEvent, request.POST, \"event_id\")\n\n if not event.can_delete:\n raise SuspiciousOperation(\"Deleting redemption event not allowed\")\n event.delete()\n return HttpResponse() # 200 OK\n\n\n@manager_required\ndef reward_point_redemption_event_export(request, event_id):\n event = get_object_or_404(RewardPointRedemptionEvent, id=event_id)\n\n filename = _(\"RewardPoints\") + f\"-{event.date}-{event.name}-{get_language()}.xls\"\n response = AttachmentResponse(filename, content_type=\"application/vnd.ms-excel\")\n\n RewardsExporter().export(response, event.redemptions_by_user())\n\n return response\n\n\n@manager_required\ndef semester_activation(request, semester_id, active):\n semester = get_object_or_404(Semester, id=semester_id)\n active = active == \"on\"\n\n SemesterActivation.objects.update_or_create(semester=semester, defaults={\"is_active\": active})\n if active:\n grant_eligible_reward_points_for_semester(request, semester)\n\n return semester_view(request=request, semester_id=semester_id)\n", "path": "evap/rewards/views.py"}]} | 1,924 | 255 |
gh_patches_debug_54565 | rasdani/github-patches | git_diff | dbt-labs__dbt-core-2832 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
set colorama upper bound to <0.4.4
colorama v0.4.4 (released in the last 24 hours) is missing an sdist, which trips up the homebrew packaging step of our [dbt release flow](https://github.com/fishtown-analytics/dbt-release/runs/1249693542). Let's set the [upper bound](https://github.com/fishtown-analytics/dbt/blob/dev/kiyoshi-kuromiya/core/setup.py#L67) to <0.4.4 instead of <0.5 for now.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/setup.py`
Content:
```
1 #!/usr/bin/env python
2 import os
3 import sys
4
5 if sys.version_info < (3, 6):
6 print('Error: dbt does not support this version of Python.')
7 print('Please upgrade to Python 3.6 or higher.')
8 sys.exit(1)
9
10
11 from setuptools import setup
12 try:
13 from setuptools import find_namespace_packages
14 except ImportError:
15 # the user has a downlevel version of setuptools.
16 print('Error: dbt requires setuptools v40.1.0 or higher.')
17 print('Please upgrade setuptools with "pip install --upgrade setuptools" '
18 'and try again')
19 sys.exit(1)
20
21
22 def read(fname):
23 return open(os.path.join(os.path.dirname(__file__), fname)).read()
24
25
26 package_name = "dbt-core"
27 package_version = "0.18.1rc1"
28 description = """dbt (data build tool) is a command line tool that helps \
29 analysts and engineers transform data in their warehouse more effectively"""
30
31
32 setup(
33 name=package_name,
34 version=package_version,
35 description=description,
36 long_description=description,
37 author="Fishtown Analytics",
38 author_email="[email protected]",
39 url="https://github.com/fishtown-analytics/dbt",
40 packages=find_namespace_packages(include=['dbt', 'dbt.*']),
41 package_data={
42 'dbt': [
43 'include/index.html',
44 'include/global_project/dbt_project.yml',
45 'include/global_project/docs/*.md',
46 'include/global_project/macros/*.sql',
47 'include/global_project/macros/**/*.sql',
48 'include/global_project/macros/**/**/*.sql',
49 'py.typed',
50 ]
51 },
52 test_suite='test',
53 entry_points={
54 'console_scripts': [
55 'dbt = dbt.main:main',
56 ],
57 },
58 scripts=[
59 'scripts/dbt',
60 ],
61 install_requires=[
62 'Jinja2==2.11.2',
63 'PyYAML>=3.11',
64 'sqlparse>=0.2.3,<0.4',
65 'networkx>=2.3,<3',
66 'minimal-snowplow-tracker==0.0.2',
67 'colorama>=0.3.9,<0.5',
68 'agate>=1.6,<2',
69 'isodate>=0.6,<0.7',
70 'json-rpc>=1.12,<2',
71 'werkzeug>=0.15,<0.17',
72 'dataclasses==0.6;python_version<"3.7"',
73 'hologram==0.0.10',
74 'logbook>=1.5,<1.6',
75 'typing-extensions>=3.7.4,<3.8',
76 # the following are all to match snowflake-connector-python
77 'requests>=2.18.0,<2.24.0',
78 'idna<2.10',
79 'cffi>=1.9,<1.15',
80 ],
81 zip_safe=False,
82 classifiers=[
83 'Development Status :: 5 - Production/Stable',
84
85 'License :: OSI Approved :: Apache Software License',
86
87 'Operating System :: Microsoft :: Windows',
88 'Operating System :: MacOS :: MacOS X',
89 'Operating System :: POSIX :: Linux',
90
91 'Programming Language :: Python :: 3.6',
92 'Programming Language :: Python :: 3.7',
93 'Programming Language :: Python :: 3.8',
94 ],
95 python_requires=">=3.6.3",
96 )
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/setup.py b/core/setup.py
--- a/core/setup.py
+++ b/core/setup.py
@@ -64,7 +64,7 @@
'sqlparse>=0.2.3,<0.4',
'networkx>=2.3,<3',
'minimal-snowplow-tracker==0.0.2',
- 'colorama>=0.3.9,<0.5',
+ 'colorama>=0.3.9,<0.4.4',
'agate>=1.6,<2',
'isodate>=0.6,<0.7',
'json-rpc>=1.12,<2',
| {"golden_diff": "diff --git a/core/setup.py b/core/setup.py\n--- a/core/setup.py\n+++ b/core/setup.py\n@@ -64,7 +64,7 @@\n 'sqlparse>=0.2.3,<0.4',\n 'networkx>=2.3,<3',\n 'minimal-snowplow-tracker==0.0.2',\n- 'colorama>=0.3.9,<0.5',\n+ 'colorama>=0.3.9,<0.4.4',\n 'agate>=1.6,<2',\n 'isodate>=0.6,<0.7',\n 'json-rpc>=1.12,<2',\n", "issue": "set colorama upper bound to <0.4.4\ncolorama v0.4.4 (released in the last 24 hours) is missing an sdist, which trips up the homebrew packaging step of our [dbt release flow](https://github.com/fishtown-analytics/dbt-release/runs/1249693542). Let's set the [upper bound](https://github.com/fishtown-analytics/dbt/blob/dev/kiyoshi-kuromiya/core/setup.py#L67) to <0.4.4 instead of <0.5 for now.\n", "before_files": [{"content": "#!/usr/bin/env python\nimport os\nimport sys\n\nif sys.version_info < (3, 6):\n print('Error: dbt does not support this version of Python.')\n print('Please upgrade to Python 3.6 or higher.')\n sys.exit(1)\n\n\nfrom setuptools import setup\ntry:\n from setuptools import find_namespace_packages\nexcept ImportError:\n # the user has a downlevel version of setuptools.\n print('Error: dbt requires setuptools v40.1.0 or higher.')\n print('Please upgrade setuptools with \"pip install --upgrade setuptools\" '\n 'and try again')\n sys.exit(1)\n\n\ndef read(fname):\n return open(os.path.join(os.path.dirname(__file__), fname)).read()\n\n\npackage_name = \"dbt-core\"\npackage_version = \"0.18.1rc1\"\ndescription = \"\"\"dbt (data build tool) is a command line tool that helps \\\nanalysts and engineers transform data in their warehouse more effectively\"\"\"\n\n\nsetup(\n name=package_name,\n version=package_version,\n description=description,\n long_description=description,\n author=\"Fishtown Analytics\",\n author_email=\"[email protected]\",\n url=\"https://github.com/fishtown-analytics/dbt\",\n packages=find_namespace_packages(include=['dbt', 'dbt.*']),\n package_data={\n 'dbt': [\n 'include/index.html',\n 'include/global_project/dbt_project.yml',\n 'include/global_project/docs/*.md',\n 'include/global_project/macros/*.sql',\n 'include/global_project/macros/**/*.sql',\n 'include/global_project/macros/**/**/*.sql',\n 'py.typed',\n ]\n },\n test_suite='test',\n entry_points={\n 'console_scripts': [\n 'dbt = dbt.main:main',\n ],\n },\n scripts=[\n 'scripts/dbt',\n ],\n install_requires=[\n 'Jinja2==2.11.2',\n 'PyYAML>=3.11',\n 'sqlparse>=0.2.3,<0.4',\n 'networkx>=2.3,<3',\n 'minimal-snowplow-tracker==0.0.2',\n 'colorama>=0.3.9,<0.5',\n 'agate>=1.6,<2',\n 'isodate>=0.6,<0.7',\n 'json-rpc>=1.12,<2',\n 'werkzeug>=0.15,<0.17',\n 'dataclasses==0.6;python_version<\"3.7\"',\n 'hologram==0.0.10',\n 'logbook>=1.5,<1.6',\n 'typing-extensions>=3.7.4,<3.8',\n # the following are all to match snowflake-connector-python\n 'requests>=2.18.0,<2.24.0',\n 'idna<2.10',\n 'cffi>=1.9,<1.15',\n ],\n zip_safe=False,\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n\n 'License :: OSI Approved :: Apache Software License',\n\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: MacOS :: MacOS X',\n 'Operating System :: POSIX :: Linux',\n\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n ],\n python_requires=\">=3.6.3\",\n)\n", "path": "core/setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\nimport os\nimport sys\n\nif sys.version_info < (3, 6):\n print('Error: dbt does not support this version of Python.')\n print('Please upgrade to Python 3.6 or higher.')\n sys.exit(1)\n\n\nfrom setuptools import setup\ntry:\n from setuptools import find_namespace_packages\nexcept ImportError:\n # the user has a downlevel version of setuptools.\n print('Error: dbt requires setuptools v40.1.0 or higher.')\n print('Please upgrade setuptools with \"pip install --upgrade setuptools\" '\n 'and try again')\n sys.exit(1)\n\n\ndef read(fname):\n return open(os.path.join(os.path.dirname(__file__), fname)).read()\n\n\npackage_name = \"dbt-core\"\npackage_version = \"0.18.1rc1\"\ndescription = \"\"\"dbt (data build tool) is a command line tool that helps \\\nanalysts and engineers transform data in their warehouse more effectively\"\"\"\n\n\nsetup(\n name=package_name,\n version=package_version,\n description=description,\n long_description=description,\n author=\"Fishtown Analytics\",\n author_email=\"[email protected]\",\n url=\"https://github.com/fishtown-analytics/dbt\",\n packages=find_namespace_packages(include=['dbt', 'dbt.*']),\n package_data={\n 'dbt': [\n 'include/index.html',\n 'include/global_project/dbt_project.yml',\n 'include/global_project/docs/*.md',\n 'include/global_project/macros/*.sql',\n 'include/global_project/macros/**/*.sql',\n 'include/global_project/macros/**/**/*.sql',\n 'py.typed',\n ]\n },\n test_suite='test',\n entry_points={\n 'console_scripts': [\n 'dbt = dbt.main:main',\n ],\n },\n scripts=[\n 'scripts/dbt',\n ],\n install_requires=[\n 'Jinja2==2.11.2',\n 'PyYAML>=3.11',\n 'sqlparse>=0.2.3,<0.4',\n 'networkx>=2.3,<3',\n 'minimal-snowplow-tracker==0.0.2',\n 'colorama>=0.3.9,<0.4.4',\n 'agate>=1.6,<2',\n 'isodate>=0.6,<0.7',\n 'json-rpc>=1.12,<2',\n 'werkzeug>=0.15,<0.17',\n 'dataclasses==0.6;python_version<\"3.7\"',\n 'hologram==0.0.10',\n 'logbook>=1.5,<1.6',\n 'typing-extensions>=3.7.4,<3.8',\n # the following are all to match snowflake-connector-python\n 'requests>=2.18.0,<2.24.0',\n 'idna<2.10',\n 'cffi>=1.9,<1.15',\n ],\n zip_safe=False,\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n\n 'License :: OSI Approved :: Apache Software License',\n\n 'Operating System :: Microsoft :: Windows',\n 'Operating System :: MacOS :: MacOS X',\n 'Operating System :: POSIX :: Linux',\n\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n ],\n python_requires=\">=3.6.3\",\n)\n", "path": "core/setup.py"}]} | 1,347 | 148 |
gh_patches_debug_14724 | rasdani/github-patches | git_diff | scikit-hep__pyhf-235 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
loosen numpy requirements for non-extra installs
# Description
we are pretty restrictive in the numpy version range due to trying to conform to TF's valid range, but TF is only one of the backends. If just installing `pip install pyhf` we should not force users to a speciic range unless we require the APIs
`numpy>=1.14.0` should be enough unless i'm missing something. @kratsg since you changed this last, any reason you see to restrict numpy further?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2 setup(
3 name = 'pyhf',
4 version = '0.0.15',
5 description = '(partial) pure python histfactory implementation',
6 url = '',
7 author = 'Lukas Heinrich',
8 author_email = '[email protected]',
9 packages = find_packages(),
10 include_package_data = True,
11 install_requires = [
12 'numpy<=1.14.5,>=1.14.3', # required by tensorflow, mxnet, and us
13 'scipy',
14 'click>=6.0', # for console scripts,
15 'tqdm', # for readxml
16 'six', # for modifiers
17 'jsonschema>=v3.0.0a2', # for utils, alpha-release for draft 6
18 ],
19 extras_require = {
20 'xmlimport': [
21 'uproot',
22 ],
23 'torch': [
24 'torch>=0.4.0'
25 ],
26 'mxnet':[
27 'mxnet>=1.0.0',
28 'requests<2.19.0,>=2.18.4',
29 'numpy<1.15.0,>=1.8.2',
30 'requests<2.19.0,>=2.18.4',
31 ],
32 'tensorflow':[
33 'tensorflow>=1.10.0',
34 'numpy<=1.14.5,>=1.13.3',
35 'setuptools<=39.1.0',
36 ],
37 'develop': [
38 'pyflakes',
39 'pytest>=3.5.1',
40 'pytest-cov>=2.5.1',
41 'pytest-benchmark[histogram]',
42 'pytest-console-scripts',
43 'python-coveralls',
44 'coverage>=4.0', # coveralls
45 'matplotlib',
46 'jupyter',
47 'uproot',
48 'papermill',
49 'graphviz',
50 'sphinx',
51 'sphinxcontrib-bibtex',
52 'sphinxcontrib-napoleon',
53 'sphinx_rtd_theme',
54 'nbsphinx',
55 'jsonpatch'
56 ]
57 },
58 entry_points = {
59 'console_scripts': ['pyhf=pyhf.commandline:pyhf']
60 },
61 dependency_links = [
62 ]
63 )
64
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -9,8 +9,7 @@
packages = find_packages(),
include_package_data = True,
install_requires = [
- 'numpy<=1.14.5,>=1.14.3', # required by tensorflow, mxnet, and us
- 'scipy',
+ 'scipy', # requires numpy, which is required by pyhf, tensorflow, and mxnet
'click>=6.0', # for console scripts,
'tqdm', # for readxml
'six', # for modifiers
@@ -31,7 +30,7 @@
],
'tensorflow':[
'tensorflow>=1.10.0',
- 'numpy<=1.14.5,>=1.13.3',
+ 'numpy<=1.14.5,>=1.14.0', # Lower of 1.14.0 instead of 1.13.3 to ensure doctest pass
'setuptools<=39.1.0',
],
'develop': [
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -9,8 +9,7 @@\n packages = find_packages(),\n include_package_data = True,\n install_requires = [\n- 'numpy<=1.14.5,>=1.14.3', # required by tensorflow, mxnet, and us\n- 'scipy',\n+ 'scipy', # requires numpy, which is required by pyhf, tensorflow, and mxnet\n 'click>=6.0', # for console scripts,\n 'tqdm', # for readxml\n 'six', # for modifiers\n@@ -31,7 +30,7 @@\n ],\n 'tensorflow':[\n 'tensorflow>=1.10.0',\n- 'numpy<=1.14.5,>=1.13.3',\n+ 'numpy<=1.14.5,>=1.14.0', # Lower of 1.14.0 instead of 1.13.3 to ensure doctest pass\n 'setuptools<=39.1.0',\n ],\n 'develop': [\n", "issue": "loosen numpy requirements for non-extra installs\n# Description\r\n\r\nwe are pretty restrictive in the numpy version range due to trying to conform to TF's valid range, but TF is only one of the backends. If just installing `pip install pyhf` we should not force users to a speciic range unless we require the APIs\r\n\r\n`numpy>=1.14.0` should be enough unless i'm missing something. @kratsg since you changed this last, any reason you see to restrict numpy further?\n", "before_files": [{"content": "from setuptools import setup, find_packages\nsetup(\n name = 'pyhf',\n version = '0.0.15',\n description = '(partial) pure python histfactory implementation',\n url = '',\n author = 'Lukas Heinrich',\n author_email = '[email protected]',\n packages = find_packages(),\n include_package_data = True,\n install_requires = [\n 'numpy<=1.14.5,>=1.14.3', # required by tensorflow, mxnet, and us\n 'scipy',\n 'click>=6.0', # for console scripts,\n 'tqdm', # for readxml\n 'six', # for modifiers\n 'jsonschema>=v3.0.0a2', # for utils, alpha-release for draft 6\n ],\n extras_require = {\n 'xmlimport': [\n 'uproot',\n ],\n 'torch': [\n 'torch>=0.4.0'\n ],\n 'mxnet':[\n 'mxnet>=1.0.0',\n 'requests<2.19.0,>=2.18.4',\n 'numpy<1.15.0,>=1.8.2',\n 'requests<2.19.0,>=2.18.4',\n ],\n 'tensorflow':[\n 'tensorflow>=1.10.0',\n 'numpy<=1.14.5,>=1.13.3',\n 'setuptools<=39.1.0',\n ],\n 'develop': [\n 'pyflakes',\n 'pytest>=3.5.1',\n 'pytest-cov>=2.5.1',\n 'pytest-benchmark[histogram]',\n 'pytest-console-scripts',\n 'python-coveralls',\n 'coverage>=4.0', # coveralls\n 'matplotlib',\n 'jupyter',\n 'uproot',\n 'papermill',\n 'graphviz',\n 'sphinx',\n 'sphinxcontrib-bibtex',\n 'sphinxcontrib-napoleon',\n 'sphinx_rtd_theme',\n 'nbsphinx',\n 'jsonpatch'\n ]\n },\n entry_points = {\n 'console_scripts': ['pyhf=pyhf.commandline:pyhf']\n },\n dependency_links = [\n ]\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\nsetup(\n name = 'pyhf',\n version = '0.0.15',\n description = '(partial) pure python histfactory implementation',\n url = '',\n author = 'Lukas Heinrich',\n author_email = '[email protected]',\n packages = find_packages(),\n include_package_data = True,\n install_requires = [\n 'scipy', # requires numpy, which is required by pyhf, tensorflow, and mxnet\n 'click>=6.0', # for console scripts,\n 'tqdm', # for readxml\n 'six', # for modifiers\n 'jsonschema>=v3.0.0a2', # for utils, alpha-release for draft 6\n ],\n extras_require = {\n 'xmlimport': [\n 'uproot',\n ],\n 'torch': [\n 'torch>=0.4.0'\n ],\n 'mxnet':[\n 'mxnet>=1.0.0',\n 'requests<2.19.0,>=2.18.4',\n 'numpy<1.15.0,>=1.8.2',\n 'requests<2.19.0,>=2.18.4',\n ],\n 'tensorflow':[\n 'tensorflow>=1.10.0',\n 'numpy<=1.14.5,>=1.14.0', # Lower of 1.14.0 instead of 1.13.3 to ensure doctest pass\n 'setuptools<=39.1.0',\n ],\n 'develop': [\n 'pyflakes',\n 'pytest>=3.5.1',\n 'pytest-cov>=2.5.1',\n 'pytest-benchmark[histogram]',\n 'pytest-console-scripts',\n 'python-coveralls',\n 'coverage>=4.0', # coveralls\n 'matplotlib',\n 'jupyter',\n 'uproot',\n 'papermill',\n 'graphviz',\n 'sphinx',\n 'sphinxcontrib-bibtex',\n 'sphinxcontrib-napoleon',\n 'sphinx_rtd_theme',\n 'nbsphinx',\n 'jsonpatch'\n ]\n },\n entry_points = {\n 'console_scripts': ['pyhf=pyhf.commandline:pyhf']\n },\n dependency_links = [\n ]\n)\n", "path": "setup.py"}]} | 995 | 260 |
gh_patches_debug_25841 | rasdani/github-patches | git_diff | saleor__saleor-2825 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Turn Order.paymentStatus field into an enum
Currently `Order.status` is an `OrderStatus` enum but `Order.paymentStatus` is a `String`.
We should make both enums so clients can know all possible values up-front.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/graphql/order/types.py`
Content:
```
1 import graphene
2 from graphene import relay
3
4 from ...order import OrderEvents, models
5 from ..account.types import User
6 from ..core.types.common import CountableDjangoObjectType
7 from ..core.types.money import Money, TaxedMoney
8 from decimal import Decimal
9
10 OrderEventsEnum = graphene.Enum.from_enum(OrderEvents)
11
12
13 class OrderEvent(CountableDjangoObjectType):
14 date = graphene.types.datetime.DateTime(
15 description='Date when event happened at in ISO 8601 format.')
16 type = OrderEventsEnum(description='Order event type')
17 user = graphene.Field(
18 User, id=graphene.Argument(graphene.ID),
19 description='User who performed the action.')
20 message = graphene.String(
21 description='Content of a note added to the order.')
22 email = graphene.String(description='Email of the customer')
23 email_type = graphene.String(
24 description='Type of an email sent to the customer')
25 amount = graphene.Float(description='Amount of money.')
26 quantity = graphene.Int(description='Number of items.')
27 composed_id = graphene.String(
28 description='Composed id of the Fulfillment.')
29
30 class Meta:
31 description = 'History log of the order.'
32 model = models.OrderEvent
33 interfaces = [relay.Node]
34 exclude_fields = ['order', 'parameters']
35
36 def resolve_email(self, info):
37 return self.parameters.get('email', None)
38
39 def resolve_email_type(self, info):
40 return self.parameters.get('email_type', None)
41
42 def resolve_amount(self, info):
43 amount = self.parameters.get('amount', None)
44 return Decimal(amount) if amount else None
45
46 def resolve_quantity(self, info):
47 quantity = self.parameters.get('quantity', None)
48 return int(quantity) if quantity else None
49
50 def resolve_message(self, info):
51 return self.parameters.get('message', None)
52
53 def resolve_composed_id(self, info):
54 return self.parameters.get('composed_id', None)
55
56
57 class Fulfillment(CountableDjangoObjectType):
58 status_display = graphene.String(
59 description='User-friendly fulfillment status.')
60
61 class Meta:
62 description = 'Represents order fulfillment.'
63 interfaces = [relay.Node]
64 model = models.Fulfillment
65 exclude_fields = ['order']
66
67 def resolve_status_display(self, info):
68 return self.get_status_display()
69
70
71 class FulfillmentLine(CountableDjangoObjectType):
72 class Meta:
73 description = 'Represents line of the fulfillment.'
74 interfaces = [relay.Node]
75 model = models.FulfillmentLine
76 exclude_fields = ['fulfillment']
77
78
79 class Order(CountableDjangoObjectType):
80 fulfillments = graphene.List(
81 Fulfillment,
82 required=True,
83 description='List of shipments for the order.')
84 is_paid = graphene.Boolean(
85 description='Informs if an order is fully paid.')
86 number = graphene.String(description='User-friendly number of an order.')
87 payment_status = graphene.String(description='Internal payment status.')
88 payment_status_display = graphene.String(
89 description='User-friendly payment status.')
90 subtotal = graphene.Field(
91 TaxedMoney,
92 description='The sum of line prices not including shipping.')
93 status_display = graphene.String(description='User-friendly order status.')
94 total_authorized = graphene.Field(
95 Money, description='Amount authorized for the order.')
96 total_captured = graphene.Field(
97 Money, description='Amount captured by payment.')
98 events = graphene.List(
99 OrderEvent,
100 description='List of events associated with the order.')
101 user_email = graphene.String(
102 required=False, description='Email address of the customer.')
103
104 class Meta:
105 description = 'Represents an order in the shop.'
106 interfaces = [relay.Node]
107 model = models.Order
108 exclude_fields = [
109 'shipping_price_gross', 'shipping_price_net', 'total_gross',
110 'total_net']
111
112 @staticmethod
113 def resolve_subtotal(obj, info):
114 return obj.get_subtotal()
115
116 @staticmethod
117 def resolve_total_authorized(obj, info):
118 payment = obj.get_last_payment()
119 if payment:
120 return payment.get_total_price().gross
121
122 @staticmethod
123 def resolve_total_captured(obj, info):
124 payment = obj.get_last_payment()
125 if payment:
126 return payment.get_captured_price()
127
128 @staticmethod
129 def resolve_fulfillments(obj, info):
130 return obj.fulfillments.all()
131
132 @staticmethod
133 def resolve_events(obj, info):
134 return obj.events.all()
135
136 @staticmethod
137 def resolve_is_paid(obj, info):
138 return obj.is_fully_paid()
139
140 @staticmethod
141 def resolve_number(obj, info):
142 return str(obj.pk)
143
144 @staticmethod
145 def resolve_payment_status(obj, info):
146 return obj.get_last_payment_status()
147
148 @staticmethod
149 def resolve_payment_status_display(obj, info):
150 return obj.get_last_payment_status_display()
151
152 @staticmethod
153 def resolve_status_display(obj, info):
154 return obj.get_status_display()
155
156 @staticmethod
157 def resolve_user_email(obj, info):
158 if obj.user_email:
159 return obj.user_email
160 if obj.user_id:
161 return obj.user.email
162 return None
163
164
165 class OrderLine(CountableDjangoObjectType):
166 class Meta:
167 description = 'Represents order line of particular order.'
168 model = models.OrderLine
169 interfaces = [relay.Node]
170 exclude_fields = [
171 'order', 'unit_price_gross', 'unit_price_net', 'variant']
172
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/graphql/order/types.py b/saleor/graphql/order/types.py
--- a/saleor/graphql/order/types.py
+++ b/saleor/graphql/order/types.py
@@ -1,13 +1,18 @@
+from decimal import Decimal
+
import graphene
from graphene import relay
+from payments import PaymentStatus
from ...order import OrderEvents, models
from ..account.types import User
from ..core.types.common import CountableDjangoObjectType
from ..core.types.money import Money, TaxedMoney
-from decimal import Decimal
OrderEventsEnum = graphene.Enum.from_enum(OrderEvents)
+PaymentStatusEnum = graphene.Enum(
+ 'PaymentStatusEnum',
+ [(code.upper(), code) for code, name in PaymentStatus.CHOICES])
class OrderEvent(CountableDjangoObjectType):
@@ -84,7 +89,7 @@
is_paid = graphene.Boolean(
description='Informs if an order is fully paid.')
number = graphene.String(description='User-friendly number of an order.')
- payment_status = graphene.String(description='Internal payment status.')
+ payment_status = PaymentStatusEnum(description='Internal payment status.')
payment_status_display = graphene.String(
description='User-friendly payment status.')
subtotal = graphene.Field(
| {"golden_diff": "diff --git a/saleor/graphql/order/types.py b/saleor/graphql/order/types.py\n--- a/saleor/graphql/order/types.py\n+++ b/saleor/graphql/order/types.py\n@@ -1,13 +1,18 @@\n+from decimal import Decimal\n+\n import graphene\n from graphene import relay\n+from payments import PaymentStatus\n \n from ...order import OrderEvents, models\n from ..account.types import User\n from ..core.types.common import CountableDjangoObjectType\n from ..core.types.money import Money, TaxedMoney\n-from decimal import Decimal\n \n OrderEventsEnum = graphene.Enum.from_enum(OrderEvents)\n+PaymentStatusEnum = graphene.Enum(\n+ 'PaymentStatusEnum',\n+ [(code.upper(), code) for code, name in PaymentStatus.CHOICES])\n \n \n class OrderEvent(CountableDjangoObjectType):\n@@ -84,7 +89,7 @@\n is_paid = graphene.Boolean(\n description='Informs if an order is fully paid.')\n number = graphene.String(description='User-friendly number of an order.')\n- payment_status = graphene.String(description='Internal payment status.')\n+ payment_status = PaymentStatusEnum(description='Internal payment status.')\n payment_status_display = graphene.String(\n description='User-friendly payment status.')\n subtotal = graphene.Field(\n", "issue": "Turn Order.paymentStatus field into an enum\nCurrently `Order.status` is an `OrderStatus` enum but `Order.paymentStatus` is a `String`.\r\n\r\nWe should make both enums so clients can know all possible values up-front.\n", "before_files": [{"content": "import graphene\nfrom graphene import relay\n\nfrom ...order import OrderEvents, models\nfrom ..account.types import User\nfrom ..core.types.common import CountableDjangoObjectType\nfrom ..core.types.money import Money, TaxedMoney\nfrom decimal import Decimal\n\nOrderEventsEnum = graphene.Enum.from_enum(OrderEvents)\n\n\nclass OrderEvent(CountableDjangoObjectType):\n date = graphene.types.datetime.DateTime(\n description='Date when event happened at in ISO 8601 format.')\n type = OrderEventsEnum(description='Order event type')\n user = graphene.Field(\n User, id=graphene.Argument(graphene.ID),\n description='User who performed the action.')\n message = graphene.String(\n description='Content of a note added to the order.')\n email = graphene.String(description='Email of the customer')\n email_type = graphene.String(\n description='Type of an email sent to the customer')\n amount = graphene.Float(description='Amount of money.')\n quantity = graphene.Int(description='Number of items.')\n composed_id = graphene.String(\n description='Composed id of the Fulfillment.')\n\n class Meta:\n description = 'History log of the order.'\n model = models.OrderEvent\n interfaces = [relay.Node]\n exclude_fields = ['order', 'parameters']\n\n def resolve_email(self, info):\n return self.parameters.get('email', None)\n\n def resolve_email_type(self, info):\n return self.parameters.get('email_type', None)\n\n def resolve_amount(self, info):\n amount = self.parameters.get('amount', None)\n return Decimal(amount) if amount else None\n\n def resolve_quantity(self, info):\n quantity = self.parameters.get('quantity', None)\n return int(quantity) if quantity else None\n\n def resolve_message(self, info):\n return self.parameters.get('message', None)\n\n def resolve_composed_id(self, info):\n return self.parameters.get('composed_id', None)\n\n\nclass Fulfillment(CountableDjangoObjectType):\n status_display = graphene.String(\n description='User-friendly fulfillment status.')\n\n class Meta:\n description = 'Represents order fulfillment.'\n interfaces = [relay.Node]\n model = models.Fulfillment\n exclude_fields = ['order']\n\n def resolve_status_display(self, info):\n return self.get_status_display()\n\n\nclass FulfillmentLine(CountableDjangoObjectType):\n class Meta:\n description = 'Represents line of the fulfillment.'\n interfaces = [relay.Node]\n model = models.FulfillmentLine\n exclude_fields = ['fulfillment']\n\n\nclass Order(CountableDjangoObjectType):\n fulfillments = graphene.List(\n Fulfillment,\n required=True,\n description='List of shipments for the order.')\n is_paid = graphene.Boolean(\n description='Informs if an order is fully paid.')\n number = graphene.String(description='User-friendly number of an order.')\n payment_status = graphene.String(description='Internal payment status.')\n payment_status_display = graphene.String(\n description='User-friendly payment status.')\n subtotal = graphene.Field(\n TaxedMoney,\n description='The sum of line prices not including shipping.')\n status_display = graphene.String(description='User-friendly order status.')\n total_authorized = graphene.Field(\n Money, description='Amount authorized for the order.')\n total_captured = graphene.Field(\n Money, description='Amount captured by payment.')\n events = graphene.List(\n OrderEvent,\n description='List of events associated with the order.')\n user_email = graphene.String(\n required=False, description='Email address of the customer.')\n\n class Meta:\n description = 'Represents an order in the shop.'\n interfaces = [relay.Node]\n model = models.Order\n exclude_fields = [\n 'shipping_price_gross', 'shipping_price_net', 'total_gross',\n 'total_net']\n\n @staticmethod\n def resolve_subtotal(obj, info):\n return obj.get_subtotal()\n\n @staticmethod\n def resolve_total_authorized(obj, info):\n payment = obj.get_last_payment()\n if payment:\n return payment.get_total_price().gross\n\n @staticmethod\n def resolve_total_captured(obj, info):\n payment = obj.get_last_payment()\n if payment:\n return payment.get_captured_price()\n\n @staticmethod\n def resolve_fulfillments(obj, info):\n return obj.fulfillments.all()\n\n @staticmethod\n def resolve_events(obj, info):\n return obj.events.all()\n\n @staticmethod\n def resolve_is_paid(obj, info):\n return obj.is_fully_paid()\n\n @staticmethod\n def resolve_number(obj, info):\n return str(obj.pk)\n\n @staticmethod\n def resolve_payment_status(obj, info):\n return obj.get_last_payment_status()\n\n @staticmethod\n def resolve_payment_status_display(obj, info):\n return obj.get_last_payment_status_display()\n\n @staticmethod\n def resolve_status_display(obj, info):\n return obj.get_status_display()\n\n @staticmethod\n def resolve_user_email(obj, info):\n if obj.user_email:\n return obj.user_email\n if obj.user_id:\n return obj.user.email\n return None\n\n\nclass OrderLine(CountableDjangoObjectType):\n class Meta:\n description = 'Represents order line of particular order.'\n model = models.OrderLine\n interfaces = [relay.Node]\n exclude_fields = [\n 'order', 'unit_price_gross', 'unit_price_net', 'variant']\n", "path": "saleor/graphql/order/types.py"}], "after_files": [{"content": "from decimal import Decimal\n\nimport graphene\nfrom graphene import relay\nfrom payments import PaymentStatus\n\nfrom ...order import OrderEvents, models\nfrom ..account.types import User\nfrom ..core.types.common import CountableDjangoObjectType\nfrom ..core.types.money import Money, TaxedMoney\n\nOrderEventsEnum = graphene.Enum.from_enum(OrderEvents)\nPaymentStatusEnum = graphene.Enum(\n 'PaymentStatusEnum',\n [(code.upper(), code) for code, name in PaymentStatus.CHOICES])\n\n\nclass OrderEvent(CountableDjangoObjectType):\n date = graphene.types.datetime.DateTime(\n description='Date when event happened at in ISO 8601 format.')\n type = OrderEventsEnum(description='Order event type')\n user = graphene.Field(\n User, id=graphene.Argument(graphene.ID),\n description='User who performed the action.')\n message = graphene.String(\n description='Content of a note added to the order.')\n email = graphene.String(description='Email of the customer')\n email_type = graphene.String(\n description='Type of an email sent to the customer')\n amount = graphene.Float(description='Amount of money.')\n quantity = graphene.Int(description='Number of items.')\n composed_id = graphene.String(\n description='Composed id of the Fulfillment.')\n\n class Meta:\n description = 'History log of the order.'\n model = models.OrderEvent\n interfaces = [relay.Node]\n exclude_fields = ['order', 'parameters']\n\n def resolve_email(self, info):\n return self.parameters.get('email', None)\n\n def resolve_email_type(self, info):\n return self.parameters.get('email_type', None)\n\n def resolve_amount(self, info):\n amount = self.parameters.get('amount', None)\n return Decimal(amount) if amount else None\n\n def resolve_quantity(self, info):\n quantity = self.parameters.get('quantity', None)\n return int(quantity) if quantity else None\n\n def resolve_message(self, info):\n return self.parameters.get('message', None)\n\n def resolve_composed_id(self, info):\n return self.parameters.get('composed_id', None)\n\n\nclass Fulfillment(CountableDjangoObjectType):\n status_display = graphene.String(\n description='User-friendly fulfillment status.')\n\n class Meta:\n description = 'Represents order fulfillment.'\n interfaces = [relay.Node]\n model = models.Fulfillment\n exclude_fields = ['order']\n\n def resolve_status_display(self, info):\n return self.get_status_display()\n\n\nclass FulfillmentLine(CountableDjangoObjectType):\n class Meta:\n description = 'Represents line of the fulfillment.'\n interfaces = [relay.Node]\n model = models.FulfillmentLine\n exclude_fields = ['fulfillment']\n\n\nclass Order(CountableDjangoObjectType):\n fulfillments = graphene.List(\n Fulfillment,\n required=True,\n description='List of shipments for the order.')\n is_paid = graphene.Boolean(\n description='Informs if an order is fully paid.')\n number = graphene.String(description='User-friendly number of an order.')\n payment_status = PaymentStatusEnum(description='Internal payment status.')\n payment_status_display = graphene.String(\n description='User-friendly payment status.')\n subtotal = graphene.Field(\n TaxedMoney,\n description='The sum of line prices not including shipping.')\n status_display = graphene.String(description='User-friendly order status.')\n total_authorized = graphene.Field(\n Money, description='Amount authorized for the order.')\n total_captured = graphene.Field(\n Money, description='Amount captured by payment.')\n events = graphene.List(\n OrderEvent,\n description='List of events associated with the order.')\n user_email = graphene.String(\n required=False, description='Email address of the customer.')\n\n class Meta:\n description = 'Represents an order in the shop.'\n interfaces = [relay.Node]\n model = models.Order\n exclude_fields = [\n 'shipping_price_gross', 'shipping_price_net', 'total_gross',\n 'total_net']\n\n @staticmethod\n def resolve_subtotal(obj, info):\n return obj.get_subtotal()\n\n @staticmethod\n def resolve_total_authorized(obj, info):\n payment = obj.get_last_payment()\n if payment:\n return payment.get_total_price().gross\n\n @staticmethod\n def resolve_total_captured(obj, info):\n payment = obj.get_last_payment()\n if payment:\n return payment.get_captured_price()\n\n @staticmethod\n def resolve_fulfillments(obj, info):\n return obj.fulfillments.all()\n\n @staticmethod\n def resolve_events(obj, info):\n return obj.events.all()\n\n @staticmethod\n def resolve_is_paid(obj, info):\n return obj.is_fully_paid()\n\n @staticmethod\n def resolve_number(obj, info):\n return str(obj.pk)\n\n @staticmethod\n def resolve_payment_status(obj, info):\n return obj.get_last_payment_status()\n\n @staticmethod\n def resolve_payment_status_display(obj, info):\n return obj.get_last_payment_status_display()\n\n @staticmethod\n def resolve_status_display(obj, info):\n return obj.get_status_display()\n\n @staticmethod\n def resolve_user_email(obj, info):\n if obj.user_email:\n return obj.user_email\n if obj.user_id:\n return obj.user.email\n return None\n\n\nclass OrderLine(CountableDjangoObjectType):\n class Meta:\n description = 'Represents order line of particular order.'\n model = models.OrderLine\n interfaces = [relay.Node]\n exclude_fields = [\n 'order', 'unit_price_gross', 'unit_price_net', 'variant']\n", "path": "saleor/graphql/order/types.py"}]} | 1,880 | 269 |
gh_patches_debug_1393 | rasdani/github-patches | git_diff | pytorch__audio-1583 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Use of deprecated `AutoNonVariableTypeMode`.
`AutoNonVariableTypeMode` is deprecated and will be removed in PyTorch 1.10.
https://github.com/pytorch/audio/search?q=AutoNonVariableTypeMode
Migration: https://github.com/pytorch/pytorch/blob/master/docs/cpp/source/notes/inference_mode.rst#migration-guide-from-autononvariabletypemode
cc @carolineechen
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchaudio/__init__.py`
Content:
```
1 from . import extension # noqa: F401
2 from torchaudio._internal import module_utils as _mod_utils # noqa: F401
3 from torchaudio import (
4 compliance,
5 datasets,
6 functional,
7 kaldi_io,
8 utils,
9 sox_effects,
10 transforms,
11 )
12
13 from torchaudio.backend import (
14 list_audio_backends,
15 get_audio_backend,
16 set_audio_backend,
17 )
18
19 try:
20 from .version import __version__, git_version # noqa: F401
21 except ImportError:
22 pass
23
24 __all__ = [
25 'compliance',
26 'datasets',
27 'functional',
28 'kaldi_io',
29 'utils',
30 'sox_effects',
31 'transforms',
32 'list_audio_backends',
33 'get_audio_backend',
34 'set_audio_backend',
35 'save_encinfo',
36 'sox_signalinfo_t',
37 'sox_encodinginfo_t',
38 'get_sox_option_t',
39 'get_sox_encoding_t',
40 'get_sox_bool',
41 'SignalInfo',
42 'EncodingInfo',
43 ]
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchaudio/__init__.py b/torchaudio/__init__.py
--- a/torchaudio/__init__.py
+++ b/torchaudio/__init__.py
@@ -32,12 +32,4 @@
'list_audio_backends',
'get_audio_backend',
'set_audio_backend',
- 'save_encinfo',
- 'sox_signalinfo_t',
- 'sox_encodinginfo_t',
- 'get_sox_option_t',
- 'get_sox_encoding_t',
- 'get_sox_bool',
- 'SignalInfo',
- 'EncodingInfo',
]
| {"golden_diff": "diff --git a/torchaudio/__init__.py b/torchaudio/__init__.py\n--- a/torchaudio/__init__.py\n+++ b/torchaudio/__init__.py\n@@ -32,12 +32,4 @@\n 'list_audio_backends',\n 'get_audio_backend',\n 'set_audio_backend',\n- 'save_encinfo',\n- 'sox_signalinfo_t',\n- 'sox_encodinginfo_t',\n- 'get_sox_option_t',\n- 'get_sox_encoding_t',\n- 'get_sox_bool',\n- 'SignalInfo',\n- 'EncodingInfo',\n ]\n", "issue": "Use of deprecated `AutoNonVariableTypeMode`.\n`AutoNonVariableTypeMode` is deprecated and will be removed in PyTorch 1.10.\r\n\r\nhttps://github.com/pytorch/audio/search?q=AutoNonVariableTypeMode\r\n\r\nMigration: https://github.com/pytorch/pytorch/blob/master/docs/cpp/source/notes/inference_mode.rst#migration-guide-from-autononvariabletypemode\r\n\r\ncc @carolineechen \n", "before_files": [{"content": "from . import extension # noqa: F401\nfrom torchaudio._internal import module_utils as _mod_utils # noqa: F401\nfrom torchaudio import (\n compliance,\n datasets,\n functional,\n kaldi_io,\n utils,\n sox_effects,\n transforms,\n)\n\nfrom torchaudio.backend import (\n list_audio_backends,\n get_audio_backend,\n set_audio_backend,\n)\n\ntry:\n from .version import __version__, git_version # noqa: F401\nexcept ImportError:\n pass\n\n__all__ = [\n 'compliance',\n 'datasets',\n 'functional',\n 'kaldi_io',\n 'utils',\n 'sox_effects',\n 'transforms',\n 'list_audio_backends',\n 'get_audio_backend',\n 'set_audio_backend',\n 'save_encinfo',\n 'sox_signalinfo_t',\n 'sox_encodinginfo_t',\n 'get_sox_option_t',\n 'get_sox_encoding_t',\n 'get_sox_bool',\n 'SignalInfo',\n 'EncodingInfo',\n]\n", "path": "torchaudio/__init__.py"}], "after_files": [{"content": "from . import extension # noqa: F401\nfrom torchaudio._internal import module_utils as _mod_utils # noqa: F401\nfrom torchaudio import (\n compliance,\n datasets,\n functional,\n kaldi_io,\n utils,\n sox_effects,\n transforms,\n)\n\nfrom torchaudio.backend import (\n list_audio_backends,\n get_audio_backend,\n set_audio_backend,\n)\n\ntry:\n from .version import __version__, git_version # noqa: F401\nexcept ImportError:\n pass\n\n__all__ = [\n 'compliance',\n 'datasets',\n 'functional',\n 'kaldi_io',\n 'utils',\n 'sox_effects',\n 'transforms',\n 'list_audio_backends',\n 'get_audio_backend',\n 'set_audio_backend',\n]\n", "path": "torchaudio/__init__.py"}]} | 663 | 140 |
gh_patches_debug_4807 | rasdani/github-patches | git_diff | bridgecrewio__checkov-5045 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update CKV_AZURE_43 `each.`
**Describe the issue**
CKV_AZURE_43 StorageAccountName.py VARIABLE_REFS list does not include the `each.` used with for_each meta argument to return UNKNOWN and currently returns FAILED check which is incorrect.
**Examples**
```
module "bootstrap" {
source = "../../modules/bootstrap"
for_each = var.bootstrap_storage
create_storage_account = try(each.value.create_storage, true)
name = each.value.name
resource_group_name = try(each.value.resource_group_name, local.resource_group.name)
location = var.location
storage_acl = try(each.value.storage_acl, false)
tags = var.tags
}
```
Within the bootstrap module - we use the `azurerm_storage_account` :
```
resource "azurerm_storage_account" "this" {
count = var.create_storage_account ? 1 : 0
name = var.name
location = var.location
resource_group_name = var.resource_group_name
min_tls_version = var.min_tls_version
account_replication_type = "LRS"
account_tier = "Standard"
tags = var.tags
queue_properties {
logging {
delete = true
read = true
write = true
version = "1.0"
retention_policy_days = var.retention_policy_days
}
}
network_rules {
default_action = var.storage_acl == true ? "Deny" : "Allow"
ip_rules = var.storage_acl == true ? var.storage_allow_inbound_public_ips : null
virtual_network_subnet_ids = var.storage_acl == true ? var.storage_allow_vnet_subnets : null
}
}
```
And Checkov returns this :
```
Check: CKV_AZURE_43: "Ensure Storage Accounts adhere to the naming rules"
FAILED for resource: module.bootstrap.azurerm_storage_account.this
File: /modules/bootstrap/main.tf:1-25
Calling File: /examples/standalone_vm/main.tf:192-204
Guide: https://docs.bridgecrew.io/docs/ensure-storage-accounts-adhere-to-the-naming-rules
```
**Version (please complete the following information):**
- Checkov Version 2.2.125
**Additional context**
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/terraform/checks/resource/azure/StorageAccountName.py`
Content:
```
1 import re
2 from typing import List, Dict, Any
3
4 from checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck
5 from checkov.common.models.enums import CheckResult, CheckCategories
6
7 STO_NAME_REGEX = re.compile(r"^[a-z0-9]{3,24}$")
8 VARIABLE_REFS = ("local.", "module.", "var.", "random_string.", "random_id.", "random_integer.", "random_pet.",
9 "azurecaf_name")
10
11
12 class StorageAccountName(BaseResourceCheck):
13 def __init__(self) -> None:
14 name = "Ensure Storage Accounts adhere to the naming rules"
15 id = "CKV_AZURE_43"
16 supported_resources = ["azurerm_storage_account"]
17 categories = [CheckCategories.CONVENTION]
18 super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)
19
20 def scan_resource_conf(self, conf: Dict[str, Any]) -> CheckResult:
21 """
22 The Storage Account naming reference:
23 https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview#naming-storage-accounts
24 :param conf: azurerm_storage_account configuration
25 :return: <CheckResult>
26 """
27 name = conf.get("name")
28 if name:
29 name = str(name[0])
30 if any(x in name for x in VARIABLE_REFS):
31 # in the case we couldn't evaluate the name, just ignore
32 return CheckResult.UNKNOWN
33 if re.findall(STO_NAME_REGEX, str(conf["name"][0])):
34 return CheckResult.PASSED
35
36 return CheckResult.FAILED
37
38 def get_evaluated_keys(self) -> List[str]:
39 return ["name"]
40
41
42 check = StorageAccountName()
43
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/terraform/checks/resource/azure/StorageAccountName.py b/checkov/terraform/checks/resource/azure/StorageAccountName.py
--- a/checkov/terraform/checks/resource/azure/StorageAccountName.py
+++ b/checkov/terraform/checks/resource/azure/StorageAccountName.py
@@ -6,7 +6,7 @@
STO_NAME_REGEX = re.compile(r"^[a-z0-9]{3,24}$")
VARIABLE_REFS = ("local.", "module.", "var.", "random_string.", "random_id.", "random_integer.", "random_pet.",
- "azurecaf_name")
+ "azurecaf_name", "each.")
class StorageAccountName(BaseResourceCheck):
| {"golden_diff": "diff --git a/checkov/terraform/checks/resource/azure/StorageAccountName.py b/checkov/terraform/checks/resource/azure/StorageAccountName.py\n--- a/checkov/terraform/checks/resource/azure/StorageAccountName.py\n+++ b/checkov/terraform/checks/resource/azure/StorageAccountName.py\n@@ -6,7 +6,7 @@\n \n STO_NAME_REGEX = re.compile(r\"^[a-z0-9]{3,24}$\")\n VARIABLE_REFS = (\"local.\", \"module.\", \"var.\", \"random_string.\", \"random_id.\", \"random_integer.\", \"random_pet.\",\n- \"azurecaf_name\")\n+ \"azurecaf_name\", \"each.\")\n \n \n class StorageAccountName(BaseResourceCheck):\n", "issue": "Update CKV_AZURE_43 `each.`\n**Describe the issue**\r\nCKV_AZURE_43 StorageAccountName.py VARIABLE_REFS list does not include the `each.` used with for_each meta argument to return UNKNOWN and currently returns FAILED check which is incorrect.\r\n\r\n**Examples**\r\n\r\n```\r\nmodule \"bootstrap\" {\r\n source = \"../../modules/bootstrap\"\r\n\r\n for_each = var.bootstrap_storage\r\n\r\n create_storage_account = try(each.value.create_storage, true)\r\n name = each.value.name\r\n resource_group_name = try(each.value.resource_group_name, local.resource_group.name)\r\n location = var.location\r\n storage_acl = try(each.value.storage_acl, false)\r\n\r\n tags = var.tags\r\n}\r\n```\r\n\r\nWithin the bootstrap module - we use the `azurerm_storage_account` :\r\n\r\n```\r\nresource \"azurerm_storage_account\" \"this\" {\r\n count = var.create_storage_account ? 1 : 0\r\n\r\n name = var.name\r\n location = var.location\r\n resource_group_name = var.resource_group_name\r\n min_tls_version = var.min_tls_version\r\n account_replication_type = \"LRS\"\r\n account_tier = \"Standard\"\r\n tags = var.tags\r\n queue_properties {\r\n logging {\r\n delete = true\r\n read = true\r\n write = true\r\n version = \"1.0\"\r\n retention_policy_days = var.retention_policy_days\r\n }\r\n }\r\n network_rules {\r\n default_action = var.storage_acl == true ? \"Deny\" : \"Allow\"\r\n ip_rules = var.storage_acl == true ? var.storage_allow_inbound_public_ips : null\r\n virtual_network_subnet_ids = var.storage_acl == true ? var.storage_allow_vnet_subnets : null\r\n }\r\n}\r\n```\r\n\r\nAnd Checkov returns this :\r\n\r\n```\r\nCheck: CKV_AZURE_43: \"Ensure Storage Accounts adhere to the naming rules\"\r\n FAILED for resource: module.bootstrap.azurerm_storage_account.this\r\n File: /modules/bootstrap/main.tf:1-25\r\n Calling File: /examples/standalone_vm/main.tf:192-204\r\n Guide: https://docs.bridgecrew.io/docs/ensure-storage-accounts-adhere-to-the-naming-rules\r\n```\r\n\r\n**Version (please complete the following information):**\r\n - Checkov Version 2.2.125\r\n\r\n**Additional context**\r\n\n", "before_files": [{"content": "import re\nfrom typing import List, Dict, Any\n\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\nfrom checkov.common.models.enums import CheckResult, CheckCategories\n\nSTO_NAME_REGEX = re.compile(r\"^[a-z0-9]{3,24}$\")\nVARIABLE_REFS = (\"local.\", \"module.\", \"var.\", \"random_string.\", \"random_id.\", \"random_integer.\", \"random_pet.\",\n \"azurecaf_name\")\n\n\nclass StorageAccountName(BaseResourceCheck):\n def __init__(self) -> None:\n name = \"Ensure Storage Accounts adhere to the naming rules\"\n id = \"CKV_AZURE_43\"\n supported_resources = [\"azurerm_storage_account\"]\n categories = [CheckCategories.CONVENTION]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf: Dict[str, Any]) -> CheckResult:\n \"\"\"\n The Storage Account naming reference:\n https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview#naming-storage-accounts\n :param conf: azurerm_storage_account configuration\n :return: <CheckResult>\n \"\"\"\n name = conf.get(\"name\")\n if name:\n name = str(name[0])\n if any(x in name for x in VARIABLE_REFS):\n # in the case we couldn't evaluate the name, just ignore\n return CheckResult.UNKNOWN\n if re.findall(STO_NAME_REGEX, str(conf[\"name\"][0])):\n return CheckResult.PASSED\n\n return CheckResult.FAILED\n\n def get_evaluated_keys(self) -> List[str]:\n return [\"name\"]\n\n\ncheck = StorageAccountName()\n", "path": "checkov/terraform/checks/resource/azure/StorageAccountName.py"}], "after_files": [{"content": "import re\nfrom typing import List, Dict, Any\n\nfrom checkov.terraform.checks.resource.base_resource_check import BaseResourceCheck\nfrom checkov.common.models.enums import CheckResult, CheckCategories\n\nSTO_NAME_REGEX = re.compile(r\"^[a-z0-9]{3,24}$\")\nVARIABLE_REFS = (\"local.\", \"module.\", \"var.\", \"random_string.\", \"random_id.\", \"random_integer.\", \"random_pet.\",\n \"azurecaf_name\", \"each.\")\n\n\nclass StorageAccountName(BaseResourceCheck):\n def __init__(self) -> None:\n name = \"Ensure Storage Accounts adhere to the naming rules\"\n id = \"CKV_AZURE_43\"\n supported_resources = [\"azurerm_storage_account\"]\n categories = [CheckCategories.CONVENTION]\n super().__init__(name=name, id=id, categories=categories, supported_resources=supported_resources)\n\n def scan_resource_conf(self, conf: Dict[str, Any]) -> CheckResult:\n \"\"\"\n The Storage Account naming reference:\n https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview#naming-storage-accounts\n :param conf: azurerm_storage_account configuration\n :return: <CheckResult>\n \"\"\"\n name = conf.get(\"name\")\n if name:\n name = str(name[0])\n if any(x in name for x in VARIABLE_REFS):\n # in the case we couldn't evaluate the name, just ignore\n return CheckResult.UNKNOWN\n if re.findall(STO_NAME_REGEX, str(conf[\"name\"][0])):\n return CheckResult.PASSED\n\n return CheckResult.FAILED\n\n def get_evaluated_keys(self) -> List[str]:\n return [\"name\"]\n\n\ncheck = StorageAccountName()\n", "path": "checkov/terraform/checks/resource/azure/StorageAccountName.py"}]} | 1,234 | 156 |
gh_patches_debug_29165 | rasdani/github-patches | git_diff | spack__spack-7545 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
gcc v5.4.0 build fails due to mpfr patching problem
There seems to be a patch application issue in the mpfr-3.1.5 build procedure
I was expecting something like my previous build:
```
==> Installing mpfr
==> Fetching file://MIRROR_DIR/mirror/mpfr/mpfr-3.1.5.tar.bz2
==> Staging archive: WORKING_DIR/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5/mpfr-3.1.5.tar.bz2
==> Created stage in WORKING_DIR/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5
==> Applied patch vasprintf.patch
==> Applied patch strtofr.patch
==> Building mpfr [AutotoolsPackage]
==> Executing phase: 'autoreconf'
==> Executing phase: 'configure'
==> Executing phase: 'build'
==> Executing phase: 'install'
==> Successfully installed mpfr
Fetch: 0.04s. Build: 9.54s. Total: 9.58s.
[+] WORKING_DIR/opt/spack/linux-centos7-x86_64/gcc-4.8.5/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5
```
When I tried to build the gcc compiler yesterday (and again this morning) the results were strange:
```
==> Installing mpfr
1 out of 1 hunk FAILED -- saving rejects to file VERSION.rej
1 out of 1 hunk FAILED -- saving rejects to file src/mpfr.h.rej
1 out of 1 hunk FAILED -- saving rejects to file src/version.c.rej
==> Fetching file://MIRROR_DIR/mirror/mpfr/mpfr-3.1.5.tar.bz2
==> Staging archive: WORKING_DIR/sat/spack/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5/mpfr-3.1.5.tar.bz2
==> Created stage in WORKING_DIR/sat/spack/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5
==> Patch strtofr.patch failed.
==> Error: ProcessError: Command exited with status 1:
'/usr/bin/patch' '-s' '-p' '1' '-i' 'WORKING_DIR/sat/spack/var/spack/repos/builtin/packages/mpfr/strtofr.patch' '-d' '.'
==> Error: [Errno 2] No such file or directory: 'WORKING_DIR/sat/spack/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5/mpfr-3.1.5/spack-build.out'
```
Not only the error, but the order of the messages seem strange.
A clean clone of the spack repo made no difference
```console
$ spack install [email protected]
```
Default environment:
```linux-centos7-x86_64/gcc-4.8.5```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `var/spack/repos/builtin/packages/mpfr/package.py`
Content:
```
1 ##############################################################################
2 # Copyright (c) 2013-2017, Lawrence Livermore National Security, LLC.
3 # Produced at the Lawrence Livermore National Laboratory.
4 #
5 # This file is part of Spack.
6 # Created by Todd Gamblin, [email protected], All rights reserved.
7 # LLNL-CODE-647188
8 #
9 # For details, see https://github.com/spack/spack
10 # Please also see the NOTICE and LICENSE files for our notice and the LGPL.
11 #
12 # This program is free software; you can redistribute it and/or modify
13 # it under the terms of the GNU Lesser General Public License (as
14 # published by the Free Software Foundation) version 2.1, February 1999.
15 #
16 # This program is distributed in the hope that it will be useful, but
17 # WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
18 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
19 # conditions of the GNU Lesser General Public License for more details.
20 #
21 # You should have received a copy of the GNU Lesser General Public
22 # License along with this program; if not, write to the Free Software
23 # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
24 ##############################################################################
25 from spack import *
26
27
28 class Mpfr(AutotoolsPackage):
29 """The MPFR library is a C library for multiple-precision
30 floating-point computations with correct rounding."""
31
32 homepage = "http://www.mpfr.org"
33 url = "https://ftp.gnu.org/gnu/mpfr/mpfr-3.1.5.tar.bz2"
34
35 version('3.1.5', 'b1d23a55588e3b2a13e3be66bc69fd8d')
36 version('3.1.4', 'b8a2f6b0e68bef46e53da2ac439e1cf4')
37 version('3.1.3', '5fdfa3cfa5c86514ee4a241a1affa138')
38 version('3.1.2', 'ee2c3ac63bf0c2359bf08fc3ee094c19')
39
40 # mpir is a drop-in replacement for gmp
41 depends_on('[email protected]:') # 4.2.3 or higher is recommended
42
43 patch('vasprintf.patch', when='@3.1.5')
44 patch('strtofr.patch', when='@3.1.5')
45
46 def configure_args(self):
47 args = [
48 '--with-gmp=' + self.spec['gmp'].prefix,
49 ]
50 return args
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/var/spack/repos/builtin/packages/mpfr/package.py b/var/spack/repos/builtin/packages/mpfr/package.py
--- a/var/spack/repos/builtin/packages/mpfr/package.py
+++ b/var/spack/repos/builtin/packages/mpfr/package.py
@@ -30,18 +30,33 @@
floating-point computations with correct rounding."""
homepage = "http://www.mpfr.org"
- url = "https://ftp.gnu.org/gnu/mpfr/mpfr-3.1.5.tar.bz2"
+ url = "https://ftp.gnu.org/gnu/mpfr/mpfr-4.0.1.tar.bz2"
+ version('4.0.1', '8c21d8ac7460493b2b9f3ef3cc610454')
+ version('4.0.0', 'ef619f3bb68039e35c4a219e06be72d0')
+ version('3.1.6', '320c28198def956aeacdb240b46b8969')
version('3.1.5', 'b1d23a55588e3b2a13e3be66bc69fd8d')
version('3.1.4', 'b8a2f6b0e68bef46e53da2ac439e1cf4')
version('3.1.3', '5fdfa3cfa5c86514ee4a241a1affa138')
version('3.1.2', 'ee2c3ac63bf0c2359bf08fc3ee094c19')
# mpir is a drop-in replacement for gmp
- depends_on('[email protected]:') # 4.2.3 or higher is recommended
+ depends_on('[email protected]:') # 4.2.3 or higher is recommended
+ depends_on('[email protected]:', when='@4.0.0:') # http://www.mpfr.org/mpfr-4.0.0/
- patch('vasprintf.patch', when='@3.1.5')
- patch('strtofr.patch', when='@3.1.5')
+ # Check the Bugs section of old release pages for patches.
+ # http://www.mpfr.org/mpfr-X.Y.Z/#bugs
+ patches = {
+ '3.1.6': '66a5d58364113a21405fc53f4a48f4e8',
+ '3.1.5': '1dc5fe65feb5607b89fe0f410d53b627',
+ '3.1.4': 'd124381573404fe83654c7d5a79aeabf',
+ '3.1.3': 'ebd1d835e0ae2fd8a9339210ccd1d0a8',
+ '3.1.2': '9f96a5c7cac1d6cd983ed9cf7d997074',
+ }
+
+ for ver, checksum in patches.items():
+ patch('http://www.mpfr.org/mpfr-{0}/allpatches'.format(ver),
+ when='@' + ver, sha256=checksum)
def configure_args(self):
args = [
| {"golden_diff": "diff --git a/var/spack/repos/builtin/packages/mpfr/package.py b/var/spack/repos/builtin/packages/mpfr/package.py\n--- a/var/spack/repos/builtin/packages/mpfr/package.py\n+++ b/var/spack/repos/builtin/packages/mpfr/package.py\n@@ -30,18 +30,33 @@\n floating-point computations with correct rounding.\"\"\"\n \n homepage = \"http://www.mpfr.org\"\n- url = \"https://ftp.gnu.org/gnu/mpfr/mpfr-3.1.5.tar.bz2\"\n+ url = \"https://ftp.gnu.org/gnu/mpfr/mpfr-4.0.1.tar.bz2\"\n \n+ version('4.0.1', '8c21d8ac7460493b2b9f3ef3cc610454')\n+ version('4.0.0', 'ef619f3bb68039e35c4a219e06be72d0')\n+ version('3.1.6', '320c28198def956aeacdb240b46b8969')\n version('3.1.5', 'b1d23a55588e3b2a13e3be66bc69fd8d')\n version('3.1.4', 'b8a2f6b0e68bef46e53da2ac439e1cf4')\n version('3.1.3', '5fdfa3cfa5c86514ee4a241a1affa138')\n version('3.1.2', 'ee2c3ac63bf0c2359bf08fc3ee094c19')\n \n # mpir is a drop-in replacement for gmp\n- depends_on('[email protected]:') # 4.2.3 or higher is recommended\n+ depends_on('[email protected]:') # 4.2.3 or higher is recommended\n+ depends_on('[email protected]:', when='@4.0.0:') # http://www.mpfr.org/mpfr-4.0.0/\n \n- patch('vasprintf.patch', when='@3.1.5')\n- patch('strtofr.patch', when='@3.1.5')\n+ # Check the Bugs section of old release pages for patches.\n+ # http://www.mpfr.org/mpfr-X.Y.Z/#bugs\n+ patches = {\n+ '3.1.6': '66a5d58364113a21405fc53f4a48f4e8',\n+ '3.1.5': '1dc5fe65feb5607b89fe0f410d53b627',\n+ '3.1.4': 'd124381573404fe83654c7d5a79aeabf',\n+ '3.1.3': 'ebd1d835e0ae2fd8a9339210ccd1d0a8',\n+ '3.1.2': '9f96a5c7cac1d6cd983ed9cf7d997074',\n+ }\n+\n+ for ver, checksum in patches.items():\n+ patch('http://www.mpfr.org/mpfr-{0}/allpatches'.format(ver),\n+ when='@' + ver, sha256=checksum)\n \n def configure_args(self):\n args = [\n", "issue": "gcc v5.4.0 build fails due to mpfr patching problem\nThere seems to be a patch application issue in the mpfr-3.1.5 build procedure\r\n\r\nI was expecting something like my previous build:\r\n```\r\n==> Installing mpfr\r\n==> Fetching file://MIRROR_DIR/mirror/mpfr/mpfr-3.1.5.tar.bz2\r\n==> Staging archive: WORKING_DIR/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5/mpfr-3.1.5.tar.bz2\r\n==> Created stage in WORKING_DIR/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5\r\n==> Applied patch vasprintf.patch\r\n==> Applied patch strtofr.patch\r\n==> Building mpfr [AutotoolsPackage]\r\n==> Executing phase: 'autoreconf'\r\n==> Executing phase: 'configure'\r\n==> Executing phase: 'build'\r\n==> Executing phase: 'install'\r\n==> Successfully installed mpfr\r\n Fetch: 0.04s. Build: 9.54s. Total: 9.58s.\r\n[+] WORKING_DIR/opt/spack/linux-centos7-x86_64/gcc-4.8.5/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5\r\n```\r\nWhen I tried to build the gcc compiler yesterday (and again this morning) the results were strange:\r\n```\r\n==> Installing mpfr\r\n1 out of 1 hunk FAILED -- saving rejects to file VERSION.rej\r\n1 out of 1 hunk FAILED -- saving rejects to file src/mpfr.h.rej\r\n1 out of 1 hunk FAILED -- saving rejects to file src/version.c.rej\r\n==> Fetching file://MIRROR_DIR/mirror/mpfr/mpfr-3.1.5.tar.bz2\r\n==> Staging archive: WORKING_DIR/sat/spack/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5/mpfr-3.1.5.tar.bz2\r\n==> Created stage in WORKING_DIR/sat/spack/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5\r\n==> Patch strtofr.patch failed.\r\n==> Error: ProcessError: Command exited with status 1:\r\n '/usr/bin/patch' '-s' '-p' '1' '-i' 'WORKING_DIR/sat/spack/var/spack/repos/builtin/packages/mpfr/strtofr.patch' '-d' '.'\r\n==> Error: [Errno 2] No such file or directory: 'WORKING_DIR/sat/spack/var/spack/stage/mpfr-3.1.5-rmi7bmi3oaqduvjown2v46snr6ps2zr5/mpfr-3.1.5/spack-build.out'\r\n```\r\nNot only the error, but the order of the messages seem strange.\r\n\r\nA clean clone of the spack repo made no difference\r\n```console\r\n$ spack install [email protected]\r\n```\r\n\r\nDefault environment:\r\n```linux-centos7-x86_64/gcc-4.8.5```\n", "before_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2017, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/spack/spack\n# Please also see the NOTICE and LICENSE files for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Mpfr(AutotoolsPackage):\n \"\"\"The MPFR library is a C library for multiple-precision\n floating-point computations with correct rounding.\"\"\"\n\n homepage = \"http://www.mpfr.org\"\n url = \"https://ftp.gnu.org/gnu/mpfr/mpfr-3.1.5.tar.bz2\"\n\n version('3.1.5', 'b1d23a55588e3b2a13e3be66bc69fd8d')\n version('3.1.4', 'b8a2f6b0e68bef46e53da2ac439e1cf4')\n version('3.1.3', '5fdfa3cfa5c86514ee4a241a1affa138')\n version('3.1.2', 'ee2c3ac63bf0c2359bf08fc3ee094c19')\n\n # mpir is a drop-in replacement for gmp\n depends_on('[email protected]:') # 4.2.3 or higher is recommended\n\n patch('vasprintf.patch', when='@3.1.5')\n patch('strtofr.patch', when='@3.1.5')\n\n def configure_args(self):\n args = [\n '--with-gmp=' + self.spec['gmp'].prefix,\n ]\n return args\n", "path": "var/spack/repos/builtin/packages/mpfr/package.py"}], "after_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2017, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/spack/spack\n# Please also see the NOTICE and LICENSE files for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Mpfr(AutotoolsPackage):\n \"\"\"The MPFR library is a C library for multiple-precision\n floating-point computations with correct rounding.\"\"\"\n\n homepage = \"http://www.mpfr.org\"\n url = \"https://ftp.gnu.org/gnu/mpfr/mpfr-4.0.1.tar.bz2\"\n\n version('4.0.1', '8c21d8ac7460493b2b9f3ef3cc610454')\n version('4.0.0', 'ef619f3bb68039e35c4a219e06be72d0')\n version('3.1.6', '320c28198def956aeacdb240b46b8969')\n version('3.1.5', 'b1d23a55588e3b2a13e3be66bc69fd8d')\n version('3.1.4', 'b8a2f6b0e68bef46e53da2ac439e1cf4')\n version('3.1.3', '5fdfa3cfa5c86514ee4a241a1affa138')\n version('3.1.2', 'ee2c3ac63bf0c2359bf08fc3ee094c19')\n\n # mpir is a drop-in replacement for gmp\n depends_on('[email protected]:') # 4.2.3 or higher is recommended\n depends_on('[email protected]:', when='@4.0.0:') # http://www.mpfr.org/mpfr-4.0.0/\n\n # Check the Bugs section of old release pages for patches.\n # http://www.mpfr.org/mpfr-X.Y.Z/#bugs\n patches = {\n '3.1.6': '66a5d58364113a21405fc53f4a48f4e8',\n '3.1.5': '1dc5fe65feb5607b89fe0f410d53b627',\n '3.1.4': 'd124381573404fe83654c7d5a79aeabf',\n '3.1.3': 'ebd1d835e0ae2fd8a9339210ccd1d0a8',\n '3.1.2': '9f96a5c7cac1d6cd983ed9cf7d997074',\n }\n\n for ver, checksum in patches.items():\n patch('http://www.mpfr.org/mpfr-{0}/allpatches'.format(ver),\n when='@' + ver, sha256=checksum)\n\n def configure_args(self):\n args = [\n '--with-gmp=' + self.spec['gmp'].prefix,\n ]\n return args\n", "path": "var/spack/repos/builtin/packages/mpfr/package.py"}]} | 1,729 | 853 |
gh_patches_debug_35167 | rasdani/github-patches | git_diff | translate__pootle-4148 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Translation of the Report Email
I would like to translate the words of the report email, if you could integrate this kind of template on the po file, it would be amazing... naturally title of the email included, which it would be `[(name-site)] Unit #(num) ((lang))`
```
Username: (username)
Current URL: (url)
IP address: (ip_address)
User-Agent: (user_agent)
Unit: (url_string)
Source: (source_string)
Current translation:
Your question or comment:
```
Thx in advance ;)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pootle/apps/contact/views.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 #
4 # Copyright (C) Pootle contributors.
5 #
6 # This file is a part of the Pootle project. It is distributed under the GPL3
7 # or later license. See the LICENSE file for a copy of the license and the
8 # AUTHORS file for copyright and authorship information.
9
10 from django.core.urlresolvers import reverse
11 from django.views.generic import TemplateView
12
13 from contact_form.views import ContactFormView as OriginalContactFormView
14
15 from pootle.core.views import AjaxResponseMixin
16
17 from .forms import ContactForm, ReportForm
18
19
20 SUBJECT_TEMPLATE = 'Unit #%d (%s)'
21 BODY_TEMPLATE = '''
22 Unit: %s
23
24 Source: %s
25
26 Current translation: %s
27
28 Your question or comment:
29 '''
30
31
32 class ContactFormTemplateView(TemplateView):
33 template_name = 'contact_form/contact_form.html'
34
35
36 class ContactFormView(AjaxResponseMixin, OriginalContactFormView):
37 form_class = ContactForm
38 template_name = 'contact_form/xhr_contact_form.html'
39
40 def get_context_data(self, **kwargs):
41 ctx = super(ContactFormView, self).get_context_data(**kwargs)
42 # Provide the form action URL to use in the template that renders the
43 # contact dialog.
44 ctx.update({
45 'contact_form_url': reverse('pootle-contact-xhr'),
46 })
47 return ctx
48
49 def get_initial(self):
50 initial = super(ContactFormView, self).get_initial()
51
52 user = self.request.user
53 if user.is_authenticated():
54 initial.update({
55 'name': user.full_name,
56 'email': user.email,
57 })
58
59 return initial
60
61 def get_success_url(self):
62 # XXX: This is unused. We don't need a `/contact/sent/` URL, but
63 # the parent :cls:`ContactView` enforces us to set some value here
64 return reverse('pootle-contact')
65
66
67 class ReportFormView(ContactFormView):
68 form_class = ReportForm
69
70 def get_context_data(self, **kwargs):
71 ctx = super(ReportFormView, self).get_context_data(**kwargs)
72 # Provide the form action URL to use in the template that renders the
73 # contact dialog.
74 ctx.update({
75 'contact_form_url': reverse('pootle-contact-report-error'),
76 })
77 return ctx
78
79 def get_initial(self):
80 initial = super(ReportFormView, self).get_initial()
81
82 report = self.request.GET.get('report', False)
83 if report:
84 try:
85 from pootle_store.models import Unit
86 uid = int(report)
87 try:
88 unit = Unit.objects.select_related(
89 'store__translation_project__project',
90 ).get(id=uid)
91 if unit.is_accessible_by(self.request.user):
92 unit_absolute_url = self.request.build_absolute_uri(
93 unit.get_translate_url()
94 )
95 initial.update({
96 'subject': SUBJECT_TEMPLATE % (
97 unit.id,
98 unit.store.translation_project.language.code
99 ),
100 'body': BODY_TEMPLATE % (
101 unit_absolute_url,
102 unit.source,
103 unit.target
104 ),
105 'report_email': unit.store.translation_project \
106 .project.report_email,
107 })
108 except Unit.DoesNotExist:
109 pass
110 except ValueError:
111 pass
112
113 return initial
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pootle/apps/contact/views.py b/pootle/apps/contact/views.py
--- a/pootle/apps/contact/views.py
+++ b/pootle/apps/contact/views.py
@@ -8,6 +8,7 @@
# AUTHORS file for copyright and authorship information.
from django.core.urlresolvers import reverse
+from django.template.loader import render_to_string
from django.views.generic import TemplateView
from contact_form.views import ContactFormView as OriginalContactFormView
@@ -17,18 +18,6 @@
from .forms import ContactForm, ReportForm
-SUBJECT_TEMPLATE = 'Unit #%d (%s)'
-BODY_TEMPLATE = '''
-Unit: %s
-
-Source: %s
-
-Current translation: %s
-
-Your question or comment:
-'''
-
-
class ContactFormTemplateView(TemplateView):
template_name = 'contact_form/contact_form.html'
@@ -93,15 +82,18 @@
unit.get_translate_url()
)
initial.update({
- 'subject': SUBJECT_TEMPLATE % (
- unit.id,
- unit.store.translation_project.language.code
- ),
- 'body': BODY_TEMPLATE % (
- unit_absolute_url,
- unit.source,
- unit.target
- ),
+ 'subject': render_to_string(
+ 'contact_form/report_form_subject.txt', {
+ 'unit': unit,
+ 'language': unit.store \
+ .translation_project \
+ .language.code,
+ }),
+ 'body': render_to_string(
+ 'contact_form/report_form_body.txt', {
+ 'unit': unit,
+ 'unit_absolute_url': unit_absolute_url,
+ }),
'report_email': unit.store.translation_project \
.project.report_email,
})
| {"golden_diff": "diff --git a/pootle/apps/contact/views.py b/pootle/apps/contact/views.py\n--- a/pootle/apps/contact/views.py\n+++ b/pootle/apps/contact/views.py\n@@ -8,6 +8,7 @@\n # AUTHORS file for copyright and authorship information.\n \n from django.core.urlresolvers import reverse\n+from django.template.loader import render_to_string\n from django.views.generic import TemplateView\n \n from contact_form.views import ContactFormView as OriginalContactFormView\n@@ -17,18 +18,6 @@\n from .forms import ContactForm, ReportForm\n \n \n-SUBJECT_TEMPLATE = 'Unit #%d (%s)'\n-BODY_TEMPLATE = '''\n-Unit: %s\n-\n-Source: %s\n-\n-Current translation: %s\n-\n-Your question or comment:\n-'''\n-\n-\n class ContactFormTemplateView(TemplateView):\n template_name = 'contact_form/contact_form.html'\n \n@@ -93,15 +82,18 @@\n unit.get_translate_url()\n )\n initial.update({\n- 'subject': SUBJECT_TEMPLATE % (\n- unit.id,\n- unit.store.translation_project.language.code\n- ),\n- 'body': BODY_TEMPLATE % (\n- unit_absolute_url,\n- unit.source,\n- unit.target\n- ),\n+ 'subject': render_to_string(\n+ 'contact_form/report_form_subject.txt', {\n+ 'unit': unit,\n+ 'language': unit.store \\\n+ .translation_project \\\n+ .language.code,\n+ }),\n+ 'body': render_to_string(\n+ 'contact_form/report_form_body.txt', {\n+ 'unit': unit,\n+ 'unit_absolute_url': unit_absolute_url,\n+ }),\n 'report_email': unit.store.translation_project \\\n .project.report_email,\n })\n", "issue": "Translation of the Report Email\nI would like to translate the words of the report email, if you could integrate this kind of template on the po file, it would be amazing... naturally title of the email included, which it would be `[(name-site)] Unit #(num) ((lang))`\n\n```\nUsername: (username)\nCurrent URL: (url)\nIP address: (ip_address)\nUser-Agent: (user_agent)\n\nUnit: (url_string)\n\nSource: (source_string)\n\nCurrent translation: \n\nYour question or comment:\n```\n\nThx in advance ;)\n\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom django.core.urlresolvers import reverse\nfrom django.views.generic import TemplateView\n\nfrom contact_form.views import ContactFormView as OriginalContactFormView\n\nfrom pootle.core.views import AjaxResponseMixin\n\nfrom .forms import ContactForm, ReportForm\n\n\nSUBJECT_TEMPLATE = 'Unit #%d (%s)'\nBODY_TEMPLATE = '''\nUnit: %s\n\nSource: %s\n\nCurrent translation: %s\n\nYour question or comment:\n'''\n\n\nclass ContactFormTemplateView(TemplateView):\n template_name = 'contact_form/contact_form.html'\n\n\nclass ContactFormView(AjaxResponseMixin, OriginalContactFormView):\n form_class = ContactForm\n template_name = 'contact_form/xhr_contact_form.html'\n\n def get_context_data(self, **kwargs):\n ctx = super(ContactFormView, self).get_context_data(**kwargs)\n # Provide the form action URL to use in the template that renders the\n # contact dialog.\n ctx.update({\n 'contact_form_url': reverse('pootle-contact-xhr'),\n })\n return ctx\n\n def get_initial(self):\n initial = super(ContactFormView, self).get_initial()\n\n user = self.request.user\n if user.is_authenticated():\n initial.update({\n 'name': user.full_name,\n 'email': user.email,\n })\n\n return initial\n\n def get_success_url(self):\n # XXX: This is unused. We don't need a `/contact/sent/` URL, but\n # the parent :cls:`ContactView` enforces us to set some value here\n return reverse('pootle-contact')\n\n\nclass ReportFormView(ContactFormView):\n form_class = ReportForm\n\n def get_context_data(self, **kwargs):\n ctx = super(ReportFormView, self).get_context_data(**kwargs)\n # Provide the form action URL to use in the template that renders the\n # contact dialog.\n ctx.update({\n 'contact_form_url': reverse('pootle-contact-report-error'),\n })\n return ctx\n\n def get_initial(self):\n initial = super(ReportFormView, self).get_initial()\n\n report = self.request.GET.get('report', False)\n if report:\n try:\n from pootle_store.models import Unit\n uid = int(report)\n try:\n unit = Unit.objects.select_related(\n 'store__translation_project__project',\n ).get(id=uid)\n if unit.is_accessible_by(self.request.user):\n unit_absolute_url = self.request.build_absolute_uri(\n unit.get_translate_url()\n )\n initial.update({\n 'subject': SUBJECT_TEMPLATE % (\n unit.id,\n unit.store.translation_project.language.code\n ),\n 'body': BODY_TEMPLATE % (\n unit_absolute_url,\n unit.source,\n unit.target\n ),\n 'report_email': unit.store.translation_project \\\n .project.report_email,\n })\n except Unit.DoesNotExist:\n pass\n except ValueError:\n pass\n\n return initial\n", "path": "pootle/apps/contact/views.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nfrom django.core.urlresolvers import reverse\nfrom django.template.loader import render_to_string\nfrom django.views.generic import TemplateView\n\nfrom contact_form.views import ContactFormView as OriginalContactFormView\n\nfrom pootle.core.views import AjaxResponseMixin\n\nfrom .forms import ContactForm, ReportForm\n\n\nclass ContactFormTemplateView(TemplateView):\n template_name = 'contact_form/contact_form.html'\n\n\nclass ContactFormView(AjaxResponseMixin, OriginalContactFormView):\n form_class = ContactForm\n template_name = 'contact_form/xhr_contact_form.html'\n\n def get_context_data(self, **kwargs):\n ctx = super(ContactFormView, self).get_context_data(**kwargs)\n # Provide the form action URL to use in the template that renders the\n # contact dialog.\n ctx.update({\n 'contact_form_url': reverse('pootle-contact-xhr'),\n })\n return ctx\n\n def get_initial(self):\n initial = super(ContactFormView, self).get_initial()\n\n user = self.request.user\n if user.is_authenticated():\n initial.update({\n 'name': user.full_name,\n 'email': user.email,\n })\n\n return initial\n\n def get_success_url(self):\n # XXX: This is unused. We don't need a `/contact/sent/` URL, but\n # the parent :cls:`ContactView` enforces us to set some value here\n return reverse('pootle-contact')\n\n\nclass ReportFormView(ContactFormView):\n form_class = ReportForm\n\n def get_context_data(self, **kwargs):\n ctx = super(ReportFormView, self).get_context_data(**kwargs)\n # Provide the form action URL to use in the template that renders the\n # contact dialog.\n ctx.update({\n 'contact_form_url': reverse('pootle-contact-report-error'),\n })\n return ctx\n\n def get_initial(self):\n initial = super(ReportFormView, self).get_initial()\n\n report = self.request.GET.get('report', False)\n if report:\n try:\n from pootle_store.models import Unit\n uid = int(report)\n try:\n unit = Unit.objects.select_related(\n 'store__translation_project__project',\n ).get(id=uid)\n if unit.is_accessible_by(self.request.user):\n unit_absolute_url = self.request.build_absolute_uri(\n unit.get_translate_url()\n )\n initial.update({\n 'subject': render_to_string(\n 'contact_form/report_form_subject.txt', {\n 'unit': unit,\n 'language': unit.store \\\n .translation_project \\\n .language.code,\n }),\n 'body': render_to_string(\n 'contact_form/report_form_body.txt', {\n 'unit': unit,\n 'unit_absolute_url': unit_absolute_url,\n }),\n 'report_email': unit.store.translation_project \\\n .project.report_email,\n })\n except Unit.DoesNotExist:\n pass\n except ValueError:\n pass\n\n return initial\n", "path": "pootle/apps/contact/views.py"}]} | 1,325 | 385 |
gh_patches_debug_16638 | rasdani/github-patches | git_diff | python-poetry__poetry-6338 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`poetry cache clear` no longer respects `--no-interaction` flag
<!--
Hi there! Thank you for discovering and submitting an issue.
Before you submit this; let's make sure of a few things.
Please make sure the following boxes are ticked if they are correct.
If not, please try and fulfill these first.
-->
<!-- Checked checkbox should look like this: [x] -->
- [x] I am on the [latest](https://github.com/python-poetry/poetry/releases/latest) Poetry version.
- [x] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.
- [x] If an exception occurs when executing a command, I executed it again in debug mode (`-vvv` option).
<!--
Once those are done, if you're able to fill in the following list with your information,
it'd be very helpful to whoever handles the issue.
-->
- **OS version and name**: Ubuntu 22.04
- **Poetry version**: 1.2.0
- **Link of a [Gist](https://gist.github.com/) with the contents of your pyproject.toml file**: <!-- Gist Link Here -->
## Issue
<!-- Now feel free to write your issue, but please be descriptive! Thanks again 🙌 ❤️ -->
Since poetry version 1.2.0, the `poetry cache clear` command no longer respects the `--no-interaction` flag:
```
$ poetry cache clear --all --no-interaction .
Delete 1882 entries? (yes/no) [no] ^C
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/poetry/console/commands/cache/clear.py`
Content:
```
1 from __future__ import annotations
2
3 import os
4
5 from cleo.helpers import argument
6 from cleo.helpers import option
7
8 from poetry.config.config import Config
9 from poetry.console.commands.command import Command
10
11
12 class CacheClearCommand(Command):
13 name = "cache clear"
14 description = "Clears Poetry's cache."
15
16 arguments = [argument("cache", description="The name of the cache to clear.")]
17 options = [option("all", description="Clear all entries in the cache.")]
18
19 def handle(self) -> int:
20 from cachy import CacheManager
21
22 cache = self.argument("cache")
23
24 parts = cache.split(":")
25 root = parts[0]
26
27 config = Config.create()
28 cache_dir = config.repository_cache_directory / root
29
30 try:
31 cache_dir.relative_to(config.repository_cache_directory)
32 except ValueError:
33 raise ValueError(f"{root} is not a valid repository cache")
34
35 cache = CacheManager(
36 {
37 "default": parts[0],
38 "serializer": "json",
39 "stores": {parts[0]: {"driver": "file", "path": str(cache_dir)}},
40 }
41 )
42
43 if len(parts) == 1:
44 if not self.option("all"):
45 raise RuntimeError(
46 f"Add the --all option if you want to clear all {parts[0]} caches"
47 )
48
49 if not cache_dir.exists():
50 self.line(f"No cache entries for {parts[0]}")
51 return 0
52
53 # Calculate number of entries
54 entries_count = sum(
55 len(files) for _path, _dirs, files in os.walk(str(cache_dir))
56 )
57
58 delete = self.confirm(f"<question>Delete {entries_count} entries?</>")
59 if not delete:
60 return 0
61
62 cache.flush()
63 elif len(parts) == 2:
64 raise RuntimeError(
65 "Only specifying the package name is not yet supported. "
66 "Add a specific version to clear"
67 )
68 elif len(parts) == 3:
69 package = parts[1]
70 version = parts[2]
71
72 if not cache.has(f"{package}:{version}"):
73 self.line(f"No cache entries for {package}:{version}")
74 return 0
75
76 delete = self.confirm(f"Delete cache entry {package}:{version}")
77 if not delete:
78 return 0
79
80 cache.forget(f"{package}:{version}")
81 else:
82 raise ValueError("Invalid cache key")
83
84 return 0
85
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/poetry/console/commands/cache/clear.py b/src/poetry/console/commands/cache/clear.py
--- a/src/poetry/console/commands/cache/clear.py
+++ b/src/poetry/console/commands/cache/clear.py
@@ -55,7 +55,7 @@
len(files) for _path, _dirs, files in os.walk(str(cache_dir))
)
- delete = self.confirm(f"<question>Delete {entries_count} entries?</>")
+ delete = self.confirm(f"<question>Delete {entries_count} entries?</>", True)
if not delete:
return 0
@@ -73,7 +73,7 @@
self.line(f"No cache entries for {package}:{version}")
return 0
- delete = self.confirm(f"Delete cache entry {package}:{version}")
+ delete = self.confirm(f"Delete cache entry {package}:{version}", True)
if not delete:
return 0
| {"golden_diff": "diff --git a/src/poetry/console/commands/cache/clear.py b/src/poetry/console/commands/cache/clear.py\n--- a/src/poetry/console/commands/cache/clear.py\n+++ b/src/poetry/console/commands/cache/clear.py\n@@ -55,7 +55,7 @@\n len(files) for _path, _dirs, files in os.walk(str(cache_dir))\n )\n \n- delete = self.confirm(f\"<question>Delete {entries_count} entries?</>\")\n+ delete = self.confirm(f\"<question>Delete {entries_count} entries?</>\", True)\n if not delete:\n return 0\n \n@@ -73,7 +73,7 @@\n self.line(f\"No cache entries for {package}:{version}\")\n return 0\n \n- delete = self.confirm(f\"Delete cache entry {package}:{version}\")\n+ delete = self.confirm(f\"Delete cache entry {package}:{version}\", True)\n if not delete:\n return 0\n", "issue": "`poetry cache clear` no longer respects `--no-interaction` flag\n<!--\r\n Hi there! Thank you for discovering and submitting an issue.\r\n\r\n Before you submit this; let's make sure of a few things.\r\n Please make sure the following boxes are ticked if they are correct.\r\n If not, please try and fulfill these first.\r\n-->\r\n\r\n<!-- Checked checkbox should look like this: [x] -->\r\n- [x] I am on the [latest](https://github.com/python-poetry/poetry/releases/latest) Poetry version.\r\n- [x] I have searched the [issues](https://github.com/python-poetry/poetry/issues) of this repo and believe that this is not a duplicate.\r\n- [x] If an exception occurs when executing a command, I executed it again in debug mode (`-vvv` option).\r\n\r\n<!--\r\n Once those are done, if you're able to fill in the following list with your information,\r\n it'd be very helpful to whoever handles the issue.\r\n-->\r\n\r\n- **OS version and name**: Ubuntu 22.04\r\n- **Poetry version**: 1.2.0\r\n- **Link of a [Gist](https://gist.github.com/) with the contents of your pyproject.toml file**: <!-- Gist Link Here -->\r\n\r\n## Issue\r\n<!-- Now feel free to write your issue, but please be descriptive! Thanks again \ud83d\ude4c \u2764\ufe0f -->\r\nSince poetry version 1.2.0, the `poetry cache clear` command no longer respects the `--no-interaction` flag:\r\n\r\n```\r\n$ poetry cache clear --all --no-interaction .\r\nDelete 1882 entries? (yes/no) [no] ^C\r\n```\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nimport os\n\nfrom cleo.helpers import argument\nfrom cleo.helpers import option\n\nfrom poetry.config.config import Config\nfrom poetry.console.commands.command import Command\n\n\nclass CacheClearCommand(Command):\n name = \"cache clear\"\n description = \"Clears Poetry's cache.\"\n\n arguments = [argument(\"cache\", description=\"The name of the cache to clear.\")]\n options = [option(\"all\", description=\"Clear all entries in the cache.\")]\n\n def handle(self) -> int:\n from cachy import CacheManager\n\n cache = self.argument(\"cache\")\n\n parts = cache.split(\":\")\n root = parts[0]\n\n config = Config.create()\n cache_dir = config.repository_cache_directory / root\n\n try:\n cache_dir.relative_to(config.repository_cache_directory)\n except ValueError:\n raise ValueError(f\"{root} is not a valid repository cache\")\n\n cache = CacheManager(\n {\n \"default\": parts[0],\n \"serializer\": \"json\",\n \"stores\": {parts[0]: {\"driver\": \"file\", \"path\": str(cache_dir)}},\n }\n )\n\n if len(parts) == 1:\n if not self.option(\"all\"):\n raise RuntimeError(\n f\"Add the --all option if you want to clear all {parts[0]} caches\"\n )\n\n if not cache_dir.exists():\n self.line(f\"No cache entries for {parts[0]}\")\n return 0\n\n # Calculate number of entries\n entries_count = sum(\n len(files) for _path, _dirs, files in os.walk(str(cache_dir))\n )\n\n delete = self.confirm(f\"<question>Delete {entries_count} entries?</>\")\n if not delete:\n return 0\n\n cache.flush()\n elif len(parts) == 2:\n raise RuntimeError(\n \"Only specifying the package name is not yet supported. \"\n \"Add a specific version to clear\"\n )\n elif len(parts) == 3:\n package = parts[1]\n version = parts[2]\n\n if not cache.has(f\"{package}:{version}\"):\n self.line(f\"No cache entries for {package}:{version}\")\n return 0\n\n delete = self.confirm(f\"Delete cache entry {package}:{version}\")\n if not delete:\n return 0\n\n cache.forget(f\"{package}:{version}\")\n else:\n raise ValueError(\"Invalid cache key\")\n\n return 0\n", "path": "src/poetry/console/commands/cache/clear.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport os\n\nfrom cleo.helpers import argument\nfrom cleo.helpers import option\n\nfrom poetry.config.config import Config\nfrom poetry.console.commands.command import Command\n\n\nclass CacheClearCommand(Command):\n name = \"cache clear\"\n description = \"Clears Poetry's cache.\"\n\n arguments = [argument(\"cache\", description=\"The name of the cache to clear.\")]\n options = [option(\"all\", description=\"Clear all entries in the cache.\")]\n\n def handle(self) -> int:\n from cachy import CacheManager\n\n cache = self.argument(\"cache\")\n\n parts = cache.split(\":\")\n root = parts[0]\n\n config = Config.create()\n cache_dir = config.repository_cache_directory / root\n\n try:\n cache_dir.relative_to(config.repository_cache_directory)\n except ValueError:\n raise ValueError(f\"{root} is not a valid repository cache\")\n\n cache = CacheManager(\n {\n \"default\": parts[0],\n \"serializer\": \"json\",\n \"stores\": {parts[0]: {\"driver\": \"file\", \"path\": str(cache_dir)}},\n }\n )\n\n if len(parts) == 1:\n if not self.option(\"all\"):\n raise RuntimeError(\n f\"Add the --all option if you want to clear all {parts[0]} caches\"\n )\n\n if not cache_dir.exists():\n self.line(f\"No cache entries for {parts[0]}\")\n return 0\n\n # Calculate number of entries\n entries_count = sum(\n len(files) for _path, _dirs, files in os.walk(str(cache_dir))\n )\n\n delete = self.confirm(f\"<question>Delete {entries_count} entries?</>\", True)\n if not delete:\n return 0\n\n cache.flush()\n elif len(parts) == 2:\n raise RuntimeError(\n \"Only specifying the package name is not yet supported. \"\n \"Add a specific version to clear\"\n )\n elif len(parts) == 3:\n package = parts[1]\n version = parts[2]\n\n if not cache.has(f\"{package}:{version}\"):\n self.line(f\"No cache entries for {package}:{version}\")\n return 0\n\n delete = self.confirm(f\"Delete cache entry {package}:{version}\", True)\n if not delete:\n return 0\n\n cache.forget(f\"{package}:{version}\")\n else:\n raise ValueError(\"Invalid cache key\")\n\n return 0\n", "path": "src/poetry/console/commands/cache/clear.py"}]} | 1,318 | 210 |
gh_patches_debug_890 | rasdani/github-patches | git_diff | falconry__falcon-801 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Default OPTIONS responder does not set Content-Length to "0"
Per RFC 7231:
> A server MUST generate a Content-Length field with a value of "0" if no payload body is to be sent in the response.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `falcon/responders.py`
Content:
```
1 # Copyright 2013 by Rackspace Hosting, Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from falcon.errors import HTTPBadRequest
16 from falcon.errors import HTTPNotFound
17 from falcon.status_codes import HTTP_204
18 from falcon.status_codes import HTTP_405
19
20
21 def path_not_found(req, resp, **kwargs):
22 """Raise 404 HTTPNotFound error"""
23 raise HTTPNotFound()
24
25
26 def bad_request(req, resp, **kwargs):
27 """Raise 400 HTTPBadRequest error"""
28 raise HTTPBadRequest('Bad request', 'Invalid HTTP method')
29
30
31 def create_method_not_allowed(allowed_methods):
32 """Creates a responder for "405 Method Not Allowed"
33
34 Args:
35 allowed_methods: A list of HTTP methods (uppercase) that should be
36 returned in the Allow header.
37
38 """
39 allowed = ', '.join(allowed_methods)
40
41 def method_not_allowed(req, resp, **kwargs):
42 resp.status = HTTP_405
43 resp.set_header('Allow', allowed)
44
45 return method_not_allowed
46
47
48 def create_default_options(allowed_methods):
49 """Creates a default responder for the OPTIONS method
50
51 Args:
52 allowed_methods: A list of HTTP methods (uppercase) that should be
53 returned in the Allow header.
54
55 """
56 allowed = ', '.join(allowed_methods)
57
58 def on_options(req, resp, **kwargs):
59 resp.status = HTTP_204
60 resp.set_header('Allow', allowed)
61
62 return on_options
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/falcon/responders.py b/falcon/responders.py
--- a/falcon/responders.py
+++ b/falcon/responders.py
@@ -58,5 +58,6 @@
def on_options(req, resp, **kwargs):
resp.status = HTTP_204
resp.set_header('Allow', allowed)
+ resp.set_header('Content-Length', '0')
return on_options
| {"golden_diff": "diff --git a/falcon/responders.py b/falcon/responders.py\n--- a/falcon/responders.py\n+++ b/falcon/responders.py\n@@ -58,5 +58,6 @@\n def on_options(req, resp, **kwargs):\n resp.status = HTTP_204\n resp.set_header('Allow', allowed)\n+ resp.set_header('Content-Length', '0')\n \n return on_options\n", "issue": "Default OPTIONS responder does not set Content-Length to \"0\"\nPer RFC 7231:\n\n> A server MUST generate a Content-Length field with a value of \"0\" if no payload body is to be sent in the response.\n\n", "before_files": [{"content": "# Copyright 2013 by Rackspace Hosting, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom falcon.errors import HTTPBadRequest\nfrom falcon.errors import HTTPNotFound\nfrom falcon.status_codes import HTTP_204\nfrom falcon.status_codes import HTTP_405\n\n\ndef path_not_found(req, resp, **kwargs):\n \"\"\"Raise 404 HTTPNotFound error\"\"\"\n raise HTTPNotFound()\n\n\ndef bad_request(req, resp, **kwargs):\n \"\"\"Raise 400 HTTPBadRequest error\"\"\"\n raise HTTPBadRequest('Bad request', 'Invalid HTTP method')\n\n\ndef create_method_not_allowed(allowed_methods):\n \"\"\"Creates a responder for \"405 Method Not Allowed\"\n\n Args:\n allowed_methods: A list of HTTP methods (uppercase) that should be\n returned in the Allow header.\n\n \"\"\"\n allowed = ', '.join(allowed_methods)\n\n def method_not_allowed(req, resp, **kwargs):\n resp.status = HTTP_405\n resp.set_header('Allow', allowed)\n\n return method_not_allowed\n\n\ndef create_default_options(allowed_methods):\n \"\"\"Creates a default responder for the OPTIONS method\n\n Args:\n allowed_methods: A list of HTTP methods (uppercase) that should be\n returned in the Allow header.\n\n \"\"\"\n allowed = ', '.join(allowed_methods)\n\n def on_options(req, resp, **kwargs):\n resp.status = HTTP_204\n resp.set_header('Allow', allowed)\n\n return on_options\n", "path": "falcon/responders.py"}], "after_files": [{"content": "# Copyright 2013 by Rackspace Hosting, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom falcon.errors import HTTPBadRequest\nfrom falcon.errors import HTTPNotFound\nfrom falcon.status_codes import HTTP_204\nfrom falcon.status_codes import HTTP_405\n\n\ndef path_not_found(req, resp, **kwargs):\n \"\"\"Raise 404 HTTPNotFound error\"\"\"\n raise HTTPNotFound()\n\n\ndef bad_request(req, resp, **kwargs):\n \"\"\"Raise 400 HTTPBadRequest error\"\"\"\n raise HTTPBadRequest('Bad request', 'Invalid HTTP method')\n\n\ndef create_method_not_allowed(allowed_methods):\n \"\"\"Creates a responder for \"405 Method Not Allowed\"\n\n Args:\n allowed_methods: A list of HTTP methods (uppercase) that should be\n returned in the Allow header.\n\n \"\"\"\n allowed = ', '.join(allowed_methods)\n\n def method_not_allowed(req, resp, **kwargs):\n resp.status = HTTP_405\n resp.set_header('Allow', allowed)\n\n return method_not_allowed\n\n\ndef create_default_options(allowed_methods):\n \"\"\"Creates a default responder for the OPTIONS method\n\n Args:\n allowed_methods: A list of HTTP methods (uppercase) that should be\n returned in the Allow header.\n\n \"\"\"\n allowed = ', '.join(allowed_methods)\n\n def on_options(req, resp, **kwargs):\n resp.status = HTTP_204\n resp.set_header('Allow', allowed)\n resp.set_header('Content-Length', '0')\n\n return on_options\n", "path": "falcon/responders.py"}]} | 867 | 92 |
gh_patches_debug_13547 | rasdani/github-patches | git_diff | kartoza__prj.app-263 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Disqus functionality is currently broken
There should be disqus inline chat widgets on each version page and each entry page. Currently these are not working - can we work to fix it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `django_project/core/settings/project.py`
Content:
```
1 # coding=utf-8
2
3 """Project level settings.
4
5 Adjust these values as needed but don't commit passwords etc. to any public
6 repository!
7 """
8
9 import os # noqa
10 from django.utils.translation import ugettext_lazy as _
11 from .utils import absolute_path
12 from .contrib import * # noqa
13
14 # Project apps
15 INSTALLED_APPS += (
16 'base',
17 'changes',
18 'github_issue',
19 'vota',
20 'disqus',
21 )
22
23 # Due to profile page does not available, this will redirect to home page after login
24 LOGIN_REDIRECT_URL = '/'
25
26 # How many versions to list in each project box
27 PROJECT_VERSION_LIST_SIZE = 10
28
29 # Set debug to false for production
30 DEBUG = TEMPLATE_DEBUG = False
31
32 SOUTH_TESTS_MIGRATE = False
33
34
35 # Set languages which want to be translated
36 LANGUAGES = (
37 ('en', _('English')),
38 ('af', _('Afrikaans')),
39 ('id', _('Indonesian')),
40 ('ko', _('Korean')),
41 )
42
43 # Set storage path for the translation files
44 LOCALE_PATHS = (absolute_path('locale'),)
45
46
47 MIDDLEWARE_CLASSES = (
48 # For nav bar generation
49 'core.custom_middleware.NavContextMiddleware',
50 ) + MIDDLEWARE_CLASSES
51
52 # Project specific javascript files to be pipelined
53 # For third party libs like jquery should go in contrib.py
54 PIPELINE_JS['project'] = {
55 'source_filenames': (
56 'js/csrf-ajax.js',
57 'js/changelog.js',
58 'js/github-issue.js'
59 ),
60 'output_filename': 'js/project.js',
61 }
62
63 # Project specific css files to be pipelined
64 # For third party libs like bootstrap should go in contrib.py
65 PIPELINE_CSS['project'] = {
66 'source_filenames': (
67 'css/changelog.css',
68 ),
69 'output_filename': 'css/project.css',
70 'extra_context': {
71 'media': 'screen, projection',
72 },
73 }
74
```
Path: `django_project/core/settings/contrib.py`
Content:
```
1 # coding=utf-8
2 """
3 core.settings.contrib
4 """
5 from .base import * # noqa
6
7 # Extra installed apps - grapelli needs to be added before others
8 INSTALLED_APPS = (
9 'grappelli',
10 ) + INSTALLED_APPS
11
12 INSTALLED_APPS += (
13 'raven.contrib.django.raven_compat', # enable Raven plugin
14 'crispy_forms',
15 'widget_tweaks', # lets us add some bootstrap css to form elements
16 'easy_thumbnails',
17 'reversion',
18 'rosetta',
19 'embed_video',
20 'django_hashedfilenamestorage',
21 'django_countries', # for sponsor addresses
22 # 'user_map',
23 )
24
25
26 MIGRATION_MODULES = {'accounts': 'core.migration'}
27
28 GRAPPELLI_ADMIN_TITLE = 'Site administration panel'
29
30 STOP_WORDS = (
31 'a', 'an', 'and', 'if', 'is', 'the', 'in', 'i', 'you', 'other',
32 'this', 'that'
33 )
34
35 CRISPY_TEMPLATE_PACK = 'bootstrap3'
36
37 # Easy-thumbnails options
38 THUMBNAIL_SUBDIR = 'thumbnails'
39 THUMBNAIL_ALIASES = {
40 '': {
41 'entry': {'size': (50, 50), 'crop': True},
42 'medium-entry': {'size': (100, 100), 'crop': True},
43 'large-entry': {'size': (400, 300), 'crop': True},
44 'thumb300x200': {'size': (300, 200), 'crop': True},
45 },
46 }
47
48 # Pipeline related settings
49
50 INSTALLED_APPS += (
51 'pipeline',)
52
53 MIDDLEWARE_CLASSES += (
54 # For rosetta localisation
55 'django.middleware.locale.LocaleMiddleware',
56 )
57
58 DEFAULT_FILE_STORAGE = (
59 'django_hashedfilenamestorage.storage.HashedFilenameFileSystemStorage')
60
61 # use underscore template function
62 PIPELINE_TEMPLATE_FUNC = '_.template'
63
64 # enable cached storage - requires uglify.js (node.js)
65 STATICFILES_STORAGE = 'pipeline.storage.PipelineCachedStorage'
66
67 # Contributed / third party js libs for pipeline compression
68 # For hand rolled js for this app, use project.py
69 PIPELINE_JS = {}
70
71 # Contributed / third party css for pipeline compression
72 # For hand rolled css for this app, use project.py
73 PIPELINE_CSS = {}
74
75 # These get enabled in prod.py
76 PIPELINE_ENABLED = False
77 PIPELINE_CSS_COMPRESSOR = None
78 PIPELINE_JS_COMPRESSOR = None
79
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/django_project/core/settings/contrib.py b/django_project/core/settings/contrib.py
--- a/django_project/core/settings/contrib.py
+++ b/django_project/core/settings/contrib.py
@@ -20,8 +20,12 @@
'django_hashedfilenamestorage',
'django_countries', # for sponsor addresses
# 'user_map',
+ 'disqus',
)
+# Set disqus and shortname
+# noinspection PyUnresolvedReferences
+from .secret import DISQUS_WEBSITE_SHORTNAME # noqa
MIGRATION_MODULES = {'accounts': 'core.migration'}
diff --git a/django_project/core/settings/project.py b/django_project/core/settings/project.py
--- a/django_project/core/settings/project.py
+++ b/django_project/core/settings/project.py
@@ -17,7 +17,6 @@
'changes',
'github_issue',
'vota',
- 'disqus',
)
# Due to profile page does not available, this will redirect to home page after login
| {"golden_diff": "diff --git a/django_project/core/settings/contrib.py b/django_project/core/settings/contrib.py\n--- a/django_project/core/settings/contrib.py\n+++ b/django_project/core/settings/contrib.py\n@@ -20,8 +20,12 @@\n 'django_hashedfilenamestorage',\n 'django_countries', # for sponsor addresses\n # 'user_map',\n+ 'disqus',\n )\n \n+# Set disqus and shortname\n+# noinspection PyUnresolvedReferences\n+from .secret import DISQUS_WEBSITE_SHORTNAME # noqa\n \n MIGRATION_MODULES = {'accounts': 'core.migration'}\n \ndiff --git a/django_project/core/settings/project.py b/django_project/core/settings/project.py\n--- a/django_project/core/settings/project.py\n+++ b/django_project/core/settings/project.py\n@@ -17,7 +17,6 @@\n 'changes',\n 'github_issue',\n 'vota',\n- 'disqus',\n )\n \n # Due to profile page does not available, this will redirect to home page after login\n", "issue": "Disqus functionality is currently broken\nThere should be disqus inline chat widgets on each version page and each entry page. Currently these are not working - can we work to fix it.\n\n", "before_files": [{"content": "# coding=utf-8\n\n\"\"\"Project level settings.\n\nAdjust these values as needed but don't commit passwords etc. to any public\nrepository!\n\"\"\"\n\nimport os # noqa\nfrom django.utils.translation import ugettext_lazy as _\nfrom .utils import absolute_path\nfrom .contrib import * # noqa\n\n# Project apps\nINSTALLED_APPS += (\n 'base',\n 'changes',\n 'github_issue',\n 'vota',\n 'disqus',\n)\n\n# Due to profile page does not available, this will redirect to home page after login\nLOGIN_REDIRECT_URL = '/'\n\n# How many versions to list in each project box\nPROJECT_VERSION_LIST_SIZE = 10\n\n# Set debug to false for production\nDEBUG = TEMPLATE_DEBUG = False\n\nSOUTH_TESTS_MIGRATE = False\n\n\n# Set languages which want to be translated\nLANGUAGES = (\n ('en', _('English')),\n ('af', _('Afrikaans')),\n ('id', _('Indonesian')),\n ('ko', _('Korean')),\n)\n\n# Set storage path for the translation files\nLOCALE_PATHS = (absolute_path('locale'),)\n\n\nMIDDLEWARE_CLASSES = (\n # For nav bar generation\n 'core.custom_middleware.NavContextMiddleware',\n) + MIDDLEWARE_CLASSES\n\n# Project specific javascript files to be pipelined\n# For third party libs like jquery should go in contrib.py\nPIPELINE_JS['project'] = {\n 'source_filenames': (\n 'js/csrf-ajax.js',\n 'js/changelog.js',\n 'js/github-issue.js'\n ),\n 'output_filename': 'js/project.js',\n}\n\n# Project specific css files to be pipelined\n# For third party libs like bootstrap should go in contrib.py\nPIPELINE_CSS['project'] = {\n 'source_filenames': (\n 'css/changelog.css',\n ),\n 'output_filename': 'css/project.css',\n 'extra_context': {\n 'media': 'screen, projection',\n },\n}\n", "path": "django_project/core/settings/project.py"}, {"content": "# coding=utf-8\n\"\"\"\ncore.settings.contrib\n\"\"\"\nfrom .base import * # noqa\n\n# Extra installed apps - grapelli needs to be added before others\nINSTALLED_APPS = (\n 'grappelli',\n) + INSTALLED_APPS\n\nINSTALLED_APPS += (\n 'raven.contrib.django.raven_compat', # enable Raven plugin\n 'crispy_forms',\n 'widget_tweaks', # lets us add some bootstrap css to form elements\n 'easy_thumbnails',\n 'reversion',\n 'rosetta',\n 'embed_video',\n 'django_hashedfilenamestorage',\n 'django_countries', # for sponsor addresses\n # 'user_map',\n)\n\n\nMIGRATION_MODULES = {'accounts': 'core.migration'}\n\nGRAPPELLI_ADMIN_TITLE = 'Site administration panel'\n\nSTOP_WORDS = (\n 'a', 'an', 'and', 'if', 'is', 'the', 'in', 'i', 'you', 'other',\n 'this', 'that'\n)\n\nCRISPY_TEMPLATE_PACK = 'bootstrap3'\n\n# Easy-thumbnails options\nTHUMBNAIL_SUBDIR = 'thumbnails'\nTHUMBNAIL_ALIASES = {\n '': {\n 'entry': {'size': (50, 50), 'crop': True},\n 'medium-entry': {'size': (100, 100), 'crop': True},\n 'large-entry': {'size': (400, 300), 'crop': True},\n 'thumb300x200': {'size': (300, 200), 'crop': True},\n },\n}\n\n# Pipeline related settings\n\nINSTALLED_APPS += (\n 'pipeline',)\n\nMIDDLEWARE_CLASSES += (\n # For rosetta localisation\n 'django.middleware.locale.LocaleMiddleware',\n)\n\nDEFAULT_FILE_STORAGE = (\n 'django_hashedfilenamestorage.storage.HashedFilenameFileSystemStorage')\n\n# use underscore template function\nPIPELINE_TEMPLATE_FUNC = '_.template'\n\n# enable cached storage - requires uglify.js (node.js)\nSTATICFILES_STORAGE = 'pipeline.storage.PipelineCachedStorage'\n\n# Contributed / third party js libs for pipeline compression\n# For hand rolled js for this app, use project.py\nPIPELINE_JS = {}\n\n# Contributed / third party css for pipeline compression\n# For hand rolled css for this app, use project.py\nPIPELINE_CSS = {}\n\n# These get enabled in prod.py\nPIPELINE_ENABLED = False\nPIPELINE_CSS_COMPRESSOR = None\nPIPELINE_JS_COMPRESSOR = None\n", "path": "django_project/core/settings/contrib.py"}], "after_files": [{"content": "# coding=utf-8\n\n\"\"\"Project level settings.\n\nAdjust these values as needed but don't commit passwords etc. to any public\nrepository!\n\"\"\"\n\nimport os # noqa\nfrom django.utils.translation import ugettext_lazy as _\nfrom .utils import absolute_path\nfrom .contrib import * # noqa\n\n# Project apps\nINSTALLED_APPS += (\n 'base',\n 'changes',\n 'github_issue',\n 'vota',\n)\n\n# Due to profile page does not available, this will redirect to home page after login\nLOGIN_REDIRECT_URL = '/'\n\n# How many versions to list in each project box\nPROJECT_VERSION_LIST_SIZE = 10\n\n# Set debug to false for production\nDEBUG = TEMPLATE_DEBUG = False\n\nSOUTH_TESTS_MIGRATE = False\n\n\n# Set languages which want to be translated\nLANGUAGES = (\n ('en', _('English')),\n ('af', _('Afrikaans')),\n ('id', _('Indonesian')),\n ('ko', _('Korean')),\n)\n\n# Set storage path for the translation files\nLOCALE_PATHS = (absolute_path('locale'),)\n\n\nMIDDLEWARE_CLASSES = (\n # For nav bar generation\n 'core.custom_middleware.NavContextMiddleware',\n) + MIDDLEWARE_CLASSES\n\n# Project specific javascript files to be pipelined\n# For third party libs like jquery should go in contrib.py\nPIPELINE_JS['project'] = {\n 'source_filenames': (\n 'js/csrf-ajax.js',\n 'js/changelog.js',\n 'js/github-issue.js'\n ),\n 'output_filename': 'js/project.js',\n}\n\n# Project specific css files to be pipelined\n# For third party libs like bootstrap should go in contrib.py\nPIPELINE_CSS['project'] = {\n 'source_filenames': (\n 'css/changelog.css',\n ),\n 'output_filename': 'css/project.css',\n 'extra_context': {\n 'media': 'screen, projection',\n },\n}\n", "path": "django_project/core/settings/project.py"}, {"content": "# coding=utf-8\n\"\"\"\ncore.settings.contrib\n\"\"\"\nfrom .base import * # noqa\n\n# Extra installed apps - grapelli needs to be added before others\nINSTALLED_APPS = (\n 'grappelli',\n) + INSTALLED_APPS\n\nINSTALLED_APPS += (\n 'raven.contrib.django.raven_compat', # enable Raven plugin\n 'crispy_forms',\n 'widget_tweaks', # lets us add some bootstrap css to form elements\n 'easy_thumbnails',\n 'reversion',\n 'rosetta',\n 'embed_video',\n 'django_hashedfilenamestorage',\n 'django_countries', # for sponsor addresses\n # 'user_map',\n 'disqus',\n)\n\n# Set disqus and shortname\n# noinspection PyUnresolvedReferences\nfrom .secret import DISQUS_WEBSITE_SHORTNAME # noqa\n\nMIGRATION_MODULES = {'accounts': 'core.migration'}\n\nGRAPPELLI_ADMIN_TITLE = 'Site administration panel'\n\nSTOP_WORDS = (\n 'a', 'an', 'and', 'if', 'is', 'the', 'in', 'i', 'you', 'other',\n 'this', 'that'\n)\n\nCRISPY_TEMPLATE_PACK = 'bootstrap3'\n\n# Easy-thumbnails options\nTHUMBNAIL_SUBDIR = 'thumbnails'\nTHUMBNAIL_ALIASES = {\n '': {\n 'entry': {'size': (50, 50), 'crop': True},\n 'medium-entry': {'size': (100, 100), 'crop': True},\n 'large-entry': {'size': (400, 300), 'crop': True},\n 'thumb300x200': {'size': (300, 200), 'crop': True},\n },\n}\n\n# Pipeline related settings\n\nINSTALLED_APPS += (\n 'pipeline',)\n\nMIDDLEWARE_CLASSES += (\n # For rosetta localisation\n 'django.middleware.locale.LocaleMiddleware',\n)\n\nDEFAULT_FILE_STORAGE = (\n 'django_hashedfilenamestorage.storage.HashedFilenameFileSystemStorage')\n\n# use underscore template function\nPIPELINE_TEMPLATE_FUNC = '_.template'\n\n# enable cached storage - requires uglify.js (node.js)\nSTATICFILES_STORAGE = 'pipeline.storage.PipelineCachedStorage'\n\n# Contributed / third party js libs for pipeline compression\n# For hand rolled js for this app, use project.py\nPIPELINE_JS = {}\n\n# Contributed / third party css for pipeline compression\n# For hand rolled css for this app, use project.py\nPIPELINE_CSS = {}\n\n# These get enabled in prod.py\nPIPELINE_ENABLED = False\nPIPELINE_CSS_COMPRESSOR = None\nPIPELINE_JS_COMPRESSOR = None\n", "path": "django_project/core/settings/contrib.py"}]} | 1,584 | 228 |
gh_patches_debug_20418 | rasdani/github-patches | git_diff | nonebot__nonebot2-238 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug: 内置的single_session插件有一些bug
**描述问题:**
内置的`single_session`插件只能处理有`get_session_id`方法的`event`,如果一个`matcher`监听了`metaevent`,那么其中的`run_preprocessor`会报错
**如何复现?**
[这一行](https://github.com/nonebot/nonebot2/blob/93ffc93a80cf9e3103eb4a164e7b32ab3cdd0882/nonebot/plugins/single_session.py#L13)限制了只能监听有`get_session_id`的事件,但是对没有这个方法的事件没有做额外的处理,导致报错。
除此之外,下面的[判断语句](https://github.com/nonebot/nonebot2/blob/93ffc93a80cf9e3103eb4a164e7b32ab3cdd0882/nonebot/plugins/single_session.py#L16)也有问题,如果这个事件第一次遇到的话不应该被忽略
**期望的结果**
插件正常使用
````
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nonebot/plugins/single_session.py`
Content:
```
1 from typing import Dict, Optional
2
3 from nonebot.typing import T_State
4 from nonebot.matcher import Matcher
5 from nonebot.adapters import Bot, Event
6 from nonebot.message import run_preprocessor, run_postprocessor, IgnoredException
7
8 _running_matcher: Dict[str, int] = {}
9
10
11 @run_preprocessor
12 async def _(matcher: Matcher, bot: Bot, event: Event, state: T_State):
13 session_id = event.get_session_id()
14 event_id = id(event)
15
16 if _running_matcher.get(session_id, None) != event_id:
17 raise IgnoredException("Annother matcher running")
18
19 _running_matcher[session_id] = event_id
20
21
22 @run_postprocessor
23 async def _(matcher: Matcher, exception: Optional[Exception], bot: Bot, event: Event, state: T_State):
24 session_id = event.get_session_id()
25 if session_id in _running_matcher:
26 del _running_matcher[session_id]
27
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nonebot/plugins/single_session.py b/nonebot/plugins/single_session.py
--- a/nonebot/plugins/single_session.py
+++ b/nonebot/plugins/single_session.py
@@ -10,17 +10,23 @@
@run_preprocessor
async def _(matcher: Matcher, bot: Bot, event: Event, state: T_State):
- session_id = event.get_session_id()
- event_id = id(event)
-
- if _running_matcher.get(session_id, None) != event_id:
+ try:
+ session_id = event.get_session_id()
+ except Exception:
+ return
+ current_event_id = id(event)
+ event_id = _running_matcher.get(session_id, None)
+ if event_id and event_id != current_event_id:
raise IgnoredException("Annother matcher running")
- _running_matcher[session_id] = event_id
+ _running_matcher[session_id] = current_event_id
@run_postprocessor
async def _(matcher: Matcher, exception: Optional[Exception], bot: Bot, event: Event, state: T_State):
- session_id = event.get_session_id()
+ try:
+ session_id = event.get_session_id()
+ except Exception:
+ return
if session_id in _running_matcher:
del _running_matcher[session_id]
| {"golden_diff": "diff --git a/nonebot/plugins/single_session.py b/nonebot/plugins/single_session.py\n--- a/nonebot/plugins/single_session.py\n+++ b/nonebot/plugins/single_session.py\n@@ -10,17 +10,23 @@\n \n @run_preprocessor\n async def _(matcher: Matcher, bot: Bot, event: Event, state: T_State):\n- session_id = event.get_session_id()\n- event_id = id(event)\n-\n- if _running_matcher.get(session_id, None) != event_id:\n+ try:\n+ session_id = event.get_session_id()\n+ except Exception:\n+ return\n+ current_event_id = id(event)\n+ event_id = _running_matcher.get(session_id, None)\n+ if event_id and event_id != current_event_id:\n raise IgnoredException(\"Annother matcher running\")\n \n- _running_matcher[session_id] = event_id\n+ _running_matcher[session_id] = current_event_id\n \n \n @run_postprocessor\n async def _(matcher: Matcher, exception: Optional[Exception], bot: Bot, event: Event, state: T_State):\n- session_id = event.get_session_id()\n+ try:\n+ session_id = event.get_session_id()\n+ except Exception:\n+ return\n if session_id in _running_matcher:\n del _running_matcher[session_id]\n", "issue": "Bug: \u5185\u7f6e\u7684single_session\u63d2\u4ef6\u6709\u4e00\u4e9bbug\n**\u63cf\u8ff0\u95ee\u9898\uff1a**\r\n\r\n\u5185\u7f6e\u7684`single_session`\u63d2\u4ef6\u53ea\u80fd\u5904\u7406\u6709`get_session_id`\u65b9\u6cd5\u7684`event`\uff0c\u5982\u679c\u4e00\u4e2a`matcher`\u76d1\u542c\u4e86`metaevent`\uff0c\u90a3\u4e48\u5176\u4e2d\u7684`run_preprocessor`\u4f1a\u62a5\u9519\r\n\r\n**\u5982\u4f55\u590d\u73b0\uff1f**\r\n\r\n[\u8fd9\u4e00\u884c](https://github.com/nonebot/nonebot2/blob/93ffc93a80cf9e3103eb4a164e7b32ab3cdd0882/nonebot/plugins/single_session.py#L13)\u9650\u5236\u4e86\u53ea\u80fd\u76d1\u542c\u6709`get_session_id`\u7684\u4e8b\u4ef6\uff0c\u4f46\u662f\u5bf9\u6ca1\u6709\u8fd9\u4e2a\u65b9\u6cd5\u7684\u4e8b\u4ef6\u6ca1\u6709\u505a\u989d\u5916\u7684\u5904\u7406\uff0c\u5bfc\u81f4\u62a5\u9519\u3002\r\n\u9664\u6b64\u4e4b\u5916\uff0c\u4e0b\u9762\u7684[\u5224\u65ad\u8bed\u53e5](https://github.com/nonebot/nonebot2/blob/93ffc93a80cf9e3103eb4a164e7b32ab3cdd0882/nonebot/plugins/single_session.py#L16)\u4e5f\u6709\u95ee\u9898\uff0c\u5982\u679c\u8fd9\u4e2a\u4e8b\u4ef6\u7b2c\u4e00\u6b21\u9047\u5230\u7684\u8bdd\u4e0d\u5e94\u8be5\u88ab\u5ffd\u7565\r\n\r\n**\u671f\u671b\u7684\u7ed3\u679c**\r\n\u63d2\u4ef6\u6b63\u5e38\u4f7f\u7528\r\n\r\n````\r\n\n", "before_files": [{"content": "from typing import Dict, Optional\n\nfrom nonebot.typing import T_State\nfrom nonebot.matcher import Matcher\nfrom nonebot.adapters import Bot, Event\nfrom nonebot.message import run_preprocessor, run_postprocessor, IgnoredException\n\n_running_matcher: Dict[str, int] = {}\n\n\n@run_preprocessor\nasync def _(matcher: Matcher, bot: Bot, event: Event, state: T_State):\n session_id = event.get_session_id()\n event_id = id(event)\n\n if _running_matcher.get(session_id, None) != event_id:\n raise IgnoredException(\"Annother matcher running\")\n\n _running_matcher[session_id] = event_id\n\n\n@run_postprocessor\nasync def _(matcher: Matcher, exception: Optional[Exception], bot: Bot, event: Event, state: T_State):\n session_id = event.get_session_id()\n if session_id in _running_matcher:\n del _running_matcher[session_id]\n", "path": "nonebot/plugins/single_session.py"}], "after_files": [{"content": "from typing import Dict, Optional\n\nfrom nonebot.typing import T_State\nfrom nonebot.matcher import Matcher\nfrom nonebot.adapters import Bot, Event\nfrom nonebot.message import run_preprocessor, run_postprocessor, IgnoredException\n\n_running_matcher: Dict[str, int] = {}\n\n\n@run_preprocessor\nasync def _(matcher: Matcher, bot: Bot, event: Event, state: T_State):\n try:\n session_id = event.get_session_id()\n except Exception:\n return\n current_event_id = id(event)\n event_id = _running_matcher.get(session_id, None)\n if event_id and event_id != current_event_id:\n raise IgnoredException(\"Annother matcher running\")\n\n _running_matcher[session_id] = current_event_id\n\n\n@run_postprocessor\nasync def _(matcher: Matcher, exception: Optional[Exception], bot: Bot, event: Event, state: T_State):\n try:\n session_id = event.get_session_id()\n except Exception:\n return\n if session_id in _running_matcher:\n del _running_matcher[session_id]\n", "path": "nonebot/plugins/single_session.py"}]} | 773 | 302 |
gh_patches_debug_607 | rasdani/github-patches | git_diff | pex-tool__pex-1446 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.49
On the docket:
+ [ ] Avoid re-using old ~/.pex/code/ caches. #1444
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.48"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.48"
+__version__ = "2.1.49"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.48\"\n+__version__ = \"2.1.49\"\n", "issue": "Release 2.1.49\nOn the docket:\r\n+ [ ] Avoid re-using old ~/.pex/code/ caches. #1444 \n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.48\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.49\"\n", "path": "pex/version.py"}]} | 342 | 96 |
gh_patches_debug_6873 | rasdani/github-patches | git_diff | DDMAL__CantusDB-454 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
required fields
On OldCantus, to create a source you need both a manuscript ID and a siglum (fields marked with asterisk) otherwise it won't create the source.
NewCantus has no asterisks on these fields, and was quite happy to let me make sources with no siglum (though it does tell me to fill out an ID field if I try to submit without it.)
On the chant level, Folio and Sequence seem to be required fields (they are not on OldCantus!) but are not marked as such with asterisks, either.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `django/cantusdb_project/main_app/models/source.py`
Content:
```
1 from django.db import models
2 from main_app.models import BaseModel, Segment
3 from django.contrib.auth import get_user_model
4
5
6 class Source(BaseModel):
7 cursus_choices = [("Monastic", "Monastic"), ("Secular", "Secular")]
8 source_status_choices = [
9 (
10 "Editing process (not all the fields have been proofread)",
11 "Editing process (not all the fields have been proofread)",
12 ),
13 ("Published / Complete", "Published / Complete"),
14 ("Published / Proofread pending", "Published / Proofread pending"),
15 ("Unpublished / Editing process", "Unpublished / Editing process"),
16 ("Unpublished / Indexing process", "Unpublished / Indexing process"),
17 ("Unpublished / Proofread pending", "Unpublished / Proofread pending"),
18 ("Unpublished / Proofreading process", "Unpublished / Proofreading process"),
19 ("Unpublished / No indexing activity", "Unpublished / No indexing activity"),
20 ]
21
22 # The old Cantus uses two fields to jointly control the access to sources.
23 # Here in the new Cantus, we only use one field, and there are two levels: published and unpublished.
24 # Published sources are available to the public.
25 # Unpublished sources are hidden from the list and cannot be accessed by URL until the user logs in.
26 published = models.BooleanField(blank=False, null=False, default=False)
27
28 title = models.CharField(
29 max_length=255,
30 help_text="Full Manuscript Identification (City, Archive, Shelf-mark)",
31 )
32 # the siglum field as implemented on the old Cantus is composed of both the RISM siglum and the shelfmark
33 # it is a human-readable ID for a source
34 siglum = models.CharField(
35 max_length=63,
36 null=True,
37 blank=True,
38 help_text="RISM-style siglum + Shelf-mark (e.g. GB-Ob 202).",
39 )
40 # the RISM siglum uniquely identifies a library or holding institution
41 rism_siglum = models.ForeignKey(
42 "RismSiglum", on_delete=models.PROTECT, null=True, blank=True,
43 )
44 provenance = models.ForeignKey(
45 "Provenance",
46 on_delete=models.PROTECT,
47 help_text="If the origin is unknown, select a location where the source was "
48 "used later in its lifetime and provide details in the "
49 '"Provenance notes" field.',
50 null=True,
51 blank=True,
52 related_name="sources",
53 )
54 provenance_notes = models.TextField(
55 blank=True,
56 null=True,
57 help_text="More exact indication of the provenance (if necessary)",
58 )
59 full_source = models.BooleanField(blank=True, null=True)
60 date = models.CharField(
61 blank=True,
62 null=True,
63 max_length=63,
64 help_text='Date of the manuscript (e.g. "1200s", "1300-1350", etc.)',
65 )
66 century = models.ManyToManyField("Century", related_name="sources", blank=True)
67 notation = models.ManyToManyField("Notation", related_name="sources", blank=True)
68 cursus = models.CharField(
69 blank=True, null=True, choices=cursus_choices, max_length=63
70 )
71 current_editors = models.ManyToManyField(get_user_model(), related_name="sources_user_can_edit", blank=True)
72
73 inventoried_by = models.ManyToManyField(
74 get_user_model(), related_name="inventoried_sources", blank=True
75 )
76 full_text_entered_by = models.ManyToManyField(
77 get_user_model(), related_name="entered_full_text_for_sources", blank=True
78 )
79 melodies_entered_by = models.ManyToManyField(
80 get_user_model(), related_name="entered_melody_for_sources", blank=True
81 )
82 proofreaders = models.ManyToManyField(get_user_model(), related_name="proofread_sources", blank=True)
83 other_editors = models.ManyToManyField(get_user_model(), related_name="edited_sources", blank=True)
84
85
86 segment = models.ForeignKey(
87 "Segment", on_delete=models.PROTECT, blank=True, null=True
88 )
89 source_status = models.CharField(blank=True, null=True, choices=source_status_choices, max_length=255)
90 complete_inventory = models.BooleanField(blank=True, null=True)
91 summary = models.TextField(blank=True, null=True)
92 liturgical_occasions = models.TextField(blank=True, null=True)
93 description = models.TextField(blank=True, null=True)
94 selected_bibliography = models.TextField(blank=True, null=True)
95 image_link = models.URLField(
96 blank=True,
97 null=True,
98 help_text='HTTP link to the image gallery of the source.',
99 )
100 indexing_notes = models.TextField(blank=True, null=True)
101 indexing_date = models.TextField(blank=True, null=True)
102 json_info = models.JSONField(blank=True, null=True)
103 fragmentarium_id = models.CharField(max_length=15, blank=True, null=True)
104 dact_id = models.CharField(max_length=15, blank=True, null=True)
105
106 # number_of_chants and number_of_melodies are used for rendering the source-list page (perhaps among other places)
107 # they are automatically recalculated in main_app.signals.update_source_chant_count and
108 # main_app.signals.update_source_melody_count every time a chant or sequence is saved or deleted
109 number_of_chants = models.IntegerField(blank=True, null=True)
110 number_of_melodies = models.IntegerField(blank=True, null=True)
111
112 def __str__(self):
113 string = '[{s}] {t} ({i})'.format(s=self.rism_siglum, t=self.title, i=self.id)
114 return string
115
116 def save(self, *args, **kwargs):
117 # when creating a source, assign it to "CANTUS Database" segment by default
118 if not self.segment:
119 cantus_db_segment = Segment.objects.get(name="CANTUS Database")
120 self.segment = cantus_db_segment
121 super().save(*args, **kwargs)
122
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/django/cantusdb_project/main_app/models/source.py b/django/cantusdb_project/main_app/models/source.py
--- a/django/cantusdb_project/main_app/models/source.py
+++ b/django/cantusdb_project/main_app/models/source.py
@@ -33,8 +33,8 @@
# it is a human-readable ID for a source
siglum = models.CharField(
max_length=63,
- null=True,
- blank=True,
+ null=False,
+ blank=False,
help_text="RISM-style siglum + Shelf-mark (e.g. GB-Ob 202).",
)
# the RISM siglum uniquely identifies a library or holding institution
| {"golden_diff": "diff --git a/django/cantusdb_project/main_app/models/source.py b/django/cantusdb_project/main_app/models/source.py\n--- a/django/cantusdb_project/main_app/models/source.py\n+++ b/django/cantusdb_project/main_app/models/source.py\n@@ -33,8 +33,8 @@\n # it is a human-readable ID for a source\n siglum = models.CharField(\n max_length=63, \n- null=True, \n- blank=True,\n+ null=False, \n+ blank=False,\n help_text=\"RISM-style siglum + Shelf-mark (e.g. GB-Ob 202).\",\n )\n # the RISM siglum uniquely identifies a library or holding institution\n", "issue": "required fields \nOn OldCantus, to create a source you need both a manuscript ID and a siglum (fields marked with asterisk) otherwise it won't create the source. \r\nNewCantus has no asterisks on these fields, and was quite happy to let me make sources with no siglum (though it does tell me to fill out an ID field if I try to submit without it.)\r\n\r\nOn the chant level, Folio and Sequence seem to be required fields (they are not on OldCantus!) but are not marked as such with asterisks, either. \n", "before_files": [{"content": "from django.db import models\nfrom main_app.models import BaseModel, Segment\nfrom django.contrib.auth import get_user_model\n\n\nclass Source(BaseModel):\n cursus_choices = [(\"Monastic\", \"Monastic\"), (\"Secular\", \"Secular\")]\n source_status_choices = [\n (\n \"Editing process (not all the fields have been proofread)\",\n \"Editing process (not all the fields have been proofread)\",\n ),\n (\"Published / Complete\", \"Published / Complete\"),\n (\"Published / Proofread pending\", \"Published / Proofread pending\"),\n (\"Unpublished / Editing process\", \"Unpublished / Editing process\"),\n (\"Unpublished / Indexing process\", \"Unpublished / Indexing process\"),\n (\"Unpublished / Proofread pending\", \"Unpublished / Proofread pending\"),\n (\"Unpublished / Proofreading process\", \"Unpublished / Proofreading process\"),\n (\"Unpublished / No indexing activity\", \"Unpublished / No indexing activity\"),\n ]\n\n # The old Cantus uses two fields to jointly control the access to sources. \n # Here in the new Cantus, we only use one field, and there are two levels: published and unpublished.\n # Published sources are available to the public. \n # Unpublished sources are hidden from the list and cannot be accessed by URL until the user logs in.\n published = models.BooleanField(blank=False, null=False, default=False)\n\n title = models.CharField(\n max_length=255,\n help_text=\"Full Manuscript Identification (City, Archive, Shelf-mark)\",\n )\n # the siglum field as implemented on the old Cantus is composed of both the RISM siglum and the shelfmark\n # it is a human-readable ID for a source\n siglum = models.CharField(\n max_length=63, \n null=True, \n blank=True,\n help_text=\"RISM-style siglum + Shelf-mark (e.g. GB-Ob 202).\",\n )\n # the RISM siglum uniquely identifies a library or holding institution\n rism_siglum = models.ForeignKey(\n \"RismSiglum\", on_delete=models.PROTECT, null=True, blank=True,\n )\n provenance = models.ForeignKey(\n \"Provenance\",\n on_delete=models.PROTECT,\n help_text=\"If the origin is unknown, select a location where the source was \"\n \"used later in its lifetime and provide details in the \"\n '\"Provenance notes\" field.',\n null=True,\n blank=True,\n related_name=\"sources\",\n )\n provenance_notes = models.TextField(\n blank=True,\n null=True,\n help_text=\"More exact indication of the provenance (if necessary)\",\n )\n full_source = models.BooleanField(blank=True, null=True)\n date = models.CharField(\n blank=True,\n null=True,\n max_length=63,\n help_text='Date of the manuscript (e.g. \"1200s\", \"1300-1350\", etc.)',\n )\n century = models.ManyToManyField(\"Century\", related_name=\"sources\", blank=True)\n notation = models.ManyToManyField(\"Notation\", related_name=\"sources\", blank=True)\n cursus = models.CharField(\n blank=True, null=True, choices=cursus_choices, max_length=63\n )\n current_editors = models.ManyToManyField(get_user_model(), related_name=\"sources_user_can_edit\", blank=True)\n \n inventoried_by = models.ManyToManyField(\n get_user_model(), related_name=\"inventoried_sources\", blank=True\n )\n full_text_entered_by = models.ManyToManyField(\n get_user_model(), related_name=\"entered_full_text_for_sources\", blank=True\n )\n melodies_entered_by = models.ManyToManyField(\n get_user_model(), related_name=\"entered_melody_for_sources\", blank=True\n )\n proofreaders = models.ManyToManyField(get_user_model(), related_name=\"proofread_sources\", blank=True)\n other_editors = models.ManyToManyField(get_user_model(), related_name=\"edited_sources\", blank=True)\n \n\n segment = models.ForeignKey(\n \"Segment\", on_delete=models.PROTECT, blank=True, null=True\n )\n source_status = models.CharField(blank=True, null=True, choices=source_status_choices, max_length=255)\n complete_inventory = models.BooleanField(blank=True, null=True)\n summary = models.TextField(blank=True, null=True)\n liturgical_occasions = models.TextField(blank=True, null=True)\n description = models.TextField(blank=True, null=True)\n selected_bibliography = models.TextField(blank=True, null=True)\n image_link = models.URLField(\n blank=True, \n null=True,\n help_text='HTTP link to the image gallery of the source.',\n )\n indexing_notes = models.TextField(blank=True, null=True)\n indexing_date = models.TextField(blank=True, null=True)\n json_info = models.JSONField(blank=True, null=True)\n fragmentarium_id = models.CharField(max_length=15, blank=True, null=True)\n dact_id = models.CharField(max_length=15, blank=True, null=True)\n\n # number_of_chants and number_of_melodies are used for rendering the source-list page (perhaps among other places)\n # they are automatically recalculated in main_app.signals.update_source_chant_count and\n # main_app.signals.update_source_melody_count every time a chant or sequence is saved or deleted\n number_of_chants = models.IntegerField(blank=True, null=True)\n number_of_melodies = models.IntegerField(blank=True, null=True)\n\n def __str__(self):\n string = '[{s}] {t} ({i})'.format(s=self.rism_siglum, t=self.title, i=self.id)\n return string\n\n def save(self, *args, **kwargs):\n # when creating a source, assign it to \"CANTUS Database\" segment by default\n if not self.segment:\n cantus_db_segment = Segment.objects.get(name=\"CANTUS Database\")\n self.segment = cantus_db_segment\n super().save(*args, **kwargs)\n", "path": "django/cantusdb_project/main_app/models/source.py"}], "after_files": [{"content": "from django.db import models\nfrom main_app.models import BaseModel, Segment\nfrom django.contrib.auth import get_user_model\n\n\nclass Source(BaseModel):\n cursus_choices = [(\"Monastic\", \"Monastic\"), (\"Secular\", \"Secular\")]\n source_status_choices = [\n (\n \"Editing process (not all the fields have been proofread)\",\n \"Editing process (not all the fields have been proofread)\",\n ),\n (\"Published / Complete\", \"Published / Complete\"),\n (\"Published / Proofread pending\", \"Published / Proofread pending\"),\n (\"Unpublished / Editing process\", \"Unpublished / Editing process\"),\n (\"Unpublished / Indexing process\", \"Unpublished / Indexing process\"),\n (\"Unpublished / Proofread pending\", \"Unpublished / Proofread pending\"),\n (\"Unpublished / Proofreading process\", \"Unpublished / Proofreading process\"),\n (\"Unpublished / No indexing activity\", \"Unpublished / No indexing activity\"),\n ]\n\n # The old Cantus uses two fields to jointly control the access to sources. \n # Here in the new Cantus, we only use one field, and there are two levels: published and unpublished.\n # Published sources are available to the public. \n # Unpublished sources are hidden from the list and cannot be accessed by URL until the user logs in.\n published = models.BooleanField(blank=False, null=False, default=False)\n\n title = models.CharField(\n max_length=255,\n help_text=\"Full Manuscript Identification (City, Archive, Shelf-mark)\",\n )\n # the siglum field as implemented on the old Cantus is composed of both the RISM siglum and the shelfmark\n # it is a human-readable ID for a source\n siglum = models.CharField(\n max_length=63, \n null=False, \n blank=False,\n help_text=\"RISM-style siglum + Shelf-mark (e.g. GB-Ob 202).\",\n )\n # the RISM siglum uniquely identifies a library or holding institution\n rism_siglum = models.ForeignKey(\n \"RismSiglum\", on_delete=models.PROTECT, null=True, blank=True,\n )\n provenance = models.ForeignKey(\n \"Provenance\",\n on_delete=models.PROTECT,\n help_text=\"If the origin is unknown, select a location where the source was \"\n \"used later in its lifetime and provide details in the \"\n '\"Provenance notes\" field.',\n null=True,\n blank=True,\n related_name=\"sources\",\n )\n provenance_notes = models.TextField(\n blank=True,\n null=True,\n help_text=\"More exact indication of the provenance (if necessary)\",\n )\n full_source = models.BooleanField(blank=True, null=True)\n date = models.CharField(\n blank=True,\n null=True,\n max_length=63,\n help_text='Date of the manuscript (e.g. \"1200s\", \"1300-1350\", etc.)',\n )\n century = models.ManyToManyField(\"Century\", related_name=\"sources\", blank=True)\n notation = models.ManyToManyField(\"Notation\", related_name=\"sources\", blank=True)\n cursus = models.CharField(\n blank=True, null=True, choices=cursus_choices, max_length=63\n )\n current_editors = models.ManyToManyField(get_user_model(), related_name=\"sources_user_can_edit\", blank=True)\n \n inventoried_by = models.ManyToManyField(\n get_user_model(), related_name=\"inventoried_sources\", blank=True\n )\n full_text_entered_by = models.ManyToManyField(\n get_user_model(), related_name=\"entered_full_text_for_sources\", blank=True\n )\n melodies_entered_by = models.ManyToManyField(\n get_user_model(), related_name=\"entered_melody_for_sources\", blank=True\n )\n proofreaders = models.ManyToManyField(get_user_model(), related_name=\"proofread_sources\", blank=True)\n other_editors = models.ManyToManyField(get_user_model(), related_name=\"edited_sources\", blank=True)\n \n\n segment = models.ForeignKey(\n \"Segment\", on_delete=models.PROTECT, blank=True, null=True\n )\n source_status = models.CharField(blank=True, null=True, choices=source_status_choices, max_length=255)\n complete_inventory = models.BooleanField(blank=True, null=True)\n summary = models.TextField(blank=True, null=True)\n liturgical_occasions = models.TextField(blank=True, null=True)\n description = models.TextField(blank=True, null=True)\n selected_bibliography = models.TextField(blank=True, null=True)\n image_link = models.URLField(\n blank=True, \n null=True,\n help_text='HTTP link to the image gallery of the source.',\n )\n indexing_notes = models.TextField(blank=True, null=True)\n indexing_date = models.TextField(blank=True, null=True)\n json_info = models.JSONField(blank=True, null=True)\n fragmentarium_id = models.CharField(max_length=15, blank=True, null=True)\n dact_id = models.CharField(max_length=15, blank=True, null=True)\n\n # number_of_chants and number_of_melodies are used for rendering the source-list page (perhaps among other places)\n # they are automatically recalculated in main_app.signals.update_source_chant_count and\n # main_app.signals.update_source_melody_count every time a chant or sequence is saved or deleted\n number_of_chants = models.IntegerField(blank=True, null=True)\n number_of_melodies = models.IntegerField(blank=True, null=True)\n\n def __str__(self):\n string = '[{s}] {t} ({i})'.format(s=self.rism_siglum, t=self.title, i=self.id)\n return string\n\n def save(self, *args, **kwargs):\n # when creating a source, assign it to \"CANTUS Database\" segment by default\n if not self.segment:\n cantus_db_segment = Segment.objects.get(name=\"CANTUS Database\")\n self.segment = cantus_db_segment\n super().save(*args, **kwargs)\n", "path": "django/cantusdb_project/main_app/models/source.py"}]} | 1,934 | 166 |
gh_patches_debug_33430 | rasdani/github-patches | git_diff | Mailu__Mailu-1268 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make roundcube log to the console
Roundcube is currently hard to debug because it logs into a special folder (/var/www/html/logs). It should log to stdout/stderr instead.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `webmails/roundcube/start.py`
Content:
```
1 #!/usr/bin/python3
2
3 import os
4 import logging as log
5 import sys
6 from socrate import conf
7
8 log.basicConfig(stream=sys.stderr, level=os.environ.get("LOG_LEVEL", "WARNING"))
9
10 os.environ["MAX_FILESIZE"] = str(int(int(os.environ.get("MESSAGE_SIZE_LIMIT"))*0.66/1048576))
11
12 conf.jinja("/php.ini", os.environ, "/usr/local/etc/php/conf.d/roundcube.ini")
13
14 # Fix some permissions
15 os.system("mkdir -p /data/gpg")
16 os.system("chown -R www-data:www-data /data")
17
18 # Run apache
19 os.execv("/usr/local/bin/apache2-foreground", ["apache2-foreground"])
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/webmails/roundcube/start.py b/webmails/roundcube/start.py
--- a/webmails/roundcube/start.py
+++ b/webmails/roundcube/start.py
@@ -4,16 +4,61 @@
import logging as log
import sys
from socrate import conf
+import subprocess
log.basicConfig(stream=sys.stderr, level=os.environ.get("LOG_LEVEL", "WARNING"))
os.environ["MAX_FILESIZE"] = str(int(int(os.environ.get("MESSAGE_SIZE_LIMIT"))*0.66/1048576))
+db_flavor=os.environ.get("ROUNDCUBE_DB_FLAVOR",os.environ.get("DB_FLAVOR","sqlite"))
+if db_flavor=="sqlite":
+ os.environ["DB_DSNW"]="sqlite:////data/roundcube.db"
+elif db_flavor=="mysql":
+ os.environ["DB_DSNW"]="mysql://%s:%s@%s/%s" % (
+ os.environ.get("ROUNDCUBE_DB_USER","roundcube"),
+ os.environ.get("ROUNDCUBE_DB_PW"),
+ os.environ.get("ROUNDCUBE_DB_HOST",os.environ.get("DB_HOST","database")),
+ os.environ.get("ROUNDCUBE_DB_NAME","roundcube")
+ )
+elif db_flavor=="postgresql":
+ os.environ["DB_DSNW"]="pgsql://%s:%s@%s/%s" % (
+ os.environ.get("ROUNDCUBE_DB_USER","roundcube"),
+ os.environ.get("ROUNDCUBE_DB_PW"),
+ os.environ.get("ROUNDCUBE_DB_HOST",os.environ.get("DB_HOST","database")),
+ os.environ.get("ROUNDCUBE_DB_NAME","roundcube")
+ )
+else:
+ print("Unknown ROUNDCUBE_DB_FLAVOR: %s",db_flavor)
+ exit(1)
+
+
+
conf.jinja("/php.ini", os.environ, "/usr/local/etc/php/conf.d/roundcube.ini")
# Fix some permissions
-os.system("mkdir -p /data/gpg")
-os.system("chown -R www-data:www-data /data")
+os.system("mkdir -p /data/gpg /var/www/html/logs")
+os.system("touch /var/www/html/logs/errors")
+os.system("chown -R www-data:www-data /data /var/www/html/logs")
+
+try:
+ print("Initializing database")
+ result=subprocess.check_output(["/var/www/html/bin/initdb.sh","--dir","/var/www/html/SQL"],stderr=subprocess.STDOUT)
+ print(result.decode())
+except subprocess.CalledProcessError as e:
+ if "already exists" in e.stdout.decode():
+ print("Already initialzed")
+ else:
+ print(e.stdout.decode())
+ quit(1)
+
+try:
+ print("Upgrading database")
+ subprocess.check_call(["/var/www/html/bin/update.sh","--version=?","-y"],stderr=subprocess.STDOUT)
+except subprocess.CalledProcessError as e:
+ quit(1)
+
+# Tail roundcube logs
+subprocess.Popen(["tail","-f","-n","0","/var/www/html/logs/errors"])
# Run apache
os.execv("/usr/local/bin/apache2-foreground", ["apache2-foreground"])
| {"golden_diff": "diff --git a/webmails/roundcube/start.py b/webmails/roundcube/start.py\n--- a/webmails/roundcube/start.py\n+++ b/webmails/roundcube/start.py\n@@ -4,16 +4,61 @@\n import logging as log\n import sys\n from socrate import conf\n+import subprocess\n \n log.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n \n os.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n \n+db_flavor=os.environ.get(\"ROUNDCUBE_DB_FLAVOR\",os.environ.get(\"DB_FLAVOR\",\"sqlite\"))\n+if db_flavor==\"sqlite\":\n+ os.environ[\"DB_DSNW\"]=\"sqlite:////data/roundcube.db\"\n+elif db_flavor==\"mysql\":\n+ os.environ[\"DB_DSNW\"]=\"mysql://%s:%s@%s/%s\" % (\n+ os.environ.get(\"ROUNDCUBE_DB_USER\",\"roundcube\"),\n+ os.environ.get(\"ROUNDCUBE_DB_PW\"),\n+ os.environ.get(\"ROUNDCUBE_DB_HOST\",os.environ.get(\"DB_HOST\",\"database\")),\n+ os.environ.get(\"ROUNDCUBE_DB_NAME\",\"roundcube\")\n+ )\n+elif db_flavor==\"postgresql\":\n+ os.environ[\"DB_DSNW\"]=\"pgsql://%s:%s@%s/%s\" % (\n+ os.environ.get(\"ROUNDCUBE_DB_USER\",\"roundcube\"),\n+ os.environ.get(\"ROUNDCUBE_DB_PW\"),\n+ os.environ.get(\"ROUNDCUBE_DB_HOST\",os.environ.get(\"DB_HOST\",\"database\")),\n+ os.environ.get(\"ROUNDCUBE_DB_NAME\",\"roundcube\")\n+ )\n+else:\n+ print(\"Unknown ROUNDCUBE_DB_FLAVOR: %s\",db_flavor)\n+ exit(1)\n+\n+\n+\n conf.jinja(\"/php.ini\", os.environ, \"/usr/local/etc/php/conf.d/roundcube.ini\")\n \n # Fix some permissions\n-os.system(\"mkdir -p /data/gpg\")\n-os.system(\"chown -R www-data:www-data /data\")\n+os.system(\"mkdir -p /data/gpg /var/www/html/logs\")\n+os.system(\"touch /var/www/html/logs/errors\")\n+os.system(\"chown -R www-data:www-data /data /var/www/html/logs\")\n+\n+try:\n+ print(\"Initializing database\")\n+ result=subprocess.check_output([\"/var/www/html/bin/initdb.sh\",\"--dir\",\"/var/www/html/SQL\"],stderr=subprocess.STDOUT)\n+ print(result.decode())\n+except subprocess.CalledProcessError as e:\n+ if \"already exists\" in e.stdout.decode():\n+ print(\"Already initialzed\")\n+ else:\n+ print(e.stdout.decode())\n+ quit(1)\n+\n+try:\n+ print(\"Upgrading database\")\n+ subprocess.check_call([\"/var/www/html/bin/update.sh\",\"--version=?\",\"-y\"],stderr=subprocess.STDOUT)\n+except subprocess.CalledProcessError as e:\n+ quit(1)\n+\n+# Tail roundcube logs\n+subprocess.Popen([\"tail\",\"-f\",\"-n\",\"0\",\"/var/www/html/logs/errors\"])\n \n # Run apache\n os.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])\n", "issue": "Make roundcube log to the console\nRoundcube is currently hard to debug because it logs into a special folder (/var/www/html/logs). It should log to stdout/stderr instead.\n", "before_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport logging as log\nimport sys\nfrom socrate import conf\n\nlog.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n\nos.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n\nconf.jinja(\"/php.ini\", os.environ, \"/usr/local/etc/php/conf.d/roundcube.ini\")\n\n# Fix some permissions\nos.system(\"mkdir -p /data/gpg\")\nos.system(\"chown -R www-data:www-data /data\")\n\n# Run apache\nos.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])\n", "path": "webmails/roundcube/start.py"}], "after_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport logging as log\nimport sys\nfrom socrate import conf\nimport subprocess\n\nlog.basicConfig(stream=sys.stderr, level=os.environ.get(\"LOG_LEVEL\", \"WARNING\"))\n\nos.environ[\"MAX_FILESIZE\"] = str(int(int(os.environ.get(\"MESSAGE_SIZE_LIMIT\"))*0.66/1048576))\n\ndb_flavor=os.environ.get(\"ROUNDCUBE_DB_FLAVOR\",os.environ.get(\"DB_FLAVOR\",\"sqlite\"))\nif db_flavor==\"sqlite\":\n os.environ[\"DB_DSNW\"]=\"sqlite:////data/roundcube.db\"\nelif db_flavor==\"mysql\":\n os.environ[\"DB_DSNW\"]=\"mysql://%s:%s@%s/%s\" % (\n os.environ.get(\"ROUNDCUBE_DB_USER\",\"roundcube\"),\n os.environ.get(\"ROUNDCUBE_DB_PW\"),\n os.environ.get(\"ROUNDCUBE_DB_HOST\",os.environ.get(\"DB_HOST\",\"database\")),\n os.environ.get(\"ROUNDCUBE_DB_NAME\",\"roundcube\")\n )\nelif db_flavor==\"postgresql\":\n os.environ[\"DB_DSNW\"]=\"pgsql://%s:%s@%s/%s\" % (\n os.environ.get(\"ROUNDCUBE_DB_USER\",\"roundcube\"),\n os.environ.get(\"ROUNDCUBE_DB_PW\"),\n os.environ.get(\"ROUNDCUBE_DB_HOST\",os.environ.get(\"DB_HOST\",\"database\")),\n os.environ.get(\"ROUNDCUBE_DB_NAME\",\"roundcube\")\n )\nelse:\n print(\"Unknown ROUNDCUBE_DB_FLAVOR: %s\",db_flavor)\n exit(1)\n\n\n\nconf.jinja(\"/php.ini\", os.environ, \"/usr/local/etc/php/conf.d/roundcube.ini\")\n\n# Fix some permissions\nos.system(\"mkdir -p /data/gpg /var/www/html/logs\")\nos.system(\"touch /var/www/html/logs/errors\")\nos.system(\"chown -R www-data:www-data /data /var/www/html/logs\")\n\ntry:\n print(\"Initializing database\")\n result=subprocess.check_output([\"/var/www/html/bin/initdb.sh\",\"--dir\",\"/var/www/html/SQL\"],stderr=subprocess.STDOUT)\n print(result.decode())\nexcept subprocess.CalledProcessError as e:\n if \"already exists\" in e.stdout.decode():\n print(\"Already initialzed\")\n else:\n print(e.stdout.decode())\n quit(1)\n\ntry:\n print(\"Upgrading database\")\n subprocess.check_call([\"/var/www/html/bin/update.sh\",\"--version=?\",\"-y\"],stderr=subprocess.STDOUT)\nexcept subprocess.CalledProcessError as e:\n quit(1)\n\n# Tail roundcube logs\nsubprocess.Popen([\"tail\",\"-f\",\"-n\",\"0\",\"/var/www/html/logs/errors\"])\n\n# Run apache\nos.execv(\"/usr/local/bin/apache2-foreground\", [\"apache2-foreground\"])\n", "path": "webmails/roundcube/start.py"}]} | 486 | 721 |
gh_patches_debug_13238 | rasdani/github-patches | git_diff | mindsdb__mindsdb-2007 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Response contains 'nan' instead of `null`
if do
```
select null, null, null from information_schema.tables limit 1;
```
then response will be:
```
+------+--------+--------+
| None | None_2 | None_3 |
+------+--------+--------+
| nan | nan | nan |
+------+--------+--------+
```
row values must be `null`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mindsdb/api/mysql/mysql_proxy/utilities/sql.py`
Content:
```
1 import duckdb
2 import pandas as pd
3 from mindsdb_sql import parse_sql
4 from mindsdb_sql.parser.ast import Select, Identifier, BinaryOperation, OrderBy
5 from mindsdb_sql.render.sqlalchemy_render import SqlalchemyRender
6
7 from mindsdb.utilities.log import log
8
9
10 def _remove_table_name(root):
11 if isinstance(root, BinaryOperation):
12 _remove_table_name(root.args[0])
13 _remove_table_name(root.args[1])
14 elif isinstance(root, Identifier):
15 root.parts = [root.parts[-1]]
16
17
18 def query_df(df, query):
19 """ Perform simple query ('select' from one table, without subqueries and joins) on DataFrame.
20
21 Args:
22 df (pandas.DataFrame): data
23 query (mindsdb_sql.parser.ast.Select | str): select query
24
25 Returns:
26 pandas.DataFrame
27 """
28
29 if isinstance(query, str):
30 query_ast = parse_sql(query, dialect='mysql')
31 else:
32 query_ast = query
33
34 if isinstance(query_ast, Select) is False or isinstance(query_ast.from_table, Identifier) is False:
35 raise Exception("Only 'SELECT from TABLE' statements supported for internal query")
36
37 query_ast.from_table.parts = ['df_table']
38 for identifier in query_ast.targets:
39 if isinstance(identifier, Identifier):
40 identifier.parts = [identifier.parts[-1]]
41 if isinstance(query_ast.order_by, list):
42 for orderby in query_ast.order_by:
43 if isinstance(orderby, OrderBy) and isinstance(orderby.field, Identifier):
44 orderby.field.parts = [orderby.field.parts[-1]]
45 _remove_table_name(query_ast.where)
46
47 render = SqlalchemyRender('postgres')
48 try:
49 query_str = render.get_string(query_ast, with_failback=False)
50 except Exception as e:
51 log.error(f"Exception during query casting to 'postgres' dialect. Query: {str(query)}. Error: {e}")
52 query_str = render.get_string(query_ast, with_failback=True)
53
54 res = duckdb.query_df(df, 'df_table', query_str)
55 result_df = res.df()
56 result_df = result_df.where(pd.notnull(result_df), None)
57 return result_df
58
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mindsdb/api/mysql/mysql_proxy/utilities/sql.py b/mindsdb/api/mysql/mysql_proxy/utilities/sql.py
--- a/mindsdb/api/mysql/mysql_proxy/utilities/sql.py
+++ b/mindsdb/api/mysql/mysql_proxy/utilities/sql.py
@@ -1,5 +1,5 @@
import duckdb
-import pandas as pd
+import numpy as np
from mindsdb_sql import parse_sql
from mindsdb_sql.parser.ast import Select, Identifier, BinaryOperation, OrderBy
from mindsdb_sql.render.sqlalchemy_render import SqlalchemyRender
@@ -53,5 +53,5 @@
res = duckdb.query_df(df, 'df_table', query_str)
result_df = res.df()
- result_df = result_df.where(pd.notnull(result_df), None)
+ result_df = result_df.replace({np.nan: None})
return result_df
| {"golden_diff": "diff --git a/mindsdb/api/mysql/mysql_proxy/utilities/sql.py b/mindsdb/api/mysql/mysql_proxy/utilities/sql.py\n--- a/mindsdb/api/mysql/mysql_proxy/utilities/sql.py\n+++ b/mindsdb/api/mysql/mysql_proxy/utilities/sql.py\n@@ -1,5 +1,5 @@\n import duckdb\n-import pandas as pd\n+import numpy as np\n from mindsdb_sql import parse_sql\n from mindsdb_sql.parser.ast import Select, Identifier, BinaryOperation, OrderBy\n from mindsdb_sql.render.sqlalchemy_render import SqlalchemyRender\n@@ -53,5 +53,5 @@\n \n res = duckdb.query_df(df, 'df_table', query_str)\n result_df = res.df()\n- result_df = result_df.where(pd.notnull(result_df), None)\n+ result_df = result_df.replace({np.nan: None})\n return result_df\n", "issue": "Response contains 'nan' instead of `null`\nif do \r\n```\r\nselect null, null, null from information_schema.tables limit 1;\r\n```\r\nthen response will be:\r\n```\r\n+------+--------+--------+\r\n| None | None_2 | None_3 |\r\n+------+--------+--------+\r\n| nan | nan | nan |\r\n+------+--------+--------+\r\n```\r\nrow values must be `null`\r\n\n", "before_files": [{"content": "import duckdb\nimport pandas as pd\nfrom mindsdb_sql import parse_sql\nfrom mindsdb_sql.parser.ast import Select, Identifier, BinaryOperation, OrderBy\nfrom mindsdb_sql.render.sqlalchemy_render import SqlalchemyRender\n\nfrom mindsdb.utilities.log import log\n\n\ndef _remove_table_name(root):\n if isinstance(root, BinaryOperation):\n _remove_table_name(root.args[0])\n _remove_table_name(root.args[1])\n elif isinstance(root, Identifier):\n root.parts = [root.parts[-1]]\n\n\ndef query_df(df, query):\n \"\"\" Perform simple query ('select' from one table, without subqueries and joins) on DataFrame.\n\n Args:\n df (pandas.DataFrame): data\n query (mindsdb_sql.parser.ast.Select | str): select query\n\n Returns:\n pandas.DataFrame\n \"\"\"\n\n if isinstance(query, str):\n query_ast = parse_sql(query, dialect='mysql')\n else:\n query_ast = query\n\n if isinstance(query_ast, Select) is False or isinstance(query_ast.from_table, Identifier) is False:\n raise Exception(\"Only 'SELECT from TABLE' statements supported for internal query\")\n\n query_ast.from_table.parts = ['df_table']\n for identifier in query_ast.targets:\n if isinstance(identifier, Identifier):\n identifier.parts = [identifier.parts[-1]]\n if isinstance(query_ast.order_by, list):\n for orderby in query_ast.order_by:\n if isinstance(orderby, OrderBy) and isinstance(orderby.field, Identifier):\n orderby.field.parts = [orderby.field.parts[-1]]\n _remove_table_name(query_ast.where)\n\n render = SqlalchemyRender('postgres')\n try:\n query_str = render.get_string(query_ast, with_failback=False)\n except Exception as e:\n log.error(f\"Exception during query casting to 'postgres' dialect. Query: {str(query)}. Error: {e}\")\n query_str = render.get_string(query_ast, with_failback=True)\n\n res = duckdb.query_df(df, 'df_table', query_str)\n result_df = res.df()\n result_df = result_df.where(pd.notnull(result_df), None)\n return result_df\n", "path": "mindsdb/api/mysql/mysql_proxy/utilities/sql.py"}], "after_files": [{"content": "import duckdb\nimport numpy as np\nfrom mindsdb_sql import parse_sql\nfrom mindsdb_sql.parser.ast import Select, Identifier, BinaryOperation, OrderBy\nfrom mindsdb_sql.render.sqlalchemy_render import SqlalchemyRender\n\nfrom mindsdb.utilities.log import log\n\n\ndef _remove_table_name(root):\n if isinstance(root, BinaryOperation):\n _remove_table_name(root.args[0])\n _remove_table_name(root.args[1])\n elif isinstance(root, Identifier):\n root.parts = [root.parts[-1]]\n\n\ndef query_df(df, query):\n \"\"\" Perform simple query ('select' from one table, without subqueries and joins) on DataFrame.\n\n Args:\n df (pandas.DataFrame): data\n query (mindsdb_sql.parser.ast.Select | str): select query\n\n Returns:\n pandas.DataFrame\n \"\"\"\n\n if isinstance(query, str):\n query_ast = parse_sql(query, dialect='mysql')\n else:\n query_ast = query\n\n if isinstance(query_ast, Select) is False or isinstance(query_ast.from_table, Identifier) is False:\n raise Exception(\"Only 'SELECT from TABLE' statements supported for internal query\")\n\n query_ast.from_table.parts = ['df_table']\n for identifier in query_ast.targets:\n if isinstance(identifier, Identifier):\n identifier.parts = [identifier.parts[-1]]\n if isinstance(query_ast.order_by, list):\n for orderby in query_ast.order_by:\n if isinstance(orderby, OrderBy) and isinstance(orderby.field, Identifier):\n orderby.field.parts = [orderby.field.parts[-1]]\n _remove_table_name(query_ast.where)\n\n render = SqlalchemyRender('postgres')\n try:\n query_str = render.get_string(query_ast, with_failback=False)\n except Exception as e:\n log.error(f\"Exception during query casting to 'postgres' dialect. Query: {str(query)}. Error: {e}\")\n query_str = render.get_string(query_ast, with_failback=True)\n\n res = duckdb.query_df(df, 'df_table', query_str)\n result_df = res.df()\n result_df = result_df.replace({np.nan: None})\n return result_df\n", "path": "mindsdb/api/mysql/mysql_proxy/utilities/sql.py"}]} | 923 | 190 |
gh_patches_debug_28834 | rasdani/github-patches | git_diff | mampfes__hacs_waste_collection_schedule-1837 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug]: ART Trier Germany collecting no more Data
### I Have A Problem With:
A specific source
### What's Your Problem
ART Trier Germany collecting no more Data. It worked till yesterday. I think they have a new homepage.
The Calender is now empty, only one Entry on February 26th: A.R.T. Wichtiger Hinweis!
The link (https://www.art-trier.de/cms/abfuhrtermine-1002.html) in the Description for ART Trier doesn't work anymore. Get a 404 Error Page.
Ver. 1.45.1
### Source (if relevant)
art_trier_de
### Logs
```Shell
no relevant logs
```
### Relevant Configuration
```YAML
- name: art_trier_de
args:
district: "Fellerich"
zip_code: "54456"
```
### Checklist Source Error
- [ ] Use the example parameters for your source (often available in the documentation) (don't forget to restart Home Assistant after changing the configuration)
- [X] Checked that the website of your service provider is still working
- [ ] Tested my attributes on the service provider website (if possible)
- [X] I have tested with the latest version of the integration (master) (for HACS in the 3 dot menu of the integration click on "Redownload" and choose master as version)
### Checklist Sensor Error
- [X] Checked in the Home Assistant Calendar tab if the event names match the types names (if types argument is used)
### Required
- [X] I have searched past (closed AND opened) issues to see if this bug has already been reported, and it hasn't been.
- [X] I understand that people give their precious time for free, and thus I've done my very best to make this problem as easy as possible to investigate.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py`
Content:
```
1 import contextlib
2 from datetime import datetime
3 from typing import Optional
4 from urllib.parse import quote
5
6 import requests
7 from waste_collection_schedule import Collection # type: ignore[attr-defined]
8 from waste_collection_schedule.service.ICS import ICS
9
10 TITLE = "ART Trier"
11 DESCRIPTION = "Source for waste collection of ART Trier."
12 URL = "https://www.art-trier.de"
13 TEST_CASES = {
14 "Trier": {
15 "zip_code": "54296",
16 "district": "Stadt Trier, Universitätsring",
17 }, # # https://www.art-trier.de/ics-feed/54296_trier_universitaetsring_1-1800.ics
18 "Schweich": {
19 "zip_code": "54338",
20 "district": "Schweich (inkl. Issel)",
21 }, # https://www.art-trier.de/ics-feed/54338_schweich_inkl_issel_1-1800.ics
22 "Dreis": {
23 "zip_code": "54518",
24 "district": "Dreis",
25 }, # https://www.art-trier.de/ics-feed/54518_dreis_1-1800.ics
26 "Wittlich Marktplatz": {
27 "zip_code": "54516",
28 "district": "Wittlich, Marktplatz",
29 }, # https://www.art-trier.de/ics-feed/54516_wittlich_marktplatz_1-1800.ics
30 "Wittlich Wengerohr": {
31 "zip_code": "54516",
32 "district": "Wittlich-Wengerohr",
33 }, # https://www.art-trier.de/ics-feed/54516_wittlich%2Dwengerohr_1-1800.ics
34 }
35
36 API_URL = "https://www.art-trier.de/ics-feed"
37 REMINDER_DAY = (
38 "0" # The calendar event should be on the same day as the waste collection
39 )
40 REMINDER_TIME = "0600" # The calendar event should start on any hour of the correct day, so this does not matter much
41 ICON_MAP = {
42 "Altpapier": "mdi:package-variant",
43 "Restmüll": "mdi:trash-can",
44 "Gelber Sack": "mdi:recycle",
45 }
46 SPECIAL_CHARS = str.maketrans(
47 {
48 " ": "_",
49 "ä": "ae",
50 "ü": "ue",
51 "ö": "oe",
52 "ß": "ss",
53 "(": None,
54 ")": None,
55 ",": None,
56 ".": None,
57 }
58 )
59
60
61 class Source:
62 def __init__(self, district: str, zip_code: str):
63 self._district = quote(
64 district.lower().removeprefix("stadt ").translate(SPECIAL_CHARS).strip()
65 )
66 self._zip_code = zip_code
67 self._ics = ICS(regex=r"^A.R.T. Abfuhrtermin: (.*)", split_at=r" & ")
68
69 def fetch(self):
70 url = f"{API_URL}/{self._zip_code}_{self._district}_{REMINDER_DAY}-{REMINDER_TIME}.ics"
71
72 res = requests.get(url)
73 res.raise_for_status()
74
75 schedule = self._ics.convert(res.text)
76
77 return [
78 Collection(date=entry[0], t=entry[1], icon=ICON_MAP.get(entry[1]))
79 for entry in schedule
80 ]
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py b/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py
--- a/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py
+++ b/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py
@@ -1,13 +1,11 @@
-import contextlib
-from datetime import datetime
-from typing import Optional
+import logging
from urllib.parse import quote
import requests
from waste_collection_schedule import Collection # type: ignore[attr-defined]
from waste_collection_schedule.service.ICS import ICS
-TITLE = "ART Trier"
+TITLE = "ART Trier (Depreciated)"
DESCRIPTION = "Source for waste collection of ART Trier."
URL = "https://www.art-trier.de"
TEST_CASES = {
@@ -56,6 +54,7 @@
".": None,
}
)
+LOGGER = logging.getLogger(__name__)
class Source:
@@ -67,7 +66,11 @@
self._ics = ICS(regex=r"^A.R.T. Abfuhrtermin: (.*)", split_at=r" & ")
def fetch(self):
- url = f"{API_URL}/{self._zip_code}_{self._district}_{REMINDER_DAY}-{REMINDER_TIME}.ics"
+ LOGGER.warning(
+ "The ART Trier source is deprecated and might not work with all addresses anymore."
+ " Please use the ICS instead: https://github.com/mampfes/hacs_waste_collection_schedule/blob/master/doc/ics/art_trier_de.md"
+ )
+ url = f"{API_URL}/{self._zip_code}:{self._district}::@{REMINDER_DAY}-{REMINDER_TIME}.ics"
res = requests.get(url)
res.raise_for_status()
| {"golden_diff": "diff --git a/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py b/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py\n--- a/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py\n+++ b/custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py\n@@ -1,13 +1,11 @@\n-import contextlib\n-from datetime import datetime\n-from typing import Optional\n+import logging\n from urllib.parse import quote\n \n import requests\n from waste_collection_schedule import Collection # type: ignore[attr-defined]\n from waste_collection_schedule.service.ICS import ICS\n \n-TITLE = \"ART Trier\"\n+TITLE = \"ART Trier (Depreciated)\"\n DESCRIPTION = \"Source for waste collection of ART Trier.\"\n URL = \"https://www.art-trier.de\"\n TEST_CASES = {\n@@ -56,6 +54,7 @@\n \".\": None,\n }\n )\n+LOGGER = logging.getLogger(__name__)\n \n \n class Source:\n@@ -67,7 +66,11 @@\n self._ics = ICS(regex=r\"^A.R.T. Abfuhrtermin: (.*)\", split_at=r\" & \")\n \n def fetch(self):\n- url = f\"{API_URL}/{self._zip_code}_{self._district}_{REMINDER_DAY}-{REMINDER_TIME}.ics\"\n+ LOGGER.warning(\n+ \"The ART Trier source is deprecated and might not work with all addresses anymore.\"\n+ \" Please use the ICS instead: https://github.com/mampfes/hacs_waste_collection_schedule/blob/master/doc/ics/art_trier_de.md\"\n+ )\n+ url = f\"{API_URL}/{self._zip_code}:{self._district}::@{REMINDER_DAY}-{REMINDER_TIME}.ics\"\n \n res = requests.get(url)\n res.raise_for_status()\n", "issue": "[Bug]: ART Trier Germany collecting no more Data\n### I Have A Problem With:\n\nA specific source\n\n### What's Your Problem\n\nART Trier Germany collecting no more Data. It worked till yesterday. I think they have a new homepage.\r\nThe Calender is now empty, only one Entry on February 26th: A.R.T. Wichtiger Hinweis!\r\nThe link (https://www.art-trier.de/cms/abfuhrtermine-1002.html) in the Description for ART Trier doesn't work anymore. Get a 404 Error Page.\r\n\r\nVer. 1.45.1\n\n### Source (if relevant)\n\nart_trier_de\n\n### Logs\n\n```Shell\nno relevant logs\n```\n\n\n### Relevant Configuration\n\n```YAML\n- name: art_trier_de\r\n args:\r\n district: \"Fellerich\"\r\n zip_code: \"54456\"\n```\n\n\n### Checklist Source Error\n\n- [ ] Use the example parameters for your source (often available in the documentation) (don't forget to restart Home Assistant after changing the configuration)\n- [X] Checked that the website of your service provider is still working\n- [ ] Tested my attributes on the service provider website (if possible)\n- [X] I have tested with the latest version of the integration (master) (for HACS in the 3 dot menu of the integration click on \"Redownload\" and choose master as version)\n\n### Checklist Sensor Error\n\n- [X] Checked in the Home Assistant Calendar tab if the event names match the types names (if types argument is used)\n\n### Required\n\n- [X] I have searched past (closed AND opened) issues to see if this bug has already been reported, and it hasn't been.\n- [X] I understand that people give their precious time for free, and thus I've done my very best to make this problem as easy as possible to investigate.\n", "before_files": [{"content": "import contextlib\nfrom datetime import datetime\nfrom typing import Optional\nfrom urllib.parse import quote\n\nimport requests\nfrom waste_collection_schedule import Collection # type: ignore[attr-defined]\nfrom waste_collection_schedule.service.ICS import ICS\n\nTITLE = \"ART Trier\"\nDESCRIPTION = \"Source for waste collection of ART Trier.\"\nURL = \"https://www.art-trier.de\"\nTEST_CASES = {\n \"Trier\": {\n \"zip_code\": \"54296\",\n \"district\": \"Stadt Trier, Universit\u00e4tsring\",\n }, # # https://www.art-trier.de/ics-feed/54296_trier_universitaetsring_1-1800.ics\n \"Schweich\": {\n \"zip_code\": \"54338\",\n \"district\": \"Schweich (inkl. Issel)\",\n }, # https://www.art-trier.de/ics-feed/54338_schweich_inkl_issel_1-1800.ics\n \"Dreis\": {\n \"zip_code\": \"54518\",\n \"district\": \"Dreis\",\n }, # https://www.art-trier.de/ics-feed/54518_dreis_1-1800.ics\n \"Wittlich Marktplatz\": {\n \"zip_code\": \"54516\",\n \"district\": \"Wittlich, Marktplatz\",\n }, # https://www.art-trier.de/ics-feed/54516_wittlich_marktplatz_1-1800.ics\n \"Wittlich Wengerohr\": {\n \"zip_code\": \"54516\",\n \"district\": \"Wittlich-Wengerohr\",\n }, # https://www.art-trier.de/ics-feed/54516_wittlich%2Dwengerohr_1-1800.ics\n}\n\nAPI_URL = \"https://www.art-trier.de/ics-feed\"\nREMINDER_DAY = (\n \"0\" # The calendar event should be on the same day as the waste collection\n)\nREMINDER_TIME = \"0600\" # The calendar event should start on any hour of the correct day, so this does not matter much\nICON_MAP = {\n \"Altpapier\": \"mdi:package-variant\",\n \"Restm\u00fcll\": \"mdi:trash-can\",\n \"Gelber Sack\": \"mdi:recycle\",\n}\nSPECIAL_CHARS = str.maketrans(\n {\n \" \": \"_\",\n \"\u00e4\": \"ae\",\n \"\u00fc\": \"ue\",\n \"\u00f6\": \"oe\",\n \"\u00df\": \"ss\",\n \"(\": None,\n \")\": None,\n \",\": None,\n \".\": None,\n }\n)\n\n\nclass Source:\n def __init__(self, district: str, zip_code: str):\n self._district = quote(\n district.lower().removeprefix(\"stadt \").translate(SPECIAL_CHARS).strip()\n )\n self._zip_code = zip_code\n self._ics = ICS(regex=r\"^A.R.T. Abfuhrtermin: (.*)\", split_at=r\" & \")\n\n def fetch(self):\n url = f\"{API_URL}/{self._zip_code}_{self._district}_{REMINDER_DAY}-{REMINDER_TIME}.ics\"\n\n res = requests.get(url)\n res.raise_for_status()\n\n schedule = self._ics.convert(res.text)\n\n return [\n Collection(date=entry[0], t=entry[1], icon=ICON_MAP.get(entry[1]))\n for entry in schedule\n ]\n", "path": "custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py"}], "after_files": [{"content": "import logging\nfrom urllib.parse import quote\n\nimport requests\nfrom waste_collection_schedule import Collection # type: ignore[attr-defined]\nfrom waste_collection_schedule.service.ICS import ICS\n\nTITLE = \"ART Trier (Depreciated)\"\nDESCRIPTION = \"Source for waste collection of ART Trier.\"\nURL = \"https://www.art-trier.de\"\nTEST_CASES = {\n \"Trier\": {\n \"zip_code\": \"54296\",\n \"district\": \"Stadt Trier, Universit\u00e4tsring\",\n }, # # https://www.art-trier.de/ics-feed/54296_trier_universitaetsring_1-1800.ics\n \"Schweich\": {\n \"zip_code\": \"54338\",\n \"district\": \"Schweich (inkl. Issel)\",\n }, # https://www.art-trier.de/ics-feed/54338_schweich_inkl_issel_1-1800.ics\n \"Dreis\": {\n \"zip_code\": \"54518\",\n \"district\": \"Dreis\",\n }, # https://www.art-trier.de/ics-feed/54518_dreis_1-1800.ics\n \"Wittlich Marktplatz\": {\n \"zip_code\": \"54516\",\n \"district\": \"Wittlich, Marktplatz\",\n }, # https://www.art-trier.de/ics-feed/54516_wittlich_marktplatz_1-1800.ics\n \"Wittlich Wengerohr\": {\n \"zip_code\": \"54516\",\n \"district\": \"Wittlich-Wengerohr\",\n }, # https://www.art-trier.de/ics-feed/54516_wittlich%2Dwengerohr_1-1800.ics\n}\n\nAPI_URL = \"https://www.art-trier.de/ics-feed\"\nREMINDER_DAY = (\n \"0\" # The calendar event should be on the same day as the waste collection\n)\nREMINDER_TIME = \"0600\" # The calendar event should start on any hour of the correct day, so this does not matter much\nICON_MAP = {\n \"Altpapier\": \"mdi:package-variant\",\n \"Restm\u00fcll\": \"mdi:trash-can\",\n \"Gelber Sack\": \"mdi:recycle\",\n}\nSPECIAL_CHARS = str.maketrans(\n {\n \" \": \"_\",\n \"\u00e4\": \"ae\",\n \"\u00fc\": \"ue\",\n \"\u00f6\": \"oe\",\n \"\u00df\": \"ss\",\n \"(\": None,\n \")\": None,\n \",\": None,\n \".\": None,\n }\n)\nLOGGER = logging.getLogger(__name__)\n\n\nclass Source:\n def __init__(self, district: str, zip_code: str):\n self._district = quote(\n district.lower().removeprefix(\"stadt \").translate(SPECIAL_CHARS).strip()\n )\n self._zip_code = zip_code\n self._ics = ICS(regex=r\"^A.R.T. Abfuhrtermin: (.*)\", split_at=r\" & \")\n\n def fetch(self):\n LOGGER.warning(\n \"The ART Trier source is deprecated and might not work with all addresses anymore.\"\n \" Please use the ICS instead: https://github.com/mampfes/hacs_waste_collection_schedule/blob/master/doc/ics/art_trier_de.md\"\n )\n url = f\"{API_URL}/{self._zip_code}:{self._district}::@{REMINDER_DAY}-{REMINDER_TIME}.ics\"\n\n res = requests.get(url)\n res.raise_for_status()\n\n schedule = self._ics.convert(res.text)\n\n return [\n Collection(date=entry[0], t=entry[1], icon=ICON_MAP.get(entry[1]))\n for entry in schedule\n ]\n", "path": "custom_components/waste_collection_schedule/waste_collection_schedule/source/art_trier_de.py"}]} | 1,625 | 402 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.