problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_13037 | rasdani/github-patches | git_diff | psf__black-2739 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Custom cache directory to be thread/process safe
**Is your feature request related to a problem? Please describe.**
The cache files are not thread/process safe. I've checked multiple projects simultaneously in different processes but get permission errors when one process is writing to the cache while another is reading from it. On linux the fix is pretty easy: set a different temp directory for each process using the ``XDG_CACHE_HOME`` environment variable but there is no equivalent for Windows (there's somewhat an equivalent but you need a rather [specific setup](https://github.com/platformdirs/platformdirs/blob/main/src/platformdirs/windows.py#L157-L165) to use the environment variable).
**Describe the solution you'd like**
The solution that will work for my use case (and I think is the least amount of changes) is to create a custom environment variable that overrides using platformdirs to get the cache directory. Think this could look like:
```python
CACHE_DIR = Path(os.environ.get("BLACK_CACHE_DIR", user_cache_dir("black", version=__version__)))
if not CACHE_DIR.exists():
raise RuntimeError(f"{CACHE_DIR} does not exist")
```
**Describe alternatives you've considered**
1. Add the command line option to turn off cacheing (as this ticket as #248 asked for) (guess is this more work)
2. Add command line option to set the cache directory (guess is this is more work but not a lot)
3. Make the cache dir thread/process safe (guess is this is more work)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/black/cache.py`
Content:
```
1 """Caching of formatted files with feature-based invalidation."""
2
3 import os
4 import pickle
5 from pathlib import Path
6 import tempfile
7 from typing import Dict, Iterable, Set, Tuple
8
9 from platformdirs import user_cache_dir
10
11 from black.mode import Mode
12
13 from _black_version import version as __version__
14
15
16 # types
17 Timestamp = float
18 FileSize = int
19 CacheInfo = Tuple[Timestamp, FileSize]
20 Cache = Dict[str, CacheInfo]
21
22
23 CACHE_DIR = Path(user_cache_dir("black", version=__version__))
24
25
26 def read_cache(mode: Mode) -> Cache:
27 """Read the cache if it exists and is well formed.
28
29 If it is not well formed, the call to write_cache later should resolve the issue.
30 """
31 cache_file = get_cache_file(mode)
32 if not cache_file.exists():
33 return {}
34
35 with cache_file.open("rb") as fobj:
36 try:
37 cache: Cache = pickle.load(fobj)
38 except (pickle.UnpicklingError, ValueError, IndexError):
39 return {}
40
41 return cache
42
43
44 def get_cache_file(mode: Mode) -> Path:
45 return CACHE_DIR / f"cache.{mode.get_cache_key()}.pickle"
46
47
48 def get_cache_info(path: Path) -> CacheInfo:
49 """Return the information used to check if a file is already formatted or not."""
50 stat = path.stat()
51 return stat.st_mtime, stat.st_size
52
53
54 def filter_cached(cache: Cache, sources: Iterable[Path]) -> Tuple[Set[Path], Set[Path]]:
55 """Split an iterable of paths in `sources` into two sets.
56
57 The first contains paths of files that modified on disk or are not in the
58 cache. The other contains paths to non-modified files.
59 """
60 todo, done = set(), set()
61 for src in sources:
62 res_src = src.resolve()
63 if cache.get(str(res_src)) != get_cache_info(res_src):
64 todo.add(src)
65 else:
66 done.add(src)
67 return todo, done
68
69
70 def write_cache(cache: Cache, sources: Iterable[Path], mode: Mode) -> None:
71 """Update the cache file."""
72 cache_file = get_cache_file(mode)
73 try:
74 CACHE_DIR.mkdir(parents=True, exist_ok=True)
75 new_cache = {
76 **cache,
77 **{str(src.resolve()): get_cache_info(src) for src in sources},
78 }
79 with tempfile.NamedTemporaryFile(dir=str(cache_file.parent), delete=False) as f:
80 pickle.dump(new_cache, f, protocol=4)
81 os.replace(f.name, cache_file)
82 except OSError:
83 pass
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/black/cache.py b/src/black/cache.py
--- a/src/black/cache.py
+++ b/src/black/cache.py
@@ -20,7 +20,23 @@
Cache = Dict[str, CacheInfo]
-CACHE_DIR = Path(user_cache_dir("black", version=__version__))
+def get_cache_dir() -> Path:
+ """Get the cache directory used by black.
+
+ Users can customize this directory on all systems using `BLACK_CACHE_DIR`
+ environment variable. By default, the cache directory is the user cache directory
+ under the black application.
+
+ This result is immediately set to a constant `black.cache.CACHE_DIR` as to avoid
+ repeated calls.
+ """
+ # NOTE: Function mostly exists as a clean way to test getting the cache directory.
+ default_cache_dir = user_cache_dir("black", version=__version__)
+ cache_dir = Path(os.environ.get("BLACK_CACHE_DIR", default_cache_dir))
+ return cache_dir
+
+
+CACHE_DIR = get_cache_dir()
def read_cache(mode: Mode) -> Cache:
| {"golden_diff": "diff --git a/src/black/cache.py b/src/black/cache.py\n--- a/src/black/cache.py\n+++ b/src/black/cache.py\n@@ -20,7 +20,23 @@\n Cache = Dict[str, CacheInfo]\n \n \n-CACHE_DIR = Path(user_cache_dir(\"black\", version=__version__))\n+def get_cache_dir() -> Path:\n+ \"\"\"Get the cache directory used by black.\n+\n+ Users can customize this directory on all systems using `BLACK_CACHE_DIR`\n+ environment variable. By default, the cache directory is the user cache directory\n+ under the black application.\n+\n+ This result is immediately set to a constant `black.cache.CACHE_DIR` as to avoid\n+ repeated calls.\n+ \"\"\"\n+ # NOTE: Function mostly exists as a clean way to test getting the cache directory.\n+ default_cache_dir = user_cache_dir(\"black\", version=__version__)\n+ cache_dir = Path(os.environ.get(\"BLACK_CACHE_DIR\", default_cache_dir))\n+ return cache_dir\n+\n+\n+CACHE_DIR = get_cache_dir()\n \n \n def read_cache(mode: Mode) -> Cache:\n", "issue": "Custom cache directory to be thread/process safe\n**Is your feature request related to a problem? Please describe.**\r\n\r\nThe cache files are not thread/process safe. I've checked multiple projects simultaneously in different processes but get permission errors when one process is writing to the cache while another is reading from it. On linux the fix is pretty easy: set a different temp directory for each process using the ``XDG_CACHE_HOME`` environment variable but there is no equivalent for Windows (there's somewhat an equivalent but you need a rather [specific setup](https://github.com/platformdirs/platformdirs/blob/main/src/platformdirs/windows.py#L157-L165) to use the environment variable).\r\n\r\n**Describe the solution you'd like**\r\n\r\nThe solution that will work for my use case (and I think is the least amount of changes) is to create a custom environment variable that overrides using platformdirs to get the cache directory. Think this could look like:\r\n```python\r\nCACHE_DIR = Path(os.environ.get(\"BLACK_CACHE_DIR\", user_cache_dir(\"black\", version=__version__)))\r\nif not CACHE_DIR.exists():\r\n raise RuntimeError(f\"{CACHE_DIR} does not exist\")\r\n```\r\n\r\n**Describe alternatives you've considered**\r\n\r\n1. Add the command line option to turn off cacheing (as this ticket as #248 asked for) (guess is this more work)\r\n2. Add command line option to set the cache directory (guess is this is more work but not a lot)\r\n3. Make the cache dir thread/process safe (guess is this is more work)\n", "before_files": [{"content": "\"\"\"Caching of formatted files with feature-based invalidation.\"\"\"\n\nimport os\nimport pickle\nfrom pathlib import Path\nimport tempfile\nfrom typing import Dict, Iterable, Set, Tuple\n\nfrom platformdirs import user_cache_dir\n\nfrom black.mode import Mode\n\nfrom _black_version import version as __version__\n\n\n# types\nTimestamp = float\nFileSize = int\nCacheInfo = Tuple[Timestamp, FileSize]\nCache = Dict[str, CacheInfo]\n\n\nCACHE_DIR = Path(user_cache_dir(\"black\", version=__version__))\n\n\ndef read_cache(mode: Mode) -> Cache:\n \"\"\"Read the cache if it exists and is well formed.\n\n If it is not well formed, the call to write_cache later should resolve the issue.\n \"\"\"\n cache_file = get_cache_file(mode)\n if not cache_file.exists():\n return {}\n\n with cache_file.open(\"rb\") as fobj:\n try:\n cache: Cache = pickle.load(fobj)\n except (pickle.UnpicklingError, ValueError, IndexError):\n return {}\n\n return cache\n\n\ndef get_cache_file(mode: Mode) -> Path:\n return CACHE_DIR / f\"cache.{mode.get_cache_key()}.pickle\"\n\n\ndef get_cache_info(path: Path) -> CacheInfo:\n \"\"\"Return the information used to check if a file is already formatted or not.\"\"\"\n stat = path.stat()\n return stat.st_mtime, stat.st_size\n\n\ndef filter_cached(cache: Cache, sources: Iterable[Path]) -> Tuple[Set[Path], Set[Path]]:\n \"\"\"Split an iterable of paths in `sources` into two sets.\n\n The first contains paths of files that modified on disk or are not in the\n cache. The other contains paths to non-modified files.\n \"\"\"\n todo, done = set(), set()\n for src in sources:\n res_src = src.resolve()\n if cache.get(str(res_src)) != get_cache_info(res_src):\n todo.add(src)\n else:\n done.add(src)\n return todo, done\n\n\ndef write_cache(cache: Cache, sources: Iterable[Path], mode: Mode) -> None:\n \"\"\"Update the cache file.\"\"\"\n cache_file = get_cache_file(mode)\n try:\n CACHE_DIR.mkdir(parents=True, exist_ok=True)\n new_cache = {\n **cache,\n **{str(src.resolve()): get_cache_info(src) for src in sources},\n }\n with tempfile.NamedTemporaryFile(dir=str(cache_file.parent), delete=False) as f:\n pickle.dump(new_cache, f, protocol=4)\n os.replace(f.name, cache_file)\n except OSError:\n pass\n", "path": "src/black/cache.py"}], "after_files": [{"content": "\"\"\"Caching of formatted files with feature-based invalidation.\"\"\"\n\nimport os\nimport pickle\nfrom pathlib import Path\nimport tempfile\nfrom typing import Dict, Iterable, Set, Tuple\n\nfrom platformdirs import user_cache_dir\n\nfrom black.mode import Mode\n\nfrom _black_version import version as __version__\n\n\n# types\nTimestamp = float\nFileSize = int\nCacheInfo = Tuple[Timestamp, FileSize]\nCache = Dict[str, CacheInfo]\n\n\ndef get_cache_dir() -> Path:\n \"\"\"Get the cache directory used by black.\n\n Users can customize this directory on all systems using `BLACK_CACHE_DIR`\n environment variable. By default, the cache directory is the user cache directory\n under the black application.\n\n This result is immediately set to a constant `black.cache.CACHE_DIR` as to avoid\n repeated calls.\n \"\"\"\n # NOTE: Function mostly exists as a clean way to test getting the cache directory.\n default_cache_dir = user_cache_dir(\"black\", version=__version__)\n cache_dir = Path(os.environ.get(\"BLACK_CACHE_DIR\", default_cache_dir))\n return cache_dir\n\n\nCACHE_DIR = get_cache_dir()\n\n\ndef read_cache(mode: Mode) -> Cache:\n \"\"\"Read the cache if it exists and is well formed.\n\n If it is not well formed, the call to write_cache later should resolve the issue.\n \"\"\"\n cache_file = get_cache_file(mode)\n if not cache_file.exists():\n return {}\n\n with cache_file.open(\"rb\") as fobj:\n try:\n cache: Cache = pickle.load(fobj)\n except (pickle.UnpicklingError, ValueError, IndexError):\n return {}\n\n return cache\n\n\ndef get_cache_file(mode: Mode) -> Path:\n return CACHE_DIR / f\"cache.{mode.get_cache_key()}.pickle\"\n\n\ndef get_cache_info(path: Path) -> CacheInfo:\n \"\"\"Return the information used to check if a file is already formatted or not.\"\"\"\n stat = path.stat()\n return stat.st_mtime, stat.st_size\n\n\ndef filter_cached(cache: Cache, sources: Iterable[Path]) -> Tuple[Set[Path], Set[Path]]:\n \"\"\"Split an iterable of paths in `sources` into two sets.\n\n The first contains paths of files that modified on disk or are not in the\n cache. The other contains paths to non-modified files.\n \"\"\"\n todo, done = set(), set()\n for src in sources:\n res_src = src.resolve()\n if cache.get(str(res_src)) != get_cache_info(res_src):\n todo.add(src)\n else:\n done.add(src)\n return todo, done\n\n\ndef write_cache(cache: Cache, sources: Iterable[Path], mode: Mode) -> None:\n \"\"\"Update the cache file.\"\"\"\n cache_file = get_cache_file(mode)\n try:\n CACHE_DIR.mkdir(parents=True, exist_ok=True)\n new_cache = {\n **cache,\n **{str(src.resolve()): get_cache_info(src) for src in sources},\n }\n with tempfile.NamedTemporaryFile(dir=str(cache_file.parent), delete=False) as f:\n pickle.dump(new_cache, f, protocol=4)\n os.replace(f.name, cache_file)\n except OSError:\n pass\n", "path": "src/black/cache.py"}]} | 1,297 | 236 |
gh_patches_debug_1081 | rasdani/github-patches | git_diff | mlcommons__GaNDLF-315 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add an easy way to verify installation
**Is your feature request related to a problem? Please describe.**
Currently, we are asking users to run specific commands to verify installation, which can be cumbursome.
**Describe the solution you'd like**
It would be great if this could put in a script (and extended/updated as needed).
**Describe alternatives you've considered**
N.A.
**Additional context**
N.A.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2
3 """The setup script."""
4
5
6 import os
7 from setuptools import setup, find_packages
8 from setuptools.command.install import install
9 from setuptools.command.develop import develop
10 from setuptools.command.egg_info import egg_info
11
12 with open("README.md") as readme_file:
13 readme = readme_file.read()
14
15
16 def git_submodule_update():
17 ## submodule update
18 os.system("git submodule update --init --recursive")
19
20
21 class CustomInstallCommand(install):
22 def run(self):
23 install.run(self)
24 git_submodule_update()
25
26
27 class CustomDevelopCommand(develop):
28 def run(self):
29 develop.run(self)
30 git_submodule_update()
31
32
33 class CustomEggInfoCommand(egg_info):
34 def run(self):
35 egg_info.run(self)
36 git_submodule_update()
37
38
39 # read version.py
40 import sys, re
41
42 try:
43 filepath = "GANDLF/version.py"
44 version_file = open(filepath)
45 (__version__,) = re.findall('__version__ = "(.*)"', version_file.read())
46
47 except Exception as error:
48 __version__ = "0.0.1"
49 sys.stderr.write("Warning: Could not open '%s' due %s\n" % (filepath, error))
50
51 requirements = [
52 "black",
53 "numpy==1.21.0",
54 "scipy",
55 "SimpleITK==2.1.0",
56 "torch>=1.7",
57 "torchvision",
58 "tqdm",
59 "torchio==0.18.57",
60 "pandas",
61 "pylint",
62 "scikit-learn==0.23.1",
63 "pickle5==0.0.11",
64 "setuptools",
65 "seaborn",
66 "pyyaml",
67 "openslide-python",
68 "scikit-image",
69 "matplotlib",
70 "requests>=2.25.0",
71 "pyvips",
72 "pytest",
73 "coverage",
74 "pytest-cov",
75 "psutil",
76 "medcam",
77 "opencv-python",
78 "torchmetrics",
79 "OpenPatchMiner==0.1.6",
80 "pydicom",
81 ]
82
83 setup(
84 name="GANDLF",
85 version=__version__,
86 author="Jose Agraz, Vinayak Ahluwalia, Bhakti Baheti, Spyridon Bakas, Ujjwal Baid, Megh Bhalerao, Brandon Edwards, Karol Gotkowski, Caleb Grenko, Orhun Güley, Ibrahim Ethem Hamamci, Sarthak Pati, Micah Sheller, Juliia Skobleva, Siddhesh Thakur, Spiros Thermos", # alphabetical order
87 author_email="[email protected]",
88 python_requires=">=3.6",
89 packages=find_packages(),
90 cmdclass={ # this ensures git_submodule_update is called during install
91 "install": CustomInstallCommand,
92 "develop": CustomDevelopCommand,
93 "egg_info": CustomEggInfoCommand,
94 },
95 scripts=[
96 "gandlf_run",
97 "gandlf_constructCSV",
98 "gandlf_collectStats",
99 "gandlf_patchMiner",
100 "gandlf_preprocess",
101 "gandlf_anonymizer",
102 ],
103 classifiers=[
104 "Development Status :: 3 - Alpha",
105 "Intended Audience :: Science/Research",
106 "License :: OSI Approved :: BSD License",
107 "Natural Language :: English",
108 "Operating System :: OS Independent",
109 "Programming Language :: Python :: 3.7",
110 "Programming Language :: Python :: 3.8",
111 "Programming Language :: Python :: 3.9",
112 "Topic :: Scientific/Engineering :: Medical Science Apps",
113 ],
114 description=(
115 "PyTorch-based framework that handles segmentation/regression/classification using various DL architectures for medical imaging."
116 ),
117 install_requires=requirements,
118 license="BSD-3-Clause License",
119 long_description=readme,
120 long_description_content_type="text/markdown",
121 include_package_data=True,
122 keywords="semantic, segmentation, regression, classification, data-augmentation, medical-imaging",
123 zip_safe=False,
124 )
125
126 ## windows vips installation
127 if os.name == "nt": # proceed for windows
128 from pathlib import Path
129
130 # download and extract if main dll is absent
131 if not Path("./vips/vips-dev-8.10/bin/libvips-42.dll").exists():
132 print("Downloading and extracting VIPS for Windows")
133 url = "https://github.com/libvips/libvips/releases/download/v8.10.2/vips-dev-w64-all-8.10.2.zip"
134 zip_to_extract = "./vips.zip"
135 import urllib.request, zipfile
136
137 urllib.request.urlretrieve(url, zip_to_extract)
138 z = zipfile.ZipFile(zip_to_extract)
139 z.extractall("./vips")
140 z.close()
141 os.remove(zip_to_extract)
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -99,6 +99,7 @@
"gandlf_patchMiner",
"gandlf_preprocess",
"gandlf_anonymizer",
+ "gandlf_verifyInstall",
],
classifiers=[
"Development Status :: 3 - Alpha",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -99,6 +99,7 @@\n \"gandlf_patchMiner\",\n \"gandlf_preprocess\",\n \"gandlf_anonymizer\",\n+ \"gandlf_verifyInstall\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n", "issue": "Add an easy way to verify installation\n**Is your feature request related to a problem? Please describe.**\r\nCurrently, we are asking users to run specific commands to verify installation, which can be cumbursome.\r\n\r\n**Describe the solution you'd like**\r\nIt would be great if this could put in a script (and extended/updated as needed).\r\n\r\n**Describe alternatives you've considered**\r\nN.A.\r\n\r\n**Additional context**\r\nN.A.\r\n\n", "before_files": [{"content": "#!/usr/bin/env python\n\n\"\"\"The setup script.\"\"\"\n\n\nimport os\nfrom setuptools import setup, find_packages\nfrom setuptools.command.install import install\nfrom setuptools.command.develop import develop\nfrom setuptools.command.egg_info import egg_info\n\nwith open(\"README.md\") as readme_file:\n readme = readme_file.read()\n\n\ndef git_submodule_update():\n ## submodule update\n os.system(\"git submodule update --init --recursive\")\n\n\nclass CustomInstallCommand(install):\n def run(self):\n install.run(self)\n git_submodule_update()\n\n\nclass CustomDevelopCommand(develop):\n def run(self):\n develop.run(self)\n git_submodule_update()\n\n\nclass CustomEggInfoCommand(egg_info):\n def run(self):\n egg_info.run(self)\n git_submodule_update()\n\n\n# read version.py\nimport sys, re\n\ntry:\n filepath = \"GANDLF/version.py\"\n version_file = open(filepath)\n (__version__,) = re.findall('__version__ = \"(.*)\"', version_file.read())\n\nexcept Exception as error:\n __version__ = \"0.0.1\"\n sys.stderr.write(\"Warning: Could not open '%s' due %s\\n\" % (filepath, error))\n\nrequirements = [\n \"black\",\n \"numpy==1.21.0\",\n \"scipy\",\n \"SimpleITK==2.1.0\",\n \"torch>=1.7\",\n \"torchvision\",\n \"tqdm\",\n \"torchio==0.18.57\",\n \"pandas\",\n \"pylint\",\n \"scikit-learn==0.23.1\",\n \"pickle5==0.0.11\",\n \"setuptools\",\n \"seaborn\",\n \"pyyaml\",\n \"openslide-python\",\n \"scikit-image\",\n \"matplotlib\",\n \"requests>=2.25.0\",\n \"pyvips\",\n \"pytest\",\n \"coverage\",\n \"pytest-cov\",\n \"psutil\",\n \"medcam\",\n \"opencv-python\",\n \"torchmetrics\",\n \"OpenPatchMiner==0.1.6\",\n \"pydicom\",\n]\n\nsetup(\n name=\"GANDLF\",\n version=__version__,\n author=\"Jose Agraz, Vinayak Ahluwalia, Bhakti Baheti, Spyridon Bakas, Ujjwal Baid, Megh Bhalerao, Brandon Edwards, Karol Gotkowski, Caleb Grenko, Orhun G\u00fcley, Ibrahim Ethem Hamamci, Sarthak Pati, Micah Sheller, Juliia Skobleva, Siddhesh Thakur, Spiros Thermos\", # alphabetical order\n author_email=\"[email protected]\",\n python_requires=\">=3.6\",\n packages=find_packages(),\n cmdclass={ # this ensures git_submodule_update is called during install\n \"install\": CustomInstallCommand,\n \"develop\": CustomDevelopCommand,\n \"egg_info\": CustomEggInfoCommand,\n },\n scripts=[\n \"gandlf_run\",\n \"gandlf_constructCSV\",\n \"gandlf_collectStats\",\n \"gandlf_patchMiner\",\n \"gandlf_preprocess\",\n \"gandlf_anonymizer\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: BSD License\",\n \"Natural Language :: English\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Topic :: Scientific/Engineering :: Medical Science Apps\",\n ],\n description=(\n \"PyTorch-based framework that handles segmentation/regression/classification using various DL architectures for medical imaging.\"\n ),\n install_requires=requirements,\n license=\"BSD-3-Clause License\",\n long_description=readme,\n long_description_content_type=\"text/markdown\",\n include_package_data=True,\n keywords=\"semantic, segmentation, regression, classification, data-augmentation, medical-imaging\",\n zip_safe=False,\n)\n\n## windows vips installation\nif os.name == \"nt\": # proceed for windows\n from pathlib import Path\n\n # download and extract if main dll is absent\n if not Path(\"./vips/vips-dev-8.10/bin/libvips-42.dll\").exists():\n print(\"Downloading and extracting VIPS for Windows\")\n url = \"https://github.com/libvips/libvips/releases/download/v8.10.2/vips-dev-w64-all-8.10.2.zip\"\n zip_to_extract = \"./vips.zip\"\n import urllib.request, zipfile\n\n urllib.request.urlretrieve(url, zip_to_extract)\n z = zipfile.ZipFile(zip_to_extract)\n z.extractall(\"./vips\")\n z.close()\n os.remove(zip_to_extract)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\n\"\"\"The setup script.\"\"\"\n\n\nimport os\nfrom setuptools import setup, find_packages\nfrom setuptools.command.install import install\nfrom setuptools.command.develop import develop\nfrom setuptools.command.egg_info import egg_info\n\nwith open(\"README.md\") as readme_file:\n readme = readme_file.read()\n\n\ndef git_submodule_update():\n ## submodule update\n os.system(\"git submodule update --init --recursive\")\n\n\nclass CustomInstallCommand(install):\n def run(self):\n install.run(self)\n git_submodule_update()\n\n\nclass CustomDevelopCommand(develop):\n def run(self):\n develop.run(self)\n git_submodule_update()\n\n\nclass CustomEggInfoCommand(egg_info):\n def run(self):\n egg_info.run(self)\n git_submodule_update()\n\n\n# read version.py\nimport sys, re\n\ntry:\n filepath = \"GANDLF/version.py\"\n version_file = open(filepath)\n (__version__,) = re.findall('__version__ = \"(.*)\"', version_file.read())\n\nexcept Exception as error:\n __version__ = \"0.0.1\"\n sys.stderr.write(\"Warning: Could not open '%s' due %s\\n\" % (filepath, error))\n\nrequirements = [\n \"black\",\n \"numpy==1.21.0\",\n \"scipy\",\n \"SimpleITK==2.1.0\",\n \"torch>=1.7\",\n \"torchvision\",\n \"tqdm\",\n \"torchio==0.18.57\",\n \"pandas\",\n \"pylint\",\n \"scikit-learn==0.23.1\",\n \"pickle5==0.0.11\",\n \"setuptools\",\n \"seaborn\",\n \"pyyaml\",\n \"openslide-python\",\n \"scikit-image\",\n \"matplotlib\",\n \"requests>=2.25.0\",\n \"pyvips\",\n \"pytest\",\n \"coverage\",\n \"pytest-cov\",\n \"psutil\",\n \"medcam\",\n \"opencv-python\",\n \"torchmetrics\",\n \"OpenPatchMiner==0.1.6\",\n \"pydicom\",\n]\n\nsetup(\n name=\"GANDLF\",\n version=__version__,\n author=\"Jose Agraz, Vinayak Ahluwalia, Bhakti Baheti, Spyridon Bakas, Ujjwal Baid, Megh Bhalerao, Brandon Edwards, Karol Gotkowski, Caleb Grenko, Orhun G\u00fcley, Ibrahim Ethem Hamamci, Sarthak Pati, Micah Sheller, Juliia Skobleva, Siddhesh Thakur, Spiros Thermos\", # alphabetical order\n author_email=\"[email protected]\",\n python_requires=\">=3.6\",\n packages=find_packages(),\n cmdclass={ # this ensures git_submodule_update is called during install\n \"install\": CustomInstallCommand,\n \"develop\": CustomDevelopCommand,\n \"egg_info\": CustomEggInfoCommand,\n },\n scripts=[\n \"gandlf_run\",\n \"gandlf_constructCSV\",\n \"gandlf_collectStats\",\n \"gandlf_patchMiner\",\n \"gandlf_preprocess\",\n \"gandlf_anonymizer\",\n \"gandlf_verifyInstall\",\n ],\n classifiers=[\n \"Development Status :: 3 - Alpha\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: BSD License\",\n \"Natural Language :: English\",\n \"Operating System :: OS Independent\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Topic :: Scientific/Engineering :: Medical Science Apps\",\n ],\n description=(\n \"PyTorch-based framework that handles segmentation/regression/classification using various DL architectures for medical imaging.\"\n ),\n install_requires=requirements,\n license=\"BSD-3-Clause License\",\n long_description=readme,\n long_description_content_type=\"text/markdown\",\n include_package_data=True,\n keywords=\"semantic, segmentation, regression, classification, data-augmentation, medical-imaging\",\n zip_safe=False,\n)\n\n## windows vips installation\nif os.name == \"nt\": # proceed for windows\n from pathlib import Path\n\n # download and extract if main dll is absent\n if not Path(\"./vips/vips-dev-8.10/bin/libvips-42.dll\").exists():\n print(\"Downloading and extracting VIPS for Windows\")\n url = \"https://github.com/libvips/libvips/releases/download/v8.10.2/vips-dev-w64-all-8.10.2.zip\"\n zip_to_extract = \"./vips.zip\"\n import urllib.request, zipfile\n\n urllib.request.urlretrieve(url, zip_to_extract)\n z = zipfile.ZipFile(zip_to_extract)\n z.extractall(\"./vips\")\n z.close()\n os.remove(zip_to_extract)\n", "path": "setup.py"}]} | 1,739 | 82 |
gh_patches_debug_27670 | rasdani/github-patches | git_diff | crytic__slither-211 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug in function-id printer
Hi there! There is an issue connected to `function-id` printer that I faced:
when you create a dynamic array with `public` visibility modifier compiler automatically generates a getter-function with `uint256` input parameter. However, Slither thinks that the getter has no input parameters. Thus, the wrong function signature is printed in the output.
Here is a small example.
```
pragma solidity 0.5.7;
contract Example {
uint256[] public example;
}
```
Slither outputs the following table:
```
+-----------+------------+
| Name | ID |
+-----------+------------+
| example() | 0x54353f2f |
+-----------+------------+
```
However, the real `example()` function's signature is `0x477e4a02` in the example.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `slither/printers/summary/function_ids.py`
Content:
```
1 """
2 Module printing summary of the contract
3 """
4 import collections
5 from prettytable import PrettyTable
6 from slither.printers.abstract_printer import AbstractPrinter
7 from slither.utils.colors import blue, green, magenta
8 from slither.utils.function import get_function_id
9
10 class FunctionIds(AbstractPrinter):
11
12 ARGUMENT = 'function-id'
13 HELP = 'Print the keccack256 signature of the functions'
14
15 WIKI = 'https://github.com/trailofbits/slither/wiki/Printer-documentation#function-id'
16
17 def output(self, _filename):
18 """
19 _filename is not used
20 Args:
21 _filename(string)
22 """
23
24 txt = ''
25 for contract in self.slither.contracts_derived:
26 txt += '\n{}:\n'.format(contract.name)
27 table = PrettyTable(['Name', 'ID'])
28 for function in contract.functions:
29 if function.visibility in ['public', 'external']:
30 table.add_row([function.full_name, hex(get_function_id(function.full_name))])
31 for variable in contract.state_variables:
32 if variable.visibility in ['public']:
33 table.add_row([variable.name+'()', hex(get_function_id(variable.name+'()'))])
34 txt += str(table) + '\n'
35
36 self.info(txt)
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/slither/printers/summary/function_ids.py b/slither/printers/summary/function_ids.py
--- a/slither/printers/summary/function_ids.py
+++ b/slither/printers/summary/function_ids.py
@@ -3,6 +3,8 @@
"""
import collections
from prettytable import PrettyTable
+
+from slither.core.solidity_types import ArrayType, MappingType
from slither.printers.abstract_printer import AbstractPrinter
from slither.utils.colors import blue, green, magenta
from slither.utils.function import get_function_id
@@ -30,7 +32,18 @@
table.add_row([function.full_name, hex(get_function_id(function.full_name))])
for variable in contract.state_variables:
if variable.visibility in ['public']:
- table.add_row([variable.name+'()', hex(get_function_id(variable.name+'()'))])
+ variable_getter_args = ""
+ if type(variable.type) is ArrayType:
+ length = 0
+ v = variable
+ while type(v.type) is ArrayType:
+ length += 1
+ v = v.type
+ variable_getter_args = ','.join(["uint256"]*length)
+ elif type(variable.type) is MappingType:
+ variable_getter_args = variable.type.type_from
+
+ table.add_row([f"{variable.name}({variable_getter_args})", hex(get_function_id(f"{variable.name}({variable_getter_args})"))])
txt += str(table) + '\n'
self.info(txt)
| {"golden_diff": "diff --git a/slither/printers/summary/function_ids.py b/slither/printers/summary/function_ids.py\n--- a/slither/printers/summary/function_ids.py\n+++ b/slither/printers/summary/function_ids.py\n@@ -3,6 +3,8 @@\n \"\"\"\n import collections\n from prettytable import PrettyTable\n+\n+from slither.core.solidity_types import ArrayType, MappingType\n from slither.printers.abstract_printer import AbstractPrinter\n from slither.utils.colors import blue, green, magenta\n from slither.utils.function import get_function_id\n@@ -30,7 +32,18 @@\n table.add_row([function.full_name, hex(get_function_id(function.full_name))])\n for variable in contract.state_variables:\n if variable.visibility in ['public']:\n- table.add_row([variable.name+'()', hex(get_function_id(variable.name+'()'))])\n+ variable_getter_args = \"\"\n+ if type(variable.type) is ArrayType:\n+ length = 0\n+ v = variable\n+ while type(v.type) is ArrayType:\n+ length += 1\n+ v = v.type\n+ variable_getter_args = ','.join([\"uint256\"]*length)\n+ elif type(variable.type) is MappingType:\n+ variable_getter_args = variable.type.type_from\n+\n+ table.add_row([f\"{variable.name}({variable_getter_args})\", hex(get_function_id(f\"{variable.name}({variable_getter_args})\"))])\n txt += str(table) + '\\n'\n \n self.info(txt)\n", "issue": "Bug in function-id printer\nHi there! There is an issue connected to `function-id` printer that I faced:\r\n\r\nwhen you create a dynamic array with `public` visibility modifier compiler automatically generates a getter-function with `uint256` input parameter. However, Slither thinks that the getter has no input parameters. Thus, the wrong function signature is printed in the output.\r\n\r\nHere is a small example.\r\n\r\n```\r\npragma solidity 0.5.7;\r\n\r\ncontract Example {\r\n uint256[] public example;\r\n}\r\n```\r\n\r\nSlither outputs the following table: \r\n\r\n```\r\n+-----------+------------+\r\n| Name | ID |\r\n+-----------+------------+\r\n| example() | 0x54353f2f |\r\n+-----------+------------+\r\n```\r\n\r\nHowever, the real `example()` function's signature is `0x477e4a02` in the example.\n", "before_files": [{"content": "\"\"\"\n Module printing summary of the contract\n\"\"\"\nimport collections\nfrom prettytable import PrettyTable\nfrom slither.printers.abstract_printer import AbstractPrinter\nfrom slither.utils.colors import blue, green, magenta\nfrom slither.utils.function import get_function_id\n\nclass FunctionIds(AbstractPrinter):\n\n ARGUMENT = 'function-id'\n HELP = 'Print the keccack256 signature of the functions'\n\n WIKI = 'https://github.com/trailofbits/slither/wiki/Printer-documentation#function-id'\n\n def output(self, _filename):\n \"\"\"\n _filename is not used\n Args:\n _filename(string)\n \"\"\"\n\n txt = ''\n for contract in self.slither.contracts_derived:\n txt += '\\n{}:\\n'.format(contract.name)\n table = PrettyTable(['Name', 'ID'])\n for function in contract.functions:\n if function.visibility in ['public', 'external']:\n table.add_row([function.full_name, hex(get_function_id(function.full_name))])\n for variable in contract.state_variables:\n if variable.visibility in ['public']:\n table.add_row([variable.name+'()', hex(get_function_id(variable.name+'()'))])\n txt += str(table) + '\\n'\n\n self.info(txt)\n", "path": "slither/printers/summary/function_ids.py"}], "after_files": [{"content": "\"\"\"\n Module printing summary of the contract\n\"\"\"\nimport collections\nfrom prettytable import PrettyTable\n\nfrom slither.core.solidity_types import ArrayType, MappingType\nfrom slither.printers.abstract_printer import AbstractPrinter\nfrom slither.utils.colors import blue, green, magenta\nfrom slither.utils.function import get_function_id\n\nclass FunctionIds(AbstractPrinter):\n\n ARGUMENT = 'function-id'\n HELP = 'Print the keccack256 signature of the functions'\n\n WIKI = 'https://github.com/trailofbits/slither/wiki/Printer-documentation#function-id'\n\n def output(self, _filename):\n \"\"\"\n _filename is not used\n Args:\n _filename(string)\n \"\"\"\n\n txt = ''\n for contract in self.slither.contracts_derived:\n txt += '\\n{}:\\n'.format(contract.name)\n table = PrettyTable(['Name', 'ID'])\n for function in contract.functions:\n if function.visibility in ['public', 'external']:\n table.add_row([function.full_name, hex(get_function_id(function.full_name))])\n for variable in contract.state_variables:\n if variable.visibility in ['public']:\n variable_getter_args = \"\"\n if type(variable.type) is ArrayType:\n length = 0\n v = variable\n while type(v.type) is ArrayType:\n length += 1\n v = v.type\n variable_getter_args = ','.join([\"uint256\"]*length)\n elif type(variable.type) is MappingType:\n variable_getter_args = variable.type.type_from\n\n table.add_row([f\"{variable.name}({variable_getter_args})\", hex(get_function_id(f\"{variable.name}({variable_getter_args})\"))])\n txt += str(table) + '\\n'\n\n self.info(txt)\n", "path": "slither/printers/summary/function_ids.py"}]} | 788 | 336 |
gh_patches_debug_5271 | rasdani/github-patches | git_diff | safe-global__safe-config-service-145 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Increase max limit for the chains endpoint
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/chains/views.py`
Content:
```
1 from rest_framework import filters
2 from rest_framework.generics import ListAPIView, RetrieveAPIView
3 from rest_framework.pagination import LimitOffsetPagination
4
5 from .models import Chain
6 from .serializers import ChainSerializer
7
8
9 class ChainsListView(ListAPIView):
10 serializer_class = ChainSerializer
11 pagination_class = LimitOffsetPagination
12 pagination_class.max_limit = 10
13 pagination_class.default_limit = 10
14 queryset = Chain.objects.all()
15 filter_backends = [filters.OrderingFilter]
16 ordering_fields = ["relevance", "name"]
17 ordering = [
18 "relevance",
19 "name",
20 ]
21
22
23 class ChainsDetailView(RetrieveAPIView):
24 serializer_class = ChainSerializer
25 queryset = Chain.objects.all()
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/chains/views.py b/src/chains/views.py
--- a/src/chains/views.py
+++ b/src/chains/views.py
@@ -9,7 +9,7 @@
class ChainsListView(ListAPIView):
serializer_class = ChainSerializer
pagination_class = LimitOffsetPagination
- pagination_class.max_limit = 10
+ pagination_class.max_limit = 100
pagination_class.default_limit = 10
queryset = Chain.objects.all()
filter_backends = [filters.OrderingFilter]
| {"golden_diff": "diff --git a/src/chains/views.py b/src/chains/views.py\n--- a/src/chains/views.py\n+++ b/src/chains/views.py\n@@ -9,7 +9,7 @@\n class ChainsListView(ListAPIView):\n serializer_class = ChainSerializer\n pagination_class = LimitOffsetPagination\n- pagination_class.max_limit = 10\n+ pagination_class.max_limit = 100\n pagination_class.default_limit = 10\n queryset = Chain.objects.all()\n filter_backends = [filters.OrderingFilter]\n", "issue": "Increase max limit for the chains endpoint\n\n", "before_files": [{"content": "from rest_framework import filters\nfrom rest_framework.generics import ListAPIView, RetrieveAPIView\nfrom rest_framework.pagination import LimitOffsetPagination\n\nfrom .models import Chain\nfrom .serializers import ChainSerializer\n\n\nclass ChainsListView(ListAPIView):\n serializer_class = ChainSerializer\n pagination_class = LimitOffsetPagination\n pagination_class.max_limit = 10\n pagination_class.default_limit = 10\n queryset = Chain.objects.all()\n filter_backends = [filters.OrderingFilter]\n ordering_fields = [\"relevance\", \"name\"]\n ordering = [\n \"relevance\",\n \"name\",\n ]\n\n\nclass ChainsDetailView(RetrieveAPIView):\n serializer_class = ChainSerializer\n queryset = Chain.objects.all()\n", "path": "src/chains/views.py"}], "after_files": [{"content": "from rest_framework import filters\nfrom rest_framework.generics import ListAPIView, RetrieveAPIView\nfrom rest_framework.pagination import LimitOffsetPagination\n\nfrom .models import Chain\nfrom .serializers import ChainSerializer\n\n\nclass ChainsListView(ListAPIView):\n serializer_class = ChainSerializer\n pagination_class = LimitOffsetPagination\n pagination_class.max_limit = 100\n pagination_class.default_limit = 10\n queryset = Chain.objects.all()\n filter_backends = [filters.OrderingFilter]\n ordering_fields = [\"relevance\", \"name\"]\n ordering = [\n \"relevance\",\n \"name\",\n ]\n\n\nclass ChainsDetailView(RetrieveAPIView):\n serializer_class = ChainSerializer\n queryset = Chain.objects.all()\n", "path": "src/chains/views.py"}]} | 463 | 114 |
gh_patches_debug_24394 | rasdani/github-patches | git_diff | pulp__pulpcore-4182 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Reclaim space for repository fails with Cannot delete some instances of model 'Artifact' because they are referenced through protected foreign keys: 'ContentArtifact.artifact'."
**Version**
3.16, but probably all versions
**Describe the bug**
- Reclaim space for repository fails with the following error.
~~~
Task paused with error: "("Cannot delete some instances of model 'Artifact' because they are referenced through protected foreign keys: 'ContentArtifact.artifact'.", {<ContentArtifact: pk=452959ad-c045-4e85-bf9f-6651ba37f57d>})"
~~~
**To Reproduce**
See BZ
**Additional context**
https://bugzilla.redhat.com/show_bug.cgi?id=2169322
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pulpcore/app/tasks/reclaim_space.py`
Content:
```
1 from pulpcore.app.models import (
2 Artifact,
3 Content,
4 ContentArtifact,
5 ProgressReport,
6 PublishedMetadata,
7 Repository,
8 RepositoryVersion,
9 )
10 from pulpcore.app.util import get_domain
11
12
13 def reclaim_space(repo_pks, keeplist_rv_pks=None, force=False):
14 """
15 This task frees-up disk space by removing Artifact files from the filesystem for Content
16 exclusive to the list of provided repos.
17
18 Note: content marked as `proctected` will be excluded from the reclaim disk space.
19
20 Kwargs:
21 repo_pks (list): A list of repo pks the disk reclaim space is performed on.
22 keeplist_rv_pks (list): A list of repo version pks that will be excluded from the reclaim
23 disk space.
24 force (bool): If True, uploaded content will be taken into account.
25
26 """
27 reclaimed_repos = Repository.objects.filter(pk__in=repo_pks)
28 for repo in reclaimed_repos:
29 repo.invalidate_cache(everything=True)
30
31 domain = get_domain()
32 rest_of_repos = Repository.objects.filter(pulp_domain=domain).exclude(pk__in=repo_pks)
33 c_keep_qs = Content.objects.filter(repositories__in=rest_of_repos)
34 c_reclaim_qs = Content.objects.filter(repositories__in=repo_pks)
35 c_reclaim_qs = c_reclaim_qs.exclude(
36 pk__in=c_keep_qs, pulp_type=PublishedMetadata.get_pulp_type()
37 )
38
39 if keeplist_rv_pks:
40 rv_qs = RepositoryVersion.objects.filter(pk__in=keeplist_rv_pks)
41 rv_content = Content.objects.none()
42 for rv in rv_qs.iterator():
43 rv_content |= rv.content
44 c_reclaim_qs = c_reclaim_qs.exclude(pk__in=rv_content)
45
46 content_distinct = c_reclaim_qs.distinct("pulp_type")
47 unprotected = []
48 for content in content_distinct:
49 if not content.cast().PROTECTED_FROM_RECLAIM:
50 unprotected.append(content.pulp_type)
51
52 ca_qs = ContentArtifact.objects.select_related("content", "artifact").filter(
53 content__in=c_reclaim_qs.values("pk"), artifact__isnull=False
54 )
55 if not force:
56 ca_qs = ca_qs.filter(remoteartifact__isnull=False)
57 artifact_pks = set()
58 ca_to_update = []
59 for ca in ca_qs.iterator():
60 if ca.content.pulp_type in unprotected:
61 artifact_pks.add(ca.artifact.pk)
62 ca.artifact = None
63 ca_to_update.append(ca)
64
65 ContentArtifact.objects.bulk_update(objs=ca_to_update, fields=["artifact"], batch_size=1000)
66 artifacts_to_delete = Artifact.objects.filter(pk__in=artifact_pks)
67 progress_bar = ProgressReport(
68 message="Reclaim disk space",
69 total=artifacts_to_delete.count(),
70 code="reclaim-space.artifact",
71 done=0,
72 state="running",
73 )
74 progress_bar.save()
75
76 counter = 0
77 interval = 100
78 for artifact in artifacts_to_delete.iterator():
79 # we need to manually call delete() because it cleans up the file on the filesystem
80 artifact.delete()
81 progress_bar.done += 1
82 counter += 1
83
84 if counter >= interval:
85 progress_bar.save()
86 counter = 0
87
88 progress_bar.state = "completed"
89 progress_bar.save()
90
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pulpcore/app/tasks/reclaim_space.py b/pulpcore/app/tasks/reclaim_space.py
--- a/pulpcore/app/tasks/reclaim_space.py
+++ b/pulpcore/app/tasks/reclaim_space.py
@@ -1,3 +1,7 @@
+from logging import getLogger
+
+from django.db.models.deletion import ProtectedError
+
from pulpcore.app.models import (
Artifact,
Content,
@@ -9,6 +13,8 @@
)
from pulpcore.app.util import get_domain
+log = getLogger(__name__)
+
def reclaim_space(repo_pks, keeplist_rv_pks=None, force=False):
"""
@@ -76,10 +82,16 @@
counter = 0
interval = 100
for artifact in artifacts_to_delete.iterator():
- # we need to manually call delete() because it cleans up the file on the filesystem
- artifact.delete()
- progress_bar.done += 1
- counter += 1
+ try:
+ # we need to manually call delete() because it cleans up the file on the filesystem
+ artifact.delete()
+ except ProtectedError as e:
+ # Rarely artifact could be shared between to different content units.
+ # Just log and skip the artifact deletion in this case
+ log.info(e)
+ else:
+ progress_bar.done += 1
+ counter += 1
if counter >= interval:
progress_bar.save()
| {"golden_diff": "diff --git a/pulpcore/app/tasks/reclaim_space.py b/pulpcore/app/tasks/reclaim_space.py\n--- a/pulpcore/app/tasks/reclaim_space.py\n+++ b/pulpcore/app/tasks/reclaim_space.py\n@@ -1,3 +1,7 @@\n+from logging import getLogger\n+\n+from django.db.models.deletion import ProtectedError\n+\n from pulpcore.app.models import (\n Artifact,\n Content,\n@@ -9,6 +13,8 @@\n )\n from pulpcore.app.util import get_domain\n \n+log = getLogger(__name__)\n+\n \n def reclaim_space(repo_pks, keeplist_rv_pks=None, force=False):\n \"\"\"\n@@ -76,10 +82,16 @@\n counter = 0\n interval = 100\n for artifact in artifacts_to_delete.iterator():\n- # we need to manually call delete() because it cleans up the file on the filesystem\n- artifact.delete()\n- progress_bar.done += 1\n- counter += 1\n+ try:\n+ # we need to manually call delete() because it cleans up the file on the filesystem\n+ artifact.delete()\n+ except ProtectedError as e:\n+ # Rarely artifact could be shared between to different content units.\n+ # Just log and skip the artifact deletion in this case\n+ log.info(e)\n+ else:\n+ progress_bar.done += 1\n+ counter += 1\n \n if counter >= interval:\n progress_bar.save()\n", "issue": "Reclaim space for repository fails with Cannot delete some instances of model 'Artifact' because they are referenced through protected foreign keys: 'ContentArtifact.artifact'.\"\n**Version**\r\n3.16, but probably all versions\r\n\r\n**Describe the bug**\r\n\r\n- Reclaim space for repository fails with the following error.\r\n\r\n ~~~\r\n Task paused with error: \"(\"Cannot delete some instances of model 'Artifact' because they are referenced through protected foreign keys: 'ContentArtifact.artifact'.\", {<ContentArtifact: pk=452959ad-c045-4e85-bf9f-6651ba37f57d>})\"\r\n ~~~\r\n\r\n**To Reproduce**\r\nSee BZ\r\n\r\n**Additional context**\r\nhttps://bugzilla.redhat.com/show_bug.cgi?id=2169322\r\n\n", "before_files": [{"content": "from pulpcore.app.models import (\n Artifact,\n Content,\n ContentArtifact,\n ProgressReport,\n PublishedMetadata,\n Repository,\n RepositoryVersion,\n)\nfrom pulpcore.app.util import get_domain\n\n\ndef reclaim_space(repo_pks, keeplist_rv_pks=None, force=False):\n \"\"\"\n This task frees-up disk space by removing Artifact files from the filesystem for Content\n exclusive to the list of provided repos.\n\n Note: content marked as `proctected` will be excluded from the reclaim disk space.\n\n Kwargs:\n repo_pks (list): A list of repo pks the disk reclaim space is performed on.\n keeplist_rv_pks (list): A list of repo version pks that will be excluded from the reclaim\n disk space.\n force (bool): If True, uploaded content will be taken into account.\n\n \"\"\"\n reclaimed_repos = Repository.objects.filter(pk__in=repo_pks)\n for repo in reclaimed_repos:\n repo.invalidate_cache(everything=True)\n\n domain = get_domain()\n rest_of_repos = Repository.objects.filter(pulp_domain=domain).exclude(pk__in=repo_pks)\n c_keep_qs = Content.objects.filter(repositories__in=rest_of_repos)\n c_reclaim_qs = Content.objects.filter(repositories__in=repo_pks)\n c_reclaim_qs = c_reclaim_qs.exclude(\n pk__in=c_keep_qs, pulp_type=PublishedMetadata.get_pulp_type()\n )\n\n if keeplist_rv_pks:\n rv_qs = RepositoryVersion.objects.filter(pk__in=keeplist_rv_pks)\n rv_content = Content.objects.none()\n for rv in rv_qs.iterator():\n rv_content |= rv.content\n c_reclaim_qs = c_reclaim_qs.exclude(pk__in=rv_content)\n\n content_distinct = c_reclaim_qs.distinct(\"pulp_type\")\n unprotected = []\n for content in content_distinct:\n if not content.cast().PROTECTED_FROM_RECLAIM:\n unprotected.append(content.pulp_type)\n\n ca_qs = ContentArtifact.objects.select_related(\"content\", \"artifact\").filter(\n content__in=c_reclaim_qs.values(\"pk\"), artifact__isnull=False\n )\n if not force:\n ca_qs = ca_qs.filter(remoteartifact__isnull=False)\n artifact_pks = set()\n ca_to_update = []\n for ca in ca_qs.iterator():\n if ca.content.pulp_type in unprotected:\n artifact_pks.add(ca.artifact.pk)\n ca.artifact = None\n ca_to_update.append(ca)\n\n ContentArtifact.objects.bulk_update(objs=ca_to_update, fields=[\"artifact\"], batch_size=1000)\n artifacts_to_delete = Artifact.objects.filter(pk__in=artifact_pks)\n progress_bar = ProgressReport(\n message=\"Reclaim disk space\",\n total=artifacts_to_delete.count(),\n code=\"reclaim-space.artifact\",\n done=0,\n state=\"running\",\n )\n progress_bar.save()\n\n counter = 0\n interval = 100\n for artifact in artifacts_to_delete.iterator():\n # we need to manually call delete() because it cleans up the file on the filesystem\n artifact.delete()\n progress_bar.done += 1\n counter += 1\n\n if counter >= interval:\n progress_bar.save()\n counter = 0\n\n progress_bar.state = \"completed\"\n progress_bar.save()\n", "path": "pulpcore/app/tasks/reclaim_space.py"}], "after_files": [{"content": "from logging import getLogger\n\nfrom django.db.models.deletion import ProtectedError\n\nfrom pulpcore.app.models import (\n Artifact,\n Content,\n ContentArtifact,\n ProgressReport,\n PublishedMetadata,\n Repository,\n RepositoryVersion,\n)\nfrom pulpcore.app.util import get_domain\n\nlog = getLogger(__name__)\n\n\ndef reclaim_space(repo_pks, keeplist_rv_pks=None, force=False):\n \"\"\"\n This task frees-up disk space by removing Artifact files from the filesystem for Content\n exclusive to the list of provided repos.\n\n Note: content marked as `proctected` will be excluded from the reclaim disk space.\n\n Kwargs:\n repo_pks (list): A list of repo pks the disk reclaim space is performed on.\n keeplist_rv_pks (list): A list of repo version pks that will be excluded from the reclaim\n disk space.\n force (bool): If True, uploaded content will be taken into account.\n\n \"\"\"\n reclaimed_repos = Repository.objects.filter(pk__in=repo_pks)\n for repo in reclaimed_repos:\n repo.invalidate_cache(everything=True)\n\n domain = get_domain()\n rest_of_repos = Repository.objects.filter(pulp_domain=domain).exclude(pk__in=repo_pks)\n c_keep_qs = Content.objects.filter(repositories__in=rest_of_repos)\n c_reclaim_qs = Content.objects.filter(repositories__in=repo_pks)\n c_reclaim_qs = c_reclaim_qs.exclude(\n pk__in=c_keep_qs, pulp_type=PublishedMetadata.get_pulp_type()\n )\n\n if keeplist_rv_pks:\n rv_qs = RepositoryVersion.objects.filter(pk__in=keeplist_rv_pks)\n rv_content = Content.objects.none()\n for rv in rv_qs.iterator():\n rv_content |= rv.content\n c_reclaim_qs = c_reclaim_qs.exclude(pk__in=rv_content)\n\n content_distinct = c_reclaim_qs.distinct(\"pulp_type\")\n unprotected = []\n for content in content_distinct:\n if not content.cast().PROTECTED_FROM_RECLAIM:\n unprotected.append(content.pulp_type)\n\n ca_qs = ContentArtifact.objects.select_related(\"content\", \"artifact\").filter(\n content__in=c_reclaim_qs.values(\"pk\"), artifact__isnull=False\n )\n if not force:\n ca_qs = ca_qs.filter(remoteartifact__isnull=False)\n artifact_pks = set()\n ca_to_update = []\n for ca in ca_qs.iterator():\n if ca.content.pulp_type in unprotected:\n artifact_pks.add(ca.artifact.pk)\n ca.artifact = None\n ca_to_update.append(ca)\n\n ContentArtifact.objects.bulk_update(objs=ca_to_update, fields=[\"artifact\"], batch_size=1000)\n artifacts_to_delete = Artifact.objects.filter(pk__in=artifact_pks)\n progress_bar = ProgressReport(\n message=\"Reclaim disk space\",\n total=artifacts_to_delete.count(),\n code=\"reclaim-space.artifact\",\n done=0,\n state=\"running\",\n )\n progress_bar.save()\n\n counter = 0\n interval = 100\n for artifact in artifacts_to_delete.iterator():\n try:\n # we need to manually call delete() because it cleans up the file on the filesystem\n artifact.delete()\n except ProtectedError as e:\n # Rarely artifact could be shared between to different content units.\n # Just log and skip the artifact deletion in this case\n log.info(e)\n else:\n progress_bar.done += 1\n counter += 1\n\n if counter >= interval:\n progress_bar.save()\n counter = 0\n\n progress_bar.state = \"completed\"\n progress_bar.save()\n", "path": "pulpcore/app/tasks/reclaim_space.py"}]} | 1,349 | 322 |
gh_patches_debug_5389 | rasdani/github-patches | git_diff | vega__altair-989 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
scatter-matrix example don't fit into screen
https://altair-viz.github.io/gallery/scatter_matrix.html#scatter-matrix

```python
#...
).properties(
width=150, # suggested instead of 250
height=150 # suggested instead of 250
).repeat(
#...
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `altair/vegalite/v2/examples/scatter_matrix.py`
Content:
```
1 """
2 Scatter Matrix
3 --------------
4 An example of using a RepeatChart to construct a multi-panel scatter plot
5 with linked panning and zooming.
6 """
7 # category: scatter plots
8 import altair as alt
9 from vega_datasets import data
10
11 alt.Chart(data.cars.url).mark_circle().encode(
12 alt.X(alt.repeat("column"), type='quantitative'),
13 alt.Y(alt.repeat("row"), type='quantitative'),
14 color='Origin:N'
15 ).properties(
16 width=250,
17 height=250
18 ).repeat(
19 row=['Horsepower', 'Acceleration', 'Miles_per_Gallon'],
20 column=['Miles_per_Gallon', 'Acceleration', 'Horsepower']
21 ).interactive()
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/altair/vegalite/v2/examples/scatter_matrix.py b/altair/vegalite/v2/examples/scatter_matrix.py
--- a/altair/vegalite/v2/examples/scatter_matrix.py
+++ b/altair/vegalite/v2/examples/scatter_matrix.py
@@ -13,8 +13,8 @@
alt.Y(alt.repeat("row"), type='quantitative'),
color='Origin:N'
).properties(
- width=250,
- height=250
+ width=150,
+ height=150
).repeat(
row=['Horsepower', 'Acceleration', 'Miles_per_Gallon'],
column=['Miles_per_Gallon', 'Acceleration', 'Horsepower']
| {"golden_diff": "diff --git a/altair/vegalite/v2/examples/scatter_matrix.py b/altair/vegalite/v2/examples/scatter_matrix.py\n--- a/altair/vegalite/v2/examples/scatter_matrix.py\n+++ b/altair/vegalite/v2/examples/scatter_matrix.py\n@@ -13,8 +13,8 @@\n alt.Y(alt.repeat(\"row\"), type='quantitative'),\n color='Origin:N'\n ).properties(\n- width=250,\n- height=250\n+ width=150,\n+ height=150\n ).repeat(\n row=['Horsepower', 'Acceleration', 'Miles_per_Gallon'],\n column=['Miles_per_Gallon', 'Acceleration', 'Horsepower']\n", "issue": "scatter-matrix example don't fit into screen\nhttps://altair-viz.github.io/gallery/scatter_matrix.html#scatter-matrix\r\n\r\n```python\r\n#...\r\n).properties(\r\n width=150, # suggested instead of 250\r\n height=150 # suggested instead of 250\r\n).repeat(\r\n#...\r\n```\n", "before_files": [{"content": "\"\"\"\nScatter Matrix\n--------------\nAn example of using a RepeatChart to construct a multi-panel scatter plot\nwith linked panning and zooming.\n\"\"\"\n# category: scatter plots\nimport altair as alt\nfrom vega_datasets import data\n\nalt.Chart(data.cars.url).mark_circle().encode(\n alt.X(alt.repeat(\"column\"), type='quantitative'),\n alt.Y(alt.repeat(\"row\"), type='quantitative'),\n color='Origin:N'\n).properties(\n width=250,\n height=250\n).repeat(\n row=['Horsepower', 'Acceleration', 'Miles_per_Gallon'],\n column=['Miles_per_Gallon', 'Acceleration', 'Horsepower']\n).interactive()\n", "path": "altair/vegalite/v2/examples/scatter_matrix.py"}], "after_files": [{"content": "\"\"\"\nScatter Matrix\n--------------\nAn example of using a RepeatChart to construct a multi-panel scatter plot\nwith linked panning and zooming.\n\"\"\"\n# category: scatter plots\nimport altair as alt\nfrom vega_datasets import data\n\nalt.Chart(data.cars.url).mark_circle().encode(\n alt.X(alt.repeat(\"column\"), type='quantitative'),\n alt.Y(alt.repeat(\"row\"), type='quantitative'),\n color='Origin:N'\n).properties(\n width=150,\n height=150\n).repeat(\n row=['Horsepower', 'Acceleration', 'Miles_per_Gallon'],\n column=['Miles_per_Gallon', 'Acceleration', 'Horsepower']\n).interactive()\n", "path": "altair/vegalite/v2/examples/scatter_matrix.py"}]} | 592 | 172 |
gh_patches_debug_16489 | rasdani/github-patches | git_diff | fossasia__open-event-server-6754 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error in migrations
**Describe the bug**
Due to recent merging of PRs https://github.com/fossasia/open-event-server/pull/6744/ https://github.com/fossasia/open-event-server/pull/6748
The PR https://github.com/fossasia/open-event-server/pull/6744 has an invalid migration to remove the 'secret' column from the db which was unrelated to PR. Hence now we have two migration for the same column `secret` from `settings` which
drops column on upgrade and creates on downgrade
**Additional context**
@iamareebjamal - Please chck
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py`
Content:
```
1 """empty message
2
3 Revision ID: 6ebafb385765
4 Revises: 30a490ad1609
5 Create Date: 2020-01-17 18:09:31.897988
6
7 """
8
9 from alembic import op
10 import sqlalchemy as sa
11 import sqlalchemy_utils
12
13
14 # revision identifiers, used by Alembic.
15 revision = '6ebafb385765'
16 down_revision = '30a490ad1609'
17
18
19 def upgrade():
20 # ### commands auto generated by Alembic - please adjust! ###
21 op.drop_column('settings', 'secret')
22 op.add_column('ticket_holders', sa.Column('age_group', sa.String(), nullable=True))
23 # ### end Alembic commands ###
24
25
26 def downgrade():
27 # ### commands auto generated by Alembic - please adjust! ###
28 op.drop_column('ticket_holders', 'age_group')
29 op.add_column('settings', sa.Column('secret', sa.VARCHAR(), autoincrement=False, nullable=True))
30 # ### end Alembic commands ###
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py b/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py
--- a/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py
+++ b/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py
@@ -18,7 +18,6 @@
def upgrade():
# ### commands auto generated by Alembic - please adjust! ###
- op.drop_column('settings', 'secret')
op.add_column('ticket_holders', sa.Column('age_group', sa.String(), nullable=True))
# ### end Alembic commands ###
@@ -26,5 +25,4 @@
def downgrade():
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column('ticket_holders', 'age_group')
- op.add_column('settings', sa.Column('secret', sa.VARCHAR(), autoincrement=False, nullable=True))
# ### end Alembic commands ###
| {"golden_diff": "diff --git a/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py b/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py\n--- a/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py\n+++ b/migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py\n@@ -18,7 +18,6 @@\n \n def upgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n- op.drop_column('settings', 'secret')\n op.add_column('ticket_holders', sa.Column('age_group', sa.String(), nullable=True))\n # ### end Alembic commands ###\n \n@@ -26,5 +25,4 @@\n def downgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.drop_column('ticket_holders', 'age_group')\n- op.add_column('settings', sa.Column('secret', sa.VARCHAR(), autoincrement=False, nullable=True))\n # ### end Alembic commands ###\n", "issue": "Error in migrations\n**Describe the bug**\r\nDue to recent merging of PRs https://github.com/fossasia/open-event-server/pull/6744/ https://github.com/fossasia/open-event-server/pull/6748\r\n\r\nThe PR https://github.com/fossasia/open-event-server/pull/6744 has an invalid migration to remove the 'secret' column from the db which was unrelated to PR. Hence now we have two migration for the same column `secret` from `settings` which\r\n drops column on upgrade and creates on downgrade\r\n\r\n\r\n**Additional context**\r\n@iamareebjamal - Please chck\r\n\n", "before_files": [{"content": "\"\"\"empty message\n\nRevision ID: 6ebafb385765\nRevises: 30a490ad1609\nCreate Date: 2020-01-17 18:09:31.897988\n\n\"\"\"\n\nfrom alembic import op\nimport sqlalchemy as sa\nimport sqlalchemy_utils\n\n\n# revision identifiers, used by Alembic.\nrevision = '6ebafb385765'\ndown_revision = '30a490ad1609'\n\n\ndef upgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.drop_column('settings', 'secret')\n op.add_column('ticket_holders', sa.Column('age_group', sa.String(), nullable=True))\n # ### end Alembic commands ###\n\n\ndef downgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.drop_column('ticket_holders', 'age_group')\n op.add_column('settings', sa.Column('secret', sa.VARCHAR(), autoincrement=False, nullable=True))\n # ### end Alembic commands ###\n", "path": "migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py"}], "after_files": [{"content": "\"\"\"empty message\n\nRevision ID: 6ebafb385765\nRevises: 30a490ad1609\nCreate Date: 2020-01-17 18:09:31.897988\n\n\"\"\"\n\nfrom alembic import op\nimport sqlalchemy as sa\nimport sqlalchemy_utils\n\n\n# revision identifiers, used by Alembic.\nrevision = '6ebafb385765'\ndown_revision = '30a490ad1609'\n\n\ndef upgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.add_column('ticket_holders', sa.Column('age_group', sa.String(), nullable=True))\n # ### end Alembic commands ###\n\n\ndef downgrade():\n # ### commands auto generated by Alembic - please adjust! ###\n op.drop_column('ticket_holders', 'age_group')\n # ### end Alembic commands ###\n", "path": "migrations/versions/rev-2020-01-17-18:09:31-6ebafb385765_.py"}]} | 734 | 313 |
gh_patches_debug_13555 | rasdani/github-patches | git_diff | cisagov__manage.get.gov-1461 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Content: Clarify that a domain that appears "available" is not guaranteed
### Story
As an applicant
I want to understand that my requested domain is not guaranteed
so that I can set appropriate expectations for myself and my team.
### Acceptance Criteria
On the ".gov Domain" step in the application process, language appears that clarifies a requested domain is not guaranteed.
### Additional Context
During user testing, most people understood what to input, but it could’ve been more explicit that the domain they entered was not guaranteed, even if available. When prompted, most participants understood the process they would go through to get the domain, but the language could be more explicit.
### Issue Links
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/api/views.py`
Content:
```
1 """Internal API views"""
2 from django.apps import apps
3 from django.views.decorators.http import require_http_methods
4 from django.http import HttpResponse, JsonResponse
5 from django.utils.safestring import mark_safe
6
7 from registrar.templatetags.url_helpers import public_site_url
8 from registrar.utility.errors import GenericError, GenericErrorCodes
9
10 import requests
11
12 from login_required import login_not_required
13
14 from cachetools.func import ttl_cache
15
16 from registrar.utility.s3_bucket import S3ClientError, S3ClientHelper
17
18
19 DOMAIN_FILE_URL = "https://raw.githubusercontent.com/cisagov/dotgov-data/main/current-full.csv"
20
21
22 DOMAIN_API_MESSAGES = {
23 "required": "Enter the .gov domain you want. Don’t include “www” or “.gov.”"
24 " For example, if you want www.city.gov, you would enter “city”"
25 " (without the quotes).",
26 "extra_dots": "Enter the .gov domain you want without any periods.",
27 # message below is considered safe; no user input can be inserted into the message
28 # body; public_site_url() function reads from local app settings and therefore safe
29 "unavailable": mark_safe( # nosec
30 "That domain isn’t available. "
31 "<a class='usa-link' href='{}' target='_blank'>"
32 "Read more about choosing your .gov domain.</a>".format(public_site_url("domains/choosing"))
33 ),
34 "invalid": "Enter a domain using only letters, numbers, or hyphens (though we don't recommend using hyphens).",
35 "success": "That domain is available!",
36 "error": GenericError.get_error_message(GenericErrorCodes.CANNOT_CONTACT_REGISTRY),
37 }
38
39
40 # this file doesn't change that often, nor is it that big, so cache the result
41 # in memory for ten minutes
42 @ttl_cache(ttl=600)
43 def _domains():
44 """Return a list of the current .gov domains.
45
46 Fetch a file from DOMAIN_FILE_URL, parse the CSV for the domain,
47 lowercase everything and return the list.
48 """
49 DraftDomain = apps.get_model("registrar.DraftDomain")
50 # 5 second timeout
51 file_contents = requests.get(DOMAIN_FILE_URL, timeout=5).text
52 domains = set()
53 # skip the first line
54 for line in file_contents.splitlines()[1:]:
55 # get the domain before the first comma
56 domain = line.split(",", 1)[0]
57 # sanity-check the string we got from the file here
58 if DraftDomain.string_could_be_domain(domain):
59 # lowercase everything when we put it in domains
60 domains.add(domain.lower())
61 return domains
62
63
64 def check_domain_available(domain):
65 """Return true if the given domain is available.
66
67 The given domain is lowercased to match against the domains list. If the
68 given domain doesn't end with .gov, ".gov" is added when looking for
69 a match. If check fails, throws a RegistryError.
70 """
71 Domain = apps.get_model("registrar.Domain")
72 if domain.endswith(".gov"):
73 return Domain.available(domain)
74 else:
75 # domain search string doesn't end with .gov, add it on here
76 return Domain.available(domain + ".gov")
77
78
79 @require_http_methods(["GET"])
80 @login_not_required
81 def available(request, domain=""):
82 """Is a given domain available or not.
83
84 Response is a JSON dictionary with the key "available" and value true or
85 false.
86 """
87 domain = request.GET.get("domain", "")
88 DraftDomain = apps.get_model("registrar.DraftDomain")
89 # validate that the given domain could be a domain name and fail early if
90 # not.
91 if not (DraftDomain.string_could_be_domain(domain) or DraftDomain.string_could_be_domain(domain + ".gov")):
92 return JsonResponse({"available": False, "message": DOMAIN_API_MESSAGES["invalid"]})
93 # a domain is available if it is NOT in the list of current domains
94 try:
95 if check_domain_available(domain):
96 return JsonResponse({"available": True, "message": DOMAIN_API_MESSAGES["success"]})
97 else:
98 return JsonResponse({"available": False, "message": DOMAIN_API_MESSAGES["unavailable"]})
99 except Exception:
100 return JsonResponse({"available": False, "message": DOMAIN_API_MESSAGES["error"]})
101
102
103 @require_http_methods(["GET"])
104 @login_not_required
105 def get_current_full(request, file_name="current-full.csv"):
106 """This will return the file content of current-full.csv which is the command
107 output of generate_current_full_report.py. This command iterates through each Domain
108 and returns a CSV representation."""
109 return serve_file(file_name)
110
111
112 @require_http_methods(["GET"])
113 @login_not_required
114 def get_current_federal(request, file_name="current-federal.csv"):
115 """This will return the file content of current-federal.csv which is the command
116 output of generate_current_federal_report.py. This command iterates through each Domain
117 and returns a CSV representation."""
118 return serve_file(file_name)
119
120
121 def serve_file(file_name):
122 """Downloads a file based on a given filepath. Returns a 500 if not found."""
123 s3_client = S3ClientHelper()
124 # Serve the CSV file. If not found, an exception will be thrown.
125 # This will then be caught by flat, causing it to not read it - which is what we want.
126 try:
127 file = s3_client.get_file(file_name, decode_to_utf=True)
128 except S3ClientError as err:
129 # TODO - #1317: Notify operations when auto report generation fails
130 raise err
131
132 response = HttpResponse(file)
133 return response
134
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/api/views.py b/src/api/views.py
--- a/src/api/views.py
+++ b/src/api/views.py
@@ -32,7 +32,9 @@
"Read more about choosing your .gov domain.</a>".format(public_site_url("domains/choosing"))
),
"invalid": "Enter a domain using only letters, numbers, or hyphens (though we don't recommend using hyphens).",
- "success": "That domain is available!",
+ "success": "That domain is available! We’ll try to give you the domain you want, \
+ but it's not guaranteed. After you complete this form, we’ll \
+ evaluate whether your request meets our requirements.",
"error": GenericError.get_error_message(GenericErrorCodes.CANNOT_CONTACT_REGISTRY),
}
| {"golden_diff": "diff --git a/src/api/views.py b/src/api/views.py\n--- a/src/api/views.py\n+++ b/src/api/views.py\n@@ -32,7 +32,9 @@\n \"Read more about choosing your .gov domain.</a>\".format(public_site_url(\"domains/choosing\"))\n ),\n \"invalid\": \"Enter a domain using only letters, numbers, or hyphens (though we don't recommend using hyphens).\",\n- \"success\": \"That domain is available!\",\n+ \"success\": \"That domain is available! We\u2019ll try to give you the domain you want, \\\n+ but it's not guaranteed. After you complete this form, we\u2019ll \\\n+ evaluate whether your request meets our requirements.\",\n \"error\": GenericError.get_error_message(GenericErrorCodes.CANNOT_CONTACT_REGISTRY),\n }\n", "issue": "Content: Clarify that a domain that appears \"available\" is not guaranteed\n### Story\nAs an applicant\nI want to understand that my requested domain is not guaranteed\nso that I can set appropriate expectations for myself and my team.\n\n\n### Acceptance Criteria\nOn the \".gov Domain\" step in the application process, language appears that clarifies a requested domain is not guaranteed.\n\n### Additional Context\nDuring user testing, most people understood what to input, but it could\u2019ve been more explicit that the domain they entered was not guaranteed, even if available. When prompted, most participants understood the process they would go through to get the domain, but the language could be more explicit.\n\n### Issue Links\n_No response_\n", "before_files": [{"content": "\"\"\"Internal API views\"\"\"\nfrom django.apps import apps\nfrom django.views.decorators.http import require_http_methods\nfrom django.http import HttpResponse, JsonResponse\nfrom django.utils.safestring import mark_safe\n\nfrom registrar.templatetags.url_helpers import public_site_url\nfrom registrar.utility.errors import GenericError, GenericErrorCodes\n\nimport requests\n\nfrom login_required import login_not_required\n\nfrom cachetools.func import ttl_cache\n\nfrom registrar.utility.s3_bucket import S3ClientError, S3ClientHelper\n\n\nDOMAIN_FILE_URL = \"https://raw.githubusercontent.com/cisagov/dotgov-data/main/current-full.csv\"\n\n\nDOMAIN_API_MESSAGES = {\n \"required\": \"Enter the .gov domain you want. Don\u2019t include \u201cwww\u201d or \u201c.gov.\u201d\"\n \" For example, if you want www.city.gov, you would enter \u201ccity\u201d\"\n \" (without the quotes).\",\n \"extra_dots\": \"Enter the .gov domain you want without any periods.\",\n # message below is considered safe; no user input can be inserted into the message\n # body; public_site_url() function reads from local app settings and therefore safe\n \"unavailable\": mark_safe( # nosec\n \"That domain isn\u2019t available. \"\n \"<a class='usa-link' href='{}' target='_blank'>\"\n \"Read more about choosing your .gov domain.</a>\".format(public_site_url(\"domains/choosing\"))\n ),\n \"invalid\": \"Enter a domain using only letters, numbers, or hyphens (though we don't recommend using hyphens).\",\n \"success\": \"That domain is available!\",\n \"error\": GenericError.get_error_message(GenericErrorCodes.CANNOT_CONTACT_REGISTRY),\n}\n\n\n# this file doesn't change that often, nor is it that big, so cache the result\n# in memory for ten minutes\n@ttl_cache(ttl=600)\ndef _domains():\n \"\"\"Return a list of the current .gov domains.\n\n Fetch a file from DOMAIN_FILE_URL, parse the CSV for the domain,\n lowercase everything and return the list.\n \"\"\"\n DraftDomain = apps.get_model(\"registrar.DraftDomain\")\n # 5 second timeout\n file_contents = requests.get(DOMAIN_FILE_URL, timeout=5).text\n domains = set()\n # skip the first line\n for line in file_contents.splitlines()[1:]:\n # get the domain before the first comma\n domain = line.split(\",\", 1)[0]\n # sanity-check the string we got from the file here\n if DraftDomain.string_could_be_domain(domain):\n # lowercase everything when we put it in domains\n domains.add(domain.lower())\n return domains\n\n\ndef check_domain_available(domain):\n \"\"\"Return true if the given domain is available.\n\n The given domain is lowercased to match against the domains list. If the\n given domain doesn't end with .gov, \".gov\" is added when looking for\n a match. If check fails, throws a RegistryError.\n \"\"\"\n Domain = apps.get_model(\"registrar.Domain\")\n if domain.endswith(\".gov\"):\n return Domain.available(domain)\n else:\n # domain search string doesn't end with .gov, add it on here\n return Domain.available(domain + \".gov\")\n\n\n@require_http_methods([\"GET\"])\n@login_not_required\ndef available(request, domain=\"\"):\n \"\"\"Is a given domain available or not.\n\n Response is a JSON dictionary with the key \"available\" and value true or\n false.\n \"\"\"\n domain = request.GET.get(\"domain\", \"\")\n DraftDomain = apps.get_model(\"registrar.DraftDomain\")\n # validate that the given domain could be a domain name and fail early if\n # not.\n if not (DraftDomain.string_could_be_domain(domain) or DraftDomain.string_could_be_domain(domain + \".gov\")):\n return JsonResponse({\"available\": False, \"message\": DOMAIN_API_MESSAGES[\"invalid\"]})\n # a domain is available if it is NOT in the list of current domains\n try:\n if check_domain_available(domain):\n return JsonResponse({\"available\": True, \"message\": DOMAIN_API_MESSAGES[\"success\"]})\n else:\n return JsonResponse({\"available\": False, \"message\": DOMAIN_API_MESSAGES[\"unavailable\"]})\n except Exception:\n return JsonResponse({\"available\": False, \"message\": DOMAIN_API_MESSAGES[\"error\"]})\n\n\n@require_http_methods([\"GET\"])\n@login_not_required\ndef get_current_full(request, file_name=\"current-full.csv\"):\n \"\"\"This will return the file content of current-full.csv which is the command\n output of generate_current_full_report.py. This command iterates through each Domain\n and returns a CSV representation.\"\"\"\n return serve_file(file_name)\n\n\n@require_http_methods([\"GET\"])\n@login_not_required\ndef get_current_federal(request, file_name=\"current-federal.csv\"):\n \"\"\"This will return the file content of current-federal.csv which is the command\n output of generate_current_federal_report.py. This command iterates through each Domain\n and returns a CSV representation.\"\"\"\n return serve_file(file_name)\n\n\ndef serve_file(file_name):\n \"\"\"Downloads a file based on a given filepath. Returns a 500 if not found.\"\"\"\n s3_client = S3ClientHelper()\n # Serve the CSV file. If not found, an exception will be thrown.\n # This will then be caught by flat, causing it to not read it - which is what we want.\n try:\n file = s3_client.get_file(file_name, decode_to_utf=True)\n except S3ClientError as err:\n # TODO - #1317: Notify operations when auto report generation fails\n raise err\n\n response = HttpResponse(file)\n return response\n", "path": "src/api/views.py"}], "after_files": [{"content": "\"\"\"Internal API views\"\"\"\nfrom django.apps import apps\nfrom django.views.decorators.http import require_http_methods\nfrom django.http import HttpResponse, JsonResponse\nfrom django.utils.safestring import mark_safe\n\nfrom registrar.templatetags.url_helpers import public_site_url\nfrom registrar.utility.errors import GenericError, GenericErrorCodes\n\nimport requests\n\nfrom login_required import login_not_required\n\nfrom cachetools.func import ttl_cache\n\nfrom registrar.utility.s3_bucket import S3ClientError, S3ClientHelper\n\n\nDOMAIN_FILE_URL = \"https://raw.githubusercontent.com/cisagov/dotgov-data/main/current-full.csv\"\n\n\nDOMAIN_API_MESSAGES = {\n \"required\": \"Enter the .gov domain you want. Don\u2019t include \u201cwww\u201d or \u201c.gov.\u201d\"\n \" For example, if you want www.city.gov, you would enter \u201ccity\u201d\"\n \" (without the quotes).\",\n \"extra_dots\": \"Enter the .gov domain you want without any periods.\",\n # message below is considered safe; no user input can be inserted into the message\n # body; public_site_url() function reads from local app settings and therefore safe\n \"unavailable\": mark_safe( # nosec\n \"That domain isn\u2019t available. \"\n \"<a class='usa-link' href='{}' target='_blank'>\"\n \"Read more about choosing your .gov domain.</a>\".format(public_site_url(\"domains/choosing\"))\n ),\n \"invalid\": \"Enter a domain using only letters, numbers, or hyphens (though we don't recommend using hyphens).\",\n \"success\": \"That domain is available! We\u2019ll try to give you the domain you want, \\\n but it's not guaranteed. After you complete this form, we\u2019ll \\\n evaluate whether your request meets our requirements.\",\n \"error\": GenericError.get_error_message(GenericErrorCodes.CANNOT_CONTACT_REGISTRY),\n}\n\n\n# this file doesn't change that often, nor is it that big, so cache the result\n# in memory for ten minutes\n@ttl_cache(ttl=600)\ndef _domains():\n \"\"\"Return a list of the current .gov domains.\n\n Fetch a file from DOMAIN_FILE_URL, parse the CSV for the domain,\n lowercase everything and return the list.\n \"\"\"\n DraftDomain = apps.get_model(\"registrar.DraftDomain\")\n # 5 second timeout\n file_contents = requests.get(DOMAIN_FILE_URL, timeout=5).text\n domains = set()\n # skip the first line\n for line in file_contents.splitlines()[1:]:\n # get the domain before the first comma\n domain = line.split(\",\", 1)[0]\n # sanity-check the string we got from the file here\n if DraftDomain.string_could_be_domain(domain):\n # lowercase everything when we put it in domains\n domains.add(domain.lower())\n return domains\n\n\ndef check_domain_available(domain):\n \"\"\"Return true if the given domain is available.\n\n The given domain is lowercased to match against the domains list. If the\n given domain doesn't end with .gov, \".gov\" is added when looking for\n a match. If check fails, throws a RegistryError.\n \"\"\"\n Domain = apps.get_model(\"registrar.Domain\")\n if domain.endswith(\".gov\"):\n return Domain.available(domain)\n else:\n # domain search string doesn't end with .gov, add it on here\n return Domain.available(domain + \".gov\")\n\n\n@require_http_methods([\"GET\"])\n@login_not_required\ndef available(request, domain=\"\"):\n \"\"\"Is a given domain available or not.\n\n Response is a JSON dictionary with the key \"available\" and value true or\n false.\n \"\"\"\n domain = request.GET.get(\"domain\", \"\")\n DraftDomain = apps.get_model(\"registrar.DraftDomain\")\n # validate that the given domain could be a domain name and fail early if\n # not.\n if not (DraftDomain.string_could_be_domain(domain) or DraftDomain.string_could_be_domain(domain + \".gov\")):\n return JsonResponse({\"available\": False, \"message\": DOMAIN_API_MESSAGES[\"invalid\"]})\n # a domain is available if it is NOT in the list of current domains\n try:\n if check_domain_available(domain):\n return JsonResponse({\"available\": True, \"message\": DOMAIN_API_MESSAGES[\"success\"]})\n else:\n return JsonResponse({\"available\": False, \"message\": DOMAIN_API_MESSAGES[\"unavailable\"]})\n except Exception:\n return JsonResponse({\"available\": False, \"message\": DOMAIN_API_MESSAGES[\"error\"]})\n\n\n@require_http_methods([\"GET\"])\n@login_not_required\ndef get_current_full(request, file_name=\"current-full.csv\"):\n \"\"\"This will return the file content of current-full.csv which is the command\n output of generate_current_full_report.py. This command iterates through each Domain\n and returns a CSV representation.\"\"\"\n return serve_file(file_name)\n\n\n@require_http_methods([\"GET\"])\n@login_not_required\ndef get_current_federal(request, file_name=\"current-federal.csv\"):\n \"\"\"This will return the file content of current-federal.csv which is the command\n output of generate_current_federal_report.py. This command iterates through each Domain\n and returns a CSV representation.\"\"\"\n return serve_file(file_name)\n\n\ndef serve_file(file_name):\n \"\"\"Downloads a file based on a given filepath. Returns a 500 if not found.\"\"\"\n s3_client = S3ClientHelper()\n # Serve the CSV file. If not found, an exception will be thrown.\n # This will then be caught by flat, causing it to not read it - which is what we want.\n try:\n file = s3_client.get_file(file_name, decode_to_utf=True)\n except S3ClientError as err:\n # TODO - #1317: Notify operations when auto report generation fails\n raise err\n\n response = HttpResponse(file)\n return response\n", "path": "src/api/views.py"}]} | 1,925 | 175 |
gh_patches_debug_5180 | rasdani/github-patches | git_diff | oppia__oppia-7075 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add the practice session experience to the topic viewer
Add an option in the topic viewer page that can start the practice session, based on the skills that are part of the topic.
Add the practice session experience to the topic viewer
Add an option in the topic viewer page that can start the practice session, based on the skills that are part of the topic.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/controllers/topic_viewer.py`
Content:
```
1 # Copyright 2018 The Oppia Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS-IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Controllers for the topic viewer page."""
16
17 from constants import constants
18 from core.controllers import acl_decorators
19 from core.controllers import base
20 from core.domain import story_services
21 from core.domain import topic_services
22 import feconf
23
24
25 class TopicViewerPage(base.BaseHandler):
26 """Renders the topic viewer page."""
27
28 @acl_decorators.can_access_topic_viewer_page
29 def get(self, _):
30 """Handles GET requests."""
31
32 if not constants.ENABLE_NEW_STRUCTURE_PLAYERS:
33 raise self.PageNotFoundException
34
35 self.render_template('dist/topic-viewer-page.mainpage.html')
36
37
38 class TopicPageDataHandler(base.BaseHandler):
39 """Manages the data that needs to be displayed to a learner on the topic
40 viewer page.
41 """
42 GET_HANDLER_ERROR_RETURN_TYPE = feconf.HANDLER_TYPE_JSON
43
44 @acl_decorators.can_access_topic_viewer_page
45 def get(self, topic_name):
46 """Handles GET requests."""
47
48 if not constants.ENABLE_NEW_STRUCTURE_PLAYERS:
49 raise self.PageNotFoundException
50
51 topic = topic_services.get_topic_by_name(topic_name)
52
53 canonical_story_summaries = [
54 story_services.get_story_summary_by_id(
55 canonical_story_id) for canonical_story_id
56 in topic.canonical_story_ids]
57
58 additional_story_summaries = [
59 story_services.get_story_summary_by_id(
60 additional_story_id) for additional_story_id
61 in topic.additional_story_ids]
62
63 canonical_story_dicts = [
64 summary.to_human_readable_dict() for summary
65 in canonical_story_summaries]
66
67 additional_story_dicts = [
68 summary.to_human_readable_dict() for summary
69 in additional_story_summaries]
70
71 uncategorized_skill_ids = topic.get_all_uncategorized_skill_ids()
72 subtopics = topic.get_all_subtopics()
73
74 self.values.update({
75 'topic_name': topic.name,
76 'canonical_story_dicts': canonical_story_dicts,
77 'additional_story_dicts': additional_story_dicts,
78 'uncategorized_skill_ids': uncategorized_skill_ids,
79 'subtopics': subtopics
80 })
81 self.render_json(self.values)
82
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/controllers/topic_viewer.py b/core/controllers/topic_viewer.py
--- a/core/controllers/topic_viewer.py
+++ b/core/controllers/topic_viewer.py
@@ -73,6 +73,7 @@
self.values.update({
'topic_name': topic.name,
+ 'topic_id': topic.id,
'canonical_story_dicts': canonical_story_dicts,
'additional_story_dicts': additional_story_dicts,
'uncategorized_skill_ids': uncategorized_skill_ids,
| {"golden_diff": "diff --git a/core/controllers/topic_viewer.py b/core/controllers/topic_viewer.py\n--- a/core/controllers/topic_viewer.py\n+++ b/core/controllers/topic_viewer.py\n@@ -73,6 +73,7 @@\n \n self.values.update({\n 'topic_name': topic.name,\n+ 'topic_id': topic.id,\n 'canonical_story_dicts': canonical_story_dicts,\n 'additional_story_dicts': additional_story_dicts,\n 'uncategorized_skill_ids': uncategorized_skill_ids,\n", "issue": "Add the practice session experience to the topic viewer\nAdd an option in the topic viewer page that can start the practice session, based on the skills that are part of the topic.\nAdd the practice session experience to the topic viewer\nAdd an option in the topic viewer page that can start the practice session, based on the skills that are part of the topic.\n", "before_files": [{"content": "# Copyright 2018 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Controllers for the topic viewer page.\"\"\"\n\nfrom constants import constants\nfrom core.controllers import acl_decorators\nfrom core.controllers import base\nfrom core.domain import story_services\nfrom core.domain import topic_services\nimport feconf\n\n\nclass TopicViewerPage(base.BaseHandler):\n \"\"\"Renders the topic viewer page.\"\"\"\n\n @acl_decorators.can_access_topic_viewer_page\n def get(self, _):\n \"\"\"Handles GET requests.\"\"\"\n\n if not constants.ENABLE_NEW_STRUCTURE_PLAYERS:\n raise self.PageNotFoundException\n\n self.render_template('dist/topic-viewer-page.mainpage.html')\n\n\nclass TopicPageDataHandler(base.BaseHandler):\n \"\"\"Manages the data that needs to be displayed to a learner on the topic\n viewer page.\n \"\"\"\n GET_HANDLER_ERROR_RETURN_TYPE = feconf.HANDLER_TYPE_JSON\n\n @acl_decorators.can_access_topic_viewer_page\n def get(self, topic_name):\n \"\"\"Handles GET requests.\"\"\"\n\n if not constants.ENABLE_NEW_STRUCTURE_PLAYERS:\n raise self.PageNotFoundException\n\n topic = topic_services.get_topic_by_name(topic_name)\n\n canonical_story_summaries = [\n story_services.get_story_summary_by_id(\n canonical_story_id) for canonical_story_id\n in topic.canonical_story_ids]\n\n additional_story_summaries = [\n story_services.get_story_summary_by_id(\n additional_story_id) for additional_story_id\n in topic.additional_story_ids]\n\n canonical_story_dicts = [\n summary.to_human_readable_dict() for summary\n in canonical_story_summaries]\n\n additional_story_dicts = [\n summary.to_human_readable_dict() for summary\n in additional_story_summaries]\n\n uncategorized_skill_ids = topic.get_all_uncategorized_skill_ids()\n subtopics = topic.get_all_subtopics()\n\n self.values.update({\n 'topic_name': topic.name,\n 'canonical_story_dicts': canonical_story_dicts,\n 'additional_story_dicts': additional_story_dicts,\n 'uncategorized_skill_ids': uncategorized_skill_ids,\n 'subtopics': subtopics\n })\n self.render_json(self.values)\n", "path": "core/controllers/topic_viewer.py"}], "after_files": [{"content": "# Copyright 2018 The Oppia Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS-IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Controllers for the topic viewer page.\"\"\"\n\nfrom constants import constants\nfrom core.controllers import acl_decorators\nfrom core.controllers import base\nfrom core.domain import story_services\nfrom core.domain import topic_services\nimport feconf\n\n\nclass TopicViewerPage(base.BaseHandler):\n \"\"\"Renders the topic viewer page.\"\"\"\n\n @acl_decorators.can_access_topic_viewer_page\n def get(self, _):\n \"\"\"Handles GET requests.\"\"\"\n\n if not constants.ENABLE_NEW_STRUCTURE_PLAYERS:\n raise self.PageNotFoundException\n\n self.render_template('dist/topic-viewer-page.mainpage.html')\n\n\nclass TopicPageDataHandler(base.BaseHandler):\n \"\"\"Manages the data that needs to be displayed to a learner on the topic\n viewer page.\n \"\"\"\n GET_HANDLER_ERROR_RETURN_TYPE = feconf.HANDLER_TYPE_JSON\n\n @acl_decorators.can_access_topic_viewer_page\n def get(self, topic_name):\n \"\"\"Handles GET requests.\"\"\"\n\n if not constants.ENABLE_NEW_STRUCTURE_PLAYERS:\n raise self.PageNotFoundException\n\n topic = topic_services.get_topic_by_name(topic_name)\n\n canonical_story_summaries = [\n story_services.get_story_summary_by_id(\n canonical_story_id) for canonical_story_id\n in topic.canonical_story_ids]\n\n additional_story_summaries = [\n story_services.get_story_summary_by_id(\n additional_story_id) for additional_story_id\n in topic.additional_story_ids]\n\n canonical_story_dicts = [\n summary.to_human_readable_dict() for summary\n in canonical_story_summaries]\n\n additional_story_dicts = [\n summary.to_human_readable_dict() for summary\n in additional_story_summaries]\n\n uncategorized_skill_ids = topic.get_all_uncategorized_skill_ids()\n subtopics = topic.get_all_subtopics()\n\n self.values.update({\n 'topic_name': topic.name,\n 'topic_id': topic.id,\n 'canonical_story_dicts': canonical_story_dicts,\n 'additional_story_dicts': additional_story_dicts,\n 'uncategorized_skill_ids': uncategorized_skill_ids,\n 'subtopics': subtopics\n })\n self.render_json(self.values)\n", "path": "core/controllers/topic_viewer.py"}]} | 1,058 | 99 |
gh_patches_debug_42262 | rasdani/github-patches | git_diff | open-telemetry__opentelemetry-python-806 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ext/pyramid: add docs page
An entry in the docs is missing for the pyramid instrumentation.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py`
Content:
```
1 # Copyright The OpenTelemetry Authors
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """
16 Pyramid instrumentation supporting `pyramid`_, it can be enabled by
17 using ``PyramidInstrumentor``.
18
19 .. _pyramid: https://docs.pylonsproject.org/projects/pyramid/en/latest/
20
21 Usage
22 -----
23 There are two methods to instrument Pyramid:
24
25 Method 1 (Instrument all Configurators):
26 ----------------------------------------
27 .. code:: python
28
29 from pyramid.config import Configurator
30 from opentelemetry.ext.pyramid import PyramidInstrumentor
31
32 PyramidInstrumentor.instrument()
33
34 config = Configurator()
35
36 # use your config as normal
37 config.add_route('index', '/')
38
39 Method 2 (Instrument one Configurator):
40 ---------------------------------------
41 .. code:: python
42
43 from pyramid.config import Configurator
44 from opentelemetry.ext.pyramid import PyramidInstrumentor
45
46 config = Configurator()
47 PyramidInstrumentor().instrument_config(config)
48
49 # use your config as normal
50 config.add_route('index', '/')
51
52 Using ``pyramid.tweens`` settings:
53 ----------------------------------
54 If you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,
55 you need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,
56 *as well as* instrumenting the config with `PyramidInstrumentor().instrument_config(config)`.
57
58 For example:
59 .. code:: python
60 settings = {
61 'pyramid.tweens', 'opentelemetry.ext.pyramid.trace_tween_factory\\nyour_tween_no_1\\nyour_tween_no_2',
62 }
63 config = Configurator(settings=settings)
64 PyramidInstrumentor.instrument_config(config)
65
66 # use your config as normal.
67 config.add_route('index', '/')
68 ---
69 """
70
71 import typing
72
73 from pyramid.config import Configurator
74 from pyramid.path import caller_package
75 from pyramid.settings import aslist
76 from wrapt import ObjectProxy
77 from wrapt import wrap_function_wrapper as _wrap
78
79 from opentelemetry.ext.pyramid.callbacks import (
80 SETTING_TRACE_ENABLED,
81 TWEEN_NAME,
82 trace_tween_factory,
83 )
84 from opentelemetry.ext.pyramid.version import __version__
85 from opentelemetry.instrumentation.instrumentor import BaseInstrumentor
86 from opentelemetry.instrumentation.utils import unwrap
87 from opentelemetry.trace import TracerProvider, get_tracer
88
89
90 def traced_init(wrapped, instance, args, kwargs):
91 settings = kwargs.get("settings", {})
92 tweens = aslist(settings.get("pyramid.tweens", []))
93
94 if tweens and TWEEN_NAME not in settings:
95 # pyramid.tweens.EXCVIEW is the name of built-in exception view provided by
96 # pyramid. We need our tween to be before it, otherwise unhandled
97 # exceptions will be caught before they reach our tween.
98 tweens = [TWEEN_NAME] + tweens
99
100 settings["pyramid.tweens"] = "\n".join(tweens)
101
102 kwargs["settings"] = settings
103
104 # `caller_package` works by walking a fixed amount of frames up the stack
105 # to find the calling package. So if we let the original `__init__`
106 # function call it, our wrapper will mess things up.
107 if not kwargs.get("package", None):
108 # Get the package for the third frame up from this one.
109 # Default is `level=2` which will give us the package from `wrapt`
110 # instead of the desired package (the caller)
111 kwargs["package"] = caller_package(level=3)
112
113 wrapped(*args, **kwargs)
114 instance.include("opentelemetry.ext.pyramid.callbacks")
115
116
117 class PyramidInstrumentor(BaseInstrumentor):
118 def _instrument(self, **kwargs):
119 """Integrate with Pyramid Python library.
120 https://docs.pylonsproject.org/projects/pyramid/en/latest/
121 """
122 _wrap("pyramid.config", "Configurator.__init__", traced_init)
123
124 def _uninstrument(self, **kwargs):
125 """"Disable Pyramid instrumentation"""
126 unwrap(Configurator, "__init__")
127
128 # pylint:disable=no-self-use
129 def instrument_config(self, config):
130 """Enable instrumentation in a Pyramid configurator.
131
132 Args:
133 config: The Configurator to instrument.
134
135 Returns:
136 An instrumented Configurator.
137 """
138 config.include("opentelemetry.ext.pyramid.callbacks")
139
140 def uninstrument_config(self, config):
141 config.add_settings({SETTING_TRACE_ENABLED: False})
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py b/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py
--- a/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py
+++ b/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py
@@ -20,16 +20,17 @@
Usage
-----
- There are two methods to instrument Pyramid:
+There are two methods to instrument Pyramid:
Method 1 (Instrument all Configurators):
----------------------------------------
+
.. code:: python
from pyramid.config import Configurator
from opentelemetry.ext.pyramid import PyramidInstrumentor
- PyramidInstrumentor.instrument()
+ PyramidInstrumentor().instrument()
config = Configurator()
@@ -38,6 +39,7 @@
Method 2 (Instrument one Configurator):
---------------------------------------
+
.. code:: python
from pyramid.config import Configurator
@@ -49,22 +51,30 @@
# use your config as normal
config.add_route('index', '/')
-Using ``pyramid.tweens`` settings:
-----------------------------------
- If you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,
- you need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,
- *as well as* instrumenting the config with `PyramidInstrumentor().instrument_config(config)`.
+Using ``pyramid.tweens`` setting:
+---------------------------------
+
+If you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,
+you need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,
+*as well as* instrumenting the config as shown above.
+
+For example:
- For example:
.. code:: python
+
+ from pyramid.config import Configurator
+ from opentelemetry.ext.pyramid import PyramidInstrumentor
+
settings = {
'pyramid.tweens', 'opentelemetry.ext.pyramid.trace_tween_factory\\nyour_tween_no_1\\nyour_tween_no_2',
}
config = Configurator(settings=settings)
- PyramidInstrumentor.instrument_config(config)
+ PyramidInstrumentor().instrument_config(config)
# use your config as normal.
config.add_route('index', '/')
+
+API
---
"""
@@ -87,7 +97,7 @@
from opentelemetry.trace import TracerProvider, get_tracer
-def traced_init(wrapped, instance, args, kwargs):
+def _traced_init(wrapped, instance, args, kwargs):
settings = kwargs.get("settings", {})
tweens = aslist(settings.get("pyramid.tweens", []))
@@ -119,7 +129,7 @@
"""Integrate with Pyramid Python library.
https://docs.pylonsproject.org/projects/pyramid/en/latest/
"""
- _wrap("pyramid.config", "Configurator.__init__", traced_init)
+ _wrap("pyramid.config", "Configurator.__init__", _traced_init)
def _uninstrument(self, **kwargs):
""""Disable Pyramid instrumentation"""
@@ -131,9 +141,6 @@
Args:
config: The Configurator to instrument.
-
- Returns:
- An instrumented Configurator.
"""
config.include("opentelemetry.ext.pyramid.callbacks")
| {"golden_diff": "diff --git a/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py b/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py\n--- a/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py\n+++ b/ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py\n@@ -20,16 +20,17 @@\n \n Usage\n -----\n- There are two methods to instrument Pyramid:\n+There are two methods to instrument Pyramid:\n \n Method 1 (Instrument all Configurators):\n ----------------------------------------\n+\n .. code:: python\n \n from pyramid.config import Configurator\n from opentelemetry.ext.pyramid import PyramidInstrumentor\n \n- PyramidInstrumentor.instrument()\n+ PyramidInstrumentor().instrument()\n \n config = Configurator()\n \n@@ -38,6 +39,7 @@\n \n Method 2 (Instrument one Configurator):\n ---------------------------------------\n+\n .. code:: python\n \n from pyramid.config import Configurator\n@@ -49,22 +51,30 @@\n # use your config as normal\n config.add_route('index', '/')\n \n-Using ``pyramid.tweens`` settings:\n-----------------------------------\n- If you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,\n- you need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,\n- *as well as* instrumenting the config with `PyramidInstrumentor().instrument_config(config)`.\n+Using ``pyramid.tweens`` setting:\n+---------------------------------\n+\n+If you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,\n+you need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,\n+*as well as* instrumenting the config as shown above.\n+\n+For example:\n \n- For example:\n .. code:: python\n+\n+ from pyramid.config import Configurator\n+ from opentelemetry.ext.pyramid import PyramidInstrumentor\n+\n settings = {\n 'pyramid.tweens', 'opentelemetry.ext.pyramid.trace_tween_factory\\\\nyour_tween_no_1\\\\nyour_tween_no_2',\n }\n config = Configurator(settings=settings)\n- PyramidInstrumentor.instrument_config(config)\n+ PyramidInstrumentor().instrument_config(config)\n \n # use your config as normal.\n config.add_route('index', '/')\n+\n+API\n ---\n \"\"\"\n \n@@ -87,7 +97,7 @@\n from opentelemetry.trace import TracerProvider, get_tracer\n \n \n-def traced_init(wrapped, instance, args, kwargs):\n+def _traced_init(wrapped, instance, args, kwargs):\n settings = kwargs.get(\"settings\", {})\n tweens = aslist(settings.get(\"pyramid.tweens\", []))\n \n@@ -119,7 +129,7 @@\n \"\"\"Integrate with Pyramid Python library.\n https://docs.pylonsproject.org/projects/pyramid/en/latest/\n \"\"\"\n- _wrap(\"pyramid.config\", \"Configurator.__init__\", traced_init)\n+ _wrap(\"pyramid.config\", \"Configurator.__init__\", _traced_init)\n \n def _uninstrument(self, **kwargs):\n \"\"\"\"Disable Pyramid instrumentation\"\"\"\n@@ -131,9 +141,6 @@\n \n Args:\n config: The Configurator to instrument.\n-\n- Returns:\n- An instrumented Configurator.\n \"\"\"\n config.include(\"opentelemetry.ext.pyramid.callbacks\")\n", "issue": "ext/pyramid: add docs page\nAn entry in the docs is missing for the pyramid instrumentation.\n", "before_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nPyramid instrumentation supporting `pyramid`_, it can be enabled by\nusing ``PyramidInstrumentor``.\n\n.. _pyramid: https://docs.pylonsproject.org/projects/pyramid/en/latest/\n\nUsage\n-----\n There are two methods to instrument Pyramid:\n\nMethod 1 (Instrument all Configurators):\n----------------------------------------\n.. code:: python\n\n from pyramid.config import Configurator\n from opentelemetry.ext.pyramid import PyramidInstrumentor\n\n PyramidInstrumentor.instrument()\n\n config = Configurator()\n\n # use your config as normal\n config.add_route('index', '/')\n\nMethod 2 (Instrument one Configurator):\n---------------------------------------\n.. code:: python\n\n from pyramid.config import Configurator\n from opentelemetry.ext.pyramid import PyramidInstrumentor\n\n config = Configurator()\n PyramidInstrumentor().instrument_config(config)\n\n # use your config as normal\n config.add_route('index', '/')\n\nUsing ``pyramid.tweens`` settings:\n----------------------------------\n If you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,\n you need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,\n *as well as* instrumenting the config with `PyramidInstrumentor().instrument_config(config)`.\n\n For example:\n.. code:: python\n settings = {\n 'pyramid.tweens', 'opentelemetry.ext.pyramid.trace_tween_factory\\\\nyour_tween_no_1\\\\nyour_tween_no_2',\n }\n config = Configurator(settings=settings)\n PyramidInstrumentor.instrument_config(config)\n\n # use your config as normal.\n config.add_route('index', '/')\n---\n\"\"\"\n\nimport typing\n\nfrom pyramid.config import Configurator\nfrom pyramid.path import caller_package\nfrom pyramid.settings import aslist\nfrom wrapt import ObjectProxy\nfrom wrapt import wrap_function_wrapper as _wrap\n\nfrom opentelemetry.ext.pyramid.callbacks import (\n SETTING_TRACE_ENABLED,\n TWEEN_NAME,\n trace_tween_factory,\n)\nfrom opentelemetry.ext.pyramid.version import __version__\nfrom opentelemetry.instrumentation.instrumentor import BaseInstrumentor\nfrom opentelemetry.instrumentation.utils import unwrap\nfrom opentelemetry.trace import TracerProvider, get_tracer\n\n\ndef traced_init(wrapped, instance, args, kwargs):\n settings = kwargs.get(\"settings\", {})\n tweens = aslist(settings.get(\"pyramid.tweens\", []))\n\n if tweens and TWEEN_NAME not in settings:\n # pyramid.tweens.EXCVIEW is the name of built-in exception view provided by\n # pyramid. We need our tween to be before it, otherwise unhandled\n # exceptions will be caught before they reach our tween.\n tweens = [TWEEN_NAME] + tweens\n\n settings[\"pyramid.tweens\"] = \"\\n\".join(tweens)\n\n kwargs[\"settings\"] = settings\n\n # `caller_package` works by walking a fixed amount of frames up the stack\n # to find the calling package. So if we let the original `__init__`\n # function call it, our wrapper will mess things up.\n if not kwargs.get(\"package\", None):\n # Get the package for the third frame up from this one.\n # Default is `level=2` which will give us the package from `wrapt`\n # instead of the desired package (the caller)\n kwargs[\"package\"] = caller_package(level=3)\n\n wrapped(*args, **kwargs)\n instance.include(\"opentelemetry.ext.pyramid.callbacks\")\n\n\nclass PyramidInstrumentor(BaseInstrumentor):\n def _instrument(self, **kwargs):\n \"\"\"Integrate with Pyramid Python library.\n https://docs.pylonsproject.org/projects/pyramid/en/latest/\n \"\"\"\n _wrap(\"pyramid.config\", \"Configurator.__init__\", traced_init)\n\n def _uninstrument(self, **kwargs):\n \"\"\"\"Disable Pyramid instrumentation\"\"\"\n unwrap(Configurator, \"__init__\")\n\n # pylint:disable=no-self-use\n def instrument_config(self, config):\n \"\"\"Enable instrumentation in a Pyramid configurator.\n\n Args:\n config: The Configurator to instrument.\n\n Returns:\n An instrumented Configurator.\n \"\"\"\n config.include(\"opentelemetry.ext.pyramid.callbacks\")\n\n def uninstrument_config(self, config):\n config.add_settings({SETTING_TRACE_ENABLED: False})\n", "path": "ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py"}], "after_files": [{"content": "# Copyright The OpenTelemetry Authors\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"\nPyramid instrumentation supporting `pyramid`_, it can be enabled by\nusing ``PyramidInstrumentor``.\n\n.. _pyramid: https://docs.pylonsproject.org/projects/pyramid/en/latest/\n\nUsage\n-----\nThere are two methods to instrument Pyramid:\n\nMethod 1 (Instrument all Configurators):\n----------------------------------------\n\n.. code:: python\n\n from pyramid.config import Configurator\n from opentelemetry.ext.pyramid import PyramidInstrumentor\n\n PyramidInstrumentor().instrument()\n\n config = Configurator()\n\n # use your config as normal\n config.add_route('index', '/')\n\nMethod 2 (Instrument one Configurator):\n---------------------------------------\n\n.. code:: python\n\n from pyramid.config import Configurator\n from opentelemetry.ext.pyramid import PyramidInstrumentor\n\n config = Configurator()\n PyramidInstrumentor().instrument_config(config)\n\n # use your config as normal\n config.add_route('index', '/')\n\nUsing ``pyramid.tweens`` setting:\n---------------------------------\n\nIf you use Method 2 and then set tweens for your application with the ``pyramid.tweens`` setting,\nyou need to add ``opentelemetry.ext.pyramid.trace_tween_factory`` explicity to the list,\n*as well as* instrumenting the config as shown above.\n\nFor example:\n\n.. code:: python\n\n from pyramid.config import Configurator\n from opentelemetry.ext.pyramid import PyramidInstrumentor\n\n settings = {\n 'pyramid.tweens', 'opentelemetry.ext.pyramid.trace_tween_factory\\\\nyour_tween_no_1\\\\nyour_tween_no_2',\n }\n config = Configurator(settings=settings)\n PyramidInstrumentor().instrument_config(config)\n\n # use your config as normal.\n config.add_route('index', '/')\n\nAPI\n---\n\"\"\"\n\nimport typing\n\nfrom pyramid.config import Configurator\nfrom pyramid.path import caller_package\nfrom pyramid.settings import aslist\nfrom wrapt import ObjectProxy\nfrom wrapt import wrap_function_wrapper as _wrap\n\nfrom opentelemetry.ext.pyramid.callbacks import (\n SETTING_TRACE_ENABLED,\n TWEEN_NAME,\n trace_tween_factory,\n)\nfrom opentelemetry.ext.pyramid.version import __version__\nfrom opentelemetry.instrumentation.instrumentor import BaseInstrumentor\nfrom opentelemetry.instrumentation.utils import unwrap\nfrom opentelemetry.trace import TracerProvider, get_tracer\n\n\ndef _traced_init(wrapped, instance, args, kwargs):\n settings = kwargs.get(\"settings\", {})\n tweens = aslist(settings.get(\"pyramid.tweens\", []))\n\n if tweens and TWEEN_NAME not in settings:\n # pyramid.tweens.EXCVIEW is the name of built-in exception view provided by\n # pyramid. We need our tween to be before it, otherwise unhandled\n # exceptions will be caught before they reach our tween.\n tweens = [TWEEN_NAME] + tweens\n\n settings[\"pyramid.tweens\"] = \"\\n\".join(tweens)\n\n kwargs[\"settings\"] = settings\n\n # `caller_package` works by walking a fixed amount of frames up the stack\n # to find the calling package. So if we let the original `__init__`\n # function call it, our wrapper will mess things up.\n if not kwargs.get(\"package\", None):\n # Get the package for the third frame up from this one.\n # Default is `level=2` which will give us the package from `wrapt`\n # instead of the desired package (the caller)\n kwargs[\"package\"] = caller_package(level=3)\n\n wrapped(*args, **kwargs)\n instance.include(\"opentelemetry.ext.pyramid.callbacks\")\n\n\nclass PyramidInstrumentor(BaseInstrumentor):\n def _instrument(self, **kwargs):\n \"\"\"Integrate with Pyramid Python library.\n https://docs.pylonsproject.org/projects/pyramid/en/latest/\n \"\"\"\n _wrap(\"pyramid.config\", \"Configurator.__init__\", _traced_init)\n\n def _uninstrument(self, **kwargs):\n \"\"\"\"Disable Pyramid instrumentation\"\"\"\n unwrap(Configurator, \"__init__\")\n\n # pylint:disable=no-self-use\n def instrument_config(self, config):\n \"\"\"Enable instrumentation in a Pyramid configurator.\n\n Args:\n config: The Configurator to instrument.\n \"\"\"\n config.include(\"opentelemetry.ext.pyramid.callbacks\")\n\n def uninstrument_config(self, config):\n config.add_settings({SETTING_TRACE_ENABLED: False})\n", "path": "ext/opentelemetry-ext-pyramid/src/opentelemetry/ext/pyramid/__init__.py"}]} | 1,691 | 789 |
gh_patches_debug_18633 | rasdani/github-patches | git_diff | microsoft__botbuilder-python-1970 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Version dependency between Botbuilder and Question Answering
### [Github issues](https://github.com/Microsoft/botbuilder-python) should be used for bugs and feature requests. Use [Stack Overflow](https://stackoverflow.com/questions/tagged/botframework) for general "how-to" questions.
## Version
Botbuilder 4.14.0
azure-ai-language-questionanswering 1.0.0
## Describe the bug
When building a Python bot using Question Answering and Botbuilder:
The package azure-ai-language-questionanswering actual version 1.0.0 requires at least msrest version 0.6.21 but the actual version of Botbuilder (-schema/core/ai/etc) 4.14.2 requires msrest to be 0.6.19 (though I'm using 4.14.0 and the version doesn't match either).
There's no way these two packages work together right now because of the msrest version dependency.
## To Reproduce
1. Create a requirements.txt file with these dependencies
azure-ai-language-questionanswering==1.0.0
botbuilder-ai==4.14.0
botbuilder-core==4.14.0
botbuilder-dialogs==4.14.0
botbuilder-integration-aiohttp==4.14.0
botbuilder-schema==4.14.0
2. Install dependencies of a bot project with pip install -r requirements.txt
## Expected behavior
Upgrade Botbuilder to work with msrest version 0.6.21 OR
Decrease Question Answering version dependency of msrest to 0.6.19 or lower
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libraries/botframework-connector/setup.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 import os
5 from setuptools import setup
6
7 NAME = "botframework-connector"
8 VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
9 REQUIRES = [
10 "msrest==0.6.19",
11 "requests>=2.23.0,<2.26",
12 "PyJWT>=1.5.3,<2.0.0",
13 "botbuilder-schema==4.15.0",
14 "msal==1.17.0",
15 ]
16
17 root = os.path.abspath(os.path.dirname(__file__))
18
19 with open(os.path.join(root, "README.rst"), encoding="utf-8") as f:
20 long_description = f.read()
21
22 setup(
23 name=NAME,
24 version=VERSION,
25 description="Microsoft Bot Framework Bot Builder SDK for Python.",
26 author="Microsoft",
27 url="https://www.github.com/Microsoft/botbuilder-python",
28 keywords=["BotFrameworkConnector", "bots", "ai", "botframework", "botbuilder"],
29 install_requires=REQUIRES,
30 packages=[
31 "botframework.connector",
32 "botframework.connector.auth",
33 "botframework.connector.async_mixin",
34 "botframework.connector.operations",
35 "botframework.connector.models",
36 "botframework.connector.aio",
37 "botframework.connector.aio.operations_async",
38 "botframework.connector.skills",
39 "botframework.connector.teams",
40 "botframework.connector.teams.operations",
41 "botframework.connector.token_api",
42 "botframework.connector.token_api.aio",
43 "botframework.connector.token_api.aio.operations_async",
44 "botframework.connector.token_api.models",
45 "botframework.connector.token_api.operations",
46 ],
47 include_package_data=True,
48 long_description=long_description,
49 long_description_content_type="text/x-rst",
50 license="MIT",
51 classifiers=[
52 "Programming Language :: Python :: 3.7",
53 "Intended Audience :: Developers",
54 "License :: OSI Approved :: MIT License",
55 "Operating System :: OS Independent",
56 "Development Status :: 5 - Production/Stable",
57 "Topic :: Scientific/Engineering :: Artificial Intelligence",
58 ],
59 )
60
```
Path: `libraries/botbuilder-schema/setup.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License.
3
4 import os
5 from setuptools import setup
6
7 NAME = "botbuilder-schema"
8 VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
9 REQUIRES = ["msrest==0.6.19"]
10
11 root = os.path.abspath(os.path.dirname(__file__))
12
13 with open(os.path.join(root, "README.rst"), encoding="utf-8") as f:
14 long_description = f.read()
15
16 setup(
17 name=NAME,
18 version=VERSION,
19 description="BotBuilder Schema",
20 author="Microsoft",
21 url="https://github.com/Microsoft/botbuilder-python",
22 keywords=["BotBuilderSchema", "bots", "ai", "botframework", "botbuilder"],
23 long_description=long_description,
24 long_description_content_type="text/x-rst",
25 license="MIT",
26 install_requires=REQUIRES,
27 packages=[
28 "botbuilder.schema",
29 "botbuilder.schema.teams",
30 ],
31 include_package_data=True,
32 classifiers=[
33 "Programming Language :: Python :: 3.7",
34 "Intended Audience :: Developers",
35 "License :: OSI Approved :: MIT License",
36 "Operating System :: OS Independent",
37 "Development Status :: 5 - Production/Stable",
38 "Topic :: Scientific/Engineering :: Artificial Intelligence",
39 ],
40 )
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libraries/botbuilder-schema/setup.py b/libraries/botbuilder-schema/setup.py
--- a/libraries/botbuilder-schema/setup.py
+++ b/libraries/botbuilder-schema/setup.py
@@ -6,7 +6,7 @@
NAME = "botbuilder-schema"
VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
-REQUIRES = ["msrest==0.6.19"]
+REQUIRES = ["msrest==0.6.*"]
root = os.path.abspath(os.path.dirname(__file__))
diff --git a/libraries/botframework-connector/setup.py b/libraries/botframework-connector/setup.py
--- a/libraries/botframework-connector/setup.py
+++ b/libraries/botframework-connector/setup.py
@@ -7,11 +7,11 @@
NAME = "botframework-connector"
VERSION = os.environ["packageVersion"] if "packageVersion" in os.environ else "4.15.0"
REQUIRES = [
- "msrest==0.6.19",
+ "msrest==0.6.*",
"requests>=2.23.0,<2.26",
"PyJWT>=1.5.3,<2.0.0",
"botbuilder-schema==4.15.0",
- "msal==1.17.0",
+ "msal==1.*",
]
root = os.path.abspath(os.path.dirname(__file__))
| {"golden_diff": "diff --git a/libraries/botbuilder-schema/setup.py b/libraries/botbuilder-schema/setup.py\n--- a/libraries/botbuilder-schema/setup.py\n+++ b/libraries/botbuilder-schema/setup.py\n@@ -6,7 +6,7 @@\n \r\n NAME = \"botbuilder-schema\"\r\n VERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\r\n-REQUIRES = [\"msrest==0.6.19\"]\r\n+REQUIRES = [\"msrest==0.6.*\"]\r\n \r\n root = os.path.abspath(os.path.dirname(__file__))\r\n \r\ndiff --git a/libraries/botframework-connector/setup.py b/libraries/botframework-connector/setup.py\n--- a/libraries/botframework-connector/setup.py\n+++ b/libraries/botframework-connector/setup.py\n@@ -7,11 +7,11 @@\n NAME = \"botframework-connector\"\n VERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\n REQUIRES = [\n- \"msrest==0.6.19\",\n+ \"msrest==0.6.*\",\n \"requests>=2.23.0,<2.26\",\n \"PyJWT>=1.5.3,<2.0.0\",\n \"botbuilder-schema==4.15.0\",\n- \"msal==1.17.0\",\n+ \"msal==1.*\",\n ]\n \n root = os.path.abspath(os.path.dirname(__file__))\n", "issue": "Version dependency between Botbuilder and Question Answering\n### [Github issues](https://github.com/Microsoft/botbuilder-python) should be used for bugs and feature requests. Use [Stack Overflow](https://stackoverflow.com/questions/tagged/botframework) for general \"how-to\" questions. \r\n\r\n## Version\r\nBotbuilder 4.14.0\r\nazure-ai-language-questionanswering 1.0.0\r\n\r\n## Describe the bug\r\nWhen building a Python bot using Question Answering and Botbuilder:\r\nThe package azure-ai-language-questionanswering actual version 1.0.0 requires at least msrest version 0.6.21 but the actual version of Botbuilder (-schema/core/ai/etc) 4.14.2 requires msrest to be 0.6.19 (though I'm using 4.14.0 and the version doesn't match either).\r\nThere's no way these two packages work together right now because of the msrest version dependency.\r\n\r\n\r\n## To Reproduce\r\n1. Create a requirements.txt file with these dependencies \r\nazure-ai-language-questionanswering==1.0.0\r\nbotbuilder-ai==4.14.0\r\nbotbuilder-core==4.14.0\r\nbotbuilder-dialogs==4.14.0\r\nbotbuilder-integration-aiohttp==4.14.0\r\nbotbuilder-schema==4.14.0\r\n\r\n2. Install dependencies of a bot project with pip install -r requirements.txt\r\n\r\n\r\n## Expected behavior\r\nUpgrade Botbuilder to work with msrest version 0.6.21 OR\r\nDecrease Question Answering version dependency of msrest to 0.6.19 or lower\r\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nNAME = \"botframework-connector\"\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\nREQUIRES = [\n \"msrest==0.6.19\",\n \"requests>=2.23.0,<2.26\",\n \"PyJWT>=1.5.3,<2.0.0\",\n \"botbuilder-schema==4.15.0\",\n \"msal==1.17.0\",\n]\n\nroot = os.path.abspath(os.path.dirname(__file__))\n\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nsetup(\n name=NAME,\n version=VERSION,\n description=\"Microsoft Bot Framework Bot Builder SDK for Python.\",\n author=\"Microsoft\",\n url=\"https://www.github.com/Microsoft/botbuilder-python\",\n keywords=[\"BotFrameworkConnector\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\n install_requires=REQUIRES,\n packages=[\n \"botframework.connector\",\n \"botframework.connector.auth\",\n \"botframework.connector.async_mixin\",\n \"botframework.connector.operations\",\n \"botframework.connector.models\",\n \"botframework.connector.aio\",\n \"botframework.connector.aio.operations_async\",\n \"botframework.connector.skills\",\n \"botframework.connector.teams\",\n \"botframework.connector.teams.operations\",\n \"botframework.connector.token_api\",\n \"botframework.connector.token_api.aio\",\n \"botframework.connector.token_api.aio.operations_async\",\n \"botframework.connector.token_api.models\",\n \"botframework.connector.token_api.operations\",\n ],\n include_package_data=True,\n long_description=long_description,\n long_description_content_type=\"text/x-rst\",\n license=\"MIT\",\n classifiers=[\n \"Programming Language :: Python :: 3.7\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: MIT License\",\n \"Operating System :: OS Independent\",\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n ],\n)\n", "path": "libraries/botframework-connector/setup.py"}, {"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\r\n# Licensed under the MIT License.\r\n\r\nimport os\r\nfrom setuptools import setup\r\n\r\nNAME = \"botbuilder-schema\"\r\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\r\nREQUIRES = [\"msrest==0.6.19\"]\r\n\r\nroot = os.path.abspath(os.path.dirname(__file__))\r\n\r\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\r\n long_description = f.read()\r\n\r\nsetup(\r\n name=NAME,\r\n version=VERSION,\r\n description=\"BotBuilder Schema\",\r\n author=\"Microsoft\",\r\n url=\"https://github.com/Microsoft/botbuilder-python\",\r\n keywords=[\"BotBuilderSchema\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\r\n long_description=long_description,\r\n long_description_content_type=\"text/x-rst\",\r\n license=\"MIT\",\r\n install_requires=REQUIRES,\r\n packages=[\r\n \"botbuilder.schema\",\r\n \"botbuilder.schema.teams\",\r\n ],\r\n include_package_data=True,\r\n classifiers=[\r\n \"Programming Language :: Python :: 3.7\",\r\n \"Intended Audience :: Developers\",\r\n \"License :: OSI Approved :: MIT License\",\r\n \"Operating System :: OS Independent\",\r\n \"Development Status :: 5 - Production/Stable\",\r\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\r\n ],\r\n)\r\n", "path": "libraries/botbuilder-schema/setup.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License.\n\nimport os\nfrom setuptools import setup\n\nNAME = \"botframework-connector\"\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\nREQUIRES = [\n \"msrest==0.6.*\",\n \"requests>=2.23.0,<2.26\",\n \"PyJWT>=1.5.3,<2.0.0\",\n \"botbuilder-schema==4.15.0\",\n \"msal==1.*\",\n]\n\nroot = os.path.abspath(os.path.dirname(__file__))\n\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\n long_description = f.read()\n\nsetup(\n name=NAME,\n version=VERSION,\n description=\"Microsoft Bot Framework Bot Builder SDK for Python.\",\n author=\"Microsoft\",\n url=\"https://www.github.com/Microsoft/botbuilder-python\",\n keywords=[\"BotFrameworkConnector\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\n install_requires=REQUIRES,\n packages=[\n \"botframework.connector\",\n \"botframework.connector.auth\",\n \"botframework.connector.async_mixin\",\n \"botframework.connector.operations\",\n \"botframework.connector.models\",\n \"botframework.connector.aio\",\n \"botframework.connector.aio.operations_async\",\n \"botframework.connector.skills\",\n \"botframework.connector.teams\",\n \"botframework.connector.teams.operations\",\n \"botframework.connector.token_api\",\n \"botframework.connector.token_api.aio\",\n \"botframework.connector.token_api.aio.operations_async\",\n \"botframework.connector.token_api.models\",\n \"botframework.connector.token_api.operations\",\n ],\n include_package_data=True,\n long_description=long_description,\n long_description_content_type=\"text/x-rst\",\n license=\"MIT\",\n classifiers=[\n \"Programming Language :: Python :: 3.7\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: MIT License\",\n \"Operating System :: OS Independent\",\n \"Development Status :: 5 - Production/Stable\",\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n ],\n)\n", "path": "libraries/botframework-connector/setup.py"}, {"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\r\n# Licensed under the MIT License.\r\n\r\nimport os\r\nfrom setuptools import setup\r\n\r\nNAME = \"botbuilder-schema\"\r\nVERSION = os.environ[\"packageVersion\"] if \"packageVersion\" in os.environ else \"4.15.0\"\r\nREQUIRES = [\"msrest==0.6.*\"]\r\n\r\nroot = os.path.abspath(os.path.dirname(__file__))\r\n\r\nwith open(os.path.join(root, \"README.rst\"), encoding=\"utf-8\") as f:\r\n long_description = f.read()\r\n\r\nsetup(\r\n name=NAME,\r\n version=VERSION,\r\n description=\"BotBuilder Schema\",\r\n author=\"Microsoft\",\r\n url=\"https://github.com/Microsoft/botbuilder-python\",\r\n keywords=[\"BotBuilderSchema\", \"bots\", \"ai\", \"botframework\", \"botbuilder\"],\r\n long_description=long_description,\r\n long_description_content_type=\"text/x-rst\",\r\n license=\"MIT\",\r\n install_requires=REQUIRES,\r\n packages=[\r\n \"botbuilder.schema\",\r\n \"botbuilder.schema.teams\",\r\n ],\r\n include_package_data=True,\r\n classifiers=[\r\n \"Programming Language :: Python :: 3.7\",\r\n \"Intended Audience :: Developers\",\r\n \"License :: OSI Approved :: MIT License\",\r\n \"Operating System :: OS Independent\",\r\n \"Development Status :: 5 - Production/Stable\",\r\n \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\r\n ],\r\n)\r\n", "path": "libraries/botbuilder-schema/setup.py"}]} | 1,605 | 331 |
gh_patches_debug_30898 | rasdani/github-patches | git_diff | mlflow__mlflow-5403 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] XGBoost autologging breaks when metric with @ is used
Thank you for submitting an issue. Please refer to our [issue policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md) for additional information about bug reports. For help with debugging your code, please refer to [Stack Overflow](https://stackoverflow.com/questions/tagged/mlflow).
**Please fill in this bug report template to ensure a timely and thorough response.**
### Willingness to contribute
The MLflow Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the MLflow code base?
- [x] Yes. I can contribute a fix for this bug independently.
- [ ] Yes. I would be willing to contribute a fix for this bug with guidance from the MLflow community.
- [ ] No. I cannot contribute a bug fix at this time.
### System information
- **Have I written custom code (as opposed to using a stock example script provided in MLflow)**: yes
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: macOS 11.6.3
- **MLflow installed from (source or binary)**: binary
- **MLflow version (run ``mlflow --version``)**: 1.23.1
- **Python version**: 3.7.9
- **npm version, if running the dev UI**:
- **Exact command to reproduce**: `python xgb_metrics_bug.py`
### Describe the problem
XGBoost comes with a few metrics with @ in their name, e.g. `ndcg@2`, see [XGBoost parameters](https://xgboost.readthedocs.io/en/stable/parameter.html#learning-task-parameters).
Expected behavior: MLflow should be able to autolog all built-in XGBoost metrics.
Actual behavior: when using a metric like this, the following warning is shown:
> WARNING mlflow.utils.autologging_utils: Encountered unexpected error during xgboost autologging: Invalid metric name: 'train-ndcg@2'. Names may only contain alphanumerics, underscores (_), dashes (-), periods (.), spaces ( ), and slashes (/).
but not only this metric is discarded, instead autologging completely breaks, i.e. other metrics, the model artifact and the feature importance plot also are not recorded.

I would propose to string-replace `@` with `_at_` in the XGBoost autologging metrics callback so that MLflow is compatible with all XGBoost built-in metrics.
### Code to reproduce issue
`xgb_metrics_bug.py`:
```python
import mlflow
import numpy as np
import xgboost as xgb
mlflow.set_experiment("xgb_autolog_metrics_bug")
for metrics in [["ndcg"], ["ndcg", "ndcg@2"]]:
with mlflow.start_run():
mlflow.xgboost.autolog()
train_matrix = xgb.DMatrix(np.array([[0], [1]]), label=[1, 0])
params = {
"objective": "rank:pairwise",
"eval_metric": metrics,
}
xgb.train(params, train_matrix, evals=[(train_matrix, "train")])
```
### Other info / logs
- I tried this with xgboost versions 1.3.3 and 1.5.1
### What component(s), interfaces, languages, and integrations does this bug affect?
Components
- [ ] `area/artifacts`: Artifact stores and artifact logging
- [ ] `area/build`: Build and test infrastructure for MLflow
- [ ] `area/docs`: MLflow documentation pages
- [ ] `area/examples`: Example code
- [ ] `area/model-registry`: Model Registry service, APIs, and the fluent client calls for Model Registry
- [ ] `area/models`: MLmodel format, model serialization/deserialization, flavors
- [ ] `area/projects`: MLproject format, project running backends
- [ ] `area/scoring`: MLflow Model server, model deployment tools, Spark UDFs
- [ ] `area/server-infra`: MLflow Tracking server backend
- [x] `area/tracking`: Tracking Service, tracking client APIs, autologging
Interface
- [ ] `area/uiux`: Front-end, user experience, plotting, JavaScript, JavaScript dev server
- [ ] `area/docker`: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
- [ ] `area/sqlalchemy`: Use of SQLAlchemy in the Tracking Service or Model Registry
- [ ] `area/windows`: Windows support
Language
- [ ] `language/r`: R APIs and clients
- [ ] `language/java`: Java APIs and clients
- [ ] `language/new`: Proposals for new client languages
Integrations
- [ ] `integrations/azure`: Azure and Azure ML integrations
- [ ] `integrations/sagemaker`: SageMaker integrations
- [ ] `integrations/databricks`: Databricks integrations
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mlflow/xgboost/_autolog.py`
Content:
```
1 from packaging.version import Version
2 import xgboost
3
4 # Suppress a false positive pylint error: https://github.com/PyCQA/pylint/issues/1630
5 # pylint: disable=unused-import
6 from mlflow.utils.autologging_utils import ExceptionSafeAbstractClass
7
8
9 def autolog_callback(env, metrics_logger, eval_results):
10 metrics_logger.record_metrics(dict(env.evaluation_result_list), env.iteration)
11 eval_results.append(dict(env.evaluation_result_list))
12
13
14 IS_TRAINING_CALLBACK_SUPPORTED = Version(xgboost.__version__.replace("SNAPSHOT", "dev")) >= Version(
15 "1.3.0"
16 )
17
18 if IS_TRAINING_CALLBACK_SUPPORTED:
19
20 class AutologCallback(
21 xgboost.callback.TrainingCallback,
22 metaclass=ExceptionSafeAbstractClass,
23 ):
24 def __init__(self, metrics_logger, eval_results):
25 # pylint: disable=super-init-not-called
26 self.metrics_logger = metrics_logger
27 self.eval_results = eval_results
28
29 def after_iteration(self, model, epoch, evals_log):
30 """
31 Run after each iteration. Return True when training should stop.
32 """
33 # `evals_log` is a nested dict (type: Dict[str, Dict[str, List[float]]])
34 # that looks like this:
35 # {
36 # "train": {
37 # "auc": [0.5, 0.6, 0.7, ...],
38 # ...
39 # },
40 # ...
41 # }
42 evaluation_result_dict = {}
43 for data_name, metric_dict in evals_log.items():
44 for metric_name, metric_values_on_each_iter in metric_dict.items():
45 key = "{}-{}".format(data_name, metric_name)
46 # The last element in `metric_values_on_each_iter` corresponds to
47 # the meric on the current iteration
48 evaluation_result_dict[key] = metric_values_on_each_iter[-1]
49
50 self.metrics_logger.record_metrics(evaluation_result_dict, epoch)
51 self.eval_results.append(evaluation_result_dict)
52
53 # Return `False` to indicate training should not stop
54 return False
55
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mlflow/xgboost/_autolog.py b/mlflow/xgboost/_autolog.py
--- a/mlflow/xgboost/_autolog.py
+++ b/mlflow/xgboost/_autolog.py
@@ -6,9 +6,15 @@
from mlflow.utils.autologging_utils import ExceptionSafeAbstractClass
+def _patch_metric_names(metric_dict):
+ # XGBoost provides some metrics with "@", e.g. "ndcg@3" that are not valid MLflow metric names
+ return {metric_name.replace("@", "_at_"): value for metric_name, value in metric_dict.items()}
+
+
def autolog_callback(env, metrics_logger, eval_results):
- metrics_logger.record_metrics(dict(env.evaluation_result_list), env.iteration)
- eval_results.append(dict(env.evaluation_result_list))
+ metric_dict = _patch_metric_names(dict(env.evaluation_result_list))
+ metrics_logger.record_metrics(metric_dict, env.iteration)
+ eval_results.append(metric_dict)
IS_TRAINING_CALLBACK_SUPPORTED = Version(xgboost.__version__.replace("SNAPSHOT", "dev")) >= Version(
@@ -41,10 +47,11 @@
# }
evaluation_result_dict = {}
for data_name, metric_dict in evals_log.items():
+ metric_dict = _patch_metric_names(metric_dict)
for metric_name, metric_values_on_each_iter in metric_dict.items():
key = "{}-{}".format(data_name, metric_name)
# The last element in `metric_values_on_each_iter` corresponds to
- # the meric on the current iteration
+ # the metric on the current iteration
evaluation_result_dict[key] = metric_values_on_each_iter[-1]
self.metrics_logger.record_metrics(evaluation_result_dict, epoch)
| {"golden_diff": "diff --git a/mlflow/xgboost/_autolog.py b/mlflow/xgboost/_autolog.py\n--- a/mlflow/xgboost/_autolog.py\n+++ b/mlflow/xgboost/_autolog.py\n@@ -6,9 +6,15 @@\n from mlflow.utils.autologging_utils import ExceptionSafeAbstractClass\n \n \n+def _patch_metric_names(metric_dict):\n+ # XGBoost provides some metrics with \"@\", e.g. \"ndcg@3\" that are not valid MLflow metric names\n+ return {metric_name.replace(\"@\", \"_at_\"): value for metric_name, value in metric_dict.items()}\n+\n+\n def autolog_callback(env, metrics_logger, eval_results):\n- metrics_logger.record_metrics(dict(env.evaluation_result_list), env.iteration)\n- eval_results.append(dict(env.evaluation_result_list))\n+ metric_dict = _patch_metric_names(dict(env.evaluation_result_list))\n+ metrics_logger.record_metrics(metric_dict, env.iteration)\n+ eval_results.append(metric_dict)\n \n \n IS_TRAINING_CALLBACK_SUPPORTED = Version(xgboost.__version__.replace(\"SNAPSHOT\", \"dev\")) >= Version(\n@@ -41,10 +47,11 @@\n # }\n evaluation_result_dict = {}\n for data_name, metric_dict in evals_log.items():\n+ metric_dict = _patch_metric_names(metric_dict)\n for metric_name, metric_values_on_each_iter in metric_dict.items():\n key = \"{}-{}\".format(data_name, metric_name)\n # The last element in `metric_values_on_each_iter` corresponds to\n- # the meric on the current iteration\n+ # the metric on the current iteration\n evaluation_result_dict[key] = metric_values_on_each_iter[-1]\n \n self.metrics_logger.record_metrics(evaluation_result_dict, epoch)\n", "issue": "[BUG] XGBoost autologging breaks when metric with @ is used\nThank you for submitting an issue. Please refer to our [issue policy](https://www.github.com/mlflow/mlflow/blob/master/ISSUE_POLICY.md) for additional information about bug reports. For help with debugging your code, please refer to [Stack Overflow](https://stackoverflow.com/questions/tagged/mlflow).\r\n\r\n**Please fill in this bug report template to ensure a timely and thorough response.**\r\n\r\n### Willingness to contribute\r\nThe MLflow Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the MLflow code base?\r\n\r\n- [x] Yes. I can contribute a fix for this bug independently.\r\n- [ ] Yes. I would be willing to contribute a fix for this bug with guidance from the MLflow community.\r\n- [ ] No. I cannot contribute a bug fix at this time.\r\n\r\n### System information\r\n- **Have I written custom code (as opposed to using a stock example script provided in MLflow)**: yes\r\n- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: macOS 11.6.3\r\n- **MLflow installed from (source or binary)**: binary\r\n- **MLflow version (run ``mlflow --version``)**: 1.23.1\r\n- **Python version**: 3.7.9\r\n- **npm version, if running the dev UI**:\r\n- **Exact command to reproduce**: `python xgb_metrics_bug.py`\r\n\r\n### Describe the problem\r\nXGBoost comes with a few metrics with @ in their name, e.g. `ndcg@2`, see [XGBoost parameters](https://xgboost.readthedocs.io/en/stable/parameter.html#learning-task-parameters). \r\n\r\nExpected behavior: MLflow should be able to autolog all built-in XGBoost metrics.\r\n\r\nActual behavior: when using a metric like this, the following warning is shown:\r\n\r\n> WARNING mlflow.utils.autologging_utils: Encountered unexpected error during xgboost autologging: Invalid metric name: 'train-ndcg@2'. Names may only contain alphanumerics, underscores (_), dashes (-), periods (.), spaces ( ), and slashes (/).\r\n\r\nbut not only this metric is discarded, instead autologging completely breaks, i.e. other metrics, the model artifact and the feature importance plot also are not recorded.\r\n\r\n\r\n\r\nI would propose to string-replace `@` with `_at_` in the XGBoost autologging metrics callback so that MLflow is compatible with all XGBoost built-in metrics.\r\n\r\n### Code to reproduce issue\r\n\r\n`xgb_metrics_bug.py`:\r\n\r\n```python\r\nimport mlflow\r\nimport numpy as np\r\nimport xgboost as xgb\r\n\r\nmlflow.set_experiment(\"xgb_autolog_metrics_bug\")\r\n\r\nfor metrics in [[\"ndcg\"], [\"ndcg\", \"ndcg@2\"]]:\r\n with mlflow.start_run():\r\n mlflow.xgboost.autolog()\r\n\r\n train_matrix = xgb.DMatrix(np.array([[0], [1]]), label=[1, 0])\r\n\r\n params = {\r\n \"objective\": \"rank:pairwise\",\r\n \"eval_metric\": metrics,\r\n }\r\n xgb.train(params, train_matrix, evals=[(train_matrix, \"train\")])\r\n```\r\n\r\n\r\n### Other info / logs\r\n- I tried this with xgboost versions 1.3.3 and 1.5.1\r\n\r\n\r\n### What component(s), interfaces, languages, and integrations does this bug affect?\r\nComponents \r\n- [ ] `area/artifacts`: Artifact stores and artifact logging\r\n- [ ] `area/build`: Build and test infrastructure for MLflow\r\n- [ ] `area/docs`: MLflow documentation pages\r\n- [ ] `area/examples`: Example code\r\n- [ ] `area/model-registry`: Model Registry service, APIs, and the fluent client calls for Model Registry\r\n- [ ] `area/models`: MLmodel format, model serialization/deserialization, flavors\r\n- [ ] `area/projects`: MLproject format, project running backends\r\n- [ ] `area/scoring`: MLflow Model server, model deployment tools, Spark UDFs\r\n- [ ] `area/server-infra`: MLflow Tracking server backend\r\n- [x] `area/tracking`: Tracking Service, tracking client APIs, autologging\r\n\r\nInterface \r\n- [ ] `area/uiux`: Front-end, user experience, plotting, JavaScript, JavaScript dev server\r\n- [ ] `area/docker`: Docker use across MLflow's components, such as MLflow Projects and MLflow Models\r\n- [ ] `area/sqlalchemy`: Use of SQLAlchemy in the Tracking Service or Model Registry\r\n- [ ] `area/windows`: Windows support\r\n\r\nLanguage \r\n- [ ] `language/r`: R APIs and clients\r\n- [ ] `language/java`: Java APIs and clients\r\n- [ ] `language/new`: Proposals for new client languages\r\n\r\nIntegrations\r\n- [ ] `integrations/azure`: Azure and Azure ML integrations\r\n- [ ] `integrations/sagemaker`: SageMaker integrations\r\n- [ ] `integrations/databricks`: Databricks integrations\r\n\n", "before_files": [{"content": "from packaging.version import Version\nimport xgboost\n\n# Suppress a false positive pylint error: https://github.com/PyCQA/pylint/issues/1630\n# pylint: disable=unused-import\nfrom mlflow.utils.autologging_utils import ExceptionSafeAbstractClass\n\n\ndef autolog_callback(env, metrics_logger, eval_results):\n metrics_logger.record_metrics(dict(env.evaluation_result_list), env.iteration)\n eval_results.append(dict(env.evaluation_result_list))\n\n\nIS_TRAINING_CALLBACK_SUPPORTED = Version(xgboost.__version__.replace(\"SNAPSHOT\", \"dev\")) >= Version(\n \"1.3.0\"\n)\n\nif IS_TRAINING_CALLBACK_SUPPORTED:\n\n class AutologCallback(\n xgboost.callback.TrainingCallback,\n metaclass=ExceptionSafeAbstractClass,\n ):\n def __init__(self, metrics_logger, eval_results):\n # pylint: disable=super-init-not-called\n self.metrics_logger = metrics_logger\n self.eval_results = eval_results\n\n def after_iteration(self, model, epoch, evals_log):\n \"\"\"\n Run after each iteration. Return True when training should stop.\n \"\"\"\n # `evals_log` is a nested dict (type: Dict[str, Dict[str, List[float]]])\n # that looks like this:\n # {\n # \"train\": {\n # \"auc\": [0.5, 0.6, 0.7, ...],\n # ...\n # },\n # ...\n # }\n evaluation_result_dict = {}\n for data_name, metric_dict in evals_log.items():\n for metric_name, metric_values_on_each_iter in metric_dict.items():\n key = \"{}-{}\".format(data_name, metric_name)\n # The last element in `metric_values_on_each_iter` corresponds to\n # the meric on the current iteration\n evaluation_result_dict[key] = metric_values_on_each_iter[-1]\n\n self.metrics_logger.record_metrics(evaluation_result_dict, epoch)\n self.eval_results.append(evaluation_result_dict)\n\n # Return `False` to indicate training should not stop\n return False\n", "path": "mlflow/xgboost/_autolog.py"}], "after_files": [{"content": "from packaging.version import Version\nimport xgboost\n\n# Suppress a false positive pylint error: https://github.com/PyCQA/pylint/issues/1630\n# pylint: disable=unused-import\nfrom mlflow.utils.autologging_utils import ExceptionSafeAbstractClass\n\n\ndef _patch_metric_names(metric_dict):\n # XGBoost provides some metrics with \"@\", e.g. \"ndcg@3\" that are not valid MLflow metric names\n return {metric_name.replace(\"@\", \"_at_\"): value for metric_name, value in metric_dict.items()}\n\n\ndef autolog_callback(env, metrics_logger, eval_results):\n metric_dict = _patch_metric_names(dict(env.evaluation_result_list))\n metrics_logger.record_metrics(metric_dict, env.iteration)\n eval_results.append(metric_dict)\n\n\nIS_TRAINING_CALLBACK_SUPPORTED = Version(xgboost.__version__.replace(\"SNAPSHOT\", \"dev\")) >= Version(\n \"1.3.0\"\n)\n\nif IS_TRAINING_CALLBACK_SUPPORTED:\n\n class AutologCallback(\n xgboost.callback.TrainingCallback,\n metaclass=ExceptionSafeAbstractClass,\n ):\n def __init__(self, metrics_logger, eval_results):\n # pylint: disable=super-init-not-called\n self.metrics_logger = metrics_logger\n self.eval_results = eval_results\n\n def after_iteration(self, model, epoch, evals_log):\n \"\"\"\n Run after each iteration. Return True when training should stop.\n \"\"\"\n # `evals_log` is a nested dict (type: Dict[str, Dict[str, List[float]]])\n # that looks like this:\n # {\n # \"train\": {\n # \"auc\": [0.5, 0.6, 0.7, ...],\n # ...\n # },\n # ...\n # }\n evaluation_result_dict = {}\n for data_name, metric_dict in evals_log.items():\n metric_dict = _patch_metric_names(metric_dict)\n for metric_name, metric_values_on_each_iter in metric_dict.items():\n key = \"{}-{}\".format(data_name, metric_name)\n # The last element in `metric_values_on_each_iter` corresponds to\n # the metric on the current iteration\n evaluation_result_dict[key] = metric_values_on_each_iter[-1]\n\n self.metrics_logger.record_metrics(evaluation_result_dict, epoch)\n self.eval_results.append(evaluation_result_dict)\n\n # Return `False` to indicate training should not stop\n return False\n", "path": "mlflow/xgboost/_autolog.py"}]} | 1,967 | 384 |
gh_patches_debug_21014 | rasdani/github-patches | git_diff | python__peps-3559 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Infra: Allow dismissing the historical note banner
These notes take up more than a 1/3 of my screen and make it significantly harder to read the PEP. I can developer console it out of existence on my laptop, but not on my phone.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py`
Content:
```
1 """Roles to insert custom admonitions pointing readers to canonical content."""
2
3 from __future__ import annotations
4
5 from docutils import nodes
6 from docutils.parsers import rst
7
8 PYPA_SPEC_BASE_URL = "https://packaging.python.org/en/latest/specifications/"
9
10
11 class PEPBanner(rst.Directive):
12 """Insert a special banner admonition in a PEP document."""
13
14 has_content = True
15 required_arguments = 0
16 optional_arguments = 1
17 final_argument_whitespace = True
18 option_spec = {}
19
20 admonition_pre_template = ""
21 admonition_pre_text = ""
22 admonition_post_text = ""
23
24 admonition_class = nodes.important
25 css_classes = []
26
27
28 def run(self) -> list[nodes.admonition]:
29
30 if self.arguments:
31 link_content = self.arguments[0]
32 pre_text = self.admonition_pre_template.format(
33 link_content=link_content)
34 else:
35 pre_text = self.admonition_pre_text
36
37 pre_text_node = nodes.paragraph(pre_text)
38 pre_text_node.line = self.lineno
39 pre_node, pre_msg = self.state.inline_text(pre_text, self.lineno)
40 pre_text_node.extend(pre_node + pre_msg)
41
42 post_text = self.admonition_post_text
43 post_text_node = nodes.paragraph(post_text)
44 post_text_node.line = self.lineno
45 post_node, post_msg = self.state.inline_text(post_text, self.lineno)
46 post_text_node.extend(post_node + post_msg)
47
48 source_lines = [pre_text] + list(self.content or []) + [post_text]
49 admonition_node = self.admonition_class(
50 "\n".join(source_lines), classes=["pep-banner"] + self.css_classes)
51
52 admonition_node.append(pre_text_node)
53 if self.content:
54 self.state.nested_parse(
55 self.content, self.content_offset, admonition_node)
56 admonition_node.append(post_text_node)
57
58 return [admonition_node]
59
60
61 class CanonicalDocBanner(PEPBanner):
62 """Insert an admonition pointing readers to a PEP's canonical docs."""
63
64 admonition_pre_template = (
65 "This PEP is a historical document. "
66 "The up-to-date, canonical documentation can now be found "
67 "at {link_content}."
68 )
69 admonition_pre_text = (
70 "This PEP is a historical document. "
71 "The up-to-date, canonical documentation can now be found elsewhere."
72 )
73 admonition_post_text = (
74 "See :pep:`1` for how to propose changes."
75 )
76
77 css_classes = ["canonical-doc", "sticky-banner"]
78
79
80
81 class CanonicalPyPASpecBanner(PEPBanner):
82 """Insert a specialized admonition for PyPA packaging specifications."""
83
84 admonition_pre_template = (
85 "This PEP is a historical document. "
86 "The up-to-date, canonical spec, {link_content}, is maintained on "
87 f"the `PyPA specs page <{PYPA_SPEC_BASE_URL}>`__."
88 )
89 admonition_pre_text = (
90 "This PEP is a historical document. "
91 "The up-to-date, canonical specifications are maintained on "
92 f"the `PyPA specs page <{PYPA_SPEC_BASE_URL}>`__."
93 )
94 admonition_post_text = (
95 "See the `PyPA specification update process "
96 "<https://www.pypa.io/en/latest/specifications/#handling-fixes-and-other-minor-updates>`__ "
97 "for how to propose changes."
98 )
99 admonition_class = nodes.attention
100
101 css_classes = ["canonical-pypa-spec", "sticky-banner"]
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py b/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py
--- a/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py
+++ b/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py
@@ -34,6 +34,9 @@
else:
pre_text = self.admonition_pre_text
+ close_button_node = nodes.paragraph('', '', nodes.Text('×'), classes=['close-button'])
+ close_button_node['classes'].append('close-button')
+
pre_text_node = nodes.paragraph(pre_text)
pre_text_node.line = self.lineno
pre_node, pre_msg = self.state.inline_text(pre_text, self.lineno)
@@ -50,6 +53,7 @@
"\n".join(source_lines), classes=["pep-banner"] + self.css_classes)
admonition_node.append(pre_text_node)
+ admonition_node.append(close_button_node)
if self.content:
self.state.nested_parse(
self.content, self.content_offset, admonition_node)
| {"golden_diff": "diff --git a/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py b/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py\n--- a/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py\n+++ b/pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py\n@@ -34,6 +34,9 @@\n else:\n pre_text = self.admonition_pre_text\n \n+ close_button_node = nodes.paragraph('', '', nodes.Text('\u00d7'), classes=['close-button'])\n+ close_button_node['classes'].append('close-button')\n+\n pre_text_node = nodes.paragraph(pre_text)\n pre_text_node.line = self.lineno\n pre_node, pre_msg = self.state.inline_text(pre_text, self.lineno)\n@@ -50,6 +53,7 @@\n \"\\n\".join(source_lines), classes=[\"pep-banner\"] + self.css_classes)\n \n admonition_node.append(pre_text_node)\n+ admonition_node.append(close_button_node)\n if self.content:\n self.state.nested_parse(\n self.content, self.content_offset, admonition_node)\n", "issue": "Infra: Allow dismissing the historical note banner\nThese notes take up more than a 1/3 of my screen and make it significantly harder to read the PEP. I can developer console it out of existence on my laptop, but not on my phone.\n", "before_files": [{"content": "\"\"\"Roles to insert custom admonitions pointing readers to canonical content.\"\"\"\n\nfrom __future__ import annotations\n\nfrom docutils import nodes\nfrom docutils.parsers import rst\n\nPYPA_SPEC_BASE_URL = \"https://packaging.python.org/en/latest/specifications/\"\n\n\nclass PEPBanner(rst.Directive):\n \"\"\"Insert a special banner admonition in a PEP document.\"\"\"\n\n has_content = True\n required_arguments = 0\n optional_arguments = 1\n final_argument_whitespace = True\n option_spec = {}\n\n admonition_pre_template = \"\"\n admonition_pre_text = \"\"\n admonition_post_text = \"\"\n\n admonition_class = nodes.important\n css_classes = []\n\n\n def run(self) -> list[nodes.admonition]:\n\n if self.arguments:\n link_content = self.arguments[0]\n pre_text = self.admonition_pre_template.format(\n link_content=link_content)\n else:\n pre_text = self.admonition_pre_text\n\n pre_text_node = nodes.paragraph(pre_text)\n pre_text_node.line = self.lineno\n pre_node, pre_msg = self.state.inline_text(pre_text, self.lineno)\n pre_text_node.extend(pre_node + pre_msg)\n\n post_text = self.admonition_post_text\n post_text_node = nodes.paragraph(post_text)\n post_text_node.line = self.lineno\n post_node, post_msg = self.state.inline_text(post_text, self.lineno)\n post_text_node.extend(post_node + post_msg)\n\n source_lines = [pre_text] + list(self.content or []) + [post_text]\n admonition_node = self.admonition_class(\n \"\\n\".join(source_lines), classes=[\"pep-banner\"] + self.css_classes)\n\n admonition_node.append(pre_text_node)\n if self.content:\n self.state.nested_parse(\n self.content, self.content_offset, admonition_node)\n admonition_node.append(post_text_node)\n\n return [admonition_node]\n\n\nclass CanonicalDocBanner(PEPBanner):\n \"\"\"Insert an admonition pointing readers to a PEP's canonical docs.\"\"\"\n\n admonition_pre_template = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical documentation can now be found \"\n \"at {link_content}.\"\n )\n admonition_pre_text = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical documentation can now be found elsewhere.\"\n )\n admonition_post_text = (\n \"See :pep:`1` for how to propose changes.\"\n )\n\n css_classes = [\"canonical-doc\", \"sticky-banner\"]\n\n\n\nclass CanonicalPyPASpecBanner(PEPBanner):\n \"\"\"Insert a specialized admonition for PyPA packaging specifications.\"\"\"\n\n admonition_pre_template = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical spec, {link_content}, is maintained on \"\n f\"the `PyPA specs page <{PYPA_SPEC_BASE_URL}>`__.\"\n )\n admonition_pre_text = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical specifications are maintained on \"\n f\"the `PyPA specs page <{PYPA_SPEC_BASE_URL}>`__.\"\n )\n admonition_post_text = (\n \"See the `PyPA specification update process \"\n \"<https://www.pypa.io/en/latest/specifications/#handling-fixes-and-other-minor-updates>`__ \"\n \"for how to propose changes.\"\n )\n admonition_class = nodes.attention\n\n css_classes = [\"canonical-pypa-spec\", \"sticky-banner\"]\n", "path": "pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py"}], "after_files": [{"content": "\"\"\"Roles to insert custom admonitions pointing readers to canonical content.\"\"\"\n\nfrom __future__ import annotations\n\nfrom docutils import nodes\nfrom docutils.parsers import rst\n\nPYPA_SPEC_BASE_URL = \"https://packaging.python.org/en/latest/specifications/\"\n\n\nclass PEPBanner(rst.Directive):\n \"\"\"Insert a special banner admonition in a PEP document.\"\"\"\n\n has_content = True\n required_arguments = 0\n optional_arguments = 1\n final_argument_whitespace = True\n option_spec = {}\n\n admonition_pre_template = \"\"\n admonition_pre_text = \"\"\n admonition_post_text = \"\"\n\n admonition_class = nodes.important\n css_classes = []\n\n\n def run(self) -> list[nodes.admonition]:\n\n if self.arguments:\n link_content = self.arguments[0]\n pre_text = self.admonition_pre_template.format(\n link_content=link_content)\n else:\n pre_text = self.admonition_pre_text\n\n close_button_node = nodes.paragraph('', '', nodes.Text('\u00d7'), classes=['close-button'])\n close_button_node['classes'].append('close-button')\n\n pre_text_node = nodes.paragraph(pre_text)\n pre_text_node.line = self.lineno\n pre_node, pre_msg = self.state.inline_text(pre_text, self.lineno)\n pre_text_node.extend(pre_node + pre_msg)\n\n post_text = self.admonition_post_text\n post_text_node = nodes.paragraph(post_text)\n post_text_node.line = self.lineno\n post_node, post_msg = self.state.inline_text(post_text, self.lineno)\n post_text_node.extend(post_node + post_msg)\n\n source_lines = [pre_text] + list(self.content or []) + [post_text]\n admonition_node = self.admonition_class(\n \"\\n\".join(source_lines), classes=[\"pep-banner\"] + self.css_classes)\n\n admonition_node.append(pre_text_node)\n admonition_node.append(close_button_node)\n if self.content:\n self.state.nested_parse(\n self.content, self.content_offset, admonition_node)\n admonition_node.append(post_text_node)\n\n return [admonition_node]\n\n\nclass CanonicalDocBanner(PEPBanner):\n \"\"\"Insert an admonition pointing readers to a PEP's canonical docs.\"\"\"\n\n admonition_pre_template = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical documentation can now be found \"\n \"at {link_content}.\"\n )\n admonition_pre_text = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical documentation can now be found elsewhere.\"\n )\n admonition_post_text = (\n \"See :pep:`1` for how to propose changes.\"\n )\n\n css_classes = [\"canonical-doc\", \"sticky-banner\"]\n\n\n\nclass CanonicalPyPASpecBanner(PEPBanner):\n \"\"\"Insert a specialized admonition for PyPA packaging specifications.\"\"\"\n\n admonition_pre_template = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical spec, {link_content}, is maintained on \"\n f\"the `PyPA specs page <{PYPA_SPEC_BASE_URL}>`__.\"\n )\n admonition_pre_text = (\n \"This PEP is a historical document. \"\n \"The up-to-date, canonical specifications are maintained on \"\n f\"the `PyPA specs page <{PYPA_SPEC_BASE_URL}>`__.\"\n )\n admonition_post_text = (\n \"See the `PyPA specification update process \"\n \"<https://www.pypa.io/en/latest/specifications/#handling-fixes-and-other-minor-updates>`__ \"\n \"for how to propose changes.\"\n )\n admonition_class = nodes.attention\n\n css_classes = [\"canonical-pypa-spec\", \"sticky-banner\"]\n", "path": "pep_sphinx_extensions/pep_processor/parsing/pep_banner_directive.py"}]} | 1,311 | 263 |
gh_patches_debug_13591 | rasdani/github-patches | git_diff | freqtrade__freqtrade-4189 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
StoplossGuard should include stoploss_on_exchange as well
The document is here: https://www.freqtrade.io/en/latest/configuration/#stoploss-guard
But it should include stoploss_on_exchange as well.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `freqtrade/plugins/protections/stoploss_guard.py`
Content:
```
1
2 import logging
3 from datetime import datetime, timedelta
4 from typing import Any, Dict
5
6 from freqtrade.persistence import Trade
7 from freqtrade.plugins.protections import IProtection, ProtectionReturn
8 from freqtrade.strategy.interface import SellType
9
10
11 logger = logging.getLogger(__name__)
12
13
14 class StoplossGuard(IProtection):
15
16 has_global_stop: bool = True
17 has_local_stop: bool = True
18
19 def __init__(self, config: Dict[str, Any], protection_config: Dict[str, Any]) -> None:
20 super().__init__(config, protection_config)
21
22 self._trade_limit = protection_config.get('trade_limit', 10)
23 self._disable_global_stop = protection_config.get('only_per_pair', False)
24
25 def short_desc(self) -> str:
26 """
27 Short method description - used for startup-messages
28 """
29 return (f"{self.name} - Frequent Stoploss Guard, {self._trade_limit} stoplosses "
30 f"within {self.lookback_period_str}.")
31
32 def _reason(self) -> str:
33 """
34 LockReason to use
35 """
36 return (f'{self._trade_limit} stoplosses in {self._lookback_period} min, '
37 f'locking for {self._stop_duration} min.')
38
39 def _stoploss_guard(self, date_now: datetime, pair: str = None) -> ProtectionReturn:
40 """
41 Evaluate recent trades
42 """
43 look_back_until = date_now - timedelta(minutes=self._lookback_period)
44 # filters = [
45 # Trade.is_open.is_(False),
46 # Trade.close_date > look_back_until,
47 # or_(Trade.sell_reason == SellType.STOP_LOSS.value,
48 # and_(Trade.sell_reason == SellType.TRAILING_STOP_LOSS.value,
49 # Trade.close_profit < 0))
50 # ]
51 # if pair:
52 # filters.append(Trade.pair == pair)
53 # trades = Trade.get_trades(filters).all()
54
55 trades1 = Trade.get_trades_proxy(pair=pair, is_open=False, close_date=look_back_until)
56 trades = [trade for trade in trades1 if str(trade.sell_reason) == SellType.STOP_LOSS.value
57 or (str(trade.sell_reason) == SellType.TRAILING_STOP_LOSS.value
58 and trade.close_profit < 0)]
59
60 if len(trades) > self._trade_limit:
61 self.log_once(f"Trading stopped due to {self._trade_limit} "
62 f"stoplosses within {self._lookback_period} minutes.", logger.info)
63 until = self.calculate_lock_end(trades, self._stop_duration)
64 return True, until, self._reason()
65
66 return False, None, None
67
68 def global_stop(self, date_now: datetime) -> ProtectionReturn:
69 """
70 Stops trading (position entering) for all pairs
71 This must evaluate to true for the whole period of the "cooldown period".
72 :return: Tuple of [bool, until, reason].
73 If true, all pairs will be locked with <reason> until <until>
74 """
75 if self._disable_global_stop:
76 return False, None, None
77 return self._stoploss_guard(date_now, None)
78
79 def stop_per_pair(self, pair: str, date_now: datetime) -> ProtectionReturn:
80 """
81 Stops trading (position entering) for this pair
82 This must evaluate to true for the whole period of the "cooldown period".
83 :return: Tuple of [bool, until, reason].
84 If true, this pair will be locked with <reason> until <until>
85 """
86 return self._stoploss_guard(date_now, pair)
87
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/freqtrade/plugins/protections/stoploss_guard.py b/freqtrade/plugins/protections/stoploss_guard.py
--- a/freqtrade/plugins/protections/stoploss_guard.py
+++ b/freqtrade/plugins/protections/stoploss_guard.py
@@ -53,8 +53,9 @@
# trades = Trade.get_trades(filters).all()
trades1 = Trade.get_trades_proxy(pair=pair, is_open=False, close_date=look_back_until)
- trades = [trade for trade in trades1 if str(trade.sell_reason) == SellType.STOP_LOSS.value
- or (str(trade.sell_reason) == SellType.TRAILING_STOP_LOSS.value
+ trades = [trade for trade in trades1 if (str(trade.sell_reason) in (
+ SellType.TRAILING_STOP_LOSS.value, SellType.STOP_LOSS.value,
+ SellType.STOPLOSS_ON_EXCHANGE.value)
and trade.close_profit < 0)]
if len(trades) > self._trade_limit:
| {"golden_diff": "diff --git a/freqtrade/plugins/protections/stoploss_guard.py b/freqtrade/plugins/protections/stoploss_guard.py\n--- a/freqtrade/plugins/protections/stoploss_guard.py\n+++ b/freqtrade/plugins/protections/stoploss_guard.py\n@@ -53,8 +53,9 @@\n # trades = Trade.get_trades(filters).all()\n \n trades1 = Trade.get_trades_proxy(pair=pair, is_open=False, close_date=look_back_until)\n- trades = [trade for trade in trades1 if str(trade.sell_reason) == SellType.STOP_LOSS.value\n- or (str(trade.sell_reason) == SellType.TRAILING_STOP_LOSS.value\n+ trades = [trade for trade in trades1 if (str(trade.sell_reason) in (\n+ SellType.TRAILING_STOP_LOSS.value, SellType.STOP_LOSS.value,\n+ SellType.STOPLOSS_ON_EXCHANGE.value)\n and trade.close_profit < 0)]\n \n if len(trades) > self._trade_limit:\n", "issue": "StoplossGuard should include stoploss_on_exchange as well\nThe document is here: https://www.freqtrade.io/en/latest/configuration/#stoploss-guard\r\n\r\nBut it should include stoploss_on_exchange as well.\n", "before_files": [{"content": "\nimport logging\nfrom datetime import datetime, timedelta\nfrom typing import Any, Dict\n\nfrom freqtrade.persistence import Trade\nfrom freqtrade.plugins.protections import IProtection, ProtectionReturn\nfrom freqtrade.strategy.interface import SellType\n\n\nlogger = logging.getLogger(__name__)\n\n\nclass StoplossGuard(IProtection):\n\n has_global_stop: bool = True\n has_local_stop: bool = True\n\n def __init__(self, config: Dict[str, Any], protection_config: Dict[str, Any]) -> None:\n super().__init__(config, protection_config)\n\n self._trade_limit = protection_config.get('trade_limit', 10)\n self._disable_global_stop = protection_config.get('only_per_pair', False)\n\n def short_desc(self) -> str:\n \"\"\"\n Short method description - used for startup-messages\n \"\"\"\n return (f\"{self.name} - Frequent Stoploss Guard, {self._trade_limit} stoplosses \"\n f\"within {self.lookback_period_str}.\")\n\n def _reason(self) -> str:\n \"\"\"\n LockReason to use\n \"\"\"\n return (f'{self._trade_limit} stoplosses in {self._lookback_period} min, '\n f'locking for {self._stop_duration} min.')\n\n def _stoploss_guard(self, date_now: datetime, pair: str = None) -> ProtectionReturn:\n \"\"\"\n Evaluate recent trades\n \"\"\"\n look_back_until = date_now - timedelta(minutes=self._lookback_period)\n # filters = [\n # Trade.is_open.is_(False),\n # Trade.close_date > look_back_until,\n # or_(Trade.sell_reason == SellType.STOP_LOSS.value,\n # and_(Trade.sell_reason == SellType.TRAILING_STOP_LOSS.value,\n # Trade.close_profit < 0))\n # ]\n # if pair:\n # filters.append(Trade.pair == pair)\n # trades = Trade.get_trades(filters).all()\n\n trades1 = Trade.get_trades_proxy(pair=pair, is_open=False, close_date=look_back_until)\n trades = [trade for trade in trades1 if str(trade.sell_reason) == SellType.STOP_LOSS.value\n or (str(trade.sell_reason) == SellType.TRAILING_STOP_LOSS.value\n and trade.close_profit < 0)]\n\n if len(trades) > self._trade_limit:\n self.log_once(f\"Trading stopped due to {self._trade_limit} \"\n f\"stoplosses within {self._lookback_period} minutes.\", logger.info)\n until = self.calculate_lock_end(trades, self._stop_duration)\n return True, until, self._reason()\n\n return False, None, None\n\n def global_stop(self, date_now: datetime) -> ProtectionReturn:\n \"\"\"\n Stops trading (position entering) for all pairs\n This must evaluate to true for the whole period of the \"cooldown period\".\n :return: Tuple of [bool, until, reason].\n If true, all pairs will be locked with <reason> until <until>\n \"\"\"\n if self._disable_global_stop:\n return False, None, None\n return self._stoploss_guard(date_now, None)\n\n def stop_per_pair(self, pair: str, date_now: datetime) -> ProtectionReturn:\n \"\"\"\n Stops trading (position entering) for this pair\n This must evaluate to true for the whole period of the \"cooldown period\".\n :return: Tuple of [bool, until, reason].\n If true, this pair will be locked with <reason> until <until>\n \"\"\"\n return self._stoploss_guard(date_now, pair)\n", "path": "freqtrade/plugins/protections/stoploss_guard.py"}], "after_files": [{"content": "\nimport logging\nfrom datetime import datetime, timedelta\nfrom typing import Any, Dict\n\nfrom freqtrade.persistence import Trade\nfrom freqtrade.plugins.protections import IProtection, ProtectionReturn\nfrom freqtrade.strategy.interface import SellType\n\n\nlogger = logging.getLogger(__name__)\n\n\nclass StoplossGuard(IProtection):\n\n has_global_stop: bool = True\n has_local_stop: bool = True\n\n def __init__(self, config: Dict[str, Any], protection_config: Dict[str, Any]) -> None:\n super().__init__(config, protection_config)\n\n self._trade_limit = protection_config.get('trade_limit', 10)\n self._disable_global_stop = protection_config.get('only_per_pair', False)\n\n def short_desc(self) -> str:\n \"\"\"\n Short method description - used for startup-messages\n \"\"\"\n return (f\"{self.name} - Frequent Stoploss Guard, {self._trade_limit} stoplosses \"\n f\"within {self.lookback_period_str}.\")\n\n def _reason(self) -> str:\n \"\"\"\n LockReason to use\n \"\"\"\n return (f'{self._trade_limit} stoplosses in {self._lookback_period} min, '\n f'locking for {self._stop_duration} min.')\n\n def _stoploss_guard(self, date_now: datetime, pair: str = None) -> ProtectionReturn:\n \"\"\"\n Evaluate recent trades\n \"\"\"\n look_back_until = date_now - timedelta(minutes=self._lookback_period)\n # filters = [\n # Trade.is_open.is_(False),\n # Trade.close_date > look_back_until,\n # or_(Trade.sell_reason == SellType.STOP_LOSS.value,\n # and_(Trade.sell_reason == SellType.TRAILING_STOP_LOSS.value,\n # Trade.close_profit < 0))\n # ]\n # if pair:\n # filters.append(Trade.pair == pair)\n # trades = Trade.get_trades(filters).all()\n\n trades1 = Trade.get_trades_proxy(pair=pair, is_open=False, close_date=look_back_until)\n trades = [trade for trade in trades1 if (str(trade.sell_reason) in (\n SellType.TRAILING_STOP_LOSS.value, SellType.STOP_LOSS.value,\n SellType.STOPLOSS_ON_EXCHANGE.value)\n and trade.close_profit < 0)]\n\n if len(trades) > self._trade_limit:\n self.log_once(f\"Trading stopped due to {self._trade_limit} \"\n f\"stoplosses within {self._lookback_period} minutes.\", logger.info)\n until = self.calculate_lock_end(trades, self._stop_duration)\n return True, until, self._reason()\n\n return False, None, None\n\n def global_stop(self, date_now: datetime) -> ProtectionReturn:\n \"\"\"\n Stops trading (position entering) for all pairs\n This must evaluate to true for the whole period of the \"cooldown period\".\n :return: Tuple of [bool, until, reason].\n If true, all pairs will be locked with <reason> until <until>\n \"\"\"\n if self._disable_global_stop:\n return False, None, None\n return self._stoploss_guard(date_now, None)\n\n def stop_per_pair(self, pair: str, date_now: datetime) -> ProtectionReturn:\n \"\"\"\n Stops trading (position entering) for this pair\n This must evaluate to true for the whole period of the \"cooldown period\".\n :return: Tuple of [bool, until, reason].\n If true, this pair will be locked with <reason> until <until>\n \"\"\"\n return self._stoploss_guard(date_now, pair)\n", "path": "freqtrade/plugins/protections/stoploss_guard.py"}]} | 1,257 | 228 |
gh_patches_debug_4850 | rasdani/github-patches | git_diff | python__python-docs-es-1000 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Translate `library/signal.po`
This needs to reach 100% translated.
Current stats for `library/signal.po`:
- Fuzzy: 0
- Percent translated: 0%
- Entries: 0 / 119
- Untranslated: 119
Please, comment here if you want this file to be assigned to you and an member will assign it to you as soon as possible, so you can start working on it.
Remember to follow the steps in our [Contributing Guide](https://python-docs-es.readthedocs.io/page/CONTRIBUTING.html).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `conf.py`
Content:
```
1 # Sphinx configuration file.
2 #
3 # - import original configurations from cpython/Doc/conf.py
4 # - append the path considering the cpython submodule is at ./cpython
5 # - create the symbolic links under ./cpython/locale/es/LC_MESSAGES
6 # - make the build to work under Read the Docs
7 #
8 # The git submodule was created using this Stack Overflow answer
9 # to fetch only the commit that I needed and avoid clonning the whole history
10 # https://stackoverflow.com/a/27445058
11 #
12 # This can be built locally using `sphinx-build` by running
13 #
14 # $ sphinx-build -b html -d _build/doctrees -D language=es . _build/html
15
16 import sys, os, time
17 sys.path.append(os.path.abspath('cpython/Doc/tools/extensions'))
18 sys.path.append(os.path.abspath('cpython/Doc/includes'))
19
20 # Import all the Sphinx settings from cpython
21 sys.path.insert(0, os.path.abspath('cpython/Doc'))
22 from conf import *
23
24 # Call patchlevel with the proper path to get the version from
25 # instead of hardcoding it
26 import patchlevel
27 version, release = patchlevel.get_header_version_info(os.path.abspath('cpython/Doc'))
28
29 project = 'Python en Español'
30 copyright = '2001-%s, Python Software Foundation' % time.strftime('%Y')
31
32 html_theme_path = ['cpython/Doc/tools']
33 templates_path = ['cpython/Doc/tools/templates']
34 html_static_path = ['cpython/Doc/tools/static']
35
36 os.system('mkdir -p cpython/locales/es/')
37 os.system('ln -nfs `pwd` cpython/locales/es/LC_MESSAGES')
38
39 html_short_title = f'Documentación {release}'
40 html_title = f'Documentación de Python en Español -- {release}'
41
42
43 # Extend settings from upstream
44 _exclude_patterns = [
45 # This file is not included and it not marked as :orphan:
46 '*/distutils/_setuptools_disclaimer.rst',
47 ]
48 if 'exclude_patterns' in globals():
49 exclude_patterns += _exclude_patterns
50 else:
51 exclude_patterns = _exclude_patterns
52
53 _extensions = [
54 'sphinx_tabs.tabs',
55 'sphinxemoji.sphinxemoji',
56 ]
57 if 'extensions' in globals():
58 extensions += _extensions
59 else:
60 extensions = _extensions
61
62
63 if not os.environ.get('SPHINX_GETTEXT') == 'True':
64 # Override all the files from ``.overrides`` directory
65 from pathlib import Path
66 overrides_paths = Path('.overrides')
67
68 for path in overrides_paths.glob('**/*.*'):
69 if path.name == 'README.rst' and path.parent == '.overrides':
70 continue
71 destroot = str(path.parent).replace('.overrides', '').lstrip('/')
72 outputdir = Path('cpython/Doc') / destroot / path.name
73 os.system(f'ln -nfs `pwd`/{path.parent}/{path.name} {outputdir}')
74
75 gettext_compact = False
76 locale_dirs = ['../locales', 'cpython/locales'] # relative to the sourcedir
77
78
79 # NOTE: Read the Docs does not support "multi document output".
80 # So, we put all the documentation as a single file for now.
81 _stdauthor = r'Guido van Rossum\\and the Python development team'
82 latex_documents = [
83 ('contents', 'python-docs-es.tex', u'Documentación de Python en Español',
84 _stdauthor, 'manual'),
85 ]
86
87 def setup(app):
88
89 def add_contributing_banner(app, doctree):
90 """
91 Insert a banner at the top of the index.
92
93 This way, we can easily communicate people to help with the translation,
94 pointing them to different resources.
95 """
96
97 if app.builder.format != 'html':
98 # Do not include the banner when building with other formats
99 # (this is useful when using -b gettext)
100 return
101
102 from docutils import nodes, core
103
104 message = '¡Ayúdanos a traducir la documentación oficial de Python al Español! ' \
105 f'Puedes encontrar más información en `Como contribuir </es/{version}/CONTRIBUTING.html>`_. ' \
106 'Ayuda a acercar Python a más personas de habla hispana.'
107
108 paragraph = core.publish_doctree(message)[0]
109 banner = nodes.warning(ids=['contributing-banner'])
110 banner.append(paragraph)
111
112 for document in doctree.traverse(nodes.document):
113 document.insert(0, banner)
114
115 # Change the sourcedir programmatically because Read the Docs always call it with `.`
116 app.srcdir = 'cpython/Doc'
117
118 app.connect('doctree-read', add_contributing_banner)
119
120 # Import the sphinx-autorun manually to avoid this warning
121 # TODO: Remove this code and use just ``extensions.append('sphinx_autorun')`` when
122 # that issue gets fixed
123 # See https://github.com/WhyNotHugo/sphinx-autorun/issues/17
124
125 # WARNING: the sphinx_autorun extension does not declare if it is safe for
126 # parallel reading, assuming it isn't - please ask the extension author to
127 # check and make it explicit
128 # WARNING: doing serial read
129 from sphinx_autorun import RunBlock, AutoRun
130 app.add_directive('runblock', RunBlock)
131 app.connect('builder-inited', AutoRun.builder_init)
132 app.add_config_value('autorun_languages', AutoRun.config, 'env')
133 return {
134 'version': '0.1',
135 'parallel_read_safe': True,
136 'parallel_write_safe': True,
137 }
138
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/conf.py b/conf.py
--- a/conf.py
+++ b/conf.py
@@ -42,8 +42,9 @@
# Extend settings from upstream
_exclude_patterns = [
- # This file is not included and it not marked as :orphan:
- '*/distutils/_setuptools_disclaimer.rst',
+ # This file is not included and it's not marked as :orphan:
+ 'distutils/_setuptools_disclaimer.rst',
+ 'cpython/Doc/distutils/_setuptools_disclaimer.rst',
]
if 'exclude_patterns' in globals():
exclude_patterns += _exclude_patterns
| {"golden_diff": "diff --git a/conf.py b/conf.py\n--- a/conf.py\n+++ b/conf.py\n@@ -42,8 +42,9 @@\n \n # Extend settings from upstream\n _exclude_patterns = [\n- # This file is not included and it not marked as :orphan:\n- '*/distutils/_setuptools_disclaimer.rst',\n+ # This file is not included and it's not marked as :orphan:\n+ 'distutils/_setuptools_disclaimer.rst',\n+ 'cpython/Doc/distutils/_setuptools_disclaimer.rst',\n ]\n if 'exclude_patterns' in globals():\n exclude_patterns += _exclude_patterns\n", "issue": "Translate `library/signal.po`\nThis needs to reach 100% translated.\n\nCurrent stats for `library/signal.po`:\n\n- Fuzzy: 0\n- Percent translated: 0%\n- Entries: 0 / 119\n- Untranslated: 119\n\nPlease, comment here if you want this file to be assigned to you and an member will assign it to you as soon as possible, so you can start working on it.\n\nRemember to follow the steps in our [Contributing Guide](https://python-docs-es.readthedocs.io/page/CONTRIBUTING.html).\n", "before_files": [{"content": "# Sphinx configuration file.\n#\n# - import original configurations from cpython/Doc/conf.py\n# - append the path considering the cpython submodule is at ./cpython\n# - create the symbolic links under ./cpython/locale/es/LC_MESSAGES\n# - make the build to work under Read the Docs\n#\n# The git submodule was created using this Stack Overflow answer\n# to fetch only the commit that I needed and avoid clonning the whole history\n# https://stackoverflow.com/a/27445058\n#\n# This can be built locally using `sphinx-build` by running\n#\n# $ sphinx-build -b html -d _build/doctrees -D language=es . _build/html\n\nimport sys, os, time\nsys.path.append(os.path.abspath('cpython/Doc/tools/extensions'))\nsys.path.append(os.path.abspath('cpython/Doc/includes'))\n\n# Import all the Sphinx settings from cpython\nsys.path.insert(0, os.path.abspath('cpython/Doc'))\nfrom conf import *\n\n# Call patchlevel with the proper path to get the version from\n# instead of hardcoding it\nimport patchlevel\nversion, release = patchlevel.get_header_version_info(os.path.abspath('cpython/Doc'))\n\nproject = 'Python en Espa\u00f1ol'\ncopyright = '2001-%s, Python Software Foundation' % time.strftime('%Y')\n\nhtml_theme_path = ['cpython/Doc/tools']\ntemplates_path = ['cpython/Doc/tools/templates']\nhtml_static_path = ['cpython/Doc/tools/static']\n\nos.system('mkdir -p cpython/locales/es/')\nos.system('ln -nfs `pwd` cpython/locales/es/LC_MESSAGES')\n\nhtml_short_title = f'Documentaci\u00f3n {release}'\nhtml_title = f'Documentaci\u00f3n de Python en Espa\u00f1ol -- {release}'\n\n\n# Extend settings from upstream\n_exclude_patterns = [\n # This file is not included and it not marked as :orphan:\n '*/distutils/_setuptools_disclaimer.rst',\n]\nif 'exclude_patterns' in globals():\n exclude_patterns += _exclude_patterns\nelse:\n exclude_patterns = _exclude_patterns\n\n_extensions = [\n 'sphinx_tabs.tabs',\n 'sphinxemoji.sphinxemoji',\n]\nif 'extensions' in globals():\n extensions += _extensions\nelse:\n extensions = _extensions\n\n\nif not os.environ.get('SPHINX_GETTEXT') == 'True':\n # Override all the files from ``.overrides`` directory\n from pathlib import Path\n overrides_paths = Path('.overrides')\n\n for path in overrides_paths.glob('**/*.*'):\n if path.name == 'README.rst' and path.parent == '.overrides':\n continue\n destroot = str(path.parent).replace('.overrides', '').lstrip('/')\n outputdir = Path('cpython/Doc') / destroot / path.name\n os.system(f'ln -nfs `pwd`/{path.parent}/{path.name} {outputdir}')\n\ngettext_compact = False\nlocale_dirs = ['../locales', 'cpython/locales'] # relative to the sourcedir\n\n\n# NOTE: Read the Docs does not support \"multi document output\".\n# So, we put all the documentation as a single file for now.\n_stdauthor = r'Guido van Rossum\\\\and the Python development team'\nlatex_documents = [\n ('contents', 'python-docs-es.tex', u'Documentaci\u00f3n de Python en Espa\u00f1ol',\n _stdauthor, 'manual'),\n]\n\ndef setup(app):\n\n def add_contributing_banner(app, doctree):\n \"\"\"\n Insert a banner at the top of the index.\n\n This way, we can easily communicate people to help with the translation,\n pointing them to different resources.\n \"\"\"\n\n if app.builder.format != 'html':\n # Do not include the banner when building with other formats\n # (this is useful when using -b gettext)\n return\n\n from docutils import nodes, core\n\n message = '\u00a1Ay\u00fadanos a traducir la documentaci\u00f3n oficial de Python al Espa\u00f1ol! ' \\\n f'Puedes encontrar m\u00e1s informaci\u00f3n en `Como contribuir </es/{version}/CONTRIBUTING.html>`_. ' \\\n 'Ayuda a acercar Python a m\u00e1s personas de habla hispana.'\n\n paragraph = core.publish_doctree(message)[0]\n banner = nodes.warning(ids=['contributing-banner'])\n banner.append(paragraph)\n\n for document in doctree.traverse(nodes.document):\n document.insert(0, banner)\n\n # Change the sourcedir programmatically because Read the Docs always call it with `.`\n app.srcdir = 'cpython/Doc'\n\n app.connect('doctree-read', add_contributing_banner)\n\n # Import the sphinx-autorun manually to avoid this warning\n # TODO: Remove this code and use just ``extensions.append('sphinx_autorun')`` when\n # that issue gets fixed\n # See https://github.com/WhyNotHugo/sphinx-autorun/issues/17\n\n # WARNING: the sphinx_autorun extension does not declare if it is safe for\n # parallel reading, assuming it isn't - please ask the extension author to\n # check and make it explicit\n # WARNING: doing serial read\n from sphinx_autorun import RunBlock, AutoRun\n app.add_directive('runblock', RunBlock)\n app.connect('builder-inited', AutoRun.builder_init)\n app.add_config_value('autorun_languages', AutoRun.config, 'env')\n return {\n 'version': '0.1',\n 'parallel_read_safe': True,\n 'parallel_write_safe': True,\n }\n", "path": "conf.py"}], "after_files": [{"content": "# Sphinx configuration file.\n#\n# - import original configurations from cpython/Doc/conf.py\n# - append the path considering the cpython submodule is at ./cpython\n# - create the symbolic links under ./cpython/locale/es/LC_MESSAGES\n# - make the build to work under Read the Docs\n#\n# The git submodule was created using this Stack Overflow answer\n# to fetch only the commit that I needed and avoid clonning the whole history\n# https://stackoverflow.com/a/27445058\n#\n# This can be built locally using `sphinx-build` by running\n#\n# $ sphinx-build -b html -d _build/doctrees -D language=es . _build/html\n\nimport sys, os, time\nsys.path.append(os.path.abspath('cpython/Doc/tools/extensions'))\nsys.path.append(os.path.abspath('cpython/Doc/includes'))\n\n# Import all the Sphinx settings from cpython\nsys.path.insert(0, os.path.abspath('cpython/Doc'))\nfrom conf import *\n\n# Call patchlevel with the proper path to get the version from\n# instead of hardcoding it\nimport patchlevel\nversion, release = patchlevel.get_header_version_info(os.path.abspath('cpython/Doc'))\n\nproject = 'Python en Espa\u00f1ol'\ncopyright = '2001-%s, Python Software Foundation' % time.strftime('%Y')\n\nhtml_theme_path = ['cpython/Doc/tools']\ntemplates_path = ['cpython/Doc/tools/templates']\nhtml_static_path = ['cpython/Doc/tools/static']\n\nos.system('mkdir -p cpython/locales/es/')\nos.system('ln -nfs `pwd` cpython/locales/es/LC_MESSAGES')\n\nhtml_short_title = f'Documentaci\u00f3n {release}'\nhtml_title = f'Documentaci\u00f3n de Python en Espa\u00f1ol -- {release}'\n\n\n# Extend settings from upstream\n_exclude_patterns = [\n # This file is not included and it's not marked as :orphan:\n 'distutils/_setuptools_disclaimer.rst',\n 'cpython/Doc/distutils/_setuptools_disclaimer.rst',\n]\nif 'exclude_patterns' in globals():\n exclude_patterns += _exclude_patterns\nelse:\n exclude_patterns = _exclude_patterns\n\n_extensions = [\n 'sphinx_tabs.tabs',\n 'sphinxemoji.sphinxemoji',\n]\nif 'extensions' in globals():\n extensions += _extensions\nelse:\n extensions = _extensions\n\n\nif not os.environ.get('SPHINX_GETTEXT') == 'True':\n # Override all the files from ``.overrides`` directory\n from pathlib import Path\n overrides_paths = Path('.overrides')\n\n for path in overrides_paths.glob('**/*.*'):\n if path.name == 'README.rst' and path.parent == '.overrides':\n continue\n destroot = str(path.parent).replace('.overrides', '').lstrip('/')\n outputdir = Path('cpython/Doc') / destroot / path.name\n os.system(f'ln -nfs `pwd`/{path.parent}/{path.name} {outputdir}')\n\ngettext_compact = False\nlocale_dirs = ['../locales', 'cpython/locales'] # relative to the sourcedir\n\n\n# NOTE: Read the Docs does not support \"multi document output\".\n# So, we put all the documentation as a single file for now.\n_stdauthor = r'Guido van Rossum\\\\and the Python development team'\nlatex_documents = [\n ('contents', 'python-docs-es.tex', u'Documentaci\u00f3n de Python en Espa\u00f1ol',\n _stdauthor, 'manual'),\n]\n\ndef setup(app):\n\n def add_contributing_banner(app, doctree):\n \"\"\"\n Insert a banner at the top of the index.\n\n This way, we can easily communicate people to help with the translation,\n pointing them to different resources.\n \"\"\"\n\n if app.builder.format != 'html':\n # Do not include the banner when building with other formats\n # (this is useful when using -b gettext)\n return\n\n from docutils import nodes, core\n\n message = '\u00a1Ay\u00fadanos a traducir la documentaci\u00f3n oficial de Python al Espa\u00f1ol! ' \\\n f'Puedes encontrar m\u00e1s informaci\u00f3n en `Como contribuir </es/{version}/CONTRIBUTING.html>`_. ' \\\n 'Ayuda a acercar Python a m\u00e1s personas de habla hispana.'\n\n paragraph = core.publish_doctree(message)[0]\n banner = nodes.warning(ids=['contributing-banner'])\n banner.append(paragraph)\n\n for document in doctree.traverse(nodes.document):\n document.insert(0, banner)\n\n # Change the sourcedir programmatically because Read the Docs always call it with `.`\n app.srcdir = 'cpython/Doc'\n\n app.connect('doctree-read', add_contributing_banner)\n\n # Import the sphinx-autorun manually to avoid this warning\n # TODO: Remove this code and use just ``extensions.append('sphinx_autorun')`` when\n # that issue gets fixed\n # See https://github.com/WhyNotHugo/sphinx-autorun/issues/17\n\n # WARNING: the sphinx_autorun extension does not declare if it is safe for\n # parallel reading, assuming it isn't - please ask the extension author to\n # check and make it explicit\n # WARNING: doing serial read\n from sphinx_autorun import RunBlock, AutoRun\n app.add_directive('runblock', RunBlock)\n app.connect('builder-inited', AutoRun.builder_init)\n app.add_config_value('autorun_languages', AutoRun.config, 'env')\n return {\n 'version': '0.1',\n 'parallel_read_safe': True,\n 'parallel_write_safe': True,\n }\n", "path": "conf.py"}]} | 1,918 | 140 |
gh_patches_debug_17067 | rasdani/github-patches | git_diff | keras-team__autokeras-241 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
"numpy.linalg.linalg.LinAlgError: N-th leading minor of the array is not positive definite" exception while training ImageClassifier
<!---
**If you are reporting a bug:**
* Verify that your issue is not being currently addressed by other issues or pull requests.
* Please note that Auto-Keras is only compatible with **Python 3.6**.
* Tag the issue with the `bug report` tag.
-->
### Bug Description
When training ImageClassifier on float inputs of shape 80x92x24 on a binary classification task, this exception is thrown when AutoKeras lists all created models until this point and their respected losses and metrics. Here is a more complete log + stack trace: http://termbin.com/gb3n
### Reproducing Steps
* Create a dataset of noise shaped 80x92x24 with ~360 or so samples
* Respective binary labels.
In my case, some values of the dataset are quite large (they vary between 0 and 1_000_000, but I'm not sure that this is related since AutoKeras performs normalization).
* Have it train on a GTX 1070Ti or equivalent in terms of vram for about an hour (can't confirm that it's GPU-related either).
* Wait for the exception
### Expected Behavior
It should work.
### Setup Details
- OS type and version: Ubuntu 18.04
- Python: 3.6
- autokeras: 0.2.14 (installed manually from git)
- all dependencies are installed as stated in requirements.txt for this release, unless for:
- tensorflow 1.10, compiled locally.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from distutils.core import setup
2
3 setup(
4 name='autokeras',
5 packages=['autokeras'], # this must be the same as the name above
6 install_requires=['torch==0.4.1', 'torchvision==0.2.1', 'numpy>=1.14.5', 'keras==2.2.2', 'scikit-learn==0.19.1',
7 'tensorflow>=1.10.0', 'tqdm==4.25.0'],
8 version='0.2.17',
9 description='AutoML for deep learning',
10 author='Haifeng Jin',
11 author_email='[email protected]',
12 url='http://autokeras.com',
13 download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.17.tar.gz',
14 keywords=['automl'], # arbitrary keywords
15 classifiers=[]
16 )
17
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -5,12 +5,12 @@
packages=['autokeras'], # this must be the same as the name above
install_requires=['torch==0.4.1', 'torchvision==0.2.1', 'numpy>=1.14.5', 'keras==2.2.2', 'scikit-learn==0.19.1',
'tensorflow>=1.10.0', 'tqdm==4.25.0'],
- version='0.2.17',
+ version='0.2.18',
description='AutoML for deep learning',
author='Haifeng Jin',
author_email='[email protected]',
url='http://autokeras.com',
- download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.17.tar.gz',
+ download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.18.tar.gz',
keywords=['automl'], # arbitrary keywords
classifiers=[]
)
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -5,12 +5,12 @@\n packages=['autokeras'], # this must be the same as the name above\n install_requires=['torch==0.4.1', 'torchvision==0.2.1', 'numpy>=1.14.5', 'keras==2.2.2', 'scikit-learn==0.19.1',\n 'tensorflow>=1.10.0', 'tqdm==4.25.0'],\n- version='0.2.17',\n+ version='0.2.18',\n description='AutoML for deep learning',\n author='Haifeng Jin',\n author_email='[email protected]',\n url='http://autokeras.com',\n- download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.17.tar.gz',\n+ download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.18.tar.gz',\n keywords=['automl'], # arbitrary keywords\n classifiers=[]\n )\n", "issue": "\"numpy.linalg.linalg.LinAlgError: N-th leading minor of the array is not positive definite\" exception while training ImageClassifier\n<!---\r\n**If you are reporting a bug:**\r\n* Verify that your issue is not being currently addressed by other issues or pull requests.\r\n* Please note that Auto-Keras is only compatible with **Python 3.6**.\r\n* Tag the issue with the `bug report` tag.\r\n-->\r\n\r\n### Bug Description\r\n\r\nWhen training ImageClassifier on float inputs of shape 80x92x24 on a binary classification task, this exception is thrown when AutoKeras lists all created models until this point and their respected losses and metrics. Here is a more complete log + stack trace: http://termbin.com/gb3n\r\n\r\n### Reproducing Steps\r\n\r\n* Create a dataset of noise shaped 80x92x24 with ~360 or so samples\r\n* Respective binary labels.\r\n\r\nIn my case, some values of the dataset are quite large (they vary between 0 and 1_000_000, but I'm not sure that this is related since AutoKeras performs normalization).\r\n\r\n* Have it train on a GTX 1070Ti or equivalent in terms of vram for about an hour (can't confirm that it's GPU-related either).\r\n* Wait for the exception\r\n\r\n\r\n### Expected Behavior\r\n\r\nIt should work.\r\n\r\n### Setup Details\r\n - OS type and version: Ubuntu 18.04\r\n - Python: 3.6\r\n - autokeras: 0.2.14 (installed manually from git)\r\n- all dependencies are installed as stated in requirements.txt for this release, unless for:\r\n- tensorflow 1.10, compiled locally. \r\n\n", "before_files": [{"content": "from distutils.core import setup\n\nsetup(\n name='autokeras',\n packages=['autokeras'], # this must be the same as the name above\n install_requires=['torch==0.4.1', 'torchvision==0.2.1', 'numpy>=1.14.5', 'keras==2.2.2', 'scikit-learn==0.19.1',\n 'tensorflow>=1.10.0', 'tqdm==4.25.0'],\n version='0.2.17',\n description='AutoML for deep learning',\n author='Haifeng Jin',\n author_email='[email protected]',\n url='http://autokeras.com',\n download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.17.tar.gz',\n keywords=['automl'], # arbitrary keywords\n classifiers=[]\n)\n", "path": "setup.py"}], "after_files": [{"content": "from distutils.core import setup\n\nsetup(\n name='autokeras',\n packages=['autokeras'], # this must be the same as the name above\n install_requires=['torch==0.4.1', 'torchvision==0.2.1', 'numpy>=1.14.5', 'keras==2.2.2', 'scikit-learn==0.19.1',\n 'tensorflow>=1.10.0', 'tqdm==4.25.0'],\n version='0.2.18',\n description='AutoML for deep learning',\n author='Haifeng Jin',\n author_email='[email protected]',\n url='http://autokeras.com',\n download_url='https://github.com/jhfjhfj1/autokeras/archive/0.2.18.tar.gz',\n keywords=['automl'], # arbitrary keywords\n classifiers=[]\n)\n", "path": "setup.py"}]} | 860 | 266 |
gh_patches_debug_31843 | rasdani/github-patches | git_diff | freedomofpress__securedrop-238 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Overly permissive Database privileges for “securedrop” user
The privileges given the MySQL user accessing the SecureDrop database
are overly permissive and allow an attacker after a successful SQL
injection attack to escalate privileges:
Example: https://github.com/freedomofpress/securedrop/search?q=
GRANT+ALL+PRIVILEGES&ref=cmdform
```
echo "Setting up MySQL database..."
mysql -u root -p"$mysql_root" -e "create database securedrop; GRANT ALL
PRIVILEGES ON securedrop.* TO 'securedrop'@'localhost' IDENTIFIED BY
'$mysql_securedrop';"
```
It should be made sure that the user is only applied with privileges
that are actually needed. In case an attacker manages to spot a SQL
injection vulnerability, the user with an unlimited set of permissions
would be capable of compromising the machine via FILE and other SQL
features (depending on the MySQL version). From what can be seen from
the application’s logic, the database user essentially needs read-write
access to one particular database - and not “GRANT ALL PRIVILEGES”.
Reported as part of the cure53 audit of 0.2 as: SD-01-003
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `install_files/source.config.py`
Content:
```
1 import os
2
3 class BaseConfig(object):
4 DEBUG = False
5 TESTING = False
6 SECRET_KEY='SECRET_KEY_VALUE'
7
8 class ProductionConfig(BaseConfig):
9 pass
10
11 class DevelopmentConfig(BaseConfig):
12 DEBUG = True
13
14 class TestingConfig(BaseConfig):
15 TESTING = True
16 # Tests are simpler if CSRF protection is disabled
17 WTF_CSRF_ENABLED = False
18
19 # data directories - should be on secure media
20 STORE_DIR='/var/www/securedrop/store'
21 GPG_KEY_DIR='/var/www/securedrop/keys'
22
23 # fingerprint of the GPG key to encrypt submissions to
24 JOURNALIST_KEY='APP_GPG_KEY_FINGERPRINT'
25
26 SOURCE_TEMPLATES_DIR='/var/www/securedrop/source_templates'
27 JOURNALIST_TEMPLATES_DIR='/var/www/securedrop/journalist_templates'
28 WORD_LIST='/var/www/securedrop/wordlist'
29
30 NOUNS='/var/www/securedrop/dictionaries/nouns.txt'
31 ADJECTIVES='/var/www/securedrop/dictionaries/adjectives.txt'
32 BCRYPT_ID_SALT='BCRYPT_ID_SALT_VALUE'
33 BCRYPT_GPG_SALT='BCRYPT_GPG_SALT_VALUE'
34 # Default to the production configuration
35 FlaskConfig = ProductionConfig
36
37 if os.environ.get('SECUREDROP_ENV') == 'test':
38 FlaskConfig = TestingConfig
39 TEST_DIR='/tmp/securedrop_test'
40 STORE_DIR=os.path.join(TEST_DIR, 'store')
41 GPG_KEY_DIR=os.path.join(TEST_DIR, 'keys')
42 # test_journalist_key.pub
43 JOURNALIST_KEY='65A1B5FF195B56353CC63DFFCC40EF1228271441'
44
45 # Database Configuration
46
47 # Default to using a sqlite database file for development
48 #DATABASE_ENGINE = 'sqlite'
49 #DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')
50
51 # Uncomment to use mysql (or any other databaes backend supported by
52 # SQLAlchemy). Make sure you have the necessary dependencies installed, and run
53 # `python -c "import db; db.create_tables()"` to initialize the database
54
55 DATABASE_ENGINE = 'mysql'
56 DATABASE_HOST = 'localhost'
57 DATABASE_NAME = 'securedrop'
58 DATABASE_USERNAME = 'securedrop'
59 DATABASE_PASSWORD = ''
60
```
Path: `install_files/document.config.py`
Content:
```
1 import os
2
3 class BaseConfig(object):
4 DEBUG = False
5 TESTING = False
6 SECRET_KEY='SECRET_KEY_VALUE'
7
8 class ProductionConfig(BaseConfig):
9 pass
10
11 class DevelopmentConfig(BaseConfig):
12 DEBUG = True
13
14 class TestingConfig(BaseConfig):
15 TESTING = True
16 # Tests are simpler if CSRF protection is disabled
17 WTF_CSRF_ENABLED = False
18
19 # data directories - should be on secure media
20 STORE_DIR='/var/www/securedrop/store'
21 GPG_KEY_DIR='/var/www/securedrop/keys'
22
23 # fingerprint of the GPG key to encrypt submissions to
24 JOURNALIST_KEY='APP_GPG_KEY_FINGERPRINT'
25
26 SOURCE_TEMPLATES_DIR='/var/www/securedrop/source_templates'
27 JOURNALIST_TEMPLATES_DIR='/var/www/securedrop/journalist_templates'
28 WORD_LIST='/var/www/securedrop/wordlist'
29 NOUNS='/var/www/securedrop/dictionaries/nouns.txt'
30 ADJECTIVES='/var/www/securedrop/dictionaries/adjectives.txt'
31 BCRYPT_ID_SALT='BCRYPT_ID_SALT_VALUE'
32 BCRYPT_GPG_SALT='BCRYPT_GPG_SALT_VALUE'
33
34 # Default to the production configuration
35 FlaskConfig = ProductionConfig
36 SECUREDROP_ROOT=os.path.abspath('/var/www/securedrop')
37
38 if os.environ.get('SECUREDROP_ENV') == 'test':
39 FlaskConfig = TestingConfig
40 TEST_DIR='/tmp/securedrop_test'
41 STORE_DIR=os.path.join(TEST_DIR, 'store')
42 GPG_KEY_DIR=os.path.join(TEST_DIR, 'keys')
43 # test_journalist_key.pub
44 JOURNALIST_KEY='65A1B5FF195B56353CC63DFFCC40EF1228271441'
45
46 # Database Configuration
47
48 # Default to using a sqlite database file for development
49 #DATABASE_ENGINE = 'sqlite'
50 #DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')
51
52 # Uncomment to use mysql (or any other databaes backend supported by
53 # SQLAlchemy). Make sure you have the necessary dependencies installed, and run
54 # `python -c "import db; db.create_tables()"` to initialize the database
55
56 DATABASE_ENGINE = 'mysql'
57 DATABASE_HOST = 'localhost'
58 DATABASE_NAME = 'securedrop'
59 DATABASE_USERNAME = 'document_mysql'
60 DATABASE_PASSWORD = 'MYSQL_USER_PASS'
61
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/install_files/document.config.py b/install_files/document.config.py
--- a/install_files/document.config.py
+++ b/install_files/document.config.py
@@ -46,15 +46,15 @@
# Database Configuration
# Default to using a sqlite database file for development
-#DATABASE_ENGINE = 'sqlite'
-#DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')
+DATABASE_ENGINE = 'sqlite'
+DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')
# Uncomment to use mysql (or any other databaes backend supported by
# SQLAlchemy). Make sure you have the necessary dependencies installed, and run
# `python -c "import db; db.create_tables()"` to initialize the database
-DATABASE_ENGINE = 'mysql'
-DATABASE_HOST = 'localhost'
-DATABASE_NAME = 'securedrop'
-DATABASE_USERNAME = 'document_mysql'
-DATABASE_PASSWORD = 'MYSQL_USER_PASS'
+#DATABASE_ENGINE = 'mysql'
+#DATABASE_HOST = 'localhost'
+#DATABASE_NAME = 'securedrop'
+#DATABASE_USERNAME = 'document_mysql'
+#DATABASE_PASSWORD = 'MYSQL_USER_PASS'
diff --git a/install_files/source.config.py b/install_files/source.config.py
--- a/install_files/source.config.py
+++ b/install_files/source.config.py
@@ -45,15 +45,16 @@
# Database Configuration
# Default to using a sqlite database file for development
-#DATABASE_ENGINE = 'sqlite'
-#DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')
+DATABASE_ENGINE = 'sqlite'
+SECUREDROP_ROOT=os.path.abspath('/var/www/securedrop')
+DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')
# Uncomment to use mysql (or any other databaes backend supported by
# SQLAlchemy). Make sure you have the necessary dependencies installed, and run
# `python -c "import db; db.create_tables()"` to initialize the database
-DATABASE_ENGINE = 'mysql'
-DATABASE_HOST = 'localhost'
-DATABASE_NAME = 'securedrop'
-DATABASE_USERNAME = 'securedrop'
-DATABASE_PASSWORD = ''
+#DATABASE_ENGINE = 'mysql'
+#DATABASE_HOST = 'localhost'
+#DATABASE_NAME = 'securedrop'
+#DATABASE_USERNAME = 'securedrop'
+#DATABASE_PASSWORD = ''
| {"golden_diff": "diff --git a/install_files/document.config.py b/install_files/document.config.py\n--- a/install_files/document.config.py\n+++ b/install_files/document.config.py\n@@ -46,15 +46,15 @@\n # Database Configuration\n \n # Default to using a sqlite database file for development\n-#DATABASE_ENGINE = 'sqlite'\n-#DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n+DATABASE_ENGINE = 'sqlite'\n+DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n \n # Uncomment to use mysql (or any other databaes backend supported by\n # SQLAlchemy). Make sure you have the necessary dependencies installed, and run\n # `python -c \"import db; db.create_tables()\"` to initialize the database\n \n-DATABASE_ENGINE = 'mysql'\n-DATABASE_HOST = 'localhost'\n-DATABASE_NAME = 'securedrop'\n-DATABASE_USERNAME = 'document_mysql'\n-DATABASE_PASSWORD = 'MYSQL_USER_PASS'\n+#DATABASE_ENGINE = 'mysql'\n+#DATABASE_HOST = 'localhost'\n+#DATABASE_NAME = 'securedrop'\n+#DATABASE_USERNAME = 'document_mysql'\n+#DATABASE_PASSWORD = 'MYSQL_USER_PASS'\ndiff --git a/install_files/source.config.py b/install_files/source.config.py\n--- a/install_files/source.config.py\n+++ b/install_files/source.config.py\n@@ -45,15 +45,16 @@\n # Database Configuration\n \n # Default to using a sqlite database file for development\n-#DATABASE_ENGINE = 'sqlite'\n-#DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n+DATABASE_ENGINE = 'sqlite'\n+SECUREDROP_ROOT=os.path.abspath('/var/www/securedrop')\n+DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n \n # Uncomment to use mysql (or any other databaes backend supported by\n # SQLAlchemy). Make sure you have the necessary dependencies installed, and run\n # `python -c \"import db; db.create_tables()\"` to initialize the database\n \n-DATABASE_ENGINE = 'mysql'\n-DATABASE_HOST = 'localhost'\n-DATABASE_NAME = 'securedrop'\n-DATABASE_USERNAME = 'securedrop'\n-DATABASE_PASSWORD = ''\n+#DATABASE_ENGINE = 'mysql'\n+#DATABASE_HOST = 'localhost'\n+#DATABASE_NAME = 'securedrop'\n+#DATABASE_USERNAME = 'securedrop'\n+#DATABASE_PASSWORD = ''\n", "issue": "Overly permissive Database privileges for \u201csecuredrop\u201d user\nThe privileges given the MySQL user accessing the SecureDrop database\nare overly permissive and allow an attacker after a successful SQL\ninjection attack to escalate privileges:\n\nExample: https://github.com/freedomofpress/securedrop/search?q=\nGRANT+ALL+PRIVILEGES&ref=cmdform\n\n```\necho \"Setting up MySQL database...\"\nmysql -u root -p\"$mysql_root\" -e \"create database securedrop; GRANT ALL\nPRIVILEGES ON securedrop.* TO 'securedrop'@'localhost' IDENTIFIED BY\n'$mysql_securedrop';\"\n```\n\nIt should be made sure that the user is only applied with privileges\nthat are actually needed. In case an attacker manages to spot a SQL\ninjection vulnerability, the user with an unlimited set of permissions\nwould be capable of compromising the machine via FILE and other SQL\nfeatures (depending on the MySQL version). From what can be seen from\nthe application\u2019s logic, the database user essentially needs read-write\naccess to one particular database - and not \u201cGRANT ALL PRIVILEGES\u201d.\n\nReported as part of the cure53 audit of 0.2 as: SD-01-003\n\n", "before_files": [{"content": "import os\n\nclass BaseConfig(object):\n DEBUG = False\n TESTING = False\n SECRET_KEY='SECRET_KEY_VALUE'\n\nclass ProductionConfig(BaseConfig):\n pass\n\nclass DevelopmentConfig(BaseConfig):\n DEBUG = True\n\nclass TestingConfig(BaseConfig):\n TESTING = True\n # Tests are simpler if CSRF protection is disabled\n WTF_CSRF_ENABLED = False\n\n# data directories - should be on secure media\nSTORE_DIR='/var/www/securedrop/store'\nGPG_KEY_DIR='/var/www/securedrop/keys'\n\n# fingerprint of the GPG key to encrypt submissions to\nJOURNALIST_KEY='APP_GPG_KEY_FINGERPRINT'\n\nSOURCE_TEMPLATES_DIR='/var/www/securedrop/source_templates'\nJOURNALIST_TEMPLATES_DIR='/var/www/securedrop/journalist_templates'\nWORD_LIST='/var/www/securedrop/wordlist'\n\nNOUNS='/var/www/securedrop/dictionaries/nouns.txt'\nADJECTIVES='/var/www/securedrop/dictionaries/adjectives.txt'\nBCRYPT_ID_SALT='BCRYPT_ID_SALT_VALUE'\nBCRYPT_GPG_SALT='BCRYPT_GPG_SALT_VALUE'\n# Default to the production configuration\nFlaskConfig = ProductionConfig\n\nif os.environ.get('SECUREDROP_ENV') == 'test':\n FlaskConfig = TestingConfig\n TEST_DIR='/tmp/securedrop_test'\n STORE_DIR=os.path.join(TEST_DIR, 'store')\n GPG_KEY_DIR=os.path.join(TEST_DIR, 'keys')\n # test_journalist_key.pub\n JOURNALIST_KEY='65A1B5FF195B56353CC63DFFCC40EF1228271441'\n\n# Database Configuration\n\n# Default to using a sqlite database file for development\n#DATABASE_ENGINE = 'sqlite'\n#DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n\n# Uncomment to use mysql (or any other databaes backend supported by\n# SQLAlchemy). Make sure you have the necessary dependencies installed, and run\n# `python -c \"import db; db.create_tables()\"` to initialize the database\n\nDATABASE_ENGINE = 'mysql'\nDATABASE_HOST = 'localhost'\nDATABASE_NAME = 'securedrop'\nDATABASE_USERNAME = 'securedrop'\nDATABASE_PASSWORD = ''\n", "path": "install_files/source.config.py"}, {"content": "import os\n\nclass BaseConfig(object):\n DEBUG = False\n TESTING = False\n SECRET_KEY='SECRET_KEY_VALUE'\n\nclass ProductionConfig(BaseConfig):\n pass\n\nclass DevelopmentConfig(BaseConfig):\n DEBUG = True\n\nclass TestingConfig(BaseConfig):\n TESTING = True\n # Tests are simpler if CSRF protection is disabled\n WTF_CSRF_ENABLED = False\n\n# data directories - should be on secure media\nSTORE_DIR='/var/www/securedrop/store'\nGPG_KEY_DIR='/var/www/securedrop/keys'\n\n# fingerprint of the GPG key to encrypt submissions to\nJOURNALIST_KEY='APP_GPG_KEY_FINGERPRINT'\n\nSOURCE_TEMPLATES_DIR='/var/www/securedrop/source_templates'\nJOURNALIST_TEMPLATES_DIR='/var/www/securedrop/journalist_templates'\nWORD_LIST='/var/www/securedrop/wordlist'\nNOUNS='/var/www/securedrop/dictionaries/nouns.txt'\nADJECTIVES='/var/www/securedrop/dictionaries/adjectives.txt'\nBCRYPT_ID_SALT='BCRYPT_ID_SALT_VALUE'\nBCRYPT_GPG_SALT='BCRYPT_GPG_SALT_VALUE'\n\n# Default to the production configuration\nFlaskConfig = ProductionConfig\nSECUREDROP_ROOT=os.path.abspath('/var/www/securedrop') \n\nif os.environ.get('SECUREDROP_ENV') == 'test':\n FlaskConfig = TestingConfig\n TEST_DIR='/tmp/securedrop_test'\n STORE_DIR=os.path.join(TEST_DIR, 'store')\n GPG_KEY_DIR=os.path.join(TEST_DIR, 'keys')\n # test_journalist_key.pub\n JOURNALIST_KEY='65A1B5FF195B56353CC63DFFCC40EF1228271441'\n\n# Database Configuration\n\n# Default to using a sqlite database file for development\n#DATABASE_ENGINE = 'sqlite'\n#DATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n\n# Uncomment to use mysql (or any other databaes backend supported by\n# SQLAlchemy). Make sure you have the necessary dependencies installed, and run\n# `python -c \"import db; db.create_tables()\"` to initialize the database\n\nDATABASE_ENGINE = 'mysql'\nDATABASE_HOST = 'localhost'\nDATABASE_NAME = 'securedrop'\nDATABASE_USERNAME = 'document_mysql'\nDATABASE_PASSWORD = 'MYSQL_USER_PASS'\n", "path": "install_files/document.config.py"}], "after_files": [{"content": "import os\n\nclass BaseConfig(object):\n DEBUG = False\n TESTING = False\n SECRET_KEY='SECRET_KEY_VALUE'\n\nclass ProductionConfig(BaseConfig):\n pass\n\nclass DevelopmentConfig(BaseConfig):\n DEBUG = True\n\nclass TestingConfig(BaseConfig):\n TESTING = True\n # Tests are simpler if CSRF protection is disabled\n WTF_CSRF_ENABLED = False\n\n# data directories - should be on secure media\nSTORE_DIR='/var/www/securedrop/store'\nGPG_KEY_DIR='/var/www/securedrop/keys'\n\n# fingerprint of the GPG key to encrypt submissions to\nJOURNALIST_KEY='APP_GPG_KEY_FINGERPRINT'\n\nSOURCE_TEMPLATES_DIR='/var/www/securedrop/source_templates'\nJOURNALIST_TEMPLATES_DIR='/var/www/securedrop/journalist_templates'\nWORD_LIST='/var/www/securedrop/wordlist'\n\nNOUNS='/var/www/securedrop/dictionaries/nouns.txt'\nADJECTIVES='/var/www/securedrop/dictionaries/adjectives.txt'\nBCRYPT_ID_SALT='BCRYPT_ID_SALT_VALUE'\nBCRYPT_GPG_SALT='BCRYPT_GPG_SALT_VALUE'\n# Default to the production configuration\nFlaskConfig = ProductionConfig\n\nif os.environ.get('SECUREDROP_ENV') == 'test':\n FlaskConfig = TestingConfig\n TEST_DIR='/tmp/securedrop_test'\n STORE_DIR=os.path.join(TEST_DIR, 'store')\n GPG_KEY_DIR=os.path.join(TEST_DIR, 'keys')\n # test_journalist_key.pub\n JOURNALIST_KEY='65A1B5FF195B56353CC63DFFCC40EF1228271441'\n\n# Database Configuration\n\n# Default to using a sqlite database file for development\nDATABASE_ENGINE = 'sqlite'\nSECUREDROP_ROOT=os.path.abspath('/var/www/securedrop')\nDATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n\n# Uncomment to use mysql (or any other databaes backend supported by\n# SQLAlchemy). Make sure you have the necessary dependencies installed, and run\n# `python -c \"import db; db.create_tables()\"` to initialize the database\n\n#DATABASE_ENGINE = 'mysql'\n#DATABASE_HOST = 'localhost'\n#DATABASE_NAME = 'securedrop'\n#DATABASE_USERNAME = 'securedrop'\n#DATABASE_PASSWORD = ''\n", "path": "install_files/source.config.py"}, {"content": "import os\n\nclass BaseConfig(object):\n DEBUG = False\n TESTING = False\n SECRET_KEY='SECRET_KEY_VALUE'\n\nclass ProductionConfig(BaseConfig):\n pass\n\nclass DevelopmentConfig(BaseConfig):\n DEBUG = True\n\nclass TestingConfig(BaseConfig):\n TESTING = True\n # Tests are simpler if CSRF protection is disabled\n WTF_CSRF_ENABLED = False\n\n# data directories - should be on secure media\nSTORE_DIR='/var/www/securedrop/store'\nGPG_KEY_DIR='/var/www/securedrop/keys'\n\n# fingerprint of the GPG key to encrypt submissions to\nJOURNALIST_KEY='APP_GPG_KEY_FINGERPRINT'\n\nSOURCE_TEMPLATES_DIR='/var/www/securedrop/source_templates'\nJOURNALIST_TEMPLATES_DIR='/var/www/securedrop/journalist_templates'\nWORD_LIST='/var/www/securedrop/wordlist'\nNOUNS='/var/www/securedrop/dictionaries/nouns.txt'\nADJECTIVES='/var/www/securedrop/dictionaries/adjectives.txt'\nBCRYPT_ID_SALT='BCRYPT_ID_SALT_VALUE'\nBCRYPT_GPG_SALT='BCRYPT_GPG_SALT_VALUE'\n\n# Default to the production configuration\nFlaskConfig = ProductionConfig\nSECUREDROP_ROOT=os.path.abspath('/var/www/securedrop') \n\nif os.environ.get('SECUREDROP_ENV') == 'test':\n FlaskConfig = TestingConfig\n TEST_DIR='/tmp/securedrop_test'\n STORE_DIR=os.path.join(TEST_DIR, 'store')\n GPG_KEY_DIR=os.path.join(TEST_DIR, 'keys')\n # test_journalist_key.pub\n JOURNALIST_KEY='65A1B5FF195B56353CC63DFFCC40EF1228271441'\n\n# Database Configuration\n\n# Default to using a sqlite database file for development\nDATABASE_ENGINE = 'sqlite'\nDATABASE_FILE=os.path.join(SECUREDROP_ROOT, 'db.sqlite')\n\n# Uncomment to use mysql (or any other databaes backend supported by\n# SQLAlchemy). Make sure you have the necessary dependencies installed, and run\n# `python -c \"import db; db.create_tables()\"` to initialize the database\n\n#DATABASE_ENGINE = 'mysql'\n#DATABASE_HOST = 'localhost'\n#DATABASE_NAME = 'securedrop'\n#DATABASE_USERNAME = 'document_mysql'\n#DATABASE_PASSWORD = 'MYSQL_USER_PASS'\n", "path": "install_files/document.config.py"}]} | 1,753 | 481 |
gh_patches_debug_158 | rasdani/github-patches | git_diff | mindsdb__mindsdb-2137 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug]: PIP installation error
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Current Behavior
Hello,
using ```pip install mindsdb``` I've got this error:
```
Traceback (most recent call last):
File "C:\Users\lukas\AppData\Local\Programs\Python\Python310\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 363, in <module>
main()
File "C:\Users\lukas\AppData\Local\Programs\Python\Python310\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 345, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "C:\Users\lukas\AppData\Local\Programs\Python\Python310\lib\site-packages\pip\_vendor\pep517\in_process\_in_process.py", line 130, in get_requires_for_build_wheel
return hook(config_settings)
File "C:\Users\lukas\AppData\Local\Temp\pip-build-env-ke4g80_2\overlay\Lib\site-packages\setuptools\build_meta.py", line 177, in get_requires_for_build_wheel
return self._get_build_requires(
File "C:\Users\lukas\AppData\Local\Temp\pip-build-env-ke4g80_2\overlay\Lib\site-packages\setuptools\build_meta.py", line 159, in _get_build_requires
self.run_setup()
File "C:\Users\lukas\AppData\Local\Temp\pip-build-env-ke4g80_2\overlay\Lib\site-packages\setuptools\build_meta.py", line 281, in run_setup
super(_BuildMetaLegacyBackend,
File "C:\Users\lukas\AppData\Local\Temp\pip-build-env-ke4g80_2\overlay\Lib\site-packages\setuptools\build_meta.py", line 174, in run_setup
exec(compile(code, __file__, 'exec'), locals())
File "setup.py", line 10, in <module>
long_description = fh.read()
File "C:\Users\lukas\AppData\Local\Programs\Python\Python310\lib\encodings\cp1250.py", line 23, in decode
return codecs.charmap_decode(input,self.errors,decoding_table)[0]
UnicodeDecodeError: 'charmap' codec can't decode byte 0x90 in position 3404: character maps to <undefined>
```
I am using the latest Python 3.10.4
### Expected Behavior
_No response_
### Steps To Reproduce
_No response_
### Anything else?
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2
3
4 about = {}
5 with open("mindsdb/__about__.py") as fp:
6 exec(fp.read(), about)
7
8
9 with open("README.md", "r") as fh:
10 long_description = fh.read()
11
12
13 def install_deps():
14 """Reads requirements.txt and preprocess it
15 to be feed into setuptools.
16
17 This is the only possible way (we found)
18 how requirements.txt can be reused in setup.py
19 using dependencies from private github repositories.
20
21 Links must be appendend by `-{StringWithAtLeastOneNumber}`
22 or something like that, so e.g. `-9231` works as well as
23 `1.1.0`. This is ignored by the setuptools, but has to be there.
24
25 Warnings:
26 to make pip respect the links, you have to use
27 `--process-dependency-links` switch. So e.g.:
28 `pip install --process-dependency-links {git-url}`
29
30 Returns:
31 list of packages and dependency links.
32 """
33 default = open('requirements.txt', 'r').readlines()
34 new_pkgs = []
35 links = []
36 for resource in default:
37 if 'git+https' in resource:
38 pkg = resource.split('#')[-1]
39 links.append(resource.strip() + '-9876543210')
40 new_pkgs.append(pkg.replace('egg=', '').rstrip())
41 else:
42 new_pkgs.append(resource.strip())
43 return new_pkgs, links
44
45
46 pkgs, new_links = install_deps()
47
48 setup(
49 name=about['__title__'],
50 version=about['__version__'],
51 url=about['__github__'],
52 download_url=about['__pypi__'],
53 license=about['__license__'],
54 author=about['__author__'],
55 author_email=about['__email__'],
56 description=about['__description__'],
57 long_description=long_description,
58 long_description_content_type="text/markdown",
59 packages=find_packages(),
60 install_requires=pkgs,
61 dependency_links=new_links,
62 include_package_data=True,
63 classifiers=[
64 "Programming Language :: Python :: 3",
65 "Operating System :: OS Independent",
66 ],
67 python_requires=">=3.6"
68 )
69
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -6,7 +6,7 @@
exec(fp.read(), about)
-with open("README.md", "r") as fh:
+with open("README.md", "r", encoding="utf8") as fh:
long_description = fh.read()
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -6,7 +6,7 @@\n exec(fp.read(), about)\n \n \n-with open(\"README.md\", \"r\") as fh:\n+with open(\"README.md\", \"r\", encoding=\"utf8\") as fh:\n long_description = fh.read()\n", "issue": "[Bug]: PIP installation error\n### Is there an existing issue for this?\n\n- [X] I have searched the existing issues\n\n### Current Behavior\n\nHello,\r\nusing ```pip install mindsdb``` I've got this error:\r\n```\r\nTraceback (most recent call last):\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pip\\_vendor\\pep517\\in_process\\_in_process.py\", line 363, in <module>\r\n main()\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pip\\_vendor\\pep517\\in_process\\_in_process.py\", line 345, in main\r\n json_out['return_val'] = hook(**hook_input['kwargs'])\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\pip\\_vendor\\pep517\\in_process\\_in_process.py\", line 130, in get_requires_for_build_wheel\r\n return hook(config_settings)\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Temp\\pip-build-env-ke4g80_2\\overlay\\Lib\\site-packages\\setuptools\\build_meta.py\", line 177, in get_requires_for_build_wheel\r\n return self._get_build_requires(\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Temp\\pip-build-env-ke4g80_2\\overlay\\Lib\\site-packages\\setuptools\\build_meta.py\", line 159, in _get_build_requires\r\n self.run_setup()\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Temp\\pip-build-env-ke4g80_2\\overlay\\Lib\\site-packages\\setuptools\\build_meta.py\", line 281, in run_setup\r\n super(_BuildMetaLegacyBackend,\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Temp\\pip-build-env-ke4g80_2\\overlay\\Lib\\site-packages\\setuptools\\build_meta.py\", line 174, in run_setup\r\n exec(compile(code, __file__, 'exec'), locals())\r\n File \"setup.py\", line 10, in <module>\r\n long_description = fh.read()\r\n File \"C:\\Users\\lukas\\AppData\\Local\\Programs\\Python\\Python310\\lib\\encodings\\cp1250.py\", line 23, in decode\r\n return codecs.charmap_decode(input,self.errors,decoding_table)[0]\r\n UnicodeDecodeError: 'charmap' codec can't decode byte 0x90 in position 3404: character maps to <undefined>\r\n```\r\nI am using the latest Python 3.10.4\n\n### Expected Behavior\n\n_No response_\n\n### Steps To Reproduce\n\n_No response_\n\n### Anything else?\n\n_No response_\n", "before_files": [{"content": "from setuptools import setup, find_packages\n\n\nabout = {}\nwith open(\"mindsdb/__about__.py\") as fp:\n exec(fp.read(), about)\n\n\nwith open(\"README.md\", \"r\") as fh:\n long_description = fh.read()\n\n\ndef install_deps():\n \"\"\"Reads requirements.txt and preprocess it\n to be feed into setuptools.\n\n This is the only possible way (we found)\n how requirements.txt can be reused in setup.py\n using dependencies from private github repositories.\n\n Links must be appendend by `-{StringWithAtLeastOneNumber}`\n or something like that, so e.g. `-9231` works as well as\n `1.1.0`. This is ignored by the setuptools, but has to be there.\n\n Warnings:\n to make pip respect the links, you have to use\n `--process-dependency-links` switch. So e.g.:\n `pip install --process-dependency-links {git-url}`\n\n Returns:\n list of packages and dependency links.\n \"\"\"\n default = open('requirements.txt', 'r').readlines()\n new_pkgs = []\n links = []\n for resource in default:\n if 'git+https' in resource:\n pkg = resource.split('#')[-1]\n links.append(resource.strip() + '-9876543210')\n new_pkgs.append(pkg.replace('egg=', '').rstrip())\n else:\n new_pkgs.append(resource.strip())\n return new_pkgs, links\n\n\npkgs, new_links = install_deps()\n\nsetup(\n name=about['__title__'],\n version=about['__version__'],\n url=about['__github__'],\n download_url=about['__pypi__'],\n license=about['__license__'],\n author=about['__author__'],\n author_email=about['__email__'],\n description=about['__description__'],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n packages=find_packages(),\n install_requires=pkgs,\n dependency_links=new_links,\n include_package_data=True,\n classifiers=[\n \"Programming Language :: Python :: 3\",\n \"Operating System :: OS Independent\",\n ],\n python_requires=\">=3.6\"\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\n\n\nabout = {}\nwith open(\"mindsdb/__about__.py\") as fp:\n exec(fp.read(), about)\n\n\nwith open(\"README.md\", \"r\", encoding=\"utf8\") as fh:\n long_description = fh.read()\n\n\ndef install_deps():\n \"\"\"Reads requirements.txt and preprocess it\n to be feed into setuptools.\n\n This is the only possible way (we found)\n how requirements.txt can be reused in setup.py\n using dependencies from private github repositories.\n\n Links must be appendend by `-{StringWithAtLeastOneNumber}`\n or something like that, so e.g. `-9231` works as well as\n `1.1.0`. This is ignored by the setuptools, but has to be there.\n\n Warnings:\n to make pip respect the links, you have to use\n `--process-dependency-links` switch. So e.g.:\n `pip install --process-dependency-links {git-url}`\n\n Returns:\n list of packages and dependency links.\n \"\"\"\n default = open('requirements.txt', 'r').readlines()\n new_pkgs = []\n links = []\n for resource in default:\n if 'git+https' in resource:\n pkg = resource.split('#')[-1]\n links.append(resource.strip() + '-9876543210')\n new_pkgs.append(pkg.replace('egg=', '').rstrip())\n else:\n new_pkgs.append(resource.strip())\n return new_pkgs, links\n\n\npkgs, new_links = install_deps()\n\nsetup(\n name=about['__title__'],\n version=about['__version__'],\n url=about['__github__'],\n download_url=about['__pypi__'],\n license=about['__license__'],\n author=about['__author__'],\n author_email=about['__email__'],\n description=about['__description__'],\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n packages=find_packages(),\n install_requires=pkgs,\n dependency_links=new_links,\n include_package_data=True,\n classifiers=[\n \"Programming Language :: Python :: 3\",\n \"Operating System :: OS Independent\",\n ],\n python_requires=\">=3.6\"\n)\n", "path": "setup.py"}]} | 1,517 | 75 |
gh_patches_debug_20224 | rasdani/github-patches | git_diff | getsentry__sentry-44899 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Project Transfer] Request Approval Email Uses Incorrect Customer Subdomain
r### Environment
SaaS (https://sentry.io/)
### Version
n/a
### Link
[relevant transaction](https://sentry.sentry.io/discover/javascript:31e672a100fd44f5ab3ac36d4be4dbd3/?field=title&field=event.type&field=project&field=user.display&field=timestamp&field=replayId&homepage=true&name=All+Events&project=11276&query=url%3A%22https%3A%2F%2Fmagiclamp-mobile-website-appli.sentry.io%2F%2A%22&sort=-timestamp&statsPeriod=14d&yAxis=count%28%29)
### DSN
n/a
### Steps to Reproduce
1. Org A requests a project transfer - supplies an owner email for Org B
2. Org B owner clicks link in resulting "Request for Project Transfer" email
3. Link directs Org B owner to **org-a**.sentry.io/accept-transfer/?data=<some_hash>
### Expected Result
...
3. Link directs Org B owner to **org-b**.sentry.io/accept-transfer/?data=<some_hash>
### Actual Result
Because Org B owner is not authenticated with orb-a.sentry.io they're asked to login, which will result in a variety of issues.
Here's [a transaction where this happened](https://sentry.sentry.io/discover/javascript:31e672a100fd44f5ab3ac36d4be4dbd3/)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/sentry/api/endpoints/project_transfer.py`
Content:
```
1 import logging
2 from urllib.parse import urlencode
3 from uuid import uuid4
4
5 from django.utils import timezone
6 from rest_framework import status
7 from rest_framework.request import Request
8 from rest_framework.response import Response
9
10 from sentry import audit_log, options, roles
11 from sentry.api.base import region_silo_endpoint
12 from sentry.api.bases.project import ProjectEndpoint, ProjectPermission
13 from sentry.api.decorators import sudo_required
14 from sentry.models import OrganizationMember
15 from sentry.utils.email import MessageBuilder
16 from sentry.utils.signing import sign
17
18 delete_logger = logging.getLogger("sentry.deletions.api")
19
20
21 class RelaxedProjectPermission(ProjectPermission):
22 scope_map = {"POST": ["project:admin"]}
23
24
25 @region_silo_endpoint
26 class ProjectTransferEndpoint(ProjectEndpoint):
27 permission_classes = [RelaxedProjectPermission]
28
29 @sudo_required
30 def post(self, request: Request, project) -> Response:
31 """
32 Transfer a Project
33 ````````````````
34
35 Schedules a project for transfer to a new organization.
36
37 :pparam string organization_slug: the slug of the organization the
38 project belongs to.
39 :pparam string project_slug: the slug of the project to delete.
40 :param string email: email of new owner. must be an organization owner
41 :auth: required
42 """
43 if project.is_internal_project():
44 return Response(
45 '{"error": "Cannot transfer projects internally used by Sentry."}',
46 status=status.HTTP_403_FORBIDDEN,
47 )
48
49 email = request.data.get("email")
50
51 if email is None:
52 return Response(status=status.HTTP_400_BAD_REQUEST)
53
54 if not request.user.is_authenticated:
55 return Response(status=status.HTTP_403_FORBIDDEN)
56
57 try:
58 owner = OrganizationMember.objects.get_members_by_email_and_role(
59 email=email, role=roles.get_top_dog().id
60 )[0]
61 except IndexError:
62 return Response(
63 {"detail": "Could not find an organization owner with that email"},
64 status=status.HTTP_404_NOT_FOUND,
65 )
66
67 organization = project.organization
68 transaction_id = uuid4().hex
69 url_data = sign(
70 actor_id=request.user.id,
71 from_organization_id=organization.id,
72 project_id=project.id,
73 user_id=owner.user_id,
74 transaction_id=transaction_id,
75 )
76
77 context = {
78 "email": email,
79 "from_org": project.organization.name,
80 "project_name": project.slug,
81 "request_time": timezone.now(),
82 "url": organization.absolute_url(
83 "/accept-transfer/", query=urlencode({"data": url_data})
84 ),
85 "requester": request.user,
86 }
87 MessageBuilder(
88 subject="{}Request for Project Transfer".format(options.get("mail.subject-prefix")),
89 template="sentry/emails/transfer_project.txt",
90 html_template="sentry/emails/transfer_project.html",
91 type="org.confirm_project_transfer_request",
92 context=context,
93 ).send_async([email])
94
95 self.create_audit_entry(
96 request=request,
97 organization=project.organization,
98 target_object=project.id,
99 event=audit_log.get_event_id("PROJECT_REQUEST_TRANSFER"),
100 data=project.get_audit_log_data(),
101 transaction_id=transaction_id,
102 )
103
104 return Response(status=status.HTTP_204_NO_CONTENT)
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/sentry/api/endpoints/project_transfer.py b/src/sentry/api/endpoints/project_transfer.py
--- a/src/sentry/api/endpoints/project_transfer.py
+++ b/src/sentry/api/endpoints/project_transfer.py
@@ -13,6 +13,7 @@
from sentry.api.decorators import sudo_required
from sentry.models import OrganizationMember
from sentry.utils.email import MessageBuilder
+from sentry.utils.http import absolute_uri
from sentry.utils.signing import sign
delete_logger = logging.getLogger("sentry.deletions.api")
@@ -79,9 +80,7 @@
"from_org": project.organization.name,
"project_name": project.slug,
"request_time": timezone.now(),
- "url": organization.absolute_url(
- "/accept-transfer/", query=urlencode({"data": url_data})
- ),
+ "url": absolute_uri(f"/accept-transfer/?{urlencode({'data': url_data})}"),
"requester": request.user,
}
MessageBuilder(
| {"golden_diff": "diff --git a/src/sentry/api/endpoints/project_transfer.py b/src/sentry/api/endpoints/project_transfer.py\n--- a/src/sentry/api/endpoints/project_transfer.py\n+++ b/src/sentry/api/endpoints/project_transfer.py\n@@ -13,6 +13,7 @@\n from sentry.api.decorators import sudo_required\n from sentry.models import OrganizationMember\n from sentry.utils.email import MessageBuilder\n+from sentry.utils.http import absolute_uri\n from sentry.utils.signing import sign\n \n delete_logger = logging.getLogger(\"sentry.deletions.api\")\n@@ -79,9 +80,7 @@\n \"from_org\": project.organization.name,\n \"project_name\": project.slug,\n \"request_time\": timezone.now(),\n- \"url\": organization.absolute_url(\n- \"/accept-transfer/\", query=urlencode({\"data\": url_data})\n- ),\n+ \"url\": absolute_uri(f\"/accept-transfer/?{urlencode({'data': url_data})}\"),\n \"requester\": request.user,\n }\n MessageBuilder(\n", "issue": "[Project Transfer] Request Approval Email Uses Incorrect Customer Subdomain\nr### Environment\r\n\r\nSaaS (https://sentry.io/)\r\n\r\n### Version\r\n\r\nn/a\r\n\r\n### Link\r\n\r\n[relevant transaction](https://sentry.sentry.io/discover/javascript:31e672a100fd44f5ab3ac36d4be4dbd3/?field=title&field=event.type&field=project&field=user.display&field=timestamp&field=replayId&homepage=true&name=All+Events&project=11276&query=url%3A%22https%3A%2F%2Fmagiclamp-mobile-website-appli.sentry.io%2F%2A%22&sort=-timestamp&statsPeriod=14d&yAxis=count%28%29)\r\n\r\n### DSN\r\n\r\nn/a\r\n\r\n### Steps to Reproduce\r\n\r\n1. Org A requests a project transfer - supplies an owner email for Org B\r\n2. Org B owner clicks link in resulting \"Request for Project Transfer\" email\r\n3. Link directs Org B owner to **org-a**.sentry.io/accept-transfer/?data=<some_hash>\r\n\r\n### Expected Result\r\n\r\n...\r\n3. Link directs Org B owner to **org-b**.sentry.io/accept-transfer/?data=<some_hash>\r\n\r\n### Actual Result\r\n\r\nBecause Org B owner is not authenticated with orb-a.sentry.io they're asked to login, which will result in a variety of issues.\r\n\r\nHere's [a transaction where this happened](https://sentry.sentry.io/discover/javascript:31e672a100fd44f5ab3ac36d4be4dbd3/)\n", "before_files": [{"content": "import logging\nfrom urllib.parse import urlencode\nfrom uuid import uuid4\n\nfrom django.utils import timezone\nfrom rest_framework import status\nfrom rest_framework.request import Request\nfrom rest_framework.response import Response\n\nfrom sentry import audit_log, options, roles\nfrom sentry.api.base import region_silo_endpoint\nfrom sentry.api.bases.project import ProjectEndpoint, ProjectPermission\nfrom sentry.api.decorators import sudo_required\nfrom sentry.models import OrganizationMember\nfrom sentry.utils.email import MessageBuilder\nfrom sentry.utils.signing import sign\n\ndelete_logger = logging.getLogger(\"sentry.deletions.api\")\n\n\nclass RelaxedProjectPermission(ProjectPermission):\n scope_map = {\"POST\": [\"project:admin\"]}\n\n\n@region_silo_endpoint\nclass ProjectTransferEndpoint(ProjectEndpoint):\n permission_classes = [RelaxedProjectPermission]\n\n @sudo_required\n def post(self, request: Request, project) -> Response:\n \"\"\"\n Transfer a Project\n ````````````````\n\n Schedules a project for transfer to a new organization.\n\n :pparam string organization_slug: the slug of the organization the\n project belongs to.\n :pparam string project_slug: the slug of the project to delete.\n :param string email: email of new owner. must be an organization owner\n :auth: required\n \"\"\"\n if project.is_internal_project():\n return Response(\n '{\"error\": \"Cannot transfer projects internally used by Sentry.\"}',\n status=status.HTTP_403_FORBIDDEN,\n )\n\n email = request.data.get(\"email\")\n\n if email is None:\n return Response(status=status.HTTP_400_BAD_REQUEST)\n\n if not request.user.is_authenticated:\n return Response(status=status.HTTP_403_FORBIDDEN)\n\n try:\n owner = OrganizationMember.objects.get_members_by_email_and_role(\n email=email, role=roles.get_top_dog().id\n )[0]\n except IndexError:\n return Response(\n {\"detail\": \"Could not find an organization owner with that email\"},\n status=status.HTTP_404_NOT_FOUND,\n )\n\n organization = project.organization\n transaction_id = uuid4().hex\n url_data = sign(\n actor_id=request.user.id,\n from_organization_id=organization.id,\n project_id=project.id,\n user_id=owner.user_id,\n transaction_id=transaction_id,\n )\n\n context = {\n \"email\": email,\n \"from_org\": project.organization.name,\n \"project_name\": project.slug,\n \"request_time\": timezone.now(),\n \"url\": organization.absolute_url(\n \"/accept-transfer/\", query=urlencode({\"data\": url_data})\n ),\n \"requester\": request.user,\n }\n MessageBuilder(\n subject=\"{}Request for Project Transfer\".format(options.get(\"mail.subject-prefix\")),\n template=\"sentry/emails/transfer_project.txt\",\n html_template=\"sentry/emails/transfer_project.html\",\n type=\"org.confirm_project_transfer_request\",\n context=context,\n ).send_async([email])\n\n self.create_audit_entry(\n request=request,\n organization=project.organization,\n target_object=project.id,\n event=audit_log.get_event_id(\"PROJECT_REQUEST_TRANSFER\"),\n data=project.get_audit_log_data(),\n transaction_id=transaction_id,\n )\n\n return Response(status=status.HTTP_204_NO_CONTENT)\n", "path": "src/sentry/api/endpoints/project_transfer.py"}], "after_files": [{"content": "import logging\nfrom urllib.parse import urlencode\nfrom uuid import uuid4\n\nfrom django.utils import timezone\nfrom rest_framework import status\nfrom rest_framework.request import Request\nfrom rest_framework.response import Response\n\nfrom sentry import audit_log, options, roles\nfrom sentry.api.base import region_silo_endpoint\nfrom sentry.api.bases.project import ProjectEndpoint, ProjectPermission\nfrom sentry.api.decorators import sudo_required\nfrom sentry.models import OrganizationMember\nfrom sentry.utils.email import MessageBuilder\nfrom sentry.utils.http import absolute_uri\nfrom sentry.utils.signing import sign\n\ndelete_logger = logging.getLogger(\"sentry.deletions.api\")\n\n\nclass RelaxedProjectPermission(ProjectPermission):\n scope_map = {\"POST\": [\"project:admin\"]}\n\n\n@region_silo_endpoint\nclass ProjectTransferEndpoint(ProjectEndpoint):\n permission_classes = [RelaxedProjectPermission]\n\n @sudo_required\n def post(self, request: Request, project) -> Response:\n \"\"\"\n Transfer a Project\n ````````````````\n\n Schedules a project for transfer to a new organization.\n\n :pparam string organization_slug: the slug of the organization the\n project belongs to.\n :pparam string project_slug: the slug of the project to delete.\n :param string email: email of new owner. must be an organization owner\n :auth: required\n \"\"\"\n if project.is_internal_project():\n return Response(\n '{\"error\": \"Cannot transfer projects internally used by Sentry.\"}',\n status=status.HTTP_403_FORBIDDEN,\n )\n\n email = request.data.get(\"email\")\n\n if email is None:\n return Response(status=status.HTTP_400_BAD_REQUEST)\n\n if not request.user.is_authenticated:\n return Response(status=status.HTTP_403_FORBIDDEN)\n\n try:\n owner = OrganizationMember.objects.get_members_by_email_and_role(\n email=email, role=roles.get_top_dog().id\n )[0]\n except IndexError:\n return Response(\n {\"detail\": \"Could not find an organization owner with that email\"},\n status=status.HTTP_404_NOT_FOUND,\n )\n\n organization = project.organization\n transaction_id = uuid4().hex\n url_data = sign(\n actor_id=request.user.id,\n from_organization_id=organization.id,\n project_id=project.id,\n user_id=owner.user_id,\n transaction_id=transaction_id,\n )\n\n context = {\n \"email\": email,\n \"from_org\": project.organization.name,\n \"project_name\": project.slug,\n \"request_time\": timezone.now(),\n \"url\": absolute_uri(f\"/accept-transfer/?{urlencode({'data': url_data})}\"),\n \"requester\": request.user,\n }\n MessageBuilder(\n subject=\"{}Request for Project Transfer\".format(options.get(\"mail.subject-prefix\")),\n template=\"sentry/emails/transfer_project.txt\",\n html_template=\"sentry/emails/transfer_project.html\",\n type=\"org.confirm_project_transfer_request\",\n context=context,\n ).send_async([email])\n\n self.create_audit_entry(\n request=request,\n organization=project.organization,\n target_object=project.id,\n event=audit_log.get_event_id(\"PROJECT_REQUEST_TRANSFER\"),\n data=project.get_audit_log_data(),\n transaction_id=transaction_id,\n )\n\n return Response(status=status.HTTP_204_NO_CONTENT)\n", "path": "src/sentry/api/endpoints/project_transfer.py"}]} | 1,548 | 218 |
gh_patches_debug_60614 | rasdani/github-patches | git_diff | cloudtools__troposphere-1670 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Security Hub tags not formatted correctly
Using the following code for a Security Hub resource results in tags that don't follow the structure of Security Hub Cloudformation:
```
hub = tpl.add_resource(Hub(
title="ExampleHub",
Tags=Tags(
Tag("Test", "Test value")
)
))
```
The resulting CF template is as follows:
```
"AWSTemplateFormatVersion": "2010-09-09",
"Description": "Security Hub configuration",
"Resources": {
"ExampleHub": {
"Properties": {
"Tags": [
{
"Key": "Test",
"Value": "Test value"
}
]
},
"Type": "AWS::SecurityHub::Hub"
}
}
}
```
Looking into the CF documentation for security hub, it looks as though the tag JSON is different:
```
{
"Description": "Example Hub with Tags",
"Resources": {
"ExampleHubWithTags": {
"Type": "AWS::SecurityHub::Hub",
"Properties": {
"Tags": {
"key1": "value1",
"key2": "value2"
}
}
}
}
}
```
Trying to apply the changeset from the generated TF template results in error `Property validation failure: [Value of property {/Tags} does not match type {Map}]`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/securityhub.py`
Content:
```
1 # Copyright (c) 2012-2019, Mark Peek <[email protected]>
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5 #
6 # *** Do not modify - this file is autogenerated ***
7 # Resource specification version: 4.1.0
8
9
10 from . import AWSObject
11 from troposphere import Tags
12
13
14 class Hub(AWSObject):
15 resource_type = "AWS::SecurityHub::Hub"
16
17 props = {
18 'Tags': (Tags, False),
19 }
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/securityhub.py b/troposphere/securityhub.py
--- a/troposphere/securityhub.py
+++ b/troposphere/securityhub.py
@@ -8,12 +8,12 @@
from . import AWSObject
-from troposphere import Tags
+from .validators import json_checker
class Hub(AWSObject):
resource_type = "AWS::SecurityHub::Hub"
props = {
- 'Tags': (Tags, False),
+ 'Tags': (json_checker, False),
}
| {"golden_diff": "diff --git a/troposphere/securityhub.py b/troposphere/securityhub.py\n--- a/troposphere/securityhub.py\n+++ b/troposphere/securityhub.py\n@@ -8,12 +8,12 @@\n \n \n from . import AWSObject\n-from troposphere import Tags\n+from .validators import json_checker\n \n \n class Hub(AWSObject):\n resource_type = \"AWS::SecurityHub::Hub\"\n \n props = {\n- 'Tags': (Tags, False),\n+ 'Tags': (json_checker, False),\n }\n", "issue": "Security Hub tags not formatted correctly\nUsing the following code for a Security Hub resource results in tags that don't follow the structure of Security Hub Cloudformation:\r\n\r\n```\r\nhub = tpl.add_resource(Hub(\r\n title=\"ExampleHub\",\r\n Tags=Tags(\r\n Tag(\"Test\", \"Test value\")\r\n )\r\n ))\r\n```\r\n\r\nThe resulting CF template is as follows:\r\n```\r\n \"AWSTemplateFormatVersion\": \"2010-09-09\",\r\n \"Description\": \"Security Hub configuration\",\r\n \"Resources\": {\r\n \"ExampleHub\": {\r\n \"Properties\": {\r\n \"Tags\": [\r\n {\r\n \"Key\": \"Test\",\r\n \"Value\": \"Test value\"\r\n }\r\n ]\r\n },\r\n \"Type\": \"AWS::SecurityHub::Hub\"\r\n }\r\n }\r\n}\r\n```\r\n\r\nLooking into the CF documentation for security hub, it looks as though the tag JSON is different:\r\n```\r\n{\r\n \"Description\": \"Example Hub with Tags\",\r\n \"Resources\": {\r\n \"ExampleHubWithTags\": {\r\n \"Type\": \"AWS::SecurityHub::Hub\",\r\n \"Properties\": {\r\n \"Tags\": {\r\n \"key1\": \"value1\",\r\n \"key2\": \"value2\"\r\n }\r\n }\r\n }\r\n }\r\n}\r\n```\r\n\r\nTrying to apply the changeset from the generated TF template results in error `Property validation failure: [Value of property {/Tags} does not match type {Map}]`.\n", "before_files": [{"content": "# Copyright (c) 2012-2019, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n#\n# *** Do not modify - this file is autogenerated ***\n# Resource specification version: 4.1.0\n\n\nfrom . import AWSObject\nfrom troposphere import Tags\n\n\nclass Hub(AWSObject):\n resource_type = \"AWS::SecurityHub::Hub\"\n\n props = {\n 'Tags': (Tags, False),\n }\n", "path": "troposphere/securityhub.py"}], "after_files": [{"content": "# Copyright (c) 2012-2019, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n#\n# *** Do not modify - this file is autogenerated ***\n# Resource specification version: 4.1.0\n\n\nfrom . import AWSObject\nfrom .validators import json_checker\n\n\nclass Hub(AWSObject):\n resource_type = \"AWS::SecurityHub::Hub\"\n\n props = {\n 'Tags': (json_checker, False),\n }\n", "path": "troposphere/securityhub.py"}]} | 704 | 116 |
gh_patches_debug_3933 | rasdani/github-patches | git_diff | microsoft__ptvsd-1559 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
attach_socket_import tests hang waiting for "process" event
(https://github.com/microsoft/ptvsd/tree/dbg_adapter_refactor)
When running multiple tests with `start_method` set to `attach_socket_cmdline` or `attach_socket_import`, subsequent tests fail because they cannot reuse the port number. It looks like ptvsd is not getting shut down properly, or the next test starts running before ptvsd exits.
Estimate is uncertain, because this needs to be investigated to determine the root cause.
Only happens on Python 2.7.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/ptvsd/__init__.py`
Content:
```
1 # Copyright (c) Microsoft Corporation. All rights reserved.
2 # Licensed under the MIT License. See LICENSE in the project root
3 # for license information.
4
5 from __future__ import absolute_import, print_function, unicode_literals
6
7 """An implementation of the Debug Adapter Protocol (DAP) for Python.
8
9 https://microsoft.github.io/debug-adapter-protocol/
10 """
11
12 __all__ = [
13 "__version__",
14 "attach",
15 "break_into_debugger",
16 "debug_this_thread",
17 "enable_attach",
18 "is_attached",
19 "wait_for_attach",
20 ]
21
22 # Force absolute path on Python 2.
23 from os import path
24 __file__ = path.abspath(__file__)
25 del path
26
27 from ptvsd import _version
28 __version__ = _version.get_versions()["version"]
29 del _version
30
31 from ptvsd.server.attach_server import (
32 attach,
33 break_into_debugger,
34 debug_this_thread,
35 enable_attach,
36 is_attached,
37 wait_for_attach,
38 )
39
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/ptvsd/__init__.py b/src/ptvsd/__init__.py
--- a/src/ptvsd/__init__.py
+++ b/src/ptvsd/__init__.py
@@ -24,6 +24,14 @@
__file__ = path.abspath(__file__)
del path
+# Preload encodings that we're going to use to avoid import deadlocks on Python 2.
+import codecs
+codecs.lookup('ascii')
+codecs.lookup('utf8')
+codecs.lookup('utf-8')
+codecs.lookup('latin1')
+codecs.lookup('latin-1')
+
from ptvsd import _version
__version__ = _version.get_versions()["version"]
del _version
| {"golden_diff": "diff --git a/src/ptvsd/__init__.py b/src/ptvsd/__init__.py\n--- a/src/ptvsd/__init__.py\n+++ b/src/ptvsd/__init__.py\n@@ -24,6 +24,14 @@\n __file__ = path.abspath(__file__)\n del path\n \n+# Preload encodings that we're going to use to avoid import deadlocks on Python 2.\n+import codecs\n+codecs.lookup('ascii')\n+codecs.lookup('utf8')\n+codecs.lookup('utf-8')\n+codecs.lookup('latin1')\n+codecs.lookup('latin-1')\n+\n from ptvsd import _version\n __version__ = _version.get_versions()[\"version\"]\n del _version\n", "issue": "attach_socket_import tests hang waiting for \"process\" event\n(https://github.com/microsoft/ptvsd/tree/dbg_adapter_refactor)\r\n\r\nWhen running multiple tests with `start_method` set to `attach_socket_cmdline` or `attach_socket_import`, subsequent tests fail because they cannot reuse the port number. It looks like ptvsd is not getting shut down properly, or the next test starts running before ptvsd exits. \r\n\r\nEstimate is uncertain, because this needs to be investigated to determine the root cause.\r\n\r\nOnly happens on Python 2.7.\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\nfrom __future__ import absolute_import, print_function, unicode_literals\n\n\"\"\"An implementation of the Debug Adapter Protocol (DAP) for Python.\n\nhttps://microsoft.github.io/debug-adapter-protocol/\n\"\"\"\n\n__all__ = [\n \"__version__\",\n \"attach\",\n \"break_into_debugger\",\n \"debug_this_thread\",\n \"enable_attach\",\n \"is_attached\",\n \"wait_for_attach\",\n]\n\n# Force absolute path on Python 2.\nfrom os import path\n__file__ = path.abspath(__file__)\ndel path\n\nfrom ptvsd import _version\n__version__ = _version.get_versions()[\"version\"]\ndel _version\n\nfrom ptvsd.server.attach_server import (\n attach,\n break_into_debugger,\n debug_this_thread,\n enable_attach,\n is_attached,\n wait_for_attach,\n)\n", "path": "src/ptvsd/__init__.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation. All rights reserved.\n# Licensed under the MIT License. See LICENSE in the project root\n# for license information.\n\nfrom __future__ import absolute_import, print_function, unicode_literals\n\n\"\"\"An implementation of the Debug Adapter Protocol (DAP) for Python.\n\nhttps://microsoft.github.io/debug-adapter-protocol/\n\"\"\"\n\n__all__ = [\n \"__version__\",\n \"attach\",\n \"break_into_debugger\",\n \"debug_this_thread\",\n \"enable_attach\",\n \"is_attached\",\n \"wait_for_attach\",\n]\n\n# Force absolute path on Python 2.\nfrom os import path\n__file__ = path.abspath(__file__)\ndel path\n\n# Preload encodings that we're going to use to avoid import deadlocks on Python 2.\nimport codecs\ncodecs.lookup('ascii')\ncodecs.lookup('utf8')\ncodecs.lookup('utf-8')\ncodecs.lookup('latin1')\ncodecs.lookup('latin-1')\n\nfrom ptvsd import _version\n__version__ = _version.get_versions()[\"version\"]\ndel _version\n\nfrom ptvsd.server.attach_server import (\n attach,\n break_into_debugger,\n debug_this_thread,\n enable_attach,\n is_attached,\n wait_for_attach,\n)\n", "path": "src/ptvsd/__init__.py"}]} | 658 | 163 |
gh_patches_debug_47811 | rasdani/github-patches | git_diff | CTFd__CTFd-1921 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Stub issue for ctfcli #13
https://github.com/CTFd/ctfcli/issues/13
This needs to be resolved in CTFd most likely.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `CTFd/plugins/dynamic_challenges/__init__.py`
Content:
```
1 from __future__ import division # Use floating point for math calculations
2
3 import math
4
5 from flask import Blueprint
6
7 from CTFd.models import Challenges, Solves, db
8 from CTFd.plugins import register_plugin_assets_directory
9 from CTFd.plugins.challenges import CHALLENGE_CLASSES, BaseChallenge
10 from CTFd.plugins.migrations import upgrade
11 from CTFd.utils.modes import get_model
12
13
14 class DynamicChallenge(Challenges):
15 __mapper_args__ = {"polymorphic_identity": "dynamic"}
16 id = db.Column(
17 db.Integer, db.ForeignKey("challenges.id", ondelete="CASCADE"), primary_key=True
18 )
19 initial = db.Column(db.Integer, default=0)
20 minimum = db.Column(db.Integer, default=0)
21 decay = db.Column(db.Integer, default=0)
22
23 def __init__(self, *args, **kwargs):
24 super(DynamicChallenge, self).__init__(**kwargs)
25 self.initial = kwargs["value"]
26
27
28 class DynamicValueChallenge(BaseChallenge):
29 id = "dynamic" # Unique identifier used to register challenges
30 name = "dynamic" # Name of a challenge type
31 templates = { # Handlebars templates used for each aspect of challenge editing & viewing
32 "create": "/plugins/dynamic_challenges/assets/create.html",
33 "update": "/plugins/dynamic_challenges/assets/update.html",
34 "view": "/plugins/dynamic_challenges/assets/view.html",
35 }
36 scripts = { # Scripts that are loaded when a template is loaded
37 "create": "/plugins/dynamic_challenges/assets/create.js",
38 "update": "/plugins/dynamic_challenges/assets/update.js",
39 "view": "/plugins/dynamic_challenges/assets/view.js",
40 }
41 # Route at which files are accessible. This must be registered using register_plugin_assets_directory()
42 route = "/plugins/dynamic_challenges/assets/"
43 # Blueprint used to access the static_folder directory.
44 blueprint = Blueprint(
45 "dynamic_challenges",
46 __name__,
47 template_folder="templates",
48 static_folder="assets",
49 )
50 challenge_model = DynamicChallenge
51
52 @classmethod
53 def calculate_value(cls, challenge):
54 Model = get_model()
55
56 solve_count = (
57 Solves.query.join(Model, Solves.account_id == Model.id)
58 .filter(
59 Solves.challenge_id == challenge.id,
60 Model.hidden == False,
61 Model.banned == False,
62 )
63 .count()
64 )
65
66 # If the solve count is 0 we shouldn't manipulate the solve count to
67 # let the math update back to normal
68 if solve_count != 0:
69 # We subtract -1 to allow the first solver to get max point value
70 solve_count -= 1
71
72 # It is important that this calculation takes into account floats.
73 # Hence this file uses from __future__ import division
74 value = (
75 ((challenge.minimum - challenge.initial) / (challenge.decay ** 2))
76 * (solve_count ** 2)
77 ) + challenge.initial
78
79 value = math.ceil(value)
80
81 if value < challenge.minimum:
82 value = challenge.minimum
83
84 challenge.value = value
85 db.session.commit()
86 return challenge
87
88 @classmethod
89 def read(cls, challenge):
90 """
91 This method is in used to access the data of a challenge in a format processable by the front end.
92
93 :param challenge:
94 :return: Challenge object, data dictionary to be returned to the user
95 """
96 challenge = DynamicChallenge.query.filter_by(id=challenge.id).first()
97 data = {
98 "id": challenge.id,
99 "name": challenge.name,
100 "value": challenge.value,
101 "initial": challenge.initial,
102 "decay": challenge.decay,
103 "minimum": challenge.minimum,
104 "description": challenge.description,
105 "category": challenge.category,
106 "state": challenge.state,
107 "max_attempts": challenge.max_attempts,
108 "type": challenge.type,
109 "type_data": {
110 "id": cls.id,
111 "name": cls.name,
112 "templates": cls.templates,
113 "scripts": cls.scripts,
114 },
115 }
116 return data
117
118 @classmethod
119 def update(cls, challenge, request):
120 """
121 This method is used to update the information associated with a challenge. This should be kept strictly to the
122 Challenges table and any child tables.
123
124 :param challenge:
125 :param request:
126 :return:
127 """
128 data = request.form or request.get_json()
129
130 for attr, value in data.items():
131 # We need to set these to floats so that the next operations don't operate on strings
132 if attr in ("initial", "minimum", "decay"):
133 value = float(value)
134 setattr(challenge, attr, value)
135
136 return DynamicValueChallenge.calculate_value(challenge)
137
138 @classmethod
139 def solve(cls, user, team, challenge, request):
140 super().solve(user, team, challenge, request)
141
142 DynamicValueChallenge.calculate_value(challenge)
143
144
145 def load(app):
146 upgrade()
147 CHALLENGE_CLASSES["dynamic"] = DynamicValueChallenge
148 register_plugin_assets_directory(
149 app, base_path="/plugins/dynamic_challenges/assets/"
150 )
151
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/CTFd/plugins/dynamic_challenges/__init__.py b/CTFd/plugins/dynamic_challenges/__init__.py
--- a/CTFd/plugins/dynamic_challenges/__init__.py
+++ b/CTFd/plugins/dynamic_challenges/__init__.py
@@ -22,7 +22,7 @@
def __init__(self, *args, **kwargs):
super(DynamicChallenge, self).__init__(**kwargs)
- self.initial = kwargs["value"]
+ self.value = kwargs["initial"]
class DynamicValueChallenge(BaseChallenge):
| {"golden_diff": "diff --git a/CTFd/plugins/dynamic_challenges/__init__.py b/CTFd/plugins/dynamic_challenges/__init__.py\n--- a/CTFd/plugins/dynamic_challenges/__init__.py\n+++ b/CTFd/plugins/dynamic_challenges/__init__.py\n@@ -22,7 +22,7 @@\n \n def __init__(self, *args, **kwargs):\n super(DynamicChallenge, self).__init__(**kwargs)\n- self.initial = kwargs[\"value\"]\n+ self.value = kwargs[\"initial\"]\n \n \n class DynamicValueChallenge(BaseChallenge):\n", "issue": "Stub issue for ctfcli #13\nhttps://github.com/CTFd/ctfcli/issues/13\r\n\r\nThis needs to be resolved in CTFd most likely.\n", "before_files": [{"content": "from __future__ import division # Use floating point for math calculations\n\nimport math\n\nfrom flask import Blueprint\n\nfrom CTFd.models import Challenges, Solves, db\nfrom CTFd.plugins import register_plugin_assets_directory\nfrom CTFd.plugins.challenges import CHALLENGE_CLASSES, BaseChallenge\nfrom CTFd.plugins.migrations import upgrade\nfrom CTFd.utils.modes import get_model\n\n\nclass DynamicChallenge(Challenges):\n __mapper_args__ = {\"polymorphic_identity\": \"dynamic\"}\n id = db.Column(\n db.Integer, db.ForeignKey(\"challenges.id\", ondelete=\"CASCADE\"), primary_key=True\n )\n initial = db.Column(db.Integer, default=0)\n minimum = db.Column(db.Integer, default=0)\n decay = db.Column(db.Integer, default=0)\n\n def __init__(self, *args, **kwargs):\n super(DynamicChallenge, self).__init__(**kwargs)\n self.initial = kwargs[\"value\"]\n\n\nclass DynamicValueChallenge(BaseChallenge):\n id = \"dynamic\" # Unique identifier used to register challenges\n name = \"dynamic\" # Name of a challenge type\n templates = { # Handlebars templates used for each aspect of challenge editing & viewing\n \"create\": \"/plugins/dynamic_challenges/assets/create.html\",\n \"update\": \"/plugins/dynamic_challenges/assets/update.html\",\n \"view\": \"/plugins/dynamic_challenges/assets/view.html\",\n }\n scripts = { # Scripts that are loaded when a template is loaded\n \"create\": \"/plugins/dynamic_challenges/assets/create.js\",\n \"update\": \"/plugins/dynamic_challenges/assets/update.js\",\n \"view\": \"/plugins/dynamic_challenges/assets/view.js\",\n }\n # Route at which files are accessible. This must be registered using register_plugin_assets_directory()\n route = \"/plugins/dynamic_challenges/assets/\"\n # Blueprint used to access the static_folder directory.\n blueprint = Blueprint(\n \"dynamic_challenges\",\n __name__,\n template_folder=\"templates\",\n static_folder=\"assets\",\n )\n challenge_model = DynamicChallenge\n\n @classmethod\n def calculate_value(cls, challenge):\n Model = get_model()\n\n solve_count = (\n Solves.query.join(Model, Solves.account_id == Model.id)\n .filter(\n Solves.challenge_id == challenge.id,\n Model.hidden == False,\n Model.banned == False,\n )\n .count()\n )\n\n # If the solve count is 0 we shouldn't manipulate the solve count to\n # let the math update back to normal\n if solve_count != 0:\n # We subtract -1 to allow the first solver to get max point value\n solve_count -= 1\n\n # It is important that this calculation takes into account floats.\n # Hence this file uses from __future__ import division\n value = (\n ((challenge.minimum - challenge.initial) / (challenge.decay ** 2))\n * (solve_count ** 2)\n ) + challenge.initial\n\n value = math.ceil(value)\n\n if value < challenge.minimum:\n value = challenge.minimum\n\n challenge.value = value\n db.session.commit()\n return challenge\n\n @classmethod\n def read(cls, challenge):\n \"\"\"\n This method is in used to access the data of a challenge in a format processable by the front end.\n\n :param challenge:\n :return: Challenge object, data dictionary to be returned to the user\n \"\"\"\n challenge = DynamicChallenge.query.filter_by(id=challenge.id).first()\n data = {\n \"id\": challenge.id,\n \"name\": challenge.name,\n \"value\": challenge.value,\n \"initial\": challenge.initial,\n \"decay\": challenge.decay,\n \"minimum\": challenge.minimum,\n \"description\": challenge.description,\n \"category\": challenge.category,\n \"state\": challenge.state,\n \"max_attempts\": challenge.max_attempts,\n \"type\": challenge.type,\n \"type_data\": {\n \"id\": cls.id,\n \"name\": cls.name,\n \"templates\": cls.templates,\n \"scripts\": cls.scripts,\n },\n }\n return data\n\n @classmethod\n def update(cls, challenge, request):\n \"\"\"\n This method is used to update the information associated with a challenge. This should be kept strictly to the\n Challenges table and any child tables.\n\n :param challenge:\n :param request:\n :return:\n \"\"\"\n data = request.form or request.get_json()\n\n for attr, value in data.items():\n # We need to set these to floats so that the next operations don't operate on strings\n if attr in (\"initial\", \"minimum\", \"decay\"):\n value = float(value)\n setattr(challenge, attr, value)\n\n return DynamicValueChallenge.calculate_value(challenge)\n\n @classmethod\n def solve(cls, user, team, challenge, request):\n super().solve(user, team, challenge, request)\n\n DynamicValueChallenge.calculate_value(challenge)\n\n\ndef load(app):\n upgrade()\n CHALLENGE_CLASSES[\"dynamic\"] = DynamicValueChallenge\n register_plugin_assets_directory(\n app, base_path=\"/plugins/dynamic_challenges/assets/\"\n )\n", "path": "CTFd/plugins/dynamic_challenges/__init__.py"}], "after_files": [{"content": "from __future__ import division # Use floating point for math calculations\n\nimport math\n\nfrom flask import Blueprint\n\nfrom CTFd.models import Challenges, Solves, db\nfrom CTFd.plugins import register_plugin_assets_directory\nfrom CTFd.plugins.challenges import CHALLENGE_CLASSES, BaseChallenge\nfrom CTFd.plugins.migrations import upgrade\nfrom CTFd.utils.modes import get_model\n\n\nclass DynamicChallenge(Challenges):\n __mapper_args__ = {\"polymorphic_identity\": \"dynamic\"}\n id = db.Column(\n db.Integer, db.ForeignKey(\"challenges.id\", ondelete=\"CASCADE\"), primary_key=True\n )\n initial = db.Column(db.Integer, default=0)\n minimum = db.Column(db.Integer, default=0)\n decay = db.Column(db.Integer, default=0)\n\n def __init__(self, *args, **kwargs):\n super(DynamicChallenge, self).__init__(**kwargs)\n self.value = kwargs[\"initial\"]\n\n\nclass DynamicValueChallenge(BaseChallenge):\n id = \"dynamic\" # Unique identifier used to register challenges\n name = \"dynamic\" # Name of a challenge type\n templates = { # Handlebars templates used for each aspect of challenge editing & viewing\n \"create\": \"/plugins/dynamic_challenges/assets/create.html\",\n \"update\": \"/plugins/dynamic_challenges/assets/update.html\",\n \"view\": \"/plugins/dynamic_challenges/assets/view.html\",\n }\n scripts = { # Scripts that are loaded when a template is loaded\n \"create\": \"/plugins/dynamic_challenges/assets/create.js\",\n \"update\": \"/plugins/dynamic_challenges/assets/update.js\",\n \"view\": \"/plugins/dynamic_challenges/assets/view.js\",\n }\n # Route at which files are accessible. This must be registered using register_plugin_assets_directory()\n route = \"/plugins/dynamic_challenges/assets/\"\n # Blueprint used to access the static_folder directory.\n blueprint = Blueprint(\n \"dynamic_challenges\",\n __name__,\n template_folder=\"templates\",\n static_folder=\"assets\",\n )\n challenge_model = DynamicChallenge\n\n @classmethod\n def calculate_value(cls, challenge):\n Model = get_model()\n\n solve_count = (\n Solves.query.join(Model, Solves.account_id == Model.id)\n .filter(\n Solves.challenge_id == challenge.id,\n Model.hidden == False,\n Model.banned == False,\n )\n .count()\n )\n\n # If the solve count is 0 we shouldn't manipulate the solve count to\n # let the math update back to normal\n if solve_count != 0:\n # We subtract -1 to allow the first solver to get max point value\n solve_count -= 1\n\n # It is important that this calculation takes into account floats.\n # Hence this file uses from __future__ import division\n value = (\n ((challenge.minimum - challenge.initial) / (challenge.decay ** 2))\n * (solve_count ** 2)\n ) + challenge.initial\n\n value = math.ceil(value)\n\n if value < challenge.minimum:\n value = challenge.minimum\n\n challenge.value = value\n db.session.commit()\n return challenge\n\n @classmethod\n def read(cls, challenge):\n \"\"\"\n This method is in used to access the data of a challenge in a format processable by the front end.\n\n :param challenge:\n :return: Challenge object, data dictionary to be returned to the user\n \"\"\"\n challenge = DynamicChallenge.query.filter_by(id=challenge.id).first()\n data = {\n \"id\": challenge.id,\n \"name\": challenge.name,\n \"value\": challenge.value,\n \"initial\": challenge.initial,\n \"decay\": challenge.decay,\n \"minimum\": challenge.minimum,\n \"description\": challenge.description,\n \"category\": challenge.category,\n \"state\": challenge.state,\n \"max_attempts\": challenge.max_attempts,\n \"type\": challenge.type,\n \"type_data\": {\n \"id\": cls.id,\n \"name\": cls.name,\n \"templates\": cls.templates,\n \"scripts\": cls.scripts,\n },\n }\n return data\n\n @classmethod\n def update(cls, challenge, request):\n \"\"\"\n This method is used to update the information associated with a challenge. This should be kept strictly to the\n Challenges table and any child tables.\n\n :param challenge:\n :param request:\n :return:\n \"\"\"\n data = request.form or request.get_json()\n\n for attr, value in data.items():\n # We need to set these to floats so that the next operations don't operate on strings\n if attr in (\"initial\", \"minimum\", \"decay\"):\n value = float(value)\n setattr(challenge, attr, value)\n\n return DynamicValueChallenge.calculate_value(challenge)\n\n @classmethod\n def solve(cls, user, team, challenge, request):\n super().solve(user, team, challenge, request)\n\n DynamicValueChallenge.calculate_value(challenge)\n\n\ndef load(app):\n upgrade()\n CHALLENGE_CLASSES[\"dynamic\"] = DynamicValueChallenge\n register_plugin_assets_directory(\n app, base_path=\"/plugins/dynamic_challenges/assets/\"\n )\n", "path": "CTFd/plugins/dynamic_challenges/__init__.py"}]} | 1,761 | 125 |
gh_patches_debug_21182 | rasdani/github-patches | git_diff | saleor__saleor-527 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
302 Status in Shipping Method.
**TD;LR** When I enter the "Shipping Address" entire as anonymous or create a new address. It makes me submit it again and again.
Using c79bdbb13c399b253e3a26480d01f1cb219b4700
I get this most annoying bug. I can't solve from days. It only happens in the production site, not on the test as well as development. I really don't understand where I am going wrong.
What should really happen `Shipping Address(302) -> Shipping Method(200)`
Problem:
`Shipping Address(302) -> Shipping Method(302) -> Shipping Address(200)`.
This mean the users has to submit the form again and again.
Cases when `Shipping Method(200)`
- 3 to 5 attempts of submitting the form.
- 2 to 5 minutes of waiting to submit again.
I have even done an attempt using the `old checkout`, when the billing address was the first form, Still result is the same.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/core/__init__.py`
Content:
```
1 from __future__ import unicode_literals
2
3 from babel.numbers import get_territory_currencies
4 from django.conf import settings
5 from django_countries import countries
6 from django_countries.fields import Country
7 from geolite2 import geolite2
8
9 TOKEN_PATTERN = ('(?P<token>[0-9a-z]{8}-[0-9a-z]{4}-[0-9a-z]{4}-[0-9a-z]{4}'
10 '-[0-9a-z]{12})')
11
12
13 def get_country_by_ip(ip_address):
14 reader = geolite2.reader()
15 geo_data = reader.get(ip_address)
16 geolite2.close()
17 if geo_data and 'country' in geo_data and 'iso_code' in geo_data['country']:
18 country_iso_code = geo_data['country']['iso_code']
19 if country_iso_code in countries:
20 return Country(country_iso_code)
21
22
23 def get_currency_for_country(country):
24 currencies = get_territory_currencies(country.code)
25 if len(currencies):
26 main_currency = currencies[0]
27 if main_currency in settings.AVAILABLE_CURRENCIES:
28 return main_currency
29 return settings.DEFAULT_CURRENCY
30
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/core/__init__.py b/saleor/core/__init__.py
--- a/saleor/core/__init__.py
+++ b/saleor/core/__init__.py
@@ -2,6 +2,7 @@
from babel.numbers import get_territory_currencies
from django.conf import settings
+from django.core.checks import register, Tags, Warning
from django_countries import countries
from django_countries.fields import Country
from geolite2 import geolite2
@@ -27,3 +28,19 @@
if main_currency in settings.AVAILABLE_CURRENCIES:
return main_currency
return settings.DEFAULT_CURRENCY
+
+
+@register()
+def check_session_caching(app_configs, **kwargs):
+ errors = []
+ cached_engines = {
+ 'django.contrib.sessions.backends.cache',
+ 'django.contrib.sessions.backends.cached_db'}
+ if ('locmem' in settings.CACHES['default']['BACKEND'] and
+ settings.SESSION_ENGINE in cached_engines):
+ errors.append(
+ Warning(
+ 'Session caching cannot work with locmem backend',
+ 'User sessions need to be globally shared, use a cache server like Redis.',
+ 'saleor.W001'))
+ return errors
| {"golden_diff": "diff --git a/saleor/core/__init__.py b/saleor/core/__init__.py\n--- a/saleor/core/__init__.py\n+++ b/saleor/core/__init__.py\n@@ -2,6 +2,7 @@\n \n from babel.numbers import get_territory_currencies\n from django.conf import settings\n+from django.core.checks import register, Tags, Warning\n from django_countries import countries\n from django_countries.fields import Country\n from geolite2 import geolite2\n@@ -27,3 +28,19 @@\n if main_currency in settings.AVAILABLE_CURRENCIES:\n return main_currency\n return settings.DEFAULT_CURRENCY\n+\n+\n+@register()\n+def check_session_caching(app_configs, **kwargs):\n+ errors = []\n+ cached_engines = {\n+ 'django.contrib.sessions.backends.cache',\n+ 'django.contrib.sessions.backends.cached_db'}\n+ if ('locmem' in settings.CACHES['default']['BACKEND'] and\n+ settings.SESSION_ENGINE in cached_engines):\n+ errors.append(\n+ Warning(\n+ 'Session caching cannot work with locmem backend',\n+ 'User sessions need to be globally shared, use a cache server like Redis.',\n+ 'saleor.W001'))\n+ return errors\n", "issue": "302 Status in Shipping Method. \n**TD;LR** When I enter the \"Shipping Address\" entire as anonymous or create a new address. It makes me submit it again and again.\n\nUsing c79bdbb13c399b253e3a26480d01f1cb219b4700\n\nI get this most annoying bug. I can't solve from days. It only happens in the production site, not on the test as well as development. I really don't understand where I am going wrong.\n\nWhat should really happen `Shipping Address(302) -> Shipping Method(200)`\n\nProblem:\n`Shipping Address(302) -> Shipping Method(302) -> Shipping Address(200)`.\nThis mean the users has to submit the form again and again.\n\nCases when `Shipping Method(200)` \n- 3 to 5 attempts of submitting the form.\n- 2 to 5 minutes of waiting to submit again.\n\nI have even done an attempt using the `old checkout`, when the billing address was the first form, Still result is the same.\n\n", "before_files": [{"content": "from __future__ import unicode_literals\n\nfrom babel.numbers import get_territory_currencies\nfrom django.conf import settings\nfrom django_countries import countries\nfrom django_countries.fields import Country\nfrom geolite2 import geolite2\n\nTOKEN_PATTERN = ('(?P<token>[0-9a-z]{8}-[0-9a-z]{4}-[0-9a-z]{4}-[0-9a-z]{4}'\n '-[0-9a-z]{12})')\n\n\ndef get_country_by_ip(ip_address):\n reader = geolite2.reader()\n geo_data = reader.get(ip_address)\n geolite2.close()\n if geo_data and 'country' in geo_data and 'iso_code' in geo_data['country']:\n country_iso_code = geo_data['country']['iso_code']\n if country_iso_code in countries:\n return Country(country_iso_code)\n\n\ndef get_currency_for_country(country):\n currencies = get_territory_currencies(country.code)\n if len(currencies):\n main_currency = currencies[0]\n if main_currency in settings.AVAILABLE_CURRENCIES:\n return main_currency\n return settings.DEFAULT_CURRENCY\n", "path": "saleor/core/__init__.py"}], "after_files": [{"content": "from __future__ import unicode_literals\n\nfrom babel.numbers import get_territory_currencies\nfrom django.conf import settings\nfrom django.core.checks import register, Tags, Warning\nfrom django_countries import countries\nfrom django_countries.fields import Country\nfrom geolite2 import geolite2\n\nTOKEN_PATTERN = ('(?P<token>[0-9a-z]{8}-[0-9a-z]{4}-[0-9a-z]{4}-[0-9a-z]{4}'\n '-[0-9a-z]{12})')\n\n\ndef get_country_by_ip(ip_address):\n reader = geolite2.reader()\n geo_data = reader.get(ip_address)\n geolite2.close()\n if geo_data and 'country' in geo_data and 'iso_code' in geo_data['country']:\n country_iso_code = geo_data['country']['iso_code']\n if country_iso_code in countries:\n return Country(country_iso_code)\n\n\ndef get_currency_for_country(country):\n currencies = get_territory_currencies(country.code)\n if len(currencies):\n main_currency = currencies[0]\n if main_currency in settings.AVAILABLE_CURRENCIES:\n return main_currency\n return settings.DEFAULT_CURRENCY\n\n\n@register()\ndef check_session_caching(app_configs, **kwargs):\n errors = []\n cached_engines = {\n 'django.contrib.sessions.backends.cache',\n 'django.contrib.sessions.backends.cached_db'}\n if ('locmem' in settings.CACHES['default']['BACKEND'] and\n settings.SESSION_ENGINE in cached_engines):\n errors.append(\n Warning(\n 'Session caching cannot work with locmem backend',\n 'User sessions need to be globally shared, use a cache server like Redis.',\n 'saleor.W001'))\n return errors\n", "path": "saleor/core/__init__.py"}]} | 812 | 284 |
gh_patches_debug_5118 | rasdani/github-patches | git_diff | akvo__akvo-rsr-2254 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Create a 500 page
Create a (working) 500 page with a message like:
> Oops something went wrong. We have been notified and will take action asap. Please check in later or contact us at xyz. RSR team
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rsr/views/error.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """
3 Akvo RSR is covered by the GNU Affero General Public License.
4 See more details in the license.txt file located at the root folder of the
5 Akvo RSR module. For additional details on the GNU license please
6 see < http://www.gnu.org/licenses/agpl.html >.
7 """
8
9 from django.http import HttpResponse
10
11
12 def server_error(request, template_name='500.html'):
13 HttpResponse("Server Error - 500")
14
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rsr/views/error.py b/akvo/rsr/views/error.py
--- a/akvo/rsr/views/error.py
+++ b/akvo/rsr/views/error.py
@@ -6,8 +6,12 @@
see < http://www.gnu.org/licenses/agpl.html >.
"""
-from django.http import HttpResponse
+from django.shortcuts import render_to_response
+from django.template import RequestContext
def server_error(request, template_name='500.html'):
- HttpResponse("Server Error - 500")
+ response = render_to_response('500.html', {}, context_instance=RequestContext(request))
+ response.status_code = 500
+ return response
+
| {"golden_diff": "diff --git a/akvo/rsr/views/error.py b/akvo/rsr/views/error.py\n--- a/akvo/rsr/views/error.py\n+++ b/akvo/rsr/views/error.py\n@@ -6,8 +6,12 @@\n see < http://www.gnu.org/licenses/agpl.html >.\n \"\"\"\n \n-from django.http import HttpResponse\n+from django.shortcuts import render_to_response\n+from django.template import RequestContext\n \n \n def server_error(request, template_name='500.html'):\n- HttpResponse(\"Server Error - 500\")\n+ response = render_to_response('500.html', {}, context_instance=RequestContext(request))\n+ response.status_code = 500\n+ return response\n+\n", "issue": "Create a 500 page\nCreate a (working) 500 page with a message like:\n\n> Oops something went wrong. We have been notified and will take action asap. Please check in later or contact us at xyz. RSR team\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"\n Akvo RSR is covered by the GNU Affero General Public License.\n See more details in the license.txt file located at the root folder of the\n Akvo RSR module. For additional details on the GNU license please\n see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom django.http import HttpResponse\n\n\ndef server_error(request, template_name='500.html'):\n HttpResponse(\"Server Error - 500\")\n", "path": "akvo/rsr/views/error.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"\n Akvo RSR is covered by the GNU Affero General Public License.\n See more details in the license.txt file located at the root folder of the\n Akvo RSR module. For additional details on the GNU license please\n see < http://www.gnu.org/licenses/agpl.html >.\n\"\"\"\n\nfrom django.shortcuts import render_to_response\nfrom django.template import RequestContext\n\n\ndef server_error(request, template_name='500.html'):\n response = render_to_response('500.html', {}, context_instance=RequestContext(request))\n response.status_code = 500\n return response\n\n", "path": "akvo/rsr/views/error.py"}]} | 439 | 157 |
gh_patches_debug_31160 | rasdani/github-patches | git_diff | ibis-project__ibis-2308 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Impala pandas_interop.py DataFrameWriter.write_csv() fails with PermissionError on Windows Platforms
pandas_interop.py DataFrameWriter.write_csv() fails with PermissionError on Windows Platforms due to how Pythons [tempfile.NamedTemporaryFile()](https://docs.python.org/3/library/tempfile.html#tempfile.NamedTemporaryFile) works.
Specifically, if the OS is Windows and the NamedTemporaryFile delete parameter is True, the IO operation will fail, because Python will open the file and then Pandas will attempt to open the file again. See [this SO answer ](https://stackoverflow.com/a/23212515/3744998)for more information.
I have replicated the issue manually and verified that setting the delete parameter to False resolves the issue. Perhaps we can add a platform check and explicitly set the delete parameter based on the platform? Then after we [exit the context statements scope](https://github.com/ibis-project/ibis/blob/master/ibis/impala/pandas_interop.py#L80), explicitly delete the temporary file. I imagine it could look something like the following.
``` python
import os
def write_csv(self, path):
if os.name == 'nt':
# Should probably think of a better name for this boolean
delete = False
else:
delete = True
with tempfile.NamedTemporaryFile(delete=delete) as f:
tmp_name = f.name
# Write the DataFrame to the temporary file path
if options.verbose:
util.log(
'Writing DataFrame to temporary file {}'.format(f.name)
)
self.df.to_csv(
f.name,
header=False,
index=False,
sep=',',
quoting=csv.QUOTE_NONE,
escapechar='\\',
na_rep='#NULL',
)
f.seek(0)
if options.verbose:
util.log('Writing CSV to: {0}'.format(path))
self.hdfs.put(path, f.name)
if not delete:
os.remove(tmp_name)
return path
```
I don't know that much about the Ibis code base, so this is just a first pass. I hope it's helpful.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ibis/impala/pandas_interop.py`
Content:
```
1 # Copyright 2014 Cloudera Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import csv
16 import tempfile
17
18 from posixpath import join as pjoin
19
20 import ibis.common.exceptions as com
21 import ibis.expr.schema as sch
22 import ibis.util as util
23 from ibis.config import options
24
25
26 class DataFrameWriter:
27
28 """
29 Interface class for writing pandas objects to Impala tables
30
31 Class takes ownership of any temporary data written to HDFS
32 """
33
34 def __init__(self, client, df, path=None):
35 self.client = client
36 self.hdfs = client.hdfs
37
38 self.df = df
39
40 self.temp_hdfs_dirs = []
41
42 def write_temp_csv(self):
43 temp_hdfs_dir = pjoin(
44 options.impala.temp_hdfs_path, 'pandas_{}'.format(util.guid())
45 )
46 self.hdfs.mkdir(temp_hdfs_dir)
47
48 # Keep track of the temporary HDFS file
49 self.temp_hdfs_dirs.append(temp_hdfs_dir)
50
51 # Write the file to HDFS
52 hdfs_path = pjoin(temp_hdfs_dir, '0.csv')
53
54 self.write_csv(hdfs_path)
55
56 return temp_hdfs_dir
57
58 def write_csv(self, path):
59 with tempfile.NamedTemporaryFile() as f:
60 # Write the DataFrame to the temporary file path
61 if options.verbose:
62 util.log(
63 'Writing DataFrame to temporary file {}'.format(f.name)
64 )
65
66 self.df.to_csv(
67 f.name,
68 header=False,
69 index=False,
70 sep=',',
71 quoting=csv.QUOTE_NONE,
72 escapechar='\\',
73 na_rep='#NULL',
74 )
75 f.seek(0)
76
77 if options.verbose:
78 util.log('Writing CSV to: {0}'.format(path))
79
80 self.hdfs.put(path, f.name)
81 return path
82
83 def get_schema(self):
84 # define a temporary table using delimited data
85 return sch.infer(self.df)
86
87 def delimited_table(self, csv_dir, name=None, database=None):
88 temp_delimited_name = 'ibis_tmp_pandas_{0}'.format(util.guid())
89 schema = self.get_schema()
90
91 return self.client.delimited_file(
92 csv_dir,
93 schema,
94 name=temp_delimited_name,
95 database=database,
96 delimiter=',',
97 na_rep='#NULL',
98 escapechar='\\\\',
99 external=True,
100 persist=False,
101 )
102
103 def __del__(self):
104 try:
105 self.cleanup()
106 except com.IbisError:
107 pass
108
109 def cleanup(self):
110 for path in self.temp_hdfs_dirs:
111 self.hdfs.rmdir(path)
112 self.temp_hdfs_dirs = []
113 self.csv_dir = None
114
115
116 def write_temp_dataframe(client, df):
117 writer = DataFrameWriter(client, df)
118 path = writer.write_temp_csv()
119 return writer, writer.delimited_table(path)
120
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ibis/impala/pandas_interop.py b/ibis/impala/pandas_interop.py
--- a/ibis/impala/pandas_interop.py
+++ b/ibis/impala/pandas_interop.py
@@ -13,8 +13,8 @@
# limitations under the License.
import csv
+import os
import tempfile
-
from posixpath import join as pjoin
import ibis.common.exceptions as com
@@ -56,15 +56,21 @@
return temp_hdfs_dir
def write_csv(self, path):
- with tempfile.NamedTemporaryFile() as f:
+ # Use a temporary dir instead of a temporary file
+ # to provide Windows support and avoid #2267
+ # https://github.com/ibis-project/ibis/issues/2267
+ with tempfile.TemporaryDirectory() as f:
# Write the DataFrame to the temporary file path
+ tmp_file_path = os.path.join(f, 'impala_temp_file.csv')
if options.verbose:
util.log(
- 'Writing DataFrame to temporary file {}'.format(f.name)
+ 'Writing DataFrame to temporary directory {}'.format(
+ tmp_file_path
+ )
)
self.df.to_csv(
- f.name,
+ tmp_file_path,
header=False,
index=False,
sep=',',
@@ -72,12 +78,11 @@
escapechar='\\',
na_rep='#NULL',
)
- f.seek(0)
if options.verbose:
util.log('Writing CSV to: {0}'.format(path))
- self.hdfs.put(path, f.name)
+ self.hdfs.put(path, tmp_file_path)
return path
def get_schema(self):
| {"golden_diff": "diff --git a/ibis/impala/pandas_interop.py b/ibis/impala/pandas_interop.py\n--- a/ibis/impala/pandas_interop.py\n+++ b/ibis/impala/pandas_interop.py\n@@ -13,8 +13,8 @@\n # limitations under the License.\n \n import csv\n+import os\n import tempfile\n-\n from posixpath import join as pjoin\n \n import ibis.common.exceptions as com\n@@ -56,15 +56,21 @@\n return temp_hdfs_dir\n \n def write_csv(self, path):\n- with tempfile.NamedTemporaryFile() as f:\n+ # Use a temporary dir instead of a temporary file\n+ # to provide Windows support and avoid #2267\n+ # https://github.com/ibis-project/ibis/issues/2267\n+ with tempfile.TemporaryDirectory() as f:\n # Write the DataFrame to the temporary file path\n+ tmp_file_path = os.path.join(f, 'impala_temp_file.csv')\n if options.verbose:\n util.log(\n- 'Writing DataFrame to temporary file {}'.format(f.name)\n+ 'Writing DataFrame to temporary directory {}'.format(\n+ tmp_file_path\n+ )\n )\n \n self.df.to_csv(\n- f.name,\n+ tmp_file_path,\n header=False,\n index=False,\n sep=',',\n@@ -72,12 +78,11 @@\n escapechar='\\\\',\n na_rep='#NULL',\n )\n- f.seek(0)\n \n if options.verbose:\n util.log('Writing CSV to: {0}'.format(path))\n \n- self.hdfs.put(path, f.name)\n+ self.hdfs.put(path, tmp_file_path)\n return path\n \n def get_schema(self):\n", "issue": "Impala pandas_interop.py DataFrameWriter.write_csv() fails with PermissionError on Windows Platforms\npandas_interop.py DataFrameWriter.write_csv() fails with PermissionError on Windows Platforms due to how Pythons [tempfile.NamedTemporaryFile()](https://docs.python.org/3/library/tempfile.html#tempfile.NamedTemporaryFile) works. \r\n\r\nSpecifically, if the OS is Windows and the NamedTemporaryFile delete parameter is True, the IO operation will fail, because Python will open the file and then Pandas will attempt to open the file again. See [this SO answer ](https://stackoverflow.com/a/23212515/3744998)for more information. \r\n\r\nI have replicated the issue manually and verified that setting the delete parameter to False resolves the issue. Perhaps we can add a platform check and explicitly set the delete parameter based on the platform? Then after we [exit the context statements scope](https://github.com/ibis-project/ibis/blob/master/ibis/impala/pandas_interop.py#L80), explicitly delete the temporary file. I imagine it could look something like the following.\r\n\r\n``` python\r\n\r\n import os\r\n\r\n def write_csv(self, path):\r\n if os.name == 'nt':\r\n # Should probably think of a better name for this boolean\r\n delete = False\r\n else:\r\n delete = True\r\n\r\n with tempfile.NamedTemporaryFile(delete=delete) as f:\r\n tmp_name = f.name\r\n # Write the DataFrame to the temporary file path\r\n if options.verbose:\r\n util.log(\r\n 'Writing DataFrame to temporary file {}'.format(f.name)\r\n )\r\n\r\n self.df.to_csv(\r\n f.name,\r\n header=False,\r\n index=False,\r\n sep=',',\r\n quoting=csv.QUOTE_NONE,\r\n escapechar='\\\\',\r\n na_rep='#NULL',\r\n )\r\n f.seek(0)\r\n\r\n if options.verbose:\r\n util.log('Writing CSV to: {0}'.format(path))\r\n\r\n self.hdfs.put(path, f.name)\r\n\r\n if not delete:\r\n os.remove(tmp_name)\r\n\r\n return path\r\n```\r\n\r\n\r\nI don't know that much about the Ibis code base, so this is just a first pass. I hope it's helpful. \n", "before_files": [{"content": "# Copyright 2014 Cloudera Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport csv\nimport tempfile\n\nfrom posixpath import join as pjoin\n\nimport ibis.common.exceptions as com\nimport ibis.expr.schema as sch\nimport ibis.util as util\nfrom ibis.config import options\n\n\nclass DataFrameWriter:\n\n \"\"\"\n Interface class for writing pandas objects to Impala tables\n\n Class takes ownership of any temporary data written to HDFS\n \"\"\"\n\n def __init__(self, client, df, path=None):\n self.client = client\n self.hdfs = client.hdfs\n\n self.df = df\n\n self.temp_hdfs_dirs = []\n\n def write_temp_csv(self):\n temp_hdfs_dir = pjoin(\n options.impala.temp_hdfs_path, 'pandas_{}'.format(util.guid())\n )\n self.hdfs.mkdir(temp_hdfs_dir)\n\n # Keep track of the temporary HDFS file\n self.temp_hdfs_dirs.append(temp_hdfs_dir)\n\n # Write the file to HDFS\n hdfs_path = pjoin(temp_hdfs_dir, '0.csv')\n\n self.write_csv(hdfs_path)\n\n return temp_hdfs_dir\n\n def write_csv(self, path):\n with tempfile.NamedTemporaryFile() as f:\n # Write the DataFrame to the temporary file path\n if options.verbose:\n util.log(\n 'Writing DataFrame to temporary file {}'.format(f.name)\n )\n\n self.df.to_csv(\n f.name,\n header=False,\n index=False,\n sep=',',\n quoting=csv.QUOTE_NONE,\n escapechar='\\\\',\n na_rep='#NULL',\n )\n f.seek(0)\n\n if options.verbose:\n util.log('Writing CSV to: {0}'.format(path))\n\n self.hdfs.put(path, f.name)\n return path\n\n def get_schema(self):\n # define a temporary table using delimited data\n return sch.infer(self.df)\n\n def delimited_table(self, csv_dir, name=None, database=None):\n temp_delimited_name = 'ibis_tmp_pandas_{0}'.format(util.guid())\n schema = self.get_schema()\n\n return self.client.delimited_file(\n csv_dir,\n schema,\n name=temp_delimited_name,\n database=database,\n delimiter=',',\n na_rep='#NULL',\n escapechar='\\\\\\\\',\n external=True,\n persist=False,\n )\n\n def __del__(self):\n try:\n self.cleanup()\n except com.IbisError:\n pass\n\n def cleanup(self):\n for path in self.temp_hdfs_dirs:\n self.hdfs.rmdir(path)\n self.temp_hdfs_dirs = []\n self.csv_dir = None\n\n\ndef write_temp_dataframe(client, df):\n writer = DataFrameWriter(client, df)\n path = writer.write_temp_csv()\n return writer, writer.delimited_table(path)\n", "path": "ibis/impala/pandas_interop.py"}], "after_files": [{"content": "# Copyright 2014 Cloudera Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport csv\nimport os\nimport tempfile\nfrom posixpath import join as pjoin\n\nimport ibis.common.exceptions as com\nimport ibis.expr.schema as sch\nimport ibis.util as util\nfrom ibis.config import options\n\n\nclass DataFrameWriter:\n\n \"\"\"\n Interface class for writing pandas objects to Impala tables\n\n Class takes ownership of any temporary data written to HDFS\n \"\"\"\n\n def __init__(self, client, df, path=None):\n self.client = client\n self.hdfs = client.hdfs\n\n self.df = df\n\n self.temp_hdfs_dirs = []\n\n def write_temp_csv(self):\n temp_hdfs_dir = pjoin(\n options.impala.temp_hdfs_path, 'pandas_{}'.format(util.guid())\n )\n self.hdfs.mkdir(temp_hdfs_dir)\n\n # Keep track of the temporary HDFS file\n self.temp_hdfs_dirs.append(temp_hdfs_dir)\n\n # Write the file to HDFS\n hdfs_path = pjoin(temp_hdfs_dir, '0.csv')\n\n self.write_csv(hdfs_path)\n\n return temp_hdfs_dir\n\n def write_csv(self, path):\n # Use a temporary dir instead of a temporary file\n # to provide Windows support and avoid #2267\n # https://github.com/ibis-project/ibis/issues/2267\n with tempfile.TemporaryDirectory() as f:\n # Write the DataFrame to the temporary file path\n tmp_file_path = os.path.join(f, 'impala_temp_file.csv')\n if options.verbose:\n util.log(\n 'Writing DataFrame to temporary directory {}'.format(\n tmp_file_path\n )\n )\n\n self.df.to_csv(\n tmp_file_path,\n header=False,\n index=False,\n sep=',',\n quoting=csv.QUOTE_NONE,\n escapechar='\\\\',\n na_rep='#NULL',\n )\n\n if options.verbose:\n util.log('Writing CSV to: {0}'.format(path))\n\n self.hdfs.put(path, tmp_file_path)\n return path\n\n def get_schema(self):\n # define a temporary table using delimited data\n return sch.infer(self.df)\n\n def delimited_table(self, csv_dir, name=None, database=None):\n temp_delimited_name = 'ibis_tmp_pandas_{0}'.format(util.guid())\n schema = self.get_schema()\n\n return self.client.delimited_file(\n csv_dir,\n schema,\n name=temp_delimited_name,\n database=database,\n delimiter=',',\n na_rep='#NULL',\n escapechar='\\\\\\\\',\n external=True,\n persist=False,\n )\n\n def __del__(self):\n try:\n self.cleanup()\n except com.IbisError:\n pass\n\n def cleanup(self):\n for path in self.temp_hdfs_dirs:\n self.hdfs.rmdir(path)\n self.temp_hdfs_dirs = []\n self.csv_dir = None\n\n\ndef write_temp_dataframe(client, df):\n writer = DataFrameWriter(client, df)\n path = writer.write_temp_csv()\n return writer, writer.delimited_table(path)\n", "path": "ibis/impala/pandas_interop.py"}]} | 1,720 | 395 |
gh_patches_debug_4876 | rasdani/github-patches | git_diff | akvo__akvo-rsr-1921 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Project editor crash for conditions without a text
## Test plan
1. Open the project editor, and click the 'Show advanced IATI fields' button
2. In the results and indicators section, add a project condition with only a type (no condition text)
3. Save
4. Refresh the page
5. **RESULT:** No error should be shown and the condition should state "No condition specified"
## Issue description
Hasn't happened on Live yet, but it did on Test. See http://sentry.support.akvo-ops.org/rsr/test/group/690/.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rsr/models/project_condition.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7
8 from django.db import models
9 from django.utils.translation import ugettext_lazy as _
10
11 from ..fields import ValidXMLCharField
12
13 from akvo.codelists.models import ConditionType
14 from akvo.codelists.store.codelists_v201 import CONDITION_TYPE
15 from akvo.utils import codelist_choices, codelist_value
16
17
18 class ProjectCondition(models.Model):
19 project = models.ForeignKey('Project', verbose_name=_(u'project'), related_name='conditions')
20 text = ValidXMLCharField(_(u'condition'), blank=True, max_length=100,
21 help_text=_(u'(100 characters)'))
22 type = ValidXMLCharField(_(u'condition type'), blank=True, max_length=1,
23 choices=codelist_choices(CONDITION_TYPE))
24
25 def __unicode__(self):
26 return self.text if self.text else _(u'No condition specified')
27
28 def iati_type(self):
29 return codelist_value(ConditionType, self, 'type')
30
31 class Meta:
32 app_label = 'rsr'
33 verbose_name = _(u'condition')
34 verbose_name_plural = _(u'conditions')
35
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rsr/models/project_condition.py b/akvo/rsr/models/project_condition.py
--- a/akvo/rsr/models/project_condition.py
+++ b/akvo/rsr/models/project_condition.py
@@ -23,7 +23,7 @@
choices=codelist_choices(CONDITION_TYPE))
def __unicode__(self):
- return self.text if self.text else _(u'No condition specified')
+ return self.text if self.text else u'%s' % _(u'No condition specified')
def iati_type(self):
return codelist_value(ConditionType, self, 'type')
| {"golden_diff": "diff --git a/akvo/rsr/models/project_condition.py b/akvo/rsr/models/project_condition.py\n--- a/akvo/rsr/models/project_condition.py\n+++ b/akvo/rsr/models/project_condition.py\n@@ -23,7 +23,7 @@\n choices=codelist_choices(CONDITION_TYPE))\n \n def __unicode__(self):\n- return self.text if self.text else _(u'No condition specified')\n+ return self.text if self.text else u'%s' % _(u'No condition specified')\n \n def iati_type(self):\n return codelist_value(ConditionType, self, 'type')\n", "issue": "Project editor crash for conditions without a text\n## Test plan\n1. Open the project editor, and click the 'Show advanced IATI fields' button\n2. In the results and indicators section, add a project condition with only a type (no condition text)\n3. Save\n4. Refresh the page\n5. **RESULT:** No error should be shown and the condition should state \"No condition specified\"\n## Issue description\n\nHasn't happened on Live yet, but it did on Test. See http://sentry.support.akvo-ops.org/rsr/test/group/690/.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom django.db import models\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom ..fields import ValidXMLCharField\n\nfrom akvo.codelists.models import ConditionType\nfrom akvo.codelists.store.codelists_v201 import CONDITION_TYPE\nfrom akvo.utils import codelist_choices, codelist_value\n\n\nclass ProjectCondition(models.Model):\n project = models.ForeignKey('Project', verbose_name=_(u'project'), related_name='conditions')\n text = ValidXMLCharField(_(u'condition'), blank=True, max_length=100,\n help_text=_(u'(100 characters)'))\n type = ValidXMLCharField(_(u'condition type'), blank=True, max_length=1,\n choices=codelist_choices(CONDITION_TYPE))\n\n def __unicode__(self):\n return self.text if self.text else _(u'No condition specified')\n\n def iati_type(self):\n return codelist_value(ConditionType, self, 'type')\n\n class Meta:\n app_label = 'rsr'\n verbose_name = _(u'condition')\n verbose_name_plural = _(u'conditions')\n", "path": "akvo/rsr/models/project_condition.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\nfrom django.db import models\nfrom django.utils.translation import ugettext_lazy as _\n\nfrom ..fields import ValidXMLCharField\n\nfrom akvo.codelists.models import ConditionType\nfrom akvo.codelists.store.codelists_v201 import CONDITION_TYPE\nfrom akvo.utils import codelist_choices, codelist_value\n\n\nclass ProjectCondition(models.Model):\n project = models.ForeignKey('Project', verbose_name=_(u'project'), related_name='conditions')\n text = ValidXMLCharField(_(u'condition'), blank=True, max_length=100,\n help_text=_(u'(100 characters)'))\n type = ValidXMLCharField(_(u'condition type'), blank=True, max_length=1,\n choices=codelist_choices(CONDITION_TYPE))\n\n def __unicode__(self):\n return self.text if self.text else u'%s' % _(u'No condition specified')\n\n def iati_type(self):\n return codelist_value(ConditionType, self, 'type')\n\n class Meta:\n app_label = 'rsr'\n verbose_name = _(u'condition')\n verbose_name_plural = _(u'conditions')\n", "path": "akvo/rsr/models/project_condition.py"}]} | 759 | 136 |
gh_patches_debug_3195 | rasdani/github-patches | git_diff | hartwork__jawanndenn-225 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
missing rapid-json dependency for pip
installing via pip3 does not also install the rapid-json dependency.
missing rapid-json dependency for pip
installing via pip3 does not also install the rapid-json dependency.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #! /usr/bin/env python3
2 # Copyright (C) 2016 Sebastian Pipping <[email protected]>
3 # Licensed under GNU Affero GPL v3 or later
4
5 import os
6
7 from setuptools import find_packages, setup
8
9 from jawanndenn.metadata import APP_NAME, VERSION_STR
10
11
12 def _read(filename):
13 with open(filename) as f:
14 return f.read()
15
16
17 _tests_require = [
18 'factory-boy>=2.12.0',
19 'parameterized>=0.7.1',
20 ]
21
22 _extras_require = {
23 'tests': _tests_require,
24 }
25
26
27 def _collect_package_data(top_directory):
28 for root, dirs, files in os.walk(os.path.join(top_directory, 'static')):
29 if files:
30 relative_root = os.path.relpath(root, top_directory)
31 yield os.path.join(relative_root, '*')
32
33
34 if __name__ == '__main__':
35 setup(
36 name=APP_NAME,
37 version=VERSION_STR,
38
39 license='AGPLv3+',
40 description='Libre alternative to Doodle',
41 long_description=_read('README.rst'),
42
43 author='Sebastian Pipping',
44 author_email='[email protected]',
45 url='https://github.com/hartwork/jawanndenn',
46
47 python_requires='>=3.6',
48 install_requires=[
49 'django>=2.2.7',
50 'django-extensions>=2.2.5',
51 'django-ratelimit>=2.0.0',
52 'djangorestframework>=3.11.0',
53 'gunicorn>=20.0.4',
54 'gunicorn-color>=0.1.0',
55 ],
56 extras_require=_extras_require,
57 tests_require=_tests_require,
58
59 packages=find_packages(),
60 package_data={
61 APP_NAME: list(_collect_package_data(APP_NAME)),
62 },
63
64 entry_points={
65 'console_scripts': [
66 f'{APP_NAME} = {APP_NAME}.__main__:main',
67 ],
68 },
69
70 classifiers=[
71 'Development Status :: 4 - Beta',
72 'Framework :: Django',
73 'Framework :: Django :: 2.2',
74 'License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)', # noqa: E501
75 'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)', # noqa: E501
76 'Programming Language :: JavaScript',
77 'Programming Language :: Python',
78 'Programming Language :: Python :: 3',
79 'Programming Language :: Python :: 3.6',
80 'Programming Language :: Python :: 3.7',
81 'Programming Language :: Python :: 3.8',
82 'Programming Language :: Python :: 3 :: Only',
83 'Topic :: Internet :: WWW/HTTP :: WSGI :: Application',
84 'Topic :: Office/Business :: Scheduling',
85 ]
86 )
87
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -52,6 +52,8 @@
'djangorestframework>=3.11.0',
'gunicorn>=20.0.4',
'gunicorn-color>=0.1.0',
+ 'python-dateutil>=2.8.1',
+ 'python-rapidjson>=1.0',
],
extras_require=_extras_require,
tests_require=_tests_require,
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -52,6 +52,8 @@\n 'djangorestframework>=3.11.0',\n 'gunicorn>=20.0.4',\n 'gunicorn-color>=0.1.0',\n+ 'python-dateutil>=2.8.1',\n+ 'python-rapidjson>=1.0',\n ],\n extras_require=_extras_require,\n tests_require=_tests_require,\n", "issue": "missing rapid-json dependency for pip\ninstalling via pip3 does not also install the rapid-json dependency.\nmissing rapid-json dependency for pip\ninstalling via pip3 does not also install the rapid-json dependency.\n", "before_files": [{"content": "#! /usr/bin/env python3\n# Copyright (C) 2016 Sebastian Pipping <[email protected]>\n# Licensed under GNU Affero GPL v3 or later\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nfrom jawanndenn.metadata import APP_NAME, VERSION_STR\n\n\ndef _read(filename):\n with open(filename) as f:\n return f.read()\n\n\n_tests_require = [\n 'factory-boy>=2.12.0',\n 'parameterized>=0.7.1',\n]\n\n_extras_require = {\n 'tests': _tests_require,\n}\n\n\ndef _collect_package_data(top_directory):\n for root, dirs, files in os.walk(os.path.join(top_directory, 'static')):\n if files:\n relative_root = os.path.relpath(root, top_directory)\n yield os.path.join(relative_root, '*')\n\n\nif __name__ == '__main__':\n setup(\n name=APP_NAME,\n version=VERSION_STR,\n\n license='AGPLv3+',\n description='Libre alternative to Doodle',\n long_description=_read('README.rst'),\n\n author='Sebastian Pipping',\n author_email='[email protected]',\n url='https://github.com/hartwork/jawanndenn',\n\n python_requires='>=3.6',\n install_requires=[\n 'django>=2.2.7',\n 'django-extensions>=2.2.5',\n 'django-ratelimit>=2.0.0',\n 'djangorestframework>=3.11.0',\n 'gunicorn>=20.0.4',\n 'gunicorn-color>=0.1.0',\n ],\n extras_require=_extras_require,\n tests_require=_tests_require,\n\n packages=find_packages(),\n package_data={\n APP_NAME: list(_collect_package_data(APP_NAME)),\n },\n\n entry_points={\n 'console_scripts': [\n f'{APP_NAME} = {APP_NAME}.__main__:main',\n ],\n },\n\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Framework :: Django',\n 'Framework :: Django :: 2.2',\n 'License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)', # noqa: E501\n 'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)', # noqa: E501\n 'Programming Language :: JavaScript',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Internet :: WWW/HTTP :: WSGI :: Application',\n 'Topic :: Office/Business :: Scheduling',\n ]\n )\n", "path": "setup.py"}], "after_files": [{"content": "#! /usr/bin/env python3\n# Copyright (C) 2016 Sebastian Pipping <[email protected]>\n# Licensed under GNU Affero GPL v3 or later\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nfrom jawanndenn.metadata import APP_NAME, VERSION_STR\n\n\ndef _read(filename):\n with open(filename) as f:\n return f.read()\n\n\n_tests_require = [\n 'factory-boy>=2.12.0',\n 'parameterized>=0.7.1',\n]\n\n_extras_require = {\n 'tests': _tests_require,\n}\n\n\ndef _collect_package_data(top_directory):\n for root, dirs, files in os.walk(os.path.join(top_directory, 'static')):\n if files:\n relative_root = os.path.relpath(root, top_directory)\n yield os.path.join(relative_root, '*')\n\n\nif __name__ == '__main__':\n setup(\n name=APP_NAME,\n version=VERSION_STR,\n\n license='AGPLv3+',\n description='Libre alternative to Doodle',\n long_description=_read('README.rst'),\n\n author='Sebastian Pipping',\n author_email='[email protected]',\n url='https://github.com/hartwork/jawanndenn',\n\n python_requires='>=3.6',\n install_requires=[\n 'django>=2.2.7',\n 'django-extensions>=2.2.5',\n 'django-ratelimit>=2.0.0',\n 'djangorestframework>=3.11.0',\n 'gunicorn>=20.0.4',\n 'gunicorn-color>=0.1.0',\n 'python-dateutil>=2.8.1',\n 'python-rapidjson>=1.0',\n ],\n extras_require=_extras_require,\n tests_require=_tests_require,\n\n packages=find_packages(),\n package_data={\n APP_NAME: list(_collect_package_data(APP_NAME)),\n },\n\n entry_points={\n 'console_scripts': [\n f'{APP_NAME} = {APP_NAME}.__main__:main',\n ],\n },\n\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'Framework :: Django',\n 'Framework :: Django :: 2.2',\n 'License :: OSI Approved :: GNU Affero General Public License v3 or later (AGPLv3+)', # noqa: E501\n 'License :: OSI Approved :: GNU General Public License v3 or later (GPLv3+)', # noqa: E501\n 'Programming Language :: JavaScript',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Programming Language :: Python :: 3.8',\n 'Programming Language :: Python :: 3 :: Only',\n 'Topic :: Internet :: WWW/HTTP :: WSGI :: Application',\n 'Topic :: Office/Business :: Scheduling',\n ]\n )\n", "path": "setup.py"}]} | 1,097 | 111 |
gh_patches_debug_18134 | rasdani/github-patches | git_diff | ocadotechnology__aimmo-350 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Functional Test 1 - Effect expiry
We want to check that the effect can expire. This would be as follows:
Place an avatar on the map in one of the cells.
Place a pickup somewhere.
Make a move in some direction, towards the pickup.
Call `avatar.pickup.effect.onturn()` 10 times until the effect expires.
Make sure that the avatar holds *no effect* (not pickup!).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `aimmo-game/simulation/effects.py`
Content:
```
1 from abc import ABCMeta, abstractmethod
2
3 import math
4
5 import sys
6
7
8 class _Effect(object):
9 __metaclass__ = ABCMeta
10
11 def __init__(self, avatar):
12 self._avatar = avatar
13 self.is_expired = False
14
15 @abstractmethod
16 def on_turn(self):
17 raise NotImplementedError()
18
19
20 class _TimedEffect(_Effect):
21 __metaclass__ = ABCMeta
22 EFFECT_TIME = 10
23
24 def __init__(self, *args):
25 super(_TimedEffect, self).__init__(*args)
26 self._time_remaining = self.EFFECT_TIME
27
28 def remove(self):
29 try:
30 self._avatar.effects.remove(self)
31 except KeyError as e:
32 raise KeyError("The avatar object does not exist! Cannot remove the effect.")
33
34 def on_turn(self):
35 self._time_remaining -= 1
36 if self._time_remaining <= 0:
37 self.is_expired = True
38
39
40 class InvulnerabilityPickupEffect(_TimedEffect):
41 def __init__(self, *args):
42 super(InvulnerabilityPickupEffect, self).__init__(*args)
43 self._avatar.resistance += 1000
44
45 def remove(self):
46 super(InvulnerabilityPickupEffect, self).remove()
47 self._avatar.resistance -= 1000
48
49
50 class DamageBoostPickupEffect(_TimedEffect):
51 def __init__(self, damage_boost, *args):
52 # For now we have no maximum damage boost constraints, so infinity needs to be casted into a max int.
53 if math.isinf(damage_boost):
54 damage_boost = sys.maxsize
55
56 self._damage_boost = int(round(damage_boost))
57 super(DamageBoostPickupEffect, self).__init__(*args)
58 self._avatar.attack_strength += self._damage_boost
59
60 def remove(self):
61 super(DamageBoostPickupEffect, self).remove()
62 self._avatar.attack_strength -= self._damage_boost
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/aimmo-game/simulation/effects.py b/aimmo-game/simulation/effects.py
--- a/aimmo-game/simulation/effects.py
+++ b/aimmo-game/simulation/effects.py
@@ -1,9 +1,10 @@
from abc import ABCMeta, abstractmethod
import math
-
import sys
+INVULNERABILITY_RESISTANCE = 1000
+
class _Effect(object):
__metaclass__ = ABCMeta
@@ -40,11 +41,11 @@
class InvulnerabilityPickupEffect(_TimedEffect):
def __init__(self, *args):
super(InvulnerabilityPickupEffect, self).__init__(*args)
- self._avatar.resistance += 1000
+ self._avatar.resistance += INVULNERABILITY_RESISTANCE
def remove(self):
super(InvulnerabilityPickupEffect, self).remove()
- self._avatar.resistance -= 1000
+ self._avatar.resistance -= INVULNERABILITY_RESISTANCE
class DamageBoostPickupEffect(_TimedEffect):
| {"golden_diff": "diff --git a/aimmo-game/simulation/effects.py b/aimmo-game/simulation/effects.py\n--- a/aimmo-game/simulation/effects.py\n+++ b/aimmo-game/simulation/effects.py\n@@ -1,9 +1,10 @@\n from abc import ABCMeta, abstractmethod\n \n import math\n-\n import sys\n \n+INVULNERABILITY_RESISTANCE = 1000\n+\n \n class _Effect(object):\n __metaclass__ = ABCMeta\n@@ -40,11 +41,11 @@\n class InvulnerabilityPickupEffect(_TimedEffect):\n def __init__(self, *args):\n super(InvulnerabilityPickupEffect, self).__init__(*args)\n- self._avatar.resistance += 1000\n+ self._avatar.resistance += INVULNERABILITY_RESISTANCE\n \n def remove(self):\n super(InvulnerabilityPickupEffect, self).remove()\n- self._avatar.resistance -= 1000\n+ self._avatar.resistance -= INVULNERABILITY_RESISTANCE\n \n \n class DamageBoostPickupEffect(_TimedEffect):\n", "issue": "Functional Test 1 - Effect expiry \nWe want to check that the effect can expire. This would be as follows:\n\nPlace an avatar on the map in one of the cells.\nPlace a pickup somewhere.\nMake a move in some direction, towards the pickup.\nCall `avatar.pickup.effect.onturn()` 10 times until the effect expires.\nMake sure that the avatar holds *no effect* (not pickup!).\n", "before_files": [{"content": "from abc import ABCMeta, abstractmethod\n\nimport math\n\nimport sys\n\n\nclass _Effect(object):\n __metaclass__ = ABCMeta\n\n def __init__(self, avatar):\n self._avatar = avatar\n self.is_expired = False\n\n @abstractmethod\n def on_turn(self):\n raise NotImplementedError()\n\n\nclass _TimedEffect(_Effect):\n __metaclass__ = ABCMeta\n EFFECT_TIME = 10\n\n def __init__(self, *args):\n super(_TimedEffect, self).__init__(*args)\n self._time_remaining = self.EFFECT_TIME\n\n def remove(self):\n try:\n self._avatar.effects.remove(self)\n except KeyError as e:\n raise KeyError(\"The avatar object does not exist! Cannot remove the effect.\")\n\n def on_turn(self):\n self._time_remaining -= 1\n if self._time_remaining <= 0:\n self.is_expired = True\n\n\nclass InvulnerabilityPickupEffect(_TimedEffect):\n def __init__(self, *args):\n super(InvulnerabilityPickupEffect, self).__init__(*args)\n self._avatar.resistance += 1000\n\n def remove(self):\n super(InvulnerabilityPickupEffect, self).remove()\n self._avatar.resistance -= 1000\n\n\nclass DamageBoostPickupEffect(_TimedEffect):\n def __init__(self, damage_boost, *args):\n # For now we have no maximum damage boost constraints, so infinity needs to be casted into a max int.\n if math.isinf(damage_boost):\n damage_boost = sys.maxsize\n\n self._damage_boost = int(round(damage_boost))\n super(DamageBoostPickupEffect, self).__init__(*args)\n self._avatar.attack_strength += self._damage_boost\n\n def remove(self):\n super(DamageBoostPickupEffect, self).remove()\n self._avatar.attack_strength -= self._damage_boost\n", "path": "aimmo-game/simulation/effects.py"}], "after_files": [{"content": "from abc import ABCMeta, abstractmethod\n\nimport math\nimport sys\n\nINVULNERABILITY_RESISTANCE = 1000\n\n\nclass _Effect(object):\n __metaclass__ = ABCMeta\n\n def __init__(self, avatar):\n self._avatar = avatar\n self.is_expired = False\n\n @abstractmethod\n def on_turn(self):\n raise NotImplementedError()\n\n\nclass _TimedEffect(_Effect):\n __metaclass__ = ABCMeta\n EFFECT_TIME = 10\n\n def __init__(self, *args):\n super(_TimedEffect, self).__init__(*args)\n self._time_remaining = self.EFFECT_TIME\n\n def remove(self):\n try:\n self._avatar.effects.remove(self)\n except KeyError as e:\n raise KeyError(\"The avatar object does not exist! Cannot remove the effect.\")\n\n def on_turn(self):\n self._time_remaining -= 1\n if self._time_remaining <= 0:\n self.is_expired = True\n\n\nclass InvulnerabilityPickupEffect(_TimedEffect):\n def __init__(self, *args):\n super(InvulnerabilityPickupEffect, self).__init__(*args)\n self._avatar.resistance += INVULNERABILITY_RESISTANCE\n\n def remove(self):\n super(InvulnerabilityPickupEffect, self).remove()\n self._avatar.resistance -= INVULNERABILITY_RESISTANCE\n\n\nclass DamageBoostPickupEffect(_TimedEffect):\n def __init__(self, damage_boost, *args):\n # For now we have no maximum damage boost constraints, so infinity needs to be casted into a max int.\n if math.isinf(damage_boost):\n damage_boost = sys.maxsize\n\n self._damage_boost = int(round(damage_boost))\n super(DamageBoostPickupEffect, self).__init__(*args)\n self._avatar.attack_strength += self._damage_boost\n\n def remove(self):\n super(DamageBoostPickupEffect, self).remove()\n self._avatar.attack_strength -= self._damage_boost\n", "path": "aimmo-game/simulation/effects.py"}]} | 892 | 245 |
gh_patches_debug_12044 | rasdani/github-patches | git_diff | chainer__chainer-5333 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
PrintHook may fail
* Conditions
```
>>> chainer.print_runtime_info()
Platform: Darwin-17.7.0-x86_64-i386-64bit
Chainer: 5.0.0b4
NumPy: 1.16.0.dev0+591880b
CuPy: Not Available
```
* Code to reproduce
```
>>> x = chainer.Variable(np.random.rand(2, 3).astype('f'))
>>> lstm = chainer.links.LSTM(3)
>>> y = lstm(x)
>>> with chainer.function_hooks.PrintHook():
... chainer.functions.sum(y).backward()
...
```
* Error messages, stack traces, or logs
```
function Sum
input data
<variable at 0x1152b3438>
- device: CPU
- backend: <class 'numpy.ndarray'>
- shape: (2, 3)
- dtype: float32
- statistics: mean=-0.08529118, std=0.18154962
- grad: None
function Sum
input data
(removed)
output gradient
<variable at 0x1152b33c8>
- device: CPU
- backend: <class 'numpy.ndarray'>
- shape: ()
- dtype: float32
- statistics: mean=0.00000000, std=0.00000000
- grad: mean=1.00000000, std=0.00000000
function BroadcastTo
input data
<variable at 0x1152b3710>
- device: CPU
- backend: <class 'numpy.ndarray'>
- shape: ()
- dtype: float32
- statistics: mean=1.00000000, std=0.00000000
- grad: None
function LSTM
input data
<variable at 0x1152b3668>
- device: CPU
- backend: <class 'numpy.ndarray'>
- shape: (2, 3)
- dtype: float32
- statistics: mean=0.00000000, std=0.00000000
- grad: None
<variable at 0x1152b3668>
- device: CPU
- backend: <class 'numpy.ndarray'>
- shape: (2, 12)
- dtype: float32
- statistics: mean=0.53247088, std=0.91164017
- grad: None
output gradient
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "/Users/tos/GitHub/chainer/chainer/variable.py", line 963, in backward
self._backward_main(retain_grad, loss_scale)
File "/Users/tos/GitHub/chainer/chainer/variable.py", line 1025, in _backward_main
hook.backward_preprocess(func, in_data, out_grad_data)
File "/Users/tos/GitHub/chainer/chainer/function_hooks/debug_print.py", line 87, in backward_preprocess
self._process(function, in_data, out_grad)
File "/Users/tos/GitHub/chainer/chainer/function_hooks/debug_print.py", line 77, in _process
v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))
AttributeError: 'NoneType' object has no attribute 'dtype'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `chainer/function_hooks/debug_print.py`
Content:
```
1 import sys
2 import warnings
3
4 from chainer.backends import cuda
5 from chainer import function_hook
6 from chainer import variable
7
8
9 class PrintHook(function_hook.FunctionHook):
10 """Function hook that prints debug information.
11
12 This function hook outputs the debug information of input arguments of
13 ``forward`` and ``backward`` methods involved in the hooked functions
14 at preprocessing time (that is, just before each method is called).
15
16 Unlike simple "debug print" technique, where users insert print functions
17 at every function to be inspected, we can show the information
18 of all functions involved with single ``with`` statement.
19
20 Further, this hook enables us to show the information of
21 ``backward`` methods without inserting print functions into
22 Chainer's library code.
23
24 Args:
25 sep: *(deprecated since v4.0.0)* Ignored.
26 end: Character to be added at the end of print function.
27 file: Output file_like object that that redirect to.
28 flush: If ``True``, this hook forcibly flushes the text stream
29 at the end of preprocessing.
30
31 .. admonition:: Example
32
33 The basic usage is to use it with ``with`` statement.
34
35 >>> from chainer import function_hooks
36 >>> l = L.Linear(10, 10)
37 >>> x = chainer.Variable(np.zeros((1, 10), np.float32))
38 >>> with chainer.function_hooks.PrintHook():
39 ... y = l(x)
40 ... z = F.sum(y)
41 ... z.backward() # doctest:+SKIP
42
43 In this example, ``PrintHook`` shows the debug information of
44 forward propagation of ``LinearFunction`` (which is implicitly
45 called by ``l``) and ``Sum`` (called by ``F.sum``)
46 and backward propagation of ``z`` and ``y``.
47
48 """
49
50 name = 'PrintHook'
51
52 def __init__(self, sep=None, end='\n', file=sys.stdout, flush=True):
53 if sep is not None:
54 warnings.warn('sep argument in chainer.function_hooks.PrintHook '
55 'is deprecated.', DeprecationWarning)
56 self.sep = sep # Keep sep because it was originally documented
57 self.end = end
58 self.file = file
59 self.flush = flush
60
61 def _print(self, msg):
62 self.file.write(msg + self.end)
63
64 def _process(self, function, in_data, out_grad=None):
65 self._print('function\t{}'.format(function.label))
66 self._print('input data')
67 for d in in_data:
68 if d is None:
69 # Some inputs can be removed with `retain_grad`.
70 self._print('(removed)')
71 continue
72 self._print(variable.Variable(d).debug_print())
73 if out_grad is not None:
74 self._print('output gradient')
75 for d in out_grad:
76 xp = cuda.get_array_module(d)
77 v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))
78 v.grad = d
79 self._print(v.debug_print())
80 if self.flush:
81 self.file.flush()
82
83 def forward_preprocess(self, function, in_data):
84 self._process(function, in_data)
85
86 def backward_preprocess(self, function, in_data, out_grad):
87 self._process(function, in_data, out_grad)
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/chainer/function_hooks/debug_print.py b/chainer/function_hooks/debug_print.py
--- a/chainer/function_hooks/debug_print.py
+++ b/chainer/function_hooks/debug_print.py
@@ -73,9 +73,12 @@
if out_grad is not None:
self._print('output gradient')
for d in out_grad:
- xp = cuda.get_array_module(d)
- v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))
- v.grad = d
+ if d is None:
+ v = variable.Variable()
+ else:
+ xp = cuda.get_array_module(d)
+ v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))
+ v.grad = d
self._print(v.debug_print())
if self.flush:
self.file.flush()
| {"golden_diff": "diff --git a/chainer/function_hooks/debug_print.py b/chainer/function_hooks/debug_print.py\n--- a/chainer/function_hooks/debug_print.py\n+++ b/chainer/function_hooks/debug_print.py\n@@ -73,9 +73,12 @@\n if out_grad is not None:\n self._print('output gradient')\n for d in out_grad:\n- xp = cuda.get_array_module(d)\n- v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))\n- v.grad = d\n+ if d is None:\n+ v = variable.Variable()\n+ else:\n+ xp = cuda.get_array_module(d)\n+ v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))\n+ v.grad = d\n self._print(v.debug_print())\n if self.flush:\n self.file.flush()\n", "issue": "PrintHook may fail\n* Conditions\r\n```\r\n>>> chainer.print_runtime_info()\r\nPlatform: Darwin-17.7.0-x86_64-i386-64bit\r\nChainer: 5.0.0b4\r\nNumPy: 1.16.0.dev0+591880b\r\nCuPy: Not Available\r\n```\r\n* Code to reproduce\r\n```\r\n>>> x = chainer.Variable(np.random.rand(2, 3).astype('f'))\r\n>>> lstm = chainer.links.LSTM(3)\r\n>>> y = lstm(x)\r\n>>> with chainer.function_hooks.PrintHook():\r\n... chainer.functions.sum(y).backward()\r\n...\r\n```\r\n\r\n* Error messages, stack traces, or logs\r\n```\r\nfunction\tSum\r\ninput data\r\n<variable at 0x1152b3438>\r\n- device: CPU\r\n- backend: <class 'numpy.ndarray'>\r\n- shape: (2, 3)\r\n- dtype: float32\r\n- statistics: mean=-0.08529118, std=0.18154962\r\n- grad: None\r\nfunction\tSum\r\ninput data\r\n(removed)\r\noutput gradient\r\n<variable at 0x1152b33c8>\r\n- device: CPU\r\n- backend: <class 'numpy.ndarray'>\r\n- shape: ()\r\n- dtype: float32\r\n- statistics: mean=0.00000000, std=0.00000000\r\n- grad: mean=1.00000000, std=0.00000000\r\nfunction\tBroadcastTo\r\ninput data\r\n<variable at 0x1152b3710>\r\n- device: CPU\r\n- backend: <class 'numpy.ndarray'>\r\n- shape: ()\r\n- dtype: float32\r\n- statistics: mean=1.00000000, std=0.00000000\r\n- grad: None\r\nfunction\tLSTM\r\ninput data\r\n<variable at 0x1152b3668>\r\n- device: CPU\r\n- backend: <class 'numpy.ndarray'>\r\n- shape: (2, 3)\r\n- dtype: float32\r\n- statistics: mean=0.00000000, std=0.00000000\r\n- grad: None\r\n<variable at 0x1152b3668>\r\n- device: CPU\r\n- backend: <class 'numpy.ndarray'>\r\n- shape: (2, 12)\r\n- dtype: float32\r\n- statistics: mean=0.53247088, std=0.91164017\r\n- grad: None\r\noutput gradient\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 2, in <module>\r\n File \"/Users/tos/GitHub/chainer/chainer/variable.py\", line 963, in backward\r\n self._backward_main(retain_grad, loss_scale)\r\n File \"/Users/tos/GitHub/chainer/chainer/variable.py\", line 1025, in _backward_main\r\n hook.backward_preprocess(func, in_data, out_grad_data)\r\n File \"/Users/tos/GitHub/chainer/chainer/function_hooks/debug_print.py\", line 87, in backward_preprocess\r\n self._process(function, in_data, out_grad)\r\n File \"/Users/tos/GitHub/chainer/chainer/function_hooks/debug_print.py\", line 77, in _process\r\n v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))\r\nAttributeError: 'NoneType' object has no attribute 'dtype'\r\n```\n", "before_files": [{"content": "import sys\nimport warnings\n\nfrom chainer.backends import cuda\nfrom chainer import function_hook\nfrom chainer import variable\n\n\nclass PrintHook(function_hook.FunctionHook):\n \"\"\"Function hook that prints debug information.\n\n This function hook outputs the debug information of input arguments of\n ``forward`` and ``backward`` methods involved in the hooked functions\n at preprocessing time (that is, just before each method is called).\n\n Unlike simple \"debug print\" technique, where users insert print functions\n at every function to be inspected, we can show the information\n of all functions involved with single ``with`` statement.\n\n Further, this hook enables us to show the information of\n ``backward`` methods without inserting print functions into\n Chainer's library code.\n\n Args:\n sep: *(deprecated since v4.0.0)* Ignored.\n end: Character to be added at the end of print function.\n file: Output file_like object that that redirect to.\n flush: If ``True``, this hook forcibly flushes the text stream\n at the end of preprocessing.\n\n .. admonition:: Example\n\n The basic usage is to use it with ``with`` statement.\n\n >>> from chainer import function_hooks\n >>> l = L.Linear(10, 10)\n >>> x = chainer.Variable(np.zeros((1, 10), np.float32))\n >>> with chainer.function_hooks.PrintHook():\n ... y = l(x)\n ... z = F.sum(y)\n ... z.backward() # doctest:+SKIP\n\n In this example, ``PrintHook`` shows the debug information of\n forward propagation of ``LinearFunction`` (which is implicitly\n called by ``l``) and ``Sum`` (called by ``F.sum``)\n and backward propagation of ``z`` and ``y``.\n\n \"\"\"\n\n name = 'PrintHook'\n\n def __init__(self, sep=None, end='\\n', file=sys.stdout, flush=True):\n if sep is not None:\n warnings.warn('sep argument in chainer.function_hooks.PrintHook '\n 'is deprecated.', DeprecationWarning)\n self.sep = sep # Keep sep because it was originally documented\n self.end = end\n self.file = file\n self.flush = flush\n\n def _print(self, msg):\n self.file.write(msg + self.end)\n\n def _process(self, function, in_data, out_grad=None):\n self._print('function\\t{}'.format(function.label))\n self._print('input data')\n for d in in_data:\n if d is None:\n # Some inputs can be removed with `retain_grad`.\n self._print('(removed)')\n continue\n self._print(variable.Variable(d).debug_print())\n if out_grad is not None:\n self._print('output gradient')\n for d in out_grad:\n xp = cuda.get_array_module(d)\n v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))\n v.grad = d\n self._print(v.debug_print())\n if self.flush:\n self.file.flush()\n\n def forward_preprocess(self, function, in_data):\n self._process(function, in_data)\n\n def backward_preprocess(self, function, in_data, out_grad):\n self._process(function, in_data, out_grad)\n", "path": "chainer/function_hooks/debug_print.py"}], "after_files": [{"content": "import sys\nimport warnings\n\nfrom chainer.backends import cuda\nfrom chainer import function_hook\nfrom chainer import variable\n\n\nclass PrintHook(function_hook.FunctionHook):\n \"\"\"Function hook that prints debug information.\n\n This function hook outputs the debug information of input arguments of\n ``forward`` and ``backward`` methods involved in the hooked functions\n at preprocessing time (that is, just before each method is called).\n\n Unlike simple \"debug print\" technique, where users insert print functions\n at every function to be inspected, we can show the information\n of all functions involved with single ``with`` statement.\n\n Further, this hook enables us to show the information of\n ``backward`` methods without inserting print functions into\n Chainer's library code.\n\n Args:\n sep: *(deprecated since v4.0.0)* Ignored.\n end: Character to be added at the end of print function.\n file: Output file_like object that that redirect to.\n flush: If ``True``, this hook forcibly flushes the text stream\n at the end of preprocessing.\n\n .. admonition:: Example\n\n The basic usage is to use it with ``with`` statement.\n\n >>> from chainer import function_hooks\n >>> l = L.Linear(10, 10)\n >>> x = chainer.Variable(np.zeros((1, 10), np.float32))\n >>> with chainer.function_hooks.PrintHook():\n ... y = l(x)\n ... z = F.sum(y)\n ... z.backward() # doctest:+SKIP\n\n In this example, ``PrintHook`` shows the debug information of\n forward propagation of ``LinearFunction`` (which is implicitly\n called by ``l``) and ``Sum`` (called by ``F.sum``)\n and backward propagation of ``z`` and ``y``.\n\n \"\"\"\n\n name = 'PrintHook'\n\n def __init__(self, sep=None, end='\\n', file=sys.stdout, flush=True):\n if sep is not None:\n warnings.warn('sep argument in chainer.function_hooks.PrintHook '\n 'is deprecated.', DeprecationWarning)\n self.sep = sep # Keep sep because it was originally documented\n self.end = end\n self.file = file\n self.flush = flush\n\n def _print(self, msg):\n self.file.write(msg + self.end)\n\n def _process(self, function, in_data, out_grad=None):\n self._print('function\\t{}'.format(function.label))\n self._print('input data')\n for d in in_data:\n if d is None:\n # Some inputs can be removed with `retain_grad`.\n self._print('(removed)')\n continue\n self._print(variable.Variable(d).debug_print())\n if out_grad is not None:\n self._print('output gradient')\n for d in out_grad:\n if d is None:\n v = variable.Variable()\n else:\n xp = cuda.get_array_module(d)\n v = variable.Variable(xp.zeros_like(d, dtype=d.dtype))\n v.grad = d\n self._print(v.debug_print())\n if self.flush:\n self.file.flush()\n\n def forward_preprocess(self, function, in_data):\n self._process(function, in_data)\n\n def backward_preprocess(self, function, in_data, out_grad):\n self._process(function, in_data, out_grad)\n", "path": "chainer/function_hooks/debug_print.py"}]} | 1,958 | 178 |
gh_patches_debug_18139 | rasdani/github-patches | git_diff | Qiskit__qiskit-10411 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Deepcopy of control flow with standalone clbit results in invalid circuit
### Environment
- **Qiskit Terra version**: 0.24.1 (but also on main)
- **Python version**: 3.11
- **Operating system**: Linux
### What is happening?
When running a deepcopy of a circuit object using a clbit as a standalone bit as the condition for a control flow operation results in the deepcopied circuit being invalid so that qasm3 or qpy serialization, and visualization will fail because the internal state of the circuit is corrupted and not valid in the copy.
### How can we reproduce the issue?
```python
import copy
from qiskit.circuit import QuantumCircuit, Qubit, Clbit, ClassicalRegister, QuantumRegister, Gate, Parameter
bits = [Qubit(), Qubit(), Clbit()]
qc = QuantumCircuit(bits)
with qc.if_test((qc.clbits[0], 1)):
qc.x(0)
copy_qc = copy.deepcopy(qc)
print(copy_qc)
```
### What should happen?
It should print the visualization of the circuit. This will stack trace though during the circuit to dag conversion. You can also replace `print(copy_qc)` with something like:
```
import io
from qiskit.qpy import dump
with io.BytesIO() as fd:
dump(copy_qc, fd)
```
this will also fail on a lookup of the clbit.
### Any suggestions?
I think the special `__deepcopy__` handling in the `qiskit.circuit.Instruction` class is not valid for the `ControlFlowOp` when the `.condition` attribute is a clbit. It might just be simplest to revert to the default deepcopy behavior for `ControlFlowOp`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qiskit/circuit/bit.py`
Content:
```
1 # This code is part of Qiskit.
2 #
3 # (C) Copyright IBM 2019.
4 #
5 # This code is licensed under the Apache License, Version 2.0. You may
6 # obtain a copy of this license in the LICENSE.txt file in the root directory
7 # of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.
8 #
9 # Any modifications or derivative works of this code must retain this
10 # copyright notice, and modified files need to carry a notice indicating
11 # that they have been altered from the originals.
12
13 """
14 Quantum bit and Classical bit objects.
15 """
16
17 from qiskit.circuit.exceptions import CircuitError
18 from qiskit.utils.deprecation import deprecate_func
19
20
21 class Bit:
22 """Implement a generic bit.
23
24 .. note::
25 This class should not be instantiated directly. This is just a superclass
26 for :class:`~.Clbit` and :class:`~.Qubit`.
27
28 """
29
30 __slots__ = {"_register", "_index", "_hash", "_repr"}
31
32 def __init__(self, register=None, index=None):
33 """Create a new generic bit."""
34 if (register, index) == (None, None):
35 self._register = None
36 self._index = None
37 # To sidestep the overridden Bit.__hash__ and use the default hash
38 # algorithm (only new-style Bits), call default object hash method.
39 self._hash = object.__hash__(self)
40 else:
41 try:
42 index = int(index)
43 except Exception as ex:
44 raise CircuitError(
45 f"index needs to be castable to an int: type {type(index)} was provided"
46 ) from ex
47
48 if index < 0:
49 index += register.size
50
51 if index >= register.size:
52 raise CircuitError(
53 f"index must be under the size of the register: {index} was provided"
54 )
55
56 self._register = register
57 self._index = index
58 self._hash = hash((self._register, self._index))
59 self._repr = f"{self.__class__.__name__}({self._register}, {self._index})"
60
61 @property
62 @deprecate_func(
63 is_property=True,
64 since="0.17",
65 additional_msg=(
66 "Instead, use :meth:`~qiskit.circuit.quantumcircuit.QuantumCircuit.find_bit` to find "
67 "all the containing registers within a circuit and the index of the bit within the "
68 "circuit."
69 ),
70 )
71 def register(self): # pylint: disable=bad-docstring-quotes
72 """Get the register of an old-style bit.
73
74 In modern Qiskit Terra (version 0.17+), bits are the fundamental object and registers are
75 aliases to collections of bits. A bit can be in many registers depending on the circuit, so
76 a single containing register is no longer a property of a bit. It is an error to access
77 this attribute on bits that were not constructed as "owned" by a register."""
78 if (self._register, self._index) == (None, None):
79 raise CircuitError("Attempt to query register of a new-style Bit.")
80
81 return self._register
82
83 @property
84 @deprecate_func(
85 is_property=True,
86 since="0.17",
87 additional_msg=(
88 "Instead, use :meth:`~qiskit.circuit.quantumcircuit.QuantumCircuit.find_bit` to find "
89 "all the containing registers within a circuit and the index of the bit within the "
90 "circuit."
91 ),
92 )
93 def index(self): # pylint: disable=bad-docstring-quotes
94 """Get the index of an old-style bit in the register that owns it.
95
96 In modern Qiskit Terra (version 0.17+), bits are the fundamental object and registers are
97 aliases to collections of bits. A bit can be in many registers depending on the circuit, so
98 a single containing register is no longer a property of a bit. It is an error to access
99 this attribute on bits that were not constructed as "owned" by a register."""
100 if (self._register, self._index) == (None, None):
101 raise CircuitError("Attempt to query index of a new-style Bit.")
102
103 return self._index
104
105 def __repr__(self):
106 """Return the official string representing the bit."""
107 if (self._register, self._index) == (None, None):
108 # Similar to __hash__, use default repr method for new-style Bits.
109 return object.__repr__(self)
110 return self._repr
111
112 def __hash__(self):
113 return self._hash
114
115 def __eq__(self, other):
116 if (self._register, self._index) == (None, None):
117 return other is self
118
119 try:
120 return self._repr == other._repr
121 except AttributeError:
122 return False
123
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/qiskit/circuit/bit.py b/qiskit/circuit/bit.py
--- a/qiskit/circuit/bit.py
+++ b/qiskit/circuit/bit.py
@@ -13,6 +13,7 @@
"""
Quantum bit and Classical bit objects.
"""
+import copy
from qiskit.circuit.exceptions import CircuitError
from qiskit.utils.deprecation import deprecate_func
@@ -120,3 +121,20 @@
return self._repr == other._repr
except AttributeError:
return False
+
+ def __copy__(self):
+ # Bits are immutable.
+ return self
+
+ def __deepcopy__(self, memo=None):
+ if (self._register, self._index) == (None, None):
+ return self
+
+ # Old-style bits need special handling for now, since some code seems
+ # to rely on their registers getting deep-copied.
+ bit = type(self).__new__(type(self))
+ bit._register = copy.deepcopy(self._register, memo)
+ bit._index = self._index
+ bit._hash = self._hash
+ bit._repr = self._repr
+ return bit
| {"golden_diff": "diff --git a/qiskit/circuit/bit.py b/qiskit/circuit/bit.py\n--- a/qiskit/circuit/bit.py\n+++ b/qiskit/circuit/bit.py\n@@ -13,6 +13,7 @@\n \"\"\"\n Quantum bit and Classical bit objects.\n \"\"\"\n+import copy\n \n from qiskit.circuit.exceptions import CircuitError\n from qiskit.utils.deprecation import deprecate_func\n@@ -120,3 +121,20 @@\n return self._repr == other._repr\n except AttributeError:\n return False\n+\n+ def __copy__(self):\n+ # Bits are immutable.\n+ return self\n+\n+ def __deepcopy__(self, memo=None):\n+ if (self._register, self._index) == (None, None):\n+ return self\n+\n+ # Old-style bits need special handling for now, since some code seems\n+ # to rely on their registers getting deep-copied.\n+ bit = type(self).__new__(type(self))\n+ bit._register = copy.deepcopy(self._register, memo)\n+ bit._index = self._index\n+ bit._hash = self._hash\n+ bit._repr = self._repr\n+ return bit\n", "issue": "Deepcopy of control flow with standalone clbit results in invalid circuit\n### Environment\n\n- **Qiskit Terra version**: 0.24.1 (but also on main)\r\n- **Python version**: 3.11\r\n- **Operating system**: Linux\r\n\n\n### What is happening?\n\nWhen running a deepcopy of a circuit object using a clbit as a standalone bit as the condition for a control flow operation results in the deepcopied circuit being invalid so that qasm3 or qpy serialization, and visualization will fail because the internal state of the circuit is corrupted and not valid in the copy.\n\n### How can we reproduce the issue?\n\n```python\r\nimport copy\r\n\r\nfrom qiskit.circuit import QuantumCircuit, Qubit, Clbit, ClassicalRegister, QuantumRegister, Gate, Parameter\r\n\r\nbits = [Qubit(), Qubit(), Clbit()]\r\nqc = QuantumCircuit(bits)\r\nwith qc.if_test((qc.clbits[0], 1)):\r\n qc.x(0)\r\n\r\ncopy_qc = copy.deepcopy(qc)\r\nprint(copy_qc)\r\n```\n\n### What should happen?\n\nIt should print the visualization of the circuit. This will stack trace though during the circuit to dag conversion. You can also replace `print(copy_qc)` with something like:\r\n\r\n```\r\nimport io\r\nfrom qiskit.qpy import dump\r\n\r\nwith io.BytesIO() as fd:\r\n dump(copy_qc, fd)\r\n```\r\nthis will also fail on a lookup of the clbit.\n\n### Any suggestions?\n\nI think the special `__deepcopy__` handling in the `qiskit.circuit.Instruction` class is not valid for the `ControlFlowOp` when the `.condition` attribute is a clbit. It might just be simplest to revert to the default deepcopy behavior for `ControlFlowOp`.\n", "before_files": [{"content": "# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2019.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"\nQuantum bit and Classical bit objects.\n\"\"\"\n\nfrom qiskit.circuit.exceptions import CircuitError\nfrom qiskit.utils.deprecation import deprecate_func\n\n\nclass Bit:\n \"\"\"Implement a generic bit.\n\n .. note::\n This class should not be instantiated directly. This is just a superclass\n for :class:`~.Clbit` and :class:`~.Qubit`.\n\n \"\"\"\n\n __slots__ = {\"_register\", \"_index\", \"_hash\", \"_repr\"}\n\n def __init__(self, register=None, index=None):\n \"\"\"Create a new generic bit.\"\"\"\n if (register, index) == (None, None):\n self._register = None\n self._index = None\n # To sidestep the overridden Bit.__hash__ and use the default hash\n # algorithm (only new-style Bits), call default object hash method.\n self._hash = object.__hash__(self)\n else:\n try:\n index = int(index)\n except Exception as ex:\n raise CircuitError(\n f\"index needs to be castable to an int: type {type(index)} was provided\"\n ) from ex\n\n if index < 0:\n index += register.size\n\n if index >= register.size:\n raise CircuitError(\n f\"index must be under the size of the register: {index} was provided\"\n )\n\n self._register = register\n self._index = index\n self._hash = hash((self._register, self._index))\n self._repr = f\"{self.__class__.__name__}({self._register}, {self._index})\"\n\n @property\n @deprecate_func(\n is_property=True,\n since=\"0.17\",\n additional_msg=(\n \"Instead, use :meth:`~qiskit.circuit.quantumcircuit.QuantumCircuit.find_bit` to find \"\n \"all the containing registers within a circuit and the index of the bit within the \"\n \"circuit.\"\n ),\n )\n def register(self): # pylint: disable=bad-docstring-quotes\n \"\"\"Get the register of an old-style bit.\n\n In modern Qiskit Terra (version 0.17+), bits are the fundamental object and registers are\n aliases to collections of bits. A bit can be in many registers depending on the circuit, so\n a single containing register is no longer a property of a bit. It is an error to access\n this attribute on bits that were not constructed as \"owned\" by a register.\"\"\"\n if (self._register, self._index) == (None, None):\n raise CircuitError(\"Attempt to query register of a new-style Bit.\")\n\n return self._register\n\n @property\n @deprecate_func(\n is_property=True,\n since=\"0.17\",\n additional_msg=(\n \"Instead, use :meth:`~qiskit.circuit.quantumcircuit.QuantumCircuit.find_bit` to find \"\n \"all the containing registers within a circuit and the index of the bit within the \"\n \"circuit.\"\n ),\n )\n def index(self): # pylint: disable=bad-docstring-quotes\n \"\"\"Get the index of an old-style bit in the register that owns it.\n\n In modern Qiskit Terra (version 0.17+), bits are the fundamental object and registers are\n aliases to collections of bits. A bit can be in many registers depending on the circuit, so\n a single containing register is no longer a property of a bit. It is an error to access\n this attribute on bits that were not constructed as \"owned\" by a register.\"\"\"\n if (self._register, self._index) == (None, None):\n raise CircuitError(\"Attempt to query index of a new-style Bit.\")\n\n return self._index\n\n def __repr__(self):\n \"\"\"Return the official string representing the bit.\"\"\"\n if (self._register, self._index) == (None, None):\n # Similar to __hash__, use default repr method for new-style Bits.\n return object.__repr__(self)\n return self._repr\n\n def __hash__(self):\n return self._hash\n\n def __eq__(self, other):\n if (self._register, self._index) == (None, None):\n return other is self\n\n try:\n return self._repr == other._repr\n except AttributeError:\n return False\n", "path": "qiskit/circuit/bit.py"}], "after_files": [{"content": "# This code is part of Qiskit.\n#\n# (C) Copyright IBM 2019.\n#\n# This code is licensed under the Apache License, Version 2.0. You may\n# obtain a copy of this license in the LICENSE.txt file in the root directory\n# of this source tree or at http://www.apache.org/licenses/LICENSE-2.0.\n#\n# Any modifications or derivative works of this code must retain this\n# copyright notice, and modified files need to carry a notice indicating\n# that they have been altered from the originals.\n\n\"\"\"\nQuantum bit and Classical bit objects.\n\"\"\"\nimport copy\n\nfrom qiskit.circuit.exceptions import CircuitError\nfrom qiskit.utils.deprecation import deprecate_func\n\n\nclass Bit:\n \"\"\"Implement a generic bit.\n\n .. note::\n This class should not be instantiated directly. This is just a superclass\n for :class:`~.Clbit` and :class:`~.Qubit`.\n\n \"\"\"\n\n __slots__ = {\"_register\", \"_index\", \"_hash\", \"_repr\"}\n\n def __init__(self, register=None, index=None):\n \"\"\"Create a new generic bit.\"\"\"\n if (register, index) == (None, None):\n self._register = None\n self._index = None\n # To sidestep the overridden Bit.__hash__ and use the default hash\n # algorithm (only new-style Bits), call default object hash method.\n self._hash = object.__hash__(self)\n else:\n try:\n index = int(index)\n except Exception as ex:\n raise CircuitError(\n f\"index needs to be castable to an int: type {type(index)} was provided\"\n ) from ex\n\n if index < 0:\n index += register.size\n\n if index >= register.size:\n raise CircuitError(\n f\"index must be under the size of the register: {index} was provided\"\n )\n\n self._register = register\n self._index = index\n self._hash = hash((self._register, self._index))\n self._repr = f\"{self.__class__.__name__}({self._register}, {self._index})\"\n\n @property\n @deprecate_func(\n is_property=True,\n since=\"0.17\",\n additional_msg=(\n \"Instead, use :meth:`~qiskit.circuit.quantumcircuit.QuantumCircuit.find_bit` to find \"\n \"all the containing registers within a circuit and the index of the bit within the \"\n \"circuit.\"\n ),\n )\n def register(self): # pylint: disable=bad-docstring-quotes\n \"\"\"Get the register of an old-style bit.\n\n In modern Qiskit Terra (version 0.17+), bits are the fundamental object and registers are\n aliases to collections of bits. A bit can be in many registers depending on the circuit, so\n a single containing register is no longer a property of a bit. It is an error to access\n this attribute on bits that were not constructed as \"owned\" by a register.\"\"\"\n if (self._register, self._index) == (None, None):\n raise CircuitError(\"Attempt to query register of a new-style Bit.\")\n\n return self._register\n\n @property\n @deprecate_func(\n is_property=True,\n since=\"0.17\",\n additional_msg=(\n \"Instead, use :meth:`~qiskit.circuit.quantumcircuit.QuantumCircuit.find_bit` to find \"\n \"all the containing registers within a circuit and the index of the bit within the \"\n \"circuit.\"\n ),\n )\n def index(self): # pylint: disable=bad-docstring-quotes\n \"\"\"Get the index of an old-style bit in the register that owns it.\n\n In modern Qiskit Terra (version 0.17+), bits are the fundamental object and registers are\n aliases to collections of bits. A bit can be in many registers depending on the circuit, so\n a single containing register is no longer a property of a bit. It is an error to access\n this attribute on bits that were not constructed as \"owned\" by a register.\"\"\"\n if (self._register, self._index) == (None, None):\n raise CircuitError(\"Attempt to query index of a new-style Bit.\")\n\n return self._index\n\n def __repr__(self):\n \"\"\"Return the official string representing the bit.\"\"\"\n if (self._register, self._index) == (None, None):\n # Similar to __hash__, use default repr method for new-style Bits.\n return object.__repr__(self)\n return self._repr\n\n def __hash__(self):\n return self._hash\n\n def __eq__(self, other):\n if (self._register, self._index) == (None, None):\n return other is self\n\n try:\n return self._repr == other._repr\n except AttributeError:\n return False\n\n def __copy__(self):\n # Bits are immutable.\n return self\n\n def __deepcopy__(self, memo=None):\n if (self._register, self._index) == (None, None):\n return self\n\n # Old-style bits need special handling for now, since some code seems\n # to rely on their registers getting deep-copied.\n bit = type(self).__new__(type(self))\n bit._register = copy.deepcopy(self._register, memo)\n bit._index = self._index\n bit._hash = self._hash\n bit._repr = self._repr\n return bit\n", "path": "qiskit/circuit/bit.py"}]} | 1,978 | 272 |
gh_patches_debug_4004 | rasdani/github-patches | git_diff | tobymao__sqlglot-552 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
cast to map with clickhouse dialect ignores case
Yet another clickhouse case sensitivity bug :)
ClickHouse CLI:
```
albatross :) select cast(map('a', 1, 'b', 2) as Map(String, Int8));
SELECT CAST(map('a', 1, 'b', 2), 'Map(String, Int8)')
Query id: bab188fb-69f7-4c0f-9bde-ac524a08f6e8
┌─CAST(map('a', 1, 'b', 2), 'Map(String, Int8)')─┐
│ {'a':1,'b':2} │
└────────────────────────────────────────────────┘
1 row in set. Elapsed: 0.001 sec.
```
sqlglot:
```
In [7]: sg.parse_one("cast(map('a', 1, 'b', 2) as Map(String, Int8))", read="clickhouse").sql(dialect="clickhouse")
Out[7]: "CAST(map('a', 1, 'b', 2) AS MAP(TEXT, TINYINT))"
```
Perhaps there's a way to disable all case transformations for the clickhouse dialect.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sqlglot/dialects/clickhouse.py`
Content:
```
1 from sqlglot import exp
2 from sqlglot.dialects.dialect import Dialect, inline_array_sql, var_map_sql
3 from sqlglot.generator import Generator
4 from sqlglot.helper import csv
5 from sqlglot.parser import Parser, parse_var_map
6 from sqlglot.tokens import Tokenizer, TokenType
7
8
9 def _lower_func(sql):
10 index = sql.index("(")
11 return sql[:index].lower() + sql[index:]
12
13
14 class ClickHouse(Dialect):
15 normalize_functions = None
16 null_ordering = "nulls_are_last"
17
18 class Tokenizer(Tokenizer):
19 IDENTIFIERS = ['"', "`"]
20
21 KEYWORDS = {
22 **Tokenizer.KEYWORDS,
23 "NULLABLE": TokenType.NULLABLE,
24 "FINAL": TokenType.FINAL,
25 "DATETIME64": TokenType.DATETIME,
26 "INT8": TokenType.TINYINT,
27 "INT16": TokenType.SMALLINT,
28 "INT32": TokenType.INT,
29 "INT64": TokenType.BIGINT,
30 "FLOAT32": TokenType.FLOAT,
31 "FLOAT64": TokenType.DOUBLE,
32 }
33
34 class Parser(Parser):
35 FUNCTIONS = {
36 **Parser.FUNCTIONS,
37 "MAP": parse_var_map,
38 }
39
40 def _parse_table(self, schema=False):
41 this = super()._parse_table(schema)
42
43 if self._match(TokenType.FINAL):
44 this = self.expression(exp.Final, this=this)
45
46 return this
47
48 class Generator(Generator):
49 STRUCT_DELIMITER = ("(", ")")
50
51 TYPE_MAPPING = {
52 **Generator.TYPE_MAPPING,
53 exp.DataType.Type.NULLABLE: "Nullable",
54 exp.DataType.Type.DATETIME: "DateTime64",
55 }
56
57 TRANSFORMS = {
58 **Generator.TRANSFORMS,
59 exp.Array: inline_array_sql,
60 exp.StrPosition: lambda self, e: f"position({csv(self.sql(e, 'this'), self.sql(e, 'substr'), self.sql(e, 'position'))})",
61 exp.Final: lambda self, e: f"{self.sql(e, 'this')} FINAL",
62 exp.Map: lambda self, e: _lower_func(var_map_sql(self, e)),
63 exp.VarMap: lambda self, e: _lower_func(var_map_sql(self, e)),
64 }
65
66 EXPLICIT_UNION = True
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sqlglot/dialects/clickhouse.py b/sqlglot/dialects/clickhouse.py
--- a/sqlglot/dialects/clickhouse.py
+++ b/sqlglot/dialects/clickhouse.py
@@ -51,6 +51,9 @@
**Generator.TYPE_MAPPING,
exp.DataType.Type.NULLABLE: "Nullable",
exp.DataType.Type.DATETIME: "DateTime64",
+ exp.DataType.Type.MAP: "Map",
+ exp.DataType.Type.ARRAY: "Array",
+ exp.DataType.Type.STRUCT: "Tuple",
}
TRANSFORMS = {
| {"golden_diff": "diff --git a/sqlglot/dialects/clickhouse.py b/sqlglot/dialects/clickhouse.py\n--- a/sqlglot/dialects/clickhouse.py\n+++ b/sqlglot/dialects/clickhouse.py\n@@ -51,6 +51,9 @@\n **Generator.TYPE_MAPPING,\n exp.DataType.Type.NULLABLE: \"Nullable\",\n exp.DataType.Type.DATETIME: \"DateTime64\",\n+ exp.DataType.Type.MAP: \"Map\",\n+ exp.DataType.Type.ARRAY: \"Array\",\n+ exp.DataType.Type.STRUCT: \"Tuple\",\n }\n \n TRANSFORMS = {\n", "issue": "cast to map with clickhouse dialect ignores case\nYet another clickhouse case sensitivity bug :)\r\n\r\n\r\nClickHouse CLI:\r\n\r\n```\r\nalbatross :) select cast(map('a', 1, 'b', 2) as Map(String, Int8));\r\n\r\nSELECT CAST(map('a', 1, 'b', 2), 'Map(String, Int8)')\r\n\r\nQuery id: bab188fb-69f7-4c0f-9bde-ac524a08f6e8\r\n\r\n\u250c\u2500CAST(map('a', 1, 'b', 2), 'Map(String, Int8)')\u2500\u2510\r\n\u2502 {'a':1,'b':2} \u2502\r\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\r\n\r\n1 row in set. Elapsed: 0.001 sec.\r\n```\r\n\r\nsqlglot:\r\n\r\n```\r\nIn [7]: sg.parse_one(\"cast(map('a', 1, 'b', 2) as Map(String, Int8))\", read=\"clickhouse\").sql(dialect=\"clickhouse\")\r\nOut[7]: \"CAST(map('a', 1, 'b', 2) AS MAP(TEXT, TINYINT))\"\r\n```\r\n\r\nPerhaps there's a way to disable all case transformations for the clickhouse dialect.\n", "before_files": [{"content": "from sqlglot import exp\nfrom sqlglot.dialects.dialect import Dialect, inline_array_sql, var_map_sql\nfrom sqlglot.generator import Generator\nfrom sqlglot.helper import csv\nfrom sqlglot.parser import Parser, parse_var_map\nfrom sqlglot.tokens import Tokenizer, TokenType\n\n\ndef _lower_func(sql):\n index = sql.index(\"(\")\n return sql[:index].lower() + sql[index:]\n\n\nclass ClickHouse(Dialect):\n normalize_functions = None\n null_ordering = \"nulls_are_last\"\n\n class Tokenizer(Tokenizer):\n IDENTIFIERS = ['\"', \"`\"]\n\n KEYWORDS = {\n **Tokenizer.KEYWORDS,\n \"NULLABLE\": TokenType.NULLABLE,\n \"FINAL\": TokenType.FINAL,\n \"DATETIME64\": TokenType.DATETIME,\n \"INT8\": TokenType.TINYINT,\n \"INT16\": TokenType.SMALLINT,\n \"INT32\": TokenType.INT,\n \"INT64\": TokenType.BIGINT,\n \"FLOAT32\": TokenType.FLOAT,\n \"FLOAT64\": TokenType.DOUBLE,\n }\n\n class Parser(Parser):\n FUNCTIONS = {\n **Parser.FUNCTIONS,\n \"MAP\": parse_var_map,\n }\n\n def _parse_table(self, schema=False):\n this = super()._parse_table(schema)\n\n if self._match(TokenType.FINAL):\n this = self.expression(exp.Final, this=this)\n\n return this\n\n class Generator(Generator):\n STRUCT_DELIMITER = (\"(\", \")\")\n\n TYPE_MAPPING = {\n **Generator.TYPE_MAPPING,\n exp.DataType.Type.NULLABLE: \"Nullable\",\n exp.DataType.Type.DATETIME: \"DateTime64\",\n }\n\n TRANSFORMS = {\n **Generator.TRANSFORMS,\n exp.Array: inline_array_sql,\n exp.StrPosition: lambda self, e: f\"position({csv(self.sql(e, 'this'), self.sql(e, 'substr'), self.sql(e, 'position'))})\",\n exp.Final: lambda self, e: f\"{self.sql(e, 'this')} FINAL\",\n exp.Map: lambda self, e: _lower_func(var_map_sql(self, e)),\n exp.VarMap: lambda self, e: _lower_func(var_map_sql(self, e)),\n }\n\n EXPLICIT_UNION = True\n", "path": "sqlglot/dialects/clickhouse.py"}], "after_files": [{"content": "from sqlglot import exp\nfrom sqlglot.dialects.dialect import Dialect, inline_array_sql, var_map_sql\nfrom sqlglot.generator import Generator\nfrom sqlglot.parser import Parser, parse_var_map\nfrom sqlglot.tokens import Tokenizer, TokenType\n\n\ndef _lower_func(sql):\n index = sql.index(\"(\")\n return sql[:index].lower() + sql[index:]\n\n\nclass ClickHouse(Dialect):\n normalize_functions = None\n null_ordering = \"nulls_are_last\"\n\n class Tokenizer(Tokenizer):\n IDENTIFIERS = ['\"', \"`\"]\n\n KEYWORDS = {\n **Tokenizer.KEYWORDS,\n \"NULLABLE\": TokenType.NULLABLE,\n \"FINAL\": TokenType.FINAL,\n \"DATETIME64\": TokenType.DATETIME,\n \"INT8\": TokenType.TINYINT,\n \"INT16\": TokenType.SMALLINT,\n \"INT32\": TokenType.INT,\n \"INT64\": TokenType.BIGINT,\n \"FLOAT32\": TokenType.FLOAT,\n \"FLOAT64\": TokenType.DOUBLE,\n }\n\n class Parser(Parser):\n FUNCTIONS = {\n **Parser.FUNCTIONS,\n \"MAP\": parse_var_map,\n }\n\n def _parse_table(self, schema=False):\n this = super()._parse_table(schema)\n\n if self._match(TokenType.FINAL):\n this = self.expression(exp.Final, this=this)\n\n return this\n\n class Generator(Generator):\n STRUCT_DELIMITER = (\"(\", \")\")\n\n TYPE_MAPPING = {\n **Generator.TYPE_MAPPING,\n exp.DataType.Type.NULLABLE: \"Nullable\",\n exp.DataType.Type.DATETIME: \"DateTime64\",\n exp.DataType.Type.MAP: \"Map\",\n exp.DataType.Type.ARRAY: \"Array\",\n exp.DataType.Type.STRUCT: \"Tuple\",\n }\n\n TRANSFORMS = {\n **Generator.TRANSFORMS,\n exp.Array: inline_array_sql,\n exp.Final: lambda self, e: f\"{self.sql(e, 'this')} FINAL\",\n exp.Map: lambda self, e: _lower_func(var_map_sql(self, e)),\n exp.VarMap: lambda self, e: _lower_func(var_map_sql(self, e)),\n }\n\n EXPLICIT_UNION = True\n", "path": "sqlglot/dialects/clickhouse.py"}]} | 1,156 | 135 |
gh_patches_debug_14670 | rasdani/github-patches | git_diff | ipython__ipython-3669 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
AttributeError: 'BlockingKernelClient' object has no attribute 'started_channels'
I have an app that is centered around IPython running in qt (https://github.com/neuropy/neuropy/blob/master/neuropy/main.py). Yesterday I updated IPython from git, and have had to change a few things since the refactoring about 3 months ago, which split a kernel client off of the kernel manager (see #3011). In the process, I came across the error in the title. This error only happens when I try to create a normal two-process app, but not when creating an inprocess app, ie, it happens with QtKernelManager, but not with QtInProcessKernelManager. I think a BlockingKernelClient happens to be the default for QtKernelClient, which may be due to #3222. Not sure how to fix this in IPython, but to get around it, I set the client_factory to QtKernelClient, and then calling QtKernelManager.client() works fine. My minimalist example is in this gist: https://gist.github.com/mspacek/6018206
Can this be fixed so that setting up the 2-process app doesn't require the extra step of changing the client_factory?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/qt/manager.py`
Content:
```
1 """ Defines a KernelClient that provides signals and slots.
2 """
3
4 from IPython.external.qt import QtCore
5
6 # Local imports
7 from IPython.utils.traitlets import Bool
8
9 from IPython.kernel import KernelManager
10 from IPython.kernel.restarter import KernelRestarter
11
12 from .kernel_mixins import QtKernelManagerMixin, QtKernelRestarterMixin
13
14
15 class QtKernelRestarter(KernelRestarter, QtKernelRestarterMixin):
16
17 def start(self):
18 if self._timer is None:
19 self._timer = QtCore.QTimer()
20 self._timer.timeout.connect(self.poll)
21 self._timer.start(self.time_to_dead * 1000)
22
23 def stop(self):
24 self._timer.stop()
25
26 def poll(self):
27 super(QtKernelRestarter, self).poll()
28
29
30 class QtKernelManager(KernelManager, QtKernelManagerMixin):
31 """A KernelManager with Qt signals for restart"""
32
33 autorestart = Bool(True, config=True)
34
35 def start_restarter(self):
36 if self.autorestart and self.has_kernel:
37 if self._restarter is None:
38 self._restarter = QtKernelRestarter(
39 kernel_manager=self,
40 parent=self,
41 log=self.log,
42 )
43 self._restarter.add_callback(self._handle_kernel_restarted)
44 self._restarter.start()
45
46 def stop_restarter(self):
47 if self.autorestart:
48 if self._restarter is not None:
49 self._restarter.stop()
50
51 def _handle_kernel_restarted(self):
52 self.kernel_restarted.emit()
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/IPython/qt/manager.py b/IPython/qt/manager.py
--- a/IPython/qt/manager.py
+++ b/IPython/qt/manager.py
@@ -4,7 +4,7 @@
from IPython.external.qt import QtCore
# Local imports
-from IPython.utils.traitlets import Bool
+from IPython.utils.traitlets import Bool, DottedObjectName
from IPython.kernel import KernelManager
from IPython.kernel.restarter import KernelRestarter
@@ -30,6 +30,7 @@
class QtKernelManager(KernelManager, QtKernelManagerMixin):
"""A KernelManager with Qt signals for restart"""
+ client_class = DottedObjectName('IPython.qt.client.QtKernelClient')
autorestart = Bool(True, config=True)
def start_restarter(self):
| {"golden_diff": "diff --git a/IPython/qt/manager.py b/IPython/qt/manager.py\n--- a/IPython/qt/manager.py\n+++ b/IPython/qt/manager.py\n@@ -4,7 +4,7 @@\n from IPython.external.qt import QtCore\n \n # Local imports\n-from IPython.utils.traitlets import Bool\n+from IPython.utils.traitlets import Bool, DottedObjectName\n \n from IPython.kernel import KernelManager\n from IPython.kernel.restarter import KernelRestarter\n@@ -30,6 +30,7 @@\n class QtKernelManager(KernelManager, QtKernelManagerMixin):\n \"\"\"A KernelManager with Qt signals for restart\"\"\"\n \n+ client_class = DottedObjectName('IPython.qt.client.QtKernelClient')\n autorestart = Bool(True, config=True)\n \n def start_restarter(self):\n", "issue": "AttributeError: 'BlockingKernelClient' object has no attribute 'started_channels'\nI have an app that is centered around IPython running in qt (https://github.com/neuropy/neuropy/blob/master/neuropy/main.py). Yesterday I updated IPython from git, and have had to change a few things since the refactoring about 3 months ago, which split a kernel client off of the kernel manager (see #3011). In the process, I came across the error in the title. This error only happens when I try to create a normal two-process app, but not when creating an inprocess app, ie, it happens with QtKernelManager, but not with QtInProcessKernelManager. I think a BlockingKernelClient happens to be the default for QtKernelClient, which may be due to #3222. Not sure how to fix this in IPython, but to get around it, I set the client_factory to QtKernelClient, and then calling QtKernelManager.client() works fine. My minimalist example is in this gist: https://gist.github.com/mspacek/6018206\n\nCan this be fixed so that setting up the 2-process app doesn't require the extra step of changing the client_factory?\n\n", "before_files": [{"content": "\"\"\" Defines a KernelClient that provides signals and slots.\n\"\"\"\n\nfrom IPython.external.qt import QtCore\n\n# Local imports\nfrom IPython.utils.traitlets import Bool\n\nfrom IPython.kernel import KernelManager\nfrom IPython.kernel.restarter import KernelRestarter\n\nfrom .kernel_mixins import QtKernelManagerMixin, QtKernelRestarterMixin\n\n\nclass QtKernelRestarter(KernelRestarter, QtKernelRestarterMixin):\n\n def start(self):\n if self._timer is None:\n self._timer = QtCore.QTimer()\n self._timer.timeout.connect(self.poll)\n self._timer.start(self.time_to_dead * 1000)\n\n def stop(self):\n self._timer.stop()\n\n def poll(self):\n super(QtKernelRestarter, self).poll()\n\n\nclass QtKernelManager(KernelManager, QtKernelManagerMixin):\n \"\"\"A KernelManager with Qt signals for restart\"\"\"\n\n autorestart = Bool(True, config=True)\n\n def start_restarter(self):\n if self.autorestart and self.has_kernel:\n if self._restarter is None:\n self._restarter = QtKernelRestarter(\n kernel_manager=self,\n parent=self,\n log=self.log,\n )\n self._restarter.add_callback(self._handle_kernel_restarted)\n self._restarter.start()\n\n def stop_restarter(self):\n if self.autorestart:\n if self._restarter is not None:\n self._restarter.stop()\n\n def _handle_kernel_restarted(self):\n self.kernel_restarted.emit()\n", "path": "IPython/qt/manager.py"}], "after_files": [{"content": "\"\"\" Defines a KernelClient that provides signals and slots.\n\"\"\"\n\nfrom IPython.external.qt import QtCore\n\n# Local imports\nfrom IPython.utils.traitlets import Bool, DottedObjectName\n\nfrom IPython.kernel import KernelManager\nfrom IPython.kernel.restarter import KernelRestarter\n\nfrom .kernel_mixins import QtKernelManagerMixin, QtKernelRestarterMixin\n\n\nclass QtKernelRestarter(KernelRestarter, QtKernelRestarterMixin):\n\n def start(self):\n if self._timer is None:\n self._timer = QtCore.QTimer()\n self._timer.timeout.connect(self.poll)\n self._timer.start(self.time_to_dead * 1000)\n\n def stop(self):\n self._timer.stop()\n\n def poll(self):\n super(QtKernelRestarter, self).poll()\n\n\nclass QtKernelManager(KernelManager, QtKernelManagerMixin):\n \"\"\"A KernelManager with Qt signals for restart\"\"\"\n\n client_class = DottedObjectName('IPython.qt.client.QtKernelClient')\n autorestart = Bool(True, config=True)\n\n def start_restarter(self):\n if self.autorestart and self.has_kernel:\n if self._restarter is None:\n self._restarter = QtKernelRestarter(\n kernel_manager=self,\n parent=self,\n log=self.log,\n )\n self._restarter.add_callback(self._handle_kernel_restarted)\n self._restarter.start()\n\n def stop_restarter(self):\n if self.autorestart:\n if self._restarter is not None:\n self._restarter.stop()\n\n def _handle_kernel_restarted(self):\n self.kernel_restarted.emit()\n", "path": "IPython/qt/manager.py"}]} | 951 | 177 |
gh_patches_debug_67296 | rasdani/github-patches | git_diff | qutip__qutip-1390 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Installation of qutip shows various warnings on ubuntu 20.04.1
**Describe the bug**
Installing the qutip on ubuntu 20.04.1 shows the following warnings:
```
$ sudo apt install python3-qutip
…
Entpacken von python3-qutip (4.4.1-6build1) ...
python3-qutip (4.4.1-6build1) wird eingerichtet ...
/usr/lib/python3/dist-packages/qutip/_mkl/spmv.py:53: SyntaxWarning: "is"
with a literal. Did you mean "=="?
if x.ndim is 1:
/usr/lib/python3/dist-packages/qutip/qobjevo.py:776: SyntaxWarning: "is no
t" with a literal. Did you mean "!="?
if self.compiled and self.compiled.split()[2] is not "cte":
/usr/lib/python3/dist-packages/qutip/qobjevo.py:1045: SyntaxWarning: "is"
with a literal. Did you mean "=="?
elif op1.type is "array":
/usr/lib/python3/dist-packages/qutip/qobjevo.py:1070: SyntaxWarning: "is"
with a literal. Did you mean "=="?
elif self.ops[_set[0]].type is "string":
/usr/lib/python3/dist-packages/qutip/qobjevo.py:1079: SyntaxWarning: "is"
with a literal. Did you mean "=="?
elif self.ops[_set[0]].type is "array":
/usr/lib/python3/dist-packages/qutip/qobjevo.py:1534: SyntaxWarning: "is n
ot" with a literal. Did you mean "!="?
for key in self.__dict__ if key is not "compiled_qobjevo"}
```
**To Reproduce**
Install qutip on ubuntu 20.04.1 via `sudo apt install python3-qutip`.
**Expected behavior**
No warnings during the installation of qutip.
**Your Environment**
```
>>> qutip.about()
QuTiP: Quantum Toolbox in Python
================================
Copyright (c) QuTiP team 2011 and later.
Original developers: R. J. Johansson & P. D. Nation.
Current admin team: Alexander Pitchford, Paul D. Nation, Nathan Shammah, Shahnawaz Ahmed, Neill Lambert, and Eric Giguère.
Project Manager: Franco Nori.
Currently developed through wide collaboration. See https://github.com/qutip for details.
QuTiP Version: 4.4.1
Numpy Version: 1.17.4
Scipy Version: 1.4.1
Cython Version: 0.29.14
Matplotlib Version: 3.1.2
Python Version: 3.8.5
Number of CPUs: 2
BLAS Info: OPENBLAS
OPENMP Installed: True
INTEL MKL Ext: False
Platform Info: Linux (x86_64)
Installation path: /usr/lib/python3/dist-packages/qutip
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qutip/_mkl/spmv.py`
Content:
```
1 # This file is part of QuTiP: Quantum Toolbox in Python.
2 #
3 # Copyright (c) 2011 and later, Paul D. Nation.
4 # All rights reserved.
5 #
6 # Redistribution and use in source and binary forms, with or without
7 # modification, are permitted provided that the following conditions are
8 # met:
9 #
10 # 1. Redistributions of source code must retain the above copyright notice,
11 # this list of conditions and the following disclaimer.
12 #
13 # 2. Redistributions in binary form must reproduce the above copyright
14 # notice, this list of conditions and the following disclaimer in the
15 # documentation and/or other materials provided with the distribution.
16 #
17 # 3. Neither the name of the QuTiP: Quantum Toolbox in Python nor the names
18 # of its contributors may be used to endorse or promote products derived
19 # from this software without specific prior written permission.
20 #
21 # THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
22 # "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
23 # LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A
24 # PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
25 # HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
26 # SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
27 # LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
28 # DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
29 # THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
30 # (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
31 # OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
32 ###############################################################################
33 import numpy as np
34 import scipy.sparse as sp
35 import ctypes
36 from ctypes import POINTER,c_int,c_char,c_double, byref
37 from numpy import ctypeslib
38 import qutip.settings as qset
39 zcsrgemv = qset.mkl_lib.mkl_cspblas_zcsrgemv
40
41 def mkl_spmv(A, x):
42 """
43 sparse csr_spmv using MKL
44 """
45 (m,n) = A.shape
46
47 # Pointers to data of the matrix
48 data = A.data.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))
49 indptr = A.indptr.ctypes.data_as(POINTER(c_int))
50 indices = A.indices.ctypes.data_as(POINTER(c_int))
51
52 # Allocate output, using same conventions as input
53 if x.ndim is 1:
54 y = np.empty(m,dtype=np.complex,order='C')
55 elif x.ndim==2 and x.shape[1]==1:
56 y = np.empty((m,1),dtype=np.complex,order='C')
57 else:
58 raise Exception('Input vector must be 1D row or 2D column vector')
59
60 np_x = x.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))
61 np_y = y.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))
62 # now call MKL. This returns the answer in np_y, which points to y
63 zcsrgemv(byref(c_char(bytes(b'N'))), byref(c_int(m)), data ,indptr, indices, np_x, np_y )
64 return y
65
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/qutip/_mkl/spmv.py b/qutip/_mkl/spmv.py
--- a/qutip/_mkl/spmv.py
+++ b/qutip/_mkl/spmv.py
@@ -50,7 +50,7 @@
indices = A.indices.ctypes.data_as(POINTER(c_int))
# Allocate output, using same conventions as input
- if x.ndim is 1:
+ if x.ndim == 1:
y = np.empty(m,dtype=np.complex,order='C')
elif x.ndim==2 and x.shape[1]==1:
y = np.empty((m,1),dtype=np.complex,order='C')
| {"golden_diff": "diff --git a/qutip/_mkl/spmv.py b/qutip/_mkl/spmv.py\n--- a/qutip/_mkl/spmv.py\n+++ b/qutip/_mkl/spmv.py\n@@ -50,7 +50,7 @@\n indices = A.indices.ctypes.data_as(POINTER(c_int))\n \n # Allocate output, using same conventions as input\n- if x.ndim is 1:\n+ if x.ndim == 1:\n y = np.empty(m,dtype=np.complex,order='C')\n elif x.ndim==2 and x.shape[1]==1:\n y = np.empty((m,1),dtype=np.complex,order='C')\n", "issue": "Installation of qutip shows various warnings on ubuntu 20.04.1\n**Describe the bug**\r\nInstalling the qutip on ubuntu 20.04.1 shows the following warnings:\r\n```\r\n$ sudo apt install python3-qutip \r\n\u2026\r\nEntpacken von python3-qutip (4.4.1-6build1) ...\r\npython3-qutip (4.4.1-6build1) wird eingerichtet ...\r\n/usr/lib/python3/dist-packages/qutip/_mkl/spmv.py:53: SyntaxWarning: \"is\" \r\nwith a literal. Did you mean \"==\"?\r\n if x.ndim is 1:\r\n/usr/lib/python3/dist-packages/qutip/qobjevo.py:776: SyntaxWarning: \"is no\r\nt\" with a literal. Did you mean \"!=\"?\r\n if self.compiled and self.compiled.split()[2] is not \"cte\":\r\n/usr/lib/python3/dist-packages/qutip/qobjevo.py:1045: SyntaxWarning: \"is\" \r\nwith a literal. Did you mean \"==\"?\r\n elif op1.type is \"array\":\r\n/usr/lib/python3/dist-packages/qutip/qobjevo.py:1070: SyntaxWarning: \"is\" \r\nwith a literal. Did you mean \"==\"?\r\n elif self.ops[_set[0]].type is \"string\":\r\n/usr/lib/python3/dist-packages/qutip/qobjevo.py:1079: SyntaxWarning: \"is\" \r\nwith a literal. Did you mean \"==\"?\r\n elif self.ops[_set[0]].type is \"array\":\r\n/usr/lib/python3/dist-packages/qutip/qobjevo.py:1534: SyntaxWarning: \"is n\r\not\" with a literal. Did you mean \"!=\"?\r\n for key in self.__dict__ if key is not \"compiled_qobjevo\"}\r\n```\r\n\r\n\r\n**To Reproduce**\r\nInstall qutip on ubuntu 20.04.1 via `sudo apt install python3-qutip`.\r\n\r\n**Expected behavior**\r\nNo warnings during the installation of qutip.\r\n\r\n**Your Environment**\r\n```\r\n>>> qutip.about()\r\n\r\nQuTiP: Quantum Toolbox in Python\r\n================================\r\nCopyright (c) QuTiP team 2011 and later.\r\nOriginal developers: R. J. Johansson & P. D. Nation.\r\nCurrent admin team: Alexander Pitchford, Paul D. Nation, Nathan Shammah, Shahnawaz Ahmed, Neill Lambert, and Eric Gigu\u00e8re.\r\nProject Manager: Franco Nori.\r\nCurrently developed through wide collaboration. See https://github.com/qutip for details.\r\n\r\nQuTiP Version: 4.4.1\r\nNumpy Version: 1.17.4\r\nScipy Version: 1.4.1\r\nCython Version: 0.29.14\r\nMatplotlib Version: 3.1.2\r\nPython Version: 3.8.5\r\nNumber of CPUs: 2\r\nBLAS Info: OPENBLAS\r\nOPENMP Installed: True\r\nINTEL MKL Ext: False\r\nPlatform Info: Linux (x86_64)\r\nInstallation path: /usr/lib/python3/dist-packages/qutip\r\n```\n", "before_files": [{"content": "# This file is part of QuTiP: Quantum Toolbox in Python.\n#\n# Copyright (c) 2011 and later, Paul D. Nation.\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are\n# met:\n#\n# 1. Redistributions of source code must retain the above copyright notice,\n# this list of conditions and the following disclaimer.\n#\n# 2. Redistributions in binary form must reproduce the above copyright\n# notice, this list of conditions and the following disclaimer in the\n# documentation and/or other materials provided with the distribution.\n#\n# 3. Neither the name of the QuTiP: Quantum Toolbox in Python nor the names\n# of its contributors may be used to endorse or promote products derived\n# from this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n# \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A\n# PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n# HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\n# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n###############################################################################\nimport numpy as np\nimport scipy.sparse as sp\nimport ctypes\nfrom ctypes import POINTER,c_int,c_char,c_double, byref\nfrom numpy import ctypeslib\nimport qutip.settings as qset\nzcsrgemv = qset.mkl_lib.mkl_cspblas_zcsrgemv\n\ndef mkl_spmv(A, x):\n \"\"\"\n sparse csr_spmv using MKL\n \"\"\"\n (m,n) = A.shape\n\n # Pointers to data of the matrix\n data = A.data.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))\n indptr = A.indptr.ctypes.data_as(POINTER(c_int))\n indices = A.indices.ctypes.data_as(POINTER(c_int))\n\n # Allocate output, using same conventions as input\n if x.ndim is 1:\n y = np.empty(m,dtype=np.complex,order='C')\n elif x.ndim==2 and x.shape[1]==1:\n y = np.empty((m,1),dtype=np.complex,order='C')\n else:\n raise Exception('Input vector must be 1D row or 2D column vector')\n \n np_x = x.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))\n np_y = y.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))\n # now call MKL. This returns the answer in np_y, which points to y\n zcsrgemv(byref(c_char(bytes(b'N'))), byref(c_int(m)), data ,indptr, indices, np_x, np_y ) \n return y\n", "path": "qutip/_mkl/spmv.py"}], "after_files": [{"content": "# This file is part of QuTiP: Quantum Toolbox in Python.\n#\n# Copyright (c) 2011 and later, Paul D. Nation.\n# All rights reserved.\n#\n# Redistribution and use in source and binary forms, with or without\n# modification, are permitted provided that the following conditions are\n# met:\n#\n# 1. Redistributions of source code must retain the above copyright notice,\n# this list of conditions and the following disclaimer.\n#\n# 2. Redistributions in binary form must reproduce the above copyright\n# notice, this list of conditions and the following disclaimer in the\n# documentation and/or other materials provided with the distribution.\n#\n# 3. Neither the name of the QuTiP: Quantum Toolbox in Python nor the names\n# of its contributors may be used to endorse or promote products derived\n# from this software without specific prior written permission.\n#\n# THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS\n# \"AS IS\" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT\n# LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A\n# PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\n# HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\n# SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\n# LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,\n# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY\n# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT\n# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\n# OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.\n###############################################################################\nimport numpy as np\nimport scipy.sparse as sp\nimport ctypes\nfrom ctypes import POINTER,c_int,c_char,c_double, byref\nfrom numpy import ctypeslib\nimport qutip.settings as qset\nzcsrgemv = qset.mkl_lib.mkl_cspblas_zcsrgemv\n\ndef mkl_spmv(A, x):\n \"\"\"\n sparse csr_spmv using MKL\n \"\"\"\n (m,n) = A.shape\n\n # Pointers to data of the matrix\n data = A.data.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))\n indptr = A.indptr.ctypes.data_as(POINTER(c_int))\n indices = A.indices.ctypes.data_as(POINTER(c_int))\n\n # Allocate output, using same conventions as input\n if x.ndim == 1:\n y = np.empty(m,dtype=np.complex,order='C')\n elif x.ndim==2 and x.shape[1]==1:\n y = np.empty((m,1),dtype=np.complex,order='C')\n else:\n raise Exception('Input vector must be 1D row or 2D column vector')\n \n np_x = x.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))\n np_y = y.ctypes.data_as(ctypeslib.ndpointer(np.complex128, ndim=1, flags='C'))\n # now call MKL. This returns the answer in np_y, which points to y\n zcsrgemv(byref(c_char(bytes(b'N'))), byref(c_int(m)), data ,indptr, indices, np_x, np_y ) \n return y\n", "path": "qutip/_mkl/spmv.py"}]} | 1,844 | 154 |
gh_patches_debug_21565 | rasdani/github-patches | git_diff | OCA__server-tools-508 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fix travis build on 9.0 branch
the 9.0 branch of the project is red, with really strange errors about computed fields. There have been such errors since the merge of #469 but for some reason they were neither fatal nor breaking the build.
Splitting `database_cleanup` tests in .travis.yml seems to let the other modules go green, but the tests for database_cleanup are red.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `database_cleanup/models/purge_wizard.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # © 2014-2016 Therp BV <http://therp.nl>
3 # License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).
4 import logging
5 from openerp import _, api, fields, models
6 from openerp.exceptions import AccessDenied
7
8
9 class CleanupPurgeLine(models.AbstractModel):
10 """ Abstract base class for the purge wizard lines """
11 _name = 'cleanup.purge.line'
12 _order = 'name'
13
14 name = fields.Char('Name', readonly=True)
15 purged = fields.Boolean('Purged', readonly=True)
16 wizard_id = fields.Many2one('cleanup.purge.wizard')
17
18 logger = logging.getLogger('openerp.addons.database_cleanup')
19
20 @api.multi
21 def purge(self):
22 raise NotImplementedError
23
24 @api.model
25 def create(self, values):
26 # make sure the user trying this is actually supposed to do it
27 if not self.env.ref('database_cleanup.menu_database_cleanup')\
28 .parent_id._filter_visible_menus():
29 raise AccessDenied
30 return super(CleanupPurgeLine, self).create(values)
31
32
33 class PurgeWizard(models.AbstractModel):
34 """ Abstract base class for the purge wizards """
35 _name = 'cleanup.purge.wizard'
36 _description = 'Purge stuff'
37
38 @api.model
39 def default_get(self, fields_list):
40 res = super(PurgeWizard, self).default_get(fields_list)
41 if 'purge_line_ids' in fields_list:
42 res['purge_line_ids'] = self.find()
43 return res
44
45 @api.multi
46 def find(self):
47 raise NotImplementedError
48
49 @api.multi
50 def purge_all(self):
51 self.mapped('purge_line_ids').purge()
52 return True
53
54 @api.model
55 def get_wizard_action(self):
56 wizard = self.create({})
57 return {
58 'type': 'ir.actions.act_window',
59 'name': wizard.display_name,
60 'views': [(False, 'form')],
61 'res_model': self._name,
62 'res_id': wizard.id,
63 'flags': {
64 'action_buttons': False,
65 'sidebar': False,
66 },
67 }
68
69 @api.multi
70 def select_lines(self):
71 return {
72 'type': 'ir.actions.act_window',
73 'name': _('Select lines to purge'),
74 'views': [(False, 'tree'), (False, 'form')],
75 'res_model': self._fields['purge_line_ids'].comodel_name,
76 'domain': [('wizard_id', 'in', self.ids)],
77 }
78
79 @api.multi
80 def name_get(self):
81 return [
82 (this.id, self._description)
83 for this in self
84 ]
85
86 @api.model
87 def create(self, values):
88 # make sure the user trying this is actually supposed to do it
89 if not self.env.ref('database_cleanup.menu_database_cleanup')\
90 .parent_id._filter_visible_menus():
91 raise AccessDenied
92 return super(PurgeWizard, self).create(values)
93
94 purge_line_ids = fields.One2many('cleanup.purge.line', 'wizard_id')
95
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/database_cleanup/models/purge_wizard.py b/database_cleanup/models/purge_wizard.py
--- a/database_cleanup/models/purge_wizard.py
+++ b/database_cleanup/models/purge_wizard.py
@@ -24,8 +24,8 @@
@api.model
def create(self, values):
# make sure the user trying this is actually supposed to do it
- if not self.env.ref('database_cleanup.menu_database_cleanup')\
- .parent_id._filter_visible_menus():
+ if self.env.ref(
+ 'base.group_erp_manager') not in self.env.user.groups_id:
raise AccessDenied
return super(CleanupPurgeLine, self).create(values)
@@ -86,8 +86,8 @@
@api.model
def create(self, values):
# make sure the user trying this is actually supposed to do it
- if not self.env.ref('database_cleanup.menu_database_cleanup')\
- .parent_id._filter_visible_menus():
+ if self.env.ref(
+ 'base.group_erp_manager') not in self.env.user.groups_id:
raise AccessDenied
return super(PurgeWizard, self).create(values)
| {"golden_diff": "diff --git a/database_cleanup/models/purge_wizard.py b/database_cleanup/models/purge_wizard.py\n--- a/database_cleanup/models/purge_wizard.py\n+++ b/database_cleanup/models/purge_wizard.py\n@@ -24,8 +24,8 @@\n @api.model\n def create(self, values):\n # make sure the user trying this is actually supposed to do it\n- if not self.env.ref('database_cleanup.menu_database_cleanup')\\\n- .parent_id._filter_visible_menus():\n+ if self.env.ref(\n+ 'base.group_erp_manager') not in self.env.user.groups_id:\n raise AccessDenied\n return super(CleanupPurgeLine, self).create(values)\n \n@@ -86,8 +86,8 @@\n @api.model\n def create(self, values):\n # make sure the user trying this is actually supposed to do it\n- if not self.env.ref('database_cleanup.menu_database_cleanup')\\\n- .parent_id._filter_visible_menus():\n+ if self.env.ref(\n+ 'base.group_erp_manager') not in self.env.user.groups_id:\n raise AccessDenied\n return super(PurgeWizard, self).create(values)\n", "issue": "Fix travis build on 9.0 branch\nthe 9.0 branch of the project is red, with really strange errors about computed fields. There have been such errors since the merge of #469 but for some reason they were neither fatal nor breaking the build. \n\nSplitting `database_cleanup` tests in .travis.yml seems to let the other modules go green, but the tests for database_cleanup are red. \n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# \u00a9 2014-2016 Therp BV <http://therp.nl>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\nimport logging\nfrom openerp import _, api, fields, models\nfrom openerp.exceptions import AccessDenied\n\n\nclass CleanupPurgeLine(models.AbstractModel):\n \"\"\" Abstract base class for the purge wizard lines \"\"\"\n _name = 'cleanup.purge.line'\n _order = 'name'\n\n name = fields.Char('Name', readonly=True)\n purged = fields.Boolean('Purged', readonly=True)\n wizard_id = fields.Many2one('cleanup.purge.wizard')\n\n logger = logging.getLogger('openerp.addons.database_cleanup')\n\n @api.multi\n def purge(self):\n raise NotImplementedError\n\n @api.model\n def create(self, values):\n # make sure the user trying this is actually supposed to do it\n if not self.env.ref('database_cleanup.menu_database_cleanup')\\\n .parent_id._filter_visible_menus():\n raise AccessDenied\n return super(CleanupPurgeLine, self).create(values)\n\n\nclass PurgeWizard(models.AbstractModel):\n \"\"\" Abstract base class for the purge wizards \"\"\"\n _name = 'cleanup.purge.wizard'\n _description = 'Purge stuff'\n\n @api.model\n def default_get(self, fields_list):\n res = super(PurgeWizard, self).default_get(fields_list)\n if 'purge_line_ids' in fields_list:\n res['purge_line_ids'] = self.find()\n return res\n\n @api.multi\n def find(self):\n raise NotImplementedError\n\n @api.multi\n def purge_all(self):\n self.mapped('purge_line_ids').purge()\n return True\n\n @api.model\n def get_wizard_action(self):\n wizard = self.create({})\n return {\n 'type': 'ir.actions.act_window',\n 'name': wizard.display_name,\n 'views': [(False, 'form')],\n 'res_model': self._name,\n 'res_id': wizard.id,\n 'flags': {\n 'action_buttons': False,\n 'sidebar': False,\n },\n }\n\n @api.multi\n def select_lines(self):\n return {\n 'type': 'ir.actions.act_window',\n 'name': _('Select lines to purge'),\n 'views': [(False, 'tree'), (False, 'form')],\n 'res_model': self._fields['purge_line_ids'].comodel_name,\n 'domain': [('wizard_id', 'in', self.ids)],\n }\n\n @api.multi\n def name_get(self):\n return [\n (this.id, self._description)\n for this in self\n ]\n\n @api.model\n def create(self, values):\n # make sure the user trying this is actually supposed to do it\n if not self.env.ref('database_cleanup.menu_database_cleanup')\\\n .parent_id._filter_visible_menus():\n raise AccessDenied\n return super(PurgeWizard, self).create(values)\n\n purge_line_ids = fields.One2many('cleanup.purge.line', 'wizard_id')\n", "path": "database_cleanup/models/purge_wizard.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# \u00a9 2014-2016 Therp BV <http://therp.nl>\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\nimport logging\nfrom openerp import _, api, fields, models\nfrom openerp.exceptions import AccessDenied\n\n\nclass CleanupPurgeLine(models.AbstractModel):\n \"\"\" Abstract base class for the purge wizard lines \"\"\"\n _name = 'cleanup.purge.line'\n _order = 'name'\n\n name = fields.Char('Name', readonly=True)\n purged = fields.Boolean('Purged', readonly=True)\n wizard_id = fields.Many2one('cleanup.purge.wizard')\n\n logger = logging.getLogger('openerp.addons.database_cleanup')\n\n @api.multi\n def purge(self):\n raise NotImplementedError\n\n @api.model\n def create(self, values):\n # make sure the user trying this is actually supposed to do it\n if self.env.ref(\n 'base.group_erp_manager') not in self.env.user.groups_id:\n raise AccessDenied\n return super(CleanupPurgeLine, self).create(values)\n\n\nclass PurgeWizard(models.AbstractModel):\n \"\"\" Abstract base class for the purge wizards \"\"\"\n _name = 'cleanup.purge.wizard'\n _description = 'Purge stuff'\n\n @api.model\n def default_get(self, fields_list):\n res = super(PurgeWizard, self).default_get(fields_list)\n if 'purge_line_ids' in fields_list:\n res['purge_line_ids'] = self.find()\n return res\n\n @api.multi\n def find(self):\n raise NotImplementedError\n\n @api.multi\n def purge_all(self):\n self.mapped('purge_line_ids').purge()\n return True\n\n @api.model\n def get_wizard_action(self):\n wizard = self.create({})\n return {\n 'type': 'ir.actions.act_window',\n 'name': wizard.display_name,\n 'views': [(False, 'form')],\n 'res_model': self._name,\n 'res_id': wizard.id,\n 'flags': {\n 'action_buttons': False,\n 'sidebar': False,\n },\n }\n\n @api.multi\n def select_lines(self):\n return {\n 'type': 'ir.actions.act_window',\n 'name': _('Select lines to purge'),\n 'views': [(False, 'tree'), (False, 'form')],\n 'res_model': self._fields['purge_line_ids'].comodel_name,\n 'domain': [('wizard_id', 'in', self.ids)],\n }\n\n @api.multi\n def name_get(self):\n return [\n (this.id, self._description)\n for this in self\n ]\n\n @api.model\n def create(self, values):\n # make sure the user trying this is actually supposed to do it\n if self.env.ref(\n 'base.group_erp_manager') not in self.env.user.groups_id:\n raise AccessDenied\n return super(PurgeWizard, self).create(values)\n\n purge_line_ids = fields.One2many('cleanup.purge.line', 'wizard_id')\n", "path": "database_cleanup/models/purge_wizard.py"}]} | 1,222 | 257 |
gh_patches_debug_5371 | rasdani/github-patches | git_diff | getnikola__nikola-3511 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Incorrect time zone guessing on macOS
On macOS Big Sur 11.2, we guess the full path to the zoneinfo file (instead of just the zone name).
```
$ nikola init
…
Time zone [private/var/db/timezone/tz/2021a.1.0/zoneinfo/Europe/Warsaw]:
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nikola/packages/tzlocal/unix.py`
Content:
```
1 """Unix support for tzlocal."""
2 import os
3 import re
4
5 import dateutil.tz
6
7 _cache_tz = None
8
9
10 def _try_tz_from_env():
11 tzenv = os.environ.get("TZ")
12 if tzenv and tzenv[0] == ":":
13 tzenv = tzenv[1:]
14 try:
15 if tzenv:
16 dateutil.tz.gettz(tzenv)
17 return tzenv
18 except Exception:
19 pass
20
21
22 def _get_localzone(_root="/"):
23 """Try to find the local timezone configuration.
24
25 The parameter _root makes the function look for files like /etc/localtime
26 beneath the _root directory. This is primarily used by the tests.
27 In normal usage you call the function without parameters.
28 """
29 tzenv = _try_tz_from_env()
30 if tzenv:
31 return tzenv
32
33 # Are we under Termux on Android?
34 if os.path.exists("/system/bin/getprop"):
35 import subprocess
36
37 androidtz = (
38 subprocess.check_output(["getprop", "persist.sys.timezone"])
39 .strip()
40 .decode()
41 )
42 return androidtz
43
44 # Now look for distribution specific configuration files
45 # that contain the timezone name.
46 for configfile in ("etc/timezone", "var/db/zoneinfo"):
47 tzpath = os.path.join(_root, configfile)
48 try:
49 with open(tzpath, "rb") as tzfile:
50 data = tzfile.read()
51
52 # Issue #3 was that /etc/timezone was a zoneinfo file.
53 # That's a misconfiguration, but we need to handle it gracefully:
54 if data[:5] == b"TZif2":
55 continue
56
57 etctz = data.strip().decode()
58 if not etctz:
59 # Empty file, skip
60 continue
61 for etctz in data.decode().splitlines():
62 # Get rid of host definitions and comments:
63 if " " in etctz:
64 etctz, dummy = etctz.split(" ", 1)
65 if "#" in etctz:
66 etctz, dummy = etctz.split("#", 1)
67 if not etctz:
68 continue
69 tz = etctz.replace(" ", "_")
70 return tz
71
72 except IOError:
73 # File doesn't exist or is a directory
74 continue
75
76 # CentOS has a ZONE setting in /etc/sysconfig/clock,
77 # OpenSUSE has a TIMEZONE setting in /etc/sysconfig/clock and
78 # Gentoo has a TIMEZONE setting in /etc/conf.d/clock
79 # We look through these files for a timezone:
80
81 zone_re = re.compile(r"\s*ZONE\s*=\s*\"")
82 timezone_re = re.compile(r"\s*TIMEZONE\s*=\s*\"")
83 end_re = re.compile('"')
84
85 for filename in ("etc/sysconfig/clock", "etc/conf.d/clock"):
86 tzpath = os.path.join(_root, filename)
87 try:
88 with open(tzpath, "rt") as tzfile:
89 data = tzfile.readlines()
90
91 for line in data:
92 # Look for the ZONE= setting.
93 match = zone_re.match(line)
94 if match is None:
95 # No ZONE= setting. Look for the TIMEZONE= setting.
96 match = timezone_re.match(line)
97 if match is not None:
98 # Some setting existed
99 line = line[match.end():]
100 etctz = line[: end_re.search(line).start()]
101
102 # We found a timezone
103 tz = etctz.replace(" ", "_")
104 return tz
105
106 except IOError:
107 # File doesn't exist or is a directory
108 continue
109
110 # systemd distributions use symlinks that include the zone name,
111 # see manpage of localtime(5) and timedatectl(1)
112 tzpath = os.path.join(_root, "etc/localtime")
113 if os.path.exists(tzpath) and os.path.islink(tzpath):
114 tzpath = os.path.realpath(tzpath)
115 start = tzpath.find("/") + 1
116 while start != 0:
117 tzpath = tzpath[start:]
118 try:
119 dateutil.tz.gettz(tzpath)
120 return tzpath
121 except Exception:
122 pass
123 start = tzpath.find("/") + 1
124
125 # Nothing found, return UTC
126 return None
127
128
129 def get_localzone():
130 """Get the computers configured local timezone, if any."""
131 global _cache_tz
132 if _cache_tz is None:
133 _cache_tz = _get_localzone()
134
135 return _cache_tz
136
137
138 def reload_localzone():
139 """Reload the cached localzone. You need to call this if the timezone has changed."""
140 global _cache_tz
141 _cache_tz = _get_localzone()
142 return _cache_tz
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nikola/packages/tzlocal/unix.py b/nikola/packages/tzlocal/unix.py
--- a/nikola/packages/tzlocal/unix.py
+++ b/nikola/packages/tzlocal/unix.py
@@ -116,8 +116,9 @@
while start != 0:
tzpath = tzpath[start:]
try:
- dateutil.tz.gettz(tzpath)
- return tzpath
+ tested_tz = dateutil.tz.gettz(tzpath)
+ if tested_tz:
+ return tzpath
except Exception:
pass
start = tzpath.find("/") + 1
| {"golden_diff": "diff --git a/nikola/packages/tzlocal/unix.py b/nikola/packages/tzlocal/unix.py\n--- a/nikola/packages/tzlocal/unix.py\n+++ b/nikola/packages/tzlocal/unix.py\n@@ -116,8 +116,9 @@\n while start != 0:\n tzpath = tzpath[start:]\n try:\n- dateutil.tz.gettz(tzpath)\n- return tzpath\n+ tested_tz = dateutil.tz.gettz(tzpath)\n+ if tested_tz:\n+ return tzpath\n except Exception:\n pass\n start = tzpath.find(\"/\") + 1\n", "issue": "Incorrect time zone guessing on macOS\nOn macOS Big Sur 11.2, we guess the full path to the zoneinfo file (instead of just the zone name).\r\n\r\n```\r\n$ nikola init\r\n\u2026\r\nTime zone [private/var/db/timezone/tz/2021a.1.0/zoneinfo/Europe/Warsaw]:\r\n```\n", "before_files": [{"content": "\"\"\"Unix support for tzlocal.\"\"\"\nimport os\nimport re\n\nimport dateutil.tz\n\n_cache_tz = None\n\n\ndef _try_tz_from_env():\n tzenv = os.environ.get(\"TZ\")\n if tzenv and tzenv[0] == \":\":\n tzenv = tzenv[1:]\n try:\n if tzenv:\n dateutil.tz.gettz(tzenv)\n return tzenv\n except Exception:\n pass\n\n\ndef _get_localzone(_root=\"/\"):\n \"\"\"Try to find the local timezone configuration.\n\n The parameter _root makes the function look for files like /etc/localtime\n beneath the _root directory. This is primarily used by the tests.\n In normal usage you call the function without parameters.\n \"\"\"\n tzenv = _try_tz_from_env()\n if tzenv:\n return tzenv\n\n # Are we under Termux on Android?\n if os.path.exists(\"/system/bin/getprop\"):\n import subprocess\n\n androidtz = (\n subprocess.check_output([\"getprop\", \"persist.sys.timezone\"])\n .strip()\n .decode()\n )\n return androidtz\n\n # Now look for distribution specific configuration files\n # that contain the timezone name.\n for configfile in (\"etc/timezone\", \"var/db/zoneinfo\"):\n tzpath = os.path.join(_root, configfile)\n try:\n with open(tzpath, \"rb\") as tzfile:\n data = tzfile.read()\n\n # Issue #3 was that /etc/timezone was a zoneinfo file.\n # That's a misconfiguration, but we need to handle it gracefully:\n if data[:5] == b\"TZif2\":\n continue\n\n etctz = data.strip().decode()\n if not etctz:\n # Empty file, skip\n continue\n for etctz in data.decode().splitlines():\n # Get rid of host definitions and comments:\n if \" \" in etctz:\n etctz, dummy = etctz.split(\" \", 1)\n if \"#\" in etctz:\n etctz, dummy = etctz.split(\"#\", 1)\n if not etctz:\n continue\n tz = etctz.replace(\" \", \"_\")\n return tz\n\n except IOError:\n # File doesn't exist or is a directory\n continue\n\n # CentOS has a ZONE setting in /etc/sysconfig/clock,\n # OpenSUSE has a TIMEZONE setting in /etc/sysconfig/clock and\n # Gentoo has a TIMEZONE setting in /etc/conf.d/clock\n # We look through these files for a timezone:\n\n zone_re = re.compile(r\"\\s*ZONE\\s*=\\s*\\\"\")\n timezone_re = re.compile(r\"\\s*TIMEZONE\\s*=\\s*\\\"\")\n end_re = re.compile('\"')\n\n for filename in (\"etc/sysconfig/clock\", \"etc/conf.d/clock\"):\n tzpath = os.path.join(_root, filename)\n try:\n with open(tzpath, \"rt\") as tzfile:\n data = tzfile.readlines()\n\n for line in data:\n # Look for the ZONE= setting.\n match = zone_re.match(line)\n if match is None:\n # No ZONE= setting. Look for the TIMEZONE= setting.\n match = timezone_re.match(line)\n if match is not None:\n # Some setting existed\n line = line[match.end():]\n etctz = line[: end_re.search(line).start()]\n\n # We found a timezone\n tz = etctz.replace(\" \", \"_\")\n return tz\n\n except IOError:\n # File doesn't exist or is a directory\n continue\n\n # systemd distributions use symlinks that include the zone name,\n # see manpage of localtime(5) and timedatectl(1)\n tzpath = os.path.join(_root, \"etc/localtime\")\n if os.path.exists(tzpath) and os.path.islink(tzpath):\n tzpath = os.path.realpath(tzpath)\n start = tzpath.find(\"/\") + 1\n while start != 0:\n tzpath = tzpath[start:]\n try:\n dateutil.tz.gettz(tzpath)\n return tzpath\n except Exception:\n pass\n start = tzpath.find(\"/\") + 1\n\n # Nothing found, return UTC\n return None\n\n\ndef get_localzone():\n \"\"\"Get the computers configured local timezone, if any.\"\"\"\n global _cache_tz\n if _cache_tz is None:\n _cache_tz = _get_localzone()\n\n return _cache_tz\n\n\ndef reload_localzone():\n \"\"\"Reload the cached localzone. You need to call this if the timezone has changed.\"\"\"\n global _cache_tz\n _cache_tz = _get_localzone()\n return _cache_tz\n", "path": "nikola/packages/tzlocal/unix.py"}], "after_files": [{"content": "\"\"\"Unix support for tzlocal.\"\"\"\nimport os\nimport re\n\nimport dateutil.tz\n\n_cache_tz = None\n\n\ndef _try_tz_from_env():\n tzenv = os.environ.get(\"TZ\")\n if tzenv and tzenv[0] == \":\":\n tzenv = tzenv[1:]\n try:\n if tzenv:\n dateutil.tz.gettz(tzenv)\n return tzenv\n except Exception:\n pass\n\n\ndef _get_localzone(_root=\"/\"):\n \"\"\"Try to find the local timezone configuration.\n\n The parameter _root makes the function look for files like /etc/localtime\n beneath the _root directory. This is primarily used by the tests.\n In normal usage you call the function without parameters.\n \"\"\"\n tzenv = _try_tz_from_env()\n if tzenv:\n return tzenv\n\n # Are we under Termux on Android?\n if os.path.exists(\"/system/bin/getprop\"):\n import subprocess\n\n androidtz = (\n subprocess.check_output([\"getprop\", \"persist.sys.timezone\"])\n .strip()\n .decode()\n )\n return androidtz\n\n # Now look for distribution specific configuration files\n # that contain the timezone name.\n for configfile in (\"etc/timezone\", \"var/db/zoneinfo\"):\n tzpath = os.path.join(_root, configfile)\n try:\n with open(tzpath, \"rb\") as tzfile:\n data = tzfile.read()\n\n # Issue #3 was that /etc/timezone was a zoneinfo file.\n # That's a misconfiguration, but we need to handle it gracefully:\n if data[:5] == b\"TZif2\":\n continue\n\n etctz = data.strip().decode()\n if not etctz:\n # Empty file, skip\n continue\n for etctz in data.decode().splitlines():\n # Get rid of host definitions and comments:\n if \" \" in etctz:\n etctz, dummy = etctz.split(\" \", 1)\n if \"#\" in etctz:\n etctz, dummy = etctz.split(\"#\", 1)\n if not etctz:\n continue\n tz = etctz.replace(\" \", \"_\")\n return tz\n\n except IOError:\n # File doesn't exist or is a directory\n continue\n\n # CentOS has a ZONE setting in /etc/sysconfig/clock,\n # OpenSUSE has a TIMEZONE setting in /etc/sysconfig/clock and\n # Gentoo has a TIMEZONE setting in /etc/conf.d/clock\n # We look through these files for a timezone:\n\n zone_re = re.compile(r\"\\s*ZONE\\s*=\\s*\\\"\")\n timezone_re = re.compile(r\"\\s*TIMEZONE\\s*=\\s*\\\"\")\n end_re = re.compile('\"')\n\n for filename in (\"etc/sysconfig/clock\", \"etc/conf.d/clock\"):\n tzpath = os.path.join(_root, filename)\n try:\n with open(tzpath, \"rt\") as tzfile:\n data = tzfile.readlines()\n\n for line in data:\n # Look for the ZONE= setting.\n match = zone_re.match(line)\n if match is None:\n # No ZONE= setting. Look for the TIMEZONE= setting.\n match = timezone_re.match(line)\n if match is not None:\n # Some setting existed\n line = line[match.end():]\n etctz = line[: end_re.search(line).start()]\n\n # We found a timezone\n tz = etctz.replace(\" \", \"_\")\n return tz\n\n except IOError:\n # File doesn't exist or is a directory\n continue\n\n # systemd distributions use symlinks that include the zone name,\n # see manpage of localtime(5) and timedatectl(1)\n tzpath = os.path.join(_root, \"etc/localtime\")\n if os.path.exists(tzpath) and os.path.islink(tzpath):\n tzpath = os.path.realpath(tzpath)\n start = tzpath.find(\"/\") + 1\n while start != 0:\n tzpath = tzpath[start:]\n try:\n tested_tz = dateutil.tz.gettz(tzpath)\n if tested_tz:\n return tzpath\n except Exception:\n pass\n start = tzpath.find(\"/\") + 1\n\n # Nothing found, return UTC\n return None\n\n\ndef get_localzone():\n \"\"\"Get the computers configured local timezone, if any.\"\"\"\n global _cache_tz\n if _cache_tz is None:\n _cache_tz = _get_localzone()\n\n return _cache_tz\n\n\ndef reload_localzone():\n \"\"\"Reload the cached localzone. You need to call this if the timezone has changed.\"\"\"\n global _cache_tz\n _cache_tz = _get_localzone()\n return _cache_tz\n", "path": "nikola/packages/tzlocal/unix.py"}]} | 1,730 | 149 |
gh_patches_debug_26724 | rasdani/github-patches | git_diff | ivy-llc__ivy-14028 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
diag_indices_from
mask_indices
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/jax/numpy/indexing.py`
Content:
```
1 # local
2 import ivy
3 from ivy.functional.frontends.jax.func_wrapper import (
4 to_ivy_arrays_and_back,
5 )
6
7
8 @to_ivy_arrays_and_back
9 def diagonal(a, offset=0, axis1=0, axis2=1):
10 return ivy.diagonal(a, offset=offset, axis1=axis1, axis2=axis2)
11
12
13 @to_ivy_arrays_and_back
14 def diag(v, k=0):
15 return ivy.diag(v, k=k)
16
17
18 @to_ivy_arrays_and_back
19 def diag_indices(n, ndim=2):
20 idx = ivy.arange(n, dtype=int)
21 return (idx,) * ndim
22
23
24 # take_along_axis
25 @to_ivy_arrays_and_back
26 def take_along_axis(arr, indices, axis, mode="fill"):
27 return ivy.take_along_axis(arr, indices, axis, mode=mode)
28
29
30 @to_ivy_arrays_and_back
31 def tril_indices(n_rows, n_cols=None, k=0):
32 return ivy.tril_indices(n_rows, n_cols, k)
33
34
35 @to_ivy_arrays_and_back
36 def triu_indices(n, k=0, m=None):
37 return ivy.triu_indices(n, m, k)
38
39
40 @to_ivy_arrays_and_back
41 def triu_indices_from(arr, k=0):
42 return ivy.triu_indices(arr.shape[-2], arr.shape[-1], k)
43
44
45 def tril_indices_from(arr, k=0):
46 return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)
47
48
49 # unravel_index
50 @to_ivy_arrays_and_back
51 def unravel_index(indices, shape):
52 ret = [x.astype("int64") for x in ivy.unravel_index(indices, shape)]
53 return tuple(ret)
54
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/jax/numpy/indexing.py b/ivy/functional/frontends/jax/numpy/indexing.py
--- a/ivy/functional/frontends/jax/numpy/indexing.py
+++ b/ivy/functional/frontends/jax/numpy/indexing.py
@@ -1,3 +1,6 @@
+# global
+import inspect
+
# local
import ivy
from ivy.functional.frontends.jax.func_wrapper import (
@@ -42,6 +45,7 @@
return ivy.triu_indices(arr.shape[-2], arr.shape[-1], k)
+@to_ivy_arrays_and_back
def tril_indices_from(arr, k=0):
return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)
@@ -51,3 +55,28 @@
def unravel_index(indices, shape):
ret = [x.astype("int64") for x in ivy.unravel_index(indices, shape)]
return tuple(ret)
+
+
+@to_ivy_arrays_and_back
+def mask_indices(n, mask_func, k=0):
+ mask_func_obj = inspect.unwrap(mask_func)
+ mask_func_name = mask_func_obj.__name__
+ try:
+ ivy_mask_func_obj = getattr(ivy.functional.frontends.jax.numpy, mask_func_name)
+ a = ivy.ones((n, n))
+ mask = ivy_mask_func_obj(a, k=k)
+ indices = ivy.argwhere(mask.ivy_array)
+ return indices[:, 0], indices[:, 1]
+ except AttributeError as e:
+ print(f"Attribute error: {e}")
+
+
+@to_ivy_arrays_and_back
+def diag_indices_from(arr):
+ print(arr)
+ n = arr.shape[0]
+ ndim = ivy.get_num_dims(arr)
+ if not all(arr.shape[i] == n for i in range(ndim)):
+ raise ValueError("All dimensions of input must be of equal length")
+ idx = ivy.arange(n, dtype=int)
+ return (idx,) * ndim
| {"golden_diff": "diff --git a/ivy/functional/frontends/jax/numpy/indexing.py b/ivy/functional/frontends/jax/numpy/indexing.py\n--- a/ivy/functional/frontends/jax/numpy/indexing.py\n+++ b/ivy/functional/frontends/jax/numpy/indexing.py\n@@ -1,3 +1,6 @@\n+# global\n+import inspect\n+\n # local\n import ivy\n from ivy.functional.frontends.jax.func_wrapper import (\n@@ -42,6 +45,7 @@\n return ivy.triu_indices(arr.shape[-2], arr.shape[-1], k)\n \n \n+@to_ivy_arrays_and_back\n def tril_indices_from(arr, k=0):\n return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)\n \n@@ -51,3 +55,28 @@\n def unravel_index(indices, shape):\n ret = [x.astype(\"int64\") for x in ivy.unravel_index(indices, shape)]\n return tuple(ret)\n+\n+\n+@to_ivy_arrays_and_back\n+def mask_indices(n, mask_func, k=0):\n+ mask_func_obj = inspect.unwrap(mask_func)\n+ mask_func_name = mask_func_obj.__name__\n+ try:\n+ ivy_mask_func_obj = getattr(ivy.functional.frontends.jax.numpy, mask_func_name)\n+ a = ivy.ones((n, n))\n+ mask = ivy_mask_func_obj(a, k=k)\n+ indices = ivy.argwhere(mask.ivy_array)\n+ return indices[:, 0], indices[:, 1]\n+ except AttributeError as e:\n+ print(f\"Attribute error: {e}\")\n+\n+\n+@to_ivy_arrays_and_back\n+def diag_indices_from(arr):\n+ print(arr)\n+ n = arr.shape[0]\n+ ndim = ivy.get_num_dims(arr)\n+ if not all(arr.shape[i] == n for i in range(ndim)):\n+ raise ValueError(\"All dimensions of input must be of equal length\")\n+ idx = ivy.arange(n, dtype=int)\n+ return (idx,) * ndim\n", "issue": "diag_indices_from\n\nmask_indices\n\n", "before_files": [{"content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@to_ivy_arrays_and_back\ndef diagonal(a, offset=0, axis1=0, axis2=1):\n return ivy.diagonal(a, offset=offset, axis1=axis1, axis2=axis2)\n\n\n@to_ivy_arrays_and_back\ndef diag(v, k=0):\n return ivy.diag(v, k=k)\n\n\n@to_ivy_arrays_and_back\ndef diag_indices(n, ndim=2):\n idx = ivy.arange(n, dtype=int)\n return (idx,) * ndim\n\n\n# take_along_axis\n@to_ivy_arrays_and_back\ndef take_along_axis(arr, indices, axis, mode=\"fill\"):\n return ivy.take_along_axis(arr, indices, axis, mode=mode)\n\n\n@to_ivy_arrays_and_back\ndef tril_indices(n_rows, n_cols=None, k=0):\n return ivy.tril_indices(n_rows, n_cols, k)\n\n\n@to_ivy_arrays_and_back\ndef triu_indices(n, k=0, m=None):\n return ivy.triu_indices(n, m, k)\n\n\n@to_ivy_arrays_and_back\ndef triu_indices_from(arr, k=0):\n return ivy.triu_indices(arr.shape[-2], arr.shape[-1], k)\n\n\ndef tril_indices_from(arr, k=0):\n return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)\n\n\n# unravel_index\n@to_ivy_arrays_and_back\ndef unravel_index(indices, shape):\n ret = [x.astype(\"int64\") for x in ivy.unravel_index(indices, shape)]\n return tuple(ret)\n", "path": "ivy/functional/frontends/jax/numpy/indexing.py"}], "after_files": [{"content": "# global\nimport inspect\n\n# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@to_ivy_arrays_and_back\ndef diagonal(a, offset=0, axis1=0, axis2=1):\n return ivy.diagonal(a, offset=offset, axis1=axis1, axis2=axis2)\n\n\n@to_ivy_arrays_and_back\ndef diag(v, k=0):\n return ivy.diag(v, k=k)\n\n\n@to_ivy_arrays_and_back\ndef diag_indices(n, ndim=2):\n idx = ivy.arange(n, dtype=int)\n return (idx,) * ndim\n\n\n# take_along_axis\n@to_ivy_arrays_and_back\ndef take_along_axis(arr, indices, axis, mode=\"fill\"):\n return ivy.take_along_axis(arr, indices, axis, mode=mode)\n\n\n@to_ivy_arrays_and_back\ndef tril_indices(n_rows, n_cols=None, k=0):\n return ivy.tril_indices(n_rows, n_cols, k)\n\n\n@to_ivy_arrays_and_back\ndef triu_indices(n, k=0, m=None):\n return ivy.triu_indices(n, m, k)\n\n\n@to_ivy_arrays_and_back\ndef triu_indices_from(arr, k=0):\n return ivy.triu_indices(arr.shape[-2], arr.shape[-1], k)\n\n\n@to_ivy_arrays_and_back\ndef tril_indices_from(arr, k=0):\n return ivy.tril_indices(arr.shape[-2], arr.shape[-1], k)\n\n\n# unravel_index\n@to_ivy_arrays_and_back\ndef unravel_index(indices, shape):\n ret = [x.astype(\"int64\") for x in ivy.unravel_index(indices, shape)]\n return tuple(ret)\n\n\n@to_ivy_arrays_and_back\ndef mask_indices(n, mask_func, k=0):\n mask_func_obj = inspect.unwrap(mask_func)\n mask_func_name = mask_func_obj.__name__\n try:\n ivy_mask_func_obj = getattr(ivy.functional.frontends.jax.numpy, mask_func_name)\n a = ivy.ones((n, n))\n mask = ivy_mask_func_obj(a, k=k)\n indices = ivy.argwhere(mask.ivy_array)\n return indices[:, 0], indices[:, 1]\n except AttributeError as e:\n print(f\"Attribute error: {e}\")\n\n\n@to_ivy_arrays_and_back\ndef diag_indices_from(arr):\n print(arr)\n n = arr.shape[0]\n ndim = ivy.get_num_dims(arr)\n if not all(arr.shape[i] == n for i in range(ndim)):\n raise ValueError(\"All dimensions of input must be of equal length\")\n idx = ivy.arange(n, dtype=int)\n return (idx,) * ndim\n", "path": "ivy/functional/frontends/jax/numpy/indexing.py"}]} | 767 | 464 |
gh_patches_debug_20739 | rasdani/github-patches | git_diff | googleapis__python-bigquery-725 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
free(): invalid pointer Aborted error when importing from google.cloud import bigquery in python
I started facing this peculiar issue while import importing BigQuery in python scripts as well as in the terminal.
Can anyone help me out here?
I even tried to downgrade the bigquery package from 2.20 to 2.1 its still give me the same error
```
root@f6968ce75f9d:~# python
Python 3.7.10 (default, May 28 2021, 10:36:10)
[GCC 8.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from google.cloud import bigquery
free(): invalid pointer
Aborted
```
<img width="858" alt="Screenshot 2021-06-07 at 10 31 45 PM" src="https://user-images.githubusercontent.com/20011105/121060374-2c9cb000-c7e0-11eb-8819-8153137ac41a.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright 2018 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import io
16 import os
17
18 import setuptools
19
20
21 # Package metadata.
22
23 name = "google-cloud-bigquery"
24 description = "Google BigQuery API client library"
25
26 # Should be one of:
27 # 'Development Status :: 3 - Alpha'
28 # 'Development Status :: 4 - Beta'
29 # 'Development Status :: 5 - Production/Stable'
30 release_status = "Development Status :: 5 - Production/Stable"
31 dependencies = [
32 "google-api-core[grpc] >= 1.29.0, < 2.0.0dev",
33 "proto-plus >= 1.10.0",
34 "google-cloud-core >= 1.4.1, < 2.0dev",
35 "google-resumable-media >= 0.6.0, < 2.0dev",
36 "packaging >= 14.3",
37 "protobuf >= 3.12.0",
38 "requests >= 2.18.0, < 3.0.0dev",
39 ]
40 extras = {
41 "bqstorage": [
42 "google-cloud-bigquery-storage >= 2.0.0, <3.0.0dev",
43 # Due to an issue in pip's dependency resolver, the `grpc` extra is not
44 # installed, even though `google-cloud-bigquery-storage` specifies it
45 # as `google-api-core[grpc]`. We thus need to explicitly specify it here.
46 # See: https://github.com/googleapis/python-bigquery/issues/83 The
47 # grpc.Channel.close() method isn't added until 1.32.0.
48 # https://github.com/grpc/grpc/pull/15254
49 "grpcio >= 1.32.0, < 2.0dev",
50 "pyarrow >= 1.0.0, < 5.0dev",
51 ],
52 "pandas": ["pandas>=0.23.0", "pyarrow >= 1.0.0, < 5.0dev"],
53 "bignumeric_type": ["pyarrow >= 3.0.0, < 5.0dev"],
54 "tqdm": ["tqdm >= 4.7.4, <5.0.0dev"],
55 "opentelemetry": [
56 "opentelemetry-api >= 0.11b0",
57 "opentelemetry-sdk >= 0.11b0",
58 "opentelemetry-instrumentation >= 0.11b0",
59 ],
60 }
61
62 all_extras = []
63
64 for extra in extras:
65 # Exclude this extra from all to avoid overly strict dependencies on core
66 # libraries such as pyarrow.
67 # https://github.com/googleapis/python-bigquery/issues/563
68 if extra in {"bignumeric_type"}:
69 continue
70 all_extras.extend(extras[extra])
71
72 extras["all"] = all_extras
73
74 # Setup boilerplate below this line.
75
76 package_root = os.path.abspath(os.path.dirname(__file__))
77
78 readme_filename = os.path.join(package_root, "README.rst")
79 with io.open(readme_filename, encoding="utf-8") as readme_file:
80 readme = readme_file.read()
81
82 version = {}
83 with open(os.path.join(package_root, "google/cloud/bigquery/version.py")) as fp:
84 exec(fp.read(), version)
85 version = version["__version__"]
86
87 # Only include packages under the 'google' namespace. Do not include tests,
88 # benchmarks, etc.
89 packages = [
90 package
91 for package in setuptools.PEP420PackageFinder.find()
92 if package.startswith("google")
93 ]
94
95 # Determine which namespaces are needed.
96 namespaces = ["google"]
97 if "google.cloud" in packages:
98 namespaces.append("google.cloud")
99
100
101 setuptools.setup(
102 name=name,
103 version=version,
104 description=description,
105 long_description=readme,
106 author="Google LLC",
107 author_email="[email protected]",
108 license="Apache 2.0",
109 url="https://github.com/googleapis/python-bigquery",
110 classifiers=[
111 release_status,
112 "Intended Audience :: Developers",
113 "License :: OSI Approved :: Apache Software License",
114 "Programming Language :: Python",
115 "Programming Language :: Python :: 3",
116 "Programming Language :: Python :: 3.6",
117 "Programming Language :: Python :: 3.7",
118 "Programming Language :: Python :: 3.8",
119 "Programming Language :: Python :: 3.9",
120 "Operating System :: OS Independent",
121 "Topic :: Internet",
122 ],
123 platforms="Posix; MacOS X; Windows",
124 packages=packages,
125 namespace_packages=namespaces,
126 install_requires=dependencies,
127 extras_require=extras,
128 python_requires=">=3.6, <3.10",
129 include_package_data=True,
130 zip_safe=False,
131 )
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -29,6 +29,7 @@
# 'Development Status :: 5 - Production/Stable'
release_status = "Development Status :: 5 - Production/Stable"
dependencies = [
+ "grpcio >= 1.38.1, < 2.0dev", # https://github.com/googleapis/python-bigquery/issues/695
"google-api-core[grpc] >= 1.29.0, < 2.0.0dev",
"proto-plus >= 1.10.0",
"google-cloud-core >= 1.4.1, < 2.0dev",
@@ -46,7 +47,7 @@
# See: https://github.com/googleapis/python-bigquery/issues/83 The
# grpc.Channel.close() method isn't added until 1.32.0.
# https://github.com/grpc/grpc/pull/15254
- "grpcio >= 1.32.0, < 2.0dev",
+ "grpcio >= 1.38.1, < 2.0dev",
"pyarrow >= 1.0.0, < 5.0dev",
],
"pandas": ["pandas>=0.23.0", "pyarrow >= 1.0.0, < 5.0dev"],
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -29,6 +29,7 @@\n # 'Development Status :: 5 - Production/Stable'\n release_status = \"Development Status :: 5 - Production/Stable\"\n dependencies = [\n+ \"grpcio >= 1.38.1, < 2.0dev\", # https://github.com/googleapis/python-bigquery/issues/695\n \"google-api-core[grpc] >= 1.29.0, < 2.0.0dev\",\n \"proto-plus >= 1.10.0\",\n \"google-cloud-core >= 1.4.1, < 2.0dev\",\n@@ -46,7 +47,7 @@\n # See: https://github.com/googleapis/python-bigquery/issues/83 The\n # grpc.Channel.close() method isn't added until 1.32.0.\n # https://github.com/grpc/grpc/pull/15254\n- \"grpcio >= 1.32.0, < 2.0dev\",\n+ \"grpcio >= 1.38.1, < 2.0dev\",\n \"pyarrow >= 1.0.0, < 5.0dev\",\n ],\n \"pandas\": [\"pandas>=0.23.0\", \"pyarrow >= 1.0.0, < 5.0dev\"],\n", "issue": "free(): invalid pointer Aborted error when importing from google.cloud import bigquery in python\nI started facing this peculiar issue while import importing BigQuery in python scripts as well as in the terminal. \r\nCan anyone help me out here?\r\nI even tried to downgrade the bigquery package from 2.20 to 2.1 its still give me the same error\r\n\r\n```\r\nroot@f6968ce75f9d:~# python \r\nPython 3.7.10 (default, May 28 2021, 10:36:10) \r\n[GCC 8.3.0] on linux\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> from google.cloud import bigquery\r\nfree(): invalid pointer\r\nAborted\r\n```\r\n\r\n<img width=\"858\" alt=\"Screenshot 2021-06-07 at 10 31 45 PM\" src=\"https://user-images.githubusercontent.com/20011105/121060374-2c9cb000-c7e0-11eb-8819-8153137ac41a.png\">\r\n\r\n\r\n\n", "before_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport os\n\nimport setuptools\n\n\n# Package metadata.\n\nname = \"google-cloud-bigquery\"\ndescription = \"Google BigQuery API client library\"\n\n# Should be one of:\n# 'Development Status :: 3 - Alpha'\n# 'Development Status :: 4 - Beta'\n# 'Development Status :: 5 - Production/Stable'\nrelease_status = \"Development Status :: 5 - Production/Stable\"\ndependencies = [\n \"google-api-core[grpc] >= 1.29.0, < 2.0.0dev\",\n \"proto-plus >= 1.10.0\",\n \"google-cloud-core >= 1.4.1, < 2.0dev\",\n \"google-resumable-media >= 0.6.0, < 2.0dev\",\n \"packaging >= 14.3\",\n \"protobuf >= 3.12.0\",\n \"requests >= 2.18.0, < 3.0.0dev\",\n]\nextras = {\n \"bqstorage\": [\n \"google-cloud-bigquery-storage >= 2.0.0, <3.0.0dev\",\n # Due to an issue in pip's dependency resolver, the `grpc` extra is not\n # installed, even though `google-cloud-bigquery-storage` specifies it\n # as `google-api-core[grpc]`. We thus need to explicitly specify it here.\n # See: https://github.com/googleapis/python-bigquery/issues/83 The\n # grpc.Channel.close() method isn't added until 1.32.0.\n # https://github.com/grpc/grpc/pull/15254\n \"grpcio >= 1.32.0, < 2.0dev\",\n \"pyarrow >= 1.0.0, < 5.0dev\",\n ],\n \"pandas\": [\"pandas>=0.23.0\", \"pyarrow >= 1.0.0, < 5.0dev\"],\n \"bignumeric_type\": [\"pyarrow >= 3.0.0, < 5.0dev\"],\n \"tqdm\": [\"tqdm >= 4.7.4, <5.0.0dev\"],\n \"opentelemetry\": [\n \"opentelemetry-api >= 0.11b0\",\n \"opentelemetry-sdk >= 0.11b0\",\n \"opentelemetry-instrumentation >= 0.11b0\",\n ],\n}\n\nall_extras = []\n\nfor extra in extras:\n # Exclude this extra from all to avoid overly strict dependencies on core\n # libraries such as pyarrow.\n # https://github.com/googleapis/python-bigquery/issues/563\n if extra in {\"bignumeric_type\"}:\n continue\n all_extras.extend(extras[extra])\n\nextras[\"all\"] = all_extras\n\n# Setup boilerplate below this line.\n\npackage_root = os.path.abspath(os.path.dirname(__file__))\n\nreadme_filename = os.path.join(package_root, \"README.rst\")\nwith io.open(readme_filename, encoding=\"utf-8\") as readme_file:\n readme = readme_file.read()\n\nversion = {}\nwith open(os.path.join(package_root, \"google/cloud/bigquery/version.py\")) as fp:\n exec(fp.read(), version)\nversion = version[\"__version__\"]\n\n# Only include packages under the 'google' namespace. Do not include tests,\n# benchmarks, etc.\npackages = [\n package\n for package in setuptools.PEP420PackageFinder.find()\n if package.startswith(\"google\")\n]\n\n# Determine which namespaces are needed.\nnamespaces = [\"google\"]\nif \"google.cloud\" in packages:\n namespaces.append(\"google.cloud\")\n\n\nsetuptools.setup(\n name=name,\n version=version,\n description=description,\n long_description=readme,\n author=\"Google LLC\",\n author_email=\"[email protected]\",\n license=\"Apache 2.0\",\n url=\"https://github.com/googleapis/python-bigquery\",\n classifiers=[\n release_status,\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet\",\n ],\n platforms=\"Posix; MacOS X; Windows\",\n packages=packages,\n namespace_packages=namespaces,\n install_requires=dependencies,\n extras_require=extras,\n python_requires=\">=3.6, <3.10\",\n include_package_data=True,\n zip_safe=False,\n)\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport os\n\nimport setuptools\n\n\n# Package metadata.\n\nname = \"google-cloud-bigquery\"\ndescription = \"Google BigQuery API client library\"\n\n# Should be one of:\n# 'Development Status :: 3 - Alpha'\n# 'Development Status :: 4 - Beta'\n# 'Development Status :: 5 - Production/Stable'\nrelease_status = \"Development Status :: 5 - Production/Stable\"\ndependencies = [\n \"grpcio >= 1.38.1, < 2.0dev\", # https://github.com/googleapis/python-bigquery/issues/695\n \"google-api-core[grpc] >= 1.29.0, < 2.0.0dev\",\n \"proto-plus >= 1.10.0\",\n \"google-cloud-core >= 1.4.1, < 2.0dev\",\n \"google-resumable-media >= 0.6.0, < 2.0dev\",\n \"packaging >= 14.3\",\n \"protobuf >= 3.12.0\",\n \"requests >= 2.18.0, < 3.0.0dev\",\n]\nextras = {\n \"bqstorage\": [\n \"google-cloud-bigquery-storage >= 2.0.0, <3.0.0dev\",\n # Due to an issue in pip's dependency resolver, the `grpc` extra is not\n # installed, even though `google-cloud-bigquery-storage` specifies it\n # as `google-api-core[grpc]`. We thus need to explicitly specify it here.\n # See: https://github.com/googleapis/python-bigquery/issues/83 The\n # grpc.Channel.close() method isn't added until 1.32.0.\n # https://github.com/grpc/grpc/pull/15254\n \"grpcio >= 1.38.1, < 2.0dev\",\n \"pyarrow >= 1.0.0, < 5.0dev\",\n ],\n \"pandas\": [\"pandas>=0.23.0\", \"pyarrow >= 1.0.0, < 5.0dev\"],\n \"bignumeric_type\": [\"pyarrow >= 3.0.0, < 5.0dev\"],\n \"tqdm\": [\"tqdm >= 4.7.4, <5.0.0dev\"],\n \"opentelemetry\": [\n \"opentelemetry-api >= 0.11b0\",\n \"opentelemetry-sdk >= 0.11b0\",\n \"opentelemetry-instrumentation >= 0.11b0\",\n ],\n}\n\nall_extras = []\n\nfor extra in extras:\n # Exclude this extra from all to avoid overly strict dependencies on core\n # libraries such as pyarrow.\n # https://github.com/googleapis/python-bigquery/issues/563\n if extra in {\"bignumeric_type\"}:\n continue\n all_extras.extend(extras[extra])\n\nextras[\"all\"] = all_extras\n\n# Setup boilerplate below this line.\n\npackage_root = os.path.abspath(os.path.dirname(__file__))\n\nreadme_filename = os.path.join(package_root, \"README.rst\")\nwith io.open(readme_filename, encoding=\"utf-8\") as readme_file:\n readme = readme_file.read()\n\nversion = {}\nwith open(os.path.join(package_root, \"google/cloud/bigquery/version.py\")) as fp:\n exec(fp.read(), version)\nversion = version[\"__version__\"]\n\n# Only include packages under the 'google' namespace. Do not include tests,\n# benchmarks, etc.\npackages = [\n package\n for package in setuptools.PEP420PackageFinder.find()\n if package.startswith(\"google\")\n]\n\n# Determine which namespaces are needed.\nnamespaces = [\"google\"]\nif \"google.cloud\" in packages:\n namespaces.append(\"google.cloud\")\n\n\nsetuptools.setup(\n name=name,\n version=version,\n description=description,\n long_description=readme,\n author=\"Google LLC\",\n author_email=\"[email protected]\",\n license=\"Apache 2.0\",\n url=\"https://github.com/googleapis/python-bigquery\",\n classifiers=[\n release_status,\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Operating System :: OS Independent\",\n \"Topic :: Internet\",\n ],\n platforms=\"Posix; MacOS X; Windows\",\n packages=packages,\n namespace_packages=namespaces,\n install_requires=dependencies,\n extras_require=extras,\n python_requires=\">=3.6, <3.10\",\n include_package_data=True,\n zip_safe=False,\n)\n", "path": "setup.py"}]} | 1,989 | 322 |
gh_patches_debug_1501 | rasdani/github-patches | git_diff | googleapis__google-cloud-python-6332 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 'api_core-1.6.0a2'? or 'api_core-1.5.1'?
I'd like to use the changes from https://github.com/googleapis/google-cloud-python/pull/6310 in the library I'm working on.
Not sure about the version number for that one, since:
- I don't know what else has been released since 1.6.0a1
- I don't know what is intended by 1.6.0a1 in https://github.com/googleapis/google-cloud-python/pull/6267 (is it a pre-release?)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `api_core/setup.py`
Content:
```
1 # Copyright 2018 Google LLC
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import io
16 import os
17
18 import setuptools
19
20
21 # Package metadata.
22
23 name = 'google-api-core'
24 description = 'Google API client core library'
25 version = '1.6.0a1'
26 # Should be one of:
27 # 'Development Status :: 3 - Alpha'
28 # 'Development Status :: 4 - Beta'
29 # 'Development Status :: 5 - Production/Stable'
30 release_status = 'Development Status :: 5 - Production/Stable'
31 dependencies = [
32 'googleapis-common-protos<2.0dev,>=1.5.3',
33 'protobuf>=3.4.0',
34 'google-auth<2.0.0dev,>=0.4.0',
35 'requests<3.0.0dev,>=2.18.0',
36 'setuptools>=34.0.0',
37 'six>=1.10.0',
38 'pytz',
39 'futures>=3.2.0;python_version<"3.2"'
40 ]
41 extras = {
42 'grpc': 'grpcio>=1.8.2',
43 'grpcio-gcp': 'grpcio-gcp>=0.2.2'
44 }
45
46
47 # Setup boilerplate below this line.
48
49 package_root = os.path.abspath(os.path.dirname(__file__))
50
51 readme_filename = os.path.join(package_root, 'README.rst')
52 with io.open(readme_filename, encoding='utf-8') as readme_file:
53 readme = readme_file.read()
54
55 # Only include packages under the 'google' namespace. Do not include tests,
56 # benchmarks, etc.
57 packages = [
58 package for package in setuptools.find_packages()
59 if package.startswith('google')]
60
61 # Determine which namespaces are needed.
62 namespaces = ['google']
63 if 'google.cloud' in packages:
64 namespaces.append('google.cloud')
65
66
67 setuptools.setup(
68 name=name,
69 version=version,
70 description=description,
71 long_description=readme,
72 author='Google LLC',
73 author_email='[email protected]',
74 license='Apache 2.0',
75 url='https://github.com/GoogleCloudPlatform/google-cloud-python',
76 classifiers=[
77 release_status,
78 'Intended Audience :: Developers',
79 'License :: OSI Approved :: Apache Software License',
80 'Programming Language :: Python',
81 'Programming Language :: Python :: 2',
82 'Programming Language :: Python :: 2.7',
83 'Programming Language :: Python :: 3',
84 'Programming Language :: Python :: 3.5',
85 'Programming Language :: Python :: 3.6',
86 'Programming Language :: Python :: 3.7',
87 'Operating System :: OS Independent',
88 'Topic :: Internet',
89 ],
90 platforms='Posix; MacOS X; Windows',
91 packages=packages,
92 namespace_packages=namespaces,
93 install_requires=dependencies,
94 extras_require=extras,
95 include_package_data=True,
96 zip_safe=False,
97 )
98
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/api_core/setup.py b/api_core/setup.py
--- a/api_core/setup.py
+++ b/api_core/setup.py
@@ -22,7 +22,7 @@
name = 'google-api-core'
description = 'Google API client core library'
-version = '1.6.0a1'
+version = '1.5.1'
# Should be one of:
# 'Development Status :: 3 - Alpha'
# 'Development Status :: 4 - Beta'
| {"golden_diff": "diff --git a/api_core/setup.py b/api_core/setup.py\n--- a/api_core/setup.py\n+++ b/api_core/setup.py\n@@ -22,7 +22,7 @@\n \n name = 'google-api-core'\n description = 'Google API client core library'\n-version = '1.6.0a1'\n+version = '1.5.1'\n # Should be one of:\n # 'Development Status :: 3 - Alpha'\n # 'Development Status :: 4 - Beta'\n", "issue": "Release 'api_core-1.6.0a2'? or 'api_core-1.5.1'?\nI'd like to use the changes from https://github.com/googleapis/google-cloud-python/pull/6310 in the library I'm working on.\r\n\r\nNot sure about the version number for that one, since:\r\n\r\n- I don't know what else has been released since 1.6.0a1\r\n- I don't know what is intended by 1.6.0a1 in https://github.com/googleapis/google-cloud-python/pull/6267 (is it a pre-release?)\n", "before_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport os\n\nimport setuptools\n\n\n# Package metadata.\n\nname = 'google-api-core'\ndescription = 'Google API client core library'\nversion = '1.6.0a1'\n# Should be one of:\n# 'Development Status :: 3 - Alpha'\n# 'Development Status :: 4 - Beta'\n# 'Development Status :: 5 - Production/Stable'\nrelease_status = 'Development Status :: 5 - Production/Stable'\ndependencies = [\n 'googleapis-common-protos<2.0dev,>=1.5.3',\n 'protobuf>=3.4.0',\n 'google-auth<2.0.0dev,>=0.4.0',\n 'requests<3.0.0dev,>=2.18.0',\n 'setuptools>=34.0.0',\n 'six>=1.10.0',\n 'pytz',\n 'futures>=3.2.0;python_version<\"3.2\"'\n]\nextras = {\n 'grpc': 'grpcio>=1.8.2',\n 'grpcio-gcp': 'grpcio-gcp>=0.2.2'\n}\n\n\n# Setup boilerplate below this line.\n\npackage_root = os.path.abspath(os.path.dirname(__file__))\n\nreadme_filename = os.path.join(package_root, 'README.rst')\nwith io.open(readme_filename, encoding='utf-8') as readme_file:\n readme = readme_file.read()\n\n# Only include packages under the 'google' namespace. Do not include tests,\n# benchmarks, etc.\npackages = [\n package for package in setuptools.find_packages()\n if package.startswith('google')]\n\n# Determine which namespaces are needed.\nnamespaces = ['google']\nif 'google.cloud' in packages:\n namespaces.append('google.cloud')\n\n\nsetuptools.setup(\n name=name,\n version=version,\n description=description,\n long_description=readme,\n author='Google LLC',\n author_email='[email protected]',\n license='Apache 2.0',\n url='https://github.com/GoogleCloudPlatform/google-cloud-python',\n classifiers=[\n release_status,\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: Apache Software License',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Operating System :: OS Independent',\n 'Topic :: Internet',\n ],\n platforms='Posix; MacOS X; Windows',\n packages=packages,\n namespace_packages=namespaces,\n install_requires=dependencies,\n extras_require=extras,\n include_package_data=True,\n zip_safe=False,\n)\n", "path": "api_core/setup.py"}], "after_files": [{"content": "# Copyright 2018 Google LLC\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport io\nimport os\n\nimport setuptools\n\n\n# Package metadata.\n\nname = 'google-api-core'\ndescription = 'Google API client core library'\nversion = '1.5.1'\n# Should be one of:\n# 'Development Status :: 3 - Alpha'\n# 'Development Status :: 4 - Beta'\n# 'Development Status :: 5 - Production/Stable'\nrelease_status = 'Development Status :: 5 - Production/Stable'\ndependencies = [\n 'googleapis-common-protos<2.0dev,>=1.5.3',\n 'protobuf>=3.4.0',\n 'google-auth<2.0.0dev,>=0.4.0',\n 'requests<3.0.0dev,>=2.18.0',\n 'setuptools>=34.0.0',\n 'six>=1.10.0',\n 'pytz',\n 'futures>=3.2.0;python_version<\"3.2\"'\n]\nextras = {\n 'grpc': 'grpcio>=1.8.2',\n 'grpcio-gcp': 'grpcio-gcp>=0.2.2'\n}\n\n\n# Setup boilerplate below this line.\n\npackage_root = os.path.abspath(os.path.dirname(__file__))\n\nreadme_filename = os.path.join(package_root, 'README.rst')\nwith io.open(readme_filename, encoding='utf-8') as readme_file:\n readme = readme_file.read()\n\n# Only include packages under the 'google' namespace. Do not include tests,\n# benchmarks, etc.\npackages = [\n package for package in setuptools.find_packages()\n if package.startswith('google')]\n\n# Determine which namespaces are needed.\nnamespaces = ['google']\nif 'google.cloud' in packages:\n namespaces.append('google.cloud')\n\n\nsetuptools.setup(\n name=name,\n version=version,\n description=description,\n long_description=readme,\n author='Google LLC',\n author_email='[email protected]',\n license='Apache 2.0',\n url='https://github.com/GoogleCloudPlatform/google-cloud-python',\n classifiers=[\n release_status,\n 'Intended Audience :: Developers',\n 'License :: OSI Approved :: Apache Software License',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Programming Language :: Python :: 3.7',\n 'Operating System :: OS Independent',\n 'Topic :: Internet',\n ],\n platforms='Posix; MacOS X; Windows',\n packages=packages,\n namespace_packages=namespaces,\n install_requires=dependencies,\n extras_require=extras,\n include_package_data=True,\n zip_safe=False,\n)\n", "path": "api_core/setup.py"}]} | 1,319 | 104 |
gh_patches_debug_22057 | rasdani/github-patches | git_diff | facebookresearch__hydra-2713 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Hydra ray launcher requirements update
# 🚀 Feature Request
The requirements of `hydra-ray-launcher` are too restrictive. Please provide wider bounds. In particular, so that is compatible with the latest version of ray and with python 3.10
```
install_requires=[
"boto3==1.22.6",
"hydra-core>=1.1.2",
"ray[default]==1.12.0",
"aiohttp==3.8.1",
"cloudpickle==2.0.0",
"pickle5==0.0.11; python_version < '3.8'",
],
```
## Motivation
**Is your feature request related to a problem? Please describe.**
There are multiple issues pointing to this problem #2407 , #2336 and many others
## Pitch
**Describe the solution you'd like**
<!-- A clear and concise description of what you want to happen. -->
**Describe alternatives you've considered**
<!-- A clear and concise description of any alternative solutions or features you've considered. -->
**Are you willing to open a pull request?** (See [CONTRIBUTING](../../CONTRIBUTING.md))
## Additional context
Add any other context or screenshots about the feature request here.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2
3 __version__ = "1.2.0"
4
```
Path: `plugins/hydra_ray_launcher/setup.py`
Content:
```
1 # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
2 # type: ignore
3 from pathlib import Path
4
5 from read_version import read_version
6 from setuptools import find_namespace_packages, setup
7
8 setup(
9 name="hydra-ray-launcher",
10 version=read_version("hydra_plugins/hydra_ray_launcher", "__init__.py"),
11 author="Jieru Hu",
12 author_email="[email protected]",
13 description="Hydra Ray launcher plugin",
14 long_description=(Path(__file__).parent / "README.md").read_text(),
15 long_description_content_type="text/markdown",
16 url="https://github.com/facebookresearch/hydra/",
17 packages=find_namespace_packages(include=["hydra_plugins.*"]),
18 classifiers=[
19 "License :: OSI Approved :: MIT License",
20 "Programming Language :: Python :: 3.7",
21 "Programming Language :: Python :: 3.8",
22 "Programming Language :: Python :: 3.9",
23 "Operating System :: MacOS",
24 "Operating System :: POSIX :: Linux",
25 ],
26 install_requires=[
27 "boto3==1.22.6",
28 "hydra-core>=1.1.2",
29 "ray[default]~=1.12",
30 "aiohttp==3.8.1",
31 "cloudpickle==2.0.0",
32 "pickle5==0.0.11; python_version < '3.8'",
33 ],
34 include_package_data=True,
35 )
36
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py b/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py
--- a/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py
+++ b/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py
@@ -1,3 +1,3 @@
# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved
-__version__ = "1.2.0"
+__version__ = "1.2.1"
diff --git a/plugins/hydra_ray_launcher/setup.py b/plugins/hydra_ray_launcher/setup.py
--- a/plugins/hydra_ray_launcher/setup.py
+++ b/plugins/hydra_ray_launcher/setup.py
@@ -20,16 +20,18 @@
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
+ "Programming Language :: Python :: 3.10",
+ "Programming Language :: Python :: 3.11",
"Operating System :: MacOS",
"Operating System :: POSIX :: Linux",
],
install_requires=[
- "boto3==1.22.6",
+ "boto3",
"hydra-core>=1.1.2",
- "ray[default]~=1.12",
- "aiohttp==3.8.1",
- "cloudpickle==2.0.0",
- "pickle5==0.0.11; python_version < '3.8'",
+ "ray[default]<3",
+ "aiohttp<4",
+ "cloudpickle<3",
+ "pickle5==0.0.12; python_version < '3.8'",
],
include_package_data=True,
)
| {"golden_diff": "diff --git a/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py b/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py\n--- a/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py\n+++ b/plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py\n@@ -1,3 +1,3 @@\n # Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n \n-__version__ = \"1.2.0\"\n+__version__ = \"1.2.1\"\ndiff --git a/plugins/hydra_ray_launcher/setup.py b/plugins/hydra_ray_launcher/setup.py\n--- a/plugins/hydra_ray_launcher/setup.py\n+++ b/plugins/hydra_ray_launcher/setup.py\n@@ -20,16 +20,18 @@\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n+ \"Programming Language :: Python :: 3.10\",\n+ \"Programming Language :: Python :: 3.11\",\n \"Operating System :: MacOS\",\n \"Operating System :: POSIX :: Linux\",\n ],\n install_requires=[\n- \"boto3==1.22.6\",\n+ \"boto3\",\n \"hydra-core>=1.1.2\",\n- \"ray[default]~=1.12\",\n- \"aiohttp==3.8.1\",\n- \"cloudpickle==2.0.0\",\n- \"pickle5==0.0.11; python_version < '3.8'\",\n+ \"ray[default]<3\",\n+ \"aiohttp<4\",\n+ \"cloudpickle<3\",\n+ \"pickle5==0.0.12; python_version < '3.8'\",\n ],\n include_package_data=True,\n )\n", "issue": "Hydra ray launcher requirements update\n# \ud83d\ude80 Feature Request\r\n\r\nThe requirements of `hydra-ray-launcher` are too restrictive. Please provide wider bounds. In particular, so that is compatible with the latest version of ray and with python 3.10\r\n\r\n```\r\ninstall_requires=[\r\n \"boto3==1.22.6\",\r\n \"hydra-core>=1.1.2\",\r\n \"ray[default]==1.12.0\",\r\n \"aiohttp==3.8.1\",\r\n \"cloudpickle==2.0.0\",\r\n \"pickle5==0.0.11; python_version < '3.8'\",\r\n ],\r\n```\r\n\r\n## Motivation\r\n\r\n**Is your feature request related to a problem? Please describe.**\r\n\r\nThere are multiple issues pointing to this problem #2407 , #2336 and many others\r\n\r\n## Pitch\r\n\r\n**Describe the solution you'd like**\r\n<!-- A clear and concise description of what you want to happen. -->\r\n\r\n**Describe alternatives you've considered**\r\n<!-- A clear and concise description of any alternative solutions or features you've considered. -->\r\n\r\n**Are you willing to open a pull request?** (See [CONTRIBUTING](../../CONTRIBUTING.md))\r\n\r\n## Additional context\r\nAdd any other context or screenshots about the feature request here.\r\n\n", "before_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\n__version__ = \"1.2.0\"\n", "path": "plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py"}, {"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n# type: ignore\nfrom pathlib import Path\n\nfrom read_version import read_version\nfrom setuptools import find_namespace_packages, setup\n\nsetup(\n name=\"hydra-ray-launcher\",\n version=read_version(\"hydra_plugins/hydra_ray_launcher\", \"__init__.py\"),\n author=\"Jieru Hu\",\n author_email=\"[email protected]\",\n description=\"Hydra Ray launcher plugin\",\n long_description=(Path(__file__).parent / \"README.md\").read_text(),\n long_description_content_type=\"text/markdown\",\n url=\"https://github.com/facebookresearch/hydra/\",\n packages=find_namespace_packages(include=[\"hydra_plugins.*\"]),\n classifiers=[\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Operating System :: MacOS\",\n \"Operating System :: POSIX :: Linux\",\n ],\n install_requires=[\n \"boto3==1.22.6\",\n \"hydra-core>=1.1.2\",\n \"ray[default]~=1.12\",\n \"aiohttp==3.8.1\",\n \"cloudpickle==2.0.0\",\n \"pickle5==0.0.11; python_version < '3.8'\",\n ],\n include_package_data=True,\n)\n", "path": "plugins/hydra_ray_launcher/setup.py"}], "after_files": [{"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n\n__version__ = \"1.2.1\"\n", "path": "plugins/hydra_ray_launcher/hydra_plugins/hydra_ray_launcher/__init__.py"}, {"content": "# Copyright (c) Facebook, Inc. and its affiliates. All Rights Reserved\n# type: ignore\nfrom pathlib import Path\n\nfrom read_version import read_version\nfrom setuptools import find_namespace_packages, setup\n\nsetup(\n name=\"hydra-ray-launcher\",\n version=read_version(\"hydra_plugins/hydra_ray_launcher\", \"__init__.py\"),\n author=\"Jieru Hu\",\n author_email=\"[email protected]\",\n description=\"Hydra Ray launcher plugin\",\n long_description=(Path(__file__).parent / \"README.md\").read_text(),\n long_description_content_type=\"text/markdown\",\n url=\"https://github.com/facebookresearch/hydra/\",\n packages=find_namespace_packages(include=[\"hydra_plugins.*\"]),\n classifiers=[\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n \"Operating System :: MacOS\",\n \"Operating System :: POSIX :: Linux\",\n ],\n install_requires=[\n \"boto3\",\n \"hydra-core>=1.1.2\",\n \"ray[default]<3\",\n \"aiohttp<4\",\n \"cloudpickle<3\",\n \"pickle5==0.0.12; python_version < '3.8'\",\n ],\n include_package_data=True,\n)\n", "path": "plugins/hydra_ray_launcher/setup.py"}]} | 986 | 431 |
gh_patches_debug_1855 | rasdani/github-patches | git_diff | flairNLP__flair-300 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update torch version to 1.0.0
torch version 1.0.0 is available. Use torch version 1.0.0 in Flair.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2
3 setup(
4 name='flair',
5 version='0.3.2',
6 description='A very simple framework for state-of-the-art NLP',
7 long_description=open("README.md", encoding='utf-8').read(),
8 long_description_content_type="text/markdown",
9 author='Alan Akbik',
10 author_email='[email protected]',
11 url='https://github.com/zalandoresearch/flair',
12 packages=find_packages(exclude='test'), # same as name
13 license='MIT',
14 install_requires=[
15 'torch==0.4.1',
16 'gensim==3.4.0',
17 'typing==3.6.4',
18 'tqdm==4.26.0',
19 'segtok==1.5.7',
20 'matplotlib==3.0.0',
21 'mpld3==0.3',
22 'sklearn',
23 'sqlitedict==1.6.0',
24 'deprecated==1.2.4',
25 'hyperopt==0.1.1',
26 'pytorch-pretrained-bert==0.3.0'
27 ],
28 include_package_data=True,
29 python_requires='>=3.6',
30 )
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -12,7 +12,7 @@
packages=find_packages(exclude='test'), # same as name
license='MIT',
install_requires=[
- 'torch==0.4.1',
+ 'torch==1.0.0',
'gensim==3.4.0',
'typing==3.6.4',
'tqdm==4.26.0',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -12,7 +12,7 @@\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n- 'torch==0.4.1',\n+ 'torch==1.0.0',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n 'tqdm==4.26.0',\n", "issue": "Update torch version to 1.0.0\ntorch version 1.0.0 is available. Use torch version 1.0.0 in Flair.\n", "before_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='flair',\n version='0.3.2',\n description='A very simple framework for state-of-the-art NLP',\n long_description=open(\"README.md\", encoding='utf-8').read(),\n long_description_content_type=\"text/markdown\",\n author='Alan Akbik',\n author_email='[email protected]',\n url='https://github.com/zalandoresearch/flair',\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n 'torch==0.4.1',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n 'tqdm==4.26.0',\n 'segtok==1.5.7',\n 'matplotlib==3.0.0',\n 'mpld3==0.3',\n 'sklearn',\n 'sqlitedict==1.6.0',\n 'deprecated==1.2.4',\n 'hyperopt==0.1.1',\n 'pytorch-pretrained-bert==0.3.0'\n ],\n include_package_data=True,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='flair',\n version='0.3.2',\n description='A very simple framework for state-of-the-art NLP',\n long_description=open(\"README.md\", encoding='utf-8').read(),\n long_description_content_type=\"text/markdown\",\n author='Alan Akbik',\n author_email='[email protected]',\n url='https://github.com/zalandoresearch/flair',\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n 'torch==1.0.0',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n 'tqdm==4.26.0',\n 'segtok==1.5.7',\n 'matplotlib==3.0.0',\n 'mpld3==0.3',\n 'sklearn',\n 'sqlitedict==1.6.0',\n 'deprecated==1.2.4',\n 'hyperopt==0.1.1',\n 'pytorch-pretrained-bert==0.3.0'\n ],\n include_package_data=True,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}]} | 618 | 112 |
gh_patches_debug_18221 | rasdani/github-patches | git_diff | PrefectHQ__prefect-2686 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Google Imports are Tied Together
## Description
*A clear description of the bug*
I’m using the new `GCSResult` and I’m getting an import error when I don’t also specify `google-cloud-bigquery` as a dependency since they’re imports occur in the same file, I think?
```
Unexpected error: ImportError("cannot import name 'bigquery' from 'google.cloud' (unknown location)")
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/prefect/engine/runner.py", line 48, in inner
new_state = method(self, state, *args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/prefect/engine/task_runner.py", line 986, in get_task_run_state
result = self.result.write(value, filename="output", **prefect.context)
File "/usr/local/lib/python3.7/site-packages/prefect/engine/results/gcs_result.py", line 73, in write
self.gcs_bucket.blob(new.location).upload_from_string(binary_data)
File "/usr/local/lib/python3.7/site-packages/prefect/engine/results/gcs_result.py", line 35, in gcs_bucket
from prefect.utilities.gcp import get_storage_client
File "/usr/local/lib/python3.7/site-packages/prefect/utilities/gcp.py", line 6, in <module>
from google.cloud import bigquery, storage
ImportError: cannot import name 'bigquery' from 'google.cloud' (unknown location)
```
https://cloud.prefect.io/prefect/flow-run/6704aa4e-ba9b-40ed-a4f8-386920839a8e?logId=75b1fc01-0ee8-4061-ab8b-5481e6123a79
On a cool note, changing to `python_dependencies=["prefect[google]"]` did work 🎉
## Expected Behavior
*What did you expect to happen instead?*
I'd like to be able to specify one import in insolation (in this case `google-cloud-storage`)
## Reproduction
*A minimal example that exhibits the behavior.*
```
from prefect import task, Flow
from prefect.tasks.notifications.slack_task import SlackTask
from prefect.schedules import CronSchedule
from prefect.environments.storage import Docker
from prefect.engine.results import GCSResult
import pendulum
import datetime
@task(name="Get Week Message", max_retries=5, retry_delay=datetime.timedelta(seconds=5))
def get_week_message():
prefects_birthday = pendulum.date(2018, 1, 17)
current_week = prefects_birthday.diff(pendulum.now()).in_weeks()
return f"Hello, Jeremiah! It is week {current_week}."
send_message = SlackTask(
name="Slack Jeremiah",
max_retries=5,
retry_delay=datetime.timedelta(seconds=5),
webhook_secret="SLACK_WEBHOOK",
)
schedule = CronSchedule(cron="50 11 * * MON", start_date=pendulum.now(tz="US/Eastern"))
storage = Docker(
base_image="prefecthq/prefect:latest-python3.7",
registry_url=URL,
python_dependencies=["google-cloud-storage"],
files={
FILE_LOCATION: FILENAME
},
env_vars={"GOOGLE_APPLICATION_CREDENTIALS": FILENAME},
)
gcs_result = GCSResult(bucket="what_week_is_it_results")
with Flow(
name="What Week is It?", schedule=schedule, storage=storage, result=gcs_result
) as flow:
week_message = get_week_message()
result = send_message(message=week_message)
```
## Environment
*Any additional information about your environment*
*Optionally run `prefect diagnostics` from the command line and paste the information here*
```
{
"config_overrides": {
"cloud": {
"auth_token": true,
"use_local_secrets": true
},
"context": {
"secrets": false
},
"home_dir": true
},
"env_vars": [],
"system_information": {
"platform": "Darwin-19.4.0-x86_64-i386-64bit",
"prefect_version": "0.11.2",
"python_version": "3.7.7"
}
}
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/prefect/utilities/gcp.py`
Content:
```
1 """
2 Utility functions for interacting with Google Cloud.
3 """
4 import prefect
5
6 from google.cloud import bigquery, storage
7 from google.oauth2.service_account import Credentials
8
9
10 def get_google_client(submodule, credentials: dict = None, project: str = None):
11 """
12 Utility function for loading Google Client objects from a given set of credentials.
13
14 Args:
15 - submodule: a Python submodule with a Client attribute
16 - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if
17 not provided, will attempt to load the Client using ambient environment settings
18 - project (str, optional): the Google project to point the Client to; if not provided, Client defaults
19 will be used
20
21 Returns:
22 - Client: an initialized and authenticated Google Client
23 """
24 Client = getattr(submodule, "Client")
25 credentials = credentials or prefect.context.get("secrets", {}).get(
26 "GCP_CREDENTIALS"
27 )
28 if credentials is not None:
29 credentials = Credentials.from_service_account_info(credentials)
30 project = project or credentials.project_id
31 client = Client(project=project, credentials=credentials)
32 else:
33 client = Client(project=project)
34 return client
35
36
37 def get_storage_client(credentials: dict = None, project: str = None):
38 """
39 Utility function for instantiating a Google Storage Client from a given set of credentials.
40
41 Args:
42 - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if
43 not provided, will attempt to load the Client using ambient environment settings
44 - project (str, optional): the Google project to point the Client to; if not provided, Client defaults
45 will be used
46
47 Returns:
48 - Client: an initialized and authenticated Google Client
49 """
50 return get_google_client(storage, credentials=credentials, project=project)
51
52
53 def get_bigquery_client(credentials: dict = None, project: str = None):
54 """
55 Utility function for instantiating a Google BigQuery Client from a given set of credentials.
56
57 Args:
58 - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if
59 not provided, will attempt to load the Client using ambient environment settings
60 - project (str, optional): the Google project to point the Client to; if not provided, Client defaults
61 will be used
62
63 Returns:
64 - Client: an initialized and authenticated Google Client
65 """
66 return get_google_client(bigquery, credentials=credentials, project=project)
67
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/prefect/utilities/gcp.py b/src/prefect/utilities/gcp.py
--- a/src/prefect/utilities/gcp.py
+++ b/src/prefect/utilities/gcp.py
@@ -3,7 +3,6 @@
"""
import prefect
-from google.cloud import bigquery, storage
from google.oauth2.service_account import Credentials
@@ -47,6 +46,8 @@
Returns:
- Client: an initialized and authenticated Google Client
"""
+ from google.cloud import storage
+
return get_google_client(storage, credentials=credentials, project=project)
@@ -63,4 +64,6 @@
Returns:
- Client: an initialized and authenticated Google Client
"""
+ from google.cloud import bigquery
+
return get_google_client(bigquery, credentials=credentials, project=project)
| {"golden_diff": "diff --git a/src/prefect/utilities/gcp.py b/src/prefect/utilities/gcp.py\n--- a/src/prefect/utilities/gcp.py\n+++ b/src/prefect/utilities/gcp.py\n@@ -3,7 +3,6 @@\n \"\"\"\n import prefect\n \n-from google.cloud import bigquery, storage\n from google.oauth2.service_account import Credentials\n \n \n@@ -47,6 +46,8 @@\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n+ from google.cloud import storage\n+\n return get_google_client(storage, credentials=credentials, project=project)\n \n \n@@ -63,4 +64,6 @@\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n+ from google.cloud import bigquery\n+\n return get_google_client(bigquery, credentials=credentials, project=project)\n", "issue": "Google Imports are Tied Together\n## Description\r\n*A clear description of the bug*\r\nI\u2019m using the new `GCSResult` and I\u2019m getting an import error when I don\u2019t also specify `google-cloud-bigquery` as a dependency since they\u2019re imports occur in the same file, I think?\r\n\r\n```\r\nUnexpected error: ImportError(\"cannot import name 'bigquery' from 'google.cloud' (unknown location)\")\r\nTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.7/site-packages/prefect/engine/runner.py\", line 48, in inner\r\n new_state = method(self, state, *args, **kwargs)\r\n File \"/usr/local/lib/python3.7/site-packages/prefect/engine/task_runner.py\", line 986, in get_task_run_state\r\n result = self.result.write(value, filename=\"output\", **prefect.context)\r\n File \"/usr/local/lib/python3.7/site-packages/prefect/engine/results/gcs_result.py\", line 73, in write\r\n self.gcs_bucket.blob(new.location).upload_from_string(binary_data)\r\n File \"/usr/local/lib/python3.7/site-packages/prefect/engine/results/gcs_result.py\", line 35, in gcs_bucket\r\n from prefect.utilities.gcp import get_storage_client\r\n File \"/usr/local/lib/python3.7/site-packages/prefect/utilities/gcp.py\", line 6, in <module>\r\n from google.cloud import bigquery, storage\r\nImportError: cannot import name 'bigquery' from 'google.cloud' (unknown location)\r\n```\r\nhttps://cloud.prefect.io/prefect/flow-run/6704aa4e-ba9b-40ed-a4f8-386920839a8e?logId=75b1fc01-0ee8-4061-ab8b-5481e6123a79\r\n\r\nOn a cool note, changing to `python_dependencies=[\"prefect[google]\"]` did work \ud83c\udf89 \r\n\r\n## Expected Behavior\r\n*What did you expect to happen instead?*\r\nI'd like to be able to specify one import in insolation (in this case `google-cloud-storage`) \r\n\r\n\r\n## Reproduction\r\n*A minimal example that exhibits the behavior.*\r\n```\r\nfrom prefect import task, Flow\r\nfrom prefect.tasks.notifications.slack_task import SlackTask\r\nfrom prefect.schedules import CronSchedule\r\nfrom prefect.environments.storage import Docker\r\nfrom prefect.engine.results import GCSResult\r\n\r\nimport pendulum\r\nimport datetime\r\n\r\n\r\n@task(name=\"Get Week Message\", max_retries=5, retry_delay=datetime.timedelta(seconds=5))\r\ndef get_week_message():\r\n prefects_birthday = pendulum.date(2018, 1, 17)\r\n current_week = prefects_birthday.diff(pendulum.now()).in_weeks()\r\n return f\"Hello, Jeremiah! It is week {current_week}.\"\r\n\r\n\r\nsend_message = SlackTask(\r\n name=\"Slack Jeremiah\",\r\n max_retries=5,\r\n retry_delay=datetime.timedelta(seconds=5),\r\n webhook_secret=\"SLACK_WEBHOOK\",\r\n)\r\n\r\nschedule = CronSchedule(cron=\"50 11 * * MON\", start_date=pendulum.now(tz=\"US/Eastern\"))\r\n\r\nstorage = Docker(\r\n base_image=\"prefecthq/prefect:latest-python3.7\",\r\n registry_url=URL,\r\n python_dependencies=[\"google-cloud-storage\"],\r\n files={\r\n FILE_LOCATION: FILENAME\r\n },\r\n env_vars={\"GOOGLE_APPLICATION_CREDENTIALS\": FILENAME},\r\n)\r\n\r\ngcs_result = GCSResult(bucket=\"what_week_is_it_results\")\r\n\r\nwith Flow(\r\n name=\"What Week is It?\", schedule=schedule, storage=storage, result=gcs_result\r\n) as flow:\r\n week_message = get_week_message()\r\n result = send_message(message=week_message)\r\n```\r\n\r\n\r\n\r\n## Environment\r\n*Any additional information about your environment*\r\n\r\n\r\n*Optionally run `prefect diagnostics` from the command line and paste the information here*\r\n```\r\n{\r\n \"config_overrides\": {\r\n \"cloud\": {\r\n \"auth_token\": true,\r\n \"use_local_secrets\": true\r\n },\r\n \"context\": {\r\n \"secrets\": false\r\n },\r\n \"home_dir\": true\r\n },\r\n \"env_vars\": [],\r\n \"system_information\": {\r\n \"platform\": \"Darwin-19.4.0-x86_64-i386-64bit\",\r\n \"prefect_version\": \"0.11.2\",\r\n \"python_version\": \"3.7.7\"\r\n }\r\n}\r\n```\n", "before_files": [{"content": "\"\"\"\nUtility functions for interacting with Google Cloud.\n\"\"\"\nimport prefect\n\nfrom google.cloud import bigquery, storage\nfrom google.oauth2.service_account import Credentials\n\n\ndef get_google_client(submodule, credentials: dict = None, project: str = None):\n \"\"\"\n Utility function for loading Google Client objects from a given set of credentials.\n\n Args:\n - submodule: a Python submodule with a Client attribute\n - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if\n not provided, will attempt to load the Client using ambient environment settings\n - project (str, optional): the Google project to point the Client to; if not provided, Client defaults\n will be used\n\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n Client = getattr(submodule, \"Client\")\n credentials = credentials or prefect.context.get(\"secrets\", {}).get(\n \"GCP_CREDENTIALS\"\n )\n if credentials is not None:\n credentials = Credentials.from_service_account_info(credentials)\n project = project or credentials.project_id\n client = Client(project=project, credentials=credentials)\n else:\n client = Client(project=project)\n return client\n\n\ndef get_storage_client(credentials: dict = None, project: str = None):\n \"\"\"\n Utility function for instantiating a Google Storage Client from a given set of credentials.\n\n Args:\n - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if\n not provided, will attempt to load the Client using ambient environment settings\n - project (str, optional): the Google project to point the Client to; if not provided, Client defaults\n will be used\n\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n return get_google_client(storage, credentials=credentials, project=project)\n\n\ndef get_bigquery_client(credentials: dict = None, project: str = None):\n \"\"\"\n Utility function for instantiating a Google BigQuery Client from a given set of credentials.\n\n Args:\n - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if\n not provided, will attempt to load the Client using ambient environment settings\n - project (str, optional): the Google project to point the Client to; if not provided, Client defaults\n will be used\n\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n return get_google_client(bigquery, credentials=credentials, project=project)\n", "path": "src/prefect/utilities/gcp.py"}], "after_files": [{"content": "\"\"\"\nUtility functions for interacting with Google Cloud.\n\"\"\"\nimport prefect\n\nfrom google.oauth2.service_account import Credentials\n\n\ndef get_google_client(submodule, credentials: dict = None, project: str = None):\n \"\"\"\n Utility function for loading Google Client objects from a given set of credentials.\n\n Args:\n - submodule: a Python submodule with a Client attribute\n - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if\n not provided, will attempt to load the Client using ambient environment settings\n - project (str, optional): the Google project to point the Client to; if not provided, Client defaults\n will be used\n\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n Client = getattr(submodule, \"Client\")\n credentials = credentials or prefect.context.get(\"secrets\", {}).get(\n \"GCP_CREDENTIALS\"\n )\n if credentials is not None:\n credentials = Credentials.from_service_account_info(credentials)\n project = project or credentials.project_id\n client = Client(project=project, credentials=credentials)\n else:\n client = Client(project=project)\n return client\n\n\ndef get_storage_client(credentials: dict = None, project: str = None):\n \"\"\"\n Utility function for instantiating a Google Storage Client from a given set of credentials.\n\n Args:\n - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if\n not provided, will attempt to load the Client using ambient environment settings\n - project (str, optional): the Google project to point the Client to; if not provided, Client defaults\n will be used\n\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n from google.cloud import storage\n\n return get_google_client(storage, credentials=credentials, project=project)\n\n\ndef get_bigquery_client(credentials: dict = None, project: str = None):\n \"\"\"\n Utility function for instantiating a Google BigQuery Client from a given set of credentials.\n\n Args:\n - credentials (dict, optional): a dictionary of Google credentials used to initialize the Client; if\n not provided, will attempt to load the Client using ambient environment settings\n - project (str, optional): the Google project to point the Client to; if not provided, Client defaults\n will be used\n\n Returns:\n - Client: an initialized and authenticated Google Client\n \"\"\"\n from google.cloud import bigquery\n\n return get_google_client(bigquery, credentials=credentials, project=project)\n", "path": "src/prefect/utilities/gcp.py"}]} | 1,906 | 188 |
gh_patches_debug_12930 | rasdani/github-patches | git_diff | spyder-ide__spyder-5128 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error when trying to load third-party plugins
## Description
**What steps will reproduce the problem?**
1. Start sypder.
2.
3.
**What is the expected output? What do you see instead?**
It shows right away after starting Spyder.
**Please provide any additional information below**
A message box show this:
module = _import_module_from_path(module_name, plugin_path)
module = spec.loader.load_module(module_name)
AttributeError: 'NoneType' object has no attribute 'load_module'

Spyder functions fine, no problem after close the message box.
## Version and main components
* Spyder Version: 3.2.0
* Python Version: 3.5.3
* Qt Versions: 5.6.2, PyQt5 5.6 on Darwin
## Dependencies
```
pyflakes >=0.6.0 : 1.5.0 (OK)
pycodestyle >=2.3: 2.3.1 (OK)
pygments >=2.0 : 2.2.0 (OK)
pandas >=0.13.1 : 0.19.2 (OK)
numpy >=1.7 : 1.12.1 (OK)
sphinx >=0.6.6 : 1.5.4 (OK)
rope >=0.9.4 : 0.10.5 (OK)
jedi >=0.9.0 : 0.10.2 (OK)
psutil >=0.3 : 5.2.1 (OK)
nbconvert >=4.0 : 5.1.1 (OK)
sympy >=0.7.3 : None (NOK)
cython >=0.21 : None (NOK)
qtconsole >=4.2.0: 4.3.0 (OK)
IPython >=4.0 : 6.1.0 (OK)
pylint >=0.25 : 1.6.4 (OK)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `spyder/otherplugins.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright © Spyder Project Contributors
4 # Licensed under the terms of the MIT License
5 # (see spyder/__init__.py for details)
6
7 """
8 Spyder third-party plugins configuration management
9 """
10
11 import os
12 import os.path as osp
13 import sys
14 import traceback
15
16 # Local imports
17 from spyder.config.base import get_conf_path
18 from spyder.py3compat import PY2
19
20 if PY2:
21 import imp
22 else:
23 import importlib
24
25
26 USER_PLUGIN_DIR = "plugins"
27 PLUGIN_PREFIX = "spyder_"
28 IO_PREFIX = PLUGIN_PREFIX + "io_"
29
30
31 def get_spyderplugins_mods(io=False):
32 """Import modules from plugins package and return the list"""
33 # Create user directory
34 user_plugin_path = osp.join(get_conf_path(), USER_PLUGIN_DIR)
35 if not osp.isdir(user_plugin_path):
36 os.makedirs(user_plugin_path)
37
38 modlist, modnames = [], []
39
40 # The user plugins directory is given the priority when looking for modules
41 for plugin_path in [user_plugin_path] + sys.path:
42 _get_spyderplugins(plugin_path, io, modnames, modlist)
43 return modlist
44
45
46 def _get_spyderplugins(plugin_path, is_io, modnames, modlist):
47 """Scan the directory `plugin_path` for plugin packages and loads them."""
48 if not osp.isdir(plugin_path):
49 return
50
51 for name in os.listdir(plugin_path):
52 # This is needed in order to register the spyder_io_hdf5 plugin.
53 # See issue 4487
54 # Is this a Spyder plugin?
55 if not name.startswith(PLUGIN_PREFIX):
56 continue
57 # Ensure right type of plugin
58 if is_io != name.startswith(IO_PREFIX):
59 continue
60
61 # Import the plugin
62 _import_plugin(name, plugin_path, modnames, modlist)
63
64
65 def _import_plugin(module_name, plugin_path, modnames, modlist):
66 """Import the plugin `module_name` from `plugin_path`, add it to `modlist`
67 and adds its name to `modnames`.
68 """
69 if module_name in modnames:
70 return
71 try:
72 # First add a mock module with the LOCALEPATH attribute so that the
73 # helper method can find the locale on import
74 mock = _ModuleMock()
75 mock.LOCALEPATH = osp.join(plugin_path, module_name, 'locale')
76 sys.modules[module_name] = mock
77
78 if osp.isdir(osp.join(plugin_path, module_name)):
79 module = _import_module_from_path(module_name, plugin_path)
80 else:
81 module = None
82
83 # Then restore the actual loaded module instead of the mock
84 if module:
85 sys.modules[module_name] = module
86 modlist.append(module)
87 modnames.append(module_name)
88 except Exception:
89 sys.stderr.write("ERROR: 3rd party plugin import failed for "
90 "`{0}`\n".format(module_name))
91 traceback.print_exc(file=sys.stderr)
92
93
94 def _import_module_from_path(module_name, plugin_path):
95 """Imports `module_name` from `plugin_path`.
96
97 Return None if no module is found.
98 """
99 module = None
100 if PY2:
101 info = imp.find_module(module_name, [plugin_path])
102 if info:
103 module = imp.load_module(module_name, *info)
104 elif sys.version_info[0:2] <= (3, 3):
105 loader = importlib.machinery.PathFinder.find_module(
106 module_name,
107 [plugin_path])
108 if loader:
109 module = loader.load_module(module_name)
110 else: # Python 3.4+
111 spec = importlib.machinery.PathFinder.find_spec(
112 module_name,
113 [plugin_path])
114 if spec:
115 module = spec.loader.load_module(module_name)
116 return module
117
118
119 class _ModuleMock():
120 """This mock module is added to sys.modules on plugin load to add the
121 location of the LOCALEDATA so that the module loads succesfully.
122 Once loaded the module is replaced by the actual loaded module object.
123 """
124 pass
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/spyder/otherplugins.py b/spyder/otherplugins.py
--- a/spyder/otherplugins.py
+++ b/spyder/otherplugins.py
@@ -54,10 +54,16 @@
# Is this a Spyder plugin?
if not name.startswith(PLUGIN_PREFIX):
continue
+
# Ensure right type of plugin
if is_io != name.startswith(IO_PREFIX):
continue
+ # Skip names that end in certain suffixes
+ forbidden_suffixes = ['dist-info', 'egg.info', 'egg-link']
+ if any([name.endswith(s) for s in forbidden_suffixes]):
+ continue
+
# Import the plugin
_import_plugin(name, plugin_path, modnames, modlist)
| {"golden_diff": "diff --git a/spyder/otherplugins.py b/spyder/otherplugins.py\n--- a/spyder/otherplugins.py\n+++ b/spyder/otherplugins.py\n@@ -54,10 +54,16 @@\n # Is this a Spyder plugin?\r\n if not name.startswith(PLUGIN_PREFIX):\r\n continue\r\n+\r\n # Ensure right type of plugin\r\n if is_io != name.startswith(IO_PREFIX):\r\n continue\r\n \r\n+ # Skip names that end in certain suffixes\r\n+ forbidden_suffixes = ['dist-info', 'egg.info', 'egg-link']\r\n+ if any([name.endswith(s) for s in forbidden_suffixes]):\r\n+ continue\r\n+\r\n # Import the plugin\r\n _import_plugin(name, plugin_path, modnames, modlist)\n", "issue": "Error when trying to load third-party plugins\n## Description\r\n\r\n**What steps will reproduce the problem?**\r\n\r\n1. Start sypder.\r\n2. \r\n3. \r\n\r\n**What is the expected output? What do you see instead?**\r\nIt shows right away after starting Spyder.\r\n\r\n**Please provide any additional information below**\r\nA message box show this:\r\n module = _import_module_from_path(module_name, plugin_path)\r\n module = spec.loader.load_module(module_name)\r\nAttributeError: 'NoneType' object has no attribute 'load_module'\r\n\r\n\r\nSpyder functions fine, no problem after close the message box.\r\n## Version and main components\r\n\r\n* Spyder Version: 3.2.0 \r\n* Python Version: 3.5.3\r\n* Qt Versions: 5.6.2, PyQt5 5.6 on Darwin\r\n\r\n## Dependencies\r\n```\r\npyflakes >=0.6.0 : 1.5.0 (OK)\r\npycodestyle >=2.3: 2.3.1 (OK)\r\npygments >=2.0 : 2.2.0 (OK)\r\npandas >=0.13.1 : 0.19.2 (OK)\r\nnumpy >=1.7 : 1.12.1 (OK)\r\nsphinx >=0.6.6 : 1.5.4 (OK)\r\nrope >=0.9.4 : 0.10.5 (OK)\r\njedi >=0.9.0 : 0.10.2 (OK)\r\npsutil >=0.3 : 5.2.1 (OK)\r\nnbconvert >=4.0 : 5.1.1 (OK)\r\nsympy >=0.7.3 : None (NOK)\r\ncython >=0.21 : None (NOK)\r\nqtconsole >=4.2.0: 4.3.0 (OK)\r\nIPython >=4.0 : 6.1.0 (OK)\r\npylint >=0.25 : 1.6.4 (OK)\r\n\r\n```\r\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\r\n#\r\n# Copyright \u00a9 Spyder Project Contributors\r\n# Licensed under the terms of the MIT License\r\n# (see spyder/__init__.py for details)\r\n\r\n\"\"\"\r\nSpyder third-party plugins configuration management\r\n\"\"\"\r\n\r\nimport os\r\nimport os.path as osp\r\nimport sys\r\nimport traceback\r\n\r\n# Local imports\r\nfrom spyder.config.base import get_conf_path\r\nfrom spyder.py3compat import PY2\r\n\r\nif PY2:\r\n import imp\r\nelse:\r\n import importlib\r\n\r\n\r\nUSER_PLUGIN_DIR = \"plugins\"\r\nPLUGIN_PREFIX = \"spyder_\"\r\nIO_PREFIX = PLUGIN_PREFIX + \"io_\"\r\n\r\n\r\ndef get_spyderplugins_mods(io=False):\r\n \"\"\"Import modules from plugins package and return the list\"\"\"\r\n # Create user directory\r\n user_plugin_path = osp.join(get_conf_path(), USER_PLUGIN_DIR)\r\n if not osp.isdir(user_plugin_path):\r\n os.makedirs(user_plugin_path)\r\n\r\n modlist, modnames = [], []\r\n\r\n # The user plugins directory is given the priority when looking for modules\r\n for plugin_path in [user_plugin_path] + sys.path:\r\n _get_spyderplugins(plugin_path, io, modnames, modlist)\r\n return modlist\r\n\r\n\r\ndef _get_spyderplugins(plugin_path, is_io, modnames, modlist):\r\n \"\"\"Scan the directory `plugin_path` for plugin packages and loads them.\"\"\"\r\n if not osp.isdir(plugin_path):\r\n return\r\n\r\n for name in os.listdir(plugin_path):\r\n # This is needed in order to register the spyder_io_hdf5 plugin.\r\n # See issue 4487\r\n # Is this a Spyder plugin?\r\n if not name.startswith(PLUGIN_PREFIX):\r\n continue\r\n # Ensure right type of plugin\r\n if is_io != name.startswith(IO_PREFIX):\r\n continue\r\n\r\n # Import the plugin\r\n _import_plugin(name, plugin_path, modnames, modlist)\r\n\r\n\r\ndef _import_plugin(module_name, plugin_path, modnames, modlist):\r\n \"\"\"Import the plugin `module_name` from `plugin_path`, add it to `modlist`\r\n and adds its name to `modnames`.\r\n \"\"\"\r\n if module_name in modnames:\r\n return\r\n try:\r\n # First add a mock module with the LOCALEPATH attribute so that the\r\n # helper method can find the locale on import\r\n mock = _ModuleMock()\r\n mock.LOCALEPATH = osp.join(plugin_path, module_name, 'locale')\r\n sys.modules[module_name] = mock\r\n\r\n if osp.isdir(osp.join(plugin_path, module_name)):\r\n module = _import_module_from_path(module_name, plugin_path)\r\n else:\r\n module = None\r\n\r\n # Then restore the actual loaded module instead of the mock\r\n if module:\r\n sys.modules[module_name] = module\r\n modlist.append(module)\r\n modnames.append(module_name)\r\n except Exception:\r\n sys.stderr.write(\"ERROR: 3rd party plugin import failed for \"\r\n \"`{0}`\\n\".format(module_name))\r\n traceback.print_exc(file=sys.stderr)\r\n\r\n\r\ndef _import_module_from_path(module_name, plugin_path):\r\n \"\"\"Imports `module_name` from `plugin_path`.\r\n\r\n Return None if no module is found.\r\n \"\"\"\r\n module = None\r\n if PY2:\r\n info = imp.find_module(module_name, [plugin_path])\r\n if info:\r\n module = imp.load_module(module_name, *info)\r\n elif sys.version_info[0:2] <= (3, 3):\r\n loader = importlib.machinery.PathFinder.find_module(\r\n module_name,\r\n [plugin_path])\r\n if loader:\r\n module = loader.load_module(module_name)\r\n else: # Python 3.4+\r\n spec = importlib.machinery.PathFinder.find_spec(\r\n module_name,\r\n [plugin_path])\r\n if spec:\r\n module = spec.loader.load_module(module_name)\r\n return module\r\n\r\n\r\nclass _ModuleMock():\r\n \"\"\"This mock module is added to sys.modules on plugin load to add the\r\n location of the LOCALEDATA so that the module loads succesfully.\r\n Once loaded the module is replaced by the actual loaded module object.\r\n \"\"\"\r\n pass\r\n", "path": "spyder/otherplugins.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\r\n#\r\n# Copyright \u00a9 Spyder Project Contributors\r\n# Licensed under the terms of the MIT License\r\n# (see spyder/__init__.py for details)\r\n\r\n\"\"\"\r\nSpyder third-party plugins configuration management\r\n\"\"\"\r\n\r\nimport os\r\nimport os.path as osp\r\nimport sys\r\nimport traceback\r\n\r\n# Local imports\r\nfrom spyder.config.base import get_conf_path\r\nfrom spyder.py3compat import PY2\r\n\r\nif PY2:\r\n import imp\r\nelse:\r\n import importlib\r\n\r\n\r\nUSER_PLUGIN_DIR = \"plugins\"\r\nPLUGIN_PREFIX = \"spyder_\"\r\nIO_PREFIX = PLUGIN_PREFIX + \"io_\"\r\n\r\n\r\ndef get_spyderplugins_mods(io=False):\r\n \"\"\"Import modules from plugins package and return the list\"\"\"\r\n # Create user directory\r\n user_plugin_path = osp.join(get_conf_path(), USER_PLUGIN_DIR)\r\n if not osp.isdir(user_plugin_path):\r\n os.makedirs(user_plugin_path)\r\n\r\n modlist, modnames = [], []\r\n\r\n # The user plugins directory is given the priority when looking for modules\r\n for plugin_path in [user_plugin_path] + sys.path:\r\n _get_spyderplugins(plugin_path, io, modnames, modlist)\r\n return modlist\r\n\r\n\r\ndef _get_spyderplugins(plugin_path, is_io, modnames, modlist):\r\n \"\"\"Scan the directory `plugin_path` for plugin packages and loads them.\"\"\"\r\n if not osp.isdir(plugin_path):\r\n return\r\n\r\n for name in os.listdir(plugin_path):\r\n # This is needed in order to register the spyder_io_hdf5 plugin.\r\n # See issue 4487\r\n # Is this a Spyder plugin?\r\n if not name.startswith(PLUGIN_PREFIX):\r\n continue\r\n\r\n # Ensure right type of plugin\r\n if is_io != name.startswith(IO_PREFIX):\r\n continue\r\n\r\n # Skip names that end in certain suffixes\r\n forbidden_suffixes = ['dist-info', 'egg.info', 'egg-link']\r\n if any([name.endswith(s) for s in forbidden_suffixes]):\r\n continue\r\n\r\n # Import the plugin\r\n _import_plugin(name, plugin_path, modnames, modlist)\r\n\r\n\r\ndef _import_plugin(module_name, plugin_path, modnames, modlist):\r\n \"\"\"Import the plugin `module_name` from `plugin_path`, add it to `modlist`\r\n and adds its name to `modnames`.\r\n \"\"\"\r\n if module_name in modnames:\r\n return\r\n try:\r\n # First add a mock module with the LOCALEPATH attribute so that the\r\n # helper method can find the locale on import\r\n mock = _ModuleMock()\r\n mock.LOCALEPATH = osp.join(plugin_path, module_name, 'locale')\r\n sys.modules[module_name] = mock\r\n\r\n if osp.isdir(osp.join(plugin_path, module_name)):\r\n module = _import_module_from_path(module_name, plugin_path)\r\n else:\r\n module = None\r\n\r\n # Then restore the actual loaded module instead of the mock\r\n if module:\r\n sys.modules[module_name] = module\r\n modlist.append(module)\r\n modnames.append(module_name)\r\n except Exception:\r\n sys.stderr.write(\"ERROR: 3rd party plugin import failed for \"\r\n \"`{0}`\\n\".format(module_name))\r\n traceback.print_exc(file=sys.stderr)\r\n\r\n\r\ndef _import_module_from_path(module_name, plugin_path):\r\n \"\"\"Imports `module_name` from `plugin_path`.\r\n\r\n Return None if no module is found.\r\n \"\"\"\r\n module = None\r\n if PY2:\r\n info = imp.find_module(module_name, [plugin_path])\r\n if info:\r\n module = imp.load_module(module_name, *info)\r\n elif sys.version_info[0:2] <= (3, 3):\r\n loader = importlib.machinery.PathFinder.find_module(\r\n module_name,\r\n [plugin_path])\r\n if loader:\r\n module = loader.load_module(module_name)\r\n else: # Python 3.4+\r\n spec = importlib.machinery.PathFinder.find_spec(\r\n module_name,\r\n [plugin_path])\r\n if spec:\r\n module = spec.loader.load_module(module_name)\r\n return module\r\n\r\n\r\nclass _ModuleMock():\r\n \"\"\"This mock module is added to sys.modules on plugin load to add the\r\n location of the LOCALEDATA so that the module loads succesfully.\r\n Once loaded the module is replaced by the actual loaded module object.\r\n \"\"\"\r\n pass\r\n", "path": "spyder/otherplugins.py"}]} | 1,947 | 172 |
gh_patches_debug_7825 | rasdani/github-patches | git_diff | spack__spack-4606 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
BUG: mxml / adios link error
This seems to be related to the pull request https://github.com/LLNL/spack/pull/3696 that I abandoned a few months back, but which has now come back to haunt me.
If I use the current spack setup for adios and mxml, I receive the following error when I try to link against the adios library:
ld: /home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a(mxml-attr.o):
relocation R_X86_64_32 against `.rodata.str1.1' can not be used when making a shared object; recompile with -fPIC
/home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a: error adding symbols: Bad value
collect2: error: ld returned 1 exit status
Interestingly enough, the adios library is always being built as a `.a`, regardless of +shared, ~shared, whatever.
If I avoid the external mxml library for ADIOS - relying instead on adios' own internal copy - I don't see this issue.
I'll post a commit and pull-request and see if someone can tell me why.
BUG: mxml / adios link error
This seems to be related to the pull request https://github.com/LLNL/spack/pull/3696 that I abandoned a few months back, but which has now come back to haunt me.
If I use the current spack setup for adios and mxml, I receive the following error when I try to link against the adios library:
ld: /home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a(mxml-attr.o):
relocation R_X86_64_32 against `.rodata.str1.1' can not be used when making a shared object; recompile with -fPIC
/home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a: error adding symbols: Bad value
collect2: error: ld returned 1 exit status
Interestingly enough, the adios library is always being built as a `.a`, regardless of +shared, ~shared, whatever.
If I avoid the external mxml library for ADIOS - relying instead on adios' own internal copy - I don't see this issue.
I'll post a commit and pull-request and see if someone can tell me why.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `var/spack/repos/builtin/packages/mxml/package.py`
Content:
```
1 ##############################################################################
2 # Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.
3 # Produced at the Lawrence Livermore National Laboratory.
4 #
5 # This file is part of Spack.
6 # Created by Todd Gamblin, [email protected], All rights reserved.
7 # LLNL-CODE-647188
8 #
9 # For details, see https://github.com/llnl/spack
10 # Please also see the NOTICE and LICENSE files for our notice and the LGPL.
11 #
12 # This program is free software; you can redistribute it and/or modify
13 # it under the terms of the GNU Lesser General Public License (as
14 # published by the Free Software Foundation) version 2.1, February 1999.
15 #
16 # This program is distributed in the hope that it will be useful, but
17 # WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF
18 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and
19 # conditions of the GNU Lesser General Public License for more details.
20 #
21 # You should have received a copy of the GNU Lesser General Public
22 # License along with this program; if not, write to the Free Software
23 # Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
24 ##############################################################################
25 from spack import *
26
27
28 class Mxml(AutotoolsPackage):
29 """Mini-XML is a small XML library that you can use to read and write XML
30 and XML-like data files in your application without requiring large
31 non-standard libraries.
32 """
33
34 homepage = "http://michaelrsweet.github.io/mxml/"
35 url = "https://github.com/michaelrsweet/mxml/releases/download/release-2.10/mxml-2.10.tar.gz"
36
37 version('2.10', '8804c961a24500a95690ef287d150abe')
38 version('2.9', 'e21cad0f7aacd18f942aa0568a8dee19')
39 version('2.8', 'd85ee6d30de053581242c4a86e79a5d2')
40 version('2.7', '76f2ae49bf0f5745d5cb5d9507774dc9')
41 version('2.6', '68977789ae64985dddbd1a1a1652642e')
42 version('2.5', 'f706377fba630b39fa02fd63642b17e5')
43
44 def url_for_version(self, version):
45 if version <= Version('2.7'):
46 return 'https://github.com/michaelrsweet/mxml/archive/release-{0}.tar.gz'.format(version)
47 else:
48 return 'https://github.com/michaelrsweet/mxml/releases/download/release-{0}/mxml-{0}.tar.gz'.format(version)
49
50 def configure_args(self):
51 # Default is non-shared, but avoid any future surprises
52 return ['--disable-shared']
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/var/spack/repos/builtin/packages/mxml/package.py b/var/spack/repos/builtin/packages/mxml/package.py
--- a/var/spack/repos/builtin/packages/mxml/package.py
+++ b/var/spack/repos/builtin/packages/mxml/package.py
@@ -48,5 +48,9 @@
return 'https://github.com/michaelrsweet/mxml/releases/download/release-{0}/mxml-{0}.tar.gz'.format(version)
def configure_args(self):
- # Default is non-shared, but avoid any future surprises
- return ['--disable-shared']
+ return [
+ # ADIOS build with -fPIC, so we need it too (avoid linkage issue)
+ 'CFLAGS=-fPIC',
+ # Default is non-shared, but avoid any future surprises
+ '--disable-shared',
+ ]
| {"golden_diff": "diff --git a/var/spack/repos/builtin/packages/mxml/package.py b/var/spack/repos/builtin/packages/mxml/package.py\n--- a/var/spack/repos/builtin/packages/mxml/package.py\n+++ b/var/spack/repos/builtin/packages/mxml/package.py\n@@ -48,5 +48,9 @@\n return 'https://github.com/michaelrsweet/mxml/releases/download/release-{0}/mxml-{0}.tar.gz'.format(version)\n \n def configure_args(self):\n- # Default is non-shared, but avoid any future surprises\n- return ['--disable-shared']\n+ return [\n+ # ADIOS build with -fPIC, so we need it too (avoid linkage issue)\n+ 'CFLAGS=-fPIC',\n+ # Default is non-shared, but avoid any future surprises\n+ '--disable-shared',\n+ ]\n", "issue": "BUG: mxml / adios link error\nThis seems to be related to the pull request https://github.com/LLNL/spack/pull/3696 that I abandoned a few months back, but which has now come back to haunt me.\r\nIf I use the current spack setup for adios and mxml, I receive the following error when I try to link against the adios library:\r\n\r\n ld: /home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a(mxml-attr.o):\r\n relocation R_X86_64_32 against `.rodata.str1.1' can not be used when making a shared object; recompile with -fPIC\r\n /home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a: error adding symbols: Bad value\r\ncollect2: error: ld returned 1 exit status\r\n\r\nInterestingly enough, the adios library is always being built as a `.a`, regardless of +shared, ~shared, whatever.\r\n\r\nIf I avoid the external mxml library for ADIOS - relying instead on adios' own internal copy - I don't see this issue.\r\nI'll post a commit and pull-request and see if someone can tell me why.\nBUG: mxml / adios link error\nThis seems to be related to the pull request https://github.com/LLNL/spack/pull/3696 that I abandoned a few months back, but which has now come back to haunt me.\r\nIf I use the current spack setup for adios and mxml, I receive the following error when I try to link against the adios library:\r\n\r\n ld: /home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a(mxml-attr.o):\r\n relocation R_X86_64_32 against `.rodata.str1.1' can not be used when making a shared object; recompile with -fPIC\r\n /home/mol/Software/spack/opt/spack/linux-opensuse42-x86_64/gcc-4.8/mxml-2.10-zxpuyumt66b7ehrlidmcax7li5ekhr6x/lib64/libmxml.a: error adding symbols: Bad value\r\ncollect2: error: ld returned 1 exit status\r\n\r\nInterestingly enough, the adios library is always being built as a `.a`, regardless of +shared, ~shared, whatever.\r\n\r\nIf I avoid the external mxml library for ADIOS - relying instead on adios' own internal copy - I don't see this issue.\r\nI'll post a commit and pull-request and see if someone can tell me why.\n", "before_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/llnl/spack\n# Please also see the NOTICE and LICENSE files for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Mxml(AutotoolsPackage):\n \"\"\"Mini-XML is a small XML library that you can use to read and write XML\n and XML-like data files in your application without requiring large\n non-standard libraries.\n \"\"\"\n\n homepage = \"http://michaelrsweet.github.io/mxml/\"\n url = \"https://github.com/michaelrsweet/mxml/releases/download/release-2.10/mxml-2.10.tar.gz\"\n\n version('2.10', '8804c961a24500a95690ef287d150abe')\n version('2.9', 'e21cad0f7aacd18f942aa0568a8dee19')\n version('2.8', 'd85ee6d30de053581242c4a86e79a5d2')\n version('2.7', '76f2ae49bf0f5745d5cb5d9507774dc9')\n version('2.6', '68977789ae64985dddbd1a1a1652642e')\n version('2.5', 'f706377fba630b39fa02fd63642b17e5')\n\n def url_for_version(self, version):\n if version <= Version('2.7'):\n return 'https://github.com/michaelrsweet/mxml/archive/release-{0}.tar.gz'.format(version)\n else:\n return 'https://github.com/michaelrsweet/mxml/releases/download/release-{0}/mxml-{0}.tar.gz'.format(version)\n\n def configure_args(self):\n # Default is non-shared, but avoid any future surprises\n return ['--disable-shared']\n", "path": "var/spack/repos/builtin/packages/mxml/package.py"}], "after_files": [{"content": "##############################################################################\n# Copyright (c) 2013-2016, Lawrence Livermore National Security, LLC.\n# Produced at the Lawrence Livermore National Laboratory.\n#\n# This file is part of Spack.\n# Created by Todd Gamblin, [email protected], All rights reserved.\n# LLNL-CODE-647188\n#\n# For details, see https://github.com/llnl/spack\n# Please also see the NOTICE and LICENSE files for our notice and the LGPL.\n#\n# This program is free software; you can redistribute it and/or modify\n# it under the terms of the GNU Lesser General Public License (as\n# published by the Free Software Foundation) version 2.1, February 1999.\n#\n# This program is distributed in the hope that it will be useful, but\n# WITHOUT ANY WARRANTY; without even the IMPLIED WARRANTY OF\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the terms and\n# conditions of the GNU Lesser General Public License for more details.\n#\n# You should have received a copy of the GNU Lesser General Public\n# License along with this program; if not, write to the Free Software\n# Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA\n##############################################################################\nfrom spack import *\n\n\nclass Mxml(AutotoolsPackage):\n \"\"\"Mini-XML is a small XML library that you can use to read and write XML\n and XML-like data files in your application without requiring large\n non-standard libraries.\n \"\"\"\n\n homepage = \"http://michaelrsweet.github.io/mxml/\"\n url = \"https://github.com/michaelrsweet/mxml/releases/download/release-2.10/mxml-2.10.tar.gz\"\n\n version('2.10', '8804c961a24500a95690ef287d150abe')\n version('2.9', 'e21cad0f7aacd18f942aa0568a8dee19')\n version('2.8', 'd85ee6d30de053581242c4a86e79a5d2')\n version('2.7', '76f2ae49bf0f5745d5cb5d9507774dc9')\n version('2.6', '68977789ae64985dddbd1a1a1652642e')\n version('2.5', 'f706377fba630b39fa02fd63642b17e5')\n\n def url_for_version(self, version):\n if version <= Version('2.7'):\n return 'https://github.com/michaelrsweet/mxml/archive/release-{0}.tar.gz'.format(version)\n else:\n return 'https://github.com/michaelrsweet/mxml/releases/download/release-{0}/mxml-{0}.tar.gz'.format(version)\n\n def configure_args(self):\n return [\n # ADIOS build with -fPIC, so we need it too (avoid linkage issue)\n 'CFLAGS=-fPIC',\n # Default is non-shared, but avoid any future surprises\n '--disable-shared',\n ]\n", "path": "var/spack/repos/builtin/packages/mxml/package.py"}]} | 1,779 | 184 |
gh_patches_debug_29724 | rasdani/github-patches | git_diff | pallets__click-545 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bash completion for MultiCommands returns subcommands twice until it stops.
Python 2.7.10
broken in click >= 6
ok in click (5.1)
```
manage hello hello world world
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `click/_bashcomplete.py`
Content:
```
1 import os
2 import re
3 from .utils import echo
4 from .parser import split_arg_string
5 from .core import MultiCommand, Option
6
7
8 COMPLETION_SCRIPT = '''
9 %(complete_func)s() {
10 COMPREPLY=( $( env COMP_WORDS="${COMP_WORDS[*]}" \\
11 COMP_CWORD=$COMP_CWORD \\
12 %(autocomplete_var)s=complete $1 ) )
13 return 0
14 }
15
16 complete -F %(complete_func)s -o default %(script_names)s
17 '''
18
19 _invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')
20
21
22 def get_completion_script(prog_name, complete_var):
23 cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))
24 return (COMPLETION_SCRIPT % {
25 'complete_func': '_%s_completion' % cf_name,
26 'script_names': prog_name,
27 'autocomplete_var': complete_var,
28 }).strip() + ';'
29
30
31 def resolve_ctx(cli, prog_name, args):
32 ctx = cli.make_context(prog_name, args, resilient_parsing=True)
33 while ctx.args and isinstance(ctx.command, MultiCommand):
34 cmd = ctx.command.get_command(ctx, ctx.args[0])
35 if cmd is None:
36 return None
37 ctx = cmd.make_context(ctx.args[0], ctx.args[1:], parent=ctx,
38 resilient_parsing=True)
39 return ctx
40
41
42 def do_complete(cli, prog_name):
43 cwords = split_arg_string(os.environ['COMP_WORDS'])
44 cword = int(os.environ['COMP_CWORD'])
45 args = cwords[1:cword]
46 try:
47 incomplete = cwords[cword]
48 except IndexError:
49 incomplete = ''
50
51 ctx = resolve_ctx(cli, prog_name, args)
52 if ctx is None:
53 return True
54
55 choices = []
56 if incomplete and not incomplete[:1].isalnum():
57 for param in ctx.command.params:
58 if not isinstance(param, Option):
59 continue
60 choices.extend(param.opts)
61 choices.extend(param.secondary_opts)
62 elif isinstance(ctx.command, MultiCommand):
63 choices.extend(ctx.command.list_commands(ctx))
64
65 for item in choices:
66 if item.startswith(incomplete):
67 echo(item)
68
69 return True
70
71
72 def bashcomplete(cli, prog_name, complete_var, complete_instr):
73 if complete_instr == 'source':
74 echo(get_completion_script(prog_name, complete_var))
75 return True
76 elif complete_instr == 'complete':
77 return do_complete(cli, prog_name)
78 return False
79
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/click/_bashcomplete.py b/click/_bashcomplete.py
--- a/click/_bashcomplete.py
+++ b/click/_bashcomplete.py
@@ -30,27 +30,19 @@
def resolve_ctx(cli, prog_name, args):
ctx = cli.make_context(prog_name, args, resilient_parsing=True)
- while ctx.args and isinstance(ctx.command, MultiCommand):
- cmd = ctx.command.get_command(ctx, ctx.args[0])
+ while ctx.args + ctx.protected_args and isinstance(ctx.command, MultiCommand):
+ a = ctx.args + ctx.protected_args
+ cmd = ctx.command.get_command(ctx, a[0])
if cmd is None:
return None
- ctx = cmd.make_context(ctx.args[0], ctx.args[1:], parent=ctx,
- resilient_parsing=True)
+ ctx = cmd.make_context(a[0], a[1:], parent=ctx, resilient_parsing=True)
return ctx
-def do_complete(cli, prog_name):
- cwords = split_arg_string(os.environ['COMP_WORDS'])
- cword = int(os.environ['COMP_CWORD'])
- args = cwords[1:cword]
- try:
- incomplete = cwords[cword]
- except IndexError:
- incomplete = ''
-
+def get_choices(cli, prog_name, args, incomplete):
ctx = resolve_ctx(cli, prog_name, args)
if ctx is None:
- return True
+ return
choices = []
if incomplete and not incomplete[:1].isalnum():
@@ -64,7 +56,20 @@
for item in choices:
if item.startswith(incomplete):
- echo(item)
+ yield item
+
+
+def do_complete(cli, prog_name):
+ cwords = split_arg_string(os.environ['COMP_WORDS'])
+ cword = int(os.environ['COMP_CWORD'])
+ args = cwords[1:cword]
+ try:
+ incomplete = cwords[cword]
+ except IndexError:
+ incomplete = ''
+
+ for item in get_choices(cli, prog_name, args, incomplete):
+ echo(item)
return True
| {"golden_diff": "diff --git a/click/_bashcomplete.py b/click/_bashcomplete.py\n--- a/click/_bashcomplete.py\n+++ b/click/_bashcomplete.py\n@@ -30,27 +30,19 @@\n \n def resolve_ctx(cli, prog_name, args):\n ctx = cli.make_context(prog_name, args, resilient_parsing=True)\n- while ctx.args and isinstance(ctx.command, MultiCommand):\n- cmd = ctx.command.get_command(ctx, ctx.args[0])\n+ while ctx.args + ctx.protected_args and isinstance(ctx.command, MultiCommand):\n+ a = ctx.args + ctx.protected_args\n+ cmd = ctx.command.get_command(ctx, a[0])\n if cmd is None:\n return None\n- ctx = cmd.make_context(ctx.args[0], ctx.args[1:], parent=ctx,\n- resilient_parsing=True)\n+ ctx = cmd.make_context(a[0], a[1:], parent=ctx, resilient_parsing=True)\n return ctx\n \n \n-def do_complete(cli, prog_name):\n- cwords = split_arg_string(os.environ['COMP_WORDS'])\n- cword = int(os.environ['COMP_CWORD'])\n- args = cwords[1:cword]\n- try:\n- incomplete = cwords[cword]\n- except IndexError:\n- incomplete = ''\n-\n+def get_choices(cli, prog_name, args, incomplete):\n ctx = resolve_ctx(cli, prog_name, args)\n if ctx is None:\n- return True\n+ return\n \n choices = []\n if incomplete and not incomplete[:1].isalnum():\n@@ -64,7 +56,20 @@\n \n for item in choices:\n if item.startswith(incomplete):\n- echo(item)\n+ yield item\n+\n+\n+def do_complete(cli, prog_name):\n+ cwords = split_arg_string(os.environ['COMP_WORDS'])\n+ cword = int(os.environ['COMP_CWORD'])\n+ args = cwords[1:cword]\n+ try:\n+ incomplete = cwords[cword]\n+ except IndexError:\n+ incomplete = ''\n+\n+ for item in get_choices(cli, prog_name, args, incomplete):\n+ echo(item)\n \n return True\n", "issue": "Bash completion for MultiCommands returns subcommands twice until it stops.\nPython 2.7.10\nbroken in click >= 6\nok in click (5.1)\n\n```\nmanage hello hello world world\n```\n\n", "before_files": [{"content": "import os\nimport re\nfrom .utils import echo\nfrom .parser import split_arg_string\nfrom .core import MultiCommand, Option\n\n\nCOMPLETION_SCRIPT = '''\n%(complete_func)s() {\n COMPREPLY=( $( env COMP_WORDS=\"${COMP_WORDS[*]}\" \\\\\n COMP_CWORD=$COMP_CWORD \\\\\n %(autocomplete_var)s=complete $1 ) )\n return 0\n}\n\ncomplete -F %(complete_func)s -o default %(script_names)s\n'''\n\n_invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')\n\n\ndef get_completion_script(prog_name, complete_var):\n cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))\n return (COMPLETION_SCRIPT % {\n 'complete_func': '_%s_completion' % cf_name,\n 'script_names': prog_name,\n 'autocomplete_var': complete_var,\n }).strip() + ';'\n\n\ndef resolve_ctx(cli, prog_name, args):\n ctx = cli.make_context(prog_name, args, resilient_parsing=True)\n while ctx.args and isinstance(ctx.command, MultiCommand):\n cmd = ctx.command.get_command(ctx, ctx.args[0])\n if cmd is None:\n return None\n ctx = cmd.make_context(ctx.args[0], ctx.args[1:], parent=ctx,\n resilient_parsing=True)\n return ctx\n\n\ndef do_complete(cli, prog_name):\n cwords = split_arg_string(os.environ['COMP_WORDS'])\n cword = int(os.environ['COMP_CWORD'])\n args = cwords[1:cword]\n try:\n incomplete = cwords[cword]\n except IndexError:\n incomplete = ''\n\n ctx = resolve_ctx(cli, prog_name, args)\n if ctx is None:\n return True\n\n choices = []\n if incomplete and not incomplete[:1].isalnum():\n for param in ctx.command.params:\n if not isinstance(param, Option):\n continue\n choices.extend(param.opts)\n choices.extend(param.secondary_opts)\n elif isinstance(ctx.command, MultiCommand):\n choices.extend(ctx.command.list_commands(ctx))\n\n for item in choices:\n if item.startswith(incomplete):\n echo(item)\n\n return True\n\n\ndef bashcomplete(cli, prog_name, complete_var, complete_instr):\n if complete_instr == 'source':\n echo(get_completion_script(prog_name, complete_var))\n return True\n elif complete_instr == 'complete':\n return do_complete(cli, prog_name)\n return False\n", "path": "click/_bashcomplete.py"}], "after_files": [{"content": "import os\nimport re\nfrom .utils import echo\nfrom .parser import split_arg_string\nfrom .core import MultiCommand, Option\n\n\nCOMPLETION_SCRIPT = '''\n%(complete_func)s() {\n COMPREPLY=( $( env COMP_WORDS=\"${COMP_WORDS[*]}\" \\\\\n COMP_CWORD=$COMP_CWORD \\\\\n %(autocomplete_var)s=complete $1 ) )\n return 0\n}\n\ncomplete -F %(complete_func)s -o default %(script_names)s\n'''\n\n_invalid_ident_char_re = re.compile(r'[^a-zA-Z0-9_]')\n\n\ndef get_completion_script(prog_name, complete_var):\n cf_name = _invalid_ident_char_re.sub('', prog_name.replace('-', '_'))\n return (COMPLETION_SCRIPT % {\n 'complete_func': '_%s_completion' % cf_name,\n 'script_names': prog_name,\n 'autocomplete_var': complete_var,\n }).strip() + ';'\n\n\ndef resolve_ctx(cli, prog_name, args):\n ctx = cli.make_context(prog_name, args, resilient_parsing=True)\n while ctx.args + ctx.protected_args and isinstance(ctx.command, MultiCommand):\n a = ctx.args + ctx.protected_args\n cmd = ctx.command.get_command(ctx, a[0])\n if cmd is None:\n return None\n ctx = cmd.make_context(a[0], a[1:], parent=ctx, resilient_parsing=True)\n return ctx\n\n\ndef get_choices(cli, prog_name, args, incomplete):\n ctx = resolve_ctx(cli, prog_name, args)\n if ctx is None:\n return\n\n choices = []\n if incomplete and not incomplete[:1].isalnum():\n for param in ctx.command.params:\n if not isinstance(param, Option):\n continue\n choices.extend(param.opts)\n choices.extend(param.secondary_opts)\n elif isinstance(ctx.command, MultiCommand):\n choices.extend(ctx.command.list_commands(ctx))\n\n for item in choices:\n if item.startswith(incomplete):\n yield item\n\n\ndef do_complete(cli, prog_name):\n cwords = split_arg_string(os.environ['COMP_WORDS'])\n cword = int(os.environ['COMP_CWORD'])\n args = cwords[1:cword]\n try:\n incomplete = cwords[cword]\n except IndexError:\n incomplete = ''\n\n for item in get_choices(cli, prog_name, args, incomplete):\n echo(item)\n\n return True\n\n\ndef bashcomplete(cli, prog_name, complete_var, complete_instr):\n if complete_instr == 'source':\n echo(get_completion_script(prog_name, complete_var))\n return True\n elif complete_instr == 'complete':\n return do_complete(cli, prog_name)\n return False\n", "path": "click/_bashcomplete.py"}]} | 983 | 478 |
gh_patches_debug_14200 | rasdani/github-patches | git_diff | mitmproxy__mitmproxy-898 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Client Address is not displayed for flows from dumpfiles
I have two clients connecting to mitmproxy now, and I'd like to see the client IP for each request in the "Detail" tab.
Maybe show the server's IP too, since it can differ when connections come from different local IPs (e.g. 127.0.0.1 and 192.168.1.101).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libmproxy/console/flowdetailview.py`
Content:
```
1 from __future__ import absolute_import
2 import urwid
3 from . import common, searchable
4 from .. import utils
5
6
7 def maybe_timestamp(base, attr):
8 if base and getattr(base, attr):
9 return utils.format_timestamp_with_milli(getattr(base, attr))
10 else:
11 return "active"
12 pass
13
14
15 def flowdetails(state, flow):
16 text = []
17
18 cc = flow.client_conn
19 sc = flow.server_conn
20 req = flow.request
21 resp = flow.response
22
23 if sc:
24 text.append(urwid.Text([("head", "Server Connection:")]))
25 parts = [
26 ["Address", "%s:%s" % sc.address()],
27 ]
28
29 text.extend(
30 common.format_keyvals(parts, key="key", val="text", indent=4)
31 )
32
33 c = sc.cert
34 if c:
35 text.append(urwid.Text([("head", "Server Certificate:")]))
36 parts = [
37 ["Type", "%s, %s bits" % c.keyinfo],
38 ["SHA1 digest", c.digest("sha1")],
39 ["Valid to", str(c.notafter)],
40 ["Valid from", str(c.notbefore)],
41 ["Serial", str(c.serial)],
42 [
43 "Subject",
44 urwid.BoxAdapter(
45 urwid.ListBox(
46 common.format_keyvals(
47 c.subject,
48 key="highlight",
49 val="text"
50 )
51 ),
52 len(c.subject)
53 )
54 ],
55 [
56 "Issuer",
57 urwid.BoxAdapter(
58 urwid.ListBox(
59 common.format_keyvals(
60 c.issuer, key="highlight", val="text"
61 )
62 ),
63 len(c.issuer)
64 )
65 ]
66 ]
67
68 if c.altnames:
69 parts.append(
70 [
71 "Alt names",
72 ", ".join(c.altnames)
73 ]
74 )
75 text.extend(
76 common.format_keyvals(parts, key="key", val="text", indent=4)
77 )
78
79 if cc:
80 text.append(urwid.Text([("head", "Client Connection:")]))
81
82 parts = [
83 ["Address", "%s:%s" % cc.address()],
84 # ["Requests", "%s"%cc.requestcount],
85 ]
86
87 text.extend(
88 common.format_keyvals(parts, key="key", val="text", indent=4)
89 )
90
91 parts = []
92
93 parts.append(
94 [
95 "Client conn. established",
96 maybe_timestamp(cc, "timestamp_start")
97 ]
98 )
99 parts.append(
100 [
101 "Server conn. initiated",
102 maybe_timestamp(sc, "timestamp_start")
103 ]
104 )
105 parts.append(
106 [
107 "Server conn. TCP handshake",
108 maybe_timestamp(sc, "timestamp_tcp_setup")
109 ]
110 )
111 if sc.ssl_established:
112 parts.append(
113 [
114 "Server conn. SSL handshake",
115 maybe_timestamp(sc, "timestamp_ssl_setup")
116 ]
117 )
118 parts.append(
119 [
120 "Client conn. SSL handshake",
121 maybe_timestamp(cc, "timestamp_ssl_setup")
122 ]
123 )
124 parts.append(
125 [
126 "First request byte",
127 maybe_timestamp(req, "timestamp_start")
128 ]
129 )
130 parts.append(
131 [
132 "Request complete",
133 maybe_timestamp(req, "timestamp_end")
134 ]
135 )
136 parts.append(
137 [
138 "First response byte",
139 maybe_timestamp(resp, "timestamp_start")
140 ]
141 )
142 parts.append(
143 [
144 "Response complete",
145 maybe_timestamp(resp, "timestamp_end")
146 ]
147 )
148
149 # sort operations by timestamp
150 parts = sorted(parts, key=lambda p: p[1])
151
152 text.append(urwid.Text([("head", "Timing:")]))
153 text.extend(common.format_keyvals(parts, key="key", val="text", indent=4))
154 return searchable.Searchable(state, text)
155
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libmproxy/console/flowdetailview.py b/libmproxy/console/flowdetailview.py
--- a/libmproxy/console/flowdetailview.py
+++ b/libmproxy/console/flowdetailview.py
@@ -20,7 +20,7 @@
req = flow.request
resp = flow.response
- if sc:
+ if sc is not None:
text.append(urwid.Text([("head", "Server Connection:")]))
parts = [
["Address", "%s:%s" % sc.address()],
@@ -76,7 +76,7 @@
common.format_keyvals(parts, key="key", val="text", indent=4)
)
- if cc:
+ if cc is not None:
text.append(urwid.Text([("head", "Client Connection:")]))
parts = [
| {"golden_diff": "diff --git a/libmproxy/console/flowdetailview.py b/libmproxy/console/flowdetailview.py\n--- a/libmproxy/console/flowdetailview.py\n+++ b/libmproxy/console/flowdetailview.py\n@@ -20,7 +20,7 @@\n req = flow.request\n resp = flow.response\n \n- if sc:\n+ if sc is not None:\n text.append(urwid.Text([(\"head\", \"Server Connection:\")]))\n parts = [\n [\"Address\", \"%s:%s\" % sc.address()],\n@@ -76,7 +76,7 @@\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n \n- if cc:\n+ if cc is not None:\n text.append(urwid.Text([(\"head\", \"Client Connection:\")]))\n \n parts = [\n", "issue": "Client Address is not displayed for flows from dumpfiles\nI have two clients connecting to mitmproxy now, and I'd like to see the client IP for each request in the \"Detail\" tab.\n\nMaybe show the server's IP too, since it can differ when connections come from different local IPs (e.g. 127.0.0.1 and 192.168.1.101).\n\n", "before_files": [{"content": "from __future__ import absolute_import\nimport urwid\nfrom . import common, searchable\nfrom .. import utils\n\n\ndef maybe_timestamp(base, attr):\n if base and getattr(base, attr):\n return utils.format_timestamp_with_milli(getattr(base, attr))\n else:\n return \"active\"\n pass\n\n\ndef flowdetails(state, flow):\n text = []\n\n cc = flow.client_conn\n sc = flow.server_conn\n req = flow.request\n resp = flow.response\n\n if sc:\n text.append(urwid.Text([(\"head\", \"Server Connection:\")]))\n parts = [\n [\"Address\", \"%s:%s\" % sc.address()],\n ]\n\n text.extend(\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n\n c = sc.cert\n if c:\n text.append(urwid.Text([(\"head\", \"Server Certificate:\")]))\n parts = [\n [\"Type\", \"%s, %s bits\" % c.keyinfo],\n [\"SHA1 digest\", c.digest(\"sha1\")],\n [\"Valid to\", str(c.notafter)],\n [\"Valid from\", str(c.notbefore)],\n [\"Serial\", str(c.serial)],\n [\n \"Subject\",\n urwid.BoxAdapter(\n urwid.ListBox(\n common.format_keyvals(\n c.subject,\n key=\"highlight\",\n val=\"text\"\n )\n ),\n len(c.subject)\n )\n ],\n [\n \"Issuer\",\n urwid.BoxAdapter(\n urwid.ListBox(\n common.format_keyvals(\n c.issuer, key=\"highlight\", val=\"text\"\n )\n ),\n len(c.issuer)\n )\n ]\n ]\n\n if c.altnames:\n parts.append(\n [\n \"Alt names\",\n \", \".join(c.altnames)\n ]\n )\n text.extend(\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n\n if cc:\n text.append(urwid.Text([(\"head\", \"Client Connection:\")]))\n\n parts = [\n [\"Address\", \"%s:%s\" % cc.address()],\n # [\"Requests\", \"%s\"%cc.requestcount],\n ]\n\n text.extend(\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n\n parts = []\n\n parts.append(\n [\n \"Client conn. established\",\n maybe_timestamp(cc, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Server conn. initiated\",\n maybe_timestamp(sc, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Server conn. TCP handshake\",\n maybe_timestamp(sc, \"timestamp_tcp_setup\")\n ]\n )\n if sc.ssl_established:\n parts.append(\n [\n \"Server conn. SSL handshake\",\n maybe_timestamp(sc, \"timestamp_ssl_setup\")\n ]\n )\n parts.append(\n [\n \"Client conn. SSL handshake\",\n maybe_timestamp(cc, \"timestamp_ssl_setup\")\n ]\n )\n parts.append(\n [\n \"First request byte\",\n maybe_timestamp(req, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Request complete\",\n maybe_timestamp(req, \"timestamp_end\")\n ]\n )\n parts.append(\n [\n \"First response byte\",\n maybe_timestamp(resp, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Response complete\",\n maybe_timestamp(resp, \"timestamp_end\")\n ]\n )\n\n # sort operations by timestamp\n parts = sorted(parts, key=lambda p: p[1])\n\n text.append(urwid.Text([(\"head\", \"Timing:\")]))\n text.extend(common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4))\n return searchable.Searchable(state, text)\n", "path": "libmproxy/console/flowdetailview.py"}], "after_files": [{"content": "from __future__ import absolute_import\nimport urwid\nfrom . import common, searchable\nfrom .. import utils\n\n\ndef maybe_timestamp(base, attr):\n if base and getattr(base, attr):\n return utils.format_timestamp_with_milli(getattr(base, attr))\n else:\n return \"active\"\n pass\n\n\ndef flowdetails(state, flow):\n text = []\n\n cc = flow.client_conn\n sc = flow.server_conn\n req = flow.request\n resp = flow.response\n\n if sc is not None:\n text.append(urwid.Text([(\"head\", \"Server Connection:\")]))\n parts = [\n [\"Address\", \"%s:%s\" % sc.address()],\n ]\n\n text.extend(\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n\n c = sc.cert\n if c:\n text.append(urwid.Text([(\"head\", \"Server Certificate:\")]))\n parts = [\n [\"Type\", \"%s, %s bits\" % c.keyinfo],\n [\"SHA1 digest\", c.digest(\"sha1\")],\n [\"Valid to\", str(c.notafter)],\n [\"Valid from\", str(c.notbefore)],\n [\"Serial\", str(c.serial)],\n [\n \"Subject\",\n urwid.BoxAdapter(\n urwid.ListBox(\n common.format_keyvals(\n c.subject,\n key=\"highlight\",\n val=\"text\"\n )\n ),\n len(c.subject)\n )\n ],\n [\n \"Issuer\",\n urwid.BoxAdapter(\n urwid.ListBox(\n common.format_keyvals(\n c.issuer, key=\"highlight\", val=\"text\"\n )\n ),\n len(c.issuer)\n )\n ]\n ]\n\n if c.altnames:\n parts.append(\n [\n \"Alt names\",\n \", \".join(c.altnames)\n ]\n )\n text.extend(\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n\n if cc is not None:\n text.append(urwid.Text([(\"head\", \"Client Connection:\")]))\n\n parts = [\n [\"Address\", \"%s:%s\" % cc.address()],\n # [\"Requests\", \"%s\"%cc.requestcount],\n ]\n\n text.extend(\n common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4)\n )\n\n parts = []\n\n parts.append(\n [\n \"Client conn. established\",\n maybe_timestamp(cc, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Server conn. initiated\",\n maybe_timestamp(sc, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Server conn. TCP handshake\",\n maybe_timestamp(sc, \"timestamp_tcp_setup\")\n ]\n )\n if sc.ssl_established:\n parts.append(\n [\n \"Server conn. SSL handshake\",\n maybe_timestamp(sc, \"timestamp_ssl_setup\")\n ]\n )\n parts.append(\n [\n \"Client conn. SSL handshake\",\n maybe_timestamp(cc, \"timestamp_ssl_setup\")\n ]\n )\n parts.append(\n [\n \"First request byte\",\n maybe_timestamp(req, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Request complete\",\n maybe_timestamp(req, \"timestamp_end\")\n ]\n )\n parts.append(\n [\n \"First response byte\",\n maybe_timestamp(resp, \"timestamp_start\")\n ]\n )\n parts.append(\n [\n \"Response complete\",\n maybe_timestamp(resp, \"timestamp_end\")\n ]\n )\n\n # sort operations by timestamp\n parts = sorted(parts, key=lambda p: p[1])\n\n text.append(urwid.Text([(\"head\", \"Timing:\")]))\n text.extend(common.format_keyvals(parts, key=\"key\", val=\"text\", indent=4))\n return searchable.Searchable(state, text)\n", "path": "libmproxy/console/flowdetailview.py"}]} | 1,522 | 184 |
gh_patches_debug_16006 | rasdani/github-patches | git_diff | openvinotoolkit__datumaro-275 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CLI help usage is misleading
Recommendation of usage in cli help message is misleading.
**Steps:**
```
datum -h
```
**Expected result:**
Correct form for usage recommendation
**Current result:**
```
usage: datumaro [-h] [--version] [--loglevel LOGLEVEL]
```
``` Run 'datumaro COMMAND --help```
**Notes:**
datumaro is not correct form, and results in "command not found". The same is command specific help messages.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `datumaro/cli/__main__.py`
Content:
```
1
2 # Copyright (C) 2019-2021 Intel Corporation
3 #
4 # SPDX-License-Identifier: MIT
5
6 import argparse
7 import logging as log
8 import sys
9
10 from . import contexts, commands
11 from .util import CliException, add_subparser
12 from ..version import VERSION
13
14
15 _log_levels = {
16 'debug': log.DEBUG,
17 'info': log.INFO,
18 'warning': log.WARNING,
19 'error': log.ERROR,
20 'critical': log.CRITICAL
21 }
22
23 def loglevel(name):
24 return _log_levels[name]
25
26 class _LogManager:
27 @classmethod
28 def init_logger(cls, args=None):
29 # Define minimalistic parser only to obtain loglevel
30 parser = argparse.ArgumentParser(add_help=False)
31 cls._define_loglevel_option(parser)
32 args, _ = parser.parse_known_args(args)
33
34 log.basicConfig(format='%(asctime)s %(levelname)s: %(message)s',
35 level=args.loglevel)
36
37 @staticmethod
38 def _define_loglevel_option(parser):
39 parser.add_argument('--loglevel', type=loglevel, default='info',
40 help="Logging level (options: %s; default: %s)" % \
41 (', '.join(_log_levels.keys()), "%(default)s"))
42 return parser
43
44
45 def _make_subcommands_help(commands, help_line_start=0):
46 desc = ""
47 for command_name, _, command_help in commands:
48 desc += (" %-" + str(max(0, help_line_start - 2 - 1)) + "s%s\n") % \
49 (command_name, command_help)
50 return desc
51
52 def make_parser():
53 parser = argparse.ArgumentParser(prog="datumaro",
54 description="Dataset Framework",
55 formatter_class=argparse.RawDescriptionHelpFormatter)
56
57 parser.add_argument('--version', action='version', version=VERSION)
58 _LogManager._define_loglevel_option(parser)
59
60 known_contexts = [
61 ('project', contexts.project, "Actions with project (deprecated)"),
62 ('source', contexts.source, "Actions with data sources"),
63 ('model', contexts.model, "Actions with models"),
64 ]
65 known_commands = [
66 ('create', commands.create, "Create project"),
67 ('import', commands.import_, "Create project from existing dataset"),
68 ('add', commands.add, "Add data source to project"),
69 ('remove', commands.remove, "Remove data source from project"),
70 ('export', commands.export, "Export project in some format"),
71 ('filter', commands.filter, "Filter project"),
72 ('transform', commands.transform, "Transform project"),
73 ('merge', commands.merge, "Merge projects"),
74 ('convert', commands.convert, "Convert dataset into another format"),
75 ('diff', commands.diff, "Compare projects with intersection"),
76 ('ediff', commands.ediff, "Compare projects for equality"),
77 ('stats', commands.stats, "Compute project statistics"),
78 ('info', commands.info, "Print project info"),
79 ('explain', commands.explain, "Run Explainable AI algorithm for model"),
80 ('validate', commands.validate, "Validate project")
81 ]
82
83 # Argparse doesn't support subparser groups:
84 # https://stackoverflow.com/questions/32017020/grouping-argparse-subparser-arguments
85 help_line_start = max((len(e[0]) for e in known_contexts + known_commands),
86 default=0)
87 help_line_start = max((2 + help_line_start) // 4 + 1, 6) * 4 # align to tabs
88 subcommands_desc = ""
89 if known_contexts:
90 subcommands_desc += "Contexts:\n"
91 subcommands_desc += _make_subcommands_help(known_contexts,
92 help_line_start)
93 if known_commands:
94 if subcommands_desc:
95 subcommands_desc += "\n"
96 subcommands_desc += "Commands:\n"
97 subcommands_desc += _make_subcommands_help(known_commands,
98 help_line_start)
99 if subcommands_desc:
100 subcommands_desc += \
101 "\nRun '%s COMMAND --help' for more information on a command." % \
102 parser.prog
103
104 subcommands = parser.add_subparsers(title=subcommands_desc,
105 description="", help=argparse.SUPPRESS)
106 for command_name, command, _ in known_contexts + known_commands:
107 add_subparser(subcommands, command_name, command.build_parser)
108
109 return parser
110
111
112 def main(args=None):
113 _LogManager.init_logger(args)
114
115 parser = make_parser()
116 args = parser.parse_args(args)
117
118 if 'command' not in args:
119 parser.print_help()
120 return 1
121
122 try:
123 return args.command(args)
124 except CliException as e:
125 log.error(e)
126 return 1
127 except Exception as e:
128 log.error(e)
129 raise
130
131
132 if __name__ == '__main__':
133 sys.exit(main())
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/datumaro/cli/__main__.py b/datumaro/cli/__main__.py
--- a/datumaro/cli/__main__.py
+++ b/datumaro/cli/__main__.py
@@ -5,6 +5,7 @@
import argparse
import logging as log
+import os.path as osp
import sys
from . import contexts, commands
@@ -50,9 +51,11 @@
return desc
def make_parser():
- parser = argparse.ArgumentParser(prog="datumaro",
+ parser = argparse.ArgumentParser(
description="Dataset Framework",
formatter_class=argparse.RawDescriptionHelpFormatter)
+ if parser.prog == osp.basename(__file__): # python -m datumaro ...
+ parser.prog = 'datumaro'
parser.add_argument('--version', action='version', version=VERSION)
_LogManager._define_loglevel_option(parser)
| {"golden_diff": "diff --git a/datumaro/cli/__main__.py b/datumaro/cli/__main__.py\n--- a/datumaro/cli/__main__.py\n+++ b/datumaro/cli/__main__.py\n@@ -5,6 +5,7 @@\n \n import argparse\n import logging as log\n+import os.path as osp\n import sys\n \n from . import contexts, commands\n@@ -50,9 +51,11 @@\n return desc\n \n def make_parser():\n- parser = argparse.ArgumentParser(prog=\"datumaro\",\n+ parser = argparse.ArgumentParser(\n description=\"Dataset Framework\",\n formatter_class=argparse.RawDescriptionHelpFormatter)\n+ if parser.prog == osp.basename(__file__): # python -m datumaro ...\n+ parser.prog = 'datumaro'\n \n parser.add_argument('--version', action='version', version=VERSION)\n _LogManager._define_loglevel_option(parser)\n", "issue": "CLI help usage is misleading\nRecommendation of usage in cli help message is misleading.\r\n\r\n**Steps:**\r\n```\r\ndatum -h\r\n```\r\n**Expected result:**\r\nCorrect form for usage recommendation\r\n\r\n**Current result:**\r\n```\r\nusage: datumaro [-h] [--version] [--loglevel LOGLEVEL]\r\n```\r\n``` Run 'datumaro COMMAND --help```\r\n\r\n**Notes:**\r\ndatumaro is not correct form, and results in \"command not found\". The same is command specific help messages. \r\n\n", "before_files": [{"content": "\n# Copyright (C) 2019-2021 Intel Corporation\n#\n# SPDX-License-Identifier: MIT\n\nimport argparse\nimport logging as log\nimport sys\n\nfrom . import contexts, commands\nfrom .util import CliException, add_subparser\nfrom ..version import VERSION\n\n\n_log_levels = {\n 'debug': log.DEBUG,\n 'info': log.INFO,\n 'warning': log.WARNING,\n 'error': log.ERROR,\n 'critical': log.CRITICAL\n}\n\ndef loglevel(name):\n return _log_levels[name]\n\nclass _LogManager:\n @classmethod\n def init_logger(cls, args=None):\n # Define minimalistic parser only to obtain loglevel\n parser = argparse.ArgumentParser(add_help=False)\n cls._define_loglevel_option(parser)\n args, _ = parser.parse_known_args(args)\n\n log.basicConfig(format='%(asctime)s %(levelname)s: %(message)s',\n level=args.loglevel)\n\n @staticmethod\n def _define_loglevel_option(parser):\n parser.add_argument('--loglevel', type=loglevel, default='info',\n help=\"Logging level (options: %s; default: %s)\" % \\\n (', '.join(_log_levels.keys()), \"%(default)s\"))\n return parser\n\n\ndef _make_subcommands_help(commands, help_line_start=0):\n desc = \"\"\n for command_name, _, command_help in commands:\n desc += (\" %-\" + str(max(0, help_line_start - 2 - 1)) + \"s%s\\n\") % \\\n (command_name, command_help)\n return desc\n\ndef make_parser():\n parser = argparse.ArgumentParser(prog=\"datumaro\",\n description=\"Dataset Framework\",\n formatter_class=argparse.RawDescriptionHelpFormatter)\n\n parser.add_argument('--version', action='version', version=VERSION)\n _LogManager._define_loglevel_option(parser)\n\n known_contexts = [\n ('project', contexts.project, \"Actions with project (deprecated)\"),\n ('source', contexts.source, \"Actions with data sources\"),\n ('model', contexts.model, \"Actions with models\"),\n ]\n known_commands = [\n ('create', commands.create, \"Create project\"),\n ('import', commands.import_, \"Create project from existing dataset\"),\n ('add', commands.add, \"Add data source to project\"),\n ('remove', commands.remove, \"Remove data source from project\"),\n ('export', commands.export, \"Export project in some format\"),\n ('filter', commands.filter, \"Filter project\"),\n ('transform', commands.transform, \"Transform project\"),\n ('merge', commands.merge, \"Merge projects\"),\n ('convert', commands.convert, \"Convert dataset into another format\"),\n ('diff', commands.diff, \"Compare projects with intersection\"),\n ('ediff', commands.ediff, \"Compare projects for equality\"),\n ('stats', commands.stats, \"Compute project statistics\"),\n ('info', commands.info, \"Print project info\"),\n ('explain', commands.explain, \"Run Explainable AI algorithm for model\"),\n ('validate', commands.validate, \"Validate project\")\n ]\n\n # Argparse doesn't support subparser groups:\n # https://stackoverflow.com/questions/32017020/grouping-argparse-subparser-arguments\n help_line_start = max((len(e[0]) for e in known_contexts + known_commands),\n default=0)\n help_line_start = max((2 + help_line_start) // 4 + 1, 6) * 4 # align to tabs\n subcommands_desc = \"\"\n if known_contexts:\n subcommands_desc += \"Contexts:\\n\"\n subcommands_desc += _make_subcommands_help(known_contexts,\n help_line_start)\n if known_commands:\n if subcommands_desc:\n subcommands_desc += \"\\n\"\n subcommands_desc += \"Commands:\\n\"\n subcommands_desc += _make_subcommands_help(known_commands,\n help_line_start)\n if subcommands_desc:\n subcommands_desc += \\\n \"\\nRun '%s COMMAND --help' for more information on a command.\" % \\\n parser.prog\n\n subcommands = parser.add_subparsers(title=subcommands_desc,\n description=\"\", help=argparse.SUPPRESS)\n for command_name, command, _ in known_contexts + known_commands:\n add_subparser(subcommands, command_name, command.build_parser)\n\n return parser\n\n\ndef main(args=None):\n _LogManager.init_logger(args)\n\n parser = make_parser()\n args = parser.parse_args(args)\n\n if 'command' not in args:\n parser.print_help()\n return 1\n\n try:\n return args.command(args)\n except CliException as e:\n log.error(e)\n return 1\n except Exception as e:\n log.error(e)\n raise\n\n\nif __name__ == '__main__':\n sys.exit(main())", "path": "datumaro/cli/__main__.py"}], "after_files": [{"content": "\n# Copyright (C) 2019-2021 Intel Corporation\n#\n# SPDX-License-Identifier: MIT\n\nimport argparse\nimport logging as log\nimport os.path as osp\nimport sys\n\nfrom . import contexts, commands\nfrom .util import CliException, add_subparser\nfrom ..version import VERSION\n\n\n_log_levels = {\n 'debug': log.DEBUG,\n 'info': log.INFO,\n 'warning': log.WARNING,\n 'error': log.ERROR,\n 'critical': log.CRITICAL\n}\n\ndef loglevel(name):\n return _log_levels[name]\n\nclass _LogManager:\n @classmethod\n def init_logger(cls, args=None):\n # Define minimalistic parser only to obtain loglevel\n parser = argparse.ArgumentParser(add_help=False)\n cls._define_loglevel_option(parser)\n args, _ = parser.parse_known_args(args)\n\n log.basicConfig(format='%(asctime)s %(levelname)s: %(message)s',\n level=args.loglevel)\n\n @staticmethod\n def _define_loglevel_option(parser):\n parser.add_argument('--loglevel', type=loglevel, default='info',\n help=\"Logging level (options: %s; default: %s)\" % \\\n (', '.join(_log_levels.keys()), \"%(default)s\"))\n return parser\n\n\ndef _make_subcommands_help(commands, help_line_start=0):\n desc = \"\"\n for command_name, _, command_help in commands:\n desc += (\" %-\" + str(max(0, help_line_start - 2 - 1)) + \"s%s\\n\") % \\\n (command_name, command_help)\n return desc\n\ndef make_parser():\n parser = argparse.ArgumentParser(\n description=\"Dataset Framework\",\n formatter_class=argparse.RawDescriptionHelpFormatter)\n if parser.prog == osp.basename(__file__): # python -m datumaro ...\n parser.prog = 'datumaro'\n\n parser.add_argument('--version', action='version', version=VERSION)\n _LogManager._define_loglevel_option(parser)\n\n known_contexts = [\n ('project', contexts.project, \"Actions with project (deprecated)\"),\n ('source', contexts.source, \"Actions with data sources\"),\n ('model', contexts.model, \"Actions with models\"),\n ]\n known_commands = [\n ('create', commands.create, \"Create project\"),\n ('import', commands.import_, \"Create project from existing dataset\"),\n ('add', commands.add, \"Add data source to project\"),\n ('remove', commands.remove, \"Remove data source from project\"),\n ('export', commands.export, \"Export project in some format\"),\n ('filter', commands.filter, \"Filter project\"),\n ('transform', commands.transform, \"Transform project\"),\n ('merge', commands.merge, \"Merge projects\"),\n ('convert', commands.convert, \"Convert dataset into another format\"),\n ('diff', commands.diff, \"Compare projects with intersection\"),\n ('ediff', commands.ediff, \"Compare projects for equality\"),\n ('stats', commands.stats, \"Compute project statistics\"),\n ('info', commands.info, \"Print project info\"),\n ('explain', commands.explain, \"Run Explainable AI algorithm for model\"),\n ('validate', commands.validate, \"Validate project\")\n ]\n\n # Argparse doesn't support subparser groups:\n # https://stackoverflow.com/questions/32017020/grouping-argparse-subparser-arguments\n help_line_start = max((len(e[0]) for e in known_contexts + known_commands),\n default=0)\n help_line_start = max((2 + help_line_start) // 4 + 1, 6) * 4 # align to tabs\n subcommands_desc = \"\"\n if known_contexts:\n subcommands_desc += \"Contexts:\\n\"\n subcommands_desc += _make_subcommands_help(known_contexts,\n help_line_start)\n if known_commands:\n if subcommands_desc:\n subcommands_desc += \"\\n\"\n subcommands_desc += \"Commands:\\n\"\n subcommands_desc += _make_subcommands_help(known_commands,\n help_line_start)\n if subcommands_desc:\n subcommands_desc += \\\n \"\\nRun '%s COMMAND --help' for more information on a command.\" % \\\n parser.prog\n\n subcommands = parser.add_subparsers(title=subcommands_desc,\n description=\"\", help=argparse.SUPPRESS)\n for command_name, command, _ in known_contexts + known_commands:\n add_subparser(subcommands, command_name, command.build_parser)\n\n return parser\n\n\ndef main(args=None):\n _LogManager.init_logger(args)\n\n parser = make_parser()\n args = parser.parse_args(args)\n\n if 'command' not in args:\n parser.print_help()\n return 1\n\n try:\n return args.command(args)\n except CliException as e:\n log.error(e)\n return 1\n except Exception as e:\n log.error(e)\n raise\n\n\nif __name__ == '__main__':\n sys.exit(main())", "path": "datumaro/cli/__main__.py"}]} | 1,699 | 193 |
gh_patches_debug_15173 | rasdani/github-patches | git_diff | searxng__searxng-131 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
searx_extra/update/update_languages.py : crash on peertube engine
The URL https://peer.tube/api/v1/videos/languages doesn't exist anymore:
https://github.com/searxng/searxng/blob/ee83c99d2b40efaaf4f3e6635e7a43f7731cb128/searx/engines/peertube.py#L24-L25
See also https://framagit.org/framasoft/peertube/search-index/-/tree/master/server/controllers/api
----
See https://framagit.org/framasoft/peertube/search-index/-/blob/master/client/src/views/Search.vue#L644-667
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `searx/engines/peertube.py`
Content:
```
1 # SPDX-License-Identifier: AGPL-3.0-or-later
2 """
3 peertube (Videos)
4 """
5
6 from json import loads
7 from datetime import datetime
8 from urllib.parse import urlencode
9 from searx.utils import html_to_text
10
11 # about
12 about = {
13 "website": 'https://joinpeertube.org',
14 "wikidata_id": 'Q50938515',
15 "official_api_documentation": 'https://docs.joinpeertube.org/api-rest-reference.html',
16 "use_official_api": True,
17 "require_api_key": False,
18 "results": 'JSON',
19 }
20
21 # engine dependent config
22 categories = ["videos"]
23 paging = True
24 base_url = "https://peer.tube"
25 supported_languages_url = base_url + "/api/v1/videos/languages"
26
27
28 # do search-request
29 def request(query, params):
30 sanitized_url = base_url.rstrip("/")
31 pageno = (params["pageno"] - 1) * 15
32 search_url = sanitized_url + "/api/v1/search/videos/?pageno={pageno}&{query}"
33 query_dict = {"search": query}
34 language = params["language"].split("-")[0]
35 # pylint: disable=undefined-variable
36 if "all" != language and language in supported_languages:
37 query_dict["languageOneOf"] = language
38 params["url"] = search_url.format(
39 query=urlencode(query_dict), pageno=pageno
40 )
41 return params
42
43
44 def _get_offset_from_pageno(pageno):
45 return (pageno - 1) * 15 + 1
46
47
48 # get response from search-request
49 def response(resp):
50 sanitized_url = base_url.rstrip("/")
51 results = []
52
53 search_res = loads(resp.text)
54
55 embedded_url = (
56 '<iframe width="560" height="315" sandbox="allow-same-origin allow-scripts allow-popups" '
57 + 'src="'
58 + sanitized_url
59 + '{embed_path}" frameborder="0" allowfullscreen></iframe>'
60 )
61 # return empty array if there are no results
62 if "data" not in search_res:
63 return []
64
65 # parse results
66 for res in search_res["data"]:
67 title = res["name"]
68 url = sanitized_url + "/videos/watch/" + res["uuid"]
69 description = res["description"]
70 if description:
71 content = html_to_text(res["description"])
72 else:
73 content = ""
74 thumbnail = sanitized_url + res["thumbnailPath"]
75 publishedDate = datetime.strptime(res["publishedAt"], "%Y-%m-%dT%H:%M:%S.%fZ")
76 embedded = embedded_url.format(embed_path=res["embedPath"])
77
78 results.append(
79 {
80 "template": "videos.html",
81 "url": url,
82 "title": title,
83 "content": content,
84 "publishedDate": publishedDate,
85 "embedded": embedded,
86 "thumbnail": thumbnail,
87 }
88 )
89
90 # return results
91 return results
92
93
94 def _fetch_supported_languages(resp):
95 peertube_languages = list(loads(resp.text).keys())
96 return peertube_languages
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/searx/engines/peertube.py b/searx/engines/peertube.py
--- a/searx/engines/peertube.py
+++ b/searx/engines/peertube.py
@@ -22,7 +22,9 @@
categories = ["videos"]
paging = True
base_url = "https://peer.tube"
-supported_languages_url = base_url + "/api/v1/videos/languages"
+supported_languages_url = (
+ 'https://framagit.org/framasoft/peertube/search-index/-/raw/master/client/src/views/Search.vue'
+)
# do search-request
@@ -92,5 +94,9 @@
def _fetch_supported_languages(resp):
- peertube_languages = list(loads(resp.text).keys())
+ import re
+
+ # https://docs.python.org/3/howto/regex.html#greedy-versus-non-greedy
+ videolanguages = re.search(r"videoLanguages \(\) \{(.*?)\]", resp.text, re.DOTALL)
+ peertube_languages = [m.group(1) for m in re.finditer(r"\{ id: '([a-z]+)', label:", videolanguages.group(1))]
return peertube_languages
| {"golden_diff": "diff --git a/searx/engines/peertube.py b/searx/engines/peertube.py\n--- a/searx/engines/peertube.py\n+++ b/searx/engines/peertube.py\n@@ -22,7 +22,9 @@\n categories = [\"videos\"]\n paging = True\n base_url = \"https://peer.tube\"\n-supported_languages_url = base_url + \"/api/v1/videos/languages\"\n+supported_languages_url = (\n+ 'https://framagit.org/framasoft/peertube/search-index/-/raw/master/client/src/views/Search.vue'\n+)\n \n \n # do search-request\n@@ -92,5 +94,9 @@\n \n \n def _fetch_supported_languages(resp):\n- peertube_languages = list(loads(resp.text).keys())\n+ import re\n+\n+ # https://docs.python.org/3/howto/regex.html#greedy-versus-non-greedy\n+ videolanguages = re.search(r\"videoLanguages \\(\\) \\{(.*?)\\]\", resp.text, re.DOTALL)\n+ peertube_languages = [m.group(1) for m in re.finditer(r\"\\{ id: '([a-z]+)', label:\", videolanguages.group(1))]\n return peertube_languages\n", "issue": "searx_extra/update/update_languages.py : crash on peertube engine\nThe URL https://peer.tube/api/v1/videos/languages doesn't exist anymore:\r\nhttps://github.com/searxng/searxng/blob/ee83c99d2b40efaaf4f3e6635e7a43f7731cb128/searx/engines/peertube.py#L24-L25\r\n\r\nSee also https://framagit.org/framasoft/peertube/search-index/-/tree/master/server/controllers/api\r\n\r\n----\r\n\r\nSee https://framagit.org/framasoft/peertube/search-index/-/blob/master/client/src/views/Search.vue#L644-667\n", "before_files": [{"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n\"\"\"\n peertube (Videos)\n\"\"\"\n\nfrom json import loads\nfrom datetime import datetime\nfrom urllib.parse import urlencode\nfrom searx.utils import html_to_text\n\n# about\nabout = {\n \"website\": 'https://joinpeertube.org',\n \"wikidata_id\": 'Q50938515',\n \"official_api_documentation\": 'https://docs.joinpeertube.org/api-rest-reference.html',\n \"use_official_api\": True,\n \"require_api_key\": False,\n \"results\": 'JSON',\n}\n\n# engine dependent config\ncategories = [\"videos\"]\npaging = True\nbase_url = \"https://peer.tube\"\nsupported_languages_url = base_url + \"/api/v1/videos/languages\"\n\n\n# do search-request\ndef request(query, params):\n sanitized_url = base_url.rstrip(\"/\")\n pageno = (params[\"pageno\"] - 1) * 15\n search_url = sanitized_url + \"/api/v1/search/videos/?pageno={pageno}&{query}\"\n query_dict = {\"search\": query}\n language = params[\"language\"].split(\"-\")[0]\n # pylint: disable=undefined-variable\n if \"all\" != language and language in supported_languages:\n query_dict[\"languageOneOf\"] = language\n params[\"url\"] = search_url.format(\n query=urlencode(query_dict), pageno=pageno\n )\n return params\n\n\ndef _get_offset_from_pageno(pageno):\n return (pageno - 1) * 15 + 1\n\n\n# get response from search-request\ndef response(resp):\n sanitized_url = base_url.rstrip(\"/\")\n results = []\n\n search_res = loads(resp.text)\n\n embedded_url = (\n '<iframe width=\"560\" height=\"315\" sandbox=\"allow-same-origin allow-scripts allow-popups\" '\n + 'src=\"'\n + sanitized_url\n + '{embed_path}\" frameborder=\"0\" allowfullscreen></iframe>'\n )\n # return empty array if there are no results\n if \"data\" not in search_res:\n return []\n\n # parse results\n for res in search_res[\"data\"]:\n title = res[\"name\"]\n url = sanitized_url + \"/videos/watch/\" + res[\"uuid\"]\n description = res[\"description\"]\n if description:\n content = html_to_text(res[\"description\"])\n else:\n content = \"\"\n thumbnail = sanitized_url + res[\"thumbnailPath\"]\n publishedDate = datetime.strptime(res[\"publishedAt\"], \"%Y-%m-%dT%H:%M:%S.%fZ\")\n embedded = embedded_url.format(embed_path=res[\"embedPath\"])\n\n results.append(\n {\n \"template\": \"videos.html\",\n \"url\": url,\n \"title\": title,\n \"content\": content,\n \"publishedDate\": publishedDate,\n \"embedded\": embedded,\n \"thumbnail\": thumbnail,\n }\n )\n\n # return results\n return results\n\n\ndef _fetch_supported_languages(resp):\n peertube_languages = list(loads(resp.text).keys())\n return peertube_languages\n", "path": "searx/engines/peertube.py"}], "after_files": [{"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n\"\"\"\n peertube (Videos)\n\"\"\"\n\nfrom json import loads\nfrom datetime import datetime\nfrom urllib.parse import urlencode\nfrom searx.utils import html_to_text\n\n# about\nabout = {\n \"website\": 'https://joinpeertube.org',\n \"wikidata_id\": 'Q50938515',\n \"official_api_documentation\": 'https://docs.joinpeertube.org/api-rest-reference.html',\n \"use_official_api\": True,\n \"require_api_key\": False,\n \"results\": 'JSON',\n}\n\n# engine dependent config\ncategories = [\"videos\"]\npaging = True\nbase_url = \"https://peer.tube\"\nsupported_languages_url = (\n 'https://framagit.org/framasoft/peertube/search-index/-/raw/master/client/src/views/Search.vue'\n)\n\n\n# do search-request\ndef request(query, params):\n sanitized_url = base_url.rstrip(\"/\")\n pageno = (params[\"pageno\"] - 1) * 15\n search_url = sanitized_url + \"/api/v1/search/videos/?pageno={pageno}&{query}\"\n query_dict = {\"search\": query}\n language = params[\"language\"].split(\"-\")[0]\n # pylint: disable=undefined-variable\n if \"all\" != language and language in supported_languages:\n query_dict[\"languageOneOf\"] = language\n params[\"url\"] = search_url.format(\n query=urlencode(query_dict), pageno=pageno\n )\n return params\n\n\ndef _get_offset_from_pageno(pageno):\n return (pageno - 1) * 15 + 1\n\n\n# get response from search-request\ndef response(resp):\n sanitized_url = base_url.rstrip(\"/\")\n results = []\n\n search_res = loads(resp.text)\n\n embedded_url = (\n '<iframe width=\"560\" height=\"315\" sandbox=\"allow-same-origin allow-scripts allow-popups\" '\n + 'src=\"'\n + sanitized_url\n + '{embed_path}\" frameborder=\"0\" allowfullscreen></iframe>'\n )\n # return empty array if there are no results\n if \"data\" not in search_res:\n return []\n\n # parse results\n for res in search_res[\"data\"]:\n title = res[\"name\"]\n url = sanitized_url + \"/videos/watch/\" + res[\"uuid\"]\n description = res[\"description\"]\n if description:\n content = html_to_text(res[\"description\"])\n else:\n content = \"\"\n thumbnail = sanitized_url + res[\"thumbnailPath\"]\n publishedDate = datetime.strptime(res[\"publishedAt\"], \"%Y-%m-%dT%H:%M:%S.%fZ\")\n embedded = embedded_url.format(embed_path=res[\"embedPath\"])\n\n results.append(\n {\n \"template\": \"videos.html\",\n \"url\": url,\n \"title\": title,\n \"content\": content,\n \"publishedDate\": publishedDate,\n \"embedded\": embedded,\n \"thumbnail\": thumbnail,\n }\n )\n\n # return results\n return results\n\n\ndef _fetch_supported_languages(resp):\n import re\n\n # https://docs.python.org/3/howto/regex.html#greedy-versus-non-greedy\n videolanguages = re.search(r\"videoLanguages \\(\\) \\{(.*?)\\]\", resp.text, re.DOTALL)\n peertube_languages = [m.group(1) for m in re.finditer(r\"\\{ id: '([a-z]+)', label:\", videolanguages.group(1))]\n return peertube_languages\n", "path": "searx/engines/peertube.py"}]} | 1,297 | 283 |
gh_patches_debug_12234 | rasdani/github-patches | git_diff | ivy-llc__ivy-15998 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
multiply
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/paddle/tensor/math.py`
Content:
```
1 # global
2 import ivy
3 from ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes
4 from ivy.functional.frontends.paddle.func_wrapper import (
5 to_ivy_arrays_and_back,
6 )
7
8
9 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
10 @to_ivy_arrays_and_back
11 def sin(x, name=None):
12 return ivy.sin(x)
13
14
15 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
16 @to_ivy_arrays_and_back
17 def cos(x, name=None):
18 return ivy.cos(x)
19
20
21 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
22 @to_ivy_arrays_and_back
23 def acos(x, name=None):
24 return ivy.acos(x)
25
26
27 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
28 @to_ivy_arrays_and_back
29 def cosh(x, name=None):
30 return ivy.cosh(x)
31
32
33 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
34 @to_ivy_arrays_and_back
35 def tanh(x, name=None):
36 return ivy.tanh(x)
37
38
39 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
40 @to_ivy_arrays_and_back
41 def acosh(x, name=None):
42 return ivy.acosh(x)
43
44
45 @with_supported_dtypes({"2.4.2 and below": ("float32", "float64")}, "paddle")
46 @to_ivy_arrays_and_back
47 def asin(x, name=None):
48 return ivy.asin(x)
49
50
51 @with_supported_dtypes({"2.4.2 and below": ("float32", "float64")}, "paddle")
52 @to_ivy_arrays_and_back
53 def log_softmax(x, name=None):
54 return ivy.log_softmax(x)
55
56
57 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
58 @to_ivy_arrays_and_back
59 def log(x, name=None):
60 return ivy.log(x)
61
62
63 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
64 @to_ivy_arrays_and_back
65 def divide(x, y, name=None):
66 return ivy.divide(x, y)
67
68
69 @with_supported_dtypes({"2.4.2 and below": ("float32", "float64")}, "paddle")
70 @to_ivy_arrays_and_back
71 def sqrt(x, name=None):
72 return ivy.sqrt(x)
73
74
75 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
76 @to_ivy_arrays_and_back
77 def atanh(x, name=None):
78 return ivy.atanh(x)
79
80
81 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
82 @to_ivy_arrays_and_back
83 def atan(x, name=None):
84 return ivy.atan(x)
85
86
87 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
88 @to_ivy_arrays_and_back
89 def round(x, name=None):
90 return ivy.round(x)
91
92
93 @with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
94 @to_ivy_arrays_and_back
95 def ceil(x, name=None):
96 return ivy.ceil(x)
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py
--- a/ivy/functional/frontends/paddle/tensor/math.py
+++ b/ivy/functional/frontends/paddle/tensor/math.py
@@ -66,6 +66,24 @@
return ivy.divide(x, y)
+@with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
+@to_ivy_arrays_and_back
+def multiply(x, y, name=None):
+ return ivy.multiply(x, y)
+
+
+@with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
+@to_ivy_arrays_and_back
+def add(x, y, name=None):
+ return ivy.add(x, y)
+
+
+@with_unsupported_dtypes({"2.4.2 and below": ("float16", "bfloat16")}, "paddle")
+@to_ivy_arrays_and_back
+def subtract(x, y, name=None):
+ return ivy.subtract(x, y)
+
+
@with_supported_dtypes({"2.4.2 and below": ("float32", "float64")}, "paddle")
@to_ivy_arrays_and_back
def sqrt(x, name=None):
| {"golden_diff": "diff --git a/ivy/functional/frontends/paddle/tensor/math.py b/ivy/functional/frontends/paddle/tensor/math.py\n--- a/ivy/functional/frontends/paddle/tensor/math.py\n+++ b/ivy/functional/frontends/paddle/tensor/math.py\n@@ -66,6 +66,24 @@\n return ivy.divide(x, y)\n \n \n+@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n+@to_ivy_arrays_and_back\n+def multiply(x, y, name=None):\n+ return ivy.multiply(x, y)\n+\n+\n+@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n+@to_ivy_arrays_and_back\n+def add(x, y, name=None):\n+ return ivy.add(x, y)\n+\n+\n+@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n+@to_ivy_arrays_and_back\n+def subtract(x, y, name=None):\n+ return ivy.subtract(x, y)\n+\n+\n @with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n @to_ivy_arrays_and_back\n def sqrt(x, name=None):\n", "issue": "multiply\n\n", "before_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sin(x, name=None):\n return ivy.sin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cos(x, name=None):\n return ivy.cos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acos(x, name=None):\n return ivy.acos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cosh(x, name=None):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef tanh(x, name=None):\n return ivy.tanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acosh(x, name=None):\n return ivy.acosh(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef asin(x, name=None):\n return ivy.asin(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef log_softmax(x, name=None):\n return ivy.log_softmax(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef log(x, name=None):\n return ivy.log(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef divide(x, y, name=None):\n return ivy.divide(x, y)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sqrt(x, name=None):\n return ivy.sqrt(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef atanh(x, name=None):\n return ivy.atanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef atan(x, name=None):\n return ivy.atan(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef round(x, name=None):\n return ivy.round(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef ceil(x, name=None):\n return ivy.ceil(x)\n", "path": "ivy/functional/frontends/paddle/tensor/math.py"}], "after_files": [{"content": "# global\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes, with_supported_dtypes\nfrom ivy.functional.frontends.paddle.func_wrapper import (\n to_ivy_arrays_and_back,\n)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sin(x, name=None):\n return ivy.sin(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cos(x, name=None):\n return ivy.cos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acos(x, name=None):\n return ivy.acos(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef cosh(x, name=None):\n return ivy.cosh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef tanh(x, name=None):\n return ivy.tanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef acosh(x, name=None):\n return ivy.acosh(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef asin(x, name=None):\n return ivy.asin(x)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef log_softmax(x, name=None):\n return ivy.log_softmax(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef log(x, name=None):\n return ivy.log(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef divide(x, y, name=None):\n return ivy.divide(x, y)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef multiply(x, y, name=None):\n return ivy.multiply(x, y)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef add(x, y, name=None):\n return ivy.add(x, y)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef subtract(x, y, name=None):\n return ivy.subtract(x, y)\n\n\n@with_supported_dtypes({\"2.4.2 and below\": (\"float32\", \"float64\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef sqrt(x, name=None):\n return ivy.sqrt(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef atanh(x, name=None):\n return ivy.atanh(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef atan(x, name=None):\n return ivy.atan(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef round(x, name=None):\n return ivy.round(x)\n\n\n@with_unsupported_dtypes({\"2.4.2 and below\": (\"float16\", \"bfloat16\")}, \"paddle\")\n@to_ivy_arrays_and_back\ndef ceil(x, name=None):\n return ivy.ceil(x)\n", "path": "ivy/functional/frontends/paddle/tensor/math.py"}]} | 1,333 | 316 |
gh_patches_debug_59756 | rasdani/github-patches | git_diff | pyca__cryptography-3819 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Signer/Verifier deprecation warning has wrong stacklevel
Seeing this with Cryptography 2.0:
```
.../python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py:477: DeprecationWarning: signer and verifier have been deprecated. Please use sign and verify instead.
_warn_sign_verify_deprecated()
.../python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py:382: DeprecationWarning: signer and verifier have been deprecated. Please use sign and verify instead.
_warn_sign_verify_deprecated()
```
I see a few open issues related to deprecations (e.g. #3794), but I'm not sure if any of them cover this particular message.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cryptography/hazmat/backends/openssl/utils.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 from __future__ import absolute_import, division, print_function
6
7 import warnings
8
9 from cryptography import utils
10 from cryptography.hazmat.primitives import hashes
11 from cryptography.hazmat.primitives.asymmetric.utils import Prehashed
12
13
14 def _calculate_digest_and_algorithm(backend, data, algorithm):
15 if not isinstance(algorithm, Prehashed):
16 hash_ctx = hashes.Hash(algorithm, backend)
17 hash_ctx.update(data)
18 data = hash_ctx.finalize()
19 else:
20 algorithm = algorithm._algorithm
21
22 if len(data) != algorithm.digest_size:
23 raise ValueError(
24 "The provided data must be the same length as the hash "
25 "algorithm's digest size."
26 )
27
28 return (data, algorithm)
29
30
31 def _check_not_prehashed(signature_algorithm):
32 if isinstance(signature_algorithm, Prehashed):
33 raise TypeError(
34 "Prehashed is only supported in the sign and verify methods. "
35 "It cannot be used with signer or verifier."
36 )
37
38
39 def _warn_sign_verify_deprecated():
40 warnings.warn(
41 "signer and verifier have been deprecated. Please use sign "
42 "and verify instead.",
43 utils.PersistentlyDeprecated,
44 stacklevel=2
45 )
46
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cryptography/hazmat/backends/openssl/utils.py b/src/cryptography/hazmat/backends/openssl/utils.py
--- a/src/cryptography/hazmat/backends/openssl/utils.py
+++ b/src/cryptography/hazmat/backends/openssl/utils.py
@@ -41,5 +41,5 @@
"signer and verifier have been deprecated. Please use sign "
"and verify instead.",
utils.PersistentlyDeprecated,
- stacklevel=2
+ stacklevel=3
)
| {"golden_diff": "diff --git a/src/cryptography/hazmat/backends/openssl/utils.py b/src/cryptography/hazmat/backends/openssl/utils.py\n--- a/src/cryptography/hazmat/backends/openssl/utils.py\n+++ b/src/cryptography/hazmat/backends/openssl/utils.py\n@@ -41,5 +41,5 @@\n \"signer and verifier have been deprecated. Please use sign \"\n \"and verify instead.\",\n utils.PersistentlyDeprecated,\n- stacklevel=2\n+ stacklevel=3\n )\n", "issue": "Signer/Verifier deprecation warning has wrong stacklevel\nSeeing this with Cryptography 2.0:\r\n\r\n```\r\n.../python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py:477: DeprecationWarning: signer and verifier have been deprecated. Please use sign and verify instead.\r\n _warn_sign_verify_deprecated()\r\n.../python3.5/site-packages/cryptography/hazmat/backends/openssl/rsa.py:382: DeprecationWarning: signer and verifier have been deprecated. Please use sign and verify instead.\r\n _warn_sign_verify_deprecated()\r\n```\r\n\r\nI see a few open issues related to deprecations (e.g. #3794), but I'm not sure if any of them cover this particular message.\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport warnings\n\nfrom cryptography import utils\nfrom cryptography.hazmat.primitives import hashes\nfrom cryptography.hazmat.primitives.asymmetric.utils import Prehashed\n\n\ndef _calculate_digest_and_algorithm(backend, data, algorithm):\n if not isinstance(algorithm, Prehashed):\n hash_ctx = hashes.Hash(algorithm, backend)\n hash_ctx.update(data)\n data = hash_ctx.finalize()\n else:\n algorithm = algorithm._algorithm\n\n if len(data) != algorithm.digest_size:\n raise ValueError(\n \"The provided data must be the same length as the hash \"\n \"algorithm's digest size.\"\n )\n\n return (data, algorithm)\n\n\ndef _check_not_prehashed(signature_algorithm):\n if isinstance(signature_algorithm, Prehashed):\n raise TypeError(\n \"Prehashed is only supported in the sign and verify methods. \"\n \"It cannot be used with signer or verifier.\"\n )\n\n\ndef _warn_sign_verify_deprecated():\n warnings.warn(\n \"signer and verifier have been deprecated. Please use sign \"\n \"and verify instead.\",\n utils.PersistentlyDeprecated,\n stacklevel=2\n )\n", "path": "src/cryptography/hazmat/backends/openssl/utils.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport warnings\n\nfrom cryptography import utils\nfrom cryptography.hazmat.primitives import hashes\nfrom cryptography.hazmat.primitives.asymmetric.utils import Prehashed\n\n\ndef _calculate_digest_and_algorithm(backend, data, algorithm):\n if not isinstance(algorithm, Prehashed):\n hash_ctx = hashes.Hash(algorithm, backend)\n hash_ctx.update(data)\n data = hash_ctx.finalize()\n else:\n algorithm = algorithm._algorithm\n\n if len(data) != algorithm.digest_size:\n raise ValueError(\n \"The provided data must be the same length as the hash \"\n \"algorithm's digest size.\"\n )\n\n return (data, algorithm)\n\n\ndef _check_not_prehashed(signature_algorithm):\n if isinstance(signature_algorithm, Prehashed):\n raise TypeError(\n \"Prehashed is only supported in the sign and verify methods. \"\n \"It cannot be used with signer or verifier.\"\n )\n\n\ndef _warn_sign_verify_deprecated():\n warnings.warn(\n \"signer and verifier have been deprecated. Please use sign \"\n \"and verify instead.\",\n utils.PersistentlyDeprecated,\n stacklevel=3\n )\n", "path": "src/cryptography/hazmat/backends/openssl/utils.py"}]} | 812 | 115 |
gh_patches_debug_23200 | rasdani/github-patches | git_diff | RedHatInsights__insights-core-3248 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update pydoc for Messages (/var/log/messages)
Correct the examples: https://github.com/RedHatInsights/insights-core/blob/master/insights/parsers/messages.py#L32
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `insights/parsers/messages.py`
Content:
```
1 """
2 Messages file ``/var/log/messages``
3 ===================================
4 """
5
6 from .. import Syslog, parser
7 from insights.specs import Specs
8
9
10 @parser(Specs.messages)
11 class Messages(Syslog):
12 """
13 Read the ``/var/log/messages`` file.
14
15 .. note::
16 Please refer to its super-class :class:`insights.core.Syslog` for more
17 details.
18
19 Sample log lines::
20
21 May 18 15:13:34 lxc-rhel68-sat56 jabberd/sm[11057]: session started: [email protected]/superclient
22 May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon
23 May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: Launching a JVM...
24 May 18 15:24:28 lxc-rhel68-sat56 yum[11597]: Installed: lynx-2.8.6-27.el6.x86_64
25 May 18 15:36:19 lxc-rhel68-sat56 yum[11954]: Updated: sos-3.2-40.el6.noarch
26
27 .. note::
28 Because /var/log/messages timestamps by default have no year,
29 the year of the logs will be inferred from the year in your timestamp.
30 This will also work around December/January crossovers.
31
32 Examples:
33 >>> Messages.filters.append('wrapper')
34 >>> Messages.token_scan('daemon_start', 'Wrapper Started as Daemon')
35 >>> msgs = shared[Messages]
36 >>> len(msgs.lines)
37 >>> wrapper_msgs = msgs.get('wrapper') # Can only rely on lines filtered being present
38 >>> wrapper_msgs[0]
39 {'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56',
40 'procname': wrapper[11375]', 'message': '--> Wrapper Started as Daemon',
41 'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon'
42 }
43 >>> msgs.daemon_start # Token set if matching lines present in logs
44 True
45 """
46 pass
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/insights/parsers/messages.py b/insights/parsers/messages.py
--- a/insights/parsers/messages.py
+++ b/insights/parsers/messages.py
@@ -30,17 +30,14 @@
This will also work around December/January crossovers.
Examples:
- >>> Messages.filters.append('wrapper')
>>> Messages.token_scan('daemon_start', 'Wrapper Started as Daemon')
- >>> msgs = shared[Messages]
+ >>> type(msgs)
+ <class 'insights.parsers.messages.Messages'>
>>> len(msgs.lines)
+ 9
>>> wrapper_msgs = msgs.get('wrapper') # Can only rely on lines filtered being present
- >>> wrapper_msgs[0]
- {'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56',
- 'procname': wrapper[11375]', 'message': '--> Wrapper Started as Daemon',
- 'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon'
- }
- >>> msgs.daemon_start # Token set if matching lines present in logs
+ >>> result = {'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon', 'message': '--> Wrapper Started as Daemon', 'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56', 'procname': 'wrapper[11375]'}
+ >>> wrapper_msgs[0] == result
True
"""
pass
| {"golden_diff": "diff --git a/insights/parsers/messages.py b/insights/parsers/messages.py\n--- a/insights/parsers/messages.py\n+++ b/insights/parsers/messages.py\n@@ -30,17 +30,14 @@\n This will also work around December/January crossovers.\n \n Examples:\n- >>> Messages.filters.append('wrapper')\n >>> Messages.token_scan('daemon_start', 'Wrapper Started as Daemon')\n- >>> msgs = shared[Messages]\n+ >>> type(msgs)\n+ <class 'insights.parsers.messages.Messages'>\n >>> len(msgs.lines)\n+ 9\n >>> wrapper_msgs = msgs.get('wrapper') # Can only rely on lines filtered being present\n- >>> wrapper_msgs[0]\n- {'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56',\n- 'procname': wrapper[11375]', 'message': '--> Wrapper Started as Daemon',\n- 'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon'\n- }\n- >>> msgs.daemon_start # Token set if matching lines present in logs\n+ >>> result = {'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon', 'message': '--> Wrapper Started as Daemon', 'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56', 'procname': 'wrapper[11375]'}\n+ >>> wrapper_msgs[0] == result\n True\n \"\"\"\n pass\n", "issue": "Update pydoc for Messages (/var/log/messages)\nCorrect the examples: https://github.com/RedHatInsights/insights-core/blob/master/insights/parsers/messages.py#L32\n", "before_files": [{"content": "\"\"\"\nMessages file ``/var/log/messages``\n===================================\n\"\"\"\n\nfrom .. import Syslog, parser\nfrom insights.specs import Specs\n\n\n@parser(Specs.messages)\nclass Messages(Syslog):\n \"\"\"\n Read the ``/var/log/messages`` file.\n\n .. note::\n Please refer to its super-class :class:`insights.core.Syslog` for more\n details.\n\n Sample log lines::\n\n May 18 15:13:34 lxc-rhel68-sat56 jabberd/sm[11057]: session started: [email protected]/superclient\n May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon\n May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: Launching a JVM...\n May 18 15:24:28 lxc-rhel68-sat56 yum[11597]: Installed: lynx-2.8.6-27.el6.x86_64\n May 18 15:36:19 lxc-rhel68-sat56 yum[11954]: Updated: sos-3.2-40.el6.noarch\n\n .. note::\n Because /var/log/messages timestamps by default have no year,\n the year of the logs will be inferred from the year in your timestamp.\n This will also work around December/January crossovers.\n\n Examples:\n >>> Messages.filters.append('wrapper')\n >>> Messages.token_scan('daemon_start', 'Wrapper Started as Daemon')\n >>> msgs = shared[Messages]\n >>> len(msgs.lines)\n >>> wrapper_msgs = msgs.get('wrapper') # Can only rely on lines filtered being present\n >>> wrapper_msgs[0]\n {'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56',\n 'procname': wrapper[11375]', 'message': '--> Wrapper Started as Daemon',\n 'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon'\n }\n >>> msgs.daemon_start # Token set if matching lines present in logs\n True\n \"\"\"\n pass\n", "path": "insights/parsers/messages.py"}], "after_files": [{"content": "\"\"\"\nMessages file ``/var/log/messages``\n===================================\n\"\"\"\n\nfrom .. import Syslog, parser\nfrom insights.specs import Specs\n\n\n@parser(Specs.messages)\nclass Messages(Syslog):\n \"\"\"\n Read the ``/var/log/messages`` file.\n\n .. note::\n Please refer to its super-class :class:`insights.core.Syslog` for more\n details.\n\n Sample log lines::\n\n May 18 15:13:34 lxc-rhel68-sat56 jabberd/sm[11057]: session started: [email protected]/superclient\n May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon\n May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: Launching a JVM...\n May 18 15:24:28 lxc-rhel68-sat56 yum[11597]: Installed: lynx-2.8.6-27.el6.x86_64\n May 18 15:36:19 lxc-rhel68-sat56 yum[11954]: Updated: sos-3.2-40.el6.noarch\n\n .. note::\n Because /var/log/messages timestamps by default have no year,\n the year of the logs will be inferred from the year in your timestamp.\n This will also work around December/January crossovers.\n\n Examples:\n >>> Messages.token_scan('daemon_start', 'Wrapper Started as Daemon')\n >>> type(msgs)\n <class 'insights.parsers.messages.Messages'>\n >>> len(msgs.lines)\n 9\n >>> wrapper_msgs = msgs.get('wrapper') # Can only rely on lines filtered being present\n >>> result = {'raw_message': 'May 18 15:13:36 lxc-rhel68-sat56 wrapper[11375]: --> Wrapper Started as Daemon', 'message': '--> Wrapper Started as Daemon', 'timestamp': 'May 18 15:13:36', 'hostname': 'lxc-rhel68-sat56', 'procname': 'wrapper[11375]'}\n >>> wrapper_msgs[0] == result\n True\n \"\"\"\n pass\n", "path": "insights/parsers/messages.py"}]} | 958 | 408 |
gh_patches_debug_11114 | rasdani/github-patches | git_diff | freedomofpress__securedrop-1890 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tails 3.0 does not restore the original filename when uncompressing a file
# Bug
Tails 3.0 does not preserve the filename when extracting an archive.
## Description
When extracting an archive, Tails 3.0 will rename the file inside the archive to _data_.
## Steps to Reproduce
Download a SecureDrop submission, decrypt the submission and extract the archive.
## Expected Behavior
Tails should preserve the filename and leave you with, say, _code_name.pdf_.
## Actual Behavior
Tails 3.0 leaves you with a file called _data_ instead.
## Comments
I suspect this is an issue with Tails 3.0 and not the way SecureDrop creates the archive in the first place. Filing here just in case.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `install_files/ansible-base/roles/tails-config/files/securedrop_init.py`
Content:
```
1 #!/usr/bin/python
2
3 import os
4 import sys
5 import subprocess
6
7
8 # check for root
9 if os.geteuid() != 0:
10 sys.exit('You need to run this as root')
11
12 # paths
13 path_torrc_additions = '/home/amnesia/Persistent/.securedrop/torrc_additions'
14 path_torrc_backup = '/etc/tor/torrc.bak'
15 path_torrc = '/etc/tor/torrc'
16
17 # load torrc_additions
18 if os.path.isfile(path_torrc_additions):
19 torrc_additions = open(path_torrc_additions).read()
20 else:
21 sys.exit('Error opening {0} for reading'.format(path_torrc_additions))
22
23 # load torrc
24 if os.path.isfile(path_torrc_backup):
25 torrc = open(path_torrc_backup).read()
26 else:
27 if os.path.isfile(path_torrc):
28 torrc = open(path_torrc).read()
29 else:
30 sys.exit('Error opening {0} for reading'.format(path_torrc))
31
32 # save a backup
33 open(path_torrc_backup, 'w').write(torrc)
34
35 # append the additions
36 open(path_torrc, 'w').write(torrc + torrc_additions)
37
38 # reload tor
39 try:
40 subprocess.check_call(['systemctl', 'reload', '[email protected]'])
41 except subprocess.CalledProcessError:
42 sys.exit('Error reloading Tor')
43
44 # notify the user
45 subprocess.call(['tails-notify-user',
46 'SecureDrop successfully auto-configured!',
47 'You can now access the Journalist Interface.\nIf you are an admin, you can now SSH to the servers.'])
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/install_files/ansible-base/roles/tails-config/files/securedrop_init.py b/install_files/ansible-base/roles/tails-config/files/securedrop_init.py
--- a/install_files/ansible-base/roles/tails-config/files/securedrop_init.py
+++ b/install_files/ansible-base/roles/tails-config/files/securedrop_init.py
@@ -41,6 +41,13 @@
except subprocess.CalledProcessError:
sys.exit('Error reloading Tor')
+# Turn off "automatic-decompression" in Nautilus to ensure the original
+# submission filename is restored (see
+# https://github.com/freedomofpress/securedrop/issues/1862#issuecomment-311519750).
+subprocess.call(['/usr/bin/dconf', 'write',
+ '/org/gnome/nautilus/preferences/automatic-decompression',
+ 'false'])
+
# notify the user
subprocess.call(['tails-notify-user',
'SecureDrop successfully auto-configured!',
| {"golden_diff": "diff --git a/install_files/ansible-base/roles/tails-config/files/securedrop_init.py b/install_files/ansible-base/roles/tails-config/files/securedrop_init.py\n--- a/install_files/ansible-base/roles/tails-config/files/securedrop_init.py\n+++ b/install_files/ansible-base/roles/tails-config/files/securedrop_init.py\n@@ -41,6 +41,13 @@\n except subprocess.CalledProcessError:\n sys.exit('Error reloading Tor')\n \n+# Turn off \"automatic-decompression\" in Nautilus to ensure the original\n+# submission filename is restored (see\n+# https://github.com/freedomofpress/securedrop/issues/1862#issuecomment-311519750).\n+subprocess.call(['/usr/bin/dconf', 'write',\n+ '/org/gnome/nautilus/preferences/automatic-decompression',\n+ 'false'])\n+\n # notify the user\n subprocess.call(['tails-notify-user',\n 'SecureDrop successfully auto-configured!',\n", "issue": "Tails 3.0 does not restore the original filename when uncompressing a file\n# Bug\r\n\r\nTails 3.0 does not preserve the filename when extracting an archive.\r\n\r\n## Description\r\n\r\nWhen extracting an archive, Tails 3.0 will rename the file inside the archive to _data_.\r\n\r\n## Steps to Reproduce\r\n\r\nDownload a SecureDrop submission, decrypt the submission and extract the archive.\r\n\r\n## Expected Behavior\r\n\r\nTails should preserve the filename and leave you with, say, _code_name.pdf_.\r\n\r\n## Actual Behavior\r\n\r\nTails 3.0 leaves you with a file called _data_ instead.\r\n\r\n## Comments\r\n\r\nI suspect this is an issue with Tails 3.0 and not the way SecureDrop creates the archive in the first place. Filing here just in case.\n", "before_files": [{"content": "#!/usr/bin/python\n\nimport os\nimport sys\nimport subprocess\n\n\n# check for root\nif os.geteuid() != 0:\n sys.exit('You need to run this as root')\n\n# paths\npath_torrc_additions = '/home/amnesia/Persistent/.securedrop/torrc_additions'\npath_torrc_backup = '/etc/tor/torrc.bak'\npath_torrc = '/etc/tor/torrc'\n\n# load torrc_additions\nif os.path.isfile(path_torrc_additions):\n torrc_additions = open(path_torrc_additions).read()\nelse:\n sys.exit('Error opening {0} for reading'.format(path_torrc_additions))\n\n# load torrc\nif os.path.isfile(path_torrc_backup):\n torrc = open(path_torrc_backup).read()\nelse:\n if os.path.isfile(path_torrc):\n torrc = open(path_torrc).read()\n else:\n sys.exit('Error opening {0} for reading'.format(path_torrc))\n\n # save a backup\n open(path_torrc_backup, 'w').write(torrc)\n\n# append the additions\nopen(path_torrc, 'w').write(torrc + torrc_additions)\n\n# reload tor\ntry:\n subprocess.check_call(['systemctl', 'reload', '[email protected]'])\nexcept subprocess.CalledProcessError:\n sys.exit('Error reloading Tor')\n\n# notify the user\nsubprocess.call(['tails-notify-user',\n 'SecureDrop successfully auto-configured!',\n 'You can now access the Journalist Interface.\\nIf you are an admin, you can now SSH to the servers.'])\n", "path": "install_files/ansible-base/roles/tails-config/files/securedrop_init.py"}], "after_files": [{"content": "#!/usr/bin/python\n\nimport os\nimport sys\nimport subprocess\n\n\n# check for root\nif os.geteuid() != 0:\n sys.exit('You need to run this as root')\n\n# paths\npath_torrc_additions = '/home/amnesia/Persistent/.securedrop/torrc_additions'\npath_torrc_backup = '/etc/tor/torrc.bak'\npath_torrc = '/etc/tor/torrc'\n\n# load torrc_additions\nif os.path.isfile(path_torrc_additions):\n torrc_additions = open(path_torrc_additions).read()\nelse:\n sys.exit('Error opening {0} for reading'.format(path_torrc_additions))\n\n# load torrc\nif os.path.isfile(path_torrc_backup):\n torrc = open(path_torrc_backup).read()\nelse:\n if os.path.isfile(path_torrc):\n torrc = open(path_torrc).read()\n else:\n sys.exit('Error opening {0} for reading'.format(path_torrc))\n\n # save a backup\n open(path_torrc_backup, 'w').write(torrc)\n\n# append the additions\nopen(path_torrc, 'w').write(torrc + torrc_additions)\n\n# reload tor\ntry:\n subprocess.check_call(['systemctl', 'reload', '[email protected]'])\nexcept subprocess.CalledProcessError:\n sys.exit('Error reloading Tor')\n\n# Turn off \"automatic-decompression\" in Nautilus to ensure the original\n# submission filename is restored (see\n# https://github.com/freedomofpress/securedrop/issues/1862#issuecomment-311519750).\nsubprocess.call(['/usr/bin/dconf', 'write',\n '/org/gnome/nautilus/preferences/automatic-decompression',\n 'false'])\n\n# notify the user\nsubprocess.call(['tails-notify-user',\n 'SecureDrop successfully auto-configured!',\n 'You can now access the Journalist Interface.\\nIf you are an admin, you can now SSH to the servers.'])\n", "path": "install_files/ansible-base/roles/tails-config/files/securedrop_init.py"}]} | 890 | 218 |
gh_patches_debug_13980 | rasdani/github-patches | git_diff | ocf__ocfweb-185 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
libsass>0.10.0 breaks dh-virtuaenv
https://jenkins.ocf.berkeley.edu/view/ocfweb-deploy/job/ocfweb-build-deb/232/console
```
# build sass, output static files
debian/ocfweb/usr/share/python/ocfweb/bin/python setup.py build_sass
running build_sass
building 'ocfweb' sass
/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/setuptools/dist.py:294: UserWarning: The version specified ('2016.01.30.23.27-git7cd5ebbd') is an invalid version, this may not work as expected with newer versions of setuptools, pip, and PyPI. Please see PEP 440 for more details.
"details." % self.metadata.version
Traceback (most recent call last):
File "setup.py", line 44, in <module>
'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',
File "/usr/lib/python3.4/distutils/core.py", line 148, in setup
dist.run_commands()
File "/usr/lib/python3.4/distutils/dist.py", line 955, in run_commands
self.run_command(cmd)
File "/usr/lib/python3.4/distutils/dist.py", line 974, in run_command
cmd_obj.run()
File "/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/distutils.py", line 123, in run
output_style=self.output_style
File "/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/builder.py", line 177, in build
).values()
File "/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/builder.py", line 73, in build_directory
_root_css=_root_css)
File "/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/builder.py", line 63, in build_directory
include_paths=[_root_sass])
File "/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sass.py", line 644, in compile
raise CompileError(v)
sass.CompileError: b'Error: ".page-staff-hours .hour .ocf-staffhours-faces" failed to @extend ".list-inline".\n The selector ".list-inline" was not found.\n Use "@extend .list-inline !optional" if the extend should be able to fail.\n on line 37 of ocfweb/static/scss/pages/staff-hours.scss\n>> @extend .list-inline;\n ------------^\n'
debian/rules:17: recipe for target 'override_dh_virtualenv' failed
```
reproduces locally, but only with `make builddeb` and not the regular `make scss`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import find_packages
2 from setuptools import setup
3
4 try:
5 with open('.version') as f:
6 VERSION = f.readline().strip()
7 except IOError:
8 VERSION = 'unknown'
9
10 setup(
11 name='ocfweb',
12 version=VERSION,
13 packages=find_packages(exclude=['debian', 'venv']),
14 include_package_data=True,
15 url='https://www.ocf.berkeley.edu/',
16 author='Open Computing Facility',
17 author_email='[email protected]',
18 install_requires=[
19 'cached-property',
20 'celery[redis]',
21 'django-bootstrap-form',
22 'django-ipware',
23 'django-mathfilters',
24 'django-redis',
25 'django>=1.10,<1.10.999',
26 'gunicorn',
27 'libsass<=0.10.0', # https://github.com/ocf/ocfweb/issues/128
28 'matplotlib',
29 'mistune',
30 'numpy',
31 'ocflib',
32 'pycrypto',
33 'pygments',
34 'pymysql',
35 'python-dateutil',
36 ],
37 sass_manifests={
38 'ocfweb': ('static/scss',), # XXX: must be tuple
39 },
40 entry_points={
41 'console_scripts': [
42 'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',
43 ],
44 },
45 classifiers=[
46 'Programming Language :: Python :: 3',
47 ],
48 )
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -24,7 +24,7 @@
'django-redis',
'django>=1.10,<1.10.999',
'gunicorn',
- 'libsass<=0.10.0', # https://github.com/ocf/ocfweb/issues/128
+ 'libsass',
'matplotlib',
'mistune',
'numpy',
@@ -34,9 +34,6 @@
'pymysql',
'python-dateutil',
],
- sass_manifests={
- 'ocfweb': ('static/scss',), # XXX: must be tuple
- },
entry_points={
'console_scripts': [
'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -24,7 +24,7 @@\n 'django-redis',\n 'django>=1.10,<1.10.999',\n 'gunicorn',\n- 'libsass<=0.10.0', # https://github.com/ocf/ocfweb/issues/128\n+ 'libsass',\n 'matplotlib',\n 'mistune',\n 'numpy',\n@@ -34,9 +34,6 @@\n 'pymysql',\n 'python-dateutil',\n ],\n- sass_manifests={\n- 'ocfweb': ('static/scss',), # XXX: must be tuple\n- },\n entry_points={\n 'console_scripts': [\n 'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',\n", "issue": "libsass>0.10.0 breaks dh-virtuaenv\nhttps://jenkins.ocf.berkeley.edu/view/ocfweb-deploy/job/ocfweb-build-deb/232/console\n\n```\n# build sass, output static files\ndebian/ocfweb/usr/share/python/ocfweb/bin/python setup.py build_sass\nrunning build_sass\nbuilding 'ocfweb' sass\n/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/setuptools/dist.py:294: UserWarning: The version specified ('2016.01.30.23.27-git7cd5ebbd') is an invalid version, this may not work as expected with newer versions of setuptools, pip, and PyPI. Please see PEP 440 for more details.\n \"details.\" % self.metadata.version\nTraceback (most recent call last):\n File \"setup.py\", line 44, in <module>\n 'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',\n File \"/usr/lib/python3.4/distutils/core.py\", line 148, in setup\n dist.run_commands()\n File \"/usr/lib/python3.4/distutils/dist.py\", line 955, in run_commands\n self.run_command(cmd)\n File \"/usr/lib/python3.4/distutils/dist.py\", line 974, in run_command\n cmd_obj.run()\n File \"/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/distutils.py\", line 123, in run\n output_style=self.output_style\n File \"/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/builder.py\", line 177, in build\n ).values()\n File \"/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/builder.py\", line 73, in build_directory\n _root_css=_root_css)\n File \"/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sassutils/builder.py\", line 63, in build_directory\n include_paths=[_root_sass])\n File \"/opt/jenkins/slave/workspace/ocfweb-build-deb/src/debian/ocfweb/usr/share/python/ocfweb/lib/python3.4/site-packages/sass.py\", line 644, in compile\n raise CompileError(v)\nsass.CompileError: b'Error: \".page-staff-hours .hour .ocf-staffhours-faces\" failed to @extend \".list-inline\".\\n The selector \".list-inline\" was not found.\\n Use \"@extend .list-inline !optional\" if the extend should be able to fail.\\n on line 37 of ocfweb/static/scss/pages/staff-hours.scss\\n>> @extend .list-inline;\\n ------------^\\n'\ndebian/rules:17: recipe for target 'override_dh_virtualenv' failed\n```\n\nreproduces locally, but only with `make builddeb` and not the regular `make scss`\n\n", "before_files": [{"content": "from setuptools import find_packages\nfrom setuptools import setup\n\ntry:\n with open('.version') as f:\n VERSION = f.readline().strip()\nexcept IOError:\n VERSION = 'unknown'\n\nsetup(\n name='ocfweb',\n version=VERSION,\n packages=find_packages(exclude=['debian', 'venv']),\n include_package_data=True,\n url='https://www.ocf.berkeley.edu/',\n author='Open Computing Facility',\n author_email='[email protected]',\n install_requires=[\n 'cached-property',\n 'celery[redis]',\n 'django-bootstrap-form',\n 'django-ipware',\n 'django-mathfilters',\n 'django-redis',\n 'django>=1.10,<1.10.999',\n 'gunicorn',\n 'libsass<=0.10.0', # https://github.com/ocf/ocfweb/issues/128\n 'matplotlib',\n 'mistune',\n 'numpy',\n 'ocflib',\n 'pycrypto',\n 'pygments',\n 'pymysql',\n 'python-dateutil',\n ],\n sass_manifests={\n 'ocfweb': ('static/scss',), # XXX: must be tuple\n },\n entry_points={\n 'console_scripts': [\n 'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',\n ],\n },\n classifiers=[\n 'Programming Language :: Python :: 3',\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import find_packages\nfrom setuptools import setup\n\ntry:\n with open('.version') as f:\n VERSION = f.readline().strip()\nexcept IOError:\n VERSION = 'unknown'\n\nsetup(\n name='ocfweb',\n version=VERSION,\n packages=find_packages(exclude=['debian', 'venv']),\n include_package_data=True,\n url='https://www.ocf.berkeley.edu/',\n author='Open Computing Facility',\n author_email='[email protected]',\n install_requires=[\n 'cached-property',\n 'celery[redis]',\n 'django-bootstrap-form',\n 'django-ipware',\n 'django-mathfilters',\n 'django-redis',\n 'django>=1.10,<1.10.999',\n 'gunicorn',\n 'libsass',\n 'matplotlib',\n 'mistune',\n 'numpy',\n 'ocflib',\n 'pycrypto',\n 'pygments',\n 'pymysql',\n 'python-dateutil',\n ],\n entry_points={\n 'console_scripts': [\n 'ocfweb-run-periodic-functions = ocfweb.bin.run_periodic_functions:main',\n ],\n },\n classifiers=[\n 'Programming Language :: Python :: 3',\n ],\n)\n", "path": "setup.py"}]} | 1,444 | 199 |
gh_patches_debug_26182 | rasdani/github-patches | git_diff | scalableminds__webknossos-libs-641 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Persist log file
When running the cuber, there usually is a lot of output (overflowing terminal buffers). It would be great if all of that output would be persisted in a log file as well to allow to easily scan and search through it, afterwards.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `webknossos/webknossos/utils.py`
Content:
```
1 import argparse
2 import calendar
3 import functools
4 import json
5 import logging
6 import os
7 import time
8 from concurrent.futures import as_completed
9 from concurrent.futures._base import Future
10 from datetime import datetime
11 from multiprocessing import cpu_count
12 from pathlib import Path
13 from typing import Any, Callable, Iterable, List, Optional, Union
14
15 import rich
16 from cluster_tools import WrappedProcessPoolExecutor, get_executor
17 from cluster_tools.schedulers.cluster_executor import ClusterExecutor
18 from rich.progress import Progress
19
20 times = {}
21
22
23 def time_start(identifier: str) -> None:
24 times[identifier] = time.time()
25 logging.debug("{} started".format(identifier))
26
27
28 def time_stop(identifier: str) -> None:
29 _time = times.pop(identifier)
30 logging.debug("{} took {:.8f}s".format(identifier, time.time() - _time))
31
32
33 def get_executor_for_args(
34 args: Optional[argparse.Namespace],
35 ) -> Union[ClusterExecutor, WrappedProcessPoolExecutor]:
36 executor = None
37 if args is None:
38 # For backwards compatibility with code from other packages
39 # we allow args to be None. In this case we are defaulting
40 # to these values:
41 jobs = cpu_count()
42 executor = get_executor("multiprocessing", max_workers=jobs)
43 logging.info("Using pool of {} workers.".format(jobs))
44 elif args.distribution_strategy == "multiprocessing":
45 # Also accept "processes" instead of job to be compatible with segmentation-tools.
46 # In the long run, the args should be unified and provided by the clustertools.
47 if "jobs" in args:
48 jobs = args.jobs
49 elif "processes" in args:
50 jobs = args.processes
51 else:
52 jobs = cpu_count()
53
54 executor = get_executor("multiprocessing", max_workers=jobs)
55 logging.info("Using pool of {} workers.".format(jobs))
56 elif args.distribution_strategy in ("slurm", "kubernetes"):
57 if args.job_resources is None:
58 resources_example = (
59 '{"mem": "1G"}'
60 if args.distribution_strategy == "slurm"
61 else '{"memory": "1G"}'
62 )
63 raise argparse.ArgumentTypeError(
64 f"Job resources (--job_resources) has to be provided when using {args.distribution_strategy} as distribution strategy. Example: --job_resources='{resources_example}'"
65 )
66
67 executor = get_executor(
68 args.distribution_strategy,
69 debug=True,
70 keep_logs=True,
71 job_resources=json.loads(args.job_resources),
72 )
73 logging.info(f"Using {args.distribution_strategy} cluster.")
74 else:
75 logging.error(
76 "Unknown distribution strategy: {}".format(args.distribution_strategy)
77 )
78
79 return executor
80
81
82 F = Callable[..., Any]
83
84
85 def named_partial(func: F, *args: Any, **kwargs: Any) -> F:
86 # Propagate __name__ and __doc__ attributes to partial function
87 partial_func = functools.partial(func, *args, **kwargs)
88 functools.update_wrapper(partial_func, func)
89 if hasattr(func, "__annotations__"):
90 # Generic types cannot be pickled in Python <= 3.6, see https://github.com/python/typing/issues/511
91 partial_func.__annotations__ = {}
92 return partial_func
93
94
95 def wait_and_ensure_success(
96 futures: List[Future], progress_desc: Optional[str] = None
97 ) -> List[Any]:
98 """Waits for all futures to complete and raises an exception
99 as soon as a future resolves with an error."""
100
101 results = []
102 if progress_desc is None:
103 for fut in as_completed(futures):
104 results.append(fut.result())
105 else:
106 with get_rich_progress() as progress:
107 task = progress.add_task(progress_desc, total=len(futures))
108 for fut in as_completed(futures):
109 results.append(fut.result())
110 progress.update(task, advance=1)
111 return results
112
113
114 def snake_to_camel_case(snake_case_name: str) -> str:
115 parts = snake_case_name.split("_")
116 return parts[0] + "".join(part.title() for part in parts[1:])
117
118
119 def get_chunks(arr: List[Any], chunk_size: int) -> Iterable[List[Any]]:
120 for i in range(0, len(arr), chunk_size):
121 yield arr[i : i + chunk_size]
122
123
124 def time_since_epoch_in_ms() -> int:
125 d = datetime.utcnow()
126 unixtime = calendar.timegm(d.utctimetuple())
127 return unixtime * 1000
128
129
130 def copy_directory_with_symlinks(
131 src_path: Path,
132 dst_path: Path,
133 ignore: Iterable[str] = tuple(),
134 make_relative: bool = False,
135 ) -> None:
136 """
137 Links all directories in src_path / dir_name to dst_path / dir_name.
138 """
139 for item in src_path.iterdir():
140 if item.name not in ignore:
141 symlink_path = dst_path / item.name
142 if make_relative:
143 rel_or_abspath = Path(os.path.relpath(item, symlink_path.parent))
144 else:
145 rel_or_abspath = item.resolve()
146 symlink_path.symlink_to(rel_or_abspath)
147
148
149 def setup_logging(args: argparse.Namespace) -> None:
150 logging.basicConfig(
151 level=(logging.DEBUG if args.verbose else logging.INFO),
152 format="%(asctime)s %(levelname)s %(message)s",
153 )
154
155
156 def add_verbose_flag(parser: argparse.ArgumentParser) -> None:
157 parser.add_argument(
158 "--verbose", help="Verbose output", dest="verbose", action="store_true"
159 )
160
161 parser.set_defaults(verbose=False)
162
163
164 def get_rich_progress() -> Progress:
165 return Progress(
166 "[progress.description]{task.description}",
167 rich.progress.BarColumn(),
168 "[progress.percentage]{task.percentage:>3.0f}%",
169 rich.progress.TimeElapsedColumn(),
170 "|",
171 rich.progress.TimeRemainingColumn(),
172 )
173
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/webknossos/webknossos/utils.py b/webknossos/webknossos/utils.py
--- a/webknossos/webknossos/utils.py
+++ b/webknossos/webknossos/utils.py
@@ -4,6 +4,7 @@
import json
import logging
import os
+import sys
import time
from concurrent.futures import as_completed
from concurrent.futures._base import Future
@@ -147,10 +148,28 @@
def setup_logging(args: argparse.Namespace) -> None:
- logging.basicConfig(
- level=(logging.DEBUG if args.verbose else logging.INFO),
- format="%(asctime)s %(levelname)s %(message)s",
- )
+ log_path = Path(f"./logs/cuber_{time.strftime('%Y-%m-%d_%H%M%S')}.txt")
+
+ console_log_level = logging.DEBUG if args.verbose else logging.INFO
+ file_log_level = logging.DEBUG
+
+ logging_formatter = logging.Formatter("%(asctime)s %(levelname)s %(message)s")
+
+ # Always set the global log level to the more verbose of console_log_level and
+ # file_log_level to allow to log with different log levels to console and files.
+ root_logger = logging.getLogger()
+ root_logger.setLevel(min(console_log_level, file_log_level))
+
+ console = logging.StreamHandler(sys.stdout)
+ console.setLevel(console_log_level)
+ console.setFormatter(logging_formatter)
+ root_logger.addHandler(console)
+
+ log_path.parent.mkdir(parents=True, exist_ok=True)
+ file_handler = logging.FileHandler(log_path, mode="w", encoding="UTF-8")
+ file_handler.setLevel(file_log_level)
+ file_handler.setFormatter(logging_formatter)
+ root_logger.addHandler(file_handler)
def add_verbose_flag(parser: argparse.ArgumentParser) -> None:
| {"golden_diff": "diff --git a/webknossos/webknossos/utils.py b/webknossos/webknossos/utils.py\n--- a/webknossos/webknossos/utils.py\n+++ b/webknossos/webknossos/utils.py\n@@ -4,6 +4,7 @@\n import json\n import logging\n import os\n+import sys\n import time\n from concurrent.futures import as_completed\n from concurrent.futures._base import Future\n@@ -147,10 +148,28 @@\n \n \n def setup_logging(args: argparse.Namespace) -> None:\n- logging.basicConfig(\n- level=(logging.DEBUG if args.verbose else logging.INFO),\n- format=\"%(asctime)s %(levelname)s %(message)s\",\n- )\n+ log_path = Path(f\"./logs/cuber_{time.strftime('%Y-%m-%d_%H%M%S')}.txt\")\n+\n+ console_log_level = logging.DEBUG if args.verbose else logging.INFO\n+ file_log_level = logging.DEBUG\n+\n+ logging_formatter = logging.Formatter(\"%(asctime)s %(levelname)s %(message)s\")\n+\n+ # Always set the global log level to the more verbose of console_log_level and\n+ # file_log_level to allow to log with different log levels to console and files.\n+ root_logger = logging.getLogger()\n+ root_logger.setLevel(min(console_log_level, file_log_level))\n+\n+ console = logging.StreamHandler(sys.stdout)\n+ console.setLevel(console_log_level)\n+ console.setFormatter(logging_formatter)\n+ root_logger.addHandler(console)\n+\n+ log_path.parent.mkdir(parents=True, exist_ok=True)\n+ file_handler = logging.FileHandler(log_path, mode=\"w\", encoding=\"UTF-8\")\n+ file_handler.setLevel(file_log_level)\n+ file_handler.setFormatter(logging_formatter)\n+ root_logger.addHandler(file_handler)\n \n \n def add_verbose_flag(parser: argparse.ArgumentParser) -> None:\n", "issue": "Persist log file\nWhen running the cuber, there usually is a lot of output (overflowing terminal buffers). It would be great if all of that output would be persisted in a log file as well to allow to easily scan and search through it, afterwards.\n", "before_files": [{"content": "import argparse\nimport calendar\nimport functools\nimport json\nimport logging\nimport os\nimport time\nfrom concurrent.futures import as_completed\nfrom concurrent.futures._base import Future\nfrom datetime import datetime\nfrom multiprocessing import cpu_count\nfrom pathlib import Path\nfrom typing import Any, Callable, Iterable, List, Optional, Union\n\nimport rich\nfrom cluster_tools import WrappedProcessPoolExecutor, get_executor\nfrom cluster_tools.schedulers.cluster_executor import ClusterExecutor\nfrom rich.progress import Progress\n\ntimes = {}\n\n\ndef time_start(identifier: str) -> None:\n times[identifier] = time.time()\n logging.debug(\"{} started\".format(identifier))\n\n\ndef time_stop(identifier: str) -> None:\n _time = times.pop(identifier)\n logging.debug(\"{} took {:.8f}s\".format(identifier, time.time() - _time))\n\n\ndef get_executor_for_args(\n args: Optional[argparse.Namespace],\n) -> Union[ClusterExecutor, WrappedProcessPoolExecutor]:\n executor = None\n if args is None:\n # For backwards compatibility with code from other packages\n # we allow args to be None. In this case we are defaulting\n # to these values:\n jobs = cpu_count()\n executor = get_executor(\"multiprocessing\", max_workers=jobs)\n logging.info(\"Using pool of {} workers.\".format(jobs))\n elif args.distribution_strategy == \"multiprocessing\":\n # Also accept \"processes\" instead of job to be compatible with segmentation-tools.\n # In the long run, the args should be unified and provided by the clustertools.\n if \"jobs\" in args:\n jobs = args.jobs\n elif \"processes\" in args:\n jobs = args.processes\n else:\n jobs = cpu_count()\n\n executor = get_executor(\"multiprocessing\", max_workers=jobs)\n logging.info(\"Using pool of {} workers.\".format(jobs))\n elif args.distribution_strategy in (\"slurm\", \"kubernetes\"):\n if args.job_resources is None:\n resources_example = (\n '{\"mem\": \"1G\"}'\n if args.distribution_strategy == \"slurm\"\n else '{\"memory\": \"1G\"}'\n )\n raise argparse.ArgumentTypeError(\n f\"Job resources (--job_resources) has to be provided when using {args.distribution_strategy} as distribution strategy. Example: --job_resources='{resources_example}'\"\n )\n\n executor = get_executor(\n args.distribution_strategy,\n debug=True,\n keep_logs=True,\n job_resources=json.loads(args.job_resources),\n )\n logging.info(f\"Using {args.distribution_strategy} cluster.\")\n else:\n logging.error(\n \"Unknown distribution strategy: {}\".format(args.distribution_strategy)\n )\n\n return executor\n\n\nF = Callable[..., Any]\n\n\ndef named_partial(func: F, *args: Any, **kwargs: Any) -> F:\n # Propagate __name__ and __doc__ attributes to partial function\n partial_func = functools.partial(func, *args, **kwargs)\n functools.update_wrapper(partial_func, func)\n if hasattr(func, \"__annotations__\"):\n # Generic types cannot be pickled in Python <= 3.6, see https://github.com/python/typing/issues/511\n partial_func.__annotations__ = {}\n return partial_func\n\n\ndef wait_and_ensure_success(\n futures: List[Future], progress_desc: Optional[str] = None\n) -> List[Any]:\n \"\"\"Waits for all futures to complete and raises an exception\n as soon as a future resolves with an error.\"\"\"\n\n results = []\n if progress_desc is None:\n for fut in as_completed(futures):\n results.append(fut.result())\n else:\n with get_rich_progress() as progress:\n task = progress.add_task(progress_desc, total=len(futures))\n for fut in as_completed(futures):\n results.append(fut.result())\n progress.update(task, advance=1)\n return results\n\n\ndef snake_to_camel_case(snake_case_name: str) -> str:\n parts = snake_case_name.split(\"_\")\n return parts[0] + \"\".join(part.title() for part in parts[1:])\n\n\ndef get_chunks(arr: List[Any], chunk_size: int) -> Iterable[List[Any]]:\n for i in range(0, len(arr), chunk_size):\n yield arr[i : i + chunk_size]\n\n\ndef time_since_epoch_in_ms() -> int:\n d = datetime.utcnow()\n unixtime = calendar.timegm(d.utctimetuple())\n return unixtime * 1000\n\n\ndef copy_directory_with_symlinks(\n src_path: Path,\n dst_path: Path,\n ignore: Iterable[str] = tuple(),\n make_relative: bool = False,\n) -> None:\n \"\"\"\n Links all directories in src_path / dir_name to dst_path / dir_name.\n \"\"\"\n for item in src_path.iterdir():\n if item.name not in ignore:\n symlink_path = dst_path / item.name\n if make_relative:\n rel_or_abspath = Path(os.path.relpath(item, symlink_path.parent))\n else:\n rel_or_abspath = item.resolve()\n symlink_path.symlink_to(rel_or_abspath)\n\n\ndef setup_logging(args: argparse.Namespace) -> None:\n logging.basicConfig(\n level=(logging.DEBUG if args.verbose else logging.INFO),\n format=\"%(asctime)s %(levelname)s %(message)s\",\n )\n\n\ndef add_verbose_flag(parser: argparse.ArgumentParser) -> None:\n parser.add_argument(\n \"--verbose\", help=\"Verbose output\", dest=\"verbose\", action=\"store_true\"\n )\n\n parser.set_defaults(verbose=False)\n\n\ndef get_rich_progress() -> Progress:\n return Progress(\n \"[progress.description]{task.description}\",\n rich.progress.BarColumn(),\n \"[progress.percentage]{task.percentage:>3.0f}%\",\n rich.progress.TimeElapsedColumn(),\n \"|\",\n rich.progress.TimeRemainingColumn(),\n )\n", "path": "webknossos/webknossos/utils.py"}], "after_files": [{"content": "import argparse\nimport calendar\nimport functools\nimport json\nimport logging\nimport os\nimport sys\nimport time\nfrom concurrent.futures import as_completed\nfrom concurrent.futures._base import Future\nfrom datetime import datetime\nfrom multiprocessing import cpu_count\nfrom pathlib import Path\nfrom typing import Any, Callable, Iterable, List, Optional, Union\n\nimport rich\nfrom cluster_tools import WrappedProcessPoolExecutor, get_executor\nfrom cluster_tools.schedulers.cluster_executor import ClusterExecutor\nfrom rich.progress import Progress\n\ntimes = {}\n\n\ndef time_start(identifier: str) -> None:\n times[identifier] = time.time()\n logging.debug(\"{} started\".format(identifier))\n\n\ndef time_stop(identifier: str) -> None:\n _time = times.pop(identifier)\n logging.debug(\"{} took {:.8f}s\".format(identifier, time.time() - _time))\n\n\ndef get_executor_for_args(\n args: Optional[argparse.Namespace],\n) -> Union[ClusterExecutor, WrappedProcessPoolExecutor]:\n executor = None\n if args is None:\n # For backwards compatibility with code from other packages\n # we allow args to be None. In this case we are defaulting\n # to these values:\n jobs = cpu_count()\n executor = get_executor(\"multiprocessing\", max_workers=jobs)\n logging.info(\"Using pool of {} workers.\".format(jobs))\n elif args.distribution_strategy == \"multiprocessing\":\n # Also accept \"processes\" instead of job to be compatible with segmentation-tools.\n # In the long run, the args should be unified and provided by the clustertools.\n if \"jobs\" in args:\n jobs = args.jobs\n elif \"processes\" in args:\n jobs = args.processes\n else:\n jobs = cpu_count()\n\n executor = get_executor(\"multiprocessing\", max_workers=jobs)\n logging.info(\"Using pool of {} workers.\".format(jobs))\n elif args.distribution_strategy in (\"slurm\", \"kubernetes\"):\n if args.job_resources is None:\n resources_example = (\n '{\"mem\": \"1G\"}'\n if args.distribution_strategy == \"slurm\"\n else '{\"memory\": \"1G\"}'\n )\n raise argparse.ArgumentTypeError(\n f\"Job resources (--job_resources) has to be provided when using {args.distribution_strategy} as distribution strategy. Example: --job_resources='{resources_example}'\"\n )\n\n executor = get_executor(\n args.distribution_strategy,\n debug=True,\n keep_logs=True,\n job_resources=json.loads(args.job_resources),\n )\n logging.info(f\"Using {args.distribution_strategy} cluster.\")\n else:\n logging.error(\n \"Unknown distribution strategy: {}\".format(args.distribution_strategy)\n )\n\n return executor\n\n\nF = Callable[..., Any]\n\n\ndef named_partial(func: F, *args: Any, **kwargs: Any) -> F:\n # Propagate __name__ and __doc__ attributes to partial function\n partial_func = functools.partial(func, *args, **kwargs)\n functools.update_wrapper(partial_func, func)\n if hasattr(func, \"__annotations__\"):\n # Generic types cannot be pickled in Python <= 3.6, see https://github.com/python/typing/issues/511\n partial_func.__annotations__ = {}\n return partial_func\n\n\ndef wait_and_ensure_success(\n futures: List[Future], progress_desc: Optional[str] = None\n) -> List[Any]:\n \"\"\"Waits for all futures to complete and raises an exception\n as soon as a future resolves with an error.\"\"\"\n\n results = []\n if progress_desc is None:\n for fut in as_completed(futures):\n results.append(fut.result())\n else:\n with get_rich_progress() as progress:\n task = progress.add_task(progress_desc, total=len(futures))\n for fut in as_completed(futures):\n results.append(fut.result())\n progress.update(task, advance=1)\n return results\n\n\ndef snake_to_camel_case(snake_case_name: str) -> str:\n parts = snake_case_name.split(\"_\")\n return parts[0] + \"\".join(part.title() for part in parts[1:])\n\n\ndef get_chunks(arr: List[Any], chunk_size: int) -> Iterable[List[Any]]:\n for i in range(0, len(arr), chunk_size):\n yield arr[i : i + chunk_size]\n\n\ndef time_since_epoch_in_ms() -> int:\n d = datetime.utcnow()\n unixtime = calendar.timegm(d.utctimetuple())\n return unixtime * 1000\n\n\ndef copy_directory_with_symlinks(\n src_path: Path,\n dst_path: Path,\n ignore: Iterable[str] = tuple(),\n make_relative: bool = False,\n) -> None:\n \"\"\"\n Links all directories in src_path / dir_name to dst_path / dir_name.\n \"\"\"\n for item in src_path.iterdir():\n if item.name not in ignore:\n symlink_path = dst_path / item.name\n if make_relative:\n rel_or_abspath = Path(os.path.relpath(item, symlink_path.parent))\n else:\n rel_or_abspath = item.resolve()\n symlink_path.symlink_to(rel_or_abspath)\n\n\ndef setup_logging(args: argparse.Namespace) -> None:\n log_path = Path(f\"./logs/cuber_{time.strftime('%Y-%m-%d_%H%M%S')}.txt\")\n\n console_log_level = logging.DEBUG if args.verbose else logging.INFO\n file_log_level = logging.DEBUG\n\n logging_formatter = logging.Formatter(\"%(asctime)s %(levelname)s %(message)s\")\n\n # Always set the global log level to the more verbose of console_log_level and\n # file_log_level to allow to log with different log levels to console and files.\n root_logger = logging.getLogger()\n root_logger.setLevel(min(console_log_level, file_log_level))\n\n console = logging.StreamHandler(sys.stdout)\n console.setLevel(console_log_level)\n console.setFormatter(logging_formatter)\n root_logger.addHandler(console)\n\n log_path.parent.mkdir(parents=True, exist_ok=True)\n file_handler = logging.FileHandler(log_path, mode=\"w\", encoding=\"UTF-8\")\n file_handler.setLevel(file_log_level)\n file_handler.setFormatter(logging_formatter)\n root_logger.addHandler(file_handler)\n\n\ndef add_verbose_flag(parser: argparse.ArgumentParser) -> None:\n parser.add_argument(\n \"--verbose\", help=\"Verbose output\", dest=\"verbose\", action=\"store_true\"\n )\n\n parser.set_defaults(verbose=False)\n\n\ndef get_rich_progress() -> Progress:\n return Progress(\n \"[progress.description]{task.description}\",\n rich.progress.BarColumn(),\n \"[progress.percentage]{task.percentage:>3.0f}%\",\n rich.progress.TimeElapsedColumn(),\n \"|\",\n rich.progress.TimeRemainingColumn(),\n )\n", "path": "webknossos/webknossos/utils.py"}]} | 1,988 | 400 |
gh_patches_debug_41281 | rasdani/github-patches | git_diff | mozilla__bugbug-310 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make NLP dependencies optional
Some dependencies are only needed in some images, we should make some dependencies optional so that they are only installed when necessary.
For example, Spacy is the biggest dependency, but it is currently not really used by any image.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # This Source Code Form is subject to the terms of the Mozilla Public
3 # License, v. 2.0. If a copy of the MPL was not distributed with this file,
4 # You can obtain one at http://mozilla.org/MPL/2.0/.
5
6 import os
7
8 from setuptools import find_packages, setup
9
10 here = os.path.dirname(__file__)
11
12
13 def read_requirements(file_):
14 requires = []
15 links = []
16 with open(os.path.join(here, file_)) as f:
17 for line in f.readlines():
18 line = line.strip()
19
20 if line.startswith("https://"):
21 links.append(line + "-1.0.0")
22 extras = ""
23 if "[" in line:
24 extras = "[" + line.split("[")[1].split("]")[0] + "]"
25 line = line.split("#")[1].split("egg=")[1] + extras
26 elif line == "" or line.startswith("#") or line.startswith("-"):
27 continue
28 line = line.split("#")[0].strip()
29 requires.append(line)
30
31 return sorted(list(set(requires))), links
32
33
34 install_requires, dependency_links = read_requirements("requirements.txt")
35
36
37 with open(os.path.join(here, "VERSION")) as f:
38 version = f.read().strip()
39
40 setup(
41 name="bugbug",
42 version=version,
43 description="ML tools for Mozilla projects",
44 author="Marco Castelluccio",
45 author_email="[email protected]",
46 install_requires=install_requires,
47 dependency_links=dependency_links,
48 packages=find_packages(exclude=["contrib", "docs", "tests"]),
49 include_package_data=True,
50 license="MPL2",
51 entry_points={
52 "console_scripts": [
53 "bugbug-data-commits = scripts.commit_retriever:main",
54 "bugbug-data-bugzilla = scripts.bug_retriever:main",
55 "bugbug-train = scripts.trainer:main",
56 ]
57 },
58 )
59
```
Path: `bugbug/nlp.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # This Source Code Form is subject to the terms of the Mozilla Public
3 # License, v. 2.0. If a copy of the MPL was not distributed with this file,
4 # You can obtain one at http://mozilla.org/MPL/2.0/.
5
6 from collections import defaultdict
7 from functools import lru_cache
8
9 import numpy as np
10 import spacy
11 from gensim.models import KeyedVectors
12 from sklearn.base import BaseEstimator, TransformerMixin
13 from sklearn.feature_extraction.text import TfidfVectorizer
14 from spacy.tokenizer import Tokenizer
15
16 nlp = spacy.load("en_core_web_sm")
17
18
19 def spacy_token_lemmatizer(text):
20 if len(text) > nlp.max_length:
21 text = text[: nlp.max_length - 1]
22 doc = nlp(text)
23 return [token.lemma_ for token in doc]
24
25
26 class SpacyVectorizer(TfidfVectorizer):
27 def __init__(self, *args, **kwargs):
28 super().__init__(tokenizer=spacy_token_lemmatizer, *args, **kwargs)
29
30
31 @lru_cache()
32 def get_word_embeddings():
33 word_embeddings = KeyedVectors.load_word2vec_format("wiki-news-300d-1M-subword.vec")
34 word_embeddings.init_sims(replace=True)
35 return word_embeddings
36
37
38 class MeanEmbeddingTransformer(BaseEstimator, TransformerMixin):
39 def __init__(self):
40 self.model = get_word_embeddings()
41 self.dim = len(self.model["if"])
42
43 def fit(self, x, y=None):
44 return self
45
46 def transform(self, data):
47 tokenizer = Tokenizer(nlp.vocab)
48 return np.array(
49 [
50 np.mean(
51 [
52 self.model[w.text.lower()]
53 for w in words
54 if w.text.lower() in self.model
55 ]
56 or [np.zeros(self.dim)],
57 axis=0,
58 )
59 for words in tokenizer.pipe(data)
60 ]
61 )
62
63 def get_feature_names(self):
64 return np.array([f"_{i}" for i in range(self.dim)], dtype=object)
65
66
67 class TfidfMeanEmbeddingTransformer(MeanEmbeddingTransformer):
68 def __init__(self):
69 super().__init__()
70 self.word2weight = None
71
72 def fit(self, X, y=None):
73 tfidf = TfidfVectorizer(analyzer=lambda x: x)
74 tfidf.fit(X)
75
76 # If a word was never seen, it must be at least as infrequent as any of the known words.
77 # So, the default idf is the max of known idfs.
78 max_idf = max(tfidf.idf_)
79 self.word2weight = defaultdict(
80 lambda: max_idf, [(w, tfidf.idf_[i]) for w, i in tfidf.vocabulary_.items()]
81 )
82
83 return self
84
85 def transform(self, data):
86 tokenizer = Tokenizer(nlp.vocab)
87 return np.array(
88 [
89 np.mean(
90 [
91 self.model[w.text.lower()] * self.word2weight[w.text.lower()]
92 for w in words
93 if w.text.lower() in self.model
94 ]
95 or [np.zeros(self.dim)],
96 axis=0,
97 )
98 for words in tokenizer.pipe(data)
99 ]
100 )
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bugbug/nlp.py b/bugbug/nlp.py
--- a/bugbug/nlp.py
+++ b/bugbug/nlp.py
@@ -3,17 +3,41 @@
# License, v. 2.0. If a copy of the MPL was not distributed with this file,
# You can obtain one at http://mozilla.org/MPL/2.0/.
+import sys
from collections import defaultdict
from functools import lru_cache
import numpy as np
-import spacy
-from gensim.models import KeyedVectors
from sklearn.base import BaseEstimator, TransformerMixin
from sklearn.feature_extraction.text import TfidfVectorizer
-from spacy.tokenizer import Tokenizer
-nlp = spacy.load("en_core_web_sm")
+HAS_OPTIONAL_DEPENDENCIES = False
+
+try:
+ import spacy
+
+ from spacy.tokenizer import Tokenizer
+ from gensim.models import KeyedVectors
+
+ HAS_OPTIONAL_DEPENDENCIES = True
+except ImportError:
+ pass
+
+try:
+ if HAS_OPTIONAL_DEPENDENCIES:
+ nlp = spacy.load("en_core_web_sm")
+except OSError:
+ msg = (
+ "Spacy model is missing, install it with: "
+ f"{sys.executable} -m spacy download en_core_web_sm"
+ )
+ print(msg, file=sys.stderr)
+
+OPT_MSG_MISSING = (
+ "Optional dependencies are missing, install them with: pip install bugbug[nlp]\n"
+ "You might need also to download the models with: "
+ f"{sys.executable} -m spacy download en_core_web_sm"
+)
def spacy_token_lemmatizer(text):
@@ -25,6 +49,11 @@
class SpacyVectorizer(TfidfVectorizer):
def __init__(self, *args, **kwargs):
+
+ # Detect when the Spacy optional dependency is missing
+ if not HAS_OPTIONAL_DEPENDENCIES:
+ raise NotImplementedError(OPT_MSG_MISSING)
+
super().__init__(tokenizer=spacy_token_lemmatizer, *args, **kwargs)
@@ -37,6 +66,10 @@
class MeanEmbeddingTransformer(BaseEstimator, TransformerMixin):
def __init__(self):
+ # Detect when the Gensim optional dependency are missing
+ if not HAS_OPTIONAL_DEPENDENCIES:
+ raise NotImplementedError(OPT_MSG_MISSING)
+
self.model = get_word_embeddings()
self.dim = len(self.model["if"])
diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -37,6 +37,20 @@
with open(os.path.join(here, "VERSION")) as f:
version = f.read().strip()
+# Read the extra requirements
+extras = ["nlp"]
+
+extras_require = {}
+
+for extra in extras:
+ extras_install, extra_links = read_requirements("extra-%s-requirements.txt" % extra)
+
+ # Merge the dependency links
+ dependency_links.extend(extra_links)
+
+ extras_require[extra] = extras_install
+
+
setup(
name="bugbug",
version=version,
@@ -44,6 +58,7 @@
author="Marco Castelluccio",
author_email="[email protected]",
install_requires=install_requires,
+ extras_require=extras_require,
dependency_links=dependency_links,
packages=find_packages(exclude=["contrib", "docs", "tests"]),
include_package_data=True,
@@ -55,4 +70,9 @@
"bugbug-train = scripts.trainer:main",
]
},
+ classifiers=[
+ "Programming Language :: Python :: 3.6",
+ "Programming Language :: Python :: 3 :: Only",
+ "License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)",
+ ],
)
| {"golden_diff": "diff --git a/bugbug/nlp.py b/bugbug/nlp.py\n--- a/bugbug/nlp.py\n+++ b/bugbug/nlp.py\n@@ -3,17 +3,41 @@\n # License, v. 2.0. If a copy of the MPL was not distributed with this file,\n # You can obtain one at http://mozilla.org/MPL/2.0/.\n \n+import sys\n from collections import defaultdict\n from functools import lru_cache\n \n import numpy as np\n-import spacy\n-from gensim.models import KeyedVectors\n from sklearn.base import BaseEstimator, TransformerMixin\n from sklearn.feature_extraction.text import TfidfVectorizer\n-from spacy.tokenizer import Tokenizer\n \n-nlp = spacy.load(\"en_core_web_sm\")\n+HAS_OPTIONAL_DEPENDENCIES = False\n+\n+try:\n+ import spacy\n+\n+ from spacy.tokenizer import Tokenizer\n+ from gensim.models import KeyedVectors\n+\n+ HAS_OPTIONAL_DEPENDENCIES = True\n+except ImportError:\n+ pass\n+\n+try:\n+ if HAS_OPTIONAL_DEPENDENCIES:\n+ nlp = spacy.load(\"en_core_web_sm\")\n+except OSError:\n+ msg = (\n+ \"Spacy model is missing, install it with: \"\n+ f\"{sys.executable} -m spacy download en_core_web_sm\"\n+ )\n+ print(msg, file=sys.stderr)\n+\n+OPT_MSG_MISSING = (\n+ \"Optional dependencies are missing, install them with: pip install bugbug[nlp]\\n\"\n+ \"You might need also to download the models with: \"\n+ f\"{sys.executable} -m spacy download en_core_web_sm\"\n+)\n \n \n def spacy_token_lemmatizer(text):\n@@ -25,6 +49,11 @@\n \n class SpacyVectorizer(TfidfVectorizer):\n def __init__(self, *args, **kwargs):\n+\n+ # Detect when the Spacy optional dependency is missing\n+ if not HAS_OPTIONAL_DEPENDENCIES:\n+ raise NotImplementedError(OPT_MSG_MISSING)\n+\n super().__init__(tokenizer=spacy_token_lemmatizer, *args, **kwargs)\n \n \n@@ -37,6 +66,10 @@\n \n class MeanEmbeddingTransformer(BaseEstimator, TransformerMixin):\n def __init__(self):\n+ # Detect when the Gensim optional dependency are missing\n+ if not HAS_OPTIONAL_DEPENDENCIES:\n+ raise NotImplementedError(OPT_MSG_MISSING)\n+\n self.model = get_word_embeddings()\n self.dim = len(self.model[\"if\"])\n \ndiff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -37,6 +37,20 @@\n with open(os.path.join(here, \"VERSION\")) as f:\n version = f.read().strip()\n \n+# Read the extra requirements\n+extras = [\"nlp\"]\n+\n+extras_require = {}\n+\n+for extra in extras:\n+ extras_install, extra_links = read_requirements(\"extra-%s-requirements.txt\" % extra)\n+\n+ # Merge the dependency links\n+ dependency_links.extend(extra_links)\n+\n+ extras_require[extra] = extras_install\n+\n+\n setup(\n name=\"bugbug\",\n version=version,\n@@ -44,6 +58,7 @@\n author=\"Marco Castelluccio\",\n author_email=\"[email protected]\",\n install_requires=install_requires,\n+ extras_require=extras_require,\n dependency_links=dependency_links,\n packages=find_packages(exclude=[\"contrib\", \"docs\", \"tests\"]),\n include_package_data=True,\n@@ -55,4 +70,9 @@\n \"bugbug-train = scripts.trainer:main\",\n ]\n },\n+ classifiers=[\n+ \"Programming Language :: Python :: 3.6\",\n+ \"Programming Language :: Python :: 3 :: Only\",\n+ \"License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)\",\n+ ],\n )\n", "issue": "Make NLP dependencies optional\nSome dependencies are only needed in some images, we should make some dependencies optional so that they are only installed when necessary.\r\n\r\nFor example, Spacy is the biggest dependency, but it is currently not really used by any image.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nhere = os.path.dirname(__file__)\n\n\ndef read_requirements(file_):\n requires = []\n links = []\n with open(os.path.join(here, file_)) as f:\n for line in f.readlines():\n line = line.strip()\n\n if line.startswith(\"https://\"):\n links.append(line + \"-1.0.0\")\n extras = \"\"\n if \"[\" in line:\n extras = \"[\" + line.split(\"[\")[1].split(\"]\")[0] + \"]\"\n line = line.split(\"#\")[1].split(\"egg=\")[1] + extras\n elif line == \"\" or line.startswith(\"#\") or line.startswith(\"-\"):\n continue\n line = line.split(\"#\")[0].strip()\n requires.append(line)\n\n return sorted(list(set(requires))), links\n\n\ninstall_requires, dependency_links = read_requirements(\"requirements.txt\")\n\n\nwith open(os.path.join(here, \"VERSION\")) as f:\n version = f.read().strip()\n\nsetup(\n name=\"bugbug\",\n version=version,\n description=\"ML tools for Mozilla projects\",\n author=\"Marco Castelluccio\",\n author_email=\"[email protected]\",\n install_requires=install_requires,\n dependency_links=dependency_links,\n packages=find_packages(exclude=[\"contrib\", \"docs\", \"tests\"]),\n include_package_data=True,\n license=\"MPL2\",\n entry_points={\n \"console_scripts\": [\n \"bugbug-data-commits = scripts.commit_retriever:main\",\n \"bugbug-data-bugzilla = scripts.bug_retriever:main\",\n \"bugbug-train = scripts.trainer:main\",\n ]\n },\n)\n", "path": "setup.py"}, {"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nfrom collections import defaultdict\nfrom functools import lru_cache\n\nimport numpy as np\nimport spacy\nfrom gensim.models import KeyedVectors\nfrom sklearn.base import BaseEstimator, TransformerMixin\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom spacy.tokenizer import Tokenizer\n\nnlp = spacy.load(\"en_core_web_sm\")\n\n\ndef spacy_token_lemmatizer(text):\n if len(text) > nlp.max_length:\n text = text[: nlp.max_length - 1]\n doc = nlp(text)\n return [token.lemma_ for token in doc]\n\n\nclass SpacyVectorizer(TfidfVectorizer):\n def __init__(self, *args, **kwargs):\n super().__init__(tokenizer=spacy_token_lemmatizer, *args, **kwargs)\n\n\n@lru_cache()\ndef get_word_embeddings():\n word_embeddings = KeyedVectors.load_word2vec_format(\"wiki-news-300d-1M-subword.vec\")\n word_embeddings.init_sims(replace=True)\n return word_embeddings\n\n\nclass MeanEmbeddingTransformer(BaseEstimator, TransformerMixin):\n def __init__(self):\n self.model = get_word_embeddings()\n self.dim = len(self.model[\"if\"])\n\n def fit(self, x, y=None):\n return self\n\n def transform(self, data):\n tokenizer = Tokenizer(nlp.vocab)\n return np.array(\n [\n np.mean(\n [\n self.model[w.text.lower()]\n for w in words\n if w.text.lower() in self.model\n ]\n or [np.zeros(self.dim)],\n axis=0,\n )\n for words in tokenizer.pipe(data)\n ]\n )\n\n def get_feature_names(self):\n return np.array([f\"_{i}\" for i in range(self.dim)], dtype=object)\n\n\nclass TfidfMeanEmbeddingTransformer(MeanEmbeddingTransformer):\n def __init__(self):\n super().__init__()\n self.word2weight = None\n\n def fit(self, X, y=None):\n tfidf = TfidfVectorizer(analyzer=lambda x: x)\n tfidf.fit(X)\n\n # If a word was never seen, it must be at least as infrequent as any of the known words.\n # So, the default idf is the max of known idfs.\n max_idf = max(tfidf.idf_)\n self.word2weight = defaultdict(\n lambda: max_idf, [(w, tfidf.idf_[i]) for w, i in tfidf.vocabulary_.items()]\n )\n\n return self\n\n def transform(self, data):\n tokenizer = Tokenizer(nlp.vocab)\n return np.array(\n [\n np.mean(\n [\n self.model[w.text.lower()] * self.word2weight[w.text.lower()]\n for w in words\n if w.text.lower() in self.model\n ]\n or [np.zeros(self.dim)],\n axis=0,\n )\n for words in tokenizer.pipe(data)\n ]\n )\n", "path": "bugbug/nlp.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport os\n\nfrom setuptools import find_packages, setup\n\nhere = os.path.dirname(__file__)\n\n\ndef read_requirements(file_):\n requires = []\n links = []\n with open(os.path.join(here, file_)) as f:\n for line in f.readlines():\n line = line.strip()\n\n if line.startswith(\"https://\"):\n links.append(line + \"-1.0.0\")\n extras = \"\"\n if \"[\" in line:\n extras = \"[\" + line.split(\"[\")[1].split(\"]\")[0] + \"]\"\n line = line.split(\"#\")[1].split(\"egg=\")[1] + extras\n elif line == \"\" or line.startswith(\"#\") or line.startswith(\"-\"):\n continue\n line = line.split(\"#\")[0].strip()\n requires.append(line)\n\n return sorted(list(set(requires))), links\n\n\ninstall_requires, dependency_links = read_requirements(\"requirements.txt\")\n\n\nwith open(os.path.join(here, \"VERSION\")) as f:\n version = f.read().strip()\n\n# Read the extra requirements\nextras = [\"nlp\"]\n\nextras_require = {}\n\nfor extra in extras:\n extras_install, extra_links = read_requirements(\"extra-%s-requirements.txt\" % extra)\n\n # Merge the dependency links\n dependency_links.extend(extra_links)\n\n extras_require[extra] = extras_install\n\n\nsetup(\n name=\"bugbug\",\n version=version,\n description=\"ML tools for Mozilla projects\",\n author=\"Marco Castelluccio\",\n author_email=\"[email protected]\",\n install_requires=install_requires,\n extras_require=extras_require,\n dependency_links=dependency_links,\n packages=find_packages(exclude=[\"contrib\", \"docs\", \"tests\"]),\n include_package_data=True,\n license=\"MPL2\",\n entry_points={\n \"console_scripts\": [\n \"bugbug-data-commits = scripts.commit_retriever:main\",\n \"bugbug-data-bugzilla = scripts.bug_retriever:main\",\n \"bugbug-train = scripts.trainer:main\",\n ]\n },\n classifiers=[\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"License :: OSI Approved :: Mozilla Public License 2.0 (MPL 2.0)\",\n ],\n)\n", "path": "setup.py"}, {"content": "# -*- coding: utf-8 -*-\n# This Source Code Form is subject to the terms of the Mozilla Public\n# License, v. 2.0. If a copy of the MPL was not distributed with this file,\n# You can obtain one at http://mozilla.org/MPL/2.0/.\n\nimport sys\nfrom collections import defaultdict\nfrom functools import lru_cache\n\nimport numpy as np\nfrom sklearn.base import BaseEstimator, TransformerMixin\nfrom sklearn.feature_extraction.text import TfidfVectorizer\n\nHAS_OPTIONAL_DEPENDENCIES = False\n\ntry:\n import spacy\n\n from spacy.tokenizer import Tokenizer\n from gensim.models import KeyedVectors\n\n HAS_OPTIONAL_DEPENDENCIES = True\nexcept ImportError:\n pass\n\ntry:\n if HAS_OPTIONAL_DEPENDENCIES:\n nlp = spacy.load(\"en_core_web_sm\")\nexcept OSError:\n msg = (\n \"Spacy model is missing, install it with: \"\n f\"{sys.executable} -m spacy download en_core_web_sm\"\n )\n print(msg, file=sys.stderr)\n\nOPT_MSG_MISSING = (\n \"Optional dependencies are missing, install them with: pip install bugbug[nlp]\\n\"\n \"You might need also to download the models with: \"\n f\"{sys.executable} -m spacy download en_core_web_sm\"\n)\n\n\ndef spacy_token_lemmatizer(text):\n if len(text) > nlp.max_length:\n text = text[: nlp.max_length - 1]\n doc = nlp(text)\n return [token.lemma_ for token in doc]\n\n\nclass SpacyVectorizer(TfidfVectorizer):\n def __init__(self, *args, **kwargs):\n\n # Detect when the Spacy optional dependency is missing\n if not HAS_OPTIONAL_DEPENDENCIES:\n raise NotImplementedError(OPT_MSG_MISSING)\n\n super().__init__(tokenizer=spacy_token_lemmatizer, *args, **kwargs)\n\n\n@lru_cache()\ndef get_word_embeddings():\n word_embeddings = KeyedVectors.load_word2vec_format(\"wiki-news-300d-1M-subword.vec\")\n word_embeddings.init_sims(replace=True)\n return word_embeddings\n\n\nclass MeanEmbeddingTransformer(BaseEstimator, TransformerMixin):\n def __init__(self):\n # Detect when the Gensim optional dependency are missing\n if not HAS_OPTIONAL_DEPENDENCIES:\n raise NotImplementedError(OPT_MSG_MISSING)\n\n self.model = get_word_embeddings()\n self.dim = len(self.model[\"if\"])\n\n def fit(self, x, y=None):\n return self\n\n def transform(self, data):\n tokenizer = Tokenizer(nlp.vocab)\n return np.array(\n [\n np.mean(\n [\n self.model[w.text.lower()]\n for w in words\n if w.text.lower() in self.model\n ]\n or [np.zeros(self.dim)],\n axis=0,\n )\n for words in tokenizer.pipe(data)\n ]\n )\n\n def get_feature_names(self):\n return np.array([f\"_{i}\" for i in range(self.dim)], dtype=object)\n\n\nclass TfidfMeanEmbeddingTransformer(MeanEmbeddingTransformer):\n def __init__(self):\n super().__init__()\n self.word2weight = None\n\n def fit(self, X, y=None):\n tfidf = TfidfVectorizer(analyzer=lambda x: x)\n tfidf.fit(X)\n\n # If a word was never seen, it must be at least as infrequent as any of the known words.\n # So, the default idf is the max of known idfs.\n max_idf = max(tfidf.idf_)\n self.word2weight = defaultdict(\n lambda: max_idf, [(w, tfidf.idf_[i]) for w, i in tfidf.vocabulary_.items()]\n )\n\n return self\n\n def transform(self, data):\n tokenizer = Tokenizer(nlp.vocab)\n return np.array(\n [\n np.mean(\n [\n self.model[w.text.lower()] * self.word2weight[w.text.lower()]\n for w in words\n if w.text.lower() in self.model\n ]\n or [np.zeros(self.dim)],\n axis=0,\n )\n for words in tokenizer.pipe(data)\n ]\n )\n", "path": "bugbug/nlp.py"}]} | 1,765 | 865 |
gh_patches_debug_11621 | rasdani/github-patches | git_diff | apache__airflow-12057 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Exception logging success function instead of failure
https://github.com/apache/airflow/blob/fdd9b6f65b608c516b8a062b058972d9a45ec9e3/airflow/sensors/sql_sensor.py#L97
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `airflow/sensors/sql_sensor.py`
Content:
```
1 #
2 # Licensed to the Apache Software Foundation (ASF) under one
3 # or more contributor license agreements. See the NOTICE file
4 # distributed with this work for additional information
5 # regarding copyright ownership. The ASF licenses this file
6 # to you under the Apache License, Version 2.0 (the
7 # "License"); you may not use this file except in compliance
8 # with the License. You may obtain a copy of the License at
9 #
10 # http://www.apache.org/licenses/LICENSE-2.0
11 #
12 # Unless required by applicable law or agreed to in writing,
13 # software distributed under the License is distributed on an
14 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
15 # KIND, either express or implied. See the License for the
16 # specific language governing permissions and limitations
17 # under the License.
18
19 from typing import Iterable
20
21 from airflow.exceptions import AirflowException
22 from airflow.hooks.base_hook import BaseHook
23 from airflow.sensors.base_sensor_operator import BaseSensorOperator
24 from airflow.utils.decorators import apply_defaults
25
26
27 class SqlSensor(BaseSensorOperator):
28 """
29 Runs a sql statement repeatedly until a criteria is met. It will keep trying until
30 success or failure criteria are met, or if the first cell is not in (0, '0', '', None).
31 Optional success and failure callables are called with the first cell returned as the argument.
32 If success callable is defined the sensor will keep retrying until the criteria is met.
33 If failure callable is defined and the criteria is met the sensor will raise AirflowException.
34 Failure criteria is evaluated before success criteria. A fail_on_empty boolean can also
35 be passed to the sensor in which case it will fail if no rows have been returned
36
37 :param conn_id: The connection to run the sensor against
38 :type conn_id: str
39 :param sql: The sql to run. To pass, it needs to return at least one cell
40 that contains a non-zero / empty string value.
41 :type sql: str
42 :param parameters: The parameters to render the SQL query with (optional).
43 :type parameters: dict or iterable
44 :param success: Success criteria for the sensor is a Callable that takes first_cell
45 as the only argument, and returns a boolean (optional).
46 :type: success: Optional<Callable[[Any], bool]>
47 :param failure: Failure criteria for the sensor is a Callable that takes first_cell
48 as the only argument and return a boolean (optional).
49 :type: failure: Optional<Callable[[Any], bool]>
50 :param fail_on_empty: Explicitly fail on no rows returned.
51 :type: fail_on_empty: bool
52 """
53
54 template_fields: Iterable[str] = ('sql',)
55 template_ext: Iterable[str] = (
56 '.hql',
57 '.sql',
58 )
59 ui_color = '#7c7287'
60
61 @apply_defaults
62 def __init__(
63 self, *, conn_id, sql, parameters=None, success=None, failure=None, fail_on_empty=False, **kwargs
64 ):
65 self.conn_id = conn_id
66 self.sql = sql
67 self.parameters = parameters
68 self.success = success
69 self.failure = failure
70 self.fail_on_empty = fail_on_empty
71 super().__init__(**kwargs)
72
73 def _get_hook(self):
74 conn = BaseHook.get_connection(self.conn_id)
75
76 allowed_conn_type = {
77 'google_cloud_platform',
78 'jdbc',
79 'mssql',
80 'mysql',
81 'odbc',
82 'oracle',
83 'postgres',
84 'presto',
85 'snowflake',
86 'sqlite',
87 'vertica',
88 }
89 if conn.conn_type not in allowed_conn_type:
90 raise AirflowException(
91 "The connection type is not supported by SqlSensor. "
92 + "Supported connection types: {}".format(list(allowed_conn_type))
93 )
94 return conn.get_hook()
95
96 def poke(self, context):
97 hook = self._get_hook()
98
99 self.log.info('Poking: %s (with parameters %s)', self.sql, self.parameters)
100 records = hook.get_records(self.sql, self.parameters)
101 if not records:
102 if self.fail_on_empty:
103 raise AirflowException("No rows returned, raising as per fail_on_empty flag")
104 else:
105 return False
106 first_cell = records[0][0]
107 if self.failure is not None:
108 if callable(self.failure):
109 if self.failure(first_cell):
110 raise AirflowException(f"Failure criteria met. self.failure({first_cell}) returned True")
111 else:
112 raise AirflowException(f"self.failure is present, but not callable -> {self.success}")
113 if self.success is not None:
114 if callable(self.success):
115 return self.success(first_cell)
116 else:
117 raise AirflowException(f"self.success is present, but not callable -> {self.success}")
118 return bool(first_cell)
119
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/airflow/sensors/sql_sensor.py b/airflow/sensors/sql_sensor.py
--- a/airflow/sensors/sql_sensor.py
+++ b/airflow/sensors/sql_sensor.py
@@ -109,7 +109,7 @@
if self.failure(first_cell):
raise AirflowException(f"Failure criteria met. self.failure({first_cell}) returned True")
else:
- raise AirflowException(f"self.failure is present, but not callable -> {self.success}")
+ raise AirflowException(f"self.failure is present, but not callable -> {self.failure}")
if self.success is not None:
if callable(self.success):
return self.success(first_cell)
| {"golden_diff": "diff --git a/airflow/sensors/sql_sensor.py b/airflow/sensors/sql_sensor.py\n--- a/airflow/sensors/sql_sensor.py\n+++ b/airflow/sensors/sql_sensor.py\n@@ -109,7 +109,7 @@\n if self.failure(first_cell):\n raise AirflowException(f\"Failure criteria met. self.failure({first_cell}) returned True\")\n else:\n- raise AirflowException(f\"self.failure is present, but not callable -> {self.success}\")\n+ raise AirflowException(f\"self.failure is present, but not callable -> {self.failure}\")\n if self.success is not None:\n if callable(self.success):\n return self.success(first_cell)\n", "issue": "Exception logging success function instead of failure\nhttps://github.com/apache/airflow/blob/fdd9b6f65b608c516b8a062b058972d9a45ec9e3/airflow/sensors/sql_sensor.py#L97\r\n\r\n\n", "before_files": [{"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\nfrom typing import Iterable\n\nfrom airflow.exceptions import AirflowException\nfrom airflow.hooks.base_hook import BaseHook\nfrom airflow.sensors.base_sensor_operator import BaseSensorOperator\nfrom airflow.utils.decorators import apply_defaults\n\n\nclass SqlSensor(BaseSensorOperator):\n \"\"\"\n Runs a sql statement repeatedly until a criteria is met. It will keep trying until\n success or failure criteria are met, or if the first cell is not in (0, '0', '', None).\n Optional success and failure callables are called with the first cell returned as the argument.\n If success callable is defined the sensor will keep retrying until the criteria is met.\n If failure callable is defined and the criteria is met the sensor will raise AirflowException.\n Failure criteria is evaluated before success criteria. A fail_on_empty boolean can also\n be passed to the sensor in which case it will fail if no rows have been returned\n\n :param conn_id: The connection to run the sensor against\n :type conn_id: str\n :param sql: The sql to run. To pass, it needs to return at least one cell\n that contains a non-zero / empty string value.\n :type sql: str\n :param parameters: The parameters to render the SQL query with (optional).\n :type parameters: dict or iterable\n :param success: Success criteria for the sensor is a Callable that takes first_cell\n as the only argument, and returns a boolean (optional).\n :type: success: Optional<Callable[[Any], bool]>\n :param failure: Failure criteria for the sensor is a Callable that takes first_cell\n as the only argument and return a boolean (optional).\n :type: failure: Optional<Callable[[Any], bool]>\n :param fail_on_empty: Explicitly fail on no rows returned.\n :type: fail_on_empty: bool\n \"\"\"\n\n template_fields: Iterable[str] = ('sql',)\n template_ext: Iterable[str] = (\n '.hql',\n '.sql',\n )\n ui_color = '#7c7287'\n\n @apply_defaults\n def __init__(\n self, *, conn_id, sql, parameters=None, success=None, failure=None, fail_on_empty=False, **kwargs\n ):\n self.conn_id = conn_id\n self.sql = sql\n self.parameters = parameters\n self.success = success\n self.failure = failure\n self.fail_on_empty = fail_on_empty\n super().__init__(**kwargs)\n\n def _get_hook(self):\n conn = BaseHook.get_connection(self.conn_id)\n\n allowed_conn_type = {\n 'google_cloud_platform',\n 'jdbc',\n 'mssql',\n 'mysql',\n 'odbc',\n 'oracle',\n 'postgres',\n 'presto',\n 'snowflake',\n 'sqlite',\n 'vertica',\n }\n if conn.conn_type not in allowed_conn_type:\n raise AirflowException(\n \"The connection type is not supported by SqlSensor. \"\n + \"Supported connection types: {}\".format(list(allowed_conn_type))\n )\n return conn.get_hook()\n\n def poke(self, context):\n hook = self._get_hook()\n\n self.log.info('Poking: %s (with parameters %s)', self.sql, self.parameters)\n records = hook.get_records(self.sql, self.parameters)\n if not records:\n if self.fail_on_empty:\n raise AirflowException(\"No rows returned, raising as per fail_on_empty flag\")\n else:\n return False\n first_cell = records[0][0]\n if self.failure is not None:\n if callable(self.failure):\n if self.failure(first_cell):\n raise AirflowException(f\"Failure criteria met. self.failure({first_cell}) returned True\")\n else:\n raise AirflowException(f\"self.failure is present, but not callable -> {self.success}\")\n if self.success is not None:\n if callable(self.success):\n return self.success(first_cell)\n else:\n raise AirflowException(f\"self.success is present, but not callable -> {self.success}\")\n return bool(first_cell)\n", "path": "airflow/sensors/sql_sensor.py"}], "after_files": [{"content": "#\n# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\nfrom typing import Iterable\n\nfrom airflow.exceptions import AirflowException\nfrom airflow.hooks.base_hook import BaseHook\nfrom airflow.sensors.base_sensor_operator import BaseSensorOperator\nfrom airflow.utils.decorators import apply_defaults\n\n\nclass SqlSensor(BaseSensorOperator):\n \"\"\"\n Runs a sql statement repeatedly until a criteria is met. It will keep trying until\n success or failure criteria are met, or if the first cell is not in (0, '0', '', None).\n Optional success and failure callables are called with the first cell returned as the argument.\n If success callable is defined the sensor will keep retrying until the criteria is met.\n If failure callable is defined and the criteria is met the sensor will raise AirflowException.\n Failure criteria is evaluated before success criteria. A fail_on_empty boolean can also\n be passed to the sensor in which case it will fail if no rows have been returned\n\n :param conn_id: The connection to run the sensor against\n :type conn_id: str\n :param sql: The sql to run. To pass, it needs to return at least one cell\n that contains a non-zero / empty string value.\n :type sql: str\n :param parameters: The parameters to render the SQL query with (optional).\n :type parameters: dict or iterable\n :param success: Success criteria for the sensor is a Callable that takes first_cell\n as the only argument, and returns a boolean (optional).\n :type: success: Optional<Callable[[Any], bool]>\n :param failure: Failure criteria for the sensor is a Callable that takes first_cell\n as the only argument and return a boolean (optional).\n :type: failure: Optional<Callable[[Any], bool]>\n :param fail_on_empty: Explicitly fail on no rows returned.\n :type: fail_on_empty: bool\n \"\"\"\n\n template_fields: Iterable[str] = ('sql',)\n template_ext: Iterable[str] = (\n '.hql',\n '.sql',\n )\n ui_color = '#7c7287'\n\n @apply_defaults\n def __init__(\n self, *, conn_id, sql, parameters=None, success=None, failure=None, fail_on_empty=False, **kwargs\n ):\n self.conn_id = conn_id\n self.sql = sql\n self.parameters = parameters\n self.success = success\n self.failure = failure\n self.fail_on_empty = fail_on_empty\n super().__init__(**kwargs)\n\n def _get_hook(self):\n conn = BaseHook.get_connection(self.conn_id)\n\n allowed_conn_type = {\n 'google_cloud_platform',\n 'jdbc',\n 'mssql',\n 'mysql',\n 'odbc',\n 'oracle',\n 'postgres',\n 'presto',\n 'snowflake',\n 'sqlite',\n 'vertica',\n }\n if conn.conn_type not in allowed_conn_type:\n raise AirflowException(\n \"The connection type is not supported by SqlSensor. \"\n + \"Supported connection types: {}\".format(list(allowed_conn_type))\n )\n return conn.get_hook()\n\n def poke(self, context):\n hook = self._get_hook()\n\n self.log.info('Poking: %s (with parameters %s)', self.sql, self.parameters)\n records = hook.get_records(self.sql, self.parameters)\n if not records:\n if self.fail_on_empty:\n raise AirflowException(\"No rows returned, raising as per fail_on_empty flag\")\n else:\n return False\n first_cell = records[0][0]\n if self.failure is not None:\n if callable(self.failure):\n if self.failure(first_cell):\n raise AirflowException(f\"Failure criteria met. self.failure({first_cell}) returned True\")\n else:\n raise AirflowException(f\"self.failure is present, but not callable -> {self.failure}\")\n if self.success is not None:\n if callable(self.success):\n return self.success(first_cell)\n else:\n raise AirflowException(f\"self.success is present, but not callable -> {self.success}\")\n return bool(first_cell)\n", "path": "airflow/sensors/sql_sensor.py"}]} | 1,626 | 151 |
gh_patches_debug_8494 | rasdani/github-patches | git_diff | cloudtools__troposphere-1040 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Appsync ApiKey Expires needs to be an int
The expires prop on the appsync ApiKey class needs to be an int otherwise the decimal place causes cloudformation to throw a `Can not deserialize value of type java.lang.Long from String` error.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/appsync.py`
Content:
```
1 # Copyright (c) 2012-2017, Mark Peek <[email protected]>
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5
6 from . import AWSObject, AWSProperty
7 from .validators import boolean
8
9
10 class ApiKey(AWSObject):
11 resource_type = "AWS::AppSync::ApiKey"
12
13 props = {
14 'ApiId': (basestring, True),
15 'Description': (basestring, False),
16 'Expires': (float, False),
17 }
18
19
20 class DynamoDBConfig(AWSProperty):
21 props = {
22 'AwsRegion': (basestring, True),
23 'TableName': (basestring, True),
24 'UseCallerCredentials': (boolean, False),
25 }
26
27
28 class ElasticsearchConfig(AWSProperty):
29 props = {
30 'AwsRegion': (basestring, True),
31 'Endpoint': (basestring, True),
32 }
33
34
35 class LambdaConfig(AWSProperty):
36 props = {
37 'LambdaFunctionArn': (basestring, True),
38 }
39
40
41 class DataSource(AWSObject):
42 resource_type = "AWS::AppSync::DataSource"
43
44 props = {
45 'ApiId': (basestring, True),
46 'Description': (basestring, False),
47 'DynamoDBConfig': (DynamoDBConfig, False),
48 'ElasticsearchConfig': (ElasticsearchConfig, False),
49 'LambdaConfig': (LambdaConfig, False),
50 'Name': (basestring, True),
51 'ServiceRoleArn': (basestring, False),
52 'Type': (basestring, True),
53 }
54
55
56 class LogConfig(AWSProperty):
57 props = {
58 'CloudWatchLogsRoleArn': (basestring, False),
59 'FieldLogLevel': (basestring, False),
60 }
61
62
63 class OpenIDConnectConfig(AWSProperty):
64 props = {
65 'AuthTTL': (float, False),
66 'ClientId': (basestring, False),
67 'IatTTL': (float, False),
68 'Issuer': (basestring, True),
69 }
70
71
72 class UserPoolConfig(AWSProperty):
73 props = {
74 'AppIdClientRegex': (basestring, False),
75 'AwsRegion': (basestring, False),
76 'DefaultAction': (basestring, False),
77 'UserPoolId': (basestring, False),
78 }
79
80
81 class GraphQLApi(AWSObject):
82 resource_type = "AWS::AppSync::GraphQLApi"
83
84 props = {
85 'AuthenticationType': (basestring, True),
86 'LogConfig': (LogConfig, False),
87 'Name': (basestring, True),
88 'OpenIDConnectConfig': (OpenIDConnectConfig, False),
89 'UserPoolConfig': (UserPoolConfig, False),
90 }
91
92
93 class GraphQLSchema(AWSObject):
94 resource_type = "AWS::AppSync::GraphQLSchema"
95
96 props = {
97 'ApiId': (basestring, True),
98 'Definition': (basestring, False),
99 'DefinitionS3Location': (basestring, False),
100 }
101
102
103 class Resolver(AWSObject):
104 resource_type = "AWS::AppSync::Resolver"
105
106 props = {
107 'ApiId': (basestring, True),
108 'DataSourceName': (basestring, True),
109 'FieldName': (basestring, True),
110 'RequestMappingTemplate': (basestring, False),
111 'RequestMappingTemplateS3Location': (basestring, False),
112 'ResponseMappingTemplate': (basestring, False),
113 'ResponseMappingTemplateS3Location': (basestring, False),
114 'TypeName': (basestring, True),
115 }
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/appsync.py b/troposphere/appsync.py
--- a/troposphere/appsync.py
+++ b/troposphere/appsync.py
@@ -4,7 +4,7 @@
# See LICENSE file for full license.
from . import AWSObject, AWSProperty
-from .validators import boolean
+from .validators import boolean, integer
class ApiKey(AWSObject):
@@ -13,7 +13,7 @@
props = {
'ApiId': (basestring, True),
'Description': (basestring, False),
- 'Expires': (float, False),
+ 'Expires': (integer, False),
}
| {"golden_diff": "diff --git a/troposphere/appsync.py b/troposphere/appsync.py\n--- a/troposphere/appsync.py\n+++ b/troposphere/appsync.py\n@@ -4,7 +4,7 @@\n # See LICENSE file for full license.\n \n from . import AWSObject, AWSProperty\n-from .validators import boolean\n+from .validators import boolean, integer\n \n \n class ApiKey(AWSObject):\n@@ -13,7 +13,7 @@\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n- 'Expires': (float, False),\n+ 'Expires': (integer, False),\n }\n", "issue": "Appsync ApiKey Expires needs to be an int\nThe expires prop on the appsync ApiKey class needs to be an int otherwise the decimal place causes cloudformation to throw a `Can not deserialize value of type java.lang.Long from String` error. \n", "before_files": [{"content": "# Copyright (c) 2012-2017, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import boolean\n\n\nclass ApiKey(AWSObject):\n resource_type = \"AWS::AppSync::ApiKey\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'Expires': (float, False),\n }\n\n\nclass DynamoDBConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'TableName': (basestring, True),\n 'UseCallerCredentials': (boolean, False),\n }\n\n\nclass ElasticsearchConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'Endpoint': (basestring, True),\n }\n\n\nclass LambdaConfig(AWSProperty):\n props = {\n 'LambdaFunctionArn': (basestring, True),\n }\n\n\nclass DataSource(AWSObject):\n resource_type = \"AWS::AppSync::DataSource\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'DynamoDBConfig': (DynamoDBConfig, False),\n 'ElasticsearchConfig': (ElasticsearchConfig, False),\n 'LambdaConfig': (LambdaConfig, False),\n 'Name': (basestring, True),\n 'ServiceRoleArn': (basestring, False),\n 'Type': (basestring, True),\n }\n\n\nclass LogConfig(AWSProperty):\n props = {\n 'CloudWatchLogsRoleArn': (basestring, False),\n 'FieldLogLevel': (basestring, False),\n }\n\n\nclass OpenIDConnectConfig(AWSProperty):\n props = {\n 'AuthTTL': (float, False),\n 'ClientId': (basestring, False),\n 'IatTTL': (float, False),\n 'Issuer': (basestring, True),\n }\n\n\nclass UserPoolConfig(AWSProperty):\n props = {\n 'AppIdClientRegex': (basestring, False),\n 'AwsRegion': (basestring, False),\n 'DefaultAction': (basestring, False),\n 'UserPoolId': (basestring, False),\n }\n\n\nclass GraphQLApi(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLApi\"\n\n props = {\n 'AuthenticationType': (basestring, True),\n 'LogConfig': (LogConfig, False),\n 'Name': (basestring, True),\n 'OpenIDConnectConfig': (OpenIDConnectConfig, False),\n 'UserPoolConfig': (UserPoolConfig, False),\n }\n\n\nclass GraphQLSchema(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLSchema\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Definition': (basestring, False),\n 'DefinitionS3Location': (basestring, False),\n }\n\n\nclass Resolver(AWSObject):\n resource_type = \"AWS::AppSync::Resolver\"\n\n props = {\n 'ApiId': (basestring, True),\n 'DataSourceName': (basestring, True),\n 'FieldName': (basestring, True),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n 'ResponseMappingTemplate': (basestring, False),\n 'ResponseMappingTemplateS3Location': (basestring, False),\n 'TypeName': (basestring, True),\n }\n", "path": "troposphere/appsync.py"}], "after_files": [{"content": "# Copyright (c) 2012-2017, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\nfrom . import AWSObject, AWSProperty\nfrom .validators import boolean, integer\n\n\nclass ApiKey(AWSObject):\n resource_type = \"AWS::AppSync::ApiKey\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'Expires': (integer, False),\n }\n\n\nclass DynamoDBConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'TableName': (basestring, True),\n 'UseCallerCredentials': (boolean, False),\n }\n\n\nclass ElasticsearchConfig(AWSProperty):\n props = {\n 'AwsRegion': (basestring, True),\n 'Endpoint': (basestring, True),\n }\n\n\nclass LambdaConfig(AWSProperty):\n props = {\n 'LambdaFunctionArn': (basestring, True),\n }\n\n\nclass DataSource(AWSObject):\n resource_type = \"AWS::AppSync::DataSource\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Description': (basestring, False),\n 'DynamoDBConfig': (DynamoDBConfig, False),\n 'ElasticsearchConfig': (ElasticsearchConfig, False),\n 'LambdaConfig': (LambdaConfig, False),\n 'Name': (basestring, True),\n 'ServiceRoleArn': (basestring, False),\n 'Type': (basestring, True),\n }\n\n\nclass LogConfig(AWSProperty):\n props = {\n 'CloudWatchLogsRoleArn': (basestring, False),\n 'FieldLogLevel': (basestring, False),\n }\n\n\nclass OpenIDConnectConfig(AWSProperty):\n props = {\n 'AuthTTL': (float, False),\n 'ClientId': (basestring, False),\n 'IatTTL': (float, False),\n 'Issuer': (basestring, True),\n }\n\n\nclass UserPoolConfig(AWSProperty):\n props = {\n 'AppIdClientRegex': (basestring, False),\n 'AwsRegion': (basestring, False),\n 'DefaultAction': (basestring, False),\n 'UserPoolId': (basestring, False),\n }\n\n\nclass GraphQLApi(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLApi\"\n\n props = {\n 'AuthenticationType': (basestring, True),\n 'LogConfig': (LogConfig, False),\n 'Name': (basestring, True),\n 'OpenIDConnectConfig': (OpenIDConnectConfig, False),\n 'UserPoolConfig': (UserPoolConfig, False),\n }\n\n\nclass GraphQLSchema(AWSObject):\n resource_type = \"AWS::AppSync::GraphQLSchema\"\n\n props = {\n 'ApiId': (basestring, True),\n 'Definition': (basestring, False),\n 'DefinitionS3Location': (basestring, False),\n }\n\n\nclass Resolver(AWSObject):\n resource_type = \"AWS::AppSync::Resolver\"\n\n props = {\n 'ApiId': (basestring, True),\n 'DataSourceName': (basestring, True),\n 'FieldName': (basestring, True),\n 'RequestMappingTemplate': (basestring, False),\n 'RequestMappingTemplateS3Location': (basestring, False),\n 'ResponseMappingTemplate': (basestring, False),\n 'ResponseMappingTemplateS3Location': (basestring, False),\n 'TypeName': (basestring, True),\n }\n", "path": "troposphere/appsync.py"}]} | 1,332 | 146 |
gh_patches_debug_2684 | rasdani/github-patches | git_diff | CTFd__CTFd-1918 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Users in admin scoreboard show user position instead of team position
In teams mode on the admin panel, users are shown with their user position on the scoreboard instead of their teams position. We should be showing both.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `CTFd/admin/users.py`
Content:
```
1 from flask import render_template, request, url_for
2 from sqlalchemy.sql import not_
3
4 from CTFd.admin import admin
5 from CTFd.models import Challenges, Tracking, Users
6 from CTFd.utils import get_config
7 from CTFd.utils.decorators import admins_only
8 from CTFd.utils.modes import TEAMS_MODE
9
10
11 @admin.route("/admin/users")
12 @admins_only
13 def users_listing():
14 q = request.args.get("q")
15 field = request.args.get("field")
16 page = abs(request.args.get("page", 1, type=int))
17 filters = []
18 users = []
19
20 if q:
21 # The field exists as an exposed column
22 if Users.__mapper__.has_property(field):
23 filters.append(getattr(Users, field).like("%{}%".format(q)))
24
25 if q and field == "ip":
26 users = (
27 Users.query.join(Tracking, Users.id == Tracking.user_id)
28 .filter(Tracking.ip.like("%{}%".format(q)))
29 .order_by(Users.id.asc())
30 .paginate(page=page, per_page=50)
31 )
32 else:
33 users = (
34 Users.query.filter(*filters)
35 .order_by(Users.id.asc())
36 .paginate(page=page, per_page=50)
37 )
38
39 args = dict(request.args)
40 args.pop("page", 1)
41
42 return render_template(
43 "admin/users/users.html",
44 users=users,
45 prev_page=url_for(request.endpoint, page=users.prev_num, **args),
46 next_page=url_for(request.endpoint, page=users.next_num, **args),
47 q=q,
48 field=field,
49 )
50
51
52 @admin.route("/admin/users/new")
53 @admins_only
54 def users_new():
55 return render_template("admin/users/new.html")
56
57
58 @admin.route("/admin/users/<int:user_id>")
59 @admins_only
60 def users_detail(user_id):
61 # Get user object
62 user = Users.query.filter_by(id=user_id).first_or_404()
63
64 # Get the user's solves
65 solves = user.get_solves(admin=True)
66
67 # Get challenges that the user is missing
68 if get_config("user_mode") == TEAMS_MODE:
69 if user.team:
70 all_solves = user.team.get_solves(admin=True)
71 else:
72 all_solves = user.get_solves(admin=True)
73 else:
74 all_solves = user.get_solves(admin=True)
75
76 solve_ids = [s.challenge_id for s in all_solves]
77 missing = Challenges.query.filter(not_(Challenges.id.in_(solve_ids))).all()
78
79 # Get IP addresses that the User has used
80 addrs = (
81 Tracking.query.filter_by(user_id=user_id).order_by(Tracking.date.desc()).all()
82 )
83
84 # Get Fails
85 fails = user.get_fails(admin=True)
86
87 # Get Awards
88 awards = user.get_awards(admin=True)
89
90 # Get user properties
91 score = user.get_score(admin=True)
92 place = user.get_place(admin=True)
93
94 return render_template(
95 "admin/users/user.html",
96 solves=solves,
97 user=user,
98 addrs=addrs,
99 score=score,
100 missing=missing,
101 place=place,
102 fails=fails,
103 awards=awards,
104 )
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/CTFd/admin/users.py b/CTFd/admin/users.py
--- a/CTFd/admin/users.py
+++ b/CTFd/admin/users.py
@@ -88,8 +88,8 @@
awards = user.get_awards(admin=True)
# Get user properties
- score = user.get_score(admin=True)
- place = user.get_place(admin=True)
+ score = user.account.get_score(admin=True)
+ place = user.account.get_place(admin=True)
return render_template(
"admin/users/user.html",
| {"golden_diff": "diff --git a/CTFd/admin/users.py b/CTFd/admin/users.py\n--- a/CTFd/admin/users.py\n+++ b/CTFd/admin/users.py\n@@ -88,8 +88,8 @@\n awards = user.get_awards(admin=True)\n \n # Get user properties\n- score = user.get_score(admin=True)\n- place = user.get_place(admin=True)\n+ score = user.account.get_score(admin=True)\n+ place = user.account.get_place(admin=True)\n \n return render_template(\n \"admin/users/user.html\",\n", "issue": "Users in admin scoreboard show user position instead of team position\nIn teams mode on the admin panel, users are shown with their user position on the scoreboard instead of their teams position. We should be showing both. \n", "before_files": [{"content": "from flask import render_template, request, url_for\nfrom sqlalchemy.sql import not_\n\nfrom CTFd.admin import admin\nfrom CTFd.models import Challenges, Tracking, Users\nfrom CTFd.utils import get_config\nfrom CTFd.utils.decorators import admins_only\nfrom CTFd.utils.modes import TEAMS_MODE\n\n\[email protected](\"/admin/users\")\n@admins_only\ndef users_listing():\n q = request.args.get(\"q\")\n field = request.args.get(\"field\")\n page = abs(request.args.get(\"page\", 1, type=int))\n filters = []\n users = []\n\n if q:\n # The field exists as an exposed column\n if Users.__mapper__.has_property(field):\n filters.append(getattr(Users, field).like(\"%{}%\".format(q)))\n\n if q and field == \"ip\":\n users = (\n Users.query.join(Tracking, Users.id == Tracking.user_id)\n .filter(Tracking.ip.like(\"%{}%\".format(q)))\n .order_by(Users.id.asc())\n .paginate(page=page, per_page=50)\n )\n else:\n users = (\n Users.query.filter(*filters)\n .order_by(Users.id.asc())\n .paginate(page=page, per_page=50)\n )\n\n args = dict(request.args)\n args.pop(\"page\", 1)\n\n return render_template(\n \"admin/users/users.html\",\n users=users,\n prev_page=url_for(request.endpoint, page=users.prev_num, **args),\n next_page=url_for(request.endpoint, page=users.next_num, **args),\n q=q,\n field=field,\n )\n\n\[email protected](\"/admin/users/new\")\n@admins_only\ndef users_new():\n return render_template(\"admin/users/new.html\")\n\n\[email protected](\"/admin/users/<int:user_id>\")\n@admins_only\ndef users_detail(user_id):\n # Get user object\n user = Users.query.filter_by(id=user_id).first_or_404()\n\n # Get the user's solves\n solves = user.get_solves(admin=True)\n\n # Get challenges that the user is missing\n if get_config(\"user_mode\") == TEAMS_MODE:\n if user.team:\n all_solves = user.team.get_solves(admin=True)\n else:\n all_solves = user.get_solves(admin=True)\n else:\n all_solves = user.get_solves(admin=True)\n\n solve_ids = [s.challenge_id for s in all_solves]\n missing = Challenges.query.filter(not_(Challenges.id.in_(solve_ids))).all()\n\n # Get IP addresses that the User has used\n addrs = (\n Tracking.query.filter_by(user_id=user_id).order_by(Tracking.date.desc()).all()\n )\n\n # Get Fails\n fails = user.get_fails(admin=True)\n\n # Get Awards\n awards = user.get_awards(admin=True)\n\n # Get user properties\n score = user.get_score(admin=True)\n place = user.get_place(admin=True)\n\n return render_template(\n \"admin/users/user.html\",\n solves=solves,\n user=user,\n addrs=addrs,\n score=score,\n missing=missing,\n place=place,\n fails=fails,\n awards=awards,\n )\n", "path": "CTFd/admin/users.py"}], "after_files": [{"content": "from flask import render_template, request, url_for\nfrom sqlalchemy.sql import not_\n\nfrom CTFd.admin import admin\nfrom CTFd.models import Challenges, Tracking, Users\nfrom CTFd.utils import get_config\nfrom CTFd.utils.decorators import admins_only\nfrom CTFd.utils.modes import TEAMS_MODE\n\n\[email protected](\"/admin/users\")\n@admins_only\ndef users_listing():\n q = request.args.get(\"q\")\n field = request.args.get(\"field\")\n page = abs(request.args.get(\"page\", 1, type=int))\n filters = []\n users = []\n\n if q:\n # The field exists as an exposed column\n if Users.__mapper__.has_property(field):\n filters.append(getattr(Users, field).like(\"%{}%\".format(q)))\n\n if q and field == \"ip\":\n users = (\n Users.query.join(Tracking, Users.id == Tracking.user_id)\n .filter(Tracking.ip.like(\"%{}%\".format(q)))\n .order_by(Users.id.asc())\n .paginate(page=page, per_page=50)\n )\n else:\n users = (\n Users.query.filter(*filters)\n .order_by(Users.id.asc())\n .paginate(page=page, per_page=50)\n )\n\n args = dict(request.args)\n args.pop(\"page\", 1)\n\n return render_template(\n \"admin/users/users.html\",\n users=users,\n prev_page=url_for(request.endpoint, page=users.prev_num, **args),\n next_page=url_for(request.endpoint, page=users.next_num, **args),\n q=q,\n field=field,\n )\n\n\[email protected](\"/admin/users/new\")\n@admins_only\ndef users_new():\n return render_template(\"admin/users/new.html\")\n\n\[email protected](\"/admin/users/<int:user_id>\")\n@admins_only\ndef users_detail(user_id):\n # Get user object\n user = Users.query.filter_by(id=user_id).first_or_404()\n\n # Get the user's solves\n solves = user.get_solves(admin=True)\n\n # Get challenges that the user is missing\n if get_config(\"user_mode\") == TEAMS_MODE:\n if user.team:\n all_solves = user.team.get_solves(admin=True)\n else:\n all_solves = user.get_solves(admin=True)\n else:\n all_solves = user.get_solves(admin=True)\n\n solve_ids = [s.challenge_id for s in all_solves]\n missing = Challenges.query.filter(not_(Challenges.id.in_(solve_ids))).all()\n\n # Get IP addresses that the User has used\n addrs = (\n Tracking.query.filter_by(user_id=user_id).order_by(Tracking.date.desc()).all()\n )\n\n # Get Fails\n fails = user.get_fails(admin=True)\n\n # Get Awards\n awards = user.get_awards(admin=True)\n\n # Get user properties\n score = user.account.get_score(admin=True)\n place = user.account.get_place(admin=True)\n\n return render_template(\n \"admin/users/user.html\",\n solves=solves,\n user=user,\n addrs=addrs,\n score=score,\n missing=missing,\n place=place,\n fails=fails,\n awards=awards,\n )\n", "path": "CTFd/admin/users.py"}]} | 1,217 | 120 |
gh_patches_debug_7313 | rasdani/github-patches | git_diff | MycroftAI__mycroft-core-1195 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Calling the pause() function does not pause the playback
This [pause() function](https://github.com/MycroftAI/mycroft-core/blob/a9d627c979b98053719200c6084f3b31a9a97c62/mycroft/skills/audioservice.py#L83) does not have the desired effect of pausing the playback.
`self.emitter.emit(Message('mycroft.sudio.service.stop'))`
This stops the playback, but the Audioservice does not expose a stop function. I could make the commit to add the stop() function if agreed upon.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mycroft/skills/audioservice.py`
Content:
```
1 # Copyright 2017 Mycroft AI Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 #
15 import time
16
17 from os.path import abspath
18
19 from mycroft.messagebus.message import Message
20
21
22 def ensure_uri(s):
23 """
24 Interprete paths as file:// uri's
25
26 Args:
27 s: string to be checked
28
29 Returns:
30 if s is uri, s is returned otherwise file:// is prepended
31 """
32 if '://' not in s:
33 return 'file://' + abspath(s)
34 else:
35 return s
36
37
38 class AudioService():
39 """
40 AudioService object for interacting with the audio subsystem
41
42 Args:
43 emitter: eventemitter or websocket object
44 """
45
46 def __init__(self, emitter):
47 self.emitter = emitter
48 self.emitter.on('mycroft.audio.service.track_info_reply',
49 self._track_info)
50 self.info = None
51
52 def _track_info(self, message=None):
53 """
54 Handler for catching returning track info
55 """
56 self.info = message.data
57
58 def play(self, tracks=[], utterance=''):
59 """ Start playback.
60
61 Args:
62 tracks: track uri or list of track uri's
63 utterance: forward utterance for further processing by the
64 audio service.
65 """
66 if isinstance(tracks, basestring):
67 tracks = [tracks]
68 elif not isinstance(tracks, list):
69 raise ValueError
70 tracks = [ensure_uri(t) for t in tracks]
71 self.emitter.emit(Message('mycroft.audio.service.play',
72 data={'tracks': tracks,
73 'utterance': utterance}))
74
75 def next(self):
76 """ Change to next track. """
77 self.emitter.emit(Message('mycroft.audio.service.next'))
78
79 def prev(self):
80 """ Change to previous track. """
81 self.emitter.emit(Message('mycroft.audio.service.prev'))
82
83 def pause(self):
84 """ Pause playback. """
85 self.emitter.emit(Message('mycroft.audio.service.pause'))
86
87 def resume(self):
88 """ Resume paused playback. """
89 self.emitter.emit(Message('mycroft.audio.service.resume'))
90
91 def track_info(self):
92 """ Request information of current playing track.
93
94 Returns:
95 Dict with track info.
96 """
97 self.info = None
98 self.emitter.emit(Message('mycroft.audio.service.track_info'))
99 wait = 5.0
100 while self.info is None and wait >= 0:
101 time.sleep(0.1)
102 wait -= 0.1
103
104 return self.info or {}
105
106 @property
107 def is_playing(self):
108 return self.track_info() != {}
109
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mycroft/skills/audioservice.py b/mycroft/skills/audioservice.py
--- a/mycroft/skills/audioservice.py
+++ b/mycroft/skills/audioservice.py
@@ -72,6 +72,10 @@
data={'tracks': tracks,
'utterance': utterance}))
+ def stop(self):
+ """ Stop the track. """
+ self.emitter.emit(Message('mycroft.audio.service.stop'))
+
def next(self):
""" Change to next track. """
self.emitter.emit(Message('mycroft.audio.service.next'))
| {"golden_diff": "diff --git a/mycroft/skills/audioservice.py b/mycroft/skills/audioservice.py\n--- a/mycroft/skills/audioservice.py\n+++ b/mycroft/skills/audioservice.py\n@@ -72,6 +72,10 @@\n data={'tracks': tracks,\n 'utterance': utterance}))\n \n+ def stop(self):\n+ \"\"\" Stop the track. \"\"\"\n+ self.emitter.emit(Message('mycroft.audio.service.stop'))\n+\n def next(self):\n \"\"\" Change to next track. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.next'))\n", "issue": "Calling the pause() function does not pause the playback\nThis [pause() function](https://github.com/MycroftAI/mycroft-core/blob/a9d627c979b98053719200c6084f3b31a9a97c62/mycroft/skills/audioservice.py#L83) does not have the desired effect of pausing the playback. \r\n`self.emitter.emit(Message('mycroft.sudio.service.stop'))`\r\nThis stops the playback, but the Audioservice does not expose a stop function. I could make the commit to add the stop() function if agreed upon.\n", "before_files": [{"content": "# Copyright 2017 Mycroft AI Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n#\nimport time\n\nfrom os.path import abspath\n\nfrom mycroft.messagebus.message import Message\n\n\ndef ensure_uri(s):\n \"\"\"\n Interprete paths as file:// uri's\n\n Args:\n s: string to be checked\n\n Returns:\n if s is uri, s is returned otherwise file:// is prepended\n \"\"\"\n if '://' not in s:\n return 'file://' + abspath(s)\n else:\n return s\n\n\nclass AudioService():\n \"\"\"\n AudioService object for interacting with the audio subsystem\n\n Args:\n emitter: eventemitter or websocket object\n \"\"\"\n\n def __init__(self, emitter):\n self.emitter = emitter\n self.emitter.on('mycroft.audio.service.track_info_reply',\n self._track_info)\n self.info = None\n\n def _track_info(self, message=None):\n \"\"\"\n Handler for catching returning track info\n \"\"\"\n self.info = message.data\n\n def play(self, tracks=[], utterance=''):\n \"\"\" Start playback.\n\n Args:\n tracks: track uri or list of track uri's\n utterance: forward utterance for further processing by the\n audio service.\n \"\"\"\n if isinstance(tracks, basestring):\n tracks = [tracks]\n elif not isinstance(tracks, list):\n raise ValueError\n tracks = [ensure_uri(t) for t in tracks]\n self.emitter.emit(Message('mycroft.audio.service.play',\n data={'tracks': tracks,\n 'utterance': utterance}))\n\n def next(self):\n \"\"\" Change to next track. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.next'))\n\n def prev(self):\n \"\"\" Change to previous track. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.prev'))\n\n def pause(self):\n \"\"\" Pause playback. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.pause'))\n\n def resume(self):\n \"\"\" Resume paused playback. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.resume'))\n\n def track_info(self):\n \"\"\" Request information of current playing track.\n\n Returns:\n Dict with track info.\n \"\"\"\n self.info = None\n self.emitter.emit(Message('mycroft.audio.service.track_info'))\n wait = 5.0\n while self.info is None and wait >= 0:\n time.sleep(0.1)\n wait -= 0.1\n\n return self.info or {}\n\n @property\n def is_playing(self):\n return self.track_info() != {}\n", "path": "mycroft/skills/audioservice.py"}], "after_files": [{"content": "# Copyright 2017 Mycroft AI Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n#\nimport time\n\nfrom os.path import abspath\n\nfrom mycroft.messagebus.message import Message\n\n\ndef ensure_uri(s):\n \"\"\"\n Interprete paths as file:// uri's\n\n Args:\n s: string to be checked\n\n Returns:\n if s is uri, s is returned otherwise file:// is prepended\n \"\"\"\n if '://' not in s:\n return 'file://' + abspath(s)\n else:\n return s\n\n\nclass AudioService():\n \"\"\"\n AudioService object for interacting with the audio subsystem\n\n Args:\n emitter: eventemitter or websocket object\n \"\"\"\n\n def __init__(self, emitter):\n self.emitter = emitter\n self.emitter.on('mycroft.audio.service.track_info_reply',\n self._track_info)\n self.info = None\n\n def _track_info(self, message=None):\n \"\"\"\n Handler for catching returning track info\n \"\"\"\n self.info = message.data\n\n def play(self, tracks=[], utterance=''):\n \"\"\" Start playback.\n\n Args:\n tracks: track uri or list of track uri's\n utterance: forward utterance for further processing by the\n audio service.\n \"\"\"\n if isinstance(tracks, basestring):\n tracks = [tracks]\n elif not isinstance(tracks, list):\n raise ValueError\n tracks = [ensure_uri(t) for t in tracks]\n self.emitter.emit(Message('mycroft.audio.service.play',\n data={'tracks': tracks,\n 'utterance': utterance}))\n\n def stop(self):\n \"\"\" Stop the track. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.stop'))\n\n def next(self):\n \"\"\" Change to next track. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.next'))\n\n def prev(self):\n \"\"\" Change to previous track. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.prev'))\n\n def pause(self):\n \"\"\" Pause playback. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.pause'))\n\n def resume(self):\n \"\"\" Resume paused playback. \"\"\"\n self.emitter.emit(Message('mycroft.audio.service.resume'))\n\n def track_info(self):\n \"\"\" Request information of current playing track.\n\n Returns:\n Dict with track info.\n \"\"\"\n self.info = None\n self.emitter.emit(Message('mycroft.audio.service.track_info'))\n wait = 5.0\n while self.info is None and wait >= 0:\n time.sleep(0.1)\n wait -= 0.1\n\n return self.info or {}\n\n @property\n def is_playing(self):\n return self.track_info() != {}\n", "path": "mycroft/skills/audioservice.py"}]} | 1,302 | 132 |
gh_patches_debug_14746 | rasdani/github-patches | git_diff | GoogleCloudPlatform__PerfKitBenchmarker-563 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Project id not respected when deleting disk in Cloudstack
I forgot to pass project_id as an agrugment when deleting volume.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `perfkitbenchmarker/cloudstack/cloudstack_disk.py`
Content:
```
1 # Copyright 2014 Google Inc. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 """Module containing classes related to CloudStack disks.
15
16 Disks can be created, deleted, attached to VMs, and detached from VMs.
17 """
18
19 from perfkitbenchmarker import disk
20 from perfkitbenchmarker import vm_util
21 from perfkitbenchmarker import flags
22 from perfkitbenchmarker.cloudstack import util
23
24 import string
25 import logging
26
27 FLAGS = flags.FLAGS
28
29
30 class CloudStackDisk(disk.BaseDisk):
31 """Object representing a Cloudstack Disk."""
32
33
34 def __init__(self, disk_spec, name, zone_id, project_id=None):
35 super(CloudStackDisk, self).__init__(disk_spec)
36
37 self.cs = util.CsClient(
38 FLAGS.CS_API_URL,
39 FLAGS.CS_API_KEY,
40 FLAGS.CS_API_SECRET
41 )
42
43 self.attached_vm_name = None
44 self.attached_vm_id = None
45 self.name = name
46
47 self.zone_id = zone_id
48 self.project_id = project_id
49
50 self.disk_offering_id = self._GetBestOfferingId(self.disk_size)
51 assert self.disk_offering_id, "Unable get disk offering of given size"
52
53 if self.disk_spec.disk_type:
54 logging.warn("Cloudstack does not support disk types")
55
56
57 @vm_util.Retry(max_retries=3)
58 def _Create(self):
59 """Creates the disk."""
60
61
62 volume = self.cs.create_volume(self.name,
63 self.disk_offering_id,
64 self.zone_id,
65 self.project_id)
66
67 assert volume, "Unable to create volume"
68
69 self.volume_id = volume['id']
70 self.disk_type = volume['type']
71 self.actual_disk_size = int(volume['size']) / (2 ** 30) # In GB
72
73
74 def _Delete(self):
75 """Deletes the disk."""
76 vol = self.cs.get_volume(self.name)
77 if vol:
78 self.cs.delete_volume(self.volume_id)
79
80
81 def _Exists(self):
82 """Returns true if the disk exists."""
83 vol = self.cs.get_volume(self.name, self.project_id)
84 if vol:
85 return True
86 return False
87
88
89 @vm_util.Retry(max_retries=3)
90 def Attach(self, vm):
91 """Attaches the disk to a VM.
92
93 Args:
94 vm: The CloudStackVirtualMachine instance to which
95 the disk will be attached.
96
97 """
98
99 res = self.cs.attach_volume(self.volume_id, vm.id)
100 assert res, "Unable to attach volume"
101
102 self.device_id = res['deviceid']
103
104 self.device_path = "/dev/xvd" + \
105 str(string.ascii_lowercase[self.device_id])
106
107
108
109 def Detach(self):
110 """Detaches the disk from a VM."""
111
112 self.cs.detach_volume(self.volume_id)
113
114
115 def _GetBestOfferingId(self, disk_size):
116 """ Given a disk_size (in GB), try to find a disk
117 offering that is atleast as big as the requested
118 one.
119 """
120
121 disk_offerings = self.cs.list_disk_offerings()
122 sorted_do = sorted(disk_offerings, key=lambda x: x['disksize'])
123
124 for do in sorted_do:
125 if int(do['disksize']) >= disk_size:
126 return do['id']
127
128 return None
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/perfkitbenchmarker/cloudstack/cloudstack_disk.py b/perfkitbenchmarker/cloudstack/cloudstack_disk.py
--- a/perfkitbenchmarker/cloudstack/cloudstack_disk.py
+++ b/perfkitbenchmarker/cloudstack/cloudstack_disk.py
@@ -50,7 +50,7 @@
self.disk_offering_id = self._GetBestOfferingId(self.disk_size)
assert self.disk_offering_id, "Unable get disk offering of given size"
- if self.disk_spec.disk_type:
+ if disk_spec.disk_type:
logging.warn("Cloudstack does not support disk types")
@@ -73,7 +73,7 @@
def _Delete(self):
"""Deletes the disk."""
- vol = self.cs.get_volume(self.name)
+ vol = self.cs.get_volume(self.name, self.project_id)
if vol:
self.cs.delete_volume(self.volume_id)
| {"golden_diff": "diff --git a/perfkitbenchmarker/cloudstack/cloudstack_disk.py b/perfkitbenchmarker/cloudstack/cloudstack_disk.py\n--- a/perfkitbenchmarker/cloudstack/cloudstack_disk.py\n+++ b/perfkitbenchmarker/cloudstack/cloudstack_disk.py\n@@ -50,7 +50,7 @@\n self.disk_offering_id = self._GetBestOfferingId(self.disk_size)\n assert self.disk_offering_id, \"Unable get disk offering of given size\"\n \n- if self.disk_spec.disk_type:\n+ if disk_spec.disk_type:\n logging.warn(\"Cloudstack does not support disk types\")\n \n \n@@ -73,7 +73,7 @@\n \n def _Delete(self):\n \"\"\"Deletes the disk.\"\"\"\n- vol = self.cs.get_volume(self.name)\n+ vol = self.cs.get_volume(self.name, self.project_id)\n if vol:\n self.cs.delete_volume(self.volume_id)\n", "issue": "Project id not respected when deleting disk in Cloudstack\nI forgot to pass project_id as an agrugment when deleting volume. \n\n", "before_files": [{"content": "# Copyright 2014 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Module containing classes related to CloudStack disks.\n\nDisks can be created, deleted, attached to VMs, and detached from VMs.\n\"\"\"\n\nfrom perfkitbenchmarker import disk\nfrom perfkitbenchmarker import vm_util\nfrom perfkitbenchmarker import flags\nfrom perfkitbenchmarker.cloudstack import util\n\nimport string\nimport logging\n\nFLAGS = flags.FLAGS\n\n\nclass CloudStackDisk(disk.BaseDisk):\n \"\"\"Object representing a Cloudstack Disk.\"\"\"\n\n\n def __init__(self, disk_spec, name, zone_id, project_id=None):\n super(CloudStackDisk, self).__init__(disk_spec)\n\n self.cs = util.CsClient(\n FLAGS.CS_API_URL,\n FLAGS.CS_API_KEY,\n FLAGS.CS_API_SECRET\n )\n\n self.attached_vm_name = None\n self.attached_vm_id = None\n self.name = name\n\n self.zone_id = zone_id\n self.project_id = project_id\n\n self.disk_offering_id = self._GetBestOfferingId(self.disk_size)\n assert self.disk_offering_id, \"Unable get disk offering of given size\"\n\n if self.disk_spec.disk_type:\n logging.warn(\"Cloudstack does not support disk types\")\n\n\n @vm_util.Retry(max_retries=3)\n def _Create(self):\n \"\"\"Creates the disk.\"\"\"\n\n\n volume = self.cs.create_volume(self.name,\n self.disk_offering_id,\n self.zone_id,\n self.project_id)\n\n assert volume, \"Unable to create volume\"\n\n self.volume_id = volume['id']\n self.disk_type = volume['type']\n self.actual_disk_size = int(volume['size']) / (2 ** 30) # In GB\n\n\n def _Delete(self):\n \"\"\"Deletes the disk.\"\"\"\n vol = self.cs.get_volume(self.name)\n if vol:\n self.cs.delete_volume(self.volume_id)\n\n\n def _Exists(self):\n \"\"\"Returns true if the disk exists.\"\"\"\n vol = self.cs.get_volume(self.name, self.project_id)\n if vol:\n return True\n return False\n\n\n @vm_util.Retry(max_retries=3)\n def Attach(self, vm):\n \"\"\"Attaches the disk to a VM.\n\n Args:\n vm: The CloudStackVirtualMachine instance to which\n the disk will be attached.\n\n \"\"\"\n\n res = self.cs.attach_volume(self.volume_id, vm.id)\n assert res, \"Unable to attach volume\"\n\n self.device_id = res['deviceid']\n\n self.device_path = \"/dev/xvd\" + \\\n str(string.ascii_lowercase[self.device_id])\n\n\n\n def Detach(self):\n \"\"\"Detaches the disk from a VM.\"\"\"\n\n self.cs.detach_volume(self.volume_id)\n\n\n def _GetBestOfferingId(self, disk_size):\n \"\"\" Given a disk_size (in GB), try to find a disk\n offering that is atleast as big as the requested\n one.\n \"\"\"\n\n disk_offerings = self.cs.list_disk_offerings()\n sorted_do = sorted(disk_offerings, key=lambda x: x['disksize'])\n\n for do in sorted_do:\n if int(do['disksize']) >= disk_size:\n return do['id']\n\n return None\n", "path": "perfkitbenchmarker/cloudstack/cloudstack_disk.py"}], "after_files": [{"content": "# Copyright 2014 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Module containing classes related to CloudStack disks.\n\nDisks can be created, deleted, attached to VMs, and detached from VMs.\n\"\"\"\n\nfrom perfkitbenchmarker import disk\nfrom perfkitbenchmarker import vm_util\nfrom perfkitbenchmarker import flags\nfrom perfkitbenchmarker.cloudstack import util\n\nimport string\nimport logging\n\nFLAGS = flags.FLAGS\n\n\nclass CloudStackDisk(disk.BaseDisk):\n \"\"\"Object representing a Cloudstack Disk.\"\"\"\n\n\n def __init__(self, disk_spec, name, zone_id, project_id=None):\n super(CloudStackDisk, self).__init__(disk_spec)\n\n self.cs = util.CsClient(\n FLAGS.CS_API_URL,\n FLAGS.CS_API_KEY,\n FLAGS.CS_API_SECRET\n )\n\n self.attached_vm_name = None\n self.attached_vm_id = None\n self.name = name\n\n self.zone_id = zone_id\n self.project_id = project_id\n\n self.disk_offering_id = self._GetBestOfferingId(self.disk_size)\n assert self.disk_offering_id, \"Unable get disk offering of given size\"\n\n if disk_spec.disk_type:\n logging.warn(\"Cloudstack does not support disk types\")\n\n\n @vm_util.Retry(max_retries=3)\n def _Create(self):\n \"\"\"Creates the disk.\"\"\"\n\n\n volume = self.cs.create_volume(self.name,\n self.disk_offering_id,\n self.zone_id,\n self.project_id)\n\n assert volume, \"Unable to create volume\"\n\n self.volume_id = volume['id']\n self.disk_type = volume['type']\n self.actual_disk_size = int(volume['size']) / (2 ** 30) # In GB\n\n\n def _Delete(self):\n \"\"\"Deletes the disk.\"\"\"\n vol = self.cs.get_volume(self.name, self.project_id)\n if vol:\n self.cs.delete_volume(self.volume_id)\n\n\n def _Exists(self):\n \"\"\"Returns true if the disk exists.\"\"\"\n vol = self.cs.get_volume(self.name, self.project_id)\n if vol:\n return True\n return False\n\n\n @vm_util.Retry(max_retries=3)\n def Attach(self, vm):\n \"\"\"Attaches the disk to a VM.\n\n Args:\n vm: The CloudStackVirtualMachine instance to which\n the disk will be attached.\n\n \"\"\"\n\n res = self.cs.attach_volume(self.volume_id, vm.id)\n assert res, \"Unable to attach volume\"\n\n self.device_id = res['deviceid']\n\n self.device_path = \"/dev/xvd\" + \\\n str(string.ascii_lowercase[self.device_id])\n\n\n\n def Detach(self):\n \"\"\"Detaches the disk from a VM.\"\"\"\n\n self.cs.detach_volume(self.volume_id)\n\n\n def _GetBestOfferingId(self, disk_size):\n \"\"\" Given a disk_size (in GB), try to find a disk\n offering that is atleast as big as the requested\n one.\n \"\"\"\n\n disk_offerings = self.cs.list_disk_offerings()\n sorted_do = sorted(disk_offerings, key=lambda x: x['disksize'])\n\n for do in sorted_do:\n if int(do['disksize']) >= disk_size:\n return do['id']\n\n return None\n", "path": "perfkitbenchmarker/cloudstack/cloudstack_disk.py"}]} | 1,415 | 198 |
gh_patches_debug_21325 | rasdani/github-patches | git_diff | akvo__akvo-rsr-3020 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Incorrect mandatory validation for qualitative indicators
The indicator measure for qualitative indicators is hidden, but the code checking for completion of sections doesn't ignore that field, and considers it mandatory.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/iati/checks/fields/results.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7
8 def results(project):
9 """
10 Check if result has a type, title and at least one indicator.
11 Check if indicator has a measure and title.
12 Check if indicator baseline has year and value.
13 Check if indicator period has a start and end date, and start before end.
14
15 :param project: Project object
16 :return: All checks passed boolean, [Check results]
17 """
18 checks = []
19 all_checks_passed = True
20
21 for result in project.results.all():
22 if not result.type:
23 all_checks_passed = False
24 checks.append((u'error', u'result (id: %s) has no type specified' % str(result.pk)))
25
26 if not result.title:
27 all_checks_passed = False
28 checks.append((u'error', u'result (id: %s) has no title specified' % str(result.pk)))
29
30 if not result.indicators.all():
31 all_checks_passed = False
32 checks.append((u'error', u'result (id: %s) has no indicator(s)' % str(result.pk)))
33
34 for indicator in result.indicators.all():
35 if not indicator.measure:
36 all_checks_passed = False
37 checks.append((u'error', u'indicator (id: %s) has no measure specified' %
38 str(indicator.pk)))
39
40 if not indicator.title:
41 all_checks_passed = False
42 checks.append((u'error', u'indicator (id: %s) has no title specified' %
43 str(indicator.pk)))
44
45 if (indicator.baseline_value and not indicator.baseline_year) or \
46 (not indicator.baseline_value and indicator.baseline_year):
47 all_checks_passed = False
48 checks.append((u'error', u'indicator (id: %s) baseline has no value or year '
49 u'specified' % str(indicator.pk)))
50
51 for reference in indicator.references.all():
52 if not reference.reference:
53 all_checks_passed = False
54 checks.append((u'error', u'indicator reference (id: %s) has no code '
55 u'specified' % str(reference.pk)))
56
57 if not reference.vocabulary:
58 all_checks_passed = False
59 checks.append((u'error', u'indicator reference (id: %s) has no vocabulary '
60 u'specified' % str(reference.pk)))
61
62 if reference.vocabulary == '99' and not reference.vocabulary_uri:
63 all_checks_passed = False
64 checks.append((u'error', u'indicator reference (id: %s) has vocabulary 99 '
65 u'(reporting organisation) but no vocabulary URI '
66 u'specified' % str(reference.pk)))
67
68 for period in indicator.periods.all():
69 if not period.period_start:
70 all_checks_passed = False
71 checks.append((u'error', u'indicator period (id: %s) has no start date '
72 u'specified' % str(period.pk)))
73
74 if not period.period_end:
75 all_checks_passed = False
76 checks.append((u'error', u'indicator period (id: %s) has no end date '
77 u'specified' % str(period.pk)))
78
79 if period.period_start and period.period_end and \
80 period.period_start > period.period_end:
81 all_checks_passed = False
82 checks.append((u'error', u'indicator period (id: %s) has a start date '
83 u'later than the end date' % str(period.pk)))
84
85 if not period.target_value and (period.target_comment or
86 period.target_locations.all() or
87 period.target_dimensions.all()):
88 all_checks_passed = False
89 checks.append((u'error', u'indicator period (id: %s) has no target value, but '
90 u'does have a target comment, target location(s) or '
91 u'target dimension(s)' % str(period.pk)))
92
93 if not period.actual_value and (period.actual_comment or
94 period.actual_locations.all() or
95 period.actual_dimensions.all()):
96 all_checks_passed = False
97 checks.append((u'error', u'indicator period (id: %s) has no actual value, but '
98 u'does have an actual comment, actual location(s) or '
99 u'actual dimension(s)' % str(period.pk)))
100
101 if project.results.all() and all_checks_passed:
102 checks.append((u'success', u'has valid result(s)'))
103
104 return all_checks_passed, checks
105
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/iati/checks/fields/results.py b/akvo/iati/checks/fields/results.py
--- a/akvo/iati/checks/fields/results.py
+++ b/akvo/iati/checks/fields/results.py
@@ -4,6 +4,8 @@
# See more details in the license.txt file located at the root folder of the Akvo RSR module.
# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
+from akvo.rsr.models.result.utils import QUALITATIVE
+
def results(project):
"""
@@ -32,7 +34,7 @@
checks.append((u'error', u'result (id: %s) has no indicator(s)' % str(result.pk)))
for indicator in result.indicators.all():
- if not indicator.measure:
+ if indicator.type != QUALITATIVE and not indicator.measure:
all_checks_passed = False
checks.append((u'error', u'indicator (id: %s) has no measure specified' %
str(indicator.pk)))
| {"golden_diff": "diff --git a/akvo/iati/checks/fields/results.py b/akvo/iati/checks/fields/results.py\n--- a/akvo/iati/checks/fields/results.py\n+++ b/akvo/iati/checks/fields/results.py\n@@ -4,6 +4,8 @@\n # See more details in the license.txt file located at the root folder of the Akvo RSR module.\n # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n \n+from akvo.rsr.models.result.utils import QUALITATIVE\n+\n \n def results(project):\n \"\"\"\n@@ -32,7 +34,7 @@\n checks.append((u'error', u'result (id: %s) has no indicator(s)' % str(result.pk)))\n \n for indicator in result.indicators.all():\n- if not indicator.measure:\n+ if indicator.type != QUALITATIVE and not indicator.measure:\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) has no measure specified' %\n str(indicator.pk)))\n", "issue": "Incorrect mandatory validation for qualitative indicators\nThe indicator measure for qualitative indicators is hidden, but the code checking for completion of sections doesn't ignore that field, and considers it mandatory. \n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\n\ndef results(project):\n \"\"\"\n Check if result has a type, title and at least one indicator.\n Check if indicator has a measure and title.\n Check if indicator baseline has year and value.\n Check if indicator period has a start and end date, and start before end.\n\n :param project: Project object\n :return: All checks passed boolean, [Check results]\n \"\"\"\n checks = []\n all_checks_passed = True\n\n for result in project.results.all():\n if not result.type:\n all_checks_passed = False\n checks.append((u'error', u'result (id: %s) has no type specified' % str(result.pk)))\n\n if not result.title:\n all_checks_passed = False\n checks.append((u'error', u'result (id: %s) has no title specified' % str(result.pk)))\n\n if not result.indicators.all():\n all_checks_passed = False\n checks.append((u'error', u'result (id: %s) has no indicator(s)' % str(result.pk)))\n\n for indicator in result.indicators.all():\n if not indicator.measure:\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) has no measure specified' %\n str(indicator.pk)))\n\n if not indicator.title:\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) has no title specified' %\n str(indicator.pk)))\n\n if (indicator.baseline_value and not indicator.baseline_year) or \\\n (not indicator.baseline_value and indicator.baseline_year):\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) baseline has no value or year '\n u'specified' % str(indicator.pk)))\n\n for reference in indicator.references.all():\n if not reference.reference:\n all_checks_passed = False\n checks.append((u'error', u'indicator reference (id: %s) has no code '\n u'specified' % str(reference.pk)))\n\n if not reference.vocabulary:\n all_checks_passed = False\n checks.append((u'error', u'indicator reference (id: %s) has no vocabulary '\n u'specified' % str(reference.pk)))\n\n if reference.vocabulary == '99' and not reference.vocabulary_uri:\n all_checks_passed = False\n checks.append((u'error', u'indicator reference (id: %s) has vocabulary 99 '\n u'(reporting organisation) but no vocabulary URI '\n u'specified' % str(reference.pk)))\n\n for period in indicator.periods.all():\n if not period.period_start:\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no start date '\n u'specified' % str(period.pk)))\n\n if not period.period_end:\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no end date '\n u'specified' % str(period.pk)))\n\n if period.period_start and period.period_end and \\\n period.period_start > period.period_end:\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has a start date '\n u'later than the end date' % str(period.pk)))\n\n if not period.target_value and (period.target_comment or\n period.target_locations.all() or\n period.target_dimensions.all()):\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no target value, but '\n u'does have a target comment, target location(s) or '\n u'target dimension(s)' % str(period.pk)))\n\n if not period.actual_value and (period.actual_comment or\n period.actual_locations.all() or\n period.actual_dimensions.all()):\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no actual value, but '\n u'does have an actual comment, actual location(s) or '\n u'actual dimension(s)' % str(period.pk)))\n\n if project.results.all() and all_checks_passed:\n checks.append((u'success', u'has valid result(s)'))\n\n return all_checks_passed, checks\n", "path": "akvo/iati/checks/fields/results.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom akvo.rsr.models.result.utils import QUALITATIVE\n\n\ndef results(project):\n \"\"\"\n Check if result has a type, title and at least one indicator.\n Check if indicator has a measure and title.\n Check if indicator baseline has year and value.\n Check if indicator period has a start and end date, and start before end.\n\n :param project: Project object\n :return: All checks passed boolean, [Check results]\n \"\"\"\n checks = []\n all_checks_passed = True\n\n for result in project.results.all():\n if not result.type:\n all_checks_passed = False\n checks.append((u'error', u'result (id: %s) has no type specified' % str(result.pk)))\n\n if not result.title:\n all_checks_passed = False\n checks.append((u'error', u'result (id: %s) has no title specified' % str(result.pk)))\n\n if not result.indicators.all():\n all_checks_passed = False\n checks.append((u'error', u'result (id: %s) has no indicator(s)' % str(result.pk)))\n\n for indicator in result.indicators.all():\n if indicator.type != QUALITATIVE and not indicator.measure:\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) has no measure specified' %\n str(indicator.pk)))\n\n if not indicator.title:\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) has no title specified' %\n str(indicator.pk)))\n\n if (indicator.baseline_value and not indicator.baseline_year) or \\\n (not indicator.baseline_value and indicator.baseline_year):\n all_checks_passed = False\n checks.append((u'error', u'indicator (id: %s) baseline has no value or year '\n u'specified' % str(indicator.pk)))\n\n for reference in indicator.references.all():\n if not reference.reference:\n all_checks_passed = False\n checks.append((u'error', u'indicator reference (id: %s) has no code '\n u'specified' % str(reference.pk)))\n\n if not reference.vocabulary:\n all_checks_passed = False\n checks.append((u'error', u'indicator reference (id: %s) has no vocabulary '\n u'specified' % str(reference.pk)))\n\n if reference.vocabulary == '99' and not reference.vocabulary_uri:\n all_checks_passed = False\n checks.append((u'error', u'indicator reference (id: %s) has vocabulary 99 '\n u'(reporting organisation) but no vocabulary URI '\n u'specified' % str(reference.pk)))\n\n for period in indicator.periods.all():\n if not period.period_start:\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no start date '\n u'specified' % str(period.pk)))\n\n if not period.period_end:\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no end date '\n u'specified' % str(period.pk)))\n\n if period.period_start and period.period_end and \\\n period.period_start > period.period_end:\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has a start date '\n u'later than the end date' % str(period.pk)))\n\n if not period.target_value and (period.target_comment or\n period.target_locations.all() or\n period.target_dimensions.all()):\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no target value, but '\n u'does have a target comment, target location(s) or '\n u'target dimension(s)' % str(period.pk)))\n\n if not period.actual_value and (period.actual_comment or\n period.actual_locations.all() or\n period.actual_dimensions.all()):\n all_checks_passed = False\n checks.append((u'error', u'indicator period (id: %s) has no actual value, but '\n u'does have an actual comment, actual location(s) or '\n u'actual dimension(s)' % str(period.pk)))\n\n if project.results.all() and all_checks_passed:\n checks.append((u'success', u'has valid result(s)'))\n\n return all_checks_passed, checks\n", "path": "akvo/iati/checks/fields/results.py"}]} | 1,526 | 240 |
gh_patches_debug_16340 | rasdani/github-patches | git_diff | qutip__qutip-2335 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Typos in qutip.expect()'s docstring
### Bug Description
Two words in the docstring contain typos. See below.
### Code to Reproduce the Bug
```shell
from qutip import expect
print(expect.__doc__)
```
### Code Output
```shell
Parameters
----------
oper : qobj/array-like
A single or a `list` or operators for expectation value.
state : qobj/array-like
A single or a `list` of quantum states or density matrices.
Returns
-------
expt : float/complex/array-like
Expectation value. ``real`` if `oper` is Hermitian, ``complex``
otherwise. A (nested) array of expectaction values of state or operator
are arrays.
```
### Expected Behaviour
Parameters
----------
oper : qobj/array-like
A single or a `list` **~~or~~** **_of_** operators for expectation value.
state : qobj/array-like
A single or a `list` of quantum states or density matrices.
Returns
-------
expt : float/complex/array-like
Expectation value. ``real`` if `oper` is Hermitian, ``complex``
otherwise. A (nested) array of expectaction values **~~of~~** **_if_** state or operator
are arrays.
### Your Environment
```shell
QuTiP Version: 4.7.5
Numpy Version: 1.26.4
Scipy Version: 1.11.4
Cython Version: None
Matplotlib Version: 3.8.3
Python Version: 3.12.2
Number of CPUs: 8
BLAS Info: Generic
OPENMP Installed: False
INTEL MKL Ext: False
Platform Info: Windows (AMD64)
```
### Additional Context
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `qutip/core/expect.py`
Content:
```
1 __all__ = ['expect', 'variance']
2
3 import numpy as np
4
5 from .qobj import Qobj
6 from . import data as _data
7
8
9 def expect(oper, state):
10 """
11 Calculate the expectation value for operator(s) and state(s). The
12 expectation of state ``k`` on operator ``A`` is defined as
13 ``k.dag() @ A @ k``, and for density matrix ``R`` on operator ``A`` it is
14 ``trace(A @ R)``.
15
16 Parameters
17 ----------
18 oper : qobj/array-like
19 A single or a `list` or operators for expectation value.
20
21 state : qobj/array-like
22 A single or a `list` of quantum states or density matrices.
23
24 Returns
25 -------
26 expt : float/complex/array-like
27 Expectation value. ``real`` if ``oper`` is Hermitian, ``complex``
28 otherwise. A (nested) array of expectaction values of state or operator
29 are arrays.
30
31 Examples
32 --------
33 >>> expect(num(4), basis(4, 3)) == 3 # doctest: +NORMALIZE_WHITESPACE
34 True
35
36 """
37 if isinstance(state, Qobj) and isinstance(oper, Qobj):
38 return _single_qobj_expect(oper, state)
39
40 elif isinstance(oper, (list, np.ndarray)):
41 if isinstance(state, Qobj):
42 dtype = np.complex128
43 if all(op.isherm for op in oper) and (state.isket or state.isherm):
44 dtype = np.float64
45 return np.array([_single_qobj_expect(op, state) for op in oper],
46 dtype=dtype)
47 return [expect(op, state) for op in oper]
48
49 elif isinstance(state, (list, np.ndarray)):
50 dtype = np.complex128
51 if oper.isherm and all(op.isherm or op.isket for op in state):
52 dtype = np.float64
53 return np.array([_single_qobj_expect(oper, x) for x in state],
54 dtype=dtype)
55 raise TypeError('Arguments must be quantum objects')
56
57
58 def _single_qobj_expect(oper, state):
59 """
60 Private function used by expect to calculate expectation values of Qobjs.
61 """
62 if not oper.isoper or not (state.isket or state.isoper):
63 raise TypeError('invalid operand types')
64 if oper.dims[1] != state.dims[0]:
65 msg = (
66 "incompatible dimensions "
67 + str(oper.dims[1]) + " and " + str(state.dims[0])
68 )
69 raise ValueError(msg)
70 out = _data.expect(oper.data, state.data)
71
72 # This ensures that expect can return something that is not a number such
73 # as a `tensorflow.Tensor` in qutip-tensorflow.
74 return out.real if (oper.isherm
75 and (state.isket or state.isherm)
76 and hasattr(out, "real")
77 ) else out
78
79
80 def variance(oper, state):
81 """
82 Variance of an operator for the given state vector or density matrix.
83
84 Parameters
85 ----------
86 oper : qobj
87 Operator for expectation value.
88
89 state : qobj/list
90 A single or ``list`` of quantum states or density matrices..
91
92 Returns
93 -------
94 var : float
95 Variance of operator 'oper' for given state.
96
97 """
98 return expect(oper**2, state) - expect(oper, state)**2
99
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/qutip/core/expect.py b/qutip/core/expect.py
--- a/qutip/core/expect.py
+++ b/qutip/core/expect.py
@@ -16,7 +16,7 @@
Parameters
----------
oper : qobj/array-like
- A single or a `list` or operators for expectation value.
+ A single or a `list` of operators for expectation value.
state : qobj/array-like
A single or a `list` of quantum states or density matrices.
@@ -25,8 +25,8 @@
-------
expt : float/complex/array-like
Expectation value. ``real`` if ``oper`` is Hermitian, ``complex``
- otherwise. A (nested) array of expectaction values of state or operator
- are arrays.
+ otherwise. A (nested) array of expectaction values if ``state`` or
+ ``oper`` are arrays.
Examples
--------
| {"golden_diff": "diff --git a/qutip/core/expect.py b/qutip/core/expect.py\n--- a/qutip/core/expect.py\n+++ b/qutip/core/expect.py\n@@ -16,7 +16,7 @@\n Parameters\n ----------\n oper : qobj/array-like\n- A single or a `list` or operators for expectation value.\n+ A single or a `list` of operators for expectation value.\n \n state : qobj/array-like\n A single or a `list` of quantum states or density matrices.\n@@ -25,8 +25,8 @@\n -------\n expt : float/complex/array-like\n Expectation value. ``real`` if ``oper`` is Hermitian, ``complex``\n- otherwise. A (nested) array of expectaction values of state or operator\n- are arrays.\n+ otherwise. A (nested) array of expectaction values if ``state`` or\n+ ``oper`` are arrays.\n \n Examples\n --------\n", "issue": "Typos in qutip.expect()'s docstring\n### Bug Description\r\n\r\nTwo words in the docstring contain typos. See below.\r\n\r\n### Code to Reproduce the Bug\r\n\r\n```shell\r\nfrom qutip import expect\r\nprint(expect.__doc__)\r\n```\r\n\r\n\r\n### Code Output\r\n\r\n```shell\r\nParameters\r\n----------\r\noper : qobj/array-like\r\n A single or a `list` or operators for expectation value.\r\n\r\nstate : qobj/array-like\r\n A single or a `list` of quantum states or density matrices.\r\n\r\nReturns\r\n-------\r\nexpt : float/complex/array-like\r\n Expectation value. ``real`` if `oper` is Hermitian, ``complex``\r\n otherwise. A (nested) array of expectaction values of state or operator\r\n are arrays.\r\n```\r\n\r\n\r\n### Expected Behaviour\r\n\r\nParameters\r\n----------\r\noper : qobj/array-like\r\n A single or a `list` **~~or~~** **_of_** operators for expectation value.\r\n\r\nstate : qobj/array-like\r\n A single or a `list` of quantum states or density matrices.\r\n\r\nReturns\r\n-------\r\nexpt : float/complex/array-like\r\n Expectation value. ``real`` if `oper` is Hermitian, ``complex``\r\n otherwise. A (nested) array of expectaction values **~~of~~** **_if_** state or operator\r\n are arrays.\r\n\r\n### Your Environment\r\n\r\n```shell\r\nQuTiP Version: 4.7.5\r\nNumpy Version: 1.26.4\r\nScipy Version: 1.11.4\r\nCython Version: None\r\nMatplotlib Version: 3.8.3\r\nPython Version: 3.12.2\r\nNumber of CPUs: 8\r\nBLAS Info: Generic\r\nOPENMP Installed: False\r\nINTEL MKL Ext: False\r\nPlatform Info: Windows (AMD64)\r\n```\r\n\r\n\r\n### Additional Context\r\n\r\n_No response_\n", "before_files": [{"content": "__all__ = ['expect', 'variance']\n\nimport numpy as np\n\nfrom .qobj import Qobj\nfrom . import data as _data\n\n\ndef expect(oper, state):\n \"\"\"\n Calculate the expectation value for operator(s) and state(s). The\n expectation of state ``k`` on operator ``A`` is defined as\n ``k.dag() @ A @ k``, and for density matrix ``R`` on operator ``A`` it is\n ``trace(A @ R)``.\n\n Parameters\n ----------\n oper : qobj/array-like\n A single or a `list` or operators for expectation value.\n\n state : qobj/array-like\n A single or a `list` of quantum states or density matrices.\n\n Returns\n -------\n expt : float/complex/array-like\n Expectation value. ``real`` if ``oper`` is Hermitian, ``complex``\n otherwise. A (nested) array of expectaction values of state or operator\n are arrays.\n\n Examples\n --------\n >>> expect(num(4), basis(4, 3)) == 3 # doctest: +NORMALIZE_WHITESPACE\n True\n\n \"\"\"\n if isinstance(state, Qobj) and isinstance(oper, Qobj):\n return _single_qobj_expect(oper, state)\n\n elif isinstance(oper, (list, np.ndarray)):\n if isinstance(state, Qobj):\n dtype = np.complex128\n if all(op.isherm for op in oper) and (state.isket or state.isherm):\n dtype = np.float64\n return np.array([_single_qobj_expect(op, state) for op in oper],\n dtype=dtype)\n return [expect(op, state) for op in oper]\n\n elif isinstance(state, (list, np.ndarray)):\n dtype = np.complex128\n if oper.isherm and all(op.isherm or op.isket for op in state):\n dtype = np.float64\n return np.array([_single_qobj_expect(oper, x) for x in state],\n dtype=dtype)\n raise TypeError('Arguments must be quantum objects')\n\n\ndef _single_qobj_expect(oper, state):\n \"\"\"\n Private function used by expect to calculate expectation values of Qobjs.\n \"\"\"\n if not oper.isoper or not (state.isket or state.isoper):\n raise TypeError('invalid operand types')\n if oper.dims[1] != state.dims[0]:\n msg = (\n \"incompatible dimensions \"\n + str(oper.dims[1]) + \" and \" + str(state.dims[0])\n )\n raise ValueError(msg)\n out = _data.expect(oper.data, state.data)\n\n # This ensures that expect can return something that is not a number such\n # as a `tensorflow.Tensor` in qutip-tensorflow.\n return out.real if (oper.isherm\n and (state.isket or state.isherm)\n and hasattr(out, \"real\")\n ) else out\n\n\ndef variance(oper, state):\n \"\"\"\n Variance of an operator for the given state vector or density matrix.\n\n Parameters\n ----------\n oper : qobj\n Operator for expectation value.\n\n state : qobj/list\n A single or ``list`` of quantum states or density matrices..\n\n Returns\n -------\n var : float\n Variance of operator 'oper' for given state.\n\n \"\"\"\n return expect(oper**2, state) - expect(oper, state)**2\n", "path": "qutip/core/expect.py"}], "after_files": [{"content": "__all__ = ['expect', 'variance']\n\nimport numpy as np\n\nfrom .qobj import Qobj\nfrom . import data as _data\n\n\ndef expect(oper, state):\n \"\"\"\n Calculate the expectation value for operator(s) and state(s). The\n expectation of state ``k`` on operator ``A`` is defined as\n ``k.dag() @ A @ k``, and for density matrix ``R`` on operator ``A`` it is\n ``trace(A @ R)``.\n\n Parameters\n ----------\n oper : qobj/array-like\n A single or a `list` of operators for expectation value.\n\n state : qobj/array-like\n A single or a `list` of quantum states or density matrices.\n\n Returns\n -------\n expt : float/complex/array-like\n Expectation value. ``real`` if ``oper`` is Hermitian, ``complex``\n otherwise. A (nested) array of expectaction values if ``state`` or\n ``oper`` are arrays.\n\n Examples\n --------\n >>> expect(num(4), basis(4, 3)) == 3 # doctest: +NORMALIZE_WHITESPACE\n True\n\n \"\"\"\n if isinstance(state, Qobj) and isinstance(oper, Qobj):\n return _single_qobj_expect(oper, state)\n\n elif isinstance(oper, (list, np.ndarray)):\n if isinstance(state, Qobj):\n dtype = np.complex128\n if all(op.isherm for op in oper) and (state.isket or state.isherm):\n dtype = np.float64\n return np.array([_single_qobj_expect(op, state) for op in oper],\n dtype=dtype)\n return [expect(op, state) for op in oper]\n\n elif isinstance(state, (list, np.ndarray)):\n dtype = np.complex128\n if oper.isherm and all(op.isherm or op.isket for op in state):\n dtype = np.float64\n return np.array([_single_qobj_expect(oper, x) for x in state],\n dtype=dtype)\n raise TypeError('Arguments must be quantum objects')\n\n\ndef _single_qobj_expect(oper, state):\n \"\"\"\n Private function used by expect to calculate expectation values of Qobjs.\n \"\"\"\n if not oper.isoper or not (state.isket or state.isoper):\n raise TypeError('invalid operand types')\n if oper.dims[1] != state.dims[0]:\n msg = (\n \"incompatible dimensions \"\n + str(oper.dims[1]) + \" and \" + str(state.dims[0])\n )\n raise ValueError(msg)\n out = _data.expect(oper.data, state.data)\n\n # This ensures that expect can return something that is not a number such\n # as a `tensorflow.Tensor` in qutip-tensorflow.\n return out.real if (oper.isherm\n and (state.isket or state.isherm)\n and hasattr(out, \"real\")\n ) else out\n\n\ndef variance(oper, state):\n \"\"\"\n Variance of an operator for the given state vector or density matrix.\n\n Parameters\n ----------\n oper : qobj\n Operator for expectation value.\n\n state : qobj/list\n A single or ``list`` of quantum states or density matrices..\n\n Returns\n -------\n var : float\n Variance of operator 'oper' for given state.\n\n \"\"\"\n return expect(oper**2, state) - expect(oper, state)**2\n", "path": "qutip/core/expect.py"}]} | 1,644 | 220 |
gh_patches_debug_505 | rasdani/github-patches | git_diff | googleapis__google-cloud-python-3693 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Increment threadsafety on BigQuery DB-API interface when httplib2 is removed
From https://github.com/GoogleCloudPlatform/google-cloud-python/pull/2921/files/5b29a9e42a40f7c74ced0b7a7e9db0add52a20a1#r123596828
Blocked by https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1998
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bigquery/google/cloud/bigquery/dbapi/__init__.py`
Content:
```
1 # Copyright 2017 Google Inc.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Google BigQuery implementation of the Database API Specification v2.0.
16
17 This module implements the `Python Database API Specification v2.0 (DB-API)`_
18 for Google BigQuery.
19
20 .. _Python Database API Specification v2.0 (DB-API):
21 https://www.python.org/dev/peps/pep-0249/
22
23 .. warning::
24 The ``dbapi`` module is **alpha**. The implementation is not complete. It
25 might be changed in backward-incompatible ways and is not subject to any SLA
26 or deprecation policy.
27 """
28
29 from google.cloud.bigquery.dbapi.connection import connect
30 from google.cloud.bigquery.dbapi.connection import Connection
31 from google.cloud.bigquery.dbapi.cursor import Cursor
32 from google.cloud.bigquery.dbapi.exceptions import Warning
33 from google.cloud.bigquery.dbapi.exceptions import Error
34 from google.cloud.bigquery.dbapi.exceptions import InterfaceError
35 from google.cloud.bigquery.dbapi.exceptions import DatabaseError
36 from google.cloud.bigquery.dbapi.exceptions import DataError
37 from google.cloud.bigquery.dbapi.exceptions import OperationalError
38 from google.cloud.bigquery.dbapi.exceptions import IntegrityError
39 from google.cloud.bigquery.dbapi.exceptions import InternalError
40 from google.cloud.bigquery.dbapi.exceptions import ProgrammingError
41 from google.cloud.bigquery.dbapi.exceptions import NotSupportedError
42 from google.cloud.bigquery.dbapi.types import Binary
43 from google.cloud.bigquery.dbapi.types import Date
44 from google.cloud.bigquery.dbapi.types import DateFromTicks
45 from google.cloud.bigquery.dbapi.types import Time
46 from google.cloud.bigquery.dbapi.types import TimeFromTicks
47 from google.cloud.bigquery.dbapi.types import Timestamp
48 from google.cloud.bigquery.dbapi.types import TimestampFromTicks
49 from google.cloud.bigquery.dbapi.types import BINARY
50 from google.cloud.bigquery.dbapi.types import DATETIME
51 from google.cloud.bigquery.dbapi.types import NUMBER
52 from google.cloud.bigquery.dbapi.types import ROWID
53 from google.cloud.bigquery.dbapi.types import STRING
54
55
56 apilevel = '2.0'
57
58 # Threads may share the module, but not connections.
59 threadsafety = 1
60
61 paramstyle = 'pyformat'
62
63 __all__ = [
64 'apilevel', 'threadsafety', 'paramstyle', 'connect', 'Connection',
65 'Cursor', 'Warning', 'Error', 'InterfaceError', 'DatabaseError',
66 'DataError', 'OperationalError', 'IntegrityError', 'InternalError',
67 'ProgrammingError', 'NotSupportedError', 'Binary', 'Date', 'DateFromTicks',
68 'Time', 'TimeFromTicks', 'Timestamp', 'TimestampFromTicks', 'BINARY',
69 'DATETIME', 'NUMBER', 'ROWID', 'STRING',
70 ]
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bigquery/google/cloud/bigquery/dbapi/__init__.py b/bigquery/google/cloud/bigquery/dbapi/__init__.py
--- a/bigquery/google/cloud/bigquery/dbapi/__init__.py
+++ b/bigquery/google/cloud/bigquery/dbapi/__init__.py
@@ -55,8 +55,8 @@
apilevel = '2.0'
-# Threads may share the module, but not connections.
-threadsafety = 1
+# Threads may share the module and connections, but not cursors.
+threadsafety = 2
paramstyle = 'pyformat'
| {"golden_diff": "diff --git a/bigquery/google/cloud/bigquery/dbapi/__init__.py b/bigquery/google/cloud/bigquery/dbapi/__init__.py\n--- a/bigquery/google/cloud/bigquery/dbapi/__init__.py\n+++ b/bigquery/google/cloud/bigquery/dbapi/__init__.py\n@@ -55,8 +55,8 @@\n \n apilevel = '2.0'\n \n-# Threads may share the module, but not connections.\n-threadsafety = 1\n+# Threads may share the module and connections, but not cursors.\n+threadsafety = 2\n \n paramstyle = 'pyformat'\n", "issue": "Increment threadsafety on BigQuery DB-API interface when httplib2 is removed\nFrom https://github.com/GoogleCloudPlatform/google-cloud-python/pull/2921/files/5b29a9e42a40f7c74ced0b7a7e9db0add52a20a1#r123596828\r\n\r\nBlocked by https://github.com/GoogleCloudPlatform/google-cloud-python/issues/1998\n", "before_files": [{"content": "# Copyright 2017 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Google BigQuery implementation of the Database API Specification v2.0.\n\nThis module implements the `Python Database API Specification v2.0 (DB-API)`_\nfor Google BigQuery.\n\n.. _Python Database API Specification v2.0 (DB-API):\n https://www.python.org/dev/peps/pep-0249/\n\n.. warning::\n The ``dbapi`` module is **alpha**. The implementation is not complete. It\n might be changed in backward-incompatible ways and is not subject to any SLA\n or deprecation policy.\n\"\"\"\n\nfrom google.cloud.bigquery.dbapi.connection import connect\nfrom google.cloud.bigquery.dbapi.connection import Connection\nfrom google.cloud.bigquery.dbapi.cursor import Cursor\nfrom google.cloud.bigquery.dbapi.exceptions import Warning\nfrom google.cloud.bigquery.dbapi.exceptions import Error\nfrom google.cloud.bigquery.dbapi.exceptions import InterfaceError\nfrom google.cloud.bigquery.dbapi.exceptions import DatabaseError\nfrom google.cloud.bigquery.dbapi.exceptions import DataError\nfrom google.cloud.bigquery.dbapi.exceptions import OperationalError\nfrom google.cloud.bigquery.dbapi.exceptions import IntegrityError\nfrom google.cloud.bigquery.dbapi.exceptions import InternalError\nfrom google.cloud.bigquery.dbapi.exceptions import ProgrammingError\nfrom google.cloud.bigquery.dbapi.exceptions import NotSupportedError\nfrom google.cloud.bigquery.dbapi.types import Binary\nfrom google.cloud.bigquery.dbapi.types import Date\nfrom google.cloud.bigquery.dbapi.types import DateFromTicks\nfrom google.cloud.bigquery.dbapi.types import Time\nfrom google.cloud.bigquery.dbapi.types import TimeFromTicks\nfrom google.cloud.bigquery.dbapi.types import Timestamp\nfrom google.cloud.bigquery.dbapi.types import TimestampFromTicks\nfrom google.cloud.bigquery.dbapi.types import BINARY\nfrom google.cloud.bigquery.dbapi.types import DATETIME\nfrom google.cloud.bigquery.dbapi.types import NUMBER\nfrom google.cloud.bigquery.dbapi.types import ROWID\nfrom google.cloud.bigquery.dbapi.types import STRING\n\n\napilevel = '2.0'\n\n# Threads may share the module, but not connections.\nthreadsafety = 1\n\nparamstyle = 'pyformat'\n\n__all__ = [\n 'apilevel', 'threadsafety', 'paramstyle', 'connect', 'Connection',\n 'Cursor', 'Warning', 'Error', 'InterfaceError', 'DatabaseError',\n 'DataError', 'OperationalError', 'IntegrityError', 'InternalError',\n 'ProgrammingError', 'NotSupportedError', 'Binary', 'Date', 'DateFromTicks',\n 'Time', 'TimeFromTicks', 'Timestamp', 'TimestampFromTicks', 'BINARY',\n 'DATETIME', 'NUMBER', 'ROWID', 'STRING',\n]\n", "path": "bigquery/google/cloud/bigquery/dbapi/__init__.py"}], "after_files": [{"content": "# Copyright 2017 Google Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Google BigQuery implementation of the Database API Specification v2.0.\n\nThis module implements the `Python Database API Specification v2.0 (DB-API)`_\nfor Google BigQuery.\n\n.. _Python Database API Specification v2.0 (DB-API):\n https://www.python.org/dev/peps/pep-0249/\n\n.. warning::\n The ``dbapi`` module is **alpha**. The implementation is not complete. It\n might be changed in backward-incompatible ways and is not subject to any SLA\n or deprecation policy.\n\"\"\"\n\nfrom google.cloud.bigquery.dbapi.connection import connect\nfrom google.cloud.bigquery.dbapi.connection import Connection\nfrom google.cloud.bigquery.dbapi.cursor import Cursor\nfrom google.cloud.bigquery.dbapi.exceptions import Warning\nfrom google.cloud.bigquery.dbapi.exceptions import Error\nfrom google.cloud.bigquery.dbapi.exceptions import InterfaceError\nfrom google.cloud.bigquery.dbapi.exceptions import DatabaseError\nfrom google.cloud.bigquery.dbapi.exceptions import DataError\nfrom google.cloud.bigquery.dbapi.exceptions import OperationalError\nfrom google.cloud.bigquery.dbapi.exceptions import IntegrityError\nfrom google.cloud.bigquery.dbapi.exceptions import InternalError\nfrom google.cloud.bigquery.dbapi.exceptions import ProgrammingError\nfrom google.cloud.bigquery.dbapi.exceptions import NotSupportedError\nfrom google.cloud.bigquery.dbapi.types import Binary\nfrom google.cloud.bigquery.dbapi.types import Date\nfrom google.cloud.bigquery.dbapi.types import DateFromTicks\nfrom google.cloud.bigquery.dbapi.types import Time\nfrom google.cloud.bigquery.dbapi.types import TimeFromTicks\nfrom google.cloud.bigquery.dbapi.types import Timestamp\nfrom google.cloud.bigquery.dbapi.types import TimestampFromTicks\nfrom google.cloud.bigquery.dbapi.types import BINARY\nfrom google.cloud.bigquery.dbapi.types import DATETIME\nfrom google.cloud.bigquery.dbapi.types import NUMBER\nfrom google.cloud.bigquery.dbapi.types import ROWID\nfrom google.cloud.bigquery.dbapi.types import STRING\n\n\napilevel = '2.0'\n\n# Threads may share the module and connections, but not cursors.\nthreadsafety = 2\n\nparamstyle = 'pyformat'\n\n__all__ = [\n 'apilevel', 'threadsafety', 'paramstyle', 'connect', 'Connection',\n 'Cursor', 'Warning', 'Error', 'InterfaceError', 'DatabaseError',\n 'DataError', 'OperationalError', 'IntegrityError', 'InternalError',\n 'ProgrammingError', 'NotSupportedError', 'Binary', 'Date', 'DateFromTicks',\n 'Time', 'TimeFromTicks', 'Timestamp', 'TimestampFromTicks', 'BINARY',\n 'DATETIME', 'NUMBER', 'ROWID', 'STRING',\n]\n", "path": "bigquery/google/cloud/bigquery/dbapi/__init__.py"}]} | 1,213 | 130 |
gh_patches_debug_25995 | rasdani/github-patches | git_diff | pytorch__vision-2979 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[DOC] Documentation for sigmoid focal loss
## 📚 Documentation
Sigmoid focal loss was added to torchvision, but I think docs are not generated for it
It was initalized in `init` as well [here](https://github.com/pytorch/vision/blob/v0.8.0/torchvision/ops/__init__.py#L11)
I think it's really minor fix, we might need `..autodoc::sigmoid_focal_loss` and it should be fine.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `torchvision/ops/focal_loss.py`
Content:
```
1 import torch
2 import torch.nn.functional as F
3
4
5 def sigmoid_focal_loss(
6 inputs,
7 targets,
8 alpha: float = 0.25,
9 gamma: float = 2,
10 reduction: str = "none",
11 ):
12 """
13 Original implementation from https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/focal_loss.py .
14 Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002.
15 Args:
16 inputs: A float tensor of arbitrary shape.
17 The predictions for each example.
18 targets: A float tensor with the same shape as inputs. Stores the binary
19 classification label for each element in inputs
20 (0 for the negative class and 1 for the positive class).
21 alpha: (optional) Weighting factor in range (0,1) to balance
22 positive vs negative examples or -1 for ignore. Default = 0.25
23 gamma: Exponent of the modulating factor (1 - p_t) to
24 balance easy vs hard examples.
25 reduction: 'none' | 'mean' | 'sum'
26 'none': No reduction will be applied to the output.
27 'mean': The output will be averaged.
28 'sum': The output will be summed.
29 Returns:
30 Loss tensor with the reduction option applied.
31 """
32 p = torch.sigmoid(inputs)
33 ce_loss = F.binary_cross_entropy_with_logits(
34 inputs, targets, reduction="none"
35 )
36 p_t = p * targets + (1 - p) * (1 - targets)
37 loss = ce_loss * ((1 - p_t) ** gamma)
38
39 if alpha >= 0:
40 alpha_t = alpha * targets + (1 - alpha) * (1 - targets)
41 loss = alpha_t * loss
42
43 if reduction == "mean":
44 loss = loss.mean()
45 elif reduction == "sum":
46 loss = loss.sum()
47
48 return loss
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/torchvision/ops/focal_loss.py b/torchvision/ops/focal_loss.py
--- a/torchvision/ops/focal_loss.py
+++ b/torchvision/ops/focal_loss.py
@@ -3,8 +3,8 @@
def sigmoid_focal_loss(
- inputs,
- targets,
+ inputs: torch.Tensor,
+ targets: torch.Tensor,
alpha: float = 0.25,
gamma: float = 2,
reduction: str = "none",
@@ -12,11 +12,12 @@
"""
Original implementation from https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/focal_loss.py .
Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002.
- Args:
+
+ Arguments:
inputs: A float tensor of arbitrary shape.
The predictions for each example.
targets: A float tensor with the same shape as inputs. Stores the binary
- classification label for each element in inputs
+ classification label for each element in inputs
(0 for the negative class and 1 for the positive class).
alpha: (optional) Weighting factor in range (0,1) to balance
positive vs negative examples or -1 for ignore. Default = 0.25
| {"golden_diff": "diff --git a/torchvision/ops/focal_loss.py b/torchvision/ops/focal_loss.py\n--- a/torchvision/ops/focal_loss.py\n+++ b/torchvision/ops/focal_loss.py\n@@ -3,8 +3,8 @@\n \n \n def sigmoid_focal_loss(\n- inputs,\n- targets,\n+ inputs: torch.Tensor,\n+ targets: torch.Tensor,\n alpha: float = 0.25,\n gamma: float = 2,\n reduction: str = \"none\",\n@@ -12,11 +12,12 @@\n \"\"\"\n Original implementation from https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/focal_loss.py .\n Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002.\n- Args:\n+\n+ Arguments:\n inputs: A float tensor of arbitrary shape.\n The predictions for each example.\n targets: A float tensor with the same shape as inputs. Stores the binary\n- classification label for each element in inputs\n+ classification label for each element in inputs\n (0 for the negative class and 1 for the positive class).\n alpha: (optional) Weighting factor in range (0,1) to balance\n positive vs negative examples or -1 for ignore. Default = 0.25\n", "issue": "[DOC] Documentation for sigmoid focal loss\n## \ud83d\udcda Documentation\r\n\r\nSigmoid focal loss was added to torchvision, but I think docs are not generated for it\r\n\r\nIt was initalized in `init` as well [here](https://github.com/pytorch/vision/blob/v0.8.0/torchvision/ops/__init__.py#L11)\r\n\r\nI think it's really minor fix, we might need `..autodoc::sigmoid_focal_loss` and it should be fine.\n", "before_files": [{"content": "import torch\nimport torch.nn.functional as F\n\n\ndef sigmoid_focal_loss(\n inputs,\n targets,\n alpha: float = 0.25,\n gamma: float = 2,\n reduction: str = \"none\",\n):\n \"\"\"\n Original implementation from https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/focal_loss.py .\n Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002.\n Args:\n inputs: A float tensor of arbitrary shape.\n The predictions for each example.\n targets: A float tensor with the same shape as inputs. Stores the binary\n classification label for each element in inputs\n (0 for the negative class and 1 for the positive class).\n alpha: (optional) Weighting factor in range (0,1) to balance\n positive vs negative examples or -1 for ignore. Default = 0.25\n gamma: Exponent of the modulating factor (1 - p_t) to\n balance easy vs hard examples.\n reduction: 'none' | 'mean' | 'sum'\n 'none': No reduction will be applied to the output.\n 'mean': The output will be averaged.\n 'sum': The output will be summed.\n Returns:\n Loss tensor with the reduction option applied.\n \"\"\"\n p = torch.sigmoid(inputs)\n ce_loss = F.binary_cross_entropy_with_logits(\n inputs, targets, reduction=\"none\"\n )\n p_t = p * targets + (1 - p) * (1 - targets)\n loss = ce_loss * ((1 - p_t) ** gamma)\n\n if alpha >= 0:\n alpha_t = alpha * targets + (1 - alpha) * (1 - targets)\n loss = alpha_t * loss\n\n if reduction == \"mean\":\n loss = loss.mean()\n elif reduction == \"sum\":\n loss = loss.sum()\n\n return loss\n", "path": "torchvision/ops/focal_loss.py"}], "after_files": [{"content": "import torch\nimport torch.nn.functional as F\n\n\ndef sigmoid_focal_loss(\n inputs: torch.Tensor,\n targets: torch.Tensor,\n alpha: float = 0.25,\n gamma: float = 2,\n reduction: str = \"none\",\n):\n \"\"\"\n Original implementation from https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/focal_loss.py .\n Loss used in RetinaNet for dense detection: https://arxiv.org/abs/1708.02002.\n\n Arguments:\n inputs: A float tensor of arbitrary shape.\n The predictions for each example.\n targets: A float tensor with the same shape as inputs. Stores the binary\n classification label for each element in inputs\n (0 for the negative class and 1 for the positive class).\n alpha: (optional) Weighting factor in range (0,1) to balance\n positive vs negative examples or -1 for ignore. Default = 0.25\n gamma: Exponent of the modulating factor (1 - p_t) to\n balance easy vs hard examples.\n reduction: 'none' | 'mean' | 'sum'\n 'none': No reduction will be applied to the output.\n 'mean': The output will be averaged.\n 'sum': The output will be summed.\n Returns:\n Loss tensor with the reduction option applied.\n \"\"\"\n p = torch.sigmoid(inputs)\n ce_loss = F.binary_cross_entropy_with_logits(\n inputs, targets, reduction=\"none\"\n )\n p_t = p * targets + (1 - p) * (1 - targets)\n loss = ce_loss * ((1 - p_t) ** gamma)\n\n if alpha >= 0:\n alpha_t = alpha * targets + (1 - alpha) * (1 - targets)\n loss = alpha_t * loss\n\n if reduction == \"mean\":\n loss = loss.mean()\n elif reduction == \"sum\":\n loss = loss.sum()\n\n return loss\n", "path": "torchvision/ops/focal_loss.py"}]} | 880 | 301 |
gh_patches_debug_41570 | rasdani/github-patches | git_diff | genialis__resolwe-328 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
500 error returned when permission does not exist
We should return 400 Bad request response.
Failing unit test: https://github.com/genialis/resolwe/pull/309
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `resolwe/permissions/utils.py`
Content:
```
1 """.. Ignore pydocstyle D400.
2
3 =================
4 Permissions utils
5 =================
6
7 .. autofunction:: copy_permissions
8
9 """
10 from __future__ import absolute_import, division, print_function, unicode_literals
11
12 import copy
13
14 from django.contrib.auth import get_user_model
15 from django.contrib.auth.models import AnonymousUser, Group
16 from django.contrib.contenttypes.models import ContentType
17
18 from guardian.models import GroupObjectPermission, UserObjectPermission
19 from guardian.shortcuts import assign_perm, remove_perm
20 from rest_framework import exceptions
21
22
23 def copy_permissions(src_obj, dest_obj):
24 """Copy permissions form ``src_obj`` to ``dest_obj``."""
25 src_obj_ctype = ContentType.objects.get_for_model(src_obj)
26 dest_obj_ctype = ContentType.objects.get_for_model(dest_obj)
27
28 if src_obj_ctype != dest_obj_ctype:
29 raise AssertionError('Content types of source and destination objects are not equal.')
30
31 for perm in UserObjectPermission.objects.filter(object_pk=src_obj.pk, content_type=src_obj_ctype):
32 assign_perm(perm.permission.codename, perm.user, dest_obj)
33 for perm in GroupObjectPermission.objects.filter(object_pk=src_obj.pk, content_type=src_obj_ctype):
34 assign_perm(perm.permission.codename, perm.group, dest_obj)
35
36
37 def fetch_user(query):
38 """Get user by ``pk`` or ``username``. Return ``None`` if doesn't exist."""
39 user_filter = {'pk': query} if query.isdigit() else {'username': query}
40 return get_user_model().objects.get(**user_filter)
41
42
43 def fetch_group(query):
44 """Get group by ``pk`` or ``name``. Return ``None`` if doesn't exist."""
45 group_filter = {'pk': query} if query.isdigit() else {'name': query}
46 return Group.objects.get(**group_filter)
47
48
49 def check_owner_permission(payload, allow_user_owner):
50 """Raise ``PermissionDenied``if ``owner`` found in ``data``."""
51 for entity_type in ['users', 'groups']:
52 for perm_type in ['add', 'remove']:
53 for perms in payload.get(entity_type, {}).get(perm_type, {}).values():
54 if 'owner' in perms:
55 if entity_type == 'users' and allow_user_owner:
56 continue
57
58 if entity_type == 'groups':
59 raise exceptions.ParseError("Owner permission cannot be assigned to a group")
60
61 raise exceptions.PermissionDenied("Only owners can grant/revoke owner permission")
62
63
64 def check_public_permissions(payload):
65 """Raise ``PermissionDenied`` if public permissions are too open."""
66 allowed_public_permissions = ['view', 'add', 'download']
67 for perm_type in ['add', 'remove']:
68 for perm in payload.get('public', {}).get(perm_type, []):
69 if perm not in allowed_public_permissions:
70 raise exceptions.PermissionDenied("Permissions for public users are too open")
71
72
73 def check_user_permissions(payload, user_pk):
74 """Raise ``PermissionDenied`` if ``payload`` includes ``user_pk``."""
75 for perm_type in ['add', 'remove']:
76 user_pks = payload.get('users', {}).get(perm_type, {}).keys()
77 if user_pk in user_pks:
78 raise exceptions.PermissionDenied("You cannot change your own permissions")
79
80
81 def remove_permission(payload, permission):
82 """Remove all occurrences of ``permission`` from ``payload``."""
83 payload = copy.deepcopy(payload)
84
85 for entity_type in ['users', 'groups']:
86 for perm_type in ['add', 'remove']:
87 for perms in payload.get(entity_type, {}).get(perm_type, {}).values():
88 if permission in perms:
89 perms.remove(permission)
90
91 for perm_type in ['add', 'remove']:
92 perms = payload.get('public', {}).get(perm_type, [])
93 if permission in perms:
94 perms.remove(permission)
95
96 return payload
97
98
99 def update_permission(obj, data):
100 """Update object permissions."""
101 content_type = ContentType.objects.get_for_model(obj)
102 full_permissions = list(zip(*obj._meta.permissions))[0] # pylint: disable=protected-access
103
104 def set_permissions(entity_type, perm_type):
105 """Set object permissions."""
106 perm_func = assign_perm if perm_type == 'add' else remove_perm
107 fetch_fn = fetch_user if entity_type == 'users' else fetch_group
108
109 for entity_id in data.get(entity_type, {}).get(perm_type, []):
110 entity = fetch_fn(entity_id)
111 if entity:
112 perms = data[entity_type][perm_type][entity_id]
113 if perms == u'ALL':
114 perms = full_permissions
115 for perm in perms:
116 perm_func('{}_{}'.format(perm.lower(), content_type), entity, obj)
117
118 set_permissions('users', 'add')
119 set_permissions('users', 'remove')
120 set_permissions('groups', 'add')
121 set_permissions('groups', 'remove')
122
123 def set_public_permissions(perm_type):
124 """Set public permissions."""
125 perm_func = assign_perm if perm_type == 'add' else remove_perm
126 user = AnonymousUser()
127 perms = data.get('public', {}).get(perm_type, [])
128 if perms == u'ALL':
129 perms = full_permissions
130 for perm in perms:
131 perm_func('{}_{}'.format(perm.lower(), content_type), user, obj)
132
133 set_public_permissions('add')
134 set_public_permissions('remove')
135
136
137 def assign_contributor_permissions(obj):
138 """Assign all permissions to object's contributor."""
139 for permission in list(zip(*obj._meta.permissions))[0]: # pylint: disable=protected-access
140 assign_perm(permission, obj.contributor, obj)
141
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/resolwe/permissions/utils.py b/resolwe/permissions/utils.py
--- a/resolwe/permissions/utils.py
+++ b/resolwe/permissions/utils.py
@@ -14,6 +14,7 @@
from django.contrib.auth import get_user_model
from django.contrib.auth.models import AnonymousUser, Group
from django.contrib.contenttypes.models import ContentType
+from django.db import transaction
from guardian.models import GroupObjectPermission, UserObjectPermission
from guardian.shortcuts import assign_perm, remove_perm
@@ -101,6 +102,35 @@
content_type = ContentType.objects.get_for_model(obj)
full_permissions = list(zip(*obj._meta.permissions))[0] # pylint: disable=protected-access
+ def apply_perm(perm_func, perms, entity):
+ """Apply permissions using given ``perm_func``.
+
+ ``perm_func`` is intended to be ``assign_perms`` or
+ ``remove_perms`` shortcut function from ``django-guardian``, but
+ can be any function that accepts permission codename,
+ user/group and object parameters (in this order).
+
+ If given permission does not exist, ``exceptions.ParseError`` is
+ raised.
+
+ "ALL" passed as ``perms`` parameter, will call ``perm_function``
+ with ``full_permissions`` list.
+
+ :param func perm_func: Permissions function to be applied
+ :param list params: list of params to be allpied
+ :param entity: user or group to be passed to ``perm_func``
+ :type entity: `~django.contrib.auth.models.User` or
+ `~django.contrib.auth.models.Group`
+
+ """
+ if perms == u'ALL':
+ perms = full_permissions
+ for perm in perms:
+ perm_codename = '{}_{}'.format(perm.lower(), content_type)
+ if perm_codename not in full_permissions:
+ raise exceptions.ParseError("Unknown permission: {}".format(perm))
+ perm_func(perm_codename, entity, obj)
+
def set_permissions(entity_type, perm_type):
"""Set object permissions."""
perm_func = assign_perm if perm_type == 'add' else remove_perm
@@ -110,28 +140,22 @@
entity = fetch_fn(entity_id)
if entity:
perms = data[entity_type][perm_type][entity_id]
- if perms == u'ALL':
- perms = full_permissions
- for perm in perms:
- perm_func('{}_{}'.format(perm.lower(), content_type), entity, obj)
-
- set_permissions('users', 'add')
- set_permissions('users', 'remove')
- set_permissions('groups', 'add')
- set_permissions('groups', 'remove')
+ apply_perm(perm_func, perms, entity)
def set_public_permissions(perm_type):
"""Set public permissions."""
perm_func = assign_perm if perm_type == 'add' else remove_perm
user = AnonymousUser()
perms = data.get('public', {}).get(perm_type, [])
- if perms == u'ALL':
- perms = full_permissions
- for perm in perms:
- perm_func('{}_{}'.format(perm.lower(), content_type), user, obj)
-
- set_public_permissions('add')
- set_public_permissions('remove')
+ apply_perm(perm_func, perms, user)
+
+ with transaction.atomic():
+ set_permissions('users', 'add')
+ set_permissions('users', 'remove')
+ set_permissions('groups', 'add')
+ set_permissions('groups', 'remove')
+ set_public_permissions('add')
+ set_public_permissions('remove')
def assign_contributor_permissions(obj):
| {"golden_diff": "diff --git a/resolwe/permissions/utils.py b/resolwe/permissions/utils.py\n--- a/resolwe/permissions/utils.py\n+++ b/resolwe/permissions/utils.py\n@@ -14,6 +14,7 @@\n from django.contrib.auth import get_user_model\n from django.contrib.auth.models import AnonymousUser, Group\n from django.contrib.contenttypes.models import ContentType\n+from django.db import transaction\n \n from guardian.models import GroupObjectPermission, UserObjectPermission\n from guardian.shortcuts import assign_perm, remove_perm\n@@ -101,6 +102,35 @@\n content_type = ContentType.objects.get_for_model(obj)\n full_permissions = list(zip(*obj._meta.permissions))[0] # pylint: disable=protected-access\n \n+ def apply_perm(perm_func, perms, entity):\n+ \"\"\"Apply permissions using given ``perm_func``.\n+\n+ ``perm_func`` is intended to be ``assign_perms`` or\n+ ``remove_perms`` shortcut function from ``django-guardian``, but\n+ can be any function that accepts permission codename,\n+ user/group and object parameters (in this order).\n+\n+ If given permission does not exist, ``exceptions.ParseError`` is\n+ raised.\n+\n+ \"ALL\" passed as ``perms`` parameter, will call ``perm_function``\n+ with ``full_permissions`` list.\n+\n+ :param func perm_func: Permissions function to be applied\n+ :param list params: list of params to be allpied\n+ :param entity: user or group to be passed to ``perm_func``\n+ :type entity: `~django.contrib.auth.models.User` or\n+ `~django.contrib.auth.models.Group`\n+\n+ \"\"\"\n+ if perms == u'ALL':\n+ perms = full_permissions\n+ for perm in perms:\n+ perm_codename = '{}_{}'.format(perm.lower(), content_type)\n+ if perm_codename not in full_permissions:\n+ raise exceptions.ParseError(\"Unknown permission: {}\".format(perm))\n+ perm_func(perm_codename, entity, obj)\n+\n def set_permissions(entity_type, perm_type):\n \"\"\"Set object permissions.\"\"\"\n perm_func = assign_perm if perm_type == 'add' else remove_perm\n@@ -110,28 +140,22 @@\n entity = fetch_fn(entity_id)\n if entity:\n perms = data[entity_type][perm_type][entity_id]\n- if perms == u'ALL':\n- perms = full_permissions\n- for perm in perms:\n- perm_func('{}_{}'.format(perm.lower(), content_type), entity, obj)\n-\n- set_permissions('users', 'add')\n- set_permissions('users', 'remove')\n- set_permissions('groups', 'add')\n- set_permissions('groups', 'remove')\n+ apply_perm(perm_func, perms, entity)\n \n def set_public_permissions(perm_type):\n \"\"\"Set public permissions.\"\"\"\n perm_func = assign_perm if perm_type == 'add' else remove_perm\n user = AnonymousUser()\n perms = data.get('public', {}).get(perm_type, [])\n- if perms == u'ALL':\n- perms = full_permissions\n- for perm in perms:\n- perm_func('{}_{}'.format(perm.lower(), content_type), user, obj)\n-\n- set_public_permissions('add')\n- set_public_permissions('remove')\n+ apply_perm(perm_func, perms, user)\n+\n+ with transaction.atomic():\n+ set_permissions('users', 'add')\n+ set_permissions('users', 'remove')\n+ set_permissions('groups', 'add')\n+ set_permissions('groups', 'remove')\n+ set_public_permissions('add')\n+ set_public_permissions('remove')\n \n \n def assign_contributor_permissions(obj):\n", "issue": "500 error returned when permission does not exist\nWe should return 400 Bad request response.\r\n\r\nFailing unit test: https://github.com/genialis/resolwe/pull/309\n", "before_files": [{"content": "\"\"\".. Ignore pydocstyle D400.\n\n=================\nPermissions utils\n=================\n\n.. autofunction:: copy_permissions\n\n\"\"\"\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport copy\n\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.auth.models import AnonymousUser, Group\nfrom django.contrib.contenttypes.models import ContentType\n\nfrom guardian.models import GroupObjectPermission, UserObjectPermission\nfrom guardian.shortcuts import assign_perm, remove_perm\nfrom rest_framework import exceptions\n\n\ndef copy_permissions(src_obj, dest_obj):\n \"\"\"Copy permissions form ``src_obj`` to ``dest_obj``.\"\"\"\n src_obj_ctype = ContentType.objects.get_for_model(src_obj)\n dest_obj_ctype = ContentType.objects.get_for_model(dest_obj)\n\n if src_obj_ctype != dest_obj_ctype:\n raise AssertionError('Content types of source and destination objects are not equal.')\n\n for perm in UserObjectPermission.objects.filter(object_pk=src_obj.pk, content_type=src_obj_ctype):\n assign_perm(perm.permission.codename, perm.user, dest_obj)\n for perm in GroupObjectPermission.objects.filter(object_pk=src_obj.pk, content_type=src_obj_ctype):\n assign_perm(perm.permission.codename, perm.group, dest_obj)\n\n\ndef fetch_user(query):\n \"\"\"Get user by ``pk`` or ``username``. Return ``None`` if doesn't exist.\"\"\"\n user_filter = {'pk': query} if query.isdigit() else {'username': query}\n return get_user_model().objects.get(**user_filter)\n\n\ndef fetch_group(query):\n \"\"\"Get group by ``pk`` or ``name``. Return ``None`` if doesn't exist.\"\"\"\n group_filter = {'pk': query} if query.isdigit() else {'name': query}\n return Group.objects.get(**group_filter)\n\n\ndef check_owner_permission(payload, allow_user_owner):\n \"\"\"Raise ``PermissionDenied``if ``owner`` found in ``data``.\"\"\"\n for entity_type in ['users', 'groups']:\n for perm_type in ['add', 'remove']:\n for perms in payload.get(entity_type, {}).get(perm_type, {}).values():\n if 'owner' in perms:\n if entity_type == 'users' and allow_user_owner:\n continue\n\n if entity_type == 'groups':\n raise exceptions.ParseError(\"Owner permission cannot be assigned to a group\")\n\n raise exceptions.PermissionDenied(\"Only owners can grant/revoke owner permission\")\n\n\ndef check_public_permissions(payload):\n \"\"\"Raise ``PermissionDenied`` if public permissions are too open.\"\"\"\n allowed_public_permissions = ['view', 'add', 'download']\n for perm_type in ['add', 'remove']:\n for perm in payload.get('public', {}).get(perm_type, []):\n if perm not in allowed_public_permissions:\n raise exceptions.PermissionDenied(\"Permissions for public users are too open\")\n\n\ndef check_user_permissions(payload, user_pk):\n \"\"\"Raise ``PermissionDenied`` if ``payload`` includes ``user_pk``.\"\"\"\n for perm_type in ['add', 'remove']:\n user_pks = payload.get('users', {}).get(perm_type, {}).keys()\n if user_pk in user_pks:\n raise exceptions.PermissionDenied(\"You cannot change your own permissions\")\n\n\ndef remove_permission(payload, permission):\n \"\"\"Remove all occurrences of ``permission`` from ``payload``.\"\"\"\n payload = copy.deepcopy(payload)\n\n for entity_type in ['users', 'groups']:\n for perm_type in ['add', 'remove']:\n for perms in payload.get(entity_type, {}).get(perm_type, {}).values():\n if permission in perms:\n perms.remove(permission)\n\n for perm_type in ['add', 'remove']:\n perms = payload.get('public', {}).get(perm_type, [])\n if permission in perms:\n perms.remove(permission)\n\n return payload\n\n\ndef update_permission(obj, data):\n \"\"\"Update object permissions.\"\"\"\n content_type = ContentType.objects.get_for_model(obj)\n full_permissions = list(zip(*obj._meta.permissions))[0] # pylint: disable=protected-access\n\n def set_permissions(entity_type, perm_type):\n \"\"\"Set object permissions.\"\"\"\n perm_func = assign_perm if perm_type == 'add' else remove_perm\n fetch_fn = fetch_user if entity_type == 'users' else fetch_group\n\n for entity_id in data.get(entity_type, {}).get(perm_type, []):\n entity = fetch_fn(entity_id)\n if entity:\n perms = data[entity_type][perm_type][entity_id]\n if perms == u'ALL':\n perms = full_permissions\n for perm in perms:\n perm_func('{}_{}'.format(perm.lower(), content_type), entity, obj)\n\n set_permissions('users', 'add')\n set_permissions('users', 'remove')\n set_permissions('groups', 'add')\n set_permissions('groups', 'remove')\n\n def set_public_permissions(perm_type):\n \"\"\"Set public permissions.\"\"\"\n perm_func = assign_perm if perm_type == 'add' else remove_perm\n user = AnonymousUser()\n perms = data.get('public', {}).get(perm_type, [])\n if perms == u'ALL':\n perms = full_permissions\n for perm in perms:\n perm_func('{}_{}'.format(perm.lower(), content_type), user, obj)\n\n set_public_permissions('add')\n set_public_permissions('remove')\n\n\ndef assign_contributor_permissions(obj):\n \"\"\"Assign all permissions to object's contributor.\"\"\"\n for permission in list(zip(*obj._meta.permissions))[0]: # pylint: disable=protected-access\n assign_perm(permission, obj.contributor, obj)\n", "path": "resolwe/permissions/utils.py"}], "after_files": [{"content": "\"\"\".. Ignore pydocstyle D400.\n\n=================\nPermissions utils\n=================\n\n.. autofunction:: copy_permissions\n\n\"\"\"\nfrom __future__ import absolute_import, division, print_function, unicode_literals\n\nimport copy\n\nfrom django.contrib.auth import get_user_model\nfrom django.contrib.auth.models import AnonymousUser, Group\nfrom django.contrib.contenttypes.models import ContentType\nfrom django.db import transaction\n\nfrom guardian.models import GroupObjectPermission, UserObjectPermission\nfrom guardian.shortcuts import assign_perm, remove_perm\nfrom rest_framework import exceptions\n\n\ndef copy_permissions(src_obj, dest_obj):\n \"\"\"Copy permissions form ``src_obj`` to ``dest_obj``.\"\"\"\n src_obj_ctype = ContentType.objects.get_for_model(src_obj)\n dest_obj_ctype = ContentType.objects.get_for_model(dest_obj)\n\n if src_obj_ctype != dest_obj_ctype:\n raise AssertionError('Content types of source and destination objects are not equal.')\n\n for perm in UserObjectPermission.objects.filter(object_pk=src_obj.pk, content_type=src_obj_ctype):\n assign_perm(perm.permission.codename, perm.user, dest_obj)\n for perm in GroupObjectPermission.objects.filter(object_pk=src_obj.pk, content_type=src_obj_ctype):\n assign_perm(perm.permission.codename, perm.group, dest_obj)\n\n\ndef fetch_user(query):\n \"\"\"Get user by ``pk`` or ``username``. Return ``None`` if doesn't exist.\"\"\"\n user_filter = {'pk': query} if query.isdigit() else {'username': query}\n return get_user_model().objects.get(**user_filter)\n\n\ndef fetch_group(query):\n \"\"\"Get group by ``pk`` or ``name``. Return ``None`` if doesn't exist.\"\"\"\n group_filter = {'pk': query} if query.isdigit() else {'name': query}\n return Group.objects.get(**group_filter)\n\n\ndef check_owner_permission(payload, allow_user_owner):\n \"\"\"Raise ``PermissionDenied``if ``owner`` found in ``data``.\"\"\"\n for entity_type in ['users', 'groups']:\n for perm_type in ['add', 'remove']:\n for perms in payload.get(entity_type, {}).get(perm_type, {}).values():\n if 'owner' in perms:\n if entity_type == 'users' and allow_user_owner:\n continue\n\n if entity_type == 'groups':\n raise exceptions.ParseError(\"Owner permission cannot be assigned to a group\")\n\n raise exceptions.PermissionDenied(\"Only owners can grant/revoke owner permission\")\n\n\ndef check_public_permissions(payload):\n \"\"\"Raise ``PermissionDenied`` if public permissions are too open.\"\"\"\n allowed_public_permissions = ['view', 'add', 'download']\n for perm_type in ['add', 'remove']:\n for perm in payload.get('public', {}).get(perm_type, []):\n if perm not in allowed_public_permissions:\n raise exceptions.PermissionDenied(\"Permissions for public users are too open\")\n\n\ndef check_user_permissions(payload, user_pk):\n \"\"\"Raise ``PermissionDenied`` if ``payload`` includes ``user_pk``.\"\"\"\n for perm_type in ['add', 'remove']:\n user_pks = payload.get('users', {}).get(perm_type, {}).keys()\n if user_pk in user_pks:\n raise exceptions.PermissionDenied(\"You cannot change your own permissions\")\n\n\ndef remove_permission(payload, permission):\n \"\"\"Remove all occurrences of ``permission`` from ``payload``.\"\"\"\n payload = copy.deepcopy(payload)\n\n for entity_type in ['users', 'groups']:\n for perm_type in ['add', 'remove']:\n for perms in payload.get(entity_type, {}).get(perm_type, {}).values():\n if permission in perms:\n perms.remove(permission)\n\n for perm_type in ['add', 'remove']:\n perms = payload.get('public', {}).get(perm_type, [])\n if permission in perms:\n perms.remove(permission)\n\n return payload\n\n\ndef update_permission(obj, data):\n \"\"\"Update object permissions.\"\"\"\n content_type = ContentType.objects.get_for_model(obj)\n full_permissions = list(zip(*obj._meta.permissions))[0] # pylint: disable=protected-access\n\n def apply_perm(perm_func, perms, entity):\n \"\"\"Apply permissions using given ``perm_func``.\n\n ``perm_func`` is intended to be ``assign_perms`` or\n ``remove_perms`` shortcut function from ``django-guardian``, but\n can be any function that accepts permission codename,\n user/group and object parameters (in this order).\n\n If given permission does not exist, ``exceptions.ParseError`` is\n raised.\n\n \"ALL\" passed as ``perms`` parameter, will call ``perm_function``\n with ``full_permissions`` list.\n\n :param func perm_func: Permissions function to be applied\n :param list params: list of params to be allpied\n :param entity: user or group to be passed to ``perm_func``\n :type entity: `~django.contrib.auth.models.User` or\n `~django.contrib.auth.models.Group`\n\n \"\"\"\n if perms == u'ALL':\n perms = full_permissions\n for perm in perms:\n perm_codename = '{}_{}'.format(perm.lower(), content_type)\n if perm_codename not in full_permissions:\n raise exceptions.ParseError(\"Unknown permission: {}\".format(perm))\n perm_func(perm_codename, entity, obj)\n\n def set_permissions(entity_type, perm_type):\n \"\"\"Set object permissions.\"\"\"\n perm_func = assign_perm if perm_type == 'add' else remove_perm\n fetch_fn = fetch_user if entity_type == 'users' else fetch_group\n\n for entity_id in data.get(entity_type, {}).get(perm_type, []):\n entity = fetch_fn(entity_id)\n if entity:\n perms = data[entity_type][perm_type][entity_id]\n apply_perm(perm_func, perms, entity)\n\n def set_public_permissions(perm_type):\n \"\"\"Set public permissions.\"\"\"\n perm_func = assign_perm if perm_type == 'add' else remove_perm\n user = AnonymousUser()\n perms = data.get('public', {}).get(perm_type, [])\n apply_perm(perm_func, perms, user)\n\n with transaction.atomic():\n set_permissions('users', 'add')\n set_permissions('users', 'remove')\n set_permissions('groups', 'add')\n set_permissions('groups', 'remove')\n set_public_permissions('add')\n set_public_permissions('remove')\n\n\ndef assign_contributor_permissions(obj):\n \"\"\"Assign all permissions to object's contributor.\"\"\"\n for permission in list(zip(*obj._meta.permissions))[0]: # pylint: disable=protected-access\n assign_perm(permission, obj.contributor, obj)\n", "path": "resolwe/permissions/utils.py"}]} | 1,810 | 814 |
gh_patches_debug_3415 | rasdani/github-patches | git_diff | pyro-ppl__numpyro-1041 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
New versions of sphinx and jinja2 break docs linting
This is observed by @tcbegley in #1034
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 # Copyright Contributors to the Pyro project.
2 # SPDX-License-Identifier: Apache-2.0
3
4 from __future__ import absolute_import, division, print_function
5
6 import os
7 import sys
8
9 from setuptools import find_packages, setup
10
11 PROJECT_PATH = os.path.dirname(os.path.abspath(__file__))
12
13 # Find version
14 for line in open(os.path.join(PROJECT_PATH, "numpyro", "version.py")):
15 if line.startswith("__version__ = "):
16 version = line.strip().split()[2][1:-1]
17
18 # READ README.md for long description on PyPi.
19 try:
20 long_description = open("README.md", encoding="utf-8").read()
21 except Exception as e:
22 sys.stderr.write("Failed to read README.md:\n {}\n".format(e))
23 sys.stderr.flush()
24 long_description = ""
25
26
27 setup(
28 name="numpyro",
29 version=version,
30 description="Pyro PPL on NumPy",
31 packages=find_packages(include=["numpyro", "numpyro.*"]),
32 url="https://github.com/pyro-ppl/numpyro",
33 author="Uber AI Labs",
34 install_requires=[
35 "jax>=0.2.11",
36 "jaxlib>=0.1.62",
37 "tqdm",
38 ],
39 extras_require={
40 "doc": [
41 "ipython", # sphinx needs this to render codes
42 "jinja2<3.0.0",
43 "nbsphinx",
44 "sphinx<4.0.0",
45 "sphinx_rtd_theme",
46 "sphinx-gallery",
47 ],
48 "test": [
49 "black",
50 "flake8",
51 "isort>=5.0",
52 "pytest>=4.1",
53 "pyro-api>=0.1.1",
54 "scipy>=1.1",
55 ],
56 "dev": [
57 "dm-haiku",
58 "flax",
59 # TODO: bump funsor version before the release
60 "funsor @ git+https://github.com/pyro-ppl/funsor.git@d5574988665dd822ec64e41f2b54b9dc929959dc",
61 "graphviz",
62 "optax==0.0.6",
63 # TODO: change this to tensorflow_probability>0.12.1 when the next version
64 # of tfp is released. The current release is not compatible with jax>=0.2.12.
65 "tfp-nightly",
66 ],
67 "examples": ["arviz", "jupyter", "matplotlib", "pandas", "seaborn"],
68 },
69 long_description=long_description,
70 long_description_content_type="text/markdown",
71 keywords="probabilistic machine learning bayesian statistics",
72 license="Apache License 2.0",
73 classifiers=[
74 "Intended Audience :: Developers",
75 "Intended Audience :: Education",
76 "Intended Audience :: Science/Research",
77 "License :: OSI Approved :: Apache Software License",
78 "Operating System :: POSIX :: Linux",
79 "Operating System :: MacOS :: MacOS X",
80 "Programming Language :: Python :: 3.6",
81 "Programming Language :: Python :: 3.7",
82 "Programming Language :: Python :: 3.8",
83 "Programming Language :: Python :: 3.9",
84 ],
85 )
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -39,9 +39,8 @@
extras_require={
"doc": [
"ipython", # sphinx needs this to render codes
- "jinja2<3.0.0",
- "nbsphinx",
- "sphinx<4.0.0",
+ "nbsphinx>=0.8.5",
+ "sphinx",
"sphinx_rtd_theme",
"sphinx-gallery",
],
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -39,9 +39,8 @@\n extras_require={\n \"doc\": [\n \"ipython\", # sphinx needs this to render codes\n- \"jinja2<3.0.0\",\n- \"nbsphinx\",\n- \"sphinx<4.0.0\",\n+ \"nbsphinx>=0.8.5\",\n+ \"sphinx\",\n \"sphinx_rtd_theme\",\n \"sphinx-gallery\",\n ],\n", "issue": "New versions of sphinx and jinja2 break docs linting\nThis is observed by @tcbegley in #1034\n", "before_files": [{"content": "# Copyright Contributors to the Pyro project.\n# SPDX-License-Identifier: Apache-2.0\n\nfrom __future__ import absolute_import, division, print_function\n\nimport os\nimport sys\n\nfrom setuptools import find_packages, setup\n\nPROJECT_PATH = os.path.dirname(os.path.abspath(__file__))\n\n# Find version\nfor line in open(os.path.join(PROJECT_PATH, \"numpyro\", \"version.py\")):\n if line.startswith(\"__version__ = \"):\n version = line.strip().split()[2][1:-1]\n\n# READ README.md for long description on PyPi.\ntry:\n long_description = open(\"README.md\", encoding=\"utf-8\").read()\nexcept Exception as e:\n sys.stderr.write(\"Failed to read README.md:\\n {}\\n\".format(e))\n sys.stderr.flush()\n long_description = \"\"\n\n\nsetup(\n name=\"numpyro\",\n version=version,\n description=\"Pyro PPL on NumPy\",\n packages=find_packages(include=[\"numpyro\", \"numpyro.*\"]),\n url=\"https://github.com/pyro-ppl/numpyro\",\n author=\"Uber AI Labs\",\n install_requires=[\n \"jax>=0.2.11\",\n \"jaxlib>=0.1.62\",\n \"tqdm\",\n ],\n extras_require={\n \"doc\": [\n \"ipython\", # sphinx needs this to render codes\n \"jinja2<3.0.0\",\n \"nbsphinx\",\n \"sphinx<4.0.0\",\n \"sphinx_rtd_theme\",\n \"sphinx-gallery\",\n ],\n \"test\": [\n \"black\",\n \"flake8\",\n \"isort>=5.0\",\n \"pytest>=4.1\",\n \"pyro-api>=0.1.1\",\n \"scipy>=1.1\",\n ],\n \"dev\": [\n \"dm-haiku\",\n \"flax\",\n # TODO: bump funsor version before the release\n \"funsor @ git+https://github.com/pyro-ppl/funsor.git@d5574988665dd822ec64e41f2b54b9dc929959dc\",\n \"graphviz\",\n \"optax==0.0.6\",\n # TODO: change this to tensorflow_probability>0.12.1 when the next version\n # of tfp is released. The current release is not compatible with jax>=0.2.12.\n \"tfp-nightly\",\n ],\n \"examples\": [\"arviz\", \"jupyter\", \"matplotlib\", \"pandas\", \"seaborn\"],\n },\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n keywords=\"probabilistic machine learning bayesian statistics\",\n license=\"Apache License 2.0\",\n classifiers=[\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Education\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "# Copyright Contributors to the Pyro project.\n# SPDX-License-Identifier: Apache-2.0\n\nfrom __future__ import absolute_import, division, print_function\n\nimport os\nimport sys\n\nfrom setuptools import find_packages, setup\n\nPROJECT_PATH = os.path.dirname(os.path.abspath(__file__))\n\n# Find version\nfor line in open(os.path.join(PROJECT_PATH, \"numpyro\", \"version.py\")):\n if line.startswith(\"__version__ = \"):\n version = line.strip().split()[2][1:-1]\n\n# READ README.md for long description on PyPi.\ntry:\n long_description = open(\"README.md\", encoding=\"utf-8\").read()\nexcept Exception as e:\n sys.stderr.write(\"Failed to read README.md:\\n {}\\n\".format(e))\n sys.stderr.flush()\n long_description = \"\"\n\n\nsetup(\n name=\"numpyro\",\n version=version,\n description=\"Pyro PPL on NumPy\",\n packages=find_packages(include=[\"numpyro\", \"numpyro.*\"]),\n url=\"https://github.com/pyro-ppl/numpyro\",\n author=\"Uber AI Labs\",\n install_requires=[\n \"jax>=0.2.11\",\n \"jaxlib>=0.1.62\",\n \"tqdm\",\n ],\n extras_require={\n \"doc\": [\n \"ipython\", # sphinx needs this to render codes\n \"nbsphinx>=0.8.5\",\n \"sphinx\",\n \"sphinx_rtd_theme\",\n \"sphinx-gallery\",\n ],\n \"test\": [\n \"black\",\n \"flake8\",\n \"isort>=5.0\",\n \"pytest>=4.1\",\n \"pyro-api>=0.1.1\",\n \"scipy>=1.1\",\n ],\n \"dev\": [\n \"dm-haiku\",\n \"flax\",\n # TODO: bump funsor version before the release\n \"funsor @ git+https://github.com/pyro-ppl/funsor.git@d5574988665dd822ec64e41f2b54b9dc929959dc\",\n \"graphviz\",\n \"optax==0.0.6\",\n # TODO: change this to tensorflow_probability>0.12.1 when the next version\n # of tfp is released. The current release is not compatible with jax>=0.2.12.\n \"tfp-nightly\",\n ],\n \"examples\": [\"arviz\", \"jupyter\", \"matplotlib\", \"pandas\", \"seaborn\"],\n },\n long_description=long_description,\n long_description_content_type=\"text/markdown\",\n keywords=\"probabilistic machine learning bayesian statistics\",\n license=\"Apache License 2.0\",\n classifiers=[\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Education\",\n \"Intended Audience :: Science/Research\",\n \"License :: OSI Approved :: Apache Software License\",\n \"Operating System :: POSIX :: Linux\",\n \"Operating System :: MacOS :: MacOS X\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n ],\n)\n", "path": "setup.py"}]} | 1,179 | 123 |
gh_patches_debug_15692 | rasdani/github-patches | git_diff | ethereum__web3.py-1198 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Drop repeated rpc endpoints
* Version: 4.6
### What was wrong?
web3.py has a few repeated endpoints, for example: `web3.version.net` and `web3.net.version`
### How can it be fixed?
Deprecate the endpoint api's that don't mirror the json-rpc spec namespacing, in favor of those that do.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `web3/version.py`
Content:
```
1 from web3.module import (
2 Module,
3 )
4
5
6 class Version(Module):
7 @property
8 def api(self):
9 from web3 import __version__
10 return __version__
11
12 @property
13 def node(self):
14 return self.web3.manager.request_blocking("web3_clientVersion", [])
15
16 @property
17 def network(self):
18 return self.web3.manager.request_blocking("net_version", [])
19
20 @property
21 def ethereum(self):
22 return self.web3.manager.request_blocking("eth_protocolVersion", [])
23
```
Path: `web3/miner.py`
Content:
```
1 from web3.module import (
2 Module,
3 )
4
5
6 class Miner(Module):
7 @property
8 def hashrate(self):
9 return self.web3.manager.request_blocking("eth_hashrate", [])
10
11 def makeDAG(self, number):
12 return self.web3.manager.request_blocking("miner_makeDag", [number])
13
14 def setExtra(self, extra):
15 return self.web3.manager.request_blocking("miner_setExtra", [extra])
16
17 def setEtherBase(self, etherbase):
18 return self.web3.manager.request_blocking("miner_setEtherbase", [etherbase])
19
20 def setGasPrice(self, gas_price):
21 return self.web3.manager.request_blocking(
22 "miner_setGasPrice", [gas_price],
23 )
24
25 def start(self, num_threads):
26 return self.web3.manager.request_blocking(
27 "miner_start", [num_threads],
28 )
29
30 def stop(self):
31 return self.web3.manager.request_blocking("miner_stop", [])
32
33 def startAutoDAG(self):
34 return self.web3.manager.request_blocking("miner_startAutoDag", [])
35
36 def stopAutoDAG(self):
37 return self.web3.manager.request_blocking("miner_stopAutoDag", [])
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/web3/miner.py b/web3/miner.py
--- a/web3/miner.py
+++ b/web3/miner.py
@@ -4,10 +4,6 @@
class Miner(Module):
- @property
- def hashrate(self):
- return self.web3.manager.request_blocking("eth_hashrate", [])
-
def makeDAG(self, number):
return self.web3.manager.request_blocking("miner_makeDag", [number])
diff --git a/web3/version.py b/web3/version.py
--- a/web3/version.py
+++ b/web3/version.py
@@ -13,10 +13,6 @@
def node(self):
return self.web3.manager.request_blocking("web3_clientVersion", [])
- @property
- def network(self):
- return self.web3.manager.request_blocking("net_version", [])
-
@property
def ethereum(self):
return self.web3.manager.request_blocking("eth_protocolVersion", [])
| {"golden_diff": "diff --git a/web3/miner.py b/web3/miner.py\n--- a/web3/miner.py\n+++ b/web3/miner.py\n@@ -4,10 +4,6 @@\n \n \n class Miner(Module):\n- @property\n- def hashrate(self):\n- return self.web3.manager.request_blocking(\"eth_hashrate\", [])\n-\n def makeDAG(self, number):\n return self.web3.manager.request_blocking(\"miner_makeDag\", [number])\n \ndiff --git a/web3/version.py b/web3/version.py\n--- a/web3/version.py\n+++ b/web3/version.py\n@@ -13,10 +13,6 @@\n def node(self):\n return self.web3.manager.request_blocking(\"web3_clientVersion\", [])\n \n- @property\n- def network(self):\n- return self.web3.manager.request_blocking(\"net_version\", [])\n-\n @property\n def ethereum(self):\n return self.web3.manager.request_blocking(\"eth_protocolVersion\", [])\n", "issue": "Drop repeated rpc endpoints\n* Version: 4.6\r\n\r\n### What was wrong?\r\n\r\nweb3.py has a few repeated endpoints, for example: `web3.version.net` and `web3.net.version`\r\n\r\n\r\n### How can it be fixed?\r\nDeprecate the endpoint api's that don't mirror the json-rpc spec namespacing, in favor of those that do.\n", "before_files": [{"content": "from web3.module import (\n Module,\n)\n\n\nclass Version(Module):\n @property\n def api(self):\n from web3 import __version__\n return __version__\n\n @property\n def node(self):\n return self.web3.manager.request_blocking(\"web3_clientVersion\", [])\n\n @property\n def network(self):\n return self.web3.manager.request_blocking(\"net_version\", [])\n\n @property\n def ethereum(self):\n return self.web3.manager.request_blocking(\"eth_protocolVersion\", [])\n", "path": "web3/version.py"}, {"content": "from web3.module import (\n Module,\n)\n\n\nclass Miner(Module):\n @property\n def hashrate(self):\n return self.web3.manager.request_blocking(\"eth_hashrate\", [])\n\n def makeDAG(self, number):\n return self.web3.manager.request_blocking(\"miner_makeDag\", [number])\n\n def setExtra(self, extra):\n return self.web3.manager.request_blocking(\"miner_setExtra\", [extra])\n\n def setEtherBase(self, etherbase):\n return self.web3.manager.request_blocking(\"miner_setEtherbase\", [etherbase])\n\n def setGasPrice(self, gas_price):\n return self.web3.manager.request_blocking(\n \"miner_setGasPrice\", [gas_price],\n )\n\n def start(self, num_threads):\n return self.web3.manager.request_blocking(\n \"miner_start\", [num_threads],\n )\n\n def stop(self):\n return self.web3.manager.request_blocking(\"miner_stop\", [])\n\n def startAutoDAG(self):\n return self.web3.manager.request_blocking(\"miner_startAutoDag\", [])\n\n def stopAutoDAG(self):\n return self.web3.manager.request_blocking(\"miner_stopAutoDag\", [])\n", "path": "web3/miner.py"}], "after_files": [{"content": "from web3.module import (\n Module,\n)\n\n\nclass Version(Module):\n @property\n def api(self):\n from web3 import __version__\n return __version__\n\n @property\n def node(self):\n return self.web3.manager.request_blocking(\"web3_clientVersion\", [])\n\n @property\n def ethereum(self):\n return self.web3.manager.request_blocking(\"eth_protocolVersion\", [])\n", "path": "web3/version.py"}, {"content": "from web3.module import (\n Module,\n)\n\n\nclass Miner(Module):\n def makeDAG(self, number):\n return self.web3.manager.request_blocking(\"miner_makeDag\", [number])\n\n def setExtra(self, extra):\n return self.web3.manager.request_blocking(\"miner_setExtra\", [extra])\n\n def setEtherBase(self, etherbase):\n return self.web3.manager.request_blocking(\"miner_setEtherbase\", [etherbase])\n\n def setGasPrice(self, gas_price):\n return self.web3.manager.request_blocking(\n \"miner_setGasPrice\", [gas_price],\n )\n\n def start(self, num_threads):\n return self.web3.manager.request_blocking(\n \"miner_start\", [num_threads],\n )\n\n def stop(self):\n return self.web3.manager.request_blocking(\"miner_stop\", [])\n\n def startAutoDAG(self):\n return self.web3.manager.request_blocking(\"miner_startAutoDag\", [])\n\n def stopAutoDAG(self):\n return self.web3.manager.request_blocking(\"miner_stopAutoDag\", [])\n", "path": "web3/miner.py"}]} | 820 | 215 |
gh_patches_debug_20807 | rasdani/github-patches | git_diff | google__flax-2540 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make RNG name configurable in Dropout
Currently `'dropout'` is hard-coded as an argument of `make_rng` inside `Dropout`. However, when implementing support for "recurrent dropout" in an LSTMCell or similar you need two kinds of dropout:
1. A regular dropout which is applied to the inputs with a different mask at each step.
2. A "recurrent dropout" that is applied to the state with the same mask at each step.
To implement `2` a possibility is to set the RNG name to `'recurrent_dropout'` on the Dropout layer applied to the state and guarantee that each step uses the same random state. From `nn.scan`'s perspective the would look like this:
```python
nn.scan(..., split_rngs={'dropout': True, 'recurrent_dropout': False})
```
The proposal is to add an `rng_name` (or similar) attribute to `Dropout` so we are able support these kind of use-cases. The alternative would be to create a separate `RecurrentDropout` layer with the same code but different hard-coded value.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `flax/linen/stochastic.py`
Content:
```
1 # Copyright 2022 The Flax Authors.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Stochastic modules."""
16
17 from typing import Optional, Sequence
18
19 from flax.linen.module import compact
20 from flax.linen.module import merge_param
21 from flax.linen.module import Module
22 from jax import lax
23 from jax import random
24 import jax.numpy as jnp
25
26
27 class Dropout(Module):
28 """Create a dropout layer.
29
30 Note: When using :meth:`Module.apply() <flax.linen.Module.apply>`, make sure
31 to include an RNG seed named `'dropout'`. For example::
32
33 model.apply({'params': params}, inputs=inputs, train=True, rngs={'dropout': dropout_rng})`
34
35 Attributes:
36 rate: the dropout probability. (_not_ the keep rate!)
37 broadcast_dims: dimensions that will share the same dropout mask
38 deterministic: if false the inputs are scaled by `1 / (1 - rate)` and
39 masked, whereas if true, no mask is applied and the inputs are returned
40 as is.
41 """
42 rate: float
43 broadcast_dims: Sequence[int] = ()
44 deterministic: Optional[bool] = None
45
46 @compact
47 def __call__(self, inputs, deterministic: Optional[bool] = None):
48 """Applies a random dropout mask to the input.
49
50 Args:
51 inputs: the inputs that should be randomly masked.
52 deterministic: if false the inputs are scaled by `1 / (1 - rate)` and
53 masked, whereas if true, no mask is applied and the inputs are returned
54 as is.
55
56 Returns:
57 The masked inputs reweighted to preserve mean.
58 """
59 deterministic = merge_param(
60 'deterministic', self.deterministic, deterministic)
61 if self.rate == 0.:
62 return inputs
63 # Prevent gradient NaNs in 1.0 edge-case.
64 if self.rate == 1.0:
65 return jnp.zeros_like(inputs)
66 keep_prob = 1. - self.rate
67 if deterministic:
68 return inputs
69 else:
70 rng = self.make_rng('dropout')
71 broadcast_shape = list(inputs.shape)
72 for dim in self.broadcast_dims:
73 broadcast_shape[dim] = 1
74 mask = random.bernoulli(rng, p=keep_prob, shape=broadcast_shape)
75 mask = jnp.broadcast_to(mask, inputs.shape)
76 return lax.select(mask, inputs / keep_prob, jnp.zeros_like(inputs))
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/flax/linen/stochastic.py b/flax/linen/stochastic.py
--- a/flax/linen/stochastic.py
+++ b/flax/linen/stochastic.py
@@ -38,10 +38,12 @@
deterministic: if false the inputs are scaled by `1 / (1 - rate)` and
masked, whereas if true, no mask is applied and the inputs are returned
as is.
+ rng_collection: the rng collection name to use when requesting an rng key.
"""
rate: float
broadcast_dims: Sequence[int] = ()
deterministic: Optional[bool] = None
+ rng_collection: str = 'dropout'
@compact
def __call__(self, inputs, deterministic: Optional[bool] = None):
@@ -67,7 +69,7 @@
if deterministic:
return inputs
else:
- rng = self.make_rng('dropout')
+ rng = self.make_rng(self.rng_collection)
broadcast_shape = list(inputs.shape)
for dim in self.broadcast_dims:
broadcast_shape[dim] = 1
| {"golden_diff": "diff --git a/flax/linen/stochastic.py b/flax/linen/stochastic.py\n--- a/flax/linen/stochastic.py\n+++ b/flax/linen/stochastic.py\n@@ -38,10 +38,12 @@\n deterministic: if false the inputs are scaled by `1 / (1 - rate)` and\n masked, whereas if true, no mask is applied and the inputs are returned\n as is.\n+ rng_collection: the rng collection name to use when requesting an rng key.\n \"\"\"\n rate: float\n broadcast_dims: Sequence[int] = ()\n deterministic: Optional[bool] = None\n+ rng_collection: str = 'dropout'\n \n @compact\n def __call__(self, inputs, deterministic: Optional[bool] = None):\n@@ -67,7 +69,7 @@\n if deterministic:\n return inputs\n else:\n- rng = self.make_rng('dropout')\n+ rng = self.make_rng(self.rng_collection)\n broadcast_shape = list(inputs.shape)\n for dim in self.broadcast_dims:\n broadcast_shape[dim] = 1\n", "issue": "Make RNG name configurable in Dropout\nCurrently `'dropout'` is hard-coded as an argument of `make_rng` inside `Dropout`. However, when implementing support for \"recurrent dropout\" in an LSTMCell or similar you need two kinds of dropout:\r\n1. A regular dropout which is applied to the inputs with a different mask at each step.\r\n2. A \"recurrent dropout\" that is applied to the state with the same mask at each step.\r\n\r\nTo implement `2` a possibility is to set the RNG name to `'recurrent_dropout'` on the Dropout layer applied to the state and guarantee that each step uses the same random state. From `nn.scan`'s perspective the would look like this:\r\n\r\n```python\r\nnn.scan(..., split_rngs={'dropout': True, 'recurrent_dropout': False})\r\n```\r\nThe proposal is to add an `rng_name` (or similar) attribute to `Dropout` so we are able support these kind of use-cases. The alternative would be to create a separate `RecurrentDropout` layer with the same code but different hard-coded value.\n", "before_files": [{"content": "# Copyright 2022 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Stochastic modules.\"\"\"\n\nfrom typing import Optional, Sequence\n\nfrom flax.linen.module import compact\nfrom flax.linen.module import merge_param\nfrom flax.linen.module import Module\nfrom jax import lax\nfrom jax import random\nimport jax.numpy as jnp\n\n\nclass Dropout(Module):\n \"\"\"Create a dropout layer.\n\n Note: When using :meth:`Module.apply() <flax.linen.Module.apply>`, make sure\n to include an RNG seed named `'dropout'`. For example::\n\n model.apply({'params': params}, inputs=inputs, train=True, rngs={'dropout': dropout_rng})`\n\n Attributes:\n rate: the dropout probability. (_not_ the keep rate!)\n broadcast_dims: dimensions that will share the same dropout mask\n deterministic: if false the inputs are scaled by `1 / (1 - rate)` and\n masked, whereas if true, no mask is applied and the inputs are returned\n as is.\n \"\"\"\n rate: float\n broadcast_dims: Sequence[int] = ()\n deterministic: Optional[bool] = None\n\n @compact\n def __call__(self, inputs, deterministic: Optional[bool] = None):\n \"\"\"Applies a random dropout mask to the input.\n\n Args:\n inputs: the inputs that should be randomly masked.\n deterministic: if false the inputs are scaled by `1 / (1 - rate)` and\n masked, whereas if true, no mask is applied and the inputs are returned\n as is.\n\n Returns:\n The masked inputs reweighted to preserve mean.\n \"\"\"\n deterministic = merge_param(\n 'deterministic', self.deterministic, deterministic)\n if self.rate == 0.:\n return inputs\n # Prevent gradient NaNs in 1.0 edge-case.\n if self.rate == 1.0:\n return jnp.zeros_like(inputs)\n keep_prob = 1. - self.rate\n if deterministic:\n return inputs\n else:\n rng = self.make_rng('dropout')\n broadcast_shape = list(inputs.shape)\n for dim in self.broadcast_dims:\n broadcast_shape[dim] = 1\n mask = random.bernoulli(rng, p=keep_prob, shape=broadcast_shape)\n mask = jnp.broadcast_to(mask, inputs.shape)\n return lax.select(mask, inputs / keep_prob, jnp.zeros_like(inputs))\n", "path": "flax/linen/stochastic.py"}], "after_files": [{"content": "# Copyright 2022 The Flax Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Stochastic modules.\"\"\"\n\nfrom typing import Optional, Sequence\n\nfrom flax.linen.module import compact\nfrom flax.linen.module import merge_param\nfrom flax.linen.module import Module\nfrom jax import lax\nfrom jax import random\nimport jax.numpy as jnp\n\n\nclass Dropout(Module):\n \"\"\"Create a dropout layer.\n\n Note: When using :meth:`Module.apply() <flax.linen.Module.apply>`, make sure\n to include an RNG seed named `'dropout'`. For example::\n\n model.apply({'params': params}, inputs=inputs, train=True, rngs={'dropout': dropout_rng})`\n\n Attributes:\n rate: the dropout probability. (_not_ the keep rate!)\n broadcast_dims: dimensions that will share the same dropout mask\n deterministic: if false the inputs are scaled by `1 / (1 - rate)` and\n masked, whereas if true, no mask is applied and the inputs are returned\n as is.\n rng_collection: the rng collection name to use when requesting an rng key.\n \"\"\"\n rate: float\n broadcast_dims: Sequence[int] = ()\n deterministic: Optional[bool] = None\n rng_collection: str = 'dropout'\n\n @compact\n def __call__(self, inputs, deterministic: Optional[bool] = None):\n \"\"\"Applies a random dropout mask to the input.\n\n Args:\n inputs: the inputs that should be randomly masked.\n deterministic: if false the inputs are scaled by `1 / (1 - rate)` and\n masked, whereas if true, no mask is applied and the inputs are returned\n as is.\n\n Returns:\n The masked inputs reweighted to preserve mean.\n \"\"\"\n deterministic = merge_param(\n 'deterministic', self.deterministic, deterministic)\n if self.rate == 0.:\n return inputs\n # Prevent gradient NaNs in 1.0 edge-case.\n if self.rate == 1.0:\n return jnp.zeros_like(inputs)\n keep_prob = 1. - self.rate\n if deterministic:\n return inputs\n else:\n rng = self.make_rng(self.rng_collection)\n broadcast_shape = list(inputs.shape)\n for dim in self.broadcast_dims:\n broadcast_shape[dim] = 1\n mask = random.bernoulli(rng, p=keep_prob, shape=broadcast_shape)\n mask = jnp.broadcast_to(mask, inputs.shape)\n return lax.select(mask, inputs / keep_prob, jnp.zeros_like(inputs))\n", "path": "flax/linen/stochastic.py"}]} | 1,281 | 245 |
gh_patches_debug_3497 | rasdani/github-patches | git_diff | ManimCommunity__manim-646 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The config value `html_favicon' has type `PosixPath'; expected `str'.
BTW when building the docs now I get
```
WARNING: The config value `html_favicon' has type `PosixPath'; expected `str'.
```
_Originally posted by @leotrs in https://github.com/ManimCommunity/manim/pull/631#issuecomment-719075737_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/source/conf.py`
Content:
```
1 # Configuration file for the Sphinx documentation builder.
2 #
3 # This file only contains a selection of the most common options. For a full
4 # list see the documentation:
5 # https://www.sphinx-doc.org/en/master/usage/configuration.html
6
7 # -- Path setup --------------------------------------------------------------
8
9 # If extensions (or modules to document with autodoc) are in another directory,
10 # add these directories to sys.path here. If the directory is relative to the
11 # documentation root, use os.path.abspath to make it absolute, like shown here.
12
13 import os
14 import subprocess
15 import sys
16 from distutils.sysconfig import get_python_lib
17 from pathlib import Path
18
19 sys.path.insert(0, os.path.abspath("."))
20
21
22 if os.environ.get("READTHEDOCS") == "True":
23 site_path = get_python_lib()
24 # bindings for pangocffi, cairocffi, pangocairocffi need to be generated
25 subprocess.run(["python", "pangocffi/ffi_build.py"], cwd=site_path)
26 subprocess.run(["python", "cairocffi/ffi_build.py"], cwd=site_path)
27 subprocess.run(["python", "pangocairocffi/ffi_build.py"], cwd=site_path)
28 # we need to add ffmpeg to the path
29 ffmpeg_path = os.path.join(site_path, "imageio_ffmpeg", "binaries")
30 # the included binary is named ffmpeg-linux..., create a symlink
31 [ffmpeg_bin] = [
32 file for file in os.listdir(ffmpeg_path) if file.startswith("ffmpeg-")
33 ]
34 os.symlink(
35 os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, "ffmpeg")
36 )
37 os.environ["PATH"] += os.pathsep + ffmpeg_path
38
39
40 # -- Project information -----------------------------------------------------
41
42 project = "Manim"
43 copyright = "2020, The Manim Community Dev Team"
44 author = "The Manim Community Dev Team"
45
46
47 # -- General configuration ---------------------------------------------------
48
49 # Add any Sphinx extension module names here, as strings. They can be
50 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
51 # ones.
52 extensions = [
53 "sphinx.ext.autodoc",
54 "recommonmark",
55 "sphinx_copybutton",
56 "sphinx.ext.napoleon",
57 "sphinx.ext.autosummary",
58 "sphinx.ext.doctest",
59 "manim_directive",
60 ]
61
62 # Automatically generate stub pages when using the .. autosummary directive
63 autosummary_generate = True
64
65 # Add any paths that contain templates here, relative to this directory.
66 templates_path = ["_templates"]
67
68 # List of patterns, relative to source directory, that match files and
69 # directories to ignore when looking for source files.
70 # This pattern also affects html_static_path and html_extra_path.
71 exclude_patterns = []
72
73
74 # -- Options for HTML output -------------------------------------------------
75
76 # The theme to use for HTML and HTML Help pages. See the documentation for
77 # a list of builtin themes.
78 #
79 import guzzle_sphinx_theme
80
81 html_theme_path = guzzle_sphinx_theme.html_theme_path()
82 html_theme = "guzzle_sphinx_theme"
83 html_favicon = Path("_static/favicon.ico")
84
85 # There's a standing issue with Sphinx's new-style sidebars. This is a
86 # workaround. Taken from
87 # https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826
88 html_sidebars = {"**": ["logo-text.html", "globaltoc.html", "searchbox.html"]}
89
90 # Register the theme as an extension to generate a sitemap.xml
91 extensions.append("guzzle_sphinx_theme")
92
93 # Add any paths that contain custom static files (such as style sheets) here,
94 # relative to this directory. They are copied after the builtin static files,
95 # so a file named "default.css" will overwrite the builtin "default.css".
96 html_static_path = ["_static"]
97
98 # This specifies any additional css files that will override the theme's
99 html_css_files = ["custom.css"]
100
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docs/source/conf.py b/docs/source/conf.py
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -80,7 +80,7 @@
html_theme_path = guzzle_sphinx_theme.html_theme_path()
html_theme = "guzzle_sphinx_theme"
-html_favicon = Path("_static/favicon.ico")
+html_favicon = str(Path("_static/favicon.ico"))
# There's a standing issue with Sphinx's new-style sidebars. This is a
# workaround. Taken from
| {"golden_diff": "diff --git a/docs/source/conf.py b/docs/source/conf.py\n--- a/docs/source/conf.py\n+++ b/docs/source/conf.py\n@@ -80,7 +80,7 @@\n \n html_theme_path = guzzle_sphinx_theme.html_theme_path()\n html_theme = \"guzzle_sphinx_theme\"\n-html_favicon = Path(\"_static/favicon.ico\")\n+html_favicon = str(Path(\"_static/favicon.ico\"))\n \n # There's a standing issue with Sphinx's new-style sidebars. This is a\n # workaround. Taken from\n", "issue": "The config value `html_favicon' has type `PosixPath'; expected `str'.\nBTW when building the docs now I get\r\n```\r\nWARNING: The config value `html_favicon' has type `PosixPath'; expected `str'.\r\n```\r\n\r\n_Originally posted by @leotrs in https://github.com/ManimCommunity/manim/pull/631#issuecomment-719075737_\n", "before_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n\nimport os\nimport subprocess\nimport sys\nfrom distutils.sysconfig import get_python_lib\nfrom pathlib import Path\n\nsys.path.insert(0, os.path.abspath(\".\"))\n\n\nif os.environ.get(\"READTHEDOCS\") == \"True\":\n site_path = get_python_lib()\n # bindings for pangocffi, cairocffi, pangocairocffi need to be generated\n subprocess.run([\"python\", \"pangocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"cairocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"pangocairocffi/ffi_build.py\"], cwd=site_path)\n # we need to add ffmpeg to the path\n ffmpeg_path = os.path.join(site_path, \"imageio_ffmpeg\", \"binaries\")\n # the included binary is named ffmpeg-linux..., create a symlink\n [ffmpeg_bin] = [\n file for file in os.listdir(ffmpeg_path) if file.startswith(\"ffmpeg-\")\n ]\n os.symlink(\n os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, \"ffmpeg\")\n )\n os.environ[\"PATH\"] += os.pathsep + ffmpeg_path\n\n\n# -- Project information -----------------------------------------------------\n\nproject = \"Manim\"\ncopyright = \"2020, The Manim Community Dev Team\"\nauthor = \"The Manim Community Dev Team\"\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx.ext.autodoc\",\n \"recommonmark\",\n \"sphinx_copybutton\",\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n \"manim_directive\",\n]\n\n# Automatically generate stub pages when using the .. autosummary directive\nautosummary_generate = True\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = [\"_templates\"]\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nimport guzzle_sphinx_theme\n\nhtml_theme_path = guzzle_sphinx_theme.html_theme_path()\nhtml_theme = \"guzzle_sphinx_theme\"\nhtml_favicon = Path(\"_static/favicon.ico\")\n\n# There's a standing issue with Sphinx's new-style sidebars. This is a\n# workaround. Taken from\n# https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826\nhtml_sidebars = {\"**\": [\"logo-text.html\", \"globaltoc.html\", \"searchbox.html\"]}\n\n# Register the theme as an extension to generate a sitemap.xml\nextensions.append(\"guzzle_sphinx_theme\")\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = [\"_static\"]\n\n# This specifies any additional css files that will override the theme's\nhtml_css_files = [\"custom.css\"]\n", "path": "docs/source/conf.py"}], "after_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n\nimport os\nimport subprocess\nimport sys\nfrom distutils.sysconfig import get_python_lib\nfrom pathlib import Path\n\nsys.path.insert(0, os.path.abspath(\".\"))\n\n\nif os.environ.get(\"READTHEDOCS\") == \"True\":\n site_path = get_python_lib()\n # bindings for pangocffi, cairocffi, pangocairocffi need to be generated\n subprocess.run([\"python\", \"pangocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"cairocffi/ffi_build.py\"], cwd=site_path)\n subprocess.run([\"python\", \"pangocairocffi/ffi_build.py\"], cwd=site_path)\n # we need to add ffmpeg to the path\n ffmpeg_path = os.path.join(site_path, \"imageio_ffmpeg\", \"binaries\")\n # the included binary is named ffmpeg-linux..., create a symlink\n [ffmpeg_bin] = [\n file for file in os.listdir(ffmpeg_path) if file.startswith(\"ffmpeg-\")\n ]\n os.symlink(\n os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, \"ffmpeg\")\n )\n os.environ[\"PATH\"] += os.pathsep + ffmpeg_path\n\n\n# -- Project information -----------------------------------------------------\n\nproject = \"Manim\"\ncopyright = \"2020, The Manim Community Dev Team\"\nauthor = \"The Manim Community Dev Team\"\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx.ext.autodoc\",\n \"recommonmark\",\n \"sphinx_copybutton\",\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n \"manim_directive\",\n]\n\n# Automatically generate stub pages when using the .. autosummary directive\nautosummary_generate = True\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = [\"_templates\"]\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nimport guzzle_sphinx_theme\n\nhtml_theme_path = guzzle_sphinx_theme.html_theme_path()\nhtml_theme = \"guzzle_sphinx_theme\"\nhtml_favicon = str(Path(\"_static/favicon.ico\"))\n\n# There's a standing issue with Sphinx's new-style sidebars. This is a\n# workaround. Taken from\n# https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826\nhtml_sidebars = {\"**\": [\"logo-text.html\", \"globaltoc.html\", \"searchbox.html\"]}\n\n# Register the theme as an extension to generate a sitemap.xml\nextensions.append(\"guzzle_sphinx_theme\")\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = [\"_static\"]\n\n# This specifies any additional css files that will override the theme's\nhtml_css_files = [\"custom.css\"]\n", "path": "docs/source/conf.py"}]} | 1,408 | 113 |
gh_patches_debug_41816 | rasdani/github-patches | git_diff | fonttools__fonttools-804 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[T2CharStringPen] Rounding coordinates?
Currently, [T2CharStringPen](https://github.com/fonttools/fonttools/blob/fcd697a328b439165b3e9d04904eb73f065b6838/Lib/fontTools/pens/t2CharStringPen.py#L78) always rounds glyph coordinates and casts to int, but it does not seem a good idea. I think we should remove this behavior, or make it optional.
cf. TTGlyphPen (#526)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `Lib/fontTools/pens/t2CharStringPen.py`
Content:
```
1 # Copyright (c) 2009 Type Supply LLC
2 # Author: Tal Leming
3
4
5 from __future__ import print_function, division, absolute_import
6
7 from fontTools.misc.py23 import *
8 from fontTools.misc.psCharStrings import T2CharString
9 from fontTools.pens.basePen import BasePen
10
11
12 def roundInt(v):
13 return int(round(v))
14
15
16 def roundIntPoint(point):
17 x, y = point
18 return roundInt(x), roundInt(y)
19
20
21 class RelativeCoordinatePen(BasePen):
22
23 def __init__(self, glyphSet):
24 BasePen.__init__(self, glyphSet)
25 self._lastX = None
26 self._lastY = None
27 self._heldAbsoluteMove = None
28
29 def _makePointRelative(self, pt):
30 absX, absY = pt
31 absX = absX
32 absY = absY
33 # no points have been added
34 # so no conversion is needed
35 if self._lastX is None:
36 relX, relY = absX, absY
37 # otherwise calculate the relative coordinates
38 else:
39 relX = absX - self._lastX
40 relY = absY - self._lastY
41 # store the absolute coordinates
42 self._lastX = absX
43 self._lastY = absY
44 # now return the relative coordinates
45 return relX, relY
46
47 def _moveTo(self, pt):
48 self._heldAbsoluteMove = pt
49
50 def _releaseHeldMove(self):
51 if self._heldAbsoluteMove is not None:
52 pt = self._makePointRelative(self._heldAbsoluteMove)
53 self._relativeMoveTo(pt)
54 self._heldAbsoluteMove = None
55
56 def _relativeMoveTo(self, pt):
57 raise NotImplementedError
58
59 def _lineTo(self, pt):
60 self._releaseHeldMove()
61 pt = self._makePointRelative(pt)
62 self._relativeLineTo(pt)
63
64 def _relativeLineTo(self, pt):
65 raise NotImplementedError
66
67 def _curveToOne(self, pt1, pt2, pt3):
68 self._releaseHeldMove()
69 pt1 = self._makePointRelative(pt1)
70 pt2 = self._makePointRelative(pt2)
71 pt3 = self._makePointRelative(pt3)
72 self._relativeCurveToOne(pt1, pt2, pt3)
73
74 def _relativeCurveToOne(self, pt1, pt2, pt3):
75 raise NotImplementedError
76
77
78 class T2CharStringPen(RelativeCoordinatePen):
79
80 def __init__(self, width, glyphSet):
81 RelativeCoordinatePen.__init__(self, glyphSet)
82 self._heldMove = None
83 self._program = []
84 if width is not None:
85 self._program.append(roundInt(width))
86
87 def _moveTo(self, pt):
88 RelativeCoordinatePen._moveTo(self, roundIntPoint(pt))
89
90 def _relativeMoveTo(self, pt):
91 pt = roundIntPoint(pt)
92 x, y = pt
93 self._heldMove = [x, y, "rmoveto"]
94
95 def _storeHeldMove(self):
96 if self._heldMove is not None:
97 self._program.extend(self._heldMove)
98 self._heldMove = None
99
100 def _lineTo(self, pt):
101 RelativeCoordinatePen._lineTo(self, roundIntPoint(pt))
102
103 def _relativeLineTo(self, pt):
104 self._storeHeldMove()
105 pt = roundIntPoint(pt)
106 x, y = pt
107 self._program.extend([x, y, "rlineto"])
108
109 def _curveToOne(self, pt1, pt2, pt3):
110 RelativeCoordinatePen._curveToOne(self, roundIntPoint(pt1), roundIntPoint(pt2), roundIntPoint(pt3))
111
112 def _relativeCurveToOne(self, pt1, pt2, pt3):
113 self._storeHeldMove()
114 pt1 = roundIntPoint(pt1)
115 pt2 = roundIntPoint(pt2)
116 pt3 = roundIntPoint(pt3)
117 x1, y1 = pt1
118 x2, y2 = pt2
119 x3, y3 = pt3
120 self._program.extend([x1, y1, x2, y2, x3, y3, "rrcurveto"])
121
122 def _closePath(self):
123 pass
124
125 def _endPath(self):
126 pass
127
128 def getCharString(self, private=None, globalSubrs=None):
129 program = self._program + ["endchar"]
130 charString = T2CharString(program=program, private=private, globalSubrs=globalSubrs)
131 return charString
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/Lib/fontTools/pens/t2CharStringPen.py b/Lib/fontTools/pens/t2CharStringPen.py
--- a/Lib/fontTools/pens/t2CharStringPen.py
+++ b/Lib/fontTools/pens/t2CharStringPen.py
@@ -9,15 +9,6 @@
from fontTools.pens.basePen import BasePen
-def roundInt(v):
- return int(round(v))
-
-
-def roundIntPoint(point):
- x, y = point
- return roundInt(x), roundInt(y)
-
-
class RelativeCoordinatePen(BasePen):
def __init__(self, glyphSet):
@@ -75,20 +66,45 @@
raise NotImplementedError
+def makeRoundFunc(tolerance):
+ if tolerance < 0:
+ raise ValueError("Rounding tolerance must be positive")
+
+ def _round(number):
+ if tolerance == 0:
+ return number # no-op
+ rounded = round(number)
+ # return rounded integer if the tolerance >= 0.5, or if the absolute
+ # difference between the original float and the rounded integer is
+ # within the tolerance
+ if tolerance >= .5 or abs(rounded - number) <= tolerance:
+ return rounded
+ else:
+ # else return the value un-rounded
+ return number
+
+ def roundPoint(point):
+ x, y = point
+ return _round(x), _round(y)
+
+ return roundPoint
+
+
class T2CharStringPen(RelativeCoordinatePen):
- def __init__(self, width, glyphSet):
+ def __init__(self, width, glyphSet, roundTolerance=0.5):
RelativeCoordinatePen.__init__(self, glyphSet)
+ self.roundPoint = makeRoundFunc(roundTolerance)
self._heldMove = None
self._program = []
if width is not None:
- self._program.append(roundInt(width))
+ self._program.append(round(width))
def _moveTo(self, pt):
- RelativeCoordinatePen._moveTo(self, roundIntPoint(pt))
+ RelativeCoordinatePen._moveTo(self, self.roundPoint(pt))
def _relativeMoveTo(self, pt):
- pt = roundIntPoint(pt)
+ pt = self.roundPoint(pt)
x, y = pt
self._heldMove = [x, y, "rmoveto"]
@@ -98,22 +114,25 @@
self._heldMove = None
def _lineTo(self, pt):
- RelativeCoordinatePen._lineTo(self, roundIntPoint(pt))
+ RelativeCoordinatePen._lineTo(self, self.roundPoint(pt))
def _relativeLineTo(self, pt):
self._storeHeldMove()
- pt = roundIntPoint(pt)
+ pt = self.roundPoint(pt)
x, y = pt
self._program.extend([x, y, "rlineto"])
def _curveToOne(self, pt1, pt2, pt3):
- RelativeCoordinatePen._curveToOne(self, roundIntPoint(pt1), roundIntPoint(pt2), roundIntPoint(pt3))
+ RelativeCoordinatePen._curveToOne(self,
+ self.roundPoint(pt1),
+ self.roundPoint(pt2),
+ self.roundPoint(pt3))
def _relativeCurveToOne(self, pt1, pt2, pt3):
self._storeHeldMove()
- pt1 = roundIntPoint(pt1)
- pt2 = roundIntPoint(pt2)
- pt3 = roundIntPoint(pt3)
+ pt1 = self.roundPoint(pt1)
+ pt2 = self.roundPoint(pt2)
+ pt3 = self.roundPoint(pt3)
x1, y1 = pt1
x2, y2 = pt2
x3, y3 = pt3
@@ -127,5 +146,6 @@
def getCharString(self, private=None, globalSubrs=None):
program = self._program + ["endchar"]
- charString = T2CharString(program=program, private=private, globalSubrs=globalSubrs)
+ charString = T2CharString(
+ program=program, private=private, globalSubrs=globalSubrs)
return charString
| {"golden_diff": "diff --git a/Lib/fontTools/pens/t2CharStringPen.py b/Lib/fontTools/pens/t2CharStringPen.py\n--- a/Lib/fontTools/pens/t2CharStringPen.py\n+++ b/Lib/fontTools/pens/t2CharStringPen.py\n@@ -9,15 +9,6 @@\n from fontTools.pens.basePen import BasePen\n \n \n-def roundInt(v):\n- return int(round(v))\n-\n-\n-def roundIntPoint(point):\n- x, y = point\n- return roundInt(x), roundInt(y)\n-\n-\n class RelativeCoordinatePen(BasePen):\n \n def __init__(self, glyphSet):\n@@ -75,20 +66,45 @@\n raise NotImplementedError\n \n \n+def makeRoundFunc(tolerance):\n+ if tolerance < 0:\n+ raise ValueError(\"Rounding tolerance must be positive\")\n+\n+ def _round(number):\n+ if tolerance == 0:\n+ return number # no-op\n+ rounded = round(number)\n+ # return rounded integer if the tolerance >= 0.5, or if the absolute\n+ # difference between the original float and the rounded integer is\n+ # within the tolerance\n+ if tolerance >= .5 or abs(rounded - number) <= tolerance:\n+ return rounded\n+ else:\n+ # else return the value un-rounded\n+ return number\n+\n+ def roundPoint(point):\n+ x, y = point\n+ return _round(x), _round(y)\n+\n+ return roundPoint\n+\n+\n class T2CharStringPen(RelativeCoordinatePen):\n \n- def __init__(self, width, glyphSet):\n+ def __init__(self, width, glyphSet, roundTolerance=0.5):\n RelativeCoordinatePen.__init__(self, glyphSet)\n+ self.roundPoint = makeRoundFunc(roundTolerance)\n self._heldMove = None\n self._program = []\n if width is not None:\n- self._program.append(roundInt(width))\n+ self._program.append(round(width))\n \n def _moveTo(self, pt):\n- RelativeCoordinatePen._moveTo(self, roundIntPoint(pt))\n+ RelativeCoordinatePen._moveTo(self, self.roundPoint(pt))\n \n def _relativeMoveTo(self, pt):\n- pt = roundIntPoint(pt)\n+ pt = self.roundPoint(pt)\n x, y = pt\n self._heldMove = [x, y, \"rmoveto\"]\n \n@@ -98,22 +114,25 @@\n self._heldMove = None\n \n def _lineTo(self, pt):\n- RelativeCoordinatePen._lineTo(self, roundIntPoint(pt))\n+ RelativeCoordinatePen._lineTo(self, self.roundPoint(pt))\n \n def _relativeLineTo(self, pt):\n self._storeHeldMove()\n- pt = roundIntPoint(pt)\n+ pt = self.roundPoint(pt)\n x, y = pt\n self._program.extend([x, y, \"rlineto\"])\n \n def _curveToOne(self, pt1, pt2, pt3):\n- RelativeCoordinatePen._curveToOne(self, roundIntPoint(pt1), roundIntPoint(pt2), roundIntPoint(pt3))\n+ RelativeCoordinatePen._curveToOne(self,\n+ self.roundPoint(pt1),\n+ self.roundPoint(pt2),\n+ self.roundPoint(pt3))\n \n def _relativeCurveToOne(self, pt1, pt2, pt3):\n self._storeHeldMove()\n- pt1 = roundIntPoint(pt1)\n- pt2 = roundIntPoint(pt2)\n- pt3 = roundIntPoint(pt3)\n+ pt1 = self.roundPoint(pt1)\n+ pt2 = self.roundPoint(pt2)\n+ pt3 = self.roundPoint(pt3)\n x1, y1 = pt1\n x2, y2 = pt2\n x3, y3 = pt3\n@@ -127,5 +146,6 @@\n \n def getCharString(self, private=None, globalSubrs=None):\n program = self._program + [\"endchar\"]\n- charString = T2CharString(program=program, private=private, globalSubrs=globalSubrs)\n+ charString = T2CharString(\n+ program=program, private=private, globalSubrs=globalSubrs)\n return charString\n", "issue": "[T2CharStringPen] Rounding coordinates?\nCurrently, [T2CharStringPen](https://github.com/fonttools/fonttools/blob/fcd697a328b439165b3e9d04904eb73f065b6838/Lib/fontTools/pens/t2CharStringPen.py#L78) always rounds glyph coordinates and casts to int, but it does not seem a good idea. I think we should remove this behavior, or make it optional.\r\n\r\ncf. TTGlyphPen (#526)\n", "before_files": [{"content": "# Copyright (c) 2009 Type Supply LLC\n# Author: Tal Leming\n\n\nfrom __future__ import print_function, division, absolute_import\n\nfrom fontTools.misc.py23 import *\nfrom fontTools.misc.psCharStrings import T2CharString\nfrom fontTools.pens.basePen import BasePen\n\n\ndef roundInt(v):\n return int(round(v))\n\n\ndef roundIntPoint(point):\n x, y = point\n return roundInt(x), roundInt(y)\n\n\nclass RelativeCoordinatePen(BasePen):\n\n def __init__(self, glyphSet):\n BasePen.__init__(self, glyphSet)\n self._lastX = None\n self._lastY = None\n self._heldAbsoluteMove = None\n\n def _makePointRelative(self, pt):\n absX, absY = pt\n absX = absX\n absY = absY\n # no points have been added\n # so no conversion is needed\n if self._lastX is None:\n relX, relY = absX, absY\n # otherwise calculate the relative coordinates\n else:\n relX = absX - self._lastX\n relY = absY - self._lastY\n # store the absolute coordinates\n self._lastX = absX\n self._lastY = absY\n # now return the relative coordinates\n return relX, relY\n\n def _moveTo(self, pt):\n self._heldAbsoluteMove = pt\n\n def _releaseHeldMove(self):\n if self._heldAbsoluteMove is not None:\n pt = self._makePointRelative(self._heldAbsoluteMove)\n self._relativeMoveTo(pt)\n self._heldAbsoluteMove = None\n\n def _relativeMoveTo(self, pt):\n raise NotImplementedError\n\n def _lineTo(self, pt):\n self._releaseHeldMove()\n pt = self._makePointRelative(pt)\n self._relativeLineTo(pt)\n\n def _relativeLineTo(self, pt):\n raise NotImplementedError\n\n def _curveToOne(self, pt1, pt2, pt3):\n self._releaseHeldMove()\n pt1 = self._makePointRelative(pt1)\n pt2 = self._makePointRelative(pt2)\n pt3 = self._makePointRelative(pt3)\n self._relativeCurveToOne(pt1, pt2, pt3)\n\n def _relativeCurveToOne(self, pt1, pt2, pt3):\n raise NotImplementedError\n\n\nclass T2CharStringPen(RelativeCoordinatePen):\n\n def __init__(self, width, glyphSet):\n RelativeCoordinatePen.__init__(self, glyphSet)\n self._heldMove = None\n self._program = []\n if width is not None:\n self._program.append(roundInt(width))\n\n def _moveTo(self, pt):\n RelativeCoordinatePen._moveTo(self, roundIntPoint(pt))\n\n def _relativeMoveTo(self, pt):\n pt = roundIntPoint(pt)\n x, y = pt\n self._heldMove = [x, y, \"rmoveto\"]\n\n def _storeHeldMove(self):\n if self._heldMove is not None:\n self._program.extend(self._heldMove)\n self._heldMove = None\n\n def _lineTo(self, pt):\n RelativeCoordinatePen._lineTo(self, roundIntPoint(pt))\n\n def _relativeLineTo(self, pt):\n self._storeHeldMove()\n pt = roundIntPoint(pt)\n x, y = pt\n self._program.extend([x, y, \"rlineto\"])\n\n def _curveToOne(self, pt1, pt2, pt3):\n RelativeCoordinatePen._curveToOne(self, roundIntPoint(pt1), roundIntPoint(pt2), roundIntPoint(pt3))\n\n def _relativeCurveToOne(self, pt1, pt2, pt3):\n self._storeHeldMove()\n pt1 = roundIntPoint(pt1)\n pt2 = roundIntPoint(pt2)\n pt3 = roundIntPoint(pt3)\n x1, y1 = pt1\n x2, y2 = pt2\n x3, y3 = pt3\n self._program.extend([x1, y1, x2, y2, x3, y3, \"rrcurveto\"])\n\n def _closePath(self):\n pass\n\n def _endPath(self):\n pass\n\n def getCharString(self, private=None, globalSubrs=None):\n program = self._program + [\"endchar\"]\n charString = T2CharString(program=program, private=private, globalSubrs=globalSubrs)\n return charString\n", "path": "Lib/fontTools/pens/t2CharStringPen.py"}], "after_files": [{"content": "# Copyright (c) 2009 Type Supply LLC\n# Author: Tal Leming\n\n\nfrom __future__ import print_function, division, absolute_import\n\nfrom fontTools.misc.py23 import *\nfrom fontTools.misc.psCharStrings import T2CharString\nfrom fontTools.pens.basePen import BasePen\n\n\nclass RelativeCoordinatePen(BasePen):\n\n def __init__(self, glyphSet):\n BasePen.__init__(self, glyphSet)\n self._lastX = None\n self._lastY = None\n self._heldAbsoluteMove = None\n\n def _makePointRelative(self, pt):\n absX, absY = pt\n absX = absX\n absY = absY\n # no points have been added\n # so no conversion is needed\n if self._lastX is None:\n relX, relY = absX, absY\n # otherwise calculate the relative coordinates\n else:\n relX = absX - self._lastX\n relY = absY - self._lastY\n # store the absolute coordinates\n self._lastX = absX\n self._lastY = absY\n # now return the relative coordinates\n return relX, relY\n\n def _moveTo(self, pt):\n self._heldAbsoluteMove = pt\n\n def _releaseHeldMove(self):\n if self._heldAbsoluteMove is not None:\n pt = self._makePointRelative(self._heldAbsoluteMove)\n self._relativeMoveTo(pt)\n self._heldAbsoluteMove = None\n\n def _relativeMoveTo(self, pt):\n raise NotImplementedError\n\n def _lineTo(self, pt):\n self._releaseHeldMove()\n pt = self._makePointRelative(pt)\n self._relativeLineTo(pt)\n\n def _relativeLineTo(self, pt):\n raise NotImplementedError\n\n def _curveToOne(self, pt1, pt2, pt3):\n self._releaseHeldMove()\n pt1 = self._makePointRelative(pt1)\n pt2 = self._makePointRelative(pt2)\n pt3 = self._makePointRelative(pt3)\n self._relativeCurveToOne(pt1, pt2, pt3)\n\n def _relativeCurveToOne(self, pt1, pt2, pt3):\n raise NotImplementedError\n\n\ndef makeRoundFunc(tolerance):\n if tolerance < 0:\n raise ValueError(\"Rounding tolerance must be positive\")\n\n def _round(number):\n if tolerance == 0:\n return number # no-op\n rounded = round(number)\n # return rounded integer if the tolerance >= 0.5, or if the absolute\n # difference between the original float and the rounded integer is\n # within the tolerance\n if tolerance >= .5 or abs(rounded - number) <= tolerance:\n return rounded\n else:\n # else return the value un-rounded\n return number\n\n def roundPoint(point):\n x, y = point\n return _round(x), _round(y)\n\n return roundPoint\n\n\nclass T2CharStringPen(RelativeCoordinatePen):\n\n def __init__(self, width, glyphSet, roundTolerance=0.5):\n RelativeCoordinatePen.__init__(self, glyphSet)\n self.roundPoint = makeRoundFunc(roundTolerance)\n self._heldMove = None\n self._program = []\n if width is not None:\n self._program.append(round(width))\n\n def _moveTo(self, pt):\n RelativeCoordinatePen._moveTo(self, self.roundPoint(pt))\n\n def _relativeMoveTo(self, pt):\n pt = self.roundPoint(pt)\n x, y = pt\n self._heldMove = [x, y, \"rmoveto\"]\n\n def _storeHeldMove(self):\n if self._heldMove is not None:\n self._program.extend(self._heldMove)\n self._heldMove = None\n\n def _lineTo(self, pt):\n RelativeCoordinatePen._lineTo(self, self.roundPoint(pt))\n\n def _relativeLineTo(self, pt):\n self._storeHeldMove()\n pt = self.roundPoint(pt)\n x, y = pt\n self._program.extend([x, y, \"rlineto\"])\n\n def _curveToOne(self, pt1, pt2, pt3):\n RelativeCoordinatePen._curveToOne(self,\n self.roundPoint(pt1),\n self.roundPoint(pt2),\n self.roundPoint(pt3))\n\n def _relativeCurveToOne(self, pt1, pt2, pt3):\n self._storeHeldMove()\n pt1 = self.roundPoint(pt1)\n pt2 = self.roundPoint(pt2)\n pt3 = self.roundPoint(pt3)\n x1, y1 = pt1\n x2, y2 = pt2\n x3, y3 = pt3\n self._program.extend([x1, y1, x2, y2, x3, y3, \"rrcurveto\"])\n\n def _closePath(self):\n pass\n\n def _endPath(self):\n pass\n\n def getCharString(self, private=None, globalSubrs=None):\n program = self._program + [\"endchar\"]\n charString = T2CharString(\n program=program, private=private, globalSubrs=globalSubrs)\n return charString\n", "path": "Lib/fontTools/pens/t2CharStringPen.py"}]} | 1,708 | 965 |
gh_patches_debug_17604 | rasdani/github-patches | git_diff | DataDog__dd-trace-py-4250 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot call sqlite3.backup(db) on a TracedSQLite object
Thanks for taking the time for reporting an issue!
Before reporting an issue on dd-trace-py, please be sure to provide all
necessary information.
If you're hitting a bug, make sure that you're using the latest version of this
library.
### Which version of dd-trace-py are you using?
1.5.0
### Which version of pip are you using?
21.1.1
_ddtrace requires pip>=18 to install one of our pre-built wheels_
### Which version of the libraries are you using?
You can copy/paste the output of `pip freeze` here.
```
ddtrace==1.5.0
```
### How can we reproduce your problem?
```
from ddtrace import config, patch_all
import sqlite3
config.env = "test" # the environment the application is in
config.service = "app" # name of your application
config.version = "v1" # version of your application
patch_all()
src = sqlite3.connect("1.db")
dst = sqlite3.connect("2.db")
with dst:
src.backup(dst, pages=1)
dst.close()
src.close()
```
### What is the result that you get?
The following TypeError
```
TypeError: backup() argument 1 must be sqlite3.Connection, not TracedSQLite
```
### What is the result that you expected?
The function should succeed without error.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ddtrace/contrib/sqlite3/patch.py`
Content:
```
1 import os
2 import sqlite3
3 import sqlite3.dbapi2
4
5 from ddtrace import config
6 from ddtrace.vendor import wrapt
7
8 from ...contrib.dbapi import FetchTracedCursor
9 from ...contrib.dbapi import TracedConnection
10 from ...contrib.dbapi import TracedCursor
11 from ...internal.utils.formats import asbool
12 from ...pin import Pin
13
14
15 # Original connect method
16 _connect = sqlite3.connect
17
18 config._add(
19 "sqlite",
20 dict(
21 _default_service="sqlite",
22 _dbapi_span_name_prefix="sqlite",
23 trace_fetch_methods=asbool(os.getenv("DD_SQLITE_TRACE_FETCH_METHODS", default=False)),
24 ),
25 )
26
27
28 def patch():
29 wrapped = wrapt.FunctionWrapper(_connect, traced_connect)
30
31 setattr(sqlite3, "connect", wrapped)
32 setattr(sqlite3.dbapi2, "connect", wrapped)
33
34
35 def unpatch():
36 sqlite3.connect = _connect
37 sqlite3.dbapi2.connect = _connect
38
39
40 def traced_connect(func, _, args, kwargs):
41 conn = func(*args, **kwargs)
42 return patch_conn(conn)
43
44
45 def patch_conn(conn):
46 wrapped = TracedSQLite(conn)
47 Pin().onto(wrapped)
48 return wrapped
49
50
51 class TracedSQLiteCursor(TracedCursor):
52 def executemany(self, *args, **kwargs):
53 # DEV: SQLite3 Cursor.execute always returns back the cursor instance
54 super(TracedSQLiteCursor, self).executemany(*args, **kwargs)
55 return self
56
57 def execute(self, *args, **kwargs):
58 # DEV: SQLite3 Cursor.execute always returns back the cursor instance
59 super(TracedSQLiteCursor, self).execute(*args, **kwargs)
60 return self
61
62
63 class TracedSQLiteFetchCursor(TracedSQLiteCursor, FetchTracedCursor):
64 pass
65
66
67 class TracedSQLite(TracedConnection):
68 def __init__(self, conn, pin=None, cursor_cls=None):
69 if not cursor_cls:
70 # Do not trace `fetch*` methods by default
71 cursor_cls = TracedSQLiteFetchCursor if config.sqlite.trace_fetch_methods else TracedSQLiteCursor
72
73 super(TracedSQLite, self).__init__(conn, pin=pin, cfg=config.sqlite, cursor_cls=cursor_cls)
74
75 def execute(self, *args, **kwargs):
76 # sqlite has a few extra sugar functions
77 return self.cursor().execute(*args, **kwargs)
78
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ddtrace/contrib/sqlite3/patch.py b/ddtrace/contrib/sqlite3/patch.py
--- a/ddtrace/contrib/sqlite3/patch.py
+++ b/ddtrace/contrib/sqlite3/patch.py
@@ -1,6 +1,7 @@
import os
import sqlite3
import sqlite3.dbapi2
+import sys
from ddtrace import config
from ddtrace.vendor import wrapt
@@ -75,3 +76,13 @@
def execute(self, *args, **kwargs):
# sqlite has a few extra sugar functions
return self.cursor().execute(*args, **kwargs)
+
+ # backup was added in Python 3.7
+ if sys.version_info >= (3, 7, 0):
+
+ def backup(self, target, *args, **kwargs):
+ # sqlite3 checks the type of `target`, it cannot be a wrapped connection
+ # https://github.com/python/cpython/blob/4652093e1b816b78e9a585d671a807ce66427417/Modules/_sqlite/connection.c#L1897-L1899
+ if isinstance(target, TracedConnection):
+ target = target.__wrapped__
+ return self.__wrapped__.backup(target, *args, **kwargs)
| {"golden_diff": "diff --git a/ddtrace/contrib/sqlite3/patch.py b/ddtrace/contrib/sqlite3/patch.py\n--- a/ddtrace/contrib/sqlite3/patch.py\n+++ b/ddtrace/contrib/sqlite3/patch.py\n@@ -1,6 +1,7 @@\n import os\n import sqlite3\n import sqlite3.dbapi2\n+import sys\n \n from ddtrace import config\n from ddtrace.vendor import wrapt\n@@ -75,3 +76,13 @@\n def execute(self, *args, **kwargs):\n # sqlite has a few extra sugar functions\n return self.cursor().execute(*args, **kwargs)\n+\n+ # backup was added in Python 3.7\n+ if sys.version_info >= (3, 7, 0):\n+\n+ def backup(self, target, *args, **kwargs):\n+ # sqlite3 checks the type of `target`, it cannot be a wrapped connection\n+ # https://github.com/python/cpython/blob/4652093e1b816b78e9a585d671a807ce66427417/Modules/_sqlite/connection.c#L1897-L1899\n+ if isinstance(target, TracedConnection):\n+ target = target.__wrapped__\n+ return self.__wrapped__.backup(target, *args, **kwargs)\n", "issue": "Cannot call sqlite3.backup(db) on a TracedSQLite object\nThanks for taking the time for reporting an issue!\r\n\r\nBefore reporting an issue on dd-trace-py, please be sure to provide all\r\nnecessary information.\r\n\r\nIf you're hitting a bug, make sure that you're using the latest version of this\r\nlibrary.\r\n\r\n### Which version of dd-trace-py are you using?\r\n1.5.0\r\n### Which version of pip are you using?\r\n21.1.1\r\n_ddtrace requires pip>=18 to install one of our pre-built wheels_\r\n\r\n### Which version of the libraries are you using?\r\n\r\nYou can copy/paste the output of `pip freeze` here.\r\n\r\n```\r\nddtrace==1.5.0\r\n```\r\n\r\n### How can we reproduce your problem?\r\n\r\n```\r\nfrom ddtrace import config, patch_all\r\nimport sqlite3\r\n\r\nconfig.env = \"test\" # the environment the application is in\r\nconfig.service = \"app\" # name of your application\r\nconfig.version = \"v1\" # version of your application\r\npatch_all()\r\n\r\nsrc = sqlite3.connect(\"1.db\")\r\ndst = sqlite3.connect(\"2.db\")\r\nwith dst:\r\n src.backup(dst, pages=1)\r\ndst.close()\r\nsrc.close()\r\n```\r\n\r\n### What is the result that you get?\r\n\r\nThe following TypeError\r\n```\r\nTypeError: backup() argument 1 must be sqlite3.Connection, not TracedSQLite\r\n```\r\n\r\n### What is the result that you expected?\r\n\r\nThe function should succeed without error.\r\n\n", "before_files": [{"content": "import os\nimport sqlite3\nimport sqlite3.dbapi2\n\nfrom ddtrace import config\nfrom ddtrace.vendor import wrapt\n\nfrom ...contrib.dbapi import FetchTracedCursor\nfrom ...contrib.dbapi import TracedConnection\nfrom ...contrib.dbapi import TracedCursor\nfrom ...internal.utils.formats import asbool\nfrom ...pin import Pin\n\n\n# Original connect method\n_connect = sqlite3.connect\n\nconfig._add(\n \"sqlite\",\n dict(\n _default_service=\"sqlite\",\n _dbapi_span_name_prefix=\"sqlite\",\n trace_fetch_methods=asbool(os.getenv(\"DD_SQLITE_TRACE_FETCH_METHODS\", default=False)),\n ),\n)\n\n\ndef patch():\n wrapped = wrapt.FunctionWrapper(_connect, traced_connect)\n\n setattr(sqlite3, \"connect\", wrapped)\n setattr(sqlite3.dbapi2, \"connect\", wrapped)\n\n\ndef unpatch():\n sqlite3.connect = _connect\n sqlite3.dbapi2.connect = _connect\n\n\ndef traced_connect(func, _, args, kwargs):\n conn = func(*args, **kwargs)\n return patch_conn(conn)\n\n\ndef patch_conn(conn):\n wrapped = TracedSQLite(conn)\n Pin().onto(wrapped)\n return wrapped\n\n\nclass TracedSQLiteCursor(TracedCursor):\n def executemany(self, *args, **kwargs):\n # DEV: SQLite3 Cursor.execute always returns back the cursor instance\n super(TracedSQLiteCursor, self).executemany(*args, **kwargs)\n return self\n\n def execute(self, *args, **kwargs):\n # DEV: SQLite3 Cursor.execute always returns back the cursor instance\n super(TracedSQLiteCursor, self).execute(*args, **kwargs)\n return self\n\n\nclass TracedSQLiteFetchCursor(TracedSQLiteCursor, FetchTracedCursor):\n pass\n\n\nclass TracedSQLite(TracedConnection):\n def __init__(self, conn, pin=None, cursor_cls=None):\n if not cursor_cls:\n # Do not trace `fetch*` methods by default\n cursor_cls = TracedSQLiteFetchCursor if config.sqlite.trace_fetch_methods else TracedSQLiteCursor\n\n super(TracedSQLite, self).__init__(conn, pin=pin, cfg=config.sqlite, cursor_cls=cursor_cls)\n\n def execute(self, *args, **kwargs):\n # sqlite has a few extra sugar functions\n return self.cursor().execute(*args, **kwargs)\n", "path": "ddtrace/contrib/sqlite3/patch.py"}], "after_files": [{"content": "import os\nimport sqlite3\nimport sqlite3.dbapi2\nimport sys\n\nfrom ddtrace import config\nfrom ddtrace.vendor import wrapt\n\nfrom ...contrib.dbapi import FetchTracedCursor\nfrom ...contrib.dbapi import TracedConnection\nfrom ...contrib.dbapi import TracedCursor\nfrom ...internal.utils.formats import asbool\nfrom ...pin import Pin\n\n\n# Original connect method\n_connect = sqlite3.connect\n\nconfig._add(\n \"sqlite\",\n dict(\n _default_service=\"sqlite\",\n _dbapi_span_name_prefix=\"sqlite\",\n trace_fetch_methods=asbool(os.getenv(\"DD_SQLITE_TRACE_FETCH_METHODS\", default=False)),\n ),\n)\n\n\ndef patch():\n wrapped = wrapt.FunctionWrapper(_connect, traced_connect)\n\n setattr(sqlite3, \"connect\", wrapped)\n setattr(sqlite3.dbapi2, \"connect\", wrapped)\n\n\ndef unpatch():\n sqlite3.connect = _connect\n sqlite3.dbapi2.connect = _connect\n\n\ndef traced_connect(func, _, args, kwargs):\n conn = func(*args, **kwargs)\n return patch_conn(conn)\n\n\ndef patch_conn(conn):\n wrapped = TracedSQLite(conn)\n Pin().onto(wrapped)\n return wrapped\n\n\nclass TracedSQLiteCursor(TracedCursor):\n def executemany(self, *args, **kwargs):\n # DEV: SQLite3 Cursor.execute always returns back the cursor instance\n super(TracedSQLiteCursor, self).executemany(*args, **kwargs)\n return self\n\n def execute(self, *args, **kwargs):\n # DEV: SQLite3 Cursor.execute always returns back the cursor instance\n super(TracedSQLiteCursor, self).execute(*args, **kwargs)\n return self\n\n\nclass TracedSQLiteFetchCursor(TracedSQLiteCursor, FetchTracedCursor):\n pass\n\n\nclass TracedSQLite(TracedConnection):\n def __init__(self, conn, pin=None, cursor_cls=None):\n if not cursor_cls:\n # Do not trace `fetch*` methods by default\n cursor_cls = TracedSQLiteFetchCursor if config.sqlite.trace_fetch_methods else TracedSQLiteCursor\n\n super(TracedSQLite, self).__init__(conn, pin=pin, cfg=config.sqlite, cursor_cls=cursor_cls)\n\n def execute(self, *args, **kwargs):\n # sqlite has a few extra sugar functions\n return self.cursor().execute(*args, **kwargs)\n\n # backup was added in Python 3.7\n if sys.version_info >= (3, 7, 0):\n\n def backup(self, target, *args, **kwargs):\n # sqlite3 checks the type of `target`, it cannot be a wrapped connection\n # https://github.com/python/cpython/blob/4652093e1b816b78e9a585d671a807ce66427417/Modules/_sqlite/connection.c#L1897-L1899\n if isinstance(target, TracedConnection):\n target = target.__wrapped__\n return self.__wrapped__.backup(target, *args, **kwargs)\n", "path": "ddtrace/contrib/sqlite3/patch.py"}]} | 1,259 | 307 |
gh_patches_debug_7048 | rasdani/github-patches | git_diff | airctic__icevision-821 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Adding Non Unique Names To ClassMap Must Be Illegal
Currently, `add_name` doesn't check if the added `name` is a duplicate or not. This should be illegal, as classes must be unique
https://github.com/airctic/icevision/blob/82f3c7322d5adbc41d6a1b7b9e4327eb32ad23c4/icevision/core/class_map.py#L47-L51
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `icevision/core/class_map.py`
Content:
```
1 __all__ = ["ClassMap", "BACKGROUND"]
2
3 from icevision.imports import *
4
5 BACKGROUND = "background"
6
7
8 class ClassMap:
9 """Utility class for mapping between class name and id."""
10
11 def __init__(
12 self,
13 classes: Optional[Sequence[str]] = None,
14 background: Optional[str] = BACKGROUND,
15 ):
16 self._lock = True
17
18 self._id2class = copy(list(classes)) if classes else []
19 # insert background if required
20 self._background = background
21 if self._background is not None:
22 try:
23 self._id2class.remove(self._background)
24 except ValueError:
25 pass
26 # background is always index zero
27 self._id2class.insert(0, self._background)
28
29 self._class2id = {name: i for i, name in enumerate(self._id2class)}
30
31 @property
32 def num_classes(self):
33 return len(self)
34
35 def get_by_id(self, id: int) -> str:
36 return self._id2class[id]
37
38 def get_by_name(self, name: str) -> int:
39 try:
40 return self._class2id[name]
41 except KeyError as e:
42 if not self._lock:
43 return self.add_name(name)
44 else:
45 raise e
46
47 def add_name(self, name) -> int:
48 self._id2class.append(name)
49 id = len(self._class2id)
50 self._class2id[name] = id
51 return id
52
53 def lock(self):
54 self._lock = True
55 return self
56
57 def unlock(self):
58 self._lock = False
59 return self
60
61 def __eq__(self, other) -> bool:
62 if isinstance(other, ClassMap):
63 return self.__dict__ == other.__dict__
64 return False
65
66 def __len__(self):
67 return len(self._id2class)
68
69 def __repr__(self):
70 return f"<ClassMap: {self._class2id.__repr__()}>"
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/icevision/core/class_map.py b/icevision/core/class_map.py
--- a/icevision/core/class_map.py
+++ b/icevision/core/class_map.py
@@ -44,7 +44,13 @@
else:
raise e
- def add_name(self, name) -> int:
+ def add_name(self, name: str) -> int:
+ # Raise error if trying to add duplicate value
+ if name in self._id2class:
+ raise ValueError(
+ f"'{name}' already exists in the ClassMap. You can only add new labels that are unique"
+ )
+
self._id2class.append(name)
id = len(self._class2id)
self._class2id[name] = id
| {"golden_diff": "diff --git a/icevision/core/class_map.py b/icevision/core/class_map.py\n--- a/icevision/core/class_map.py\n+++ b/icevision/core/class_map.py\n@@ -44,7 +44,13 @@\n else:\n raise e\n \n- def add_name(self, name) -> int:\n+ def add_name(self, name: str) -> int:\n+ # Raise error if trying to add duplicate value\n+ if name in self._id2class:\n+ raise ValueError(\n+ f\"'{name}' already exists in the ClassMap. You can only add new labels that are unique\"\n+ )\n+\n self._id2class.append(name)\n id = len(self._class2id)\n self._class2id[name] = id\n", "issue": "Adding Non Unique Names To ClassMap Must Be Illegal\nCurrently, `add_name` doesn't check if the added `name` is a duplicate or not. This should be illegal, as classes must be unique\r\n\r\nhttps://github.com/airctic/icevision/blob/82f3c7322d5adbc41d6a1b7b9e4327eb32ad23c4/icevision/core/class_map.py#L47-L51\n", "before_files": [{"content": "__all__ = [\"ClassMap\", \"BACKGROUND\"]\n\nfrom icevision.imports import *\n\nBACKGROUND = \"background\"\n\n\nclass ClassMap:\n \"\"\"Utility class for mapping between class name and id.\"\"\"\n\n def __init__(\n self,\n classes: Optional[Sequence[str]] = None,\n background: Optional[str] = BACKGROUND,\n ):\n self._lock = True\n\n self._id2class = copy(list(classes)) if classes else []\n # insert background if required\n self._background = background\n if self._background is not None:\n try:\n self._id2class.remove(self._background)\n except ValueError:\n pass\n # background is always index zero\n self._id2class.insert(0, self._background)\n\n self._class2id = {name: i for i, name in enumerate(self._id2class)}\n\n @property\n def num_classes(self):\n return len(self)\n\n def get_by_id(self, id: int) -> str:\n return self._id2class[id]\n\n def get_by_name(self, name: str) -> int:\n try:\n return self._class2id[name]\n except KeyError as e:\n if not self._lock:\n return self.add_name(name)\n else:\n raise e\n\n def add_name(self, name) -> int:\n self._id2class.append(name)\n id = len(self._class2id)\n self._class2id[name] = id\n return id\n\n def lock(self):\n self._lock = True\n return self\n\n def unlock(self):\n self._lock = False\n return self\n\n def __eq__(self, other) -> bool:\n if isinstance(other, ClassMap):\n return self.__dict__ == other.__dict__\n return False\n\n def __len__(self):\n return len(self._id2class)\n\n def __repr__(self):\n return f\"<ClassMap: {self._class2id.__repr__()}>\"\n", "path": "icevision/core/class_map.py"}], "after_files": [{"content": "__all__ = [\"ClassMap\", \"BACKGROUND\"]\n\nfrom icevision.imports import *\n\nBACKGROUND = \"background\"\n\n\nclass ClassMap:\n \"\"\"Utility class for mapping between class name and id.\"\"\"\n\n def __init__(\n self,\n classes: Optional[Sequence[str]] = None,\n background: Optional[str] = BACKGROUND,\n ):\n self._lock = True\n\n self._id2class = copy(list(classes)) if classes else []\n # insert background if required\n self._background = background\n if self._background is not None:\n try:\n self._id2class.remove(self._background)\n except ValueError:\n pass\n # background is always index zero\n self._id2class.insert(0, self._background)\n\n self._class2id = {name: i for i, name in enumerate(self._id2class)}\n\n @property\n def num_classes(self):\n return len(self)\n\n def get_by_id(self, id: int) -> str:\n return self._id2class[id]\n\n def get_by_name(self, name: str) -> int:\n try:\n return self._class2id[name]\n except KeyError as e:\n if not self._lock:\n return self.add_name(name)\n else:\n raise e\n\n def add_name(self, name: str) -> int:\n # Raise error if trying to add duplicate value\n if name in self._id2class:\n raise ValueError(\n f\"'{name}' already exists in the ClassMap. You can only add new labels that are unique\"\n )\n\n self._id2class.append(name)\n id = len(self._class2id)\n self._class2id[name] = id\n return id\n\n def lock(self):\n self._lock = True\n return self\n\n def unlock(self):\n self._lock = False\n return self\n\n def __eq__(self, other) -> bool:\n if isinstance(other, ClassMap):\n return self.__dict__ == other.__dict__\n return False\n\n def __len__(self):\n return len(self._id2class)\n\n def __repr__(self):\n return f\"<ClassMap: {self._class2id.__repr__()}>\"\n", "path": "icevision/core/class_map.py"}]} | 939 | 172 |
gh_patches_debug_42527 | rasdani/github-patches | git_diff | conan-io__conan-center-index-2077 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[request] cpp-taskflow/2.5.0
### Package Details
* Package Name/Version: **cpp-taskflow/2.5.0**
* Changelog: **https://taskflow.github.io/taskflow/release-2-5-0.html**
Please note that `cpp-taskflow` is renamed to `taskflow`. So new recipe shall be created.
P.S. Can you tell me please, why recipes is not updated automatically, when new library version is released? Lask of automation? Will it be fixed in future conan center pipeline?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `recipes/taskflow/all/conanfile.py`
Content:
```
1 from conans import ConanFile, tools
2 from conans.errors import ConanInvalidConfiguration
3 from conans.model.version import Version
4 import os
5
6
7 class TaskflowConan(ConanFile):
8 name = "taskflow"
9 description = "A fast C++ header-only library to help you quickly write parallel programs with complex task dependencies."
10 topics = ("conan", "taskflow", "tasking", "parallelism")
11 url = "https://github.com/conan-io/conan-center-index"
12 homepage = "https://github.com/taskflow/taskflow"
13 license = "MIT"
14
15 no_copy_source = True
16
17 settings = "os", "compiler"
18
19 _source_subfolder = "source_subfolder"
20
21 def configure(self):
22 compiler = str(self.settings.compiler)
23 compiler_version = tools.Version(self.settings.compiler.version)
24 min_req_cppstd = "17" if tools.Version(self.version) <= "2.2.0" else "14"
25
26 if self.settings.compiler.cppstd:
27 tools.check_min_cppstd(self, min_req_cppstd)
28 else:
29 self.output.warn("%s recipe lacks information about the %s compiler"
30 " standard version support" % (self.name, compiler))
31
32 minimal_version = {
33 "17" : {
34 "Visual Studio": "16",
35 "gcc": "7.3",
36 "clang": "6.0",
37 "apple-clang": "10.0"
38 },
39 "14" : {
40 "Visual Studio": "15",
41 "gcc": "5",
42 "clang": "4.0",
43 "apple-clang": "8.0"
44 }
45 }
46
47 if compiler not in minimal_version[min_req_cppstd]:
48 self.output.info("%s requires a compiler that supports at least C++%s" % (self.name, min_req_cppstd))
49 return
50
51 # Exclude compilers not supported by taskflow
52 if compiler_version < minimal_version[min_req_cppstd][compiler]:
53 raise ConanInvalidConfiguration("%s requires a compiler that supports"
54 " at least C++%s. %s %s is not"
55 " supported." % (self.name, min_req_cppstd, compiler, Version(self.settings.compiler.version.value)))
56
57 def source(self):
58 tools.get(**self.conan_data["sources"][self.version])
59 extracted_dir = self.name + "-" + self.version
60 os.rename(extracted_dir, self._source_subfolder)
61
62 def package(self):
63 self.copy(pattern="LICENSE", dst="licenses", src=self._source_subfolder)
64 self.copy(pattern="*", dst="include/taskflow", src=os.path.join(self._source_subfolder, "taskflow"))
65
66 def package_id(self):
67 self.info.header_only()
68
69 def package_info(self):
70 if self.settings.os == "Linux":
71 self.cpp_info.system_libs.append("pthread")
72 if self.settings.compiler == "Visual Studio":
73 self.cpp_info.defines.append("_ENABLE_EXTENDED_ALIGNED_STORAGE")
74
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/recipes/taskflow/all/conanfile.py b/recipes/taskflow/all/conanfile.py
--- a/recipes/taskflow/all/conanfile.py
+++ b/recipes/taskflow/all/conanfile.py
@@ -1,8 +1,7 @@
-from conans import ConanFile, tools
-from conans.errors import ConanInvalidConfiguration
-from conans.model.version import Version
import os
+from conans import ConanFile, tools
+from conans.errors import ConanInvalidConfiguration
class TaskflowConan(ConanFile):
name = "taskflow"
@@ -16,43 +15,41 @@
settings = "os", "compiler"
- _source_subfolder = "source_subfolder"
+ @property
+ def _source_subfolder(self):
+ return "source_subfolder"
def configure(self):
- compiler = str(self.settings.compiler)
- compiler_version = tools.Version(self.settings.compiler.version)
- min_req_cppstd = "17" if tools.Version(self.version) <= "2.2.0" else "14"
+ minimal_cpp_standard = "17" if tools.Version(self.version) <= "2.2.0" else "14"
if self.settings.compiler.cppstd:
- tools.check_min_cppstd(self, min_req_cppstd)
- else:
- self.output.warn("%s recipe lacks information about the %s compiler"
- " standard version support" % (self.name, compiler))
+ tools.check_min_cppstd(self, minimal_cpp_standard)
minimal_version = {
- "17" : {
+ "17": {
"Visual Studio": "16",
"gcc": "7.3",
"clang": "6.0",
"apple-clang": "10.0"
},
- "14" : {
+ "14": {
"Visual Studio": "15",
"gcc": "5",
"clang": "4.0",
"apple-clang": "8.0"
}
- }
+ }[minimal_cpp_standard]
- if compiler not in minimal_version[min_req_cppstd]:
- self.output.info("%s requires a compiler that supports at least C++%s" % (self.name, min_req_cppstd))
+ compiler = str(self.settings.compiler)
+ if compiler not in minimal_version:
+ self.output.warn(
+ "%s recipe lacks information about the %s compiler standard version support" % (self.name, compiler))
+ self.output.warn(
+ "%s requires a compiler that supports at least C++%s" % (self.name, minimal_cpp_standard))
return
-
- # Exclude compilers not supported by taskflow
- if compiler_version < minimal_version[min_req_cppstd][compiler]:
- raise ConanInvalidConfiguration("%s requires a compiler that supports"
- " at least C++%s. %s %s is not"
- " supported." % (self.name, min_req_cppstd, compiler, Version(self.settings.compiler.version.value)))
+ version = tools.Version(self.settings.compiler.version)
+ if version < minimal_version[compiler]:
+ raise ConanInvalidConfiguration("%s requires a compiler that supports at least C++%s" % (self.name, minimal_cpp_standard))
def source(self):
tools.get(**self.conan_data["sources"][self.version])
@@ -61,7 +58,7 @@
def package(self):
self.copy(pattern="LICENSE", dst="licenses", src=self._source_subfolder)
- self.copy(pattern="*", dst="include/taskflow", src=os.path.join(self._source_subfolder, "taskflow"))
+ self.copy(pattern="*", dst=os.path.join("include", "taskflow"), src=os.path.join(self._source_subfolder, "taskflow"))
def package_id(self):
self.info.header_only()
@@ -71,4 +68,5 @@
self.cpp_info.system_libs.append("pthread")
if self.settings.compiler == "Visual Studio":
self.cpp_info.defines.append("_ENABLE_EXTENDED_ALIGNED_STORAGE")
-
+ self.cpp_info.names["cmake_find_package"] = "Taskflow"
+ self.cpp_info.names["cmake_find_package_multi"] = "Taskflow"
| {"golden_diff": "diff --git a/recipes/taskflow/all/conanfile.py b/recipes/taskflow/all/conanfile.py\n--- a/recipes/taskflow/all/conanfile.py\n+++ b/recipes/taskflow/all/conanfile.py\n@@ -1,8 +1,7 @@\n-from conans import ConanFile, tools\n-from conans.errors import ConanInvalidConfiguration\n-from conans.model.version import Version\n import os\n \n+from conans import ConanFile, tools\n+from conans.errors import ConanInvalidConfiguration\n \n class TaskflowConan(ConanFile):\n name = \"taskflow\"\n@@ -16,43 +15,41 @@\n \n settings = \"os\", \"compiler\"\n \n- _source_subfolder = \"source_subfolder\"\n+ @property\n+ def _source_subfolder(self):\n+ return \"source_subfolder\"\n \n def configure(self):\n- compiler = str(self.settings.compiler)\n- compiler_version = tools.Version(self.settings.compiler.version)\n- min_req_cppstd = \"17\" if tools.Version(self.version) <= \"2.2.0\" else \"14\"\n+ minimal_cpp_standard = \"17\" if tools.Version(self.version) <= \"2.2.0\" else \"14\"\n \n if self.settings.compiler.cppstd:\n- tools.check_min_cppstd(self, min_req_cppstd)\n- else:\n- self.output.warn(\"%s recipe lacks information about the %s compiler\"\n- \" standard version support\" % (self.name, compiler))\n+ tools.check_min_cppstd(self, minimal_cpp_standard)\n \n minimal_version = {\n- \"17\" : {\n+ \"17\": {\n \"Visual Studio\": \"16\",\n \"gcc\": \"7.3\",\n \"clang\": \"6.0\",\n \"apple-clang\": \"10.0\"\n },\n- \"14\" : {\n+ \"14\": {\n \"Visual Studio\": \"15\",\n \"gcc\": \"5\",\n \"clang\": \"4.0\",\n \"apple-clang\": \"8.0\"\n }\n- }\n+ }[minimal_cpp_standard]\n \n- if compiler not in minimal_version[min_req_cppstd]:\n- self.output.info(\"%s requires a compiler that supports at least C++%s\" % (self.name, min_req_cppstd))\n+ compiler = str(self.settings.compiler)\n+ if compiler not in minimal_version:\n+ self.output.warn(\n+ \"%s recipe lacks information about the %s compiler standard version support\" % (self.name, compiler))\n+ self.output.warn(\n+ \"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n return\n-\n- # Exclude compilers not supported by taskflow\n- if compiler_version < minimal_version[min_req_cppstd][compiler]:\n- raise ConanInvalidConfiguration(\"%s requires a compiler that supports\"\n- \" at least C++%s. %s %s is not\"\n- \" supported.\" % (self.name, min_req_cppstd, compiler, Version(self.settings.compiler.version.value)))\n+ version = tools.Version(self.settings.compiler.version)\n+ if version < minimal_version[compiler]:\n+ raise ConanInvalidConfiguration(\"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n \n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n@@ -61,7 +58,7 @@\n \n def package(self):\n self.copy(pattern=\"LICENSE\", dst=\"licenses\", src=self._source_subfolder)\n- self.copy(pattern=\"*\", dst=\"include/taskflow\", src=os.path.join(self._source_subfolder, \"taskflow\"))\n+ self.copy(pattern=\"*\", dst=os.path.join(\"include\", \"taskflow\"), src=os.path.join(self._source_subfolder, \"taskflow\"))\n \n def package_id(self):\n self.info.header_only()\n@@ -71,4 +68,5 @@\n self.cpp_info.system_libs.append(\"pthread\")\n if self.settings.compiler == \"Visual Studio\":\n self.cpp_info.defines.append(\"_ENABLE_EXTENDED_ALIGNED_STORAGE\")\n-\n+ self.cpp_info.names[\"cmake_find_package\"] = \"Taskflow\"\n+ self.cpp_info.names[\"cmake_find_package_multi\"] = \"Taskflow\"\n", "issue": "[request] cpp-taskflow/2.5.0\n### Package Details\r\n * Package Name/Version: **cpp-taskflow/2.5.0**\r\n * Changelog: **https://taskflow.github.io/taskflow/release-2-5-0.html**\r\n\r\n\r\nPlease note that `cpp-taskflow` is renamed to `taskflow`. So new recipe shall be created.\r\n\r\nP.S. Can you tell me please, why recipes is not updated automatically, when new library version is released? Lask of automation? Will it be fixed in future conan center pipeline?\r\n\n", "before_files": [{"content": "from conans import ConanFile, tools\nfrom conans.errors import ConanInvalidConfiguration\nfrom conans.model.version import Version\nimport os\n\n\nclass TaskflowConan(ConanFile):\n name = \"taskflow\"\n description = \"A fast C++ header-only library to help you quickly write parallel programs with complex task dependencies.\"\n topics = (\"conan\", \"taskflow\", \"tasking\", \"parallelism\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/taskflow/taskflow\"\n license = \"MIT\"\n\n no_copy_source = True\n\n settings = \"os\", \"compiler\"\n\n _source_subfolder = \"source_subfolder\"\n\n def configure(self):\n compiler = str(self.settings.compiler)\n compiler_version = tools.Version(self.settings.compiler.version)\n min_req_cppstd = \"17\" if tools.Version(self.version) <= \"2.2.0\" else \"14\"\n\n if self.settings.compiler.cppstd:\n tools.check_min_cppstd(self, min_req_cppstd)\n else:\n self.output.warn(\"%s recipe lacks information about the %s compiler\"\n \" standard version support\" % (self.name, compiler))\n\n minimal_version = {\n \"17\" : {\n \"Visual Studio\": \"16\",\n \"gcc\": \"7.3\",\n \"clang\": \"6.0\",\n \"apple-clang\": \"10.0\"\n },\n \"14\" : {\n \"Visual Studio\": \"15\",\n \"gcc\": \"5\",\n \"clang\": \"4.0\",\n \"apple-clang\": \"8.0\"\n }\n }\n\n if compiler not in minimal_version[min_req_cppstd]:\n self.output.info(\"%s requires a compiler that supports at least C++%s\" % (self.name, min_req_cppstd))\n return\n\n # Exclude compilers not supported by taskflow\n if compiler_version < minimal_version[min_req_cppstd][compiler]:\n raise ConanInvalidConfiguration(\"%s requires a compiler that supports\"\n \" at least C++%s. %s %s is not\"\n \" supported.\" % (self.name, min_req_cppstd, compiler, Version(self.settings.compiler.version.value)))\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = self.name + \"-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def package(self):\n self.copy(pattern=\"LICENSE\", dst=\"licenses\", src=self._source_subfolder)\n self.copy(pattern=\"*\", dst=\"include/taskflow\", src=os.path.join(self._source_subfolder, \"taskflow\"))\n\n def package_id(self):\n self.info.header_only()\n\n def package_info(self):\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs.append(\"pthread\")\n if self.settings.compiler == \"Visual Studio\":\n self.cpp_info.defines.append(\"_ENABLE_EXTENDED_ALIGNED_STORAGE\")\n\n", "path": "recipes/taskflow/all/conanfile.py"}], "after_files": [{"content": "import os\n\nfrom conans import ConanFile, tools\nfrom conans.errors import ConanInvalidConfiguration\n\nclass TaskflowConan(ConanFile):\n name = \"taskflow\"\n description = \"A fast C++ header-only library to help you quickly write parallel programs with complex task dependencies.\"\n topics = (\"conan\", \"taskflow\", \"tasking\", \"parallelism\")\n url = \"https://github.com/conan-io/conan-center-index\"\n homepage = \"https://github.com/taskflow/taskflow\"\n license = \"MIT\"\n\n no_copy_source = True\n\n settings = \"os\", \"compiler\"\n\n @property\n def _source_subfolder(self):\n return \"source_subfolder\"\n\n def configure(self):\n minimal_cpp_standard = \"17\" if tools.Version(self.version) <= \"2.2.0\" else \"14\"\n\n if self.settings.compiler.cppstd:\n tools.check_min_cppstd(self, minimal_cpp_standard)\n\n minimal_version = {\n \"17\": {\n \"Visual Studio\": \"16\",\n \"gcc\": \"7.3\",\n \"clang\": \"6.0\",\n \"apple-clang\": \"10.0\"\n },\n \"14\": {\n \"Visual Studio\": \"15\",\n \"gcc\": \"5\",\n \"clang\": \"4.0\",\n \"apple-clang\": \"8.0\"\n }\n }[minimal_cpp_standard]\n\n compiler = str(self.settings.compiler)\n if compiler not in minimal_version:\n self.output.warn(\n \"%s recipe lacks information about the %s compiler standard version support\" % (self.name, compiler))\n self.output.warn(\n \"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n return\n version = tools.Version(self.settings.compiler.version)\n if version < minimal_version[compiler]:\n raise ConanInvalidConfiguration(\"%s requires a compiler that supports at least C++%s\" % (self.name, minimal_cpp_standard))\n\n def source(self):\n tools.get(**self.conan_data[\"sources\"][self.version])\n extracted_dir = self.name + \"-\" + self.version\n os.rename(extracted_dir, self._source_subfolder)\n\n def package(self):\n self.copy(pattern=\"LICENSE\", dst=\"licenses\", src=self._source_subfolder)\n self.copy(pattern=\"*\", dst=os.path.join(\"include\", \"taskflow\"), src=os.path.join(self._source_subfolder, \"taskflow\"))\n\n def package_id(self):\n self.info.header_only()\n\n def package_info(self):\n if self.settings.os == \"Linux\":\n self.cpp_info.system_libs.append(\"pthread\")\n if self.settings.compiler == \"Visual Studio\":\n self.cpp_info.defines.append(\"_ENABLE_EXTENDED_ALIGNED_STORAGE\")\n self.cpp_info.names[\"cmake_find_package\"] = \"Taskflow\"\n self.cpp_info.names[\"cmake_find_package_multi\"] = \"Taskflow\"\n", "path": "recipes/taskflow/all/conanfile.py"}]} | 1,169 | 932 |
gh_patches_debug_3808 | rasdani/github-patches | git_diff | ManimCommunity__manim-2197 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Create a test for resolution configuration
## Enhancement proposal
A test for the `--resolution` configuration option. It should be placed [here](https://github.com/ManimCommunity/manim/blob/main/tests/test_scene_rendering/test_cli_flags.py) and [here](https://github.com/ManimCommunity/manim/blob/main/tests/test_scene_rendering/opengl/test_cli_flags_opengl.py) (the OpenGL test can be marked as `xfail` if setting resolution does not work properly on OpenGL). Test different resolutions as well as different ways of separating the resolutions (width and height separated by `;`, `,`, and `-`).
## Additional comments
<!-- Add further context that you think might be relevant. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `manim/utils/commands.py`
Content:
```
1 import json
2 import os
3 from subprocess import run
4 from typing import Any, Dict, List
5
6 __all__ = [
7 "capture",
8 "get_video_metadata",
9 "get_dir_layout",
10 ]
11
12
13 def capture(command, cwd=None, command_input=None):
14 p = run(command, cwd=cwd, input=command_input, capture_output=True, text=True)
15 out, err = p.stdout, p.stderr
16 return out, err, p.returncode
17
18
19 def get_video_metadata(path_to_video: str) -> Dict[str, Any]:
20 command = [
21 "ffprobe",
22 "-v",
23 "error",
24 "-select_streams",
25 "v:0",
26 "-show_entries",
27 "stream=width,height,nb_frames,duration,avg_frame_rate,codec_name",
28 "-print_format",
29 "json",
30 path_to_video,
31 ]
32 config, err, exitcode = capture(command)
33 assert exitcode == 0, f"FFprobe error: {err}"
34 return json.loads(config)["streams"][0]
35
36
37 def get_dir_layout(dirpath: str) -> List[str]:
38 """Get list of paths relative to dirpath of all files in dir and subdirs recursively."""
39 index_files: List[str] = []
40 for root, dirs, files in os.walk(dirpath):
41 for file in files:
42 index_files.append(f"{os.path.relpath(os.path.join(root, file), dirpath)}")
43 return index_files
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/manim/utils/commands.py b/manim/utils/commands.py
--- a/manim/utils/commands.py
+++ b/manim/utils/commands.py
@@ -27,7 +27,7 @@
"stream=width,height,nb_frames,duration,avg_frame_rate,codec_name",
"-print_format",
"json",
- path_to_video,
+ str(path_to_video),
]
config, err, exitcode = capture(command)
assert exitcode == 0, f"FFprobe error: {err}"
| {"golden_diff": "diff --git a/manim/utils/commands.py b/manim/utils/commands.py\n--- a/manim/utils/commands.py\n+++ b/manim/utils/commands.py\n@@ -27,7 +27,7 @@\n \"stream=width,height,nb_frames,duration,avg_frame_rate,codec_name\",\n \"-print_format\",\n \"json\",\n- path_to_video,\n+ str(path_to_video),\n ]\n config, err, exitcode = capture(command)\n assert exitcode == 0, f\"FFprobe error: {err}\"\n", "issue": "Create a test for resolution configuration\n## Enhancement proposal\r\nA test for the `--resolution` configuration option. It should be placed [here](https://github.com/ManimCommunity/manim/blob/main/tests/test_scene_rendering/test_cli_flags.py) and [here](https://github.com/ManimCommunity/manim/blob/main/tests/test_scene_rendering/opengl/test_cli_flags_opengl.py) (the OpenGL test can be marked as `xfail` if setting resolution does not work properly on OpenGL). Test different resolutions as well as different ways of separating the resolutions (width and height separated by `;`, `,`, and `-`).\r\n\r\n## Additional comments\r\n<!-- Add further context that you think might be relevant. -->\r\n\n", "before_files": [{"content": "import json\nimport os\nfrom subprocess import run\nfrom typing import Any, Dict, List\n\n__all__ = [\n \"capture\",\n \"get_video_metadata\",\n \"get_dir_layout\",\n]\n\n\ndef capture(command, cwd=None, command_input=None):\n p = run(command, cwd=cwd, input=command_input, capture_output=True, text=True)\n out, err = p.stdout, p.stderr\n return out, err, p.returncode\n\n\ndef get_video_metadata(path_to_video: str) -> Dict[str, Any]:\n command = [\n \"ffprobe\",\n \"-v\",\n \"error\",\n \"-select_streams\",\n \"v:0\",\n \"-show_entries\",\n \"stream=width,height,nb_frames,duration,avg_frame_rate,codec_name\",\n \"-print_format\",\n \"json\",\n path_to_video,\n ]\n config, err, exitcode = capture(command)\n assert exitcode == 0, f\"FFprobe error: {err}\"\n return json.loads(config)[\"streams\"][0]\n\n\ndef get_dir_layout(dirpath: str) -> List[str]:\n \"\"\"Get list of paths relative to dirpath of all files in dir and subdirs recursively.\"\"\"\n index_files: List[str] = []\n for root, dirs, files in os.walk(dirpath):\n for file in files:\n index_files.append(f\"{os.path.relpath(os.path.join(root, file), dirpath)}\")\n return index_files\n", "path": "manim/utils/commands.py"}], "after_files": [{"content": "import json\nimport os\nfrom subprocess import run\nfrom typing import Any, Dict, List\n\n__all__ = [\n \"capture\",\n \"get_video_metadata\",\n \"get_dir_layout\",\n]\n\n\ndef capture(command, cwd=None, command_input=None):\n p = run(command, cwd=cwd, input=command_input, capture_output=True, text=True)\n out, err = p.stdout, p.stderr\n return out, err, p.returncode\n\n\ndef get_video_metadata(path_to_video: str) -> Dict[str, Any]:\n command = [\n \"ffprobe\",\n \"-v\",\n \"error\",\n \"-select_streams\",\n \"v:0\",\n \"-show_entries\",\n \"stream=width,height,nb_frames,duration,avg_frame_rate,codec_name\",\n \"-print_format\",\n \"json\",\n str(path_to_video),\n ]\n config, err, exitcode = capture(command)\n assert exitcode == 0, f\"FFprobe error: {err}\"\n return json.loads(config)[\"streams\"][0]\n\n\ndef get_dir_layout(dirpath: str) -> List[str]:\n \"\"\"Get list of paths relative to dirpath of all files in dir and subdirs recursively.\"\"\"\n index_files: List[str] = []\n for root, dirs, files in os.walk(dirpath):\n for file in files:\n index_files.append(f\"{os.path.relpath(os.path.join(root, file), dirpath)}\")\n return index_files\n", "path": "manim/utils/commands.py"}]} | 799 | 119 |
gh_patches_debug_28244 | rasdani/github-patches | git_diff | scrapy__scrapy-3383 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Contracts for FormRequest
At the moment there's no way to create a contract for a callback that is passed a response from `FormRequest`.
How about something simple like this (`ContractsManager.from_method`):
```python
request = FormRequest(**kwargs) if 'formdata' in kwargs else Request(**kwargs)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/contracts/__init__.py`
Content:
```
1 import sys
2 import re
3 from functools import wraps
4 from inspect import getmembers
5 from unittest import TestCase
6
7 from scrapy.http import Request
8 from scrapy.utils.spider import iterate_spider_output
9 from scrapy.utils.python import get_spec
10
11
12 class ContractsManager(object):
13 contracts = {}
14
15 def __init__(self, contracts):
16 for contract in contracts:
17 self.contracts[contract.name] = contract
18
19 def tested_methods_from_spidercls(self, spidercls):
20 methods = []
21 for key, value in getmembers(spidercls):
22 if (callable(value) and value.__doc__ and
23 re.search(r'^\s*@', value.__doc__, re.MULTILINE)):
24 methods.append(key)
25
26 return methods
27
28 def extract_contracts(self, method):
29 contracts = []
30 for line in method.__doc__.split('\n'):
31 line = line.strip()
32
33 if line.startswith('@'):
34 name, args = re.match(r'@(\w+)\s*(.*)', line).groups()
35 args = re.split(r'\s+', args)
36
37 contracts.append(self.contracts[name](method, *args))
38
39 return contracts
40
41 def from_spider(self, spider, results):
42 requests = []
43 for method in self.tested_methods_from_spidercls(type(spider)):
44 bound_method = spider.__getattribute__(method)
45 requests.append(self.from_method(bound_method, results))
46
47 return requests
48
49 def from_method(self, method, results):
50 contracts = self.extract_contracts(method)
51 if contracts:
52 # calculate request args
53 args, kwargs = get_spec(Request.__init__)
54 kwargs['callback'] = method
55 for contract in contracts:
56 kwargs = contract.adjust_request_args(kwargs)
57
58 # create and prepare request
59 args.remove('self')
60 if set(args).issubset(set(kwargs)):
61 request = Request(**kwargs)
62
63 # execute pre and post hooks in order
64 for contract in reversed(contracts):
65 request = contract.add_pre_hook(request, results)
66 for contract in contracts:
67 request = contract.add_post_hook(request, results)
68
69 self._clean_req(request, method, results)
70 return request
71
72 def _clean_req(self, request, method, results):
73 """ stop the request from returning objects and records any errors """
74
75 cb = request.callback
76
77 @wraps(cb)
78 def cb_wrapper(response):
79 try:
80 output = cb(response)
81 output = list(iterate_spider_output(output))
82 except:
83 case = _create_testcase(method, 'callback')
84 results.addError(case, sys.exc_info())
85
86 def eb_wrapper(failure):
87 case = _create_testcase(method, 'errback')
88 exc_info = failure.type, failure.value, failure.getTracebackObject()
89 results.addError(case, exc_info)
90
91 request.callback = cb_wrapper
92 request.errback = eb_wrapper
93
94
95 class Contract(object):
96 """ Abstract class for contracts """
97
98 def __init__(self, method, *args):
99 self.testcase_pre = _create_testcase(method, '@%s pre-hook' % self.name)
100 self.testcase_post = _create_testcase(method, '@%s post-hook' % self.name)
101 self.args = args
102
103 def add_pre_hook(self, request, results):
104 if hasattr(self, 'pre_process'):
105 cb = request.callback
106
107 @wraps(cb)
108 def wrapper(response):
109 try:
110 results.startTest(self.testcase_pre)
111 self.pre_process(response)
112 results.stopTest(self.testcase_pre)
113 except AssertionError:
114 results.addFailure(self.testcase_pre, sys.exc_info())
115 except Exception:
116 results.addError(self.testcase_pre, sys.exc_info())
117 else:
118 results.addSuccess(self.testcase_pre)
119 finally:
120 return list(iterate_spider_output(cb(response)))
121
122 request.callback = wrapper
123
124 return request
125
126 def add_post_hook(self, request, results):
127 if hasattr(self, 'post_process'):
128 cb = request.callback
129
130 @wraps(cb)
131 def wrapper(response):
132 output = list(iterate_spider_output(cb(response)))
133 try:
134 results.startTest(self.testcase_post)
135 self.post_process(output)
136 results.stopTest(self.testcase_post)
137 except AssertionError:
138 results.addFailure(self.testcase_post, sys.exc_info())
139 except Exception:
140 results.addError(self.testcase_post, sys.exc_info())
141 else:
142 results.addSuccess(self.testcase_post)
143 finally:
144 return output
145
146 request.callback = wrapper
147
148 return request
149
150 def adjust_request_args(self, args):
151 return args
152
153
154 def _create_testcase(method, desc):
155 spider = method.__self__.name
156
157 class ContractTestCase(TestCase):
158 def __str__(_self):
159 return "[%s] %s (%s)" % (spider, method.__name__, desc)
160
161 name = '%s_%s' % (spider, method.__name__)
162 setattr(ContractTestCase, name, lambda x: x)
163 return ContractTestCase(name)
164
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scrapy/contracts/__init__.py b/scrapy/contracts/__init__.py
--- a/scrapy/contracts/__init__.py
+++ b/scrapy/contracts/__init__.py
@@ -49,16 +49,22 @@
def from_method(self, method, results):
contracts = self.extract_contracts(method)
if contracts:
+ request_cls = Request
+ for contract in contracts:
+ if contract.request_cls is not None:
+ request_cls = contract.request_cls
+
# calculate request args
- args, kwargs = get_spec(Request.__init__)
+ args, kwargs = get_spec(request_cls.__init__)
kwargs['callback'] = method
for contract in contracts:
kwargs = contract.adjust_request_args(kwargs)
- # create and prepare request
args.remove('self')
+
+ # check if all positional arguments are defined in kwargs
if set(args).issubset(set(kwargs)):
- request = Request(**kwargs)
+ request = request_cls(**kwargs)
# execute pre and post hooks in order
for contract in reversed(contracts):
@@ -94,6 +100,7 @@
class Contract(object):
""" Abstract class for contracts """
+ request_cls = None
def __init__(self, method, *args):
self.testcase_pre = _create_testcase(method, '@%s pre-hook' % self.name)
| {"golden_diff": "diff --git a/scrapy/contracts/__init__.py b/scrapy/contracts/__init__.py\n--- a/scrapy/contracts/__init__.py\n+++ b/scrapy/contracts/__init__.py\n@@ -49,16 +49,22 @@\n def from_method(self, method, results):\n contracts = self.extract_contracts(method)\n if contracts:\n+ request_cls = Request\n+ for contract in contracts:\n+ if contract.request_cls is not None:\n+ request_cls = contract.request_cls\n+\n # calculate request args\n- args, kwargs = get_spec(Request.__init__)\n+ args, kwargs = get_spec(request_cls.__init__)\n kwargs['callback'] = method\n for contract in contracts:\n kwargs = contract.adjust_request_args(kwargs)\n \n- # create and prepare request\n args.remove('self')\n+\n+ # check if all positional arguments are defined in kwargs\n if set(args).issubset(set(kwargs)):\n- request = Request(**kwargs)\n+ request = request_cls(**kwargs)\n \n # execute pre and post hooks in order\n for contract in reversed(contracts):\n@@ -94,6 +100,7 @@\n \n class Contract(object):\n \"\"\" Abstract class for contracts \"\"\"\n+ request_cls = None\n \n def __init__(self, method, *args):\n self.testcase_pre = _create_testcase(method, '@%s pre-hook' % self.name)\n", "issue": "Contracts for FormRequest\nAt the moment there's no way to create a contract for a callback that is passed a response from `FormRequest`.\r\n\r\nHow about something simple like this (`ContractsManager.from_method`):\r\n```python\r\nrequest = FormRequest(**kwargs) if 'formdata' in kwargs else Request(**kwargs)\r\n```\n", "before_files": [{"content": "import sys\nimport re\nfrom functools import wraps\nfrom inspect import getmembers\nfrom unittest import TestCase\n\nfrom scrapy.http import Request\nfrom scrapy.utils.spider import iterate_spider_output\nfrom scrapy.utils.python import get_spec\n\n\nclass ContractsManager(object):\n contracts = {}\n\n def __init__(self, contracts):\n for contract in contracts:\n self.contracts[contract.name] = contract\n\n def tested_methods_from_spidercls(self, spidercls):\n methods = []\n for key, value in getmembers(spidercls):\n if (callable(value) and value.__doc__ and\n re.search(r'^\\s*@', value.__doc__, re.MULTILINE)):\n methods.append(key)\n\n return methods\n\n def extract_contracts(self, method):\n contracts = []\n for line in method.__doc__.split('\\n'):\n line = line.strip()\n\n if line.startswith('@'):\n name, args = re.match(r'@(\\w+)\\s*(.*)', line).groups()\n args = re.split(r'\\s+', args)\n\n contracts.append(self.contracts[name](method, *args))\n\n return contracts\n\n def from_spider(self, spider, results):\n requests = []\n for method in self.tested_methods_from_spidercls(type(spider)):\n bound_method = spider.__getattribute__(method)\n requests.append(self.from_method(bound_method, results))\n\n return requests\n\n def from_method(self, method, results):\n contracts = self.extract_contracts(method)\n if contracts:\n # calculate request args\n args, kwargs = get_spec(Request.__init__)\n kwargs['callback'] = method\n for contract in contracts:\n kwargs = contract.adjust_request_args(kwargs)\n\n # create and prepare request\n args.remove('self')\n if set(args).issubset(set(kwargs)):\n request = Request(**kwargs)\n\n # execute pre and post hooks in order\n for contract in reversed(contracts):\n request = contract.add_pre_hook(request, results)\n for contract in contracts:\n request = contract.add_post_hook(request, results)\n\n self._clean_req(request, method, results)\n return request\n\n def _clean_req(self, request, method, results):\n \"\"\" stop the request from returning objects and records any errors \"\"\"\n\n cb = request.callback\n\n @wraps(cb)\n def cb_wrapper(response):\n try:\n output = cb(response)\n output = list(iterate_spider_output(output))\n except:\n case = _create_testcase(method, 'callback')\n results.addError(case, sys.exc_info())\n\n def eb_wrapper(failure):\n case = _create_testcase(method, 'errback')\n exc_info = failure.type, failure.value, failure.getTracebackObject()\n results.addError(case, exc_info)\n\n request.callback = cb_wrapper\n request.errback = eb_wrapper\n\n\nclass Contract(object):\n \"\"\" Abstract class for contracts \"\"\"\n\n def __init__(self, method, *args):\n self.testcase_pre = _create_testcase(method, '@%s pre-hook' % self.name)\n self.testcase_post = _create_testcase(method, '@%s post-hook' % self.name)\n self.args = args\n\n def add_pre_hook(self, request, results):\n if hasattr(self, 'pre_process'):\n cb = request.callback\n\n @wraps(cb)\n def wrapper(response):\n try:\n results.startTest(self.testcase_pre)\n self.pre_process(response)\n results.stopTest(self.testcase_pre)\n except AssertionError:\n results.addFailure(self.testcase_pre, sys.exc_info())\n except Exception:\n results.addError(self.testcase_pre, sys.exc_info())\n else:\n results.addSuccess(self.testcase_pre)\n finally:\n return list(iterate_spider_output(cb(response)))\n\n request.callback = wrapper\n\n return request\n\n def add_post_hook(self, request, results):\n if hasattr(self, 'post_process'):\n cb = request.callback\n\n @wraps(cb)\n def wrapper(response):\n output = list(iterate_spider_output(cb(response)))\n try:\n results.startTest(self.testcase_post)\n self.post_process(output)\n results.stopTest(self.testcase_post)\n except AssertionError:\n results.addFailure(self.testcase_post, sys.exc_info())\n except Exception:\n results.addError(self.testcase_post, sys.exc_info())\n else:\n results.addSuccess(self.testcase_post)\n finally:\n return output\n\n request.callback = wrapper\n\n return request\n\n def adjust_request_args(self, args):\n return args\n\n\ndef _create_testcase(method, desc):\n spider = method.__self__.name\n\n class ContractTestCase(TestCase):\n def __str__(_self):\n return \"[%s] %s (%s)\" % (spider, method.__name__, desc)\n\n name = '%s_%s' % (spider, method.__name__)\n setattr(ContractTestCase, name, lambda x: x)\n return ContractTestCase(name)\n", "path": "scrapy/contracts/__init__.py"}], "after_files": [{"content": "import sys\nimport re\nfrom functools import wraps\nfrom inspect import getmembers\nfrom unittest import TestCase\n\nfrom scrapy.http import Request\nfrom scrapy.utils.spider import iterate_spider_output\nfrom scrapy.utils.python import get_spec\n\n\nclass ContractsManager(object):\n contracts = {}\n\n def __init__(self, contracts):\n for contract in contracts:\n self.contracts[contract.name] = contract\n\n def tested_methods_from_spidercls(self, spidercls):\n methods = []\n for key, value in getmembers(spidercls):\n if (callable(value) and value.__doc__ and\n re.search(r'^\\s*@', value.__doc__, re.MULTILINE)):\n methods.append(key)\n\n return methods\n\n def extract_contracts(self, method):\n contracts = []\n for line in method.__doc__.split('\\n'):\n line = line.strip()\n\n if line.startswith('@'):\n name, args = re.match(r'@(\\w+)\\s*(.*)', line).groups()\n args = re.split(r'\\s+', args)\n\n contracts.append(self.contracts[name](method, *args))\n\n return contracts\n\n def from_spider(self, spider, results):\n requests = []\n for method in self.tested_methods_from_spidercls(type(spider)):\n bound_method = spider.__getattribute__(method)\n requests.append(self.from_method(bound_method, results))\n\n return requests\n\n def from_method(self, method, results):\n contracts = self.extract_contracts(method)\n if contracts:\n request_cls = Request\n for contract in contracts:\n if contract.request_cls is not None:\n request_cls = contract.request_cls\n\n # calculate request args\n args, kwargs = get_spec(request_cls.__init__)\n kwargs['callback'] = method\n for contract in contracts:\n kwargs = contract.adjust_request_args(kwargs)\n\n args.remove('self')\n\n # check if all positional arguments are defined in kwargs\n if set(args).issubset(set(kwargs)):\n request = request_cls(**kwargs)\n\n # execute pre and post hooks in order\n for contract in reversed(contracts):\n request = contract.add_pre_hook(request, results)\n for contract in contracts:\n request = contract.add_post_hook(request, results)\n\n self._clean_req(request, method, results)\n return request\n\n def _clean_req(self, request, method, results):\n \"\"\" stop the request from returning objects and records any errors \"\"\"\n\n cb = request.callback\n\n @wraps(cb)\n def cb_wrapper(response):\n try:\n output = cb(response)\n output = list(iterate_spider_output(output))\n except:\n case = _create_testcase(method, 'callback')\n results.addError(case, sys.exc_info())\n\n def eb_wrapper(failure):\n case = _create_testcase(method, 'errback')\n exc_info = failure.type, failure.value, failure.getTracebackObject()\n results.addError(case, exc_info)\n\n request.callback = cb_wrapper\n request.errback = eb_wrapper\n\n\nclass Contract(object):\n \"\"\" Abstract class for contracts \"\"\"\n request_cls = None\n\n def __init__(self, method, *args):\n self.testcase_pre = _create_testcase(method, '@%s pre-hook' % self.name)\n self.testcase_post = _create_testcase(method, '@%s post-hook' % self.name)\n self.args = args\n\n def add_pre_hook(self, request, results):\n if hasattr(self, 'pre_process'):\n cb = request.callback\n\n @wraps(cb)\n def wrapper(response):\n try:\n results.startTest(self.testcase_pre)\n self.pre_process(response)\n results.stopTest(self.testcase_pre)\n except AssertionError:\n results.addFailure(self.testcase_pre, sys.exc_info())\n except Exception:\n results.addError(self.testcase_pre, sys.exc_info())\n else:\n results.addSuccess(self.testcase_pre)\n finally:\n return list(iterate_spider_output(cb(response)))\n\n request.callback = wrapper\n\n return request\n\n def add_post_hook(self, request, results):\n if hasattr(self, 'post_process'):\n cb = request.callback\n\n @wraps(cb)\n def wrapper(response):\n output = list(iterate_spider_output(cb(response)))\n try:\n results.startTest(self.testcase_post)\n self.post_process(output)\n results.stopTest(self.testcase_post)\n except AssertionError:\n results.addFailure(self.testcase_post, sys.exc_info())\n except Exception:\n results.addError(self.testcase_post, sys.exc_info())\n else:\n results.addSuccess(self.testcase_post)\n finally:\n return output\n\n request.callback = wrapper\n\n return request\n\n def adjust_request_args(self, args):\n return args\n\n\ndef _create_testcase(method, desc):\n spider = method.__self__.name\n\n class ContractTestCase(TestCase):\n def __str__(_self):\n return \"[%s] %s (%s)\" % (spider, method.__name__, desc)\n\n name = '%s_%s' % (spider, method.__name__)\n setattr(ContractTestCase, name, lambda x: x)\n return ContractTestCase(name)\n", "path": "scrapy/contracts/__init__.py"}]} | 1,796 | 311 |
gh_patches_debug_18067 | rasdani/github-patches | git_diff | ivy-llc__ivy-13637 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
beta
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/jax/random.py`
Content:
```
1 # local
2 import ivy
3 from ivy.functional.frontends.jax.func_wrapper import (
4 to_ivy_arrays_and_back,
5 handle_jax_dtype,
6 )
7
8
9 @to_ivy_arrays_and_back
10 def PRNGKey(seed):
11 return ivy.array([0, seed % 4294967295 - (seed // 4294967295)], dtype=ivy.int64)
12
13
14 @handle_jax_dtype
15 @to_ivy_arrays_and_back
16 def uniform(key, shape=(), dtype=None, minval=0.0, maxval=1.0):
17 return ivy.random_uniform(
18 low=minval, high=maxval, shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1])
19 )
20
21
22 @handle_jax_dtype
23 @to_ivy_arrays_and_back
24 def normal(key, shape=(), dtype=None):
25 return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))
26
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/jax/random.py b/ivy/functional/frontends/jax/random.py
--- a/ivy/functional/frontends/jax/random.py
+++ b/ivy/functional/frontends/jax/random.py
@@ -1,5 +1,6 @@
# local
import ivy
+from ivy.func_wrapper import with_unsupported_dtypes
from ivy.functional.frontends.jax.func_wrapper import (
to_ivy_arrays_and_back,
handle_jax_dtype,
@@ -23,3 +24,24 @@
@to_ivy_arrays_and_back
def normal(key, shape=(), dtype=None):
return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))
+
+
+def _get_seed(key):
+ key1, key2 = int(key[0]), int(key[1])
+ return ivy.to_scalar(int("".join(map(str, [key1, key2]))))
+
+
+@handle_jax_dtype
+@to_ivy_arrays_and_back
+@with_unsupported_dtypes(
+ {
+ "0.3.14 and below": (
+ "float16",
+ "bfloat16",
+ )
+ },
+ "jax",
+)
+def beta(key, a, b, shape=None, dtype=None):
+ seed = _get_seed(key)
+ return ivy.beta(a, b, shape=shape, dtype=dtype, seed=seed)
| {"golden_diff": "diff --git a/ivy/functional/frontends/jax/random.py b/ivy/functional/frontends/jax/random.py\n--- a/ivy/functional/frontends/jax/random.py\n+++ b/ivy/functional/frontends/jax/random.py\n@@ -1,5 +1,6 @@\n # local\n import ivy\n+from ivy.func_wrapper import with_unsupported_dtypes\n from ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n@@ -23,3 +24,24 @@\n @to_ivy_arrays_and_back\n def normal(key, shape=(), dtype=None):\n return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))\n+\n+\n+def _get_seed(key):\n+ key1, key2 = int(key[0]), int(key[1])\n+ return ivy.to_scalar(int(\"\".join(map(str, [key1, key2]))))\n+\n+\n+@handle_jax_dtype\n+@to_ivy_arrays_and_back\n+@with_unsupported_dtypes(\n+ {\n+ \"0.3.14 and below\": (\n+ \"float16\",\n+ \"bfloat16\",\n+ )\n+ },\n+ \"jax\",\n+)\n+def beta(key, a, b, shape=None, dtype=None):\n+ seed = _get_seed(key)\n+ return ivy.beta(a, b, shape=shape, dtype=dtype, seed=seed)\n", "issue": "beta\n\n", "before_files": [{"content": "# local\nimport ivy\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n)\n\n\n@to_ivy_arrays_and_back\ndef PRNGKey(seed):\n return ivy.array([0, seed % 4294967295 - (seed // 4294967295)], dtype=ivy.int64)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef uniform(key, shape=(), dtype=None, minval=0.0, maxval=1.0):\n return ivy.random_uniform(\n low=minval, high=maxval, shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1])\n )\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef normal(key, shape=(), dtype=None):\n return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))\n", "path": "ivy/functional/frontends/jax/random.py"}], "after_files": [{"content": "# local\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\nfrom ivy.functional.frontends.jax.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_jax_dtype,\n)\n\n\n@to_ivy_arrays_and_back\ndef PRNGKey(seed):\n return ivy.array([0, seed % 4294967295 - (seed // 4294967295)], dtype=ivy.int64)\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef uniform(key, shape=(), dtype=None, minval=0.0, maxval=1.0):\n return ivy.random_uniform(\n low=minval, high=maxval, shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1])\n )\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\ndef normal(key, shape=(), dtype=None):\n return ivy.random_normal(shape=shape, dtype=dtype, seed=ivy.to_scalar(key[1]))\n\n\ndef _get_seed(key):\n key1, key2 = int(key[0]), int(key[1])\n return ivy.to_scalar(int(\"\".join(map(str, [key1, key2]))))\n\n\n@handle_jax_dtype\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes(\n {\n \"0.3.14 and below\": (\n \"float16\",\n \"bfloat16\",\n )\n },\n \"jax\",\n)\ndef beta(key, a, b, shape=None, dtype=None):\n seed = _get_seed(key)\n return ivy.beta(a, b, shape=shape, dtype=dtype, seed=seed)\n", "path": "ivy/functional/frontends/jax/random.py"}]} | 526 | 327 |
gh_patches_debug_28361 | rasdani/github-patches | git_diff | getsentry__sentry-41334 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pip-compile and CUSTOM_COMPILE_COMMAND
### Environment
SaaS (https://sentry.io/)
### Version
_No response_
### Steps to Reproduce
JFYI
https://github.com/getsentry/sentry/blob/2e36d4d3a3926e9efe06287e59d27be1c40300fa/tools/freeze_requirements.py#L13
There is `CUSTOM_COMPILE_COMMAND` env to [customize](https://github.com/jazzband/pip-tools#configuration) the header:
```
$ CUSTOM_COMPILE_COMMAND="make freeze-requirements" pip-compile
#
# This file is autogenerated by pip-compile with python 3.11
# To update, run:
#
# make freeze-requirements
#
....
```
### Expected Result
N/A
### Actual Result
N/A
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tools/freeze_requirements.py`
Content:
```
1 from __future__ import annotations
2
3 import argparse
4 from concurrent.futures import Future, ThreadPoolExecutor
5 from os.path import abspath
6 from subprocess import CalledProcessError, run
7 from typing import Sequence
8
9 from tools.lib import gitroot
10
11
12 def worker(args: tuple[str, ...]) -> None:
13 # pip-compile doesn't let you customize the header, so we write
14 # one ourselves. However, pip-compile needs -o DEST otherwise
15 # it will bump >= pins even if they're satisfied. So, we need to
16 # unfortunately rewrite the whole file.
17 dest = args[-1]
18 try:
19 run(args, check=True, capture_output=True)
20 except CalledProcessError as e:
21 raise e
22
23 with open(dest, "rb+") as f:
24 content = f.read()
25 f.seek(0, 0)
26 f.write(
27 b"""# DO NOT MODIFY. This file was generated with `make freeze-requirements`.
28
29 """
30 + content
31 )
32
33
34 def check_futures(futures: list[Future[None]]) -> int:
35 rc = 0
36 for future in futures:
37 try:
38 future.result()
39 except CalledProcessError as e:
40 rc = 1
41 print(
42 f"""`{e.cmd}` returned code {e.returncode}
43
44 stdout:
45 {e.stdout.decode()}
46
47 stderr:
48 {e.stderr.decode()}
49 """
50 )
51 return rc
52
53
54 def main(argv: Sequence[str] | None = None) -> int:
55 parser = argparse.ArgumentParser()
56 parser.parse_args(argv)
57
58 base_path = abspath(gitroot())
59
60 base_cmd = (
61 "pip-compile",
62 "--allow-unsafe",
63 "--no-annotate",
64 "--no-header",
65 "--quiet",
66 "--strip-extras",
67 "--index-url=https://pypi.devinfra.sentry.io/simple",
68 )
69
70 executor = ThreadPoolExecutor(max_workers=2)
71 futures = [
72 executor.submit(
73 worker,
74 (
75 *base_cmd,
76 f"{base_path}/requirements-base.txt",
77 f"{base_path}/requirements-getsentry.txt",
78 "-o",
79 f"{base_path}/requirements-frozen.txt",
80 ),
81 ),
82 executor.submit(
83 worker,
84 (
85 *base_cmd,
86 f"{base_path}/requirements-base.txt",
87 f"{base_path}/requirements-getsentry.txt",
88 f"{base_path}/requirements-dev.txt",
89 "-o",
90 f"{base_path}/requirements-dev-frozen.txt",
91 ),
92 ),
93 ]
94
95 rc = check_futures(futures)
96 executor.shutdown()
97 return rc
98
99
100 if __name__ == "__main__":
101 raise SystemExit(main())
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/tools/freeze_requirements.py b/tools/freeze_requirements.py
--- a/tools/freeze_requirements.py
+++ b/tools/freeze_requirements.py
@@ -1,6 +1,7 @@
from __future__ import annotations
import argparse
+import os
from concurrent.futures import Future, ThreadPoolExecutor
from os.path import abspath
from subprocess import CalledProcessError, run
@@ -10,25 +11,10 @@
def worker(args: tuple[str, ...]) -> None:
- # pip-compile doesn't let you customize the header, so we write
- # one ourselves. However, pip-compile needs -o DEST otherwise
- # it will bump >= pins even if they're satisfied. So, we need to
- # unfortunately rewrite the whole file.
- dest = args[-1]
- try:
- run(args, check=True, capture_output=True)
- except CalledProcessError as e:
- raise e
-
- with open(dest, "rb+") as f:
- content = f.read()
- f.seek(0, 0)
- f.write(
- b"""# DO NOT MODIFY. This file was generated with `make freeze-requirements`.
+ env = os.environ.copy()
+ env["CUSTOM_COMPILE_COMMAND"] = "make freeze-requirements"
-"""
- + content
- )
+ run(args, check=True, capture_output=True, env=env)
def check_futures(futures: list[Future[None]]) -> int:
@@ -61,7 +47,6 @@
"pip-compile",
"--allow-unsafe",
"--no-annotate",
- "--no-header",
"--quiet",
"--strip-extras",
"--index-url=https://pypi.devinfra.sentry.io/simple",
| {"golden_diff": "diff --git a/tools/freeze_requirements.py b/tools/freeze_requirements.py\n--- a/tools/freeze_requirements.py\n+++ b/tools/freeze_requirements.py\n@@ -1,6 +1,7 @@\n from __future__ import annotations\n \n import argparse\n+import os\n from concurrent.futures import Future, ThreadPoolExecutor\n from os.path import abspath\n from subprocess import CalledProcessError, run\n@@ -10,25 +11,10 @@\n \n \n def worker(args: tuple[str, ...]) -> None:\n- # pip-compile doesn't let you customize the header, so we write\n- # one ourselves. However, pip-compile needs -o DEST otherwise\n- # it will bump >= pins even if they're satisfied. So, we need to\n- # unfortunately rewrite the whole file.\n- dest = args[-1]\n- try:\n- run(args, check=True, capture_output=True)\n- except CalledProcessError as e:\n- raise e\n-\n- with open(dest, \"rb+\") as f:\n- content = f.read()\n- f.seek(0, 0)\n- f.write(\n- b\"\"\"# DO NOT MODIFY. This file was generated with `make freeze-requirements`.\n+ env = os.environ.copy()\n+ env[\"CUSTOM_COMPILE_COMMAND\"] = \"make freeze-requirements\"\n \n-\"\"\"\n- + content\n- )\n+ run(args, check=True, capture_output=True, env=env)\n \n \n def check_futures(futures: list[Future[None]]) -> int:\n@@ -61,7 +47,6 @@\n \"pip-compile\",\n \"--allow-unsafe\",\n \"--no-annotate\",\n- \"--no-header\",\n \"--quiet\",\n \"--strip-extras\",\n \"--index-url=https://pypi.devinfra.sentry.io/simple\",\n", "issue": "pip-compile and CUSTOM_COMPILE_COMMAND\n### Environment\r\n\r\nSaaS (https://sentry.io/)\r\n\r\n### Version\r\n\r\n_No response_\r\n\r\n### Steps to Reproduce\r\n\r\nJFYI\r\n\r\nhttps://github.com/getsentry/sentry/blob/2e36d4d3a3926e9efe06287e59d27be1c40300fa/tools/freeze_requirements.py#L13\r\n\r\nThere is `CUSTOM_COMPILE_COMMAND` env to [customize](https://github.com/jazzband/pip-tools#configuration) the header:\r\n\r\n\r\n```\r\n$ CUSTOM_COMPILE_COMMAND=\"make freeze-requirements\" pip-compile\r\n#\r\n# This file is autogenerated by pip-compile with python 3.11\r\n# To update, run:\r\n#\r\n# make freeze-requirements\r\n#\r\n....\r\n```\r\n\r\n### Expected Result\r\n\r\nN/A\r\n\r\n### Actual Result\r\n\r\nN/A\n", "before_files": [{"content": "from __future__ import annotations\n\nimport argparse\nfrom concurrent.futures import Future, ThreadPoolExecutor\nfrom os.path import abspath\nfrom subprocess import CalledProcessError, run\nfrom typing import Sequence\n\nfrom tools.lib import gitroot\n\n\ndef worker(args: tuple[str, ...]) -> None:\n # pip-compile doesn't let you customize the header, so we write\n # one ourselves. However, pip-compile needs -o DEST otherwise\n # it will bump >= pins even if they're satisfied. So, we need to\n # unfortunately rewrite the whole file.\n dest = args[-1]\n try:\n run(args, check=True, capture_output=True)\n except CalledProcessError as e:\n raise e\n\n with open(dest, \"rb+\") as f:\n content = f.read()\n f.seek(0, 0)\n f.write(\n b\"\"\"# DO NOT MODIFY. This file was generated with `make freeze-requirements`.\n\n\"\"\"\n + content\n )\n\n\ndef check_futures(futures: list[Future[None]]) -> int:\n rc = 0\n for future in futures:\n try:\n future.result()\n except CalledProcessError as e:\n rc = 1\n print(\n f\"\"\"`{e.cmd}` returned code {e.returncode}\n\nstdout:\n{e.stdout.decode()}\n\nstderr:\n{e.stderr.decode()}\n\"\"\"\n )\n return rc\n\n\ndef main(argv: Sequence[str] | None = None) -> int:\n parser = argparse.ArgumentParser()\n parser.parse_args(argv)\n\n base_path = abspath(gitroot())\n\n base_cmd = (\n \"pip-compile\",\n \"--allow-unsafe\",\n \"--no-annotate\",\n \"--no-header\",\n \"--quiet\",\n \"--strip-extras\",\n \"--index-url=https://pypi.devinfra.sentry.io/simple\",\n )\n\n executor = ThreadPoolExecutor(max_workers=2)\n futures = [\n executor.submit(\n worker,\n (\n *base_cmd,\n f\"{base_path}/requirements-base.txt\",\n f\"{base_path}/requirements-getsentry.txt\",\n \"-o\",\n f\"{base_path}/requirements-frozen.txt\",\n ),\n ),\n executor.submit(\n worker,\n (\n *base_cmd,\n f\"{base_path}/requirements-base.txt\",\n f\"{base_path}/requirements-getsentry.txt\",\n f\"{base_path}/requirements-dev.txt\",\n \"-o\",\n f\"{base_path}/requirements-dev-frozen.txt\",\n ),\n ),\n ]\n\n rc = check_futures(futures)\n executor.shutdown()\n return rc\n\n\nif __name__ == \"__main__\":\n raise SystemExit(main())\n", "path": "tools/freeze_requirements.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport argparse\nimport os\nfrom concurrent.futures import Future, ThreadPoolExecutor\nfrom os.path import abspath\nfrom subprocess import CalledProcessError, run\nfrom typing import Sequence\n\nfrom tools.lib import gitroot\n\n\ndef worker(args: tuple[str, ...]) -> None:\n env = os.environ.copy()\n env[\"CUSTOM_COMPILE_COMMAND\"] = \"make freeze-requirements\"\n\n run(args, check=True, capture_output=True, env=env)\n\n\ndef check_futures(futures: list[Future[None]]) -> int:\n rc = 0\n for future in futures:\n try:\n future.result()\n except CalledProcessError as e:\n rc = 1\n print(\n f\"\"\"`{e.cmd}` returned code {e.returncode}\n\nstdout:\n{e.stdout.decode()}\n\nstderr:\n{e.stderr.decode()}\n\"\"\"\n )\n return rc\n\n\ndef main(argv: Sequence[str] | None = None) -> int:\n parser = argparse.ArgumentParser()\n parser.parse_args(argv)\n\n base_path = abspath(gitroot())\n\n base_cmd = (\n \"pip-compile\",\n \"--allow-unsafe\",\n \"--no-annotate\",\n \"--quiet\",\n \"--strip-extras\",\n \"--index-url=https://pypi.devinfra.sentry.io/simple\",\n )\n\n executor = ThreadPoolExecutor(max_workers=2)\n futures = [\n executor.submit(\n worker,\n (\n *base_cmd,\n f\"{base_path}/requirements-base.txt\",\n f\"{base_path}/requirements-getsentry.txt\",\n \"-o\",\n f\"{base_path}/requirements-frozen.txt\",\n ),\n ),\n executor.submit(\n worker,\n (\n *base_cmd,\n f\"{base_path}/requirements-base.txt\",\n f\"{base_path}/requirements-getsentry.txt\",\n f\"{base_path}/requirements-dev.txt\",\n \"-o\",\n f\"{base_path}/requirements-dev-frozen.txt\",\n ),\n ),\n ]\n\n rc = check_futures(futures)\n executor.shutdown()\n return rc\n\n\nif __name__ == \"__main__\":\n raise SystemExit(main())\n", "path": "tools/freeze_requirements.py"}]} | 1,223 | 395 |
gh_patches_debug_9494 | rasdani/github-patches | git_diff | quantopian__zipline-382 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make ta-lib optional
`ta-lib` is causing us many installation problems. We should thus make it optional.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 #
3 # Copyright 2013 Quantopian, Inc.
4 #
5 # Licensed under the Apache License, Version 2.0 (the "License");
6 # you may not use this file except in compliance with the License.
7 # You may obtain a copy of the License at
8 #
9 # http://www.apache.org/licenses/LICENSE-2.0
10 #
11 # Unless required by applicable law or agreed to in writing, software
12 # distributed under the License is distributed on an "AS IS" BASIS,
13 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14 # See the License for the specific language governing permissions and
15 # limitations under the License.
16 import sys
17
18 from setuptools import setup, find_packages
19
20 LONG_DESCRIPTION = None
21 README_MARKDOWN = None
22
23 with open('README.md') as markdown_source:
24 README_MARKDOWN = markdown_source.read()
25
26 if 'upload' in sys.argv:
27 # Converts the README.md file to ReST, since PyPI uses ReST for formatting,
28 # This allows to have one canonical README file, being the README.md
29 # The conversion only needs to be done on upload.
30 # Otherwise, the pandoc import and errors that are thrown when
31 # pandoc are both overhead and a source of confusion for general
32 # usage/installation.
33 import pandoc
34 pandoc.core.PANDOC_PATH = 'pandoc'
35 doc = pandoc.Document()
36 doc.markdown = README_MARKDOWN
37 LONG_DESCRIPTION = doc.rst
38 else:
39 # If pandoc isn't installed, e.g. when downloading from pip,
40 # just use the regular README.
41 LONG_DESCRIPTION = README_MARKDOWN
42
43 setup(
44 name='zipline',
45 version='0.7.0',
46 description='A backtester for financial algorithms.',
47 author='Quantopian Inc.',
48 author_email='[email protected]',
49 packages=find_packages(),
50 scripts=['scripts/run_algo.py'],
51 long_description=LONG_DESCRIPTION,
52 license='Apache 2.0',
53 classifiers=[
54 'Development Status :: 4 - Beta',
55 'License :: OSI Approved :: Apache Software License',
56 'Natural Language :: English',
57 'Programming Language :: Python',
58 'Programming Language :: Python :: 2.7',
59 'Programming Language :: Python :: 3.3',
60 'Operating System :: OS Independent',
61 'Intended Audience :: Science/Research',
62 'Topic :: Office/Business :: Financial',
63 'Topic :: Scientific/Engineering :: Information Analysis',
64 'Topic :: System :: Distributed Computing',
65 ],
66 install_requires=[
67 'Logbook',
68 'pytz',
69 'requests',
70 'numpy',
71 'pandas',
72 'six'
73 ],
74 url="https://github.com/quantopian/zipline"
75 )
76
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -1,6 +1,6 @@
#!/usr/bin/env python
#
-# Copyright 2013 Quantopian, Inc.
+# Copyright 2014 Quantopian, Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
@@ -71,5 +71,8 @@
'pandas',
'six'
],
+ extras_require = {
+ 'talib': ["talib"],
+ },
url="https://github.com/quantopian/zipline"
)
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -1,6 +1,6 @@\n #!/usr/bin/env python\n #\n-# Copyright 2013 Quantopian, Inc.\n+# Copyright 2014 Quantopian, Inc.\n #\n # Licensed under the Apache License, Version 2.0 (the \"License\");\n # you may not use this file except in compliance with the License.\n@@ -71,5 +71,8 @@\n 'pandas',\n 'six'\n ],\n+ extras_require = {\n+ 'talib': [\"talib\"],\n+ },\n url=\"https://github.com/quantopian/zipline\"\n )\n", "issue": "Make ta-lib optional\n`ta-lib` is causing us many installation problems. We should thus make it optional.\n\n", "before_files": [{"content": "#!/usr/bin/env python\n#\n# Copyright 2013 Quantopian, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport sys\n\nfrom setuptools import setup, find_packages\n\nLONG_DESCRIPTION = None\nREADME_MARKDOWN = None\n\nwith open('README.md') as markdown_source:\n README_MARKDOWN = markdown_source.read()\n\nif 'upload' in sys.argv:\n # Converts the README.md file to ReST, since PyPI uses ReST for formatting,\n # This allows to have one canonical README file, being the README.md\n # The conversion only needs to be done on upload.\n # Otherwise, the pandoc import and errors that are thrown when\n # pandoc are both overhead and a source of confusion for general\n # usage/installation.\n import pandoc\n pandoc.core.PANDOC_PATH = 'pandoc'\n doc = pandoc.Document()\n doc.markdown = README_MARKDOWN\n LONG_DESCRIPTION = doc.rst\nelse:\n # If pandoc isn't installed, e.g. when downloading from pip,\n # just use the regular README.\n LONG_DESCRIPTION = README_MARKDOWN\n\nsetup(\n name='zipline',\n version='0.7.0',\n description='A backtester for financial algorithms.',\n author='Quantopian Inc.',\n author_email='[email protected]',\n packages=find_packages(),\n scripts=['scripts/run_algo.py'],\n long_description=LONG_DESCRIPTION,\n license='Apache 2.0',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'License :: OSI Approved :: Apache Software License',\n 'Natural Language :: English',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3.3',\n 'Operating System :: OS Independent',\n 'Intended Audience :: Science/Research',\n 'Topic :: Office/Business :: Financial',\n 'Topic :: Scientific/Engineering :: Information Analysis',\n 'Topic :: System :: Distributed Computing',\n ],\n install_requires=[\n 'Logbook',\n 'pytz',\n 'requests',\n 'numpy',\n 'pandas',\n 'six'\n ],\n url=\"https://github.com/quantopian/zipline\"\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n#\n# Copyright 2014 Quantopian, Inc.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport sys\n\nfrom setuptools import setup, find_packages\n\nLONG_DESCRIPTION = None\nREADME_MARKDOWN = None\n\nwith open('README.md') as markdown_source:\n README_MARKDOWN = markdown_source.read()\n\nif 'upload' in sys.argv:\n # Converts the README.md file to ReST, since PyPI uses ReST for formatting,\n # This allows to have one canonical README file, being the README.md\n # The conversion only needs to be done on upload.\n # Otherwise, the pandoc import and errors that are thrown when\n # pandoc are both overhead and a source of confusion for general\n # usage/installation.\n import pandoc\n pandoc.core.PANDOC_PATH = 'pandoc'\n doc = pandoc.Document()\n doc.markdown = README_MARKDOWN\n LONG_DESCRIPTION = doc.rst\nelse:\n # If pandoc isn't installed, e.g. when downloading from pip,\n # just use the regular README.\n LONG_DESCRIPTION = README_MARKDOWN\n\nsetup(\n name='zipline',\n version='0.7.0',\n description='A backtester for financial algorithms.',\n author='Quantopian Inc.',\n author_email='[email protected]',\n packages=find_packages(),\n scripts=['scripts/run_algo.py'],\n long_description=LONG_DESCRIPTION,\n license='Apache 2.0',\n classifiers=[\n 'Development Status :: 4 - Beta',\n 'License :: OSI Approved :: Apache Software License',\n 'Natural Language :: English',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3.3',\n 'Operating System :: OS Independent',\n 'Intended Audience :: Science/Research',\n 'Topic :: Office/Business :: Financial',\n 'Topic :: Scientific/Engineering :: Information Analysis',\n 'Topic :: System :: Distributed Computing',\n ],\n install_requires=[\n 'Logbook',\n 'pytz',\n 'requests',\n 'numpy',\n 'pandas',\n 'six'\n ],\n extras_require = {\n 'talib': [\"talib\"],\n },\n url=\"https://github.com/quantopian/zipline\"\n)\n", "path": "setup.py"}]} | 1,011 | 153 |
gh_patches_debug_50307 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-3345 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Spider konagrill is broken
During the global build at 2021-06-16-14-42-20, spider **konagrill** failed with **0 features** and **24 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-06-16-14-42-20/logs/konagrill.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-06-16-14-42-20/output/konagrill.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-06-16-14-42-20/output/konagrill.geojson))
Spider konagrill is broken
During the global build at 2021-10-13-14-42-23, spider **konagrill** failed with **0 features** and **24 errors**.
Here's [the log](https://data.alltheplaces.xyz/runs/2021-10-13-14-42-23/logs/konagrill.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-10-13-14-42-23/output/konagrill.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-10-13-14-42-23/output/konagrill.geojson))
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/kona_grill.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import json
3 import scrapy
4
5 from locations.items import GeojsonPointItem
6 from locations.hours import OpeningHours
7
8 STATES = [
9 "AL", "AK", "AZ", "AR", "CA", "CO", "CT", "DC", "DE", "FL",
10 "GA", "HI", "ID", "IL", "IN", "IA", "KS", "KY", "LA", "ME",
11 "MD", "MA", "MI", "MN", "MS", "MO", "MT", "NE", "NV", "NH",
12 "NJ", "NM", "NY", "NC", "ND", "OH", "OK", "OR", "PA", "RI",
13 "SC", "SD", "TN", "TX", "UT", "VT", "VA", "WA", "WV", "WI",
14 "WY",
15 ]
16
17 WEEKDAYS = ["Mo", "Tu", "We", "Th", "Fr", "Sa", "Su"]
18
19
20 class KonaGrillSpider(scrapy.Spider):
21 download_delay = 0.2
22 name = "konagrill"
23 item_attributes = {"brand": "Kona Grill", "brand_wikidata": "Q6428706"}
24 allowed_domains = ["konagrill.com"]
25
26 def start_requests(self):
27 url_by_state = "https://www.konagrill.com/ajax/getlocationsbystate"
28 headers = {"content-type": "application/x-www-form-urlencoded"}
29
30 # Get store id per state
31 for state in STATES:
32 yield scrapy.http.Request(
33 url_by_state,
34 method="POST",
35 body="state={}".format(state),
36 callback=self.parse,
37 headers=headers,
38 )
39
40 def parse(self, response):
41 store_data = json.loads(response.text)
42 url_location_details = "https://www.konagrill.com/ajax/getlocationdetails"
43 headers = {"content-type": "application/x-www-form-urlencoded"}
44 store_ids = []
45
46 if not store_data.get("data"):
47 return
48
49 store_ids += [s.get("id") for _, s in store_data.get("data").items()]
50
51 # Get store details
52 for i in store_ids:
53 yield scrapy.http.Request(
54 url_location_details,
55 method="POST",
56 body="id={}".format(i),
57 callback=self.parse_store,
58 headers=headers,
59 )
60
61 def parse_store(self, response):
62 response_data = json.loads(response.text)
63 if not response_data.get("data"):
64 return
65
66 store = response_data.get("data")
67 dh = store.get("dininghours")
68 # Data is inconsistent some keys were found with a trailing space
69 opening_hours = self.parse_hours(dh.get("dining hours") or dh.get("dining hours "))
70 properties = {
71 "addr_full": store.get("address"),
72 "city": store.get("city"),
73 "extras": {"email": store.get("email"),},
74 "lat": store.get("latitude"),
75 "lon": store.get("longitude"),
76 "name": store.get("title"),
77 "opening_hours": opening_hours,
78 "phone": store.get("phone_number"),
79 "postcode": store.get("zip"),
80 "ref": store.get("id"),
81 "state": store.get("state"),
82 "website": store.get("order_online_url"),
83 }
84
85 yield GeojsonPointItem(**properties)
86
87 def parse_hours(self, hours):
88 oh = OpeningHours()
89
90 for t in hours:
91 days = self.parse_days(t.get("days"))
92 open_time, close_time = t.get("hours").split("-")
93 ot = open_time.strip()
94 ct = close_time.strip()
95 for day in days:
96 oh.add_range(day=day, open_time=ot, close_time=ct, time_format="%I%p")
97
98 return oh.as_opening_hours()
99
100 def parse_days(self, days):
101 """Parse day ranges and returns a list of days it represent
102 The following formats are considered:
103 - Single day, e.g. "Mon", "Monday"
104 - Range, e.g. "Mon-Fri", "Tue-Sund", "Sat-Sunday"
105 - Two days, e.g. "Sat & Sun", "Friday & Su"
106
107 Returns a list with the weekdays
108 """
109 parsed_days = []
110
111 # Range
112 # Produce a list of weekdays between two days e.g. su-sa, mo-th, etc.
113 if "-" in days:
114 d = days.split("-")
115 r = [i.strip()[:2] for i in d]
116 s = WEEKDAYS.index(r[0].title())
117 e = WEEKDAYS.index(r[1].title())
118 if s <= e:
119 return WEEKDAYS[s : e + 1]
120 else:
121 return WEEKDAYS[s:] + WEEKDAYS[: e + 1]
122 # Two days
123 if "&" in days:
124 d = days.split("&")
125 return [i.strip()[:2].title() for i in d]
126 # Single days
127 else:
128 return [days.strip()[:2].title()]
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/kona_grill.py b/locations/spiders/kona_grill.py
--- a/locations/spiders/kona_grill.py
+++ b/locations/spiders/kona_grill.py
@@ -88,6 +88,10 @@
oh = OpeningHours()
for t in hours:
+ # Some day entries contain invalid week data, e.g. "Brunch"
+ # "Brunch" is a special dining hour that is contained in regular hours, ignore it
+ if "Brunch" in t.get("days"):
+ continue
days = self.parse_days(t.get("days"))
open_time, close_time = t.get("hours").split("-")
ot = open_time.strip()
| {"golden_diff": "diff --git a/locations/spiders/kona_grill.py b/locations/spiders/kona_grill.py\n--- a/locations/spiders/kona_grill.py\n+++ b/locations/spiders/kona_grill.py\n@@ -88,6 +88,10 @@\n oh = OpeningHours()\n \n for t in hours:\n+ # Some day entries contain invalid week data, e.g. \"Brunch\"\n+ # \"Brunch\" is a special dining hour that is contained in regular hours, ignore it\n+ if \"Brunch\" in t.get(\"days\"):\n+ continue\n days = self.parse_days(t.get(\"days\"))\n open_time, close_time = t.get(\"hours\").split(\"-\")\n ot = open_time.strip()\n", "issue": "Spider konagrill is broken\nDuring the global build at 2021-06-16-14-42-20, spider **konagrill** failed with **0 features** and **24 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-06-16-14-42-20/logs/konagrill.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-06-16-14-42-20/output/konagrill.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-06-16-14-42-20/output/konagrill.geojson))\nSpider konagrill is broken\nDuring the global build at 2021-10-13-14-42-23, spider **konagrill** failed with **0 features** and **24 errors**.\n\nHere's [the log](https://data.alltheplaces.xyz/runs/2021-10-13-14-42-23/logs/konagrill.txt) and [the output](https://data.alltheplaces.xyz/runs/2021-10-13-14-42-23/output/konagrill.geojson) ([on a map](https://data.alltheplaces.xyz/map.html?show=https://data.alltheplaces.xyz/runs/2021-10-13-14-42-23/output/konagrill.geojson))\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport json\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\nfrom locations.hours import OpeningHours\n\nSTATES = [\n \"AL\", \"AK\", \"AZ\", \"AR\", \"CA\", \"CO\", \"CT\", \"DC\", \"DE\", \"FL\",\n \"GA\", \"HI\", \"ID\", \"IL\", \"IN\", \"IA\", \"KS\", \"KY\", \"LA\", \"ME\",\n \"MD\", \"MA\", \"MI\", \"MN\", \"MS\", \"MO\", \"MT\", \"NE\", \"NV\", \"NH\",\n \"NJ\", \"NM\", \"NY\", \"NC\", \"ND\", \"OH\", \"OK\", \"OR\", \"PA\", \"RI\",\n \"SC\", \"SD\", \"TN\", \"TX\", \"UT\", \"VT\", \"VA\", \"WA\", \"WV\", \"WI\",\n \"WY\",\n]\n\nWEEKDAYS = [\"Mo\", \"Tu\", \"We\", \"Th\", \"Fr\", \"Sa\", \"Su\"]\n\n\nclass KonaGrillSpider(scrapy.Spider):\n download_delay = 0.2\n name = \"konagrill\"\n item_attributes = {\"brand\": \"Kona Grill\", \"brand_wikidata\": \"Q6428706\"}\n allowed_domains = [\"konagrill.com\"]\n\n def start_requests(self):\n url_by_state = \"https://www.konagrill.com/ajax/getlocationsbystate\"\n headers = {\"content-type\": \"application/x-www-form-urlencoded\"}\n\n # Get store id per state\n for state in STATES:\n yield scrapy.http.Request(\n url_by_state,\n method=\"POST\",\n body=\"state={}\".format(state),\n callback=self.parse,\n headers=headers,\n )\n\n def parse(self, response):\n store_data = json.loads(response.text)\n url_location_details = \"https://www.konagrill.com/ajax/getlocationdetails\"\n headers = {\"content-type\": \"application/x-www-form-urlencoded\"}\n store_ids = []\n\n if not store_data.get(\"data\"):\n return\n\n store_ids += [s.get(\"id\") for _, s in store_data.get(\"data\").items()]\n\n # Get store details\n for i in store_ids:\n yield scrapy.http.Request(\n url_location_details,\n method=\"POST\",\n body=\"id={}\".format(i),\n callback=self.parse_store,\n headers=headers,\n )\n\n def parse_store(self, response):\n response_data = json.loads(response.text)\n if not response_data.get(\"data\"):\n return\n\n store = response_data.get(\"data\")\n dh = store.get(\"dininghours\")\n # Data is inconsistent some keys were found with a trailing space\n opening_hours = self.parse_hours(dh.get(\"dining hours\") or dh.get(\"dining hours \"))\n properties = {\n \"addr_full\": store.get(\"address\"),\n \"city\": store.get(\"city\"),\n \"extras\": {\"email\": store.get(\"email\"),},\n \"lat\": store.get(\"latitude\"),\n \"lon\": store.get(\"longitude\"),\n \"name\": store.get(\"title\"),\n \"opening_hours\": opening_hours,\n \"phone\": store.get(\"phone_number\"),\n \"postcode\": store.get(\"zip\"),\n \"ref\": store.get(\"id\"),\n \"state\": store.get(\"state\"),\n \"website\": store.get(\"order_online_url\"),\n }\n\n yield GeojsonPointItem(**properties)\n\n def parse_hours(self, hours):\n oh = OpeningHours()\n\n for t in hours:\n days = self.parse_days(t.get(\"days\"))\n open_time, close_time = t.get(\"hours\").split(\"-\")\n ot = open_time.strip()\n ct = close_time.strip()\n for day in days:\n oh.add_range(day=day, open_time=ot, close_time=ct, time_format=\"%I%p\")\n\n return oh.as_opening_hours()\n\n def parse_days(self, days):\n \"\"\"Parse day ranges and returns a list of days it represent\n The following formats are considered:\n - Single day, e.g. \"Mon\", \"Monday\"\n - Range, e.g. \"Mon-Fri\", \"Tue-Sund\", \"Sat-Sunday\"\n - Two days, e.g. \"Sat & Sun\", \"Friday & Su\"\n\n Returns a list with the weekdays\n \"\"\"\n parsed_days = []\n\n # Range\n # Produce a list of weekdays between two days e.g. su-sa, mo-th, etc.\n if \"-\" in days:\n d = days.split(\"-\")\n r = [i.strip()[:2] for i in d]\n s = WEEKDAYS.index(r[0].title())\n e = WEEKDAYS.index(r[1].title())\n if s <= e:\n return WEEKDAYS[s : e + 1]\n else:\n return WEEKDAYS[s:] + WEEKDAYS[: e + 1]\n # Two days\n if \"&\" in days:\n d = days.split(\"&\")\n return [i.strip()[:2].title() for i in d]\n # Single days\n else:\n return [days.strip()[:2].title()]\n", "path": "locations/spiders/kona_grill.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport json\nimport scrapy\n\nfrom locations.items import GeojsonPointItem\nfrom locations.hours import OpeningHours\n\nSTATES = [\n \"AL\", \"AK\", \"AZ\", \"AR\", \"CA\", \"CO\", \"CT\", \"DC\", \"DE\", \"FL\",\n \"GA\", \"HI\", \"ID\", \"IL\", \"IN\", \"IA\", \"KS\", \"KY\", \"LA\", \"ME\",\n \"MD\", \"MA\", \"MI\", \"MN\", \"MS\", \"MO\", \"MT\", \"NE\", \"NV\", \"NH\",\n \"NJ\", \"NM\", \"NY\", \"NC\", \"ND\", \"OH\", \"OK\", \"OR\", \"PA\", \"RI\",\n \"SC\", \"SD\", \"TN\", \"TX\", \"UT\", \"VT\", \"VA\", \"WA\", \"WV\", \"WI\",\n \"WY\",\n]\n\nWEEKDAYS = [\"Mo\", \"Tu\", \"We\", \"Th\", \"Fr\", \"Sa\", \"Su\"]\n\n\nclass KonaGrillSpider(scrapy.Spider):\n download_delay = 0.2\n name = \"konagrill\"\n item_attributes = {\"brand\": \"Kona Grill\", \"brand_wikidata\": \"Q6428706\"}\n allowed_domains = [\"konagrill.com\"]\n\n def start_requests(self):\n url_by_state = \"https://www.konagrill.com/ajax/getlocationsbystate\"\n headers = {\"content-type\": \"application/x-www-form-urlencoded\"}\n\n # Get store id per state\n for state in STATES:\n yield scrapy.http.Request(\n url_by_state,\n method=\"POST\",\n body=\"state={}\".format(state),\n callback=self.parse,\n headers=headers,\n )\n\n def parse(self, response):\n store_data = json.loads(response.text)\n url_location_details = \"https://www.konagrill.com/ajax/getlocationdetails\"\n headers = {\"content-type\": \"application/x-www-form-urlencoded\"}\n store_ids = []\n\n if not store_data.get(\"data\"):\n return\n\n store_ids += [s.get(\"id\") for _, s in store_data.get(\"data\").items()]\n\n # Get store details\n for i in store_ids:\n yield scrapy.http.Request(\n url_location_details,\n method=\"POST\",\n body=\"id={}\".format(i),\n callback=self.parse_store,\n headers=headers,\n )\n\n def parse_store(self, response):\n response_data = json.loads(response.text)\n if not response_data.get(\"data\"):\n return\n\n store = response_data.get(\"data\")\n dh = store.get(\"dininghours\")\n # Data is inconsistent some keys were found with a trailing space\n opening_hours = self.parse_hours(dh.get(\"dining hours\") or dh.get(\"dining hours \"))\n properties = {\n \"addr_full\": store.get(\"address\"),\n \"city\": store.get(\"city\"),\n \"extras\": {\"email\": store.get(\"email\"),},\n \"lat\": store.get(\"latitude\"),\n \"lon\": store.get(\"longitude\"),\n \"name\": store.get(\"title\"),\n \"opening_hours\": opening_hours,\n \"phone\": store.get(\"phone_number\"),\n \"postcode\": store.get(\"zip\"),\n \"ref\": store.get(\"id\"),\n \"state\": store.get(\"state\"),\n \"website\": store.get(\"order_online_url\"),\n }\n\n yield GeojsonPointItem(**properties)\n\n def parse_hours(self, hours):\n oh = OpeningHours()\n\n for t in hours:\n # Some day entries contain invalid week data, e.g. \"Brunch\"\n # \"Brunch\" is a special dining hour that is contained in regular hours, ignore it\n if \"Brunch\" in t.get(\"days\"):\n continue\n days = self.parse_days(t.get(\"days\"))\n open_time, close_time = t.get(\"hours\").split(\"-\")\n ot = open_time.strip()\n ct = close_time.strip()\n for day in days:\n oh.add_range(day=day, open_time=ot, close_time=ct, time_format=\"%I%p\")\n\n return oh.as_opening_hours()\n\n def parse_days(self, days):\n \"\"\"Parse day ranges and returns a list of days it represent\n The following formats are considered:\n - Single day, e.g. \"Mon\", \"Monday\"\n - Range, e.g. \"Mon-Fri\", \"Tue-Sund\", \"Sat-Sunday\"\n - Two days, e.g. \"Sat & Sun\", \"Friday & Su\"\n\n Returns a list with the weekdays\n \"\"\"\n parsed_days = []\n\n # Range\n # Produce a list of weekdays between two days e.g. su-sa, mo-th, etc.\n if \"-\" in days:\n d = days.split(\"-\")\n r = [i.strip()[:2] for i in d]\n s = WEEKDAYS.index(r[0].title())\n e = WEEKDAYS.index(r[1].title())\n if s <= e:\n return WEEKDAYS[s : e + 1]\n else:\n return WEEKDAYS[s:] + WEEKDAYS[: e + 1]\n # Two days\n if \"&\" in days:\n d = days.split(\"&\")\n return [i.strip()[:2].title() for i in d]\n # Single days\n else:\n return [days.strip()[:2].title()]\n", "path": "locations/spiders/kona_grill.py"}]} | 2,036 | 164 |
gh_patches_debug_3571 | rasdani/github-patches | git_diff | nerfstudio-project__nerfstudio-2076 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Doc Description Wrong
Hello,
I find python doc in [get_depth_image_from_path](https://github.com/nerfstudio-project/nerfstudio/blob/main/nerfstudio/data/utils/data_utils.py) is wrong about the return tensor shape,
it should be [height, width, 1] not [width, height, 1].

a simple verification:

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nerfstudio/data/utils/data_utils.py`
Content:
```
1 # Copyright 2022 the Regents of the University of California, Nerfstudio Team and contributors. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Utility functions to allow easy re-use of common operations across dataloaders"""
16 from pathlib import Path
17 from typing import List, Tuple, Union
18
19 import cv2
20 import numpy as np
21 import torch
22 from PIL import Image
23
24
25 def get_image_mask_tensor_from_path(filepath: Path, scale_factor: float = 1.0) -> torch.Tensor:
26 """
27 Utility function to read a mask image from the given path and return a boolean tensor
28 """
29 pil_mask = Image.open(filepath)
30 if scale_factor != 1.0:
31 width, height = pil_mask.size
32 newsize = (int(width * scale_factor), int(height * scale_factor))
33 pil_mask = pil_mask.resize(newsize, resample=Image.NEAREST)
34 mask_tensor = torch.from_numpy(np.array(pil_mask)).unsqueeze(-1).bool()
35 if len(mask_tensor.shape) != 3:
36 raise ValueError("The mask image should have 1 channel")
37 return mask_tensor
38
39
40 def get_semantics_and_mask_tensors_from_path(
41 filepath: Path, mask_indices: Union[List, torch.Tensor], scale_factor: float = 1.0
42 ) -> Tuple[torch.Tensor, torch.Tensor]:
43 """
44 Utility function to read segmentation from the given filepath
45 If no mask is required - use mask_indices = []
46 """
47 if isinstance(mask_indices, List):
48 mask_indices = torch.tensor(mask_indices, dtype=torch.int64).view(1, 1, -1)
49 pil_image = Image.open(filepath)
50 if scale_factor != 1.0:
51 width, height = pil_image.size
52 newsize = (int(width * scale_factor), int(height * scale_factor))
53 pil_image = pil_image.resize(newsize, resample=Image.NEAREST)
54 semantics = torch.from_numpy(np.array(pil_image, dtype="int64"))[..., None]
55 mask = torch.sum(semantics == mask_indices, dim=-1, keepdim=True) == 0
56 return semantics, mask
57
58
59 def get_depth_image_from_path(
60 filepath: Path,
61 height: int,
62 width: int,
63 scale_factor: float,
64 interpolation: int = cv2.INTER_NEAREST,
65 ) -> torch.Tensor:
66 """Loads, rescales and resizes depth images.
67 Filepath points to a 16-bit or 32-bit depth image, or a numpy array `*.npy`.
68
69 Args:
70 filepath: Path to depth image.
71 height: Target depth image height.
72 width: Target depth image width.
73 scale_factor: Factor by which to scale depth image.
74 interpolation: Depth value interpolation for resizing.
75
76 Returns:
77 Depth image torch tensor with shape [width, height, 1].
78 """
79 if filepath.suffix == ".npy":
80 image = np.load(filepath) * scale_factor
81 image = cv2.resize(image, (width, height), interpolation=interpolation)
82 else:
83 image = cv2.imread(str(filepath.absolute()), cv2.IMREAD_ANYDEPTH)
84 image = image.astype(np.float64) * scale_factor
85 image = cv2.resize(image, (width, height), interpolation=interpolation)
86 return torch.from_numpy(image[:, :, np.newaxis])
87
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nerfstudio/data/utils/data_utils.py b/nerfstudio/data/utils/data_utils.py
--- a/nerfstudio/data/utils/data_utils.py
+++ b/nerfstudio/data/utils/data_utils.py
@@ -74,7 +74,7 @@
interpolation: Depth value interpolation for resizing.
Returns:
- Depth image torch tensor with shape [width, height, 1].
+ Depth image torch tensor with shape [height, width, 1].
"""
if filepath.suffix == ".npy":
image = np.load(filepath) * scale_factor
| {"golden_diff": "diff --git a/nerfstudio/data/utils/data_utils.py b/nerfstudio/data/utils/data_utils.py\n--- a/nerfstudio/data/utils/data_utils.py\n+++ b/nerfstudio/data/utils/data_utils.py\n@@ -74,7 +74,7 @@\n interpolation: Depth value interpolation for resizing.\n \n Returns:\n- Depth image torch tensor with shape [width, height, 1].\n+ Depth image torch tensor with shape [height, width, 1].\n \"\"\"\n if filepath.suffix == \".npy\":\n image = np.load(filepath) * scale_factor\n", "issue": "Doc Description Wrong\nHello, \r\n\r\nI find python doc in [get_depth_image_from_path](https://github.com/nerfstudio-project/nerfstudio/blob/main/nerfstudio/data/utils/data_utils.py) is wrong about the return tensor shape, \r\n\r\nit should be [height, width, 1] not [width, height, 1].\r\n\r\n\r\n\r\na simple verification:\r\n\r\n\r\n\r\n\n", "before_files": [{"content": "# Copyright 2022 the Regents of the University of California, Nerfstudio Team and contributors. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Utility functions to allow easy re-use of common operations across dataloaders\"\"\"\nfrom pathlib import Path\nfrom typing import List, Tuple, Union\n\nimport cv2\nimport numpy as np\nimport torch\nfrom PIL import Image\n\n\ndef get_image_mask_tensor_from_path(filepath: Path, scale_factor: float = 1.0) -> torch.Tensor:\n \"\"\"\n Utility function to read a mask image from the given path and return a boolean tensor\n \"\"\"\n pil_mask = Image.open(filepath)\n if scale_factor != 1.0:\n width, height = pil_mask.size\n newsize = (int(width * scale_factor), int(height * scale_factor))\n pil_mask = pil_mask.resize(newsize, resample=Image.NEAREST)\n mask_tensor = torch.from_numpy(np.array(pil_mask)).unsqueeze(-1).bool()\n if len(mask_tensor.shape) != 3:\n raise ValueError(\"The mask image should have 1 channel\")\n return mask_tensor\n\n\ndef get_semantics_and_mask_tensors_from_path(\n filepath: Path, mask_indices: Union[List, torch.Tensor], scale_factor: float = 1.0\n) -> Tuple[torch.Tensor, torch.Tensor]:\n \"\"\"\n Utility function to read segmentation from the given filepath\n If no mask is required - use mask_indices = []\n \"\"\"\n if isinstance(mask_indices, List):\n mask_indices = torch.tensor(mask_indices, dtype=torch.int64).view(1, 1, -1)\n pil_image = Image.open(filepath)\n if scale_factor != 1.0:\n width, height = pil_image.size\n newsize = (int(width * scale_factor), int(height * scale_factor))\n pil_image = pil_image.resize(newsize, resample=Image.NEAREST)\n semantics = torch.from_numpy(np.array(pil_image, dtype=\"int64\"))[..., None]\n mask = torch.sum(semantics == mask_indices, dim=-1, keepdim=True) == 0\n return semantics, mask\n\n\ndef get_depth_image_from_path(\n filepath: Path,\n height: int,\n width: int,\n scale_factor: float,\n interpolation: int = cv2.INTER_NEAREST,\n) -> torch.Tensor:\n \"\"\"Loads, rescales and resizes depth images.\n Filepath points to a 16-bit or 32-bit depth image, or a numpy array `*.npy`.\n\n Args:\n filepath: Path to depth image.\n height: Target depth image height.\n width: Target depth image width.\n scale_factor: Factor by which to scale depth image.\n interpolation: Depth value interpolation for resizing.\n\n Returns:\n Depth image torch tensor with shape [width, height, 1].\n \"\"\"\n if filepath.suffix == \".npy\":\n image = np.load(filepath) * scale_factor\n image = cv2.resize(image, (width, height), interpolation=interpolation)\n else:\n image = cv2.imread(str(filepath.absolute()), cv2.IMREAD_ANYDEPTH)\n image = image.astype(np.float64) * scale_factor\n image = cv2.resize(image, (width, height), interpolation=interpolation)\n return torch.from_numpy(image[:, :, np.newaxis])\n", "path": "nerfstudio/data/utils/data_utils.py"}], "after_files": [{"content": "# Copyright 2022 the Regents of the University of California, Nerfstudio Team and contributors. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Utility functions to allow easy re-use of common operations across dataloaders\"\"\"\nfrom pathlib import Path\nfrom typing import List, Tuple, Union\n\nimport cv2\nimport numpy as np\nimport torch\nfrom PIL import Image\n\n\ndef get_image_mask_tensor_from_path(filepath: Path, scale_factor: float = 1.0) -> torch.Tensor:\n \"\"\"\n Utility function to read a mask image from the given path and return a boolean tensor\n \"\"\"\n pil_mask = Image.open(filepath)\n if scale_factor != 1.0:\n width, height = pil_mask.size\n newsize = (int(width * scale_factor), int(height * scale_factor))\n pil_mask = pil_mask.resize(newsize, resample=Image.NEAREST)\n mask_tensor = torch.from_numpy(np.array(pil_mask)).unsqueeze(-1).bool()\n if len(mask_tensor.shape) != 3:\n raise ValueError(\"The mask image should have 1 channel\")\n return mask_tensor\n\n\ndef get_semantics_and_mask_tensors_from_path(\n filepath: Path, mask_indices: Union[List, torch.Tensor], scale_factor: float = 1.0\n) -> Tuple[torch.Tensor, torch.Tensor]:\n \"\"\"\n Utility function to read segmentation from the given filepath\n If no mask is required - use mask_indices = []\n \"\"\"\n if isinstance(mask_indices, List):\n mask_indices = torch.tensor(mask_indices, dtype=torch.int64).view(1, 1, -1)\n pil_image = Image.open(filepath)\n if scale_factor != 1.0:\n width, height = pil_image.size\n newsize = (int(width * scale_factor), int(height * scale_factor))\n pil_image = pil_image.resize(newsize, resample=Image.NEAREST)\n semantics = torch.from_numpy(np.array(pil_image, dtype=\"int64\"))[..., None]\n mask = torch.sum(semantics == mask_indices, dim=-1, keepdim=True) == 0\n return semantics, mask\n\n\ndef get_depth_image_from_path(\n filepath: Path,\n height: int,\n width: int,\n scale_factor: float,\n interpolation: int = cv2.INTER_NEAREST,\n) -> torch.Tensor:\n \"\"\"Loads, rescales and resizes depth images.\n Filepath points to a 16-bit or 32-bit depth image, or a numpy array `*.npy`.\n\n Args:\n filepath: Path to depth image.\n height: Target depth image height.\n width: Target depth image width.\n scale_factor: Factor by which to scale depth image.\n interpolation: Depth value interpolation for resizing.\n\n Returns:\n Depth image torch tensor with shape [height, width, 1].\n \"\"\"\n if filepath.suffix == \".npy\":\n image = np.load(filepath) * scale_factor\n image = cv2.resize(image, (width, height), interpolation=interpolation)\n else:\n image = cv2.imread(str(filepath.absolute()), cv2.IMREAD_ANYDEPTH)\n image = image.astype(np.float64) * scale_factor\n image = cv2.resize(image, (width, height), interpolation=interpolation)\n return torch.from_numpy(image[:, :, np.newaxis])\n", "path": "nerfstudio/data/utils/data_utils.py"}]} | 1,452 | 128 |
gh_patches_debug_9412 | rasdani/github-patches | git_diff | mitmproxy__mitmproxy-3099 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
allow_remote=false does not prevent remote access
##### Steps to reproduce the problem:
1. Run mitmproxy on a publicly routable host, with default configuration including `listen_host=""`, `listen_port=8080`, and `allow_remote=false`.
2. From a host on a different network, send a request through that instance of mitmproxy, e.g. with `curl --proxy http://your-host.example:8080`
The default `allow_remote=false` should prevent this request from succeeding. However, it is served by mitmproxy just fine.
##### Any other comments? What have you tried so far?
I have a laptop sitting in the “DMZ” of a home router, which is globally IPv4 routable. I also have a VPS which is globally IPv4 routable. Both the laptop and the VPS are running Ubuntu 16.04 “Xenial Xerus”. I can reproduce the problem with mitmproxy running on the VPS and curl on the laptop, as well as vice-versa.
Both tcpdump and mitmproxy’s own Details pane show the request as originating from a remote network.
I only noticed this because I saw strange flows in a mitmproxy instance that I spun up on the laptop.
##### System information
Mitmproxy: 3.0.3 binary
Python: 3.5.2
OpenSSL: OpenSSL 1.1.0g 2 Nov 2017
Platform: Linux-4.4.0-116-generic-x86_64-with-debian-stretch-sid
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mitmproxy/addons/allowremote.py`
Content:
```
1 import ipaddress
2 from mitmproxy import ctx
3
4
5 class AllowRemote:
6 def load(self, loader):
7 loader.add_option(
8 "allow_remote", bool, False,
9 """
10 Allow remote clients to connect to proxy. If set to false,
11 client will not be able to connect to proxy unless it is on the same network
12 or the proxyauth option is set
13 """
14 )
15
16 def clientconnect(self, layer):
17 address = layer.client_conn.address
18
19 accept_connection = (
20 ctx.options.allow_remote or
21 ipaddress.ip_address(address[0]).is_private or
22 ctx.options.proxyauth is not None
23 )
24
25 if not accept_connection:
26 layer.reply.kill()
27 ctx.log.warn("Client connection was killed because allow_remote option is set to false, "
28 "client IP was not a private IP and proxyauth was not set.\n"
29 "To allow remote connections set allow_remote option to true or set proxyauth option.")
30
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mitmproxy/addons/allowremote.py b/mitmproxy/addons/allowremote.py
--- a/mitmproxy/addons/allowremote.py
+++ b/mitmproxy/addons/allowremote.py
@@ -14,11 +14,13 @@
)
def clientconnect(self, layer):
- address = layer.client_conn.address
+ address = ipaddress.ip_address(layer.client_conn.address[0])
+ if isinstance(address, ipaddress.IPv6Address):
+ address = address.ipv4_mapped or address
accept_connection = (
ctx.options.allow_remote or
- ipaddress.ip_address(address[0]).is_private or
+ ipaddress.ip_address(address).is_private or
ctx.options.proxyauth is not None
)
| {"golden_diff": "diff --git a/mitmproxy/addons/allowremote.py b/mitmproxy/addons/allowremote.py\n--- a/mitmproxy/addons/allowremote.py\n+++ b/mitmproxy/addons/allowremote.py\n@@ -14,11 +14,13 @@\n )\n \n def clientconnect(self, layer):\n- address = layer.client_conn.address\n+ address = ipaddress.ip_address(layer.client_conn.address[0])\n+ if isinstance(address, ipaddress.IPv6Address):\n+ address = address.ipv4_mapped or address\n \n accept_connection = (\n ctx.options.allow_remote or\n- ipaddress.ip_address(address[0]).is_private or\n+ ipaddress.ip_address(address).is_private or\n ctx.options.proxyauth is not None\n )\n", "issue": "allow_remote=false does not prevent remote access\n##### Steps to reproduce the problem:\r\n\r\n1. Run mitmproxy on a publicly routable host, with default configuration including `listen_host=\"\"`, `listen_port=8080`, and `allow_remote=false`.\r\n2. From a host on a different network, send a request through that instance of mitmproxy, e.g. with `curl --proxy http://your-host.example:8080`\r\n\r\nThe default `allow_remote=false` should prevent this request from succeeding. However, it is served by mitmproxy just fine.\r\n\r\n##### Any other comments? What have you tried so far?\r\n\r\nI have a laptop sitting in the \u201cDMZ\u201d of a home router, which is globally IPv4 routable. I also have a VPS which is globally IPv4 routable. Both the laptop and the VPS are running Ubuntu 16.04 \u201cXenial Xerus\u201d. I can reproduce the problem with mitmproxy running on the VPS and curl on the laptop, as well as vice-versa.\r\n\r\nBoth tcpdump and mitmproxy\u2019s own Details pane show the request as originating from a remote network.\r\n\r\nI only noticed this because I saw strange flows in a mitmproxy instance that I spun up on the laptop.\r\n\r\n##### System information\r\n\r\nMitmproxy: 3.0.3 binary\r\nPython: 3.5.2\r\nOpenSSL: OpenSSL 1.1.0g 2 Nov 2017\r\nPlatform: Linux-4.4.0-116-generic-x86_64-with-debian-stretch-sid\r\n\n", "before_files": [{"content": "import ipaddress\nfrom mitmproxy import ctx\n\n\nclass AllowRemote:\n def load(self, loader):\n loader.add_option(\n \"allow_remote\", bool, False,\n \"\"\"\n Allow remote clients to connect to proxy. If set to false,\n client will not be able to connect to proxy unless it is on the same network\n or the proxyauth option is set\n \"\"\"\n )\n\n def clientconnect(self, layer):\n address = layer.client_conn.address\n\n accept_connection = (\n ctx.options.allow_remote or\n ipaddress.ip_address(address[0]).is_private or\n ctx.options.proxyauth is not None\n )\n\n if not accept_connection:\n layer.reply.kill()\n ctx.log.warn(\"Client connection was killed because allow_remote option is set to false, \"\n \"client IP was not a private IP and proxyauth was not set.\\n\"\n \"To allow remote connections set allow_remote option to true or set proxyauth option.\")\n", "path": "mitmproxy/addons/allowremote.py"}], "after_files": [{"content": "import ipaddress\nfrom mitmproxy import ctx\n\n\nclass AllowRemote:\n def load(self, loader):\n loader.add_option(\n \"allow_remote\", bool, False,\n \"\"\"\n Allow remote clients to connect to proxy. If set to false,\n client will not be able to connect to proxy unless it is on the same network\n or the proxyauth option is set\n \"\"\"\n )\n\n def clientconnect(self, layer):\n address = ipaddress.ip_address(layer.client_conn.address[0])\n if isinstance(address, ipaddress.IPv6Address):\n address = address.ipv4_mapped or address\n\n accept_connection = (\n ctx.options.allow_remote or\n ipaddress.ip_address(address).is_private or\n ctx.options.proxyauth is not None\n )\n\n if not accept_connection:\n layer.reply.kill()\n ctx.log.warn(\"Client connection was killed because allow_remote option is set to false, \"\n \"client IP was not a private IP and proxyauth was not set.\\n\"\n \"To allow remote connections set allow_remote option to true or set proxyauth option.\")\n", "path": "mitmproxy/addons/allowremote.py"}]} | 864 | 170 |
gh_patches_debug_10238 | rasdani/github-patches | git_diff | quantumlib__Cirq-4003 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
optimized_for_sycamore fails when you measure more than 16 qubits
```python
def thing():
q = cirq.LineQubit.range(17)
c = cirq.Circuit(cirq.H.on_each(*q), cirq.measure(*q, key='z'))
c = cg.optimized_for_sycamore(c)
return c
thing()
```
```
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-44-accaace1183a> in <module>
4 c = cg.optimized_for_sycamore(c)
5 return c
----> 6 thing()
<ipython-input-44-accaace1183a> in thing()
2 q = cirq.LineQubit.range(17)
3 c = cirq.Circuit(cirq.H.on_each(*q), cirq.measure(*q, key='z'))
----> 4 c = cg.optimized_for_sycamore(c)
5 return c
6 thing()
~/ecc/cirq/cirq/google/optimizers/optimize_for_sycamore.py in optimized_for_sycamore(circuit, new_device, qubit_map, optimizer_type, tolerance, tabulation_resolution)
159 opts = _OPTIMIZER_TYPES[optimizer_type](tolerance=tolerance, tabulation=tabulation)
160 for optimizer in opts:
--> 161 optimizer(copy)
162
163 return circuits.Circuit(
~/ecc/cirq/cirq/optimizers/drop_negligible.py in optimize_circuit(self, circuit)
37 for moment_index, moment in enumerate(circuit):
38 for op in moment.operations:
---> 39 if op is not None and protocols.trace_distance_bound(op) <= self.tolerance:
40 deletions.append((moment_index, op))
41 circuit.batch_remove(deletions)
~/ecc/cirq/cirq/protocols/trace_distance_bound.py in trace_distance_bound(val)
71
72 for strat in strats:
---> 73 result = strat(val)
74 if result is None:
75 break
~/ecc/cirq/cirq/protocols/trace_distance_bound.py in _strat_distance_from_unitary(val)
96 def _strat_distance_from_unitary(val: Any) -> Optional[float]:
97 """Attempts to compute a value's trace_distance_bound from its unitary."""
---> 98 u = unitary_protocol.unitary(val, default=None)
99
100 if u is None:
~/ecc/cirq/cirq/protocols/unitary_protocol.py in unitary(val, default)
127 ]
128 for strat in strats:
--> 129 result = strat(val)
130 if result is None:
131 break
~/ecc/cirq/cirq/protocols/unitary_protocol.py in _strat_unitary_from_apply_unitary(val)
173
174 # Apply unitary effect to an identity matrix.
--> 175 state = qis.eye_tensor(val_qid_shape, dtype=np.complex128)
176 buffer = np.empty_like(state)
177 result = method(ApplyUnitaryArgs(state, buffer, range(len(val_qid_shape))))
~/ecc/cirq/cirq/qis/states.py in eye_tensor(half_shape, dtype)
669 """
670 identity = np.eye(np.prod(half_shape, dtype=int), dtype=dtype)
--> 671 identity.shape = half_shape * 2
672 return identity
ValueError: maximum supported dimension for an ndarray is 32, found 34
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cirq/optimizers/drop_negligible.py`
Content:
```
1 # Copyright 2018 The Cirq Developers
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """An optimization pass that removes operations with tiny effects."""
16
17 from typing import List, Tuple, TYPE_CHECKING
18
19 from cirq import protocols
20 from cirq.circuits import circuit as _circuit
21
22 if TYPE_CHECKING:
23 from cirq import ops
24
25
26 class DropNegligible:
27 """An optimization pass that removes operations with tiny effects."""
28
29 def __init__(self, tolerance: float = 1e-8) -> None:
30 self.tolerance = tolerance
31
32 def __call__(self, circuit: _circuit.Circuit):
33 self.optimize_circuit(circuit)
34
35 def optimize_circuit(self, circuit: _circuit.Circuit) -> None:
36 deletions: List[Tuple[int, ops.Operation]] = []
37 for moment_index, moment in enumerate(circuit):
38 for op in moment.operations:
39 if op is not None and protocols.trace_distance_bound(op) <= self.tolerance:
40 deletions.append((moment_index, op))
41 circuit.batch_remove(deletions)
42
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cirq/optimizers/drop_negligible.py b/cirq/optimizers/drop_negligible.py
--- a/cirq/optimizers/drop_negligible.py
+++ b/cirq/optimizers/drop_negligible.py
@@ -36,6 +36,8 @@
deletions: List[Tuple[int, ops.Operation]] = []
for moment_index, moment in enumerate(circuit):
for op in moment.operations:
- if op is not None and protocols.trace_distance_bound(op) <= self.tolerance:
+ if protocols.is_measurement(op):
+ continue
+ if protocols.trace_distance_bound(op) <= self.tolerance:
deletions.append((moment_index, op))
circuit.batch_remove(deletions)
| {"golden_diff": "diff --git a/cirq/optimizers/drop_negligible.py b/cirq/optimizers/drop_negligible.py\n--- a/cirq/optimizers/drop_negligible.py\n+++ b/cirq/optimizers/drop_negligible.py\n@@ -36,6 +36,8 @@\n deletions: List[Tuple[int, ops.Operation]] = []\n for moment_index, moment in enumerate(circuit):\n for op in moment.operations:\n- if op is not None and protocols.trace_distance_bound(op) <= self.tolerance:\n+ if protocols.is_measurement(op):\n+ continue\n+ if protocols.trace_distance_bound(op) <= self.tolerance:\n deletions.append((moment_index, op))\n circuit.batch_remove(deletions)\n", "issue": "optimized_for_sycamore fails when you measure more than 16 qubits\n```python\r\ndef thing():\r\n q = cirq.LineQubit.range(17)\r\n c = cirq.Circuit(cirq.H.on_each(*q), cirq.measure(*q, key='z'))\r\n c = cg.optimized_for_sycamore(c)\r\n return c\r\nthing() \r\n```\r\n\r\n```\r\n---------------------------------------------------------------------------\r\nValueError Traceback (most recent call last)\r\n<ipython-input-44-accaace1183a> in <module>\r\n 4 c = cg.optimized_for_sycamore(c)\r\n 5 return c\r\n----> 6 thing()\r\n\r\n<ipython-input-44-accaace1183a> in thing()\r\n 2 q = cirq.LineQubit.range(17)\r\n 3 c = cirq.Circuit(cirq.H.on_each(*q), cirq.measure(*q, key='z'))\r\n----> 4 c = cg.optimized_for_sycamore(c)\r\n 5 return c\r\n 6 thing()\r\n\r\n~/ecc/cirq/cirq/google/optimizers/optimize_for_sycamore.py in optimized_for_sycamore(circuit, new_device, qubit_map, optimizer_type, tolerance, tabulation_resolution)\r\n 159 opts = _OPTIMIZER_TYPES[optimizer_type](tolerance=tolerance, tabulation=tabulation)\r\n 160 for optimizer in opts:\r\n--> 161 optimizer(copy)\r\n 162 \r\n 163 return circuits.Circuit(\r\n\r\n~/ecc/cirq/cirq/optimizers/drop_negligible.py in optimize_circuit(self, circuit)\r\n 37 for moment_index, moment in enumerate(circuit):\r\n 38 for op in moment.operations:\r\n---> 39 if op is not None and protocols.trace_distance_bound(op) <= self.tolerance:\r\n 40 deletions.append((moment_index, op))\r\n 41 circuit.batch_remove(deletions)\r\n\r\n~/ecc/cirq/cirq/protocols/trace_distance_bound.py in trace_distance_bound(val)\r\n 71 \r\n 72 for strat in strats:\r\n---> 73 result = strat(val)\r\n 74 if result is None:\r\n 75 break\r\n\r\n~/ecc/cirq/cirq/protocols/trace_distance_bound.py in _strat_distance_from_unitary(val)\r\n 96 def _strat_distance_from_unitary(val: Any) -> Optional[float]:\r\n 97 \"\"\"Attempts to compute a value's trace_distance_bound from its unitary.\"\"\"\r\n---> 98 u = unitary_protocol.unitary(val, default=None)\r\n 99 \r\n 100 if u is None:\r\n\r\n~/ecc/cirq/cirq/protocols/unitary_protocol.py in unitary(val, default)\r\n 127 ]\r\n 128 for strat in strats:\r\n--> 129 result = strat(val)\r\n 130 if result is None:\r\n 131 break\r\n\r\n~/ecc/cirq/cirq/protocols/unitary_protocol.py in _strat_unitary_from_apply_unitary(val)\r\n 173 \r\n 174 # Apply unitary effect to an identity matrix.\r\n--> 175 state = qis.eye_tensor(val_qid_shape, dtype=np.complex128)\r\n 176 buffer = np.empty_like(state)\r\n 177 result = method(ApplyUnitaryArgs(state, buffer, range(len(val_qid_shape))))\r\n\r\n~/ecc/cirq/cirq/qis/states.py in eye_tensor(half_shape, dtype)\r\n 669 \"\"\"\r\n 670 identity = np.eye(np.prod(half_shape, dtype=int), dtype=dtype)\r\n--> 671 identity.shape = half_shape * 2\r\n 672 return identity\r\n\r\nValueError: maximum supported dimension for an ndarray is 32, found 34\r\n```\n", "before_files": [{"content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"An optimization pass that removes operations with tiny effects.\"\"\"\n\nfrom typing import List, Tuple, TYPE_CHECKING\n\nfrom cirq import protocols\nfrom cirq.circuits import circuit as _circuit\n\nif TYPE_CHECKING:\n from cirq import ops\n\n\nclass DropNegligible:\n \"\"\"An optimization pass that removes operations with tiny effects.\"\"\"\n\n def __init__(self, tolerance: float = 1e-8) -> None:\n self.tolerance = tolerance\n\n def __call__(self, circuit: _circuit.Circuit):\n self.optimize_circuit(circuit)\n\n def optimize_circuit(self, circuit: _circuit.Circuit) -> None:\n deletions: List[Tuple[int, ops.Operation]] = []\n for moment_index, moment in enumerate(circuit):\n for op in moment.operations:\n if op is not None and protocols.trace_distance_bound(op) <= self.tolerance:\n deletions.append((moment_index, op))\n circuit.batch_remove(deletions)\n", "path": "cirq/optimizers/drop_negligible.py"}], "after_files": [{"content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"An optimization pass that removes operations with tiny effects.\"\"\"\n\nfrom typing import List, Tuple, TYPE_CHECKING\n\nfrom cirq import protocols\nfrom cirq.circuits import circuit as _circuit\n\nif TYPE_CHECKING:\n from cirq import ops\n\n\nclass DropNegligible:\n \"\"\"An optimization pass that removes operations with tiny effects.\"\"\"\n\n def __init__(self, tolerance: float = 1e-8) -> None:\n self.tolerance = tolerance\n\n def __call__(self, circuit: _circuit.Circuit):\n self.optimize_circuit(circuit)\n\n def optimize_circuit(self, circuit: _circuit.Circuit) -> None:\n deletions: List[Tuple[int, ops.Operation]] = []\n for moment_index, moment in enumerate(circuit):\n for op in moment.operations:\n if protocols.is_measurement(op):\n continue\n if protocols.trace_distance_bound(op) <= self.tolerance:\n deletions.append((moment_index, op))\n circuit.batch_remove(deletions)\n", "path": "cirq/optimizers/drop_negligible.py"}]} | 1,573 | 163 |
gh_patches_debug_22077 | rasdani/github-patches | git_diff | spack__spack-13605 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pybind11: fix get_include()
The `py-pybind11` package has a python module helper of the form:
```python
import pybind11 as py
print(py.get_include())
```
which helps downstream, if they are not using the CMake config package (which probably everyone is using), to query the install location.
`get_include()` currently points to the spec's Python include location instead of the package's include.
Diff for a new unit test:
```diff
diff --git a/var/spack/repos/builtin/packages/py-pybind11/package.py b/var/spack/repos/builtin/packages/py-pybind11/package.py
index cd482ceb3..0a86afb97 100644
--- a/var/spack/repos/builtin/packages/py-pybind11/package.py
+++ b/var/spack/repos/builtin/packages/py-pybind11/package.py
@@ -60,3 +60,17 @@ class PyPybind11(CMakePackage):
super(PyPybind11, self).install(spec, prefix)
setup_py('install', '--single-version-externally-managed', '--root=/',
'--prefix={0}'.format(prefix))
+
+ @run_after('install')
+ @on_package_attributes(run_tests=True)
+ def test(self):
+ with working_dir('spack-test', create=True):
+ # test include helper points to right location
+ module = "pybind11"
+ python = Executable(self.spec['python'].command.path)
+ python('-c', 'import {0}'.format(module))
+ inc = python('-c', 'import {0} as py; print(py.get_include())'.format(
+ module), output=str)
+ print("INC:")
+ print(inc)
+ assert inc == self.prefix.include
```
Similar to:
- https://github.com/conda-forge/pybind11-feedstock/pull/32
- https://github.com/pybind/pybind11/pull/1877
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `var/spack/repos/builtin/packages/py-pybind11/package.py`
Content:
```
1 # Copyright 2013-2019 Lawrence Livermore National Security, LLC and other
2 # Spack Project Developers. See the top-level COPYRIGHT file for details.
3 #
4 # SPDX-License-Identifier: (Apache-2.0 OR MIT)
5
6 from spack import *
7
8
9 class PyPybind11(CMakePackage):
10 """pybind11 -- Seamless operability between C++11 and Python.
11
12 pybind11 is a lightweight header-only library that exposes C++ types in
13 Python and vice versa, mainly to create Python bindings of existing C++
14 code. Its goals and syntax are similar to the excellent Boost.Python
15 library by David Abrahams: to minimize boilerplate code in traditional
16 extension modules by inferring type information using compile-time
17 introspection."""
18
19 homepage = "https://pybind11.readthedocs.io"
20 url = "https://github.com/pybind/pybind11/archive/v2.1.0.tar.gz"
21 git = "https://github.com/pybind/pybind11.git"
22
23 maintainers = ['ax3l']
24
25 version('master', branch='master')
26 version('2.4.3', sha256='1eed57bc6863190e35637290f97a20c81cfe4d9090ac0a24f3bbf08f265eb71d')
27 version('2.3.0', sha256='0f34838f2c8024a6765168227ba587b3687729ebf03dc912f88ff75c7aa9cfe8')
28 version('2.2.4', sha256='b69e83658513215b8d1443544d0549b7d231b9f201f6fc787a2b2218b408181e')
29 version('2.2.3', sha256='3a3b7b651afab1c5ba557f4c37d785a522b8030dfc765da26adc2ecd1de940ea')
30 version('2.2.2', sha256='b639a2b2cbf1c467849660801c4665ffc1a4d0a9e153ae1996ed6f21c492064e')
31 version('2.2.1', sha256='f8bd1509578b2a1e7407d52e6ee8afe64268909a1bbda620ca407318598927e7')
32 version('2.2.0', sha256='1b0fda17c650c493f5862902e90f426df6751da8c0b58c05983ab009951ed769')
33 version('2.1.1', sha256='f2c6874f1ea5b4ad4ffffe352413f7d2cd1a49f9050940805c2a082348621540')
34 version('2.1.0', sha256='2860f2b8d0c9f65f0698289a161385f59d099b7ead1bf64e8993c486f2b93ee0')
35
36 depends_on('py-pytest', type='test')
37 depends_on('py-setuptools', type='build')
38
39 extends('python')
40
41 # compiler support
42 conflicts('%gcc@:4.7')
43 conflicts('%clang@:3.2')
44 conflicts('%intel@:16')
45
46 def cmake_args(self):
47 args = []
48 args.append('-DPYTHON_EXECUTABLE:FILEPATH=%s'
49 % self.spec['python'].command.path)
50 args += [
51 '-DPYBIND11_TEST:BOOL={0}'.format(
52 'ON' if self.run_tests else 'OFF')
53 ]
54 return args
55
56 def setup_build_environment(self, env):
57 env.set('PYBIND11_USE_CMAKE', 1)
58
59 def install(self, spec, prefix):
60 super(PyPybind11, self).install(spec, prefix)
61 setup_py('install', '--single-version-externally-managed', '--root=/',
62 '--prefix={0}'.format(prefix))
63
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/var/spack/repos/builtin/packages/py-pybind11/package.py b/var/spack/repos/builtin/packages/py-pybind11/package.py
--- a/var/spack/repos/builtin/packages/py-pybind11/package.py
+++ b/var/spack/repos/builtin/packages/py-pybind11/package.py
@@ -56,7 +56,27 @@
def setup_build_environment(self, env):
env.set('PYBIND11_USE_CMAKE', 1)
+ def patch(self):
+ """ see https://github.com/spack/spack/issues/13559 """
+ filter_file('import sys',
+ 'import sys; return "{0}"'.format(self.prefix.include),
+ 'pybind11/__init__.py',
+ string=True)
+
def install(self, spec, prefix):
super(PyPybind11, self).install(spec, prefix)
setup_py('install', '--single-version-externally-managed', '--root=/',
'--prefix={0}'.format(prefix))
+
+ @run_after('install')
+ @on_package_attributes(run_tests=True)
+ def test(self):
+ with working_dir('spack-test', create=True):
+ # test include helper points to right location
+ python = self.spec['python'].command
+ inc = python(
+ '-c',
+ 'import pybind11 as py; ' +
+ self.spec['python'].package.print_string('py.get_include()'),
+ output=str)
+ assert inc.strip() == str(self.prefix.include)
| {"golden_diff": "diff --git a/var/spack/repos/builtin/packages/py-pybind11/package.py b/var/spack/repos/builtin/packages/py-pybind11/package.py\n--- a/var/spack/repos/builtin/packages/py-pybind11/package.py\n+++ b/var/spack/repos/builtin/packages/py-pybind11/package.py\n@@ -56,7 +56,27 @@\n def setup_build_environment(self, env):\n env.set('PYBIND11_USE_CMAKE', 1)\n \n+ def patch(self):\n+ \"\"\" see https://github.com/spack/spack/issues/13559 \"\"\"\n+ filter_file('import sys',\n+ 'import sys; return \"{0}\"'.format(self.prefix.include),\n+ 'pybind11/__init__.py',\n+ string=True)\n+\n def install(self, spec, prefix):\n super(PyPybind11, self).install(spec, prefix)\n setup_py('install', '--single-version-externally-managed', '--root=/',\n '--prefix={0}'.format(prefix))\n+\n+ @run_after('install')\n+ @on_package_attributes(run_tests=True)\n+ def test(self):\n+ with working_dir('spack-test', create=True):\n+ # test include helper points to right location\n+ python = self.spec['python'].command\n+ inc = python(\n+ '-c',\n+ 'import pybind11 as py; ' +\n+ self.spec['python'].package.print_string('py.get_include()'),\n+ output=str)\n+ assert inc.strip() == str(self.prefix.include)\n", "issue": "pybind11: fix get_include()\nThe `py-pybind11` package has a python module helper of the form:\r\n\r\n```python\r\nimport pybind11 as py\r\nprint(py.get_include())\r\n```\r\n\r\nwhich helps downstream, if they are not using the CMake config package (which probably everyone is using), to query the install location.\r\n\r\n`get_include()` currently points to the spec's Python include location instead of the package's include.\r\n\r\nDiff for a new unit test:\r\n```diff\r\ndiff --git a/var/spack/repos/builtin/packages/py-pybind11/package.py b/var/spack/repos/builtin/packages/py-pybind11/package.py\r\nindex cd482ceb3..0a86afb97 100644\r\n--- a/var/spack/repos/builtin/packages/py-pybind11/package.py\r\n+++ b/var/spack/repos/builtin/packages/py-pybind11/package.py\r\n@@ -60,3 +60,17 @@ class PyPybind11(CMakePackage):\r\n super(PyPybind11, self).install(spec, prefix)\r\n setup_py('install', '--single-version-externally-managed', '--root=/',\r\n '--prefix={0}'.format(prefix))\r\n+\r\n+ @run_after('install')\r\n+ @on_package_attributes(run_tests=True)\r\n+ def test(self):\r\n+ with working_dir('spack-test', create=True):\r\n+ # test include helper points to right location\r\n+ module = \"pybind11\"\r\n+ python = Executable(self.spec['python'].command.path)\r\n+ python('-c', 'import {0}'.format(module))\r\n+ inc = python('-c', 'import {0} as py; print(py.get_include())'.format(\r\n+ module), output=str)\r\n+ print(\"INC:\")\r\n+ print(inc)\r\n+ assert inc == self.prefix.include\r\n```\r\n\r\nSimilar to:\r\n- https://github.com/conda-forge/pybind11-feedstock/pull/32\r\n- https://github.com/pybind/pybind11/pull/1877\n", "before_files": [{"content": "# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other\n# Spack Project Developers. See the top-level COPYRIGHT file for details.\n#\n# SPDX-License-Identifier: (Apache-2.0 OR MIT)\n\nfrom spack import *\n\n\nclass PyPybind11(CMakePackage):\n \"\"\"pybind11 -- Seamless operability between C++11 and Python.\n\n pybind11 is a lightweight header-only library that exposes C++ types in\n Python and vice versa, mainly to create Python bindings of existing C++\n code. Its goals and syntax are similar to the excellent Boost.Python\n library by David Abrahams: to minimize boilerplate code in traditional\n extension modules by inferring type information using compile-time\n introspection.\"\"\"\n\n homepage = \"https://pybind11.readthedocs.io\"\n url = \"https://github.com/pybind/pybind11/archive/v2.1.0.tar.gz\"\n git = \"https://github.com/pybind/pybind11.git\"\n\n maintainers = ['ax3l']\n\n version('master', branch='master')\n version('2.4.3', sha256='1eed57bc6863190e35637290f97a20c81cfe4d9090ac0a24f3bbf08f265eb71d')\n version('2.3.0', sha256='0f34838f2c8024a6765168227ba587b3687729ebf03dc912f88ff75c7aa9cfe8')\n version('2.2.4', sha256='b69e83658513215b8d1443544d0549b7d231b9f201f6fc787a2b2218b408181e')\n version('2.2.3', sha256='3a3b7b651afab1c5ba557f4c37d785a522b8030dfc765da26adc2ecd1de940ea')\n version('2.2.2', sha256='b639a2b2cbf1c467849660801c4665ffc1a4d0a9e153ae1996ed6f21c492064e')\n version('2.2.1', sha256='f8bd1509578b2a1e7407d52e6ee8afe64268909a1bbda620ca407318598927e7')\n version('2.2.0', sha256='1b0fda17c650c493f5862902e90f426df6751da8c0b58c05983ab009951ed769')\n version('2.1.1', sha256='f2c6874f1ea5b4ad4ffffe352413f7d2cd1a49f9050940805c2a082348621540')\n version('2.1.0', sha256='2860f2b8d0c9f65f0698289a161385f59d099b7ead1bf64e8993c486f2b93ee0')\n\n depends_on('py-pytest', type='test')\n depends_on('py-setuptools', type='build')\n\n extends('python')\n\n # compiler support\n conflicts('%gcc@:4.7')\n conflicts('%clang@:3.2')\n conflicts('%intel@:16')\n\n def cmake_args(self):\n args = []\n args.append('-DPYTHON_EXECUTABLE:FILEPATH=%s'\n % self.spec['python'].command.path)\n args += [\n '-DPYBIND11_TEST:BOOL={0}'.format(\n 'ON' if self.run_tests else 'OFF')\n ]\n return args\n\n def setup_build_environment(self, env):\n env.set('PYBIND11_USE_CMAKE', 1)\n\n def install(self, spec, prefix):\n super(PyPybind11, self).install(spec, prefix)\n setup_py('install', '--single-version-externally-managed', '--root=/',\n '--prefix={0}'.format(prefix))\n", "path": "var/spack/repos/builtin/packages/py-pybind11/package.py"}], "after_files": [{"content": "# Copyright 2013-2019 Lawrence Livermore National Security, LLC and other\n# Spack Project Developers. See the top-level COPYRIGHT file for details.\n#\n# SPDX-License-Identifier: (Apache-2.0 OR MIT)\n\nfrom spack import *\n\n\nclass PyPybind11(CMakePackage):\n \"\"\"pybind11 -- Seamless operability between C++11 and Python.\n\n pybind11 is a lightweight header-only library that exposes C++ types in\n Python and vice versa, mainly to create Python bindings of existing C++\n code. Its goals and syntax are similar to the excellent Boost.Python\n library by David Abrahams: to minimize boilerplate code in traditional\n extension modules by inferring type information using compile-time\n introspection.\"\"\"\n\n homepage = \"https://pybind11.readthedocs.io\"\n url = \"https://github.com/pybind/pybind11/archive/v2.1.0.tar.gz\"\n git = \"https://github.com/pybind/pybind11.git\"\n\n maintainers = ['ax3l']\n\n version('master', branch='master')\n version('2.4.3', sha256='1eed57bc6863190e35637290f97a20c81cfe4d9090ac0a24f3bbf08f265eb71d')\n version('2.3.0', sha256='0f34838f2c8024a6765168227ba587b3687729ebf03dc912f88ff75c7aa9cfe8')\n version('2.2.4', sha256='b69e83658513215b8d1443544d0549b7d231b9f201f6fc787a2b2218b408181e')\n version('2.2.3', sha256='3a3b7b651afab1c5ba557f4c37d785a522b8030dfc765da26adc2ecd1de940ea')\n version('2.2.2', sha256='b639a2b2cbf1c467849660801c4665ffc1a4d0a9e153ae1996ed6f21c492064e')\n version('2.2.1', sha256='f8bd1509578b2a1e7407d52e6ee8afe64268909a1bbda620ca407318598927e7')\n version('2.2.0', sha256='1b0fda17c650c493f5862902e90f426df6751da8c0b58c05983ab009951ed769')\n version('2.1.1', sha256='f2c6874f1ea5b4ad4ffffe352413f7d2cd1a49f9050940805c2a082348621540')\n version('2.1.0', sha256='2860f2b8d0c9f65f0698289a161385f59d099b7ead1bf64e8993c486f2b93ee0')\n\n depends_on('py-pytest', type='test')\n depends_on('py-setuptools', type='build')\n\n extends('python')\n\n # compiler support\n conflicts('%gcc@:4.7')\n conflicts('%clang@:3.2')\n conflicts('%intel@:16')\n\n def cmake_args(self):\n args = []\n args.append('-DPYTHON_EXECUTABLE:FILEPATH=%s'\n % self.spec['python'].command.path)\n args += [\n '-DPYBIND11_TEST:BOOL={0}'.format(\n 'ON' if self.run_tests else 'OFF')\n ]\n return args\n\n def setup_build_environment(self, env):\n env.set('PYBIND11_USE_CMAKE', 1)\n\n def patch(self):\n \"\"\" see https://github.com/spack/spack/issues/13559 \"\"\"\n filter_file('import sys',\n 'import sys; return \"{0}\"'.format(self.prefix.include),\n 'pybind11/__init__.py',\n string=True)\n\n def install(self, spec, prefix):\n super(PyPybind11, self).install(spec, prefix)\n setup_py('install', '--single-version-externally-managed', '--root=/',\n '--prefix={0}'.format(prefix))\n\n @run_after('install')\n @on_package_attributes(run_tests=True)\n def test(self):\n with working_dir('spack-test', create=True):\n # test include helper points to right location\n python = self.spec['python'].command\n inc = python(\n '-c',\n 'import pybind11 as py; ' +\n self.spec['python'].package.print_string('py.get_include()'),\n output=str)\n assert inc.strip() == str(self.prefix.include)\n", "path": "var/spack/repos/builtin/packages/py-pybind11/package.py"}]} | 1,962 | 347 |
gh_patches_debug_9405 | rasdani/github-patches | git_diff | ivy-llc__ivy-17873 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
dropout1d
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/torch/nn/functional/dropout_functions.py`
Content:
```
1 # local
2 import ivy
3 from ivy.func_wrapper import with_unsupported_dtypes
4
5 from ivy.functional.frontends.torch.func_wrapper import to_ivy_arrays_and_back
6
7
8 @to_ivy_arrays_and_back
9 @with_unsupported_dtypes({"2.0.1 and below": ("float16",)}, "torch")
10 def dropout(input, p=0.5, training=True, inplace=False):
11 return ivy.dropout(input, p, training=training)
12
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/torch/nn/functional/dropout_functions.py b/ivy/functional/frontends/torch/nn/functional/dropout_functions.py
--- a/ivy/functional/frontends/torch/nn/functional/dropout_functions.py
+++ b/ivy/functional/frontends/torch/nn/functional/dropout_functions.py
@@ -9,3 +9,11 @@
@with_unsupported_dtypes({"2.0.1 and below": ("float16",)}, "torch")
def dropout(input, p=0.5, training=True, inplace=False):
return ivy.dropout(input, p, training=training)
+
+
+@to_ivy_arrays_and_back
+@with_unsupported_dtypes({"2.0.1 and below": ("float16",)}, "torch")
+def dropout1d(input, p=0.5, training=True, inplace=False):
+ if inplace:
+ return ivy.dropout1d(input, p, training=training, data_format="NCW", out=input)
+ return ivy.dropout1d(input, p, training=training, data_format="NCW")
| {"golden_diff": "diff --git a/ivy/functional/frontends/torch/nn/functional/dropout_functions.py b/ivy/functional/frontends/torch/nn/functional/dropout_functions.py\n--- a/ivy/functional/frontends/torch/nn/functional/dropout_functions.py\n+++ b/ivy/functional/frontends/torch/nn/functional/dropout_functions.py\n@@ -9,3 +9,11 @@\n @with_unsupported_dtypes({\"2.0.1 and below\": (\"float16\",)}, \"torch\")\n def dropout(input, p=0.5, training=True, inplace=False):\n return ivy.dropout(input, p, training=training)\n+\n+\n+@to_ivy_arrays_and_back\n+@with_unsupported_dtypes({\"2.0.1 and below\": (\"float16\",)}, \"torch\")\n+def dropout1d(input, p=0.5, training=True, inplace=False):\n+ if inplace:\n+ return ivy.dropout1d(input, p, training=training, data_format=\"NCW\", out=input)\n+ return ivy.dropout1d(input, p, training=training, data_format=\"NCW\")\n", "issue": "dropout1d\n\n", "before_files": [{"content": "# local\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\n\nfrom ivy.functional.frontends.torch.func_wrapper import to_ivy_arrays_and_back\n\n\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes({\"2.0.1 and below\": (\"float16\",)}, \"torch\")\ndef dropout(input, p=0.5, training=True, inplace=False):\n return ivy.dropout(input, p, training=training)\n", "path": "ivy/functional/frontends/torch/nn/functional/dropout_functions.py"}], "after_files": [{"content": "# local\nimport ivy\nfrom ivy.func_wrapper import with_unsupported_dtypes\n\nfrom ivy.functional.frontends.torch.func_wrapper import to_ivy_arrays_and_back\n\n\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes({\"2.0.1 and below\": (\"float16\",)}, \"torch\")\ndef dropout(input, p=0.5, training=True, inplace=False):\n return ivy.dropout(input, p, training=training)\n\n\n@to_ivy_arrays_and_back\n@with_unsupported_dtypes({\"2.0.1 and below\": (\"float16\",)}, \"torch\")\ndef dropout1d(input, p=0.5, training=True, inplace=False):\n if inplace:\n return ivy.dropout1d(input, p, training=training, data_format=\"NCW\", out=input)\n return ivy.dropout1d(input, p, training=training, data_format=\"NCW\")\n", "path": "ivy/functional/frontends/torch/nn/functional/dropout_functions.py"}]} | 390 | 251 |
gh_patches_debug_25242 | rasdani/github-patches | git_diff | fedora-infra__bodhi-5479 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Updates sometimes get stuck in pending state
From quick look, it seems that the composer does:
1. move from `f*-updates-candidate` to `f*-updates-testing`
2. do stuff
3. untag from `f*-updates-testing-pending`
4. mark update state as testing
If the composer hang on 2 the update remains stuck in pending as the builds are not tagged anymore in `f*-updates-candidate`. We should find a solution.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bodhi-server/bodhi/server/tasks/check_signed_builds.py`
Content:
```
1 # Copyright © 2017 Red Hat, Inc.
2 #
3 # This file is part of Bodhi.
4 #
5 # This program is free software; you can redistribute it and/or
6 # modify it under the terms of the GNU General Public License
7 # as published by the Free Software Foundation; either version 2
8 # of the License, or (at your option) any later version.
9 #
10 # This program is distributed in the hope that it will be useful,
11 # but WITHOUT ANY WARRANTY; without even the implied warranty of
12 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
13 # GNU General Public License for more details.
14 #
15 # You should have received a copy of the GNU General Public License
16 # along with this program; if not, write to the Free Software
17 # Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.
18
19 """
20 Avoid Updates being stuck in pending.
21
22 It may happen that Bodhi misses fedora-messaging messages announcing builds
23 have been signed.
24 In these cases, the Update remain stuck in pending until a manual intervention.
25
26 This script will cycle through builds of Updates in pending status and update
27 the signed status in the db to match the tags found in Koji.
28 """
29
30 import logging
31 from datetime import datetime, timedelta
32
33 from bodhi.server import buildsys, models
34 from bodhi.server.config import config
35 from bodhi.server.util import transactional_session_maker
36
37
38 log = logging.getLogger(__name__)
39
40
41 def main():
42 """Check build tags and sign those we missed."""
43 db_factory = transactional_session_maker()
44 older_than = datetime.utcnow() - timedelta(days=config.get('check_signed_builds_delay'))
45 with db_factory() as session:
46 updates = models.Update.query.filter(
47 models.Update.status == models.UpdateStatus.pending
48 ).filter(
49 models.Update.release_id == models.Release.id
50 ).filter(
51 models.Release.state.in_([
52 models.ReleaseState.current,
53 models.ReleaseState.pending,
54 models.ReleaseState.frozen,
55 ])
56 ).all()
57
58 if len(updates) == 0:
59 log.debug('No stuck Updates found')
60 return
61
62 kc = buildsys.get_session()
63 stuck_builds = []
64 overlooked_builds = []
65
66 for update in updates:
67 # Let Bodhi have its times
68 if update.date_submitted >= older_than:
69 continue
70 builds = update.builds
71 # Clean Updates with no builds
72 if len(builds) == 0:
73 log.debug(f'Obsoleting empty update {update.alias}')
74 update.obsolete(session)
75 session.flush()
76 continue
77 pending_signing_tag = update.release.pending_signing_tag
78 pending_testing_tag = update.release.pending_testing_tag
79 for build in builds:
80 if build.signed:
81 log.debug(f'{build.nvr} already marked as signed')
82 continue
83 build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]
84 if pending_signing_tag not in build_tags and pending_testing_tag in build_tags:
85 # Our composer missed the message that the build got signed
86 log.debug(f'Changing signed status of {build.nvr}')
87 build.signed = True
88 elif pending_signing_tag in build_tags and pending_testing_tag not in build_tags:
89 # autosign missed the message that the build is waiting to be signed
90 log.debug(f'{build.nvr} is stuck waiting to be signed, let\'s try again')
91 stuck_builds.append((build.nvr, pending_signing_tag))
92 elif (pending_signing_tag not in build_tags
93 and pending_testing_tag not in build_tags):
94 # this means that an update has been created but we never tagged the build
95 # as pending-signing
96 log.debug(f'Oh, no! We\'ve never sent {build.nvr} for signing, let\'s fix it')
97 overlooked_builds.append((build.nvr, pending_signing_tag))
98 session.flush()
99
100 if stuck_builds:
101 kc.multicall = True
102 for b, t in stuck_builds:
103 kc.untagBuild(t, b, force=True)
104 kc.multiCall()
105 for b, t in stuck_builds:
106 kc.tagBuild(t, b, force=True)
107 kc.multiCall()
108
109 if overlooked_builds:
110 kc.multicall = True
111 for b, t in overlooked_builds:
112 kc.tagBuild(t, b, force=True)
113 kc.multiCall()
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bodhi-server/bodhi/server/tasks/check_signed_builds.py b/bodhi-server/bodhi/server/tasks/check_signed_builds.py
--- a/bodhi-server/bodhi/server/tasks/check_signed_builds.py
+++ b/bodhi-server/bodhi/server/tasks/check_signed_builds.py
@@ -77,10 +77,16 @@
pending_signing_tag = update.release.pending_signing_tag
pending_testing_tag = update.release.pending_testing_tag
for build in builds:
+ build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]
if build.signed:
log.debug(f'{build.nvr} already marked as signed')
+ if (update.release.testing_tag in build_tags
+ and update.release.candidate_tag not in build_tags):
+ # The update was probably ejected from a compose and is stuck
+ log.debug(f'Resubmitting {update.alias} to testing')
+ update.set_request(session, models.UpdateRequest.testing, 'bodhi')
+ break
continue
- build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]
if pending_signing_tag not in build_tags and pending_testing_tag in build_tags:
# Our composer missed the message that the build got signed
log.debug(f'Changing signed status of {build.nvr}')
| {"golden_diff": "diff --git a/bodhi-server/bodhi/server/tasks/check_signed_builds.py b/bodhi-server/bodhi/server/tasks/check_signed_builds.py\n--- a/bodhi-server/bodhi/server/tasks/check_signed_builds.py\n+++ b/bodhi-server/bodhi/server/tasks/check_signed_builds.py\n@@ -77,10 +77,16 @@\n pending_signing_tag = update.release.pending_signing_tag\n pending_testing_tag = update.release.pending_testing_tag\n for build in builds:\n+ build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]\n if build.signed:\n log.debug(f'{build.nvr} already marked as signed')\n+ if (update.release.testing_tag in build_tags\n+ and update.release.candidate_tag not in build_tags):\n+ # The update was probably ejected from a compose and is stuck\n+ log.debug(f'Resubmitting {update.alias} to testing')\n+ update.set_request(session, models.UpdateRequest.testing, 'bodhi')\n+ break\n continue\n- build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]\n if pending_signing_tag not in build_tags and pending_testing_tag in build_tags:\n # Our composer missed the message that the build got signed\n log.debug(f'Changing signed status of {build.nvr}')\n", "issue": "Updates sometimes get stuck in pending state\nFrom quick look, it seems that the composer does:\r\n1. move from `f*-updates-candidate` to `f*-updates-testing`\r\n2. do stuff\r\n3. untag from `f*-updates-testing-pending`\r\n4. mark update state as testing\r\n\r\nIf the composer hang on 2 the update remains stuck in pending as the builds are not tagged anymore in `f*-updates-candidate`. We should find a solution.\n", "before_files": [{"content": "# Copyright \u00a9 2017 Red Hat, Inc.\n#\n# This file is part of Bodhi.\n#\n# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program; if not, write to the Free Software\n# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n\n\"\"\"\nAvoid Updates being stuck in pending.\n\nIt may happen that Bodhi misses fedora-messaging messages announcing builds\nhave been signed.\nIn these cases, the Update remain stuck in pending until a manual intervention.\n\nThis script will cycle through builds of Updates in pending status and update\nthe signed status in the db to match the tags found in Koji.\n\"\"\"\n\nimport logging\nfrom datetime import datetime, timedelta\n\nfrom bodhi.server import buildsys, models\nfrom bodhi.server.config import config\nfrom bodhi.server.util import transactional_session_maker\n\n\nlog = logging.getLogger(__name__)\n\n\ndef main():\n \"\"\"Check build tags and sign those we missed.\"\"\"\n db_factory = transactional_session_maker()\n older_than = datetime.utcnow() - timedelta(days=config.get('check_signed_builds_delay'))\n with db_factory() as session:\n updates = models.Update.query.filter(\n models.Update.status == models.UpdateStatus.pending\n ).filter(\n models.Update.release_id == models.Release.id\n ).filter(\n models.Release.state.in_([\n models.ReleaseState.current,\n models.ReleaseState.pending,\n models.ReleaseState.frozen,\n ])\n ).all()\n\n if len(updates) == 0:\n log.debug('No stuck Updates found')\n return\n\n kc = buildsys.get_session()\n stuck_builds = []\n overlooked_builds = []\n\n for update in updates:\n # Let Bodhi have its times\n if update.date_submitted >= older_than:\n continue\n builds = update.builds\n # Clean Updates with no builds\n if len(builds) == 0:\n log.debug(f'Obsoleting empty update {update.alias}')\n update.obsolete(session)\n session.flush()\n continue\n pending_signing_tag = update.release.pending_signing_tag\n pending_testing_tag = update.release.pending_testing_tag\n for build in builds:\n if build.signed:\n log.debug(f'{build.nvr} already marked as signed')\n continue\n build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]\n if pending_signing_tag not in build_tags and pending_testing_tag in build_tags:\n # Our composer missed the message that the build got signed\n log.debug(f'Changing signed status of {build.nvr}')\n build.signed = True\n elif pending_signing_tag in build_tags and pending_testing_tag not in build_tags:\n # autosign missed the message that the build is waiting to be signed\n log.debug(f'{build.nvr} is stuck waiting to be signed, let\\'s try again')\n stuck_builds.append((build.nvr, pending_signing_tag))\n elif (pending_signing_tag not in build_tags\n and pending_testing_tag not in build_tags):\n # this means that an update has been created but we never tagged the build\n # as pending-signing\n log.debug(f'Oh, no! We\\'ve never sent {build.nvr} for signing, let\\'s fix it')\n overlooked_builds.append((build.nvr, pending_signing_tag))\n session.flush()\n\n if stuck_builds:\n kc.multicall = True\n for b, t in stuck_builds:\n kc.untagBuild(t, b, force=True)\n kc.multiCall()\n for b, t in stuck_builds:\n kc.tagBuild(t, b, force=True)\n kc.multiCall()\n\n if overlooked_builds:\n kc.multicall = True\n for b, t in overlooked_builds:\n kc.tagBuild(t, b, force=True)\n kc.multiCall()\n", "path": "bodhi-server/bodhi/server/tasks/check_signed_builds.py"}], "after_files": [{"content": "# Copyright \u00a9 2017 Red Hat, Inc.\n#\n# This file is part of Bodhi.\n#\n# This program is free software; you can redistribute it and/or\n# modify it under the terms of the GNU General Public License\n# as published by the Free Software Foundation; either version 2\n# of the License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU General Public License for more details.\n#\n# You should have received a copy of the GNU General Public License\n# along with this program; if not, write to the Free Software\n# Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301, USA.\n\n\"\"\"\nAvoid Updates being stuck in pending.\n\nIt may happen that Bodhi misses fedora-messaging messages announcing builds\nhave been signed.\nIn these cases, the Update remain stuck in pending until a manual intervention.\n\nThis script will cycle through builds of Updates in pending status and update\nthe signed status in the db to match the tags found in Koji.\n\"\"\"\n\nimport logging\nfrom datetime import datetime, timedelta\n\nfrom bodhi.server import buildsys, models\nfrom bodhi.server.config import config\nfrom bodhi.server.util import transactional_session_maker\n\n\nlog = logging.getLogger(__name__)\n\n\ndef main():\n \"\"\"Check build tags and sign those we missed.\"\"\"\n db_factory = transactional_session_maker()\n older_than = datetime.utcnow() - timedelta(days=config.get('check_signed_builds_delay'))\n with db_factory() as session:\n updates = models.Update.query.filter(\n models.Update.status == models.UpdateStatus.pending\n ).filter(\n models.Update.release_id == models.Release.id\n ).filter(\n models.Release.state.in_([\n models.ReleaseState.current,\n models.ReleaseState.pending,\n models.ReleaseState.frozen,\n ])\n ).all()\n\n if len(updates) == 0:\n log.debug('No stuck Updates found')\n return\n\n kc = buildsys.get_session()\n stuck_builds = []\n overlooked_builds = []\n\n for update in updates:\n # Let Bodhi have its times\n if update.date_submitted >= older_than:\n continue\n builds = update.builds\n # Clean Updates with no builds\n if len(builds) == 0:\n log.debug(f'Obsoleting empty update {update.alias}')\n update.obsolete(session)\n session.flush()\n continue\n pending_signing_tag = update.release.pending_signing_tag\n pending_testing_tag = update.release.pending_testing_tag\n for build in builds:\n build_tags = [t['name'] for t in kc.listTags(build=build.nvr)]\n if build.signed:\n log.debug(f'{build.nvr} already marked as signed')\n if (update.release.testing_tag in build_tags\n and update.release.candidate_tag not in build_tags):\n # The update was probably ejected from a compose and is stuck\n log.debug(f'Resubmitting {update.alias} to testing')\n update.set_request(session, models.UpdateRequest.testing, 'bodhi')\n break\n continue\n if pending_signing_tag not in build_tags and pending_testing_tag in build_tags:\n # Our composer missed the message that the build got signed\n log.debug(f'Changing signed status of {build.nvr}')\n build.signed = True\n elif pending_signing_tag in build_tags and pending_testing_tag not in build_tags:\n # autosign missed the message that the build is waiting to be signed\n log.debug(f'{build.nvr} is stuck waiting to be signed, let\\'s try again')\n stuck_builds.append((build.nvr, pending_signing_tag))\n elif (pending_signing_tag not in build_tags\n and pending_testing_tag not in build_tags):\n # this means that an update has been created but we never tagged the build\n # as pending-signing\n log.debug(f'Oh, no! We\\'ve never sent {build.nvr} for signing, let\\'s fix it')\n overlooked_builds.append((build.nvr, pending_signing_tag))\n session.flush()\n\n if stuck_builds:\n kc.multicall = True\n for b, t in stuck_builds:\n kc.untagBuild(t, b, force=True)\n kc.multiCall()\n for b, t in stuck_builds:\n kc.tagBuild(t, b, force=True)\n kc.multiCall()\n\n if overlooked_builds:\n kc.multicall = True\n for b, t in overlooked_builds:\n kc.tagBuild(t, b, force=True)\n kc.multiCall()\n", "path": "bodhi-server/bodhi/server/tasks/check_signed_builds.py"}]} | 1,561 | 300 |
gh_patches_debug_10905 | rasdani/github-patches | git_diff | saleor__saleor-1775 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improving product variant behaviour
### Steps to reproduce the problem
1. Choose any product in dashboard
2. Remove all variants
3. Try to go to the product page in storefront
### What I expected to happen
Product page in storefront with "UNAVAILABLE" label.
### What happened instead/how it failed
500 code.
It is getting crashed with `list index out of range` error, because in method `update_field_data` in `VariantChoiceField` lines
```
if self.queryset.count() < 2:
self.widget = forms.HiddenInput(
{'value': variants.all()[0].pk})
```
are trying to get index of empty query set.
Possible solution would be to omit this block of code if `variants.all()` is `False`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/product/forms.py`
Content:
```
1 import json
2
3 from django import forms
4 from django.utils.encoding import smart_text
5 from django.utils.translation import pgettext_lazy
6 from django_prices.templatetags.prices_i18n import gross
7
8 from ..cart.forms import AddToCartForm
9
10
11 class VariantChoiceField(forms.ModelChoiceField):
12 discounts = None
13
14 def label_from_instance(self, obj):
15 variant_label = smart_text(obj)
16 label = pgettext_lazy(
17 'Variant choice field label',
18 '%(variant_label)s - %(price)s') % {
19 'variant_label': variant_label,
20 'price': gross(
21 obj.get_price_per_item(discounts=self.discounts))}
22 return label
23
24 def update_field_data(self, variants, cart):
25 """Initialize variant picker metadata."""
26 self.queryset = variants
27 self.discounts = cart.discounts
28 self.empty_label = None
29 images_map = {
30 variant.pk: [
31 vi.image.image.url for vi in variant.variant_images.all()]
32 for variant in variants.all()}
33 self.widget.attrs['data-images'] = json.dumps(images_map)
34 # Don't display select input if there are less than two variants
35 if self.queryset.count() < 2:
36 self.widget = forms.HiddenInput(
37 {'value': variants.all()[0].pk})
38
39
40 class ProductForm(AddToCartForm):
41 variant = VariantChoiceField(queryset=None)
42
43 def __init__(self, *args, **kwargs):
44 super().__init__(*args, **kwargs)
45 variant_field = self.fields['variant']
46 variant_field.update_field_data(self.product.variants, self.cart)
47
48 def get_variant(self, cleaned_data):
49 return cleaned_data.get('variant')
50
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/product/forms.py b/saleor/product/forms.py
--- a/saleor/product/forms.py
+++ b/saleor/product/forms.py
@@ -31,8 +31,8 @@
vi.image.image.url for vi in variant.variant_images.all()]
for variant in variants.all()}
self.widget.attrs['data-images'] = json.dumps(images_map)
- # Don't display select input if there are less than two variants
- if self.queryset.count() < 2:
+ # Don't display select input if there is only one variant.
+ if self.queryset.count() == 1:
self.widget = forms.HiddenInput(
{'value': variants.all()[0].pk})
| {"golden_diff": "diff --git a/saleor/product/forms.py b/saleor/product/forms.py\n--- a/saleor/product/forms.py\n+++ b/saleor/product/forms.py\n@@ -31,8 +31,8 @@\n vi.image.image.url for vi in variant.variant_images.all()]\n for variant in variants.all()}\n self.widget.attrs['data-images'] = json.dumps(images_map)\n- # Don't display select input if there are less than two variants\n- if self.queryset.count() < 2:\n+ # Don't display select input if there is only one variant.\n+ if self.queryset.count() == 1:\n self.widget = forms.HiddenInput(\n {'value': variants.all()[0].pk})\n", "issue": "Improving product variant behaviour\n### Steps to reproduce the problem\r\n\r\n1. Choose any product in dashboard\r\n2. Remove all variants\r\n3. Try to go to the product page in storefront\r\n\r\n### What I expected to happen\r\n\r\nProduct page in storefront with \"UNAVAILABLE\" label.\r\n\r\n### What happened instead/how it failed\r\n500 code. \r\n\r\n\r\nIt is getting crashed with `list index out of range` error, because in method `update_field_data` in `VariantChoiceField` lines\r\n```\r\n if self.queryset.count() < 2:\r\n self.widget = forms.HiddenInput(\r\n {'value': variants.all()[0].pk})\r\n```\r\nare trying to get index of empty query set.\r\nPossible solution would be to omit this block of code if `variants.all()` is `False`.\n", "before_files": [{"content": "import json\n\nfrom django import forms\nfrom django.utils.encoding import smart_text\nfrom django.utils.translation import pgettext_lazy\nfrom django_prices.templatetags.prices_i18n import gross\n\nfrom ..cart.forms import AddToCartForm\n\n\nclass VariantChoiceField(forms.ModelChoiceField):\n discounts = None\n\n def label_from_instance(self, obj):\n variant_label = smart_text(obj)\n label = pgettext_lazy(\n 'Variant choice field label',\n '%(variant_label)s - %(price)s') % {\n 'variant_label': variant_label,\n 'price': gross(\n obj.get_price_per_item(discounts=self.discounts))}\n return label\n\n def update_field_data(self, variants, cart):\n \"\"\"Initialize variant picker metadata.\"\"\"\n self.queryset = variants\n self.discounts = cart.discounts\n self.empty_label = None\n images_map = {\n variant.pk: [\n vi.image.image.url for vi in variant.variant_images.all()]\n for variant in variants.all()}\n self.widget.attrs['data-images'] = json.dumps(images_map)\n # Don't display select input if there are less than two variants\n if self.queryset.count() < 2:\n self.widget = forms.HiddenInput(\n {'value': variants.all()[0].pk})\n\n\nclass ProductForm(AddToCartForm):\n variant = VariantChoiceField(queryset=None)\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n variant_field = self.fields['variant']\n variant_field.update_field_data(self.product.variants, self.cart)\n\n def get_variant(self, cleaned_data):\n return cleaned_data.get('variant')\n", "path": "saleor/product/forms.py"}], "after_files": [{"content": "import json\n\nfrom django import forms\nfrom django.utils.encoding import smart_text\nfrom django.utils.translation import pgettext_lazy\nfrom django_prices.templatetags.prices_i18n import gross\n\nfrom ..cart.forms import AddToCartForm\n\n\nclass VariantChoiceField(forms.ModelChoiceField):\n discounts = None\n\n def label_from_instance(self, obj):\n variant_label = smart_text(obj)\n label = pgettext_lazy(\n 'Variant choice field label',\n '%(variant_label)s - %(price)s') % {\n 'variant_label': variant_label,\n 'price': gross(\n obj.get_price_per_item(discounts=self.discounts))}\n return label\n\n def update_field_data(self, variants, cart):\n \"\"\"Initialize variant picker metadata.\"\"\"\n self.queryset = variants\n self.discounts = cart.discounts\n self.empty_label = None\n images_map = {\n variant.pk: [\n vi.image.image.url for vi in variant.variant_images.all()]\n for variant in variants.all()}\n self.widget.attrs['data-images'] = json.dumps(images_map)\n # Don't display select input if there is only one variant.\n if self.queryset.count() == 1:\n self.widget = forms.HiddenInput(\n {'value': variants.all()[0].pk})\n\n\nclass ProductForm(AddToCartForm):\n variant = VariantChoiceField(queryset=None)\n\n def __init__(self, *args, **kwargs):\n super().__init__(*args, **kwargs)\n variant_field = self.fields['variant']\n variant_field.update_field_data(self.product.variants, self.cart)\n\n def get_variant(self, cleaned_data):\n return cleaned_data.get('variant')\n", "path": "saleor/product/forms.py"}]} | 877 | 156 |
gh_patches_debug_6255 | rasdani/github-patches | git_diff | qtile__qtile-180 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
using Qtile.cmd_{info, debug} doesn't switch logging level
It seems that critical, warning, and error do all work, though.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `libqtile/log_utils.py`
Content:
```
1 import logging
2 import os
3 import sys
4 from logging import getLogger, StreamHandler
5
6
7 class ColorFormatter(logging.Formatter):
8 """Logging formatter adding console colors to the output.
9 """
10 black, red, green, yellow, blue, magenta, cyan, white = range(8)
11 colors = {
12 'WARNING': yellow,
13 'INFO': green,
14 'DEBUG': blue,
15 'CRITICAL': yellow,
16 'ERROR': red,
17 'RED': red,
18 'GREEN': green,
19 'YELLOW': yellow,
20 'BLUE': blue,
21 'MAGENTA': magenta,
22 'CYAN': cyan,
23 'WHITE': white}
24 reset_seq = '\033[0m'
25 color_seq = '\033[%dm'
26 bold_seq = '\033[1m'
27
28 def format(self, record):
29 """Format the record with colors."""
30 color = self.color_seq % (30 + self.colors[record.levelname])
31 message = logging.Formatter.format(self, record)
32 message = message.replace('$RESET', self.reset_seq)\
33 .replace('$BOLD', self.bold_seq)\
34 .replace('$COLOR', color)
35 for color, value in self.colors.items():
36 message = message.replace(
37 '$' + color, self.color_seq % (value + 30))\
38 .replace('$BG' + color, self.color_seq % (value + 40))\
39 .replace('$BG-' + color, self.color_seq % (value + 40))
40 return message + self.reset_seq
41
42
43 def init_log(log_level=logging.WARNING, logger='qtile'):
44 handler = logging.FileHandler(
45 os.path.expanduser('~/.%s.log' % logger))
46 handler.setLevel(logging.WARNING)
47 handler.setFormatter(
48 logging.Formatter(
49 "%(asctime)s %(levelname)s %(funcName)s:%(lineno)d %(message)s"))
50 log = getLogger(logger)
51 log.setLevel(log_level)
52 log.addHandler(handler)
53 log.warning('Starting %s' % logger.title())
54 handler = StreamHandler(sys.stderr)
55 handler.setFormatter(
56 ColorFormatter(
57 '$RESET$COLOR%(asctime)s $BOLD$COLOR%(name)s'
58 ' %(funcName)s:%(lineno)d $RESET %(message)s'))
59 log.addHandler(handler)
60 return log
61
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/libqtile/log_utils.py b/libqtile/log_utils.py
--- a/libqtile/log_utils.py
+++ b/libqtile/log_utils.py
@@ -43,7 +43,6 @@
def init_log(log_level=logging.WARNING, logger='qtile'):
handler = logging.FileHandler(
os.path.expanduser('~/.%s.log' % logger))
- handler.setLevel(logging.WARNING)
handler.setFormatter(
logging.Formatter(
"%(asctime)s %(levelname)s %(funcName)s:%(lineno)d %(message)s"))
| {"golden_diff": "diff --git a/libqtile/log_utils.py b/libqtile/log_utils.py\n--- a/libqtile/log_utils.py\n+++ b/libqtile/log_utils.py\n@@ -43,7 +43,6 @@\n def init_log(log_level=logging.WARNING, logger='qtile'):\n handler = logging.FileHandler(\n os.path.expanduser('~/.%s.log' % logger))\n- handler.setLevel(logging.WARNING)\n handler.setFormatter(\n logging.Formatter(\n \"%(asctime)s %(levelname)s %(funcName)s:%(lineno)d %(message)s\"))\n", "issue": "using Qtile.cmd_{info, debug} doesn't switch logging level\nIt seems that critical, warning, and error do all work, though.\n\n", "before_files": [{"content": "import logging\nimport os\nimport sys\nfrom logging import getLogger, StreamHandler\n\n\nclass ColorFormatter(logging.Formatter):\n \"\"\"Logging formatter adding console colors to the output.\n \"\"\"\n black, red, green, yellow, blue, magenta, cyan, white = range(8)\n colors = {\n 'WARNING': yellow,\n 'INFO': green,\n 'DEBUG': blue,\n 'CRITICAL': yellow,\n 'ERROR': red,\n 'RED': red,\n 'GREEN': green,\n 'YELLOW': yellow,\n 'BLUE': blue,\n 'MAGENTA': magenta,\n 'CYAN': cyan,\n 'WHITE': white}\n reset_seq = '\\033[0m'\n color_seq = '\\033[%dm'\n bold_seq = '\\033[1m'\n\n def format(self, record):\n \"\"\"Format the record with colors.\"\"\"\n color = self.color_seq % (30 + self.colors[record.levelname])\n message = logging.Formatter.format(self, record)\n message = message.replace('$RESET', self.reset_seq)\\\n .replace('$BOLD', self.bold_seq)\\\n .replace('$COLOR', color)\n for color, value in self.colors.items():\n message = message.replace(\n '$' + color, self.color_seq % (value + 30))\\\n .replace('$BG' + color, self.color_seq % (value + 40))\\\n .replace('$BG-' + color, self.color_seq % (value + 40))\n return message + self.reset_seq\n\n\ndef init_log(log_level=logging.WARNING, logger='qtile'):\n handler = logging.FileHandler(\n os.path.expanduser('~/.%s.log' % logger))\n handler.setLevel(logging.WARNING)\n handler.setFormatter(\n logging.Formatter(\n \"%(asctime)s %(levelname)s %(funcName)s:%(lineno)d %(message)s\"))\n log = getLogger(logger)\n log.setLevel(log_level)\n log.addHandler(handler)\n log.warning('Starting %s' % logger.title())\n handler = StreamHandler(sys.stderr)\n handler.setFormatter(\n ColorFormatter(\n '$RESET$COLOR%(asctime)s $BOLD$COLOR%(name)s'\n ' %(funcName)s:%(lineno)d $RESET %(message)s'))\n log.addHandler(handler)\n return log\n", "path": "libqtile/log_utils.py"}], "after_files": [{"content": "import logging\nimport os\nimport sys\nfrom logging import getLogger, StreamHandler\n\n\nclass ColorFormatter(logging.Formatter):\n \"\"\"Logging formatter adding console colors to the output.\n \"\"\"\n black, red, green, yellow, blue, magenta, cyan, white = range(8)\n colors = {\n 'WARNING': yellow,\n 'INFO': green,\n 'DEBUG': blue,\n 'CRITICAL': yellow,\n 'ERROR': red,\n 'RED': red,\n 'GREEN': green,\n 'YELLOW': yellow,\n 'BLUE': blue,\n 'MAGENTA': magenta,\n 'CYAN': cyan,\n 'WHITE': white}\n reset_seq = '\\033[0m'\n color_seq = '\\033[%dm'\n bold_seq = '\\033[1m'\n\n def format(self, record):\n \"\"\"Format the record with colors.\"\"\"\n color = self.color_seq % (30 + self.colors[record.levelname])\n message = logging.Formatter.format(self, record)\n message = message.replace('$RESET', self.reset_seq)\\\n .replace('$BOLD', self.bold_seq)\\\n .replace('$COLOR', color)\n for color, value in self.colors.items():\n message = message.replace(\n '$' + color, self.color_seq % (value + 30))\\\n .replace('$BG' + color, self.color_seq % (value + 40))\\\n .replace('$BG-' + color, self.color_seq % (value + 40))\n return message + self.reset_seq\n\n\ndef init_log(log_level=logging.WARNING, logger='qtile'):\n handler = logging.FileHandler(\n os.path.expanduser('~/.%s.log' % logger))\n handler.setFormatter(\n logging.Formatter(\n \"%(asctime)s %(levelname)s %(funcName)s:%(lineno)d %(message)s\"))\n log = getLogger(logger)\n log.setLevel(log_level)\n log.addHandler(handler)\n log.warning('Starting %s' % logger.title())\n handler = StreamHandler(sys.stderr)\n handler.setFormatter(\n ColorFormatter(\n '$RESET$COLOR%(asctime)s $BOLD$COLOR%(name)s'\n ' %(funcName)s:%(lineno)d $RESET %(message)s'))\n log.addHandler(handler)\n return log\n", "path": "libqtile/log_utils.py"}]} | 896 | 118 |
gh_patches_debug_20514 | rasdani/github-patches | git_diff | liqd__a4-product-149 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
tile images on partner page are not cut to same size

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `liqd_product/config/urls.py`
Content:
```
1 """Beteiligung.in URL Configuration."""
2
3 from ckeditor_uploader import views as ck_views
4 from django.conf import settings
5 from django.conf.urls import include
6 from django.conf.urls import url
7 from django.contrib import admin
8 from django.views.decorators.cache import never_cache
9 from django.views.i18n import javascript_catalog
10 from rest_framework import routers
11
12 from adhocracy4.api import routers as a4routers
13 from adhocracy4.comments.api import CommentViewSet
14 from adhocracy4.follows.api import FollowViewSet
15 from adhocracy4.ratings.api import RatingViewSet
16 from adhocracy4.reports.api import ReportViewSet
17 from liqd_product.apps.partners.urlresolvers import partner_patterns
18 from liqd_product.apps.users.decorators import user_is_project_admin
19 from meinberlin.apps.documents.api import DocumentViewSet
20 from meinberlin.apps.polls.api import PollViewSet
21 from meinberlin.apps.polls.api import VoteViewSet
22 from meinberlin.apps.polls.routers import QuestionDefaultRouter
23
24 js_info_dict = {
25 'packages': ('adhocracy4.comments',),
26 }
27
28 router = routers.DefaultRouter()
29 router.register(r'follows', FollowViewSet, base_name='follows')
30 router.register(r'reports', ReportViewSet, base_name='reports')
31 router.register(r'polls', PollViewSet, base_name='polls')
32
33 module_router = a4routers.ModuleDefaultRouter()
34 # FIXME: rename to 'chapters'
35 module_router.register(r'documents', DocumentViewSet, base_name='chapters')
36
37 orga_router = a4routers.OrganisationDefaultRouter()
38
39 ct_router = a4routers.ContentTypeDefaultRouter()
40 ct_router.register(r'comments', CommentViewSet, base_name='comments')
41 ct_router.register(r'ratings', RatingViewSet, base_name='ratings')
42
43 question_router = QuestionDefaultRouter()
44 question_router.register(r'vote', VoteViewSet, base_name='vote')
45
46
47 urlpatterns = [
48 # General platform urls
49 url(r'^django-admin/', include(admin.site.urls)),
50 url(r'^admin/', include('wagtail.wagtailadmin.urls')),
51
52 url(r'^accounts/', include('allauth.urls')),
53 url(r'^account/', include('liqd_product.apps.account.urls')),
54 url(r'^embed/', include('meinberlin.apps.embed.urls')),
55 url(r'^dashboard/', include('meinberlin.apps.dashboard2.urls')),
56 url(r'^profile/', include('liqd_product.apps.users.urls')),
57
58 # API urls
59 url(r'^api/', include(ct_router.urls)),
60 url(r'^api/', include(module_router.urls)),
61 url(r'^api/', include(orga_router.urls)),
62 url(r'^api/', include(question_router.urls)),
63 url(r'^api/', include(router.urls)),
64
65 url(r'^upload/', user_is_project_admin(ck_views.upload),
66 name='ckeditor_upload'),
67 url(r'^browse/', never_cache(user_is_project_admin(ck_views.browse)),
68 name='ckeditor_browse'),
69
70 url(r'^jsi18n/$', javascript_catalog,
71 js_info_dict, name='javascript-catalog'),
72
73 # Urls within the context of a partner
74 partner_patterns(
75 url(r'^modules/', include('adhocracy4.modules.urls')),
76 url(r'^projects/', include('adhocracy4.projects.urls')),
77 url(r'^offlineevents/', include('meinberlin.apps.offlineevents.urls',
78 namespace='meinberlin_offlineevents')),
79 url(r'^ideas/', include(r'meinberlin.apps.ideas.urls',
80 namespace='meinberlin_ideas')),
81 url(r'^mapideas/', include('meinberlin.apps.mapideas.urls',
82 namespace='meinberlin_mapideas')),
83 url(r'^text/', include('meinberlin.apps.documents.urls',
84 namespace='meinberlin_documents')),
85 ),
86
87 url(r'', include('liqd_product.apps.partners.urls')),
88 url(r'', include('wagtail.wagtailcore.urls'))
89 ]
90
91
92 if settings.DEBUG:
93 from django.conf.urls.static import static
94 from django.contrib.staticfiles.urls import staticfiles_urlpatterns
95
96 # Serve static and media locally
97 urlpatterns += staticfiles_urlpatterns()
98 urlpatterns += static(settings.MEDIA_URL,
99 document_root=settings.MEDIA_ROOT)
100 try:
101 import debug_toolbar
102 except ImportError:
103 pass
104 else:
105 urlpatterns = [
106 url(r'^__debug__/', include(debug_toolbar.urls)),
107 ] + urlpatterns
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/liqd_product/config/urls.py b/liqd_product/config/urls.py
--- a/liqd_product/config/urls.py
+++ b/liqd_product/config/urls.py
@@ -16,6 +16,7 @@
from adhocracy4.reports.api import ReportViewSet
from liqd_product.apps.partners.urlresolvers import partner_patterns
from liqd_product.apps.users.decorators import user_is_project_admin
+from meinberlin.apps.contrib import views as contrib_views
from meinberlin.apps.documents.api import DocumentViewSet
from meinberlin.apps.polls.api import PollViewSet
from meinberlin.apps.polls.api import VoteViewSet
@@ -67,6 +68,7 @@
url(r'^browse/', never_cache(user_is_project_admin(ck_views.browse)),
name='ckeditor_browse'),
+ url(r'^components/$', contrib_views.ComponentLibraryView.as_view()),
url(r'^jsi18n/$', javascript_catalog,
js_info_dict, name='javascript-catalog'),
| {"golden_diff": "diff --git a/liqd_product/config/urls.py b/liqd_product/config/urls.py\n--- a/liqd_product/config/urls.py\n+++ b/liqd_product/config/urls.py\n@@ -16,6 +16,7 @@\n from adhocracy4.reports.api import ReportViewSet\n from liqd_product.apps.partners.urlresolvers import partner_patterns\n from liqd_product.apps.users.decorators import user_is_project_admin\n+from meinberlin.apps.contrib import views as contrib_views\n from meinberlin.apps.documents.api import DocumentViewSet\n from meinberlin.apps.polls.api import PollViewSet\n from meinberlin.apps.polls.api import VoteViewSet\n@@ -67,6 +68,7 @@\n url(r'^browse/', never_cache(user_is_project_admin(ck_views.browse)),\n name='ckeditor_browse'),\n \n+ url(r'^components/$', contrib_views.ComponentLibraryView.as_view()),\n url(r'^jsi18n/$', javascript_catalog,\n js_info_dict, name='javascript-catalog'),\n", "issue": "tile images on partner page are not cut to same size\n\r\n\n", "before_files": [{"content": "\"\"\"Beteiligung.in URL Configuration.\"\"\"\n\nfrom ckeditor_uploader import views as ck_views\nfrom django.conf import settings\nfrom django.conf.urls import include\nfrom django.conf.urls import url\nfrom django.contrib import admin\nfrom django.views.decorators.cache import never_cache\nfrom django.views.i18n import javascript_catalog\nfrom rest_framework import routers\n\nfrom adhocracy4.api import routers as a4routers\nfrom adhocracy4.comments.api import CommentViewSet\nfrom adhocracy4.follows.api import FollowViewSet\nfrom adhocracy4.ratings.api import RatingViewSet\nfrom adhocracy4.reports.api import ReportViewSet\nfrom liqd_product.apps.partners.urlresolvers import partner_patterns\nfrom liqd_product.apps.users.decorators import user_is_project_admin\nfrom meinberlin.apps.documents.api import DocumentViewSet\nfrom meinberlin.apps.polls.api import PollViewSet\nfrom meinberlin.apps.polls.api import VoteViewSet\nfrom meinberlin.apps.polls.routers import QuestionDefaultRouter\n\njs_info_dict = {\n 'packages': ('adhocracy4.comments',),\n}\n\nrouter = routers.DefaultRouter()\nrouter.register(r'follows', FollowViewSet, base_name='follows')\nrouter.register(r'reports', ReportViewSet, base_name='reports')\nrouter.register(r'polls', PollViewSet, base_name='polls')\n\nmodule_router = a4routers.ModuleDefaultRouter()\n# FIXME: rename to 'chapters'\nmodule_router.register(r'documents', DocumentViewSet, base_name='chapters')\n\norga_router = a4routers.OrganisationDefaultRouter()\n\nct_router = a4routers.ContentTypeDefaultRouter()\nct_router.register(r'comments', CommentViewSet, base_name='comments')\nct_router.register(r'ratings', RatingViewSet, base_name='ratings')\n\nquestion_router = QuestionDefaultRouter()\nquestion_router.register(r'vote', VoteViewSet, base_name='vote')\n\n\nurlpatterns = [\n # General platform urls\n url(r'^django-admin/', include(admin.site.urls)),\n url(r'^admin/', include('wagtail.wagtailadmin.urls')),\n\n url(r'^accounts/', include('allauth.urls')),\n url(r'^account/', include('liqd_product.apps.account.urls')),\n url(r'^embed/', include('meinberlin.apps.embed.urls')),\n url(r'^dashboard/', include('meinberlin.apps.dashboard2.urls')),\n url(r'^profile/', include('liqd_product.apps.users.urls')),\n\n # API urls\n url(r'^api/', include(ct_router.urls)),\n url(r'^api/', include(module_router.urls)),\n url(r'^api/', include(orga_router.urls)),\n url(r'^api/', include(question_router.urls)),\n url(r'^api/', include(router.urls)),\n\n url(r'^upload/', user_is_project_admin(ck_views.upload),\n name='ckeditor_upload'),\n url(r'^browse/', never_cache(user_is_project_admin(ck_views.browse)),\n name='ckeditor_browse'),\n\n url(r'^jsi18n/$', javascript_catalog,\n js_info_dict, name='javascript-catalog'),\n\n # Urls within the context of a partner\n partner_patterns(\n url(r'^modules/', include('adhocracy4.modules.urls')),\n url(r'^projects/', include('adhocracy4.projects.urls')),\n url(r'^offlineevents/', include('meinberlin.apps.offlineevents.urls',\n namespace='meinberlin_offlineevents')),\n url(r'^ideas/', include(r'meinberlin.apps.ideas.urls',\n namespace='meinberlin_ideas')),\n url(r'^mapideas/', include('meinberlin.apps.mapideas.urls',\n namespace='meinberlin_mapideas')),\n url(r'^text/', include('meinberlin.apps.documents.urls',\n namespace='meinberlin_documents')),\n ),\n\n url(r'', include('liqd_product.apps.partners.urls')),\n url(r'', include('wagtail.wagtailcore.urls'))\n]\n\n\nif settings.DEBUG:\n from django.conf.urls.static import static\n from django.contrib.staticfiles.urls import staticfiles_urlpatterns\n\n # Serve static and media locally\n urlpatterns += staticfiles_urlpatterns()\n urlpatterns += static(settings.MEDIA_URL,\n document_root=settings.MEDIA_ROOT)\n try:\n import debug_toolbar\n except ImportError:\n pass\n else:\n urlpatterns = [\n url(r'^__debug__/', include(debug_toolbar.urls)),\n ] + urlpatterns\n", "path": "liqd_product/config/urls.py"}], "after_files": [{"content": "\"\"\"Beteiligung.in URL Configuration.\"\"\"\n\nfrom ckeditor_uploader import views as ck_views\nfrom django.conf import settings\nfrom django.conf.urls import include\nfrom django.conf.urls import url\nfrom django.contrib import admin\nfrom django.views.decorators.cache import never_cache\nfrom django.views.i18n import javascript_catalog\nfrom rest_framework import routers\n\nfrom adhocracy4.api import routers as a4routers\nfrom adhocracy4.comments.api import CommentViewSet\nfrom adhocracy4.follows.api import FollowViewSet\nfrom adhocracy4.ratings.api import RatingViewSet\nfrom adhocracy4.reports.api import ReportViewSet\nfrom liqd_product.apps.partners.urlresolvers import partner_patterns\nfrom liqd_product.apps.users.decorators import user_is_project_admin\nfrom meinberlin.apps.contrib import views as contrib_views\nfrom meinberlin.apps.documents.api import DocumentViewSet\nfrom meinberlin.apps.polls.api import PollViewSet\nfrom meinberlin.apps.polls.api import VoteViewSet\nfrom meinberlin.apps.polls.routers import QuestionDefaultRouter\n\njs_info_dict = {\n 'packages': ('adhocracy4.comments',),\n}\n\nrouter = routers.DefaultRouter()\nrouter.register(r'follows', FollowViewSet, base_name='follows')\nrouter.register(r'reports', ReportViewSet, base_name='reports')\nrouter.register(r'polls', PollViewSet, base_name='polls')\n\nmodule_router = a4routers.ModuleDefaultRouter()\n# FIXME: rename to 'chapters'\nmodule_router.register(r'documents', DocumentViewSet, base_name='chapters')\n\norga_router = a4routers.OrganisationDefaultRouter()\n\nct_router = a4routers.ContentTypeDefaultRouter()\nct_router.register(r'comments', CommentViewSet, base_name='comments')\nct_router.register(r'ratings', RatingViewSet, base_name='ratings')\n\nquestion_router = QuestionDefaultRouter()\nquestion_router.register(r'vote', VoteViewSet, base_name='vote')\n\n\nurlpatterns = [\n # General platform urls\n url(r'^django-admin/', include(admin.site.urls)),\n url(r'^admin/', include('wagtail.wagtailadmin.urls')),\n\n url(r'^accounts/', include('allauth.urls')),\n url(r'^account/', include('liqd_product.apps.account.urls')),\n url(r'^embed/', include('meinberlin.apps.embed.urls')),\n url(r'^dashboard/', include('meinberlin.apps.dashboard2.urls')),\n url(r'^profile/', include('liqd_product.apps.users.urls')),\n\n # API urls\n url(r'^api/', include(ct_router.urls)),\n url(r'^api/', include(module_router.urls)),\n url(r'^api/', include(orga_router.urls)),\n url(r'^api/', include(question_router.urls)),\n url(r'^api/', include(router.urls)),\n\n url(r'^upload/', user_is_project_admin(ck_views.upload),\n name='ckeditor_upload'),\n url(r'^browse/', never_cache(user_is_project_admin(ck_views.browse)),\n name='ckeditor_browse'),\n\n url(r'^components/$', contrib_views.ComponentLibraryView.as_view()),\n url(r'^jsi18n/$', javascript_catalog,\n js_info_dict, name='javascript-catalog'),\n\n # Urls within the context of a partner\n partner_patterns(\n url(r'^modules/', include('adhocracy4.modules.urls')),\n url(r'^projects/', include('adhocracy4.projects.urls')),\n url(r'^offlineevents/', include('meinberlin.apps.offlineevents.urls',\n namespace='meinberlin_offlineevents')),\n url(r'^ideas/', include(r'meinberlin.apps.ideas.urls',\n namespace='meinberlin_ideas')),\n url(r'^mapideas/', include('meinberlin.apps.mapideas.urls',\n namespace='meinberlin_mapideas')),\n url(r'^text/', include('meinberlin.apps.documents.urls',\n namespace='meinberlin_documents')),\n ),\n\n url(r'', include('liqd_product.apps.partners.urls')),\n url(r'', include('wagtail.wagtailcore.urls'))\n]\n\n\nif settings.DEBUG:\n from django.conf.urls.static import static\n from django.contrib.staticfiles.urls import staticfiles_urlpatterns\n\n # Serve static and media locally\n urlpatterns += staticfiles_urlpatterns()\n urlpatterns += static(settings.MEDIA_URL,\n document_root=settings.MEDIA_ROOT)\n try:\n import debug_toolbar\n except ImportError:\n pass\n else:\n urlpatterns = [\n url(r'^__debug__/', include(debug_toolbar.urls)),\n ] + urlpatterns\n", "path": "liqd_product/config/urls.py"}]} | 1,511 | 216 |
gh_patches_debug_30361 | rasdani/github-patches | git_diff | pytorch__ignite-1771 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Patch MNIST dataset downloading to fix CI
Currently, there is an issue with downloading MNIST dataset using torchvision. Let's introduce the following patch to our CI to fix it:
- https://github.com/pytorch/vision/issues/3500#issuecomment-790491487
Where to put that:
- create new step with the patch before [here](https://github.com/pytorch/ignite/blob/700f0e1325efc5dc0dce88d26284e51bc2a7c87c/.github/workflows/unit-tests.yml#L106)
- add patch [here](https://github.com/pytorch/ignite/blob/700f0e1325efc5dc0dce88d26284e51bc2a7c87c/.circleci/config.yml#L147).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `examples/mnist/mnist_patch.py`
Content:
```
1 """Patch to fix MNIST download issue as described here:
2 - https://github.com/pytorch/ignite/issues/1737
3 - https://github.com/pytorch/vision/issues/3500
4 """
5
6 import os
7 import subprocess as sp
8
9 import torch
10 from torchvision.datasets.mnist import MNIST, read_image_file, read_label_file
11 from torchvision.datasets.utils import extract_archive
12
13
14 def patched_download(self):
15 """wget patched download method.
16 """
17 if self._check_exists():
18 return
19
20 os.makedirs(self.raw_folder, exist_ok=True)
21 os.makedirs(self.processed_folder, exist_ok=True)
22
23 # download files
24 for url, md5 in self.resources:
25 filename = url.rpartition("/")[2]
26 download_root = os.path.expanduser(self.raw_folder)
27 extract_root = None
28 remove_finished = False
29
30 if extract_root is None:
31 extract_root = download_root
32 if not filename:
33 filename = os.path.basename(url)
34
35 # Use wget to download archives
36 sp.run(["wget", url, "-P", download_root])
37
38 archive = os.path.join(download_root, filename)
39 print("Extracting {} to {}".format(archive, extract_root))
40 extract_archive(archive, extract_root, remove_finished)
41
42 # process and save as torch files
43 print("Processing...")
44
45 training_set = (
46 read_image_file(os.path.join(self.raw_folder, "train-images-idx3-ubyte")),
47 read_label_file(os.path.join(self.raw_folder, "train-labels-idx1-ubyte")),
48 )
49 test_set = (
50 read_image_file(os.path.join(self.raw_folder, "t10k-images-idx3-ubyte")),
51 read_label_file(os.path.join(self.raw_folder, "t10k-labels-idx1-ubyte")),
52 )
53 with open(os.path.join(self.processed_folder, self.training_file), "wb") as f:
54 torch.save(training_set, f)
55 with open(os.path.join(self.processed_folder, self.test_file), "wb") as f:
56 torch.save(test_set, f)
57
58 print("Done!")
59
60
61 def main():
62 # Patch download method
63 MNIST.download = patched_download
64 # Download MNIST
65 MNIST(".", download=True)
66
67
68 if __name__ == "__main__":
69 main()
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/examples/mnist/mnist_patch.py b/examples/mnist/mnist_patch.py
deleted file mode 100644
--- a/examples/mnist/mnist_patch.py
+++ /dev/null
@@ -1,69 +0,0 @@
-"""Patch to fix MNIST download issue as described here:
-- https://github.com/pytorch/ignite/issues/1737
-- https://github.com/pytorch/vision/issues/3500
-"""
-
-import os
-import subprocess as sp
-
-import torch
-from torchvision.datasets.mnist import MNIST, read_image_file, read_label_file
-from torchvision.datasets.utils import extract_archive
-
-
-def patched_download(self):
- """wget patched download method.
- """
- if self._check_exists():
- return
-
- os.makedirs(self.raw_folder, exist_ok=True)
- os.makedirs(self.processed_folder, exist_ok=True)
-
- # download files
- for url, md5 in self.resources:
- filename = url.rpartition("/")[2]
- download_root = os.path.expanduser(self.raw_folder)
- extract_root = None
- remove_finished = False
-
- if extract_root is None:
- extract_root = download_root
- if not filename:
- filename = os.path.basename(url)
-
- # Use wget to download archives
- sp.run(["wget", url, "-P", download_root])
-
- archive = os.path.join(download_root, filename)
- print("Extracting {} to {}".format(archive, extract_root))
- extract_archive(archive, extract_root, remove_finished)
-
- # process and save as torch files
- print("Processing...")
-
- training_set = (
- read_image_file(os.path.join(self.raw_folder, "train-images-idx3-ubyte")),
- read_label_file(os.path.join(self.raw_folder, "train-labels-idx1-ubyte")),
- )
- test_set = (
- read_image_file(os.path.join(self.raw_folder, "t10k-images-idx3-ubyte")),
- read_label_file(os.path.join(self.raw_folder, "t10k-labels-idx1-ubyte")),
- )
- with open(os.path.join(self.processed_folder, self.training_file), "wb") as f:
- torch.save(training_set, f)
- with open(os.path.join(self.processed_folder, self.test_file), "wb") as f:
- torch.save(test_set, f)
-
- print("Done!")
-
-
-def main():
- # Patch download method
- MNIST.download = patched_download
- # Download MNIST
- MNIST(".", download=True)
-
-
-if __name__ == "__main__":
- main()
| {"golden_diff": "diff --git a/examples/mnist/mnist_patch.py b/examples/mnist/mnist_patch.py\ndeleted file mode 100644\n--- a/examples/mnist/mnist_patch.py\n+++ /dev/null\n@@ -1,69 +0,0 @@\n-\"\"\"Patch to fix MNIST download issue as described here:\n-- https://github.com/pytorch/ignite/issues/1737\n-- https://github.com/pytorch/vision/issues/3500\n-\"\"\"\n-\n-import os\n-import subprocess as sp\n-\n-import torch\n-from torchvision.datasets.mnist import MNIST, read_image_file, read_label_file\n-from torchvision.datasets.utils import extract_archive\n-\n-\n-def patched_download(self):\n- \"\"\"wget patched download method.\n- \"\"\"\n- if self._check_exists():\n- return\n-\n- os.makedirs(self.raw_folder, exist_ok=True)\n- os.makedirs(self.processed_folder, exist_ok=True)\n-\n- # download files\n- for url, md5 in self.resources:\n- filename = url.rpartition(\"/\")[2]\n- download_root = os.path.expanduser(self.raw_folder)\n- extract_root = None\n- remove_finished = False\n-\n- if extract_root is None:\n- extract_root = download_root\n- if not filename:\n- filename = os.path.basename(url)\n-\n- # Use wget to download archives\n- sp.run([\"wget\", url, \"-P\", download_root])\n-\n- archive = os.path.join(download_root, filename)\n- print(\"Extracting {} to {}\".format(archive, extract_root))\n- extract_archive(archive, extract_root, remove_finished)\n-\n- # process and save as torch files\n- print(\"Processing...\")\n-\n- training_set = (\n- read_image_file(os.path.join(self.raw_folder, \"train-images-idx3-ubyte\")),\n- read_label_file(os.path.join(self.raw_folder, \"train-labels-idx1-ubyte\")),\n- )\n- test_set = (\n- read_image_file(os.path.join(self.raw_folder, \"t10k-images-idx3-ubyte\")),\n- read_label_file(os.path.join(self.raw_folder, \"t10k-labels-idx1-ubyte\")),\n- )\n- with open(os.path.join(self.processed_folder, self.training_file), \"wb\") as f:\n- torch.save(training_set, f)\n- with open(os.path.join(self.processed_folder, self.test_file), \"wb\") as f:\n- torch.save(test_set, f)\n-\n- print(\"Done!\")\n-\n-\n-def main():\n- # Patch download method\n- MNIST.download = patched_download\n- # Download MNIST\n- MNIST(\".\", download=True)\n-\n-\n-if __name__ == \"__main__\":\n- main()\n", "issue": "Patch MNIST dataset downloading to fix CI\nCurrently, there is an issue with downloading MNIST dataset using torchvision. Let's introduce the following patch to our CI to fix it:\r\n- https://github.com/pytorch/vision/issues/3500#issuecomment-790491487\r\n\r\nWhere to put that:\r\n- create new step with the patch before [here](https://github.com/pytorch/ignite/blob/700f0e1325efc5dc0dce88d26284e51bc2a7c87c/.github/workflows/unit-tests.yml#L106)\r\n- add patch [here](https://github.com/pytorch/ignite/blob/700f0e1325efc5dc0dce88d26284e51bc2a7c87c/.circleci/config.yml#L147).\n", "before_files": [{"content": "\"\"\"Patch to fix MNIST download issue as described here:\n- https://github.com/pytorch/ignite/issues/1737\n- https://github.com/pytorch/vision/issues/3500\n\"\"\"\n\nimport os\nimport subprocess as sp\n\nimport torch\nfrom torchvision.datasets.mnist import MNIST, read_image_file, read_label_file\nfrom torchvision.datasets.utils import extract_archive\n\n\ndef patched_download(self):\n \"\"\"wget patched download method.\n \"\"\"\n if self._check_exists():\n return\n\n os.makedirs(self.raw_folder, exist_ok=True)\n os.makedirs(self.processed_folder, exist_ok=True)\n\n # download files\n for url, md5 in self.resources:\n filename = url.rpartition(\"/\")[2]\n download_root = os.path.expanduser(self.raw_folder)\n extract_root = None\n remove_finished = False\n\n if extract_root is None:\n extract_root = download_root\n if not filename:\n filename = os.path.basename(url)\n\n # Use wget to download archives\n sp.run([\"wget\", url, \"-P\", download_root])\n\n archive = os.path.join(download_root, filename)\n print(\"Extracting {} to {}\".format(archive, extract_root))\n extract_archive(archive, extract_root, remove_finished)\n\n # process and save as torch files\n print(\"Processing...\")\n\n training_set = (\n read_image_file(os.path.join(self.raw_folder, \"train-images-idx3-ubyte\")),\n read_label_file(os.path.join(self.raw_folder, \"train-labels-idx1-ubyte\")),\n )\n test_set = (\n read_image_file(os.path.join(self.raw_folder, \"t10k-images-idx3-ubyte\")),\n read_label_file(os.path.join(self.raw_folder, \"t10k-labels-idx1-ubyte\")),\n )\n with open(os.path.join(self.processed_folder, self.training_file), \"wb\") as f:\n torch.save(training_set, f)\n with open(os.path.join(self.processed_folder, self.test_file), \"wb\") as f:\n torch.save(test_set, f)\n\n print(\"Done!\")\n\n\ndef main():\n # Patch download method\n MNIST.download = patched_download\n # Download MNIST\n MNIST(\".\", download=True)\n\n\nif __name__ == \"__main__\":\n main()\n", "path": "examples/mnist/mnist_patch.py"}], "after_files": [{"content": null, "path": "examples/mnist/mnist_patch.py"}]} | 1,095 | 600 |
gh_patches_debug_8885 | rasdani/github-patches | git_diff | certbot__certbot-5861 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[1] Standardize how you’ll run mypy
From http://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/:
Write tooling to [install](https://github.com/zulip/zulip/blob/master/tools/install-mypy) and [run](https://github.com/zulip/zulip/blob/master/tools/run-mypy) `mypy` against your codebase, so that everyone using the project can run the type checker the same way. Two features are important in how you run mypy:
- Support for determining which files should be checked (a whitelist/exclude list is useful!).
- Specifying the correct flags for your project at this time. For a Python 2 project, I recommend starting with `mypy --py2 --silent-imports --fast-parser -i <paths>`. You should be able to do this using a [mypy.ini](http://mypy.readthedocs.io/en/latest/config_file.html) file.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import codecs
2 import os
3 import re
4 import sys
5
6 from setuptools import setup
7 from setuptools import find_packages
8
9 # Workaround for http://bugs.python.org/issue8876, see
10 # http://bugs.python.org/issue8876#msg208792
11 # This can be removed when using Python 2.7.9 or later:
12 # https://hg.python.org/cpython/raw-file/v2.7.9/Misc/NEWS
13 if os.path.abspath(__file__).split(os.path.sep)[1] == 'vagrant':
14 del os.link
15
16
17 def read_file(filename, encoding='utf8'):
18 """Read unicode from given file."""
19 with codecs.open(filename, encoding=encoding) as fd:
20 return fd.read()
21
22
23 here = os.path.abspath(os.path.dirname(__file__))
24
25 # read version number (and other metadata) from package init
26 init_fn = os.path.join(here, 'certbot', '__init__.py')
27 meta = dict(re.findall(r"""__([a-z]+)__ = '([^']+)""", read_file(init_fn)))
28
29 readme = read_file(os.path.join(here, 'README.rst'))
30 changes = read_file(os.path.join(here, 'CHANGES.rst'))
31 version = meta['version']
32
33 # This package relies on PyOpenSSL, requests, and six, however, it isn't
34 # specified here to avoid masking the more specific request requirements in
35 # acme. See https://github.com/pypa/pip/issues/988 for more info.
36 install_requires = [
37 'acme>=0.22.1',
38 # We technically need ConfigArgParse 0.10.0 for Python 2.6 support, but
39 # saying so here causes a runtime error against our temporary fork of 0.9.3
40 # in which we added 2.6 support (see #2243), so we relax the requirement.
41 'ConfigArgParse>=0.9.3',
42 'configobj',
43 'cryptography>=1.2', # load_pem_x509_certificate
44 'josepy',
45 'mock',
46 'parsedatetime>=1.3', # Calendar.parseDT
47 'pyrfc3339',
48 'pytz',
49 'setuptools',
50 'zope.component',
51 'zope.interface',
52 ]
53
54 dev_extras = [
55 # Pin astroid==1.3.5, pylint==1.4.2 as a workaround for #289
56 'astroid==1.3.5',
57 'coverage',
58 'ipdb',
59 'pytest',
60 'pytest-cov',
61 'pytest-xdist',
62 'pylint==1.4.2', # upstream #248
63 'tox',
64 'twine',
65 'wheel',
66 ]
67
68 docs_extras = [
69 'repoze.sphinx.autointerface',
70 # autodoc_member_order = 'bysource', autodoc_default_flags, and #4686
71 'Sphinx >=1.0,<=1.5.6',
72 'sphinx_rtd_theme',
73 ]
74
75 setup(
76 name='certbot',
77 version=version,
78 description="ACME client",
79 long_description=readme, # later: + '\n\n' + changes
80 url='https://github.com/letsencrypt/letsencrypt',
81 author="Certbot Project",
82 author_email='[email protected]',
83 license='Apache License 2.0',
84 python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
85 classifiers=[
86 'Development Status :: 3 - Alpha',
87 'Environment :: Console',
88 'Environment :: Console :: Curses',
89 'Intended Audience :: System Administrators',
90 'License :: OSI Approved :: Apache Software License',
91 'Operating System :: POSIX :: Linux',
92 'Programming Language :: Python',
93 'Programming Language :: Python :: 2',
94 'Programming Language :: Python :: 2.7',
95 'Programming Language :: Python :: 3',
96 'Programming Language :: Python :: 3.4',
97 'Programming Language :: Python :: 3.5',
98 'Programming Language :: Python :: 3.6',
99 'Topic :: Internet :: WWW/HTTP',
100 'Topic :: Security',
101 'Topic :: System :: Installation/Setup',
102 'Topic :: System :: Networking',
103 'Topic :: System :: Systems Administration',
104 'Topic :: Utilities',
105 ],
106
107 packages=find_packages(exclude=['docs', 'examples', 'tests', 'venv']),
108 include_package_data=True,
109
110 install_requires=install_requires,
111 extras_require={
112 'dev': dev_extras,
113 'docs': docs_extras,
114 },
115
116 # to test all packages run "python setup.py test -s
117 # {acme,certbot_apache,certbot_nginx}"
118 test_suite='certbot',
119
120 entry_points={
121 'console_scripts': [
122 'certbot = certbot.main:main',
123 ],
124 'certbot.plugins': [
125 'manual = certbot.plugins.manual:Authenticator',
126 'null = certbot.plugins.null:Installer',
127 'standalone = certbot.plugins.standalone:Authenticator',
128 'webroot = certbot.plugins.webroot:Authenticator',
129 ],
130 },
131 )
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -65,6 +65,10 @@
'wheel',
]
+dev3_extras = [
+ 'mypy',
+]
+
docs_extras = [
'repoze.sphinx.autointerface',
# autodoc_member_order = 'bysource', autodoc_default_flags, and #4686
@@ -110,6 +114,7 @@
install_requires=install_requires,
extras_require={
'dev': dev_extras,
+ 'dev3': dev3_extras,
'docs': docs_extras,
},
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -65,6 +65,10 @@\n 'wheel',\n ]\n \n+dev3_extras = [\n+ 'mypy',\n+]\n+\n docs_extras = [\n 'repoze.sphinx.autointerface',\n # autodoc_member_order = 'bysource', autodoc_default_flags, and #4686\n@@ -110,6 +114,7 @@\n install_requires=install_requires,\n extras_require={\n 'dev': dev_extras,\n+ 'dev3': dev3_extras,\n 'docs': docs_extras,\n },\n", "issue": "[1] Standardize how you\u2019ll run mypy\nFrom http://blog.zulip.org/2016/10/13/static-types-in-python-oh-mypy/:\r\n\r\nWrite tooling to [install](https://github.com/zulip/zulip/blob/master/tools/install-mypy) and [run](https://github.com/zulip/zulip/blob/master/tools/run-mypy) `mypy` against your codebase, so that everyone using the project can run the type checker the same way. Two features are important in how you run mypy:\r\n\r\n- Support for determining which files should be checked (a whitelist/exclude list is useful!).\r\n- Specifying the correct flags for your project at this time. For a Python 2 project, I recommend starting with `mypy --py2 --silent-imports --fast-parser -i <paths>`. You should be able to do this using a [mypy.ini](http://mypy.readthedocs.io/en/latest/config_file.html) file.\n", "before_files": [{"content": "import codecs\nimport os\nimport re\nimport sys\n\nfrom setuptools import setup\nfrom setuptools import find_packages\n\n# Workaround for http://bugs.python.org/issue8876, see\n# http://bugs.python.org/issue8876#msg208792\n# This can be removed when using Python 2.7.9 or later:\n# https://hg.python.org/cpython/raw-file/v2.7.9/Misc/NEWS\nif os.path.abspath(__file__).split(os.path.sep)[1] == 'vagrant':\n del os.link\n\n\ndef read_file(filename, encoding='utf8'):\n \"\"\"Read unicode from given file.\"\"\"\n with codecs.open(filename, encoding=encoding) as fd:\n return fd.read()\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n# read version number (and other metadata) from package init\ninit_fn = os.path.join(here, 'certbot', '__init__.py')\nmeta = dict(re.findall(r\"\"\"__([a-z]+)__ = '([^']+)\"\"\", read_file(init_fn)))\n\nreadme = read_file(os.path.join(here, 'README.rst'))\nchanges = read_file(os.path.join(here, 'CHANGES.rst'))\nversion = meta['version']\n\n# This package relies on PyOpenSSL, requests, and six, however, it isn't\n# specified here to avoid masking the more specific request requirements in\n# acme. See https://github.com/pypa/pip/issues/988 for more info.\ninstall_requires = [\n 'acme>=0.22.1',\n # We technically need ConfigArgParse 0.10.0 for Python 2.6 support, but\n # saying so here causes a runtime error against our temporary fork of 0.9.3\n # in which we added 2.6 support (see #2243), so we relax the requirement.\n 'ConfigArgParse>=0.9.3',\n 'configobj',\n 'cryptography>=1.2', # load_pem_x509_certificate\n 'josepy',\n 'mock',\n 'parsedatetime>=1.3', # Calendar.parseDT\n 'pyrfc3339',\n 'pytz',\n 'setuptools',\n 'zope.component',\n 'zope.interface',\n]\n\ndev_extras = [\n # Pin astroid==1.3.5, pylint==1.4.2 as a workaround for #289\n 'astroid==1.3.5',\n 'coverage',\n 'ipdb',\n 'pytest',\n 'pytest-cov',\n 'pytest-xdist',\n 'pylint==1.4.2', # upstream #248\n 'tox',\n 'twine',\n 'wheel',\n]\n\ndocs_extras = [\n 'repoze.sphinx.autointerface',\n # autodoc_member_order = 'bysource', autodoc_default_flags, and #4686\n 'Sphinx >=1.0,<=1.5.6',\n 'sphinx_rtd_theme',\n]\n\nsetup(\n name='certbot',\n version=version,\n description=\"ACME client\",\n long_description=readme, # later: + '\\n\\n' + changes\n url='https://github.com/letsencrypt/letsencrypt',\n author=\"Certbot Project\",\n author_email='[email protected]',\n license='Apache License 2.0',\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',\n classifiers=[\n 'Development Status :: 3 - Alpha',\n 'Environment :: Console',\n 'Environment :: Console :: Curses',\n 'Intended Audience :: System Administrators',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: POSIX :: Linux',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Internet :: WWW/HTTP',\n 'Topic :: Security',\n 'Topic :: System :: Installation/Setup',\n 'Topic :: System :: Networking',\n 'Topic :: System :: Systems Administration',\n 'Topic :: Utilities',\n ],\n\n packages=find_packages(exclude=['docs', 'examples', 'tests', 'venv']),\n include_package_data=True,\n\n install_requires=install_requires,\n extras_require={\n 'dev': dev_extras,\n 'docs': docs_extras,\n },\n\n # to test all packages run \"python setup.py test -s\n # {acme,certbot_apache,certbot_nginx}\"\n test_suite='certbot',\n\n entry_points={\n 'console_scripts': [\n 'certbot = certbot.main:main',\n ],\n 'certbot.plugins': [\n 'manual = certbot.plugins.manual:Authenticator',\n 'null = certbot.plugins.null:Installer',\n 'standalone = certbot.plugins.standalone:Authenticator',\n 'webroot = certbot.plugins.webroot:Authenticator',\n ],\n },\n)\n", "path": "setup.py"}], "after_files": [{"content": "import codecs\nimport os\nimport re\nimport sys\n\nfrom setuptools import setup\nfrom setuptools import find_packages\n\n# Workaround for http://bugs.python.org/issue8876, see\n# http://bugs.python.org/issue8876#msg208792\n# This can be removed when using Python 2.7.9 or later:\n# https://hg.python.org/cpython/raw-file/v2.7.9/Misc/NEWS\nif os.path.abspath(__file__).split(os.path.sep)[1] == 'vagrant':\n del os.link\n\n\ndef read_file(filename, encoding='utf8'):\n \"\"\"Read unicode from given file.\"\"\"\n with codecs.open(filename, encoding=encoding) as fd:\n return fd.read()\n\n\nhere = os.path.abspath(os.path.dirname(__file__))\n\n# read version number (and other metadata) from package init\ninit_fn = os.path.join(here, 'certbot', '__init__.py')\nmeta = dict(re.findall(r\"\"\"__([a-z]+)__ = '([^']+)\"\"\", read_file(init_fn)))\n\nreadme = read_file(os.path.join(here, 'README.rst'))\nchanges = read_file(os.path.join(here, 'CHANGES.rst'))\nversion = meta['version']\n\n# This package relies on PyOpenSSL, requests, and six, however, it isn't\n# specified here to avoid masking the more specific request requirements in\n# acme. See https://github.com/pypa/pip/issues/988 for more info.\ninstall_requires = [\n 'acme>=0.22.1',\n # We technically need ConfigArgParse 0.10.0 for Python 2.6 support, but\n # saying so here causes a runtime error against our temporary fork of 0.9.3\n # in which we added 2.6 support (see #2243), so we relax the requirement.\n 'ConfigArgParse>=0.9.3',\n 'configobj',\n 'cryptography>=1.2', # load_pem_x509_certificate\n 'josepy',\n 'mock',\n 'parsedatetime>=1.3', # Calendar.parseDT\n 'pyrfc3339',\n 'pytz',\n 'setuptools',\n 'zope.component',\n 'zope.interface',\n]\n\ndev_extras = [\n # Pin astroid==1.3.5, pylint==1.4.2 as a workaround for #289\n 'astroid==1.3.5',\n 'coverage',\n 'ipdb',\n 'pytest',\n 'pytest-cov',\n 'pytest-xdist',\n 'pylint==1.4.2', # upstream #248\n 'tox',\n 'twine',\n 'wheel',\n]\n\ndev3_extras = [\n 'mypy',\n]\n\ndocs_extras = [\n 'repoze.sphinx.autointerface',\n # autodoc_member_order = 'bysource', autodoc_default_flags, and #4686\n 'Sphinx >=1.0,<=1.5.6',\n 'sphinx_rtd_theme',\n]\n\nsetup(\n name='certbot',\n version=version,\n description=\"ACME client\",\n long_description=readme, # later: + '\\n\\n' + changes\n url='https://github.com/letsencrypt/letsencrypt',\n author=\"Certbot Project\",\n author_email='[email protected]',\n license='Apache License 2.0',\n python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',\n classifiers=[\n 'Development Status :: 3 - Alpha',\n 'Environment :: Console',\n 'Environment :: Console :: Curses',\n 'Intended Audience :: System Administrators',\n 'License :: OSI Approved :: Apache Software License',\n 'Operating System :: POSIX :: Linux',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Topic :: Internet :: WWW/HTTP',\n 'Topic :: Security',\n 'Topic :: System :: Installation/Setup',\n 'Topic :: System :: Networking',\n 'Topic :: System :: Systems Administration',\n 'Topic :: Utilities',\n ],\n\n packages=find_packages(exclude=['docs', 'examples', 'tests', 'venv']),\n include_package_data=True,\n\n install_requires=install_requires,\n extras_require={\n 'dev': dev_extras,\n 'dev3': dev3_extras,\n 'docs': docs_extras,\n },\n\n # to test all packages run \"python setup.py test -s\n # {acme,certbot_apache,certbot_nginx}\"\n test_suite='certbot',\n\n entry_points={\n 'console_scripts': [\n 'certbot = certbot.main:main',\n ],\n 'certbot.plugins': [\n 'manual = certbot.plugins.manual:Authenticator',\n 'null = certbot.plugins.null:Installer',\n 'standalone = certbot.plugins.standalone:Authenticator',\n 'webroot = certbot.plugins.webroot:Authenticator',\n ],\n },\n)\n", "path": "setup.py"}]} | 1,917 | 151 |
gh_patches_debug_15872 | rasdani/github-patches | git_diff | azavea__raster-vision-427 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Fill out model-defaults for tensorflow-od
Currently we only have 1 model in model_defaults.json:
https://github.com/azavea/raster-vision/blob/feature/api-refactor/src/rastervision/backend/model_defaults.json#L2
We need to fill it out to include each of these configurations, matched up with each of the pretrained weights from the model zoo:
#### Configs
https://github.com/azavea/models/tree/master/research/object_detection/samples/configs
#### Weights
https://github.com/azavea/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/rastervision/backend/api.py`
Content:
```
1 # flake8: noqa
2
3 # Registry keys
4
5 BACKEND = 'BACKEND'
6
7 ## Backend Keys
8
9 TF_OBJECT_DETECTION = 'TF_OBJECT_DETECTION'
10 KERAS_CLASSIFICATION = 'KERAS_CLASSIFICATION'
11
12 ## Model keys
13
14 ### TF Object Detection
15 SSD_MOBILENET_V1_COCO = 'SSD_MOBILENET_V1_COCO'
16
17 ## Keras Classificaiton
18 RESNET50_IMAGENET = 'RESNET50_IMAGENET'
19
20 from .backend_config import BackendConfig
21
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/rastervision/backend/api.py b/src/rastervision/backend/api.py
--- a/src/rastervision/backend/api.py
+++ b/src/rastervision/backend/api.py
@@ -13,6 +13,20 @@
### TF Object Detection
SSD_MOBILENET_V1_COCO = 'SSD_MOBILENET_V1_COCO'
+SSD_MOBILENET_V2_COCO = 'SSD_MOBILENET_V2_COCO'
+SSDLITE_MOBILENET_V2_COCO = 'SSDLITE_MOBILENET_V2_COCO'
+SSD_INCEPTION_V2_COCO = 'SSD_INCEPTION_V2_COCO'
+FASTER_RCNN_INCEPTION_V2_COCO = 'FASTER_RCNN_INCEPTION_V2_COCO'
+FASTER_RCNN_RESNET50_COCO = 'FASTER_RCNN_RESNET50_COCO'
+RFCN_RESNET101_COCO = 'RFCN_RESNET101_COCO'
+FASTER_RCNN_RESNET101_COCO = 'FASTER_RCNN_RESNET101_COCO'
+FASTER_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO = \
+'FASTER_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO'
+MASK_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO = \
+'MASK_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO'
+MASK_RCNN_INCEPTION_V2_COCO = 'MASK_RCNN_INCEPTION_V2_COCO'
+MASK_RCNN_RESNET101_ATROUS_COCO = 'MASK_RCNN_RESNET101_ATROUS_COCO'
+MASK_RCNN_RESNET50_ATROUS_COCO = 'MASK_RCNN_RESNET50_ATROUS_COCO'
## Keras Classificaiton
RESNET50_IMAGENET = 'RESNET50_IMAGENET'
| {"golden_diff": "diff --git a/src/rastervision/backend/api.py b/src/rastervision/backend/api.py\n--- a/src/rastervision/backend/api.py\n+++ b/src/rastervision/backend/api.py\n@@ -13,6 +13,20 @@\n \n ### TF Object Detection\n SSD_MOBILENET_V1_COCO = 'SSD_MOBILENET_V1_COCO'\n+SSD_MOBILENET_V2_COCO = 'SSD_MOBILENET_V2_COCO'\n+SSDLITE_MOBILENET_V2_COCO = 'SSDLITE_MOBILENET_V2_COCO'\n+SSD_INCEPTION_V2_COCO = 'SSD_INCEPTION_V2_COCO'\n+FASTER_RCNN_INCEPTION_V2_COCO = 'FASTER_RCNN_INCEPTION_V2_COCO'\n+FASTER_RCNN_RESNET50_COCO = 'FASTER_RCNN_RESNET50_COCO'\n+RFCN_RESNET101_COCO = 'RFCN_RESNET101_COCO'\n+FASTER_RCNN_RESNET101_COCO = 'FASTER_RCNN_RESNET101_COCO'\n+FASTER_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO = \\\n+'FASTER_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO'\n+MASK_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO = \\\n+'MASK_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO'\n+MASK_RCNN_INCEPTION_V2_COCO = 'MASK_RCNN_INCEPTION_V2_COCO'\n+MASK_RCNN_RESNET101_ATROUS_COCO = 'MASK_RCNN_RESNET101_ATROUS_COCO'\n+MASK_RCNN_RESNET50_ATROUS_COCO = 'MASK_RCNN_RESNET50_ATROUS_COCO'\n \n ## Keras Classificaiton\n RESNET50_IMAGENET = 'RESNET50_IMAGENET'\n", "issue": "Fill out model-defaults for tensorflow-od\nCurrently we only have 1 model in model_defaults.json: \r\n\r\nhttps://github.com/azavea/raster-vision/blob/feature/api-refactor/src/rastervision/backend/model_defaults.json#L2\r\n\r\nWe need to fill it out to include each of these configurations, matched up with each of the pretrained weights from the model zoo:\r\n\r\n#### Configs\r\nhttps://github.com/azavea/models/tree/master/research/object_detection/samples/configs\r\n\r\n#### Weights\r\nhttps://github.com/azavea/models/blob/master/research/object_detection/g3doc/detection_model_zoo.md\n", "before_files": [{"content": "# flake8: noqa\n\n# Registry keys\n\nBACKEND = 'BACKEND'\n\n## Backend Keys\n\nTF_OBJECT_DETECTION = 'TF_OBJECT_DETECTION'\nKERAS_CLASSIFICATION = 'KERAS_CLASSIFICATION'\n\n## Model keys\n\n### TF Object Detection\nSSD_MOBILENET_V1_COCO = 'SSD_MOBILENET_V1_COCO'\n\n## Keras Classificaiton\nRESNET50_IMAGENET = 'RESNET50_IMAGENET'\n\nfrom .backend_config import BackendConfig\n", "path": "src/rastervision/backend/api.py"}], "after_files": [{"content": "# flake8: noqa\n\n# Registry keys\n\nBACKEND = 'BACKEND'\n\n## Backend Keys\n\nTF_OBJECT_DETECTION = 'TF_OBJECT_DETECTION'\nKERAS_CLASSIFICATION = 'KERAS_CLASSIFICATION'\n\n## Model keys\n\n### TF Object Detection\nSSD_MOBILENET_V1_COCO = 'SSD_MOBILENET_V1_COCO'\nSSD_MOBILENET_V2_COCO = 'SSD_MOBILENET_V2_COCO'\nSSDLITE_MOBILENET_V2_COCO = 'SSDLITE_MOBILENET_V2_COCO'\nSSD_INCEPTION_V2_COCO = 'SSD_INCEPTION_V2_COCO'\nFASTER_RCNN_INCEPTION_V2_COCO = 'FASTER_RCNN_INCEPTION_V2_COCO'\nFASTER_RCNN_RESNET50_COCO = 'FASTER_RCNN_RESNET50_COCO'\nRFCN_RESNET101_COCO = 'RFCN_RESNET101_COCO'\nFASTER_RCNN_RESNET101_COCO = 'FASTER_RCNN_RESNET101_COCO'\nFASTER_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO = \\\n'FASTER_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO'\nMASK_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO = \\\n'MASK_RCNN_INCEPTION_RESNET_V2_ATROUS_COCO'\nMASK_RCNN_INCEPTION_V2_COCO = 'MASK_RCNN_INCEPTION_V2_COCO'\nMASK_RCNN_RESNET101_ATROUS_COCO = 'MASK_RCNN_RESNET101_ATROUS_COCO'\nMASK_RCNN_RESNET50_ATROUS_COCO = 'MASK_RCNN_RESNET50_ATROUS_COCO'\n\n## Keras Classificaiton\nRESNET50_IMAGENET = 'RESNET50_IMAGENET'\n\nfrom .backend_config import BackendConfig\n", "path": "src/rastervision/backend/api.py"}]} | 545 | 441 |
gh_patches_debug_37445 | rasdani/github-patches | git_diff | hpcaitech__ColossalAI-4815 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[tensor] fix some unittests
[tensor] fix some unittests
[tensor] fix some unittests
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `op_builder/gptq.py`
Content:
```
1 import os
2 import torch
3 import re
4
5 from .builder import Builder
6 from .utils import append_nvcc_threads, get_cuda_cc_flag
7
8 class GPTQBuilder(Builder):
9
10 NAME = "cu_gptq"
11 PREBUILT_IMPORT_PATH = "colossalai._C.cu_gptq"
12
13 def __init__(self):
14 super().__init__(name=GPTQBuilder.NAME,
15 prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)
16
17
18 def include_dirs(self):
19 ret = [self.csrc_abs_path("gptq"), self.get_cuda_home_include()]
20 return ret
21
22 def sources_files(self):
23 ret = [
24 self.csrc_abs_path(fname) for fname in [
25 'gptq/linear_gptq.cpp',
26 'gptq/column_remap.cu',
27 'gptq/cuda_buffers.cu',
28 'gptq/q4_matmul.cu',
29 'gptq/q4_matrix.cu'
30 ]
31 ]
32 return ret
33
34 def cxx_flags(self):
35 return ['-O3'] + self.version_dependent_macros
36
37 def nvcc_flags(self):
38 extra_cuda_flags = ['-v',
39 '-std=c++14', '-U__CUDA_NO_HALF_OPERATORS__', '-U__CUDA_NO_HALF_CONVERSIONS__',
40 '-U__CUDA_NO_HALF2_OPERATORS__', '-DTHRUST_IGNORE_CUB_VERSION_CHECK', "-lcublas", "-std=c++17"
41 ]
42
43
44 for arch in torch.cuda.get_arch_list():
45 res = re.search(r'sm_(\d+)', arch)
46 if res:
47 arch_cap = res[1]
48 if int(arch_cap) >= 80:
49 extra_cuda_flags.extend(['-gencode', f'arch=compute_{arch_cap},code={arch}'])
50
51 ret = ['-O3', '--use_fast_math'] + self.version_dependent_macros + extra_cuda_flags
52 return append_nvcc_threads(ret)
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/op_builder/gptq.py b/op_builder/gptq.py
--- a/op_builder/gptq.py
+++ b/op_builder/gptq.py
@@ -1,19 +1,17 @@
-import os
-import torch
import re
+import torch
+
from .builder import Builder
-from .utils import append_nvcc_threads, get_cuda_cc_flag
+from .utils import append_nvcc_threads
-class GPTQBuilder(Builder):
+class GPTQBuilder(Builder):
NAME = "cu_gptq"
PREBUILT_IMPORT_PATH = "colossalai._C.cu_gptq"
def __init__(self):
- super().__init__(name=GPTQBuilder.NAME,
- prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)
-
+ super().__init__(name=GPTQBuilder.NAME, prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)
def include_dirs(self):
ret = [self.csrc_abs_path("gptq"), self.get_cuda_home_include()]
@@ -21,32 +19,38 @@
def sources_files(self):
ret = [
- self.csrc_abs_path(fname) for fname in [
- 'gptq/linear_gptq.cpp',
- 'gptq/column_remap.cu',
- 'gptq/cuda_buffers.cu',
- 'gptq/q4_matmul.cu',
- 'gptq/q4_matrix.cu'
+ self.csrc_abs_path(fname)
+ for fname in [
+ "gptq/linear_gptq.cpp",
+ "gptq/column_remap.cu",
+ "gptq/cuda_buffers.cu",
+ "gptq/q4_matmul.cu",
+ "gptq/q4_matrix.cu",
]
]
return ret
def cxx_flags(self):
- return ['-O3'] + self.version_dependent_macros
+ return ["-O3"] + self.version_dependent_macros
def nvcc_flags(self):
- extra_cuda_flags = ['-v',
- '-std=c++14', '-U__CUDA_NO_HALF_OPERATORS__', '-U__CUDA_NO_HALF_CONVERSIONS__',
- '-U__CUDA_NO_HALF2_OPERATORS__', '-DTHRUST_IGNORE_CUB_VERSION_CHECK', "-lcublas", "-std=c++17"
+ extra_cuda_flags = [
+ "-v",
+ "-std=c++14",
+ "-U__CUDA_NO_HALF_OPERATORS__",
+ "-U__CUDA_NO_HALF_CONVERSIONS__",
+ "-U__CUDA_NO_HALF2_OPERATORS__",
+ "-DTHRUST_IGNORE_CUB_VERSION_CHECK",
+ "-lcublas",
+ "-std=c++17",
]
-
for arch in torch.cuda.get_arch_list():
- res = re.search(r'sm_(\d+)', arch)
+ res = re.search(r"sm_(\d+)", arch)
if res:
arch_cap = res[1]
if int(arch_cap) >= 80:
- extra_cuda_flags.extend(['-gencode', f'arch=compute_{arch_cap},code={arch}'])
+ extra_cuda_flags.extend(["-gencode", f"arch=compute_{arch_cap},code={arch}"])
- ret = ['-O3', '--use_fast_math'] + self.version_dependent_macros + extra_cuda_flags
- return append_nvcc_threads(ret)
\ No newline at end of file
+ ret = ["-O3", "--use_fast_math"] + self.version_dependent_macros + extra_cuda_flags
+ return append_nvcc_threads(ret)
| {"golden_diff": "diff --git a/op_builder/gptq.py b/op_builder/gptq.py\n--- a/op_builder/gptq.py\n+++ b/op_builder/gptq.py\n@@ -1,19 +1,17 @@\n-import os\n-import torch\n import re\n \n+import torch\n+\n from .builder import Builder\n-from .utils import append_nvcc_threads, get_cuda_cc_flag\n+from .utils import append_nvcc_threads\n \n-class GPTQBuilder(Builder):\n \n+class GPTQBuilder(Builder):\n NAME = \"cu_gptq\"\n PREBUILT_IMPORT_PATH = \"colossalai._C.cu_gptq\"\n \n def __init__(self):\n- super().__init__(name=GPTQBuilder.NAME,\n- prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)\n-\n+ super().__init__(name=GPTQBuilder.NAME, prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)\n \n def include_dirs(self):\n ret = [self.csrc_abs_path(\"gptq\"), self.get_cuda_home_include()]\n@@ -21,32 +19,38 @@\n \n def sources_files(self):\n ret = [\n- self.csrc_abs_path(fname) for fname in [\n- 'gptq/linear_gptq.cpp',\n- 'gptq/column_remap.cu',\n- 'gptq/cuda_buffers.cu',\n- 'gptq/q4_matmul.cu',\n- 'gptq/q4_matrix.cu'\n+ self.csrc_abs_path(fname)\n+ for fname in [\n+ \"gptq/linear_gptq.cpp\",\n+ \"gptq/column_remap.cu\",\n+ \"gptq/cuda_buffers.cu\",\n+ \"gptq/q4_matmul.cu\",\n+ \"gptq/q4_matrix.cu\",\n ]\n ]\n return ret\n \n def cxx_flags(self):\n- return ['-O3'] + self.version_dependent_macros\n+ return [\"-O3\"] + self.version_dependent_macros\n \n def nvcc_flags(self):\n- extra_cuda_flags = ['-v',\n- '-std=c++14', '-U__CUDA_NO_HALF_OPERATORS__', '-U__CUDA_NO_HALF_CONVERSIONS__',\n- '-U__CUDA_NO_HALF2_OPERATORS__', '-DTHRUST_IGNORE_CUB_VERSION_CHECK', \"-lcublas\", \"-std=c++17\"\n+ extra_cuda_flags = [\n+ \"-v\",\n+ \"-std=c++14\",\n+ \"-U__CUDA_NO_HALF_OPERATORS__\",\n+ \"-U__CUDA_NO_HALF_CONVERSIONS__\",\n+ \"-U__CUDA_NO_HALF2_OPERATORS__\",\n+ \"-DTHRUST_IGNORE_CUB_VERSION_CHECK\",\n+ \"-lcublas\",\n+ \"-std=c++17\",\n ]\n \n-\n for arch in torch.cuda.get_arch_list():\n- res = re.search(r'sm_(\\d+)', arch)\n+ res = re.search(r\"sm_(\\d+)\", arch)\n if res:\n arch_cap = res[1]\n if int(arch_cap) >= 80:\n- extra_cuda_flags.extend(['-gencode', f'arch=compute_{arch_cap},code={arch}'])\n+ extra_cuda_flags.extend([\"-gencode\", f\"arch=compute_{arch_cap},code={arch}\"])\n \n- ret = ['-O3', '--use_fast_math'] + self.version_dependent_macros + extra_cuda_flags\n- return append_nvcc_threads(ret)\n\\ No newline at end of file\n+ ret = [\"-O3\", \"--use_fast_math\"] + self.version_dependent_macros + extra_cuda_flags\n+ return append_nvcc_threads(ret)\n", "issue": "[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n[tensor] fix some unittests\n\n", "before_files": [{"content": "import os\nimport torch\nimport re\n\nfrom .builder import Builder\nfrom .utils import append_nvcc_threads, get_cuda_cc_flag\n\nclass GPTQBuilder(Builder):\n\n NAME = \"cu_gptq\"\n PREBUILT_IMPORT_PATH = \"colossalai._C.cu_gptq\"\n\n def __init__(self):\n super().__init__(name=GPTQBuilder.NAME,\n prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)\n\n\n def include_dirs(self):\n ret = [self.csrc_abs_path(\"gptq\"), self.get_cuda_home_include()]\n return ret\n\n def sources_files(self):\n ret = [\n self.csrc_abs_path(fname) for fname in [\n 'gptq/linear_gptq.cpp',\n 'gptq/column_remap.cu',\n 'gptq/cuda_buffers.cu',\n 'gptq/q4_matmul.cu',\n 'gptq/q4_matrix.cu'\n ]\n ]\n return ret\n\n def cxx_flags(self):\n return ['-O3'] + self.version_dependent_macros\n\n def nvcc_flags(self):\n extra_cuda_flags = ['-v',\n '-std=c++14', '-U__CUDA_NO_HALF_OPERATORS__', '-U__CUDA_NO_HALF_CONVERSIONS__',\n '-U__CUDA_NO_HALF2_OPERATORS__', '-DTHRUST_IGNORE_CUB_VERSION_CHECK', \"-lcublas\", \"-std=c++17\"\n ]\n\n\n for arch in torch.cuda.get_arch_list():\n res = re.search(r'sm_(\\d+)', arch)\n if res:\n arch_cap = res[1]\n if int(arch_cap) >= 80:\n extra_cuda_flags.extend(['-gencode', f'arch=compute_{arch_cap},code={arch}'])\n\n ret = ['-O3', '--use_fast_math'] + self.version_dependent_macros + extra_cuda_flags\n return append_nvcc_threads(ret)", "path": "op_builder/gptq.py"}], "after_files": [{"content": "import re\n\nimport torch\n\nfrom .builder import Builder\nfrom .utils import append_nvcc_threads\n\n\nclass GPTQBuilder(Builder):\n NAME = \"cu_gptq\"\n PREBUILT_IMPORT_PATH = \"colossalai._C.cu_gptq\"\n\n def __init__(self):\n super().__init__(name=GPTQBuilder.NAME, prebuilt_import_path=GPTQBuilder.PREBUILT_IMPORT_PATH)\n\n def include_dirs(self):\n ret = [self.csrc_abs_path(\"gptq\"), self.get_cuda_home_include()]\n return ret\n\n def sources_files(self):\n ret = [\n self.csrc_abs_path(fname)\n for fname in [\n \"gptq/linear_gptq.cpp\",\n \"gptq/column_remap.cu\",\n \"gptq/cuda_buffers.cu\",\n \"gptq/q4_matmul.cu\",\n \"gptq/q4_matrix.cu\",\n ]\n ]\n return ret\n\n def cxx_flags(self):\n return [\"-O3\"] + self.version_dependent_macros\n\n def nvcc_flags(self):\n extra_cuda_flags = [\n \"-v\",\n \"-std=c++14\",\n \"-U__CUDA_NO_HALF_OPERATORS__\",\n \"-U__CUDA_NO_HALF_CONVERSIONS__\",\n \"-U__CUDA_NO_HALF2_OPERATORS__\",\n \"-DTHRUST_IGNORE_CUB_VERSION_CHECK\",\n \"-lcublas\",\n \"-std=c++17\",\n ]\n\n for arch in torch.cuda.get_arch_list():\n res = re.search(r\"sm_(\\d+)\", arch)\n if res:\n arch_cap = res[1]\n if int(arch_cap) >= 80:\n extra_cuda_flags.extend([\"-gencode\", f\"arch=compute_{arch_cap},code={arch}\"])\n\n ret = [\"-O3\", \"--use_fast_math\"] + self.version_dependent_macros + extra_cuda_flags\n return append_nvcc_threads(ret)\n", "path": "op_builder/gptq.py"}]} | 809 | 815 |
gh_patches_debug_35667 | rasdani/github-patches | git_diff | scrapy__scrapy-4799 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Usage of HttpCompressionMiddleware needs to be reflected in Scrapy stats
## Summary
Usage of `HttpCompressionMiddleware` needs to be relfected in Scrapy stats.
## Motivation
In order to estimate scrapy memory usage efficiency and prevent.. memory leaks like [this](https://stackoverflow.com/q/63936759/10884791).
I will need to know:
1. number of request/response objects that can be active (can be achieved by using [`trackref`](https://docs.scrapy.org/en/latest/topics/leaks.html#debugging-memory-leaks-with-trackref) )
2. size of memory required to store that number of request/response objects.
A lot of websites use compression to reduce traffic. In this case I would like to calculate average size of **decompressed** responses to estimate p.2.
Decompression process means that at some point application will require to allocate memory to store both compressed and decompressed response body and I will need to know this sizes to have more complete vision of scrapy memory usage.
Also size of decompressed body will be several times more than size of compressed response and it will affect scrapy memory usage.
## Describe alternatives you've considered
The easiest one - is to change priority of `DownloaderStats` middleware and check difference in `downloader/response_bytes` stats parameter.
```
custom_settings = {"DOWNLOAD_DELAY":1,
"DOWNLOADER_MIDDLEWARES":{
'scrapy.downloadermiddlewares.stats.DownloaderStats':50
}
```
Stats from quotes.toscrape.com spider (it uses `gzip` compression) with default settings:
```
{'downloader/request_bytes': 2642,
'downloader/request_count': 10,
'downloader/request_method_count/GET': 10,
'downloader/response_bytes': 24534,
```
And with changed priority of `DownloaderStats` middleware:
```
{'downloader/request_bytes': 912, # size reduced as it didn't count size of request headers populated by downloader middlewares
'downloader/request_count': 10,
'downloader/request_method_count/GET': 10,
'downloader/response_bytes': 110191, # it counted size of decompressed data
```
Average size of compressed response (by default) - 2453 bytes.
Average size of decompressed response - 11019 bytes (~4.5 times more).
## Additional context
Potential solution is to add something like this:
` self.stats.inc_value('decompressed_bytes', spider=spider)`
into `process_response` method of `HttpCompressionMiddleware`
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scrapy/downloadermiddlewares/httpcompression.py`
Content:
```
1 import io
2 import zlib
3
4 from scrapy.utils.gz import gunzip
5 from scrapy.http import Response, TextResponse
6 from scrapy.responsetypes import responsetypes
7 from scrapy.exceptions import NotConfigured
8
9
10 ACCEPTED_ENCODINGS = [b'gzip', b'deflate']
11
12 try:
13 import brotli
14 ACCEPTED_ENCODINGS.append(b'br')
15 except ImportError:
16 pass
17
18 try:
19 import zstandard
20 ACCEPTED_ENCODINGS.append(b'zstd')
21 except ImportError:
22 pass
23
24
25 class HttpCompressionMiddleware:
26 """This middleware allows compressed (gzip, deflate) traffic to be
27 sent/received from web sites"""
28 @classmethod
29 def from_crawler(cls, crawler):
30 if not crawler.settings.getbool('COMPRESSION_ENABLED'):
31 raise NotConfigured
32 return cls()
33
34 def process_request(self, request, spider):
35 request.headers.setdefault('Accept-Encoding',
36 b", ".join(ACCEPTED_ENCODINGS))
37
38 def process_response(self, request, response, spider):
39
40 if request.method == 'HEAD':
41 return response
42 if isinstance(response, Response):
43 content_encoding = response.headers.getlist('Content-Encoding')
44 if content_encoding:
45 encoding = content_encoding.pop()
46 decoded_body = self._decode(response.body, encoding.lower())
47 respcls = responsetypes.from_args(
48 headers=response.headers, url=response.url, body=decoded_body
49 )
50 kwargs = dict(cls=respcls, body=decoded_body)
51 if issubclass(respcls, TextResponse):
52 # force recalculating the encoding until we make sure the
53 # responsetypes guessing is reliable
54 kwargs['encoding'] = None
55 response = response.replace(**kwargs)
56 if not content_encoding:
57 del response.headers['Content-Encoding']
58
59 return response
60
61 def _decode(self, body, encoding):
62 if encoding == b'gzip' or encoding == b'x-gzip':
63 body = gunzip(body)
64
65 if encoding == b'deflate':
66 try:
67 body = zlib.decompress(body)
68 except zlib.error:
69 # ugly hack to work with raw deflate content that may
70 # be sent by microsoft servers. For more information, see:
71 # http://carsten.codimi.de/gzip.yaws/
72 # http://www.port80software.com/200ok/archive/2005/10/31/868.aspx
73 # http://www.gzip.org/zlib/zlib_faq.html#faq38
74 body = zlib.decompress(body, -15)
75 if encoding == b'br' and b'br' in ACCEPTED_ENCODINGS:
76 body = brotli.decompress(body)
77 if encoding == b'zstd' and b'zstd' in ACCEPTED_ENCODINGS:
78 # Using its streaming API since its simple API could handle only cases
79 # where there is content size data embedded in the frame
80 reader = zstandard.ZstdDecompressor().stream_reader(io.BytesIO(body))
81 body = reader.read()
82 return body
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scrapy/downloadermiddlewares/httpcompression.py b/scrapy/downloadermiddlewares/httpcompression.py
--- a/scrapy/downloadermiddlewares/httpcompression.py
+++ b/scrapy/downloadermiddlewares/httpcompression.py
@@ -1,10 +1,12 @@
import io
+import warnings
import zlib
-from scrapy.utils.gz import gunzip
+from scrapy.exceptions import NotConfigured
from scrapy.http import Response, TextResponse
from scrapy.responsetypes import responsetypes
-from scrapy.exceptions import NotConfigured
+from scrapy.utils.deprecate import ScrapyDeprecationWarning
+from scrapy.utils.gz import gunzip
ACCEPTED_ENCODINGS = [b'gzip', b'deflate']
@@ -25,11 +27,25 @@
class HttpCompressionMiddleware:
"""This middleware allows compressed (gzip, deflate) traffic to be
sent/received from web sites"""
+ def __init__(self, stats=None):
+ self.stats = stats
+
@classmethod
def from_crawler(cls, crawler):
if not crawler.settings.getbool('COMPRESSION_ENABLED'):
raise NotConfigured
- return cls()
+ try:
+ return cls(stats=crawler.stats)
+ except TypeError:
+ warnings.warn(
+ "HttpCompressionMiddleware subclasses must either modify "
+ "their '__init__' method to support a 'stats' parameter or "
+ "reimplement the 'from_crawler' method.",
+ ScrapyDeprecationWarning,
+ )
+ result = cls()
+ result.stats = crawler.stats
+ return result
def process_request(self, request, spider):
request.headers.setdefault('Accept-Encoding',
@@ -44,6 +60,9 @@
if content_encoding:
encoding = content_encoding.pop()
decoded_body = self._decode(response.body, encoding.lower())
+ if self.stats:
+ self.stats.inc_value('httpcompression/response_bytes', len(decoded_body), spider=spider)
+ self.stats.inc_value('httpcompression/response_count', spider=spider)
respcls = responsetypes.from_args(
headers=response.headers, url=response.url, body=decoded_body
)
| {"golden_diff": "diff --git a/scrapy/downloadermiddlewares/httpcompression.py b/scrapy/downloadermiddlewares/httpcompression.py\n--- a/scrapy/downloadermiddlewares/httpcompression.py\n+++ b/scrapy/downloadermiddlewares/httpcompression.py\n@@ -1,10 +1,12 @@\n import io\n+import warnings\n import zlib\n \n-from scrapy.utils.gz import gunzip\n+from scrapy.exceptions import NotConfigured\n from scrapy.http import Response, TextResponse\n from scrapy.responsetypes import responsetypes\n-from scrapy.exceptions import NotConfigured\n+from scrapy.utils.deprecate import ScrapyDeprecationWarning\n+from scrapy.utils.gz import gunzip\n \n \n ACCEPTED_ENCODINGS = [b'gzip', b'deflate']\n@@ -25,11 +27,25 @@\n class HttpCompressionMiddleware:\n \"\"\"This middleware allows compressed (gzip, deflate) traffic to be\n sent/received from web sites\"\"\"\n+ def __init__(self, stats=None):\n+ self.stats = stats\n+\n @classmethod\n def from_crawler(cls, crawler):\n if not crawler.settings.getbool('COMPRESSION_ENABLED'):\n raise NotConfigured\n- return cls()\n+ try:\n+ return cls(stats=crawler.stats)\n+ except TypeError:\n+ warnings.warn(\n+ \"HttpCompressionMiddleware subclasses must either modify \"\n+ \"their '__init__' method to support a 'stats' parameter or \"\n+ \"reimplement the 'from_crawler' method.\",\n+ ScrapyDeprecationWarning,\n+ )\n+ result = cls()\n+ result.stats = crawler.stats\n+ return result\n \n def process_request(self, request, spider):\n request.headers.setdefault('Accept-Encoding',\n@@ -44,6 +60,9 @@\n if content_encoding:\n encoding = content_encoding.pop()\n decoded_body = self._decode(response.body, encoding.lower())\n+ if self.stats:\n+ self.stats.inc_value('httpcompression/response_bytes', len(decoded_body), spider=spider)\n+ self.stats.inc_value('httpcompression/response_count', spider=spider)\n respcls = responsetypes.from_args(\n headers=response.headers, url=response.url, body=decoded_body\n )\n", "issue": "Usage of HttpCompressionMiddleware needs to be reflected in Scrapy stats\n## Summary\r\nUsage of `HttpCompressionMiddleware` needs to be relfected in Scrapy stats.\r\n## Motivation\r\nIn order to estimate scrapy memory usage efficiency and prevent.. memory leaks like [this](https://stackoverflow.com/q/63936759/10884791).\r\nI will need to know:\r\n1. number of request/response objects that can be active (can be achieved by using [`trackref`](https://docs.scrapy.org/en/latest/topics/leaks.html#debugging-memory-leaks-with-trackref) )\r\n2. size of memory required to store that number of request/response objects. \r\n\r\nA lot of websites use compression to reduce traffic. In this case I would like to calculate average size of **decompressed** responses to estimate p.2.\r\n\r\nDecompression process means that at some point application will require to allocate memory to store both compressed and decompressed response body and I will need to know this sizes to have more complete vision of scrapy memory usage.\r\n\r\nAlso size of decompressed body will be several times more than size of compressed response and it will affect scrapy memory usage.\r\n\r\n## Describe alternatives you've considered\r\nThe easiest one - is to change priority of `DownloaderStats` middleware and check difference in `downloader/response_bytes` stats parameter.\r\n```\r\n custom_settings = {\"DOWNLOAD_DELAY\":1,\r\n \"DOWNLOADER_MIDDLEWARES\":{\r\n 'scrapy.downloadermiddlewares.stats.DownloaderStats':50\r\n }\r\n```\r\nStats from quotes.toscrape.com spider (it uses `gzip` compression) with default settings:\r\n\r\n```\r\n{'downloader/request_bytes': 2642,\r\n 'downloader/request_count': 10,\r\n 'downloader/request_method_count/GET': 10,\r\n 'downloader/response_bytes': 24534,\r\n```\r\n \r\nAnd with changed priority of `DownloaderStats` middleware:\r\n\r\n```\r\n{'downloader/request_bytes': 912, # size reduced as it didn't count size of request headers populated by downloader middlewares\r\n 'downloader/request_count': 10,\r\n 'downloader/request_method_count/GET': 10,\r\n 'downloader/response_bytes': 110191, # it counted size of decompressed data \r\n```\r\n\r\nAverage size of compressed response (by default) - 2453 bytes.\r\nAverage size of decompressed response - 11019 bytes (~4.5 times more).\r\n\r\n## Additional context\r\nPotential solution is to add something like this:\r\n` self.stats.inc_value('decompressed_bytes', spider=spider)`\r\ninto `process_response` method of `HttpCompressionMiddleware`\n", "before_files": [{"content": "import io\nimport zlib\n\nfrom scrapy.utils.gz import gunzip\nfrom scrapy.http import Response, TextResponse\nfrom scrapy.responsetypes import responsetypes\nfrom scrapy.exceptions import NotConfigured\n\n\nACCEPTED_ENCODINGS = [b'gzip', b'deflate']\n\ntry:\n import brotli\n ACCEPTED_ENCODINGS.append(b'br')\nexcept ImportError:\n pass\n\ntry:\n import zstandard\n ACCEPTED_ENCODINGS.append(b'zstd')\nexcept ImportError:\n pass\n\n\nclass HttpCompressionMiddleware:\n \"\"\"This middleware allows compressed (gzip, deflate) traffic to be\n sent/received from web sites\"\"\"\n @classmethod\n def from_crawler(cls, crawler):\n if not crawler.settings.getbool('COMPRESSION_ENABLED'):\n raise NotConfigured\n return cls()\n\n def process_request(self, request, spider):\n request.headers.setdefault('Accept-Encoding',\n b\", \".join(ACCEPTED_ENCODINGS))\n\n def process_response(self, request, response, spider):\n\n if request.method == 'HEAD':\n return response\n if isinstance(response, Response):\n content_encoding = response.headers.getlist('Content-Encoding')\n if content_encoding:\n encoding = content_encoding.pop()\n decoded_body = self._decode(response.body, encoding.lower())\n respcls = responsetypes.from_args(\n headers=response.headers, url=response.url, body=decoded_body\n )\n kwargs = dict(cls=respcls, body=decoded_body)\n if issubclass(respcls, TextResponse):\n # force recalculating the encoding until we make sure the\n # responsetypes guessing is reliable\n kwargs['encoding'] = None\n response = response.replace(**kwargs)\n if not content_encoding:\n del response.headers['Content-Encoding']\n\n return response\n\n def _decode(self, body, encoding):\n if encoding == b'gzip' or encoding == b'x-gzip':\n body = gunzip(body)\n\n if encoding == b'deflate':\n try:\n body = zlib.decompress(body)\n except zlib.error:\n # ugly hack to work with raw deflate content that may\n # be sent by microsoft servers. For more information, see:\n # http://carsten.codimi.de/gzip.yaws/\n # http://www.port80software.com/200ok/archive/2005/10/31/868.aspx\n # http://www.gzip.org/zlib/zlib_faq.html#faq38\n body = zlib.decompress(body, -15)\n if encoding == b'br' and b'br' in ACCEPTED_ENCODINGS:\n body = brotli.decompress(body)\n if encoding == b'zstd' and b'zstd' in ACCEPTED_ENCODINGS:\n # Using its streaming API since its simple API could handle only cases\n # where there is content size data embedded in the frame\n reader = zstandard.ZstdDecompressor().stream_reader(io.BytesIO(body))\n body = reader.read()\n return body\n", "path": "scrapy/downloadermiddlewares/httpcompression.py"}], "after_files": [{"content": "import io\nimport warnings\nimport zlib\n\nfrom scrapy.exceptions import NotConfigured\nfrom scrapy.http import Response, TextResponse\nfrom scrapy.responsetypes import responsetypes\nfrom scrapy.utils.deprecate import ScrapyDeprecationWarning\nfrom scrapy.utils.gz import gunzip\n\n\nACCEPTED_ENCODINGS = [b'gzip', b'deflate']\n\ntry:\n import brotli\n ACCEPTED_ENCODINGS.append(b'br')\nexcept ImportError:\n pass\n\ntry:\n import zstandard\n ACCEPTED_ENCODINGS.append(b'zstd')\nexcept ImportError:\n pass\n\n\nclass HttpCompressionMiddleware:\n \"\"\"This middleware allows compressed (gzip, deflate) traffic to be\n sent/received from web sites\"\"\"\n def __init__(self, stats=None):\n self.stats = stats\n\n @classmethod\n def from_crawler(cls, crawler):\n if not crawler.settings.getbool('COMPRESSION_ENABLED'):\n raise NotConfigured\n try:\n return cls(stats=crawler.stats)\n except TypeError:\n warnings.warn(\n \"HttpCompressionMiddleware subclasses must either modify \"\n \"their '__init__' method to support a 'stats' parameter or \"\n \"reimplement the 'from_crawler' method.\",\n ScrapyDeprecationWarning,\n )\n result = cls()\n result.stats = crawler.stats\n return result\n\n def process_request(self, request, spider):\n request.headers.setdefault('Accept-Encoding',\n b\", \".join(ACCEPTED_ENCODINGS))\n\n def process_response(self, request, response, spider):\n\n if request.method == 'HEAD':\n return response\n if isinstance(response, Response):\n content_encoding = response.headers.getlist('Content-Encoding')\n if content_encoding:\n encoding = content_encoding.pop()\n decoded_body = self._decode(response.body, encoding.lower())\n if self.stats:\n self.stats.inc_value('httpcompression/response_bytes', len(decoded_body), spider=spider)\n self.stats.inc_value('httpcompression/response_count', spider=spider)\n respcls = responsetypes.from_args(\n headers=response.headers, url=response.url, body=decoded_body\n )\n kwargs = dict(cls=respcls, body=decoded_body)\n if issubclass(respcls, TextResponse):\n # force recalculating the encoding until we make sure the\n # responsetypes guessing is reliable\n kwargs['encoding'] = None\n response = response.replace(**kwargs)\n if not content_encoding:\n del response.headers['Content-Encoding']\n\n return response\n\n def _decode(self, body, encoding):\n if encoding == b'gzip' or encoding == b'x-gzip':\n body = gunzip(body)\n\n if encoding == b'deflate':\n try:\n body = zlib.decompress(body)\n except zlib.error:\n # ugly hack to work with raw deflate content that may\n # be sent by microsoft servers. For more information, see:\n # http://carsten.codimi.de/gzip.yaws/\n # http://www.port80software.com/200ok/archive/2005/10/31/868.aspx\n # http://www.gzip.org/zlib/zlib_faq.html#faq38\n body = zlib.decompress(body, -15)\n if encoding == b'br' and b'br' in ACCEPTED_ENCODINGS:\n body = brotli.decompress(body)\n if encoding == b'zstd' and b'zstd' in ACCEPTED_ENCODINGS:\n # Using its streaming API since its simple API could handle only cases\n # where there is content size data embedded in the frame\n reader = zstandard.ZstdDecompressor().stream_reader(io.BytesIO(body))\n body = reader.read()\n return body\n", "path": "scrapy/downloadermiddlewares/httpcompression.py"}]} | 1,644 | 471 |
gh_patches_debug_35340 | rasdani/github-patches | git_diff | microsoft__playwright-python-86 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update versions in README.md on Playwright roll
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `build_driver.py`
Content:
```
1 # Copyright (c) Microsoft Corporation.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import gzip
16 import os
17 import shutil
18 import subprocess
19
20 driver_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "driver")
21 package_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "playwright")
22 drivers_path = os.path.join(package_path, "drivers")
23
24 if os.path.exists(os.path.join(driver_path, "package-lock.json")):
25 os.remove(os.path.join(driver_path, "package-lock.json"))
26 if os.path.exists(os.path.join(driver_path, "node_modules")):
27 shutil.rmtree(os.path.join(driver_path, "node_modules"))
28 if os.path.exists(os.path.join(driver_path, "out")):
29 shutil.rmtree(os.path.join(driver_path, "out"))
30
31 subprocess.run("npm i", cwd=driver_path, shell=True)
32 subprocess.run("npm run bake", cwd=driver_path, shell=True)
33
34 for driver in ["driver-linux", "driver-macos", "driver-win.exe"]:
35 if os.path.exists(os.path.join(package_path, driver)):
36 os.remove(os.path.join(package_path, driver))
37
38 in_path = os.path.join(driver_path, "out", driver)
39 out_path = os.path.join(drivers_path, driver + ".gz")
40 with open(in_path, "rb") as f_in, gzip.open(out_path, "wb") as f_out:
41 shutil.copyfileobj(f_in, f_out)
42
43 shutil.copyfile(
44 os.path.join(driver_path, "node_modules", "playwright", "browsers.json"),
45 os.path.join(drivers_path, "browsers.json"),
46 )
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/build_driver.py b/build_driver.py
--- a/build_driver.py
+++ b/build_driver.py
@@ -14,33 +14,52 @@
import gzip
import os
+import re
import shutil
import subprocess
+from pathlib import Path
-driver_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "driver")
-package_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), "playwright")
-drivers_path = os.path.join(package_path, "drivers")
+_dirname = Path(os.path.dirname(os.path.abspath(__file__)))
-if os.path.exists(os.path.join(driver_path, "package-lock.json")):
- os.remove(os.path.join(driver_path, "package-lock.json"))
-if os.path.exists(os.path.join(driver_path, "node_modules")):
- shutil.rmtree(os.path.join(driver_path, "node_modules"))
-if os.path.exists(os.path.join(driver_path, "out")):
- shutil.rmtree(os.path.join(driver_path, "out"))
+driver_path = _dirname / "driver"
+package_path = _dirname / "playwright"
+drivers_path = package_path / "drivers"
+
+if (driver_path / "package-lock.json").exists():
+ os.remove(driver_path / "package-lock.json")
+if (driver_path / "node_modules").exists():
+ shutil.rmtree(driver_path / "node_modules")
+if (driver_path / "out").exists():
+ shutil.rmtree(driver_path / "out")
subprocess.run("npm i", cwd=driver_path, shell=True)
subprocess.run("npm run bake", cwd=driver_path, shell=True)
for driver in ["driver-linux", "driver-macos", "driver-win.exe"]:
- if os.path.exists(os.path.join(package_path, driver)):
- os.remove(os.path.join(package_path, driver))
+ if (package_path / driver).exists():
+ os.remove((package_path / driver))
- in_path = os.path.join(driver_path, "out", driver)
- out_path = os.path.join(drivers_path, driver + ".gz")
+ in_path = driver_path / "out" / driver
+ out_path = drivers_path / (driver + ".gz")
with open(in_path, "rb") as f_in, gzip.open(out_path, "wb") as f_out:
shutil.copyfileobj(f_in, f_out)
+node_modules_playwright = driver_path / "node_modules" / "playwright"
+
shutil.copyfile(
- os.path.join(driver_path, "node_modules", "playwright", "browsers.json"),
- os.path.join(drivers_path, "browsers.json"),
+ node_modules_playwright / "browsers.json", drivers_path / "browsers.json",
)
+
+upstream_readme = (node_modules_playwright / "README.md").read_text()
+pw_python_readme = (_dirname / "README.md").read_text()
+
+matches = re.findall(r"<!-- GEN:(.*?) -->(.*?)<!-- GEN:stop -->", upstream_readme)
+
+for key, value in matches:
+ pw_python_readme = re.sub(
+ rf"(<!-- GEN:{key} -->).*?(<!-- GEN:stop -->)",
+ f"<!-- GEN:{key} -->{value}<!-- GEN:stop -->",
+ pw_python_readme,
+ )
+
+(_dirname / "README.md").write_text(pw_python_readme)
| {"golden_diff": "diff --git a/build_driver.py b/build_driver.py\n--- a/build_driver.py\n+++ b/build_driver.py\n@@ -14,33 +14,52 @@\n \n import gzip\n import os\n+import re\n import shutil\n import subprocess\n+from pathlib import Path\n \n-driver_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"driver\")\n-package_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"playwright\")\n-drivers_path = os.path.join(package_path, \"drivers\")\n+_dirname = Path(os.path.dirname(os.path.abspath(__file__)))\n \n-if os.path.exists(os.path.join(driver_path, \"package-lock.json\")):\n- os.remove(os.path.join(driver_path, \"package-lock.json\"))\n-if os.path.exists(os.path.join(driver_path, \"node_modules\")):\n- shutil.rmtree(os.path.join(driver_path, \"node_modules\"))\n-if os.path.exists(os.path.join(driver_path, \"out\")):\n- shutil.rmtree(os.path.join(driver_path, \"out\"))\n+driver_path = _dirname / \"driver\"\n+package_path = _dirname / \"playwright\"\n+drivers_path = package_path / \"drivers\"\n+\n+if (driver_path / \"package-lock.json\").exists():\n+ os.remove(driver_path / \"package-lock.json\")\n+if (driver_path / \"node_modules\").exists():\n+ shutil.rmtree(driver_path / \"node_modules\")\n+if (driver_path / \"out\").exists():\n+ shutil.rmtree(driver_path / \"out\")\n \n subprocess.run(\"npm i\", cwd=driver_path, shell=True)\n subprocess.run(\"npm run bake\", cwd=driver_path, shell=True)\n \n for driver in [\"driver-linux\", \"driver-macos\", \"driver-win.exe\"]:\n- if os.path.exists(os.path.join(package_path, driver)):\n- os.remove(os.path.join(package_path, driver))\n+ if (package_path / driver).exists():\n+ os.remove((package_path / driver))\n \n- in_path = os.path.join(driver_path, \"out\", driver)\n- out_path = os.path.join(drivers_path, driver + \".gz\")\n+ in_path = driver_path / \"out\" / driver\n+ out_path = drivers_path / (driver + \".gz\")\n with open(in_path, \"rb\") as f_in, gzip.open(out_path, \"wb\") as f_out:\n shutil.copyfileobj(f_in, f_out)\n \n+node_modules_playwright = driver_path / \"node_modules\" / \"playwright\"\n+\n shutil.copyfile(\n- os.path.join(driver_path, \"node_modules\", \"playwright\", \"browsers.json\"),\n- os.path.join(drivers_path, \"browsers.json\"),\n+ node_modules_playwright / \"browsers.json\", drivers_path / \"browsers.json\",\n )\n+\n+upstream_readme = (node_modules_playwright / \"README.md\").read_text()\n+pw_python_readme = (_dirname / \"README.md\").read_text()\n+\n+matches = re.findall(r\"<!-- GEN:(.*?) -->(.*?)<!-- GEN:stop -->\", upstream_readme)\n+\n+for key, value in matches:\n+ pw_python_readme = re.sub(\n+ rf\"(<!-- GEN:{key} -->).*?(<!-- GEN:stop -->)\",\n+ f\"<!-- GEN:{key} -->{value}<!-- GEN:stop -->\",\n+ pw_python_readme,\n+ )\n+\n+(_dirname / \"README.md\").write_text(pw_python_readme)\n", "issue": "Update versions in README.md on Playwright roll\n\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport gzip\nimport os\nimport shutil\nimport subprocess\n\ndriver_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"driver\")\npackage_path = os.path.join(os.path.dirname(os.path.abspath(__file__)), \"playwright\")\ndrivers_path = os.path.join(package_path, \"drivers\")\n\nif os.path.exists(os.path.join(driver_path, \"package-lock.json\")):\n os.remove(os.path.join(driver_path, \"package-lock.json\"))\nif os.path.exists(os.path.join(driver_path, \"node_modules\")):\n shutil.rmtree(os.path.join(driver_path, \"node_modules\"))\nif os.path.exists(os.path.join(driver_path, \"out\")):\n shutil.rmtree(os.path.join(driver_path, \"out\"))\n\nsubprocess.run(\"npm i\", cwd=driver_path, shell=True)\nsubprocess.run(\"npm run bake\", cwd=driver_path, shell=True)\n\nfor driver in [\"driver-linux\", \"driver-macos\", \"driver-win.exe\"]:\n if os.path.exists(os.path.join(package_path, driver)):\n os.remove(os.path.join(package_path, driver))\n\n in_path = os.path.join(driver_path, \"out\", driver)\n out_path = os.path.join(drivers_path, driver + \".gz\")\n with open(in_path, \"rb\") as f_in, gzip.open(out_path, \"wb\") as f_out:\n shutil.copyfileobj(f_in, f_out)\n\nshutil.copyfile(\n os.path.join(driver_path, \"node_modules\", \"playwright\", \"browsers.json\"),\n os.path.join(drivers_path, \"browsers.json\"),\n)\n", "path": "build_driver.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nimport gzip\nimport os\nimport re\nimport shutil\nimport subprocess\nfrom pathlib import Path\n\n_dirname = Path(os.path.dirname(os.path.abspath(__file__)))\n\ndriver_path = _dirname / \"driver\"\npackage_path = _dirname / \"playwright\"\ndrivers_path = package_path / \"drivers\"\n\nif (driver_path / \"package-lock.json\").exists():\n os.remove(driver_path / \"package-lock.json\")\nif (driver_path / \"node_modules\").exists():\n shutil.rmtree(driver_path / \"node_modules\")\nif (driver_path / \"out\").exists():\n shutil.rmtree(driver_path / \"out\")\n\nsubprocess.run(\"npm i\", cwd=driver_path, shell=True)\nsubprocess.run(\"npm run bake\", cwd=driver_path, shell=True)\n\nfor driver in [\"driver-linux\", \"driver-macos\", \"driver-win.exe\"]:\n if (package_path / driver).exists():\n os.remove((package_path / driver))\n\n in_path = driver_path / \"out\" / driver\n out_path = drivers_path / (driver + \".gz\")\n with open(in_path, \"rb\") as f_in, gzip.open(out_path, \"wb\") as f_out:\n shutil.copyfileobj(f_in, f_out)\n\nnode_modules_playwright = driver_path / \"node_modules\" / \"playwright\"\n\nshutil.copyfile(\n node_modules_playwright / \"browsers.json\", drivers_path / \"browsers.json\",\n)\n\nupstream_readme = (node_modules_playwright / \"README.md\").read_text()\npw_python_readme = (_dirname / \"README.md\").read_text()\n\nmatches = re.findall(r\"<!-- GEN:(.*?) -->(.*?)<!-- GEN:stop -->\", upstream_readme)\n\nfor key, value in matches:\n pw_python_readme = re.sub(\n rf\"(<!-- GEN:{key} -->).*?(<!-- GEN:stop -->)\",\n f\"<!-- GEN:{key} -->{value}<!-- GEN:stop -->\",\n pw_python_readme,\n )\n\n(_dirname / \"README.md\").write_text(pw_python_readme)\n", "path": "build_driver.py"}]} | 808 | 733 |
gh_patches_debug_1213 | rasdani/github-patches | git_diff | scalableminds__webknossos-libs-312 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Convenience for wkcuber.api
To open/create a dataset with the cool new high-level API the following code is required:
```python
from wkcuber.api.Dataset import WKDataset
from pathlib import Path
ds1 = WKDataset.create(Path("path") / "to" / "dataset1", scale=(128,128,128))
ds2 = WKDataset.open(Path("path") / "to" / "dataset2")
```
For one-off scripts, I think that could be a bit more convenient, if we had an API like this
```python
from wkcuber import WKDataset
ds1 = WKDataset.create("path/to/dataset1", scale=(128, 128, 128))
ds2 = WKDataset.open("path/to/dataset2")
```
Any thoughts? @rschwanhold @jstriebel @philippotto
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wkcuber/__init__.py`
Content:
```
1 from .cubing import cubing
2 from .downsampling import downsample_mags
3 from .compress import compress_mag
4 from .metadata import write_webknossos_metadata
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wkcuber/__init__.py b/wkcuber/__init__.py
--- a/wkcuber/__init__.py
+++ b/wkcuber/__init__.py
@@ -1,4 +1,6 @@
+from .api.Dataset import WKDataset
from .cubing import cubing
from .downsampling import downsample_mags
from .compress import compress_mag
+from .mag import Mag
from .metadata import write_webknossos_metadata
| {"golden_diff": "diff --git a/wkcuber/__init__.py b/wkcuber/__init__.py\n--- a/wkcuber/__init__.py\n+++ b/wkcuber/__init__.py\n@@ -1,4 +1,6 @@\n+from .api.Dataset import WKDataset\n from .cubing import cubing\n from .downsampling import downsample_mags\n from .compress import compress_mag\n+from .mag import Mag\n from .metadata import write_webknossos_metadata\n", "issue": "Convenience for wkcuber.api\nTo open/create a dataset with the cool new high-level API the following code is required:\r\n\r\n```python\r\nfrom wkcuber.api.Dataset import WKDataset\r\nfrom pathlib import Path\r\n\r\nds1 = WKDataset.create(Path(\"path\") / \"to\" / \"dataset1\", scale=(128,128,128))\r\nds2 = WKDataset.open(Path(\"path\") / \"to\" / \"dataset2\")\r\n\r\n```\r\n\r\nFor one-off scripts, I think that could be a bit more convenient, if we had an API like this\r\n\r\n```python\r\nfrom wkcuber import WKDataset\r\n\r\nds1 = WKDataset.create(\"path/to/dataset1\", scale=(128, 128, 128))\r\nds2 = WKDataset.open(\"path/to/dataset2\")\r\n```\r\n\r\nAny thoughts? @rschwanhold @jstriebel @philippotto \r\n\n", "before_files": [{"content": "from .cubing import cubing\nfrom .downsampling import downsample_mags\nfrom .compress import compress_mag\nfrom .metadata import write_webknossos_metadata\n", "path": "wkcuber/__init__.py"}], "after_files": [{"content": "from .api.Dataset import WKDataset\nfrom .cubing import cubing\nfrom .downsampling import downsample_mags\nfrom .compress import compress_mag\nfrom .mag import Mag\nfrom .metadata import write_webknossos_metadata\n", "path": "wkcuber/__init__.py"}]} | 499 | 103 |
gh_patches_debug_15531 | rasdani/github-patches | git_diff | tensorflow__addons-2299 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Enable SSE4.2 and AVX support during build
So the pip installed TF does not support these instruction sets by default, but modern-ish CPUs do. (Roughly CPUs after 2012).
We could try this and see if there are any improvements in test times and weight the benefits. If nothing else we can add it as a flag for building from source. Currently TF-IO does this by default:
https://github.com/tensorflow/io/blob/master/.github/workflows/build.yml#L13
@perfinion do we know if this is on the roadmap for default TF installations?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `configure.py`
Content:
```
1 # Copyright 2020 The TensorFlow Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 # ==============================================================================
15 # Usage: python configure.py
16 #
17
18
19 import os
20 import pathlib
21 import platform
22 import logging
23
24 import tensorflow as tf
25
26 _TFA_BAZELRC = ".bazelrc"
27
28
29 # Writes variables to bazelrc file
30 def write(line):
31 with open(_TFA_BAZELRC, "a") as f:
32 f.write(line + "\n")
33
34
35 def write_action_env(var_name, var):
36 write('build --action_env {}="{}"'.format(var_name, var))
37
38
39 def is_macos():
40 return platform.system() == "Darwin"
41
42
43 def is_windows():
44 return platform.system() == "Windows"
45
46
47 def is_raspi_arm():
48 return os.uname()[4] == "armv7l"
49
50
51 def get_tf_header_dir():
52 import tensorflow as tf
53
54 tf_header_dir = tf.sysconfig.get_compile_flags()[0][2:]
55 if is_windows():
56 tf_header_dir = tf_header_dir.replace("\\", "/")
57 return tf_header_dir
58
59
60 def get_tf_shared_lib_dir():
61 import tensorflow as tf
62
63 # OS Specific parsing
64 if is_windows():
65 tf_shared_lib_dir = tf.sysconfig.get_compile_flags()[0][2:-7] + "python"
66 return tf_shared_lib_dir.replace("\\", "/")
67 elif is_raspi_arm():
68 return tf.sysconfig.get_compile_flags()[0][2:-7] + "python"
69 else:
70 return tf.sysconfig.get_link_flags()[0][2:]
71
72
73 # Converts the linkflag namespec to the full shared library name
74 def get_shared_lib_name():
75 import tensorflow as tf
76
77 namespec = tf.sysconfig.get_link_flags()
78 if is_macos():
79 # MacOS
80 return "lib" + namespec[1][2:] + ".dylib"
81 elif is_windows():
82 # Windows
83 return "_pywrap_tensorflow_internal.lib"
84 elif is_raspi_arm():
85 # The below command for linux would return an empty list
86 return "_pywrap_tensorflow_internal.so"
87 else:
88 # Linux
89 return namespec[1][3:]
90
91
92 def create_build_configuration():
93 print()
94 print("Configuring TensorFlow Addons to be built from source...")
95
96 if os.path.isfile(_TFA_BAZELRC):
97 os.remove(_TFA_BAZELRC)
98
99 logging.disable(logging.WARNING)
100
101 write_action_env("TF_HEADER_DIR", get_tf_header_dir())
102 write_action_env("TF_SHARED_LIBRARY_DIR", get_tf_shared_lib_dir())
103 write_action_env("TF_SHARED_LIBRARY_NAME", get_shared_lib_name())
104 write_action_env("TF_CXX11_ABI_FLAG", tf.sysconfig.CXX11_ABI_FLAG)
105
106 write("build --spawn_strategy=standalone")
107 write("build --strategy=Genrule=standalone")
108 write("build -c opt")
109
110 if is_windows():
111 write("build --config=windows")
112 write("build:windows --copt=/experimental:preprocessor")
113 write("build:windows --host_copt=/experimental:preprocessor")
114
115 if os.getenv("TF_NEED_CUDA", "0") == "1":
116 print("> Building GPU & CPU ops")
117 configure_cuda()
118 else:
119 print("> Building only CPU ops")
120
121 print()
122 print("Build configurations successfully written to", _TFA_BAZELRC, ":\n")
123 print(pathlib.Path(_TFA_BAZELRC).read_text())
124
125
126 def configure_cuda():
127 write_action_env("TF_NEED_CUDA", "1")
128 write_action_env(
129 "CUDA_TOOLKIT_PATH", os.getenv("CUDA_TOOLKIT_PATH", "/usr/local/cuda")
130 )
131 write_action_env(
132 "CUDNN_INSTALL_PATH",
133 os.getenv("CUDNN_INSTALL_PATH", "/usr/lib/x86_64-linux-gnu"),
134 )
135 write_action_env("TF_CUDA_VERSION", os.getenv("TF_CUDA_VERSION", "11"))
136 write_action_env("TF_CUDNN_VERSION", os.getenv("TF_CUDNN_VERSION", "8"))
137
138 write("test --config=cuda")
139 write("build --config=cuda")
140 write("build:cuda --define=using_cuda=true --define=using_cuda_nvcc=true")
141 write("build:cuda --crosstool_top=@local_config_cuda//crosstool:toolchain")
142
143
144 if __name__ == "__main__":
145 create_build_configuration()
146
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/configure.py b/configure.py
--- a/configure.py
+++ b/configure.py
@@ -44,6 +44,10 @@
return platform.system() == "Windows"
+def is_linux():
+ return platform.system() == "Linux"
+
+
def is_raspi_arm():
return os.uname()[4] == "armv7l"
@@ -111,6 +115,10 @@
write("build --config=windows")
write("build:windows --copt=/experimental:preprocessor")
write("build:windows --host_copt=/experimental:preprocessor")
+ write("build:windows --copt=/arch=AVX2")
+
+ if is_macos() or is_linux():
+ write("build --copt=-mavx2")
if os.getenv("TF_NEED_CUDA", "0") == "1":
print("> Building GPU & CPU ops")
| {"golden_diff": "diff --git a/configure.py b/configure.py\n--- a/configure.py\n+++ b/configure.py\n@@ -44,6 +44,10 @@\n return platform.system() == \"Windows\"\n \n \n+def is_linux():\n+ return platform.system() == \"Linux\"\n+\n+\n def is_raspi_arm():\n return os.uname()[4] == \"armv7l\"\n \n@@ -111,6 +115,10 @@\n write(\"build --config=windows\")\n write(\"build:windows --copt=/experimental:preprocessor\")\n write(\"build:windows --host_copt=/experimental:preprocessor\")\n+ write(\"build:windows --copt=/arch=AVX2\")\n+\n+ if is_macos() or is_linux():\n+ write(\"build --copt=-mavx2\")\n \n if os.getenv(\"TF_NEED_CUDA\", \"0\") == \"1\":\n print(\"> Building GPU & CPU ops\")\n", "issue": "Enable SSE4.2 and AVX support during build\nSo the pip installed TF does not support these instruction sets by default, but modern-ish CPUs do. (Roughly CPUs after 2012).\r\n\r\nWe could try this and see if there are any improvements in test times and weight the benefits. If nothing else we can add it as a flag for building from source. Currently TF-IO does this by default:\r\nhttps://github.com/tensorflow/io/blob/master/.github/workflows/build.yml#L13\r\n\r\n@perfinion do we know if this is on the roadmap for default TF installations?\n", "before_files": [{"content": "# Copyright 2020 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n# Usage: python configure.py\n#\n\n\nimport os\nimport pathlib\nimport platform\nimport logging\n\nimport tensorflow as tf\n\n_TFA_BAZELRC = \".bazelrc\"\n\n\n# Writes variables to bazelrc file\ndef write(line):\n with open(_TFA_BAZELRC, \"a\") as f:\n f.write(line + \"\\n\")\n\n\ndef write_action_env(var_name, var):\n write('build --action_env {}=\"{}\"'.format(var_name, var))\n\n\ndef is_macos():\n return platform.system() == \"Darwin\"\n\n\ndef is_windows():\n return platform.system() == \"Windows\"\n\n\ndef is_raspi_arm():\n return os.uname()[4] == \"armv7l\"\n\n\ndef get_tf_header_dir():\n import tensorflow as tf\n\n tf_header_dir = tf.sysconfig.get_compile_flags()[0][2:]\n if is_windows():\n tf_header_dir = tf_header_dir.replace(\"\\\\\", \"/\")\n return tf_header_dir\n\n\ndef get_tf_shared_lib_dir():\n import tensorflow as tf\n\n # OS Specific parsing\n if is_windows():\n tf_shared_lib_dir = tf.sysconfig.get_compile_flags()[0][2:-7] + \"python\"\n return tf_shared_lib_dir.replace(\"\\\\\", \"/\")\n elif is_raspi_arm():\n return tf.sysconfig.get_compile_flags()[0][2:-7] + \"python\"\n else:\n return tf.sysconfig.get_link_flags()[0][2:]\n\n\n# Converts the linkflag namespec to the full shared library name\ndef get_shared_lib_name():\n import tensorflow as tf\n\n namespec = tf.sysconfig.get_link_flags()\n if is_macos():\n # MacOS\n return \"lib\" + namespec[1][2:] + \".dylib\"\n elif is_windows():\n # Windows\n return \"_pywrap_tensorflow_internal.lib\"\n elif is_raspi_arm():\n # The below command for linux would return an empty list\n return \"_pywrap_tensorflow_internal.so\"\n else:\n # Linux\n return namespec[1][3:]\n\n\ndef create_build_configuration():\n print()\n print(\"Configuring TensorFlow Addons to be built from source...\")\n\n if os.path.isfile(_TFA_BAZELRC):\n os.remove(_TFA_BAZELRC)\n\n logging.disable(logging.WARNING)\n\n write_action_env(\"TF_HEADER_DIR\", get_tf_header_dir())\n write_action_env(\"TF_SHARED_LIBRARY_DIR\", get_tf_shared_lib_dir())\n write_action_env(\"TF_SHARED_LIBRARY_NAME\", get_shared_lib_name())\n write_action_env(\"TF_CXX11_ABI_FLAG\", tf.sysconfig.CXX11_ABI_FLAG)\n\n write(\"build --spawn_strategy=standalone\")\n write(\"build --strategy=Genrule=standalone\")\n write(\"build -c opt\")\n\n if is_windows():\n write(\"build --config=windows\")\n write(\"build:windows --copt=/experimental:preprocessor\")\n write(\"build:windows --host_copt=/experimental:preprocessor\")\n\n if os.getenv(\"TF_NEED_CUDA\", \"0\") == \"1\":\n print(\"> Building GPU & CPU ops\")\n configure_cuda()\n else:\n print(\"> Building only CPU ops\")\n\n print()\n print(\"Build configurations successfully written to\", _TFA_BAZELRC, \":\\n\")\n print(pathlib.Path(_TFA_BAZELRC).read_text())\n\n\ndef configure_cuda():\n write_action_env(\"TF_NEED_CUDA\", \"1\")\n write_action_env(\n \"CUDA_TOOLKIT_PATH\", os.getenv(\"CUDA_TOOLKIT_PATH\", \"/usr/local/cuda\")\n )\n write_action_env(\n \"CUDNN_INSTALL_PATH\",\n os.getenv(\"CUDNN_INSTALL_PATH\", \"/usr/lib/x86_64-linux-gnu\"),\n )\n write_action_env(\"TF_CUDA_VERSION\", os.getenv(\"TF_CUDA_VERSION\", \"11\"))\n write_action_env(\"TF_CUDNN_VERSION\", os.getenv(\"TF_CUDNN_VERSION\", \"8\"))\n\n write(\"test --config=cuda\")\n write(\"build --config=cuda\")\n write(\"build:cuda --define=using_cuda=true --define=using_cuda_nvcc=true\")\n write(\"build:cuda --crosstool_top=@local_config_cuda//crosstool:toolchain\")\n\n\nif __name__ == \"__main__\":\n create_build_configuration()\n", "path": "configure.py"}], "after_files": [{"content": "# Copyright 2020 The TensorFlow Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n# ==============================================================================\n# Usage: python configure.py\n#\n\n\nimport os\nimport pathlib\nimport platform\nimport logging\n\nimport tensorflow as tf\n\n_TFA_BAZELRC = \".bazelrc\"\n\n\n# Writes variables to bazelrc file\ndef write(line):\n with open(_TFA_BAZELRC, \"a\") as f:\n f.write(line + \"\\n\")\n\n\ndef write_action_env(var_name, var):\n write('build --action_env {}=\"{}\"'.format(var_name, var))\n\n\ndef is_macos():\n return platform.system() == \"Darwin\"\n\n\ndef is_windows():\n return platform.system() == \"Windows\"\n\n\ndef is_linux():\n return platform.system() == \"Linux\"\n\n\ndef is_raspi_arm():\n return os.uname()[4] == \"armv7l\"\n\n\ndef get_tf_header_dir():\n import tensorflow as tf\n\n tf_header_dir = tf.sysconfig.get_compile_flags()[0][2:]\n if is_windows():\n tf_header_dir = tf_header_dir.replace(\"\\\\\", \"/\")\n return tf_header_dir\n\n\ndef get_tf_shared_lib_dir():\n import tensorflow as tf\n\n # OS Specific parsing\n if is_windows():\n tf_shared_lib_dir = tf.sysconfig.get_compile_flags()[0][2:-7] + \"python\"\n return tf_shared_lib_dir.replace(\"\\\\\", \"/\")\n elif is_raspi_arm():\n return tf.sysconfig.get_compile_flags()[0][2:-7] + \"python\"\n else:\n return tf.sysconfig.get_link_flags()[0][2:]\n\n\n# Converts the linkflag namespec to the full shared library name\ndef get_shared_lib_name():\n import tensorflow as tf\n\n namespec = tf.sysconfig.get_link_flags()\n if is_macos():\n # MacOS\n return \"lib\" + namespec[1][2:] + \".dylib\"\n elif is_windows():\n # Windows\n return \"_pywrap_tensorflow_internal.lib\"\n elif is_raspi_arm():\n # The below command for linux would return an empty list\n return \"_pywrap_tensorflow_internal.so\"\n else:\n # Linux\n return namespec[1][3:]\n\n\ndef create_build_configuration():\n print()\n print(\"Configuring TensorFlow Addons to be built from source...\")\n\n if os.path.isfile(_TFA_BAZELRC):\n os.remove(_TFA_BAZELRC)\n\n logging.disable(logging.WARNING)\n\n write_action_env(\"TF_HEADER_DIR\", get_tf_header_dir())\n write_action_env(\"TF_SHARED_LIBRARY_DIR\", get_tf_shared_lib_dir())\n write_action_env(\"TF_SHARED_LIBRARY_NAME\", get_shared_lib_name())\n write_action_env(\"TF_CXX11_ABI_FLAG\", tf.sysconfig.CXX11_ABI_FLAG)\n\n write(\"build --spawn_strategy=standalone\")\n write(\"build --strategy=Genrule=standalone\")\n write(\"build -c opt\")\n\n if is_windows():\n write(\"build --config=windows\")\n write(\"build:windows --copt=/experimental:preprocessor\")\n write(\"build:windows --host_copt=/experimental:preprocessor\")\n write(\"build:windows --copt=/arch=AVX2\")\n\n if is_macos() or is_linux():\n write(\"build --copt=-mavx2\")\n\n if os.getenv(\"TF_NEED_CUDA\", \"0\") == \"1\":\n print(\"> Building GPU & CPU ops\")\n configure_cuda()\n else:\n print(\"> Building only CPU ops\")\n\n print()\n print(\"Build configurations successfully written to\", _TFA_BAZELRC, \":\\n\")\n print(pathlib.Path(_TFA_BAZELRC).read_text())\n\n\ndef configure_cuda():\n write_action_env(\"TF_NEED_CUDA\", \"1\")\n write_action_env(\n \"CUDA_TOOLKIT_PATH\", os.getenv(\"CUDA_TOOLKIT_PATH\", \"/usr/local/cuda\")\n )\n write_action_env(\n \"CUDNN_INSTALL_PATH\",\n os.getenv(\"CUDNN_INSTALL_PATH\", \"/usr/lib/x86_64-linux-gnu\"),\n )\n write_action_env(\"TF_CUDA_VERSION\", os.getenv(\"TF_CUDA_VERSION\", \"11\"))\n write_action_env(\"TF_CUDNN_VERSION\", os.getenv(\"TF_CUDNN_VERSION\", \"8\"))\n\n write(\"test --config=cuda\")\n write(\"build --config=cuda\")\n write(\"build:cuda --define=using_cuda=true --define=using_cuda_nvcc=true\")\n write(\"build:cuda --crosstool_top=@local_config_cuda//crosstool:toolchain\")\n\n\nif __name__ == \"__main__\":\n create_build_configuration()\n", "path": "configure.py"}]} | 1,804 | 211 |
gh_patches_debug_29909 | rasdani/github-patches | git_diff | nf-core__tools-2031 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bump minimum required Nextflow version
### Description of feature
Latest stable release brings lots of new features that we probably want to use at module level (eg. `bin` directories).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `nf_core/lint/readme.py`
Content:
```
1 import os
2 import re
3
4
5 def readme(self):
6 """Repository ``README.md`` tests
7
8 The ``README.md`` files for a project are very important and must meet some requirements:
9
10 * Nextflow badge
11
12 * If no Nextflow badge is found, a warning is given
13 * If a badge is found but the version doesn't match the minimum version in the config file, the test fails
14 * Example badge code:
15
16 .. code-block:: md
17
18 [](https://www.nextflow.io/)
19
20 * Bioconda badge
21
22 * If your pipeline contains a file called ``environment.yml`` in the root directory, a bioconda badge is required
23 * Required badge code:
24
25 .. code-block:: md
26
27 [](https://bioconda.github.io/)
28
29 .. note:: These badges are a markdown image ```` *inside* a markdown link ``[markdown image](<link URL>)``, so a bit fiddly to write.
30 """
31 passed = []
32 warned = []
33 failed = []
34
35 # Remove field that should be ignored according to the linting config
36 ignore_configs = self.lint_config.get("readme", [])
37
38 with open(os.path.join(self.wf_path, "README.md"), "r") as fh:
39 content = fh.read()
40
41 if "nextflow_badge" not in ignore_configs:
42 # Check that there is a readme badge showing the minimum required version of Nextflow
43 # [](https://www.nextflow.io/)
44 # and that it has the correct version
45 nf_badge_re = r"\[!\[Nextflow\]\(https://img\.shields\.io/badge/nextflow%20DSL2-!?(?:%E2%89%A5|%3E%3D)([\d\.]+)-23aa62\.svg\)\]\(https://www\.nextflow\.io/\)"
46 match = re.search(nf_badge_re, content)
47 if match:
48 nf_badge_version = match.group(1).strip("'\"")
49 try:
50 if nf_badge_version != self.minNextflowVersion:
51 raise AssertionError()
52 except (AssertionError, KeyError):
53 failed.append(
54 f"README Nextflow minimum version badge does not match config. Badge: `{nf_badge_version}`, "
55 f"Config: `{self.minNextflowVersion}`"
56 )
57 else:
58 passed.append(
59 f"README Nextflow minimum version badge matched config. Badge: `{nf_badge_version}`, "
60 f"Config: `{self.minNextflowVersion}`"
61 )
62 else:
63 warned.append("README did not have a Nextflow minimum version badge.")
64
65 # Check that the minimum version mentioned in the quick start section is consistent
66 # Looking for: "1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=21.10.3`)"
67 nf_version_re = r"1\.\s*Install\s*\[`Nextflow`\]\(https://www.nextflow.io/docs/latest/getstarted.html#installation\)\s*\(`>=(\d*\.\d*\.\d*)`\)"
68 match = re.search(nf_version_re, content)
69 if match:
70 nf_quickstart_version = match.group(1)
71 try:
72 if nf_quickstart_version != self.minNextflowVersion:
73 raise AssertionError()
74 except (AssertionError, KeyError):
75 failed.append(
76 f"README Nextflow minimium version in Quick Start section does not match config. README: `{nf_quickstart_version}`, Config `{self.minNextflowVersion}`"
77 )
78 else:
79 passed.append(
80 f"README Nextflow minimum version in Quick Start section matched config. README: `{nf_quickstart_version}`, Config: `{self.minNextflowVersion}`"
81 )
82 else:
83 warned.append("README did not have a Nextflow minimum version mentioned in Quick Start section.")
84
85 return {"passed": passed, "warned": warned, "failed": failed}
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/nf_core/lint/readme.py b/nf_core/lint/readme.py
--- a/nf_core/lint/readme.py
+++ b/nf_core/lint/readme.py
@@ -40,7 +40,7 @@
if "nextflow_badge" not in ignore_configs:
# Check that there is a readme badge showing the minimum required version of Nextflow
- # [](https://www.nextflow.io/)
+ # [](https://www.nextflow.io/)
# and that it has the correct version
nf_badge_re = r"\[!\[Nextflow\]\(https://img\.shields\.io/badge/nextflow%20DSL2-!?(?:%E2%89%A5|%3E%3D)([\d\.]+)-23aa62\.svg\)\]\(https://www\.nextflow\.io/\)"
match = re.search(nf_badge_re, content)
@@ -63,7 +63,7 @@
warned.append("README did not have a Nextflow minimum version badge.")
# Check that the minimum version mentioned in the quick start section is consistent
- # Looking for: "1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=21.10.3`)"
+ # Looking for: "1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=22.10.1`)"
nf_version_re = r"1\.\s*Install\s*\[`Nextflow`\]\(https://www.nextflow.io/docs/latest/getstarted.html#installation\)\s*\(`>=(\d*\.\d*\.\d*)`\)"
match = re.search(nf_version_re, content)
if match:
| {"golden_diff": "diff --git a/nf_core/lint/readme.py b/nf_core/lint/readme.py\n--- a/nf_core/lint/readme.py\n+++ b/nf_core/lint/readme.py\n@@ -40,7 +40,7 @@\n \n if \"nextflow_badge\" not in ignore_configs:\n # Check that there is a readme badge showing the minimum required version of Nextflow\n- # [](https://www.nextflow.io/)\n+ # [](https://www.nextflow.io/)\n # and that it has the correct version\n nf_badge_re = r\"\\[!\\[Nextflow\\]\\(https://img\\.shields\\.io/badge/nextflow%20DSL2-!?(?:%E2%89%A5|%3E%3D)([\\d\\.]+)-23aa62\\.svg\\)\\]\\(https://www\\.nextflow\\.io/\\)\"\n match = re.search(nf_badge_re, content)\n@@ -63,7 +63,7 @@\n warned.append(\"README did not have a Nextflow minimum version badge.\")\n \n # Check that the minimum version mentioned in the quick start section is consistent\n- # Looking for: \"1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=21.10.3`)\"\n+ # Looking for: \"1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=22.10.1`)\"\n nf_version_re = r\"1\\.\\s*Install\\s*\\[`Nextflow`\\]\\(https://www.nextflow.io/docs/latest/getstarted.html#installation\\)\\s*\\(`>=(\\d*\\.\\d*\\.\\d*)`\\)\"\n match = re.search(nf_version_re, content)\n if match:\n", "issue": "Bump minimum required Nextflow version\n### Description of feature\n\nLatest stable release brings lots of new features that we probably want to use at module level (eg. `bin` directories).\n", "before_files": [{"content": "import os\nimport re\n\n\ndef readme(self):\n \"\"\"Repository ``README.md`` tests\n\n The ``README.md`` files for a project are very important and must meet some requirements:\n\n * Nextflow badge\n\n * If no Nextflow badge is found, a warning is given\n * If a badge is found but the version doesn't match the minimum version in the config file, the test fails\n * Example badge code:\n\n .. code-block:: md\n\n [](https://www.nextflow.io/)\n\n * Bioconda badge\n\n * If your pipeline contains a file called ``environment.yml`` in the root directory, a bioconda badge is required\n * Required badge code:\n\n .. code-block:: md\n\n [](https://bioconda.github.io/)\n\n .. note:: These badges are a markdown image ```` *inside* a markdown link ``[markdown image](<link URL>)``, so a bit fiddly to write.\n \"\"\"\n passed = []\n warned = []\n failed = []\n\n # Remove field that should be ignored according to the linting config\n ignore_configs = self.lint_config.get(\"readme\", [])\n\n with open(os.path.join(self.wf_path, \"README.md\"), \"r\") as fh:\n content = fh.read()\n\n if \"nextflow_badge\" not in ignore_configs:\n # Check that there is a readme badge showing the minimum required version of Nextflow\n # [](https://www.nextflow.io/)\n # and that it has the correct version\n nf_badge_re = r\"\\[!\\[Nextflow\\]\\(https://img\\.shields\\.io/badge/nextflow%20DSL2-!?(?:%E2%89%A5|%3E%3D)([\\d\\.]+)-23aa62\\.svg\\)\\]\\(https://www\\.nextflow\\.io/\\)\"\n match = re.search(nf_badge_re, content)\n if match:\n nf_badge_version = match.group(1).strip(\"'\\\"\")\n try:\n if nf_badge_version != self.minNextflowVersion:\n raise AssertionError()\n except (AssertionError, KeyError):\n failed.append(\n f\"README Nextflow minimum version badge does not match config. Badge: `{nf_badge_version}`, \"\n f\"Config: `{self.minNextflowVersion}`\"\n )\n else:\n passed.append(\n f\"README Nextflow minimum version badge matched config. Badge: `{nf_badge_version}`, \"\n f\"Config: `{self.minNextflowVersion}`\"\n )\n else:\n warned.append(\"README did not have a Nextflow minimum version badge.\")\n\n # Check that the minimum version mentioned in the quick start section is consistent\n # Looking for: \"1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=21.10.3`)\"\n nf_version_re = r\"1\\.\\s*Install\\s*\\[`Nextflow`\\]\\(https://www.nextflow.io/docs/latest/getstarted.html#installation\\)\\s*\\(`>=(\\d*\\.\\d*\\.\\d*)`\\)\"\n match = re.search(nf_version_re, content)\n if match:\n nf_quickstart_version = match.group(1)\n try:\n if nf_quickstart_version != self.minNextflowVersion:\n raise AssertionError()\n except (AssertionError, KeyError):\n failed.append(\n f\"README Nextflow minimium version in Quick Start section does not match config. README: `{nf_quickstart_version}`, Config `{self.minNextflowVersion}`\"\n )\n else:\n passed.append(\n f\"README Nextflow minimum version in Quick Start section matched config. README: `{nf_quickstart_version}`, Config: `{self.minNextflowVersion}`\"\n )\n else:\n warned.append(\"README did not have a Nextflow minimum version mentioned in Quick Start section.\")\n\n return {\"passed\": passed, \"warned\": warned, \"failed\": failed}\n", "path": "nf_core/lint/readme.py"}], "after_files": [{"content": "import os\nimport re\n\n\ndef readme(self):\n \"\"\"Repository ``README.md`` tests\n\n The ``README.md`` files for a project are very important and must meet some requirements:\n\n * Nextflow badge\n\n * If no Nextflow badge is found, a warning is given\n * If a badge is found but the version doesn't match the minimum version in the config file, the test fails\n * Example badge code:\n\n .. code-block:: md\n\n [](https://www.nextflow.io/)\n\n * Bioconda badge\n\n * If your pipeline contains a file called ``environment.yml`` in the root directory, a bioconda badge is required\n * Required badge code:\n\n .. code-block:: md\n\n [](https://bioconda.github.io/)\n\n .. note:: These badges are a markdown image ```` *inside* a markdown link ``[markdown image](<link URL>)``, so a bit fiddly to write.\n \"\"\"\n passed = []\n warned = []\n failed = []\n\n # Remove field that should be ignored according to the linting config\n ignore_configs = self.lint_config.get(\"readme\", [])\n\n with open(os.path.join(self.wf_path, \"README.md\"), \"r\") as fh:\n content = fh.read()\n\n if \"nextflow_badge\" not in ignore_configs:\n # Check that there is a readme badge showing the minimum required version of Nextflow\n # [](https://www.nextflow.io/)\n # and that it has the correct version\n nf_badge_re = r\"\\[!\\[Nextflow\\]\\(https://img\\.shields\\.io/badge/nextflow%20DSL2-!?(?:%E2%89%A5|%3E%3D)([\\d\\.]+)-23aa62\\.svg\\)\\]\\(https://www\\.nextflow\\.io/\\)\"\n match = re.search(nf_badge_re, content)\n if match:\n nf_badge_version = match.group(1).strip(\"'\\\"\")\n try:\n if nf_badge_version != self.minNextflowVersion:\n raise AssertionError()\n except (AssertionError, KeyError):\n failed.append(\n f\"README Nextflow minimum version badge does not match config. Badge: `{nf_badge_version}`, \"\n f\"Config: `{self.minNextflowVersion}`\"\n )\n else:\n passed.append(\n f\"README Nextflow minimum version badge matched config. Badge: `{nf_badge_version}`, \"\n f\"Config: `{self.minNextflowVersion}`\"\n )\n else:\n warned.append(\"README did not have a Nextflow minimum version badge.\")\n\n # Check that the minimum version mentioned in the quick start section is consistent\n # Looking for: \"1. Install [`Nextflow`](https://www.nextflow.io/docs/latest/getstarted.html#installation) (`>=22.10.1`)\"\n nf_version_re = r\"1\\.\\s*Install\\s*\\[`Nextflow`\\]\\(https://www.nextflow.io/docs/latest/getstarted.html#installation\\)\\s*\\(`>=(\\d*\\.\\d*\\.\\d*)`\\)\"\n match = re.search(nf_version_re, content)\n if match:\n nf_quickstart_version = match.group(1)\n try:\n if nf_quickstart_version != self.minNextflowVersion:\n raise AssertionError()\n except (AssertionError, KeyError):\n failed.append(\n f\"README Nextflow minimium version in Quick Start section does not match config. README: `{nf_quickstart_version}`, Config `{self.minNextflowVersion}`\"\n )\n else:\n passed.append(\n f\"README Nextflow minimum version in Quick Start section matched config. README: `{nf_quickstart_version}`, Config: `{self.minNextflowVersion}`\"\n )\n else:\n warned.append(\"README did not have a Nextflow minimum version mentioned in Quick Start section.\")\n\n return {\"passed\": passed, \"warned\": warned, \"failed\": failed}\n", "path": "nf_core/lint/readme.py"}]} | 1,437 | 490 |
gh_patches_debug_937 | rasdani/github-patches | git_diff | boto__boto-2166 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Invalid path check in euca-bundle-image
The -i option uses convert_file in boto/roboto/param.py to verify that the path passed is, indeed, a file. This fails unless the path specified is a boring old file which is not necessary. Indeed it not being necessary is sort of the whole point in unix having a /dev in the first place. Everything is a file.
The code calls os.path.isfile(value) in convert_file(). It should call os.path.exists(value) and not os.path.isdir(value). Directories are the only types of files which need to be considered special in the normal course of events.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `boto/roboto/param.py`
Content:
```
1 # Copyright (c) 2010 Mitch Garnaat http://garnaat.org/
2 # Copyright (c) 2010, Eucalyptus Systems, Inc.
3 #
4 # Permission is hereby granted, free of charge, to any person obtaining a
5 # copy of this software and associated documentation files (the
6 # "Software"), to deal in the Software without restriction, including
7 # without limitation the rights to use, copy, modify, merge, publish, dis-
8 # tribute, sublicense, and/or sell copies of the Software, and to permit
9 # persons to whom the Software is furnished to do so, subject to the fol-
10 # lowing conditions:
11 #
12 # The above copyright notice and this permission notice shall be included
13 # in all copies or substantial portions of the Software.
14 #
15 # THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
16 # OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-
17 # ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT
18 # SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
19 # WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20 # OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS
21 # IN THE SOFTWARE.
22
23 import os
24
25 class Converter(object):
26
27 @classmethod
28 def convert_string(cls, param, value):
29 # TODO: could do length validation, etc. here
30 if not isinstance(value, basestring):
31 raise ValueError
32 return value
33
34 @classmethod
35 def convert_integer(cls, param, value):
36 # TODO: could do range checking here
37 return int(value)
38
39 @classmethod
40 def convert_boolean(cls, param, value):
41 """
42 For command line arguments, just the presence
43 of the option means True so just return True
44 """
45 return True
46
47 @classmethod
48 def convert_file(cls, param, value):
49 if os.path.isfile(value):
50 return value
51 raise ValueError
52
53 @classmethod
54 def convert_dir(cls, param, value):
55 if os.path.isdir(value):
56 return value
57 raise ValueError
58
59 @classmethod
60 def convert(cls, param, value):
61 try:
62 if hasattr(cls, 'convert_'+param.ptype):
63 mthd = getattr(cls, 'convert_'+param.ptype)
64 else:
65 mthd = cls.convert_string
66 return mthd(param, value)
67 except:
68 raise ValidationException(param, '')
69
70 class Param(Converter):
71
72 def __init__(self, name=None, ptype='string', optional=True,
73 short_name=None, long_name=None, doc='',
74 metavar=None, cardinality=1, default=None,
75 choices=None, encoder=None, request_param=True):
76 self.name = name
77 self.ptype = ptype
78 self.optional = optional
79 self.short_name = short_name
80 self.long_name = long_name
81 self.doc = doc
82 self.metavar = metavar
83 self.cardinality = cardinality
84 self.default = default
85 self.choices = choices
86 self.encoder = encoder
87 self.request_param = request_param
88
89 @property
90 def optparse_long_name(self):
91 ln = None
92 if self.long_name:
93 ln = '--%s' % self.long_name
94 return ln
95
96 @property
97 def synopsis_long_name(self):
98 ln = None
99 if self.long_name:
100 ln = '--%s' % self.long_name
101 return ln
102
103 @property
104 def getopt_long_name(self):
105 ln = None
106 if self.long_name:
107 ln = '%s' % self.long_name
108 if self.ptype != 'boolean':
109 ln += '='
110 return ln
111
112 @property
113 def optparse_short_name(self):
114 sn = None
115 if self.short_name:
116 sn = '-%s' % self.short_name
117 return sn
118
119 @property
120 def synopsis_short_name(self):
121 sn = None
122 if self.short_name:
123 sn = '-%s' % self.short_name
124 return sn
125
126 @property
127 def getopt_short_name(self):
128 sn = None
129 if self.short_name:
130 sn = '%s' % self.short_name
131 if self.ptype != 'boolean':
132 sn += ':'
133 return sn
134
135 def convert(self, value):
136 """
137 Convert a string value as received in the command line
138 tools and convert to the appropriate type of value.
139 Raise a ValidationError if the value can't be converted.
140
141 :type value: str
142 :param value: The value to convert. This should always
143 be a string.
144 """
145 return super(Param, self).convert(self,value)
146
147
148
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/boto/roboto/param.py b/boto/roboto/param.py
--- a/boto/roboto/param.py
+++ b/boto/roboto/param.py
@@ -46,7 +46,7 @@
@classmethod
def convert_file(cls, param, value):
- if os.path.isfile(value):
+ if os.path.exists(value) and not os.path.isdir(value):
return value
raise ValueError
| {"golden_diff": "diff --git a/boto/roboto/param.py b/boto/roboto/param.py\n--- a/boto/roboto/param.py\n+++ b/boto/roboto/param.py\n@@ -46,7 +46,7 @@\n \n @classmethod\n def convert_file(cls, param, value):\n- if os.path.isfile(value):\n+ if os.path.exists(value) and not os.path.isdir(value):\n return value\n raise ValueError\n", "issue": "Invalid path check in euca-bundle-image\nThe -i option uses convert_file in boto/roboto/param.py to verify that the path passed is, indeed, a file. This fails unless the path specified is a boring old file which is not necessary. Indeed it not being necessary is sort of the whole point in unix having a /dev in the first place. Everything is a file.\n\nThe code calls os.path.isfile(value) in convert_file(). It should call os.path.exists(value) and not os.path.isdir(value). Directories are the only types of files which need to be considered special in the normal course of events.\n\n", "before_files": [{"content": "# Copyright (c) 2010 Mitch Garnaat http://garnaat.org/\n# Copyright (c) 2010, Eucalyptus Systems, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, including\n# without limitation the rights to use, copy, modify, merge, publish, dis-\n# tribute, sublicense, and/or sell copies of the Software, and to permit\n# persons to whom the Software is furnished to do so, subject to the fol-\n# lowing conditions:\n#\n# The above copyright notice and this permission notice shall be included\n# in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\n# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-\n# ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT\n# SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\n# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n# IN THE SOFTWARE.\n\nimport os\n\nclass Converter(object):\n\n @classmethod\n def convert_string(cls, param, value):\n # TODO: could do length validation, etc. here\n if not isinstance(value, basestring):\n raise ValueError\n return value\n\n @classmethod\n def convert_integer(cls, param, value):\n # TODO: could do range checking here\n return int(value)\n\n @classmethod\n def convert_boolean(cls, param, value):\n \"\"\"\n For command line arguments, just the presence\n of the option means True so just return True\n \"\"\"\n return True\n\n @classmethod\n def convert_file(cls, param, value):\n if os.path.isfile(value):\n return value\n raise ValueError\n\n @classmethod\n def convert_dir(cls, param, value):\n if os.path.isdir(value):\n return value\n raise ValueError\n\n @classmethod\n def convert(cls, param, value):\n try:\n if hasattr(cls, 'convert_'+param.ptype):\n mthd = getattr(cls, 'convert_'+param.ptype)\n else:\n mthd = cls.convert_string\n return mthd(param, value)\n except:\n raise ValidationException(param, '')\n\nclass Param(Converter):\n\n def __init__(self, name=None, ptype='string', optional=True,\n short_name=None, long_name=None, doc='',\n metavar=None, cardinality=1, default=None,\n choices=None, encoder=None, request_param=True):\n self.name = name\n self.ptype = ptype\n self.optional = optional\n self.short_name = short_name\n self.long_name = long_name\n self.doc = doc\n self.metavar = metavar\n self.cardinality = cardinality\n self.default = default\n self.choices = choices\n self.encoder = encoder\n self.request_param = request_param\n\n @property\n def optparse_long_name(self):\n ln = None\n if self.long_name:\n ln = '--%s' % self.long_name\n return ln\n\n @property\n def synopsis_long_name(self):\n ln = None\n if self.long_name:\n ln = '--%s' % self.long_name\n return ln\n\n @property\n def getopt_long_name(self):\n ln = None\n if self.long_name:\n ln = '%s' % self.long_name\n if self.ptype != 'boolean':\n ln += '='\n return ln\n\n @property\n def optparse_short_name(self):\n sn = None\n if self.short_name:\n sn = '-%s' % self.short_name\n return sn\n\n @property\n def synopsis_short_name(self):\n sn = None\n if self.short_name:\n sn = '-%s' % self.short_name\n return sn\n\n @property\n def getopt_short_name(self):\n sn = None\n if self.short_name:\n sn = '%s' % self.short_name\n if self.ptype != 'boolean':\n sn += ':'\n return sn\n\n def convert(self, value):\n \"\"\"\n Convert a string value as received in the command line\n tools and convert to the appropriate type of value.\n Raise a ValidationError if the value can't be converted.\n\n :type value: str\n :param value: The value to convert. This should always\n be a string.\n \"\"\"\n return super(Param, self).convert(self,value)\n\n\n", "path": "boto/roboto/param.py"}], "after_files": [{"content": "# Copyright (c) 2010 Mitch Garnaat http://garnaat.org/\n# Copyright (c) 2010, Eucalyptus Systems, Inc.\n#\n# Permission is hereby granted, free of charge, to any person obtaining a\n# copy of this software and associated documentation files (the\n# \"Software\"), to deal in the Software without restriction, including\n# without limitation the rights to use, copy, modify, merge, publish, dis-\n# tribute, sublicense, and/or sell copies of the Software, and to permit\n# persons to whom the Software is furnished to do so, subject to the fol-\n# lowing conditions:\n#\n# The above copyright notice and this permission notice shall be included\n# in all copies or substantial portions of the Software.\n#\n# THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS\n# OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABIL-\n# ITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT\n# SHALL THE AUTHOR BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,\n# WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\n# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS\n# IN THE SOFTWARE.\n\nimport os\n\nclass Converter(object):\n\n @classmethod\n def convert_string(cls, param, value):\n # TODO: could do length validation, etc. here\n if not isinstance(value, basestring):\n raise ValueError\n return value\n\n @classmethod\n def convert_integer(cls, param, value):\n # TODO: could do range checking here\n return int(value)\n\n @classmethod\n def convert_boolean(cls, param, value):\n \"\"\"\n For command line arguments, just the presence\n of the option means True so just return True\n \"\"\"\n return True\n\n @classmethod\n def convert_file(cls, param, value):\n if os.path.exists(value) and not os.path.isdir(value):\n return value\n raise ValueError\n\n @classmethod\n def convert_dir(cls, param, value):\n if os.path.isdir(value):\n return value\n raise ValueError\n\n @classmethod\n def convert(cls, param, value):\n try:\n if hasattr(cls, 'convert_'+param.ptype):\n mthd = getattr(cls, 'convert_'+param.ptype)\n else:\n mthd = cls.convert_string\n return mthd(param, value)\n except:\n raise ValidationException(param, '')\n\nclass Param(Converter):\n\n def __init__(self, name=None, ptype='string', optional=True,\n short_name=None, long_name=None, doc='',\n metavar=None, cardinality=1, default=None,\n choices=None, encoder=None, request_param=True):\n self.name = name\n self.ptype = ptype\n self.optional = optional\n self.short_name = short_name\n self.long_name = long_name\n self.doc = doc\n self.metavar = metavar\n self.cardinality = cardinality\n self.default = default\n self.choices = choices\n self.encoder = encoder\n self.request_param = request_param\n\n @property\n def optparse_long_name(self):\n ln = None\n if self.long_name:\n ln = '--%s' % self.long_name\n return ln\n\n @property\n def synopsis_long_name(self):\n ln = None\n if self.long_name:\n ln = '--%s' % self.long_name\n return ln\n\n @property\n def getopt_long_name(self):\n ln = None\n if self.long_name:\n ln = '%s' % self.long_name\n if self.ptype != 'boolean':\n ln += '='\n return ln\n\n @property\n def optparse_short_name(self):\n sn = None\n if self.short_name:\n sn = '-%s' % self.short_name\n return sn\n\n @property\n def synopsis_short_name(self):\n sn = None\n if self.short_name:\n sn = '-%s' % self.short_name\n return sn\n\n @property\n def getopt_short_name(self):\n sn = None\n if self.short_name:\n sn = '%s' % self.short_name\n if self.ptype != 'boolean':\n sn += ':'\n return sn\n\n def convert(self, value):\n \"\"\"\n Convert a string value as received in the command line\n tools and convert to the appropriate type of value.\n Raise a ValidationError if the value can't be converted.\n\n :type value: str\n :param value: The value to convert. This should always\n be a string.\n \"\"\"\n return super(Param, self).convert(self,value)\n\n\n", "path": "boto/roboto/param.py"}]} | 1,750 | 102 |
gh_patches_debug_526 | rasdani/github-patches | git_diff | Parsl__parsl-2302 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove parsl container bits
This issue is to remind us to remove Parsl container support and update the docs as soon as the funcX executor is integrated-- we should switch to recommending container support through it.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docker/app1/app1.py`
Content:
```
1
2 def predict(list_items):
3 """Returns the double of the items"""
4 return [i*2 for i in list_items]
5
```
Path: `docker/app2/app2.py`
Content:
```
1
2 def predict(list_items):
3 """Returns items+10"""
4 return [i+10 for i in list_items]
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docker/app1/app1.py b/docker/app1/app1.py
deleted file mode 100644
--- a/docker/app1/app1.py
+++ /dev/null
@@ -1,4 +0,0 @@
-
-def predict(list_items):
- """Returns the double of the items"""
- return [i*2 for i in list_items]
diff --git a/docker/app2/app2.py b/docker/app2/app2.py
deleted file mode 100644
--- a/docker/app2/app2.py
+++ /dev/null
@@ -1,4 +0,0 @@
-
-def predict(list_items):
- """Returns items+10"""
- return [i+10 for i in list_items]
| {"golden_diff": "diff --git a/docker/app1/app1.py b/docker/app1/app1.py\ndeleted file mode 100644\n--- a/docker/app1/app1.py\n+++ /dev/null\n@@ -1,4 +0,0 @@\n-\n-def predict(list_items):\n- \"\"\"Returns the double of the items\"\"\"\n- return [i*2 for i in list_items]\ndiff --git a/docker/app2/app2.py b/docker/app2/app2.py\ndeleted file mode 100644\n--- a/docker/app2/app2.py\n+++ /dev/null\n@@ -1,4 +0,0 @@\n-\n-def predict(list_items):\n- \"\"\"Returns items+10\"\"\"\n- return [i+10 for i in list_items]\n", "issue": "Remove parsl container bits\nThis issue is to remind us to remove Parsl container support and update the docs as soon as the funcX executor is integrated-- we should switch to recommending container support through it.\n", "before_files": [{"content": "\ndef predict(list_items):\n \"\"\"Returns the double of the items\"\"\"\n return [i*2 for i in list_items]\n", "path": "docker/app1/app1.py"}, {"content": "\ndef predict(list_items):\n \"\"\"Returns items+10\"\"\"\n return [i+10 for i in list_items]\n", "path": "docker/app2/app2.py"}], "after_files": [{"content": null, "path": "docker/app1/app1.py"}, {"content": null, "path": "docker/app2/app2.py"}]} | 383 | 164 |
gh_patches_debug_1401 | rasdani/github-patches | git_diff | ktbyers__netmiko-1073 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Huawei vrpv8 commit func issue
After commiting changes on huawei vrpv8, cli on devices look like this:
```
[~HUAWEI]dot1x enable
[*HUAWEI]snmp-agent sys-info version all
Warning: SNMPv1/SNMPv2c is not secure, and SNMPv3 in either authentication or privacy mode is recommended.
[*HUAWEI]commit
[~HUAWEI]
```
with following code:
```
from netmiko import Netmiko
device = {
"host": "10.0.0.3",
"username": "yyy",
"password": "xxx",
"device_type": "huawei_vrpv8",
"session_log": "log_file2.txt"
}
config_commands = ['dot1x enable','snmp-agent sys-info version all']
net_connect = Netmiko(**device)
output = net_connect.send_config_set(config_commands,exit_config_mode=False)
output += net_connect.commit()
print(output)
```
i got this error:
```
Traceback (most recent call last):
File "/home/kafooo/PycharmProjects/nornir_scripts/venv/huawei_netmiko_test.py", line 18, in <module>
output2 = net_connect.commit()
File "/home/kafooo/PycharmProjects/nornir_scripts/venv/lib/python3.6/site-packages/netmiko/huawei/huawei_ssh.py", line 114, in commit
strip_command=False, delay_factor=delay_factor)
File "/home/kafooo/PycharmProjects/nornir_scripts/venv/lib/python3.6/site-packages/netmiko/base_connection.py", line 1206, in send_command_expect
return self.send_command(*args, **kwargs)
File "/home/kafooo/PycharmProjects/nornir_scripts/venv/lib/python3.6/site-packages/netmiko/base_connection.py", line 1188, in send_command
search_pattern))
OSError: Search pattern never detected in send_command_expect: \[\*HUAWEI\]
```
looks like netmiko is expecting [*hostname] after commit, but in reality there is [~hostname] after commit
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `netmiko/huawei/huawei_ssh.py`
Content:
```
1 from __future__ import print_function
2 from __future__ import unicode_literals
3 import time
4 import re
5 from netmiko.cisco_base_connection import CiscoSSHConnection
6 from netmiko import log
7
8
9 class HuaweiSSH(CiscoSSHConnection):
10 def session_preparation(self):
11 """Prepare the session after the connection has been established."""
12 self._test_channel_read()
13 self.set_base_prompt()
14 self.disable_paging(command="screen-length 0 temporary")
15 # Clear the read buffer
16 time.sleep(0.3 * self.global_delay_factor)
17 self.clear_buffer()
18
19 def config_mode(self, config_command="system-view"):
20 """Enter configuration mode."""
21 return super(HuaweiSSH, self).config_mode(config_command=config_command)
22
23 def exit_config_mode(self, exit_config="return", pattern=r">"):
24 """Exit configuration mode."""
25 return super(HuaweiSSH, self).exit_config_mode(
26 exit_config=exit_config, pattern=pattern
27 )
28
29 def check_config_mode(self, check_string="]"):
30 """Checks whether in configuration mode. Returns a boolean."""
31 return super(HuaweiSSH, self).check_config_mode(check_string=check_string)
32
33 def check_enable_mode(self, *args, **kwargs):
34 """Huawei has no enable mode."""
35 pass
36
37 def enable(self, *args, **kwargs):
38 """Huawei has no enable mode."""
39 return ""
40
41 def exit_enable_mode(self, *args, **kwargs):
42 """Huawei has no enable mode."""
43 return ""
44
45 def set_base_prompt(
46 self, pri_prompt_terminator=">", alt_prompt_terminator="]", delay_factor=1
47 ):
48 """
49 Sets self.base_prompt
50
51 Used as delimiter for stripping of trailing prompt in output.
52
53 Should be set to something that is general and applies in multiple contexts. For Comware
54 this will be the router prompt with < > or [ ] stripped off.
55
56 This will be set on logging in, but not when entering system-view
57 """
58 log.debug("In set_base_prompt")
59 delay_factor = self.select_delay_factor(delay_factor)
60 self.clear_buffer()
61 self.write_channel(self.RETURN)
62 time.sleep(0.5 * delay_factor)
63
64 prompt = self.read_channel()
65 prompt = self.normalize_linefeeds(prompt)
66
67 # If multiple lines in the output take the last line
68 prompt = prompt.split(self.RESPONSE_RETURN)[-1]
69 prompt = prompt.strip()
70
71 # Check that ends with a valid terminator character
72 if not prompt[-1] in (pri_prompt_terminator, alt_prompt_terminator):
73 raise ValueError("Router prompt not found: {0}".format(prompt))
74
75 # Strip off any leading HRP_. characters for USGv5 HA
76 prompt = re.sub(r"^HRP_.", "", prompt, flags=re.M)
77
78 # Strip off leading and trailing terminator
79 prompt = prompt[1:-1]
80 prompt = prompt.strip()
81 self.base_prompt = prompt
82 log.debug("prompt: {0}".format(self.base_prompt))
83
84 return self.base_prompt
85
86 def save_config(self, cmd="save", confirm=False, confirm_response=""):
87 """ Save Config for HuaweiSSH"""
88 return super(HuaweiSSH, self).save_config(cmd=cmd, confirm=confirm)
89
90
91 class HuaweiVrpv8SSH(HuaweiSSH):
92 def commit(self, comment="", delay_factor=1):
93 """
94 Commit the candidate configuration.
95
96 Commit the entered configuration. Raise an error and return the failure
97 if the commit fails.
98
99 default:
100 command_string = commit
101 comment:
102 command_string = commit comment <comment>
103
104 """
105 delay_factor = self.select_delay_factor(delay_factor)
106 error_marker = "Failed to generate committed config"
107 command_string = "commit"
108
109 if comment:
110 command_string += ' comment "{}"'.format(comment)
111
112 output = self.config_mode()
113 output += self.send_command_expect(
114 command_string,
115 strip_prompt=False,
116 strip_command=False,
117 delay_factor=delay_factor,
118 )
119 output += self.exit_config_mode()
120
121 if error_marker in output:
122 raise ValueError(
123 "Commit failed with following errors:\n\n{}".format(output)
124 )
125 return output
126
127 def save_config(self, cmd="", confirm=True, confirm_response=""):
128 """Not Implemented"""
129 raise NotImplementedError
130
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/netmiko/huawei/huawei_ssh.py b/netmiko/huawei/huawei_ssh.py
--- a/netmiko/huawei/huawei_ssh.py
+++ b/netmiko/huawei/huawei_ssh.py
@@ -115,6 +115,7 @@
strip_prompt=False,
strip_command=False,
delay_factor=delay_factor,
+ expect_string=r"]",
)
output += self.exit_config_mode()
| {"golden_diff": "diff --git a/netmiko/huawei/huawei_ssh.py b/netmiko/huawei/huawei_ssh.py\n--- a/netmiko/huawei/huawei_ssh.py\n+++ b/netmiko/huawei/huawei_ssh.py\n@@ -115,6 +115,7 @@\n strip_prompt=False,\n strip_command=False,\n delay_factor=delay_factor,\n+ expect_string=r\"]\",\n )\n output += self.exit_config_mode()\n", "issue": "Huawei vrpv8 commit func issue\nAfter commiting changes on huawei vrpv8, cli on devices look like this: \r\n```\r\n[~HUAWEI]dot1x enable\r\n[*HUAWEI]snmp-agent sys-info version all\r\nWarning: SNMPv1/SNMPv2c is not secure, and SNMPv3 in either authentication or privacy mode is recommended.\r\n[*HUAWEI]commit\r\n[~HUAWEI]\r\n```\r\n\r\n\r\nwith following code: \r\n\r\n```\r\nfrom netmiko import Netmiko\r\n\r\ndevice = {\r\n \"host\": \"10.0.0.3\",\r\n \"username\": \"yyy\",\r\n \"password\": \"xxx\",\r\n \"device_type\": \"huawei_vrpv8\",\r\n \"session_log\": \"log_file2.txt\"\r\n}\r\nconfig_commands = ['dot1x enable','snmp-agent sys-info version all']\r\nnet_connect = Netmiko(**device)\r\n\r\noutput = net_connect.send_config_set(config_commands,exit_config_mode=False)\r\noutput += net_connect.commit()\r\nprint(output)\r\n```\r\n\r\ni got this error: \r\n\r\n```\r\nTraceback (most recent call last):\r\n File \"/home/kafooo/PycharmProjects/nornir_scripts/venv/huawei_netmiko_test.py\", line 18, in <module>\r\n output2 = net_connect.commit()\r\n File \"/home/kafooo/PycharmProjects/nornir_scripts/venv/lib/python3.6/site-packages/netmiko/huawei/huawei_ssh.py\", line 114, in commit\r\n strip_command=False, delay_factor=delay_factor)\r\n File \"/home/kafooo/PycharmProjects/nornir_scripts/venv/lib/python3.6/site-packages/netmiko/base_connection.py\", line 1206, in send_command_expect\r\n return self.send_command(*args, **kwargs)\r\n File \"/home/kafooo/PycharmProjects/nornir_scripts/venv/lib/python3.6/site-packages/netmiko/base_connection.py\", line 1188, in send_command\r\n search_pattern))\r\nOSError: Search pattern never detected in send_command_expect: \\[\\*HUAWEI\\]\r\n```\r\n\r\n\r\nlooks like netmiko is expecting [*hostname] after commit, but in reality there is [~hostname] after commit\n", "before_files": [{"content": "from __future__ import print_function\nfrom __future__ import unicode_literals\nimport time\nimport re\nfrom netmiko.cisco_base_connection import CiscoSSHConnection\nfrom netmiko import log\n\n\nclass HuaweiSSH(CiscoSSHConnection):\n def session_preparation(self):\n \"\"\"Prepare the session after the connection has been established.\"\"\"\n self._test_channel_read()\n self.set_base_prompt()\n self.disable_paging(command=\"screen-length 0 temporary\")\n # Clear the read buffer\n time.sleep(0.3 * self.global_delay_factor)\n self.clear_buffer()\n\n def config_mode(self, config_command=\"system-view\"):\n \"\"\"Enter configuration mode.\"\"\"\n return super(HuaweiSSH, self).config_mode(config_command=config_command)\n\n def exit_config_mode(self, exit_config=\"return\", pattern=r\">\"):\n \"\"\"Exit configuration mode.\"\"\"\n return super(HuaweiSSH, self).exit_config_mode(\n exit_config=exit_config, pattern=pattern\n )\n\n def check_config_mode(self, check_string=\"]\"):\n \"\"\"Checks whether in configuration mode. Returns a boolean.\"\"\"\n return super(HuaweiSSH, self).check_config_mode(check_string=check_string)\n\n def check_enable_mode(self, *args, **kwargs):\n \"\"\"Huawei has no enable mode.\"\"\"\n pass\n\n def enable(self, *args, **kwargs):\n \"\"\"Huawei has no enable mode.\"\"\"\n return \"\"\n\n def exit_enable_mode(self, *args, **kwargs):\n \"\"\"Huawei has no enable mode.\"\"\"\n return \"\"\n\n def set_base_prompt(\n self, pri_prompt_terminator=\">\", alt_prompt_terminator=\"]\", delay_factor=1\n ):\n \"\"\"\n Sets self.base_prompt\n\n Used as delimiter for stripping of trailing prompt in output.\n\n Should be set to something that is general and applies in multiple contexts. For Comware\n this will be the router prompt with < > or [ ] stripped off.\n\n This will be set on logging in, but not when entering system-view\n \"\"\"\n log.debug(\"In set_base_prompt\")\n delay_factor = self.select_delay_factor(delay_factor)\n self.clear_buffer()\n self.write_channel(self.RETURN)\n time.sleep(0.5 * delay_factor)\n\n prompt = self.read_channel()\n prompt = self.normalize_linefeeds(prompt)\n\n # If multiple lines in the output take the last line\n prompt = prompt.split(self.RESPONSE_RETURN)[-1]\n prompt = prompt.strip()\n\n # Check that ends with a valid terminator character\n if not prompt[-1] in (pri_prompt_terminator, alt_prompt_terminator):\n raise ValueError(\"Router prompt not found: {0}\".format(prompt))\n\n # Strip off any leading HRP_. characters for USGv5 HA\n prompt = re.sub(r\"^HRP_.\", \"\", prompt, flags=re.M)\n\n # Strip off leading and trailing terminator\n prompt = prompt[1:-1]\n prompt = prompt.strip()\n self.base_prompt = prompt\n log.debug(\"prompt: {0}\".format(self.base_prompt))\n\n return self.base_prompt\n\n def save_config(self, cmd=\"save\", confirm=False, confirm_response=\"\"):\n \"\"\" Save Config for HuaweiSSH\"\"\"\n return super(HuaweiSSH, self).save_config(cmd=cmd, confirm=confirm)\n\n\nclass HuaweiVrpv8SSH(HuaweiSSH):\n def commit(self, comment=\"\", delay_factor=1):\n \"\"\"\n Commit the candidate configuration.\n\n Commit the entered configuration. Raise an error and return the failure\n if the commit fails.\n\n default:\n command_string = commit\n comment:\n command_string = commit comment <comment>\n\n \"\"\"\n delay_factor = self.select_delay_factor(delay_factor)\n error_marker = \"Failed to generate committed config\"\n command_string = \"commit\"\n\n if comment:\n command_string += ' comment \"{}\"'.format(comment)\n\n output = self.config_mode()\n output += self.send_command_expect(\n command_string,\n strip_prompt=False,\n strip_command=False,\n delay_factor=delay_factor,\n )\n output += self.exit_config_mode()\n\n if error_marker in output:\n raise ValueError(\n \"Commit failed with following errors:\\n\\n{}\".format(output)\n )\n return output\n\n def save_config(self, cmd=\"\", confirm=True, confirm_response=\"\"):\n \"\"\"Not Implemented\"\"\"\n raise NotImplementedError\n", "path": "netmiko/huawei/huawei_ssh.py"}], "after_files": [{"content": "from __future__ import print_function\nfrom __future__ import unicode_literals\nimport time\nimport re\nfrom netmiko.cisco_base_connection import CiscoSSHConnection\nfrom netmiko import log\n\n\nclass HuaweiSSH(CiscoSSHConnection):\n def session_preparation(self):\n \"\"\"Prepare the session after the connection has been established.\"\"\"\n self._test_channel_read()\n self.set_base_prompt()\n self.disable_paging(command=\"screen-length 0 temporary\")\n # Clear the read buffer\n time.sleep(0.3 * self.global_delay_factor)\n self.clear_buffer()\n\n def config_mode(self, config_command=\"system-view\"):\n \"\"\"Enter configuration mode.\"\"\"\n return super(HuaweiSSH, self).config_mode(config_command=config_command)\n\n def exit_config_mode(self, exit_config=\"return\", pattern=r\">\"):\n \"\"\"Exit configuration mode.\"\"\"\n return super(HuaweiSSH, self).exit_config_mode(\n exit_config=exit_config, pattern=pattern\n )\n\n def check_config_mode(self, check_string=\"]\"):\n \"\"\"Checks whether in configuration mode. Returns a boolean.\"\"\"\n return super(HuaweiSSH, self).check_config_mode(check_string=check_string)\n\n def check_enable_mode(self, *args, **kwargs):\n \"\"\"Huawei has no enable mode.\"\"\"\n pass\n\n def enable(self, *args, **kwargs):\n \"\"\"Huawei has no enable mode.\"\"\"\n return \"\"\n\n def exit_enable_mode(self, *args, **kwargs):\n \"\"\"Huawei has no enable mode.\"\"\"\n return \"\"\n\n def set_base_prompt(\n self, pri_prompt_terminator=\">\", alt_prompt_terminator=\"]\", delay_factor=1\n ):\n \"\"\"\n Sets self.base_prompt\n\n Used as delimiter for stripping of trailing prompt in output.\n\n Should be set to something that is general and applies in multiple contexts. For Comware\n this will be the router prompt with < > or [ ] stripped off.\n\n This will be set on logging in, but not when entering system-view\n \"\"\"\n log.debug(\"In set_base_prompt\")\n delay_factor = self.select_delay_factor(delay_factor)\n self.clear_buffer()\n self.write_channel(self.RETURN)\n time.sleep(0.5 * delay_factor)\n\n prompt = self.read_channel()\n prompt = self.normalize_linefeeds(prompt)\n\n # If multiple lines in the output take the last line\n prompt = prompt.split(self.RESPONSE_RETURN)[-1]\n prompt = prompt.strip()\n\n # Check that ends with a valid terminator character\n if not prompt[-1] in (pri_prompt_terminator, alt_prompt_terminator):\n raise ValueError(\"Router prompt not found: {0}\".format(prompt))\n\n # Strip off any leading HRP_. characters for USGv5 HA\n prompt = re.sub(r\"^HRP_.\", \"\", prompt, flags=re.M)\n\n # Strip off leading and trailing terminator\n prompt = prompt[1:-1]\n prompt = prompt.strip()\n self.base_prompt = prompt\n log.debug(\"prompt: {0}\".format(self.base_prompt))\n\n return self.base_prompt\n\n def save_config(self, cmd=\"save\", confirm=False, confirm_response=\"\"):\n \"\"\" Save Config for HuaweiSSH\"\"\"\n return super(HuaweiSSH, self).save_config(cmd=cmd, confirm=confirm)\n\n\nclass HuaweiVrpv8SSH(HuaweiSSH):\n def commit(self, comment=\"\", delay_factor=1):\n \"\"\"\n Commit the candidate configuration.\n\n Commit the entered configuration. Raise an error and return the failure\n if the commit fails.\n\n default:\n command_string = commit\n comment:\n command_string = commit comment <comment>\n\n \"\"\"\n delay_factor = self.select_delay_factor(delay_factor)\n error_marker = \"Failed to generate committed config\"\n command_string = \"commit\"\n\n if comment:\n command_string += ' comment \"{}\"'.format(comment)\n\n output = self.config_mode()\n output += self.send_command_expect(\n command_string,\n strip_prompt=False,\n strip_command=False,\n delay_factor=delay_factor,\n expect_string=r\"]\",\n )\n output += self.exit_config_mode()\n\n if error_marker in output:\n raise ValueError(\n \"Commit failed with following errors:\\n\\n{}\".format(output)\n )\n return output\n\n def save_config(self, cmd=\"\", confirm=True, confirm_response=\"\"):\n \"\"\"Not Implemented\"\"\"\n raise NotImplementedError\n", "path": "netmiko/huawei/huawei_ssh.py"}]} | 1,987 | 104 |
gh_patches_debug_17030 | rasdani/github-patches | git_diff | apache__tvm-6499 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[uTVM] Use an alternative CRC Library
The 3rdparty crc library introduced in https://github.com/apache/incubator-tvm/pull/6334 has a license problem.
We will need to replace it with a new impl or an alternative library
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/tvm/micro/build.py`
Content:
```
1 # Licensed to the Apache Software Foundation (ASF) under one
2 # or more contributor license agreements. See the NOTICE file
3 # distributed with this work for additional information
4 # regarding copyright ownership. The ASF licenses this file
5 # to you under the Apache License, Version 2.0 (the
6 # "License"); you may not use this file except in compliance
7 # with the License. You may obtain a copy of the License at
8 #
9 # http://www.apache.org/licenses/LICENSE-2.0
10 #
11 # Unless required by applicable law or agreed to in writing,
12 # software distributed under the License is distributed on an
13 # "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
14 # KIND, either express or implied. See the License for the
15 # specific language governing permissions and limitations
16 # under the License.
17
18 """Defines top-level glue functions for building microTVM artifacts."""
19
20 import copy
21 import logging
22 import os
23 import re
24 from tvm.contrib import util
25
26
27 _LOG = logging.getLogger(__name__)
28
29
30 class Workspace:
31 """Defines helper functions for manipulating temporary compilation workspaces."""
32
33 def __init__(self, root=None, debug=False):
34 if debug or root is not None:
35 with util.TempDirectory.set_keep_for_debug():
36 self.tempdir = util.tempdir(custom_path=root)
37 _LOG.info("Created debug mode workspace at: %s", self.tempdir.temp_dir)
38 else:
39 self.tempdir = util.tempdir()
40
41 def relpath(self, path):
42 return self.tempdir.relpath(path)
43
44 def listdir(self):
45 return self.tempdir.listdir()
46
47 @property
48 def path(self):
49 return self.tempdir.temp_dir
50
51
52 # Required C runtime libraries, in link order.
53 CRT_RUNTIME_LIB_NAMES = ["utvm_rpc_server", "utvm_rpc_common", "common"]
54
55
56 TVM_ROOT_DIR = os.path.realpath(os.path.join(os.path.dirname(__file__), "..", "..", ".."))
57
58
59 CRT_ROOT_DIR = os.path.join(TVM_ROOT_DIR, "src", "runtime", "crt")
60
61
62 RUNTIME_LIB_SRC_DIRS = [os.path.join(CRT_ROOT_DIR, n) for n in CRT_RUNTIME_LIB_NAMES] + [
63 os.path.join(
64 TVM_ROOT_DIR,
65 "3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/TARGET_SDK_11/" "libraries/crc16",
66 )
67 ]
68
69
70 RUNTIME_SRC_REGEX = re.compile(r"^.*\.cc?$", re.IGNORECASE)
71
72
73 _CRT_DEFAULT_OPTIONS = {
74 "ccflags": ["-std=c++11"],
75 "ldflags": ["-std=gnu++14"],
76 "include_dirs": [
77 f"{TVM_ROOT_DIR}/include",
78 f"{TVM_ROOT_DIR}/3rdparty/dlpack/include",
79 f"{TVM_ROOT_DIR}/3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/"
80 "TARGET_SDK_11/libraries/crc16/",
81 f"{TVM_ROOT_DIR}/3rdparty/dmlc-core/include",
82 f"{CRT_ROOT_DIR}/include",
83 ],
84 "profile": {"common": ["-Wno-unused-variable"]},
85 }
86
87
88 def default_options(target_include_dir):
89 """Return default opts passed to Compile commands."""
90 bin_opts = copy.deepcopy(_CRT_DEFAULT_OPTIONS)
91 bin_opts["include_dirs"].append(target_include_dir)
92 lib_opts = copy.deepcopy(bin_opts)
93 lib_opts["profile"]["common"].append("-Werror")
94 lib_opts["cflags"] = ["-Wno-error=incompatible-pointer-types"]
95 return {"bin_opts": bin_opts, "lib_opts": lib_opts}
96
97
98 def build_static_runtime(workspace, compiler, module, lib_opts=None, bin_opts=None):
99 """Build the on-device runtime, statically linking the given modules.
100
101 Parameters
102 ----------
103 compiler : tvm.micro.Compiler
104 Compiler instance used to build the runtime.
105
106 module : IRModule
107 Module to statically link.
108
109 lib_opts : dict
110 Extra kwargs passed to library(),
111
112 bin_opts : dict
113 Extra kwargs passed to binary(),
114
115 Returns
116 -------
117 MicroBinary :
118 The compiled runtime.
119 """
120 lib_opts = _CRT_DEFAULT_OPTIONS if lib_opts is None else lib_opts
121 bin_opts = _CRT_DEFAULT_OPTIONS if bin_opts is None else bin_opts
122
123 mod_build_dir = workspace.relpath(os.path.join("build", "module"))
124 os.makedirs(mod_build_dir)
125 mod_src_dir = workspace.relpath(os.path.join("src", "module"))
126 os.makedirs(mod_src_dir)
127 mod_src_path = os.path.join(mod_src_dir, "module.c")
128 module.save(mod_src_path, "cc")
129
130 libs = []
131 for lib_src_dir in RUNTIME_LIB_SRC_DIRS:
132 lib_name = os.path.basename(lib_src_dir)
133 lib_build_dir = workspace.relpath(f"build/{lib_name}")
134 os.makedirs(lib_build_dir)
135
136 lib_srcs = []
137 for p in os.listdir(lib_src_dir):
138 if RUNTIME_SRC_REGEX.match(p):
139 lib_srcs.append(os.path.join(lib_src_dir, p))
140
141 libs.append(compiler.library(lib_build_dir, lib_srcs, lib_opts))
142
143 libs.append(compiler.library(mod_build_dir, [mod_src_path], lib_opts))
144
145 runtime_build_dir = workspace.relpath(f"build/runtime")
146 os.makedirs(runtime_build_dir)
147 return compiler.binary(runtime_build_dir, libs, bin_opts)
148
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/python/tvm/micro/build.py b/python/tvm/micro/build.py
--- a/python/tvm/micro/build.py
+++ b/python/tvm/micro/build.py
@@ -60,10 +60,7 @@
RUNTIME_LIB_SRC_DIRS = [os.path.join(CRT_ROOT_DIR, n) for n in CRT_RUNTIME_LIB_NAMES] + [
- os.path.join(
- TVM_ROOT_DIR,
- "3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/TARGET_SDK_11/" "libraries/crc16",
- )
+ os.path.join(TVM_ROOT_DIR, "3rdparty/libcrc/src")
]
@@ -76,8 +73,7 @@
"include_dirs": [
f"{TVM_ROOT_DIR}/include",
f"{TVM_ROOT_DIR}/3rdparty/dlpack/include",
- f"{TVM_ROOT_DIR}/3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/"
- "TARGET_SDK_11/libraries/crc16/",
+ f"{TVM_ROOT_DIR}/3rdparty/libcrc/include",
f"{TVM_ROOT_DIR}/3rdparty/dmlc-core/include",
f"{CRT_ROOT_DIR}/include",
],
| {"golden_diff": "diff --git a/python/tvm/micro/build.py b/python/tvm/micro/build.py\n--- a/python/tvm/micro/build.py\n+++ b/python/tvm/micro/build.py\n@@ -60,10 +60,7 @@\n \n \n RUNTIME_LIB_SRC_DIRS = [os.path.join(CRT_ROOT_DIR, n) for n in CRT_RUNTIME_LIB_NAMES] + [\n- os.path.join(\n- TVM_ROOT_DIR,\n- \"3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/TARGET_SDK_11/\" \"libraries/crc16\",\n- )\n+ os.path.join(TVM_ROOT_DIR, \"3rdparty/libcrc/src\")\n ]\n \n \n@@ -76,8 +73,7 @@\n \"include_dirs\": [\n f\"{TVM_ROOT_DIR}/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/dlpack/include\",\n- f\"{TVM_ROOT_DIR}/3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/\"\n- \"TARGET_SDK_11/libraries/crc16/\",\n+ f\"{TVM_ROOT_DIR}/3rdparty/libcrc/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/dmlc-core/include\",\n f\"{CRT_ROOT_DIR}/include\",\n ],\n", "issue": "[uTVM] Use an alternative CRC Library\nThe 3rdparty crc library introduced in https://github.com/apache/incubator-tvm/pull/6334 has a license problem.\r\n\r\n\r\nWe will need to replace it with a new impl or an alternative library\r\n\n", "before_files": [{"content": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\n\"\"\"Defines top-level glue functions for building microTVM artifacts.\"\"\"\n\nimport copy\nimport logging\nimport os\nimport re\nfrom tvm.contrib import util\n\n\n_LOG = logging.getLogger(__name__)\n\n\nclass Workspace:\n \"\"\"Defines helper functions for manipulating temporary compilation workspaces.\"\"\"\n\n def __init__(self, root=None, debug=False):\n if debug or root is not None:\n with util.TempDirectory.set_keep_for_debug():\n self.tempdir = util.tempdir(custom_path=root)\n _LOG.info(\"Created debug mode workspace at: %s\", self.tempdir.temp_dir)\n else:\n self.tempdir = util.tempdir()\n\n def relpath(self, path):\n return self.tempdir.relpath(path)\n\n def listdir(self):\n return self.tempdir.listdir()\n\n @property\n def path(self):\n return self.tempdir.temp_dir\n\n\n# Required C runtime libraries, in link order.\nCRT_RUNTIME_LIB_NAMES = [\"utvm_rpc_server\", \"utvm_rpc_common\", \"common\"]\n\n\nTVM_ROOT_DIR = os.path.realpath(os.path.join(os.path.dirname(__file__), \"..\", \"..\", \"..\"))\n\n\nCRT_ROOT_DIR = os.path.join(TVM_ROOT_DIR, \"src\", \"runtime\", \"crt\")\n\n\nRUNTIME_LIB_SRC_DIRS = [os.path.join(CRT_ROOT_DIR, n) for n in CRT_RUNTIME_LIB_NAMES] + [\n os.path.join(\n TVM_ROOT_DIR,\n \"3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/TARGET_SDK_11/\" \"libraries/crc16\",\n )\n]\n\n\nRUNTIME_SRC_REGEX = re.compile(r\"^.*\\.cc?$\", re.IGNORECASE)\n\n\n_CRT_DEFAULT_OPTIONS = {\n \"ccflags\": [\"-std=c++11\"],\n \"ldflags\": [\"-std=gnu++14\"],\n \"include_dirs\": [\n f\"{TVM_ROOT_DIR}/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/dlpack/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/mbed-os/targets/TARGET_NORDIC/TARGET_NRF5x/\"\n \"TARGET_SDK_11/libraries/crc16/\",\n f\"{TVM_ROOT_DIR}/3rdparty/dmlc-core/include\",\n f\"{CRT_ROOT_DIR}/include\",\n ],\n \"profile\": {\"common\": [\"-Wno-unused-variable\"]},\n}\n\n\ndef default_options(target_include_dir):\n \"\"\"Return default opts passed to Compile commands.\"\"\"\n bin_opts = copy.deepcopy(_CRT_DEFAULT_OPTIONS)\n bin_opts[\"include_dirs\"].append(target_include_dir)\n lib_opts = copy.deepcopy(bin_opts)\n lib_opts[\"profile\"][\"common\"].append(\"-Werror\")\n lib_opts[\"cflags\"] = [\"-Wno-error=incompatible-pointer-types\"]\n return {\"bin_opts\": bin_opts, \"lib_opts\": lib_opts}\n\n\ndef build_static_runtime(workspace, compiler, module, lib_opts=None, bin_opts=None):\n \"\"\"Build the on-device runtime, statically linking the given modules.\n\n Parameters\n ----------\n compiler : tvm.micro.Compiler\n Compiler instance used to build the runtime.\n\n module : IRModule\n Module to statically link.\n\n lib_opts : dict\n Extra kwargs passed to library(),\n\n bin_opts : dict\n Extra kwargs passed to binary(),\n\n Returns\n -------\n MicroBinary :\n The compiled runtime.\n \"\"\"\n lib_opts = _CRT_DEFAULT_OPTIONS if lib_opts is None else lib_opts\n bin_opts = _CRT_DEFAULT_OPTIONS if bin_opts is None else bin_opts\n\n mod_build_dir = workspace.relpath(os.path.join(\"build\", \"module\"))\n os.makedirs(mod_build_dir)\n mod_src_dir = workspace.relpath(os.path.join(\"src\", \"module\"))\n os.makedirs(mod_src_dir)\n mod_src_path = os.path.join(mod_src_dir, \"module.c\")\n module.save(mod_src_path, \"cc\")\n\n libs = []\n for lib_src_dir in RUNTIME_LIB_SRC_DIRS:\n lib_name = os.path.basename(lib_src_dir)\n lib_build_dir = workspace.relpath(f\"build/{lib_name}\")\n os.makedirs(lib_build_dir)\n\n lib_srcs = []\n for p in os.listdir(lib_src_dir):\n if RUNTIME_SRC_REGEX.match(p):\n lib_srcs.append(os.path.join(lib_src_dir, p))\n\n libs.append(compiler.library(lib_build_dir, lib_srcs, lib_opts))\n\n libs.append(compiler.library(mod_build_dir, [mod_src_path], lib_opts))\n\n runtime_build_dir = workspace.relpath(f\"build/runtime\")\n os.makedirs(runtime_build_dir)\n return compiler.binary(runtime_build_dir, libs, bin_opts)\n", "path": "python/tvm/micro/build.py"}], "after_files": [{"content": "# Licensed to the Apache Software Foundation (ASF) under one\n# or more contributor license agreements. See the NOTICE file\n# distributed with this work for additional information\n# regarding copyright ownership. The ASF licenses this file\n# to you under the Apache License, Version 2.0 (the\n# \"License\"); you may not use this file except in compliance\n# with the License. You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing,\n# software distributed under the License is distributed on an\n# \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY\n# KIND, either express or implied. See the License for the\n# specific language governing permissions and limitations\n# under the License.\n\n\"\"\"Defines top-level glue functions for building microTVM artifacts.\"\"\"\n\nimport copy\nimport logging\nimport os\nimport re\nfrom tvm.contrib import util\n\n\n_LOG = logging.getLogger(__name__)\n\n\nclass Workspace:\n \"\"\"Defines helper functions for manipulating temporary compilation workspaces.\"\"\"\n\n def __init__(self, root=None, debug=False):\n if debug or root is not None:\n with util.TempDirectory.set_keep_for_debug():\n self.tempdir = util.tempdir(custom_path=root)\n _LOG.info(\"Created debug mode workspace at: %s\", self.tempdir.temp_dir)\n else:\n self.tempdir = util.tempdir()\n\n def relpath(self, path):\n return self.tempdir.relpath(path)\n\n def listdir(self):\n return self.tempdir.listdir()\n\n @property\n def path(self):\n return self.tempdir.temp_dir\n\n\n# Required C runtime libraries, in link order.\nCRT_RUNTIME_LIB_NAMES = [\"utvm_rpc_server\", \"utvm_rpc_common\", \"common\"]\n\n\nTVM_ROOT_DIR = os.path.realpath(os.path.join(os.path.dirname(__file__), \"..\", \"..\", \"..\"))\n\n\nCRT_ROOT_DIR = os.path.join(TVM_ROOT_DIR, \"src\", \"runtime\", \"crt\")\n\n\nRUNTIME_LIB_SRC_DIRS = [os.path.join(CRT_ROOT_DIR, n) for n in CRT_RUNTIME_LIB_NAMES] + [\n os.path.join(TVM_ROOT_DIR, \"3rdparty/libcrc/src\")\n]\n\n\nRUNTIME_SRC_REGEX = re.compile(r\"^.*\\.cc?$\", re.IGNORECASE)\n\n\n_CRT_DEFAULT_OPTIONS = {\n \"ccflags\": [\"-std=c++11\"],\n \"ldflags\": [\"-std=gnu++14\"],\n \"include_dirs\": [\n f\"{TVM_ROOT_DIR}/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/dlpack/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/libcrc/include\",\n f\"{TVM_ROOT_DIR}/3rdparty/dmlc-core/include\",\n f\"{CRT_ROOT_DIR}/include\",\n ],\n \"profile\": {\"common\": [\"-Wno-unused-variable\"]},\n}\n\n\ndef default_options(target_include_dir):\n \"\"\"Return default opts passed to Compile commands.\"\"\"\n bin_opts = copy.deepcopy(_CRT_DEFAULT_OPTIONS)\n bin_opts[\"include_dirs\"].append(target_include_dir)\n lib_opts = copy.deepcopy(bin_opts)\n lib_opts[\"profile\"][\"common\"].append(\"-Werror\")\n lib_opts[\"cflags\"] = [\"-Wno-error=incompatible-pointer-types\"]\n return {\"bin_opts\": bin_opts, \"lib_opts\": lib_opts}\n\n\ndef build_static_runtime(workspace, compiler, module, lib_opts=None, bin_opts=None):\n \"\"\"Build the on-device runtime, statically linking the given modules.\n\n Parameters\n ----------\n compiler : tvm.micro.Compiler\n Compiler instance used to build the runtime.\n\n module : IRModule\n Module to statically link.\n\n lib_opts : dict\n Extra kwargs passed to library(),\n\n bin_opts : dict\n Extra kwargs passed to binary(),\n\n Returns\n -------\n MicroBinary :\n The compiled runtime.\n \"\"\"\n lib_opts = _CRT_DEFAULT_OPTIONS if lib_opts is None else lib_opts\n bin_opts = _CRT_DEFAULT_OPTIONS if bin_opts is None else bin_opts\n\n mod_build_dir = workspace.relpath(os.path.join(\"build\", \"module\"))\n os.makedirs(mod_build_dir)\n mod_src_dir = workspace.relpath(os.path.join(\"src\", \"module\"))\n os.makedirs(mod_src_dir)\n mod_src_path = os.path.join(mod_src_dir, \"module.c\")\n module.save(mod_src_path, \"cc\")\n\n libs = []\n for lib_src_dir in RUNTIME_LIB_SRC_DIRS:\n lib_name = os.path.basename(lib_src_dir)\n lib_build_dir = workspace.relpath(f\"build/{lib_name}\")\n os.makedirs(lib_build_dir)\n\n lib_srcs = []\n for p in os.listdir(lib_src_dir):\n if RUNTIME_SRC_REGEX.match(p):\n lib_srcs.append(os.path.join(lib_src_dir, p))\n\n libs.append(compiler.library(lib_build_dir, lib_srcs, lib_opts))\n\n libs.append(compiler.library(mod_build_dir, [mod_src_path], lib_opts))\n\n runtime_build_dir = workspace.relpath(f\"build/runtime\")\n os.makedirs(runtime_build_dir)\n return compiler.binary(runtime_build_dir, libs, bin_opts)\n", "path": "python/tvm/micro/build.py"}]} | 1,832 | 285 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.