problem_id
stringlengths 18
22
| source
stringclasses 1
value | task_type
stringclasses 1
value | in_source_id
stringlengths 13
58
| prompt
stringlengths 1.1k
10.2k
| golden_diff
stringlengths 151
4.94k
| verification_info
stringlengths 582
21k
| num_tokens
int64 271
2.05k
| num_tokens_diff
int64 47
1.02k
|
---|---|---|---|---|---|---|---|---|
gh_patches_debug_55627 | rasdani/github-patches | git_diff | xonsh__xonsh-3527 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Weird completion issue
<!--- Provide a general summary of the issue in the Title above -->
<!--- If you have a question along the lines of "How do I do this Bash command in xonsh"
please first look over the Bash to Xonsh translation guide: https://xon.sh/bash_to_xsh.html
If you don't find an answer there, please do open an issue! -->
## xonfig
<details>
```
$ xonfig
+------------------+-----------------+
| xonsh | 0.9.12 |
| Python | 3.7.4 |
| PLY | 3.11 |
| have readline | True |
| prompt toolkit | 2.0.9 |
| shell type | prompt_toolkit2 |
| pygments | 2.4.2 |
| on posix | True |
| on linux | False |
| on darwin | True |
| on windows | False |
| on cygwin | False |
| on msys2 | False |
| is superuser | False |
| default encoding | utf-8 |
| xonsh encoding | utf-8 |
| encoding errors | surrogateescape |
+------------------+-----------------+
```
</details>
## Expected Behavior
<!--- Tell us what should happen -->
Tab completion behind shell command `vim` should work
## Current Behavior
<!--- Tell us what happens instead of the expected behavior -->
<!--- If part of your bug report is a traceback, please first enter debug mode before triggering the error
To enter debug mode, set the environment variable `XONSH_DEBUG=1` _before_ starting `xonsh`.
On Linux and OSX, an easy way to to do this is to run `env XONSH_DEBUG=1 xonsh` -->
existing file is not being found by completion (see screenshot).
As you can see in the lower part of the screenshot, the file `pip_packages_to_install.txt` exists in the current folder but isn't found when used behind the shell command `vim` (but does work behind `cat`).
Is this maybe created by interfering completions installed elsewhere? Maybe some vim completions from homebrew?
<img width="822" alt="Screenshot 2019-10-31 14 11 02" src="https://user-images.githubusercontent.com/69774/67982582-99090380-fbe8-11e9-839a-b6fd0536a3ed.png">
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `xonsh/completers/pip.py`
Content:
```
1 """Completers for pip."""
2 # pylint: disable=invalid-name, missing-docstring, unsupported-membership-test
3 # pylint: disable=unused-argument, not-an-iterable
4 import re
5 import subprocess
6
7 import xonsh.lazyasd as xl
8
9
10 @xl.lazyobject
11 def PIP_RE():
12 return re.compile(r"\bx?pip(?:\d|\.)*")
13
14
15 @xl.lazyobject
16 def PIP_LIST_RE():
17 return re.compile(r"\bx?pip(?:\d|\.)* (?:uninstall|show)")
18
19
20 @xl.lazyobject
21 def ALL_COMMANDS():
22 try:
23 help_text = str(
24 subprocess.check_output(["pip", "--help"], stderr=subprocess.DEVNULL)
25 )
26 except FileNotFoundError:
27 return []
28 commands = re.findall(r" (\w+) ", help_text)
29 return [c for c in commands if c not in ["completion", "help"]]
30
31
32 def complete_pip(prefix, line, begidx, endidx, ctx):
33 """Completes python's package manager pip"""
34 line_len = len(line.split())
35 if (
36 (line_len > 3)
37 or (line_len > 2 and line.endswith(" "))
38 or (not PIP_RE.search(line))
39 ):
40 return
41 if PIP_LIST_RE.search(line):
42 try:
43 items = subprocess.check_output(["pip", "list"], stderr=subprocess.DEVNULL)
44 except FileNotFoundError:
45 return set()
46 items = items.decode("utf-8").splitlines()
47 return set(i.split()[0] for i in items if i.split()[0].startswith(prefix))
48
49 if (line_len > 1 and line.endswith(" ")) or line_len > 2:
50 # "pip show " -> no complete (note space)
51 return
52 if prefix not in ALL_COMMANDS:
53 suggestions = [c for c in ALL_COMMANDS if c.startswith(prefix)]
54 if suggestions:
55 return suggestions, len(prefix)
56 return ALL_COMMANDS, len(prefix)
57
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/xonsh/completers/pip.py b/xonsh/completers/pip.py
--- a/xonsh/completers/pip.py
+++ b/xonsh/completers/pip.py
@@ -9,12 +9,12 @@
@xl.lazyobject
def PIP_RE():
- return re.compile(r"\bx?pip(?:\d|\.)*")
+ return re.compile(r"\bx?pip(?:\d|\.)*\b")
@xl.lazyobject
def PIP_LIST_RE():
- return re.compile(r"\bx?pip(?:\d|\.)* (?:uninstall|show)")
+ return re.compile(r"\bx?pip(?:\d|\.)*\b (?:uninstall|show)")
@xl.lazyobject
| {"golden_diff": "diff --git a/xonsh/completers/pip.py b/xonsh/completers/pip.py\n--- a/xonsh/completers/pip.py\n+++ b/xonsh/completers/pip.py\n@@ -9,12 +9,12 @@\n \n @xl.lazyobject\n def PIP_RE():\n- return re.compile(r\"\\bx?pip(?:\\d|\\.)*\")\n+ return re.compile(r\"\\bx?pip(?:\\d|\\.)*\\b\")\n \n \n @xl.lazyobject\n def PIP_LIST_RE():\n- return re.compile(r\"\\bx?pip(?:\\d|\\.)* (?:uninstall|show)\")\n+ return re.compile(r\"\\bx?pip(?:\\d|\\.)*\\b (?:uninstall|show)\")\n \n \n @xl.lazyobject\n", "issue": "Weird completion issue\n<!--- Provide a general summary of the issue in the Title above -->\r\n<!--- If you have a question along the lines of \"How do I do this Bash command in xonsh\"\r\nplease first look over the Bash to Xonsh translation guide: https://xon.sh/bash_to_xsh.html\r\nIf you don't find an answer there, please do open an issue! -->\r\n\r\n## xonfig\r\n\r\n<details>\r\n\r\n```\r\n$ xonfig\r\n+------------------+-----------------+\r\n| xonsh | 0.9.12 |\r\n| Python | 3.7.4 |\r\n| PLY | 3.11 |\r\n| have readline | True |\r\n| prompt toolkit | 2.0.9 |\r\n| shell type | prompt_toolkit2 |\r\n| pygments | 2.4.2 |\r\n| on posix | True |\r\n| on linux | False |\r\n| on darwin | True |\r\n| on windows | False |\r\n| on cygwin | False |\r\n| on msys2 | False |\r\n| is superuser | False |\r\n| default encoding | utf-8 |\r\n| xonsh encoding | utf-8 |\r\n| encoding errors | surrogateescape |\r\n+------------------+-----------------+\r\n```\r\n\r\n</details>\r\n\r\n## Expected Behavior\r\n<!--- Tell us what should happen -->\r\nTab completion behind shell command `vim` should work\r\n\r\n## Current Behavior\r\n<!--- Tell us what happens instead of the expected behavior -->\r\n<!--- If part of your bug report is a traceback, please first enter debug mode before triggering the error\r\nTo enter debug mode, set the environment variable `XONSH_DEBUG=1` _before_ starting `xonsh`.\r\nOn Linux and OSX, an easy way to to do this is to run `env XONSH_DEBUG=1 xonsh` -->\r\nexisting file is not being found by completion (see screenshot).\r\nAs you can see in the lower part of the screenshot, the file `pip_packages_to_install.txt` exists in the current folder but isn't found when used behind the shell command `vim` (but does work behind `cat`).\r\nIs this maybe created by interfering completions installed elsewhere? Maybe some vim completions from homebrew?\r\n\r\n\r\n<img width=\"822\" alt=\"Screenshot 2019-10-31 14 11 02\" src=\"https://user-images.githubusercontent.com/69774/67982582-99090380-fbe8-11e9-839a-b6fd0536a3ed.png\">\n", "before_files": [{"content": "\"\"\"Completers for pip.\"\"\"\n# pylint: disable=invalid-name, missing-docstring, unsupported-membership-test\n# pylint: disable=unused-argument, not-an-iterable\nimport re\nimport subprocess\n\nimport xonsh.lazyasd as xl\n\n\[email protected]\ndef PIP_RE():\n return re.compile(r\"\\bx?pip(?:\\d|\\.)*\")\n\n\[email protected]\ndef PIP_LIST_RE():\n return re.compile(r\"\\bx?pip(?:\\d|\\.)* (?:uninstall|show)\")\n\n\[email protected]\ndef ALL_COMMANDS():\n try:\n help_text = str(\n subprocess.check_output([\"pip\", \"--help\"], stderr=subprocess.DEVNULL)\n )\n except FileNotFoundError:\n return []\n commands = re.findall(r\" (\\w+) \", help_text)\n return [c for c in commands if c not in [\"completion\", \"help\"]]\n\n\ndef complete_pip(prefix, line, begidx, endidx, ctx):\n \"\"\"Completes python's package manager pip\"\"\"\n line_len = len(line.split())\n if (\n (line_len > 3)\n or (line_len > 2 and line.endswith(\" \"))\n or (not PIP_RE.search(line))\n ):\n return\n if PIP_LIST_RE.search(line):\n try:\n items = subprocess.check_output([\"pip\", \"list\"], stderr=subprocess.DEVNULL)\n except FileNotFoundError:\n return set()\n items = items.decode(\"utf-8\").splitlines()\n return set(i.split()[0] for i in items if i.split()[0].startswith(prefix))\n\n if (line_len > 1 and line.endswith(\" \")) or line_len > 2:\n # \"pip show \" -> no complete (note space)\n return\n if prefix not in ALL_COMMANDS:\n suggestions = [c for c in ALL_COMMANDS if c.startswith(prefix)]\n if suggestions:\n return suggestions, len(prefix)\n return ALL_COMMANDS, len(prefix)\n", "path": "xonsh/completers/pip.py"}], "after_files": [{"content": "\"\"\"Completers for pip.\"\"\"\n# pylint: disable=invalid-name, missing-docstring, unsupported-membership-test\n# pylint: disable=unused-argument, not-an-iterable\nimport re\nimport subprocess\n\nimport xonsh.lazyasd as xl\n\n\[email protected]\ndef PIP_RE():\n return re.compile(r\"\\bx?pip(?:\\d|\\.)*\\b\")\n\n\[email protected]\ndef PIP_LIST_RE():\n return re.compile(r\"\\bx?pip(?:\\d|\\.)*\\b (?:uninstall|show)\")\n\n\[email protected]\ndef ALL_COMMANDS():\n try:\n help_text = str(\n subprocess.check_output([\"pip\", \"--help\"], stderr=subprocess.DEVNULL)\n )\n except FileNotFoundError:\n return []\n commands = re.findall(r\" (\\w+) \", help_text)\n return [c for c in commands if c not in [\"completion\", \"help\"]]\n\n\ndef complete_pip(prefix, line, begidx, endidx, ctx):\n \"\"\"Completes python's package manager pip\"\"\"\n line_len = len(line.split())\n if (\n (line_len > 3)\n or (line_len > 2 and line.endswith(\" \"))\n or (not PIP_RE.search(line))\n ):\n return\n if PIP_LIST_RE.search(line):\n try:\n items = subprocess.check_output([\"pip\", \"list\"], stderr=subprocess.DEVNULL)\n except FileNotFoundError:\n return set()\n items = items.decode(\"utf-8\").splitlines()\n return set(i.split()[0] for i in items if i.split()[0].startswith(prefix))\n\n if (line_len > 1 and line.endswith(\" \")) or line_len > 2:\n # \"pip show \" -> no complete (note space)\n return\n if prefix not in ALL_COMMANDS:\n suggestions = [c for c in ALL_COMMANDS if c.startswith(prefix)]\n if suggestions:\n return suggestions, len(prefix)\n return ALL_COMMANDS, len(prefix)\n", "path": "xonsh/completers/pip.py"}]} | 1,367 | 173 |
gh_patches_debug_33399 | rasdani/github-patches | git_diff | plotly__plotly.py-1832 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Does Plotly 4.2.0 depend on scikit-image?
I failed to `import plotly.figure_factory` because plotly seems not to install `scikit-image` when running `pip install -U plotly`. After I manually installed `scikit-image`, `import plotly.figure_factory` worked.
This was not a problem in version 4.1.1.
But the source code shows it depends on it.
https://github.com/plotly/plotly.py/blob/b7ad5433c4e0882715781fa6c4816fc7fff62965/packages/python/plotly/plotly/figure_factory/_ternary_contour.py#L11
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `packages/python/plotly/plotly/express/__init__.py`
Content:
```
1 """
2 `plotly_express` is a terse, consistent, high-level wrapper around `plotly` for rapid \
3 data exploration and figure generation. See the gallery at https://plotly.github.io/plotly_express
4 """
5
6 from ._chart_types import ( # noqa: F401
7 scatter,
8 scatter_3d,
9 scatter_polar,
10 scatter_ternary,
11 scatter_mapbox,
12 scatter_geo,
13 line,
14 line_3d,
15 line_polar,
16 line_ternary,
17 line_mapbox,
18 line_geo,
19 area,
20 bar,
21 bar_polar,
22 violin,
23 box,
24 strip,
25 histogram,
26 scatter_matrix,
27 parallel_coordinates,
28 parallel_categories,
29 choropleth,
30 density_contour,
31 density_heatmap,
32 )
33
34 from ._core import ( # noqa: F401
35 set_mapbox_access_token,
36 defaults,
37 get_trendline_results,
38 )
39
40 from . import data, colors # noqa: F401
41
42 __all__ = [
43 "scatter",
44 "scatter_3d",
45 "scatter_polar",
46 "scatter_ternary",
47 "scatter_mapbox",
48 "scatter_geo",
49 "scatter_matrix",
50 "density_contour",
51 "density_heatmap",
52 "line",
53 "line_3d",
54 "line_polar",
55 "line_ternary",
56 "line_mapbox",
57 "line_geo",
58 "parallel_coordinates",
59 "parallel_categories",
60 "area",
61 "bar",
62 "bar_polar",
63 "violin",
64 "box",
65 "strip",
66 "histogram",
67 "choropleth",
68 "data",
69 "colors",
70 "set_mapbox_access_token",
71 "get_trendline_results",
72 ]
73
```
Path: `packages/python/plotly/plotly/figure_factory/__init__.py`
Content:
```
1 from __future__ import absolute_import
2
3 from plotly import optional_imports
4
5 # Require that numpy exists for figure_factory
6 np = optional_imports.get_module("numpy")
7 if np is None:
8 raise ImportError(
9 """\
10 The figure factory module requires the numpy package"""
11 )
12
13
14 from plotly.figure_factory._2d_density import create_2d_density
15 from plotly.figure_factory._annotated_heatmap import create_annotated_heatmap
16 from plotly.figure_factory._bullet import create_bullet
17 from plotly.figure_factory._candlestick import create_candlestick
18 from plotly.figure_factory._dendrogram import create_dendrogram
19 from plotly.figure_factory._distplot import create_distplot
20 from plotly.figure_factory._facet_grid import create_facet_grid
21 from plotly.figure_factory._gantt import create_gantt
22 from plotly.figure_factory._ohlc import create_ohlc
23 from plotly.figure_factory._quiver import create_quiver
24 from plotly.figure_factory._scatterplot import create_scatterplotmatrix
25 from plotly.figure_factory._streamline import create_streamline
26 from plotly.figure_factory._table import create_table
27 from plotly.figure_factory._ternary_contour import create_ternary_contour
28 from plotly.figure_factory._trisurf import create_trisurf
29 from plotly.figure_factory._violin import create_violin
30
31 if optional_imports.get_module("pandas") is not None:
32 from plotly.figure_factory._county_choropleth import create_choropleth
33
34 __all__ = [
35 "create_2d_density",
36 "create_annotated_heatmap",
37 "create_bullet",
38 "create_candlestick",
39 "create_dendrogram",
40 "create_distplot",
41 "create_facet_grid",
42 "create_gantt",
43 "create_ohlc",
44 "create_quiver",
45 "create_scatterplotmatrix",
46 "create_streamline",
47 "create_table",
48 "create_ternary_contour",
49 "create_trisurf",
50 "create_violin",
51 ]
52
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/packages/python/plotly/plotly/express/__init__.py b/packages/python/plotly/plotly/express/__init__.py
--- a/packages/python/plotly/plotly/express/__init__.py
+++ b/packages/python/plotly/plotly/express/__init__.py
@@ -2,6 +2,16 @@
`plotly_express` is a terse, consistent, high-level wrapper around `plotly` for rapid \
data exploration and figure generation. See the gallery at https://plotly.github.io/plotly_express
"""
+from __future__ import absolute_import
+from plotly import optional_imports
+
+pd = optional_imports.get_module("pandas")
+if pd is None:
+ raise ImportError(
+ """\
+Plotly express requires pandas to be installed."""
+ )
+
from ._chart_types import ( # noqa: F401
scatter,
diff --git a/packages/python/plotly/plotly/figure_factory/__init__.py b/packages/python/plotly/plotly/figure_factory/__init__.py
--- a/packages/python/plotly/plotly/figure_factory/__init__.py
+++ b/packages/python/plotly/plotly/figure_factory/__init__.py
@@ -24,18 +24,31 @@
from plotly.figure_factory._scatterplot import create_scatterplotmatrix
from plotly.figure_factory._streamline import create_streamline
from plotly.figure_factory._table import create_table
-from plotly.figure_factory._ternary_contour import create_ternary_contour
from plotly.figure_factory._trisurf import create_trisurf
from plotly.figure_factory._violin import create_violin
if optional_imports.get_module("pandas") is not None:
from plotly.figure_factory._county_choropleth import create_choropleth
+else:
+
+ def create_choropleth(*args, **kwargs):
+ raise ImportError("Please install pandas to use `create_choropleth`")
+
+
+if optional_imports.get_module("skimage") is not None:
+ from plotly.figure_factory._ternary_contour import create_ternary_contour
+else:
+
+ def create_ternary_contour(*args, **kwargs):
+ raise ImportError("Please install scikit-image to use `create_ternary_contour`")
+
__all__ = [
"create_2d_density",
"create_annotated_heatmap",
"create_bullet",
"create_candlestick",
+ "create_choropleth",
"create_dendrogram",
"create_distplot",
"create_facet_grid",
| {"golden_diff": "diff --git a/packages/python/plotly/plotly/express/__init__.py b/packages/python/plotly/plotly/express/__init__.py\n--- a/packages/python/plotly/plotly/express/__init__.py\n+++ b/packages/python/plotly/plotly/express/__init__.py\n@@ -2,6 +2,16 @@\n `plotly_express` is a terse, consistent, high-level wrapper around `plotly` for rapid \\\n data exploration and figure generation. See the gallery at https://plotly.github.io/plotly_express\n \"\"\"\n+from __future__ import absolute_import\n+from plotly import optional_imports\n+\n+pd = optional_imports.get_module(\"pandas\")\n+if pd is None:\n+ raise ImportError(\n+ \"\"\"\\\n+Plotly express requires pandas to be installed.\"\"\"\n+ )\n+\n \n from ._chart_types import ( # noqa: F401\n scatter,\ndiff --git a/packages/python/plotly/plotly/figure_factory/__init__.py b/packages/python/plotly/plotly/figure_factory/__init__.py\n--- a/packages/python/plotly/plotly/figure_factory/__init__.py\n+++ b/packages/python/plotly/plotly/figure_factory/__init__.py\n@@ -24,18 +24,31 @@\n from plotly.figure_factory._scatterplot import create_scatterplotmatrix\n from plotly.figure_factory._streamline import create_streamline\n from plotly.figure_factory._table import create_table\n-from plotly.figure_factory._ternary_contour import create_ternary_contour\n from plotly.figure_factory._trisurf import create_trisurf\n from plotly.figure_factory._violin import create_violin\n \n if optional_imports.get_module(\"pandas\") is not None:\n from plotly.figure_factory._county_choropleth import create_choropleth\n+else:\n+\n+ def create_choropleth(*args, **kwargs):\n+ raise ImportError(\"Please install pandas to use `create_choropleth`\")\n+\n+\n+if optional_imports.get_module(\"skimage\") is not None:\n+ from plotly.figure_factory._ternary_contour import create_ternary_contour\n+else:\n+\n+ def create_ternary_contour(*args, **kwargs):\n+ raise ImportError(\"Please install scikit-image to use `create_ternary_contour`\")\n+\n \n __all__ = [\n \"create_2d_density\",\n \"create_annotated_heatmap\",\n \"create_bullet\",\n \"create_candlestick\",\n+ \"create_choropleth\",\n \"create_dendrogram\",\n \"create_distplot\",\n \"create_facet_grid\",\n", "issue": "Does Plotly 4.2.0 depend on scikit-image?\nI failed to `import plotly.figure_factory` because plotly seems not to install `scikit-image` when running `pip install -U plotly`. After I manually installed `scikit-image`, `import plotly.figure_factory` worked.\r\n\r\nThis was not a problem in version 4.1.1.\r\n\r\nBut the source code shows it depends on it.\r\nhttps://github.com/plotly/plotly.py/blob/b7ad5433c4e0882715781fa6c4816fc7fff62965/packages/python/plotly/plotly/figure_factory/_ternary_contour.py#L11\n", "before_files": [{"content": "\"\"\"\n`plotly_express` is a terse, consistent, high-level wrapper around `plotly` for rapid \\\ndata exploration and figure generation. See the gallery at https://plotly.github.io/plotly_express\n\"\"\"\n\nfrom ._chart_types import ( # noqa: F401\n scatter,\n scatter_3d,\n scatter_polar,\n scatter_ternary,\n scatter_mapbox,\n scatter_geo,\n line,\n line_3d,\n line_polar,\n line_ternary,\n line_mapbox,\n line_geo,\n area,\n bar,\n bar_polar,\n violin,\n box,\n strip,\n histogram,\n scatter_matrix,\n parallel_coordinates,\n parallel_categories,\n choropleth,\n density_contour,\n density_heatmap,\n)\n\nfrom ._core import ( # noqa: F401\n set_mapbox_access_token,\n defaults,\n get_trendline_results,\n)\n\nfrom . import data, colors # noqa: F401\n\n__all__ = [\n \"scatter\",\n \"scatter_3d\",\n \"scatter_polar\",\n \"scatter_ternary\",\n \"scatter_mapbox\",\n \"scatter_geo\",\n \"scatter_matrix\",\n \"density_contour\",\n \"density_heatmap\",\n \"line\",\n \"line_3d\",\n \"line_polar\",\n \"line_ternary\",\n \"line_mapbox\",\n \"line_geo\",\n \"parallel_coordinates\",\n \"parallel_categories\",\n \"area\",\n \"bar\",\n \"bar_polar\",\n \"violin\",\n \"box\",\n \"strip\",\n \"histogram\",\n \"choropleth\",\n \"data\",\n \"colors\",\n \"set_mapbox_access_token\",\n \"get_trendline_results\",\n]\n", "path": "packages/python/plotly/plotly/express/__init__.py"}, {"content": "from __future__ import absolute_import\n\nfrom plotly import optional_imports\n\n# Require that numpy exists for figure_factory\nnp = optional_imports.get_module(\"numpy\")\nif np is None:\n raise ImportError(\n \"\"\"\\\nThe figure factory module requires the numpy package\"\"\"\n )\n\n\nfrom plotly.figure_factory._2d_density import create_2d_density\nfrom plotly.figure_factory._annotated_heatmap import create_annotated_heatmap\nfrom plotly.figure_factory._bullet import create_bullet\nfrom plotly.figure_factory._candlestick import create_candlestick\nfrom plotly.figure_factory._dendrogram import create_dendrogram\nfrom plotly.figure_factory._distplot import create_distplot\nfrom plotly.figure_factory._facet_grid import create_facet_grid\nfrom plotly.figure_factory._gantt import create_gantt\nfrom plotly.figure_factory._ohlc import create_ohlc\nfrom plotly.figure_factory._quiver import create_quiver\nfrom plotly.figure_factory._scatterplot import create_scatterplotmatrix\nfrom plotly.figure_factory._streamline import create_streamline\nfrom plotly.figure_factory._table import create_table\nfrom plotly.figure_factory._ternary_contour import create_ternary_contour\nfrom plotly.figure_factory._trisurf import create_trisurf\nfrom plotly.figure_factory._violin import create_violin\n\nif optional_imports.get_module(\"pandas\") is not None:\n from plotly.figure_factory._county_choropleth import create_choropleth\n\n__all__ = [\n \"create_2d_density\",\n \"create_annotated_heatmap\",\n \"create_bullet\",\n \"create_candlestick\",\n \"create_dendrogram\",\n \"create_distplot\",\n \"create_facet_grid\",\n \"create_gantt\",\n \"create_ohlc\",\n \"create_quiver\",\n \"create_scatterplotmatrix\",\n \"create_streamline\",\n \"create_table\",\n \"create_ternary_contour\",\n \"create_trisurf\",\n \"create_violin\",\n]\n", "path": "packages/python/plotly/plotly/figure_factory/__init__.py"}], "after_files": [{"content": "\"\"\"\n`plotly_express` is a terse, consistent, high-level wrapper around `plotly` for rapid \\\ndata exploration and figure generation. See the gallery at https://plotly.github.io/plotly_express\n\"\"\"\nfrom __future__ import absolute_import\nfrom plotly import optional_imports\n\npd = optional_imports.get_module(\"pandas\")\nif pd is None:\n raise ImportError(\n \"\"\"\\\nPlotly express requires pandas to be installed.\"\"\"\n )\n\n\nfrom ._chart_types import ( # noqa: F401\n scatter,\n scatter_3d,\n scatter_polar,\n scatter_ternary,\n scatter_mapbox,\n scatter_geo,\n line,\n line_3d,\n line_polar,\n line_ternary,\n line_mapbox,\n line_geo,\n area,\n bar,\n bar_polar,\n violin,\n box,\n strip,\n histogram,\n scatter_matrix,\n parallel_coordinates,\n parallel_categories,\n choropleth,\n density_contour,\n density_heatmap,\n)\n\nfrom ._core import ( # noqa: F401\n set_mapbox_access_token,\n defaults,\n get_trendline_results,\n)\n\nfrom . import data, colors # noqa: F401\n\n__all__ = [\n \"scatter\",\n \"scatter_3d\",\n \"scatter_polar\",\n \"scatter_ternary\",\n \"scatter_mapbox\",\n \"scatter_geo\",\n \"scatter_matrix\",\n \"density_contour\",\n \"density_heatmap\",\n \"line\",\n \"line_3d\",\n \"line_polar\",\n \"line_ternary\",\n \"line_mapbox\",\n \"line_geo\",\n \"parallel_coordinates\",\n \"parallel_categories\",\n \"area\",\n \"bar\",\n \"bar_polar\",\n \"violin\",\n \"box\",\n \"strip\",\n \"histogram\",\n \"choropleth\",\n \"data\",\n \"colors\",\n \"set_mapbox_access_token\",\n \"get_trendline_results\",\n]\n", "path": "packages/python/plotly/plotly/express/__init__.py"}, {"content": "from __future__ import absolute_import\n\nfrom plotly import optional_imports\n\n# Require that numpy exists for figure_factory\nnp = optional_imports.get_module(\"numpy\")\nif np is None:\n raise ImportError(\n \"\"\"\\\nThe figure factory module requires the numpy package\"\"\"\n )\n\n\nfrom plotly.figure_factory._2d_density import create_2d_density\nfrom plotly.figure_factory._annotated_heatmap import create_annotated_heatmap\nfrom plotly.figure_factory._bullet import create_bullet\nfrom plotly.figure_factory._candlestick import create_candlestick\nfrom plotly.figure_factory._dendrogram import create_dendrogram\nfrom plotly.figure_factory._distplot import create_distplot\nfrom plotly.figure_factory._facet_grid import create_facet_grid\nfrom plotly.figure_factory._gantt import create_gantt\nfrom plotly.figure_factory._ohlc import create_ohlc\nfrom plotly.figure_factory._quiver import create_quiver\nfrom plotly.figure_factory._scatterplot import create_scatterplotmatrix\nfrom plotly.figure_factory._streamline import create_streamline\nfrom plotly.figure_factory._table import create_table\nfrom plotly.figure_factory._trisurf import create_trisurf\nfrom plotly.figure_factory._violin import create_violin\n\nif optional_imports.get_module(\"pandas\") is not None:\n from plotly.figure_factory._county_choropleth import create_choropleth\nelse:\n\n def create_choropleth(*args, **kwargs):\n raise ImportError(\"Please install pandas to use `create_choropleth`\")\n\n\nif optional_imports.get_module(\"skimage\") is not None:\n from plotly.figure_factory._ternary_contour import create_ternary_contour\nelse:\n\n def create_ternary_contour(*args, **kwargs):\n raise ImportError(\"Please install scikit-image to use `create_ternary_contour`\")\n\n\n__all__ = [\n \"create_2d_density\",\n \"create_annotated_heatmap\",\n \"create_bullet\",\n \"create_candlestick\",\n \"create_choropleth\",\n \"create_dendrogram\",\n \"create_distplot\",\n \"create_facet_grid\",\n \"create_gantt\",\n \"create_ohlc\",\n \"create_quiver\",\n \"create_scatterplotmatrix\",\n \"create_streamline\",\n \"create_table\",\n \"create_ternary_contour\",\n \"create_trisurf\",\n \"create_violin\",\n]\n", "path": "packages/python/plotly/plotly/figure_factory/__init__.py"}]} | 1,512 | 587 |
gh_patches_debug_33729 | rasdani/github-patches | git_diff | translate__pootle-5882 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Dont create project if command fails in init_fs_project
atm if for some reason this command fails it leaves a project behind
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pootle/apps/pootle_fs/management/commands/init_fs_project.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright (C) Pootle contributors.
4 #
5 # This file is a part of the Pootle project. It is distributed under the GPL3
6 # or later license. See the LICENSE file for a copy of the license and the
7 # AUTHORS file for copyright and authorship information.
8
9 import logging
10
11 from django.core.exceptions import ValidationError
12 from django.core.management import BaseCommand, CommandError
13
14 from pootle_format.models import Format
15 from pootle_fs.utils import FSPlugin, parse_fs_url
16 from pootle_language.models import Language
17 from pootle_project.models import Project
18
19
20 logger = logging.getLogger('pootle.fs')
21
22
23 class Command(BaseCommand):
24 help = "Init a new Pootle FS project."
25
26 def add_arguments(self, parser):
27 parser.add_argument(
28 'code',
29 metavar='CODE',
30 help='Project code'
31 )
32 parser.add_argument(
33 'fs',
34 metavar='FS_URL',
35 help='FS url "filesystem_type+/repo/path/"'
36 )
37 parser.add_argument(
38 'translation_mapping',
39 help='Translation mapping "<language_code>/<filename>.<ext>"',
40 metavar='TRANSLATION_MAPPING'
41 )
42 parser.add_argument(
43 '-n', '--name',
44 action='store',
45 dest='name',
46 nargs='?',
47 help='Project name',
48 )
49 parser.add_argument(
50 '--filetypes',
51 action='append',
52 dest='filetypes',
53 help='File types',
54 )
55 parser.add_argument(
56 '--checkstyle',
57 action='store',
58 dest='checkstyle',
59 help='Checkstyle',
60 nargs='?',
61 default='standard'
62 )
63 parser.add_argument(
64 '-l', '--source-language',
65 action='store',
66 dest='source_language',
67 help="Code for the project's source language",
68 nargs='?',
69 default='en'
70 )
71 parser.add_argument(
72 '--nosync',
73 action='store_false',
74 dest='sync',
75 help='Flag if sync is unnecessary',
76 default=True
77 )
78
79 def handle(self, **options):
80 source_language_code = options['source_language']
81 try:
82 source_language = Language.objects.get(code=source_language_code)
83 except Language.DoesNotExist as e:
84 self.stdout.write('%s: Unknown language code.' %
85 source_language_code)
86 raise CommandError(e)
87
88 fs_type, fs_url = parse_fs_url(options['fs'])
89 code = options['code']
90 name = options['name'] or code.capitalize()
91
92 try:
93 project = Project.objects.create(
94 code=code,
95 fullname=name,
96 treestyle='pootle_fs',
97 checkstyle=options['checkstyle'],
98 source_language=source_language)
99 except ValidationError as e:
100 raise CommandError(e)
101
102 for filetype in options["filetypes"] or ["po"]:
103 try:
104 filetype = Format.objects.get(name=filetype)
105 project.filetypes.add(filetype)
106 except Format.DoesNotExist as e:
107 raise CommandError(e)
108
109 project.config['pootle_fs.fs_type'] = fs_type
110 project.config['pootle_fs.fs_url'] = fs_url
111 project.config['pootle_fs.translation_mappings'] = {
112 'default': options['translation_mapping']
113 }
114 if options['sync']:
115 plugin = FSPlugin(project)
116 plugin.fetch()
117 plugin.add()
118 plugin.sync()
119
```
Path: `pootle/apps/pootle_fs/localfs.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 #
3 # Copyright (C) Pootle contributors.
4 #
5 # This file is a part of the Pootle project. It is distributed under the GPL3
6 # or later license. See the LICENSE file for a copy of the license and the
7 # AUTHORS file for copyright and authorship information.
8
9 import logging
10 import uuid
11
12 import dirsync
13
14 from django import forms
15
16 from pootle.core.delegate import revision
17 from pootle_project.models import Project
18
19 from .plugin import Plugin
20
21
22 class LocalFSPlugin(Plugin):
23
24 fs_type = "localfs"
25 _pulled = False
26
27 @property
28 def latest_hash(self):
29 return revision.get(Project)(
30 self.project).get(key="pootle.fs.fs_hash")
31
32 def push(self, response):
33 dirsync.sync(
34 self.project.local_fs_path,
35 self.fs_url,
36 "sync",
37 purge=True,
38 logger=logging.getLogger(dirsync.__name__))
39 return response
40
41 def fetch(self):
42 synced = dirsync.sync(
43 self.fs_url,
44 self.project.local_fs_path,
45 "sync",
46 create=True,
47 purge=True,
48 logger=logging.getLogger(dirsync.__name__))
49 if synced:
50 revision.get(Project)(self.project).set(
51 keys=["pootle.fs.fs_hash"], value=uuid.uuid4().hex)
52
53
54 class LocalFSUrlValidator(object):
55
56 help_text = "Enter an absolute path to a directory on your filesystem"
57
58 def validate(self, url):
59 if not url.startswith("/"):
60 raise forms.ValidationError(self.help_text)
61
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pootle/apps/pootle_fs/localfs.py b/pootle/apps/pootle_fs/localfs.py
--- a/pootle/apps/pootle_fs/localfs.py
+++ b/pootle/apps/pootle_fs/localfs.py
@@ -16,6 +16,7 @@
from pootle.core.delegate import revision
from pootle_project.models import Project
+from .exceptions import FSFetchError
from .plugin import Plugin
@@ -39,13 +40,16 @@
return response
def fetch(self):
- synced = dirsync.sync(
- self.fs_url,
- self.project.local_fs_path,
- "sync",
- create=True,
- purge=True,
- logger=logging.getLogger(dirsync.__name__))
+ try:
+ synced = dirsync.sync(
+ self.fs_url,
+ self.project.local_fs_path,
+ "sync",
+ create=True,
+ purge=True,
+ logger=logging.getLogger(dirsync.__name__))
+ except ValueError as e:
+ raise FSFetchError(e)
if synced:
revision.get(Project)(self.project).set(
keys=["pootle.fs.fs_hash"], value=uuid.uuid4().hex)
diff --git a/pootle/apps/pootle_fs/management/commands/init_fs_project.py b/pootle/apps/pootle_fs/management/commands/init_fs_project.py
--- a/pootle/apps/pootle_fs/management/commands/init_fs_project.py
+++ b/pootle/apps/pootle_fs/management/commands/init_fs_project.py
@@ -12,6 +12,7 @@
from django.core.management import BaseCommand, CommandError
from pootle_format.models import Format
+from pootle_fs.exceptions import FSFetchError
from pootle_fs.utils import FSPlugin, parse_fs_url
from pootle_language.models import Language
from pootle_project.models import Project
@@ -112,7 +113,11 @@
'default': options['translation_mapping']
}
if options['sync']:
- plugin = FSPlugin(project)
- plugin.fetch()
- plugin.add()
- plugin.sync()
+ try:
+ plugin = FSPlugin(project)
+ plugin.fetch()
+ plugin.add()
+ plugin.sync()
+ except FSFetchError as e:
+ project.delete()
+ raise CommandError(e)
| {"golden_diff": "diff --git a/pootle/apps/pootle_fs/localfs.py b/pootle/apps/pootle_fs/localfs.py\n--- a/pootle/apps/pootle_fs/localfs.py\n+++ b/pootle/apps/pootle_fs/localfs.py\n@@ -16,6 +16,7 @@\n from pootle.core.delegate import revision\n from pootle_project.models import Project\n \n+from .exceptions import FSFetchError\n from .plugin import Plugin\n \n \n@@ -39,13 +40,16 @@\n return response\n \n def fetch(self):\n- synced = dirsync.sync(\n- self.fs_url,\n- self.project.local_fs_path,\n- \"sync\",\n- create=True,\n- purge=True,\n- logger=logging.getLogger(dirsync.__name__))\n+ try:\n+ synced = dirsync.sync(\n+ self.fs_url,\n+ self.project.local_fs_path,\n+ \"sync\",\n+ create=True,\n+ purge=True,\n+ logger=logging.getLogger(dirsync.__name__))\n+ except ValueError as e:\n+ raise FSFetchError(e)\n if synced:\n revision.get(Project)(self.project).set(\n keys=[\"pootle.fs.fs_hash\"], value=uuid.uuid4().hex)\ndiff --git a/pootle/apps/pootle_fs/management/commands/init_fs_project.py b/pootle/apps/pootle_fs/management/commands/init_fs_project.py\n--- a/pootle/apps/pootle_fs/management/commands/init_fs_project.py\n+++ b/pootle/apps/pootle_fs/management/commands/init_fs_project.py\n@@ -12,6 +12,7 @@\n from django.core.management import BaseCommand, CommandError\n \n from pootle_format.models import Format\n+from pootle_fs.exceptions import FSFetchError\n from pootle_fs.utils import FSPlugin, parse_fs_url\n from pootle_language.models import Language\n from pootle_project.models import Project\n@@ -112,7 +113,11 @@\n 'default': options['translation_mapping']\n }\n if options['sync']:\n- plugin = FSPlugin(project)\n- plugin.fetch()\n- plugin.add()\n- plugin.sync()\n+ try:\n+ plugin = FSPlugin(project)\n+ plugin.fetch()\n+ plugin.add()\n+ plugin.sync()\n+ except FSFetchError as e:\n+ project.delete()\n+ raise CommandError(e)\n", "issue": "Dont create project if command fails in init_fs_project\natm if for some reason this command fails it leaves a project behind\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nimport logging\n\nfrom django.core.exceptions import ValidationError\nfrom django.core.management import BaseCommand, CommandError\n\nfrom pootle_format.models import Format\nfrom pootle_fs.utils import FSPlugin, parse_fs_url\nfrom pootle_language.models import Language\nfrom pootle_project.models import Project\n\n\nlogger = logging.getLogger('pootle.fs')\n\n\nclass Command(BaseCommand):\n help = \"Init a new Pootle FS project.\"\n\n def add_arguments(self, parser):\n parser.add_argument(\n 'code',\n metavar='CODE',\n help='Project code'\n )\n parser.add_argument(\n 'fs',\n metavar='FS_URL',\n help='FS url \"filesystem_type+/repo/path/\"'\n )\n parser.add_argument(\n 'translation_mapping',\n help='Translation mapping \"<language_code>/<filename>.<ext>\"',\n metavar='TRANSLATION_MAPPING'\n )\n parser.add_argument(\n '-n', '--name',\n action='store',\n dest='name',\n nargs='?',\n help='Project name',\n )\n parser.add_argument(\n '--filetypes',\n action='append',\n dest='filetypes',\n help='File types',\n )\n parser.add_argument(\n '--checkstyle',\n action='store',\n dest='checkstyle',\n help='Checkstyle',\n nargs='?',\n default='standard'\n )\n parser.add_argument(\n '-l', '--source-language',\n action='store',\n dest='source_language',\n help=\"Code for the project's source language\",\n nargs='?',\n default='en'\n )\n parser.add_argument(\n '--nosync',\n action='store_false',\n dest='sync',\n help='Flag if sync is unnecessary',\n default=True\n )\n\n def handle(self, **options):\n source_language_code = options['source_language']\n try:\n source_language = Language.objects.get(code=source_language_code)\n except Language.DoesNotExist as e:\n self.stdout.write('%s: Unknown language code.' %\n source_language_code)\n raise CommandError(e)\n\n fs_type, fs_url = parse_fs_url(options['fs'])\n code = options['code']\n name = options['name'] or code.capitalize()\n\n try:\n project = Project.objects.create(\n code=code,\n fullname=name,\n treestyle='pootle_fs',\n checkstyle=options['checkstyle'],\n source_language=source_language)\n except ValidationError as e:\n raise CommandError(e)\n\n for filetype in options[\"filetypes\"] or [\"po\"]:\n try:\n filetype = Format.objects.get(name=filetype)\n project.filetypes.add(filetype)\n except Format.DoesNotExist as e:\n raise CommandError(e)\n\n project.config['pootle_fs.fs_type'] = fs_type\n project.config['pootle_fs.fs_url'] = fs_url\n project.config['pootle_fs.translation_mappings'] = {\n 'default': options['translation_mapping']\n }\n if options['sync']:\n plugin = FSPlugin(project)\n plugin.fetch()\n plugin.add()\n plugin.sync()\n", "path": "pootle/apps/pootle_fs/management/commands/init_fs_project.py"}, {"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nimport logging\nimport uuid\n\nimport dirsync\n\nfrom django import forms\n\nfrom pootle.core.delegate import revision\nfrom pootle_project.models import Project\n\nfrom .plugin import Plugin\n\n\nclass LocalFSPlugin(Plugin):\n\n fs_type = \"localfs\"\n _pulled = False\n\n @property\n def latest_hash(self):\n return revision.get(Project)(\n self.project).get(key=\"pootle.fs.fs_hash\")\n\n def push(self, response):\n dirsync.sync(\n self.project.local_fs_path,\n self.fs_url,\n \"sync\",\n purge=True,\n logger=logging.getLogger(dirsync.__name__))\n return response\n\n def fetch(self):\n synced = dirsync.sync(\n self.fs_url,\n self.project.local_fs_path,\n \"sync\",\n create=True,\n purge=True,\n logger=logging.getLogger(dirsync.__name__))\n if synced:\n revision.get(Project)(self.project).set(\n keys=[\"pootle.fs.fs_hash\"], value=uuid.uuid4().hex)\n\n\nclass LocalFSUrlValidator(object):\n\n help_text = \"Enter an absolute path to a directory on your filesystem\"\n\n def validate(self, url):\n if not url.startswith(\"/\"):\n raise forms.ValidationError(self.help_text)\n", "path": "pootle/apps/pootle_fs/localfs.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nimport logging\n\nfrom django.core.exceptions import ValidationError\nfrom django.core.management import BaseCommand, CommandError\n\nfrom pootle_format.models import Format\nfrom pootle_fs.exceptions import FSFetchError\nfrom pootle_fs.utils import FSPlugin, parse_fs_url\nfrom pootle_language.models import Language\nfrom pootle_project.models import Project\n\n\nlogger = logging.getLogger('pootle.fs')\n\n\nclass Command(BaseCommand):\n help = \"Init a new Pootle FS project.\"\n\n def add_arguments(self, parser):\n parser.add_argument(\n 'code',\n metavar='CODE',\n help='Project code'\n )\n parser.add_argument(\n 'fs',\n metavar='FS_URL',\n help='FS url \"filesystem_type+/repo/path/\"'\n )\n parser.add_argument(\n 'translation_mapping',\n help='Translation mapping \"<language_code>/<filename>.<ext>\"',\n metavar='TRANSLATION_MAPPING'\n )\n parser.add_argument(\n '-n', '--name',\n action='store',\n dest='name',\n nargs='?',\n help='Project name',\n )\n parser.add_argument(\n '--filetypes',\n action='append',\n dest='filetypes',\n help='File types',\n )\n parser.add_argument(\n '--checkstyle',\n action='store',\n dest='checkstyle',\n help='Checkstyle',\n nargs='?',\n default='standard'\n )\n parser.add_argument(\n '-l', '--source-language',\n action='store',\n dest='source_language',\n help=\"Code for the project's source language\",\n nargs='?',\n default='en'\n )\n parser.add_argument(\n '--nosync',\n action='store_false',\n dest='sync',\n help='Flag if sync is unnecessary',\n default=True\n )\n\n def handle(self, **options):\n source_language_code = options['source_language']\n try:\n source_language = Language.objects.get(code=source_language_code)\n except Language.DoesNotExist as e:\n self.stdout.write('%s: Unknown language code.' %\n source_language_code)\n raise CommandError(e)\n\n fs_type, fs_url = parse_fs_url(options['fs'])\n code = options['code']\n name = options['name'] or code.capitalize()\n\n try:\n project = Project.objects.create(\n code=code,\n fullname=name,\n treestyle='pootle_fs',\n checkstyle=options['checkstyle'],\n source_language=source_language)\n except ValidationError as e:\n raise CommandError(e)\n\n for filetype in options[\"filetypes\"] or [\"po\"]:\n try:\n filetype = Format.objects.get(name=filetype)\n project.filetypes.add(filetype)\n except Format.DoesNotExist as e:\n raise CommandError(e)\n\n project.config['pootle_fs.fs_type'] = fs_type\n project.config['pootle_fs.fs_url'] = fs_url\n project.config['pootle_fs.translation_mappings'] = {\n 'default': options['translation_mapping']\n }\n if options['sync']:\n try:\n plugin = FSPlugin(project)\n plugin.fetch()\n plugin.add()\n plugin.sync()\n except FSFetchError as e:\n project.delete()\n raise CommandError(e)\n", "path": "pootle/apps/pootle_fs/management/commands/init_fs_project.py"}, {"content": "# -*- coding: utf-8 -*-\n#\n# Copyright (C) Pootle contributors.\n#\n# This file is a part of the Pootle project. It is distributed under the GPL3\n# or later license. See the LICENSE file for a copy of the license and the\n# AUTHORS file for copyright and authorship information.\n\nimport logging\nimport uuid\n\nimport dirsync\n\nfrom django import forms\n\nfrom pootle.core.delegate import revision\nfrom pootle_project.models import Project\n\nfrom .exceptions import FSFetchError\nfrom .plugin import Plugin\n\n\nclass LocalFSPlugin(Plugin):\n\n fs_type = \"localfs\"\n _pulled = False\n\n @property\n def latest_hash(self):\n return revision.get(Project)(\n self.project).get(key=\"pootle.fs.fs_hash\")\n\n def push(self, response):\n dirsync.sync(\n self.project.local_fs_path,\n self.fs_url,\n \"sync\",\n purge=True,\n logger=logging.getLogger(dirsync.__name__))\n return response\n\n def fetch(self):\n try:\n synced = dirsync.sync(\n self.fs_url,\n self.project.local_fs_path,\n \"sync\",\n create=True,\n purge=True,\n logger=logging.getLogger(dirsync.__name__))\n except ValueError as e:\n raise FSFetchError(e)\n if synced:\n revision.get(Project)(self.project).set(\n keys=[\"pootle.fs.fs_hash\"], value=uuid.uuid4().hex)\n\n\nclass LocalFSUrlValidator(object):\n\n help_text = \"Enter an absolute path to a directory on your filesystem\"\n\n def validate(self, url):\n if not url.startswith(\"/\"):\n raise forms.ValidationError(self.help_text)\n", "path": "pootle/apps/pootle_fs/localfs.py"}]} | 1,750 | 525 |
gh_patches_debug_34572 | rasdani/github-patches | git_diff | SigmaHQ__sigma-1895 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
sigma2attack does not support collections
Collections parsing happens only in [`collection.py`](tools/sigma/parser/collection.py) it seems but [`sigma2attack`](/tools/sigma/sigma2attack.py#L24) uses good old `yaml.safe_load` on his own.
that leads to errors when parsing and rules being ignored
```
[snip]
Ignoring rule rules\windows\other\win_tool_psexec.yml (parsing failed)
Ignoring rule rules\windows\powershell\win_powershell_web_request.yml (parsing failed)
Ignoring rule rules\windows\process_access\sysmon_cmstp_execution.yml (parsing failed)
Ignoring rule rules\windows\process_creation\win_apt_chafer_mar18.yml (parsing failed)
Ignoring rule rules\windows\process_creation\win_apt_empiremonkey.yml (parsing failed)
Ignoring rule rules\windows\process_creation\win_apt_gallium.yml (parsing failed)
[snip]
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tools/sigma/sigma2attack.py`
Content:
```
1 #!/usr/bin/env python3
2
3 import argparse
4 import glob
5 import json
6 import os
7 import sys
8
9 import yaml
10
11 def main():
12 parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
13 parser.add_argument("--rules-directory", "-d", dest="rules_dir", default="rules", help="Directory to read rules from")
14 parser.add_argument("--out-file", "-o", dest="out_file", default="heatmap.json", help="File to write the JSON layer to")
15 parser.add_argument("--no-comment", dest="no_comment", action="store_true", help="Don't store rule names in comments")
16 args = parser.parse_args()
17
18 rule_files = glob.glob(os.path.join(args.rules_dir, "**/*.yml"), recursive=True)
19 techniques_to_rules = {}
20 curr_max_technique_count = 0
21 num_rules_used = 0
22 for rule_file in rule_files:
23 try:
24 rule = yaml.safe_load(open(rule_file, encoding="utf-8").read())
25 except yaml.YAMLError:
26 sys.stderr.write("Ignoring rule " + rule_file + " (parsing failed)\n")
27 continue
28 if "tags" not in rule:
29 sys.stderr.write("Ignoring rule " + rule_file + " (no tags)\n")
30 continue
31 tags = rule["tags"]
32 for tag in tags:
33 if tag.lower().startswith("attack.t"):
34 technique_id = tag[len("attack."):].upper()
35 num_rules_used += 1
36 if technique_id not in techniques_to_rules:
37 techniques_to_rules[technique_id] = []
38 techniques_to_rules[technique_id].append(os.path.basename(rule_file))
39 curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))
40
41
42 scores = []
43 for technique in techniques_to_rules:
44 entry = {
45 "techniqueID": technique,
46 "score": len(techniques_to_rules[technique]),
47 }
48 if not args.no_comment:
49 entry["comment"] = "\n".join(techniques_to_rules[technique])
50
51 scores.append(entry)
52
53 output = {
54 "domain": "mitre-enterprise",
55 "name": "Sigma rules heatmap",
56 "gradient": {
57 "colors": [
58 "#ffffff",
59 "#ff6666"
60 ],
61 "maxValue": curr_max_technique_count,
62 "minValue": 0
63 },
64 "versions": {
65 "navigator": "4.0",
66 "layer": "4.0"
67 },
68 "techniques": scores,
69 }
70
71 with open(args.out_file, "w") as f:
72 f.write(json.dumps(output))
73 print("[*] Layer file written in " + args.out_file + " (" + str(num_rules_used) + " rules)")
74
75 if __name__ == "__main__":
76 main()
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/tools/sigma/sigma2attack.py b/tools/sigma/sigma2attack.py
--- a/tools/sigma/sigma2attack.py
+++ b/tools/sigma/sigma2attack.py
@@ -8,6 +8,7 @@
import yaml
+
def main():
parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)
parser.add_argument("--rules-directory", "-d", dest="rules_dir", default="rules", help="Directory to read rules from")
@@ -20,24 +21,25 @@
curr_max_technique_count = 0
num_rules_used = 0
for rule_file in rule_files:
- try:
- rule = yaml.safe_load(open(rule_file, encoding="utf-8").read())
- except yaml.YAMLError:
- sys.stderr.write("Ignoring rule " + rule_file + " (parsing failed)\n")
- continue
- if "tags" not in rule:
- sys.stderr.write("Ignoring rule " + rule_file + " (no tags)\n")
- continue
- tags = rule["tags"]
- for tag in tags:
- if tag.lower().startswith("attack.t"):
- technique_id = tag[len("attack."):].upper()
- num_rules_used += 1
- if technique_id not in techniques_to_rules:
- techniques_to_rules[technique_id] = []
- techniques_to_rules[technique_id].append(os.path.basename(rule_file))
- curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))
-
+ with open(rule_file,encoding='utf-8') as f:
+ docs = yaml.load_all(f, Loader=yaml.FullLoader)
+ double = False
+ for rule in docs:
+ if "tags" not in rule :
+ if double == False : # Only 1 warning
+ sys.stderr.write("Ignoring rule " + rule_file + " (no tags)\n")
+ double = True # action globle no tag
+ continue
+ tags = rule["tags"]
+ double = True
+ for tag in tags:
+ if tag.lower().startswith("attack.t"):
+ technique_id = tag[len("attack."):].upper()
+ num_rules_used += 1
+ if technique_id not in techniques_to_rules:
+ techniques_to_rules[technique_id] = []
+ techniques_to_rules[technique_id].append(os.path.basename(rule_file))
+ curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))
scores = []
for technique in techniques_to_rules:
| {"golden_diff": "diff --git a/tools/sigma/sigma2attack.py b/tools/sigma/sigma2attack.py\n--- a/tools/sigma/sigma2attack.py\n+++ b/tools/sigma/sigma2attack.py\n@@ -8,6 +8,7 @@\n \n import yaml\n \n+\n def main():\n parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)\n parser.add_argument(\"--rules-directory\", \"-d\", dest=\"rules_dir\", default=\"rules\", help=\"Directory to read rules from\")\n@@ -20,24 +21,25 @@\n curr_max_technique_count = 0\n num_rules_used = 0\n for rule_file in rule_files:\n- try:\n- rule = yaml.safe_load(open(rule_file, encoding=\"utf-8\").read())\n- except yaml.YAMLError:\n- sys.stderr.write(\"Ignoring rule \" + rule_file + \" (parsing failed)\\n\")\n- continue\n- if \"tags\" not in rule:\n- sys.stderr.write(\"Ignoring rule \" + rule_file + \" (no tags)\\n\")\n- continue\n- tags = rule[\"tags\"]\n- for tag in tags:\n- if tag.lower().startswith(\"attack.t\"):\n- technique_id = tag[len(\"attack.\"):].upper()\n- num_rules_used += 1\n- if technique_id not in techniques_to_rules:\n- techniques_to_rules[technique_id] = []\n- techniques_to_rules[technique_id].append(os.path.basename(rule_file))\n- curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))\n-\n+ with open(rule_file,encoding='utf-8') as f:\n+ docs = yaml.load_all(f, Loader=yaml.FullLoader)\n+ double = False\n+ for rule in docs:\n+ if \"tags\" not in rule :\n+ if double == False : # Only 1 warning\n+ sys.stderr.write(\"Ignoring rule \" + rule_file + \" (no tags)\\n\")\n+ double = True # action globle no tag\n+ continue\n+ tags = rule[\"tags\"]\n+ double = True\n+ for tag in tags:\n+ if tag.lower().startswith(\"attack.t\"):\n+ technique_id = tag[len(\"attack.\"):].upper()\n+ num_rules_used += 1\n+ if technique_id not in techniques_to_rules:\n+ techniques_to_rules[technique_id] = []\n+ techniques_to_rules[technique_id].append(os.path.basename(rule_file))\n+ curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))\n \n scores = []\n for technique in techniques_to_rules:\n", "issue": "sigma2attack does not support collections\nCollections parsing happens only in [`collection.py`](tools/sigma/parser/collection.py) it seems but [`sigma2attack`](/tools/sigma/sigma2attack.py#L24) uses good old `yaml.safe_load` on his own.\r\n\r\nthat leads to errors when parsing and rules being ignored\r\n\r\n```\r\n[snip]\r\nIgnoring rule rules\\windows\\other\\win_tool_psexec.yml (parsing failed)\r\nIgnoring rule rules\\windows\\powershell\\win_powershell_web_request.yml (parsing failed)\r\nIgnoring rule rules\\windows\\process_access\\sysmon_cmstp_execution.yml (parsing failed)\r\nIgnoring rule rules\\windows\\process_creation\\win_apt_chafer_mar18.yml (parsing failed)\r\nIgnoring rule rules\\windows\\process_creation\\win_apt_empiremonkey.yml (parsing failed)\r\nIgnoring rule rules\\windows\\process_creation\\win_apt_gallium.yml (parsing failed)\r\n[snip]\r\n```\n", "before_files": [{"content": "#!/usr/bin/env python3\n\nimport argparse\nimport glob\nimport json\nimport os\nimport sys\n\nimport yaml\n\ndef main():\n parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)\n parser.add_argument(\"--rules-directory\", \"-d\", dest=\"rules_dir\", default=\"rules\", help=\"Directory to read rules from\")\n parser.add_argument(\"--out-file\", \"-o\", dest=\"out_file\", default=\"heatmap.json\", help=\"File to write the JSON layer to\")\n parser.add_argument(\"--no-comment\", dest=\"no_comment\", action=\"store_true\", help=\"Don't store rule names in comments\")\n args = parser.parse_args()\n\n rule_files = glob.glob(os.path.join(args.rules_dir, \"**/*.yml\"), recursive=True)\n techniques_to_rules = {}\n curr_max_technique_count = 0\n num_rules_used = 0\n for rule_file in rule_files:\n try:\n rule = yaml.safe_load(open(rule_file, encoding=\"utf-8\").read())\n except yaml.YAMLError:\n sys.stderr.write(\"Ignoring rule \" + rule_file + \" (parsing failed)\\n\")\n continue\n if \"tags\" not in rule:\n sys.stderr.write(\"Ignoring rule \" + rule_file + \" (no tags)\\n\")\n continue\n tags = rule[\"tags\"]\n for tag in tags:\n if tag.lower().startswith(\"attack.t\"):\n technique_id = tag[len(\"attack.\"):].upper()\n num_rules_used += 1\n if technique_id not in techniques_to_rules:\n techniques_to_rules[technique_id] = []\n techniques_to_rules[technique_id].append(os.path.basename(rule_file))\n curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))\n\n\n scores = []\n for technique in techniques_to_rules:\n entry = {\n \"techniqueID\": technique, \n \"score\": len(techniques_to_rules[technique]), \n }\n if not args.no_comment:\n entry[\"comment\"] = \"\\n\".join(techniques_to_rules[technique])\n\n scores.append(entry)\n\n output = {\n \"domain\": \"mitre-enterprise\",\n \"name\": \"Sigma rules heatmap\",\n \"gradient\": {\n \"colors\": [\n \"#ffffff\",\n \"#ff6666\"\n ],\n \"maxValue\": curr_max_technique_count,\n \"minValue\": 0\n },\n \"versions\": {\n \"navigator\": \"4.0\",\n \"layer\": \"4.0\"\n },\n \"techniques\": scores,\n }\n\n with open(args.out_file, \"w\") as f:\n f.write(json.dumps(output))\n print(\"[*] Layer file written in \" + args.out_file + \" (\" + str(num_rules_used) + \" rules)\")\n\nif __name__ == \"__main__\":\n main()\n", "path": "tools/sigma/sigma2attack.py"}], "after_files": [{"content": "#!/usr/bin/env python3\n\nimport argparse\nimport glob\nimport json\nimport os\nimport sys\n\nimport yaml\n\n\ndef main():\n parser = argparse.ArgumentParser(formatter_class=argparse.ArgumentDefaultsHelpFormatter)\n parser.add_argument(\"--rules-directory\", \"-d\", dest=\"rules_dir\", default=\"rules\", help=\"Directory to read rules from\")\n parser.add_argument(\"--out-file\", \"-o\", dest=\"out_file\", default=\"heatmap.json\", help=\"File to write the JSON layer to\")\n parser.add_argument(\"--no-comment\", dest=\"no_comment\", action=\"store_true\", help=\"Don't store rule names in comments\")\n args = parser.parse_args()\n\n rule_files = glob.glob(os.path.join(args.rules_dir, \"**/*.yml\"), recursive=True)\n techniques_to_rules = {}\n curr_max_technique_count = 0\n num_rules_used = 0\n for rule_file in rule_files:\n with open(rule_file,encoding='utf-8') as f:\n docs = yaml.load_all(f, Loader=yaml.FullLoader)\n double = False\n for rule in docs:\n if \"tags\" not in rule :\n if double == False : # Only 1 warning\n sys.stderr.write(\"Ignoring rule \" + rule_file + \" (no tags)\\n\")\n double = True # action globle no tag\n continue\n tags = rule[\"tags\"]\n double = True\n for tag in tags:\n if tag.lower().startswith(\"attack.t\"):\n technique_id = tag[len(\"attack.\"):].upper()\n num_rules_used += 1\n if technique_id not in techniques_to_rules:\n techniques_to_rules[technique_id] = []\n techniques_to_rules[technique_id].append(os.path.basename(rule_file))\n curr_max_technique_count = max(curr_max_technique_count, len(techniques_to_rules[technique_id]))\n\n scores = []\n for technique in techniques_to_rules:\n entry = {\n \"techniqueID\": technique, \n \"score\": len(techniques_to_rules[technique]), \n }\n if not args.no_comment:\n entry[\"comment\"] = \"\\n\".join(techniques_to_rules[technique])\n\n scores.append(entry)\n\n output = {\n \"domain\": \"mitre-enterprise\",\n \"name\": \"Sigma rules heatmap\",\n \"gradient\": {\n \"colors\": [\n \"#ffffff\",\n \"#ff6666\"\n ],\n \"maxValue\": curr_max_technique_count,\n \"minValue\": 0\n },\n \"versions\": {\n \"navigator\": \"4.0\",\n \"layer\": \"4.0\"\n },\n \"techniques\": scores,\n }\n\n with open(args.out_file, \"w\") as f:\n f.write(json.dumps(output))\n print(\"[*] Layer file written in \" + args.out_file + \" (\" + str(num_rules_used) + \" rules)\")\n\nif __name__ == \"__main__\":\n main()\n", "path": "tools/sigma/sigma2attack.py"}]} | 1,241 | 593 |
gh_patches_debug_11125 | rasdani/github-patches | git_diff | Kinto__kinto-2108 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
memcached cache backend does not really test memcached
@autrilla reported that when he moved Kinto to GCP, the heartbeat reported success for all backends, including cache, even though there is no `memcached`. I think the problem is here:
https://github.com/Kinto/kinto/blob/61494caae2bb8fe342f32f6a5d89f40ed2b62dff/kinto/core/cache/__init__.py#L87-L91
The heartbeat for the cache backend just sends two messages which don't expect responses. For something like memcached which communicates over UDP, these messages can "succeed" by silently going nowhere.
memcached cache backend does not really test memcached
@autrilla reported that when he moved Kinto to GCP, the heartbeat reported success for all backends, including cache, even though there is no `memcached`. I think the problem is here:
https://github.com/Kinto/kinto/blob/61494caae2bb8fe342f32f6a5d89f40ed2b62dff/kinto/core/cache/__init__.py#L87-L91
The heartbeat for the cache backend just sends two messages which don't expect responses. For something like memcached which communicates over UDP, these messages can "succeed" by silently going nowhere.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kinto/core/cache/__init__.py`
Content:
```
1 import logging
2 import random
3
4
5 logger = logging.getLogger(__name__)
6
7
8 _HEARTBEAT_DELETE_RATE = 0.5
9 _HEARTBEAT_KEY = "__heartbeat__"
10 _HEARTBEAT_TTL_SECONDS = 3600
11
12
13 class CacheBase:
14 def __init__(self, *args, **kwargs):
15 self.prefix = kwargs["cache_prefix"]
16 self.max_size_bytes = kwargs.get("cache_max_size_bytes")
17
18 def initialize_schema(self, dry_run=False):
19 """Create every necessary objects (like tables or indices) in the
20 backend.
21
22 This is executed when the ``kinto migrate`` command is run.
23
24 :param bool dry_run: simulate instead of executing the operations.
25 """
26 raise NotImplementedError
27
28 def flush(self):
29 """Delete every values."""
30 raise NotImplementedError
31
32 def ttl(self, key):
33 """Obtain the expiration value of the specified `key`.
34
35 :param str key: key
36 :returns: number of seconds or negative if no TTL.
37 :rtype: float
38 """
39 raise NotImplementedError
40
41 def expire(self, key, ttl):
42 """Set the expiration value `ttl` for the specified `key`.
43
44 :param str key: key
45 :param float ttl: number of seconds
46 """
47 raise NotImplementedError
48
49 def set(self, key, value, ttl):
50 """Store a value with the specified `key`.
51
52 :param str key: key
53 :param str value: value to store
54 :param float ttl: expire after number of seconds
55 """
56 raise NotImplementedError
57
58 def get(self, key):
59 """Obtain the value of the specified `key`.
60
61 :param str key: key
62 :returns: the stored value or None if missing.
63 :rtype: str
64 """
65 raise NotImplementedError
66
67 def delete(self, key):
68 """Delete the value of the specified `key`.
69
70 :param str key: key
71 """
72 raise NotImplementedError
73
74
75 def heartbeat(backend):
76 def ping(request):
77 """Test that cache backend is operational.
78
79 :param request: current request object
80 :type request: :class:`~pyramid:pyramid.request.Request`
81 :returns: ``True`` is everything is ok, ``False`` otherwise.
82 :rtype: bool
83 """
84 # No specific case for readonly mode because the cache should
85 # continue to work in that mode.
86 try:
87 if random.SystemRandom().random() < _HEARTBEAT_DELETE_RATE:
88 backend.delete(_HEARTBEAT_KEY)
89 else:
90 backend.set(_HEARTBEAT_KEY, "alive", _HEARTBEAT_TTL_SECONDS)
91 return True
92 except Exception:
93 logger.exception("Heartbeat Failure")
94 return False
95
96 return ping
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kinto/core/cache/__init__.py b/kinto/core/cache/__init__.py
--- a/kinto/core/cache/__init__.py
+++ b/kinto/core/cache/__init__.py
@@ -86,9 +86,9 @@
try:
if random.SystemRandom().random() < _HEARTBEAT_DELETE_RATE:
backend.delete(_HEARTBEAT_KEY)
- else:
- backend.set(_HEARTBEAT_KEY, "alive", _HEARTBEAT_TTL_SECONDS)
- return True
+ return backend.get(_HEARTBEAT_KEY) is None
+ backend.set(_HEARTBEAT_KEY, "alive", _HEARTBEAT_TTL_SECONDS)
+ return backend.get(_HEARTBEAT_KEY) == "alive"
except Exception:
logger.exception("Heartbeat Failure")
return False
| {"golden_diff": "diff --git a/kinto/core/cache/__init__.py b/kinto/core/cache/__init__.py\n--- a/kinto/core/cache/__init__.py\n+++ b/kinto/core/cache/__init__.py\n@@ -86,9 +86,9 @@\n try:\n if random.SystemRandom().random() < _HEARTBEAT_DELETE_RATE:\n backend.delete(_HEARTBEAT_KEY)\n- else:\n- backend.set(_HEARTBEAT_KEY, \"alive\", _HEARTBEAT_TTL_SECONDS)\n- return True\n+ return backend.get(_HEARTBEAT_KEY) is None\n+ backend.set(_HEARTBEAT_KEY, \"alive\", _HEARTBEAT_TTL_SECONDS)\n+ return backend.get(_HEARTBEAT_KEY) == \"alive\"\n except Exception:\n logger.exception(\"Heartbeat Failure\")\n return False\n", "issue": "memcached cache backend does not really test memcached\n@autrilla reported that when he moved Kinto to GCP, the heartbeat reported success for all backends, including cache, even though there is no `memcached`. I think the problem is here:\r\n\r\nhttps://github.com/Kinto/kinto/blob/61494caae2bb8fe342f32f6a5d89f40ed2b62dff/kinto/core/cache/__init__.py#L87-L91\r\n\r\nThe heartbeat for the cache backend just sends two messages which don't expect responses. For something like memcached which communicates over UDP, these messages can \"succeed\" by silently going nowhere.\nmemcached cache backend does not really test memcached\n@autrilla reported that when he moved Kinto to GCP, the heartbeat reported success for all backends, including cache, even though there is no `memcached`. I think the problem is here:\r\n\r\nhttps://github.com/Kinto/kinto/blob/61494caae2bb8fe342f32f6a5d89f40ed2b62dff/kinto/core/cache/__init__.py#L87-L91\r\n\r\nThe heartbeat for the cache backend just sends two messages which don't expect responses. For something like memcached which communicates over UDP, these messages can \"succeed\" by silently going nowhere.\n", "before_files": [{"content": "import logging\nimport random\n\n\nlogger = logging.getLogger(__name__)\n\n\n_HEARTBEAT_DELETE_RATE = 0.5\n_HEARTBEAT_KEY = \"__heartbeat__\"\n_HEARTBEAT_TTL_SECONDS = 3600\n\n\nclass CacheBase:\n def __init__(self, *args, **kwargs):\n self.prefix = kwargs[\"cache_prefix\"]\n self.max_size_bytes = kwargs.get(\"cache_max_size_bytes\")\n\n def initialize_schema(self, dry_run=False):\n \"\"\"Create every necessary objects (like tables or indices) in the\n backend.\n\n This is executed when the ``kinto migrate`` command is run.\n\n :param bool dry_run: simulate instead of executing the operations.\n \"\"\"\n raise NotImplementedError\n\n def flush(self):\n \"\"\"Delete every values.\"\"\"\n raise NotImplementedError\n\n def ttl(self, key):\n \"\"\"Obtain the expiration value of the specified `key`.\n\n :param str key: key\n :returns: number of seconds or negative if no TTL.\n :rtype: float\n \"\"\"\n raise NotImplementedError\n\n def expire(self, key, ttl):\n \"\"\"Set the expiration value `ttl` for the specified `key`.\n\n :param str key: key\n :param float ttl: number of seconds\n \"\"\"\n raise NotImplementedError\n\n def set(self, key, value, ttl):\n \"\"\"Store a value with the specified `key`.\n\n :param str key: key\n :param str value: value to store\n :param float ttl: expire after number of seconds\n \"\"\"\n raise NotImplementedError\n\n def get(self, key):\n \"\"\"Obtain the value of the specified `key`.\n\n :param str key: key\n :returns: the stored value or None if missing.\n :rtype: str\n \"\"\"\n raise NotImplementedError\n\n def delete(self, key):\n \"\"\"Delete the value of the specified `key`.\n\n :param str key: key\n \"\"\"\n raise NotImplementedError\n\n\ndef heartbeat(backend):\n def ping(request):\n \"\"\"Test that cache backend is operational.\n\n :param request: current request object\n :type request: :class:`~pyramid:pyramid.request.Request`\n :returns: ``True`` is everything is ok, ``False`` otherwise.\n :rtype: bool\n \"\"\"\n # No specific case for readonly mode because the cache should\n # continue to work in that mode.\n try:\n if random.SystemRandom().random() < _HEARTBEAT_DELETE_RATE:\n backend.delete(_HEARTBEAT_KEY)\n else:\n backend.set(_HEARTBEAT_KEY, \"alive\", _HEARTBEAT_TTL_SECONDS)\n return True\n except Exception:\n logger.exception(\"Heartbeat Failure\")\n return False\n\n return ping\n", "path": "kinto/core/cache/__init__.py"}], "after_files": [{"content": "import logging\nimport random\n\n\nlogger = logging.getLogger(__name__)\n\n\n_HEARTBEAT_DELETE_RATE = 0.5\n_HEARTBEAT_KEY = \"__heartbeat__\"\n_HEARTBEAT_TTL_SECONDS = 3600\n\n\nclass CacheBase:\n def __init__(self, *args, **kwargs):\n self.prefix = kwargs[\"cache_prefix\"]\n self.max_size_bytes = kwargs.get(\"cache_max_size_bytes\")\n\n def initialize_schema(self, dry_run=False):\n \"\"\"Create every necessary objects (like tables or indices) in the\n backend.\n\n This is executed when the ``kinto migrate`` command is run.\n\n :param bool dry_run: simulate instead of executing the operations.\n \"\"\"\n raise NotImplementedError\n\n def flush(self):\n \"\"\"Delete every values.\"\"\"\n raise NotImplementedError\n\n def ttl(self, key):\n \"\"\"Obtain the expiration value of the specified `key`.\n\n :param str key: key\n :returns: number of seconds or negative if no TTL.\n :rtype: float\n \"\"\"\n raise NotImplementedError\n\n def expire(self, key, ttl):\n \"\"\"Set the expiration value `ttl` for the specified `key`.\n\n :param str key: key\n :param float ttl: number of seconds\n \"\"\"\n raise NotImplementedError\n\n def set(self, key, value, ttl):\n \"\"\"Store a value with the specified `key`.\n\n :param str key: key\n :param str value: value to store\n :param float ttl: expire after number of seconds\n \"\"\"\n raise NotImplementedError\n\n def get(self, key):\n \"\"\"Obtain the value of the specified `key`.\n\n :param str key: key\n :returns: the stored value or None if missing.\n :rtype: str\n \"\"\"\n raise NotImplementedError\n\n def delete(self, key):\n \"\"\"Delete the value of the specified `key`.\n\n :param str key: key\n \"\"\"\n raise NotImplementedError\n\n\ndef heartbeat(backend):\n def ping(request):\n \"\"\"Test that cache backend is operational.\n\n :param request: current request object\n :type request: :class:`~pyramid:pyramid.request.Request`\n :returns: ``True`` is everything is ok, ``False`` otherwise.\n :rtype: bool\n \"\"\"\n # No specific case for readonly mode because the cache should\n # continue to work in that mode.\n try:\n if random.SystemRandom().random() < _HEARTBEAT_DELETE_RATE:\n backend.delete(_HEARTBEAT_KEY)\n return backend.get(_HEARTBEAT_KEY) is None\n backend.set(_HEARTBEAT_KEY, \"alive\", _HEARTBEAT_TTL_SECONDS)\n return backend.get(_HEARTBEAT_KEY) == \"alive\"\n except Exception:\n logger.exception(\"Heartbeat Failure\")\n return False\n\n return ping\n", "path": "kinto/core/cache/__init__.py"}]} | 1,355 | 188 |
gh_patches_debug_27587 | rasdani/github-patches | git_diff | ocadotechnology__aimmo-101 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
in portal/rr, when you are redirected to home, you are redirected to AI:MMO home
The website now loads the aimmo urls.
But now each time the website is supposed to redirect you to the portal home, it redirects to the AI:MMO login page.
Probably because both urls are named the same in their respective urls.py and the website imports both, finishing with aimmo urls?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `players/autoconfig.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 # Code for Life
3 #
4 # Copyright (C) 2015, Ocado Innovation Limited
5 #
6 # This program is free software: you can redistribute it and/or modify
7 # it under the terms of the GNU Affero General Public License as
8 # published by the Free Software Foundation, either version 3 of the
9 # License, or (at your option) any later version.
10 #
11 # This program is distributed in the hope that it will be useful,
12 # but WITHOUT ANY WARRANTY; without even the implied warranty of
13 # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
14 # GNU Affero General Public License for more details.
15 #
16 # You should have received a copy of the GNU Affero General Public License
17 # along with this program. If not, see <http://www.gnu.org/licenses/>.
18 #
19 # ADDITIONAL TERMS – Section 7 GNU General Public Licence
20 #
21 # This licence does not grant any right, title or interest in any “Ocado” logos,
22 # trade names or the trademark “Ocado” or any other trademarks or domain names
23 # owned by Ocado Innovation Limited or the Ocado group of companies or any other
24 # distinctive brand features of “Ocado” as may be secured from time to time. You
25 # must not distribute any modification of this program using the trademark
26 # “Ocado” or claim any affiliation or association with Ocado or its employees.
27 #
28 # You are not authorised to use the name Ocado (or any of its trade names) or
29 # the names of any author or contributor in advertising or for publicity purposes
30 # pertaining to the distribution of this program, without the prior written
31 # authorisation of Ocado.
32 #
33 # Any propagation, distribution or conveyance of this program must include this
34 # copyright notice and these terms. You must not misrepresent the origins of this
35 # program; modified versions of the program must be marked as such and not
36 # identified as the original program.
37 '''Players autoconfig'''
38
39 DEFAULT_SETTINGS = {
40 'AUTOCONFIG_INDEX_VIEW': 'home',
41 'STATIC_URL': '/static/',
42 }
43
44 SETTINGS = {
45 'INSTALLED_APPS': [
46 'django.contrib.auth',
47 'django.contrib.messages',
48 'django.contrib.staticfiles',
49 'django_js_reverse',
50 ],
51 'TEMPLATES': [
52 {
53 'BACKEND': 'django.template.backends.django.DjangoTemplates',
54 'APP_DIRS': True,
55 'OPTIONS': {
56 'context_processors': [
57 'django.template.context_processors.debug',
58 'django.template.context_processors.request',
59 'django.contrib.auth.context_processors.auth',
60 'django.contrib.messages.context_processors.messages',
61 ]
62 }
63 }
64 ],
65 'USE_TZ': True,
66 }
67
```
Path: `players/urls.py`
Content:
```
1 from django.conf.urls import url
2 from django.views.generic import TemplateView
3 from django.contrib.auth.decorators import login_required
4 from django.contrib.admin.views.decorators import staff_member_required
5
6 from players import views
7
8 urlpatterns = [
9 url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='home'),
10
11 url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='program'),
12 url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='watch'),
13 url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='statistics'),
14
15 url(r'^api/code/$', staff_member_required(views.code), name='code'),
16 url(r'^api/games/$', views.games, name='games'),
17
18 url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='js_reverse'), # TODO: Pull request to make django_js_reverse.urls
19 ]
20
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/players/autoconfig.py b/players/autoconfig.py
--- a/players/autoconfig.py
+++ b/players/autoconfig.py
@@ -37,7 +37,7 @@
'''Players autoconfig'''
DEFAULT_SETTINGS = {
- 'AUTOCONFIG_INDEX_VIEW': 'home',
+ 'AUTOCONFIG_INDEX_VIEW': 'aimmo/home',
'STATIC_URL': '/static/',
}
diff --git a/players/urls.py b/players/urls.py
--- a/players/urls.py
+++ b/players/urls.py
@@ -6,14 +6,14 @@
from players import views
urlpatterns = [
- url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='home'),
+ url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='aimmo/home'),
- url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='program'),
- url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='watch'),
- url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='statistics'),
+ url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='aimmo/program'),
+ url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='aimmo/watch'),
+ url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='aimmo/statistics'),
- url(r'^api/code/$', staff_member_required(views.code), name='code'),
- url(r'^api/games/$', views.games, name='games'),
+ url(r'^api/code/$', staff_member_required(views.code), name='aimmo/code'),
+ url(r'^api/games/$', views.games, name='aimmo/games'),
- url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='js_reverse'), # TODO: Pull request to make django_js_reverse.urls
+ url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='aimmo/js_reverse'), # TODO: Pull request to make django_js_reverse.urls
]
| {"golden_diff": "diff --git a/players/autoconfig.py b/players/autoconfig.py\n--- a/players/autoconfig.py\n+++ b/players/autoconfig.py\n@@ -37,7 +37,7 @@\n '''Players autoconfig'''\n \n DEFAULT_SETTINGS = {\n- 'AUTOCONFIG_INDEX_VIEW': 'home',\n+ 'AUTOCONFIG_INDEX_VIEW': 'aimmo/home',\n 'STATIC_URL': '/static/',\n }\n \ndiff --git a/players/urls.py b/players/urls.py\n--- a/players/urls.py\n+++ b/players/urls.py\n@@ -6,14 +6,14 @@\n from players import views\n \n urlpatterns = [\n- url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='home'),\n+ url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='aimmo/home'),\n \n- url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='program'),\n- url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='watch'),\n- url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='statistics'),\n+ url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='aimmo/program'),\n+ url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='aimmo/watch'),\n+ url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='aimmo/statistics'),\n \n- url(r'^api/code/$', staff_member_required(views.code), name='code'),\n- url(r'^api/games/$', views.games, name='games'),\n+ url(r'^api/code/$', staff_member_required(views.code), name='aimmo/code'),\n+ url(r'^api/games/$', views.games, name='aimmo/games'),\n \n- url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='js_reverse'), # TODO: Pull request to make django_js_reverse.urls\n+ url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='aimmo/js_reverse'), # TODO: Pull request to make django_js_reverse.urls\n ]\n", "issue": "in portal/rr, when you are redirected to home, you are redirected to AI:MMO home\nThe website now loads the aimmo urls.\nBut now each time the website is supposed to redirect you to the portal home, it redirects to the AI:MMO login page.\nProbably because both urls are named the same in their respective urls.py and the website imports both, finishing with aimmo urls?\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n# Code for Life\n#\n# Copyright (C) 2015, Ocado Innovation Limited\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n# ADDITIONAL TERMS \u2013 Section 7 GNU General Public Licence\n#\n# This licence does not grant any right, title or interest in any \u201cOcado\u201d logos,\n# trade names or the trademark \u201cOcado\u201d or any other trademarks or domain names\n# owned by Ocado Innovation Limited or the Ocado group of companies or any other\n# distinctive brand features of \u201cOcado\u201d as may be secured from time to time. You\n# must not distribute any modification of this program using the trademark\n# \u201cOcado\u201d or claim any affiliation or association with Ocado or its employees.\n#\n# You are not authorised to use the name Ocado (or any of its trade names) or\n# the names of any author or contributor in advertising or for publicity purposes\n# pertaining to the distribution of this program, without the prior written\n# authorisation of Ocado.\n#\n# Any propagation, distribution or conveyance of this program must include this\n# copyright notice and these terms. You must not misrepresent the origins of this\n# program; modified versions of the program must be marked as such and not\n# identified as the original program.\n'''Players autoconfig'''\n\nDEFAULT_SETTINGS = {\n 'AUTOCONFIG_INDEX_VIEW': 'home',\n 'STATIC_URL': '/static/',\n}\n\nSETTINGS = {\n 'INSTALLED_APPS': [\n 'django.contrib.auth',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django_js_reverse',\n ],\n 'TEMPLATES': [\n {\n 'BACKEND': 'django.template.backends.django.DjangoTemplates',\n 'APP_DIRS': True,\n 'OPTIONS': {\n 'context_processors': [\n 'django.template.context_processors.debug',\n 'django.template.context_processors.request',\n 'django.contrib.auth.context_processors.auth',\n 'django.contrib.messages.context_processors.messages',\n ]\n }\n }\n ],\n 'USE_TZ': True,\n}\n", "path": "players/autoconfig.py"}, {"content": "from django.conf.urls import url\nfrom django.views.generic import TemplateView\nfrom django.contrib.auth.decorators import login_required\nfrom django.contrib.admin.views.decorators import staff_member_required\n\nfrom players import views\n\nurlpatterns = [\n url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='home'),\n\n url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='program'),\n url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='watch'),\n url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='statistics'),\n\n url(r'^api/code/$', staff_member_required(views.code), name='code'),\n url(r'^api/games/$', views.games, name='games'),\n\n url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='js_reverse'), # TODO: Pull request to make django_js_reverse.urls\n]\n", "path": "players/urls.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n# Code for Life\n#\n# Copyright (C) 2015, Ocado Innovation Limited\n#\n# This program is free software: you can redistribute it and/or modify\n# it under the terms of the GNU Affero General Public License as\n# published by the Free Software Foundation, either version 3 of the\n# License, or (at your option) any later version.\n#\n# This program is distributed in the hope that it will be useful,\n# but WITHOUT ANY WARRANTY; without even the implied warranty of\n# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\n# GNU Affero General Public License for more details.\n#\n# You should have received a copy of the GNU Affero General Public License\n# along with this program. If not, see <http://www.gnu.org/licenses/>.\n#\n# ADDITIONAL TERMS \u2013 Section 7 GNU General Public Licence\n#\n# This licence does not grant any right, title or interest in any \u201cOcado\u201d logos,\n# trade names or the trademark \u201cOcado\u201d or any other trademarks or domain names\n# owned by Ocado Innovation Limited or the Ocado group of companies or any other\n# distinctive brand features of \u201cOcado\u201d as may be secured from time to time. You\n# must not distribute any modification of this program using the trademark\n# \u201cOcado\u201d or claim any affiliation or association with Ocado or its employees.\n#\n# You are not authorised to use the name Ocado (or any of its trade names) or\n# the names of any author or contributor in advertising or for publicity purposes\n# pertaining to the distribution of this program, without the prior written\n# authorisation of Ocado.\n#\n# Any propagation, distribution or conveyance of this program must include this\n# copyright notice and these terms. You must not misrepresent the origins of this\n# program; modified versions of the program must be marked as such and not\n# identified as the original program.\n'''Players autoconfig'''\n\nDEFAULT_SETTINGS = {\n 'AUTOCONFIG_INDEX_VIEW': 'aimmo/home',\n 'STATIC_URL': '/static/',\n}\n\nSETTINGS = {\n 'INSTALLED_APPS': [\n 'django.contrib.auth',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django_js_reverse',\n ],\n 'TEMPLATES': [\n {\n 'BACKEND': 'django.template.backends.django.DjangoTemplates',\n 'APP_DIRS': True,\n 'OPTIONS': {\n 'context_processors': [\n 'django.template.context_processors.debug',\n 'django.template.context_processors.request',\n 'django.contrib.auth.context_processors.auth',\n 'django.contrib.messages.context_processors.messages',\n ]\n }\n }\n ],\n 'USE_TZ': True,\n}\n", "path": "players/autoconfig.py"}, {"content": "from django.conf.urls import url\nfrom django.views.generic import TemplateView\nfrom django.contrib.auth.decorators import login_required\nfrom django.contrib.admin.views.decorators import staff_member_required\n\nfrom players import views\n\nurlpatterns = [\n url(r'^$', staff_member_required(TemplateView.as_view(template_name='players/home.html')), name='aimmo/home'),\n\n url(r'^program/$', staff_member_required(login_required(TemplateView.as_view(template_name='players/program.html'))), name='aimmo/program'),\n url(r'^watch/$', staff_member_required(views.WatchView.as_view()), name='aimmo/watch'),\n url(r'^statistics/$', staff_member_required(TemplateView.as_view(template_name='players/statistics.html')), name='aimmo/statistics'),\n\n url(r'^api/code/$', staff_member_required(views.code), name='aimmo/code'),\n url(r'^api/games/$', views.games, name='aimmo/games'),\n\n url(r'^jsreverse/$', 'django_js_reverse.views.urls_js', name='aimmo/js_reverse'), # TODO: Pull request to make django_js_reverse.urls\n]\n", "path": "players/urls.py"}]} | 1,298 | 502 |
gh_patches_debug_33169 | rasdani/github-patches | git_diff | streamlink__streamlink-2388 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Playtv uk
playtv.fr working fine is there any possibility to run http://uk.play.tv/ looks like both sites france and uk owned by same company. As streamlink support tvplay.fr. hope u get support for uk playtv.
http://uk.play.tv/live-tv/363/pick-tv/
http://uk.play.tv/live-tv/752/itv4-1/
http://uk.play.tv/live-tv/1106/itv3-1/
http://uk.play.tv/live-tv/1105/itv-1/
Playtv uk
playtv.fr working fine is there any possibility to run http://uk.play.tv/ looks like both sites france and uk owned by same company. As streamlink support tvplay.fr. hope u get support for uk playtv.
http://uk.play.tv/live-tv/363/pick-tv/
http://uk.play.tv/live-tv/752/itv4-1/
http://uk.play.tv/live-tv/1106/itv3-1/
http://uk.play.tv/live-tv/1105/itv-1/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/playtv.py`
Content:
```
1 import re
2
3 from streamlink.plugin import Plugin
4 from streamlink.plugin.api import validate
5 from streamlink.stream import HDSStream, HLSStream
6
7
8 class PlayTV(Plugin):
9 FORMATS_URL = 'http://playtv.fr/player/initialize/{0}/'
10 API_URL = 'http://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'
11
12 _url_re = re.compile(r'http://(?:playtv\.fr/television|play\.tv/live-tv/\d+)/(?P<channel>[^/]+)/?')
13
14 _formats_schema = validate.Schema({
15 'streams': validate.any(
16 [],
17 {
18 validate.text: validate.Schema({
19 validate.text: {
20 'bitrates': validate.all([
21 validate.Schema({
22 'value': int
23 })
24 ])
25 }
26 })
27 }
28 )
29 })
30 _api_schema = validate.Schema({
31 'url': validate.url()
32 })
33
34 @classmethod
35 def can_handle_url(cls, url):
36 return PlayTV._url_re.match(url)
37
38 def _get_streams(self):
39 match = self._url_re.match(self.url)
40 channel = match.group('channel')
41
42 res = self.session.http.get(self.FORMATS_URL.format(channel))
43 streams = self.session.http.json(res, schema=self._formats_schema)['streams']
44 if streams == []:
45 self.logger.error('Channel may be geo-restricted, not directly provided by PlayTV or not freely available')
46 return
47
48 for language in streams:
49 for protocol, bitrates in list(streams[language].items()):
50 # - Ignore non-supported protocols (RTSP, DASH)
51 # - Ignore deprecated Flash (RTMPE/HDS) streams (PlayTV doesn't provide anymore a Flash player)
52 if protocol in ['rtsp', 'flash', 'dash', 'hds']:
53 continue
54
55 for bitrate in bitrates['bitrates']:
56 if bitrate['value'] == 0:
57 continue
58 api_url = self.API_URL.format(channel, protocol, language, bitrate['value'])
59 res = self.session.http.get(api_url)
60 video_url = self.session.http.json(res, schema=self._api_schema)['url']
61 bs = '{0}k'.format(bitrate['value'])
62
63 if protocol == 'hls':
64 for _, stream in HLSStream.parse_variant_playlist(self.session, video_url).items():
65 yield bs, stream
66 elif protocol == 'hds':
67 for _, stream in HDSStream.parse_manifest(self.session, video_url).items():
68 yield bs, stream
69
70
71 __plugin__ = PlayTV
72
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/playtv.py b/src/streamlink/plugins/playtv.py
--- a/src/streamlink/plugins/playtv.py
+++ b/src/streamlink/plugins/playtv.py
@@ -1,3 +1,5 @@
+import base64
+import json
import re
from streamlink.plugin import Plugin
@@ -5,11 +7,17 @@
from streamlink.stream import HDSStream, HLSStream
+def jwt_decode(token):
+ info, payload, sig = token.split(".")
+ data = base64.urlsafe_b64decode(payload + '=' * (-len(payload) % 4))
+ return json.loads(data)
+
+
class PlayTV(Plugin):
- FORMATS_URL = 'http://playtv.fr/player/initialize/{0}/'
- API_URL = 'http://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'
+ FORMATS_URL = 'https://playtv.fr/player/initialize/{0}/'
+ API_URL = 'https://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'
- _url_re = re.compile(r'http://(?:playtv\.fr/television|play\.tv/live-tv/\d+)/(?P<channel>[^/]+)/?')
+ _url_re = re.compile(r'https?://(?:playtv\.fr/television|(:?\w+\.)?play\.tv/live-tv/\d+)/(?P<channel>[^/]+)/?')
_formats_schema = validate.Schema({
'streams': validate.any(
@@ -27,9 +35,13 @@
}
)
})
- _api_schema = validate.Schema({
- 'url': validate.url()
- })
+
+ _api_schema = validate.Schema(
+ validate.transform(lambda x: jwt_decode(x)),
+ {
+ 'url': validate.url()
+ }
+ )
@classmethod
def can_handle_url(cls, url):
@@ -57,7 +69,7 @@
continue
api_url = self.API_URL.format(channel, protocol, language, bitrate['value'])
res = self.session.http.get(api_url)
- video_url = self.session.http.json(res, schema=self._api_schema)['url']
+ video_url = self._api_schema.validate(res.text)['url']
bs = '{0}k'.format(bitrate['value'])
if protocol == 'hls':
| {"golden_diff": "diff --git a/src/streamlink/plugins/playtv.py b/src/streamlink/plugins/playtv.py\n--- a/src/streamlink/plugins/playtv.py\n+++ b/src/streamlink/plugins/playtv.py\n@@ -1,3 +1,5 @@\n+import base64\n+import json\n import re\n \n from streamlink.plugin import Plugin\n@@ -5,11 +7,17 @@\n from streamlink.stream import HDSStream, HLSStream\n \n \n+def jwt_decode(token):\n+ info, payload, sig = token.split(\".\")\n+ data = base64.urlsafe_b64decode(payload + '=' * (-len(payload) % 4))\n+ return json.loads(data)\n+\n+\n class PlayTV(Plugin):\n- FORMATS_URL = 'http://playtv.fr/player/initialize/{0}/'\n- API_URL = 'http://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'\n+ FORMATS_URL = 'https://playtv.fr/player/initialize/{0}/'\n+ API_URL = 'https://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'\n \n- _url_re = re.compile(r'http://(?:playtv\\.fr/television|play\\.tv/live-tv/\\d+)/(?P<channel>[^/]+)/?')\n+ _url_re = re.compile(r'https?://(?:playtv\\.fr/television|(:?\\w+\\.)?play\\.tv/live-tv/\\d+)/(?P<channel>[^/]+)/?')\n \n _formats_schema = validate.Schema({\n 'streams': validate.any(\n@@ -27,9 +35,13 @@\n }\n )\n })\n- _api_schema = validate.Schema({\n- 'url': validate.url()\n- })\n+\n+ _api_schema = validate.Schema(\n+ validate.transform(lambda x: jwt_decode(x)),\n+ {\n+ 'url': validate.url()\n+ }\n+ )\n \n @classmethod\n def can_handle_url(cls, url):\n@@ -57,7 +69,7 @@\n continue\n api_url = self.API_URL.format(channel, protocol, language, bitrate['value'])\n res = self.session.http.get(api_url)\n- video_url = self.session.http.json(res, schema=self._api_schema)['url']\n+ video_url = self._api_schema.validate(res.text)['url']\n bs = '{0}k'.format(bitrate['value'])\n \n if protocol == 'hls':\n", "issue": "Playtv uk\nplaytv.fr working fine is there any possibility to run http://uk.play.tv/ looks like both sites france and uk owned by same company. As streamlink support tvplay.fr. hope u get support for uk playtv.\r\nhttp://uk.play.tv/live-tv/363/pick-tv/\r\nhttp://uk.play.tv/live-tv/752/itv4-1/\r\nhttp://uk.play.tv/live-tv/1106/itv3-1/\r\nhttp://uk.play.tv/live-tv/1105/itv-1/\nPlaytv uk\nplaytv.fr working fine is there any possibility to run http://uk.play.tv/ looks like both sites france and uk owned by same company. As streamlink support tvplay.fr. hope u get support for uk playtv.\r\nhttp://uk.play.tv/live-tv/363/pick-tv/\r\nhttp://uk.play.tv/live-tv/752/itv4-1/\r\nhttp://uk.play.tv/live-tv/1106/itv3-1/\r\nhttp://uk.play.tv/live-tv/1105/itv-1/\n", "before_files": [{"content": "import re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HDSStream, HLSStream\n\n\nclass PlayTV(Plugin):\n FORMATS_URL = 'http://playtv.fr/player/initialize/{0}/'\n API_URL = 'http://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'\n\n _url_re = re.compile(r'http://(?:playtv\\.fr/television|play\\.tv/live-tv/\\d+)/(?P<channel>[^/]+)/?')\n\n _formats_schema = validate.Schema({\n 'streams': validate.any(\n [],\n {\n validate.text: validate.Schema({\n validate.text: {\n 'bitrates': validate.all([\n validate.Schema({\n 'value': int\n })\n ])\n }\n })\n }\n )\n })\n _api_schema = validate.Schema({\n 'url': validate.url()\n })\n\n @classmethod\n def can_handle_url(cls, url):\n return PlayTV._url_re.match(url)\n\n def _get_streams(self):\n match = self._url_re.match(self.url)\n channel = match.group('channel')\n\n res = self.session.http.get(self.FORMATS_URL.format(channel))\n streams = self.session.http.json(res, schema=self._formats_schema)['streams']\n if streams == []:\n self.logger.error('Channel may be geo-restricted, not directly provided by PlayTV or not freely available')\n return\n\n for language in streams:\n for protocol, bitrates in list(streams[language].items()):\n # - Ignore non-supported protocols (RTSP, DASH)\n # - Ignore deprecated Flash (RTMPE/HDS) streams (PlayTV doesn't provide anymore a Flash player)\n if protocol in ['rtsp', 'flash', 'dash', 'hds']:\n continue\n\n for bitrate in bitrates['bitrates']:\n if bitrate['value'] == 0:\n continue\n api_url = self.API_URL.format(channel, protocol, language, bitrate['value'])\n res = self.session.http.get(api_url)\n video_url = self.session.http.json(res, schema=self._api_schema)['url']\n bs = '{0}k'.format(bitrate['value'])\n\n if protocol == 'hls':\n for _, stream in HLSStream.parse_variant_playlist(self.session, video_url).items():\n yield bs, stream\n elif protocol == 'hds':\n for _, stream in HDSStream.parse_manifest(self.session, video_url).items():\n yield bs, stream\n\n\n__plugin__ = PlayTV\n", "path": "src/streamlink/plugins/playtv.py"}], "after_files": [{"content": "import base64\nimport json\nimport re\n\nfrom streamlink.plugin import Plugin\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream import HDSStream, HLSStream\n\n\ndef jwt_decode(token):\n info, payload, sig = token.split(\".\")\n data = base64.urlsafe_b64decode(payload + '=' * (-len(payload) % 4))\n return json.loads(data)\n\n\nclass PlayTV(Plugin):\n FORMATS_URL = 'https://playtv.fr/player/initialize/{0}/'\n API_URL = 'https://playtv.fr/player/play/{0}/?format={1}&language={2}&bitrate={3}'\n\n _url_re = re.compile(r'https?://(?:playtv\\.fr/television|(:?\\w+\\.)?play\\.tv/live-tv/\\d+)/(?P<channel>[^/]+)/?')\n\n _formats_schema = validate.Schema({\n 'streams': validate.any(\n [],\n {\n validate.text: validate.Schema({\n validate.text: {\n 'bitrates': validate.all([\n validate.Schema({\n 'value': int\n })\n ])\n }\n })\n }\n )\n })\n\n _api_schema = validate.Schema(\n validate.transform(lambda x: jwt_decode(x)),\n {\n 'url': validate.url()\n }\n )\n\n @classmethod\n def can_handle_url(cls, url):\n return PlayTV._url_re.match(url)\n\n def _get_streams(self):\n match = self._url_re.match(self.url)\n channel = match.group('channel')\n\n res = self.session.http.get(self.FORMATS_URL.format(channel))\n streams = self.session.http.json(res, schema=self._formats_schema)['streams']\n if streams == []:\n self.logger.error('Channel may be geo-restricted, not directly provided by PlayTV or not freely available')\n return\n\n for language in streams:\n for protocol, bitrates in list(streams[language].items()):\n # - Ignore non-supported protocols (RTSP, DASH)\n # - Ignore deprecated Flash (RTMPE/HDS) streams (PlayTV doesn't provide anymore a Flash player)\n if protocol in ['rtsp', 'flash', 'dash', 'hds']:\n continue\n\n for bitrate in bitrates['bitrates']:\n if bitrate['value'] == 0:\n continue\n api_url = self.API_URL.format(channel, protocol, language, bitrate['value'])\n res = self.session.http.get(api_url)\n video_url = self._api_schema.validate(res.text)['url']\n bs = '{0}k'.format(bitrate['value'])\n\n if protocol == 'hls':\n for _, stream in HLSStream.parse_variant_playlist(self.session, video_url).items():\n yield bs, stream\n elif protocol == 'hds':\n for _, stream in HDSStream.parse_manifest(self.session, video_url).items():\n yield bs, stream\n\n\n__plugin__ = PlayTV\n", "path": "src/streamlink/plugins/playtv.py"}]} | 1,212 | 546 |
gh_patches_debug_3132 | rasdani/github-patches | git_diff | pantsbuild__pants-6037 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
contrib go plugin not able to recognize meta tag if meta ends with />
The regex only recognize `<meta xxxxxxxxxxxxx >` but not `<meta xxxxxxxxxx />`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py`
Content:
```
1 # coding=utf-8
2 # Copyright 2016 Pants project contributors (see CONTRIBUTORS.md).
3 # Licensed under the Apache License, Version 2.0 (see LICENSE).
4
5 from __future__ import (absolute_import, division, generators, nested_scopes, print_function,
6 unicode_literals, with_statement)
7
8 import re
9
10 import requests
11 from pants.subsystem.subsystem import Subsystem
12 from pants.util.memo import memoized_method
13
14 from pants.contrib.go.subsystems.imported_repo import ImportedRepo
15
16
17 class GoImportMetaTagReader(Subsystem):
18 """Implements a reader for the <meta name="go-import"> protocol.
19
20 See https://golang.org/cmd/go/#hdr-Remote_import_paths .
21 """
22 options_scope = 'go-import-metatag-reader'
23
24 @classmethod
25 def register_options(cls, register):
26 super(GoImportMetaTagReader, cls).register_options(register)
27 register('--retries', type=int, default=1, advanced=True,
28 help='How many times to retry when fetching meta tags.')
29
30 _META_IMPORT_REGEX = re.compile(r"""
31 <meta
32 \s+
33 name=['"]go-import['"]
34 \s+
35 content=['"](?P<root>[^\s]+)\s+(?P<vcs>[^\s]+)\s+(?P<url>[^\s]+)['"]
36 \s*
37 >""", flags=re.VERBOSE)
38
39 @classmethod
40 def find_meta_tags(cls, page_html):
41 """Returns the content of the meta tag if found inside of the provided HTML."""
42
43 return cls._META_IMPORT_REGEX.findall(page_html)
44
45 @memoized_method
46 def get_imported_repo(self, import_path):
47 """Looks for a go-import meta tag for the provided import_path.
48
49 Returns an ImportedRepo instance with the information in the meta tag,
50 or None if no go-import meta tag is found.
51 """
52 try:
53 session = requests.session()
54 # TODO: Support https with (optional) fallback to http, as Go does.
55 # See https://github.com/pantsbuild/pants/issues/3503.
56 session.mount("http://",
57 requests.adapters.HTTPAdapter(max_retries=self.get_options().retries))
58 page_data = session.get('http://{import_path}?go-get=1'.format(import_path=import_path))
59 except requests.ConnectionError:
60 return None
61
62 if not page_data:
63 return None
64
65 # Return the first match, rather than doing some kind of longest prefix search.
66 # Hopefully no one returns multiple valid go-import meta tags.
67 for (root, vcs, url) in self.find_meta_tags(page_data.text):
68 if root and vcs and url:
69 # Check to make sure returned root is an exact match to the provided import path. If it is
70 # not then run a recursive check on the returned and return the values provided by that call.
71 if root == import_path:
72 return ImportedRepo(root, vcs, url)
73 elif import_path.startswith(root):
74 return self.get_imported_repo(root)
75
76 return None
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py b/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py
--- a/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py
+++ b/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py
@@ -34,7 +34,7 @@
\s+
content=['"](?P<root>[^\s]+)\s+(?P<vcs>[^\s]+)\s+(?P<url>[^\s]+)['"]
\s*
- >""", flags=re.VERBOSE)
+ /?>""", flags=re.VERBOSE)
@classmethod
def find_meta_tags(cls, page_html):
| {"golden_diff": "diff --git a/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py b/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py\n--- a/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py\n+++ b/contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py\n@@ -34,7 +34,7 @@\n \\s+\n content=['\"](?P<root>[^\\s]+)\\s+(?P<vcs>[^\\s]+)\\s+(?P<url>[^\\s]+)['\"]\n \\s*\n- >\"\"\", flags=re.VERBOSE)\n+ /?>\"\"\", flags=re.VERBOSE)\n \n @classmethod\n def find_meta_tags(cls, page_html):\n", "issue": "contrib go plugin not able to recognize meta tag if meta ends with />\nThe regex only recognize `<meta xxxxxxxxxxxxx >` but not `<meta xxxxxxxxxx />`.\n", "before_files": [{"content": "# coding=utf-8\n# Copyright 2016 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import (absolute_import, division, generators, nested_scopes, print_function,\n unicode_literals, with_statement)\n\nimport re\n\nimport requests\nfrom pants.subsystem.subsystem import Subsystem\nfrom pants.util.memo import memoized_method\n\nfrom pants.contrib.go.subsystems.imported_repo import ImportedRepo\n\n\nclass GoImportMetaTagReader(Subsystem):\n \"\"\"Implements a reader for the <meta name=\"go-import\"> protocol.\n\n See https://golang.org/cmd/go/#hdr-Remote_import_paths .\n \"\"\"\n options_scope = 'go-import-metatag-reader'\n\n @classmethod\n def register_options(cls, register):\n super(GoImportMetaTagReader, cls).register_options(register)\n register('--retries', type=int, default=1, advanced=True,\n help='How many times to retry when fetching meta tags.')\n\n _META_IMPORT_REGEX = re.compile(r\"\"\"\n <meta\n \\s+\n name=['\"]go-import['\"]\n \\s+\n content=['\"](?P<root>[^\\s]+)\\s+(?P<vcs>[^\\s]+)\\s+(?P<url>[^\\s]+)['\"]\n \\s*\n >\"\"\", flags=re.VERBOSE)\n\n @classmethod\n def find_meta_tags(cls, page_html):\n \"\"\"Returns the content of the meta tag if found inside of the provided HTML.\"\"\"\n\n return cls._META_IMPORT_REGEX.findall(page_html)\n\n @memoized_method\n def get_imported_repo(self, import_path):\n \"\"\"Looks for a go-import meta tag for the provided import_path.\n\n Returns an ImportedRepo instance with the information in the meta tag,\n or None if no go-import meta tag is found.\n \"\"\"\n try:\n session = requests.session()\n # TODO: Support https with (optional) fallback to http, as Go does.\n # See https://github.com/pantsbuild/pants/issues/3503.\n session.mount(\"http://\",\n requests.adapters.HTTPAdapter(max_retries=self.get_options().retries))\n page_data = session.get('http://{import_path}?go-get=1'.format(import_path=import_path))\n except requests.ConnectionError:\n return None\n\n if not page_data:\n return None\n\n # Return the first match, rather than doing some kind of longest prefix search.\n # Hopefully no one returns multiple valid go-import meta tags.\n for (root, vcs, url) in self.find_meta_tags(page_data.text):\n if root and vcs and url:\n # Check to make sure returned root is an exact match to the provided import path. If it is\n # not then run a recursive check on the returned and return the values provided by that call.\n if root == import_path:\n return ImportedRepo(root, vcs, url)\n elif import_path.startswith(root):\n return self.get_imported_repo(root)\n\n return None\n", "path": "contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py"}], "after_files": [{"content": "# coding=utf-8\n# Copyright 2016 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\nfrom __future__ import (absolute_import, division, generators, nested_scopes, print_function,\n unicode_literals, with_statement)\n\nimport re\n\nimport requests\nfrom pants.subsystem.subsystem import Subsystem\nfrom pants.util.memo import memoized_method\n\nfrom pants.contrib.go.subsystems.imported_repo import ImportedRepo\n\n\nclass GoImportMetaTagReader(Subsystem):\n \"\"\"Implements a reader for the <meta name=\"go-import\"> protocol.\n\n See https://golang.org/cmd/go/#hdr-Remote_import_paths .\n \"\"\"\n options_scope = 'go-import-metatag-reader'\n\n @classmethod\n def register_options(cls, register):\n super(GoImportMetaTagReader, cls).register_options(register)\n register('--retries', type=int, default=1, advanced=True,\n help='How many times to retry when fetching meta tags.')\n\n _META_IMPORT_REGEX = re.compile(r\"\"\"\n <meta\n \\s+\n name=['\"]go-import['\"]\n \\s+\n content=['\"](?P<root>[^\\s]+)\\s+(?P<vcs>[^\\s]+)\\s+(?P<url>[^\\s]+)['\"]\n \\s*\n /?>\"\"\", flags=re.VERBOSE)\n\n @classmethod\n def find_meta_tags(cls, page_html):\n \"\"\"Returns the content of the meta tag if found inside of the provided HTML.\"\"\"\n\n return cls._META_IMPORT_REGEX.findall(page_html)\n\n @memoized_method\n def get_imported_repo(self, import_path):\n \"\"\"Looks for a go-import meta tag for the provided import_path.\n\n Returns an ImportedRepo instance with the information in the meta tag,\n or None if no go-import meta tag is found.\n \"\"\"\n try:\n session = requests.session()\n # TODO: Support https with (optional) fallback to http, as Go does.\n # See https://github.com/pantsbuild/pants/issues/3503.\n session.mount(\"http://\",\n requests.adapters.HTTPAdapter(max_retries=self.get_options().retries))\n page_data = session.get('http://{import_path}?go-get=1'.format(import_path=import_path))\n except requests.ConnectionError:\n return None\n\n if not page_data:\n return None\n\n # Return the first match, rather than doing some kind of longest prefix search.\n # Hopefully no one returns multiple valid go-import meta tags.\n for (root, vcs, url) in self.find_meta_tags(page_data.text):\n if root and vcs and url:\n # Check to make sure returned root is an exact match to the provided import path. If it is\n # not then run a recursive check on the returned and return the values provided by that call.\n if root == import_path:\n return ImportedRepo(root, vcs, url)\n elif import_path.startswith(root):\n return self.get_imported_repo(root)\n\n return None\n", "path": "contrib/go/src/python/pants/contrib/go/subsystems/go_import_meta_tag_reader.py"}]} | 1,111 | 176 |
gh_patches_debug_18605 | rasdani/github-patches | git_diff | opensearch-project__opensearch-build-287 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add support for running bwctest.sh
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `bundle-workflow/src/paths/tree_walker.py`
Content:
```
1 # SPDX-License-Identifier: Apache-2.0
2 #
3 # The OpenSearch Contributors require contributions made to
4 # this file be licensed under the Apache-2.0 license or a
5 # compatible open source license.
6
7 import os
8
9
10 def walk(root):
11 print(f'Walking tree from {root}')
12 for dir, dirs, files in os.walk(root):
13 for file_name in files:
14 absolute_path = os.path.join(dir, file_name)
15 relative_path = os.path.relpath(absolute_path, root)
16 yield (os.path.realpath(absolute_path), relative_path)
17
```
Path: `bundle-workflow/src/system/execute.py`
Content:
```
1 # SPDX-License-Identifier: Apache-2.0
2 #
3 # The OpenSearch Contributors require contributions made to
4 # this file be licensed under the Apache-2.0 license or a
5 # compatible open source license.
6
7 import subprocess
8
9
10 def execute(command, dir, capture=True, raise_on_failure=True):
11 """
12 Execute a shell command inside a directory.
13 :param command: The shell command to execute.
14 :param dir: The full path to the directory that the command should be executed in.
15 :returns a tuple containing the exit code, stdout, and stderr.
16 """
17 print(f'Executing "{command}" in {dir}')
18 result = subprocess.run(command, cwd=dir, shell=True, capture_output=capture, text=True)
19 if raise_on_failure:
20 result.check_returncode()
21 return (result.returncode, result.stdout, result.stderr)
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/bundle-workflow/src/paths/tree_walker.py b/bundle-workflow/src/paths/tree_walker.py
--- a/bundle-workflow/src/paths/tree_walker.py
+++ b/bundle-workflow/src/paths/tree_walker.py
@@ -8,7 +8,7 @@
def walk(root):
- print(f'Walking tree from {root}')
+ print(f"Walking tree from {root}")
for dir, dirs, files in os.walk(root):
for file_name in files:
absolute_path = os.path.join(dir, file_name)
diff --git a/bundle-workflow/src/system/execute.py b/bundle-workflow/src/system/execute.py
--- a/bundle-workflow/src/system/execute.py
+++ b/bundle-workflow/src/system/execute.py
@@ -15,7 +15,9 @@
:returns a tuple containing the exit code, stdout, and stderr.
"""
print(f'Executing "{command}" in {dir}')
- result = subprocess.run(command, cwd=dir, shell=True, capture_output=capture, text=True)
+ result = subprocess.run(
+ command, cwd=dir, shell=True, capture_output=capture, text=True
+ )
if raise_on_failure:
result.check_returncode()
return (result.returncode, result.stdout, result.stderr)
| {"golden_diff": "diff --git a/bundle-workflow/src/paths/tree_walker.py b/bundle-workflow/src/paths/tree_walker.py\n--- a/bundle-workflow/src/paths/tree_walker.py\n+++ b/bundle-workflow/src/paths/tree_walker.py\n@@ -8,7 +8,7 @@\n \n \n def walk(root):\n- print(f'Walking tree from {root}')\n+ print(f\"Walking tree from {root}\")\n for dir, dirs, files in os.walk(root):\n for file_name in files:\n absolute_path = os.path.join(dir, file_name)\ndiff --git a/bundle-workflow/src/system/execute.py b/bundle-workflow/src/system/execute.py\n--- a/bundle-workflow/src/system/execute.py\n+++ b/bundle-workflow/src/system/execute.py\n@@ -15,7 +15,9 @@\n :returns a tuple containing the exit code, stdout, and stderr.\n \"\"\"\n print(f'Executing \"{command}\" in {dir}')\n- result = subprocess.run(command, cwd=dir, shell=True, capture_output=capture, text=True)\n+ result = subprocess.run(\n+ command, cwd=dir, shell=True, capture_output=capture, text=True\n+ )\n if raise_on_failure:\n result.check_returncode()\n return (result.returncode, result.stdout, result.stderr)\n", "issue": "Add support for running bwctest.sh\n\n", "before_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n#\n# The OpenSearch Contributors require contributions made to\n# this file be licensed under the Apache-2.0 license or a\n# compatible open source license.\n\nimport os\n\n\ndef walk(root):\n print(f'Walking tree from {root}')\n for dir, dirs, files in os.walk(root):\n for file_name in files:\n absolute_path = os.path.join(dir, file_name)\n relative_path = os.path.relpath(absolute_path, root)\n yield (os.path.realpath(absolute_path), relative_path)\n", "path": "bundle-workflow/src/paths/tree_walker.py"}, {"content": "# SPDX-License-Identifier: Apache-2.0\n#\n# The OpenSearch Contributors require contributions made to\n# this file be licensed under the Apache-2.0 license or a\n# compatible open source license.\n\nimport subprocess\n\n\ndef execute(command, dir, capture=True, raise_on_failure=True):\n \"\"\"\n Execute a shell command inside a directory.\n :param command: The shell command to execute.\n :param dir: The full path to the directory that the command should be executed in.\n :returns a tuple containing the exit code, stdout, and stderr.\n \"\"\"\n print(f'Executing \"{command}\" in {dir}')\n result = subprocess.run(command, cwd=dir, shell=True, capture_output=capture, text=True)\n if raise_on_failure:\n result.check_returncode()\n return (result.returncode, result.stdout, result.stderr)\n", "path": "bundle-workflow/src/system/execute.py"}], "after_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n#\n# The OpenSearch Contributors require contributions made to\n# this file be licensed under the Apache-2.0 license or a\n# compatible open source license.\n\nimport os\n\n\ndef walk(root):\n print(f\"Walking tree from {root}\")\n for dir, dirs, files in os.walk(root):\n for file_name in files:\n absolute_path = os.path.join(dir, file_name)\n relative_path = os.path.relpath(absolute_path, root)\n yield (os.path.realpath(absolute_path), relative_path)\n", "path": "bundle-workflow/src/paths/tree_walker.py"}, {"content": "# SPDX-License-Identifier: Apache-2.0\n#\n# The OpenSearch Contributors require contributions made to\n# this file be licensed under the Apache-2.0 license or a\n# compatible open source license.\n\nimport subprocess\n\n\ndef execute(command, dir, capture=True, raise_on_failure=True):\n \"\"\"\n Execute a shell command inside a directory.\n :param command: The shell command to execute.\n :param dir: The full path to the directory that the command should be executed in.\n :returns a tuple containing the exit code, stdout, and stderr.\n \"\"\"\n print(f'Executing \"{command}\" in {dir}')\n result = subprocess.run(\n command, cwd=dir, shell=True, capture_output=capture, text=True\n )\n if raise_on_failure:\n result.check_returncode()\n return (result.returncode, result.stdout, result.stderr)\n", "path": "bundle-workflow/src/system/execute.py"}]} | 654 | 290 |
gh_patches_debug_31112 | rasdani/github-patches | git_diff | quantumlib__Cirq-1646 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Complex numbers can be approximately equal to integers
but approx_eq incorrectly disagrees
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cirq/protocols/approximate_equality.py`
Content:
```
1 # Copyright 2018 The Cirq Developers
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # https://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from typing import Any, Union
16
17 import numpy as np
18
19 from typing_extensions import Protocol
20
21
22 class SupportsApproximateEquality(Protocol):
23 """Object which can be compared approximately."""
24
25 def _approx_eq_(
26 self,
27 other: Any,
28 *,
29 atol: Union[int, float]
30 ) -> bool:
31 """Approximate comparator.
32
33 Types implementing this protocol define their own logic for approximate
34 comparison with other types.
35
36 Args:
37 other: Target object for approximate comparison.
38 atol: The minimum absolute tolerance. See np.isclose() documentation
39 for details.
40
41 Returns:
42 True if objects are approximately equal, False otherwise. Returns
43 NotImplemented when approximate equality is not implemented for
44 given types.
45 """
46
47
48 def approx_eq(val: Any, other: Any, *, atol: Union[int, float] = 1e-8) -> bool:
49 """Approximately compares two objects.
50
51 If `val` implements SupportsApproxEquality protocol then it is invoked and
52 takes precedence over all other checks:
53 - For primitive numeric types `int` and `float` approximate equality is
54 delegated to math.isclose().
55 - For complex primitive type the real and imaginary parts are treated
56 independently and compared using math.isclose().
57 - For `val` and `other` both iterable of the same length, consecutive
58 elements are compared recursively. Types of `val` and `other` does not
59 necessarily needs to match each other. They just need to be iterable and
60 have the same structure.
61
62 Args:
63 val: Source object for approximate comparison.
64 other: Target object for approximate comparison.
65 atol: The minimum absolute tolerance. See np.isclose() documentation for
66 details. Defaults to 1e-8 which matches np.isclose() default
67 absolute tolerance.
68
69 Returns:
70 True if objects are approximately equal, False otherwise.
71 """
72
73 # Check if val defines approximate equality via _approx_eq_. This takes
74 # precedence over all other overloads.
75 approx_eq_getter = getattr(val, '_approx_eq_', None)
76 if approx_eq_getter is not None:
77 result = approx_eq_getter(other, atol)
78 if result is not NotImplemented:
79 return result
80
81 # The same for other to make approx_eq symmetric.
82 other_approx_eq_getter = getattr(other, '_approx_eq_', None)
83 if other_approx_eq_getter is not None:
84 result = other_approx_eq_getter(val, atol)
85 if result is not NotImplemented:
86 return result
87
88 # Compare primitive types directly.
89 if isinstance(val, (int, float)):
90 if not isinstance(other, (int, float)):
91 return False
92 return _isclose(val, other, atol=atol)
93
94 if isinstance(val, complex):
95 if not isinstance(other, complex):
96 return False
97 return _isclose(val, other, atol=atol)
98
99 # Try to compare source and target recursively, assuming they're iterable.
100 result = _approx_eq_iterables(val, other, atol=atol)
101
102 # Fallback to __eq__() when anything else fails.
103 if result is NotImplemented:
104 return val == other
105 return result
106
107
108 def _approx_eq_iterables(val: Any, other: Any, *,
109 atol: Union[int, float]) -> bool:
110 """Iterates over arguments and calls approx_eq recursively.
111
112 Types of `val` and `other` does not necessarily needs to match each other.
113 They just need to be iterable of the same length and have the same
114 structure, approx_eq() will be called on each consecutive element of `val`
115 and `other`.
116
117 Args:
118 val: Source for approximate comparison.
119 other: Target for approximate comparison.
120 atol: The minimum absolute tolerance. See np.isclose() documentation for
121 details.
122
123 Returns:
124 True if objects are approximately equal, False otherwise. Returns
125 NotImplemented when approximate equality is not implemented for given
126 types.
127 """
128
129 def get_iter(iterable):
130 try:
131 return iter(iterable)
132 except TypeError:
133 return None
134
135 val_it = get_iter(val)
136 other_it = get_iter(other)
137
138 if val_it is not None and other_it is not None:
139 while True:
140 try:
141 val_next = next(val_it)
142 except StopIteration:
143 try:
144 next(other_it)
145 return False
146 except StopIteration:
147 return True
148
149 try:
150 other_next = next(other_it)
151 except StopIteration:
152 return False
153
154 result = approx_eq(val_next, other_next, atol=atol)
155 if result is not True:
156 return result
157
158 return NotImplemented
159
160
161 def _isclose(a: Any, b: Any, *, atol: Union[int, float]) -> bool:
162 """Convenience wrapper around np.isclose."""
163 return True if np.isclose([a], [b], atol=atol, rtol=0.0)[0] else False
164
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cirq/protocols/approximate_equality.py b/cirq/protocols/approximate_equality.py
--- a/cirq/protocols/approximate_equality.py
+++ b/cirq/protocols/approximate_equality.py
@@ -13,7 +13,10 @@
# limitations under the License.
from typing import Any, Union
+from fractions import Fraction
+from decimal import Decimal
+import numbers
import numpy as np
from typing_extensions import Protocol
@@ -86,15 +89,12 @@
return result
# Compare primitive types directly.
- if isinstance(val, (int, float)):
- if not isinstance(other, (int, float)):
+ if isinstance(val, numbers.Number):
+ if not isinstance(other, numbers.Number):
return False
- return _isclose(val, other, atol=atol)
-
- if isinstance(val, complex):
- if not isinstance(other, complex):
- return False
- return _isclose(val, other, atol=atol)
+ result = _isclose(val, other, atol=atol)
+ if result is not NotImplemented:
+ return result
# Try to compare source and target recursively, assuming they're iterable.
result = _approx_eq_iterables(val, other, atol=atol)
@@ -160,4 +160,19 @@
def _isclose(a: Any, b: Any, *, atol: Union[int, float]) -> bool:
"""Convenience wrapper around np.isclose."""
- return True if np.isclose([a], [b], atol=atol, rtol=0.0)[0] else False
+
+ # support casting some standard numeric types
+ x1 = np.asarray([a])
+ if isinstance(a, (Fraction, Decimal)):
+ x1 = x1.astype(np.float64)
+ x2 = np.asarray([b])
+ if isinstance(b, (Fraction, Decimal)):
+ x2 = x2.astype(np.float64)
+
+ # workaround np.isfinite type limitations. Cast to bool to avoid np.bool_
+ try:
+ result = bool(np.isclose(x1, x2, atol=atol, rtol=0.0)[0])
+ except TypeError:
+ return NotImplemented
+
+ return result
| {"golden_diff": "diff --git a/cirq/protocols/approximate_equality.py b/cirq/protocols/approximate_equality.py\n--- a/cirq/protocols/approximate_equality.py\n+++ b/cirq/protocols/approximate_equality.py\n@@ -13,7 +13,10 @@\n # limitations under the License.\n \n from typing import Any, Union\n+from fractions import Fraction\n+from decimal import Decimal\n \n+import numbers\n import numpy as np\n \n from typing_extensions import Protocol\n@@ -86,15 +89,12 @@\n return result\n \n # Compare primitive types directly.\n- if isinstance(val, (int, float)):\n- if not isinstance(other, (int, float)):\n+ if isinstance(val, numbers.Number):\n+ if not isinstance(other, numbers.Number):\n return False\n- return _isclose(val, other, atol=atol)\n-\n- if isinstance(val, complex):\n- if not isinstance(other, complex):\n- return False\n- return _isclose(val, other, atol=atol)\n+ result = _isclose(val, other, atol=atol)\n+ if result is not NotImplemented:\n+ return result\n \n # Try to compare source and target recursively, assuming they're iterable.\n result = _approx_eq_iterables(val, other, atol=atol)\n@@ -160,4 +160,19 @@\n \n def _isclose(a: Any, b: Any, *, atol: Union[int, float]) -> bool:\n \"\"\"Convenience wrapper around np.isclose.\"\"\"\n- return True if np.isclose([a], [b], atol=atol, rtol=0.0)[0] else False\n+\n+ # support casting some standard numeric types\n+ x1 = np.asarray([a])\n+ if isinstance(a, (Fraction, Decimal)):\n+ x1 = x1.astype(np.float64)\n+ x2 = np.asarray([b])\n+ if isinstance(b, (Fraction, Decimal)):\n+ x2 = x2.astype(np.float64)\n+\n+ # workaround np.isfinite type limitations. Cast to bool to avoid np.bool_\n+ try:\n+ result = bool(np.isclose(x1, x2, atol=atol, rtol=0.0)[0])\n+ except TypeError:\n+ return NotImplemented\n+\n+ return result\n", "issue": "Complex numbers can be approximately equal to integers\nbut approx_eq incorrectly disagrees\n", "before_files": [{"content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Any, Union\n\nimport numpy as np\n\nfrom typing_extensions import Protocol\n\n\nclass SupportsApproximateEquality(Protocol):\n \"\"\"Object which can be compared approximately.\"\"\"\n\n def _approx_eq_(\n self,\n other: Any,\n *,\n atol: Union[int, float]\n ) -> bool:\n \"\"\"Approximate comparator.\n\n Types implementing this protocol define their own logic for approximate\n comparison with other types.\n\n Args:\n other: Target object for approximate comparison.\n atol: The minimum absolute tolerance. See np.isclose() documentation\n for details.\n\n Returns:\n True if objects are approximately equal, False otherwise. Returns\n NotImplemented when approximate equality is not implemented for\n given types.\n \"\"\"\n\n\ndef approx_eq(val: Any, other: Any, *, atol: Union[int, float] = 1e-8) -> bool:\n \"\"\"Approximately compares two objects.\n\n If `val` implements SupportsApproxEquality protocol then it is invoked and\n takes precedence over all other checks:\n - For primitive numeric types `int` and `float` approximate equality is\n delegated to math.isclose().\n - For complex primitive type the real and imaginary parts are treated\n independently and compared using math.isclose().\n - For `val` and `other` both iterable of the same length, consecutive\n elements are compared recursively. Types of `val` and `other` does not\n necessarily needs to match each other. They just need to be iterable and\n have the same structure.\n\n Args:\n val: Source object for approximate comparison.\n other: Target object for approximate comparison.\n atol: The minimum absolute tolerance. See np.isclose() documentation for\n details. Defaults to 1e-8 which matches np.isclose() default\n absolute tolerance.\n\n Returns:\n True if objects are approximately equal, False otherwise.\n \"\"\"\n\n # Check if val defines approximate equality via _approx_eq_. This takes\n # precedence over all other overloads.\n approx_eq_getter = getattr(val, '_approx_eq_', None)\n if approx_eq_getter is not None:\n result = approx_eq_getter(other, atol)\n if result is not NotImplemented:\n return result\n\n # The same for other to make approx_eq symmetric.\n other_approx_eq_getter = getattr(other, '_approx_eq_', None)\n if other_approx_eq_getter is not None:\n result = other_approx_eq_getter(val, atol)\n if result is not NotImplemented:\n return result\n\n # Compare primitive types directly.\n if isinstance(val, (int, float)):\n if not isinstance(other, (int, float)):\n return False\n return _isclose(val, other, atol=atol)\n\n if isinstance(val, complex):\n if not isinstance(other, complex):\n return False\n return _isclose(val, other, atol=atol)\n\n # Try to compare source and target recursively, assuming they're iterable.\n result = _approx_eq_iterables(val, other, atol=atol)\n\n # Fallback to __eq__() when anything else fails.\n if result is NotImplemented:\n return val == other\n return result\n\n\ndef _approx_eq_iterables(val: Any, other: Any, *,\n atol: Union[int, float]) -> bool:\n \"\"\"Iterates over arguments and calls approx_eq recursively.\n\n Types of `val` and `other` does not necessarily needs to match each other.\n They just need to be iterable of the same length and have the same\n structure, approx_eq() will be called on each consecutive element of `val`\n and `other`.\n\n Args:\n val: Source for approximate comparison.\n other: Target for approximate comparison.\n atol: The minimum absolute tolerance. See np.isclose() documentation for\n details.\n\n Returns:\n True if objects are approximately equal, False otherwise. Returns\n NotImplemented when approximate equality is not implemented for given\n types.\n \"\"\"\n\n def get_iter(iterable):\n try:\n return iter(iterable)\n except TypeError:\n return None\n\n val_it = get_iter(val)\n other_it = get_iter(other)\n\n if val_it is not None and other_it is not None:\n while True:\n try:\n val_next = next(val_it)\n except StopIteration:\n try:\n next(other_it)\n return False\n except StopIteration:\n return True\n\n try:\n other_next = next(other_it)\n except StopIteration:\n return False\n\n result = approx_eq(val_next, other_next, atol=atol)\n if result is not True:\n return result\n\n return NotImplemented\n\n\ndef _isclose(a: Any, b: Any, *, atol: Union[int, float]) -> bool:\n \"\"\"Convenience wrapper around np.isclose.\"\"\"\n return True if np.isclose([a], [b], atol=atol, rtol=0.0)[0] else False\n", "path": "cirq/protocols/approximate_equality.py"}], "after_files": [{"content": "# Copyright 2018 The Cirq Developers\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom typing import Any, Union\nfrom fractions import Fraction\nfrom decimal import Decimal\n\nimport numbers\nimport numpy as np\n\nfrom typing_extensions import Protocol\n\n\nclass SupportsApproximateEquality(Protocol):\n \"\"\"Object which can be compared approximately.\"\"\"\n\n def _approx_eq_(\n self,\n other: Any,\n *,\n atol: Union[int, float]\n ) -> bool:\n \"\"\"Approximate comparator.\n\n Types implementing this protocol define their own logic for approximate\n comparison with other types.\n\n Args:\n other: Target object for approximate comparison.\n atol: The minimum absolute tolerance. See np.isclose() documentation\n for details.\n\n Returns:\n True if objects are approximately equal, False otherwise. Returns\n NotImplemented when approximate equality is not implemented for\n given types.\n \"\"\"\n\n\ndef approx_eq(val: Any, other: Any, *, atol: Union[int, float] = 1e-8) -> bool:\n \"\"\"Approximately compares two objects.\n\n If `val` implements SupportsApproxEquality protocol then it is invoked and\n takes precedence over all other checks:\n - For primitive numeric types `int` and `float` approximate equality is\n delegated to math.isclose().\n - For complex primitive type the real and imaginary parts are treated\n independently and compared using math.isclose().\n - For `val` and `other` both iterable of the same length, consecutive\n elements are compared recursively. Types of `val` and `other` does not\n necessarily needs to match each other. They just need to be iterable and\n have the same structure.\n\n Args:\n val: Source object for approximate comparison.\n other: Target object for approximate comparison.\n atol: The minimum absolute tolerance. See np.isclose() documentation for\n details. Defaults to 1e-8 which matches np.isclose() default\n absolute tolerance.\n\n Returns:\n True if objects are approximately equal, False otherwise.\n \"\"\"\n\n # Check if val defines approximate equality via _approx_eq_. This takes\n # precedence over all other overloads.\n approx_eq_getter = getattr(val, '_approx_eq_', None)\n if approx_eq_getter is not None:\n result = approx_eq_getter(other, atol)\n if result is not NotImplemented:\n return result\n\n # The same for other to make approx_eq symmetric.\n other_approx_eq_getter = getattr(other, '_approx_eq_', None)\n if other_approx_eq_getter is not None:\n result = other_approx_eq_getter(val, atol)\n if result is not NotImplemented:\n return result\n\n # Compare primitive types directly.\n if isinstance(val, numbers.Number):\n if not isinstance(other, numbers.Number):\n return False\n result = _isclose(val, other, atol=atol)\n if result is not NotImplemented:\n return result\n\n # Try to compare source and target recursively, assuming they're iterable.\n result = _approx_eq_iterables(val, other, atol=atol)\n\n # Fallback to __eq__() when anything else fails.\n if result is NotImplemented:\n return val == other\n return result\n\n\ndef _approx_eq_iterables(val: Any, other: Any, *,\n atol: Union[int, float]) -> bool:\n \"\"\"Iterates over arguments and calls approx_eq recursively.\n\n Types of `val` and `other` does not necessarily needs to match each other.\n They just need to be iterable of the same length and have the same\n structure, approx_eq() will be called on each consecutive element of `val`\n and `other`.\n\n Args:\n val: Source for approximate comparison.\n other: Target for approximate comparison.\n atol: The minimum absolute tolerance. See np.isclose() documentation for\n details.\n\n Returns:\n True if objects are approximately equal, False otherwise. Returns\n NotImplemented when approximate equality is not implemented for given\n types.\n \"\"\"\n\n def get_iter(iterable):\n try:\n return iter(iterable)\n except TypeError:\n return None\n\n val_it = get_iter(val)\n other_it = get_iter(other)\n\n if val_it is not None and other_it is not None:\n while True:\n try:\n val_next = next(val_it)\n except StopIteration:\n try:\n next(other_it)\n return False\n except StopIteration:\n return True\n\n try:\n other_next = next(other_it)\n except StopIteration:\n return False\n\n result = approx_eq(val_next, other_next, atol=atol)\n if result is not True:\n return result\n\n return NotImplemented\n\n\ndef _isclose(a: Any, b: Any, *, atol: Union[int, float]) -> bool:\n \"\"\"Convenience wrapper around np.isclose.\"\"\"\n\n # support casting some standard numeric types\n x1 = np.asarray([a])\n if isinstance(a, (Fraction, Decimal)):\n x1 = x1.astype(np.float64)\n x2 = np.asarray([b])\n if isinstance(b, (Fraction, Decimal)):\n x2 = x2.astype(np.float64)\n\n # workaround np.isfinite type limitations. Cast to bool to avoid np.bool_\n try:\n result = bool(np.isclose(x1, x2, atol=atol, rtol=0.0)[0])\n except TypeError:\n return NotImplemented\n\n return result\n", "path": "cirq/protocols/approximate_equality.py"}]} | 1,870 | 517 |
gh_patches_debug_28976 | rasdani/github-patches | git_diff | huggingface__dataset-viewer-2415 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Replace `DatasetModuleNotInstalledError` errors with `DatasetWithScriptNotSupportedError`
We should never have a `DatasetModuleNotInstalledError` error, because we should return a `DatasetWithScriptNotSupportedError` error before
See https://github.com/huggingface/datasets-server/issues/1067#issuecomment-1924305954
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `services/worker/src/worker/job_runners/dataset/config_names.py`
Content:
```
1 # SPDX-License-Identifier: Apache-2.0
2 # Copyright 2022 The HuggingFace Authors.
3
4 import logging
5 from typing import Optional
6
7 from datasets import get_dataset_config_names
8 from datasets.data_files import EmptyDatasetError as _EmptyDatasetError
9 from libcommon.exceptions import (
10 ConfigNamesError,
11 DatasetModuleNotInstalledError,
12 DatasetWithScriptNotSupportedError,
13 DatasetWithTooManyConfigsError,
14 EmptyDatasetError,
15 )
16
17 from worker.dtos import CompleteJobResult, ConfigNameItem, DatasetConfigNamesResponse
18 from worker.job_runners.dataset.dataset_job_runner import (
19 DatasetJobRunnerWithDatasetsCache,
20 )
21 from worker.utils import resolve_trust_remote_code
22
23
24 def compute_config_names_response(
25 dataset: str,
26 max_number: int,
27 dataset_scripts_allow_list: list[str],
28 hf_token: Optional[str] = None,
29 ) -> DatasetConfigNamesResponse:
30 """
31 Get the response of 'dataset-config-names' for one specific dataset on huggingface.co.
32 Dataset can be gated if you pass an acceptable token.
33 It is assumed that the dataset exists and can be accessed using the token.
34
35 Args:
36 dataset (`str`):
37 A namespace (user or an organization) and a repo name separated by a `/`.
38 max_number (`int`):
39 The maximum number of configs for a dataset.
40 dataset_scripts_allow_list (`list[str]`):
41 List of datasets for which we support dataset scripts.
42 Unix shell-style wildcards also work in the dataset name for namespaced datasets,
43 for example `some_namespace/*` to refer to all the datasets in the `some_namespace` namespace.
44 The keyword `{{ALL_DATASETS_WITH_NO_NAMESPACE}}` refers to all the datasets without namespace.
45 hf_token (`str`, *optional*):
46 An authentication token (See https://huggingface.co/settings/token)
47
48 Raises:
49 [~`libcommon.exceptions.EmptyDatasetError`]:
50 The dataset is empty.
51 [~`libcommon.exceptions.DatasetModuleNotInstalledError`]:
52 The dataset tries to import a module that is not installed.
53 [~`libcommon.exceptions.ConfigNamesError`]:
54 If the list of configs could not be obtained using the datasets library.
55 [~`libcommon.exceptions.DatasetWithScriptNotSupportedError`]:
56 If the dataset has a dataset script and is not in the allow list.
57
58 Returns:
59 `DatasetConfigNamesResponse`: An object with the list of config names.
60 """
61 logging.info(f"get 'dateset-config-names' for {dataset=}")
62 # get the list of splits in streaming mode
63 try:
64 config_name_items: list[ConfigNameItem] = [
65 {"dataset": dataset, "config": str(config)}
66 for config in sorted(
67 get_dataset_config_names(
68 path=dataset,
69 token=hf_token,
70 trust_remote_code=resolve_trust_remote_code(
71 dataset=dataset, allow_list=dataset_scripts_allow_list
72 ),
73 )
74 )
75 ]
76 except _EmptyDatasetError as err:
77 raise EmptyDatasetError("The dataset is empty.", cause=err) from err
78 except ImportError as err:
79 raise DatasetModuleNotInstalledError(
80 "The dataset tries to import a module that is not installed.", cause=err
81 ) from err
82 except Exception as err:
83 if isinstance(err, ValueError) and "trust_remote_code" in str(err):
84 raise DatasetWithScriptNotSupportedError(
85 "The dataset viewer doesn't support this dataset because it runs "
86 "arbitrary python code. Please open a discussion in the discussion tab "
87 "if you think this is an error and tag @lhoestq and @severo."
88 ) from err
89 raise ConfigNamesError("Cannot get the config names for the dataset.", cause=err) from err
90
91 number_of_configs = len(config_name_items)
92 if number_of_configs > max_number:
93 raise DatasetWithTooManyConfigsError(
94 f"The maximum number of configs allowed is {max_number}, dataset has {number_of_configs} configs."
95 )
96
97 return DatasetConfigNamesResponse(config_names=config_name_items)
98
99
100 class DatasetConfigNamesJobRunner(DatasetJobRunnerWithDatasetsCache):
101 @staticmethod
102 def get_job_type() -> str:
103 return "dataset-config-names"
104
105 def compute(self) -> CompleteJobResult:
106 return CompleteJobResult(
107 compute_config_names_response(
108 dataset=self.dataset,
109 hf_token=self.app_config.common.hf_token,
110 max_number=self.app_config.config_names.max_number,
111 dataset_scripts_allow_list=self.app_config.common.dataset_scripts_allow_list,
112 )
113 )
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/services/worker/src/worker/job_runners/dataset/config_names.py b/services/worker/src/worker/job_runners/dataset/config_names.py
--- a/services/worker/src/worker/job_runners/dataset/config_names.py
+++ b/services/worker/src/worker/job_runners/dataset/config_names.py
@@ -75,18 +75,21 @@
]
except _EmptyDatasetError as err:
raise EmptyDatasetError("The dataset is empty.", cause=err) from err
- except ImportError as err:
- raise DatasetModuleNotInstalledError(
- "The dataset tries to import a module that is not installed.", cause=err
- ) from err
- except Exception as err:
- if isinstance(err, ValueError) and "trust_remote_code" in str(err):
+ except ValueError as err:
+ if "trust_remote_code" in str(err):
raise DatasetWithScriptNotSupportedError(
"The dataset viewer doesn't support this dataset because it runs "
"arbitrary python code. Please open a discussion in the discussion tab "
"if you think this is an error and tag @lhoestq and @severo."
) from err
raise ConfigNamesError("Cannot get the config names for the dataset.", cause=err) from err
+ except ImportError as err:
+ # this should only happen if the dataset is in the allow list, which should soon disappear
+ raise DatasetModuleNotInstalledError(
+ "The dataset tries to import a module that is not installed.", cause=err
+ ) from err
+ except Exception as err:
+ raise ConfigNamesError("Cannot get the config names for the dataset.", cause=err) from err
number_of_configs = len(config_name_items)
if number_of_configs > max_number:
| {"golden_diff": "diff --git a/services/worker/src/worker/job_runners/dataset/config_names.py b/services/worker/src/worker/job_runners/dataset/config_names.py\n--- a/services/worker/src/worker/job_runners/dataset/config_names.py\n+++ b/services/worker/src/worker/job_runners/dataset/config_names.py\n@@ -75,18 +75,21 @@\n ]\n except _EmptyDatasetError as err:\n raise EmptyDatasetError(\"The dataset is empty.\", cause=err) from err\n- except ImportError as err:\n- raise DatasetModuleNotInstalledError(\n- \"The dataset tries to import a module that is not installed.\", cause=err\n- ) from err\n- except Exception as err:\n- if isinstance(err, ValueError) and \"trust_remote_code\" in str(err):\n+ except ValueError as err:\n+ if \"trust_remote_code\" in str(err):\n raise DatasetWithScriptNotSupportedError(\n \"The dataset viewer doesn't support this dataset because it runs \"\n \"arbitrary python code. Please open a discussion in the discussion tab \"\n \"if you think this is an error and tag @lhoestq and @severo.\"\n ) from err\n raise ConfigNamesError(\"Cannot get the config names for the dataset.\", cause=err) from err\n+ except ImportError as err:\n+ # this should only happen if the dataset is in the allow list, which should soon disappear\n+ raise DatasetModuleNotInstalledError(\n+ \"The dataset tries to import a module that is not installed.\", cause=err\n+ ) from err\n+ except Exception as err:\n+ raise ConfigNamesError(\"Cannot get the config names for the dataset.\", cause=err) from err\n \n number_of_configs = len(config_name_items)\n if number_of_configs > max_number:\n", "issue": "Replace `DatasetModuleNotInstalledError` errors with `DatasetWithScriptNotSupportedError`\nWe should never have a `DatasetModuleNotInstalledError` error, because we should return a `DatasetWithScriptNotSupportedError` error before\r\n\r\nSee https://github.com/huggingface/datasets-server/issues/1067#issuecomment-1924305954\n", "before_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n# Copyright 2022 The HuggingFace Authors.\n\nimport logging\nfrom typing import Optional\n\nfrom datasets import get_dataset_config_names\nfrom datasets.data_files import EmptyDatasetError as _EmptyDatasetError\nfrom libcommon.exceptions import (\n ConfigNamesError,\n DatasetModuleNotInstalledError,\n DatasetWithScriptNotSupportedError,\n DatasetWithTooManyConfigsError,\n EmptyDatasetError,\n)\n\nfrom worker.dtos import CompleteJobResult, ConfigNameItem, DatasetConfigNamesResponse\nfrom worker.job_runners.dataset.dataset_job_runner import (\n DatasetJobRunnerWithDatasetsCache,\n)\nfrom worker.utils import resolve_trust_remote_code\n\n\ndef compute_config_names_response(\n dataset: str,\n max_number: int,\n dataset_scripts_allow_list: list[str],\n hf_token: Optional[str] = None,\n) -> DatasetConfigNamesResponse:\n \"\"\"\n Get the response of 'dataset-config-names' for one specific dataset on huggingface.co.\n Dataset can be gated if you pass an acceptable token.\n It is assumed that the dataset exists and can be accessed using the token.\n\n Args:\n dataset (`str`):\n A namespace (user or an organization) and a repo name separated by a `/`.\n max_number (`int`):\n The maximum number of configs for a dataset.\n dataset_scripts_allow_list (`list[str]`):\n List of datasets for which we support dataset scripts.\n Unix shell-style wildcards also work in the dataset name for namespaced datasets,\n for example `some_namespace/*` to refer to all the datasets in the `some_namespace` namespace.\n The keyword `{{ALL_DATASETS_WITH_NO_NAMESPACE}}` refers to all the datasets without namespace.\n hf_token (`str`, *optional*):\n An authentication token (See https://huggingface.co/settings/token)\n\n Raises:\n [~`libcommon.exceptions.EmptyDatasetError`]:\n The dataset is empty.\n [~`libcommon.exceptions.DatasetModuleNotInstalledError`]:\n The dataset tries to import a module that is not installed.\n [~`libcommon.exceptions.ConfigNamesError`]:\n If the list of configs could not be obtained using the datasets library.\n [~`libcommon.exceptions.DatasetWithScriptNotSupportedError`]:\n If the dataset has a dataset script and is not in the allow list.\n\n Returns:\n `DatasetConfigNamesResponse`: An object with the list of config names.\n \"\"\"\n logging.info(f\"get 'dateset-config-names' for {dataset=}\")\n # get the list of splits in streaming mode\n try:\n config_name_items: list[ConfigNameItem] = [\n {\"dataset\": dataset, \"config\": str(config)}\n for config in sorted(\n get_dataset_config_names(\n path=dataset,\n token=hf_token,\n trust_remote_code=resolve_trust_remote_code(\n dataset=dataset, allow_list=dataset_scripts_allow_list\n ),\n )\n )\n ]\n except _EmptyDatasetError as err:\n raise EmptyDatasetError(\"The dataset is empty.\", cause=err) from err\n except ImportError as err:\n raise DatasetModuleNotInstalledError(\n \"The dataset tries to import a module that is not installed.\", cause=err\n ) from err\n except Exception as err:\n if isinstance(err, ValueError) and \"trust_remote_code\" in str(err):\n raise DatasetWithScriptNotSupportedError(\n \"The dataset viewer doesn't support this dataset because it runs \"\n \"arbitrary python code. Please open a discussion in the discussion tab \"\n \"if you think this is an error and tag @lhoestq and @severo.\"\n ) from err\n raise ConfigNamesError(\"Cannot get the config names for the dataset.\", cause=err) from err\n\n number_of_configs = len(config_name_items)\n if number_of_configs > max_number:\n raise DatasetWithTooManyConfigsError(\n f\"The maximum number of configs allowed is {max_number}, dataset has {number_of_configs} configs.\"\n )\n\n return DatasetConfigNamesResponse(config_names=config_name_items)\n\n\nclass DatasetConfigNamesJobRunner(DatasetJobRunnerWithDatasetsCache):\n @staticmethod\n def get_job_type() -> str:\n return \"dataset-config-names\"\n\n def compute(self) -> CompleteJobResult:\n return CompleteJobResult(\n compute_config_names_response(\n dataset=self.dataset,\n hf_token=self.app_config.common.hf_token,\n max_number=self.app_config.config_names.max_number,\n dataset_scripts_allow_list=self.app_config.common.dataset_scripts_allow_list,\n )\n )\n", "path": "services/worker/src/worker/job_runners/dataset/config_names.py"}], "after_files": [{"content": "# SPDX-License-Identifier: Apache-2.0\n# Copyright 2022 The HuggingFace Authors.\n\nimport logging\nfrom typing import Optional\n\nfrom datasets import get_dataset_config_names\nfrom datasets.data_files import EmptyDatasetError as _EmptyDatasetError\nfrom libcommon.exceptions import (\n ConfigNamesError,\n DatasetModuleNotInstalledError,\n DatasetWithScriptNotSupportedError,\n DatasetWithTooManyConfigsError,\n EmptyDatasetError,\n)\n\nfrom worker.dtos import CompleteJobResult, ConfigNameItem, DatasetConfigNamesResponse\nfrom worker.job_runners.dataset.dataset_job_runner import (\n DatasetJobRunnerWithDatasetsCache,\n)\nfrom worker.utils import resolve_trust_remote_code\n\n\ndef compute_config_names_response(\n dataset: str,\n max_number: int,\n dataset_scripts_allow_list: list[str],\n hf_token: Optional[str] = None,\n) -> DatasetConfigNamesResponse:\n \"\"\"\n Get the response of 'dataset-config-names' for one specific dataset on huggingface.co.\n Dataset can be gated if you pass an acceptable token.\n It is assumed that the dataset exists and can be accessed using the token.\n\n Args:\n dataset (`str`):\n A namespace (user or an organization) and a repo name separated by a `/`.\n max_number (`int`):\n The maximum number of configs for a dataset.\n dataset_scripts_allow_list (`list[str]`):\n List of datasets for which we support dataset scripts.\n Unix shell-style wildcards also work in the dataset name for namespaced datasets,\n for example `some_namespace/*` to refer to all the datasets in the `some_namespace` namespace.\n The keyword `{{ALL_DATASETS_WITH_NO_NAMESPACE}}` refers to all the datasets without namespace.\n hf_token (`str`, *optional*):\n An authentication token (See https://huggingface.co/settings/token)\n\n Raises:\n [~`libcommon.exceptions.EmptyDatasetError`]:\n The dataset is empty.\n [~`libcommon.exceptions.DatasetModuleNotInstalledError`]:\n The dataset tries to import a module that is not installed.\n [~`libcommon.exceptions.ConfigNamesError`]:\n If the list of configs could not be obtained using the datasets library.\n [~`libcommon.exceptions.DatasetWithScriptNotSupportedError`]:\n If the dataset has a dataset script and is not in the allow list.\n\n Returns:\n `DatasetConfigNamesResponse`: An object with the list of config names.\n \"\"\"\n logging.info(f\"get 'dateset-config-names' for {dataset=}\")\n # get the list of splits in streaming mode\n try:\n config_name_items: list[ConfigNameItem] = [\n {\"dataset\": dataset, \"config\": str(config)}\n for config in sorted(\n get_dataset_config_names(\n path=dataset,\n token=hf_token,\n trust_remote_code=resolve_trust_remote_code(\n dataset=dataset, allow_list=dataset_scripts_allow_list\n ),\n )\n )\n ]\n except _EmptyDatasetError as err:\n raise EmptyDatasetError(\"The dataset is empty.\", cause=err) from err\n except ValueError as err:\n if \"trust_remote_code\" in str(err):\n raise DatasetWithScriptNotSupportedError(\n \"The dataset viewer doesn't support this dataset because it runs \"\n \"arbitrary python code. Please open a discussion in the discussion tab \"\n \"if you think this is an error and tag @lhoestq and @severo.\"\n ) from err\n raise ConfigNamesError(\"Cannot get the config names for the dataset.\", cause=err) from err\n except ImportError as err:\n # this should only happen if the dataset is in the allow list, which should soon disappear\n raise DatasetModuleNotInstalledError(\n \"The dataset tries to import a module that is not installed.\", cause=err\n ) from err\n except Exception as err:\n raise ConfigNamesError(\"Cannot get the config names for the dataset.\", cause=err) from err\n\n number_of_configs = len(config_name_items)\n if number_of_configs > max_number:\n raise DatasetWithTooManyConfigsError(\n f\"The maximum number of configs allowed is {max_number}, dataset has {number_of_configs} configs.\"\n )\n\n return DatasetConfigNamesResponse(config_names=config_name_items)\n\n\nclass DatasetConfigNamesJobRunner(DatasetJobRunnerWithDatasetsCache):\n @staticmethod\n def get_job_type() -> str:\n return \"dataset-config-names\"\n\n def compute(self) -> CompleteJobResult:\n return CompleteJobResult(\n compute_config_names_response(\n dataset=self.dataset,\n hf_token=self.app_config.common.hf_token,\n max_number=self.app_config.config_names.max_number,\n dataset_scripts_allow_list=self.app_config.common.dataset_scripts_allow_list,\n )\n )\n", "path": "services/worker/src/worker/job_runners/dataset/config_names.py"}]} | 1,561 | 394 |
gh_patches_debug_185 | rasdani/github-patches | git_diff | pyqtgraph__pyqtgraph-868 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Crash on closing Matplotlib export
E.g. when opening the Matplotlib exporter multiple times, and closing the windows again, Python crashes with a segmentation fault.
This is caused by the Matplotlib QMainWindow listening to the closeEvent and deleting the only reference of the window before it is closed properly.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyqtgraph/exporters/Matplotlib.py`
Content:
```
1 from ..Qt import QtGui, QtCore
2 from .Exporter import Exporter
3 from .. import PlotItem
4 from .. import functions as fn
5
6 __all__ = ['MatplotlibExporter']
7
8 """
9 It is helpful when using the matplotlib Exporter if your
10 .matplotlib/matplotlibrc file is configured appropriately.
11 The following are suggested for getting usable PDF output that
12 can be edited in Illustrator, etc.
13
14 backend : Qt4Agg
15 text.usetex : True # Assumes you have a findable LaTeX installation
16 interactive : False
17 font.family : sans-serif
18 font.sans-serif : 'Arial' # (make first in list)
19 mathtext.default : sf
20 figure.facecolor : white # personal preference
21 # next setting allows pdf font to be readable in Adobe Illustrator
22 pdf.fonttype : 42 # set fonts to TrueType (otherwise it will be 3
23 # and the text will be vectorized.
24 text.dvipnghack : True # primarily to clean up font appearance on Mac
25
26 The advantage is that there is less to do to get an exported file cleaned and ready for
27 publication. Fonts are not vectorized (outlined), and window colors are white.
28
29 """
30
31 class MatplotlibExporter(Exporter):
32 Name = "Matplotlib Window"
33 windows = []
34 def __init__(self, item):
35 Exporter.__init__(self, item)
36
37 def parameters(self):
38 return None
39
40 def cleanAxes(self, axl):
41 if type(axl) is not list:
42 axl = [axl]
43 for ax in axl:
44 if ax is None:
45 continue
46 for loc, spine in ax.spines.items():
47 if loc in ['left', 'bottom']:
48 pass
49 elif loc in ['right', 'top']:
50 spine.set_color('none')
51 # do not draw the spine
52 else:
53 raise ValueError('Unknown spine location: %s' % loc)
54 # turn off ticks when there is no spine
55 ax.xaxis.set_ticks_position('bottom')
56
57 def export(self, fileName=None):
58
59 if isinstance(self.item, PlotItem):
60 mpw = MatplotlibWindow()
61 MatplotlibExporter.windows.append(mpw)
62
63 stdFont = 'Arial'
64
65 fig = mpw.getFigure()
66
67 # get labels from the graphic item
68 xlabel = self.item.axes['bottom']['item'].label.toPlainText()
69 ylabel = self.item.axes['left']['item'].label.toPlainText()
70 title = self.item.titleLabel.text
71
72 ax = fig.add_subplot(111, title=title)
73 ax.clear()
74 self.cleanAxes(ax)
75 #ax.grid(True)
76 for item in self.item.curves:
77 x, y = item.getData()
78 opts = item.opts
79 pen = fn.mkPen(opts['pen'])
80 if pen.style() == QtCore.Qt.NoPen:
81 linestyle = ''
82 else:
83 linestyle = '-'
84 color = tuple([c/255. for c in fn.colorTuple(pen.color())])
85 symbol = opts['symbol']
86 if symbol == 't':
87 symbol = '^'
88 symbolPen = fn.mkPen(opts['symbolPen'])
89 symbolBrush = fn.mkBrush(opts['symbolBrush'])
90 markeredgecolor = tuple([c/255. for c in fn.colorTuple(symbolPen.color())])
91 markerfacecolor = tuple([c/255. for c in fn.colorTuple(symbolBrush.color())])
92 markersize = opts['symbolSize']
93
94 if opts['fillLevel'] is not None and opts['fillBrush'] is not None:
95 fillBrush = fn.mkBrush(opts['fillBrush'])
96 fillcolor = tuple([c/255. for c in fn.colorTuple(fillBrush.color())])
97 ax.fill_between(x=x, y1=y, y2=opts['fillLevel'], facecolor=fillcolor)
98
99 pl = ax.plot(x, y, marker=symbol, color=color, linewidth=pen.width(),
100 linestyle=linestyle, markeredgecolor=markeredgecolor, markerfacecolor=markerfacecolor,
101 markersize=markersize)
102 xr, yr = self.item.viewRange()
103 ax.set_xbound(*xr)
104 ax.set_ybound(*yr)
105 ax.set_xlabel(xlabel) # place the labels.
106 ax.set_ylabel(ylabel)
107 mpw.draw()
108 else:
109 raise Exception("Matplotlib export currently only works with plot items")
110
111 MatplotlibExporter.register()
112
113
114 class MatplotlibWindow(QtGui.QMainWindow):
115 def __init__(self):
116 from ..widgets import MatplotlibWidget
117 QtGui.QMainWindow.__init__(self)
118 self.mpl = MatplotlibWidget.MatplotlibWidget()
119 self.setCentralWidget(self.mpl)
120 self.show()
121
122 def __getattr__(self, attr):
123 return getattr(self.mpl, attr)
124
125 def closeEvent(self, ev):
126 MatplotlibExporter.windows.remove(self)
127
128
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pyqtgraph/exporters/Matplotlib.py b/pyqtgraph/exporters/Matplotlib.py
--- a/pyqtgraph/exporters/Matplotlib.py
+++ b/pyqtgraph/exporters/Matplotlib.py
@@ -124,5 +124,4 @@
def closeEvent(self, ev):
MatplotlibExporter.windows.remove(self)
-
-
+ self.deleteLater()
| {"golden_diff": "diff --git a/pyqtgraph/exporters/Matplotlib.py b/pyqtgraph/exporters/Matplotlib.py\n--- a/pyqtgraph/exporters/Matplotlib.py\n+++ b/pyqtgraph/exporters/Matplotlib.py\n@@ -124,5 +124,4 @@\n \n def closeEvent(self, ev):\n MatplotlibExporter.windows.remove(self)\n-\n-\n+ self.deleteLater()\n", "issue": "Crash on closing Matplotlib export\nE.g. when opening the Matplotlib exporter multiple times, and closing the windows again, Python crashes with a segmentation fault.\r\n\r\nThis is caused by the Matplotlib QMainWindow listening to the closeEvent and deleting the only reference of the window before it is closed properly.\n", "before_files": [{"content": "from ..Qt import QtGui, QtCore\nfrom .Exporter import Exporter\nfrom .. import PlotItem\nfrom .. import functions as fn\n\n__all__ = ['MatplotlibExporter']\n\n\"\"\"\nIt is helpful when using the matplotlib Exporter if your\n.matplotlib/matplotlibrc file is configured appropriately.\nThe following are suggested for getting usable PDF output that\ncan be edited in Illustrator, etc.\n\nbackend : Qt4Agg\ntext.usetex : True # Assumes you have a findable LaTeX installation\ninteractive : False\nfont.family : sans-serif\nfont.sans-serif : 'Arial' # (make first in list)\nmathtext.default : sf\nfigure.facecolor : white # personal preference\n# next setting allows pdf font to be readable in Adobe Illustrator\npdf.fonttype : 42 # set fonts to TrueType (otherwise it will be 3\n # and the text will be vectorized.\ntext.dvipnghack : True # primarily to clean up font appearance on Mac\n\nThe advantage is that there is less to do to get an exported file cleaned and ready for\npublication. Fonts are not vectorized (outlined), and window colors are white.\n\n\"\"\"\n \nclass MatplotlibExporter(Exporter):\n Name = \"Matplotlib Window\"\n windows = []\n def __init__(self, item):\n Exporter.__init__(self, item)\n \n def parameters(self):\n return None\n\n def cleanAxes(self, axl):\n if type(axl) is not list:\n axl = [axl]\n for ax in axl:\n if ax is None:\n continue\n for loc, spine in ax.spines.items():\n if loc in ['left', 'bottom']:\n pass\n elif loc in ['right', 'top']:\n spine.set_color('none')\n # do not draw the spine\n else:\n raise ValueError('Unknown spine location: %s' % loc)\n # turn off ticks when there is no spine\n ax.xaxis.set_ticks_position('bottom')\n \n def export(self, fileName=None):\n \n if isinstance(self.item, PlotItem):\n mpw = MatplotlibWindow()\n MatplotlibExporter.windows.append(mpw)\n\n stdFont = 'Arial'\n \n fig = mpw.getFigure()\n \n # get labels from the graphic item\n xlabel = self.item.axes['bottom']['item'].label.toPlainText()\n ylabel = self.item.axes['left']['item'].label.toPlainText()\n title = self.item.titleLabel.text\n\n ax = fig.add_subplot(111, title=title)\n ax.clear()\n self.cleanAxes(ax)\n #ax.grid(True)\n for item in self.item.curves:\n x, y = item.getData()\n opts = item.opts\n pen = fn.mkPen(opts['pen'])\n if pen.style() == QtCore.Qt.NoPen:\n linestyle = ''\n else:\n linestyle = '-'\n color = tuple([c/255. for c in fn.colorTuple(pen.color())])\n symbol = opts['symbol']\n if symbol == 't':\n symbol = '^'\n symbolPen = fn.mkPen(opts['symbolPen'])\n symbolBrush = fn.mkBrush(opts['symbolBrush'])\n markeredgecolor = tuple([c/255. for c in fn.colorTuple(symbolPen.color())])\n markerfacecolor = tuple([c/255. for c in fn.colorTuple(symbolBrush.color())])\n markersize = opts['symbolSize']\n \n if opts['fillLevel'] is not None and opts['fillBrush'] is not None:\n fillBrush = fn.mkBrush(opts['fillBrush'])\n fillcolor = tuple([c/255. for c in fn.colorTuple(fillBrush.color())])\n ax.fill_between(x=x, y1=y, y2=opts['fillLevel'], facecolor=fillcolor)\n \n pl = ax.plot(x, y, marker=symbol, color=color, linewidth=pen.width(), \n linestyle=linestyle, markeredgecolor=markeredgecolor, markerfacecolor=markerfacecolor,\n markersize=markersize)\n xr, yr = self.item.viewRange()\n ax.set_xbound(*xr)\n ax.set_ybound(*yr)\n ax.set_xlabel(xlabel) # place the labels.\n ax.set_ylabel(ylabel)\n mpw.draw()\n else:\n raise Exception(\"Matplotlib export currently only works with plot items\")\n \nMatplotlibExporter.register() \n \n\nclass MatplotlibWindow(QtGui.QMainWindow):\n def __init__(self):\n from ..widgets import MatplotlibWidget\n QtGui.QMainWindow.__init__(self)\n self.mpl = MatplotlibWidget.MatplotlibWidget()\n self.setCentralWidget(self.mpl)\n self.show()\n \n def __getattr__(self, attr):\n return getattr(self.mpl, attr)\n \n def closeEvent(self, ev):\n MatplotlibExporter.windows.remove(self)\n\n\n", "path": "pyqtgraph/exporters/Matplotlib.py"}], "after_files": [{"content": "from ..Qt import QtGui, QtCore\nfrom .Exporter import Exporter\nfrom .. import PlotItem\nfrom .. import functions as fn\n\n__all__ = ['MatplotlibExporter']\n\n\"\"\"\nIt is helpful when using the matplotlib Exporter if your\n.matplotlib/matplotlibrc file is configured appropriately.\nThe following are suggested for getting usable PDF output that\ncan be edited in Illustrator, etc.\n\nbackend : Qt4Agg\ntext.usetex : True # Assumes you have a findable LaTeX installation\ninteractive : False\nfont.family : sans-serif\nfont.sans-serif : 'Arial' # (make first in list)\nmathtext.default : sf\nfigure.facecolor : white # personal preference\n# next setting allows pdf font to be readable in Adobe Illustrator\npdf.fonttype : 42 # set fonts to TrueType (otherwise it will be 3\n # and the text will be vectorized.\ntext.dvipnghack : True # primarily to clean up font appearance on Mac\n\nThe advantage is that there is less to do to get an exported file cleaned and ready for\npublication. Fonts are not vectorized (outlined), and window colors are white.\n\n\"\"\"\n \nclass MatplotlibExporter(Exporter):\n Name = \"Matplotlib Window\"\n windows = []\n def __init__(self, item):\n Exporter.__init__(self, item)\n \n def parameters(self):\n return None\n\n def cleanAxes(self, axl):\n if type(axl) is not list:\n axl = [axl]\n for ax in axl:\n if ax is None:\n continue\n for loc, spine in ax.spines.items():\n if loc in ['left', 'bottom']:\n pass\n elif loc in ['right', 'top']:\n spine.set_color('none')\n # do not draw the spine\n else:\n raise ValueError('Unknown spine location: %s' % loc)\n # turn off ticks when there is no spine\n ax.xaxis.set_ticks_position('bottom')\n \n def export(self, fileName=None):\n \n if isinstance(self.item, PlotItem):\n mpw = MatplotlibWindow()\n MatplotlibExporter.windows.append(mpw)\n\n stdFont = 'Arial'\n \n fig = mpw.getFigure()\n \n # get labels from the graphic item\n xlabel = self.item.axes['bottom']['item'].label.toPlainText()\n ylabel = self.item.axes['left']['item'].label.toPlainText()\n title = self.item.titleLabel.text\n\n ax = fig.add_subplot(111, title=title)\n ax.clear()\n self.cleanAxes(ax)\n #ax.grid(True)\n for item in self.item.curves:\n x, y = item.getData()\n opts = item.opts\n pen = fn.mkPen(opts['pen'])\n if pen.style() == QtCore.Qt.NoPen:\n linestyle = ''\n else:\n linestyle = '-'\n color = tuple([c/255. for c in fn.colorTuple(pen.color())])\n symbol = opts['symbol']\n if symbol == 't':\n symbol = '^'\n symbolPen = fn.mkPen(opts['symbolPen'])\n symbolBrush = fn.mkBrush(opts['symbolBrush'])\n markeredgecolor = tuple([c/255. for c in fn.colorTuple(symbolPen.color())])\n markerfacecolor = tuple([c/255. for c in fn.colorTuple(symbolBrush.color())])\n markersize = opts['symbolSize']\n \n if opts['fillLevel'] is not None and opts['fillBrush'] is not None:\n fillBrush = fn.mkBrush(opts['fillBrush'])\n fillcolor = tuple([c/255. for c in fn.colorTuple(fillBrush.color())])\n ax.fill_between(x=x, y1=y, y2=opts['fillLevel'], facecolor=fillcolor)\n \n pl = ax.plot(x, y, marker=symbol, color=color, linewidth=pen.width(), \n linestyle=linestyle, markeredgecolor=markeredgecolor, markerfacecolor=markerfacecolor,\n markersize=markersize)\n xr, yr = self.item.viewRange()\n ax.set_xbound(*xr)\n ax.set_ybound(*yr)\n ax.set_xlabel(xlabel) # place the labels.\n ax.set_ylabel(ylabel)\n mpw.draw()\n else:\n raise Exception(\"Matplotlib export currently only works with plot items\")\n \nMatplotlibExporter.register() \n \n\nclass MatplotlibWindow(QtGui.QMainWindow):\n def __init__(self):\n from ..widgets import MatplotlibWidget\n QtGui.QMainWindow.__init__(self)\n self.mpl = MatplotlibWidget.MatplotlibWidget()\n self.setCentralWidget(self.mpl)\n self.show()\n \n def __getattr__(self, attr):\n return getattr(self.mpl, attr)\n \n def closeEvent(self, ev):\n MatplotlibExporter.windows.remove(self)\n self.deleteLater()\n", "path": "pyqtgraph/exporters/Matplotlib.py"}]} | 1,650 | 87 |
gh_patches_debug_24516 | rasdani/github-patches | git_diff | dbt-labs__dbt-core-3275 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
git package: warn if specified revision master, main
Follow up from #3057, #3104
Previously, if a git package was unpinned, dbt tried to install it from `master` (hard coded). In #3104, this was fixed to used `HEAD` (default branch, regardless of name) and continue to warn if unpinned.
We'd like to warn as well if the user specifies the package `revision` as `'master'` or `'main'`, since either almost certainly represents the default branch. Users can disable the warning with `warn-unpinned: false`.
### Example
```yml
packages:
- git: https://github.com/fishtown-analytics/dbt-codegen
revision: master
- git: https://github.com/tailsdotcom/dbt_artifacts
revision: main
- git: https://gitlab.com/gitlab-data/snowflake_spend
- package: fishtown-analytics/audit_helper
version: 0.3.0
```
<details>
<summary> <code>$ dbt deps</code> </summary>
```
Running with dbt=0.19.0
WARNING: The git package "https://github.com/fishtown-analytics/dbt-codegen"
is pinned to the "master" branch.
This can introduce breaking changes into your project without warning!
See https://docs.getdbt.com/docs/package-management#section-specifying-package-versions
WARNING: The git package "https://github.com/tailsdotcom/dbt_artifacts"
is pinned to the "main" branch.
This can introduce breaking changes into your project without warning!
See https://docs.getdbt.com/docs/package-management#section-specifying-package-versions
WARNING: The git package "https://gitlab.com/gitlab-data/snowflake_spend"
is not pinned, using HEAD (default branch).
This can introduce breaking changes into your project without warning!
See https://docs.getdbt.com/docs/package-management#section-specifying-package-versions
WARNING: The git package "https://github.com/fishtown-analytics/dbt-codegen"
is pinned to the "master" branch.
This can introduce breaking changes into your project without warning!
See https://docs.getdbt.com/docs/package-management#section-specifying-package-versions
WARNING: The git package "https://github.com/tailsdotcom/dbt_artifacts"
is pinned to the "main" branch.
This can introduce breaking changes into your project without warning!
See https://docs.getdbt.com/docs/package-management#section-specifying-package-versions
WARNING: The git package "https://gitlab.com/gitlab-data/snowflake_spend"
is not pinned, using HEAD (default branch).
This can introduce breaking changes into your project without warning!
See https://docs.getdbt.com/docs/package-management#section-specifying-package-versions
Installing https://github.com/fishtown-analytics/dbt-codegen@master
Installed from revision master
Installing https://github.com/tailsdotcom/dbt_artifacts@main
Installed from revision main
Installing https://gitlab.com/gitlab-data/snowflake_spend@HEAD
Installed from HEAD (default branch)
Installing fishtown-analytics/[email protected]
Installed from version 0.3.0
Installing fishtown-analytics/[email protected]
Installed from version 0.6.4
```
</details>
```yml
packages:
- git: https://github.com/fishtown-analytics/dbt-codegen
revision: master
warn-unpinned: false
- git: https://github.com/tailsdotcom/dbt_artifacts
revision: main
warn-unpinned: false
- git: https://gitlab.com/gitlab-data/snowflake_spend
warn-unpinned: false
- package: fishtown-analytics/audit_helper
version: 0.3.0
```
<details>
<summary> <code>$ dbt deps</code> </summary>
```
Running with dbt=0.19.0
Installing https://github.com/fishtown-analytics/dbt-codegen@master
Installed from revision master
Installing https://github.com/tailsdotcom/dbt_artifacts@main
Installed from revision main
Installing https://gitlab.com/gitlab-data/snowflake_spend@HEAD
Installed from HEAD (default branch)
Installing fishtown-analytics/[email protected]
Installed from version 0.3.0
Installing fishtown-analytics/[email protected]
Installed from version 0.6.4
```
</details>
### Checklist
- [x] I have signed the [CLA](https://docs.getdbt.com/docs/contributor-license-agreements)
- [x] I have run this code in development and it appears to resolve the stated issue
- ~This PR includes tests, or tests are not required/relevant for this PR~
- [x] I have updated the `CHANGELOG.md` and added information about my change to the "dbt next" section.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/dbt/clients/registry.py`
Content:
```
1 from functools import wraps
2 import requests
3 from dbt.exceptions import RegistryException
4 from dbt.utils import memoized
5 from dbt.logger import GLOBAL_LOGGER as logger
6 import os
7 import time
8
9 if os.getenv('DBT_PACKAGE_HUB_URL'):
10 DEFAULT_REGISTRY_BASE_URL = os.getenv('DBT_PACKAGE_HUB_URL')
11 else:
12 DEFAULT_REGISTRY_BASE_URL = 'https://hub.getdbt.com/'
13
14
15 def _get_url(url, registry_base_url=None):
16 if registry_base_url is None:
17 registry_base_url = DEFAULT_REGISTRY_BASE_URL
18
19 return '{}{}'.format(registry_base_url, url)
20
21
22 def _wrap_exceptions(fn):
23 @wraps(fn)
24 def wrapper(*args, **kwargs):
25 max_attempts = 5
26 attempt = 0
27 while True:
28 attempt += 1
29 try:
30 return fn(*args, **kwargs)
31 except requests.exceptions.ConnectionError as exc:
32 if attempt < max_attempts:
33 time.sleep(1)
34 continue
35
36 raise RegistryException(
37 'Unable to connect to registry hub'
38 ) from exc
39 return wrapper
40
41
42 @_wrap_exceptions
43 def _get(path, registry_base_url=None):
44 url = _get_url(path, registry_base_url)
45 logger.debug('Making package registry request: GET {}'.format(url))
46 resp = requests.get(url)
47 logger.debug('Response from registry: GET {} {}'.format(url,
48 resp.status_code))
49 resp.raise_for_status()
50 return resp.json()
51
52
53 def index(registry_base_url=None):
54 return _get('api/v1/index.json', registry_base_url)
55
56
57 index_cached = memoized(index)
58
59
60 def packages(registry_base_url=None):
61 return _get('api/v1/packages.json', registry_base_url)
62
63
64 def package(name, registry_base_url=None):
65 return _get('api/v1/{}.json'.format(name), registry_base_url)
66
67
68 def package_version(name, version, registry_base_url=None):
69 return _get('api/v1/{}/{}.json'.format(name, version), registry_base_url)
70
71
72 def get_available_versions(name):
73 response = package(name)
74 return list(response['versions'])
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/dbt/clients/registry.py b/core/dbt/clients/registry.py
--- a/core/dbt/clients/registry.py
+++ b/core/dbt/clients/registry.py
@@ -28,11 +28,10 @@
attempt += 1
try:
return fn(*args, **kwargs)
- except requests.exceptions.ConnectionError as exc:
+ except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as exc:
if attempt < max_attempts:
time.sleep(1)
continue
-
raise RegistryException(
'Unable to connect to registry hub'
) from exc
@@ -43,7 +42,7 @@
def _get(path, registry_base_url=None):
url = _get_url(path, registry_base_url)
logger.debug('Making package registry request: GET {}'.format(url))
- resp = requests.get(url)
+ resp = requests.get(url, timeout=30)
logger.debug('Response from registry: GET {} {}'.format(url,
resp.status_code))
resp.raise_for_status()
| {"golden_diff": "diff --git a/core/dbt/clients/registry.py b/core/dbt/clients/registry.py\n--- a/core/dbt/clients/registry.py\n+++ b/core/dbt/clients/registry.py\n@@ -28,11 +28,10 @@\n attempt += 1\n try:\n return fn(*args, **kwargs)\n- except requests.exceptions.ConnectionError as exc:\n+ except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as exc:\n if attempt < max_attempts:\n time.sleep(1)\n continue\n-\n raise RegistryException(\n 'Unable to connect to registry hub'\n ) from exc\n@@ -43,7 +42,7 @@\n def _get(path, registry_base_url=None):\n url = _get_url(path, registry_base_url)\n logger.debug('Making package registry request: GET {}'.format(url))\n- resp = requests.get(url)\n+ resp = requests.get(url, timeout=30)\n logger.debug('Response from registry: GET {} {}'.format(url,\n resp.status_code))\n resp.raise_for_status()\n", "issue": "git package: warn if specified revision master, main\nFollow up from #3057, #3104\r\n\r\nPreviously, if a git package was unpinned, dbt tried to install it from `master` (hard coded). In #3104, this was fixed to used `HEAD` (default branch, regardless of name) and continue to warn if unpinned.\r\n\r\nWe'd like to warn as well if the user specifies the package `revision` as `'master'` or `'main'`, since either almost certainly represents the default branch. Users can disable the warning with `warn-unpinned: false`.\r\n\r\n### Example\r\n\r\n```yml\r\npackages:\r\n - git: https://github.com/fishtown-analytics/dbt-codegen \r\n revision: master\r\n - git: https://github.com/tailsdotcom/dbt_artifacts \r\n revision: main\r\n - git: https://gitlab.com/gitlab-data/snowflake_spend\r\n - package: fishtown-analytics/audit_helper\r\n version: 0.3.0\r\n```\r\n\r\n<details>\r\n<summary> <code>$ dbt deps</code> </summary>\r\n\r\n```\r\nRunning with dbt=0.19.0\r\nWARNING: The git package \"https://github.com/fishtown-analytics/dbt-codegen\"\r\n\tis pinned to the \"master\" branch.\r\n\tThis can introduce breaking changes into your project without warning!\r\n\r\nSee https://docs.getdbt.com/docs/package-management#section-specifying-package-versions\r\nWARNING: The git package \"https://github.com/tailsdotcom/dbt_artifacts\"\r\n\tis pinned to the \"main\" branch.\r\n\tThis can introduce breaking changes into your project without warning!\r\n\r\nSee https://docs.getdbt.com/docs/package-management#section-specifying-package-versions\r\nWARNING: The git package \"https://gitlab.com/gitlab-data/snowflake_spend\"\r\n\tis not pinned, using HEAD (default branch).\r\n\tThis can introduce breaking changes into your project without warning!\r\n\r\nSee https://docs.getdbt.com/docs/package-management#section-specifying-package-versions\r\nWARNING: The git package \"https://github.com/fishtown-analytics/dbt-codegen\"\r\n\tis pinned to the \"master\" branch.\r\n\tThis can introduce breaking changes into your project without warning!\r\n\r\nSee https://docs.getdbt.com/docs/package-management#section-specifying-package-versions\r\nWARNING: The git package \"https://github.com/tailsdotcom/dbt_artifacts\"\r\n\tis pinned to the \"main\" branch.\r\n\tThis can introduce breaking changes into your project without warning!\r\n\r\nSee https://docs.getdbt.com/docs/package-management#section-specifying-package-versions\r\nWARNING: The git package \"https://gitlab.com/gitlab-data/snowflake_spend\"\r\n\tis not pinned, using HEAD (default branch).\r\n\tThis can introduce breaking changes into your project without warning!\r\n\r\nSee https://docs.getdbt.com/docs/package-management#section-specifying-package-versions\r\nInstalling https://github.com/fishtown-analytics/dbt-codegen@master\r\n Installed from revision master\r\n\r\nInstalling https://github.com/tailsdotcom/dbt_artifacts@main\r\n Installed from revision main\r\n\r\nInstalling https://gitlab.com/gitlab-data/snowflake_spend@HEAD\r\n Installed from HEAD (default branch)\r\n\r\nInstalling fishtown-analytics/[email protected]\r\n Installed from version 0.3.0\r\n\r\nInstalling fishtown-analytics/[email protected]\r\n Installed from version 0.6.4\r\n```\r\n\r\n</details>\r\n\r\n```yml\r\npackages:\r\n - git: https://github.com/fishtown-analytics/dbt-codegen \r\n revision: master\r\n warn-unpinned: false\r\n - git: https://github.com/tailsdotcom/dbt_artifacts \r\n revision: main\r\n warn-unpinned: false\r\n - git: https://gitlab.com/gitlab-data/snowflake_spend\r\n warn-unpinned: false\r\n - package: fishtown-analytics/audit_helper\r\n version: 0.3.0\r\n```\r\n\r\n<details>\r\n<summary> <code>$ dbt deps</code> </summary>\r\n\r\n```\r\nRunning with dbt=0.19.0\r\nInstalling https://github.com/fishtown-analytics/dbt-codegen@master\r\n Installed from revision master\r\n\r\nInstalling https://github.com/tailsdotcom/dbt_artifacts@main\r\n Installed from revision main\r\n\r\nInstalling https://gitlab.com/gitlab-data/snowflake_spend@HEAD\r\n Installed from HEAD (default branch)\r\n\r\nInstalling fishtown-analytics/[email protected]\r\n Installed from version 0.3.0\r\n\r\nInstalling fishtown-analytics/[email protected]\r\n Installed from version 0.6.4\r\n```\r\n</details>\r\n\r\n### Checklist\r\n - [x] I have signed the [CLA](https://docs.getdbt.com/docs/contributor-license-agreements)\r\n - [x] I have run this code in development and it appears to resolve the stated issue\r\n - ~This PR includes tests, or tests are not required/relevant for this PR~\r\n - [x] I have updated the `CHANGELOG.md` and added information about my change to the \"dbt next\" section.\r\n\n", "before_files": [{"content": "from functools import wraps\nimport requests\nfrom dbt.exceptions import RegistryException\nfrom dbt.utils import memoized\nfrom dbt.logger import GLOBAL_LOGGER as logger\nimport os\nimport time\n\nif os.getenv('DBT_PACKAGE_HUB_URL'):\n DEFAULT_REGISTRY_BASE_URL = os.getenv('DBT_PACKAGE_HUB_URL')\nelse:\n DEFAULT_REGISTRY_BASE_URL = 'https://hub.getdbt.com/'\n\n\ndef _get_url(url, registry_base_url=None):\n if registry_base_url is None:\n registry_base_url = DEFAULT_REGISTRY_BASE_URL\n\n return '{}{}'.format(registry_base_url, url)\n\n\ndef _wrap_exceptions(fn):\n @wraps(fn)\n def wrapper(*args, **kwargs):\n max_attempts = 5\n attempt = 0\n while True:\n attempt += 1\n try:\n return fn(*args, **kwargs)\n except requests.exceptions.ConnectionError as exc:\n if attempt < max_attempts:\n time.sleep(1)\n continue\n\n raise RegistryException(\n 'Unable to connect to registry hub'\n ) from exc\n return wrapper\n\n\n@_wrap_exceptions\ndef _get(path, registry_base_url=None):\n url = _get_url(path, registry_base_url)\n logger.debug('Making package registry request: GET {}'.format(url))\n resp = requests.get(url)\n logger.debug('Response from registry: GET {} {}'.format(url,\n resp.status_code))\n resp.raise_for_status()\n return resp.json()\n\n\ndef index(registry_base_url=None):\n return _get('api/v1/index.json', registry_base_url)\n\n\nindex_cached = memoized(index)\n\n\ndef packages(registry_base_url=None):\n return _get('api/v1/packages.json', registry_base_url)\n\n\ndef package(name, registry_base_url=None):\n return _get('api/v1/{}.json'.format(name), registry_base_url)\n\n\ndef package_version(name, version, registry_base_url=None):\n return _get('api/v1/{}/{}.json'.format(name, version), registry_base_url)\n\n\ndef get_available_versions(name):\n response = package(name)\n return list(response['versions'])\n", "path": "core/dbt/clients/registry.py"}], "after_files": [{"content": "from functools import wraps\nimport requests\nfrom dbt.exceptions import RegistryException\nfrom dbt.utils import memoized\nfrom dbt.logger import GLOBAL_LOGGER as logger\nimport os\nimport time\n\nif os.getenv('DBT_PACKAGE_HUB_URL'):\n DEFAULT_REGISTRY_BASE_URL = os.getenv('DBT_PACKAGE_HUB_URL')\nelse:\n DEFAULT_REGISTRY_BASE_URL = 'https://hub.getdbt.com/'\n\n\ndef _get_url(url, registry_base_url=None):\n if registry_base_url is None:\n registry_base_url = DEFAULT_REGISTRY_BASE_URL\n\n return '{}{}'.format(registry_base_url, url)\n\n\ndef _wrap_exceptions(fn):\n @wraps(fn)\n def wrapper(*args, **kwargs):\n max_attempts = 5\n attempt = 0\n while True:\n attempt += 1\n try:\n return fn(*args, **kwargs)\n except (requests.exceptions.ConnectionError, requests.exceptions.Timeout) as exc:\n if attempt < max_attempts:\n time.sleep(1)\n continue\n raise RegistryException(\n 'Unable to connect to registry hub'\n ) from exc\n return wrapper\n\n\n@_wrap_exceptions\ndef _get(path, registry_base_url=None):\n url = _get_url(path, registry_base_url)\n logger.debug('Making package registry request: GET {}'.format(url))\n resp = requests.get(url, timeout=30)\n logger.debug('Response from registry: GET {} {}'.format(url,\n resp.status_code))\n resp.raise_for_status()\n return resp.json()\n\n\ndef index(registry_base_url=None):\n return _get('api/v1/index.json', registry_base_url)\n\n\nindex_cached = memoized(index)\n\n\ndef packages(registry_base_url=None):\n return _get('api/v1/packages.json', registry_base_url)\n\n\ndef package(name, registry_base_url=None):\n return _get('api/v1/{}.json'.format(name), registry_base_url)\n\n\ndef package_version(name, version, registry_base_url=None):\n return _get('api/v1/{}/{}.json'.format(name, version), registry_base_url)\n\n\ndef get_available_versions(name):\n response = package(name)\n return list(response['versions'])\n", "path": "core/dbt/clients/registry.py"}]} | 1,983 | 232 |
gh_patches_debug_4023 | rasdani/github-patches | git_diff | pymedusa__Medusa-7842 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
API v2 - Authenticate Fail
I'm trying to use the APIv2 but the authorization fails.
What is wrong with my resquests?
#############Request#############
POST /api/v2/authenticate HTTP/1.1
Host: 192.168.17.204:8081
User-Agent: FlexGet/3.1.15 (www.flexget.com)
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
Content-Length: 46
Content-Type: application/json
{"username": "medusa", "password": "password"}
#############Response#############
HTTP/1.1 200 OK
Server: TornadoServer/5.1.1
Content-Type: application/json; charset=UTF-8
Date: Thu, 06 Feb 2020 15:58:10 GMT
X-Medusa-Server: 0.3.11
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: Origin, Accept, Authorization, Content-Type, X-Requested-With, X-CSRF-Token, X-Api-Key, X-Medusa-Server
Access-Control-Allow-Methods: OPTIONS, POST
Content-Length: 297
Vary: Accept-Encoding
{"token": "b'eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJNZWR1c2EgMC4zLjExIiwiaWF0IjoxNTgxMDA0NjkwLCJqdGkiOiJwZ25rdXI2WDZrOEdRZjBleGc1OCIsImV4cCI6MTU4MTA5MTA5MCwidXNlcm5hbWUiOiJtZWR1c2EiLCJhcGlLZXkiOiI3NWVhYWM4ZTY3YzRhNWIyODQ5MmZmZjk3ODRjNDZhMCJ9.LU7fdfIU9wFVvg_nsJpPzUgOUQ8juPR0t6_uACfr3Zc'"}
#############Request#############
GET /api/v2/series?limit=1000 HTTP/1.1
Host: 192.168.17.204:8081
User-Agent: FlexGet/3.1.15 (www.flexget.com)
Accept-Encoding: gzip, deflate
Accept: */*
Connection: keep-alive
authorization: Bearer b'eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJNZWR1c2EgMC4zLjExIiwiaWF0IjoxNTgxMDA0NjkwLCJqdGkiOiJwZ25rdXI2WDZrOEdRZjBleGc1OCIsImV4cCI6MTU4MTA5MTA5MCwidXNlcm5hbWUiOiJtZWR1c2EiLCJhcGlLZXkiOiI3NWVhYWM4ZTY3YzRhNWIyODQ5MmZmZjk3ODRjNDZhMCJ9.LU7fdfIU9wFVvg_nsJpPzUgOUQ8juPR0t6_uACfr3Zc'
#############Response#############
HTTP/1.1 401 Unauthorized
Server: TornadoServer/5.1.1
Content-Type: application/json; charset=UTF-8
Date: Thu, 06 Feb 2020 15:58:10 GMT
X-Medusa-Server: 0.3.11
Access-Control-Allow-Origin: *
Access-Control-Allow-Headers: Origin, Accept, Authorization, Content-Type, X-Requested-With, X-CSRF-Token, X-Api-Key, X-Medusa-Server
Access-Control-Allow-Methods: OPTIONS, GET, POST, PATCH, DELETE
Content-Length: 27
Vary: Accept-Encoding
{"error": "Invalid token."}
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `medusa/server/api/v2/auth.py`
Content:
```
1 # coding=utf-8
2 """Request handler for authentication."""
3 from __future__ import unicode_literals
4
5 import logging
6 import random
7 import string
8 import time
9 from builtins import range
10
11 import jwt
12
13 from medusa import app, helpers, notifiers
14 from medusa.logger.adapters.style import BraceAdapter
15 from medusa.server.api.v2.base import BaseRequestHandler
16
17 from six import text_type
18
19 from tornado.escape import json_decode
20
21 log = BraceAdapter(logging.getLogger(__name__))
22 log.logger.addHandler(logging.NullHandler())
23
24
25 class AuthHandler(BaseRequestHandler):
26 """Auth request handler."""
27
28 #: resource name
29 name = 'authenticate'
30 #: allowed HTTP methods
31 allowed_methods = ('POST', )
32
33 def _check_authentication(self):
34 """Override authentication check for the authentication endpoint."""
35 return None
36
37 def post(self, *args, **kwargs):
38 """Request JWT."""
39 username = app.WEB_USERNAME
40 password = app.WEB_PASSWORD
41
42 # If the user hasn't set a username and/or password just let them login
43 if not username.strip() or not password.strip():
44 return self._login()
45
46 if not self.request.body:
47 return self._failed_login(error='No Credentials Provided')
48
49 if self.request.headers['content-type'] != 'application/json':
50 return self._failed_login(error='Incorrect content-type')
51
52 request_body = json_decode(self.request.body)
53 submitted_username = request_body.get('username')
54 submitted_password = request_body.get('password')
55 submitted_exp = request_body.get('exp', 86400)
56 if username != submitted_username or password != submitted_password:
57 return self._failed_login(error='Invalid credentials')
58
59 return self._login(submitted_exp)
60
61 def _login(self, exp=86400):
62 self.set_header('Content-Type', 'application/json')
63 if app.NOTIFY_ON_LOGIN and not helpers.is_ip_private(self.request.remote_ip):
64 notifiers.notify_login(self.request.remote_ip)
65
66 log.info('{user} logged into the API v2', {'user': app.WEB_USERNAME})
67 time_now = int(time.time())
68 return self._ok(data={
69 'token': jwt.encode({
70 'iss': 'Medusa ' + text_type(app.APP_VERSION),
71 'iat': time_now,
72 # @TODO: The jti should be saved so we can revoke tokens
73 'jti': ''.join(random.choice(string.ascii_letters + string.digits) for _ in range(20)),
74 'exp': time_now + int(exp),
75 'username': app.WEB_USERNAME,
76 'apiKey': app.API_KEY
77 }, app.ENCRYPTION_SECRET, algorithm='HS256')
78 })
79
80 def _failed_login(self, error=None):
81 log.warning('{user} attempted a failed login to the API v2 from IP: {ip}', {
82 'user': app.WEB_USERNAME,
83 'ip': self.request.remote_ip
84 })
85 return self._unauthorized(error=error)
86
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/medusa/server/api/v2/auth.py b/medusa/server/api/v2/auth.py
--- a/medusa/server/api/v2/auth.py
+++ b/medusa/server/api/v2/auth.py
@@ -74,7 +74,7 @@
'exp': time_now + int(exp),
'username': app.WEB_USERNAME,
'apiKey': app.API_KEY
- }, app.ENCRYPTION_SECRET, algorithm='HS256')
+ }, app.ENCRYPTION_SECRET, algorithm='HS256').decode('utf-8')
})
def _failed_login(self, error=None):
| {"golden_diff": "diff --git a/medusa/server/api/v2/auth.py b/medusa/server/api/v2/auth.py\n--- a/medusa/server/api/v2/auth.py\n+++ b/medusa/server/api/v2/auth.py\n@@ -74,7 +74,7 @@\n 'exp': time_now + int(exp),\n 'username': app.WEB_USERNAME,\n 'apiKey': app.API_KEY\n- }, app.ENCRYPTION_SECRET, algorithm='HS256')\n+ }, app.ENCRYPTION_SECRET, algorithm='HS256').decode('utf-8')\n })\n \n def _failed_login(self, error=None):\n", "issue": "API v2 - Authenticate Fail\nI'm trying to use the APIv2 but the authorization fails.\r\nWhat is wrong with my resquests?\r\n\r\n#############Request#############\r\nPOST /api/v2/authenticate HTTP/1.1\r\nHost: 192.168.17.204:8081\r\nUser-Agent: FlexGet/3.1.15 (www.flexget.com)\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\nContent-Length: 46\r\nContent-Type: application/json\r\n\r\n{\"username\": \"medusa\", \"password\": \"password\"}\r\n\r\n#############Response#############\r\nHTTP/1.1 200 OK\r\nServer: TornadoServer/5.1.1\r\nContent-Type: application/json; charset=UTF-8\r\nDate: Thu, 06 Feb 2020 15:58:10 GMT\r\nX-Medusa-Server: 0.3.11\r\nAccess-Control-Allow-Origin: *\r\nAccess-Control-Allow-Headers: Origin, Accept, Authorization, Content-Type, X-Requested-With, X-CSRF-Token, X-Api-Key, X-Medusa-Server\r\nAccess-Control-Allow-Methods: OPTIONS, POST\r\nContent-Length: 297\r\nVary: Accept-Encoding\r\n\r\n{\"token\": \"b'eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJNZWR1c2EgMC4zLjExIiwiaWF0IjoxNTgxMDA0NjkwLCJqdGkiOiJwZ25rdXI2WDZrOEdRZjBleGc1OCIsImV4cCI6MTU4MTA5MTA5MCwidXNlcm5hbWUiOiJtZWR1c2EiLCJhcGlLZXkiOiI3NWVhYWM4ZTY3YzRhNWIyODQ5MmZmZjk3ODRjNDZhMCJ9.LU7fdfIU9wFVvg_nsJpPzUgOUQ8juPR0t6_uACfr3Zc'\"}\r\n\r\n\r\n\r\n\r\n#############Request#############\r\nGET /api/v2/series?limit=1000 HTTP/1.1\r\nHost: 192.168.17.204:8081\r\nUser-Agent: FlexGet/3.1.15 (www.flexget.com)\r\nAccept-Encoding: gzip, deflate\r\nAccept: */*\r\nConnection: keep-alive\r\nauthorization: Bearer b'eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJNZWR1c2EgMC4zLjExIiwiaWF0IjoxNTgxMDA0NjkwLCJqdGkiOiJwZ25rdXI2WDZrOEdRZjBleGc1OCIsImV4cCI6MTU4MTA5MTA5MCwidXNlcm5hbWUiOiJtZWR1c2EiLCJhcGlLZXkiOiI3NWVhYWM4ZTY3YzRhNWIyODQ5MmZmZjk3ODRjNDZhMCJ9.LU7fdfIU9wFVvg_nsJpPzUgOUQ8juPR0t6_uACfr3Zc'\r\n\r\n#############Response#############\r\nHTTP/1.1 401 Unauthorized\r\nServer: TornadoServer/5.1.1\r\nContent-Type: application/json; charset=UTF-8\r\nDate: Thu, 06 Feb 2020 15:58:10 GMT\r\nX-Medusa-Server: 0.3.11\r\nAccess-Control-Allow-Origin: *\r\nAccess-Control-Allow-Headers: Origin, Accept, Authorization, Content-Type, X-Requested-With, X-CSRF-Token, X-Api-Key, X-Medusa-Server\r\nAccess-Control-Allow-Methods: OPTIONS, GET, POST, PATCH, DELETE\r\nContent-Length: 27\r\nVary: Accept-Encoding\r\n\r\n{\"error\": \"Invalid token.\"}\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"Request handler for authentication.\"\"\"\nfrom __future__ import unicode_literals\n\nimport logging\nimport random\nimport string\nimport time\nfrom builtins import range\n\nimport jwt\n\nfrom medusa import app, helpers, notifiers\nfrom medusa.logger.adapters.style import BraceAdapter\nfrom medusa.server.api.v2.base import BaseRequestHandler\n\nfrom six import text_type\n\nfrom tornado.escape import json_decode\n\nlog = BraceAdapter(logging.getLogger(__name__))\nlog.logger.addHandler(logging.NullHandler())\n\n\nclass AuthHandler(BaseRequestHandler):\n \"\"\"Auth request handler.\"\"\"\n\n #: resource name\n name = 'authenticate'\n #: allowed HTTP methods\n allowed_methods = ('POST', )\n\n def _check_authentication(self):\n \"\"\"Override authentication check for the authentication endpoint.\"\"\"\n return None\n\n def post(self, *args, **kwargs):\n \"\"\"Request JWT.\"\"\"\n username = app.WEB_USERNAME\n password = app.WEB_PASSWORD\n\n # If the user hasn't set a username and/or password just let them login\n if not username.strip() or not password.strip():\n return self._login()\n\n if not self.request.body:\n return self._failed_login(error='No Credentials Provided')\n\n if self.request.headers['content-type'] != 'application/json':\n return self._failed_login(error='Incorrect content-type')\n\n request_body = json_decode(self.request.body)\n submitted_username = request_body.get('username')\n submitted_password = request_body.get('password')\n submitted_exp = request_body.get('exp', 86400)\n if username != submitted_username or password != submitted_password:\n return self._failed_login(error='Invalid credentials')\n\n return self._login(submitted_exp)\n\n def _login(self, exp=86400):\n self.set_header('Content-Type', 'application/json')\n if app.NOTIFY_ON_LOGIN and not helpers.is_ip_private(self.request.remote_ip):\n notifiers.notify_login(self.request.remote_ip)\n\n log.info('{user} logged into the API v2', {'user': app.WEB_USERNAME})\n time_now = int(time.time())\n return self._ok(data={\n 'token': jwt.encode({\n 'iss': 'Medusa ' + text_type(app.APP_VERSION),\n 'iat': time_now,\n # @TODO: The jti should be saved so we can revoke tokens\n 'jti': ''.join(random.choice(string.ascii_letters + string.digits) for _ in range(20)),\n 'exp': time_now + int(exp),\n 'username': app.WEB_USERNAME,\n 'apiKey': app.API_KEY\n }, app.ENCRYPTION_SECRET, algorithm='HS256')\n })\n\n def _failed_login(self, error=None):\n log.warning('{user} attempted a failed login to the API v2 from IP: {ip}', {\n 'user': app.WEB_USERNAME,\n 'ip': self.request.remote_ip\n })\n return self._unauthorized(error=error)\n", "path": "medusa/server/api/v2/auth.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"Request handler for authentication.\"\"\"\nfrom __future__ import unicode_literals\n\nimport logging\nimport random\nimport string\nimport time\nfrom builtins import range\n\nimport jwt\n\nfrom medusa import app, helpers, notifiers\nfrom medusa.logger.adapters.style import BraceAdapter\nfrom medusa.server.api.v2.base import BaseRequestHandler\n\nfrom six import text_type\n\nfrom tornado.escape import json_decode\n\nlog = BraceAdapter(logging.getLogger(__name__))\nlog.logger.addHandler(logging.NullHandler())\n\n\nclass AuthHandler(BaseRequestHandler):\n \"\"\"Auth request handler.\"\"\"\n\n #: resource name\n name = 'authenticate'\n #: allowed HTTP methods\n allowed_methods = ('POST', )\n\n def _check_authentication(self):\n \"\"\"Override authentication check for the authentication endpoint.\"\"\"\n return None\n\n def post(self, *args, **kwargs):\n \"\"\"Request JWT.\"\"\"\n username = app.WEB_USERNAME\n password = app.WEB_PASSWORD\n\n # If the user hasn't set a username and/or password just let them login\n if not username.strip() or not password.strip():\n return self._login()\n\n if not self.request.body:\n return self._failed_login(error='No Credentials Provided')\n\n if self.request.headers['content-type'] != 'application/json':\n return self._failed_login(error='Incorrect content-type')\n\n request_body = json_decode(self.request.body)\n submitted_username = request_body.get('username')\n submitted_password = request_body.get('password')\n submitted_exp = request_body.get('exp', 86400)\n if username != submitted_username or password != submitted_password:\n return self._failed_login(error='Invalid credentials')\n\n return self._login(submitted_exp)\n\n def _login(self, exp=86400):\n self.set_header('Content-Type', 'application/json')\n if app.NOTIFY_ON_LOGIN and not helpers.is_ip_private(self.request.remote_ip):\n notifiers.notify_login(self.request.remote_ip)\n\n log.info('{user} logged into the API v2', {'user': app.WEB_USERNAME})\n time_now = int(time.time())\n return self._ok(data={\n 'token': jwt.encode({\n 'iss': 'Medusa ' + text_type(app.APP_VERSION),\n 'iat': time_now,\n # @TODO: The jti should be saved so we can revoke tokens\n 'jti': ''.join(random.choice(string.ascii_letters + string.digits) for _ in range(20)),\n 'exp': time_now + int(exp),\n 'username': app.WEB_USERNAME,\n 'apiKey': app.API_KEY\n }, app.ENCRYPTION_SECRET, algorithm='HS256').decode('utf-8')\n })\n\n def _failed_login(self, error=None):\n log.warning('{user} attempted a failed login to the API v2 from IP: {ip}', {\n 'user': app.WEB_USERNAME,\n 'ip': self.request.remote_ip\n })\n return self._unauthorized(error=error)\n", "path": "medusa/server/api/v2/auth.py"}]} | 1,990 | 138 |
gh_patches_debug_20683 | rasdani/github-patches | git_diff | cornellius-gp__gpytorch-133 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
RBF Kernel Change Breaks Testing Code
The change to RBFKernel in 84fccd898c45c08279fb5c109e6e234f3a47588a may break something about our prediction code.
I am not totally sure what the problem is yet, but I isolated this as the problem with `git bisect` and have a reasonable test case where results are significantly worse with the commit in compared to after a revert commit.
It seems like the stability issues we encountered when making this change in the past don't come up in the unit tests, but do on some real datasets.
I can try to push my test case to a branch as well, although it relies on a UCI dataset.
@Balandat @gpleiss
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gpytorch/kernels/rbf_kernel.py`
Content:
```
1 from __future__ import absolute_import
2 from __future__ import division
3 from __future__ import print_function
4 from __future__ import unicode_literals
5
6 from .kernel import Kernel
7
8
9 class RBFKernel(Kernel):
10
11 def __init__(
12 self,
13 ard_num_dims=None,
14 log_lengthscale_bounds=(-10000, 10000),
15 eps=1e-5,
16 active_dims=None,
17 ):
18 super(RBFKernel, self).__init__(
19 has_lengthscale=True,
20 ard_num_dims=ard_num_dims,
21 log_lengthscale_bounds=log_lengthscale_bounds,
22 active_dims=active_dims,
23 )
24 self.eps = eps
25
26 def forward(self, x1, x2):
27 lengthscales = self.log_lengthscale.exp() + self.eps
28 diff = (x1.unsqueeze(2) - x2.unsqueeze(1)).div_(lengthscales)
29 return diff.pow_(2).sum(-1).mul_(-0.5).exp_()
30
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gpytorch/kernels/rbf_kernel.py b/gpytorch/kernels/rbf_kernel.py
--- a/gpytorch/kernels/rbf_kernel.py
+++ b/gpytorch/kernels/rbf_kernel.py
@@ -3,6 +3,7 @@
from __future__ import print_function
from __future__ import unicode_literals
+import math
from .kernel import Kernel
@@ -12,7 +13,7 @@
self,
ard_num_dims=None,
log_lengthscale_bounds=(-10000, 10000),
- eps=1e-5,
+ eps=1e-6,
active_dims=None,
):
super(RBFKernel, self).__init__(
@@ -24,6 +25,6 @@
self.eps = eps
def forward(self, x1, x2):
- lengthscales = self.log_lengthscale.exp() + self.eps
+ lengthscales = self.log_lengthscale.exp().mul(math.sqrt(2)).clamp(self.eps, 1e5)
diff = (x1.unsqueeze(2) - x2.unsqueeze(1)).div_(lengthscales)
- return diff.pow_(2).sum(-1).mul_(-0.5).exp_()
+ return diff.pow_(2).sum(-1).mul_(-1).exp_()
| {"golden_diff": "diff --git a/gpytorch/kernels/rbf_kernel.py b/gpytorch/kernels/rbf_kernel.py\n--- a/gpytorch/kernels/rbf_kernel.py\n+++ b/gpytorch/kernels/rbf_kernel.py\n@@ -3,6 +3,7 @@\n from __future__ import print_function\n from __future__ import unicode_literals\n \n+import math\n from .kernel import Kernel\n \n \n@@ -12,7 +13,7 @@\n self,\n ard_num_dims=None,\n log_lengthscale_bounds=(-10000, 10000),\n- eps=1e-5,\n+ eps=1e-6,\n active_dims=None,\n ):\n super(RBFKernel, self).__init__(\n@@ -24,6 +25,6 @@\n self.eps = eps\n \n def forward(self, x1, x2):\n- lengthscales = self.log_lengthscale.exp() + self.eps\n+ lengthscales = self.log_lengthscale.exp().mul(math.sqrt(2)).clamp(self.eps, 1e5)\n diff = (x1.unsqueeze(2) - x2.unsqueeze(1)).div_(lengthscales)\n- return diff.pow_(2).sum(-1).mul_(-0.5).exp_()\n+ return diff.pow_(2).sum(-1).mul_(-1).exp_()\n", "issue": "RBF Kernel Change Breaks Testing Code\nThe change to RBFKernel in 84fccd898c45c08279fb5c109e6e234f3a47588a may break something about our prediction code. \r\n\r\nI am not totally sure what the problem is yet, but I isolated this as the problem with `git bisect` and have a reasonable test case where results are significantly worse with the commit in compared to after a revert commit. \r\n\r\nIt seems like the stability issues we encountered when making this change in the past don't come up in the unit tests, but do on some real datasets.\r\n\r\nI can try to push my test case to a branch as well, although it relies on a UCI dataset.\r\n\r\n@Balandat @gpleiss \n", "before_files": [{"content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nfrom .kernel import Kernel\n\n\nclass RBFKernel(Kernel):\n\n def __init__(\n self,\n ard_num_dims=None,\n log_lengthscale_bounds=(-10000, 10000),\n eps=1e-5,\n active_dims=None,\n ):\n super(RBFKernel, self).__init__(\n has_lengthscale=True,\n ard_num_dims=ard_num_dims,\n log_lengthscale_bounds=log_lengthscale_bounds,\n active_dims=active_dims,\n )\n self.eps = eps\n\n def forward(self, x1, x2):\n lengthscales = self.log_lengthscale.exp() + self.eps\n diff = (x1.unsqueeze(2) - x2.unsqueeze(1)).div_(lengthscales)\n return diff.pow_(2).sum(-1).mul_(-0.5).exp_()\n", "path": "gpytorch/kernels/rbf_kernel.py"}], "after_files": [{"content": "from __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport math\nfrom .kernel import Kernel\n\n\nclass RBFKernel(Kernel):\n\n def __init__(\n self,\n ard_num_dims=None,\n log_lengthscale_bounds=(-10000, 10000),\n eps=1e-6,\n active_dims=None,\n ):\n super(RBFKernel, self).__init__(\n has_lengthscale=True,\n ard_num_dims=ard_num_dims,\n log_lengthscale_bounds=log_lengthscale_bounds,\n active_dims=active_dims,\n )\n self.eps = eps\n\n def forward(self, x1, x2):\n lengthscales = self.log_lengthscale.exp().mul(math.sqrt(2)).clamp(self.eps, 1e5)\n diff = (x1.unsqueeze(2) - x2.unsqueeze(1)).div_(lengthscales)\n return diff.pow_(2).sum(-1).mul_(-1).exp_()\n", "path": "gpytorch/kernels/rbf_kernel.py"}]} | 702 | 293 |
gh_patches_debug_706 | rasdani/github-patches | git_diff | deepset-ai__haystack-3705 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bad Semaphore initialization in RequestLimiter
**Describe the bug**
RequestLimiter takes a number as parameter and use it to set up a Semaphore. The issue is that the environment variable indicates the concurrent allowed requests per worker. When the semaphore is created (https://github.com/deepset-ai/haystack/blob/6790eaf7d8be05c5674d97a75cc5783e00a66875/rest_api/rest_api/controller/utils.py#L13), this value is set down by 1. This is clearly not what the project tried to achieve (at least per naming).
**Error message**
REST API will always return it's busy, error 503 when CONCURRENT_REQUEST_PER_WORKER is equal to CONCURRENT_REQUEST_PER_WORKER -1. When user set the concurrency to 1, it will never be able to call the API, since the Semaphore declaration will be Semaphore(0)
**Expected behavior**
Being able to set the request limits using the env variable CONCURRENT_REQUEST_PER_WORKER
**Additional context**
**To Reproduce**
**FAQ Check**
- [x] Have you had a look at [our new FAQ page](https://haystack.deepset.ai/overview/faq)?
**System:**
- OS: Ubuntu
- GPU/CPU: i7/ Nvidia
- Haystack version (commit or version number): 1.9
- DocumentStore:
- Reader:
- Retriever:
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `rest_api/rest_api/controller/utils.py`
Content:
```
1 from typing import Type, NewType
2
3 import inspect
4 from contextlib import contextmanager
5 from threading import Semaphore
6
7 from fastapi import Form, HTTPException
8 from pydantic import BaseModel
9
10
11 class RequestLimiter:
12 def __init__(self, limit):
13 self.semaphore = Semaphore(limit - 1)
14
15 @contextmanager
16 def run(self):
17 acquired = self.semaphore.acquire(blocking=False)
18 if not acquired:
19 raise HTTPException(status_code=503, detail="The server is busy processing requests.")
20 try:
21 yield acquired
22 finally:
23 self.semaphore.release()
24
25
26 StringId = NewType("StringId", str)
27
28
29 def as_form(cls: Type[BaseModel]):
30 """
31 Adds an as_form class method to decorated models. The as_form class method
32 can be used with FastAPI endpoints
33 """
34 new_params = [
35 inspect.Parameter(
36 field.alias,
37 inspect.Parameter.POSITIONAL_ONLY,
38 default=(Form(field.default) if not field.required else Form(...)),
39 )
40 for field in cls.__fields__.values()
41 ]
42
43 async def _as_form(**data):
44 return cls(**data)
45
46 sig = inspect.signature(_as_form)
47 sig = sig.replace(parameters=new_params)
48 _as_form.__signature__ = sig # type: ignore
49 setattr(cls, "as_form", _as_form)
50 return cls
51
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/rest_api/rest_api/controller/utils.py b/rest_api/rest_api/controller/utils.py
--- a/rest_api/rest_api/controller/utils.py
+++ b/rest_api/rest_api/controller/utils.py
@@ -10,7 +10,7 @@
class RequestLimiter:
def __init__(self, limit):
- self.semaphore = Semaphore(limit - 1)
+ self.semaphore = Semaphore(limit)
@contextmanager
def run(self):
| {"golden_diff": "diff --git a/rest_api/rest_api/controller/utils.py b/rest_api/rest_api/controller/utils.py\n--- a/rest_api/rest_api/controller/utils.py\n+++ b/rest_api/rest_api/controller/utils.py\n@@ -10,7 +10,7 @@\n \n class RequestLimiter:\n def __init__(self, limit):\n- self.semaphore = Semaphore(limit - 1)\n+ self.semaphore = Semaphore(limit)\n \n @contextmanager\n def run(self):\n", "issue": "Bad Semaphore initialization in RequestLimiter\n**Describe the bug**\r\nRequestLimiter takes a number as parameter and use it to set up a Semaphore. The issue is that the environment variable indicates the concurrent allowed requests per worker. When the semaphore is created (https://github.com/deepset-ai/haystack/blob/6790eaf7d8be05c5674d97a75cc5783e00a66875/rest_api/rest_api/controller/utils.py#L13), this value is set down by 1. This is clearly not what the project tried to achieve (at least per naming). \r\n\r\n**Error message**\r\nREST API will always return it's busy, error 503 when CONCURRENT_REQUEST_PER_WORKER is equal to CONCURRENT_REQUEST_PER_WORKER -1. When user set the concurrency to 1, it will never be able to call the API, since the Semaphore declaration will be Semaphore(0)\r\n\r\n**Expected behavior**\r\nBeing able to set the request limits using the env variable CONCURRENT_REQUEST_PER_WORKER\r\n\r\n**Additional context**\r\n\r\n\r\n**To Reproduce**\r\n\r\n\r\n**FAQ Check**\r\n- [x] Have you had a look at [our new FAQ page](https://haystack.deepset.ai/overview/faq)?\r\n\r\n**System:**\r\n - OS: Ubuntu\r\n - GPU/CPU: i7/ Nvidia\r\n - Haystack version (commit or version number): 1.9\r\n - DocumentStore:\r\n - Reader:\r\n - Retriever:\r\n\n", "before_files": [{"content": "from typing import Type, NewType\n\nimport inspect\nfrom contextlib import contextmanager\nfrom threading import Semaphore\n\nfrom fastapi import Form, HTTPException\nfrom pydantic import BaseModel\n\n\nclass RequestLimiter:\n def __init__(self, limit):\n self.semaphore = Semaphore(limit - 1)\n\n @contextmanager\n def run(self):\n acquired = self.semaphore.acquire(blocking=False)\n if not acquired:\n raise HTTPException(status_code=503, detail=\"The server is busy processing requests.\")\n try:\n yield acquired\n finally:\n self.semaphore.release()\n\n\nStringId = NewType(\"StringId\", str)\n\n\ndef as_form(cls: Type[BaseModel]):\n \"\"\"\n Adds an as_form class method to decorated models. The as_form class method\n can be used with FastAPI endpoints\n \"\"\"\n new_params = [\n inspect.Parameter(\n field.alias,\n inspect.Parameter.POSITIONAL_ONLY,\n default=(Form(field.default) if not field.required else Form(...)),\n )\n for field in cls.__fields__.values()\n ]\n\n async def _as_form(**data):\n return cls(**data)\n\n sig = inspect.signature(_as_form)\n sig = sig.replace(parameters=new_params)\n _as_form.__signature__ = sig # type: ignore\n setattr(cls, \"as_form\", _as_form)\n return cls\n", "path": "rest_api/rest_api/controller/utils.py"}], "after_files": [{"content": "from typing import Type, NewType\n\nimport inspect\nfrom contextlib import contextmanager\nfrom threading import Semaphore\n\nfrom fastapi import Form, HTTPException\nfrom pydantic import BaseModel\n\n\nclass RequestLimiter:\n def __init__(self, limit):\n self.semaphore = Semaphore(limit)\n\n @contextmanager\n def run(self):\n acquired = self.semaphore.acquire(blocking=False)\n if not acquired:\n raise HTTPException(status_code=503, detail=\"The server is busy processing requests.\")\n try:\n yield acquired\n finally:\n self.semaphore.release()\n\n\nStringId = NewType(\"StringId\", str)\n\n\ndef as_form(cls: Type[BaseModel]):\n \"\"\"\n Adds an as_form class method to decorated models. The as_form class method\n can be used with FastAPI endpoints\n \"\"\"\n new_params = [\n inspect.Parameter(\n field.alias,\n inspect.Parameter.POSITIONAL_ONLY,\n default=(Form(field.default) if not field.required else Form(...)),\n )\n for field in cls.__fields__.values()\n ]\n\n async def _as_form(**data):\n return cls(**data)\n\n sig = inspect.signature(_as_form)\n sig = sig.replace(parameters=new_params)\n _as_form.__signature__ = sig # type: ignore\n setattr(cls, \"as_form\", _as_form)\n return cls\n", "path": "rest_api/rest_api/controller/utils.py"}]} | 975 | 99 |
gh_patches_debug_468 | rasdani/github-patches | git_diff | chainer__chainer-1562 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
inconsistency between cupy.vstack and numpy.vstack
```
In [1]: import cupy, numpy
In [2]: a = cupy.arange(12).reshape(3, 4)
In [3]: cupy.vstack([a])
Out[3]: array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11])
In [4]: numpy.vstack([a.get()])
Out[4]:
array([[ 0, 1, 2, 3],
[ 4, 5, 6, 7],
[ 8, 9, 10, 11]])
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cupy/manipulation/join.py`
Content:
```
1 import numpy
2 import six
3
4 import cupy
5
6
7 def column_stack(tup):
8 """Stacks 1-D and 2-D arrays as columns into a 2-D array.
9
10 A 1-D array is first converted to a 2-D column array. Then, the 2-D arrays
11 are concatenated along the second axis.
12
13 Args:
14 tup (sequence of arrays): 1-D or 2-D arrays to be stacked.
15
16 Returns:
17 cupy.ndarray: A new 2-D array of stacked columns.
18
19 .. seealso:: :func:`numpy.column_stack`
20
21 """
22 if any(not isinstance(a, cupy.ndarray) for a in tup):
23 raise TypeError('Only cupy arrays can be column stacked')
24
25 lst = list(tup)
26 for i, a in enumerate(lst):
27 if a.ndim == 1:
28 a = a[:, cupy.newaxis]
29 lst[i] = a
30 elif a.ndim != 2:
31 raise ValueError(
32 'Only 1 or 2 dimensional arrays can be column stacked')
33
34 return concatenate(lst, axis=1)
35
36
37 def concatenate(tup, axis=0):
38 """Joins arrays along an axis.
39
40 Args:
41 tup (sequence of arrays): Arrays to be joined. All of these should have
42 same dimensionalities except the specified axis.
43 axis (int): The axis to join arrays along.
44
45 Returns:
46 cupy.ndarray: Joined array.
47
48 .. seealso:: :func:`numpy.concatenate`
49
50 """
51 ndim = None
52 shape = None
53 for a in tup:
54 if not isinstance(a, cupy.ndarray):
55 raise TypeError('Only cupy arrays can be concatenated')
56 if a.ndim == 0:
57 raise TypeError('zero-dimensional arrays cannot be concatenated')
58 if ndim is None:
59 ndim = a.ndim
60 shape = list(a.shape)
61 axis = _get_positive_axis(a.ndim, axis)
62 continue
63
64 if a.ndim != ndim:
65 raise ValueError(
66 'All arrays to concatenate must have the same ndim')
67 if any(i != axis and shape[i] != a.shape[i]
68 for i in six.moves.range(ndim)):
69 raise ValueError(
70 'All arrays must have same shape except the axis to '
71 'concatenate')
72 shape[axis] += a.shape[axis]
73
74 if ndim is None:
75 raise ValueError('Cannot concatenate from empty tuple')
76
77 dtype = numpy.find_common_type([a.dtype for a in tup], [])
78 ret = cupy.empty(shape, dtype=dtype)
79
80 skip = (slice(None),) * axis
81 i = 0
82 for a in tup:
83 aw = a.shape[axis]
84 ret[skip + (slice(i, i + aw),)] = a
85 i += aw
86
87 return ret
88
89
90 def dstack(tup):
91 """Stacks arrays along the third axis.
92
93 Args:
94 tup (sequence of arrays): Arrays to be stacked. Each array is converted
95 by :func:`cupy.atleast_3d` before stacking.
96
97 Returns:
98 cupy.ndarray: Stacked array.
99
100 .. seealso:: :func:`numpy.dstack`
101
102 """
103 return concatenate(cupy.atleast_3d(*tup), 2)
104
105
106 def hstack(tup):
107 """Stacks arrays horizontally.
108
109 If an input array has one dimension, then the array is treated as a
110 horizontal vector and stacked along the first axis. Otherwise, the array is
111 stacked along the second axis.
112
113 Args:
114 tup (sequence of arrays): Arrays to be stacked.
115
116 Returns:
117 cupy.ndarray: Stacked array.
118
119 .. seealso:: :func:`numpy.hstack`
120
121 """
122 arrs = [cupy.atleast_1d(a) for a in tup]
123 axis = 1
124 if arrs[0].ndim == 1:
125 axis = 0
126 return concatenate(arrs, axis)
127
128
129 def vstack(tup):
130 """Stacks arrays vertically.
131
132 If an input array has one dimension, then the array is treated as a
133 horizontal vector and stacked along the additional axis at the head.
134 Otherwise, the array is stacked along the first axis.
135
136 Args:
137 tup (sequence of arrays): Arrays to be stacked. Each array is converted
138 by :func:`cupy.atleast_2d` before stacking.
139
140 Returns:
141 cupy.ndarray: Stacked array.
142
143 .. seealso:: :func:`numpy.dstack`
144
145 """
146 return concatenate(cupy.atleast_2d(*tup), 0)
147
148
149 def stack(tup, axis=0):
150 """Stacks arrays along a new axis.
151
152 Args:
153 tup (sequence of arrays): Arrays to be stacked.
154 axis (int): Axis along which the arrays are stacked.
155
156 Returns:
157 cupy.ndarray: Stacked array.
158
159 .. seealso:: :func:`numpy.stack`
160 """
161 return concatenate([cupy.expand_dims(x, axis) for x in tup], axis)
162
163
164 def _get_positive_axis(ndim, axis):
165 a = axis
166 if a < 0:
167 a += ndim
168 if a < 0 or a >= ndim:
169 raise IndexError('axis {} out of bounds [0, {})'.format(axis, ndim))
170 return a
171
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cupy/manipulation/join.py b/cupy/manipulation/join.py
--- a/cupy/manipulation/join.py
+++ b/cupy/manipulation/join.py
@@ -143,7 +143,7 @@
.. seealso:: :func:`numpy.dstack`
"""
- return concatenate(cupy.atleast_2d(*tup), 0)
+ return concatenate([cupy.atleast_2d(m) for m in tup], 0)
def stack(tup, axis=0):
| {"golden_diff": "diff --git a/cupy/manipulation/join.py b/cupy/manipulation/join.py\n--- a/cupy/manipulation/join.py\n+++ b/cupy/manipulation/join.py\n@@ -143,7 +143,7 @@\n .. seealso:: :func:`numpy.dstack`\n \n \"\"\"\n- return concatenate(cupy.atleast_2d(*tup), 0)\n+ return concatenate([cupy.atleast_2d(m) for m in tup], 0)\n \n \n def stack(tup, axis=0):\n", "issue": "inconsistency between cupy.vstack and numpy.vstack\n```\nIn [1]: import cupy, numpy\nIn [2]: a = cupy.arange(12).reshape(3, 4)\nIn [3]: cupy.vstack([a])\nOut[3]: array([ 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11])\nIn [4]: numpy.vstack([a.get()])\nOut[4]: \narray([[ 0, 1, 2, 3],\n [ 4, 5, 6, 7],\n [ 8, 9, 10, 11]])\n```\n\n", "before_files": [{"content": "import numpy\nimport six\n\nimport cupy\n\n\ndef column_stack(tup):\n \"\"\"Stacks 1-D and 2-D arrays as columns into a 2-D array.\n\n A 1-D array is first converted to a 2-D column array. Then, the 2-D arrays\n are concatenated along the second axis.\n\n Args:\n tup (sequence of arrays): 1-D or 2-D arrays to be stacked.\n\n Returns:\n cupy.ndarray: A new 2-D array of stacked columns.\n\n .. seealso:: :func:`numpy.column_stack`\n\n \"\"\"\n if any(not isinstance(a, cupy.ndarray) for a in tup):\n raise TypeError('Only cupy arrays can be column stacked')\n\n lst = list(tup)\n for i, a in enumerate(lst):\n if a.ndim == 1:\n a = a[:, cupy.newaxis]\n lst[i] = a\n elif a.ndim != 2:\n raise ValueError(\n 'Only 1 or 2 dimensional arrays can be column stacked')\n\n return concatenate(lst, axis=1)\n\n\ndef concatenate(tup, axis=0):\n \"\"\"Joins arrays along an axis.\n\n Args:\n tup (sequence of arrays): Arrays to be joined. All of these should have\n same dimensionalities except the specified axis.\n axis (int): The axis to join arrays along.\n\n Returns:\n cupy.ndarray: Joined array.\n\n .. seealso:: :func:`numpy.concatenate`\n\n \"\"\"\n ndim = None\n shape = None\n for a in tup:\n if not isinstance(a, cupy.ndarray):\n raise TypeError('Only cupy arrays can be concatenated')\n if a.ndim == 0:\n raise TypeError('zero-dimensional arrays cannot be concatenated')\n if ndim is None:\n ndim = a.ndim\n shape = list(a.shape)\n axis = _get_positive_axis(a.ndim, axis)\n continue\n\n if a.ndim != ndim:\n raise ValueError(\n 'All arrays to concatenate must have the same ndim')\n if any(i != axis and shape[i] != a.shape[i]\n for i in six.moves.range(ndim)):\n raise ValueError(\n 'All arrays must have same shape except the axis to '\n 'concatenate')\n shape[axis] += a.shape[axis]\n\n if ndim is None:\n raise ValueError('Cannot concatenate from empty tuple')\n\n dtype = numpy.find_common_type([a.dtype for a in tup], [])\n ret = cupy.empty(shape, dtype=dtype)\n\n skip = (slice(None),) * axis\n i = 0\n for a in tup:\n aw = a.shape[axis]\n ret[skip + (slice(i, i + aw),)] = a\n i += aw\n\n return ret\n\n\ndef dstack(tup):\n \"\"\"Stacks arrays along the third axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked. Each array is converted\n by :func:`cupy.atleast_3d` before stacking.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.dstack`\n\n \"\"\"\n return concatenate(cupy.atleast_3d(*tup), 2)\n\n\ndef hstack(tup):\n \"\"\"Stacks arrays horizontally.\n\n If an input array has one dimension, then the array is treated as a\n horizontal vector and stacked along the first axis. Otherwise, the array is\n stacked along the second axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.hstack`\n\n \"\"\"\n arrs = [cupy.atleast_1d(a) for a in tup]\n axis = 1\n if arrs[0].ndim == 1:\n axis = 0\n return concatenate(arrs, axis)\n\n\ndef vstack(tup):\n \"\"\"Stacks arrays vertically.\n\n If an input array has one dimension, then the array is treated as a\n horizontal vector and stacked along the additional axis at the head.\n Otherwise, the array is stacked along the first axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked. Each array is converted\n by :func:`cupy.atleast_2d` before stacking.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.dstack`\n\n \"\"\"\n return concatenate(cupy.atleast_2d(*tup), 0)\n\n\ndef stack(tup, axis=0):\n \"\"\"Stacks arrays along a new axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked.\n axis (int): Axis along which the arrays are stacked.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.stack`\n \"\"\"\n return concatenate([cupy.expand_dims(x, axis) for x in tup], axis)\n\n\ndef _get_positive_axis(ndim, axis):\n a = axis\n if a < 0:\n a += ndim\n if a < 0 or a >= ndim:\n raise IndexError('axis {} out of bounds [0, {})'.format(axis, ndim))\n return a\n", "path": "cupy/manipulation/join.py"}], "after_files": [{"content": "import numpy\nimport six\n\nimport cupy\n\n\ndef column_stack(tup):\n \"\"\"Stacks 1-D and 2-D arrays as columns into a 2-D array.\n\n A 1-D array is first converted to a 2-D column array. Then, the 2-D arrays\n are concatenated along the second axis.\n\n Args:\n tup (sequence of arrays): 1-D or 2-D arrays to be stacked.\n\n Returns:\n cupy.ndarray: A new 2-D array of stacked columns.\n\n .. seealso:: :func:`numpy.column_stack`\n\n \"\"\"\n if any(not isinstance(a, cupy.ndarray) for a in tup):\n raise TypeError('Only cupy arrays can be column stacked')\n\n lst = list(tup)\n for i, a in enumerate(lst):\n if a.ndim == 1:\n a = a[:, cupy.newaxis]\n lst[i] = a\n elif a.ndim != 2:\n raise ValueError(\n 'Only 1 or 2 dimensional arrays can be column stacked')\n\n return concatenate(lst, axis=1)\n\n\ndef concatenate(tup, axis=0):\n \"\"\"Joins arrays along an axis.\n\n Args:\n tup (sequence of arrays): Arrays to be joined. All of these should have\n same dimensionalities except the specified axis.\n axis (int): The axis to join arrays along.\n\n Returns:\n cupy.ndarray: Joined array.\n\n .. seealso:: :func:`numpy.concatenate`\n\n \"\"\"\n ndim = None\n shape = None\n for a in tup:\n if not isinstance(a, cupy.ndarray):\n raise TypeError('Only cupy arrays can be concatenated')\n if a.ndim == 0:\n raise TypeError('zero-dimensional arrays cannot be concatenated')\n if ndim is None:\n ndim = a.ndim\n shape = list(a.shape)\n axis = _get_positive_axis(a.ndim, axis)\n continue\n\n if a.ndim != ndim:\n raise ValueError(\n 'All arrays to concatenate must have the same ndim')\n if any(i != axis and shape[i] != a.shape[i]\n for i in six.moves.range(ndim)):\n raise ValueError(\n 'All arrays must have same shape except the axis to '\n 'concatenate')\n shape[axis] += a.shape[axis]\n\n if ndim is None:\n raise ValueError('Cannot concatenate from empty tuple')\n\n dtype = numpy.find_common_type([a.dtype for a in tup], [])\n ret = cupy.empty(shape, dtype=dtype)\n\n skip = (slice(None),) * axis\n i = 0\n for a in tup:\n aw = a.shape[axis]\n ret[skip + (slice(i, i + aw),)] = a\n i += aw\n\n return ret\n\n\ndef dstack(tup):\n \"\"\"Stacks arrays along the third axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked. Each array is converted\n by :func:`cupy.atleast_3d` before stacking.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.dstack`\n\n \"\"\"\n return concatenate(cupy.atleast_3d(*tup), 2)\n\n\ndef hstack(tup):\n \"\"\"Stacks arrays horizontally.\n\n If an input array has one dimension, then the array is treated as a\n horizontal vector and stacked along the first axis. Otherwise, the array is\n stacked along the second axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.hstack`\n\n \"\"\"\n arrs = [cupy.atleast_1d(a) for a in tup]\n axis = 1\n if arrs[0].ndim == 1:\n axis = 0\n return concatenate(arrs, axis)\n\n\ndef vstack(tup):\n \"\"\"Stacks arrays vertically.\n\n If an input array has one dimension, then the array is treated as a\n horizontal vector and stacked along the additional axis at the head.\n Otherwise, the array is stacked along the first axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked. Each array is converted\n by :func:`cupy.atleast_2d` before stacking.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.dstack`\n\n \"\"\"\n return concatenate([cupy.atleast_2d(m) for m in tup], 0)\n\n\ndef stack(tup, axis=0):\n \"\"\"Stacks arrays along a new axis.\n\n Args:\n tup (sequence of arrays): Arrays to be stacked.\n axis (int): Axis along which the arrays are stacked.\n\n Returns:\n cupy.ndarray: Stacked array.\n\n .. seealso:: :func:`numpy.stack`\n \"\"\"\n return concatenate([cupy.expand_dims(x, axis) for x in tup], axis)\n\n\ndef _get_positive_axis(ndim, axis):\n a = axis\n if a < 0:\n a += ndim\n if a < 0 or a >= ndim:\n raise IndexError('axis {} out of bounds [0, {})'.format(axis, ndim))\n return a\n", "path": "cupy/manipulation/join.py"}]} | 2,007 | 123 |
gh_patches_debug_30321 | rasdani/github-patches | git_diff | pwndbg__pwndbg-433 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
bug of function get_load_segment_info in readelf.py
when I got readelf result like below
```
readelf --program-headers /bin/ls | grep "LOAD" -A 10
LOAD 0x0000000000000000 0x0000000000400000 0x0000000000400000
0x000000000001da64 0x000000000001da64 R E 200000
```
the function crashed at line 65
```
65 fsize, msize, read, write, execute, align = re_secnd.match(line).groups()
```
the reason is in the regex format
```python
re_secnd = re.compile(r"\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\s+(**0x**[0-9A-Fa-f]+)"
```
I mean, "0x" should not a absolute prefix of align number
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwndbg/wrappers/readelf.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3 from __future__ import absolute_import
4 from __future__ import division
5 from __future__ import print_function
6 from __future__ import unicode_literals
7
8 import re
9
10 import pwndbg.wrappers
11
12 cmd_name = "readelf"
13
14 @pwndbg.wrappers.OnlyWithCommand(cmd_name)
15 def get_jmpslots():
16 local_path = pwndbg.file.get_file(pwndbg.proc.exe)
17 cmd = [get_jmpslots.cmd_path, "--relocs", local_path]
18 readelf_out = pwndbg.wrappers.call_cmd(cmd)
19
20 return filter(_extract_jumps, readelf_out.splitlines())
21
22 def _extract_jumps(line):
23 '''
24 Checks for records in `readelf --relocs <binary>` which has type e.g. `R_X86_64_JUMP_SLO`
25 NOTE: Because of that we DO NOT display entries that are not writeable (due to FULL RELRO)
26 as they have `R_X86_64_GLOB_DAT` type.
27
28 It might be good to display them seperately in the future.
29 '''
30 try:
31 if "JUMP" in line.split()[2]:
32 return line
33 else:
34 return False
35 except IndexError:
36 return False
37
38 @pwndbg.wrappers.OnlyWithCommand(cmd_name)
39 def get_load_segment_info():
40 '''
41 Looks for LOAD sections by parsing the output of `readelf --program-headers <binary>`
42 '''
43 local_path = pwndbg.file.get_file(pwndbg.proc.exe)
44 cmd = [get_jmpslots.cmd_path, "--program-headers", local_path]
45 readelf_out = pwndbg.wrappers.call_cmd(cmd)
46
47 segments = []
48 load_found = False
49
50 # Output from readelf is
51 # Type Offset VirtAddr PhysAddr
52 # FileSiz MemSiz Flags Align
53 # LOAD 0x0000000000000000 0x0000000000000000 0x0000000000000000
54 # 0x0000000000000830 0x0000000000000830 R E 0x200000
55 # Account for this using two regular expressions
56 re_first = re.compile(r"\s+LOAD\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+)")
57 re_secnd = re.compile(r"\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\s+(0x[0-9A-Fa-f]+)")
58 hex2int = lambda x: int(x, 16)
59
60 for line in readelf_out.splitlines():
61 if "LOAD" in line:
62 load_found = True
63 offset, vaddr, paddr = map(hex2int, re_first.match(line).groups())
64 elif load_found:
65 fsize, msize, read, write, execute, align = re_secnd.match(line).groups()
66 fsize, msize, align = map(hex2int, (fsize, msize, align))
67 read = read == "R"
68 write = write == "W"
69 execute = execute == "E"
70
71 segments.append({"Offset": offset,
72 "VirtAddr": vaddr,
73 "PhysAddr": paddr,
74 "FileSiz": fsize,
75 "MemSiz": msize,
76 "FlagsRead": read,
77 "FlagsWrite": write,
78 "FlagsExecute": execute})
79
80 load_found = False
81
82 return segments
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pwndbg/wrappers/readelf.py b/pwndbg/wrappers/readelf.py
--- a/pwndbg/wrappers/readelf.py
+++ b/pwndbg/wrappers/readelf.py
@@ -52,9 +52,15 @@
# FileSiz MemSiz Flags Align
# LOAD 0x0000000000000000 0x0000000000000000 0x0000000000000000
# 0x0000000000000830 0x0000000000000830 R E 0x200000
+ #
+ ############################################################################
+ #
+ # NOTE: On some readelf versions the Align column might not be prefixed with 0x
+ # See https://github.com/pwndbg/pwndbg/issues/427
+ #
# Account for this using two regular expressions
re_first = re.compile(r"\s+LOAD\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+)")
- re_secnd = re.compile(r"\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\s+(0x[0-9A-Fa-f]+)")
+ re_secnd = re.compile(r"\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\s+(0x)?([0-9A-Fa-f]+)")
hex2int = lambda x: int(x, 16)
for line in readelf_out.splitlines():
@@ -62,8 +68,8 @@
load_found = True
offset, vaddr, paddr = map(hex2int, re_first.match(line).groups())
elif load_found:
- fsize, msize, read, write, execute, align = re_secnd.match(line).groups()
- fsize, msize, align = map(hex2int, (fsize, msize, align))
+ fsize, msize, read, write, execute, _optional_prefix, align = re_secnd.match(line).groups()
+ fsize, msize, align = map(hex2int, (fsize, msize, '0x' + align))
read = read == "R"
write = write == "W"
execute = execute == "E"
| {"golden_diff": "diff --git a/pwndbg/wrappers/readelf.py b/pwndbg/wrappers/readelf.py\n--- a/pwndbg/wrappers/readelf.py\n+++ b/pwndbg/wrappers/readelf.py\n@@ -52,9 +52,15 @@\n # FileSiz MemSiz Flags Align\n # LOAD 0x0000000000000000 0x0000000000000000 0x0000000000000000\n # 0x0000000000000830 0x0000000000000830 R E 0x200000\n+ #\n+ ############################################################################\n+ #\n+ # NOTE: On some readelf versions the Align column might not be prefixed with 0x\n+ # See https://github.com/pwndbg/pwndbg/issues/427\n+ #\n # Account for this using two regular expressions\n re_first = re.compile(r\"\\s+LOAD\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+)\")\n- re_secnd = re.compile(r\"\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\\s+(0x[0-9A-Fa-f]+)\")\n+ re_secnd = re.compile(r\"\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\\s+(0x)?([0-9A-Fa-f]+)\")\n hex2int = lambda x: int(x, 16)\n \n for line in readelf_out.splitlines():\n@@ -62,8 +68,8 @@\n load_found = True\n offset, vaddr, paddr = map(hex2int, re_first.match(line).groups())\n elif load_found:\n- fsize, msize, read, write, execute, align = re_secnd.match(line).groups()\n- fsize, msize, align = map(hex2int, (fsize, msize, align))\n+ fsize, msize, read, write, execute, _optional_prefix, align = re_secnd.match(line).groups()\n+ fsize, msize, align = map(hex2int, (fsize, msize, '0x' + align))\n read = read == \"R\"\n write = write == \"W\"\n execute = execute == \"E\"\n", "issue": "bug of function get_load_segment_info in readelf.py\nwhen I got readelf result like below\r\n```\r\nreadelf --program-headers /bin/ls | grep \"LOAD\" -A 10\r\n LOAD 0x0000000000000000 0x0000000000400000 0x0000000000400000\r\n 0x000000000001da64 0x000000000001da64 R E 200000\r\n```\r\nthe function crashed at line 65\r\n```\r\n65 fsize, msize, read, write, execute, align = re_secnd.match(line).groups()\r\n```\r\nthe reason is in the regex format\r\n```python\r\nre_secnd = re.compile(r\"\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\\s+(**0x**[0-9A-Fa-f]+)\"\r\n```\r\nI mean, \"0x\" should not a absolute prefix of align number\n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport re\n\nimport pwndbg.wrappers\n\ncmd_name = \"readelf\"\n\[email protected](cmd_name)\ndef get_jmpslots():\n local_path = pwndbg.file.get_file(pwndbg.proc.exe)\n cmd = [get_jmpslots.cmd_path, \"--relocs\", local_path]\n readelf_out = pwndbg.wrappers.call_cmd(cmd)\n\n return filter(_extract_jumps, readelf_out.splitlines())\n\ndef _extract_jumps(line):\n '''\n Checks for records in `readelf --relocs <binary>` which has type e.g. `R_X86_64_JUMP_SLO`\n NOTE: Because of that we DO NOT display entries that are not writeable (due to FULL RELRO)\n as they have `R_X86_64_GLOB_DAT` type.\n\n It might be good to display them seperately in the future.\n '''\n try:\n if \"JUMP\" in line.split()[2]:\n return line\n else:\n return False\n except IndexError:\n return False\n\[email protected](cmd_name)\ndef get_load_segment_info():\n '''\n Looks for LOAD sections by parsing the output of `readelf --program-headers <binary>`\n '''\n local_path = pwndbg.file.get_file(pwndbg.proc.exe)\n cmd = [get_jmpslots.cmd_path, \"--program-headers\", local_path]\n readelf_out = pwndbg.wrappers.call_cmd(cmd)\n\n segments = []\n load_found = False\n\n # Output from readelf is \n # Type Offset VirtAddr PhysAddr\n # FileSiz MemSiz Flags Align\n # LOAD 0x0000000000000000 0x0000000000000000 0x0000000000000000\n # 0x0000000000000830 0x0000000000000830 R E 0x200000\n # Account for this using two regular expressions\n re_first = re.compile(r\"\\s+LOAD\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+)\")\n re_secnd = re.compile(r\"\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\\s+(0x[0-9A-Fa-f]+)\")\n hex2int = lambda x: int(x, 16)\n\n for line in readelf_out.splitlines():\n if \"LOAD\" in line:\n load_found = True\n offset, vaddr, paddr = map(hex2int, re_first.match(line).groups())\n elif load_found:\n fsize, msize, read, write, execute, align = re_secnd.match(line).groups()\n fsize, msize, align = map(hex2int, (fsize, msize, align))\n read = read == \"R\"\n write = write == \"W\"\n execute = execute == \"E\"\n\n segments.append({\"Offset\": offset,\n \"VirtAddr\": vaddr,\n \"PhysAddr\": paddr,\n \"FileSiz\": fsize,\n \"MemSiz\": msize,\n \"FlagsRead\": read,\n \"FlagsWrite\": write,\n \"FlagsExecute\": execute})\n\n load_found = False\n\n return segments\n", "path": "pwndbg/wrappers/readelf.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\nfrom __future__ import unicode_literals\n\nimport re\n\nimport pwndbg.wrappers\n\ncmd_name = \"readelf\"\n\[email protected](cmd_name)\ndef get_jmpslots():\n local_path = pwndbg.file.get_file(pwndbg.proc.exe)\n cmd = [get_jmpslots.cmd_path, \"--relocs\", local_path]\n readelf_out = pwndbg.wrappers.call_cmd(cmd)\n\n return filter(_extract_jumps, readelf_out.splitlines())\n\ndef _extract_jumps(line):\n '''\n Checks for records in `readelf --relocs <binary>` which has type e.g. `R_X86_64_JUMP_SLO`\n NOTE: Because of that we DO NOT display entries that are not writeable (due to FULL RELRO)\n as they have `R_X86_64_GLOB_DAT` type.\n\n It might be good to display them seperately in the future.\n '''\n try:\n if \"JUMP\" in line.split()[2]:\n return line\n else:\n return False\n except IndexError:\n return False\n\[email protected](cmd_name)\ndef get_load_segment_info():\n '''\n Looks for LOAD sections by parsing the output of `readelf --program-headers <binary>`\n '''\n local_path = pwndbg.file.get_file(pwndbg.proc.exe)\n cmd = [get_jmpslots.cmd_path, \"--program-headers\", local_path]\n readelf_out = pwndbg.wrappers.call_cmd(cmd)\n\n segments = []\n load_found = False\n\n # Output from readelf is \n # Type Offset VirtAddr PhysAddr\n # FileSiz MemSiz Flags Align\n # LOAD 0x0000000000000000 0x0000000000000000 0x0000000000000000\n # 0x0000000000000830 0x0000000000000830 R E 0x200000\n #\n ############################################################################\n #\n # NOTE: On some readelf versions the Align column might not be prefixed with 0x\n # See https://github.com/pwndbg/pwndbg/issues/427\n #\n # Account for this using two regular expressions\n re_first = re.compile(r\"\\s+LOAD\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+)\")\n re_secnd = re.compile(r\"\\s+(0x[0-9A-Fa-f]+) (0x[0-9A-Fa-f]+) (.)(.)(.)\\s+(0x)?([0-9A-Fa-f]+)\")\n hex2int = lambda x: int(x, 16)\n\n for line in readelf_out.splitlines():\n if \"LOAD\" in line:\n load_found = True\n offset, vaddr, paddr = map(hex2int, re_first.match(line).groups())\n elif load_found:\n fsize, msize, read, write, execute, _optional_prefix, align = re_secnd.match(line).groups()\n fsize, msize, align = map(hex2int, (fsize, msize, '0x' + align))\n read = read == \"R\"\n write = write == \"W\"\n execute = execute == \"E\"\n\n segments.append({\"Offset\": offset,\n \"VirtAddr\": vaddr,\n \"PhysAddr\": paddr,\n \"FileSiz\": fsize,\n \"MemSiz\": msize,\n \"FlagsRead\": read,\n \"FlagsWrite\": write,\n \"FlagsExecute\": execute})\n\n load_found = False\n\n return segments\n", "path": "pwndbg/wrappers/readelf.py"}]} | 1,574 | 625 |
gh_patches_debug_17862 | rasdani/github-patches | git_diff | kivy__kivy-2700 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SDL2 - crash on loading asyncimage loading gif?
relevant log:
Traceback (most recent call last):
File "/home/chozabu/git/KivEntEd/main.py", line 1289, in <module>
KivEntEd().run()
File "/usr/local/lib/python2.7/dist-packages/kivy/app.py", line 825, in run
runTouchApp()
File "/usr/local/lib/python2.7/dist-packages/kivy/base.py", line 484, in runTouchApp
EventLoop.window.mainloop()
File "/usr/local/lib/python2.7/dist-packages/kivy/core/window/window_pygame.py", line 364, in mainloop
self._mainloop()
File "/usr/local/lib/python2.7/dist-packages/kivy/core/window/window_pygame.py", line 268, in _mainloop
EventLoop.idle()
File "/usr/local/lib/python2.7/dist-packages/kivy/base.py", line 324, in idle
Clock.tick()
File "/usr/local/lib/python2.7/dist-packages/kivy/clock.py", line 482, in tick
self._process_events()
File "/usr/local/lib/python2.7/dist-packages/kivy/clock.py", line 614, in _process_events
event.tick(self._last_tick, remove)
File "/usr/local/lib/python2.7/dist-packages/kivy/clock.py", line 373, in tick
ret = callback(self._dt)
File "/home/chozabu/git/KivEntEd/ui_elements.py", line 121, in initUI
self.screenShot.source = serverURL+"/downloadSS?fullname="+self.info['filename']+".png"
File "kivy/properties.pyx", line 377, in kivy.properties.Property.__set__ (kivy/properties.c:4346)
File "kivy/properties.pyx", line 409, in kivy.properties.Property.set (kivy/properties.c:4861)
File "kivy/properties.pyx", line 460, in kivy.properties.Property.dispatch (kivy/properties.c:5437)
File "kivy/_event.pyx", line 1046, in kivy._event.EventObservers.dispatch (kivy/_event.c:10980)
File "/usr/local/lib/python2.7/dist-packages/kivy/uix/image.py", line 327, in _load_source
anim_delay=self.anim_delay)
File "/usr/local/lib/python2.7/dist-packages/kivy/loader.py", line 432, in image
client = ProxyImage(self.loading_image,
File "/usr/local/lib/python2.7/dist-packages/kivy/loader.py", line 163, in _get_loading_image
self._loading_image = ImageLoader.load(filename=loading_png_fn)
File "/usr/local/lib/python2.7/dist-packages/kivy/core/image/__init__.py", line 385, in load
im = loader(filename, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/kivy/core/image/**init**.py", line 164, in **init**
self._data = self.load(filename)
File "/usr/local/lib/python2.7/dist-packages/kivy/core/image/img_sdl2.py", line 34, in load
raise Exception('SDL2: Unable to load image')
Exception: SDL2: Unable to load image
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kivy/core/image/img_sdl2.py`
Content:
```
1 '''
2 SDL2 image loader
3 =================
4 '''
5
6 __all__ = ('ImageLoaderSDL2', )
7
8 from kivy.compat import PY2
9 from kivy.logger import Logger
10 from kivy.core.image import ImageLoaderBase, ImageData, ImageLoader
11 from kivy.core.image import _img_sdl2
12
13
14 class ImageLoaderSDL2(ImageLoaderBase):
15 '''Image loader based on the PIL library'''
16
17 def _ensure_ext(self):
18 _img_sdl2.init()
19
20 @staticmethod
21 def extensions():
22 '''Return accepted extensions for this loader'''
23 return ('bmp', 'gif', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',
24 'webp', 'xcf', 'xpm', 'xv')
25
26 @staticmethod
27 def can_save():
28 return True
29
30 def load(self, filename):
31 info = _img_sdl2.load(filename)
32 if not info:
33 Logger.warning('Image: Unable to load image <%s>' % filename)
34 raise Exception('SDL2: Unable to load image')
35
36 w, h, fmt, pixels, rowlength = info
37
38 # update internals
39 self.filename = filename
40 return [ImageData(
41 w, h, fmt, pixels, source=filename,
42 rowlength=rowlength)]
43
44 @staticmethod
45 def save(filename, width, height, fmt, pixels, flipped):
46 # TODO implement the save for sdl2
47 #surface = SDL2.image.fromstring(
48 # pixels, (width, height), fmt.upper(), False)
49 #SDL2.image.save(surface, filename)
50 _img_sdl2.save(filename, width, height, fmt, pixels, flipped)
51 return True
52
53
54 # register
55 ImageLoader.register(ImageLoaderSDL2)
56
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kivy/core/image/img_sdl2.py b/kivy/core/image/img_sdl2.py
--- a/kivy/core/image/img_sdl2.py
+++ b/kivy/core/image/img_sdl2.py
@@ -20,7 +20,7 @@
@staticmethod
def extensions():
'''Return accepted extensions for this loader'''
- return ('bmp', 'gif', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',
+ return ('bmp', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',
'webp', 'xcf', 'xpm', 'xv')
@staticmethod
@@ -43,10 +43,6 @@
@staticmethod
def save(filename, width, height, fmt, pixels, flipped):
- # TODO implement the save for sdl2
- #surface = SDL2.image.fromstring(
- # pixels, (width, height), fmt.upper(), False)
- #SDL2.image.save(surface, filename)
_img_sdl2.save(filename, width, height, fmt, pixels, flipped)
return True
| {"golden_diff": "diff --git a/kivy/core/image/img_sdl2.py b/kivy/core/image/img_sdl2.py\n--- a/kivy/core/image/img_sdl2.py\n+++ b/kivy/core/image/img_sdl2.py\n@@ -20,7 +20,7 @@\n @staticmethod\n def extensions():\n '''Return accepted extensions for this loader'''\n- return ('bmp', 'gif', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',\n+ return ('bmp', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',\n 'webp', 'xcf', 'xpm', 'xv')\n \n @staticmethod\n@@ -43,10 +43,6 @@\n \n @staticmethod\n def save(filename, width, height, fmt, pixels, flipped):\n- # TODO implement the save for sdl2\n- #surface = SDL2.image.fromstring(\n- # pixels, (width, height), fmt.upper(), False)\n- #SDL2.image.save(surface, filename)\n _img_sdl2.save(filename, width, height, fmt, pixels, flipped)\n return True\n", "issue": "SDL2 - crash on loading asyncimage loading gif?\nrelevant log:\n\nTraceback (most recent call last):\n File \"/home/chozabu/git/KivEntEd/main.py\", line 1289, in <module>\n KivEntEd().run()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/app.py\", line 825, in run\n runTouchApp()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/base.py\", line 484, in runTouchApp\n EventLoop.window.mainloop()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/core/window/window_pygame.py\", line 364, in mainloop\n self._mainloop()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/core/window/window_pygame.py\", line 268, in _mainloop\n EventLoop.idle()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/base.py\", line 324, in idle\n Clock.tick()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/clock.py\", line 482, in tick\n self._process_events()\n File \"/usr/local/lib/python2.7/dist-packages/kivy/clock.py\", line 614, in _process_events\n event.tick(self._last_tick, remove)\n File \"/usr/local/lib/python2.7/dist-packages/kivy/clock.py\", line 373, in tick\n ret = callback(self._dt)\n File \"/home/chozabu/git/KivEntEd/ui_elements.py\", line 121, in initUI\n self.screenShot.source = serverURL+\"/downloadSS?fullname=\"+self.info['filename']+\".png\"\n File \"kivy/properties.pyx\", line 377, in kivy.properties.Property.__set__ (kivy/properties.c:4346)\n File \"kivy/properties.pyx\", line 409, in kivy.properties.Property.set (kivy/properties.c:4861)\n File \"kivy/properties.pyx\", line 460, in kivy.properties.Property.dispatch (kivy/properties.c:5437)\n File \"kivy/_event.pyx\", line 1046, in kivy._event.EventObservers.dispatch (kivy/_event.c:10980)\n File \"/usr/local/lib/python2.7/dist-packages/kivy/uix/image.py\", line 327, in _load_source\n anim_delay=self.anim_delay)\n File \"/usr/local/lib/python2.7/dist-packages/kivy/loader.py\", line 432, in image\n client = ProxyImage(self.loading_image,\n File \"/usr/local/lib/python2.7/dist-packages/kivy/loader.py\", line 163, in _get_loading_image\n self._loading_image = ImageLoader.load(filename=loading_png_fn)\n File \"/usr/local/lib/python2.7/dist-packages/kivy/core/image/__init__.py\", line 385, in load\n im = loader(filename, **kwargs)\n File \"/usr/local/lib/python2.7/dist-packages/kivy/core/image/**init**.py\", line 164, in **init**\n self._data = self.load(filename)\n File \"/usr/local/lib/python2.7/dist-packages/kivy/core/image/img_sdl2.py\", line 34, in load\n raise Exception('SDL2: Unable to load image')\nException: SDL2: Unable to load image\n\n", "before_files": [{"content": "'''\nSDL2 image loader\n=================\n'''\n\n__all__ = ('ImageLoaderSDL2', )\n\nfrom kivy.compat import PY2\nfrom kivy.logger import Logger\nfrom kivy.core.image import ImageLoaderBase, ImageData, ImageLoader\nfrom kivy.core.image import _img_sdl2\n\n\nclass ImageLoaderSDL2(ImageLoaderBase):\n '''Image loader based on the PIL library'''\n\n def _ensure_ext(self):\n _img_sdl2.init()\n\n @staticmethod\n def extensions():\n '''Return accepted extensions for this loader'''\n return ('bmp', 'gif', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',\n 'webp', 'xcf', 'xpm', 'xv')\n\n @staticmethod\n def can_save():\n return True\n\n def load(self, filename):\n info = _img_sdl2.load(filename)\n if not info:\n Logger.warning('Image: Unable to load image <%s>' % filename)\n raise Exception('SDL2: Unable to load image')\n\n w, h, fmt, pixels, rowlength = info\n\n # update internals\n self.filename = filename\n return [ImageData(\n w, h, fmt, pixels, source=filename,\n rowlength=rowlength)]\n\n @staticmethod\n def save(filename, width, height, fmt, pixels, flipped):\n # TODO implement the save for sdl2\n #surface = SDL2.image.fromstring(\n # pixels, (width, height), fmt.upper(), False)\n #SDL2.image.save(surface, filename)\n _img_sdl2.save(filename, width, height, fmt, pixels, flipped)\n return True\n\n\n# register\nImageLoader.register(ImageLoaderSDL2)\n", "path": "kivy/core/image/img_sdl2.py"}], "after_files": [{"content": "'''\nSDL2 image loader\n=================\n'''\n\n__all__ = ('ImageLoaderSDL2', )\n\nfrom kivy.compat import PY2\nfrom kivy.logger import Logger\nfrom kivy.core.image import ImageLoaderBase, ImageData, ImageLoader\nfrom kivy.core.image import _img_sdl2\n\n\nclass ImageLoaderSDL2(ImageLoaderBase):\n '''Image loader based on the PIL library'''\n\n def _ensure_ext(self):\n _img_sdl2.init()\n\n @staticmethod\n def extensions():\n '''Return accepted extensions for this loader'''\n return ('bmp', 'jpg', 'jpeg', 'lbm', 'pcx', 'png', 'pnm', 'tga', 'tiff',\n 'webp', 'xcf', 'xpm', 'xv')\n\n @staticmethod\n def can_save():\n return True\n\n def load(self, filename):\n info = _img_sdl2.load(filename)\n if not info:\n Logger.warning('Image: Unable to load image <%s>' % filename)\n raise Exception('SDL2: Unable to load image')\n\n w, h, fmt, pixels, rowlength = info\n\n # update internals\n self.filename = filename\n return [ImageData(\n w, h, fmt, pixels, source=filename,\n rowlength=rowlength)]\n\n @staticmethod\n def save(filename, width, height, fmt, pixels, flipped):\n _img_sdl2.save(filename, width, height, fmt, pixels, flipped)\n return True\n\n\n# register\nImageLoader.register(ImageLoaderSDL2)\n", "path": "kivy/core/image/img_sdl2.py"}]} | 1,532 | 276 |
gh_patches_debug_19620 | rasdani/github-patches | git_diff | sopel-irc__sopel-2166 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
announce error on python3
<!-- Before reporting a bug, please search both open *and closed* issues to
see if it has already been reported. If you can, try to reproduce the problem
on an unmodified copy of the `master` branch first, as sometimes bugs are found
and fixed without a report. If the problem is unreported and persists in
`master`, please help us fix it quickly by filling out as much of this
information as you can. Thanks! -->
### Description
.announce results in an error on python 3
### Reproduction steps
1. Setup a instance on python3 (specifically I got the error on v3.7)
2. Try to use .announce
3. there will be a error
### Expected behavior
Works without errors.
### Logs
```
If applicable, add logs to help us figure out what's happening. Raw logs are
super helpful! Logs are usually found in ~/.sopel/logs, depending on your
configuration.
```
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: [2021-07-22 17:04:51,684] sopel.bot ERROR - Unexpected error ('dict_keys' object is not subscriptable) from MacFan4000 at 2
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: Traceback (most recent call last):
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File "/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/bot.py", line 757, in call_rule
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: rule.execute(sopel, trigger)
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File "/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/plugins/rules.py", line 1057, in execute
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: exit_code = self._handler(bot, trigger)
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File "/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/plugin.py", line 1071, in guarded
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: return function(bot, trigger, *args, **kwargs)
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File "/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/modules/announce.py", line 44, in announce
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: for cgroup in channels:
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File "/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/modules/announce.py", line 24, in _chunks
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: yield items[delim:delim + size]
Jul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: TypeError: 'dict_keys' object is not subscriptable
### Environment
- Sopel `.version`: [e.g. 7.0.0 or d416e19] 7.1.2
- Sopel installed via: [apt, pip, `setup.py install`, source, ?] pip
- Python version: [e.g. 3.6.9] 3.7
- Operating system: [e.g. Debian 10] Debian Buster
- IRCd `/version`: [e.g. InspIRCd 3.0.1] Libera Chat
- Relevant plugins: [adminchannel, weather, custom\_thing.py, ?] announce
### Notes
I believe https://github.com/sopel-irc/sopel/commit/b7b6b46a84e29e26a6a6b921debf57735661a4c0#diff-a9aa50736c17c299dac1ad9cb5ea1b835fb638c91bbd8c547990ffd9d67daa74 broke it due to .keys() not working the same way on python3 as it does on python2.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `sopel/modules/announce.py`
Content:
```
1 """
2 announce.py - Sopel Announcement Plugin
3 Sends announcements to all channels the bot has joined.
4 Copyright © 2013, Elad Alfassa, <[email protected]>
5 Licensed under the Eiffel Forum License 2.
6
7 https://sopel.chat
8 """
9 from __future__ import generator_stop
10
11 from sopel import plugin
12
13
14 def _chunks(items, size):
15 """Break a list of items into groups.
16
17 :param items: the collection of items to chunk
18 :type items: :term:`iterable`
19 :param int size: the size of each chunk
20 """
21 # from https://stackoverflow.com/a/312464/5991 with modified names for readability
22 for delim in range(0, len(items), size):
23 yield items[delim:delim + size]
24
25
26 @plugin.command('announce')
27 @plugin.example('.announce Some important message here')
28 @plugin.require_admin('Sorry, I can\'t let you do that', reply=True)
29 @plugin.output_prefix('[ANNOUNCEMENT] ')
30 def announce(bot, trigger):
31 """Send an announcement to all channels the bot is in."""
32 if trigger.group(2) is None:
33 bot.reply('Announce what? I need a message to say.')
34 return
35
36 size = 1
37 try:
38 size = bot.isupport.TARGMAX.get('PRIVMSG', size)
39 except AttributeError:
40 pass
41
42 channels = _chunks(bot.channels.keys(), size)
43 for cgroup in channels:
44 bot.say(trigger.group(2), ','.join(cgroup))
45
46 bot.reply('Announce complete.')
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/sopel/modules/announce.py b/sopel/modules/announce.py
--- a/sopel/modules/announce.py
+++ b/sopel/modules/announce.py
@@ -8,6 +8,8 @@
"""
from __future__ import generator_stop
+import itertools
+
from sopel import plugin
@@ -17,10 +19,18 @@
:param items: the collection of items to chunk
:type items: :term:`iterable`
:param int size: the size of each chunk
+ :return: a :term:`generator` of chunks
+ :rtype: :term:`generator` of :class:`tuple`
"""
- # from https://stackoverflow.com/a/312464/5991 with modified names for readability
- for delim in range(0, len(items), size):
- yield items[delim:delim + size]
+ # This approach is safer than slicing with non-subscriptable types,
+ # for example `dict_keys` objects
+ iterator = iter(items)
+ # TODO: Simplify to assignment expression (`while cond := expr`)
+ # when dropping Python 3.7
+ chunk = tuple(itertools.islice(iterator, size))
+ while chunk:
+ yield chunk
+ chunk = tuple(itertools.islice(iterator, size))
@plugin.command('announce')
| {"golden_diff": "diff --git a/sopel/modules/announce.py b/sopel/modules/announce.py\n--- a/sopel/modules/announce.py\n+++ b/sopel/modules/announce.py\n@@ -8,6 +8,8 @@\n \"\"\"\n from __future__ import generator_stop\n \n+import itertools\n+\n from sopel import plugin\n \n \n@@ -17,10 +19,18 @@\n :param items: the collection of items to chunk\n :type items: :term:`iterable`\n :param int size: the size of each chunk\n+ :return: a :term:`generator` of chunks\n+ :rtype: :term:`generator` of :class:`tuple`\n \"\"\"\n- # from https://stackoverflow.com/a/312464/5991 with modified names for readability\n- for delim in range(0, len(items), size):\n- yield items[delim:delim + size]\n+ # This approach is safer than slicing with non-subscriptable types,\n+ # for example `dict_keys` objects\n+ iterator = iter(items)\n+ # TODO: Simplify to assignment expression (`while cond := expr`)\n+ # when dropping Python 3.7\n+ chunk = tuple(itertools.islice(iterator, size))\n+ while chunk:\n+ yield chunk\n+ chunk = tuple(itertools.islice(iterator, size))\n \n \n @plugin.command('announce')\n", "issue": "announce error on python3\n<!-- Before reporting a bug, please search both open *and closed* issues to\r\nsee if it has already been reported. If you can, try to reproduce the problem\r\non an unmodified copy of the `master` branch first, as sometimes bugs are found\r\nand fixed without a report. If the problem is unreported and persists in\r\n`master`, please help us fix it quickly by filling out as much of this\r\ninformation as you can. Thanks! -->\r\n\r\n### Description\r\n.announce results in an error on python 3\r\n\r\n### Reproduction steps\r\n1. Setup a instance on python3 (specifically I got the error on v3.7)\r\n2. Try to use .announce\r\n3. there will be a error\r\n\r\n### Expected behavior\r\nWorks without errors.\r\n\r\n### Logs\r\n```\r\nIf applicable, add logs to help us figure out what's happening. Raw logs are\r\nsuper helpful! Logs are usually found in ~/.sopel/logs, depending on your\r\nconfiguration.\r\n```\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: [2021-07-22 17:04:51,684] sopel.bot ERROR - Unexpected error ('dict_keys' object is not subscriptable) from MacFan4000 at 2\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: Traceback (most recent call last):\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File \"/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/bot.py\", line 757, in call_rule\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: rule.execute(sopel, trigger)\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File \"/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/plugins/rules.py\", line 1057, in execute\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: exit_code = self._handler(bot, trigger)\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File \"/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/plugin.py\", line 1071, in guarded\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: return function(bot, trigger, *args, **kwargs)\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File \"/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/modules/announce.py\", line 44, in announce\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: for cgroup in channels:\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: File \"/srv/sopelbots/prodvenv/lib/python3.7/site-packages/sopel/modules/announce.py\", line 24, in _chunks\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: yield items[delim:delim + size]\r\nJul 22 17:04:51 bots1.mirahezebots.org sopel[31484]: TypeError: 'dict_keys' object is not subscriptable\r\n### Environment\r\n- Sopel `.version`: [e.g. 7.0.0 or d416e19] 7.1.2\r\n- Sopel installed via: [apt, pip, `setup.py install`, source, ?] pip\r\n- Python version: [e.g. 3.6.9] 3.7\r\n- Operating system: [e.g. Debian 10] Debian Buster\r\n- IRCd `/version`: [e.g. InspIRCd 3.0.1] Libera Chat\r\n- Relevant plugins: [adminchannel, weather, custom\\_thing.py, ?] announce\r\n\r\n### Notes\r\nI believe https://github.com/sopel-irc/sopel/commit/b7b6b46a84e29e26a6a6b921debf57735661a4c0#diff-a9aa50736c17c299dac1ad9cb5ea1b835fb638c91bbd8c547990ffd9d67daa74 broke it due to .keys() not working the same way on python3 as it does on python2.\r\n\n", "before_files": [{"content": "\"\"\"\nannounce.py - Sopel Announcement Plugin\nSends announcements to all channels the bot has joined.\nCopyright \u00a9 2013, Elad Alfassa, <[email protected]>\nLicensed under the Eiffel Forum License 2.\n\nhttps://sopel.chat\n\"\"\"\nfrom __future__ import generator_stop\n\nfrom sopel import plugin\n\n\ndef _chunks(items, size):\n \"\"\"Break a list of items into groups.\n\n :param items: the collection of items to chunk\n :type items: :term:`iterable`\n :param int size: the size of each chunk\n \"\"\"\n # from https://stackoverflow.com/a/312464/5991 with modified names for readability\n for delim in range(0, len(items), size):\n yield items[delim:delim + size]\n\n\[email protected]('announce')\[email protected]('.announce Some important message here')\[email protected]_admin('Sorry, I can\\'t let you do that', reply=True)\[email protected]_prefix('[ANNOUNCEMENT] ')\ndef announce(bot, trigger):\n \"\"\"Send an announcement to all channels the bot is in.\"\"\"\n if trigger.group(2) is None:\n bot.reply('Announce what? I need a message to say.')\n return\n\n size = 1\n try:\n size = bot.isupport.TARGMAX.get('PRIVMSG', size)\n except AttributeError:\n pass\n\n channels = _chunks(bot.channels.keys(), size)\n for cgroup in channels:\n bot.say(trigger.group(2), ','.join(cgroup))\n\n bot.reply('Announce complete.')\n", "path": "sopel/modules/announce.py"}], "after_files": [{"content": "\"\"\"\nannounce.py - Sopel Announcement Plugin\nSends announcements to all channels the bot has joined.\nCopyright \u00a9 2013, Elad Alfassa, <[email protected]>\nLicensed under the Eiffel Forum License 2.\n\nhttps://sopel.chat\n\"\"\"\nfrom __future__ import generator_stop\n\nimport itertools\n\nfrom sopel import plugin\n\n\ndef _chunks(items, size):\n \"\"\"Break a list of items into groups.\n\n :param items: the collection of items to chunk\n :type items: :term:`iterable`\n :param int size: the size of each chunk\n :return: a :term:`generator` of chunks\n :rtype: :term:`generator` of :class:`tuple`\n \"\"\"\n # This approach is safer than slicing with non-subscriptable types,\n # for example `dict_keys` objects\n iterator = iter(items)\n # TODO: Simplify to assignment expression (`while cond := expr`)\n # when dropping Python 3.7\n chunk = tuple(itertools.islice(iterator, size))\n while chunk:\n yield chunk\n chunk = tuple(itertools.islice(iterator, size))\n\n\[email protected]('announce')\[email protected]('.announce Some important message here')\[email protected]_admin('Sorry, I can\\'t let you do that', reply=True)\[email protected]_prefix('[ANNOUNCEMENT] ')\ndef announce(bot, trigger):\n \"\"\"Send an announcement to all channels the bot is in.\"\"\"\n if trigger.group(2) is None:\n bot.reply('Announce what? I need a message to say.')\n return\n\n size = 1\n try:\n size = bot.isupport.TARGMAX.get('PRIVMSG', size)\n except AttributeError:\n pass\n\n channels = _chunks(bot.channels.keys(), size)\n for cgroup in channels:\n bot.say(trigger.group(2), ','.join(cgroup))\n\n bot.reply('Announce complete.')\n", "path": "sopel/modules/announce.py"}]} | 1,880 | 310 |
gh_patches_debug_54184 | rasdani/github-patches | git_diff | pyro-ppl__pyro-2846 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[bug] Runtime error during SVI inference when using poutine.do()
### Issue Description
Setting: a simple model with 2 latent Gaussians z1 and z2, giving rise to x ~ N( z1+z2, I).
In this setting p(z2 | x, z1) should be the same as p(z2 | x, do(z1)).
I wanted to check whether the current Pyro interface reflects this and it seems it does not.
My initial thought is that there is a difference in how .do() and .condition() broadcast the constants across the plate context.
### Environment
- OS and python version: MacOS 10.14.6, Python: 3.8.6
- PyTorch version: 1.9.0.dev20210502 (nightly version)
- Pyro version: 1.6.0.
### Code Snippet
Replication code:
https://pastebin.com/Ki2PYX7z
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pyro/poutine/do_messenger.py`
Content:
```
1 # Copyright (c) 2017-2019 Uber Technologies, Inc.
2 # SPDX-License-Identifier: Apache-2.0
3
4 import numbers
5 import warnings
6
7 import torch
8
9 from .messenger import Messenger
10 from .runtime import apply_stack
11
12
13 class DoMessenger(Messenger):
14 """
15 Given a stochastic function with some sample statements
16 and a dictionary of values at names,
17 set the return values of those sites equal to the values
18 as if they were hard-coded to those values
19 and introduce fresh sample sites with the same names
20 whose values do not propagate.
21
22 Composes freely with :func:`~pyro.poutine.handlers.condition`
23 to represent counterfactual distributions over potential outcomes.
24 See Single World Intervention Graphs [1] for additional details and theory.
25
26 Consider the following Pyro program:
27
28 >>> def model(x):
29 ... s = pyro.param("s", torch.tensor(0.5))
30 ... z = pyro.sample("z", dist.Normal(x, s))
31 ... return z ** 2
32
33 To intervene with a value for site `z`, we can write
34
35 >>> intervened_model = pyro.poutine.do(model, data={"z": torch.tensor(1.)})
36
37 This is equivalent to replacing `z = pyro.sample("z", ...)` with
38 `z = torch.tensor(1.)`
39 and introducing a fresh sample site pyro.sample("z", ...) whose value is not used elsewhere.
40
41 References
42
43 [1] `Single World Intervention Graphs: A Primer`,
44 Thomas Richardson, James Robins
45
46 :param fn: a stochastic function (callable containing Pyro primitive calls)
47 :param data: a ``dict`` mapping sample site names to interventions
48 :returns: stochastic function decorated with a :class:`~pyro.poutine.do_messenger.DoMessenger`
49 """
50 def __init__(self, data):
51 super().__init__()
52 self.data = data
53 self._intervener_id = str(id(self))
54
55 def _pyro_sample(self, msg):
56 if msg.get('_intervener_id', None) != self._intervener_id and \
57 self.data.get(msg['name']) is not None:
58
59 if msg.get('_intervener_id', None) is not None:
60 warnings.warn(
61 "Attempting to intervene on variable {} multiple times,"
62 "this is almost certainly incorrect behavior".format(msg['name']),
63 RuntimeWarning)
64
65 msg['_intervener_id'] = self._intervener_id
66
67 # split node, avoid reapplying self recursively to new node
68 new_msg = msg.copy()
69 apply_stack(new_msg)
70
71 # apply intervention
72 intervention = self.data[msg['name']]
73 msg['name'] = msg['name'] + "__CF" # mangle old name
74
75 if isinstance(intervention, (numbers.Number, torch.Tensor)):
76 msg['value'] = intervention
77 msg['is_observed'] = True
78 msg['stop'] = True
79 else:
80 raise NotImplementedError(
81 "Interventions of type {} not implemented (yet)".format(type(intervention)))
82
83 return None
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pyro/poutine/do_messenger.py b/pyro/poutine/do_messenger.py
--- a/pyro/poutine/do_messenger.py
+++ b/pyro/poutine/do_messenger.py
@@ -66,6 +66,7 @@
# split node, avoid reapplying self recursively to new node
new_msg = msg.copy()
+ new_msg["cond_indep_stack"] = () # avoid entering plates twice
apply_stack(new_msg)
# apply intervention
| {"golden_diff": "diff --git a/pyro/poutine/do_messenger.py b/pyro/poutine/do_messenger.py\n--- a/pyro/poutine/do_messenger.py\n+++ b/pyro/poutine/do_messenger.py\n@@ -66,6 +66,7 @@\n \n # split node, avoid reapplying self recursively to new node\n new_msg = msg.copy()\n+ new_msg[\"cond_indep_stack\"] = () # avoid entering plates twice\n apply_stack(new_msg)\n \n # apply intervention\n", "issue": "[bug] Runtime error during SVI inference when using poutine.do()\n### Issue Description\r\n\r\nSetting: a simple model with 2 latent Gaussians z1 and z2, giving rise to x ~ N( z1+z2, I).\r\n\r\nIn this setting p(z2 | x, z1) should be the same as p(z2 | x, do(z1)). \r\n\r\nI wanted to check whether the current Pyro interface reflects this and it seems it does not.\r\n\r\nMy initial thought is that there is a difference in how .do() and .condition() broadcast the constants across the plate context.\r\n\r\n### Environment\r\n\r\n - OS and python version: MacOS 10.14.6, Python: 3.8.6\r\n - PyTorch version: 1.9.0.dev20210502 (nightly version)\r\n - Pyro version: 1.6.0.\r\n\r\n### Code Snippet\r\n\r\nReplication code:\r\nhttps://pastebin.com/Ki2PYX7z\r\n\n", "before_files": [{"content": "# Copyright (c) 2017-2019 Uber Technologies, Inc.\n# SPDX-License-Identifier: Apache-2.0\n\nimport numbers\nimport warnings\n\nimport torch\n\nfrom .messenger import Messenger\nfrom .runtime import apply_stack\n\n\nclass DoMessenger(Messenger):\n \"\"\"\n Given a stochastic function with some sample statements\n and a dictionary of values at names,\n set the return values of those sites equal to the values\n as if they were hard-coded to those values\n and introduce fresh sample sites with the same names\n whose values do not propagate.\n\n Composes freely with :func:`~pyro.poutine.handlers.condition`\n to represent counterfactual distributions over potential outcomes.\n See Single World Intervention Graphs [1] for additional details and theory.\n\n Consider the following Pyro program:\n\n >>> def model(x):\n ... s = pyro.param(\"s\", torch.tensor(0.5))\n ... z = pyro.sample(\"z\", dist.Normal(x, s))\n ... return z ** 2\n\n To intervene with a value for site `z`, we can write\n\n >>> intervened_model = pyro.poutine.do(model, data={\"z\": torch.tensor(1.)})\n\n This is equivalent to replacing `z = pyro.sample(\"z\", ...)` with\n `z = torch.tensor(1.)`\n and introducing a fresh sample site pyro.sample(\"z\", ...) whose value is not used elsewhere.\n\n References\n\n [1] `Single World Intervention Graphs: A Primer`,\n Thomas Richardson, James Robins\n\n :param fn: a stochastic function (callable containing Pyro primitive calls)\n :param data: a ``dict`` mapping sample site names to interventions\n :returns: stochastic function decorated with a :class:`~pyro.poutine.do_messenger.DoMessenger`\n \"\"\"\n def __init__(self, data):\n super().__init__()\n self.data = data\n self._intervener_id = str(id(self))\n\n def _pyro_sample(self, msg):\n if msg.get('_intervener_id', None) != self._intervener_id and \\\n self.data.get(msg['name']) is not None:\n\n if msg.get('_intervener_id', None) is not None:\n warnings.warn(\n \"Attempting to intervene on variable {} multiple times,\"\n \"this is almost certainly incorrect behavior\".format(msg['name']),\n RuntimeWarning)\n\n msg['_intervener_id'] = self._intervener_id\n\n # split node, avoid reapplying self recursively to new node\n new_msg = msg.copy()\n apply_stack(new_msg)\n\n # apply intervention\n intervention = self.data[msg['name']]\n msg['name'] = msg['name'] + \"__CF\" # mangle old name\n\n if isinstance(intervention, (numbers.Number, torch.Tensor)):\n msg['value'] = intervention\n msg['is_observed'] = True\n msg['stop'] = True\n else:\n raise NotImplementedError(\n \"Interventions of type {} not implemented (yet)\".format(type(intervention)))\n\n return None\n", "path": "pyro/poutine/do_messenger.py"}], "after_files": [{"content": "# Copyright (c) 2017-2019 Uber Technologies, Inc.\n# SPDX-License-Identifier: Apache-2.0\n\nimport numbers\nimport warnings\n\nimport torch\n\nfrom .messenger import Messenger\nfrom .runtime import apply_stack\n\n\nclass DoMessenger(Messenger):\n \"\"\"\n Given a stochastic function with some sample statements\n and a dictionary of values at names,\n set the return values of those sites equal to the values\n as if they were hard-coded to those values\n and introduce fresh sample sites with the same names\n whose values do not propagate.\n\n Composes freely with :func:`~pyro.poutine.handlers.condition`\n to represent counterfactual distributions over potential outcomes.\n See Single World Intervention Graphs [1] for additional details and theory.\n\n Consider the following Pyro program:\n\n >>> def model(x):\n ... s = pyro.param(\"s\", torch.tensor(0.5))\n ... z = pyro.sample(\"z\", dist.Normal(x, s))\n ... return z ** 2\n\n To intervene with a value for site `z`, we can write\n\n >>> intervened_model = pyro.poutine.do(model, data={\"z\": torch.tensor(1.)})\n\n This is equivalent to replacing `z = pyro.sample(\"z\", ...)` with\n `z = torch.tensor(1.)`\n and introducing a fresh sample site pyro.sample(\"z\", ...) whose value is not used elsewhere.\n\n References\n\n [1] `Single World Intervention Graphs: A Primer`,\n Thomas Richardson, James Robins\n\n :param fn: a stochastic function (callable containing Pyro primitive calls)\n :param data: a ``dict`` mapping sample site names to interventions\n :returns: stochastic function decorated with a :class:`~pyro.poutine.do_messenger.DoMessenger`\n \"\"\"\n def __init__(self, data):\n super().__init__()\n self.data = data\n self._intervener_id = str(id(self))\n\n def _pyro_sample(self, msg):\n if msg.get('_intervener_id', None) != self._intervener_id and \\\n self.data.get(msg['name']) is not None:\n\n if msg.get('_intervener_id', None) is not None:\n warnings.warn(\n \"Attempting to intervene on variable {} multiple times,\"\n \"this is almost certainly incorrect behavior\".format(msg['name']),\n RuntimeWarning)\n\n msg['_intervener_id'] = self._intervener_id\n\n # split node, avoid reapplying self recursively to new node\n new_msg = msg.copy()\n new_msg[\"cond_indep_stack\"] = () # avoid entering plates twice\n apply_stack(new_msg)\n\n # apply intervention\n intervention = self.data[msg['name']]\n msg['name'] = msg['name'] + \"__CF\" # mangle old name\n\n if isinstance(intervention, (numbers.Number, torch.Tensor)):\n msg['value'] = intervention\n msg['is_observed'] = True\n msg['stop'] = True\n else:\n raise NotImplementedError(\n \"Interventions of type {} not implemented (yet)\".format(type(intervention)))\n\n return None\n", "path": "pyro/poutine/do_messenger.py"}]} | 1,317 | 110 |
gh_patches_debug_58564 | rasdani/github-patches | git_diff | codespell-project__codespell-2626 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`python setup.py check` → `twine check`
Because `setup.py ...` is deprecated, we need an alternative to `setup.py check` such as `twine`.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #! /usr/bin/env python
2
3 from setuptools import setup
4
5 if __name__ == "__main__":
6 setup()
7
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
deleted file mode 100755
--- a/setup.py
+++ /dev/null
@@ -1,6 +0,0 @@
-#! /usr/bin/env python
-
-from setuptools import setup
-
-if __name__ == "__main__":
- setup()
| {"golden_diff": "diff --git a/setup.py b/setup.py\ndeleted file mode 100755\n--- a/setup.py\n+++ /dev/null\n@@ -1,6 +0,0 @@\n-#! /usr/bin/env python\n-\n-from setuptools import setup\n-\n-if __name__ == \"__main__\":\n- setup()\n", "issue": "`python setup.py check` \u2192 `twine check`\nBecause `setup.py ...` is deprecated, we need an alternative to `setup.py check` such as `twine`.\n", "before_files": [{"content": "#! /usr/bin/env python\n\nfrom setuptools import setup\n\nif __name__ == \"__main__\":\n setup()\n", "path": "setup.py"}], "after_files": [{"content": null, "path": "setup.py"}]} | 326 | 68 |
gh_patches_debug_5766 | rasdani/github-patches | git_diff | napari__napari-4259 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Previously selected point deleted when deleting layer
## 🐛 Bug
Recently selected points are erroneously removed when deleting new layers with the delete key. (reproduced with points and labels layer)
## To Reproduce
Steps to reproduce the behaviour:
1. Create a point on a points layer
2. Create a new points layer
3. Select the newly created points layer from the layer list (visually deselecting the point)
4. Delete newly created layer using the delete key, the last selected point will also be deleted
Please note that this issue does not occur when the layer is deleted using the bin icon, leading me to believe it is a keybinding issue (and the point must still be 'selected' in come capacity)
<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->
https://user-images.githubusercontent.com/95660545/156966137-b2a645a6-25ae-42b4-baf7-137e7506e20a.mp4
## Expected behaviour
It is expected that only the newly created points layer (with no points assigned to it) should be deleted, not the point as well.
<!-- A clear and concise description of what you expected to happen. -->
## Environment
napari: 0.4.15.dev68+gdd3a2afd
Platform: Windows-10-10.0.19044-SP0
Python: 3.9.7 (default, Sep 16 2021, 16:59:28) [MSC v.1916 64 bit (AMD64)]
Qt: 5.15.2
PyQt5: 5.15.6
NumPy: 1.21.5
SciPy: 1.7.3
Dask: 2022.01.0
VisPy: 0.9.6
OpenGL:
- GL version: 4.6.0 - Build 26.20.100.7372
- MAX_TEXTURE_SIZE: 16384
Screens:
- screen 1: resolution 1920x1080, scale 1.0
Plugins:
- console: 0.0.4
- scikit-image: 0.4.15.dev68+gdd3a2afd
- svg: 0.1.6
napari contributors (2019). napari: a multi-dimensional image viewer for python. doi:10.5281/zenodo.3555620
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `napari/_qt/containers/qt_layer_list.py`
Content:
```
1 from __future__ import annotations
2
3 from typing import TYPE_CHECKING
4
5 from qtpy.QtCore import QSortFilterProxyModel, Qt
6
7 from ...layers import Layer
8 from ...utils.translations import trans
9 from ._base_item_model import SortRole, _BaseEventedItemModel
10 from ._layer_delegate import LayerDelegate
11 from .qt_list_view import QtListView
12
13 if TYPE_CHECKING:
14 from qtpy.QtGui import QKeyEvent
15 from qtpy.QtWidgets import QWidget
16
17 from ...components.layerlist import LayerList
18
19
20 class ReverseProxyModel(QSortFilterProxyModel):
21 """Proxy Model that reverses the view order of a _BaseEventedItemModel."""
22
23 def __init__(self, model: _BaseEventedItemModel) -> None:
24 super().__init__()
25 self.setSourceModel(model)
26 self.setSortRole(SortRole)
27 self.sort(0, Qt.DescendingOrder)
28
29 def dropMimeData(self, data, action, destRow, col, parent):
30 """Handle destination row for dropping with reversed indices."""
31 row = 0 if destRow == -1 else self.sourceModel().rowCount() - destRow
32 return self.sourceModel().dropMimeData(data, action, row, col, parent)
33
34
35 class QtLayerList(QtListView[Layer]):
36 """QItemView subclass specialized for the LayerList.
37
38 This is as mostly for targetting with QSS, applying the delegate and
39 reversing the view with ReverseProxyModel.
40 """
41
42 def __init__(self, root: LayerList, parent: QWidget = None):
43 super().__init__(root, parent)
44 self.setItemDelegate(LayerDelegate())
45 self.setToolTip(trans._('Layer list'))
46 font = self.font()
47 font.setPointSize(12)
48 self.setFont(font)
49
50 # This reverses the order of the items in the view,
51 # so items at the end of the list are at the top.
52 self.setModel(ReverseProxyModel(self.model()))
53
54 def keyPressEvent(self, e: QKeyEvent) -> None:
55 """Override Qt event to pass events to the viewer."""
56 if e.key() != Qt.Key_Space:
57 super().keyPressEvent(e)
58
59 e.ignore() # pass key events up to viewer
60
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/napari/_qt/containers/qt_layer_list.py b/napari/_qt/containers/qt_layer_list.py
--- a/napari/_qt/containers/qt_layer_list.py
+++ b/napari/_qt/containers/qt_layer_list.py
@@ -53,7 +53,7 @@
def keyPressEvent(self, e: QKeyEvent) -> None:
"""Override Qt event to pass events to the viewer."""
- if e.key() != Qt.Key_Space:
+ if e.key() != Qt.Key.Key_Space:
super().keyPressEvent(e)
-
- e.ignore() # pass key events up to viewer
+ if e.key() not in (Qt.Key.Key_Backspace, Qt.Key.Key_Delete):
+ e.ignore() # pass key events up to viewer
| {"golden_diff": "diff --git a/napari/_qt/containers/qt_layer_list.py b/napari/_qt/containers/qt_layer_list.py\n--- a/napari/_qt/containers/qt_layer_list.py\n+++ b/napari/_qt/containers/qt_layer_list.py\n@@ -53,7 +53,7 @@\n \n def keyPressEvent(self, e: QKeyEvent) -> None:\n \"\"\"Override Qt event to pass events to the viewer.\"\"\"\n- if e.key() != Qt.Key_Space:\n+ if e.key() != Qt.Key.Key_Space:\n super().keyPressEvent(e)\n-\n- e.ignore() # pass key events up to viewer\n+ if e.key() not in (Qt.Key.Key_Backspace, Qt.Key.Key_Delete):\n+ e.ignore() # pass key events up to viewer\n", "issue": "Previously selected point deleted when deleting layer\n## \ud83d\udc1b Bug\r\n\r\nRecently selected points are erroneously removed when deleting new layers with the delete key. (reproduced with points and labels layer)\r\n\r\n## To Reproduce\r\n\r\nSteps to reproduce the behaviour:\r\n\r\n1. Create a point on a points layer\r\n2. Create a new points layer\r\n3. Select the newly created points layer from the layer list (visually deselecting the point)\r\n4. Delete newly created layer using the delete key, the last selected point will also be deleted\r\n\r\nPlease note that this issue does not occur when the layer is deleted using the bin icon, leading me to believe it is a keybinding issue (and the point must still be 'selected' in come capacity)\r\n<!-- If you have a code sample, error messages, stack traces, please provide it here as well -->\r\n\r\n\r\nhttps://user-images.githubusercontent.com/95660545/156966137-b2a645a6-25ae-42b4-baf7-137e7506e20a.mp4\r\n\r\n\r\n## Expected behaviour\r\nIt is expected that only the newly created points layer (with no points assigned to it) should be deleted, not the point as well.\r\n\r\n<!-- A clear and concise description of what you expected to happen. -->\r\n\r\n## Environment\r\n\r\nnapari: 0.4.15.dev68+gdd3a2afd\r\nPlatform: Windows-10-10.0.19044-SP0\r\nPython: 3.9.7 (default, Sep 16 2021, 16:59:28) [MSC v.1916 64 bit (AMD64)]\r\nQt: 5.15.2\r\nPyQt5: 5.15.6\r\nNumPy: 1.21.5\r\nSciPy: 1.7.3\r\nDask: 2022.01.0\r\nVisPy: 0.9.6\r\n\r\nOpenGL:\r\n- GL version: 4.6.0 - Build 26.20.100.7372\r\n- MAX_TEXTURE_SIZE: 16384\r\n\r\nScreens:\r\n- screen 1: resolution 1920x1080, scale 1.0\r\n\r\nPlugins:\r\n- console: 0.0.4\r\n- scikit-image: 0.4.15.dev68+gdd3a2afd\r\n- svg: 0.1.6\r\n\r\nnapari contributors (2019). napari: a multi-dimensional image viewer for python. doi:10.5281/zenodo.3555620\r\n\n", "before_files": [{"content": "from __future__ import annotations\n\nfrom typing import TYPE_CHECKING\n\nfrom qtpy.QtCore import QSortFilterProxyModel, Qt\n\nfrom ...layers import Layer\nfrom ...utils.translations import trans\nfrom ._base_item_model import SortRole, _BaseEventedItemModel\nfrom ._layer_delegate import LayerDelegate\nfrom .qt_list_view import QtListView\n\nif TYPE_CHECKING:\n from qtpy.QtGui import QKeyEvent\n from qtpy.QtWidgets import QWidget\n\n from ...components.layerlist import LayerList\n\n\nclass ReverseProxyModel(QSortFilterProxyModel):\n \"\"\"Proxy Model that reverses the view order of a _BaseEventedItemModel.\"\"\"\n\n def __init__(self, model: _BaseEventedItemModel) -> None:\n super().__init__()\n self.setSourceModel(model)\n self.setSortRole(SortRole)\n self.sort(0, Qt.DescendingOrder)\n\n def dropMimeData(self, data, action, destRow, col, parent):\n \"\"\"Handle destination row for dropping with reversed indices.\"\"\"\n row = 0 if destRow == -1 else self.sourceModel().rowCount() - destRow\n return self.sourceModel().dropMimeData(data, action, row, col, parent)\n\n\nclass QtLayerList(QtListView[Layer]):\n \"\"\"QItemView subclass specialized for the LayerList.\n\n This is as mostly for targetting with QSS, applying the delegate and\n reversing the view with ReverseProxyModel.\n \"\"\"\n\n def __init__(self, root: LayerList, parent: QWidget = None):\n super().__init__(root, parent)\n self.setItemDelegate(LayerDelegate())\n self.setToolTip(trans._('Layer list'))\n font = self.font()\n font.setPointSize(12)\n self.setFont(font)\n\n # This reverses the order of the items in the view,\n # so items at the end of the list are at the top.\n self.setModel(ReverseProxyModel(self.model()))\n\n def keyPressEvent(self, e: QKeyEvent) -> None:\n \"\"\"Override Qt event to pass events to the viewer.\"\"\"\n if e.key() != Qt.Key_Space:\n super().keyPressEvent(e)\n\n e.ignore() # pass key events up to viewer\n", "path": "napari/_qt/containers/qt_layer_list.py"}], "after_files": [{"content": "from __future__ import annotations\n\nfrom typing import TYPE_CHECKING\n\nfrom qtpy.QtCore import QSortFilterProxyModel, Qt\n\nfrom ...layers import Layer\nfrom ...utils.translations import trans\nfrom ._base_item_model import SortRole, _BaseEventedItemModel\nfrom ._layer_delegate import LayerDelegate\nfrom .qt_list_view import QtListView\n\nif TYPE_CHECKING:\n from qtpy.QtGui import QKeyEvent\n from qtpy.QtWidgets import QWidget\n\n from ...components.layerlist import LayerList\n\n\nclass ReverseProxyModel(QSortFilterProxyModel):\n \"\"\"Proxy Model that reverses the view order of a _BaseEventedItemModel.\"\"\"\n\n def __init__(self, model: _BaseEventedItemModel) -> None:\n super().__init__()\n self.setSourceModel(model)\n self.setSortRole(SortRole)\n self.sort(0, Qt.DescendingOrder)\n\n def dropMimeData(self, data, action, destRow, col, parent):\n \"\"\"Handle destination row for dropping with reversed indices.\"\"\"\n row = 0 if destRow == -1 else self.sourceModel().rowCount() - destRow\n return self.sourceModel().dropMimeData(data, action, row, col, parent)\n\n\nclass QtLayerList(QtListView[Layer]):\n \"\"\"QItemView subclass specialized for the LayerList.\n\n This is as mostly for targetting with QSS, applying the delegate and\n reversing the view with ReverseProxyModel.\n \"\"\"\n\n def __init__(self, root: LayerList, parent: QWidget = None):\n super().__init__(root, parent)\n self.setItemDelegate(LayerDelegate())\n self.setToolTip(trans._('Layer list'))\n font = self.font()\n font.setPointSize(12)\n self.setFont(font)\n\n # This reverses the order of the items in the view,\n # so items at the end of the list are at the top.\n self.setModel(ReverseProxyModel(self.model()))\n\n def keyPressEvent(self, e: QKeyEvent) -> None:\n \"\"\"Override Qt event to pass events to the viewer.\"\"\"\n if e.key() != Qt.Key.Key_Space:\n super().keyPressEvent(e)\n if e.key() not in (Qt.Key.Key_Backspace, Qt.Key.Key_Delete):\n e.ignore() # pass key events up to viewer\n", "path": "napari/_qt/containers/qt_layer_list.py"}]} | 1,456 | 175 |
gh_patches_debug_16311 | rasdani/github-patches | git_diff | spotify__luigi-368 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
LuigiConfigParser::add_config_path() raises if instance() hasn't been accessed
To add a path to the list of config paths, one currently has to do:
``` python
LuigiConfigParser.instance() # remove this and get an exception
LuigiConfigParser.add_config_path(my_path)
```
because `add_config_path` tries to reload `cls._instance` which is initialized with `None`. Wouldn't it be cleaner to do a check there and only reload a non-null instance?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `luigi/configuration.py`
Content:
```
1
2 import os
3 import logging
4 from ConfigParser import ConfigParser, NoOptionError, NoSectionError
5
6
7 class LuigiConfigParser(ConfigParser):
8 NO_DEFAULT = object()
9 _instance = None
10 _config_paths = ['/etc/luigi/client.cfg', 'client.cfg']
11 if 'LUIGI_CONFIG_PATH' in os.environ:
12 _config_paths.append(os.environ['LUIGI_CONFIG_PATH'])
13
14 @classmethod
15 def add_config_path(cls, path):
16 cls._config_paths.append(path)
17 cls._instance.reload()
18
19 @classmethod
20 def instance(cls, *args, **kwargs):
21 """ Singleton getter """
22 if cls._instance is None:
23 cls._instance = cls(*args, **kwargs)
24 loaded = cls._instance.reload()
25 logging.getLogger('luigi-interface').info('Loaded %r', loaded)
26
27 return cls._instance
28
29 def reload(self):
30 return self._instance.read(self._config_paths)
31
32 def _get_with_default(self, method, section, option, default, expected_type=None):
33 """ Gets the value of the section/option using method. Returns default if value
34 is not found. Raises an exception if the default value is not None and doesn't match
35 the expected_type.
36 """
37 try:
38 return method(self, section, option)
39 except (NoOptionError, NoSectionError):
40 if default is LuigiConfigParser.NO_DEFAULT:
41 raise
42 if expected_type is not None and default is not None and \
43 not isinstance(default, expected_type):
44 raise
45 return default
46
47 def get(self, section, option, default=NO_DEFAULT):
48 return self._get_with_default(ConfigParser.get, section, option, default)
49
50 def getboolean(self, section, option, default=NO_DEFAULT):
51 return self._get_with_default(ConfigParser.getboolean, section, option, default, bool)
52
53 def getint(self, section, option, default=NO_DEFAULT):
54 return self._get_with_default(ConfigParser.getint, section, option, default, int)
55
56 def getfloat(self, section, option, default=NO_DEFAULT):
57 return self._get_with_default(ConfigParser.getfloat, section, option, default, float)
58
59 def set(self, section, option, value):
60 if not ConfigParser.has_section(self, section):
61 ConfigParser.add_section(self, section)
62
63 return ConfigParser.set(self, section, option, value)
64
65 def get_config():
66 """ Convenience method (for backwards compatibility) for accessing config singleton """
67 return LuigiConfigParser.instance()
68
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/luigi/configuration.py b/luigi/configuration.py
--- a/luigi/configuration.py
+++ b/luigi/configuration.py
@@ -14,7 +14,7 @@
@classmethod
def add_config_path(cls, path):
cls._config_paths.append(path)
- cls._instance.reload()
+ cls.reload()
@classmethod
def instance(cls, *args, **kwargs):
@@ -26,8 +26,9 @@
return cls._instance
- def reload(self):
- return self._instance.read(self._config_paths)
+ @classmethod
+ def reload(cls):
+ return cls.instance().read(cls._config_paths)
def _get_with_default(self, method, section, option, default, expected_type=None):
""" Gets the value of the section/option using method. Returns default if value
| {"golden_diff": "diff --git a/luigi/configuration.py b/luigi/configuration.py\n--- a/luigi/configuration.py\n+++ b/luigi/configuration.py\n@@ -14,7 +14,7 @@\n @classmethod\n def add_config_path(cls, path):\n cls._config_paths.append(path)\n- cls._instance.reload()\n+ cls.reload()\n \n @classmethod\n def instance(cls, *args, **kwargs):\n@@ -26,8 +26,9 @@\n \n return cls._instance\n \n- def reload(self):\n- return self._instance.read(self._config_paths)\n+ @classmethod\n+ def reload(cls):\n+ return cls.instance().read(cls._config_paths)\n \n def _get_with_default(self, method, section, option, default, expected_type=None):\n \"\"\" Gets the value of the section/option using method. Returns default if value\n", "issue": "LuigiConfigParser::add_config_path() raises if instance() hasn't been accessed\nTo add a path to the list of config paths, one currently has to do:\n\n``` python\nLuigiConfigParser.instance() # remove this and get an exception\nLuigiConfigParser.add_config_path(my_path)\n```\n\nbecause `add_config_path` tries to reload `cls._instance` which is initialized with `None`. Wouldn't it be cleaner to do a check there and only reload a non-null instance?\n\n", "before_files": [{"content": "\nimport os\nimport logging\nfrom ConfigParser import ConfigParser, NoOptionError, NoSectionError\n\n\nclass LuigiConfigParser(ConfigParser):\n NO_DEFAULT = object()\n _instance = None\n _config_paths = ['/etc/luigi/client.cfg', 'client.cfg']\n if 'LUIGI_CONFIG_PATH' in os.environ:\n _config_paths.append(os.environ['LUIGI_CONFIG_PATH'])\n\n @classmethod\n def add_config_path(cls, path):\n cls._config_paths.append(path)\n cls._instance.reload()\n\n @classmethod\n def instance(cls, *args, **kwargs):\n \"\"\" Singleton getter \"\"\"\n if cls._instance is None:\n cls._instance = cls(*args, **kwargs)\n loaded = cls._instance.reload()\n logging.getLogger('luigi-interface').info('Loaded %r', loaded)\n\n return cls._instance\n\n def reload(self):\n return self._instance.read(self._config_paths)\n\n def _get_with_default(self, method, section, option, default, expected_type=None):\n \"\"\" Gets the value of the section/option using method. Returns default if value\n is not found. Raises an exception if the default value is not None and doesn't match\n the expected_type.\n \"\"\"\n try:\n return method(self, section, option)\n except (NoOptionError, NoSectionError):\n if default is LuigiConfigParser.NO_DEFAULT:\n raise\n if expected_type is not None and default is not None and \\\n not isinstance(default, expected_type):\n raise\n return default\n\n def get(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.get, section, option, default)\n\n def getboolean(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.getboolean, section, option, default, bool)\n\n def getint(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.getint, section, option, default, int)\n\n def getfloat(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.getfloat, section, option, default, float)\n\n def set(self, section, option, value):\n if not ConfigParser.has_section(self, section):\n ConfigParser.add_section(self, section)\n\n return ConfigParser.set(self, section, option, value)\n\ndef get_config():\n \"\"\" Convenience method (for backwards compatibility) for accessing config singleton \"\"\"\n return LuigiConfigParser.instance()\n", "path": "luigi/configuration.py"}], "after_files": [{"content": "\nimport os\nimport logging\nfrom ConfigParser import ConfigParser, NoOptionError, NoSectionError\n\n\nclass LuigiConfigParser(ConfigParser):\n NO_DEFAULT = object()\n _instance = None\n _config_paths = ['/etc/luigi/client.cfg', 'client.cfg']\n if 'LUIGI_CONFIG_PATH' in os.environ:\n _config_paths.append(os.environ['LUIGI_CONFIG_PATH'])\n\n @classmethod\n def add_config_path(cls, path):\n cls._config_paths.append(path)\n cls.reload()\n\n @classmethod\n def instance(cls, *args, **kwargs):\n \"\"\" Singleton getter \"\"\"\n if cls._instance is None:\n cls._instance = cls(*args, **kwargs)\n loaded = cls._instance.reload()\n logging.getLogger('luigi-interface').info('Loaded %r', loaded)\n\n return cls._instance\n\n @classmethod\n def reload(cls):\n return cls.instance().read(cls._config_paths)\n\n def _get_with_default(self, method, section, option, default, expected_type=None):\n \"\"\" Gets the value of the section/option using method. Returns default if value\n is not found. Raises an exception if the default value is not None and doesn't match\n the expected_type.\n \"\"\"\n try:\n return method(self, section, option)\n except (NoOptionError, NoSectionError):\n if default is LuigiConfigParser.NO_DEFAULT:\n raise\n if expected_type is not None and default is not None and \\\n not isinstance(default, expected_type):\n raise\n return default\n\n def get(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.get, section, option, default)\n\n def getboolean(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.getboolean, section, option, default, bool)\n\n def getint(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.getint, section, option, default, int)\n\n def getfloat(self, section, option, default=NO_DEFAULT):\n return self._get_with_default(ConfigParser.getfloat, section, option, default, float)\n\n def set(self, section, option, value):\n if not ConfigParser.has_section(self, section):\n ConfigParser.add_section(self, section)\n\n return ConfigParser.set(self, section, option, value)\n\ndef get_config():\n \"\"\" Convenience method (for backwards compatibility) for accessing config singleton \"\"\"\n return LuigiConfigParser.instance()\n", "path": "luigi/configuration.py"}]} | 1,036 | 192 |
gh_patches_debug_8797 | rasdani/github-patches | git_diff | Kinto__kinto-1340 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`kinto create-user` doesn't override the password if the user already exists.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kinto/plugins/accounts/scripts.py`
Content:
```
1 import logging
2 import getpass
3 from pyramid.settings import asbool
4
5 from .utils import hash_password
6 from .views import AccountIdGenerator
7
8
9 logger = logging.getLogger(__name__)
10
11
12 def create_user(env, username=None, password=None):
13 """Administrative command to create a new user."""
14 registry = env['registry']
15 settings = registry.settings
16 readonly_mode = asbool(settings.get('readonly', False))
17 if readonly_mode:
18 message = 'Cannot create a user with a readonly server.'
19 logger.error(message)
20 return 51
21
22 if 'kinto.plugins.accounts' not in settings['includes']:
23 message = 'Cannot create a user when the accounts plugin is not installed.'
24 logger.error(message)
25 return 52
26
27 try:
28 validator = AccountIdGenerator()
29 if username is None:
30 username = input('Username: ')
31 while not validator.match(username):
32 print('{} is not a valid username.')
33 print('Username should match {0!r}, please try again.'.format(validator.regexp))
34 username = input('Username: ')
35
36 if password is None:
37 while True: # The user didn't entered twice the same password
38 password = getpass.getpass('Please enter a password for {}: '.format(username))
39 confirm = getpass.getpass('Please confirm the password: '.format(username))
40
41 if password != confirm:
42 print('Sorry, passwords do not match, please try again.')
43 else:
44 break
45 except EOFError:
46 print('User creation aborted')
47 return 53
48
49 print("Creating user '{}'".format(username))
50 record = {'id': username, 'password': hash_password(password)}
51 registry.storage.create(collection_id='account',
52 parent_id=username,
53 record=record,
54 ignore_conflict=True)
55 registry.permission.add_principal_to_ace('/accounts/{}'.format(username),
56 'write',
57 'account:{}'.format(username))
58
59 return 0
60
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kinto/plugins/accounts/scripts.py b/kinto/plugins/accounts/scripts.py
--- a/kinto/plugins/accounts/scripts.py
+++ b/kinto/plugins/accounts/scripts.py
@@ -1,5 +1,7 @@
import logging
import getpass
+
+import transaction as current_transaction
from pyramid.settings import asbool
from .utils import hash_password
@@ -56,4 +58,6 @@
'write',
'account:{}'.format(username))
+ current_transaction.commit()
+
return 0
| {"golden_diff": "diff --git a/kinto/plugins/accounts/scripts.py b/kinto/plugins/accounts/scripts.py\n--- a/kinto/plugins/accounts/scripts.py\n+++ b/kinto/plugins/accounts/scripts.py\n@@ -1,5 +1,7 @@\n import logging\n import getpass\n+\n+import transaction as current_transaction\n from pyramid.settings import asbool\n \n from .utils import hash_password\n@@ -56,4 +58,6 @@\n 'write',\n 'account:{}'.format(username))\n \n+ current_transaction.commit()\n+\n return 0\n", "issue": "`kinto create-user` doesn't override the password if the user already exists.\n\n", "before_files": [{"content": "import logging\nimport getpass\nfrom pyramid.settings import asbool\n\nfrom .utils import hash_password\nfrom .views import AccountIdGenerator\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef create_user(env, username=None, password=None):\n \"\"\"Administrative command to create a new user.\"\"\"\n registry = env['registry']\n settings = registry.settings\n readonly_mode = asbool(settings.get('readonly', False))\n if readonly_mode:\n message = 'Cannot create a user with a readonly server.'\n logger.error(message)\n return 51\n\n if 'kinto.plugins.accounts' not in settings['includes']:\n message = 'Cannot create a user when the accounts plugin is not installed.'\n logger.error(message)\n return 52\n\n try:\n validator = AccountIdGenerator()\n if username is None:\n username = input('Username: ')\n while not validator.match(username):\n print('{} is not a valid username.')\n print('Username should match {0!r}, please try again.'.format(validator.regexp))\n username = input('Username: ')\n\n if password is None:\n while True: # The user didn't entered twice the same password\n password = getpass.getpass('Please enter a password for {}: '.format(username))\n confirm = getpass.getpass('Please confirm the password: '.format(username))\n\n if password != confirm:\n print('Sorry, passwords do not match, please try again.')\n else:\n break\n except EOFError:\n print('User creation aborted')\n return 53\n\n print(\"Creating user '{}'\".format(username))\n record = {'id': username, 'password': hash_password(password)}\n registry.storage.create(collection_id='account',\n parent_id=username,\n record=record,\n ignore_conflict=True)\n registry.permission.add_principal_to_ace('/accounts/{}'.format(username),\n 'write',\n 'account:{}'.format(username))\n\n return 0\n", "path": "kinto/plugins/accounts/scripts.py"}], "after_files": [{"content": "import logging\nimport getpass\n\nimport transaction as current_transaction\nfrom pyramid.settings import asbool\n\nfrom .utils import hash_password\nfrom .views import AccountIdGenerator\n\n\nlogger = logging.getLogger(__name__)\n\n\ndef create_user(env, username=None, password=None):\n \"\"\"Administrative command to create a new user.\"\"\"\n registry = env['registry']\n settings = registry.settings\n readonly_mode = asbool(settings.get('readonly', False))\n if readonly_mode:\n message = 'Cannot create a user with a readonly server.'\n logger.error(message)\n return 51\n\n if 'kinto.plugins.accounts' not in settings['includes']:\n message = 'Cannot create a user when the accounts plugin is not installed.'\n logger.error(message)\n return 52\n\n try:\n validator = AccountIdGenerator()\n if username is None:\n username = input('Username: ')\n while not validator.match(username):\n print('{} is not a valid username.')\n print('Username should match {0!r}, please try again.'.format(validator.regexp))\n username = input('Username: ')\n\n if password is None:\n while True: # The user didn't entered twice the same password\n password = getpass.getpass('Please enter a password for {}: '.format(username))\n confirm = getpass.getpass('Please confirm the password: '.format(username))\n\n if password != confirm:\n print('Sorry, passwords do not match, please try again.')\n else:\n break\n except EOFError:\n print('User creation aborted')\n return 53\n\n print(\"Creating user '{}'\".format(username))\n record = {'id': username, 'password': hash_password(password)}\n registry.storage.create(collection_id='account',\n parent_id=username,\n record=record,\n ignore_conflict=True)\n registry.permission.add_principal_to_ace('/accounts/{}'.format(username),\n 'write',\n 'account:{}'.format(username))\n\n current_transaction.commit()\n\n return 0\n", "path": "kinto/plugins/accounts/scripts.py"}]} | 805 | 112 |
gh_patches_debug_31523 | rasdani/github-patches | git_diff | freedomofpress__securedrop-3884 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove unnecessary Ansible callback for profile_tasks
# Feature request
## Description
The file at `install_files/ansible-base/callback_plugins/profile_tasks.py` was added via #1196, to provide additional information on task performance, with the goal of aiding developers in improving the server config workflow. Since we moved to Ansible v2 in #1146, the hardcoded plugin is no longer necessary.
Instead, we can ansible add a lint to `ansible.cfg` under `[defaults]`:
```
callback_whitelist = profile_tasks
```
The simplification is possible because task profiling was [added to Ansible core as of v2](https://docs.ansible.com/ansible/devel/plugins/callback/profile_tasks.html).
## User Stories
As a maintainer, I want to delete redundant code wherever possible, and lean on upstream to handle core functionality when appropriate.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `install_files/ansible-base/callback_plugins/profile_tasks.py`
Content:
```
1 # Source: https://github.com/jlafon/ansible-profile
2 # License: MIT
3 # More info: http://jlafon.io/ansible-profiling.html
4 # The profiling functionality will be provided by Ansible v2,
5 # since this callback_plugin has been merged into core,
6 # but we're including here to support older versions of Ansible.
7 import datetime
8 import os
9 import time
10
11
12 class CallbackModule(object):
13 """
14 A plugin for timing tasks
15 """
16 def __init__(self):
17 self.stats = {}
18 self.current = None
19
20 def playbook_on_task_start(self, name, is_conditional):
21 """
22 Logs the start of each task
23 """
24
25 if os.getenv("ANSIBLE_PROFILE_DISABLE") is not None:
26 return
27
28 if self.current is not None:
29 # Record the running time of the last executed task
30 self.stats[self.current] = time.time() - self.stats[self.current]
31
32 # Record the start time of the current task
33 self.current = name
34 self.stats[self.current] = time.time()
35
36 def playbook_on_stats(self, stats):
37 """
38 Prints the timings
39 """
40
41 if os.getenv("ANSIBLE_PROFILE_DISABLE") is not None:
42 return
43
44 # Record the timing of the very last task
45 if self.current is not None:
46 self.stats[self.current] = time.time() - self.stats[self.current]
47
48 # Sort the tasks by their running time
49 results = sorted(
50 self.stats.items(),
51 key=lambda value: value[1],
52 reverse=True,
53 )
54
55 # Just keep the top 10
56 results = results[:10]
57
58 # Print the timings
59 for name, elapsed in results:
60 print(
61 "{0:-<70}{1:->9}".format(
62 '{0} '.format(name),
63 ' {0:.02f}s'.format(elapsed),
64 )
65 )
66
67 total_seconds = sum([x[1] for x in self.stats.items()])
68 print("\nPlaybook finished: {0}, {1} total tasks."
69 " {2} elapsed. \n".format(
70 time.asctime(),
71 len(self.stats.items()),
72 datetime.timedelta(seconds=(int(total_seconds)))
73 )
74 )
75
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/install_files/ansible-base/callback_plugins/profile_tasks.py b/install_files/ansible-base/callback_plugins/profile_tasks.py
deleted file mode 100644
--- a/install_files/ansible-base/callback_plugins/profile_tasks.py
+++ /dev/null
@@ -1,74 +0,0 @@
-# Source: https://github.com/jlafon/ansible-profile
-# License: MIT
-# More info: http://jlafon.io/ansible-profiling.html
-# The profiling functionality will be provided by Ansible v2,
-# since this callback_plugin has been merged into core,
-# but we're including here to support older versions of Ansible.
-import datetime
-import os
-import time
-
-
-class CallbackModule(object):
- """
- A plugin for timing tasks
- """
- def __init__(self):
- self.stats = {}
- self.current = None
-
- def playbook_on_task_start(self, name, is_conditional):
- """
- Logs the start of each task
- """
-
- if os.getenv("ANSIBLE_PROFILE_DISABLE") is not None:
- return
-
- if self.current is not None:
- # Record the running time of the last executed task
- self.stats[self.current] = time.time() - self.stats[self.current]
-
- # Record the start time of the current task
- self.current = name
- self.stats[self.current] = time.time()
-
- def playbook_on_stats(self, stats):
- """
- Prints the timings
- """
-
- if os.getenv("ANSIBLE_PROFILE_DISABLE") is not None:
- return
-
- # Record the timing of the very last task
- if self.current is not None:
- self.stats[self.current] = time.time() - self.stats[self.current]
-
- # Sort the tasks by their running time
- results = sorted(
- self.stats.items(),
- key=lambda value: value[1],
- reverse=True,
- )
-
- # Just keep the top 10
- results = results[:10]
-
- # Print the timings
- for name, elapsed in results:
- print(
- "{0:-<70}{1:->9}".format(
- '{0} '.format(name),
- ' {0:.02f}s'.format(elapsed),
- )
- )
-
- total_seconds = sum([x[1] for x in self.stats.items()])
- print("\nPlaybook finished: {0}, {1} total tasks."
- " {2} elapsed. \n".format(
- time.asctime(),
- len(self.stats.items()),
- datetime.timedelta(seconds=(int(total_seconds)))
- )
- )
| {"golden_diff": "diff --git a/install_files/ansible-base/callback_plugins/profile_tasks.py b/install_files/ansible-base/callback_plugins/profile_tasks.py\ndeleted file mode 100644\n--- a/install_files/ansible-base/callback_plugins/profile_tasks.py\n+++ /dev/null\n@@ -1,74 +0,0 @@\n-# Source: https://github.com/jlafon/ansible-profile\n-# License: MIT\n-# More info: http://jlafon.io/ansible-profiling.html\n-# The profiling functionality will be provided by Ansible v2,\n-# since this callback_plugin has been merged into core,\n-# but we're including here to support older versions of Ansible.\n-import datetime\n-import os\n-import time\n-\n-\n-class CallbackModule(object):\n- \"\"\"\n- A plugin for timing tasks\n- \"\"\"\n- def __init__(self):\n- self.stats = {}\n- self.current = None\n-\n- def playbook_on_task_start(self, name, is_conditional):\n- \"\"\"\n- Logs the start of each task\n- \"\"\"\n-\n- if os.getenv(\"ANSIBLE_PROFILE_DISABLE\") is not None:\n- return\n-\n- if self.current is not None:\n- # Record the running time of the last executed task\n- self.stats[self.current] = time.time() - self.stats[self.current]\n-\n- # Record the start time of the current task\n- self.current = name\n- self.stats[self.current] = time.time()\n-\n- def playbook_on_stats(self, stats):\n- \"\"\"\n- Prints the timings\n- \"\"\"\n-\n- if os.getenv(\"ANSIBLE_PROFILE_DISABLE\") is not None:\n- return\n-\n- # Record the timing of the very last task\n- if self.current is not None:\n- self.stats[self.current] = time.time() - self.stats[self.current]\n-\n- # Sort the tasks by their running time\n- results = sorted(\n- self.stats.items(),\n- key=lambda value: value[1],\n- reverse=True,\n- )\n-\n- # Just keep the top 10\n- results = results[:10]\n-\n- # Print the timings\n- for name, elapsed in results:\n- print(\n- \"{0:-<70}{1:->9}\".format(\n- '{0} '.format(name),\n- ' {0:.02f}s'.format(elapsed),\n- )\n- )\n-\n- total_seconds = sum([x[1] for x in self.stats.items()])\n- print(\"\\nPlaybook finished: {0}, {1} total tasks.\"\n- \" {2} elapsed. \\n\".format(\n- time.asctime(),\n- len(self.stats.items()),\n- datetime.timedelta(seconds=(int(total_seconds)))\n- )\n- )\n", "issue": "Remove unnecessary Ansible callback for profile_tasks\n# Feature request\r\n\r\n## Description\r\n\r\nThe file at `install_files/ansible-base/callback_plugins/profile_tasks.py` was added via #1196, to provide additional information on task performance, with the goal of aiding developers in improving the server config workflow. Since we moved to Ansible v2 in #1146, the hardcoded plugin is no longer necessary.\r\n\r\nInstead, we can ansible add a lint to `ansible.cfg` under `[defaults]`:\r\n\r\n```\r\ncallback_whitelist = profile_tasks\r\n```\r\n\r\nThe simplification is possible because task profiling was [added to Ansible core as of v2](https://docs.ansible.com/ansible/devel/plugins/callback/profile_tasks.html).\r\n\r\n## User Stories\r\nAs a maintainer, I want to delete redundant code wherever possible, and lean on upstream to handle core functionality when appropriate.\r\n\n", "before_files": [{"content": "# Source: https://github.com/jlafon/ansible-profile\n# License: MIT\n# More info: http://jlafon.io/ansible-profiling.html\n# The profiling functionality will be provided by Ansible v2,\n# since this callback_plugin has been merged into core,\n# but we're including here to support older versions of Ansible.\nimport datetime\nimport os\nimport time\n\n\nclass CallbackModule(object):\n \"\"\"\n A plugin for timing tasks\n \"\"\"\n def __init__(self):\n self.stats = {}\n self.current = None\n\n def playbook_on_task_start(self, name, is_conditional):\n \"\"\"\n Logs the start of each task\n \"\"\"\n\n if os.getenv(\"ANSIBLE_PROFILE_DISABLE\") is not None:\n return\n\n if self.current is not None:\n # Record the running time of the last executed task\n self.stats[self.current] = time.time() - self.stats[self.current]\n\n # Record the start time of the current task\n self.current = name\n self.stats[self.current] = time.time()\n\n def playbook_on_stats(self, stats):\n \"\"\"\n Prints the timings\n \"\"\"\n\n if os.getenv(\"ANSIBLE_PROFILE_DISABLE\") is not None:\n return\n\n # Record the timing of the very last task\n if self.current is not None:\n self.stats[self.current] = time.time() - self.stats[self.current]\n\n # Sort the tasks by their running time\n results = sorted(\n self.stats.items(),\n key=lambda value: value[1],\n reverse=True,\n )\n\n # Just keep the top 10\n results = results[:10]\n\n # Print the timings\n for name, elapsed in results:\n print(\n \"{0:-<70}{1:->9}\".format(\n '{0} '.format(name),\n ' {0:.02f}s'.format(elapsed),\n )\n )\n\n total_seconds = sum([x[1] for x in self.stats.items()])\n print(\"\\nPlaybook finished: {0}, {1} total tasks.\"\n \" {2} elapsed. \\n\".format(\n time.asctime(),\n len(self.stats.items()),\n datetime.timedelta(seconds=(int(total_seconds)))\n )\n )\n", "path": "install_files/ansible-base/callback_plugins/profile_tasks.py"}], "after_files": [{"content": null, "path": "install_files/ansible-base/callback_plugins/profile_tasks.py"}]} | 1,074 | 610 |
gh_patches_debug_3328 | rasdani/github-patches | git_diff | Mailu__Mailu-1944 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Letsencrypt Force Renewal
Is there a limit on the Subject Alt Name entries?
I have updated my /mailu/mailu.env "HOSTNAMES" variable, but when I restart Mailu it doesn't update the Subject Alt Names on the mailu cert.
Previously it has worked, so I am guessing that I need to force Letsencrypt to refresh as it isnt within the renewal window. But there is no guidance for the new letsencrypt certbot.
I am using the latest Mailu version (1.7) and this is the command I am using to restart mailu '/mailu/docker-compose -p mailu up -d'
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/nginx/letsencrypt.py`
Content:
```
1 #!/usr/bin/python3
2
3 import os
4 import time
5 import subprocess
6
7
8 command = [
9 "certbot",
10 "-n", "--agree-tos", # non-interactive
11 "-d", os.environ["HOSTNAMES"],
12 "-m", "{}@{}".format(os.environ["POSTMASTER"], os.environ["DOMAIN"]),
13 "certonly", "--standalone",
14 "--cert-name", "mailu",
15 "--preferred-challenges", "http", "--http-01-port", "8008",
16 "--keep-until-expiring",
17 "--rsa-key-size", "4096",
18 "--config-dir", "/certs/letsencrypt",
19 "--post-hook", "/config.py"
20 ]
21
22 # Wait for nginx to start
23 time.sleep(5)
24
25 # Run certbot every hour
26 while True:
27 subprocess.call(command)
28 time.sleep(3600)
29
30
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/nginx/letsencrypt.py b/core/nginx/letsencrypt.py
--- a/core/nginx/letsencrypt.py
+++ b/core/nginx/letsencrypt.py
@@ -14,8 +14,8 @@
"--cert-name", "mailu",
"--preferred-challenges", "http", "--http-01-port", "8008",
"--keep-until-expiring",
- "--rsa-key-size", "4096",
"--config-dir", "/certs/letsencrypt",
+ "--renew-with-new-domains",
"--post-hook", "/config.py"
]
| {"golden_diff": "diff --git a/core/nginx/letsencrypt.py b/core/nginx/letsencrypt.py\n--- a/core/nginx/letsencrypt.py\n+++ b/core/nginx/letsencrypt.py\n@@ -14,8 +14,8 @@\n \"--cert-name\", \"mailu\",\n \"--preferred-challenges\", \"http\", \"--http-01-port\", \"8008\",\n \"--keep-until-expiring\",\n- \"--rsa-key-size\", \"4096\",\n \"--config-dir\", \"/certs/letsencrypt\",\n+ \"--renew-with-new-domains\",\n \"--post-hook\", \"/config.py\"\n ]\n", "issue": "Letsencrypt Force Renewal\nIs there a limit on the Subject Alt Name entries?\r\n\r\nI have updated my /mailu/mailu.env \"HOSTNAMES\" variable, but when I restart Mailu it doesn't update the Subject Alt Names on the mailu cert.\r\n\r\nPreviously it has worked, so I am guessing that I need to force Letsencrypt to refresh as it isnt within the renewal window. But there is no guidance for the new letsencrypt certbot.\r\n\r\nI am using the latest Mailu version (1.7) and this is the command I am using to restart mailu '/mailu/docker-compose -p mailu up -d'\n", "before_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport time\nimport subprocess\n\n\ncommand = [\n \"certbot\",\n \"-n\", \"--agree-tos\", # non-interactive\n \"-d\", os.environ[\"HOSTNAMES\"],\n \"-m\", \"{}@{}\".format(os.environ[\"POSTMASTER\"], os.environ[\"DOMAIN\"]),\n \"certonly\", \"--standalone\",\n \"--cert-name\", \"mailu\",\n \"--preferred-challenges\", \"http\", \"--http-01-port\", \"8008\",\n \"--keep-until-expiring\",\n \"--rsa-key-size\", \"4096\",\n \"--config-dir\", \"/certs/letsencrypt\",\n \"--post-hook\", \"/config.py\"\n]\n\n# Wait for nginx to start\ntime.sleep(5)\n\n# Run certbot every hour\nwhile True:\n subprocess.call(command)\n time.sleep(3600)\n\n", "path": "core/nginx/letsencrypt.py"}], "after_files": [{"content": "#!/usr/bin/python3\n\nimport os\nimport time\nimport subprocess\n\n\ncommand = [\n \"certbot\",\n \"-n\", \"--agree-tos\", # non-interactive\n \"-d\", os.environ[\"HOSTNAMES\"],\n \"-m\", \"{}@{}\".format(os.environ[\"POSTMASTER\"], os.environ[\"DOMAIN\"]),\n \"certonly\", \"--standalone\",\n \"--cert-name\", \"mailu\",\n \"--preferred-challenges\", \"http\", \"--http-01-port\", \"8008\",\n \"--keep-until-expiring\",\n \"--config-dir\", \"/certs/letsencrypt\",\n \"--renew-with-new-domains\",\n \"--post-hook\", \"/config.py\"\n]\n\n# Wait for nginx to start\ntime.sleep(5)\n\n# Run certbot every hour\nwhile True:\n subprocess.call(command)\n time.sleep(3600)\n\n", "path": "core/nginx/letsencrypt.py"}]} | 635 | 132 |
gh_patches_debug_29046 | rasdani/github-patches | git_diff | Mailu__Mailu-931 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Admin whitelist_webmail hard coded
Hi,
I'm trying to send transactional email with Mailu SMTP from a backend server and it takes more than 15s.
I tracked down one problem in core/admin/mailu/internal/__init__.py:
def whitelist_webmail() uses socket.gethostbyname("webmail")
In my docker configuration there is no "webmail" host so the function socket.gethostbyname return nothing after 5s which slows down a lot the request /internal/auth/email
When I set "webmail" to a fake ip on the admin server the /internal/auth/email returns immediately.
Maybe it would be better to define a list of hostnames in the configuration file instead of using a hard coded "webmail" value. What do you think?
Thanks Mailu for the great work!
JB
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `core/admin/mailu/internal/__init__.py`
Content:
```
1 from flask_limiter import RateLimitExceeded
2
3 from mailu import utils
4
5 import socket
6 import flask
7
8
9 internal = flask.Blueprint('internal', __name__, template_folder='templates')
10
11
12 @internal.app_errorhandler(RateLimitExceeded)
13 def rate_limit_handler(e):
14 response = flask.Response()
15 response.headers['Auth-Status'] = 'Authentication rate limit from one source exceeded'
16 response.headers['Auth-Error-Code'] = '451 4.3.2'
17 if int(flask.request.headers['Auth-Login-Attempt']) < 10:
18 response.headers['Auth-Wait'] = '3'
19 return response
20
21
22 @utils.limiter.request_filter
23 def whitelist_webmail():
24 try:
25 return flask.request.headers["Client-Ip"] ==\
26 socket.gethostbyname("webmail")
27 except:
28 return False
29
30
31 from mailu.internal.views import *
32
```
Path: `core/admin/mailu/configuration.py`
Content:
```
1 import os
2 from mailustart import resolve
3
4 DEFAULT_CONFIG = {
5 # Specific to the admin UI
6 'DOCKER_SOCKET': 'unix:///var/run/docker.sock',
7 'BABEL_DEFAULT_LOCALE': 'en',
8 'BABEL_DEFAULT_TIMEZONE': 'UTC',
9 'BOOTSTRAP_SERVE_LOCAL': True,
10 'RATELIMIT_STORAGE_URL': 'redis://redis/2',
11 'QUOTA_STORAGE_URL': 'redis://redis/1',
12 'DEBUG': False,
13 'DOMAIN_REGISTRATION': False,
14 'TEMPLATES_AUTO_RELOAD': True,
15 # Database settings
16 'DB_FLAVOR': None,
17 'DB_USER': 'mailu',
18 'DB_PW': None,
19 'DB_HOST': 'database',
20 'DB_NAME': 'mailu',
21 'SQLITE_DATABASE_FILE':'data/main.db',
22 'SQLALCHEMY_DATABASE_URI': 'sqlite:////data/main.db',
23 'SQLALCHEMY_TRACK_MODIFICATIONS': False,
24 # Statistics management
25 'INSTANCE_ID_PATH': '/data/instance',
26 'STATS_ENDPOINT': '0.{}.stats.mailu.io',
27 # Common configuration variables
28 'SECRET_KEY': 'changeMe',
29 'DOMAIN': 'mailu.io',
30 'HOSTNAMES': 'mail.mailu.io,alternative.mailu.io,yetanother.mailu.io',
31 'POSTMASTER': 'postmaster',
32 'TLS_FLAVOR': 'cert',
33 'AUTH_RATELIMIT': '10/minute;1000/hour',
34 'DISABLE_STATISTICS': False,
35 # Mail settings
36 'DMARC_RUA': None,
37 'DMARC_RUF': None,
38 'WELCOME': False,
39 'WELCOME_SUBJECT': 'Dummy welcome topic',
40 'WELCOME_BODY': 'Dummy welcome body',
41 'DKIM_SELECTOR': 'dkim',
42 'DKIM_PATH': '/dkim/{domain}.{selector}.key',
43 'DEFAULT_QUOTA': 1000000000,
44 # Web settings
45 'SITENAME': 'Mailu',
46 'WEBSITE': 'https://mailu.io',
47 'WEB_ADMIN': '/admin',
48 'WEB_WEBMAIL': '/webmail',
49 'RECAPTCHA_PUBLIC_KEY': '',
50 'RECAPTCHA_PRIVATE_KEY': '',
51 # Advanced settings
52 'PASSWORD_SCHEME': 'BLF-CRYPT',
53 # Host settings
54 'HOST_IMAP': 'imap',
55 'HOST_POP3': 'imap',
56 'HOST_SMTP': 'smtp',
57 'HOST_WEBMAIL': 'webmail',
58 'HOST_FRONT': 'front',
59 'HOST_AUTHSMTP': os.environ.get('HOST_SMTP', 'smtp'),
60 'SUBNET': '192.168.203.0/24',
61 'POD_ADDRESS_RANGE': None
62 }
63
64 class ConfigManager(dict):
65 """ Naive configuration manager that uses environment only
66 """
67
68 DB_TEMPLATES = {
69 'sqlite': 'sqlite:////{SQLITE_DATABASE_FILE}',
70 'postgresql': 'postgresql://{DB_USER}:{DB_PW}@{DB_HOST}/{DB_NAME}',
71 'mysql': 'mysql://{DB_USER}:{DB_PW}@{DB_HOST}/{DB_NAME}'
72 }
73
74 def __init__(self):
75 self.config = dict()
76
77 def resolve_host(self):
78 self.config['HOST_IMAP'] = resolve(self.config['HOST_IMAP'])
79 self.config['HOST_POP3'] = resolve(self.config['HOST_POP3'])
80 self.config['HOST_AUTHSMTP'] = resolve(self.config['HOST_AUTHSMTP'])
81 self.config['HOST_SMTP'] = resolve(self.config['HOST_SMTP'])
82
83 def __coerce_value(self, value):
84 if isinstance(value, str) and value.lower() in ('true','yes'):
85 return True
86 elif isinstance(value, str) and value.lower() in ('false', 'no'):
87 return False
88 return value
89
90 def init_app(self, app):
91 self.config.update(app.config)
92 # get environment variables
93 self.config.update({
94 key: self.__coerce_value(os.environ.get(key, value))
95 for key, value in DEFAULT_CONFIG.items()
96 })
97 self.resolve_host()
98
99 # automatically set the sqlalchemy string
100 if self.config['DB_FLAVOR']:
101 template = self.DB_TEMPLATES[self.config['DB_FLAVOR']]
102 self.config['SQLALCHEMY_DATABASE_URI'] = template.format(**self.config)
103 # update the app config itself
104 app.config = self
105
106 def setdefault(self, key, value):
107 if key not in self.config:
108 self.config[key] = value
109 return self.config[key]
110
111 def get(self, *args):
112 return self.config.get(*args)
113
114 def keys(self):
115 return self.config.keys()
116
117 def __getitem__(self, key):
118 return self.config.get(key)
119
120 def __setitem__(self, key, value):
121 self.config[key] = value
122
123 def __contains__(self, key):
124 return key in self.config
125
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/core/admin/mailu/configuration.py b/core/admin/mailu/configuration.py
--- a/core/admin/mailu/configuration.py
+++ b/core/admin/mailu/configuration.py
@@ -46,6 +46,7 @@
'WEBSITE': 'https://mailu.io',
'WEB_ADMIN': '/admin',
'WEB_WEBMAIL': '/webmail',
+ 'WEBMAIL': 'none',
'RECAPTCHA_PUBLIC_KEY': '',
'RECAPTCHA_PRIVATE_KEY': '',
# Advanced settings
@@ -79,6 +80,8 @@
self.config['HOST_POP3'] = resolve(self.config['HOST_POP3'])
self.config['HOST_AUTHSMTP'] = resolve(self.config['HOST_AUTHSMTP'])
self.config['HOST_SMTP'] = resolve(self.config['HOST_SMTP'])
+ if self.config['WEBMAIL'] != 'none':
+ self.config['HOST_WEBMAIL'] = resolve(self.config['HOST_WEBMAIL'])
def __coerce_value(self, value):
if isinstance(value, str) and value.lower() in ('true','yes'):
diff --git a/core/admin/mailu/internal/__init__.py b/core/admin/mailu/internal/__init__.py
--- a/core/admin/mailu/internal/__init__.py
+++ b/core/admin/mailu/internal/__init__.py
@@ -1,6 +1,7 @@
from flask_limiter import RateLimitExceeded
from mailu import utils
+from flask import current_app as app
import socket
import flask
@@ -23,7 +24,7 @@
def whitelist_webmail():
try:
return flask.request.headers["Client-Ip"] ==\
- socket.gethostbyname("webmail")
+ app.config["HOST_WEBMAIL"]
except:
return False
| {"golden_diff": "diff --git a/core/admin/mailu/configuration.py b/core/admin/mailu/configuration.py\n--- a/core/admin/mailu/configuration.py\n+++ b/core/admin/mailu/configuration.py\n@@ -46,6 +46,7 @@\n 'WEBSITE': 'https://mailu.io',\n 'WEB_ADMIN': '/admin',\n 'WEB_WEBMAIL': '/webmail',\n+ 'WEBMAIL': 'none',\n 'RECAPTCHA_PUBLIC_KEY': '',\n 'RECAPTCHA_PRIVATE_KEY': '',\n # Advanced settings\n@@ -79,6 +80,8 @@\n self.config['HOST_POP3'] = resolve(self.config['HOST_POP3'])\n self.config['HOST_AUTHSMTP'] = resolve(self.config['HOST_AUTHSMTP'])\n self.config['HOST_SMTP'] = resolve(self.config['HOST_SMTP'])\n+ if self.config['WEBMAIL'] != 'none':\n+ self.config['HOST_WEBMAIL'] = resolve(self.config['HOST_WEBMAIL'])\n \n def __coerce_value(self, value):\n if isinstance(value, str) and value.lower() in ('true','yes'):\ndiff --git a/core/admin/mailu/internal/__init__.py b/core/admin/mailu/internal/__init__.py\n--- a/core/admin/mailu/internal/__init__.py\n+++ b/core/admin/mailu/internal/__init__.py\n@@ -1,6 +1,7 @@\n from flask_limiter import RateLimitExceeded\n \n from mailu import utils\n+from flask import current_app as app\n \n import socket\n import flask\n@@ -23,7 +24,7 @@\n def whitelist_webmail():\n try:\n return flask.request.headers[\"Client-Ip\"] ==\\\n- socket.gethostbyname(\"webmail\")\n+ app.config[\"HOST_WEBMAIL\"]\n except:\n return False\n", "issue": "Admin whitelist_webmail hard coded\nHi,\r\nI'm trying to send transactional email with Mailu SMTP from a backend server and it takes more than 15s.\r\nI tracked down one problem in core/admin/mailu/internal/__init__.py:\r\ndef whitelist_webmail() uses socket.gethostbyname(\"webmail\")\r\n\r\nIn my docker configuration there is no \"webmail\" host so the function socket.gethostbyname return nothing after 5s which slows down a lot the request /internal/auth/email\r\nWhen I set \"webmail\" to a fake ip on the admin server the /internal/auth/email returns immediately.\r\n\r\nMaybe it would be better to define a list of hostnames in the configuration file instead of using a hard coded \"webmail\" value. What do you think?\r\n\r\nThanks Mailu for the great work!\r\nJB\n", "before_files": [{"content": "from flask_limiter import RateLimitExceeded\n\nfrom mailu import utils\n\nimport socket\nimport flask\n\n\ninternal = flask.Blueprint('internal', __name__, template_folder='templates')\n\n\[email protected]_errorhandler(RateLimitExceeded)\ndef rate_limit_handler(e):\n response = flask.Response()\n response.headers['Auth-Status'] = 'Authentication rate limit from one source exceeded'\n response.headers['Auth-Error-Code'] = '451 4.3.2'\n if int(flask.request.headers['Auth-Login-Attempt']) < 10:\n response.headers['Auth-Wait'] = '3'\n return response\n\n\[email protected]_filter\ndef whitelist_webmail():\n try:\n return flask.request.headers[\"Client-Ip\"] ==\\\n socket.gethostbyname(\"webmail\")\n except:\n return False\n\n\nfrom mailu.internal.views import *\n", "path": "core/admin/mailu/internal/__init__.py"}, {"content": "import os\nfrom mailustart import resolve\n\nDEFAULT_CONFIG = {\n # Specific to the admin UI\n 'DOCKER_SOCKET': 'unix:///var/run/docker.sock',\n 'BABEL_DEFAULT_LOCALE': 'en',\n 'BABEL_DEFAULT_TIMEZONE': 'UTC',\n 'BOOTSTRAP_SERVE_LOCAL': True,\n 'RATELIMIT_STORAGE_URL': 'redis://redis/2',\n 'QUOTA_STORAGE_URL': 'redis://redis/1',\n 'DEBUG': False,\n 'DOMAIN_REGISTRATION': False,\n 'TEMPLATES_AUTO_RELOAD': True,\n # Database settings\n 'DB_FLAVOR': None,\n 'DB_USER': 'mailu',\n 'DB_PW': None,\n 'DB_HOST': 'database',\n 'DB_NAME': 'mailu',\n 'SQLITE_DATABASE_FILE':'data/main.db',\n 'SQLALCHEMY_DATABASE_URI': 'sqlite:////data/main.db',\n 'SQLALCHEMY_TRACK_MODIFICATIONS': False,\n # Statistics management\n 'INSTANCE_ID_PATH': '/data/instance',\n 'STATS_ENDPOINT': '0.{}.stats.mailu.io',\n # Common configuration variables\n 'SECRET_KEY': 'changeMe',\n 'DOMAIN': 'mailu.io',\n 'HOSTNAMES': 'mail.mailu.io,alternative.mailu.io,yetanother.mailu.io',\n 'POSTMASTER': 'postmaster',\n 'TLS_FLAVOR': 'cert',\n 'AUTH_RATELIMIT': '10/minute;1000/hour',\n 'DISABLE_STATISTICS': False,\n # Mail settings\n 'DMARC_RUA': None,\n 'DMARC_RUF': None,\n 'WELCOME': False,\n 'WELCOME_SUBJECT': 'Dummy welcome topic',\n 'WELCOME_BODY': 'Dummy welcome body',\n 'DKIM_SELECTOR': 'dkim',\n 'DKIM_PATH': '/dkim/{domain}.{selector}.key',\n 'DEFAULT_QUOTA': 1000000000,\n # Web settings\n 'SITENAME': 'Mailu',\n 'WEBSITE': 'https://mailu.io',\n 'WEB_ADMIN': '/admin',\n 'WEB_WEBMAIL': '/webmail',\n 'RECAPTCHA_PUBLIC_KEY': '',\n 'RECAPTCHA_PRIVATE_KEY': '',\n # Advanced settings\n 'PASSWORD_SCHEME': 'BLF-CRYPT',\n # Host settings\n 'HOST_IMAP': 'imap',\n 'HOST_POP3': 'imap',\n 'HOST_SMTP': 'smtp',\n 'HOST_WEBMAIL': 'webmail',\n 'HOST_FRONT': 'front',\n 'HOST_AUTHSMTP': os.environ.get('HOST_SMTP', 'smtp'),\n 'SUBNET': '192.168.203.0/24',\n 'POD_ADDRESS_RANGE': None\n}\n\nclass ConfigManager(dict):\n \"\"\" Naive configuration manager that uses environment only\n \"\"\"\n\n DB_TEMPLATES = {\n 'sqlite': 'sqlite:////{SQLITE_DATABASE_FILE}',\n 'postgresql': 'postgresql://{DB_USER}:{DB_PW}@{DB_HOST}/{DB_NAME}',\n 'mysql': 'mysql://{DB_USER}:{DB_PW}@{DB_HOST}/{DB_NAME}'\n }\n\n def __init__(self):\n self.config = dict()\n\n def resolve_host(self):\n self.config['HOST_IMAP'] = resolve(self.config['HOST_IMAP'])\n self.config['HOST_POP3'] = resolve(self.config['HOST_POP3'])\n self.config['HOST_AUTHSMTP'] = resolve(self.config['HOST_AUTHSMTP'])\n self.config['HOST_SMTP'] = resolve(self.config['HOST_SMTP'])\n\n def __coerce_value(self, value):\n if isinstance(value, str) and value.lower() in ('true','yes'):\n return True\n elif isinstance(value, str) and value.lower() in ('false', 'no'):\n return False\n return value\n\n def init_app(self, app):\n self.config.update(app.config)\n # get environment variables\n self.config.update({\n key: self.__coerce_value(os.environ.get(key, value))\n for key, value in DEFAULT_CONFIG.items()\n })\n self.resolve_host()\n\n # automatically set the sqlalchemy string\n if self.config['DB_FLAVOR']:\n template = self.DB_TEMPLATES[self.config['DB_FLAVOR']]\n self.config['SQLALCHEMY_DATABASE_URI'] = template.format(**self.config)\n # update the app config itself\n app.config = self\n\n def setdefault(self, key, value):\n if key not in self.config:\n self.config[key] = value\n return self.config[key]\n\n def get(self, *args):\n return self.config.get(*args)\n\n def keys(self):\n return self.config.keys()\n\n def __getitem__(self, key):\n return self.config.get(key)\n\n def __setitem__(self, key, value):\n self.config[key] = value\n\n def __contains__(self, key):\n return key in self.config\n", "path": "core/admin/mailu/configuration.py"}], "after_files": [{"content": "from flask_limiter import RateLimitExceeded\n\nfrom mailu import utils\nfrom flask import current_app as app\n\nimport socket\nimport flask\n\n\ninternal = flask.Blueprint('internal', __name__, template_folder='templates')\n\n\[email protected]_errorhandler(RateLimitExceeded)\ndef rate_limit_handler(e):\n response = flask.Response()\n response.headers['Auth-Status'] = 'Authentication rate limit from one source exceeded'\n response.headers['Auth-Error-Code'] = '451 4.3.2'\n if int(flask.request.headers['Auth-Login-Attempt']) < 10:\n response.headers['Auth-Wait'] = '3'\n return response\n\n\[email protected]_filter\ndef whitelist_webmail():\n try:\n return flask.request.headers[\"Client-Ip\"] ==\\\n app.config[\"HOST_WEBMAIL\"]\n except:\n return False\n\n\nfrom mailu.internal.views import *\n", "path": "core/admin/mailu/internal/__init__.py"}, {"content": "import os\nfrom mailustart import resolve\n\nDEFAULT_CONFIG = {\n # Specific to the admin UI\n 'DOCKER_SOCKET': 'unix:///var/run/docker.sock',\n 'BABEL_DEFAULT_LOCALE': 'en',\n 'BABEL_DEFAULT_TIMEZONE': 'UTC',\n 'BOOTSTRAP_SERVE_LOCAL': True,\n 'RATELIMIT_STORAGE_URL': 'redis://redis/2',\n 'QUOTA_STORAGE_URL': 'redis://redis/1',\n 'DEBUG': False,\n 'DOMAIN_REGISTRATION': False,\n 'TEMPLATES_AUTO_RELOAD': True,\n # Database settings\n 'DB_FLAVOR': None,\n 'DB_USER': 'mailu',\n 'DB_PW': None,\n 'DB_HOST': 'database',\n 'DB_NAME': 'mailu',\n 'SQLITE_DATABASE_FILE':'data/main.db',\n 'SQLALCHEMY_DATABASE_URI': 'sqlite:////data/main.db',\n 'SQLALCHEMY_TRACK_MODIFICATIONS': False,\n # Statistics management\n 'INSTANCE_ID_PATH': '/data/instance',\n 'STATS_ENDPOINT': '0.{}.stats.mailu.io',\n # Common configuration variables\n 'SECRET_KEY': 'changeMe',\n 'DOMAIN': 'mailu.io',\n 'HOSTNAMES': 'mail.mailu.io,alternative.mailu.io,yetanother.mailu.io',\n 'POSTMASTER': 'postmaster',\n 'TLS_FLAVOR': 'cert',\n 'AUTH_RATELIMIT': '10/minute;1000/hour',\n 'DISABLE_STATISTICS': False,\n # Mail settings\n 'DMARC_RUA': None,\n 'DMARC_RUF': None,\n 'WELCOME': False,\n 'WELCOME_SUBJECT': 'Dummy welcome topic',\n 'WELCOME_BODY': 'Dummy welcome body',\n 'DKIM_SELECTOR': 'dkim',\n 'DKIM_PATH': '/dkim/{domain}.{selector}.key',\n 'DEFAULT_QUOTA': 1000000000,\n # Web settings\n 'SITENAME': 'Mailu',\n 'WEBSITE': 'https://mailu.io',\n 'WEB_ADMIN': '/admin',\n 'WEB_WEBMAIL': '/webmail',\n 'WEBMAIL': 'none',\n 'RECAPTCHA_PUBLIC_KEY': '',\n 'RECAPTCHA_PRIVATE_KEY': '',\n # Advanced settings\n 'PASSWORD_SCHEME': 'BLF-CRYPT',\n # Host settings\n 'HOST_IMAP': 'imap',\n 'HOST_POP3': 'imap',\n 'HOST_SMTP': 'smtp',\n 'HOST_WEBMAIL': 'webmail',\n 'HOST_FRONT': 'front',\n 'HOST_AUTHSMTP': os.environ.get('HOST_SMTP', 'smtp'),\n 'SUBNET': '192.168.203.0/24',\n 'POD_ADDRESS_RANGE': None\n}\n\nclass ConfigManager(dict):\n \"\"\" Naive configuration manager that uses environment only\n \"\"\"\n\n DB_TEMPLATES = {\n 'sqlite': 'sqlite:////{SQLITE_DATABASE_FILE}',\n 'postgresql': 'postgresql://{DB_USER}:{DB_PW}@{DB_HOST}/{DB_NAME}',\n 'mysql': 'mysql://{DB_USER}:{DB_PW}@{DB_HOST}/{DB_NAME}'\n }\n\n def __init__(self):\n self.config = dict()\n\n def resolve_host(self):\n self.config['HOST_IMAP'] = resolve(self.config['HOST_IMAP'])\n self.config['HOST_POP3'] = resolve(self.config['HOST_POP3'])\n self.config['HOST_AUTHSMTP'] = resolve(self.config['HOST_AUTHSMTP'])\n self.config['HOST_SMTP'] = resolve(self.config['HOST_SMTP'])\n if self.config['WEBMAIL'] != 'none':\n self.config['HOST_WEBMAIL'] = resolve(self.config['HOST_WEBMAIL'])\n\n def __coerce_value(self, value):\n if isinstance(value, str) and value.lower() in ('true','yes'):\n return True\n elif isinstance(value, str) and value.lower() in ('false', 'no'):\n return False\n return value\n\n def init_app(self, app):\n self.config.update(app.config)\n # get environment variables\n self.config.update({\n key: self.__coerce_value(os.environ.get(key, value))\n for key, value in DEFAULT_CONFIG.items()\n })\n self.resolve_host()\n\n # automatically set the sqlalchemy string\n if self.config['DB_FLAVOR']:\n template = self.DB_TEMPLATES[self.config['DB_FLAVOR']]\n self.config['SQLALCHEMY_DATABASE_URI'] = template.format(**self.config)\n # update the app config itself\n app.config = self\n\n def setdefault(self, key, value):\n if key not in self.config:\n self.config[key] = value\n return self.config[key]\n\n def get(self, *args):\n return self.config.get(*args)\n\n def keys(self):\n return self.config.keys()\n\n def __getitem__(self, key):\n return self.config.get(key)\n\n def __setitem__(self, key, value):\n self.config[key] = value\n\n def __contains__(self, key):\n return key in self.config\n", "path": "core/admin/mailu/configuration.py"}]} | 2,038 | 379 |
gh_patches_debug_594 | rasdani/github-patches | git_diff | pex-tool__pex-1057 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Release 2.1.17
On the docket:
+ [x] TypeError when resolving local platforms. #1043
+ [x] No such file for interpreter's binary name #1009
+ [x] Pex resources leak while bootstrapping pants #1050
+ [x] Pex PEX perf regression #1054
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pex/version.py`
Content:
```
1 # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
2 # Licensed under the Apache License, Version 2.0 (see LICENSE).
3
4 __version__ = "2.1.16"
5
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pex/version.py b/pex/version.py
--- a/pex/version.py
+++ b/pex/version.py
@@ -1,4 +1,4 @@
# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).
# Licensed under the Apache License, Version 2.0 (see LICENSE).
-__version__ = "2.1.16"
+__version__ = "2.1.17"
| {"golden_diff": "diff --git a/pex/version.py b/pex/version.py\n--- a/pex/version.py\n+++ b/pex/version.py\n@@ -1,4 +1,4 @@\n # Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n # Licensed under the Apache License, Version 2.0 (see LICENSE).\n \n-__version__ = \"2.1.16\"\n+__version__ = \"2.1.17\"\n", "issue": "Release 2.1.17\nOn the docket:\r\n+ [x] TypeError when resolving local platforms. #1043\r\n+ [x] No such file for interpreter's binary name #1009\r\n+ [x] Pex resources leak while bootstrapping pants #1050\r\n+ [x] Pex PEX perf regression #1054\r\n\n", "before_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.16\"\n", "path": "pex/version.py"}], "after_files": [{"content": "# Copyright 2015 Pants project contributors (see CONTRIBUTORS.md).\n# Licensed under the Apache License, Version 2.0 (see LICENSE).\n\n__version__ = \"2.1.17\"\n", "path": "pex/version.py"}]} | 391 | 96 |
gh_patches_debug_3456 | rasdani/github-patches | git_diff | CTFd__CTFd-1827 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Set plugin migration version in between each migration
https://github.com/CTFd/CTFd/blob/e1991e16963b10302baa7cc50d52071a5053bf2f/CTFd/plugins/migrations.py#L72-L77
This code here probably should be setting the plugin version in between each migration so that if a migration fails it doesn't need to be started from the beginning again.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `CTFd/plugins/migrations.py`
Content:
```
1 import inspect
2 import os
3
4 from alembic.config import Config
5 from alembic.migration import MigrationContext
6 from alembic.operations import Operations
7 from alembic.script import ScriptDirectory
8 from flask import current_app
9 from sqlalchemy import create_engine, pool
10
11 from CTFd.utils import get_config, set_config
12
13
14 def current(plugin_name=None):
15 if plugin_name is None:
16 # Get the directory name of the plugin if unspecified
17 # Doing it this way doesn't waste the rest of the inspect.stack call
18 frame = inspect.currentframe()
19 caller_info = inspect.getframeinfo(frame.f_back)
20 caller_path = caller_info[0]
21 plugin_name = os.path.basename(os.path.dirname(caller_path))
22
23 return get_config(plugin_name + "_alembic_version")
24
25
26 def upgrade(plugin_name=None, revision=None, lower="current"):
27 database_url = current_app.config.get("SQLALCHEMY_DATABASE_URI")
28 if database_url.startswith("sqlite"):
29 current_app.db.create_all()
30 return
31
32 if plugin_name is None:
33 # Get the directory name of the plugin if unspecified
34 # Doing it this way doesn't waste the rest of the inspect.stack call
35 frame = inspect.currentframe()
36 caller_info = inspect.getframeinfo(frame.f_back)
37 caller_path = caller_info[0]
38 plugin_name = os.path.basename(os.path.dirname(caller_path))
39
40 # Check if the plugin has migraitons
41 migrations_path = os.path.join(current_app.plugins_dir, plugin_name, "migrations")
42 if os.path.isdir(migrations_path) is False:
43 return
44
45 engine = create_engine(database_url, poolclass=pool.NullPool)
46 conn = engine.connect()
47 context = MigrationContext.configure(conn)
48 op = Operations(context)
49
50 # Find the list of migrations to run
51 config = Config()
52 config.set_main_option("script_location", migrations_path)
53 config.set_main_option("version_locations", migrations_path)
54 script = ScriptDirectory.from_config(config)
55
56 # Choose base revision for plugin upgrade
57 # "current" points to the current plugin version stored in config
58 # None represents the absolute base layer (e.g. first installation)
59 if lower == "current":
60 lower = get_config(plugin_name + "_alembic_version")
61
62 # Do we upgrade to head or to a specific revision
63 if revision is None:
64 upper = script.get_current_head()
65 else:
66 upper = revision
67
68 # Apply from lower to upper
69 revs = list(script.iterate_revisions(lower=lower, upper=upper))
70 revs.reverse()
71
72 try:
73 for r in revs:
74 with context.begin_transaction():
75 r.module.upgrade(op=op)
76 finally:
77 conn.close()
78
79 # Set the new latest revision
80 set_config(plugin_name + "_alembic_version", upper)
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/CTFd/plugins/migrations.py b/CTFd/plugins/migrations.py
--- a/CTFd/plugins/migrations.py
+++ b/CTFd/plugins/migrations.py
@@ -73,6 +73,9 @@
for r in revs:
with context.begin_transaction():
r.module.upgrade(op=op)
+ # Set revision that succeeded so we don't need
+ # to start from the beginning on failure
+ set_config(plugin_name + "_alembic_version", r.revision)
finally:
conn.close()
| {"golden_diff": "diff --git a/CTFd/plugins/migrations.py b/CTFd/plugins/migrations.py\n--- a/CTFd/plugins/migrations.py\n+++ b/CTFd/plugins/migrations.py\n@@ -73,6 +73,9 @@\n for r in revs:\n with context.begin_transaction():\n r.module.upgrade(op=op)\n+ # Set revision that succeeded so we don't need\n+ # to start from the beginning on failure\n+ set_config(plugin_name + \"_alembic_version\", r.revision)\n finally:\n conn.close()\n", "issue": "Set plugin migration version in between each migration\nhttps://github.com/CTFd/CTFd/blob/e1991e16963b10302baa7cc50d52071a5053bf2f/CTFd/plugins/migrations.py#L72-L77\r\n\r\nThis code here probably should be setting the plugin version in between each migration so that if a migration fails it doesn't need to be started from the beginning again. \n", "before_files": [{"content": "import inspect\nimport os\n\nfrom alembic.config import Config\nfrom alembic.migration import MigrationContext\nfrom alembic.operations import Operations\nfrom alembic.script import ScriptDirectory\nfrom flask import current_app\nfrom sqlalchemy import create_engine, pool\n\nfrom CTFd.utils import get_config, set_config\n\n\ndef current(plugin_name=None):\n if plugin_name is None:\n # Get the directory name of the plugin if unspecified\n # Doing it this way doesn't waste the rest of the inspect.stack call\n frame = inspect.currentframe()\n caller_info = inspect.getframeinfo(frame.f_back)\n caller_path = caller_info[0]\n plugin_name = os.path.basename(os.path.dirname(caller_path))\n\n return get_config(plugin_name + \"_alembic_version\")\n\n\ndef upgrade(plugin_name=None, revision=None, lower=\"current\"):\n database_url = current_app.config.get(\"SQLALCHEMY_DATABASE_URI\")\n if database_url.startswith(\"sqlite\"):\n current_app.db.create_all()\n return\n\n if plugin_name is None:\n # Get the directory name of the plugin if unspecified\n # Doing it this way doesn't waste the rest of the inspect.stack call\n frame = inspect.currentframe()\n caller_info = inspect.getframeinfo(frame.f_back)\n caller_path = caller_info[0]\n plugin_name = os.path.basename(os.path.dirname(caller_path))\n\n # Check if the plugin has migraitons\n migrations_path = os.path.join(current_app.plugins_dir, plugin_name, \"migrations\")\n if os.path.isdir(migrations_path) is False:\n return\n\n engine = create_engine(database_url, poolclass=pool.NullPool)\n conn = engine.connect()\n context = MigrationContext.configure(conn)\n op = Operations(context)\n\n # Find the list of migrations to run\n config = Config()\n config.set_main_option(\"script_location\", migrations_path)\n config.set_main_option(\"version_locations\", migrations_path)\n script = ScriptDirectory.from_config(config)\n\n # Choose base revision for plugin upgrade\n # \"current\" points to the current plugin version stored in config\n # None represents the absolute base layer (e.g. first installation)\n if lower == \"current\":\n lower = get_config(plugin_name + \"_alembic_version\")\n\n # Do we upgrade to head or to a specific revision\n if revision is None:\n upper = script.get_current_head()\n else:\n upper = revision\n\n # Apply from lower to upper\n revs = list(script.iterate_revisions(lower=lower, upper=upper))\n revs.reverse()\n\n try:\n for r in revs:\n with context.begin_transaction():\n r.module.upgrade(op=op)\n finally:\n conn.close()\n\n # Set the new latest revision\n set_config(plugin_name + \"_alembic_version\", upper)\n", "path": "CTFd/plugins/migrations.py"}], "after_files": [{"content": "import inspect\nimport os\n\nfrom alembic.config import Config\nfrom alembic.migration import MigrationContext\nfrom alembic.operations import Operations\nfrom alembic.script import ScriptDirectory\nfrom flask import current_app\nfrom sqlalchemy import create_engine, pool\n\nfrom CTFd.utils import get_config, set_config\n\n\ndef current(plugin_name=None):\n if plugin_name is None:\n # Get the directory name of the plugin if unspecified\n # Doing it this way doesn't waste the rest of the inspect.stack call\n frame = inspect.currentframe()\n caller_info = inspect.getframeinfo(frame.f_back)\n caller_path = caller_info[0]\n plugin_name = os.path.basename(os.path.dirname(caller_path))\n\n return get_config(plugin_name + \"_alembic_version\")\n\n\ndef upgrade(plugin_name=None, revision=None, lower=\"current\"):\n database_url = current_app.config.get(\"SQLALCHEMY_DATABASE_URI\")\n if database_url.startswith(\"sqlite\"):\n current_app.db.create_all()\n return\n\n if plugin_name is None:\n # Get the directory name of the plugin if unspecified\n # Doing it this way doesn't waste the rest of the inspect.stack call\n frame = inspect.currentframe()\n caller_info = inspect.getframeinfo(frame.f_back)\n caller_path = caller_info[0]\n plugin_name = os.path.basename(os.path.dirname(caller_path))\n\n # Check if the plugin has migraitons\n migrations_path = os.path.join(current_app.plugins_dir, plugin_name, \"migrations\")\n if os.path.isdir(migrations_path) is False:\n return\n\n engine = create_engine(database_url, poolclass=pool.NullPool)\n conn = engine.connect()\n context = MigrationContext.configure(conn)\n op = Operations(context)\n\n # Find the list of migrations to run\n config = Config()\n config.set_main_option(\"script_location\", migrations_path)\n config.set_main_option(\"version_locations\", migrations_path)\n script = ScriptDirectory.from_config(config)\n\n # Choose base revision for plugin upgrade\n # \"current\" points to the current plugin version stored in config\n # None represents the absolute base layer (e.g. first installation)\n if lower == \"current\":\n lower = get_config(plugin_name + \"_alembic_version\")\n\n # Do we upgrade to head or to a specific revision\n if revision is None:\n upper = script.get_current_head()\n else:\n upper = revision\n\n # Apply from lower to upper\n revs = list(script.iterate_revisions(lower=lower, upper=upper))\n revs.reverse()\n\n try:\n for r in revs:\n with context.begin_transaction():\n r.module.upgrade(op=op)\n # Set revision that succeeded so we don't need\n # to start from the beginning on failure\n set_config(plugin_name + \"_alembic_version\", r.revision)\n finally:\n conn.close()\n\n # Set the new latest revision\n set_config(plugin_name + \"_alembic_version\", upper)\n", "path": "CTFd/plugins/migrations.py"}]} | 1,129 | 122 |
gh_patches_debug_23213 | rasdani/github-patches | git_diff | microsoft__lisa-1567 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Command not found (PATH does not contain /usr/sbin)
Getting errors when using LISAv3 to deploy and test CentOS 7_9 on Azure
`[ERROR] lisa.env[generated_0].node[0].cmd[7289] not found command: Command not found: modinfo. Check that modinfo is installed and on $PATH`
`[ERROR] lisa.env[generated_0].node[0].cmd[1038] not found command: Command not found: waagent. Check that waagent is installed and on $PATH`
`[ERROR] lisa.env[generated_0].node[0].cmd[8629] not found command: Command not found: lsmod. Check that lsmod is installed and on $PATH`
SSHing into the node confirms that all three of these commands are present and runnable on the node.
The error about modinfo missing appears to occur before any tests start running. These errors do not occur when deploying and testing Ubuntu 18.04-LTS.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lisa/tools/modinfo.py`
Content:
```
1 # Copyright (c) Microsoft Corporation.
2 # Licensed under the MIT license.
3
4 import re
5 from typing import Any
6
7 from lisa.executable import Tool
8 from lisa.util import find_patterns_in_lines
9
10
11 class Modinfo(Tool):
12 __version_pattern = re.compile(r"^version:[ \t]*([^ \n]*)")
13
14 @property
15 def command(self) -> str:
16 return self._command
17
18 def _check_exists(self) -> bool:
19 return True
20
21 def _initialize(self, *args: Any, **kwargs: Any) -> None:
22 self._command = "modinfo"
23
24 def get_info(
25 self,
26 mod_name: str,
27 force_run: bool = False,
28 no_info_log: bool = True,
29 no_error_log: bool = True,
30 ) -> str:
31 result = self.run(
32 mod_name,
33 force_run=force_run,
34 no_info_log=no_info_log,
35 no_error_log=no_error_log,
36 )
37 if result.exit_code != 0:
38 # CentOS may not include the path when started,
39 # specify path and try again.
40 self._command = "/usr/sbin/modinfo"
41 result = self.run(
42 mod_name,
43 force_run=force_run,
44 no_info_log=no_info_log,
45 no_error_log=no_error_log,
46 )
47 return result.stdout
48
49 def get_version(
50 self,
51 mod_name: str,
52 force_run: bool = False,
53 no_info_log: bool = True,
54 no_error_log: bool = True,
55 ) -> str:
56 output = self.get_info(
57 mod_name=mod_name,
58 force_run=force_run,
59 no_info_log=no_info_log,
60 no_error_log=no_error_log,
61 )
62 found_version = find_patterns_in_lines(output, [self.__version_pattern])
63 return found_version[0][0] if found_version[0] else ""
64
```
Path: `lisa/tools/lsmod.py`
Content:
```
1 # Copyright (c) Microsoft Corporation.
2 # Licensed under the MIT license.
3
4 import re
5 from typing import Any
6
7 from lisa.executable import Tool
8 from lisa.util import LisaException, find_patterns_in_lines
9
10
11 class Lsmod(Tool):
12 # The output of lsmod command is in below format -
13 # Module Size Used by
14 # fuse 52176 3
15 # cryptd 14125 0
16 # aes_generic 32970 1 aes_i586
17 __output_pattern = re.compile(
18 r"^(?P<name>[^\s]+)\s+(?P<size>[^\s]+)\s+(?P<usedby>.*)?$", re.MULTILINE
19 )
20
21 @property
22 def command(self) -> str:
23 return self._command
24
25 def _check_exists(self) -> bool:
26 return True
27
28 def _initialize(self, *args: Any, **kwargs: Any) -> None:
29 self._command = "lsmod"
30
31 def module_exists(
32 self,
33 mod_name: str = "",
34 force_run: bool = False,
35 no_info_log: bool = True,
36 no_error_log: bool = True,
37 ) -> bool:
38 result = self.run(
39 force_run=force_run,
40 no_info_log=no_info_log,
41 no_error_log=no_error_log,
42 )
43 if result.exit_code != 0:
44 raise LisaException(
45 f"{self._command} command got non-zero exit code: {result.exit_code}"
46 )
47
48 module_info = find_patterns_in_lines(result.stdout, [self.__output_pattern])
49 if any(mod_name in info for sublist in module_info for info in sublist):
50 return True
51
52 return False
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lisa/tools/lsmod.py b/lisa/tools/lsmod.py
--- a/lisa/tools/lsmod.py
+++ b/lisa/tools/lsmod.py
@@ -36,6 +36,7 @@
no_error_log: bool = True,
) -> bool:
result = self.run(
+ sudo=True,
force_run=force_run,
no_info_log=no_info_log,
no_error_log=no_error_log,
diff --git a/lisa/tools/modinfo.py b/lisa/tools/modinfo.py
--- a/lisa/tools/modinfo.py
+++ b/lisa/tools/modinfo.py
@@ -30,20 +30,13 @@
) -> str:
result = self.run(
mod_name,
+ sudo=True,
force_run=force_run,
no_info_log=no_info_log,
no_error_log=no_error_log,
+ expected_exit_code=0,
+ expected_exit_code_failure_message=f"Modinfo failed for module {mod_name}",
)
- if result.exit_code != 0:
- # CentOS may not include the path when started,
- # specify path and try again.
- self._command = "/usr/sbin/modinfo"
- result = self.run(
- mod_name,
- force_run=force_run,
- no_info_log=no_info_log,
- no_error_log=no_error_log,
- )
return result.stdout
def get_version(
| {"golden_diff": "diff --git a/lisa/tools/lsmod.py b/lisa/tools/lsmod.py\n--- a/lisa/tools/lsmod.py\n+++ b/lisa/tools/lsmod.py\n@@ -36,6 +36,7 @@\n no_error_log: bool = True,\n ) -> bool:\n result = self.run(\n+ sudo=True,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\ndiff --git a/lisa/tools/modinfo.py b/lisa/tools/modinfo.py\n--- a/lisa/tools/modinfo.py\n+++ b/lisa/tools/modinfo.py\n@@ -30,20 +30,13 @@\n ) -> str:\n result = self.run(\n mod_name,\n+ sudo=True,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n+ expected_exit_code=0,\n+ expected_exit_code_failure_message=f\"Modinfo failed for module {mod_name}\",\n )\n- if result.exit_code != 0:\n- # CentOS may not include the path when started,\n- # specify path and try again.\n- self._command = \"/usr/sbin/modinfo\"\n- result = self.run(\n- mod_name,\n- force_run=force_run,\n- no_info_log=no_info_log,\n- no_error_log=no_error_log,\n- )\n return result.stdout\n \n def get_version(\n", "issue": "Command not found (PATH does not contain /usr/sbin)\nGetting errors when using LISAv3 to deploy and test CentOS 7_9 on Azure\r\n\r\n`[ERROR] lisa.env[generated_0].node[0].cmd[7289] not found command: Command not found: modinfo. Check that modinfo is installed and on $PATH`\r\n\r\n`[ERROR] lisa.env[generated_0].node[0].cmd[1038] not found command: Command not found: waagent. Check that waagent is installed and on $PATH`\r\n\r\n`[ERROR] lisa.env[generated_0].node[0].cmd[8629] not found command: Command not found: lsmod. Check that lsmod is installed and on $PATH`\r\n\r\nSSHing into the node confirms that all three of these commands are present and runnable on the node.\r\n\r\nThe error about modinfo missing appears to occur before any tests start running. These errors do not occur when deploying and testing Ubuntu 18.04-LTS.\n", "before_files": [{"content": "# Copyright (c) Microsoft Corporation.\n# Licensed under the MIT license.\n\nimport re\nfrom typing import Any\n\nfrom lisa.executable import Tool\nfrom lisa.util import find_patterns_in_lines\n\n\nclass Modinfo(Tool):\n __version_pattern = re.compile(r\"^version:[ \\t]*([^ \\n]*)\")\n\n @property\n def command(self) -> str:\n return self._command\n\n def _check_exists(self) -> bool:\n return True\n\n def _initialize(self, *args: Any, **kwargs: Any) -> None:\n self._command = \"modinfo\"\n\n def get_info(\n self,\n mod_name: str,\n force_run: bool = False,\n no_info_log: bool = True,\n no_error_log: bool = True,\n ) -> str:\n result = self.run(\n mod_name,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n )\n if result.exit_code != 0:\n # CentOS may not include the path when started,\n # specify path and try again.\n self._command = \"/usr/sbin/modinfo\"\n result = self.run(\n mod_name,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n )\n return result.stdout\n\n def get_version(\n self,\n mod_name: str,\n force_run: bool = False,\n no_info_log: bool = True,\n no_error_log: bool = True,\n ) -> str:\n output = self.get_info(\n mod_name=mod_name,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n )\n found_version = find_patterns_in_lines(output, [self.__version_pattern])\n return found_version[0][0] if found_version[0] else \"\"\n", "path": "lisa/tools/modinfo.py"}, {"content": "# Copyright (c) Microsoft Corporation.\n# Licensed under the MIT license.\n\nimport re\nfrom typing import Any\n\nfrom lisa.executable import Tool\nfrom lisa.util import LisaException, find_patterns_in_lines\n\n\nclass Lsmod(Tool):\n # The output of lsmod command is in below format -\n # Module Size Used by\n # fuse 52176 3\n # cryptd 14125 0\n # aes_generic 32970 1 aes_i586\n __output_pattern = re.compile(\n r\"^(?P<name>[^\\s]+)\\s+(?P<size>[^\\s]+)\\s+(?P<usedby>.*)?$\", re.MULTILINE\n )\n\n @property\n def command(self) -> str:\n return self._command\n\n def _check_exists(self) -> bool:\n return True\n\n def _initialize(self, *args: Any, **kwargs: Any) -> None:\n self._command = \"lsmod\"\n\n def module_exists(\n self,\n mod_name: str = \"\",\n force_run: bool = False,\n no_info_log: bool = True,\n no_error_log: bool = True,\n ) -> bool:\n result = self.run(\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n )\n if result.exit_code != 0:\n raise LisaException(\n f\"{self._command} command got non-zero exit code: {result.exit_code}\"\n )\n\n module_info = find_patterns_in_lines(result.stdout, [self.__output_pattern])\n if any(mod_name in info for sublist in module_info for info in sublist):\n return True\n\n return False\n", "path": "lisa/tools/lsmod.py"}], "after_files": [{"content": "# Copyright (c) Microsoft Corporation.\n# Licensed under the MIT license.\n\nimport re\nfrom typing import Any\n\nfrom lisa.executable import Tool\nfrom lisa.util import find_patterns_in_lines\n\n\nclass Modinfo(Tool):\n __version_pattern = re.compile(r\"^version:[ \\t]*([^ \\n]*)\")\n\n @property\n def command(self) -> str:\n return self._command\n\n def _check_exists(self) -> bool:\n return True\n\n def _initialize(self, *args: Any, **kwargs: Any) -> None:\n self._command = \"modinfo\"\n\n def get_info(\n self,\n mod_name: str,\n force_run: bool = False,\n no_info_log: bool = True,\n no_error_log: bool = True,\n ) -> str:\n result = self.run(\n mod_name,\n sudo=True,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n expected_exit_code=0,\n expected_exit_code_failure_message=f\"Modinfo failed for module {mod_name}\",\n )\n return result.stdout\n\n def get_version(\n self,\n mod_name: str,\n force_run: bool = False,\n no_info_log: bool = True,\n no_error_log: bool = True,\n ) -> str:\n output = self.get_info(\n mod_name=mod_name,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n )\n found_version = find_patterns_in_lines(output, [self.__version_pattern])\n return found_version[0][0] if found_version[0] else \"\"\n", "path": "lisa/tools/modinfo.py"}, {"content": "# Copyright (c) Microsoft Corporation.\n# Licensed under the MIT license.\n\nimport re\nfrom typing import Any\n\nfrom lisa.executable import Tool\nfrom lisa.util import LisaException, find_patterns_in_lines\n\n\nclass Lsmod(Tool):\n # The output of lsmod command is in below format -\n # Module Size Used by\n # fuse 52176 3\n # cryptd 14125 0\n # aes_generic 32970 1 aes_i586\n __output_pattern = re.compile(\n r\"^(?P<name>[^\\s]+)\\s+(?P<size>[^\\s]+)\\s+(?P<usedby>.*)?$\", re.MULTILINE\n )\n\n @property\n def command(self) -> str:\n return self._command\n\n def _check_exists(self) -> bool:\n return True\n\n def _initialize(self, *args: Any, **kwargs: Any) -> None:\n self._command = \"lsmod\"\n\n def module_exists(\n self,\n mod_name: str = \"\",\n force_run: bool = False,\n no_info_log: bool = True,\n no_error_log: bool = True,\n ) -> bool:\n result = self.run(\n sudo=True,\n force_run=force_run,\n no_info_log=no_info_log,\n no_error_log=no_error_log,\n )\n if result.exit_code != 0:\n raise LisaException(\n f\"{self._command} command got non-zero exit code: {result.exit_code}\"\n )\n\n module_info = find_patterns_in_lines(result.stdout, [self.__output_pattern])\n if any(mod_name in info for sublist in module_info for info in sublist):\n return True\n\n return False\n", "path": "lisa/tools/lsmod.py"}]} | 1,526 | 315 |
gh_patches_debug_26655 | rasdani/github-patches | git_diff | scikit-image__scikit-image-6989 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Existing "inpainting" gallery example could use a better (more specific) title.
Creating this issue so we don't lose track of what's been discussed in the conversation _Originally posted by @lagru in https://github.com/scikit-image/scikit-image/pull/6853#discussion_r1149741067_
> @mkcor, just wondering how this relates to [our existing inpainting example ](https://scikit-image.org/docs/dev/auto_examples/filters/plot_inpaint.html#sphx-glr-auto-examples-filters-plot-inpaint-py). I am assuming that the main benefit here is that it's a real world use case?
[...]
> Which prompts the idea that we should update the title of the existing example, so it's less generic than just "inpainting."
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `doc/examples/filters/plot_inpaint.py`
Content:
```
1 """
2 ===========
3 Inpainting
4 ===========
5 Inpainting [1]_ is the process of reconstructing lost or deteriorated
6 parts of images and videos.
7
8 The reconstruction is supposed to be performed in fully automatic way by
9 exploiting the information presented in non-damaged regions.
10
11 In this example, we show how the masked pixels get inpainted by
12 inpainting algorithm based on 'biharmonic equation'-assumption [2]_ [3]_ [4]_.
13
14 .. [1] Wikipedia. Inpainting
15 https://en.wikipedia.org/wiki/Inpainting
16 .. [2] Wikipedia. Biharmonic equation
17 https://en.wikipedia.org/wiki/Biharmonic_equation
18 .. [3] S.B.Damelin and N.S.Hoang. "On Surface Completion and Image
19 Inpainting by Biharmonic Functions: Numerical Aspects",
20 International Journal of Mathematics and Mathematical Sciences,
21 Vol. 2018, Article ID 3950312
22 :DOI:`10.1155/2018/3950312`
23 .. [4] C. K. Chui and H. N. Mhaskar, MRA Contextual-Recovery Extension of
24 Smooth Functions on Manifolds, Appl. and Comp. Harmonic Anal.,
25 28 (2010), 104-113,
26 :DOI:`10.1016/j.acha.2009.04.004`
27 """
28
29 import numpy as np
30 import matplotlib.pyplot as plt
31
32 from skimage import data
33 from skimage.morphology import disk, binary_dilation
34 from skimage.restoration import inpaint
35
36 image_orig = data.astronaut()
37
38 # Create mask with six block defect regions
39 mask = np.zeros(image_orig.shape[:-1], dtype=bool)
40 mask[20:60, 0:20] = 1
41 mask[160:180, 70:155] = 1
42 mask[30:60, 170:195] = 1
43 mask[-60:-30, 170:195] = 1
44 mask[-180:-160, 70:155] = 1
45 mask[-60:-20, 0:20] = 1
46
47 # add a few long, narrow defects
48 mask[200:205, -200:] = 1
49 mask[150:255, 20:23] = 1
50 mask[365:368, 60:130] = 1
51
52 # add randomly positioned small point-like defects
53 rstate = np.random.default_rng(0)
54 for radius in [0, 2, 4]:
55 # larger defects are less common
56 thresh = 3 + 0.25 * radius # make larger defects less common
57 tmp_mask = rstate.standard_normal(image_orig.shape[:-1]) > thresh
58 if radius > 0:
59 tmp_mask = binary_dilation(tmp_mask, disk(radius, dtype=bool))
60 mask[tmp_mask] = 1
61
62 # Apply defect mask to the image over the same region in each color channel
63 image_defect = image_orig * ~mask[..., np.newaxis]
64
65 image_result = inpaint.inpaint_biharmonic(image_defect, mask, channel_axis=-1)
66
67 fig, axes = plt.subplots(ncols=2, nrows=2)
68 ax = axes.ravel()
69
70 ax[0].set_title('Original image')
71 ax[0].imshow(image_orig)
72
73 ax[1].set_title('Mask')
74 ax[1].imshow(mask, cmap=plt.cm.gray)
75
76 ax[2].set_title('Defected image')
77 ax[2].imshow(image_defect)
78
79 ax[3].set_title('Inpainted image')
80 ax[3].imshow(image_result)
81
82 for a in ax:
83 a.axis('off')
84
85 fig.tight_layout()
86 plt.show()
87
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/doc/examples/filters/plot_inpaint.py b/doc/examples/filters/plot_inpaint.py
--- a/doc/examples/filters/plot_inpaint.py
+++ b/doc/examples/filters/plot_inpaint.py
@@ -1,15 +1,16 @@
"""
-===========
-Inpainting
-===========
+===============================
+Fill in defects with inpainting
+===============================
+
Inpainting [1]_ is the process of reconstructing lost or deteriorated
parts of images and videos.
-The reconstruction is supposed to be performed in fully automatic way by
-exploiting the information presented in non-damaged regions.
+The reconstruction (restoration) is performed in an automatic way by
+exploiting the information present in non-damaged regions.
-In this example, we show how the masked pixels get inpainted by
-inpainting algorithm based on 'biharmonic equation'-assumption [2]_ [3]_ [4]_.
+In this example, we show how the masked pixels get inpainted using an
+inpainting algorithm based on the biharmonic equation [2]_ [3]_ [4]_.
.. [1] Wikipedia. Inpainting
https://en.wikipedia.org/wiki/Inpainting
@@ -44,12 +45,12 @@
mask[-180:-160, 70:155] = 1
mask[-60:-20, 0:20] = 1
-# add a few long, narrow defects
+# Add a few long, narrow defects
mask[200:205, -200:] = 1
mask[150:255, 20:23] = 1
mask[365:368, 60:130] = 1
-# add randomly positioned small point-like defects
+# Add randomly positioned small point-like defects
rstate = np.random.default_rng(0)
for radius in [0, 2, 4]:
# larger defects are less common
| {"golden_diff": "diff --git a/doc/examples/filters/plot_inpaint.py b/doc/examples/filters/plot_inpaint.py\n--- a/doc/examples/filters/plot_inpaint.py\n+++ b/doc/examples/filters/plot_inpaint.py\n@@ -1,15 +1,16 @@\n \"\"\"\n-===========\n-Inpainting\n-===========\n+===============================\n+Fill in defects with inpainting\n+===============================\n+\n Inpainting [1]_ is the process of reconstructing lost or deteriorated\n parts of images and videos.\n \n-The reconstruction is supposed to be performed in fully automatic way by\n-exploiting the information presented in non-damaged regions.\n+The reconstruction (restoration) is performed in an automatic way by\n+exploiting the information present in non-damaged regions.\n \n-In this example, we show how the masked pixels get inpainted by\n-inpainting algorithm based on 'biharmonic equation'-assumption [2]_ [3]_ [4]_.\n+In this example, we show how the masked pixels get inpainted using an\n+inpainting algorithm based on the biharmonic equation [2]_ [3]_ [4]_.\n \n .. [1] Wikipedia. Inpainting\n https://en.wikipedia.org/wiki/Inpainting\n@@ -44,12 +45,12 @@\n mask[-180:-160, 70:155] = 1\n mask[-60:-20, 0:20] = 1\n \n-# add a few long, narrow defects\n+# Add a few long, narrow defects\n mask[200:205, -200:] = 1\n mask[150:255, 20:23] = 1\n mask[365:368, 60:130] = 1\n \n-# add randomly positioned small point-like defects\n+# Add randomly positioned small point-like defects\n rstate = np.random.default_rng(0)\n for radius in [0, 2, 4]:\n # larger defects are less common\n", "issue": "Existing \"inpainting\" gallery example could use a better (more specific) title.\nCreating this issue so we don't lose track of what's been discussed in the conversation _Originally posted by @lagru in https://github.com/scikit-image/scikit-image/pull/6853#discussion_r1149741067_\r\n\r\n> @mkcor, just wondering how this relates to [our existing inpainting example ](https://scikit-image.org/docs/dev/auto_examples/filters/plot_inpaint.html#sphx-glr-auto-examples-filters-plot-inpaint-py). I am assuming that the main benefit here is that it's a real world use case?\r\n\r\n[...]\r\n\r\n> Which prompts the idea that we should update the title of the existing example, so it's less generic than just \"inpainting.\"\n", "before_files": [{"content": "\"\"\"\n===========\nInpainting\n===========\nInpainting [1]_ is the process of reconstructing lost or deteriorated\nparts of images and videos.\n\nThe reconstruction is supposed to be performed in fully automatic way by\nexploiting the information presented in non-damaged regions.\n\nIn this example, we show how the masked pixels get inpainted by\ninpainting algorithm based on 'biharmonic equation'-assumption [2]_ [3]_ [4]_.\n\n.. [1] Wikipedia. Inpainting\n https://en.wikipedia.org/wiki/Inpainting\n.. [2] Wikipedia. Biharmonic equation\n https://en.wikipedia.org/wiki/Biharmonic_equation\n.. [3] S.B.Damelin and N.S.Hoang. \"On Surface Completion and Image\n Inpainting by Biharmonic Functions: Numerical Aspects\",\n International Journal of Mathematics and Mathematical Sciences,\n Vol. 2018, Article ID 3950312\n :DOI:`10.1155/2018/3950312`\n.. [4] C. K. Chui and H. N. Mhaskar, MRA Contextual-Recovery Extension of\n Smooth Functions on Manifolds, Appl. and Comp. Harmonic Anal.,\n 28 (2010), 104-113,\n :DOI:`10.1016/j.acha.2009.04.004`\n\"\"\"\n\nimport numpy as np\nimport matplotlib.pyplot as plt\n\nfrom skimage import data\nfrom skimage.morphology import disk, binary_dilation\nfrom skimage.restoration import inpaint\n\nimage_orig = data.astronaut()\n\n# Create mask with six block defect regions\nmask = np.zeros(image_orig.shape[:-1], dtype=bool)\nmask[20:60, 0:20] = 1\nmask[160:180, 70:155] = 1\nmask[30:60, 170:195] = 1\nmask[-60:-30, 170:195] = 1\nmask[-180:-160, 70:155] = 1\nmask[-60:-20, 0:20] = 1\n\n# add a few long, narrow defects\nmask[200:205, -200:] = 1\nmask[150:255, 20:23] = 1\nmask[365:368, 60:130] = 1\n\n# add randomly positioned small point-like defects\nrstate = np.random.default_rng(0)\nfor radius in [0, 2, 4]:\n # larger defects are less common\n thresh = 3 + 0.25 * radius # make larger defects less common\n tmp_mask = rstate.standard_normal(image_orig.shape[:-1]) > thresh\n if radius > 0:\n tmp_mask = binary_dilation(tmp_mask, disk(radius, dtype=bool))\n mask[tmp_mask] = 1\n\n# Apply defect mask to the image over the same region in each color channel\nimage_defect = image_orig * ~mask[..., np.newaxis]\n\nimage_result = inpaint.inpaint_biharmonic(image_defect, mask, channel_axis=-1)\n\nfig, axes = plt.subplots(ncols=2, nrows=2)\nax = axes.ravel()\n\nax[0].set_title('Original image')\nax[0].imshow(image_orig)\n\nax[1].set_title('Mask')\nax[1].imshow(mask, cmap=plt.cm.gray)\n\nax[2].set_title('Defected image')\nax[2].imshow(image_defect)\n\nax[3].set_title('Inpainted image')\nax[3].imshow(image_result)\n\nfor a in ax:\n a.axis('off')\n\nfig.tight_layout()\nplt.show()\n", "path": "doc/examples/filters/plot_inpaint.py"}], "after_files": [{"content": "\"\"\"\n===============================\nFill in defects with inpainting\n===============================\n\nInpainting [1]_ is the process of reconstructing lost or deteriorated\nparts of images and videos.\n\nThe reconstruction (restoration) is performed in an automatic way by\nexploiting the information present in non-damaged regions.\n\nIn this example, we show how the masked pixels get inpainted using an\ninpainting algorithm based on the biharmonic equation [2]_ [3]_ [4]_.\n\n.. [1] Wikipedia. Inpainting\n https://en.wikipedia.org/wiki/Inpainting\n.. [2] Wikipedia. Biharmonic equation\n https://en.wikipedia.org/wiki/Biharmonic_equation\n.. [3] S.B.Damelin and N.S.Hoang. \"On Surface Completion and Image\n Inpainting by Biharmonic Functions: Numerical Aspects\",\n International Journal of Mathematics and Mathematical Sciences,\n Vol. 2018, Article ID 3950312\n :DOI:`10.1155/2018/3950312`\n.. [4] C. K. Chui and H. N. Mhaskar, MRA Contextual-Recovery Extension of\n Smooth Functions on Manifolds, Appl. and Comp. Harmonic Anal.,\n 28 (2010), 104-113,\n :DOI:`10.1016/j.acha.2009.04.004`\n\"\"\"\n\nimport numpy as np\nimport matplotlib.pyplot as plt\n\nfrom skimage import data\nfrom skimage.morphology import disk, binary_dilation\nfrom skimage.restoration import inpaint\n\nimage_orig = data.astronaut()\n\n# Create mask with six block defect regions\nmask = np.zeros(image_orig.shape[:-1], dtype=bool)\nmask[20:60, 0:20] = 1\nmask[160:180, 70:155] = 1\nmask[30:60, 170:195] = 1\nmask[-60:-30, 170:195] = 1\nmask[-180:-160, 70:155] = 1\nmask[-60:-20, 0:20] = 1\n\n# Add a few long, narrow defects\nmask[200:205, -200:] = 1\nmask[150:255, 20:23] = 1\nmask[365:368, 60:130] = 1\n\n# Add randomly positioned small point-like defects\nrstate = np.random.default_rng(0)\nfor radius in [0, 2, 4]:\n # larger defects are less common\n thresh = 3 + 0.25 * radius # make larger defects less common\n tmp_mask = rstate.standard_normal(image_orig.shape[:-1]) > thresh\n if radius > 0:\n tmp_mask = binary_dilation(tmp_mask, disk(radius, dtype=bool))\n mask[tmp_mask] = 1\n\n# Apply defect mask to the image over the same region in each color channel\nimage_defect = image_orig * ~mask[..., np.newaxis]\n\nimage_result = inpaint.inpaint_biharmonic(image_defect, mask, channel_axis=-1)\n\nfig, axes = plt.subplots(ncols=2, nrows=2)\nax = axes.ravel()\n\nax[0].set_title('Original image')\nax[0].imshow(image_orig)\n\nax[1].set_title('Mask')\nax[1].imshow(mask, cmap=plt.cm.gray)\n\nax[2].set_title('Defected image')\nax[2].imshow(image_defect)\n\nax[3].set_title('Inpainted image')\nax[3].imshow(image_result)\n\nfor a in ax:\n a.axis('off')\n\nfig.tight_layout()\nplt.show()\n", "path": "doc/examples/filters/plot_inpaint.py"}]} | 1,489 | 454 |
gh_patches_debug_21673 | rasdani/github-patches | git_diff | ivy-llc__ivy-13280 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
unwrap
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py`
Content:
```
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py b/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py
--- a/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py
+++ b/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py
@@ -0,0 +1,48 @@
+# global
+import ivy
+
+# local
+from ivy.functional.frontends.numpy.func_wrapper import (
+ to_ivy_arrays_and_back,
+ handle_numpy_dtype,
+ from_zero_dim_arrays_to_scalar,
+ handle_numpy_out,
+)
+
+
+
+@handle_numpy_out
+@handle_numpy_dtype
+@to_ivy_arrays_and_back
+@from_zero_dim_arrays_to_scalar
+def unwrap(p, discont=None, axis=-1, *, period=2*pi):
+ p = ivy.Array.asarray(p)
+ nd = p.ndim
+ dd = ivy.diff(p, axis=axis)
+ if discont is None:
+ discont = period/2
+ slice1 = [ivy.slice(None, None)]*nd # full slices
+ slice1[axis] = ivy.slice(1, None)
+ slice1 = ivy.tuple(slice1)
+ dtype = ivy.result_type(dd, period)
+ if ivy.issubdtype(dtype, ivy.integer):
+ interval_high, rem = ivy.divmod(period, 2)
+ boundary_ambiguous = rem == 0
+ else:
+ interval_high = period / 2
+ boundary_ambiguous = True
+ interval_low = -interval_high
+ ddmod = ivy.mod(dd - interval_low, period) + interval_low
+ if boundary_ambiguous:
+ ivy.copyto(ddmod, interval_high,
+ where=(ddmod == interval_low) & (dd > 0))
+ ph_correct = ddmod - dd
+ ivy.copyto(ph_correct, 0, where=ivy.abs(dd) < discont)
+ up = ivy.array(p, copy=True, dtype=dtype)
+ up[slice1] = p[slice1] + ph_correct.cumsum(axis)
+ return up
+
+my_list = [24,8,3,4,34,8]
+ans = unwrap(my_list)
+print("After the np.unwrap()")
+print(ans)
\ No newline at end of file
| {"golden_diff": "diff --git a/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py b/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py\n--- a/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py\n+++ b/ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py\n@@ -0,0 +1,48 @@\n+# global\n+import ivy\n+\n+# local\n+from ivy.functional.frontends.numpy.func_wrapper import (\n+ to_ivy_arrays_and_back,\n+ handle_numpy_dtype,\n+ from_zero_dim_arrays_to_scalar,\n+ handle_numpy_out,\n+)\n+\n+\n+\n+@handle_numpy_out\n+@handle_numpy_dtype\n+@to_ivy_arrays_and_back\n+@from_zero_dim_arrays_to_scalar\n+def unwrap(p, discont=None, axis=-1, *, period=2*pi):\n+ p = ivy.Array.asarray(p)\n+ nd = p.ndim\n+ dd = ivy.diff(p, axis=axis)\n+ if discont is None:\n+ discont = period/2\n+ slice1 = [ivy.slice(None, None)]*nd # full slices\n+ slice1[axis] = ivy.slice(1, None)\n+ slice1 = ivy.tuple(slice1)\n+ dtype = ivy.result_type(dd, period)\n+ if ivy.issubdtype(dtype, ivy.integer):\n+ interval_high, rem = ivy.divmod(period, 2)\n+ boundary_ambiguous = rem == 0\n+ else:\n+ interval_high = period / 2\n+ boundary_ambiguous = True\n+ interval_low = -interval_high\n+ ddmod = ivy.mod(dd - interval_low, period) + interval_low\n+ if boundary_ambiguous:\n+ ivy.copyto(ddmod, interval_high,\n+ where=(ddmod == interval_low) & (dd > 0))\n+ ph_correct = ddmod - dd\n+ ivy.copyto(ph_correct, 0, where=ivy.abs(dd) < discont)\n+ up = ivy.array(p, copy=True, dtype=dtype)\n+ up[slice1] = p[slice1] + ph_correct.cumsum(axis)\n+ return up\n+\n+my_list = [24,8,3,4,34,8]\n+ans = unwrap(my_list)\n+print(\"After the np.unwrap()\")\n+print(ans)\n\\ No newline at end of file\n", "issue": "unwrap\n\n", "before_files": [{"content": "", "path": "ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py"}], "after_files": [{"content": "# global\nimport ivy\n\n# local\nfrom ivy.functional.frontends.numpy.func_wrapper import (\n to_ivy_arrays_and_back,\n handle_numpy_dtype,\n from_zero_dim_arrays_to_scalar,\n handle_numpy_out,\n)\n\n\n\n@handle_numpy_out\n@handle_numpy_dtype\n@to_ivy_arrays_and_back\n@from_zero_dim_arrays_to_scalar\ndef unwrap(p, discont=None, axis=-1, *, period=2*pi):\n p = ivy.Array.asarray(p)\n nd = p.ndim\n dd = ivy.diff(p, axis=axis)\n if discont is None:\n discont = period/2\n slice1 = [ivy.slice(None, None)]*nd # full slices\n slice1[axis] = ivy.slice(1, None)\n slice1 = ivy.tuple(slice1)\n dtype = ivy.result_type(dd, period)\n if ivy.issubdtype(dtype, ivy.integer):\n interval_high, rem = ivy.divmod(period, 2)\n boundary_ambiguous = rem == 0\n else:\n interval_high = period / 2\n boundary_ambiguous = True\n interval_low = -interval_high\n ddmod = ivy.mod(dd - interval_low, period) + interval_low\n if boundary_ambiguous:\n ivy.copyto(ddmod, interval_high,\n where=(ddmod == interval_low) & (dd > 0))\n ph_correct = ddmod - dd\n ivy.copyto(ph_correct, 0, where=ivy.abs(dd) < discont)\n up = ivy.array(p, copy=True, dtype=dtype)\n up[slice1] = p[slice1] + ph_correct.cumsum(axis)\n return up\n\nmy_list = [24,8,3,4,34,8]\nans = unwrap(my_list)\nprint(\"After the np.unwrap()\")\nprint(ans)", "path": "ivy/functional/frontends/numpy/mathematical_functions/other_special_functions.py"}]} | 271 | 552 |
gh_patches_debug_18052 | rasdani/github-patches | git_diff | scikit-image__scikit-image-4064 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
travis has random failures on rank filters
## Description
See for ex: https://travis-ci.org/scikit-image/scikit-image/jobs/563363217
## Way to reproduce
```python
# Place the full code we need to recreate your issue here
# upload all necessary images to github too!
```
## Version information
```python
# Paste the output of the following python commands
from __future__ import print_function
import sys; print(sys.version)
import platform; print(platform.platform())
import skimage; print("scikit-image version: {}".format(skimage.__version__))
import numpy; print("numpy version: {}".format(numpy.__version__))
```
```python
# your output here
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `skimage/_shared/_warnings.py`
Content:
```
1 from contextlib import contextmanager
2 import sys
3 import warnings
4 import re
5 import os
6
7 __all__ = ['all_warnings', 'expected_warnings', 'warn']
8
9
10 def warn(message, category=None, stacklevel=2):
11 """A version of `warnings.warn` with a default stacklevel of 2.
12 """
13 if category is not None:
14 warnings.warn(message, category=category, stacklevel=stacklevel)
15 else:
16 warnings.warn(message, stacklevel=stacklevel)
17
18
19 @contextmanager
20 def all_warnings():
21 """
22 Context for use in testing to ensure that all warnings are raised.
23
24 Examples
25 --------
26 >>> import warnings
27 >>> def foo():
28 ... warnings.warn(RuntimeWarning("bar"))
29
30 We raise the warning once, while the warning filter is set to "once".
31 Hereafter, the warning is invisible, even with custom filters:
32
33 >>> with warnings.catch_warnings():
34 ... warnings.simplefilter('once')
35 ... foo()
36
37 We can now run ``foo()`` without a warning being raised:
38
39 >>> from numpy.testing import assert_warns
40 >>> foo()
41
42 To catch the warning, we call in the help of ``all_warnings``:
43
44 >>> with all_warnings():
45 ... assert_warns(RuntimeWarning, foo)
46 """
47 # _warnings.py is on the critical import path.
48 # Since this is a testing only function, we lazy import inspect.
49 import inspect
50 # Whenever a warning is triggered, Python adds a __warningregistry__
51 # member to the *calling* module. The exercize here is to find
52 # and eradicate all those breadcrumbs that were left lying around.
53 #
54 # We proceed by first searching all parent calling frames and explicitly
55 # clearing their warning registries (necessary for the doctests above to
56 # pass). Then, we search for all submodules of skimage and clear theirs
57 # as well (necessary for the skimage test suite to pass).
58
59 frame = inspect.currentframe()
60 if frame:
61 for f in inspect.getouterframes(frame):
62 f[0].f_locals['__warningregistry__'] = {}
63 del frame
64
65 for mod_name, mod in list(sys.modules.items()):
66 try:
67 mod.__warningregistry__.clear()
68 except AttributeError:
69 pass
70
71 with warnings.catch_warnings(record=True) as w:
72 warnings.simplefilter("always")
73 yield w
74
75
76 @contextmanager
77 def expected_warnings(matching):
78 r"""Context for use in testing to catch known warnings matching regexes
79
80 Parameters
81 ----------
82 matching : list of strings or compiled regexes
83 Regexes for the desired warning to catch
84
85 Examples
86 --------
87 >>> import numpy as np
88 >>> image = np.random.randint(0, 2**16, size=(100, 100), dtype=np.uint16)
89 >>> # rank filters are slow when bit-depth exceeds 10 bits
90 >>> from skimage import filters
91 >>> with expected_warnings(['Bad rank filter performance']):
92 ... median_filtered = filters.rank.median(image)
93
94 Notes
95 -----
96 Uses `all_warnings` to ensure all warnings are raised.
97 Upon exiting, it checks the recorded warnings for the desired matching
98 pattern(s).
99 Raises a ValueError if any match was not found or an unexpected
100 warning was raised.
101 Allows for three types of behaviors: `and`, `or`, and `optional` matches.
102 This is done to accommodate different build environments or loop conditions
103 that may produce different warnings. The behaviors can be combined.
104 If you pass multiple patterns, you get an orderless `and`, where all of the
105 warnings must be raised.
106 If you use the `|` operator in a pattern, you can catch one of several
107 warnings.
108 Finally, you can use `|\A\Z` in a pattern to signify it as optional.
109
110 """
111 if isinstance(matching, str):
112 raise ValueError('``matching`` should be a list of strings and not '
113 'a string itself.')
114
115 strict_warnings = os.environ.get('SKIMAGE_TEST_STRICT_WARNINGS', '1')
116 if strict_warnings.lower() == 'true':
117 strict_warnings = True
118 elif strict_warnings.lower() == 'false':
119 strict_warnings = False
120 else:
121 strict_warnings = bool(int(strict_warnings))
122
123 with all_warnings() as w:
124 # enter context
125 yield w
126 # exited user context, check the recorded warnings
127 # Allow users to provide None
128 while None in matching:
129 matching.remove(None)
130 remaining = [m for m in matching if r'\A\Z' not in m.split('|')]
131 for warn in w:
132 found = False
133 for match in matching:
134 if re.search(match, str(warn.message)) is not None:
135 found = True
136 if match in remaining:
137 remaining.remove(match)
138 if strict_warnings and not found:
139 raise ValueError('Unexpected warning: %s' % str(warn.message))
140 if strict_warnings and (len(remaining) > 0):
141 msg = 'No warning raised matching:\n%s' % '\n'.join(remaining)
142 raise ValueError(msg)
143
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/skimage/_shared/_warnings.py b/skimage/_shared/_warnings.py
--- a/skimage/_shared/_warnings.py
+++ b/skimage/_shared/_warnings.py
@@ -79,8 +79,9 @@
Parameters
----------
- matching : list of strings or compiled regexes
+ matching : None or a list of strings or compiled regexes
Regexes for the desired warning to catch
+ If matching is None, this behaves as a no-op.
Examples
--------
@@ -112,6 +113,11 @@
raise ValueError('``matching`` should be a list of strings and not '
'a string itself.')
+ # Special case for disabling the context manager
+ if matching is None:
+ yield None
+ return
+
strict_warnings = os.environ.get('SKIMAGE_TEST_STRICT_WARNINGS', '1')
if strict_warnings.lower() == 'true':
strict_warnings = True
| {"golden_diff": "diff --git a/skimage/_shared/_warnings.py b/skimage/_shared/_warnings.py\n--- a/skimage/_shared/_warnings.py\n+++ b/skimage/_shared/_warnings.py\n@@ -79,8 +79,9 @@\n \n Parameters\n ----------\n- matching : list of strings or compiled regexes\n+ matching : None or a list of strings or compiled regexes\n Regexes for the desired warning to catch\n+ If matching is None, this behaves as a no-op.\n \n Examples\n --------\n@@ -112,6 +113,11 @@\n raise ValueError('``matching`` should be a list of strings and not '\n 'a string itself.')\n \n+ # Special case for disabling the context manager\n+ if matching is None:\n+ yield None\n+ return\n+\n strict_warnings = os.environ.get('SKIMAGE_TEST_STRICT_WARNINGS', '1')\n if strict_warnings.lower() == 'true':\n strict_warnings = True\n", "issue": "travis has random failures on rank filters\n## Description\r\n\r\nSee for ex: https://travis-ci.org/scikit-image/scikit-image/jobs/563363217\r\n\r\n\r\n## Way to reproduce\r\n```python\r\n# Place the full code we need to recreate your issue here\r\n# upload all necessary images to github too!\r\n```\r\n\r\n\r\n## Version information\r\n```python\r\n# Paste the output of the following python commands\r\nfrom __future__ import print_function\r\nimport sys; print(sys.version)\r\nimport platform; print(platform.platform())\r\nimport skimage; print(\"scikit-image version: {}\".format(skimage.__version__))\r\nimport numpy; print(\"numpy version: {}\".format(numpy.__version__))\r\n```\r\n\r\n```python\r\n# your output here\r\n\r\n```\r\n\r\n\n", "before_files": [{"content": "from contextlib import contextmanager\nimport sys\nimport warnings\nimport re\nimport os\n\n__all__ = ['all_warnings', 'expected_warnings', 'warn']\n\n\ndef warn(message, category=None, stacklevel=2):\n \"\"\"A version of `warnings.warn` with a default stacklevel of 2.\n \"\"\"\n if category is not None:\n warnings.warn(message, category=category, stacklevel=stacklevel)\n else:\n warnings.warn(message, stacklevel=stacklevel)\n\n\n@contextmanager\ndef all_warnings():\n \"\"\"\n Context for use in testing to ensure that all warnings are raised.\n\n Examples\n --------\n >>> import warnings\n >>> def foo():\n ... warnings.warn(RuntimeWarning(\"bar\"))\n\n We raise the warning once, while the warning filter is set to \"once\".\n Hereafter, the warning is invisible, even with custom filters:\n\n >>> with warnings.catch_warnings():\n ... warnings.simplefilter('once')\n ... foo()\n\n We can now run ``foo()`` without a warning being raised:\n\n >>> from numpy.testing import assert_warns\n >>> foo()\n\n To catch the warning, we call in the help of ``all_warnings``:\n\n >>> with all_warnings():\n ... assert_warns(RuntimeWarning, foo)\n \"\"\"\n # _warnings.py is on the critical import path.\n # Since this is a testing only function, we lazy import inspect.\n import inspect\n # Whenever a warning is triggered, Python adds a __warningregistry__\n # member to the *calling* module. The exercize here is to find\n # and eradicate all those breadcrumbs that were left lying around.\n #\n # We proceed by first searching all parent calling frames and explicitly\n # clearing their warning registries (necessary for the doctests above to\n # pass). Then, we search for all submodules of skimage and clear theirs\n # as well (necessary for the skimage test suite to pass).\n\n frame = inspect.currentframe()\n if frame:\n for f in inspect.getouterframes(frame):\n f[0].f_locals['__warningregistry__'] = {}\n del frame\n\n for mod_name, mod in list(sys.modules.items()):\n try:\n mod.__warningregistry__.clear()\n except AttributeError:\n pass\n\n with warnings.catch_warnings(record=True) as w:\n warnings.simplefilter(\"always\")\n yield w\n\n\n@contextmanager\ndef expected_warnings(matching):\n r\"\"\"Context for use in testing to catch known warnings matching regexes\n\n Parameters\n ----------\n matching : list of strings or compiled regexes\n Regexes for the desired warning to catch\n\n Examples\n --------\n >>> import numpy as np\n >>> image = np.random.randint(0, 2**16, size=(100, 100), dtype=np.uint16)\n >>> # rank filters are slow when bit-depth exceeds 10 bits\n >>> from skimage import filters\n >>> with expected_warnings(['Bad rank filter performance']):\n ... median_filtered = filters.rank.median(image)\n\n Notes\n -----\n Uses `all_warnings` to ensure all warnings are raised.\n Upon exiting, it checks the recorded warnings for the desired matching\n pattern(s).\n Raises a ValueError if any match was not found or an unexpected\n warning was raised.\n Allows for three types of behaviors: `and`, `or`, and `optional` matches.\n This is done to accommodate different build environments or loop conditions\n that may produce different warnings. The behaviors can be combined.\n If you pass multiple patterns, you get an orderless `and`, where all of the\n warnings must be raised.\n If you use the `|` operator in a pattern, you can catch one of several\n warnings.\n Finally, you can use `|\\A\\Z` in a pattern to signify it as optional.\n\n \"\"\"\n if isinstance(matching, str):\n raise ValueError('``matching`` should be a list of strings and not '\n 'a string itself.')\n\n strict_warnings = os.environ.get('SKIMAGE_TEST_STRICT_WARNINGS', '1')\n if strict_warnings.lower() == 'true':\n strict_warnings = True\n elif strict_warnings.lower() == 'false':\n strict_warnings = False\n else:\n strict_warnings = bool(int(strict_warnings))\n\n with all_warnings() as w:\n # enter context\n yield w\n # exited user context, check the recorded warnings\n # Allow users to provide None\n while None in matching:\n matching.remove(None)\n remaining = [m for m in matching if r'\\A\\Z' not in m.split('|')]\n for warn in w:\n found = False\n for match in matching:\n if re.search(match, str(warn.message)) is not None:\n found = True\n if match in remaining:\n remaining.remove(match)\n if strict_warnings and not found:\n raise ValueError('Unexpected warning: %s' % str(warn.message))\n if strict_warnings and (len(remaining) > 0):\n msg = 'No warning raised matching:\\n%s' % '\\n'.join(remaining)\n raise ValueError(msg)\n", "path": "skimage/_shared/_warnings.py"}], "after_files": [{"content": "from contextlib import contextmanager\nimport sys\nimport warnings\nimport re\nimport os\n\n__all__ = ['all_warnings', 'expected_warnings', 'warn']\n\n\ndef warn(message, category=None, stacklevel=2):\n \"\"\"A version of `warnings.warn` with a default stacklevel of 2.\n \"\"\"\n if category is not None:\n warnings.warn(message, category=category, stacklevel=stacklevel)\n else:\n warnings.warn(message, stacklevel=stacklevel)\n\n\n@contextmanager\ndef all_warnings():\n \"\"\"\n Context for use in testing to ensure that all warnings are raised.\n\n Examples\n --------\n >>> import warnings\n >>> def foo():\n ... warnings.warn(RuntimeWarning(\"bar\"))\n\n We raise the warning once, while the warning filter is set to \"once\".\n Hereafter, the warning is invisible, even with custom filters:\n\n >>> with warnings.catch_warnings():\n ... warnings.simplefilter('once')\n ... foo()\n\n We can now run ``foo()`` without a warning being raised:\n\n >>> from numpy.testing import assert_warns\n >>> foo()\n\n To catch the warning, we call in the help of ``all_warnings``:\n\n >>> with all_warnings():\n ... assert_warns(RuntimeWarning, foo)\n \"\"\"\n # _warnings.py is on the critical import path.\n # Since this is a testing only function, we lazy import inspect.\n import inspect\n # Whenever a warning is triggered, Python adds a __warningregistry__\n # member to the *calling* module. The exercize here is to find\n # and eradicate all those breadcrumbs that were left lying around.\n #\n # We proceed by first searching all parent calling frames and explicitly\n # clearing their warning registries (necessary for the doctests above to\n # pass). Then, we search for all submodules of skimage and clear theirs\n # as well (necessary for the skimage test suite to pass).\n\n frame = inspect.currentframe()\n if frame:\n for f in inspect.getouterframes(frame):\n f[0].f_locals['__warningregistry__'] = {}\n del frame\n\n for mod_name, mod in list(sys.modules.items()):\n try:\n mod.__warningregistry__.clear()\n except AttributeError:\n pass\n\n with warnings.catch_warnings(record=True) as w:\n warnings.simplefilter(\"always\")\n yield w\n\n\n@contextmanager\ndef expected_warnings(matching):\n r\"\"\"Context for use in testing to catch known warnings matching regexes\n\n Parameters\n ----------\n matching : None or a list of strings or compiled regexes\n Regexes for the desired warning to catch\n If matching is None, this behaves as a no-op.\n\n Examples\n --------\n >>> import numpy as np\n >>> image = np.random.randint(0, 2**16, size=(100, 100), dtype=np.uint16)\n >>> # rank filters are slow when bit-depth exceeds 10 bits\n >>> from skimage import filters\n >>> with expected_warnings(['Bad rank filter performance']):\n ... median_filtered = filters.rank.median(image)\n\n Notes\n -----\n Uses `all_warnings` to ensure all warnings are raised.\n Upon exiting, it checks the recorded warnings for the desired matching\n pattern(s).\n Raises a ValueError if any match was not found or an unexpected\n warning was raised.\n Allows for three types of behaviors: `and`, `or`, and `optional` matches.\n This is done to accommodate different build environments or loop conditions\n that may produce different warnings. The behaviors can be combined.\n If you pass multiple patterns, you get an orderless `and`, where all of the\n warnings must be raised.\n If you use the `|` operator in a pattern, you can catch one of several\n warnings.\n Finally, you can use `|\\A\\Z` in a pattern to signify it as optional.\n\n \"\"\"\n if isinstance(matching, str):\n raise ValueError('``matching`` should be a list of strings and not '\n 'a string itself.')\n\n # Special case for disabling the context manager\n if matching is None:\n yield None\n return\n\n strict_warnings = os.environ.get('SKIMAGE_TEST_STRICT_WARNINGS', '1')\n if strict_warnings.lower() == 'true':\n strict_warnings = True\n elif strict_warnings.lower() == 'false':\n strict_warnings = False\n else:\n strict_warnings = bool(int(strict_warnings))\n\n with all_warnings() as w:\n # enter context\n yield w\n # exited user context, check the recorded warnings\n # Allow users to provide None\n while None in matching:\n matching.remove(None)\n remaining = [m for m in matching if r'\\A\\Z' not in m.split('|')]\n for warn in w:\n found = False\n for match in matching:\n if re.search(match, str(warn.message)) is not None:\n found = True\n if match in remaining:\n remaining.remove(match)\n if strict_warnings and not found:\n raise ValueError('Unexpected warning: %s' % str(warn.message))\n if strict_warnings and (len(remaining) > 0):\n msg = 'No warning raised matching:\\n%s' % '\\n'.join(remaining)\n raise ValueError(msg)\n", "path": "skimage/_shared/_warnings.py"}]} | 1,855 | 219 |
gh_patches_debug_25191 | rasdani/github-patches | git_diff | scipy__scipy-6119 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
DeprecationWarnings in stats on python 3.5
```
/home/br/repos/scipy/build/testenv/lib/python3.5/site-packages/scipy/stats/tests/test_stats.py:101: DeprecationWarning: Please use assertRaisesRegex instead.
```
Apparently, `assertRaisesRegexp` was renamed to `assertRaisesRegex`: https://docs.python.org/3/library/unittest.html#unittest.TestCase.assertRaisesRegexp
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `scipy/_lib/_numpy_compat.py`
Content:
```
1 """Functions copypasted from newer versions of numpy.
2
3 """
4 from __future__ import division, print_function, absolute_import
5
6 import warnings
7
8 import numpy as np
9
10 from scipy._lib._version import NumpyVersion
11
12 if NumpyVersion(np.__version__) > '1.7.0.dev':
13 _assert_warns = np.testing.assert_warns
14 else:
15 def _assert_warns(warning_class, func, *args, **kw):
16 r"""
17 Fail unless the given callable throws the specified warning.
18
19 This definition is copypasted from numpy 1.9.0.dev.
20 The version in earlier numpy returns None.
21
22 Parameters
23 ----------
24 warning_class : class
25 The class defining the warning that `func` is expected to throw.
26 func : callable
27 The callable to test.
28 *args : Arguments
29 Arguments passed to `func`.
30 **kwargs : Kwargs
31 Keyword arguments passed to `func`.
32
33 Returns
34 -------
35 The value returned by `func`.
36
37 """
38 with warnings.catch_warnings(record=True) as l:
39 warnings.simplefilter('always')
40 result = func(*args, **kw)
41 if not len(l) > 0:
42 raise AssertionError("No warning raised when calling %s"
43 % func.__name__)
44 if not l[0].category is warning_class:
45 raise AssertionError("First warning for %s is not a "
46 "%s( is %s)" % (func.__name__, warning_class, l[0]))
47 return result
48
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/scipy/_lib/_numpy_compat.py b/scipy/_lib/_numpy_compat.py
--- a/scipy/_lib/_numpy_compat.py
+++ b/scipy/_lib/_numpy_compat.py
@@ -4,8 +4,10 @@
from __future__ import division, print_function, absolute_import
import warnings
+import sys
import numpy as np
+from numpy.testing.nosetester import import_nose
from scipy._lib._version import NumpyVersion
@@ -45,3 +47,28 @@
raise AssertionError("First warning for %s is not a "
"%s( is %s)" % (func.__name__, warning_class, l[0]))
return result
+
+
+def assert_raises_regex(exception_class, expected_regexp,
+ callable_obj=None, *args, **kwargs):
+ """
+ Fail unless an exception of class exception_class and with message that
+ matches expected_regexp is thrown by callable when invoked with arguments
+ args and keyword arguments kwargs.
+ Name of this function adheres to Python 3.2+ reference, but should work in
+ all versions down to 2.6.
+ Notes
+ -----
+ .. versionadded:: 1.8.0
+ """
+ __tracebackhide__ = True # Hide traceback for py.test
+ nose = import_nose()
+
+ if sys.version_info.major >= 3:
+ funcname = nose.tools.assert_raises_regex
+ else:
+ # Only present in Python 2.7, missing from unittest in 2.6
+ funcname = nose.tools.assert_raises_regexp
+
+ return funcname(exception_class, expected_regexp, callable_obj,
+ *args, **kwargs)
| {"golden_diff": "diff --git a/scipy/_lib/_numpy_compat.py b/scipy/_lib/_numpy_compat.py\n--- a/scipy/_lib/_numpy_compat.py\n+++ b/scipy/_lib/_numpy_compat.py\n@@ -4,8 +4,10 @@\n from __future__ import division, print_function, absolute_import\n \n import warnings\n+import sys\n \n import numpy as np\n+from numpy.testing.nosetester import import_nose\n \n from scipy._lib._version import NumpyVersion\n \n@@ -45,3 +47,28 @@\n raise AssertionError(\"First warning for %s is not a \"\n \"%s( is %s)\" % (func.__name__, warning_class, l[0]))\n return result\n+\n+\n+def assert_raises_regex(exception_class, expected_regexp,\n+ callable_obj=None, *args, **kwargs):\n+ \"\"\"\n+ Fail unless an exception of class exception_class and with message that\n+ matches expected_regexp is thrown by callable when invoked with arguments\n+ args and keyword arguments kwargs.\n+ Name of this function adheres to Python 3.2+ reference, but should work in\n+ all versions down to 2.6.\n+ Notes\n+ -----\n+ .. versionadded:: 1.8.0\n+ \"\"\"\n+ __tracebackhide__ = True # Hide traceback for py.test\n+ nose = import_nose()\n+\n+ if sys.version_info.major >= 3:\n+ funcname = nose.tools.assert_raises_regex\n+ else:\n+ # Only present in Python 2.7, missing from unittest in 2.6\n+ funcname = nose.tools.assert_raises_regexp\n+\n+ return funcname(exception_class, expected_regexp, callable_obj,\n+ *args, **kwargs)\n", "issue": "DeprecationWarnings in stats on python 3.5\n```\n/home/br/repos/scipy/build/testenv/lib/python3.5/site-packages/scipy/stats/tests/test_stats.py:101: DeprecationWarning: Please use assertRaisesRegex instead.\n```\n\nApparently, `assertRaisesRegexp` was renamed to `assertRaisesRegex`: https://docs.python.org/3/library/unittest.html#unittest.TestCase.assertRaisesRegexp\n\n", "before_files": [{"content": "\"\"\"Functions copypasted from newer versions of numpy.\n\n\"\"\"\nfrom __future__ import division, print_function, absolute_import\n\nimport warnings\n\nimport numpy as np\n\nfrom scipy._lib._version import NumpyVersion\n\nif NumpyVersion(np.__version__) > '1.7.0.dev':\n _assert_warns = np.testing.assert_warns\nelse:\n def _assert_warns(warning_class, func, *args, **kw):\n r\"\"\"\n Fail unless the given callable throws the specified warning.\n\n This definition is copypasted from numpy 1.9.0.dev.\n The version in earlier numpy returns None.\n\n Parameters\n ----------\n warning_class : class\n The class defining the warning that `func` is expected to throw.\n func : callable\n The callable to test.\n *args : Arguments\n Arguments passed to `func`.\n **kwargs : Kwargs\n Keyword arguments passed to `func`.\n\n Returns\n -------\n The value returned by `func`.\n\n \"\"\"\n with warnings.catch_warnings(record=True) as l:\n warnings.simplefilter('always')\n result = func(*args, **kw)\n if not len(l) > 0:\n raise AssertionError(\"No warning raised when calling %s\"\n % func.__name__)\n if not l[0].category is warning_class:\n raise AssertionError(\"First warning for %s is not a \"\n \"%s( is %s)\" % (func.__name__, warning_class, l[0]))\n return result\n", "path": "scipy/_lib/_numpy_compat.py"}], "after_files": [{"content": "\"\"\"Functions copypasted from newer versions of numpy.\n\n\"\"\"\nfrom __future__ import division, print_function, absolute_import\n\nimport warnings\nimport sys\n\nimport numpy as np\nfrom numpy.testing.nosetester import import_nose\n\nfrom scipy._lib._version import NumpyVersion\n\nif NumpyVersion(np.__version__) > '1.7.0.dev':\n _assert_warns = np.testing.assert_warns\nelse:\n def _assert_warns(warning_class, func, *args, **kw):\n r\"\"\"\n Fail unless the given callable throws the specified warning.\n\n This definition is copypasted from numpy 1.9.0.dev.\n The version in earlier numpy returns None.\n\n Parameters\n ----------\n warning_class : class\n The class defining the warning that `func` is expected to throw.\n func : callable\n The callable to test.\n *args : Arguments\n Arguments passed to `func`.\n **kwargs : Kwargs\n Keyword arguments passed to `func`.\n\n Returns\n -------\n The value returned by `func`.\n\n \"\"\"\n with warnings.catch_warnings(record=True) as l:\n warnings.simplefilter('always')\n result = func(*args, **kw)\n if not len(l) > 0:\n raise AssertionError(\"No warning raised when calling %s\"\n % func.__name__)\n if not l[0].category is warning_class:\n raise AssertionError(\"First warning for %s is not a \"\n \"%s( is %s)\" % (func.__name__, warning_class, l[0]))\n return result\n\n\ndef assert_raises_regex(exception_class, expected_regexp,\n callable_obj=None, *args, **kwargs):\n \"\"\"\n Fail unless an exception of class exception_class and with message that\n matches expected_regexp is thrown by callable when invoked with arguments\n args and keyword arguments kwargs.\n Name of this function adheres to Python 3.2+ reference, but should work in\n all versions down to 2.6.\n Notes\n -----\n .. versionadded:: 1.8.0\n \"\"\"\n __tracebackhide__ = True # Hide traceback for py.test\n nose = import_nose()\n\n if sys.version_info.major >= 3:\n funcname = nose.tools.assert_raises_regex\n else:\n # Only present in Python 2.7, missing from unittest in 2.6\n funcname = nose.tools.assert_raises_regexp\n\n return funcname(exception_class, expected_regexp, callable_obj,\n *args, **kwargs)\n", "path": "scipy/_lib/_numpy_compat.py"}]} | 769 | 387 |
gh_patches_debug_34963 | rasdani/github-patches | git_diff | adfinis__timed-backend-925 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
bug(auth): requests to the api with an invalid token receive a response status 500 instead of 401
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `timed/authentication.py`
Content:
```
1 import base64
2 import functools
3 import hashlib
4
5 import requests
6 from django.conf import settings
7 from django.core.cache import cache
8 from django.core.exceptions import SuspiciousOperation
9 from django.utils.encoding import force_bytes
10 from mozilla_django_oidc.auth import LOGGER, OIDCAuthenticationBackend
11
12
13 class TimedOIDCAuthenticationBackend(OIDCAuthenticationBackend):
14 def get_introspection(self, access_token, id_token, payload):
15 """Return user details dictionary."""
16
17 basic = base64.b64encode(
18 f"{settings.OIDC_RP_INTROSPECT_CLIENT_ID}:{settings.OIDC_RP_INTROSPECT_CLIENT_SECRET}".encode(
19 "utf-8"
20 )
21 ).decode()
22 headers = {
23 "Authorization": f"Basic {basic}",
24 "Content-Type": "application/x-www-form-urlencoded",
25 }
26 response = requests.post(
27 settings.OIDC_OP_INTROSPECT_ENDPOINT,
28 verify=settings.OIDC_VERIFY_SSL,
29 headers=headers,
30 data={"token": access_token},
31 )
32 response.raise_for_status()
33 return response.json()
34
35 def get_userinfo_or_introspection(self, access_token):
36 try:
37 claims = self.cached_request(
38 self.get_userinfo, access_token, "auth.userinfo"
39 )
40 except requests.HTTPError as e:
41 if not (
42 e.response.status_code in [401, 403] and settings.OIDC_CHECK_INTROSPECT
43 ):
44 raise e
45
46 # check introspection if userinfo fails (confidental client)
47 claims = self.cached_request(
48 self.get_introspection, access_token, "auth.introspection"
49 )
50 if "client_id" not in claims:
51 raise SuspiciousOperation("client_id not present in introspection")
52
53 return claims
54
55 def get_or_create_user(self, access_token, id_token, payload):
56 """Verify claims and return user, otherwise raise an Exception."""
57
58 claims = self.get_userinfo_or_introspection(access_token)
59
60 users = self.filter_users_by_claims(claims)
61
62 if len(users) == 1:
63 user = users.get()
64 self.update_user_from_claims(user, claims)
65 return user
66 elif settings.OIDC_CREATE_USER:
67 return self.create_user(claims)
68 else:
69 LOGGER.debug(
70 "Login failed: No user with username %s found, and "
71 "OIDC_CREATE_USER is False",
72 self.get_username(claims),
73 )
74 return None
75
76 def update_user_from_claims(self, user, claims):
77 user.email = claims.get(settings.OIDC_EMAIL_CLAIM, "")
78 user.first_name = claims.get(settings.OIDC_FIRSTNAME_CLAIM, "")
79 user.last_name = claims.get(settings.OIDC_LASTNAME_CLAIM, "")
80 user.save()
81
82 def filter_users_by_claims(self, claims):
83 username = self.get_username(claims)
84 return self.UserModel.objects.filter(username__iexact=username)
85
86 def cached_request(self, method, token, cache_prefix):
87 token_hash = hashlib.sha256(force_bytes(token)).hexdigest()
88
89 func = functools.partial(method, token, None, None)
90
91 return cache.get_or_set(
92 f"{cache_prefix}.{token_hash}",
93 func,
94 timeout=settings.OIDC_BEARER_TOKEN_REVALIDATION_TIME,
95 )
96
97 def create_user(self, claims):
98 """Return object for a newly created user account."""
99
100 username = self.get_username(claims)
101 email = claims.get(settings.OIDC_EMAIL_CLAIM, "")
102 first_name = claims.get(settings.OIDC_FIRSTNAME_CLAIM, "")
103 last_name = claims.get(settings.OIDC_LASTNAME_CLAIM, "")
104
105 return self.UserModel.objects.create(
106 username=username, email=email, first_name=first_name, last_name=last_name
107 )
108
109 def get_username(self, claims):
110 try:
111 return claims[settings.OIDC_USERNAME_CLAIM]
112 except KeyError:
113 raise SuspiciousOperation("Couldn't find username claim")
114
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/timed/authentication.py b/timed/authentication.py
--- a/timed/authentication.py
+++ b/timed/authentication.py
@@ -8,6 +8,7 @@
from django.core.exceptions import SuspiciousOperation
from django.utils.encoding import force_bytes
from mozilla_django_oidc.auth import LOGGER, OIDCAuthenticationBackend
+from rest_framework.exceptions import AuthenticationFailed
class TimedOIDCAuthenticationBackend(OIDCAuthenticationBackend):
@@ -37,20 +38,29 @@
claims = self.cached_request(
self.get_userinfo, access_token, "auth.userinfo"
)
+ return claims
except requests.HTTPError as e:
- if not (
- e.response.status_code in [401, 403] and settings.OIDC_CHECK_INTROSPECT
- ):
+ if e.response.status_code not in [401, 403]:
raise e
-
- # check introspection if userinfo fails (confidental client)
- claims = self.cached_request(
- self.get_introspection, access_token, "auth.introspection"
- )
- if "client_id" not in claims:
- raise SuspiciousOperation("client_id not present in introspection")
-
- return claims
+ if settings.OIDC_CHECK_INTROSPECT:
+ try:
+ # check introspection if userinfo fails (confidential client)
+ claims = self.cached_request(
+ self.get_introspection, access_token, "auth.introspection"
+ )
+ if "client_id" not in claims:
+ raise SuspiciousOperation(
+ "client_id not present in introspection"
+ )
+ return claims
+ except requests.HTTPError as e:
+ # if the authorization fails it's not a valid client or
+ # the token is expired and permission is denied.
+ # Handing on the 401 Client Error would be transformed into
+ # a 500 by Django's exception handling. But that's not what we want.
+ if e.response.status_code not in [401, 403]: # pragma: no cover
+ raise e
+ raise AuthenticationFailed()
def get_or_create_user(self, access_token, id_token, payload):
"""Verify claims and return user, otherwise raise an Exception."""
| {"golden_diff": "diff --git a/timed/authentication.py b/timed/authentication.py\n--- a/timed/authentication.py\n+++ b/timed/authentication.py\n@@ -8,6 +8,7 @@\n from django.core.exceptions import SuspiciousOperation\n from django.utils.encoding import force_bytes\n from mozilla_django_oidc.auth import LOGGER, OIDCAuthenticationBackend\n+from rest_framework.exceptions import AuthenticationFailed\n \n \n class TimedOIDCAuthenticationBackend(OIDCAuthenticationBackend):\n@@ -37,20 +38,29 @@\n claims = self.cached_request(\n self.get_userinfo, access_token, \"auth.userinfo\"\n )\n+ return claims\n except requests.HTTPError as e:\n- if not (\n- e.response.status_code in [401, 403] and settings.OIDC_CHECK_INTROSPECT\n- ):\n+ if e.response.status_code not in [401, 403]:\n raise e\n-\n- # check introspection if userinfo fails (confidental client)\n- claims = self.cached_request(\n- self.get_introspection, access_token, \"auth.introspection\"\n- )\n- if \"client_id\" not in claims:\n- raise SuspiciousOperation(\"client_id not present in introspection\")\n-\n- return claims\n+ if settings.OIDC_CHECK_INTROSPECT:\n+ try:\n+ # check introspection if userinfo fails (confidential client)\n+ claims = self.cached_request(\n+ self.get_introspection, access_token, \"auth.introspection\"\n+ )\n+ if \"client_id\" not in claims:\n+ raise SuspiciousOperation(\n+ \"client_id not present in introspection\"\n+ )\n+ return claims\n+ except requests.HTTPError as e:\n+ # if the authorization fails it's not a valid client or\n+ # the token is expired and permission is denied.\n+ # Handing on the 401 Client Error would be transformed into\n+ # a 500 by Django's exception handling. But that's not what we want.\n+ if e.response.status_code not in [401, 403]: # pragma: no cover\n+ raise e\n+ raise AuthenticationFailed()\n \n def get_or_create_user(self, access_token, id_token, payload):\n \"\"\"Verify claims and return user, otherwise raise an Exception.\"\"\"\n", "issue": "bug(auth): requests to the api with an invalid token receive a response status 500 instead of 401\n\n", "before_files": [{"content": "import base64\nimport functools\nimport hashlib\n\nimport requests\nfrom django.conf import settings\nfrom django.core.cache import cache\nfrom django.core.exceptions import SuspiciousOperation\nfrom django.utils.encoding import force_bytes\nfrom mozilla_django_oidc.auth import LOGGER, OIDCAuthenticationBackend\n\n\nclass TimedOIDCAuthenticationBackend(OIDCAuthenticationBackend):\n def get_introspection(self, access_token, id_token, payload):\n \"\"\"Return user details dictionary.\"\"\"\n\n basic = base64.b64encode(\n f\"{settings.OIDC_RP_INTROSPECT_CLIENT_ID}:{settings.OIDC_RP_INTROSPECT_CLIENT_SECRET}\".encode(\n \"utf-8\"\n )\n ).decode()\n headers = {\n \"Authorization\": f\"Basic {basic}\",\n \"Content-Type\": \"application/x-www-form-urlencoded\",\n }\n response = requests.post(\n settings.OIDC_OP_INTROSPECT_ENDPOINT,\n verify=settings.OIDC_VERIFY_SSL,\n headers=headers,\n data={\"token\": access_token},\n )\n response.raise_for_status()\n return response.json()\n\n def get_userinfo_or_introspection(self, access_token):\n try:\n claims = self.cached_request(\n self.get_userinfo, access_token, \"auth.userinfo\"\n )\n except requests.HTTPError as e:\n if not (\n e.response.status_code in [401, 403] and settings.OIDC_CHECK_INTROSPECT\n ):\n raise e\n\n # check introspection if userinfo fails (confidental client)\n claims = self.cached_request(\n self.get_introspection, access_token, \"auth.introspection\"\n )\n if \"client_id\" not in claims:\n raise SuspiciousOperation(\"client_id not present in introspection\")\n\n return claims\n\n def get_or_create_user(self, access_token, id_token, payload):\n \"\"\"Verify claims and return user, otherwise raise an Exception.\"\"\"\n\n claims = self.get_userinfo_or_introspection(access_token)\n\n users = self.filter_users_by_claims(claims)\n\n if len(users) == 1:\n user = users.get()\n self.update_user_from_claims(user, claims)\n return user\n elif settings.OIDC_CREATE_USER:\n return self.create_user(claims)\n else:\n LOGGER.debug(\n \"Login failed: No user with username %s found, and \"\n \"OIDC_CREATE_USER is False\",\n self.get_username(claims),\n )\n return None\n\n def update_user_from_claims(self, user, claims):\n user.email = claims.get(settings.OIDC_EMAIL_CLAIM, \"\")\n user.first_name = claims.get(settings.OIDC_FIRSTNAME_CLAIM, \"\")\n user.last_name = claims.get(settings.OIDC_LASTNAME_CLAIM, \"\")\n user.save()\n\n def filter_users_by_claims(self, claims):\n username = self.get_username(claims)\n return self.UserModel.objects.filter(username__iexact=username)\n\n def cached_request(self, method, token, cache_prefix):\n token_hash = hashlib.sha256(force_bytes(token)).hexdigest()\n\n func = functools.partial(method, token, None, None)\n\n return cache.get_or_set(\n f\"{cache_prefix}.{token_hash}\",\n func,\n timeout=settings.OIDC_BEARER_TOKEN_REVALIDATION_TIME,\n )\n\n def create_user(self, claims):\n \"\"\"Return object for a newly created user account.\"\"\"\n\n username = self.get_username(claims)\n email = claims.get(settings.OIDC_EMAIL_CLAIM, \"\")\n first_name = claims.get(settings.OIDC_FIRSTNAME_CLAIM, \"\")\n last_name = claims.get(settings.OIDC_LASTNAME_CLAIM, \"\")\n\n return self.UserModel.objects.create(\n username=username, email=email, first_name=first_name, last_name=last_name\n )\n\n def get_username(self, claims):\n try:\n return claims[settings.OIDC_USERNAME_CLAIM]\n except KeyError:\n raise SuspiciousOperation(\"Couldn't find username claim\")\n", "path": "timed/authentication.py"}], "after_files": [{"content": "import base64\nimport functools\nimport hashlib\n\nimport requests\nfrom django.conf import settings\nfrom django.core.cache import cache\nfrom django.core.exceptions import SuspiciousOperation\nfrom django.utils.encoding import force_bytes\nfrom mozilla_django_oidc.auth import LOGGER, OIDCAuthenticationBackend\nfrom rest_framework.exceptions import AuthenticationFailed\n\n\nclass TimedOIDCAuthenticationBackend(OIDCAuthenticationBackend):\n def get_introspection(self, access_token, id_token, payload):\n \"\"\"Return user details dictionary.\"\"\"\n\n basic = base64.b64encode(\n f\"{settings.OIDC_RP_INTROSPECT_CLIENT_ID}:{settings.OIDC_RP_INTROSPECT_CLIENT_SECRET}\".encode(\n \"utf-8\"\n )\n ).decode()\n headers = {\n \"Authorization\": f\"Basic {basic}\",\n \"Content-Type\": \"application/x-www-form-urlencoded\",\n }\n response = requests.post(\n settings.OIDC_OP_INTROSPECT_ENDPOINT,\n verify=settings.OIDC_VERIFY_SSL,\n headers=headers,\n data={\"token\": access_token},\n )\n response.raise_for_status()\n return response.json()\n\n def get_userinfo_or_introspection(self, access_token):\n try:\n claims = self.cached_request(\n self.get_userinfo, access_token, \"auth.userinfo\"\n )\n return claims\n except requests.HTTPError as e:\n if e.response.status_code not in [401, 403]:\n raise e\n if settings.OIDC_CHECK_INTROSPECT:\n try:\n # check introspection if userinfo fails (confidential client)\n claims = self.cached_request(\n self.get_introspection, access_token, \"auth.introspection\"\n )\n if \"client_id\" not in claims:\n raise SuspiciousOperation(\n \"client_id not present in introspection\"\n )\n return claims\n except requests.HTTPError as e:\n # if the authorization fails it's not a valid client or\n # the token is expired and permission is denied.\n # Handing on the 401 Client Error would be transformed into\n # a 500 by Django's exception handling. But that's not what we want.\n if e.response.status_code not in [401, 403]: # pragma: no cover\n raise e\n raise AuthenticationFailed()\n\n def get_or_create_user(self, access_token, id_token, payload):\n \"\"\"Verify claims and return user, otherwise raise an Exception.\"\"\"\n\n claims = self.get_userinfo_or_introspection(access_token)\n\n users = self.filter_users_by_claims(claims)\n\n if len(users) == 1:\n user = users.get()\n self.update_user_from_claims(user, claims)\n return user\n elif settings.OIDC_CREATE_USER:\n return self.create_user(claims)\n else:\n LOGGER.debug(\n \"Login failed: No user with username %s found, and \"\n \"OIDC_CREATE_USER is False\",\n self.get_username(claims),\n )\n return None\n\n def update_user_from_claims(self, user, claims):\n user.email = claims.get(settings.OIDC_EMAIL_CLAIM, \"\")\n user.first_name = claims.get(settings.OIDC_FIRSTNAME_CLAIM, \"\")\n user.last_name = claims.get(settings.OIDC_LASTNAME_CLAIM, \"\")\n user.save()\n\n def filter_users_by_claims(self, claims):\n username = self.get_username(claims)\n return self.UserModel.objects.filter(username__iexact=username)\n\n def cached_request(self, method, token, cache_prefix):\n token_hash = hashlib.sha256(force_bytes(token)).hexdigest()\n\n func = functools.partial(method, token, None, None)\n\n return cache.get_or_set(\n f\"{cache_prefix}.{token_hash}\",\n func,\n timeout=settings.OIDC_BEARER_TOKEN_REVALIDATION_TIME,\n )\n\n def create_user(self, claims):\n \"\"\"Return object for a newly created user account.\"\"\"\n\n username = self.get_username(claims)\n email = claims.get(settings.OIDC_EMAIL_CLAIM, \"\")\n first_name = claims.get(settings.OIDC_FIRSTNAME_CLAIM, \"\")\n last_name = claims.get(settings.OIDC_LASTNAME_CLAIM, \"\")\n\n return self.UserModel.objects.create(\n username=username, email=email, first_name=first_name, last_name=last_name\n )\n\n def get_username(self, claims):\n try:\n return claims[settings.OIDC_USERNAME_CLAIM]\n except KeyError:\n raise SuspiciousOperation(\"Couldn't find username claim\")\n", "path": "timed/authentication.py"}]} | 1,383 | 520 |
gh_patches_debug_20812 | rasdani/github-patches | git_diff | ipython__ipython-5202 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
node != nodejs within Debian packages
As part of resolving https://github.com/ipython/nbviewer/issues/196, (and https://github.com/ipython/nbviewer/pull/194), @ahmadia and I ended up finding out that Debian based Linux Distributions build the `node` binary as `nodejs`.
IPython nbconvert defaults to using `node`, which is actually `ax25-node` on Debian based systems. [See relevant posting on the Debian mailing list for more](https://lists.debian.org/debian-devel-announce/2012/07/msg00002.html).
This won't affect users of nvm (who provide `node`) or those who build from source. This will affect certain strains of Ubuntu (Saucy Salamander was what I used to test).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `IPython/nbconvert/filters/markdown.py`
Content:
```
1 """Markdown filters
2 This file contains a collection of utility filters for dealing with
3 markdown within Jinja templates.
4 """
5 #-----------------------------------------------------------------------------
6 # Copyright (c) 2013, the IPython Development Team.
7 #
8 # Distributed under the terms of the Modified BSD License.
9 #
10 # The full license is in the file COPYING.txt, distributed with this software.
11 #-----------------------------------------------------------------------------
12
13 #-----------------------------------------------------------------------------
14 # Imports
15 #-----------------------------------------------------------------------------
16 from __future__ import print_function
17
18 # Stdlib imports
19 import os
20 import subprocess
21 from io import TextIOWrapper, BytesIO
22
23 # IPython imports
24 from IPython.nbconvert.utils.pandoc import pandoc
25 from IPython.nbconvert.utils.exceptions import ConversionException
26 from IPython.utils.process import find_cmd, FindCmdError
27 from IPython.utils.py3compat import cast_bytes
28
29 #-----------------------------------------------------------------------------
30 # Functions
31 #-----------------------------------------------------------------------------
32 marked = os.path.join(os.path.dirname(__file__), "marked.js")
33
34 __all__ = [
35 'markdown2html',
36 'markdown2html_pandoc',
37 'markdown2html_marked',
38 'markdown2latex',
39 'markdown2rst',
40 ]
41
42 class NodeJSMissing(ConversionException):
43 """Exception raised when node.js is missing."""
44 pass
45
46 def markdown2latex(source):
47 """Convert a markdown string to LaTeX via pandoc.
48
49 This function will raise an error if pandoc is not installed.
50 Any error messages generated by pandoc are printed to stderr.
51
52 Parameters
53 ----------
54 source : string
55 Input string, assumed to be valid markdown.
56
57 Returns
58 -------
59 out : string
60 Output as returned by pandoc.
61 """
62 return pandoc(source, 'markdown', 'latex')
63
64 def markdown2html_pandoc(source):
65 """Convert a markdown string to HTML via pandoc"""
66 return pandoc(source, 'markdown', 'html', extra_args=['--mathjax'])
67
68 def markdown2html_marked(source, encoding='utf-8'):
69 """Convert a markdown string to HTML via marked"""
70 command = ['node', marked]
71 try:
72 p = subprocess.Popen(command,
73 stdin=subprocess.PIPE, stdout=subprocess.PIPE
74 )
75 except OSError as e:
76 raise NodeJSMissing(
77 "The command '%s' returned an error: %s.\n" % (" ".join(command), e) +
78 "Please check that Node.js is installed."
79 )
80 out, _ = p.communicate(cast_bytes(source, encoding))
81 out = TextIOWrapper(BytesIO(out), encoding, 'replace').read()
82 return out.rstrip('\n')
83
84 def markdown2rst(source):
85 """Convert a markdown string to LaTeX via pandoc.
86
87 This function will raise an error if pandoc is not installed.
88 Any error messages generated by pandoc are printed to stderr.
89
90 Parameters
91 ----------
92 source : string
93 Input string, assumed to be valid markdown.
94
95 Returns
96 -------
97 out : string
98 Output as returned by pandoc.
99 """
100 return pandoc(source, 'markdown', 'rst')
101
102 try:
103 find_cmd('node')
104 except FindCmdError:
105 markdown2html = markdown2html_pandoc
106 else:
107 markdown2html = markdown2html_marked
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/IPython/nbconvert/filters/markdown.py b/IPython/nbconvert/filters/markdown.py
--- a/IPython/nbconvert/filters/markdown.py
+++ b/IPython/nbconvert/filters/markdown.py
@@ -67,7 +67,7 @@
def markdown2html_marked(source, encoding='utf-8'):
"""Convert a markdown string to HTML via marked"""
- command = ['node', marked]
+ command = [node_cmd, marked]
try:
p = subprocess.Popen(command,
stdin=subprocess.PIPE, stdout=subprocess.PIPE
@@ -99,9 +99,18 @@
"""
return pandoc(source, 'markdown', 'rst')
+# prefer md2html via marked if node.js is available
+# node is called nodejs on debian, so try that first
+node_cmd = 'nodejs'
try:
- find_cmd('node')
+ find_cmd(node_cmd)
except FindCmdError:
- markdown2html = markdown2html_pandoc
+ node_cmd = 'node'
+ try:
+ find_cmd(node_cmd)
+ except FindCmdError:
+ markdown2html = markdown2html_pandoc
+ else:
+ markdown2html = markdown2html_marked
else:
markdown2html = markdown2html_marked
| {"golden_diff": "diff --git a/IPython/nbconvert/filters/markdown.py b/IPython/nbconvert/filters/markdown.py\n--- a/IPython/nbconvert/filters/markdown.py\n+++ b/IPython/nbconvert/filters/markdown.py\n@@ -67,7 +67,7 @@\n \n def markdown2html_marked(source, encoding='utf-8'):\n \"\"\"Convert a markdown string to HTML via marked\"\"\"\n- command = ['node', marked]\n+ command = [node_cmd, marked]\n try:\n p = subprocess.Popen(command,\n stdin=subprocess.PIPE, stdout=subprocess.PIPE\n@@ -99,9 +99,18 @@\n \"\"\"\n return pandoc(source, 'markdown', 'rst')\n \n+# prefer md2html via marked if node.js is available\n+# node is called nodejs on debian, so try that first\n+node_cmd = 'nodejs'\n try:\n- find_cmd('node')\n+ find_cmd(node_cmd)\n except FindCmdError:\n- markdown2html = markdown2html_pandoc\n+ node_cmd = 'node'\n+ try:\n+ find_cmd(node_cmd)\n+ except FindCmdError:\n+ markdown2html = markdown2html_pandoc\n+ else:\n+ markdown2html = markdown2html_marked\n else:\n markdown2html = markdown2html_marked\n", "issue": "node != nodejs within Debian packages\nAs part of resolving https://github.com/ipython/nbviewer/issues/196, (and https://github.com/ipython/nbviewer/pull/194), @ahmadia and I ended up finding out that Debian based Linux Distributions build the `node` binary as `nodejs`.\n\nIPython nbconvert defaults to using `node`, which is actually `ax25-node` on Debian based systems. [See relevant posting on the Debian mailing list for more](https://lists.debian.org/debian-devel-announce/2012/07/msg00002.html).\n\nThis won't affect users of nvm (who provide `node`) or those who build from source. This will affect certain strains of Ubuntu (Saucy Salamander was what I used to test).\n\n", "before_files": [{"content": "\"\"\"Markdown filters\nThis file contains a collection of utility filters for dealing with \nmarkdown within Jinja templates.\n\"\"\"\n#-----------------------------------------------------------------------------\n# Copyright (c) 2013, the IPython Development Team.\n#\n# Distributed under the terms of the Modified BSD License.\n#\n# The full license is in the file COPYING.txt, distributed with this software.\n#-----------------------------------------------------------------------------\n\n#-----------------------------------------------------------------------------\n# Imports\n#-----------------------------------------------------------------------------\nfrom __future__ import print_function\n\n# Stdlib imports\nimport os\nimport subprocess\nfrom io import TextIOWrapper, BytesIO\n\n# IPython imports\nfrom IPython.nbconvert.utils.pandoc import pandoc\nfrom IPython.nbconvert.utils.exceptions import ConversionException\nfrom IPython.utils.process import find_cmd, FindCmdError\nfrom IPython.utils.py3compat import cast_bytes\n\n#-----------------------------------------------------------------------------\n# Functions\n#-----------------------------------------------------------------------------\nmarked = os.path.join(os.path.dirname(__file__), \"marked.js\")\n\n__all__ = [\n 'markdown2html',\n 'markdown2html_pandoc',\n 'markdown2html_marked',\n 'markdown2latex',\n 'markdown2rst',\n]\n\nclass NodeJSMissing(ConversionException):\n \"\"\"Exception raised when node.js is missing.\"\"\"\n pass\n\ndef markdown2latex(source):\n \"\"\"Convert a markdown string to LaTeX via pandoc.\n\n This function will raise an error if pandoc is not installed.\n Any error messages generated by pandoc are printed to stderr.\n\n Parameters\n ----------\n source : string\n Input string, assumed to be valid markdown.\n\n Returns\n -------\n out : string\n Output as returned by pandoc.\n \"\"\"\n return pandoc(source, 'markdown', 'latex')\n\ndef markdown2html_pandoc(source):\n \"\"\"Convert a markdown string to HTML via pandoc\"\"\"\n return pandoc(source, 'markdown', 'html', extra_args=['--mathjax'])\n\ndef markdown2html_marked(source, encoding='utf-8'):\n \"\"\"Convert a markdown string to HTML via marked\"\"\"\n command = ['node', marked]\n try:\n p = subprocess.Popen(command,\n stdin=subprocess.PIPE, stdout=subprocess.PIPE\n )\n except OSError as e:\n raise NodeJSMissing(\n \"The command '%s' returned an error: %s.\\n\" % (\" \".join(command), e) +\n \"Please check that Node.js is installed.\"\n )\n out, _ = p.communicate(cast_bytes(source, encoding))\n out = TextIOWrapper(BytesIO(out), encoding, 'replace').read()\n return out.rstrip('\\n')\n\ndef markdown2rst(source):\n \"\"\"Convert a markdown string to LaTeX via pandoc.\n\n This function will raise an error if pandoc is not installed.\n Any error messages generated by pandoc are printed to stderr.\n\n Parameters\n ----------\n source : string\n Input string, assumed to be valid markdown.\n\n Returns\n -------\n out : string\n Output as returned by pandoc.\n \"\"\"\n return pandoc(source, 'markdown', 'rst')\n\ntry:\n find_cmd('node')\nexcept FindCmdError:\n markdown2html = markdown2html_pandoc\nelse:\n markdown2html = markdown2html_marked\n", "path": "IPython/nbconvert/filters/markdown.py"}], "after_files": [{"content": "\"\"\"Markdown filters\nThis file contains a collection of utility filters for dealing with \nmarkdown within Jinja templates.\n\"\"\"\n#-----------------------------------------------------------------------------\n# Copyright (c) 2013, the IPython Development Team.\n#\n# Distributed under the terms of the Modified BSD License.\n#\n# The full license is in the file COPYING.txt, distributed with this software.\n#-----------------------------------------------------------------------------\n\n#-----------------------------------------------------------------------------\n# Imports\n#-----------------------------------------------------------------------------\nfrom __future__ import print_function\n\n# Stdlib imports\nimport os\nimport subprocess\nfrom io import TextIOWrapper, BytesIO\n\n# IPython imports\nfrom IPython.nbconvert.utils.pandoc import pandoc\nfrom IPython.nbconvert.utils.exceptions import ConversionException\nfrom IPython.utils.process import find_cmd, FindCmdError\nfrom IPython.utils.py3compat import cast_bytes\n\n#-----------------------------------------------------------------------------\n# Functions\n#-----------------------------------------------------------------------------\nmarked = os.path.join(os.path.dirname(__file__), \"marked.js\")\n\n__all__ = [\n 'markdown2html',\n 'markdown2html_pandoc',\n 'markdown2html_marked',\n 'markdown2latex',\n 'markdown2rst',\n]\n\nclass NodeJSMissing(ConversionException):\n \"\"\"Exception raised when node.js is missing.\"\"\"\n pass\n\ndef markdown2latex(source):\n \"\"\"Convert a markdown string to LaTeX via pandoc.\n\n This function will raise an error if pandoc is not installed.\n Any error messages generated by pandoc are printed to stderr.\n\n Parameters\n ----------\n source : string\n Input string, assumed to be valid markdown.\n\n Returns\n -------\n out : string\n Output as returned by pandoc.\n \"\"\"\n return pandoc(source, 'markdown', 'latex')\n\ndef markdown2html_pandoc(source):\n \"\"\"Convert a markdown string to HTML via pandoc\"\"\"\n return pandoc(source, 'markdown', 'html', extra_args=['--mathjax'])\n\ndef markdown2html_marked(source, encoding='utf-8'):\n \"\"\"Convert a markdown string to HTML via marked\"\"\"\n command = [node_cmd, marked]\n try:\n p = subprocess.Popen(command,\n stdin=subprocess.PIPE, stdout=subprocess.PIPE\n )\n except OSError as e:\n raise NodeJSMissing(\n \"The command '%s' returned an error: %s.\\n\" % (\" \".join(command), e) +\n \"Please check that Node.js is installed.\"\n )\n out, _ = p.communicate(cast_bytes(source, encoding))\n out = TextIOWrapper(BytesIO(out), encoding, 'replace').read()\n return out.rstrip('\\n')\n\ndef markdown2rst(source):\n \"\"\"Convert a markdown string to LaTeX via pandoc.\n\n This function will raise an error if pandoc is not installed.\n Any error messages generated by pandoc are printed to stderr.\n\n Parameters\n ----------\n source : string\n Input string, assumed to be valid markdown.\n\n Returns\n -------\n out : string\n Output as returned by pandoc.\n \"\"\"\n return pandoc(source, 'markdown', 'rst')\n\n# prefer md2html via marked if node.js is available\n# node is called nodejs on debian, so try that first\nnode_cmd = 'nodejs'\ntry:\n find_cmd(node_cmd)\nexcept FindCmdError:\n node_cmd = 'node'\n try:\n find_cmd(node_cmd)\n except FindCmdError:\n markdown2html = markdown2html_pandoc\n else:\n markdown2html = markdown2html_marked\nelse:\n markdown2html = markdown2html_marked\n", "path": "IPython/nbconvert/filters/markdown.py"}]} | 1,345 | 293 |
gh_patches_debug_8504 | rasdani/github-patches | git_diff | Gallopsled__pwntools-218 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
SyntaxWarning in pwnlib.util.web
This line generates a `SyntaxWarning`: https://github.com/Gallopsled/pwntools/blob/master/pwnlib/util/web.py#L27
Either we should use qualified names or only import the names that we need. My votes goes toward the former.
SyntaxWarning in pwnlib.util.web
This line generates a `SyntaxWarning`: https://github.com/Gallopsled/pwntools/blob/master/pwnlib/util/web.py#L27
Either we should use qualified names or only import the names that we need. My votes goes toward the former.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pwnlib/util/web.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import os, tempfile, logging
3 from .misc import size
4 log = logging.getLogger(__name__)
5
6 def wget(url, save=None, timeout=5, **kwargs):
7 """wget(url, save=None, timeout=5) -> str
8
9 Downloads a file via HTTP/HTTPS.
10
11 Args:
12 url (str): URL to download
13 save (str or bool): Name to save as. Any truthy value
14 will auto-generate a name based on the URL.
15 timeout (int): Timeout, in seconds
16
17 Example:
18
19 >>> url = 'http://httpbin.org/robots.txt'
20 >>> with context.local(log_level='ERROR'): result = wget(url)
21 >>> result
22 'User-agent: *\nDisallow: /deny\n'
23 >>> with context.local(log_level='ERROR'): wget(url, True)
24 >>> result == file('robots.txt').read()
25 True
26 """
27 from requests import *
28
29 with log.progress("Downloading '%s'" % url) as w:
30 w.status("Making request...")
31
32 response = get(url, stream=True, **kwargs)
33
34 if not response.ok:
35 w.failure("Got code %s" % response.status_code)
36 return
37
38 total_size = int(response.headers.get('content-length',0))
39
40 w.status('0 / %s' % size(total_size))
41
42 # Find out the next largest size we can represent as
43 chunk_size = 1
44 while chunk_size < (total_size/10):
45 chunk_size *= 1000
46
47 # Count chunks as they're received
48 total_data = ''
49
50 # Loop until we have all of the data
51 for chunk in response.iter_content(chunk_size = 2**10):
52 total_data += chunk
53 if total_size:
54 w.status('%s / %s' % (size(total_data), size(total_size)))
55 else:
56 w.status('%s' % size(total_data))
57
58 # Save to the target file if provided
59 if save:
60 if not isinstance(save, (str, unicode)):
61 save = os.path.basename(url)
62 save = save or tempfile.NamedTemporaryFile(dir='.', delete=False).name
63 with file(save,'wb+') as f:
64 f.write(total_data)
65 w.success('Saved %r (%s)' % (f.name, size(total_data)))
66 else:
67 w.success('%s' % size(total_data))
68
69 return total_data
70
71
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pwnlib/util/web.py b/pwnlib/util/web.py
--- a/pwnlib/util/web.py
+++ b/pwnlib/util/web.py
@@ -24,12 +24,12 @@
>>> result == file('robots.txt').read()
True
"""
- from requests import *
+ import requests
with log.progress("Downloading '%s'" % url) as w:
w.status("Making request...")
- response = get(url, stream=True, **kwargs)
+ response = requests.get(url, stream=True, **kwargs)
if not response.ok:
w.failure("Got code %s" % response.status_code)
| {"golden_diff": "diff --git a/pwnlib/util/web.py b/pwnlib/util/web.py\n--- a/pwnlib/util/web.py\n+++ b/pwnlib/util/web.py\n@@ -24,12 +24,12 @@\n >>> result == file('robots.txt').read()\n True\n \"\"\"\n- from requests import *\n+ import requests\n \n with log.progress(\"Downloading '%s'\" % url) as w:\n w.status(\"Making request...\")\n \n- response = get(url, stream=True, **kwargs)\n+ response = requests.get(url, stream=True, **kwargs)\n \n if not response.ok:\n w.failure(\"Got code %s\" % response.status_code)\n", "issue": "SyntaxWarning in pwnlib.util.web\nThis line generates a `SyntaxWarning`: https://github.com/Gallopsled/pwntools/blob/master/pwnlib/util/web.py#L27\n\nEither we should use qualified names or only import the names that we need. My votes goes toward the former.\n\nSyntaxWarning in pwnlib.util.web\nThis line generates a `SyntaxWarning`: https://github.com/Gallopsled/pwntools/blob/master/pwnlib/util/web.py#L27\n\nEither we should use qualified names or only import the names that we need. My votes goes toward the former.\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport os, tempfile, logging\nfrom .misc import size\nlog = logging.getLogger(__name__)\n\ndef wget(url, save=None, timeout=5, **kwargs):\n \"\"\"wget(url, save=None, timeout=5) -> str\n\n Downloads a file via HTTP/HTTPS.\n\n Args:\n url (str): URL to download\n save (str or bool): Name to save as. Any truthy value\n will auto-generate a name based on the URL.\n timeout (int): Timeout, in seconds\n\n Example:\n\n >>> url = 'http://httpbin.org/robots.txt'\n >>> with context.local(log_level='ERROR'): result = wget(url)\n >>> result\n 'User-agent: *\\nDisallow: /deny\\n'\n >>> with context.local(log_level='ERROR'): wget(url, True)\n >>> result == file('robots.txt').read()\n True\n \"\"\"\n from requests import *\n\n with log.progress(\"Downloading '%s'\" % url) as w:\n w.status(\"Making request...\")\n\n response = get(url, stream=True, **kwargs)\n\n if not response.ok:\n w.failure(\"Got code %s\" % response.status_code)\n return\n\n total_size = int(response.headers.get('content-length',0))\n\n w.status('0 / %s' % size(total_size))\n\n # Find out the next largest size we can represent as\n chunk_size = 1\n while chunk_size < (total_size/10):\n chunk_size *= 1000\n\n # Count chunks as they're received\n total_data = ''\n\n # Loop until we have all of the data\n for chunk in response.iter_content(chunk_size = 2**10):\n total_data += chunk\n if total_size:\n w.status('%s / %s' % (size(total_data), size(total_size)))\n else:\n w.status('%s' % size(total_data))\n\n # Save to the target file if provided\n if save:\n if not isinstance(save, (str, unicode)):\n save = os.path.basename(url)\n save = save or tempfile.NamedTemporaryFile(dir='.', delete=False).name\n with file(save,'wb+') as f:\n f.write(total_data)\n w.success('Saved %r (%s)' % (f.name, size(total_data)))\n else:\n w.success('%s' % size(total_data))\n\n return total_data\n\n", "path": "pwnlib/util/web.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport os, tempfile, logging\nfrom .misc import size\nlog = logging.getLogger(__name__)\n\ndef wget(url, save=None, timeout=5, **kwargs):\n \"\"\"wget(url, save=None, timeout=5) -> str\n\n Downloads a file via HTTP/HTTPS.\n\n Args:\n url (str): URL to download\n save (str or bool): Name to save as. Any truthy value\n will auto-generate a name based on the URL.\n timeout (int): Timeout, in seconds\n\n Example:\n\n >>> url = 'http://httpbin.org/robots.txt'\n >>> with context.local(log_level='ERROR'): result = wget(url)\n >>> result\n 'User-agent: *\\nDisallow: /deny\\n'\n >>> with context.local(log_level='ERROR'): wget(url, True)\n >>> result == file('robots.txt').read()\n True\n \"\"\"\n import requests\n\n with log.progress(\"Downloading '%s'\" % url) as w:\n w.status(\"Making request...\")\n\n response = requests.get(url, stream=True, **kwargs)\n\n if not response.ok:\n w.failure(\"Got code %s\" % response.status_code)\n return\n\n total_size = int(response.headers.get('content-length',0))\n\n w.status('0 / %s' % size(total_size))\n\n # Find out the next largest size we can represent as\n chunk_size = 1\n while chunk_size < (total_size/10):\n chunk_size *= 1000\n\n # Count chunks as they're received\n total_data = ''\n\n # Loop until we have all of the data\n for chunk in response.iter_content(chunk_size = 2**10):\n total_data += chunk\n if total_size:\n w.status('%s / %s' % (size(total_data), size(total_size)))\n else:\n w.status('%s' % size(total_data))\n\n # Save to the target file if provided\n if save:\n if not isinstance(save, (str, unicode)):\n save = os.path.basename(url)\n save = save or tempfile.NamedTemporaryFile(dir='.', delete=False).name\n with file(save,'wb+') as f:\n f.write(total_data)\n w.success('Saved %r (%s)' % (f.name, size(total_data)))\n else:\n w.success('%s' % size(total_data))\n\n return total_data\n\n", "path": "pwnlib/util/web.py"}]} | 1,061 | 148 |
gh_patches_debug_14960 | rasdani/github-patches | git_diff | flairNLP__flair-422 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Don't pin package dependencies in setup.py
To be removed, once it is done: Please add the appropriate label to this ticket, e.g. feature or enhancement.
**Is your feature/enhancement request related to a problem? Please describe.**
It is not considered good practice to pin package dependencies in setup.py (see additional context).
For instance, I'm forced to downgrade certain packages by installing flair.
**Describe the solution you'd like**
Just list the abstract requirements in setup.py with less restrictive version bounds.
**Additional context**
See https://packaging.python.org/discussions/install-requires-vs-requirements/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 from setuptools import setup, find_packages
2
3 setup(
4 name='flair',
5 version='0.4.0',
6 description='A very simple framework for state-of-the-art NLP',
7 long_description=open("README.md", encoding='utf-8').read(),
8 long_description_content_type="text/markdown",
9 author='Alan Akbik',
10 author_email='[email protected]',
11 url='https://github.com/zalandoresearch/flair',
12 packages=find_packages(exclude='test'), # same as name
13 license='MIT',
14 install_requires=[
15 'torch==1.0.0',
16 'gensim==3.4.0',
17 'typing==3.6.4',
18 'tqdm==4.26.0',
19 'segtok==1.5.7',
20 'matplotlib==3.0.0',
21 'mpld3==0.3',
22 'sklearn',
23 'sqlitedict==1.6.0',
24 'deprecated==1.2.4',
25 'hyperopt==0.1.1',
26 'pytorch-pretrained-bert==0.3.0'
27 ],
28 include_package_data=True,
29 python_requires='>=3.6',
30 )
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -12,18 +12,17 @@
packages=find_packages(exclude='test'), # same as name
license='MIT',
install_requires=[
- 'torch==1.0.0',
- 'gensim==3.4.0',
- 'typing==3.6.4',
- 'tqdm==4.26.0',
- 'segtok==1.5.7',
- 'matplotlib==3.0.0',
- 'mpld3==0.3',
+ 'torch>=1.0.0',
+ 'gensim>=3.4.0',
+ 'tqdm>=4.26.0',
+ 'segtok>=1.5.7',
+ 'matplotlib>=3.0.0',
+ 'mpld3>=0.3',
'sklearn',
- 'sqlitedict==1.6.0',
- 'deprecated==1.2.4',
- 'hyperopt==0.1.1',
- 'pytorch-pretrained-bert==0.3.0'
+ 'sqlitedict>=1.6.0',
+ 'deprecated>=1.2.4',
+ 'hyperopt>=0.1.1',
+ 'pytorch-pretrained-bert>=0.3.0'
],
include_package_data=True,
python_requires='>=3.6',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -12,18 +12,17 @@\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n- 'torch==1.0.0',\n- 'gensim==3.4.0',\n- 'typing==3.6.4',\n- 'tqdm==4.26.0',\n- 'segtok==1.5.7',\n- 'matplotlib==3.0.0',\n- 'mpld3==0.3',\n+ 'torch>=1.0.0',\n+ 'gensim>=3.4.0',\n+ 'tqdm>=4.26.0',\n+ 'segtok>=1.5.7',\n+ 'matplotlib>=3.0.0',\n+ 'mpld3>=0.3',\n 'sklearn',\n- 'sqlitedict==1.6.0',\n- 'deprecated==1.2.4',\n- 'hyperopt==0.1.1',\n- 'pytorch-pretrained-bert==0.3.0'\n+ 'sqlitedict>=1.6.0',\n+ 'deprecated>=1.2.4',\n+ 'hyperopt>=0.1.1',\n+ 'pytorch-pretrained-bert>=0.3.0'\n ],\n include_package_data=True,\n python_requires='>=3.6',\n", "issue": "Don't pin package dependencies in setup.py\nTo be removed, once it is done: Please add the appropriate label to this ticket, e.g. feature or enhancement.\r\n\r\n**Is your feature/enhancement request related to a problem? Please describe.**\r\n\r\nIt is not considered good practice to pin package dependencies in setup.py (see additional context).\r\n\r\nFor instance, I'm forced to downgrade certain packages by installing flair.\r\n\r\n**Describe the solution you'd like**\r\n\r\nJust list the abstract requirements in setup.py with less restrictive version bounds.\r\n\r\n**Additional context**\r\n\r\nSee https://packaging.python.org/discussions/install-requires-vs-requirements/\n", "before_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='flair',\n version='0.4.0',\n description='A very simple framework for state-of-the-art NLP',\n long_description=open(\"README.md\", encoding='utf-8').read(),\n long_description_content_type=\"text/markdown\",\n author='Alan Akbik',\n author_email='[email protected]',\n url='https://github.com/zalandoresearch/flair',\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n 'torch==1.0.0',\n 'gensim==3.4.0',\n 'typing==3.6.4',\n 'tqdm==4.26.0',\n 'segtok==1.5.7',\n 'matplotlib==3.0.0',\n 'mpld3==0.3',\n 'sklearn',\n 'sqlitedict==1.6.0',\n 'deprecated==1.2.4',\n 'hyperopt==0.1.1',\n 'pytorch-pretrained-bert==0.3.0'\n ],\n include_package_data=True,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}], "after_files": [{"content": "from setuptools import setup, find_packages\n\nsetup(\n name='flair',\n version='0.4.0',\n description='A very simple framework for state-of-the-art NLP',\n long_description=open(\"README.md\", encoding='utf-8').read(),\n long_description_content_type=\"text/markdown\",\n author='Alan Akbik',\n author_email='[email protected]',\n url='https://github.com/zalandoresearch/flair',\n packages=find_packages(exclude='test'), # same as name\n license='MIT',\n install_requires=[\n 'torch>=1.0.0',\n 'gensim>=3.4.0',\n 'tqdm>=4.26.0',\n 'segtok>=1.5.7',\n 'matplotlib>=3.0.0',\n 'mpld3>=0.3',\n 'sklearn',\n 'sqlitedict>=1.6.0',\n 'deprecated>=1.2.4',\n 'hyperopt>=0.1.1',\n 'pytorch-pretrained-bert>=0.3.0'\n ],\n include_package_data=True,\n python_requires='>=3.6',\n)\n", "path": "setup.py"}]} | 712 | 339 |
gh_patches_debug_5563 | rasdani/github-patches | git_diff | mlflow__mlflow-9536 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[BUG] basic-auth init on remote database
### Describe the problem
Same issue #9399 happened when trying to initialize database which invokes this function [migrate_if_needed](https://github.com/mlflow/mlflow/blob/master/mlflow/server/auth/db/utils.py#L30)
Suggestion: Apply the same fix #9410 to force SqlAlchemy to render unobfuscated url
### Suggestion
```
alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))
```
### What component(s) does this bug affect?
- [ ] `area/artifacts`: Artifact stores and artifact logging
- [ ] `area/build`: Build and test infrastructure for MLflow
- [ ] `area/docs`: MLflow documentation pages
- [ ] `area/examples`: Example code
- [ ] `area/gateway`: AI Gateway service, Gateway client APIs, third-party Gateway integrations
- [ ] `area/model-registry`: Model Registry service, APIs, and the fluent client calls for Model Registry
- [ ] `area/models`: MLmodel format, model serialization/deserialization, flavors
- [ ] `area/recipes`: Recipes, Recipe APIs, Recipe configs, Recipe Templates
- [ ] `area/projects`: MLproject format, project running backends
- [ ] `area/scoring`: MLflow Model server, model deployment tools, Spark UDFs
- [X] `area/server-infra`: MLflow Tracking server backend
- [ ] `area/tracking`: Tracking Service, tracking client APIs, autologging
### What interface(s) does this bug affect?
- [ ] `area/uiux`: Front-end, user experience, plotting, JavaScript, JavaScript dev server
- [ ] `area/docker`: Docker use across MLflow's components, such as MLflow Projects and MLflow Models
- [X] `area/sqlalchemy`: Use of SQLAlchemy in the Tracking Service or Model Registry
- [ ] `area/windows`: Windows support
### What language(s) does this bug affect?
- [ ] `language/r`: R APIs and clients
- [ ] `language/java`: Java APIs and clients
- [ ] `language/new`: Proposals for new client languages
### What integration(s) does this bug affect?
- [ ] `integrations/azure`: Azure and Azure ML integrations
- [ ] `integrations/sagemaker`: SageMaker integrations
- [ ] `integrations/databricks`: Databricks integrations
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mlflow/server/auth/db/utils.py`
Content:
```
1 from pathlib import Path
2
3 from alembic.command import upgrade
4 from alembic.config import Config
5 from alembic.migration import MigrationContext
6 from alembic.script import ScriptDirectory
7 from sqlalchemy.engine.base import Engine
8
9
10 def _get_alembic_dir() -> str:
11 return Path(__file__).parent / "migrations"
12
13
14 def _get_alembic_config(url: str) -> Config:
15 alembic_dir = _get_alembic_dir()
16 alembic_ini_path = alembic_dir / "alembic.ini"
17 alembic_cfg = Config(alembic_ini_path)
18 alembic_cfg.set_main_option("script_location", str(alembic_dir))
19 alembic_cfg.set_main_option("sqlalchemy.url", url)
20 return alembic_cfg
21
22
23 def migrate(engine: Engine, revision: str) -> None:
24 alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))
25 with engine.begin() as conn:
26 alembic_cfg.attributes["connection"] = conn
27 upgrade(alembic_cfg, revision)
28
29
30 def migrate_if_needed(engine: Engine, revision: str) -> None:
31 alembic_cfg = _get_alembic_config(str(engine.url))
32 script_dir = ScriptDirectory.from_config(alembic_cfg)
33 with engine.begin() as conn:
34 context = MigrationContext.configure(conn)
35 if context.get_current_revision() != script_dir.get_current_head():
36 upgrade(alembic_cfg, revision)
37
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mlflow/server/auth/db/utils.py b/mlflow/server/auth/db/utils.py
--- a/mlflow/server/auth/db/utils.py
+++ b/mlflow/server/auth/db/utils.py
@@ -28,7 +28,7 @@
def migrate_if_needed(engine: Engine, revision: str) -> None:
- alembic_cfg = _get_alembic_config(str(engine.url))
+ alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))
script_dir = ScriptDirectory.from_config(alembic_cfg)
with engine.begin() as conn:
context = MigrationContext.configure(conn)
| {"golden_diff": "diff --git a/mlflow/server/auth/db/utils.py b/mlflow/server/auth/db/utils.py\n--- a/mlflow/server/auth/db/utils.py\n+++ b/mlflow/server/auth/db/utils.py\n@@ -28,7 +28,7 @@\n \n \n def migrate_if_needed(engine: Engine, revision: str) -> None:\n- alembic_cfg = _get_alembic_config(str(engine.url))\n+ alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))\n script_dir = ScriptDirectory.from_config(alembic_cfg)\n with engine.begin() as conn:\n context = MigrationContext.configure(conn)\n", "issue": "[BUG] basic-auth init on remote database\n### Describe the problem\r\n\r\nSame issue #9399 happened when trying to initialize database which invokes this function [migrate_if_needed](https://github.com/mlflow/mlflow/blob/master/mlflow/server/auth/db/utils.py#L30)\r\n\r\nSuggestion: Apply the same fix #9410 to force SqlAlchemy to render unobfuscated url\r\n\r\n### Suggestion\r\n```\r\nalembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))\r\n```\r\n\r\n### What component(s) does this bug affect?\r\n\r\n- [ ] `area/artifacts`: Artifact stores and artifact logging\r\n- [ ] `area/build`: Build and test infrastructure for MLflow\r\n- [ ] `area/docs`: MLflow documentation pages\r\n- [ ] `area/examples`: Example code\r\n- [ ] `area/gateway`: AI Gateway service, Gateway client APIs, third-party Gateway integrations\r\n- [ ] `area/model-registry`: Model Registry service, APIs, and the fluent client calls for Model Registry\r\n- [ ] `area/models`: MLmodel format, model serialization/deserialization, flavors\r\n- [ ] `area/recipes`: Recipes, Recipe APIs, Recipe configs, Recipe Templates\r\n- [ ] `area/projects`: MLproject format, project running backends\r\n- [ ] `area/scoring`: MLflow Model server, model deployment tools, Spark UDFs\r\n- [X] `area/server-infra`: MLflow Tracking server backend\r\n- [ ] `area/tracking`: Tracking Service, tracking client APIs, autologging\r\n\r\n### What interface(s) does this bug affect?\r\n\r\n- [ ] `area/uiux`: Front-end, user experience, plotting, JavaScript, JavaScript dev server\r\n- [ ] `area/docker`: Docker use across MLflow's components, such as MLflow Projects and MLflow Models\r\n- [X] `area/sqlalchemy`: Use of SQLAlchemy in the Tracking Service or Model Registry\r\n- [ ] `area/windows`: Windows support\r\n\r\n### What language(s) does this bug affect?\r\n\r\n- [ ] `language/r`: R APIs and clients\r\n- [ ] `language/java`: Java APIs and clients\r\n- [ ] `language/new`: Proposals for new client languages\r\n\r\n### What integration(s) does this bug affect?\r\n\r\n- [ ] `integrations/azure`: Azure and Azure ML integrations\r\n- [ ] `integrations/sagemaker`: SageMaker integrations\r\n- [ ] `integrations/databricks`: Databricks integrations\n", "before_files": [{"content": "from pathlib import Path\n\nfrom alembic.command import upgrade\nfrom alembic.config import Config\nfrom alembic.migration import MigrationContext\nfrom alembic.script import ScriptDirectory\nfrom sqlalchemy.engine.base import Engine\n\n\ndef _get_alembic_dir() -> str:\n return Path(__file__).parent / \"migrations\"\n\n\ndef _get_alembic_config(url: str) -> Config:\n alembic_dir = _get_alembic_dir()\n alembic_ini_path = alembic_dir / \"alembic.ini\"\n alembic_cfg = Config(alembic_ini_path)\n alembic_cfg.set_main_option(\"script_location\", str(alembic_dir))\n alembic_cfg.set_main_option(\"sqlalchemy.url\", url)\n return alembic_cfg\n\n\ndef migrate(engine: Engine, revision: str) -> None:\n alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))\n with engine.begin() as conn:\n alembic_cfg.attributes[\"connection\"] = conn\n upgrade(alembic_cfg, revision)\n\n\ndef migrate_if_needed(engine: Engine, revision: str) -> None:\n alembic_cfg = _get_alembic_config(str(engine.url))\n script_dir = ScriptDirectory.from_config(alembic_cfg)\n with engine.begin() as conn:\n context = MigrationContext.configure(conn)\n if context.get_current_revision() != script_dir.get_current_head():\n upgrade(alembic_cfg, revision)\n", "path": "mlflow/server/auth/db/utils.py"}], "after_files": [{"content": "from pathlib import Path\n\nfrom alembic.command import upgrade\nfrom alembic.config import Config\nfrom alembic.migration import MigrationContext\nfrom alembic.script import ScriptDirectory\nfrom sqlalchemy.engine.base import Engine\n\n\ndef _get_alembic_dir() -> str:\n return Path(__file__).parent / \"migrations\"\n\n\ndef _get_alembic_config(url: str) -> Config:\n alembic_dir = _get_alembic_dir()\n alembic_ini_path = alembic_dir / \"alembic.ini\"\n alembic_cfg = Config(alembic_ini_path)\n alembic_cfg.set_main_option(\"script_location\", str(alembic_dir))\n alembic_cfg.set_main_option(\"sqlalchemy.url\", url)\n return alembic_cfg\n\n\ndef migrate(engine: Engine, revision: str) -> None:\n alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))\n with engine.begin() as conn:\n alembic_cfg.attributes[\"connection\"] = conn\n upgrade(alembic_cfg, revision)\n\n\ndef migrate_if_needed(engine: Engine, revision: str) -> None:\n alembic_cfg = _get_alembic_config(engine.url.render_as_string(hide_password=False))\n script_dir = ScriptDirectory.from_config(alembic_cfg)\n with engine.begin() as conn:\n context = MigrationContext.configure(conn)\n if context.get_current_revision() != script_dir.get_current_head():\n upgrade(alembic_cfg, revision)\n", "path": "mlflow/server/auth/db/utils.py"}]} | 1,187 | 140 |
gh_patches_debug_61226 | rasdani/github-patches | git_diff | searxng__searxng-2862 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Bug: bilibili engine is broken
<!-- PLEASE FILL THESE FIELDS, IT REALLY HELPS THE MAINTAINERS OF SearXNG -->
Something has changed, and now some fixes are needed to use the api successfully.
**Version of SearXNG, commit number if you are using on master branch and stipulate if you forked SearXNG**
Repository: https://github.com/searxng/searxng
Branch: master
Version: 2023.9.27+1a66d7467+dirty
<!-- If you are running on master branch using git execute this command
in order to fetch the latest commit ID:
```
git log -1
```
If you are using searxng-docker then look at the bottom of the SearXNG page
and check for the version after "Powered by SearXNG"
Please also stipulate if you are using a forked version of SearXNG and
include a link to the fork source code.
-->
**How did you install SearXNG?**
make run
<!-- Did you install SearXNG using the official wiki or using searxng-docker
or manually by executing the searx/webapp.py file? -->
**What happened?**
<!-- A clear and concise description of what the bug is. -->
**How To Reproduce**
<!-- How can we reproduce this issue? (as minimally and as precisely as possible) -->
**Expected behavior**
<!-- A clear and concise description of what you expected to happen. -->
**Screenshots & Logs**
<!-- If applicable, add screenshots, logs to help explain your problem. -->
**Additional context**
<!-- Add any other context about the problem here. -->
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `searx/engines/bilibili.py`
Content:
```
1 # SPDX-License-Identifier: AGPL-3.0-or-later
2 # lint: pylint
3 """Bilibili is a Chinese video sharing website.
4
5 .. _Bilibili: https://www.bilibili.com
6 """
7
8 import random
9 import string
10 from urllib.parse import urlencode
11 from datetime import datetime, timedelta
12
13 # Engine metadata
14 about = {
15 "website": "https://www.bilibili.com",
16 "wikidata_id": "Q3077586",
17 "official_api_documentation": None,
18 "use_official_api": False,
19 "require_api_key": False,
20 "results": "JSON",
21 }
22
23 # Engine configuration
24 paging = True
25 results_per_page = 20
26 categories = ["videos"]
27
28 # Search URL
29 base_url = "https://api.bilibili.com/x/web-interface/wbi/search/type"
30
31 cookie = {
32 "innersign": "0",
33 "buvid3": "".join(random.choice(string.hexdigits) for _ in range(16)) + "infoc",
34 "i-wanna-go-back": "-1",
35 "b_ut": "7",
36 "FEED_LIVE_VERSION": "V8",
37 "header_theme_version": "undefined",
38 "home_feed_column": "4",
39 }
40
41
42 def request(query, params):
43 query_params = {
44 "__refresh__": "true",
45 "page": params["pageno"],
46 "page_size": results_per_page,
47 "single_column": "0",
48 "keyword": query,
49 "search_type": "video",
50 }
51
52 params["url"] = f"{base_url}?{urlencode(query_params)}"
53 params["cookies"] = cookie
54
55 return params
56
57
58 # Format the video duration
59 def format_duration(duration):
60 minutes, seconds = map(int, duration.split(":"))
61 total_seconds = minutes * 60 + seconds
62
63 formatted_duration = str(timedelta(seconds=total_seconds))[2:] if 0 <= total_seconds < 3600 else ""
64
65 return formatted_duration
66
67
68 def response(resp):
69 search_res = resp.json()
70
71 results = []
72
73 for item in search_res.get("data", {}).get("result", []):
74 title = item["title"]
75 url = item["arcurl"]
76 thumbnail = item["pic"]
77 description = item["description"]
78 author = item["author"]
79 video_id = item["aid"]
80 unix_date = item["pubdate"]
81
82 formatted_date = datetime.utcfromtimestamp(unix_date)
83 formatted_duration = format_duration(item["duration"])
84 iframe_url = f"https://player.bilibili.com/player.html?aid={video_id}&high_quality=1&autoplay=false&danmaku=0"
85
86 results.append(
87 {
88 "title": title,
89 "url": url,
90 "content": description,
91 "author": author,
92 "publishedDate": formatted_date,
93 "length": formatted_duration,
94 "thumbnail": thumbnail,
95 "iframe_src": iframe_url,
96 "template": "videos.html",
97 }
98 )
99
100 return results
101
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/searx/engines/bilibili.py b/searx/engines/bilibili.py
--- a/searx/engines/bilibili.py
+++ b/searx/engines/bilibili.py
@@ -26,7 +26,7 @@
categories = ["videos"]
# Search URL
-base_url = "https://api.bilibili.com/x/web-interface/wbi/search/type"
+base_url = "https://api.bilibili.com/x/web-interface/search/type"
cookie = {
"innersign": "0",
| {"golden_diff": "diff --git a/searx/engines/bilibili.py b/searx/engines/bilibili.py\n--- a/searx/engines/bilibili.py\n+++ b/searx/engines/bilibili.py\n@@ -26,7 +26,7 @@\n categories = [\"videos\"]\n \n # Search URL\n-base_url = \"https://api.bilibili.com/x/web-interface/wbi/search/type\"\n+base_url = \"https://api.bilibili.com/x/web-interface/search/type\"\n \n cookie = {\n \"innersign\": \"0\",\n", "issue": "Bug: bilibili engine is broken\n<!-- PLEASE FILL THESE FIELDS, IT REALLY HELPS THE MAINTAINERS OF SearXNG -->\r\n\r\nSomething has changed, and now some fixes are needed to use the api successfully.\r\n\r\n**Version of SearXNG, commit number if you are using on master branch and stipulate if you forked SearXNG**\r\nRepository: https://github.com/searxng/searxng\r\nBranch: master\r\nVersion: 2023.9.27+1a66d7467+dirty\r\n<!-- If you are running on master branch using git execute this command\r\nin order to fetch the latest commit ID:\r\n```\r\ngit log -1\r\n``` \r\nIf you are using searxng-docker then look at the bottom of the SearXNG page\r\nand check for the version after \"Powered by SearXNG\"\r\n\r\nPlease also stipulate if you are using a forked version of SearXNG and\r\ninclude a link to the fork source code.\r\n-->\r\n**How did you install SearXNG?**\r\nmake run\r\n<!-- Did you install SearXNG using the official wiki or using searxng-docker\r\nor manually by executing the searx/webapp.py file? -->\r\n**What happened?**\r\n<!-- A clear and concise description of what the bug is. -->\r\n\r\n**How To Reproduce**\r\n<!-- How can we reproduce this issue? (as minimally and as precisely as possible) -->\r\n\r\n**Expected behavior**\r\n<!-- A clear and concise description of what you expected to happen. -->\r\n\r\n**Screenshots & Logs**\r\n<!-- If applicable, add screenshots, logs to help explain your problem. -->\r\n\r\n**Additional context**\r\n<!-- Add any other context about the problem here. -->\r\n\n", "before_files": [{"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n# lint: pylint\n\"\"\"Bilibili is a Chinese video sharing website.\n\n.. _Bilibili: https://www.bilibili.com\n\"\"\"\n\nimport random\nimport string\nfrom urllib.parse import urlencode\nfrom datetime import datetime, timedelta\n\n# Engine metadata\nabout = {\n \"website\": \"https://www.bilibili.com\",\n \"wikidata_id\": \"Q3077586\",\n \"official_api_documentation\": None,\n \"use_official_api\": False,\n \"require_api_key\": False,\n \"results\": \"JSON\",\n}\n\n# Engine configuration\npaging = True\nresults_per_page = 20\ncategories = [\"videos\"]\n\n# Search URL\nbase_url = \"https://api.bilibili.com/x/web-interface/wbi/search/type\"\n\ncookie = {\n \"innersign\": \"0\",\n \"buvid3\": \"\".join(random.choice(string.hexdigits) for _ in range(16)) + \"infoc\",\n \"i-wanna-go-back\": \"-1\",\n \"b_ut\": \"7\",\n \"FEED_LIVE_VERSION\": \"V8\",\n \"header_theme_version\": \"undefined\",\n \"home_feed_column\": \"4\",\n}\n\n\ndef request(query, params):\n query_params = {\n \"__refresh__\": \"true\",\n \"page\": params[\"pageno\"],\n \"page_size\": results_per_page,\n \"single_column\": \"0\",\n \"keyword\": query,\n \"search_type\": \"video\",\n }\n\n params[\"url\"] = f\"{base_url}?{urlencode(query_params)}\"\n params[\"cookies\"] = cookie\n\n return params\n\n\n# Format the video duration\ndef format_duration(duration):\n minutes, seconds = map(int, duration.split(\":\"))\n total_seconds = minutes * 60 + seconds\n\n formatted_duration = str(timedelta(seconds=total_seconds))[2:] if 0 <= total_seconds < 3600 else \"\"\n\n return formatted_duration\n\n\ndef response(resp):\n search_res = resp.json()\n\n results = []\n\n for item in search_res.get(\"data\", {}).get(\"result\", []):\n title = item[\"title\"]\n url = item[\"arcurl\"]\n thumbnail = item[\"pic\"]\n description = item[\"description\"]\n author = item[\"author\"]\n video_id = item[\"aid\"]\n unix_date = item[\"pubdate\"]\n\n formatted_date = datetime.utcfromtimestamp(unix_date)\n formatted_duration = format_duration(item[\"duration\"])\n iframe_url = f\"https://player.bilibili.com/player.html?aid={video_id}&high_quality=1&autoplay=false&danmaku=0\"\n\n results.append(\n {\n \"title\": title,\n \"url\": url,\n \"content\": description,\n \"author\": author,\n \"publishedDate\": formatted_date,\n \"length\": formatted_duration,\n \"thumbnail\": thumbnail,\n \"iframe_src\": iframe_url,\n \"template\": \"videos.html\",\n }\n )\n\n return results\n", "path": "searx/engines/bilibili.py"}], "after_files": [{"content": "# SPDX-License-Identifier: AGPL-3.0-or-later\n# lint: pylint\n\"\"\"Bilibili is a Chinese video sharing website.\n\n.. _Bilibili: https://www.bilibili.com\n\"\"\"\n\nimport random\nimport string\nfrom urllib.parse import urlencode\nfrom datetime import datetime, timedelta\n\n# Engine metadata\nabout = {\n \"website\": \"https://www.bilibili.com\",\n \"wikidata_id\": \"Q3077586\",\n \"official_api_documentation\": None,\n \"use_official_api\": False,\n \"require_api_key\": False,\n \"results\": \"JSON\",\n}\n\n# Engine configuration\npaging = True\nresults_per_page = 20\ncategories = [\"videos\"]\n\n# Search URL\nbase_url = \"https://api.bilibili.com/x/web-interface/search/type\"\n\ncookie = {\n \"innersign\": \"0\",\n \"buvid3\": \"\".join(random.choice(string.hexdigits) for _ in range(16)) + \"infoc\",\n \"i-wanna-go-back\": \"-1\",\n \"b_ut\": \"7\",\n \"FEED_LIVE_VERSION\": \"V8\",\n \"header_theme_version\": \"undefined\",\n \"home_feed_column\": \"4\",\n}\n\n\ndef request(query, params):\n query_params = {\n \"__refresh__\": \"true\",\n \"page\": params[\"pageno\"],\n \"page_size\": results_per_page,\n \"single_column\": \"0\",\n \"keyword\": query,\n \"search_type\": \"video\",\n }\n\n params[\"url\"] = f\"{base_url}?{urlencode(query_params)}\"\n params[\"cookies\"] = cookie\n\n return params\n\n\n# Format the video duration\ndef format_duration(duration):\n minutes, seconds = map(int, duration.split(\":\"))\n total_seconds = minutes * 60 + seconds\n\n formatted_duration = str(timedelta(seconds=total_seconds))[2:] if 0 <= total_seconds < 3600 else \"\"\n\n return formatted_duration\n\n\ndef response(resp):\n search_res = resp.json()\n\n results = []\n\n for item in search_res.get(\"data\", {}).get(\"result\", []):\n title = item[\"title\"]\n url = item[\"arcurl\"]\n thumbnail = item[\"pic\"]\n description = item[\"description\"]\n author = item[\"author\"]\n video_id = item[\"aid\"]\n unix_date = item[\"pubdate\"]\n\n formatted_date = datetime.utcfromtimestamp(unix_date)\n formatted_duration = format_duration(item[\"duration\"])\n iframe_url = f\"https://player.bilibili.com/player.html?aid={video_id}&high_quality=1&autoplay=false&danmaku=0\"\n\n results.append(\n {\n \"title\": title,\n \"url\": url,\n \"content\": description,\n \"author\": author,\n \"publishedDate\": formatted_date,\n \"length\": formatted_duration,\n \"thumbnail\": thumbnail,\n \"iframe_src\": iframe_url,\n \"template\": \"videos.html\",\n }\n )\n\n return results\n", "path": "searx/engines/bilibili.py"}]} | 1,488 | 124 |
gh_patches_debug_57128 | rasdani/github-patches | git_diff | liqd__adhocracy4-58 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Extend linting to javascript and jsx files
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `adhocracy4/reports/emails.py`
Content:
```
1 from django.contrib.auth import get_user_model
2 from django.core import urlresolvers
3
4 from adhocracy4 import emails
5
6 User = get_user_model()
7
8
9 class ReportModeratorEmail(emails.ModeratorNotification):
10 template_name = 'a4reports/emails/report_moderators'
11
12
13 class ReportCreatorEmail(emails.Email):
14 template_name = 'a4reports/emails/report_creator'
15
16 def get_receivers(self):
17 return [self.object.content_object.creator]
18
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/adhocracy4/reports/emails.py b/adhocracy4/reports/emails.py
--- a/adhocracy4/reports/emails.py
+++ b/adhocracy4/reports/emails.py
@@ -1,5 +1,4 @@
from django.contrib.auth import get_user_model
-from django.core import urlresolvers
from adhocracy4 import emails
| {"golden_diff": "diff --git a/adhocracy4/reports/emails.py b/adhocracy4/reports/emails.py\n--- a/adhocracy4/reports/emails.py\n+++ b/adhocracy4/reports/emails.py\n@@ -1,5 +1,4 @@\n from django.contrib.auth import get_user_model\n-from django.core import urlresolvers\n \n from adhocracy4 import emails\n", "issue": "Extend linting to javascript and jsx files\n\n", "before_files": [{"content": "from django.contrib.auth import get_user_model\nfrom django.core import urlresolvers\n\nfrom adhocracy4 import emails\n\nUser = get_user_model()\n\n\nclass ReportModeratorEmail(emails.ModeratorNotification):\n template_name = 'a4reports/emails/report_moderators'\n\n\nclass ReportCreatorEmail(emails.Email):\n template_name = 'a4reports/emails/report_creator'\n\n def get_receivers(self):\n return [self.object.content_object.creator]\n", "path": "adhocracy4/reports/emails.py"}], "after_files": [{"content": "from django.contrib.auth import get_user_model\n\nfrom adhocracy4 import emails\n\nUser = get_user_model()\n\n\nclass ReportModeratorEmail(emails.ModeratorNotification):\n template_name = 'a4reports/emails/report_moderators'\n\n\nclass ReportCreatorEmail(emails.Email):\n template_name = 'a4reports/emails/report_creator'\n\n def get_receivers(self):\n return [self.object.content_object.creator]\n", "path": "adhocracy4/reports/emails.py"}]} | 401 | 83 |
gh_patches_debug_24071 | rasdani/github-patches | git_diff | open-mmlab__mmdetection-5654 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Error get params DETR/ Deformable DETR
Despite my attempts to modify, also just testing with the basic config detr file.
Maybe this issue has already been raised?
mmdet==2.13.0
mmcv=1.3.3
```python
python tools/analysis_tools/get_flops.py configs/detr/detr_r50_8x2_150e_coco.py
```
```python
/home/bluav/mmdetection/mmdet/models/backbones/resnet.py:400: UserWarning: DeprecationWarning: pretrained is a deprecated, please use "init_cfg" instead
warnings.warn('DeprecationWarning: pretrained is a deprecated, '
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Warning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!
Traceback (most recent call last):
File "tools/analysis_tools/get_flops.py", line 81, in <module>
main()
File "tools/analysis_tools/get_flops.py", line 71, in main
flops, params = get_model_complexity_info(model, input_shape)
File "/home/bluav/.conda/envs/open-mmlab/lib/python3.7/site-packages/mmcv/cnn/utils/flops_counter.py", line 104, in get_model_complexity_info
_ = flops_model(batch)
File "/home/bluav/.conda/envs/open-mmlab/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
File "/home/bluav/mmdetection/mmdet/models/detectors/single_stage.py", line 48, in forward_dummy
outs = self.bbox_head(x)
File "/home/bluav/.conda/envs/open-mmlab/lib/python3.7/site-packages/torch/nn/modules/module.py", line 889, in _call_impl
result = self.forward(*input, **kwargs)
TypeError: forward() missing 1 required positional argument: 'img_metas'
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mmdet/models/detectors/detr.py`
Content:
```
1 import torch
2
3 from ..builder import DETECTORS
4 from .single_stage import SingleStageDetector
5
6
7 @DETECTORS.register_module()
8 class DETR(SingleStageDetector):
9 r"""Implementation of `DETR: End-to-End Object Detection with
10 Transformers <https://arxiv.org/pdf/2005.12872>`_"""
11
12 def __init__(self,
13 backbone,
14 bbox_head,
15 train_cfg=None,
16 test_cfg=None,
17 pretrained=None,
18 init_cfg=None):
19 super(DETR, self).__init__(backbone, None, bbox_head, train_cfg,
20 test_cfg, pretrained, init_cfg)
21
22 # over-write `onnx_export` because:
23 # (1) the forward of bbox_head requires img_metas
24 # (2) the different behavior (e.g. construction of `masks`) between
25 # torch and ONNX model, during the forward of bbox_head
26 def onnx_export(self, img, img_metas):
27 """Test function for exporting to ONNX, without test time augmentation.
28
29 Args:
30 img (torch.Tensor): input images.
31 img_metas (list[dict]): List of image information.
32
33 Returns:
34 tuple[Tensor, Tensor]: dets of shape [N, num_det, 5]
35 and class labels of shape [N, num_det].
36 """
37 x = self.extract_feat(img)
38 # forward of this head requires img_metas
39 outs = self.bbox_head.forward_onnx(x, img_metas)
40 # get shape as tensor
41 img_shape = torch._shape_as_tensor(img)[2:]
42 img_metas[0]['img_shape_for_onnx'] = img_shape
43
44 det_bboxes, det_labels = self.bbox_head.onnx_export(*outs, img_metas)
45
46 return det_bboxes, det_labels
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mmdet/models/detectors/detr.py b/mmdet/models/detectors/detr.py
--- a/mmdet/models/detectors/detr.py
+++ b/mmdet/models/detectors/detr.py
@@ -1,3 +1,5 @@
+import warnings
+
import torch
from ..builder import DETECTORS
@@ -19,6 +21,27 @@
super(DETR, self).__init__(backbone, None, bbox_head, train_cfg,
test_cfg, pretrained, init_cfg)
+ # over-write `forward_dummy` because:
+ # the forward of bbox_head requires img_metas
+ def forward_dummy(self, img):
+ """Used for computing network flops.
+
+ See `mmdetection/tools/analysis_tools/get_flops.py`
+ """
+ warnings.warn('Warning! MultiheadAttention in DETR does not '
+ 'support flops computation! Do not use the '
+ 'results in your papers!')
+
+ batch_size, _, height, width = img.shape
+ dummy_img_metas = [
+ dict(
+ batch_input_shape=(height, width),
+ img_shape=(height, width, 3)) for _ in range(batch_size)
+ ]
+ x = self.extract_feat(img)
+ outs = self.bbox_head(x, dummy_img_metas)
+ return outs
+
# over-write `onnx_export` because:
# (1) the forward of bbox_head requires img_metas
# (2) the different behavior (e.g. construction of `masks`) between
| {"golden_diff": "diff --git a/mmdet/models/detectors/detr.py b/mmdet/models/detectors/detr.py\n--- a/mmdet/models/detectors/detr.py\n+++ b/mmdet/models/detectors/detr.py\n@@ -1,3 +1,5 @@\n+import warnings\n+\n import torch\n \n from ..builder import DETECTORS\n@@ -19,6 +21,27 @@\n super(DETR, self).__init__(backbone, None, bbox_head, train_cfg,\n test_cfg, pretrained, init_cfg)\n \n+ # over-write `forward_dummy` because:\n+ # the forward of bbox_head requires img_metas\n+ def forward_dummy(self, img):\n+ \"\"\"Used for computing network flops.\n+\n+ See `mmdetection/tools/analysis_tools/get_flops.py`\n+ \"\"\"\n+ warnings.warn('Warning! MultiheadAttention in DETR does not '\n+ 'support flops computation! Do not use the '\n+ 'results in your papers!')\n+\n+ batch_size, _, height, width = img.shape\n+ dummy_img_metas = [\n+ dict(\n+ batch_input_shape=(height, width),\n+ img_shape=(height, width, 3)) for _ in range(batch_size)\n+ ]\n+ x = self.extract_feat(img)\n+ outs = self.bbox_head(x, dummy_img_metas)\n+ return outs\n+\n # over-write `onnx_export` because:\n # (1) the forward of bbox_head requires img_metas\n # (2) the different behavior (e.g. construction of `masks`) between\n", "issue": "Error get params DETR/ Deformable DETR\nDespite my attempts to modify, also just testing with the basic config detr file. \r\nMaybe this issue has already been raised?\r\nmmdet==2.13.0\r\nmmcv=1.3.3\r\n\r\n```python\r\npython tools/analysis_tools/get_flops.py configs/detr/detr_r50_8x2_150e_coco.py\r\n```\r\n\r\n```python\r\n/home/bluav/mmdetection/mmdet/models/backbones/resnet.py:400: UserWarning: DeprecationWarning: pretrained is a deprecated, please use \"init_cfg\" instead\r\n warnings.warn('DeprecationWarning: pretrained is a deprecated, '\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nWarning: variables __flops__ or __params__ are already defined for the moduleReLU ptflops can affect your code!\r\nTraceback (most recent call last):\r\n File \"tools/analysis_tools/get_flops.py\", line 81, in <module>\r\n main()\r\n File \"tools/analysis_tools/get_flops.py\", line 71, in main\r\n flops, params = get_model_complexity_info(model, input_shape)\r\n File \"/home/bluav/.conda/envs/open-mmlab/lib/python3.7/site-packages/mmcv/cnn/utils/flops_counter.py\", line 104, in get_model_complexity_info\r\n _ = flops_model(batch)\r\n File \"/home/bluav/.conda/envs/open-mmlab/lib/python3.7/site-packages/torch/nn/modules/module.py\", line 889, in _call_impl\r\n result = self.forward(*input, **kwargs)\r\n File \"/home/bluav/mmdetection/mmdet/models/detectors/single_stage.py\", line 48, in forward_dummy\r\n outs = self.bbox_head(x)\r\n File \"/home/bluav/.conda/envs/open-mmlab/lib/python3.7/site-packages/torch/nn/modules/module.py\", line 889, in _call_impl\r\n result = self.forward(*input, **kwargs)\r\nTypeError: forward() missing 1 required positional argument: 'img_metas'\r\n```\r\n\n", "before_files": [{"content": "import torch\n\nfrom ..builder import DETECTORS\nfrom .single_stage import SingleStageDetector\n\n\[email protected]_module()\nclass DETR(SingleStageDetector):\n r\"\"\"Implementation of `DETR: End-to-End Object Detection with\n Transformers <https://arxiv.org/pdf/2005.12872>`_\"\"\"\n\n def __init__(self,\n backbone,\n bbox_head,\n train_cfg=None,\n test_cfg=None,\n pretrained=None,\n init_cfg=None):\n super(DETR, self).__init__(backbone, None, bbox_head, train_cfg,\n test_cfg, pretrained, init_cfg)\n\n # over-write `onnx_export` because:\n # (1) the forward of bbox_head requires img_metas\n # (2) the different behavior (e.g. construction of `masks`) between\n # torch and ONNX model, during the forward of bbox_head\n def onnx_export(self, img, img_metas):\n \"\"\"Test function for exporting to ONNX, without test time augmentation.\n\n Args:\n img (torch.Tensor): input images.\n img_metas (list[dict]): List of image information.\n\n Returns:\n tuple[Tensor, Tensor]: dets of shape [N, num_det, 5]\n and class labels of shape [N, num_det].\n \"\"\"\n x = self.extract_feat(img)\n # forward of this head requires img_metas\n outs = self.bbox_head.forward_onnx(x, img_metas)\n # get shape as tensor\n img_shape = torch._shape_as_tensor(img)[2:]\n img_metas[0]['img_shape_for_onnx'] = img_shape\n\n det_bboxes, det_labels = self.bbox_head.onnx_export(*outs, img_metas)\n\n return det_bboxes, det_labels\n", "path": "mmdet/models/detectors/detr.py"}], "after_files": [{"content": "import warnings\n\nimport torch\n\nfrom ..builder import DETECTORS\nfrom .single_stage import SingleStageDetector\n\n\[email protected]_module()\nclass DETR(SingleStageDetector):\n r\"\"\"Implementation of `DETR: End-to-End Object Detection with\n Transformers <https://arxiv.org/pdf/2005.12872>`_\"\"\"\n\n def __init__(self,\n backbone,\n bbox_head,\n train_cfg=None,\n test_cfg=None,\n pretrained=None,\n init_cfg=None):\n super(DETR, self).__init__(backbone, None, bbox_head, train_cfg,\n test_cfg, pretrained, init_cfg)\n\n # over-write `forward_dummy` because:\n # the forward of bbox_head requires img_metas\n def forward_dummy(self, img):\n \"\"\"Used for computing network flops.\n\n See `mmdetection/tools/analysis_tools/get_flops.py`\n \"\"\"\n warnings.warn('Warning! MultiheadAttention in DETR does not '\n 'support flops computation! Do not use the '\n 'results in your papers!')\n\n batch_size, _, height, width = img.shape\n dummy_img_metas = [\n dict(\n batch_input_shape=(height, width),\n img_shape=(height, width, 3)) for _ in range(batch_size)\n ]\n x = self.extract_feat(img)\n outs = self.bbox_head(x, dummy_img_metas)\n return outs\n\n # over-write `onnx_export` because:\n # (1) the forward of bbox_head requires img_metas\n # (2) the different behavior (e.g. construction of `masks`) between\n # torch and ONNX model, during the forward of bbox_head\n def onnx_export(self, img, img_metas):\n \"\"\"Test function for exporting to ONNX, without test time augmentation.\n\n Args:\n img (torch.Tensor): input images.\n img_metas (list[dict]): List of image information.\n\n Returns:\n tuple[Tensor, Tensor]: dets of shape [N, num_det, 5]\n and class labels of shape [N, num_det].\n \"\"\"\n x = self.extract_feat(img)\n # forward of this head requires img_metas\n outs = self.bbox_head.forward_onnx(x, img_metas)\n # get shape as tensor\n img_shape = torch._shape_as_tensor(img)[2:]\n img_metas[0]['img_shape_for_onnx'] = img_shape\n\n det_bboxes, det_labels = self.bbox_head.onnx_export(*outs, img_metas)\n\n return det_bboxes, det_labels\n", "path": "mmdet/models/detectors/detr.py"}]} | 1,524 | 355 |
gh_patches_debug_12016 | rasdani/github-patches | git_diff | celery__celery-450 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
os.kill is not available in windows before python 2.7
As per the topic, the current celery implementation (>=2.3.0) crashes on windows using python 2.5 and 2.6, because it uses os.kill which is not available in windows before python 2.7
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `celery/concurrency/processes/__init__.py`
Content:
```
1 """
2
3 Process Pools.
4
5 """
6 import platform
7 import signal as _signal
8
9 from os import kill as _kill
10
11 from celery.concurrency.base import BasePool
12 from celery.concurrency.processes.pool import Pool, RUN
13
14 if platform.system() == "Windows": # pragma: no cover
15 # On Windows os.kill calls TerminateProcess which cannot be
16 # handled by # any process, so this is needed to terminate the task
17 # *and its children* (if any).
18 from celery.concurrency.processes import _win
19 _kill = _win.kill_processtree # noqa
20
21
22 class TaskPool(BasePool):
23 """Process Pool for processing tasks in parallel.
24
25 :param processes: see :attr:`processes`.
26 :param logger: see :attr:`logger`.
27
28
29 .. attribute:: limit
30
31 The number of processes that can run simultaneously.
32
33 .. attribute:: logger
34
35 The logger used for debugging.
36
37 """
38 Pool = Pool
39
40 def on_start(self):
41 """Run the task pool.
42
43 Will pre-fork all workers so they're ready to accept tasks.
44
45 """
46 self._pool = self.Pool(processes=self.limit, **self.options)
47 self.on_apply = self._pool.apply_async
48
49 def on_stop(self):
50 """Gracefully stop the pool."""
51 if self._pool is not None and self._pool._state == RUN:
52 self._pool.close()
53 self._pool.join()
54 self._pool = None
55
56 def on_terminate(self):
57 """Force terminate the pool."""
58 if self._pool is not None:
59 self._pool.terminate()
60 self._pool = None
61
62 def terminate_job(self, pid, signal=None):
63 _kill(pid, signal or _signal.SIGTERM)
64
65 def grow(self, n=1):
66 return self._pool.grow(n)
67
68 def shrink(self, n=1):
69 return self._pool.shrink(n)
70
71 def _get_info(self):
72 return {"max-concurrency": self.limit,
73 "processes": [p.pid for p in self._pool._pool],
74 "max-tasks-per-child": self._pool._maxtasksperchild,
75 "put-guarded-by-semaphore": self.putlocks,
76 "timeouts": (self._pool.soft_timeout, self._pool.timeout)}
77
78 @property
79 def num_processes(self):
80 return self._pool._processes
81
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/celery/concurrency/processes/__init__.py b/celery/concurrency/processes/__init__.py
--- a/celery/concurrency/processes/__init__.py
+++ b/celery/concurrency/processes/__init__.py
@@ -6,8 +6,6 @@
import platform
import signal as _signal
-from os import kill as _kill
-
from celery.concurrency.base import BasePool
from celery.concurrency.processes.pool import Pool, RUN
@@ -17,6 +15,8 @@
# *and its children* (if any).
from celery.concurrency.processes import _win
_kill = _win.kill_processtree # noqa
+else:
+ from os import kill as _kill
class TaskPool(BasePool):
| {"golden_diff": "diff --git a/celery/concurrency/processes/__init__.py b/celery/concurrency/processes/__init__.py\n--- a/celery/concurrency/processes/__init__.py\n+++ b/celery/concurrency/processes/__init__.py\n@@ -6,8 +6,6 @@\n import platform\n import signal as _signal\n \n-from os import kill as _kill\n-\n from celery.concurrency.base import BasePool\n from celery.concurrency.processes.pool import Pool, RUN\n \n@@ -17,6 +15,8 @@\n # *and its children* (if any).\n from celery.concurrency.processes import _win\n _kill = _win.kill_processtree # noqa\n+else:\n+ from os import kill as _kill\n \n \n class TaskPool(BasePool):\n", "issue": "os.kill is not available in windows before python 2.7\nAs per the topic, the current celery implementation (>=2.3.0) crashes on windows using python 2.5 and 2.6, because it uses os.kill which is not available in windows before python 2.7\n\n", "before_files": [{"content": "\"\"\"\n\nProcess Pools.\n\n\"\"\"\nimport platform\nimport signal as _signal\n\nfrom os import kill as _kill\n\nfrom celery.concurrency.base import BasePool\nfrom celery.concurrency.processes.pool import Pool, RUN\n\nif platform.system() == \"Windows\": # pragma: no cover\n # On Windows os.kill calls TerminateProcess which cannot be\n # handled by # any process, so this is needed to terminate the task\n # *and its children* (if any).\n from celery.concurrency.processes import _win\n _kill = _win.kill_processtree # noqa\n\n\nclass TaskPool(BasePool):\n \"\"\"Process Pool for processing tasks in parallel.\n\n :param processes: see :attr:`processes`.\n :param logger: see :attr:`logger`.\n\n\n .. attribute:: limit\n\n The number of processes that can run simultaneously.\n\n .. attribute:: logger\n\n The logger used for debugging.\n\n \"\"\"\n Pool = Pool\n\n def on_start(self):\n \"\"\"Run the task pool.\n\n Will pre-fork all workers so they're ready to accept tasks.\n\n \"\"\"\n self._pool = self.Pool(processes=self.limit, **self.options)\n self.on_apply = self._pool.apply_async\n\n def on_stop(self):\n \"\"\"Gracefully stop the pool.\"\"\"\n if self._pool is not None and self._pool._state == RUN:\n self._pool.close()\n self._pool.join()\n self._pool = None\n\n def on_terminate(self):\n \"\"\"Force terminate the pool.\"\"\"\n if self._pool is not None:\n self._pool.terminate()\n self._pool = None\n\n def terminate_job(self, pid, signal=None):\n _kill(pid, signal or _signal.SIGTERM)\n\n def grow(self, n=1):\n return self._pool.grow(n)\n\n def shrink(self, n=1):\n return self._pool.shrink(n)\n\n def _get_info(self):\n return {\"max-concurrency\": self.limit,\n \"processes\": [p.pid for p in self._pool._pool],\n \"max-tasks-per-child\": self._pool._maxtasksperchild,\n \"put-guarded-by-semaphore\": self.putlocks,\n \"timeouts\": (self._pool.soft_timeout, self._pool.timeout)}\n\n @property\n def num_processes(self):\n return self._pool._processes\n", "path": "celery/concurrency/processes/__init__.py"}], "after_files": [{"content": "\"\"\"\n\nProcess Pools.\n\n\"\"\"\nimport platform\nimport signal as _signal\n\nfrom celery.concurrency.base import BasePool\nfrom celery.concurrency.processes.pool import Pool, RUN\n\nif platform.system() == \"Windows\": # pragma: no cover\n # On Windows os.kill calls TerminateProcess which cannot be\n # handled by # any process, so this is needed to terminate the task\n # *and its children* (if any).\n from celery.concurrency.processes import _win\n _kill = _win.kill_processtree # noqa\nelse:\n from os import kill as _kill\n\n\nclass TaskPool(BasePool):\n \"\"\"Process Pool for processing tasks in parallel.\n\n :param processes: see :attr:`processes`.\n :param logger: see :attr:`logger`.\n\n\n .. attribute:: limit\n\n The number of processes that can run simultaneously.\n\n .. attribute:: logger\n\n The logger used for debugging.\n\n \"\"\"\n Pool = Pool\n\n def on_start(self):\n \"\"\"Run the task pool.\n\n Will pre-fork all workers so they're ready to accept tasks.\n\n \"\"\"\n self._pool = self.Pool(processes=self.limit, **self.options)\n self.on_apply = self._pool.apply_async\n\n def on_stop(self):\n \"\"\"Gracefully stop the pool.\"\"\"\n if self._pool is not None and self._pool._state == RUN:\n self._pool.close()\n self._pool.join()\n self._pool = None\n\n def on_terminate(self):\n \"\"\"Force terminate the pool.\"\"\"\n if self._pool is not None:\n self._pool.terminate()\n self._pool = None\n\n def terminate_job(self, pid, signal=None):\n _kill(pid, signal or _signal.SIGTERM)\n\n def grow(self, n=1):\n return self._pool.grow(n)\n\n def shrink(self, n=1):\n return self._pool.shrink(n)\n\n def _get_info(self):\n return {\"max-concurrency\": self.limit,\n \"processes\": [p.pid for p in self._pool._pool],\n \"max-tasks-per-child\": self._pool._maxtasksperchild,\n \"put-guarded-by-semaphore\": self.putlocks,\n \"timeouts\": (self._pool.soft_timeout, self._pool.timeout)}\n\n @property\n def num_processes(self):\n return self._pool._processes\n", "path": "celery/concurrency/processes/__init__.py"}]} | 1,009 | 174 |
gh_patches_debug_27317 | rasdani/github-patches | git_diff | bridgecrewio__checkov-3126 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
How to Check a Data Source connected to a Resource
**Describe the issue**
I want to check if the attribute "group" of the resource "azuredevops_group_membership" refers to a data source "azuredevops_group" with the attribute "name" = "Build Administrators" for example.
**Examples**
Snippet from [terraform registry](https://registry.terraform.io/providers/microsoft/azuredevops/latest/docs/resources/group_membership)
```terraform
data "azuredevops_group" "example" {
project_id = azuredevops_project.example.id
name = "Build Administrators"
}
resource "azuredevops_group_membership" "example" {
group = data.azuredevops_group.example.descriptor
members = [
azuredevops_user_entitlement.example.descriptor
]
}
```
I tryed creating a custom policy in python but I didn't understand how I could make this work, I was only able to create a policy to check if the attribute name of the data azuredevops_group was equal to "Build Administrators":
```python
from typing import Dict, List, Any
from checkov.terraform.checks.data.base_check import BaseDataCheck
from checkov.common.models.enums import CheckResult, CheckCategories
class NoBuildAdministratorCreated(BaseDataCheck):
def __init__(self) -> None:
name = 'Ensure no build administrator is created on file'
id = "CKV_ADO_9000"
supported_data = ["azuredevops_group"]
categories = [CheckCategories.GENERAL_SECURITY]
super().__init__(name=name, id=id, categories=categories, supported_data=supported_data)
def scan_data_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:
if (conf.get("name", "Build Administrators")):
return CheckResult.FAILED
return CheckResult.PASSED
check = NoBuildAdministratorCreated()
```
**Version (please complete the following information):**
- Checkov Version 2.0.1223
**Additional context**
My goal is to check if people are creating admin groups inside of a terraform file. I'm kinda new to reading documentations and code libraries of open source projects so I'm having a bit of a hard time understanding how to use the checkov python scan functions to create custom policies. So any advice or code example to help me understand better how it works and what is this **conf** would be much appreciated, thanks!
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py`
Content:
```
1 import concurrent.futures
2 import re
3 from typing import List, Tuple, Dict, Any, Optional, Pattern
4
5 from networkx import DiGraph
6
7 from checkov.common.graph.checks_infra.enums import SolverType
8 from checkov.common.graph.checks_infra.solvers.base_solver import BaseSolver
9
10 from concurrent.futures import ThreadPoolExecutor
11
12 from checkov.common.graph.graph_builder import CustomAttributes
13 from checkov.common.graph.graph_builder.graph_components.block_types import BlockType
14 from checkov.common.util.var_utils import is_terraform_variable_dependent, is_cloudformation_variable_dependent
15
16 WILDCARD_PATTERN = re.compile(r"(\S+[.][*][.]*)+")
17
18
19 class BaseAttributeSolver(BaseSolver):
20 operator = ""
21
22 def __init__(self, resource_types: List[str], attribute: Optional[str], value: Any) -> None:
23 super().__init__(SolverType.ATTRIBUTE)
24 self.resource_types = resource_types
25 self.attribute = attribute
26 self.value = value
27
28 def run(self, graph_connector: DiGraph) -> Tuple[List[Dict[str, Any]], List[Dict[str, Any]]]:
29 executer = ThreadPoolExecutor()
30 jobs = []
31 passed_vertices: List[Dict[str, Any]] = []
32 failed_vertices: List[Dict[str, Any]] = []
33 for _, data in graph_connector.nodes(data=True):
34 if (not self.resource_types or data.get(CustomAttributes.RESOURCE_TYPE) in self.resource_types) \
35 and data.get(CustomAttributes.BLOCK_TYPE) == BlockType.RESOURCE:
36 jobs.append(executer.submit(self._process_node, data, passed_vertices, failed_vertices))
37
38 concurrent.futures.wait(jobs)
39 return passed_vertices, failed_vertices
40
41 def get_operation(self, vertex: Dict[str, Any]) -> bool:
42 if self.attribute and re.match(WILDCARD_PATTERN, self.attribute):
43 attribute_patterns = self.get_attribute_patterns(self.attribute)
44 attribute_matches = [
45 attr
46 for attr in vertex
47 if any(re.match(re.compile(attribute_pattern), attr) for attribute_pattern in attribute_patterns)
48 ]
49 if attribute_matches:
50 return self.resource_type_pred(vertex, self.resource_types) and any(
51 self._get_operation(vertex=vertex, attribute=attr) for attr in attribute_matches
52 )
53 return self.resource_type_pred(vertex, self.resource_types) and self._get_operation(
54 vertex=vertex, attribute=self.attribute
55 )
56
57 def _get_operation(self, vertex: Dict[str, Any], attribute: Optional[str]) -> bool:
58 raise NotImplementedError
59
60 def _process_node(
61 self, data: Dict[str, Any], passed_vartices: List[Dict[str, Any]], failed_vertices: List[Dict[str, Any]]
62 ) -> None:
63 if not self.resource_type_pred(data, self.resource_types):
64 return
65 if self.get_operation(vertex=data):
66 passed_vartices.append(data)
67 else:
68 failed_vertices.append(data)
69
70 @staticmethod
71 def get_attribute_patterns(attribute: str) -> Tuple[Pattern[str], Pattern[str]]:
72 index_pattern = r"[\d]+"
73 split_by_dots = attribute.split(".")
74
75 pattern_parts = []
76 pattern_parts_without_index = []
77 for attr_part in split_by_dots:
78 if attr_part == "*":
79 pattern_parts.append(index_pattern)
80 else:
81 attr_part_pattern = f"({attr_part})"
82 pattern_parts.append(attr_part_pattern)
83 pattern_parts_without_index.append(attr_part_pattern)
84
85 pattern = "[.]".join(pattern_parts)
86 pattern_with_index = re.compile(pattern)
87
88 pattern = "[.]".join(pattern_parts_without_index)
89 pattern_without_index = re.compile(pattern)
90
91 return pattern_with_index, pattern_without_index
92
93 @staticmethod
94 def _is_variable_dependant(value: Any, source: str) -> bool:
95 if source == 'Terraform' and is_terraform_variable_dependent(value):
96 return True
97 # TODO add logic for CloudFormation
98 # elif source == 'CloudFormation' and is_cloudformation_variable_dependent(value):
99 # return True
100
101 return False
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py b/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py
--- a/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py
+++ b/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py
@@ -12,7 +12,9 @@
from checkov.common.graph.graph_builder import CustomAttributes
from checkov.common.graph.graph_builder.graph_components.block_types import BlockType
from checkov.common.util.var_utils import is_terraform_variable_dependent, is_cloudformation_variable_dependent
+from checkov.terraform.graph_builder.graph_components.block_types import BlockType as TerraformBlockType
+SUPPORTED_BLOCK_TYPES = {BlockType.RESOURCE, TerraformBlockType.DATA}
WILDCARD_PATTERN = re.compile(r"(\S+[.][*][.]*)+")
@@ -32,7 +34,7 @@
failed_vertices: List[Dict[str, Any]] = []
for _, data in graph_connector.nodes(data=True):
if (not self.resource_types or data.get(CustomAttributes.RESOURCE_TYPE) in self.resource_types) \
- and data.get(CustomAttributes.BLOCK_TYPE) == BlockType.RESOURCE:
+ and data.get(CustomAttributes.BLOCK_TYPE) in SUPPORTED_BLOCK_TYPES:
jobs.append(executer.submit(self._process_node, data, passed_vertices, failed_vertices))
concurrent.futures.wait(jobs)
| {"golden_diff": "diff --git a/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py b/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py\n--- a/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py\n+++ b/checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py\n@@ -12,7 +12,9 @@\n from checkov.common.graph.graph_builder import CustomAttributes\n from checkov.common.graph.graph_builder.graph_components.block_types import BlockType\n from checkov.common.util.var_utils import is_terraform_variable_dependent, is_cloudformation_variable_dependent\n+from checkov.terraform.graph_builder.graph_components.block_types import BlockType as TerraformBlockType\n \n+SUPPORTED_BLOCK_TYPES = {BlockType.RESOURCE, TerraformBlockType.DATA}\n WILDCARD_PATTERN = re.compile(r\"(\\S+[.][*][.]*)+\")\n \n \n@@ -32,7 +34,7 @@\n failed_vertices: List[Dict[str, Any]] = []\n for _, data in graph_connector.nodes(data=True):\n if (not self.resource_types or data.get(CustomAttributes.RESOURCE_TYPE) in self.resource_types) \\\n- and data.get(CustomAttributes.BLOCK_TYPE) == BlockType.RESOURCE:\n+ and data.get(CustomAttributes.BLOCK_TYPE) in SUPPORTED_BLOCK_TYPES:\n jobs.append(executer.submit(self._process_node, data, passed_vertices, failed_vertices))\n \n concurrent.futures.wait(jobs)\n", "issue": "How to Check a Data Source connected to a Resource\n**Describe the issue**\r\nI want to check if the attribute \"group\" of the resource \"azuredevops_group_membership\" refers to a data source \"azuredevops_group\" with the attribute \"name\" = \"Build Administrators\" for example.\r\n\r\n**Examples**\r\nSnippet from [terraform registry](https://registry.terraform.io/providers/microsoft/azuredevops/latest/docs/resources/group_membership)\r\n```terraform\r\ndata \"azuredevops_group\" \"example\" {\r\n project_id = azuredevops_project.example.id\r\n name = \"Build Administrators\"\r\n}\r\n\r\nresource \"azuredevops_group_membership\" \"example\" {\r\n group = data.azuredevops_group.example.descriptor\r\n members = [\r\n azuredevops_user_entitlement.example.descriptor\r\n ]\r\n}\r\n```\r\nI tryed creating a custom policy in python but I didn't understand how I could make this work, I was only able to create a policy to check if the attribute name of the data azuredevops_group was equal to \"Build Administrators\":\r\n\r\n```python\r\nfrom typing import Dict, List, Any\r\n\r\nfrom checkov.terraform.checks.data.base_check import BaseDataCheck\r\nfrom checkov.common.models.enums import CheckResult, CheckCategories\r\n\r\nclass NoBuildAdministratorCreated(BaseDataCheck):\r\n def __init__(self) -> None:\r\n name = 'Ensure no build administrator is created on file'\r\n id = \"CKV_ADO_9000\"\r\n supported_data = [\"azuredevops_group\"]\r\n categories = [CheckCategories.GENERAL_SECURITY]\r\n super().__init__(name=name, id=id, categories=categories, supported_data=supported_data)\r\n\r\n def scan_data_conf(self, conf: Dict[str, List[Any]]) -> CheckResult:\r\n\r\n if (conf.get(\"name\", \"Build Administrators\")):\r\n return CheckResult.FAILED\r\n \r\n return CheckResult.PASSED\r\n\r\ncheck = NoBuildAdministratorCreated()\r\n```\r\n\r\n**Version (please complete the following information):**\r\n - Checkov Version 2.0.1223\r\n\r\n**Additional context**\r\nMy goal is to check if people are creating admin groups inside of a terraform file. I'm kinda new to reading documentations and code libraries of open source projects so I'm having a bit of a hard time understanding how to use the checkov python scan functions to create custom policies. So any advice or code example to help me understand better how it works and what is this **conf** would be much appreciated, thanks!\r\n\n", "before_files": [{"content": "import concurrent.futures\nimport re\nfrom typing import List, Tuple, Dict, Any, Optional, Pattern\n\nfrom networkx import DiGraph\n\nfrom checkov.common.graph.checks_infra.enums import SolverType\nfrom checkov.common.graph.checks_infra.solvers.base_solver import BaseSolver\n\nfrom concurrent.futures import ThreadPoolExecutor\n\nfrom checkov.common.graph.graph_builder import CustomAttributes\nfrom checkov.common.graph.graph_builder.graph_components.block_types import BlockType\nfrom checkov.common.util.var_utils import is_terraform_variable_dependent, is_cloudformation_variable_dependent\n\nWILDCARD_PATTERN = re.compile(r\"(\\S+[.][*][.]*)+\")\n\n\nclass BaseAttributeSolver(BaseSolver):\n operator = \"\"\n\n def __init__(self, resource_types: List[str], attribute: Optional[str], value: Any) -> None:\n super().__init__(SolverType.ATTRIBUTE)\n self.resource_types = resource_types\n self.attribute = attribute\n self.value = value\n\n def run(self, graph_connector: DiGraph) -> Tuple[List[Dict[str, Any]], List[Dict[str, Any]]]:\n executer = ThreadPoolExecutor()\n jobs = []\n passed_vertices: List[Dict[str, Any]] = []\n failed_vertices: List[Dict[str, Any]] = []\n for _, data in graph_connector.nodes(data=True):\n if (not self.resource_types or data.get(CustomAttributes.RESOURCE_TYPE) in self.resource_types) \\\n and data.get(CustomAttributes.BLOCK_TYPE) == BlockType.RESOURCE:\n jobs.append(executer.submit(self._process_node, data, passed_vertices, failed_vertices))\n\n concurrent.futures.wait(jobs)\n return passed_vertices, failed_vertices\n\n def get_operation(self, vertex: Dict[str, Any]) -> bool:\n if self.attribute and re.match(WILDCARD_PATTERN, self.attribute):\n attribute_patterns = self.get_attribute_patterns(self.attribute)\n attribute_matches = [\n attr\n for attr in vertex\n if any(re.match(re.compile(attribute_pattern), attr) for attribute_pattern in attribute_patterns)\n ]\n if attribute_matches:\n return self.resource_type_pred(vertex, self.resource_types) and any(\n self._get_operation(vertex=vertex, attribute=attr) for attr in attribute_matches\n )\n return self.resource_type_pred(vertex, self.resource_types) and self._get_operation(\n vertex=vertex, attribute=self.attribute\n )\n\n def _get_operation(self, vertex: Dict[str, Any], attribute: Optional[str]) -> bool:\n raise NotImplementedError\n\n def _process_node(\n self, data: Dict[str, Any], passed_vartices: List[Dict[str, Any]], failed_vertices: List[Dict[str, Any]]\n ) -> None:\n if not self.resource_type_pred(data, self.resource_types):\n return\n if self.get_operation(vertex=data):\n passed_vartices.append(data)\n else:\n failed_vertices.append(data)\n\n @staticmethod\n def get_attribute_patterns(attribute: str) -> Tuple[Pattern[str], Pattern[str]]:\n index_pattern = r\"[\\d]+\"\n split_by_dots = attribute.split(\".\")\n\n pattern_parts = []\n pattern_parts_without_index = []\n for attr_part in split_by_dots:\n if attr_part == \"*\":\n pattern_parts.append(index_pattern)\n else:\n attr_part_pattern = f\"({attr_part})\"\n pattern_parts.append(attr_part_pattern)\n pattern_parts_without_index.append(attr_part_pattern)\n\n pattern = \"[.]\".join(pattern_parts)\n pattern_with_index = re.compile(pattern)\n\n pattern = \"[.]\".join(pattern_parts_without_index)\n pattern_without_index = re.compile(pattern)\n\n return pattern_with_index, pattern_without_index\n\n @staticmethod\n def _is_variable_dependant(value: Any, source: str) -> bool:\n if source == 'Terraform' and is_terraform_variable_dependent(value):\n return True\n # TODO add logic for CloudFormation\n # elif source == 'CloudFormation' and is_cloudformation_variable_dependent(value):\n # return True\n\n return False\n", "path": "checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py"}], "after_files": [{"content": "import concurrent.futures\nimport re\nfrom typing import List, Tuple, Dict, Any, Optional, Pattern\n\nfrom networkx import DiGraph\n\nfrom checkov.common.graph.checks_infra.enums import SolverType\nfrom checkov.common.graph.checks_infra.solvers.base_solver import BaseSolver\n\nfrom concurrent.futures import ThreadPoolExecutor\n\nfrom checkov.common.graph.graph_builder import CustomAttributes\nfrom checkov.common.graph.graph_builder.graph_components.block_types import BlockType\nfrom checkov.common.util.var_utils import is_terraform_variable_dependent, is_cloudformation_variable_dependent\nfrom checkov.terraform.graph_builder.graph_components.block_types import BlockType as TerraformBlockType\n\nSUPPORTED_BLOCK_TYPES = {BlockType.RESOURCE, TerraformBlockType.DATA}\nWILDCARD_PATTERN = re.compile(r\"(\\S+[.][*][.]*)+\")\n\n\nclass BaseAttributeSolver(BaseSolver):\n operator = \"\"\n\n def __init__(self, resource_types: List[str], attribute: Optional[str], value: Any) -> None:\n super().__init__(SolverType.ATTRIBUTE)\n self.resource_types = resource_types\n self.attribute = attribute\n self.value = value\n\n def run(self, graph_connector: DiGraph) -> Tuple[List[Dict[str, Any]], List[Dict[str, Any]]]:\n executer = ThreadPoolExecutor()\n jobs = []\n passed_vertices: List[Dict[str, Any]] = []\n failed_vertices: List[Dict[str, Any]] = []\n for _, data in graph_connector.nodes(data=True):\n if (not self.resource_types or data.get(CustomAttributes.RESOURCE_TYPE) in self.resource_types) \\\n and data.get(CustomAttributes.BLOCK_TYPE) in SUPPORTED_BLOCK_TYPES:\n jobs.append(executer.submit(self._process_node, data, passed_vertices, failed_vertices))\n\n concurrent.futures.wait(jobs)\n return passed_vertices, failed_vertices\n\n def get_operation(self, vertex: Dict[str, Any]) -> bool:\n if self.attribute and re.match(WILDCARD_PATTERN, self.attribute):\n attribute_patterns = self.get_attribute_patterns(self.attribute)\n attribute_matches = [\n attr\n for attr in vertex\n if any(re.match(re.compile(attribute_pattern), attr) for attribute_pattern in attribute_patterns)\n ]\n if attribute_matches:\n return self.resource_type_pred(vertex, self.resource_types) and any(\n self._get_operation(vertex=vertex, attribute=attr) for attr in attribute_matches\n )\n return self.resource_type_pred(vertex, self.resource_types) and self._get_operation(\n vertex=vertex, attribute=self.attribute\n )\n\n def _get_operation(self, vertex: Dict[str, Any], attribute: Optional[str]) -> bool:\n raise NotImplementedError\n\n def _process_node(\n self, data: Dict[str, Any], passed_vartices: List[Dict[str, Any]], failed_vertices: List[Dict[str, Any]]\n ) -> None:\n if not self.resource_type_pred(data, self.resource_types):\n return\n if self.get_operation(vertex=data):\n passed_vartices.append(data)\n else:\n failed_vertices.append(data)\n\n @staticmethod\n def get_attribute_patterns(attribute: str) -> Tuple[Pattern[str], Pattern[str]]:\n index_pattern = r\"[\\d]+\"\n split_by_dots = attribute.split(\".\")\n\n pattern_parts = []\n pattern_parts_without_index = []\n for attr_part in split_by_dots:\n if attr_part == \"*\":\n pattern_parts.append(index_pattern)\n else:\n attr_part_pattern = f\"({attr_part})\"\n pattern_parts.append(attr_part_pattern)\n pattern_parts_without_index.append(attr_part_pattern)\n\n pattern = \"[.]\".join(pattern_parts)\n pattern_with_index = re.compile(pattern)\n\n pattern = \"[.]\".join(pattern_parts_without_index)\n pattern_without_index = re.compile(pattern)\n\n return pattern_with_index, pattern_without_index\n\n @staticmethod\n def _is_variable_dependant(value: Any, source: str) -> bool:\n if source == 'Terraform' and is_terraform_variable_dependent(value):\n return True\n # TODO add logic for CloudFormation\n # elif source == 'CloudFormation' and is_cloudformation_variable_dependent(value):\n # return True\n\n return False\n", "path": "checkov/common/checks_infra/solvers/attribute_solvers/base_attribute_solver.py"}]} | 1,863 | 323 |
gh_patches_debug_16894 | rasdani/github-patches | git_diff | svthalia__concrexit-2820 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Admin site doesnt show organizers
### Describe the bug
Organizers are not shown in the site admin
### How to reproduce
Steps to reproduce the behaviour:
1. Go to any event
2. See that the organizers field is empty
### Expected behaviour
there should be at least one organizer
### Additional context
multiple organizers broke things again
Admin site doesnt show organizers
### Describe the bug
Organizers are not shown in the site admin
### How to reproduce
Steps to reproduce the behaviour:
1. Go to any event
2. See that the organizers field is empty
### Expected behaviour
there should be at least one organizer
### Additional context
multiple organizers broke things again
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `website/events/emails.py`
Content:
```
1 """The emails defined by the events package."""
2 from django.conf import settings
3 from django.core.mail import EmailMessage
4 from django.template.loader import get_template
5 from django.utils.translation import gettext_lazy as _
6
7
8 def notify_first_waiting(event):
9 """Send an email to the first person on the waiting list when someone cancels their registration.
10
11 :param event: the event
12 """
13 if (
14 event.max_participants is not None
15 and event.eventregistration_set.filter(date_cancelled=None).count()
16 > event.max_participants
17 ):
18 # Prepare email to send to the first person on the waiting list
19 first_waiting = event.eventregistration_set.filter(
20 date_cancelled=None
21 ).order_by("date")[event.max_participants]
22
23 text_template = get_template("events/member_email.txt")
24
25 subject = _("[THALIA] Notification about your registration for '{}'").format(
26 event.title
27 )
28
29 organiser_emails = [
30 organiser.contact_address
31 for organiser in event.organisers.all()
32 if organiser.contact_address is not None
33 ]
34 text_message = text_template.render(
35 {
36 "event": event,
37 "registration": first_waiting,
38 "name": first_waiting.name or first_waiting.member.first_name,
39 "base_url": settings.BASE_URL,
40 "organisers": organiser_emails,
41 }
42 )
43
44 EmailMessage(subject, text_message, to=[first_waiting.email]).send()
45
46
47 def notify_organiser(event, registration):
48 """Send an email to the organiser of the event if someone cancels their registration.
49
50 :param event: the event
51 :param registration: the registration that was cancelled
52 """
53 if not event.organisers.exists():
54 return
55
56 text_template = get_template("events/organiser_email.txt")
57 subject = f"Registration for {event.title} cancelled by member"
58 text_message = text_template.render({"event": event, "registration": registration})
59
60 EmailMessage(
61 subject,
62 text_message,
63 to=[
64 organiser.contact_mailinglist.name + "@" + settings.SITE_DOMAIN
65 for organiser in event.organisers.all()
66 ],
67 ).send()
68
69
70 def notify_waiting(event, registration):
71 text_template = get_template("events/more_places_email.txt")
72 subject = _("[THALIA] Notification about your registration for '{}'").format(
73 event.title
74 )
75 text_message = text_template.render(
76 {
77 "event": event,
78 "registration": registration,
79 "name": registration.name or registration.member.first_name,
80 "base_url": settings.BASE_URL,
81 }
82 )
83 EmailMessage(subject, text_message, to=[registration.email]).send()
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/website/events/emails.py b/website/events/emails.py
--- a/website/events/emails.py
+++ b/website/events/emails.py
@@ -72,12 +72,21 @@
subject = _("[THALIA] Notification about your registration for '{}'").format(
event.title
)
+
+ organiser_emails = [
+ organiser.contact_address
+ for organiser in event.organisers.all()
+ if organiser.contact_address is not None
+ ]
+
text_message = text_template.render(
{
"event": event,
"registration": registration,
"name": registration.name or registration.member.first_name,
"base_url": settings.BASE_URL,
+ "organisers": organiser_emails,
}
)
+
EmailMessage(subject, text_message, to=[registration.email]).send()
| {"golden_diff": "diff --git a/website/events/emails.py b/website/events/emails.py\n--- a/website/events/emails.py\n+++ b/website/events/emails.py\n@@ -72,12 +72,21 @@\n subject = _(\"[THALIA] Notification about your registration for '{}'\").format(\n event.title\n )\n+\n+ organiser_emails = [\n+ organiser.contact_address\n+ for organiser in event.organisers.all()\n+ if organiser.contact_address is not None\n+ ]\n+\n text_message = text_template.render(\n {\n \"event\": event,\n \"registration\": registration,\n \"name\": registration.name or registration.member.first_name,\n \"base_url\": settings.BASE_URL,\n+ \"organisers\": organiser_emails,\n }\n )\n+\n EmailMessage(subject, text_message, to=[registration.email]).send()\n", "issue": "Admin site doesnt show organizers\n### Describe the bug\r\nOrganizers are not shown in the site admin\r\n\r\n### How to reproduce\r\nSteps to reproduce the behaviour:\r\n1. Go to any event\r\n2. See that the organizers field is empty\r\n\r\n### Expected behaviour\r\nthere should be at least one organizer\r\n\r\n### Additional context\r\nmultiple organizers broke things again\r\n\nAdmin site doesnt show organizers\n### Describe the bug\r\nOrganizers are not shown in the site admin\r\n\r\n### How to reproduce\r\nSteps to reproduce the behaviour:\r\n1. Go to any event\r\n2. See that the organizers field is empty\r\n\r\n### Expected behaviour\r\nthere should be at least one organizer\r\n\r\n### Additional context\r\nmultiple organizers broke things again\r\n\n", "before_files": [{"content": "\"\"\"The emails defined by the events package.\"\"\"\nfrom django.conf import settings\nfrom django.core.mail import EmailMessage\nfrom django.template.loader import get_template\nfrom django.utils.translation import gettext_lazy as _\n\n\ndef notify_first_waiting(event):\n \"\"\"Send an email to the first person on the waiting list when someone cancels their registration.\n\n :param event: the event\n \"\"\"\n if (\n event.max_participants is not None\n and event.eventregistration_set.filter(date_cancelled=None).count()\n > event.max_participants\n ):\n # Prepare email to send to the first person on the waiting list\n first_waiting = event.eventregistration_set.filter(\n date_cancelled=None\n ).order_by(\"date\")[event.max_participants]\n\n text_template = get_template(\"events/member_email.txt\")\n\n subject = _(\"[THALIA] Notification about your registration for '{}'\").format(\n event.title\n )\n\n organiser_emails = [\n organiser.contact_address\n for organiser in event.organisers.all()\n if organiser.contact_address is not None\n ]\n text_message = text_template.render(\n {\n \"event\": event,\n \"registration\": first_waiting,\n \"name\": first_waiting.name or first_waiting.member.first_name,\n \"base_url\": settings.BASE_URL,\n \"organisers\": organiser_emails,\n }\n )\n\n EmailMessage(subject, text_message, to=[first_waiting.email]).send()\n\n\ndef notify_organiser(event, registration):\n \"\"\"Send an email to the organiser of the event if someone cancels their registration.\n\n :param event: the event\n :param registration: the registration that was cancelled\n \"\"\"\n if not event.organisers.exists():\n return\n\n text_template = get_template(\"events/organiser_email.txt\")\n subject = f\"Registration for {event.title} cancelled by member\"\n text_message = text_template.render({\"event\": event, \"registration\": registration})\n\n EmailMessage(\n subject,\n text_message,\n to=[\n organiser.contact_mailinglist.name + \"@\" + settings.SITE_DOMAIN\n for organiser in event.organisers.all()\n ],\n ).send()\n\n\ndef notify_waiting(event, registration):\n text_template = get_template(\"events/more_places_email.txt\")\n subject = _(\"[THALIA] Notification about your registration for '{}'\").format(\n event.title\n )\n text_message = text_template.render(\n {\n \"event\": event,\n \"registration\": registration,\n \"name\": registration.name or registration.member.first_name,\n \"base_url\": settings.BASE_URL,\n }\n )\n EmailMessage(subject, text_message, to=[registration.email]).send()\n", "path": "website/events/emails.py"}], "after_files": [{"content": "\"\"\"The emails defined by the events package.\"\"\"\nfrom django.conf import settings\nfrom django.core.mail import EmailMessage\nfrom django.template.loader import get_template\nfrom django.utils.translation import gettext_lazy as _\n\n\ndef notify_first_waiting(event):\n \"\"\"Send an email to the first person on the waiting list when someone cancels their registration.\n\n :param event: the event\n \"\"\"\n if (\n event.max_participants is not None\n and event.eventregistration_set.filter(date_cancelled=None).count()\n > event.max_participants\n ):\n # Prepare email to send to the first person on the waiting list\n first_waiting = event.eventregistration_set.filter(\n date_cancelled=None\n ).order_by(\"date\")[event.max_participants]\n\n text_template = get_template(\"events/member_email.txt\")\n\n subject = _(\"[THALIA] Notification about your registration for '{}'\").format(\n event.title\n )\n\n organiser_emails = [\n organiser.contact_address\n for organiser in event.organisers.all()\n if organiser.contact_address is not None\n ]\n text_message = text_template.render(\n {\n \"event\": event,\n \"registration\": first_waiting,\n \"name\": first_waiting.name or first_waiting.member.first_name,\n \"base_url\": settings.BASE_URL,\n \"organisers\": organiser_emails,\n }\n )\n\n EmailMessage(subject, text_message, to=[first_waiting.email]).send()\n\n\ndef notify_organiser(event, registration):\n \"\"\"Send an email to the organiser of the event if someone cancels their registration.\n\n :param event: the event\n :param registration: the registration that was cancelled\n \"\"\"\n if not event.organisers.exists():\n return\n\n text_template = get_template(\"events/organiser_email.txt\")\n subject = f\"Registration for {event.title} cancelled by member\"\n text_message = text_template.render({\"event\": event, \"registration\": registration})\n\n EmailMessage(\n subject,\n text_message,\n to=[\n organiser.contact_mailinglist.name + \"@\" + settings.SITE_DOMAIN\n for organiser in event.organisers.all()\n ],\n ).send()\n\n\ndef notify_waiting(event, registration):\n text_template = get_template(\"events/more_places_email.txt\")\n subject = _(\"[THALIA] Notification about your registration for '{}'\").format(\n event.title\n )\n\n organiser_emails = [\n organiser.contact_address\n for organiser in event.organisers.all()\n if organiser.contact_address is not None\n ]\n\n text_message = text_template.render(\n {\n \"event\": event,\n \"registration\": registration,\n \"name\": registration.name or registration.member.first_name,\n \"base_url\": settings.BASE_URL,\n \"organisers\": organiser_emails,\n }\n )\n\n EmailMessage(subject, text_message, to=[registration.email]).send()\n", "path": "website/events/emails.py"}]} | 1,133 | 188 |
gh_patches_debug_28771 | rasdani/github-patches | git_diff | opsdroid__opsdroid-182 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add ssl to the web server
It should be possible to enable ssl on the web server and pass in paths to the ssl keys in the config.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `opsdroid/web.py`
Content:
```
1 """Submodule to handle web requests in opsdroid."""
2
3 import json
4 import logging
5
6 from aiohttp import web
7
8 from opsdroid.const import __version__
9
10
11 _LOGGER = logging.getLogger(__name__)
12
13
14 class Web:
15 """Web server for opsdroid."""
16
17 def __init__(self, opsdroid):
18 """Create web object."""
19 self.opsdroid = opsdroid
20 try:
21 self.config = self.opsdroid.config["web"]
22 except KeyError:
23 self.config = {}
24 self.web_app = web.Application(loop=self.opsdroid.eventloop)
25 self.web_app.router.add_get('/', self.web_index_handler)
26 self.web_app.router.add_get('', self.web_index_handler)
27 self.web_app.router.add_get('/stats', self.web_stats_handler)
28 self.web_app.router.add_get('/stats/', self.web_stats_handler)
29
30 @property
31 def get_port(self):
32 """Return port from config or the default."""
33 try:
34 port = self.config["port"]
35 except KeyError:
36 port = 8080
37 return port
38
39 @property
40 def get_host(self):
41 """Return host from config or the default."""
42 try:
43 host = self.config["host"]
44 except KeyError:
45 host = '127.0.0.1'
46 return host
47
48 def start(self):
49 """Start web servers."""
50 _LOGGER.debug(
51 "Starting web server with host %s and port %s",
52 self.get_host, self.get_port)
53 web.run_app(self.web_app, host=self.get_host,
54 port=self.get_port, print=_LOGGER.info)
55
56 @staticmethod
57 def build_response(status, result):
58 """Build a json response object."""
59 return web.Response(text=json.dumps(result), status=status)
60
61 def web_index_handler(self, request):
62 """Handle root web request."""
63 return self.build_response(200, {
64 "message": "Welcome to the opsdroid API"})
65
66 def web_stats_handler(self, request):
67 """Handle stats request."""
68 stats = self.opsdroid.stats
69 try:
70 stats["average_response_time"] = \
71 stats["total_response_time"] / stats["total_responses"]
72 except ZeroDivisionError:
73 stats["average_response_time"] = 0
74
75 return self.build_response(200, {
76 "version": __version__,
77 "messages": {
78 "total_parsed": stats["messages_parsed"],
79 "webhooks_called": stats["webhooks_called"],
80 "total_response_time": stats["total_response_time"],
81 "total_responses": stats["total_responses"],
82 "average_response_time": stats["average_response_time"]
83 },
84 "modules": {
85 "skills": len(self.opsdroid.skills),
86 "connectors": len(self.opsdroid.connectors),
87 "databases": len(self.opsdroid.memory.databases)
88 }
89 })
90
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/opsdroid/web.py b/opsdroid/web.py
--- a/opsdroid/web.py
+++ b/opsdroid/web.py
@@ -2,6 +2,7 @@
import json
import logging
+import ssl
from aiohttp import web
@@ -33,7 +34,10 @@
try:
port = self.config["port"]
except KeyError:
- port = 8080
+ if self.get_ssl_context is not None:
+ port = 8443
+ else:
+ port = 8080
return port
@property
@@ -45,13 +49,28 @@
host = '127.0.0.1'
return host
+ @property
+ def get_ssl_context(self):
+ """Return the ssl context or None."""
+ try:
+ ssl_config = self.config["ssl"]
+ sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23)
+ sslcontext.load_cert_chain(ssl_config["cert"], ssl_config["key"])
+ return sslcontext
+ except FileNotFoundError:
+ _LOGGER.error("Cannot find ssl cert or key.")
+ return None
+ except KeyError:
+ return None
+
def start(self):
"""Start web servers."""
_LOGGER.debug(
"Starting web server with host %s and port %s",
self.get_host, self.get_port)
web.run_app(self.web_app, host=self.get_host,
- port=self.get_port, print=_LOGGER.info)
+ port=self.get_port, print=_LOGGER.info,
+ ssl_context=self.get_ssl_context)
@staticmethod
def build_response(status, result):
| {"golden_diff": "diff --git a/opsdroid/web.py b/opsdroid/web.py\n--- a/opsdroid/web.py\n+++ b/opsdroid/web.py\n@@ -2,6 +2,7 @@\n \n import json\n import logging\n+import ssl\n \n from aiohttp import web\n \n@@ -33,7 +34,10 @@\n try:\n port = self.config[\"port\"]\n except KeyError:\n- port = 8080\n+ if self.get_ssl_context is not None:\n+ port = 8443\n+ else:\n+ port = 8080\n return port\n \n @property\n@@ -45,13 +49,28 @@\n host = '127.0.0.1'\n return host\n \n+ @property\n+ def get_ssl_context(self):\n+ \"\"\"Return the ssl context or None.\"\"\"\n+ try:\n+ ssl_config = self.config[\"ssl\"]\n+ sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23)\n+ sslcontext.load_cert_chain(ssl_config[\"cert\"], ssl_config[\"key\"])\n+ return sslcontext\n+ except FileNotFoundError:\n+ _LOGGER.error(\"Cannot find ssl cert or key.\")\n+ return None\n+ except KeyError:\n+ return None\n+\n def start(self):\n \"\"\"Start web servers.\"\"\"\n _LOGGER.debug(\n \"Starting web server with host %s and port %s\",\n self.get_host, self.get_port)\n web.run_app(self.web_app, host=self.get_host,\n- port=self.get_port, print=_LOGGER.info)\n+ port=self.get_port, print=_LOGGER.info,\n+ ssl_context=self.get_ssl_context)\n \n @staticmethod\n def build_response(status, result):\n", "issue": "Add ssl to the web server\nIt should be possible to enable ssl on the web server and pass in paths to the ssl keys in the config.\n", "before_files": [{"content": "\"\"\"Submodule to handle web requests in opsdroid.\"\"\"\n\nimport json\nimport logging\n\nfrom aiohttp import web\n\nfrom opsdroid.const import __version__\n\n\n_LOGGER = logging.getLogger(__name__)\n\n\nclass Web:\n \"\"\"Web server for opsdroid.\"\"\"\n\n def __init__(self, opsdroid):\n \"\"\"Create web object.\"\"\"\n self.opsdroid = opsdroid\n try:\n self.config = self.opsdroid.config[\"web\"]\n except KeyError:\n self.config = {}\n self.web_app = web.Application(loop=self.opsdroid.eventloop)\n self.web_app.router.add_get('/', self.web_index_handler)\n self.web_app.router.add_get('', self.web_index_handler)\n self.web_app.router.add_get('/stats', self.web_stats_handler)\n self.web_app.router.add_get('/stats/', self.web_stats_handler)\n\n @property\n def get_port(self):\n \"\"\"Return port from config or the default.\"\"\"\n try:\n port = self.config[\"port\"]\n except KeyError:\n port = 8080\n return port\n\n @property\n def get_host(self):\n \"\"\"Return host from config or the default.\"\"\"\n try:\n host = self.config[\"host\"]\n except KeyError:\n host = '127.0.0.1'\n return host\n\n def start(self):\n \"\"\"Start web servers.\"\"\"\n _LOGGER.debug(\n \"Starting web server with host %s and port %s\",\n self.get_host, self.get_port)\n web.run_app(self.web_app, host=self.get_host,\n port=self.get_port, print=_LOGGER.info)\n\n @staticmethod\n def build_response(status, result):\n \"\"\"Build a json response object.\"\"\"\n return web.Response(text=json.dumps(result), status=status)\n\n def web_index_handler(self, request):\n \"\"\"Handle root web request.\"\"\"\n return self.build_response(200, {\n \"message\": \"Welcome to the opsdroid API\"})\n\n def web_stats_handler(self, request):\n \"\"\"Handle stats request.\"\"\"\n stats = self.opsdroid.stats\n try:\n stats[\"average_response_time\"] = \\\n stats[\"total_response_time\"] / stats[\"total_responses\"]\n except ZeroDivisionError:\n stats[\"average_response_time\"] = 0\n\n return self.build_response(200, {\n \"version\": __version__,\n \"messages\": {\n \"total_parsed\": stats[\"messages_parsed\"],\n \"webhooks_called\": stats[\"webhooks_called\"],\n \"total_response_time\": stats[\"total_response_time\"],\n \"total_responses\": stats[\"total_responses\"],\n \"average_response_time\": stats[\"average_response_time\"]\n },\n \"modules\": {\n \"skills\": len(self.opsdroid.skills),\n \"connectors\": len(self.opsdroid.connectors),\n \"databases\": len(self.opsdroid.memory.databases)\n }\n })\n", "path": "opsdroid/web.py"}], "after_files": [{"content": "\"\"\"Submodule to handle web requests in opsdroid.\"\"\"\n\nimport json\nimport logging\nimport ssl\n\nfrom aiohttp import web\n\nfrom opsdroid.const import __version__\n\n\n_LOGGER = logging.getLogger(__name__)\n\n\nclass Web:\n \"\"\"Web server for opsdroid.\"\"\"\n\n def __init__(self, opsdroid):\n \"\"\"Create web object.\"\"\"\n self.opsdroid = opsdroid\n try:\n self.config = self.opsdroid.config[\"web\"]\n except KeyError:\n self.config = {}\n self.web_app = web.Application(loop=self.opsdroid.eventloop)\n self.web_app.router.add_get('/', self.web_index_handler)\n self.web_app.router.add_get('', self.web_index_handler)\n self.web_app.router.add_get('/stats', self.web_stats_handler)\n self.web_app.router.add_get('/stats/', self.web_stats_handler)\n\n @property\n def get_port(self):\n \"\"\"Return port from config or the default.\"\"\"\n try:\n port = self.config[\"port\"]\n except KeyError:\n if self.get_ssl_context is not None:\n port = 8443\n else:\n port = 8080\n return port\n\n @property\n def get_host(self):\n \"\"\"Return host from config or the default.\"\"\"\n try:\n host = self.config[\"host\"]\n except KeyError:\n host = '127.0.0.1'\n return host\n\n @property\n def get_ssl_context(self):\n \"\"\"Return the ssl context or None.\"\"\"\n try:\n ssl_config = self.config[\"ssl\"]\n sslcontext = ssl.SSLContext(ssl.PROTOCOL_SSLv23)\n sslcontext.load_cert_chain(ssl_config[\"cert\"], ssl_config[\"key\"])\n return sslcontext\n except FileNotFoundError:\n _LOGGER.error(\"Cannot find ssl cert or key.\")\n return None\n except KeyError:\n return None\n\n def start(self):\n \"\"\"Start web servers.\"\"\"\n _LOGGER.debug(\n \"Starting web server with host %s and port %s\",\n self.get_host, self.get_port)\n web.run_app(self.web_app, host=self.get_host,\n port=self.get_port, print=_LOGGER.info,\n ssl_context=self.get_ssl_context)\n\n @staticmethod\n def build_response(status, result):\n \"\"\"Build a json response object.\"\"\"\n return web.Response(text=json.dumps(result), status=status)\n\n def web_index_handler(self, request):\n \"\"\"Handle root web request.\"\"\"\n return self.build_response(200, {\n \"message\": \"Welcome to the opsdroid API\"})\n\n def web_stats_handler(self, request):\n \"\"\"Handle stats request.\"\"\"\n stats = self.opsdroid.stats\n try:\n stats[\"average_response_time\"] = \\\n stats[\"total_response_time\"] / stats[\"total_responses\"]\n except ZeroDivisionError:\n stats[\"average_response_time\"] = 0\n\n return self.build_response(200, {\n \"version\": __version__,\n \"messages\": {\n \"total_parsed\": stats[\"messages_parsed\"],\n \"webhooks_called\": stats[\"webhooks_called\"],\n \"total_response_time\": stats[\"total_response_time\"],\n \"total_responses\": stats[\"total_responses\"],\n \"average_response_time\": stats[\"average_response_time\"]\n },\n \"modules\": {\n \"skills\": len(self.opsdroid.skills),\n \"connectors\": len(self.opsdroid.connectors),\n \"databases\": len(self.opsdroid.memory.databases)\n }\n })\n", "path": "opsdroid/web.py"}]} | 1,080 | 387 |
gh_patches_debug_11252 | rasdani/github-patches | git_diff | iterative__dvc-4462 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
experiments: show table includes all staged/stashed experiments instead of only the currently applicable ones
```
example-get-started git:executor-tree py:dvc ❯ dvc exp show --no-pager --include-params=featurize
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━┓
┃ Experiment ┃ auc ┃ featurize.max_features ┃ featurize.ngrams ┃
┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━┩
│ workspace │ 0.54175 │ 500 │ 5 │
│ bbdfa81 (2020-08-21 11:27:38) │ 0.54175 │ 500 │ 5 │
│ ├── ebbf40d (2020-08-21 11:28:42) │ 0.50822 │ 1500 │ 4 │
│ └── *32c3875 (2020-08-21 12:05:16) │ - │ 1500 │ 7 │
│ ├── *8cb834d (2020-08-21 12:04:59) │ - │ 1500 │ 2 │
│ ├── *32d107b (2020-08-21 12:05:01) │ - │ 1500 │ 5 │
│ └── *4f2c53c (2020-08-21 12:05:04) │ - │ 1500 │ 6 │
└────────────────────────────────────┴─────────┴────────────────────────┴──────────────────┘
```
the last 3 stashed experiments are derived from a different baseline commit and should be excluded by default (unless `--all-commit`/etc are used)
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dvc/repo/experiments/show.py`
Content:
```
1 import logging
2 import re
3 from collections import OrderedDict, defaultdict
4 from datetime import datetime
5
6 from dvc.repo import locked
7 from dvc.repo.metrics.show import _collect_metrics, _read_metrics
8 from dvc.repo.params.show import _collect_configs, _read_params
9
10 logger = logging.getLogger(__name__)
11
12
13 EXP_RE = re.compile(r"(?P<rev_sha>[a-f0-9]{7})-(?P<exp_sha>[a-f0-9]+)")
14
15
16 def _collect_experiment(repo, branch, stash=False):
17 res = defaultdict(dict)
18 for rev in repo.brancher(revs=[branch]):
19 if rev == "workspace":
20 res["timestamp"] = None
21 else:
22 commit = repo.scm.repo.rev_parse(rev)
23 res["timestamp"] = datetime.fromtimestamp(commit.committed_date)
24
25 configs = _collect_configs(repo)
26 params = _read_params(repo, configs, rev)
27 if params:
28 res["params"] = params
29
30 res["queued"] = stash
31 if not stash:
32 metrics = _collect_metrics(repo, None, False)
33 vals = _read_metrics(repo, metrics, rev)
34 res["metrics"] = vals
35
36 return res
37
38
39 @locked
40 def show(
41 repo, all_branches=False, all_tags=False, revs=None, all_commits=False
42 ):
43 res = defaultdict(OrderedDict)
44
45 if revs is None:
46 revs = [repo.scm.get_rev()]
47
48 revs = OrderedDict(
49 (rev, None)
50 for rev in repo.brancher(
51 revs=revs,
52 all_branches=all_branches,
53 all_tags=all_tags,
54 all_commits=all_commits,
55 )
56 )
57
58 for rev in revs:
59 res[rev]["baseline"] = _collect_experiment(repo, rev)
60
61 # collect reproduced experiments
62 for exp_branch in repo.experiments.scm.list_branches():
63 m = re.match(EXP_RE, exp_branch)
64 if m:
65 rev = repo.scm.resolve_rev(m.group("rev_sha"))
66 if rev in revs:
67 exp_rev = repo.experiments.scm.resolve_rev(exp_branch)
68 with repo.experiments.chdir():
69 experiment = _collect_experiment(
70 repo.experiments.exp_dvc, exp_branch
71 )
72 res[rev][exp_rev] = experiment
73
74 # collect queued (not yet reproduced) experiments
75 for stash_rev, (_, baseline_rev) in repo.experiments.stash_revs.items():
76 with repo.experiments.chdir():
77 experiment = _collect_experiment(
78 repo.experiments.exp_dvc, stash_rev, stash=True
79 )
80 res[baseline_rev][stash_rev] = experiment
81
82 return res
83
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dvc/repo/experiments/show.py b/dvc/repo/experiments/show.py
--- a/dvc/repo/experiments/show.py
+++ b/dvc/repo/experiments/show.py
@@ -73,10 +73,11 @@
# collect queued (not yet reproduced) experiments
for stash_rev, (_, baseline_rev) in repo.experiments.stash_revs.items():
- with repo.experiments.chdir():
- experiment = _collect_experiment(
- repo.experiments.exp_dvc, stash_rev, stash=True
- )
- res[baseline_rev][stash_rev] = experiment
+ if baseline_rev in revs:
+ with repo.experiments.chdir():
+ experiment = _collect_experiment(
+ repo.experiments.exp_dvc, stash_rev, stash=True
+ )
+ res[baseline_rev][stash_rev] = experiment
return res
| {"golden_diff": "diff --git a/dvc/repo/experiments/show.py b/dvc/repo/experiments/show.py\n--- a/dvc/repo/experiments/show.py\n+++ b/dvc/repo/experiments/show.py\n@@ -73,10 +73,11 @@\n \n # collect queued (not yet reproduced) experiments\n for stash_rev, (_, baseline_rev) in repo.experiments.stash_revs.items():\n- with repo.experiments.chdir():\n- experiment = _collect_experiment(\n- repo.experiments.exp_dvc, stash_rev, stash=True\n- )\n- res[baseline_rev][stash_rev] = experiment\n+ if baseline_rev in revs:\n+ with repo.experiments.chdir():\n+ experiment = _collect_experiment(\n+ repo.experiments.exp_dvc, stash_rev, stash=True\n+ )\n+ res[baseline_rev][stash_rev] = experiment\n \n return res\n", "issue": "experiments: show table includes all staged/stashed experiments instead of only the currently applicable ones\n```\r\nexample-get-started git:executor-tree py:dvc \u276f dvc exp show --no-pager --include-params=featurize\r\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\r\n\u2503 Experiment \u2503 auc \u2503 featurize.max_features \u2503 featurize.ngrams \u2503\r\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\r\n\u2502 workspace \u2502 0.54175 \u2502 500 \u2502 5 \u2502\r\n\u2502 bbdfa81 (2020-08-21 11:27:38) \u2502 0.54175 \u2502 500 \u2502 5 \u2502\r\n\u2502 \u251c\u2500\u2500 ebbf40d (2020-08-21 11:28:42) \u2502 0.50822 \u2502 1500 \u2502 4 \u2502\r\n\u2502 \u2514\u2500\u2500 *32c3875 (2020-08-21 12:05:16) \u2502 - \u2502 1500 \u2502 7 \u2502\r\n\u2502 \u251c\u2500\u2500 *8cb834d (2020-08-21 12:04:59) \u2502 - \u2502 1500 \u2502 2 \u2502\r\n\u2502 \u251c\u2500\u2500 *32d107b (2020-08-21 12:05:01) \u2502 - \u2502 1500 \u2502 5 \u2502\r\n\u2502 \u2514\u2500\u2500 *4f2c53c (2020-08-21 12:05:04) \u2502 - \u2502 1500 \u2502 6 \u2502\r\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\r\n```\r\n\r\nthe last 3 stashed experiments are derived from a different baseline commit and should be excluded by default (unless `--all-commit`/etc are used)\n", "before_files": [{"content": "import logging\nimport re\nfrom collections import OrderedDict, defaultdict\nfrom datetime import datetime\n\nfrom dvc.repo import locked\nfrom dvc.repo.metrics.show import _collect_metrics, _read_metrics\nfrom dvc.repo.params.show import _collect_configs, _read_params\n\nlogger = logging.getLogger(__name__)\n\n\nEXP_RE = re.compile(r\"(?P<rev_sha>[a-f0-9]{7})-(?P<exp_sha>[a-f0-9]+)\")\n\n\ndef _collect_experiment(repo, branch, stash=False):\n res = defaultdict(dict)\n for rev in repo.brancher(revs=[branch]):\n if rev == \"workspace\":\n res[\"timestamp\"] = None\n else:\n commit = repo.scm.repo.rev_parse(rev)\n res[\"timestamp\"] = datetime.fromtimestamp(commit.committed_date)\n\n configs = _collect_configs(repo)\n params = _read_params(repo, configs, rev)\n if params:\n res[\"params\"] = params\n\n res[\"queued\"] = stash\n if not stash:\n metrics = _collect_metrics(repo, None, False)\n vals = _read_metrics(repo, metrics, rev)\n res[\"metrics\"] = vals\n\n return res\n\n\n@locked\ndef show(\n repo, all_branches=False, all_tags=False, revs=None, all_commits=False\n):\n res = defaultdict(OrderedDict)\n\n if revs is None:\n revs = [repo.scm.get_rev()]\n\n revs = OrderedDict(\n (rev, None)\n for rev in repo.brancher(\n revs=revs,\n all_branches=all_branches,\n all_tags=all_tags,\n all_commits=all_commits,\n )\n )\n\n for rev in revs:\n res[rev][\"baseline\"] = _collect_experiment(repo, rev)\n\n # collect reproduced experiments\n for exp_branch in repo.experiments.scm.list_branches():\n m = re.match(EXP_RE, exp_branch)\n if m:\n rev = repo.scm.resolve_rev(m.group(\"rev_sha\"))\n if rev in revs:\n exp_rev = repo.experiments.scm.resolve_rev(exp_branch)\n with repo.experiments.chdir():\n experiment = _collect_experiment(\n repo.experiments.exp_dvc, exp_branch\n )\n res[rev][exp_rev] = experiment\n\n # collect queued (not yet reproduced) experiments\n for stash_rev, (_, baseline_rev) in repo.experiments.stash_revs.items():\n with repo.experiments.chdir():\n experiment = _collect_experiment(\n repo.experiments.exp_dvc, stash_rev, stash=True\n )\n res[baseline_rev][stash_rev] = experiment\n\n return res\n", "path": "dvc/repo/experiments/show.py"}], "after_files": [{"content": "import logging\nimport re\nfrom collections import OrderedDict, defaultdict\nfrom datetime import datetime\n\nfrom dvc.repo import locked\nfrom dvc.repo.metrics.show import _collect_metrics, _read_metrics\nfrom dvc.repo.params.show import _collect_configs, _read_params\n\nlogger = logging.getLogger(__name__)\n\n\nEXP_RE = re.compile(r\"(?P<rev_sha>[a-f0-9]{7})-(?P<exp_sha>[a-f0-9]+)\")\n\n\ndef _collect_experiment(repo, branch, stash=False):\n res = defaultdict(dict)\n for rev in repo.brancher(revs=[branch]):\n if rev == \"workspace\":\n res[\"timestamp\"] = None\n else:\n commit = repo.scm.repo.rev_parse(rev)\n res[\"timestamp\"] = datetime.fromtimestamp(commit.committed_date)\n\n configs = _collect_configs(repo)\n params = _read_params(repo, configs, rev)\n if params:\n res[\"params\"] = params\n\n res[\"queued\"] = stash\n if not stash:\n metrics = _collect_metrics(repo, None, False)\n vals = _read_metrics(repo, metrics, rev)\n res[\"metrics\"] = vals\n\n return res\n\n\n@locked\ndef show(\n repo, all_branches=False, all_tags=False, revs=None, all_commits=False\n):\n res = defaultdict(OrderedDict)\n\n if revs is None:\n revs = [repo.scm.get_rev()]\n\n revs = OrderedDict(\n (rev, None)\n for rev in repo.brancher(\n revs=revs,\n all_branches=all_branches,\n all_tags=all_tags,\n all_commits=all_commits,\n )\n )\n\n for rev in revs:\n res[rev][\"baseline\"] = _collect_experiment(repo, rev)\n\n # collect reproduced experiments\n for exp_branch in repo.experiments.scm.list_branches():\n m = re.match(EXP_RE, exp_branch)\n if m:\n rev = repo.scm.resolve_rev(m.group(\"rev_sha\"))\n if rev in revs:\n exp_rev = repo.experiments.scm.resolve_rev(exp_branch)\n with repo.experiments.chdir():\n experiment = _collect_experiment(\n repo.experiments.exp_dvc, exp_branch\n )\n res[rev][exp_rev] = experiment\n\n # collect queued (not yet reproduced) experiments\n for stash_rev, (_, baseline_rev) in repo.experiments.stash_revs.items():\n if baseline_rev in revs:\n with repo.experiments.chdir():\n experiment = _collect_experiment(\n repo.experiments.exp_dvc, stash_rev, stash=True\n )\n res[baseline_rev][stash_rev] = experiment\n\n return res\n", "path": "dvc/repo/experiments/show.py"}]} | 1,546 | 196 |
gh_patches_debug_8445 | rasdani/github-patches | git_diff | zestedesavoir__zds-site-2951 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
L'affichage des pseudos dans la liste des tutoriels / article déconne
L'affichage du pseudo "Bat'" n'est pas correct.

Possible de voir le comportement sur la page: https://zestedesavoir.com/tutoriels/?tag=dot-net
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `zds/utils/templatetags/captureas.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 from django import template
4
5 register = template.Library()
6
7 """
8 Define a tag allowing to capture template content as a variable.
9 """
10
11
12 @register.tag(name='captureas')
13 def do_captureas(parser, token):
14 """
15 Define a tag allowing to capture template content as a variable.
16
17 :param parser: The django template parser
18 :param token: tag token (tag_name + variable_name)
19 :return: Template node.
20 """
21
22 try:
23 _, variable_name = token.split_contents()
24 except ValueError:
25 raise template.TemplateSyntaxError("'captureas' node requires a variable name.")
26
27 nodelist = parser.parse(('endcaptureas',))
28 parser.delete_first_token()
29
30 return CaptureasNode(nodelist, variable_name)
31
32
33 class CaptureasNode(template.Node):
34 """
35 Capture end render node content to a variable name.
36 """
37
38 def __init__(self, nodelist, variable_name):
39 """
40 Create a template node which render `nodelist` to `variable_name`.
41
42 :param nodelist: The node list to capture.
43 :param variable_name: The variable name which will gain the rendered content.
44 """
45 self.__node_list = nodelist
46 self.__variable_name = variable_name
47
48 def render(self, context):
49 """
50 Render the node list to the variable name.
51
52 :param context: Current context.
53 :return: Empty string
54 :rtype: str
55 """
56 output = self.__node_list.render(context)
57 context[self.__variable_name] = output.strip()
58 return ''
59
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/zds/utils/templatetags/captureas.py b/zds/utils/templatetags/captureas.py
--- a/zds/utils/templatetags/captureas.py
+++ b/zds/utils/templatetags/captureas.py
@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
from django import template
+from django.utils.safestring import mark_safe
register = template.Library()
@@ -54,5 +55,5 @@
:rtype: str
"""
output = self.__node_list.render(context)
- context[self.__variable_name] = output.strip()
+ context[self.__variable_name] = mark_safe(output.strip())
return ''
| {"golden_diff": "diff --git a/zds/utils/templatetags/captureas.py b/zds/utils/templatetags/captureas.py\n--- a/zds/utils/templatetags/captureas.py\n+++ b/zds/utils/templatetags/captureas.py\n@@ -1,6 +1,7 @@\n # -*- coding: utf-8 -*-\n \n from django import template\n+from django.utils.safestring import mark_safe\n \n register = template.Library()\n \n@@ -54,5 +55,5 @@\n :rtype: str\n \"\"\"\n output = self.__node_list.render(context)\n- context[self.__variable_name] = output.strip()\n+ context[self.__variable_name] = mark_safe(output.strip())\n return ''\n", "issue": "L'affichage des pseudos dans la liste des tutoriels / article d\u00e9conne\nL'affichage du pseudo \"Bat'\" n'est pas correct.\n\n\n\nPossible de voir le comportement sur la page: https://zestedesavoir.com/tutoriels/?tag=dot-net\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom django import template\n\nregister = template.Library()\n\n\"\"\"\nDefine a tag allowing to capture template content as a variable.\n\"\"\"\n\n\[email protected](name='captureas')\ndef do_captureas(parser, token):\n \"\"\"\n Define a tag allowing to capture template content as a variable.\n\n :param parser: The django template parser\n :param token: tag token (tag_name + variable_name)\n :return: Template node.\n \"\"\"\n\n try:\n _, variable_name = token.split_contents()\n except ValueError:\n raise template.TemplateSyntaxError(\"'captureas' node requires a variable name.\")\n\n nodelist = parser.parse(('endcaptureas',))\n parser.delete_first_token()\n\n return CaptureasNode(nodelist, variable_name)\n\n\nclass CaptureasNode(template.Node):\n \"\"\"\n Capture end render node content to a variable name.\n \"\"\"\n\n def __init__(self, nodelist, variable_name):\n \"\"\"\n Create a template node which render `nodelist` to `variable_name`.\n\n :param nodelist: The node list to capture.\n :param variable_name: The variable name which will gain the rendered content.\n \"\"\"\n self.__node_list = nodelist\n self.__variable_name = variable_name\n\n def render(self, context):\n \"\"\"\n Render the node list to the variable name.\n\n :param context: Current context.\n :return: Empty string\n :rtype: str\n \"\"\"\n output = self.__node_list.render(context)\n context[self.__variable_name] = output.strip()\n return ''\n", "path": "zds/utils/templatetags/captureas.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\nfrom django import template\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\n\"\"\"\nDefine a tag allowing to capture template content as a variable.\n\"\"\"\n\n\[email protected](name='captureas')\ndef do_captureas(parser, token):\n \"\"\"\n Define a tag allowing to capture template content as a variable.\n\n :param parser: The django template parser\n :param token: tag token (tag_name + variable_name)\n :return: Template node.\n \"\"\"\n\n try:\n _, variable_name = token.split_contents()\n except ValueError:\n raise template.TemplateSyntaxError(\"'captureas' node requires a variable name.\")\n\n nodelist = parser.parse(('endcaptureas',))\n parser.delete_first_token()\n\n return CaptureasNode(nodelist, variable_name)\n\n\nclass CaptureasNode(template.Node):\n \"\"\"\n Capture end render node content to a variable name.\n \"\"\"\n\n def __init__(self, nodelist, variable_name):\n \"\"\"\n Create a template node which render `nodelist` to `variable_name`.\n\n :param nodelist: The node list to capture.\n :param variable_name: The variable name which will gain the rendered content.\n \"\"\"\n self.__node_list = nodelist\n self.__variable_name = variable_name\n\n def render(self, context):\n \"\"\"\n Render the node list to the variable name.\n\n :param context: Current context.\n :return: Empty string\n :rtype: str\n \"\"\"\n output = self.__node_list.render(context)\n context[self.__variable_name] = mark_safe(output.strip())\n return ''\n", "path": "zds/utils/templatetags/captureas.py"}]} | 834 | 160 |
gh_patches_debug_1113 | rasdani/github-patches | git_diff | Pylons__pyramid-2225 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Update to Sphinx 1.3.4 when released
There is a [bug in Sphinx 1.3.3 and 1.3.1](https://github.com/sphinx-doc/sphinx/issues/2189) (I haven't tried 1.3.2) where next and previous links in Sphinx documentation are broken when going into children and across sibling directories.
When 1.3.4 is released, we need to pin sphinx to 1.3.4, which will include the commit made 8 days after the 1.3.3 release.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 ##############################################################################
2 #
3 # Copyright (c) 2008-2013 Agendaless Consulting and Contributors.
4 # All Rights Reserved.
5 #
6 # This software is subject to the provisions of the BSD-like license at
7 # http://www.repoze.org/LICENSE.txt. A copy of the license should accompany
8 # this distribution. THIS SOFTWARE IS PROVIDED "AS IS" AND ANY AND ALL
9 # EXPRESS OR IMPLIED WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO,
10 # THE IMPLIED WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND
11 # FITNESS FOR A PARTICULAR PURPOSE
12 #
13 ##############################################################################
14
15 import os
16 import sys
17
18 from setuptools import setup, find_packages
19
20 py_version = sys.version_info[:2]
21
22 PY3 = py_version[0] == 3
23
24 if PY3:
25 if py_version < (3, 2):
26 raise RuntimeError('On Python 3, Pyramid requires Python 3.2 or better')
27 else:
28 if py_version < (2, 6):
29 raise RuntimeError('On Python 2, Pyramid requires Python 2.6 or better')
30
31 here = os.path.abspath(os.path.dirname(__file__))
32 try:
33 with open(os.path.join(here, 'README.rst')) as f:
34 README = f.read()
35 with open(os.path.join(here, 'CHANGES.txt')) as f:
36 CHANGES = f.read()
37 except IOError:
38 README = CHANGES = ''
39
40 install_requires=[
41 'setuptools',
42 'WebOb >= 1.3.1', # request.domain and CookieProfile
43 'repoze.lru >= 0.4', # py3 compat
44 'zope.interface >= 3.8.0', # has zope.interface.registry
45 'zope.deprecation >= 3.5.0', # py3 compat
46 'venusian >= 1.0a3', # ``ignore``
47 'translationstring >= 0.4', # py3 compat
48 'PasteDeploy >= 1.5.0', # py3 compat
49 ]
50
51 tests_require = [
52 'WebTest >= 1.3.1', # py3 compat
53 ]
54
55 if not PY3:
56 tests_require.append('zope.component>=3.11.0')
57
58 docs_extras = [
59 'Sphinx >= 1.3.1',
60 'docutils',
61 'repoze.sphinx.autointerface',
62 'pylons_sphinx_latesturl',
63 'pylons-sphinx-themes',
64 'sphinxcontrib-programoutput',
65 ]
66
67 testing_extras = tests_require + [
68 'nose',
69 'coverage',
70 'virtualenv', # for scaffolding tests
71 ]
72
73 setup(name='pyramid',
74 version='1.6',
75 description='The Pyramid Web Framework, a Pylons project',
76 long_description=README + '\n\n' + CHANGES,
77 classifiers=[
78 "Development Status :: 6 - Mature",
79 "Intended Audience :: Developers",
80 "Programming Language :: Python",
81 "Programming Language :: Python :: 2.6",
82 "Programming Language :: Python :: 2.7",
83 "Programming Language :: Python :: 3",
84 "Programming Language :: Python :: 3.2",
85 "Programming Language :: Python :: 3.3",
86 "Programming Language :: Python :: 3.4",
87 "Programming Language :: Python :: 3.5",
88 "Programming Language :: Python :: Implementation :: CPython",
89 "Programming Language :: Python :: Implementation :: PyPy",
90 "Framework :: Pyramid",
91 "Topic :: Internet :: WWW/HTTP",
92 "Topic :: Internet :: WWW/HTTP :: WSGI",
93 "License :: Repoze Public License",
94 ],
95 keywords='web wsgi pylons pyramid',
96 author="Chris McDonough, Agendaless Consulting",
97 author_email="[email protected]",
98 url="http://docs.pylonsproject.org/en/latest/docs/pyramid.html",
99 license="BSD-derived (http://www.repoze.org/LICENSE.txt)",
100 packages=find_packages(),
101 include_package_data=True,
102 zip_safe=False,
103 install_requires = install_requires,
104 extras_require = {
105 'testing':testing_extras,
106 'docs':docs_extras,
107 },
108 tests_require = tests_require,
109 test_suite="pyramid.tests",
110 entry_points = """\
111 [pyramid.scaffold]
112 starter=pyramid.scaffolds:StarterProjectTemplate
113 zodb=pyramid.scaffolds:ZODBProjectTemplate
114 alchemy=pyramid.scaffolds:AlchemyProjectTemplate
115 [pyramid.pshell_runner]
116 python=pyramid.scripts.pshell:python_shell_runner
117 [console_scripts]
118 pcreate = pyramid.scripts.pcreate:main
119 pserve = pyramid.scripts.pserve:main
120 pshell = pyramid.scripts.pshell:main
121 proutes = pyramid.scripts.proutes:main
122 pviews = pyramid.scripts.pviews:main
123 ptweens = pyramid.scripts.ptweens:main
124 prequest = pyramid.scripts.prequest:main
125 pdistreport = pyramid.scripts.pdistreport:main
126 [paste.server_runner]
127 wsgiref = pyramid.scripts.pserve:wsgiref_server_runner
128 cherrypy = pyramid.scripts.pserve:cherrypy_server_runner
129 """
130 )
131
132
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -56,7 +56,7 @@
tests_require.append('zope.component>=3.11.0')
docs_extras = [
- 'Sphinx >= 1.3.1',
+ 'Sphinx >= 1.3.4',
'docutils',
'repoze.sphinx.autointerface',
'pylons_sphinx_latesturl',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -56,7 +56,7 @@\n tests_require.append('zope.component>=3.11.0')\n \n docs_extras = [\n- 'Sphinx >= 1.3.1',\n+ 'Sphinx >= 1.3.4',\n 'docutils',\n 'repoze.sphinx.autointerface',\n 'pylons_sphinx_latesturl',\n", "issue": "Update to Sphinx 1.3.4 when released\nThere is a [bug in Sphinx 1.3.3 and 1.3.1](https://github.com/sphinx-doc/sphinx/issues/2189) (I haven't tried 1.3.2) where next and previous links in Sphinx documentation are broken when going into children and across sibling directories.\n\nWhen 1.3.4 is released, we need to pin sphinx to 1.3.4, which will include the commit made 8 days after the 1.3.3 release.\n\n", "before_files": [{"content": "##############################################################################\n#\n# Copyright (c) 2008-2013 Agendaless Consulting and Contributors.\n# All Rights Reserved.\n#\n# This software is subject to the provisions of the BSD-like license at\n# http://www.repoze.org/LICENSE.txt. A copy of the license should accompany\n# this distribution. THIS SOFTWARE IS PROVIDED \"AS IS\" AND ANY AND ALL\n# EXPRESS OR IMPLIED WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO,\n# THE IMPLIED WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND\n# FITNESS FOR A PARTICULAR PURPOSE\n#\n##############################################################################\n\nimport os\nimport sys\n\nfrom setuptools import setup, find_packages\n\npy_version = sys.version_info[:2]\n\nPY3 = py_version[0] == 3\n\nif PY3:\n if py_version < (3, 2):\n raise RuntimeError('On Python 3, Pyramid requires Python 3.2 or better')\nelse:\n if py_version < (2, 6):\n raise RuntimeError('On Python 2, Pyramid requires Python 2.6 or better')\n\nhere = os.path.abspath(os.path.dirname(__file__))\ntry:\n with open(os.path.join(here, 'README.rst')) as f:\n README = f.read()\n with open(os.path.join(here, 'CHANGES.txt')) as f:\n CHANGES = f.read()\nexcept IOError:\n README = CHANGES = ''\n\ninstall_requires=[\n 'setuptools',\n 'WebOb >= 1.3.1', # request.domain and CookieProfile\n 'repoze.lru >= 0.4', # py3 compat\n 'zope.interface >= 3.8.0', # has zope.interface.registry\n 'zope.deprecation >= 3.5.0', # py3 compat\n 'venusian >= 1.0a3', # ``ignore``\n 'translationstring >= 0.4', # py3 compat\n 'PasteDeploy >= 1.5.0', # py3 compat\n ]\n\ntests_require = [\n 'WebTest >= 1.3.1', # py3 compat\n ]\n\nif not PY3:\n tests_require.append('zope.component>=3.11.0')\n\ndocs_extras = [\n 'Sphinx >= 1.3.1',\n 'docutils',\n 'repoze.sphinx.autointerface',\n 'pylons_sphinx_latesturl',\n 'pylons-sphinx-themes',\n 'sphinxcontrib-programoutput',\n ]\n\ntesting_extras = tests_require + [\n 'nose',\n 'coverage',\n 'virtualenv', # for scaffolding tests\n ]\n\nsetup(name='pyramid',\n version='1.6',\n description='The Pyramid Web Framework, a Pylons project',\n long_description=README + '\\n\\n' + CHANGES,\n classifiers=[\n \"Development Status :: 6 - Mature\",\n \"Intended Audience :: Developers\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2.6\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.2\",\n \"Programming Language :: Python :: 3.3\",\n \"Programming Language :: Python :: 3.4\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Framework :: Pyramid\",\n \"Topic :: Internet :: WWW/HTTP\",\n \"Topic :: Internet :: WWW/HTTP :: WSGI\",\n \"License :: Repoze Public License\",\n ],\n keywords='web wsgi pylons pyramid',\n author=\"Chris McDonough, Agendaless Consulting\",\n author_email=\"[email protected]\",\n url=\"http://docs.pylonsproject.org/en/latest/docs/pyramid.html\",\n license=\"BSD-derived (http://www.repoze.org/LICENSE.txt)\",\n packages=find_packages(),\n include_package_data=True,\n zip_safe=False,\n install_requires = install_requires,\n extras_require = {\n 'testing':testing_extras,\n 'docs':docs_extras,\n },\n tests_require = tests_require,\n test_suite=\"pyramid.tests\",\n entry_points = \"\"\"\\\n [pyramid.scaffold]\n starter=pyramid.scaffolds:StarterProjectTemplate\n zodb=pyramid.scaffolds:ZODBProjectTemplate\n alchemy=pyramid.scaffolds:AlchemyProjectTemplate\n [pyramid.pshell_runner]\n python=pyramid.scripts.pshell:python_shell_runner\n [console_scripts]\n pcreate = pyramid.scripts.pcreate:main\n pserve = pyramid.scripts.pserve:main\n pshell = pyramid.scripts.pshell:main\n proutes = pyramid.scripts.proutes:main\n pviews = pyramid.scripts.pviews:main\n ptweens = pyramid.scripts.ptweens:main\n prequest = pyramid.scripts.prequest:main\n pdistreport = pyramid.scripts.pdistreport:main\n [paste.server_runner]\n wsgiref = pyramid.scripts.pserve:wsgiref_server_runner\n cherrypy = pyramid.scripts.pserve:cherrypy_server_runner\n \"\"\"\n )\n\n", "path": "setup.py"}], "after_files": [{"content": "##############################################################################\n#\n# Copyright (c) 2008-2013 Agendaless Consulting and Contributors.\n# All Rights Reserved.\n#\n# This software is subject to the provisions of the BSD-like license at\n# http://www.repoze.org/LICENSE.txt. A copy of the license should accompany\n# this distribution. THIS SOFTWARE IS PROVIDED \"AS IS\" AND ANY AND ALL\n# EXPRESS OR IMPLIED WARRANTIES ARE DISCLAIMED, INCLUDING, BUT NOT LIMITED TO,\n# THE IMPLIED WARRANTIES OF TITLE, MERCHANTABILITY, AGAINST INFRINGEMENT, AND\n# FITNESS FOR A PARTICULAR PURPOSE\n#\n##############################################################################\n\nimport os\nimport sys\n\nfrom setuptools import setup, find_packages\n\npy_version = sys.version_info[:2]\n\nPY3 = py_version[0] == 3\n\nif PY3:\n if py_version < (3, 2):\n raise RuntimeError('On Python 3, Pyramid requires Python 3.2 or better')\nelse:\n if py_version < (2, 6):\n raise RuntimeError('On Python 2, Pyramid requires Python 2.6 or better')\n\nhere = os.path.abspath(os.path.dirname(__file__))\ntry:\n with open(os.path.join(here, 'README.rst')) as f:\n README = f.read()\n with open(os.path.join(here, 'CHANGES.txt')) as f:\n CHANGES = f.read()\nexcept IOError:\n README = CHANGES = ''\n\ninstall_requires=[\n 'setuptools',\n 'WebOb >= 1.3.1', # request.domain and CookieProfile\n 'repoze.lru >= 0.4', # py3 compat\n 'zope.interface >= 3.8.0', # has zope.interface.registry\n 'zope.deprecation >= 3.5.0', # py3 compat\n 'venusian >= 1.0a3', # ``ignore``\n 'translationstring >= 0.4', # py3 compat\n 'PasteDeploy >= 1.5.0', # py3 compat\n ]\n\ntests_require = [\n 'WebTest >= 1.3.1', # py3 compat\n ]\n\nif not PY3:\n tests_require.append('zope.component>=3.11.0')\n\ndocs_extras = [\n 'Sphinx >= 1.3.4',\n 'docutils',\n 'repoze.sphinx.autointerface',\n 'pylons_sphinx_latesturl',\n 'pylons-sphinx-themes',\n 'sphinxcontrib-programoutput',\n ]\n\ntesting_extras = tests_require + [\n 'nose',\n 'coverage',\n 'virtualenv', # for scaffolding tests\n ]\n\nsetup(name='pyramid',\n version='1.6',\n description='The Pyramid Web Framework, a Pylons project',\n long_description=README + '\\n\\n' + CHANGES,\n classifiers=[\n \"Development Status :: 6 - Mature\",\n \"Intended Audience :: Developers\",\n \"Programming Language :: Python\",\n \"Programming Language :: Python :: 2.6\",\n \"Programming Language :: Python :: 2.7\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.2\",\n \"Programming Language :: Python :: 3.3\",\n \"Programming Language :: Python :: 3.4\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: Implementation :: CPython\",\n \"Programming Language :: Python :: Implementation :: PyPy\",\n \"Framework :: Pyramid\",\n \"Topic :: Internet :: WWW/HTTP\",\n \"Topic :: Internet :: WWW/HTTP :: WSGI\",\n \"License :: Repoze Public License\",\n ],\n keywords='web wsgi pylons pyramid',\n author=\"Chris McDonough, Agendaless Consulting\",\n author_email=\"[email protected]\",\n url=\"http://docs.pylonsproject.org/en/latest/docs/pyramid.html\",\n license=\"BSD-derived (http://www.repoze.org/LICENSE.txt)\",\n packages=find_packages(),\n include_package_data=True,\n zip_safe=False,\n install_requires = install_requires,\n extras_require = {\n 'testing':testing_extras,\n 'docs':docs_extras,\n },\n tests_require = tests_require,\n test_suite=\"pyramid.tests\",\n entry_points = \"\"\"\\\n [pyramid.scaffold]\n starter=pyramid.scaffolds:StarterProjectTemplate\n zodb=pyramid.scaffolds:ZODBProjectTemplate\n alchemy=pyramid.scaffolds:AlchemyProjectTemplate\n [pyramid.pshell_runner]\n python=pyramid.scripts.pshell:python_shell_runner\n [console_scripts]\n pcreate = pyramid.scripts.pcreate:main\n pserve = pyramid.scripts.pserve:main\n pshell = pyramid.scripts.pshell:main\n proutes = pyramid.scripts.proutes:main\n pviews = pyramid.scripts.pviews:main\n ptweens = pyramid.scripts.ptweens:main\n prequest = pyramid.scripts.prequest:main\n pdistreport = pyramid.scripts.pdistreport:main\n [paste.server_runner]\n wsgiref = pyramid.scripts.pserve:wsgiref_server_runner\n cherrypy = pyramid.scripts.pserve:cherrypy_server_runner\n \"\"\"\n )\n\n", "path": "setup.py"}]} | 1,821 | 106 |
gh_patches_debug_5381 | rasdani/github-patches | git_diff | ManimCommunity__manim-1053 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Broken source links on the stable version of the documentation
## Description of bug / unexpected behavior
Source links on the stable version of documentation does not work. It links to something like this: https://github.com/ManimCommunity/manim/blob/stable/manim/mobject/changing.py which is a 404 error.
## Expected behavior
Source links should link to a file containing source code for the stable version.
## How to reproduce the issue
On the documentation website, switch the version to stable. Navigate to and click the source link of any class.
## Additional comments
Perhaps this is an access rights issue, which explains why it evaded detection from community devs for so long?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `docs/source/conf.py`
Content:
```
1 # Configuration file for the Sphinx documentation builder.
2 #
3 # This file only contains a selection of the most common options. For a full
4 # list see the documentation:
5 # https://www.sphinx-doc.org/en/master/usage/configuration.html
6
7 # -- Path setup --------------------------------------------------------------
8
9 # If extensions (or modules to document with autodoc) are in another directory,
10 # add these directories to sys.path here. If the directory is relative to the
11 # documentation root, use os.path.abspath to make it absolute, like shown here.
12
13 import os
14 import sys
15 from distutils.sysconfig import get_python_lib
16 from pathlib import Path
17
18 sys.path.insert(0, os.path.abspath("."))
19
20
21 if os.environ.get("READTHEDOCS") == "True":
22 site_path = get_python_lib()
23 # we need to add ffmpeg to the path
24 ffmpeg_path = os.path.join(site_path, "imageio_ffmpeg", "binaries")
25 # the included binary is named ffmpeg-linux..., create a symlink
26 [ffmpeg_bin] = [
27 file for file in os.listdir(ffmpeg_path) if file.startswith("ffmpeg-")
28 ]
29 os.symlink(
30 os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, "ffmpeg")
31 )
32 os.environ["PATH"] += os.pathsep + ffmpeg_path
33
34
35 # -- Project information -----------------------------------------------------
36
37 project = "Manim"
38 copyright = "2020, The Manim Community Dev Team"
39 author = "The Manim Community Dev Team"
40
41
42 # -- General configuration ---------------------------------------------------
43
44 # Add any Sphinx extension module names here, as strings. They can be
45 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
46 # ones.
47 extensions = [
48 "sphinx.ext.autodoc",
49 "recommonmark",
50 "sphinx_copybutton",
51 "sphinx.ext.napoleon",
52 "sphinx.ext.autosummary",
53 "sphinx.ext.doctest",
54 "sphinx.ext.extlinks",
55 "sphinx.ext.linkcode",
56 "sphinxext.opengraph",
57 "manim_directive",
58 ]
59
60 # Automatically generate stub pages when using the .. autosummary directive
61 autosummary_generate = True
62
63 # generate documentation from type hints
64 autodoc_typehints = "description"
65 autoclass_content = "both"
66
67 # controls whether functions documented by the autofunction directive
68 # appear with their full module names
69 add_module_names = False
70
71 # Add any paths that contain templates here, relative to this directory.
72 templates_path = ["_templates"]
73
74 # Custom section headings in our documentation
75 napoleon_custom_sections = ["Tests", ("Test", "Tests")]
76
77 # List of patterns, relative to source directory, that match files and
78 # directories to ignore when looking for source files.
79 # This pattern also affects html_static_path and html_extra_path.
80 exclude_patterns = []
81
82
83 # -- Options for HTML output -------------------------------------------------
84
85 # The theme to use for HTML and HTML Help pages. See the documentation for
86 # a list of builtin themes.
87 #
88 import guzzle_sphinx_theme
89
90 html_theme_path = guzzle_sphinx_theme.html_theme_path()
91 html_theme = "guzzle_sphinx_theme"
92 html_favicon = str(Path("_static/favicon.ico"))
93
94 # There's a standing issue with Sphinx's new-style sidebars. This is a
95 # workaround. Taken from
96 # https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826
97 html_sidebars = {"**": ["logo-text.html", "globaltoc.html", "searchbox.html"]}
98
99 # Register the theme as an extension to generate a sitemap.xml
100 extensions.append("guzzle_sphinx_theme")
101
102 # Add any paths that contain custom static files (such as style sheets) here,
103 # relative to this directory. They are copied after the builtin static files,
104 # so a file named "default.css" will overwrite the builtin "default.css".
105 html_static_path = ["_static"]
106
107 # This specifies any additional css files that will override the theme's
108 html_css_files = ["custom.css"]
109
110 # source links to github
111 def linkcode_resolve(domain, info):
112 if domain != "py":
113 return None
114 if not info["module"]:
115 return None
116 filename = info["module"].replace(".", "/")
117 version = os.getenv("READTHEDOCS_VERSION", "master")
118 if version == "latest":
119 version = "master"
120 return f"https://github.com/ManimCommunity/manim/blob/{version}/{filename}.py"
121
122
123 # external links
124 extlinks = {
125 "issue": ("https://github.com/ManimCommunity/manim/issues/%s", "issue "),
126 "pr": ("https://github.com/ManimCommunity/manim/pull/%s", "pull request "),
127 }
128
129 # opengraph settings
130 ogp_image = "https://www.manim.community/logo.png"
131 ogp_site_name = "Manim Community | Documentation"
132 ogp_site_url = "https://docs.manim.community/"
133
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/docs/source/conf.py b/docs/source/conf.py
--- a/docs/source/conf.py
+++ b/docs/source/conf.py
@@ -115,7 +115,7 @@
return None
filename = info["module"].replace(".", "/")
version = os.getenv("READTHEDOCS_VERSION", "master")
- if version == "latest":
+ if version == "latest" or version == "stable":
version = "master"
return f"https://github.com/ManimCommunity/manim/blob/{version}/{filename}.py"
| {"golden_diff": "diff --git a/docs/source/conf.py b/docs/source/conf.py\n--- a/docs/source/conf.py\n+++ b/docs/source/conf.py\n@@ -115,7 +115,7 @@\n return None\n filename = info[\"module\"].replace(\".\", \"/\")\n version = os.getenv(\"READTHEDOCS_VERSION\", \"master\")\n- if version == \"latest\":\n+ if version == \"latest\" or version == \"stable\":\n version = \"master\"\n return f\"https://github.com/ManimCommunity/manim/blob/{version}/{filename}.py\"\n", "issue": "Broken source links on the stable version of the documentation\n## Description of bug / unexpected behavior\nSource links on the stable version of documentation does not work. It links to something like this: https://github.com/ManimCommunity/manim/blob/stable/manim/mobject/changing.py which is a 404 error. \n\n## Expected behavior\nSource links should link to a file containing source code for the stable version. \n\n## How to reproduce the issue\nOn the documentation website, switch the version to stable. Navigate to and click the source link of any class. \n\n## Additional comments\nPerhaps this is an access rights issue, which explains why it evaded detection from community devs for so long?\n\n", "before_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n\nimport os\nimport sys\nfrom distutils.sysconfig import get_python_lib\nfrom pathlib import Path\n\nsys.path.insert(0, os.path.abspath(\".\"))\n\n\nif os.environ.get(\"READTHEDOCS\") == \"True\":\n site_path = get_python_lib()\n # we need to add ffmpeg to the path\n ffmpeg_path = os.path.join(site_path, \"imageio_ffmpeg\", \"binaries\")\n # the included binary is named ffmpeg-linux..., create a symlink\n [ffmpeg_bin] = [\n file for file in os.listdir(ffmpeg_path) if file.startswith(\"ffmpeg-\")\n ]\n os.symlink(\n os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, \"ffmpeg\")\n )\n os.environ[\"PATH\"] += os.pathsep + ffmpeg_path\n\n\n# -- Project information -----------------------------------------------------\n\nproject = \"Manim\"\ncopyright = \"2020, The Manim Community Dev Team\"\nauthor = \"The Manim Community Dev Team\"\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx.ext.autodoc\",\n \"recommonmark\",\n \"sphinx_copybutton\",\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n \"sphinx.ext.extlinks\",\n \"sphinx.ext.linkcode\",\n \"sphinxext.opengraph\",\n \"manim_directive\",\n]\n\n# Automatically generate stub pages when using the .. autosummary directive\nautosummary_generate = True\n\n# generate documentation from type hints\nautodoc_typehints = \"description\"\nautoclass_content = \"both\"\n\n# controls whether functions documented by the autofunction directive\n# appear with their full module names\nadd_module_names = False\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = [\"_templates\"]\n\n# Custom section headings in our documentation\nnapoleon_custom_sections = [\"Tests\", (\"Test\", \"Tests\")]\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nimport guzzle_sphinx_theme\n\nhtml_theme_path = guzzle_sphinx_theme.html_theme_path()\nhtml_theme = \"guzzle_sphinx_theme\"\nhtml_favicon = str(Path(\"_static/favicon.ico\"))\n\n# There's a standing issue with Sphinx's new-style sidebars. This is a\n# workaround. Taken from\n# https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826\nhtml_sidebars = {\"**\": [\"logo-text.html\", \"globaltoc.html\", \"searchbox.html\"]}\n\n# Register the theme as an extension to generate a sitemap.xml\nextensions.append(\"guzzle_sphinx_theme\")\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = [\"_static\"]\n\n# This specifies any additional css files that will override the theme's\nhtml_css_files = [\"custom.css\"]\n\n# source links to github\ndef linkcode_resolve(domain, info):\n if domain != \"py\":\n return None\n if not info[\"module\"]:\n return None\n filename = info[\"module\"].replace(\".\", \"/\")\n version = os.getenv(\"READTHEDOCS_VERSION\", \"master\")\n if version == \"latest\":\n version = \"master\"\n return f\"https://github.com/ManimCommunity/manim/blob/{version}/{filename}.py\"\n\n\n# external links\nextlinks = {\n \"issue\": (\"https://github.com/ManimCommunity/manim/issues/%s\", \"issue \"),\n \"pr\": (\"https://github.com/ManimCommunity/manim/pull/%s\", \"pull request \"),\n}\n\n# opengraph settings\nogp_image = \"https://www.manim.community/logo.png\"\nogp_site_name = \"Manim Community | Documentation\"\nogp_site_url = \"https://docs.manim.community/\"\n", "path": "docs/source/conf.py"}], "after_files": [{"content": "# Configuration file for the Sphinx documentation builder.\n#\n# This file only contains a selection of the most common options. For a full\n# list see the documentation:\n# https://www.sphinx-doc.org/en/master/usage/configuration.html\n\n# -- Path setup --------------------------------------------------------------\n\n# If extensions (or modules to document with autodoc) are in another directory,\n# add these directories to sys.path here. If the directory is relative to the\n# documentation root, use os.path.abspath to make it absolute, like shown here.\n\nimport os\nimport sys\nfrom distutils.sysconfig import get_python_lib\nfrom pathlib import Path\n\nsys.path.insert(0, os.path.abspath(\".\"))\n\n\nif os.environ.get(\"READTHEDOCS\") == \"True\":\n site_path = get_python_lib()\n # we need to add ffmpeg to the path\n ffmpeg_path = os.path.join(site_path, \"imageio_ffmpeg\", \"binaries\")\n # the included binary is named ffmpeg-linux..., create a symlink\n [ffmpeg_bin] = [\n file for file in os.listdir(ffmpeg_path) if file.startswith(\"ffmpeg-\")\n ]\n os.symlink(\n os.path.join(ffmpeg_path, ffmpeg_bin), os.path.join(ffmpeg_path, \"ffmpeg\")\n )\n os.environ[\"PATH\"] += os.pathsep + ffmpeg_path\n\n\n# -- Project information -----------------------------------------------------\n\nproject = \"Manim\"\ncopyright = \"2020, The Manim Community Dev Team\"\nauthor = \"The Manim Community Dev Team\"\n\n\n# -- General configuration ---------------------------------------------------\n\n# Add any Sphinx extension module names here, as strings. They can be\n# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom\n# ones.\nextensions = [\n \"sphinx.ext.autodoc\",\n \"recommonmark\",\n \"sphinx_copybutton\",\n \"sphinx.ext.napoleon\",\n \"sphinx.ext.autosummary\",\n \"sphinx.ext.doctest\",\n \"sphinx.ext.extlinks\",\n \"sphinx.ext.linkcode\",\n \"sphinxext.opengraph\",\n \"manim_directive\",\n]\n\n# Automatically generate stub pages when using the .. autosummary directive\nautosummary_generate = True\n\n# generate documentation from type hints\nautodoc_typehints = \"description\"\nautoclass_content = \"both\"\n\n# controls whether functions documented by the autofunction directive\n# appear with their full module names\nadd_module_names = False\n\n# Add any paths that contain templates here, relative to this directory.\ntemplates_path = [\"_templates\"]\n\n# Custom section headings in our documentation\nnapoleon_custom_sections = [\"Tests\", (\"Test\", \"Tests\")]\n\n# List of patterns, relative to source directory, that match files and\n# directories to ignore when looking for source files.\n# This pattern also affects html_static_path and html_extra_path.\nexclude_patterns = []\n\n\n# -- Options for HTML output -------------------------------------------------\n\n# The theme to use for HTML and HTML Help pages. See the documentation for\n# a list of builtin themes.\n#\nimport guzzle_sphinx_theme\n\nhtml_theme_path = guzzle_sphinx_theme.html_theme_path()\nhtml_theme = \"guzzle_sphinx_theme\"\nhtml_favicon = str(Path(\"_static/favicon.ico\"))\n\n# There's a standing issue with Sphinx's new-style sidebars. This is a\n# workaround. Taken from\n# https://github.com/guzzle/guzzle_sphinx_theme/issues/33#issuecomment-637081826\nhtml_sidebars = {\"**\": [\"logo-text.html\", \"globaltoc.html\", \"searchbox.html\"]}\n\n# Register the theme as an extension to generate a sitemap.xml\nextensions.append(\"guzzle_sphinx_theme\")\n\n# Add any paths that contain custom static files (such as style sheets) here,\n# relative to this directory. They are copied after the builtin static files,\n# so a file named \"default.css\" will overwrite the builtin \"default.css\".\nhtml_static_path = [\"_static\"]\n\n# This specifies any additional css files that will override the theme's\nhtml_css_files = [\"custom.css\"]\n\n# source links to github\ndef linkcode_resolve(domain, info):\n if domain != \"py\":\n return None\n if not info[\"module\"]:\n return None\n filename = info[\"module\"].replace(\".\", \"/\")\n version = os.getenv(\"READTHEDOCS_VERSION\", \"master\")\n if version == \"latest\" or version == \"stable\":\n version = \"master\"\n return f\"https://github.com/ManimCommunity/manim/blob/{version}/{filename}.py\"\n\n\n# external links\nextlinks = {\n \"issue\": (\"https://github.com/ManimCommunity/manim/issues/%s\", \"issue \"),\n \"pr\": (\"https://github.com/ManimCommunity/manim/pull/%s\", \"pull request \"),\n}\n\n# opengraph settings\nogp_image = \"https://www.manim.community/logo.png\"\nogp_site_name = \"Manim Community | Documentation\"\nogp_site_url = \"https://docs.manim.community/\"\n", "path": "docs/source/conf.py"}]} | 1,767 | 122 |
gh_patches_debug_7772 | rasdani/github-patches | git_diff | OctoPrint__OctoPrint-3054 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Software update fails on Debian testing/unstable
Tested with Octoprint 1.3.10
Right now, the software update will not work in Debian testing and unstable. The problem here is that Debian decided to name its python version `2.7.15+` (yes the '+' is part of the version string returned by `python --version`. Octoprint's version compare cannot cope with this and sees this as < 2.7.9 (which leads to a very confusing output of the software update component telling you why it doesn't want to update: `Python: 2.7.9 (you have: 2.7.15+)` ... took me some time to figure out what is actually going on)
There is a bug report for Debian's python2.7 package already here https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=914072
Sadly there is no feedback from the Debian maintainers on why they named it this way and if this might be changed again in the future.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/octoprint/util/version.py`
Content:
```
1 # coding=utf-8
2 """
3 This module provides a bunch of utility methods and helpers for version handling.
4 """
5 from __future__ import absolute_import, division, print_function
6
7 __license__ = 'GNU Affero General Public License http://www.gnu.org/licenses/agpl.html'
8
9 import pkg_resources
10 import logging
11
12 from octoprint import __version__
13
14
15 def get_octoprint_version_string():
16 return __version__
17
18
19 def get_octoprint_version(base=False):
20 octoprint_version_string = get_octoprint_version_string()
21 return get_comparable_version(octoprint_version_string, base=base)
22
23
24 def is_released_octoprint_version(version=None):
25 """
26 >>> import pkg_resources
27 >>> is_released_octoprint_version(version=pkg_resources.parse_version("1.3.6rc3"))
28 True
29 >>> is_released_octoprint_version(version=pkg_resources.parse_version("1.3.6rc3.dev2+g1234"))
30 False
31 >>> is_released_octoprint_version(version=pkg_resources.parse_version("1.3.6"))
32 True
33 >>> is_released_octoprint_version(version=pkg_resources.parse_version("1.3.6.post1+g1234"))
34 True
35 >>> is_released_octoprint_version(version=pkg_resources.parse_version("1.3.6.post1.dev0+g1234"))
36 False
37 >>> is_released_octoprint_version(version=pkg_resources.parse_version("1.3.7.dev123+g23545"))
38 False
39 """
40
41 if version is None:
42 version = get_octoprint_version()
43
44 if isinstance(version, tuple):
45 # old setuptools
46 return "*@" not in version
47 else:
48 # new setuptools
49 return "dev" not in version.public
50
51
52 def is_stable_octoprint_version(version=None):
53 """
54 >>> import pkg_resources
55 >>> is_stable_octoprint_version(version=pkg_resources.parse_version("1.3.6rc3"))
56 False
57 >>> is_stable_octoprint_version(version=pkg_resources.parse_version("1.3.6rc3.dev2+g1234"))
58 False
59 >>> is_stable_octoprint_version(version=pkg_resources.parse_version("1.3.6"))
60 True
61 >>> is_stable_octoprint_version(version=pkg_resources.parse_version("1.3.6.post1+g1234"))
62 True
63 >>> is_stable_octoprint_version(version=pkg_resources.parse_version("1.3.6.post1.dev0+g1234"))
64 False
65 >>> is_stable_octoprint_version(version=pkg_resources.parse_version("1.3.7.dev123+g23545"))
66 False
67 """
68
69 if version is None:
70 version = get_octoprint_version()
71
72 if not is_released_octoprint_version(version=version):
73 return False
74
75 if isinstance(version, tuple):
76 return "*a" not in version and "*b" not in version and "*c" not in version
77 else:
78 return not version.is_prerelease
79
80
81 def is_octoprint_compatible(*compatibility_entries, **kwargs):
82 """
83 Tests if the current ``octoprint_version`` is compatible to any of the provided ``compatibility_entries``.
84
85 Arguments:
86 compatibility_entries (str): compatibility string(s) to test against, result will be `True` if any match
87 is found
88 octoprint_version (tuple or SetuptoolsVersion): optional OctoPrint version to match against, if not current
89 base version will be determined via :func:`get_octoprint_version`.
90
91 Returns:
92 (bool) ``True`` if any of the provided compatibility entries matches or there are no entries, else ``False``
93 """
94
95 logger = logging.getLogger(__name__)
96
97 if not compatibility_entries:
98 return True
99
100 octoprint_version = kwargs.get("octoprint_version")
101 if octoprint_version is None:
102 octoprint_version = get_octoprint_version(base=True)
103
104 for octo_compat in compatibility_entries:
105 try:
106 if not any(octo_compat.startswith(c) for c in ("<", "<=", "!=", "==", ">=", ">", "~=", "===")):
107 octo_compat = ">={}".format(octo_compat)
108
109 s = pkg_resources.Requirement.parse("OctoPrint" + octo_compat)
110 if octoprint_version in s:
111 break
112 except:
113 logger.exception("Something is wrong with this compatibility string for OctoPrint: {}".format(octo_compat))
114 else:
115 return False
116
117 return True
118
119
120 def get_comparable_version(version_string, base=False):
121 if "-" in version_string:
122 version_string = version_string[:version_string.find("-")]
123
124 version = pkg_resources.parse_version(version_string)
125
126 # A leading v is common in github release tags and old setuptools doesn't remove it.
127 if version and isinstance(version, tuple) and version[0].lower() == "*v":
128 version = version[1:]
129
130 if base:
131 if isinstance(version, tuple):
132 # old setuptools
133 base_version = []
134 for part in version:
135 if part.startswith("*"):
136 break
137 base_version.append(part)
138 base_version.append("*final")
139 version = tuple(base_version)
140 else:
141 # new setuptools
142 version = pkg_resources.parse_version(version.base_version)
143 return version
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/octoprint/util/version.py b/src/octoprint/util/version.py
--- a/src/octoprint/util/version.py
+++ b/src/octoprint/util/version.py
@@ -121,6 +121,10 @@
if "-" in version_string:
version_string = version_string[:version_string.find("-")]
+ # Debian has the python version set to 2.7.15+ which is not PEP440 compliant (bug 914072)
+ if version_string.endswith("+"):
+ version_string = version_string[:-1]
+
version = pkg_resources.parse_version(version_string)
# A leading v is common in github release tags and old setuptools doesn't remove it.
| {"golden_diff": "diff --git a/src/octoprint/util/version.py b/src/octoprint/util/version.py\n--- a/src/octoprint/util/version.py\n+++ b/src/octoprint/util/version.py\n@@ -121,6 +121,10 @@\n \tif \"-\" in version_string:\n \t\tversion_string = version_string[:version_string.find(\"-\")]\n \n+\t# Debian has the python version set to 2.7.15+ which is not PEP440 compliant (bug 914072)\n+\tif version_string.endswith(\"+\"):\n+\t\tversion_string = version_string[:-1]\n+\n \tversion = pkg_resources.parse_version(version_string)\n \n \t# A leading v is common in github release tags and old setuptools doesn't remove it.\n", "issue": "Software update fails on Debian testing/unstable\nTested with Octoprint 1.3.10\r\n\r\nRight now, the software update will not work in Debian testing and unstable. The problem here is that Debian decided to name its python version `2.7.15+` (yes the '+' is part of the version string returned by `python --version`. Octoprint's version compare cannot cope with this and sees this as < 2.7.9 (which leads to a very confusing output of the software update component telling you why it doesn't want to update: `Python: 2.7.9 (you have: 2.7.15+)` ... took me some time to figure out what is actually going on)\r\n\r\nThere is a bug report for Debian's python2.7 package already here https://bugs.debian.org/cgi-bin/bugreport.cgi?bug=914072\r\nSadly there is no feedback from the Debian maintainers on why they named it this way and if this might be changed again in the future.\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"\nThis module provides a bunch of utility methods and helpers for version handling.\n\"\"\"\nfrom __future__ import absolute_import, division, print_function\n\n__license__ = 'GNU Affero General Public License http://www.gnu.org/licenses/agpl.html'\n\nimport pkg_resources\nimport logging\n\nfrom octoprint import __version__\n\n\ndef get_octoprint_version_string():\n\treturn __version__\n\n\ndef get_octoprint_version(base=False):\n\toctoprint_version_string = get_octoprint_version_string()\n\treturn get_comparable_version(octoprint_version_string, base=base)\n\n\ndef is_released_octoprint_version(version=None):\n\t\"\"\"\n\t>>> import pkg_resources\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3\"))\n\tTrue\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3.dev2+g1234\"))\n\tFalse\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6\"))\n\tTrue\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1+g1234\"))\n\tTrue\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1.dev0+g1234\"))\n\tFalse\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.7.dev123+g23545\"))\n\tFalse\n\t\"\"\"\n\n\tif version is None:\n\t\tversion = get_octoprint_version()\n\n\tif isinstance(version, tuple):\n\t\t# old setuptools\n\t\treturn \"*@\" not in version\n\telse:\n\t\t# new setuptools\n\t\treturn \"dev\" not in version.public\n\n\ndef is_stable_octoprint_version(version=None):\n\t\"\"\"\n\t>>> import pkg_resources\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3\"))\n\tFalse\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3.dev2+g1234\"))\n\tFalse\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6\"))\n\tTrue\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1+g1234\"))\n\tTrue\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1.dev0+g1234\"))\n\tFalse\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.7.dev123+g23545\"))\n\tFalse\n\t\"\"\"\n\n\tif version is None:\n\t\tversion = get_octoprint_version()\n\n\tif not is_released_octoprint_version(version=version):\n\t\treturn False\n\n\tif isinstance(version, tuple):\n\t\treturn \"*a\" not in version and \"*b\" not in version and \"*c\" not in version\n\telse:\n\t\treturn not version.is_prerelease\n\n\ndef is_octoprint_compatible(*compatibility_entries, **kwargs):\n\t\"\"\"\n\tTests if the current ``octoprint_version`` is compatible to any of the provided ``compatibility_entries``.\n\n\tArguments:\n\t\tcompatibility_entries (str): compatibility string(s) to test against, result will be `True` if any match\n\t\t\tis found\n\t\toctoprint_version (tuple or SetuptoolsVersion): optional OctoPrint version to match against, if not current\n\t\t\tbase version will be determined via :func:`get_octoprint_version`.\n\n\tReturns:\n\t\t(bool) ``True`` if any of the provided compatibility entries matches or there are no entries, else ``False``\n\t\"\"\"\n\n\tlogger = logging.getLogger(__name__)\n\n\tif not compatibility_entries:\n\t\treturn True\n\n\toctoprint_version = kwargs.get(\"octoprint_version\")\n\tif octoprint_version is None:\n\t\toctoprint_version = get_octoprint_version(base=True)\n\n\tfor octo_compat in compatibility_entries:\n\t\ttry:\n\t\t\tif not any(octo_compat.startswith(c) for c in (\"<\", \"<=\", \"!=\", \"==\", \">=\", \">\", \"~=\", \"===\")):\n\t\t\t\tocto_compat = \">={}\".format(octo_compat)\n\n\t\t\ts = pkg_resources.Requirement.parse(\"OctoPrint\" + octo_compat)\n\t\t\tif octoprint_version in s:\n\t\t\t\tbreak\n\t\texcept:\n\t\t\tlogger.exception(\"Something is wrong with this compatibility string for OctoPrint: {}\".format(octo_compat))\n\telse:\n\t\treturn False\n\n\treturn True\n\n\ndef get_comparable_version(version_string, base=False):\n\tif \"-\" in version_string:\n\t\tversion_string = version_string[:version_string.find(\"-\")]\n\n\tversion = pkg_resources.parse_version(version_string)\n\n\t# A leading v is common in github release tags and old setuptools doesn't remove it.\n\tif version and isinstance(version, tuple) and version[0].lower() == \"*v\":\n\t\tversion = version[1:]\n\n\tif base:\n\t\tif isinstance(version, tuple):\n\t\t\t# old setuptools\n\t\t\tbase_version = []\n\t\t\tfor part in version:\n\t\t\t\tif part.startswith(\"*\"):\n\t\t\t\t\tbreak\n\t\t\t\tbase_version.append(part)\n\t\t\tbase_version.append(\"*final\")\n\t\t\tversion = tuple(base_version)\n\t\telse:\n\t\t\t# new setuptools\n\t\t\tversion = pkg_resources.parse_version(version.base_version)\n\treturn version\n", "path": "src/octoprint/util/version.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"\nThis module provides a bunch of utility methods and helpers for version handling.\n\"\"\"\nfrom __future__ import absolute_import, division, print_function\n\n__license__ = 'GNU Affero General Public License http://www.gnu.org/licenses/agpl.html'\n\nimport pkg_resources\nimport logging\n\nfrom octoprint import __version__\n\n\ndef get_octoprint_version_string():\n\treturn __version__\n\n\ndef get_octoprint_version(base=False):\n\toctoprint_version_string = get_octoprint_version_string()\n\treturn get_comparable_version(octoprint_version_string, base=base)\n\n\ndef is_released_octoprint_version(version=None):\n\t\"\"\"\n\t>>> import pkg_resources\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3\"))\n\tTrue\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3.dev2+g1234\"))\n\tFalse\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6\"))\n\tTrue\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1+g1234\"))\n\tTrue\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1.dev0+g1234\"))\n\tFalse\n\t>>> is_released_octoprint_version(version=pkg_resources.parse_version(\"1.3.7.dev123+g23545\"))\n\tFalse\n\t\"\"\"\n\n\tif version is None:\n\t\tversion = get_octoprint_version()\n\n\tif isinstance(version, tuple):\n\t\t# old setuptools\n\t\treturn \"*@\" not in version\n\telse:\n\t\t# new setuptools\n\t\treturn \"dev\" not in version.public\n\n\ndef is_stable_octoprint_version(version=None):\n\t\"\"\"\n\t>>> import pkg_resources\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3\"))\n\tFalse\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6rc3.dev2+g1234\"))\n\tFalse\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6\"))\n\tTrue\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1+g1234\"))\n\tTrue\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.6.post1.dev0+g1234\"))\n\tFalse\n\t>>> is_stable_octoprint_version(version=pkg_resources.parse_version(\"1.3.7.dev123+g23545\"))\n\tFalse\n\t\"\"\"\n\n\tif version is None:\n\t\tversion = get_octoprint_version()\n\n\tif not is_released_octoprint_version(version=version):\n\t\treturn False\n\n\tif isinstance(version, tuple):\n\t\treturn \"*a\" not in version and \"*b\" not in version and \"*c\" not in version\n\telse:\n\t\treturn not version.is_prerelease\n\n\ndef is_octoprint_compatible(*compatibility_entries, **kwargs):\n\t\"\"\"\n\tTests if the current ``octoprint_version`` is compatible to any of the provided ``compatibility_entries``.\n\n\tArguments:\n\t\tcompatibility_entries (str): compatibility string(s) to test against, result will be `True` if any match\n\t\t\tis found\n\t\toctoprint_version (tuple or SetuptoolsVersion): optional OctoPrint version to match against, if not current\n\t\t\tbase version will be determined via :func:`get_octoprint_version`.\n\n\tReturns:\n\t\t(bool) ``True`` if any of the provided compatibility entries matches or there are no entries, else ``False``\n\t\"\"\"\n\n\tlogger = logging.getLogger(__name__)\n\n\tif not compatibility_entries:\n\t\treturn True\n\n\toctoprint_version = kwargs.get(\"octoprint_version\")\n\tif octoprint_version is None:\n\t\toctoprint_version = get_octoprint_version(base=True)\n\n\tfor octo_compat in compatibility_entries:\n\t\ttry:\n\t\t\tif not any(octo_compat.startswith(c) for c in (\"<\", \"<=\", \"!=\", \"==\", \">=\", \">\", \"~=\", \"===\")):\n\t\t\t\tocto_compat = \">={}\".format(octo_compat)\n\n\t\t\ts = pkg_resources.Requirement.parse(\"OctoPrint\" + octo_compat)\n\t\t\tif octoprint_version in s:\n\t\t\t\tbreak\n\t\texcept:\n\t\t\tlogger.exception(\"Something is wrong with this compatibility string for OctoPrint: {}\".format(octo_compat))\n\telse:\n\t\treturn False\n\n\treturn True\n\n\ndef get_comparable_version(version_string, base=False):\n\tif \"-\" in version_string:\n\t\tversion_string = version_string[:version_string.find(\"-\")]\n\n\t# Debian has the python version set to 2.7.15+ which is not PEP440 compliant (bug 914072)\n\tif version_string.endswith(\"+\"):\n\t\tversion_string = version_string[:-1]\n\n\tversion = pkg_resources.parse_version(version_string)\n\n\t# A leading v is common in github release tags and old setuptools doesn't remove it.\n\tif version and isinstance(version, tuple) and version[0].lower() == \"*v\":\n\t\tversion = version[1:]\n\n\tif base:\n\t\tif isinstance(version, tuple):\n\t\t\t# old setuptools\n\t\t\tbase_version = []\n\t\t\tfor part in version:\n\t\t\t\tif part.startswith(\"*\"):\n\t\t\t\t\tbreak\n\t\t\t\tbase_version.append(part)\n\t\t\tbase_version.append(\"*final\")\n\t\t\tversion = tuple(base_version)\n\t\telse:\n\t\t\t# new setuptools\n\t\t\tversion = pkg_resources.parse_version(version.base_version)\n\treturn version\n", "path": "src/octoprint/util/version.py"}]} | 2,044 | 161 |
gh_patches_debug_4319 | rasdani/github-patches | git_diff | electricitymaps__electricitymaps-contrib-5651 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
CA-SK production parser down
## Description
This is an automatic error report generated for Canada Saskatchewan (CA-SK).
Issues:
- No recent data found for `production` parser
- No recent data found for `consumption` parser
## Suggestions
- Try running the parser locally using the command `poetry run test_parser CA-SK production`
- <a href="https://storage.googleapis.com/electricitymap-parser-logs/CA-SK.html">Explore the runtime logs</a>
You can see an overview of all parser issues [here](https://github.com/tmrowco/electricitymap-contrib/wiki/Parser-issues).
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `parsers/CA_SK.py`
Content:
```
1 from datetime import datetime, timedelta
2 from logging import Logger, getLogger
3 from typing import List, Optional
4
5 from pytz import timezone
6 from requests import Response, Session
7
8 from parsers.lib.exceptions import ParserException
9
10 TIMEZONE = timezone("America/Regina")
11
12 # URLs for the different endpoints.
13 PRODUCTION_URL = (
14 "https://www.saskpower.com/ignitionapi/PowerUseDashboard/GetPowerUseDashboardData"
15 )
16 CONSUMPTION_URL = "https://www.saskpower.com/ignitionapi/Content/GetNetLoad"
17
18 PRODUCTION_MAPPING = {
19 "Hydro": "hydro",
20 "Wind": "wind",
21 "Solar": "solar",
22 "Natural Gas": "gas",
23 "Coal": "coal",
24 "Other": "unknown", # This is internal consumption, losses, heat recovery facilities and small independent power producers.
25 }
26
27 USER_AGENT = "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36"
28
29
30 def validate_zone_key(zone_key: str) -> None:
31 if zone_key != "CA-SK":
32 raise ParserException(
33 "CA_SK.py",
34 f"CA_SK.py is not designed to parse zone_key: {zone_key}.",
35 zone_key,
36 )
37
38
39 def validate_no_datetime(target_datetime: Optional[datetime], zone_key) -> None:
40 if target_datetime:
41 raise ParserException(
42 "CA_SK.py",
43 "This parser is unable to fetch historical data.",
44 zone_key,
45 )
46
47
48 def fetch_production(
49 zone_key: str = "CA-SK",
50 session: Optional[Session] = None,
51 target_datetime: Optional[datetime] = None,
52 logger: Logger = getLogger(__name__),
53 ):
54 """This parser function will currently return the daily average of the day in question as hourly data.
55 This is because the API only returns daily data but the backend expects hourly values.
56 This is in order to facilitate the estimation of the hourly values from the daily average.
57 """
58 # Validate that the zone key is equal to CA-SK.
59 validate_zone_key(zone_key)
60 # Validate that the target_datetime is None as this parser is unable to fetch historical data.
61 validate_no_datetime(target_datetime, zone_key)
62
63 session = session or Session()
64
65 # Set the headers to mimic a user browser as the API will return a 403 if not.
66 headers = {"user-agent": USER_AGENT}
67 response: Response = session.get(PRODUCTION_URL, headers=headers)
68
69 if not response.ok:
70 raise ParserException(
71 "CA_SK.py",
72 f"Failed to fetch production data. Response Code: {response.status_code}\nError:\n{response.text}",
73 zone_key,
74 )
75
76 raw_data = response.json()
77 # Date is in the format "Jan 01, 2020"
78 raw_date = raw_data["SupplyDataText"]
79 date = datetime.strptime(raw_date, "%b %d, %Y")
80 production_data = {}
81
82 for value in raw_data["PowerCacheData"]["generationByType"]:
83 production_data[PRODUCTION_MAPPING[value["type"]]] = value[
84 "totalGenerationForType"
85 ]
86
87 data_list: List[dict] = []
88 # Hack to return hourly data from daily data for the backend as it expects hourly data.
89 for hour in range(0, 24):
90 data_list.append(
91 {
92 "zoneKey": zone_key,
93 "datetime": date.replace(hour=hour, tzinfo=TIMEZONE),
94 "production": production_data,
95 "source": "saskpower.com",
96 }
97 )
98
99 return data_list
100
101
102 def fetch_consumption(
103 zone_key: str = "CA-SK",
104 session: Optional[Session] = None,
105 target_datetime: Optional[datetime] = None,
106 logger: Logger = getLogger(__name__),
107 ):
108 # Validate that the zone key is equal to CA-SK.
109 validate_zone_key(zone_key)
110 # Validate that the target_datetime is None as this parser is unable to fetch historical data.
111 validate_no_datetime(target_datetime, zone_key)
112
113 session = session or Session()
114
115 # Set the headers to mimic a user browser as the API will return a 403 if not.
116 headers = {"user-agent": USER_AGENT}
117
118 response: Response = session.get(CONSUMPTION_URL) # , headers=headers)
119
120 if not response.ok:
121 raise ParserException(
122 "CA_SK.py",
123 f"Failed to fetch consumption data. Response Code: {response.status_code}\nError:\n{response.text}",
124 zone_key,
125 )
126
127 raw_data = response.json()
128
129 now = datetime.now(TIMEZONE)
130
131 # Data is updated every 5 minutes so we assume the data is from a multiple of 5 minutes and has a 5 minute delay from that multiple.
132 assumed_datetime = now.replace(second=0, microsecond=0) - timedelta(
133 minutes=(now.minute % 5) + 5
134 )
135
136 return [
137 {
138 "zoneKey": zone_key,
139 "datetime": assumed_datetime,
140 "consumption": int(raw_data),
141 "source": "saskpower.com",
142 }
143 ]
144
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/parsers/CA_SK.py b/parsers/CA_SK.py
--- a/parsers/CA_SK.py
+++ b/parsers/CA_SK.py
@@ -115,7 +115,7 @@
# Set the headers to mimic a user browser as the API will return a 403 if not.
headers = {"user-agent": USER_AGENT}
- response: Response = session.get(CONSUMPTION_URL) # , headers=headers)
+ response: Response = session.get(CONSUMPTION_URL, headers=headers)
if not response.ok:
raise ParserException(
| {"golden_diff": "diff --git a/parsers/CA_SK.py b/parsers/CA_SK.py\n--- a/parsers/CA_SK.py\n+++ b/parsers/CA_SK.py\n@@ -115,7 +115,7 @@\n # Set the headers to mimic a user browser as the API will return a 403 if not.\n headers = {\"user-agent\": USER_AGENT}\n \n- response: Response = session.get(CONSUMPTION_URL) # , headers=headers)\n+ response: Response = session.get(CONSUMPTION_URL, headers=headers)\n \n if not response.ok:\n raise ParserException(\n", "issue": "CA-SK production parser down\n## Description\n\nThis is an automatic error report generated for Canada Saskatchewan (CA-SK).\n\nIssues:\n- No recent data found for `production` parser\n- No recent data found for `consumption` parser\n\n## Suggestions\n- Try running the parser locally using the command `poetry run test_parser CA-SK production`\n- <a href=\"https://storage.googleapis.com/electricitymap-parser-logs/CA-SK.html\">Explore the runtime logs</a>\n\nYou can see an overview of all parser issues [here](https://github.com/tmrowco/electricitymap-contrib/wiki/Parser-issues).\n\n", "before_files": [{"content": "from datetime import datetime, timedelta\nfrom logging import Logger, getLogger\nfrom typing import List, Optional\n\nfrom pytz import timezone\nfrom requests import Response, Session\n\nfrom parsers.lib.exceptions import ParserException\n\nTIMEZONE = timezone(\"America/Regina\")\n\n# URLs for the different endpoints.\nPRODUCTION_URL = (\n \"https://www.saskpower.com/ignitionapi/PowerUseDashboard/GetPowerUseDashboardData\"\n)\nCONSUMPTION_URL = \"https://www.saskpower.com/ignitionapi/Content/GetNetLoad\"\n\nPRODUCTION_MAPPING = {\n \"Hydro\": \"hydro\",\n \"Wind\": \"wind\",\n \"Solar\": \"solar\",\n \"Natural Gas\": \"gas\",\n \"Coal\": \"coal\",\n \"Other\": \"unknown\", # This is internal consumption, losses, heat recovery facilities and small independent power producers.\n}\n\nUSER_AGENT = \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36\"\n\n\ndef validate_zone_key(zone_key: str) -> None:\n if zone_key != \"CA-SK\":\n raise ParserException(\n \"CA_SK.py\",\n f\"CA_SK.py is not designed to parse zone_key: {zone_key}.\",\n zone_key,\n )\n\n\ndef validate_no_datetime(target_datetime: Optional[datetime], zone_key) -> None:\n if target_datetime:\n raise ParserException(\n \"CA_SK.py\",\n \"This parser is unable to fetch historical data.\",\n zone_key,\n )\n\n\ndef fetch_production(\n zone_key: str = \"CA-SK\",\n session: Optional[Session] = None,\n target_datetime: Optional[datetime] = None,\n logger: Logger = getLogger(__name__),\n):\n \"\"\"This parser function will currently return the daily average of the day in question as hourly data.\n This is because the API only returns daily data but the backend expects hourly values.\n This is in order to facilitate the estimation of the hourly values from the daily average.\n \"\"\"\n # Validate that the zone key is equal to CA-SK.\n validate_zone_key(zone_key)\n # Validate that the target_datetime is None as this parser is unable to fetch historical data.\n validate_no_datetime(target_datetime, zone_key)\n\n session = session or Session()\n\n # Set the headers to mimic a user browser as the API will return a 403 if not.\n headers = {\"user-agent\": USER_AGENT}\n response: Response = session.get(PRODUCTION_URL, headers=headers)\n\n if not response.ok:\n raise ParserException(\n \"CA_SK.py\",\n f\"Failed to fetch production data. Response Code: {response.status_code}\\nError:\\n{response.text}\",\n zone_key,\n )\n\n raw_data = response.json()\n # Date is in the format \"Jan 01, 2020\"\n raw_date = raw_data[\"SupplyDataText\"]\n date = datetime.strptime(raw_date, \"%b %d, %Y\")\n production_data = {}\n\n for value in raw_data[\"PowerCacheData\"][\"generationByType\"]:\n production_data[PRODUCTION_MAPPING[value[\"type\"]]] = value[\n \"totalGenerationForType\"\n ]\n\n data_list: List[dict] = []\n # Hack to return hourly data from daily data for the backend as it expects hourly data.\n for hour in range(0, 24):\n data_list.append(\n {\n \"zoneKey\": zone_key,\n \"datetime\": date.replace(hour=hour, tzinfo=TIMEZONE),\n \"production\": production_data,\n \"source\": \"saskpower.com\",\n }\n )\n\n return data_list\n\n\ndef fetch_consumption(\n zone_key: str = \"CA-SK\",\n session: Optional[Session] = None,\n target_datetime: Optional[datetime] = None,\n logger: Logger = getLogger(__name__),\n):\n # Validate that the zone key is equal to CA-SK.\n validate_zone_key(zone_key)\n # Validate that the target_datetime is None as this parser is unable to fetch historical data.\n validate_no_datetime(target_datetime, zone_key)\n\n session = session or Session()\n\n # Set the headers to mimic a user browser as the API will return a 403 if not.\n headers = {\"user-agent\": USER_AGENT}\n\n response: Response = session.get(CONSUMPTION_URL) # , headers=headers)\n\n if not response.ok:\n raise ParserException(\n \"CA_SK.py\",\n f\"Failed to fetch consumption data. Response Code: {response.status_code}\\nError:\\n{response.text}\",\n zone_key,\n )\n\n raw_data = response.json()\n\n now = datetime.now(TIMEZONE)\n\n # Data is updated every 5 minutes so we assume the data is from a multiple of 5 minutes and has a 5 minute delay from that multiple.\n assumed_datetime = now.replace(second=0, microsecond=0) - timedelta(\n minutes=(now.minute % 5) + 5\n )\n\n return [\n {\n \"zoneKey\": zone_key,\n \"datetime\": assumed_datetime,\n \"consumption\": int(raw_data),\n \"source\": \"saskpower.com\",\n }\n ]\n", "path": "parsers/CA_SK.py"}], "after_files": [{"content": "from datetime import datetime, timedelta\nfrom logging import Logger, getLogger\nfrom typing import List, Optional\n\nfrom pytz import timezone\nfrom requests import Response, Session\n\nfrom parsers.lib.exceptions import ParserException\n\nTIMEZONE = timezone(\"America/Regina\")\n\n# URLs for the different endpoints.\nPRODUCTION_URL = (\n \"https://www.saskpower.com/ignitionapi/PowerUseDashboard/GetPowerUseDashboardData\"\n)\nCONSUMPTION_URL = \"https://www.saskpower.com/ignitionapi/Content/GetNetLoad\"\n\nPRODUCTION_MAPPING = {\n \"Hydro\": \"hydro\",\n \"Wind\": \"wind\",\n \"Solar\": \"solar\",\n \"Natural Gas\": \"gas\",\n \"Coal\": \"coal\",\n \"Other\": \"unknown\", # This is internal consumption, losses, heat recovery facilities and small independent power producers.\n}\n\nUSER_AGENT = \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/75.0.3770.142 Safari/537.36\"\n\n\ndef validate_zone_key(zone_key: str) -> None:\n if zone_key != \"CA-SK\":\n raise ParserException(\n \"CA_SK.py\",\n f\"CA_SK.py is not designed to parse zone_key: {zone_key}.\",\n zone_key,\n )\n\n\ndef validate_no_datetime(target_datetime: Optional[datetime], zone_key) -> None:\n if target_datetime:\n raise ParserException(\n \"CA_SK.py\",\n \"This parser is unable to fetch historical data.\",\n zone_key,\n )\n\n\ndef fetch_production(\n zone_key: str = \"CA-SK\",\n session: Optional[Session] = None,\n target_datetime: Optional[datetime] = None,\n logger: Logger = getLogger(__name__),\n):\n \"\"\"This parser function will currently return the daily average of the day in question as hourly data.\n This is because the API only returns daily data but the backend expects hourly values.\n This is in order to facilitate the estimation of the hourly values from the daily average.\n \"\"\"\n # Validate that the zone key is equal to CA-SK.\n validate_zone_key(zone_key)\n # Validate that the target_datetime is None as this parser is unable to fetch historical data.\n validate_no_datetime(target_datetime, zone_key)\n\n session = session or Session()\n\n # Set the headers to mimic a user browser as the API will return a 403 if not.\n headers = {\"user-agent\": USER_AGENT}\n response: Response = session.get(PRODUCTION_URL, headers=headers)\n\n if not response.ok:\n raise ParserException(\n \"CA_SK.py\",\n f\"Failed to fetch production data. Response Code: {response.status_code}\\nError:\\n{response.text}\",\n zone_key,\n )\n\n raw_data = response.json()\n # Date is in the format \"Jan 01, 2020\"\n raw_date = raw_data[\"SupplyDataText\"]\n date = datetime.strptime(raw_date, \"%b %d, %Y\")\n production_data = {}\n\n for value in raw_data[\"PowerCacheData\"][\"generationByType\"]:\n production_data[PRODUCTION_MAPPING[value[\"type\"]]] = value[\n \"totalGenerationForType\"\n ]\n\n data_list: List[dict] = []\n # Hack to return hourly data from daily data for the backend as it expects hourly data.\n for hour in range(0, 24):\n data_list.append(\n {\n \"zoneKey\": zone_key,\n \"datetime\": date.replace(hour=hour, tzinfo=TIMEZONE),\n \"production\": production_data,\n \"source\": \"saskpower.com\",\n }\n )\n\n return data_list\n\n\ndef fetch_consumption(\n zone_key: str = \"CA-SK\",\n session: Optional[Session] = None,\n target_datetime: Optional[datetime] = None,\n logger: Logger = getLogger(__name__),\n):\n # Validate that the zone key is equal to CA-SK.\n validate_zone_key(zone_key)\n # Validate that the target_datetime is None as this parser is unable to fetch historical data.\n validate_no_datetime(target_datetime, zone_key)\n\n session = session or Session()\n\n # Set the headers to mimic a user browser as the API will return a 403 if not.\n headers = {\"user-agent\": USER_AGENT}\n\n response: Response = session.get(CONSUMPTION_URL, headers=headers)\n\n if not response.ok:\n raise ParserException(\n \"CA_SK.py\",\n f\"Failed to fetch consumption data. Response Code: {response.status_code}\\nError:\\n{response.text}\",\n zone_key,\n )\n\n raw_data = response.json()\n\n now = datetime.now(TIMEZONE)\n\n # Data is updated every 5 minutes so we assume the data is from a multiple of 5 minutes and has a 5 minute delay from that multiple.\n assumed_datetime = now.replace(second=0, microsecond=0) - timedelta(\n minutes=(now.minute % 5) + 5\n )\n\n return [\n {\n \"zoneKey\": zone_key,\n \"datetime\": assumed_datetime,\n \"consumption\": int(raw_data),\n \"source\": \"saskpower.com\",\n }\n ]\n", "path": "parsers/CA_SK.py"}]} | 1,880 | 131 |
gh_patches_debug_18454 | rasdani/github-patches | git_diff | napalm-automation__napalm-514 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[2.0] "pip3 install napalm" doesn't install requirements
Debian 9.2 (Stretch) with python3 v3.5.3, pip3 v9.0.1
With v1.2.0 a `pip3 install napalm==1.2.0` installs also the required modules (MarkupSafe, jinja2, netaddr, pyYAML, pyeapi, future, pynacl, bcrypt, paramiko, pyFG, scp, netmiko, lxml, pyIOSXR, ncclient, pyserial, junos-eznc, urllib3, idna, certifi, chardet, requests, pynxos, pan-python, requests-toolbelt, xmltodict, pyPluribus, chainmap, librouteros, vyattaconfparser).
With Napalm v2.0.0 no required module is installed with `pip3 install napalm`, so napalm won't work.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 """setup.py file."""
2 import uuid
3 import os
4
5 from distutils.core import Command
6 from setuptools import setup, find_packages
7 from setuptools.command import install
8
9
10 from pip.req import parse_requirements
11
12 import pip
13 import sys
14
15 __author__ = 'David Barroso <[email protected]>'
16
17 # Read SUPPORTED_DRIVERS from file (without importing)
18 _locals = {}
19 filename = os.path.join('napalm', '_SUPPORTED_DRIVERS.py')
20 with open(filename) as supported:
21 exec(supported.read(), None, _locals)
22 SUPPORTED_DRIVERS = _locals['SUPPORTED_DRIVERS']
23
24
25 def process_requirements(dep):
26 print("PROCESSING DEPENDENCIES FOR {}".format(dep))
27 u = uuid.uuid1()
28 iter_reqs = parse_requirements("requirements/{}".format(dep), session=u)
29 [pip.main(['install', (str(ir.req))]) for ir in iter_reqs]
30
31
32 def custom_command_driver(driver):
33 class CustomCommand(Command):
34 """A custom command to run Pylint on all Python source files."""
35 user_options = []
36
37 def initialize_options(self):
38 pass
39
40 def finalize_options(self):
41 pass
42
43 def run(self):
44 """Run command."""
45 process_requirements(driver)
46
47 return CustomCommand
48
49
50 class CustomInstall(install.install):
51 """A custom command to run Pylint on all Python source files."""
52
53 def run(self):
54 """Run command."""
55 if any([d in sys.argv for d in SUPPORTED_DRIVERS]):
56 process_requirements('base')
57 else:
58 process_requirements('all')
59 install.install.run(self)
60
61
62 custom_commands = {d: custom_command_driver(d) for d in SUPPORTED_DRIVERS}
63 custom_commands['install'] = CustomInstall
64
65 setup(
66 cmdclass=custom_commands,
67 name="napalm",
68 version='2.0.0',
69 packages=find_packages(exclude=("test*", )),
70 test_suite='test_base',
71 author="David Barroso, Kirk Byers, Mircea Ulinic",
72 author_email="[email protected], [email protected], [email protected]",
73 description="Network Automation and Programmability Abstraction Layer with Multivendor support",
74 classifiers=[
75 'Topic :: Utilities',
76 'Programming Language :: Python',
77 'Programming Language :: Python :: 2',
78 'Programming Language :: Python :: 2.7',
79 'Programming Language :: Python :: 3',
80 'Programming Language :: Python :: 3.4',
81 'Programming Language :: Python :: 3.5',
82 'Programming Language :: Python :: 3.6',
83 'Operating System :: POSIX :: Linux',
84 'Operating System :: MacOS',
85 ],
86 url="https://github.com/napalm-automation/napalm",
87 include_package_data=True,
88 install_requires=[],
89 entry_points={
90 'console_scripts': [
91 'cl_napalm_configure=napalm.base.clitools.cl_napalm_configure:main',
92 'cl_napalm_test=napalm.base.clitools.cl_napalm_test:main',
93 'cl_napalm_validate=napalm.base.clitools.cl_napalm_validate:main',
94 'napalm=napalm.base.clitools.cl_napalm:main',
95 ],
96 }
97 )
98
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -5,11 +5,12 @@
from distutils.core import Command
from setuptools import setup, find_packages
from setuptools.command import install
+from subprocess import check_call
from pip.req import parse_requirements
-import pip
+import pip # noqa: test pip is installed
import sys
__author__ = 'David Barroso <[email protected]>'
@@ -26,7 +27,9 @@
print("PROCESSING DEPENDENCIES FOR {}".format(dep))
u = uuid.uuid1()
iter_reqs = parse_requirements("requirements/{}".format(dep), session=u)
- [pip.main(['install', (str(ir.req))]) for ir in iter_reqs]
+
+ for ir in iter_reqs:
+ check_call([sys.executable, '-m', 'pip', 'install', str(ir.req)])
def custom_command_driver(driver):
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -5,11 +5,12 @@\n from distutils.core import Command\n from setuptools import setup, find_packages\n from setuptools.command import install\n+from subprocess import check_call\n \n \n from pip.req import parse_requirements\n \n-import pip\n+import pip # noqa: test pip is installed\n import sys\n \n __author__ = 'David Barroso <[email protected]>'\n@@ -26,7 +27,9 @@\n print(\"PROCESSING DEPENDENCIES FOR {}\".format(dep))\n u = uuid.uuid1()\n iter_reqs = parse_requirements(\"requirements/{}\".format(dep), session=u)\n- [pip.main(['install', (str(ir.req))]) for ir in iter_reqs]\n+\n+ for ir in iter_reqs:\n+ check_call([sys.executable, '-m', 'pip', 'install', str(ir.req)])\n \n \n def custom_command_driver(driver):\n", "issue": "[2.0] \"pip3 install napalm\" doesn't install requirements\nDebian 9.2 (Stretch) with python3 v3.5.3, pip3 v9.0.1\r\n\r\nWith v1.2.0 a `pip3 install napalm==1.2.0` installs also the required modules (MarkupSafe, jinja2, netaddr, pyYAML, pyeapi, future, pynacl, bcrypt, paramiko, pyFG, scp, netmiko, lxml, pyIOSXR, ncclient, pyserial, junos-eznc, urllib3, idna, certifi, chardet, requests, pynxos, pan-python, requests-toolbelt, xmltodict, pyPluribus, chainmap, librouteros, vyattaconfparser).\r\n\r\nWith Napalm v2.0.0 no required module is installed with `pip3 install napalm`, so napalm won't work.\n", "before_files": [{"content": "\"\"\"setup.py file.\"\"\"\nimport uuid\nimport os\n\nfrom distutils.core import Command\nfrom setuptools import setup, find_packages\nfrom setuptools.command import install\n\n\nfrom pip.req import parse_requirements\n\nimport pip\nimport sys\n\n__author__ = 'David Barroso <[email protected]>'\n\n# Read SUPPORTED_DRIVERS from file (without importing)\n_locals = {}\nfilename = os.path.join('napalm', '_SUPPORTED_DRIVERS.py')\nwith open(filename) as supported:\n exec(supported.read(), None, _locals)\n SUPPORTED_DRIVERS = _locals['SUPPORTED_DRIVERS']\n\n\ndef process_requirements(dep):\n print(\"PROCESSING DEPENDENCIES FOR {}\".format(dep))\n u = uuid.uuid1()\n iter_reqs = parse_requirements(\"requirements/{}\".format(dep), session=u)\n [pip.main(['install', (str(ir.req))]) for ir in iter_reqs]\n\n\ndef custom_command_driver(driver):\n class CustomCommand(Command):\n \"\"\"A custom command to run Pylint on all Python source files.\"\"\"\n user_options = []\n\n def initialize_options(self):\n pass\n\n def finalize_options(self):\n pass\n\n def run(self):\n \"\"\"Run command.\"\"\"\n process_requirements(driver)\n\n return CustomCommand\n\n\nclass CustomInstall(install.install):\n \"\"\"A custom command to run Pylint on all Python source files.\"\"\"\n\n def run(self):\n \"\"\"Run command.\"\"\"\n if any([d in sys.argv for d in SUPPORTED_DRIVERS]):\n process_requirements('base')\n else:\n process_requirements('all')\n install.install.run(self)\n\n\ncustom_commands = {d: custom_command_driver(d) for d in SUPPORTED_DRIVERS}\ncustom_commands['install'] = CustomInstall\n\nsetup(\n cmdclass=custom_commands,\n name=\"napalm\",\n version='2.0.0',\n packages=find_packages(exclude=(\"test*\", )),\n test_suite='test_base',\n author=\"David Barroso, Kirk Byers, Mircea Ulinic\",\n author_email=\"[email protected], [email protected], [email protected]\",\n description=\"Network Automation and Programmability Abstraction Layer with Multivendor support\",\n classifiers=[\n 'Topic :: Utilities',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Operating System :: POSIX :: Linux',\n 'Operating System :: MacOS',\n ],\n url=\"https://github.com/napalm-automation/napalm\",\n include_package_data=True,\n install_requires=[],\n entry_points={\n 'console_scripts': [\n 'cl_napalm_configure=napalm.base.clitools.cl_napalm_configure:main',\n 'cl_napalm_test=napalm.base.clitools.cl_napalm_test:main',\n 'cl_napalm_validate=napalm.base.clitools.cl_napalm_validate:main',\n 'napalm=napalm.base.clitools.cl_napalm:main',\n ],\n }\n)\n", "path": "setup.py"}], "after_files": [{"content": "\"\"\"setup.py file.\"\"\"\nimport uuid\nimport os\n\nfrom distutils.core import Command\nfrom setuptools import setup, find_packages\nfrom setuptools.command import install\nfrom subprocess import check_call\n\n\nfrom pip.req import parse_requirements\n\nimport pip # noqa: test pip is installed\nimport sys\n\n__author__ = 'David Barroso <[email protected]>'\n\n# Read SUPPORTED_DRIVERS from file (without importing)\n_locals = {}\nfilename = os.path.join('napalm', '_SUPPORTED_DRIVERS.py')\nwith open(filename) as supported:\n exec(supported.read(), None, _locals)\n SUPPORTED_DRIVERS = _locals['SUPPORTED_DRIVERS']\n\n\ndef process_requirements(dep):\n print(\"PROCESSING DEPENDENCIES FOR {}\".format(dep))\n u = uuid.uuid1()\n iter_reqs = parse_requirements(\"requirements/{}\".format(dep), session=u)\n\n for ir in iter_reqs:\n check_call([sys.executable, '-m', 'pip', 'install', str(ir.req)])\n\n\ndef custom_command_driver(driver):\n class CustomCommand(Command):\n \"\"\"A custom command to run Pylint on all Python source files.\"\"\"\n user_options = []\n\n def initialize_options(self):\n pass\n\n def finalize_options(self):\n pass\n\n def run(self):\n \"\"\"Run command.\"\"\"\n process_requirements(driver)\n\n return CustomCommand\n\n\nclass CustomInstall(install.install):\n \"\"\"A custom command to run Pylint on all Python source files.\"\"\"\n\n def run(self):\n \"\"\"Run command.\"\"\"\n if any([d in sys.argv for d in SUPPORTED_DRIVERS]):\n process_requirements('base')\n else:\n process_requirements('all')\n install.install.run(self)\n\n\ncustom_commands = {d: custom_command_driver(d) for d in SUPPORTED_DRIVERS}\ncustom_commands['install'] = CustomInstall\n\nsetup(\n cmdclass=custom_commands,\n name=\"napalm\",\n version='2.0.0',\n packages=find_packages(exclude=(\"test*\", )),\n test_suite='test_base',\n author=\"David Barroso, Kirk Byers, Mircea Ulinic\",\n author_email=\"[email protected], [email protected], [email protected]\",\n description=\"Network Automation and Programmability Abstraction Layer with Multivendor support\",\n classifiers=[\n 'Topic :: Utilities',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: 3.6',\n 'Operating System :: POSIX :: Linux',\n 'Operating System :: MacOS',\n ],\n url=\"https://github.com/napalm-automation/napalm\",\n include_package_data=True,\n install_requires=[],\n entry_points={\n 'console_scripts': [\n 'cl_napalm_configure=napalm.base.clitools.cl_napalm_configure:main',\n 'cl_napalm_test=napalm.base.clitools.cl_napalm_test:main',\n 'cl_napalm_validate=napalm.base.clitools.cl_napalm_validate:main',\n 'napalm=napalm.base.clitools.cl_napalm:main',\n ],\n }\n)\n", "path": "setup.py"}]} | 1,366 | 216 |
gh_patches_debug_66489 | rasdani/github-patches | git_diff | aio-libs__aiohttp-1752 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Encoding is always UTF-8 in POST data
## Long story short
I'm doing a `POST` request via `client.post`:
```
data = aiohttp.FormData({
'FindText': name,
}, charset='windows-1251')
client.post(base_url, params={'RowFrom': offset}, data=data)
```
where `name` contains some none-latin text (`'хан'`)
## Expected behaviour
POST data should contain: `FindText=%D5%E0%ED`
## Actual behaviour
`FindText=%D1%85%D0%B0%D0%BD'`
## Steps to reproduce
Looking through the code of `formdata.py:99`
```
urlencode(data, doseq=True).encode(charset),
```
I noticed, that `data` is urlencoded in UTF-8 first and then encoded to `windows-1251` (and that has no effect on `%D1...`).
For now, I just manually do in my code:
```
data = urlencode({
'FindText': name,
}, encoding='windows-1251')
```
And I get the string that I need.
Is it a bug? Or am I doing it wrong?
## Your environment
```
Python 3.6.0 (default, Jan 16 2017, 12:12:55)
[GCC 6.3.1 20170109] on linux
---
aiohttp==2.0.3
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `aiohttp/formdata.py`
Content:
```
1 import io
2 from urllib.parse import urlencode
3
4 from multidict import MultiDict, MultiDictProxy
5
6 from . import hdrs, multipart, payload
7 from .helpers import guess_filename
8
9 __all__ = ('FormData',)
10
11
12 class FormData:
13 """Helper class for multipart/form-data and
14 application/x-www-form-urlencoded body generation."""
15
16 def __init__(self, fields=(), quote_fields=True, charset=None):
17 self._writer = multipart.MultipartWriter('form-data')
18 self._fields = []
19 self._is_multipart = False
20 self._quote_fields = quote_fields
21 self._charset = charset
22
23 if isinstance(fields, dict):
24 fields = list(fields.items())
25 elif not isinstance(fields, (list, tuple)):
26 fields = (fields,)
27 self.add_fields(*fields)
28
29 @property
30 def is_multipart(self):
31 return self._is_multipart
32
33 def add_field(self, name, value, *, content_type=None, filename=None,
34 content_transfer_encoding=None):
35
36 if isinstance(value, io.IOBase):
37 self._is_multipart = True
38 elif isinstance(value, (bytes, bytearray, memoryview)):
39 if filename is None and content_transfer_encoding is None:
40 filename = name
41
42 type_options = MultiDict({'name': name})
43 if filename is not None and not isinstance(filename, str):
44 raise TypeError('filename must be an instance of str. '
45 'Got: %s' % filename)
46 if filename is None and isinstance(value, io.IOBase):
47 filename = guess_filename(value, name)
48 if filename is not None:
49 type_options['filename'] = filename
50 self._is_multipart = True
51
52 headers = {}
53 if content_type is not None:
54 if not isinstance(content_type, str):
55 raise TypeError('content_type must be an instance of str. '
56 'Got: %s' % content_type)
57 headers[hdrs.CONTENT_TYPE] = content_type
58 self._is_multipart = True
59 if content_transfer_encoding is not None:
60 if not isinstance(content_transfer_encoding, str):
61 raise TypeError('content_transfer_encoding must be an instance'
62 ' of str. Got: %s' % content_transfer_encoding)
63 headers[hdrs.CONTENT_TRANSFER_ENCODING] = content_transfer_encoding
64 self._is_multipart = True
65
66 self._fields.append((type_options, headers, value))
67
68 def add_fields(self, *fields):
69 to_add = list(fields)
70
71 while to_add:
72 rec = to_add.pop(0)
73
74 if isinstance(rec, io.IOBase):
75 k = guess_filename(rec, 'unknown')
76 self.add_field(k, rec)
77
78 elif isinstance(rec, (MultiDictProxy, MultiDict)):
79 to_add.extend(rec.items())
80
81 elif isinstance(rec, (list, tuple)) and len(rec) == 2:
82 k, fp = rec
83 self.add_field(k, fp)
84
85 else:
86 raise TypeError('Only io.IOBase, multidict and (name, file) '
87 'pairs allowed, use .add_field() for passing '
88 'more complex parameters, got {!r}'
89 .format(rec))
90
91 def _gen_form_urlencoded(self):
92 # form data (x-www-form-urlencoded)
93 data = []
94 for type_options, _, value in self._fields:
95 data.append((type_options['name'], value))
96
97 charset = self._charset if self._charset is not None else 'utf-8'
98 return payload.BytesPayload(
99 urlencode(data, doseq=True).encode(charset),
100 content_type='application/x-www-form-urlencoded')
101
102 def _gen_form_data(self):
103 """Encode a list of fields using the multipart/form-data MIME format"""
104 for dispparams, headers, value in self._fields:
105 try:
106 if hdrs.CONTENT_TYPE in headers:
107 part = payload.get_payload(
108 value, content_type=headers[hdrs.CONTENT_TYPE],
109 headers=headers, encoding=self._charset)
110 else:
111 part = payload.get_payload(
112 value, headers=headers, encoding=self._charset)
113 except Exception as exc:
114 raise TypeError(
115 'Can not serialize value type: %r\n '
116 'headers: %r\n value: %r' % (
117 type(value), headers, value)) from exc
118
119 if dispparams:
120 part.set_content_disposition(
121 'form-data', quote_fields=self._quote_fields, **dispparams
122 )
123 # FIXME cgi.FieldStorage doesn't likes body parts with
124 # Content-Length which were sent via chunked transfer encoding
125 part.headers.pop(hdrs.CONTENT_LENGTH, None)
126
127 self._writer.append_payload(part)
128
129 return self._writer
130
131 def __call__(self):
132 if self._is_multipart:
133 return self._gen_form_data()
134 else:
135 return self._gen_form_urlencoded()
136
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/aiohttp/formdata.py b/aiohttp/formdata.py
--- a/aiohttp/formdata.py
+++ b/aiohttp/formdata.py
@@ -96,7 +96,7 @@
charset = self._charset if self._charset is not None else 'utf-8'
return payload.BytesPayload(
- urlencode(data, doseq=True).encode(charset),
+ urlencode(data, doseq=True, encoding=charset).encode(),
content_type='application/x-www-form-urlencoded')
def _gen_form_data(self):
| {"golden_diff": "diff --git a/aiohttp/formdata.py b/aiohttp/formdata.py\n--- a/aiohttp/formdata.py\n+++ b/aiohttp/formdata.py\n@@ -96,7 +96,7 @@\n \n charset = self._charset if self._charset is not None else 'utf-8'\n return payload.BytesPayload(\n- urlencode(data, doseq=True).encode(charset),\n+ urlencode(data, doseq=True, encoding=charset).encode(),\n content_type='application/x-www-form-urlencoded')\n \n def _gen_form_data(self):\n", "issue": "Encoding is always UTF-8 in POST data\n## Long story short\r\n\r\nI'm doing a `POST` request via `client.post`:\r\n\r\n```\r\ndata = aiohttp.FormData({\r\n 'FindText': name,\r\n }, charset='windows-1251')\r\n\r\nclient.post(base_url, params={'RowFrom': offset}, data=data)\r\n```\r\n\r\nwhere `name` contains some none-latin text (`'\u0445\u0430\u043d'`)\r\n\r\n## Expected behaviour\r\n\r\nPOST data should contain: `FindText=%D5%E0%ED`\r\n\r\n## Actual behaviour\r\n\r\n`FindText=%D1%85%D0%B0%D0%BD'`\r\n\r\n## Steps to reproduce\r\n\r\nLooking through the code of `formdata.py:99`\r\n\r\n```\r\nurlencode(data, doseq=True).encode(charset),\r\n```\r\n\r\nI noticed, that `data` is urlencoded in UTF-8 first and then encoded to `windows-1251` (and that has no effect on `%D1...`).\r\n\r\nFor now, I just manually do in my code:\r\n\r\n```\r\ndata = urlencode({\r\n 'FindText': name,\r\n }, encoding='windows-1251')\r\n```\r\n\r\nAnd I get the string that I need.\r\n\r\nIs it a bug? Or am I doing it wrong?\r\n\r\n## Your environment\r\n\r\n```\r\nPython 3.6.0 (default, Jan 16 2017, 12:12:55) \r\n[GCC 6.3.1 20170109] on linux\r\n---\r\naiohttp==2.0.3\r\n```\r\n\n", "before_files": [{"content": "import io\nfrom urllib.parse import urlencode\n\nfrom multidict import MultiDict, MultiDictProxy\n\nfrom . import hdrs, multipart, payload\nfrom .helpers import guess_filename\n\n__all__ = ('FormData',)\n\n\nclass FormData:\n \"\"\"Helper class for multipart/form-data and\n application/x-www-form-urlencoded body generation.\"\"\"\n\n def __init__(self, fields=(), quote_fields=True, charset=None):\n self._writer = multipart.MultipartWriter('form-data')\n self._fields = []\n self._is_multipart = False\n self._quote_fields = quote_fields\n self._charset = charset\n\n if isinstance(fields, dict):\n fields = list(fields.items())\n elif not isinstance(fields, (list, tuple)):\n fields = (fields,)\n self.add_fields(*fields)\n\n @property\n def is_multipart(self):\n return self._is_multipart\n\n def add_field(self, name, value, *, content_type=None, filename=None,\n content_transfer_encoding=None):\n\n if isinstance(value, io.IOBase):\n self._is_multipart = True\n elif isinstance(value, (bytes, bytearray, memoryview)):\n if filename is None and content_transfer_encoding is None:\n filename = name\n\n type_options = MultiDict({'name': name})\n if filename is not None and not isinstance(filename, str):\n raise TypeError('filename must be an instance of str. '\n 'Got: %s' % filename)\n if filename is None and isinstance(value, io.IOBase):\n filename = guess_filename(value, name)\n if filename is not None:\n type_options['filename'] = filename\n self._is_multipart = True\n\n headers = {}\n if content_type is not None:\n if not isinstance(content_type, str):\n raise TypeError('content_type must be an instance of str. '\n 'Got: %s' % content_type)\n headers[hdrs.CONTENT_TYPE] = content_type\n self._is_multipart = True\n if content_transfer_encoding is not None:\n if not isinstance(content_transfer_encoding, str):\n raise TypeError('content_transfer_encoding must be an instance'\n ' of str. Got: %s' % content_transfer_encoding)\n headers[hdrs.CONTENT_TRANSFER_ENCODING] = content_transfer_encoding\n self._is_multipart = True\n\n self._fields.append((type_options, headers, value))\n\n def add_fields(self, *fields):\n to_add = list(fields)\n\n while to_add:\n rec = to_add.pop(0)\n\n if isinstance(rec, io.IOBase):\n k = guess_filename(rec, 'unknown')\n self.add_field(k, rec)\n\n elif isinstance(rec, (MultiDictProxy, MultiDict)):\n to_add.extend(rec.items())\n\n elif isinstance(rec, (list, tuple)) and len(rec) == 2:\n k, fp = rec\n self.add_field(k, fp)\n\n else:\n raise TypeError('Only io.IOBase, multidict and (name, file) '\n 'pairs allowed, use .add_field() for passing '\n 'more complex parameters, got {!r}'\n .format(rec))\n\n def _gen_form_urlencoded(self):\n # form data (x-www-form-urlencoded)\n data = []\n for type_options, _, value in self._fields:\n data.append((type_options['name'], value))\n\n charset = self._charset if self._charset is not None else 'utf-8'\n return payload.BytesPayload(\n urlencode(data, doseq=True).encode(charset),\n content_type='application/x-www-form-urlencoded')\n\n def _gen_form_data(self):\n \"\"\"Encode a list of fields using the multipart/form-data MIME format\"\"\"\n for dispparams, headers, value in self._fields:\n try:\n if hdrs.CONTENT_TYPE in headers:\n part = payload.get_payload(\n value, content_type=headers[hdrs.CONTENT_TYPE],\n headers=headers, encoding=self._charset)\n else:\n part = payload.get_payload(\n value, headers=headers, encoding=self._charset)\n except Exception as exc:\n raise TypeError(\n 'Can not serialize value type: %r\\n '\n 'headers: %r\\n value: %r' % (\n type(value), headers, value)) from exc\n\n if dispparams:\n part.set_content_disposition(\n 'form-data', quote_fields=self._quote_fields, **dispparams\n )\n # FIXME cgi.FieldStorage doesn't likes body parts with\n # Content-Length which were sent via chunked transfer encoding\n part.headers.pop(hdrs.CONTENT_LENGTH, None)\n\n self._writer.append_payload(part)\n\n return self._writer\n\n def __call__(self):\n if self._is_multipart:\n return self._gen_form_data()\n else:\n return self._gen_form_urlencoded()\n", "path": "aiohttp/formdata.py"}], "after_files": [{"content": "import io\nfrom urllib.parse import urlencode\n\nfrom multidict import MultiDict, MultiDictProxy\n\nfrom . import hdrs, multipart, payload\nfrom .helpers import guess_filename\n\n__all__ = ('FormData',)\n\n\nclass FormData:\n \"\"\"Helper class for multipart/form-data and\n application/x-www-form-urlencoded body generation.\"\"\"\n\n def __init__(self, fields=(), quote_fields=True, charset=None):\n self._writer = multipart.MultipartWriter('form-data')\n self._fields = []\n self._is_multipart = False\n self._quote_fields = quote_fields\n self._charset = charset\n\n if isinstance(fields, dict):\n fields = list(fields.items())\n elif not isinstance(fields, (list, tuple)):\n fields = (fields,)\n self.add_fields(*fields)\n\n @property\n def is_multipart(self):\n return self._is_multipart\n\n def add_field(self, name, value, *, content_type=None, filename=None,\n content_transfer_encoding=None):\n\n if isinstance(value, io.IOBase):\n self._is_multipart = True\n elif isinstance(value, (bytes, bytearray, memoryview)):\n if filename is None and content_transfer_encoding is None:\n filename = name\n\n type_options = MultiDict({'name': name})\n if filename is not None and not isinstance(filename, str):\n raise TypeError('filename must be an instance of str. '\n 'Got: %s' % filename)\n if filename is None and isinstance(value, io.IOBase):\n filename = guess_filename(value, name)\n if filename is not None:\n type_options['filename'] = filename\n self._is_multipart = True\n\n headers = {}\n if content_type is not None:\n if not isinstance(content_type, str):\n raise TypeError('content_type must be an instance of str. '\n 'Got: %s' % content_type)\n headers[hdrs.CONTENT_TYPE] = content_type\n self._is_multipart = True\n if content_transfer_encoding is not None:\n if not isinstance(content_transfer_encoding, str):\n raise TypeError('content_transfer_encoding must be an instance'\n ' of str. Got: %s' % content_transfer_encoding)\n headers[hdrs.CONTENT_TRANSFER_ENCODING] = content_transfer_encoding\n self._is_multipart = True\n\n self._fields.append((type_options, headers, value))\n\n def add_fields(self, *fields):\n to_add = list(fields)\n\n while to_add:\n rec = to_add.pop(0)\n\n if isinstance(rec, io.IOBase):\n k = guess_filename(rec, 'unknown')\n self.add_field(k, rec)\n\n elif isinstance(rec, (MultiDictProxy, MultiDict)):\n to_add.extend(rec.items())\n\n elif isinstance(rec, (list, tuple)) and len(rec) == 2:\n k, fp = rec\n self.add_field(k, fp)\n\n else:\n raise TypeError('Only io.IOBase, multidict and (name, file) '\n 'pairs allowed, use .add_field() for passing '\n 'more complex parameters, got {!r}'\n .format(rec))\n\n def _gen_form_urlencoded(self):\n # form data (x-www-form-urlencoded)\n data = []\n for type_options, _, value in self._fields:\n data.append((type_options['name'], value))\n\n charset = self._charset if self._charset is not None else 'utf-8'\n return payload.BytesPayload(\n urlencode(data, doseq=True, encoding=charset).encode(),\n content_type='application/x-www-form-urlencoded')\n\n def _gen_form_data(self):\n \"\"\"Encode a list of fields using the multipart/form-data MIME format\"\"\"\n for dispparams, headers, value in self._fields:\n try:\n if hdrs.CONTENT_TYPE in headers:\n part = payload.get_payload(\n value, content_type=headers[hdrs.CONTENT_TYPE],\n headers=headers, encoding=self._charset)\n else:\n part = payload.get_payload(\n value, headers=headers, encoding=self._charset)\n except Exception as exc:\n raise TypeError(\n 'Can not serialize value type: %r\\n '\n 'headers: %r\\n value: %r' % (\n type(value), headers, value)) from exc\n\n if dispparams:\n part.set_content_disposition(\n 'form-data', quote_fields=self._quote_fields, **dispparams\n )\n # FIXME cgi.FieldStorage doesn't likes body parts with\n # Content-Length which were sent via chunked transfer encoding\n part.headers.pop(hdrs.CONTENT_LENGTH, None)\n\n self._writer.append_payload(part)\n\n return self._writer\n\n def __call__(self):\n if self._is_multipart:\n return self._gen_form_data()\n else:\n return self._gen_form_urlencoded()\n", "path": "aiohttp/formdata.py"}]} | 1,943 | 121 |
gh_patches_debug_31145 | rasdani/github-patches | git_diff | archlinux__archinstall-408 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
_gfx_driver_packages not defined when choosing sway option (current master build)

_gfx_driver_packages not defined when choosing sway option (current master build)

_gfx_driver_packages not defined when choosing sway option (current master build)

--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `profiles/sway.py`
Content:
```
1 # A desktop environment using "Sway"
2
3 import archinstall
4
5 is_top_level_profile = False
6
7 __packages__ = ["sway", "swaylock", "swayidle", "waybar", "dmenu", "light", "grim", "slurp", "pavucontrol", "alacritty"]
8
9 def _prep_function(*args, **kwargs):
10 """
11 Magic function called by the importing installer
12 before continuing any further. It also avoids executing any
13 other code in this stage. So it's a safe way to ask the user
14 for more input before any other installer steps start.
15 """
16 if "nvidia" in _gfx_driver_packages:
17 choice = input("The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] ")
18 if choice.lower() in ("n", ""):
19 raise archinstall.lib.exceptions.HardwareIncompatibilityError("Sway does not support the proprietary nvidia drivers.")
20
21 __builtins__['_gfx_driver_packages'] = archinstall.select_driver()
22
23 return True
24
25 # Ensures that this code only gets executed if executed
26 # through importlib.util.spec_from_file_location("sway", "/somewhere/sway.py")
27 # or through conventional import sway
28 if __name__ == 'sway':
29 # Install the Sway packages
30 installation.add_additional_packages(__packages__)
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/profiles/sway.py b/profiles/sway.py
--- a/profiles/sway.py
+++ b/profiles/sway.py
@@ -4,7 +4,19 @@
is_top_level_profile = False
-__packages__ = ["sway", "swaylock", "swayidle", "waybar", "dmenu", "light", "grim", "slurp", "pavucontrol", "alacritty"]
+__packages__ = [
+ "sway",
+ "swaylock",
+ "swayidle",
+ "waybar",
+ "dmenu",
+ "light",
+ "grim",
+ "slurp",
+ "pavucontrol",
+ "alacritty",
+]
+
def _prep_function(*args, **kwargs):
"""
@@ -13,18 +25,26 @@
other code in this stage. So it's a safe way to ask the user
for more input before any other installer steps start.
"""
- if "nvidia" in _gfx_driver_packages:
- choice = input("The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] ")
- if choice.lower() in ("n", ""):
- raise archinstall.lib.exceptions.HardwareIncompatibilityError("Sway does not support the proprietary nvidia drivers.")
-
- __builtins__['_gfx_driver_packages'] = archinstall.select_driver()
+ __builtins__["_gfx_driver_packages"] = archinstall.select_driver()
return True
+
# Ensures that this code only gets executed if executed
# through importlib.util.spec_from_file_location("sway", "/somewhere/sway.py")
# or through conventional import sway
-if __name__ == 'sway':
+if __name__ == "sway":
+ if "nvidia" in _gfx_driver_packages:
+ choice = input(
+ "The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] "
+ )
+ if choice.lower() in ("n", ""):
+ raise archinstall.lib.exceptions.HardwareIncompatibilityError(
+ "Sway does not support the proprietary nvidia drivers."
+ )
+
# Install the Sway packages
installation.add_additional_packages(__packages__)
+
+ # Install the graphics driver packages
+ installation.add_additional_packages(_gfx_driver_packages)
| {"golden_diff": "diff --git a/profiles/sway.py b/profiles/sway.py\n--- a/profiles/sway.py\n+++ b/profiles/sway.py\n@@ -4,7 +4,19 @@\n \n is_top_level_profile = False\n \n-__packages__ = [\"sway\", \"swaylock\", \"swayidle\", \"waybar\", \"dmenu\", \"light\", \"grim\", \"slurp\", \"pavucontrol\", \"alacritty\"]\n+__packages__ = [\n+\t\"sway\",\n+\t\"swaylock\",\n+\t\"swayidle\",\n+\t\"waybar\",\n+\t\"dmenu\",\n+\t\"light\",\n+\t\"grim\",\n+\t\"slurp\",\n+\t\"pavucontrol\",\n+\t\"alacritty\",\n+]\n+\n \n def _prep_function(*args, **kwargs):\n \t\"\"\"\n@@ -13,18 +25,26 @@\n \tother code in this stage. So it's a safe way to ask the user\n \tfor more input before any other installer steps start.\n \t\"\"\"\n-\tif \"nvidia\" in _gfx_driver_packages:\n-\t\tchoice = input(\"The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] \")\n-\t\tif choice.lower() in (\"n\", \"\"):\n-\t\t\traise archinstall.lib.exceptions.HardwareIncompatibilityError(\"Sway does not support the proprietary nvidia drivers.\")\n-\n-\t__builtins__['_gfx_driver_packages'] = archinstall.select_driver()\n+\t__builtins__[\"_gfx_driver_packages\"] = archinstall.select_driver()\n \n \treturn True\n \n+\n # Ensures that this code only gets executed if executed\n # through importlib.util.spec_from_file_location(\"sway\", \"/somewhere/sway.py\")\n # or through conventional import sway\n-if __name__ == 'sway':\n+if __name__ == \"sway\":\n+\tif \"nvidia\" in _gfx_driver_packages:\n+\t\tchoice = input(\n+\t\t\t\"The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] \"\n+\t\t)\n+\t\tif choice.lower() in (\"n\", \"\"):\n+\t\t\traise archinstall.lib.exceptions.HardwareIncompatibilityError(\n+\t\t\t\t\"Sway does not support the proprietary nvidia drivers.\"\n+\t\t\t)\n+\n \t# Install the Sway packages\n \tinstallation.add_additional_packages(__packages__)\n+\n+\t# Install the graphics driver packages\n+\tinstallation.add_additional_packages(_gfx_driver_packages)\n", "issue": "_gfx_driver_packages not defined when choosing sway option (current master build)\n\r\n\n_gfx_driver_packages not defined when choosing sway option (current master build)\n\r\n\n_gfx_driver_packages not defined when choosing sway option (current master build)\n\r\n\n", "before_files": [{"content": "# A desktop environment using \"Sway\"\n\nimport archinstall\n\nis_top_level_profile = False\n\n__packages__ = [\"sway\", \"swaylock\", \"swayidle\", \"waybar\", \"dmenu\", \"light\", \"grim\", \"slurp\", \"pavucontrol\", \"alacritty\"]\n\ndef _prep_function(*args, **kwargs):\n\t\"\"\"\n\tMagic function called by the importing installer\n\tbefore continuing any further. It also avoids executing any\n\tother code in this stage. So it's a safe way to ask the user\n\tfor more input before any other installer steps start.\n\t\"\"\"\n\tif \"nvidia\" in _gfx_driver_packages:\n\t\tchoice = input(\"The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] \")\n\t\tif choice.lower() in (\"n\", \"\"):\n\t\t\traise archinstall.lib.exceptions.HardwareIncompatibilityError(\"Sway does not support the proprietary nvidia drivers.\")\n\n\t__builtins__['_gfx_driver_packages'] = archinstall.select_driver()\n\n\treturn True\n\n# Ensures that this code only gets executed if executed\n# through importlib.util.spec_from_file_location(\"sway\", \"/somewhere/sway.py\")\n# or through conventional import sway\nif __name__ == 'sway':\n\t# Install the Sway packages\n\tinstallation.add_additional_packages(__packages__)\n", "path": "profiles/sway.py"}], "after_files": [{"content": "# A desktop environment using \"Sway\"\n\nimport archinstall\n\nis_top_level_profile = False\n\n__packages__ = [\n\t\"sway\",\n\t\"swaylock\",\n\t\"swayidle\",\n\t\"waybar\",\n\t\"dmenu\",\n\t\"light\",\n\t\"grim\",\n\t\"slurp\",\n\t\"pavucontrol\",\n\t\"alacritty\",\n]\n\n\ndef _prep_function(*args, **kwargs):\n\t\"\"\"\n\tMagic function called by the importing installer\n\tbefore continuing any further. It also avoids executing any\n\tother code in this stage. So it's a safe way to ask the user\n\tfor more input before any other installer steps start.\n\t\"\"\"\n\t__builtins__[\"_gfx_driver_packages\"] = archinstall.select_driver()\n\n\treturn True\n\n\n# Ensures that this code only gets executed if executed\n# through importlib.util.spec_from_file_location(\"sway\", \"/somewhere/sway.py\")\n# or through conventional import sway\nif __name__ == \"sway\":\n\tif \"nvidia\" in _gfx_driver_packages:\n\t\tchoice = input(\n\t\t\t\"The proprietary Nvidia driver is not supported by Sway. It is likely that you will run into issues. Continue anyways? [y/N] \"\n\t\t)\n\t\tif choice.lower() in (\"n\", \"\"):\n\t\t\traise archinstall.lib.exceptions.HardwareIncompatibilityError(\n\t\t\t\t\"Sway does not support the proprietary nvidia drivers.\"\n\t\t\t)\n\n\t# Install the Sway packages\n\tinstallation.add_additional_packages(__packages__)\n\n\t# Install the graphics driver packages\n\tinstallation.add_additional_packages(_gfx_driver_packages)\n", "path": "profiles/sway.py"}]} | 928 | 558 |
gh_patches_debug_1625 | rasdani/github-patches | git_diff | Lightning-AI__pytorch-lightning-799 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Optional dependencies are required for deprecated logging module
🐛 Bug
There is a backwards compatibility issues coming from PR #767. Notably, if a user doesn't have any of the extra logging dependencies then they'll be an import error.
### To Reproduce
1. Remove all logging dependencies from your environment (E.g. comet)
2. Depend on the deprecated pytorch_lightning.logging package and run
### Expected behavior
We expect to maintain backwards compatibility here so optional dependencies shouldn't be required.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pytorch_lightning/logging/__init__.py`
Content:
```
1 """
2 .. warning:: `logging` package has been renamed to `loggers` since v0.6.1 and will be removed in v0.8.0
3 """
4
5 import warnings
6
7 warnings.warn("`logging` package has been renamed to `loggers` since v0.6.1"
8 " and will be removed in v0.8.0", DeprecationWarning)
9
10 from pytorch_lightning.loggers import * # noqa: F403
11 from pytorch_lightning.loggers import ( # noqa: E402
12 base, comet, mlflow, neptune, tensorboard, test_tube, wandb
13 )
14
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pytorch_lightning/logging/__init__.py b/pytorch_lightning/logging/__init__.py
--- a/pytorch_lightning/logging/__init__.py
+++ b/pytorch_lightning/logging/__init__.py
@@ -8,6 +8,3 @@
" and will be removed in v0.8.0", DeprecationWarning)
from pytorch_lightning.loggers import * # noqa: F403
-from pytorch_lightning.loggers import ( # noqa: E402
- base, comet, mlflow, neptune, tensorboard, test_tube, wandb
-)
| {"golden_diff": "diff --git a/pytorch_lightning/logging/__init__.py b/pytorch_lightning/logging/__init__.py\n--- a/pytorch_lightning/logging/__init__.py\n+++ b/pytorch_lightning/logging/__init__.py\n@@ -8,6 +8,3 @@\n \" and will be removed in v0.8.0\", DeprecationWarning)\n \n from pytorch_lightning.loggers import * # noqa: F403\n-from pytorch_lightning.loggers import ( # noqa: E402\n- base, comet, mlflow, neptune, tensorboard, test_tube, wandb\n-)\n", "issue": "Optional dependencies are required for deprecated logging module\n\ud83d\udc1b Bug\r\n\r\nThere is a backwards compatibility issues coming from PR #767. Notably, if a user doesn't have any of the extra logging dependencies then they'll be an import error.\r\n\r\n### To Reproduce\r\n\r\n1. Remove all logging dependencies from your environment (E.g. comet)\r\n2. Depend on the deprecated pytorch_lightning.logging package and run\r\n\r\n### Expected behavior\r\n\r\nWe expect to maintain backwards compatibility here so optional dependencies shouldn't be required.\n", "before_files": [{"content": "\"\"\"\n.. warning:: `logging` package has been renamed to `loggers` since v0.6.1 and will be removed in v0.8.0\n\"\"\"\n\nimport warnings\n\nwarnings.warn(\"`logging` package has been renamed to `loggers` since v0.6.1\"\n \" and will be removed in v0.8.0\", DeprecationWarning)\n\nfrom pytorch_lightning.loggers import * # noqa: F403\nfrom pytorch_lightning.loggers import ( # noqa: E402\n base, comet, mlflow, neptune, tensorboard, test_tube, wandb\n)\n", "path": "pytorch_lightning/logging/__init__.py"}], "after_files": [{"content": "\"\"\"\n.. warning:: `logging` package has been renamed to `loggers` since v0.6.1 and will be removed in v0.8.0\n\"\"\"\n\nimport warnings\n\nwarnings.warn(\"`logging` package has been renamed to `loggers` since v0.6.1\"\n \" and will be removed in v0.8.0\", DeprecationWarning)\n\nfrom pytorch_lightning.loggers import * # noqa: F403\n", "path": "pytorch_lightning/logging/__init__.py"}]} | 526 | 138 |
gh_patches_debug_6678 | rasdani/github-patches | git_diff | psychopy__psychopy-2734 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[bug] visual.Rect component does not recalculate vertices after change of size, width and height properties
Which results in inability to update width and height of Rect during main loop. I have noticed that Rect class was updated (commit from 14 october), but this update made it unusable. Fix is simple, update vertices after setting size and set self._needVertexUpdate to True to enable redrawing updated shape.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `psychopy/visual/rect.py`
Content:
```
1 #!/usr/bin/env python
2 # -*- coding: utf-8 -*-
3
4 """Creates a rectangle of given width and height as a special case of a
5 :class:`~psychopy.visual.ShapeStim`"""
6
7 # Part of the PsychoPy library
8 # Copyright (C) 2002-2018 Jonathan Peirce (C) 2019 Open Science Tools Ltd.
9 # Distributed under the terms of the GNU General Public License (GPL).
10
11 from __future__ import absolute_import, print_function
12
13 import numpy as np
14
15 import psychopy # so we can get the __path__
16 from psychopy.visual.shape import BaseShapeStim
17 from psychopy.tools.attributetools import attributeSetter, setAttribute
18
19
20 class Rect(BaseShapeStim):
21 """Creates a rectangle of given width and height as a special case of a
22 :class:`~psychopy.visual.ShapeStim`
23
24 (New in version 1.72.00)
25
26 Attributes
27 ----------
28 width, height : float or int
29 The width and height of the rectangle. Values are aliased with fields
30 in the `size` attribute. Use these values to adjust the size of the
31 rectangle in a single dimension after initialization.
32
33 """
34 def __init__(self,
35 win,
36 width=.5,
37 height=.5,
38 autoLog=None,
39 units='',
40 lineWidth=1.5,
41 lineColor='white',
42 lineColorSpace='rgb',
43 fillColor=None,
44 fillColorSpace='rgb',
45 pos=(0, 0),
46 size=None,
47 ori=0.0,
48 opacity=1.0,
49 contrast=1.0,
50 depth=0,
51 interpolate=True,
52 name=None,
53 autoDraw=False):
54 """
55 Parameters
56 ----------
57 win : `~psychopy.visual.Window`
58 Window object to be associated with this stimuli.
59 width, height : float or int
60 The width and height of the rectangle. *DEPRECATED* use `size`
61 to define the dimensions of the rectangle on initialization. If
62 `size` is specified the values of `width` and `height` are
63 ignored. This is to provide legacy compatibility for existing
64 applications.
65 size : array_like, float or int
66 Width and height of the rectangle as (w, h) or [w, h]. If a single
67 value is provided, the width and height will be set to the same
68 specified value. If `None` is specified, the `size` will be set
69 with values passed to `width` and `height`.
70
71 """
72 # width and height attributes, these are later aliased with `size`
73 self.__dict__['width'] = float(width)
74 self.__dict__['height'] = float(height)
75
76 # If the size argument was specified, override values of width and
77 # height, this is to maintain legacy compatibility. Args width and
78 # height should be deprecated in later releases.
79 if size is None:
80 size = (self.__dict__['width'],
81 self.__dict__['height'])
82
83 # vertices for rectangle, CCW winding order
84 vertices = np.array([[-.5, .5],
85 [ .5, .5],
86 [ .5, -.5],
87 [-.5, -.5]])
88
89 super(Rect, self).__init__(
90 win,
91 units=units,
92 lineWidth=lineWidth,
93 lineColor=lineColor,
94 lineColorSpace=lineColorSpace,
95 fillColor=fillColor,
96 fillColorSpace=fillColorSpace,
97 vertices=vertices,
98 closeShape=True,
99 pos=pos,
100 size=size,
101 ori=ori,
102 opacity=opacity,
103 contrast=contrast,
104 depth=depth,
105 interpolate=interpolate,
106 name=name,
107 autoLog=autoLog,
108 autoDraw=autoDraw)
109
110 @attributeSetter
111 def size(self, value):
112 """array-like.
113 Size of the rectangle (`width` and `height`).
114 """
115 # Needed to override `size` to ensure `width` and `height` attrs
116 # are updated when it changes.
117 self.__dict__['size'] = np.array(value, float)
118
119 width, height = self.__dict__['size']
120 self.__dict__['width'] = width
121 self.__dict__['height'] = height
122
123 def setSize(self, size, operation='', log=None):
124 """Usually you can use 'stim.attribute = value' syntax instead,
125 but use this method if you need to suppress the log message
126
127 :ref:`Operations <attrib-operations>` supported.
128 """
129 setAttribute(self, 'size', size, log, operation)
130
131 @attributeSetter
132 def width(self, value):
133 """int or float.
134 Width of the Rectangle (in its respective units, if specified).
135
136 :ref:`Operations <attrib-operations>` supported.
137 """
138 self.__dict__['width'] = float(value)
139 self.size = (self.__dict__['width'], self.size[1])
140
141 def setWidth(self, width, operation='', log=None):
142 """Usually you can use 'stim.attribute = value' syntax instead,
143 but use this method if you need to suppress the log message
144 """
145 setAttribute(self, 'width', width, log, operation)
146
147 @attributeSetter
148 def height(self, value):
149 """int or float.
150 Height of the Rectangle (in its respective units, if specified).
151
152 :ref:`Operations <attrib-operations>` supported.
153 """
154 self.__dict__['height'] = float(value)
155 self.size = (self.size[0], self.__dict__['height'])
156
157 def setHeight(self, height, operation='', log=None):
158 """Usually you can use 'stim.attribute = value' syntax instead,
159 but use this method if you need to suppress the log message
160 """
161 setAttribute(self, 'height', height, log, operation)
162
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/psychopy/visual/rect.py b/psychopy/visual/rect.py
--- a/psychopy/visual/rect.py
+++ b/psychopy/visual/rect.py
@@ -120,6 +120,8 @@
self.__dict__['width'] = width
self.__dict__['height'] = height
+ self._needVertexUpdate = True
+
def setSize(self, size, operation='', log=None):
"""Usually you can use 'stim.attribute = value' syntax instead,
but use this method if you need to suppress the log message
| {"golden_diff": "diff --git a/psychopy/visual/rect.py b/psychopy/visual/rect.py\n--- a/psychopy/visual/rect.py\n+++ b/psychopy/visual/rect.py\n@@ -120,6 +120,8 @@\n self.__dict__['width'] = width\n self.__dict__['height'] = height\n \n+ self._needVertexUpdate = True\n+\n def setSize(self, size, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n", "issue": "[bug] visual.Rect component does not recalculate vertices after change of size, width and height properties\nWhich results in inability to update width and height of Rect during main loop. I have noticed that Rect class was updated (commit from 14 october), but this update made it unusable. Fix is simple, update vertices after setting size and set self._needVertexUpdate to True to enable redrawing updated shape.\r\n \n", "before_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n\"\"\"Creates a rectangle of given width and height as a special case of a\n:class:`~psychopy.visual.ShapeStim`\"\"\"\n\n# Part of the PsychoPy library\n# Copyright (C) 2002-2018 Jonathan Peirce (C) 2019 Open Science Tools Ltd.\n# Distributed under the terms of the GNU General Public License (GPL).\n\nfrom __future__ import absolute_import, print_function\n\nimport numpy as np\n\nimport psychopy # so we can get the __path__\nfrom psychopy.visual.shape import BaseShapeStim\nfrom psychopy.tools.attributetools import attributeSetter, setAttribute\n\n\nclass Rect(BaseShapeStim):\n \"\"\"Creates a rectangle of given width and height as a special case of a\n :class:`~psychopy.visual.ShapeStim`\n\n (New in version 1.72.00)\n\n Attributes\n ----------\n width, height : float or int\n The width and height of the rectangle. Values are aliased with fields\n in the `size` attribute. Use these values to adjust the size of the\n rectangle in a single dimension after initialization.\n\n \"\"\"\n def __init__(self,\n win,\n width=.5,\n height=.5,\n autoLog=None,\n units='',\n lineWidth=1.5,\n lineColor='white',\n lineColorSpace='rgb',\n fillColor=None,\n fillColorSpace='rgb',\n pos=(0, 0),\n size=None,\n ori=0.0,\n opacity=1.0,\n contrast=1.0,\n depth=0,\n interpolate=True,\n name=None,\n autoDraw=False):\n \"\"\"\n Parameters\n ----------\n win : `~psychopy.visual.Window`\n Window object to be associated with this stimuli.\n width, height : float or int\n The width and height of the rectangle. *DEPRECATED* use `size`\n to define the dimensions of the rectangle on initialization. If\n `size` is specified the values of `width` and `height` are\n ignored. This is to provide legacy compatibility for existing\n applications.\n size : array_like, float or int\n Width and height of the rectangle as (w, h) or [w, h]. If a single\n value is provided, the width and height will be set to the same\n specified value. If `None` is specified, the `size` will be set\n with values passed to `width` and `height`.\n\n \"\"\"\n # width and height attributes, these are later aliased with `size`\n self.__dict__['width'] = float(width)\n self.__dict__['height'] = float(height)\n\n # If the size argument was specified, override values of width and\n # height, this is to maintain legacy compatibility. Args width and\n # height should be deprecated in later releases.\n if size is None:\n size = (self.__dict__['width'],\n self.__dict__['height'])\n\n # vertices for rectangle, CCW winding order\n vertices = np.array([[-.5, .5],\n [ .5, .5],\n [ .5, -.5],\n [-.5, -.5]])\n\n super(Rect, self).__init__(\n win,\n units=units,\n lineWidth=lineWidth,\n lineColor=lineColor,\n lineColorSpace=lineColorSpace,\n fillColor=fillColor,\n fillColorSpace=fillColorSpace,\n vertices=vertices,\n closeShape=True,\n pos=pos,\n size=size,\n ori=ori,\n opacity=opacity,\n contrast=contrast,\n depth=depth,\n interpolate=interpolate,\n name=name,\n autoLog=autoLog,\n autoDraw=autoDraw)\n\n @attributeSetter\n def size(self, value):\n \"\"\"array-like.\n Size of the rectangle (`width` and `height`).\n \"\"\"\n # Needed to override `size` to ensure `width` and `height` attrs\n # are updated when it changes.\n self.__dict__['size'] = np.array(value, float)\n\n width, height = self.__dict__['size']\n self.__dict__['width'] = width\n self.__dict__['height'] = height\n\n def setSize(self, size, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n\n :ref:`Operations <attrib-operations>` supported.\n \"\"\"\n setAttribute(self, 'size', size, log, operation)\n\n @attributeSetter\n def width(self, value):\n \"\"\"int or float.\n Width of the Rectangle (in its respective units, if specified).\n\n :ref:`Operations <attrib-operations>` supported.\n \"\"\"\n self.__dict__['width'] = float(value)\n self.size = (self.__dict__['width'], self.size[1])\n\n def setWidth(self, width, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n \"\"\"\n setAttribute(self, 'width', width, log, operation)\n\n @attributeSetter\n def height(self, value):\n \"\"\"int or float.\n Height of the Rectangle (in its respective units, if specified).\n\n :ref:`Operations <attrib-operations>` supported.\n \"\"\"\n self.__dict__['height'] = float(value)\n self.size = (self.size[0], self.__dict__['height'])\n\n def setHeight(self, height, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n \"\"\"\n setAttribute(self, 'height', height, log, operation)\n", "path": "psychopy/visual/rect.py"}], "after_files": [{"content": "#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\n\"\"\"Creates a rectangle of given width and height as a special case of a\n:class:`~psychopy.visual.ShapeStim`\"\"\"\n\n# Part of the PsychoPy library\n# Copyright (C) 2002-2018 Jonathan Peirce (C) 2019 Open Science Tools Ltd.\n# Distributed under the terms of the GNU General Public License (GPL).\n\nfrom __future__ import absolute_import, print_function\n\nimport numpy as np\n\nimport psychopy # so we can get the __path__\nfrom psychopy.visual.shape import BaseShapeStim\nfrom psychopy.tools.attributetools import attributeSetter, setAttribute\n\n\nclass Rect(BaseShapeStim):\n \"\"\"Creates a rectangle of given width and height as a special case of a\n :class:`~psychopy.visual.ShapeStim`\n\n (New in version 1.72.00)\n\n Attributes\n ----------\n width, height : float or int\n The width and height of the rectangle. Values are aliased with fields\n in the `size` attribute. Use these values to adjust the size of the\n rectangle in a single dimension after initialization.\n\n \"\"\"\n def __init__(self,\n win,\n width=.5,\n height=.5,\n autoLog=None,\n units='',\n lineWidth=1.5,\n lineColor='white',\n lineColorSpace='rgb',\n fillColor=None,\n fillColorSpace='rgb',\n pos=(0, 0),\n size=None,\n ori=0.0,\n opacity=1.0,\n contrast=1.0,\n depth=0,\n interpolate=True,\n name=None,\n autoDraw=False):\n \"\"\"\n Parameters\n ----------\n win : `~psychopy.visual.Window`\n Window object to be associated with this stimuli.\n width, height : float or int\n The width and height of the rectangle. *DEPRECATED* use `size`\n to define the dimensions of the rectangle on initialization. If\n `size` is specified the values of `width` and `height` are\n ignored. This is to provide legacy compatibility for existing\n applications.\n size : array_like, float or int\n Width and height of the rectangle as (w, h) or [w, h]. If a single\n value is provided, the width and height will be set to the same\n specified value. If `None` is specified, the `size` will be set\n with values passed to `width` and `height`.\n\n \"\"\"\n # width and height attributes, these are later aliased with `size`\n self.__dict__['width'] = float(width)\n self.__dict__['height'] = float(height)\n\n # If the size argument was specified, override values of width and\n # height, this is to maintain legacy compatibility. Args width and\n # height should be deprecated in later releases.\n if size is None:\n size = (self.__dict__['width'],\n self.__dict__['height'])\n\n # vertices for rectangle, CCW winding order\n vertices = np.array([[-.5, .5],\n [ .5, .5],\n [ .5, -.5],\n [-.5, -.5]])\n\n super(Rect, self).__init__(\n win,\n units=units,\n lineWidth=lineWidth,\n lineColor=lineColor,\n lineColorSpace=lineColorSpace,\n fillColor=fillColor,\n fillColorSpace=fillColorSpace,\n vertices=vertices,\n closeShape=True,\n pos=pos,\n size=size,\n ori=ori,\n opacity=opacity,\n contrast=contrast,\n depth=depth,\n interpolate=interpolate,\n name=name,\n autoLog=autoLog,\n autoDraw=autoDraw)\n\n @attributeSetter\n def size(self, value):\n \"\"\"array-like.\n Size of the rectangle (`width` and `height`).\n \"\"\"\n # Needed to override `size` to ensure `width` and `height` attrs\n # are updated when it changes.\n self.__dict__['size'] = np.array(value, float)\n\n width, height = self.__dict__['size']\n self.__dict__['width'] = width\n self.__dict__['height'] = height\n\n self._needVertexUpdate = True\n\n def setSize(self, size, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n\n :ref:`Operations <attrib-operations>` supported.\n \"\"\"\n setAttribute(self, 'size', size, log, operation)\n\n @attributeSetter\n def width(self, value):\n \"\"\"int or float.\n Width of the Rectangle (in its respective units, if specified).\n\n :ref:`Operations <attrib-operations>` supported.\n \"\"\"\n self.__dict__['width'] = float(value)\n self.size = (self.__dict__['width'], self.size[1])\n\n def setWidth(self, width, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n \"\"\"\n setAttribute(self, 'width', width, log, operation)\n\n @attributeSetter\n def height(self, value):\n \"\"\"int or float.\n Height of the Rectangle (in its respective units, if specified).\n\n :ref:`Operations <attrib-operations>` supported.\n \"\"\"\n self.__dict__['height'] = float(value)\n self.size = (self.size[0], self.__dict__['height'])\n\n def setHeight(self, height, operation='', log=None):\n \"\"\"Usually you can use 'stim.attribute = value' syntax instead,\n but use this method if you need to suppress the log message\n \"\"\"\n setAttribute(self, 'height', height, log, operation)\n", "path": "psychopy/visual/rect.py"}]} | 2,008 | 132 |
gh_patches_debug_22721 | rasdani/github-patches | git_diff | cookiecutter__cookiecutter-825 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
.cookiecutterrc and cookiecutters_dir not working as I expected
Hi,
Here's the setup:
- I have a ~/.cookiecutterrc as talked about here: http://cookiecutter.readthedocs.org/en/latest/advanced_usage.html
- I also have a cookiecutters_dir at ~/.cookiecutters/ with the subdirectory cookiecutter-pypackage/.
When I try to run "cookiecutter cookiecutter-pypackage/" in ~/Projects/, I get the following error
```
Traceback (most recent call last):
File "/opt/anaconda/bin/cookiecutter", line 9, in <module>
load_entry_point('cookiecutter==0.9.0', 'console_scripts', 'cookiecutter')()
File "/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 610, in __call__
return self.main(*args, **kwargs)
File "/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 590, in main
rv = self.invoke(ctx)
File "/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 782, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py", line 416, in invoke
return callback(*args, **kwargs)
File "/Users/alexp/Projects/cookiecutter/cookiecutter/cli.py", line 70, in main
cookiecutter(template, checkout, no_input)
File "/Users/alexp/Projects/cookiecutter/cookiecutter/main.py", line 95, in cookiecutter
extra_context=extra_context,
File "/Users/alexp/Projects/cookiecutter/cookiecutter/generate.py", line 43, in generate_context
file_handle = open(context_file)
IOError: [Errno 2] No such file or directory: u'cookiecutter-pypackage/cookiecutter.json'
```
This error shows up if I either do pip install or with the git repo locally. Naturally, it makes a bit of sense. There is no directory ~/Projects/cookiecutter-pypackage/.
However, and perhaps I'm making a poor assumption about usage, it seems to me if I clone or otherwise create a cookiecutter and it's sitting in cookiecutters_dir, it'd be nice to just refer to it as I did above. For my usage, if I create a cookiecutter, I don't particularly want it sitting around a Projects directory, especially if I have multiple project directories for personal and organizational purposes.
In order to do this, I added three lines to main.py in my fork (see below) and it seems to work. I did it as an `elif` to try to preserve the possibility of a lack of a cookiecutters_dir. I have not written a test for this and admittedly I don't really know how. I will likely just use my fork with this modificationgoing forward but I wanted to let the developer crew know about this.
Cheers.
```
# cookiecutter.main
...
# TODO: find a better way to tell if it's a repo URL
if 'git@' in template or 'https://' in template:
repo_dir = clone(
repo_url=template,
checkout=checkout,
clone_to_dir=config_dict['cookiecutters_dir'],
no_input=no_input
)
#### Added these three lines
elif 'cookiecutters_dir' in config_dict:
cookiecutters_dir = os.path.expanduser(config_dict['cookiecutters_dir'])
repo_dir = os.path.join(cookiecutters_dir,template)
else:
# If it's a local repo, no need to clone or copy to your
# cookiecutters_dir
repo_dir = template
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cookiecutter/repository.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 """Cookiecutter repository functions."""
4 from __future__ import unicode_literals
5 import os
6 import re
7
8 from .exceptions import RepositoryNotFound
9 from .vcs import clone
10
11 REPO_REGEX = re.compile(r"""
12 (?x)
13 ((((git|hg)\+)?(git|ssh|https?):(//)?) # something like git:// ssh:// etc.
14 | # or
15 (\w+@[\w\.]+) # something like user@...
16 )
17 """)
18
19
20 def is_repo_url(value):
21 """Return True if value is a repository URL."""
22 return bool(REPO_REGEX.match(value))
23
24
25 def expand_abbreviations(template, abbreviations):
26 """
27 Expand abbreviations in a template name.
28
29 :param template: The project template name.
30 :param abbreviations: Abbreviation definitions.
31 """
32 if template in abbreviations:
33 return abbreviations[template]
34
35 # Split on colon. If there is no colon, rest will be empty
36 # and prefix will be the whole template
37 prefix, sep, rest = template.partition(':')
38 if prefix in abbreviations:
39 return abbreviations[prefix].format(rest)
40
41 return template
42
43
44 def repository_has_cookiecutter_json(repo_directory):
45 """Determines if `repo_directory` contains a `cookiecutter.json` file.
46
47 :param repo_directory: The candidate repository directory.
48 :return: True if the `repo_directory` is valid, else False.
49 """
50 repo_directory_exists = os.path.isdir(repo_directory)
51
52 repo_config_exists = os.path.isfile(
53 os.path.join(repo_directory, 'cookiecutter.json')
54 )
55 return repo_directory_exists and repo_config_exists
56
57
58 def determine_repo_dir(template, abbreviations, clone_to_dir, checkout,
59 no_input):
60 """
61 Locate the repository directory from a template reference.
62
63 Applies repository abbreviations to the template reference.
64 If the template refers to a repository URL, clone it.
65 If the template is a path to a local repository, use it.
66
67 :param template: A directory containing a project template directory,
68 or a URL to a git repository.
69 :param abbreviations: A dictionary of repository abbreviation
70 definitions.
71 :param clone_to_dir: The directory to clone the repository into.
72 :param checkout: The branch, tag or commit ID to checkout after clone.
73 :param no_input: Prompt the user at command line for manual configuration?
74 :return: The cookiecutter template directory
75 :raises: `RepositoryNotFound` if a repository directory could not be found.
76 """
77 template = expand_abbreviations(template, abbreviations)
78
79 if is_repo_url(template):
80 repo_dir = clone(
81 repo_url=template,
82 checkout=checkout,
83 clone_to_dir=clone_to_dir,
84 no_input=no_input,
85 )
86 else:
87 # If it's a local repo, no need to clone or copy to your
88 # cookiecutters_dir
89 repo_dir = template
90
91 if repository_has_cookiecutter_json(repo_dir):
92 return repo_dir
93
94 raise RepositoryNotFound(
95 'The repository {} could not be located or does not contain '
96 'a "cookiecutter.json" file.'.format(repo_dir)
97 )
98
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cookiecutter/repository.py b/cookiecutter/repository.py
--- a/cookiecutter/repository.py
+++ b/cookiecutter/repository.py
@@ -77,21 +77,27 @@
template = expand_abbreviations(template, abbreviations)
if is_repo_url(template):
- repo_dir = clone(
+ cloned_repo = clone(
repo_url=template,
checkout=checkout,
clone_to_dir=clone_to_dir,
no_input=no_input,
)
+ repository_candidates = [cloned_repo]
else:
- # If it's a local repo, no need to clone or copy to your
- # cookiecutters_dir
- repo_dir = template
+ repository_candidates = [
+ template,
+ os.path.join(clone_to_dir, template)
+ ]
- if repository_has_cookiecutter_json(repo_dir):
- return repo_dir
+ for repo_candidate in repository_candidates:
+ if repository_has_cookiecutter_json(repo_candidate):
+ return repo_candidate
raise RepositoryNotFound(
- 'The repository {} could not be located or does not contain '
- 'a "cookiecutter.json" file.'.format(repo_dir)
+ 'A valid repository for "{}" could not be found in the following '
+ 'locations:\n{}'.format(
+ template,
+ '\n'.join(repository_candidates)
+ )
)
| {"golden_diff": "diff --git a/cookiecutter/repository.py b/cookiecutter/repository.py\n--- a/cookiecutter/repository.py\n+++ b/cookiecutter/repository.py\n@@ -77,21 +77,27 @@\n template = expand_abbreviations(template, abbreviations)\n \n if is_repo_url(template):\n- repo_dir = clone(\n+ cloned_repo = clone(\n repo_url=template,\n checkout=checkout,\n clone_to_dir=clone_to_dir,\n no_input=no_input,\n )\n+ repository_candidates = [cloned_repo]\n else:\n- # If it's a local repo, no need to clone or copy to your\n- # cookiecutters_dir\n- repo_dir = template\n+ repository_candidates = [\n+ template,\n+ os.path.join(clone_to_dir, template)\n+ ]\n \n- if repository_has_cookiecutter_json(repo_dir):\n- return repo_dir\n+ for repo_candidate in repository_candidates:\n+ if repository_has_cookiecutter_json(repo_candidate):\n+ return repo_candidate\n \n raise RepositoryNotFound(\n- 'The repository {} could not be located or does not contain '\n- 'a \"cookiecutter.json\" file.'.format(repo_dir)\n+ 'A valid repository for \"{}\" could not be found in the following '\n+ 'locations:\\n{}'.format(\n+ template,\n+ '\\n'.join(repository_candidates)\n+ )\n )\n", "issue": ".cookiecutterrc and cookiecutters_dir not working as I expected\nHi,\nHere's the setup:\n- I have a ~/.cookiecutterrc as talked about here: http://cookiecutter.readthedocs.org/en/latest/advanced_usage.html\n- I also have a cookiecutters_dir at ~/.cookiecutters/ with the subdirectory cookiecutter-pypackage/.\n\nWhen I try to run \"cookiecutter cookiecutter-pypackage/\" in ~/Projects/, I get the following error\n\n```\nTraceback (most recent call last):\n File \"/opt/anaconda/bin/cookiecutter\", line 9, in <module>\n load_entry_point('cookiecutter==0.9.0', 'console_scripts', 'cookiecutter')()\n File \"/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py\", line 610, in __call__\n return self.main(*args, **kwargs)\n File \"/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py\", line 590, in main\n rv = self.invoke(ctx)\n File \"/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py\", line 782, in invoke\n return ctx.invoke(self.callback, **ctx.params)\n File \"/opt/anaconda/lib/python2.7/site-packages/click-3.3-py2.7.egg/click/core.py\", line 416, in invoke\n return callback(*args, **kwargs)\n File \"/Users/alexp/Projects/cookiecutter/cookiecutter/cli.py\", line 70, in main\n cookiecutter(template, checkout, no_input)\n File \"/Users/alexp/Projects/cookiecutter/cookiecutter/main.py\", line 95, in cookiecutter\n extra_context=extra_context,\n File \"/Users/alexp/Projects/cookiecutter/cookiecutter/generate.py\", line 43, in generate_context\n file_handle = open(context_file)\nIOError: [Errno 2] No such file or directory: u'cookiecutter-pypackage/cookiecutter.json'\n```\n\nThis error shows up if I either do pip install or with the git repo locally. Naturally, it makes a bit of sense. There is no directory ~/Projects/cookiecutter-pypackage/. \n\nHowever, and perhaps I'm making a poor assumption about usage, it seems to me if I clone or otherwise create a cookiecutter and it's sitting in cookiecutters_dir, it'd be nice to just refer to it as I did above. For my usage, if I create a cookiecutter, I don't particularly want it sitting around a Projects directory, especially if I have multiple project directories for personal and organizational purposes.\n\nIn order to do this, I added three lines to main.py in my fork (see below) and it seems to work. I did it as an `elif` to try to preserve the possibility of a lack of a cookiecutters_dir. I have not written a test for this and admittedly I don't really know how. I will likely just use my fork with this modificationgoing forward but I wanted to let the developer crew know about this. \n\nCheers.\n\n```\n# cookiecutter.main\n...\n# TODO: find a better way to tell if it's a repo URL\nif 'git@' in template or 'https://' in template:\n repo_dir = clone(\n repo_url=template,\n checkout=checkout,\n clone_to_dir=config_dict['cookiecutters_dir'],\n no_input=no_input\n )\n#### Added these three lines\nelif 'cookiecutters_dir' in config_dict:\n cookiecutters_dir = os.path.expanduser(config_dict['cookiecutters_dir'])\n repo_dir = os.path.join(cookiecutters_dir,template)\nelse:\n # If it's a local repo, no need to clone or copy to your\n # cookiecutters_dir\n repo_dir = template\n```\n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n\"\"\"Cookiecutter repository functions.\"\"\"\nfrom __future__ import unicode_literals\nimport os\nimport re\n\nfrom .exceptions import RepositoryNotFound\nfrom .vcs import clone\n\nREPO_REGEX = re.compile(r\"\"\"\n(?x)\n((((git|hg)\\+)?(git|ssh|https?):(//)?) # something like git:// ssh:// etc.\n | # or\n (\\w+@[\\w\\.]+) # something like user@...\n)\n\"\"\")\n\n\ndef is_repo_url(value):\n \"\"\"Return True if value is a repository URL.\"\"\"\n return bool(REPO_REGEX.match(value))\n\n\ndef expand_abbreviations(template, abbreviations):\n \"\"\"\n Expand abbreviations in a template name.\n\n :param template: The project template name.\n :param abbreviations: Abbreviation definitions.\n \"\"\"\n if template in abbreviations:\n return abbreviations[template]\n\n # Split on colon. If there is no colon, rest will be empty\n # and prefix will be the whole template\n prefix, sep, rest = template.partition(':')\n if prefix in abbreviations:\n return abbreviations[prefix].format(rest)\n\n return template\n\n\ndef repository_has_cookiecutter_json(repo_directory):\n \"\"\"Determines if `repo_directory` contains a `cookiecutter.json` file.\n\n :param repo_directory: The candidate repository directory.\n :return: True if the `repo_directory` is valid, else False.\n \"\"\"\n repo_directory_exists = os.path.isdir(repo_directory)\n\n repo_config_exists = os.path.isfile(\n os.path.join(repo_directory, 'cookiecutter.json')\n )\n return repo_directory_exists and repo_config_exists\n\n\ndef determine_repo_dir(template, abbreviations, clone_to_dir, checkout,\n no_input):\n \"\"\"\n Locate the repository directory from a template reference.\n\n Applies repository abbreviations to the template reference.\n If the template refers to a repository URL, clone it.\n If the template is a path to a local repository, use it.\n\n :param template: A directory containing a project template directory,\n or a URL to a git repository.\n :param abbreviations: A dictionary of repository abbreviation\n definitions.\n :param clone_to_dir: The directory to clone the repository into.\n :param checkout: The branch, tag or commit ID to checkout after clone.\n :param no_input: Prompt the user at command line for manual configuration?\n :return: The cookiecutter template directory\n :raises: `RepositoryNotFound` if a repository directory could not be found.\n \"\"\"\n template = expand_abbreviations(template, abbreviations)\n\n if is_repo_url(template):\n repo_dir = clone(\n repo_url=template,\n checkout=checkout,\n clone_to_dir=clone_to_dir,\n no_input=no_input,\n )\n else:\n # If it's a local repo, no need to clone or copy to your\n # cookiecutters_dir\n repo_dir = template\n\n if repository_has_cookiecutter_json(repo_dir):\n return repo_dir\n\n raise RepositoryNotFound(\n 'The repository {} could not be located or does not contain '\n 'a \"cookiecutter.json\" file.'.format(repo_dir)\n )\n", "path": "cookiecutter/repository.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n\"\"\"Cookiecutter repository functions.\"\"\"\nfrom __future__ import unicode_literals\nimport os\nimport re\n\nfrom .exceptions import RepositoryNotFound\nfrom .vcs import clone\n\nREPO_REGEX = re.compile(r\"\"\"\n(?x)\n((((git|hg)\\+)?(git|ssh|https?):(//)?) # something like git:// ssh:// etc.\n | # or\n (\\w+@[\\w\\.]+) # something like user@...\n)\n\"\"\")\n\n\ndef is_repo_url(value):\n \"\"\"Return True if value is a repository URL.\"\"\"\n return bool(REPO_REGEX.match(value))\n\n\ndef expand_abbreviations(template, abbreviations):\n \"\"\"\n Expand abbreviations in a template name.\n\n :param template: The project template name.\n :param abbreviations: Abbreviation definitions.\n \"\"\"\n if template in abbreviations:\n return abbreviations[template]\n\n # Split on colon. If there is no colon, rest will be empty\n # and prefix will be the whole template\n prefix, sep, rest = template.partition(':')\n if prefix in abbreviations:\n return abbreviations[prefix].format(rest)\n\n return template\n\n\ndef repository_has_cookiecutter_json(repo_directory):\n \"\"\"Determines if `repo_directory` contains a `cookiecutter.json` file.\n\n :param repo_directory: The candidate repository directory.\n :return: True if the `repo_directory` is valid, else False.\n \"\"\"\n repo_directory_exists = os.path.isdir(repo_directory)\n\n repo_config_exists = os.path.isfile(\n os.path.join(repo_directory, 'cookiecutter.json')\n )\n return repo_directory_exists and repo_config_exists\n\n\ndef determine_repo_dir(template, abbreviations, clone_to_dir, checkout,\n no_input):\n \"\"\"\n Locate the repository directory from a template reference.\n\n Applies repository abbreviations to the template reference.\n If the template refers to a repository URL, clone it.\n If the template is a path to a local repository, use it.\n\n :param template: A directory containing a project template directory,\n or a URL to a git repository.\n :param abbreviations: A dictionary of repository abbreviation\n definitions.\n :param clone_to_dir: The directory to clone the repository into.\n :param checkout: The branch, tag or commit ID to checkout after clone.\n :param no_input: Prompt the user at command line for manual configuration?\n :return: The cookiecutter template directory\n :raises: `RepositoryNotFound` if a repository directory could not be found.\n \"\"\"\n template = expand_abbreviations(template, abbreviations)\n\n if is_repo_url(template):\n cloned_repo = clone(\n repo_url=template,\n checkout=checkout,\n clone_to_dir=clone_to_dir,\n no_input=no_input,\n )\n repository_candidates = [cloned_repo]\n else:\n repository_candidates = [\n template,\n os.path.join(clone_to_dir, template)\n ]\n\n for repo_candidate in repository_candidates:\n if repository_has_cookiecutter_json(repo_candidate):\n return repo_candidate\n\n raise RepositoryNotFound(\n 'A valid repository for \"{}\" could not be found in the following '\n 'locations:\\n{}'.format(\n template,\n '\\n'.join(repository_candidates)\n )\n )\n", "path": "cookiecutter/repository.py"}]} | 2,041 | 309 |
gh_patches_debug_12348 | rasdani/github-patches | git_diff | pyodide__pyodide-74 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Improve parsing of result line
The parsing of the input Python to find the last line which will be evaluated (rather than executed) to provide the result is probably a little brittle in certain corner cases. We should look at what IPython does here and copy that.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pyodide.py`
Content:
```
1 """
2 A library of helper utilities for connecting Python to the browser environment.
3 """
4
5 from js import XMLHttpRequest
6
7 import io
8
9
10 def open_url(url):
11 """
12 Fetches a given *url* and returns a io.StringIO to access its contents.
13 """
14 req = XMLHttpRequest.new()
15 req.open('GET', url, False)
16 req.send(None)
17 return io.StringIO(req.response)
18
19
20 __all__ = ['open_url']
21
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pyodide.py b/src/pyodide.py
--- a/src/pyodide.py
+++ b/src/pyodide.py
@@ -4,6 +4,7 @@
from js import XMLHttpRequest
+import ast
import io
@@ -17,4 +18,23 @@
return io.StringIO(req.response)
-__all__ = ['open_url']
+def eval_code(code, ns):
+ """
+ Runs a string of code, the last part of which may be an expression.
+ """
+ mod = ast.parse(code)
+ if isinstance(mod.body[-1], ast.Expr):
+ expr = ast.Expression(mod.body[-1].value)
+ del mod.body[-1]
+ else:
+ expr = None
+
+ if len(mod.body):
+ exec(compile(mod, '<exec>', mode='exec'), ns, ns)
+ if expr is not None:
+ return eval(compile(expr, '<eval>', mode='eval'), ns, ns)
+ else:
+ return None
+
+
+__all__ = ['open_url', 'eval_code']
| {"golden_diff": "diff --git a/src/pyodide.py b/src/pyodide.py\n--- a/src/pyodide.py\n+++ b/src/pyodide.py\n@@ -4,6 +4,7 @@\n \n from js import XMLHttpRequest\n \n+import ast\n import io\n \n \n@@ -17,4 +18,23 @@\n return io.StringIO(req.response)\n \n \n-__all__ = ['open_url']\n+def eval_code(code, ns):\n+ \"\"\"\n+ Runs a string of code, the last part of which may be an expression.\n+ \"\"\"\n+ mod = ast.parse(code)\n+ if isinstance(mod.body[-1], ast.Expr):\n+ expr = ast.Expression(mod.body[-1].value)\n+ del mod.body[-1]\n+ else:\n+ expr = None\n+\n+ if len(mod.body):\n+ exec(compile(mod, '<exec>', mode='exec'), ns, ns)\n+ if expr is not None:\n+ return eval(compile(expr, '<eval>', mode='eval'), ns, ns)\n+ else:\n+ return None\n+\n+\n+__all__ = ['open_url', 'eval_code']\n", "issue": "Improve parsing of result line\nThe parsing of the input Python to find the last line which will be evaluated (rather than executed) to provide the result is probably a little brittle in certain corner cases. We should look at what IPython does here and copy that.\n", "before_files": [{"content": "\"\"\"\nA library of helper utilities for connecting Python to the browser environment.\n\"\"\"\n\nfrom js import XMLHttpRequest\n\nimport io\n\n\ndef open_url(url):\n \"\"\"\n Fetches a given *url* and returns a io.StringIO to access its contents.\n \"\"\"\n req = XMLHttpRequest.new()\n req.open('GET', url, False)\n req.send(None)\n return io.StringIO(req.response)\n\n\n__all__ = ['open_url']\n", "path": "src/pyodide.py"}], "after_files": [{"content": "\"\"\"\nA library of helper utilities for connecting Python to the browser environment.\n\"\"\"\n\nfrom js import XMLHttpRequest\n\nimport ast\nimport io\n\n\ndef open_url(url):\n \"\"\"\n Fetches a given *url* and returns a io.StringIO to access its contents.\n \"\"\"\n req = XMLHttpRequest.new()\n req.open('GET', url, False)\n req.send(None)\n return io.StringIO(req.response)\n\n\ndef eval_code(code, ns):\n \"\"\"\n Runs a string of code, the last part of which may be an expression.\n \"\"\"\n mod = ast.parse(code)\n if isinstance(mod.body[-1], ast.Expr):\n expr = ast.Expression(mod.body[-1].value)\n del mod.body[-1]\n else:\n expr = None\n\n if len(mod.body):\n exec(compile(mod, '<exec>', mode='exec'), ns, ns)\n if expr is not None:\n return eval(compile(expr, '<eval>', mode='eval'), ns, ns)\n else:\n return None\n\n\n__all__ = ['open_url', 'eval_code']\n", "path": "src/pyodide.py"}]} | 441 | 247 |
gh_patches_debug_30993 | rasdani/github-patches | git_diff | PaddlePaddle__PaddleSpeech-1609 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[vec][search] update to paddlespeech model
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `demos/audio_searching/src/config.py`
Content:
```
1 # Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import os
15
16 ############### Milvus Configuration ###############
17 MILVUS_HOST = os.getenv("MILVUS_HOST", "127.0.0.1")
18 MILVUS_PORT = int(os.getenv("MILVUS_PORT", "19530"))
19 VECTOR_DIMENSION = int(os.getenv("VECTOR_DIMENSION", "2048"))
20 INDEX_FILE_SIZE = int(os.getenv("INDEX_FILE_SIZE", "1024"))
21 METRIC_TYPE = os.getenv("METRIC_TYPE", "L2")
22 DEFAULT_TABLE = os.getenv("DEFAULT_TABLE", "audio_table")
23 TOP_K = int(os.getenv("TOP_K", "10"))
24
25 ############### MySQL Configuration ###############
26 MYSQL_HOST = os.getenv("MYSQL_HOST", "127.0.0.1")
27 MYSQL_PORT = int(os.getenv("MYSQL_PORT", "3306"))
28 MYSQL_USER = os.getenv("MYSQL_USER", "root")
29 MYSQL_PWD = os.getenv("MYSQL_PWD", "123456")
30 MYSQL_DB = os.getenv("MYSQL_DB", "mysql")
31
32 ############### Data Path ###############
33 UPLOAD_PATH = os.getenv("UPLOAD_PATH", "tmp/audio-data")
34
35 ############### Number of Log Files ###############
36 LOGS_NUM = int(os.getenv("logs_num", "0"))
37
```
Path: `demos/audio_searching/src/encode.py`
Content:
```
1 # Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 import os
15
16 import librosa
17 import numpy as np
18 from logs import LOGGER
19
20
21 def get_audio_embedding(path):
22 """
23 Use vpr_inference to generate embedding of audio
24 """
25 try:
26 RESAMPLE_RATE = 16000
27 audio, _ = librosa.load(path, sr=RESAMPLE_RATE, mono=True)
28
29 # TODO add infer/python interface to get embedding, now fake it by rand
30 # vpr = ECAPATDNN(checkpoint_path=None, device='cuda')
31 # embedding = vpr.inference(audio)
32 np.random.seed(hash(os.path.basename(path)) % 1000000)
33 embedding = np.random.rand(1, 2048)
34 embedding = embedding / np.linalg.norm(embedding)
35 embedding = embedding.tolist()[0]
36 return embedding
37 except Exception as e:
38 LOGGER.error(f"Error with embedding:{e}")
39 return None
40
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/demos/audio_searching/src/config.py b/demos/audio_searching/src/config.py
--- a/demos/audio_searching/src/config.py
+++ b/demos/audio_searching/src/config.py
@@ -16,7 +16,7 @@
############### Milvus Configuration ###############
MILVUS_HOST = os.getenv("MILVUS_HOST", "127.0.0.1")
MILVUS_PORT = int(os.getenv("MILVUS_PORT", "19530"))
-VECTOR_DIMENSION = int(os.getenv("VECTOR_DIMENSION", "2048"))
+VECTOR_DIMENSION = int(os.getenv("VECTOR_DIMENSION", "192"))
INDEX_FILE_SIZE = int(os.getenv("INDEX_FILE_SIZE", "1024"))
METRIC_TYPE = os.getenv("METRIC_TYPE", "L2")
DEFAULT_TABLE = os.getenv("DEFAULT_TABLE", "audio_table")
diff --git a/demos/audio_searching/src/encode.py b/demos/audio_searching/src/encode.py
--- a/demos/audio_searching/src/encode.py
+++ b/demos/audio_searching/src/encode.py
@@ -15,7 +15,12 @@
import librosa
import numpy as np
+from config import DEFAULT_TABLE
+
from logs import LOGGER
+from paddlespeech.cli import VectorExecutor
+
+vector_executor = VectorExecutor()
def get_audio_embedding(path):
@@ -23,16 +28,9 @@
Use vpr_inference to generate embedding of audio
"""
try:
- RESAMPLE_RATE = 16000
- audio, _ = librosa.load(path, sr=RESAMPLE_RATE, mono=True)
-
- # TODO add infer/python interface to get embedding, now fake it by rand
- # vpr = ECAPATDNN(checkpoint_path=None, device='cuda')
- # embedding = vpr.inference(audio)
- np.random.seed(hash(os.path.basename(path)) % 1000000)
- embedding = np.random.rand(1, 2048)
+ embedding = vector_executor(audio_file=path)
embedding = embedding / np.linalg.norm(embedding)
- embedding = embedding.tolist()[0]
+ embedding = embedding.tolist()
return embedding
except Exception as e:
LOGGER.error(f"Error with embedding:{e}")
| {"golden_diff": "diff --git a/demos/audio_searching/src/config.py b/demos/audio_searching/src/config.py\n--- a/demos/audio_searching/src/config.py\n+++ b/demos/audio_searching/src/config.py\n@@ -16,7 +16,7 @@\n ############### Milvus Configuration ###############\n MILVUS_HOST = os.getenv(\"MILVUS_HOST\", \"127.0.0.1\")\n MILVUS_PORT = int(os.getenv(\"MILVUS_PORT\", \"19530\"))\n-VECTOR_DIMENSION = int(os.getenv(\"VECTOR_DIMENSION\", \"2048\"))\n+VECTOR_DIMENSION = int(os.getenv(\"VECTOR_DIMENSION\", \"192\"))\n INDEX_FILE_SIZE = int(os.getenv(\"INDEX_FILE_SIZE\", \"1024\"))\n METRIC_TYPE = os.getenv(\"METRIC_TYPE\", \"L2\")\n DEFAULT_TABLE = os.getenv(\"DEFAULT_TABLE\", \"audio_table\")\ndiff --git a/demos/audio_searching/src/encode.py b/demos/audio_searching/src/encode.py\n--- a/demos/audio_searching/src/encode.py\n+++ b/demos/audio_searching/src/encode.py\n@@ -15,7 +15,12 @@\n \n import librosa\n import numpy as np\n+from config import DEFAULT_TABLE\n+\n from logs import LOGGER\n+from paddlespeech.cli import VectorExecutor\n+\n+vector_executor = VectorExecutor()\n \n \n def get_audio_embedding(path):\n@@ -23,16 +28,9 @@\n Use vpr_inference to generate embedding of audio\n \"\"\"\n try:\n- RESAMPLE_RATE = 16000\n- audio, _ = librosa.load(path, sr=RESAMPLE_RATE, mono=True)\n-\n- # TODO add infer/python interface to get embedding, now fake it by rand\n- # vpr = ECAPATDNN(checkpoint_path=None, device='cuda')\n- # embedding = vpr.inference(audio)\n- np.random.seed(hash(os.path.basename(path)) % 1000000)\n- embedding = np.random.rand(1, 2048)\n+ embedding = vector_executor(audio_file=path)\n embedding = embedding / np.linalg.norm(embedding)\n- embedding = embedding.tolist()[0]\n+ embedding = embedding.tolist()\n return embedding\n except Exception as e:\n LOGGER.error(f\"Error with embedding:{e}\")\n", "issue": "[vec][search] update to paddlespeech model\n\n", "before_files": [{"content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\n############### Milvus Configuration ###############\nMILVUS_HOST = os.getenv(\"MILVUS_HOST\", \"127.0.0.1\")\nMILVUS_PORT = int(os.getenv(\"MILVUS_PORT\", \"19530\"))\nVECTOR_DIMENSION = int(os.getenv(\"VECTOR_DIMENSION\", \"2048\"))\nINDEX_FILE_SIZE = int(os.getenv(\"INDEX_FILE_SIZE\", \"1024\"))\nMETRIC_TYPE = os.getenv(\"METRIC_TYPE\", \"L2\")\nDEFAULT_TABLE = os.getenv(\"DEFAULT_TABLE\", \"audio_table\")\nTOP_K = int(os.getenv(\"TOP_K\", \"10\"))\n\n############### MySQL Configuration ###############\nMYSQL_HOST = os.getenv(\"MYSQL_HOST\", \"127.0.0.1\")\nMYSQL_PORT = int(os.getenv(\"MYSQL_PORT\", \"3306\"))\nMYSQL_USER = os.getenv(\"MYSQL_USER\", \"root\")\nMYSQL_PWD = os.getenv(\"MYSQL_PWD\", \"123456\")\nMYSQL_DB = os.getenv(\"MYSQL_DB\", \"mysql\")\n\n############### Data Path ###############\nUPLOAD_PATH = os.getenv(\"UPLOAD_PATH\", \"tmp/audio-data\")\n\n############### Number of Log Files ###############\nLOGS_NUM = int(os.getenv(\"logs_num\", \"0\"))\n", "path": "demos/audio_searching/src/config.py"}, {"content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport librosa\nimport numpy as np\nfrom logs import LOGGER\n\n\ndef get_audio_embedding(path):\n \"\"\"\n Use vpr_inference to generate embedding of audio\n \"\"\"\n try:\n RESAMPLE_RATE = 16000\n audio, _ = librosa.load(path, sr=RESAMPLE_RATE, mono=True)\n\n # TODO add infer/python interface to get embedding, now fake it by rand\n # vpr = ECAPATDNN(checkpoint_path=None, device='cuda')\n # embedding = vpr.inference(audio)\n np.random.seed(hash(os.path.basename(path)) % 1000000)\n embedding = np.random.rand(1, 2048)\n embedding = embedding / np.linalg.norm(embedding)\n embedding = embedding.tolist()[0]\n return embedding\n except Exception as e:\n LOGGER.error(f\"Error with embedding:{e}\")\n return None\n", "path": "demos/audio_searching/src/encode.py"}], "after_files": [{"content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\n############### Milvus Configuration ###############\nMILVUS_HOST = os.getenv(\"MILVUS_HOST\", \"127.0.0.1\")\nMILVUS_PORT = int(os.getenv(\"MILVUS_PORT\", \"19530\"))\nVECTOR_DIMENSION = int(os.getenv(\"VECTOR_DIMENSION\", \"192\"))\nINDEX_FILE_SIZE = int(os.getenv(\"INDEX_FILE_SIZE\", \"1024\"))\nMETRIC_TYPE = os.getenv(\"METRIC_TYPE\", \"L2\")\nDEFAULT_TABLE = os.getenv(\"DEFAULT_TABLE\", \"audio_table\")\nTOP_K = int(os.getenv(\"TOP_K\", \"10\"))\n\n############### MySQL Configuration ###############\nMYSQL_HOST = os.getenv(\"MYSQL_HOST\", \"127.0.0.1\")\nMYSQL_PORT = int(os.getenv(\"MYSQL_PORT\", \"3306\"))\nMYSQL_USER = os.getenv(\"MYSQL_USER\", \"root\")\nMYSQL_PWD = os.getenv(\"MYSQL_PWD\", \"123456\")\nMYSQL_DB = os.getenv(\"MYSQL_DB\", \"mysql\")\n\n############### Data Path ###############\nUPLOAD_PATH = os.getenv(\"UPLOAD_PATH\", \"tmp/audio-data\")\n\n############### Number of Log Files ###############\nLOGS_NUM = int(os.getenv(\"logs_num\", \"0\"))\n", "path": "demos/audio_searching/src/config.py"}, {"content": "# Copyright (c) 2022 PaddlePaddle Authors. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\nimport os\n\nimport librosa\nimport numpy as np\nfrom config import DEFAULT_TABLE\n\nfrom logs import LOGGER\nfrom paddlespeech.cli import VectorExecutor\n\nvector_executor = VectorExecutor()\n\n\ndef get_audio_embedding(path):\n \"\"\"\n Use vpr_inference to generate embedding of audio\n \"\"\"\n try:\n embedding = vector_executor(audio_file=path)\n embedding = embedding / np.linalg.norm(embedding)\n embedding = embedding.tolist()\n return embedding\n except Exception as e:\n LOGGER.error(f\"Error with embedding:{e}\")\n return None\n", "path": "demos/audio_searching/src/encode.py"}]} | 1,179 | 511 |
gh_patches_debug_14 | rasdani/github-patches | git_diff | OCHA-DAP__hdx-ckan-2135 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Browse Page Map: opening a country link has different behaviors
From the map: open in new tab
From the list: open in same tab
We should make it the same: open in same tab (unless there was some specification that it should be a new tab that I'm not remembering.
Graphic in Colombia page: instead of line (time-series) make it a bar graph.
CJ added current action for this issue:
- Change "Number of IDPs" graph **from** bar graph **to** line graph.
-----------------Original issue text follows---------------------
I think the graph **Number of people with access constrains** would look better if it was a bar graph instead of a line, time-series:

The reason I think that is that the lines give the impression the indicator changes significantly every month, but in a continuum of time. Bar graphs will help the user compare months as nearly independent measurements, which is influences better consumption of the data in my opinion.
I chatted with the Data Team about this (including @JavierTeran) and they've approved this suggestion.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ckanext-hdx_theme/ckanext/hdx_theme/version.py`
Content:
```
1 hdx_version = 'v0.6.1'
2
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ckanext-hdx_theme/ckanext/hdx_theme/version.py b/ckanext-hdx_theme/ckanext/hdx_theme/version.py
--- a/ckanext-hdx_theme/ckanext/hdx_theme/version.py
+++ b/ckanext-hdx_theme/ckanext/hdx_theme/version.py
@@ -1 +1 @@
-hdx_version = 'v0.6.1'
+hdx_version = 'v0.6.2'
| {"golden_diff": "diff --git a/ckanext-hdx_theme/ckanext/hdx_theme/version.py b/ckanext-hdx_theme/ckanext/hdx_theme/version.py\n--- a/ckanext-hdx_theme/ckanext/hdx_theme/version.py\n+++ b/ckanext-hdx_theme/ckanext/hdx_theme/version.py\n@@ -1 +1 @@\n-hdx_version = 'v0.6.1'\n+hdx_version = 'v0.6.2'\n", "issue": "Browse Page Map: opening a country link has different behaviors\nFrom the map: open in new tab\nFrom the list: open in same tab\n\nWe should make it the same: open in same tab (unless there was some specification that it should be a new tab that I'm not remembering. \n\nGraphic in Colombia page: instead of line (time-series) make it a bar graph.\nCJ added current action for this issue:\n- Change \"Number of IDPs\" graph **from** bar graph **to** line graph. \n\n-----------------Original issue text follows---------------------\nI think the graph **Number of people with access constrains** would look better if it was a bar graph instead of a line, time-series: \n\n\n\nThe reason I think that is that the lines give the impression the indicator changes significantly every month, but in a continuum of time. Bar graphs will help the user compare months as nearly independent measurements, which is influences better consumption of the data in my opinion. \n\nI chatted with the Data Team about this (including @JavierTeran) and they've approved this suggestion.\n\n", "before_files": [{"content": "hdx_version = 'v0.6.1'\n", "path": "ckanext-hdx_theme/ckanext/hdx_theme/version.py"}], "after_files": [{"content": "hdx_version = 'v0.6.2'\n", "path": "ckanext-hdx_theme/ckanext/hdx_theme/version.py"}]} | 591 | 106 |
gh_patches_debug_2059 | rasdani/github-patches | git_diff | comic__grand-challenge.org-1062 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
The schema is empty for unauthorised users.
Another problem with this - the schema is empty for unauthorised users. You need to add `public=True` to `get_schema_view`.
_Originally posted by @jmsmkn in https://github.com/comic/grand-challenge.org/issues/1017#issuecomment-567254400_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/grandchallenge/api/urls.py`
Content:
```
1 from django.conf import settings
2 from django.conf.urls import include, url
3 from django.urls import path
4 from drf_yasg import openapi
5 from drf_yasg.views import get_schema_view
6 from rest_framework import permissions, routers
7
8 from grandchallenge.algorithms.views import (
9 AlgorithmImageViewSet,
10 AlgorithmViewSet,
11 JobViewSet,
12 ResultViewSet,
13 )
14 from grandchallenge.cases.views import (
15 ImageViewSet,
16 RawImageUploadSessionViewSet,
17 )
18 from grandchallenge.jqfileupload.views import StagedFileViewSet
19 from grandchallenge.reader_studies.views import (
20 AnswerViewSet,
21 QuestionViewSet,
22 ReaderStudyViewSet,
23 )
24 from grandchallenge.retina_api.views import LandmarkAnnotationSetViewSet
25 from grandchallenge.subdomains.utils import reverse_lazy
26 from grandchallenge.workstation_configs.views import WorkstationConfigViewSet
27 from grandchallenge.workstations.views import SessionViewSet
28
29 app_name = "api"
30
31 router = routers.DefaultRouter()
32 router.register(
33 r"cases/upload-sessions",
34 RawImageUploadSessionViewSet,
35 basename="upload-session",
36 )
37 router.register(r"cases/images", ImageViewSet, basename="image")
38 router.register(r"workstations/sessions", SessionViewSet)
39 router.register(
40 r"workstations/configs",
41 WorkstationConfigViewSet,
42 basename="workstations-config",
43 )
44 router.register(r"algorithms/jobs", JobViewSet, basename="algorithms-job")
45 router.register(
46 r"algorithms/results", ResultViewSet, basename="algorithms-result"
47 )
48 router.register(
49 r"algorithms/images", AlgorithmImageViewSet, basename="algorithms-image"
50 )
51 router.register(r"algorithms", AlgorithmViewSet, basename="algorithm")
52
53 router.register(
54 r"reader-studies/answers", AnswerViewSet, basename="reader-studies-answer"
55 )
56 router.register(
57 r"reader-studies/questions",
58 QuestionViewSet,
59 basename="reader-studies-question",
60 )
61 router.register(r"reader-studies", ReaderStudyViewSet, basename="reader-study")
62 router.register(r"chunked-uploads", StagedFileViewSet, basename="staged-file")
63
64 router.register(
65 r"retina/landmark-annotation",
66 LandmarkAnnotationSetViewSet,
67 basename="landmark-annotation",
68 )
69
70 # TODO: add terms_of_service and contact
71 schema_view = get_schema_view(
72 openapi.Info(
73 title=f"{settings.SESSION_COOKIE_DOMAIN.lstrip('.')} API",
74 default_version="v1",
75 description=f"The API for {settings.SESSION_COOKIE_DOMAIN.lstrip('.')}.",
76 license=openapi.License(name="Apache License 2.0"),
77 terms_of_service=reverse_lazy(
78 "policies:detail", kwargs={"slug": "terms-of-service"}
79 ),
80 ),
81 permission_classes=(permissions.AllowAny,),
82 patterns=[path("api/v1/", include(router.urls))],
83 )
84
85 urlpatterns = [
86 url(
87 r"^swagger(?P<format>\.json|\.yaml)$",
88 schema_view.without_ui(),
89 name="schema-json",
90 ),
91 # Do not namespace the router.urls without updating the view names in
92 # the serializers
93 path("v1/", include(router.urls)),
94 path("auth/", include("rest_framework.urls", namespace="rest_framework")),
95 path("", schema_view.with_ui("swagger"), name="schema-docs"),
96 ]
97
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/app/grandchallenge/api/urls.py b/app/grandchallenge/api/urls.py
--- a/app/grandchallenge/api/urls.py
+++ b/app/grandchallenge/api/urls.py
@@ -78,6 +78,7 @@
"policies:detail", kwargs={"slug": "terms-of-service"}
),
),
+ public=True,
permission_classes=(permissions.AllowAny,),
patterns=[path("api/v1/", include(router.urls))],
)
| {"golden_diff": "diff --git a/app/grandchallenge/api/urls.py b/app/grandchallenge/api/urls.py\n--- a/app/grandchallenge/api/urls.py\n+++ b/app/grandchallenge/api/urls.py\n@@ -78,6 +78,7 @@\n \"policies:detail\", kwargs={\"slug\": \"terms-of-service\"}\n ),\n ),\n+ public=True,\n permission_classes=(permissions.AllowAny,),\n patterns=[path(\"api/v1/\", include(router.urls))],\n )\n", "issue": "The schema is empty for unauthorised users.\nAnother problem with this - the schema is empty for unauthorised users. You need to add `public=True` to `get_schema_view`.\r\n\r\n_Originally posted by @jmsmkn in https://github.com/comic/grand-challenge.org/issues/1017#issuecomment-567254400_\n", "before_files": [{"content": "from django.conf import settings\nfrom django.conf.urls import include, url\nfrom django.urls import path\nfrom drf_yasg import openapi\nfrom drf_yasg.views import get_schema_view\nfrom rest_framework import permissions, routers\n\nfrom grandchallenge.algorithms.views import (\n AlgorithmImageViewSet,\n AlgorithmViewSet,\n JobViewSet,\n ResultViewSet,\n)\nfrom grandchallenge.cases.views import (\n ImageViewSet,\n RawImageUploadSessionViewSet,\n)\nfrom grandchallenge.jqfileupload.views import StagedFileViewSet\nfrom grandchallenge.reader_studies.views import (\n AnswerViewSet,\n QuestionViewSet,\n ReaderStudyViewSet,\n)\nfrom grandchallenge.retina_api.views import LandmarkAnnotationSetViewSet\nfrom grandchallenge.subdomains.utils import reverse_lazy\nfrom grandchallenge.workstation_configs.views import WorkstationConfigViewSet\nfrom grandchallenge.workstations.views import SessionViewSet\n\napp_name = \"api\"\n\nrouter = routers.DefaultRouter()\nrouter.register(\n r\"cases/upload-sessions\",\n RawImageUploadSessionViewSet,\n basename=\"upload-session\",\n)\nrouter.register(r\"cases/images\", ImageViewSet, basename=\"image\")\nrouter.register(r\"workstations/sessions\", SessionViewSet)\nrouter.register(\n r\"workstations/configs\",\n WorkstationConfigViewSet,\n basename=\"workstations-config\",\n)\nrouter.register(r\"algorithms/jobs\", JobViewSet, basename=\"algorithms-job\")\nrouter.register(\n r\"algorithms/results\", ResultViewSet, basename=\"algorithms-result\"\n)\nrouter.register(\n r\"algorithms/images\", AlgorithmImageViewSet, basename=\"algorithms-image\"\n)\nrouter.register(r\"algorithms\", AlgorithmViewSet, basename=\"algorithm\")\n\nrouter.register(\n r\"reader-studies/answers\", AnswerViewSet, basename=\"reader-studies-answer\"\n)\nrouter.register(\n r\"reader-studies/questions\",\n QuestionViewSet,\n basename=\"reader-studies-question\",\n)\nrouter.register(r\"reader-studies\", ReaderStudyViewSet, basename=\"reader-study\")\nrouter.register(r\"chunked-uploads\", StagedFileViewSet, basename=\"staged-file\")\n\nrouter.register(\n r\"retina/landmark-annotation\",\n LandmarkAnnotationSetViewSet,\n basename=\"landmark-annotation\",\n)\n\n# TODO: add terms_of_service and contact\nschema_view = get_schema_view(\n openapi.Info(\n title=f\"{settings.SESSION_COOKIE_DOMAIN.lstrip('.')} API\",\n default_version=\"v1\",\n description=f\"The API for {settings.SESSION_COOKIE_DOMAIN.lstrip('.')}.\",\n license=openapi.License(name=\"Apache License 2.0\"),\n terms_of_service=reverse_lazy(\n \"policies:detail\", kwargs={\"slug\": \"terms-of-service\"}\n ),\n ),\n permission_classes=(permissions.AllowAny,),\n patterns=[path(\"api/v1/\", include(router.urls))],\n)\n\nurlpatterns = [\n url(\n r\"^swagger(?P<format>\\.json|\\.yaml)$\",\n schema_view.without_ui(),\n name=\"schema-json\",\n ),\n # Do not namespace the router.urls without updating the view names in\n # the serializers\n path(\"v1/\", include(router.urls)),\n path(\"auth/\", include(\"rest_framework.urls\", namespace=\"rest_framework\")),\n path(\"\", schema_view.with_ui(\"swagger\"), name=\"schema-docs\"),\n]\n", "path": "app/grandchallenge/api/urls.py"}], "after_files": [{"content": "from django.conf import settings\nfrom django.conf.urls import include, url\nfrom django.urls import path\nfrom drf_yasg import openapi\nfrom drf_yasg.views import get_schema_view\nfrom rest_framework import permissions, routers\n\nfrom grandchallenge.algorithms.views import (\n AlgorithmImageViewSet,\n AlgorithmViewSet,\n JobViewSet,\n ResultViewSet,\n)\nfrom grandchallenge.cases.views import (\n ImageViewSet,\n RawImageUploadSessionViewSet,\n)\nfrom grandchallenge.jqfileupload.views import StagedFileViewSet\nfrom grandchallenge.reader_studies.views import (\n AnswerViewSet,\n QuestionViewSet,\n ReaderStudyViewSet,\n)\nfrom grandchallenge.retina_api.views import LandmarkAnnotationSetViewSet\nfrom grandchallenge.subdomains.utils import reverse_lazy\nfrom grandchallenge.workstation_configs.views import WorkstationConfigViewSet\nfrom grandchallenge.workstations.views import SessionViewSet\n\napp_name = \"api\"\n\nrouter = routers.DefaultRouter()\nrouter.register(\n r\"cases/upload-sessions\",\n RawImageUploadSessionViewSet,\n basename=\"upload-session\",\n)\nrouter.register(r\"cases/images\", ImageViewSet, basename=\"image\")\nrouter.register(r\"workstations/sessions\", SessionViewSet)\nrouter.register(\n r\"workstations/configs\",\n WorkstationConfigViewSet,\n basename=\"workstations-config\",\n)\nrouter.register(r\"algorithms/jobs\", JobViewSet, basename=\"algorithms-job\")\nrouter.register(\n r\"algorithms/results\", ResultViewSet, basename=\"algorithms-result\"\n)\nrouter.register(\n r\"algorithms/images\", AlgorithmImageViewSet, basename=\"algorithms-image\"\n)\nrouter.register(r\"algorithms\", AlgorithmViewSet, basename=\"algorithm\")\n\nrouter.register(\n r\"reader-studies/answers\", AnswerViewSet, basename=\"reader-studies-answer\"\n)\nrouter.register(\n r\"reader-studies/questions\",\n QuestionViewSet,\n basename=\"reader-studies-question\",\n)\nrouter.register(r\"reader-studies\", ReaderStudyViewSet, basename=\"reader-study\")\nrouter.register(r\"chunked-uploads\", StagedFileViewSet, basename=\"staged-file\")\n\nrouter.register(\n r\"retina/landmark-annotation\",\n LandmarkAnnotationSetViewSet,\n basename=\"landmark-annotation\",\n)\n\n# TODO: add terms_of_service and contact\nschema_view = get_schema_view(\n openapi.Info(\n title=f\"{settings.SESSION_COOKIE_DOMAIN.lstrip('.')} API\",\n default_version=\"v1\",\n description=f\"The API for {settings.SESSION_COOKIE_DOMAIN.lstrip('.')}.\",\n license=openapi.License(name=\"Apache License 2.0\"),\n terms_of_service=reverse_lazy(\n \"policies:detail\", kwargs={\"slug\": \"terms-of-service\"}\n ),\n ),\n public=True,\n permission_classes=(permissions.AllowAny,),\n patterns=[path(\"api/v1/\", include(router.urls))],\n)\n\nurlpatterns = [\n url(\n r\"^swagger(?P<format>\\.json|\\.yaml)$\",\n schema_view.without_ui(),\n name=\"schema-json\",\n ),\n # Do not namespace the router.urls without updating the view names in\n # the serializers\n path(\"v1/\", include(router.urls)),\n path(\"auth/\", include(\"rest_framework.urls\", namespace=\"rest_framework\")),\n path(\"\", schema_view.with_ui(\"swagger\"), name=\"schema-docs\"),\n]\n", "path": "app/grandchallenge/api/urls.py"}]} | 1,219 | 104 |
gh_patches_debug_12858 | rasdani/github-patches | git_diff | streamlink__streamlink-5616 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
plugins.hls: recognize URLs with uppercase ".M3U8"
### Checklist
- [X] This is a bug report and not [a different kind of issue](https://github.com/streamlink/streamlink/issues/new/choose)
- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)
- [X] [I have checked the list of open and recently closed bug reports](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22bug%22)
- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)
### Streamlink version
streamlink 6.2.1+29.gc82a8535
### Description
Currently a URL with upper case M3U8, e.g. `https://example.com/live.M3U8` would not be recognized as an HLS URL.
### Debug log
```text
>streamlink https://example.com/live.M3U8 best --loglevel=debug
[cli][debug] OS: Windows 10
[cli][debug] Python: 3.11.1
[cli][debug] OpenSSL: OpenSSL 1.1.1q 5 Jul 2022
[cli][debug] Streamlink: 6.2.1+29.gc82a8535
[cli][debug] Dependencies:
[cli][debug] certifi: 2023.5.7
[cli][debug] isodate: 0.6.1
[cli][debug] lxml: 4.9.2
[cli][debug] pycountry: 22.3.5
[cli][debug] pycryptodome: 3.16.0
[cli][debug] PySocks: 1.7.1
[cli][debug] requests: 2.31.0
[cli][debug] trio: 0.22.0
[cli][debug] trio-websocket: 0.9.2
[cli][debug] typing-extensions: 4.4.0
[cli][debug] urllib3: 1.26.15
[cli][debug] websocket-client: 1.5.1
[cli][debug] Arguments:
[cli][debug] url=https://example.com/live.M3U8
[cli][debug] stream=['best']
[cli][debug] --loglevel=debug
[cli][debug] --player=mpv.exe
[cli][debug] --stream-segment-threads=10
[cli][debug] --hls-segment-queue-threshold=0.0
error: No plugin can handle URL: https://example.com/live.M3U8
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/dash.py`
Content:
```
1 import logging
2 import re
3
4 from streamlink.plugin import Plugin, pluginmatcher
5 from streamlink.plugin.plugin import LOW_PRIORITY, parse_params, stream_weight
6 from streamlink.stream.dash import DASHStream
7 from streamlink.utils.url import update_scheme
8
9
10 log = logging.getLogger(__name__)
11
12
13 @pluginmatcher(re.compile(
14 r"dash://(?P<url>\S+)(?:\s(?P<params>.+))?$",
15 ))
16 @pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(
17 r"(?P<url>\S+\.mpd(?:\?\S*)?)(?:\s(?P<params>.+))?$",
18 ))
19 class MPEGDASH(Plugin):
20 @classmethod
21 def stream_weight(cls, stream):
22 match = re.match(r"^(?:(.*)\+)?(?:a(\d+)k)$", stream)
23 if match and match.group(1) and match.group(2):
24 weight, group = stream_weight(match.group(1))
25 weight += int(match.group(2))
26 return weight, group
27 elif match and match.group(2):
28 return stream_weight(f"{match.group(2)}k")
29 else:
30 return stream_weight(stream)
31
32 def _get_streams(self):
33 data = self.match.groupdict()
34 url = update_scheme("https://", data.get("url"), force=False)
35 params = parse_params(data.get("params"))
36 log.debug(f"URL={url}; params={params}")
37
38 return DASHStream.parse_manifest(self.session, url, **params)
39
40
41 __plugin__ = MPEGDASH
42
```
Path: `src/streamlink/plugins/hls.py`
Content:
```
1 import logging
2 import re
3
4 from streamlink.plugin import Plugin, pluginmatcher
5 from streamlink.plugin.plugin import LOW_PRIORITY, parse_params
6 from streamlink.stream.hls import HLSStream
7 from streamlink.utils.url import update_scheme
8
9
10 log = logging.getLogger(__name__)
11
12
13 @pluginmatcher(re.compile(
14 r"hls(?:variant)?://(?P<url>\S+)(?:\s(?P<params>.+))?$",
15 ))
16 @pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(
17 r"(?P<url>\S+\.m3u8(?:\?\S*)?)(?:\s(?P<params>.+))?$",
18 ))
19 class HLSPlugin(Plugin):
20 def _get_streams(self):
21 data = self.match.groupdict()
22 url = update_scheme("https://", data.get("url"), force=False)
23 params = parse_params(data.get("params"))
24 log.debug(f"URL={url}; params={params}")
25
26 streams = HLSStream.parse_variant_playlist(self.session, url, **params)
27
28 return streams or {"live": HLSStream(self.session, url, **params)}
29
30
31 __plugin__ = HLSPlugin
32
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/dash.py b/src/streamlink/plugins/dash.py
--- a/src/streamlink/plugins/dash.py
+++ b/src/streamlink/plugins/dash.py
@@ -15,6 +15,7 @@
))
@pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(
r"(?P<url>\S+\.mpd(?:\?\S*)?)(?:\s(?P<params>.+))?$",
+ re.IGNORECASE,
))
class MPEGDASH(Plugin):
@classmethod
diff --git a/src/streamlink/plugins/hls.py b/src/streamlink/plugins/hls.py
--- a/src/streamlink/plugins/hls.py
+++ b/src/streamlink/plugins/hls.py
@@ -15,6 +15,7 @@
))
@pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(
r"(?P<url>\S+\.m3u8(?:\?\S*)?)(?:\s(?P<params>.+))?$",
+ re.IGNORECASE,
))
class HLSPlugin(Plugin):
def _get_streams(self):
| {"golden_diff": "diff --git a/src/streamlink/plugins/dash.py b/src/streamlink/plugins/dash.py\n--- a/src/streamlink/plugins/dash.py\n+++ b/src/streamlink/plugins/dash.py\n@@ -15,6 +15,7 @@\n ))\n @pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(\n r\"(?P<url>\\S+\\.mpd(?:\\?\\S*)?)(?:\\s(?P<params>.+))?$\",\n+ re.IGNORECASE,\n ))\n class MPEGDASH(Plugin):\n @classmethod\ndiff --git a/src/streamlink/plugins/hls.py b/src/streamlink/plugins/hls.py\n--- a/src/streamlink/plugins/hls.py\n+++ b/src/streamlink/plugins/hls.py\n@@ -15,6 +15,7 @@\n ))\n @pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(\n r\"(?P<url>\\S+\\.m3u8(?:\\?\\S*)?)(?:\\s(?P<params>.+))?$\",\n+ re.IGNORECASE,\n ))\n class HLSPlugin(Plugin):\n def _get_streams(self):\n", "issue": "plugins.hls: recognize URLs with uppercase \".M3U8\"\n### Checklist\n\n- [X] This is a bug report and not [a different kind of issue](https://github.com/streamlink/streamlink/issues/new/choose)\n- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)\n- [X] [I have checked the list of open and recently closed bug reports](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22bug%22)\n- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)\n\n### Streamlink version\n\nstreamlink 6.2.1+29.gc82a8535\n\n### Description\n\nCurrently a URL with upper case M3U8, e.g. `https://example.com/live.M3U8` would not be recognized as an HLS URL.\r\n\r\n\n\n### Debug log\n\n```text\n>streamlink https://example.com/live.M3U8 best --loglevel=debug \r\n[cli][debug] OS: Windows 10\r\n[cli][debug] Python: 3.11.1\r\n[cli][debug] OpenSSL: OpenSSL 1.1.1q 5 Jul 2022\r\n[cli][debug] Streamlink: 6.2.1+29.gc82a8535\r\n[cli][debug] Dependencies:\r\n[cli][debug] certifi: 2023.5.7\r\n[cli][debug] isodate: 0.6.1\r\n[cli][debug] lxml: 4.9.2\r\n[cli][debug] pycountry: 22.3.5\r\n[cli][debug] pycryptodome: 3.16.0\r\n[cli][debug] PySocks: 1.7.1\r\n[cli][debug] requests: 2.31.0\r\n[cli][debug] trio: 0.22.0\r\n[cli][debug] trio-websocket: 0.9.2\r\n[cli][debug] typing-extensions: 4.4.0\r\n[cli][debug] urllib3: 1.26.15\r\n[cli][debug] websocket-client: 1.5.1\r\n[cli][debug] Arguments:\r\n[cli][debug] url=https://example.com/live.M3U8\r\n[cli][debug] stream=['best']\r\n[cli][debug] --loglevel=debug\r\n[cli][debug] --player=mpv.exe\r\n[cli][debug] --stream-segment-threads=10\r\n[cli][debug] --hls-segment-queue-threshold=0.0\r\nerror: No plugin can handle URL: https://example.com/live.M3U8\n```\n\n", "before_files": [{"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.plugin import LOW_PRIORITY, parse_params, stream_weight\nfrom streamlink.stream.dash import DASHStream\nfrom streamlink.utils.url import update_scheme\n\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"dash://(?P<url>\\S+)(?:\\s(?P<params>.+))?$\",\n))\n@pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(\n r\"(?P<url>\\S+\\.mpd(?:\\?\\S*)?)(?:\\s(?P<params>.+))?$\",\n))\nclass MPEGDASH(Plugin):\n @classmethod\n def stream_weight(cls, stream):\n match = re.match(r\"^(?:(.*)\\+)?(?:a(\\d+)k)$\", stream)\n if match and match.group(1) and match.group(2):\n weight, group = stream_weight(match.group(1))\n weight += int(match.group(2))\n return weight, group\n elif match and match.group(2):\n return stream_weight(f\"{match.group(2)}k\")\n else:\n return stream_weight(stream)\n\n def _get_streams(self):\n data = self.match.groupdict()\n url = update_scheme(\"https://\", data.get(\"url\"), force=False)\n params = parse_params(data.get(\"params\"))\n log.debug(f\"URL={url}; params={params}\")\n\n return DASHStream.parse_manifest(self.session, url, **params)\n\n\n__plugin__ = MPEGDASH\n", "path": "src/streamlink/plugins/dash.py"}, {"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.plugin import LOW_PRIORITY, parse_params\nfrom streamlink.stream.hls import HLSStream\nfrom streamlink.utils.url import update_scheme\n\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"hls(?:variant)?://(?P<url>\\S+)(?:\\s(?P<params>.+))?$\",\n))\n@pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(\n r\"(?P<url>\\S+\\.m3u8(?:\\?\\S*)?)(?:\\s(?P<params>.+))?$\",\n))\nclass HLSPlugin(Plugin):\n def _get_streams(self):\n data = self.match.groupdict()\n url = update_scheme(\"https://\", data.get(\"url\"), force=False)\n params = parse_params(data.get(\"params\"))\n log.debug(f\"URL={url}; params={params}\")\n\n streams = HLSStream.parse_variant_playlist(self.session, url, **params)\n\n return streams or {\"live\": HLSStream(self.session, url, **params)}\n\n\n__plugin__ = HLSPlugin\n", "path": "src/streamlink/plugins/hls.py"}], "after_files": [{"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.plugin import LOW_PRIORITY, parse_params, stream_weight\nfrom streamlink.stream.dash import DASHStream\nfrom streamlink.utils.url import update_scheme\n\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"dash://(?P<url>\\S+)(?:\\s(?P<params>.+))?$\",\n))\n@pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(\n r\"(?P<url>\\S+\\.mpd(?:\\?\\S*)?)(?:\\s(?P<params>.+))?$\",\n re.IGNORECASE,\n))\nclass MPEGDASH(Plugin):\n @classmethod\n def stream_weight(cls, stream):\n match = re.match(r\"^(?:(.*)\\+)?(?:a(\\d+)k)$\", stream)\n if match and match.group(1) and match.group(2):\n weight, group = stream_weight(match.group(1))\n weight += int(match.group(2))\n return weight, group\n elif match and match.group(2):\n return stream_weight(f\"{match.group(2)}k\")\n else:\n return stream_weight(stream)\n\n def _get_streams(self):\n data = self.match.groupdict()\n url = update_scheme(\"https://\", data.get(\"url\"), force=False)\n params = parse_params(data.get(\"params\"))\n log.debug(f\"URL={url}; params={params}\")\n\n return DASHStream.parse_manifest(self.session, url, **params)\n\n\n__plugin__ = MPEGDASH\n", "path": "src/streamlink/plugins/dash.py"}, {"content": "import logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.plugin import LOW_PRIORITY, parse_params\nfrom streamlink.stream.hls import HLSStream\nfrom streamlink.utils.url import update_scheme\n\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"hls(?:variant)?://(?P<url>\\S+)(?:\\s(?P<params>.+))?$\",\n))\n@pluginmatcher(priority=LOW_PRIORITY, pattern=re.compile(\n r\"(?P<url>\\S+\\.m3u8(?:\\?\\S*)?)(?:\\s(?P<params>.+))?$\",\n re.IGNORECASE,\n))\nclass HLSPlugin(Plugin):\n def _get_streams(self):\n data = self.match.groupdict()\n url = update_scheme(\"https://\", data.get(\"url\"), force=False)\n params = parse_params(data.get(\"params\"))\n log.debug(f\"URL={url}; params={params}\")\n\n streams = HLSStream.parse_variant_playlist(self.session, url, **params)\n\n return streams or {\"live\": HLSStream(self.session, url, **params)}\n\n\n__plugin__ = HLSPlugin\n", "path": "src/streamlink/plugins/hls.py"}]} | 1,649 | 234 |
gh_patches_debug_4119 | rasdani/github-patches | git_diff | mathesar-foundation__mathesar-2424 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
First time password reset should redirect to login page instead of confirmation page
## Description
* After the user resets their password at the page: `/auth/password_reset_confirm`, they are redirected to `/auth/reset/done/`.
* They should be redirected to `/auth/login` instead.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mathesar/users/password_reset.py`
Content:
```
1 from django.contrib.auth.forms import SetPasswordForm
2 from django.contrib.auth.views import PasswordResetConfirmView
3 from django.utils.decorators import method_decorator
4 from django.views.decorators.cache import never_cache
5 from django.views.decorators.debug import sensitive_post_parameters
6 from django.utils.translation import gettext_lazy as _
7
8
9 class MathesarSetPasswordForm(SetPasswordForm):
10 def save(self, commit=True):
11 password = self.cleaned_data["new_password1"]
12 self.user.set_password(password)
13 # Default password is replaced with a password is set by the user, so change the status
14 self.user.password_change_needed = False
15 if commit:
16 self.user.save()
17 return self.user
18
19
20 class MathesarPasswordResetConfirmView(PasswordResetConfirmView):
21 # Override default form as we need custom save behaviour
22 form_class = MathesarSetPasswordForm
23 template_name = 'users/password_reset_confirmation.html'
24 title = _('Change Default Password')
25
26 @method_decorator(sensitive_post_parameters())
27 @method_decorator(never_cache)
28 def dispatch(self, *args, **kwargs):
29 self.user = self.request.user
30 self.validlink = True
31 # Avoid calling the PasswordResetConfirmView `dispatch` method
32 # as it contains behaviours not suited for our user flow
33 return super(PasswordResetConfirmView, self).dispatch(*args, **kwargs)
34
35 def form_valid(self, form):
36 form.save()
37 return super(PasswordResetConfirmView, self).form_valid(form)
38
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mathesar/users/password_reset.py b/mathesar/users/password_reset.py
--- a/mathesar/users/password_reset.py
+++ b/mathesar/users/password_reset.py
@@ -22,6 +22,7 @@
form_class = MathesarSetPasswordForm
template_name = 'users/password_reset_confirmation.html'
title = _('Change Default Password')
+ success_url = "/auth/login"
@method_decorator(sensitive_post_parameters())
@method_decorator(never_cache)
| {"golden_diff": "diff --git a/mathesar/users/password_reset.py b/mathesar/users/password_reset.py\n--- a/mathesar/users/password_reset.py\n+++ b/mathesar/users/password_reset.py\n@@ -22,6 +22,7 @@\n form_class = MathesarSetPasswordForm\n template_name = 'users/password_reset_confirmation.html'\n title = _('Change Default Password')\n+ success_url = \"/auth/login\"\n \n @method_decorator(sensitive_post_parameters())\n @method_decorator(never_cache)\n", "issue": "First time password reset should redirect to login page instead of confirmation page\n## Description\r\n* After the user resets their password at the page: `/auth/password_reset_confirm`, they are redirected to `/auth/reset/done/`.\r\n* They should be redirected to `/auth/login` instead.\n", "before_files": [{"content": "from django.contrib.auth.forms import SetPasswordForm\nfrom django.contrib.auth.views import PasswordResetConfirmView\nfrom django.utils.decorators import method_decorator\nfrom django.views.decorators.cache import never_cache\nfrom django.views.decorators.debug import sensitive_post_parameters\nfrom django.utils.translation import gettext_lazy as _\n\n\nclass MathesarSetPasswordForm(SetPasswordForm):\n def save(self, commit=True):\n password = self.cleaned_data[\"new_password1\"]\n self.user.set_password(password)\n # Default password is replaced with a password is set by the user, so change the status\n self.user.password_change_needed = False\n if commit:\n self.user.save()\n return self.user\n\n\nclass MathesarPasswordResetConfirmView(PasswordResetConfirmView):\n # Override default form as we need custom save behaviour\n form_class = MathesarSetPasswordForm\n template_name = 'users/password_reset_confirmation.html'\n title = _('Change Default Password')\n\n @method_decorator(sensitive_post_parameters())\n @method_decorator(never_cache)\n def dispatch(self, *args, **kwargs):\n self.user = self.request.user\n self.validlink = True\n # Avoid calling the PasswordResetConfirmView `dispatch` method\n # as it contains behaviours not suited for our user flow\n return super(PasswordResetConfirmView, self).dispatch(*args, **kwargs)\n\n def form_valid(self, form):\n form.save()\n return super(PasswordResetConfirmView, self).form_valid(form)\n", "path": "mathesar/users/password_reset.py"}], "after_files": [{"content": "from django.contrib.auth.forms import SetPasswordForm\nfrom django.contrib.auth.views import PasswordResetConfirmView\nfrom django.utils.decorators import method_decorator\nfrom django.views.decorators.cache import never_cache\nfrom django.views.decorators.debug import sensitive_post_parameters\nfrom django.utils.translation import gettext_lazy as _\n\n\nclass MathesarSetPasswordForm(SetPasswordForm):\n def save(self, commit=True):\n password = self.cleaned_data[\"new_password1\"]\n self.user.set_password(password)\n # Default password is replaced with a password is set by the user, so change the status\n self.user.password_change_needed = False\n if commit:\n self.user.save()\n return self.user\n\n\nclass MathesarPasswordResetConfirmView(PasswordResetConfirmView):\n # Override default form as we need custom save behaviour\n form_class = MathesarSetPasswordForm\n template_name = 'users/password_reset_confirmation.html'\n title = _('Change Default Password')\n success_url = \"/auth/login\"\n\n @method_decorator(sensitive_post_parameters())\n @method_decorator(never_cache)\n def dispatch(self, *args, **kwargs):\n self.user = self.request.user\n self.validlink = True\n # Avoid calling the PasswordResetConfirmView `dispatch` method\n # as it contains behaviours not suited for our user flow\n return super(PasswordResetConfirmView, self).dispatch(*args, **kwargs)\n\n def form_valid(self, form):\n form.save()\n return super(PasswordResetConfirmView, self).form_valid(form)\n", "path": "mathesar/users/password_reset.py"}]} | 696 | 104 |
gh_patches_debug_9831 | rasdani/github-patches | git_diff | wemake-services__wemake-python-styleguide-1226 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Annotation complexity should not fail on expressions
# Bug report
This code:
```python
def some() -> 'test expression':
...
```
Makes `flake8-annotation-complexity` to fail. We need to ignore this case silently.
Related: #1170
Demo: https://asciinema.org/a/IIjIfkVKytmZ1F5c2YufMORdi
Annotation complexity should not fail on expressions
# Bug report
This code:
```python
def some() -> 'test expression':
...
```
Makes `flake8-annotation-complexity` to fail. We need to ignore this case silently.
Related: #1170
Demo: https://asciinema.org/a/IIjIfkVKytmZ1F5c2YufMORdi
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wemake_python_styleguide/logic/complexity/annotations.py`
Content:
```
1 """
2 Counts annotation complexity by getting the nesting level of nodes.
3
4 So ``List[int]`` complexity is 2
5 and ``Tuple[List[Optional[str]], int]`` is 4.
6
7 Adapted from: https://github.com/best-doctor/flake8-annotations-complexity
8 """
9
10 import ast
11 from typing import Union
12
13 _Annotation = Union[
14 ast.expr,
15 ast.Str,
16 ]
17
18
19 def get_annotation_compexity(annotation_node: _Annotation) -> int:
20 """
21 Recursevly counts complexity of annotation nodes.
22
23 When annotations are written as strings,
24 we additionally parse them to ``ast`` nodes.
25 """
26 if isinstance(annotation_node, ast.Str):
27 annotation_node = ast.parse( # type: ignore
28 annotation_node.s,
29 ).body[0].value
30
31 if isinstance(annotation_node, ast.Subscript):
32 return 1 + get_annotation_compexity(
33 annotation_node.slice.value, # type: ignore
34 )
35 elif isinstance(annotation_node, (ast.Tuple, ast.List)):
36 return max(
37 (get_annotation_compexity(node) for node in annotation_node.elts),
38 default=1,
39 )
40 return 1
41
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wemake_python_styleguide/logic/complexity/annotations.py b/wemake_python_styleguide/logic/complexity/annotations.py
--- a/wemake_python_styleguide/logic/complexity/annotations.py
+++ b/wemake_python_styleguide/logic/complexity/annotations.py
@@ -24,9 +24,12 @@
we additionally parse them to ``ast`` nodes.
"""
if isinstance(annotation_node, ast.Str):
- annotation_node = ast.parse( # type: ignore
- annotation_node.s,
- ).body[0].value
+ try:
+ annotation_node = ast.parse( # type: ignore
+ annotation_node.s,
+ ).body[0].value
+ except SyntaxError:
+ return 1
if isinstance(annotation_node, ast.Subscript):
return 1 + get_annotation_compexity(
| {"golden_diff": "diff --git a/wemake_python_styleguide/logic/complexity/annotations.py b/wemake_python_styleguide/logic/complexity/annotations.py\n--- a/wemake_python_styleguide/logic/complexity/annotations.py\n+++ b/wemake_python_styleguide/logic/complexity/annotations.py\n@@ -24,9 +24,12 @@\n we additionally parse them to ``ast`` nodes.\n \"\"\"\n if isinstance(annotation_node, ast.Str):\n- annotation_node = ast.parse( # type: ignore\n- annotation_node.s,\n- ).body[0].value\n+ try:\n+ annotation_node = ast.parse( # type: ignore\n+ annotation_node.s,\n+ ).body[0].value\n+ except SyntaxError:\n+ return 1\n \n if isinstance(annotation_node, ast.Subscript):\n return 1 + get_annotation_compexity(\n", "issue": "Annotation complexity should not fail on expressions\n# Bug report\r\n\r\nThis code:\r\n\r\n```python\r\ndef some() -> 'test expression':\r\n ...\r\n```\r\n\r\nMakes `flake8-annotation-complexity` to fail. We need to ignore this case silently.\r\n\r\nRelated: #1170 \r\n\r\nDemo: https://asciinema.org/a/IIjIfkVKytmZ1F5c2YufMORdi\nAnnotation complexity should not fail on expressions\n# Bug report\r\n\r\nThis code:\r\n\r\n```python\r\ndef some() -> 'test expression':\r\n ...\r\n```\r\n\r\nMakes `flake8-annotation-complexity` to fail. We need to ignore this case silently.\r\n\r\nRelated: #1170 \r\n\r\nDemo: https://asciinema.org/a/IIjIfkVKytmZ1F5c2YufMORdi\n", "before_files": [{"content": "\"\"\"\nCounts annotation complexity by getting the nesting level of nodes.\n\nSo ``List[int]`` complexity is 2\nand ``Tuple[List[Optional[str]], int]`` is 4.\n\nAdapted from: https://github.com/best-doctor/flake8-annotations-complexity\n\"\"\"\n\nimport ast\nfrom typing import Union\n\n_Annotation = Union[\n ast.expr,\n ast.Str,\n]\n\n\ndef get_annotation_compexity(annotation_node: _Annotation) -> int:\n \"\"\"\n Recursevly counts complexity of annotation nodes.\n\n When annotations are written as strings,\n we additionally parse them to ``ast`` nodes.\n \"\"\"\n if isinstance(annotation_node, ast.Str):\n annotation_node = ast.parse( # type: ignore\n annotation_node.s,\n ).body[0].value\n\n if isinstance(annotation_node, ast.Subscript):\n return 1 + get_annotation_compexity(\n annotation_node.slice.value, # type: ignore\n )\n elif isinstance(annotation_node, (ast.Tuple, ast.List)):\n return max(\n (get_annotation_compexity(node) for node in annotation_node.elts),\n default=1,\n )\n return 1\n", "path": "wemake_python_styleguide/logic/complexity/annotations.py"}], "after_files": [{"content": "\"\"\"\nCounts annotation complexity by getting the nesting level of nodes.\n\nSo ``List[int]`` complexity is 2\nand ``Tuple[List[Optional[str]], int]`` is 4.\n\nAdapted from: https://github.com/best-doctor/flake8-annotations-complexity\n\"\"\"\n\nimport ast\nfrom typing import Union\n\n_Annotation = Union[\n ast.expr,\n ast.Str,\n]\n\n\ndef get_annotation_compexity(annotation_node: _Annotation) -> int:\n \"\"\"\n Recursevly counts complexity of annotation nodes.\n\n When annotations are written as strings,\n we additionally parse them to ``ast`` nodes.\n \"\"\"\n if isinstance(annotation_node, ast.Str):\n try:\n annotation_node = ast.parse( # type: ignore\n annotation_node.s,\n ).body[0].value\n except SyntaxError:\n return 1\n\n if isinstance(annotation_node, ast.Subscript):\n return 1 + get_annotation_compexity(\n annotation_node.slice.value, # type: ignore\n )\n elif isinstance(annotation_node, (ast.Tuple, ast.List)):\n return max(\n (get_annotation_compexity(node) for node in annotation_node.elts),\n default=1,\n )\n return 1\n", "path": "wemake_python_styleguide/logic/complexity/annotations.py"}]} | 773 | 198 |
gh_patches_debug_10410 | rasdani/github-patches | git_diff | pypa__pip-6731 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Add docs for new pip debug command
This is a follow-up issue to PR #6638 to add docs for the new `pip debug` command. As @xavfernandez said in [this comment](https://github.com/pypa/pip/pull/6638#pullrequestreview-256090004):
> It would also need basic documentation (at least a `docs/html/reference/pip_debug.rst`) and most importantly (IMHO), strongly emphasize that the output and the options of this command are provisional and might change without notice.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/pip/_internal/commands/debug.py`
Content:
```
1 from __future__ import absolute_import
2
3 import logging
4 import sys
5
6 from pip._internal.cli import cmdoptions
7 from pip._internal.cli.base_command import Command
8 from pip._internal.cli.cmdoptions import make_target_python
9 from pip._internal.cli.status_codes import SUCCESS
10 from pip._internal.utils.logging import indent_log
11 from pip._internal.utils.misc import get_pip_version
12 from pip._internal.utils.typing import MYPY_CHECK_RUNNING
13 from pip._internal.wheel import format_tag
14
15 if MYPY_CHECK_RUNNING:
16 from typing import Any, List
17 from optparse import Values
18
19 logger = logging.getLogger(__name__)
20
21
22 def show_value(name, value):
23 # type: (str, str) -> None
24 logger.info('{}: {}'.format(name, value))
25
26
27 def show_sys_implementation():
28 # type: () -> None
29 logger.info('sys.implementation:')
30 if hasattr(sys, 'implementation'):
31 implementation = sys.implementation # type: ignore
32 implementation_name = implementation.name
33 else:
34 implementation_name = ''
35
36 with indent_log():
37 show_value('name', implementation_name)
38
39
40 def show_tags(options):
41 # type: (Values) -> None
42 tag_limit = 10
43
44 target_python = make_target_python(options)
45 tags = target_python.get_tags()
46
47 # Display the target options that were explicitly provided.
48 formatted_target = target_python.format_given()
49 suffix = ''
50 if formatted_target:
51 suffix = ' (target: {})'.format(formatted_target)
52
53 msg = 'Compatible tags: {}{}'.format(len(tags), suffix)
54 logger.info(msg)
55
56 if options.verbose < 1 and len(tags) > tag_limit:
57 tags_limited = True
58 tags = tags[:tag_limit]
59 else:
60 tags_limited = False
61
62 with indent_log():
63 for tag in tags:
64 logger.info(format_tag(tag))
65
66 if tags_limited:
67 msg = (
68 '...\n'
69 '[First {tag_limit} tags shown. Pass --verbose to show all.]'
70 ).format(tag_limit=tag_limit)
71 logger.info(msg)
72
73
74 class DebugCommand(Command):
75 """
76 Display debug information.
77 """
78
79 name = 'debug'
80 usage = """
81 %prog <options>"""
82 summary = 'Show information useful for debugging.'
83 ignore_require_venv = True
84
85 def __init__(self, *args, **kw):
86 super(DebugCommand, self).__init__(*args, **kw)
87
88 cmd_opts = self.cmd_opts
89 cmdoptions.add_target_python_options(cmd_opts)
90 self.parser.insert_option_group(0, cmd_opts)
91
92 def run(self, options, args):
93 # type: (Values, List[Any]) -> int
94 show_value('pip version', get_pip_version())
95 show_value('sys.version', sys.version)
96 show_value('sys.executable', sys.executable)
97 show_value('sys.platform', sys.platform)
98 show_sys_implementation()
99
100 show_tags(options)
101
102 return SUCCESS
103
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/pip/_internal/commands/debug.py b/src/pip/_internal/commands/debug.py
--- a/src/pip/_internal/commands/debug.py
+++ b/src/pip/_internal/commands/debug.py
@@ -91,6 +91,12 @@
def run(self, options, args):
# type: (Values, List[Any]) -> int
+ logger.warning(
+ "This command is only meant for debugging. "
+ "Do not use this with automation for parsing and getting these "
+ "details, since the output and options of this command may "
+ "change without notice."
+ )
show_value('pip version', get_pip_version())
show_value('sys.version', sys.version)
show_value('sys.executable', sys.executable)
| {"golden_diff": "diff --git a/src/pip/_internal/commands/debug.py b/src/pip/_internal/commands/debug.py\n--- a/src/pip/_internal/commands/debug.py\n+++ b/src/pip/_internal/commands/debug.py\n@@ -91,6 +91,12 @@\n \n def run(self, options, args):\n # type: (Values, List[Any]) -> int\n+ logger.warning(\n+ \"This command is only meant for debugging. \"\n+ \"Do not use this with automation for parsing and getting these \"\n+ \"details, since the output and options of this command may \"\n+ \"change without notice.\"\n+ )\n show_value('pip version', get_pip_version())\n show_value('sys.version', sys.version)\n show_value('sys.executable', sys.executable)\n", "issue": "Add docs for new pip debug command\nThis is a follow-up issue to PR #6638 to add docs for the new `pip debug` command. As @xavfernandez said in [this comment](https://github.com/pypa/pip/pull/6638#pullrequestreview-256090004):\r\n\r\n> It would also need basic documentation (at least a `docs/html/reference/pip_debug.rst`) and most importantly (IMHO), strongly emphasize that the output and the options of this command are provisional and might change without notice.\r\n\r\n\r\n\n", "before_files": [{"content": "from __future__ import absolute_import\n\nimport logging\nimport sys\n\nfrom pip._internal.cli import cmdoptions\nfrom pip._internal.cli.base_command import Command\nfrom pip._internal.cli.cmdoptions import make_target_python\nfrom pip._internal.cli.status_codes import SUCCESS\nfrom pip._internal.utils.logging import indent_log\nfrom pip._internal.utils.misc import get_pip_version\nfrom pip._internal.utils.typing import MYPY_CHECK_RUNNING\nfrom pip._internal.wheel import format_tag\n\nif MYPY_CHECK_RUNNING:\n from typing import Any, List\n from optparse import Values\n\nlogger = logging.getLogger(__name__)\n\n\ndef show_value(name, value):\n # type: (str, str) -> None\n logger.info('{}: {}'.format(name, value))\n\n\ndef show_sys_implementation():\n # type: () -> None\n logger.info('sys.implementation:')\n if hasattr(sys, 'implementation'):\n implementation = sys.implementation # type: ignore\n implementation_name = implementation.name\n else:\n implementation_name = ''\n\n with indent_log():\n show_value('name', implementation_name)\n\n\ndef show_tags(options):\n # type: (Values) -> None\n tag_limit = 10\n\n target_python = make_target_python(options)\n tags = target_python.get_tags()\n\n # Display the target options that were explicitly provided.\n formatted_target = target_python.format_given()\n suffix = ''\n if formatted_target:\n suffix = ' (target: {})'.format(formatted_target)\n\n msg = 'Compatible tags: {}{}'.format(len(tags), suffix)\n logger.info(msg)\n\n if options.verbose < 1 and len(tags) > tag_limit:\n tags_limited = True\n tags = tags[:tag_limit]\n else:\n tags_limited = False\n\n with indent_log():\n for tag in tags:\n logger.info(format_tag(tag))\n\n if tags_limited:\n msg = (\n '...\\n'\n '[First {tag_limit} tags shown. Pass --verbose to show all.]'\n ).format(tag_limit=tag_limit)\n logger.info(msg)\n\n\nclass DebugCommand(Command):\n \"\"\"\n Display debug information.\n \"\"\"\n\n name = 'debug'\n usage = \"\"\"\n %prog <options>\"\"\"\n summary = 'Show information useful for debugging.'\n ignore_require_venv = True\n\n def __init__(self, *args, **kw):\n super(DebugCommand, self).__init__(*args, **kw)\n\n cmd_opts = self.cmd_opts\n cmdoptions.add_target_python_options(cmd_opts)\n self.parser.insert_option_group(0, cmd_opts)\n\n def run(self, options, args):\n # type: (Values, List[Any]) -> int\n show_value('pip version', get_pip_version())\n show_value('sys.version', sys.version)\n show_value('sys.executable', sys.executable)\n show_value('sys.platform', sys.platform)\n show_sys_implementation()\n\n show_tags(options)\n\n return SUCCESS\n", "path": "src/pip/_internal/commands/debug.py"}], "after_files": [{"content": "from __future__ import absolute_import\n\nimport logging\nimport sys\n\nfrom pip._internal.cli import cmdoptions\nfrom pip._internal.cli.base_command import Command\nfrom pip._internal.cli.cmdoptions import make_target_python\nfrom pip._internal.cli.status_codes import SUCCESS\nfrom pip._internal.utils.logging import indent_log\nfrom pip._internal.utils.misc import get_pip_version\nfrom pip._internal.utils.typing import MYPY_CHECK_RUNNING\nfrom pip._internal.wheel import format_tag\n\nif MYPY_CHECK_RUNNING:\n from typing import Any, List\n from optparse import Values\n\nlogger = logging.getLogger(__name__)\n\n\ndef show_value(name, value):\n # type: (str, str) -> None\n logger.info('{}: {}'.format(name, value))\n\n\ndef show_sys_implementation():\n # type: () -> None\n logger.info('sys.implementation:')\n if hasattr(sys, 'implementation'):\n implementation = sys.implementation # type: ignore\n implementation_name = implementation.name\n else:\n implementation_name = ''\n\n with indent_log():\n show_value('name', implementation_name)\n\n\ndef show_tags(options):\n # type: (Values) -> None\n tag_limit = 10\n\n target_python = make_target_python(options)\n tags = target_python.get_tags()\n\n # Display the target options that were explicitly provided.\n formatted_target = target_python.format_given()\n suffix = ''\n if formatted_target:\n suffix = ' (target: {})'.format(formatted_target)\n\n msg = 'Compatible tags: {}{}'.format(len(tags), suffix)\n logger.info(msg)\n\n if options.verbose < 1 and len(tags) > tag_limit:\n tags_limited = True\n tags = tags[:tag_limit]\n else:\n tags_limited = False\n\n with indent_log():\n for tag in tags:\n logger.info(format_tag(tag))\n\n if tags_limited:\n msg = (\n '...\\n'\n '[First {tag_limit} tags shown. Pass --verbose to show all.]'\n ).format(tag_limit=tag_limit)\n logger.info(msg)\n\n\nclass DebugCommand(Command):\n \"\"\"\n Display debug information.\n \"\"\"\n\n name = 'debug'\n usage = \"\"\"\n %prog <options>\"\"\"\n summary = 'Show information useful for debugging.'\n ignore_require_venv = True\n\n def __init__(self, *args, **kw):\n super(DebugCommand, self).__init__(*args, **kw)\n\n cmd_opts = self.cmd_opts\n cmdoptions.add_target_python_options(cmd_opts)\n self.parser.insert_option_group(0, cmd_opts)\n\n def run(self, options, args):\n # type: (Values, List[Any]) -> int\n logger.warning(\n \"This command is only meant for debugging. \"\n \"Do not use this with automation for parsing and getting these \"\n \"details, since the output and options of this command may \"\n \"change without notice.\"\n )\n show_value('pip version', get_pip_version())\n show_value('sys.version', sys.version)\n show_value('sys.executable', sys.executable)\n show_value('sys.platform', sys.platform)\n show_sys_implementation()\n\n show_tags(options)\n\n return SUCCESS\n", "path": "src/pip/_internal/commands/debug.py"}]} | 1,241 | 175 |
gh_patches_debug_3521 | rasdani/github-patches | git_diff | wagtail__wagtail-2465 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove redundant template debug lines from project template
Ref: https://github.com/torchbox/wagtail/blob/9ff7961a3c8f508ad17735cd815335bad12fd67f/wagtail/project_template/project_name/settings/dev.py#L7-L8
#1688
According to https://docs.djangoproject.com/en/1.9/topics/templates/#django.template.backends.django.DjangoTemplates, the 'debug' option on the DjangoTemplates engine defaults to the global DEBUG setting, so setting this here is apparently redundant. (Also, there's no corresponding option for the Jinja2 backend, so setting this for all engines is not strictly correct.)
So, we just need someone to double-check that with these lines removed, template debug info still displays in development mode but not in production.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `wagtail/project_template/project_name/settings/dev.py`
Content:
```
1 from __future__ import absolute_import, unicode_literals
2
3 from .base import *
4
5 # SECURITY WARNING: don't run with debug turned on in production!
6 DEBUG = True
7
8 for template_engine in TEMPLATES:
9 template_engine['OPTIONS']['debug'] = True
10
11 # SECURITY WARNING: keep the secret key used in production secret!
12 SECRET_KEY = '{{ secret_key }}'
13
14
15 EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
16
17
18 try:
19 from .local import *
20 except ImportError:
21 pass
22
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/wagtail/project_template/project_name/settings/dev.py b/wagtail/project_template/project_name/settings/dev.py
--- a/wagtail/project_template/project_name/settings/dev.py
+++ b/wagtail/project_template/project_name/settings/dev.py
@@ -5,9 +5,6 @@
# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = True
-for template_engine in TEMPLATES:
- template_engine['OPTIONS']['debug'] = True
-
# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = '{{ secret_key }}'
| {"golden_diff": "diff --git a/wagtail/project_template/project_name/settings/dev.py b/wagtail/project_template/project_name/settings/dev.py\n--- a/wagtail/project_template/project_name/settings/dev.py\n+++ b/wagtail/project_template/project_name/settings/dev.py\n@@ -5,9 +5,6 @@\n # SECURITY WARNING: don't run with debug turned on in production!\n DEBUG = True\n \n-for template_engine in TEMPLATES:\n- template_engine['OPTIONS']['debug'] = True\n-\n # SECURITY WARNING: keep the secret key used in production secret!\n SECRET_KEY = '{{ secret_key }}'\n", "issue": "Remove redundant template debug lines from project template\nRef: https://github.com/torchbox/wagtail/blob/9ff7961a3c8f508ad17735cd815335bad12fd67f/wagtail/project_template/project_name/settings/dev.py#L7-L8\n#1688\n\nAccording to https://docs.djangoproject.com/en/1.9/topics/templates/#django.template.backends.django.DjangoTemplates, the 'debug' option on the DjangoTemplates engine defaults to the global DEBUG setting, so setting this here is apparently redundant. (Also, there's no corresponding option for the Jinja2 backend, so setting this for all engines is not strictly correct.)\n\nSo, we just need someone to double-check that with these lines removed, template debug info still displays in development mode but not in production.\n\n", "before_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nfrom .base import *\n\n# SECURITY WARNING: don't run with debug turned on in production!\nDEBUG = True\n\nfor template_engine in TEMPLATES:\n template_engine['OPTIONS']['debug'] = True\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = '{{ secret_key }}'\n\n\nEMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'\n\n\ntry:\n from .local import *\nexcept ImportError:\n pass\n", "path": "wagtail/project_template/project_name/settings/dev.py"}], "after_files": [{"content": "from __future__ import absolute_import, unicode_literals\n\nfrom .base import *\n\n# SECURITY WARNING: don't run with debug turned on in production!\nDEBUG = True\n\n# SECURITY WARNING: keep the secret key used in production secret!\nSECRET_KEY = '{{ secret_key }}'\n\n\nEMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'\n\n\ntry:\n from .local import *\nexcept ImportError:\n pass\n", "path": "wagtail/project_template/project_name/settings/dev.py"}]} | 589 | 122 |
gh_patches_debug_3489 | rasdani/github-patches | git_diff | cookiecutter__cookiecutter-1526 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Enforce the minimal coverage to 100%
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `cookiecutter/utils.py`
Content:
```
1 """Helper functions used throughout Cookiecutter."""
2 import contextlib
3 import errno
4 import logging
5 import os
6 import shutil
7 import stat
8 import sys
9
10 from cookiecutter.prompt import read_user_yes_no
11
12 logger = logging.getLogger(__name__)
13
14
15 def force_delete(func, path, exc_info):
16 """Error handler for `shutil.rmtree()` equivalent to `rm -rf`.
17
18 Usage: `shutil.rmtree(path, onerror=force_delete)`
19 From stackoverflow.com/questions/1889597
20 """
21 os.chmod(path, stat.S_IWRITE)
22 func(path)
23
24
25 def rmtree(path):
26 """Remove a directory and all its contents. Like rm -rf on Unix.
27
28 :param path: A directory path.
29 """
30 shutil.rmtree(path, onerror=force_delete)
31
32
33 def make_sure_path_exists(path):
34 """Ensure that a directory exists.
35
36 :param path: A directory path.
37 """
38 logger.debug('Making sure path exists: %s', path)
39 try:
40 os.makedirs(path)
41 logger.debug('Created directory at: %s', path)
42 except OSError as exception:
43 if exception.errno != errno.EEXIST:
44 return False
45 return True
46
47
48 @contextlib.contextmanager
49 def work_in(dirname=None):
50 """Context manager version of os.chdir.
51
52 When exited, returns to the working directory prior to entering.
53 """
54 curdir = os.getcwd()
55 try:
56 if dirname is not None:
57 os.chdir(dirname)
58 yield
59 finally:
60 os.chdir(curdir)
61
62
63 def make_executable(script_path):
64 """Make `script_path` executable.
65
66 :param script_path: The file to change
67 """
68 status = os.stat(script_path)
69 os.chmod(script_path, status.st_mode | stat.S_IEXEC)
70
71
72 def prompt_and_delete(path, no_input=False):
73 """
74 Ask user if it's okay to delete the previously-downloaded file/directory.
75
76 If yes, delete it. If no, checks to see if the old version should be
77 reused. If yes, it's reused; otherwise, Cookiecutter exits.
78
79 :param path: Previously downloaded zipfile.
80 :param no_input: Suppress prompt to delete repo and just delete it.
81 :return: True if the content was deleted
82 """
83 # Suppress prompt if called via API
84 if no_input:
85 ok_to_delete = True
86 else:
87 question = (
88 "You've downloaded {} before. Is it okay to delete and re-download it?"
89 ).format(path)
90
91 ok_to_delete = read_user_yes_no(question, 'yes')
92
93 if ok_to_delete:
94 if os.path.isdir(path):
95 rmtree(path)
96 else:
97 os.remove(path)
98 return True
99 else:
100 ok_to_reuse = read_user_yes_no(
101 "Do you want to re-use the existing version?", 'yes'
102 )
103
104 if ok_to_reuse:
105 return False
106
107 sys.exit()
108
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/cookiecutter/utils.py b/cookiecutter/utils.py
--- a/cookiecutter/utils.py
+++ b/cookiecutter/utils.py
@@ -16,7 +16,7 @@
"""Error handler for `shutil.rmtree()` equivalent to `rm -rf`.
Usage: `shutil.rmtree(path, onerror=force_delete)`
- From stackoverflow.com/questions/1889597
+ From https://docs.python.org/3/library/shutil.html#rmtree-example
"""
os.chmod(path, stat.S_IWRITE)
func(path)
| {"golden_diff": "diff --git a/cookiecutter/utils.py b/cookiecutter/utils.py\n--- a/cookiecutter/utils.py\n+++ b/cookiecutter/utils.py\n@@ -16,7 +16,7 @@\n \"\"\"Error handler for `shutil.rmtree()` equivalent to `rm -rf`.\n \n Usage: `shutil.rmtree(path, onerror=force_delete)`\n- From stackoverflow.com/questions/1889597\n+ From https://docs.python.org/3/library/shutil.html#rmtree-example\n \"\"\"\n os.chmod(path, stat.S_IWRITE)\n func(path)\n", "issue": "Enforce the minimal coverage to 100%\n\n", "before_files": [{"content": "\"\"\"Helper functions used throughout Cookiecutter.\"\"\"\nimport contextlib\nimport errno\nimport logging\nimport os\nimport shutil\nimport stat\nimport sys\n\nfrom cookiecutter.prompt import read_user_yes_no\n\nlogger = logging.getLogger(__name__)\n\n\ndef force_delete(func, path, exc_info):\n \"\"\"Error handler for `shutil.rmtree()` equivalent to `rm -rf`.\n\n Usage: `shutil.rmtree(path, onerror=force_delete)`\n From stackoverflow.com/questions/1889597\n \"\"\"\n os.chmod(path, stat.S_IWRITE)\n func(path)\n\n\ndef rmtree(path):\n \"\"\"Remove a directory and all its contents. Like rm -rf on Unix.\n\n :param path: A directory path.\n \"\"\"\n shutil.rmtree(path, onerror=force_delete)\n\n\ndef make_sure_path_exists(path):\n \"\"\"Ensure that a directory exists.\n\n :param path: A directory path.\n \"\"\"\n logger.debug('Making sure path exists: %s', path)\n try:\n os.makedirs(path)\n logger.debug('Created directory at: %s', path)\n except OSError as exception:\n if exception.errno != errno.EEXIST:\n return False\n return True\n\n\[email protected]\ndef work_in(dirname=None):\n \"\"\"Context manager version of os.chdir.\n\n When exited, returns to the working directory prior to entering.\n \"\"\"\n curdir = os.getcwd()\n try:\n if dirname is not None:\n os.chdir(dirname)\n yield\n finally:\n os.chdir(curdir)\n\n\ndef make_executable(script_path):\n \"\"\"Make `script_path` executable.\n\n :param script_path: The file to change\n \"\"\"\n status = os.stat(script_path)\n os.chmod(script_path, status.st_mode | stat.S_IEXEC)\n\n\ndef prompt_and_delete(path, no_input=False):\n \"\"\"\n Ask user if it's okay to delete the previously-downloaded file/directory.\n\n If yes, delete it. If no, checks to see if the old version should be\n reused. If yes, it's reused; otherwise, Cookiecutter exits.\n\n :param path: Previously downloaded zipfile.\n :param no_input: Suppress prompt to delete repo and just delete it.\n :return: True if the content was deleted\n \"\"\"\n # Suppress prompt if called via API\n if no_input:\n ok_to_delete = True\n else:\n question = (\n \"You've downloaded {} before. Is it okay to delete and re-download it?\"\n ).format(path)\n\n ok_to_delete = read_user_yes_no(question, 'yes')\n\n if ok_to_delete:\n if os.path.isdir(path):\n rmtree(path)\n else:\n os.remove(path)\n return True\n else:\n ok_to_reuse = read_user_yes_no(\n \"Do you want to re-use the existing version?\", 'yes'\n )\n\n if ok_to_reuse:\n return False\n\n sys.exit()\n", "path": "cookiecutter/utils.py"}], "after_files": [{"content": "\"\"\"Helper functions used throughout Cookiecutter.\"\"\"\nimport contextlib\nimport errno\nimport logging\nimport os\nimport shutil\nimport stat\nimport sys\n\nfrom cookiecutter.prompt import read_user_yes_no\n\nlogger = logging.getLogger(__name__)\n\n\ndef force_delete(func, path, exc_info):\n \"\"\"Error handler for `shutil.rmtree()` equivalent to `rm -rf`.\n\n Usage: `shutil.rmtree(path, onerror=force_delete)`\n From https://docs.python.org/3/library/shutil.html#rmtree-example\n \"\"\"\n os.chmod(path, stat.S_IWRITE)\n func(path)\n\n\ndef rmtree(path):\n \"\"\"Remove a directory and all its contents. Like rm -rf on Unix.\n\n :param path: A directory path.\n \"\"\"\n shutil.rmtree(path, onerror=force_delete)\n\n\ndef make_sure_path_exists(path):\n \"\"\"Ensure that a directory exists.\n\n :param path: A directory path.\n \"\"\"\n logger.debug('Making sure path exists: %s', path)\n try:\n os.makedirs(path)\n logger.debug('Created directory at: %s', path)\n except OSError as exception:\n if exception.errno != errno.EEXIST:\n return False\n return True\n\n\[email protected]\ndef work_in(dirname=None):\n \"\"\"Context manager version of os.chdir.\n\n When exited, returns to the working directory prior to entering.\n \"\"\"\n curdir = os.getcwd()\n try:\n if dirname is not None:\n os.chdir(dirname)\n yield\n finally:\n os.chdir(curdir)\n\n\ndef make_executable(script_path):\n \"\"\"Make `script_path` executable.\n\n :param script_path: The file to change\n \"\"\"\n status = os.stat(script_path)\n os.chmod(script_path, status.st_mode | stat.S_IEXEC)\n\n\ndef prompt_and_delete(path, no_input=False):\n \"\"\"\n Ask user if it's okay to delete the previously-downloaded file/directory.\n\n If yes, delete it. If no, checks to see if the old version should be\n reused. If yes, it's reused; otherwise, Cookiecutter exits.\n\n :param path: Previously downloaded zipfile.\n :param no_input: Suppress prompt to delete repo and just delete it.\n :return: True if the content was deleted\n \"\"\"\n # Suppress prompt if called via API\n if no_input:\n ok_to_delete = True\n else:\n question = (\n \"You've downloaded {} before. Is it okay to delete and re-download it?\"\n ).format(path)\n\n ok_to_delete = read_user_yes_no(question, 'yes')\n\n if ok_to_delete:\n if os.path.isdir(path):\n rmtree(path)\n else:\n os.remove(path)\n return True\n else:\n ok_to_reuse = read_user_yes_no(\n \"Do you want to re-use the existing version?\", 'yes'\n )\n\n if ok_to_reuse:\n return False\n\n sys.exit()\n", "path": "cookiecutter/utils.py"}]} | 1,133 | 133 |
gh_patches_debug_35096 | rasdani/github-patches | git_diff | kornia__kornia-1687 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
kornia.augmentation.resize does not have antialias flag, unlike kornia.geometry.transform.resize
### Describe the bug
Check
https://kornia.readthedocs.io/en/latest/_modules/kornia/augmentation/_2d/geometric/resize.html#Resize
versus
https://kornia.readthedocs.io/en/latest/geometry.transform.html#kornia.geometry.transform.Resize
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `kornia/augmentation/_2d/geometric/resize.py`
Content:
```
1 from typing import Dict, Optional, Tuple, Union, cast
2
3 import torch
4 from torch import Tensor
5
6 from kornia.augmentation import random_generator as rg
7 from kornia.augmentation._2d.geometric.base import GeometricAugmentationBase2D
8 from kornia.constants import Resample
9 from kornia.geometry.transform import crop_by_transform_mat, get_perspective_transform, resize
10 from kornia.utils import eye_like
11
12
13 class Resize(GeometricAugmentationBase2D):
14 """Resize to size.
15
16 Args:
17 size: Size (h, w) in pixels of the resized region or just one side.
18 side: Which side to resize, if size is only of type int.
19 resample: Resampling mode.
20 align_corners: interpolation flag.
21 keepdim: whether to keep the output shape the same as input (True) or broadcast it
22 to the batch form (False).
23 """
24
25 def __init__(
26 self,
27 size: Union[int, Tuple[int, int]],
28 side: str = "short",
29 resample: Union[str, int, Resample] = Resample.BILINEAR.name,
30 align_corners: bool = True,
31 p: float = 1.0,
32 return_transform: Optional[bool] = None,
33 keepdim: bool = False,
34 ) -> None:
35 super().__init__(p=1., return_transform=return_transform, same_on_batch=True, p_batch=p, keepdim=keepdim)
36 self._param_generator = cast(rg.ResizeGenerator, rg.ResizeGenerator(resize_to=size, side=side))
37 self.flags = dict(size=size, side=side, resample=Resample.get(resample), align_corners=align_corners)
38
39 def compute_transformation(self, input: Tensor, params: Dict[str, Tensor]) -> Tensor:
40 if params["output_size"] == input.shape[-2:]:
41 return eye_like(3, input)
42
43 transform: Tensor = get_perspective_transform(params["src"], params["dst"])
44 transform = transform.expand(input.shape[0], -1, -1)
45 return transform
46
47 def apply_transform(
48 self, input: Tensor, params: Dict[str, Tensor], transform: Optional[Tensor] = None
49 ) -> Tensor:
50 B, C, _, _ = input.shape
51 out_size = tuple(params["output_size"][0].tolist())
52 out = torch.empty(B, C, *out_size, device=input.device, dtype=input.dtype)
53 for i in range(B):
54 x1 = int(params["src"][i, 0, 0])
55 x2 = int(params["src"][i, 1, 0]) + 1
56 y1 = int(params["src"][i, 0, 1])
57 y2 = int(params["src"][i, 3, 1]) + 1
58 out[i] = resize(
59 input[i : i + 1, :, y1:y2, x1:x2],
60 out_size,
61 interpolation=(self.flags["resample"].name).lower(),
62 align_corners=self.flags["align_corners"],
63 )
64 return out
65
66 def inverse_transform(
67 self,
68 input: Tensor,
69 transform: Optional[Tensor] = None,
70 size: Optional[Tuple[int, int]] = None,
71 **kwargs,
72 ) -> Tensor:
73 size = cast(Tuple[int, int], size)
74 mode = self.flags["resample"].name.lower() if "mode" not in kwargs else kwargs["mode"]
75 align_corners = self.flags["align_corners"] if "align_corners" not in kwargs else kwargs["align_corners"]
76 padding_mode = "zeros" if "padding_mode" not in kwargs else kwargs["padding_mode"]
77 transform = cast(Tensor, transform)
78 return crop_by_transform_mat(input, transform[:, :2, :], size, mode, padding_mode, align_corners)
79
80
81 class LongestMaxSize(Resize):
82 """Rescale an image so that maximum side is equal to max_size, keeping the aspect ratio of the initial image.
83
84 Args:
85 max_size: maximum size of the image after the transformation.
86 """
87
88 def __init__(
89 self,
90 max_size: int,
91 resample: Union[str, int, Resample] = Resample.BILINEAR.name,
92 align_corners: bool = True,
93 p: float = 1.0,
94 return_transform: Optional[bool] = None,
95 ) -> None:
96 # TODO: Support max_size list input to randomly select from
97 super().__init__(
98 size=max_size,
99 side="long",
100 resample=resample,
101 return_transform=return_transform,
102 align_corners=align_corners,
103 p=p,
104 )
105
106
107 class SmallestMaxSize(Resize):
108 """Rescale an image so that minimum side is equal to max_size, keeping the aspect ratio of the initial image.
109
110 Args:
111 max_size: maximum size of the image after the transformation.
112 """
113
114 def __init__(
115 self,
116 max_size: int,
117 resample: Union[str, int, Resample] = Resample.BILINEAR.name,
118 align_corners: bool = True,
119 p: float = 1.0,
120 return_transform: Optional[bool] = None,
121 ) -> None:
122 # TODO: Support max_size list input to randomly select from
123 super().__init__(
124 size=max_size,
125 side="short",
126 resample=resample,
127 return_transform=return_transform,
128 align_corners=align_corners,
129 p=p,
130 )
131
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/kornia/augmentation/_2d/geometric/resize.py b/kornia/augmentation/_2d/geometric/resize.py
--- a/kornia/augmentation/_2d/geometric/resize.py
+++ b/kornia/augmentation/_2d/geometric/resize.py
@@ -18,6 +18,7 @@
side: Which side to resize, if size is only of type int.
resample: Resampling mode.
align_corners: interpolation flag.
+ antialias: if True, then image will be filtered with Gaussian before downscaling. No effect for upscaling.
keepdim: whether to keep the output shape the same as input (True) or broadcast it
to the batch form (False).
"""
@@ -28,13 +29,20 @@
side: str = "short",
resample: Union[str, int, Resample] = Resample.BILINEAR.name,
align_corners: bool = True,
+ antialias: bool = False,
p: float = 1.0,
return_transform: Optional[bool] = None,
keepdim: bool = False,
) -> None:
super().__init__(p=1., return_transform=return_transform, same_on_batch=True, p_batch=p, keepdim=keepdim)
self._param_generator = cast(rg.ResizeGenerator, rg.ResizeGenerator(resize_to=size, side=side))
- self.flags = dict(size=size, side=side, resample=Resample.get(resample), align_corners=align_corners)
+ self.flags = dict(
+ size=size,
+ side=side,
+ resample=Resample.get(resample),
+ align_corners=align_corners,
+ antialias=antialias
+ )
def compute_transformation(self, input: Tensor, params: Dict[str, Tensor]) -> Tensor:
if params["output_size"] == input.shape[-2:]:
@@ -60,6 +68,7 @@
out_size,
interpolation=(self.flags["resample"].name).lower(),
align_corners=self.flags["align_corners"],
+ antialias=self.flags["antialias"]
)
return out
| {"golden_diff": "diff --git a/kornia/augmentation/_2d/geometric/resize.py b/kornia/augmentation/_2d/geometric/resize.py\n--- a/kornia/augmentation/_2d/geometric/resize.py\n+++ b/kornia/augmentation/_2d/geometric/resize.py\n@@ -18,6 +18,7 @@\n side: Which side to resize, if size is only of type int.\n resample: Resampling mode.\n align_corners: interpolation flag.\n+ antialias: if True, then image will be filtered with Gaussian before downscaling. No effect for upscaling.\n keepdim: whether to keep the output shape the same as input (True) or broadcast it\n to the batch form (False).\n \"\"\"\n@@ -28,13 +29,20 @@\n side: str = \"short\",\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n+ antialias: bool = False,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n keepdim: bool = False,\n ) -> None:\n super().__init__(p=1., return_transform=return_transform, same_on_batch=True, p_batch=p, keepdim=keepdim)\n self._param_generator = cast(rg.ResizeGenerator, rg.ResizeGenerator(resize_to=size, side=side))\n- self.flags = dict(size=size, side=side, resample=Resample.get(resample), align_corners=align_corners)\n+ self.flags = dict(\n+ size=size,\n+ side=side,\n+ resample=Resample.get(resample),\n+ align_corners=align_corners,\n+ antialias=antialias\n+ )\n \n def compute_transformation(self, input: Tensor, params: Dict[str, Tensor]) -> Tensor:\n if params[\"output_size\"] == input.shape[-2:]:\n@@ -60,6 +68,7 @@\n out_size,\n interpolation=(self.flags[\"resample\"].name).lower(),\n align_corners=self.flags[\"align_corners\"],\n+ antialias=self.flags[\"antialias\"]\n )\n return out\n", "issue": "kornia.augmentation.resize does not have antialias flag, unlike kornia.geometry.transform.resize\n### Describe the bug\r\n\r\nCheck \r\nhttps://kornia.readthedocs.io/en/latest/_modules/kornia/augmentation/_2d/geometric/resize.html#Resize\r\n\r\nversus \r\nhttps://kornia.readthedocs.io/en/latest/geometry.transform.html#kornia.geometry.transform.Resize\n", "before_files": [{"content": "from typing import Dict, Optional, Tuple, Union, cast\n\nimport torch\nfrom torch import Tensor\n\nfrom kornia.augmentation import random_generator as rg\nfrom kornia.augmentation._2d.geometric.base import GeometricAugmentationBase2D\nfrom kornia.constants import Resample\nfrom kornia.geometry.transform import crop_by_transform_mat, get_perspective_transform, resize\nfrom kornia.utils import eye_like\n\n\nclass Resize(GeometricAugmentationBase2D):\n \"\"\"Resize to size.\n\n Args:\n size: Size (h, w) in pixels of the resized region or just one side.\n side: Which side to resize, if size is only of type int.\n resample: Resampling mode.\n align_corners: interpolation flag.\n keepdim: whether to keep the output shape the same as input (True) or broadcast it\n to the batch form (False).\n \"\"\"\n\n def __init__(\n self,\n size: Union[int, Tuple[int, int]],\n side: str = \"short\",\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n keepdim: bool = False,\n ) -> None:\n super().__init__(p=1., return_transform=return_transform, same_on_batch=True, p_batch=p, keepdim=keepdim)\n self._param_generator = cast(rg.ResizeGenerator, rg.ResizeGenerator(resize_to=size, side=side))\n self.flags = dict(size=size, side=side, resample=Resample.get(resample), align_corners=align_corners)\n\n def compute_transformation(self, input: Tensor, params: Dict[str, Tensor]) -> Tensor:\n if params[\"output_size\"] == input.shape[-2:]:\n return eye_like(3, input)\n\n transform: Tensor = get_perspective_transform(params[\"src\"], params[\"dst\"])\n transform = transform.expand(input.shape[0], -1, -1)\n return transform\n\n def apply_transform(\n self, input: Tensor, params: Dict[str, Tensor], transform: Optional[Tensor] = None\n ) -> Tensor:\n B, C, _, _ = input.shape\n out_size = tuple(params[\"output_size\"][0].tolist())\n out = torch.empty(B, C, *out_size, device=input.device, dtype=input.dtype)\n for i in range(B):\n x1 = int(params[\"src\"][i, 0, 0])\n x2 = int(params[\"src\"][i, 1, 0]) + 1\n y1 = int(params[\"src\"][i, 0, 1])\n y2 = int(params[\"src\"][i, 3, 1]) + 1\n out[i] = resize(\n input[i : i + 1, :, y1:y2, x1:x2],\n out_size,\n interpolation=(self.flags[\"resample\"].name).lower(),\n align_corners=self.flags[\"align_corners\"],\n )\n return out\n\n def inverse_transform(\n self,\n input: Tensor,\n transform: Optional[Tensor] = None,\n size: Optional[Tuple[int, int]] = None,\n **kwargs,\n ) -> Tensor:\n size = cast(Tuple[int, int], size)\n mode = self.flags[\"resample\"].name.lower() if \"mode\" not in kwargs else kwargs[\"mode\"]\n align_corners = self.flags[\"align_corners\"] if \"align_corners\" not in kwargs else kwargs[\"align_corners\"]\n padding_mode = \"zeros\" if \"padding_mode\" not in kwargs else kwargs[\"padding_mode\"]\n transform = cast(Tensor, transform)\n return crop_by_transform_mat(input, transform[:, :2, :], size, mode, padding_mode, align_corners)\n\n\nclass LongestMaxSize(Resize):\n \"\"\"Rescale an image so that maximum side is equal to max_size, keeping the aspect ratio of the initial image.\n\n Args:\n max_size: maximum size of the image after the transformation.\n \"\"\"\n\n def __init__(\n self,\n max_size: int,\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n ) -> None:\n # TODO: Support max_size list input to randomly select from\n super().__init__(\n size=max_size,\n side=\"long\",\n resample=resample,\n return_transform=return_transform,\n align_corners=align_corners,\n p=p,\n )\n\n\nclass SmallestMaxSize(Resize):\n \"\"\"Rescale an image so that minimum side is equal to max_size, keeping the aspect ratio of the initial image.\n\n Args:\n max_size: maximum size of the image after the transformation.\n \"\"\"\n\n def __init__(\n self,\n max_size: int,\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n ) -> None:\n # TODO: Support max_size list input to randomly select from\n super().__init__(\n size=max_size,\n side=\"short\",\n resample=resample,\n return_transform=return_transform,\n align_corners=align_corners,\n p=p,\n )\n", "path": "kornia/augmentation/_2d/geometric/resize.py"}], "after_files": [{"content": "from typing import Dict, Optional, Tuple, Union, cast\n\nimport torch\nfrom torch import Tensor\n\nfrom kornia.augmentation import random_generator as rg\nfrom kornia.augmentation._2d.geometric.base import GeometricAugmentationBase2D\nfrom kornia.constants import Resample\nfrom kornia.geometry.transform import crop_by_transform_mat, get_perspective_transform, resize\nfrom kornia.utils import eye_like\n\n\nclass Resize(GeometricAugmentationBase2D):\n \"\"\"Resize to size.\n\n Args:\n size: Size (h, w) in pixels of the resized region or just one side.\n side: Which side to resize, if size is only of type int.\n resample: Resampling mode.\n align_corners: interpolation flag.\n antialias: if True, then image will be filtered with Gaussian before downscaling. No effect for upscaling.\n keepdim: whether to keep the output shape the same as input (True) or broadcast it\n to the batch form (False).\n \"\"\"\n\n def __init__(\n self,\n size: Union[int, Tuple[int, int]],\n side: str = \"short\",\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n antialias: bool = False,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n keepdim: bool = False,\n ) -> None:\n super().__init__(p=1., return_transform=return_transform, same_on_batch=True, p_batch=p, keepdim=keepdim)\n self._param_generator = cast(rg.ResizeGenerator, rg.ResizeGenerator(resize_to=size, side=side))\n self.flags = dict(\n size=size,\n side=side,\n resample=Resample.get(resample),\n align_corners=align_corners,\n antialias=antialias\n )\n\n def compute_transformation(self, input: Tensor, params: Dict[str, Tensor]) -> Tensor:\n if params[\"output_size\"] == input.shape[-2:]:\n return eye_like(3, input)\n\n transform: Tensor = get_perspective_transform(params[\"src\"], params[\"dst\"])\n transform = transform.expand(input.shape[0], -1, -1)\n return transform\n\n def apply_transform(\n self, input: Tensor, params: Dict[str, Tensor], transform: Optional[Tensor] = None\n ) -> Tensor:\n B, C, _, _ = input.shape\n out_size = tuple(params[\"output_size\"][0].tolist())\n out = torch.empty(B, C, *out_size, device=input.device, dtype=input.dtype)\n for i in range(B):\n x1 = int(params[\"src\"][i, 0, 0])\n x2 = int(params[\"src\"][i, 1, 0]) + 1\n y1 = int(params[\"src\"][i, 0, 1])\n y2 = int(params[\"src\"][i, 3, 1]) + 1\n out[i] = resize(\n input[i : i + 1, :, y1:y2, x1:x2],\n out_size,\n interpolation=(self.flags[\"resample\"].name).lower(),\n align_corners=self.flags[\"align_corners\"],\n antialias=self.flags[\"antialias\"]\n )\n return out\n\n def inverse_transform(\n self,\n input: Tensor,\n transform: Optional[Tensor] = None,\n size: Optional[Tuple[int, int]] = None,\n **kwargs,\n ) -> Tensor:\n size = cast(Tuple[int, int], size)\n mode = self.flags[\"resample\"].name.lower() if \"mode\" not in kwargs else kwargs[\"mode\"]\n align_corners = self.flags[\"align_corners\"] if \"align_corners\" not in kwargs else kwargs[\"align_corners\"]\n padding_mode = \"zeros\" if \"padding_mode\" not in kwargs else kwargs[\"padding_mode\"]\n transform = cast(Tensor, transform)\n return crop_by_transform_mat(input, transform[:, :2, :], size, mode, padding_mode, align_corners)\n\n\nclass LongestMaxSize(Resize):\n \"\"\"Rescale an image so that maximum side is equal to max_size, keeping the aspect ratio of the initial image.\n\n Args:\n max_size: maximum size of the image after the transformation.\n \"\"\"\n\n def __init__(\n self,\n max_size: int,\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n ) -> None:\n # TODO: Support max_size list input to randomly select from\n super().__init__(\n size=max_size,\n side=\"long\",\n resample=resample,\n return_transform=return_transform,\n align_corners=align_corners,\n p=p,\n )\n\n\nclass SmallestMaxSize(Resize):\n \"\"\"Rescale an image so that minimum side is equal to max_size, keeping the aspect ratio of the initial image.\n\n Args:\n max_size: maximum size of the image after the transformation.\n \"\"\"\n\n def __init__(\n self,\n max_size: int,\n resample: Union[str, int, Resample] = Resample.BILINEAR.name,\n align_corners: bool = True,\n p: float = 1.0,\n return_transform: Optional[bool] = None,\n ) -> None:\n # TODO: Support max_size list input to randomly select from\n super().__init__(\n size=max_size,\n side=\"short\",\n resample=resample,\n return_transform=return_transform,\n align_corners=align_corners,\n p=p,\n )\n", "path": "kornia/augmentation/_2d/geometric/resize.py"}]} | 1,829 | 479 |
gh_patches_debug_1356 | rasdani/github-patches | git_diff | kserve__kserve-2103 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Cannot install required version of numpy on M1 mac
/kind bug
Issue:
Installation on python 3.8 or 3.9 (and presumably all versions of Python) of the v0.8.0 release candidate fails due to the pinned requirement of numpy.
Expected behavior:
kserve's release candidate for 0.8 can be installed on an M1 mac.
Extra information:
https://github.com/numpy/numpy/releases/tag/v1.21.0 numpy 1.21+ allows installation on M1 macs.
**Environment:**
- OS (e.g. from `/etc/os-release`): M1 mac
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/lgbserver/setup.py`
Content:
```
1 # Copyright 2021 The KServe Authors.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from setuptools import setup, find_packages
16
17 tests_require = [
18 'pytest',
19 'pytest-asyncio',
20 'pytest-tornasync',
21 'mypy'
22 ]
23
24 setup(
25 name='lgbserver',
26 version='0.7.0',
27 author_email='[email protected]',
28 license='../../LICENSE.txt',
29 url='https://github.com/kserve/kserve/python/lgbserver',
30 description='Model Server implementation for LightGBM. \
31 Not intended for use outside KServe Frameworks Images',
32 long_description=open('README.md').read(),
33 python_requires='>3.4',
34 packages=find_packages("lgbserver"),
35 install_requires=[
36 "kserve>=0.7.0",
37 "lightgbm == 3.3.2",
38 "pandas == 0.25.3",
39 "argparse >= 1.4.0",
40 ],
41 tests_require=tests_require,
42 extras_require={'test': tests_require}
43 )
44
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/python/lgbserver/setup.py b/python/lgbserver/setup.py
--- a/python/lgbserver/setup.py
+++ b/python/lgbserver/setup.py
@@ -35,7 +35,7 @@
install_requires=[
"kserve>=0.7.0",
"lightgbm == 3.3.2",
- "pandas == 0.25.3",
+ "pandas == 1.3.5",
"argparse >= 1.4.0",
],
tests_require=tests_require,
| {"golden_diff": "diff --git a/python/lgbserver/setup.py b/python/lgbserver/setup.py\n--- a/python/lgbserver/setup.py\n+++ b/python/lgbserver/setup.py\n@@ -35,7 +35,7 @@\n install_requires=[\n \"kserve>=0.7.0\",\n \"lightgbm == 3.3.2\",\n- \"pandas == 0.25.3\",\n+ \"pandas == 1.3.5\",\n \"argparse >= 1.4.0\",\n ],\n tests_require=tests_require,\n", "issue": "Cannot install required version of numpy on M1 mac\n/kind bug\r\n\r\nIssue:\r\nInstallation on python 3.8 or 3.9 (and presumably all versions of Python) of the v0.8.0 release candidate fails due to the pinned requirement of numpy.\r\n\r\nExpected behavior:\r\nkserve's release candidate for 0.8 can be installed on an M1 mac.\r\n\r\nExtra information:\r\nhttps://github.com/numpy/numpy/releases/tag/v1.21.0 numpy 1.21+ allows installation on M1 macs.\r\n\r\n\r\n**Environment:**\r\n\r\n- OS (e.g. from `/etc/os-release`): M1 mac\r\n\n", "before_files": [{"content": "# Copyright 2021 The KServe Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'pytest',\n 'pytest-asyncio',\n 'pytest-tornasync',\n 'mypy'\n]\n\nsetup(\n name='lgbserver',\n version='0.7.0',\n author_email='[email protected]',\n license='../../LICENSE.txt',\n url='https://github.com/kserve/kserve/python/lgbserver',\n description='Model Server implementation for LightGBM. \\\n Not intended for use outside KServe Frameworks Images',\n long_description=open('README.md').read(),\n python_requires='>3.4',\n packages=find_packages(\"lgbserver\"),\n install_requires=[\n \"kserve>=0.7.0\",\n \"lightgbm == 3.3.2\",\n \"pandas == 0.25.3\",\n \"argparse >= 1.4.0\",\n ],\n tests_require=tests_require,\n extras_require={'test': tests_require}\n)\n", "path": "python/lgbserver/setup.py"}], "after_files": [{"content": "# Copyright 2021 The KServe Authors.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'pytest',\n 'pytest-asyncio',\n 'pytest-tornasync',\n 'mypy'\n]\n\nsetup(\n name='lgbserver',\n version='0.7.0',\n author_email='[email protected]',\n license='../../LICENSE.txt',\n url='https://github.com/kserve/kserve/python/lgbserver',\n description='Model Server implementation for LightGBM. \\\n Not intended for use outside KServe Frameworks Images',\n long_description=open('README.md').read(),\n python_requires='>3.4',\n packages=find_packages(\"lgbserver\"),\n install_requires=[\n \"kserve>=0.7.0\",\n \"lightgbm == 3.3.2\",\n \"pandas == 1.3.5\",\n \"argparse >= 1.4.0\",\n ],\n tests_require=tests_require,\n extras_require={'test': tests_require}\n)\n", "path": "python/lgbserver/setup.py"}]} | 823 | 124 |
gh_patches_debug_18073 | rasdani/github-patches | git_diff | kartoza__prj.app-895 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Display the version number in the footer

In the footer, next to "Available on github", we can display the version number from this file: https://github.com/kartoza/projecta/blob/develop/django_project/.version
To be able to know between staging and production which version we are running
Sentry is already reading this file: https://github.com/kartoza/projecta/blob/develop/django_project/core/settings/prod.py#L47
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `django_project/lesson/templatetags/lesson_tags.py`
Content:
```
1 # coding=utf-8
2 """Custom tags for lesson app."""
3
4 from django import template
5 from django.utils.safestring import mark_safe
6
7 register = template.Library()
8
9
10 @register.filter(name='is_translation_up_to_date')
11 def is_translation_up_to_date(value):
12 if not value.is_translation_up_to_date:
13 return mark_safe(
14 '<span title="Translation is outdated"><sup>❗</sup></span>')
15 else:
16 return mark_safe('')
17
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/django_project/lesson/templatetags/lesson_tags.py b/django_project/lesson/templatetags/lesson_tags.py
--- a/django_project/lesson/templatetags/lesson_tags.py
+++ b/django_project/lesson/templatetags/lesson_tags.py
@@ -1,8 +1,9 @@
# coding=utf-8
"""Custom tags for lesson app."""
-
from django import template
from django.utils.safestring import mark_safe
+from core.settings.utils import absolute_path
+
register = template.Library()
@@ -14,3 +15,16 @@
'<span title="Translation is outdated"><sup>❗</sup></span>')
else:
return mark_safe('')
+
+
[email protected]_tag(takes_context=True)
+def version_tag(context):
+ """Reads current project release from the .version file."""
+ version_file = absolute_path('.version')
+ try:
+ with open(version_file, 'r') as file:
+ version = file.read()
+ context['version'] = version
+ except IOError:
+ context['version'] = 'Unknown'
+ return context['version']
| {"golden_diff": "diff --git a/django_project/lesson/templatetags/lesson_tags.py b/django_project/lesson/templatetags/lesson_tags.py\n--- a/django_project/lesson/templatetags/lesson_tags.py\n+++ b/django_project/lesson/templatetags/lesson_tags.py\n@@ -1,8 +1,9 @@\n # coding=utf-8\n \"\"\"Custom tags for lesson app.\"\"\"\n-\n from django import template\n from django.utils.safestring import mark_safe\n+from core.settings.utils import absolute_path\n+\n \n register = template.Library()\n \n@@ -14,3 +15,16 @@\n '<span title=\"Translation is outdated\"><sup>❗</sup></span>')\n else:\n return mark_safe('')\n+\n+\[email protected]_tag(takes_context=True)\n+def version_tag(context):\n+ \"\"\"Reads current project release from the .version file.\"\"\"\n+ version_file = absolute_path('.version')\n+ try:\n+ with open(version_file, 'r') as file:\n+ version = file.read()\n+ context['version'] = version\n+ except IOError:\n+ context['version'] = 'Unknown'\n+ return context['version']\n", "issue": "Display the version number in the footer\n\r\n\r\nIn the footer, next to \"Available on github\", we can display the version number from this file: https://github.com/kartoza/projecta/blob/develop/django_project/.version\r\n\r\nTo be able to know between staging and production which version we are running\r\n\r\nSentry is already reading this file: https://github.com/kartoza/projecta/blob/develop/django_project/core/settings/prod.py#L47\n", "before_files": [{"content": "# coding=utf-8\n\"\"\"Custom tags for lesson app.\"\"\"\n\nfrom django import template\nfrom django.utils.safestring import mark_safe\n\nregister = template.Library()\n\n\[email protected](name='is_translation_up_to_date')\ndef is_translation_up_to_date(value):\n if not value.is_translation_up_to_date:\n return mark_safe(\n '<span title=\"Translation is outdated\"><sup>❗</sup></span>')\n else:\n return mark_safe('')\n", "path": "django_project/lesson/templatetags/lesson_tags.py"}], "after_files": [{"content": "# coding=utf-8\n\"\"\"Custom tags for lesson app.\"\"\"\nfrom django import template\nfrom django.utils.safestring import mark_safe\nfrom core.settings.utils import absolute_path\n\n\nregister = template.Library()\n\n\[email protected](name='is_translation_up_to_date')\ndef is_translation_up_to_date(value):\n if not value.is_translation_up_to_date:\n return mark_safe(\n '<span title=\"Translation is outdated\"><sup>❗</sup></span>')\n else:\n return mark_safe('')\n\n\[email protected]_tag(takes_context=True)\ndef version_tag(context):\n \"\"\"Reads current project release from the .version file.\"\"\"\n version_file = absolute_path('.version')\n try:\n with open(version_file, 'r') as file:\n version = file.read()\n context['version'] = version\n except IOError:\n context['version'] = 'Unknown'\n return context['version']\n", "path": "django_project/lesson/templatetags/lesson_tags.py"}]} | 571 | 264 |
gh_patches_debug_1969 | rasdani/github-patches | git_diff | kserve__kserve-1053 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Tabular Explainer e2e test failing
/kind bug
```
(base) C02YJ034JGH5:~ dsun20$ kubectl logs isvc-explainer-tabular-explainer-default-7cnkj-deployment-4q4hn -n kfserving-ci-e2e-test kfserving-container
[I 200828 13:12:28 font_manager:1423] Generating new fontManager, this may take some time...
Traceback (most recent call last):
File "/usr/local/lib/python3.7/runpy.py", line 183, in _run_module_as_main
mod_name, mod_spec, code = _get_module_details(mod_name, _Error)
File "/usr/local/lib/python3.7/runpy.py", line 142, in _get_module_details
return _get_module_details(pkg_main_name, error)
File "/usr/local/lib/python3.7/runpy.py", line 109, in _get_module_details
__import__(pkg_name)
File "/alibiexplainer/alibiexplainer/__init__.py", line 15, in <module>
from .explainer import AlibiExplainer
File "/alibiexplainer/alibiexplainer/explainer.py", line 21, in <module>
from alibiexplainer.anchor_images import AnchorImages
File "/alibiexplainer/alibiexplainer/anchor_images.py", line 17, in <module>
import alibi
File "/usr/local/lib/python3.7/site-packages/alibi/__init__.py", line 1, in <module>
from . import confidence, datasets, explainers, utils
File "/usr/local/lib/python3.7/site-packages/alibi/explainers/__init__.py", line 11, in <module>
from .kernel_shap import KernelShap
File "/usr/local/lib/python3.7/site-packages/alibi/explainers/kernel_shap.py", line 11, in <module>
from shap.common import DenseData, DenseDataWithIndex
ModuleNotFoundError: No module named 'shap.common'
```
**What did you expect to happen:**
**Anything else you would like to add:**
[Miscellaneous information that will assist in solving the issue.]
**Environment:**
- Istio Version:
- Knative Version:
- KFServing Version:
- Kubeflow version:
- Kfdef:[k8s_istio/istio_dex/gcp_basic_auth/gcp_iap/aws/aws_cognito/ibm]
- Minikube version:
- Kubernetes version: (use `kubectl version`):
- OS (e.g. from `/etc/os-release`):
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `python/alibiexplainer/setup.py`
Content:
```
1 # Copyright 2019 kubeflow.org.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from setuptools import setup, find_packages
16
17 tests_require = [
18 'pytest',
19 'pytest-tornasync',
20 'mypy'
21 ]
22
23 setup(
24 name='alibiexplainer',
25 version='0.4.0',
26 author_email='[email protected]',
27 license='../../LICENSE.txt',
28 url='https://github.com/kubeflow/kfserving/python/kfserving/alibiexplainer',
29 description='Model Explaination Server. \
30 Not intended for use outside KFServing Frameworks Images',
31 long_description=open('README.md').read(),
32 python_requires='>=3.6',
33 packages=find_packages("alibiexplainer"),
34 install_requires=[
35 "kfserving>=0.4.0",
36 "alibi==0.4.0",
37 "scikit-learn>=0.20.3",
38 "argparse>=1.4.0",
39 "requests>=2.22.0",
40 "joblib>=0.13.2",
41 "pandas>=0.24.2",
42 "numpy>=1.16.3",
43 "dill>=0.3.0",
44 "spacy>=2.1.4"
45 ],
46 tests_require=tests_require,
47 extras_require={'test': tests_require}
48 )
49
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/python/alibiexplainer/setup.py b/python/alibiexplainer/setup.py
--- a/python/alibiexplainer/setup.py
+++ b/python/alibiexplainer/setup.py
@@ -32,6 +32,7 @@
python_requires='>=3.6',
packages=find_packages("alibiexplainer"),
install_requires=[
+ "shap==0.35",
"kfserving>=0.4.0",
"alibi==0.4.0",
"scikit-learn>=0.20.3",
| {"golden_diff": "diff --git a/python/alibiexplainer/setup.py b/python/alibiexplainer/setup.py\n--- a/python/alibiexplainer/setup.py\n+++ b/python/alibiexplainer/setup.py\n@@ -32,6 +32,7 @@\n python_requires='>=3.6',\n packages=find_packages(\"alibiexplainer\"),\n install_requires=[\n+ \"shap==0.35\",\n \"kfserving>=0.4.0\",\n \"alibi==0.4.0\",\n \"scikit-learn>=0.20.3\",\n", "issue": "Tabular Explainer e2e test failing\n/kind bug\r\n\r\n```\r\n(base) C02YJ034JGH5:~ dsun20$ kubectl logs isvc-explainer-tabular-explainer-default-7cnkj-deployment-4q4hn -n kfserving-ci-e2e-test kfserving-container\r\n[I 200828 13:12:28 font_manager:1423] Generating new fontManager, this may take some time...\r\nTraceback (most recent call last):\r\n File \"/usr/local/lib/python3.7/runpy.py\", line 183, in _run_module_as_main\r\n mod_name, mod_spec, code = _get_module_details(mod_name, _Error)\r\n File \"/usr/local/lib/python3.7/runpy.py\", line 142, in _get_module_details\r\n return _get_module_details(pkg_main_name, error)\r\n File \"/usr/local/lib/python3.7/runpy.py\", line 109, in _get_module_details\r\n __import__(pkg_name)\r\n File \"/alibiexplainer/alibiexplainer/__init__.py\", line 15, in <module>\r\n from .explainer import AlibiExplainer\r\n File \"/alibiexplainer/alibiexplainer/explainer.py\", line 21, in <module>\r\n from alibiexplainer.anchor_images import AnchorImages\r\n File \"/alibiexplainer/alibiexplainer/anchor_images.py\", line 17, in <module>\r\n import alibi\r\n File \"/usr/local/lib/python3.7/site-packages/alibi/__init__.py\", line 1, in <module>\r\n from . import confidence, datasets, explainers, utils\r\n File \"/usr/local/lib/python3.7/site-packages/alibi/explainers/__init__.py\", line 11, in <module>\r\n from .kernel_shap import KernelShap\r\n File \"/usr/local/lib/python3.7/site-packages/alibi/explainers/kernel_shap.py\", line 11, in <module>\r\n from shap.common import DenseData, DenseDataWithIndex\r\nModuleNotFoundError: No module named 'shap.common'\r\n```\r\n\r\n\r\n**What did you expect to happen:**\r\n\r\n\r\n**Anything else you would like to add:**\r\n[Miscellaneous information that will assist in solving the issue.]\r\n\r\n\r\n**Environment:**\r\n\r\n- Istio Version:\r\n- Knative Version:\r\n- KFServing Version:\r\n- Kubeflow version:\r\n- Kfdef:[k8s_istio/istio_dex/gcp_basic_auth/gcp_iap/aws/aws_cognito/ibm]\r\n- Minikube version:\r\n- Kubernetes version: (use `kubectl version`):\r\n- OS (e.g. from `/etc/os-release`):\r\n\n", "before_files": [{"content": "# Copyright 2019 kubeflow.org.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'pytest',\n 'pytest-tornasync',\n 'mypy'\n]\n\nsetup(\n name='alibiexplainer',\n version='0.4.0',\n author_email='[email protected]',\n license='../../LICENSE.txt',\n url='https://github.com/kubeflow/kfserving/python/kfserving/alibiexplainer',\n description='Model Explaination Server. \\\n Not intended for use outside KFServing Frameworks Images',\n long_description=open('README.md').read(),\n python_requires='>=3.6',\n packages=find_packages(\"alibiexplainer\"),\n install_requires=[\n \"kfserving>=0.4.0\",\n \"alibi==0.4.0\",\n \"scikit-learn>=0.20.3\",\n \"argparse>=1.4.0\",\n \"requests>=2.22.0\",\n \"joblib>=0.13.2\",\n \"pandas>=0.24.2\",\n \"numpy>=1.16.3\",\n \"dill>=0.3.0\",\n \"spacy>=2.1.4\"\n ],\n tests_require=tests_require,\n extras_require={'test': tests_require}\n)\n", "path": "python/alibiexplainer/setup.py"}], "after_files": [{"content": "# Copyright 2019 kubeflow.org.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\nfrom setuptools import setup, find_packages\n\ntests_require = [\n 'pytest',\n 'pytest-tornasync',\n 'mypy'\n]\n\nsetup(\n name='alibiexplainer',\n version='0.4.0',\n author_email='[email protected]',\n license='../../LICENSE.txt',\n url='https://github.com/kubeflow/kfserving/python/kfserving/alibiexplainer',\n description='Model Explaination Server. \\\n Not intended for use outside KFServing Frameworks Images',\n long_description=open('README.md').read(),\n python_requires='>=3.6',\n packages=find_packages(\"alibiexplainer\"),\n install_requires=[\n \"shap==0.35\",\n \"kfserving>=0.4.0\",\n \"alibi==0.4.0\",\n \"scikit-learn>=0.20.3\",\n \"argparse>=1.4.0\",\n \"requests>=2.22.0\",\n \"joblib>=0.13.2\",\n \"pandas>=0.24.2\",\n \"numpy>=1.16.3\",\n \"dill>=0.3.0\",\n \"spacy>=2.1.4\"\n ],\n tests_require=tests_require,\n extras_require={'test': tests_require}\n)\n", "path": "python/alibiexplainer/setup.py"}]} | 1,365 | 123 |
gh_patches_debug_2597 | rasdani/github-patches | git_diff | cookiecutter__cookiecutter-539 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Increase development status to 'beta' or 'stable'.
I think we can say the project is waaaay beyond alpha. :wink:
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2
3 import os
4 import sys
5
6 try:
7 from setuptools import setup
8 except ImportError:
9 from distutils.core import setup
10
11 version = "1.1.0"
12
13 if sys.argv[-1] == 'publish':
14 os.system('python setup.py sdist upload')
15 os.system('python setup.py bdist_wheel upload')
16 sys.exit()
17
18 if sys.argv[-1] == 'tag':
19 os.system("git tag -a %s -m 'version %s'" % (version, version))
20 os.system("git push --tags")
21 sys.exit()
22
23 with open('README.rst') as readme_file:
24 readme = readme_file.read()
25
26 with open('HISTORY.rst') as history_file:
27 history = history_file.read().replace('.. :changelog:', '')
28
29 requirements = [
30 'future>=0.15.2',
31 'binaryornot>=0.2.0',
32 'jinja2>=2.7',
33 'PyYAML>=3.10',
34 'click>=5.0',
35 'whichcraft>=0.1.1'
36 ]
37
38 long_description = readme + '\n\n' + history
39
40 if sys.argv[-1] == 'readme':
41 print(long_description)
42 sys.exit()
43
44
45 setup(
46 name='cookiecutter',
47 version=version,
48 description=('A command-line utility that creates projects from project '
49 'templates, e.g. creating a Python package project from a '
50 'Python package project template.'),
51 long_description=long_description,
52 author='Audrey Roy',
53 author_email='[email protected]',
54 url='https://github.com/audreyr/cookiecutter',
55 packages=[
56 'cookiecutter',
57 ],
58 package_dir={'cookiecutter': 'cookiecutter'},
59 entry_points={
60 'console_scripts': [
61 'cookiecutter = cookiecutter.cli:main',
62 ]
63 },
64 include_package_data=True,
65 install_requires=requirements,
66 license='BSD',
67 zip_safe=False,
68 classifiers=[
69 'Development Status :: 3 - Alpha',
70 'Environment :: Console',
71 'Intended Audience :: Developers',
72 'Natural Language :: English',
73 'License :: OSI Approved :: BSD License',
74 'Programming Language :: Python',
75 'Programming Language :: Python :: 2',
76 'Programming Language :: Python :: 2.7',
77 'Programming Language :: Python :: 3',
78 'Programming Language :: Python :: 3.3',
79 'Programming Language :: Python :: 3.4',
80 'Programming Language :: Python :: 3.5',
81 'Programming Language :: Python :: Implementation :: CPython',
82 'Programming Language :: Python :: Implementation :: PyPy',
83 'Topic :: Software Development',
84 ],
85 keywords=(
86 'cookiecutter, Python, projects, project templates, Jinja2, '
87 'skeleton, scaffolding, project directory, setup.py, package, '
88 'packaging'
89 ),
90 )
91
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -66,7 +66,7 @@
license='BSD',
zip_safe=False,
classifiers=[
- 'Development Status :: 3 - Alpha',
+ 'Development Status :: 5 - Production/Stable',
'Environment :: Console',
'Intended Audience :: Developers',
'Natural Language :: English',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -66,7 +66,7 @@\n license='BSD',\n zip_safe=False,\n classifiers=[\n- 'Development Status :: 3 - Alpha',\n+ 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'Natural Language :: English',\n", "issue": "Increase development status to 'beta' or 'stable'.\nI think we can say the project is waaaay beyond alpha. :wink: \n\n", "before_files": [{"content": "#!/usr/bin/env python\n\nimport os\nimport sys\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\nversion = \"1.1.0\"\n\nif sys.argv[-1] == 'publish':\n os.system('python setup.py sdist upload')\n os.system('python setup.py bdist_wheel upload')\n sys.exit()\n\nif sys.argv[-1] == 'tag':\n os.system(\"git tag -a %s -m 'version %s'\" % (version, version))\n os.system(\"git push --tags\")\n sys.exit()\n\nwith open('README.rst') as readme_file:\n readme = readme_file.read()\n\nwith open('HISTORY.rst') as history_file:\n history = history_file.read().replace('.. :changelog:', '')\n\nrequirements = [\n 'future>=0.15.2',\n 'binaryornot>=0.2.0',\n 'jinja2>=2.7',\n 'PyYAML>=3.10',\n 'click>=5.0',\n 'whichcraft>=0.1.1'\n]\n\nlong_description = readme + '\\n\\n' + history\n\nif sys.argv[-1] == 'readme':\n print(long_description)\n sys.exit()\n\n\nsetup(\n name='cookiecutter',\n version=version,\n description=('A command-line utility that creates projects from project '\n 'templates, e.g. creating a Python package project from a '\n 'Python package project template.'),\n long_description=long_description,\n author='Audrey Roy',\n author_email='[email protected]',\n url='https://github.com/audreyr/cookiecutter',\n packages=[\n 'cookiecutter',\n ],\n package_dir={'cookiecutter': 'cookiecutter'},\n entry_points={\n 'console_scripts': [\n 'cookiecutter = cookiecutter.cli:main',\n ]\n },\n include_package_data=True,\n install_requires=requirements,\n license='BSD',\n zip_safe=False,\n classifiers=[\n 'Development Status :: 3 - Alpha',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'Natural Language :: English',\n 'License :: OSI Approved :: BSD License',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Topic :: Software Development',\n ],\n keywords=(\n 'cookiecutter, Python, projects, project templates, Jinja2, '\n 'skeleton, scaffolding, project directory, setup.py, package, '\n 'packaging'\n ),\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\nimport os\nimport sys\n\ntry:\n from setuptools import setup\nexcept ImportError:\n from distutils.core import setup\n\nversion = \"1.1.0\"\n\nif sys.argv[-1] == 'publish':\n os.system('python setup.py sdist upload')\n os.system('python setup.py bdist_wheel upload')\n sys.exit()\n\nif sys.argv[-1] == 'tag':\n os.system(\"git tag -a %s -m 'version %s'\" % (version, version))\n os.system(\"git push --tags\")\n sys.exit()\n\nwith open('README.rst') as readme_file:\n readme = readme_file.read()\n\nwith open('HISTORY.rst') as history_file:\n history = history_file.read().replace('.. :changelog:', '')\n\nrequirements = [\n 'future>=0.15.2',\n 'binaryornot>=0.2.0',\n 'jinja2>=2.7',\n 'PyYAML>=3.10',\n 'click>=5.0',\n 'whichcraft>=0.1.1'\n]\n\nlong_description = readme + '\\n\\n' + history\n\nif sys.argv[-1] == 'readme':\n print(long_description)\n sys.exit()\n\n\nsetup(\n name='cookiecutter',\n version=version,\n description=('A command-line utility that creates projects from project '\n 'templates, e.g. creating a Python package project from a '\n 'Python package project template.'),\n long_description=long_description,\n author='Audrey Roy',\n author_email='[email protected]',\n url='https://github.com/audreyr/cookiecutter',\n packages=[\n 'cookiecutter',\n ],\n package_dir={'cookiecutter': 'cookiecutter'},\n entry_points={\n 'console_scripts': [\n 'cookiecutter = cookiecutter.cli:main',\n ]\n },\n include_package_data=True,\n install_requires=requirements,\n license='BSD',\n zip_safe=False,\n classifiers=[\n 'Development Status :: 5 - Production/Stable',\n 'Environment :: Console',\n 'Intended Audience :: Developers',\n 'Natural Language :: English',\n 'License :: OSI Approved :: BSD License',\n 'Programming Language :: Python',\n 'Programming Language :: Python :: 2',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3',\n 'Programming Language :: Python :: 3.3',\n 'Programming Language :: Python :: 3.4',\n 'Programming Language :: Python :: 3.5',\n 'Programming Language :: Python :: Implementation :: CPython',\n 'Programming Language :: Python :: Implementation :: PyPy',\n 'Topic :: Software Development',\n ],\n keywords=(\n 'cookiecutter, Python, projects, project templates, Jinja2, '\n 'skeleton, scaffolding, project directory, setup.py, package, '\n 'packaging'\n ),\n)\n", "path": "setup.py"}]} | 1,099 | 90 |
gh_patches_debug_36589 | rasdani/github-patches | git_diff | freedomofpress__securedrop-146 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Journalists should be able to bulk download from document server
Right now, journalists can only download one file at a time, even if there are dozens new submissions in any given session. The New Yorker team asked if we can enable bulk downloading so that journalists can download multiple files at once.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `modules/deaddrop/files/deaddrop/journalist.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import os
3 from datetime import datetime
4 import uuid
5
6 from flask import Flask, request, render_template, send_file
7 from flask_wtf.csrf import CsrfProtect
8
9 import config, version, crypto, store, background
10
11 app = Flask(__name__, template_folder=config.JOURNALIST_TEMPLATES_DIR)
12 app.secret_key = config.SECRET_KEY
13
14 app.jinja_env.globals['version'] = version.__version__
15
16 def get_docs(sid):
17 """Get docs associated with source id `sid` sorted by submission date"""
18 docs = []
19 for filename in os.listdir(store.path(sid)):
20 os_stat = os.stat(store.path(sid, filename))
21 docs.append(dict(
22 name=filename,
23 date=str(datetime.fromtimestamp(os_stat.st_mtime)),
24 size=os_stat.st_size,
25 ))
26 # sort by date since ordering by filename is meaningless
27 docs.sort(key=lambda x: x['date'])
28 return docs
29
30 @app.after_request
31 def no_cache(response):
32 """Minimize potential traces of site access by telling the browser not to
33 cache anything"""
34 no_cache_headers = {
35 'Cache-Control': 'no-cache, no-store, must-revalidate',
36 'Pragma': 'no-cache',
37 'Expires': '-1',
38 }
39 for header, header_value in no_cache_headers.iteritems():
40 response.headers.add(header, header_value)
41 return response
42
43 @app.route('/')
44 def index():
45 dirs = os.listdir(config.STORE_DIR)
46 cols = []
47 for d in dirs:
48 cols.append(dict(
49 name=d,
50 sid=crypto.displayid(d),
51 date=str(datetime.fromtimestamp(os.stat(store.path(d)).st_mtime)).split('.')[0]
52 ))
53 cols.sort(key=lambda x: x['date'], reverse=True)
54 return render_template('index.html', cols=cols)
55
56 @app.route('/col/<sid>')
57 def col(sid):
58 return render_template("col.html", sid=sid, codename=crypto.displayid(sid),
59 docs=get_docs(sid), haskey=crypto.getkey(sid))
60
61 @app.route('/col/<sid>/<fn>')
62 def doc(sid, fn):
63 if '..' in fn or fn.startswith('/'):
64 abort(404)
65 return send_file(store.path(sid, fn), mimetype="application/pgp-encrypted")
66
67 @app.route('/reply', methods=('POST',))
68 def reply():
69 sid, msg = request.form['sid'], request.form['msg']
70 crypto.encrypt(crypto.getkey(sid), request.form['msg'], output=
71 store.path(sid, 'reply-%s.gpg' % uuid.uuid4()))
72 return render_template('reply.html', sid=sid, codename=crypto.displayid(sid))
73
74 @app.route('/delete', methods=('POST',))
75 def delete():
76 sid = request.form['sid']
77 doc_names_selected = request.form.getlist('doc_names_selected')
78 docs_selected = [doc for doc in get_docs(sid) if doc['name'] in doc_names_selected]
79 confirm_delete = bool(request.form.get('confirm_delete', False))
80 if confirm_delete:
81 for doc in docs_selected:
82 fn = store.path(sid, doc['name'])
83 crypto.secureunlink(fn)
84 return render_template('delete.html', sid=sid, codename=crypto.displayid(sid),
85 docs_selected=docs_selected, confirm_delete=confirm_delete)
86
87 if __name__ == "__main__":
88 # TODO: make sure this gets run by the web server
89 CsrfProtect(app)
90 app.run(debug=True, port=8081)
91
```
Path: `modules/deaddrop/files/deaddrop/store.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 import os
3 import re
4 import config
5
6 VALIDATE_FILENAME = re.compile("^(reply-)?[a-f0-9-]+(_msg|_doc|)\.gpg$").match
7
8 class PathException(Exception):
9 '''An exception raised by `store.verify` when it encounters a bad path. A path
10 can be bad when it is not absolute, not normalized, not within
11 `config.STORE_DIR`, or doesn't match the filename format.
12 '''
13 pass
14
15 def verify(p):
16 '''Assert that the path is absolute, normalized, inside `config.STORE_DIR`, and
17 matches the filename format.
18 '''
19 if not os.path.isabs(config.STORE_DIR):
20 raise PathException("config.STORE_DIR(%s) is not absolute" % (
21 config.STORE_DIR, ))
22
23 # os.path.abspath makes the path absolute and normalizes '/foo/../bar' to
24 # '/bar', etc. We have to check that the path is normalized before checking
25 # that it starts with the `config.STORE_DIR` or else a malicious actor could
26 # append a bunch of '../../..' to access files outside of the store.
27 if not p == os.path.abspath(p):
28 raise PathException("The path is not absolute and/or normalized")
29
30 if os.path.commonprefix([config.STORE_DIR, p]) != config.STORE_DIR:
31 raise PathException("Invalid directory %s" % (p, ))
32
33 filename = os.path.basename(p)
34 ext = os.path.splitext(filename)[-1]
35
36 if os.path.isfile(p):
37 if ext != '.gpg':
38 # if there's an extension, verify it's a GPG
39 raise PathException("Invalid file extension %s" % (ext, ))
40
41 if not VALIDATE_FILENAME(filename):
42 raise PathException("Invalid filename %s" % (filename, ))
43
44 def path(*s):
45 '''Get the normalized, absolute file path, within `config.STORE_DIR`.'''
46 joined = os.path.join(os.path.abspath(config.STORE_DIR), *s)
47 absolute = os.path.abspath(joined)
48 verify(absolute)
49 return absolute
50
51 def log(msg):
52 file(path('NOTES'), 'a').write(msg)
53
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/modules/deaddrop/files/deaddrop/journalist.py b/modules/deaddrop/files/deaddrop/journalist.py
--- a/modules/deaddrop/files/deaddrop/journalist.py
+++ b/modules/deaddrop/files/deaddrop/journalist.py
@@ -71,11 +71,22 @@
store.path(sid, 'reply-%s.gpg' % uuid.uuid4()))
return render_template('reply.html', sid=sid, codename=crypto.displayid(sid))
[email protected]('/delete', methods=('POST',))
-def delete():
[email protected]('/bulk', methods=('POST',))
+def bulk():
+ action = request.form['action']
+
sid = request.form['sid']
doc_names_selected = request.form.getlist('doc_names_selected')
- docs_selected = [doc for doc in get_docs(sid) if doc['name'] in doc_names_selected]
+ docs_selected = [ doc for doc in get_docs(sid) if doc['name'] in doc_names_selected ]
+
+ if action == 'download':
+ return bulk_download(sid, docs_selected)
+ elif action == 'delete':
+ return bulk_delete(sid, docs_selected)
+ else:
+ abort(422)
+
+def bulk_delete(sid, docs_selected):
confirm_delete = bool(request.form.get('confirm_delete', False))
if confirm_delete:
for doc in docs_selected:
@@ -84,6 +95,13 @@
return render_template('delete.html', sid=sid, codename=crypto.displayid(sid),
docs_selected=docs_selected, confirm_delete=confirm_delete)
+def bulk_download(sid, docs_selected):
+ filenames = [store.path(sid, doc['name']) for doc in docs_selected]
+ zip = store.get_bulk_archive(filenames)
+ return send_file(zip, mimetype="application/zip", attachment_filename=crypto.displayid(sid), as_attachment=True)
+
+
+
if __name__ == "__main__":
# TODO: make sure this gets run by the web server
CsrfProtect(app)
diff --git a/modules/deaddrop/files/deaddrop/store.py b/modules/deaddrop/files/deaddrop/store.py
--- a/modules/deaddrop/files/deaddrop/store.py
+++ b/modules/deaddrop/files/deaddrop/store.py
@@ -2,6 +2,9 @@
import os
import re
import config
+import zipfile
+import crypto
+import uuid
VALIDATE_FILENAME = re.compile("^(reply-)?[a-f0-9-]+(_msg|_doc|)\.gpg$").match
@@ -48,5 +51,14 @@
verify(absolute)
return absolute
+def get_bulk_archive(filenames):
+ zip_file_name = os.path.join(config.TEMP_DIR, str(uuid.uuid4()) + '.zip')
+ with zipfile.ZipFile(zip_file_name, 'w') as zip:
+ for filename in filenames:
+ verify(filename)
+ basename = os.path.basename(filename)
+ zip.write(filename, arcname=basename)
+ return zip_file_name
+
def log(msg):
file(path('NOTES'), 'a').write(msg)
| {"golden_diff": "diff --git a/modules/deaddrop/files/deaddrop/journalist.py b/modules/deaddrop/files/deaddrop/journalist.py\n--- a/modules/deaddrop/files/deaddrop/journalist.py\n+++ b/modules/deaddrop/files/deaddrop/journalist.py\n@@ -71,11 +71,22 @@\n store.path(sid, 'reply-%s.gpg' % uuid.uuid4()))\n return render_template('reply.html', sid=sid, codename=crypto.displayid(sid))\n \[email protected]('/delete', methods=('POST',))\n-def delete():\[email protected]('/bulk', methods=('POST',))\n+def bulk():\n+ action = request.form['action']\n+\n sid = request.form['sid']\n doc_names_selected = request.form.getlist('doc_names_selected')\n- docs_selected = [doc for doc in get_docs(sid) if doc['name'] in doc_names_selected]\n+ docs_selected = [ doc for doc in get_docs(sid) if doc['name'] in doc_names_selected ]\n+\n+ if action == 'download':\n+ return bulk_download(sid, docs_selected)\n+ elif action == 'delete':\n+ return bulk_delete(sid, docs_selected)\n+ else:\n+ abort(422)\n+\n+def bulk_delete(sid, docs_selected):\n confirm_delete = bool(request.form.get('confirm_delete', False))\n if confirm_delete:\n for doc in docs_selected:\n@@ -84,6 +95,13 @@\n return render_template('delete.html', sid=sid, codename=crypto.displayid(sid),\n docs_selected=docs_selected, confirm_delete=confirm_delete)\n \n+def bulk_download(sid, docs_selected):\n+ filenames = [store.path(sid, doc['name']) for doc in docs_selected]\n+ zip = store.get_bulk_archive(filenames)\n+ return send_file(zip, mimetype=\"application/zip\", attachment_filename=crypto.displayid(sid), as_attachment=True)\n+\n+\n+ \n if __name__ == \"__main__\":\n # TODO: make sure this gets run by the web server\n CsrfProtect(app)\ndiff --git a/modules/deaddrop/files/deaddrop/store.py b/modules/deaddrop/files/deaddrop/store.py\n--- a/modules/deaddrop/files/deaddrop/store.py\n+++ b/modules/deaddrop/files/deaddrop/store.py\n@@ -2,6 +2,9 @@\n import os\n import re\n import config\n+import zipfile\n+import crypto\n+import uuid\n \n VALIDATE_FILENAME = re.compile(\"^(reply-)?[a-f0-9-]+(_msg|_doc|)\\.gpg$\").match\n \n@@ -48,5 +51,14 @@\n verify(absolute)\n return absolute\n \n+def get_bulk_archive(filenames):\n+ zip_file_name = os.path.join(config.TEMP_DIR, str(uuid.uuid4()) + '.zip')\n+ with zipfile.ZipFile(zip_file_name, 'w') as zip:\n+ for filename in filenames:\n+ verify(filename)\n+ basename = os.path.basename(filename)\n+ zip.write(filename, arcname=basename)\n+ return zip_file_name\n+\n def log(msg):\n file(path('NOTES'), 'a').write(msg)\n", "issue": "Journalists should be able to bulk download from document server\nRight now, journalists can only download one file at a time, even if there are dozens new submissions in any given session. The New Yorker team asked if we can enable bulk downloading so that journalists can download multiple files at once. \n\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\nimport os\nfrom datetime import datetime\nimport uuid\n\nfrom flask import Flask, request, render_template, send_file\nfrom flask_wtf.csrf import CsrfProtect\n\nimport config, version, crypto, store, background\n\napp = Flask(__name__, template_folder=config.JOURNALIST_TEMPLATES_DIR)\napp.secret_key = config.SECRET_KEY\n\napp.jinja_env.globals['version'] = version.__version__\n\ndef get_docs(sid):\n \"\"\"Get docs associated with source id `sid` sorted by submission date\"\"\"\n docs = []\n for filename in os.listdir(store.path(sid)):\n os_stat = os.stat(store.path(sid, filename))\n docs.append(dict(\n name=filename,\n date=str(datetime.fromtimestamp(os_stat.st_mtime)),\n size=os_stat.st_size,\n ))\n # sort by date since ordering by filename is meaningless\n docs.sort(key=lambda x: x['date'])\n return docs\n\[email protected]_request\ndef no_cache(response):\n \"\"\"Minimize potential traces of site access by telling the browser not to\n cache anything\"\"\"\n no_cache_headers = {\n 'Cache-Control': 'no-cache, no-store, must-revalidate',\n 'Pragma': 'no-cache',\n 'Expires': '-1',\n }\n for header, header_value in no_cache_headers.iteritems():\n response.headers.add(header, header_value)\n return response\n\[email protected]('/')\ndef index():\n dirs = os.listdir(config.STORE_DIR)\n cols = []\n for d in dirs:\n cols.append(dict(\n name=d,\n sid=crypto.displayid(d),\n date=str(datetime.fromtimestamp(os.stat(store.path(d)).st_mtime)).split('.')[0]\n ))\n cols.sort(key=lambda x: x['date'], reverse=True)\n return render_template('index.html', cols=cols)\n\[email protected]('/col/<sid>')\ndef col(sid):\n return render_template(\"col.html\", sid=sid, codename=crypto.displayid(sid),\n docs=get_docs(sid), haskey=crypto.getkey(sid))\n\[email protected]('/col/<sid>/<fn>')\ndef doc(sid, fn):\n if '..' in fn or fn.startswith('/'):\n abort(404)\n return send_file(store.path(sid, fn), mimetype=\"application/pgp-encrypted\")\n\[email protected]('/reply', methods=('POST',))\ndef reply():\n sid, msg = request.form['sid'], request.form['msg']\n crypto.encrypt(crypto.getkey(sid), request.form['msg'], output=\n store.path(sid, 'reply-%s.gpg' % uuid.uuid4()))\n return render_template('reply.html', sid=sid, codename=crypto.displayid(sid))\n\[email protected]('/delete', methods=('POST',))\ndef delete():\n sid = request.form['sid']\n doc_names_selected = request.form.getlist('doc_names_selected')\n docs_selected = [doc for doc in get_docs(sid) if doc['name'] in doc_names_selected]\n confirm_delete = bool(request.form.get('confirm_delete', False))\n if confirm_delete:\n for doc in docs_selected:\n fn = store.path(sid, doc['name'])\n crypto.secureunlink(fn)\n return render_template('delete.html', sid=sid, codename=crypto.displayid(sid),\n docs_selected=docs_selected, confirm_delete=confirm_delete)\n\nif __name__ == \"__main__\":\n # TODO: make sure this gets run by the web server\n CsrfProtect(app)\n app.run(debug=True, port=8081)\n", "path": "modules/deaddrop/files/deaddrop/journalist.py"}, {"content": "# -*- coding: utf-8 -*-\nimport os\nimport re\nimport config\n\nVALIDATE_FILENAME = re.compile(\"^(reply-)?[a-f0-9-]+(_msg|_doc|)\\.gpg$\").match\n\nclass PathException(Exception):\n '''An exception raised by `store.verify` when it encounters a bad path. A path\n can be bad when it is not absolute, not normalized, not within\n `config.STORE_DIR`, or doesn't match the filename format.\n '''\n pass\n\ndef verify(p):\n '''Assert that the path is absolute, normalized, inside `config.STORE_DIR`, and\n matches the filename format.\n '''\n if not os.path.isabs(config.STORE_DIR):\n raise PathException(\"config.STORE_DIR(%s) is not absolute\" % (\n config.STORE_DIR, ))\n\n # os.path.abspath makes the path absolute and normalizes '/foo/../bar' to\n # '/bar', etc. We have to check that the path is normalized before checking\n # that it starts with the `config.STORE_DIR` or else a malicious actor could\n # append a bunch of '../../..' to access files outside of the store.\n if not p == os.path.abspath(p):\n raise PathException(\"The path is not absolute and/or normalized\")\n\n if os.path.commonprefix([config.STORE_DIR, p]) != config.STORE_DIR:\n raise PathException(\"Invalid directory %s\" % (p, ))\n\n filename = os.path.basename(p)\n ext = os.path.splitext(filename)[-1]\n\n if os.path.isfile(p):\n if ext != '.gpg':\n # if there's an extension, verify it's a GPG\n raise PathException(\"Invalid file extension %s\" % (ext, ))\n\n if not VALIDATE_FILENAME(filename):\n raise PathException(\"Invalid filename %s\" % (filename, ))\n\ndef path(*s):\n '''Get the normalized, absolute file path, within `config.STORE_DIR`.'''\n joined = os.path.join(os.path.abspath(config.STORE_DIR), *s)\n absolute = os.path.abspath(joined)\n verify(absolute)\n return absolute\n\ndef log(msg):\n file(path('NOTES'), 'a').write(msg)\n", "path": "modules/deaddrop/files/deaddrop/store.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\nimport os\nfrom datetime import datetime\nimport uuid\n\nfrom flask import Flask, request, render_template, send_file\nfrom flask_wtf.csrf import CsrfProtect\n\nimport config, version, crypto, store, background\n\napp = Flask(__name__, template_folder=config.JOURNALIST_TEMPLATES_DIR)\napp.secret_key = config.SECRET_KEY\n\napp.jinja_env.globals['version'] = version.__version__\n\ndef get_docs(sid):\n \"\"\"Get docs associated with source id `sid` sorted by submission date\"\"\"\n docs = []\n for filename in os.listdir(store.path(sid)):\n os_stat = os.stat(store.path(sid, filename))\n docs.append(dict(\n name=filename,\n date=str(datetime.fromtimestamp(os_stat.st_mtime)),\n size=os_stat.st_size,\n ))\n # sort by date since ordering by filename is meaningless\n docs.sort(key=lambda x: x['date'])\n return docs\n\[email protected]_request\ndef no_cache(response):\n \"\"\"Minimize potential traces of site access by telling the browser not to\n cache anything\"\"\"\n no_cache_headers = {\n 'Cache-Control': 'no-cache, no-store, must-revalidate',\n 'Pragma': 'no-cache',\n 'Expires': '-1',\n }\n for header, header_value in no_cache_headers.iteritems():\n response.headers.add(header, header_value)\n return response\n\[email protected]('/')\ndef index():\n dirs = os.listdir(config.STORE_DIR)\n cols = []\n for d in dirs:\n cols.append(dict(\n name=d,\n sid=crypto.displayid(d),\n date=str(datetime.fromtimestamp(os.stat(store.path(d)).st_mtime)).split('.')[0]\n ))\n cols.sort(key=lambda x: x['date'], reverse=True)\n return render_template('index.html', cols=cols)\n\[email protected]('/col/<sid>')\ndef col(sid):\n return render_template(\"col.html\", sid=sid, codename=crypto.displayid(sid),\n docs=get_docs(sid), haskey=crypto.getkey(sid))\n\[email protected]('/col/<sid>/<fn>')\ndef doc(sid, fn):\n if '..' in fn or fn.startswith('/'):\n abort(404)\n return send_file(store.path(sid, fn), mimetype=\"application/pgp-encrypted\")\n\[email protected]('/reply', methods=('POST',))\ndef reply():\n sid, msg = request.form['sid'], request.form['msg']\n crypto.encrypt(crypto.getkey(sid), request.form['msg'], output=\n store.path(sid, 'reply-%s.gpg' % uuid.uuid4()))\n return render_template('reply.html', sid=sid, codename=crypto.displayid(sid))\n\[email protected]('/bulk', methods=('POST',))\ndef bulk():\n action = request.form['action']\n\n sid = request.form['sid']\n doc_names_selected = request.form.getlist('doc_names_selected')\n docs_selected = [ doc for doc in get_docs(sid) if doc['name'] in doc_names_selected ]\n\n if action == 'download':\n return bulk_download(sid, docs_selected)\n elif action == 'delete':\n return bulk_delete(sid, docs_selected)\n else:\n abort(422)\n\ndef bulk_delete(sid, docs_selected):\n confirm_delete = bool(request.form.get('confirm_delete', False))\n if confirm_delete:\n for doc in docs_selected:\n fn = store.path(sid, doc['name'])\n crypto.secureunlink(fn)\n return render_template('delete.html', sid=sid, codename=crypto.displayid(sid),\n docs_selected=docs_selected, confirm_delete=confirm_delete)\n\ndef bulk_download(sid, docs_selected):\n filenames = [store.path(sid, doc['name']) for doc in docs_selected]\n zip = store.get_bulk_archive(filenames)\n return send_file(zip, mimetype=\"application/zip\", attachment_filename=crypto.displayid(sid), as_attachment=True)\n\n\n \nif __name__ == \"__main__\":\n # TODO: make sure this gets run by the web server\n CsrfProtect(app)\n app.run(debug=True, port=8081)\n", "path": "modules/deaddrop/files/deaddrop/journalist.py"}, {"content": "# -*- coding: utf-8 -*-\nimport os\nimport re\nimport config\nimport zipfile\nimport crypto\nimport uuid\n\nVALIDATE_FILENAME = re.compile(\"^(reply-)?[a-f0-9-]+(_msg|_doc|)\\.gpg$\").match\n\nclass PathException(Exception):\n '''An exception raised by `store.verify` when it encounters a bad path. A path\n can be bad when it is not absolute, not normalized, not within\n `config.STORE_DIR`, or doesn't match the filename format.\n '''\n pass\n\ndef verify(p):\n '''Assert that the path is absolute, normalized, inside `config.STORE_DIR`, and\n matches the filename format.\n '''\n if not os.path.isabs(config.STORE_DIR):\n raise PathException(\"config.STORE_DIR(%s) is not absolute\" % (\n config.STORE_DIR, ))\n\n # os.path.abspath makes the path absolute and normalizes '/foo/../bar' to\n # '/bar', etc. We have to check that the path is normalized before checking\n # that it starts with the `config.STORE_DIR` or else a malicious actor could\n # append a bunch of '../../..' to access files outside of the store.\n if not p == os.path.abspath(p):\n raise PathException(\"The path is not absolute and/or normalized\")\n\n if os.path.commonprefix([config.STORE_DIR, p]) != config.STORE_DIR:\n raise PathException(\"Invalid directory %s\" % (p, ))\n\n filename = os.path.basename(p)\n ext = os.path.splitext(filename)[-1]\n\n if os.path.isfile(p):\n if ext != '.gpg':\n # if there's an extension, verify it's a GPG\n raise PathException(\"Invalid file extension %s\" % (ext, ))\n\n if not VALIDATE_FILENAME(filename):\n raise PathException(\"Invalid filename %s\" % (filename, ))\n\ndef path(*s):\n '''Get the normalized, absolute file path, within `config.STORE_DIR`.'''\n joined = os.path.join(os.path.abspath(config.STORE_DIR), *s)\n absolute = os.path.abspath(joined)\n verify(absolute)\n return absolute\n\ndef get_bulk_archive(filenames):\n zip_file_name = os.path.join(config.TEMP_DIR, str(uuid.uuid4()) + '.zip')\n with zipfile.ZipFile(zip_file_name, 'w') as zip:\n for filename in filenames:\n verify(filename)\n basename = os.path.basename(filename)\n zip.write(filename, arcname=basename)\n return zip_file_name\n\ndef log(msg):\n file(path('NOTES'), 'a').write(msg)\n", "path": "modules/deaddrop/files/deaddrop/store.py"}]} | 1,853 | 694 |
gh_patches_debug_51622 | rasdani/github-patches | git_diff | akvo__akvo-rsr-3604 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Organisation report shown in project reports page
The "Project overview" report is displayed on the project report page, which is an organisation report and should not be displayed on the project report page.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `akvo/rest/views/report.py`
Content:
```
1 # -*- coding: utf-8 -*-
2
3 # Akvo RSR is covered by the GNU Affero General Public License.
4 # See more details in the license.txt file located at the root folder of the Akvo RSR module.
5 # For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.
6
7 from django.db.models import Q
8 from django.shortcuts import get_object_or_404
9 from rest_framework import status
10 from rest_framework.decorators import api_view
11 from rest_framework.response import Response
12
13 from akvo.rsr.models import Report, ReportFormat, Project
14 from ..serializers import ReportSerializer, ReportFormatSerializer
15 from ..viewsets import BaseRSRViewSet
16
17
18 class ReportViewSet(BaseRSRViewSet):
19 """Viewset providing Result data."""
20
21 queryset = Report.objects.prefetch_related(
22 'organisations',
23 'formats',
24 )
25 serializer_class = ReportSerializer
26
27 def get_queryset(self):
28 """
29 Allow custom filter for sync_owner, since this field has been replaced by the
30 reporting org partnership.
31 """
32 reports = super(ReportViewSet, self).get_queryset()
33 user = self.request.user
34 is_admin = user.is_active and (user.is_superuser or user.is_admin)
35 if not is_admin:
36 # Show only those reports that the user is allowed to see
37 approved_orgs = user.approved_organisations() if not user.is_anonymous() else []
38 reports = reports.filter(
39 Q(organisations=None) | Q(organisations__in=approved_orgs)
40 ).distinct()
41 return reports
42
43
44 @api_view(['GET'])
45 def report_formats(request):
46 """
47 A view for displaying all report format information.
48 """
49 return Response({
50 'count': ReportFormat.objects.all().count(),
51 'results': [ReportFormatSerializer(f).data for f in ReportFormat.objects.all()],
52 })
53
54
55 @api_view(['GET'])
56 def project_reports(request, project_pk):
57 """A view for displaying project specific reports."""
58
59 project = get_object_or_404(Project, pk=project_pk)
60 reports = Report.objects.prefetch_related('formats', 'organisations')\
61 .filter(url__icontains='project')
62
63 user = request.user
64 if not user.has_perm('rsr.view_project', project):
65 return Response('Request not allowed', status=status.HTTP_403_FORBIDDEN)
66
67 is_admin = user.is_active and (user.is_superuser or user.is_admin)
68
69 if not is_admin:
70 partners_org = project.partner_organisation_pks()
71 reports = reports.filter(
72 Q(organisations=None) | Q(organisations__in=partners_org)
73 )
74
75 serializer = ReportSerializer(reports.distinct(), many=True)
76 return Response(serializer.data)
77
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/akvo/rest/views/report.py b/akvo/rest/views/report.py
--- a/akvo/rest/views/report.py
+++ b/akvo/rest/views/report.py
@@ -58,7 +58,7 @@
project = get_object_or_404(Project, pk=project_pk)
reports = Report.objects.prefetch_related('formats', 'organisations')\
- .filter(url__icontains='project')
+ .filter(url__icontains='{project}')
user = request.user
if not user.has_perm('rsr.view_project', project):
| {"golden_diff": "diff --git a/akvo/rest/views/report.py b/akvo/rest/views/report.py\n--- a/akvo/rest/views/report.py\n+++ b/akvo/rest/views/report.py\n@@ -58,7 +58,7 @@\n \n project = get_object_or_404(Project, pk=project_pk)\n reports = Report.objects.prefetch_related('formats', 'organisations')\\\n- .filter(url__icontains='project')\n+ .filter(url__icontains='{project}')\n \n user = request.user\n if not user.has_perm('rsr.view_project', project):\n", "issue": "Organisation report shown in project reports page\nThe \"Project overview\" report is displayed on the project report page, which is an organisation report and should not be displayed on the project report page.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom django.db.models import Q\nfrom django.shortcuts import get_object_or_404\nfrom rest_framework import status\nfrom rest_framework.decorators import api_view\nfrom rest_framework.response import Response\n\nfrom akvo.rsr.models import Report, ReportFormat, Project\nfrom ..serializers import ReportSerializer, ReportFormatSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass ReportViewSet(BaseRSRViewSet):\n \"\"\"Viewset providing Result data.\"\"\"\n\n queryset = Report.objects.prefetch_related(\n 'organisations',\n 'formats',\n )\n serializer_class = ReportSerializer\n\n def get_queryset(self):\n \"\"\"\n Allow custom filter for sync_owner, since this field has been replaced by the\n reporting org partnership.\n \"\"\"\n reports = super(ReportViewSet, self).get_queryset()\n user = self.request.user\n is_admin = user.is_active and (user.is_superuser or user.is_admin)\n if not is_admin:\n # Show only those reports that the user is allowed to see\n approved_orgs = user.approved_organisations() if not user.is_anonymous() else []\n reports = reports.filter(\n Q(organisations=None) | Q(organisations__in=approved_orgs)\n ).distinct()\n return reports\n\n\n@api_view(['GET'])\ndef report_formats(request):\n \"\"\"\n A view for displaying all report format information.\n \"\"\"\n return Response({\n 'count': ReportFormat.objects.all().count(),\n 'results': [ReportFormatSerializer(f).data for f in ReportFormat.objects.all()],\n })\n\n\n@api_view(['GET'])\ndef project_reports(request, project_pk):\n \"\"\"A view for displaying project specific reports.\"\"\"\n\n project = get_object_or_404(Project, pk=project_pk)\n reports = Report.objects.prefetch_related('formats', 'organisations')\\\n .filter(url__icontains='project')\n\n user = request.user\n if not user.has_perm('rsr.view_project', project):\n return Response('Request not allowed', status=status.HTTP_403_FORBIDDEN)\n\n is_admin = user.is_active and (user.is_superuser or user.is_admin)\n\n if not is_admin:\n partners_org = project.partner_organisation_pks()\n reports = reports.filter(\n Q(organisations=None) | Q(organisations__in=partners_org)\n )\n\n serializer = ReportSerializer(reports.distinct(), many=True)\n return Response(serializer.data)\n", "path": "akvo/rest/views/report.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\n# Akvo RSR is covered by the GNU Affero General Public License.\n# See more details in the license.txt file located at the root folder of the Akvo RSR module.\n# For additional details on the GNU license please see < http://www.gnu.org/licenses/agpl.html >.\n\nfrom django.db.models import Q\nfrom django.shortcuts import get_object_or_404\nfrom rest_framework import status\nfrom rest_framework.decorators import api_view\nfrom rest_framework.response import Response\n\nfrom akvo.rsr.models import Report, ReportFormat, Project\nfrom ..serializers import ReportSerializer, ReportFormatSerializer\nfrom ..viewsets import BaseRSRViewSet\n\n\nclass ReportViewSet(BaseRSRViewSet):\n \"\"\"Viewset providing Result data.\"\"\"\n\n queryset = Report.objects.prefetch_related(\n 'organisations',\n 'formats',\n )\n serializer_class = ReportSerializer\n\n def get_queryset(self):\n \"\"\"\n Allow custom filter for sync_owner, since this field has been replaced by the\n reporting org partnership.\n \"\"\"\n reports = super(ReportViewSet, self).get_queryset()\n user = self.request.user\n is_admin = user.is_active and (user.is_superuser or user.is_admin)\n if not is_admin:\n # Show only those reports that the user is allowed to see\n approved_orgs = user.approved_organisations() if not user.is_anonymous() else []\n reports = reports.filter(\n Q(organisations=None) | Q(organisations__in=approved_orgs)\n ).distinct()\n return reports\n\n\n@api_view(['GET'])\ndef report_formats(request):\n \"\"\"\n A view for displaying all report format information.\n \"\"\"\n return Response({\n 'count': ReportFormat.objects.all().count(),\n 'results': [ReportFormatSerializer(f).data for f in ReportFormat.objects.all()],\n })\n\n\n@api_view(['GET'])\ndef project_reports(request, project_pk):\n \"\"\"A view for displaying project specific reports.\"\"\"\n\n project = get_object_or_404(Project, pk=project_pk)\n reports = Report.objects.prefetch_related('formats', 'organisations')\\\n .filter(url__icontains='{project}')\n\n user = request.user\n if not user.has_perm('rsr.view_project', project):\n return Response('Request not allowed', status=status.HTTP_403_FORBIDDEN)\n\n is_admin = user.is_active and (user.is_superuser or user.is_admin)\n\n if not is_admin:\n partners_org = project.partner_organisation_pks()\n reports = reports.filter(\n Q(organisations=None) | Q(organisations__in=partners_org)\n )\n\n serializer = ReportSerializer(reports.distinct(), many=True)\n return Response(serializer.data)\n", "path": "akvo/rest/views/report.py"}]} | 1,036 | 127 |
gh_patches_debug_19318 | rasdani/github-patches | git_diff | lightly-ai__lightly-587 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
lightly-download fails for integer tag names
lightly-download fails for integer tag names
For the `tag_name` `1000`, the following warning appears:
> Possible bug: I get `warnings.warn(f'The specified tag {tag_name} does not exist`
The source of the problem is probably in this line:
https://github.com/lightly-ai/lightly/blob/db33b15de6f77e50b0c815c4c405a8fb371d22e7/lightly/cli/download_cli.py#L44
Current guess: Either the api sends the string as a number or the command-line tool parses the string as a number which makes the lookup fail.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lightly/cli/download_cli.py`
Content:
```
1 # -*- coding: utf-8 -*-
2 """**Lightly Download:** Download images from the Lightly platform.
3
4 This module contains the entrypoint for the **lightly-download**
5 command-line interface.
6 """
7
8 # Copyright (c) 2020. Lightly AG and its affiliates.
9 # All Rights Reserved
10
11 import os
12 import shutil
13 import warnings
14
15 import hydra
16 from torch.utils.hipify.hipify_python import bcolors
17 from tqdm import tqdm
18
19 import lightly.data as data
20 from lightly.cli._helpers import fix_input_path, print_as_warning
21
22 from lightly.api.utils import getenv
23 from lightly.api.api_workflow_client import ApiWorkflowClient
24 from lightly.api.bitmask import BitMask
25 from lightly.openapi_generated.swagger_client import TagData, TagArithmeticsRequest, TagArithmeticsOperation, \
26 TagBitMaskResponse
27
28
29 def _download_cli(cfg, is_cli_call=True):
30 tag_name = cfg['tag_name']
31 dataset_id = cfg['dataset_id']
32 token = cfg['token']
33
34 if not tag_name or not token or not dataset_id:
35 print_as_warning('Please specify all of the parameters tag_name, token and dataset_id')
36 print_as_warning('For help, try: lightly-download --help')
37 return
38
39 api_workflow_client = ApiWorkflowClient(
40 token=token, dataset_id=dataset_id
41 )
42
43 # get tag id
44 tag_name_id_dict = dict([tag.name, tag.id] for tag in api_workflow_client._get_all_tags())
45 tag_id = tag_name_id_dict.get(tag_name, None)
46 if tag_id is None:
47 warnings.warn(f'The specified tag {tag_name} does not exist.')
48 return
49
50 # get tag data
51 tag_data: TagData = api_workflow_client.tags_api.get_tag_by_tag_id(
52 dataset_id=dataset_id, tag_id=tag_id
53 )
54
55 if cfg["exclude_parent_tag"]:
56 parent_tag_id = tag_data.prev_tag_id
57 tag_arithmetics_request = TagArithmeticsRequest(
58 tag_id1=tag_data.id,
59 tag_id2=parent_tag_id,
60 operation=TagArithmeticsOperation.DIFFERENCE)
61 bit_mask_response: TagBitMaskResponse \
62 = api_workflow_client.tags_api.perform_tag_arithmetics(body=tag_arithmetics_request, dataset_id=dataset_id)
63 bit_mask_data = bit_mask_response.bit_mask_data
64 else:
65 bit_mask_data = tag_data.bit_mask_data
66
67 # get samples
68 chosen_samples_ids = BitMask.from_hex(bit_mask_data).to_indices()
69 samples = [api_workflow_client.filenames_on_server[i] for i in chosen_samples_ids]
70
71 # store sample names in a .txt file
72 filename = cfg['tag_name'] + '.txt'
73 with open(filename, 'w') as f:
74 for item in samples:
75 f.write("%s\n" % item)
76
77 filepath = os.path.join(os.getcwd(), filename)
78 msg = f'The list of files in tag {cfg["tag_name"]} is stored at: {bcolors.OKBLUE}{filepath}{bcolors.ENDC}'
79 print(msg, flush=True)
80
81 if not cfg['input_dir'] and cfg['output_dir']:
82 # download full images from api
83 output_dir = fix_input_path(cfg['output_dir'])
84 api_workflow_client.download_dataset(output_dir, tag_name=tag_name)
85
86 elif cfg['input_dir'] and cfg['output_dir']:
87 input_dir = fix_input_path(cfg['input_dir'])
88 output_dir = fix_input_path(cfg['output_dir'])
89 print(f'Copying files from {input_dir} to {bcolors.OKBLUE}{output_dir}{bcolors.ENDC}.')
90
91 # create a dataset from the input directory
92 dataset = data.LightlyDataset(input_dir=input_dir)
93
94 # dump the dataset in the output directory
95 dataset.dump(output_dir, samples)
96
97
98 @hydra.main(config_path='config', config_name='config')
99 def download_cli(cfg):
100 """Download images from the Lightly platform.
101
102 Args:
103 cfg:
104 The default configs are loaded from the config file.
105 To overwrite them please see the section on the config file
106 (.config.config.yaml).
107
108 Command-Line Args:
109 tag_name:
110 Download all images from the requested tag. Use initial-tag
111 to get all images from the dataset.
112 token:
113 User access token to the Lightly platform. If dataset_id
114 and token are specified, the images and embeddings are
115 uploaded to the platform.
116 dataset_id:
117 Identifier of the dataset on the Lightly platform. If
118 dataset_id and token are specified, the images and
119 embeddings are uploaded to the platform.
120 input_dir:
121 If input_dir and output_dir are specified, lightly will copy
122 all images belonging to the tag from the input_dir to the
123 output_dir.
124 output_dir:
125 If input_dir and output_dir are specified, lightly will copy
126 all images belonging to the tag from the input_dir to the
127 output_dir.
128
129 Examples:
130 >>> # download list of all files in the dataset from the Lightly platform
131 >>> lightly-download token='123' dataset_id='XYZ'
132 >>>
133 >>> # download list of all files in tag 'my-tag' from the Lightly platform
134 >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag'
135 >>>
136 >>> # download all images in tag 'my-tag' from the Lightly platform
137 >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag' output_dir='my_data/'
138 >>>
139 >>> # copy all files in 'my-tag' to a new directory
140 >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag' input_dir='data/' output_dir='my_data/'
141
142
143 """
144 _download_cli(cfg)
145
146
147 def entry():
148 download_cli()
149
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lightly/cli/download_cli.py b/lightly/cli/download_cli.py
--- a/lightly/cli/download_cli.py
+++ b/lightly/cli/download_cli.py
@@ -27,9 +27,10 @@
def _download_cli(cfg, is_cli_call=True):
- tag_name = cfg['tag_name']
- dataset_id = cfg['dataset_id']
- token = cfg['token']
+
+ tag_name = str(cfg['tag_name'])
+ dataset_id = str(cfg['dataset_id'])
+ token = str(cfg['token'])
if not tag_name or not token or not dataset_id:
print_as_warning('Please specify all of the parameters tag_name, token and dataset_id')
@@ -69,7 +70,7 @@
samples = [api_workflow_client.filenames_on_server[i] for i in chosen_samples_ids]
# store sample names in a .txt file
- filename = cfg['tag_name'] + '.txt'
+ filename = tag_name + '.txt'
with open(filename, 'w') as f:
for item in samples:
f.write("%s\n" % item)
| {"golden_diff": "diff --git a/lightly/cli/download_cli.py b/lightly/cli/download_cli.py\n--- a/lightly/cli/download_cli.py\n+++ b/lightly/cli/download_cli.py\n@@ -27,9 +27,10 @@\n \n \n def _download_cli(cfg, is_cli_call=True):\n- tag_name = cfg['tag_name']\n- dataset_id = cfg['dataset_id']\n- token = cfg['token']\n+\n+ tag_name = str(cfg['tag_name'])\n+ dataset_id = str(cfg['dataset_id'])\n+ token = str(cfg['token'])\n \n if not tag_name or not token or not dataset_id:\n print_as_warning('Please specify all of the parameters tag_name, token and dataset_id')\n@@ -69,7 +70,7 @@\n samples = [api_workflow_client.filenames_on_server[i] for i in chosen_samples_ids]\n \n # store sample names in a .txt file\n- filename = cfg['tag_name'] + '.txt'\n+ filename = tag_name + '.txt'\n with open(filename, 'w') as f:\n for item in samples:\n f.write(\"%s\\n\" % item)\n", "issue": "lightly-download fails for integer tag names\nlightly-download fails for integer tag names\r\n\r\nFor the `tag_name` `1000`, the following warning appears:\r\n> Possible bug: I get `warnings.warn(f'The specified tag {tag_name} does not exist`\r\n\r\nThe source of the problem is probably in this line:\r\nhttps://github.com/lightly-ai/lightly/blob/db33b15de6f77e50b0c815c4c405a8fb371d22e7/lightly/cli/download_cli.py#L44\r\n\r\nCurrent guess: Either the api sends the string as a number or the command-line tool parses the string as a number which makes the lookup fail.\n", "before_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"**Lightly Download:** Download images from the Lightly platform.\n\nThis module contains the entrypoint for the **lightly-download**\ncommand-line interface.\n\"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nimport os\nimport shutil\nimport warnings\n\nimport hydra\nfrom torch.utils.hipify.hipify_python import bcolors\nfrom tqdm import tqdm\n\nimport lightly.data as data\nfrom lightly.cli._helpers import fix_input_path, print_as_warning\n\nfrom lightly.api.utils import getenv\nfrom lightly.api.api_workflow_client import ApiWorkflowClient\nfrom lightly.api.bitmask import BitMask\nfrom lightly.openapi_generated.swagger_client import TagData, TagArithmeticsRequest, TagArithmeticsOperation, \\\n TagBitMaskResponse\n\n\ndef _download_cli(cfg, is_cli_call=True):\n tag_name = cfg['tag_name']\n dataset_id = cfg['dataset_id']\n token = cfg['token']\n\n if not tag_name or not token or not dataset_id:\n print_as_warning('Please specify all of the parameters tag_name, token and dataset_id')\n print_as_warning('For help, try: lightly-download --help')\n return\n\n api_workflow_client = ApiWorkflowClient(\n token=token, dataset_id=dataset_id\n )\n\n # get tag id\n tag_name_id_dict = dict([tag.name, tag.id] for tag in api_workflow_client._get_all_tags())\n tag_id = tag_name_id_dict.get(tag_name, None)\n if tag_id is None:\n warnings.warn(f'The specified tag {tag_name} does not exist.')\n return\n\n # get tag data\n tag_data: TagData = api_workflow_client.tags_api.get_tag_by_tag_id(\n dataset_id=dataset_id, tag_id=tag_id\n )\n\n if cfg[\"exclude_parent_tag\"]:\n parent_tag_id = tag_data.prev_tag_id\n tag_arithmetics_request = TagArithmeticsRequest(\n tag_id1=tag_data.id,\n tag_id2=parent_tag_id,\n operation=TagArithmeticsOperation.DIFFERENCE)\n bit_mask_response: TagBitMaskResponse \\\n = api_workflow_client.tags_api.perform_tag_arithmetics(body=tag_arithmetics_request, dataset_id=dataset_id)\n bit_mask_data = bit_mask_response.bit_mask_data\n else:\n bit_mask_data = tag_data.bit_mask_data\n\n # get samples\n chosen_samples_ids = BitMask.from_hex(bit_mask_data).to_indices()\n samples = [api_workflow_client.filenames_on_server[i] for i in chosen_samples_ids]\n\n # store sample names in a .txt file\n filename = cfg['tag_name'] + '.txt'\n with open(filename, 'w') as f:\n for item in samples:\n f.write(\"%s\\n\" % item)\n\n filepath = os.path.join(os.getcwd(), filename)\n msg = f'The list of files in tag {cfg[\"tag_name\"]} is stored at: {bcolors.OKBLUE}{filepath}{bcolors.ENDC}'\n print(msg, flush=True)\n\n if not cfg['input_dir'] and cfg['output_dir']:\n # download full images from api\n output_dir = fix_input_path(cfg['output_dir'])\n api_workflow_client.download_dataset(output_dir, tag_name=tag_name)\n\n elif cfg['input_dir'] and cfg['output_dir']:\n input_dir = fix_input_path(cfg['input_dir'])\n output_dir = fix_input_path(cfg['output_dir'])\n print(f'Copying files from {input_dir} to {bcolors.OKBLUE}{output_dir}{bcolors.ENDC}.')\n\n # create a dataset from the input directory\n dataset = data.LightlyDataset(input_dir=input_dir)\n\n # dump the dataset in the output directory\n dataset.dump(output_dir, samples)\n\n\[email protected](config_path='config', config_name='config')\ndef download_cli(cfg):\n \"\"\"Download images from the Lightly platform.\n\n Args:\n cfg:\n The default configs are loaded from the config file.\n To overwrite them please see the section on the config file \n (.config.config.yaml).\n \n Command-Line Args:\n tag_name:\n Download all images from the requested tag. Use initial-tag\n to get all images from the dataset.\n token:\n User access token to the Lightly platform. If dataset_id\n and token are specified, the images and embeddings are \n uploaded to the platform.\n dataset_id:\n Identifier of the dataset on the Lightly platform. If \n dataset_id and token are specified, the images and \n embeddings are uploaded to the platform.\n input_dir:\n If input_dir and output_dir are specified, lightly will copy\n all images belonging to the tag from the input_dir to the \n output_dir.\n output_dir:\n If input_dir and output_dir are specified, lightly will copy\n all images belonging to the tag from the input_dir to the \n output_dir.\n\n Examples:\n >>> #\u00a0download list of all files in the dataset from the Lightly platform\n >>> lightly-download token='123' dataset_id='XYZ'\n >>> \n >>> # download list of all files in tag 'my-tag' from the Lightly platform\n >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag'\n >>>\n >>> # download all images in tag 'my-tag' from the Lightly platform\n >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag' output_dir='my_data/'\n >>>\n >>> # copy all files in 'my-tag' to a new directory\n >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag' input_dir='data/' output_dir='my_data/'\n\n\n \"\"\"\n _download_cli(cfg)\n\n\ndef entry():\n download_cli()\n", "path": "lightly/cli/download_cli.py"}], "after_files": [{"content": "# -*- coding: utf-8 -*-\n\"\"\"**Lightly Download:** Download images from the Lightly platform.\n\nThis module contains the entrypoint for the **lightly-download**\ncommand-line interface.\n\"\"\"\n\n# Copyright (c) 2020. Lightly AG and its affiliates.\n# All Rights Reserved\n\nimport os\nimport shutil\nimport warnings\n\nimport hydra\nfrom torch.utils.hipify.hipify_python import bcolors\nfrom tqdm import tqdm\n\nimport lightly.data as data\nfrom lightly.cli._helpers import fix_input_path, print_as_warning\n\nfrom lightly.api.utils import getenv\nfrom lightly.api.api_workflow_client import ApiWorkflowClient\nfrom lightly.api.bitmask import BitMask\nfrom lightly.openapi_generated.swagger_client import TagData, TagArithmeticsRequest, TagArithmeticsOperation, \\\n TagBitMaskResponse\n\n\ndef _download_cli(cfg, is_cli_call=True):\n\n tag_name = str(cfg['tag_name'])\n dataset_id = str(cfg['dataset_id'])\n token = str(cfg['token'])\n\n if not tag_name or not token or not dataset_id:\n print_as_warning('Please specify all of the parameters tag_name, token and dataset_id')\n print_as_warning('For help, try: lightly-download --help')\n return\n\n api_workflow_client = ApiWorkflowClient(\n token=token, dataset_id=dataset_id\n )\n\n # get tag id\n tag_name_id_dict = dict([tag.name, tag.id] for tag in api_workflow_client._get_all_tags())\n tag_id = tag_name_id_dict.get(tag_name, None)\n if tag_id is None:\n warnings.warn(f'The specified tag {tag_name} does not exist.')\n return\n\n # get tag data\n tag_data: TagData = api_workflow_client.tags_api.get_tag_by_tag_id(\n dataset_id=dataset_id, tag_id=tag_id\n )\n\n if cfg[\"exclude_parent_tag\"]:\n parent_tag_id = tag_data.prev_tag_id\n tag_arithmetics_request = TagArithmeticsRequest(\n tag_id1=tag_data.id,\n tag_id2=parent_tag_id,\n operation=TagArithmeticsOperation.DIFFERENCE)\n bit_mask_response: TagBitMaskResponse \\\n = api_workflow_client.tags_api.perform_tag_arithmetics(body=tag_arithmetics_request, dataset_id=dataset_id)\n bit_mask_data = bit_mask_response.bit_mask_data\n else:\n bit_mask_data = tag_data.bit_mask_data\n\n # get samples\n chosen_samples_ids = BitMask.from_hex(bit_mask_data).to_indices()\n samples = [api_workflow_client.filenames_on_server[i] for i in chosen_samples_ids]\n\n # store sample names in a .txt file\n filename = tag_name + '.txt'\n with open(filename, 'w') as f:\n for item in samples:\n f.write(\"%s\\n\" % item)\n\n filepath = os.path.join(os.getcwd(), filename)\n msg = f'The list of files in tag {cfg[\"tag_name\"]} is stored at: {bcolors.OKBLUE}{filepath}{bcolors.ENDC}'\n print(msg, flush=True)\n\n if not cfg['input_dir'] and cfg['output_dir']:\n # download full images from api\n output_dir = fix_input_path(cfg['output_dir'])\n api_workflow_client.download_dataset(output_dir, tag_name=tag_name)\n\n elif cfg['input_dir'] and cfg['output_dir']:\n input_dir = fix_input_path(cfg['input_dir'])\n output_dir = fix_input_path(cfg['output_dir'])\n print(f'Copying files from {input_dir} to {bcolors.OKBLUE}{output_dir}{bcolors.ENDC}.')\n\n # create a dataset from the input directory\n dataset = data.LightlyDataset(input_dir=input_dir)\n\n # dump the dataset in the output directory\n dataset.dump(output_dir, samples)\n\n\[email protected](config_path='config', config_name='config')\ndef download_cli(cfg):\n \"\"\"Download images from the Lightly platform.\n\n Args:\n cfg:\n The default configs are loaded from the config file.\n To overwrite them please see the section on the config file \n (.config.config.yaml).\n \n Command-Line Args:\n tag_name:\n Download all images from the requested tag. Use initial-tag\n to get all images from the dataset.\n token:\n User access token to the Lightly platform. If dataset_id\n and token are specified, the images and embeddings are \n uploaded to the platform.\n dataset_id:\n Identifier of the dataset on the Lightly platform. If \n dataset_id and token are specified, the images and \n embeddings are uploaded to the platform.\n input_dir:\n If input_dir and output_dir are specified, lightly will copy\n all images belonging to the tag from the input_dir to the \n output_dir.\n output_dir:\n If input_dir and output_dir are specified, lightly will copy\n all images belonging to the tag from the input_dir to the \n output_dir.\n\n Examples:\n >>> #\u00a0download list of all files in the dataset from the Lightly platform\n >>> lightly-download token='123' dataset_id='XYZ'\n >>> \n >>> # download list of all files in tag 'my-tag' from the Lightly platform\n >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag'\n >>>\n >>> # download all images in tag 'my-tag' from the Lightly platform\n >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag' output_dir='my_data/'\n >>>\n >>> # copy all files in 'my-tag' to a new directory\n >>> lightly-download token='123' dataset_id='XYZ' tag_name='my-tag' input_dir='data/' output_dir='my_data/'\n\n\n \"\"\"\n _download_cli(cfg)\n\n\ndef entry():\n download_cli()\n", "path": "lightly/cli/download_cli.py"}]} | 2,024 | 248 |
gh_patches_debug_16617 | rasdani/github-patches | git_diff | OCA__bank-payment-900 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[14.0] account_payment_purchase: changing many2one resets payment_mode
Hi,
We've seen that when the field purchase_vendor_bill_id is changed and a purchase is selected from it, the payment_mode_id is always reseted because it is using the reference purchase_id.
`new_mode = self.purchase_id.payment_mode_id.id or False`
We've made this change, and it seems to work as it should.
`new_mode = self.purchase_vendor_bill_id.purchase_order_id.payment_mode_id.id or False`
The same goes for the partner_bank_id field.
@MiquelRForgeFlow
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `account_payment_purchase/models/account_invoice.py`
Content:
```
1 # Copyright 2016 Akretion (<http://www.akretion.com>).
2 # Copyright 2017 Tecnativa - Vicent Cubells.
3 # License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).
4
5 from odoo import _, api, models
6
7
8 class AccountMove(models.Model):
9 _inherit = "account.move"
10
11 @api.onchange("purchase_vendor_bill_id", "purchase_id")
12 def _onchange_purchase_auto_complete(self):
13 new_mode = self.purchase_id.payment_mode_id.id or False
14 new_bank = self.purchase_id.supplier_partner_bank_id.id or False
15 res = super()._onchange_purchase_auto_complete() or {}
16 if self.payment_mode_id and new_mode and self.payment_mode_id.id != new_mode:
17 res["warning"] = {
18 "title": _("Warning"),
19 "message": _("Selected purchase order have different payment mode."),
20 }
21 return res
22 self.payment_mode_id = new_mode
23 if self.partner_bank_id and new_bank and self.partner_bank_id.id != new_bank:
24 res["warning"] = {
25 "title": _("Warning"),
26 "message": _("Selected purchase order have different supplier bank."),
27 }
28 return res
29 self.partner_bank_id = new_bank
30 return res
31
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/account_payment_purchase/models/account_invoice.py b/account_payment_purchase/models/account_invoice.py
--- a/account_payment_purchase/models/account_invoice.py
+++ b/account_payment_purchase/models/account_invoice.py
@@ -10,8 +10,16 @@
@api.onchange("purchase_vendor_bill_id", "purchase_id")
def _onchange_purchase_auto_complete(self):
- new_mode = self.purchase_id.payment_mode_id.id or False
- new_bank = self.purchase_id.supplier_partner_bank_id.id or False
+
+ new_mode = (
+ self.purchase_vendor_bill_id.purchase_order_id.payment_mode_id.id
+ or self.purchase_id.payment_mode_id.id
+ )
+ new_bank = (
+ self.purchase_vendor_bill_id.purchase_order_id.supplier_partner_bank_id.id
+ or self.purchase_id.supplier_partner_bank_id.id
+ )
+
res = super()._onchange_purchase_auto_complete() or {}
if self.payment_mode_id and new_mode and self.payment_mode_id.id != new_mode:
res["warning"] = {
| {"golden_diff": "diff --git a/account_payment_purchase/models/account_invoice.py b/account_payment_purchase/models/account_invoice.py\n--- a/account_payment_purchase/models/account_invoice.py\n+++ b/account_payment_purchase/models/account_invoice.py\n@@ -10,8 +10,16 @@\n \n @api.onchange(\"purchase_vendor_bill_id\", \"purchase_id\")\n def _onchange_purchase_auto_complete(self):\n- new_mode = self.purchase_id.payment_mode_id.id or False\n- new_bank = self.purchase_id.supplier_partner_bank_id.id or False\n+\n+ new_mode = (\n+ self.purchase_vendor_bill_id.purchase_order_id.payment_mode_id.id\n+ or self.purchase_id.payment_mode_id.id\n+ )\n+ new_bank = (\n+ self.purchase_vendor_bill_id.purchase_order_id.supplier_partner_bank_id.id\n+ or self.purchase_id.supplier_partner_bank_id.id\n+ )\n+\n res = super()._onchange_purchase_auto_complete() or {}\n if self.payment_mode_id and new_mode and self.payment_mode_id.id != new_mode:\n res[\"warning\"] = {\n", "issue": "[14.0] account_payment_purchase: changing many2one resets payment_mode\nHi,\r\nWe've seen that when the field purchase_vendor_bill_id is changed and a purchase is selected from it, the payment_mode_id is always reseted because it is using the reference purchase_id.\r\n`new_mode = self.purchase_id.payment_mode_id.id or False`\r\nWe've made this change, and it seems to work as it should.\r\n`new_mode = self.purchase_vendor_bill_id.purchase_order_id.payment_mode_id.id or False`\r\nThe same goes for the partner_bank_id field.\r\n@MiquelRForgeFlow \n", "before_files": [{"content": "# Copyright 2016 Akretion (<http://www.akretion.com>).\n# Copyright 2017 Tecnativa - Vicent Cubells.\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n\nfrom odoo import _, api, models\n\n\nclass AccountMove(models.Model):\n _inherit = \"account.move\"\n\n @api.onchange(\"purchase_vendor_bill_id\", \"purchase_id\")\n def _onchange_purchase_auto_complete(self):\n new_mode = self.purchase_id.payment_mode_id.id or False\n new_bank = self.purchase_id.supplier_partner_bank_id.id or False\n res = super()._onchange_purchase_auto_complete() or {}\n if self.payment_mode_id and new_mode and self.payment_mode_id.id != new_mode:\n res[\"warning\"] = {\n \"title\": _(\"Warning\"),\n \"message\": _(\"Selected purchase order have different payment mode.\"),\n }\n return res\n self.payment_mode_id = new_mode\n if self.partner_bank_id and new_bank and self.partner_bank_id.id != new_bank:\n res[\"warning\"] = {\n \"title\": _(\"Warning\"),\n \"message\": _(\"Selected purchase order have different supplier bank.\"),\n }\n return res\n self.partner_bank_id = new_bank\n return res\n", "path": "account_payment_purchase/models/account_invoice.py"}], "after_files": [{"content": "# Copyright 2016 Akretion (<http://www.akretion.com>).\n# Copyright 2017 Tecnativa - Vicent Cubells.\n# License AGPL-3.0 or later (http://www.gnu.org/licenses/agpl.html).\n\nfrom odoo import _, api, models\n\n\nclass AccountMove(models.Model):\n _inherit = \"account.move\"\n\n @api.onchange(\"purchase_vendor_bill_id\", \"purchase_id\")\n def _onchange_purchase_auto_complete(self):\n\n new_mode = (\n self.purchase_vendor_bill_id.purchase_order_id.payment_mode_id.id\n or self.purchase_id.payment_mode_id.id\n )\n new_bank = (\n self.purchase_vendor_bill_id.purchase_order_id.supplier_partner_bank_id.id\n or self.purchase_id.supplier_partner_bank_id.id\n )\n\n res = super()._onchange_purchase_auto_complete() or {}\n if self.payment_mode_id and new_mode and self.payment_mode_id.id != new_mode:\n res[\"warning\"] = {\n \"title\": _(\"Warning\"),\n \"message\": _(\"Selected purchase order have different payment mode.\"),\n }\n return res\n self.payment_mode_id = new_mode\n if self.partner_bank_id and new_bank and self.partner_bank_id.id != new_bank:\n res[\"warning\"] = {\n \"title\": _(\"Warning\"),\n \"message\": _(\"Selected purchase order have different supplier bank.\"),\n }\n return res\n self.partner_bank_id = new_bank\n return res\n", "path": "account_payment_purchase/models/account_invoice.py"}]} | 715 | 226 |
gh_patches_debug_21021 | rasdani/github-patches | git_diff | liqd__a4-meinberlin-593 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Make creator a readonly field in django admin
With a large user base the dropdown used for the user selection in django admin becomes unresponsive. As there is no apparent reason to change the creator of an object (comment, item, rate, poll, ...) the creator field should be made read_only.
The same problem occurs for every model where a user is set as a foreign key (Action: actor, Projects: member/moderator, Organisation: initiator)
The readonly property can either be set on every Admin class by setting ` readonly_fields = ('creator',)` individually or by using a custom `A4Admin(admin.ModelAdmin)` which has to be set as the parent for every Admin class used.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `apps/polls/admin.py`
Content:
```
1 from django.contrib import admin
2
3 from . import models
4
5
6 class ChoiceInline(admin.TabularInline):
7 model = models.Choice
8
9
10 class QuestionAdmin(admin.ModelAdmin):
11 inlines = [
12 ChoiceInline
13 ]
14
15
16 class VoteAdmin(admin.ModelAdmin):
17 list_filter = ('choice__question',)
18
19
20 admin.site.register(models.Question, QuestionAdmin)
21 admin.site.register(models.Vote, VoteAdmin)
22
```
Path: `apps/documents/admin.py`
Content:
```
1 from django.contrib import admin
2
3 from . import models
4
5
6 class ParagraphAdmin(admin.ModelAdmin):
7 list_filter = ('chapter',)
8
9
10 admin.site.register(models.Chapter)
11 admin.site.register(models.Paragraph, ParagraphAdmin)
12
```
Path: `apps/offlineevents/admin.py`
Content:
```
1 from django.contrib import admin
2
3 from . import models
4
5 admin.site.register(models.OfflineEvent, admin.ModelAdmin)
6
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/apps/documents/admin.py b/apps/documents/admin.py
--- a/apps/documents/admin.py
+++ b/apps/documents/admin.py
@@ -3,9 +3,12 @@
from . import models
[email protected](models.Paragraph)
class ParagraphAdmin(admin.ModelAdmin):
list_filter = ('chapter',)
+ readonly_fields = ('creator',)
-admin.site.register(models.Chapter)
-admin.site.register(models.Paragraph, ParagraphAdmin)
[email protected](models.Chapter)
+class ChapterAdmin(admin.ModelAdmin):
+ readonly_fields = ('creator', )
diff --git a/apps/offlineevents/admin.py b/apps/offlineevents/admin.py
--- a/apps/offlineevents/admin.py
+++ b/apps/offlineevents/admin.py
@@ -2,4 +2,7 @@
from . import models
-admin.site.register(models.OfflineEvent, admin.ModelAdmin)
+
[email protected](models.OfflineEvent)
+class OfflineEventAdmin(admin.ModelAdmin):
+ readonly_fields = ('creator', )
diff --git a/apps/polls/admin.py b/apps/polls/admin.py
--- a/apps/polls/admin.py
+++ b/apps/polls/admin.py
@@ -7,15 +7,8 @@
model = models.Choice
[email protected](models.Question)
class QuestionAdmin(admin.ModelAdmin):
inlines = [
ChoiceInline
]
-
-
-class VoteAdmin(admin.ModelAdmin):
- list_filter = ('choice__question',)
-
-
-admin.site.register(models.Question, QuestionAdmin)
-admin.site.register(models.Vote, VoteAdmin)
| {"golden_diff": "diff --git a/apps/documents/admin.py b/apps/documents/admin.py\n--- a/apps/documents/admin.py\n+++ b/apps/documents/admin.py\n@@ -3,9 +3,12 @@\n from . import models\n \n \[email protected](models.Paragraph)\n class ParagraphAdmin(admin.ModelAdmin):\n list_filter = ('chapter',)\n+ readonly_fields = ('creator',)\n \n \n-admin.site.register(models.Chapter)\n-admin.site.register(models.Paragraph, ParagraphAdmin)\[email protected](models.Chapter)\n+class ChapterAdmin(admin.ModelAdmin):\n+ readonly_fields = ('creator', )\ndiff --git a/apps/offlineevents/admin.py b/apps/offlineevents/admin.py\n--- a/apps/offlineevents/admin.py\n+++ b/apps/offlineevents/admin.py\n@@ -2,4 +2,7 @@\n \n from . import models\n \n-admin.site.register(models.OfflineEvent, admin.ModelAdmin)\n+\[email protected](models.OfflineEvent)\n+class OfflineEventAdmin(admin.ModelAdmin):\n+ readonly_fields = ('creator', )\ndiff --git a/apps/polls/admin.py b/apps/polls/admin.py\n--- a/apps/polls/admin.py\n+++ b/apps/polls/admin.py\n@@ -7,15 +7,8 @@\n model = models.Choice\n \n \[email protected](models.Question)\n class QuestionAdmin(admin.ModelAdmin):\n inlines = [\n ChoiceInline\n ]\n-\n-\n-class VoteAdmin(admin.ModelAdmin):\n- list_filter = ('choice__question',)\n-\n-\n-admin.site.register(models.Question, QuestionAdmin)\n-admin.site.register(models.Vote, VoteAdmin)\n", "issue": "Make creator a readonly field in django admin\nWith a large user base the dropdown used for the user selection in django admin becomes unresponsive. As there is no apparent reason to change the creator of an object (comment, item, rate, poll, ...) the creator field should be made read_only.\r\n\r\nThe same problem occurs for every model where a user is set as a foreign key (Action: actor, Projects: member/moderator, Organisation: initiator)\r\n\r\nThe readonly property can either be set on every Admin class by setting ` readonly_fields = ('creator',)` individually or by using a custom `A4Admin(admin.ModelAdmin)` which has to be set as the parent for every Admin class used.\n", "before_files": [{"content": "from django.contrib import admin\n\nfrom . import models\n\n\nclass ChoiceInline(admin.TabularInline):\n model = models.Choice\n\n\nclass QuestionAdmin(admin.ModelAdmin):\n inlines = [\n ChoiceInline\n ]\n\n\nclass VoteAdmin(admin.ModelAdmin):\n list_filter = ('choice__question',)\n\n\nadmin.site.register(models.Question, QuestionAdmin)\nadmin.site.register(models.Vote, VoteAdmin)\n", "path": "apps/polls/admin.py"}, {"content": "from django.contrib import admin\n\nfrom . import models\n\n\nclass ParagraphAdmin(admin.ModelAdmin):\n list_filter = ('chapter',)\n\n\nadmin.site.register(models.Chapter)\nadmin.site.register(models.Paragraph, ParagraphAdmin)\n", "path": "apps/documents/admin.py"}, {"content": "from django.contrib import admin\n\nfrom . import models\n\nadmin.site.register(models.OfflineEvent, admin.ModelAdmin)\n", "path": "apps/offlineevents/admin.py"}], "after_files": [{"content": "from django.contrib import admin\n\nfrom . import models\n\n\nclass ChoiceInline(admin.TabularInline):\n model = models.Choice\n\n\[email protected](models.Question)\nclass QuestionAdmin(admin.ModelAdmin):\n inlines = [\n ChoiceInline\n ]\n", "path": "apps/polls/admin.py"}, {"content": "from django.contrib import admin\n\nfrom . import models\n\n\[email protected](models.Paragraph)\nclass ParagraphAdmin(admin.ModelAdmin):\n list_filter = ('chapter',)\n readonly_fields = ('creator',)\n\n\[email protected](models.Chapter)\nclass ChapterAdmin(admin.ModelAdmin):\n readonly_fields = ('creator', )\n", "path": "apps/documents/admin.py"}, {"content": "from django.contrib import admin\n\nfrom . import models\n\n\[email protected](models.OfflineEvent)\nclass OfflineEventAdmin(admin.ModelAdmin):\n readonly_fields = ('creator', )\n", "path": "apps/offlineevents/admin.py"}]} | 642 | 320 |
gh_patches_debug_24554 | rasdani/github-patches | git_diff | litestar-org__litestar-174 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Move `requests` to `testing` extra
I was inspecting starlight dependencies and was confused to see `requests` being required without any obvious reason.
I found #13 then, but I disagree with the resolution since I believe libs required for testing purposes should not be installed for normal use. Now it only one lib (with several dependencies), but imagine if more dependencies would be added for testing.
#### What I propose:
1. Move requests from required dependencies to `testing` extra (`pip install starlight[testing]`)
1. Remove import of `starlite.testing` from `starlite` package
2. When starlight is imported explicitly (`from starlight import testint`), check for requests installed. if not, raise `RuntimeError("To access starlight.testing install starlight with [testing] extra")`
How would `pyproject.toml` of end user look like:
```toml
[tool.poetry.dependencies]
python = "^3.10"
starlite = "^1.3.9"
[tool.poetry.dev-dependencies]
starlite = {extras = ["testing"], version = "*"} # whatever version is installed + testing dependencies
pytest = "^5.2"
```
I can send a PR if changes are welcomed.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `starlite/__init__.py`
Content:
```
1 from starlite.datastructures import File, Redirect, State, Stream, Template
2
3 from .app import Starlite
4 from .config import (
5 CacheConfig,
6 CORSConfig,
7 OpenAPIConfig,
8 StaticFilesConfig,
9 TemplateConfig,
10 )
11 from .connection import Request, WebSocket
12 from .controller import Controller
13 from .dto import DTOFactory
14 from .enums import (
15 HttpMethod,
16 MediaType,
17 OpenAPIMediaType,
18 RequestEncodingType,
19 ScopeType,
20 )
21 from .exceptions import (
22 HTTPException,
23 ImproperlyConfiguredException,
24 InternalServerException,
25 MissingDependencyException,
26 NotAuthorizedException,
27 NotFoundException,
28 PermissionDeniedException,
29 ServiceUnavailableException,
30 StarLiteException,
31 ValidationException,
32 )
33 from .handlers import (
34 ASGIRouteHandler,
35 BaseRouteHandler,
36 HTTPRouteHandler,
37 WebsocketRouteHandler,
38 asgi,
39 delete,
40 get,
41 patch,
42 post,
43 put,
44 route,
45 websocket,
46 )
47 from .logging import LoggingConfig, QueueListenerHandler
48 from .middleware import AbstractAuthenticationMiddleware, AuthenticationResult
49 from .openapi.controller import OpenAPIController
50 from .params import Body, Dependency, Parameter
51 from .plugins import PluginProtocol
52 from .provide import Provide
53 from .response import Response
54 from .router import Router
55 from .routes import BaseRoute, HTTPRoute, WebSocketRoute
56 from .testing import TestClient, create_test_client, create_test_request
57 from .types import MiddlewareProtocol, Partial, ResponseHeader
58
59 __all__ = [
60 "ASGIRouteHandler",
61 "AbstractAuthenticationMiddleware",
62 "AuthenticationResult",
63 "BaseRoute",
64 "BaseRouteHandler",
65 "Body",
66 "CORSConfig",
67 "CacheConfig",
68 "Controller",
69 "Dependency",
70 "DTOFactory",
71 "File",
72 "HTTPException",
73 "HTTPRoute",
74 "HTTPRouteHandler",
75 "HttpMethod",
76 "ImproperlyConfiguredException",
77 "InternalServerException",
78 "LoggingConfig",
79 "MediaType",
80 "MiddlewareProtocol",
81 "MissingDependencyException",
82 "NotAuthorizedException",
83 "NotFoundException",
84 "OpenAPIConfig",
85 "OpenAPIController",
86 "OpenAPIMediaType",
87 "Parameter",
88 "Partial",
89 "PermissionDeniedException",
90 "PluginProtocol",
91 "Provide",
92 "QueueListenerHandler",
93 "Redirect",
94 "Request",
95 "RequestEncodingType",
96 "Response",
97 "ResponseHeader",
98 "Router",
99 "ScopeType",
100 "ServiceUnavailableException",
101 "StarLiteException",
102 "Starlite",
103 "State",
104 "StaticFilesConfig",
105 "Stream",
106 "Template",
107 "TemplateConfig",
108 "TestClient",
109 "ValidationException",
110 "WebSocket",
111 "WebSocketRoute",
112 "WebsocketRouteHandler",
113 "asgi",
114 "create_test_client",
115 "create_test_request",
116 "delete",
117 "get",
118 "patch",
119 "post",
120 "put",
121 "route",
122 "websocket",
123 ]
124
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/starlite/__init__.py b/starlite/__init__.py
--- a/starlite/__init__.py
+++ b/starlite/__init__.py
@@ -1,3 +1,5 @@
+from typing import TYPE_CHECKING, Any
+
from starlite.datastructures import File, Redirect, State, Stream, Template
from .app import Starlite
@@ -53,9 +55,12 @@
from .response import Response
from .router import Router
from .routes import BaseRoute, HTTPRoute, WebSocketRoute
-from .testing import TestClient, create_test_client, create_test_request
from .types import MiddlewareProtocol, Partial, ResponseHeader
+if TYPE_CHECKING:
+ from .testing import TestClient, create_test_client, create_test_request
+
+
__all__ = [
"ASGIRouteHandler",
"AbstractAuthenticationMiddleware",
@@ -121,3 +126,17 @@
"route",
"websocket",
]
+
+_dynamic_imports = {"TestClient", "create_test_client", "create_test_request"}
+
+
+# pylint: disable=import-outside-toplevel
+def __getattr__(name: str) -> Any:
+ """Provide lazy importing as per https://peps.python.org/pep-0562/"""
+ if name not in _dynamic_imports:
+ raise AttributeError(f"Module {__package__} has no attribute {name}")
+
+ from . import testing
+
+ attr = globals()[name] = getattr(testing, name)
+ return attr
| {"golden_diff": "diff --git a/starlite/__init__.py b/starlite/__init__.py\n--- a/starlite/__init__.py\n+++ b/starlite/__init__.py\n@@ -1,3 +1,5 @@\n+from typing import TYPE_CHECKING, Any\n+\n from starlite.datastructures import File, Redirect, State, Stream, Template\n \n from .app import Starlite\n@@ -53,9 +55,12 @@\n from .response import Response\n from .router import Router\n from .routes import BaseRoute, HTTPRoute, WebSocketRoute\n-from .testing import TestClient, create_test_client, create_test_request\n from .types import MiddlewareProtocol, Partial, ResponseHeader\n \n+if TYPE_CHECKING:\n+ from .testing import TestClient, create_test_client, create_test_request\n+\n+\n __all__ = [\n \"ASGIRouteHandler\",\n \"AbstractAuthenticationMiddleware\",\n@@ -121,3 +126,17 @@\n \"route\",\n \"websocket\",\n ]\n+\n+_dynamic_imports = {\"TestClient\", \"create_test_client\", \"create_test_request\"}\n+\n+\n+# pylint: disable=import-outside-toplevel\n+def __getattr__(name: str) -> Any:\n+ \"\"\"Provide lazy importing as per https://peps.python.org/pep-0562/\"\"\"\n+ if name not in _dynamic_imports:\n+ raise AttributeError(f\"Module {__package__} has no attribute {name}\")\n+\n+ from . import testing\n+\n+ attr = globals()[name] = getattr(testing, name)\n+ return attr\n", "issue": "Move `requests` to `testing` extra\nI was inspecting starlight dependencies and was confused to see `requests` being required without any obvious reason.\r\n\r\nI found #13 then, but I disagree with the resolution since I believe libs required for testing purposes should not be installed for normal use. Now it only one lib (with several dependencies), but imagine if more dependencies would be added for testing.\r\n\r\n#### What I propose:\r\n1. Move requests from required dependencies to `testing` extra (`pip install starlight[testing]`)\r\n1. Remove import of `starlite.testing` from `starlite` package\r\n2. When starlight is imported explicitly (`from starlight import testint`), check for requests installed. if not, raise `RuntimeError(\"To access starlight.testing install starlight with [testing] extra\")`\r\n\r\nHow would `pyproject.toml` of end user look like:\r\n```toml\r\n[tool.poetry.dependencies]\r\npython = \"^3.10\"\r\nstarlite = \"^1.3.9\"\r\n\r\n[tool.poetry.dev-dependencies]\r\nstarlite = {extras = [\"testing\"], version = \"*\"} # whatever version is installed + testing dependencies\r\npytest = \"^5.2\"\r\n```\r\n\r\n\r\nI can send a PR if changes are welcomed.\n", "before_files": [{"content": "from starlite.datastructures import File, Redirect, State, Stream, Template\n\nfrom .app import Starlite\nfrom .config import (\n CacheConfig,\n CORSConfig,\n OpenAPIConfig,\n StaticFilesConfig,\n TemplateConfig,\n)\nfrom .connection import Request, WebSocket\nfrom .controller import Controller\nfrom .dto import DTOFactory\nfrom .enums import (\n HttpMethod,\n MediaType,\n OpenAPIMediaType,\n RequestEncodingType,\n ScopeType,\n)\nfrom .exceptions import (\n HTTPException,\n ImproperlyConfiguredException,\n InternalServerException,\n MissingDependencyException,\n NotAuthorizedException,\n NotFoundException,\n PermissionDeniedException,\n ServiceUnavailableException,\n StarLiteException,\n ValidationException,\n)\nfrom .handlers import (\n ASGIRouteHandler,\n BaseRouteHandler,\n HTTPRouteHandler,\n WebsocketRouteHandler,\n asgi,\n delete,\n get,\n patch,\n post,\n put,\n route,\n websocket,\n)\nfrom .logging import LoggingConfig, QueueListenerHandler\nfrom .middleware import AbstractAuthenticationMiddleware, AuthenticationResult\nfrom .openapi.controller import OpenAPIController\nfrom .params import Body, Dependency, Parameter\nfrom .plugins import PluginProtocol\nfrom .provide import Provide\nfrom .response import Response\nfrom .router import Router\nfrom .routes import BaseRoute, HTTPRoute, WebSocketRoute\nfrom .testing import TestClient, create_test_client, create_test_request\nfrom .types import MiddlewareProtocol, Partial, ResponseHeader\n\n__all__ = [\n \"ASGIRouteHandler\",\n \"AbstractAuthenticationMiddleware\",\n \"AuthenticationResult\",\n \"BaseRoute\",\n \"BaseRouteHandler\",\n \"Body\",\n \"CORSConfig\",\n \"CacheConfig\",\n \"Controller\",\n \"Dependency\",\n \"DTOFactory\",\n \"File\",\n \"HTTPException\",\n \"HTTPRoute\",\n \"HTTPRouteHandler\",\n \"HttpMethod\",\n \"ImproperlyConfiguredException\",\n \"InternalServerException\",\n \"LoggingConfig\",\n \"MediaType\",\n \"MiddlewareProtocol\",\n \"MissingDependencyException\",\n \"NotAuthorizedException\",\n \"NotFoundException\",\n \"OpenAPIConfig\",\n \"OpenAPIController\",\n \"OpenAPIMediaType\",\n \"Parameter\",\n \"Partial\",\n \"PermissionDeniedException\",\n \"PluginProtocol\",\n \"Provide\",\n \"QueueListenerHandler\",\n \"Redirect\",\n \"Request\",\n \"RequestEncodingType\",\n \"Response\",\n \"ResponseHeader\",\n \"Router\",\n \"ScopeType\",\n \"ServiceUnavailableException\",\n \"StarLiteException\",\n \"Starlite\",\n \"State\",\n \"StaticFilesConfig\",\n \"Stream\",\n \"Template\",\n \"TemplateConfig\",\n \"TestClient\",\n \"ValidationException\",\n \"WebSocket\",\n \"WebSocketRoute\",\n \"WebsocketRouteHandler\",\n \"asgi\",\n \"create_test_client\",\n \"create_test_request\",\n \"delete\",\n \"get\",\n \"patch\",\n \"post\",\n \"put\",\n \"route\",\n \"websocket\",\n]\n", "path": "starlite/__init__.py"}], "after_files": [{"content": "from typing import TYPE_CHECKING, Any\n\nfrom starlite.datastructures import File, Redirect, State, Stream, Template\n\nfrom .app import Starlite\nfrom .config import (\n CacheConfig,\n CORSConfig,\n OpenAPIConfig,\n StaticFilesConfig,\n TemplateConfig,\n)\nfrom .connection import Request, WebSocket\nfrom .controller import Controller\nfrom .dto import DTOFactory\nfrom .enums import (\n HttpMethod,\n MediaType,\n OpenAPIMediaType,\n RequestEncodingType,\n ScopeType,\n)\nfrom .exceptions import (\n HTTPException,\n ImproperlyConfiguredException,\n InternalServerException,\n MissingDependencyException,\n NotAuthorizedException,\n NotFoundException,\n PermissionDeniedException,\n ServiceUnavailableException,\n StarLiteException,\n ValidationException,\n)\nfrom .handlers import (\n ASGIRouteHandler,\n BaseRouteHandler,\n HTTPRouteHandler,\n WebsocketRouteHandler,\n asgi,\n delete,\n get,\n patch,\n post,\n put,\n route,\n websocket,\n)\nfrom .logging import LoggingConfig, QueueListenerHandler\nfrom .middleware import AbstractAuthenticationMiddleware, AuthenticationResult\nfrom .openapi.controller import OpenAPIController\nfrom .params import Body, Dependency, Parameter\nfrom .plugins import PluginProtocol\nfrom .provide import Provide\nfrom .response import Response\nfrom .router import Router\nfrom .routes import BaseRoute, HTTPRoute, WebSocketRoute\nfrom .types import MiddlewareProtocol, Partial, ResponseHeader\n\nif TYPE_CHECKING:\n from .testing import TestClient, create_test_client, create_test_request\n\n\n__all__ = [\n \"ASGIRouteHandler\",\n \"AbstractAuthenticationMiddleware\",\n \"AuthenticationResult\",\n \"BaseRoute\",\n \"BaseRouteHandler\",\n \"Body\",\n \"CORSConfig\",\n \"CacheConfig\",\n \"Controller\",\n \"Dependency\",\n \"DTOFactory\",\n \"File\",\n \"HTTPException\",\n \"HTTPRoute\",\n \"HTTPRouteHandler\",\n \"HttpMethod\",\n \"ImproperlyConfiguredException\",\n \"InternalServerException\",\n \"LoggingConfig\",\n \"MediaType\",\n \"MiddlewareProtocol\",\n \"MissingDependencyException\",\n \"NotAuthorizedException\",\n \"NotFoundException\",\n \"OpenAPIConfig\",\n \"OpenAPIController\",\n \"OpenAPIMediaType\",\n \"Parameter\",\n \"Partial\",\n \"PermissionDeniedException\",\n \"PluginProtocol\",\n \"Provide\",\n \"QueueListenerHandler\",\n \"Redirect\",\n \"Request\",\n \"RequestEncodingType\",\n \"Response\",\n \"ResponseHeader\",\n \"Router\",\n \"ScopeType\",\n \"ServiceUnavailableException\",\n \"StarLiteException\",\n \"Starlite\",\n \"State\",\n \"StaticFilesConfig\",\n \"Stream\",\n \"Template\",\n \"TemplateConfig\",\n \"TestClient\",\n \"ValidationException\",\n \"WebSocket\",\n \"WebSocketRoute\",\n \"WebsocketRouteHandler\",\n \"asgi\",\n \"create_test_client\",\n \"create_test_request\",\n \"delete\",\n \"get\",\n \"patch\",\n \"post\",\n \"put\",\n \"route\",\n \"websocket\",\n]\n\n_dynamic_imports = {\"TestClient\", \"create_test_client\", \"create_test_request\"}\n\n\n# pylint: disable=import-outside-toplevel\ndef __getattr__(name: str) -> Any:\n \"\"\"Provide lazy importing as per https://peps.python.org/pep-0562/\"\"\"\n if name not in _dynamic_imports:\n raise AttributeError(f\"Module {__package__} has no attribute {name}\")\n\n from . import testing\n\n attr = globals()[name] = getattr(testing, name)\n return attr\n", "path": "starlite/__init__.py"}]} | 1,432 | 340 |
gh_patches_debug_13005 | rasdani/github-patches | git_diff | tensorflow__tfx-25 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
base_component.BaseComponent.__str__ method raising KeyError
I got a `KeyError` when calling this method: `base_component.BaseComponent.__str__`
Here is the code to reproduce:
```python
import os
from tfx.utils.dsl_utils import csv_input
from tfx.components.example_gen.csv_example_gen.component import CsvExampleGen
_taxi_root = os.path.join(os.environ['HOME'], 'taxi')
_data_root = os.path.join(_taxi_root, 'data/simple')
examples = csv_input(_data_root)
example_gen = CsvExampleGen(input_base=examples)
print(example_gen)
```
The error trace is:
```
/Users/alelevier/Documents/github/tfx/tfx/components/base/base_component.pyc in __str__(self)
89 input_dict=self.input_dict,
90 outputs=self.outputs,
---> 91 exec_properties=self.exec_properties)
92
93 def __repr__(self):
KeyError: '\n component_name'
```
I looked at the method, it needs use double `{{` and `}}` so change from:
```
def __str__(self):
return """
{
component_name: {component_name},
unique_name: {unique_name},
driver: {driver},
executor: {executor},
input_dict: {input_dict},
outputs: {outputs},
exec_properties: {exec_properties}
}
""".format( # pylint: disable=missing-format-argument-key
component_name=self.component_name,
unique_name=self.unique_name,
driver=self.driver,
executor=self.executor,
input_dict=self.input_dict,
outputs=self.outputs,
exec_properties=self.exec_properties)
```
To:
```
def __str__(self):
return """
{{
component_name: {component_name},
unique_name: {unique_name},
driver: {driver},
executor: {executor},
input_dict: {input_dict},
outputs: {outputs},
exec_properties: {exec_properties}
}}
""".format( # pylint: disable=missing-format-argument-key
component_name=self.component_name,
unique_name=self.unique_name,
driver=self.driver,
executor=self.executor,
input_dict=self.input_dict,
outputs=self.outputs,
exec_properties=self.exec_properties)
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `tfx/components/base/base_component.py`
Content:
```
1 # Copyright 2019 Google LLC. All Rights Reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14 """Base class for all TFX components."""
15
16 from __future__ import absolute_import
17 from __future__ import division
18 from __future__ import print_function
19
20 import abc
21 from six import with_metaclass
22 from typing import Any
23 from typing import Dict
24 from typing import Optional
25 from typing import Text
26
27 from tfx.utils import channel
28
29
30 class ComponentOutputs(object):
31 """Helper class to wrap outputs from TFX components."""
32
33 def __init__(self, d):
34 self.__dict__ = d
35
36 def get_all(self):
37 return self.__dict__
38
39
40 class BaseComponent(with_metaclass(abc.ABCMeta, object)):
41 """Base TFX component.
42
43 This is the parent class of any TFX component.
44
45 Attributes:
46 component_name: Name of the component, should be unique per component class.
47 unique_name: Unique name for every component class instance.
48 driver: Driver class to handle pre-execution behaviors in a component.
49 executor: Executor class to do the real execution work.
50 input_dict: A [Text -> Channel] dict serving as the inputs to the component.
51 exec_properties: A [Text -> Any] dict serving as additional properties
52 needed for execution.
53 outputs: Optional Channel destinations of the component.
54 """
55
56 def __init__(self,
57 component_name,
58 driver,
59 executor,
60 input_dict,
61 exec_properties,
62 unique_name = '',
63 outputs = ComponentOutputs({})):
64 self.component_name = component_name
65 self.driver = driver
66 self.executor = executor
67 self.input_dict = input_dict
68 self.exec_properties = exec_properties
69 self.unique_name = unique_name
70 self.outputs = outputs or self._create_outputs()
71 self._type_check(self.input_dict, self.exec_properties)
72
73 def __str__(self):
74 return """
75 {
76 component_name: {component_name},
77 unique_name: {unique_name},
78 driver: {driver},
79 executor: {executor},
80 input_dict: {input_dict},
81 outputs: {outputs},
82 exec_properties: {exec_properties}
83 }
84 """.format( # pylint: disable=missing-format-argument-key
85 component_name=self.component_name,
86 unique_name=self.unique_name,
87 driver=self.driver,
88 executor=self.executor,
89 input_dict=self.input_dict,
90 outputs=self.outputs,
91 exec_properties=self.exec_properties)
92
93 def __repr__(self):
94 return self.__str__()
95
96 @abc.abstractmethod
97 def _create_outputs(self):
98 """Creates outputs placeholder for components.
99
100 Returns:
101 ComponentOutputs object containing the dict of [Text -> Channel]
102 """
103 raise NotImplementedError
104
105 @abc.abstractmethod
106 def _type_check(self, input_dict,
107 exec_properties):
108 """Does type checking for the inputs and exec_properties.
109
110 Args:
111 input_dict: A Dict[Text, Channel] as the inputs of the Component.
112 exec_properties: A Dict[Text, Any] as the execution properties of the
113 component.
114 """
115 raise NotImplementedError
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/tfx/components/base/base_component.py b/tfx/components/base/base_component.py
--- a/tfx/components/base/base_component.py
+++ b/tfx/components/base/base_component.py
@@ -72,7 +72,7 @@
def __str__(self):
return """
-{
+{{
component_name: {component_name},
unique_name: {unique_name},
driver: {driver},
@@ -80,7 +80,7 @@
input_dict: {input_dict},
outputs: {outputs},
exec_properties: {exec_properties}
-}
+}}
""".format( # pylint: disable=missing-format-argument-key
component_name=self.component_name,
unique_name=self.unique_name,
| {"golden_diff": "diff --git a/tfx/components/base/base_component.py b/tfx/components/base/base_component.py\n--- a/tfx/components/base/base_component.py\n+++ b/tfx/components/base/base_component.py\n@@ -72,7 +72,7 @@\n \n def __str__(self):\n return \"\"\"\n-{\n+{{\n component_name: {component_name},\n unique_name: {unique_name},\n driver: {driver},\n@@ -80,7 +80,7 @@\n input_dict: {input_dict},\n outputs: {outputs},\n exec_properties: {exec_properties}\n-}\n+}}\n \"\"\".format( # pylint: disable=missing-format-argument-key\n component_name=self.component_name,\n unique_name=self.unique_name,\n", "issue": "base_component.BaseComponent.__str__ method raising KeyError\nI got a `KeyError` when calling this method: `base_component.BaseComponent.__str__`\r\n\r\nHere is the code to reproduce:\r\n\r\n```python\r\nimport os\r\nfrom tfx.utils.dsl_utils import csv_input\r\nfrom tfx.components.example_gen.csv_example_gen.component import CsvExampleGen\r\n\r\n_taxi_root = os.path.join(os.environ['HOME'], 'taxi')\r\n_data_root = os.path.join(_taxi_root, 'data/simple')\r\nexamples = csv_input(_data_root)\r\nexample_gen = CsvExampleGen(input_base=examples)\r\nprint(example_gen)\r\n```\r\n\r\nThe error trace is:\r\n\r\n```\r\n/Users/alelevier/Documents/github/tfx/tfx/components/base/base_component.pyc in __str__(self)\r\n 89 input_dict=self.input_dict,\r\n 90 outputs=self.outputs,\r\n---> 91 exec_properties=self.exec_properties)\r\n 92 \r\n 93 def __repr__(self):\r\n\r\nKeyError: '\\n component_name'\r\n```\r\n\r\nI looked at the method, it needs use double `{{` and `}}` so change from:\r\n\r\n```\r\n def __str__(self):\r\n return \"\"\"\r\n{\r\n component_name: {component_name},\r\n unique_name: {unique_name},\r\n driver: {driver},\r\n executor: {executor},\r\n input_dict: {input_dict},\r\n outputs: {outputs},\r\n exec_properties: {exec_properties}\r\n}\r\n \"\"\".format( # pylint: disable=missing-format-argument-key\r\n component_name=self.component_name,\r\n unique_name=self.unique_name,\r\n driver=self.driver,\r\n executor=self.executor,\r\n input_dict=self.input_dict,\r\n outputs=self.outputs,\r\n exec_properties=self.exec_properties)\r\n```\r\n\r\nTo:\r\n\r\n```\r\n def __str__(self):\r\n return \"\"\"\r\n{{\r\n component_name: {component_name},\r\n unique_name: {unique_name},\r\n driver: {driver},\r\n executor: {executor},\r\n input_dict: {input_dict},\r\n outputs: {outputs},\r\n exec_properties: {exec_properties}\r\n}}\r\n \"\"\".format( # pylint: disable=missing-format-argument-key\r\n component_name=self.component_name,\r\n unique_name=self.unique_name,\r\n driver=self.driver,\r\n executor=self.executor,\r\n input_dict=self.input_dict,\r\n outputs=self.outputs,\r\n exec_properties=self.exec_properties)\r\n```\n", "before_files": [{"content": "# Copyright 2019 Google LLC. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Base class for all TFX components.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport abc\nfrom six import with_metaclass\nfrom typing import Any\nfrom typing import Dict\nfrom typing import Optional\nfrom typing import Text\n\nfrom tfx.utils import channel\n\n\nclass ComponentOutputs(object):\n \"\"\"Helper class to wrap outputs from TFX components.\"\"\"\n\n def __init__(self, d):\n self.__dict__ = d\n\n def get_all(self):\n return self.__dict__\n\n\nclass BaseComponent(with_metaclass(abc.ABCMeta, object)):\n \"\"\"Base TFX component.\n\n This is the parent class of any TFX component.\n\n Attributes:\n component_name: Name of the component, should be unique per component class.\n unique_name: Unique name for every component class instance.\n driver: Driver class to handle pre-execution behaviors in a component.\n executor: Executor class to do the real execution work.\n input_dict: A [Text -> Channel] dict serving as the inputs to the component.\n exec_properties: A [Text -> Any] dict serving as additional properties\n needed for execution.\n outputs: Optional Channel destinations of the component.\n \"\"\"\n\n def __init__(self,\n component_name,\n driver,\n executor,\n input_dict,\n exec_properties,\n unique_name = '',\n outputs = ComponentOutputs({})):\n self.component_name = component_name\n self.driver = driver\n self.executor = executor\n self.input_dict = input_dict\n self.exec_properties = exec_properties\n self.unique_name = unique_name\n self.outputs = outputs or self._create_outputs()\n self._type_check(self.input_dict, self.exec_properties)\n\n def __str__(self):\n return \"\"\"\n{\n component_name: {component_name},\n unique_name: {unique_name},\n driver: {driver},\n executor: {executor},\n input_dict: {input_dict},\n outputs: {outputs},\n exec_properties: {exec_properties}\n}\n \"\"\".format( # pylint: disable=missing-format-argument-key\n component_name=self.component_name,\n unique_name=self.unique_name,\n driver=self.driver,\n executor=self.executor,\n input_dict=self.input_dict,\n outputs=self.outputs,\n exec_properties=self.exec_properties)\n\n def __repr__(self):\n return self.__str__()\n\n @abc.abstractmethod\n def _create_outputs(self):\n \"\"\"Creates outputs placeholder for components.\n\n Returns:\n ComponentOutputs object containing the dict of [Text -> Channel]\n \"\"\"\n raise NotImplementedError\n\n @abc.abstractmethod\n def _type_check(self, input_dict,\n exec_properties):\n \"\"\"Does type checking for the inputs and exec_properties.\n\n Args:\n input_dict: A Dict[Text, Channel] as the inputs of the Component.\n exec_properties: A Dict[Text, Any] as the execution properties of the\n component.\n \"\"\"\n raise NotImplementedError\n", "path": "tfx/components/base/base_component.py"}], "after_files": [{"content": "# Copyright 2019 Google LLC. All Rights Reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\"\"\"Base class for all TFX components.\"\"\"\n\nfrom __future__ import absolute_import\nfrom __future__ import division\nfrom __future__ import print_function\n\nimport abc\nfrom six import with_metaclass\nfrom typing import Any\nfrom typing import Dict\nfrom typing import Optional\nfrom typing import Text\n\nfrom tfx.utils import channel\n\n\nclass ComponentOutputs(object):\n \"\"\"Helper class to wrap outputs from TFX components.\"\"\"\n\n def __init__(self, d):\n self.__dict__ = d\n\n def get_all(self):\n return self.__dict__\n\n\nclass BaseComponent(with_metaclass(abc.ABCMeta, object)):\n \"\"\"Base TFX component.\n\n This is the parent class of any TFX component.\n\n Attributes:\n component_name: Name of the component, should be unique per component class.\n unique_name: Unique name for every component class instance.\n driver: Driver class to handle pre-execution behaviors in a component.\n executor: Executor class to do the real execution work.\n input_dict: A [Text -> Channel] dict serving as the inputs to the component.\n exec_properties: A [Text -> Any] dict serving as additional properties\n needed for execution.\n outputs: Optional Channel destinations of the component.\n \"\"\"\n\n def __init__(self,\n component_name,\n driver,\n executor,\n input_dict,\n exec_properties,\n unique_name = '',\n outputs = ComponentOutputs({})):\n self.component_name = component_name\n self.driver = driver\n self.executor = executor\n self.input_dict = input_dict\n self.exec_properties = exec_properties\n self.unique_name = unique_name\n self.outputs = outputs or self._create_outputs()\n self._type_check(self.input_dict, self.exec_properties)\n\n def __str__(self):\n return \"\"\"\n{{\n component_name: {component_name},\n unique_name: {unique_name},\n driver: {driver},\n executor: {executor},\n input_dict: {input_dict},\n outputs: {outputs},\n exec_properties: {exec_properties}\n}}\n \"\"\".format( # pylint: disable=missing-format-argument-key\n component_name=self.component_name,\n unique_name=self.unique_name,\n driver=self.driver,\n executor=self.executor,\n input_dict=self.input_dict,\n outputs=self.outputs,\n exec_properties=self.exec_properties)\n\n def __repr__(self):\n return self.__str__()\n\n @abc.abstractmethod\n def _create_outputs(self):\n \"\"\"Creates outputs placeholder for components.\n\n Returns:\n ComponentOutputs object containing the dict of [Text -> Channel]\n \"\"\"\n raise NotImplementedError\n\n @abc.abstractmethod\n def _type_check(self, input_dict,\n exec_properties):\n \"\"\"Does type checking for the inputs and exec_properties.\n\n Args:\n input_dict: A Dict[Text, Channel] as the inputs of the Component.\n exec_properties: A Dict[Text, Any] as the execution properties of the\n component.\n \"\"\"\n raise NotImplementedError\n", "path": "tfx/components/base/base_component.py"}]} | 1,765 | 158 |
gh_patches_debug_31131 | rasdani/github-patches | git_diff | saleor__saleor-2979 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Return user's geolocalization based on IP
API should return geolocalization data guessed from users's IP. We already have all the logic in the `saleor.core.middleware.country` function. What needs to be done is to wrap this data in a GraphQL type and tie it up to the API.
E.g.:
```
shop {
geolocalization {
countryCode
}
}
```
Should it return DEFAULT_COUNTRY from settings as a fallback?
@mociepka Please provide more information about what data you'd need in the storefront.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `saleor/graphql/shop/types.py`
Content:
```
1 import graphene
2 from django.conf import settings
3 from django_countries import countries
4 from graphql_jwt.decorators import permission_required
5 from phonenumbers import COUNTRY_CODE_TO_REGION_CODE
6
7 from ...core.permissions import get_permissions
8 from ...site import models as site_models
9 from ..core.types.common import (
10 CountryDisplay, LanguageDisplay, PermissionDisplay, WeightUnitsEnum)
11 from ..menu.types import Menu
12 from ..product.types import Collection
13 from ..utils import format_permissions_for_display
14
15
16 class Navigation(graphene.ObjectType):
17 main = graphene.Field(Menu, description='Main navigation bar.')
18 secondary = graphene.Field(Menu, description='Secondary navigation bar.')
19
20 class Meta:
21 description = 'Represents shop\'s navigation menus.'
22
23
24 class AuthorizationKey(graphene.ObjectType):
25 name = graphene.String(description='Name of the key.', required=True)
26 key = graphene.String(description='Value of the key.', required=True)
27
28
29 class Domain(graphene.ObjectType):
30 host = graphene.String(
31 description='The host name of the domain.', required=True)
32 ssl_enabled = graphene.Boolean(
33 description='Inform if SSL is enabled.', required=True)
34 url = graphene.String(
35 description='Shop\'s absolute URL.', required=True)
36
37 class Meta:
38 description = 'Represents shop\'s domain.'
39
40
41 class Shop(graphene.ObjectType):
42 authorization_keys = graphene.List(
43 AuthorizationKey, description='List of configured authorization keys.',
44 required=True)
45 countries = graphene.List(
46 CountryDisplay, description='List of countries available in the shop.',
47 required=True)
48 currencies = graphene.List(
49 graphene.String, description='List of available currencies.',
50 required=True)
51 default_currency = graphene.String(
52 description='Default shop\'s currency.', required=True)
53 default_country = graphene.Field(
54 CountryDisplay, description='Default shop\'s country')
55 description = graphene.String(description='Shop\'s description.')
56 domain = graphene.Field(
57 Domain, required=True, description='Shop\'s domain data.')
58 homepage_collection = graphene.Field(
59 Collection, description='Collection displayed on homepage')
60 languages = graphene.List(
61 LanguageDisplay,
62 description='List of the shops\'s supported languages.', required=True)
63 name = graphene.String(description='Shop\'s name.', required=True)
64 navigation = graphene.Field(
65 Navigation, description='Shop\'s navigation.')
66 permissions = graphene.List(
67 PermissionDisplay, description='List of available permissions.',
68 required=True)
69 phone_prefixes = graphene.List(
70 graphene.String, description='List of possible phone prefixes.',
71 required=True)
72 header_text = graphene.String(description='Header text')
73 include_taxes_in_prices = graphene.Boolean(
74 description='Include taxes in prices')
75 display_gross_prices = graphene.Boolean(
76 description='Display prices with tax in store')
77 track_inventory_by_default = graphene.Boolean(
78 description='Enable inventory tracking')
79 default_weight_unit = WeightUnitsEnum(description='Default weight unit')
80
81 class Meta:
82 description = '''
83 Represents a shop resource containing general shop\'s data
84 and configuration.'''
85
86 @permission_required('site.manage_settings')
87 def resolve_authorization_keys(self, info):
88 return site_models.AuthorizationKey.objects.all()
89
90 def resolve_countries(self, info):
91 return [
92 CountryDisplay(code=country[0], country=country[1])
93 for country in countries]
94
95 def resolve_currencies(self, info):
96 return settings.AVAILABLE_CURRENCIES
97
98 def resolve_domain(self, info):
99 site = info.context.site
100 return Domain(
101 host=site.domain,
102 ssl_enabled=settings.ENABLE_SSL,
103 url=info.context.build_absolute_uri('/'))
104
105 def resolve_default_currency(self, info):
106 return settings.DEFAULT_CURRENCY
107
108 def resolve_description(self, info):
109 return info.context.site.settings.description
110
111 def resolve_homepage_collection(self, info):
112 return info.context.site.settings.homepage_collection
113
114 def resolve_languages(self, info):
115 return [
116 LanguageDisplay(code=language[0], language=language[1])
117 for language in settings.LANGUAGES]
118
119 def resolve_name(self, info):
120 return info.context.site.name
121
122 def resolve_navigation(self, info):
123 site_settings = info.context.site.settings
124 return Navigation(
125 main=site_settings.top_menu, secondary=site_settings.bottom_menu)
126
127 @permission_required('site.manage_settings')
128 def resolve_permissions(self, info):
129 permissions = get_permissions()
130 return format_permissions_for_display(permissions)
131
132 def resolve_phone_prefixes(self, info):
133 return list(COUNTRY_CODE_TO_REGION_CODE.keys())
134
135 def resolve_header_text(self, info):
136 return info.context.site.settings.header_text
137
138 def resolve_include_taxes_in_prices(self, info):
139 return info.context.site.settings.include_taxes_in_prices
140
141 def resolve_display_gross_prices(self, info):
142 return info.context.site.settings.display_gross_prices
143
144 def resolve_track_inventory_by_default(self, info):
145 return info.context.site.settings.track_inventory_by_default
146
147 def resolve_default_weight_unit(self, info):
148 return info.context.site.settings.default_weight_unit
149
150 def resolve_default_country(self, info):
151 default_country_code = settings.DEFAULT_COUNTRY
152 default_country_name = countries.countries.get(default_country_code)
153 if default_country_name:
154 default_country = CountryDisplay(
155 code=default_country_code, country=default_country_name)
156 else:
157 default_country = None
158 return default_country
159
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/saleor/graphql/shop/types.py b/saleor/graphql/shop/types.py
--- a/saleor/graphql/shop/types.py
+++ b/saleor/graphql/shop/types.py
@@ -5,6 +5,7 @@
from phonenumbers import COUNTRY_CODE_TO_REGION_CODE
from ...core.permissions import get_permissions
+from ...core.utils import get_client_ip, get_country_by_ip
from ...site import models as site_models
from ..core.types.common import (
CountryDisplay, LanguageDisplay, PermissionDisplay, WeightUnitsEnum)
@@ -38,7 +39,19 @@
description = 'Represents shop\'s domain.'
+class Geolocalization(graphene.ObjectType):
+ country = graphene.Field(
+ CountryDisplay,
+ description='Country of the user acquired by his IP address.')
+
+ class Meta:
+ description = 'Represents customers\'s geolocalization data.'
+
+
class Shop(graphene.ObjectType):
+ geolocalization = graphene.Field(
+ Geolocalization,
+ description='Customer\'s geolocalization data.')
authorization_keys = graphene.List(
AuthorizationKey, description='List of configured authorization keys.',
required=True)
@@ -102,6 +115,15 @@
ssl_enabled=settings.ENABLE_SSL,
url=info.context.build_absolute_uri('/'))
+ def resolve_geolocalization(self, info):
+ client_ip = get_client_ip(info.context)
+ country = get_country_by_ip(client_ip)
+ if country:
+ return Geolocalization(
+ country=CountryDisplay(
+ code=country.code, country=country.name))
+ return Geolocalization(country=None)
+
def resolve_default_currency(self, info):
return settings.DEFAULT_CURRENCY
| {"golden_diff": "diff --git a/saleor/graphql/shop/types.py b/saleor/graphql/shop/types.py\n--- a/saleor/graphql/shop/types.py\n+++ b/saleor/graphql/shop/types.py\n@@ -5,6 +5,7 @@\n from phonenumbers import COUNTRY_CODE_TO_REGION_CODE\n \n from ...core.permissions import get_permissions\n+from ...core.utils import get_client_ip, get_country_by_ip\n from ...site import models as site_models\n from ..core.types.common import (\n CountryDisplay, LanguageDisplay, PermissionDisplay, WeightUnitsEnum)\n@@ -38,7 +39,19 @@\n description = 'Represents shop\\'s domain.'\n \n \n+class Geolocalization(graphene.ObjectType):\n+ country = graphene.Field(\n+ CountryDisplay,\n+ description='Country of the user acquired by his IP address.')\n+\n+ class Meta:\n+ description = 'Represents customers\\'s geolocalization data.'\n+\n+\n class Shop(graphene.ObjectType):\n+ geolocalization = graphene.Field(\n+ Geolocalization,\n+ description='Customer\\'s geolocalization data.')\n authorization_keys = graphene.List(\n AuthorizationKey, description='List of configured authorization keys.',\n required=True)\n@@ -102,6 +115,15 @@\n ssl_enabled=settings.ENABLE_SSL,\n url=info.context.build_absolute_uri('/'))\n \n+ def resolve_geolocalization(self, info):\n+ client_ip = get_client_ip(info.context)\n+ country = get_country_by_ip(client_ip)\n+ if country:\n+ return Geolocalization(\n+ country=CountryDisplay(\n+ code=country.code, country=country.name))\n+ return Geolocalization(country=None)\n+\n def resolve_default_currency(self, info):\n return settings.DEFAULT_CURRENCY\n", "issue": "Return user's geolocalization based on IP\nAPI should return geolocalization data guessed from users's IP. We already have all the logic in the `saleor.core.middleware.country` function. What needs to be done is to wrap this data in a GraphQL type and tie it up to the API. \r\n\r\nE.g.:\r\n```\r\nshop {\r\n geolocalization {\r\n countryCode\r\n }\r\n}\r\n```\r\n\r\nShould it return DEFAULT_COUNTRY from settings as a fallback?\r\n\r\n@mociepka Please provide more information about what data you'd need in the storefront.\n", "before_files": [{"content": "import graphene\nfrom django.conf import settings\nfrom django_countries import countries\nfrom graphql_jwt.decorators import permission_required\nfrom phonenumbers import COUNTRY_CODE_TO_REGION_CODE\n\nfrom ...core.permissions import get_permissions\nfrom ...site import models as site_models\nfrom ..core.types.common import (\n CountryDisplay, LanguageDisplay, PermissionDisplay, WeightUnitsEnum)\nfrom ..menu.types import Menu\nfrom ..product.types import Collection\nfrom ..utils import format_permissions_for_display\n\n\nclass Navigation(graphene.ObjectType):\n main = graphene.Field(Menu, description='Main navigation bar.')\n secondary = graphene.Field(Menu, description='Secondary navigation bar.')\n\n class Meta:\n description = 'Represents shop\\'s navigation menus.'\n\n\nclass AuthorizationKey(graphene.ObjectType):\n name = graphene.String(description='Name of the key.', required=True)\n key = graphene.String(description='Value of the key.', required=True)\n\n\nclass Domain(graphene.ObjectType):\n host = graphene.String(\n description='The host name of the domain.', required=True)\n ssl_enabled = graphene.Boolean(\n description='Inform if SSL is enabled.', required=True)\n url = graphene.String(\n description='Shop\\'s absolute URL.', required=True)\n\n class Meta:\n description = 'Represents shop\\'s domain.'\n\n\nclass Shop(graphene.ObjectType):\n authorization_keys = graphene.List(\n AuthorizationKey, description='List of configured authorization keys.',\n required=True)\n countries = graphene.List(\n CountryDisplay, description='List of countries available in the shop.',\n required=True)\n currencies = graphene.List(\n graphene.String, description='List of available currencies.',\n required=True)\n default_currency = graphene.String(\n description='Default shop\\'s currency.', required=True)\n default_country = graphene.Field(\n CountryDisplay, description='Default shop\\'s country')\n description = graphene.String(description='Shop\\'s description.')\n domain = graphene.Field(\n Domain, required=True, description='Shop\\'s domain data.')\n homepage_collection = graphene.Field(\n Collection, description='Collection displayed on homepage')\n languages = graphene.List(\n LanguageDisplay,\n description='List of the shops\\'s supported languages.', required=True)\n name = graphene.String(description='Shop\\'s name.', required=True)\n navigation = graphene.Field(\n Navigation, description='Shop\\'s navigation.')\n permissions = graphene.List(\n PermissionDisplay, description='List of available permissions.',\n required=True)\n phone_prefixes = graphene.List(\n graphene.String, description='List of possible phone prefixes.',\n required=True)\n header_text = graphene.String(description='Header text')\n include_taxes_in_prices = graphene.Boolean(\n description='Include taxes in prices')\n display_gross_prices = graphene.Boolean(\n description='Display prices with tax in store')\n track_inventory_by_default = graphene.Boolean(\n description='Enable inventory tracking')\n default_weight_unit = WeightUnitsEnum(description='Default weight unit')\n\n class Meta:\n description = '''\n Represents a shop resource containing general shop\\'s data\n and configuration.'''\n\n @permission_required('site.manage_settings')\n def resolve_authorization_keys(self, info):\n return site_models.AuthorizationKey.objects.all()\n\n def resolve_countries(self, info):\n return [\n CountryDisplay(code=country[0], country=country[1])\n for country in countries]\n\n def resolve_currencies(self, info):\n return settings.AVAILABLE_CURRENCIES\n\n def resolve_domain(self, info):\n site = info.context.site\n return Domain(\n host=site.domain,\n ssl_enabled=settings.ENABLE_SSL,\n url=info.context.build_absolute_uri('/'))\n\n def resolve_default_currency(self, info):\n return settings.DEFAULT_CURRENCY\n\n def resolve_description(self, info):\n return info.context.site.settings.description\n\n def resolve_homepage_collection(self, info):\n return info.context.site.settings.homepage_collection\n\n def resolve_languages(self, info):\n return [\n LanguageDisplay(code=language[0], language=language[1])\n for language in settings.LANGUAGES]\n\n def resolve_name(self, info):\n return info.context.site.name\n\n def resolve_navigation(self, info):\n site_settings = info.context.site.settings\n return Navigation(\n main=site_settings.top_menu, secondary=site_settings.bottom_menu)\n\n @permission_required('site.manage_settings')\n def resolve_permissions(self, info):\n permissions = get_permissions()\n return format_permissions_for_display(permissions)\n\n def resolve_phone_prefixes(self, info):\n return list(COUNTRY_CODE_TO_REGION_CODE.keys())\n\n def resolve_header_text(self, info):\n return info.context.site.settings.header_text\n\n def resolve_include_taxes_in_prices(self, info):\n return info.context.site.settings.include_taxes_in_prices\n\n def resolve_display_gross_prices(self, info):\n return info.context.site.settings.display_gross_prices\n\n def resolve_track_inventory_by_default(self, info):\n return info.context.site.settings.track_inventory_by_default\n\n def resolve_default_weight_unit(self, info):\n return info.context.site.settings.default_weight_unit\n\n def resolve_default_country(self, info):\n default_country_code = settings.DEFAULT_COUNTRY\n default_country_name = countries.countries.get(default_country_code)\n if default_country_name:\n default_country = CountryDisplay(\n code=default_country_code, country=default_country_name)\n else:\n default_country = None\n return default_country\n", "path": "saleor/graphql/shop/types.py"}], "after_files": [{"content": "import graphene\nfrom django.conf import settings\nfrom django_countries import countries\nfrom graphql_jwt.decorators import permission_required\nfrom phonenumbers import COUNTRY_CODE_TO_REGION_CODE\n\nfrom ...core.permissions import get_permissions\nfrom ...core.utils import get_client_ip, get_country_by_ip\nfrom ...site import models as site_models\nfrom ..core.types.common import (\n CountryDisplay, LanguageDisplay, PermissionDisplay, WeightUnitsEnum)\nfrom ..menu.types import Menu\nfrom ..product.types import Collection\nfrom ..utils import format_permissions_for_display\n\n\nclass Navigation(graphene.ObjectType):\n main = graphene.Field(Menu, description='Main navigation bar.')\n secondary = graphene.Field(Menu, description='Secondary navigation bar.')\n\n class Meta:\n description = 'Represents shop\\'s navigation menus.'\n\n\nclass AuthorizationKey(graphene.ObjectType):\n name = graphene.String(description='Name of the key.', required=True)\n key = graphene.String(description='Value of the key.', required=True)\n\n\nclass Domain(graphene.ObjectType):\n host = graphene.String(\n description='The host name of the domain.', required=True)\n ssl_enabled = graphene.Boolean(\n description='Inform if SSL is enabled.', required=True)\n url = graphene.String(\n description='Shop\\'s absolute URL.', required=True)\n\n class Meta:\n description = 'Represents shop\\'s domain.'\n\n\nclass Geolocalization(graphene.ObjectType):\n country = graphene.Field(\n CountryDisplay,\n description='Country of the user acquired by his IP address.')\n\n class Meta:\n description = 'Represents customers\\'s geolocalization data.'\n\n\nclass Shop(graphene.ObjectType):\n geolocalization = graphene.Field(\n Geolocalization,\n description='Customer\\'s geolocalization data.')\n authorization_keys = graphene.List(\n AuthorizationKey, description='List of configured authorization keys.',\n required=True)\n countries = graphene.List(\n CountryDisplay, description='List of countries available in the shop.',\n required=True)\n currencies = graphene.List(\n graphene.String, description='List of available currencies.',\n required=True)\n default_currency = graphene.String(\n description='Default shop\\'s currency.', required=True)\n default_country = graphene.Field(\n CountryDisplay, description='Default shop\\'s country')\n description = graphene.String(description='Shop\\'s description.')\n domain = graphene.Field(\n Domain, required=True, description='Shop\\'s domain data.')\n homepage_collection = graphene.Field(\n Collection, description='Collection displayed on homepage')\n languages = graphene.List(\n LanguageDisplay,\n description='List of the shops\\'s supported languages.', required=True)\n name = graphene.String(description='Shop\\'s name.', required=True)\n navigation = graphene.Field(\n Navigation, description='Shop\\'s navigation.')\n permissions = graphene.List(\n PermissionDisplay, description='List of available permissions.',\n required=True)\n phone_prefixes = graphene.List(\n graphene.String, description='List of possible phone prefixes.',\n required=True)\n header_text = graphene.String(description='Header text')\n include_taxes_in_prices = graphene.Boolean(\n description='Include taxes in prices')\n display_gross_prices = graphene.Boolean(\n description='Display prices with tax in store')\n track_inventory_by_default = graphene.Boolean(\n description='Enable inventory tracking')\n default_weight_unit = WeightUnitsEnum(description='Default weight unit')\n\n class Meta:\n description = '''\n Represents a shop resource containing general shop\\'s data\n and configuration.'''\n\n @permission_required('site.manage_settings')\n def resolve_authorization_keys(self, info):\n return site_models.AuthorizationKey.objects.all()\n\n def resolve_countries(self, info):\n return [\n CountryDisplay(code=country[0], country=country[1])\n for country in countries]\n\n def resolve_currencies(self, info):\n return settings.AVAILABLE_CURRENCIES\n\n def resolve_domain(self, info):\n site = info.context.site\n return Domain(\n host=site.domain,\n ssl_enabled=settings.ENABLE_SSL,\n url=info.context.build_absolute_uri('/'))\n\n def resolve_geolocalization(self, info):\n client_ip = get_client_ip(info.context)\n country = get_country_by_ip(client_ip)\n if country:\n return Geolocalization(\n country=CountryDisplay(\n code=country.code, country=country.name))\n return Geolocalization(country=None)\n\n def resolve_default_currency(self, info):\n return settings.DEFAULT_CURRENCY\n\n def resolve_description(self, info):\n return info.context.site.settings.description\n\n def resolve_homepage_collection(self, info):\n return info.context.site.settings.homepage_collection\n\n def resolve_languages(self, info):\n return [\n LanguageDisplay(code=language[0], language=language[1])\n for language in settings.LANGUAGES]\n\n def resolve_name(self, info):\n return info.context.site.name\n\n def resolve_navigation(self, info):\n site_settings = info.context.site.settings\n return Navigation(\n main=site_settings.top_menu, secondary=site_settings.bottom_menu)\n\n @permission_required('site.manage_settings')\n def resolve_permissions(self, info):\n permissions = get_permissions()\n return format_permissions_for_display(permissions)\n\n def resolve_phone_prefixes(self, info):\n return list(COUNTRY_CODE_TO_REGION_CODE.keys())\n\n def resolve_header_text(self, info):\n return info.context.site.settings.header_text\n\n def resolve_include_taxes_in_prices(self, info):\n return info.context.site.settings.include_taxes_in_prices\n\n def resolve_display_gross_prices(self, info):\n return info.context.site.settings.display_gross_prices\n\n def resolve_track_inventory_by_default(self, info):\n return info.context.site.settings.track_inventory_by_default\n\n def resolve_default_weight_unit(self, info):\n return info.context.site.settings.default_weight_unit\n\n def resolve_default_country(self, info):\n default_country_code = settings.DEFAULT_COUNTRY\n default_country_name = countries.countries.get(default_country_code)\n if default_country_name:\n default_country = CountryDisplay(\n code=default_country_code, country=default_country_name)\n else:\n default_country = None\n return default_country\n", "path": "saleor/graphql/shop/types.py"}]} | 1,893 | 388 |
gh_patches_debug_37626 | rasdani/github-patches | git_diff | pre-commit__pre-commit-2774 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
pre-commit can delete/revert unstaged files if error occurs during git diff-index
### search you tried in the issue tracker
diff-index
### describe your issue
I performed a git commit with some modifications unstaged. After the commit, most of the modifications had been reverted and my work was lost. The diff saved in the patch directory had only a few of the modifications in - the ones that survived. The rest were gone.
To reproduce:
- Modify four files and stage one with `git add`
- Use `git status` to determine the order of the three unstaged files.
- Change the permission on the middle one so that git will not be able to read it
- Now do `git commit`: the changes to the first unstaged file will be preserved but the other two will be lost.
The key point, I think, is that the code in `staged_files_only.py` checks that the return code when creating the diff is non-zero which it takes to mean that the code is `1` meaning that there were diffs. However, in this case the return code is `128` which *is* non-zero but does _not_ mean success - it means error. So the code assumes the diff is OK even though it is incomplete.
### pre-commit --version
2.17.0
### .pre-commit-config.yaml
```yaml
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: 56b4a7e506901ff86f8de5c2551bc41f8eacf717
hooks:
- id: check-yaml
# - id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/psf/black
rev: 21.11b0
hooks:
- id: black
language_version: python3.6
- repo: https://github.com/PyCQA/isort
rev: 5.10.1
hooks:
- id: isort
args: ["--profile", "black", "--filter-files"]
```
### ~/.cache/pre-commit/pre-commit.log (if present)
_No response_
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `pre_commit/staged_files_only.py`
Content:
```
1 from __future__ import annotations
2
3 import contextlib
4 import logging
5 import os.path
6 import time
7 from typing import Generator
8
9 from pre_commit import git
10 from pre_commit.util import CalledProcessError
11 from pre_commit.util import cmd_output
12 from pre_commit.util import cmd_output_b
13 from pre_commit.xargs import xargs
14
15
16 logger = logging.getLogger('pre_commit')
17
18 # without forcing submodule.recurse=0, changes in nested submodules will be
19 # discarded if `submodule.recurse=1` is configured
20 # we choose this instead of `--no-recurse-submodules` because it works on
21 # versions of git before that option was added to `git checkout`
22 _CHECKOUT_CMD = ('git', '-c', 'submodule.recurse=0', 'checkout', '--', '.')
23
24
25 def _git_apply(patch: str) -> None:
26 args = ('apply', '--whitespace=nowarn', patch)
27 try:
28 cmd_output_b('git', *args)
29 except CalledProcessError:
30 # Retry with autocrlf=false -- see #570
31 cmd_output_b('git', '-c', 'core.autocrlf=false', *args)
32
33
34 @contextlib.contextmanager
35 def _intent_to_add_cleared() -> Generator[None, None, None]:
36 intent_to_add = git.intent_to_add_files()
37 if intent_to_add:
38 logger.warning('Unstaged intent-to-add files detected.')
39
40 xargs(('git', 'rm', '--cached', '--'), intent_to_add)
41 try:
42 yield
43 finally:
44 xargs(('git', 'add', '--intent-to-add', '--'), intent_to_add)
45 else:
46 yield
47
48
49 @contextlib.contextmanager
50 def _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]:
51 tree = cmd_output('git', 'write-tree')[1].strip()
52 retcode, diff_stdout_binary, _ = cmd_output_b(
53 'git', 'diff-index', '--ignore-submodules', '--binary',
54 '--exit-code', '--no-color', '--no-ext-diff', tree, '--',
55 check=False,
56 )
57 if retcode and diff_stdout_binary.strip():
58 patch_filename = f'patch{int(time.time())}-{os.getpid()}'
59 patch_filename = os.path.join(patch_dir, patch_filename)
60 logger.warning('Unstaged files detected.')
61 logger.info(f'Stashing unstaged files to {patch_filename}.')
62 # Save the current unstaged changes as a patch
63 os.makedirs(patch_dir, exist_ok=True)
64 with open(patch_filename, 'wb') as patch_file:
65 patch_file.write(diff_stdout_binary)
66
67 # prevent recursive post-checkout hooks (#1418)
68 no_checkout_env = dict(os.environ, _PRE_COMMIT_SKIP_POST_CHECKOUT='1')
69
70 try:
71 cmd_output_b(*_CHECKOUT_CMD, env=no_checkout_env)
72 yield
73 finally:
74 # Try to apply the patch we saved
75 try:
76 _git_apply(patch_filename)
77 except CalledProcessError:
78 logger.warning(
79 'Stashed changes conflicted with hook auto-fixes... '
80 'Rolling back fixes...',
81 )
82 # We failed to apply the patch, presumably due to fixes made
83 # by hooks.
84 # Roll back the changes made by hooks.
85 cmd_output_b(*_CHECKOUT_CMD, env=no_checkout_env)
86 _git_apply(patch_filename)
87
88 logger.info(f'Restored changes from {patch_filename}.')
89 else:
90 # There weren't any staged files so we don't need to do anything
91 # special
92 yield
93
94
95 @contextlib.contextmanager
96 def staged_files_only(patch_dir: str) -> Generator[None, None, None]:
97 """Clear any unstaged changes from the git working directory inside this
98 context.
99 """
100 with _intent_to_add_cleared(), _unstaged_changes_cleared(patch_dir):
101 yield
102
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/pre_commit/staged_files_only.py b/pre_commit/staged_files_only.py
--- a/pre_commit/staged_files_only.py
+++ b/pre_commit/staged_files_only.py
@@ -7,6 +7,7 @@
from typing import Generator
from pre_commit import git
+from pre_commit.errors import FatalError
from pre_commit.util import CalledProcessError
from pre_commit.util import cmd_output
from pre_commit.util import cmd_output_b
@@ -49,12 +50,16 @@
@contextlib.contextmanager
def _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]:
tree = cmd_output('git', 'write-tree')[1].strip()
- retcode, diff_stdout_binary, _ = cmd_output_b(
+ diff_cmd = (
'git', 'diff-index', '--ignore-submodules', '--binary',
'--exit-code', '--no-color', '--no-ext-diff', tree, '--',
- check=False,
)
- if retcode and diff_stdout_binary.strip():
+ retcode, diff_stdout, diff_stderr = cmd_output_b(*diff_cmd, check=False)
+ if retcode == 0:
+ # There weren't any staged files so we don't need to do anything
+ # special
+ yield
+ elif retcode == 1 and diff_stdout.strip():
patch_filename = f'patch{int(time.time())}-{os.getpid()}'
patch_filename = os.path.join(patch_dir, patch_filename)
logger.warning('Unstaged files detected.')
@@ -62,7 +67,7 @@
# Save the current unstaged changes as a patch
os.makedirs(patch_dir, exist_ok=True)
with open(patch_filename, 'wb') as patch_file:
- patch_file.write(diff_stdout_binary)
+ patch_file.write(diff_stdout)
# prevent recursive post-checkout hooks (#1418)
no_checkout_env = dict(os.environ, _PRE_COMMIT_SKIP_POST_CHECKOUT='1')
@@ -86,10 +91,12 @@
_git_apply(patch_filename)
logger.info(f'Restored changes from {patch_filename}.')
- else:
- # There weren't any staged files so we don't need to do anything
- # special
- yield
+ else: # pragma: win32 no cover
+ # some error occurred while requesting the diff
+ e = CalledProcessError(retcode, diff_cmd, b'', diff_stderr)
+ raise FatalError(
+ f'pre-commit failed to diff -- perhaps due to permissions?\n\n{e}',
+ )
@contextlib.contextmanager
| {"golden_diff": "diff --git a/pre_commit/staged_files_only.py b/pre_commit/staged_files_only.py\n--- a/pre_commit/staged_files_only.py\n+++ b/pre_commit/staged_files_only.py\n@@ -7,6 +7,7 @@\n from typing import Generator\n \n from pre_commit import git\n+from pre_commit.errors import FatalError\n from pre_commit.util import CalledProcessError\n from pre_commit.util import cmd_output\n from pre_commit.util import cmd_output_b\n@@ -49,12 +50,16 @@\n @contextlib.contextmanager\n def _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]:\n tree = cmd_output('git', 'write-tree')[1].strip()\n- retcode, diff_stdout_binary, _ = cmd_output_b(\n+ diff_cmd = (\n 'git', 'diff-index', '--ignore-submodules', '--binary',\n '--exit-code', '--no-color', '--no-ext-diff', tree, '--',\n- check=False,\n )\n- if retcode and diff_stdout_binary.strip():\n+ retcode, diff_stdout, diff_stderr = cmd_output_b(*diff_cmd, check=False)\n+ if retcode == 0:\n+ # There weren't any staged files so we don't need to do anything\n+ # special\n+ yield\n+ elif retcode == 1 and diff_stdout.strip():\n patch_filename = f'patch{int(time.time())}-{os.getpid()}'\n patch_filename = os.path.join(patch_dir, patch_filename)\n logger.warning('Unstaged files detected.')\n@@ -62,7 +67,7 @@\n # Save the current unstaged changes as a patch\n os.makedirs(patch_dir, exist_ok=True)\n with open(patch_filename, 'wb') as patch_file:\n- patch_file.write(diff_stdout_binary)\n+ patch_file.write(diff_stdout)\n \n # prevent recursive post-checkout hooks (#1418)\n no_checkout_env = dict(os.environ, _PRE_COMMIT_SKIP_POST_CHECKOUT='1')\n@@ -86,10 +91,12 @@\n _git_apply(patch_filename)\n \n logger.info(f'Restored changes from {patch_filename}.')\n- else:\n- # There weren't any staged files so we don't need to do anything\n- # special\n- yield\n+ else: # pragma: win32 no cover\n+ # some error occurred while requesting the diff\n+ e = CalledProcessError(retcode, diff_cmd, b'', diff_stderr)\n+ raise FatalError(\n+ f'pre-commit failed to diff -- perhaps due to permissions?\\n\\n{e}',\n+ )\n \n \n @contextlib.contextmanager\n", "issue": "pre-commit can delete/revert unstaged files if error occurs during git diff-index\n### search you tried in the issue tracker\n\ndiff-index\n\n### describe your issue\n\nI performed a git commit with some modifications unstaged. After the commit, most of the modifications had been reverted and my work was lost. The diff saved in the patch directory had only a few of the modifications in - the ones that survived. The rest were gone.\r\n\r\nTo reproduce:\r\n- Modify four files and stage one with `git add`\r\n- Use `git status` to determine the order of the three unstaged files.\r\n- Change the permission on the middle one so that git will not be able to read it\r\n- Now do `git commit`: the changes to the first unstaged file will be preserved but the other two will be lost.\r\n\r\nThe key point, I think, is that the code in `staged_files_only.py` checks that the return code when creating the diff is non-zero which it takes to mean that the code is `1` meaning that there were diffs. However, in this case the return code is `128` which *is* non-zero but does _not_ mean success - it means error. So the code assumes the diff is OK even though it is incomplete.\n\n### pre-commit --version\n\n2.17.0\n\n### .pre-commit-config.yaml\n\n```yaml\nrepos:\r\n- repo: https://github.com/pre-commit/pre-commit-hooks\r\n rev: 56b4a7e506901ff86f8de5c2551bc41f8eacf717\r\n hooks:\r\n - id: check-yaml\r\n# - id: end-of-file-fixer\r\n - id: trailing-whitespace\r\n- repo: https://github.com/psf/black\r\n rev: 21.11b0\r\n hooks:\r\n - id: black\r\n language_version: python3.6\r\n- repo: https://github.com/PyCQA/isort\r\n rev: 5.10.1\r\n hooks:\r\n - id: isort\r\n args: [\"--profile\", \"black\", \"--filter-files\"]\n```\n\n\n### ~/.cache/pre-commit/pre-commit.log (if present)\n\n_No response_\n", "before_files": [{"content": "from __future__ import annotations\n\nimport contextlib\nimport logging\nimport os.path\nimport time\nfrom typing import Generator\n\nfrom pre_commit import git\nfrom pre_commit.util import CalledProcessError\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import cmd_output_b\nfrom pre_commit.xargs import xargs\n\n\nlogger = logging.getLogger('pre_commit')\n\n# without forcing submodule.recurse=0, changes in nested submodules will be\n# discarded if `submodule.recurse=1` is configured\n# we choose this instead of `--no-recurse-submodules` because it works on\n# versions of git before that option was added to `git checkout`\n_CHECKOUT_CMD = ('git', '-c', 'submodule.recurse=0', 'checkout', '--', '.')\n\n\ndef _git_apply(patch: str) -> None:\n args = ('apply', '--whitespace=nowarn', patch)\n try:\n cmd_output_b('git', *args)\n except CalledProcessError:\n # Retry with autocrlf=false -- see #570\n cmd_output_b('git', '-c', 'core.autocrlf=false', *args)\n\n\[email protected]\ndef _intent_to_add_cleared() -> Generator[None, None, None]:\n intent_to_add = git.intent_to_add_files()\n if intent_to_add:\n logger.warning('Unstaged intent-to-add files detected.')\n\n xargs(('git', 'rm', '--cached', '--'), intent_to_add)\n try:\n yield\n finally:\n xargs(('git', 'add', '--intent-to-add', '--'), intent_to_add)\n else:\n yield\n\n\[email protected]\ndef _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]:\n tree = cmd_output('git', 'write-tree')[1].strip()\n retcode, diff_stdout_binary, _ = cmd_output_b(\n 'git', 'diff-index', '--ignore-submodules', '--binary',\n '--exit-code', '--no-color', '--no-ext-diff', tree, '--',\n check=False,\n )\n if retcode and diff_stdout_binary.strip():\n patch_filename = f'patch{int(time.time())}-{os.getpid()}'\n patch_filename = os.path.join(patch_dir, patch_filename)\n logger.warning('Unstaged files detected.')\n logger.info(f'Stashing unstaged files to {patch_filename}.')\n # Save the current unstaged changes as a patch\n os.makedirs(patch_dir, exist_ok=True)\n with open(patch_filename, 'wb') as patch_file:\n patch_file.write(diff_stdout_binary)\n\n # prevent recursive post-checkout hooks (#1418)\n no_checkout_env = dict(os.environ, _PRE_COMMIT_SKIP_POST_CHECKOUT='1')\n\n try:\n cmd_output_b(*_CHECKOUT_CMD, env=no_checkout_env)\n yield\n finally:\n # Try to apply the patch we saved\n try:\n _git_apply(patch_filename)\n except CalledProcessError:\n logger.warning(\n 'Stashed changes conflicted with hook auto-fixes... '\n 'Rolling back fixes...',\n )\n # We failed to apply the patch, presumably due to fixes made\n # by hooks.\n # Roll back the changes made by hooks.\n cmd_output_b(*_CHECKOUT_CMD, env=no_checkout_env)\n _git_apply(patch_filename)\n\n logger.info(f'Restored changes from {patch_filename}.')\n else:\n # There weren't any staged files so we don't need to do anything\n # special\n yield\n\n\[email protected]\ndef staged_files_only(patch_dir: str) -> Generator[None, None, None]:\n \"\"\"Clear any unstaged changes from the git working directory inside this\n context.\n \"\"\"\n with _intent_to_add_cleared(), _unstaged_changes_cleared(patch_dir):\n yield\n", "path": "pre_commit/staged_files_only.py"}], "after_files": [{"content": "from __future__ import annotations\n\nimport contextlib\nimport logging\nimport os.path\nimport time\nfrom typing import Generator\n\nfrom pre_commit import git\nfrom pre_commit.errors import FatalError\nfrom pre_commit.util import CalledProcessError\nfrom pre_commit.util import cmd_output\nfrom pre_commit.util import cmd_output_b\nfrom pre_commit.xargs import xargs\n\n\nlogger = logging.getLogger('pre_commit')\n\n# without forcing submodule.recurse=0, changes in nested submodules will be\n# discarded if `submodule.recurse=1` is configured\n# we choose this instead of `--no-recurse-submodules` because it works on\n# versions of git before that option was added to `git checkout`\n_CHECKOUT_CMD = ('git', '-c', 'submodule.recurse=0', 'checkout', '--', '.')\n\n\ndef _git_apply(patch: str) -> None:\n args = ('apply', '--whitespace=nowarn', patch)\n try:\n cmd_output_b('git', *args)\n except CalledProcessError:\n # Retry with autocrlf=false -- see #570\n cmd_output_b('git', '-c', 'core.autocrlf=false', *args)\n\n\[email protected]\ndef _intent_to_add_cleared() -> Generator[None, None, None]:\n intent_to_add = git.intent_to_add_files()\n if intent_to_add:\n logger.warning('Unstaged intent-to-add files detected.')\n\n xargs(('git', 'rm', '--cached', '--'), intent_to_add)\n try:\n yield\n finally:\n xargs(('git', 'add', '--intent-to-add', '--'), intent_to_add)\n else:\n yield\n\n\[email protected]\ndef _unstaged_changes_cleared(patch_dir: str) -> Generator[None, None, None]:\n tree = cmd_output('git', 'write-tree')[1].strip()\n diff_cmd = (\n 'git', 'diff-index', '--ignore-submodules', '--binary',\n '--exit-code', '--no-color', '--no-ext-diff', tree, '--',\n )\n retcode, diff_stdout, diff_stderr = cmd_output_b(*diff_cmd, check=False)\n if retcode == 0:\n # There weren't any staged files so we don't need to do anything\n # special\n yield\n elif retcode == 1 and diff_stdout.strip():\n patch_filename = f'patch{int(time.time())}-{os.getpid()}'\n patch_filename = os.path.join(patch_dir, patch_filename)\n logger.warning('Unstaged files detected.')\n logger.info(f'Stashing unstaged files to {patch_filename}.')\n # Save the current unstaged changes as a patch\n os.makedirs(patch_dir, exist_ok=True)\n with open(patch_filename, 'wb') as patch_file:\n patch_file.write(diff_stdout)\n\n # prevent recursive post-checkout hooks (#1418)\n no_checkout_env = dict(os.environ, _PRE_COMMIT_SKIP_POST_CHECKOUT='1')\n\n try:\n cmd_output_b(*_CHECKOUT_CMD, env=no_checkout_env)\n yield\n finally:\n # Try to apply the patch we saved\n try:\n _git_apply(patch_filename)\n except CalledProcessError:\n logger.warning(\n 'Stashed changes conflicted with hook auto-fixes... '\n 'Rolling back fixes...',\n )\n # We failed to apply the patch, presumably due to fixes made\n # by hooks.\n # Roll back the changes made by hooks.\n cmd_output_b(*_CHECKOUT_CMD, env=no_checkout_env)\n _git_apply(patch_filename)\n\n logger.info(f'Restored changes from {patch_filename}.')\n else: # pragma: win32 no cover\n # some error occurred while requesting the diff\n e = CalledProcessError(retcode, diff_cmd, b'', diff_stderr)\n raise FatalError(\n f'pre-commit failed to diff -- perhaps due to permissions?\\n\\n{e}',\n )\n\n\[email protected]\ndef staged_files_only(patch_dir: str) -> Generator[None, None, None]:\n \"\"\"Clear any unstaged changes from the git working directory inside this\n context.\n \"\"\"\n with _intent_to_add_cleared(), _unstaged_changes_cleared(patch_dir):\n yield\n", "path": "pre_commit/staged_files_only.py"}]} | 1,794 | 582 |
gh_patches_debug_14845 | rasdani/github-patches | git_diff | HypothesisWorks__hypothesis-1044 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
`escalation.belongs_to` cache is broken
The problem is that the cache is checked with the incoming string type, but [the string type changes](https://github.com/HypothesisWorks/hypothesis-python/blob/3.44.1/src/hypothesis/internal/escalation.py#L41) before the value is inserted:
A simple (but not elegant) fix:
```diff
--- hypothesis/internal/escalation.py
+++ hypothesis/internal/escalation.py
@@ -34,13 +34,14 @@ def belongs_to(package):
cache = {text_type: {}, binary_type: {}}
def accept(filepath):
+ ftype = type(filepath)
try:
- return cache[type(filepath)][filepath]
+ return cache[ftype][filepath]
except KeyError:
pass
filepath = encoded_filepath(filepath)
result = os.path.abspath(filepath).startswith(root)
- cache[type(filepath)][filepath] = result
+ cache[ftype][filepath] = result
return result
accept.__name__ = 'is_%s_file' % (package.__name__,)
return accept
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/hypothesis/internal/escalation.py`
Content:
```
1 # coding=utf-8
2 #
3 # This file is part of Hypothesis, which may be found at
4 # https://github.com/HypothesisWorks/hypothesis-python
5 #
6 # Most of this work is copyright (C) 2013-2017 David R. MacIver
7 # ([email protected]), but it contains contributions by others. See
8 # CONTRIBUTING.rst for a full list of people who may hold copyright, and
9 # consult the git log if you need to determine who owns an individual
10 # contribution.
11 #
12 # This Source Code Form is subject to the terms of the Mozilla Public License,
13 # v. 2.0. If a copy of the MPL was not distributed with this file, You can
14 # obtain one at http://mozilla.org/MPL/2.0/.
15 #
16 # END HEADER
17
18 from __future__ import division, print_function, absolute_import
19
20 import os
21 import sys
22
23 import coverage
24
25 import hypothesis
26 from hypothesis.errors import StopTest, DeadlineExceeded, \
27 HypothesisException, UnsatisfiedAssumption
28 from hypothesis.internal.compat import text_type, binary_type, \
29 encoded_filepath
30
31
32 def belongs_to(package):
33 root = os.path.dirname(package.__file__)
34 cache = {text_type: {}, binary_type: {}}
35
36 def accept(filepath):
37 try:
38 return cache[type(filepath)][filepath]
39 except KeyError:
40 pass
41 filepath = encoded_filepath(filepath)
42 result = os.path.abspath(filepath).startswith(root)
43 cache[type(filepath)][filepath] = result
44 return result
45 accept.__name__ = 'is_%s_file' % (package.__name__,)
46 return accept
47
48
49 PREVENT_ESCALATION = os.getenv('HYPOTHESIS_DO_NOT_ESCALATE') == 'true'
50
51 FILE_CACHE = {}
52
53
54 is_hypothesis_file = belongs_to(hypothesis)
55 is_coverage_file = belongs_to(coverage)
56
57 HYPOTHESIS_CONTROL_EXCEPTIONS = (
58 DeadlineExceeded, StopTest, UnsatisfiedAssumption
59 )
60
61
62 def mark_for_escalation(e):
63 if not isinstance(e, HYPOTHESIS_CONTROL_EXCEPTIONS):
64 e.hypothesis_internal_always_escalate = True
65
66
67 def escalate_hypothesis_internal_error():
68 if PREVENT_ESCALATION:
69 return
70 error_type, e, tb = sys.exc_info()
71 if getattr(e, 'hypothesis_internal_always_escalate', False):
72 raise
73 import traceback
74 filepath = traceback.extract_tb(tb)[-1][0]
75 if is_hypothesis_file(filepath) and not isinstance(
76 e, (HypothesisException,) + HYPOTHESIS_CONTROL_EXCEPTIONS,
77 ):
78 raise
79 # This is so that if we do something wrong and trigger an internal Coverage
80 # error we don't try to catch it. It should be impossible to trigger, but
81 # you never know.
82 if is_coverage_file(filepath): # pragma: no cover
83 raise
84
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/hypothesis/internal/escalation.py b/src/hypothesis/internal/escalation.py
--- a/src/hypothesis/internal/escalation.py
+++ b/src/hypothesis/internal/escalation.py
@@ -34,13 +34,15 @@
cache = {text_type: {}, binary_type: {}}
def accept(filepath):
+ ftype = type(filepath)
try:
- return cache[type(filepath)][filepath]
+ return cache[ftype][filepath]
except KeyError:
pass
- filepath = encoded_filepath(filepath)
- result = os.path.abspath(filepath).startswith(root)
- cache[type(filepath)][filepath] = result
+ new_filepath = encoded_filepath(filepath)
+ result = os.path.abspath(new_filepath).startswith(root)
+ cache[ftype][filepath] = result
+ cache[type(new_filepath)][new_filepath] = result
return result
accept.__name__ = 'is_%s_file' % (package.__name__,)
return accept
| {"golden_diff": "diff --git a/src/hypothesis/internal/escalation.py b/src/hypothesis/internal/escalation.py\n--- a/src/hypothesis/internal/escalation.py\n+++ b/src/hypothesis/internal/escalation.py\n@@ -34,13 +34,15 @@\n cache = {text_type: {}, binary_type: {}}\n \n def accept(filepath):\n+ ftype = type(filepath)\n try:\n- return cache[type(filepath)][filepath]\n+ return cache[ftype][filepath]\n except KeyError:\n pass\n- filepath = encoded_filepath(filepath)\n- result = os.path.abspath(filepath).startswith(root)\n- cache[type(filepath)][filepath] = result\n+ new_filepath = encoded_filepath(filepath)\n+ result = os.path.abspath(new_filepath).startswith(root)\n+ cache[ftype][filepath] = result\n+ cache[type(new_filepath)][new_filepath] = result\n return result\n accept.__name__ = 'is_%s_file' % (package.__name__,)\n return accept\n", "issue": "`escalation.belongs_to` cache is broken\nThe problem is that the cache is checked with the incoming string type, but [the string type changes](https://github.com/HypothesisWorks/hypothesis-python/blob/3.44.1/src/hypothesis/internal/escalation.py#L41) before the value is inserted:\r\n\r\nA simple (but not elegant) fix:\r\n\r\n```diff\r\n--- hypothesis/internal/escalation.py\r\n+++ hypothesis/internal/escalation.py\r\n@@ -34,13 +34,14 @@ def belongs_to(package):\r\n cache = {text_type: {}, binary_type: {}}\r\n \r\n def accept(filepath):\r\n+ ftype = type(filepath)\r\n try:\r\n- return cache[type(filepath)][filepath]\r\n+ return cache[ftype][filepath]\r\n except KeyError:\r\n pass\r\n filepath = encoded_filepath(filepath)\r\n result = os.path.abspath(filepath).startswith(root)\r\n- cache[type(filepath)][filepath] = result\r\n+ cache[ftype][filepath] = result\r\n return result\r\n accept.__name__ = 'is_%s_file' % (package.__name__,)\r\n return accept\r\n```\n", "before_files": [{"content": "# coding=utf-8\n#\n# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis-python\n#\n# Most of this work is copyright (C) 2013-2017 David R. MacIver\n# ([email protected]), but it contains contributions by others. See\n# CONTRIBUTING.rst for a full list of people who may hold copyright, and\n# consult the git log if you need to determine who owns an individual\n# contribution.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at http://mozilla.org/MPL/2.0/.\n#\n# END HEADER\n\nfrom __future__ import division, print_function, absolute_import\n\nimport os\nimport sys\n\nimport coverage\n\nimport hypothesis\nfrom hypothesis.errors import StopTest, DeadlineExceeded, \\\n HypothesisException, UnsatisfiedAssumption\nfrom hypothesis.internal.compat import text_type, binary_type, \\\n encoded_filepath\n\n\ndef belongs_to(package):\n root = os.path.dirname(package.__file__)\n cache = {text_type: {}, binary_type: {}}\n\n def accept(filepath):\n try:\n return cache[type(filepath)][filepath]\n except KeyError:\n pass\n filepath = encoded_filepath(filepath)\n result = os.path.abspath(filepath).startswith(root)\n cache[type(filepath)][filepath] = result\n return result\n accept.__name__ = 'is_%s_file' % (package.__name__,)\n return accept\n\n\nPREVENT_ESCALATION = os.getenv('HYPOTHESIS_DO_NOT_ESCALATE') == 'true'\n\nFILE_CACHE = {}\n\n\nis_hypothesis_file = belongs_to(hypothesis)\nis_coverage_file = belongs_to(coverage)\n\nHYPOTHESIS_CONTROL_EXCEPTIONS = (\n DeadlineExceeded, StopTest, UnsatisfiedAssumption\n)\n\n\ndef mark_for_escalation(e):\n if not isinstance(e, HYPOTHESIS_CONTROL_EXCEPTIONS):\n e.hypothesis_internal_always_escalate = True\n\n\ndef escalate_hypothesis_internal_error():\n if PREVENT_ESCALATION:\n return\n error_type, e, tb = sys.exc_info()\n if getattr(e, 'hypothesis_internal_always_escalate', False):\n raise\n import traceback\n filepath = traceback.extract_tb(tb)[-1][0]\n if is_hypothesis_file(filepath) and not isinstance(\n e, (HypothesisException,) + HYPOTHESIS_CONTROL_EXCEPTIONS,\n ):\n raise\n # This is so that if we do something wrong and trigger an internal Coverage\n # error we don't try to catch it. It should be impossible to trigger, but\n # you never know.\n if is_coverage_file(filepath): # pragma: no cover\n raise\n", "path": "src/hypothesis/internal/escalation.py"}], "after_files": [{"content": "# coding=utf-8\n#\n# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis-python\n#\n# Most of this work is copyright (C) 2013-2017 David R. MacIver\n# ([email protected]), but it contains contributions by others. See\n# CONTRIBUTING.rst for a full list of people who may hold copyright, and\n# consult the git log if you need to determine who owns an individual\n# contribution.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at http://mozilla.org/MPL/2.0/.\n#\n# END HEADER\n\nfrom __future__ import division, print_function, absolute_import\n\nimport os\nimport sys\n\nimport coverage\n\nimport hypothesis\nfrom hypothesis.errors import StopTest, DeadlineExceeded, \\\n HypothesisException, UnsatisfiedAssumption\nfrom hypothesis.internal.compat import text_type, binary_type, \\\n encoded_filepath\n\n\ndef belongs_to(package):\n root = os.path.dirname(package.__file__)\n cache = {text_type: {}, binary_type: {}}\n\n def accept(filepath):\n ftype = type(filepath)\n try:\n return cache[ftype][filepath]\n except KeyError:\n pass\n new_filepath = encoded_filepath(filepath)\n result = os.path.abspath(new_filepath).startswith(root)\n cache[ftype][filepath] = result\n cache[type(new_filepath)][new_filepath] = result\n return result\n accept.__name__ = 'is_%s_file' % (package.__name__,)\n return accept\n\n\nPREVENT_ESCALATION = os.getenv('HYPOTHESIS_DO_NOT_ESCALATE') == 'true'\n\nFILE_CACHE = {}\n\n\nis_hypothesis_file = belongs_to(hypothesis)\nis_coverage_file = belongs_to(coverage)\n\nHYPOTHESIS_CONTROL_EXCEPTIONS = (\n DeadlineExceeded, StopTest, UnsatisfiedAssumption\n)\n\n\ndef mark_for_escalation(e):\n if not isinstance(e, HYPOTHESIS_CONTROL_EXCEPTIONS):\n e.hypothesis_internal_always_escalate = True\n\n\ndef escalate_hypothesis_internal_error():\n if PREVENT_ESCALATION:\n return\n error_type, e, tb = sys.exc_info()\n if getattr(e, 'hypothesis_internal_always_escalate', False):\n raise\n import traceback\n filepath = traceback.extract_tb(tb)[-1][0]\n if is_hypothesis_file(filepath) and not isinstance(\n e, (HypothesisException,) + HYPOTHESIS_CONTROL_EXCEPTIONS,\n ):\n raise\n # This is so that if we do something wrong and trigger an internal Coverage\n # error we don't try to catch it. It should be impossible to trigger, but\n # you never know.\n if is_coverage_file(filepath): # pragma: no cover\n raise\n", "path": "src/hypothesis/internal/escalation.py"}]} | 1,303 | 221 |
gh_patches_debug_36419 | rasdani/github-patches | git_diff | alltheplaces__alltheplaces-8552 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
house_au: split off new brand for House Bed & Bath
house_au captures two brands:
* [House](https://www.wikidata.org/wiki/Q117921987)
* House Bed & Bath (https://www.wikidata.org/wiki/Q126176210)
Currently the spider doesn't differentiate the two brands. It should though.
Reference: https://globalretailbrands.net/
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `locations/spiders/house_au.py`
Content:
```
1 import reverse_geocoder
2 from scrapy import Request, Spider
3
4 from locations.categories import Categories
5 from locations.dict_parser import DictParser
6 from locations.hours import OpeningHours
7 from locations.pipelines.address_clean_up import clean_address
8
9
10 class HouseAUSpider(Spider):
11 name = "house_au"
12 item_attributes = {
13 "brand": "House",
14 "brand_wikidata": "Q117921987",
15 "extras": Categories.SHOP_HOUSEWARE.value,
16 }
17 allowed_domains = ["www.house.com.au"]
18 start_urls = ["https://www.house.com.au/api/get-stores"]
19
20 def start_requests(self):
21 for url in self.start_urls:
22 yield Request(url=url, method="POST")
23
24 def parse(self, response):
25 for location in response.json():
26 item = DictParser.parse(location)
27
28 # Some stores have wildly incorrect coordinates for
29 # locations as far away as France. Only add geometry
30 # where coordinates existing within Australia.
31 if result := reverse_geocoder.get((location["latitude"], location["longitude"]), mode=1, verbose=False):
32 if result["cc"] == "AU":
33 item["geometry"] = location["location"]
34
35 item["street_address"] = clean_address([location["address1"], location["address2"]])
36 item["website"] = "https://www.house.com.au/stores/" + location["slug"]
37 item["opening_hours"] = OpeningHours()
38 for day_name, hours in location["storeHours"].items():
39 if hours["open"] == "-" or hours["close"] == "-" or hours["close"] == "17:3016:00":
40 continue
41 item["opening_hours"].add_range(
42 day_name.title(),
43 hours["open"].replace(".", ":"),
44 hours["close"].replace(".", ":").replace(":-", ":"),
45 )
46 yield item
47
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/locations/spiders/house_au.py b/locations/spiders/house_au.py
--- a/locations/spiders/house_au.py
+++ b/locations/spiders/house_au.py
@@ -9,13 +9,16 @@
class HouseAUSpider(Spider):
name = "house_au"
- item_attributes = {
- "brand": "House",
- "brand_wikidata": "Q117921987",
- "extras": Categories.SHOP_HOUSEWARE.value,
- }
allowed_domains = ["www.house.com.au"]
start_urls = ["https://www.house.com.au/api/get-stores"]
+ brands = {
+ "House Bed & Bath": {
+ "brand": "House Bed & Bath",
+ "brand_wikidata": "",
+ "extras": Categories.SHOP_HOUSEHOLD_LINEN.value,
+ },
+ "House": {"brand": "House", "brand_wikidata": "Q117921987", "extras": Categories.SHOP_HOUSEWARE.value},
+ }
def start_requests(self):
for url in self.start_urls:
@@ -25,6 +28,12 @@
for location in response.json():
item = DictParser.parse(location)
+ for brand_name in self.brands.keys():
+ if item["name"].startswith(f"{brand_name} "):
+ item.update(self.brands[brand_name])
+ item["branch"] = item["name"].replace(f"{brand_name} ", "")
+ break
+
# Some stores have wildly incorrect coordinates for
# locations as far away as France. Only add geometry
# where coordinates existing within Australia.
@@ -34,6 +43,7 @@
item["street_address"] = clean_address([location["address1"], location["address2"]])
item["website"] = "https://www.house.com.au/stores/" + location["slug"]
+
item["opening_hours"] = OpeningHours()
for day_name, hours in location["storeHours"].items():
if hours["open"] == "-" or hours["close"] == "-" or hours["close"] == "17:3016:00":
@@ -43,4 +53,5 @@
hours["open"].replace(".", ":"),
hours["close"].replace(".", ":").replace(":-", ":"),
)
+
yield item
| {"golden_diff": "diff --git a/locations/spiders/house_au.py b/locations/spiders/house_au.py\n--- a/locations/spiders/house_au.py\n+++ b/locations/spiders/house_au.py\n@@ -9,13 +9,16 @@\n \n class HouseAUSpider(Spider):\n name = \"house_au\"\n- item_attributes = {\n- \"brand\": \"House\",\n- \"brand_wikidata\": \"Q117921987\",\n- \"extras\": Categories.SHOP_HOUSEWARE.value,\n- }\n allowed_domains = [\"www.house.com.au\"]\n start_urls = [\"https://www.house.com.au/api/get-stores\"]\n+ brands = {\n+ \"House Bed & Bath\": {\n+ \"brand\": \"House Bed & Bath\",\n+ \"brand_wikidata\": \"\",\n+ \"extras\": Categories.SHOP_HOUSEHOLD_LINEN.value,\n+ },\n+ \"House\": {\"brand\": \"House\", \"brand_wikidata\": \"Q117921987\", \"extras\": Categories.SHOP_HOUSEWARE.value},\n+ }\n \n def start_requests(self):\n for url in self.start_urls:\n@@ -25,6 +28,12 @@\n for location in response.json():\n item = DictParser.parse(location)\n \n+ for brand_name in self.brands.keys():\n+ if item[\"name\"].startswith(f\"{brand_name} \"):\n+ item.update(self.brands[brand_name])\n+ item[\"branch\"] = item[\"name\"].replace(f\"{brand_name} \", \"\")\n+ break\n+\n # Some stores have wildly incorrect coordinates for\n # locations as far away as France. Only add geometry\n # where coordinates existing within Australia.\n@@ -34,6 +43,7 @@\n \n item[\"street_address\"] = clean_address([location[\"address1\"], location[\"address2\"]])\n item[\"website\"] = \"https://www.house.com.au/stores/\" + location[\"slug\"]\n+\n item[\"opening_hours\"] = OpeningHours()\n for day_name, hours in location[\"storeHours\"].items():\n if hours[\"open\"] == \"-\" or hours[\"close\"] == \"-\" or hours[\"close\"] == \"17:3016:00\":\n@@ -43,4 +53,5 @@\n hours[\"open\"].replace(\".\", \":\"),\n hours[\"close\"].replace(\".\", \":\").replace(\":-\", \":\"),\n )\n+\n yield item\n", "issue": "house_au: split off new brand for House Bed & Bath\nhouse_au captures two brands:\r\n* [House](https://www.wikidata.org/wiki/Q117921987)\r\n* House Bed & Bath (https://www.wikidata.org/wiki/Q126176210)\r\n\r\nCurrently the spider doesn't differentiate the two brands. It should though.\r\n\r\nReference: https://globalretailbrands.net/\n", "before_files": [{"content": "import reverse_geocoder\nfrom scrapy import Request, Spider\n\nfrom locations.categories import Categories\nfrom locations.dict_parser import DictParser\nfrom locations.hours import OpeningHours\nfrom locations.pipelines.address_clean_up import clean_address\n\n\nclass HouseAUSpider(Spider):\n name = \"house_au\"\n item_attributes = {\n \"brand\": \"House\",\n \"brand_wikidata\": \"Q117921987\",\n \"extras\": Categories.SHOP_HOUSEWARE.value,\n }\n allowed_domains = [\"www.house.com.au\"]\n start_urls = [\"https://www.house.com.au/api/get-stores\"]\n\n def start_requests(self):\n for url in self.start_urls:\n yield Request(url=url, method=\"POST\")\n\n def parse(self, response):\n for location in response.json():\n item = DictParser.parse(location)\n\n # Some stores have wildly incorrect coordinates for\n # locations as far away as France. Only add geometry\n # where coordinates existing within Australia.\n if result := reverse_geocoder.get((location[\"latitude\"], location[\"longitude\"]), mode=1, verbose=False):\n if result[\"cc\"] == \"AU\":\n item[\"geometry\"] = location[\"location\"]\n\n item[\"street_address\"] = clean_address([location[\"address1\"], location[\"address2\"]])\n item[\"website\"] = \"https://www.house.com.au/stores/\" + location[\"slug\"]\n item[\"opening_hours\"] = OpeningHours()\n for day_name, hours in location[\"storeHours\"].items():\n if hours[\"open\"] == \"-\" or hours[\"close\"] == \"-\" or hours[\"close\"] == \"17:3016:00\":\n continue\n item[\"opening_hours\"].add_range(\n day_name.title(),\n hours[\"open\"].replace(\".\", \":\"),\n hours[\"close\"].replace(\".\", \":\").replace(\":-\", \":\"),\n )\n yield item\n", "path": "locations/spiders/house_au.py"}], "after_files": [{"content": "import reverse_geocoder\nfrom scrapy import Request, Spider\n\nfrom locations.categories import Categories\nfrom locations.dict_parser import DictParser\nfrom locations.hours import OpeningHours\nfrom locations.pipelines.address_clean_up import clean_address\n\n\nclass HouseAUSpider(Spider):\n name = \"house_au\"\n allowed_domains = [\"www.house.com.au\"]\n start_urls = [\"https://www.house.com.au/api/get-stores\"]\n brands = {\n \"House Bed & Bath\": {\n \"brand\": \"House Bed & Bath\",\n \"brand_wikidata\": \"\",\n \"extras\": Categories.SHOP_HOUSEHOLD_LINEN.value,\n },\n \"House\": {\"brand\": \"House\", \"brand_wikidata\": \"Q117921987\", \"extras\": Categories.SHOP_HOUSEWARE.value},\n }\n\n def start_requests(self):\n for url in self.start_urls:\n yield Request(url=url, method=\"POST\")\n\n def parse(self, response):\n for location in response.json():\n item = DictParser.parse(location)\n\n for brand_name in self.brands.keys():\n if item[\"name\"].startswith(f\"{brand_name} \"):\n item.update(self.brands[brand_name])\n item[\"branch\"] = item[\"name\"].replace(f\"{brand_name} \", \"\")\n break\n\n # Some stores have wildly incorrect coordinates for\n # locations as far away as France. Only add geometry\n # where coordinates existing within Australia.\n if result := reverse_geocoder.get((location[\"latitude\"], location[\"longitude\"]), mode=1, verbose=False):\n if result[\"cc\"] == \"AU\":\n item[\"geometry\"] = location[\"location\"]\n\n item[\"street_address\"] = clean_address([location[\"address1\"], location[\"address2\"]])\n item[\"website\"] = \"https://www.house.com.au/stores/\" + location[\"slug\"]\n\n item[\"opening_hours\"] = OpeningHours()\n for day_name, hours in location[\"storeHours\"].items():\n if hours[\"open\"] == \"-\" or hours[\"close\"] == \"-\" or hours[\"close\"] == \"17:3016:00\":\n continue\n item[\"opening_hours\"].add_range(\n day_name.title(),\n hours[\"open\"].replace(\".\", \":\"),\n hours[\"close\"].replace(\".\", \":\").replace(\":-\", \":\"),\n )\n\n yield item\n", "path": "locations/spiders/house_au.py"}]} | 845 | 544 |
gh_patches_debug_7899 | rasdani/github-patches | git_diff | cloudtools__troposphere-1692 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
implement AWS::CodeStarConnections changes from May 14, 2020 update
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `troposphere/codestarconnections.py`
Content:
```
1 # Copyright (c) 2012-2020, Mark Peek <[email protected]>
2 # All rights reserved.
3 #
4 # See LICENSE file for full license.
5
6
7 from . import AWSObject
8
9
10 VALID_CONNECTION_PROVIDERTYPE = ('Bitbucket')
11
12
13 def validate_connection_providertype(connection_providertype):
14 """Validate ProviderType for Connection"""
15
16 if connection_providertype not in VALID_CONNECTION_PROVIDERTYPE:
17 raise ValueError("Connection ProviderType must be one of: %s" %
18 ", ".join(VALID_CONNECTION_PROVIDERTYPE))
19 return connection_providertype
20
21
22 class Connection(AWSObject):
23 resource_type = "AWS::CodeStarConnections::Connection"
24
25 props = {
26 'ConnectionName': (basestring, True),
27 'ProviderType': (validate_connection_providertype, True),
28 }
29
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/troposphere/codestarconnections.py b/troposphere/codestarconnections.py
--- a/troposphere/codestarconnections.py
+++ b/troposphere/codestarconnections.py
@@ -4,7 +4,7 @@
# See LICENSE file for full license.
-from . import AWSObject
+from . import AWSObject, Tags
VALID_CONNECTION_PROVIDERTYPE = ('Bitbucket')
@@ -25,4 +25,5 @@
props = {
'ConnectionName': (basestring, True),
'ProviderType': (validate_connection_providertype, True),
+ 'Tags': (Tags, False),
}
| {"golden_diff": "diff --git a/troposphere/codestarconnections.py b/troposphere/codestarconnections.py\n--- a/troposphere/codestarconnections.py\n+++ b/troposphere/codestarconnections.py\n@@ -4,7 +4,7 @@\n # See LICENSE file for full license.\n \n \n-from . import AWSObject\n+from . import AWSObject, Tags\n \n \n VALID_CONNECTION_PROVIDERTYPE = ('Bitbucket')\n@@ -25,4 +25,5 @@\n props = {\n 'ConnectionName': (basestring, True),\n 'ProviderType': (validate_connection_providertype, True),\n+ 'Tags': (Tags, False),\n }\n", "issue": "implement AWS::CodeStarConnections changes from May 14, 2020 update\n\n", "before_files": [{"content": "# Copyright (c) 2012-2020, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\n\nfrom . import AWSObject\n\n\nVALID_CONNECTION_PROVIDERTYPE = ('Bitbucket')\n\n\ndef validate_connection_providertype(connection_providertype):\n \"\"\"Validate ProviderType for Connection\"\"\"\n\n if connection_providertype not in VALID_CONNECTION_PROVIDERTYPE:\n raise ValueError(\"Connection ProviderType must be one of: %s\" %\n \", \".join(VALID_CONNECTION_PROVIDERTYPE))\n return connection_providertype\n\n\nclass Connection(AWSObject):\n resource_type = \"AWS::CodeStarConnections::Connection\"\n\n props = {\n 'ConnectionName': (basestring, True),\n 'ProviderType': (validate_connection_providertype, True),\n }\n", "path": "troposphere/codestarconnections.py"}], "after_files": [{"content": "# Copyright (c) 2012-2020, Mark Peek <[email protected]>\n# All rights reserved.\n#\n# See LICENSE file for full license.\n\n\nfrom . import AWSObject, Tags\n\n\nVALID_CONNECTION_PROVIDERTYPE = ('Bitbucket')\n\n\ndef validate_connection_providertype(connection_providertype):\n \"\"\"Validate ProviderType for Connection\"\"\"\n\n if connection_providertype not in VALID_CONNECTION_PROVIDERTYPE:\n raise ValueError(\"Connection ProviderType must be one of: %s\" %\n \", \".join(VALID_CONNECTION_PROVIDERTYPE))\n return connection_providertype\n\n\nclass Connection(AWSObject):\n resource_type = \"AWS::CodeStarConnections::Connection\"\n\n props = {\n 'ConnectionName': (basestring, True),\n 'ProviderType': (validate_connection_providertype, True),\n 'Tags': (Tags, False),\n }\n", "path": "troposphere/codestarconnections.py"}]} | 512 | 142 |
gh_patches_debug_2344 | rasdani/github-patches | git_diff | ethereum__web3.py-3196 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Remove lru-dict dependency
lru-dict requires a wheel that is not pre-compiled for Python 3.11.
It is only used in 1 place where it should be able to be replaced with the built-in functools lru cache: https://github.com/ethereum/web3.py/blob/master/web3/middleware/cache.py#L196
Removing this dependency would avoid future compatibility problems as well.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 #!/usr/bin/env python
2 from setuptools import (
3 find_packages,
4 setup,
5 )
6
7 extras_require = {
8 "tester": [
9 "eth-tester[py-evm]==v0.9.1-b.1",
10 "py-geth>=3.11.0",
11 ],
12 "linter": [
13 "black>=22.1.0",
14 "flake8==3.8.3",
15 "isort>=5.11.0",
16 "mypy==1.4.1",
17 "types-setuptools>=57.4.4",
18 "types-requests>=2.26.1",
19 "types-protobuf==3.19.13",
20 ],
21 "docs": [
22 "sphinx>=5.3.0",
23 "sphinx_rtd_theme>=1.0.0",
24 "towncrier>=21,<22",
25 ],
26 "dev": [
27 "bumpversion",
28 "flaky>=3.7.0",
29 "hypothesis>=3.31.2",
30 "importlib-metadata<5.0;python_version<'3.8'",
31 "pytest>=7.0.0",
32 "pytest-asyncio>=0.18.1,<0.23",
33 "pytest-mock>=1.10",
34 "pytest-watch>=4.2",
35 "pytest-xdist>=1.29",
36 "setuptools>=38.6.0",
37 "tox>=3.18.0",
38 "tqdm>4.32",
39 "twine>=1.13",
40 "when-changed>=0.3.0",
41 "build>=0.9.0",
42 ],
43 "ipfs": [
44 "ipfshttpclient==0.8.0a2",
45 ],
46 }
47
48 extras_require["dev"] = (
49 extras_require["tester"]
50 + extras_require["linter"]
51 + extras_require["docs"]
52 + extras_require["ipfs"]
53 + extras_require["dev"]
54 )
55
56 with open("./README.md") as readme:
57 long_description = readme.read()
58
59 setup(
60 name="web3",
61 # *IMPORTANT*: Don't manually change the version here. Use the 'bumpversion' utility.
62 version="6.14.0",
63 description="""web3.py""",
64 long_description_content_type="text/markdown",
65 long_description=long_description,
66 author="The Ethereum Foundation",
67 author_email="[email protected]",
68 url="https://github.com/ethereum/web3.py",
69 include_package_data=True,
70 install_requires=[
71 "aiohttp>=3.7.4.post0",
72 "eth-abi>=4.0.0",
73 "eth-account>=0.8.0",
74 "eth-hash[pycryptodome]>=0.5.1",
75 "eth-typing>=3.0.0",
76 "eth-utils>=2.1.0",
77 "hexbytes>=0.1.0,<0.4.0",
78 "jsonschema>=4.0.0",
79 "lru-dict>=1.1.6,<1.3.0",
80 "protobuf>=4.21.6",
81 "pydantic>=2.4.0",
82 "pywin32>=223;platform_system=='Windows'",
83 "requests>=2.16.0",
84 "typing-extensions>=4.0.1",
85 "websockets>=10.0.0",
86 "pyunormalize>=15.0.0",
87 ],
88 python_requires=">=3.7.2",
89 extras_require=extras_require,
90 py_modules=["web3", "ens", "ethpm"],
91 entry_points={"pytest11": ["pytest_ethereum = web3.tools.pytest_ethereum.plugins"]},
92 license="MIT",
93 zip_safe=False,
94 keywords="ethereum",
95 packages=find_packages(exclude=["tests", "tests.*"]),
96 package_data={"web3": ["py.typed"]},
97 classifiers=[
98 "Development Status :: 5 - Production/Stable",
99 "Intended Audience :: Developers",
100 "License :: OSI Approved :: MIT License",
101 "Natural Language :: English",
102 "Programming Language :: Python :: 3",
103 "Programming Language :: Python :: 3.7",
104 "Programming Language :: Python :: 3.8",
105 "Programming Language :: Python :: 3.9",
106 "Programming Language :: Python :: 3.10",
107 "Programming Language :: Python :: 3.11",
108 ],
109 )
110
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -76,7 +76,6 @@
"eth-utils>=2.1.0",
"hexbytes>=0.1.0,<0.4.0",
"jsonschema>=4.0.0",
- "lru-dict>=1.1.6,<1.3.0",
"protobuf>=4.21.6",
"pydantic>=2.4.0",
"pywin32>=223;platform_system=='Windows'",
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -76,7 +76,6 @@\n \"eth-utils>=2.1.0\",\n \"hexbytes>=0.1.0,<0.4.0\",\n \"jsonschema>=4.0.0\",\n- \"lru-dict>=1.1.6,<1.3.0\",\n \"protobuf>=4.21.6\",\n \"pydantic>=2.4.0\",\n \"pywin32>=223;platform_system=='Windows'\",\n", "issue": "Remove lru-dict dependency\nlru-dict requires a wheel that is not pre-compiled for Python 3.11.\r\n\r\nIt is only used in 1 place where it should be able to be replaced with the built-in functools lru cache: https://github.com/ethereum/web3.py/blob/master/web3/middleware/cache.py#L196\r\n\r\nRemoving this dependency would avoid future compatibility problems as well.\n", "before_files": [{"content": "#!/usr/bin/env python\nfrom setuptools import (\n find_packages,\n setup,\n)\n\nextras_require = {\n \"tester\": [\n \"eth-tester[py-evm]==v0.9.1-b.1\",\n \"py-geth>=3.11.0\",\n ],\n \"linter\": [\n \"black>=22.1.0\",\n \"flake8==3.8.3\",\n \"isort>=5.11.0\",\n \"mypy==1.4.1\",\n \"types-setuptools>=57.4.4\",\n \"types-requests>=2.26.1\",\n \"types-protobuf==3.19.13\",\n ],\n \"docs\": [\n \"sphinx>=5.3.0\",\n \"sphinx_rtd_theme>=1.0.0\",\n \"towncrier>=21,<22\",\n ],\n \"dev\": [\n \"bumpversion\",\n \"flaky>=3.7.0\",\n \"hypothesis>=3.31.2\",\n \"importlib-metadata<5.0;python_version<'3.8'\",\n \"pytest>=7.0.0\",\n \"pytest-asyncio>=0.18.1,<0.23\",\n \"pytest-mock>=1.10\",\n \"pytest-watch>=4.2\",\n \"pytest-xdist>=1.29\",\n \"setuptools>=38.6.0\",\n \"tox>=3.18.0\",\n \"tqdm>4.32\",\n \"twine>=1.13\",\n \"when-changed>=0.3.0\",\n \"build>=0.9.0\",\n ],\n \"ipfs\": [\n \"ipfshttpclient==0.8.0a2\",\n ],\n}\n\nextras_require[\"dev\"] = (\n extras_require[\"tester\"]\n + extras_require[\"linter\"]\n + extras_require[\"docs\"]\n + extras_require[\"ipfs\"]\n + extras_require[\"dev\"]\n)\n\nwith open(\"./README.md\") as readme:\n long_description = readme.read()\n\nsetup(\n name=\"web3\",\n # *IMPORTANT*: Don't manually change the version here. Use the 'bumpversion' utility.\n version=\"6.14.0\",\n description=\"\"\"web3.py\"\"\",\n long_description_content_type=\"text/markdown\",\n long_description=long_description,\n author=\"The Ethereum Foundation\",\n author_email=\"[email protected]\",\n url=\"https://github.com/ethereum/web3.py\",\n include_package_data=True,\n install_requires=[\n \"aiohttp>=3.7.4.post0\",\n \"eth-abi>=4.0.0\",\n \"eth-account>=0.8.0\",\n \"eth-hash[pycryptodome]>=0.5.1\",\n \"eth-typing>=3.0.0\",\n \"eth-utils>=2.1.0\",\n \"hexbytes>=0.1.0,<0.4.0\",\n \"jsonschema>=4.0.0\",\n \"lru-dict>=1.1.6,<1.3.0\",\n \"protobuf>=4.21.6\",\n \"pydantic>=2.4.0\",\n \"pywin32>=223;platform_system=='Windows'\",\n \"requests>=2.16.0\",\n \"typing-extensions>=4.0.1\",\n \"websockets>=10.0.0\",\n \"pyunormalize>=15.0.0\",\n ],\n python_requires=\">=3.7.2\",\n extras_require=extras_require,\n py_modules=[\"web3\", \"ens\", \"ethpm\"],\n entry_points={\"pytest11\": [\"pytest_ethereum = web3.tools.pytest_ethereum.plugins\"]},\n license=\"MIT\",\n zip_safe=False,\n keywords=\"ethereum\",\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n package_data={\"web3\": [\"py.typed\"]},\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: MIT License\",\n \"Natural Language :: English\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n ],\n)\n", "path": "setup.py"}], "after_files": [{"content": "#!/usr/bin/env python\nfrom setuptools import (\n find_packages,\n setup,\n)\n\nextras_require = {\n \"tester\": [\n \"eth-tester[py-evm]==v0.9.1-b.1\",\n \"py-geth>=3.11.0\",\n ],\n \"linter\": [\n \"black>=22.1.0\",\n \"flake8==3.8.3\",\n \"isort>=5.11.0\",\n \"mypy==1.4.1\",\n \"types-setuptools>=57.4.4\",\n \"types-requests>=2.26.1\",\n \"types-protobuf==3.19.13\",\n ],\n \"docs\": [\n \"sphinx>=5.3.0\",\n \"sphinx_rtd_theme>=1.0.0\",\n \"towncrier>=21,<22\",\n ],\n \"dev\": [\n \"bumpversion\",\n \"flaky>=3.7.0\",\n \"hypothesis>=3.31.2\",\n \"importlib-metadata<5.0;python_version<'3.8'\",\n \"pytest>=7.0.0\",\n \"pytest-asyncio>=0.18.1,<0.23\",\n \"pytest-mock>=1.10\",\n \"pytest-watch>=4.2\",\n \"pytest-xdist>=1.29\",\n \"setuptools>=38.6.0\",\n \"tox>=3.18.0\",\n \"tqdm>4.32\",\n \"twine>=1.13\",\n \"when-changed>=0.3.0\",\n \"build>=0.9.0\",\n ],\n \"ipfs\": [\n \"ipfshttpclient==0.8.0a2\",\n ],\n}\n\nextras_require[\"dev\"] = (\n extras_require[\"tester\"]\n + extras_require[\"linter\"]\n + extras_require[\"docs\"]\n + extras_require[\"ipfs\"]\n + extras_require[\"dev\"]\n)\n\nwith open(\"./README.md\") as readme:\n long_description = readme.read()\n\nsetup(\n name=\"web3\",\n # *IMPORTANT*: Don't manually change the version here. Use the 'bumpversion' utility.\n version=\"6.14.0\",\n description=\"\"\"web3.py\"\"\",\n long_description_content_type=\"text/markdown\",\n long_description=long_description,\n author=\"The Ethereum Foundation\",\n author_email=\"[email protected]\",\n url=\"https://github.com/ethereum/web3.py\",\n include_package_data=True,\n install_requires=[\n \"aiohttp>=3.7.4.post0\",\n \"eth-abi>=4.0.0\",\n \"eth-account>=0.8.0\",\n \"eth-hash[pycryptodome]>=0.5.1\",\n \"eth-typing>=3.0.0\",\n \"eth-utils>=2.1.0\",\n \"hexbytes>=0.1.0,<0.4.0\",\n \"jsonschema>=4.0.0\",\n \"protobuf>=4.21.6\",\n \"pydantic>=2.4.0\",\n \"pywin32>=223;platform_system=='Windows'\",\n \"requests>=2.16.0\",\n \"typing-extensions>=4.0.1\",\n \"websockets>=10.0.0\",\n \"pyunormalize>=15.0.0\",\n ],\n python_requires=\">=3.7.2\",\n extras_require=extras_require,\n py_modules=[\"web3\", \"ens\", \"ethpm\"],\n entry_points={\"pytest11\": [\"pytest_ethereum = web3.tools.pytest_ethereum.plugins\"]},\n license=\"MIT\",\n zip_safe=False,\n keywords=\"ethereum\",\n packages=find_packages(exclude=[\"tests\", \"tests.*\"]),\n package_data={\"web3\": [\"py.typed\"]},\n classifiers=[\n \"Development Status :: 5 - Production/Stable\",\n \"Intended Audience :: Developers\",\n \"License :: OSI Approved :: MIT License\",\n \"Natural Language :: English\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.7\",\n \"Programming Language :: Python :: 3.8\",\n \"Programming Language :: Python :: 3.9\",\n \"Programming Language :: Python :: 3.10\",\n \"Programming Language :: Python :: 3.11\",\n ],\n)\n", "path": "setup.py"}]} | 1,564 | 130 |
gh_patches_debug_527 | rasdani/github-patches | git_diff | mlflow__mlflow-351 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
UUID dependency breaks python 3 under AWS linux
### System information
- **Have I written custom code (as opposed to using a stock example script provided in MLflow)**: No
- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: Amazon linux deep learning AMI 12.0 (like CentOS)
- **MLflow installed from (source or binary)**: source (PyPI)
- **MLflow version (run ``mlflow --version``)**: mlflow, version 0.5.0
- **Python version**: Python 3.6.6
- **npm version (if running the dev UI): N/A
- **Exact command to reproduce**: python -c "import mlflow"
### Describe the problem
```pip install mlflow``` also installs uuid==1.30 (which breaks under python3)
The default "uuid" library is included in the python standard library. On the AWS instance, the installed version shadows the default, and includes syntax which is only valid in python2.
On the computer I'm connecting to the instance from, the same script does not produce any errors, but ```uuid.__file__``` points to a standard library version and not the packaged 1.30
### Source code / logs
Full reproduction from a newly created instance:
```
source activate tensorflow_p36
virtualenv env --system-site-packages --python=$(which python) env
source env/bin/activate
pip install mlflow
python -c "import mlflow"
```
```
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/__init__.py", line 33, in <module>
import mlflow.projects as projects # noqa
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/projects/__init__.py", line 17, in <module>
import mlflow.tracking as tracking
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/tracking/__init__.py", line 7, in <module>
from mlflow.tracking.service import MLflowService, get_service
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/tracking/service.py", line 13, in <module>
from mlflow.tracking.utils import _get_store
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/tracking/utils.py", line 8, in <module>
from mlflow.store.file_store import FileStore
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/store/file_store.py", line 3, in <module>
import uuid
File "/home/ec2-user/scratch/env/lib/python3.6/site-packages/uuid.py", line 138
if not 0 <= time_low < 1<<32L:
^
SyntaxError: invalid syntax
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `setup.py`
Content:
```
1 import imp
2 import os
3 from setuptools import setup, find_packages
4
5 version = imp.load_source(
6 'mlflow.version', os.path.join('mlflow', 'version.py')).VERSION
7
8
9 # Get a list of all files in the JS directory to include in our module
10 def package_files(directory):
11 paths = []
12 for (path, directories, filenames) in os.walk(directory):
13 for filename in filenames:
14 paths.append(os.path.join('..', path, filename))
15 return paths
16
17
18 # Prints out a set of paths (relative to the mlflow/ directory) of files in mlflow/server/js/build
19 # to include in the wheel, e.g. "../mlflow/server/js/build/index.html"
20 js_files = package_files('mlflow/server/js/build')
21 sagmaker_server_files = package_files("mlflow/sagemaker/container")
22
23 setup(
24 name='mlflow',
25 version=version,
26 packages=find_packages(exclude=['tests', 'tests.*']),
27 package_data={"mlflow": js_files + sagmaker_server_files},
28 install_requires=[
29 'awscli',
30 'click>=6.7',
31 'databricks-cli>=0.8.0',
32 'requests>=2.17.3',
33 'six>=1.10.0',
34 'uuid',
35 'gunicorn',
36 'Flask',
37 'numpy',
38 'pandas',
39 'scipy',
40 'scikit-learn',
41 'python-dateutil',
42 'protobuf>=3.6.0',
43 'gitpython>=2.1.0',
44 'pyyaml',
45 'boto3',
46 'querystring_parser',
47 'simplejson',
48 ],
49 entry_points='''
50 [console_scripts]
51 mlflow=mlflow.cli:cli
52 ''',
53 zip_safe=False,
54 author='Databricks',
55 description='MLflow: An ML Workflow Tool',
56 long_description=open('README.rst').read(),
57 license='Apache License 2.0',
58 classifiers=[
59 'Intended Audience :: Developers',
60 'Programming Language :: Python :: 2.7',
61 'Programming Language :: Python :: 3.6',
62 ],
63 keywords='ml ai databricks',
64 url='https://mlflow.org/'
65 )
66
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/setup.py b/setup.py
--- a/setup.py
+++ b/setup.py
@@ -31,7 +31,6 @@
'databricks-cli>=0.8.0',
'requests>=2.17.3',
'six>=1.10.0',
- 'uuid',
'gunicorn',
'Flask',
'numpy',
| {"golden_diff": "diff --git a/setup.py b/setup.py\n--- a/setup.py\n+++ b/setup.py\n@@ -31,7 +31,6 @@\n 'databricks-cli>=0.8.0',\n 'requests>=2.17.3',\n 'six>=1.10.0',\n- 'uuid',\n 'gunicorn',\n 'Flask',\n 'numpy',\n", "issue": "UUID dependency breaks python 3 under AWS linux\n### System information\r\n- **Have I written custom code (as opposed to using a stock example script provided in MLflow)**: No\r\n- **OS Platform and Distribution (e.g., Linux Ubuntu 16.04)**: Amazon linux deep learning AMI 12.0 (like CentOS)\r\n- **MLflow installed from (source or binary)**: source (PyPI)\r\n- **MLflow version (run ``mlflow --version``)**: mlflow, version 0.5.0\r\n- **Python version**: Python 3.6.6\r\n- **npm version (if running the dev UI): N/A\r\n- **Exact command to reproduce**: python -c \"import mlflow\"\r\n\r\n### Describe the problem\r\n```pip install mlflow``` also installs uuid==1.30 (which breaks under python3)\r\n\r\nThe default \"uuid\" library is included in the python standard library. On the AWS instance, the installed version shadows the default, and includes syntax which is only valid in python2. \r\nOn the computer I'm connecting to the instance from, the same script does not produce any errors, but ```uuid.__file__``` points to a standard library version and not the packaged 1.30\r\n\r\n### Source code / logs\r\nFull reproduction from a newly created instance:\r\n```\r\nsource activate tensorflow_p36\r\nvirtualenv env --system-site-packages --python=$(which python) env\r\nsource env/bin/activate\r\npip install mlflow\r\npython -c \"import mlflow\"\r\n```\r\n```\r\nTraceback (most recent call last):\r\n File \"<string>\", line 1, in <module>\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/__init__.py\", line 33, in <module>\r\n import mlflow.projects as projects # noqa\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/projects/__init__.py\", line 17, in <module>\r\n import mlflow.tracking as tracking\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/tracking/__init__.py\", line 7, in <module>\r\n from mlflow.tracking.service import MLflowService, get_service\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/tracking/service.py\", line 13, in <module>\r\n from mlflow.tracking.utils import _get_store\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/tracking/utils.py\", line 8, in <module>\r\n from mlflow.store.file_store import FileStore\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/mlflow/store/file_store.py\", line 3, in <module>\r\n import uuid\r\n File \"/home/ec2-user/scratch/env/lib/python3.6/site-packages/uuid.py\", line 138\r\n if not 0 <= time_low < 1<<32L:\r\n ^\r\nSyntaxError: invalid syntax\r\n```\n", "before_files": [{"content": "import imp\nimport os\nfrom setuptools import setup, find_packages\n\nversion = imp.load_source(\n 'mlflow.version', os.path.join('mlflow', 'version.py')).VERSION\n\n\n# Get a list of all files in the JS directory to include in our module\ndef package_files(directory):\n paths = []\n for (path, directories, filenames) in os.walk(directory):\n for filename in filenames:\n paths.append(os.path.join('..', path, filename))\n return paths\n\n\n# Prints out a set of paths (relative to the mlflow/ directory) of files in mlflow/server/js/build\n# to include in the wheel, e.g. \"../mlflow/server/js/build/index.html\"\njs_files = package_files('mlflow/server/js/build')\nsagmaker_server_files = package_files(\"mlflow/sagemaker/container\")\n\nsetup(\n name='mlflow',\n version=version,\n packages=find_packages(exclude=['tests', 'tests.*']),\n package_data={\"mlflow\": js_files + sagmaker_server_files},\n install_requires=[\n 'awscli',\n 'click>=6.7',\n 'databricks-cli>=0.8.0',\n 'requests>=2.17.3',\n 'six>=1.10.0',\n 'uuid',\n 'gunicorn',\n 'Flask',\n 'numpy',\n 'pandas',\n 'scipy',\n 'scikit-learn',\n 'python-dateutil',\n 'protobuf>=3.6.0',\n 'gitpython>=2.1.0',\n 'pyyaml',\n 'boto3',\n 'querystring_parser',\n 'simplejson',\n ],\n entry_points='''\n [console_scripts]\n mlflow=mlflow.cli:cli\n ''',\n zip_safe=False,\n author='Databricks',\n description='MLflow: An ML Workflow Tool',\n long_description=open('README.rst').read(),\n license='Apache License 2.0',\n classifiers=[\n 'Intended Audience :: Developers',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3.6',\n ],\n keywords='ml ai databricks',\n url='https://mlflow.org/'\n)\n", "path": "setup.py"}], "after_files": [{"content": "import imp\nimport os\nfrom setuptools import setup, find_packages\n\nversion = imp.load_source(\n 'mlflow.version', os.path.join('mlflow', 'version.py')).VERSION\n\n\n# Get a list of all files in the JS directory to include in our module\ndef package_files(directory):\n paths = []\n for (path, directories, filenames) in os.walk(directory):\n for filename in filenames:\n paths.append(os.path.join('..', path, filename))\n return paths\n\n\n# Prints out a set of paths (relative to the mlflow/ directory) of files in mlflow/server/js/build\n# to include in the wheel, e.g. \"../mlflow/server/js/build/index.html\"\njs_files = package_files('mlflow/server/js/build')\nsagmaker_server_files = package_files(\"mlflow/sagemaker/container\")\n\nsetup(\n name='mlflow',\n version=version,\n packages=find_packages(exclude=['tests', 'tests.*']),\n package_data={\"mlflow\": js_files + sagmaker_server_files},\n install_requires=[\n 'awscli',\n 'click>=6.7',\n 'databricks-cli>=0.8.0',\n 'requests>=2.17.3',\n 'six>=1.10.0',\n 'gunicorn',\n 'Flask',\n 'numpy',\n 'pandas',\n 'scipy',\n 'scikit-learn',\n 'python-dateutil',\n 'protobuf>=3.6.0',\n 'gitpython>=2.1.0',\n 'pyyaml',\n 'boto3',\n 'querystring_parser',\n 'simplejson',\n ],\n entry_points='''\n [console_scripts]\n mlflow=mlflow.cli:cli\n ''',\n zip_safe=False,\n author='Databricks',\n description='MLflow: An ML Workflow Tool',\n long_description=open('README.rst').read(),\n license='Apache License 2.0',\n classifiers=[\n 'Intended Audience :: Developers',\n 'Programming Language :: Python :: 2.7',\n 'Programming Language :: Python :: 3.6',\n ],\n keywords='ml ai databricks',\n url='https://mlflow.org/'\n)\n", "path": "setup.py"}]} | 1,520 | 86 |
gh_patches_debug_9590 | rasdani/github-patches | git_diff | HypothesisWorks__hypothesis-2965 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
[Bug?] `.filter` ignored on `just` strategy
Hi,
I hope this has not been reported before. I checked the issues and read the [changelog] ~but could not find anything related~ and this seems to be related to issue https://github.com/HypothesisWorks/hypothesis/issues/2036.
[changelog]: https://hypothesis.readthedocs.io/en/latest/changes.html
I noticed an unexpected difference in the behavior when `.filter` is applied on `just` strategy between Hypothesis 5.43.3 and Hypothesis 6.12.0 (the former ran on my machine, while the remote CI machine had 6.12.0 installed).
With Hypothesis 5.43.3 (expected behavior):
```python
import hypothesis.strategies
import hypothesis.strategies._internal.strategies
strategy = hypothesis.strategies.just(1).filter(lambda x: x > 10)
assert str(strategy) == "just(1).filter(lambda x: x > 10)"
assert type(strategy) == hypothesis.strategies._internal.strategies.FilteredStrategy
```
With Hypothesis 6.12.0 (unexpected behavior):
```python
import hypothesis.strategies
import hypothesis.strategies._internal.misc
strategy = hypothesis.strategies.just(1).filter(lambda x: x > 10)
assert str(strategy) == "just(1)"
assert type(strategy) == hypothesis.strategies._internal.misc.JustStrategy
```
This bug (?) is relevant for [icontract-hypothesis] when we test instance methods automatically where pre-conditions need to be applied on `self`. I'd expect the health check to be raised rather than the pre-conditions to be silently ignored.
[icontract-hypothesis]: https://github.com/mristin/icontract-hypothesis
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `hypothesis-python/src/hypothesis/strategies/_internal/misc.py`
Content:
```
1 # This file is part of Hypothesis, which may be found at
2 # https://github.com/HypothesisWorks/hypothesis/
3 #
4 # Most of this work is copyright (C) 2013-2021 David R. MacIver
5 # ([email protected]), but it contains contributions by others. See
6 # CONTRIBUTING.rst for a full list of people who may hold copyright, and
7 # consult the git log if you need to determine who owns an individual
8 # contribution.
9 #
10 # This Source Code Form is subject to the terms of the Mozilla Public License,
11 # v. 2.0. If a copy of the MPL was not distributed with this file, You can
12 # obtain one at https://mozilla.org/MPL/2.0/.
13 #
14 # END HEADER
15
16 from hypothesis.internal.reflection import get_pretty_function_description
17 from hypothesis.strategies._internal.strategies import (
18 FilteredStrategy,
19 SampledFromStrategy,
20 SearchStrategy,
21 T,
22 filter_not_satisfied,
23 is_simple_data,
24 )
25 from hypothesis.strategies._internal.utils import cacheable, defines_strategy
26
27
28 class JustStrategy(SampledFromStrategy):
29 """A strategy which always returns a single fixed value.
30
31 It's implemented as a length-one SampledFromStrategy so that all our
32 special-case logic for filtering and sets applies also to just(x).
33
34 The important difference from a SampledFromStrategy with only one
35 element to choose is that JustStrategy *never* touches the underlying
36 choice sequence, i.e. drawing neither reads from nor writes to `data`.
37 This is a reasonably important optimisation (or semantic distinction!)
38 for both JustStrategy and SampledFromStrategy.
39 """
40
41 @property
42 def value(self):
43 return self.elements[0]
44
45 def __repr__(self):
46 if self.value is None:
47 return "none()"
48 return f"just({get_pretty_function_description(self.value)})"
49
50 def calc_has_reusable_values(self, recur):
51 return True
52
53 def calc_is_cacheable(self, recur):
54 return is_simple_data(self.value)
55
56 def do_draw(self, data):
57 result = self._transform(self.value)
58 if result is filter_not_satisfied:
59 data.note_event(f"Aborted test because unable to satisfy {self!r}")
60 data.mark_invalid()
61 return result
62
63 def do_filtered_draw(self, data, filter_strategy):
64 if isinstance(filter_strategy, FilteredStrategy):
65 return self._transform(self.value, filter_strategy.flat_conditions)
66 return self._transform(self.value)
67
68
69 def just(value: T) -> SearchStrategy[T]:
70 """Return a strategy which only generates ``value``.
71
72 Note: ``value`` is not copied. Be wary of using mutable values.
73
74 If ``value`` is the result of a callable, you can use
75 :func:`builds(callable) <hypothesis.strategies.builds>` instead
76 of ``just(callable())`` to get a fresh value each time.
77
78 Examples from this strategy do not shrink (because there is only one).
79 """
80 return JustStrategy([value])
81
82
83 @defines_strategy(force_reusable_values=True)
84 def none() -> SearchStrategy[None]:
85 """Return a strategy which only generates None.
86
87 Examples from this strategy do not shrink (because there is only
88 one).
89 """
90 return just(None)
91
92
93 class Nothing(SearchStrategy):
94 def calc_is_empty(self, recur):
95 return True
96
97 def do_draw(self, data):
98 # This method should never be called because draw() will mark the
99 # data as invalid immediately because is_empty is True.
100 raise NotImplementedError("This should never happen")
101
102 def calc_has_reusable_values(self, recur):
103 return True
104
105 def __repr__(self):
106 return "nothing()"
107
108 def map(self, f):
109 return self
110
111 def filter(self, f):
112 return self
113
114 def flatmap(self, f):
115 return self
116
117
118 NOTHING = Nothing()
119
120
121 @cacheable
122 def nothing() -> SearchStrategy:
123 """This strategy never successfully draws a value and will always reject on
124 an attempt to draw.
125
126 Examples from this strategy do not shrink (because there are none).
127 """
128 return NOTHING
129
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/hypothesis-python/src/hypothesis/strategies/_internal/misc.py b/hypothesis-python/src/hypothesis/strategies/_internal/misc.py
--- a/hypothesis-python/src/hypothesis/strategies/_internal/misc.py
+++ b/hypothesis-python/src/hypothesis/strategies/_internal/misc.py
@@ -43,9 +43,13 @@
return self.elements[0]
def __repr__(self):
+ suffix = "".join(
+ f".{name}({get_pretty_function_description(f)})"
+ for name, f in self._transformations
+ )
if self.value is None:
- return "none()"
- return f"just({get_pretty_function_description(self.value)})"
+ return "none()" + suffix
+ return f"just({get_pretty_function_description(self.value)}){suffix}"
def calc_has_reusable_values(self, recur):
return True
| {"golden_diff": "diff --git a/hypothesis-python/src/hypothesis/strategies/_internal/misc.py b/hypothesis-python/src/hypothesis/strategies/_internal/misc.py\n--- a/hypothesis-python/src/hypothesis/strategies/_internal/misc.py\n+++ b/hypothesis-python/src/hypothesis/strategies/_internal/misc.py\n@@ -43,9 +43,13 @@\n return self.elements[0]\n \n def __repr__(self):\n+ suffix = \"\".join(\n+ f\".{name}({get_pretty_function_description(f)})\"\n+ for name, f in self._transformations\n+ )\n if self.value is None:\n- return \"none()\"\n- return f\"just({get_pretty_function_description(self.value)})\"\n+ return \"none()\" + suffix\n+ return f\"just({get_pretty_function_description(self.value)}){suffix}\"\n \n def calc_has_reusable_values(self, recur):\n return True\n", "issue": "[Bug?] `.filter` ignored on `just` strategy\nHi,\r\nI hope this has not been reported before. I checked the issues and read the [changelog] ~but could not find anything related~ and this seems to be related to issue https://github.com/HypothesisWorks/hypothesis/issues/2036.\r\n\r\n[changelog]: https://hypothesis.readthedocs.io/en/latest/changes.html\r\n\r\nI noticed an unexpected difference in the behavior when `.filter` is applied on `just` strategy between Hypothesis 5.43.3 and Hypothesis 6.12.0 (the former ran on my machine, while the remote CI machine had 6.12.0 installed).\r\n\r\nWith Hypothesis 5.43.3 (expected behavior):\r\n```python\r\nimport hypothesis.strategies\r\nimport hypothesis.strategies._internal.strategies\r\n\r\nstrategy = hypothesis.strategies.just(1).filter(lambda x: x > 10)\r\nassert str(strategy) == \"just(1).filter(lambda x: x > 10)\"\r\nassert type(strategy) == hypothesis.strategies._internal.strategies.FilteredStrategy\r\n```\r\n\r\nWith Hypothesis 6.12.0 (unexpected behavior):\r\n\r\n```python\r\nimport hypothesis.strategies\r\nimport hypothesis.strategies._internal.misc\r\n\r\nstrategy = hypothesis.strategies.just(1).filter(lambda x: x > 10)\r\nassert str(strategy) == \"just(1)\"\r\nassert type(strategy) == hypothesis.strategies._internal.misc.JustStrategy\r\n```\r\n\r\nThis bug (?) is relevant for [icontract-hypothesis] when we test instance methods automatically where pre-conditions need to be applied on `self`. I'd expect the health check to be raised rather than the pre-conditions to be silently ignored.\r\n\r\n[icontract-hypothesis]: https://github.com/mristin/icontract-hypothesis\n", "before_files": [{"content": "# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis/\n#\n# Most of this work is copyright (C) 2013-2021 David R. MacIver\n# ([email protected]), but it contains contributions by others. See\n# CONTRIBUTING.rst for a full list of people who may hold copyright, and\n# consult the git log if you need to determine who owns an individual\n# contribution.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at https://mozilla.org/MPL/2.0/.\n#\n# END HEADER\n\nfrom hypothesis.internal.reflection import get_pretty_function_description\nfrom hypothesis.strategies._internal.strategies import (\n FilteredStrategy,\n SampledFromStrategy,\n SearchStrategy,\n T,\n filter_not_satisfied,\n is_simple_data,\n)\nfrom hypothesis.strategies._internal.utils import cacheable, defines_strategy\n\n\nclass JustStrategy(SampledFromStrategy):\n \"\"\"A strategy which always returns a single fixed value.\n\n It's implemented as a length-one SampledFromStrategy so that all our\n special-case logic for filtering and sets applies also to just(x).\n\n The important difference from a SampledFromStrategy with only one\n element to choose is that JustStrategy *never* touches the underlying\n choice sequence, i.e. drawing neither reads from nor writes to `data`.\n This is a reasonably important optimisation (or semantic distinction!)\n for both JustStrategy and SampledFromStrategy.\n \"\"\"\n\n @property\n def value(self):\n return self.elements[0]\n\n def __repr__(self):\n if self.value is None:\n return \"none()\"\n return f\"just({get_pretty_function_description(self.value)})\"\n\n def calc_has_reusable_values(self, recur):\n return True\n\n def calc_is_cacheable(self, recur):\n return is_simple_data(self.value)\n\n def do_draw(self, data):\n result = self._transform(self.value)\n if result is filter_not_satisfied:\n data.note_event(f\"Aborted test because unable to satisfy {self!r}\")\n data.mark_invalid()\n return result\n\n def do_filtered_draw(self, data, filter_strategy):\n if isinstance(filter_strategy, FilteredStrategy):\n return self._transform(self.value, filter_strategy.flat_conditions)\n return self._transform(self.value)\n\n\ndef just(value: T) -> SearchStrategy[T]:\n \"\"\"Return a strategy which only generates ``value``.\n\n Note: ``value`` is not copied. Be wary of using mutable values.\n\n If ``value`` is the result of a callable, you can use\n :func:`builds(callable) <hypothesis.strategies.builds>` instead\n of ``just(callable())`` to get a fresh value each time.\n\n Examples from this strategy do not shrink (because there is only one).\n \"\"\"\n return JustStrategy([value])\n\n\n@defines_strategy(force_reusable_values=True)\ndef none() -> SearchStrategy[None]:\n \"\"\"Return a strategy which only generates None.\n\n Examples from this strategy do not shrink (because there is only\n one).\n \"\"\"\n return just(None)\n\n\nclass Nothing(SearchStrategy):\n def calc_is_empty(self, recur):\n return True\n\n def do_draw(self, data):\n # This method should never be called because draw() will mark the\n # data as invalid immediately because is_empty is True.\n raise NotImplementedError(\"This should never happen\")\n\n def calc_has_reusable_values(self, recur):\n return True\n\n def __repr__(self):\n return \"nothing()\"\n\n def map(self, f):\n return self\n\n def filter(self, f):\n return self\n\n def flatmap(self, f):\n return self\n\n\nNOTHING = Nothing()\n\n\n@cacheable\ndef nothing() -> SearchStrategy:\n \"\"\"This strategy never successfully draws a value and will always reject on\n an attempt to draw.\n\n Examples from this strategy do not shrink (because there are none).\n \"\"\"\n return NOTHING\n", "path": "hypothesis-python/src/hypothesis/strategies/_internal/misc.py"}], "after_files": [{"content": "# This file is part of Hypothesis, which may be found at\n# https://github.com/HypothesisWorks/hypothesis/\n#\n# Most of this work is copyright (C) 2013-2021 David R. MacIver\n# ([email protected]), but it contains contributions by others. See\n# CONTRIBUTING.rst for a full list of people who may hold copyright, and\n# consult the git log if you need to determine who owns an individual\n# contribution.\n#\n# This Source Code Form is subject to the terms of the Mozilla Public License,\n# v. 2.0. If a copy of the MPL was not distributed with this file, You can\n# obtain one at https://mozilla.org/MPL/2.0/.\n#\n# END HEADER\n\nfrom hypothesis.internal.reflection import get_pretty_function_description\nfrom hypothesis.strategies._internal.strategies import (\n FilteredStrategy,\n SampledFromStrategy,\n SearchStrategy,\n T,\n filter_not_satisfied,\n is_simple_data,\n)\nfrom hypothesis.strategies._internal.utils import cacheable, defines_strategy\n\n\nclass JustStrategy(SampledFromStrategy):\n \"\"\"A strategy which always returns a single fixed value.\n\n It's implemented as a length-one SampledFromStrategy so that all our\n special-case logic for filtering and sets applies also to just(x).\n\n The important difference from a SampledFromStrategy with only one\n element to choose is that JustStrategy *never* touches the underlying\n choice sequence, i.e. drawing neither reads from nor writes to `data`.\n This is a reasonably important optimisation (or semantic distinction!)\n for both JustStrategy and SampledFromStrategy.\n \"\"\"\n\n @property\n def value(self):\n return self.elements[0]\n\n def __repr__(self):\n suffix = \"\".join(\n f\".{name}({get_pretty_function_description(f)})\"\n for name, f in self._transformations\n )\n if self.value is None:\n return \"none()\" + suffix\n return f\"just({get_pretty_function_description(self.value)}){suffix}\"\n\n def calc_has_reusable_values(self, recur):\n return True\n\n def calc_is_cacheable(self, recur):\n return is_simple_data(self.value)\n\n def do_draw(self, data):\n result = self._transform(self.value)\n if result is filter_not_satisfied:\n data.note_event(f\"Aborted test because unable to satisfy {self!r}\")\n data.mark_invalid()\n return result\n\n def do_filtered_draw(self, data, filter_strategy):\n if isinstance(filter_strategy, FilteredStrategy):\n return self._transform(self.value, filter_strategy.flat_conditions)\n return self._transform(self.value)\n\n\ndef just(value: T) -> SearchStrategy[T]:\n \"\"\"Return a strategy which only generates ``value``.\n\n Note: ``value`` is not copied. Be wary of using mutable values.\n\n If ``value`` is the result of a callable, you can use\n :func:`builds(callable) <hypothesis.strategies.builds>` instead\n of ``just(callable())`` to get a fresh value each time.\n\n Examples from this strategy do not shrink (because there is only one).\n \"\"\"\n return JustStrategy([value])\n\n\n@defines_strategy(force_reusable_values=True)\ndef none() -> SearchStrategy[None]:\n \"\"\"Return a strategy which only generates None.\n\n Examples from this strategy do not shrink (because there is only\n one).\n \"\"\"\n return just(None)\n\n\nclass Nothing(SearchStrategy):\n def calc_is_empty(self, recur):\n return True\n\n def do_draw(self, data):\n # This method should never be called because draw() will mark the\n # data as invalid immediately because is_empty is True.\n raise NotImplementedError(\"This should never happen\")\n\n def calc_has_reusable_values(self, recur):\n return True\n\n def __repr__(self):\n return \"nothing()\"\n\n def map(self, f):\n return self\n\n def filter(self, f):\n return self\n\n def flatmap(self, f):\n return self\n\n\nNOTHING = Nothing()\n\n\n@cacheable\ndef nothing() -> SearchStrategy:\n \"\"\"This strategy never successfully draws a value and will always reject on\n an attempt to draw.\n\n Examples from this strategy do not shrink (because there are none).\n \"\"\"\n return NOTHING\n", "path": "hypothesis-python/src/hypothesis/strategies/_internal/misc.py"}]} | 1,858 | 210 |
gh_patches_debug_2694 | rasdani/github-patches | git_diff | googleapis__google-cloud-python-1347 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Should we compare Entity._meanings in __eq__
/cc @tseaver @pcostell
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `gcloud/datastore/entity.py`
Content:
```
1 # Copyright 2014 Google Inc. All rights reserved.
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 """Class for representing a single entity in the Cloud Datastore."""
16
17
18 from gcloud._helpers import _ensure_tuple_or_list
19
20
21 class Entity(dict):
22 """Entities are akin to rows in a relational database
23
24 An entity storing the actual instance of data.
25
26 Each entity is officially represented with a
27 :class:`gcloud.datastore.key.Key` class, however it is possible that
28 you might create an Entity with only a partial Key (that is, a Key
29 with a Kind, and possibly a parent, but without an ID). In such a
30 case, the datastore service will automatically assign an ID to the
31 partial key.
32
33 Entities in this API act like dictionaries with extras built in that
34 allow you to delete or persist the data stored on the entity.
35
36 Entities are mutable and act like a subclass of a dictionary.
37 This means you could take an existing entity and change the key
38 to duplicate the object.
39
40 Use :func:`gcloud.datastore.get` to retrieve an existing entity.
41
42 >>> datastore.get(key)
43 <Entity[{'kind': 'EntityKind', id: 1234}] {'property': 'value'}>
44
45 You can the set values on the entity just like you would on any
46 other dictionary.
47
48 >>> entity['age'] = 20
49 >>> entity['name'] = 'JJ'
50 >>> entity
51 <Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>
52
53 And you can convert an entity to a regular Python dictionary with the
54 ``dict`` builtin:
55
56 >>> dict(entity)
57 {'age': 20, 'name': 'JJ'}
58
59 .. note::
60
61 When saving an entity to the backend, values which are "text"
62 (``unicode`` in Python2, ``str`` in Python3) will be saved using
63 the 'text_value' field, after being encoded to UTF-8. When
64 retrieved from the back-end, such values will be decoded to "text"
65 again. Values which are "bytes" (``str`` in Python2, ``bytes`` in
66 Python3), will be saved using the 'blob_value' field, without
67 any decoding / encoding step.
68
69 :type key: :class:`gcloud.datastore.key.Key`
70 :param key: Optional key to be set on entity. Required for
71 :func:`gcloud.datastore.put()` and
72 :func:`gcloud.datastore.put_multi()`
73
74 :type exclude_from_indexes: tuple of string
75 :param exclude_from_indexes: Names of fields whose values are not to be
76 indexed for this entity.
77 """
78
79 def __init__(self, key=None, exclude_from_indexes=()):
80 super(Entity, self).__init__()
81 self.key = key
82 self._exclude_from_indexes = set(_ensure_tuple_or_list(
83 'exclude_from_indexes', exclude_from_indexes))
84 # NOTE: This will be populated when parsing a protobuf in
85 # gcloud.datastore.helpers.entity_from_protobuf.
86 self._meanings = {}
87
88 def __eq__(self, other):
89 """Compare two entities for equality.
90
91 Entities compare equal if their keys compare equal, and their
92 properties compare equal.
93
94 :rtype: boolean
95 :returns: True if the entities compare equal, else False.
96 """
97 if not isinstance(other, Entity):
98 return False
99
100 return (self.key == other.key and
101 super(Entity, self).__eq__(other))
102
103 def __ne__(self, other):
104 """Compare two entities for inequality.
105
106 Entities compare equal if their keys compare equal, and their
107 properties compare equal.
108
109 :rtype: boolean
110 :returns: False if the entities compare equal, else True.
111 """
112 return not self.__eq__(other)
113
114 @property
115 def kind(self):
116 """Get the kind of the current entity.
117
118 .. note::
119 This relies entirely on the :class:`gcloud.datastore.key.Key`
120 set on the entity. That means that we're not storing the kind
121 of the entity at all, just the properties and a pointer to a
122 Key which knows its Kind.
123 """
124 if self.key:
125 return self.key.kind
126
127 @property
128 def exclude_from_indexes(self):
129 """Names of fields which are *not* to be indexed for this entity.
130
131 :rtype: sequence of field names
132 """
133 return frozenset(self._exclude_from_indexes)
134
135 def __repr__(self):
136 if self.key:
137 return '<Entity%s %s>' % (self.key.path,
138 super(Entity, self).__repr__())
139 else:
140 return '<Entity %s>' % (super(Entity, self).__repr__())
141
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/gcloud/datastore/entity.py b/gcloud/datastore/entity.py
--- a/gcloud/datastore/entity.py
+++ b/gcloud/datastore/entity.py
@@ -98,6 +98,8 @@
return False
return (self.key == other.key and
+ self._exclude_from_indexes == other._exclude_from_indexes and
+ self._meanings == other._meanings and
super(Entity, self).__eq__(other))
def __ne__(self, other):
| {"golden_diff": "diff --git a/gcloud/datastore/entity.py b/gcloud/datastore/entity.py\n--- a/gcloud/datastore/entity.py\n+++ b/gcloud/datastore/entity.py\n@@ -98,6 +98,8 @@\n return False\n \n return (self.key == other.key and\n+ self._exclude_from_indexes == other._exclude_from_indexes and\n+ self._meanings == other._meanings and\n super(Entity, self).__eq__(other))\n \n def __ne__(self, other):\n", "issue": "Should we compare Entity._meanings in __eq__\n/cc @tseaver @pcostell \n\n", "before_files": [{"content": "# Copyright 2014 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Class for representing a single entity in the Cloud Datastore.\"\"\"\n\n\nfrom gcloud._helpers import _ensure_tuple_or_list\n\n\nclass Entity(dict):\n \"\"\"Entities are akin to rows in a relational database\n\n An entity storing the actual instance of data.\n\n Each entity is officially represented with a\n :class:`gcloud.datastore.key.Key` class, however it is possible that\n you might create an Entity with only a partial Key (that is, a Key\n with a Kind, and possibly a parent, but without an ID). In such a\n case, the datastore service will automatically assign an ID to the\n partial key.\n\n Entities in this API act like dictionaries with extras built in that\n allow you to delete or persist the data stored on the entity.\n\n Entities are mutable and act like a subclass of a dictionary.\n This means you could take an existing entity and change the key\n to duplicate the object.\n\n Use :func:`gcloud.datastore.get` to retrieve an existing entity.\n\n >>> datastore.get(key)\n <Entity[{'kind': 'EntityKind', id: 1234}] {'property': 'value'}>\n\n You can the set values on the entity just like you would on any\n other dictionary.\n\n >>> entity['age'] = 20\n >>> entity['name'] = 'JJ'\n >>> entity\n <Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>\n\n And you can convert an entity to a regular Python dictionary with the\n ``dict`` builtin:\n\n >>> dict(entity)\n {'age': 20, 'name': 'JJ'}\n\n .. note::\n\n When saving an entity to the backend, values which are \"text\"\n (``unicode`` in Python2, ``str`` in Python3) will be saved using\n the 'text_value' field, after being encoded to UTF-8. When\n retrieved from the back-end, such values will be decoded to \"text\"\n again. Values which are \"bytes\" (``str`` in Python2, ``bytes`` in\n Python3), will be saved using the 'blob_value' field, without\n any decoding / encoding step.\n\n :type key: :class:`gcloud.datastore.key.Key`\n :param key: Optional key to be set on entity. Required for\n :func:`gcloud.datastore.put()` and\n :func:`gcloud.datastore.put_multi()`\n\n :type exclude_from_indexes: tuple of string\n :param exclude_from_indexes: Names of fields whose values are not to be\n indexed for this entity.\n \"\"\"\n\n def __init__(self, key=None, exclude_from_indexes=()):\n super(Entity, self).__init__()\n self.key = key\n self._exclude_from_indexes = set(_ensure_tuple_or_list(\n 'exclude_from_indexes', exclude_from_indexes))\n # NOTE: This will be populated when parsing a protobuf in\n # gcloud.datastore.helpers.entity_from_protobuf.\n self._meanings = {}\n\n def __eq__(self, other):\n \"\"\"Compare two entities for equality.\n\n Entities compare equal if their keys compare equal, and their\n properties compare equal.\n\n :rtype: boolean\n :returns: True if the entities compare equal, else False.\n \"\"\"\n if not isinstance(other, Entity):\n return False\n\n return (self.key == other.key and\n super(Entity, self).__eq__(other))\n\n def __ne__(self, other):\n \"\"\"Compare two entities for inequality.\n\n Entities compare equal if their keys compare equal, and their\n properties compare equal.\n\n :rtype: boolean\n :returns: False if the entities compare equal, else True.\n \"\"\"\n return not self.__eq__(other)\n\n @property\n def kind(self):\n \"\"\"Get the kind of the current entity.\n\n .. note::\n This relies entirely on the :class:`gcloud.datastore.key.Key`\n set on the entity. That means that we're not storing the kind\n of the entity at all, just the properties and a pointer to a\n Key which knows its Kind.\n \"\"\"\n if self.key:\n return self.key.kind\n\n @property\n def exclude_from_indexes(self):\n \"\"\"Names of fields which are *not* to be indexed for this entity.\n\n :rtype: sequence of field names\n \"\"\"\n return frozenset(self._exclude_from_indexes)\n\n def __repr__(self):\n if self.key:\n return '<Entity%s %s>' % (self.key.path,\n super(Entity, self).__repr__())\n else:\n return '<Entity %s>' % (super(Entity, self).__repr__())\n", "path": "gcloud/datastore/entity.py"}], "after_files": [{"content": "# Copyright 2014 Google Inc. All rights reserved.\n#\n# Licensed under the Apache License, Version 2.0 (the \"License\");\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# http://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.\n\n\"\"\"Class for representing a single entity in the Cloud Datastore.\"\"\"\n\n\nfrom gcloud._helpers import _ensure_tuple_or_list\n\n\nclass Entity(dict):\n \"\"\"Entities are akin to rows in a relational database\n\n An entity storing the actual instance of data.\n\n Each entity is officially represented with a\n :class:`gcloud.datastore.key.Key` class, however it is possible that\n you might create an Entity with only a partial Key (that is, a Key\n with a Kind, and possibly a parent, but without an ID). In such a\n case, the datastore service will automatically assign an ID to the\n partial key.\n\n Entities in this API act like dictionaries with extras built in that\n allow you to delete or persist the data stored on the entity.\n\n Entities are mutable and act like a subclass of a dictionary.\n This means you could take an existing entity and change the key\n to duplicate the object.\n\n Use :func:`gcloud.datastore.get` to retrieve an existing entity.\n\n >>> datastore.get(key)\n <Entity[{'kind': 'EntityKind', id: 1234}] {'property': 'value'}>\n\n You can the set values on the entity just like you would on any\n other dictionary.\n\n >>> entity['age'] = 20\n >>> entity['name'] = 'JJ'\n >>> entity\n <Entity[{'kind': 'EntityKind', id: 1234}] {'age': 20, 'name': 'JJ'}>\n\n And you can convert an entity to a regular Python dictionary with the\n ``dict`` builtin:\n\n >>> dict(entity)\n {'age': 20, 'name': 'JJ'}\n\n .. note::\n\n When saving an entity to the backend, values which are \"text\"\n (``unicode`` in Python2, ``str`` in Python3) will be saved using\n the 'text_value' field, after being encoded to UTF-8. When\n retrieved from the back-end, such values will be decoded to \"text\"\n again. Values which are \"bytes\" (``str`` in Python2, ``bytes`` in\n Python3), will be saved using the 'blob_value' field, without\n any decoding / encoding step.\n\n :type key: :class:`gcloud.datastore.key.Key`\n :param key: Optional key to be set on entity. Required for\n :func:`gcloud.datastore.put()` and\n :func:`gcloud.datastore.put_multi()`\n\n :type exclude_from_indexes: tuple of string\n :param exclude_from_indexes: Names of fields whose values are not to be\n indexed for this entity.\n \"\"\"\n\n def __init__(self, key=None, exclude_from_indexes=()):\n super(Entity, self).__init__()\n self.key = key\n self._exclude_from_indexes = set(_ensure_tuple_or_list(\n 'exclude_from_indexes', exclude_from_indexes))\n # NOTE: This will be populated when parsing a protobuf in\n # gcloud.datastore.helpers.entity_from_protobuf.\n self._meanings = {}\n\n def __eq__(self, other):\n \"\"\"Compare two entities for equality.\n\n Entities compare equal if their keys compare equal, and their\n properties compare equal.\n\n :rtype: boolean\n :returns: True if the entities compare equal, else False.\n \"\"\"\n if not isinstance(other, Entity):\n return False\n\n return (self.key == other.key and\n self._exclude_from_indexes == other._exclude_from_indexes and\n self._meanings == other._meanings and\n super(Entity, self).__eq__(other))\n\n def __ne__(self, other):\n \"\"\"Compare two entities for inequality.\n\n Entities compare equal if their keys compare equal, and their\n properties compare equal.\n\n :rtype: boolean\n :returns: False if the entities compare equal, else True.\n \"\"\"\n return not self.__eq__(other)\n\n @property\n def kind(self):\n \"\"\"Get the kind of the current entity.\n\n .. note::\n This relies entirely on the :class:`gcloud.datastore.key.Key`\n set on the entity. That means that we're not storing the kind\n of the entity at all, just the properties and a pointer to a\n Key which knows its Kind.\n \"\"\"\n if self.key:\n return self.key.kind\n\n @property\n def exclude_from_indexes(self):\n \"\"\"Names of fields which are *not* to be indexed for this entity.\n\n :rtype: sequence of field names\n \"\"\"\n return frozenset(self._exclude_from_indexes)\n\n def __repr__(self):\n if self.key:\n return '<Entity%s %s>' % (self.key.path,\n super(Entity, self).__repr__())\n else:\n return '<Entity %s>' % (super(Entity, self).__repr__())\n", "path": "gcloud/datastore/entity.py"}]} | 1,775 | 110 |
gh_patches_debug_3810 | rasdani/github-patches | git_diff | iterative__dvc-3129 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
dvc: fix version generatio
Looks like dynamic version got broken https://travis-ci.com/iterative/dvc/jobs/274986530 .
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `dvc/version.py`
Content:
```
1 # Used in setup.py, so don't pull any additional dependencies
2 #
3 # Based on:
4 # - https://github.com/python/mypy/blob/master/mypy/version.py
5 # - https://github.com/python/mypy/blob/master/mypy/git.py
6 import os
7 import subprocess
8
9
10 _BASE_VERSION = "0.81.0"
11
12
13 def _generate_version(base_version):
14 """Generate a version with information about the git repository"""
15 pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
16
17 if not _is_git_repo(pkg_dir) or not _have_git():
18 return base_version
19
20 if _is_release(pkg_dir, base_version) and not _is_dirty(pkg_dir):
21 return base_version
22
23 return "{base_version}+{short_sha}{dirty}".format(
24 base_version=base_version,
25 short_sha=_git_revision(pkg_dir).decode("utf-8")[0:6],
26 dirty=".mod" if _is_dirty(pkg_dir) else "",
27 )
28
29
30 def _is_git_repo(dir_path):
31 """Is the given directory version-controlled with git?"""
32 return os.path.exists(os.path.join(dir_path, ".git"))
33
34
35 def _have_git():
36 """Can we run the git executable?"""
37 try:
38 subprocess.check_output(["git", "--help"])
39 return True
40 except subprocess.CalledProcessError:
41 return False
42 except OSError:
43 return False
44
45
46 def _is_release(dir_path, base_version):
47 try:
48 output = subprocess.check_output(
49 ["git", "describe", "--tags", "--exact-match"],
50 cwd=dir_path,
51 stderr=subprocess.STDOUT,
52 )
53 tag = output.strip()
54 return tag == base_version
55 except subprocess.CalledProcessError:
56 return False
57
58
59 def _git_revision(dir_path):
60 """Get the SHA-1 of the HEAD of a git repository."""
61 return subprocess.check_output(
62 ["git", "rev-parse", "HEAD"], cwd=dir_path
63 ).strip()
64
65
66 def _is_dirty(dir_path):
67 """Check whether a git repository has uncommitted changes."""
68 try:
69 subprocess.check_call(["git", "diff", "--quiet"], cwd=dir_path)
70 return False
71 except subprocess.CalledProcessError:
72 return True
73
74
75 __version__ = _generate_version(_BASE_VERSION)
76
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/dvc/version.py b/dvc/version.py
--- a/dvc/version.py
+++ b/dvc/version.py
@@ -49,7 +49,7 @@
["git", "describe", "--tags", "--exact-match"],
cwd=dir_path,
stderr=subprocess.STDOUT,
- )
+ ).decode("utf-8")
tag = output.strip()
return tag == base_version
except subprocess.CalledProcessError:
| {"golden_diff": "diff --git a/dvc/version.py b/dvc/version.py\n--- a/dvc/version.py\n+++ b/dvc/version.py\n@@ -49,7 +49,7 @@\n [\"git\", \"describe\", \"--tags\", \"--exact-match\"],\n cwd=dir_path,\n stderr=subprocess.STDOUT,\n- )\n+ ).decode(\"utf-8\")\n tag = output.strip()\n return tag == base_version\n except subprocess.CalledProcessError:\n", "issue": "dvc: fix version generatio\nLooks like dynamic version got broken https://travis-ci.com/iterative/dvc/jobs/274986530 .\r\n\n", "before_files": [{"content": "# Used in setup.py, so don't pull any additional dependencies\n#\n# Based on:\n# - https://github.com/python/mypy/blob/master/mypy/version.py\n# - https://github.com/python/mypy/blob/master/mypy/git.py\nimport os\nimport subprocess\n\n\n_BASE_VERSION = \"0.81.0\"\n\n\ndef _generate_version(base_version):\n \"\"\"Generate a version with information about the git repository\"\"\"\n pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n\n if not _is_git_repo(pkg_dir) or not _have_git():\n return base_version\n\n if _is_release(pkg_dir, base_version) and not _is_dirty(pkg_dir):\n return base_version\n\n return \"{base_version}+{short_sha}{dirty}\".format(\n base_version=base_version,\n short_sha=_git_revision(pkg_dir).decode(\"utf-8\")[0:6],\n dirty=\".mod\" if _is_dirty(pkg_dir) else \"\",\n )\n\n\ndef _is_git_repo(dir_path):\n \"\"\"Is the given directory version-controlled with git?\"\"\"\n return os.path.exists(os.path.join(dir_path, \".git\"))\n\n\ndef _have_git():\n \"\"\"Can we run the git executable?\"\"\"\n try:\n subprocess.check_output([\"git\", \"--help\"])\n return True\n except subprocess.CalledProcessError:\n return False\n except OSError:\n return False\n\n\ndef _is_release(dir_path, base_version):\n try:\n output = subprocess.check_output(\n [\"git\", \"describe\", \"--tags\", \"--exact-match\"],\n cwd=dir_path,\n stderr=subprocess.STDOUT,\n )\n tag = output.strip()\n return tag == base_version\n except subprocess.CalledProcessError:\n return False\n\n\ndef _git_revision(dir_path):\n \"\"\"Get the SHA-1 of the HEAD of a git repository.\"\"\"\n return subprocess.check_output(\n [\"git\", \"rev-parse\", \"HEAD\"], cwd=dir_path\n ).strip()\n\n\ndef _is_dirty(dir_path):\n \"\"\"Check whether a git repository has uncommitted changes.\"\"\"\n try:\n subprocess.check_call([\"git\", \"diff\", \"--quiet\"], cwd=dir_path)\n return False\n except subprocess.CalledProcessError:\n return True\n\n\n__version__ = _generate_version(_BASE_VERSION)\n", "path": "dvc/version.py"}], "after_files": [{"content": "# Used in setup.py, so don't pull any additional dependencies\n#\n# Based on:\n# - https://github.com/python/mypy/blob/master/mypy/version.py\n# - https://github.com/python/mypy/blob/master/mypy/git.py\nimport os\nimport subprocess\n\n\n_BASE_VERSION = \"0.81.0\"\n\n\ndef _generate_version(base_version):\n \"\"\"Generate a version with information about the git repository\"\"\"\n pkg_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))\n\n if not _is_git_repo(pkg_dir) or not _have_git():\n return base_version\n\n if _is_release(pkg_dir, base_version) and not _is_dirty(pkg_dir):\n return base_version\n\n return \"{base_version}+{short_sha}{dirty}\".format(\n base_version=base_version,\n short_sha=_git_revision(pkg_dir).decode(\"utf-8\")[0:6],\n dirty=\".mod\" if _is_dirty(pkg_dir) else \"\",\n )\n\n\ndef _is_git_repo(dir_path):\n \"\"\"Is the given directory version-controlled with git?\"\"\"\n return os.path.exists(os.path.join(dir_path, \".git\"))\n\n\ndef _have_git():\n \"\"\"Can we run the git executable?\"\"\"\n try:\n subprocess.check_output([\"git\", \"--help\"])\n return True\n except subprocess.CalledProcessError:\n return False\n except OSError:\n return False\n\n\ndef _is_release(dir_path, base_version):\n try:\n output = subprocess.check_output(\n [\"git\", \"describe\", \"--tags\", \"--exact-match\"],\n cwd=dir_path,\n stderr=subprocess.STDOUT,\n ).decode(\"utf-8\")\n tag = output.strip()\n return tag == base_version\n except subprocess.CalledProcessError:\n return False\n\n\ndef _git_revision(dir_path):\n \"\"\"Get the SHA-1 of the HEAD of a git repository.\"\"\"\n return subprocess.check_output(\n [\"git\", \"rev-parse\", \"HEAD\"], cwd=dir_path\n ).strip()\n\n\ndef _is_dirty(dir_path):\n \"\"\"Check whether a git repository has uncommitted changes.\"\"\"\n try:\n subprocess.check_call([\"git\", \"diff\", \"--quiet\"], cwd=dir_path)\n return False\n except subprocess.CalledProcessError:\n return True\n\n\n__version__ = _generate_version(_BASE_VERSION)\n", "path": "dvc/version.py"}]} | 944 | 100 |
gh_patches_debug_13402 | rasdani/github-patches | git_diff | ytdl-org__youtube-dl-10971 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
openload.co extractor not working
youtube-dl --get-url --verbose https://openload.co/embed/kUEfGclsU9o/
[debug] System config: []
[debug] User config: []
[debug] Command-line args: [u'--get-url', u'--verbose', u'https://openload.co/embed/kUEfGclsU9o/']
[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8
[debug] youtube-dl version 2016.08.22
[debug] Python version 2.6.6 - Linux-2.6.32-642.1.1.el6.x86_64-x86_64-with-centos-6.8-Final
[debug] exe versions: ffmpeg 0.6.5, ffprobe 0.6.5
[debug] Proxy map: {}
ERROR: Unable to extract link image; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
Traceback (most recent call last):
File "/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py", line 691, in extract_info
ie_result = ie.extract(url)
File "/usr/local/bin/youtube-dl/youtube_dl/extractor/common.py", line 347, in extract
return self._real_extract(url)
File "/usr/local/bin/youtube-dl/youtube_dl/extractor/openload.py", line 62, in _real_extract
r'<img[^>]+id="linkimg"[^>]+src="([^"]+)"', webpage, 'link image')
File "/usr/local/bin/youtube-dl/youtube_dl/extractor/common.py", line 650, in _search_regex
raise RegexNotFoundError('Unable to extract %s' % _name)
RegexNotFoundError: Unable to extract link image; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `youtube_dl/extractor/openload.py`
Content:
```
1 # coding: utf-8
2 from __future__ import unicode_literals, division
3
4 from .common import InfoExtractor
5 from ..compat import (
6 compat_chr,
7 compat_ord,
8 )
9 from ..utils import (
10 determine_ext,
11 ExtractorError,
12 )
13
14
15 class OpenloadIE(InfoExtractor):
16 _VALID_URL = r'https?://openload\.(?:co|io)/(?:f|embed)/(?P<id>[a-zA-Z0-9-_]+)'
17
18 _TESTS = [{
19 'url': 'https://openload.co/f/kUEfGclsU9o',
20 'md5': 'bf1c059b004ebc7a256f89408e65c36e',
21 'info_dict': {
22 'id': 'kUEfGclsU9o',
23 'ext': 'mp4',
24 'title': 'skyrim_no-audio_1080.mp4',
25 'thumbnail': 're:^https?://.*\.jpg$',
26 },
27 }, {
28 'url': 'https://openload.co/embed/rjC09fkPLYs',
29 'info_dict': {
30 'id': 'rjC09fkPLYs',
31 'ext': 'mp4',
32 'title': 'movie.mp4',
33 'thumbnail': 're:^https?://.*\.jpg$',
34 'subtitles': {
35 'en': [{
36 'ext': 'vtt',
37 }],
38 },
39 },
40 'params': {
41 'skip_download': True, # test subtitles only
42 },
43 }, {
44 'url': 'https://openload.co/embed/kUEfGclsU9o/skyrim_no-audio_1080.mp4',
45 'only_matching': True,
46 }, {
47 'url': 'https://openload.io/f/ZAn6oz-VZGE/',
48 'only_matching': True,
49 }, {
50 'url': 'https://openload.co/f/_-ztPaZtMhM/',
51 'only_matching': True,
52 }, {
53 # unavailable via https://openload.co/f/Sxz5sADo82g/, different layout
54 # for title and ext
55 'url': 'https://openload.co/embed/Sxz5sADo82g/',
56 'only_matching': True,
57 }]
58
59 def _real_extract(self, url):
60 video_id = self._match_id(url)
61 webpage = self._download_webpage('https://openload.co/embed/%s/' % video_id, video_id)
62
63 if 'File not found' in webpage or 'deleted by the owner' in webpage:
64 raise ExtractorError('File not found', expected=True)
65
66 # The following decryption algorithm is written by @yokrysty and
67 # declared to be freely used in youtube-dl
68 # See https://github.com/rg3/youtube-dl/issues/10408
69 enc_data = self._html_search_regex(
70 r'<span[^>]*>([^<]+)</span>\s*<span[^>]*>[^<]+</span>\s*<span[^>]+id="streamurl"',
71 webpage, 'encrypted data')
72
73 video_url_chars = []
74
75 for idx, c in enumerate(enc_data):
76 j = compat_ord(c)
77 if j >= 33 and j <= 126:
78 j = ((j + 14) % 94) + 33
79 if idx == len(enc_data) - 1:
80 j += 2
81 video_url_chars += compat_chr(j)
82
83 video_url = 'https://openload.co/stream/%s?mime=true' % ''.join(video_url_chars)
84
85 title = self._og_search_title(webpage, default=None) or self._search_regex(
86 r'<span[^>]+class=["\']title["\'][^>]*>([^<]+)', webpage,
87 'title', default=None) or self._html_search_meta(
88 'description', webpage, 'title', fatal=True)
89
90 entries = self._parse_html5_media_entries(url, webpage, video_id)
91 subtitles = entries[0]['subtitles'] if entries else None
92
93 info_dict = {
94 'id': video_id,
95 'title': title,
96 'thumbnail': self._og_search_thumbnail(webpage, default=None),
97 'url': video_url,
98 # Seems all videos have extensions in their titles
99 'ext': determine_ext(title),
100 'subtitles': subtitles,
101 }
102
103 return info_dict
104
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/youtube_dl/extractor/openload.py b/youtube_dl/extractor/openload.py
--- a/youtube_dl/extractor/openload.py
+++ b/youtube_dl/extractor/openload.py
@@ -70,10 +70,15 @@
r'<span[^>]*>([^<]+)</span>\s*<span[^>]*>[^<]+</span>\s*<span[^>]+id="streamurl"',
webpage, 'encrypted data')
+ magic = compat_ord(enc_data[-1])
video_url_chars = []
for idx, c in enumerate(enc_data):
j = compat_ord(c)
+ if j == magic:
+ j -= 1
+ elif j == magic - 1:
+ j += 1
if j >= 33 and j <= 126:
j = ((j + 14) % 94) + 33
if idx == len(enc_data) - 1:
| {"golden_diff": "diff --git a/youtube_dl/extractor/openload.py b/youtube_dl/extractor/openload.py\n--- a/youtube_dl/extractor/openload.py\n+++ b/youtube_dl/extractor/openload.py\n@@ -70,10 +70,15 @@\n r'<span[^>]*>([^<]+)</span>\\s*<span[^>]*>[^<]+</span>\\s*<span[^>]+id=\"streamurl\"',\n webpage, 'encrypted data')\n \n+ magic = compat_ord(enc_data[-1])\n video_url_chars = []\n \n for idx, c in enumerate(enc_data):\n j = compat_ord(c)\n+ if j == magic:\n+ j -= 1\n+ elif j == magic - 1:\n+ j += 1\n if j >= 33 and j <= 126:\n j = ((j + 14) % 94) + 33\n if idx == len(enc_data) - 1:\n", "issue": "openload.co extractor not working\n youtube-dl --get-url --verbose https://openload.co/embed/kUEfGclsU9o/\n[debug] System config: []\n[debug] User config: []\n[debug] Command-line args: [u'--get-url', u'--verbose', u'https://openload.co/embed/kUEfGclsU9o/']\n[debug] Encodings: locale UTF-8, fs UTF-8, out UTF-8, pref UTF-8\n[debug] youtube-dl version 2016.08.22\n[debug] Python version 2.6.6 - Linux-2.6.32-642.1.1.el6.x86_64-x86_64-with-centos-6.8-Final\n[debug] exe versions: ffmpeg 0.6.5, ffprobe 0.6.5\n[debug] Proxy map: {}\nERROR: Unable to extract link image; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.\nTraceback (most recent call last):\n File \"/usr/local/bin/youtube-dl/youtube_dl/YoutubeDL.py\", line 691, in extract_info\n ie_result = ie.extract(url)\n File \"/usr/local/bin/youtube-dl/youtube_dl/extractor/common.py\", line 347, in extract\n return self._real_extract(url)\n File \"/usr/local/bin/youtube-dl/youtube_dl/extractor/openload.py\", line 62, in _real_extract\n r'<img[^>]+id=\"linkimg\"[^>]+src=\"([^\"]+)\"', webpage, 'link image')\n File \"/usr/local/bin/youtube-dl/youtube_dl/extractor/common.py\", line 650, in _search_regex\n raise RegexNotFoundError('Unable to extract %s' % _name)\nRegexNotFoundError: Unable to extract link image; please report this issue on https://yt-dl.org/bug . Make sure you are using the latest version; type youtube-dl -U to update. Be sure to call youtube-dl with the --verbose flag and include its complete output.\n\n", "before_files": [{"content": "# coding: utf-8\nfrom __future__ import unicode_literals, division\n\nfrom .common import InfoExtractor\nfrom ..compat import (\n compat_chr,\n compat_ord,\n)\nfrom ..utils import (\n determine_ext,\n ExtractorError,\n)\n\n\nclass OpenloadIE(InfoExtractor):\n _VALID_URL = r'https?://openload\\.(?:co|io)/(?:f|embed)/(?P<id>[a-zA-Z0-9-_]+)'\n\n _TESTS = [{\n 'url': 'https://openload.co/f/kUEfGclsU9o',\n 'md5': 'bf1c059b004ebc7a256f89408e65c36e',\n 'info_dict': {\n 'id': 'kUEfGclsU9o',\n 'ext': 'mp4',\n 'title': 'skyrim_no-audio_1080.mp4',\n 'thumbnail': 're:^https?://.*\\.jpg$',\n },\n }, {\n 'url': 'https://openload.co/embed/rjC09fkPLYs',\n 'info_dict': {\n 'id': 'rjC09fkPLYs',\n 'ext': 'mp4',\n 'title': 'movie.mp4',\n 'thumbnail': 're:^https?://.*\\.jpg$',\n 'subtitles': {\n 'en': [{\n 'ext': 'vtt',\n }],\n },\n },\n 'params': {\n 'skip_download': True, # test subtitles only\n },\n }, {\n 'url': 'https://openload.co/embed/kUEfGclsU9o/skyrim_no-audio_1080.mp4',\n 'only_matching': True,\n }, {\n 'url': 'https://openload.io/f/ZAn6oz-VZGE/',\n 'only_matching': True,\n }, {\n 'url': 'https://openload.co/f/_-ztPaZtMhM/',\n 'only_matching': True,\n }, {\n # unavailable via https://openload.co/f/Sxz5sADo82g/, different layout\n # for title and ext\n 'url': 'https://openload.co/embed/Sxz5sADo82g/',\n 'only_matching': True,\n }]\n\n def _real_extract(self, url):\n video_id = self._match_id(url)\n webpage = self._download_webpage('https://openload.co/embed/%s/' % video_id, video_id)\n\n if 'File not found' in webpage or 'deleted by the owner' in webpage:\n raise ExtractorError('File not found', expected=True)\n\n # The following decryption algorithm is written by @yokrysty and\n # declared to be freely used in youtube-dl\n # See https://github.com/rg3/youtube-dl/issues/10408\n enc_data = self._html_search_regex(\n r'<span[^>]*>([^<]+)</span>\\s*<span[^>]*>[^<]+</span>\\s*<span[^>]+id=\"streamurl\"',\n webpage, 'encrypted data')\n\n video_url_chars = []\n\n for idx, c in enumerate(enc_data):\n j = compat_ord(c)\n if j >= 33 and j <= 126:\n j = ((j + 14) % 94) + 33\n if idx == len(enc_data) - 1:\n j += 2\n video_url_chars += compat_chr(j)\n\n video_url = 'https://openload.co/stream/%s?mime=true' % ''.join(video_url_chars)\n\n title = self._og_search_title(webpage, default=None) or self._search_regex(\n r'<span[^>]+class=[\"\\']title[\"\\'][^>]*>([^<]+)', webpage,\n 'title', default=None) or self._html_search_meta(\n 'description', webpage, 'title', fatal=True)\n\n entries = self._parse_html5_media_entries(url, webpage, video_id)\n subtitles = entries[0]['subtitles'] if entries else None\n\n info_dict = {\n 'id': video_id,\n 'title': title,\n 'thumbnail': self._og_search_thumbnail(webpage, default=None),\n 'url': video_url,\n # Seems all videos have extensions in their titles\n 'ext': determine_ext(title),\n 'subtitles': subtitles,\n }\n\n return info_dict\n", "path": "youtube_dl/extractor/openload.py"}], "after_files": [{"content": "# coding: utf-8\nfrom __future__ import unicode_literals, division\n\nfrom .common import InfoExtractor\nfrom ..compat import (\n compat_chr,\n compat_ord,\n)\nfrom ..utils import (\n determine_ext,\n ExtractorError,\n)\n\n\nclass OpenloadIE(InfoExtractor):\n _VALID_URL = r'https?://openload\\.(?:co|io)/(?:f|embed)/(?P<id>[a-zA-Z0-9-_]+)'\n\n _TESTS = [{\n 'url': 'https://openload.co/f/kUEfGclsU9o',\n 'md5': 'bf1c059b004ebc7a256f89408e65c36e',\n 'info_dict': {\n 'id': 'kUEfGclsU9o',\n 'ext': 'mp4',\n 'title': 'skyrim_no-audio_1080.mp4',\n 'thumbnail': 're:^https?://.*\\.jpg$',\n },\n }, {\n 'url': 'https://openload.co/embed/rjC09fkPLYs',\n 'info_dict': {\n 'id': 'rjC09fkPLYs',\n 'ext': 'mp4',\n 'title': 'movie.mp4',\n 'thumbnail': 're:^https?://.*\\.jpg$',\n 'subtitles': {\n 'en': [{\n 'ext': 'vtt',\n }],\n },\n },\n 'params': {\n 'skip_download': True, # test subtitles only\n },\n }, {\n 'url': 'https://openload.co/embed/kUEfGclsU9o/skyrim_no-audio_1080.mp4',\n 'only_matching': True,\n }, {\n 'url': 'https://openload.io/f/ZAn6oz-VZGE/',\n 'only_matching': True,\n }, {\n 'url': 'https://openload.co/f/_-ztPaZtMhM/',\n 'only_matching': True,\n }, {\n # unavailable via https://openload.co/f/Sxz5sADo82g/, different layout\n # for title and ext\n 'url': 'https://openload.co/embed/Sxz5sADo82g/',\n 'only_matching': True,\n }]\n\n def _real_extract(self, url):\n video_id = self._match_id(url)\n webpage = self._download_webpage('https://openload.co/embed/%s/' % video_id, video_id)\n\n if 'File not found' in webpage or 'deleted by the owner' in webpage:\n raise ExtractorError('File not found', expected=True)\n\n # The following decryption algorithm is written by @yokrysty and\n # declared to be freely used in youtube-dl\n # See https://github.com/rg3/youtube-dl/issues/10408\n enc_data = self._html_search_regex(\n r'<span[^>]*>([^<]+)</span>\\s*<span[^>]*>[^<]+</span>\\s*<span[^>]+id=\"streamurl\"',\n webpage, 'encrypted data')\n\n magic = compat_ord(enc_data[-1])\n video_url_chars = []\n\n for idx, c in enumerate(enc_data):\n j = compat_ord(c)\n if j == magic:\n j -= 1\n elif j == magic - 1:\n j += 1\n if j >= 33 and j <= 126:\n j = ((j + 14) % 94) + 33\n if idx == len(enc_data) - 1:\n j += 2\n video_url_chars += compat_chr(j)\n\n video_url = 'https://openload.co/stream/%s?mime=true' % ''.join(video_url_chars)\n\n title = self._og_search_title(webpage, default=None) or self._search_regex(\n r'<span[^>]+class=[\"\\']title[\"\\'][^>]*>([^<]+)', webpage,\n 'title', default=None) or self._html_search_meta(\n 'description', webpage, 'title', fatal=True)\n\n entries = self._parse_html5_media_entries(url, webpage, video_id)\n subtitles = entries[0]['subtitles'] if entries else None\n\n info_dict = {\n 'id': video_id,\n 'title': title,\n 'thumbnail': self._og_search_thumbnail(webpage, default=None),\n 'url': video_url,\n # Seems all videos have extensions in their titles\n 'ext': determine_ext(title),\n 'subtitles': subtitles,\n }\n\n return info_dict\n", "path": "youtube_dl/extractor/openload.py"}]} | 1,971 | 223 |
gh_patches_debug_31725 | rasdani/github-patches | git_diff | pyca__cryptography-3880 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
RFC 5649 support
RFC 3394 (AES Key Wrap) was added a while back. I'd like to request support for RFC 5649 (AES Key Wrap with Padding), since it builds off of RFC 3394. It looks like OpenSSL handled this back in 2015:
https://rt.openssl.org/Ticket/Display.html?id=3675&user=guest&pass=guest
Is this feasible for cryptography in the not-too-distant future?
Thanks,
Peter
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/cryptography/hazmat/primitives/keywrap.py`
Content:
```
1 # This file is dual licensed under the terms of the Apache License, Version
2 # 2.0, and the BSD License. See the LICENSE file in the root of this repository
3 # for complete details.
4
5 from __future__ import absolute_import, division, print_function
6
7 import struct
8
9 from cryptography.hazmat.primitives.ciphers import Cipher
10 from cryptography.hazmat.primitives.ciphers.algorithms import AES
11 from cryptography.hazmat.primitives.ciphers.modes import ECB
12 from cryptography.hazmat.primitives.constant_time import bytes_eq
13
14
15 def _wrap_core(wrapping_key, a, r, backend):
16 # RFC 3394 Key Wrap - 2.2.1 (index method)
17 encryptor = Cipher(AES(wrapping_key), ECB(), backend).encryptor()
18 n = len(r)
19 for j in range(6):
20 for i in range(n):
21 # every encryption operation is a discrete 16 byte chunk (because
22 # AES has a 128-bit block size) and since we're using ECB it is
23 # safe to reuse the encryptor for the entire operation
24 b = encryptor.update(a + r[i])
25 # pack/unpack are safe as these are always 64-bit chunks
26 a = struct.pack(
27 ">Q", struct.unpack(">Q", b[:8])[0] ^ ((n * j) + i + 1)
28 )
29 r[i] = b[-8:]
30
31 assert encryptor.finalize() == b""
32
33 return a + b"".join(r)
34
35
36 def aes_key_wrap(wrapping_key, key_to_wrap, backend):
37 if len(wrapping_key) not in [16, 24, 32]:
38 raise ValueError("The wrapping key must be a valid AES key length")
39
40 if len(key_to_wrap) < 16:
41 raise ValueError("The key to wrap must be at least 16 bytes")
42
43 if len(key_to_wrap) % 8 != 0:
44 raise ValueError("The key to wrap must be a multiple of 8 bytes")
45
46 a = b"\xa6\xa6\xa6\xa6\xa6\xa6\xa6\xa6"
47 r = [key_to_wrap[i:i + 8] for i in range(0, len(key_to_wrap), 8)]
48 return _wrap_core(wrapping_key, a, r, backend)
49
50
51 def _unwrap_core(wrapping_key, a, r, backend):
52 # Implement RFC 3394 Key Unwrap - 2.2.2 (index method)
53 decryptor = Cipher(AES(wrapping_key), ECB(), backend).decryptor()
54 n = len(r)
55 for j in reversed(range(6)):
56 for i in reversed(range(n)):
57 # pack/unpack are safe as these are always 64-bit chunks
58 atr = struct.pack(
59 ">Q", struct.unpack(">Q", a)[0] ^ ((n * j) + i + 1)
60 ) + r[i]
61 # every decryption operation is a discrete 16 byte chunk so
62 # it is safe to reuse the decryptor for the entire operation
63 b = decryptor.update(atr)
64 a = b[:8]
65 r[i] = b[-8:]
66
67 assert decryptor.finalize() == b""
68 return a, r
69
70
71 def aes_key_unwrap(wrapping_key, wrapped_key, backend):
72 if len(wrapped_key) < 24:
73 raise ValueError("Must be at least 24 bytes")
74
75 if len(wrapped_key) % 8 != 0:
76 raise ValueError("The wrapped key must be a multiple of 8 bytes")
77
78 if len(wrapping_key) not in [16, 24, 32]:
79 raise ValueError("The wrapping key must be a valid AES key length")
80
81 aiv = b"\xa6\xa6\xa6\xa6\xa6\xa6\xa6\xa6"
82 r = [wrapped_key[i:i + 8] for i in range(0, len(wrapped_key), 8)]
83 a = r.pop(0)
84 a, r = _unwrap_core(wrapping_key, a, r, backend)
85 if not bytes_eq(a, aiv):
86 raise InvalidUnwrap()
87
88 return b"".join(r)
89
90
91 class InvalidUnwrap(Exception):
92 pass
93
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/cryptography/hazmat/primitives/keywrap.py b/src/cryptography/hazmat/primitives/keywrap.py
--- a/src/cryptography/hazmat/primitives/keywrap.py
+++ b/src/cryptography/hazmat/primitives/keywrap.py
@@ -68,6 +68,63 @@
return a, r
+def aes_key_wrap_with_padding(wrapping_key, key_to_wrap, backend):
+ if len(wrapping_key) not in [16, 24, 32]:
+ raise ValueError("The wrapping key must be a valid AES key length")
+
+ aiv = b"\xA6\x59\x59\xA6" + struct.pack(">i", len(key_to_wrap))
+ # pad the key to wrap if necessary
+ pad = (8 - (len(key_to_wrap) % 8)) % 8
+ key_to_wrap = key_to_wrap + b"\x00" * pad
+ if len(key_to_wrap) == 8:
+ # RFC 5649 - 4.1 - exactly 8 octets after padding
+ encryptor = Cipher(AES(wrapping_key), ECB(), backend).encryptor()
+ b = encryptor.update(aiv + key_to_wrap)
+ assert encryptor.finalize() == b""
+ return b
+ else:
+ r = [key_to_wrap[i:i + 8] for i in range(0, len(key_to_wrap), 8)]
+ return _wrap_core(wrapping_key, aiv, r, backend)
+
+
+def aes_key_unwrap_with_padding(wrapping_key, wrapped_key, backend):
+ if len(wrapped_key) < 16:
+ raise ValueError("Must be at least 16 bytes")
+
+ if len(wrapping_key) not in [16, 24, 32]:
+ raise ValueError("The wrapping key must be a valid AES key length")
+
+ if len(wrapped_key) == 16:
+ # RFC 5649 - 4.2 - exactly two 64-bit blocks
+ decryptor = Cipher(AES(wrapping_key), ECB(), backend).decryptor()
+ b = decryptor.update(wrapped_key)
+ assert decryptor.finalize() == b""
+ a = b[:8]
+ data = b[8:]
+ n = 1
+ else:
+ r = [wrapped_key[i:i + 8] for i in range(0, len(wrapped_key), 8)]
+ encrypted_aiv = r.pop(0)
+ n = len(r)
+ a, r = _unwrap_core(wrapping_key, encrypted_aiv, r, backend)
+ data = b"".join(r)
+
+ # 1) Check that MSB(32,A) = A65959A6.
+ # 2) Check that 8*(n-1) < LSB(32,A) <= 8*n. If so, let
+ # MLI = LSB(32,A).
+ # 3) Let b = (8*n)-MLI, and then check that the rightmost b octets of
+ # the output data are zero.
+ (mli,) = struct.unpack(">I", a[4:])
+ b = (8 * n) - mli
+ if (
+ not bytes_eq(a[:4], b"\xa6\x59\x59\xa6") or not
+ 8 * (n - 1) < mli <= 8 * n or not bytes_eq(data[-b:], b"\x00" * b)
+ ):
+ raise InvalidUnwrap()
+
+ return data[:-b]
+
+
def aes_key_unwrap(wrapping_key, wrapped_key, backend):
if len(wrapped_key) < 24:
raise ValueError("Must be at least 24 bytes")
| {"golden_diff": "diff --git a/src/cryptography/hazmat/primitives/keywrap.py b/src/cryptography/hazmat/primitives/keywrap.py\n--- a/src/cryptography/hazmat/primitives/keywrap.py\n+++ b/src/cryptography/hazmat/primitives/keywrap.py\n@@ -68,6 +68,63 @@\n return a, r\n \n \n+def aes_key_wrap_with_padding(wrapping_key, key_to_wrap, backend):\n+ if len(wrapping_key) not in [16, 24, 32]:\n+ raise ValueError(\"The wrapping key must be a valid AES key length\")\n+\n+ aiv = b\"\\xA6\\x59\\x59\\xA6\" + struct.pack(\">i\", len(key_to_wrap))\n+ # pad the key to wrap if necessary\n+ pad = (8 - (len(key_to_wrap) % 8)) % 8\n+ key_to_wrap = key_to_wrap + b\"\\x00\" * pad\n+ if len(key_to_wrap) == 8:\n+ # RFC 5649 - 4.1 - exactly 8 octets after padding\n+ encryptor = Cipher(AES(wrapping_key), ECB(), backend).encryptor()\n+ b = encryptor.update(aiv + key_to_wrap)\n+ assert encryptor.finalize() == b\"\"\n+ return b\n+ else:\n+ r = [key_to_wrap[i:i + 8] for i in range(0, len(key_to_wrap), 8)]\n+ return _wrap_core(wrapping_key, aiv, r, backend)\n+\n+\n+def aes_key_unwrap_with_padding(wrapping_key, wrapped_key, backend):\n+ if len(wrapped_key) < 16:\n+ raise ValueError(\"Must be at least 16 bytes\")\n+\n+ if len(wrapping_key) not in [16, 24, 32]:\n+ raise ValueError(\"The wrapping key must be a valid AES key length\")\n+\n+ if len(wrapped_key) == 16:\n+ # RFC 5649 - 4.2 - exactly two 64-bit blocks\n+ decryptor = Cipher(AES(wrapping_key), ECB(), backend).decryptor()\n+ b = decryptor.update(wrapped_key)\n+ assert decryptor.finalize() == b\"\"\n+ a = b[:8]\n+ data = b[8:]\n+ n = 1\n+ else:\n+ r = [wrapped_key[i:i + 8] for i in range(0, len(wrapped_key), 8)]\n+ encrypted_aiv = r.pop(0)\n+ n = len(r)\n+ a, r = _unwrap_core(wrapping_key, encrypted_aiv, r, backend)\n+ data = b\"\".join(r)\n+\n+ # 1) Check that MSB(32,A) = A65959A6.\n+ # 2) Check that 8*(n-1) < LSB(32,A) <= 8*n. If so, let\n+ # MLI = LSB(32,A).\n+ # 3) Let b = (8*n)-MLI, and then check that the rightmost b octets of\n+ # the output data are zero.\n+ (mli,) = struct.unpack(\">I\", a[4:])\n+ b = (8 * n) - mli\n+ if (\n+ not bytes_eq(a[:4], b\"\\xa6\\x59\\x59\\xa6\") or not\n+ 8 * (n - 1) < mli <= 8 * n or not bytes_eq(data[-b:], b\"\\x00\" * b)\n+ ):\n+ raise InvalidUnwrap()\n+\n+ return data[:-b]\n+\n+\n def aes_key_unwrap(wrapping_key, wrapped_key, backend):\n if len(wrapped_key) < 24:\n raise ValueError(\"Must be at least 24 bytes\")\n", "issue": "RFC 5649 support\nRFC 3394 (AES Key Wrap) was added a while back. I'd like to request support for RFC 5649 (AES Key Wrap with Padding), since it builds off of RFC 3394. It looks like OpenSSL handled this back in 2015:\r\n\r\nhttps://rt.openssl.org/Ticket/Display.html?id=3675&user=guest&pass=guest\r\n\r\nIs this feasible for cryptography in the not-too-distant future?\r\n\r\nThanks,\r\nPeter\n", "before_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport struct\n\nfrom cryptography.hazmat.primitives.ciphers import Cipher\nfrom cryptography.hazmat.primitives.ciphers.algorithms import AES\nfrom cryptography.hazmat.primitives.ciphers.modes import ECB\nfrom cryptography.hazmat.primitives.constant_time import bytes_eq\n\n\ndef _wrap_core(wrapping_key, a, r, backend):\n # RFC 3394 Key Wrap - 2.2.1 (index method)\n encryptor = Cipher(AES(wrapping_key), ECB(), backend).encryptor()\n n = len(r)\n for j in range(6):\n for i in range(n):\n # every encryption operation is a discrete 16 byte chunk (because\n # AES has a 128-bit block size) and since we're using ECB it is\n # safe to reuse the encryptor for the entire operation\n b = encryptor.update(a + r[i])\n # pack/unpack are safe as these are always 64-bit chunks\n a = struct.pack(\n \">Q\", struct.unpack(\">Q\", b[:8])[0] ^ ((n * j) + i + 1)\n )\n r[i] = b[-8:]\n\n assert encryptor.finalize() == b\"\"\n\n return a + b\"\".join(r)\n\n\ndef aes_key_wrap(wrapping_key, key_to_wrap, backend):\n if len(wrapping_key) not in [16, 24, 32]:\n raise ValueError(\"The wrapping key must be a valid AES key length\")\n\n if len(key_to_wrap) < 16:\n raise ValueError(\"The key to wrap must be at least 16 bytes\")\n\n if len(key_to_wrap) % 8 != 0:\n raise ValueError(\"The key to wrap must be a multiple of 8 bytes\")\n\n a = b\"\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\"\n r = [key_to_wrap[i:i + 8] for i in range(0, len(key_to_wrap), 8)]\n return _wrap_core(wrapping_key, a, r, backend)\n\n\ndef _unwrap_core(wrapping_key, a, r, backend):\n # Implement RFC 3394 Key Unwrap - 2.2.2 (index method)\n decryptor = Cipher(AES(wrapping_key), ECB(), backend).decryptor()\n n = len(r)\n for j in reversed(range(6)):\n for i in reversed(range(n)):\n # pack/unpack are safe as these are always 64-bit chunks\n atr = struct.pack(\n \">Q\", struct.unpack(\">Q\", a)[0] ^ ((n * j) + i + 1)\n ) + r[i]\n # every decryption operation is a discrete 16 byte chunk so\n # it is safe to reuse the decryptor for the entire operation\n b = decryptor.update(atr)\n a = b[:8]\n r[i] = b[-8:]\n\n assert decryptor.finalize() == b\"\"\n return a, r\n\n\ndef aes_key_unwrap(wrapping_key, wrapped_key, backend):\n if len(wrapped_key) < 24:\n raise ValueError(\"Must be at least 24 bytes\")\n\n if len(wrapped_key) % 8 != 0:\n raise ValueError(\"The wrapped key must be a multiple of 8 bytes\")\n\n if len(wrapping_key) not in [16, 24, 32]:\n raise ValueError(\"The wrapping key must be a valid AES key length\")\n\n aiv = b\"\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\"\n r = [wrapped_key[i:i + 8] for i in range(0, len(wrapped_key), 8)]\n a = r.pop(0)\n a, r = _unwrap_core(wrapping_key, a, r, backend)\n if not bytes_eq(a, aiv):\n raise InvalidUnwrap()\n\n return b\"\".join(r)\n\n\nclass InvalidUnwrap(Exception):\n pass\n", "path": "src/cryptography/hazmat/primitives/keywrap.py"}], "after_files": [{"content": "# This file is dual licensed under the terms of the Apache License, Version\n# 2.0, and the BSD License. See the LICENSE file in the root of this repository\n# for complete details.\n\nfrom __future__ import absolute_import, division, print_function\n\nimport struct\n\nfrom cryptography.hazmat.primitives.ciphers import Cipher\nfrom cryptography.hazmat.primitives.ciphers.algorithms import AES\nfrom cryptography.hazmat.primitives.ciphers.modes import ECB\nfrom cryptography.hazmat.primitives.constant_time import bytes_eq\n\n\ndef _wrap_core(wrapping_key, a, r, backend):\n # RFC 3394 Key Wrap - 2.2.1 (index method)\n encryptor = Cipher(AES(wrapping_key), ECB(), backend).encryptor()\n n = len(r)\n for j in range(6):\n for i in range(n):\n # every encryption operation is a discrete 16 byte chunk (because\n # AES has a 128-bit block size) and since we're using ECB it is\n # safe to reuse the encryptor for the entire operation\n b = encryptor.update(a + r[i])\n # pack/unpack are safe as these are always 64-bit chunks\n a = struct.pack(\n \">Q\", struct.unpack(\">Q\", b[:8])[0] ^ ((n * j) + i + 1)\n )\n r[i] = b[-8:]\n\n assert encryptor.finalize() == b\"\"\n\n return a + b\"\".join(r)\n\n\ndef aes_key_wrap(wrapping_key, key_to_wrap, backend):\n if len(wrapping_key) not in [16, 24, 32]:\n raise ValueError(\"The wrapping key must be a valid AES key length\")\n\n if len(key_to_wrap) < 16:\n raise ValueError(\"The key to wrap must be at least 16 bytes\")\n\n if len(key_to_wrap) % 8 != 0:\n raise ValueError(\"The key to wrap must be a multiple of 8 bytes\")\n\n a = b\"\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\"\n r = [key_to_wrap[i:i + 8] for i in range(0, len(key_to_wrap), 8)]\n return _wrap_core(wrapping_key, a, r, backend)\n\n\ndef _unwrap_core(wrapping_key, a, r, backend):\n # Implement RFC 3394 Key Unwrap - 2.2.2 (index method)\n decryptor = Cipher(AES(wrapping_key), ECB(), backend).decryptor()\n n = len(r)\n for j in reversed(range(6)):\n for i in reversed(range(n)):\n # pack/unpack are safe as these are always 64-bit chunks\n atr = struct.pack(\n \">Q\", struct.unpack(\">Q\", a)[0] ^ ((n * j) + i + 1)\n ) + r[i]\n # every decryption operation is a discrete 16 byte chunk so\n # it is safe to reuse the decryptor for the entire operation\n b = decryptor.update(atr)\n a = b[:8]\n r[i] = b[-8:]\n\n assert decryptor.finalize() == b\"\"\n return a, r\n\n\ndef aes_key_wrap_with_padding(wrapping_key, key_to_wrap, backend):\n if len(wrapping_key) not in [16, 24, 32]:\n raise ValueError(\"The wrapping key must be a valid AES key length\")\n\n aiv = b\"\\xA6\\x59\\x59\\xA6\" + struct.pack(\">i\", len(key_to_wrap))\n # pad the key to wrap if necessary\n pad = (8 - (len(key_to_wrap) % 8)) % 8\n key_to_wrap = key_to_wrap + b\"\\x00\" * pad\n if len(key_to_wrap) == 8:\n # RFC 5649 - 4.1 - exactly 8 octets after padding\n encryptor = Cipher(AES(wrapping_key), ECB(), backend).encryptor()\n b = encryptor.update(aiv + key_to_wrap)\n assert encryptor.finalize() == b\"\"\n return b\n else:\n r = [key_to_wrap[i:i + 8] for i in range(0, len(key_to_wrap), 8)]\n return _wrap_core(wrapping_key, aiv, r, backend)\n\n\ndef aes_key_unwrap_with_padding(wrapping_key, wrapped_key, backend):\n if len(wrapped_key) < 16:\n raise ValueError(\"Must be at least 16 bytes\")\n\n if len(wrapping_key) not in [16, 24, 32]:\n raise ValueError(\"The wrapping key must be a valid AES key length\")\n\n if len(wrapped_key) == 16:\n # RFC 5649 - 4.2 - exactly two 64-bit blocks\n decryptor = Cipher(AES(wrapping_key), ECB(), backend).decryptor()\n b = decryptor.update(wrapped_key)\n assert decryptor.finalize() == b\"\"\n a = b[:8]\n data = b[8:]\n n = 1\n else:\n r = [wrapped_key[i:i + 8] for i in range(0, len(wrapped_key), 8)]\n encrypted_aiv = r.pop(0)\n n = len(r)\n a, r = _unwrap_core(wrapping_key, encrypted_aiv, r, backend)\n data = b\"\".join(r)\n\n # 1) Check that MSB(32,A) = A65959A6.\n # 2) Check that 8*(n-1) < LSB(32,A) <= 8*n. If so, let\n # MLI = LSB(32,A).\n # 3) Let b = (8*n)-MLI, and then check that the rightmost b octets of\n # the output data are zero.\n (mli,) = struct.unpack(\">I\", a[4:])\n b = (8 * n) - mli\n if (\n not bytes_eq(a[:4], b\"\\xa6\\x59\\x59\\xa6\") or not\n 8 * (n - 1) < mli <= 8 * n or not bytes_eq(data[-b:], b\"\\x00\" * b)\n ):\n raise InvalidUnwrap()\n\n return data[:-b]\n\n\ndef aes_key_unwrap(wrapping_key, wrapped_key, backend):\n if len(wrapped_key) < 24:\n raise ValueError(\"Must be at least 24 bytes\")\n\n if len(wrapped_key) % 8 != 0:\n raise ValueError(\"The wrapped key must be a multiple of 8 bytes\")\n\n if len(wrapping_key) not in [16, 24, 32]:\n raise ValueError(\"The wrapping key must be a valid AES key length\")\n\n aiv = b\"\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\\xa6\"\n r = [wrapped_key[i:i + 8] for i in range(0, len(wrapped_key), 8)]\n a = r.pop(0)\n a, r = _unwrap_core(wrapping_key, a, r, backend)\n if not bytes_eq(a, aiv):\n raise InvalidUnwrap()\n\n return b\"\".join(r)\n\n\nclass InvalidUnwrap(Exception):\n pass\n", "path": "src/cryptography/hazmat/primitives/keywrap.py"}]} | 1,509 | 885 |
gh_patches_debug_31639 | rasdani/github-patches | git_diff | ESMCI__cime-1136 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
ERR test does not always report failures correctly
The ERR test runs four separate jobs, if one of these jobs completes but the next fails to launch, the test reports PASS. To reproduce this problem its enough to edit the jobid_pattern field in config batch so that the dependency is incorrect - this causes the first job to exit and the second to fail to launch. But the TestStatus file indicates all PASS.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `utils/python/CIME/case_submit.py`
Content:
```
1 #!/usr/bin/env python
2
3 """
4 case.submit - Submit a cesm workflow to the queueing system or run it
5 if there is no queueing system. A cesm workflow may include multiple
6 jobs.
7 """
8 import socket
9 from CIME.XML.standard_module_setup import *
10 from CIME.utils import expect, append_status
11 from CIME.preview_namelists import create_namelists
12 from CIME.check_lockedfiles import check_lockedfiles
13 from CIME.check_input_data import check_all_input_data
14 from CIME.case_cmpgen_namelists import case_cmpgen_namelists
15
16 logger = logging.getLogger(__name__)
17
18 def submit(case, job=None, resubmit=False, no_batch=False):
19 caseroot = case.get_value("CASEROOT")
20
21 if job is None:
22 if case.get_value("TEST"):
23 job = "case.test"
24 else:
25 job = "case.run"
26
27 if resubmit:
28 resub = case.get_value("RESUBMIT")
29 logger.info("Submitting job '%s', resubmit=%d" % (job, resub))
30 case.set_value("RESUBMIT",resub-1)
31 if case.get_value("RESUBMIT_SETS_CONTINUE_RUN"):
32 case.set_value("CONTINUE_RUN", True)
33 else:
34 if job in ("case.test","case.run"):
35 check_case(case, caseroot)
36 check_DA_settings(case)
37 if case.get_value("MACH") == "mira":
38 with open(".original_host","w") as fd:
39 fd.write( socket.gethostname())
40
41 # if case.submit is called with the no_batch flag then we assume that this
42 # flag will stay in effect for the duration of the RESUBMITs
43 env_batch = case.get_env("batch")
44 if not resubmit:
45 case.set_value("IS_FIRST_RUN", True)
46 if no_batch:
47 batch_system = "none"
48 else:
49 batch_system = env_batch.get_batch_system_type()
50 case.set_value("BATCH_SYSTEM", batch_system)
51 else:
52 if env_batch.get_batch_system_type() == "none":
53 no_batch = True
54
55 # This is a resubmission, do not reinitialize test values
56 case.set_value("IS_FIRST_RUN", False)
57
58 #Load Modules
59 case.load_env()
60
61 case.set_value("RUN_WITH_SUBMIT",True)
62 case.flush()
63
64 logger.warn("submit_jobs %s"%job)
65 job_ids = case.submit_jobs(no_batch=no_batch, job=job)
66 msg = "Submitted jobs %s"%job_ids
67 append_status(msg, caseroot=caseroot, sfile="CaseStatus")
68
69 def check_case(case, caseroot):
70 check_lockedfiles(caseroot)
71 create_namelists(case) # Must be called before check_all_input_data
72 logger.info("Checking that inputdata is available as part of case submission")
73 check_all_input_data(case)
74 # Now that we have baselines, do baseline operations
75 if case.get_value("TEST"):
76 case_cmpgen_namelists(case)
77
78 expect(case.get_value("BUILD_COMPLETE"), "Build complete is "
79 "not True please rebuild the model by calling case.build")
80 logger.info("Check case OK")
81
82 def check_DA_settings(case):
83 if case.get_value("DATA_ASSIMILATION"):
84 script = case.get_value("DATA_ASSIMILATION_SCRIPT")
85 cycles = case.get_value("DATA_ASSIMILATION_CYCLES")
86 logger.info("Data Assimilation enabled using script %s with %d cycles"%(script,cycles))
87
88
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/utils/python/CIME/case_submit.py b/utils/python/CIME/case_submit.py
--- a/utils/python/CIME/case_submit.py
+++ b/utils/python/CIME/case_submit.py
@@ -12,10 +12,11 @@
from CIME.check_lockedfiles import check_lockedfiles
from CIME.check_input_data import check_all_input_data
from CIME.case_cmpgen_namelists import case_cmpgen_namelists
+from CIME.test_status import *
logger = logging.getLogger(__name__)
-def submit(case, job=None, resubmit=False, no_batch=False):
+def _submit(case, job=None, resubmit=False, no_batch=False):
caseroot = case.get_value("CASEROOT")
if job is None:
@@ -61,11 +62,27 @@
case.set_value("RUN_WITH_SUBMIT",True)
case.flush()
- logger.warn("submit_jobs %s"%job)
+ logger.warn("submit_jobs %s" % job)
job_ids = case.submit_jobs(no_batch=no_batch, job=job)
- msg = "Submitted jobs %s"%job_ids
+ msg = "Submitted jobs %s" % job_ids
append_status(msg, caseroot=caseroot, sfile="CaseStatus")
+def submit(case, job=None, resubmit=False, no_batch=False):
+ try:
+ _submit(case, job=job, resubmit=resubmit, no_batch=no_batch)
+ except:
+ # If something failed in the batch system, make sure to mark
+ # the test as failed if we are running a test.
+ if case.get_value("TEST"):
+ caseroot = case.get_value("CASEROOT")
+ casebaseid = case.get_value("CASEBASEID")
+ with TestStatus(test_dir=caseroot, test_name=casebaseid, lock=True) as ts:
+ ts.set_status(RUN_PHASE, TEST_FAIL_STATUS, comments="batch system failure")
+
+ append_status("Batch submission failed, TestStatus file changed to read-only", caseroot=caseroot, sfile="TestStatus.log")
+
+ raise
+
def check_case(case, caseroot):
check_lockedfiles(caseroot)
create_namelists(case) # Must be called before check_all_input_data
| {"golden_diff": "diff --git a/utils/python/CIME/case_submit.py b/utils/python/CIME/case_submit.py\n--- a/utils/python/CIME/case_submit.py\n+++ b/utils/python/CIME/case_submit.py\n@@ -12,10 +12,11 @@\n from CIME.check_lockedfiles import check_lockedfiles\n from CIME.check_input_data import check_all_input_data\n from CIME.case_cmpgen_namelists import case_cmpgen_namelists\n+from CIME.test_status import *\n \n logger = logging.getLogger(__name__)\n \n-def submit(case, job=None, resubmit=False, no_batch=False):\n+def _submit(case, job=None, resubmit=False, no_batch=False):\n caseroot = case.get_value(\"CASEROOT\")\n \n if job is None:\n@@ -61,11 +62,27 @@\n case.set_value(\"RUN_WITH_SUBMIT\",True)\n case.flush()\n \n- logger.warn(\"submit_jobs %s\"%job)\n+ logger.warn(\"submit_jobs %s\" % job)\n job_ids = case.submit_jobs(no_batch=no_batch, job=job)\n- msg = \"Submitted jobs %s\"%job_ids\n+ msg = \"Submitted jobs %s\" % job_ids\n append_status(msg, caseroot=caseroot, sfile=\"CaseStatus\")\n \n+def submit(case, job=None, resubmit=False, no_batch=False):\n+ try:\n+ _submit(case, job=job, resubmit=resubmit, no_batch=no_batch)\n+ except:\n+ # If something failed in the batch system, make sure to mark\n+ # the test as failed if we are running a test.\n+ if case.get_value(\"TEST\"):\n+ caseroot = case.get_value(\"CASEROOT\")\n+ casebaseid = case.get_value(\"CASEBASEID\")\n+ with TestStatus(test_dir=caseroot, test_name=casebaseid, lock=True) as ts:\n+ ts.set_status(RUN_PHASE, TEST_FAIL_STATUS, comments=\"batch system failure\")\n+\n+ append_status(\"Batch submission failed, TestStatus file changed to read-only\", caseroot=caseroot, sfile=\"TestStatus.log\")\n+\n+ raise\n+\n def check_case(case, caseroot):\n check_lockedfiles(caseroot)\n create_namelists(case) # Must be called before check_all_input_data\n", "issue": "ERR test does not always report failures correctly\nThe ERR test runs four separate jobs, if one of these jobs completes but the next fails to launch, the test reports PASS. To reproduce this problem its enough to edit the jobid_pattern field in config batch so that the dependency is incorrect - this causes the first job to exit and the second to fail to launch. But the TestStatus file indicates all PASS. \n", "before_files": [{"content": "#!/usr/bin/env python\n\n\"\"\"\ncase.submit - Submit a cesm workflow to the queueing system or run it\nif there is no queueing system. A cesm workflow may include multiple\njobs.\n\"\"\"\nimport socket\nfrom CIME.XML.standard_module_setup import *\nfrom CIME.utils import expect, append_status\nfrom CIME.preview_namelists import create_namelists\nfrom CIME.check_lockedfiles import check_lockedfiles\nfrom CIME.check_input_data import check_all_input_data\nfrom CIME.case_cmpgen_namelists import case_cmpgen_namelists\n\nlogger = logging.getLogger(__name__)\n\ndef submit(case, job=None, resubmit=False, no_batch=False):\n caseroot = case.get_value(\"CASEROOT\")\n\n if job is None:\n if case.get_value(\"TEST\"):\n job = \"case.test\"\n else:\n job = \"case.run\"\n\n if resubmit:\n resub = case.get_value(\"RESUBMIT\")\n logger.info(\"Submitting job '%s', resubmit=%d\" % (job, resub))\n case.set_value(\"RESUBMIT\",resub-1)\n if case.get_value(\"RESUBMIT_SETS_CONTINUE_RUN\"):\n case.set_value(\"CONTINUE_RUN\", True)\n else:\n if job in (\"case.test\",\"case.run\"):\n check_case(case, caseroot)\n check_DA_settings(case)\n if case.get_value(\"MACH\") == \"mira\":\n with open(\".original_host\",\"w\") as fd:\n fd.write( socket.gethostname())\n\n # if case.submit is called with the no_batch flag then we assume that this\n # flag will stay in effect for the duration of the RESUBMITs\n env_batch = case.get_env(\"batch\")\n if not resubmit:\n case.set_value(\"IS_FIRST_RUN\", True)\n if no_batch:\n batch_system = \"none\"\n else:\n batch_system = env_batch.get_batch_system_type()\n case.set_value(\"BATCH_SYSTEM\", batch_system)\n else:\n if env_batch.get_batch_system_type() == \"none\":\n no_batch = True\n\n # This is a resubmission, do not reinitialize test values\n case.set_value(\"IS_FIRST_RUN\", False)\n\n #Load Modules\n case.load_env()\n\n case.set_value(\"RUN_WITH_SUBMIT\",True)\n case.flush()\n\n logger.warn(\"submit_jobs %s\"%job)\n job_ids = case.submit_jobs(no_batch=no_batch, job=job)\n msg = \"Submitted jobs %s\"%job_ids\n append_status(msg, caseroot=caseroot, sfile=\"CaseStatus\")\n\ndef check_case(case, caseroot):\n check_lockedfiles(caseroot)\n create_namelists(case) # Must be called before check_all_input_data\n logger.info(\"Checking that inputdata is available as part of case submission\")\n check_all_input_data(case)\n # Now that we have baselines, do baseline operations\n if case.get_value(\"TEST\"):\n case_cmpgen_namelists(case)\n\n expect(case.get_value(\"BUILD_COMPLETE\"), \"Build complete is \"\n \"not True please rebuild the model by calling case.build\")\n logger.info(\"Check case OK\")\n\ndef check_DA_settings(case):\n if case.get_value(\"DATA_ASSIMILATION\"):\n script = case.get_value(\"DATA_ASSIMILATION_SCRIPT\")\n cycles = case.get_value(\"DATA_ASSIMILATION_CYCLES\")\n logger.info(\"Data Assimilation enabled using script %s with %d cycles\"%(script,cycles))\n\n", "path": "utils/python/CIME/case_submit.py"}], "after_files": [{"content": "#!/usr/bin/env python\n\n\"\"\"\ncase.submit - Submit a cesm workflow to the queueing system or run it\nif there is no queueing system. A cesm workflow may include multiple\njobs.\n\"\"\"\nimport socket\nfrom CIME.XML.standard_module_setup import *\nfrom CIME.utils import expect, append_status\nfrom CIME.preview_namelists import create_namelists\nfrom CIME.check_lockedfiles import check_lockedfiles\nfrom CIME.check_input_data import check_all_input_data\nfrom CIME.case_cmpgen_namelists import case_cmpgen_namelists\nfrom CIME.test_status import *\n\nlogger = logging.getLogger(__name__)\n\ndef _submit(case, job=None, resubmit=False, no_batch=False):\n caseroot = case.get_value(\"CASEROOT\")\n\n if job is None:\n if case.get_value(\"TEST\"):\n job = \"case.test\"\n else:\n job = \"case.run\"\n\n if resubmit:\n resub = case.get_value(\"RESUBMIT\")\n logger.info(\"Submitting job '%s', resubmit=%d\" % (job, resub))\n case.set_value(\"RESUBMIT\",resub-1)\n if case.get_value(\"RESUBMIT_SETS_CONTINUE_RUN\"):\n case.set_value(\"CONTINUE_RUN\", True)\n else:\n if job in (\"case.test\",\"case.run\"):\n check_case(case, caseroot)\n check_DA_settings(case)\n if case.get_value(\"MACH\") == \"mira\":\n with open(\".original_host\",\"w\") as fd:\n fd.write( socket.gethostname())\n\n # if case.submit is called with the no_batch flag then we assume that this\n # flag will stay in effect for the duration of the RESUBMITs\n env_batch = case.get_env(\"batch\")\n if not resubmit:\n case.set_value(\"IS_FIRST_RUN\", True)\n if no_batch:\n batch_system = \"none\"\n else:\n batch_system = env_batch.get_batch_system_type()\n case.set_value(\"BATCH_SYSTEM\", batch_system)\n else:\n if env_batch.get_batch_system_type() == \"none\":\n no_batch = True\n\n # This is a resubmission, do not reinitialize test values\n case.set_value(\"IS_FIRST_RUN\", False)\n\n #Load Modules\n case.load_env()\n\n case.set_value(\"RUN_WITH_SUBMIT\",True)\n case.flush()\n\n logger.warn(\"submit_jobs %s\" % job)\n job_ids = case.submit_jobs(no_batch=no_batch, job=job)\n msg = \"Submitted jobs %s\" % job_ids\n append_status(msg, caseroot=caseroot, sfile=\"CaseStatus\")\n\ndef submit(case, job=None, resubmit=False, no_batch=False):\n try:\n _submit(case, job=job, resubmit=resubmit, no_batch=no_batch)\n except:\n # If something failed in the batch system, make sure to mark\n # the test as failed if we are running a test.\n if case.get_value(\"TEST\"):\n caseroot = case.get_value(\"CASEROOT\")\n casebaseid = case.get_value(\"CASEBASEID\")\n with TestStatus(test_dir=caseroot, test_name=casebaseid, lock=True) as ts:\n ts.set_status(RUN_PHASE, TEST_FAIL_STATUS, comments=\"batch system failure\")\n\n append_status(\"Batch submission failed, TestStatus file changed to read-only\", caseroot=caseroot, sfile=\"TestStatus.log\")\n\n raise\n\ndef check_case(case, caseroot):\n check_lockedfiles(caseroot)\n create_namelists(case) # Must be called before check_all_input_data\n logger.info(\"Checking that inputdata is available as part of case submission\")\n check_all_input_data(case)\n # Now that we have baselines, do baseline operations\n if case.get_value(\"TEST\"):\n case_cmpgen_namelists(case)\n\n expect(case.get_value(\"BUILD_COMPLETE\"), \"Build complete is \"\n \"not True please rebuild the model by calling case.build\")\n logger.info(\"Check case OK\")\n\ndef check_DA_settings(case):\n if case.get_value(\"DATA_ASSIMILATION\"):\n script = case.get_value(\"DATA_ASSIMILATION_SCRIPT\")\n cycles = case.get_value(\"DATA_ASSIMILATION_CYCLES\")\n logger.info(\"Data Assimilation enabled using script %s with %d cycles\"%(script,cycles))\n\n", "path": "utils/python/CIME/case_submit.py"}]} | 1,287 | 520 |
gh_patches_debug_21868 | rasdani/github-patches | git_diff | streamlink__streamlink-4885 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
plugins.btv: No playable streams found
### Checklist
- [X] This is a plugin issue and not a different kind of issue
- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)
- [X] [I have checked the list of open and recently closed plugin issues](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22)
- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)
### Streamlink version
Latest stable release
### Description
The plugin is not functional. I am attaching a log.
### Debug log
```text
streamlink --loglevel debug "https://btvplus.bg/live/" best
[cli][debug] OS: Linux-5.15.0-50-generic-x86_64-with-glibc2.29
[cli][debug] Python: 3.8.10
[cli][debug] Streamlink: 5.0.1
[cli][debug] Dependencies:
[cli][debug] isodate: 0.6.0
[cli][debug] lxml: 4.6.4
[cli][debug] pycountry: 19.8.18
[cli][debug] pycryptodome: 3.9.9
[cli][debug] PySocks: 1.7.1
[cli][debug] requests: 2.26.0
[cli][debug] websocket-client: 1.2.1
[cli][debug] Arguments:
[cli][debug] url=https://btvplus.bg/live/
[cli][debug] stream=['best']
[cli][debug] --loglevel=debug
[cli][info] Found matching plugin btv for URL https://btvplus.bg/live/
[utils.l10n][debug] Language code: bg_BG
error: No playable streams found on this URL: https://btvplus.bg/live/
```
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `src/streamlink/plugins/btv.py`
Content:
```
1 """
2 $description A privately owned Bulgarian live TV channel.
3 $url btvplus.bg
4 $type live
5 $region Bulgaria
6 """
7
8 import logging
9 import re
10
11 from streamlink.plugin import Plugin, pluginmatcher
12 from streamlink.plugin.api import validate
13 from streamlink.stream.hls import HLSStream
14
15 log = logging.getLogger(__name__)
16
17
18 @pluginmatcher(re.compile(
19 r"https?://(?:www\.)?btvplus\.bg/live/?"
20 ))
21 class BTV(Plugin):
22 URL_API = "https://btvplus.bg/lbin/v3/btvplus/player_config.php"
23
24 def _get_streams(self):
25 media_id = self.session.http.get(self.url, schema=validate.Schema(
26 re.compile(r"media_id=(\d+)"),
27 validate.any(None, validate.get(1)),
28 ))
29 if media_id is None:
30 return
31
32 stream_url = self.session.http.get(
33 self.URL_API,
34 params={
35 "media_id": media_id,
36 },
37 schema=validate.Schema(
38 validate.any(
39 validate.all(
40 validate.regex(re.compile(r"geo_blocked_stream")),
41 validate.get(0),
42 ),
43 validate.all(
44 validate.parse_json(),
45 {
46 "status": "ok",
47 "config": str,
48 },
49 validate.get("config"),
50 re.compile(r"src: \"(http.*?)\""),
51 validate.none_or_all(
52 validate.get(1),
53 validate.url(),
54 ),
55 ),
56 ),
57 ),
58 )
59 if not stream_url:
60 return
61
62 if stream_url == "geo_blocked_stream":
63 log.error("The content is not available in your region")
64 return
65
66 return HLSStream.parse_variant_playlist(self.session, stream_url)
67
68
69 __plugin__ = BTV
70
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/src/streamlink/plugins/btv.py b/src/streamlink/plugins/btv.py
--- a/src/streamlink/plugins/btv.py
+++ b/src/streamlink/plugins/btv.py
@@ -44,14 +44,11 @@
validate.parse_json(),
{
"status": "ok",
- "config": str,
+ "info": {
+ "file": validate.url(path=validate.endswith(".m3u8")),
+ },
},
- validate.get("config"),
- re.compile(r"src: \"(http.*?)\""),
- validate.none_or_all(
- validate.get(1),
- validate.url(),
- ),
+ validate.get(("info", "file")),
),
),
),
@@ -63,7 +60,7 @@
log.error("The content is not available in your region")
return
- return HLSStream.parse_variant_playlist(self.session, stream_url)
+ return {"live": HLSStream(self.session, stream_url)}
__plugin__ = BTV
| {"golden_diff": "diff --git a/src/streamlink/plugins/btv.py b/src/streamlink/plugins/btv.py\n--- a/src/streamlink/plugins/btv.py\n+++ b/src/streamlink/plugins/btv.py\n@@ -44,14 +44,11 @@\n validate.parse_json(),\n {\n \"status\": \"ok\",\n- \"config\": str,\n+ \"info\": {\n+ \"file\": validate.url(path=validate.endswith(\".m3u8\")),\n+ },\n },\n- validate.get(\"config\"),\n- re.compile(r\"src: \\\"(http.*?)\\\"\"),\n- validate.none_or_all(\n- validate.get(1),\n- validate.url(),\n- ),\n+ validate.get((\"info\", \"file\")),\n ),\n ),\n ),\n@@ -63,7 +60,7 @@\n log.error(\"The content is not available in your region\")\n return\n \n- return HLSStream.parse_variant_playlist(self.session, stream_url)\n+ return {\"live\": HLSStream(self.session, stream_url)}\n \n \n __plugin__ = BTV\n", "issue": "plugins.btv: No playable streams found\n### Checklist\n\n- [X] This is a plugin issue and not a different kind of issue\n- [X] [I have read the contribution guidelines](https://github.com/streamlink/streamlink/blob/master/CONTRIBUTING.md#contributing-to-streamlink)\n- [X] [I have checked the list of open and recently closed plugin issues](https://github.com/streamlink/streamlink/issues?q=is%3Aissue+label%3A%22plugin+issue%22)\n- [X] [I have checked the commit log of the master branch](https://github.com/streamlink/streamlink/commits/master)\n\n### Streamlink version\n\nLatest stable release\n\n### Description\n\nThe plugin is not functional. I am attaching a log.\n\n### Debug log\n\n```text\nstreamlink --loglevel debug \"https://btvplus.bg/live/\" best\r\n[cli][debug] OS: Linux-5.15.0-50-generic-x86_64-with-glibc2.29\r\n[cli][debug] Python: 3.8.10\r\n[cli][debug] Streamlink: 5.0.1\r\n[cli][debug] Dependencies:\r\n[cli][debug] isodate: 0.6.0\r\n[cli][debug] lxml: 4.6.4\r\n[cli][debug] pycountry: 19.8.18\r\n[cli][debug] pycryptodome: 3.9.9\r\n[cli][debug] PySocks: 1.7.1\r\n[cli][debug] requests: 2.26.0\r\n[cli][debug] websocket-client: 1.2.1\r\n[cli][debug] Arguments:\r\n[cli][debug] url=https://btvplus.bg/live/\r\n[cli][debug] stream=['best']\r\n[cli][debug] --loglevel=debug\r\n[cli][info] Found matching plugin btv for URL https://btvplus.bg/live/\r\n[utils.l10n][debug] Language code: bg_BG\r\nerror: No playable streams found on this URL: https://btvplus.bg/live/\n```\n\n", "before_files": [{"content": "\"\"\"\n$description A privately owned Bulgarian live TV channel.\n$url btvplus.bg\n$type live\n$region Bulgaria\n\"\"\"\n\nimport logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream.hls import HLSStream\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"https?://(?:www\\.)?btvplus\\.bg/live/?\"\n))\nclass BTV(Plugin):\n URL_API = \"https://btvplus.bg/lbin/v3/btvplus/player_config.php\"\n\n def _get_streams(self):\n media_id = self.session.http.get(self.url, schema=validate.Schema(\n re.compile(r\"media_id=(\\d+)\"),\n validate.any(None, validate.get(1)),\n ))\n if media_id is None:\n return\n\n stream_url = self.session.http.get(\n self.URL_API,\n params={\n \"media_id\": media_id,\n },\n schema=validate.Schema(\n validate.any(\n validate.all(\n validate.regex(re.compile(r\"geo_blocked_stream\")),\n validate.get(0),\n ),\n validate.all(\n validate.parse_json(),\n {\n \"status\": \"ok\",\n \"config\": str,\n },\n validate.get(\"config\"),\n re.compile(r\"src: \\\"(http.*?)\\\"\"),\n validate.none_or_all(\n validate.get(1),\n validate.url(),\n ),\n ),\n ),\n ),\n )\n if not stream_url:\n return\n\n if stream_url == \"geo_blocked_stream\":\n log.error(\"The content is not available in your region\")\n return\n\n return HLSStream.parse_variant_playlist(self.session, stream_url)\n\n\n__plugin__ = BTV\n", "path": "src/streamlink/plugins/btv.py"}], "after_files": [{"content": "\"\"\"\n$description A privately owned Bulgarian live TV channel.\n$url btvplus.bg\n$type live\n$region Bulgaria\n\"\"\"\n\nimport logging\nimport re\n\nfrom streamlink.plugin import Plugin, pluginmatcher\nfrom streamlink.plugin.api import validate\nfrom streamlink.stream.hls import HLSStream\n\nlog = logging.getLogger(__name__)\n\n\n@pluginmatcher(re.compile(\n r\"https?://(?:www\\.)?btvplus\\.bg/live/?\"\n))\nclass BTV(Plugin):\n URL_API = \"https://btvplus.bg/lbin/v3/btvplus/player_config.php\"\n\n def _get_streams(self):\n media_id = self.session.http.get(self.url, schema=validate.Schema(\n re.compile(r\"media_id=(\\d+)\"),\n validate.any(None, validate.get(1)),\n ))\n if media_id is None:\n return\n\n stream_url = self.session.http.get(\n self.URL_API,\n params={\n \"media_id\": media_id,\n },\n schema=validate.Schema(\n validate.any(\n validate.all(\n validate.regex(re.compile(r\"geo_blocked_stream\")),\n validate.get(0),\n ),\n validate.all(\n validate.parse_json(),\n {\n \"status\": \"ok\",\n \"info\": {\n \"file\": validate.url(path=validate.endswith(\".m3u8\")),\n },\n },\n validate.get((\"info\", \"file\")),\n ),\n ),\n ),\n )\n if not stream_url:\n return\n\n if stream_url == \"geo_blocked_stream\":\n log.error(\"The content is not available in your region\")\n return\n\n return {\"live\": HLSStream(self.session, stream_url)}\n\n\n__plugin__ = BTV\n", "path": "src/streamlink/plugins/btv.py"}]} | 1,245 | 226 |
gh_patches_debug_1546 | rasdani/github-patches | git_diff | lightly-ai__lightly-1450 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
VICReg Loss De-Means Twice?
https://github.com/lightly-ai/lightly/blob/66ad1b40ebf3b53512703c774988211ce283211f/lightly/loss/vicreg_loss.py#L128-L129
I think the VICReg loss removes the mean, then calls `.var()` which also de-means (see: https://pytorch.org/docs/stable/generated/torch.var.html).
If I understand correctly, that seems unnecessary?
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `lightly/loss/vicreg_loss.py`
Content:
```
1 import torch
2 import torch.distributed as dist
3 import torch.nn.functional as F
4 from torch import Tensor
5
6 from lightly.utils.dist import gather
7
8
9 class VICRegLoss(torch.nn.Module):
10 """Implementation of the VICReg loss [0].
11
12 This implementation is based on the code published by the authors [1].
13
14 - [0] VICReg, 2022, https://arxiv.org/abs/2105.04906
15 - [1] https://github.com/facebookresearch/vicreg/
16
17 Attributes:
18 lambda_param:
19 Scaling coefficient for the invariance term of the loss.
20 mu_param:
21 Scaling coefficient for the variance term of the loss.
22 nu_param:
23 Scaling coefficient for the covariance term of the loss.
24 gather_distributed:
25 If True then the cross-correlation matrices from all gpus are gathered and
26 summed before the loss calculation.
27 eps:
28 Epsilon for numerical stability.
29
30 Examples:
31
32 >>> # initialize loss function
33 >>> loss_fn = VICRegLoss()
34 >>>
35 >>> # generate two random transforms of images
36 >>> t0 = transforms(images)
37 >>> t1 = transforms(images)
38 >>>
39 >>> # feed through model
40 >>> out0, out1 = model(t0, t1)
41 >>>
42 >>> # calculate loss
43 >>> loss = loss_fn(out0, out1)
44 """
45
46 def __init__(
47 self,
48 lambda_param: float = 25.0,
49 mu_param: float = 25.0,
50 nu_param: float = 1.0,
51 gather_distributed: bool = False,
52 eps=0.0001,
53 ):
54 super(VICRegLoss, self).__init__()
55 if gather_distributed and not dist.is_available():
56 raise ValueError(
57 "gather_distributed is True but torch.distributed is not available. "
58 "Please set gather_distributed=False or install a torch version with "
59 "distributed support."
60 )
61
62 self.lambda_param = lambda_param
63 self.mu_param = mu_param
64 self.nu_param = nu_param
65 self.gather_distributed = gather_distributed
66 self.eps = eps
67
68 def forward(self, z_a: torch.Tensor, z_b: torch.Tensor) -> torch.Tensor:
69 """Returns VICReg loss.
70
71 Args:
72 z_a:
73 Tensor with shape (batch_size, ..., dim).
74 z_b:
75 Tensor with shape (batch_size, ..., dim).
76 """
77 assert (
78 z_a.shape[0] > 1 and z_b.shape[0] > 1
79 ), f"z_a and z_b must have batch size > 1 but found {z_a.shape[0]} and {z_b.shape[0]}"
80 assert (
81 z_a.shape == z_b.shape
82 ), f"z_a and z_b must have same shape but found {z_a.shape} and {z_b.shape}."
83
84 # invariance term of the loss
85 inv_loss = invariance_loss(x=z_a, y=z_b)
86
87 # gather all batches
88 if self.gather_distributed and dist.is_initialized():
89 world_size = dist.get_world_size()
90 if world_size > 1:
91 z_a = torch.cat(gather(z_a), dim=0)
92 z_b = torch.cat(gather(z_b), dim=0)
93
94 var_loss = 0.5 * (
95 variance_loss(x=z_a, eps=self.eps) + variance_loss(x=z_b, eps=self.eps)
96 )
97 cov_loss = covariance_loss(x=z_a) + covariance_loss(x=z_b)
98
99 loss = (
100 self.lambda_param * inv_loss
101 + self.mu_param * var_loss
102 + self.nu_param * cov_loss
103 )
104 return loss
105
106
107 def invariance_loss(x: Tensor, y: Tensor) -> Tensor:
108 """Returns VICReg invariance loss.
109
110 Args:
111 x:
112 Tensor with shape (batch_size, ..., dim).
113 y:
114 Tensor with shape (batch_size, ..., dim).
115 """
116 return F.mse_loss(x, y)
117
118
119 def variance_loss(x: Tensor, eps: float = 0.0001) -> Tensor:
120 """Returns VICReg variance loss.
121
122 Args:
123 x:
124 Tensor with shape (batch_size, ..., dim).
125 eps:
126 Epsilon for numerical stability.
127 """
128 x = x - x.mean(dim=0)
129 std = torch.sqrt(x.var(dim=0) + eps)
130 loss = torch.mean(F.relu(1.0 - std))
131 return loss
132
133
134 def covariance_loss(x: Tensor) -> Tensor:
135 """Returns VICReg covariance loss.
136
137 Generalized version of the covariance loss with support for tensors with more than
138 two dimensions. Adapted from VICRegL:
139 https://github.com/facebookresearch/VICRegL/blob/803ae4c8cd1649a820f03afb4793763e95317620/main_vicregl.py#L299
140
141 Args:
142 x:
143 Tensor with shape (batch_size, ..., dim).
144 """
145 x = x - x.mean(dim=0)
146 batch_size = x.size(0)
147 dim = x.size(-1)
148 # nondiag_mask has shape (dim, dim) with 1s on all non-diagonal entries.
149 nondiag_mask = ~torch.eye(dim, device=x.device, dtype=torch.bool)
150 # cov has shape (..., dim, dim)
151 cov = torch.einsum("b...c,b...d->...cd", x, x) / (batch_size - 1)
152 loss = cov[..., nondiag_mask].pow(2).sum(-1) / dim
153 return loss.mean()
154
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/lightly/loss/vicreg_loss.py b/lightly/loss/vicreg_loss.py
--- a/lightly/loss/vicreg_loss.py
+++ b/lightly/loss/vicreg_loss.py
@@ -125,7 +125,6 @@
eps:
Epsilon for numerical stability.
"""
- x = x - x.mean(dim=0)
std = torch.sqrt(x.var(dim=0) + eps)
loss = torch.mean(F.relu(1.0 - std))
return loss
| {"golden_diff": "diff --git a/lightly/loss/vicreg_loss.py b/lightly/loss/vicreg_loss.py\n--- a/lightly/loss/vicreg_loss.py\n+++ b/lightly/loss/vicreg_loss.py\n@@ -125,7 +125,6 @@\n eps:\n Epsilon for numerical stability.\n \"\"\"\n- x = x - x.mean(dim=0)\n std = torch.sqrt(x.var(dim=0) + eps)\n loss = torch.mean(F.relu(1.0 - std))\n return loss\n", "issue": "VICReg Loss De-Means Twice?\nhttps://github.com/lightly-ai/lightly/blob/66ad1b40ebf3b53512703c774988211ce283211f/lightly/loss/vicreg_loss.py#L128-L129\r\n\r\nI think the VICReg loss removes the mean, then calls `.var()` which also de-means (see: https://pytorch.org/docs/stable/generated/torch.var.html). \r\n\r\nIf I understand correctly, that seems unnecessary?\n", "before_files": [{"content": "import torch\nimport torch.distributed as dist\nimport torch.nn.functional as F\nfrom torch import Tensor\n\nfrom lightly.utils.dist import gather\n\n\nclass VICRegLoss(torch.nn.Module):\n \"\"\"Implementation of the VICReg loss [0].\n\n This implementation is based on the code published by the authors [1].\n\n - [0] VICReg, 2022, https://arxiv.org/abs/2105.04906\n - [1] https://github.com/facebookresearch/vicreg/\n\n Attributes:\n lambda_param:\n Scaling coefficient for the invariance term of the loss.\n mu_param:\n Scaling coefficient for the variance term of the loss.\n nu_param:\n Scaling coefficient for the covariance term of the loss.\n gather_distributed:\n If True then the cross-correlation matrices from all gpus are gathered and\n summed before the loss calculation.\n eps:\n Epsilon for numerical stability.\n\n Examples:\n\n >>> # initialize loss function\n >>> loss_fn = VICRegLoss()\n >>>\n >>> # generate two random transforms of images\n >>> t0 = transforms(images)\n >>> t1 = transforms(images)\n >>>\n >>> # feed through model\n >>> out0, out1 = model(t0, t1)\n >>>\n >>> # calculate loss\n >>> loss = loss_fn(out0, out1)\n \"\"\"\n\n def __init__(\n self,\n lambda_param: float = 25.0,\n mu_param: float = 25.0,\n nu_param: float = 1.0,\n gather_distributed: bool = False,\n eps=0.0001,\n ):\n super(VICRegLoss, self).__init__()\n if gather_distributed and not dist.is_available():\n raise ValueError(\n \"gather_distributed is True but torch.distributed is not available. \"\n \"Please set gather_distributed=False or install a torch version with \"\n \"distributed support.\"\n )\n\n self.lambda_param = lambda_param\n self.mu_param = mu_param\n self.nu_param = nu_param\n self.gather_distributed = gather_distributed\n self.eps = eps\n\n def forward(self, z_a: torch.Tensor, z_b: torch.Tensor) -> torch.Tensor:\n \"\"\"Returns VICReg loss.\n\n Args:\n z_a:\n Tensor with shape (batch_size, ..., dim).\n z_b:\n Tensor with shape (batch_size, ..., dim).\n \"\"\"\n assert (\n z_a.shape[0] > 1 and z_b.shape[0] > 1\n ), f\"z_a and z_b must have batch size > 1 but found {z_a.shape[0]} and {z_b.shape[0]}\"\n assert (\n z_a.shape == z_b.shape\n ), f\"z_a and z_b must have same shape but found {z_a.shape} and {z_b.shape}.\"\n\n # invariance term of the loss\n inv_loss = invariance_loss(x=z_a, y=z_b)\n\n # gather all batches\n if self.gather_distributed and dist.is_initialized():\n world_size = dist.get_world_size()\n if world_size > 1:\n z_a = torch.cat(gather(z_a), dim=0)\n z_b = torch.cat(gather(z_b), dim=0)\n\n var_loss = 0.5 * (\n variance_loss(x=z_a, eps=self.eps) + variance_loss(x=z_b, eps=self.eps)\n )\n cov_loss = covariance_loss(x=z_a) + covariance_loss(x=z_b)\n\n loss = (\n self.lambda_param * inv_loss\n + self.mu_param * var_loss\n + self.nu_param * cov_loss\n )\n return loss\n\n\ndef invariance_loss(x: Tensor, y: Tensor) -> Tensor:\n \"\"\"Returns VICReg invariance loss.\n\n Args:\n x:\n Tensor with shape (batch_size, ..., dim).\n y:\n Tensor with shape (batch_size, ..., dim).\n \"\"\"\n return F.mse_loss(x, y)\n\n\ndef variance_loss(x: Tensor, eps: float = 0.0001) -> Tensor:\n \"\"\"Returns VICReg variance loss.\n\n Args:\n x:\n Tensor with shape (batch_size, ..., dim).\n eps:\n Epsilon for numerical stability.\n \"\"\"\n x = x - x.mean(dim=0)\n std = torch.sqrt(x.var(dim=0) + eps)\n loss = torch.mean(F.relu(1.0 - std))\n return loss\n\n\ndef covariance_loss(x: Tensor) -> Tensor:\n \"\"\"Returns VICReg covariance loss.\n\n Generalized version of the covariance loss with support for tensors with more than\n two dimensions. Adapted from VICRegL:\n https://github.com/facebookresearch/VICRegL/blob/803ae4c8cd1649a820f03afb4793763e95317620/main_vicregl.py#L299\n\n Args:\n x:\n Tensor with shape (batch_size, ..., dim).\n \"\"\"\n x = x - x.mean(dim=0)\n batch_size = x.size(0)\n dim = x.size(-1)\n # nondiag_mask has shape (dim, dim) with 1s on all non-diagonal entries.\n nondiag_mask = ~torch.eye(dim, device=x.device, dtype=torch.bool)\n # cov has shape (..., dim, dim)\n cov = torch.einsum(\"b...c,b...d->...cd\", x, x) / (batch_size - 1)\n loss = cov[..., nondiag_mask].pow(2).sum(-1) / dim\n return loss.mean()\n", "path": "lightly/loss/vicreg_loss.py"}], "after_files": [{"content": "import torch\nimport torch.distributed as dist\nimport torch.nn.functional as F\nfrom torch import Tensor\n\nfrom lightly.utils.dist import gather\n\n\nclass VICRegLoss(torch.nn.Module):\n \"\"\"Implementation of the VICReg loss [0].\n\n This implementation is based on the code published by the authors [1].\n\n - [0] VICReg, 2022, https://arxiv.org/abs/2105.04906\n - [1] https://github.com/facebookresearch/vicreg/\n\n Attributes:\n lambda_param:\n Scaling coefficient for the invariance term of the loss.\n mu_param:\n Scaling coefficient for the variance term of the loss.\n nu_param:\n Scaling coefficient for the covariance term of the loss.\n gather_distributed:\n If True then the cross-correlation matrices from all gpus are gathered and\n summed before the loss calculation.\n eps:\n Epsilon for numerical stability.\n\n Examples:\n\n >>> # initialize loss function\n >>> loss_fn = VICRegLoss()\n >>>\n >>> # generate two random transforms of images\n >>> t0 = transforms(images)\n >>> t1 = transforms(images)\n >>>\n >>> # feed through model\n >>> out0, out1 = model(t0, t1)\n >>>\n >>> # calculate loss\n >>> loss = loss_fn(out0, out1)\n \"\"\"\n\n def __init__(\n self,\n lambda_param: float = 25.0,\n mu_param: float = 25.0,\n nu_param: float = 1.0,\n gather_distributed: bool = False,\n eps=0.0001,\n ):\n super(VICRegLoss, self).__init__()\n if gather_distributed and not dist.is_available():\n raise ValueError(\n \"gather_distributed is True but torch.distributed is not available. \"\n \"Please set gather_distributed=False or install a torch version with \"\n \"distributed support.\"\n )\n\n self.lambda_param = lambda_param\n self.mu_param = mu_param\n self.nu_param = nu_param\n self.gather_distributed = gather_distributed\n self.eps = eps\n\n def forward(self, z_a: torch.Tensor, z_b: torch.Tensor) -> torch.Tensor:\n \"\"\"Returns VICReg loss.\n\n Args:\n z_a:\n Tensor with shape (batch_size, ..., dim).\n z_b:\n Tensor with shape (batch_size, ..., dim).\n \"\"\"\n assert (\n z_a.shape[0] > 1 and z_b.shape[0] > 1\n ), f\"z_a and z_b must have batch size > 1 but found {z_a.shape[0]} and {z_b.shape[0]}\"\n assert (\n z_a.shape == z_b.shape\n ), f\"z_a and z_b must have same shape but found {z_a.shape} and {z_b.shape}.\"\n\n # invariance term of the loss\n inv_loss = invariance_loss(x=z_a, y=z_b)\n\n # gather all batches\n if self.gather_distributed and dist.is_initialized():\n world_size = dist.get_world_size()\n if world_size > 1:\n z_a = torch.cat(gather(z_a), dim=0)\n z_b = torch.cat(gather(z_b), dim=0)\n\n var_loss = 0.5 * (\n variance_loss(x=z_a, eps=self.eps) + variance_loss(x=z_b, eps=self.eps)\n )\n cov_loss = covariance_loss(x=z_a) + covariance_loss(x=z_b)\n\n loss = (\n self.lambda_param * inv_loss\n + self.mu_param * var_loss\n + self.nu_param * cov_loss\n )\n return loss\n\n\ndef invariance_loss(x: Tensor, y: Tensor) -> Tensor:\n \"\"\"Returns VICReg invariance loss.\n\n Args:\n x:\n Tensor with shape (batch_size, ..., dim).\n y:\n Tensor with shape (batch_size, ..., dim).\n \"\"\"\n return F.mse_loss(x, y)\n\n\ndef variance_loss(x: Tensor, eps: float = 0.0001) -> Tensor:\n \"\"\"Returns VICReg variance loss.\n\n Args:\n x:\n Tensor with shape (batch_size, ..., dim).\n eps:\n Epsilon for numerical stability.\n \"\"\"\n std = torch.sqrt(x.var(dim=0) + eps)\n loss = torch.mean(F.relu(1.0 - std))\n return loss\n\n\ndef covariance_loss(x: Tensor) -> Tensor:\n \"\"\"Returns VICReg covariance loss.\n\n Generalized version of the covariance loss with support for tensors with more than\n two dimensions. Adapted from VICRegL:\n https://github.com/facebookresearch/VICRegL/blob/803ae4c8cd1649a820f03afb4793763e95317620/main_vicregl.py#L299\n\n Args:\n x:\n Tensor with shape (batch_size, ..., dim).\n \"\"\"\n x = x - x.mean(dim=0)\n batch_size = x.size(0)\n dim = x.size(-1)\n # nondiag_mask has shape (dim, dim) with 1s on all non-diagonal entries.\n nondiag_mask = ~torch.eye(dim, device=x.device, dtype=torch.bool)\n # cov has shape (..., dim, dim)\n cov = torch.einsum(\"b...c,b...d->...cd\", x, x) / (batch_size - 1)\n loss = cov[..., nondiag_mask].pow(2).sum(-1) / dim\n return loss.mean()\n", "path": "lightly/loss/vicreg_loss.py"}]} | 2,004 | 118 |
gh_patches_debug_6249 | rasdani/github-patches | git_diff | microsoft__ptvsd-641 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
Errors logged on VSTS
```
2018-07-05T19:13:17.5780150Z .Traceback (most recent call last):
2018-07-05T19:13:17.5795340Z File "/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_process_net_command.py", line 749, in process_net_command
2018-07-05T19:13:17.5813150Z py_db.enable_output_redirection('STDOUT' in text, 'STDERR' in text)
2018-07-05T19:13:17.5831030Z File "/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/pydevd.py", line 361, in enable_output_redirection
2018-07-05T19:13:17.5847040Z init_stdout_redirect()
2018-07-05T19:13:17.5862230Z File "/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/pydevd.py", line 1199, in init_stdout_redirect
2018-07-05T19:13:17.5878570Z sys.stdout = pydevd_io.IORedirector(original, sys._pydevd_out_buffer_, wrap_buffer) #@UndefinedVariable
2018-07-05T19:13:17.5895080Z File "/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py", line 24, in __init__
2018-07-05T19:13:17.5913010Z self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)
2018-07-05T19:13:17.5939400Z AttributeError: '_DuplicateWriter' object has no attribute 'buffer'
```
The same errors are logged for Linux and Mac OS.
I'm using a Mac and cannot replicate this error.
/cc @fabioz
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py`
Content:
```
1 from _pydevd_bundle import pydevd_constants
2
3 IS_PY3K = pydevd_constants.IS_PY3K
4
5 class IORedirector:
6 '''
7 This class works to wrap a stream (stdout/stderr) with an additional redirect.
8 '''
9
10 def __init__(self, original, new_redirect, wrap_buffer=False):
11 '''
12 :param stream original:
13 The stream to be wrapped (usually stdout/stderr).
14
15 :param stream new_redirect:
16 Usually IOBuf (below).
17
18 :param bool wrap_buffer:
19 Whether to create a buffer attribute (needed to mimick python 3 s
20 tdout/stderr which has a buffer to write binary data).
21 '''
22 self._redirect_to = (original, new_redirect)
23 if wrap_buffer:
24 self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)
25
26 def write(self, s):
27 # Note that writing to the original stream may fail for some reasons
28 # (such as trying to write something that's not a string or having it closed).
29 for r in self._redirect_to:
30 r.write(s)
31
32 def isatty(self):
33 return self._redirect_to[0].isatty()
34
35 def flush(self):
36 for r in self._redirect_to:
37 r.flush()
38
39 def __getattr__(self, name):
40 for r in self._redirect_to:
41 if hasattr(r, name):
42 return getattr(r, name)
43 raise AttributeError(name)
44
45 class IOBuf:
46 '''This class works as a replacement for stdio and stderr.
47 It is a buffer and when its contents are requested, it will erase what
48 it has so far so that the next return will not return the same contents again.
49 '''
50 def __init__(self):
51 self.buflist = []
52 import os
53 self.encoding = os.environ.get('PYTHONIOENCODING', 'utf-8')
54
55 def getvalue(self):
56 b = self.buflist
57 self.buflist = [] # clear it
58 return ''.join(b) # bytes on py2, str on py3.
59
60 def write(self, s):
61 if not IS_PY3K:
62 if isinstance(s, unicode):
63 # can't use 'errors' as kwargs in py 2.6
64 s = s.encode(self.encoding, 'replace')
65 else:
66 if isinstance(s, bytes):
67 s = s.decode(self.encoding, errors='replace')
68 self.buflist.append(s)
69
70 def isatty(self):
71 return False
72
73 def flush(self):
74 pass
75
76 def empty(self):
77 return len(self.buflist) == 0
78
79 class _RedirectionsHolder:
80 _stack_stdout = []
81 _stack_stderr = []
82
83
84 def start_redirect(keep_original_redirection=False, std='stdout'):
85 '''
86 @param std: 'stdout', 'stderr', or 'both'
87 '''
88 import sys
89 buf = IOBuf()
90
91 if std == 'both':
92 config_stds = ['stdout', 'stderr']
93 else:
94 config_stds = [std]
95
96 for std in config_stds:
97 original = getattr(sys, std)
98 stack = getattr(_RedirectionsHolder, '_stack_%s' % std)
99 stack.append(original)
100
101 if keep_original_redirection:
102 setattr(sys, std, IORedirector(getattr(sys, std), buf))
103 else:
104 setattr(sys, std, buf)
105 return buf
106
107
108 def end_redirect(std='stdout'):
109 import sys
110 if std == 'both':
111 config_stds = ['stdout', 'stderr']
112 else:
113 config_stds = [std]
114 for std in config_stds:
115 stack = getattr(_RedirectionsHolder, '_stack_%s' % std)
116 setattr(sys, std, stack.pop())
117
118
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py b/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py
--- a/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py
+++ b/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py
@@ -20,7 +20,7 @@
tdout/stderr which has a buffer to write binary data).
'''
self._redirect_to = (original, new_redirect)
- if wrap_buffer:
+ if wrap_buffer and hasattr(original, 'buffer'):
self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)
def write(self, s):
| {"golden_diff": "diff --git a/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py b/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py\n--- a/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py\n+++ b/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py\n@@ -20,7 +20,7 @@\n tdout/stderr which has a buffer to write binary data).\n '''\n self._redirect_to = (original, new_redirect)\n- if wrap_buffer:\n+ if wrap_buffer and hasattr(original, 'buffer'):\n self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)\n \n def write(self, s):\n", "issue": "Errors logged on VSTS \n```\r\n2018-07-05T19:13:17.5780150Z .Traceback (most recent call last):\r\n2018-07-05T19:13:17.5795340Z File \"/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_process_net_command.py\", line 749, in process_net_command\r\n2018-07-05T19:13:17.5813150Z py_db.enable_output_redirection('STDOUT' in text, 'STDERR' in text)\r\n2018-07-05T19:13:17.5831030Z File \"/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/pydevd.py\", line 361, in enable_output_redirection\r\n2018-07-05T19:13:17.5847040Z init_stdout_redirect()\r\n2018-07-05T19:13:17.5862230Z File \"/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/pydevd.py\", line 1199, in init_stdout_redirect\r\n2018-07-05T19:13:17.5878570Z sys.stdout = pydevd_io.IORedirector(original, sys._pydevd_out_buffer_, wrap_buffer) #@UndefinedVariable\r\n2018-07-05T19:13:17.5895080Z File \"/Users/vsts/agent/2.134.2/work/1/s/ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py\", line 24, in __init__\r\n2018-07-05T19:13:17.5913010Z self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)\r\n2018-07-05T19:13:17.5939400Z AttributeError: '_DuplicateWriter' object has no attribute 'buffer'\r\n```\r\n\r\nThe same errors are logged for Linux and Mac OS.\r\nI'm using a Mac and cannot replicate this error.\r\n\r\n/cc @fabioz \n", "before_files": [{"content": "from _pydevd_bundle import pydevd_constants\n\nIS_PY3K = pydevd_constants.IS_PY3K\n\nclass IORedirector:\n '''\n This class works to wrap a stream (stdout/stderr) with an additional redirect.\n '''\n\n def __init__(self, original, new_redirect, wrap_buffer=False):\n '''\n :param stream original:\n The stream to be wrapped (usually stdout/stderr).\n\n :param stream new_redirect:\n Usually IOBuf (below).\n\n :param bool wrap_buffer:\n Whether to create a buffer attribute (needed to mimick python 3 s\n tdout/stderr which has a buffer to write binary data).\n '''\n self._redirect_to = (original, new_redirect)\n if wrap_buffer:\n self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)\n\n def write(self, s):\n # Note that writing to the original stream may fail for some reasons\n # (such as trying to write something that's not a string or having it closed).\n for r in self._redirect_to:\n r.write(s)\n\n def isatty(self):\n return self._redirect_to[0].isatty()\n\n def flush(self):\n for r in self._redirect_to:\n r.flush()\n\n def __getattr__(self, name):\n for r in self._redirect_to:\n if hasattr(r, name):\n return getattr(r, name)\n raise AttributeError(name)\n\nclass IOBuf:\n '''This class works as a replacement for stdio and stderr.\n It is a buffer and when its contents are requested, it will erase what\n it has so far so that the next return will not return the same contents again.\n '''\n def __init__(self):\n self.buflist = []\n import os\n self.encoding = os.environ.get('PYTHONIOENCODING', 'utf-8')\n\n def getvalue(self):\n b = self.buflist\n self.buflist = [] # clear it\n return ''.join(b) # bytes on py2, str on py3.\n \n def write(self, s):\n if not IS_PY3K:\n if isinstance(s, unicode):\n # can't use 'errors' as kwargs in py 2.6\n s = s.encode(self.encoding, 'replace')\n else:\n if isinstance(s, bytes):\n s = s.decode(self.encoding, errors='replace')\n self.buflist.append(s)\n\n def isatty(self):\n return False\n\n def flush(self):\n pass\n\n def empty(self):\n return len(self.buflist) == 0\n\nclass _RedirectionsHolder:\n _stack_stdout = []\n _stack_stderr = []\n\n\ndef start_redirect(keep_original_redirection=False, std='stdout'):\n '''\n @param std: 'stdout', 'stderr', or 'both'\n '''\n import sys\n buf = IOBuf()\n\n if std == 'both':\n config_stds = ['stdout', 'stderr']\n else:\n config_stds = [std]\n\n for std in config_stds:\n original = getattr(sys, std)\n stack = getattr(_RedirectionsHolder, '_stack_%s' % std)\n stack.append(original)\n\n if keep_original_redirection:\n setattr(sys, std, IORedirector(getattr(sys, std), buf))\n else:\n setattr(sys, std, buf)\n return buf\n\n\ndef end_redirect(std='stdout'):\n import sys\n if std == 'both':\n config_stds = ['stdout', 'stderr']\n else:\n config_stds = [std]\n for std in config_stds:\n stack = getattr(_RedirectionsHolder, '_stack_%s' % std)\n setattr(sys, std, stack.pop())\n\n", "path": "ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py"}], "after_files": [{"content": "from _pydevd_bundle import pydevd_constants\n\nIS_PY3K = pydevd_constants.IS_PY3K\n\nclass IORedirector:\n '''\n This class works to wrap a stream (stdout/stderr) with an additional redirect.\n '''\n\n def __init__(self, original, new_redirect, wrap_buffer=False):\n '''\n :param stream original:\n The stream to be wrapped (usually stdout/stderr).\n\n :param stream new_redirect:\n Usually IOBuf (below).\n\n :param bool wrap_buffer:\n Whether to create a buffer attribute (needed to mimick python 3 s\n tdout/stderr which has a buffer to write binary data).\n '''\n self._redirect_to = (original, new_redirect)\n if wrap_buffer and hasattr(original, 'buffer'):\n self.buffer = IORedirector(original.buffer, new_redirect.buffer, False)\n\n def write(self, s):\n # Note that writing to the original stream may fail for some reasons\n # (such as trying to write something that's not a string or having it closed).\n for r in self._redirect_to:\n r.write(s)\n\n def isatty(self):\n return self._redirect_to[0].isatty()\n\n def flush(self):\n for r in self._redirect_to:\n r.flush()\n\n def __getattr__(self, name):\n for r in self._redirect_to:\n if hasattr(r, name):\n return getattr(r, name)\n raise AttributeError(name)\n\nclass IOBuf:\n '''This class works as a replacement for stdio and stderr.\n It is a buffer and when its contents are requested, it will erase what\n it has so far so that the next return will not return the same contents again.\n '''\n def __init__(self):\n self.buflist = []\n import os\n self.encoding = os.environ.get('PYTHONIOENCODING', 'utf-8')\n\n def getvalue(self):\n b = self.buflist\n self.buflist = [] # clear it\n return ''.join(b) # bytes on py2, str on py3.\n \n def write(self, s):\n if not IS_PY3K:\n if isinstance(s, unicode):\n # can't use 'errors' as kwargs in py 2.6\n s = s.encode(self.encoding, 'replace')\n else:\n if isinstance(s, bytes):\n s = s.decode(self.encoding, errors='replace')\n self.buflist.append(s)\n\n def isatty(self):\n return False\n\n def flush(self):\n pass\n\n def empty(self):\n return len(self.buflist) == 0\n\nclass _RedirectionsHolder:\n _stack_stdout = []\n _stack_stderr = []\n\n\ndef start_redirect(keep_original_redirection=False, std='stdout'):\n '''\n @param std: 'stdout', 'stderr', or 'both'\n '''\n import sys\n buf = IOBuf()\n\n if std == 'both':\n config_stds = ['stdout', 'stderr']\n else:\n config_stds = [std]\n\n for std in config_stds:\n original = getattr(sys, std)\n stack = getattr(_RedirectionsHolder, '_stack_%s' % std)\n stack.append(original)\n\n if keep_original_redirection:\n setattr(sys, std, IORedirector(getattr(sys, std), buf))\n else:\n setattr(sys, std, buf)\n return buf\n\n\ndef end_redirect(std='stdout'):\n import sys\n if std == 'both':\n config_stds = ['stdout', 'stderr']\n else:\n config_stds = [std]\n for std in config_stds:\n stack = getattr(_RedirectionsHolder, '_stack_%s' % std)\n setattr(sys, std, stack.pop())\n\n", "path": "ptvsd/_vendored/pydevd/_pydevd_bundle/pydevd_io.py"}]} | 1,964 | 179 |
gh_patches_debug_32008 | rasdani/github-patches | git_diff | microsoft__AzureTRE-1039 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
API app not reporting requests to AppInsights
**Description**
Ensure opencensus reports http requests to app insights.
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `api_app/main.py`
Content:
```
1 import logging
2 import uvicorn
3
4 from fastapi import FastAPI
5 from fastapi.exceptions import RequestValidationError
6 from fastapi_utils.tasks import repeat_every
7 from starlette.exceptions import HTTPException
8 from starlette.middleware.errors import ServerErrorMiddleware
9
10 from api.routes.api import router as api_router
11 from api.routes.api import tags_metadata
12 from api.errors.http_error import http_error_handler
13 from api.errors.validation_error import http422_error_handler
14 from api.errors.generic_error import generic_error_handler
15 from core import config
16 from core.events import create_start_app_handler, create_stop_app_handler
17 from services.logging import disable_unwanted_loggers, initialize_logging
18 from service_bus.deployment_status_update import receive_message_and_update_deployment
19
20
21 def get_application() -> FastAPI:
22 application = FastAPI(
23 title=config.PROJECT_NAME,
24 debug=config.DEBUG,
25 description=config.API_DESCRIPTION,
26 version=config.VERSION,
27 docs_url="/api/docs",
28 swagger_ui_oauth2_redirect_url="/api/docs/oauth2-redirect",
29 swagger_ui_init_oauth={
30 "usePkceWithAuthorizationCodeGrant": True,
31 "clientId": config.SWAGGER_UI_CLIENT_ID,
32 "scopes": ["openid", "offline_access", f"api://{config.API_CLIENT_ID}/Workspace.Read", f"api://{config.API_CLIENT_ID}/Workspace.Write"]
33 },
34 openapi_tags=tags_metadata
35 )
36
37 application.add_event_handler("startup", create_start_app_handler(application))
38 application.add_event_handler("shutdown", create_stop_app_handler(application))
39
40 application.add_middleware(ServerErrorMiddleware, handler=generic_error_handler)
41 application.add_exception_handler(HTTPException, http_error_handler)
42 application.add_exception_handler(RequestValidationError, http422_error_handler)
43
44 application.include_router(api_router, prefix=config.API_PREFIX)
45 return application
46
47
48 app = get_application()
49
50
51 @app.on_event("startup")
52 async def initialize_logging_on_startup():
53 if config.DEBUG:
54 initialize_logging(logging.DEBUG)
55 else:
56 initialize_logging(logging.INFO)
57
58 disable_unwanted_loggers()
59
60
61 @app.on_event("startup")
62 @repeat_every(seconds=20, wait_first=True, logger=logging.getLogger())
63 async def update_deployment_status() -> None:
64 await receive_message_and_update_deployment(app)
65
66
67 if __name__ == "__main__":
68 uvicorn.run(app, host="0.0.0.0", port=8000)
69
```
Path: `api_app/_version.py`
Content:
```
1 __version__ = "0.1.1"
2
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/api_app/_version.py b/api_app/_version.py
--- a/api_app/_version.py
+++ b/api_app/_version.py
@@ -1 +1 @@
-__version__ = "0.1.1"
+__version__ = "0.1.3"
diff --git a/api_app/main.py b/api_app/main.py
--- a/api_app/main.py
+++ b/api_app/main.py
@@ -1,7 +1,8 @@
import logging
+import os
import uvicorn
-from fastapi import FastAPI
+from fastapi import FastAPI, Request
from fastapi.exceptions import RequestValidationError
from fastapi_utils.tasks import repeat_every
from starlette.exceptions import HTTPException
@@ -17,6 +18,16 @@
from services.logging import disable_unwanted_loggers, initialize_logging
from service_bus.deployment_status_update import receive_message_and_update_deployment
+# Opencensus Azure imports
+from opencensus.ext.azure.trace_exporter import AzureExporter
+from opencensus.trace.attributes_helper import COMMON_ATTRIBUTES
+from opencensus.trace.samplers import ProbabilitySampler
+from opencensus.trace.span import SpanKind
+from opencensus.trace.tracer import Tracer
+
+HTTP_URL = COMMON_ATTRIBUTES['HTTP_URL']
+HTTP_STATUS_CODE = COMMON_ATTRIBUTES['HTTP_STATUS_CODE']
+
def get_application() -> FastAPI:
application = FastAPI(
@@ -64,5 +75,19 @@
await receive_message_and_update_deployment(app)
[email protected]("http")
+async def add_process_time_header(request: Request, call_next):
+ tracer = Tracer(exporter=AzureExporter(connection_string=f'InstrumentationKey={os.getenv("APPINSIGHTS_INSTRUMENTATIONKEY")}'), sampler=ProbabilitySampler(1.0))
+ with tracer.span("main") as span:
+ span.span_kind = SpanKind.SERVER
+
+ response = await call_next(request)
+
+ tracer.add_attribute_to_current_span(attribute_key=HTTP_STATUS_CODE, attribute_value=response.status_code)
+ tracer.add_attribute_to_current_span(attribute_key=HTTP_URL, attribute_value=str(request.url))
+
+ return response
+
+
if __name__ == "__main__":
uvicorn.run(app, host="0.0.0.0", port=8000)
| {"golden_diff": "diff --git a/api_app/_version.py b/api_app/_version.py\n--- a/api_app/_version.py\n+++ b/api_app/_version.py\n@@ -1 +1 @@\n-__version__ = \"0.1.1\"\n+__version__ = \"0.1.3\"\ndiff --git a/api_app/main.py b/api_app/main.py\n--- a/api_app/main.py\n+++ b/api_app/main.py\n@@ -1,7 +1,8 @@\n import logging\n+import os\n import uvicorn\n \n-from fastapi import FastAPI\n+from fastapi import FastAPI, Request\n from fastapi.exceptions import RequestValidationError\n from fastapi_utils.tasks import repeat_every\n from starlette.exceptions import HTTPException\n@@ -17,6 +18,16 @@\n from services.logging import disable_unwanted_loggers, initialize_logging\n from service_bus.deployment_status_update import receive_message_and_update_deployment\n \n+# Opencensus Azure imports\n+from opencensus.ext.azure.trace_exporter import AzureExporter\n+from opencensus.trace.attributes_helper import COMMON_ATTRIBUTES\n+from opencensus.trace.samplers import ProbabilitySampler\n+from opencensus.trace.span import SpanKind\n+from opencensus.trace.tracer import Tracer\n+\n+HTTP_URL = COMMON_ATTRIBUTES['HTTP_URL']\n+HTTP_STATUS_CODE = COMMON_ATTRIBUTES['HTTP_STATUS_CODE']\n+\n \n def get_application() -> FastAPI:\n application = FastAPI(\n@@ -64,5 +75,19 @@\n await receive_message_and_update_deployment(app)\n \n \[email protected](\"http\")\n+async def add_process_time_header(request: Request, call_next):\n+ tracer = Tracer(exporter=AzureExporter(connection_string=f'InstrumentationKey={os.getenv(\"APPINSIGHTS_INSTRUMENTATIONKEY\")}'), sampler=ProbabilitySampler(1.0))\n+ with tracer.span(\"main\") as span:\n+ span.span_kind = SpanKind.SERVER\n+\n+ response = await call_next(request)\n+\n+ tracer.add_attribute_to_current_span(attribute_key=HTTP_STATUS_CODE, attribute_value=response.status_code)\n+ tracer.add_attribute_to_current_span(attribute_key=HTTP_URL, attribute_value=str(request.url))\n+\n+ return response\n+\n+\n if __name__ == \"__main__\":\n uvicorn.run(app, host=\"0.0.0.0\", port=8000)\n", "issue": "API app not reporting requests to AppInsights\n**Description**\r\nEnsure opencensus reports http requests to app insights.\n", "before_files": [{"content": "import logging\nimport uvicorn\n\nfrom fastapi import FastAPI\nfrom fastapi.exceptions import RequestValidationError\nfrom fastapi_utils.tasks import repeat_every\nfrom starlette.exceptions import HTTPException\nfrom starlette.middleware.errors import ServerErrorMiddleware\n\nfrom api.routes.api import router as api_router\nfrom api.routes.api import tags_metadata\nfrom api.errors.http_error import http_error_handler\nfrom api.errors.validation_error import http422_error_handler\nfrom api.errors.generic_error import generic_error_handler\nfrom core import config\nfrom core.events import create_start_app_handler, create_stop_app_handler\nfrom services.logging import disable_unwanted_loggers, initialize_logging\nfrom service_bus.deployment_status_update import receive_message_and_update_deployment\n\n\ndef get_application() -> FastAPI:\n application = FastAPI(\n title=config.PROJECT_NAME,\n debug=config.DEBUG,\n description=config.API_DESCRIPTION,\n version=config.VERSION,\n docs_url=\"/api/docs\",\n swagger_ui_oauth2_redirect_url=\"/api/docs/oauth2-redirect\",\n swagger_ui_init_oauth={\n \"usePkceWithAuthorizationCodeGrant\": True,\n \"clientId\": config.SWAGGER_UI_CLIENT_ID,\n \"scopes\": [\"openid\", \"offline_access\", f\"api://{config.API_CLIENT_ID}/Workspace.Read\", f\"api://{config.API_CLIENT_ID}/Workspace.Write\"]\n },\n openapi_tags=tags_metadata\n )\n\n application.add_event_handler(\"startup\", create_start_app_handler(application))\n application.add_event_handler(\"shutdown\", create_stop_app_handler(application))\n\n application.add_middleware(ServerErrorMiddleware, handler=generic_error_handler)\n application.add_exception_handler(HTTPException, http_error_handler)\n application.add_exception_handler(RequestValidationError, http422_error_handler)\n\n application.include_router(api_router, prefix=config.API_PREFIX)\n return application\n\n\napp = get_application()\n\n\[email protected]_event(\"startup\")\nasync def initialize_logging_on_startup():\n if config.DEBUG:\n initialize_logging(logging.DEBUG)\n else:\n initialize_logging(logging.INFO)\n\n disable_unwanted_loggers()\n\n\[email protected]_event(\"startup\")\n@repeat_every(seconds=20, wait_first=True, logger=logging.getLogger())\nasync def update_deployment_status() -> None:\n await receive_message_and_update_deployment(app)\n\n\nif __name__ == \"__main__\":\n uvicorn.run(app, host=\"0.0.0.0\", port=8000)\n", "path": "api_app/main.py"}, {"content": "__version__ = \"0.1.1\"\n", "path": "api_app/_version.py"}], "after_files": [{"content": "import logging\nimport os\nimport uvicorn\n\nfrom fastapi import FastAPI, Request\nfrom fastapi.exceptions import RequestValidationError\nfrom fastapi_utils.tasks import repeat_every\nfrom starlette.exceptions import HTTPException\nfrom starlette.middleware.errors import ServerErrorMiddleware\n\nfrom api.routes.api import router as api_router\nfrom api.routes.api import tags_metadata\nfrom api.errors.http_error import http_error_handler\nfrom api.errors.validation_error import http422_error_handler\nfrom api.errors.generic_error import generic_error_handler\nfrom core import config\nfrom core.events import create_start_app_handler, create_stop_app_handler\nfrom services.logging import disable_unwanted_loggers, initialize_logging\nfrom service_bus.deployment_status_update import receive_message_and_update_deployment\n\n# Opencensus Azure imports\nfrom opencensus.ext.azure.trace_exporter import AzureExporter\nfrom opencensus.trace.attributes_helper import COMMON_ATTRIBUTES\nfrom opencensus.trace.samplers import ProbabilitySampler\nfrom opencensus.trace.span import SpanKind\nfrom opencensus.trace.tracer import Tracer\n\nHTTP_URL = COMMON_ATTRIBUTES['HTTP_URL']\nHTTP_STATUS_CODE = COMMON_ATTRIBUTES['HTTP_STATUS_CODE']\n\n\ndef get_application() -> FastAPI:\n application = FastAPI(\n title=config.PROJECT_NAME,\n debug=config.DEBUG,\n description=config.API_DESCRIPTION,\n version=config.VERSION,\n docs_url=\"/api/docs\",\n swagger_ui_oauth2_redirect_url=\"/api/docs/oauth2-redirect\",\n swagger_ui_init_oauth={\n \"usePkceWithAuthorizationCodeGrant\": True,\n \"clientId\": config.SWAGGER_UI_CLIENT_ID,\n \"scopes\": [\"openid\", \"offline_access\", f\"api://{config.API_CLIENT_ID}/Workspace.Read\", f\"api://{config.API_CLIENT_ID}/Workspace.Write\"]\n },\n openapi_tags=tags_metadata\n )\n\n application.add_event_handler(\"startup\", create_start_app_handler(application))\n application.add_event_handler(\"shutdown\", create_stop_app_handler(application))\n\n application.add_middleware(ServerErrorMiddleware, handler=generic_error_handler)\n application.add_exception_handler(HTTPException, http_error_handler)\n application.add_exception_handler(RequestValidationError, http422_error_handler)\n\n application.include_router(api_router, prefix=config.API_PREFIX)\n return application\n\n\napp = get_application()\n\n\[email protected]_event(\"startup\")\nasync def initialize_logging_on_startup():\n if config.DEBUG:\n initialize_logging(logging.DEBUG)\n else:\n initialize_logging(logging.INFO)\n\n disable_unwanted_loggers()\n\n\[email protected]_event(\"startup\")\n@repeat_every(seconds=20, wait_first=True, logger=logging.getLogger())\nasync def update_deployment_status() -> None:\n await receive_message_and_update_deployment(app)\n\n\[email protected](\"http\")\nasync def add_process_time_header(request: Request, call_next):\n tracer = Tracer(exporter=AzureExporter(connection_string=f'InstrumentationKey={os.getenv(\"APPINSIGHTS_INSTRUMENTATIONKEY\")}'), sampler=ProbabilitySampler(1.0))\n with tracer.span(\"main\") as span:\n span.span_kind = SpanKind.SERVER\n\n response = await call_next(request)\n\n tracer.add_attribute_to_current_span(attribute_key=HTTP_STATUS_CODE, attribute_value=response.status_code)\n tracer.add_attribute_to_current_span(attribute_key=HTTP_URL, attribute_value=str(request.url))\n\n return response\n\n\nif __name__ == \"__main__\":\n uvicorn.run(app, host=\"0.0.0.0\", port=8000)\n", "path": "api_app/main.py"}, {"content": "__version__ = \"0.1.3\"\n", "path": "api_app/_version.py"}]} | 946 | 501 |
gh_patches_debug_7155 | rasdani/github-patches | git_diff | MongoEngine__mongoengine-873 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
No module named 'django.utils.importlib' (Django dev)
In mongoengine/django/mongo_auth/models.py
See https://github.com/django/django/tree/master/django/utils
No module named 'django.utils.importlib' (Django dev)
In mongoengine/django/mongo_auth/models.py
See https://github.com/django/django/tree/master/django/utils
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `mongoengine/django/mongo_auth/models.py`
Content:
```
1 from django.conf import settings
2 from django.contrib.auth.hashers import make_password
3 from django.contrib.auth.models import UserManager
4 from django.core.exceptions import ImproperlyConfigured
5 from django.db import models
6 from django.utils.importlib import import_module
7 from django.utils.translation import ugettext_lazy as _
8
9
10 __all__ = (
11 'get_user_document',
12 )
13
14
15 MONGOENGINE_USER_DOCUMENT = getattr(
16 settings, 'MONGOENGINE_USER_DOCUMENT', 'mongoengine.django.auth.User')
17
18
19 def get_user_document():
20 """Get the user document class used for authentication.
21
22 This is the class defined in settings.MONGOENGINE_USER_DOCUMENT, which
23 defaults to `mongoengine.django.auth.User`.
24
25 """
26
27 name = MONGOENGINE_USER_DOCUMENT
28 dot = name.rindex('.')
29 module = import_module(name[:dot])
30 return getattr(module, name[dot + 1:])
31
32
33 class MongoUserManager(UserManager):
34 """A User manager wich allows the use of MongoEngine documents in Django.
35
36 To use the manager, you must tell django.contrib.auth to use MongoUser as
37 the user model. In you settings.py, you need:
38
39 INSTALLED_APPS = (
40 ...
41 'django.contrib.auth',
42 'mongoengine.django.mongo_auth',
43 ...
44 )
45 AUTH_USER_MODEL = 'mongo_auth.MongoUser'
46
47 Django will use the model object to access the custom Manager, which will
48 replace the original queryset with MongoEngine querysets.
49
50 By default, mongoengine.django.auth.User will be used to store users. You
51 can specify another document class in MONGOENGINE_USER_DOCUMENT in your
52 settings.py.
53
54 The User Document class has the same requirements as a standard custom user
55 model: https://docs.djangoproject.com/en/dev/topics/auth/customizing/
56
57 In particular, the User Document class must define USERNAME_FIELD and
58 REQUIRED_FIELDS.
59
60 `AUTH_USER_MODEL` has been added in Django 1.5.
61
62 """
63
64 def contribute_to_class(self, model, name):
65 super(MongoUserManager, self).contribute_to_class(model, name)
66 self.dj_model = self.model
67 self.model = get_user_document()
68
69 self.dj_model.USERNAME_FIELD = self.model.USERNAME_FIELD
70 username = models.CharField(_('username'), max_length=30, unique=True)
71 username.contribute_to_class(self.dj_model, self.dj_model.USERNAME_FIELD)
72
73 self.dj_model.REQUIRED_FIELDS = self.model.REQUIRED_FIELDS
74 for name in self.dj_model.REQUIRED_FIELDS:
75 field = models.CharField(_(name), max_length=30)
76 field.contribute_to_class(self.dj_model, name)
77
78
79 def get(self, *args, **kwargs):
80 try:
81 return self.get_query_set().get(*args, **kwargs)
82 except self.model.DoesNotExist:
83 # ModelBackend expects this exception
84 raise self.dj_model.DoesNotExist
85
86 @property
87 def db(self):
88 raise NotImplementedError
89
90 def get_empty_query_set(self):
91 return self.model.objects.none()
92
93 def get_query_set(self):
94 return self.model.objects
95
96
97 class MongoUser(models.Model):
98 """"Dummy user model for Django.
99
100 MongoUser is used to replace Django's UserManager with MongoUserManager.
101 The actual user document class is mongoengine.django.auth.User or any
102 other document class specified in MONGOENGINE_USER_DOCUMENT.
103
104 To get the user document class, use `get_user_document()`.
105
106 """
107
108 objects = MongoUserManager()
109
110 class Meta:
111 app_label = 'mongo_auth'
112
113 def set_password(self, password):
114 """Doesn't do anything, but works around the issue with Django 1.6."""
115 make_password(password)
116
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/mongoengine/django/mongo_auth/models.py b/mongoengine/django/mongo_auth/models.py
--- a/mongoengine/django/mongo_auth/models.py
+++ b/mongoengine/django/mongo_auth/models.py
@@ -3,7 +3,11 @@
from django.contrib.auth.models import UserManager
from django.core.exceptions import ImproperlyConfigured
from django.db import models
-from django.utils.importlib import import_module
+try:
+ from django.utils.module_loading import import_module
+except ImportError:
+ """Handle older versions of Django"""
+ from django.utils.importlib import import_module
from django.utils.translation import ugettext_lazy as _
| {"golden_diff": "diff --git a/mongoengine/django/mongo_auth/models.py b/mongoengine/django/mongo_auth/models.py\n--- a/mongoengine/django/mongo_auth/models.py\n+++ b/mongoengine/django/mongo_auth/models.py\n@@ -3,7 +3,11 @@\n from django.contrib.auth.models import UserManager\n from django.core.exceptions import ImproperlyConfigured\n from django.db import models\n-from django.utils.importlib import import_module\n+try:\n+ from django.utils.module_loading import import_module\n+except ImportError:\n+ \"\"\"Handle older versions of Django\"\"\"\n+ from django.utils.importlib import import_module\n from django.utils.translation import ugettext_lazy as _\n", "issue": "No module named 'django.utils.importlib' (Django dev)\nIn mongoengine/django/mongo_auth/models.py\nSee https://github.com/django/django/tree/master/django/utils\n\nNo module named 'django.utils.importlib' (Django dev)\nIn mongoengine/django/mongo_auth/models.py\nSee https://github.com/django/django/tree/master/django/utils\n\n", "before_files": [{"content": "from django.conf import settings\nfrom django.contrib.auth.hashers import make_password\nfrom django.contrib.auth.models import UserManager\nfrom django.core.exceptions import ImproperlyConfigured\nfrom django.db import models\nfrom django.utils.importlib import import_module\nfrom django.utils.translation import ugettext_lazy as _\n\n\n__all__ = (\n 'get_user_document',\n)\n\n\nMONGOENGINE_USER_DOCUMENT = getattr(\n settings, 'MONGOENGINE_USER_DOCUMENT', 'mongoengine.django.auth.User')\n\n\ndef get_user_document():\n \"\"\"Get the user document class used for authentication.\n\n This is the class defined in settings.MONGOENGINE_USER_DOCUMENT, which\n defaults to `mongoengine.django.auth.User`.\n\n \"\"\"\n\n name = MONGOENGINE_USER_DOCUMENT\n dot = name.rindex('.')\n module = import_module(name[:dot])\n return getattr(module, name[dot + 1:])\n\n\nclass MongoUserManager(UserManager):\n \"\"\"A User manager wich allows the use of MongoEngine documents in Django.\n\n To use the manager, you must tell django.contrib.auth to use MongoUser as\n the user model. In you settings.py, you need:\n\n INSTALLED_APPS = (\n ...\n 'django.contrib.auth',\n 'mongoengine.django.mongo_auth',\n ...\n )\n AUTH_USER_MODEL = 'mongo_auth.MongoUser'\n\n Django will use the model object to access the custom Manager, which will\n replace the original queryset with MongoEngine querysets.\n\n By default, mongoengine.django.auth.User will be used to store users. You\n can specify another document class in MONGOENGINE_USER_DOCUMENT in your\n settings.py.\n\n The User Document class has the same requirements as a standard custom user\n model: https://docs.djangoproject.com/en/dev/topics/auth/customizing/\n\n In particular, the User Document class must define USERNAME_FIELD and\n REQUIRED_FIELDS.\n\n `AUTH_USER_MODEL` has been added in Django 1.5.\n\n \"\"\"\n\n def contribute_to_class(self, model, name):\n super(MongoUserManager, self).contribute_to_class(model, name)\n self.dj_model = self.model\n self.model = get_user_document()\n\n self.dj_model.USERNAME_FIELD = self.model.USERNAME_FIELD\n username = models.CharField(_('username'), max_length=30, unique=True)\n username.contribute_to_class(self.dj_model, self.dj_model.USERNAME_FIELD)\n\n self.dj_model.REQUIRED_FIELDS = self.model.REQUIRED_FIELDS\n for name in self.dj_model.REQUIRED_FIELDS:\n field = models.CharField(_(name), max_length=30)\n field.contribute_to_class(self.dj_model, name)\n\n\n def get(self, *args, **kwargs):\n try:\n return self.get_query_set().get(*args, **kwargs)\n except self.model.DoesNotExist:\n # ModelBackend expects this exception\n raise self.dj_model.DoesNotExist\n\n @property\n def db(self):\n raise NotImplementedError\n\n def get_empty_query_set(self):\n return self.model.objects.none()\n\n def get_query_set(self):\n return self.model.objects\n\n\nclass MongoUser(models.Model):\n \"\"\"\"Dummy user model for Django.\n\n MongoUser is used to replace Django's UserManager with MongoUserManager.\n The actual user document class is mongoengine.django.auth.User or any\n other document class specified in MONGOENGINE_USER_DOCUMENT.\n\n To get the user document class, use `get_user_document()`.\n\n \"\"\"\n\n objects = MongoUserManager()\n\n class Meta:\n app_label = 'mongo_auth'\n\n def set_password(self, password):\n \"\"\"Doesn't do anything, but works around the issue with Django 1.6.\"\"\"\n make_password(password)\n", "path": "mongoengine/django/mongo_auth/models.py"}], "after_files": [{"content": "from django.conf import settings\nfrom django.contrib.auth.hashers import make_password\nfrom django.contrib.auth.models import UserManager\nfrom django.core.exceptions import ImproperlyConfigured\nfrom django.db import models\ntry:\n from django.utils.module_loading import import_module\nexcept ImportError:\n \"\"\"Handle older versions of Django\"\"\"\n from django.utils.importlib import import_module\nfrom django.utils.translation import ugettext_lazy as _\n\n\n__all__ = (\n 'get_user_document',\n)\n\n\nMONGOENGINE_USER_DOCUMENT = getattr(\n settings, 'MONGOENGINE_USER_DOCUMENT', 'mongoengine.django.auth.User')\n\n\ndef get_user_document():\n \"\"\"Get the user document class used for authentication.\n\n This is the class defined in settings.MONGOENGINE_USER_DOCUMENT, which\n defaults to `mongoengine.django.auth.User`.\n\n \"\"\"\n\n name = MONGOENGINE_USER_DOCUMENT\n dot = name.rindex('.')\n module = import_module(name[:dot])\n return getattr(module, name[dot + 1:])\n\n\nclass MongoUserManager(UserManager):\n \"\"\"A User manager wich allows the use of MongoEngine documents in Django.\n\n To use the manager, you must tell django.contrib.auth to use MongoUser as\n the user model. In you settings.py, you need:\n\n INSTALLED_APPS = (\n ...\n 'django.contrib.auth',\n 'mongoengine.django.mongo_auth',\n ...\n )\n AUTH_USER_MODEL = 'mongo_auth.MongoUser'\n\n Django will use the model object to access the custom Manager, which will\n replace the original queryset with MongoEngine querysets.\n\n By default, mongoengine.django.auth.User will be used to store users. You\n can specify another document class in MONGOENGINE_USER_DOCUMENT in your\n settings.py.\n\n The User Document class has the same requirements as a standard custom user\n model: https://docs.djangoproject.com/en/dev/topics/auth/customizing/\n\n In particular, the User Document class must define USERNAME_FIELD and\n REQUIRED_FIELDS.\n\n `AUTH_USER_MODEL` has been added in Django 1.5.\n\n \"\"\"\n\n def contribute_to_class(self, model, name):\n super(MongoUserManager, self).contribute_to_class(model, name)\n self.dj_model = self.model\n self.model = get_user_document()\n\n self.dj_model.USERNAME_FIELD = self.model.USERNAME_FIELD\n username = models.CharField(_('username'), max_length=30, unique=True)\n username.contribute_to_class(self.dj_model, self.dj_model.USERNAME_FIELD)\n\n self.dj_model.REQUIRED_FIELDS = self.model.REQUIRED_FIELDS\n for name in self.dj_model.REQUIRED_FIELDS:\n field = models.CharField(_(name), max_length=30)\n field.contribute_to_class(self.dj_model, name)\n\n\n def get(self, *args, **kwargs):\n try:\n return self.get_query_set().get(*args, **kwargs)\n except self.model.DoesNotExist:\n # ModelBackend expects this exception\n raise self.dj_model.DoesNotExist\n\n @property\n def db(self):\n raise NotImplementedError\n\n def get_empty_query_set(self):\n return self.model.objects.none()\n\n def get_query_set(self):\n return self.model.objects\n\n\nclass MongoUser(models.Model):\n \"\"\"\"Dummy user model for Django.\n\n MongoUser is used to replace Django's UserManager with MongoUserManager.\n The actual user document class is mongoengine.django.auth.User or any\n other document class specified in MONGOENGINE_USER_DOCUMENT.\n\n To get the user document class, use `get_user_document()`.\n\n \"\"\"\n\n objects = MongoUserManager()\n\n class Meta:\n app_label = 'mongo_auth'\n\n def set_password(self, password):\n \"\"\"Doesn't do anything, but works around the issue with Django 1.6.\"\"\"\n make_password(password)\n", "path": "mongoengine/django/mongo_auth/models.py"}]} | 1,390 | 145 |
gh_patches_debug_27750 | rasdani/github-patches | git_diff | fossasia__open-event-server-4399 | We are currently solving the following issue within our repository. Here is the issue text:
--- BEGIN ISSUE ---
User-Session: user/<id>/sessions returns all the sessions in system
**I'm submitting a ...** (check one with "x")
- [x] bug report
- [ ] feature request
- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-orga-server
The user API returns all the sessions in the system instead of sessions under the user
eg
URL
```
https://open-event-api.herokuapp.com/v1/users/5/sessions
```
Query Params:
```
include:event
sort:starts-at
```
@poush @shubham-padia @enigmaeth @magdalenesuo Please have a look at it
--- END ISSUE ---
Below are some code segments, each from a relevant file. One or more of these files may contain bugs.
--- BEGIN FILES ---
Path: `app/api/sessions.py`
Content:
```
1 from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship
2
3 from app.api.bootstrap import api
4 from app.api.events import Event
5 from app.api.helpers.db import safe_query
6 from app.api.helpers.mail import send_email_new_session, send_email_session_accept_reject
7 from app.api.helpers.notification import send_notif_new_session_organizer, send_notif_session_accept_reject
8 from app.api.helpers.permissions import current_identity
9 from app.api.helpers.query import event_query
10 from app.api.helpers.utilities import require_relationship
11 from app.api.schema.sessions import SessionSchema
12 from app.models import db
13 from app.models.microlocation import Microlocation
14 from app.models.session import Session
15 from app.models.session_type import SessionType
16 from app.models.speaker import Speaker
17 from app.models.track import Track
18 from app.settings import get_settings
19
20
21 class SessionListPost(ResourceList):
22 """
23 List Sessions
24 """
25 def before_post(self, args, kwargs, data):
26 require_relationship(['event'], data)
27 data['creator_id'] = current_identity.id
28
29 def after_create_object(self, session, data, view_kwargs):
30 if session.event.get_organizer():
31 event_name = session.event.name
32 organizer = session.event.get_organizer()
33 organizer_email = organizer.email
34 frontend_url = get_settings()['frontend_url']
35 link = "{}/events/{}/sessions/{}"\
36 .format(frontend_url, session.event_id, session.id)
37 send_email_new_session(organizer_email, event_name, link)
38 send_notif_new_session_organizer(organizer, event_name, link)
39
40 decorators = (api.has_permission('create_event'),)
41 schema = SessionSchema
42 data_layer = {'session': db.session,
43 'model': Session,
44 'methods': {'after_create_object': after_create_object
45 }}
46
47
48 class SessionList(ResourceList):
49 """
50 List Sessions
51 """
52
53 def query(self, view_kwargs):
54 query_ = self.session.query(Session)
55 if view_kwargs.get('track_id') is not None:
56 track = safe_query(self, Track, 'id', view_kwargs['track_id'], 'track_id')
57 query_ = query_.join(Track).filter(Track.id == track.id)
58 if view_kwargs.get('session_type_id') is not None:
59 session_type = safe_query(self, SessionType, 'id', view_kwargs['session_type_id'], 'session_type_id')
60 query_ = query_.join(SessionType).filter(SessionType.id == session_type.id)
61 if view_kwargs.get('microlocation_id') is not None:
62 microlocation = safe_query(self, Microlocation, 'id', view_kwargs['microlocation_id'], 'microlocation_id')
63 query_ = query_.join(Microlocation).filter(Microlocation.id == microlocation.id)
64 query_ = event_query(self, query_, view_kwargs)
65 if view_kwargs.get('speaker_id'):
66 speaker = safe_query(self, Speaker, 'id', view_kwargs['speaker_id'], 'speaker_id')
67 # session-speaker :: many-to-many relationship
68 query_ = Session.query.filter(Session.speakers.any(id=speaker.id))
69 return query_
70
71 view_kwargs = True
72 methods = ['GET']
73 schema = SessionSchema
74 data_layer = {'session': db.session,
75 'model': Session,
76 'methods': {
77 'query': query
78 }}
79
80
81 class SessionDetail(ResourceDetail):
82 """
83 Session detail by id
84 """
85 def before_get_object(self, view_kwargs):
86 if view_kwargs.get('event_identifier'):
87 event = safe_query(self, Event, 'identifier', view_kwargs['event_identifier'], 'identifier')
88 view_kwargs['event_id'] = event.id
89
90 def after_update_object(self, session, data, view_kwargs):
91 """ Send email if session accepted or rejected """
92 if 'state' in data and (session.state == 'accepted' or session.state == 'rejected'):
93 # Email for speaker
94 speakers = session.speakers
95 for speaker in speakers:
96 frontend_url = get_settings()['frontend_url']
97 link = "{}/events/{}/sessions/{}" \
98 .format(frontend_url, session.event_id, session.id)
99 send_email_session_accept_reject(speaker.email, session, link)
100 send_notif_session_accept_reject(speaker, session.title, session.state, link)
101
102 # Email for organizer
103 if session.event.get_organizer():
104 organizer = session.event.get_organizer()
105 organizer_email = organizer.email
106 frontend_url = get_settings()['frontend_url']
107 link = "{}/events/{}/sessions/{}" \
108 .format(frontend_url, session.event_id, session.id)
109 send_email_session_accept_reject(organizer_email, session,
110 link)
111 send_notif_session_accept_reject(organizer, session.title,
112 session.state, link)
113
114
115 decorators = (api.has_permission('is_speaker_for_session', methods="PATCH,DELETE"),)
116 schema = SessionSchema
117 data_layer = {'session': db.session,
118 'model': Session,
119 'methods': {'before_get_object': before_get_object,
120 'after_update_object': after_update_object}}
121
122
123 class SessionRelationshipRequired(ResourceRelationship):
124 """
125 Session Relationship
126 """
127 schema = SessionSchema
128 decorators = (api.has_permission('is_speaker_for_session', methods="PATCH,DELETE"),)
129 methods = ['GET', 'PATCH']
130 data_layer = {'session': db.session,
131 'model': Session}
132
133
134 class SessionRelationshipOptional(ResourceRelationship):
135 """
136 Session Relationship
137 """
138 schema = SessionSchema
139 decorators = (api.has_permission('is_speaker_for_session', methods="PATCH,DELETE"),)
140 data_layer = {'session': db.session,
141 'model': Session}
142
```
--- END FILES ---
Please first localize the bug based on the issue statement, and then generate a patch according to the `git diff` format fenced by three backticks.
Here is an example:
```diff
diff --git a/examples/server_async.py b/examples/server_async.py
--- a/examples/server_async.py
+++ b/examples/server_async.py
@@ -313,4 +313,4 @@
if __name__ == "__main__":
- asyncio.run(run_async_server("."), debug=True)
+ asyncio.run(run_async_server(), debug=True)
diff --git a/examples/server_sync.py b/examples/server_sync.py
--- a/examples/server_sync.py
+++ b/examples/server_sync.py
@@ -313,5 +313,5 @@
if __name__ == "__main__":
- server = run_sync_server(".")
+ server = run_sync_server()
server.shutdown()
```
| diff --git a/app/api/sessions.py b/app/api/sessions.py
--- a/app/api/sessions.py
+++ b/app/api/sessions.py
@@ -15,6 +15,7 @@
from app.models.session_type import SessionType
from app.models.speaker import Speaker
from app.models.track import Track
+from app.models.user import User
from app.settings import get_settings
@@ -61,11 +62,15 @@
if view_kwargs.get('microlocation_id') is not None:
microlocation = safe_query(self, Microlocation, 'id', view_kwargs['microlocation_id'], 'microlocation_id')
query_ = query_.join(Microlocation).filter(Microlocation.id == microlocation.id)
+ if view_kwargs.get('user_id') is not None:
+ user = safe_query(self, User, 'id', view_kwargs['user_id'], 'user_id')
+ query_ = query_.join(User).filter(User.id == user.id)
query_ = event_query(self, query_, view_kwargs)
if view_kwargs.get('speaker_id'):
speaker = safe_query(self, Speaker, 'id', view_kwargs['speaker_id'], 'speaker_id')
# session-speaker :: many-to-many relationship
query_ = Session.query.filter(Session.speakers.any(id=speaker.id))
+
return query_
view_kwargs = True
| {"golden_diff": "diff --git a/app/api/sessions.py b/app/api/sessions.py\n--- a/app/api/sessions.py\n+++ b/app/api/sessions.py\n@@ -15,6 +15,7 @@\n from app.models.session_type import SessionType\n from app.models.speaker import Speaker\n from app.models.track import Track\n+from app.models.user import User\n from app.settings import get_settings\n \n \n@@ -61,11 +62,15 @@\n if view_kwargs.get('microlocation_id') is not None:\n microlocation = safe_query(self, Microlocation, 'id', view_kwargs['microlocation_id'], 'microlocation_id')\n query_ = query_.join(Microlocation).filter(Microlocation.id == microlocation.id)\n+ if view_kwargs.get('user_id') is not None:\n+ user = safe_query(self, User, 'id', view_kwargs['user_id'], 'user_id')\n+ query_ = query_.join(User).filter(User.id == user.id)\n query_ = event_query(self, query_, view_kwargs)\n if view_kwargs.get('speaker_id'):\n speaker = safe_query(self, Speaker, 'id', view_kwargs['speaker_id'], 'speaker_id')\n # session-speaker :: many-to-many relationship\n query_ = Session.query.filter(Session.speakers.any(id=speaker.id))\n+\n return query_\n \n view_kwargs = True\n", "issue": "User-Session: user/<id>/sessions returns all the sessions in system\n**I'm submitting a ...** (check one with \"x\")\r\n- [x] bug report\r\n- [ ] feature request\r\n- [ ] support request => Please do not submit support requests here, instead ask your query in out Gitter channel at https://gitter.im/fossasia/open-event-orga-server\r\n\r\nThe user API returns all the sessions in the system instead of sessions under the user\r\neg\r\nURL\r\n```\r\nhttps://open-event-api.herokuapp.com/v1/users/5/sessions\r\n```\r\n\r\nQuery Params:\r\n```\r\ninclude:event\r\nsort:starts-at\r\n```\r\n\r\n@poush @shubham-padia @enigmaeth @magdalenesuo Please have a look at it\n", "before_files": [{"content": "from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship\n\nfrom app.api.bootstrap import api\nfrom app.api.events import Event\nfrom app.api.helpers.db import safe_query\nfrom app.api.helpers.mail import send_email_new_session, send_email_session_accept_reject\nfrom app.api.helpers.notification import send_notif_new_session_organizer, send_notif_session_accept_reject\nfrom app.api.helpers.permissions import current_identity\nfrom app.api.helpers.query import event_query\nfrom app.api.helpers.utilities import require_relationship\nfrom app.api.schema.sessions import SessionSchema\nfrom app.models import db\nfrom app.models.microlocation import Microlocation\nfrom app.models.session import Session\nfrom app.models.session_type import SessionType\nfrom app.models.speaker import Speaker\nfrom app.models.track import Track\nfrom app.settings import get_settings\n\n\nclass SessionListPost(ResourceList):\n \"\"\"\n List Sessions\n \"\"\"\n def before_post(self, args, kwargs, data):\n require_relationship(['event'], data)\n data['creator_id'] = current_identity.id\n\n def after_create_object(self, session, data, view_kwargs):\n if session.event.get_organizer():\n event_name = session.event.name\n organizer = session.event.get_organizer()\n organizer_email = organizer.email\n frontend_url = get_settings()['frontend_url']\n link = \"{}/events/{}/sessions/{}\"\\\n .format(frontend_url, session.event_id, session.id)\n send_email_new_session(organizer_email, event_name, link)\n send_notif_new_session_organizer(organizer, event_name, link)\n\n decorators = (api.has_permission('create_event'),)\n schema = SessionSchema\n data_layer = {'session': db.session,\n 'model': Session,\n 'methods': {'after_create_object': after_create_object\n }}\n\n\nclass SessionList(ResourceList):\n \"\"\"\n List Sessions\n \"\"\"\n\n def query(self, view_kwargs):\n query_ = self.session.query(Session)\n if view_kwargs.get('track_id') is not None:\n track = safe_query(self, Track, 'id', view_kwargs['track_id'], 'track_id')\n query_ = query_.join(Track).filter(Track.id == track.id)\n if view_kwargs.get('session_type_id') is not None:\n session_type = safe_query(self, SessionType, 'id', view_kwargs['session_type_id'], 'session_type_id')\n query_ = query_.join(SessionType).filter(SessionType.id == session_type.id)\n if view_kwargs.get('microlocation_id') is not None:\n microlocation = safe_query(self, Microlocation, 'id', view_kwargs['microlocation_id'], 'microlocation_id')\n query_ = query_.join(Microlocation).filter(Microlocation.id == microlocation.id)\n query_ = event_query(self, query_, view_kwargs)\n if view_kwargs.get('speaker_id'):\n speaker = safe_query(self, Speaker, 'id', view_kwargs['speaker_id'], 'speaker_id')\n # session-speaker :: many-to-many relationship\n query_ = Session.query.filter(Session.speakers.any(id=speaker.id))\n return query_\n\n view_kwargs = True\n methods = ['GET']\n schema = SessionSchema\n data_layer = {'session': db.session,\n 'model': Session,\n 'methods': {\n 'query': query\n }}\n\n\nclass SessionDetail(ResourceDetail):\n \"\"\"\n Session detail by id\n \"\"\"\n def before_get_object(self, view_kwargs):\n if view_kwargs.get('event_identifier'):\n event = safe_query(self, Event, 'identifier', view_kwargs['event_identifier'], 'identifier')\n view_kwargs['event_id'] = event.id\n\n def after_update_object(self, session, data, view_kwargs):\n \"\"\" Send email if session accepted or rejected \"\"\"\n if 'state' in data and (session.state == 'accepted' or session.state == 'rejected'):\n # Email for speaker\n speakers = session.speakers\n for speaker in speakers:\n frontend_url = get_settings()['frontend_url']\n link = \"{}/events/{}/sessions/{}\" \\\n .format(frontend_url, session.event_id, session.id)\n send_email_session_accept_reject(speaker.email, session, link)\n send_notif_session_accept_reject(speaker, session.title, session.state, link)\n\n # Email for organizer\n if session.event.get_organizer():\n organizer = session.event.get_organizer()\n organizer_email = organizer.email\n frontend_url = get_settings()['frontend_url']\n link = \"{}/events/{}/sessions/{}\" \\\n .format(frontend_url, session.event_id, session.id)\n send_email_session_accept_reject(organizer_email, session,\n link)\n send_notif_session_accept_reject(organizer, session.title,\n session.state, link)\n\n\n decorators = (api.has_permission('is_speaker_for_session', methods=\"PATCH,DELETE\"),)\n schema = SessionSchema\n data_layer = {'session': db.session,\n 'model': Session,\n 'methods': {'before_get_object': before_get_object,\n 'after_update_object': after_update_object}}\n\n\nclass SessionRelationshipRequired(ResourceRelationship):\n \"\"\"\n Session Relationship\n \"\"\"\n schema = SessionSchema\n decorators = (api.has_permission('is_speaker_for_session', methods=\"PATCH,DELETE\"),)\n methods = ['GET', 'PATCH']\n data_layer = {'session': db.session,\n 'model': Session}\n\n\nclass SessionRelationshipOptional(ResourceRelationship):\n \"\"\"\n Session Relationship\n \"\"\"\n schema = SessionSchema\n decorators = (api.has_permission('is_speaker_for_session', methods=\"PATCH,DELETE\"),)\n data_layer = {'session': db.session,\n 'model': Session}\n", "path": "app/api/sessions.py"}], "after_files": [{"content": "from flask_rest_jsonapi import ResourceDetail, ResourceList, ResourceRelationship\n\nfrom app.api.bootstrap import api\nfrom app.api.events import Event\nfrom app.api.helpers.db import safe_query\nfrom app.api.helpers.mail import send_email_new_session, send_email_session_accept_reject\nfrom app.api.helpers.notification import send_notif_new_session_organizer, send_notif_session_accept_reject\nfrom app.api.helpers.permissions import current_identity\nfrom app.api.helpers.query import event_query\nfrom app.api.helpers.utilities import require_relationship\nfrom app.api.schema.sessions import SessionSchema\nfrom app.models import db\nfrom app.models.microlocation import Microlocation\nfrom app.models.session import Session\nfrom app.models.session_type import SessionType\nfrom app.models.speaker import Speaker\nfrom app.models.track import Track\nfrom app.models.user import User\nfrom app.settings import get_settings\n\n\nclass SessionListPost(ResourceList):\n \"\"\"\n List Sessions\n \"\"\"\n def before_post(self, args, kwargs, data):\n require_relationship(['event'], data)\n data['creator_id'] = current_identity.id\n\n def after_create_object(self, session, data, view_kwargs):\n if session.event.get_organizer():\n event_name = session.event.name\n organizer = session.event.get_organizer()\n organizer_email = organizer.email\n frontend_url = get_settings()['frontend_url']\n link = \"{}/events/{}/sessions/{}\"\\\n .format(frontend_url, session.event_id, session.id)\n send_email_new_session(organizer_email, event_name, link)\n send_notif_new_session_organizer(organizer, event_name, link)\n\n decorators = (api.has_permission('create_event'),)\n schema = SessionSchema\n data_layer = {'session': db.session,\n 'model': Session,\n 'methods': {'after_create_object': after_create_object\n }}\n\n\nclass SessionList(ResourceList):\n \"\"\"\n List Sessions\n \"\"\"\n\n def query(self, view_kwargs):\n query_ = self.session.query(Session)\n if view_kwargs.get('track_id') is not None:\n track = safe_query(self, Track, 'id', view_kwargs['track_id'], 'track_id')\n query_ = query_.join(Track).filter(Track.id == track.id)\n if view_kwargs.get('session_type_id') is not None:\n session_type = safe_query(self, SessionType, 'id', view_kwargs['session_type_id'], 'session_type_id')\n query_ = query_.join(SessionType).filter(SessionType.id == session_type.id)\n if view_kwargs.get('microlocation_id') is not None:\n microlocation = safe_query(self, Microlocation, 'id', view_kwargs['microlocation_id'], 'microlocation_id')\n query_ = query_.join(Microlocation).filter(Microlocation.id == microlocation.id)\n if view_kwargs.get('user_id') is not None:\n user = safe_query(self, User, 'id', view_kwargs['user_id'], 'user_id')\n query_ = query_.join(User).filter(User.id == user.id)\n query_ = event_query(self, query_, view_kwargs)\n if view_kwargs.get('speaker_id'):\n speaker = safe_query(self, Speaker, 'id', view_kwargs['speaker_id'], 'speaker_id')\n # session-speaker :: many-to-many relationship\n query_ = Session.query.filter(Session.speakers.any(id=speaker.id))\n\n return query_\n\n view_kwargs = True\n methods = ['GET']\n schema = SessionSchema\n data_layer = {'session': db.session,\n 'model': Session,\n 'methods': {\n 'query': query\n }}\n\n\nclass SessionDetail(ResourceDetail):\n \"\"\"\n Session detail by id\n \"\"\"\n def before_get_object(self, view_kwargs):\n if view_kwargs.get('event_identifier'):\n event = safe_query(self, Event, 'identifier', view_kwargs['event_identifier'], 'identifier')\n view_kwargs['event_id'] = event.id\n\n def after_update_object(self, session, data, view_kwargs):\n \"\"\" Send email if session accepted or rejected \"\"\"\n if 'state' in data and (session.state == 'accepted' or session.state == 'rejected'):\n # Email for speaker\n speakers = session.speakers\n for speaker in speakers:\n frontend_url = get_settings()['frontend_url']\n link = \"{}/events/{}/sessions/{}\" \\\n .format(frontend_url, session.event_id, session.id)\n send_email_session_accept_reject(speaker.email, session, link)\n send_notif_session_accept_reject(speaker, session.title, session.state, link)\n\n # Email for organizer\n if session.event.get_organizer():\n organizer = session.event.get_organizer()\n organizer_email = organizer.email\n frontend_url = get_settings()['frontend_url']\n link = \"{}/events/{}/sessions/{}\" \\\n .format(frontend_url, session.event_id, session.id)\n send_email_session_accept_reject(organizer_email, session,\n link)\n send_notif_session_accept_reject(organizer, session.title,\n session.state, link)\n\n\n decorators = (api.has_permission('is_speaker_for_session', methods=\"PATCH,DELETE\"),)\n schema = SessionSchema\n data_layer = {'session': db.session,\n 'model': Session,\n 'methods': {'before_get_object': before_get_object,\n 'after_update_object': after_update_object}}\n\n\nclass SessionRelationshipRequired(ResourceRelationship):\n \"\"\"\n Session Relationship\n \"\"\"\n schema = SessionSchema\n decorators = (api.has_permission('is_speaker_for_session', methods=\"PATCH,DELETE\"),)\n methods = ['GET', 'PATCH']\n data_layer = {'session': db.session,\n 'model': Session}\n\n\nclass SessionRelationshipOptional(ResourceRelationship):\n \"\"\"\n Session Relationship\n \"\"\"\n schema = SessionSchema\n decorators = (api.has_permission('is_speaker_for_session', methods=\"PATCH,DELETE\"),)\n data_layer = {'session': db.session,\n 'model': Session}\n", "path": "app/api/sessions.py"}]} | 1,965 | 297 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.